[
  {
    "path": "Images_for_readme/README.md",
    "content": "\n"
  },
  {
    "path": "LICENSE",
    "content": "MIT License\n\nCopyright (c) 2018 陈潇凯\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
  },
  {
    "path": "README.md",
    "content": "# Action Recognition Zoo\nCodes for popular action recognition models, written based on pytorch, verified on the [something-something](https://www.twentybn.com/datasets/something-something) dataset. This code is built on top of the [TRN-pytorch](https://github.com/metalbubble/TRN-pytorch).\n\n**Note** The main purpose of this repositoriy is to go through several methods and get familiar with their pipelines.\n\n**Note**: always use git clone --recursive https://github.com/coderSkyChen/Action_Recognition_Zoo to clone this project Otherwise you will not be able to use the inception series CNN architecture.\n\n## Depencies\n- Opencv-2.4.13 or some greater version that has tvl1 api for the computing of optical flow.\n- Pytorch-0.2.0_3\n- Tensorflow-1.3.1，this is only for the using of tensorboard, it's ok without this, but you need to comment the corresponding codes.\n\n## Data preparation\n### Dataset\n- **Download** the [something-something](https://www.twentybn.com/datasets/something-something) dataset. Decompress them into some folder.\n- Note that this dataset contains 108,499 videos and each video is presented in JPG images. The JPG images were extracted from the orginal videos at 12 frames per seconds.\n- The temporal evolution in videos is important for this dataset, so it's hard for some classic models which pay attention to short motion such as: Two-Stream Convolutional Networks for Action Recognition in Videos, NIPS 2014.\n### Prepare optical flow using Opencv\nNote that optical flow is an important modal feature in two-stream series methods, which contains the motion information of videos.\n\nSince there only rgb frames in the official dataset, we need compute optical flow by ourselves.\n\nI apply a TV-L1 optical flow algorithm, pixel values are truncated to the range \\[-20, 20\\], then rescaled between 0 and 255, each optical flow has two channels representing horizontal and vertical components. Note that the fps in original dataset is 12, which is too fast for optical flow computing in practice, so i sample frame at 6fps.\n\n- The command to compute optical flow:\n```\ncd optical_flow\nmake bin                          #for cpu\nmake -f gpu_makefile gpu_bin      #for gpu\n\n./bin          #for cpu\n./gpu_bin      #for gpu\n```\nBefore using the code you should modify the path in main.cpp or gpu_main.cpp.\n\n### Generate the meta files\n```\npython process_dataset.py\n```\n\n# Models\nBefore using the code you should modify the path as your own. The test time for one video is measured on one K80.\n## Two stream action recognition\n**Main Reference Paper**: [Two-stream convolutional networks for action recognition in videos](http://papers.nips.cc/paper/5353-two-stream-convolutional)\n\n- Base CNN: BN-Inception pretrained on ImageNet.\n- Partical BN and cross-modality tricks have been used in the code.\n- Spatial stream: it's input is single rgb frame.\n- Temporal stream: it's input is stacked optical flows.\n### Training\n- Spatial CNN: A single rgb frame is randomly selected for a video, which equals to image classification，input channel is 3.\n- Temporal CNN: 5 consequent stacked optical flows are selected for a video, input channel is 5*2(2 channels:x and y).\n\n- The command to train models:\n```\ntrain for spatial stream:\npython main.py TwoStream RGB two-stream-rgb --arch BNInception --batch_size 256 --lr 0.002\ntrain for temporal stream:\npython main.py TwoStream Flow two-stream-flow --arch BNInception --batch_size 256 --lr 0.0005\n```\n### Testing on validation set\nAt test time, given a video, i sample a fixed number of frames (25 for spatial stream and 8 for temporal stream in my experiments) with equal temporal spacing between them. From each of the frames i then obtain 10 ConvNet\ninputs by cropping and flipping four corners and the center of the frame. The class scores for the\nwhole video are then obtained by averaging the scores across the sampled frames and crops therein.\n\n- The command to test models:\n```\ntest for spatial stream:\npython test_models.py --model TwoStream --modality RGB --weights TwoStream_RGB_BNInception_best.pth.tar --train_id two-stream-rgb --save_scores rgb.npz --arch BNInception --test_segments 25\n\ntest for temporal stream；\npython test_models.py --model TwoStream --modality Flow --weights TwoStream_Flow_BNInception_best.pth.tar --train_id two-stream-flow --save_scores flow.npz --arch BNInception --test_segments 25\n\n```\nAfter running the test code, we get the precision scores on validation set and the probability for all class is saved in npz files which is useful in late fusion.\n\n```\nfusion: combine spatial stream and temporal stream results.\npython average_scores.py\n```\n\n## Temporal Segment Networks\n**Main Reference Paper**: [Temporal Segment Networks: Towards Good Practices for Deep Action Recognition](https://arxiv.org/abs/1611.05267)\n\n\n- Base CNN: BN-Inception pretrained on ImageNet.\n- Partical BN and cross-modality tricks have been used in the code.\n- Spatial stream: it's input is k rgb frames, k is the segment number.\n- Temporal stream: it's input is k stacked optical flows.\n- The consensus function i've implemented is average function.\n\n### Training\n\n```\ntrain spatial stream:\npython main.py TSN RGB tsn-rgb --arch BNInception --batch_size 128 --lr 0.001 --num_segments 3\n\ntrain temporal stream:\npython main.py TSN Flow tsn-flow --arch BNInception --batch_size 128 --lr 0.0007 --num_segments 3\n```\n### Testing on validation set\nNote that in testing phrase the k equals 1 according to the paper and it's offical code. So the segment mechanism is only used in training phrase.\n```\ntest spatial stream:\npython test_models.py --model TSN --modality RGB --weights TSN_RGB_BNInception_best.pth.tar --train_id tsn-rgb --save_scores rgb.npz --arch BNInception --test_segments 25\n\ntest temporal stream:\npython test_models.py --model TSN --modality Flow --weights TSN_Flow_BNInception_best.pth.tar --train_id tsn-flow --save_scores flow.npz --arch BNInception --test_segments 25\n\nfusion:\npython average_scores.py   # need modify the path to your own\n```\n\n## Pretrained-C3D :3D Convolutional Networks\n**Main Reference Paper**: [Learning Spatiotemporal Features with 3D Convolutional Networks](https://arxiv.org/abs/1412.0767\n\n- finetune the model pretrained on sports-1M, the pretrained model is upload to Baidu Cloud: [link](https://pan.baidu.com/s/1A-iAn4x45CHFgs7caOAFZw)\n\n### Training\n```\npython main.py C3D RGB c3d-rgb --arch BNInception --batch_size 32 --lr 0.0001 --num_segments 1 --lr_steps 2 5 10 20 --factor 0.5\n```\n\n### Testing\n```\npython test_models.py --model C3D --modality RGB --weights C3D_RGB_BNInception_best.pth.tar --train_id c3d-rgb --save_scores rgb.npz --test_segments 5 --test_crops 1\n```\n### Results on validation set\n- It seems like the C3D is faster than previous methods, but the input size for C3D is `112*112` vs `224*224` for Two-Stream models.\n- The result is not good. I've found that it's hard to traing 3D CNN on this difficult dataset. This is mainly due to the poor GPU which slows the training phrase, so it's hard to choose proper hyperparameters with my machine, but this code works and it'll give you a **quick start**.\n\n## I3D\n**Main Reference Paper**: [Quo Vadis, Action Recognition? A New Model and the Kinetics Dataset](https://arxiv.org/abs/1705.07750)\n\n- The code for I3D model is based on [hassony2](https://github.com/hassony2/kinetics_i3d_pytorch)\n- Training is too slow to report the results on Something-Something, but this code is useful\n- Kinetics pretrained model is uploaded to Baidu Cloud: [link](https://pan.baidu.com/s/18pfAM2fYVsA6KxhX4A_pMQ)\n### Training\n```\npython main.py I3D RGB i3d-rgb --arch I3D --batch_size 32 --lr 0.002 --num_segments 1 --lr_steps 2 10 20 --factor 0.5\n```\n\n"
  },
  {
    "path": "average_scores.py",
    "content": "# @Author  : Sky chen\n# @Email   : dzhchxk@126.com\n# @Personal homepage  : https://coderskychen.cn\n\nimport numpy as np\nimport pdb\n\ndef valid():\n    files_scores = ['/home/mcg/cxk/action-recognition-zoo/results/tsn-flow/output/flow.npz', '/home/mcg/cxk/action-recognition-zoo/results/tsn-rgb/output/rgb.npz']\n\n    allsum = np.zeros([11522, 174])\n    labels = []\n    for filename in files_scores:\n        print(filename)\n        data = np.load(filename)\n        scores = data['scores']\n        # print(scores.shape)\n        ss = scores[:, 0]\n        ll = scores[:, 1]\n        labels.append(ll)\n        ss = [x.reshape(174) for x in ss]\n\n        allsum += ss\n\n    preds = np.argmax(allsum,axis=1)\n\n    num_correct = np.sum(preds == labels)\n    acc = num_correct * 1.0 / preds.shape[0]\n    print('acc=%.3f' % (acc))\n\n\n\nif __name__ == '__main__':\n    valid()\n"
  },
  {
    "path": "dataset.py",
    "content": "# @Author  : Sky chen\n# @Email   : dzhchxk@126.com\n# @Personal homepage  : https://coderskychen.cn\n\nimport torch.utils.data as data\n\nfrom PIL import Image\nimport os\nimport os.path\nimport numpy as np\nfrom numpy.random import randint\nimport random\n\n\nclass VideoRecord(object):\n    def __init__(self, row):\n        self._data = row\n\n    @property\n    def path(self):\n        return self._data[0]\n\n    @property\n    def num_frames(self):\n        return int(self._data[1])\n\n    @property\n    def label(self):\n        if len(self._data) == 2:\n            return -1\n        else:\n            return int(self._data[2])\n\n\nclass TwoStreamDataSet(data.Dataset):\n    def __init__(self, root_path, list_file, num_segments=3,\n                 new_length=1, modality='RGB',\n                 image_tmpl='img_{:05d}.jpg', transform=None,\n                 random_shift=True, test_mode=False):\n\n        self.root_path = root_path\n        self.list_file = list_file\n        self.new_length = new_length\n        self.modality = modality\n        self.image_tmpl = image_tmpl\n        self.transform = transform\n        self.random_shift = random_shift\n        self.test_mode = test_mode\n        self.num_segments = num_segments\n\n        self._parse_list()\n\n    def _load_image(self, directory, idx):\n        if self.modality == 'RGB':\n            try:\n                return [Image.open(os.path.join(self.root_path, directory, self.image_tmpl.format(idx))).convert('RGB')]\n            except Exception:\n                print('error loading image:', os.path.join(self.root_path, directory, self.image_tmpl.format(idx)))\n                return [Image.open(os.path.join(self.root_path, directory, self.image_tmpl.format(1))).convert('RGB')]\n        elif self.modality == 'Flow':\n            x_img = Image.open(os.path.join(self.root_path, directory, self.image_tmpl.format('x', idx))).convert('L')\n            y_img = Image.open(os.path.join(self.root_path, directory, self.image_tmpl.format('y', idx))).convert('L')\n            return [x_img, y_img]\n\n    def _parse_list(self):\n        # check the frame number is large >3:\n        # usualy it is [video_id, num_frames, class_idx]\n        if not self.test_mode:\n            tmp = [x.strip().split(' ') for x in open(self.list_file)]\n            if self.modality == 'Flow':\n                for item in tmp:\n                    item[1] = int(item[1]) / 2\n            tmp = [item for item in tmp if int(item[1]) >= 3]\n            self.video_list = [VideoRecord(item) for item in tmp]\n            print('video number:%d' % (len(self.video_list)))\n        else:\n            tmp = [x.strip().split() for x in open(self.list_file)]\n            if self.modality == 'Flow':\n                for item in tmp:\n                    item[1] = int(item[1]) / 2\n            # tmp = [item for item in tmp if int(item[1]) >= 3]\n            self.video_list = [VideoRecord(item) for item in tmp]\n            print('video number:%d' % (len(self.video_list)))\n\n    def _get_val_indices(self, record):\n        if record.num_frames > self.num_segments + self.new_length - 1:\n            tick = (record.num_frames - self.new_length + 1) / float(self.num_segments)\n            offsets = np.array([int(tick / 2.0 + tick * x) for x in range(self.num_segments)])\n        else:\n            # offsets = np.zeros((self.num_segments,))\n            offsets = np.sort(randint(record.num_frames - self.new_length + 1, size=self.num_segments))\n        if self.modality == 'Flow':\n            offsets = offsets * 2 + 1\n        else:\n            offsets += 1\n        return offsets\n\n    def __getitem__(self, index):\n        record = self.video_list[index]\n        # check this is a legit video folder\n        if self.modality == 'RGB':\n            while not os.path.exists(os.path.join(self.root_path, record.path, self.image_tmpl.format(1))):\n                print(os.path.join(self.root_path, record.path, self.image_tmpl.format(1)) + ' not exists jumpping')\n                index = np.random.randint(len(self.video_list))\n                record = self.video_list[index]\n        else:\n            while not os.path.exists(os.path.join(self.root_path, record.path, self.image_tmpl.format('x', 1))):\n                print(\n                    os.path.join(self.root_path, record.path, self.image_tmpl.format('x', 1)) + ' not exists jumpping')\n                index = np.random.randint(len(self.video_list))\n                record = self.video_list[index]\n\n        if not self.test_mode:\n            sample_indice = [randint(low=1, high=record.num_frames + 2 - self.new_length)]\n            if self.modality == 'Flow':\n                sample_indice = [x * 2 - 1 for x in sample_indice]  # flow index 1 3 5 7 ...\n        else:\n            sample_indice = self._get_val_indices(record)\n\n        return self.get(record, sample_indice)\n\n    def get(self, record, indice):\n\n        images = list()\n\n        for seg_ind in indice:\n            p = int(seg_ind)\n            for i in range(self.new_length):  # for optical flow getting a volumn start with seg_ind\n                img = self._load_image(record.path, p)\n                images.extend(img)\n                if p < record.num_frames:\n                    if self.modality == 'RGB':\n                        p += 1\n                    else:\n                        p += 2\n\n        # one image: H*W*C\n        process_data = self.transform(images)\n        return process_data, record.label\n\n    def __len__(self):\n        return len(self.video_list)\n\n\nclass TSNDataSet(data.Dataset):\n    def __init__(self, root_path, list_file,\n                 num_segments=3, new_length=1, modality='RGB',\n                 image_tmpl='img_{:05d}.jpg', transform=None,\n                 random_shift=True, test_mode=False):\n\n        self.root_path = root_path\n        self.list_file = list_file\n        self.num_segments = num_segments\n        self.new_length = new_length\n        self.modality = modality\n        self.image_tmpl = image_tmpl\n        self.transform = transform\n        self.random_shift = random_shift\n        self.test_mode = test_mode\n\n        self._parse_list()\n\n    def _load_image(self, directory, idx):\n        if self.modality == 'RGB':\n            try:\n                return [Image.open(os.path.join(self.root_path, directory, self.image_tmpl.format(idx))).convert('RGB')]\n            except Exception:\n                print('error loading image:', os.path.join(self.root_path, directory, self.image_tmpl.format(idx)))\n                return [Image.open(os.path.join(self.root_path, directory, self.image_tmpl.format(1))).convert('RGB')]\n        elif self.modality == 'Flow':\n            x_img = Image.open(os.path.join(self.root_path, directory, self.image_tmpl.format('x', idx))).convert('L')\n            y_img = Image.open(os.path.join(self.root_path, directory, self.image_tmpl.format('y', idx))).convert('L')\n            return [x_img, y_img]\n\n    def _parse_list(self):\n        # check the frame number is large >3:\n        # usualy it is [video_id, num_frames, class_idx]\n        if not self.test_mode:\n            tmp = [x.strip().split(' ') for x in open(self.list_file)]\n            if self.modality == 'Flow':\n                for item in tmp:\n                    item[1] = int(item[1]) / 2\n            tmp = [item for item in tmp if int(item[1]) >= 3]\n            self.video_list = [VideoRecord(item) for item in tmp]\n            print('video number:%d' % (len(self.video_list)))\n        else:\n            tmp = [x.strip().split() for x in open(self.list_file)]\n            if self.modality == 'Flow':\n                for item in tmp:\n                    item[1] = int(item[1]) / 2\n            # tmp = [item for item in tmp if int(item[1]) >= 3]\n            self.video_list = [VideoRecord(item) for item in tmp]\n            print('video number:%d' % (len(self.video_list)))\n\n    def _sample_indices(self, record):\n        \"\"\"\n\n        :param record: VideoRecord\n        :return: list\n        \"\"\"\n\n        average_duration = (record.num_frames - self.new_length + 1) // self.num_segments\n        if average_duration > 0:  # random sample\n            offsets = np.multiply(list(range(self.num_segments)), average_duration) + randint(average_duration,\n                                                                                              size=self.num_segments)\n        elif record.num_frames > self.num_segments:  # [0,0,1,1,1,2,2,3]     dense sample\n            offsets = np.sort(randint(record.num_frames - self.new_length + 1, size=self.num_segments))\n        else:\n            offsets = np.zeros((self.num_segments,))\n        if self.modality == 'Flow':\n            offsets = offsets * 2 + 1\n        else:\n            offsets += 1\n        return offsets\n\n    def _get_val_indices(self, record):\n        if record.num_frames > self.num_segments + self.new_length - 1:\n            tick = (record.num_frames - self.new_length + 1) / float(self.num_segments)\n            offsets = np.array([int(tick / 2.0 + tick * x) for x in range(self.num_segments)])\n        else:\n            offsets = np.sort(randint(record.num_frames - self.new_length + 1, size=self.num_segments))\n        if self.modality == 'Flow':\n            offsets = offsets * 2 + 1\n        else:\n            offsets += 1\n        return offsets\n\n    def _get_test_indices(self, record):\n        if record.num_frames > self.num_segments + self.new_length - 1:\n            tick = (record.num_frames - self.new_length + 1) / float(self.num_segments)\n            offsets = np.array([int(tick / 2.0 + tick * x) for x in range(self.num_segments)])\n        else:\n            offsets = np.sort(randint(record.num_frames - self.new_length + 1, size=self.num_segments))\n        if self.modality == 'Flow':\n            offsets = offsets * 2 + 1\n        else:\n            offsets += 1\n        return offsets\n\n    def __getitem__(self, index):\n        record = self.video_list[index]\n        # check this is a legit video folder\n        if self.modality == 'RGB':\n            while not os.path.exists(os.path.join(self.root_path, record.path, self.image_tmpl.format(1))):\n                print(os.path.join(self.root_path, record.path, self.image_tmpl.format(1)) + ' not exists jumpping')\n                index = np.random.randint(len(self.video_list))\n                record = self.video_list[index]\n        else:\n            while not os.path.exists(os.path.join(self.root_path, record.path, self.image_tmpl.format('x', 1))):\n                print(\n                    os.path.join(self.root_path, record.path, self.image_tmpl.format('x', 1)) + ' not exists jumpping')\n                index = np.random.randint(len(self.video_list))\n                record = self.video_list[index]\n\n        if not self.test_mode:\n            segment_indices = self._sample_indices(record) if self.random_shift else self._get_val_indices(record)\n        else:\n            segment_indices = self._get_test_indices(record)\n\n        return self.get(record, segment_indices)\n\n    def get(self, record, indices):\n\n        images = list()\n\n        for seg_ind in indices:\n            p = int(seg_ind)\n            for i in range(self.new_length):  # for optical flow getting a volumn start with seg_ind\n                img = self._load_image(record.path, p)\n                images.extend(img)\n                if p < record.num_frames:\n                    if self.modality == 'RGB':\n                        p += 1\n                    else:\n                        p += 2\n\n        # one image: H*W*C\n        process_data = self.transform(images)\n        return process_data, record.label\n\n    def __len__(self):\n        return len(self.video_list)\n\nclass C3DDataSet(data.Dataset):\n    def __init__(self, root_path, list_file,\n                 num_segments=3, new_length=1, modality='RGB',\n                 image_tmpl='img_{:05d}.jpg', transform=None,\n                 random_shift=True, test_mode=False):\n\n        self.root_path = root_path\n        self.list_file = list_file\n        self.num_segments = num_segments\n        self.new_length = new_length\n        self.modality = modality\n        self.image_tmpl = image_tmpl\n        self.transform = transform\n        self.random_shift = random_shift\n        self.test_mode = test_mode\n\n        self._parse_list()\n\n    def _load_image(self, directory, idx):\n        if self.modality == 'RGB':\n            try:\n                return [Image.open(os.path.join(self.root_path, directory, self.image_tmpl.format(idx))).convert('RGB')]\n            except Exception:\n                print('error loading image:', os.path.join(self.root_path, directory, self.image_tmpl.format(idx)))\n                return [Image.open(os.path.join(self.root_path, directory, self.image_tmpl.format(1))).convert('RGB')]\n        elif self.modality == 'Flow':\n            x_img = Image.open(os.path.join(self.root_path, directory, self.image_tmpl.format('x', idx))).convert('L')\n            y_img = Image.open(os.path.join(self.root_path, directory, self.image_tmpl.format('y', idx))).convert('L')\n            return [x_img, y_img]\n\n    def _parse_list(self):\n        # check the frame number is large >3:\n        # usualy it is [video_id, num_frames, class_idx]\n        if not self.test_mode:\n            tmp = [x.strip().split(' ') for x in open(self.list_file)]\n            if self.modality == 'Flow':\n                for item in tmp:\n                    item[1] = int(item[1]) / 2\n            tmp = [item for item in tmp if int(item[1]) >= 3]\n            self.video_list = [VideoRecord(item) for item in tmp]\n            print('video number:%d' % (len(self.video_list)))\n        else:\n            tmp = [x.strip().split() for x in open(self.list_file)]\n            if self.modality == 'Flow':\n                for item in tmp:\n                    item[1] = int(item[1]) / 2\n            # tmp = [item for item in tmp if int(item[1]) >= 3]\n            self.video_list = [VideoRecord(item) for item in tmp]\n            print('video number:%d' % (len(self.video_list)))\n\n    def _sample_indices(self, record):\n        \"\"\"\n\n        :param record: VideoRecord\n        :return: list\n        \"\"\"\n\n        average_duration = (record.num_frames - self.new_length + 1) // self.num_segments\n        if average_duration > 0:  # random sample\n            offsets = np.multiply(list(range(self.num_segments)), average_duration) + randint(average_duration,\n                                                                                              size=self.num_segments)\n        elif record.num_frames > self.num_segments:  # [0,0,1,1,1,2,2,3]     dense sample\n            offsets = np.sort(randint(record.num_frames - self.new_length + 1, size=self.num_segments))\n        else:\n            offsets = np.zeros((self.num_segments,))\n        if self.modality == 'Flow':\n            offsets = offsets * 2 + 1\n        else:\n            offsets += 1\n        return offsets\n\n    def _get_val_indices(self, record):\n        if record.num_frames > self.num_segments + self.new_length - 1:\n            tick = (record.num_frames - self.new_length + 1) / float(self.num_segments)\n            offsets = np.array([int(tick / 2.0 + tick * x) for x in range(self.num_segments)])\n        else:\n            offsets = np.sort(randint(record.num_frames - self.new_length + 1, size=self.num_segments))\n        if self.modality == 'Flow':\n            offsets = offsets * 2 + 1\n        else:\n            offsets += 1\n        return offsets\n\n    def _get_test_indices(self, record):\n        if record.num_frames > self.num_segments + self.new_length - 1:\n            tick = (record.num_frames - self.new_length + 1) / float(self.num_segments)\n            offsets = np.array([int(tick / 2.0 + tick * x) for x in range(self.num_segments)])\n        else:\n            offsets = np.sort(randint(record.num_frames - self.new_length + 1, size=self.num_segments))\n\n        offsets += 1\n        return offsets\n\n    def __getitem__(self, index):\n        record = self.video_list[index]\n        # check this is a legit video folder\n        if self.modality == 'RGB':\n            while not os.path.exists(os.path.join(self.root_path, record.path, self.image_tmpl.format(1))):\n                print(os.path.join(self.root_path, record.path, self.image_tmpl.format(1)) + ' not exists jumpping')\n                index = np.random.randint(len(self.video_list))\n                record = self.video_list[index]\n        else:\n            while not os.path.exists(os.path.join(self.root_path, record.path, self.image_tmpl.format('x', 1))):\n                print(\n                    os.path.join(self.root_path, record.path, self.image_tmpl.format('x', 1)) + ' not exists jumpping')\n                index = np.random.randint(len(self.video_list))\n                record = self.video_list[index]\n\n        if not self.test_mode:\n            if record.num_frames > self.new_length:\n                segment_indices = [randint(low=1, high=record.num_frames + 2 - self.new_length)]\n            else:\n                segment_indices = [1]\n\n        else:\n            segment_indices = self._get_test_indices(record)\n\n        return self.get(record, segment_indices)\n\n    def get(self, record, indices):\n\n        images = list()\n\n        for seg_ind in indices:\n            p = int(seg_ind)\n            for i in range(self.new_length):\n                img = self._load_image(record.path, p)\n                images.extend(img)\n                if p < record.num_frames:\n                    if self.modality == 'RGB':\n                        p += 1\n\n\n        # one image: H*W*C\n        process_data = self.transform(images)\n        return process_data, record.label\n\n    def __len__(self):\n        return len(self.video_list)\n"
  },
  {
    "path": "main.py",
    "content": "# @Author  : Sky chen\n# @Email   : dzhchxk@126.com\n# @Personal homepage  : https://coderskychen.cn\n\ntry:\n    import tensorflow as tf\nexcept ImportError:\n    print(\"Tensorflow not installed; No tensorboard logging.\")\n    tf = None\n\nimport argparse\nimport os\nimport time\nimport shutil\nimport torch\nimport torchvision\nimport torch.nn.parallel\nimport torch.backends.cudnn as cudnn\nimport torch.optim\nfrom torch.nn.utils import clip_grad_norm\n\nfrom dataset import *\nfrom models import *\nfrom transforms import *\nfrom opts import parser\n\n\ndef add_summary_value(writer, key, value, iteration):\n    summary = tf.Summary(value=[tf.Summary.Value(tag=key, simple_value=value)])\n    writer.add_summary(summary, iteration)\n\n\ndef return_something_path(modality):\n    filename_categories = '/home/mcg/cxk/dataset/somthing-something/category.txt'\n    if modality == 'RGB':\n        root_data = '/home/mcg/cxk/dataset/somthing-something/something-rgb'\n        filename_imglist_train = '/home/mcg/cxk/dataset/somthing-something/train_videofolder_rgb.txt'\n        filename_imglist_val = '/home/mcg/cxk/dataset/somthing-something/val_videofolder_rgb.txt'\n\n        prefix = '{:05d}.jpg'\n    else:\n        root_data = '/home/mcg/cxk/dataset/somthing-something/something-optical-flow'\n        filename_imglist_train = '/home/mcg/cxk/dataset/somthing-something/train_videofolder_flow.txt'\n        filename_imglist_val = '/home/mcg/cxk/dataset/somthing-something/val_videofolder_flow.txt'\n\n        prefix = '{:s}_{:05d}.jpg'\n\n    with open(filename_categories) as f:\n        lines = f.readlines()\n    categories = [item.rstrip() for item in lines]\n    return categories, filename_imglist_train, filename_imglist_val, root_data, prefix\n\n\nbest_prec1 = 0\n\ndef main():\n    global args, best_prec1\n    args = parser.parse_args()\n    assert len(args.train_id) > 0\n\n    check_rootfolders(args.train_id)\n    summary_w = tf and tf.summary.FileWriter(os.path.join('results', args.train_id, args.root_log))  #tensorboard\n\n    categories, args.train_list, args.val_list, args.root_path, prefix = return_something_path(args.modality)\n    num_class = len(categories)\n\n    args.store_name = '_'.join([args.model, args.modality, args.arch])\n    print('storing name: ' + args.store_name)\n\n    policies = -1\n    if args.model == 'TwoStream':\n        model = TwoStream(num_class, args.modality,\n                     base_model=args.arch, dropout=args.dropout,\n                     crop_num=1, partial_bn=not args.no_partialbn)\n        policies = model.get_optim_policies()\n\n    elif args.model == 'TSN':\n        model = TSN(num_class, args.num_segments, args.modality,\n                          base_model=args.arch, dropout=args.dropout,\n                          crop_num=1, partial_bn=not args.no_partialbn)\n        policies = model.get_optim_policies()\n\n    elif args.model == 'C3D':\n        model = C3D()\n        model_dict = model.state_dict()\n\n        pretrained_dict = torch.load('./model_zoo/c3d.pickle')\n\n        # 1. filter out unnecessary keys\n        pretrained_dict = {k: v for k, v in pretrained_dict.items() if k in model_dict}\n        # 2. overwrite entries in the existing state dict\n        model_dict.update(pretrained_dict)\n        # 3. load the new state dict\n        model.load_state_dict(model_dict)\n\n        print('c3d pretrained model loaded~')\n    else:\n        print('error!')\n        exit()\n\n    crop_size = model.crop_size\n    scale_size = model.scale_size\n    input_mean = model.input_mean\n    input_std = model.input_std\n    train_augmentation = model.get_augmentation()\n\n    model = torch.nn.DataParallel(model, device_ids=args.gpus).cuda()\n\n    if args.resume:\n        if os.path.isfile(args.resume):\n            print((\"=> loading checkpoint '{}'\".format(args.resume)))\n            checkpoint = torch.load(args.resume)\n            args.start_epoch = checkpoint['epoch']\n            best_prec1 = checkpoint['best_prec1']\n            model.load_state_dict(checkpoint['state_dict'])\n            print((\"=> loaded checkpoint '{}' (epoch {})\"\n                  .format(args.evaluate, checkpoint['epoch'])))\n        else:\n            print((\"=> no checkpoint found at '{}'\".format(args.resume)))\n\n    cudnn.benchmark = True\n\n    # Data loading code\n    if args.modality != 'RGBDiff':\n        normalize = GroupNormalize(input_mean, input_std)\n    else:\n        normalize = IdentityTransform()\n\n    if args.modality == 'RGB':\n        data_length = 1\n    elif args.modality in ['Flow', 'RGBDiff']:\n        data_length = 5\n\n    if args.modality == 'RGB' and args.model == 'C3D':\n        data_length = 16  # clip\n\n    if args.model == 'TwoStream':\n        datasettrain = TwoStreamDataSet(args.root_path, args.train_list,\n                   new_length=data_length,\n                   modality=args.modality,\n                   image_tmpl=prefix,\n                   transform=torchvision.transforms.Compose([\n                       train_augmentation,\n                       Stack(roll=(args.arch in ['BNInception', 'InceptionV3'])),\n                       ToTorchFormatTensor(div=(args.arch not in ['BNInception', 'InceptionV3'])),\n                       normalize,\n                   ]))\n\n        datasetval = TwoStreamDataSet(args.root_path, args.val_list,\n                   new_length=data_length,\n                   modality=args.modality,\n                   image_tmpl=prefix,\n                   random_shift=False,\n                   transform=torchvision.transforms.Compose([\n                       GroupScale(int(scale_size)),\n                       GroupCenterCrop(crop_size),\n                       Stack(roll=(args.arch in ['BNInception', 'InceptionV3'])),\n                       ToTorchFormatTensor(div=(args.arch not in ['BNInception', 'InceptionV3'])),\n                       normalize,\n                   ]))\n    elif args.model == 'TSN':\n        datasettrain = TSNDataSet(args.root_path, args.train_list, args.num_segments,\n                                        new_length=data_length,\n                                        modality=args.modality,\n                                        image_tmpl=prefix,\n                                        transform=torchvision.transforms.Compose([\n                                            train_augmentation,\n                                            Stack(roll=(args.arch in ['BNInception', 'InceptionV3'])),\n                                            ToTorchFormatTensor(div=(args.arch not in ['BNInception', 'InceptionV3'])),\n                                            normalize,\n                                        ]))\n\n        datasetval = TSNDataSet(args.root_path, args.val_list, args.num_segments,\n                                      new_length=data_length,\n                                      modality=args.modality,\n                                      image_tmpl=prefix,\n                                      random_shift=False,\n                                      transform=torchvision.transforms.Compose([\n                                          GroupScale(int(scale_size)),\n                                          GroupCenterCrop(crop_size),\n                                          Stack(roll=(args.arch in ['BNInception', 'InceptionV3'])),\n                                          ToTorchFormatTensor(div=(args.arch not in ['BNInception', 'InceptionV3'])),\n                                          normalize,\n                                      ]))\n    elif args.model == 'C3D':\n        datasettrain = C3DDataSet(args.root_path, args.train_list, 1,\n                                        new_length=data_length,\n                                        modality=args.modality,\n                                        image_tmpl=prefix,\n                                        transform=torchvision.transforms.Compose([\n                                            train_augmentation,\n                                            Stack(roll=(args.arch in ['BNInception', 'InceptionV3'])),\n                                            ToTorchFormatTensor(div=(args.arch not in ['BNInception', 'InceptionV3', 'C3D'])),\n                                            normalize,\n                                        ]))\n\n        datasetval = C3DDataSet(args.root_path, args.val_list, 1,\n                                        new_length=data_length,\n                                        modality=args.modality,\n                                        image_tmpl=prefix,\n                                        random_shift=False,\n                                        transform=torchvision.transforms.Compose([\n                                            GroupScale(int(scale_size)),\n                                            GroupCenterCrop(crop_size),\n                                            Stack(roll=(args.arch in ['BNInception', 'InceptionV3'])),\n                                            ToTorchFormatTensor(\n                                                div=(args.arch not in ['BNInception', 'InceptionV3', 'C3D'])),\n                                            normalize,\n                                        ]))\n\n    trainvidnum = len(datasettrain)\n    valvidnum = len(datasetval)\n\n    train_loader = torch.utils.data.DataLoader(\n        datasettrain,\n        batch_size=args.batch_size, shuffle=True,\n        num_workers=args.workers, pin_memory=True)\n\n    val_loader = torch.utils.data.DataLoader(\n        datasetval,\n        batch_size=args.batch_size, shuffle=False,\n        num_workers=args.workers, pin_memory=True)\n\n    # define loss function (criterion) and optimizer\n    criterion = torch.nn.CrossEntropyLoss().cuda()\n\n    if policies != -1:\n        for group in policies:\n            print(('group: {} has {} params, lr_mult: {}, decay_mult: {}'.format(\n                group['name'], len(group['params']), group['lr_mult'], group['decay_mult'])))\n\n        optimizer = torch.optim.SGD(policies, args.lr, momentum=args.momentum, weight_decay=args.weight_decay)\n    else:\n        optimizer = torch.optim.SGD(model.parameters(), args.lr, momentum=args.momentum, weight_decay=args.weight_decay)\n\n    # log_training = open(os.path.join(args.root_log, '%s.csv' % args.store_name), 'w')\n    for epoch in range(args.start_epoch, args.epochs):\n        adjust_learning_rate(optimizer, epoch, args.lr_steps, args.factor, policies != -1)\n\n        # train for one epoch\n        train(train_loader, model, criterion, optimizer, epoch, trainvidnum, summary_w)\n\n        # evaluate on validation set\n        if (epoch + 1) % args.eval_freq == 0 or epoch == args.epochs - 1:\n            prec1 = validate(val_loader, model, criterion, (epoch + 1) * trainvidnum, summary_w)\n            # prec1 = validate(val_loader, model, criterion, (epoch + 1) * len(train_loader), summary_w)\n\n            # remember best prec@1 and save checkpoint\n            is_best = prec1 > best_prec1\n            best_prec1 = max(prec1, best_prec1)\n            save_checkpoint({\n                'epoch': epoch + 1,\n                'arch': args.arch,\n                'state_dict': model.state_dict(),\n                'best_prec1': best_prec1,\n            }, is_best)\n\n\ndef train(train_loader, model, criterion, optimizer, epoch, vidnums, summary_w):\n    batch_time = AverageMeter()\n    data_time = AverageMeter()\n    losses = AverageMeter()\n    top1 = AverageMeter()\n    top5 = AverageMeter()\n\n    if args.no_partialbn:\n        model.module.partialBN(False)\n    else:\n        # model.partialBN(True)\n        model.module.partialBN(True)\n\n    # switch to train mode\n    model.train()\n\n    samples_have_seen = epoch*vidnums\n\n    end = time.time()\n    for i, (input, target) in enumerate(train_loader):\n        # if i>5:\n        #     break\n        # measure data loading time\n        data_time.update(time.time() - end)\n\n        target = target.cuda(async=True)\n\n        input_var = torch.autograd.Variable(input)\n        target_var = torch.autograd.Variable(target)\n\n        bs = input_var.size(0)\n\n        # compute output\n        output = model(input_var)\n        loss = criterion(output, target_var)\n\n        # measure accuracy and record loss\n        prec1, prec5 = accuracy(output.data, target, topk=(1,5))\n        losses.update(loss.data[0], input.size(0))\n        top1.update(prec1[0], input.size(0))\n        top5.update(prec5[0], input.size(0))\n\n        # compute gradient and do SGD step\n        optimizer.zero_grad()\n\n        loss.backward()\n\n        if args.clip_gradient is not None:\n            total_norm = clip_grad_norm(model.parameters(), args.clip_gradient)\n            if total_norm > args.clip_gradient:\n                print(\"clipping gradient: {} with coef {}\".format(total_norm, args.clip_gradient / total_norm))\n\n        optimizer.step()\n\n        # measure elapsed time\n        batch_time.update(time.time() - end)\n        end = time.time()\n\n        samples_have_seen += bs\n\n        if i % args.print_freq == 0:\n            output = ('Epoch: [{0}][{1}/{2}], lr: {lr:.5f}\\t'\n                    'Time {batch_time.val:.3f} ({batch_time.avg:.3f})\\t'\n                    'Data {data_time.val:.3f} ({data_time.avg:.3f})\\t'\n                    'Loss {loss.val:.4f} ({loss.avg:.4f})\\t'\n                    'Prec@1 {top1.val:.3f} ({top1.avg:.3f})\\t'\n                    'Prec@5 {top5.val:.3f} ({top5.avg:.3f})'.format(\n                        epoch, i, len(train_loader), batch_time=batch_time,\n                        data_time=data_time, loss=losses, top1=top1, top5=top5, lr=optimizer.param_groups[-1]['lr']))\n            print(output)\n            add_summary_value(summary_w, 'train_loss', losses.val, samples_have_seen)\n            add_summary_value(summary_w, 'train_Prec@1', top1.val, samples_have_seen)\n            add_summary_value(summary_w, 'train_Prec@5', top5.val, samples_have_seen)\n            add_summary_value(summary_w, 'train_Prec@1_mean', top1.avg, samples_have_seen)\n            add_summary_value(summary_w, 'train_Prec@5_mean', top5.avg, samples_have_seen)\n            add_summary_value(summary_w, 'lr', optimizer.param_groups[-1]['lr'], samples_have_seen)\n\n            # log.write(output + '\\n')\n            # log.flush()\n\n\n\ndef validate(val_loader, model, criterion, iter, summary_w):\n    batch_time = AverageMeter()\n    losses = AverageMeter()\n    top1 = AverageMeter()\n    top5 = AverageMeter()\n\n    # switch to evaluate mode\n    model.eval()\n\n    end = time.time()\n    for i, (input, target) in enumerate(val_loader):\n        # if i>5:\n        #     break\n        target = target.cuda(async=True)\n        input_var = torch.autograd.Variable(input, volatile=True)\n        target_var = torch.autograd.Variable(target, volatile=True)\n\n        # compute output\n        output = model(input_var)\n        loss = criterion(output, target_var)\n\n        # measure accuracy and record loss\n        prec1, prec5 = accuracy(output.data, target, topk=(1,5))\n\n        losses.update(loss.data[0], input.size(0))\n        top1.update(prec1[0], input.size(0))\n        top5.update(prec5[0], input.size(0))\n\n        # measure elapsed time\n        batch_time.update(time.time() - end)\n        end = time.time()\n        \n        if i % args.print_freq == 0:\n            output = ('Test: [{0}/{1}]\\t'\n                  'Time {batch_time.val:.3f} ({batch_time.avg:.3f})\\t'\n                  'Loss {loss.val:.4f} ({loss.avg:.4f})\\t'\n                  'Prec@1 {top1.val:.3f} ({top1.avg:.3f})\\t'\n                  'Prec@5 {top5.val:.3f} ({top5.avg:.3f})'.format(\n                   i, len(val_loader), batch_time=batch_time, loss=losses,\n                   top1=top1, top5=top5))\n            print(output)            \n            # log.write(output + '\\n')\n            # log.flush()\n\n    output = ('Testing Results: Prec@1 {top1.avg:.3f} Prec@5 {top5.avg:.3f} Loss {loss.avg:.5f}'\n          .format(top1=top1, top5=top5, loss=losses))\n    print(output)\n\n    add_summary_value(summary_w, 'val_loss', losses.avg, iter)\n    add_summary_value(summary_w, 'val_Prec@1', top1.avg, iter)\n    add_summary_value(summary_w, 'val_Prec@5', top5.avg, iter)\n    \n    output_best = '\\nBest Prec@1: %.3f'%(best_prec1)\n    print(output_best)\n    # log.write(output + ' ' + output_best + '\\n')\n    # log.flush()\n\n    return top1.avg\n\n\ndef save_checkpoint(state, is_best, filename='checkpoint.pth.tar'):\n    torch.save(state, './results/%s/%s/%s_checkpoint.pth.tar' % (args.train_id, args.root_model, args.store_name))\n    if is_best:\n        shutil.copyfile('./results/%s/%s/%s_checkpoint.pth.tar' % (args.train_id, args.root_model, args.store_name), './results/%s/%s/%s_best.pth.tar' % (args.train_id, args.root_model, args.store_name))\n\nclass AverageMeter(object):\n    \"\"\"Computes and stores the average and current value\"\"\"\n    def __init__(self):\n        self.reset()\n\n    def reset(self):\n        self.val = 0\n        self.avg = 0\n        self.sum = 0\n        self.count = 0\n\n    def update(self, val, n=1):\n        self.val = val\n        self.sum += val * n\n        self.count += n\n        self.avg = self.sum / self.count\n\n\ndef adjust_learning_rate(optimizer, epoch, lr_steps, factor, with_police=True):\n    \"\"\"Sets the learning rate to the initial LR decayed by 10 every 30 epochs\"\"\"\n    decay = factor ** (sum(epoch >= np.array(lr_steps)))\n    lr = args.lr * decay\n    decay = args.weight_decay\n    if with_police:\n        for param_group in optimizer.param_groups:\n            param_group['lr'] = lr * param_group['lr_mult']\n            param_group['weight_decay'] = decay * param_group['decay_mult']\n    else:\n        for param_group in optimizer.param_groups:\n            param_group['lr'] = lr\n            param_group['weight_decay'] = decay\n\n\ndef accuracy(output, target, topk=(1,)):\n    \"\"\"Computes the precision@k for the specified values of k\"\"\"\n    maxk = max(topk)\n    batch_size = target.size(0)\n\n    _, pred = output.topk(maxk, 1, True, True)\n    pred = pred.t()\n    correct = pred.eq(target.view(1, -1).expand_as(pred))\n\n    res = []\n    for k in topk:\n        correct_k = correct[:k].view(-1).float().sum(0)\n        res.append(correct_k.mul_(100.0 / batch_size))\n    return res\n\ndef check_rootfolders(trainid):\n    \"\"\"Create log and model folder\"\"\"\n    folders_util = [args.root_log, args.root_model, args.root_output]\n    if not os.path.exists('./results'):\n        os.makedirs('./results')\n    for folder in folders_util:\n        if not os.path.exists(os.path.join('./results', trainid, folder)):\n            print('creating folder ' + folder)\n            os.makedirs(os.path.join('./results', trainid, folder))\n\nif __name__ == '__main__':\n    main()\n"
  },
  {
    "path": "model_zoo/LICENSE",
    "content": "Copyright (c) 2017 LIP6 Lab\n\nPermission is hereby granted, free of charge, to any person obtaining\na copy of this software and associated documentation files (the\n\"Software\"), to deal in the Software without restriction, including\nwithout limitation the rights to use, copy, modify, merge, publish,\ndistribute, sublicense, and/or sell copies of the Software, and to\npermit persons to whom the Software is furnished to do so, subject to\nthe following conditions:\n\nThe above copyright notice and this permission notice shall be\nincluded in all copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND,\nEXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF\nMERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND\nNONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE\nLIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION\nOF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION\nWITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\n\n-------------------------------------------------\n\nThis product contains portions of third party software provided under this license:\n\ndump_filters.py (x)\n===============\n\nCopyright 2015 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n   http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n\n\n(x) adapted from https://github.com/tensorflow/tensorflow/blob/411f57e/tensorflow/models/image/imagenet/classify_image.py\n"
  },
  {
    "path": "model_zoo/README.md",
    "content": "# Tensorflow Model Zoo for Torch7 and PyTorch\n\nThis is a porting of tensorflow pretrained models made by [Remi Cadene](http://remicadene.com) and [Micael Carvalho](http://micaelcarvalho.com). Special thanks to Moustapha Cissé. All models have been tested on Imagenet.\n\nThis work was inspired by [inception-v3.torch](https://github.com/Moodstocks/inception-v3.torch).\n\n\n## Using pretrained models\n\n### Torch7\n\n#### Requirements \n\nPlease install [torchnet-vision](https://github.com/Cadene/torchnet-vision).\n```\nluarocks install --server=http://luarocks.org/dev torchnet-vision\n```\n\nModels available:\n\n- inceptionv3\n- inceptionv4\n- inceptionresnetv2\n- resnet{18, 34, 50, 101, 152, 200}\n- overfeat\n- vggm\n- vgg16\n\n#### Simple example\n\n```lua\nrequire 'image'\ntnt = require 'torchnet'\nvision = require 'torchnet-vision'\nmodel = vision.models.inceptionresnetv2\nnet = model.load()\n\naugmentation = tnt.transform.compose{\n   vision.image.transformimage.randomScale{\n   \tminSize = 299, maxSize = 350\n   },\n   vision.image.transformimage.randomCrop(299),\n   vision.image.transformimage.colorNormalize{\n      mean = model.mean, std  = model.std\n   },\n   function(img) return img:float() end\n}\n\nnet:evaluate()\noutput = net:forward(augmentation(image.lena()))\n```\n\n### PyTorch\n\nCurrently available in this repo only On pytorch/vision maybe!\n\nModels available:\n\n- inceptionv4\n- inceptionresnetv2\n\n#### Simple example\n\n```python\nimport torch\nfrom inceptionv4.pytorch_load import inceptionv4\nnet = inceptionv4()\ninput = torch.autograd.Variable(torch.ones(1,3,299,299))\noutput = net.forward(input)\n```\n\n\n## Reproducing the porting\n\n### Requirements\n \n- Tensorflow\n- Torch7\n- PyTorch\n- hdf5 for python3\n- hdf5 for lua\n\n### Example of commands\n\nIn Tensorflow: Download tensorflow parameters and extract them in `./dump` directory.\n```\npython3 inceptionv4/tensorflow_dump.py\n```\n\nIn Torch7 or PyTorch: Create the network, load the parameters, launch few tests and save the network in `./save` directory.\n```\nth inceptionv4/torch_load.lua\npython3 inceptionv4/pytorch_load.py\n```\n"
  },
  {
    "path": "model_zoo/__init__.py",
    "content": "from .inceptionresnetv2.pytorch_load import inceptionresnetv2\nfrom .inceptionv4.pytorch_load import inceptionv4\nfrom .bninception.pytorch_load import BNInception, InceptionV3\n"
  },
  {
    "path": "model_zoo/bninception/__init__.py",
    "content": ""
  },
  {
    "path": "model_zoo/bninception/bn_inception.yaml",
    "content": "inputs: []\nlayers:\n- attrs: {kernel_size: 7, num_output: 64, pad: 3, stride: 2}\n  expr: conv1_7x7_s2<=Convolution<=data\n  id: conv1_7x7_s2\n- attrs: {frozen: true}\n  expr: conv1_7x7_s2_bn<=BN<=conv1_7x7_s2\n  id: conv1_7x7_s2_bn\n- {expr: conv1_7x7_s2_bn<=ReLU<=conv1_7x7_s2_bn, id: conv1_relu_7x7}\n- attrs: {kernel_size: 3, mode: max, stride: 2}\n  expr: pool1_3x3_s2<=Pooling<=conv1_7x7_s2_bn\n  id: pool1_3x3_s2\n- attrs: {kernel_size: 1, num_output: 64}\n  expr: conv2_3x3_reduce<=Convolution<=pool1_3x3_s2\n  id: conv2_3x3_reduce\n- attrs: {frozen: true}\n  expr: conv2_3x3_reduce_bn<=BN<=conv2_3x3_reduce\n  id: conv2_3x3_reduce_bn\n- {expr: conv2_3x3_reduce_bn<=ReLU<=conv2_3x3_reduce_bn, id: conv2_relu_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 192, pad: 1}\n  expr: conv2_3x3<=Convolution<=conv2_3x3_reduce_bn\n  id: conv2_3x3\n- attrs: {frozen: true}\n  expr: conv2_3x3_bn<=BN<=conv2_3x3\n  id: conv2_3x3_bn\n- {expr: conv2_3x3_bn<=ReLU<=conv2_3x3_bn, id: conv2_relu_3x3}\n- attrs: {kernel_size: 3, mode: max, stride: 2}\n  expr: pool2_3x3_s2<=Pooling<=conv2_3x3_bn\n  id: pool2_3x3_s2\n- attrs: {kernel_size: 1, num_output: 64}\n  expr: inception_3a_1x1<=Convolution<=pool2_3x3_s2\n  id: inception_3a_1x1\n- attrs: {frozen: true}\n  expr: inception_3a_1x1_bn<=BN<=inception_3a_1x1\n  id: inception_3a_1x1_bn\n- {expr: inception_3a_1x1_bn<=ReLU<=inception_3a_1x1_bn, id: inception_3a_relu_1x1}\n- attrs: {kernel_size: 1, num_output: 64}\n  expr: inception_3a_3x3_reduce<=Convolution<=pool2_3x3_s2\n  id: inception_3a_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_3a_3x3_reduce_bn<=BN<=inception_3a_3x3_reduce\n  id: inception_3a_3x3_reduce_bn\n- {expr: inception_3a_3x3_reduce_bn<=ReLU<=inception_3a_3x3_reduce_bn, id: inception_3a_relu_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 64, pad: 1}\n  expr: inception_3a_3x3<=Convolution<=inception_3a_3x3_reduce_bn\n  id: inception_3a_3x3\n- attrs: {frozen: true}\n  expr: inception_3a_3x3_bn<=BN<=inception_3a_3x3\n  id: inception_3a_3x3_bn\n- {expr: inception_3a_3x3_bn<=ReLU<=inception_3a_3x3_bn, id: inception_3a_relu_3x3}\n- attrs: {kernel_size: 1, num_output: 64}\n  expr: inception_3a_double_3x3_reduce<=Convolution<=pool2_3x3_s2\n  id: inception_3a_double_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_3a_double_3x3_reduce_bn<=BN<=inception_3a_double_3x3_reduce\n  id: inception_3a_double_3x3_reduce_bn\n- {expr: inception_3a_double_3x3_reduce_bn<=ReLU<=inception_3a_double_3x3_reduce_bn,\n  id: inception_3a_relu_double_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 96, pad: 1}\n  expr: inception_3a_double_3x3_1<=Convolution<=inception_3a_double_3x3_reduce_bn\n  id: inception_3a_double_3x3_1\n- attrs: {frozen: true}\n  expr: inception_3a_double_3x3_1_bn<=BN<=inception_3a_double_3x3_1\n  id: inception_3a_double_3x3_1_bn\n- {expr: inception_3a_double_3x3_1_bn<=ReLU<=inception_3a_double_3x3_1_bn, id: inception_3a_relu_double_3x3_1}\n- attrs: {kernel_size: 3, num_output: 96, pad: 1}\n  expr: inception_3a_double_3x3_2<=Convolution<=inception_3a_double_3x3_1_bn\n  id: inception_3a_double_3x3_2\n- attrs: {frozen: true}\n  expr: inception_3a_double_3x3_2_bn<=BN<=inception_3a_double_3x3_2\n  id: inception_3a_double_3x3_2_bn\n- {expr: inception_3a_double_3x3_2_bn<=ReLU<=inception_3a_double_3x3_2_bn, id: inception_3a_relu_double_3x3_2}\n- attrs: {kernel_size: 3, mode: ave, pad: 1, stride: 1}\n  expr: inception_3a_pool<=Pooling<=pool2_3x3_s2\n  id: inception_3a_pool\n- attrs: {kernel_size: 1, num_output: 32}\n  expr: inception_3a_pool_proj<=Convolution<=inception_3a_pool\n  id: inception_3a_pool_proj\n- attrs: {frozen: true}\n  expr: inception_3a_pool_proj_bn<=BN<=inception_3a_pool_proj\n  id: inception_3a_pool_proj_bn\n- {expr: inception_3a_pool_proj_bn<=ReLU<=inception_3a_pool_proj_bn, id: inception_3a_relu_pool_proj}\n- {expr: 'inception_3a_output<=Concat<=inception_3a_1x1_bn,inception_3a_3x3_bn,inception_3a_double_3x3_2_bn,inception_3a_pool_proj_bn',\n  id: inception_3a_output}\n- attrs: {kernel_size: 1, num_output: 64}\n  expr: inception_3b_1x1<=Convolution<=inception_3a_output\n  id: inception_3b_1x1\n- attrs: {frozen: true}\n  expr: inception_3b_1x1_bn<=BN<=inception_3b_1x1\n  id: inception_3b_1x1_bn\n- {expr: inception_3b_1x1_bn<=ReLU<=inception_3b_1x1_bn, id: inception_3b_relu_1x1}\n- attrs: {kernel_size: 1, num_output: 64}\n  expr: inception_3b_3x3_reduce<=Convolution<=inception_3a_output\n  id: inception_3b_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_3b_3x3_reduce_bn<=BN<=inception_3b_3x3_reduce\n  id: inception_3b_3x3_reduce_bn\n- {expr: inception_3b_3x3_reduce_bn<=ReLU<=inception_3b_3x3_reduce_bn, id: inception_3b_relu_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 96, pad: 1}\n  expr: inception_3b_3x3<=Convolution<=inception_3b_3x3_reduce_bn\n  id: inception_3b_3x3\n- attrs: {frozen: true}\n  expr: inception_3b_3x3_bn<=BN<=inception_3b_3x3\n  id: inception_3b_3x3_bn\n- {expr: inception_3b_3x3_bn<=ReLU<=inception_3b_3x3_bn, id: inception_3b_relu_3x3}\n- attrs: {kernel_size: 1, num_output: 64}\n  expr: inception_3b_double_3x3_reduce<=Convolution<=inception_3a_output\n  id: inception_3b_double_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_3b_double_3x3_reduce_bn<=BN<=inception_3b_double_3x3_reduce\n  id: inception_3b_double_3x3_reduce_bn\n- {expr: inception_3b_double_3x3_reduce_bn<=ReLU<=inception_3b_double_3x3_reduce_bn,\n  id: inception_3b_relu_double_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 96, pad: 1}\n  expr: inception_3b_double_3x3_1<=Convolution<=inception_3b_double_3x3_reduce_bn\n  id: inception_3b_double_3x3_1\n- attrs: {frozen: true}\n  expr: inception_3b_double_3x3_1_bn<=BN<=inception_3b_double_3x3_1\n  id: inception_3b_double_3x3_1_bn\n- {expr: inception_3b_double_3x3_1_bn<=ReLU<=inception_3b_double_3x3_1_bn, id: inception_3b_relu_double_3x3_1}\n- attrs: {kernel_size: 3, num_output: 96, pad: 1}\n  expr: inception_3b_double_3x3_2<=Convolution<=inception_3b_double_3x3_1_bn\n  id: inception_3b_double_3x3_2\n- attrs: {frozen: true}\n  expr: inception_3b_double_3x3_2_bn<=BN<=inception_3b_double_3x3_2\n  id: inception_3b_double_3x3_2_bn\n- {expr: inception_3b_double_3x3_2_bn<=ReLU<=inception_3b_double_3x3_2_bn, id: inception_3b_relu_double_3x3_2}\n- attrs: {kernel_size: 3, mode: ave, pad: 1, stride: 1}\n  expr: inception_3b_pool<=Pooling<=inception_3a_output\n  id: inception_3b_pool\n- attrs: {kernel_size: 1, num_output: 64}\n  expr: inception_3b_pool_proj<=Convolution<=inception_3b_pool\n  id: inception_3b_pool_proj\n- attrs: {frozen: true}\n  expr: inception_3b_pool_proj_bn<=BN<=inception_3b_pool_proj\n  id: inception_3b_pool_proj_bn\n- {expr: inception_3b_pool_proj_bn<=ReLU<=inception_3b_pool_proj_bn, id: inception_3b_relu_pool_proj}\n- {expr: 'inception_3b_output<=Concat<=inception_3b_1x1_bn,inception_3b_3x3_bn,inception_3b_double_3x3_2_bn,inception_3b_pool_proj_bn',\n  id: inception_3b_output}\n- attrs: {kernel_size: 1, num_output: 128}\n  expr: inception_3c_3x3_reduce<=Convolution<=inception_3b_output\n  id: inception_3c_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_3c_3x3_reduce_bn<=BN<=inception_3c_3x3_reduce\n  id: inception_3c_3x3_reduce_bn\n- {expr: inception_3c_3x3_reduce_bn<=ReLU<=inception_3c_3x3_reduce_bn, id: inception_3c_relu_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 160, pad: 1, stride: 2}\n  expr: inception_3c_3x3<=Convolution<=inception_3c_3x3_reduce_bn\n  id: inception_3c_3x3\n- attrs: {frozen: true}\n  expr: inception_3c_3x3_bn<=BN<=inception_3c_3x3\n  id: inception_3c_3x3_bn\n- {expr: inception_3c_3x3_bn<=ReLU<=inception_3c_3x3_bn, id: inception_3c_relu_3x3}\n- attrs: {kernel_size: 1, num_output: 64}\n  expr: inception_3c_double_3x3_reduce<=Convolution<=inception_3b_output\n  id: inception_3c_double_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_3c_double_3x3_reduce_bn<=BN<=inception_3c_double_3x3_reduce\n  id: inception_3c_double_3x3_reduce_bn\n- {expr: inception_3c_double_3x3_reduce_bn<=ReLU<=inception_3c_double_3x3_reduce_bn,\n  id: inception_3c_relu_double_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 96, pad: 1}\n  expr: inception_3c_double_3x3_1<=Convolution<=inception_3c_double_3x3_reduce_bn\n  id: inception_3c_double_3x3_1\n- attrs: {frozen: true}\n  expr: inception_3c_double_3x3_1_bn<=BN<=inception_3c_double_3x3_1\n  id: inception_3c_double_3x3_1_bn\n- {expr: inception_3c_double_3x3_1_bn<=ReLU<=inception_3c_double_3x3_1_bn, id: inception_3c_relu_double_3x3_1}\n- attrs: {kernel_size: 3, num_output: 96, pad: 1, stride: 2}\n  expr: inception_3c_double_3x3_2<=Convolution<=inception_3c_double_3x3_1_bn\n  id: inception_3c_double_3x3_2\n- attrs: {frozen: true}\n  expr: inception_3c_double_3x3_2_bn<=BN<=inception_3c_double_3x3_2\n  id: inception_3c_double_3x3_2_bn\n- {expr: inception_3c_double_3x3_2_bn<=ReLU<=inception_3c_double_3x3_2_bn, id: inception_3c_relu_double_3x3_2}\n- attrs: {kernel_size: 3, mode: max, stride: 2}\n  expr: inception_3c_pool<=Pooling<=inception_3b_output\n  id: inception_3c_pool\n- {expr: 'inception_3c_output<=Concat<=inception_3c_3x3_bn,inception_3c_double_3x3_2_bn,inception_3c_pool',\n  id: inception_3c_output}\n- attrs: {kernel_size: 1, num_output: 224}\n  expr: inception_4a_1x1<=Convolution<=inception_3c_output\n  id: inception_4a_1x1\n- attrs: {frozen: true}\n  expr: inception_4a_1x1_bn<=BN<=inception_4a_1x1\n  id: inception_4a_1x1_bn\n- {expr: inception_4a_1x1_bn<=ReLU<=inception_4a_1x1_bn, id: inception_4a_relu_1x1}\n- attrs: {kernel_size: 1, num_output: 64}\n  expr: inception_4a_3x3_reduce<=Convolution<=inception_3c_output\n  id: inception_4a_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_4a_3x3_reduce_bn<=BN<=inception_4a_3x3_reduce\n  id: inception_4a_3x3_reduce_bn\n- {expr: inception_4a_3x3_reduce_bn<=ReLU<=inception_4a_3x3_reduce_bn, id: inception_4a_relu_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 96, pad: 1}\n  expr: inception_4a_3x3<=Convolution<=inception_4a_3x3_reduce_bn\n  id: inception_4a_3x3\n- attrs: {frozen: true}\n  expr: inception_4a_3x3_bn<=BN<=inception_4a_3x3\n  id: inception_4a_3x3_bn\n- {expr: inception_4a_3x3_bn<=ReLU<=inception_4a_3x3_bn, id: inception_4a_relu_3x3}\n- attrs: {kernel_size: 1, num_output: 96}\n  expr: inception_4a_double_3x3_reduce<=Convolution<=inception_3c_output\n  id: inception_4a_double_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_4a_double_3x3_reduce_bn<=BN<=inception_4a_double_3x3_reduce\n  id: inception_4a_double_3x3_reduce_bn\n- {expr: inception_4a_double_3x3_reduce_bn<=ReLU<=inception_4a_double_3x3_reduce_bn,\n  id: inception_4a_relu_double_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 128, pad: 1}\n  expr: inception_4a_double_3x3_1<=Convolution<=inception_4a_double_3x3_reduce_bn\n  id: inception_4a_double_3x3_1\n- attrs: {frozen: true}\n  expr: inception_4a_double_3x3_1_bn<=BN<=inception_4a_double_3x3_1\n  id: inception_4a_double_3x3_1_bn\n- {expr: inception_4a_double_3x3_1_bn<=ReLU<=inception_4a_double_3x3_1_bn, id: inception_4a_relu_double_3x3_1}\n- attrs: {kernel_size: 3, num_output: 128, pad: 1}\n  expr: inception_4a_double_3x3_2<=Convolution<=inception_4a_double_3x3_1_bn\n  id: inception_4a_double_3x3_2\n- attrs: {frozen: true}\n  expr: inception_4a_double_3x3_2_bn<=BN<=inception_4a_double_3x3_2\n  id: inception_4a_double_3x3_2_bn\n- {expr: inception_4a_double_3x3_2_bn<=ReLU<=inception_4a_double_3x3_2_bn, id: inception_4a_relu_double_3x3_2}\n- attrs: {kernel_size: 3, mode: ave, pad: 1, stride: 1}\n  expr: inception_4a_pool<=Pooling<=inception_3c_output\n  id: inception_4a_pool\n- attrs: {kernel_size: 1, num_output: 128}\n  expr: inception_4a_pool_proj<=Convolution<=inception_4a_pool\n  id: inception_4a_pool_proj\n- attrs: {frozen: true}\n  expr: inception_4a_pool_proj_bn<=BN<=inception_4a_pool_proj\n  id: inception_4a_pool_proj_bn\n- {expr: inception_4a_pool_proj_bn<=ReLU<=inception_4a_pool_proj_bn, id: inception_4a_relu_pool_proj}\n- {expr: 'inception_4a_output<=Concat<=inception_4a_1x1_bn,inception_4a_3x3_bn,inception_4a_double_3x3_2_bn,inception_4a_pool_proj_bn',\n  id: inception_4a_output}\n- attrs: {kernel_size: 1, num_output: 192}\n  expr: inception_4b_1x1<=Convolution<=inception_4a_output\n  id: inception_4b_1x1\n- attrs: {frozen: true}\n  expr: inception_4b_1x1_bn<=BN<=inception_4b_1x1\n  id: inception_4b_1x1_bn\n- {expr: inception_4b_1x1_bn<=ReLU<=inception_4b_1x1_bn, id: inception_4b_relu_1x1}\n- attrs: {kernel_size: 1, num_output: 96}\n  expr: inception_4b_3x3_reduce<=Convolution<=inception_4a_output\n  id: inception_4b_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_4b_3x3_reduce_bn<=BN<=inception_4b_3x3_reduce\n  id: inception_4b_3x3_reduce_bn\n- {expr: inception_4b_3x3_reduce_bn<=ReLU<=inception_4b_3x3_reduce_bn, id: inception_4b_relu_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 128, pad: 1}\n  expr: inception_4b_3x3<=Convolution<=inception_4b_3x3_reduce_bn\n  id: inception_4b_3x3\n- attrs: {frozen: true}\n  expr: inception_4b_3x3_bn<=BN<=inception_4b_3x3\n  id: inception_4b_3x3_bn\n- {expr: inception_4b_3x3_bn<=ReLU<=inception_4b_3x3_bn, id: inception_4b_relu_3x3}\n- attrs: {kernel_size: 1, num_output: 96}\n  expr: inception_4b_double_3x3_reduce<=Convolution<=inception_4a_output\n  id: inception_4b_double_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_4b_double_3x3_reduce_bn<=BN<=inception_4b_double_3x3_reduce\n  id: inception_4b_double_3x3_reduce_bn\n- {expr: inception_4b_double_3x3_reduce_bn<=ReLU<=inception_4b_double_3x3_reduce_bn,\n  id: inception_4b_relu_double_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 128, pad: 1}\n  expr: inception_4b_double_3x3_1<=Convolution<=inception_4b_double_3x3_reduce_bn\n  id: inception_4b_double_3x3_1\n- attrs: {frozen: true}\n  expr: inception_4b_double_3x3_1_bn<=BN<=inception_4b_double_3x3_1\n  id: inception_4b_double_3x3_1_bn\n- {expr: inception_4b_double_3x3_1_bn<=ReLU<=inception_4b_double_3x3_1_bn, id: inception_4b_relu_double_3x3_1}\n- attrs: {kernel_size: 3, num_output: 128, pad: 1}\n  expr: inception_4b_double_3x3_2<=Convolution<=inception_4b_double_3x3_1_bn\n  id: inception_4b_double_3x3_2\n- attrs: {frozen: true}\n  expr: inception_4b_double_3x3_2_bn<=BN<=inception_4b_double_3x3_2\n  id: inception_4b_double_3x3_2_bn\n- {expr: inception_4b_double_3x3_2_bn<=ReLU<=inception_4b_double_3x3_2_bn, id: inception_4b_relu_double_3x3_2}\n- attrs: {kernel_size: 3, mode: ave, pad: 1, stride: 1}\n  expr: inception_4b_pool<=Pooling<=inception_4a_output\n  id: inception_4b_pool\n- attrs: {kernel_size: 1, num_output: 128}\n  expr: inception_4b_pool_proj<=Convolution<=inception_4b_pool\n  id: inception_4b_pool_proj\n- attrs: {frozen: true}\n  expr: inception_4b_pool_proj_bn<=BN<=inception_4b_pool_proj\n  id: inception_4b_pool_proj_bn\n- {expr: inception_4b_pool_proj_bn<=ReLU<=inception_4b_pool_proj_bn, id: inception_4b_relu_pool_proj}\n- {expr: 'inception_4b_output<=Concat<=inception_4b_1x1_bn,inception_4b_3x3_bn,inception_4b_double_3x3_2_bn,inception_4b_pool_proj_bn',\n  id: inception_4b_output}\n- attrs: {kernel_size: 1, num_output: 160}\n  expr: inception_4c_1x1<=Convolution<=inception_4b_output\n  id: inception_4c_1x1\n- attrs: {frozen: true}\n  expr: inception_4c_1x1_bn<=BN<=inception_4c_1x1\n  id: inception_4c_1x1_bn\n- {expr: inception_4c_1x1_bn<=ReLU<=inception_4c_1x1_bn, id: inception_4c_relu_1x1}\n- attrs: {kernel_size: 1, num_output: 128}\n  expr: inception_4c_3x3_reduce<=Convolution<=inception_4b_output\n  id: inception_4c_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_4c_3x3_reduce_bn<=BN<=inception_4c_3x3_reduce\n  id: inception_4c_3x3_reduce_bn\n- {expr: inception_4c_3x3_reduce_bn<=ReLU<=inception_4c_3x3_reduce_bn, id: inception_4c_relu_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 160, pad: 1}\n  expr: inception_4c_3x3<=Convolution<=inception_4c_3x3_reduce_bn\n  id: inception_4c_3x3\n- attrs: {frozen: true}\n  expr: inception_4c_3x3_bn<=BN<=inception_4c_3x3\n  id: inception_4c_3x3_bn\n- {expr: inception_4c_3x3_bn<=ReLU<=inception_4c_3x3_bn, id: inception_4c_relu_3x3}\n- attrs: {kernel_size: 1, num_output: 128}\n  expr: inception_4c_double_3x3_reduce<=Convolution<=inception_4b_output\n  id: inception_4c_double_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_4c_double_3x3_reduce_bn<=BN<=inception_4c_double_3x3_reduce\n  id: inception_4c_double_3x3_reduce_bn\n- {expr: inception_4c_double_3x3_reduce_bn<=ReLU<=inception_4c_double_3x3_reduce_bn,\n  id: inception_4c_relu_double_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 160, pad: 1}\n  expr: inception_4c_double_3x3_1<=Convolution<=inception_4c_double_3x3_reduce_bn\n  id: inception_4c_double_3x3_1\n- attrs: {frozen: true}\n  expr: inception_4c_double_3x3_1_bn<=BN<=inception_4c_double_3x3_1\n  id: inception_4c_double_3x3_1_bn\n- {expr: inception_4c_double_3x3_1_bn<=ReLU<=inception_4c_double_3x3_1_bn, id: inception_4c_relu_double_3x3_1}\n- attrs: {kernel_size: 3, num_output: 160, pad: 1}\n  expr: inception_4c_double_3x3_2<=Convolution<=inception_4c_double_3x3_1_bn\n  id: inception_4c_double_3x3_2\n- attrs: {frozen: true}\n  expr: inception_4c_double_3x3_2_bn<=BN<=inception_4c_double_3x3_2\n  id: inception_4c_double_3x3_2_bn\n- {expr: inception_4c_double_3x3_2_bn<=ReLU<=inception_4c_double_3x3_2_bn, id: inception_4c_relu_double_3x3_2}\n- attrs: {kernel_size: 3, mode: ave, pad: 1, stride: 1}\n  expr: inception_4c_pool<=Pooling<=inception_4b_output\n  id: inception_4c_pool\n- attrs: {kernel_size: 1, num_output: 128}\n  expr: inception_4c_pool_proj<=Convolution<=inception_4c_pool\n  id: inception_4c_pool_proj\n- attrs: {frozen: true}\n  expr: inception_4c_pool_proj_bn<=BN<=inception_4c_pool_proj\n  id: inception_4c_pool_proj_bn\n- {expr: inception_4c_pool_proj_bn<=ReLU<=inception_4c_pool_proj_bn, id: inception_4c_relu_pool_proj}\n- {expr: 'inception_4c_output<=Concat<=inception_4c_1x1_bn,inception_4c_3x3_bn,inception_4c_double_3x3_2_bn,inception_4c_pool_proj_bn',\n  id: inception_4c_output}\n- attrs: {kernel_size: 1, num_output: 96}\n  expr: inception_4d_1x1<=Convolution<=inception_4c_output\n  id: inception_4d_1x1\n- attrs: {frozen: true}\n  expr: inception_4d_1x1_bn<=BN<=inception_4d_1x1\n  id: inception_4d_1x1_bn\n- {expr: inception_4d_1x1_bn<=ReLU<=inception_4d_1x1_bn, id: inception_4d_relu_1x1}\n- attrs: {kernel_size: 1, num_output: 128}\n  expr: inception_4d_3x3_reduce<=Convolution<=inception_4c_output\n  id: inception_4d_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_4d_3x3_reduce_bn<=BN<=inception_4d_3x3_reduce\n  id: inception_4d_3x3_reduce_bn\n- {expr: inception_4d_3x3_reduce_bn<=ReLU<=inception_4d_3x3_reduce_bn, id: inception_4d_relu_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 192, pad: 1}\n  expr: inception_4d_3x3<=Convolution<=inception_4d_3x3_reduce_bn\n  id: inception_4d_3x3\n- attrs: {frozen: true}\n  expr: inception_4d_3x3_bn<=BN<=inception_4d_3x3\n  id: inception_4d_3x3_bn\n- {expr: inception_4d_3x3_bn<=ReLU<=inception_4d_3x3_bn, id: inception_4d_relu_3x3}\n- attrs: {kernel_size: 1, num_output: 160}\n  expr: inception_4d_double_3x3_reduce<=Convolution<=inception_4c_output\n  id: inception_4d_double_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_4d_double_3x3_reduce_bn<=BN<=inception_4d_double_3x3_reduce\n  id: inception_4d_double_3x3_reduce_bn\n- {expr: inception_4d_double_3x3_reduce_bn<=ReLU<=inception_4d_double_3x3_reduce_bn,\n  id: inception_4d_relu_double_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 192, pad: 1}\n  expr: inception_4d_double_3x3_1<=Convolution<=inception_4d_double_3x3_reduce_bn\n  id: inception_4d_double_3x3_1\n- attrs: {frozen: true}\n  expr: inception_4d_double_3x3_1_bn<=BN<=inception_4d_double_3x3_1\n  id: inception_4d_double_3x3_1_bn\n- {expr: inception_4d_double_3x3_1_bn<=ReLU<=inception_4d_double_3x3_1_bn, id: inception_4d_relu_double_3x3_1}\n- attrs: {kernel_size: 3, num_output: 192, pad: 1}\n  expr: inception_4d_double_3x3_2<=Convolution<=inception_4d_double_3x3_1_bn\n  id: inception_4d_double_3x3_2\n- attrs: {frozen: true}\n  expr: inception_4d_double_3x3_2_bn<=BN<=inception_4d_double_3x3_2\n  id: inception_4d_double_3x3_2_bn\n- {expr: inception_4d_double_3x3_2_bn<=ReLU<=inception_4d_double_3x3_2_bn, id: inception_4d_relu_double_3x3_2}\n- attrs: {kernel_size: 3, mode: ave, pad: 1, stride: 1}\n  expr: inception_4d_pool<=Pooling<=inception_4c_output\n  id: inception_4d_pool\n- attrs: {kernel_size: 1, num_output: 128}\n  expr: inception_4d_pool_proj<=Convolution<=inception_4d_pool\n  id: inception_4d_pool_proj\n- attrs: {frozen: true}\n  expr: inception_4d_pool_proj_bn<=BN<=inception_4d_pool_proj\n  id: inception_4d_pool_proj_bn\n- {expr: inception_4d_pool_proj_bn<=ReLU<=inception_4d_pool_proj_bn, id: inception_4d_relu_pool_proj}\n- {expr: 'inception_4d_output<=Concat<=inception_4d_1x1_bn,inception_4d_3x3_bn,inception_4d_double_3x3_2_bn,inception_4d_pool_proj_bn',\n  id: inception_4d_output}\n- attrs: {kernel_size: 1, num_output: 128}\n  expr: inception_4e_3x3_reduce<=Convolution<=inception_4d_output\n  id: inception_4e_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_4e_3x3_reduce_bn<=BN<=inception_4e_3x3_reduce\n  id: inception_4e_3x3_reduce_bn\n- {expr: inception_4e_3x3_reduce_bn<=ReLU<=inception_4e_3x3_reduce_bn, id: inception_4e_relu_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 192, pad: 1, stride: 2}\n  expr: inception_4e_3x3<=Convolution<=inception_4e_3x3_reduce_bn\n  id: inception_4e_3x3\n- attrs: {frozen: true}\n  expr: inception_4e_3x3_bn<=BN<=inception_4e_3x3\n  id: inception_4e_3x3_bn\n- {expr: inception_4e_3x3_bn<=ReLU<=inception_4e_3x3_bn, id: inception_4e_relu_3x3}\n- attrs: {kernel_size: 1, num_output: 192}\n  expr: inception_4e_double_3x3_reduce<=Convolution<=inception_4d_output\n  id: inception_4e_double_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_4e_double_3x3_reduce_bn<=BN<=inception_4e_double_3x3_reduce\n  id: inception_4e_double_3x3_reduce_bn\n- {expr: inception_4e_double_3x3_reduce_bn<=ReLU<=inception_4e_double_3x3_reduce_bn,\n  id: inception_4e_relu_double_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 256, pad: 1}\n  expr: inception_4e_double_3x3_1<=Convolution<=inception_4e_double_3x3_reduce_bn\n  id: inception_4e_double_3x3_1\n- attrs: {frozen: true}\n  expr: inception_4e_double_3x3_1_bn<=BN<=inception_4e_double_3x3_1\n  id: inception_4e_double_3x3_1_bn\n- {expr: inception_4e_double_3x3_1_bn<=ReLU<=inception_4e_double_3x3_1_bn, id: inception_4e_relu_double_3x3_1}\n- attrs: {kernel_size: 3, num_output: 256, pad: 1, stride: 2}\n  expr: inception_4e_double_3x3_2<=Convolution<=inception_4e_double_3x3_1_bn\n  id: inception_4e_double_3x3_2\n- attrs: {frozen: true}\n  expr: inception_4e_double_3x3_2_bn<=BN<=inception_4e_double_3x3_2\n  id: inception_4e_double_3x3_2_bn\n- {expr: inception_4e_double_3x3_2_bn<=ReLU<=inception_4e_double_3x3_2_bn, id: inception_4e_relu_double_3x3_2}\n- attrs: {kernel_size: 3, mode: max, stride: 2}\n  expr: inception_4e_pool<=Pooling<=inception_4d_output\n  id: inception_4e_pool\n- {expr: 'inception_4e_output<=Concat<=inception_4e_3x3_bn,inception_4e_double_3x3_2_bn,inception_4e_pool',\n  id: inception_4e_output}\n- attrs: {kernel_size: 1, num_output: 352}\n  expr: inception_5a_1x1<=Convolution<=inception_4e_output\n  id: inception_5a_1x1\n- attrs: {frozen: true}\n  expr: inception_5a_1x1_bn<=BN<=inception_5a_1x1\n  id: inception_5a_1x1_bn\n- {expr: inception_5a_1x1_bn<=ReLU<=inception_5a_1x1_bn, id: inception_5a_relu_1x1}\n- attrs: {kernel_size: 1, num_output: 192}\n  expr: inception_5a_3x3_reduce<=Convolution<=inception_4e_output\n  id: inception_5a_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_5a_3x3_reduce_bn<=BN<=inception_5a_3x3_reduce\n  id: inception_5a_3x3_reduce_bn\n- {expr: inception_5a_3x3_reduce_bn<=ReLU<=inception_5a_3x3_reduce_bn, id: inception_5a_relu_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 320, pad: 1}\n  expr: inception_5a_3x3<=Convolution<=inception_5a_3x3_reduce_bn\n  id: inception_5a_3x3\n- attrs: {frozen: true}\n  expr: inception_5a_3x3_bn<=BN<=inception_5a_3x3\n  id: inception_5a_3x3_bn\n- {expr: inception_5a_3x3_bn<=ReLU<=inception_5a_3x3_bn, id: inception_5a_relu_3x3}\n- attrs: {kernel_size: 1, num_output: 160}\n  expr: inception_5a_double_3x3_reduce<=Convolution<=inception_4e_output\n  id: inception_5a_double_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_5a_double_3x3_reduce_bn<=BN<=inception_5a_double_3x3_reduce\n  id: inception_5a_double_3x3_reduce_bn\n- {expr: inception_5a_double_3x3_reduce_bn<=ReLU<=inception_5a_double_3x3_reduce_bn,\n  id: inception_5a_relu_double_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 224, pad: 1}\n  expr: inception_5a_double_3x3_1<=Convolution<=inception_5a_double_3x3_reduce_bn\n  id: inception_5a_double_3x3_1\n- attrs: {frozen: true}\n  expr: inception_5a_double_3x3_1_bn<=BN<=inception_5a_double_3x3_1\n  id: inception_5a_double_3x3_1_bn\n- {expr: inception_5a_double_3x3_1_bn<=ReLU<=inception_5a_double_3x3_1_bn, id: inception_5a_relu_double_3x3_1}\n- attrs: {kernel_size: 3, num_output: 224, pad: 1}\n  expr: inception_5a_double_3x3_2<=Convolution<=inception_5a_double_3x3_1_bn\n  id: inception_5a_double_3x3_2\n- attrs: {frozen: true}\n  expr: inception_5a_double_3x3_2_bn<=BN<=inception_5a_double_3x3_2\n  id: inception_5a_double_3x3_2_bn\n- {expr: inception_5a_double_3x3_2_bn<=ReLU<=inception_5a_double_3x3_2_bn, id: inception_5a_relu_double_3x3_2}\n- attrs: {kernel_size: 3, mode: ave, pad: 1, stride: 1}\n  expr: inception_5a_pool<=Pooling<=inception_4e_output\n  id: inception_5a_pool\n- attrs: {kernel_size: 1, num_output: 128}\n  expr: inception_5a_pool_proj<=Convolution<=inception_5a_pool\n  id: inception_5a_pool_proj\n- attrs: {frozen: true}\n  expr: inception_5a_pool_proj_bn<=BN<=inception_5a_pool_proj\n  id: inception_5a_pool_proj_bn\n- {expr: inception_5a_pool_proj_bn<=ReLU<=inception_5a_pool_proj_bn, id: inception_5a_relu_pool_proj}\n- {expr: 'inception_5a_output<=Concat<=inception_5a_1x1_bn,inception_5a_3x3_bn,inception_5a_double_3x3_2_bn,inception_5a_pool_proj_bn',\n  id: inception_5a_output}\n- attrs: {kernel_size: 1, num_output: 352}\n  expr: inception_5b_1x1<=Convolution<=inception_5a_output\n  id: inception_5b_1x1\n- attrs: {frozen: true}\n  expr: inception_5b_1x1_bn<=BN<=inception_5b_1x1\n  id: inception_5b_1x1_bn\n- {expr: inception_5b_1x1_bn<=ReLU<=inception_5b_1x1_bn, id: inception_5b_relu_1x1}\n- attrs: {kernel_size: 1, num_output: 192}\n  expr: inception_5b_3x3_reduce<=Convolution<=inception_5a_output\n  id: inception_5b_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_5b_3x3_reduce_bn<=BN<=inception_5b_3x3_reduce\n  id: inception_5b_3x3_reduce_bn\n- {expr: inception_5b_3x3_reduce_bn<=ReLU<=inception_5b_3x3_reduce_bn, id: inception_5b_relu_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 320, pad: 1}\n  expr: inception_5b_3x3<=Convolution<=inception_5b_3x3_reduce_bn\n  id: inception_5b_3x3\n- attrs: {frozen: true}\n  expr: inception_5b_3x3_bn<=BN<=inception_5b_3x3\n  id: inception_5b_3x3_bn\n- {expr: inception_5b_3x3_bn<=ReLU<=inception_5b_3x3_bn, id: inception_5b_relu_3x3}\n- attrs: {kernel_size: 1, num_output: 192}\n  expr: inception_5b_double_3x3_reduce<=Convolution<=inception_5a_output\n  id: inception_5b_double_3x3_reduce\n- attrs: {frozen: true}\n  expr: inception_5b_double_3x3_reduce_bn<=BN<=inception_5b_double_3x3_reduce\n  id: inception_5b_double_3x3_reduce_bn\n- {expr: inception_5b_double_3x3_reduce_bn<=ReLU<=inception_5b_double_3x3_reduce_bn,\n  id: inception_5b_relu_double_3x3_reduce}\n- attrs: {kernel_size: 3, num_output: 224, pad: 1}\n  expr: inception_5b_double_3x3_1<=Convolution<=inception_5b_double_3x3_reduce_bn\n  id: inception_5b_double_3x3_1\n- attrs: {frozen: true}\n  expr: inception_5b_double_3x3_1_bn<=BN<=inception_5b_double_3x3_1\n  id: inception_5b_double_3x3_1_bn\n- {expr: inception_5b_double_3x3_1_bn<=ReLU<=inception_5b_double_3x3_1_bn, id: inception_5b_relu_double_3x3_1}\n- attrs: {kernel_size: 3, num_output: 224, pad: 1}\n  expr: inception_5b_double_3x3_2<=Convolution<=inception_5b_double_3x3_1_bn\n  id: inception_5b_double_3x3_2\n- attrs: {frozen: true}\n  expr: inception_5b_double_3x3_2_bn<=BN<=inception_5b_double_3x3_2\n  id: inception_5b_double_3x3_2_bn\n- {expr: inception_5b_double_3x3_2_bn<=ReLU<=inception_5b_double_3x3_2_bn, id: inception_5b_relu_double_3x3_2}\n- attrs: {kernel_size: 3, mode: max, pad: 1, stride: 1}\n  expr: inception_5b_pool<=Pooling<=inception_5a_output\n  id: inception_5b_pool\n- attrs: {kernel_size: 1, num_output: 128}\n  expr: inception_5b_pool_proj<=Convolution<=inception_5b_pool\n  id: inception_5b_pool_proj\n- attrs: {frozen: true}\n  expr: inception_5b_pool_proj_bn<=BN<=inception_5b_pool_proj\n  id: inception_5b_pool_proj_bn\n- {expr: inception_5b_pool_proj_bn<=ReLU<=inception_5b_pool_proj_bn, id: inception_5b_relu_pool_proj}\n- {expr: 'inception_5b_output<=Concat<=inception_5b_1x1_bn,inception_5b_3x3_bn,inception_5b_double_3x3_2_bn,inception_5b_pool_proj_bn',\n  id: inception_5b_output}\n- attrs: {kernel_size: 7, mode: ave, stride: 1}\n  expr: global_pool<=Pooling<=inception_5b_output\n  id: global_pool\n- attrs: {num_output: 1000}\n  expr: fc_action<=InnerProduct<=global_pool\n  id: fc\nname: BN-Inception\n"
  },
  {
    "path": "model_zoo/bninception/caffe_pb2.py",
    "content": "# Generated by the protocol buffer compiler.  DO NOT EDIT!\n# source: caffe.proto\n\nfrom google.protobuf.internal import enum_type_wrapper\nfrom google.protobuf import descriptor as _descriptor\nfrom google.protobuf import message as _message\nfrom google.protobuf import reflection as _reflection\nfrom google.protobuf import descriptor_pb2\n# @@protoc_insertion_point(imports)\n\n\n\n\nDESCRIPTOR = _descriptor.FileDescriptor(\n  name='caffe.proto',\n  package='caffe',\n  serialized_pb='\\n\\x0b\\x63\\x61\\x66\\x66\\x65.proto\\x12\\x05\\x63\\x61\\x66\\x66\\x65\\\"\\x1c\\n\\tBlobShape\\x12\\x0f\\n\\x03\\x64im\\x18\\x01 \\x03(\\x03\\x42\\x02\\x10\\x01\\\"\\x9a\\x01\\n\\tBlobProto\\x12\\x1f\\n\\x05shape\\x18\\x07 \\x01(\\x0b\\x32\\x10.caffe.BlobShape\\x12\\x10\\n\\x04\\x64\\x61ta\\x18\\x05 \\x03(\\x02\\x42\\x02\\x10\\x01\\x12\\x10\\n\\x04\\x64iff\\x18\\x06 \\x03(\\x02\\x42\\x02\\x10\\x01\\x12\\x0e\\n\\x03num\\x18\\x01 \\x01(\\x05:\\x01\\x30\\x12\\x13\\n\\x08\\x63hannels\\x18\\x02 \\x01(\\x05:\\x01\\x30\\x12\\x11\\n\\x06height\\x18\\x03 \\x01(\\x05:\\x01\\x30\\x12\\x10\\n\\x05width\\x18\\x04 \\x01(\\x05:\\x01\\x30\\\"2\\n\\x0f\\x42lobProtoVector\\x12\\x1f\\n\\x05\\x62lobs\\x18\\x01 \\x03(\\x0b\\x32\\x10.caffe.BlobProto\\\"\\x81\\x01\\n\\x05\\x44\\x61tum\\x12\\x10\\n\\x08\\x63hannels\\x18\\x01 \\x01(\\x05\\x12\\x0e\\n\\x06height\\x18\\x02 \\x01(\\x05\\x12\\r\\n\\x05width\\x18\\x03 \\x01(\\x05\\x12\\x0c\\n\\x04\\x64\\x61ta\\x18\\x04 \\x01(\\x0c\\x12\\r\\n\\x05label\\x18\\x05 \\x01(\\x05\\x12\\x12\\n\\nfloat_data\\x18\\x06 \\x03(\\x02\\x12\\x16\\n\\x07\\x65ncoded\\x18\\x07 \\x01(\\x08:\\x05\\x66\\x61lse\\\"\\x8a\\x02\\n\\x0f\\x46illerParameter\\x12\\x16\\n\\x04type\\x18\\x01 \\x01(\\t:\\x08\\x63onstant\\x12\\x10\\n\\x05value\\x18\\x02 \\x01(\\x02:\\x01\\x30\\x12\\x0e\\n\\x03min\\x18\\x03 \\x01(\\x02:\\x01\\x30\\x12\\x0e\\n\\x03max\\x18\\x04 \\x01(\\x02:\\x01\\x31\\x12\\x0f\\n\\x04mean\\x18\\x05 \\x01(\\x02:\\x01\\x30\\x12\\x0e\\n\\x03std\\x18\\x06 \\x01(\\x02:\\x01\\x31\\x12\\x12\\n\\x06sparse\\x18\\x07 \\x01(\\x05:\\x02-1\\x12\\x42\\n\\rvariance_norm\\x18\\x08 \\x01(\\x0e\\x32#.caffe.FillerParameter.VarianceNorm:\\x06\\x46\\x41N_IN\\\"4\\n\\x0cVarianceNorm\\x12\\n\\n\\x06\\x46\\x41N_IN\\x10\\x00\\x12\\x0b\\n\\x07\\x46\\x41N_OUT\\x10\\x01\\x12\\x0b\\n\\x07\\x41VERAGE\\x10\\x02\\\"\\xc6\\x02\\n\\x0cNetParameter\\x12\\x0c\\n\\x04name\\x18\\x01 \\x01(\\t\\x12\\r\\n\\x05input\\x18\\x03 \\x03(\\t\\x12%\\n\\x0binput_shape\\x18\\x08 \\x03(\\x0b\\x32\\x10.caffe.BlobShape\\x12\\x11\\n\\tinput_dim\\x18\\x04 \\x03(\\x05\\x12\\x1d\\n\\x0e\\x66orce_backward\\x18\\x05 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x1e\\n\\x05state\\x18\\x06 \\x01(\\x0b\\x32\\x0f.caffe.NetState\\x12\\x19\\n\\ndebug_info\\x18\\x07 \\x01(\\x08:\\x05\\x66\\x61lse\\x12$\\n\\x05layer\\x18\\x64 \\x03(\\x0b\\x32\\x15.caffe.LayerParameter\\x12\\x36\\n\\tmem_param\\x18\\xc8\\x01 \\x01(\\x0b\\x32\\\".caffe.MemoryOptimizationParameter\\x12\\'\\n\\x06layers\\x18\\x02 \\x03(\\x0b\\x32\\x17.caffe.V1LayerParameter\\\"\\xe3\\x08\\n\\x0fSolverParameter\\x12\\x0b\\n\\x03net\\x18\\x18 \\x01(\\t\\x12&\\n\\tnet_param\\x18\\x19 \\x01(\\x0b\\x32\\x13.caffe.NetParameter\\x12\\x11\\n\\ttrain_net\\x18\\x01 \\x01(\\t\\x12\\x10\\n\\x08test_net\\x18\\x02 \\x03(\\t\\x12,\\n\\x0ftrain_net_param\\x18\\x15 \\x01(\\x0b\\x32\\x13.caffe.NetParameter\\x12+\\n\\x0etest_net_param\\x18\\x16 \\x03(\\x0b\\x32\\x13.caffe.NetParameter\\x12$\\n\\x0btrain_state\\x18\\x1a \\x01(\\x0b\\x32\\x0f.caffe.NetState\\x12#\\n\\ntest_state\\x18\\x1b \\x03(\\x0b\\x32\\x0f.caffe.NetState\\x12\\x11\\n\\ttest_iter\\x18\\x03 \\x03(\\x05\\x12\\x18\\n\\rtest_interval\\x18\\x04 \\x01(\\x05:\\x01\\x30\\x12 \\n\\x11test_compute_loss\\x18\\x13 \\x01(\\x08:\\x05\\x66\\x61lse\\x12!\\n\\x13test_initialization\\x18  \\x01(\\x08:\\x04true\\x12\\x0f\\n\\x07\\x62\\x61se_lr\\x18\\x05 \\x01(\\x02\\x12\\x0f\\n\\x07\\x64isplay\\x18\\x06 \\x01(\\x05\\x12\\x17\\n\\x0c\\x61verage_loss\\x18! \\x01(\\x05:\\x01\\x31\\x12\\x10\\n\\x08max_iter\\x18\\x07 \\x01(\\x05\\x12\\x14\\n\\titer_size\\x18$ \\x01(\\x05:\\x01\\x31\\x12\\x11\\n\\tlr_policy\\x18\\x08 \\x01(\\t\\x12\\r\\n\\x05gamma\\x18\\t \\x01(\\x02\\x12\\r\\n\\x05power\\x18\\n \\x01(\\x02\\x12\\x10\\n\\x08momentum\\x18\\x0b \\x01(\\x02\\x12\\x14\\n\\x0cweight_decay\\x18\\x0c \\x01(\\x02\\x12\\x1f\\n\\x13regularization_type\\x18\\x1d \\x01(\\t:\\x02L2\\x12\\x10\\n\\x08stepsize\\x18\\r \\x01(\\x05\\x12\\x11\\n\\tstepvalue\\x18\\\" \\x03(\\x05\\x12\\x1a\\n\\x0e\\x63lip_gradients\\x18# \\x01(\\x02:\\x02-1\\x12\\x13\\n\\x08snapshot\\x18\\x0e \\x01(\\x05:\\x01\\x30\\x12\\x17\\n\\x0fsnapshot_prefix\\x18\\x0f \\x01(\\t\\x12\\x1c\\n\\rsnapshot_diff\\x18\\x10 \\x01(\\x08:\\x05\\x66\\x61lse\\x12;\\n\\x0bsolver_mode\\x18\\x11 \\x01(\\x0e\\x32!.caffe.SolverParameter.SolverMode:\\x03GPU\\x12\\x11\\n\\tdevice_id\\x18\\x12 \\x03(\\x05\\x12\\x10\\n\\x08group_id\\x18& \\x03(\\x05\\x12\\x17\\n\\x0brandom_seed\\x18\\x14 \\x01(\\x03:\\x02-1\\x12;\\n\\x0bsolver_type\\x18\\x1e \\x01(\\x0e\\x32!.caffe.SolverParameter.SolverType:\\x03SGD\\x12\\x14\\n\\x05\\x64\\x65lta\\x18\\x1f \\x01(\\x02:\\x05\\x31\\x65-08\\x12\\x19\\n\\ndebug_info\\x18\\x17 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\\"\\n\\x14snapshot_after_train\\x18\\x1c \\x01(\\x08:\\x04true\\x12\\x15\\n\\x08richness\\x18% \\x01(\\x05:\\x03\\x33\\x30\\x30\\\"\\x1e\\n\\nSolverMode\\x12\\x07\\n\\x03\\x43PU\\x10\\x00\\x12\\x07\\n\\x03GPU\\x10\\x01\\\"0\\n\\nSolverType\\x12\\x07\\n\\x03SGD\\x10\\x00\\x12\\x0c\\n\\x08NESTEROV\\x10\\x01\\x12\\x0b\\n\\x07\\x41\\x44\\x41GRAD\\x10\\x02\\\"l\\n\\x0bSolverState\\x12\\x0c\\n\\x04iter\\x18\\x01 \\x01(\\x05\\x12\\x13\\n\\x0blearned_net\\x18\\x02 \\x01(\\t\\x12!\\n\\x07history\\x18\\x03 \\x03(\\x0b\\x32\\x10.caffe.BlobProto\\x12\\x17\\n\\x0c\\x63urrent_step\\x18\\x04 \\x01(\\x05:\\x01\\x30\\\"N\\n\\x08NetState\\x12!\\n\\x05phase\\x18\\x01 \\x01(\\x0e\\x32\\x0c.caffe.Phase:\\x04TEST\\x12\\x10\\n\\x05level\\x18\\x02 \\x01(\\x05:\\x01\\x30\\x12\\r\\n\\x05stage\\x18\\x03 \\x03(\\t\\\"s\\n\\x0cNetStateRule\\x12\\x1b\\n\\x05phase\\x18\\x01 \\x01(\\x0e\\x32\\x0c.caffe.Phase\\x12\\x11\\n\\tmin_level\\x18\\x02 \\x01(\\x05\\x12\\x11\\n\\tmax_level\\x18\\x03 \\x01(\\x05\\x12\\r\\n\\x05stage\\x18\\x04 \\x03(\\t\\x12\\x11\\n\\tnot_stage\\x18\\x05 \\x03(\\t\\\"\\xa3\\x01\\n\\tParamSpec\\x12\\x0c\\n\\x04name\\x18\\x01 \\x01(\\t\\x12\\x31\\n\\nshare_mode\\x18\\x02 \\x01(\\x0e\\x32\\x1d.caffe.ParamSpec.DimCheckMode\\x12\\x12\\n\\x07lr_mult\\x18\\x03 \\x01(\\x02:\\x01\\x31\\x12\\x15\\n\\ndecay_mult\\x18\\x04 \\x01(\\x02:\\x01\\x31\\\"*\\n\\x0c\\x44imCheckMode\\x12\\n\\n\\x06STRICT\\x10\\x00\\x12\\x0e\\n\\nPERMISSIVE\\x10\\x01\\\"\\x90\\x13\\n\\x0eLayerParameter\\x12\\x0c\\n\\x04name\\x18\\x01 \\x01(\\t\\x12\\x0c\\n\\x04type\\x18\\x02 \\x01(\\t\\x12\\x0e\\n\\x06\\x62ottom\\x18\\x03 \\x03(\\t\\x12\\x0b\\n\\x03top\\x18\\x04 \\x03(\\t\\x12\\x1b\\n\\x05phase\\x18\\n \\x01(\\x0e\\x32\\x0c.caffe.Phase\\x12\\x13\\n\\x0bloss_weight\\x18\\x05 \\x03(\\x02\\x12\\x1f\\n\\x05param\\x18\\x06 \\x03(\\x0b\\x32\\x10.caffe.ParamSpec\\x12\\x1f\\n\\x05\\x62lobs\\x18\\x07 \\x03(\\x0b\\x32\\x10.caffe.BlobProto\\x12\\x16\\n\\x0epropagate_down\\x18\\x0b \\x03(\\x08\\x12$\\n\\x07include\\x18\\x08 \\x03(\\x0b\\x32\\x13.caffe.NetStateRule\\x12$\\n\\x07\\x65xclude\\x18\\t \\x03(\\x0b\\x32\\x13.caffe.NetStateRule\\x12\\x37\\n\\x0ftransform_param\\x18\\x64 \\x01(\\x0b\\x32\\x1e.caffe.TransformationParameter\\x12(\\n\\nloss_param\\x18\\x65 \\x01(\\x0b\\x32\\x14.caffe.LossParameter\\x12\\x30\\n\\x0e\\x61\\x63\\x63uracy_param\\x18\\x66 \\x01(\\x0b\\x32\\x18.caffe.AccuracyParameter\\x12,\\n\\x0c\\x61rgmax_param\\x18g \\x01(\\x0b\\x32\\x16.caffe.ArgMaxParameter\\x12%\\n\\x08\\x62n_param\\x18\\x89\\x01 \\x01(\\x0b\\x32\\x12.caffe.BNParameter\\x12,\\n\\x0c\\x63oncat_param\\x18h \\x01(\\x0b\\x32\\x16.caffe.ConcatParameter\\x12?\\n\\x16\\x63ontrastive_loss_param\\x18i \\x01(\\x0b\\x32\\x1f.caffe.ContrastiveLossParameter\\x12\\x36\\n\\x11\\x63onvolution_param\\x18j \\x01(\\x0b\\x32\\x1b.caffe.ConvolutionParameter\\x12(\\n\\ndata_param\\x18k \\x01(\\x0b\\x32\\x14.caffe.DataParameter\\x12.\\n\\rdropout_param\\x18l \\x01(\\x0b\\x32\\x17.caffe.DropoutParameter\\x12\\x33\\n\\x10\\x64ummy_data_param\\x18m \\x01(\\x0b\\x32\\x19.caffe.DummyDataParameter\\x12.\\n\\reltwise_param\\x18n \\x01(\\x0b\\x32\\x17.caffe.EltwiseParameter\\x12&\\n\\texp_param\\x18o \\x01(\\x0b\\x32\\x13.caffe.ExpParameter\\x12/\\n\\rflatten_param\\x18\\x87\\x01 \\x01(\\x0b\\x32\\x17.caffe.FlattenParameter\\x12\\x31\\n\\x0fhdf5_data_param\\x18p \\x01(\\x0b\\x32\\x18.caffe.HDF5DataParameter\\x12\\x35\\n\\x11hdf5_output_param\\x18q \\x01(\\x0b\\x32\\x1a.caffe.HDF5OutputParameter\\x12\\x33\\n\\x10hinge_loss_param\\x18r \\x01(\\x0b\\x32\\x19.caffe.HingeLossParameter\\x12\\x33\\n\\x10image_data_param\\x18s \\x01(\\x0b\\x32\\x19.caffe.ImageDataParameter\\x12\\x39\\n\\x13infogain_loss_param\\x18t \\x01(\\x0b\\x32\\x1c.caffe.InfogainLossParameter\\x12\\x39\\n\\x13inner_product_param\\x18u \\x01(\\x0b\\x32\\x1c.caffe.InnerProductParameter\\x12\\'\\n\\tlog_param\\x18\\x86\\x01 \\x01(\\x0b\\x32\\x13.caffe.LogParameter\\x12&\\n\\tlrn_param\\x18v \\x01(\\x0b\\x32\\x13.caffe.LRNParameter\\x12\\x35\\n\\x11memory_data_param\\x18w \\x01(\\x0b\\x32\\x1a.caffe.MemoryDataParameter\\x12&\\n\\tmvn_param\\x18x \\x01(\\x0b\\x32\\x13.caffe.MVNParameter\\x12.\\n\\rpooling_param\\x18y \\x01(\\x0b\\x32\\x17.caffe.PoolingParameter\\x12*\\n\\x0bpower_param\\x18z \\x01(\\x0b\\x32\\x15.caffe.PowerParameter\\x12+\\n\\x0bprelu_param\\x18\\x83\\x01 \\x01(\\x0b\\x32\\x15.caffe.PReLUParameter\\x12-\\n\\x0cpython_param\\x18\\x82\\x01 \\x01(\\x0b\\x32\\x16.caffe.PythonParameter\\x12\\x33\\n\\x0freduction_param\\x18\\x88\\x01 \\x01(\\x0b\\x32\\x19.caffe.ReductionParameter\\x12(\\n\\nrelu_param\\x18{ \\x01(\\x0b\\x32\\x14.caffe.ReLUParameter\\x12/\\n\\rreshape_param\\x18\\x85\\x01 \\x01(\\x0b\\x32\\x17.caffe.ReshapeParameter\\x12\\x30\\n\\x0eseg_data_param\\x18\\x8d\\x01 \\x01(\\x0b\\x32\\x17.caffe.SegDataParameter\\x12.\\n\\rsigmoid_param\\x18| \\x01(\\x0b\\x32\\x17.caffe.SigmoidParameter\\x12.\\n\\rsoftmax_param\\x18} \\x01(\\x0b\\x32\\x17.caffe.SoftmaxParameter\\x12\\'\\n\\tspp_param\\x18\\x84\\x01 \\x01(\\x0b\\x32\\x13.caffe.SPPParameter\\x12*\\n\\x0bslice_param\\x18~ \\x01(\\x0b\\x32\\x15.caffe.SliceParameter\\x12(\\n\\ntanh_param\\x18\\x7f \\x01(\\x0b\\x32\\x14.caffe.TanHParameter\\x12\\x33\\n\\x0fthreshold_param\\x18\\x80\\x01 \\x01(\\x0b\\x32\\x19.caffe.ThresholdParameter\\x12\\x36\\n\\x11window_data_param\\x18\\x81\\x01 \\x01(\\x0b\\x32\\x1a.caffe.WindowDataParameter\\x12\\x34\\n\\x10video_data_param\\x18\\x8c\\x01 \\x01(\\x0b\\x32\\x19.caffe.VideoDataParameter\\x12\\x36\\n\\x11roi_pooling_param\\x18\\x96\\x01 \\x01(\\x0b\\x32\\x1a.caffe.ROIPoolingParameter\\x12+\\n\\x0bscale_param\\x18\\xa0\\x01 \\x01(\\x0b\\x32\\x15.caffe.ScaleParameter\\x12)\\n\\nbias_param\\x18\\xa1\\x01 \\x01(\\x0b\\x32\\x14.caffe.BiasParameter\\x12>\\n\\x15\\x62\\x61tch_reduction_param\\x18\\xa2\\x01 \\x01(\\x0b\\x32\\x1e.caffe.BatchReductionParameter\\\"\\xc0\\x03\\n\\x17TransformationParameter\\x12\\x10\\n\\x05scale\\x18\\x01 \\x01(\\x02:\\x01\\x31\\x12\\x15\\n\\x06mirror\\x18\\x02 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x14\\n\\tcrop_size\\x18\\x03 \\x01(\\r:\\x01\\x30\\x12\\x11\\n\\tmean_file\\x18\\x04 \\x01(\\t\\x12\\x12\\n\\nmean_value\\x18\\x05 \\x03(\\x02\\x12\\x1a\\n\\x0b\\x66orce_color\\x18\\x06 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x19\\n\\nforce_gray\\x18\\x07 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x17\\n\\x08\\x66ix_crop\\x18\\n \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x1c\\n\\rmore_fix_crop\\x18\\x0f \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x1a\\n\\x0bmulti_scale\\x18\\x0b \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x14\\n\\x0cscale_ratios\\x18\\x0c \\x03(\\x02\\x12\\x16\\n\\x0bmax_distort\\x18\\r \\x01(\\x05:\\x01\\x31\\x12\\x16\\n\\x07is_flow\\x18\\x0e \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x1d\\n\\x0eoriginal_image\\x18\\x14 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x11\\n\\x06stride\\x18\\x10 \\x01(\\x05:\\x01\\x31\\x12\\x12\\n\\nupper_size\\x18\\x11 \\x01(\\x05\\x12\\x14\\n\\x0cupper_height\\x18\\x12 \\x01(\\x05\\x12\\x13\\n\\x0bupper_width\\x18\\x13 \\x01(\\x05\\\">\\n\\rLossParameter\\x12\\x14\\n\\x0cignore_label\\x18\\x01 \\x01(\\x05\\x12\\x17\\n\\tnormalize\\x18\\x02 \\x01(\\x08:\\x04true\\\"L\\n\\x11\\x41\\x63\\x63uracyParameter\\x12\\x10\\n\\x05top_k\\x18\\x01 \\x01(\\r:\\x01\\x31\\x12\\x0f\\n\\x04\\x61xis\\x18\\x02 \\x01(\\x05:\\x01\\x31\\x12\\x14\\n\\x0cignore_label\\x18\\x03 \\x01(\\x05\\\"?\\n\\x0f\\x41rgMaxParameter\\x12\\x1a\\n\\x0bout_max_val\\x18\\x01 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x10\\n\\x05top_k\\x18\\x02 \\x01(\\r:\\x01\\x31\\\"\\x8b\\x02\\n\\x0b\\x42NParameter\\x12,\\n\\x0cslope_filler\\x18\\x01 \\x01(\\x0b\\x32\\x16.caffe.FillerParameter\\x12+\\n\\x0b\\x62ias_filler\\x18\\x02 \\x01(\\x0b\\x32\\x16.caffe.FillerParameter\\x12\\x15\\n\\x08momentum\\x18\\x03 \\x01(\\x02:\\x03\\x30.9\\x12\\x12\\n\\x03\\x65ps\\x18\\x04 \\x01(\\x02:\\x05\\x31\\x65-05\\x12\\x15\\n\\x06\\x66rozen\\x18\\x05 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x32\\n\\x06\\x65ngine\\x18\\x06 \\x01(\\x0e\\x32\\x19.caffe.BNParameter.Engine:\\x07\\x44\\x45\\x46\\x41ULT\\\"+\\n\\x06\\x45ngine\\x12\\x0b\\n\\x07\\x44\\x45\\x46\\x41ULT\\x10\\x00\\x12\\t\\n\\x05\\x43\\x41\\x46\\x46\\x45\\x10\\x01\\x12\\t\\n\\x05\\x43UDNN\\x10\\x02\\\"9\\n\\x0f\\x43oncatParameter\\x12\\x0f\\n\\x04\\x61xis\\x18\\x02 \\x01(\\x05:\\x01\\x31\\x12\\x15\\n\\nconcat_dim\\x18\\x01 \\x01(\\r:\\x01\\x31\\\"L\\n\\x18\\x43ontrastiveLossParameter\\x12\\x11\\n\\x06margin\\x18\\x01 \\x01(\\x02:\\x01\\x31\\x12\\x1d\\n\\x0elegacy_version\\x18\\x02 \\x01(\\x08:\\x05\\x66\\x61lse\\\"\\xfc\\x03\\n\\x14\\x43onvolutionParameter\\x12\\x12\\n\\nnum_output\\x18\\x01 \\x01(\\r\\x12\\x17\\n\\tbias_term\\x18\\x02 \\x01(\\x08:\\x04true\\x12\\x0e\\n\\x03pad\\x18\\x03 \\x01(\\r:\\x01\\x30\\x12\\x10\\n\\x05pad_h\\x18\\t \\x01(\\r:\\x01\\x30\\x12\\x10\\n\\x05pad_w\\x18\\n \\x01(\\r:\\x01\\x30\\x12\\x13\\n\\x0bkernel_size\\x18\\x04 \\x01(\\r\\x12\\x10\\n\\x08kernel_h\\x18\\x0b \\x01(\\r\\x12\\x10\\n\\x08kernel_w\\x18\\x0c \\x01(\\r\\x12\\x10\\n\\x05group\\x18\\x05 \\x01(\\r:\\x01\\x31\\x12\\x11\\n\\x06stride\\x18\\x06 \\x01(\\r:\\x01\\x31\\x12\\x10\\n\\x08stride_h\\x18\\r \\x01(\\r\\x12\\x10\\n\\x08stride_w\\x18\\x0e \\x01(\\r\\x12\\x13\\n\\x08\\x64ilation\\x18\\x10 \\x01(\\r:\\x01\\x31\\x12\\x12\\n\\ndilation_h\\x18\\x11 \\x01(\\r\\x12\\x12\\n\\ndilation_w\\x18\\x12 \\x01(\\r\\x12-\\n\\rweight_filler\\x18\\x07 \\x01(\\x0b\\x32\\x16.caffe.FillerParameter\\x12+\\n\\x0b\\x62ias_filler\\x18\\x08 \\x01(\\x0b\\x32\\x16.caffe.FillerParameter\\x12;\\n\\x06\\x65ngine\\x18\\x0f \\x01(\\x0e\\x32\\\".caffe.ConvolutionParameter.Engine:\\x07\\x44\\x45\\x46\\x41ULT\\\"+\\n\\x06\\x45ngine\\x12\\x0b\\n\\x07\\x44\\x45\\x46\\x41ULT\\x10\\x00\\x12\\t\\n\\x05\\x43\\x41\\x46\\x46\\x45\\x10\\x01\\x12\\t\\n\\x05\\x43UDNN\\x10\\x02\\\"\\xa7\\x02\\n\\rDataParameter\\x12\\x0e\\n\\x06source\\x18\\x01 \\x01(\\t\\x12\\x12\\n\\nbatch_size\\x18\\x04 \\x01(\\r\\x12\\x14\\n\\trand_skip\\x18\\x07 \\x01(\\r:\\x01\\x30\\x12\\x31\\n\\x07\\x62\\x61\\x63kend\\x18\\x08 \\x01(\\x0e\\x32\\x17.caffe.DataParameter.DB:\\x07LEVELDB\\x12\\x10\\n\\x05scale\\x18\\x02 \\x01(\\x02:\\x01\\x31\\x12\\x11\\n\\tmean_file\\x18\\x03 \\x01(\\t\\x12\\x14\\n\\tcrop_size\\x18\\x05 \\x01(\\r:\\x01\\x30\\x12\\x15\\n\\x06mirror\\x18\\x06 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\\"\\n\\x13\\x66orce_encoded_color\\x18\\t \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x16\\n\\x07shuffle\\x18\\n \\x01(\\x08:\\x05\\x66\\x61lse\\\"\\x1b\\n\\x02\\x44\\x42\\x12\\x0b\\n\\x07LEVELDB\\x10\\x00\\x12\\x08\\n\\x04LMDB\\x10\\x01\\\".\\n\\x10\\x44ropoutParameter\\x12\\x1a\\n\\rdropout_ratio\\x18\\x01 \\x01(\\x02:\\x03\\x30.5\\\"\\xa0\\x01\\n\\x12\\x44ummyDataParameter\\x12+\\n\\x0b\\x64\\x61ta_filler\\x18\\x01 \\x03(\\x0b\\x32\\x16.caffe.FillerParameter\\x12\\x1f\\n\\x05shape\\x18\\x06 \\x03(\\x0b\\x32\\x10.caffe.BlobShape\\x12\\x0b\\n\\x03num\\x18\\x02 \\x03(\\r\\x12\\x10\\n\\x08\\x63hannels\\x18\\x03 \\x03(\\r\\x12\\x0e\\n\\x06height\\x18\\x04 \\x03(\\r\\x12\\r\\n\\x05width\\x18\\x05 \\x03(\\r\\\"\\xb9\\x01\\n\\x10\\x45ltwiseParameter\\x12\\x39\\n\\toperation\\x18\\x01 \\x01(\\x0e\\x32!.caffe.EltwiseParameter.EltwiseOp:\\x03SUM\\x12\\r\\n\\x05\\x63oeff\\x18\\x02 \\x03(\\x02\\x12\\x1e\\n\\x10stable_prod_grad\\x18\\x03 \\x01(\\x08:\\x04true\\\";\\n\\tEltwiseOp\\x12\\x08\\n\\x04PROD\\x10\\x00\\x12\\x07\\n\\x03SUM\\x10\\x01\\x12\\x07\\n\\x03MAX\\x10\\x02\\x12\\x12\\n\\x0eSTOCHASTIC_SUM\\x10\\x03\\\"D\\n\\x0c\\x45xpParameter\\x12\\x10\\n\\x04\\x62\\x61se\\x18\\x01 \\x01(\\x02:\\x02-1\\x12\\x10\\n\\x05scale\\x18\\x02 \\x01(\\x02:\\x01\\x31\\x12\\x10\\n\\x05shift\\x18\\x03 \\x01(\\x02:\\x01\\x30\\\"9\\n\\x10\\x46lattenParameter\\x12\\x0f\\n\\x04\\x61xis\\x18\\x01 \\x01(\\x05:\\x01\\x31\\x12\\x14\\n\\x08\\x65nd_axis\\x18\\x02 \\x01(\\x05:\\x02-1\\\"O\\n\\x11HDF5DataParameter\\x12\\x0e\\n\\x06source\\x18\\x01 \\x01(\\t\\x12\\x12\\n\\nbatch_size\\x18\\x02 \\x01(\\r\\x12\\x16\\n\\x07shuffle\\x18\\x03 \\x01(\\x08:\\x05\\x66\\x61lse\\\"(\\n\\x13HDF5OutputParameter\\x12\\x11\\n\\tfile_name\\x18\\x01 \\x01(\\t\\\"^\\n\\x12HingeLossParameter\\x12\\x30\\n\\x04norm\\x18\\x01 \\x01(\\x0e\\x32\\x1e.caffe.HingeLossParameter.Norm:\\x02L1\\\"\\x16\\n\\x04Norm\\x12\\x06\\n\\x02L1\\x10\\x01\\x12\\x06\\n\\x02L2\\x10\\x02\\\"\\x94\\x02\\n\\x12ImageDataParameter\\x12\\x0e\\n\\x06source\\x18\\x01 \\x01(\\t\\x12\\x12\\n\\nbatch_size\\x18\\x04 \\x01(\\r\\x12\\x14\\n\\trand_skip\\x18\\x07 \\x01(\\r:\\x01\\x30\\x12\\x16\\n\\x07shuffle\\x18\\x08 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x15\\n\\nnew_height\\x18\\t \\x01(\\r:\\x01\\x30\\x12\\x14\\n\\tnew_width\\x18\\n \\x01(\\r:\\x01\\x30\\x12\\x16\\n\\x08is_color\\x18\\x0b \\x01(\\x08:\\x04true\\x12\\x10\\n\\x05scale\\x18\\x02 \\x01(\\x02:\\x01\\x31\\x12\\x11\\n\\tmean_file\\x18\\x03 \\x01(\\t\\x12\\x14\\n\\tcrop_size\\x18\\x05 \\x01(\\r:\\x01\\x30\\x12\\x15\\n\\x06mirror\\x18\\x06 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x15\\n\\x0broot_folder\\x18\\x0c \\x01(\\t:\\x00\\\"\\xb8\\x03\\n\\x12VideoDataParameter\\x12\\x0e\\n\\x06source\\x18\\x01 \\x01(\\t\\x12\\x12\\n\\nbatch_size\\x18\\x04 \\x01(\\r\\x12\\x14\\n\\trand_skip\\x18\\x07 \\x01(\\r:\\x01\\x30\\x12\\x16\\n\\x07shuffle\\x18\\x08 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x15\\n\\nnew_height\\x18\\t \\x01(\\r:\\x01\\x30\\x12\\x14\\n\\tnew_width\\x18\\n \\x01(\\r:\\x01\\x30\\x12\\x15\\n\\nnew_length\\x18\\x0b \\x01(\\r:\\x01\\x31\\x12\\x17\\n\\x0cnum_segments\\x18\\x0c \\x01(\\r:\\x01\\x31\\x12\\x10\\n\\x05scale\\x18\\x02 \\x01(\\x02:\\x01\\x31\\x12\\x11\\n\\tmean_file\\x18\\x03 \\x01(\\t\\x12\\x14\\n\\tcrop_size\\x18\\x05 \\x01(\\r:\\x01\\x30\\x12\\x15\\n\\x06mirror\\x18\\x06 \\x01(\\x08:\\x05\\x66\\x61lse\\x12:\\n\\x08modality\\x18\\r \\x01(\\x0e\\x32\\\".caffe.VideoDataParameter.Modality:\\x04\\x46LOW\\x12\\x14\\n\\x0cname_pattern\\x18\\x0e \\x01(\\t\\x12\\x16\\n\\x07\\x65ncoded\\x18\\x0f \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x18\\n\\tgrayscale\\x18\\x10 \\x01(\\x08:\\x05\\x66\\x61lse\\\"\\x1d\\n\\x08Modality\\x12\\x07\\n\\x03RGB\\x10\\x00\\x12\\x08\\n\\x04\\x46LOW\\x10\\x01\\\"\\'\\n\\x15InfogainLossParameter\\x12\\x0e\\n\\x06source\\x18\\x01 \\x01(\\t\\\"\\xb1\\x01\\n\\x15InnerProductParameter\\x12\\x12\\n\\nnum_output\\x18\\x01 \\x01(\\r\\x12\\x17\\n\\tbias_term\\x18\\x02 \\x01(\\x08:\\x04true\\x12-\\n\\rweight_filler\\x18\\x03 \\x01(\\x0b\\x32\\x16.caffe.FillerParameter\\x12+\\n\\x0b\\x62ias_filler\\x18\\x04 \\x01(\\x0b\\x32\\x16.caffe.FillerParameter\\x12\\x0f\\n\\x04\\x61xis\\x18\\x05 \\x01(\\x05:\\x01\\x31\\\"D\\n\\x0cLogParameter\\x12\\x10\\n\\x04\\x62\\x61se\\x18\\x01 \\x01(\\x02:\\x02-1\\x12\\x10\\n\\x05scale\\x18\\x02 \\x01(\\x02:\\x01\\x31\\x12\\x10\\n\\x05shift\\x18\\x03 \\x01(\\x02:\\x01\\x30\\\"\\xd6\\x01\\n\\x0cLRNParameter\\x12\\x15\\n\\nlocal_size\\x18\\x01 \\x01(\\r:\\x01\\x35\\x12\\x10\\n\\x05\\x61lpha\\x18\\x02 \\x01(\\x02:\\x01\\x31\\x12\\x12\\n\\x04\\x62\\x65ta\\x18\\x03 \\x01(\\x02:\\x04\\x30.75\\x12\\x44\\n\\x0bnorm_region\\x18\\x04 \\x01(\\x0e\\x32\\x1e.caffe.LRNParameter.NormRegion:\\x0f\\x41\\x43ROSS_CHANNELS\\x12\\x0c\\n\\x01k\\x18\\x05 \\x01(\\x02:\\x01\\x31\\\"5\\n\\nNormRegion\\x12\\x13\\n\\x0f\\x41\\x43ROSS_CHANNELS\\x10\\x00\\x12\\x12\\n\\x0eWITHIN_CHANNEL\\x10\\x01\\\"Z\\n\\x13MemoryDataParameter\\x12\\x12\\n\\nbatch_size\\x18\\x01 \\x01(\\r\\x12\\x10\\n\\x08\\x63hannels\\x18\\x02 \\x01(\\r\\x12\\x0e\\n\\x06height\\x18\\x03 \\x01(\\r\\x12\\r\\n\\x05width\\x18\\x04 \\x01(\\r\\\"d\\n\\x0cMVNParameter\\x12 \\n\\x12normalize_variance\\x18\\x01 \\x01(\\x08:\\x04true\\x12\\x1e\\n\\x0f\\x61\\x63ross_channels\\x18\\x02 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x12\\n\\x03\\x65ps\\x18\\x03 \\x01(\\x02:\\x05\\x31\\x65-09\\\"\\xa2\\x03\\n\\x10PoolingParameter\\x12\\x35\\n\\x04pool\\x18\\x01 \\x01(\\x0e\\x32\\\".caffe.PoolingParameter.PoolMethod:\\x03MAX\\x12\\x0e\\n\\x03pad\\x18\\x04 \\x01(\\r:\\x01\\x30\\x12\\x10\\n\\x05pad_h\\x18\\t \\x01(\\r:\\x01\\x30\\x12\\x10\\n\\x05pad_w\\x18\\n \\x01(\\r:\\x01\\x30\\x12\\x13\\n\\x0bkernel_size\\x18\\x02 \\x01(\\r\\x12\\x10\\n\\x08kernel_h\\x18\\x05 \\x01(\\r\\x12\\x10\\n\\x08kernel_w\\x18\\x06 \\x01(\\r\\x12\\x11\\n\\x06stride\\x18\\x03 \\x01(\\r:\\x01\\x31\\x12\\x10\\n\\x08stride_h\\x18\\x07 \\x01(\\r\\x12\\x10\\n\\x08stride_w\\x18\\x08 \\x01(\\r\\x12\\x37\\n\\x06\\x65ngine\\x18\\x0b \\x01(\\x0e\\x32\\x1e.caffe.PoolingParameter.Engine:\\x07\\x44\\x45\\x46\\x41ULT\\x12\\x1d\\n\\x0eglobal_pooling\\x18\\x0c \\x01(\\x08:\\x05\\x66\\x61lse\\\".\\n\\nPoolMethod\\x12\\x07\\n\\x03MAX\\x10\\x00\\x12\\x07\\n\\x03\\x41VE\\x10\\x01\\x12\\x0e\\n\\nSTOCHASTIC\\x10\\x02\\\"+\\n\\x06\\x45ngine\\x12\\x0b\\n\\x07\\x44\\x45\\x46\\x41ULT\\x10\\x00\\x12\\t\\n\\x05\\x43\\x41\\x46\\x46\\x45\\x10\\x01\\x12\\t\\n\\x05\\x43UDNN\\x10\\x02\\\"F\\n\\x0ePowerParameter\\x12\\x10\\n\\x05power\\x18\\x01 \\x01(\\x02:\\x01\\x31\\x12\\x10\\n\\x05scale\\x18\\x02 \\x01(\\x02:\\x01\\x31\\x12\\x10\\n\\x05shift\\x18\\x03 \\x01(\\x02:\\x01\\x30\\\"C\\n\\x0fPythonParameter\\x12\\x0e\\n\\x06module\\x18\\x01 \\x01(\\t\\x12\\r\\n\\x05layer\\x18\\x02 \\x01(\\t\\x12\\x11\\n\\tparam_str\\x18\\x03 \\x01(\\t\\\"\\xc5\\x01\\n\\x12ReductionParameter\\x12=\\n\\toperation\\x18\\x01 \\x01(\\x0e\\x32%.caffe.ReductionParameter.ReductionOp:\\x03SUM\\x12\\x0f\\n\\x04\\x61xis\\x18\\x02 \\x01(\\x05:\\x01\\x30\\x12\\x10\\n\\x05\\x63oeff\\x18\\x03 \\x01(\\x02:\\x01\\x31\\x12\\x0c\\n\\x01k\\x18\\x04 \\x01(\\x05:\\x01\\x31\\\"?\\n\\x0bReductionOp\\x12\\x07\\n\\x03SUM\\x10\\x01\\x12\\x08\\n\\x04\\x41SUM\\x10\\x02\\x12\\t\\n\\x05SUMSQ\\x10\\x03\\x12\\x08\\n\\x04MEAN\\x10\\x04\\x12\\x08\\n\\x04TOPK\\x10\\x05\\\"\\x8d\\x01\\n\\rReLUParameter\\x12\\x19\\n\\x0enegative_slope\\x18\\x01 \\x01(\\x02:\\x01\\x30\\x12\\x34\\n\\x06\\x65ngine\\x18\\x02 \\x01(\\x0e\\x32\\x1b.caffe.ReLUParameter.Engine:\\x07\\x44\\x45\\x46\\x41ULT\\\"+\\n\\x06\\x45ngine\\x12\\x0b\\n\\x07\\x44\\x45\\x46\\x41ULT\\x10\\x00\\x12\\t\\n\\x05\\x43\\x41\\x46\\x46\\x45\\x10\\x01\\x12\\t\\n\\x05\\x43UDNN\\x10\\x02\\\"Z\\n\\x10ReshapeParameter\\x12\\x1f\\n\\x05shape\\x18\\x01 \\x01(\\x0b\\x32\\x10.caffe.BlobShape\\x12\\x0f\\n\\x04\\x61xis\\x18\\x02 \\x01(\\x05:\\x01\\x30\\x12\\x14\\n\\x08num_axes\\x18\\x03 \\x01(\\x05:\\x02-1\\\"d\\n\\x10SegDataParameter\\x12\\x0e\\n\\x06source\\x18\\x01 \\x01(\\t\\x12\\x10\\n\\x08root_dir\\x18\\x02 \\x01(\\t\\x12\\x16\\n\\x07shuffle\\x18\\x03 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x16\\n\\x07\\x62\\x61lance\\x18\\x04 \\x01(\\x08:\\x05\\x66\\x61lse\\\"x\\n\\x10SigmoidParameter\\x12\\x37\\n\\x06\\x65ngine\\x18\\x01 \\x01(\\x0e\\x32\\x1e.caffe.SigmoidParameter.Engine:\\x07\\x44\\x45\\x46\\x41ULT\\\"+\\n\\x06\\x45ngine\\x12\\x0b\\n\\x07\\x44\\x45\\x46\\x41ULT\\x10\\x00\\x12\\t\\n\\x05\\x43\\x41\\x46\\x46\\x45\\x10\\x01\\x12\\t\\n\\x05\\x43UDNN\\x10\\x02\\\"L\\n\\x0eSliceParameter\\x12\\x0f\\n\\x04\\x61xis\\x18\\x03 \\x01(\\x05:\\x01\\x31\\x12\\x13\\n\\x0bslice_point\\x18\\x02 \\x03(\\r\\x12\\x14\\n\\tslice_dim\\x18\\x01 \\x01(\\r:\\x01\\x31\\\"\\x89\\x01\\n\\x10SoftmaxParameter\\x12\\x37\\n\\x06\\x65ngine\\x18\\x01 \\x01(\\x0e\\x32\\x1e.caffe.SoftmaxParameter.Engine:\\x07\\x44\\x45\\x46\\x41ULT\\x12\\x0f\\n\\x04\\x61xis\\x18\\x02 \\x01(\\x05:\\x01\\x31\\\"+\\n\\x06\\x45ngine\\x12\\x0b\\n\\x07\\x44\\x45\\x46\\x41ULT\\x10\\x00\\x12\\t\\n\\x05\\x43\\x41\\x46\\x46\\x45\\x10\\x01\\x12\\t\\n\\x05\\x43UDNN\\x10\\x02\\\"r\\n\\rTanHParameter\\x12\\x34\\n\\x06\\x65ngine\\x18\\x01 \\x01(\\x0e\\x32\\x1b.caffe.TanHParameter.Engine:\\x07\\x44\\x45\\x46\\x41ULT\\\"+\\n\\x06\\x45ngine\\x12\\x0b\\n\\x07\\x44\\x45\\x46\\x41ULT\\x10\\x00\\x12\\t\\n\\x05\\x43\\x41\\x46\\x46\\x45\\x10\\x01\\x12\\t\\n\\x05\\x43UDNN\\x10\\x02\\\"*\\n\\x12ThresholdParameter\\x12\\x14\\n\\tthreshold\\x18\\x01 \\x01(\\x02:\\x01\\x30\\\"\\xc1\\x02\\n\\x13WindowDataParameter\\x12\\x0e\\n\\x06source\\x18\\x01 \\x01(\\t\\x12\\x10\\n\\x05scale\\x18\\x02 \\x01(\\x02:\\x01\\x31\\x12\\x11\\n\\tmean_file\\x18\\x03 \\x01(\\t\\x12\\x12\\n\\nbatch_size\\x18\\x04 \\x01(\\r\\x12\\x14\\n\\tcrop_size\\x18\\x05 \\x01(\\r:\\x01\\x30\\x12\\x15\\n\\x06mirror\\x18\\x06 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x19\\n\\x0c\\x66g_threshold\\x18\\x07 \\x01(\\x02:\\x03\\x30.5\\x12\\x19\\n\\x0c\\x62g_threshold\\x18\\x08 \\x01(\\x02:\\x03\\x30.5\\x12\\x19\\n\\x0b\\x66g_fraction\\x18\\t \\x01(\\x02:\\x04\\x30.25\\x12\\x16\\n\\x0b\\x63ontext_pad\\x18\\n \\x01(\\r:\\x01\\x30\\x12\\x17\\n\\tcrop_mode\\x18\\x0b \\x01(\\t:\\x04warp\\x12\\x1b\\n\\x0c\\x63\\x61\\x63he_images\\x18\\x0c \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x15\\n\\x0broot_folder\\x18\\r \\x01(\\t:\\x00\\\"\\xeb\\x01\\n\\x0cSPPParameter\\x12\\x16\\n\\x0epyramid_height\\x18\\x01 \\x01(\\r\\x12\\x31\\n\\x04pool\\x18\\x02 \\x01(\\x0e\\x32\\x1e.caffe.SPPParameter.PoolMethod:\\x03MAX\\x12\\x33\\n\\x06\\x65ngine\\x18\\x06 \\x01(\\x0e\\x32\\x1a.caffe.SPPParameter.Engine:\\x07\\x44\\x45\\x46\\x41ULT\\\".\\n\\nPoolMethod\\x12\\x07\\n\\x03MAX\\x10\\x00\\x12\\x07\\n\\x03\\x41VE\\x10\\x01\\x12\\x0e\\n\\nSTOCHASTIC\\x10\\x02\\\"+\\n\\x06\\x45ngine\\x12\\x0b\\n\\x07\\x44\\x45\\x46\\x41ULT\\x10\\x00\\x12\\t\\n\\x05\\x43\\x41\\x46\\x46\\x45\\x10\\x01\\x12\\t\\n\\x05\\x43UDNN\\x10\\x02\\\"Y\\n\\x13ROIPoolingParameter\\x12\\x13\\n\\x08pooled_h\\x18\\x01 \\x01(\\r:\\x01\\x30\\x12\\x13\\n\\x08pooled_w\\x18\\x02 \\x01(\\r:\\x01\\x30\\x12\\x18\\n\\rspatial_scale\\x18\\x03 \\x01(\\x02:\\x01\\x31\\\"\\xe0\\x13\\n\\x10V1LayerParameter\\x12\\x0e\\n\\x06\\x62ottom\\x18\\x02 \\x03(\\t\\x12\\x0b\\n\\x03top\\x18\\x03 \\x03(\\t\\x12\\x0c\\n\\x04name\\x18\\x04 \\x01(\\t\\x12$\\n\\x07include\\x18  \\x03(\\x0b\\x32\\x13.caffe.NetStateRule\\x12$\\n\\x07\\x65xclude\\x18! \\x03(\\x0b\\x32\\x13.caffe.NetStateRule\\x12/\\n\\x04type\\x18\\x05 \\x01(\\x0e\\x32!.caffe.V1LayerParameter.LayerType\\x12\\x1f\\n\\x05\\x62lobs\\x18\\x06 \\x03(\\x0b\\x32\\x10.caffe.BlobProto\\x12\\x0e\\n\\x05param\\x18\\xe9\\x07 \\x03(\\t\\x12>\\n\\x0f\\x62lob_share_mode\\x18\\xea\\x07 \\x03(\\x0e\\x32$.caffe.V1LayerParameter.DimCheckMode\\x12\\x10\\n\\x08\\x62lobs_lr\\x18\\x07 \\x03(\\x02\\x12\\x14\\n\\x0cweight_decay\\x18\\x08 \\x03(\\x02\\x12\\x13\\n\\x0bloss_weight\\x18# \\x03(\\x02\\x12\\x30\\n\\x0e\\x61\\x63\\x63uracy_param\\x18\\x1b \\x01(\\x0b\\x32\\x18.caffe.AccuracyParameter\\x12,\\n\\x0c\\x61rgmax_param\\x18\\x17 \\x01(\\x0b\\x32\\x16.caffe.ArgMaxParameter\\x12,\\n\\x0c\\x63oncat_param\\x18\\t \\x01(\\x0b\\x32\\x16.caffe.ConcatParameter\\x12?\\n\\x16\\x63ontrastive_loss_param\\x18( \\x01(\\x0b\\x32\\x1f.caffe.ContrastiveLossParameter\\x12\\x36\\n\\x11\\x63onvolution_param\\x18\\n \\x01(\\x0b\\x32\\x1b.caffe.ConvolutionParameter\\x12(\\n\\ndata_param\\x18\\x0b \\x01(\\x0b\\x32\\x14.caffe.DataParameter\\x12.\\n\\rdropout_param\\x18\\x0c \\x01(\\x0b\\x32\\x17.caffe.DropoutParameter\\x12\\x33\\n\\x10\\x64ummy_data_param\\x18\\x1a \\x01(\\x0b\\x32\\x19.caffe.DummyDataParameter\\x12.\\n\\reltwise_param\\x18\\x18 \\x01(\\x0b\\x32\\x17.caffe.EltwiseParameter\\x12&\\n\\texp_param\\x18) \\x01(\\x0b\\x32\\x13.caffe.ExpParameter\\x12\\x31\\n\\x0fhdf5_data_param\\x18\\r \\x01(\\x0b\\x32\\x18.caffe.HDF5DataParameter\\x12\\x35\\n\\x11hdf5_output_param\\x18\\x0e \\x01(\\x0b\\x32\\x1a.caffe.HDF5OutputParameter\\x12\\x33\\n\\x10hinge_loss_param\\x18\\x1d \\x01(\\x0b\\x32\\x19.caffe.HingeLossParameter\\x12\\x33\\n\\x10image_data_param\\x18\\x0f \\x01(\\x0b\\x32\\x19.caffe.ImageDataParameter\\x12\\x39\\n\\x13infogain_loss_param\\x18\\x10 \\x01(\\x0b\\x32\\x1c.caffe.InfogainLossParameter\\x12\\x39\\n\\x13inner_product_param\\x18\\x11 \\x01(\\x0b\\x32\\x1c.caffe.InnerProductParameter\\x12&\\n\\tlrn_param\\x18\\x12 \\x01(\\x0b\\x32\\x13.caffe.LRNParameter\\x12\\x35\\n\\x11memory_data_param\\x18\\x16 \\x01(\\x0b\\x32\\x1a.caffe.MemoryDataParameter\\x12&\\n\\tmvn_param\\x18\\\" \\x01(\\x0b\\x32\\x13.caffe.MVNParameter\\x12.\\n\\rpooling_param\\x18\\x13 \\x01(\\x0b\\x32\\x17.caffe.PoolingParameter\\x12*\\n\\x0bpower_param\\x18\\x15 \\x01(\\x0b\\x32\\x15.caffe.PowerParameter\\x12(\\n\\nrelu_param\\x18\\x1e \\x01(\\x0b\\x32\\x14.caffe.ReLUParameter\\x12.\\n\\rsigmoid_param\\x18& \\x01(\\x0b\\x32\\x17.caffe.SigmoidParameter\\x12.\\n\\rsoftmax_param\\x18\\' \\x01(\\x0b\\x32\\x17.caffe.SoftmaxParameter\\x12*\\n\\x0bslice_param\\x18\\x1f \\x01(\\x0b\\x32\\x15.caffe.SliceParameter\\x12(\\n\\ntanh_param\\x18% \\x01(\\x0b\\x32\\x14.caffe.TanHParameter\\x12\\x32\\n\\x0fthreshold_param\\x18\\x19 \\x01(\\x0b\\x32\\x19.caffe.ThresholdParameter\\x12\\x35\\n\\x11window_data_param\\x18\\x14 \\x01(\\x0b\\x32\\x1a.caffe.WindowDataParameter\\x12\\x37\\n\\x0ftransform_param\\x18$ \\x01(\\x0b\\x32\\x1e.caffe.TransformationParameter\\x12(\\n\\nloss_param\\x18* \\x01(\\x0b\\x32\\x14.caffe.LossParameter\\x12&\\n\\x05layer\\x18\\x01 \\x01(\\x0b\\x32\\x17.caffe.V0LayerParameter\\\"\\xd8\\x04\\n\\tLayerType\\x12\\x08\\n\\x04NONE\\x10\\x00\\x12\\n\\n\\x06\\x41\\x42SVAL\\x10#\\x12\\x0c\\n\\x08\\x41\\x43\\x43URACY\\x10\\x01\\x12\\n\\n\\x06\\x41RGMAX\\x10\\x1e\\x12\\x08\\n\\x04\\x42NLL\\x10\\x02\\x12\\n\\n\\x06\\x43ONCAT\\x10\\x03\\x12\\x14\\n\\x10\\x43ONTRASTIVE_LOSS\\x10%\\x12\\x0f\\n\\x0b\\x43ONVOLUTION\\x10\\x04\\x12\\x08\\n\\x04\\x44\\x41TA\\x10\\x05\\x12\\x11\\n\\rDECONVOLUTION\\x10\\'\\x12\\x0b\\n\\x07\\x44ROPOUT\\x10\\x06\\x12\\x0e\\n\\nDUMMY_DATA\\x10 \\x12\\x12\\n\\x0e\\x45UCLIDEAN_LOSS\\x10\\x07\\x12\\x0b\\n\\x07\\x45LTWISE\\x10\\x19\\x12\\x07\\n\\x03\\x45XP\\x10&\\x12\\x0b\\n\\x07\\x46LATTEN\\x10\\x08\\x12\\r\\n\\tHDF5_DATA\\x10\\t\\x12\\x0f\\n\\x0bHDF5_OUTPUT\\x10\\n\\x12\\x0e\\n\\nHINGE_LOSS\\x10\\x1c\\x12\\n\\n\\x06IM2COL\\x10\\x0b\\x12\\x0e\\n\\nIMAGE_DATA\\x10\\x0c\\x12\\x11\\n\\rINFOGAIN_LOSS\\x10\\r\\x12\\x11\\n\\rINNER_PRODUCT\\x10\\x0e\\x12\\x07\\n\\x03LRN\\x10\\x0f\\x12\\x0f\\n\\x0bMEMORY_DATA\\x10\\x1d\\x12\\x1d\\n\\x19MULTINOMIAL_LOGISTIC_LOSS\\x10\\x10\\x12\\x07\\n\\x03MVN\\x10\\\"\\x12\\x0b\\n\\x07POOLING\\x10\\x11\\x12\\t\\n\\x05POWER\\x10\\x1a\\x12\\x08\\n\\x04RELU\\x10\\x12\\x12\\x0b\\n\\x07SIGMOID\\x10\\x13\\x12\\x1e\\n\\x1aSIGMOID_CROSS_ENTROPY_LOSS\\x10\\x1b\\x12\\x0b\\n\\x07SILENCE\\x10$\\x12\\x0b\\n\\x07SOFTMAX\\x10\\x14\\x12\\x10\\n\\x0cSOFTMAX_LOSS\\x10\\x15\\x12\\t\\n\\x05SPLIT\\x10\\x16\\x12\\t\\n\\x05SLICE\\x10!\\x12\\x08\\n\\x04TANH\\x10\\x17\\x12\\x0f\\n\\x0bWINDOW_DATA\\x10\\x18\\x12\\r\\n\\tTHRESHOLD\\x10\\x1f\\\"*\\n\\x0c\\x44imCheckMode\\x12\\n\\n\\x06STRICT\\x10\\x00\\x12\\x0e\\n\\nPERMISSIVE\\x10\\x01\\\"\\xfd\\x07\\n\\x10V0LayerParameter\\x12\\x0c\\n\\x04name\\x18\\x01 \\x01(\\t\\x12\\x0c\\n\\x04type\\x18\\x02 \\x01(\\t\\x12\\x12\\n\\nnum_output\\x18\\x03 \\x01(\\r\\x12\\x16\\n\\x08\\x62iasterm\\x18\\x04 \\x01(\\x08:\\x04true\\x12-\\n\\rweight_filler\\x18\\x05 \\x01(\\x0b\\x32\\x16.caffe.FillerParameter\\x12+\\n\\x0b\\x62ias_filler\\x18\\x06 \\x01(\\x0b\\x32\\x16.caffe.FillerParameter\\x12\\x0e\\n\\x03pad\\x18\\x07 \\x01(\\r:\\x01\\x30\\x12\\x12\\n\\nkernelsize\\x18\\x08 \\x01(\\r\\x12\\x10\\n\\x05group\\x18\\t \\x01(\\r:\\x01\\x31\\x12\\x11\\n\\x06stride\\x18\\n \\x01(\\r:\\x01\\x31\\x12\\x35\\n\\x04pool\\x18\\x0b \\x01(\\x0e\\x32\\\".caffe.V0LayerParameter.PoolMethod:\\x03MAX\\x12\\x1a\\n\\rdropout_ratio\\x18\\x0c \\x01(\\x02:\\x03\\x30.5\\x12\\x15\\n\\nlocal_size\\x18\\r \\x01(\\r:\\x01\\x35\\x12\\x10\\n\\x05\\x61lpha\\x18\\x0e \\x01(\\x02:\\x01\\x31\\x12\\x12\\n\\x04\\x62\\x65ta\\x18\\x0f \\x01(\\x02:\\x04\\x30.75\\x12\\x0c\\n\\x01k\\x18\\x16 \\x01(\\x02:\\x01\\x31\\x12\\x0e\\n\\x06source\\x18\\x10 \\x01(\\t\\x12\\x10\\n\\x05scale\\x18\\x11 \\x01(\\x02:\\x01\\x31\\x12\\x10\\n\\x08meanfile\\x18\\x12 \\x01(\\t\\x12\\x11\\n\\tbatchsize\\x18\\x13 \\x01(\\r\\x12\\x13\\n\\x08\\x63ropsize\\x18\\x14 \\x01(\\r:\\x01\\x30\\x12\\x15\\n\\x06mirror\\x18\\x15 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x1f\\n\\x05\\x62lobs\\x18\\x32 \\x03(\\x0b\\x32\\x10.caffe.BlobProto\\x12\\x10\\n\\x08\\x62lobs_lr\\x18\\x33 \\x03(\\x02\\x12\\x14\\n\\x0cweight_decay\\x18\\x34 \\x03(\\x02\\x12\\x14\\n\\trand_skip\\x18\\x35 \\x01(\\r:\\x01\\x30\\x12\\x1d\\n\\x10\\x64\\x65t_fg_threshold\\x18\\x36 \\x01(\\x02:\\x03\\x30.5\\x12\\x1d\\n\\x10\\x64\\x65t_bg_threshold\\x18\\x37 \\x01(\\x02:\\x03\\x30.5\\x12\\x1d\\n\\x0f\\x64\\x65t_fg_fraction\\x18\\x38 \\x01(\\x02:\\x04\\x30.25\\x12\\x1a\\n\\x0f\\x64\\x65t_context_pad\\x18: \\x01(\\r:\\x01\\x30\\x12\\x1b\\n\\rdet_crop_mode\\x18; \\x01(\\t:\\x04warp\\x12\\x12\\n\\x07new_num\\x18< \\x01(\\x05:\\x01\\x30\\x12\\x17\\n\\x0cnew_channels\\x18= \\x01(\\x05:\\x01\\x30\\x12\\x15\\n\\nnew_height\\x18> \\x01(\\x05:\\x01\\x30\\x12\\x14\\n\\tnew_width\\x18? \\x01(\\x05:\\x01\\x30\\x12\\x1d\\n\\x0eshuffle_images\\x18@ \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x15\\n\\nconcat_dim\\x18\\x41 \\x01(\\r:\\x01\\x31\\x12\\x36\\n\\x11hdf5_output_param\\x18\\xe9\\x07 \\x01(\\x0b\\x32\\x1a.caffe.HDF5OutputParameter\\\".\\n\\nPoolMethod\\x12\\x07\\n\\x03MAX\\x10\\x00\\x12\\x07\\n\\x03\\x41VE\\x10\\x01\\x12\\x0e\\n\\nSTOCHASTIC\\x10\\x02\\\"W\\n\\x0ePReLUParameter\\x12&\\n\\x06\\x66iller\\x18\\x01 \\x01(\\x0b\\x32\\x16.caffe.FillerParameter\\x12\\x1d\\n\\x0e\\x63hannel_shared\\x18\\x02 \\x01(\\x08:\\x05\\x66\\x61lse\\\"\\xa5\\x01\\n\\x0eScaleParameter\\x12\\x0f\\n\\x04\\x61xis\\x18\\x01 \\x01(\\x05:\\x01\\x31\\x12\\x13\\n\\x08num_axes\\x18\\x02 \\x01(\\x05:\\x01\\x31\\x12&\\n\\x06\\x66iller\\x18\\x03 \\x01(\\x0b\\x32\\x16.caffe.FillerParameter\\x12\\x18\\n\\tbias_term\\x18\\x04 \\x01(\\x08:\\x05\\x66\\x61lse\\x12+\\n\\x0b\\x62ias_filler\\x18\\x05 \\x01(\\x0b\\x32\\x16.caffe.FillerParameter\\\"]\\n\\rBiasParameter\\x12\\x0f\\n\\x04\\x61xis\\x18\\x01 \\x01(\\x05:\\x01\\x31\\x12\\x13\\n\\x08num_axes\\x18\\x02 \\x01(\\x05:\\x01\\x31\\x12&\\n\\x06\\x66iller\\x18\\x03 \\x01(\\x0b\\x32\\x16.caffe.FillerParameter\\\"p\\n\\x17\\x42\\x61tchReductionParameter\\x12\\r\\n\\x05level\\x18\\x01 \\x03(\\x05\\x12\\x32\\n\\x0freduction_param\\x18\\x02 \\x01(\\x0b\\x32\\x19.caffe.ReductionParameter\\x12\\x12\\n\\x03pos\\x18\\x03 \\x01(\\x08:\\x05\\x66\\x61lse\\\"o\\n\\x1bMemoryOptimizationParameter\\x12\\x1c\\n\\x0eoptimize_train\\x18\\x01 \\x01(\\x08:\\x04true\\x12\\x1c\\n\\roptimize_test\\x18\\x02 \\x01(\\x08:\\x05\\x66\\x61lse\\x12\\x14\\n\\x0c\\x65xclude_blob\\x18\\x03 \\x03(\\t*\\x1c\\n\\x05Phase\\x12\\t\\n\\x05TRAIN\\x10\\x00\\x12\\x08\\n\\x04TEST\\x10\\x01')\n\n_PHASE = _descriptor.EnumDescriptor(\n  name='Phase',\n  full_name='caffe.Phase',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='TRAIN', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='TEST', index=1, number=1,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=15473,\n  serialized_end=15501,\n)\n\nPhase = enum_type_wrapper.EnumTypeWrapper(_PHASE)\nTRAIN = 0\nTEST = 1\n\n\n_FILLERPARAMETER_VARIANCENORM = _descriptor.EnumDescriptor(\n  name='VarianceNorm',\n  full_name='caffe.FillerParameter.VarianceNorm',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='FAN_IN', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='FAN_OUT', index=1, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='AVERAGE', index=2, number=2,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=608,\n  serialized_end=660,\n)\n\n_SOLVERPARAMETER_SOLVERMODE = _descriptor.EnumDescriptor(\n  name='SolverMode',\n  full_name='caffe.SolverParameter.SolverMode',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='CPU', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='GPU', index=1, number=1,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=2035,\n  serialized_end=2065,\n)\n\n_SOLVERPARAMETER_SOLVERTYPE = _descriptor.EnumDescriptor(\n  name='SolverType',\n  full_name='caffe.SolverParameter.SolverType',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='SGD', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='NESTEROV', index=1, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='ADAGRAD', index=2, number=2,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=2067,\n  serialized_end=2115,\n)\n\n_PARAMSPEC_DIMCHECKMODE = _descriptor.EnumDescriptor(\n  name='DimCheckMode',\n  full_name='caffe.ParamSpec.DimCheckMode',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='STRICT', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='PERMISSIVE', index=1, number=1,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=2546,\n  serialized_end=2588,\n)\n\n_BNPARAMETER_ENGINE = _descriptor.EnumDescriptor(\n  name='Engine',\n  full_name='caffe.BNParameter.Engine',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='DEFAULT', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CAFFE', index=1, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CUDNN', index=2, number=2,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=5924,\n  serialized_end=5967,\n)\n\n_CONVOLUTIONPARAMETER_ENGINE = _descriptor.EnumDescriptor(\n  name='Engine',\n  full_name='caffe.ConvolutionParameter.Engine',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='DEFAULT', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CAFFE', index=1, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CUDNN', index=2, number=2,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=5924,\n  serialized_end=5967,\n)\n\n_DATAPARAMETER_DB = _descriptor.EnumDescriptor(\n  name='DB',\n  full_name='caffe.DataParameter.DB',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='LEVELDB', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='LMDB', index=1, number=1,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=6886,\n  serialized_end=6913,\n)\n\n_ELTWISEPARAMETER_ELTWISEOP = _descriptor.EnumDescriptor(\n  name='EltwiseOp',\n  full_name='caffe.EltwiseParameter.EltwiseOp',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='PROD', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='SUM', index=1, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='MAX', index=2, number=2,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='STOCHASTIC_SUM', index=3, number=3,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=7253,\n  serialized_end=7312,\n)\n\n_HINGELOSSPARAMETER_NORM = _descriptor.EnumDescriptor(\n  name='Norm',\n  full_name='caffe.HingeLossParameter.Norm',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='L1', index=0, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='L2', index=1, number=2,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=7638,\n  serialized_end=7660,\n)\n\n_VIDEODATAPARAMETER_MODALITY = _descriptor.EnumDescriptor(\n  name='Modality',\n  full_name='caffe.VideoDataParameter.Modality',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='RGB', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='FLOW', index=1, number=1,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=8353,\n  serialized_end=8382,\n)\n\n_LRNPARAMETER_NORMREGION = _descriptor.EnumDescriptor(\n  name='NormRegion',\n  full_name='caffe.LRNParameter.NormRegion',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='ACROSS_CHANNELS', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='WITHIN_CHANNEL', index=1, number=1,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=8837,\n  serialized_end=8890,\n)\n\n_POOLINGPARAMETER_POOLMETHOD = _descriptor.EnumDescriptor(\n  name='PoolMethod',\n  full_name='caffe.PoolingParameter.PoolMethod',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='MAX', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='AVE', index=1, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='STOCHASTIC', index=2, number=2,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=9414,\n  serialized_end=9460,\n)\n\n_POOLINGPARAMETER_ENGINE = _descriptor.EnumDescriptor(\n  name='Engine',\n  full_name='caffe.PoolingParameter.Engine',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='DEFAULT', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CAFFE', index=1, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CUDNN', index=2, number=2,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=5924,\n  serialized_end=5967,\n)\n\n_REDUCTIONPARAMETER_REDUCTIONOP = _descriptor.EnumDescriptor(\n  name='ReductionOp',\n  full_name='caffe.ReductionParameter.ReductionOp',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='SUM', index=0, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='ASUM', index=1, number=2,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='SUMSQ', index=2, number=3,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='MEAN', index=3, number=4,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='TOPK', index=4, number=5,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=9783,\n  serialized_end=9846,\n)\n\n_RELUPARAMETER_ENGINE = _descriptor.EnumDescriptor(\n  name='Engine',\n  full_name='caffe.ReLUParameter.Engine',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='DEFAULT', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CAFFE', index=1, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CUDNN', index=2, number=2,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=5924,\n  serialized_end=5967,\n)\n\n_SIGMOIDPARAMETER_ENGINE = _descriptor.EnumDescriptor(\n  name='Engine',\n  full_name='caffe.SigmoidParameter.Engine',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='DEFAULT', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CAFFE', index=1, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CUDNN', index=2, number=2,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=5924,\n  serialized_end=5967,\n)\n\n_SOFTMAXPARAMETER_ENGINE = _descriptor.EnumDescriptor(\n  name='Engine',\n  full_name='caffe.SoftmaxParameter.Engine',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='DEFAULT', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CAFFE', index=1, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CUDNN', index=2, number=2,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=5924,\n  serialized_end=5967,\n)\n\n_TANHPARAMETER_ENGINE = _descriptor.EnumDescriptor(\n  name='Engine',\n  full_name='caffe.TanHParameter.Engine',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='DEFAULT', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CAFFE', index=1, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CUDNN', index=2, number=2,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=5924,\n  serialized_end=5967,\n)\n\n_SPPPARAMETER_POOLMETHOD = _descriptor.EnumDescriptor(\n  name='PoolMethod',\n  full_name='caffe.SPPParameter.PoolMethod',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='MAX', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='AVE', index=1, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='STOCHASTIC', index=2, number=2,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=9414,\n  serialized_end=9460,\n)\n\n_SPPPARAMETER_ENGINE = _descriptor.EnumDescriptor(\n  name='Engine',\n  full_name='caffe.SPPParameter.Engine',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='DEFAULT', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CAFFE', index=1, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CUDNN', index=2, number=2,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=5924,\n  serialized_end=5967,\n)\n\n_V1LAYERPARAMETER_LAYERTYPE = _descriptor.EnumDescriptor(\n  name='LayerType',\n  full_name='caffe.V1LayerParameter.LayerType',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='NONE', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='ABSVAL', index=1, number=35,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='ACCURACY', index=2, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='ARGMAX', index=3, number=30,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='BNLL', index=4, number=2,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CONCAT', index=5, number=3,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CONTRASTIVE_LOSS', index=6, number=37,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='CONVOLUTION', index=7, number=4,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='DATA', index=8, number=5,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='DECONVOLUTION', index=9, number=39,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='DROPOUT', index=10, number=6,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='DUMMY_DATA', index=11, number=32,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='EUCLIDEAN_LOSS', index=12, number=7,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='ELTWISE', index=13, number=25,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='EXP', index=14, number=38,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='FLATTEN', index=15, number=8,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='HDF5_DATA', index=16, number=9,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='HDF5_OUTPUT', index=17, number=10,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='HINGE_LOSS', index=18, number=28,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='IM2COL', index=19, number=11,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='IMAGE_DATA', index=20, number=12,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='INFOGAIN_LOSS', index=21, number=13,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='INNER_PRODUCT', index=22, number=14,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='LRN', index=23, number=15,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='MEMORY_DATA', index=24, number=29,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='MULTINOMIAL_LOGISTIC_LOSS', index=25, number=16,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='MVN', index=26, number=34,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='POOLING', index=27, number=17,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='POWER', index=28, number=26,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='RELU', index=29, number=18,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='SIGMOID', index=30, number=19,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='SIGMOID_CROSS_ENTROPY_LOSS', index=31, number=27,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='SILENCE', index=32, number=36,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='SOFTMAX', index=33, number=20,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='SOFTMAX_LOSS', index=34, number=21,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='SPLIT', index=35, number=22,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='SLICE', index=36, number=33,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='TANH', index=37, number=23,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='WINDOW_DATA', index=38, number=24,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='THRESHOLD', index=39, number=31,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=13224,\n  serialized_end=13824,\n)\n\n_V1LAYERPARAMETER_DIMCHECKMODE = _descriptor.EnumDescriptor(\n  name='DimCheckMode',\n  full_name='caffe.V1LayerParameter.DimCheckMode',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='STRICT', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='PERMISSIVE', index=1, number=1,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=2546,\n  serialized_end=2588,\n)\n\n_V0LAYERPARAMETER_POOLMETHOD = _descriptor.EnumDescriptor(\n  name='PoolMethod',\n  full_name='caffe.V0LayerParameter.PoolMethod',\n  filename=None,\n  file=DESCRIPTOR,\n  values=[\n    _descriptor.EnumValueDescriptor(\n      name='MAX', index=0, number=0,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='AVE', index=1, number=1,\n      options=None,\n      type=None),\n    _descriptor.EnumValueDescriptor(\n      name='STOCHASTIC', index=2, number=2,\n      options=None,\n      type=None),\n  ],\n  containing_type=None,\n  options=None,\n  serialized_start=9414,\n  serialized_end=9460,\n)\n\n\n_BLOBSHAPE = _descriptor.Descriptor(\n  name='BlobShape',\n  full_name='caffe.BlobShape',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='dim', full_name='caffe.BlobShape.dim', index=0,\n      number=1, type=3, cpp_type=2, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=_descriptor._ParseOptions(descriptor_pb2.FieldOptions(), '\\020\\001')),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=22,\n  serialized_end=50,\n)\n\n\n_BLOBPROTO = _descriptor.Descriptor(\n  name='BlobProto',\n  full_name='caffe.BlobProto',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='shape', full_name='caffe.BlobProto.shape', index=0,\n      number=7, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='data', full_name='caffe.BlobProto.data', index=1,\n      number=5, type=2, cpp_type=6, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=_descriptor._ParseOptions(descriptor_pb2.FieldOptions(), '\\020\\001')),\n    _descriptor.FieldDescriptor(\n      name='diff', full_name='caffe.BlobProto.diff', index=2,\n      number=6, type=2, cpp_type=6, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=_descriptor._ParseOptions(descriptor_pb2.FieldOptions(), '\\020\\001')),\n    _descriptor.FieldDescriptor(\n      name='num', full_name='caffe.BlobProto.num', index=3,\n      number=1, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='channels', full_name='caffe.BlobProto.channels', index=4,\n      number=2, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='height', full_name='caffe.BlobProto.height', index=5,\n      number=3, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='width', full_name='caffe.BlobProto.width', index=6,\n      number=4, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=53,\n  serialized_end=207,\n)\n\n\n_BLOBPROTOVECTOR = _descriptor.Descriptor(\n  name='BlobProtoVector',\n  full_name='caffe.BlobProtoVector',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='blobs', full_name='caffe.BlobProtoVector.blobs', index=0,\n      number=1, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=209,\n  serialized_end=259,\n)\n\n\n_DATUM = _descriptor.Descriptor(\n  name='Datum',\n  full_name='caffe.Datum',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='channels', full_name='caffe.Datum.channels', index=0,\n      number=1, type=5, cpp_type=1, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='height', full_name='caffe.Datum.height', index=1,\n      number=2, type=5, cpp_type=1, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='width', full_name='caffe.Datum.width', index=2,\n      number=3, type=5, cpp_type=1, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='data', full_name='caffe.Datum.data', index=3,\n      number=4, type=12, cpp_type=9, label=1,\n      has_default_value=False, default_value=\"\",\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='label', full_name='caffe.Datum.label', index=4,\n      number=5, type=5, cpp_type=1, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='float_data', full_name='caffe.Datum.float_data', index=5,\n      number=6, type=2, cpp_type=6, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='encoded', full_name='caffe.Datum.encoded', index=6,\n      number=7, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=262,\n  serialized_end=391,\n)\n\n\n_FILLERPARAMETER = _descriptor.Descriptor(\n  name='FillerParameter',\n  full_name='caffe.FillerParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='type', full_name='caffe.FillerParameter.type', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=True, default_value=unicode(\"constant\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='value', full_name='caffe.FillerParameter.value', index=1,\n      number=2, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='min', full_name='caffe.FillerParameter.min', index=2,\n      number=3, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='max', full_name='caffe.FillerParameter.max', index=3,\n      number=4, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='mean', full_name='caffe.FillerParameter.mean', index=4,\n      number=5, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='std', full_name='caffe.FillerParameter.std', index=5,\n      number=6, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='sparse', full_name='caffe.FillerParameter.sparse', index=6,\n      number=7, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=-1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='variance_norm', full_name='caffe.FillerParameter.variance_norm', index=7,\n      number=8, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _FILLERPARAMETER_VARIANCENORM,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=394,\n  serialized_end=660,\n)\n\n\n_NETPARAMETER = _descriptor.Descriptor(\n  name='NetParameter',\n  full_name='caffe.NetParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='name', full_name='caffe.NetParameter.name', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='input', full_name='caffe.NetParameter.input', index=1,\n      number=3, type=9, cpp_type=9, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='input_shape', full_name='caffe.NetParameter.input_shape', index=2,\n      number=8, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='input_dim', full_name='caffe.NetParameter.input_dim', index=3,\n      number=4, type=5, cpp_type=1, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='force_backward', full_name='caffe.NetParameter.force_backward', index=4,\n      number=5, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='state', full_name='caffe.NetParameter.state', index=5,\n      number=6, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='debug_info', full_name='caffe.NetParameter.debug_info', index=6,\n      number=7, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='layer', full_name='caffe.NetParameter.layer', index=7,\n      number=100, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='mem_param', full_name='caffe.NetParameter.mem_param', index=8,\n      number=200, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='layers', full_name='caffe.NetParameter.layers', index=9,\n      number=2, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=663,\n  serialized_end=989,\n)\n\n\n_SOLVERPARAMETER = _descriptor.Descriptor(\n  name='SolverParameter',\n  full_name='caffe.SolverParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='net', full_name='caffe.SolverParameter.net', index=0,\n      number=24, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='net_param', full_name='caffe.SolverParameter.net_param', index=1,\n      number=25, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='train_net', full_name='caffe.SolverParameter.train_net', index=2,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='test_net', full_name='caffe.SolverParameter.test_net', index=3,\n      number=2, type=9, cpp_type=9, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='train_net_param', full_name='caffe.SolverParameter.train_net_param', index=4,\n      number=21, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='test_net_param', full_name='caffe.SolverParameter.test_net_param', index=5,\n      number=22, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='train_state', full_name='caffe.SolverParameter.train_state', index=6,\n      number=26, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='test_state', full_name='caffe.SolverParameter.test_state', index=7,\n      number=27, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='test_iter', full_name='caffe.SolverParameter.test_iter', index=8,\n      number=3, type=5, cpp_type=1, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='test_interval', full_name='caffe.SolverParameter.test_interval', index=9,\n      number=4, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='test_compute_loss', full_name='caffe.SolverParameter.test_compute_loss', index=10,\n      number=19, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='test_initialization', full_name='caffe.SolverParameter.test_initialization', index=11,\n      number=32, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=True,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='base_lr', full_name='caffe.SolverParameter.base_lr', index=12,\n      number=5, type=2, cpp_type=6, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='display', full_name='caffe.SolverParameter.display', index=13,\n      number=6, type=5, cpp_type=1, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='average_loss', full_name='caffe.SolverParameter.average_loss', index=14,\n      number=33, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='max_iter', full_name='caffe.SolverParameter.max_iter', index=15,\n      number=7, type=5, cpp_type=1, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='iter_size', full_name='caffe.SolverParameter.iter_size', index=16,\n      number=36, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='lr_policy', full_name='caffe.SolverParameter.lr_policy', index=17,\n      number=8, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='gamma', full_name='caffe.SolverParameter.gamma', index=18,\n      number=9, type=2, cpp_type=6, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='power', full_name='caffe.SolverParameter.power', index=19,\n      number=10, type=2, cpp_type=6, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='momentum', full_name='caffe.SolverParameter.momentum', index=20,\n      number=11, type=2, cpp_type=6, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='weight_decay', full_name='caffe.SolverParameter.weight_decay', index=21,\n      number=12, type=2, cpp_type=6, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='regularization_type', full_name='caffe.SolverParameter.regularization_type', index=22,\n      number=29, type=9, cpp_type=9, label=1,\n      has_default_value=True, default_value=unicode(\"L2\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='stepsize', full_name='caffe.SolverParameter.stepsize', index=23,\n      number=13, type=5, cpp_type=1, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='stepvalue', full_name='caffe.SolverParameter.stepvalue', index=24,\n      number=34, type=5, cpp_type=1, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='clip_gradients', full_name='caffe.SolverParameter.clip_gradients', index=25,\n      number=35, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=-1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='snapshot', full_name='caffe.SolverParameter.snapshot', index=26,\n      number=14, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='snapshot_prefix', full_name='caffe.SolverParameter.snapshot_prefix', index=27,\n      number=15, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='snapshot_diff', full_name='caffe.SolverParameter.snapshot_diff', index=28,\n      number=16, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='solver_mode', full_name='caffe.SolverParameter.solver_mode', index=29,\n      number=17, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='device_id', full_name='caffe.SolverParameter.device_id', index=30,\n      number=18, type=5, cpp_type=1, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='group_id', full_name='caffe.SolverParameter.group_id', index=31,\n      number=38, type=5, cpp_type=1, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='random_seed', full_name='caffe.SolverParameter.random_seed', index=32,\n      number=20, type=3, cpp_type=2, label=1,\n      has_default_value=True, default_value=-1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='solver_type', full_name='caffe.SolverParameter.solver_type', index=33,\n      number=30, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='delta', full_name='caffe.SolverParameter.delta', index=34,\n      number=31, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1e-08,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='debug_info', full_name='caffe.SolverParameter.debug_info', index=35,\n      number=23, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='snapshot_after_train', full_name='caffe.SolverParameter.snapshot_after_train', index=36,\n      number=28, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=True,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='richness', full_name='caffe.SolverParameter.richness', index=37,\n      number=37, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=300,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _SOLVERPARAMETER_SOLVERMODE,\n    _SOLVERPARAMETER_SOLVERTYPE,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=992,\n  serialized_end=2115,\n)\n\n\n_SOLVERSTATE = _descriptor.Descriptor(\n  name='SolverState',\n  full_name='caffe.SolverState',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='iter', full_name='caffe.SolverState.iter', index=0,\n      number=1, type=5, cpp_type=1, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='learned_net', full_name='caffe.SolverState.learned_net', index=1,\n      number=2, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='history', full_name='caffe.SolverState.history', index=2,\n      number=3, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='current_step', full_name='caffe.SolverState.current_step', index=3,\n      number=4, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=2117,\n  serialized_end=2225,\n)\n\n\n_NETSTATE = _descriptor.Descriptor(\n  name='NetState',\n  full_name='caffe.NetState',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='phase', full_name='caffe.NetState.phase', index=0,\n      number=1, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='level', full_name='caffe.NetState.level', index=1,\n      number=2, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='stage', full_name='caffe.NetState.stage', index=2,\n      number=3, type=9, cpp_type=9, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=2227,\n  serialized_end=2305,\n)\n\n\n_NETSTATERULE = _descriptor.Descriptor(\n  name='NetStateRule',\n  full_name='caffe.NetStateRule',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='phase', full_name='caffe.NetStateRule.phase', index=0,\n      number=1, type=14, cpp_type=8, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='min_level', full_name='caffe.NetStateRule.min_level', index=1,\n      number=2, type=5, cpp_type=1, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='max_level', full_name='caffe.NetStateRule.max_level', index=2,\n      number=3, type=5, cpp_type=1, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='stage', full_name='caffe.NetStateRule.stage', index=3,\n      number=4, type=9, cpp_type=9, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='not_stage', full_name='caffe.NetStateRule.not_stage', index=4,\n      number=5, type=9, cpp_type=9, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=2307,\n  serialized_end=2422,\n)\n\n\n_PARAMSPEC = _descriptor.Descriptor(\n  name='ParamSpec',\n  full_name='caffe.ParamSpec',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='name', full_name='caffe.ParamSpec.name', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='share_mode', full_name='caffe.ParamSpec.share_mode', index=1,\n      number=2, type=14, cpp_type=8, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='lr_mult', full_name='caffe.ParamSpec.lr_mult', index=2,\n      number=3, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='decay_mult', full_name='caffe.ParamSpec.decay_mult', index=3,\n      number=4, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _PARAMSPEC_DIMCHECKMODE,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=2425,\n  serialized_end=2588,\n)\n\n\n_LAYERPARAMETER = _descriptor.Descriptor(\n  name='LayerParameter',\n  full_name='caffe.LayerParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='name', full_name='caffe.LayerParameter.name', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='type', full_name='caffe.LayerParameter.type', index=1,\n      number=2, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='bottom', full_name='caffe.LayerParameter.bottom', index=2,\n      number=3, type=9, cpp_type=9, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='top', full_name='caffe.LayerParameter.top', index=3,\n      number=4, type=9, cpp_type=9, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='phase', full_name='caffe.LayerParameter.phase', index=4,\n      number=10, type=14, cpp_type=8, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='loss_weight', full_name='caffe.LayerParameter.loss_weight', index=5,\n      number=5, type=2, cpp_type=6, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='param', full_name='caffe.LayerParameter.param', index=6,\n      number=6, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='blobs', full_name='caffe.LayerParameter.blobs', index=7,\n      number=7, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='propagate_down', full_name='caffe.LayerParameter.propagate_down', index=8,\n      number=11, type=8, cpp_type=7, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='include', full_name='caffe.LayerParameter.include', index=9,\n      number=8, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='exclude', full_name='caffe.LayerParameter.exclude', index=10,\n      number=9, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='transform_param', full_name='caffe.LayerParameter.transform_param', index=11,\n      number=100, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='loss_param', full_name='caffe.LayerParameter.loss_param', index=12,\n      number=101, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='accuracy_param', full_name='caffe.LayerParameter.accuracy_param', index=13,\n      number=102, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='argmax_param', full_name='caffe.LayerParameter.argmax_param', index=14,\n      number=103, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='bn_param', full_name='caffe.LayerParameter.bn_param', index=15,\n      number=137, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='concat_param', full_name='caffe.LayerParameter.concat_param', index=16,\n      number=104, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='contrastive_loss_param', full_name='caffe.LayerParameter.contrastive_loss_param', index=17,\n      number=105, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='convolution_param', full_name='caffe.LayerParameter.convolution_param', index=18,\n      number=106, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='data_param', full_name='caffe.LayerParameter.data_param', index=19,\n      number=107, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='dropout_param', full_name='caffe.LayerParameter.dropout_param', index=20,\n      number=108, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='dummy_data_param', full_name='caffe.LayerParameter.dummy_data_param', index=21,\n      number=109, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='eltwise_param', full_name='caffe.LayerParameter.eltwise_param', index=22,\n      number=110, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='exp_param', full_name='caffe.LayerParameter.exp_param', index=23,\n      number=111, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='flatten_param', full_name='caffe.LayerParameter.flatten_param', index=24,\n      number=135, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='hdf5_data_param', full_name='caffe.LayerParameter.hdf5_data_param', index=25,\n      number=112, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='hdf5_output_param', full_name='caffe.LayerParameter.hdf5_output_param', index=26,\n      number=113, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='hinge_loss_param', full_name='caffe.LayerParameter.hinge_loss_param', index=27,\n      number=114, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='image_data_param', full_name='caffe.LayerParameter.image_data_param', index=28,\n      number=115, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='infogain_loss_param', full_name='caffe.LayerParameter.infogain_loss_param', index=29,\n      number=116, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='inner_product_param', full_name='caffe.LayerParameter.inner_product_param', index=30,\n      number=117, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='log_param', full_name='caffe.LayerParameter.log_param', index=31,\n      number=134, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='lrn_param', full_name='caffe.LayerParameter.lrn_param', index=32,\n      number=118, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='memory_data_param', full_name='caffe.LayerParameter.memory_data_param', index=33,\n      number=119, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='mvn_param', full_name='caffe.LayerParameter.mvn_param', index=34,\n      number=120, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='pooling_param', full_name='caffe.LayerParameter.pooling_param', index=35,\n      number=121, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='power_param', full_name='caffe.LayerParameter.power_param', index=36,\n      number=122, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='prelu_param', full_name='caffe.LayerParameter.prelu_param', index=37,\n      number=131, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='python_param', full_name='caffe.LayerParameter.python_param', index=38,\n      number=130, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='reduction_param', full_name='caffe.LayerParameter.reduction_param', index=39,\n      number=136, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='relu_param', full_name='caffe.LayerParameter.relu_param', index=40,\n      number=123, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='reshape_param', full_name='caffe.LayerParameter.reshape_param', index=41,\n      number=133, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='seg_data_param', full_name='caffe.LayerParameter.seg_data_param', index=42,\n      number=141, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='sigmoid_param', full_name='caffe.LayerParameter.sigmoid_param', index=43,\n      number=124, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='softmax_param', full_name='caffe.LayerParameter.softmax_param', index=44,\n      number=125, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='spp_param', full_name='caffe.LayerParameter.spp_param', index=45,\n      number=132, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='slice_param', full_name='caffe.LayerParameter.slice_param', index=46,\n      number=126, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='tanh_param', full_name='caffe.LayerParameter.tanh_param', index=47,\n      number=127, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='threshold_param', full_name='caffe.LayerParameter.threshold_param', index=48,\n      number=128, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='window_data_param', full_name='caffe.LayerParameter.window_data_param', index=49,\n      number=129, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='video_data_param', full_name='caffe.LayerParameter.video_data_param', index=50,\n      number=140, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='roi_pooling_param', full_name='caffe.LayerParameter.roi_pooling_param', index=51,\n      number=150, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='scale_param', full_name='caffe.LayerParameter.scale_param', index=52,\n      number=160, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='bias_param', full_name='caffe.LayerParameter.bias_param', index=53,\n      number=161, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='batch_reduction_param', full_name='caffe.LayerParameter.batch_reduction_param', index=54,\n      number=162, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=2591,\n  serialized_end=5039,\n)\n\n\n_TRANSFORMATIONPARAMETER = _descriptor.Descriptor(\n  name='TransformationParameter',\n  full_name='caffe.TransformationParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='scale', full_name='caffe.TransformationParameter.scale', index=0,\n      number=1, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='mirror', full_name='caffe.TransformationParameter.mirror', index=1,\n      number=2, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='crop_size', full_name='caffe.TransformationParameter.crop_size', index=2,\n      number=3, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='mean_file', full_name='caffe.TransformationParameter.mean_file', index=3,\n      number=4, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='mean_value', full_name='caffe.TransformationParameter.mean_value', index=4,\n      number=5, type=2, cpp_type=6, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='force_color', full_name='caffe.TransformationParameter.force_color', index=5,\n      number=6, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='force_gray', full_name='caffe.TransformationParameter.force_gray', index=6,\n      number=7, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='fix_crop', full_name='caffe.TransformationParameter.fix_crop', index=7,\n      number=10, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='more_fix_crop', full_name='caffe.TransformationParameter.more_fix_crop', index=8,\n      number=15, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='multi_scale', full_name='caffe.TransformationParameter.multi_scale', index=9,\n      number=11, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='scale_ratios', full_name='caffe.TransformationParameter.scale_ratios', index=10,\n      number=12, type=2, cpp_type=6, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='max_distort', full_name='caffe.TransformationParameter.max_distort', index=11,\n      number=13, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='is_flow', full_name='caffe.TransformationParameter.is_flow', index=12,\n      number=14, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='original_image', full_name='caffe.TransformationParameter.original_image', index=13,\n      number=20, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='stride', full_name='caffe.TransformationParameter.stride', index=14,\n      number=16, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='upper_size', full_name='caffe.TransformationParameter.upper_size', index=15,\n      number=17, type=5, cpp_type=1, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='upper_height', full_name='caffe.TransformationParameter.upper_height', index=16,\n      number=18, type=5, cpp_type=1, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='upper_width', full_name='caffe.TransformationParameter.upper_width', index=17,\n      number=19, type=5, cpp_type=1, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=5042,\n  serialized_end=5490,\n)\n\n\n_LOSSPARAMETER = _descriptor.Descriptor(\n  name='LossParameter',\n  full_name='caffe.LossParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='ignore_label', full_name='caffe.LossParameter.ignore_label', index=0,\n      number=1, type=5, cpp_type=1, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='normalize', full_name='caffe.LossParameter.normalize', index=1,\n      number=2, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=True,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=5492,\n  serialized_end=5554,\n)\n\n\n_ACCURACYPARAMETER = _descriptor.Descriptor(\n  name='AccuracyParameter',\n  full_name='caffe.AccuracyParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='top_k', full_name='caffe.AccuracyParameter.top_k', index=0,\n      number=1, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='axis', full_name='caffe.AccuracyParameter.axis', index=1,\n      number=2, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='ignore_label', full_name='caffe.AccuracyParameter.ignore_label', index=2,\n      number=3, type=5, cpp_type=1, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=5556,\n  serialized_end=5632,\n)\n\n\n_ARGMAXPARAMETER = _descriptor.Descriptor(\n  name='ArgMaxParameter',\n  full_name='caffe.ArgMaxParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='out_max_val', full_name='caffe.ArgMaxParameter.out_max_val', index=0,\n      number=1, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='top_k', full_name='caffe.ArgMaxParameter.top_k', index=1,\n      number=2, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=5634,\n  serialized_end=5697,\n)\n\n\n_BNPARAMETER = _descriptor.Descriptor(\n  name='BNParameter',\n  full_name='caffe.BNParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='slope_filler', full_name='caffe.BNParameter.slope_filler', index=0,\n      number=1, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='bias_filler', full_name='caffe.BNParameter.bias_filler', index=1,\n      number=2, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='momentum', full_name='caffe.BNParameter.momentum', index=2,\n      number=3, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0.9,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='eps', full_name='caffe.BNParameter.eps', index=3,\n      number=4, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1e-05,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='frozen', full_name='caffe.BNParameter.frozen', index=4,\n      number=5, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='engine', full_name='caffe.BNParameter.engine', index=5,\n      number=6, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _BNPARAMETER_ENGINE,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=5700,\n  serialized_end=5967,\n)\n\n\n_CONCATPARAMETER = _descriptor.Descriptor(\n  name='ConcatParameter',\n  full_name='caffe.ConcatParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='axis', full_name='caffe.ConcatParameter.axis', index=0,\n      number=2, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='concat_dim', full_name='caffe.ConcatParameter.concat_dim', index=1,\n      number=1, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=5969,\n  serialized_end=6026,\n)\n\n\n_CONTRASTIVELOSSPARAMETER = _descriptor.Descriptor(\n  name='ContrastiveLossParameter',\n  full_name='caffe.ContrastiveLossParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='margin', full_name='caffe.ContrastiveLossParameter.margin', index=0,\n      number=1, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='legacy_version', full_name='caffe.ContrastiveLossParameter.legacy_version', index=1,\n      number=2, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=6028,\n  serialized_end=6104,\n)\n\n\n_CONVOLUTIONPARAMETER = _descriptor.Descriptor(\n  name='ConvolutionParameter',\n  full_name='caffe.ConvolutionParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='num_output', full_name='caffe.ConvolutionParameter.num_output', index=0,\n      number=1, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='bias_term', full_name='caffe.ConvolutionParameter.bias_term', index=1,\n      number=2, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=True,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='pad', full_name='caffe.ConvolutionParameter.pad', index=2,\n      number=3, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='pad_h', full_name='caffe.ConvolutionParameter.pad_h', index=3,\n      number=9, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='pad_w', full_name='caffe.ConvolutionParameter.pad_w', index=4,\n      number=10, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='kernel_size', full_name='caffe.ConvolutionParameter.kernel_size', index=5,\n      number=4, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='kernel_h', full_name='caffe.ConvolutionParameter.kernel_h', index=6,\n      number=11, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='kernel_w', full_name='caffe.ConvolutionParameter.kernel_w', index=7,\n      number=12, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='group', full_name='caffe.ConvolutionParameter.group', index=8,\n      number=5, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='stride', full_name='caffe.ConvolutionParameter.stride', index=9,\n      number=6, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='stride_h', full_name='caffe.ConvolutionParameter.stride_h', index=10,\n      number=13, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='stride_w', full_name='caffe.ConvolutionParameter.stride_w', index=11,\n      number=14, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='dilation', full_name='caffe.ConvolutionParameter.dilation', index=12,\n      number=16, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='dilation_h', full_name='caffe.ConvolutionParameter.dilation_h', index=13,\n      number=17, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='dilation_w', full_name='caffe.ConvolutionParameter.dilation_w', index=14,\n      number=18, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='weight_filler', full_name='caffe.ConvolutionParameter.weight_filler', index=15,\n      number=7, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='bias_filler', full_name='caffe.ConvolutionParameter.bias_filler', index=16,\n      number=8, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='engine', full_name='caffe.ConvolutionParameter.engine', index=17,\n      number=15, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _CONVOLUTIONPARAMETER_ENGINE,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=6107,\n  serialized_end=6615,\n)\n\n\n_DATAPARAMETER = _descriptor.Descriptor(\n  name='DataParameter',\n  full_name='caffe.DataParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='source', full_name='caffe.DataParameter.source', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='batch_size', full_name='caffe.DataParameter.batch_size', index=1,\n      number=4, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='rand_skip', full_name='caffe.DataParameter.rand_skip', index=2,\n      number=7, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='backend', full_name='caffe.DataParameter.backend', index=3,\n      number=8, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='scale', full_name='caffe.DataParameter.scale', index=4,\n      number=2, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='mean_file', full_name='caffe.DataParameter.mean_file', index=5,\n      number=3, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='crop_size', full_name='caffe.DataParameter.crop_size', index=6,\n      number=5, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='mirror', full_name='caffe.DataParameter.mirror', index=7,\n      number=6, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='force_encoded_color', full_name='caffe.DataParameter.force_encoded_color', index=8,\n      number=9, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='shuffle', full_name='caffe.DataParameter.shuffle', index=9,\n      number=10, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _DATAPARAMETER_DB,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=6618,\n  serialized_end=6913,\n)\n\n\n_DROPOUTPARAMETER = _descriptor.Descriptor(\n  name='DropoutParameter',\n  full_name='caffe.DropoutParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='dropout_ratio', full_name='caffe.DropoutParameter.dropout_ratio', index=0,\n      number=1, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0.5,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=6915,\n  serialized_end=6961,\n)\n\n\n_DUMMYDATAPARAMETER = _descriptor.Descriptor(\n  name='DummyDataParameter',\n  full_name='caffe.DummyDataParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='data_filler', full_name='caffe.DummyDataParameter.data_filler', index=0,\n      number=1, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='shape', full_name='caffe.DummyDataParameter.shape', index=1,\n      number=6, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='num', full_name='caffe.DummyDataParameter.num', index=2,\n      number=2, type=13, cpp_type=3, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='channels', full_name='caffe.DummyDataParameter.channels', index=3,\n      number=3, type=13, cpp_type=3, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='height', full_name='caffe.DummyDataParameter.height', index=4,\n      number=4, type=13, cpp_type=3, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='width', full_name='caffe.DummyDataParameter.width', index=5,\n      number=5, type=13, cpp_type=3, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=6964,\n  serialized_end=7124,\n)\n\n\n_ELTWISEPARAMETER = _descriptor.Descriptor(\n  name='EltwiseParameter',\n  full_name='caffe.EltwiseParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='operation', full_name='caffe.EltwiseParameter.operation', index=0,\n      number=1, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='coeff', full_name='caffe.EltwiseParameter.coeff', index=1,\n      number=2, type=2, cpp_type=6, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='stable_prod_grad', full_name='caffe.EltwiseParameter.stable_prod_grad', index=2,\n      number=3, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=True,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _ELTWISEPARAMETER_ELTWISEOP,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=7127,\n  serialized_end=7312,\n)\n\n\n_EXPPARAMETER = _descriptor.Descriptor(\n  name='ExpParameter',\n  full_name='caffe.ExpParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='base', full_name='caffe.ExpParameter.base', index=0,\n      number=1, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=-1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='scale', full_name='caffe.ExpParameter.scale', index=1,\n      number=2, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='shift', full_name='caffe.ExpParameter.shift', index=2,\n      number=3, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=7314,\n  serialized_end=7382,\n)\n\n\n_FLATTENPARAMETER = _descriptor.Descriptor(\n  name='FlattenParameter',\n  full_name='caffe.FlattenParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='axis', full_name='caffe.FlattenParameter.axis', index=0,\n      number=1, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='end_axis', full_name='caffe.FlattenParameter.end_axis', index=1,\n      number=2, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=-1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=7384,\n  serialized_end=7441,\n)\n\n\n_HDF5DATAPARAMETER = _descriptor.Descriptor(\n  name='HDF5DataParameter',\n  full_name='caffe.HDF5DataParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='source', full_name='caffe.HDF5DataParameter.source', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='batch_size', full_name='caffe.HDF5DataParameter.batch_size', index=1,\n      number=2, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='shuffle', full_name='caffe.HDF5DataParameter.shuffle', index=2,\n      number=3, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=7443,\n  serialized_end=7522,\n)\n\n\n_HDF5OUTPUTPARAMETER = _descriptor.Descriptor(\n  name='HDF5OutputParameter',\n  full_name='caffe.HDF5OutputParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='file_name', full_name='caffe.HDF5OutputParameter.file_name', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=7524,\n  serialized_end=7564,\n)\n\n\n_HINGELOSSPARAMETER = _descriptor.Descriptor(\n  name='HingeLossParameter',\n  full_name='caffe.HingeLossParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='norm', full_name='caffe.HingeLossParameter.norm', index=0,\n      number=1, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _HINGELOSSPARAMETER_NORM,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=7566,\n  serialized_end=7660,\n)\n\n\n_IMAGEDATAPARAMETER = _descriptor.Descriptor(\n  name='ImageDataParameter',\n  full_name='caffe.ImageDataParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='source', full_name='caffe.ImageDataParameter.source', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='batch_size', full_name='caffe.ImageDataParameter.batch_size', index=1,\n      number=4, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='rand_skip', full_name='caffe.ImageDataParameter.rand_skip', index=2,\n      number=7, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='shuffle', full_name='caffe.ImageDataParameter.shuffle', index=3,\n      number=8, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='new_height', full_name='caffe.ImageDataParameter.new_height', index=4,\n      number=9, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='new_width', full_name='caffe.ImageDataParameter.new_width', index=5,\n      number=10, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='is_color', full_name='caffe.ImageDataParameter.is_color', index=6,\n      number=11, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=True,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='scale', full_name='caffe.ImageDataParameter.scale', index=7,\n      number=2, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='mean_file', full_name='caffe.ImageDataParameter.mean_file', index=8,\n      number=3, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='crop_size', full_name='caffe.ImageDataParameter.crop_size', index=9,\n      number=5, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='mirror', full_name='caffe.ImageDataParameter.mirror', index=10,\n      number=6, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='root_folder', full_name='caffe.ImageDataParameter.root_folder', index=11,\n      number=12, type=9, cpp_type=9, label=1,\n      has_default_value=True, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=7663,\n  serialized_end=7939,\n)\n\n\n_VIDEODATAPARAMETER = _descriptor.Descriptor(\n  name='VideoDataParameter',\n  full_name='caffe.VideoDataParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='source', full_name='caffe.VideoDataParameter.source', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='batch_size', full_name='caffe.VideoDataParameter.batch_size', index=1,\n      number=4, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='rand_skip', full_name='caffe.VideoDataParameter.rand_skip', index=2,\n      number=7, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='shuffle', full_name='caffe.VideoDataParameter.shuffle', index=3,\n      number=8, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='new_height', full_name='caffe.VideoDataParameter.new_height', index=4,\n      number=9, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='new_width', full_name='caffe.VideoDataParameter.new_width', index=5,\n      number=10, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='new_length', full_name='caffe.VideoDataParameter.new_length', index=6,\n      number=11, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='num_segments', full_name='caffe.VideoDataParameter.num_segments', index=7,\n      number=12, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='scale', full_name='caffe.VideoDataParameter.scale', index=8,\n      number=2, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='mean_file', full_name='caffe.VideoDataParameter.mean_file', index=9,\n      number=3, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='crop_size', full_name='caffe.VideoDataParameter.crop_size', index=10,\n      number=5, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='mirror', full_name='caffe.VideoDataParameter.mirror', index=11,\n      number=6, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='modality', full_name='caffe.VideoDataParameter.modality', index=12,\n      number=13, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='name_pattern', full_name='caffe.VideoDataParameter.name_pattern', index=13,\n      number=14, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='encoded', full_name='caffe.VideoDataParameter.encoded', index=14,\n      number=15, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='grayscale', full_name='caffe.VideoDataParameter.grayscale', index=15,\n      number=16, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _VIDEODATAPARAMETER_MODALITY,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=7942,\n  serialized_end=8382,\n)\n\n\n_INFOGAINLOSSPARAMETER = _descriptor.Descriptor(\n  name='InfogainLossParameter',\n  full_name='caffe.InfogainLossParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='source', full_name='caffe.InfogainLossParameter.source', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=8384,\n  serialized_end=8423,\n)\n\n\n_INNERPRODUCTPARAMETER = _descriptor.Descriptor(\n  name='InnerProductParameter',\n  full_name='caffe.InnerProductParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='num_output', full_name='caffe.InnerProductParameter.num_output', index=0,\n      number=1, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='bias_term', full_name='caffe.InnerProductParameter.bias_term', index=1,\n      number=2, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=True,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='weight_filler', full_name='caffe.InnerProductParameter.weight_filler', index=2,\n      number=3, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='bias_filler', full_name='caffe.InnerProductParameter.bias_filler', index=3,\n      number=4, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='axis', full_name='caffe.InnerProductParameter.axis', index=4,\n      number=5, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=8426,\n  serialized_end=8603,\n)\n\n\n_LOGPARAMETER = _descriptor.Descriptor(\n  name='LogParameter',\n  full_name='caffe.LogParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='base', full_name='caffe.LogParameter.base', index=0,\n      number=1, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=-1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='scale', full_name='caffe.LogParameter.scale', index=1,\n      number=2, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='shift', full_name='caffe.LogParameter.shift', index=2,\n      number=3, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=8605,\n  serialized_end=8673,\n)\n\n\n_LRNPARAMETER = _descriptor.Descriptor(\n  name='LRNParameter',\n  full_name='caffe.LRNParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='local_size', full_name='caffe.LRNParameter.local_size', index=0,\n      number=1, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=5,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='alpha', full_name='caffe.LRNParameter.alpha', index=1,\n      number=2, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='beta', full_name='caffe.LRNParameter.beta', index=2,\n      number=3, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0.75,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='norm_region', full_name='caffe.LRNParameter.norm_region', index=3,\n      number=4, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='k', full_name='caffe.LRNParameter.k', index=4,\n      number=5, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _LRNPARAMETER_NORMREGION,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=8676,\n  serialized_end=8890,\n)\n\n\n_MEMORYDATAPARAMETER = _descriptor.Descriptor(\n  name='MemoryDataParameter',\n  full_name='caffe.MemoryDataParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='batch_size', full_name='caffe.MemoryDataParameter.batch_size', index=0,\n      number=1, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='channels', full_name='caffe.MemoryDataParameter.channels', index=1,\n      number=2, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='height', full_name='caffe.MemoryDataParameter.height', index=2,\n      number=3, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='width', full_name='caffe.MemoryDataParameter.width', index=3,\n      number=4, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=8892,\n  serialized_end=8982,\n)\n\n\n_MVNPARAMETER = _descriptor.Descriptor(\n  name='MVNParameter',\n  full_name='caffe.MVNParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='normalize_variance', full_name='caffe.MVNParameter.normalize_variance', index=0,\n      number=1, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=True,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='across_channels', full_name='caffe.MVNParameter.across_channels', index=1,\n      number=2, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='eps', full_name='caffe.MVNParameter.eps', index=2,\n      number=3, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1e-09,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=8984,\n  serialized_end=9084,\n)\n\n\n_POOLINGPARAMETER = _descriptor.Descriptor(\n  name='PoolingParameter',\n  full_name='caffe.PoolingParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='pool', full_name='caffe.PoolingParameter.pool', index=0,\n      number=1, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='pad', full_name='caffe.PoolingParameter.pad', index=1,\n      number=4, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='pad_h', full_name='caffe.PoolingParameter.pad_h', index=2,\n      number=9, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='pad_w', full_name='caffe.PoolingParameter.pad_w', index=3,\n      number=10, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='kernel_size', full_name='caffe.PoolingParameter.kernel_size', index=4,\n      number=2, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='kernel_h', full_name='caffe.PoolingParameter.kernel_h', index=5,\n      number=5, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='kernel_w', full_name='caffe.PoolingParameter.kernel_w', index=6,\n      number=6, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='stride', full_name='caffe.PoolingParameter.stride', index=7,\n      number=3, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='stride_h', full_name='caffe.PoolingParameter.stride_h', index=8,\n      number=7, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='stride_w', full_name='caffe.PoolingParameter.stride_w', index=9,\n      number=8, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='engine', full_name='caffe.PoolingParameter.engine', index=10,\n      number=11, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='global_pooling', full_name='caffe.PoolingParameter.global_pooling', index=11,\n      number=12, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _POOLINGPARAMETER_POOLMETHOD,\n    _POOLINGPARAMETER_ENGINE,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=9087,\n  serialized_end=9505,\n)\n\n\n_POWERPARAMETER = _descriptor.Descriptor(\n  name='PowerParameter',\n  full_name='caffe.PowerParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='power', full_name='caffe.PowerParameter.power', index=0,\n      number=1, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='scale', full_name='caffe.PowerParameter.scale', index=1,\n      number=2, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='shift', full_name='caffe.PowerParameter.shift', index=2,\n      number=3, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=9507,\n  serialized_end=9577,\n)\n\n\n_PYTHONPARAMETER = _descriptor.Descriptor(\n  name='PythonParameter',\n  full_name='caffe.PythonParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='module', full_name='caffe.PythonParameter.module', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='layer', full_name='caffe.PythonParameter.layer', index=1,\n      number=2, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='param_str', full_name='caffe.PythonParameter.param_str', index=2,\n      number=3, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=9579,\n  serialized_end=9646,\n)\n\n\n_REDUCTIONPARAMETER = _descriptor.Descriptor(\n  name='ReductionParameter',\n  full_name='caffe.ReductionParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='operation', full_name='caffe.ReductionParameter.operation', index=0,\n      number=1, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='axis', full_name='caffe.ReductionParameter.axis', index=1,\n      number=2, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='coeff', full_name='caffe.ReductionParameter.coeff', index=2,\n      number=3, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='k', full_name='caffe.ReductionParameter.k', index=3,\n      number=4, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _REDUCTIONPARAMETER_REDUCTIONOP,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=9649,\n  serialized_end=9846,\n)\n\n\n_RELUPARAMETER = _descriptor.Descriptor(\n  name='ReLUParameter',\n  full_name='caffe.ReLUParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='negative_slope', full_name='caffe.ReLUParameter.negative_slope', index=0,\n      number=1, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='engine', full_name='caffe.ReLUParameter.engine', index=1,\n      number=2, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _RELUPARAMETER_ENGINE,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=9849,\n  serialized_end=9990,\n)\n\n\n_RESHAPEPARAMETER = _descriptor.Descriptor(\n  name='ReshapeParameter',\n  full_name='caffe.ReshapeParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='shape', full_name='caffe.ReshapeParameter.shape', index=0,\n      number=1, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='axis', full_name='caffe.ReshapeParameter.axis', index=1,\n      number=2, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='num_axes', full_name='caffe.ReshapeParameter.num_axes', index=2,\n      number=3, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=-1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=9992,\n  serialized_end=10082,\n)\n\n\n_SEGDATAPARAMETER = _descriptor.Descriptor(\n  name='SegDataParameter',\n  full_name='caffe.SegDataParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='source', full_name='caffe.SegDataParameter.source', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='root_dir', full_name='caffe.SegDataParameter.root_dir', index=1,\n      number=2, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='shuffle', full_name='caffe.SegDataParameter.shuffle', index=2,\n      number=3, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='balance', full_name='caffe.SegDataParameter.balance', index=3,\n      number=4, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=10084,\n  serialized_end=10184,\n)\n\n\n_SIGMOIDPARAMETER = _descriptor.Descriptor(\n  name='SigmoidParameter',\n  full_name='caffe.SigmoidParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='engine', full_name='caffe.SigmoidParameter.engine', index=0,\n      number=1, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _SIGMOIDPARAMETER_ENGINE,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=10186,\n  serialized_end=10306,\n)\n\n\n_SLICEPARAMETER = _descriptor.Descriptor(\n  name='SliceParameter',\n  full_name='caffe.SliceParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='axis', full_name='caffe.SliceParameter.axis', index=0,\n      number=3, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='slice_point', full_name='caffe.SliceParameter.slice_point', index=1,\n      number=2, type=13, cpp_type=3, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='slice_dim', full_name='caffe.SliceParameter.slice_dim', index=2,\n      number=1, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=10308,\n  serialized_end=10384,\n)\n\n\n_SOFTMAXPARAMETER = _descriptor.Descriptor(\n  name='SoftmaxParameter',\n  full_name='caffe.SoftmaxParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='engine', full_name='caffe.SoftmaxParameter.engine', index=0,\n      number=1, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='axis', full_name='caffe.SoftmaxParameter.axis', index=1,\n      number=2, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _SOFTMAXPARAMETER_ENGINE,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=10387,\n  serialized_end=10524,\n)\n\n\n_TANHPARAMETER = _descriptor.Descriptor(\n  name='TanHParameter',\n  full_name='caffe.TanHParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='engine', full_name='caffe.TanHParameter.engine', index=0,\n      number=1, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _TANHPARAMETER_ENGINE,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=10526,\n  serialized_end=10640,\n)\n\n\n_THRESHOLDPARAMETER = _descriptor.Descriptor(\n  name='ThresholdParameter',\n  full_name='caffe.ThresholdParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='threshold', full_name='caffe.ThresholdParameter.threshold', index=0,\n      number=1, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=10642,\n  serialized_end=10684,\n)\n\n\n_WINDOWDATAPARAMETER = _descriptor.Descriptor(\n  name='WindowDataParameter',\n  full_name='caffe.WindowDataParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='source', full_name='caffe.WindowDataParameter.source', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='scale', full_name='caffe.WindowDataParameter.scale', index=1,\n      number=2, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='mean_file', full_name='caffe.WindowDataParameter.mean_file', index=2,\n      number=3, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='batch_size', full_name='caffe.WindowDataParameter.batch_size', index=3,\n      number=4, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='crop_size', full_name='caffe.WindowDataParameter.crop_size', index=4,\n      number=5, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='mirror', full_name='caffe.WindowDataParameter.mirror', index=5,\n      number=6, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='fg_threshold', full_name='caffe.WindowDataParameter.fg_threshold', index=6,\n      number=7, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0.5,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='bg_threshold', full_name='caffe.WindowDataParameter.bg_threshold', index=7,\n      number=8, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0.5,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='fg_fraction', full_name='caffe.WindowDataParameter.fg_fraction', index=8,\n      number=9, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0.25,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='context_pad', full_name='caffe.WindowDataParameter.context_pad', index=9,\n      number=10, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='crop_mode', full_name='caffe.WindowDataParameter.crop_mode', index=10,\n      number=11, type=9, cpp_type=9, label=1,\n      has_default_value=True, default_value=unicode(\"warp\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='cache_images', full_name='caffe.WindowDataParameter.cache_images', index=11,\n      number=12, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='root_folder', full_name='caffe.WindowDataParameter.root_folder', index=12,\n      number=13, type=9, cpp_type=9, label=1,\n      has_default_value=True, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=10687,\n  serialized_end=11008,\n)\n\n\n_SPPPARAMETER = _descriptor.Descriptor(\n  name='SPPParameter',\n  full_name='caffe.SPPParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='pyramid_height', full_name='caffe.SPPParameter.pyramid_height', index=0,\n      number=1, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='pool', full_name='caffe.SPPParameter.pool', index=1,\n      number=2, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='engine', full_name='caffe.SPPParameter.engine', index=2,\n      number=6, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _SPPPARAMETER_POOLMETHOD,\n    _SPPPARAMETER_ENGINE,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=11011,\n  serialized_end=11246,\n)\n\n\n_ROIPOOLINGPARAMETER = _descriptor.Descriptor(\n  name='ROIPoolingParameter',\n  full_name='caffe.ROIPoolingParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='pooled_h', full_name='caffe.ROIPoolingParameter.pooled_h', index=0,\n      number=1, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='pooled_w', full_name='caffe.ROIPoolingParameter.pooled_w', index=1,\n      number=2, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='spatial_scale', full_name='caffe.ROIPoolingParameter.spatial_scale', index=2,\n      number=3, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=11248,\n  serialized_end=11337,\n)\n\n\n_V1LAYERPARAMETER = _descriptor.Descriptor(\n  name='V1LayerParameter',\n  full_name='caffe.V1LayerParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='bottom', full_name='caffe.V1LayerParameter.bottom', index=0,\n      number=2, type=9, cpp_type=9, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='top', full_name='caffe.V1LayerParameter.top', index=1,\n      number=3, type=9, cpp_type=9, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='name', full_name='caffe.V1LayerParameter.name', index=2,\n      number=4, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='include', full_name='caffe.V1LayerParameter.include', index=3,\n      number=32, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='exclude', full_name='caffe.V1LayerParameter.exclude', index=4,\n      number=33, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='type', full_name='caffe.V1LayerParameter.type', index=5,\n      number=5, type=14, cpp_type=8, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='blobs', full_name='caffe.V1LayerParameter.blobs', index=6,\n      number=6, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='param', full_name='caffe.V1LayerParameter.param', index=7,\n      number=1001, type=9, cpp_type=9, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='blob_share_mode', full_name='caffe.V1LayerParameter.blob_share_mode', index=8,\n      number=1002, type=14, cpp_type=8, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='blobs_lr', full_name='caffe.V1LayerParameter.blobs_lr', index=9,\n      number=7, type=2, cpp_type=6, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='weight_decay', full_name='caffe.V1LayerParameter.weight_decay', index=10,\n      number=8, type=2, cpp_type=6, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='loss_weight', full_name='caffe.V1LayerParameter.loss_weight', index=11,\n      number=35, type=2, cpp_type=6, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='accuracy_param', full_name='caffe.V1LayerParameter.accuracy_param', index=12,\n      number=27, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='argmax_param', full_name='caffe.V1LayerParameter.argmax_param', index=13,\n      number=23, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='concat_param', full_name='caffe.V1LayerParameter.concat_param', index=14,\n      number=9, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='contrastive_loss_param', full_name='caffe.V1LayerParameter.contrastive_loss_param', index=15,\n      number=40, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='convolution_param', full_name='caffe.V1LayerParameter.convolution_param', index=16,\n      number=10, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='data_param', full_name='caffe.V1LayerParameter.data_param', index=17,\n      number=11, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='dropout_param', full_name='caffe.V1LayerParameter.dropout_param', index=18,\n      number=12, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='dummy_data_param', full_name='caffe.V1LayerParameter.dummy_data_param', index=19,\n      number=26, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='eltwise_param', full_name='caffe.V1LayerParameter.eltwise_param', index=20,\n      number=24, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='exp_param', full_name='caffe.V1LayerParameter.exp_param', index=21,\n      number=41, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='hdf5_data_param', full_name='caffe.V1LayerParameter.hdf5_data_param', index=22,\n      number=13, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='hdf5_output_param', full_name='caffe.V1LayerParameter.hdf5_output_param', index=23,\n      number=14, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='hinge_loss_param', full_name='caffe.V1LayerParameter.hinge_loss_param', index=24,\n      number=29, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='image_data_param', full_name='caffe.V1LayerParameter.image_data_param', index=25,\n      number=15, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='infogain_loss_param', full_name='caffe.V1LayerParameter.infogain_loss_param', index=26,\n      number=16, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='inner_product_param', full_name='caffe.V1LayerParameter.inner_product_param', index=27,\n      number=17, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='lrn_param', full_name='caffe.V1LayerParameter.lrn_param', index=28,\n      number=18, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='memory_data_param', full_name='caffe.V1LayerParameter.memory_data_param', index=29,\n      number=22, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='mvn_param', full_name='caffe.V1LayerParameter.mvn_param', index=30,\n      number=34, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='pooling_param', full_name='caffe.V1LayerParameter.pooling_param', index=31,\n      number=19, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='power_param', full_name='caffe.V1LayerParameter.power_param', index=32,\n      number=21, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='relu_param', full_name='caffe.V1LayerParameter.relu_param', index=33,\n      number=30, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='sigmoid_param', full_name='caffe.V1LayerParameter.sigmoid_param', index=34,\n      number=38, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='softmax_param', full_name='caffe.V1LayerParameter.softmax_param', index=35,\n      number=39, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='slice_param', full_name='caffe.V1LayerParameter.slice_param', index=36,\n      number=31, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='tanh_param', full_name='caffe.V1LayerParameter.tanh_param', index=37,\n      number=37, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='threshold_param', full_name='caffe.V1LayerParameter.threshold_param', index=38,\n      number=25, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='window_data_param', full_name='caffe.V1LayerParameter.window_data_param', index=39,\n      number=20, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='transform_param', full_name='caffe.V1LayerParameter.transform_param', index=40,\n      number=36, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='loss_param', full_name='caffe.V1LayerParameter.loss_param', index=41,\n      number=42, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='layer', full_name='caffe.V1LayerParameter.layer', index=42,\n      number=1, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _V1LAYERPARAMETER_LAYERTYPE,\n    _V1LAYERPARAMETER_DIMCHECKMODE,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=11340,\n  serialized_end=13868,\n)\n\n\n_V0LAYERPARAMETER = _descriptor.Descriptor(\n  name='V0LayerParameter',\n  full_name='caffe.V0LayerParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='name', full_name='caffe.V0LayerParameter.name', index=0,\n      number=1, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='type', full_name='caffe.V0LayerParameter.type', index=1,\n      number=2, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='num_output', full_name='caffe.V0LayerParameter.num_output', index=2,\n      number=3, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='biasterm', full_name='caffe.V0LayerParameter.biasterm', index=3,\n      number=4, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=True,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='weight_filler', full_name='caffe.V0LayerParameter.weight_filler', index=4,\n      number=5, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='bias_filler', full_name='caffe.V0LayerParameter.bias_filler', index=5,\n      number=6, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='pad', full_name='caffe.V0LayerParameter.pad', index=6,\n      number=7, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='kernelsize', full_name='caffe.V0LayerParameter.kernelsize', index=7,\n      number=8, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='group', full_name='caffe.V0LayerParameter.group', index=8,\n      number=9, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='stride', full_name='caffe.V0LayerParameter.stride', index=9,\n      number=10, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='pool', full_name='caffe.V0LayerParameter.pool', index=10,\n      number=11, type=14, cpp_type=8, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='dropout_ratio', full_name='caffe.V0LayerParameter.dropout_ratio', index=11,\n      number=12, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0.5,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='local_size', full_name='caffe.V0LayerParameter.local_size', index=12,\n      number=13, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=5,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='alpha', full_name='caffe.V0LayerParameter.alpha', index=13,\n      number=14, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='beta', full_name='caffe.V0LayerParameter.beta', index=14,\n      number=15, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0.75,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='k', full_name='caffe.V0LayerParameter.k', index=15,\n      number=22, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='source', full_name='caffe.V0LayerParameter.source', index=16,\n      number=16, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='scale', full_name='caffe.V0LayerParameter.scale', index=17,\n      number=17, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='meanfile', full_name='caffe.V0LayerParameter.meanfile', index=18,\n      number=18, type=9, cpp_type=9, label=1,\n      has_default_value=False, default_value=unicode(\"\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='batchsize', full_name='caffe.V0LayerParameter.batchsize', index=19,\n      number=19, type=13, cpp_type=3, label=1,\n      has_default_value=False, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='cropsize', full_name='caffe.V0LayerParameter.cropsize', index=20,\n      number=20, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='mirror', full_name='caffe.V0LayerParameter.mirror', index=21,\n      number=21, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='blobs', full_name='caffe.V0LayerParameter.blobs', index=22,\n      number=50, type=11, cpp_type=10, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='blobs_lr', full_name='caffe.V0LayerParameter.blobs_lr', index=23,\n      number=51, type=2, cpp_type=6, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='weight_decay', full_name='caffe.V0LayerParameter.weight_decay', index=24,\n      number=52, type=2, cpp_type=6, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='rand_skip', full_name='caffe.V0LayerParameter.rand_skip', index=25,\n      number=53, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='det_fg_threshold', full_name='caffe.V0LayerParameter.det_fg_threshold', index=26,\n      number=54, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0.5,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='det_bg_threshold', full_name='caffe.V0LayerParameter.det_bg_threshold', index=27,\n      number=55, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0.5,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='det_fg_fraction', full_name='caffe.V0LayerParameter.det_fg_fraction', index=28,\n      number=56, type=2, cpp_type=6, label=1,\n      has_default_value=True, default_value=0.25,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='det_context_pad', full_name='caffe.V0LayerParameter.det_context_pad', index=29,\n      number=58, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='det_crop_mode', full_name='caffe.V0LayerParameter.det_crop_mode', index=30,\n      number=59, type=9, cpp_type=9, label=1,\n      has_default_value=True, default_value=unicode(\"warp\", \"utf-8\"),\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='new_num', full_name='caffe.V0LayerParameter.new_num', index=31,\n      number=60, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='new_channels', full_name='caffe.V0LayerParameter.new_channels', index=32,\n      number=61, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='new_height', full_name='caffe.V0LayerParameter.new_height', index=33,\n      number=62, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='new_width', full_name='caffe.V0LayerParameter.new_width', index=34,\n      number=63, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=0,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='shuffle_images', full_name='caffe.V0LayerParameter.shuffle_images', index=35,\n      number=64, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='concat_dim', full_name='caffe.V0LayerParameter.concat_dim', index=36,\n      number=65, type=13, cpp_type=3, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='hdf5_output_param', full_name='caffe.V0LayerParameter.hdf5_output_param', index=37,\n      number=1001, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n    _V0LAYERPARAMETER_POOLMETHOD,\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=13871,\n  serialized_end=14892,\n)\n\n\n_PRELUPARAMETER = _descriptor.Descriptor(\n  name='PReLUParameter',\n  full_name='caffe.PReLUParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='filler', full_name='caffe.PReLUParameter.filler', index=0,\n      number=1, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='channel_shared', full_name='caffe.PReLUParameter.channel_shared', index=1,\n      number=2, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=14894,\n  serialized_end=14981,\n)\n\n\n_SCALEPARAMETER = _descriptor.Descriptor(\n  name='ScaleParameter',\n  full_name='caffe.ScaleParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='axis', full_name='caffe.ScaleParameter.axis', index=0,\n      number=1, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='num_axes', full_name='caffe.ScaleParameter.num_axes', index=1,\n      number=2, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='filler', full_name='caffe.ScaleParameter.filler', index=2,\n      number=3, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='bias_term', full_name='caffe.ScaleParameter.bias_term', index=3,\n      number=4, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='bias_filler', full_name='caffe.ScaleParameter.bias_filler', index=4,\n      number=5, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=14984,\n  serialized_end=15149,\n)\n\n\n_BIASPARAMETER = _descriptor.Descriptor(\n  name='BiasParameter',\n  full_name='caffe.BiasParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='axis', full_name='caffe.BiasParameter.axis', index=0,\n      number=1, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='num_axes', full_name='caffe.BiasParameter.num_axes', index=1,\n      number=2, type=5, cpp_type=1, label=1,\n      has_default_value=True, default_value=1,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='filler', full_name='caffe.BiasParameter.filler', index=2,\n      number=3, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=15151,\n  serialized_end=15244,\n)\n\n\n_BATCHREDUCTIONPARAMETER = _descriptor.Descriptor(\n  name='BatchReductionParameter',\n  full_name='caffe.BatchReductionParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='level', full_name='caffe.BatchReductionParameter.level', index=0,\n      number=1, type=5, cpp_type=1, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='reduction_param', full_name='caffe.BatchReductionParameter.reduction_param', index=1,\n      number=2, type=11, cpp_type=10, label=1,\n      has_default_value=False, default_value=None,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='pos', full_name='caffe.BatchReductionParameter.pos', index=2,\n      number=3, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=15246,\n  serialized_end=15358,\n)\n\n\n_MEMORYOPTIMIZATIONPARAMETER = _descriptor.Descriptor(\n  name='MemoryOptimizationParameter',\n  full_name='caffe.MemoryOptimizationParameter',\n  filename=None,\n  file=DESCRIPTOR,\n  containing_type=None,\n  fields=[\n    _descriptor.FieldDescriptor(\n      name='optimize_train', full_name='caffe.MemoryOptimizationParameter.optimize_train', index=0,\n      number=1, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=True,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='optimize_test', full_name='caffe.MemoryOptimizationParameter.optimize_test', index=1,\n      number=2, type=8, cpp_type=7, label=1,\n      has_default_value=True, default_value=False,\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n    _descriptor.FieldDescriptor(\n      name='exclude_blob', full_name='caffe.MemoryOptimizationParameter.exclude_blob', index=2,\n      number=3, type=9, cpp_type=9, label=3,\n      has_default_value=False, default_value=[],\n      message_type=None, enum_type=None, containing_type=None,\n      is_extension=False, extension_scope=None,\n      options=None),\n  ],\n  extensions=[\n  ],\n  nested_types=[],\n  enum_types=[\n  ],\n  options=None,\n  is_extendable=False,\n  extension_ranges=[],\n  serialized_start=15360,\n  serialized_end=15471,\n)\n\n_BLOBPROTO.fields_by_name['shape'].message_type = _BLOBSHAPE\n_BLOBPROTOVECTOR.fields_by_name['blobs'].message_type = _BLOBPROTO\n_FILLERPARAMETER.fields_by_name['variance_norm'].enum_type = _FILLERPARAMETER_VARIANCENORM\n_FILLERPARAMETER_VARIANCENORM.containing_type = _FILLERPARAMETER;\n_NETPARAMETER.fields_by_name['input_shape'].message_type = _BLOBSHAPE\n_NETPARAMETER.fields_by_name['state'].message_type = _NETSTATE\n_NETPARAMETER.fields_by_name['layer'].message_type = _LAYERPARAMETER\n_NETPARAMETER.fields_by_name['mem_param'].message_type = _MEMORYOPTIMIZATIONPARAMETER\n_NETPARAMETER.fields_by_name['layers'].message_type = _V1LAYERPARAMETER\n_SOLVERPARAMETER.fields_by_name['net_param'].message_type = _NETPARAMETER\n_SOLVERPARAMETER.fields_by_name['train_net_param'].message_type = _NETPARAMETER\n_SOLVERPARAMETER.fields_by_name['test_net_param'].message_type = _NETPARAMETER\n_SOLVERPARAMETER.fields_by_name['train_state'].message_type = _NETSTATE\n_SOLVERPARAMETER.fields_by_name['test_state'].message_type = _NETSTATE\n_SOLVERPARAMETER.fields_by_name['solver_mode'].enum_type = _SOLVERPARAMETER_SOLVERMODE\n_SOLVERPARAMETER.fields_by_name['solver_type'].enum_type = _SOLVERPARAMETER_SOLVERTYPE\n_SOLVERPARAMETER_SOLVERMODE.containing_type = _SOLVERPARAMETER;\n_SOLVERPARAMETER_SOLVERTYPE.containing_type = _SOLVERPARAMETER;\n_SOLVERSTATE.fields_by_name['history'].message_type = _BLOBPROTO\n_NETSTATE.fields_by_name['phase'].enum_type = _PHASE\n_NETSTATERULE.fields_by_name['phase'].enum_type = _PHASE\n_PARAMSPEC.fields_by_name['share_mode'].enum_type = _PARAMSPEC_DIMCHECKMODE\n_PARAMSPEC_DIMCHECKMODE.containing_type = _PARAMSPEC;\n_LAYERPARAMETER.fields_by_name['phase'].enum_type = _PHASE\n_LAYERPARAMETER.fields_by_name['param'].message_type = _PARAMSPEC\n_LAYERPARAMETER.fields_by_name['blobs'].message_type = _BLOBPROTO\n_LAYERPARAMETER.fields_by_name['include'].message_type = _NETSTATERULE\n_LAYERPARAMETER.fields_by_name['exclude'].message_type = _NETSTATERULE\n_LAYERPARAMETER.fields_by_name['transform_param'].message_type = _TRANSFORMATIONPARAMETER\n_LAYERPARAMETER.fields_by_name['loss_param'].message_type = _LOSSPARAMETER\n_LAYERPARAMETER.fields_by_name['accuracy_param'].message_type = _ACCURACYPARAMETER\n_LAYERPARAMETER.fields_by_name['argmax_param'].message_type = _ARGMAXPARAMETER\n_LAYERPARAMETER.fields_by_name['bn_param'].message_type = _BNPARAMETER\n_LAYERPARAMETER.fields_by_name['concat_param'].message_type = _CONCATPARAMETER\n_LAYERPARAMETER.fields_by_name['contrastive_loss_param'].message_type = _CONTRASTIVELOSSPARAMETER\n_LAYERPARAMETER.fields_by_name['convolution_param'].message_type = _CONVOLUTIONPARAMETER\n_LAYERPARAMETER.fields_by_name['data_param'].message_type = _DATAPARAMETER\n_LAYERPARAMETER.fields_by_name['dropout_param'].message_type = _DROPOUTPARAMETER\n_LAYERPARAMETER.fields_by_name['dummy_data_param'].message_type = _DUMMYDATAPARAMETER\n_LAYERPARAMETER.fields_by_name['eltwise_param'].message_type = _ELTWISEPARAMETER\n_LAYERPARAMETER.fields_by_name['exp_param'].message_type = _EXPPARAMETER\n_LAYERPARAMETER.fields_by_name['flatten_param'].message_type = _FLATTENPARAMETER\n_LAYERPARAMETER.fields_by_name['hdf5_data_param'].message_type = _HDF5DATAPARAMETER\n_LAYERPARAMETER.fields_by_name['hdf5_output_param'].message_type = _HDF5OUTPUTPARAMETER\n_LAYERPARAMETER.fields_by_name['hinge_loss_param'].message_type = _HINGELOSSPARAMETER\n_LAYERPARAMETER.fields_by_name['image_data_param'].message_type = _IMAGEDATAPARAMETER\n_LAYERPARAMETER.fields_by_name['infogain_loss_param'].message_type = _INFOGAINLOSSPARAMETER\n_LAYERPARAMETER.fields_by_name['inner_product_param'].message_type = _INNERPRODUCTPARAMETER\n_LAYERPARAMETER.fields_by_name['log_param'].message_type = _LOGPARAMETER\n_LAYERPARAMETER.fields_by_name['lrn_param'].message_type = _LRNPARAMETER\n_LAYERPARAMETER.fields_by_name['memory_data_param'].message_type = _MEMORYDATAPARAMETER\n_LAYERPARAMETER.fields_by_name['mvn_param'].message_type = _MVNPARAMETER\n_LAYERPARAMETER.fields_by_name['pooling_param'].message_type = _POOLINGPARAMETER\n_LAYERPARAMETER.fields_by_name['power_param'].message_type = _POWERPARAMETER\n_LAYERPARAMETER.fields_by_name['prelu_param'].message_type = _PRELUPARAMETER\n_LAYERPARAMETER.fields_by_name['python_param'].message_type = _PYTHONPARAMETER\n_LAYERPARAMETER.fields_by_name['reduction_param'].message_type = _REDUCTIONPARAMETER\n_LAYERPARAMETER.fields_by_name['relu_param'].message_type = _RELUPARAMETER\n_LAYERPARAMETER.fields_by_name['reshape_param'].message_type = _RESHAPEPARAMETER\n_LAYERPARAMETER.fields_by_name['seg_data_param'].message_type = _SEGDATAPARAMETER\n_LAYERPARAMETER.fields_by_name['sigmoid_param'].message_type = _SIGMOIDPARAMETER\n_LAYERPARAMETER.fields_by_name['softmax_param'].message_type = _SOFTMAXPARAMETER\n_LAYERPARAMETER.fields_by_name['spp_param'].message_type = _SPPPARAMETER\n_LAYERPARAMETER.fields_by_name['slice_param'].message_type = _SLICEPARAMETER\n_LAYERPARAMETER.fields_by_name['tanh_param'].message_type = _TANHPARAMETER\n_LAYERPARAMETER.fields_by_name['threshold_param'].message_type = _THRESHOLDPARAMETER\n_LAYERPARAMETER.fields_by_name['window_data_param'].message_type = _WINDOWDATAPARAMETER\n_LAYERPARAMETER.fields_by_name['video_data_param'].message_type = _VIDEODATAPARAMETER\n_LAYERPARAMETER.fields_by_name['roi_pooling_param'].message_type = _ROIPOOLINGPARAMETER\n_LAYERPARAMETER.fields_by_name['scale_param'].message_type = _SCALEPARAMETER\n_LAYERPARAMETER.fields_by_name['bias_param'].message_type = _BIASPARAMETER\n_LAYERPARAMETER.fields_by_name['batch_reduction_param'].message_type = _BATCHREDUCTIONPARAMETER\n_BNPARAMETER.fields_by_name['slope_filler'].message_type = _FILLERPARAMETER\n_BNPARAMETER.fields_by_name['bias_filler'].message_type = _FILLERPARAMETER\n_BNPARAMETER.fields_by_name['engine'].enum_type = _BNPARAMETER_ENGINE\n_BNPARAMETER_ENGINE.containing_type = _BNPARAMETER;\n_CONVOLUTIONPARAMETER.fields_by_name['weight_filler'].message_type = _FILLERPARAMETER\n_CONVOLUTIONPARAMETER.fields_by_name['bias_filler'].message_type = _FILLERPARAMETER\n_CONVOLUTIONPARAMETER.fields_by_name['engine'].enum_type = _CONVOLUTIONPARAMETER_ENGINE\n_CONVOLUTIONPARAMETER_ENGINE.containing_type = _CONVOLUTIONPARAMETER;\n_DATAPARAMETER.fields_by_name['backend'].enum_type = _DATAPARAMETER_DB\n_DATAPARAMETER_DB.containing_type = _DATAPARAMETER;\n_DUMMYDATAPARAMETER.fields_by_name['data_filler'].message_type = _FILLERPARAMETER\n_DUMMYDATAPARAMETER.fields_by_name['shape'].message_type = _BLOBSHAPE\n_ELTWISEPARAMETER.fields_by_name['operation'].enum_type = _ELTWISEPARAMETER_ELTWISEOP\n_ELTWISEPARAMETER_ELTWISEOP.containing_type = _ELTWISEPARAMETER;\n_HINGELOSSPARAMETER.fields_by_name['norm'].enum_type = _HINGELOSSPARAMETER_NORM\n_HINGELOSSPARAMETER_NORM.containing_type = _HINGELOSSPARAMETER;\n_VIDEODATAPARAMETER.fields_by_name['modality'].enum_type = _VIDEODATAPARAMETER_MODALITY\n_VIDEODATAPARAMETER_MODALITY.containing_type = _VIDEODATAPARAMETER;\n_INNERPRODUCTPARAMETER.fields_by_name['weight_filler'].message_type = _FILLERPARAMETER\n_INNERPRODUCTPARAMETER.fields_by_name['bias_filler'].message_type = _FILLERPARAMETER\n_LRNPARAMETER.fields_by_name['norm_region'].enum_type = _LRNPARAMETER_NORMREGION\n_LRNPARAMETER_NORMREGION.containing_type = _LRNPARAMETER;\n_POOLINGPARAMETER.fields_by_name['pool'].enum_type = _POOLINGPARAMETER_POOLMETHOD\n_POOLINGPARAMETER.fields_by_name['engine'].enum_type = _POOLINGPARAMETER_ENGINE\n_POOLINGPARAMETER_POOLMETHOD.containing_type = _POOLINGPARAMETER;\n_POOLINGPARAMETER_ENGINE.containing_type = _POOLINGPARAMETER;\n_REDUCTIONPARAMETER.fields_by_name['operation'].enum_type = _REDUCTIONPARAMETER_REDUCTIONOP\n_REDUCTIONPARAMETER_REDUCTIONOP.containing_type = _REDUCTIONPARAMETER;\n_RELUPARAMETER.fields_by_name['engine'].enum_type = _RELUPARAMETER_ENGINE\n_RELUPARAMETER_ENGINE.containing_type = _RELUPARAMETER;\n_RESHAPEPARAMETER.fields_by_name['shape'].message_type = _BLOBSHAPE\n_SIGMOIDPARAMETER.fields_by_name['engine'].enum_type = _SIGMOIDPARAMETER_ENGINE\n_SIGMOIDPARAMETER_ENGINE.containing_type = _SIGMOIDPARAMETER;\n_SOFTMAXPARAMETER.fields_by_name['engine'].enum_type = _SOFTMAXPARAMETER_ENGINE\n_SOFTMAXPARAMETER_ENGINE.containing_type = _SOFTMAXPARAMETER;\n_TANHPARAMETER.fields_by_name['engine'].enum_type = _TANHPARAMETER_ENGINE\n_TANHPARAMETER_ENGINE.containing_type = _TANHPARAMETER;\n_SPPPARAMETER.fields_by_name['pool'].enum_type = _SPPPARAMETER_POOLMETHOD\n_SPPPARAMETER.fields_by_name['engine'].enum_type = _SPPPARAMETER_ENGINE\n_SPPPARAMETER_POOLMETHOD.containing_type = _SPPPARAMETER;\n_SPPPARAMETER_ENGINE.containing_type = _SPPPARAMETER;\n_V1LAYERPARAMETER.fields_by_name['include'].message_type = _NETSTATERULE\n_V1LAYERPARAMETER.fields_by_name['exclude'].message_type = _NETSTATERULE\n_V1LAYERPARAMETER.fields_by_name['type'].enum_type = _V1LAYERPARAMETER_LAYERTYPE\n_V1LAYERPARAMETER.fields_by_name['blobs'].message_type = _BLOBPROTO\n_V1LAYERPARAMETER.fields_by_name['blob_share_mode'].enum_type = _V1LAYERPARAMETER_DIMCHECKMODE\n_V1LAYERPARAMETER.fields_by_name['accuracy_param'].message_type = _ACCURACYPARAMETER\n_V1LAYERPARAMETER.fields_by_name['argmax_param'].message_type = _ARGMAXPARAMETER\n_V1LAYERPARAMETER.fields_by_name['concat_param'].message_type = _CONCATPARAMETER\n_V1LAYERPARAMETER.fields_by_name['contrastive_loss_param'].message_type = _CONTRASTIVELOSSPARAMETER\n_V1LAYERPARAMETER.fields_by_name['convolution_param'].message_type = _CONVOLUTIONPARAMETER\n_V1LAYERPARAMETER.fields_by_name['data_param'].message_type = _DATAPARAMETER\n_V1LAYERPARAMETER.fields_by_name['dropout_param'].message_type = _DROPOUTPARAMETER\n_V1LAYERPARAMETER.fields_by_name['dummy_data_param'].message_type = _DUMMYDATAPARAMETER\n_V1LAYERPARAMETER.fields_by_name['eltwise_param'].message_type = _ELTWISEPARAMETER\n_V1LAYERPARAMETER.fields_by_name['exp_param'].message_type = _EXPPARAMETER\n_V1LAYERPARAMETER.fields_by_name['hdf5_data_param'].message_type = _HDF5DATAPARAMETER\n_V1LAYERPARAMETER.fields_by_name['hdf5_output_param'].message_type = _HDF5OUTPUTPARAMETER\n_V1LAYERPARAMETER.fields_by_name['hinge_loss_param'].message_type = _HINGELOSSPARAMETER\n_V1LAYERPARAMETER.fields_by_name['image_data_param'].message_type = _IMAGEDATAPARAMETER\n_V1LAYERPARAMETER.fields_by_name['infogain_loss_param'].message_type = _INFOGAINLOSSPARAMETER\n_V1LAYERPARAMETER.fields_by_name['inner_product_param'].message_type = _INNERPRODUCTPARAMETER\n_V1LAYERPARAMETER.fields_by_name['lrn_param'].message_type = _LRNPARAMETER\n_V1LAYERPARAMETER.fields_by_name['memory_data_param'].message_type = _MEMORYDATAPARAMETER\n_V1LAYERPARAMETER.fields_by_name['mvn_param'].message_type = _MVNPARAMETER\n_V1LAYERPARAMETER.fields_by_name['pooling_param'].message_type = _POOLINGPARAMETER\n_V1LAYERPARAMETER.fields_by_name['power_param'].message_type = _POWERPARAMETER\n_V1LAYERPARAMETER.fields_by_name['relu_param'].message_type = _RELUPARAMETER\n_V1LAYERPARAMETER.fields_by_name['sigmoid_param'].message_type = _SIGMOIDPARAMETER\n_V1LAYERPARAMETER.fields_by_name['softmax_param'].message_type = _SOFTMAXPARAMETER\n_V1LAYERPARAMETER.fields_by_name['slice_param'].message_type = _SLICEPARAMETER\n_V1LAYERPARAMETER.fields_by_name['tanh_param'].message_type = _TANHPARAMETER\n_V1LAYERPARAMETER.fields_by_name['threshold_param'].message_type = _THRESHOLDPARAMETER\n_V1LAYERPARAMETER.fields_by_name['window_data_param'].message_type = _WINDOWDATAPARAMETER\n_V1LAYERPARAMETER.fields_by_name['transform_param'].message_type = _TRANSFORMATIONPARAMETER\n_V1LAYERPARAMETER.fields_by_name['loss_param'].message_type = _LOSSPARAMETER\n_V1LAYERPARAMETER.fields_by_name['layer'].message_type = _V0LAYERPARAMETER\n_V1LAYERPARAMETER_LAYERTYPE.containing_type = _V1LAYERPARAMETER;\n_V1LAYERPARAMETER_DIMCHECKMODE.containing_type = _V1LAYERPARAMETER;\n_V0LAYERPARAMETER.fields_by_name['weight_filler'].message_type = _FILLERPARAMETER\n_V0LAYERPARAMETER.fields_by_name['bias_filler'].message_type = _FILLERPARAMETER\n_V0LAYERPARAMETER.fields_by_name['pool'].enum_type = _V0LAYERPARAMETER_POOLMETHOD\n_V0LAYERPARAMETER.fields_by_name['blobs'].message_type = _BLOBPROTO\n_V0LAYERPARAMETER.fields_by_name['hdf5_output_param'].message_type = _HDF5OUTPUTPARAMETER\n_V0LAYERPARAMETER_POOLMETHOD.containing_type = _V0LAYERPARAMETER;\n_PRELUPARAMETER.fields_by_name['filler'].message_type = _FILLERPARAMETER\n_SCALEPARAMETER.fields_by_name['filler'].message_type = _FILLERPARAMETER\n_SCALEPARAMETER.fields_by_name['bias_filler'].message_type = _FILLERPARAMETER\n_BIASPARAMETER.fields_by_name['filler'].message_type = _FILLERPARAMETER\n_BATCHREDUCTIONPARAMETER.fields_by_name['reduction_param'].message_type = _REDUCTIONPARAMETER\nDESCRIPTOR.message_types_by_name['BlobShape'] = _BLOBSHAPE\nDESCRIPTOR.message_types_by_name['BlobProto'] = _BLOBPROTO\nDESCRIPTOR.message_types_by_name['BlobProtoVector'] = _BLOBPROTOVECTOR\nDESCRIPTOR.message_types_by_name['Datum'] = _DATUM\nDESCRIPTOR.message_types_by_name['FillerParameter'] = _FILLERPARAMETER\nDESCRIPTOR.message_types_by_name['NetParameter'] = _NETPARAMETER\nDESCRIPTOR.message_types_by_name['SolverParameter'] = _SOLVERPARAMETER\nDESCRIPTOR.message_types_by_name['SolverState'] = _SOLVERSTATE\nDESCRIPTOR.message_types_by_name['NetState'] = _NETSTATE\nDESCRIPTOR.message_types_by_name['NetStateRule'] = _NETSTATERULE\nDESCRIPTOR.message_types_by_name['ParamSpec'] = _PARAMSPEC\nDESCRIPTOR.message_types_by_name['LayerParameter'] = _LAYERPARAMETER\nDESCRIPTOR.message_types_by_name['TransformationParameter'] = _TRANSFORMATIONPARAMETER\nDESCRIPTOR.message_types_by_name['LossParameter'] = _LOSSPARAMETER\nDESCRIPTOR.message_types_by_name['AccuracyParameter'] = _ACCURACYPARAMETER\nDESCRIPTOR.message_types_by_name['ArgMaxParameter'] = _ARGMAXPARAMETER\nDESCRIPTOR.message_types_by_name['BNParameter'] = _BNPARAMETER\nDESCRIPTOR.message_types_by_name['ConcatParameter'] = _CONCATPARAMETER\nDESCRIPTOR.message_types_by_name['ContrastiveLossParameter'] = _CONTRASTIVELOSSPARAMETER\nDESCRIPTOR.message_types_by_name['ConvolutionParameter'] = _CONVOLUTIONPARAMETER\nDESCRIPTOR.message_types_by_name['DataParameter'] = _DATAPARAMETER\nDESCRIPTOR.message_types_by_name['DropoutParameter'] = _DROPOUTPARAMETER\nDESCRIPTOR.message_types_by_name['DummyDataParameter'] = _DUMMYDATAPARAMETER\nDESCRIPTOR.message_types_by_name['EltwiseParameter'] = _ELTWISEPARAMETER\nDESCRIPTOR.message_types_by_name['ExpParameter'] = _EXPPARAMETER\nDESCRIPTOR.message_types_by_name['FlattenParameter'] = _FLATTENPARAMETER\nDESCRIPTOR.message_types_by_name['HDF5DataParameter'] = _HDF5DATAPARAMETER\nDESCRIPTOR.message_types_by_name['HDF5OutputParameter'] = _HDF5OUTPUTPARAMETER\nDESCRIPTOR.message_types_by_name['HingeLossParameter'] = _HINGELOSSPARAMETER\nDESCRIPTOR.message_types_by_name['ImageDataParameter'] = _IMAGEDATAPARAMETER\nDESCRIPTOR.message_types_by_name['VideoDataParameter'] = _VIDEODATAPARAMETER\nDESCRIPTOR.message_types_by_name['InfogainLossParameter'] = _INFOGAINLOSSPARAMETER\nDESCRIPTOR.message_types_by_name['InnerProductParameter'] = _INNERPRODUCTPARAMETER\nDESCRIPTOR.message_types_by_name['LogParameter'] = _LOGPARAMETER\nDESCRIPTOR.message_types_by_name['LRNParameter'] = _LRNPARAMETER\nDESCRIPTOR.message_types_by_name['MemoryDataParameter'] = _MEMORYDATAPARAMETER\nDESCRIPTOR.message_types_by_name['MVNParameter'] = _MVNPARAMETER\nDESCRIPTOR.message_types_by_name['PoolingParameter'] = _POOLINGPARAMETER\nDESCRIPTOR.message_types_by_name['PowerParameter'] = _POWERPARAMETER\nDESCRIPTOR.message_types_by_name['PythonParameter'] = _PYTHONPARAMETER\nDESCRIPTOR.message_types_by_name['ReductionParameter'] = _REDUCTIONPARAMETER\nDESCRIPTOR.message_types_by_name['ReLUParameter'] = _RELUPARAMETER\nDESCRIPTOR.message_types_by_name['ReshapeParameter'] = _RESHAPEPARAMETER\nDESCRIPTOR.message_types_by_name['SegDataParameter'] = _SEGDATAPARAMETER\nDESCRIPTOR.message_types_by_name['SigmoidParameter'] = _SIGMOIDPARAMETER\nDESCRIPTOR.message_types_by_name['SliceParameter'] = _SLICEPARAMETER\nDESCRIPTOR.message_types_by_name['SoftmaxParameter'] = _SOFTMAXPARAMETER\nDESCRIPTOR.message_types_by_name['TanHParameter'] = _TANHPARAMETER\nDESCRIPTOR.message_types_by_name['ThresholdParameter'] = _THRESHOLDPARAMETER\nDESCRIPTOR.message_types_by_name['WindowDataParameter'] = _WINDOWDATAPARAMETER\nDESCRIPTOR.message_types_by_name['SPPParameter'] = _SPPPARAMETER\nDESCRIPTOR.message_types_by_name['ROIPoolingParameter'] = _ROIPOOLINGPARAMETER\nDESCRIPTOR.message_types_by_name['V1LayerParameter'] = _V1LAYERPARAMETER\nDESCRIPTOR.message_types_by_name['V0LayerParameter'] = _V0LAYERPARAMETER\nDESCRIPTOR.message_types_by_name['PReLUParameter'] = _PRELUPARAMETER\nDESCRIPTOR.message_types_by_name['ScaleParameter'] = _SCALEPARAMETER\nDESCRIPTOR.message_types_by_name['BiasParameter'] = _BIASPARAMETER\nDESCRIPTOR.message_types_by_name['BatchReductionParameter'] = _BATCHREDUCTIONPARAMETER\nDESCRIPTOR.message_types_by_name['MemoryOptimizationParameter'] = _MEMORYOPTIMIZATIONPARAMETER\n\nclass BlobShape(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _BLOBSHAPE\n\n  # @@protoc_insertion_point(class_scope:caffe.BlobShape)\n\nclass BlobProto(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _BLOBPROTO\n\n  # @@protoc_insertion_point(class_scope:caffe.BlobProto)\n\nclass BlobProtoVector(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _BLOBPROTOVECTOR\n\n  # @@protoc_insertion_point(class_scope:caffe.BlobProtoVector)\n\nclass Datum(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _DATUM\n\n  # @@protoc_insertion_point(class_scope:caffe.Datum)\n\nclass FillerParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _FILLERPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.FillerParameter)\n\nclass NetParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _NETPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.NetParameter)\n\nclass SolverParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _SOLVERPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.SolverParameter)\n\nclass SolverState(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _SOLVERSTATE\n\n  # @@protoc_insertion_point(class_scope:caffe.SolverState)\n\nclass NetState(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _NETSTATE\n\n  # @@protoc_insertion_point(class_scope:caffe.NetState)\n\nclass NetStateRule(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _NETSTATERULE\n\n  # @@protoc_insertion_point(class_scope:caffe.NetStateRule)\n\nclass ParamSpec(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _PARAMSPEC\n\n  # @@protoc_insertion_point(class_scope:caffe.ParamSpec)\n\nclass LayerParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _LAYERPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.LayerParameter)\n\nclass TransformationParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _TRANSFORMATIONPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.TransformationParameter)\n\nclass LossParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _LOSSPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.LossParameter)\n\nclass AccuracyParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _ACCURACYPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.AccuracyParameter)\n\nclass ArgMaxParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _ARGMAXPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.ArgMaxParameter)\n\nclass BNParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _BNPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.BNParameter)\n\nclass ConcatParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _CONCATPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.ConcatParameter)\n\nclass ContrastiveLossParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _CONTRASTIVELOSSPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.ContrastiveLossParameter)\n\nclass ConvolutionParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _CONVOLUTIONPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.ConvolutionParameter)\n\nclass DataParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _DATAPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.DataParameter)\n\nclass DropoutParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _DROPOUTPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.DropoutParameter)\n\nclass DummyDataParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _DUMMYDATAPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.DummyDataParameter)\n\nclass EltwiseParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _ELTWISEPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.EltwiseParameter)\n\nclass ExpParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _EXPPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.ExpParameter)\n\nclass FlattenParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _FLATTENPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.FlattenParameter)\n\nclass HDF5DataParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _HDF5DATAPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.HDF5DataParameter)\n\nclass HDF5OutputParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _HDF5OUTPUTPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.HDF5OutputParameter)\n\nclass HingeLossParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _HINGELOSSPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.HingeLossParameter)\n\nclass ImageDataParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _IMAGEDATAPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.ImageDataParameter)\n\nclass VideoDataParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _VIDEODATAPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.VideoDataParameter)\n\nclass InfogainLossParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _INFOGAINLOSSPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.InfogainLossParameter)\n\nclass InnerProductParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _INNERPRODUCTPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.InnerProductParameter)\n\nclass LogParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _LOGPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.LogParameter)\n\nclass LRNParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _LRNPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.LRNParameter)\n\nclass MemoryDataParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _MEMORYDATAPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.MemoryDataParameter)\n\nclass MVNParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _MVNPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.MVNParameter)\n\nclass PoolingParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _POOLINGPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.PoolingParameter)\n\nclass PowerParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _POWERPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.PowerParameter)\n\nclass PythonParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _PYTHONPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.PythonParameter)\n\nclass ReductionParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _REDUCTIONPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.ReductionParameter)\n\nclass ReLUParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _RELUPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.ReLUParameter)\n\nclass ReshapeParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _RESHAPEPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.ReshapeParameter)\n\nclass SegDataParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _SEGDATAPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.SegDataParameter)\n\nclass SigmoidParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _SIGMOIDPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.SigmoidParameter)\n\nclass SliceParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _SLICEPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.SliceParameter)\n\nclass SoftmaxParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _SOFTMAXPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.SoftmaxParameter)\n\nclass TanHParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _TANHPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.TanHParameter)\n\nclass ThresholdParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _THRESHOLDPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.ThresholdParameter)\n\nclass WindowDataParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _WINDOWDATAPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.WindowDataParameter)\n\nclass SPPParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _SPPPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.SPPParameter)\n\nclass ROIPoolingParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _ROIPOOLINGPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.ROIPoolingParameter)\n\nclass V1LayerParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _V1LAYERPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.V1LayerParameter)\n\nclass V0LayerParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _V0LAYERPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.V0LayerParameter)\n\nclass PReLUParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _PRELUPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.PReLUParameter)\n\nclass ScaleParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _SCALEPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.ScaleParameter)\n\nclass BiasParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _BIASPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.BiasParameter)\n\nclass BatchReductionParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _BATCHREDUCTIONPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.BatchReductionParameter)\n\nclass MemoryOptimizationParameter(_message.Message):\n  __metaclass__ = _reflection.GeneratedProtocolMessageType\n  DESCRIPTOR = _MEMORYOPTIMIZATIONPARAMETER\n\n  # @@protoc_insertion_point(class_scope:caffe.MemoryOptimizationParameter)\n\n\n_BLOBSHAPE.fields_by_name['dim'].has_options = True\n_BLOBSHAPE.fields_by_name['dim']._options = _descriptor._ParseOptions(descriptor_pb2.FieldOptions(), '\\020\\001')\n_BLOBPROTO.fields_by_name['data'].has_options = True\n_BLOBPROTO.fields_by_name['data']._options = _descriptor._ParseOptions(descriptor_pb2.FieldOptions(), '\\020\\001')\n_BLOBPROTO.fields_by_name['diff'].has_options = True\n_BLOBPROTO.fields_by_name['diff']._options = _descriptor._ParseOptions(descriptor_pb2.FieldOptions(), '\\020\\001')\n# @@protoc_insertion_point(module_scope)\n"
  },
  {
    "path": "model_zoo/bninception/inceptionv3.yaml",
    "content": "inputs: []\nlayers:\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 32, pad_h: 0, pad_w: 0, stride_h: 2,\n    stride_w: 2}\n  expr: conv_Conv2D<=Convolution<=data\n  id: conv_Conv2D\n- attrs: {}\n  expr: conv<=BN<=conv_Conv2D\n  id: conv_batchnorm\n- {expr: conv<=ReLU<=conv, id: conv}\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 32, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: conv_1_Conv2D<=Convolution<=conv\n  id: conv_1_Conv2D\n- attrs: {}\n  expr: conv_1<=BN<=conv_1_Conv2D\n  id: conv_1_batchnorm\n- {expr: conv_1<=ReLU<=conv_1, id: conv_1}\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 64, pad_h: 1, pad_w: 1, stride_h: 1,\n    stride_w: 1}\n  expr: conv_2_Conv2D<=Convolution<=conv_1\n  id: conv_2_Conv2D\n- attrs: {}\n  expr: conv_2<=BN<=conv_2_Conv2D\n  id: conv_2_batchnorm\n- {expr: conv_2<=ReLU<=conv_2, id: conv_2}\n- attrs: {kernel_size: 3, mode: max, pad: 0, stride: 2}\n  expr: pool<=Pooling<=conv_2\n  id: pool\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 80, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: conv_3_Conv2D<=Convolution<=pool\n  id: conv_3_Conv2D\n- attrs: {}\n  expr: conv_3<=BN<=conv_3_Conv2D\n  id: conv_3_batchnorm\n- {expr: conv_3<=ReLU<=conv_3, id: conv_3}\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 192, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: conv_4_Conv2D<=Convolution<=conv_3\n  id: conv_4_Conv2D\n- attrs: {}\n  expr: conv_4<=BN<=conv_4_Conv2D\n  id: conv_4_batchnorm\n- {expr: conv_4<=ReLU<=conv_4, id: conv_4}\n- attrs: {kernel_size: 3, mode: max, pad: 0, stride: 2}\n  expr: pool_1<=Pooling<=conv_4\n  id: pool_1\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 64, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_conv_Conv2D<=Convolution<=pool_1\n  id: mixed_conv_Conv2D\n- attrs: {}\n  expr: mixed_conv<=BN<=mixed_conv_Conv2D\n  id: mixed_conv_batchnorm\n- {expr: mixed_conv<=ReLU<=mixed_conv, id: mixed_conv}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 48, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_tower_conv_Conv2D<=Convolution<=pool_1\n  id: mixed_tower_conv_Conv2D\n- attrs: {}\n  expr: mixed_tower_conv<=BN<=mixed_tower_conv_Conv2D\n  id: mixed_tower_conv_batchnorm\n- {expr: mixed_tower_conv<=ReLU<=mixed_tower_conv, id: mixed_tower_conv}\n- attrs: {kernel_h: 5, kernel_w: 5, num_output: 64, pad_h: 2, pad_w: 2, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_tower_conv_1_Conv2D<=Convolution<=mixed_tower_conv\n  id: mixed_tower_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_tower_conv_1<=BN<=mixed_tower_conv_1_Conv2D\n  id: mixed_tower_conv_1_batchnorm\n- {expr: mixed_tower_conv_1<=ReLU<=mixed_tower_conv_1, id: mixed_tower_conv_1}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 64, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_tower_1_conv_Conv2D<=Convolution<=pool_1\n  id: mixed_tower_1_conv_Conv2D\n- attrs: {}\n  expr: mixed_tower_1_conv<=BN<=mixed_tower_1_conv_Conv2D\n  id: mixed_tower_1_conv_batchnorm\n- {expr: mixed_tower_1_conv<=ReLU<=mixed_tower_1_conv, id: mixed_tower_1_conv}\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 96, pad_h: 1, pad_w: 1, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_tower_1_conv_1_Conv2D<=Convolution<=mixed_tower_1_conv\n  id: mixed_tower_1_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_tower_1_conv_1<=BN<=mixed_tower_1_conv_1_Conv2D\n  id: mixed_tower_1_conv_1_batchnorm\n- {expr: mixed_tower_1_conv_1<=ReLU<=mixed_tower_1_conv_1, id: mixed_tower_1_conv_1}\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 96, pad_h: 1, pad_w: 1, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_tower_1_conv_2_Conv2D<=Convolution<=mixed_tower_1_conv_1\n  id: mixed_tower_1_conv_2_Conv2D\n- attrs: {}\n  expr: mixed_tower_1_conv_2<=BN<=mixed_tower_1_conv_2_Conv2D\n  id: mixed_tower_1_conv_2_batchnorm\n- {expr: mixed_tower_1_conv_2<=ReLU<=mixed_tower_1_conv_2, id: mixed_tower_1_conv_2}\n- attrs: {kernel_size: 3, mode: ave, pad: 1, stride: 1}\n  expr: mixed_tower_2_pool<=Pooling<=pool_1\n  id: mixed_tower_2_pool\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 32, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_tower_2_conv_Conv2D<=Convolution<=mixed_tower_2_pool\n  id: mixed_tower_2_conv_Conv2D\n- attrs: {}\n  expr: mixed_tower_2_conv<=BN<=mixed_tower_2_conv_Conv2D\n  id: mixed_tower_2_conv_batchnorm\n- {expr: mixed_tower_2_conv<=ReLU<=mixed_tower_2_conv, id: mixed_tower_2_conv}\n- {expr: 'mixed_join<=Concat<=mixed_conv,mixed_tower_conv_1,mixed_tower_1_conv_2,mixed_tower_2_conv',\n  id: mixed_join}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 64, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_1_conv_Conv2D<=Convolution<=mixed_join\n  id: mixed_1_conv_Conv2D\n- attrs: {}\n  expr: mixed_1_conv<=BN<=mixed_1_conv_Conv2D\n  id: mixed_1_conv_batchnorm\n- {expr: mixed_1_conv<=ReLU<=mixed_1_conv, id: mixed_1_conv}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 48, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_1_tower_conv_Conv2D<=Convolution<=mixed_join\n  id: mixed_1_tower_conv_Conv2D\n- attrs: {}\n  expr: mixed_1_tower_conv<=BN<=mixed_1_tower_conv_Conv2D\n  id: mixed_1_tower_conv_batchnorm\n- {expr: mixed_1_tower_conv<=ReLU<=mixed_1_tower_conv, id: mixed_1_tower_conv}\n- attrs: {kernel_h: 5, kernel_w: 5, num_output: 64, pad_h: 2, pad_w: 2, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_1_tower_conv_1_Conv2D<=Convolution<=mixed_1_tower_conv\n  id: mixed_1_tower_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_1_tower_conv_1<=BN<=mixed_1_tower_conv_1_Conv2D\n  id: mixed_1_tower_conv_1_batchnorm\n- {expr: mixed_1_tower_conv_1<=ReLU<=mixed_1_tower_conv_1, id: mixed_1_tower_conv_1}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 64, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_1_tower_1_conv_Conv2D<=Convolution<=mixed_join\n  id: mixed_1_tower_1_conv_Conv2D\n- attrs: {}\n  expr: mixed_1_tower_1_conv<=BN<=mixed_1_tower_1_conv_Conv2D\n  id: mixed_1_tower_1_conv_batchnorm\n- {expr: mixed_1_tower_1_conv<=ReLU<=mixed_1_tower_1_conv, id: mixed_1_tower_1_conv}\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 96, pad_h: 1, pad_w: 1, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_1_tower_1_conv_1_Conv2D<=Convolution<=mixed_1_tower_1_conv\n  id: mixed_1_tower_1_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_1_tower_1_conv_1<=BN<=mixed_1_tower_1_conv_1_Conv2D\n  id: mixed_1_tower_1_conv_1_batchnorm\n- {expr: mixed_1_tower_1_conv_1<=ReLU<=mixed_1_tower_1_conv_1, id: mixed_1_tower_1_conv_1}\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 96, pad_h: 1, pad_w: 1, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_1_tower_1_conv_2_Conv2D<=Convolution<=mixed_1_tower_1_conv_1\n  id: mixed_1_tower_1_conv_2_Conv2D\n- attrs: {}\n  expr: mixed_1_tower_1_conv_2<=BN<=mixed_1_tower_1_conv_2_Conv2D\n  id: mixed_1_tower_1_conv_2_batchnorm\n- {expr: mixed_1_tower_1_conv_2<=ReLU<=mixed_1_tower_1_conv_2, id: mixed_1_tower_1_conv_2}\n- attrs: {kernel_size: 3, mode: ave, pad: 1, stride: 1}\n  expr: mixed_1_tower_2_pool<=Pooling<=mixed_join\n  id: mixed_1_tower_2_pool\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 64, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_1_tower_2_conv_Conv2D<=Convolution<=mixed_1_tower_2_pool\n  id: mixed_1_tower_2_conv_Conv2D\n- attrs: {}\n  expr: mixed_1_tower_2_conv<=BN<=mixed_1_tower_2_conv_Conv2D\n  id: mixed_1_tower_2_conv_batchnorm\n- {expr: mixed_1_tower_2_conv<=ReLU<=mixed_1_tower_2_conv, id: mixed_1_tower_2_conv}\n- {expr: 'mixed_1_join<=Concat<=mixed_1_conv,mixed_1_tower_conv_1,mixed_1_tower_1_conv_2,mixed_1_tower_2_conv',\n  id: mixed_1_join}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 64, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_2_conv_Conv2D<=Convolution<=mixed_1_join\n  id: mixed_2_conv_Conv2D\n- attrs: {}\n  expr: mixed_2_conv<=BN<=mixed_2_conv_Conv2D\n  id: mixed_2_conv_batchnorm\n- {expr: mixed_2_conv<=ReLU<=mixed_2_conv, id: mixed_2_conv}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 48, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_2_tower_conv_Conv2D<=Convolution<=mixed_1_join\n  id: mixed_2_tower_conv_Conv2D\n- attrs: {}\n  expr: mixed_2_tower_conv<=BN<=mixed_2_tower_conv_Conv2D\n  id: mixed_2_tower_conv_batchnorm\n- {expr: mixed_2_tower_conv<=ReLU<=mixed_2_tower_conv, id: mixed_2_tower_conv}\n- attrs: {kernel_h: 5, kernel_w: 5, num_output: 64, pad_h: 2, pad_w: 2, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_2_tower_conv_1_Conv2D<=Convolution<=mixed_2_tower_conv\n  id: mixed_2_tower_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_2_tower_conv_1<=BN<=mixed_2_tower_conv_1_Conv2D\n  id: mixed_2_tower_conv_1_batchnorm\n- {expr: mixed_2_tower_conv_1<=ReLU<=mixed_2_tower_conv_1, id: mixed_2_tower_conv_1}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 64, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_2_tower_1_conv_Conv2D<=Convolution<=mixed_1_join\n  id: mixed_2_tower_1_conv_Conv2D\n- attrs: {}\n  expr: mixed_2_tower_1_conv<=BN<=mixed_2_tower_1_conv_Conv2D\n  id: mixed_2_tower_1_conv_batchnorm\n- {expr: mixed_2_tower_1_conv<=ReLU<=mixed_2_tower_1_conv, id: mixed_2_tower_1_conv}\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 96, pad_h: 1, pad_w: 1, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_2_tower_1_conv_1_Conv2D<=Convolution<=mixed_2_tower_1_conv\n  id: mixed_2_tower_1_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_2_tower_1_conv_1<=BN<=mixed_2_tower_1_conv_1_Conv2D\n  id: mixed_2_tower_1_conv_1_batchnorm\n- {expr: mixed_2_tower_1_conv_1<=ReLU<=mixed_2_tower_1_conv_1, id: mixed_2_tower_1_conv_1}\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 96, pad_h: 1, pad_w: 1, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_2_tower_1_conv_2_Conv2D<=Convolution<=mixed_2_tower_1_conv_1\n  id: mixed_2_tower_1_conv_2_Conv2D\n- attrs: {}\n  expr: mixed_2_tower_1_conv_2<=BN<=mixed_2_tower_1_conv_2_Conv2D\n  id: mixed_2_tower_1_conv_2_batchnorm\n- {expr: mixed_2_tower_1_conv_2<=ReLU<=mixed_2_tower_1_conv_2, id: mixed_2_tower_1_conv_2}\n- attrs: {kernel_size: 3, mode: ave, pad: 1, stride: 1}\n  expr: mixed_2_tower_2_pool<=Pooling<=mixed_1_join\n  id: mixed_2_tower_2_pool\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 64, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_2_tower_2_conv_Conv2D<=Convolution<=mixed_2_tower_2_pool\n  id: mixed_2_tower_2_conv_Conv2D\n- attrs: {}\n  expr: mixed_2_tower_2_conv<=BN<=mixed_2_tower_2_conv_Conv2D\n  id: mixed_2_tower_2_conv_batchnorm\n- {expr: mixed_2_tower_2_conv<=ReLU<=mixed_2_tower_2_conv, id: mixed_2_tower_2_conv}\n- {expr: 'mixed_2_join<=Concat<=mixed_2_conv,mixed_2_tower_conv_1,mixed_2_tower_1_conv_2,mixed_2_tower_2_conv',\n  id: mixed_2_join}\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 384, pad_h: 0, pad_w: 0, stride_h: 2,\n    stride_w: 2}\n  expr: mixed_3_conv_Conv2D<=Convolution<=mixed_2_join\n  id: mixed_3_conv_Conv2D\n- attrs: {}\n  expr: mixed_3_conv<=BN<=mixed_3_conv_Conv2D\n  id: mixed_3_conv_batchnorm\n- {expr: mixed_3_conv<=ReLU<=mixed_3_conv, id: mixed_3_conv}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 64, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_3_tower_conv_Conv2D<=Convolution<=mixed_2_join\n  id: mixed_3_tower_conv_Conv2D\n- attrs: {}\n  expr: mixed_3_tower_conv<=BN<=mixed_3_tower_conv_Conv2D\n  id: mixed_3_tower_conv_batchnorm\n- {expr: mixed_3_tower_conv<=ReLU<=mixed_3_tower_conv, id: mixed_3_tower_conv}\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 96, pad_h: 1, pad_w: 1, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_3_tower_conv_1_Conv2D<=Convolution<=mixed_3_tower_conv\n  id: mixed_3_tower_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_3_tower_conv_1<=BN<=mixed_3_tower_conv_1_Conv2D\n  id: mixed_3_tower_conv_1_batchnorm\n- {expr: mixed_3_tower_conv_1<=ReLU<=mixed_3_tower_conv_1, id: mixed_3_tower_conv_1}\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 96, pad_h: 0, pad_w: 0, stride_h: 2,\n    stride_w: 2}\n  expr: mixed_3_tower_conv_2_Conv2D<=Convolution<=mixed_3_tower_conv_1\n  id: mixed_3_tower_conv_2_Conv2D\n- attrs: {}\n  expr: mixed_3_tower_conv_2<=BN<=mixed_3_tower_conv_2_Conv2D\n  id: mixed_3_tower_conv_2_batchnorm\n- {expr: mixed_3_tower_conv_2<=ReLU<=mixed_3_tower_conv_2, id: mixed_3_tower_conv_2}\n- attrs: {kernel_size: 3, mode: max, pad: 0, stride: 2}\n  expr: mixed_3_pool<=Pooling<=mixed_2_join\n  id: mixed_3_pool\n- {expr: 'mixed_3_join<=Concat<=mixed_3_conv,mixed_3_tower_conv_2,mixed_3_pool', id: mixed_3_join}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 192, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_4_conv_Conv2D<=Convolution<=mixed_3_join\n  id: mixed_4_conv_Conv2D\n- attrs: {}\n  expr: mixed_4_conv<=BN<=mixed_4_conv_Conv2D\n  id: mixed_4_conv_batchnorm\n- {expr: mixed_4_conv<=ReLU<=mixed_4_conv, id: mixed_4_conv}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 128, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_4_tower_conv_Conv2D<=Convolution<=mixed_3_join\n  id: mixed_4_tower_conv_Conv2D\n- attrs: {}\n  expr: mixed_4_tower_conv<=BN<=mixed_4_tower_conv_Conv2D\n  id: mixed_4_tower_conv_batchnorm\n- {expr: mixed_4_tower_conv<=ReLU<=mixed_4_tower_conv, id: mixed_4_tower_conv}\n- attrs: {kernel_h: 7, kernel_w: 1, num_output: 128, pad_h: 3, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_4_tower_conv_1_Conv2D<=Convolution<=mixed_4_tower_conv\n  id: mixed_4_tower_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_4_tower_conv_1<=BN<=mixed_4_tower_conv_1_Conv2D\n  id: mixed_4_tower_conv_1_batchnorm\n- {expr: mixed_4_tower_conv_1<=ReLU<=mixed_4_tower_conv_1, id: mixed_4_tower_conv_1}\n- attrs: {kernel_h: 1, kernel_w: 7, num_output: 192, pad_h: 0, pad_w: 3, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_4_tower_conv_2_Conv2D<=Convolution<=mixed_4_tower_conv_1\n  id: mixed_4_tower_conv_2_Conv2D\n- attrs: {}\n  expr: mixed_4_tower_conv_2<=BN<=mixed_4_tower_conv_2_Conv2D\n  id: mixed_4_tower_conv_2_batchnorm\n- {expr: mixed_4_tower_conv_2<=ReLU<=mixed_4_tower_conv_2, id: mixed_4_tower_conv_2}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 128, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_4_tower_1_conv_Conv2D<=Convolution<=mixed_3_join\n  id: mixed_4_tower_1_conv_Conv2D\n- attrs: {}\n  expr: mixed_4_tower_1_conv<=BN<=mixed_4_tower_1_conv_Conv2D\n  id: mixed_4_tower_1_conv_batchnorm\n- {expr: mixed_4_tower_1_conv<=ReLU<=mixed_4_tower_1_conv, id: mixed_4_tower_1_conv}\n- attrs: {kernel_h: 1, kernel_w: 7, num_output: 128, pad_h: 0, pad_w: 3, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_4_tower_1_conv_1_Conv2D<=Convolution<=mixed_4_tower_1_conv\n  id: mixed_4_tower_1_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_4_tower_1_conv_1<=BN<=mixed_4_tower_1_conv_1_Conv2D\n  id: mixed_4_tower_1_conv_1_batchnorm\n- {expr: mixed_4_tower_1_conv_1<=ReLU<=mixed_4_tower_1_conv_1, id: mixed_4_tower_1_conv_1}\n- attrs: {kernel_h: 7, kernel_w: 1, num_output: 128, pad_h: 3, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_4_tower_1_conv_2_Conv2D<=Convolution<=mixed_4_tower_1_conv_1\n  id: mixed_4_tower_1_conv_2_Conv2D\n- attrs: {}\n  expr: mixed_4_tower_1_conv_2<=BN<=mixed_4_tower_1_conv_2_Conv2D\n  id: mixed_4_tower_1_conv_2_batchnorm\n- {expr: mixed_4_tower_1_conv_2<=ReLU<=mixed_4_tower_1_conv_2, id: mixed_4_tower_1_conv_2}\n- attrs: {kernel_h: 1, kernel_w: 7, num_output: 128, pad_h: 0, pad_w: 3, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_4_tower_1_conv_3_Conv2D<=Convolution<=mixed_4_tower_1_conv_2\n  id: mixed_4_tower_1_conv_3_Conv2D\n- attrs: {}\n  expr: mixed_4_tower_1_conv_3<=BN<=mixed_4_tower_1_conv_3_Conv2D\n  id: mixed_4_tower_1_conv_3_batchnorm\n- {expr: mixed_4_tower_1_conv_3<=ReLU<=mixed_4_tower_1_conv_3, id: mixed_4_tower_1_conv_3}\n- attrs: {kernel_h: 7, kernel_w: 1, num_output: 192, pad_h: 3, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_4_tower_1_conv_4_Conv2D<=Convolution<=mixed_4_tower_1_conv_3\n  id: mixed_4_tower_1_conv_4_Conv2D\n- attrs: {}\n  expr: mixed_4_tower_1_conv_4<=BN<=mixed_4_tower_1_conv_4_Conv2D\n  id: mixed_4_tower_1_conv_4_batchnorm\n- {expr: mixed_4_tower_1_conv_4<=ReLU<=mixed_4_tower_1_conv_4, id: mixed_4_tower_1_conv_4}\n- attrs: {kernel_size: 3, mode: ave, pad: 1, stride: 1}\n  expr: mixed_4_tower_2_pool<=Pooling<=mixed_3_join\n  id: mixed_4_tower_2_pool\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 192, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_4_tower_2_conv_Conv2D<=Convolution<=mixed_4_tower_2_pool\n  id: mixed_4_tower_2_conv_Conv2D\n- attrs: {}\n  expr: mixed_4_tower_2_conv<=BN<=mixed_4_tower_2_conv_Conv2D\n  id: mixed_4_tower_2_conv_batchnorm\n- {expr: mixed_4_tower_2_conv<=ReLU<=mixed_4_tower_2_conv, id: mixed_4_tower_2_conv}\n- {expr: 'mixed_4_join<=Concat<=mixed_4_conv,mixed_4_tower_conv_2,mixed_4_tower_1_conv_4,mixed_4_tower_2_conv',\n  id: mixed_4_join}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 192, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_5_conv_Conv2D<=Convolution<=mixed_4_join\n  id: mixed_5_conv_Conv2D\n- attrs: {}\n  expr: mixed_5_conv<=BN<=mixed_5_conv_Conv2D\n  id: mixed_5_conv_batchnorm\n- {expr: mixed_5_conv<=ReLU<=mixed_5_conv, id: mixed_5_conv}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 160, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_5_tower_conv_Conv2D<=Convolution<=mixed_4_join\n  id: mixed_5_tower_conv_Conv2D\n- attrs: {}\n  expr: mixed_5_tower_conv<=BN<=mixed_5_tower_conv_Conv2D\n  id: mixed_5_tower_conv_batchnorm\n- {expr: mixed_5_tower_conv<=ReLU<=mixed_5_tower_conv, id: mixed_5_tower_conv}\n- attrs: {kernel_h: 7, kernel_w: 1, num_output: 160, pad_h: 3, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_5_tower_conv_1_Conv2D<=Convolution<=mixed_5_tower_conv\n  id: mixed_5_tower_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_5_tower_conv_1<=BN<=mixed_5_tower_conv_1_Conv2D\n  id: mixed_5_tower_conv_1_batchnorm\n- {expr: mixed_5_tower_conv_1<=ReLU<=mixed_5_tower_conv_1, id: mixed_5_tower_conv_1}\n- attrs: {kernel_h: 1, kernel_w: 7, num_output: 192, pad_h: 0, pad_w: 3, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_5_tower_conv_2_Conv2D<=Convolution<=mixed_5_tower_conv_1\n  id: mixed_5_tower_conv_2_Conv2D\n- attrs: {}\n  expr: mixed_5_tower_conv_2<=BN<=mixed_5_tower_conv_2_Conv2D\n  id: mixed_5_tower_conv_2_batchnorm\n- {expr: mixed_5_tower_conv_2<=ReLU<=mixed_5_tower_conv_2, id: mixed_5_tower_conv_2}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 160, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_5_tower_1_conv_Conv2D<=Convolution<=mixed_4_join\n  id: mixed_5_tower_1_conv_Conv2D\n- attrs: {}\n  expr: mixed_5_tower_1_conv<=BN<=mixed_5_tower_1_conv_Conv2D\n  id: mixed_5_tower_1_conv_batchnorm\n- {expr: mixed_5_tower_1_conv<=ReLU<=mixed_5_tower_1_conv, id: mixed_5_tower_1_conv}\n- attrs: {kernel_h: 1, kernel_w: 7, num_output: 160, pad_h: 0, pad_w: 3, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_5_tower_1_conv_1_Conv2D<=Convolution<=mixed_5_tower_1_conv\n  id: mixed_5_tower_1_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_5_tower_1_conv_1<=BN<=mixed_5_tower_1_conv_1_Conv2D\n  id: mixed_5_tower_1_conv_1_batchnorm\n- {expr: mixed_5_tower_1_conv_1<=ReLU<=mixed_5_tower_1_conv_1, id: mixed_5_tower_1_conv_1}\n- attrs: {kernel_h: 7, kernel_w: 1, num_output: 160, pad_h: 3, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_5_tower_1_conv_2_Conv2D<=Convolution<=mixed_5_tower_1_conv_1\n  id: mixed_5_tower_1_conv_2_Conv2D\n- attrs: {}\n  expr: mixed_5_tower_1_conv_2<=BN<=mixed_5_tower_1_conv_2_Conv2D\n  id: mixed_5_tower_1_conv_2_batchnorm\n- {expr: mixed_5_tower_1_conv_2<=ReLU<=mixed_5_tower_1_conv_2, id: mixed_5_tower_1_conv_2}\n- attrs: {kernel_h: 1, kernel_w: 7, num_output: 160, pad_h: 0, pad_w: 3, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_5_tower_1_conv_3_Conv2D<=Convolution<=mixed_5_tower_1_conv_2\n  id: mixed_5_tower_1_conv_3_Conv2D\n- attrs: {}\n  expr: mixed_5_tower_1_conv_3<=BN<=mixed_5_tower_1_conv_3_Conv2D\n  id: mixed_5_tower_1_conv_3_batchnorm\n- {expr: mixed_5_tower_1_conv_3<=ReLU<=mixed_5_tower_1_conv_3, id: mixed_5_tower_1_conv_3}\n- attrs: {kernel_h: 7, kernel_w: 1, num_output: 192, pad_h: 3, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_5_tower_1_conv_4_Conv2D<=Convolution<=mixed_5_tower_1_conv_3\n  id: mixed_5_tower_1_conv_4_Conv2D\n- attrs: {}\n  expr: mixed_5_tower_1_conv_4<=BN<=mixed_5_tower_1_conv_4_Conv2D\n  id: mixed_5_tower_1_conv_4_batchnorm\n- {expr: mixed_5_tower_1_conv_4<=ReLU<=mixed_5_tower_1_conv_4, id: mixed_5_tower_1_conv_4}\n- attrs: {kernel_size: 3, mode: ave, pad: 1, stride: 1}\n  expr: mixed_5_tower_2_pool<=Pooling<=mixed_4_join\n  id: mixed_5_tower_2_pool\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 192, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_5_tower_2_conv_Conv2D<=Convolution<=mixed_5_tower_2_pool\n  id: mixed_5_tower_2_conv_Conv2D\n- attrs: {}\n  expr: mixed_5_tower_2_conv<=BN<=mixed_5_tower_2_conv_Conv2D\n  id: mixed_5_tower_2_conv_batchnorm\n- {expr: mixed_5_tower_2_conv<=ReLU<=mixed_5_tower_2_conv, id: mixed_5_tower_2_conv}\n- {expr: 'mixed_5_join<=Concat<=mixed_5_conv,mixed_5_tower_conv_2,mixed_5_tower_1_conv_4,mixed_5_tower_2_conv',\n  id: mixed_5_join}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 192, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_6_conv_Conv2D<=Convolution<=mixed_5_join\n  id: mixed_6_conv_Conv2D\n- attrs: {}\n  expr: mixed_6_conv<=BN<=mixed_6_conv_Conv2D\n  id: mixed_6_conv_batchnorm\n- {expr: mixed_6_conv<=ReLU<=mixed_6_conv, id: mixed_6_conv}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 160, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_6_tower_conv_Conv2D<=Convolution<=mixed_5_join\n  id: mixed_6_tower_conv_Conv2D\n- attrs: {}\n  expr: mixed_6_tower_conv<=BN<=mixed_6_tower_conv_Conv2D\n  id: mixed_6_tower_conv_batchnorm\n- {expr: mixed_6_tower_conv<=ReLU<=mixed_6_tower_conv, id: mixed_6_tower_conv}\n- attrs: {kernel_h: 7, kernel_w: 1, num_output: 160, pad_h: 3, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_6_tower_conv_1_Conv2D<=Convolution<=mixed_6_tower_conv\n  id: mixed_6_tower_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_6_tower_conv_1<=BN<=mixed_6_tower_conv_1_Conv2D\n  id: mixed_6_tower_conv_1_batchnorm\n- {expr: mixed_6_tower_conv_1<=ReLU<=mixed_6_tower_conv_1, id: mixed_6_tower_conv_1}\n- attrs: {kernel_h: 1, kernel_w: 7, num_output: 192, pad_h: 0, pad_w: 3, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_6_tower_conv_2_Conv2D<=Convolution<=mixed_6_tower_conv_1\n  id: mixed_6_tower_conv_2_Conv2D\n- attrs: {}\n  expr: mixed_6_tower_conv_2<=BN<=mixed_6_tower_conv_2_Conv2D\n  id: mixed_6_tower_conv_2_batchnorm\n- {expr: mixed_6_tower_conv_2<=ReLU<=mixed_6_tower_conv_2, id: mixed_6_tower_conv_2}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 160, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_6_tower_1_conv_Conv2D<=Convolution<=mixed_5_join\n  id: mixed_6_tower_1_conv_Conv2D\n- attrs: {}\n  expr: mixed_6_tower_1_conv<=BN<=mixed_6_tower_1_conv_Conv2D\n  id: mixed_6_tower_1_conv_batchnorm\n- {expr: mixed_6_tower_1_conv<=ReLU<=mixed_6_tower_1_conv, id: mixed_6_tower_1_conv}\n- attrs: {kernel_h: 1, kernel_w: 7, num_output: 160, pad_h: 0, pad_w: 3, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_6_tower_1_conv_1_Conv2D<=Convolution<=mixed_6_tower_1_conv\n  id: mixed_6_tower_1_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_6_tower_1_conv_1<=BN<=mixed_6_tower_1_conv_1_Conv2D\n  id: mixed_6_tower_1_conv_1_batchnorm\n- {expr: mixed_6_tower_1_conv_1<=ReLU<=mixed_6_tower_1_conv_1, id: mixed_6_tower_1_conv_1}\n- attrs: {kernel_h: 7, kernel_w: 1, num_output: 160, pad_h: 3, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_6_tower_1_conv_2_Conv2D<=Convolution<=mixed_6_tower_1_conv_1\n  id: mixed_6_tower_1_conv_2_Conv2D\n- attrs: {}\n  expr: mixed_6_tower_1_conv_2<=BN<=mixed_6_tower_1_conv_2_Conv2D\n  id: mixed_6_tower_1_conv_2_batchnorm\n- {expr: mixed_6_tower_1_conv_2<=ReLU<=mixed_6_tower_1_conv_2, id: mixed_6_tower_1_conv_2}\n- attrs: {kernel_h: 1, kernel_w: 7, num_output: 160, pad_h: 0, pad_w: 3, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_6_tower_1_conv_3_Conv2D<=Convolution<=mixed_6_tower_1_conv_2\n  id: mixed_6_tower_1_conv_3_Conv2D\n- attrs: {}\n  expr: mixed_6_tower_1_conv_3<=BN<=mixed_6_tower_1_conv_3_Conv2D\n  id: mixed_6_tower_1_conv_3_batchnorm\n- {expr: mixed_6_tower_1_conv_3<=ReLU<=mixed_6_tower_1_conv_3, id: mixed_6_tower_1_conv_3}\n- attrs: {kernel_h: 7, kernel_w: 1, num_output: 192, pad_h: 3, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_6_tower_1_conv_4_Conv2D<=Convolution<=mixed_6_tower_1_conv_3\n  id: mixed_6_tower_1_conv_4_Conv2D\n- attrs: {}\n  expr: mixed_6_tower_1_conv_4<=BN<=mixed_6_tower_1_conv_4_Conv2D\n  id: mixed_6_tower_1_conv_4_batchnorm\n- {expr: mixed_6_tower_1_conv_4<=ReLU<=mixed_6_tower_1_conv_4, id: mixed_6_tower_1_conv_4}\n- attrs: {kernel_size: 3, mode: ave, pad: 1, stride: 1}\n  expr: mixed_6_tower_2_pool<=Pooling<=mixed_5_join\n  id: mixed_6_tower_2_pool\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 192, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_6_tower_2_conv_Conv2D<=Convolution<=mixed_6_tower_2_pool\n  id: mixed_6_tower_2_conv_Conv2D\n- attrs: {}\n  expr: mixed_6_tower_2_conv<=BN<=mixed_6_tower_2_conv_Conv2D\n  id: mixed_6_tower_2_conv_batchnorm\n- {expr: mixed_6_tower_2_conv<=ReLU<=mixed_6_tower_2_conv, id: mixed_6_tower_2_conv}\n- {expr: 'mixed_6_join<=Concat<=mixed_6_conv,mixed_6_tower_conv_2,mixed_6_tower_1_conv_4,mixed_6_tower_2_conv',\n  id: mixed_6_join}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 192, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_7_conv_Conv2D<=Convolution<=mixed_6_join\n  id: mixed_7_conv_Conv2D\n- attrs: {}\n  expr: mixed_7_conv<=BN<=mixed_7_conv_Conv2D\n  id: mixed_7_conv_batchnorm\n- {expr: mixed_7_conv<=ReLU<=mixed_7_conv, id: mixed_7_conv}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 192, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_7_tower_conv_Conv2D<=Convolution<=mixed_6_join\n  id: mixed_7_tower_conv_Conv2D\n- attrs: {}\n  expr: mixed_7_tower_conv<=BN<=mixed_7_tower_conv_Conv2D\n  id: mixed_7_tower_conv_batchnorm\n- {expr: mixed_7_tower_conv<=ReLU<=mixed_7_tower_conv, id: mixed_7_tower_conv}\n- attrs: {kernel_h: 7, kernel_w: 1, num_output: 192, pad_h: 3, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_7_tower_conv_1_Conv2D<=Convolution<=mixed_7_tower_conv\n  id: mixed_7_tower_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_7_tower_conv_1<=BN<=mixed_7_tower_conv_1_Conv2D\n  id: mixed_7_tower_conv_1_batchnorm\n- {expr: mixed_7_tower_conv_1<=ReLU<=mixed_7_tower_conv_1, id: mixed_7_tower_conv_1}\n- attrs: {kernel_h: 1, kernel_w: 7, num_output: 192, pad_h: 0, pad_w: 3, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_7_tower_conv_2_Conv2D<=Convolution<=mixed_7_tower_conv_1\n  id: mixed_7_tower_conv_2_Conv2D\n- attrs: {}\n  expr: mixed_7_tower_conv_2<=BN<=mixed_7_tower_conv_2_Conv2D\n  id: mixed_7_tower_conv_2_batchnorm\n- {expr: mixed_7_tower_conv_2<=ReLU<=mixed_7_tower_conv_2, id: mixed_7_tower_conv_2}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 192, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_7_tower_1_conv_Conv2D<=Convolution<=mixed_6_join\n  id: mixed_7_tower_1_conv_Conv2D\n- attrs: {}\n  expr: mixed_7_tower_1_conv<=BN<=mixed_7_tower_1_conv_Conv2D\n  id: mixed_7_tower_1_conv_batchnorm\n- {expr: mixed_7_tower_1_conv<=ReLU<=mixed_7_tower_1_conv, id: mixed_7_tower_1_conv}\n- attrs: {kernel_h: 1, kernel_w: 7, num_output: 192, pad_h: 0, pad_w: 3, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_7_tower_1_conv_1_Conv2D<=Convolution<=mixed_7_tower_1_conv\n  id: mixed_7_tower_1_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_7_tower_1_conv_1<=BN<=mixed_7_tower_1_conv_1_Conv2D\n  id: mixed_7_tower_1_conv_1_batchnorm\n- {expr: mixed_7_tower_1_conv_1<=ReLU<=mixed_7_tower_1_conv_1, id: mixed_7_tower_1_conv_1}\n- attrs: {kernel_h: 7, kernel_w: 1, num_output: 192, pad_h: 3, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_7_tower_1_conv_2_Conv2D<=Convolution<=mixed_7_tower_1_conv_1\n  id: mixed_7_tower_1_conv_2_Conv2D\n- attrs: {}\n  expr: mixed_7_tower_1_conv_2<=BN<=mixed_7_tower_1_conv_2_Conv2D\n  id: mixed_7_tower_1_conv_2_batchnorm\n- {expr: mixed_7_tower_1_conv_2<=ReLU<=mixed_7_tower_1_conv_2, id: mixed_7_tower_1_conv_2}\n- attrs: {kernel_h: 1, kernel_w: 7, num_output: 192, pad_h: 0, pad_w: 3, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_7_tower_1_conv_3_Conv2D<=Convolution<=mixed_7_tower_1_conv_2\n  id: mixed_7_tower_1_conv_3_Conv2D\n- attrs: {}\n  expr: mixed_7_tower_1_conv_3<=BN<=mixed_7_tower_1_conv_3_Conv2D\n  id: mixed_7_tower_1_conv_3_batchnorm\n- {expr: mixed_7_tower_1_conv_3<=ReLU<=mixed_7_tower_1_conv_3, id: mixed_7_tower_1_conv_3}\n- attrs: {kernel_h: 7, kernel_w: 1, num_output: 192, pad_h: 3, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_7_tower_1_conv_4_Conv2D<=Convolution<=mixed_7_tower_1_conv_3\n  id: mixed_7_tower_1_conv_4_Conv2D\n- attrs: {}\n  expr: mixed_7_tower_1_conv_4<=BN<=mixed_7_tower_1_conv_4_Conv2D\n  id: mixed_7_tower_1_conv_4_batchnorm\n- {expr: mixed_7_tower_1_conv_4<=ReLU<=mixed_7_tower_1_conv_4, id: mixed_7_tower_1_conv_4}\n- attrs: {kernel_size: 3, mode: ave, pad: 1, stride: 1}\n  expr: mixed_7_tower_2_pool<=Pooling<=mixed_6_join\n  id: mixed_7_tower_2_pool\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 192, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_7_tower_2_conv_Conv2D<=Convolution<=mixed_7_tower_2_pool\n  id: mixed_7_tower_2_conv_Conv2D\n- attrs: {}\n  expr: mixed_7_tower_2_conv<=BN<=mixed_7_tower_2_conv_Conv2D\n  id: mixed_7_tower_2_conv_batchnorm\n- {expr: mixed_7_tower_2_conv<=ReLU<=mixed_7_tower_2_conv, id: mixed_7_tower_2_conv}\n- {expr: 'mixed_7_join<=Concat<=mixed_7_conv,mixed_7_tower_conv_2,mixed_7_tower_1_conv_4,mixed_7_tower_2_conv',\n  id: mixed_7_join}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 192, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_8_tower_conv_Conv2D<=Convolution<=mixed_7_join\n  id: mixed_8_tower_conv_Conv2D\n- attrs: {}\n  expr: mixed_8_tower_conv<=BN<=mixed_8_tower_conv_Conv2D\n  id: mixed_8_tower_conv_batchnorm\n- {expr: mixed_8_tower_conv<=ReLU<=mixed_8_tower_conv, id: mixed_8_tower_conv}\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 320, pad_h: 0, pad_w: 0, stride_h: 2,\n    stride_w: 2}\n  expr: mixed_8_tower_conv_1_Conv2D<=Convolution<=mixed_8_tower_conv\n  id: mixed_8_tower_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_8_tower_conv_1<=BN<=mixed_8_tower_conv_1_Conv2D\n  id: mixed_8_tower_conv_1_batchnorm\n- {expr: mixed_8_tower_conv_1<=ReLU<=mixed_8_tower_conv_1, id: mixed_8_tower_conv_1}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 192, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_8_tower_1_conv_Conv2D<=Convolution<=mixed_7_join\n  id: mixed_8_tower_1_conv_Conv2D\n- attrs: {}\n  expr: mixed_8_tower_1_conv<=BN<=mixed_8_tower_1_conv_Conv2D\n  id: mixed_8_tower_1_conv_batchnorm\n- {expr: mixed_8_tower_1_conv<=ReLU<=mixed_8_tower_1_conv, id: mixed_8_tower_1_conv}\n- attrs: {kernel_h: 7, kernel_w: 1, num_output: 192, pad_h: 3, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_8_tower_1_conv_1_Conv2D<=Convolution<=mixed_8_tower_1_conv\n  id: mixed_8_tower_1_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_8_tower_1_conv_1<=BN<=mixed_8_tower_1_conv_1_Conv2D\n  id: mixed_8_tower_1_conv_1_batchnorm\n- {expr: mixed_8_tower_1_conv_1<=ReLU<=mixed_8_tower_1_conv_1, id: mixed_8_tower_1_conv_1}\n- attrs: {kernel_h: 1, kernel_w: 7, num_output: 192, pad_h: 0, pad_w: 3, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_8_tower_1_conv_2_Conv2D<=Convolution<=mixed_8_tower_1_conv_1\n  id: mixed_8_tower_1_conv_2_Conv2D\n- attrs: {}\n  expr: mixed_8_tower_1_conv_2<=BN<=mixed_8_tower_1_conv_2_Conv2D\n  id: mixed_8_tower_1_conv_2_batchnorm\n- {expr: mixed_8_tower_1_conv_2<=ReLU<=mixed_8_tower_1_conv_2, id: mixed_8_tower_1_conv_2}\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 192, pad_h: 0, pad_w: 0, stride_h: 2,\n    stride_w: 2}\n  expr: mixed_8_tower_1_conv_3_Conv2D<=Convolution<=mixed_8_tower_1_conv_2\n  id: mixed_8_tower_1_conv_3_Conv2D\n- attrs: {}\n  expr: mixed_8_tower_1_conv_3<=BN<=mixed_8_tower_1_conv_3_Conv2D\n  id: mixed_8_tower_1_conv_3_batchnorm\n- {expr: mixed_8_tower_1_conv_3<=ReLU<=mixed_8_tower_1_conv_3, id: mixed_8_tower_1_conv_3}\n- attrs: {kernel_size: 3, mode: max, pad: 0, stride: 2}\n  expr: mixed_8_pool<=Pooling<=mixed_7_join\n  id: mixed_8_pool\n- {expr: 'mixed_8_join<=Concat<=mixed_8_tower_conv_1,mixed_8_tower_1_conv_3,mixed_8_pool',\n  id: mixed_8_join}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 320, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_9_conv_Conv2D<=Convolution<=mixed_8_join\n  id: mixed_9_conv_Conv2D\n- attrs: {}\n  expr: mixed_9_conv<=BN<=mixed_9_conv_Conv2D\n  id: mixed_9_conv_batchnorm\n- {expr: mixed_9_conv<=ReLU<=mixed_9_conv, id: mixed_9_conv}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 384, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_9_tower_conv_Conv2D<=Convolution<=mixed_8_join\n  id: mixed_9_tower_conv_Conv2D\n- attrs: {}\n  expr: mixed_9_tower_conv<=BN<=mixed_9_tower_conv_Conv2D\n  id: mixed_9_tower_conv_batchnorm\n- {expr: mixed_9_tower_conv<=ReLU<=mixed_9_tower_conv, id: mixed_9_tower_conv}\n- attrs: {kernel_h: 3, kernel_w: 1, num_output: 384, pad_h: 1, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_9_tower_mixed_conv_Conv2D<=Convolution<=mixed_9_tower_conv\n  id: mixed_9_tower_mixed_conv_Conv2D\n- attrs: {}\n  expr: mixed_9_tower_mixed_conv<=BN<=mixed_9_tower_mixed_conv_Conv2D\n  id: mixed_9_tower_mixed_conv_batchnorm\n- {expr: mixed_9_tower_mixed_conv<=ReLU<=mixed_9_tower_mixed_conv, id: mixed_9_tower_mixed_conv}\n- attrs: {kernel_h: 1, kernel_w: 3, num_output: 384, pad_h: 0, pad_w: 1, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_9_tower_mixed_conv_1_Conv2D<=Convolution<=mixed_9_tower_conv\n  id: mixed_9_tower_mixed_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_9_tower_mixed_conv_1<=BN<=mixed_9_tower_mixed_conv_1_Conv2D\n  id: mixed_9_tower_mixed_conv_1_batchnorm\n- {expr: mixed_9_tower_mixed_conv_1<=ReLU<=mixed_9_tower_mixed_conv_1, id: mixed_9_tower_mixed_conv_1}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 448, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_9_tower_1_conv_Conv2D<=Convolution<=mixed_8_join\n  id: mixed_9_tower_1_conv_Conv2D\n- attrs: {}\n  expr: mixed_9_tower_1_conv<=BN<=mixed_9_tower_1_conv_Conv2D\n  id: mixed_9_tower_1_conv_batchnorm\n- {expr: mixed_9_tower_1_conv<=ReLU<=mixed_9_tower_1_conv, id: mixed_9_tower_1_conv}\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 384, pad_h: 1, pad_w: 1, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_9_tower_1_conv_1_Conv2D<=Convolution<=mixed_9_tower_1_conv\n  id: mixed_9_tower_1_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_9_tower_1_conv_1<=BN<=mixed_9_tower_1_conv_1_Conv2D\n  id: mixed_9_tower_1_conv_1_batchnorm\n- {expr: mixed_9_tower_1_conv_1<=ReLU<=mixed_9_tower_1_conv_1, id: mixed_9_tower_1_conv_1}\n- attrs: {kernel_h: 3, kernel_w: 1, num_output: 384, pad_h: 1, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_9_tower_1_mixed_conv_Conv2D<=Convolution<=mixed_9_tower_1_conv_1\n  id: mixed_9_tower_1_mixed_conv_Conv2D\n- attrs: {}\n  expr: mixed_9_tower_1_mixed_conv<=BN<=mixed_9_tower_1_mixed_conv_Conv2D\n  id: mixed_9_tower_1_mixed_conv_batchnorm\n- {expr: mixed_9_tower_1_mixed_conv<=ReLU<=mixed_9_tower_1_mixed_conv, id: mixed_9_tower_1_mixed_conv}\n- attrs: {kernel_h: 1, kernel_w: 3, num_output: 384, pad_h: 0, pad_w: 1, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_9_tower_1_mixed_conv_1_Conv2D<=Convolution<=mixed_9_tower_1_conv_1\n  id: mixed_9_tower_1_mixed_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_9_tower_1_mixed_conv_1<=BN<=mixed_9_tower_1_mixed_conv_1_Conv2D\n  id: mixed_9_tower_1_mixed_conv_1_batchnorm\n- {expr: mixed_9_tower_1_mixed_conv_1<=ReLU<=mixed_9_tower_1_mixed_conv_1, id: mixed_9_tower_1_mixed_conv_1}\n- attrs: {kernel_size: 3, mode: ave, pad: 1, stride: 1}\n  expr: mixed_9_tower_2_pool<=Pooling<=mixed_8_join\n  id: mixed_9_tower_2_pool\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 192, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_9_tower_2_conv_Conv2D<=Convolution<=mixed_9_tower_2_pool\n  id: mixed_9_tower_2_conv_Conv2D\n- attrs: {}\n  expr: mixed_9_tower_2_conv<=BN<=mixed_9_tower_2_conv_Conv2D\n  id: mixed_9_tower_2_conv_batchnorm\n- {expr: mixed_9_tower_2_conv<=ReLU<=mixed_9_tower_2_conv, id: mixed_9_tower_2_conv}\n- {expr: 'mixed_9_join<=Concat<=mixed_9_conv,mixed_9_tower_mixed_conv,mixed_9_tower_mixed_conv_1,mixed_9_tower_1_mixed_conv,mixed_9_tower_1_mixed_conv_1,mixed_9_tower_2_conv',\n  id: mixed_9_join}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 320, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_10_conv_Conv2D<=Convolution<=mixed_9_join\n  id: mixed_10_conv_Conv2D\n- attrs: {}\n  expr: mixed_10_conv<=BN<=mixed_10_conv_Conv2D\n  id: mixed_10_conv_batchnorm\n- {expr: mixed_10_conv<=ReLU<=mixed_10_conv, id: mixed_10_conv}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 384, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_10_tower_conv_Conv2D<=Convolution<=mixed_9_join\n  id: mixed_10_tower_conv_Conv2D\n- attrs: {}\n  expr: mixed_10_tower_conv<=BN<=mixed_10_tower_conv_Conv2D\n  id: mixed_10_tower_conv_batchnorm\n- {expr: mixed_10_tower_conv<=ReLU<=mixed_10_tower_conv, id: mixed_10_tower_conv}\n- attrs: {kernel_h: 3, kernel_w: 1, num_output: 384, pad_h: 1, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_10_tower_mixed_conv_Conv2D<=Convolution<=mixed_10_tower_conv\n  id: mixed_10_tower_mixed_conv_Conv2D\n- attrs: {}\n  expr: mixed_10_tower_mixed_conv<=BN<=mixed_10_tower_mixed_conv_Conv2D\n  id: mixed_10_tower_mixed_conv_batchnorm\n- {expr: mixed_10_tower_mixed_conv<=ReLU<=mixed_10_tower_mixed_conv, id: mixed_10_tower_mixed_conv}\n- attrs: {kernel_h: 1, kernel_w: 3, num_output: 384, pad_h: 0, pad_w: 1, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_10_tower_mixed_conv_1_Conv2D<=Convolution<=mixed_10_tower_conv\n  id: mixed_10_tower_mixed_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_10_tower_mixed_conv_1<=BN<=mixed_10_tower_mixed_conv_1_Conv2D\n  id: mixed_10_tower_mixed_conv_1_batchnorm\n- {expr: mixed_10_tower_mixed_conv_1<=ReLU<=mixed_10_tower_mixed_conv_1, id: mixed_10_tower_mixed_conv_1}\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 448, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_10_tower_1_conv_Conv2D<=Convolution<=mixed_9_join\n  id: mixed_10_tower_1_conv_Conv2D\n- attrs: {}\n  expr: mixed_10_tower_1_conv<=BN<=mixed_10_tower_1_conv_Conv2D\n  id: mixed_10_tower_1_conv_batchnorm\n- {expr: mixed_10_tower_1_conv<=ReLU<=mixed_10_tower_1_conv, id: mixed_10_tower_1_conv}\n- attrs: {kernel_h: 3, kernel_w: 3, num_output: 384, pad_h: 1, pad_w: 1, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_10_tower_1_conv_1_Conv2D<=Convolution<=mixed_10_tower_1_conv\n  id: mixed_10_tower_1_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_10_tower_1_conv_1<=BN<=mixed_10_tower_1_conv_1_Conv2D\n  id: mixed_10_tower_1_conv_1_batchnorm\n- {expr: mixed_10_tower_1_conv_1<=ReLU<=mixed_10_tower_1_conv_1, id: mixed_10_tower_1_conv_1}\n- attrs: {kernel_h: 3, kernel_w: 1, num_output: 384, pad_h: 1, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_10_tower_1_mixed_conv_Conv2D<=Convolution<=mixed_10_tower_1_conv_1\n  id: mixed_10_tower_1_mixed_conv_Conv2D\n- attrs: {}\n  expr: mixed_10_tower_1_mixed_conv<=BN<=mixed_10_tower_1_mixed_conv_Conv2D\n  id: mixed_10_tower_1_mixed_conv_batchnorm\n- {expr: mixed_10_tower_1_mixed_conv<=ReLU<=mixed_10_tower_1_mixed_conv, id: mixed_10_tower_1_mixed_conv}\n- attrs: {kernel_h: 1, kernel_w: 3, num_output: 384, pad_h: 0, pad_w: 1, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_10_tower_1_mixed_conv_1_Conv2D<=Convolution<=mixed_10_tower_1_conv_1\n  id: mixed_10_tower_1_mixed_conv_1_Conv2D\n- attrs: {}\n  expr: mixed_10_tower_1_mixed_conv_1<=BN<=mixed_10_tower_1_mixed_conv_1_Conv2D\n  id: mixed_10_tower_1_mixed_conv_1_batchnorm\n- {expr: mixed_10_tower_1_mixed_conv_1<=ReLU<=mixed_10_tower_1_mixed_conv_1, id: mixed_10_tower_1_mixed_conv_1}\n- attrs: {kernel_size: 3, mode: max, pad: 1, stride: 1}\n  expr: mixed_10_tower_2_pool<=Pooling<=mixed_9_join\n  id: mixed_10_tower_2_pool\n- attrs: {kernel_h: 1, kernel_w: 1, num_output: 192, pad_h: 0, pad_w: 0, stride_h: 1,\n    stride_w: 1}\n  expr: mixed_10_tower_2_conv_Conv2D<=Convolution<=mixed_10_tower_2_pool\n  id: mixed_10_tower_2_conv_Conv2D\n- attrs: {}\n  expr: mixed_10_tower_2_conv<=BN<=mixed_10_tower_2_conv_Conv2D\n  id: mixed_10_tower_2_conv_batchnorm\n- {expr: mixed_10_tower_2_conv<=ReLU<=mixed_10_tower_2_conv, id: mixed_10_tower_2_conv}\n- {expr: 'mixed_10_join<=Concat<=mixed_10_conv,mixed_10_tower_mixed_conv,mixed_10_tower_mixed_conv_1,mixed_10_tower_1_mixed_conv,mixed_10_tower_1_mixed_conv_1,mixed_10_tower_2_conv',\n  id: mixed_10_join}\n- attrs: {kernel_size: 8, mode: ave, pad: 0, stride: 1}\n  expr: top_cls_global_pool<=Pooling<=mixed_10_join\n  id: top_cls_pool\n- attrs: {num_output: 1000}\n  expr: fc<=InnerProduct<=top_cls_global_pool\n  id: top_cls_fc\nname: inception2\n"
  },
  {
    "path": "model_zoo/bninception/layer_factory.py",
    "content": "import torch\nfrom torch import nn\n\n\nLAYER_BUILDER_DICT=dict()\n\n\ndef parse_expr(expr):\n    parts = expr.split('<=')\n    return parts[0].split(','), parts[1], parts[2].split(',')\n\n\ndef get_basic_layer(info, channels=None, conv_bias=False):\n    id = info['id']\n    attr = info['attrs'] if 'attrs' in info else list()\n\n    out, op, in_vars = parse_expr(info['expr'])\n    assert(len(out) == 1)\n    assert(len(in_vars) == 1)\n    mod, out_channel, = LAYER_BUILDER_DICT[op](attr, channels, conv_bias)\n\n    return id, out[0], mod, out_channel, in_vars[0]\n\n\ndef build_conv(attr, channels=None, conv_bias=False):\n    out_channels = attr['num_output']\n    ks = attr['kernel_size'] if 'kernel_size' in attr else (attr['kernel_h'], attr['kernel_w'])\n    if 'pad' in attr or 'pad_w' in attr and 'pad_h' in attr:\n        padding = attr['pad'] if 'pad' in attr else (attr['pad_h'], attr['pad_w'])\n    else:\n        padding = 0\n    if 'stride' in attr or 'stride_w' in attr and 'stride_h' in attr:\n        stride = attr['stride'] if 'stride' in attr else (attr['stride_h'], attr['stride_w'])\n    else:\n        stride = 1\n\n    conv = nn.Conv2d(channels, out_channels, ks, stride, padding, bias=conv_bias)\n\n    return conv, out_channels\n\n\ndef build_pooling(attr, channels=None, conv_bias=False):\n    method = attr['mode']\n    pad = attr['pad'] if 'pad' in attr else 0\n    if method == 'max':\n        pool = nn.MaxPool2d(attr['kernel_size'], attr['stride'], pad,\n                            ceil_mode=True) # all Caffe pooling use ceil model\n    elif method == 'ave':\n        pool = nn.AvgPool2d(attr['kernel_size'], attr['stride'], pad,\n                            ceil_mode=True)  # all Caffe pooling use ceil model\n    else:\n        raise ValueError(\"Unknown pooling method: {}\".format(method))\n\n    return pool, channels\n\n\ndef build_relu(attr, channels=None, conv_bias=False):\n    return nn.ReLU(inplace=True), channels\n\n\ndef build_bn(attr, channels=None, conv_bias=False):\n    return nn.BatchNorm2d(channels, momentum=0.1), channels\n\n\ndef build_linear(attr, channels=None, conv_bias=False):\n    return nn.Linear(channels, attr['num_output']), channels\n\n\ndef build_dropout(attr, channels=None, conv_bias=False):\n    return nn.Dropout(p=attr['dropout_ratio']), channels\n\n\nLAYER_BUILDER_DICT['Convolution'] = build_conv\n\nLAYER_BUILDER_DICT['Pooling'] = build_pooling\n\nLAYER_BUILDER_DICT['ReLU'] = build_relu\n\nLAYER_BUILDER_DICT['Dropout'] = build_dropout\n\nLAYER_BUILDER_DICT['BN'] = build_bn\n\nLAYER_BUILDER_DICT['InnerProduct'] = build_linear\n\n"
  },
  {
    "path": "model_zoo/bninception/parse_caffe.py",
    "content": "#!/usr/bin/env python\n\nimport argparse\n\nparser = argparse.ArgumentParser(description=\"Convert a Caffe model and its learned parameters to torch\")\nparser.add_argument('model', help='network spec, usually a ProtoBuf text message')\nparser.add_argument('weights', help='network parameters, usually in a name like *.caffemodel ')\nparser.add_argument('--model_yaml', help=\"translated model spec yaml file\")\nparser.add_argument('--dump_weights', help=\"translated model parameters to be used by torch\")\nparser.add_argument('--model_version', help=\"the version of Caffe's model spec, usually 2\", default=2)\n\nargs = parser.parse_args()\n\nfrom . import caffe_pb2\nfrom google.protobuf import text_format\nfrom pprint import pprint\nimport yaml\nimport numpy as np\nimport torch\n\n\nclass CaffeVendor(object):\n    def __init__(self, net_name, weight_name, version=2):\n        print(\"loading model spec...\")\n        self._net_pb = caffe_pb2.NetParameter()\n        text_format.Merge(open(net_name).read(), self._net_pb)\n        self._weight_dict = {}\n        self._init_dict = []\n\n        if weight_name is not None:\n            print(\"loading weights...\")\n            self._weight_pb = caffe_pb2.NetParameter()\n            self._weight_pb.ParseFromString(open(weight_name, 'rb').read())\n            for l in self._weight_pb.layer:\n                self._weight_dict[l.name] = l\n\n        print(\"parsing...\")\n        self._parse_net(version)\n\n    def _parse_net(self, version):\n        self._name = str(self._net_pb.name)\n        self._layers = self._net_pb.layer if version == 2 else self._net_pb.layers\n        self._parsed_layers = [self._layer2dict(x, version) for x in self._layers]\n\n        self._net_dict = {\n            'name': self._name,\n            'inputs': [],\n            'layers': [],\n        }\n\n        self._weight_array_dict = {}\n\n        for info, blob, is_data in self._parsed_layers:\n            if not is_data and info is not None:\n                self._net_dict['layers'].append(info)\n\n            self._weight_array_dict.update(blob)\n\n    @staticmethod\n    def _parse_blob(blob):\n        flat_data = np.array(blob.data)\n        shaped_data = flat_data.reshape(list(blob.shape.dim))\n        return shaped_data\n\n    def _layer2dict(self, layer, version):\n        attr_dict = {}\n        params = []\n        weight_params = []\n        fillers = []\n\n        for field, value in layer.ListFields():\n            if field.name == 'top':\n                tops = [v.replace('-', '_').replace('/', '_') for v in value]\n            elif field.name == 'name':\n                layer_name = str(value).replace('-', '_').replace('/', '_')\n            elif field.name == 'bottom':\n                bottoms = [v.replace('-', '_').replace('/', '_') for v in value]\n            elif field.name == 'include':\n                if value[0].phase == 1 and op == 'Data':\n                    print('found 1 testing data layer')\n                    return None, dict(), dict(), False\n            elif field.name == 'type':\n                if version == 2:\n                    op = value\n                else:\n                    raise NotImplemented\n            elif field.name == 'loss_weight':\n                pass\n            elif field.name == 'param':\n                pass\n            else:\n                # other params\n                try:\n                    for f, v in value.ListFields():\n                        if 'filler' in f.name:\n                            pass\n                        elif f.name == 'pool':\n                          attr_dict['mode'] = 'max' if v == 0 else 'ave'\n                        else:\n                          attr_dict[f.name] = v\n\n                except:\n                    print(field.name, value)\n                    raise\n\n        expr_temp = '{top}<={op}<={input}'\n\n        if layer.name in self._weight_dict:\n            blobs = [self._parse_blob(x) for x in self._weight_dict[layer.name].blobs]\n        else:\n            blobs = []\n\n        blob_dict = dict()\n        if len(blobs) > 0:\n            blob_dict['{}.weight'.format(layer_name)] = torch.from_numpy(blobs[0])\n            blob_dict['{}.bias'.format(layer_name)] = torch.from_numpy(blobs[1])\n            if op == 'BN':\n                blob_dict['{}.running_mean'.format(layer_name)] = torch.from_numpy(blobs[2])\n                blob_dict['{}.running_var'.format(layer_name)] = torch.from_numpy(blobs[3])\n\n        expr = expr_temp.format(top=','.join(tops), input=','.join(bottoms), op=op)\n\n        out_dict = {\n            'id': layer_name,\n            'expr': expr,\n        }\n\n        if len(attr_dict) > 0:\n            out_dict['attrs'] = attr_dict\n\n        return out_dict, blob_dict, False\n\n    @property\n    def text_form(self):\n        return str(self._net_pb)\n\n    @property\n    def info(self):\n        return {\n            'name': self._name,\n            'layers': [x.name for x in self._layers]\n        }\n\n    @property\n    def yaml(self):\n        return yaml.dump(self._net_dict)\n\n    def dump_weights(self, filename):\n        # print self._weight_array_dict.keys()\n        torch.save(self._weight_array_dict, open(filename, 'wb'))\n\n# build output\ncv = CaffeVendor(args.model, args.weights, int(args.model_version))\n\nif args.model_yaml is not None:\n    open(args.model_yaml, 'w').write(cv.yaml)\n\nif args.dump_weights is not None:\n    cv.dump_weights(args.dump_weights)\n"
  },
  {
    "path": "model_zoo/bninception/pytorch_load.py",
    "content": "import torch\nfrom torch import nn\nfrom .layer_factory import get_basic_layer, parse_expr\nimport torch.utils.model_zoo as model_zoo\nimport yaml\n\n\nclass BNInception(nn.Module):\n    def __init__(self, model_path='model_zoo/bninception/bn_inception.yaml', num_classes=101,\n                       weight_url='https://yjxiong.blob.core.windows.net/models/bn_inception-9f5701afb96c8044.pth'):\n        super(BNInception, self).__init__()\n\n        manifest = yaml.load(open(model_path))\n\n        layers = manifest['layers']\n\n        self._channel_dict = dict()\n\n        self._op_list = list()\n        for l in layers:\n            out_var, op, in_var = parse_expr(l['expr'])\n            if op != 'Concat':\n                id, out_name, module, out_channel, in_name = get_basic_layer(l,\n                                                                3 if len(self._channel_dict) == 0 else self._channel_dict[in_var[0]],\n                                                                             conv_bias=True)\n\n                self._channel_dict[out_name] = out_channel\n                setattr(self, id, module)\n                self._op_list.append((id, op, out_name, in_name))\n            else:\n                self._op_list.append((id, op, out_var[0], in_var))\n                channel = sum([self._channel_dict[x] for x in in_var])\n                self._channel_dict[out_var[0]] = channel\n\n        self.load_state_dict(torch.utils.model_zoo.load_url(weight_url))\n\n    def forward(self, input):\n        data_dict = dict()\n        data_dict[self._op_list[0][-1]] = input\n\n        def get_hook(name):\n\n            def hook(m, grad_in, grad_out):\n                print(name, grad_out[0].data.abs().mean())\n\n            return hook\n        for op in self._op_list:\n            if op[1] != 'Concat' and op[1] != 'InnerProduct':\n                data_dict[op[2]] = getattr(self, op[0])(data_dict[op[-1]])\n                # getattr(self, op[0]).register_backward_hook(get_hook(op[0]))\n            elif op[1] == 'InnerProduct':\n                x = data_dict[op[-1]]\n                data_dict[op[2]] = getattr(self, op[0])(x.view(x.size(0), -1))\n            else:\n                try:\n                    data_dict[op[2]] = torch.cat(tuple(data_dict[x] for x in op[-1]), 1)\n                except:\n                    for x in op[-1]:\n                        print(x,data_dict[x].size())\n                    raise\n        return data_dict[self._op_list[-1][2]]\n\n\nclass InceptionV3(BNInception):\n    def __init__(self, model_path='model_zoo/bninception/inceptionv3.yaml', num_classes=101,\n                 weight_url='https://yjxiong.blob.core.windows.net/models/inceptionv3-cuhk-0e09b300b493bc74c.pth'):\n        super(InceptionV3, self).__init__(model_path=model_path, weight_url=weight_url, num_classes=num_classes)\n"
  },
  {
    "path": "model_zoo/inceptionresnetv2/__init__.py",
    "content": ""
  },
  {
    "path": "model_zoo/inceptionresnetv2/pytorch_load.py",
    "content": "import torch\nimport torch.nn as nn\nimport torch.utils.model_zoo as model_zoo\nimport os\nimport sys\n\nmodel_urls = {\n    'imagenet': 'http://webia.lip6.fr/~cadene/Downloads/inceptionresnetv2-d579a627.pth'\n}\n\nclass BasicConv2d(nn.Module):\n\n    def __init__(self, in_planes, out_planes, kernel_size, stride, padding=0):\n        super(BasicConv2d, self).__init__()\n        self.conv = nn.Conv2d(in_planes, out_planes, kernel_size=kernel_size, stride=stride, padding=padding, bias=False) # verify bias false\n        self.bn = nn.BatchNorm2d(out_planes, eps=0.001, momentum=0, affine=True)\n        self.relu = nn.ReLU(inplace=False)\n\n    def forward(self, x):\n        x = self.conv(x)\n        x = self.bn(x)\n        x = self.relu(x)\n        return x\n\nclass Mixed_5b(nn.Module):\n\n    def __init__(self):\n        super(Mixed_5b, self).__init__()\n\n        self.branch0 = BasicConv2d(192, 96, kernel_size=1, stride=1)\n\n        self.branch1 = nn.Sequential(\n            BasicConv2d(192, 48, kernel_size=1, stride=1),\n            BasicConv2d(48, 64, kernel_size=5, stride=1, padding=2)\n        ) \n\n        self.branch2 = nn.Sequential(\n            BasicConv2d(192, 64, kernel_size=1, stride=1),\n            BasicConv2d(64, 96, kernel_size=3, stride=1, padding=1),\n            BasicConv2d(96, 96, kernel_size=3, stride=1, padding=1)\n        )\n\n        self.branch3 = nn.Sequential(\n            nn.AvgPool2d(3, stride=1, padding=1, count_include_pad=False),\n            BasicConv2d(192, 64, kernel_size=1, stride=1)\n        )\n\n    def forward(self, x):\n        x0 = self.branch0(x)\n        x1 = self.branch1(x)\n        x2 = self.branch2(x)\n        x3 = self.branch3(x)\n        out = torch.cat((x0, x1, x2, x3), 1)\n        return out\n\nclass Block35(nn.Module):\n\n    def __init__(self, scale=1.0):\n        super(Block35, self).__init__()\n\n        self.scale = scale\n\n        self.branch0 = BasicConv2d(320, 32, kernel_size=1, stride=1)\n\n        self.branch1 = nn.Sequential(\n            BasicConv2d(320, 32, kernel_size=1, stride=1),\n            BasicConv2d(32, 32, kernel_size=3, stride=1, padding=1)\n        )\n\n        self.branch2 = nn.Sequential(\n            BasicConv2d(320, 32, kernel_size=1, stride=1),\n            BasicConv2d(32, 48, kernel_size=3, stride=1, padding=1),\n            BasicConv2d(48, 64, kernel_size=3, stride=1, padding=1)\n        )\n\n        self.conv2d = nn.Conv2d(128, 320, kernel_size=1, stride=1)\n        self.relu = nn.ReLU(inplace=False)\n\n    def forward(self, x):\n        x0 = self.branch0(x)\n        x1 = self.branch1(x)\n        x2 = self.branch2(x)\n        out = torch.cat((x0, x1, x2), 1)\n        out = self.conv2d(out)\n        out = out * self.scale + x\n        out = self.relu(out)\n        return out\n\nclass Mixed_6a(nn.Module):\n\n    def __init__(self):\n        super(Mixed_6a, self).__init__()\n        \n        self.branch0 = BasicConv2d(320, 384, kernel_size=3, stride=2)\n\n        self.branch1 = nn.Sequential(\n            BasicConv2d(320, 256, kernel_size=1, stride=1),\n            BasicConv2d(256, 256, kernel_size=3, stride=1, padding=1),\n            BasicConv2d(256, 384, kernel_size=3, stride=2)\n        )\n\n        self.branch2 = nn.MaxPool2d(3, stride=2)\n\n    def forward(self, x):\n        x0 = self.branch0(x)\n        x1 = self.branch1(x)\n        x2 = self.branch2(x)\n        out = torch.cat((x0, x1, x2), 1)\n        return out\n\nclass Block17(nn.Module):\n\n    def __init__(self, scale=1.0):\n        super(Block17, self).__init__()\n\n        self.scale = scale\n\n        self.branch0 = BasicConv2d(1088, 192, kernel_size=1, stride=1)\n\n        self.branch1 = nn.Sequential(\n            BasicConv2d(1088, 128, kernel_size=1, stride=1),\n            BasicConv2d(128, 160, kernel_size=(1,7), stride=1, padding=(0,3)),\n            BasicConv2d(160, 192, kernel_size=(7,1), stride=1, padding=(3,0))\n        )\n\n        self.conv2d = nn.Conv2d(384, 1088, kernel_size=1, stride=1)\n        self.relu = nn.ReLU(inplace=False)\n\n    def forward(self, x):\n        x0 = self.branch0(x)\n        x1 = self.branch1(x)\n        out = torch.cat((x0, x1), 1)\n        out = self.conv2d(out)\n        out = out * self.scale + x\n        out = self.relu(out)\n        return out\n\nclass Mixed_7a(nn.Module):\n\n    def __init__(self):\n        super(Mixed_7a, self).__init__()\n        \n        self.branch0 = nn.Sequential(\n            BasicConv2d(1088, 256, kernel_size=1, stride=1),\n            BasicConv2d(256, 384, kernel_size=3, stride=2)\n        )\n\n        self.branch1 = nn.Sequential(\n            BasicConv2d(1088, 256, kernel_size=1, stride=1),\n            BasicConv2d(256, 288, kernel_size=3, stride=2)\n        )\n\n        self.branch2 = nn.Sequential(\n            BasicConv2d(1088, 256, kernel_size=1, stride=1),\n            BasicConv2d(256, 288, kernel_size=3, stride=1, padding=1),\n            BasicConv2d(288, 320, kernel_size=3, stride=2)\n        )\n\n        self.branch3 = nn.MaxPool2d(3, stride=2)\n\n    def forward(self, x):\n        x0 = self.branch0(x)\n        x1 = self.branch1(x)\n        x2 = self.branch2(x)\n        x3 = self.branch3(x)\n        out = torch.cat((x0, x1, x2, x3), 1)\n        return out\n\nclass Block8(nn.Module):\n\n    def __init__(self, scale=1.0, noReLU=False):\n        super(Block8, self).__init__()\n\n        self.scale = scale\n        self.noReLU = noReLU\n\n        self.branch0 = BasicConv2d(2080, 192, kernel_size=1, stride=1)\n\n        self.branch1 = nn.Sequential(\n            BasicConv2d(2080, 192, kernel_size=1, stride=1),\n            BasicConv2d(192, 224, kernel_size=(1,3), stride=1, padding=(0,1)),\n            BasicConv2d(224, 256, kernel_size=(3,1), stride=1, padding=(1,0))\n        )\n\n        self.conv2d = nn.Conv2d(448, 2080, kernel_size=1, stride=1)\n        if not self.noReLU:\n            self.relu = nn.ReLU(inplace=False)\n\n    def forward(self, x):\n        x0 = self.branch0(x)\n        x1 = self.branch1(x)\n        out = torch.cat((x0, x1), 1)\n        out = self.conv2d(out)\n        out = out * self.scale + x\n        if not self.noReLU:\n            out = self.relu(out)\n        return out\n\n\nclass InceptionResnetV2(nn.Module):\n\n    def __init__(self, num_classes=1001):\n        super(InceptionResnetV2, self).__init__()\n        self.conv2d_1a = BasicConv2d(3, 32, kernel_size=3, stride=2)\n        self.conv2d_2a = BasicConv2d(32, 32, kernel_size=3, stride=1)\n        self.conv2d_2b = BasicConv2d(32, 64, kernel_size=3, stride=1, padding=1)\n        self.maxpool_3a = nn.MaxPool2d(3, stride=2)\n        self.conv2d_3b = BasicConv2d(64, 80, kernel_size=1, stride=1)\n        self.conv2d_4a = BasicConv2d(80, 192, kernel_size=3, stride=1)\n        self.maxpool_5a = nn.MaxPool2d(3, stride=2)\n        self.mixed_5b = Mixed_5b()\n        self.repeat = nn.Sequential(\n            Block35(scale=0.17),\n            Block35(scale=0.17),\n            Block35(scale=0.17),\n            Block35(scale=0.17),\n            Block35(scale=0.17),\n            Block35(scale=0.17),\n            Block35(scale=0.17),\n            Block35(scale=0.17),\n            Block35(scale=0.17),\n            Block35(scale=0.17)\n        )\n        self.mixed_6a = Mixed_6a()\n        self.repeat_1 = nn.Sequential(\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10),\n            Block17(scale=0.10)\n        )\n        self.mixed_7a = Mixed_7a()\n        self.repeat_2 = nn.Sequential(\n            Block8(scale=0.20),\n            Block8(scale=0.20),\n            Block8(scale=0.20),\n            Block8(scale=0.20),\n            Block8(scale=0.20),\n            Block8(scale=0.20),\n            Block8(scale=0.20),\n            Block8(scale=0.20),\n            Block8(scale=0.20)\n        )\n        self.block8 = Block8(noReLU=True)\n        self.conv2d_7b = BasicConv2d(2080, 1536, kernel_size=1, stride=1)\n        self.avgpool_1a = nn.AvgPool2d(8, count_include_pad=False)\n        self.classif = nn.Linear(1536, num_classes)\n\n    def forward(self, x):\n        x = self.conv2d_1a(x)\n        x = self.conv2d_2a(x)\n        x = self.conv2d_2b(x)\n        x = self.maxpool_3a(x)\n        x = self.conv2d_3b(x)\n        x = self.conv2d_4a(x)\n        x = self.maxpool_5a(x)\n        x = self.mixed_5b(x)\n        x = self.repeat(x)\n        x = self.mixed_6a(x)\n        x = self.repeat_1(x)\n        x = self.mixed_7a(x)\n        x = self.repeat_2(x)\n        x = self.block8(x)\n        x = self.conv2d_7b(x)\n        x = self.avgpool_1a(x)\n        x = x.view(x.size(0), -1)\n        x = self.classif(x) \n        return x\n\ndef inceptionresnetv2(pretrained=True):\n    r\"\"\"InceptionResnetV2 model architecture from the\n    `\"InceptionV4, Inception-ResNet...\" <https://arxiv.org/abs/1602.07261>`_ paper.\n\n    Args:\n        pretrained ('string'): If True, returns a model pre-trained on ImageNet\n    \"\"\"\n    model = InceptionResnetV2()\n    if pretrained:\n        model.load_state_dict(model_zoo.load_url(model_urls['imagenet']))\n    return model\n\n\n######################################################################\n## Load parameters from HDF5 to Dict\n######################################################################\n\ndef load_conv2d(state_dict, name_pth, name_tf):\n    h5f = h5py.File('dump/InceptionResnetV2/'+name_tf+'.h5', 'r')\n    state_dict[name_pth+'.conv.weight'] = torch.from_numpy(h5f['weights'][()]).permute(3, 2, 0, 1)\n    out_planes = state_dict[name_pth+'.conv.weight'].size(0)\n    state_dict[name_pth+'.bn.weight'] = torch.ones(out_planes)\n    state_dict[name_pth+'.bn.bias'] = torch.from_numpy(h5f['beta'][()])\n    state_dict[name_pth+'.bn.running_mean'] = torch.from_numpy(h5f['mean'][()])\n    state_dict[name_pth+'.bn.running_var'] = torch.from_numpy(h5f['var'][()])\n    h5f.close()\n\ndef load_conv2d_nobn(state_dict, name_pth, name_tf):\n    h5f = h5py.File('dump/InceptionResnetV2/'+name_tf+'.h5', 'r')\n    state_dict[name_pth+'.weight'] = torch.from_numpy(h5f['weights'][()]).permute(3, 2, 0, 1)\n    state_dict[name_pth+'.bias'] = torch.from_numpy(h5f['biases'][()])\n    h5f.close()\n\ndef load_linear(state_dict, name_pth, name_tf):\n    h5f = h5py.File('dump/InceptionResnetV2/'+name_tf+'.h5', 'r')\n    state_dict[name_pth+'.weight'] = torch.from_numpy(h5f['weights'][()]).t()\n    state_dict[name_pth+'.bias'] = torch.from_numpy(h5f['biases'][()])\n    h5f.close()\n\ndef load_mixed_5b(state_dict, name_pth, name_tf):\n    load_conv2d(state_dict, name_pth+'.branch0', name_tf+'/Branch_0/Conv2d_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1.0', name_tf+'/Branch_1/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1.1', name_tf+'/Branch_1/Conv2d_0b_5x5')\n    load_conv2d(state_dict, name_pth+'.branch2.0', name_tf+'/Branch_2/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch2.1', name_tf+'/Branch_2/Conv2d_0b_3x3')\n    load_conv2d(state_dict, name_pth+'.branch2.2', name_tf+'/Branch_2/Conv2d_0c_3x3')\n    load_conv2d(state_dict, name_pth+'.branch3.1', name_tf+'/Branch_3/Conv2d_0b_1x1')\n\ndef load_block35(state_dict, name_pth, name_tf):\n    load_conv2d(state_dict, name_pth+'.branch0', name_tf+'/Branch_0/Conv2d_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1.0', name_tf+'/Branch_1/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1.1', name_tf+'/Branch_1/Conv2d_0b_3x3')\n    load_conv2d(state_dict, name_pth+'.branch2.0', name_tf+'/Branch_2/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch2.1', name_tf+'/Branch_2/Conv2d_0b_3x3')\n    load_conv2d(state_dict, name_pth+'.branch2.2', name_tf+'/Branch_2/Conv2d_0c_3x3')\n    load_conv2d_nobn(state_dict, name_pth+'.conv2d', name_tf+'/Conv2d_1x1')\n\ndef load_mixed_6a(state_dict, name_pth, name_tf):\n    load_conv2d(state_dict, name_pth+'.branch0', name_tf+'/Branch_0/Conv2d_1a_3x3')\n    load_conv2d(state_dict, name_pth+'.branch1.0', name_tf+'/Branch_1/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1.1', name_tf+'/Branch_1/Conv2d_0b_3x3')\n    load_conv2d(state_dict, name_pth+'.branch1.2', name_tf+'/Branch_1/Conv2d_1a_3x3')\n\ndef load_block17(state_dict, name_pth, name_tf):\n    load_conv2d(state_dict, name_pth+'.branch0', name_tf+'/Branch_0/Conv2d_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1.0', name_tf+'/Branch_1/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1.1', name_tf+'/Branch_1/Conv2d_0b_1x7')\n    load_conv2d(state_dict, name_pth+'.branch1.2', name_tf+'/Branch_1/Conv2d_0c_7x1')\n    load_conv2d_nobn(state_dict, name_pth+'.conv2d', name_tf+'/Conv2d_1x1')\n\ndef load_mixed_7a(state_dict, name_pth, name_tf):\n    load_conv2d(state_dict, name_pth+'.branch0.0', name_tf+'/Branch_0/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch0.1', name_tf+'/Branch_0/Conv2d_1a_3x3')\n    load_conv2d(state_dict, name_pth+'.branch1.0', name_tf+'/Branch_1/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1.1', name_tf+'/Branch_1/Conv2d_1a_3x3')\n    load_conv2d(state_dict, name_pth+'.branch2.0', name_tf+'/Branch_2/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch2.1', name_tf+'/Branch_2/Conv2d_0b_3x3')\n    load_conv2d(state_dict, name_pth+'.branch2.2', name_tf+'/Branch_2/Conv2d_1a_3x3')\n\ndef load_block8(state_dict, name_pth, name_tf):\n    load_conv2d(state_dict, name_pth+'.branch0', name_tf+'/Branch_0/Conv2d_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1.0', name_tf+'/Branch_1/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1.1', name_tf+'/Branch_1/Conv2d_0b_1x3')\n    load_conv2d(state_dict, name_pth+'.branch1.2', name_tf+'/Branch_1/Conv2d_0c_3x1')\n    load_conv2d_nobn(state_dict, name_pth+'.conv2d', name_tf+'/Conv2d_1x1')\n\n\n\ndef load():\n    state_dict={}\n    \n    load_conv2d(state_dict, name_pth='conv2d_1a', name_tf='Conv2d_1a_3x3')\n    load_conv2d(state_dict, name_pth='conv2d_2a', name_tf='Conv2d_2a_3x3')\n    load_conv2d(state_dict, name_pth='conv2d_2b', name_tf='Conv2d_2b_3x3')\n    \n    load_conv2d(state_dict, name_pth='conv2d_3b', name_tf='Conv2d_3b_1x1')\n    load_conv2d(state_dict, name_pth='conv2d_4a', name_tf='Conv2d_4a_3x3')\n\n    load_mixed_5b(state_dict, name_pth='mixed_5b', name_tf='Mixed_5b')\n\n    for i in range(10):\n        load_block35(state_dict, name_pth='repeat.'+str(i), name_tf='Repeat/block35_'+str(i+1))\n\n    load_mixed_6a(state_dict, name_pth='mixed_6a', name_tf='Mixed_6a')\n\n    for i in range(20):\n        load_block17(state_dict, name_pth='repeat_1.'+str(i), name_tf='Repeat_1/block17_'+str(i+1))\n\n    load_mixed_7a(state_dict, name_pth='mixed_7a', name_tf='Mixed_7a')\n\n    for i in range(9):\n        load_block8(state_dict, name_pth='repeat_2.'+str(i), name_tf='Repeat_2/block8_'+str(i+1))\n\n    load_block8(state_dict, name_pth='block8', name_tf='Block8')\n    load_conv2d(state_dict, name_pth='conv2d_7b', name_tf='Conv2d_7b_1x1')\n    load_linear(state_dict, name_pth='classif', name_tf='Logits')\n\n    return state_dict\n\n######################################################################\n## Test\n######################################################################\n\ndef test(model):\n    from scipy import misc\n    img = misc.imread('lena_299.png')\n    inputs = torch.ones(1,299,299,3)\n    #inputs[0] = torch.from_numpy(img)\n\n    inputs[0,0,0,0] = -1\n    inputs.transpose_(1,3)\n    inputs.transpose_(2,3)\n\n    print(inputs.mean())\n    print(inputs.std())\n\n    #inputs.sub_(0.5).div_(0.5)\n    #inputs.sub_(inputs)\n    # 1, 3, 299, 299\n\n    outputs = model.forward(torch.autograd.Variable(inputs))\n    h5f = h5py.File('dump/InceptionResnetV2/Logits.h5', 'r')\n    outputs_tf = torch.from_numpy(h5f['out'][()])\n    h5f.close()\n    outputs = torch.nn.functional.softmax(outputs)\n    print(outputs.sum())\n    print(outputs[0])\n    print(outputs_tf.sum())\n    print(outputs_tf[0])\n    print(torch.dist(outputs.data, outputs_tf))\n    return outputs\n \ndef test_conv2d(module, name):\n    #global output_tf\n    h5f = h5py.File('dump/InceptionResnetV2/'+name+'.h5', 'r')\n    output_tf_conv = torch.from_numpy(h5f['conv_out'][()])\n    output_tf_conv.transpose_(1,3)\n    output_tf_conv.transpose_(2,3)\n    output_tf_relu = torch.from_numpy(h5f['relu_out'][()])\n    output_tf_relu.transpose_(1,3)\n    output_tf_relu.transpose_(2,3)\n    h5f.close()\n    def test_dist_conv(self, input, output):\n        print(name, 'conv', torch.dist(output.data, output_tf_conv))\n    module.conv.register_forward_hook(test_dist_conv)\n    def test_dist_relu(self, input, output):\n        print(name, 'relu', torch.dist(output.data, output_tf_relu))\n    module.relu.register_forward_hook(test_dist_relu)\n\ndef test_conv2d_nobn(module, name):\n    #global output_tf\n    h5f = h5py.File('dump/InceptionResnetV2/'+name+'.h5', 'r')\n    output_tf = torch.from_numpy(h5f['conv_out'][()])\n    output_tf.transpose_(1,3)\n    output_tf.transpose_(2,3)\n    h5f.close()\n    def test_dist(self, input, output):\n        print(name, 'conv+bias', torch.dist(output.data, output_tf))\n    module.register_forward_hook(test_dist)\n\ndef test_mixed_5b(module, name):\n    test_conv2d(module.branch0, name+'/Branch_0/Conv2d_1x1')\n    test_conv2d(module.branch1[0], name+'/Branch_1/Conv2d_0a_1x1')\n    test_conv2d(module.branch1[1], name+'/Branch_1/Conv2d_0b_5x5')\n    test_conv2d(module.branch2[0], name+'/Branch_2/Conv2d_0a_1x1')\n    test_conv2d(module.branch2[1], name+'/Branch_2/Conv2d_0b_3x3')\n    test_conv2d(module.branch2[2], name+'/Branch_2/Conv2d_0c_3x3')\n    test_conv2d(module.branch3[1], name+'/Branch_3/Conv2d_0b_1x1')\n\ndef test_block35(module, name):\n    test_conv2d(module.branch0, name+'/Branch_0/Conv2d_1x1')\n    test_conv2d(module.branch1[0], name+'/Branch_1/Conv2d_0a_1x1')\n    test_conv2d(module.branch1[1], name+'/Branch_1/Conv2d_0b_3x3')\n    test_conv2d(module.branch2[0], name+'/Branch_2/Conv2d_0a_1x1')\n    test_conv2d(module.branch2[1], name+'/Branch_2/Conv2d_0b_3x3')\n    test_conv2d(module.branch2[2], name+'/Branch_2/Conv2d_0c_3x3')\n    test_conv2d_nobn(module.conv2d, name+'/Conv2d_1x1')\n\ndef test_mixed_6a(module, name):\n    test_conv2d(module.branch0, name+'/Branch_0/Conv2d_1a_3x3')\n    test_conv2d(module.branch1[0], name+'/Branch_1/Conv2d_0a_1x1')\n    test_conv2d(module.branch1[1], name+'/Branch_1/Conv2d_0b_3x3')\n    test_conv2d(module.branch1[2], name+'/Branch_1/Conv2d_1a_3x3')\n\ndef test_block17(module, name):\n    test_conv2d(module.branch0, name+'/Branch_0/Conv2d_1x1')\n    test_conv2d(module.branch1[0], name+'/Branch_1/Conv2d_0a_1x1')\n    test_conv2d(module.branch1[1], name+'/Branch_1/Conv2d_0b_1x7')\n    test_conv2d(module.branch1[2], name+'/Branch_1/Conv2d_0c_7x1')\n    test_conv2d_nobn(module.conv2d, name+'/Conv2d_1x1')\n\ndef test_mixed_7a(module, name):\n    test_conv2d(module.branch0[0], name+'/Branch_0/Conv2d_0a_1x1')\n    test_conv2d(module.branch0[1], name+'/Branch_0/Conv2d_1a_3x3')\n    test_conv2d(module.branch1[0], name+'/Branch_1/Conv2d_0a_1x1')\n    test_conv2d(module.branch1[1], name+'/Branch_1/Conv2d_1a_3x3')\n    test_conv2d(module.branch2[0], name+'/Branch_2/Conv2d_0a_1x1')\n    test_conv2d(module.branch2[1], name+'/Branch_2/Conv2d_0b_3x3')\n    test_conv2d(module.branch2[2], name+'/Branch_2/Conv2d_1a_3x3')\n\ndef test_block8(module, name):\n    test_conv2d(module.branch0, name+'/Branch_0/Conv2d_1x1')\n    test_conv2d(module.branch1[0], name+'/Branch_1/Conv2d_0a_1x1')\n    test_conv2d(module.branch1[1], name+'/Branch_1/Conv2d_0b_1x3')\n    test_conv2d(module.branch1[2], name+'/Branch_1/Conv2d_0c_3x1')\n    test_conv2d_nobn(module.conv2d, name+'/Conv2d_1x1')\n\n######################################################################\n## Main\n######################################################################\n\nif __name__ == \"__main__\":\n\n    import h5py\n\n    model = InceptionResnetV2()\n    state_dict = load()\n    model.load_state_dict(state_dict)\n    model.eval()\n\n    os.system('mkdir -p save')\n    torch.save(model, 'save/inceptionresnetv2.pth')\n    torch.save(state_dict, 'save/inceptionresnetv2_state.pth')\n\n    test_conv2d(model.conv2d_1a, 'Conv2d_1a_3x3')\n    test_conv2d(model.conv2d_2a, 'Conv2d_2a_3x3')\n    test_conv2d(model.conv2d_2b, 'Conv2d_2b_3x3')\n    test_conv2d(model.conv2d_3b, 'Conv2d_3b_1x1')\n    test_conv2d(model.conv2d_4a, 'Conv2d_4a_3x3')\n\n    test_mixed_5b(model.mixed_5b, 'Mixed_5b')\n\n    for i in range(len(model.repeat._modules)):\n        test_block35(model.repeat[i], 'Repeat/block35_'+str(i+1))\n\n    test_mixed_6a(model.mixed_6a, 'Mixed_6a')\n\n    for i in range(len(model.repeat_1._modules)):\n        test_block17(model.repeat_1[i], 'Repeat_1/block17_'+str(i+1))\n\n    test_mixed_7a(model.mixed_7a, 'Mixed_7a')\n\n    for i in range(len(model.repeat_2._modules)):\n        test_block8(model.repeat_2[i], 'Repeat_2/block8_'+str(i+1))\n\n    test_block8(model.block8, 'Block8')\n\n    test_conv2d(model.conv2d_7b, 'Conv2d_7b_1x1')\n\n    outputs = test(model)\n    # test_conv2d(model.features[1], 'Conv2d_2a_3x3')\n    # test_conv2d(model.features[2], 'Conv2d_2b_3x3')\n    # test_conv2d(model.features[3].conv, 'Mixed_3a/Branch_1/Conv2d_0a_3x3')\n    #test_mixed_4a_7a(model.features[4], 'Mixed_4a')\n\n"
  },
  {
    "path": "model_zoo/inceptionresnetv2/tensorflow_dump.py",
    "content": "# python3\n\n# TensorBoard\n# python ~/.local/lib/python3.5/site-packages/tensorflow/tensorboard/tensorboard.py --logdir=logs --port=6007\n\n# python /home/cadene/anaconda3/envs/tensorflow/lib/python3.5/site-packages/tensorflow/tensorboard/tensorboard.py  --logdir=logs --port=6007\n\n\nimport os\nimport sys\nimport h5py\nimport math\nimport urllib.request\nimport numpy as np\nimport tensorflow as tf\n\nsys.path.append('models/slim')\nfrom datasets import dataset_utils\nfrom datasets import imagenet\nfrom nets import inception\nfrom preprocessing import inception_preprocessing\n\nslim = tf.contrib.slim\n\nimage_size = inception.inception_v3.default_image_size\n\nurl = 'http://download.tensorflow.org/models/inception_resnet_v2_2016_08_30.tar.gz'\ncheckpoints_dir = '/tmp/checkpoints/'\n\ndef make_padding(padding_name, conv_shape):\n  padding_name = padding_name.decode(\"utf-8\")\n  if padding_name == \"VALID\":\n    return [0, 0]\n  elif padding_name == \"SAME\":\n    #return [math.ceil(int(conv_shape[0])/2), math.ceil(int(conv_shape[1])/2)]\n    return [math.floor(int(conv_shape[0])/2), math.floor(int(conv_shape[1])/2)]\n  else:\n    sys.exit('Invalid padding name '+padding_name)\n\ndef dump_conv2d(name='Conv2d_1a_3x3'):\n  conv_operation = sess.graph.get_operation_by_name('InceptionResnetV2/'+name+'/Conv2D') # remplacer convolution par Conv2D si erreur\n  weights_tensor = sess.graph.get_tensor_by_name('InceptionResnetV2/'+name+'/weights:0')\n  weights = weights_tensor.eval()\n  padding = make_padding(conv_operation.get_attr('padding'), weights_tensor.get_shape())\n  strides = conv_operation.get_attr('strides')\n  conv_out = sess.graph.get_operation_by_name('InceptionResnetV2/'+name+'/Conv2D').outputs[0].eval() # remplacer convolution par Conv2D si erreur\n\n  beta = sess.graph.get_tensor_by_name('InceptionResnetV2/'+name+'/BatchNorm/beta:0').eval()\n  #gamma = sess.graph.get_tensor_by_name('InceptionResnetV2/'+name+'/BatchNorm/gamma:0').eval()\n  mean = sess.graph.get_tensor_by_name('InceptionResnetV2/'+name+'/BatchNorm/moving_mean:0').eval()\n  var = sess.graph.get_tensor_by_name('InceptionResnetV2/'+name+'/BatchNorm/moving_variance:0').eval()\n  \n  relu_out = sess.graph.get_operation_by_name('InceptionResnetV2/'+name+'/Relu').outputs[0].eval()\n\n  os.system('mkdir -p dump/InceptionResnetV2/'+name)\n  h5f = h5py.File('dump/InceptionResnetV2/'+name+'.h5', 'w')\n  # conv\n  h5f.create_dataset(\"weights\", data=weights)\n  h5f.create_dataset(\"strides\", data=strides)\n  h5f.create_dataset(\"padding\", data=padding)\n  h5f.create_dataset(\"conv_out\", data=conv_out)\n  # batch norm\n  h5f.create_dataset(\"beta\", data=beta)\n  #h5f.create_dataset(\"gamma\", data=gamma)\n  h5f.create_dataset(\"mean\", data=mean)\n  h5f.create_dataset(\"var\", data=var)\n  h5f.create_dataset(\"relu_out\", data=relu_out)\n  h5f.close()\n\ndef dump_conv2d_nobn(name='Conv2d_1x1'):\n  conv_operation = sess.graph.get_operation_by_name('InceptionResnetV2/'+name+'/Conv2D') # remplacer convolution par Conv2D si erreur\n  weights_tensor = sess.graph.get_tensor_by_name('InceptionResnetV2/'+name+'/weights:0')\n  weights = weights_tensor.eval()\n  biases_tensor = sess.graph.get_tensor_by_name('InceptionResnetV2/'+name+'/biases:0')\n  biases = biases_tensor.eval()\n  padding = make_padding(conv_operation.get_attr('padding'), weights_tensor.get_shape())\n  strides = conv_operation.get_attr('strides')\n  conv_out = sess.graph.get_operation_by_name('InceptionResnetV2/'+name+'/BiasAdd').outputs[0].eval() # remplacer convolution par Conv2D si erreur\n\n  os.system('mkdir -p dump/InceptionResnetV2/'+name)\n  h5f = h5py.File('dump/InceptionResnetV2/'+name+'.h5', 'w')\n  # conv\n  h5f.create_dataset(\"weights\", data=weights)\n  h5f.create_dataset(\"biases\", data=biases)\n  h5f.create_dataset(\"strides\", data=strides)\n  h5f.create_dataset(\"padding\", data=padding)\n  h5f.create_dataset(\"conv_out\", data=conv_out)\n  h5f.close()\n\ndef dump_logits():\n  operation = sess.graph.get_operation_by_name('InceptionResnetV2/Logits/Predictions')\n\n  weights_tensor = sess.graph.get_tensor_by_name('InceptionResnetV2/Logits/Logits/weights:0')\n  weights = weights_tensor.eval()\n\n  biases_tensor = sess.graph.get_tensor_by_name('InceptionResnetV2/Logits/Logits/biases:0')\n  biases = biases_tensor.eval()\n  \n  out = operation.outputs[0].eval()\n  print(out)\n\n  h5f = h5py.File('dump/InceptionResnetV2/Logits.h5', 'w')\n  h5f.create_dataset(\"weights\", data=weights)\n  h5f.create_dataset(\"biases\", data=biases)\n  h5f.create_dataset(\"out\", data=out)\n  h5f.close()\n\n\n# def dump_avgpool(name='Mixed_5b/Branch_3/AvgPool_0a_3x3'):\n#   operation = sess.graph.get_operation_by_name('InceptionResnetV2/InceptionResnetV2/'+name+'/AvgPool')\n#   out = operation.outputs[0].eval()\n#   os.system('mkdir -p dump/InceptionResnetV2/'+name)\n#   h5f = h5py.File('dump/InceptionResnetV2/'+name+'.h5', 'w')\n#   h5f.create_dataset(\"out\", data=out)\n#   h5f.close()\n\n# def dump_concats():\n#   operation1 = sess.graph.get_operation_by_name('InceptionResnetV2/InceptionResnetV2/Mixed_7b/Branch_1/concat')\n#   operation2 = sess.graph.get_operation_by_name('InceptionResnetV2/InceptionResnetV2/Mixed_7b/Branch_2/concat')\n#   out1 = operation1.outputs[0].eval()\n#   out2 = operation2.outputs[0].eval()\n#   os.system('mkdir -p dump/InceptionResnetV2/Mixed_7b/Branch_1/concat')\n#   h5f = h5py.File('dump/InceptionResnetV2/Mixed_7b/Branch_1/concat.h5', 'w')\n#   h5f.create_dataset(\"out\", data=out1)\n#   h5f.close()\n#   os.system('mkdir -p dump/InceptionResnetV2/Mixed_7b/Branch_2/concat')\n#   h5f = h5py.File('dump/InceptionResnetV2/Mixed_7b/Branch_2/concat.h5', 'w')\n#   h5f.create_dataset(\"out\", data=out2)\n#   h5f.close()\n\n\ndef dump_mixed_5b(name='Mixed_5b'):\n  dump_conv2d(name=name+'/Branch_0/Conv2d_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0b_5x5')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0b_3x3')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0c_3x3')\n  dump_conv2d(name=name+'/Branch_3/Conv2d_0b_1x1')\n\ndef dump_block35(name='Repeat/block35_1'):\n  dump_conv2d(name=name+'/Branch_0/Conv2d_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0b_3x3')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0b_3x3')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0c_3x3')\n  dump_conv2d_nobn(name=name+'/Conv2d_1x1')\n\ndef dump_mixed_6a(name='Mixed_6a'):\n  dump_conv2d(name=name+'/Branch_0/Conv2d_1a_3x3')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0b_3x3')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_1a_3x3')\n\ndef dump_block17(name='Repeat_1/block17_1'):\n  dump_conv2d(name=name+'/Branch_0/Conv2d_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0b_1x7')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0c_7x1')\n  dump_conv2d_nobn(name=name+'/Conv2d_1x1')\n\ndef dump_mixed_7a(name='Mixed_7a'):\n  dump_conv2d(name=name+'/Branch_0/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_0/Conv2d_1a_3x3')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_1a_3x3')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0b_3x3')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_1a_3x3')\n\ndef dump_block8(name='Repeat_2/block8_1'):\n  dump_conv2d(name=name+'/Branch_0/Conv2d_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0b_1x3')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0c_3x1')\n  dump_conv2d_nobn(name=name+'/Conv2d_1x1')\n\n\nif not tf.gfile.Exists(checkpoints_dir+'inception_resnet_v2_2016_08_30.ckpt'):\n  tf.gfile.MakeDirs(checkpoints_dir)\n  dataset_utils.download_and_uncompress_tarball(url, checkpoints_dir)\n\nwith tf.Graph().as_default():\n\n  # Create model architecture\n\n  from scipy import misc\n  img = misc.imread('lena_299.png')\n  print(img.shape)\n\n  inputs = np.ones((1,299,299,3), dtype=np.float32)\n  inputs[0,0,0,0] = -1\n  #inputs[0] = img\n  print(inputs.mean())\n  print(inputs.std())\n  inputs = tf.pack(inputs)\n  # tensorflow normalization\n  # https://github.com/tensorflow/models/blob/master/slim/preprocessing/inception_preprocessing.py#L273\n  #inputs = tf.sub(inputs, 0.5) \n  #inputs = tf.mul(inputs, 2.0)\n\n\n  with slim.arg_scope(inception.inception_resnet_v2_arg_scope()):\n    logits, _ = inception.inception_resnet_v2(inputs, num_classes=1001, is_training=False)\n\n  with tf.Session() as sess:\n\n    # Initialize model\n    init_fn = slim.assign_from_checkpoint_fn(\n    os.path.join(checkpoints_dir, 'inception_resnet_v2_2016_08_30.ckpt'),\n    slim.get_model_variables('InceptionResnetV2'))  \n\n    init_fn(sess)\n\n    # Display model variables\n    for v in slim.get_model_variables():\n      print('name = {}, shape = {}'.format(v.name, v.get_shape()))\n\n    # Create graph\n    os.system(\"rm -rf logs\")\n    os.system(\"mkdir -p logs\")\n\n    tf.scalar_summary('logs', logits[0][0])\n    summary_op = tf.merge_all_summaries()\n    summary_writer = tf.train.SummaryWriter(\"logs\", sess.graph)\n\n    out = sess.run(summary_op)\n    summary_writer.add_summary(out, 0)\n\n    ###############################\n    # Dump parameters and outputs\n\n    dump_conv2d(name='Conv2d_1a_3x3')\n    dump_conv2d(name='Conv2d_2a_3x3')\n    dump_conv2d(name='Conv2d_2b_3x3')\n    # MaxPooling\n\n    dump_conv2d(name='Conv2d_3b_1x1')\n    dump_conv2d(name='Conv2d_4a_3x3')\n    # MaxPooling\n\n    dump_mixed_5b()\n    for i in range(1,11):\n      dump_block35(name='Repeat/block35_'+str(i))\n\n    dump_mixed_6a()\n    for i in range(1,21):\n      dump_block17(name='Repeat_1/block17_'+str(i))\n\n    dump_mixed_7a()\n    for i in range(1,10):\n      dump_block8(name='Repeat_2/block8_'+str(i))\n    \n    dump_block8(name='Block8')\n    dump_conv2d(name='Conv2d_7b_1x1')\n    # AvgPooling\n    \n    dump_logits()"
  },
  {
    "path": "model_zoo/inceptionresnetv2/torch_load.lua",
    "content": "require 'nn'\nlocal hdf5 = require 'hdf5'\ntorch.setdefaulttensortype('torch.FloatTensor')\nrequire 'image'\n\nlocal function SpatialConvBatchNormReLU(nInputPlane, nOutputPlane, kW, kH, dW, dH, padW, padH)\n   local std_epsilon = 0.001\n   local conv = nn.SpatialConvolution(nInputPlane, nOutputPlane, kW, kH, dW, dH, padW, padH)\n   conv:noBias()\n   local module = nn.Sequential()\n   module:add(conv)\n   module:add(nn.SpatialBatchNormalization(nOutputPlane, std_epsilon, nil, true))\n   module:add(nn.ReLU(true))\n   return module\nend\n\nlocal function SpatialAveragePoolingNoCIP(nInputPlane, nOutputPlane, kW, kH, dW, dH, padW, padH)\n   local module = nn.SpatialAveragePooling(nInputPlane, nOutputPlane, kW, kH, dW, dH, padW, padH)\n   module.count_include_pad = false\n   return module\nend\n\nlocal function Tower(layers)\n   local tower = nn.Sequential()\n   for i=1,#layers do\n      tower:add(layers[i])\n   end\n   return tower\nend\n\nlocal function FilterConcat(towers)\n   local concat = nn.DepthConcat(2)\n   for i=1,#towers do\n      concat:add(towers[i])\n   end\n   return concat\nend\n\nlocal function Mixed_5b()\n   local module = FilterConcat({\n      SpatialConvBatchNormReLU(192, 96, 1, 1, 1, 1),\n      Tower({\n         SpatialConvBatchNormReLU(192, 48, 1, 1, 1, 1),\n         SpatialConvBatchNormReLU(48, 64, 5, 5, 1, 1, 2, 2)\n      }),\n      Tower({\n         SpatialConvBatchNormReLU(192, 64, 1, 1, 1, 1),\n         SpatialConvBatchNormReLU(64, 96, 3, 3, 1, 1, 1, 1),\n         SpatialConvBatchNormReLU(96, 96, 3, 3, 1, 1, 1, 1)\n      }),\n      Tower({\n         SpatialAveragePoolingNoCIP(3, 3, 1, 1, 1, 1),\n         SpatialConvBatchNormReLU(192, 64, 1, 1, 1, 1)\n      })\n   })\n   return module\nend\n\nlocal function Block35(scale)\n   local scale = scale or 0.17\n\n   local branchs = FilterConcat({\n      SpatialConvBatchNormReLU(320, 32, 1, 1, 1, 1),\n      Tower({\n         SpatialConvBatchNormReLU(320, 32, 1, 1, 1, 1),\n         SpatialConvBatchNormReLU(32, 32, 3, 3, 1, 1, 1, 1)\n      }),\n      Tower({\n         SpatialConvBatchNormReLU(320, 32, 1, 1, 1, 1),\n         SpatialConvBatchNormReLU(32, 48, 3, 3, 1, 1, 1, 1),\n         SpatialConvBatchNormReLU(48, 64, 3, 3, 1, 1, 1, 1)\n      })\n   })\n\n   local shortcut = nn.ConcatTable(2)\n   shortcut:add(nn.Identity())\n   shortcut:add(\n      Tower({\n         branchs,\n         nn.SpatialConvolution(128, 320, 1, 1, 1, 1),\n         nn.MulConstant(scale)\n      })\n   )\n\n   local module = nn.Sequential()\n   module:add(shortcut)\n   module:add(nn.CAddTable(true))\n   module:add(nn.ReLU(true))\n   return module\nend\n\nlocal function Mixed_6a()\n   local module = FilterConcat({\n      SpatialConvBatchNormReLU(320, 384, 3, 3, 2, 2),\n      Tower({\n         SpatialConvBatchNormReLU(320, 256, 1, 1, 1, 1),\n         SpatialConvBatchNormReLU(256, 256, 3, 3, 1, 1, 1, 1),\n         SpatialConvBatchNormReLU(256, 384, 3, 3, 2, 2)\n      }),\n      nn.SpatialMaxPooling(3, 3, 2, 2)\n   })\n   return module\nend\n\nlocal function Block17(scale)\n   local scale = scale or 0.10\n\n   local branchs = FilterConcat({\n      SpatialConvBatchNormReLU(1088, 192, 1, 1, 1, 1),\n      Tower({\n         SpatialConvBatchNormReLU(1088, 128, 1, 1, 1, 1),\n         SpatialConvBatchNormReLU(128, 160, 7, 1, 1, 1, 3, 0),\n         SpatialConvBatchNormReLU(160, 192, 1, 7, 1, 1, 0, 3)\n      })\n   })\n\n   local shortcut = nn.ConcatTable(2)\n   shortcut:add(nn.Identity())\n   shortcut:add(\n      Tower({\n         branchs,\n         nn.SpatialConvolution(384, 1088, 1, 1, 1, 1),\n         nn.MulConstant(scale)\n      })\n   )\n\n   local module = nn.Sequential()\n   module:add(shortcut)\n   module:add(nn.CAddTable(true))\n   module:add(nn.ReLU(true))\n   return module\nend\n\nlocal function Mixed_7a()\n   local module = FilterConcat({\n      Tower({\n         SpatialConvBatchNormReLU(1088, 256, 1, 1, 1, 1),\n         SpatialConvBatchNormReLU(256, 384, 3, 3, 2, 2)\n      }),\n      Tower({\n         SpatialConvBatchNormReLU(1088, 256, 1, 1, 1, 1),\n         SpatialConvBatchNormReLU(256, 288, 3, 3, 2, 2)\n      }),\n      Tower({\n         SpatialConvBatchNormReLU(1088, 256, 1, 1, 1, 1),\n         SpatialConvBatchNormReLU(256, 288, 3, 3, 1, 1, 1, 1),\n         SpatialConvBatchNormReLU(288, 320, 3, 3, 2, 2)\n      }),\n      nn.SpatialMaxPooling(3, 3, 2, 2)\n   })\n   return module\nend\n\nlocal function Block8(scale, noReLU)\n   local scale = scale or 0.20\n\n   local branchs = FilterConcat({\n      SpatialConvBatchNormReLU(2080, 192, 1, 1, 1, 1),\n      Tower({\n         SpatialConvBatchNormReLU(2080, 192, 1, 1, 1, 1),\n         SpatialConvBatchNormReLU(192, 224, 3, 1, 1, 1, 1, 0),\n         SpatialConvBatchNormReLU(224, 256, 1, 3, 1, 1, 0, 1)\n      })\n   })\n\n   local shortcut = nn.ConcatTable(2)\n   shortcut:add(nn.Identity())\n   shortcut:add(\n      Tower({\n         branchs,\n         nn.SpatialConvolution(448, 2080, 1, 1, 1, 1),\n         nn.MulConstant(scale)\n      })\n   )\n\n   local module = nn.Sequential()\n   module:add(shortcut)\n   module:add(nn.CAddTable(true))\n   if not noReLU then\n      module:add(nn.ReLU(true))\n   end\n   return module\nend\n\nlocal function InceptionResnetV2(nclass)\n   local nclass = nclass or 1001\n   local net = nn.Sequential()\n   net:add(SpatialConvBatchNormReLU(3, 32, 3, 3, 2, 2))\n   net:add(SpatialConvBatchNormReLU(32, 32, 3, 3, 1, 1))\n   net:add(SpatialConvBatchNormReLU(32, 64, 3, 3, 1, 1, 1, 1))\n   net:add(nn.SpatialMaxPooling(3, 3, 2, 2))\n   net:add(SpatialConvBatchNormReLU(64, 80, 1, 1, 1, 1))\n   net:add(SpatialConvBatchNormReLU(80, 192, 3, 3, 1, 1))\n   net:add(nn.SpatialMaxPooling(3, 3, 2, 2))\n   net:add(Mixed_5b())\n   net:add(Tower({\n      Block35(),\n      Block35(),\n      Block35(),\n      Block35(),\n      Block35(),\n      Block35(),\n      Block35(),\n      Block35(),\n      Block35(),\n      Block35() -- 10th\n   }))\n   net:add(Mixed_6a())\n   net:add(Tower({\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17(),\n      Block17() -- 20th\n   }))\n   net:add(Mixed_7a())\n   net:add(Tower({\n      Block8(),\n      Block8(),\n      Block8(),\n      Block8(),\n      Block8(),\n      Block8(),\n      Block8(),\n      Block8(),\n      Block8() -- 9th\n   }))\n   net:add(Block8(1.0, true))\n   net:add(SpatialConvBatchNormReLU(2080, 1536, 1, 1, 1, 1))\n   net:add(nn.SpatialAveragePooling(8, 8))\n   net:add(nn.View(1536))\n   net:add(nn.Linear(1536, nclass))\n   return net\nend\n\n----------------\n-- Load --\n----------------\n\nlocal function load_conv2d(module, name)\n   -- local name = name or 'Conv2d_1a_3x3'\n   local h5f = hdf5.open('dump/InceptionResnetV2/'..name..'.h5', 'r')\n  \n   local conv = module:get(1) -- Spatial Convolution\n   local weights = h5f:read(\"weights\"):all():permute(4, 3, 1, 2)\n   conv.weight:copy(weights)\n\n   local bn = module:get(2) -- Spatial Batch Normalization\n   --local gamma = h5f:read(\"gamma\"):all() \n   bn.weight:copy(torch.ones(bn.weight:size(1))) -- gamma is set to 1\n   local beta = h5f:read(\"beta\"):all()\n   bn.bias:copy(beta)\n   local mean = h5f:read(\"mean\"):all()\n   bn.running_mean:copy(mean)\n   local var = h5f:read(\"var\"):all()\n   bn.running_var:copy(var)\n\n   h5f:close()\nend\n\nlocal function load_conv2d_nobn(module, name)\n   local name = name or 'Conv2d_1a_3x3'\n   local h5f = hdf5.open('dump/InceptionResnetV2/'..name..'.h5', 'r')\n   local conv = module -- Spatial Convolution\n\n   local weights = h5f:read(\"weights\"):all():permute(4, 3, 1, 2)\n   conv.weight:copy(weights)\n\n   local biases = h5f:read(\"biases\"):all()\n   conv.bias:copy(biases)\n\n   h5f:close()\nend\n\nlocal function load_linear(module, name)\n  local h5f = hdf5.open('dump/InceptionResnetV2/'..name..'.h5', 'r')\n  local weights = h5f:read('weights'):all():t()\n  local biases = h5f:read('biases'):all()\n  module.weight:copy(weights)\n  module.bias:copy(biases)\n  h5f:close()\nend\n\nlocal function load_mixed_5b(module)\n   load_conv2d(module:get(1), 'Mixed_5b/Branch_0/Conv2d_1x1')\n   load_conv2d(module:get(2):get(1), 'Mixed_5b/Branch_1/Conv2d_0a_1x1')\n   load_conv2d(module:get(2):get(2), 'Mixed_5b/Branch_1/Conv2d_0b_5x5')\n   load_conv2d(module:get(3):get(1), 'Mixed_5b/Branch_2/Conv2d_0a_1x1')\n   load_conv2d(module:get(3):get(2), 'Mixed_5b/Branch_2/Conv2d_0b_3x3')\n   load_conv2d(module:get(3):get(3), 'Mixed_5b/Branch_2/Conv2d_0c_3x3')\n   load_conv2d(module:get(4):get(2), 'Mixed_5b/Branch_3/Conv2d_0b_1x1')\nend\n\nlocal function load_block35(module, name)\n   load_conv2d(module:get(1):get(2):get(1):get(1), name..'/Branch_0/Conv2d_1x1')\n   load_conv2d(module:get(1):get(2):get(1):get(2):get(1), name..'/Branch_1/Conv2d_0a_1x1')\n   load_conv2d(module:get(1):get(2):get(1):get(2):get(2), name..'/Branch_1/Conv2d_0b_3x3')\n   load_conv2d(module:get(1):get(2):get(1):get(3):get(1), name..'/Branch_2/Conv2d_0a_1x1')\n   load_conv2d(module:get(1):get(2):get(1):get(3):get(2), name..'/Branch_2/Conv2d_0b_3x3')\n   load_conv2d(module:get(1):get(2):get(1):get(3):get(3), name..'/Branch_2/Conv2d_0c_3x3')\n   load_conv2d_nobn(module:get(1):get(2):get(2), name..'/Conv2d_1x1')\nend\n\nlocal function load_mixed_6a(module)\n   load_conv2d(module:get(1), 'Mixed_6a/Branch_0/Conv2d_1a_3x3')\n   load_conv2d(module:get(2):get(1), 'Mixed_6a/Branch_1/Conv2d_0a_1x1')\n   load_conv2d(module:get(2):get(2), 'Mixed_6a/Branch_1/Conv2d_0b_3x3')\n   load_conv2d(module:get(2):get(3), 'Mixed_6a/Branch_1/Conv2d_1a_3x3')\nend\n\nlocal function load_block17(module, name)\n   load_conv2d(module:get(1):get(2):get(1):get(1), name..'/Branch_0/Conv2d_1x1')\n   load_conv2d(module:get(1):get(2):get(1):get(2):get(1), name..'/Branch_1/Conv2d_0a_1x1')\n   load_conv2d(module:get(1):get(2):get(1):get(2):get(2), name..'/Branch_1/Conv2d_0b_1x7')\n   load_conv2d(module:get(1):get(2):get(1):get(2):get(3), name..'/Branch_1/Conv2d_0c_7x1')\n   load_conv2d_nobn(module:get(1):get(2):get(2), name..'/Conv2d_1x1')\nend\n\nlocal function load_mixed_7a(module)\n   load_conv2d(module:get(1):get(1), 'Mixed_7a/Branch_0/Conv2d_0a_1x1')\n   load_conv2d(module:get(1):get(2), 'Mixed_7a/Branch_0/Conv2d_1a_3x3')\n   load_conv2d(module:get(2):get(1), 'Mixed_7a/Branch_1/Conv2d_0a_1x1')\n   load_conv2d(module:get(2):get(2), 'Mixed_7a/Branch_1/Conv2d_1a_3x3')\n   load_conv2d(module:get(3):get(1), 'Mixed_7a/Branch_2/Conv2d_0a_1x1')\n   load_conv2d(module:get(3):get(2), 'Mixed_7a/Branch_2/Conv2d_0b_3x3')\n   load_conv2d(module:get(3):get(3), 'Mixed_7a/Branch_2/Conv2d_1a_3x3')\nend\n\nlocal function load_block8(module, name)\n   load_conv2d(module:get(1):get(2):get(1):get(1), name..'/Branch_0/Conv2d_1x1')\n   load_conv2d(module:get(1):get(2):get(1):get(2):get(1), name..'/Branch_1/Conv2d_0a_1x1')\n   load_conv2d(module:get(1):get(2):get(1):get(2):get(2), name..'/Branch_1/Conv2d_0b_1x3')\n   load_conv2d(module:get(1):get(2):get(1):get(2):get(3), name..'/Branch_1/Conv2d_0c_3x1')\n   load_conv2d_nobn(module:get(1):get(2):get(2), name..'/Conv2d_1x1')\nend\n\nlocal function load(net)\n   load_conv2d(net:get(1), 'Conv2d_1a_3x3')\n   load_conv2d(net:get(2), 'Conv2d_2a_3x3')\n   load_conv2d(net:get(3), 'Conv2d_2b_3x3')\n\n   load_conv2d(net:get(5), 'Conv2d_3b_1x1')\n   load_conv2d(net:get(6), 'Conv2d_4a_3x3')\n\n   load_mixed_5b(net:get(8))\n\n   for i=1, 10 do\n      load_block35(net:get(9):get(i), 'Repeat/block35_'..i)\n   end\n\n   load_mixed_6a(net:get(10))\n\n   for i=1, 20 do\n      load_block17(net:get(11):get(i), 'Repeat_1/block17_'..i)\n   end\n\n   load_mixed_7a(net:get(12))\n\n   for i=1, 9 do\n      load_block8(net:get(13):get(i), 'Repeat_2/block8_'..i)\n   end\n\n   load_block8(net:get(14), 'Block8')\n   load_conv2d(net:get(15), 'Conv2d_7b_1x1')\n   load_linear(net:get(18), 'Logits')\nend\n\n----------\n-- Test --\n----------\n\nlocal function test_conv2d(module, name, opt)\n   local name = name or 'Conv2d_1a_3x3'\n   local h5f = hdf5.open('dump/InceptionResnetV2/'..name..'.h5', 'r')\n\n   local conv_out = h5f:read(\"conv_out\"):all()\n   conv_out = conv_out:transpose(2,4)\n   conv_out = conv_out:transpose(3,4)\n \n   local relu_out = h5f:read(\"relu_out\"):all()\n   relu_out = relu_out:transpose(2,4)\n   relu_out = relu_out:transpose(3,4)\n \n   h5f:close()\n \n   if opt.cuda then\n     conv_out = conv_out:cuda()\n     relu_out = relu_out:cuda()\n   end\n \n   print(name..' conv_out', torch.dist(module:get(1).output, conv_out))\n   print(name..' relu_out', torch.dist(module:get(3).output, relu_out))\n   print('')\nend\n\nlocal function test_conv2d_nobn(module, name, opt)\n   local name = name or 'Conv2d_1a_3x3'\n   local h5f = hdf5.open('dump/InceptionResnetV2/'..name..'.h5', 'r')\n\n   local conv_out = h5f:read(\"conv_out\"):all()\n   conv_out = conv_out:transpose(2,4)\n   conv_out = conv_out:transpose(3,4)\n \n   h5f:close()\n \n   if opt.cuda then\n     conv_out = conv_out:cuda()\n   end\n \n   print(name..' conv_out', torch.dist(module.output, conv_out))\n   print('')\nend\n\nlocal function test_linear(module, name, opt)\n   local h5f = hdf5.open('dump/InceptionResnetV2/'..name..'.h5', 'r')\n   local out = h5f:read(\"out\"):all()\n   h5f:close()\n   local softmax = nn.SoftMax()\n   if opt.cuda then\n      softmax:cuda()\n      out = out:cuda()\n   end\n   local output = softmax:forward(module.output)\n   print(name..' linear_out', torch.dist(output, out))\n   print('')\nend\n\nlocal function test_mixed_5b(module, opt)\n   test_conv2d(module:get(1), 'Mixed_5b/Branch_0/Conv2d_1x1', opt)\n   test_conv2d(module:get(2):get(1), 'Mixed_5b/Branch_1/Conv2d_0a_1x1', opt)\n   test_conv2d(module:get(2):get(2), 'Mixed_5b/Branch_1/Conv2d_0b_5x5', opt)\n   test_conv2d(module:get(3):get(1), 'Mixed_5b/Branch_2/Conv2d_0a_1x1', opt)\n   test_conv2d(module:get(3):get(2), 'Mixed_5b/Branch_2/Conv2d_0b_3x3', opt)\n   test_conv2d(module:get(3):get(3), 'Mixed_5b/Branch_2/Conv2d_0c_3x3', opt)\n   test_conv2d(module:get(4):get(2), 'Mixed_5b/Branch_3/Conv2d_0b_1x1', opt)\nend\n\nlocal function test_block35(module, name, opt)\n   test_conv2d(module:get(1):get(2):get(1):get(1), name..'/Branch_0/Conv2d_1x1', opt)\n   test_conv2d(module:get(1):get(2):get(1):get(2):get(1), name..'/Branch_1/Conv2d_0a_1x1', opt)\n   test_conv2d(module:get(1):get(2):get(1):get(2):get(2), name..'/Branch_1/Conv2d_0b_3x3', opt)\n   test_conv2d(module:get(1):get(2):get(1):get(3):get(1), name..'/Branch_2/Conv2d_0a_1x1', opt)\n   test_conv2d(module:get(1):get(2):get(1):get(3):get(2), name..'/Branch_2/Conv2d_0b_3x3', opt)\n   test_conv2d(module:get(1):get(2):get(1):get(3):get(3), name..'/Branch_2/Conv2d_0c_3x3', opt)\n   test_conv2d_nobn(module:get(1):get(2):get(2), name..'/Conv2d_1x1', opt)\nend\n\nlocal function test_mixed_6a(module, opt)\n   test_conv2d(module:get(1), 'Mixed_6a/Branch_0/Conv2d_1a_3x3', opt)\n   test_conv2d(module:get(2):get(1), 'Mixed_6a/Branch_1/Conv2d_0a_1x1', opt)\n   test_conv2d(module:get(2):get(2), 'Mixed_6a/Branch_1/Conv2d_0b_3x3', opt)\n   test_conv2d(module:get(2):get(3), 'Mixed_6a/Branch_1/Conv2d_1a_3x3', opt)\nend\n\nlocal function test_block17(module, name, opt)\n   test_conv2d(module:get(1):get(2):get(1):get(1), name..'/Branch_0/Conv2d_1x1', opt)\n   test_conv2d(module:get(1):get(2):get(1):get(2):get(1), name..'/Branch_1/Conv2d_0a_1x1', opt)\n   test_conv2d(module:get(1):get(2):get(1):get(2):get(2), name..'/Branch_1/Conv2d_0b_1x7', opt)\n   test_conv2d(module:get(1):get(2):get(1):get(2):get(3), name..'/Branch_1/Conv2d_0c_7x1', opt)\n   test_conv2d_nobn(module:get(1):get(2):get(2), name..'/Conv2d_1x1', opt)\nend\n\nlocal function test_mixed_7a(module, opt)\n   test_conv2d(module:get(1):get(1), 'Mixed_7a/Branch_0/Conv2d_0a_1x1', opt)\n   test_conv2d(module:get(1):get(2), 'Mixed_7a/Branch_0/Conv2d_1a_3x3', opt)\n   test_conv2d(module:get(2):get(1), 'Mixed_7a/Branch_1/Conv2d_0a_1x1', opt)\n   test_conv2d(module:get(2):get(2), 'Mixed_7a/Branch_1/Conv2d_1a_3x3', opt)\n   test_conv2d(module:get(3):get(1), 'Mixed_7a/Branch_2/Conv2d_0a_1x1', opt)\n   test_conv2d(module:get(3):get(2), 'Mixed_7a/Branch_2/Conv2d_0b_3x3', opt)\n   test_conv2d(module:get(3):get(3), 'Mixed_7a/Branch_2/Conv2d_1a_3x3', opt)\nend\n\nlocal function test_block8(module, name)\n   load_conv2d(module:get(1):get(2):get(1):get(1), name..'/Branch_0/Conv2d_1x1')\n   load_conv2d(module:get(1):get(2):get(1):get(2):get(1), name..'/Branch_1/Conv2d_0a_1x1')\n   load_conv2d(module:get(1):get(2):get(1):get(2):get(2), name..'/Branch_1/Conv2d_0b_1x3')\n   load_conv2d(module:get(1):get(2):get(1):get(2):get(3), name..'/Branch_1/Conv2d_0c_3x1')\n   load_conv2d_nobn(module:get(1):get(2):get(2), name..'/Conv2d_1x1')\nend\n\nlocal function test(net, opt)\n   net:evaluate()\n   local input = torch.ones(1,299,299,3) -- [0,1]\n   --local img = image.load('lena_299.png') * 255.0 -- [0,255]\n   --input[1] = img:float()\n   input[{1,1,1,1}] = -1\n   input = input:transpose(2,4)\n   input = input:transpose(3,4)\n   local softmax = nn.SoftMax()\n   if opt.cuda then\n      input = input:cuda()\n      softmax:cuda()\n   end\n   local output = net:forward(input)\n\n   test_conv2d(net:get(1), 'Conv2d_1a_3x3', opt)\n   test_conv2d(net:get(2), 'Conv2d_2a_3x3', opt)\n   test_conv2d(net:get(3), 'Conv2d_2b_3x3', opt)\n\n   test_conv2d(net:get(5), 'Conv2d_3b_1x1', opt)\n   test_conv2d(net:get(6), 'Conv2d_4a_3x3', opt)\n\n   test_mixed_5b(net:get(8), opt)\n\n   for i=1, 10 do\n      test_block35(net:get(9):get(i), 'Repeat/block35_'..i, opt)\n   end\n\n   test_mixed_6a(net:get(10), opt)\n\n   for i=1, 20 do\n      test_block17(net:get(11):get(i), 'Repeat_1/block17_'..i, opt)\n   end\n\n   test_mixed_7a(net:get(12), opt)\n\n   for i=1, 9 do\n      test_block8(net:get(13):get(i), 'Repeat_2/block8_'..i, opt)\n   end\n\n   test_block8(net:get(14), 'Block8', opt)\n   test_conv2d(net:get(15), 'Conv2d_7b_1x1', opt)\n   test_linear(net:get(18), 'Logits', opt)\nend\n\n------------------------------------------------------------------\n-- Main\n------------------------------------------------------------------\n\nlocal function main()\n   local opt = {\n      cuda = true\n   }\n   local net = InceptionResnetV2()\n   print(net)\n   load(net)\n   print('loaded')\n\n   if opt.cuda then\n      require 'cunn'\n      require 'cutorch'\n      --require 'cudnn'\n      net:cuda()\n   end\n\n   test(net, opt)\n\n   os.execute('mkdir -p save')\n   torch.save('save/inceptionresnetv2.t7', net:clearState():float())\nend\n\nmain()"
  },
  {
    "path": "model_zoo/inceptionv4/__init__.py",
    "content": ""
  },
  {
    "path": "model_zoo/inceptionv4/pytorch_load.py",
    "content": "import torch\nimport torch.nn as nn\nimport torch.utils.model_zoo as model_zoo\nimport os\nimport sys\n\nmodel_urls = {\n    'imagenet': 'http://webia.lip6.fr/~cadene/Downloads/inceptionv4-97ef9c30.pth'\n}\n\nclass BasicConv2d(nn.Module):\n\n    def __init__(self, in_planes, out_planes, kernel_size, stride, padding=0):\n        super(BasicConv2d, self).__init__()\n        self.conv = nn.Conv2d(in_planes, out_planes, kernel_size=kernel_size, stride=stride, padding=padding, bias=False) # verify bias false\n        self.bn = nn.BatchNorm2d(out_planes, eps=0.001, momentum=0, affine=True)\n        self.relu = nn.ReLU(inplace=True)\n\n    def forward(self, x):\n        x = self.conv(x)\n        x = self.bn(x)\n        x = self.relu(x)\n        return x\n\nclass Mixed_3a(nn.Module):\n\n    def __init__(self):\n        super(Mixed_3a, self).__init__()\n        self.maxpool = nn.MaxPool2d(3, stride=2)\n        self.conv = BasicConv2d(64, 96, kernel_size=3, stride=2)\n\n    def forward(self, x):\n        x0 = self.maxpool(x)\n        x1 = self.conv(x)\n        out = torch.cat((x0, x1), 1)\n        return out\n\nclass Mixed_4a(nn.Module):\n\n    def __init__(self):\n        super(Mixed_4a, self).__init__()\n\n        self.branch0 = nn.Sequential(\n            BasicConv2d(160, 64, kernel_size=1, stride=1),\n            BasicConv2d(64, 96, kernel_size=3, stride=1)\n        )\n\n        self.branch1 = nn.Sequential(\n            BasicConv2d(160, 64, kernel_size=1, stride=1),\n            BasicConv2d(64, 64, kernel_size=(1,7), stride=1, padding=(0,3)),\n            BasicConv2d(64, 64, kernel_size=(7,1), stride=1, padding=(3,0)),\n            BasicConv2d(64, 96, kernel_size=(3,3), stride=1)\n        )\n\n    def forward(self, x):\n        x0 = self.branch0(x)\n        x1 = self.branch1(x)\n        out = torch.cat((x0, x1), 1)\n        return out\n\nclass Mixed_5a(nn.Module):\n\n    def __init__(self):\n        super(Mixed_5a, self).__init__()\n        self.conv = BasicConv2d(192, 192, kernel_size=3, stride=2)\n        self.maxpool = nn.MaxPool2d(3, stride=2)\n\n    def forward(self, x):\n        x0 = self.conv(x)\n        x1 = self.maxpool(x)\n        out = torch.cat((x0, x1), 1)\n        return out\n\nclass Inception_A(nn.Module):\n\n    def __init__(self):\n        super(Inception_A, self).__init__()\n        self.branch0 = BasicConv2d(384, 96, kernel_size=1, stride=1)\n\n        self.branch1 = nn.Sequential(\n            BasicConv2d(384, 64, kernel_size=1, stride=1),\n            BasicConv2d(64, 96, kernel_size=3, stride=1, padding=1)\n        )\n\n        self.branch2 = nn.Sequential(\n            BasicConv2d(384, 64, kernel_size=1, stride=1),\n            BasicConv2d(64, 96, kernel_size=3, stride=1, padding=1),\n            BasicConv2d(96, 96, kernel_size=3, stride=1, padding=1)\n        )\n\n        self.branch3 = nn.Sequential(\n            nn.AvgPool2d(3, stride=1, padding=1, count_include_pad=False),\n            BasicConv2d(384, 96, kernel_size=1, stride=1)\n        )\n\n    def forward(self, x):\n        x0 = self.branch0(x)\n        x1 = self.branch1(x)\n        x2 = self.branch2(x)\n        x3 = self.branch3(x)\n        out = torch.cat((x0, x1, x2, x3), 1)\n        return out\n\nclass Reduction_A(nn.Module):\n\n    def __init__(self):\n        super(Reduction_A, self).__init__()\n        self.branch0 = BasicConv2d(384, 384, kernel_size=3, stride=2)\n\n        self.branch1 = nn.Sequential(\n            BasicConv2d(384, 192, kernel_size=1, stride=1),\n            BasicConv2d(192, 224, kernel_size=3, stride=1, padding=1),\n            BasicConv2d(224, 256, kernel_size=3, stride=2)\n        )\n        \n        self.branch2 = nn.MaxPool2d(3, stride=2)\n\n    def forward(self, x):\n        x0 = self.branch0(x)\n        x1 = self.branch1(x)\n        x2 = self.branch2(x)\n        out = torch.cat((x0, x1, x2), 1)\n        return out\n\nclass Inception_B(nn.Module):\n\n    def __init__(self):\n        super(Inception_B, self).__init__()\n        self.branch0 = BasicConv2d(1024, 384, kernel_size=1, stride=1)\n        \n        self.branch1 = nn.Sequential(\n            BasicConv2d(1024, 192, kernel_size=1, stride=1),\n            BasicConv2d(192, 224, kernel_size=(1,7), stride=1, padding=(0,3)),\n            BasicConv2d(224, 256, kernel_size=(7,1), stride=1, padding=(3,0))\n        )\n\n        self.branch2 = nn.Sequential(\n            BasicConv2d(1024, 192, kernel_size=1, stride=1),\n            BasicConv2d(192, 192, kernel_size=(7,1), stride=1, padding=(3,0)),\n            BasicConv2d(192, 224, kernel_size=(1,7), stride=1, padding=(0,3)),\n            BasicConv2d(224, 224, kernel_size=(7,1), stride=1, padding=(3,0)),\n            BasicConv2d(224, 256, kernel_size=(1,7), stride=1, padding=(0,3))\n        )\n\n        self.branch3 = nn.Sequential(\n            nn.AvgPool2d(3, stride=1, padding=1, count_include_pad=False),\n            BasicConv2d(1024, 128, kernel_size=1, stride=1)\n        )\n\n    def forward(self, x):\n        x0 = self.branch0(x)\n        x1 = self.branch1(x)\n        x2 = self.branch2(x)\n        x3 = self.branch3(x)\n        out = torch.cat((x0, x1, x2, x3), 1)\n        return out\n\nclass Reduction_B(nn.Module):\n\n    def __init__(self):\n        super(Reduction_B, self).__init__()\n\n        self.branch0 = nn.Sequential(\n            BasicConv2d(1024, 192, kernel_size=1, stride=1),\n            BasicConv2d(192, 192, kernel_size=3, stride=2)\n        )\n\n        self.branch1 = nn.Sequential(\n            BasicConv2d(1024, 256, kernel_size=1, stride=1),\n            BasicConv2d(256, 256, kernel_size=(1,7), stride=1, padding=(0,3)),\n            BasicConv2d(256, 320, kernel_size=(7,1), stride=1, padding=(3,0)),\n            BasicConv2d(320, 320, kernel_size=3, stride=2)\n        )\n\n        self.branch2 = nn.MaxPool2d(3, stride=2)\n\n    def forward(self, x):\n        x0 = self.branch0(x)\n        x1 = self.branch1(x)\n        x2 = self.branch2(x)\n        out = torch.cat((x0, x1, x2), 1)\n        return out\n\nclass Inception_C(nn.Module):\n\n    def __init__(self):\n        super(Inception_C, self).__init__()\n\n        self.branch0 = BasicConv2d(1536, 256, kernel_size=1, stride=1)\n        \n        self.branch1_0 = BasicConv2d(1536, 384, kernel_size=1, stride=1)\n        self.branch1_1a = BasicConv2d(384, 256, kernel_size=(1,3), stride=1, padding=(0,1))\n        self.branch1_1b = BasicConv2d(384, 256, kernel_size=(3,1), stride=1, padding=(1,0))\n        \n        self.branch2_0 = BasicConv2d(1536, 384, kernel_size=1, stride=1)\n        self.branch2_1 = BasicConv2d(384, 448, kernel_size=(3,1), stride=1, padding=(1,0))\n        self.branch2_2 = BasicConv2d(448, 512, kernel_size=(1,3), stride=1, padding=(0,1))\n        self.branch2_3a = BasicConv2d(512, 256, kernel_size=(1,3), stride=1, padding=(0,1))\n        self.branch2_3b = BasicConv2d(512, 256, kernel_size=(3,1), stride=1, padding=(1,0))\n        \n        self.branch3 = nn.Sequential(\n            nn.AvgPool2d(3, stride=1, padding=1, count_include_pad=False),\n            BasicConv2d(1536, 256, kernel_size=1, stride=1)\n        )\n\n    def forward(self, x):\n        x0 = self.branch0(x)\n        \n        x1_0 = self.branch1_0(x)\n        x1_1a = self.branch1_1a(x1_0)\n        x1_1b = self.branch1_1b(x1_0)\n        x1 = torch.cat((x1_1a, x1_1b), 1)\n\n        x2_0 = self.branch2_0(x)\n        x2_1 = self.branch2_1(x2_0)\n        x2_2 = self.branch2_2(x2_1)\n        x2_3a = self.branch2_3a(x2_2)\n        x2_3b = self.branch2_3b(x2_2)\n        x2 = torch.cat((x2_3a, x2_3b), 1)\n\n        x3 = self.branch3(x)\n\n        out = torch.cat((x0, x1, x2, x3), 1)\n        return out\n\nclass InceptionV4(nn.Module):\n\n    def __init__(self, num_classes=1001):\n        super(InceptionV4, self).__init__()\n        self.features = nn.Sequential(\n            BasicConv2d(3, 32, kernel_size=3, stride=2),\n            BasicConv2d(32, 32, kernel_size=3, stride=1),\n            BasicConv2d(32, 64, kernel_size=3, stride=1, padding=1),\n            Mixed_3a(),\n            Mixed_4a(),\n            Mixed_5a(),\n            Inception_A(),\n            Inception_A(),\n            Inception_A(),\n            Inception_A(),\n            Reduction_A(), # Mixed_6a\n            Inception_B(),\n            Inception_B(),\n            Inception_B(),\n            Inception_B(),\n            Inception_B(),\n            Inception_B(),\n            Inception_B(),\n            Reduction_B(), # Mixed_7a\n            Inception_C(),\n            Inception_C(),\n            Inception_C(),\n            nn.AvgPool2d(8, count_include_pad=False)\n        )\n        self.classif = nn.Linear(1536, num_classes)\n\n    def forward(self, x):\n        x = self.features(x)\n        x = x.view(x.size(0), -1)\n        x = self.classif(x) \n        return x\n\ndef inceptionv4(pretrained=True):\n    model = InceptionV4()\n    if pretrained:\n        model.load_state_dict(model_zoo.load_url(model_urls['imagenet']))\n    return model\n\n######################################################################\n## Load parameters from HDF5 to Dict\n######################################################################\n\ndef load_conv2d(state_dict, name_pth, name_tf):\n    h5f = h5py.File('dump/InceptionV4/'+name_tf+'.h5', 'r')\n    state_dict[name_pth+'.conv.weight'] = torch.from_numpy(h5f['weights'][()]).permute(3, 2, 0, 1)\n    out_planes = state_dict[name_pth+'.conv.weight'].size(0)\n    state_dict[name_pth+'.bn.weight'] = torch.ones(out_planes)\n    state_dict[name_pth+'.bn.bias'] = torch.from_numpy(h5f['beta'][()])\n    state_dict[name_pth+'.bn.running_mean'] = torch.from_numpy(h5f['mean'][()])\n    state_dict[name_pth+'.bn.running_var'] = torch.from_numpy(h5f['var'][()])\n    h5f.close()\n\ndef load_linear(state_dict, name_pth, name_tf):\n    h5f = h5py.File('dump/InceptionV4/'+name_tf+'.h5', 'r')\n    state_dict[name_pth+'.weight'] = torch.from_numpy(h5f['weights'][()]).t()\n    state_dict[name_pth+'.bias'] = torch.from_numpy(h5f['biases'][()])\n    h5f.close()\n\ndef load_mixed_4a_7a(state_dict, name_pth, name_tf):\n    load_conv2d(state_dict, name_pth+'.branch0.0', name_tf+'/Branch_0/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch0.1', name_tf+'/Branch_0/Conv2d_1a_3x3')\n    load_conv2d(state_dict, name_pth+'.branch1.0', name_tf+'/Branch_1/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1.1', name_tf+'/Branch_1/Conv2d_0b_1x7')\n    load_conv2d(state_dict, name_pth+'.branch1.2', name_tf+'/Branch_1/Conv2d_0c_7x1')\n    load_conv2d(state_dict, name_pth+'.branch1.3', name_tf+'/Branch_1/Conv2d_1a_3x3')\n\ndef load_mixed_5(state_dict, name_pth, name_tf):\n    load_conv2d(state_dict, name_pth+'.branch0', name_tf+'/Branch_0/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1.0', name_tf+'/Branch_1/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1.1', name_tf+'/Branch_1/Conv2d_0b_3x3')\n    load_conv2d(state_dict, name_pth+'.branch2.0', name_tf+'/Branch_2/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch2.1', name_tf+'/Branch_2/Conv2d_0b_3x3')\n    load_conv2d(state_dict, name_pth+'.branch2.2', name_tf+'/Branch_2/Conv2d_0c_3x3')\n    load_conv2d(state_dict, name_pth+'.branch3.1', name_tf+'/Branch_3/Conv2d_0b_1x1')\n\ndef load_mixed_6(state_dict, name_pth, name_tf):\n    load_conv2d(state_dict, name_pth+'.branch0', name_tf+'/Branch_0/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1.0', name_tf+'/Branch_1/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1.1', name_tf+'/Branch_1/Conv2d_0b_1x7')\n    load_conv2d(state_dict, name_pth+'.branch1.2', name_tf+'/Branch_1/Conv2d_0c_7x1')\n    load_conv2d(state_dict, name_pth+'.branch2.0', name_tf+'/Branch_2/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch2.1', name_tf+'/Branch_2/Conv2d_0b_7x1')\n    load_conv2d(state_dict, name_pth+'.branch2.2', name_tf+'/Branch_2/Conv2d_0c_1x7')\n    load_conv2d(state_dict, name_pth+'.branch2.3', name_tf+'/Branch_2/Conv2d_0d_7x1')\n    load_conv2d(state_dict, name_pth+'.branch2.4', name_tf+'/Branch_2/Conv2d_0e_1x7')\n    load_conv2d(state_dict, name_pth+'.branch3.1', name_tf+'/Branch_3/Conv2d_0b_1x1')\n\ndef load_mixed_7(state_dict, name_pth, name_tf):\n    load_conv2d(state_dict, name_pth+'.branch0', name_tf+'/Branch_0/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1_0', name_tf+'/Branch_1/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch1_1a', name_tf+'/Branch_1/Conv2d_0b_1x3')\n    load_conv2d(state_dict, name_pth+'.branch1_1b', name_tf+'/Branch_1/Conv2d_0c_3x1')\n    load_conv2d(state_dict, name_pth+'.branch2_0', name_tf+'/Branch_2/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth+'.branch2_1', name_tf+'/Branch_2/Conv2d_0b_3x1')\n    load_conv2d(state_dict, name_pth+'.branch2_2', name_tf+'/Branch_2/Conv2d_0c_1x3')\n    load_conv2d(state_dict, name_pth+'.branch2_3a', name_tf+'/Branch_2/Conv2d_0d_1x3')\n    load_conv2d(state_dict, name_pth+'.branch2_3b', name_tf+'/Branch_2/Conv2d_0e_3x1')\n    load_conv2d(state_dict, name_pth+'.branch3.1', name_tf+'/Branch_3/Conv2d_0b_1x1')\n\n\ndef load():\n    state_dict={}\n    \n    load_conv2d(state_dict, name_pth='features.0', name_tf='Conv2d_1a_3x3')\n    load_conv2d(state_dict, name_pth='features.1', name_tf='Conv2d_2a_3x3')\n    load_conv2d(state_dict, name_pth='features.2', name_tf='Conv2d_2b_3x3')\n    \n    load_conv2d(state_dict, name_pth='features.3.conv', name_tf='Mixed_3a/Branch_1/Conv2d_0a_3x3')\n\n    load_mixed_4a_7a(state_dict, name_pth='features.4', name_tf='Mixed_4a')\n\n    load_conv2d(state_dict, name_pth='features.5.conv', name_tf='Mixed_5a/Branch_0/Conv2d_1a_3x3')\n\n    load_mixed_5(state_dict, name_pth='features.6', name_tf='Mixed_5b')\n    load_mixed_5(state_dict, name_pth='features.7', name_tf='Mixed_5c')\n    load_mixed_5(state_dict, name_pth='features.8', name_tf='Mixed_5d')\n    load_mixed_5(state_dict, name_pth='features.9', name_tf='Mixed_5e')\n\n    load_conv2d(state_dict, name_pth='features.10.branch0', name_tf='Mixed_6a/Branch_0/Conv2d_1a_3x3')\n    load_conv2d(state_dict, name_pth='features.10.branch1.0', name_tf='Mixed_6a/Branch_1/Conv2d_0a_1x1')\n    load_conv2d(state_dict, name_pth='features.10.branch1.1', name_tf='Mixed_6a/Branch_1/Conv2d_0b_3x3')\n    load_conv2d(state_dict, name_pth='features.10.branch1.2', name_tf='Mixed_6a/Branch_1/Conv2d_1a_3x3')\n\n    load_mixed_6(state_dict, name_pth='features.11', name_tf='Mixed_6b')\n    load_mixed_6(state_dict, name_pth='features.12', name_tf='Mixed_6c')\n    load_mixed_6(state_dict, name_pth='features.13', name_tf='Mixed_6d')\n    load_mixed_6(state_dict, name_pth='features.14', name_tf='Mixed_6e')\n    load_mixed_6(state_dict, name_pth='features.15', name_tf='Mixed_6f')\n    load_mixed_6(state_dict, name_pth='features.16', name_tf='Mixed_6g')\n    load_mixed_6(state_dict, name_pth='features.17', name_tf='Mixed_6h')\n\n    load_mixed_4a_7a(state_dict, name_pth='features.18', name_tf='Mixed_7a')\n\n    load_mixed_7(state_dict, name_pth='features.19', name_tf='Mixed_7b')\n    load_mixed_7(state_dict, name_pth='features.20', name_tf='Mixed_7c')\n    load_mixed_7(state_dict, name_pth='features.21', name_tf='Mixed_7d')\n\n    load_linear(state_dict, name_pth='classif', name_tf='Logits')\n\n    return state_dict\n\n######################################################################\n## Test\n######################################################################\n\ndef test(model):\n    model.eval()\n    from scipy import misc\n    img = misc.imread('lena_299.png')\n    inputs = torch.zeros(1,299,299,3)\n    inputs[0] = torch.from_numpy(img)\n    inputs.transpose_(1,3)\n    inputs.transpose_(2,3)\n    # 1, 3, 299, 299\n    outputs = model.forward(torch.autograd.Variable(inputs))\n    h5f = h5py.File('dump/InceptionV4/Logits.h5', 'r')\n    outputs_tf = torch.from_numpy(h5f['out'][()])\n    h5f.close()\n    outputs = torch.nn.functional.softmax(outputs)\n    print(torch.dist(outputs.data, outputs_tf))\n    return outputs\n \ndef test_conv2d(module, name):\n    #global output_tf\n    h5f = h5py.File('dump/InceptionV4/'+name+'.h5', 'r')\n    output_tf = torch.from_numpy(h5f['relu_out'][()])\n    output_tf.transpose_(1,3)\n    output_tf.transpose_(2,3)\n    h5f.close()\n    def test_dist(self, input, output):\n        print(name, torch.dist(output.data, output_tf))\n    module.register_forward_hook(test_dist)\n\ndef test_mixed_4a_7a(module, name):\n    test_conv2d(module.branch0[0], name+'/Branch_0/Conv2d_0a_1x1')\n    test_conv2d(module.branch0[1], name+'/Branch_0/Conv2d_1a_3x3')\n    test_conv2d(module.branch1[0], name+'/Branch_1/Conv2d_0a_1x1')\n    test_conv2d(module.branch1[1], name+'/Branch_1/Conv2d_0b_1x7')\n    test_conv2d(module.branch1[2], name+'/Branch_1/Conv2d_0c_7x1')\n    test_conv2d(module.branch1[3], name+'/Branch_1/Conv2d_1a_3x3')\n\n######################################################################\n## Main\n######################################################################\n\nif __name__ == \"__main__\":\n\n    import h5py\n\n    model = InceptionV4()\n    state_dict = load()\n    model.load_state_dict(state_dict)\n\n    # test_conv2d(model.features[0], 'Conv2d_1a_3x3')\n    # test_conv2d(model.features[1], 'Conv2d_2a_3x3')\n    # test_conv2d(model.features[2], 'Conv2d_2b_3x3')\n    # test_conv2d(model.features[3].conv, 'Mixed_3a/Branch_1/Conv2d_0a_3x3')\n    # test_mixed_4a_7a(model.features[4], 'Mixed_4a')\n    \n    os.system('mkdir -p save')\n    torch.save(model, 'save/inceptionv4.pth')\n    torch.save(state_dict, 'save/inceptionv4_state.pth')\n\n    outputs = test(model)\n\n\n"
  },
  {
    "path": "model_zoo/inceptionv4/tensorflow_dump.py",
    "content": "# python3\n\n# TensorBoard\n# python3 ~/.local/lib/python3.5/site-packages/tensorflow/tensorboard/tensorboard.py --logdir=logs --port=6007\n\nimport os\nimport sys\nimport h5py\nimport math\nimport urllib.request\nimport numpy as np\nimport tensorflow as tf\n\nsys.path.append('models/slim')\nfrom datasets import dataset_utils\nfrom datasets import imagenet\nfrom nets import inception\nfrom preprocessing import inception_preprocessing\n\nslim = tf.contrib.slim\n\nimage_size = inception.inception_v3.default_image_size\n\nurl = 'http://download.tensorflow.org/models/inception_v4_2016_09_09.tar.gz'\ncheckpoints_dir = '/tmp/checkpoints/'\n\ndef make_padding(padding_name, conv_shape):\n  padding_name = padding_name.decode(\"utf-8\")\n  if padding_name == \"VALID\":\n    return [0, 0]\n  elif padding_name == \"SAME\":\n    #return [math.ceil(int(conv_shape[0])/2), math.ceil(int(conv_shape[1])/2)]\n    return [math.floor(int(conv_shape[0])/2), math.floor(int(conv_shape[1])/2)]\n  else:\n    sys.exit('Invalid padding name '+padding_name)\n\ndef dump_conv2d(name='Conv2d_1a_3x3'):\n  \n  conv_operation = sess.graph.get_operation_by_name('InceptionV4/InceptionV4/'+name+'/Conv2D') # remplacer convolution par Conv2D si erreur\n\n  weights_tensor = sess.graph.get_tensor_by_name('InceptionV4/'+name+'/weights:0')\n  weights = weights_tensor.eval()\n\n  padding = make_padding(conv_operation.get_attr('padding'), weights_tensor.get_shape())\n  strides = conv_operation.get_attr('strides')\n\n  conv_out = sess.graph.get_operation_by_name('InceptionV4/InceptionV4/'+name+'/Conv2D').outputs[0].eval() # remplacer convolution par Conv2D si erreur\n  \n  beta = sess.graph.get_tensor_by_name('InceptionV4/'+name+'/BatchNorm/beta:0').eval()\n  #gamma = sess.graph.get_tensor_by_name('InceptionV4/'+name+'/BatchNorm/gamma:0').eval()\n  mean = sess.graph.get_tensor_by_name('InceptionV4/'+name+'/BatchNorm/moving_mean:0').eval()\n  var = sess.graph.get_tensor_by_name('InceptionV4/'+name+'/BatchNorm/moving_variance:0').eval()\n  \n  relu_out = sess.graph.get_operation_by_name('InceptionV4/InceptionV4/'+name+'/Relu').outputs[0].eval()\n\n  os.system('mkdir -p dump/InceptionV4/'+name)\n  h5f = h5py.File('dump/InceptionV4/'+name+'.h5', 'w')\n  # conv\n  h5f.create_dataset(\"weights\", data=weights)\n  h5f.create_dataset(\"strides\", data=strides)\n  h5f.create_dataset(\"padding\", data=padding)\n  h5f.create_dataset(\"conv_out\", data=conv_out)\n  # batch norm\n  h5f.create_dataset(\"beta\", data=beta)\n  #h5f.create_dataset(\"gamma\", data=gamma)\n  h5f.create_dataset(\"mean\", data=mean)\n  h5f.create_dataset(\"var\", data=var)\n  h5f.create_dataset(\"relu_out\", data=relu_out)\n  h5f.close()\n\ndef dump_logits():\n  operation = sess.graph.get_operation_by_name('InceptionV4/Logits/Predictions')\n\n  weights_tensor = sess.graph.get_tensor_by_name('InceptionV4/Logits/Logits/weights:0')\n  weights = weights_tensor.eval()\n\n  biases_tensor = sess.graph.get_tensor_by_name('InceptionV4/Logits/Logits/biases:0')\n  biases = biases_tensor.eval()\n\n  out = operation.outputs[0].eval()\n\n  h5f = h5py.File('dump/InceptionV4/Logits.h5', 'w')\n  # conv\n  h5f.create_dataset(\"weights\", data=weights)\n  h5f.create_dataset(\"biases\", data=biases)\n  h5f.create_dataset(\"out\", data=out)\n  h5f.close()\n\n\n# def dump_avgpool(name='Mixed_5b/Branch_3/AvgPool_0a_3x3'):\n#   operation = sess.graph.get_operation_by_name('InceptionV4/InceptionV4/'+name+'/AvgPool')\n#   out = operation.outputs[0].eval()\n#   os.system('mkdir -p dump/InceptionV4/'+name)\n#   h5f = h5py.File('dump/InceptionV4/'+name+'.h5', 'w')\n#   h5f.create_dataset(\"out\", data=out)\n#   h5f.close()\n\n# def dump_concats():\n#   operation1 = sess.graph.get_operation_by_name('InceptionV4/InceptionV4/Mixed_7b/Branch_1/concat')\n#   operation2 = sess.graph.get_operation_by_name('InceptionV4/InceptionV4/Mixed_7b/Branch_2/concat')\n#   out1 = operation1.outputs[0].eval()\n#   out2 = operation2.outputs[0].eval()\n#   os.system('mkdir -p dump/InceptionV4/Mixed_7b/Branch_1/concat')\n#   h5f = h5py.File('dump/InceptionV4/Mixed_7b/Branch_1/concat.h5', 'w')\n#   h5f.create_dataset(\"out\", data=out1)\n#   h5f.close()\n#   os.system('mkdir -p dump/InceptionV4/Mixed_7b/Branch_2/concat')\n#   h5f = h5py.File('dump/InceptionV4/Mixed_7b/Branch_2/concat.h5', 'w')\n#   h5f.create_dataset(\"out\", data=out2)\n#   h5f.close()\n\ndef dump_mixed_4a_7a(name='Mixed_4a'):\n  dump_conv2d(name=name+'/Branch_0/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_0/Conv2d_1a_3x3')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0b_1x7')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0c_7x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_1a_3x3')\n\ndef dump_mixed_5(name='Mixed_5b'):\n  dump_conv2d(name=name+'/Branch_0/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0b_3x3')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0b_3x3')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0c_3x3')\n  dump_conv2d(name=name+'/Branch_3/Conv2d_0b_1x1')\n\ndef dump_mixed_6(name='Mixed_6b'):\n  dump_conv2d(name=name+'/Branch_0/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0b_1x7')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0c_7x1')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0b_7x1')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0c_1x7')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0d_7x1')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0e_1x7')\n  dump_conv2d(name=name+'/Branch_3/Conv2d_0b_1x1')\n\ndef dump_mixed_7(name='Mixed_7b'):\n  dump_conv2d(name=name+'/Branch_0/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0b_1x3')\n  dump_conv2d(name=name+'/Branch_1/Conv2d_0c_3x1')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0a_1x1')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0b_3x1')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0c_1x3')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0d_1x3')\n  dump_conv2d(name=name+'/Branch_2/Conv2d_0e_3x1')\n  dump_conv2d(name=name+'/Branch_3/Conv2d_0b_1x1')\n\n\nif not tf.gfile.Exists(checkpoints_dir+'inception_v4.ckpt'):\n  tf.gfile.MakeDirs(checkpoints_dir)\n  dataset_utils.download_and_uncompress_tarball(url, checkpoints_dir)\n\nwith tf.Graph().as_default():\n\n  # Create model architecture\n\n  from scipy import misc\n  img = misc.imread('lena_299.png')\n  print(img.shape)\n\n  inputs = np.zeros((1,299,299,3), dtype=np.float32)\n\n  inputs[0] = img\n  inputs = tf.pack(inputs)\n\n  with slim.arg_scope(inception.inception_v4_arg_scope()):\n    logits, _ = inception.inception_v4(inputs, num_classes=1001, is_training=False)\n\n  with tf.Session() as sess:\n\n    # Initialize model\n    init_fn = slim.assign_from_checkpoint_fn(\n    os.path.join(checkpoints_dir, 'inception_v4.ckpt'),\n    slim.get_model_variables('InceptionV4'))  \n\n    init_fn(sess)\n\n    # Display model variables\n    for v in slim.get_model_variables():\n      print('name = {}, shape = {}'.format(v.name, v.get_shape()))\n\n    # Create graph\n    os.system(\"rm -rf logs\")\n    os.system(\"mkdir -p logs\")\n\n    tf.scalar_summary('logs', logits[0][0])\n    summary_op = tf.merge_all_summaries()\n    summary_writer = tf.train.SummaryWriter(\"logs\", sess.graph)\n\n    out = sess.run(summary_op)\n    summary_writer.add_summary(out, 0)\n\n    # Stem\n    dump_conv2d(name='Conv2d_1a_3x3')\n    dump_conv2d(name='Conv2d_2a_3x3')\n    dump_conv2d(name='Conv2d_2b_3x3')\n\n    dump_conv2d(name='Mixed_3a/Branch_1/Conv2d_0a_3x3')\n    dump_mixed_4a_7a(name='Mixed_4a')\n    dump_conv2d(name='Mixed_5a/Branch_0/Conv2d_1a_3x3')\n\n    # Inception A\n    dump_mixed_5(name='Mixed_5b')\n    dump_mixed_5(name='Mixed_5c')\n    dump_mixed_5(name='Mixed_5d')\n    dump_mixed_5(name='Mixed_5e')\n\n    # Reduction A\n    dump_conv2d(name='Mixed_6a/Branch_0/Conv2d_1a_3x3')\n    dump_conv2d(name='Mixed_6a/Branch_1/Conv2d_0a_1x1')\n    dump_conv2d(name='Mixed_6a/Branch_1/Conv2d_0b_3x3')\n    dump_conv2d(name='Mixed_6a/Branch_1/Conv2d_1a_3x3')\n\n    # Inception B\n    dump_mixed_6(name='Mixed_6b')\n    dump_mixed_6(name='Mixed_6c')\n    dump_mixed_6(name='Mixed_6d')\n    dump_mixed_6(name='Mixed_6e')\n    dump_mixed_6(name='Mixed_6f')\n    dump_mixed_6(name='Mixed_6g')\n    dump_mixed_6(name='Mixed_6h')\n\n    # Reduction B\n    dump_mixed_4a_7a(name='Mixed_7a')\n\n    # Inception C\n    dump_mixed_7(name='Mixed_7b')\n    dump_mixed_7(name='Mixed_7c')\n    dump_mixed_7(name='Mixed_7d')\n\n    dump_logits()\n\n\n    # #dump_concats()\n    # #dump_avgpool(name='Mixed_5b/Branch_3/AvgPool_0a_3x3')"
  },
  {
    "path": "model_zoo/inceptionv4/torch_load.lua",
    "content": "require 'nn'\nlocal hdf5 = require 'hdf5'\ntorch.setdefaulttensortype('torch.FloatTensor')\nrequire 'image'\n\nlocal function SpatialConvolution(nInputPlane, nOutputPlane, kW, kH, dW, dH, padW, padH)\n  local std_epsilon = 0.001\n  local m = nn.Sequential()\n  m:add(nn.SpatialConvolution(nInputPlane, nOutputPlane, kW, kH, dW, dH, padW, padH))\n  m:add(nn.SpatialBatchNormalization(nOutputPlane, std_epsilon, nil, true))\n  m:add(nn.ReLU())\n  return m\nend\n\nlocal function Tower(layers)\n  local tower = nn.Sequential()\n  for i=1,#layers do\n    tower:add(layers[i])\n  end\n  return tower\nend\n\nlocal function FilterConcat(towers)\n  local concat = nn.DepthConcat(2)\n  for i=1,#towers do\n    concat:add(towers[i])\n  end\n  return concat\nend\n\nlocal function Stem()\n  local stem = nn.Sequential()\n  stem:add(SpatialConvolution(3, 32, 3, 3, 2, 2)) -- 32x149x149\n  stem:add(SpatialConvolution(32, 32, 3, 3, 1, 1)) -- 32x147x147\n  stem:add(SpatialConvolution(32, 64, 3, 3, 1, 1, 1, 1)) -- 64x147x147\n  stem:add(FilterConcat(\n    {\n      nn.SpatialMaxPooling(3, 3, 2, 2), -- 64x73x73\n      SpatialConvolution(64, 96, 3, 3, 2, 2) -- 96x73x73\n    }\n  )) -- 160x73x73\n  stem:add(FilterConcat(\n    {\n      Tower(\n        {\n          SpatialConvolution(160, 64, 1, 1, 1, 1), -- 64x73x73\n          SpatialConvolution(64, 96, 3, 3, 1, 1) -- 96x71x71\n        }\n      ),\n      Tower(\n        {\n          SpatialConvolution(160, 64, 1, 1, 1, 1), -- 64x73x73\n          SpatialConvolution(64, 64, 7, 1, 1, 1, 3, 0), -- 64x73x73\n          SpatialConvolution(64, 64, 1, 7, 1, 1, 0, 3), -- 64x73x73\n          SpatialConvolution(64, 96, 3, 3, 1, 1) -- 96x71x71\n        }\n      )\n    }\n  )) -- 192x71x71\n  stem:add(FilterConcat(\n    {\n      SpatialConvolution(192, 192, 3, 3, 2, 2), -- 192x35x35\n      nn.SpatialMaxPooling(3, 3, 2, 2) -- 192x35x35\n    }\n  )) -- 384x35x35\n  return stem\nend\n\nlocal function Inception_A()\n  local avgpool = nn.SpatialAveragePooling(3, 3, 1, 1, 1, 1)\n  avgpool.count_include_pad = false\n  local inception = FilterConcat(\n    {\n      SpatialConvolution(384, 96, 1, 1, 1, 1), -- 96x35x35\n      Tower(\n        {\n          SpatialConvolution(384, 64, 1, 1, 1, 1), -- 64x35x35\n          SpatialConvolution(64, 96, 3, 3, 1, 1, 1, 1) -- 96x35x35\n        }\n      ),\n      Tower(\n        {\n          SpatialConvolution(384, 64, 1, 1, 1, 1), -- 64x35x35\n          SpatialConvolution(64, 96, 3, 3, 1, 1, 1, 1), -- 96x35x35\n          SpatialConvolution(96, 96, 3, 3, 1, 1, 1, 1), -- 96x35x35\n        }\n      ),\n      Tower(\n        {\n          avgpool, -- 384x35x35\n          SpatialConvolution(384, 96, 1, 1, 1, 1) -- 96x35x35\n        }\n      )\n    }\n  ) -- 384x35x35\n  -- 384 ifms / ofms\n  return inception\nend\n\nlocal function Reduction_A()\n  local inception = FilterConcat(\n    {\n      SpatialConvolution(384, 384, 3, 3, 2, 2), -- 384x17x17\n      Tower(\n        {\n          SpatialConvolution(384, 192, 1, 1, 1, 1), -- 192x35x35\n          SpatialConvolution(192, 224, 3, 3, 1, 1, 1, 1), -- 224x35x35\n          SpatialConvolution(224, 256, 3, 3, 2, 2), -- 256x17x17\n        }\n      ),\n      nn.SpatialMaxPooling(3, 3, 2, 2) -- 384x17x17\n    }\n  ) -- 1024x17x17\n  -- 384 ifms, 1024 ofms\n  return inception\nend\n\nlocal function Inception_B()\n  local avgpool = nn.SpatialAveragePooling(3, 3, 1, 1, 1, 1)\n  avgpool.count_include_pad = false\n  local inception = FilterConcat(\n    {\n      SpatialConvolution(1024, 384, 1, 1, 1, 1), -- 384x17x17\n      Tower(\n        {\n          SpatialConvolution(1024, 192, 1, 1, 1, 1), -- 192x17x17\n          SpatialConvolution(192, 224, 7, 1, 1, 1, 3, 0), -- 224x17x17\n          SpatialConvolution(224, 256, 1, 7, 1, 1, 0, 3) -- 256x17x17\n        }\n      ),\n      Tower(\n        {\n          SpatialConvolution(1024, 192, 1, 1, 1, 1), -- 192x17x17\n          SpatialConvolution(192, 192, 1, 7, 1, 1, 0, 3), -- 192x17x17\n          SpatialConvolution(192, 224, 7, 1, 1, 1, 3, 0), -- 224x17x17\n          SpatialConvolution(224, 224, 1, 7, 1, 1, 0, 3), -- 224x17x17\n          SpatialConvolution(224, 256, 7, 1, 1, 1, 3, 0), -- 256x17x17\n        }\n      ),\n      Tower(\n        {\n          avgpool, -- 1024x17x17\n          SpatialConvolution(1024, 128, 1, 1, 1, 1) -- 128x17x17\n        }\n      )\n    }\n  ) -- 1024x17x17\n  -- 1024 ifms / ofms\n  return inception\nend\n\nlocal function Reduction_B()\n  local inception = FilterConcat(\n    {\n      Tower(\n        {\n          SpatialConvolution(1024, 192, 1, 1, 1, 1), -- 192x17x17\n          SpatialConvolution(192, 192, 3, 3, 2, 2) -- 192x8x8\n        }\n      ),\n      Tower(\n        {\n          SpatialConvolution(1024, 256, 1, 1, 1, 1), -- 256x17x17\n          SpatialConvolution(256, 256, 7, 1, 1, 1, 3, 0), -- 256x17x17\n          SpatialConvolution(256, 320, 1, 7, 1, 1, 0, 3), -- 320x17x17\n          SpatialConvolution(320, 320, 3, 3, 2, 2) -- 320x8x8\n        }\n      ),\n      nn.SpatialMaxPooling(3, 3, 2, 2) -- 1024x8x8\n    }\n  ) -- 1536x8x8\n  -- 1024 ifms, 1536 ofms\n  return inception\nend\n\nlocal function Inception_C()\n  local avgpool = nn.SpatialAveragePooling(3, 3, 1, 1, 1, 1)\n  avgpool.count_include_pad = false\n  local inception = FilterConcat(\n    {\n      SpatialConvolution(1536, 256, 1, 1, 1, 1), -- 256x8x8\n      Tower(\n        {\n          SpatialConvolution(1536, 384, 1, 1, 1, 1), -- 384x8x8\n          FilterConcat(\n            {\n              SpatialConvolution(384, 256, 3, 1, 1, 1, 1, 0), -- 256x8x8\n              SpatialConvolution(384, 256, 1, 3, 1, 1, 0, 1) -- 256x8x8\n            }\n          ) -- 512x8x8\n        }\n      ),\n      Tower(\n        {\n          SpatialConvolution(1536, 384, 1, 1, 1, 1), -- 384x8x8\n          SpatialConvolution(384, 448, 1, 3, 1, 1, 0, 1), -- 448x8x8\n          SpatialConvolution(448, 512, 3, 1, 1, 1, 1, 0), -- 512x8x8\n          FilterConcat(\n            {\n              SpatialConvolution(512, 256, 3, 1, 1, 1, 1, 0), -- 256x8x8\n              SpatialConvolution(512, 256, 1, 3, 1, 1, 0, 1) -- 256x8x8\n            }\n          ) -- 512x8x8\n        }\n      ),\n      Tower(\n        {\n          avgpool, -- 1536x8x8\n          SpatialConvolution(1536, 256, 1, 1, 1, 1) -- 256x8x8\n        }\n      )\n    }\n  ) -- 1536x8x8\n  -- 1536 ifms / ofms\n  return inception\nend\n\n----------------\n-- Load --\n----------------\n\nlocal function load_conv2d(module, name)\n  local name = name or 'Conv2d_1a_3x3'\n  local h5f = hdf5.open('dump/InceptionV4/'..name..'.h5', 'r')\n  \n  local conv = module:get(1) -- Spatial Convolution\n  local weights = h5f:read(\"weights\"):all():permute(4, 3, 1, 2)\n  conv.weight:copy(weights)\n  conv:noBias()\n\n  local bn = module:get(2) -- Spatial Batch Normalization\n  --local gamma = h5f:read(\"gamma\"):all() \n  bn.weight:copy(torch.ones(bn.weight:size(1))) -- gamma is set to 1\n  local beta = h5f:read(\"beta\"):all()\n  bn.bias:copy(beta)\n  local mean = h5f:read(\"mean\"):all()\n  bn.running_mean:copy(mean)\n  local var = h5f:read(\"var\"):all()\n  bn.running_var:copy(var)\n\n  h5f:close()\nend\n\nlocal function load_linear(module, name)\n  print(module.weight:size())\n  local h5f = hdf5.open('dump/InceptionV4/'..name..'.h5', 'r')\n  local weights = h5f:read('weights'):all():t()\n  local biases = h5f:read('biases'):all()\n  module.weight:copy(weights)\n  module.bias:copy(biases)\n\n  print(weights:size())\n  print(biases:size())\n  print(module.bias:size())\n\n  h5f:close()\nend\n\nlocal function load_mixed_4a_7a(module, name)\n  load_conv2d(module:get(1):get(1), name..'/Branch_0/Conv2d_0a_1x1')\n  load_conv2d(module:get(1):get(2), name..'/Branch_0/Conv2d_1a_3x3')\n  load_conv2d(module:get(2):get(1), name..'/Branch_1/Conv2d_0a_1x1')\n  load_conv2d(module:get(2):get(2), name..'/Branch_1/Conv2d_0b_1x7')\n  load_conv2d(module:get(2):get(3), name..'/Branch_1/Conv2d_0c_7x1')\n  load_conv2d(module:get(2):get(4), name..'/Branch_1/Conv2d_1a_3x3')\nend\n\nlocal function load_mixed_5(module, name)\n  local name = name or 'Mixed_5b'\n  load_conv2d(module:get(1), name..'/Branch_0/Conv2d_0a_1x1')\n  load_conv2d(module:get(2):get(1), name..'/Branch_1/Conv2d_0a_1x1')\n  load_conv2d(module:get(2):get(2), name..'/Branch_1/Conv2d_0b_3x3')\n  load_conv2d(module:get(3):get(1), name..'/Branch_2/Conv2d_0a_1x1')\n  load_conv2d(module:get(3):get(2), name..'/Branch_2/Conv2d_0b_3x3')\n  load_conv2d(module:get(3):get(3), name..'/Branch_2/Conv2d_0c_3x3')\n  load_conv2d(module:get(4):get(2), name..'/Branch_3/Conv2d_0b_1x1') -- pb\nend\n\nlocal function load_mixed_6(module, name)\n  local name = name or 'Mixed_6b'\n  load_conv2d(module:get(1), name..'/Branch_0/Conv2d_0a_1x1')\n  load_conv2d(module:get(2):get(1), name..'/Branch_1/Conv2d_0a_1x1')\n  load_conv2d(module:get(2):get(2), name..'/Branch_1/Conv2d_0b_1x7')\n  load_conv2d(module:get(2):get(3), name..'/Branch_1/Conv2d_0c_7x1')\n  load_conv2d(module:get(3):get(1), name..'/Branch_2/Conv2d_0a_1x1')\n  load_conv2d(module:get(3):get(2), name..'/Branch_2/Conv2d_0b_7x1')\n  load_conv2d(module:get(3):get(3), name..'/Branch_2/Conv2d_0c_1x7')\n  load_conv2d(module:get(3):get(4), name..'/Branch_2/Conv2d_0d_7x1')\n  load_conv2d(module:get(3):get(5), name..'/Branch_2/Conv2d_0e_1x7')\n  load_conv2d(module:get(4):get(2), name..'/Branch_3/Conv2d_0b_1x1')\nend\n\nlocal function load_mixed_7(module, name)\n  local name = name or 'Mixed_7b'\n  load_conv2d(module:get(1), name..'/Branch_0/Conv2d_0a_1x1')\n  load_conv2d(module:get(2):get(1), name..'/Branch_1/Conv2d_0a_1x1')\n  load_conv2d(module:get(2):get(2):get(1), name..'/Branch_1/Conv2d_0b_1x3') -- Beware if inverse ??? TODO\n  load_conv2d(module:get(2):get(2):get(2), name..'/Branch_1/Conv2d_0c_3x1')\n  load_conv2d(module:get(3):get(1), name..'/Branch_2/Conv2d_0a_1x1')\n  load_conv2d(module:get(3):get(2), name..'/Branch_2/Conv2d_0b_3x1')\n  load_conv2d(module:get(3):get(3), name..'/Branch_2/Conv2d_0c_1x3')\n  load_conv2d(module:get(3):get(4):get(1), name..'/Branch_2/Conv2d_0d_1x3') -- Beware\n  load_conv2d(module:get(3):get(4):get(2), name..'/Branch_2/Conv2d_0e_3x1')\n  load_conv2d(module:get(4):get(2), name..'/Branch_3/Conv2d_0b_1x1')\nend\n\nlocal function load(net)\n  load_conv2d(net:get(1):get(1), 'Conv2d_1a_3x3')\n  load_conv2d(net:get(1):get(2), 'Conv2d_2a_3x3')\n  load_conv2d(net:get(1):get(3), 'Conv2d_2b_3x3')\n  load_conv2d(net:get(1):get(4):get(2) ,'Mixed_3a/Branch_1/Conv2d_0a_3x3')\n  \n  load_mixed_4a_7a(net:get(1):get(5), 'Mixed_4a')\n  \n  load_conv2d(net:get(1):get(6):get(1) ,'Mixed_5a/Branch_0/Conv2d_1a_3x3')\n\n  load_mixed_5(net:get(2), 'Mixed_5b') -- pb\n  load_mixed_5(net:get(3), 'Mixed_5c')\n  load_mixed_5(net:get(4), 'Mixed_5d')\n  load_mixed_5(net:get(5), 'Mixed_5e')\n\n  load_conv2d(net:get(6):get(1) ,'Mixed_6a/Branch_0/Conv2d_1a_3x3')\n  load_conv2d(net:get(6):get(2):get(1) ,'Mixed_6a/Branch_1/Conv2d_0a_1x1')\n  load_conv2d(net:get(6):get(2):get(2) ,'Mixed_6a/Branch_1/Conv2d_0b_3x3')\n  load_conv2d(net:get(6):get(2):get(3) ,'Mixed_6a/Branch_1/Conv2d_1a_3x3')\n\n  load_mixed_6(net:get(7), 'Mixed_6b')\n  load_mixed_6(net:get(8), 'Mixed_6c')\n  load_mixed_6(net:get(9), 'Mixed_6d')\n  load_mixed_6(net:get(10), 'Mixed_6e')\n  load_mixed_6(net:get(11), 'Mixed_6f')\n  load_mixed_6(net:get(12), 'Mixed_6g')\n  load_mixed_6(net:get(13), 'Mixed_6h')\n\n  load_mixed_4a_7a(net:get(14), 'Mixed_7a')\n\n  load_mixed_7(net:get(15), 'Mixed_7b')\n  load_mixed_7(net:get(16), 'Mixed_7c')\n  load_mixed_7(net:get(17), 'Mixed_7d')\n\n  load_linear(net:get(20), 'Logits')\nend\n\n----------\n-- Test --\n----------\n\nlocal function test_conv2d(module, name)\n  local name = name or 'Conv2d_1a_3x3'\n  local h5f = hdf5.open('dump/InceptionV4/'..name..'.h5', 'r')\n\n  local conv_out = h5f:read(\"conv_out\"):all()\n  conv_out = conv_out:transpose(2,4)\n  conv_out = conv_out:transpose(3,4)\n\n  local relu_out = h5f:read(\"relu_out\"):all()\n  relu_out = relu_out:transpose(2,4)\n  relu_out = relu_out:transpose(3,4)\n\n  h5f:close()\n\n  if opt.cuda then\n    conv_out = conv_out:cuda()\n    relu_out = relu_out:cuda()\n  end\n\n  print(name..' conv_out', torch.dist(module:get(1).output, conv_out))\n  print(name..' relu_out', torch.dist(module:get(3).output, relu_out))\n  print('')\nend\n\nlocal function test_mixed_4a_7a(module, name)\n  test_conv2d(module:get(1):get(1), name..'/Branch_0/Conv2d_0a_1x1')\n  test_conv2d(module:get(1):get(2), name..'/Branch_0/Conv2d_1a_3x3')\n  test_conv2d(module:get(2):get(1), name..'/Branch_1/Conv2d_0a_1x1')\n  test_conv2d(module:get(2):get(2), name..'/Branch_1/Conv2d_0b_1x7')\n  test_conv2d(module:get(2):get(3), name..'/Branch_1/Conv2d_0c_7x1')\n  test_conv2d(module:get(2):get(4), name..'/Branch_1/Conv2d_1a_3x3')\nend\n\nlocal function test_mixed_5(module, name)\n  local name = name or 'Mixed_5b'\n  test_conv2d(module:get(1), name..'/Branch_0/Conv2d_0a_1x1')\n  test_conv2d(module:get(2):get(1), name..'/Branch_1/Conv2d_0a_1x1')\n  test_conv2d(module:get(2):get(2), name..'/Branch_1/Conv2d_0b_3x3')\n  test_conv2d(module:get(3):get(1), name..'/Branch_2/Conv2d_0a_1x1')\n  test_conv2d(module:get(3):get(2), name..'/Branch_2/Conv2d_0b_3x3')\n  test_conv2d(module:get(3):get(3), name..'/Branch_2/Conv2d_0c_3x3')\n  test_conv2d(module:get(4):get(2), name..'/Branch_3/Conv2d_0b_1x1')\nend\n\nlocal function test_mixed_6(module, name)\n  local name = name or 'Mixed_6b'\n  test_conv2d(module:get(1), name..'/Branch_0/Conv2d_0a_1x1')\n  test_conv2d(module:get(2):get(1), name..'/Branch_1/Conv2d_0a_1x1')\n  test_conv2d(module:get(2):get(2), name..'/Branch_1/Conv2d_0b_1x7')\n  test_conv2d(module:get(2):get(3), name..'/Branch_1/Conv2d_0c_7x1')\n  test_conv2d(module:get(3):get(1), name..'/Branch_2/Conv2d_0a_1x1')\n  test_conv2d(module:get(3):get(2), name..'/Branch_2/Conv2d_0b_7x1')\n  test_conv2d(module:get(3):get(3), name..'/Branch_2/Conv2d_0c_1x7')\n  test_conv2d(module:get(3):get(4), name..'/Branch_2/Conv2d_0d_7x1')\n  test_conv2d(module:get(3):get(5), name..'/Branch_2/Conv2d_0e_1x7')\n  test_conv2d(module:get(4):get(2), name..'/Branch_3/Conv2d_0b_1x1')\nend\n\nlocal function test_mixed_7(module, name)\n  local name = name or 'Mixed_7b'\n  test_conv2d(module:get(1), name..'/Branch_0/Conv2d_0a_1x1')\n  test_conv2d(module:get(2):get(1), name..'/Branch_1/Conv2d_0a_1x1')\n  test_conv2d(module:get(2):get(2):get(1), name..'/Branch_1/Conv2d_0b_1x3') -- Beware if inverse ??? TODO\n  test_conv2d(module:get(2):get(2):get(2), name..'/Branch_1/Conv2d_0c_3x1')\n  test_conv2d(module:get(3):get(1), name..'/Branch_2/Conv2d_0a_1x1')\n  test_conv2d(module:get(3):get(2), name..'/Branch_2/Conv2d_0b_3x1')\n  test_conv2d(module:get(3):get(3), name..'/Branch_2/Conv2d_0c_1x3')\n  test_conv2d(module:get(3):get(4):get(1), name..'/Branch_2/Conv2d_0d_1x3') -- Beware\n  test_conv2d(module:get(3):get(4):get(2), name..'/Branch_2/Conv2d_0e_3x1')\n  test_conv2d(module:get(4):get(2), name..'/Branch_3/Conv2d_0b_1x1')\nend\n\nlocal function test_module(module, name)\n  local name = name or 'Mixed_5b/Branch_3/AvgPool_0a_3x3'\n  local h5f = hdf5.open('dump/InceptionV4/'..name..'.h5', 'r')\n  local out = h5f:read(\"out\"):all()\n  out = out:transpose(2,4)\n  out = out:transpose(3,4)\n  h5f:close()\n  if opt.cuda then\n    out = out:cuda()\n  end\n  print(name..' test_out', torch.dist(module.output, out))\n  print('')\nend\n\nlocal function test_linear(module, name)\n  local h5f = hdf5.open('dump/InceptionV4/'..name..'.h5', 'r')\n  local out = h5f:read(\"out\"):all()\n  h5f:close()\n  if opt.cuda then\n    out = out:cuda()\n  end\n  print(name..' linear_out', torch.dist(module.output, out))\n  print('')\nend\n\nlocal function test(net)\n  net:evaluate()\n  local input = torch.zeros(1,3,299,299) -- [0,1]\n  local img = image.load('lena_299.png') * 255.0 -- [0,255]\n  input[1] = img:float()\n  if opt.cuda then\n    input = input:cuda()\n  end\n  local output = net:forward(input)\n\n  test_conv2d(net:get(1):get(1), 'Conv2d_1a_3x3')\n  test_conv2d(net:get(1):get(2), 'Conv2d_2a_3x3')\n  test_conv2d(net:get(1):get(3), 'Conv2d_2b_3x3')\n  test_conv2d(net:get(1):get(4):get(2) ,'Mixed_3a/Branch_1/Conv2d_0a_3x3')\n\n  test_mixed_4a_7a(net:get(1):get(5), 'Mixed_4a')\n\n  test_conv2d(net:get(1):get(6):get(1) ,'Mixed_5a/Branch_0/Conv2d_1a_3x3')\n\n  -- test_module(net:get(2):get(4):get(1), 'Mixed_5b/Branch_3/AvgPool_0a_3x3')\n  test_mixed_5(net:get(2), 'Mixed_5b')\n  test_mixed_5(net:get(3), 'Mixed_5c')\n  test_mixed_5(net:get(4), 'Mixed_5d')\n  test_mixed_5(net:get(5), 'Mixed_5e')\n\n  test_conv2d(net:get(6):get(1) ,'Mixed_6a/Branch_0/Conv2d_1a_3x3')\n  test_conv2d(net:get(6):get(2):get(1) ,'Mixed_6a/Branch_1/Conv2d_0a_1x1')\n  test_conv2d(net:get(6):get(2):get(2) ,'Mixed_6a/Branch_1/Conv2d_0b_3x3')\n  test_conv2d(net:get(6):get(2):get(3) ,'Mixed_6a/Branch_1/Conv2d_1a_3x3')\n\n  test_mixed_6(net:get(7), 'Mixed_6b')\n  test_mixed_6(net:get(8), 'Mixed_6c')\n  test_mixed_6(net:get(9), 'Mixed_6d')\n  test_mixed_6(net:get(10), 'Mixed_6e')\n  test_mixed_6(net:get(11), 'Mixed_6f')\n  test_mixed_6(net:get(12), 'Mixed_6g')\n  test_mixed_6(net:get(13), 'Mixed_6h')\n\n  test_mixed_4a_7a(net:get(14), 'Mixed_7a')\n\n  test_mixed_7(net:get(15), 'Mixed_7b')\n  -- test_module(net:get(15):get(2), 'Mixed_7b/Branch_1/concat')\n  -- test_module(net:get(15):get(3), 'Mixed_7b/Branch_2/concat')\n  test_mixed_7(net:get(16), 'Mixed_7c')\n  test_mixed_7(net:get(17), 'Mixed_7d')\n\n  test_linear(net:get(21), 'Logits')\nend\n\n------------------------------------------------------------------\n-- Main\n------------------------------------------------------------------\n\nopt = {\n  cuda = true\n}\n\nnet = nn.Sequential()\nprint(\"-- Stem\")\nnet:add(Stem())           -- 3x299x299 ==> 384x35x35\nprint(\"-- Inception-A x 4\")\nfor i=1,4 do\n  net:add(Inception_A())  -- 384x35x35 ==> 384x35x35\nend\nprint(\"-- Reduction-A\")\nnet:add(Reduction_A())    -- 384x35x35 ==> 1024x17x17\nprint(\"-- Inception-B x 7\")\nfor i=1,7 do\n  net:add(Inception_B())  -- 1024x17x17 ==> 1024x17x17\nend\nprint(\"-- Reduction-B\")\nnet:add(Reduction_B())    -- 1024x17x17 ==> 1536x8x8\nprint(\"-- Inception-C x 3\")\nfor i=1,3 do\n  net:add(Inception_C())  -- 1536x8x8 ==> 1536x8x8\nend\nprint(\"-- Average Pooling\")\nlocal avgpool = nn.SpatialAveragePooling(8, 8)\navgpool.count_include_pad = false\nnet:add(avgpool) -- 1536x8x8 ==> 1536x1x1\nnet:add(nn.View(1536))\nprint(\"-- Fully Connected\")\nnet:add(nn.Linear(1536, 1001))  -- 1536 ==> 1000\nnet:add(nn.SoftMax())\nprint(net)\n\nload(net)\n\nif opt.cuda then\n  require 'cunn'\n  require 'cutorch'\n  --require 'cudnn'\n  net:cuda()\nend\n\ntest(net)\n\nos.execute('mkdir -p save')\ntorch.save('save/inceptionv4.t7', net:clearState():float())"
  },
  {
    "path": "model_zoo/models/.github/ISSUE_TEMPLATE.md",
    "content": "## Please let us know which model this issue is about (specify the top-level directory)\n"
  },
  {
    "path": "model_zoo/models/.gitignore",
    "content": "# Byte-compiled / optimized / DLL files\n__pycache__/\n*.py[cod]\n*$py.class\n\n# C extensions\n*.so\n\n# Distribution / packaging\n.Python\nenv/\nbuild/\ndevelop-eggs/\ndist/\ndownloads/\neggs/\n.eggs/\nlib/\nlib64/\nparts/\nsdist/\nvar/\n*.egg-info/\n.installed.cfg\n*.egg\n\n# PyInstaller\n#  Usually these files are written by a python script from a template\n#  before PyInstaller builds the exe, so as to inject date/other infos into it.\n*.manifest\n*.spec\n\n# Installer logs\npip-log.txt\npip-delete-this-directory.txt\n\n# Unit test / coverage reports\nhtmlcov/\n.tox/\n.coverage\n.coverage.*\n.cache\nnosetests.xml\ncoverage.xml\n*,cover\n.hypothesis/\n\n# Translations\n*.mo\n*.pot\n\n# Django stuff:\n*.log\nlocal_settings.py\n\n# Flask stuff:\ninstance/\n.webassets-cache\n\n# Scrapy stuff:\n.scrapy\n\n# Sphinx documentation\ndocs/_build/\n\n# PyBuilder\ntarget/\n\n# IPython Notebook\n.ipynb_checkpoints\n\n# pyenv\n.python-version\n\n# celery beat schedule file\ncelerybeat-schedule\n\n# dotenv\n.env\n\n# virtualenv\nvenv/\nENV/\n\n# Spyder project settings\n.spyderproject\n\n# Rope project settings\n.ropeproject\n"
  },
  {
    "path": "model_zoo/models/.gitmodules",
    "content": "[submodule \"tensorflow\"]\n\tpath = syntaxnet/tensorflow\n\turl = https://github.com/tensorflow/tensorflow.git\n"
  },
  {
    "path": "model_zoo/models/AUTHORS",
    "content": "# This is the official list of authors for copyright purposes.\n# This file is distinct from the CONTRIBUTORS files.\n# See the latter for an explanation.\n\n# Names should be added to this file as:\n# Name or Organization <email address>\n# The email address is not required for organizations.\n\nGoogle Inc.\nDavid Dao <daviddao@broad.mit.edu>\n"
  },
  {
    "path": "model_zoo/models/CONTRIBUTING.md",
    "content": "# Contributing guidelines\n\nIf you have created a model and would like to publish it here, please send us a\npull request. For those just getting started with pull reuests, GitHub has a\n[howto](https://help.github.com/articles/using-pull-requests/).\n\nThe code for any model in this repository is licensed under the Apache License\n2.0.\n\nIn order to accept our code, we have to make sure that we can publish your code:\nYou have to sign a Contributor License Agreement (CLA).\n\n### Contributor License Agreements\n\nPlease fill out either the individual or corporate Contributor License Agreement (CLA).\n\n  * If you are an individual writing original source code and you're sure you own the intellectual property, then you'll need to sign an [individual CLA](http://code.google.com/legal/individual-cla-v1.0.html).\n  * If you work for a company that wants to allow you to contribute your work, then you'll need to sign a [corporate CLA](http://code.google.com/legal/corporate-cla-v1.0.html).\n\nFollow either of the two links above to access the appropriate CLA and instructions for how to sign and return it. Once we receive it, we'll be able to accept your pull requests.\n\n***NOTE***: Only original source code from you and other people that have signed the CLA can be accepted into the repository.\n\n"
  },
  {
    "path": "model_zoo/models/LICENSE",
    "content": "Copyright 2016 The TensorFlow Authors.  All rights reserved.\n\n                                 Apache License\n                           Version 2.0, January 2004\n                        http://www.apache.org/licenses/\n\n   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\n\n   1. Definitions.\n\n      \"License\" shall mean the terms and conditions for use, reproduction,\n      and distribution as defined by Sections 1 through 9 of this document.\n\n      \"Licensor\" shall mean the copyright owner or entity authorized by\n      the copyright owner that is granting the License.\n\n      \"Legal Entity\" shall mean the union of the acting entity and all\n      other entities that control, are controlled by, or are under common\n      control with that entity. For the purposes of this definition,\n      \"control\" means (i) the power, direct or indirect, to cause the\n      direction or management of such entity, whether by contract or\n      otherwise, or (ii) ownership of fifty percent (50%) or more of the\n      outstanding shares, or (iii) beneficial ownership of such entity.\n\n      \"You\" (or \"Your\") shall mean an individual or Legal Entity\n      exercising permissions granted by this License.\n\n      \"Source\" form shall mean the preferred form for making modifications,\n      including but not limited to software source code, documentation\n      source, and configuration files.\n\n      \"Object\" form shall mean any form resulting from mechanical\n      transformation or translation of a Source form, including but\n      not limited to compiled object code, generated documentation,\n      and conversions to other media types.\n\n      \"Work\" shall mean the work of authorship, whether in Source or\n      Object form, made available under the License, as indicated by a\n      copyright notice that is included in or attached to the work\n      (an example is provided in the Appendix below).\n\n      \"Derivative Works\" shall mean any work, whether in Source or Object\n      form, that is based on (or derived from) the Work and for which the\n      editorial revisions, annotations, elaborations, or other modifications\n      represent, as a whole, an original work of authorship. For the purposes\n      of this License, Derivative Works shall not include works that remain\n      separable from, or merely link (or bind by name) to the interfaces of,\n      the Work and Derivative Works thereof.\n\n      \"Contribution\" shall mean any work of authorship, including\n      the original version of the Work and any modifications or additions\n      to that Work or Derivative Works thereof, that is intentionally\n      submitted to Licensor for inclusion in the Work by the copyright owner\n      or by an individual or Legal Entity authorized to submit on behalf of\n      the copyright owner. For the purposes of this definition, \"submitted\"\n      means any form of electronic, verbal, or written communication sent\n      to the Licensor or its representatives, including but not limited to\n      communication on electronic mailing lists, source code control systems,\n      and issue tracking systems that are managed by, or on behalf of, the\n      Licensor for the purpose of discussing and improving the Work, but\n      excluding communication that is conspicuously marked or otherwise\n      designated in writing by the copyright owner as \"Not a Contribution.\"\n\n      \"Contributor\" shall mean Licensor and any individual or Legal Entity\n      on behalf of whom a Contribution has been received by Licensor and\n      subsequently incorporated within the Work.\n\n   2. Grant of Copyright License. Subject to the terms and conditions of\n      this License, each Contributor hereby grants to You a perpetual,\n      worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n      copyright license to reproduce, prepare Derivative Works of,\n      publicly display, publicly perform, sublicense, and distribute the\n      Work and such Derivative Works in Source or Object form.\n\n   3. Grant of Patent License. Subject to the terms and conditions of\n      this License, each Contributor hereby grants to You a perpetual,\n      worldwide, non-exclusive, no-charge, royalty-free, irrevocable\n      (except as stated in this section) patent license to make, have made,\n      use, offer to sell, sell, import, and otherwise transfer the Work,\n      where such license applies only to those patent claims licensable\n      by such Contributor that are necessarily infringed by their\n      Contribution(s) alone or by combination of their Contribution(s)\n      with the Work to which such Contribution(s) was submitted. If You\n      institute patent litigation against any entity (including a\n      cross-claim or counterclaim in a lawsuit) alleging that the Work\n      or a Contribution incorporated within the Work constitutes direct\n      or contributory patent infringement, then any patent licenses\n      granted to You under this License for that Work shall terminate\n      as of the date such litigation is filed.\n\n   4. Redistribution. You may reproduce and distribute copies of the\n      Work or Derivative Works thereof in any medium, with or without\n      modifications, and in Source or Object form, provided that You\n      meet the following conditions:\n\n      (a) You must give any other recipients of the Work or\n          Derivative Works a copy of this License; and\n\n      (b) You must cause any modified files to carry prominent notices\n          stating that You changed the files; and\n\n      (c) You must retain, in the Source form of any Derivative Works\n          that You distribute, all copyright, patent, trademark, and\n          attribution notices from the Source form of the Work,\n          excluding those notices that do not pertain to any part of\n          the Derivative Works; and\n\n      (d) If the Work includes a \"NOTICE\" text file as part of its\n          distribution, then any Derivative Works that You distribute must\n          include a readable copy of the attribution notices contained\n          within such NOTICE file, excluding those notices that do not\n          pertain to any part of the Derivative Works, in at least one\n          of the following places: within a NOTICE text file distributed\n          as part of the Derivative Works; within the Source form or\n          documentation, if provided along with the Derivative Works; or,\n          within a display generated by the Derivative Works, if and\n          wherever such third-party notices normally appear. The contents\n          of the NOTICE file are for informational purposes only and\n          do not modify the License. You may add Your own attribution\n          notices within Derivative Works that You distribute, alongside\n          or as an addendum to the NOTICE text from the Work, provided\n          that such additional attribution notices cannot be construed\n          as modifying the License.\n\n      You may add Your own copyright statement to Your modifications and\n      may provide additional or different license terms and conditions\n      for use, reproduction, or distribution of Your modifications, or\n      for any such Derivative Works as a whole, provided Your use,\n      reproduction, and distribution of the Work otherwise complies with\n      the conditions stated in this License.\n\n   5. Submission of Contributions. Unless You explicitly state otherwise,\n      any Contribution intentionally submitted for inclusion in the Work\n      by You to the Licensor shall be under the terms and conditions of\n      this License, without any additional terms or conditions.\n      Notwithstanding the above, nothing herein shall supersede or modify\n      the terms of any separate license agreement you may have executed\n      with Licensor regarding such Contributions.\n\n   6. Trademarks. This License does not grant permission to use the trade\n      names, trademarks, service marks, or product names of the Licensor,\n      except as required for reasonable and customary use in describing the\n      origin of the Work and reproducing the content of the NOTICE file.\n\n   7. Disclaimer of Warranty. Unless required by applicable law or\n      agreed to in writing, Licensor provides the Work (and each\n      Contributor provides its Contributions) on an \"AS IS\" BASIS,\n      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\n      implied, including, without limitation, any warranties or conditions\n      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\n      PARTICULAR PURPOSE. You are solely responsible for determining the\n      appropriateness of using or redistributing the Work and assume any\n      risks associated with Your exercise of permissions under this License.\n\n   8. Limitation of Liability. In no event and under no legal theory,\n      whether in tort (including negligence), contract, or otherwise,\n      unless required by applicable law (such as deliberate and grossly\n      negligent acts) or agreed to in writing, shall any Contributor be\n      liable to You for damages, including any direct, indirect, special,\n      incidental, or consequential damages of any character arising as a\n      result of this License or out of the use or inability to use the\n      Work (including but not limited to damages for loss of goodwill,\n      work stoppage, computer failure or malfunction, or any and all\n      other commercial damages or losses), even if such Contributor\n      has been advised of the possibility of such damages.\n\n   9. Accepting Warranty or Additional Liability. While redistributing\n      the Work or Derivative Works thereof, You may choose to offer,\n      and charge a fee for, acceptance of support, warranty, indemnity,\n      or other liability obligations and/or rights consistent with this\n      License. However, in accepting such obligations, You may act only\n      on Your own behalf and on Your sole responsibility, not on behalf\n      of any other Contributor, and only if You agree to indemnify,\n      defend, and hold each Contributor harmless for any liability\n      incurred by, or claims asserted against, such Contributor by reason\n      of your accepting any such warranty or additional liability.\n\n   END OF TERMS AND CONDITIONS\n\n   APPENDIX: How to apply the Apache License to your work.\n\n      To apply the Apache License to your work, attach the following\n      boilerplate notice, with the fields enclosed by brackets \"[]\"\n      replaced with your own identifying information. (Don't include\n      the brackets!)  The text should be enclosed in the appropriate\n      comment syntax for the file format. We also recommend that a\n      file or class name and description of purpose be included on the\n      same \"printed page\" as the copyright notice for easier\n      identification within third-party archives.\n\n   Copyright 2016, The Authors.\n\n   Licensed under the Apache License, Version 2.0 (the \"License\");\n   you may not use this file except in compliance with the License.\n   You may obtain a copy of the License at\n\n       http://www.apache.org/licenses/LICENSE-2.0\n\n   Unless required by applicable law or agreed to in writing, software\n   distributed under the License is distributed on an \"AS IS\" BASIS,\n   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n   See the License for the specific language governing permissions and\n   limitations under the License.\n"
  },
  {
    "path": "model_zoo/models/README.md",
    "content": "# TensorFlow Models\n\nThis repository contains machine learning models implemented in\n[TensorFlow](https://tensorflow.org). The models are maintained by their\nrespective authors.\n\nTo propose a model for inclusion please submit a pull request.\n\n\n## Models\n- [autoencoder](autoencoder) -- various autoencoders\n- [inception](inception) -- deep convolutional networks for computer vision\n- [namignizer](namignizer) -- recognize and generate names\n- [neural_gpu](neural_gpu) -- highly parallel neural computer\n- [privacy](privacy) -- privacy-preserving student models from multiple teachers\n- [resnet](resnet) -- deep and wide residual networks\n- [slim](slim) -- image classification models in TF-Slim\n- [swivel](swivel) -- the Swivel algorithm for generating word embeddings\n- [syntaxnet](syntaxnet) -- neural models of natural language syntax\n- [textsum](textsum) -- sequence-to-sequence with attention model for text summarization.\n- [transformer](transformer) -- spatial transformer network, which allows the spatial manipulation of data within the network\n- [im2txt](im2txt) -- image-to-text neural network for image captioning.\n- [neural_programmer](neural programmer) -- neural network augmented with logic and mathematic operations.\n"
  },
  {
    "path": "model_zoo/models/WORKSPACE",
    "content": ""
  },
  {
    "path": "model_zoo/models/autoencoder/AdditiveGaussianNoiseAutoencoderRunner.py",
    "content": "import numpy as np\n\nimport sklearn.preprocessing as prep\nimport tensorflow as tf\nfrom tensorflow.examples.tutorials.mnist import input_data\n\nfrom autoencoder.autoencoder_models.DenoisingAutoencoder import AdditiveGaussianNoiseAutoencoder\n\nmnist = input_data.read_data_sets('MNIST_data', one_hot = True)\n\ndef standard_scale(X_train, X_test):\n    preprocessor = prep.StandardScaler().fit(X_train)\n    X_train = preprocessor.transform(X_train)\n    X_test = preprocessor.transform(X_test)\n    return X_train, X_test\n\ndef get_random_block_from_data(data, batch_size):\n    start_index = np.random.randint(0, len(data) - batch_size)\n    return data[start_index:(start_index + batch_size)]\n\nX_train, X_test = standard_scale(mnist.train.images, mnist.test.images)\n\nn_samples = int(mnist.train.num_examples)\ntraining_epochs = 20\nbatch_size = 128\ndisplay_step = 1\n\nautoencoder = AdditiveGaussianNoiseAutoencoder(n_input = 784,\n                                               n_hidden = 200,\n                                               transfer_function = tf.nn.softplus,\n                                               optimizer = tf.train.AdamOptimizer(learning_rate = 0.001),\n                                               scale = 0.01)\n\nfor epoch in range(training_epochs):\n    avg_cost = 0.\n    total_batch = int(n_samples / batch_size)\n    # Loop over all batches\n    for i in range(total_batch):\n        batch_xs = get_random_block_from_data(X_train, batch_size)\n\n        # Fit training using batch data\n        cost = autoencoder.partial_fit(batch_xs)\n        # Compute average loss\n        avg_cost += cost / n_samples * batch_size\n\n    # Display logs per epoch step\n    if epoch % display_step == 0:\n        print \"Epoch:\", '%04d' % (epoch + 1), \\\n            \"cost=\", \"{:.9f}\".format(avg_cost)\n\nprint \"Total cost: \" + str(autoencoder.calc_total_cost(X_test))\n"
  },
  {
    "path": "model_zoo/models/autoencoder/AutoencoderRunner.py",
    "content": "import numpy as np\n\nimport sklearn.preprocessing as prep\nimport tensorflow as tf\nfrom tensorflow.examples.tutorials.mnist import input_data\n\nfrom autoencoder.autoencoder_models.Autoencoder import Autoencoder\n\nmnist = input_data.read_data_sets('MNIST_data', one_hot = True)\n\ndef standard_scale(X_train, X_test):\n    preprocessor = prep.StandardScaler().fit(X_train)\n    X_train = preprocessor.transform(X_train)\n    X_test = preprocessor.transform(X_test)\n    return X_train, X_test\n\ndef get_random_block_from_data(data, batch_size):\n    start_index = np.random.randint(0, len(data) - batch_size)\n    return data[start_index:(start_index + batch_size)]\n\nX_train, X_test = standard_scale(mnist.train.images, mnist.test.images)\n\nn_samples = int(mnist.train.num_examples)\ntraining_epochs = 20\nbatch_size = 128\ndisplay_step = 1\n\nautoencoder = Autoencoder(n_input = 784,\n                          n_hidden = 200,\n                          transfer_function = tf.nn.softplus,\n                          optimizer = tf.train.AdamOptimizer(learning_rate = 0.001))\n\nfor epoch in range(training_epochs):\n    avg_cost = 0.\n    total_batch = int(n_samples / batch_size)\n    # Loop over all batches\n    for i in range(total_batch):\n        batch_xs = get_random_block_from_data(X_train, batch_size)\n\n        # Fit training using batch data\n        cost = autoencoder.partial_fit(batch_xs)\n        # Compute average loss\n        avg_cost += cost / n_samples * batch_size\n\n    # Display logs per epoch step\n    if epoch % display_step == 0:\n        print \"Epoch:\", '%04d' % (epoch + 1), \\\n            \"cost=\", \"{:.9f}\".format(avg_cost)\n\nprint \"Total cost: \" + str(autoencoder.calc_total_cost(X_test))\n"
  },
  {
    "path": "model_zoo/models/autoencoder/MaskingNoiseAutoencoderRunner.py",
    "content": "import numpy as np\n\nimport sklearn.preprocessing as prep\nimport tensorflow as tf\nfrom tensorflow.examples.tutorials.mnist import input_data\n\nfrom autoencoder.autoencoder_models.DenoisingAutoencoder import MaskingNoiseAutoencoder\n\nmnist = input_data.read_data_sets('MNIST_data', one_hot = True)\n\ndef standard_scale(X_train, X_test):\n    preprocessor = prep.StandardScaler().fit(X_train)\n    X_train = preprocessor.transform(X_train)\n    X_test = preprocessor.transform(X_test)\n    return X_train, X_test\n\ndef get_random_block_from_data(data, batch_size):\n    start_index = np.random.randint(0, len(data) - batch_size)\n    return data[start_index:(start_index + batch_size)]\n\nX_train, X_test = standard_scale(mnist.train.images, mnist.test.images)\n\n\nn_samples = int(mnist.train.num_examples)\ntraining_epochs = 100\nbatch_size = 128\ndisplay_step = 1\n\nautoencoder = MaskingNoiseAutoencoder(n_input = 784,\n                                      n_hidden = 200,\n                                      transfer_function = tf.nn.softplus,\n                                      optimizer = tf.train.AdamOptimizer(learning_rate = 0.001),\n                                      dropout_probability = 0.95)\n\nfor epoch in range(training_epochs):\n    avg_cost = 0.\n    total_batch = int(n_samples / batch_size)\n    for i in range(total_batch):\n        batch_xs = get_random_block_from_data(X_train, batch_size)\n\n        cost = autoencoder.partial_fit(batch_xs)\n\n        avg_cost += cost / n_samples * batch_size\n\n    if epoch % display_step == 0:\n        print \"Epoch:\", '%04d' % (epoch + 1), \\\n            \"cost=\", \"{:.9f}\".format(avg_cost)\n\nprint \"Total cost: \" + str(autoencoder.calc_total_cost(X_test))\n"
  },
  {
    "path": "model_zoo/models/autoencoder/Utils.py",
    "content": "import numpy as np\nimport tensorflow as tf\n\ndef xavier_init(fan_in, fan_out, constant = 1):\n    low = -constant * np.sqrt(6.0 / (fan_in + fan_out))\n    high = constant * np.sqrt(6.0 / (fan_in + fan_out))\n    return tf.random_uniform((fan_in, fan_out),\n                             minval = low, maxval = high,\n                             dtype = tf.float32)\n"
  },
  {
    "path": "model_zoo/models/autoencoder/VariationalAutoencoderRunner.py",
    "content": "import numpy as np\n\nimport sklearn.preprocessing as prep\nimport tensorflow as tf\nfrom tensorflow.examples.tutorials.mnist import input_data\n\nfrom autoencoder.autoencoder_models.VariationalAutoencoder import VariationalAutoencoder\n\nmnist = input_data.read_data_sets('MNIST_data', one_hot = True)\n\n\n\ndef min_max_scale(X_train, X_test):\n    preprocessor = prep.MinMaxScaler().fit(X_train)\n    X_train = preprocessor.transform(X_train)\n    X_test = preprocessor.transform(X_test)\n    return X_train, X_test\n\n\ndef get_random_block_from_data(data, batch_size):\n    start_index = np.random.randint(0, len(data) - batch_size)\n    return data[start_index:(start_index + batch_size)]\n\n\nX_train, X_test = min_max_scale(mnist.train.images, mnist.test.images)\n\nn_samples = int(mnist.train.num_examples)\ntraining_epochs = 20\nbatch_size = 128\ndisplay_step = 1\n\nautoencoder = VariationalAutoencoder(n_input = 784,\n                                     n_hidden = 200,\n                                     optimizer = tf.train.AdamOptimizer(learning_rate = 0.001))\n\nfor epoch in range(training_epochs):\n    avg_cost = 0.\n    total_batch = int(n_samples / batch_size)\n    # Loop over all batches\n    for i in range(total_batch):\n        batch_xs = get_random_block_from_data(X_train, batch_size)\n\n        # Fit training using batch data\n        cost = autoencoder.partial_fit(batch_xs)\n        # Compute average loss\n        avg_cost += cost / n_samples * batch_size\n\n    # Display logs per epoch step\n    if epoch % display_step == 0:\n        print \"Epoch:\", '%04d' % (epoch + 1), \\\n            \"cost=\", \"{:.9f}\".format(avg_cost)\n\nprint \"Total cost: \" + str(autoencoder.calc_total_cost(X_test))\n"
  },
  {
    "path": "model_zoo/models/autoencoder/__init__.py",
    "content": ""
  },
  {
    "path": "model_zoo/models/autoencoder/autoencoder_models/Autoencoder.py",
    "content": "import tensorflow as tf\nimport numpy as np\nimport autoencoder.Utils\n\nclass Autoencoder(object):\n\n    def __init__(self, n_input, n_hidden, transfer_function=tf.nn.softplus, optimizer = tf.train.AdamOptimizer()):\n        self.n_input = n_input\n        self.n_hidden = n_hidden\n        self.transfer = transfer_function\n\n        network_weights = self._initialize_weights()\n        self.weights = network_weights\n\n        # model\n        self.x = tf.placeholder(tf.float32, [None, self.n_input])\n        self.hidden = self.transfer(tf.add(tf.matmul(self.x, self.weights['w1']), self.weights['b1']))\n        self.reconstruction = tf.add(tf.matmul(self.hidden, self.weights['w2']), self.weights['b2'])\n\n        # cost\n        self.cost = 0.5 * tf.reduce_sum(tf.pow(tf.sub(self.reconstruction, self.x), 2.0))\n        self.optimizer = optimizer.minimize(self.cost)\n\n        init = tf.initialize_all_variables()\n        self.sess = tf.Session()\n        self.sess.run(init)\n\n\n    def _initialize_weights(self):\n        all_weights = dict()\n        all_weights['w1'] = tf.Variable(autoencoder.Utils.xavier_init(self.n_input, self.n_hidden))\n        all_weights['b1'] = tf.Variable(tf.zeros([self.n_hidden], dtype=tf.float32))\n        all_weights['w2'] = tf.Variable(tf.zeros([self.n_hidden, self.n_input], dtype=tf.float32))\n        all_weights['b2'] = tf.Variable(tf.zeros([self.n_input], dtype=tf.float32))\n        return all_weights\n\n    def partial_fit(self, X):\n        cost, opt = self.sess.run((self.cost, self.optimizer), feed_dict={self.x: X})\n        return cost\n\n    def calc_total_cost(self, X):\n        return self.sess.run(self.cost, feed_dict = {self.x: X})\n\n    def transform(self, X):\n        return self.sess.run(self.hidden, feed_dict={self.x: X})\n\n    def generate(self, hidden = None):\n        if hidden is None:\n            hidden = np.random.normal(size=self.weights[\"b1\"])\n        return self.sess.run(self.reconstruction, feed_dict={self.hidden: hidden})\n\n    def reconstruct(self, X):\n        return self.sess.run(self.reconstruction, feed_dict={self.x: X})\n\n    def getWeights(self):\n        return self.sess.run(self.weights['w1'])\n\n    def getBiases(self):\n        return self.sess.run(self.weights['b1'])\n\n"
  },
  {
    "path": "model_zoo/models/autoencoder/autoencoder_models/DenoisingAutoencoder.py",
    "content": "import tensorflow as tf\nimport numpy as np\nimport autoencoder.Utils\n\n\nclass AdditiveGaussianNoiseAutoencoder(object):\n    def __init__(self, n_input, n_hidden, transfer_function = tf.nn.softplus, optimizer = tf.train.AdamOptimizer(),\n                 scale = 0.1):\n        self.n_input = n_input\n        self.n_hidden = n_hidden\n        self.transfer = transfer_function\n        self.scale = tf.placeholder(tf.float32)\n        self.training_scale = scale\n        network_weights = self._initialize_weights()\n        self.weights = network_weights\n\n        # model\n        self.x = tf.placeholder(tf.float32, [None, self.n_input])\n        self.hidden = self.transfer(tf.add(tf.matmul(self.x + scale * tf.random_normal((n_input,)),\n                self.weights['w1']),\n                self.weights['b1']))\n        self.reconstruction = tf.add(tf.matmul(self.hidden, self.weights['w2']), self.weights['b2'])\n\n        # cost\n        self.cost = 0.5 * tf.reduce_sum(tf.pow(tf.sub(self.reconstruction, self.x), 2.0))\n        self.optimizer = optimizer.minimize(self.cost)\n\n        init = tf.initialize_all_variables()\n        self.sess = tf.Session()\n        self.sess.run(init)\n\n    def _initialize_weights(self):\n        all_weights = dict()\n        all_weights['w1'] = tf.Variable(autoencoder.Utils.xavier_init(self.n_input, self.n_hidden))\n        all_weights['b1'] = tf.Variable(tf.zeros([self.n_hidden], dtype = tf.float32))\n        all_weights['w2'] = tf.Variable(tf.zeros([self.n_hidden, self.n_input], dtype = tf.float32))\n        all_weights['b2'] = tf.Variable(tf.zeros([self.n_input], dtype = tf.float32))\n        return all_weights\n\n    def partial_fit(self, X):\n        cost, opt = self.sess.run((self.cost, self.optimizer), feed_dict = {self.x: X,\n                                                                            self.scale: self.training_scale\n                                                                            })\n        return cost\n\n    def calc_total_cost(self, X):\n        return self.sess.run(self.cost, feed_dict = {self.x: X,\n                                                     self.scale: self.training_scale\n                                                     })\n\n    def transform(self, X):\n        return self.sess.run(self.hidden, feed_dict = {self.x: X,\n                                                       self.scale: self.training_scale\n                                                       })\n\n    def generate(self, hidden = None):\n        if hidden is None:\n            hidden = np.random.normal(size = self.weights[\"b1\"])\n        return self.sess.run(self.reconstruction, feed_dict = {self.hidden: hidden})\n\n    def reconstruct(self, X):\n        return self.sess.run(self.reconstruction, feed_dict = {self.x: X,\n                                                               self.scale: self.training_scale\n                                                               })\n\n    def getWeights(self):\n        return self.sess.run(self.weights['w1'])\n\n    def getBiases(self):\n        return self.sess.run(self.weights['b1'])\n\n\nclass MaskingNoiseAutoencoder(object):\n    def __init__(self, n_input, n_hidden, transfer_function = tf.nn.softplus, optimizer = tf.train.AdamOptimizer(),\n                 dropout_probability = 0.95):\n        self.n_input = n_input\n        self.n_hidden = n_hidden\n        self.transfer = transfer_function\n        self.dropout_probability = dropout_probability\n        self.keep_prob = tf.placeholder(tf.float32)\n\n        network_weights = self._initialize_weights()\n        self.weights = network_weights\n\n        # model\n        self.x = tf.placeholder(tf.float32, [None, self.n_input])\n        self.hidden = self.transfer(tf.add(tf.matmul(tf.nn.dropout(self.x, self.keep_prob), self.weights['w1']),\n                                           self.weights['b1']))\n        self.reconstruction = tf.add(tf.matmul(self.hidden, self.weights['w2']), self.weights['b2'])\n\n        # cost\n        self.cost = 0.5 * tf.reduce_sum(tf.pow(tf.sub(self.reconstruction, self.x), 2.0))\n        self.optimizer = optimizer.minimize(self.cost)\n\n        init = tf.initialize_all_variables()\n        self.sess = tf.Session()\n        self.sess.run(init)\n\n    def _initialize_weights(self):\n        all_weights = dict()\n        all_weights['w1'] = tf.Variable(autoencoder.Utils.xavier_init(self.n_input, self.n_hidden))\n        all_weights['b1'] = tf.Variable(tf.zeros([self.n_hidden], dtype = tf.float32))\n        all_weights['w2'] = tf.Variable(tf.zeros([self.n_hidden, self.n_input], dtype = tf.float32))\n        all_weights['b2'] = tf.Variable(tf.zeros([self.n_input], dtype = tf.float32))\n        return all_weights\n\n    def partial_fit(self, X):\n        cost, opt = self.sess.run((self.cost, self.optimizer),\n                                  feed_dict = {self.x: X, self.keep_prob: self.dropout_probability})\n        return cost\n\n    def calc_total_cost(self, X):\n        return self.sess.run(self.cost, feed_dict = {self.x: X, self.keep_prob: 1.0})\n\n    def transform(self, X):\n        return self.sess.run(self.hidden, feed_dict = {self.x: X, self.keep_prob: 1.0})\n\n    def generate(self, hidden = None):\n        if hidden is None:\n            hidden = np.random.normal(size = self.weights[\"b1\"])\n        return self.sess.run(self.reconstruction, feed_dict = {self.hidden: hidden})\n\n    def reconstruct(self, X):\n        return self.sess.run(self.reconstruction, feed_dict = {self.x: X, self.keep_prob: 1.0})\n\n    def getWeights(self):\n        return self.sess.run(self.weights['w1'])\n\n    def getBiases(self):\n        return self.sess.run(self.weights['b1'])\n"
  },
  {
    "path": "model_zoo/models/autoencoder/autoencoder_models/VariationalAutoencoder.py",
    "content": "import tensorflow as tf\nimport numpy as np\nimport autoencoder.Utils\n\nclass VariationalAutoencoder(object):\n\n    def __init__(self, n_input, n_hidden, optimizer = tf.train.AdamOptimizer()):\n        self.n_input = n_input\n        self.n_hidden = n_hidden\n\n        network_weights = self._initialize_weights()\n        self.weights = network_weights\n\n        # model\n        self.x = tf.placeholder(tf.float32, [None, self.n_input])\n        self.z_mean = tf.add(tf.matmul(self.x, self.weights['w1']), self.weights['b1'])\n        self.z_log_sigma_sq = tf.add(tf.matmul(self.x, self.weights['log_sigma_w1']), self.weights['log_sigma_b1'])\n\n        # sample from gaussian distribution\n        eps = tf.random_normal(tf.pack([tf.shape(self.x)[0], self.n_hidden]), 0, 1, dtype = tf.float32)\n        self.z = tf.add(self.z_mean, tf.mul(tf.sqrt(tf.exp(self.z_log_sigma_sq)), eps))\n\n        self.reconstruction = tf.add(tf.matmul(self.z, self.weights['w2']), self.weights['b2'])\n\n        # cost\n        reconstr_loss = 0.5 * tf.reduce_sum(tf.pow(tf.sub(self.reconstruction, self.x), 2.0))\n        latent_loss = -0.5 * tf.reduce_sum(1 + self.z_log_sigma_sq\n                                           - tf.square(self.z_mean)\n                                           - tf.exp(self.z_log_sigma_sq), 1)\n        self.cost = tf.reduce_mean(reconstr_loss + latent_loss)\n        self.optimizer = optimizer.minimize(self.cost)\n\n        init = tf.initialize_all_variables()\n        self.sess = tf.Session()\n        self.sess.run(init)\n\n    def _initialize_weights(self):\n        all_weights = dict()\n        all_weights['w1'] = tf.Variable(autoencoder.Utils.xavier_init(self.n_input, self.n_hidden))\n        all_weights['log_sigma_w1'] = tf.Variable(autoencoder.Utils.xavier_init(self.n_input, self.n_hidden))\n        all_weights['b1'] = tf.Variable(tf.zeros([self.n_hidden], dtype=tf.float32))\n        all_weights['log_sigma_b1'] = tf.Variable(tf.zeros([self.n_hidden], dtype=tf.float32))\n        all_weights['w2'] = tf.Variable(tf.zeros([self.n_hidden, self.n_input], dtype=tf.float32))\n        all_weights['b2'] = tf.Variable(tf.zeros([self.n_input], dtype=tf.float32))\n        return all_weights\n\n    def partial_fit(self, X):\n        cost, opt = self.sess.run((self.cost, self.optimizer), feed_dict={self.x: X})\n        return cost\n\n    def calc_total_cost(self, X):\n        return self.sess.run(self.cost, feed_dict = {self.x: X})\n\n    def transform(self, X):\n        return self.sess.run(self.z_mean, feed_dict={self.x: X})\n\n    def generate(self, hidden = None):\n        if hidden is None:\n            hidden = np.random.normal(size=self.weights[\"b1\"])\n        return self.sess.run(self.reconstruction, feed_dict={self.z_mean: hidden})\n\n    def reconstruct(self, X):\n        return self.sess.run(self.reconstruction, feed_dict={self.x: X})\n\n    def getWeights(self):\n        return self.sess.run(self.weights['w1'])\n\n    def getBiases(self):\n        return self.sess.run(self.weights['b1'])\n\n"
  },
  {
    "path": "model_zoo/models/autoencoder/autoencoder_models/__init__.py",
    "content": ""
  },
  {
    "path": "model_zoo/models/compression/README.md",
    "content": "# Image Compression with Neural Networks\n\nThis is a [TensorFlow](http://www.tensorflow.org/) model for compressing and\ndecompressing images using an already trained  Residual GRU model as descibed\nin [Full Resolution Image Compression with Recurrent Neural Networks]\n(https://arxiv.org/abs/1608.05148). Please consult the paper for more details\non the architecture and compression results.\n\nThis code will allow you to perform the lossy compression on an model\nalready trained on compression. This code doesn't not currently contain the\nEntropy Coding portions of our paper.\n\n\n## Prerequisites\nThe only software requirements for running the encoder and decoder is having\nTensorflow installed. You will also need to [download]\n(http://download.tensorflow.org/models/compression_residual_gru-2016-08-23.tar.gz)\nand extract the model residual_gru.pb.\n\nIf you want to generate the perceptual similarity under MS-SSIM, you will also\nneed to [Install SciPy](https://www.scipy.org/install.html).\n\n## Encoding\nThe Residual GRU network is fully convolutional, but requires the images\nheight and width in pixels by a multiple of 32. There is an image in this folder\ncalled example.png that is 768x1024 if one is needed for testing. We also\nrely on TensorFlow's built in decoding ops, which support only PNG and JPEG at\ntime of release.\n\nTo encode an image, simply run the following command:\n\n`python encoder.py --input_image=/your/image/here.png\n--output_codes=output_codes.npz --iteration=15\n--model=/path/to/model/residual_gru.pb\n`\n\nThe iteration parameter specifies the lossy-quality to target for compression.\nThe quality can be [0-15], where 0 corresponds to a target of 1/8 (bits per\npixel) bpp and every increment results in an additional 1/8 bpp.\n\n| Iteration | BPP | Compression Ratio |\n|---: |---: |---: |\n|0 | 0.125 | 192:1|\n|1 | 0.250 | 96:1|\n|2 | 0.375 | 64:1|\n|3 | 0.500 | 48:1|\n|4 | 0.625 | 38.4:1|\n|5 | 0.750 | 32:1|\n|6 | 0.875 | 27.4:1|\n|7 | 1.000 | 24:1|\n|8 | 1.125 | 21.3:1|\n|9 | 1.250 | 19.2:1|\n|10 | 1.375 | 17.4:1|\n|11 | 1.500 | 16:1|\n|12 | 1.625 | 14.7:1|\n|13 | 1.750 | 13.7:1|\n|14 | 1.875 | 12.8:1|\n|15 | 2.000 | 12:1|\n\nThe output_codes file contains the numpy shape and a flattened, bit-packed\narray of the codes. These can be inspected in python by using numpy.load().\n\n\n## Decoding\nAfter generating codes for an image, the lossy reconstructions for that image\ncan be done as follows:\n\n`python decoder.py --input_codes=codes.npz --output_directory=/tmp/decoded/\n--model=residual_gru.pb`\n\nThe output_directory will contain images decoded at each quality level.\n\n\n## Comparing Similarity\nOne of our primary metrics for comparing how similar two images are\nis MS-SSIM.\n\nTo generate these metrics on your images you can run:\n`python msssim.py --original_image=/path/to/your/image.png\n--compared_image=/tmp/decoded/image_15.png`\n\n\n## Results\nCSV results containing the post-entropy bitrates and MS-SSIM over Kodak can \nare available for reference. Each row of the CSV represents each of the Kodak\nimages in their dataset number (1-24). Each column of the CSV represents each\niteration of the model (1-16).\n\n[Post Entropy Bitrates](https://storage.googleapis.com/compression-ml/residual_gru_results/bitrate.csv)\n\n[MS-SSIM](https://storage.googleapis.com/compression-ml/residual_gru_results/msssim.csv)\n\n\n## FAQ\n\n#### How do I train my own compression network?\nWe currently don't provide the code to build and train a compression\ngraph from scratch.\n\n#### I get an InvalidArgumentError: Incompatible shapes.\nThis is usually due to the fact that our network only supports images that are\nboth height and width divisible by 32 pixel. Try padding your images to 32\npixel boundaries.\n\n\n## Contact Info\nModel repository maintained by Nick Johnston ([nickj-google](https://github.com/nickj-google)).\n"
  },
  {
    "path": "model_zoo/models/compression/decoder.py",
    "content": "#!/usr/bin/python\n#\n# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\nr\"\"\"Neural Network Image Compression Decoder.\n\nDecompress an image from the numpy's npz format generated by the encoder.\n\nExample usage:\npython decoder.py --input_codes=output_codes.pkl --iteration=15 \\\n--output_directory=/tmp/compression_output/ --model=residual_gru.pb\n\"\"\"\nimport io\nimport os\n\nimport numpy as np\nimport tensorflow as tf\n\ntf.flags.DEFINE_string('input_codes', None, 'Location of binary code file.')\ntf.flags.DEFINE_integer('iteration', -1, 'The max quality level of '\n                        'the images to output. Use -1 to infer from loaded '\n                        ' codes.')\ntf.flags.DEFINE_string('output_directory', None, 'Directory to save decoded '\n                       'images.')\ntf.flags.DEFINE_string('model', None, 'Location of compression model.')\n\nFLAGS = tf.flags.FLAGS\n\n\ndef get_input_tensor_names():\n  name_list = ['GruBinarizer/SignBinarizer/Sign:0']\n  for i in xrange(1, 16):\n    name_list.append('GruBinarizer/SignBinarizer/Sign_{}:0'.format(i))\n  return name_list\n\n\ndef get_output_tensor_names():\n  return ['loop_{0:02d}/add:0'.format(i) for i in xrange(0, 16)]\n\n\ndef main(_):\n  if (FLAGS.input_codes is None or FLAGS.output_directory is None or\n      FLAGS.model is None):\n    print ('\\nUsage: python decoder.py --input_codes=output_codes.pkl '\n           '--iteration=15 --output_directory=/tmp/compression_output/ '\n           '--model=residual_gru.pb\\n\\n')\n    return\n\n  if FLAGS.iteration < -1 or FLAGS.iteration > 15:\n    print ('\\n--iteration must be between 0 and 15 inclusive, or -1 to infer '\n           'from file.\\n')\n    return\n  iteration = FLAGS.iteration\n\n  if not tf.gfile.Exists(FLAGS.output_directory):\n    tf.gfile.MkDir(FLAGS.output_directory)\n\n  if not tf.gfile.Exists(FLAGS.input_codes):\n    print '\\nInput codes not found.\\n'\n    return\n\n  contents = ''\n  with tf.gfile.FastGFile(FLAGS.input_codes, 'r') as code_file:\n    contents = code_file.read()\n    loaded_codes = np.load(io.BytesIO(contents))\n    assert ['codes', 'shape'] not in loaded_codes.files\n    loaded_shape = loaded_codes['shape']\n    loaded_array = loaded_codes['codes']\n\n    # Unpack and recover code shapes.\n    unpacked_codes = np.reshape(np.unpackbits(loaded_array)\n                                [:np.prod(loaded_shape)],\n                                loaded_shape)\n\n    numpy_int_codes = np.split(unpacked_codes, len(unpacked_codes))\n    if iteration == -1:\n      iteration = len(unpacked_codes) - 1\n    # Convert back to float and recover scale.\n    numpy_codes = [np.squeeze(x.astype(np.float32), 0) * 2 - 1 for x in\n                   numpy_int_codes]\n\n  with tf.Graph().as_default() as graph:\n    # Load the inference model for decoding.\n    with tf.gfile.FastGFile(FLAGS.model, 'rb') as model_file:\n      graph_def = tf.GraphDef()\n      graph_def.ParseFromString(model_file.read())\n    _ = tf.import_graph_def(graph_def, name='')\n\n    # For encoding the tensors into PNGs.\n    input_image = tf.placeholder(tf.uint8)\n    encoded_image = tf.image.encode_png(input_image)\n\n    input_tensors = [graph.get_tensor_by_name(name) for name in\n                     get_input_tensor_names()][0:iteration+1]\n    outputs = [graph.get_tensor_by_name(name) for name in\n               get_output_tensor_names()][0:iteration+1]\n\n  feed_dict = {key: value for (key, value) in zip(input_tensors,\n                                                  numpy_codes)}\n\n  with tf.Session(graph=graph) as sess:\n    results = sess.run(outputs, feed_dict=feed_dict)\n\n    for index, result in enumerate(results):\n      img = np.uint8(np.clip(result + 0.5, 0, 255))\n      img = img.squeeze()\n      png_img = sess.run(encoded_image, feed_dict={input_image: img})\n\n      with tf.gfile.FastGFile(os.path.join(FLAGS.output_directory,\n                                           'image_{0:02d}.png'.format(index)),\n                              'w') as output_image:\n        output_image.write(png_img)\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/compression/encoder.py",
    "content": "#!/usr/bin/python\n#\n# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\nr\"\"\"Neural Network Image Compression Encoder.\n\nCompresses an image to a binarized numpy array. The image must be padded to a\nmultiple of 32 pixels in height and width.\n\nExample usage:\npython encoder.py --input_image=/your/image/here.png \\\n--output_codes=output_codes.pkl --iteration=15 --model=residual_gru.pb\n\"\"\"\nimport io\nimport os\n\nimport numpy as np\nimport tensorflow as tf\n\ntf.flags.DEFINE_string('input_image', None, 'Location of input image. We rely '\n                       'on tf.image to decode the image, so only PNG and JPEG '\n                       'formats are currently supported.')\ntf.flags.DEFINE_integer('iteration', 15, 'Quality level for encoding image. '\n                        'Must be between 0 and 15 inclusive.')\ntf.flags.DEFINE_string('output_codes', None, 'File to save output encoding.')\ntf.flags.DEFINE_string('model', None, 'Location of compression model.')\n\nFLAGS = tf.flags.FLAGS\n\n\ndef get_output_tensor_names():\n  name_list = ['GruBinarizer/SignBinarizer/Sign:0']\n  for i in xrange(1, 16):\n    name_list.append('GruBinarizer/SignBinarizer/Sign_{}:0'.format(i))\n  return name_list\n\n\ndef main(_):\n  if (FLAGS.input_image is None or FLAGS.output_codes is None or\n      FLAGS.model is None):\n    print ('\\nUsage: python encoder.py --input_image=/your/image/here.png '\n           '--output_codes=output_codes.pkl --iteration=15 '\n           '--model=residual_gru.pb\\n\\n')\n    return\n\n  if FLAGS.iteration < 0 or FLAGS.iteration > 15:\n    print '\\n--iteration must be between 0 and 15 inclusive.\\n'\n    return\n\n  with tf.gfile.FastGFile(FLAGS.input_image) as input_image:\n    input_image_str = input_image.read()\n\n  with tf.Graph().as_default() as graph:\n    # Load the inference model for encoding.\n    with tf.gfile.FastGFile(FLAGS.model, 'rb') as model_file:\n      graph_def = tf.GraphDef()\n      graph_def.ParseFromString(model_file.read())\n    _ = tf.import_graph_def(graph_def, name='')\n\n    input_tensor = graph.get_tensor_by_name('Placeholder:0')\n    outputs = [graph.get_tensor_by_name(name) for name in\n               get_output_tensor_names()]\n\n    input_image = tf.placeholder(tf.string)\n    _, ext = os.path.splitext(FLAGS.input_image)\n    if ext == '.png':\n      decoded_image = tf.image.decode_png(input_image, channels=3)\n    elif ext == '.jpeg' or ext == '.jpg':\n      decoded_image = tf.image.decode_jpeg(input_image, channels=3)\n    else:\n      assert False, 'Unsupported file format {}'.format(ext)\n    decoded_image = tf.expand_dims(decoded_image, 0)\n\n  with tf.Session(graph=graph) as sess:\n    img_array = sess.run(decoded_image, feed_dict={input_image:\n                                                   input_image_str})\n    results = sess.run(outputs, feed_dict={input_tensor: img_array})\n\n  results = results[0:FLAGS.iteration + 1]\n  int_codes = np.asarray([x.astype(np.int8) for x in results])\n\n  # Convert int codes to binary.\n  int_codes = (int_codes + 1)/2\n  export = np.packbits(int_codes.reshape(-1))\n\n  output = io.BytesIO()\n  np.savez_compressed(output, shape=int_codes.shape, codes=export)\n  with tf.gfile.FastGFile(FLAGS.output_codes, 'w') as code_file:\n    code_file.write(output.getvalue())\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/compression/msssim.py",
    "content": "#!/usr/bin/python\n#\n# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Python implementation of MS-SSIM.\n\nUsage:\n\npython msssim.py --original_image=original.png --compared_image=distorted.png\n\"\"\"\nimport numpy as np\nfrom scipy import signal\nfrom scipy.ndimage.filters import convolve\nimport tensorflow as tf\n\n\ntf.flags.DEFINE_string('original_image', None, 'Path to PNG image.')\ntf.flags.DEFINE_string('compared_image', None, 'Path to PNG image.')\nFLAGS = tf.flags.FLAGS\n\n\ndef _FSpecialGauss(size, sigma):\n  \"\"\"Function to mimic the 'fspecial' gaussian MATLAB function.\"\"\"\n  radius = size // 2\n  offset = 0.0\n  start, stop = -radius, radius + 1\n  if size % 2 == 0:\n    offset = 0.5\n    stop -= 1\n  x, y = np.mgrid[offset + start:stop, offset + start:stop]\n  assert len(x) == size\n  g = np.exp(-((x**2 + y**2)/(2.0 * sigma**2)))\n  return g / g.sum()\n\n\ndef _SSIMForMultiScale(img1, img2, max_val=255, filter_size=11,\n                       filter_sigma=1.5, k1=0.01, k2=0.03):\n  \"\"\"Return the Structural Similarity Map between `img1` and `img2`.\n\n  This function attempts to match the functionality of ssim_index_new.m by\n  Zhou Wang: http://www.cns.nyu.edu/~lcv/ssim/msssim.zip\n\n  Arguments:\n    img1: Numpy array holding the first RGB image batch.\n    img2: Numpy array holding the second RGB image batch.\n    max_val: the dynamic range of the images (i.e., the difference between the\n      maximum the and minimum allowed values).\n    filter_size: Size of blur kernel to use (will be reduced for small images).\n    filter_sigma: Standard deviation for Gaussian blur kernel (will be reduced\n      for small images).\n    k1: Constant used to maintain stability in the SSIM calculation (0.01 in\n      the original paper).\n    k2: Constant used to maintain stability in the SSIM calculation (0.03 in\n      the original paper).\n\n  Returns:\n    Pair containing the mean SSIM and contrast sensitivity between `img1` and\n    `img2`.\n\n  Raises:\n    RuntimeError: If input images don't have the same shape or don't have four\n      dimensions: [batch_size, height, width, depth].\n  \"\"\"\n  if img1.shape != img2.shape:\n    raise RuntimeError('Input images must have the same shape (%s vs. %s).',\n                       img1.shape, img2.shape)\n  if img1.ndim != 4:\n    raise RuntimeError('Input images must have four dimensions, not %d',\n                       img1.ndim)\n\n  img1 = img1.astype(np.float64)\n  img2 = img2.astype(np.float64)\n  _, height, width, _ = img1.shape\n\n  # Filter size can't be larger than height or width of images.\n  size = min(filter_size, height, width)\n\n  # Scale down sigma if a smaller filter size is used.\n  sigma = size * filter_sigma / filter_size if filter_size else 0\n\n  if filter_size:\n    window = np.reshape(_FSpecialGauss(size, sigma), (1, size, size, 1))\n    mu1 = signal.fftconvolve(img1, window, mode='valid')\n    mu2 = signal.fftconvolve(img2, window, mode='valid')\n    sigma11 = signal.fftconvolve(img1 * img1, window, mode='valid')\n    sigma22 = signal.fftconvolve(img2 * img2, window, mode='valid')\n    sigma12 = signal.fftconvolve(img1 * img2, window, mode='valid')\n  else:\n    # Empty blur kernel so no need to convolve.\n    mu1, mu2 = img1, img2\n    sigma11 = img1 * img1\n    sigma22 = img2 * img2\n    sigma12 = img1 * img2\n\n  mu11 = mu1 * mu1\n  mu22 = mu2 * mu2\n  mu12 = mu1 * mu2\n  sigma11 -= mu11\n  sigma22 -= mu22\n  sigma12 -= mu12\n\n  # Calculate intermediate values used by both ssim and cs_map.\n  c1 = (k1 * max_val) ** 2\n  c2 = (k2 * max_val) ** 2\n  v1 = 2.0 * sigma12 + c2\n  v2 = sigma11 + sigma22 + c2\n  ssim = np.mean((((2.0 * mu12 + c1) * v1) / ((mu11 + mu22 + c1) * v2)))\n  cs = np.mean(v1 / v2)\n  return ssim, cs\n\n\ndef MultiScaleSSIM(img1, img2, max_val=255, filter_size=11, filter_sigma=1.5,\n                   k1=0.01, k2=0.03, weights=None):\n  \"\"\"Return the MS-SSIM score between `img1` and `img2`.\n\n  This function implements Multi-Scale Structural Similarity (MS-SSIM) Image\n  Quality Assessment according to Zhou Wang's paper, \"Multi-scale structural\n  similarity for image quality assessment\" (2003).\n  Link: https://ece.uwaterloo.ca/~z70wang/publications/msssim.pdf\n\n  Author's MATLAB implementation:\n  http://www.cns.nyu.edu/~lcv/ssim/msssim.zip\n\n  Arguments:\n    img1: Numpy array holding the first RGB image batch.\n    img2: Numpy array holding the second RGB image batch.\n    max_val: the dynamic range of the images (i.e., the difference between the\n      maximum the and minimum allowed values).\n    filter_size: Size of blur kernel to use (will be reduced for small images).\n    filter_sigma: Standard deviation for Gaussian blur kernel (will be reduced\n      for small images).\n    k1: Constant used to maintain stability in the SSIM calculation (0.01 in\n      the original paper).\n    k2: Constant used to maintain stability in the SSIM calculation (0.03 in\n      the original paper).\n    weights: List of weights for each level; if none, use five levels and the\n      weights from the original paper.\n\n  Returns:\n    MS-SSIM score between `img1` and `img2`.\n\n  Raises:\n    RuntimeError: If input images don't have the same shape or don't have four\n      dimensions: [batch_size, height, width, depth].\n  \"\"\"\n  if img1.shape != img2.shape:\n    raise RuntimeError('Input images must have the same shape (%s vs. %s).',\n                       img1.shape, img2.shape)\n  if img1.ndim != 4:\n    raise RuntimeError('Input images must have four dimensions, not %d',\n                       img1.ndim)\n\n  # Note: default weights don't sum to 1.0 but do match the paper / matlab code.\n  weights = np.array(weights if weights else\n                     [0.0448, 0.2856, 0.3001, 0.2363, 0.1333])\n  levels = weights.size\n  downsample_filter = np.ones((1, 2, 2, 1)) / 4.0\n  im1, im2 = [x.astype(np.float64) for x in [img1, img2]]\n  mssim = np.array([])\n  mcs = np.array([])\n  for _ in xrange(levels):\n    ssim, cs = _SSIMForMultiScale(\n        im1, im2, max_val=max_val, filter_size=filter_size,\n        filter_sigma=filter_sigma, k1=k1, k2=k2)\n    mssim = np.append(mssim, ssim)\n    mcs = np.append(mcs, cs)\n    filtered = [convolve(im, downsample_filter, mode='reflect')\n                for im in [im1, im2]]\n    im1, im2 = [x[:, ::2, ::2, :] for x in filtered]\n  return (np.prod(mcs[0:levels-1] ** weights[0:levels-1]) *\n          (mssim[levels-1] ** weights[levels-1]))\n\n\ndef main(_):\n  if FLAGS.original_image is None or FLAGS.compared_image is None:\n    print ('\\nUsage: python msssim.py --original_image=original.png '\n           '--compared_image=distorted.png\\n\\n')\n    return\n\n  if not tf.gfile.Exists(FLAGS.original_image):\n    print '\\nCannot find --original_image.\\n'\n    return\n\n  if not tf.gfile.Exists(FLAGS.compared_image):\n    print '\\nCannot find --compared_image.\\n'\n    return\n\n  with tf.gfile.FastGFile(FLAGS.original_image) as image_file:\n    img1_str = image_file.read()\n  with tf.gfile.FastGFile(FLAGS.compared_image) as image_file:\n    img2_str = image_file.read()\n\n  input_img = tf.placeholder(tf.string)\n  decoded_image = tf.expand_dims(tf.image.decode_png(input_img, channels=3), 0)\n\n  with tf.Session() as sess:\n    img1 = sess.run(decoded_image, feed_dict={input_img: img1_str})\n    img2 = sess.run(decoded_image, feed_dict={input_img: img2_str})\n\n  print MultiScaleSSIM(img1, img2, max_val=255)\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/README.md",
    "content": "<font size=4><b>Deep Learning with Differential Privacy</b></font>\n\nOpen Sourced By: Xin Pan (xpan@google.com, github: panyx0718)\n\n\n###Introduction for dp_sgd/README.md\n\nMachine learning techniques based on neural networks are achieving remarkable \nresults in a wide variety of domains. Often, the training of models requires \nlarge, representative datasets, which may be crowdsourced and contain sensitive \ninformation. The models should not expose private information in these datasets. \nAddressing this goal, we develop new algorithmic techniques for learning and a \nrefined analysis of privacy costs within the framework of differential privacy. \nOur implementation and experiments demonstrate that we can train deep neural \nnetworks with non-convex objectives, under a modest privacy budget, and at a \nmanageable cost in software complexity, training efficiency, and model quality.\n\npaper: https://arxiv.org/abs/1607.00133\n\n\n###Introduction for multiple_teachers/README.md\n\nThis repository contains code to create a setup for learning privacy-preserving \nstudent models by transferring knowledge from an ensemble of teachers trained \non disjoint subsets of the data for which privacy guarantees are to be provided.\n\nKnowledge acquired by teachers is transferred to the student in a differentially\nprivate manner by noisily aggregating the teacher decisions before feeding them\nto the student during training.\n\npaper: https://arxiv.org/abs/1610.05755\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/__init__.py",
    "content": ""
  },
  {
    "path": "model_zoo/models/differential_privacy/dp_sgd/README.md",
    "content": "<font size=4><b>Deep Learning with Differential Privacy</b></font>\n\nAuthors:\nMartín Abadi, Andy Chu, Ian Goodfellow, H. Brendan McMahan, Ilya Mironov, Kunal Talwar, Li Zhang\n\nOpen Sourced By: Xin Pan (xpan@google.com, github: panyx0718)\n\n\n<Introduction>\n\nMachine learning techniques based on neural networks are achieving remarkable \nresults in a wide variety of domains. Often, the training of models requires \nlarge, representative datasets, which may be crowdsourced and contain sensitive \ninformation. The models should not expose private information in these datasets. \nAddressing this goal, we develop new algorithmic techniques for learning and a \nrefined analysis of privacy costs within the framework of differential privacy. \nOur implementation and experiments demonstrate that we can train deep neural \nnetworks with non-convex objectives, under a modest privacy budget, and at a \nmanageable cost in software complexity, training efficiency, and model quality.\n\npaper: https://arxiv.org/abs/1607.00133\n\n\n<b>Requirements:</b>\n\n1. Tensorflow 0.10.0 (master branch)\n\nNote: r0.11 might experience some problems\n\n2. Bazel 0.3.1\n\n3. Download MNIST data\n\nTODO(xpan): Complete the link:\n[train](http://download.tensorflow.org/models/)\n[test](http://download.tensorflow.org/models/)\n\nAlternatively, download the tfrecord format MNIST from:\nhttps://github.com/panyx0718/models/tree/master/slim\n\n<b>How to run:</b>\n\n```shell\n# Clone the codes under differential_privacy.\n# Create an empty WORKSPACE file.\n# Download the data to the data/ directory.\n\n# List the codes.\nls -R differential_privacy/\ndifferential_privacy/:\ndp_sgd  __init__.py  privacy_accountant  README.md\n\ndifferential_privacy/dp_sgd:\ndp_mnist  dp_optimizer  per_example_gradients  README.md\n\ndifferential_privacy/dp_sgd/dp_mnist:\nBUILD  dp_mnist.py\n\ndifferential_privacy/dp_sgd/dp_optimizer:\nBUILD  dp_optimizer.py  dp_pca.py  sanitizer.py  utils.py\n\ndifferential_privacy/dp_sgd/per_example_gradients:\nBUILD  per_example_gradients.py\n\ndifferential_privacy/privacy_accountant:\npython  tf\n\ndifferential_privacy/privacy_accountant/python:\nBUILD  gaussian_moments.py\n\ndifferential_privacy/privacy_accountant/tf:\naccountant.py  accountant_test.py  BUILD\n\n# List the data.\nls -R data/\n\n./data:\nmnist_test.tfrecord  mnist_train.tfrecord\n\n# Build the codes.\nbazel build -c opt differential_privacy/...\n\n# Run the mnist differntial privacy training codes.\nbazel-bin/differential_privacy/dp_sgd/dp_mnist/dp_mnist \\\n    --training_data_path=data/mnist_train.tfrecord \\\n    --eval_data_path=data/mnist_test.tfrecord \\\n    --save_path=/tmp/mnist_dir\n\n...\nstep: 1\nstep: 2\n...\nstep: 9\nspent privacy: eps 0.1250 delta 0.72709\nspent privacy: eps 0.2500 delta 0.24708\nspent privacy: eps 0.5000 delta 0.0029139\nspent privacy: eps 1.0000 delta 6.494e-10\nspent privacy: eps 2.0000 delta 8.2242e-24\nspent privacy: eps 4.0000 delta 1.319e-51\nspent privacy: eps 8.0000 delta 3.3927e-107\ntrain_accuracy: 0.53\neval_accuracy: 0.53\n...\n\nls /tmp/mnist_dir/\ncheckpoint  ckpt  ckpt.meta  results-0.json\n```\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/dp_sgd/dp_mnist/BUILD",
    "content": "package(default_visibility = [\":internal\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\nexports_files([\"LICENSE\"])\n\npackage_group(\n    name = \"internal\",\n    packages = [\n        \"//differential_privacy/...\",\n    ],\n)\n\npy_binary(\n    name = \"dp_mnist\",\n    srcs = [\n        \"dp_mnist.py\",\n    ],\n    deps = [\n        \"//differential_privacy/dp_sgd/dp_optimizer\",\n        \"//differential_privacy/dp_sgd/dp_optimizer:dp_pca\",\n        \"//differential_privacy/dp_sgd/dp_optimizer:utils\",\n    ],\n)\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/dp_sgd/dp_mnist/dp_mnist.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Example differentially private trainer and evaluator for MNIST.\n\"\"\"\nfrom __future__ import division\n\nimport json\nimport os\nimport sys\nimport time\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom differential_privacy.dp_sgd.dp_optimizer import dp_optimizer\nfrom differential_privacy.dp_sgd.dp_optimizer import dp_pca\nfrom differential_privacy.dp_sgd.dp_optimizer import sanitizer\nfrom differential_privacy.dp_sgd.dp_optimizer import utils\nfrom differential_privacy.privacy_accountant.tf import accountant\n\n# parameters for the training\ntf.flags.DEFINE_integer(\"batch_size\", 600,\n                        \"The training batch size.\")\ntf.flags.DEFINE_integer(\"batches_per_lot\", 1,\n                        \"Number of batches per lot.\")\n# Together, batch_size and batches_per_lot determine lot_size.\ntf.flags.DEFINE_integer(\"num_training_steps\", 50000,\n                        \"The number of training steps.\"\n                        \"This counts number of lots.\")\n\ntf.flags.DEFINE_bool(\"randomize\", True,\n                     \"If true, randomize the input data; otherwise use a fixed \"\n                     \"seed and non-randomized input.\")\ntf.flags.DEFINE_bool(\"freeze_bottom_layers\", False,\n                     \"If true, only train on the logit layer.\")\ntf.flags.DEFINE_bool(\"save_mistakes\", False,\n                     \"If true, save the mistakes made during testing.\")\ntf.flags.DEFINE_float(\"lr\", 0.05, \"start learning rate\")\ntf.flags.DEFINE_float(\"end_lr\", 0.05, \"end learning rate\")\ntf.flags.DEFINE_float(\"lr_saturate_epochs\", 0,\n                      \"learning rate saturate epochs; set to 0 for a constant \"\n                      \"learning rate of --lr.\")\n\n# For searching parameters\ntf.flags.DEFINE_integer(\"projection_dimensions\", 60,\n                        \"PCA projection dimensions, or 0 for no projection.\")\ntf.flags.DEFINE_integer(\"num_hidden_layers\", 1,\n                        \"Number of hidden layers in the network\")\ntf.flags.DEFINE_integer(\"hidden_layer_num_units\", 1000,\n                        \"Number of units per hidden layer\")\ntf.flags.DEFINE_float(\"default_gradient_l2norm_bound\", 4.0, \"norm clipping\")\ntf.flags.DEFINE_integer(\"num_conv_layers\", 0,\n                        \"Number of convolutional layers to use.\")\n\ntf.flags.DEFINE_string(\"training_data_path\",\n                       \"/tmp/mnist/mnist_train.tfrecord\",\n                       \"Location of the training data.\")\ntf.flags.DEFINE_string(\"eval_data_path\",\n                       \"/tmp/mnist/mnist_test.tfrecord\",\n                       \"Location of the eval data.\")\ntf.flags.DEFINE_integer(\"eval_steps\", 10,\n                        \"Evaluate the model every eval_steps\")\n\n# Parameters for privacy spending. We allow linearly varying eps during\n# training.\ntf.flags.DEFINE_string(\"accountant_type\", \"Moments\", \"Moments, Amortized.\")\n\n# Flags that control privacy spending during training.\ntf.flags.DEFINE_float(\"eps\", 1.0,\n                      \"Start privacy spending for one epoch of training, \"\n                      \"used if accountant_type is Amortized.\")\ntf.flags.DEFINE_float(\"end_eps\", 1.0,\n                      \"End privacy spending for one epoch of training, \"\n                      \"used if accountant_type is Amortized.\")\ntf.flags.DEFINE_float(\"eps_saturate_epochs\", 0,\n                      \"Stop varying epsilon after eps_saturate_epochs. Set to \"\n                      \"0 for constant eps of --eps. \"\n                      \"Used if accountant_type is Amortized.\")\ntf.flags.DEFINE_float(\"delta\", 1e-5,\n                      \"Privacy spending for training. Constant through \"\n                      \"training, used if accountant_type is Amortized.\")\ntf.flags.DEFINE_float(\"sigma\", 4.0,\n                      \"Noise sigma, used only if accountant_type is Moments\")\n\n\n# Flags that control privacy spending for the pca projection\n# (only used if --projection_dimensions > 0).\ntf.flags.DEFINE_float(\"pca_eps\", 0.5,\n                      \"Privacy spending for PCA, used if accountant_type is \"\n                      \"Amortized.\")\ntf.flags.DEFINE_float(\"pca_delta\", 0.005,\n                      \"Privacy spending for PCA, used if accountant_type is \"\n                      \"Amortized.\")\n\ntf.flags.DEFINE_float(\"pca_sigma\", 7.0,\n                      \"Noise sigma for PCA, used if accountant_type is Moments\")\n\ntf.flags.DEFINE_string(\"target_eps\", \"0.125,0.25,0.5,1,2,4,8\",\n                       \"Log the privacy loss for the target epsilon's. Only \"\n                       \"used when accountant_type is Moments.\")\ntf.flags.DEFINE_float(\"target_delta\", 1e-5,\n                      \"Maximum delta for --terminate_based_on_privacy.\")\ntf.flags.DEFINE_bool(\"terminate_based_on_privacy\", False,\n                     \"Stop training if privacy spent exceeds \"\n                     \"(max(--target_eps), --target_delta), even \"\n                     \"if --num_training_steps have not yet been completed.\")\n\ntf.flags.DEFINE_string(\"save_path\", \"/tmp/mnist_dir\",\n                       \"Directory for saving model outputs.\")\n\nFLAGS = tf.flags.FLAGS\nNUM_TRAINING_IMAGES = 60000\nNUM_TESTING_IMAGES = 10000\nIMAGE_SIZE = 28\n\n\ndef MnistInput(mnist_data_file, batch_size, randomize):\n  \"\"\"Create operations to read the MNIST input file.\n\n  Args:\n    mnist_data_file: Path of a file containing the MNIST images to process.\n    batch_size: size of the mini batches to generate.\n    randomize: If true, randomize the dataset.\n\n  Returns:\n    images: A tensor with the formatted image data. shape [batch_size, 28*28]\n    labels: A tensor with the labels for each image.  shape [batch_size]\n  \"\"\"\n  file_queue = tf.train.string_input_producer([mnist_data_file])\n  reader = tf.TFRecordReader()\n  _, value = reader.read(file_queue)\n  example = tf.parse_single_example(\n      value,\n      features={\"image/encoded\": tf.FixedLenFeature(shape=(), dtype=tf.string),\n                \"image/class/label\": tf.FixedLenFeature([1], tf.int64)})\n\n  image = tf.cast(tf.image.decode_png(example[\"image/encoded\"], channels=1),\n                  tf.float32)\n  image = tf.reshape(image, [IMAGE_SIZE * IMAGE_SIZE])\n  image /= 255\n  label = tf.cast(example[\"image/class/label\"], dtype=tf.int32)\n  label = tf.reshape(label, [])\n\n  if randomize:\n    images, labels = tf.train.shuffle_batch(\n        [image, label], batch_size=batch_size,\n        capacity=(batch_size * 100),\n        min_after_dequeue=(batch_size * 10))\n  else:\n    images, labels = tf.train.batch([image, label], batch_size=batch_size)\n\n  return images, labels\n\n\ndef Eval(mnist_data_file, network_parameters, num_testing_images,\n         randomize, load_path, save_mistakes=False):\n  \"\"\"Evaluate MNIST for a number of steps.\n\n  Args:\n    mnist_data_file: Path of a file containing the MNIST images to process.\n    network_parameters: parameters for defining and training the network.\n    num_testing_images: the number of images we will evaluate on.\n    randomize: if false, randomize; otherwise, read the testing images\n      sequentially.\n    load_path: path where to load trained parameters from.\n    save_mistakes: save the mistakes if True.\n\n  Returns:\n    The evaluation accuracy as a float.\n  \"\"\"\n  batch_size = 100\n  # Like for training, we need a session for executing the TensorFlow graph.\n  with tf.Graph().as_default(), tf.Session() as sess:\n    # Create the basic Mnist model.\n    images, labels = MnistInput(mnist_data_file, batch_size, randomize)\n    logits, _, _ = utils.BuildNetwork(images, network_parameters)\n    softmax = tf.nn.softmax(logits)\n\n    # Load the variables.\n    ckpt_state = tf.train.get_checkpoint_state(load_path)\n    if not (ckpt_state and ckpt_state.model_checkpoint_path):\n      raise ValueError(\"No model checkpoint to eval at %s\\n\" % load_path)\n\n    saver = tf.train.Saver()\n    saver.restore(sess, ckpt_state.model_checkpoint_path)\n    coord = tf.train.Coordinator()\n    _ = tf.train.start_queue_runners(sess=sess, coord=coord)\n\n    total_examples = 0\n    correct_predictions = 0\n    image_index = 0\n    mistakes = []\n    for _ in xrange((num_testing_images + batch_size - 1) // batch_size):\n      predictions, label_values = sess.run([softmax, labels])\n\n      # Count how many were predicted correctly.\n      for prediction, label_value in zip(predictions, label_values):\n        total_examples += 1\n        if np.argmax(prediction) == label_value:\n          correct_predictions += 1\n        elif save_mistakes:\n          mistakes.append({\"index\": image_index,\n                           \"label\": label_value,\n                           \"pred\": np.argmax(prediction)})\n        image_index += 1\n\n  return (correct_predictions / total_examples,\n          mistakes if save_mistakes else None)\n\n\ndef Train(mnist_train_file, mnist_test_file, network_parameters, num_steps,\n          save_path, eval_steps=0):\n  \"\"\"Train MNIST for a number of steps.\n\n  Args:\n    mnist_train_file: path of MNIST train data file.\n    mnist_test_file: path of MNIST test data file.\n    network_parameters: parameters for defining and training the network.\n    num_steps: number of steps to run. Here steps = lots\n    save_path: path where to save trained parameters.\n    eval_steps: evaluate the model every eval_steps.\n\n  Returns:\n    the result after the final training step.\n\n  Raises:\n    ValueError: if the accountant_type is not supported.\n  \"\"\"\n  batch_size = FLAGS.batch_size\n\n  params = {\"accountant_type\": FLAGS.accountant_type,\n            \"task_id\": 0,\n            \"batch_size\": FLAGS.batch_size,\n            \"projection_dimensions\": FLAGS.projection_dimensions,\n            \"default_gradient_l2norm_bound\":\n            network_parameters.default_gradient_l2norm_bound,\n            \"num_hidden_layers\": FLAGS.num_hidden_layers,\n            \"hidden_layer_num_units\": FLAGS.hidden_layer_num_units,\n            \"num_examples\": NUM_TRAINING_IMAGES,\n            \"learning_rate\": FLAGS.lr,\n            \"end_learning_rate\": FLAGS.end_lr,\n            \"learning_rate_saturate_epochs\": FLAGS.lr_saturate_epochs\n           }\n  # Log different privacy parameters dependent on the accountant type.\n  if FLAGS.accountant_type == \"Amortized\":\n    params.update({\"flag_eps\": FLAGS.eps,\n                   \"flag_delta\": FLAGS.delta,\n                   \"flag_pca_eps\": FLAGS.pca_eps,\n                   \"flag_pca_delta\": FLAGS.pca_delta,\n                  })\n  elif FLAGS.accountant_type == \"Moments\":\n    params.update({\"sigma\": FLAGS.sigma,\n                   \"pca_sigma\": FLAGS.pca_sigma,\n                  })\n\n  with tf.Graph().as_default(), tf.Session() as sess, tf.device('/cpu:0'):\n    # Create the basic Mnist model.\n    images, labels = MnistInput(mnist_train_file, batch_size, FLAGS.randomize)\n\n    logits, projection, training_params = utils.BuildNetwork(\n        images, network_parameters)\n\n    cost = tf.nn.softmax_cross_entropy_with_logits(\n        logits, tf.one_hot(labels, 10))\n\n    # The actual cost is the average across the examples.\n    cost = tf.reduce_sum(cost, [0]) / batch_size\n\n    if FLAGS.accountant_type == \"Amortized\":\n      priv_accountant = accountant.AmortizedAccountant(NUM_TRAINING_IMAGES)\n      sigma = None\n      pca_sigma = None\n      with_privacy = FLAGS.eps > 0\n    elif FLAGS.accountant_type == \"Moments\":\n      priv_accountant = accountant.GaussianMomentsAccountant(\n          NUM_TRAINING_IMAGES)\n      sigma = FLAGS.sigma\n      pca_sigma = FLAGS.pca_sigma\n      with_privacy = FLAGS.sigma > 0\n    else:\n      raise ValueError(\"Undefined accountant type, needs to be \"\n                       \"Amortized or Moments, but got %s\" % FLAGS.accountant)\n    # Note: Here and below, we scale down the l2norm_bound by\n    # batch_size. This is because per_example_gradients computes the\n    # gradient of the minibatch loss with respect to each individual\n    # example, and the minibatch loss (for our model) is the *average*\n    # loss over examples in the minibatch. Hence, the scale of the\n    # per-example gradients goes like 1 / batch_size.\n    gaussian_sanitizer = sanitizer.AmortizedGaussianSanitizer(\n        priv_accountant,\n        [network_parameters.default_gradient_l2norm_bound / batch_size, True])\n\n    for var in training_params:\n      if \"gradient_l2norm_bound\" in training_params[var]:\n        l2bound = training_params[var][\"gradient_l2norm_bound\"] / batch_size\n        gaussian_sanitizer.set_option(var,\n                                      sanitizer.ClipOption(l2bound, True))\n    lr = tf.placeholder(tf.float32)\n    eps = tf.placeholder(tf.float32)\n    delta = tf.placeholder(tf.float32)\n\n    init_ops = []\n    if network_parameters.projection_type == \"PCA\":\n      with tf.variable_scope(\"pca\"):\n        # Compute differentially private PCA.\n        all_data, _ = MnistInput(mnist_train_file, NUM_TRAINING_IMAGES, False)\n        pca_projection = dp_pca.ComputeDPPrincipalProjection(\n            all_data, network_parameters.projection_dimensions,\n            gaussian_sanitizer, [FLAGS.pca_eps, FLAGS.pca_delta], pca_sigma)\n        assign_pca_proj = tf.assign(projection, pca_projection)\n        init_ops.append(assign_pca_proj)\n\n    # Add global_step\n    global_step = tf.Variable(0, dtype=tf.int32, trainable=False,\n                              name=\"global_step\")\n\n    if with_privacy:\n      gd_op = dp_optimizer.DPGradientDescentOptimizer(\n          lr,\n          [eps, delta],\n          gaussian_sanitizer,\n          sigma=sigma,\n          batches_per_lot=FLAGS.batches_per_lot).minimize(\n              cost, global_step=global_step)\n    else:\n      gd_op = tf.train.GradientDescentOptimizer(lr).minimize(cost)\n\n    saver = tf.train.Saver()\n    coord = tf.train.Coordinator()\n    _ = tf.train.start_queue_runners(sess=sess, coord=coord)\n\n    # We need to maintain the intialization sequence.\n    for v in tf.trainable_variables():\n      sess.run(tf.initialize_variables([v]))\n    sess.run(tf.initialize_all_variables())\n    sess.run(init_ops)\n\n    results = []\n    start_time = time.time()\n    prev_time = start_time\n    filename = \"results-0.json\"\n    log_path = os.path.join(save_path, filename)\n\n    target_eps = [float(s) for s in FLAGS.target_eps.split(\",\")]\n    if FLAGS.accountant_type == \"Amortized\":\n      # Only matters if --terminate_based_on_privacy is true.\n      target_eps = [max(target_eps)]\n    max_target_eps = max(target_eps)\n\n    lot_size = FLAGS.batches_per_lot * FLAGS.batch_size\n    lots_per_epoch = NUM_TRAINING_IMAGES / lot_size\n    for step in xrange(num_steps):\n      epoch = step / lots_per_epoch\n      curr_lr = utils.VaryRate(FLAGS.lr, FLAGS.end_lr,\n                               FLAGS.lr_saturate_epochs, epoch)\n      curr_eps = utils.VaryRate(FLAGS.eps, FLAGS.end_eps,\n                                FLAGS.eps_saturate_epochs, epoch)\n      for _ in xrange(FLAGS.batches_per_lot):\n        _ = sess.run(\n            [gd_op], feed_dict={lr: curr_lr, eps: curr_eps, delta: FLAGS.delta})\n      sys.stderr.write(\"step: %d\\n\" % step)\n\n      # See if we should stop training due to exceeded privacy budget:\n      should_terminate = False\n      terminate_spent_eps_delta = None\n      if with_privacy and FLAGS.terminate_based_on_privacy:\n        terminate_spent_eps_delta = priv_accountant.get_privacy_spent(\n            sess, target_eps=[max_target_eps])[0]\n        # For the Moments accountant, we should always have\n        # spent_eps == max_target_eps.\n        if (terminate_spent_eps_delta.spent_delta > FLAGS.target_delta or\n            terminate_spent_eps_delta.spent_eps > max_target_eps):\n          should_terminate = True\n\n      if (eval_steps > 0 and (step + 1) % eval_steps == 0) or should_terminate:\n        if with_privacy:\n          spent_eps_deltas = priv_accountant.get_privacy_spent(\n              sess, target_eps=target_eps)\n        else:\n          spent_eps_deltas = [accountant.EpsDelta(0, 0)]\n        for spent_eps, spent_delta in spent_eps_deltas:\n          sys.stderr.write(\"spent privacy: eps %.4f delta %.5g\\n\" % (\n              spent_eps, spent_delta))\n\n        saver.save(sess, save_path=save_path + \"/ckpt\")\n        train_accuracy, _ = Eval(mnist_train_file, network_parameters,\n                                 num_testing_images=NUM_TESTING_IMAGES,\n                                 randomize=True, load_path=save_path)\n        sys.stderr.write(\"train_accuracy: %.2f\\n\" % train_accuracy)\n        test_accuracy, mistakes = Eval(mnist_test_file, network_parameters,\n                                       num_testing_images=NUM_TESTING_IMAGES,\n                                       randomize=False, load_path=save_path,\n                                       save_mistakes=FLAGS.save_mistakes)\n        sys.stderr.write(\"eval_accuracy: %.2f\\n\" % test_accuracy)\n\n        curr_time = time.time()\n        elapsed_time = curr_time - prev_time\n        prev_time = curr_time\n\n        results.append({\"step\": step+1,  # Number of lots trained so far.\n                        \"elapsed_secs\": elapsed_time,\n                        \"spent_eps_deltas\": spent_eps_deltas,\n                        \"train_accuracy\": train_accuracy,\n                        \"test_accuracy\": test_accuracy,\n                        \"mistakes\": mistakes})\n        loginfo = {\"elapsed_secs\": curr_time-start_time,\n                   \"spent_eps_deltas\": spent_eps_deltas,\n                   \"train_accuracy\": train_accuracy,\n                   \"test_accuracy\": test_accuracy,\n                   \"num_training_steps\": step+1,  # Steps so far.\n                   \"mistakes\": mistakes,\n                   \"result_series\": results}\n        loginfo.update(params)\n        if log_path:\n          with tf.gfile.Open(log_path, \"w\") as f:\n            json.dump(loginfo, f, indent=2)\n            f.write(\"\\n\")\n            f.close()\n\n      if should_terminate:\n        break\n\n\ndef main(_):\n  network_parameters = utils.NetworkParameters()\n\n  # If the ASCII proto isn't specified, then construct a config protobuf based\n  # on 3 flags.\n  network_parameters.input_size = IMAGE_SIZE ** 2\n  network_parameters.default_gradient_l2norm_bound = (\n      FLAGS.default_gradient_l2norm_bound)\n  if FLAGS.projection_dimensions > 0 and FLAGS.num_conv_layers > 0:\n    raise ValueError(\"Currently you can't do PCA and have convolutions\"\n                     \"at the same time. Pick one\")\n\n    # could add support for PCA after convolutions.\n    # Currently BuildNetwork can build the network with conv followed by\n    # projection, but the PCA training works on data, rather than data run\n    # through a few layers. Will need to init the convs before running the\n    # PCA, and need to change the PCA subroutine to take a network and perhaps\n    # allow for batched inputs, to handle larger datasets.\n  if FLAGS.num_conv_layers > 0:\n    conv = utils.ConvParameters()\n    conv.name = \"conv1\"\n    conv.in_channels = 1\n    conv.out_channels = 128\n    conv.num_outputs = 128 * 14 * 14\n    network_parameters.conv_parameters.append(conv)\n    # defaults for the rest: 5x5,stride 1, relu, maxpool 2x2,stride 2.\n    # insize 28x28, bias, stddev 0.1, non-trainable.\n  if FLAGS.num_conv_layers > 1:\n    conv = network_parameters.ConvParameters()\n    conv.name = \"conv2\"\n    conv.in_channels = 128\n    conv.out_channels = 128\n    conv.num_outputs = 128 * 7 * 7\n    conv.in_size = 14\n    # defaults for the rest: 5x5,stride 1, relu, maxpool 2x2,stride 2.\n    # bias, stddev 0.1, non-trainable.\n    network_parameters.conv_parameters.append(conv)\n\n  if FLAGS.num_conv_layers > 2:\n    raise ValueError(\"Currently --num_conv_layers must be 0,1 or 2.\"\n                     \"Manually create a network_parameters proto for more.\")\n\n  if FLAGS.projection_dimensions > 0:\n    network_parameters.projection_type = \"PCA\"\n    network_parameters.projection_dimensions = FLAGS.projection_dimensions\n  for i in xrange(FLAGS.num_hidden_layers):\n    hidden = utils.LayerParameters()\n    hidden.name = \"hidden%d\" % i\n    hidden.num_units = FLAGS.hidden_layer_num_units\n    hidden.relu = True\n    hidden.with_bias = False\n    hidden.trainable = not FLAGS.freeze_bottom_layers\n    network_parameters.layer_parameters.append(hidden)\n\n  logits = utils.LayerParameters()\n  logits.name = \"logits\"\n  logits.num_units = 10\n  logits.relu = False\n  logits.with_bias = False\n  network_parameters.layer_parameters.append(logits)\n\n  Train(FLAGS.training_data_path,\n        FLAGS.eval_data_path,\n        network_parameters,\n        FLAGS.num_training_steps,\n        FLAGS.save_path,\n        eval_steps=FLAGS.eval_steps)\n\n\nif __name__ == \"__main__\":\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/dp_sgd/dp_optimizer/BUILD",
    "content": "package(default_visibility = [\":internal\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\nexports_files([\"LICENSE\"])\n\npackage_group(\n    name = \"internal\",\n    packages = [\n        \"//differential_privacy/...\",\n    ],\n)\n\npy_library(\n    name = \"utils\",\n    srcs = [\n        \"utils.py\",\n    ],\n    deps = [\n    ],\n)\n\npy_library(\n    name = \"dp_pca\",\n    srcs = [\n        \"dp_pca.py\",\n    ],\n    deps = [\n    ],\n)\n\npy_library(\n    name = \"dp_optimizer\",\n    srcs = [\n        \"dp_optimizer.py\",\n        \"sanitizer.py\",\n    ],\n    deps = [\n        \":utils\",\n        \"//differential_privacy/dp_sgd/per_example_gradients\",\n        \"//differential_privacy/privacy_accountant/tf:accountant\",\n    ],\n)\n\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/dp_sgd/dp_optimizer/dp_optimizer.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Differentially private optimizers.\n\"\"\"\nfrom __future__ import division\n\nimport tensorflow as tf\n\nfrom differential_privacy.dp_sgd.dp_optimizer import utils\nfrom differential_privacy.dp_sgd.per_example_gradients import per_example_gradients\n\n\nclass DPGradientDescentOptimizer(tf.train.GradientDescentOptimizer):\n  \"\"\"Differentially private gradient descent optimizer.\n  \"\"\"\n\n  def __init__(self, learning_rate, eps_delta, sanitizer,\n               sigma=None, use_locking=False, name=\"DPGradientDescent\",\n               batches_per_lot=1):\n    \"\"\"Construct a differentially private gradient descent optimizer.\n\n    The optimizer uses fixed privacy budget for each batch of training.\n\n    Args:\n      learning_rate: for GradientDescentOptimizer.\n      eps_delta: EpsDelta pair for each epoch.\n      sanitizer: for sanitizing the graident.\n      sigma: noise sigma. If None, use eps_delta pair to compute sigma;\n        otherwise use supplied sigma directly.\n      use_locking: use locking.\n      name: name for the object.\n      batches_per_lot: Number of batches in a lot.\n    \"\"\"\n\n    super(DPGradientDescentOptimizer, self).__init__(learning_rate,\n                                                     use_locking, name)\n\n    # Also, if needed, define the gradient accumulators\n    self._batches_per_lot = batches_per_lot\n    self._grad_accum_dict = {}\n    if batches_per_lot > 1:\n      self._batch_count = tf.Variable(1, dtype=tf.int32, trainable=False,\n                                      name=\"batch_count\")\n      var_list = tf.trainable_variables()\n      with tf.variable_scope(\"grad_acc_for\"):\n        for var in var_list:\n          v_grad_accum = tf.Variable(tf.zeros_like(var),\n                                     trainable=False,\n                                     name=utils.GetTensorOpName(var))\n          self._grad_accum_dict[var.name] = v_grad_accum\n\n    self._eps_delta = eps_delta\n    self._sanitizer = sanitizer\n    self._sigma = sigma\n\n  def compute_sanitized_gradients(self, loss, var_list=None,\n                                  add_noise=True):\n    \"\"\"Compute the sanitized gradients.\n\n    Args:\n      loss: the loss tensor.\n      var_list: the optional variables.\n      add_noise: if true, then add noise. Always clip.\n    Returns:\n      a pair of (list of sanitized gradients) and privacy spending accumulation\n      operations.\n    Raises:\n      TypeError: if var_list contains non-variable.\n    \"\"\"\n\n    self._assert_valid_dtypes([loss])\n\n    xs = [tf.convert_to_tensor(x) for x in var_list]\n    px_grads = per_example_gradients.PerExampleGradients(loss, xs)\n    sanitized_grads = []\n    for px_grad, v in zip(px_grads, var_list):\n      tensor_name = utils.GetTensorOpName(v)\n      sanitized_grad = self._sanitizer.sanitize(\n          px_grad, self._eps_delta, sigma=self._sigma,\n          tensor_name=tensor_name, add_noise=add_noise,\n          num_examples=self._batches_per_lot * tf.slice(\n              tf.shape(px_grad), [0], [1]))\n      sanitized_grads.append(sanitized_grad)\n\n    return sanitized_grads\n\n  def minimize(self, loss, global_step=None, var_list=None,\n               name=None):\n    \"\"\"Minimize using sanitized gradients.\n\n    This gets a var_list which is the list of trainable variables.\n    For each var in var_list, we defined a grad_accumulator variable\n    during init. When batches_per_lot > 1, we accumulate the gradient\n    update in those. At the end of each lot, we apply the update back to\n    the variable. This has the effect that for each lot we compute\n    gradients at the point at the beginning of the lot, and then apply one\n    update at the end of the lot. In other words, semantically, we are doing\n    SGD with one lot being the equivalent of one usual batch of size\n    batch_size * batches_per_lot.\n    This allows us to simulate larger batches than our memory size would permit.\n\n    The lr and the num_steps are in the lot world.\n\n    Args:\n      loss: the loss tensor.\n      global_step: the optional global step.\n      var_list: the optional variables.\n      name: the optional name.\n    Returns:\n      the operation that runs one step of DP gradient descent.\n    \"\"\"\n\n    # First validate the var_list\n\n    if var_list is None:\n      var_list = tf.trainable_variables()\n    for var in var_list:\n      if not isinstance(var, tf.Variable):\n        raise TypeError(\"Argument is not a variable.Variable: %s\" % var)\n\n    # Modification: apply gradient once every batches_per_lot many steps.\n    # This may lead to smaller error\n\n    if self._batches_per_lot == 1:\n      sanitized_grads = self.compute_sanitized_gradients(\n          loss, var_list=var_list)\n\n      grads_and_vars = zip(sanitized_grads, var_list)\n      self._assert_valid_dtypes([v for g, v in grads_and_vars if g is not None])\n\n      apply_grads = self.apply_gradients(grads_and_vars,\n                                         global_step=global_step, name=name)\n      return apply_grads\n\n    # Condition for deciding whether to accumulate the gradient\n    # or actually apply it.\n    # we use a private self_batch_count to keep track of number of batches.\n    # global step will count number of lots processed.\n\n    update_cond = tf.equal(tf.constant(0),\n                           tf.mod(self._batch_count,\n                                  tf.constant(self._batches_per_lot)))\n\n    # Things to do for batches other than last of the lot.\n    # Add non-noisy clipped grads to shadow variables.\n\n    def non_last_in_lot_op(loss, var_list):\n      \"\"\"Ops to do for a typical batch.\n\n      For a batch that is not the last one in the lot, we simply compute the\n      sanitized gradients and apply them to the grad_acc variables.\n\n      Args:\n        loss: loss function tensor\n        var_list: list of variables\n      Returns:\n        A tensorflow op to do the updates to the gradient accumulators\n      \"\"\"\n      sanitized_grads = self.compute_sanitized_gradients(\n          loss, var_list=var_list, add_noise=False)\n\n      update_ops_list = []\n      for var, grad in zip(var_list, sanitized_grads):\n        grad_acc_v = self._grad_accum_dict[var.name]\n        update_ops_list.append(grad_acc_v.assign_add(grad))\n      update_ops_list.append(self._batch_count.assign_add(1))\n      return tf.group(*update_ops_list)\n\n    # Things to do for last batch of a lot.\n    # Add noisy clipped grads to accumulator.\n    # Apply accumulated grads to vars.\n\n    def last_in_lot_op(loss, var_list, global_step):\n      \"\"\"Ops to do for last batch in a lot.\n\n      For the last batch in the lot, we first add the sanitized gradients to\n      the gradient acc variables, and then apply these\n      values over to the original variables (via an apply gradient)\n\n      Args:\n        loss: loss function tensor\n        var_list: list of variables\n        global_step: optional global step to be passed to apply_gradients\n      Returns:\n        A tensorflow op to push updates from shadow vars to real vars.\n      \"\"\"\n\n      # We add noise in the last lot. This is why we need this code snippet\n      # that looks almost identical to the non_last_op case here.\n      sanitized_grads = self.compute_sanitized_gradients(\n          loss, var_list=var_list, add_noise=True)\n\n      normalized_grads = []\n      for var, grad in zip(var_list, sanitized_grads):\n        grad_acc_v = self._grad_accum_dict[var.name]\n        # To handle the lr difference per lot vs per batch, we divide the\n        # update by number of batches per lot.\n        normalized_grad = tf.div(grad_acc_v.assign_add(grad),\n                                 tf.to_float(self._batches_per_lot))\n\n        normalized_grads.append(normalized_grad)\n\n      with tf.control_dependencies(normalized_grads):\n        grads_and_vars = zip(normalized_grads, var_list)\n        self._assert_valid_dtypes(\n            [v for g, v in grads_and_vars if g is not None])\n        apply_san_grads = self.apply_gradients(grads_and_vars,\n                                               global_step=global_step,\n                                               name=\"apply_grads\")\n\n      # Now reset the accumulators to zero\n      resets_list = []\n      with tf.control_dependencies([apply_san_grads]):\n        for _, acc in self._grad_accum_dict.items():\n          reset = tf.assign(acc, tf.zeros_like(acc))\n          resets_list.append(reset)\n      resets_list.append(self._batch_count.assign_add(1))\n\n      last_step_update = tf.group(*([apply_san_grads] + resets_list))\n      return last_step_update\n    # pylint: disable=g-long-lambda\n    update_op = tf.cond(update_cond,\n                        lambda: last_in_lot_op(\n                            loss, var_list,\n                            global_step),\n                        lambda: non_last_in_lot_op(\n                            loss, var_list))\n    return tf.group(update_op)\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/dp_sgd/dp_optimizer/dp_pca.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Differentially private optimizers.\n\"\"\"\nimport tensorflow as tf\n\nfrom differential_privacy.dp_sgd.dp_optimizer import sanitizer as san\n\n\ndef ComputeDPPrincipalProjection(data, projection_dims,\n                                 sanitizer, eps_delta, sigma):\n  \"\"\"Compute differentially private projection.\n\n  Args:\n    data: the input data, each row is a data vector.\n    projection_dims: the projection dimension.\n    sanitizer: the sanitizer used for acheiving privacy.\n    eps_delta: (eps, delta) pair.\n    sigma: if not None, use noise sigma; otherwise compute it using\n      eps_delta pair.\n  Returns:\n    A projection matrix with projection_dims columns.\n  \"\"\"\n\n  eps, delta = eps_delta\n  # Normalize each row.\n  normalized_data = tf.nn.l2_normalize(data, 1)\n  covar = tf.matmul(tf.transpose(normalized_data), normalized_data)\n  saved_shape = tf.shape(covar)\n  num_examples = tf.slice(tf.shape(data), [0], [1])\n  if eps > 0:\n    # Since the data is already normalized, there is no need to clip\n    # the covariance matrix.\n    assert delta > 0\n    saned_covar = sanitizer.sanitize(\n        tf.reshape(covar, [1, -1]), eps_delta, sigma=sigma,\n        option=san.ClipOption(1.0, False), num_examples=num_examples)\n    saned_covar = tf.reshape(saned_covar, saved_shape)\n    # Symmetrize saned_covar. This also reduces the noise variance.\n    saned_covar = 0.5 * (saned_covar + tf.transpose(saned_covar))\n  else:\n    saned_covar = covar\n\n  # Compute the eigen decomposition of the covariance matrix, and\n  # return the top projection_dims eigen vectors, represented as columns of\n  # the projection matrix.\n  eigvals, eigvecs = tf.self_adjoint_eig(saned_covar)\n  _, topk_indices = tf.nn.top_k(eigvals, projection_dims)\n  topk_indices = tf.reshape(topk_indices, [projection_dims])\n  # Gather and return the corresponding eigenvectors.\n  return tf.transpose(tf.gather(tf.transpose(eigvecs), topk_indices))\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/dp_sgd/dp_optimizer/sanitizer.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Defines Sanitizer class for sanitizing tensors.\n\nA sanitizer first limits the sensitivity of a tensor and then adds noise\nto the tensor. The parameters are determined by the privacy_spending and the\nother parameters. It also uses an accountant to keep track of the privacy\nspending.\n\"\"\"\nfrom __future__ import division\n\nimport collections\n\nimport tensorflow as tf\n\nfrom differential_privacy.dp_sgd.dp_optimizer import utils\n\n\nClipOption = collections.namedtuple(\"ClipOption\",\n                                    [\"l2norm_bound\", \"clip\"])\n\n\nclass AmortizedGaussianSanitizer(object):\n  \"\"\"Sanitizer with Gaussian noise and amoritzed privacy spending accounting.\n\n  This sanitizes a tensor by first clipping the tensor, summing the tensor\n  and then adding appropriate amount of noise. It also uses an amortized\n  accountant to keep track of privacy spending.\n  \"\"\"\n\n  def __init__(self, accountant, default_option):\n    \"\"\"Construct an AmortizedGaussianSanitizer.\n\n    Args:\n      accountant: the privacy accountant. Expect an amortized one.\n      default_option: the default ClipOptoin.\n    \"\"\"\n\n    self._accountant = accountant\n    self._default_option = default_option\n    self._options = {}\n\n  def set_option(self, tensor_name, option):\n    \"\"\"Set options for an individual tensor.\n\n    Args:\n      tensor_name: the name of the tensor.\n      option: clip option.\n    \"\"\"\n\n    self._options[tensor_name] = option\n\n  def sanitize(self, x, eps_delta, sigma=None,\n               option=ClipOption(None, None), tensor_name=None,\n               num_examples=None, add_noise=True):\n    \"\"\"Sanitize the given tensor.\n\n    This santize a given tensor by first applying l2 norm clipping and then\n    adding Gaussian noise. It calls the privacy accountant for updating the\n    privacy spending.\n\n    Args:\n      x: the tensor to sanitize.\n      eps_delta: a pair of eps, delta for (eps,delta)-DP. Use it to\n        compute sigma if sigma is None.\n      sigma: if sigma is not None, use sigma.\n      option: a ClipOption which, if supplied, used for\n        clipping and adding noise.\n      tensor_name: the name of the tensor.\n      num_examples: if None, use the number of \"rows\" of x.\n      add_noise: if True, then add noise, else just clip.\n    Returns:\n      a pair of sanitized tensor and the operation to accumulate privacy\n      spending.\n    \"\"\"\n\n    if sigma is None:\n      # pylint: disable=unpacking-non-sequence\n      eps, delta = eps_delta\n      with tf.control_dependencies(\n          [tf.Assert(tf.greater(eps, 0),\n                     [\"eps needs to be greater than 0\"]),\n           tf.Assert(tf.greater(delta, 0),\n                     [\"delta needs to be greater than 0\"])]):\n        # The following formula is taken from\n        #   Dwork and Roth, The Algorithmic Foundations of Differential\n        #   Privacy, Appendix A.\n        #   http://www.cis.upenn.edu/~aaroth/Papers/privacybook.pdf\n        sigma = tf.sqrt(2.0 * tf.log(1.25 / delta)) / eps\n\n    l2norm_bound, clip = option\n    if l2norm_bound is None:\n      l2norm_bound, clip = self._default_option\n      if ((tensor_name is not None) and\n          (tensor_name in self._options)):\n        l2norm_bound, clip = self._options[tensor_name]\n    if clip:\n      x = utils.BatchClipByL2norm(x, l2norm_bound)\n\n    if add_noise:\n      if num_examples is None:\n        num_examples = tf.slice(tf.shape(x), [0], [1])\n      privacy_accum_op = self._accountant.accumulate_privacy_spending(\n          eps_delta, sigma, num_examples)\n      with tf.control_dependencies([privacy_accum_op]):\n        saned_x = utils.AddGaussianNoise(tf.reduce_sum(x, 0),\n                                         sigma * l2norm_bound)\n    else:\n      saned_x = tf.reduce_sum(x, 0)\n    return saned_x\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/dp_sgd/dp_optimizer/utils.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Utils for building and training NN models.\n\"\"\"\nfrom __future__ import division\n\nimport math\n\nimport numpy\nimport tensorflow as tf\n\n\nclass LayerParameters(object):\n  \"\"\"class that defines a non-conv layer.\"\"\"\n  def __init__(self):\n    self.name = \"\"\n    self.num_units = 0\n    self._with_bias = False\n    self.relu = False\n    self.gradient_l2norm_bound = 0.0\n    self.bias_gradient_l2norm_bound = 0.0\n    self.trainable = True\n    self.weight_decay = 0.0\n\n\nclass ConvParameters(object):\n  \"\"\"class that defines a conv layer.\"\"\"\n  def __init__(self):\n    self.patch_size = 5\n    self.stride = 1\n    self.in_channels = 1\n    self.out_channels = 0\n    self.with_bias = True\n    self.relu = True\n    self.max_pool = True\n    self.max_pool_size = 2\n    self.max_pool_stride = 2\n    self.trainable = False\n    self.in_size = 28\n    self.name = \"\"\n    self.num_outputs = 0\n    self.bias_stddev = 0.1\n\n\n# Parameters for a layered neural network.\nclass NetworkParameters(object):\n  \"\"\"class that define the overall model structure.\"\"\"\n  def __init__(self):\n    self.input_size = 0\n    self.projection_type = 'NONE'  # NONE, RANDOM, PCA\n    self.projection_dimensions = 0\n    self.default_gradient_l2norm_bound = 0.0\n    self.layer_parameters = []  # List of LayerParameters\n    self.conv_parameters = []  # List of ConvParameters\n\n\ndef GetTensorOpName(x):\n  \"\"\"Get the name of the op that created a tensor.\n\n  Useful for naming related tensors, as ':' in name field of op is not permitted\n\n  Args:\n    x: the input tensor.\n  Returns:\n    the name of the op.\n  \"\"\"\n\n  t = x.name.rsplit(\":\", 1)\n  if len(t) == 1:\n    return x.name\n  else:\n    return t[0]\n\n\ndef BuildNetwork(inputs, network_parameters):\n  \"\"\"Build a network using the given parameters.\n\n  Args:\n    inputs: a Tensor of floats containing the input data.\n    network_parameters: NetworkParameters object\n      that describes the parameters for the network.\n  Returns:\n    output, training_parameters: where the outputs (a tensor) is the output\n      of the network, and training_parameters (a dictionary that maps the\n      name of each variable to a dictionary of parameters) is the parameters\n      used during training.\n  \"\"\"\n\n  training_parameters = {}\n  num_inputs = network_parameters.input_size\n  outputs = inputs\n  projection = None\n\n  # First apply convolutions, if needed\n  for conv_param in network_parameters.conv_parameters:\n    outputs = tf.reshape(\n        outputs,\n        [-1, conv_param.in_size, conv_param.in_size,\n         conv_param.in_channels])\n    conv_weights_name = \"%s_conv_weight\" % (conv_param.name)\n    conv_bias_name = \"%s_conv_bias\" % (conv_param.name)\n    conv_std_dev = 1.0 / (conv_param.patch_size\n                          * math.sqrt(conv_param.in_channels))\n    conv_weights = tf.Variable(\n        tf.truncated_normal([conv_param.patch_size,\n                             conv_param.patch_size,\n                             conv_param.in_channels,\n                             conv_param.out_channels],\n                            stddev=conv_std_dev),\n        trainable=conv_param.trainable,\n        name=conv_weights_name)\n    conv_bias = tf.Variable(\n        tf.truncated_normal([conv_param.out_channels],\n                            stddev=conv_param.bias_stddev),\n        trainable=conv_param.trainable,\n        name=conv_bias_name)\n    training_parameters[conv_weights_name] = {}\n    training_parameters[conv_bias_name] = {}\n    conv = tf.nn.conv2d(outputs, conv_weights,\n                        strides=[1, conv_param.stride,\n                                 conv_param.stride, 1],\n                        padding=\"SAME\")\n    relud = tf.nn.relu(conv + conv_bias)\n    mpd = tf.nn.max_pool(relud, ksize=[1,\n                                       conv_param.max_pool_size,\n                                       conv_param.max_pool_size, 1],\n                         strides=[1, conv_param.max_pool_stride,\n                                  conv_param.max_pool_stride, 1],\n                         padding=\"SAME\")\n    outputs = mpd\n    num_inputs = conv_param.num_outputs\n    # this should equal\n    # in_size * in_size * out_channels / (stride * max_pool_stride)\n\n  # once all the convs are done, reshape to make it flat\n  outputs = tf.reshape(outputs, [-1, num_inputs])\n\n  # Now project, if needed\n  if network_parameters.projection_type is not \"NONE\":\n    projection = tf.Variable(tf.truncated_normal(\n        [num_inputs, network_parameters.projection_dimensions],\n        stddev=1.0 / math.sqrt(num_inputs)), trainable=False, name=\"projection\")\n    num_inputs = network_parameters.projection_dimensions\n    outputs = tf.matmul(outputs, projection)\n\n  # Now apply any other layers\n\n  for layer_parameters in network_parameters.layer_parameters:\n    num_units = layer_parameters.num_units\n    hidden_weights_name = \"%s_weight\" % (layer_parameters.name)\n    hidden_weights = tf.Variable(\n        tf.truncated_normal([num_inputs, num_units],\n                            stddev=1.0 / math.sqrt(num_inputs)),\n        name=hidden_weights_name, trainable=layer_parameters.trainable)\n    training_parameters[hidden_weights_name] = {}\n    if layer_parameters.gradient_l2norm_bound:\n      training_parameters[hidden_weights_name][\"gradient_l2norm_bound\"] = (\n          layer_parameters.gradient_l2norm_bound)\n    if layer_parameters.weight_decay:\n      training_parameters[hidden_weights_name][\"weight_decay\"] = (\n          layer_parameters.weight_decay)\n\n    outputs = tf.matmul(outputs, hidden_weights)\n    if layer_parameters.with_bias:\n      hidden_biases_name = \"%s_bias\" % (layer_parameters.name)\n      hidden_biases = tf.Variable(tf.zeros([num_units]),\n                                  name=hidden_biases_name)\n      training_parameters[hidden_biases_name] = {}\n      if layer_parameters.bias_gradient_l2norm_bound:\n        training_parameters[hidden_biases_name][\n            \"bias_gradient_l2norm_bound\"] = (\n                layer_parameters.bias_gradient_l2norm_bound)\n\n      outputs += hidden_biases\n    if layer_parameters.relu:\n      outputs = tf.nn.relu(outputs)\n    # num_inputs for the next layer is num_units in the current layer.\n    num_inputs = num_units\n\n  return outputs, projection, training_parameters\n\n\ndef VaryRate(start, end, saturate_epochs, epoch):\n  \"\"\"Compute a linearly varying number.\n\n  Decrease linearly from start to end until epoch saturate_epochs.\n\n  Args:\n    start: the initial number.\n    end: the end number.\n    saturate_epochs: after this we do not reduce the number; if less than\n      or equal to zero, just return start.\n    epoch: the current learning epoch.\n  Returns:\n    the caculated number.\n  \"\"\"\n  if saturate_epochs <= 0:\n    return start\n\n  step = (start - end) / (saturate_epochs - 1)\n  if epoch < saturate_epochs:\n    return start - step * epoch\n  else:\n    return end\n\n\ndef BatchClipByL2norm(t, upper_bound, name=None):\n  \"\"\"Clip an array of tensors by L2 norm.\n\n  Shrink each dimension-0 slice of tensor (for matrix it is each row) such\n  that the l2 norm is at most upper_bound. Here we clip each row as it\n  corresponds to each example in the batch.\n\n  Args:\n    t: the input tensor.\n    upper_bound: the upperbound of the L2 norm.\n    name: optional name.\n  Returns:\n    the clipped tensor.\n  \"\"\"\n\n  assert upper_bound > 0\n  with tf.op_scope([t, upper_bound], name, \"batch_clip_by_l2norm\") as name:\n    saved_shape = tf.shape(t)\n    batch_size = tf.slice(saved_shape, [0], [1])\n    t2 = tf.reshape(t, tf.concat(0, [batch_size, [-1]]))\n    upper_bound_inv = tf.fill(tf.slice(saved_shape, [0], [1]),\n                              tf.constant(1.0/upper_bound))\n    # Add a small number to avoid divide by 0\n    l2norm_inv = tf.rsqrt(tf.reduce_sum(t2 * t2, [1]) + 0.000001)\n    scale = tf.minimum(l2norm_inv, upper_bound_inv) * upper_bound\n    clipped_t = tf.matmul(tf.diag(scale), t2)\n    clipped_t = tf.reshape(clipped_t, saved_shape, name=name)\n  return clipped_t\n\n\ndef SoftThreshold(t, threshold_ratio, name=None):\n  \"\"\"Soft-threshold a tensor by the mean value.\n\n  Softthreshold each dimension-0 vector (for matrix it is each column) by\n  the mean of absolute value multiplied by the threshold_ratio factor. Here\n  we soft threshold each column as it corresponds to each unit in a layer.\n\n  Args:\n    t: the input tensor.\n    threshold_ratio: the threshold ratio.\n    name: the optional name for the returned tensor.\n  Returns:\n    the thresholded tensor, where each entry is soft-thresholded by\n    threshold_ratio times the mean of the aboslute value of each column.\n  \"\"\"\n\n  assert threshold_ratio >= 0\n  with tf.op_scope([t, threshold_ratio], name, \"soft_thresholding\") as name:\n    saved_shape = tf.shape(t)\n    t2 = tf.reshape(t, tf.concat(0, [tf.slice(saved_shape, [0], [1]), -1]))\n    t_abs = tf.abs(t2)\n    t_x = tf.sign(t2) * tf.nn.relu(t_abs -\n                                   (tf.reduce_mean(t_abs, [0],\n                                                   keep_dims=True) *\n                                    threshold_ratio))\n    return tf.reshape(t_x, saved_shape, name=name)\n\n\ndef AddGaussianNoise(t, sigma, name=None):\n  \"\"\"Add i.i.d. Gaussian noise (0, sigma^2) to every entry of t.\n\n  Args:\n    t: the input tensor.\n    sigma: the stddev of the Gaussian noise.\n    name: optional name.\n  Returns:\n    the noisy tensor.\n  \"\"\"\n\n  with tf.op_scope([t, sigma], name, \"add_gaussian_noise\") as name:\n    noisy_t = t + tf.random_normal(tf.shape(t), stddev=sigma)\n  return noisy_t\n\n\ndef GenerateBinomialTable(m):\n  \"\"\"Generate binomial table.\n\n  Args:\n    m: the size of the table.\n  Returns:\n    A two dimensional array T where T[i][j] = (i choose j),\n    for 0<= i, j <=m.\n  \"\"\"\n\n  table = numpy.zeros((m + 1, m + 1), dtype=numpy.float64)\n  for i in range(m + 1):\n    table[i, 0] = 1\n  for i in range(1, m + 1):\n    for j in range(1, m + 1):\n      v = table[i - 1, j] + table[i - 1, j -1]\n      assert not math.isnan(v) and not math.isinf(v)\n      table[i, j] = v\n  return tf.convert_to_tensor(table)\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/dp_sgd/per_example_gradients/BUILD",
    "content": "package(default_visibility = [\":internal\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\nexports_files([\"LICENSE\"])\n\npackage_group(\n    name = \"internal\",\n    packages = [\n        \"//differential_privacy/...\",\n    ],\n)\n\npy_library(\n    name = \"per_example_gradients\",\n    srcs = [\n        \"per_example_gradients.py\",\n    ],\n    deps = [\n    ],\n)\n\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/dp_sgd/per_example_gradients/per_example_gradients.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Per-example gradients for selected ops.\"\"\"\n\nimport collections\n\nimport tensorflow as tf\n\nOrderedDict = collections.OrderedDict\n\n\ndef _ListUnion(list_1, list_2):\n  \"\"\"Returns the union of two lists.\n\n  Python sets can have a non-deterministic iteration order. In some\n  contexts, this could lead to TensorFlow producing two different\n  programs when the same Python script is run twice. In these contexts\n  we use lists instead of sets.\n\n  This function is not designed to be especially fast and should only\n  be used with small lists.\n\n  Args:\n    list_1: A list\n    list_2: Another list\n\n  Returns:\n    A new list containing one copy of each unique element of list_1 and\n    list_2. Uniqueness is determined by \"x in union\" logic; e.g. two\n    string of that value appearing in the union.\n\n  Raises:\n    TypeError: The arguments are not lists.\n  \"\"\"\n\n  if not (isinstance(list_1, list) and isinstance(list_2, list)):\n    raise TypeError(\"Arguments must be lists.\")\n\n  union = []\n  for x in list_1 + list_2:\n    if x not in union:\n      union.append(x)\n\n  return union\n\n\ndef Interface(ys, xs):\n  \"\"\"Maps xs to consumers.\n\n    Returns a dict mapping each element of xs to any of its consumers that are\n    indirectly consumed by ys.\n\n  Args:\n    ys: The outputs\n    xs: The inputs\n  Returns:\n    out: Dict mapping each member x of `xs` to a list of all Tensors that are\n         direct consumers of x and are eventually consumed by a member of\n         `ys`.\n  \"\"\"\n\n  if isinstance(ys, (list, tuple)):\n    queue = list(ys)\n  else:\n    queue = [ys]\n\n  out = OrderedDict()\n  if isinstance(xs, (list, tuple)):\n    for x in xs:\n      out[x] = []\n  else:\n    out[xs] = []\n\n  done = set()\n\n  while queue:\n    y = queue.pop()\n    if y in done:\n      continue\n    done = done.union(set([y]))\n    for x in y.op.inputs:\n      if x in out:\n        out[x].append(y)\n      else:\n        assert id(x) not in [id(foo) for foo in out]\n    queue.extend(y.op.inputs)\n\n  return out\n\n\nclass PXGRegistry(object):\n  \"\"\"Per-Example Gradient registry.\n\n  Maps names of ops to per-example gradient rules for those ops.\n  These rules are only needed for ops that directly touch values that\n  are shared between examples. For most machine learning applications,\n  this means only ops that directly operate on the parameters.\n\n\n  See http://arxiv.org/abs/1510.01799 for more information, and please\n  consider citing that tech report if you use this function in published\n  research.\n  \"\"\"\n\n  def __init__(self):\n    self.d = OrderedDict()\n\n  def __call__(self, op,\n               colocate_gradients_with_ops=False,\n               gate_gradients=False):\n    if op.node_def.op not in self.d:\n      raise NotImplementedError(\"No per-example gradient rule registered \"\n                                \"for \" + op.node_def.op + \" in pxg_registry.\")\n    return self.d[op.node_def.op](op,\n                                  colocate_gradients_with_ops,\n                                  gate_gradients)\n\n  def Register(self, op_name, pxg_class):\n    \"\"\"Associates `op_name` key with `pxg_class` value.\n\n    Registers `pxg_class` as the class that will be called to perform\n    per-example differentiation through ops with `op_name`.\n\n    Args:\n      op_name: String op name.\n      pxg_class: An instance of any class with the same signature as MatMulPXG.\n    \"\"\"\n    self.d[op_name] = pxg_class\n\n\npxg_registry = PXGRegistry()\n\n\nclass MatMulPXG(object):\n  \"\"\"Per-example gradient rule for MatMul op.\n  \"\"\"\n\n  def __init__(self, op,\n               colocate_gradients_with_ops=False,\n               gate_gradients=False):\n    \"\"\"Construct an instance of the rule for `op`.\n\n    Args:\n      op: The Operation to differentiate through.\n      colocate_gradients_with_ops: currently unsupported\n      gate_gradients: currently unsupported\n    \"\"\"\n    assert op.node_def.op == \"MatMul\"\n    self.op = op\n    self.colocate_gradients_with_ops = colocate_gradients_with_ops\n    self.gate_gradients = gate_gradients\n\n  def __call__(self, x, z_grads):\n    \"\"\"Build the graph for the per-example gradient through the op.\n\n    Assumes that the MatMul was called with a design matrix with examples\n    in rows as the first argument and parameters as the second argument.\n\n    Args:\n      x: The Tensor to differentiate with respect to. This tensor must\n         represent the weights.\n      z_grads: The list of gradients on the output of the op.\n\n    Returns:\n      x_grads: A Tensor containing the gradient with respect to `x` for\n       each example. This is a 3-D tensor, with the first axis corresponding\n       to examples and the remaining axes matching the shape of x.\n    \"\"\"\n    idx = list(self.op.inputs).index(x)\n    assert idx != -1\n    assert len(z_grads) == len(self.op.outputs)\n    assert idx == 1  # We expect weights to be arg 1\n    # We don't expect anyone to per-example differentiate with repsect\n    # to anything other than the weights.\n    x, _ = self.op.inputs\n    z_grads, = z_grads\n    x_expanded = tf.expand_dims(x, 2)\n    z_grads_expanded = tf.expand_dims(z_grads, 1)\n    return tf.mul(x_expanded, z_grads_expanded)\n\n\npxg_registry.Register(\"MatMul\", MatMulPXG)\n\n\nclass Conv2DPXG(object):\n  \"\"\"Per-example gradient rule of Conv2d op.\n\n  Same interface as MatMulPXG.\n  \"\"\"\n\n  def __init__(self, op,\n               colocate_gradients_with_ops=False,\n               gate_gradients=False):\n\n    assert op.node_def.op == \"Conv2D\"\n    self.op = op\n    self.colocate_gradients_with_ops = colocate_gradients_with_ops\n    self.gate_gradients = gate_gradients\n\n  def _PxConv2DBuilder(self, input_, w, strides, padding):\n    \"\"\"conv2d run separately per example, to help compute per-example gradients.\n\n    Args:\n      input_: tensor containing a minibatch of images / feature maps.\n              Shape [batch_size, rows, columns, channels]\n      w: convolution kernels. Shape\n        [kernel rows, kernel columns, input channels, output channels]\n      strides: passed through to regular conv_2d\n      padding: passed through to regular conv_2d\n\n    Returns:\n      conv: the output of the convolution.\n         single tensor, same as what regular conv_2d does\n      w_px: a list of batch_size copies of w. each copy was used\n          for the corresponding example in the minibatch.\n           calling tf.gradients on the copy gives the gradient for just\n                  that example.\n    \"\"\"\n    input_shape = [int(e) for e in input_.get_shape()]\n    batch_size = input_shape[0]\n    input_px = [tf.slice(\n        input_, [example] + [0] * 3, [1] + input_shape[1:]) for example\n                in xrange(batch_size)]\n    for input_x in input_px:\n      assert int(input_x.get_shape()[0]) == 1\n    w_px = [tf.identity(w) for example in xrange(batch_size)]\n    conv_px = [tf.nn.conv2d(input_x, w_x,\n                            strides=strides,\n                            padding=padding)\n               for input_x, w_x in zip(input_px, w_px)]\n    for conv_x in conv_px:\n      num_x = int(conv_x.get_shape()[0])\n      assert num_x == 1, num_x\n    assert len(conv_px) == batch_size\n    conv = tf.concat(0, conv_px)\n    assert int(conv.get_shape()[0]) == batch_size\n    return conv, w_px\n\n  def __call__(self, w, z_grads):\n    idx = list(self.op.inputs).index(w)\n    # Make sure that `op` was actually applied to `w`\n    assert idx != -1\n    assert len(z_grads) == len(self.op.outputs)\n    # The following assert may be removed when we are ready to use this\n    # for general purpose code.\n    # This assert is only expected to hold in the contex of our preliminary\n    # MNIST experiments.\n    assert idx == 1  # We expect convolution weights to be arg 1\n\n    images, filters = self.op.inputs\n    strides = self.op.get_attr(\"strides\")\n    padding = self.op.get_attr(\"padding\")\n    # Currently assuming that one specifies at most these four arguments and\n    # that all other arguments to conv2d are set to default.\n\n    conv, w_px = self._PxConv2DBuilder(images, filters, strides, padding)\n    z_grads, = z_grads\n\n    gradients_list = tf.gradients(conv, w_px, z_grads,\n                                  colocate_gradients_with_ops=\n                                  self.colocate_gradients_with_ops,\n                                  gate_gradients=self.gate_gradients)\n\n    return tf.pack(gradients_list)\n\npxg_registry.Register(\"Conv2D\", Conv2DPXG)\n\n\nclass AddPXG(object):\n  \"\"\"Per-example gradient rule for Add op.\n\n  Same interface as MatMulPXG.\n  \"\"\"\n\n  def __init__(self, op,\n               colocate_gradients_with_ops=False,\n               gate_gradients=False):\n\n    assert op.node_def.op == \"Add\"\n    self.op = op\n    self.colocate_gradients_with_ops = colocate_gradients_with_ops\n    self.gate_gradients = gate_gradients\n\n  def __call__(self, x, z_grads):\n    idx = list(self.op.inputs).index(x)\n    # Make sure that `op` was actually applied to `x`\n    assert idx != -1\n    assert len(z_grads) == len(self.op.outputs)\n    # The following assert may be removed when we are ready to use this\n    # for general purpose code.\n    # This assert is only expected to hold in the contex of our preliminary\n    # MNIST experiments.\n    assert idx == 1 # We expect biases to be arg 1\n    # We don't expect anyone to per-example differentiate with respect\n    # to anything other than the biases.\n    x, _ = self.op.inputs\n    z_grads, = z_grads\n    return z_grads\n\n\npxg_registry.Register(\"Add\", AddPXG)\n\n\ndef PerExampleGradients(ys, xs, grad_ys=None, name=\"gradients\",\n                        colocate_gradients_with_ops=False,\n                        gate_gradients=False):\n  \"\"\"Symbolic differentiation, separately for each example.\n\n  Matches the interface of tf.gradients, but the return values each have an\n  additional axis corresponding to the examples.\n\n  Assumes that the cost in `ys` is additive across examples.\n  e.g., no batch normalization.\n  Individual rules for each op specify their own assumptions about how\n  examples are put into tensors.\n  \"\"\"\n\n  # Find the interface between the xs and the cost\n  for x in xs:\n    assert isinstance(x, tf.Tensor), type(x)\n  interface = Interface(ys, xs)\n  merged_interface = []\n  for x in xs:\n    merged_interface = _ListUnion(merged_interface, interface[x])\n  # Differentiate with respect to the interface\n  interface_gradients = tf.gradients(ys, merged_interface, grad_ys=grad_ys,\n                                     name=name,\n                                     colocate_gradients_with_ops=\n                                     colocate_gradients_with_ops,\n                                     gate_gradients=gate_gradients)\n  grad_dict = OrderedDict(zip(merged_interface, interface_gradients))\n  # Build the per-example gradients with respect to the xs\n  if colocate_gradients_with_ops:\n    raise NotImplementedError(\"The per-example gradients are not yet \"\n                              \"colocated with ops.\")\n  if gate_gradients:\n    raise NotImplementedError(\"The per-example gradients are not yet \"\n                              \"gated.\")\n  out = []\n  for x in xs:\n    zs = interface[x]\n    ops = []\n    for z in zs:\n      ops = _ListUnion(ops, [z.op])\n    if len(ops) != 1:\n      raise NotImplementedError(\"Currently we only support the case \"\n                                \"where each x is consumed by exactly \"\n                                \"one op. but %s is consumed by %d ops.\"\n                                % (x.name, len(ops)))\n    op = ops[0]\n    pxg_rule = pxg_registry(op, colocate_gradients_with_ops, gate_gradients)\n    x_grad = pxg_rule(x, [grad_dict[z] for z in zs])\n    out.append(x_grad)\n  return out\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/multiple_teachers/BUILD",
    "content": "package(default_visibility = [\":internal\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\nexports_files([\"LICENSE\"])\n\npackage_group(\n    name = \"internal\",\n    packages = [\n        \"//differential_privacy/...\",\n    ],\n)\n\npy_library(\n    name = \"aggregation\",\n    srcs = [\n        \"aggregation.py\",\n    ],\n    deps = [\n    ],\n)\n\npy_library(\n    name = \"deep_cnn\",\n    srcs = [\n        \"deep_cnn.py\",\n    ],\n    deps = [\n        \":utils\",\n    ],\n)\n\npy_library(\n    name = \"input\",\n    srcs = [\n        \"input.py\",\n    ],\n    deps = [\n    ],\n)\n\npy_library(\n    name = \"metrics\",\n    srcs = [\n        \"metrics.py\",\n    ],\n    deps = [\n    ],\n)\n\npy_library(\n    name = \"utils\",\n    srcs = [\n        \"utils.py\",\n    ],\n    deps = [\n    ],\n)\n\npy_binary(\n    name = \"train_student\",\n    srcs = [\n        \"train_student.py\",\n    ],\n    deps = [\n        \":aggregation\",\n        \":deep_cnn\",\n        \":input\",\n        \":metrics\",\n    ],\n)\n\npy_binary(\n    name = \"train_teachers\",\n    srcs = [\n        \"train_teachers.py\",\n        \":deep_cnn\",\n        \":input\",\n        \":metrics\",\n    ],\n    deps = [\n    ],\n)\n\npy_library(\n    name = \"analysis\",\n    srcs = [\n        \"analysis.py\",\n    ],\n    deps = [\n        \"//differential_privacy/multiple_teachers:input\",\n    ],\n)\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/multiple_teachers/README.md",
    "content": "# Learning private models with multiple teachers\n\nThis repository contains code to create a setup for learning privacy-preserving\nstudent models by transferring knowledge from an ensemble of teachers trained\non disjoint subsets of the data for which privacy guarantees are to be provided.\n\nKnowledge acquired by teachers is transferred to the student in a differentially\nprivate manner by noisily aggregating the teacher decisions before feeding them\nto the student during training.\n\nThe paper describing the approach is [arXiv:1610.05755](https://arxiv.org/abs/1610.05755)\n\n## Dependencies\n\nThis model uses `TensorFlow` to perform numerical computations associated with\nmachine learning models, as well as common Python libraries like: `numpy`,\n`scipy`, and `six`. Instructions to install these can be found in their\nrespective documentations.\n\n## How to run\n\nThis repository supports the MNIST and SVHN datasets. The following\ninstructions are given for MNIST but can easily be adapted by replacing the\nflag `--dataset=mnist` by `--dataset=svhn`.\nThere are 2 steps: teacher training and student training. Data will be\nautomatically downloaded when you start the teacher training.\n\nThe following is a two-step process: first we train an ensemble of teacher\nmodels and second we train a student using predictions made by this ensemble.\n\n**Training the teachers:** first run the `train_teachers.py` file with at least\nthree flags specifying (1) the number of teachers, (2) the ID of the teacher\nyou are training among these teachers, and (3) the dataset on which to train.\nFor instance, to train teacher number 10 among an ensemble of 100 teachers for\nMNIST, you use the following command:\n\n```\npython train_teachers.py --nb_teachers=100 --teacher_id=10 --dataset=mnist\n```\n\nOther flags like `train_dir` and `data_dir` should optionally be set to\nrespectively point to the directory where model checkpoints and temporary data\n(like the dataset) should be saved. The flag `max_steps` (default at 3000)\ncontrols the length of training. See `train_teachers.py` and `deep_cnn.py`\nto find available flags and their descriptions.\n\n**Training the student:** once the teachers are all trained, e.g., teachers\nwith IDs `0` to `99` are trained for `nb_teachers=100`, we are ready to train\nthe student. The student is trained by labeling some of the test data with\npredictions from the teachers. The predictions are aggregated by counting the\nvotes assigned to each class among the ensemble of teachers, adding Laplacian\nnoise to these votes, and assigning the label with the maximum noisy vote count\nto the sample. This is detailed in function `noisy_max` in the file\n`aggregation.py`. To learn the student, use the following command:\n\n```\npython train_student.py --nb_teachers=100 --dataset=mnist --stdnt_share=5000\n```\n\nThe flag `--stdnt_share=5000` indicates that the student should be able to\nuse the first `5000` samples of the dataset's test subset as unlabeled\ntraining points (they will be labeled using the teacher predictions). The\nremaining samples are used for evaluation of the student's accuracy, which\nis displayed upon completion of training.\n\n## Using semi-supervised GANs to train the student\n\nIn the paper, we describe how to train the student in a semi-supervised \nfashion using Generative Adversarial Networks. This can be reproduced for MNIST \nby cloning the [improved-gan](https://github.com/openai/improved-gan)\nrepository and adding to your `PATH` variable before running the shell\nscript `train_student_mnist_250_lap_20_count_50_epochs_600.sh`.\n\n```\nexport PATH=\"/path/to/improved-gan/mnist_svhn_cifar10\":$PATH\nsh train_student_mnist_250_lap_20_count_50_epochs_600.sh\n```\n\n\n## Alternative deeper convolutional architecture\n\nNote that a deeper convolutional model is available. Both the default and\ndeeper models graphs are defined in `deep_cnn.py`, respectively by\nfunctions `inference` and `inference_deeper`. Use the flag `--deeper=true`\nto switch to that model when launching `train_teachers.py` and\n`train_student.py`.\n\n## Privacy analysis\n\nIn the paper, we detail how data-dependent differential privacy bounds can be\ncomputed to estimate the cost of training the student. In order to reproduce\nthe bounds given in the paper, we include the label predicted by our two\nteacher ensembles: MNIST and SVHN. You can run the privacy analysis for each\ndataset with the following commands:\n\n```\npython analysis.py --counts_file=mnist_250_teachers_labels.npy --indices_file=mnist_250_teachers_100_indices_used_by_student.npy\n\npython analysis.py --counts_file=svhn_250_teachers_labels.npy --max_examples=1000 --delta=1e-6\n```\n\nTo expedite experimentation with the privacy analysis of student training,\nthe `analysis.py` file is configured to download the labels produced by 250\nteacher models, for MNIST and SVHN when running the two commands included\nabove. These 250 teacher models were trained using the following command lines,\nwhere `XXX` takes values between `0` and `249`:\n\n```\npython train_teachers.py --nb_teachers=250 --teacher_id=XXX --dataset=mnist\npython train_teachers.py --nb_teachers=250 --teacher_id=XXX --dataset=svhn\n```\n\nNote that these labels may also be used in lieu of function `ensemble_preds`\nin `train_student.py`, to compare the performance of alternative student model\narchitectures and learning techniques. This facilitates future work, by\nremoving the need for training the MNIST and SVHN teacher ensembles when\nproposing new student training approaches.\n\n## Contact\n\nTo ask questions, please email `nicolas@papernot.fr` or open an issue on\nthe `tensorflow/models` issues tracker. Please assign issues to\n[@npapernot](https://github.com/npapernot).\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/multiple_teachers/aggregation.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport numpy as np\n\n\ndef labels_from_probs(probs):\n  \"\"\"\n  Helper function: computes argmax along last dimension of array to obtain\n  labels (max prob or max logit value)\n  :param probs: numpy array where probabilities or logits are on last dimension\n  :return: array with same shape as input besides last dimension with shape 1\n          now containing the labels\n  \"\"\"\n  # Compute last axis index\n  last_axis = len(np.shape(probs)) - 1\n\n  # Label is argmax over last dimension\n  labels = np.argmax(probs, axis=last_axis)\n\n  # Return as np.int32\n  return np.asarray(labels, dtype=np.int32)\n\n\ndef noisy_max(logits, lap_scale, return_clean_votes=False):\n  \"\"\"\n  This aggregation mechanism takes the softmax/logit output of several models\n  resulting from inference on identical inputs and computes the noisy-max of\n  the votes for candidate classes to select a label for each sample: it\n  adds Laplacian noise to label counts and returns the most frequent label.\n  :param logits: logits or probabilities for each sample\n  :param lap_scale: scale of the Laplacian noise to be added to counts\n  :param return_clean_votes: if set to True, also returns clean votes (without\n                      Laplacian noise). This can be used to perform the\n                      privacy analysis of this aggregation mechanism.\n  :return: pair of result and (if clean_votes is set to True) the clean counts\n           for each class per sample and the the original labels produced by\n           the teachers.\n  \"\"\"\n\n  # Compute labels from logits/probs and reshape array properly\n  labels = labels_from_probs(logits)\n  labels_shape = np.shape(labels)\n  labels = labels.reshape((labels_shape[0], labels_shape[1]))\n\n  # Initialize array to hold final labels\n  result = np.zeros(int(labels_shape[1]))\n\n  if return_clean_votes:\n    # Initialize array to hold clean votes for each sample\n    clean_votes = np.zeros((int(labels_shape[1]), 10))\n\n  # Parse each sample\n  for i in xrange(int(labels_shape[1])):\n    # Count number of votes assigned to each class\n    label_counts = np.bincount(labels[:, i], minlength=10)\n\n    if return_clean_votes:\n      # Store vote counts for export\n      clean_votes[i] = label_counts\n\n    # Cast in float32 to prepare before addition of Laplacian noise\n    label_counts = np.asarray(label_counts, dtype=np.float32)\n\n    # Sample independent Laplacian noise for each class\n    for item in xrange(10):\n      label_counts[item] += np.random.laplace(loc=0.0, scale=float(lap_scale))\n\n    # Result is the most frequent label\n    result[i] = np.argmax(label_counts)\n\n  # Cast labels to np.int32 for compatibility with deep_cnn.py feed dictionaries\n  result = np.asarray(result, dtype=np.int32)\n\n  if return_clean_votes:\n    # Returns several array, which are later saved:\n    # result: labels obtained from the noisy aggregation\n    # clean_votes: the number of teacher votes assigned to each sample and class\n    # labels: the labels assigned by teachers (before the noisy aggregation)\n    return result, clean_votes, labels\n  else:\n    # Only return labels resulting from noisy aggregation\n    return result\n\n\ndef aggregation_most_frequent(logits):\n  \"\"\"\n  This aggregation mechanism takes the softmax/logit output of several models\n  resulting from inference on identical inputs and computes the most frequent\n  label. It is deterministic (no noise injection like noisy_max() above.\n  :param logits: logits or probabilities for each sample\n  :return:\n  \"\"\"\n  # Compute labels from logits/probs and reshape array properly\n  labels = labels_from_probs(logits)\n  labels_shape = np.shape(labels)\n  labels = labels.reshape((labels_shape[0], labels_shape[1]))\n\n  # Initialize array to hold final labels\n  result = np.zeros(int(labels_shape[1]))\n\n  # Parse each sample\n  for i in xrange(int(labels_shape[1])):\n    # Count number of votes assigned to each class\n    label_counts = np.bincount(labels[:, i], minlength=10)\n\n    label_counts = np.asarray(label_counts, dtype=np.int32)\n\n    # Result is the most frequent label\n    result[i] = np.argmax(label_counts)\n\n  return np.asarray(result, dtype=np.int32)\n\n\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/multiple_teachers/analysis.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"\nThis script computes bounds on the privacy cost of training the\nstudent model from noisy aggregation of labels predicted by teachers.\nIt should be used only after training the student (and therefore the\nteachers as well). We however include the label files required to\nreproduce key results from our paper (https://arxiv.org/abs/1610.05755):\nthe epsilon bounds for MNIST and SVHN students.\n\nThe command that computes the epsilon bound associated\nwith the training of the MNIST student model (100 label queries\nwith a (1/20)*2=0.1 epsilon bound each) is:\n\npython analysis.py\n  --counts_file=mnist_250_teachers_labels.npy\n  --indices_file=mnist_250_teachers_100_indices_used_by_student.npy\n\nThe command that computes the epsilon bound associated\nwith the training of the SVHN student model (1000 label queries\nwith a (1/20)*2=0.1 epsilon bound each) is:\n\npython analysis.py\n  --counts_file=svhn_250_teachers_labels.npy\n  --max_examples=1000\n  --delta=1e-6\n\"\"\"\nimport os\nimport math\nimport numpy as np\nimport tensorflow as tf\n\nfrom differential_privacy.multiple_teachers.input import maybe_download\n\n# These parameters can be changed to compute bounds for different failure rates\n# or different model predictions.\n\ntf.flags.DEFINE_integer(\"moments\",8, \"Number of moments\")\ntf.flags.DEFINE_float(\"noise_eps\", 0.1, \"Eps value for each call to noisymax.\")\ntf.flags.DEFINE_float(\"delta\", 1e-5, \"Target value of delta.\")\ntf.flags.DEFINE_float(\"beta\", 0.09, \"Value of beta for smooth sensitivity\")\ntf.flags.DEFINE_string(\"counts_file\",\"\",\"Numpy matrix with raw counts\")\ntf.flags.DEFINE_string(\"indices_file\",\"\",\n    \"File containting a numpy matrix with indices used.\"\n    \"Optional. Use the first max_examples indices if this is not provided.\")\ntf.flags.DEFINE_integer(\"max_examples\",1000,\n    \"Number of examples to use. We will use the first\"\n    \" max_examples many examples from the counts_file\"\n    \" or indices_file to do the privacy cost estimate\")\ntf.flags.DEFINE_float(\"too_small\", 1e-10, \"Small threshold to avoid log of 0\")\ntf.flags.DEFINE_bool(\"input_is_counts\", False, \"False if labels, True if counts\")\n\nFLAGS = tf.flags.FLAGS\n\n\ndef compute_q_noisy_max(counts, noise_eps):\n  \"\"\"returns ~ Pr[outcome != winner].\n\n  Args:\n    counts: a list of scores\n    noise_eps: privacy parameter for noisy_max\n  Returns:\n    q: the probability that outcome is different from true winner.\n  \"\"\"\n  # For noisy max, we only get an upper bound.\n  # Pr[ j beats i*] \\leq (2+gap(j,i*))/ 4 exp(gap(j,i*)\n  # proof at http://mathoverflow.net/questions/66763/\n  # tight-bounds-on-probability-of-sum-of-laplace-random-variables\n\n  winner = np.argmax(counts)\n  counts_normalized = noise_eps * (counts - counts[winner])\n  counts_rest = np.array(\n      [counts_normalized[i] for i in xrange(len(counts)) if i != winner])\n  q = 0.0\n  for c in counts_rest:\n    gap = -c\n    q += (gap + 2.0) / (4.0 * math.exp(gap))\n  return min(q, 1.0 - (1.0/len(counts)))\n\n\ndef compute_q_noisy_max_approx(counts, noise_eps):\n  \"\"\"returns ~ Pr[outcome != winner].\n\n  Args:\n    counts: a list of scores\n    noise_eps: privacy parameter for noisy_max\n  Returns:\n    q: the probability that outcome is different from true winner.\n  \"\"\"\n  # For noisy max, we only get an upper bound.\n  # Pr[ j beats i*] \\leq (2+gap(j,i*))/ 4 exp(gap(j,i*)\n  # proof at http://mathoverflow.net/questions/66763/\n  # tight-bounds-on-probability-of-sum-of-laplace-random-variables\n  # This code uses an approximation that is faster and easier\n  # to get local sensitivity bound on.\n\n  winner = np.argmax(counts)\n  counts_normalized = noise_eps * (counts - counts[winner])\n  counts_rest = np.array(\n      [counts_normalized[i] for i in xrange(len(counts)) if i != winner])\n  gap = -max(counts_rest)\n  q = (len(counts) - 1) * (gap + 2.0) / (4.0 * math.exp(gap))\n  return min(q, 1.0 - (1.0/len(counts)))\n\n\ndef logmgf_exact(q, priv_eps, l):\n  \"\"\"Computes the logmgf value given q and privacy eps.\n\n  The bound used is the min of three terms. The first term is from\n  https://arxiv.org/pdf/1605.02065.pdf.\n  The second term is based on the fact that when event has probability (1-q) for\n  q close to zero, q can only change by exp(eps), which corresponds to a\n  much smaller multiplicative change in (1-q)\n  The third term comes directly from the privacy guarantee.\n  Args:\n    q: pr of non-optimal outcome\n    priv_eps: eps parameter for DP\n    l: moment to compute.\n  Returns:\n    Upper bound on logmgf\n  \"\"\"\n  if q < 0.5:\n    t_one = (1-q) * math.pow((1-q) / (1 - math.exp(priv_eps) * q), l)\n    t_two = q * math.exp(priv_eps * l)\n    t = t_one + t_two\n    try:\n      log_t = math.log(t)\n    except ValueError:\n      print \"Got ValueError in math.log for values :\" + str((q, priv_eps, l, t))\n      log_t = priv_eps * l\n  else:\n    log_t = priv_eps * l\n\n  return min(0.5 * priv_eps * priv_eps * l * (l + 1), log_t, priv_eps * l)\n\n\ndef logmgf_from_counts(counts, noise_eps, l):\n  \"\"\"\n  ReportNoisyMax mechanism with noise_eps with 2*noise_eps-DP\n  in our setting where one count can go up by one and another\n  can go down by 1.\n  \"\"\"\n\n  q = compute_q_noisy_max(counts, noise_eps)\n  return logmgf_exact(q, 2.0 * noise_eps, l)\n\n\ndef sens_at_k(counts, noise_eps, l, k):\n  \"\"\"Return sensitivity at distane k.\n\n  Args:\n    counts: an array of scores\n    noise_eps: noise parameter used\n    l: moment whose sensitivity is being computed\n    k: distance\n  Returns:\n    sensitivity: at distance k\n  \"\"\"\n  counts_sorted = sorted(counts, reverse=True)\n  if 0.5 * noise_eps * l > 1:\n    print \"l too large to compute sensitivity\"\n    return 0\n  # Now we can assume that at k, gap remains positive\n  # or we have reached the point where logmgf_exact is\n  # determined by the first term and ind of q.\n  if counts[0] < counts[1] + k:\n    return 0\n  counts_sorted[0] -= k\n  counts_sorted[1] += k\n  val = logmgf_from_counts(counts_sorted, noise_eps, l)\n  counts_sorted[0] -= 1\n  counts_sorted[1] += 1\n  val_changed = logmgf_from_counts(counts_sorted, noise_eps, l)\n  return val_changed - val\n\n\ndef smoothed_sens(counts, noise_eps, l, beta):\n  \"\"\"Compute beta-smooth sensitivity.\n\n  Args:\n    counts: array of scors\n    noise_eps: noise parameter\n    l: moment of interest\n    beta: smoothness parameter\n  Returns:\n    smooth_sensitivity: a beta smooth upper bound\n  \"\"\"\n  k = 0\n  smoothed_sensitivity = sens_at_k(counts, noise_eps, l, k)\n  while k < max(counts):\n    k += 1\n    sensitivity_at_k = sens_at_k(counts, noise_eps, l, k)\n    smoothed_sensitivity = max(\n        smoothed_sensitivity,\n        math.exp(-beta * k) * sensitivity_at_k)\n    if sensitivity_at_k == 0.0:\n      break\n  return smoothed_sensitivity\n\n\ndef main(unused_argv):\n  ##################################################################\n  # If we are reproducing results from paper https://arxiv.org/abs/1610.05755,\n  # download the required binaries with label information.\n  ##################################################################\n  \n  # Binaries for MNIST results\n  paper_binaries_mnist = \\\n    [\"https://github.com/npapernot/multiple-teachers-for-privacy/blob/master/mnist_250_teachers_labels.npy?raw=true\", \n    \"https://github.com/npapernot/multiple-teachers-for-privacy/blob/master/mnist_250_teachers_100_indices_used_by_student.npy?raw=true\"]\n  if FLAGS.counts_file == \"mnist_250_teachers_labels.npy\" \\\n    or FLAGS.indices_file == \"mnist_250_teachers_100_indices_used_by_student.npy\":\n    maybe_download(paper_binaries_mnist, os.getcwd())\n\n  # Binaries for SVHN results\n  paper_binaries_svhn = [\"https://github.com/npapernot/multiple-teachers-for-privacy/blob/master/svhn_250_teachers_labels.npy?raw=true\"]\n  if FLAGS.counts_file == \"svhn_250_teachers_labels.npy\":\n    maybe_download(paper_binaries_svhn, os.getcwd())\n\n  input_mat = np.load(FLAGS.counts_file)\n  if FLAGS.input_is_counts:\n    counts_mat = input_mat\n  else:\n    # In this case, the input is the raw predictions. Transform\n    num_teachers, n = input_mat.shape\n    counts_mat = np.zeros((n, 10)).astype(np.int32)\n    for i in range(n):\n      for j in range(num_teachers):\n        counts_mat[i, input_mat[j, i]] += 1\n  n = counts_mat.shape[0]\n  num_examples = min(n, FLAGS.max_examples)\n\n  if not FLAGS.indices_file:\n    indices = np.array(range(num_examples))\n  else:\n    index_list = np.load(FLAGS.indices_file)\n    indices = index_list[:num_examples]\n\n  l_list = 1.0 + np.array(xrange(FLAGS.moments))\n  beta = FLAGS.beta\n  total_log_mgf_nm = np.array([0.0 for _ in l_list])\n  total_ss_nm = np.array([0.0 for _ in l_list])\n  noise_eps = FLAGS.noise_eps\n  \n  for i in indices:\n    total_log_mgf_nm += np.array(\n        [logmgf_from_counts(counts_mat[i], noise_eps, l)\n         for l in l_list])\n    total_ss_nm += np.array(\n        [smoothed_sens(counts_mat[i], noise_eps, l, beta)\n         for l in l_list])\n  delta = FLAGS.delta\n\n  # We want delta = exp(alpha - eps l).\n  # Solving gives eps = (alpha - ln (delta))/l\n  eps_list_nm = (total_log_mgf_nm - math.log(delta)) / l_list\n\n  print \"Epsilons (Noisy Max): \" + str(eps_list_nm)\n  print \"Smoothed sensitivities (Noisy Max): \" + str(total_ss_nm / l_list)\n\n  # If beta < eps / 2 ln (1/delta), then adding noise Lap(1) * 2 SS/eps\n  # is eps,delta DP\n  # Also if beta < eps / 2(gamma +1), then adding noise 2(gamma+1) SS eta / eps\n  # where eta has density proportional to 1 / (1+|z|^gamma) is eps-DP\n  # Both from Corolloary 2.4 in\n  # http://www.cse.psu.edu/~ads22/pubs/NRS07/NRS07-full-draft-v1.pdf\n  # Print the first one's scale\n  ss_eps = 2.0 * beta * math.log(1/delta)\n  ss_scale = 2.0 / ss_eps\n  print \"To get an \" + str(ss_eps) + \"-DP estimate of epsilon, \"\n  print \"..add noise ~ \" + str(ss_scale)\n  print \"... times \" + str(total_ss_nm / l_list)\n  print \"Epsilon = \" + str(min(eps_list_nm)) + \".\"\n  if min(eps_list_nm) == eps_list_nm[-1]:\n    print \"Warning: May not have used enough values of l\"\n\n  # Data indpendent bound, as mechanism is\n  # 2*noise_eps DP.\n  data_ind_log_mgf = np.array([0.0 for _ in l_list])\n  data_ind_log_mgf += num_examples * np.array(\n      [logmgf_exact(1.0, 2.0 * noise_eps, l) for l in l_list])\n\n  data_ind_eps_list = (data_ind_log_mgf - math.log(delta)) / l_list\n  print \"Data independent bound = \" + str(min(data_ind_eps_list)) + \".\"\n\n  return\n\n\nif __name__ == \"__main__\":\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/multiple_teachers/deep_cnn.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom datetime import datetime\nimport math\nimport numpy as np\nimport tensorflow as tf\nimport time\n\nfrom differential_privacy.multiple_teachers import utils\n\nFLAGS = tf.app.flags.FLAGS\n\n# Basic model parameters.\ntf.app.flags.DEFINE_integer('dropout_seed', 123, \"\"\"seed for dropout.\"\"\")\ntf.app.flags.DEFINE_integer('batch_size', 128, \"\"\"Nb of images in a batch.\"\"\")\ntf.app.flags.DEFINE_integer('epochs_per_decay', 350, \"\"\"Nb epochs per decay\"\"\")\ntf.app.flags.DEFINE_integer('learning_rate', 5, \"\"\"100 * learning rate\"\"\")\ntf.app.flags.DEFINE_boolean('log_device_placement', False, \"\"\"see TF doc\"\"\")\n\n\n# Constants describing the training process.\nMOVING_AVERAGE_DECAY = 0.9999     # The decay to use for the moving average.\nLEARNING_RATE_DECAY_FACTOR = 0.1  # Learning rate decay factor.\n\n\ndef _variable_on_cpu(name, shape, initializer):\n  \"\"\"Helper to create a Variable stored on CPU memory.\n\n  Args:\n    name: name of the variable\n    shape: list of ints\n    initializer: initializer for Variable\n\n  Returns:\n    Variable Tensor\n  \"\"\"\n  with tf.device('/cpu:0'):\n    var = tf.get_variable(name, shape, initializer=initializer)\n  return var\n\n\ndef _variable_with_weight_decay(name, shape, stddev, wd):\n  \"\"\"Helper to create an initialized Variable with weight decay.\n\n  Note that the Variable is initialized with a truncated normal distribution.\n  A weight decay is added only if one is specified.\n\n  Args:\n    name: name of the variable\n    shape: list of ints\n    stddev: standard deviation of a truncated Gaussian\n    wd: add L2Loss weight decay multiplied by this float. If None, weight\n        decay is not added for this Variable.\n\n  Returns:\n    Variable Tensor\n  \"\"\"\n  var = _variable_on_cpu(name, shape,\n                         tf.truncated_normal_initializer(stddev=stddev))\n  if wd is not None:\n    weight_decay = tf.mul(tf.nn.l2_loss(var), wd, name='weight_loss')\n    tf.add_to_collection('losses', weight_decay)\n  return var\n\n\ndef inference(images, dropout=False):\n  \"\"\"Build the CNN model.\n  Args:\n    images: Images returned from distorted_inputs() or inputs().\n    dropout: Boolean controling whether to use dropout or not\n  Returns:\n    Logits\n  \"\"\"\n  if FLAGS.dataset == 'mnist':\n    first_conv_shape = [5, 5, 1, 64]\n  else:\n    first_conv_shape = [5, 5, 3, 64]\n\n  # conv1\n  with tf.variable_scope('conv1') as scope:\n    kernel = _variable_with_weight_decay('weights', \n                                         shape=first_conv_shape,\n                                         stddev=1e-4, \n                                         wd=0.0)\n    conv = tf.nn.conv2d(images, kernel, [1, 1, 1, 1], padding='SAME')\n    biases = _variable_on_cpu('biases', [64], tf.constant_initializer(0.0))\n    bias = tf.nn.bias_add(conv, biases)\n    conv1 = tf.nn.relu(bias, name=scope.name)\n    if dropout:\n      conv1 = tf.nn.dropout(conv1, 0.3, seed=FLAGS.dropout_seed)\n\n\n  # pool1\n  pool1 = tf.nn.max_pool(conv1, \n                         ksize=[1, 3, 3, 1], \n                         strides=[1, 2, 2, 1],\n                         padding='SAME', \n                         name='pool1')\n  \n  # norm1\n  norm1 = tf.nn.lrn(pool1, \n                    4, \n                    bias=1.0, \n                    alpha=0.001 / 9.0, \n                    beta=0.75,\n                    name='norm1')\n\n  # conv2\n  with tf.variable_scope('conv2') as scope:\n    kernel = _variable_with_weight_decay('weights', \n                                         shape=[5, 5, 64, 128],\n                                         stddev=1e-4, \n                                         wd=0.0)\n    conv = tf.nn.conv2d(norm1, kernel, [1, 1, 1, 1], padding='SAME')\n    biases = _variable_on_cpu('biases', [128], tf.constant_initializer(0.1))\n    bias = tf.nn.bias_add(conv, biases)\n    conv2 = tf.nn.relu(bias, name=scope.name)\n    if dropout:\n      conv2 = tf.nn.dropout(conv2, 0.3, seed=FLAGS.dropout_seed)\n\n\n  # norm2\n  norm2 = tf.nn.lrn(conv2, \n                    4, \n                    bias=1.0, \n                    alpha=0.001 / 9.0, \n                    beta=0.75,\n                    name='norm2')\n  \n  # pool2\n  pool2 = tf.nn.max_pool(norm2, \n                         ksize=[1, 3, 3, 1],\n                         strides=[1, 2, 2, 1], \n                         padding='SAME', \n                         name='pool2')\n\n  # local3\n  with tf.variable_scope('local3') as scope:\n    # Move everything into depth so we can perform a single matrix multiply.\n    reshape = tf.reshape(pool2, [FLAGS.batch_size, -1])\n    dim = reshape.get_shape()[1].value\n    weights = _variable_with_weight_decay('weights', \n                                          shape=[dim, 384],\n                                          stddev=0.04, \n                                          wd=0.004)\n    biases = _variable_on_cpu('biases', [384], tf.constant_initializer(0.1))\n    local3 = tf.nn.relu(tf.matmul(reshape, weights) + biases, name=scope.name)\n    if dropout:\n      local3 = tf.nn.dropout(local3, 0.5, seed=FLAGS.dropout_seed)\n\n  # local4\n  with tf.variable_scope('local4') as scope:\n    weights = _variable_with_weight_decay('weights', \n                                          shape=[384, 192],\n                                          stddev=0.04, \n                                          wd=0.004)\n    biases = _variable_on_cpu('biases', [192], tf.constant_initializer(0.1))\n    local4 = tf.nn.relu(tf.matmul(local3, weights) + biases, name=scope.name)\n    if dropout:\n      local4 = tf.nn.dropout(local4, 0.5, seed=FLAGS.dropout_seed)\n\n  # compute logits\n  with tf.variable_scope('softmax_linear') as scope:\n    weights = _variable_with_weight_decay('weights', \n                                          [192, FLAGS.nb_labels],\n                                          stddev=1/192.0, \n                                          wd=0.0)\n    biases = _variable_on_cpu('biases', \n                              [FLAGS.nb_labels],\n                              tf.constant_initializer(0.0))\n    logits = tf.add(tf.matmul(local4, weights), biases, name=scope.name)\n\n  return logits\n\n\ndef inference_deeper(images, dropout=False):\n  \"\"\"Build a deeper CNN model.\n  Args:\n    images: Images returned from distorted_inputs() or inputs().\n    dropout: Boolean controling whether to use dropout or not\n  Returns:\n    Logits\n  \"\"\"\n  if FLAGS.dataset == 'mnist':\n    first_conv_shape = [3, 3, 1, 96]\n  else:\n    first_conv_shape = [3, 3, 3, 96]\n\n  # conv1\n  with tf.variable_scope('conv1') as scope:\n    kernel = _variable_with_weight_decay('weights',\n                                         shape=first_conv_shape,\n                                         stddev=0.05,\n                                         wd=0.0)\n    conv = tf.nn.conv2d(images, kernel, [1, 1, 1, 1], padding='SAME')\n    biases = _variable_on_cpu('biases', [96], tf.constant_initializer(0.0))\n    bias = tf.nn.bias_add(conv, biases)\n    conv1 = tf.nn.relu(bias, name=scope.name)\n\n  # conv2\n  with tf.variable_scope('conv2') as scope:\n    kernel = _variable_with_weight_decay('weights',\n                                         shape=[3, 3, 96, 96],\n                                         stddev=0.05,\n                                         wd=0.0)\n    conv = tf.nn.conv2d(conv1, kernel, [1, 1, 1, 1], padding='SAME')\n    biases = _variable_on_cpu('biases', [96], tf.constant_initializer(0.0))\n    bias = tf.nn.bias_add(conv, biases)\n    conv2 = tf.nn.relu(bias, name=scope.name)\n\n  # conv3\n  with tf.variable_scope('conv3') as scope:\n    kernel = _variable_with_weight_decay('weights',\n                                         shape=[3, 3, 96, 96],\n                                         stddev=0.05,\n                                         wd=0.0)\n    conv = tf.nn.conv2d(conv2, kernel, [1, 2, 2, 1], padding='SAME')\n    biases = _variable_on_cpu('biases', [96], tf.constant_initializer(0.0))\n    bias = tf.nn.bias_add(conv, biases)\n    conv3 = tf.nn.relu(bias, name=scope.name)\n    if dropout:\n      conv3 = tf.nn.dropout(conv3, 0.5, seed=FLAGS.dropout_seed)\n\n  # conv4\n  with tf.variable_scope('conv4') as scope:\n    kernel = _variable_with_weight_decay('weights',\n                                         shape=[3, 3, 96, 192],\n                                         stddev=0.05,\n                                         wd=0.0)\n    conv = tf.nn.conv2d(conv3, kernel, [1, 1, 1, 1], padding='SAME')\n    biases = _variable_on_cpu('biases', [192], tf.constant_initializer(0.0))\n    bias = tf.nn.bias_add(conv, biases)\n    conv4 = tf.nn.relu(bias, name=scope.name)\n\n  # conv5\n  with tf.variable_scope('conv5') as scope:\n    kernel = _variable_with_weight_decay('weights',\n                                         shape=[3, 3, 192, 192],\n                                         stddev=0.05,\n                                         wd=0.0)\n    conv = tf.nn.conv2d(conv4, kernel, [1, 1, 1, 1], padding='SAME')\n    biases = _variable_on_cpu('biases', [192], tf.constant_initializer(0.0))\n    bias = tf.nn.bias_add(conv, biases)\n    conv5 = tf.nn.relu(bias, name=scope.name)\n\n  # conv6\n  with tf.variable_scope('conv6') as scope:\n    kernel = _variable_with_weight_decay('weights',\n                                         shape=[3, 3, 192, 192],\n                                         stddev=0.05,\n                                         wd=0.0)\n    conv = tf.nn.conv2d(conv5, kernel, [1, 2, 2, 1], padding='SAME')\n    biases = _variable_on_cpu('biases', [192], tf.constant_initializer(0.0))\n    bias = tf.nn.bias_add(conv, biases)\n    conv6 = tf.nn.relu(bias, name=scope.name)\n    if dropout:\n      conv6 = tf.nn.dropout(conv6, 0.5, seed=FLAGS.dropout_seed)\n\n\n  # conv7\n  with tf.variable_scope('conv7') as scope:\n    kernel = _variable_with_weight_decay('weights',\n                                         shape=[5, 5, 192, 192],\n                                         stddev=1e-4,\n                                         wd=0.0)\n    conv = tf.nn.conv2d(conv6, kernel, [1, 1, 1, 1], padding='SAME')\n    biases = _variable_on_cpu('biases', [192], tf.constant_initializer(0.1))\n    bias = tf.nn.bias_add(conv, biases)\n    conv7 = tf.nn.relu(bias, name=scope.name)\n\n\n  # local1\n  with tf.variable_scope('local1') as scope:\n    # Move everything into depth so we can perform a single matrix multiply.\n    reshape = tf.reshape(conv7, [FLAGS.batch_size, -1])\n    dim = reshape.get_shape()[1].value\n    weights = _variable_with_weight_decay('weights',\n                                          shape=[dim, 192],\n                                          stddev=0.05,\n                                          wd=0)\n    biases = _variable_on_cpu('biases', [192], tf.constant_initializer(0.1))\n    local1 = tf.nn.relu(tf.matmul(reshape, weights) + biases, name=scope.name)\n\n  # local2\n  with tf.variable_scope('local2') as scope:\n    weights = _variable_with_weight_decay('weights',\n                                          shape=[192, 192],\n                                          stddev=0.05,\n                                          wd=0)\n    biases = _variable_on_cpu('biases', [192], tf.constant_initializer(0.1))\n    local2 = tf.nn.relu(tf.matmul(local1, weights) + biases, name=scope.name)\n    if dropout:\n      local2 = tf.nn.dropout(local2, 0.5, seed=FLAGS.dropout_seed)\n\n  # compute logits\n  with tf.variable_scope('softmax_linear') as scope:\n    weights = _variable_with_weight_decay('weights',\n                                          [192, FLAGS.nb_labels],\n                                          stddev=0.05,\n                                          wd=0.0)\n    biases = _variable_on_cpu('biases',\n                              [FLAGS.nb_labels],\n                              tf.constant_initializer(0.0))\n    logits = tf.add(tf.matmul(local2, weights), biases, name=scope.name)\n\n  return logits\n\n\ndef loss_fun(logits, labels):\n  \"\"\"Add L2Loss to all the trainable variables.\n\n  Add summary for \"Loss\" and \"Loss/avg\".\n  Args:\n    logits: Logits from inference().\n    labels: Labels from distorted_inputs or inputs(). 1-D tensor\n            of shape [batch_size]\n    distillation: if set to True, use probabilities and not class labels to\n                  compute softmax loss\n\n  Returns:\n    Loss tensor of type float.\n  \"\"\"\n\n  # Calculate the cross entropy between labels and predictions\n  labels = tf.cast(labels, tf.int64)\n  cross_entropy = tf.nn.sparse_softmax_cross_entropy_with_logits(\n      logits, labels, name='cross_entropy_per_example')\n\n  # Calculate the average cross entropy loss across the batch.\n  cross_entropy_mean = tf.reduce_mean(cross_entropy, name='cross_entropy')\n\n  # Add to TF collection for losses\n  tf.add_to_collection('losses', cross_entropy_mean)\n\n  # The total loss is defined as the cross entropy loss plus all of the weight\n  # decay terms (L2 loss).\n  return tf.add_n(tf.get_collection('losses'), name='total_loss')\n\n\ndef moving_av(total_loss):\n  \"\"\"\n  Generates moving average for all losses\n\n  Args:\n    total_loss: Total loss from loss().\n  Returns:\n    loss_averages_op: op for generating moving averages of losses.\n  \"\"\"\n  # Compute the moving average of all individual losses and the total loss.\n  loss_averages = tf.train.ExponentialMovingAverage(0.9, name='avg')\n  losses = tf.get_collection('losses')\n  loss_averages_op = loss_averages.apply(losses + [total_loss])\n\n  return loss_averages_op\n\n\ndef train_op_fun(total_loss, global_step):\n  \"\"\"Train model.\n\n  Create an optimizer and apply to all trainable variables. Add moving\n  average for all trainable variables.\n\n  Args:\n    total_loss: Total loss from loss().\n    global_step: Integer Variable counting the number of training steps\n      processed.\n  Returns:\n    train_op: op for training.\n  \"\"\"\n  # Variables that affect learning rate.\n  nb_ex_per_train_epoch = int(60000 / FLAGS.nb_teachers)\n  \n  num_batches_per_epoch = nb_ex_per_train_epoch / FLAGS.batch_size\n  decay_steps = int(num_batches_per_epoch * FLAGS.epochs_per_decay)\n\n  initial_learning_rate = float(FLAGS.learning_rate) / 100.0\n\n  # Decay the learning rate exponentially based on the number of steps.\n  lr = tf.train.exponential_decay(initial_learning_rate,\n                                  global_step,\n                                  decay_steps,\n                                  LEARNING_RATE_DECAY_FACTOR,\n                                  staircase=True)\n  tf.scalar_summary('learning_rate', lr)\n\n  # Generate moving averages of all losses and associated summaries.\n  loss_averages_op = moving_av(total_loss)\n\n  # Compute gradients.\n  with tf.control_dependencies([loss_averages_op]):\n    opt = tf.train.GradientDescentOptimizer(lr)\n    grads = opt.compute_gradients(total_loss)\n\n  # Apply gradients.\n  apply_gradient_op = opt.apply_gradients(grads, global_step=global_step)\n\n  # Add histograms for trainable variables.\n  for var in tf.trainable_variables():\n    tf.histogram_summary(var.op.name, var)\n\n  # Track the moving averages of all trainable variables.\n  variable_averages = tf.train.ExponentialMovingAverage(\n      MOVING_AVERAGE_DECAY, global_step)\n  variables_averages_op = variable_averages.apply(tf.trainable_variables())\n\n  with tf.control_dependencies([apply_gradient_op, variables_averages_op]):\n    train_op = tf.no_op(name='train')\n\n  return train_op\n\n\ndef _input_placeholder():\n  \"\"\"\n  This helper function declares a TF placeholder for the graph input data\n  :return: TF placeholder for the graph input data\n  \"\"\"\n  if FLAGS.dataset == 'mnist':\n    image_size = 28\n    num_channels = 1\n  else:\n    image_size = 32\n    num_channels = 3\n\n  # Declare data placeholder\n  train_node_shape = (FLAGS.batch_size, image_size, image_size, num_channels)\n  return tf.placeholder(tf.float32, shape=train_node_shape)\n\n\ndef train(images, labels, ckpt_path, dropout=False):\n  \"\"\"\n  This function contains the loop that actually trains the model.\n  :param images: a numpy array with the input data\n  :param labels: a numpy array with the output labels\n  :param ckpt_path: a path (including name) where model checkpoints are saved\n  :param dropout: Boolean, whether to use dropout or not\n  :return: True if everything went well\n  \"\"\"\n\n  # Check training data\n  assert len(images) == len(labels)\n  assert images.dtype == np.float32\n  assert labels.dtype == np.int32\n\n  # Set default TF graph\n  with tf.Graph().as_default():\n    global_step = tf.Variable(0, trainable=False)\n\n    # Declare data placeholder\n    train_data_node = _input_placeholder()\n\n    # Create a placeholder to hold labels\n    train_labels_shape = (FLAGS.batch_size,)\n    train_labels_node = tf.placeholder(tf.int32, shape=train_labels_shape)\n\n    print(\"Done Initializing Training Placeholders\")\n\n    # Build a Graph that computes the logits predictions from the placeholder\n    if FLAGS.deeper:\n      logits = inference_deeper(train_data_node, dropout=dropout)\n    else:\n      logits = inference(train_data_node, dropout=dropout)\n\n    # Calculate loss\n    loss = loss_fun(logits, train_labels_node)\n\n    # Build a Graph that trains the model with one batch of examples and\n    # updates the model parameters.\n    train_op = train_op_fun(loss, global_step)\n\n    # Create a saver.\n    saver = tf.train.Saver(tf.all_variables())\n\n    print(\"Graph constructed and saver created\")\n\n    # Build an initialization operation to run below.\n    init = tf.initialize_all_variables()\n\n    # Create and init sessions\n    sess = tf.Session(config=tf.ConfigProto(log_device_placement=FLAGS.log_device_placement)) #NOLINT(long-line)\n    sess.run(init)\n\n    print(\"Session ready, beginning training loop\")\n\n    # Initialize the number of batches\n    data_length = len(images)\n    nb_batches = math.ceil(data_length / FLAGS.batch_size)\n\n    for step in xrange(FLAGS.max_steps):\n      # for debug, save start time\n      start_time = time.time()\n\n      # Current batch number\n      batch_nb = step % nb_batches\n\n      # Current batch start and end indices\n      start, end = utils.batch_indices(batch_nb, data_length, FLAGS.batch_size)\n\n      # Prepare dictionnary to feed the session with\n      feed_dict = {train_data_node: images[start:end],\n                   train_labels_node: labels[start:end]}\n\n      # Run training step\n      _, loss_value = sess.run([train_op, loss], feed_dict=feed_dict)\n\n      # Compute duration of training step\n      duration = time.time() - start_time\n\n      # Sanity check\n      assert not np.isnan(loss_value), 'Model diverged with loss = NaN'\n\n      # Echo loss once in a while\n      if step % 100 == 0:\n        num_examples_per_step = FLAGS.batch_size\n        examples_per_sec = num_examples_per_step / duration\n        sec_per_batch = float(duration)\n\n        format_str = ('%s: step %d, loss = %.2f (%.1f examples/sec; %.3f '\n                      'sec/batch)')\n        print (format_str % (datetime.now(), step, loss_value,\n                             examples_per_sec, sec_per_batch))\n\n      # Save the model checkpoint periodically.\n      if step % 1000 == 0 or (step + 1) == FLAGS.max_steps:\n        saver.save(sess, ckpt_path, global_step=step)\n\n  return True\n\n\ndef softmax_preds(images, ckpt_path, return_logits=False):\n  \"\"\"\n  Compute softmax activations (probabilities) with the model saved in the path\n  specified as an argument\n  :param images: a np array of images\n  :param ckpt_path: a TF model checkpoint\n  :param logits: if set to True, return logits instead of probabilities\n  :return: probabilities (or logits if logits is set to True)\n  \"\"\"\n  # Compute nb samples and deduce nb of batches\n  data_length = len(images)\n  nb_batches = math.ceil(len(images) / FLAGS.batch_size)\n\n  # Declare data placeholder\n  train_data_node = _input_placeholder()\n\n  # Build a Graph that computes the logits predictions from the placeholder\n  if FLAGS.deeper:\n    logits = inference_deeper(train_data_node)\n  else:\n    logits = inference(train_data_node)\n\n  if return_logits:\n    # We are returning the logits directly (no need to apply softmax)\n    output = logits\n  else:\n    # Add softmax predictions to graph: will return probabilities\n    output = tf.nn.softmax(logits)\n\n  # Restore the moving average version of the learned variables for eval.\n  variable_averages = tf.train.ExponentialMovingAverage(MOVING_AVERAGE_DECAY)\n  variables_to_restore = variable_averages.variables_to_restore()\n  saver = tf.train.Saver(variables_to_restore)\n\n  # Will hold the result\n  preds = np.zeros((data_length, FLAGS.nb_labels), dtype=np.float32)\n\n  # Create TF session\n  with tf.Session() as sess:\n    # Restore TF session from checkpoint file\n    saver.restore(sess, ckpt_path)\n\n    # Parse data by batch\n    for batch_nb in xrange(0, int(nb_batches+1)):\n      # Compute batch start and end indices\n      start, end = utils.batch_indices(batch_nb, data_length, FLAGS.batch_size)\n\n      # Prepare feed dictionary\n      feed_dict = {train_data_node: images[start:end]}\n\n      # Run session ([0] because run returns a batch with len 1st dim == 1)\n      preds[start:end, :] = sess.run([output], feed_dict=feed_dict)[0]\n\n  # Reset graph to allow multiple calls\n  tf.reset_default_graph()\n\n  return preds\n\n\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/multiple_teachers/input.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport cPickle\nimport gzip\nimport math\nimport numpy as np\nimport os\nfrom scipy.io import loadmat as loadmat\nfrom six.moves import urllib\nimport sys\nimport tarfile\n\nimport tensorflow as tf\n\nFLAGS = tf.flags.FLAGS\n\n\ndef create_dir_if_needed(dest_directory):\n  \"\"\"\n  Create directory if doesn't exist\n  :param dest_directory:\n  :return: True if everything went well\n  \"\"\"\n  if not tf.gfile.IsDirectory(dest_directory):\n    tf.gfile.MakeDirs(dest_directory)\n\n  return True\n\n\ndef maybe_download(file_urls, directory):\n  \"\"\"\n  Download a set of files in temporary local folder\n  :param directory: the directory where to download \n  :return: a tuple of filepaths corresponding to the files given as input\n  \"\"\"\n  # Create directory if doesn't exist\n  assert create_dir_if_needed(directory)\n\n  # This list will include all URLS of the local copy of downloaded files\n  result = []\n\n  # For each file of the dataset\n  for file_url in file_urls:\n    # Extract filename\n    filename = file_url.split('/')[-1]\n\n    # If downloading from GitHub, remove suffix ?raw=True from local filename\n    if filename.endswith(\"?raw=true\"):\n      filename = filename[:-9]\n\n    # Deduce local file url\n    #filepath = os.path.join(directory, filename)\n    filepath = directory + '/' + filename\n\n    # Add to result list\n    result.append(filepath)\n\n    # Test if file already exists\n    if not gfile.Exists(filepath):\n      def _progress(count, block_size, total_size):\n        sys.stdout.write('\\r>> Downloading %s %.1f%%' % (filename,\n            float(count * block_size) / float(total_size) * 100.0))\n        sys.stdout.flush()\n      filepath, _ = urllib.request.urlretrieve(file_url, filepath, _progress)\n      print()\n      statinfo = os.stat(filepath)\n      print('Successfully downloaded', filename, statinfo.st_size, 'bytes.')\n\n  return result\n\n\ndef image_whitening(data):\n  \"\"\"\n  Subtracts mean of image and divides by adjusted standard variance (for\n  stability). Operations are per image but performed for the entire array.\n  :param image: 4D array (ID, Height, Weight, Channel)\n  :return: 4D array (ID, Height, Weight, Channel)\n  \"\"\"\n  assert len(np.shape(data)) == 4\n\n  # Compute number of pixels in image\n  nb_pixels = np.shape(data)[1] * np.shape(data)[2] * np.shape(data)[3]\n\n  # Subtract mean\n  mean = np.mean(data, axis=(1,2,3))\n\n  ones = np.ones(np.shape(data)[1:4], dtype=np.float32)\n  for i in xrange(len(data)):\n    data[i, :, :, :] -= mean[i] * ones\n\n  # Compute adjusted standard variance\n  adj_std_var = np.maximum(np.ones(len(data), dtype=np.float32) / math.sqrt(nb_pixels), np.std(data, axis=(1,2,3))) #NOLINT(long-line)\n\n  # Divide image\n  for i in xrange(len(data)):\n    data[i, :, :, :] = data[i, :, :, :] / adj_std_var[i]\n\n  print(np.shape(data))\n\n  return data\n\n\ndef extract_svhn(local_url):\n  \"\"\"\n  Extract a MATLAB matrix into two numpy arrays with data and labels\n  :param local_url:\n  :return:\n  \"\"\"\n\n  with gfile.Open(local_url, mode='r') as file_obj:\n    # Load MATLAB matrix using scipy IO\n    dict = loadmat(file_obj)\n\n    # Extract each dictionary (one for data, one for labels)\n    data, labels = dict[\"X\"], dict[\"y\"]\n\n    # Set np type\n    data = np.asarray(data, dtype=np.float32)\n    labels = np.asarray(labels, dtype=np.int32)\n\n    # Transpose data to match TF model input format\n    data = data.transpose(3, 0, 1, 2)\n\n    # Fix the SVHN labels which label 0s as 10s\n    labels[labels == 10] = 0\n\n    # Fix label dimensions\n    labels = labels.reshape(len(labels))\n\n    return data, labels\n\n\ndef unpickle_cifar_dic(file):\n  \"\"\"\n  Helper function: unpickles a dictionary (used for loading CIFAR)\n  :param file: filename of the pickle\n  :return: tuple of (images, labels)\n  \"\"\"\n  fo = open(file, 'rb')\n  dict = cPickle.load(fo)\n  fo.close()\n  return dict['data'], dict['labels']\n\n\ndef extract_cifar10(local_url, data_dir):\n  \"\"\"\n  Extracts the CIFAR-10 dataset and return numpy arrays with the different sets\n  :param local_url: where the tar.gz archive is located locally\n  :param data_dir: where to extract the archive's file\n  :return: a tuple (train data, train labels, test data, test labels)\n  \"\"\"\n  # These numpy dumps can be reloaded to avoid performing the pre-processing\n  # if they exist in the working directory.\n  # Changing the order of this list will ruin the indices below.\n  preprocessed_files = ['/cifar10_train.npy',\n                        '/cifar10_train_labels.npy',\n                        '/cifar10_test.npy',\n                        '/cifar10_test_labels.npy']\n\n  all_preprocessed = True\n  for file in preprocessed_files:\n    if not tf.gfile.Exists(data_dir + file):\n      all_preprocessed = False\n      break\n\n  if all_preprocessed:\n    # Reload pre-processed training data from numpy dumps\n    with tf.gfile.Open(data_dir + preprocessed_files[0], mode='r') as file_obj:\n      train_data = np.load(file_obj)\n    with tf.gfile.Open(data_dir + preprocessed_files[1], mode='r') as file_obj:\n      train_labels = np.load(file_obj)\n\n    # Reload pre-processed testing data from numpy dumps\n    with tf.gfile.Open(data_dir + preprocessed_files[2], mode='r') as file_obj:\n      test_data = np.load(file_obj)\n    with tf.gfile.Open(data_dir + preprocessed_files[3], mode='r') as file_obj:\n      test_labels = np.load(file_obj)\n\n  else:\n    # Do everything from scratch\n    # Define lists of all files we should extract\n    train_files = [\"data_batch_\" + str(i) for i in xrange(1,6)]\n    test_file = [\"test_batch\"]\n    cifar10_files = train_files + test_file\n\n    # Check if all files have already been extracted\n    need_to_unpack = False\n    for file in cifar10_files:\n      if not tf.gfile.Exists(file):\n        need_to_unpack = True\n        break\n\n    # We have to unpack the archive\n    if need_to_unpack:\n      tarfile.open(local_url, 'r:gz').extractall(data_dir)\n\n    # Load training images and labels\n    images = []\n    labels = []\n    for file in train_files:\n      # Construct filename\n      filename = data_dir + \"/cifar-10-batches-py/\" + file\n\n      # Unpickle dictionary and extract images and labels\n      images_tmp, labels_tmp = unpickle_cifar_dic(filename)\n\n      # Append to lists\n      images.append(images_tmp)\n      labels.append(labels_tmp)\n\n    # Convert to numpy arrays and reshape in the expected format\n    train_data = np.asarray(images, dtype=np.float32).reshape((50000,3,32,32))\n    train_data = np.swapaxes(train_data, 1, 3)\n    train_labels = np.asarray(labels, dtype=np.int32).reshape(50000)\n\n    # Save so we don't have to do this again\n    np.save(data_dir + preprocessed_files[0], train_data)\n    np.save(data_dir + preprocessed_files[1], train_labels)\n\n    # Construct filename for test file\n    filename = data_dir + \"/cifar-10-batches-py/\" + test_file[0]\n\n    # Load test images and labels\n    test_data, test_images = unpickle_cifar_dic(filename)\n\n    # Convert to numpy arrays and reshape in the expected format\n    test_data = np.asarray(test_data,dtype=np.float32).reshape((10000,3,32,32))\n    test_data = np.swapaxes(test_data, 1, 3)\n    test_labels = np.asarray(test_images, dtype=np.int32).reshape(10000)\n\n    # Save so we don't have to do this again\n    np.save(data_dir + preprocessed_files[2], test_data)\n    np.save(data_dir + preprocessed_files[3], test_labels)\n\n  return train_data, train_labels, test_data, test_labels\n\n\ndef extract_mnist_data(filename, num_images, image_size, pixel_depth):\n  \"\"\"\n  Extract the images into a 4D tensor [image index, y, x, channels].\n\n  Values are rescaled from [0, 255] down to [-0.5, 0.5].\n  \"\"\"\n  # if not os.path.exists(file):\n  if not tf.gfile.Exists(filename+\".npy\"):\n    with gzip.open(filename) as bytestream:\n      bytestream.read(16)\n      buf = bytestream.read(image_size * image_size * num_images)\n      data = np.frombuffer(buf, dtype=np.uint8).astype(np.float32)\n      data = (data - (pixel_depth / 2.0)) / pixel_depth\n      data = data.reshape(num_images, image_size, image_size, 1)\n      np.save(filename, data)\n      return data\n  else:\n    with tf.gfile.Open(filename+\".npy\", mode='r') as file_obj:\n      return np.load(file_obj)\n\n\ndef extract_mnist_labels(filename, num_images):\n  \"\"\"\n  Extract the labels into a vector of int64 label IDs.\n  \"\"\"\n  # if not os.path.exists(file):\n  if not tf.gfile.Exists(filename+\".npy\"):\n    with gzip.open(filename) as bytestream:\n      bytestream.read(8)\n      buf = bytestream.read(1 * num_images)\n      labels = np.frombuffer(buf, dtype=np.uint8).astype(np.int32)\n      np.save(filename, labels)\n    return labels\n  else:\n    with tf.gfile.Open(filename+\".npy\", mode='r') as file_obj:\n      return np.load(file_obj)\n\n\ndef ld_svhn(extended=False, test_only=False):\n  \"\"\"\n  Load the original SVHN data\n  :param extended: include extended training data in the returned array\n  :param test_only: disables loading of both train and extra -> large speed up\n  :return: tuple of arrays which depend on the parameters\n  \"\"\"\n  # Define files to be downloaded\n  # WARNING: changing the order of this list will break indices (cf. below)\n  file_urls = ['http://ufldl.stanford.edu/housenumbers/train_32x32.mat',\n               'http://ufldl.stanford.edu/housenumbers/test_32x32.mat',\n               'http://ufldl.stanford.edu/housenumbers/extra_32x32.mat']\n\n  # Maybe download data and retrieve local storage urls\n  local_urls = maybe_download(file_urls, FLAGS.data_dir)\n\n  # Extra Train, Test, and Extended Train data\n  if not test_only:\n    # Load and applying whitening to train data\n    train_data, train_labels = extract_svhn(local_urls[0])\n    train_data = image_whitening(train_data)\n\n    # Load and applying whitening to extended train data\n    ext_data, ext_labels = extract_svhn(local_urls[2])\n    ext_data = image_whitening(ext_data)\n\n  # Load and applying whitening to test data\n  test_data, test_labels = extract_svhn(local_urls[1])\n  test_data = image_whitening(test_data)\n\n  if test_only:\n    return test_data, test_labels\n  else:\n    if extended:\n      # Stack train data with the extended training data\n      train_data = np.vstack((train_data, ext_data))\n      train_labels = np.hstack((train_labels, ext_labels))\n\n      return train_data, train_labels, test_data, test_labels\n    else:\n      # Return training and extended training data separately\n      return train_data,train_labels, test_data,test_labels, ext_data,ext_labels\n\n\ndef ld_cifar10(test_only=False):\n  \"\"\"\n  Load the original CIFAR10 data\n  :param extended: include extended training data in the returned array\n  :param test_only: disables loading of both train and extra -> large speed up\n  :return: tuple of arrays which depend on the parameters\n  \"\"\"\n  # Define files to be downloaded\n  file_urls = ['https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz']\n\n  # Maybe download data and retrieve local storage urls\n  local_urls = maybe_download(file_urls, FLAGS.data_dir)\n\n  # Extract archives and return different sets\n  dataset = extract_cifar10(local_urls[0], FLAGS.data_dir)\n\n  # Unpack tuple\n  train_data, train_labels, test_data, test_labels = dataset\n\n  # Apply whitening to input data\n  train_data = image_whitening(train_data)\n  test_data = image_whitening(test_data)\n\n  if test_only:\n    return test_data, test_labels\n  else:\n    return train_data, train_labels, test_data, test_labels\n\n\ndef ld_mnist(test_only=False):\n  \"\"\"\n  Load the MNIST dataset\n  :param extended: include extended training data in the returned array\n  :param test_only: disables loading of both train and extra -> large speed up\n  :return: tuple of arrays which depend on the parameters\n  \"\"\"\n  # Define files to be downloaded\n  # WARNING: changing the order of this list will break indices (cf. below)\n  file_urls = ['http://yann.lecun.com/exdb/mnist/train-images-idx3-ubyte.gz',\n               'http://yann.lecun.com/exdb/mnist/train-labels-idx1-ubyte.gz',\n               'http://yann.lecun.com/exdb/mnist/t10k-images-idx3-ubyte.gz',\n               'http://yann.lecun.com/exdb/mnist/t10k-labels-idx1-ubyte.gz',\n               ]\n\n  # Maybe download data and retrieve local storage urls\n  local_urls = maybe_download(file_urls, FLAGS.data_dir)\n\n  # Extract it into np arrays.\n  train_data = extract_mnist_data(local_urls[0], 60000, 28, 1)\n  train_labels = extract_mnist_labels(local_urls[1], 60000)\n  test_data = extract_mnist_data(local_urls[2], 10000, 28, 1)\n  test_labels = extract_mnist_labels(local_urls[3], 10000)\n\n  if test_only:\n    return test_data, test_labels\n  else:\n    return train_data, train_labels, test_data, test_labels\n\n\ndef partition_dataset(data, labels, nb_teachers, teacher_id):\n  \"\"\"\n  Simple partitioning algorithm that returns the right portion of the data\n  needed by a given teacher out of a certain nb of teachers\n  :param data: input data to be partitioned\n  :param labels: output data to be partitioned\n  :param nb_teachers: number of teachers in the ensemble (affects size of each\n                      partition)\n  :param teacher_id: id of partition to retrieve\n  :return:\n  \"\"\"\n\n  # Sanity check\n  assert len(data) == len(labels)\n  assert int(teacher_id) < int(nb_teachers)\n\n  # This will floor the possible number of batches\n  batch_len = int(len(data) / nb_teachers)\n\n  # Compute start, end indices of partition\n  start = teacher_id * batch_len\n  end = (teacher_id+1) * batch_len\n\n  # Slice partition off\n  partition_data = data[start:end]\n  partition_labels = labels[start:end]\n\n  return partition_data, partition_labels\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/multiple_teachers/metrics.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport numpy as np\n\n\ndef accuracy(logits, labels):\n  \"\"\"\n  Return accuracy of the array of logits (or label predictions) wrt the labels\n  :param logits: this can either be logits, probabilities, or a single label\n  :param labels: the correct labels to match against\n  :return: the accuracy as a float\n  \"\"\"\n  assert len(logits) == len(labels)\n\n  if len(np.shape(logits)) > 1:\n    # Predicted labels are the argmax over axis 1\n    predicted_labels = np.argmax(logits, axis=1)\n  else:\n    # Input was already labels\n    assert len(np.shape(logits)) == 1\n    predicted_labels = logits\n\n  # Check against correct labels to compute correct guesses\n  correct = np.sum(predicted_labels == labels.reshape(len(labels)))\n\n  # Divide by number of labels to obtain accuracy\n  accuracy = float(correct) / len(labels)\n\n  # Return float value\n  return accuracy\n\n\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/multiple_teachers/train_student.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom differential_privacy.multiple_teachers import aggregation\nfrom differential_privacy.multiple_teachers import deep_cnn\nfrom differential_privacy.multiple_teachers import input\nfrom differential_privacy.multiple_teachers import metrics\n\nFLAGS = tf.flags.FLAGS\n\ntf.flags.DEFINE_string('dataset', 'svhn', 'The name of the dataset to use')\ntf.flags.DEFINE_integer('nb_labels', 10, 'Number of output classes')\n\ntf.flags.DEFINE_string('data_dir','/tmp','Temporary storage')\ntf.flags.DEFINE_string('train_dir','/tmp/train_dir','Where model chkpt are saved')\ntf.flags.DEFINE_string('teachers_dir','/tmp/train_dir',\n                       'Directory where teachers checkpoints are stored.')\n\ntf.flags.DEFINE_integer('teachers_max_steps', 3000,\n                        'Number of steps teachers were ran.')\ntf.flags.DEFINE_integer('max_steps', 3000, 'Number of steps to run student.')\ntf.flags.DEFINE_integer('nb_teachers', 10, 'Teachers in the ensemble.')\ntf.flags.DEFINE_integer('stdnt_share', 1000,\n                        'Student share (last index) of the test data')\ntf.flags.DEFINE_integer('lap_scale', 10,\n                        'Scale of the Laplacian noise added for privacy')\ntf.flags.DEFINE_boolean('save_labels', False,\n                        'Dump numpy arrays of labels and clean teacher votes')\ntf.flags.DEFINE_boolean('deeper', False, 'Activate deeper CNN model')\n\n\ndef ensemble_preds(dataset, nb_teachers, stdnt_data):\n  \"\"\"\n  Given a dataset, a number of teachers, and some input data, this helper\n  function queries each teacher for predictions on the data and returns\n  all predictions in a single array. (That can then be aggregated into\n  one single prediction per input using aggregation.py (cf. function\n  prepare_student_data() below)\n  :param dataset: string corresponding to mnist, cifar10, or svhn\n  :param nb_teachers: number of teachers (in the ensemble) to learn from\n  :param stdnt_data: unlabeled student training data\n  :return: 3d array (teacher id, sample id, probability per class)\n  \"\"\"\n\n  # Compute shape of array that will hold probabilities produced by each\n  # teacher, for each training point, and each output class\n  result_shape = (nb_teachers, len(stdnt_data), FLAGS.nb_labels)\n\n  # Create array that will hold result\n  result = np.zeros(result_shape, dtype=np.float32)\n\n  # Get predictions from each teacher\n  for teacher_id in xrange(nb_teachers):\n    # Compute path of checkpoint file for teacher model with ID teacher_id\n    if FLAGS.deeper:\n      ckpt_path = FLAGS.teachers_dir + '/' + str(dataset) + '_' + str(nb_teachers) + '_teachers_' + str(teacher_id) + '_deep.ckpt-' + str(FLAGS.teachers_max_steps - 1) #NOLINT(long-line)\n    else:\n      ckpt_path = FLAGS.teachers_dir + '/' + str(dataset) + '_' + str(nb_teachers) + '_teachers_' + str(teacher_id) + '.ckpt-' + str(FLAGS.teachers_max_steps - 1)  # NOLINT(long-line)\n\n    # Get predictions on our training data and store in result array\n    result[teacher_id] = deep_cnn.softmax_preds(stdnt_data, ckpt_path)\n\n    # This can take a while when there are a lot of teachers so output status\n    print(\"Computed Teacher \" + str(teacher_id) + \" softmax predictions\")\n\n  return result\n\n\ndef prepare_student_data(dataset, nb_teachers, save=False):\n  \"\"\"\n  Takes a dataset name and the size of the teacher ensemble and prepares\n  training data for the student model, according to parameters indicated\n  in flags above.\n  :param dataset: string corresponding to mnist, cifar10, or svhn\n  :param nb_teachers: number of teachers (in the ensemble) to learn from\n  :param save: if set to True, will dump student training labels predicted by\n               the ensemble of teachers (with Laplacian noise) as npy files.\n               It also dumps the clean votes for each class (without noise) and\n               the labels assigned by teachers\n  :return: pairs of (data, labels) to be used for student training and testing\n  \"\"\"\n  assert input.create_dir_if_needed(FLAGS.train_dir)\n\n  # Load the dataset\n  if dataset == 'svhn':\n    test_data, test_labels = input.ld_svhn(test_only=True)\n  elif dataset == 'cifar10':\n    test_data, test_labels = input.ld_cifar10(test_only=True)\n  elif dataset == 'mnist':\n    test_data, test_labels = input.ld_mnist(test_only=True)\n  else:\n    print(\"Check value of dataset flag\")\n    return False\n\n  # Make sure there is data leftover to be used as a test set\n  assert FLAGS.stdnt_share < len(test_data)\n\n  # Prepare [unlabeled] student training data (subset of test set)\n  stdnt_data = test_data[:FLAGS.stdnt_share]\n\n  # Compute teacher predictions for student training data\n  teachers_preds = ensemble_preds(dataset, nb_teachers, stdnt_data)\n\n  # Aggregate teacher predictions to get student training labels\n  if not save:\n    stdnt_labels = aggregation.noisy_max(teachers_preds, FLAGS.lap_scale)\n  else:\n    # Request clean votes and clean labels as well\n    stdnt_labels, clean_votes, labels_for_dump = aggregation.noisy_max(teachers_preds, FLAGS.lap_scale, return_clean_votes=True) #NOLINT(long-line)\n\n    # Prepare filepath for numpy dump of clean votes\n    filepath = FLAGS.data_dir + \"/\" + str(dataset) + '_' + str(nb_teachers) + '_student_clean_votes_lap_' + str(FLAGS.lap_scale) + '.npy'  # NOLINT(long-line)\n\n    # Prepare filepath for numpy dump of clean labels\n    filepath_labels = FLAGS.data_dir + \"/\" + str(dataset) + '_' + str(nb_teachers) + '_teachers_labels_lap_' + str(FLAGS.lap_scale) + '.npy'  # NOLINT(long-line)\n\n    # Dump clean_votes array\n    with tf.gfile.Open(filepath, mode='w') as file_obj:\n      np.save(file_obj, clean_votes)\n\n    # Dump labels_for_dump array\n    with tf.gfile.Open(filepath_labels, mode='w') as file_obj:\n      np.save(file_obj, labels_for_dump)\n\n  # Print accuracy of aggregated labels\n  ac_ag_labels = metrics.accuracy(stdnt_labels, test_labels[:FLAGS.stdnt_share])\n  print(\"Accuracy of the aggregated labels: \" + str(ac_ag_labels))\n\n  # Store unused part of test set for use as a test set after student training\n  stdnt_test_data = test_data[FLAGS.stdnt_share:]\n  stdnt_test_labels = test_labels[FLAGS.stdnt_share:]\n\n  if save:\n    # Prepare filepath for numpy dump of labels produced by noisy aggregation\n    filepath = FLAGS.data_dir + \"/\" + str(dataset) + '_' + str(nb_teachers) + '_student_labels_lap_' + str(FLAGS.lap_scale) + '.npy' #NOLINT(long-line)\n\n    # Dump student noisy labels array\n    with tf.gfile.Open(filepath, mode='w') as file_obj:\n      np.save(file_obj, stdnt_labels)\n\n  return stdnt_data, stdnt_labels, stdnt_test_data, stdnt_test_labels\n\n\ndef train_student(dataset, nb_teachers):\n  \"\"\"\n  This function trains a student using predictions made by an ensemble of\n  teachers. The student and teacher models are trained using the same\n  neural network architecture.\n  :param dataset: string corresponding to mnist, cifar10, or svhn\n  :param nb_teachers: number of teachers (in the ensemble) to learn from\n  :return: True if student training went well\n  \"\"\"\n  assert input.create_dir_if_needed(FLAGS.train_dir)\n\n  # Call helper function to prepare student data using teacher predictions\n  stdnt_dataset = prepare_student_data(dataset, nb_teachers, save=True)\n\n  # Unpack the student dataset\n  stdnt_data, stdnt_labels, stdnt_test_data, stdnt_test_labels = stdnt_dataset\n\n  # Prepare checkpoint filename and path\n  if FLAGS.deeper:\n    ckpt_path = FLAGS.train_dir + '/' + str(dataset) + '_' + str(nb_teachers) + '_student_deeper.ckpt' #NOLINT(long-line)\n  else:\n    ckpt_path = FLAGS.train_dir + '/' + str(dataset) + '_' + str(nb_teachers) + '_student.ckpt'  # NOLINT(long-line)\n\n  # Start student training\n  assert deep_cnn.train(stdnt_data, stdnt_labels, ckpt_path)\n\n  # Compute final checkpoint name for student (with max number of steps)\n  ckpt_path_final = ckpt_path + '-' + str(FLAGS.max_steps - 1)\n\n  # Compute student label predictions on remaining chunk of test set\n  student_preds = deep_cnn.softmax_preds(stdnt_test_data, ckpt_path_final)\n\n  # Compute teacher accuracy\n  precision = metrics.accuracy(student_preds, stdnt_test_labels)\n  print('Precision of student after training: ' + str(precision))\n\n  return True\n\ndef main(argv=None): # pylint: disable=unused-argument\n  # Run student training according to values specified in flags\n  assert train_student(FLAGS.dataset, FLAGS.nb_teachers)\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/multiple_teachers/train_student_mnist_250_lap_20_count_50_epochs_600.sh",
    "content": "# Be sure to clone https://github.com/openai/improved-gan\n# and add improved-gan/mnist_svhn_cifar10 to your PATH variable\n\n# Download labels used to train the student\nwget https://github.com/npapernot/multiple-teachers-for-privacy/blob/master/mnist_250_student_labels_lap_20.npy\n\n# Train the student using improved-gan \nTHEANO_FLAGS='floatX=float32,device=gpu,lib.cnmem=1' train_mnist_fm_custom_labels.py --labels mnist_250_student_labels_lap_20.npy --count 50 --epochs 600\n\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/multiple_teachers/train_teachers.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom differential_privacy.multiple_teachers import deep_cnn\nfrom differential_privacy.multiple_teachers import input\nfrom differential_privacy.multiple_teachers import metrics\n\n\ntf.flags.DEFINE_string('dataset', 'svhn', 'The name of the dataset to use')\ntf.flags.DEFINE_integer('nb_labels', 10, 'Number of output classes')\n\ntf.flags.DEFINE_string('data_dir','/tmp','Temporary storage')\ntf.flags.DEFINE_string('train_dir','/tmp/train_dir',\n                       'Where model ckpt are saved')\n\ntf.flags.DEFINE_integer('max_steps', 3000, 'Number of training steps to run.')\ntf.flags.DEFINE_integer('nb_teachers', 50, 'Teachers in the ensemble.')\ntf.flags.DEFINE_integer('teacher_id', 0, 'ID of teacher being trained.')\n\ntf.flags.DEFINE_boolean('deeper', False, 'Activate deeper CNN model')\n\nFLAGS = tf.flags.FLAGS\n\n\ndef train_teacher(dataset, nb_teachers, teacher_id):\n  \"\"\"\n  This function trains a teacher (teacher id) among an ensemble of nb_teachers\n  models for the dataset specified.\n  :param dataset: string corresponding to dataset (svhn, cifar10)\n  :param nb_teachers: total number of teachers in the ensemble\n  :param teacher_id: id of the teacher being trained\n  :return: True if everything went well\n  \"\"\"\n  # If working directories do not exist, create them\n  assert input.create_dir_if_needed(FLAGS.data_dir)\n  assert input.create_dir_if_needed(FLAGS.train_dir)\n\n  # Load the dataset\n  if dataset == 'svhn':\n    train_data,train_labels,test_data,test_labels = input.ld_svhn(extended=True)\n  elif dataset == 'cifar10':\n    train_data, train_labels, test_data, test_labels = input.ld_cifar10()\n  elif dataset == 'mnist':\n    train_data, train_labels, test_data, test_labels = input.ld_mnist()\n  else:\n    print(\"Check value of dataset flag\")\n    return False\n    \n  # Retrieve subset of data for this teacher\n  data, labels = input.partition_dataset(train_data, \n                                         train_labels, \n                                         nb_teachers, \n                                         teacher_id)\n\n  print(\"Length of training data: \" + str(len(labels)))\n\n  # Define teacher checkpoint filename and full path\n  if FLAGS.deeper:\n    filename = str(nb_teachers) + '_teachers_' + str(teacher_id) + '_deep.ckpt'\n  else:\n    filename = str(nb_teachers) + '_teachers_' + str(teacher_id) + '.ckpt'\n  ckpt_path = FLAGS.train_dir + '/' + str(dataset) + '_' + filename\n\n  # Perform teacher training\n  assert deep_cnn.train(data, labels, ckpt_path)\n\n  # Append final step value to checkpoint for evaluation\n  ckpt_path_final = ckpt_path + '-' + str(FLAGS.max_steps - 1)\n\n  # Retrieve teacher probability estimates on the test data\n  teacher_preds = deep_cnn.softmax_preds(test_data, ckpt_path_final)\n\n  # Compute teacher accuracy\n  precision = metrics.accuracy(teacher_preds, test_labels)\n  print('Precision of teacher after training: ' + str(precision))\n\n  return True\n\n\ndef main(argv=None):  # pylint: disable=unused-argument\n  # Make a call to train_teachers with values specified in flags\n  assert train_teacher(FLAGS.dataset, FLAGS.nb_teachers, FLAGS.teacher_id)\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/multiple_teachers/utils.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\ndef batch_indices(batch_nb, data_length, batch_size):\n  \"\"\"\n  This helper function computes a batch start and end index\n  :param batch_nb: the batch number\n  :param data_length: the total length of the data being parsed by batches\n  :param batch_size: the number of inputs in each batch\n  :return: pair of (start, end) indices\n  \"\"\"\n  # Batch start and end index\n  start = int(batch_nb * batch_size)\n  end = int((batch_nb + 1) * batch_size)\n\n  # When there are not enough inputs left, we reuse some to complete the batch\n  if end > data_length:\n    shift = end - data_length\n    start -= shift\n    end -= shift\n\n  return start, end\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/privacy_accountant/python/BUILD",
    "content": "package(default_visibility = [\":internal\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\nexports_files([\"LICENSE\"])\n\npackage_group(\n    name = \"internal\",\n    packages = [\n        \"//third_party/tensorflow_models/...\",\n    ],\n)\n\npy_binary(\n    name = \"gaussian_moments\",\n    srcs = [\n        \"gaussian_moments.py\",\n    ],\n    deps = [\n    ],\n)\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/privacy_accountant/python/gaussian_moments.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"A standalone utility for computing the log moments.\n\nThe utility for computing the log moments. It consists of two methods.\ncompute_log_moment(q, sigma, T, lmbd) computes the log moment with sampling\nprobability q, noise sigma, order lmbd, and T steps. get_privacy_spent computes\ndelta (or eps) given log moments and eps (or delta).\n\nExample use:\n\nSuppose that we have run an algorithm with parameters, an array of\n(q1, sigma1, T1) ... (qk, sigmak, Tk), and we wish to compute eps for a given\ndelta. The example code would be:\n\n  max_lmbd = 32\n  lmbds = xrange(1, max_lmbd + 1)\n  log_moments = []\n  for lmbd in lmbds:\n    log_moment = 0\n    for q, sigma, T in parameters:\n      log_moment += compute_log_moment(q, sigma, T, lmbd)\n    log_moments.append((lmbd, log_moment))\n  eps, delta = get_privacy_spent(log_moments, target_delta=delta)\n\nTo verify that the I1 >= I2 (see comments in GaussianMomentsAccountant in\naccountant.py for the context), run the same loop above with verify=True\npassed to compute_log_moment.\n\"\"\"\nimport math\nimport sys\n\nimport numpy as np\nimport scipy.integrate as integrate\nimport scipy.stats\nfrom sympy.mpmath import mp\n\n\ndef _to_np_float64(v):\n  if math.isnan(v) or math.isinf(v):\n    return np.inf\n  return np.float64(v)\n\n\n######################\n# FLOAT64 ARITHMETIC #\n######################\n\n\ndef pdf_gauss(x, sigma, mean=0):\n  return scipy.stats.norm.pdf(x, loc=mean, scale=sigma)\n\n\ndef cropped_ratio(a, b):\n  if a < 1E-50 and b < 1E-50:\n    return 1.\n  else:\n    return a / b\n\n\ndef integral_inf(fn):\n  integral, _ = integrate.quad(fn, -np.inf, np.inf)\n  return integral\n\n\ndef integral_bounded(fn, lb, ub):\n  integral, _ = integrate.quad(fn, lb, ub)\n  return integral\n\n\ndef distributions(sigma, q):\n  mu0 = lambda y: pdf_gauss(y, sigma=sigma, mean=0.0)\n  mu1 = lambda y: pdf_gauss(y, sigma=sigma, mean=1.0)\n  mu = lambda y: (1 - q) * mu0(y) + q * mu1(y)\n  return mu0, mu1, mu\n\n\ndef compute_a(sigma, q, lmbd, verbose=False):\n  lmbd_int = int(math.ceil(lmbd))\n  if lmbd_int == 0:\n    return 1.0\n\n  a_lambda_first_term_exact = 0\n  a_lambda_second_term_exact = 0\n  for i in xrange(lmbd_int + 1):\n    coef_i = scipy.special.binom(lmbd_int, i) * (q ** i)\n    s1, s2 = 0, 0\n    for j in xrange(i + 1):\n      coef_j = scipy.special.binom(i, j) * (-1) ** (i - j)\n      s1 += coef_j * np.exp((j * j - j) / (2.0 * (sigma ** 2)))\n      s2 += coef_j * np.exp((j * j + j) / (2.0 * (sigma ** 2)))\n    a_lambda_first_term_exact += coef_i * s1\n    a_lambda_second_term_exact += coef_i * s2\n\n  a_lambda_exact = ((1.0 - q) * a_lambda_first_term_exact +\n                    q * a_lambda_second_term_exact)\n  if verbose:\n    print \"A: by binomial expansion    {} = {} + {}\".format(\n        a_lambda_exact,\n        (1.0 - q) * a_lambda_first_term_exact,\n        q * a_lambda_second_term_exact)\n  return _to_np_float64(a_lambda_exact)\n\n\ndef compute_b(sigma, q, lmbd, verbose=False):\n  mu0, _, mu = distributions(sigma, q)\n\n  b_lambda_fn = lambda z: mu0(z) * np.power(cropped_ratio(mu0(z), mu(z)), lmbd)\n  b_lambda = integral_inf(b_lambda_fn)\n  m = sigma ** 2 * (np.log((2. - q) / (1. - q)) + 1. / (2 * sigma ** 2))\n\n  b_fn = lambda z: (np.power(mu0(z) / mu(z), lmbd) -\n                    np.power(mu(-z) / mu0(z), lmbd))\n  if verbose:\n    print \"M =\", m\n    print \"f(-M) = {} f(M) = {}\".format(b_fn(-m), b_fn(m))\n    assert b_fn(-m) < 0 and b_fn(m) < 0\n\n  b_lambda_int1_fn = lambda z: (mu0(z) *\n                                np.power(cropped_ratio(mu0(z), mu(z)), lmbd))\n  b_lambda_int2_fn = lambda z: (mu0(z) *\n                                np.power(cropped_ratio(mu(z), mu0(z)), lmbd))\n  b_int1 = integral_bounded(b_lambda_int1_fn, -m, m)\n  b_int2 = integral_bounded(b_lambda_int2_fn, -m, m)\n\n  a_lambda_m1 = compute_a(sigma, q, lmbd - 1)\n  b_bound = a_lambda_m1 + b_int1 - b_int2\n\n  if verbose:\n    print \"B: by numerical integration\", b_lambda\n    print \"B must be no more than     \", b_bound\n  print b_lambda, b_bound\n  return _to_np_float64(b_lambda)\n\n\n###########################\n# MULTIPRECISION ROUTINES #\n###########################\n\n\ndef pdf_gauss_mp(x, sigma, mean):\n  return mp.mpf(1.) / mp.sqrt(mp.mpf(\"2.\") * sigma ** 2 * mp.pi) * mp.exp(\n      - (x - mean) ** 2 / (mp.mpf(\"2.\") * sigma ** 2))\n\n\ndef integral_inf_mp(fn):\n  integral, _ = mp.quad(fn, [-mp.inf, mp.inf], error=True)\n  return integral\n\n\ndef integral_bounded_mp(fn, lb, ub):\n  integral, _ = mp.quad(fn, [lb, ub], error=True)\n  return integral\n\n\ndef distributions_mp(sigma, q):\n  mu0 = lambda y: pdf_gauss_mp(y, sigma=sigma, mean=mp.mpf(0))\n  mu1 = lambda y: pdf_gauss_mp(y, sigma=sigma, mean=mp.mpf(1))\n  mu = lambda y: (1 - q) * mu0(y) + q * mu1(y)\n  return mu0, mu1, mu\n\n\ndef compute_a_mp(sigma, q, lmbd, verbose=False):\n  lmbd_int = int(math.ceil(lmbd))\n  if lmbd_int == 0:\n    return 1.0\n\n  mu0, mu1, mu = distributions_mp(sigma, q)\n  a_lambda_fn = lambda z: mu(z) * (mu(z) / mu0(z)) ** lmbd_int\n  a_lambda_first_term_fn = lambda z: mu0(z) * (mu(z) / mu0(z)) ** lmbd_int\n  a_lambda_second_term_fn = lambda z: mu1(z) * (mu(z) / mu0(z)) ** lmbd_int\n\n  a_lambda = integral_inf_mp(a_lambda_fn)\n  a_lambda_first_term = integral_inf_mp(a_lambda_first_term_fn)\n  a_lambda_second_term = integral_inf_mp(a_lambda_second_term_fn)\n\n  if verbose:\n    print \"A: by numerical integration {} = {} + {}\".format(\n        a_lambda,\n        (1 - q) * a_lambda_first_term,\n        q * a_lambda_second_term)\n\n  return _to_np_float64(a_lambda)\n\n\ndef compute_b_mp(sigma, q, lmbd, verbose=False):\n  lmbd_int = int(math.ceil(lmbd))\n  if lmbd_int == 0:\n    return 1.0\n\n  mu0, _, mu = distributions_mp(sigma, q)\n\n  b_lambda_fn = lambda z: mu0(z) * (mu0(z) / mu(z)) ** lmbd_int\n  b_lambda = integral_inf_mp(b_lambda_fn)\n\n  m = sigma ** 2 * (mp.log((2 - q) / (1 - q)) + 1 / (2 * (sigma ** 2)))\n  b_fn = lambda z: ((mu0(z) / mu(z)) ** lmbd_int -\n                    (mu(-z) / mu0(z)) ** lmbd_int)\n  if verbose:\n    print \"M =\", m\n    print \"f(-M) = {} f(M) = {}\".format(b_fn(-m), b_fn(m))\n    assert b_fn(-m) < 0 and b_fn(m) < 0\n\n  b_lambda_int1_fn = lambda z: mu0(z) * (mu0(z) / mu(z)) ** lmbd_int\n  b_lambda_int2_fn = lambda z: mu0(z) * (mu(z) / mu0(z)) ** lmbd_int\n  b_int1 = integral_bounded_mp(b_lambda_int1_fn, -m, m)\n  b_int2 = integral_bounded_mp(b_lambda_int2_fn, -m, m)\n\n  a_lambda_m1 = compute_a_mp(sigma, q, lmbd - 1)\n  b_bound = a_lambda_m1 + b_int1 - b_int2\n\n  if verbose:\n    print \"B by numerical integration\", b_lambda\n    print \"B must be no more than    \", b_bound\n  assert b_lambda < b_bound + 1e-5\n  return _to_np_float64(b_lambda)\n\n\ndef _compute_delta(log_moments, eps):\n  \"\"\"Compute delta for given log_moments and eps.\n\n  Args:\n    log_moments: the log moments of privacy loss, in the form of pairs\n      of (moment_order, log_moment)\n    eps: the target epsilon.\n  Returns:\n    delta\n  \"\"\"\n  min_delta = 1.0\n  for moment_order, log_moment in log_moments:\n    if moment_order == 0:\n      continue\n    if math.isinf(log_moment) or math.isnan(log_moment):\n      sys.stderr.write(\"The %d-th order is inf or Nan\\n\" % moment_order)\n      continue\n    if log_moment < moment_order * eps:\n      min_delta = min(min_delta,\n                      math.exp(log_moment - moment_order * eps))\n  return min_delta\n\n\ndef _compute_eps(log_moments, delta):\n  \"\"\"Compute epsilon for given log_moments and delta.\n\n  Args:\n    log_moments: the log moments of privacy loss, in the form of pairs\n      of (moment_order, log_moment)\n    delta: the target delta.\n  Returns:\n    epsilon\n  \"\"\"\n  min_eps = float(\"inf\")\n  for moment_order, log_moment in log_moments:\n    if moment_order == 0:\n      continue\n    if math.isinf(log_moment) or math.isnan(log_moment):\n      sys.stderr.write(\"The %d-th order is inf or Nan\\n\" % moment_order)\n      continue\n    min_eps = min(min_eps, (log_moment - math.log(delta)) / moment_order)\n  return min_eps\n\n\ndef compute_log_moment(q, sigma, steps, lmbd, verify=False, verbose=False):\n  \"\"\"Compute the log moment of Gaussian mechanism for given parameters.\n\n  Args:\n    q: the sampling ratio.\n    sigma: the noise sigma.\n    steps: the number of steps.\n    lmbd: the moment order.\n    verify: if False, only compute the symbolic version. If True, computes\n      both symbolic and numerical solutions and verifies the results match.\n    verbose: if True, print out debug information.\n  Returns:\n    the log moment with type np.float64, could be np.inf.\n  \"\"\"\n  moment = compute_a(sigma, q, lmbd, verbose=verbose)\n  if verify:\n    mp.dps = 50\n    moment_a_mp = compute_a_mp(sigma, q, lmbd, verbose=verbose)\n    moment_b_mp = compute_b_mp(sigma, q, lmbd, verbose=verbose)\n    np.testing.assert_allclose(moment, moment_a_mp, rtol=1e-10)\n    if not np.isinf(moment_a_mp):\n      # The following test fails for (1, np.inf)!\n      np.testing.assert_array_less(moment_b_mp, moment_a_mp)\n  if np.isinf(moment):\n    return np.inf\n  else:\n    return np.log(moment) * steps\n\n\ndef get_privacy_spent(log_moments, target_eps=None, target_delta=None):\n  \"\"\"Compute delta (or eps) for given eps (or delta) from log moments.\n\n  Args:\n    log_moments: array of (moment_order, log_moment) pairs.\n    target_eps: if not None, the epsilon for which we would like to compute\n      corresponding delta value.\n    target_delta: if not None, the delta for which we would like to compute\n      corresponding epsilon value. Exactly one of target_eps and target_delta\n      is None.\n  Returns:\n    eps, delta pair\n  \"\"\"\n  assert (target_eps is None) ^ (target_delta is None)\n  assert not ((target_eps is None) and (target_delta is None))\n  if target_eps is not None:\n    return (target_eps, _compute_delta(log_moments, target_eps))\n  else:\n    return (_compute_eps(log_moments, target_delta), target_delta)\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/privacy_accountant/tf/BUILD",
    "content": "package(default_visibility = [\":internal\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\nexports_files([\"LICENSE\"])\n\npackage_group(\n    name = \"internal\",\n    packages = [\n        \"//differential_privacy/...\",\n    ],\n)\n\npy_library(\n    name = \"accountant\",\n    srcs = [\n        \"accountant.py\",\n    ],\n    deps = [\n        \"//differential_privacy/dp_sgd/dp_optimizer:utils\",\n    ],\n)\n\n"
  },
  {
    "path": "model_zoo/models/differential_privacy/privacy_accountant/tf/accountant.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Defines Accountant class for keeping track of privacy spending.\n\nA privacy accountant keeps track of privacy spendings. It has methods\naccumulate_privacy_spending and get_privacy_spent. Here we only define\nAmortizedAccountant which tracks the privacy spending in the amortized\nway. It uses privacy amplication via sampling to compute the privacy\nspending for each batch and strong composition (specialized for Gaussian\nnoise) for accumulate the privacy spending.\n\"\"\"\nfrom __future__ import division\n\nimport abc\nimport collections\nimport math\nimport sys\n\nimport numpy\nimport tensorflow as tf\n\nfrom differential_privacy.dp_sgd.dp_optimizer import utils\n\nEpsDelta = collections.namedtuple(\"EpsDelta\", [\"spent_eps\", \"spent_delta\"])\n\n\n# TODO(liqzhang) To ensure the same API for AmortizedAccountant and\n# MomentsAccountant, we pass the union of arguments to both, so we\n# have unused_sigma for AmortizedAccountant and unused_eps_delta for\n# MomentsAccountant. Consider to revise the API to avoid the unused\n# arguments.  It would be good to use @abc.abstractmethod, etc, to\n# define the common interface as a base class.\nclass AmortizedAccountant(object):\n  \"\"\"Keep track of privacy spending in an amortized way.\n\n  AmortizedAccountant accumulates the privacy spending by assuming\n  all the examples are processed uniformly at random so the spending is\n  amortized among all the examples. And we assume that we use Gaussian noise\n  so the accumulation is on eps^2 and delta, using advanced composition.\n  \"\"\"\n\n  def __init__(self, total_examples):\n    \"\"\"Initialization. Currently only support amortized tracking.\n\n    Args:\n      total_examples: total number of examples.\n    \"\"\"\n\n    assert total_examples > 0\n    self._total_examples = total_examples\n    self._eps_squared_sum = tf.Variable(tf.zeros([1]), trainable=False,\n                                        name=\"eps_squared_sum\")\n    self._delta_sum = tf.Variable(tf.zeros([1]), trainable=False,\n                                  name=\"delta_sum\")\n\n  def accumulate_privacy_spending(self, eps_delta, unused_sigma,\n                                  num_examples):\n    \"\"\"Accumulate the privacy spending.\n\n    Currently only support approximate privacy. Here we assume we use Gaussian\n    noise on randomly sampled batch so we get better composition: 1. the per\n    batch privacy is computed using privacy amplication via sampling bound;\n    2. the composition is done using the composition with Gaussian noise.\n    TODO(liqzhang) Add a link to a document that describes the bounds used.\n\n    Args:\n      eps_delta: EpsDelta pair which can be tensors.\n      unused_sigma: the noise sigma. Unused for this accountant.\n      num_examples: the number of examples involved.\n    Returns:\n      a TensorFlow operation for updating the privacy spending.\n    \"\"\"\n\n    eps, delta = eps_delta\n    with tf.control_dependencies(\n        [tf.Assert(tf.greater(delta, 0),\n                   [\"delta needs to be greater than 0\"])]):\n      amortize_ratio = (tf.cast(num_examples, tf.float32) * 1.0 /\n                        self._total_examples)\n      # Use privacy amplification via sampling bound.\n      # See Lemma 2.2 in http://arxiv.org/pdf/1405.7085v2.pdf\n      # TODO(liqzhang) Add a link to a document with formal statement\n      # and proof.\n      amortize_eps = tf.reshape(tf.log(1.0 + amortize_ratio * (\n          tf.exp(eps) - 1.0)), [1])\n      amortize_delta = tf.reshape(amortize_ratio * delta, [1])\n      return tf.group(*[tf.assign_add(self._eps_squared_sum,\n                                      tf.square(amortize_eps)),\n                        tf.assign_add(self._delta_sum, amortize_delta)])\n\n  def get_privacy_spent(self, sess, target_eps=None):\n    \"\"\"Report the spending so far.\n\n    Args:\n      sess: the session to run the tensor.\n      target_eps: the target epsilon. Unused.\n    Returns:\n      the list containing a single EpsDelta, with values as Python floats (as\n      opposed to numpy.float64). This is to be consistent with\n      MomentAccountant which can return a list of (eps, delta) pair.\n    \"\"\"\n\n    # pylint: disable=unused-argument\n    unused_target_eps = target_eps\n    eps_squared_sum, delta_sum = sess.run([self._eps_squared_sum,\n                                           self._delta_sum])\n    return [EpsDelta(math.sqrt(eps_squared_sum), float(delta_sum))]\n\n\nclass MomentsAccountant(object):\n  \"\"\"Privacy accountant which keeps track of moments of privacy loss.\n\n  Note: The constructor of this class creates tf.Variables that must\n  be initialized with tf.initialize_all_variables() or similar calls.\n\n  MomentsAccountant accumulates the high moments of the privacy loss. It\n  requires a method for computing differenital moments of the noise (See\n  below for the definition). So every specific accountant should subclass\n  this class by implementing _differential_moments method.\n\n  Denote by X_i the random variable of privacy loss at the i-th step.\n  Consider two databases D, D' which differ by one item. X_i takes value\n  log Pr[M(D')==x]/Pr[M(D)==x] with probability Pr[M(D)==x].\n  In MomentsAccountant, we keep track of y_i(L) = log E[exp(L X_i)] for some\n  large enough L. To compute the final privacy spending,  we apply Chernoff\n  bound (assuming the random noise added at each step is independent) to\n  bound the total privacy loss Z = sum X_i as follows:\n    Pr[Z > e] = Pr[exp(L Z) > exp(L e)]\n              < E[exp(L Z)] / exp(L e)\n              = Prod_i E[exp(L X_i)] / exp(L e)\n              = exp(sum_i log E[exp(L X_i)]) / exp(L e)\n              = exp(sum_i y_i(L) - L e)\n  Hence the mechanism is (e, d)-differentially private for\n    d =  exp(sum_i y_i(L) - L e).\n  We require d < 1, i.e. e > sum_i y_i(L) / L. We maintain y_i(L) for several\n  L to compute the best d for any give e (normally should be the lowest L\n  such that 2 * sum_i y_i(L) / L < e.\n\n  We further assume that at each step, the mechanism operates on a random\n  sample with sampling probability q = batch_size / total_examples. Then\n    E[exp(L X)] = E[(Pr[M(D)==x / Pr[M(D')==x])^L]\n  By distinguishign two cases of wether D < D' or D' < D, we have\n  that\n    E[exp(L X)] <= max (I1, I2)\n  where\n    I1 = (1-q) E ((1-q) + q P(X+1) / P(X))^L + q E ((1-q) + q P(X) / P(X-1))^L\n    I2 = E (P(X) / ((1-q) + q P(X+1)))^L\n\n  In order to compute I1 and I2, one can consider to\n    1. use an asymptotic bound, which recovers the advance composition theorem;\n    2. use the closed formula (like GaussianMomentsAccountant);\n    3. use numerical integration or random sample estimation.\n\n  Dependent on the distribution, we can often obtain a tigher estimation on\n  the moments and hence a more accurate estimation of the privacy loss than\n  obtained using generic composition theorems.\n\n  \"\"\"\n\n  __metaclass__ = abc.ABCMeta\n\n  def __init__(self, total_examples, moment_orders=32):\n    \"\"\"Initialize a MomentsAccountant.\n\n    Args:\n      total_examples: total number of examples.\n      moment_orders: the order of moments to keep.\n    \"\"\"\n\n    assert total_examples > 0\n    self._total_examples = total_examples\n    self._moment_orders = (moment_orders\n                           if isinstance(moment_orders, (list, tuple))\n                           else range(1, moment_orders + 1))\n    self._max_moment_order = max(self._moment_orders)\n    assert self._max_moment_order < 100, \"The moment order is too large.\"\n    self._log_moments = [tf.Variable(numpy.float64(0.0),\n                                     trainable=False,\n                                     name=(\"log_moments-%d\" % moment_order))\n                         for moment_order in self._moment_orders]\n\n  @abc.abstractmethod\n  def _compute_log_moment(self, sigma, q, moment_order):\n    \"\"\"Compute high moment of privacy loss.\n\n    Args:\n      sigma: the noise sigma, in the multiples of the sensitivity.\n      q: the sampling ratio.\n      moment_order: the order of moment.\n    Returns:\n      log E[exp(moment_order * X)]\n    \"\"\"\n    pass\n\n  def accumulate_privacy_spending(self, unused_eps_delta,\n                                  sigma, num_examples):\n    \"\"\"Accumulate privacy spending.\n\n    In particular, accounts for privacy spending when we assume there\n    are num_examples, and we are releasing the vector\n    (sum_{i=1}^{num_examples} x_i) + Normal(0, stddev=l2norm_bound*sigma)\n    where l2norm_bound is the maximum l2_norm of each example x_i, and\n    the num_examples have been randomly selected out of a pool of\n    self.total_examples.\n\n    Args:\n      unused_eps_delta: EpsDelta pair which can be tensors. Unused\n        in this accountant.\n      sigma: the noise sigma, in the multiples of the sensitivity (that is,\n        if the l2norm sensitivity is k, then the caller must have added\n        Gaussian noise with stddev=k*sigma to the result of the query).\n      num_examples: the number of examples involved.\n    Returns:\n      a TensorFlow operation for updating the privacy spending.\n    \"\"\"\n    q = tf.cast(num_examples, tf.float64) * 1.0 / self._total_examples\n\n    moments_accum_ops = []\n    for i in range(len(self._log_moments)):\n      moment = self._compute_log_moment(sigma, q, self._moment_orders[i])\n      moments_accum_ops.append(tf.assign_add(self._log_moments[i], moment))\n    return tf.group(*moments_accum_ops)\n\n  def _compute_delta(self, log_moments, eps):\n    \"\"\"Compute delta for given log_moments and eps.\n\n    Args:\n      log_moments: the log moments of privacy loss, in the form of pairs\n        of (moment_order, log_moment)\n      eps: the target epsilon.\n    Returns:\n      delta\n    \"\"\"\n    min_delta = 1.0\n    for moment_order, log_moment in log_moments:\n      if math.isinf(log_moment) or math.isnan(log_moment):\n        sys.stderr.write(\"The %d-th order is inf or Nan\\n\" % moment_order)\n        continue\n      if log_moment < moment_order * eps:\n        min_delta = min(min_delta,\n                        math.exp(log_moment - moment_order * eps))\n    return min_delta\n\n  def _compute_eps(self, log_moments, delta):\n    min_eps = float(\"inf\")\n    for moment_order, log_moment in log_moments:\n      if math.isinf(log_moment) or math.isnan(log_moment):\n        sys.stderr.write(\"The %d-th order is inf or Nan\\n\" % moment_order)\n        continue\n      min_eps = min(min_eps, (log_moment - math.log(delta)) / moment_order)\n    return min_eps\n\n  def get_privacy_spent(self, sess, target_eps=None, target_deltas=None):\n    \"\"\"Compute privacy spending in (e, d)-DP form for a single or list of eps.\n\n    Args:\n      sess: the session to run the tensor.\n      target_eps: a list of target epsilon's for which we would like to\n        compute corresponding delta value.\n      target_deltas: a list of target deltas for which we would like to\n        compute the corresponding eps value. Caller must specify\n        either target_eps or target_delta.\n    Returns:\n      A list of EpsDelta pairs.\n    \"\"\"\n    assert (target_eps is None) ^ (target_deltas is None)\n    eps_deltas = []\n    log_moments = sess.run(self._log_moments)\n    log_moments_with_order = zip(self._moment_orders, log_moments)\n    if target_eps is not None:\n      for eps in target_eps:\n        eps_deltas.append(\n            EpsDelta(eps, self._compute_delta(log_moments_with_order, eps)))\n    else:\n      assert target_deltas\n      for delta in target_deltas:\n        eps_deltas.append(\n            EpsDelta(self._compute_eps(log_moments_with_order, delta), delta))\n    return eps_deltas\n\n\nclass GaussianMomentsAccountant(MomentsAccountant):\n  \"\"\"MomentsAccountant which assumes Gaussian noise.\n\n  GaussianMomentsAccountant assumes the noise added is centered Gaussian\n  noise N(0, sigma^2 I). In this case, we can compute the differential moments\n  accurately using a formula.\n\n  For asymptotic bound, for Gaussian noise with variance sigma^2, we can show\n  for L < sigma^2,  q L < sigma,\n    log E[exp(L X)] = O(q^2 L^2 / sigma^2).\n  Using this we derive that for training T epoches, with batch ratio q,\n  the Gaussian mechanism with variance sigma^2 (with q < 1/sigma) is (e, d)\n  private for d = exp(T/q q^2 L^2 / sigma^2 - L e). Setting L = sigma^2,\n  Tq = e/2, the mechanism is (e, exp(-e sigma^2/2))-DP. Equivalently, the\n  mechanism is (e, d)-DP if sigma = sqrt{2 log(1/d)}/e, q < 1/sigma,\n  and T < e/(2q). This bound is better than the bound obtained using general\n  composition theorems, by an Omega(sqrt{log k}) factor on epsilon, if we run\n  k steps. Since we use direct estimate, the obtained privacy bound has tight\n  constant.\n\n  For GaussianMomentAccountant, it suffices to compute I1, as I1 >= I2,\n  which reduce to computing E(P(x+s)/P(x+s-1) - 1)^i for s = 0 and 1. In the\n  companion gaussian_moments.py file, we supply procedure for computing both\n  I1 and I2 (the computation of I2 is through multi-precision integration\n  package). It can be verified that indeed I1 >= I2 for wide range of parameters\n  we have tried, though at the moment we are unable to prove this claim.\n\n  We recommend that when using this accountant, users independently verify\n  using gaussian_moments.py that for their parameters, I1 is indeed larger\n  than I2. This can be done by following the instructions in\n  gaussian_moments.py.\n  \"\"\"\n\n  def __init__(self, total_examples, moment_orders=32):\n    \"\"\"Initialization.\n\n    Args:\n      total_examples: total number of examples.\n      moment_orders: the order of moments to keep.\n    \"\"\"\n    super(self.__class__, self).__init__(total_examples, moment_orders)\n    self._binomial_table = utils.GenerateBinomialTable(self._max_moment_order)\n\n  def _differential_moments(self, sigma, s, t):\n    \"\"\"Compute 0 to t-th differential moments for Gaussian variable.\n\n        E[(P(x+s)/P(x+s-1)-1)^t]\n      = sum_{i=0}^t (t choose i) (-1)^{t-i} E[(P(x+s)/P(x+s-1))^i]\n      = sum_{i=0}^t (t choose i) (-1)^{t-i} E[exp(-i*(2*x+2*s-1)/(2*sigma^2))]\n      = sum_{i=0}^t (t choose i) (-1)^{t-i} exp(i(i+1-2*s)/(2 sigma^2))\n    Args:\n      sigma: the noise sigma, in the multiples of the sensitivity.\n      s: the shift.\n      t: 0 to t-th moment.\n    Returns:\n      0 to t-th moment as a tensor of shape [t+1].\n    \"\"\"\n    assert t <= self._max_moment_order, (\"The order of %d is out \"\n                                         \"of the upper bound %d.\"\n                                         % (t, self._max_moment_order))\n    binomial = tf.slice(self._binomial_table, [0, 0],\n                        [t + 1, t + 1])\n    signs = numpy.zeros((t + 1, t + 1), dtype=numpy.float64)\n    for i in range(t + 1):\n      for j in range(t + 1):\n        signs[i, j] = 1.0 - 2 * ((i - j) % 2)\n    exponents = tf.constant([j * (j + 1.0 - 2.0 * s) / (2.0 * sigma * sigma)\n                             for j in range(t + 1)], dtype=tf.float64)\n    # x[i, j] = binomial[i, j] * signs[i, j] = (i choose j) * (-1)^{i-j}\n    x = tf.mul(binomial, signs)\n    # y[i, j] = x[i, j] * exp(exponents[j])\n    #         = (i choose j) * (-1)^{i-j} * exp(j(j-1)/(2 sigma^2))\n    # Note: this computation is done by broadcasting pointwise multiplication\n    # between [t+1, t+1] tensor and [t+1] tensor.\n    y = tf.mul(x, tf.exp(exponents))\n    # z[i] = sum_j y[i, j]\n    #      = sum_j (i choose j) * (-1)^{i-j} * exp(j(j-1)/(2 sigma^2))\n    z = tf.reduce_sum(y, 1)\n    return z\n\n  def _compute_log_moment(self, sigma, q, moment_order):\n    \"\"\"Compute high moment of privacy loss.\n\n    Args:\n      sigma: the noise sigma, in the multiples of the sensitivity.\n      q: the sampling ratio.\n      moment_order: the order of moment.\n    Returns:\n      log E[exp(moment_order * X)]\n    \"\"\"\n    assert moment_order <= self._max_moment_order, (\"The order of %d is out \"\n                                                    \"of the upper bound %d.\"\n                                                    % (moment_order,\n                                                       self._max_moment_order))\n    binomial_table = tf.slice(self._binomial_table, [moment_order, 0],\n                              [1, moment_order + 1])\n    # qs = [1 q q^2 ... q^L] = exp([0 1 2 ... L] * log(q))\n    qs = tf.exp(tf.constant([i * 1.0 for i in range(moment_order + 1)],\n                            dtype=tf.float64) * tf.cast(\n                                tf.log(q), dtype=tf.float64))\n    moments0 = self._differential_moments(sigma, 0.0, moment_order)\n    term0 = tf.reduce_sum(binomial_table * qs * moments0)\n    moments1 = self._differential_moments(sigma, 1.0, moment_order)\n    term1 = tf.reduce_sum(binomial_table * qs * moments1)\n    return tf.squeeze(tf.log(tf.cast(q * term0 + (1.0 - q) * term1,\n                                     tf.float64)))\n\n\nclass DummyAccountant(object):\n  \"\"\"An accountant that does no accounting.\"\"\"\n\n  def accumulate_privacy_spending(self, *unused_args):\n    return tf.no_op()\n\n  def get_privacy_spent(self, unused_sess, **unused_kwargs):\n    return [EpsDelta(numpy.inf, 1.0)]\n"
  },
  {
    "path": "model_zoo/models/im2txt/.gitignore",
    "content": "/bazel-bin\n/bazel-ci_build-cache\n/bazel-genfiles\n/bazel-out\n/bazel-im2txt\n/bazel-testlogs\n/bazel-tf\n"
  },
  {
    "path": "model_zoo/models/im2txt/README.md",
    "content": "# Show and Tell: A Neural Image Caption Generator\n\nA TensorFlow implementation of the image-to-text model described in the paper:\n\n\"Show and Tell: Lessons learned from the 2015 MSCOCO Image Captioning\nChallenge.\"\n\nOriol Vinyals, Alexander Toshev, Samy Bengio, Dumitru Erhan.\n\n*IEEE transactions on pattern analysis and machine intelligence (2016).*\n\nFull text available at: http://arxiv.org/abs/1609.06647\n\n## Contact\n***Author:*** Chris Shallue (shallue@google.com).\n\n***Pull requests and issues:*** @cshallue.\n\n## Contents\n* [Model Overview](#model-overview)\n    * [Introduction](#introduction)\n    * [Architecture](#architecture)\n* [Getting Started](#getting-started)\n    * [A Note on Hardware and Training Time](#a-note-on-hardware-and-training-time)\n    * [Install Required Packages](#install-required-packages)\n    * [Prepare the Training Data](#prepare-the-training-data)\n    * [Download the Inception v3 Checkpoint](#download-the-inception-v3-checkpoint)\n* [Training a Model](#training-a-model)\n    * [Initial Training](#initial-training)\n    * [Fine Tune the Inception v3 Model](#fine-tune-the-inception-v3-model)\n* [Generating Captions](#generating-captions)\n\n## Model Overview\n\n### Introduction\n\nThe *Show and Tell* model is a deep neural network that learns how to describe\nthe content of images. For example:\n\n<center>\n![Example captions](g3doc/example_captions.jpg)\n</center>\n\n### Architecture\n\nThe *Show and Tell* model is an example of an *encoder-decoder* neural network.\nIt works by first \"encoding\" an image into a fixed-length vector representation,\nand then \"decoding\" the representation into a natural language description.\n\nThe image encoder is a deep convolutional neural network. This type of\nnetwork is widely used for image tasks and is currently state-of-the-art for\nobject recognition and detection. Our particular choice of network is the\n[*Inception v3*](http://arxiv.org/abs/1512.00567) image recognition model\npretrained on the\n[ILSVRC-2012-CLS](http://www.image-net.org/challenges/LSVRC/2012/) image\nclassification dataset.\n\nThe decoder is a long short-term memory (LSTM) network. This type of network is\ncommonly used for sequence modeling tasks such as language modeling and machine\ntranslation. In the *Show and Tell* model, the LSTM network is trained as a\nlanguage model conditioned on the image encoding.\n\nWords in the captions are represented with an embedding model. Each word in the\nvocabulary is associated with a fixed-length vector representation that is\nlearned during training.\n\nThe following diagram illustrates the model architecture.\n\n<center>\n![Show and Tell Architecture](g3doc/show_and_tell_architecture.png)\n</center>\n\nIn this diagram, \\{*s*<sub>0</sub>, *s*<sub>1</sub>, ..., *s*<sub>*N*-1</sub>\\}\nare the words of the caption and \\{*w*<sub>*e*</sub>*s*<sub>0</sub>,\n*w*<sub>*e*</sub>*s*<sub>1</sub>, ..., *w*<sub>*e*</sub>*s*<sub>*N*-1</sub>\\}\nare their corresponding word embedding vectors. The outputs \\{*p*<sub>1</sub>,\n*p*<sub>2</sub>, ..., *p*<sub>*N*</sub>\\} of the LSTM are probability\ndistributions generated by the model for the next word in the sentence. The\nterms \\{log *p*<sub>1</sub>(*s*<sub>1</sub>),\nlog *p*<sub>2</sub>(*s*<sub>2</sub>), ...,\nlog *p*<sub>*N*</sub>(*s*<sub>*N*</sub>)\\} are the log-likelihoods of the\ncorrect word at each step; the negated sum of these terms is the minimization\nobjective of the model.\n\nDuring the first phase of training the parameters of the *Inception v3* model\nare kept fixed: it is simply a static image encoder function. A single trainable\nlayer is added on top of the *Inception v3* model to transform the image\nembedding into the word embedding vector space. The model is trained with\nrespect to the parameters of the word embeddings, the parameters of the layer on\ntop of *Inception v3* and the parameters of the LSTM. In the second phase of\ntraining, all parameters - including the parameters of *Inception v3* - are\ntrained to jointly fine-tune the image encoder and the LSTM.\n\nGiven a trained model and an image we use *beam search* to generate captions for\nthat image. Captions are generated word-by-word, where at each step *t* we use\nthe set of sentences already generated with length *t* - 1 to generate a new set\nof sentences with length *t*. We keep only the top *k* candidates at each step,\nwhere the hyperparameter *k* is called the *beam size*. We have found the best\nperformance with *k* = 3.\n\n## Getting Started\n\n### A Note on Hardware and Training Time\n\nThe time required to train the *Show and Tell* model depends on your specific\nhardware and computational capacity. In this guide we assume you will be running\ntraining on a single machine with a GPU. In our experience on an NVIDIA Tesla\nK20m GPU the initial training phase takes 1-2 weeks. The second training phase\nmay take several additional weeks to achieve peak performance (but you can stop\nthis phase early and still get reasonable results).\n\nIt is possible to achieve a speed-up by implementing distributed training across\na cluster of machines with GPUs, but that is not covered in this guide.\n\nWhilst it is possible to run this code on a CPU, beware that this may be\napproximately 10 times slower.\n\n### Install Required Packages\nFirst ensure that you have installed the following required packages:\n\n* **Bazel** ([instructions](http://bazel.io/docs/install.html)).\n* **TensorFlow** ([instructions](https://www.tensorflow.org/versions/master/get_started/os_setup.html)).\n* **NumPy** ([instructions](http://www.scipy.org/install.html)).\n* **Natural Language Toolkit (NLTK)**:\n    * First install NLTK ([instructions](http://www.nltk.org/install.html)).\n    * Then install the NLTK data ([instructions](http://www.nltk.org/data.html)).\n\n### Prepare the Training Data\n\nTo train the model you will need to provide training data in native TFRecord\nformat. The TFRecord format consists of a set of sharded files containing\nserialized `tf.SequenceExample` protocol buffers. Each `tf.SequenceExample`\nproto contains an image (JPEG format), a caption and metadata such as the image\nid.\n\nEach caption is a list of words. During preprocessing, a dictionary is created\nthat assigns each word in the vocabulary to an integer-valued id. Each caption\nis encoded as a list of integer word ids in the `tf.SequenceExample` protos.\n\nWe have provided a script to download and preprocess the [MSCOCO]\n(http://mscoco.org/) image captioning data set into this format. Downloading\nand preprocessing the data may take several hours depending on your network and\ncomputer speed. Please be patient.\n\nBefore running the script, ensure that your hard disk has at least 150GB of\navailable space for storing the downloaded and processed data.\n\n```shell\n# Location to save the MSCOCO data.\nMSCOCO_DIR=\"${HOME}/im2txt/data/mscoco\"\n\n# Build the preprocessing script.\nbazel build im2txt/download_and_preprocess_mscoco\n\n# Run the preprocessing script.\nbazel-bin/im2txt/download_and_preprocess_mscoco \"${MSCOCO_DIR}\"\n```\n\nThe final line of the output should read:\n\n```\n2016-09-01 16:47:47.296630: Finished processing all 20267 image-caption pairs in data set 'test'.\n```\n\nWhen the script finishes you will find 256 training, 4 validation and 8 testing\nfiles in `DATA_DIR`. The files will match the patterns `train-?????-of-00256`,\n`val-?????-of-00004` and `test-?????-of-00008`, respectively.\n\n### Download the Inception v3 Checkpoint\n\nThe *Show and Tell* model requires a pretrained *Inception v3* checkpoint file\nto initialize the parameters of its image encoder submodel.\n\nThis checkpoint file is provided by the\n[TensorFlow-Slim image classification library](https://github.com/tensorflow/models/tree/master/slim#tensorflow-slim-image-classification-library)\nwhich provides a suite of pre-trained image classification models. You can read\nmore about the models provided by the library\n[here](https://github.com/tensorflow/models/tree/master/slim#pre-trained-models).\n\n\nRun the following commands to download the *Inception v3* checkpoint.\n\n```shell\n# Location to save the Inception v3 checkpoint.\nINCEPTION_DIR=\"${HOME}/im2txt/data\"\nmkdir -p ${INCEPTION_DIR}\n\nwget \"http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz\"\ntar -xvf \"inception_v3_2016_08_28.tar.gz\" -C ${INCEPTION_DIR}\nrm \"inception_v3_2016_08_28.tar.gz\"\n```\n\nNote that the *Inception v3* checkpoint will only be used for initializing the\nparameters of the *Show and Tell* model. Once the *Show and Tell* model starts\ntraining it will save its own checkpoint files containing the values of all its\nparameters (including copies of the *Inception v3* parameters). If training is\nstopped and restarted, the parameter values will be restored from the latest\n*Show and Tell* checkpoint and the *Inception v3* checkpoint will be ignored. In\nother words, the *Inception v3* checkpoint is only used in the 0-th global step\n(initialization) of training the *Show and Tell* model.\n\n## Training a Model\n\n### Initial Training\n\nRun the training script.\n\n```shell\n# Directory containing preprocessed MSCOCO data.\nMSCOCO_DIR=\"${HOME}/im2txt/data/mscoco\"\n\n# Inception v3 checkpoint file.\nINCEPTION_CHECKPOINT=\"${HOME}/im2txt/data/inception_v3.ckpt\"\n\n# Directory to save the model.\nMODEL_DIR=\"${HOME}/im2txt/model\"\n\n# Build the model.\nbazel build -c opt im2txt/...\n\n# Run the training script.\nbazel-bin/im2txt/train \\\n  --input_file_pattern=\"${MSCOCO_DIR}/train-?????-of-00256\" \\\n  --inception_checkpoint_file=\"${INCEPTION_CHECKPOINT}\" \\\n  --train_dir=\"${MODEL_DIR}/train\" \\\n  --train_inception=false \\\n  --number_of_steps=1000000\n```\n\nRun the evaluation script in a separate process. This will log evaluation\nmetrics to TensorBoard which allows training progress to be monitored in\nreal-time.\n\nNote that you may run out of memory if you run the evaluation script on the same\nGPU as the training script. You can run the command\n`export CUDA_VISIBLE_DEVICES=\"\"` to force the evaluation script to run on CPU.\nIf evaluation runs too slowly on CPU, you can decrease the value of\n`--num_eval_examples`.\n\n```shell\nMSCOCO_DIR=\"${HOME}/im2txt/data/mscoco\"\nMODEL_DIR=\"${HOME}/im2txt/model\"\n\n# Ignore GPU devices (only necessary if your GPU is currently memory\n# constrained, for example, by running the training script).\nexport CUDA_VISIBLE_DEVICES=\"\"\n\n# Run the evaluation script. This will run in a loop, periodically loading the\n# latest model checkpoint file and computing evaluation metrics.\nbazel-bin/im2txt/evaluate \\\n  --input_file_pattern=\"${MSCOCO_DIR}/val-?????-of-00004\" \\\n  --checkpoint_dir=\"${MODEL_DIR}/train\" \\\n  --eval_dir=\"${MODEL_DIR}/eval\"\n```\n\nRun a TensorBoard server in a separate process for real-time monitoring of\ntraining progress and evaluation metrics.\n\n```shell\nMODEL_DIR=\"${HOME}/im2txt/model\"\n\n# Run a TensorBoard server.\ntensorboard --logdir=\"${MODEL_DIR}\"\n```\n\n### Fine Tune the Inception v3 Model\n\nYour model will already be able to generate reasonable captions after the first\nphase of training. Try it out! (See [Generating Captions]\n(#generating-captions)).\n\nYou can further improve the performance of the model by running a\nsecond training phase to jointly fine-tune the parameters of the *Inception v3*\nimage submodel and the LSTM.\n\n```shell\n# Restart the training script with --train_inception=true.\nbazel-bin/im2txt/train \\\n  --input_file_pattern=\"${MSCOCO_DIR}/train-?????-of-00256\" \\\n  --train_dir=\"${MODEL_DIR}/train\" \\\n  --train_inception=true \\\n  --number_of_steps=3000000  # Additional 2M steps (assuming 1M in initial training).\n```\n\nNote that training will proceed much slower now, and the model will continue to\nimprove by a small amount for a long time. We have found that it will improve\nslowly for an additional 2-2.5 million steps before it begins to overfit. This\nmay take several weeks on a single GPU. If you don't care about absolutely\noptimal performance then feel free to halt training sooner by stopping the\ntraining script or passing a smaller value to the flag `--number_of_steps`. Your\nmodel will still work reasonably well.\n\n## Generating Captions\n\nYour trained *Show and Tell* model can generate captions for any JPEG image! The\nfollowing command line will generate captions for an image from the test set.\n\n```shell\n# Directory containing model checkpoints.\nCHECKPOINT_DIR=\"${HOME}/im2txt/model/train\"\n\n# Vocabulary file generated by the preprocessing script.\nVOCAB_FILE=\"${HOME}/im2txt/data/mscoco/word_counts.txt\"\n\n# JPEG image file to caption.\nIMAGE_FILE=\"${HOME}/im2txt/data/mscoco/raw-data/val2014/COCO_val2014_000000224477.jpg\"\n\n# Build the inference binary.\nbazel build -c opt im2txt/run_inference\n\n# Ignore GPU devices (only necessary if your GPU is currently memory\n# constrained, for example, by running the training script).\nexport CUDA_VISIBLE_DEVICES=\"\"\n\n# Run inference to generate captions.\nbazel-bin/im2txt/run_inference \\\n  --checkpoint_path=${CHECKPOINT_DIR} \\\n  --vocab_file=${VOCAB_FILE} \\\n  --input_files=${IMAGE_FILE}\n```\n\nExample output:\n\n```shell\nCaptions for image COCO_val2014_000000224477.jpg:\n  0) a man riding a wave on top of a surfboard . (p=0.040413)\n  1) a person riding a surf board on a wave (p=0.017452)\n  2) a man riding a wave on a surfboard in the ocean . (p=0.005743)\n```\n\nNote: you may get different results. Some variation between different models is\nexpected.\n\nHere is the image:\n\n<center>\n![Surfer](g3doc/COCO_val2014_000000224477.jpg)\n</center>\n"
  },
  {
    "path": "model_zoo/models/im2txt/WORKSPACE",
    "content": "workspace(name = \"im2txt\")\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/BUILD",
    "content": "package(default_visibility = [\":internal\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\nexports_files([\"LICENSE\"])\n\npackage_group(\n    name = \"internal\",\n    packages = [\n        \"//im2txt/...\",\n    ],\n)\n\npy_binary(\n    name = \"build_mscoco_data\",\n    srcs = [\n        \"data/build_mscoco_data.py\",\n    ],\n)\n\nsh_binary(\n    name = \"download_and_preprocess_mscoco\",\n    srcs = [\"data/download_and_preprocess_mscoco.sh\"],\n    data = [\n        \":build_mscoco_data\",\n    ],\n)\n\npy_library(\n    name = \"configuration\",\n    srcs = [\"configuration.py\"],\n    srcs_version = \"PY2AND3\",\n)\n\npy_library(\n    name = \"show_and_tell_model\",\n    srcs = [\"show_and_tell_model.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\n        \"//im2txt/ops:image_embedding\",\n        \"//im2txt/ops:image_processing\",\n        \"//im2txt/ops:inputs\",\n    ],\n)\n\npy_test(\n    name = \"show_and_tell_model_test\",\n    size = \"large\",\n    srcs = [\"show_and_tell_model_test.py\"],\n    deps = [\n        \":configuration\",\n        \":show_and_tell_model\",\n    ],\n)\n\npy_library(\n    name = \"inference_wrapper\",\n    srcs = [\"inference_wrapper.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\n        \":show_and_tell_model\",\n        \"//im2txt/inference_utils:inference_wrapper_base\",\n    ],\n)\n\npy_binary(\n    name = \"train\",\n    srcs = [\"train.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\n        \":configuration\",\n        \":show_and_tell_model\",\n    ],\n)\n\npy_binary(\n    name = \"evaluate\",\n    srcs = [\"evaluate.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\n        \":configuration\",\n        \":show_and_tell_model\",\n    ],\n)\n\npy_binary(\n    name = \"run_inference\",\n    srcs = [\"run_inference.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\n        \":configuration\",\n        \":inference_wrapper\",\n        \"//im2txt/inference_utils:caption_generator\",\n        \"//im2txt/inference_utils:vocabulary\",\n    ],\n)\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/configuration.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Image-to-text model and training configurations.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nclass ModelConfig(object):\n  \"\"\"Wrapper class for model hyperparameters.\"\"\"\n\n  def __init__(self):\n    \"\"\"Sets the default model hyperparameters.\"\"\"\n    # File pattern of sharded TFRecord file containing SequenceExample protos.\n    # Must be provided in training and evaluation modes.\n    self.input_file_pattern = None\n\n    # Image format (\"jpeg\" or \"png\").\n    self.image_format = \"jpeg\"\n\n    # Approximate number of values per input shard. Used to ensure sufficient\n    # mixing between shards in training.\n    self.values_per_input_shard = 2300\n    # Minimum number of shards to keep in the input queue.\n    self.input_queue_capacity_factor = 2\n    # Number of threads for prefetching SequenceExample protos.\n    self.num_input_reader_threads = 1\n\n    # Name of the SequenceExample context feature containing image data.\n    self.image_feature_name = \"image/data\"\n    # Name of the SequenceExample feature list containing integer captions.\n    self.caption_feature_name = \"image/caption_ids\"\n\n    # Number of unique words in the vocab (plus 1, for <UNK>).\n    # The default value is larger than the expected actual vocab size to allow\n    # for differences between tokenizer versions used in preprocessing. There is\n    # no harm in using a value greater than the actual vocab size, but using a\n    # value less than the actual vocab size will result in an error.\n    self.vocab_size = 12000\n\n    # Number of threads for image preprocessing. Should be a multiple of 2.\n    self.num_preprocess_threads = 4\n\n    # Batch size.\n    self.batch_size = 32\n\n    # File containing an Inception v3 checkpoint to initialize the variables\n    # of the Inception model. Must be provided when starting training for the\n    # first time.\n    self.inception_checkpoint_file = None\n\n    # Dimensions of Inception v3 input images.\n    self.image_height = 299\n    self.image_width = 299\n\n    # Scale used to initialize model variables.\n    self.initializer_scale = 0.08\n\n    # LSTM input and output dimensionality, respectively.\n    self.embedding_size = 512\n    self.num_lstm_units = 512\n\n    # If < 1.0, the dropout keep probability applied to LSTM variables.\n    self.lstm_dropout_keep_prob = 0.7\n\n\nclass TrainingConfig(object):\n  \"\"\"Wrapper class for training hyperparameters.\"\"\"\n\n  def __init__(self):\n    \"\"\"Sets the default training hyperparameters.\"\"\"\n    # Number of examples per epoch of training data.\n    self.num_examples_per_epoch = 586363\n\n    # Optimizer for training the model.\n    self.optimizer = \"SGD\"\n\n    # Learning rate for the initial phase of training.\n    self.initial_learning_rate = 2.0\n    self.learning_rate_decay_factor = 0.5\n    self.num_epochs_per_decay = 8.0\n\n    # Learning rate when fine tuning the Inception v3 parameters.\n    self.train_inception_learning_rate = 0.0005\n\n    # If not None, clip gradients to this value.\n    self.clip_gradients = 5.0\n\n    # How many model checkpoints to keep.\n    self.max_checkpoints_to_keep = 5\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/data/build_mscoco_data.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Converts MSCOCO data to TFRecord file format with SequenceExample protos.\n\nThe MSCOCO images are expected to reside in JPEG files located in the following\ndirectory structure:\n\n  train_image_dir/COCO_train2014_000000000151.jpg\n  train_image_dir/COCO_train2014_000000000260.jpg\n  ...\n\nand\n\n  val_image_dir/COCO_val2014_000000000042.jpg\n  val_image_dir/COCO_val2014_000000000073.jpg\n  ...\n\nThe MSCOCO annotations JSON files are expected to reside in train_captions_file\nand val_captions_file respectively.\n\nThis script converts the combined MSCOCO data into sharded data files consisting\nof 256, 4 and 8 TFRecord files, respectively:\n\n  output_dir/train-00000-of-00256\n  output_dir/train-00001-of-00256\n  ...\n  output_dir/train-00255-of-00256\n\nand\n\n  output_dir/val-00000-of-00004\n  ...\n  output_dir/val-00003-of-00004\n\nand\n\n  output_dir/test-00000-of-00008\n  ...\n  output_dir/test-00007-of-00008\n\nEach TFRecord file contains ~2300 records. Each record within the TFRecord file\nis a serialized SequenceExample proto consisting of precisely one image-caption\npair. Note that each image has multiple captions (usually 5) and therefore each\nimage is replicated multiple times in the TFRecord files.\n\nThe SequenceExample proto contains the following fields:\n\n  context:\n    image/image_id: integer MSCOCO image identifier\n    image/data: string containing JPEG encoded image in RGB colorspace\n\n  feature_lists:\n    image/caption: list of strings containing the (tokenized) caption words\n    image/caption_ids: list of integer ids corresponding to the caption words\n\nThe captions are tokenized using the NLTK (http://www.nltk.org/) word tokenizer.\nThe vocabulary of word identifiers is constructed from the sorted list (by\ndescending frequency) of word tokens in the training set. Only tokens appearing\nat least 4 times are considered; all other words get the \"unknown\" word id.\n\nNOTE: This script will consume around 100GB of disk space because each image\nin the MSCOCO dataset is replicated ~5 times (once per caption) in the output.\nThis is done for two reasons:\n  1. In order to better shuffle the training data.\n  2. It makes it easier to perform asynchronous preprocessing of each image in\n     TensorFlow.\n\nRunning this script using 16 threads may take around 1 hour on a HP Z420.\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom collections import Counter\nfrom collections import namedtuple\nfrom datetime import datetime\nimport json\nimport os.path\nimport random\nimport sys\nimport threading\n\n\n\nimport nltk.tokenize\nimport numpy as np\nimport tensorflow as tf\n\ntf.flags.DEFINE_string(\"train_image_dir\", \"/tmp/train2014/\",\n                       \"Training image directory.\")\ntf.flags.DEFINE_string(\"val_image_dir\", \"/tmp/val2014\",\n                       \"Validation image directory.\")\n\ntf.flags.DEFINE_string(\"train_captions_file\", \"/tmp/captions_train2014.json\",\n                       \"Training captions JSON file.\")\ntf.flags.DEFINE_string(\"val_captions_file\", \"/tmp/captions_val2014.json\",\n                       \"Validation captions JSON file.\")\n\ntf.flags.DEFINE_string(\"output_dir\", \"/tmp/\", \"Output data directory.\")\n\ntf.flags.DEFINE_integer(\"train_shards\", 256,\n                        \"Number of shards in training TFRecord files.\")\ntf.flags.DEFINE_integer(\"val_shards\", 4,\n                        \"Number of shards in validation TFRecord files.\")\ntf.flags.DEFINE_integer(\"test_shards\", 8,\n                        \"Number of shards in testing TFRecord files.\")\n\ntf.flags.DEFINE_string(\"start_word\", \"<S>\",\n                       \"Special word added to the beginning of each sentence.\")\ntf.flags.DEFINE_string(\"end_word\", \"</S>\",\n                       \"Special word added to the end of each sentence.\")\ntf.flags.DEFINE_string(\"unknown_word\", \"<UNK>\",\n                       \"Special word meaning 'unknown'.\")\ntf.flags.DEFINE_integer(\"min_word_count\", 4,\n                        \"The minimum number of occurrences of each word in the \"\n                        \"training set for inclusion in the vocabulary.\")\ntf.flags.DEFINE_string(\"word_counts_output_file\", \"/tmp/word_counts.txt\",\n                       \"Output vocabulary file of word counts.\")\n\ntf.flags.DEFINE_integer(\"num_threads\", 8,\n                        \"Number of threads to preprocess the images.\")\n\nFLAGS = tf.flags.FLAGS\n\nImageMetadata = namedtuple(\"ImageMetadata\",\n                           [\"image_id\", \"filename\", \"captions\"])\n\n\nclass Vocabulary(object):\n  \"\"\"Simple vocabulary wrapper.\"\"\"\n\n  def __init__(self, vocab, unk_id):\n    \"\"\"Initializes the vocabulary.\n\n    Args:\n      vocab: A dictionary of word to word_id.\n      unk_id: Id of the special 'unknown' word.\n    \"\"\"\n    self._vocab = vocab\n    self._unk_id = unk_id\n\n  def word_to_id(self, word):\n    \"\"\"Returns the integer id of a word string.\"\"\"\n    if word in self._vocab:\n      return self._vocab[word]\n    else:\n      return self._unk_id\n\n\nclass ImageDecoder(object):\n  \"\"\"Helper class for decoding images in TensorFlow.\"\"\"\n\n  def __init__(self):\n    # Create a single TensorFlow Session for all image decoding calls.\n    self._sess = tf.Session()\n\n    # TensorFlow ops for JPEG decoding.\n    self._encoded_jpeg = tf.placeholder(dtype=tf.string)\n    self._decode_jpeg = tf.image.decode_jpeg(self._encoded_jpeg, channels=3)\n\n  def decode_jpeg(self, encoded_jpeg):\n    image = self._sess.run(self._decode_jpeg,\n                           feed_dict={self._encoded_jpeg: encoded_jpeg})\n    assert len(image.shape) == 3\n    assert image.shape[2] == 3\n    return image\n\n\ndef _int64_feature(value):\n  \"\"\"Wrapper for inserting an int64 Feature into a SequenceExample proto.\"\"\"\n  return tf.train.Feature(int64_list=tf.train.Int64List(value=[value]))\n\n\ndef _bytes_feature(value):\n  \"\"\"Wrapper for inserting a bytes Feature into a SequenceExample proto.\"\"\"\n  return tf.train.Feature(bytes_list=tf.train.BytesList(value=[str(value)]))\n\n\ndef _int64_feature_list(values):\n  \"\"\"Wrapper for inserting an int64 FeatureList into a SequenceExample proto.\"\"\"\n  return tf.train.FeatureList(feature=[_int64_feature(v) for v in values])\n\n\ndef _bytes_feature_list(values):\n  \"\"\"Wrapper for inserting a bytes FeatureList into a SequenceExample proto.\"\"\"\n  return tf.train.FeatureList(feature=[_bytes_feature(v) for v in values])\n\n\ndef _to_sequence_example(image, decoder, vocab):\n  \"\"\"Builds a SequenceExample proto for an image-caption pair.\n\n  Args:\n    image: An ImageMetadata object.\n    decoder: An ImageDecoder object.\n    vocab: A Vocabulary object.\n\n  Returns:\n    A SequenceExample proto.\n  \"\"\"\n  with tf.gfile.FastGFile(image.filename, \"r\") as f:\n    encoded_image = f.read()\n\n  try:\n    decoder.decode_jpeg(encoded_image)\n  except (tf.errors.InvalidArgumentError, AssertionError):\n    print(\"Skipping file with invalid JPEG data: %s\" % image.filename)\n    return\n\n  context = tf.train.Features(feature={\n      \"image/image_id\": _int64_feature(image.image_id),\n      \"image/data\": _bytes_feature(encoded_image),\n  })\n\n  assert len(image.captions) == 1\n  caption = image.captions[0]\n  caption_ids = [vocab.word_to_id(word) for word in caption]\n  feature_lists = tf.train.FeatureLists(feature_list={\n      \"image/caption\": _bytes_feature_list(caption),\n      \"image/caption_ids\": _int64_feature_list(caption_ids)\n  })\n  sequence_example = tf.train.SequenceExample(\n      context=context, feature_lists=feature_lists)\n\n  return sequence_example\n\n\ndef _process_image_files(thread_index, ranges, name, images, decoder, vocab,\n                         num_shards):\n  \"\"\"Processes and saves a subset of images as TFRecord files in one thread.\n\n  Args:\n    thread_index: Integer thread identifier within [0, len(ranges)].\n    ranges: A list of pairs of integers specifying the ranges of the dataset to\n      process in parallel.\n    name: Unique identifier specifying the dataset.\n    images: List of ImageMetadata.\n    decoder: An ImageDecoder object.\n    vocab: A Vocabulary object.\n    num_shards: Integer number of shards for the output files.\n  \"\"\"\n  # Each thread produces N shards where N = num_shards / num_threads. For\n  # instance, if num_shards = 128, and num_threads = 2, then the first thread\n  # would produce shards [0, 64).\n  num_threads = len(ranges)\n  assert not num_shards % num_threads\n  num_shards_per_batch = int(num_shards / num_threads)\n\n  shard_ranges = np.linspace(ranges[thread_index][0], ranges[thread_index][1],\n                             num_shards_per_batch + 1).astype(int)\n  num_images_in_thread = ranges[thread_index][1] - ranges[thread_index][0]\n\n  counter = 0\n  for s in xrange(num_shards_per_batch):\n    # Generate a sharded version of the file name, e.g. 'train-00002-of-00010'\n    shard = thread_index * num_shards_per_batch + s\n    output_filename = \"%s-%.5d-of-%.5d\" % (name, shard, num_shards)\n    output_file = os.path.join(FLAGS.output_dir, output_filename)\n    writer = tf.python_io.TFRecordWriter(output_file)\n\n    shard_counter = 0\n    images_in_shard = np.arange(shard_ranges[s], shard_ranges[s + 1], dtype=int)\n    for i in images_in_shard:\n      image = images[i]\n\n      sequence_example = _to_sequence_example(image, decoder, vocab)\n      if sequence_example is not None:\n        writer.write(sequence_example.SerializeToString())\n        shard_counter += 1\n        counter += 1\n\n      if not counter % 1000:\n        print(\"%s [thread %d]: Processed %d of %d items in thread batch.\" %\n              (datetime.now(), thread_index, counter, num_images_in_thread))\n        sys.stdout.flush()\n\n    writer.close()\n    print(\"%s [thread %d]: Wrote %d image-caption pairs to %s\" %\n          (datetime.now(), thread_index, shard_counter, output_file))\n    sys.stdout.flush()\n    shard_counter = 0\n  print(\"%s [thread %d]: Wrote %d image-caption pairs to %d shards.\" %\n        (datetime.now(), thread_index, counter, num_shards_per_batch))\n  sys.stdout.flush()\n\n\ndef _process_dataset(name, images, vocab, num_shards):\n  \"\"\"Processes a complete data set and saves it as a TFRecord.\n\n  Args:\n    name: Unique identifier specifying the dataset.\n    images: List of ImageMetadata.\n    vocab: A Vocabulary object.\n    num_shards: Integer number of shards for the output files.\n  \"\"\"\n  # Break up each image into a separate entity for each caption.\n  images = [ImageMetadata(image.image_id, image.filename, [caption])\n            for image in images for caption in image.captions]\n\n  # Shuffle the ordering of images. Make the randomization repeatable.\n  random.seed(12345)\n  random.shuffle(images)\n\n  # Break the images into num_threads batches. Batch i is defined as\n  # images[ranges[i][0]:ranges[i][1]].\n  num_threads = min(num_shards, FLAGS.num_threads)\n  spacing = np.linspace(0, len(images), num_threads + 1).astype(np.int)\n  ranges = []\n  threads = []\n  for i in xrange(len(spacing) - 1):\n    ranges.append([spacing[i], spacing[i + 1]])\n\n  # Create a mechanism for monitoring when all threads are finished.\n  coord = tf.train.Coordinator()\n\n  # Create a utility for decoding JPEG images to run sanity checks.\n  decoder = ImageDecoder()\n\n  # Launch a thread for each batch.\n  print(\"Launching %d threads for spacings: %s\" % (num_threads, ranges))\n  for thread_index in xrange(len(ranges)):\n    args = (thread_index, ranges, name, images, decoder, vocab, num_shards)\n    t = threading.Thread(target=_process_image_files, args=args)\n    t.start()\n    threads.append(t)\n\n  # Wait for all the threads to terminate.\n  coord.join(threads)\n  print(\"%s: Finished processing all %d image-caption pairs in data set '%s'.\" %\n        (datetime.now(), len(images), name))\n\n\ndef _create_vocab(captions):\n  \"\"\"Creates the vocabulary of word to word_id.\n\n  The vocabulary is saved to disk in a text file of word counts. The id of each\n  word in the file is its corresponding 0-based line number.\n\n  Args:\n    captions: A list of lists of strings.\n\n  Returns:\n    A Vocabulary object.\n  \"\"\"\n  print(\"Creating vocabulary.\")\n  counter = Counter()\n  for c in captions:\n    counter.update(c)\n  print(\"Total words:\", len(counter))\n\n  # Filter uncommon words and sort by descending count.\n  word_counts = [x for x in counter.items() if x[1] >= FLAGS.min_word_count]\n  word_counts.sort(key=lambda x: x[1], reverse=True)\n  print(\"Words in vocabulary:\", len(word_counts))\n\n  # Write out the word counts file.\n  with tf.gfile.FastGFile(FLAGS.word_counts_output_file, \"w\") as f:\n    f.write(\"\\n\".join([\"%s %d\" % (w, c) for w, c in word_counts]))\n  print(\"Wrote vocabulary file:\", FLAGS.word_counts_output_file)\n\n  # Create the vocabulary dictionary.\n  reverse_vocab = [x[0] for x in word_counts]\n  unk_id = len(reverse_vocab)\n  vocab_dict = dict([(x, y) for (y, x) in enumerate(reverse_vocab)])\n  vocab = Vocabulary(vocab_dict, unk_id)\n\n  return vocab\n\n\ndef _process_caption(caption):\n  \"\"\"Processes a caption string into a list of tonenized words.\n\n  Args:\n    caption: A string caption.\n\n  Returns:\n    A list of strings; the tokenized caption.\n  \"\"\"\n  tokenized_caption = [FLAGS.start_word]\n  tokenized_caption.extend(nltk.tokenize.word_tokenize(caption.lower()))\n  tokenized_caption.append(FLAGS.end_word)\n  return tokenized_caption\n\n\ndef _load_and_process_metadata(captions_file, image_dir):\n  \"\"\"Loads image metadata from a JSON file and processes the captions.\n\n  Args:\n    captions_file: JSON file containing caption annotations.\n    image_dir: Directory containing the image files.\n\n  Returns:\n    A list of ImageMetadata.\n  \"\"\"\n  with tf.gfile.FastGFile(captions_file, \"r\") as f:\n    caption_data = json.load(f)\n\n  # Extract the filenames.\n  id_to_filename = [(x[\"id\"], x[\"file_name\"]) for x in caption_data[\"images\"]]\n\n  # Extract the captions. Each image_id is associated with multiple captions.\n  id_to_captions = {}\n  for annotation in caption_data[\"annotations\"]:\n    image_id = annotation[\"image_id\"]\n    caption = annotation[\"caption\"]\n    id_to_captions.setdefault(image_id, [])\n    id_to_captions[image_id].append(caption)\n\n  assert len(id_to_filename) == len(id_to_captions)\n  assert set([x[0] for x in id_to_filename]) == set(id_to_captions.keys())\n  print(\"Loaded caption metadata for %d images from %s\" %\n        (len(id_to_filename), captions_file))\n\n  # Process the captions and combine the data into a list of ImageMetadata.\n  print(\"Proccessing captions.\")\n  image_metadata = []\n  num_captions = 0\n  for image_id, base_filename in id_to_filename:\n    filename = os.path.join(image_dir, base_filename)\n    captions = [_process_caption(c) for c in id_to_captions[image_id]]\n    image_metadata.append(ImageMetadata(image_id, filename, captions))\n    num_captions += len(captions)\n  print(\"Finished processing %d captions for %d images in %s\" %\n        (num_captions, len(id_to_filename), captions_file))\n\n  return image_metadata\n\n\ndef main(unused_argv):\n  def _is_valid_num_shards(num_shards):\n    \"\"\"Returns True if num_shards is compatible with FLAGS.num_threads.\"\"\"\n    return num_shards < FLAGS.num_threads or not num_shards % FLAGS.num_threads\n\n  assert _is_valid_num_shards(FLAGS.train_shards), (\n      \"Please make the FLAGS.num_threads commensurate with FLAGS.train_shards\")\n  assert _is_valid_num_shards(FLAGS.val_shards), (\n      \"Please make the FLAGS.num_threads commensurate with FLAGS.val_shards\")\n  assert _is_valid_num_shards(FLAGS.test_shards), (\n      \"Please make the FLAGS.num_threads commensurate with FLAGS.test_shards\")\n\n  if not tf.gfile.IsDirectory(FLAGS.output_dir):\n    tf.gfile.MakeDirs(FLAGS.output_dir)\n\n  # Load image metadata from caption files.\n  mscoco_train_dataset = _load_and_process_metadata(FLAGS.train_captions_file,\n                                                    FLAGS.train_image_dir)\n  mscoco_val_dataset = _load_and_process_metadata(FLAGS.val_captions_file,\n                                                  FLAGS.val_image_dir)\n\n  # Redistribute the MSCOCO data as follows:\n  #   train_dataset = 100% of mscoco_train_dataset + 85% of mscoco_val_dataset.\n  #   val_dataset = 5% of mscoco_val_dataset (for validation during training).\n  #   test_dataset = 10% of mscoco_val_dataset (for final evaluation).\n  train_cutoff = int(0.85 * len(mscoco_val_dataset))\n  val_cutoff = int(0.90 * len(mscoco_val_dataset))\n  train_dataset = mscoco_train_dataset + mscoco_val_dataset[0:train_cutoff]\n  val_dataset = mscoco_val_dataset[train_cutoff:val_cutoff]\n  test_dataset = mscoco_val_dataset[val_cutoff:]\n\n  # Create vocabulary from the training captions.\n  train_captions = [c for image in train_dataset for c in image.captions]\n  vocab = _create_vocab(train_captions)\n\n  _process_dataset(\"train\", train_dataset, vocab, FLAGS.train_shards)\n  _process_dataset(\"val\", val_dataset, vocab, FLAGS.val_shards)\n  _process_dataset(\"test\", test_dataset, vocab, FLAGS.test_shards)\n\n\nif __name__ == \"__main__\":\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/data/download_and_preprocess_mscoco.sh",
    "content": "#!/bin/bash\n# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n# Script to download and preprocess the MSCOCO data set.\n#\n# The outputs of this script are sharded TFRecord files containing serialized\n# SequenceExample protocol buffers. See build_mscoco_data.py for details of how\n# the SequenceExample protocol buffers are constructed.\n#\n# usage:\n#  ./download_and_preprocess_mscoco.sh\nset -e\n\nif [ -z \"$1\" ]; then\n  echo \"usage download_and_preproces_mscoco.sh [data dir]\"\n  exit\nfi\n\nif [ \"$(uname)\" == \"Darwin\" ]; then\n  UNZIP=\"tar -xf\"\nelse\n  UNZIP=\"unzip -nq\"\nfi\n\n# Create the output directories.\nOUTPUT_DIR=\"${1%/}\"\nSCRATCH_DIR=\"${OUTPUT_DIR}/raw-data\"\nmkdir -p \"${OUTPUT_DIR}\"\nmkdir -p \"${SCRATCH_DIR}\"\nCURRENT_DIR=$(pwd)\nWORK_DIR=\"$0.runfiles/im2txt/im2txt\"\n\n# Helper function to download and unpack a .zip file.\nfunction download_and_unzip() {\n  local BASE_URL=${1}\n  local FILENAME=${2}\n\n  if [ ! -f ${FILENAME} ]; then\n    echo \"Downloading ${FILENAME} to $(pwd)\"\n    wget -nd -c \"${BASE_URL}/${FILENAME}\"\n  else\n    echo \"Skipping download of ${FILENAME}\"\n  fi\n  echo \"Unzipping ${FILENAME}\"\n  ${UNZIP} ${FILENAME}\n}\n\ncd ${SCRATCH_DIR}\n\n# Download the images.\nBASE_IMAGE_URL=\"http://msvocds.blob.core.windows.net/coco2014\"\n\nTRAIN_IMAGE_FILE=\"train2014.zip\"\ndownload_and_unzip ${BASE_IMAGE_URL} ${TRAIN_IMAGE_FILE}\nTRAIN_IMAGE_DIR=\"${SCRATCH_DIR}/train2014\"\n\nVAL_IMAGE_FILE=\"val2014.zip\"\ndownload_and_unzip ${BASE_IMAGE_URL} ${VAL_IMAGE_FILE}\nVAL_IMAGE_DIR=\"${SCRATCH_DIR}/val2014\"\n\n# Download the captions.\nBASE_CAPTIONS_URL=\"http://msvocds.blob.core.windows.net/annotations-1-0-3\"\nCAPTIONS_FILE=\"captions_train-val2014.zip\"\ndownload_and_unzip ${BASE_CAPTIONS_URL} ${CAPTIONS_FILE}\nTRAIN_CAPTIONS_FILE=\"${SCRATCH_DIR}/annotations/captions_train2014.json\"\nVAL_CAPTIONS_FILE=\"${SCRATCH_DIR}/annotations/captions_val2014.json\"\n\n# Build TFRecords of the image data.\ncd \"${CURRENT_DIR}\"\nBUILD_SCRIPT=\"${WORK_DIR}/build_mscoco_data\"\n\"${BUILD_SCRIPT}\" \\\n  --train_image_dir=\"${TRAIN_IMAGE_DIR}\" \\\n  --val_image_dir=\"${VAL_IMAGE_DIR}\" \\\n  --train_captions_file=\"${TRAIN_CAPTIONS_FILE}\" \\\n  --val_captions_file=\"${VAL_CAPTIONS_FILE}\" \\\n  --output_dir=\"${OUTPUT_DIR}\" \\\n  --word_counts_output_file=\"${OUTPUT_DIR}/word_counts.txt\" \\\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/evaluate.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Evaluate the model.\n\nThis script should be run concurrently with training so that summaries show up\nin TensorBoard.\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport math\nimport os.path\nimport time\n\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom im2txt import configuration\nfrom im2txt import show_and_tell_model\n\nFLAGS = tf.flags.FLAGS\n\ntf.flags.DEFINE_string(\"input_file_pattern\", \"\",\n                       \"File pattern of sharded TFRecord input files.\")\ntf.flags.DEFINE_string(\"checkpoint_dir\", \"\",\n                       \"Directory containing model checkpoints.\")\ntf.flags.DEFINE_string(\"eval_dir\", \"\", \"Directory to write event logs.\")\n\ntf.flags.DEFINE_integer(\"eval_interval_secs\", 600,\n                        \"Interval between evaluation runs.\")\ntf.flags.DEFINE_integer(\"num_eval_examples\", 10132,\n                        \"Number of examples for evaluation.\")\n\ntf.flags.DEFINE_integer(\"min_global_step\", 5000,\n                        \"Minimum global step to run evaluation.\")\n\ntf.logging.set_verbosity(tf.logging.INFO)\n\n\ndef evaluate_model(sess, model, global_step, summary_writer, summary_op):\n  \"\"\"Computes perplexity-per-word over the evaluation dataset.\n\n  Summaries and perplexity-per-word are written out to the eval directory.\n\n  Args:\n    sess: Session object.\n    model: Instance of ShowAndTellModel; the model to evaluate.\n    global_step: Integer; global step of the model checkpoint.\n    summary_writer: Instance of SummaryWriter.\n    summary_op: Op for generating model summaries.\n  \"\"\"\n  # Log model summaries on a single batch.\n  summary_str = sess.run(summary_op)\n  summary_writer.add_summary(summary_str, global_step)\n\n  # Compute perplexity over the entire dataset.\n  num_eval_batches = int(\n      math.ceil(FLAGS.num_eval_examples / model.config.batch_size))\n\n  start_time = time.time()\n  sum_losses = 0.\n  sum_weights = 0.\n  for i in xrange(num_eval_batches):\n    cross_entropy_losses, weights = sess.run([\n        model.target_cross_entropy_losses,\n        model.target_cross_entropy_loss_weights\n    ])\n    sum_losses += np.sum(cross_entropy_losses * weights)\n    sum_weights += np.sum(weights)\n    if not i % 100:\n      tf.logging.info(\"Computed losses for %d of %d batches.\", i + 1,\n                      num_eval_batches)\n  eval_time = time.time() - start_time\n\n  perplexity = math.exp(sum_losses / sum_weights)\n  tf.logging.info(\"Perplexity = %f (%.2g sec)\", perplexity, eval_time)\n\n  # Log perplexity to the SummaryWriter.\n  summary = tf.Summary()\n  value = summary.value.add()\n  value.simple_value = perplexity\n  value.tag = \"Perplexity\"\n  summary_writer.add_summary(summary, global_step)\n\n  # Write the Events file to the eval directory.\n  summary_writer.flush()\n  tf.logging.info(\"Finished processing evaluation at global step %d.\",\n                  global_step)\n\n\ndef run_once(model, saver, summary_writer, summary_op):\n  \"\"\"Evaluates the latest model checkpoint.\n\n  Args:\n    model: Instance of ShowAndTellModel; the model to evaluate.\n    saver: Instance of tf.train.Saver for restoring model Variables.\n    summary_writer: Instance of SummaryWriter.\n    summary_op: Op for generating model summaries.\n  \"\"\"\n  model_path = tf.train.latest_checkpoint(FLAGS.checkpoint_dir)\n  if not model_path:\n    tf.logging.info(\"Skipping evaluation. No checkpoint found in: %s\",\n                    FLAGS.checkpoint_dir)\n    return\n\n  with tf.Session() as sess:\n    # Load model from checkpoint.\n    tf.logging.info(\"Loading model from checkpoint: %s\", model_path)\n    saver.restore(sess, model_path)\n    global_step = tf.train.global_step(sess, model.global_step.name)\n    tf.logging.info(\"Successfully loaded %s at global step = %d.\",\n                    os.path.basename(model_path), global_step)\n    if global_step < FLAGS.min_global_step:\n      tf.logging.info(\"Skipping evaluation. Global step = %d < %d\", global_step,\n                      FLAGS.min_global_step)\n      return\n\n    # Start the queue runners.\n    coord = tf.train.Coordinator()\n    threads = tf.train.start_queue_runners(coord=coord)\n\n    # Run evaluation on the latest checkpoint.\n    try:\n      evaluate_model(\n          sess=sess,\n          model=model,\n          global_step=global_step,\n          summary_writer=summary_writer,\n          summary_op=summary_op)\n    except Exception, e:  # pylint: disable=broad-except\n      tf.logging.error(\"Evaluation failed.\")\n      coord.request_stop(e)\n\n    coord.request_stop()\n    coord.join(threads, stop_grace_period_secs=10)\n\n\ndef run():\n  \"\"\"Runs evaluation in a loop, and logs summaries to TensorBoard.\"\"\"\n  # Create the evaluation directory if it doesn't exist.\n  eval_dir = FLAGS.eval_dir\n  if not tf.gfile.IsDirectory(eval_dir):\n    tf.logging.info(\"Creating eval directory: %s\", eval_dir)\n    tf.gfile.MakeDirs(eval_dir)\n\n  g = tf.Graph()\n  with g.as_default():\n    # Build the model for evaluation.\n    model_config = configuration.ModelConfig()\n    model_config.input_file_pattern = FLAGS.input_file_pattern\n    model = show_and_tell_model.ShowAndTellModel(model_config, mode=\"eval\")\n    model.build()\n\n    # Create the Saver to restore model Variables.\n    saver = tf.train.Saver()\n\n    # Create the summary operation and the summary writer.\n    summary_op = tf.merge_all_summaries()\n    summary_writer = tf.train.SummaryWriter(eval_dir)\n\n    g.finalize()\n\n    # Run a new evaluation run every eval_interval_secs.\n    while True:\n      start = time.time()\n      tf.logging.info(\"Starting evaluation at \" + time.strftime(\n          \"%Y-%m-%d-%H:%M:%S\", time.localtime()))\n      run_once(model, saver, summary_writer, summary_op)\n      time_to_next_eval = start + FLAGS.eval_interval_secs - time.time()\n      if time_to_next_eval > 0:\n        time.sleep(time_to_next_eval)\n\n\ndef main(unused_argv):\n  assert FLAGS.input_file_pattern, \"--input_file_pattern is required\"\n  assert FLAGS.checkpoint_dir, \"--checkpoint_dir is required\"\n  assert FLAGS.eval_dir, \"--eval_dir is required\"\n  run()\n\n\nif __name__ == \"__main__\":\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/inference_utils/BUILD",
    "content": "package(default_visibility = [\"//im2txt:internal\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\nexports_files([\"LICENSE\"])\n\npy_library(\n    name = \"inference_wrapper_base\",\n    srcs = [\"inference_wrapper_base.py\"],\n    srcs_version = \"PY2AND3\",\n)\n\npy_library(\n    name = \"vocabulary\",\n    srcs = [\"vocabulary.py\"],\n    srcs_version = \"PY2AND3\",\n)\n\npy_library(\n    name = \"caption_generator\",\n    srcs = [\"caption_generator.py\"],\n    srcs_version = \"PY2AND3\",\n)\n\npy_test(\n    name = \"caption_generator_test\",\n    srcs = [\"caption_generator_test.py\"],\n    deps = [\n        \":caption_generator\",\n    ],\n)\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/inference_utils/caption_generator.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Class for generating captions from an image-to-text model.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport heapq\nimport math\n\n\nimport numpy as np\n\n\nclass Caption(object):\n  \"\"\"Represents a complete or partial caption.\"\"\"\n\n  def __init__(self, sentence, state, logprob, score, metadata=None):\n    \"\"\"Initializes the Caption.\n\n    Args:\n      sentence: List of word ids in the caption.\n      state: Model state after generating the previous word.\n      logprob: Log-probability of the caption.\n      score: Score of the caption.\n      metadata: Optional metadata associated with the partial sentence. If not\n        None, a list of strings with the same length as 'sentence'.\n    \"\"\"\n    self.sentence = sentence\n    self.state = state\n    self.logprob = logprob\n    self.score = score\n    self.metadata = metadata\n\n  def __cmp__(self, other):\n    \"\"\"Compares Captions by score.\"\"\"\n    assert isinstance(other, Caption)\n    if self.score == other.score:\n      return 0\n    elif self.score < other.score:\n      return -1\n    else:\n      return 1\n\n\nclass TopN(object):\n  \"\"\"Maintains the top n elements of an incrementally provided set.\"\"\"\n\n  def __init__(self, n):\n    self._n = n\n    self._data = []\n\n  def size(self):\n    assert self._data is not None\n    return len(self._data)\n\n  def push(self, x):\n    \"\"\"Pushes a new element.\"\"\"\n    assert self._data is not None\n    if len(self._data) < self._n:\n      heapq.heappush(self._data, x)\n    else:\n      heapq.heappushpop(self._data, x)\n\n  def extract(self, sort=False):\n    \"\"\"Extracts all elements from the TopN. This is a destructive operation.\n\n    The only method that can be called immediately after extract() is reset().\n\n    Args:\n      sort: Whether to return the elements in descending sorted order.\n\n    Returns:\n      A list of data; the top n elements provided to the set.\n    \"\"\"\n    assert self._data is not None\n    data = self._data\n    self._data = None\n    if sort:\n      data.sort(reverse=True)\n    return data\n\n  def reset(self):\n    \"\"\"Returns the TopN to an empty state.\"\"\"\n    self._data = []\n\n\nclass CaptionGenerator(object):\n  \"\"\"Class to generate captions from an image-to-text model.\"\"\"\n\n  def __init__(self,\n               model,\n               vocab,\n               beam_size=3,\n               max_caption_length=20,\n               length_normalization_factor=0.0):\n    \"\"\"Initializes the generator.\n\n    Args:\n      model: Object encapsulating a trained image-to-text model. Must have\n        methods feed_image() and inference_step(). For example, an instance of\n        InferenceWrapperBase.\n      vocab: A Vocabulary object.\n      beam_size: Beam size to use when generating captions.\n      max_caption_length: The maximum caption length before stopping the search.\n      length_normalization_factor: If != 0, a number x such that captions are\n        scored by logprob/length^x, rather than logprob. This changes the\n        relative scores of captions depending on their lengths. For example, if\n        x > 0 then longer captions will be favored.\n    \"\"\"\n    self.vocab = vocab\n    self.model = model\n\n    self.beam_size = beam_size\n    self.max_caption_length = max_caption_length\n    self.length_normalization_factor = length_normalization_factor\n\n  def beam_search(self, sess, encoded_image):\n    \"\"\"Runs beam search caption generation on a single image.\n\n    Args:\n      sess: TensorFlow Session object.\n      encoded_image: An encoded image string.\n\n    Returns:\n      A list of Caption sorted by descending score.\n    \"\"\"\n    # Feed in the image to get the initial state.\n    initial_state = self.model.feed_image(sess, encoded_image)\n\n    initial_beam = Caption(\n        sentence=[self.vocab.start_id],\n        state=initial_state[0],\n        logprob=0.0,\n        score=0.0,\n        metadata=[\"\"])\n    partial_captions = TopN(self.beam_size)\n    partial_captions.push(initial_beam)\n    complete_captions = TopN(self.beam_size)\n\n    # Run beam search.\n    for _ in range(self.max_caption_length - 1):\n      partial_captions_list = partial_captions.extract()\n      partial_captions.reset()\n      input_feed = np.array([c.sentence[-1] for c in partial_captions_list])\n      state_feed = np.array([c.state for c in partial_captions_list])\n\n      softmax, new_states, metadata = self.model.inference_step(sess,\n                                                                input_feed,\n                                                                state_feed)\n\n      for i, partial_caption in enumerate(partial_captions_list):\n        word_probabilities = softmax[i]\n        state = new_states[i]\n        # For this partial caption, get the beam_size most probable next words.\n        words_and_probs = list(enumerate(word_probabilities))\n        words_and_probs.sort(key=lambda x: -x[1])\n        words_and_probs = words_and_probs[0:self.beam_size]\n        # Each next word gives a new partial caption.\n        for w, p in words_and_probs:\n          if p < 1e-12:\n            continue  # Avoid log(0).\n          sentence = partial_caption.sentence + [w]\n          logprob = partial_caption.logprob + math.log(p)\n          score = logprob\n          if metadata:\n            metadata_list = partial_caption.metadata + [metadata[i]]\n          else:\n            metadata_list = None\n          if w == self.vocab.end_id:\n            if self.length_normalization_factor > 0:\n              score /= len(sentence)**self.length_normalization_factor\n            beam = Caption(sentence, state, logprob, score, metadata_list)\n            complete_captions.push(beam)\n          else:\n            beam = Caption(sentence, state, logprob, score, metadata_list)\n            partial_captions.push(beam)\n      if partial_captions.size() == 0:\n        # We have run out of partial candidates; happens when beam_size = 1.\n        break\n\n    # If we have no complete captions then fall back to the partial captions.\n    # But never output a mixture of complete and partial captions because a\n    # partial caption could have a higher score than all the complete captions.\n    if not complete_captions.size():\n      complete_captions = partial_captions\n\n    return complete_captions.extract(sort=True)\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/inference_utils/caption_generator_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Unit tests for CaptionGenerator.\"\"\"\n\nimport math\n\n\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom im2txt.inference_utils import caption_generator\n\n\nclass FakeVocab(object):\n  \"\"\"Fake Vocabulary for testing purposes.\"\"\"\n\n  def __init__(self):\n    self.start_id = 0  # Word id denoting sentence start.\n    self.end_id = 1  # Word id denoting sentence end.\n\n\nclass FakeModel(object):\n  \"\"\"Fake model for testing purposes.\"\"\"\n\n  def __init__(self):\n    # Number of words in the vocab.\n    self._vocab_size = 12\n\n    # Dimensionality of the nominal model state.\n    self._state_size = 1\n\n    # Map of previous word to the probability distribution of the next word.\n    self._probabilities = {\n        0: {1: 0.1,\n            2: 0.2,\n            3: 0.3,\n            4: 0.4},\n        2: {5: 0.1,\n            6: 0.9},\n        3: {1: 0.1,\n            7: 0.4,\n            8: 0.5},\n        4: {1: 0.3,\n            9: 0.3,\n            10: 0.4},\n        5: {1: 1.0},\n        6: {1: 1.0},\n        7: {1: 1.0},\n        8: {1: 1.0},\n        9: {1: 0.5,\n            11: 0.5},\n        10: {1: 1.0},\n        11: {1: 1.0},\n    }\n\n  # pylint: disable=unused-argument\n\n  def feed_image(self, sess, encoded_image):\n    # Return a nominal model state.\n    return np.zeros([1, self._state_size])\n\n  def inference_step(self, sess, input_feed, state_feed):\n    # Compute the matrix of softmax distributions for the next batch of words.\n    batch_size = input_feed.shape[0]\n    softmax_output = np.zeros([batch_size, self._vocab_size])\n    for batch_index, word_id in enumerate(input_feed):\n      for next_word, probability in self._probabilities[word_id].items():\n        softmax_output[batch_index, next_word] = probability\n\n    # Nominal state and metadata.\n    new_state = np.zeros([batch_size, self._state_size])\n    metadata = None\n\n    return softmax_output, new_state, metadata\n\n  # pylint: enable=unused-argument\n\n\nclass CaptionGeneratorTest(tf.test.TestCase):\n\n  def _assertExpectedCaptions(self,\n                              expected_captions,\n                              beam_size=3,\n                              max_caption_length=20,\n                              length_normalization_factor=0):\n    \"\"\"Tests that beam search generates the expected captions.\n\n    Args:\n      expected_captions: A sequence of pairs (sentence, probability), where\n        sentence is a list of integer ids and probability is a float in [0, 1].\n      beam_size: Parameter passed to beam_search().\n      max_caption_length: Parameter passed to beam_search().\n      length_normalization_factor: Parameter passed to beam_search().\n    \"\"\"\n    expected_sentences = [c[0] for c in expected_captions]\n    expected_probabilities = [c[1] for c in expected_captions]\n\n    # Generate captions.\n    generator = caption_generator.CaptionGenerator(\n        model=FakeModel(),\n        vocab=FakeVocab(),\n        beam_size=beam_size,\n        max_caption_length=max_caption_length,\n        length_normalization_factor=length_normalization_factor)\n    actual_captions = generator.beam_search(sess=None, encoded_image=None)\n\n    actual_sentences = [c.sentence for c in actual_captions]\n    actual_probabilities = [math.exp(c.logprob) for c in actual_captions]\n\n    self.assertEqual(expected_sentences, actual_sentences)\n    self.assertAllClose(expected_probabilities, actual_probabilities)\n\n  def testBeamSize(self):\n    # Beam size = 1.\n    expected = [([0, 4, 10, 1], 0.16)]\n    self._assertExpectedCaptions(expected, beam_size=1)\n\n    # Beam size = 2.\n    expected = [([0, 4, 10, 1], 0.16), ([0, 3, 8, 1], 0.15)]\n    self._assertExpectedCaptions(expected, beam_size=2)\n\n    # Beam size = 3.\n    expected = [\n        ([0, 2, 6, 1], 0.18), ([0, 4, 10, 1], 0.16), ([0, 3, 8, 1], 0.15)\n    ]\n    self._assertExpectedCaptions(expected, beam_size=3)\n\n  def testMaxLength(self):\n    # Max length = 1.\n    expected = [([0], 1.0)]\n    self._assertExpectedCaptions(expected, max_caption_length=1)\n\n    # Max length = 2.\n    # There are no complete sentences, so partial sentences are returned.\n    expected = [([0, 4], 0.4), ([0, 3], 0.3), ([0, 2], 0.2)]\n    self._assertExpectedCaptions(expected, max_caption_length=2)\n\n    # Max length = 3.\n    # There is at least one complete sentence, so only complete sentences are\n    # returned.\n    expected = [([0, 4, 1], 0.12), ([0, 3, 1], 0.03)]\n    self._assertExpectedCaptions(expected, max_caption_length=3)\n\n    # Max length = 4.\n    expected = [\n        ([0, 2, 6, 1], 0.18), ([0, 4, 10, 1], 0.16), ([0, 3, 8, 1], 0.15)\n    ]\n    self._assertExpectedCaptions(expected, max_caption_length=4)\n\n  def testLengthNormalization(self):\n    # Length normalization factor = 3.\n    # The longest caption is returned first, despite having low probability,\n    # because it has the highest log(probability)/length**3.\n    expected = [\n        ([0, 4, 9, 11, 1], 0.06),\n        ([0, 2, 6, 1], 0.18),\n        ([0, 4, 10, 1], 0.16),\n        ([0, 3, 8, 1], 0.15),\n    ]\n    self._assertExpectedCaptions(\n        expected, beam_size=4, length_normalization_factor=3)\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/inference_utils/inference_wrapper_base.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Base wrapper class for performing inference with an image-to-text model.\n\nSubclasses must implement the following methods:\n\n  build_model():\n    Builds the model for inference and returns the model object.\n\n  feed_image():\n    Takes an encoded image and returns the initial model state, where \"state\"\n    is a numpy array whose specifics are defined by the subclass, e.g.\n    concatenated LSTM state. It's assumed that feed_image() will be called\n    precisely once at the start of inference for each image. Subclasses may\n    compute and/or save per-image internal context in this method.\n\n  inference_step():\n    Takes a batch of inputs and states at a single time-step. Returns the\n    softmax output corresponding to the inputs, and the new states of the batch.\n    Optionally also returns metadata about the current inference step, e.g. a\n    serialized numpy array containing activations from a particular model layer.\n\nClient usage:\n  1. Build the model inference graph via build_graph_from_config() or\n     build_graph_from_proto().\n  2. Call the resulting restore_fn to load the model checkpoint.\n  3. For each image in a batch of images:\n     a) Call feed_image() once to get the initial state.\n     b) For each step of caption generation, call inference_step().\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os.path\n\n\nimport tensorflow as tf\n\n# pylint: disable=unused-argument\n\n\nclass InferenceWrapperBase(object):\n  \"\"\"Base wrapper class for performing inference with an image-to-text model.\"\"\"\n\n  def __init__(self):\n    pass\n\n  def build_model(self, model_config):\n    \"\"\"Builds the model for inference.\n\n    Args:\n      model_config: Object containing configuration for building the model.\n\n    Returns:\n      model: The model object.\n    \"\"\"\n    tf.logging.fatal(\"Please implement build_model in subclass\")\n\n  def _create_restore_fn(self, checkpoint_path, saver):\n    \"\"\"Creates a function that restores a model from checkpoint.\n\n    Args:\n      checkpoint_path: Checkpoint file or a directory containing a checkpoint\n        file.\n      saver: Saver for restoring variables from the checkpoint file.\n\n    Returns:\n      restore_fn: A function such that restore_fn(sess) loads model variables\n        from the checkpoint file.\n\n    Raises:\n      ValueError: If checkpoint_path does not refer to a checkpoint file or a\n        directory containing a checkpoint file.\n    \"\"\"\n    if tf.gfile.IsDirectory(checkpoint_path):\n      checkpoint_path = tf.train.latest_checkpoint(checkpoint_path)\n      if not checkpoint_path:\n        raise ValueError(\"No checkpoint file found in: %s\" % checkpoint_path)\n\n    def _restore_fn(sess):\n      tf.logging.info(\"Loading model from checkpoint: %s\", checkpoint_path)\n      saver.restore(sess, checkpoint_path)\n      tf.logging.info(\"Successfully loaded checkpoint: %s\",\n                      os.path.basename(checkpoint_path))\n\n    return _restore_fn\n\n  def build_graph_from_config(self, model_config, checkpoint_path):\n    \"\"\"Builds the inference graph from a configuration object.\n\n    Args:\n      model_config: Object containing configuration for building the model.\n      checkpoint_path: Checkpoint file or a directory containing a checkpoint\n        file.\n\n    Returns:\n      restore_fn: A function such that restore_fn(sess) loads model variables\n        from the checkpoint file.\n    \"\"\"\n    tf.logging.info(\"Building model.\")\n    self.build_model(model_config)\n    saver = tf.train.Saver()\n\n    return self._create_restore_fn(checkpoint_path, saver)\n\n  def build_graph_from_proto(self, graph_def_file, saver_def_file,\n                             checkpoint_path):\n    \"\"\"Builds the inference graph from serialized GraphDef and SaverDef protos.\n\n    Args:\n      graph_def_file: File containing a serialized GraphDef proto.\n      saver_def_file: File containing a serialized SaverDef proto.\n      checkpoint_path: Checkpoint file or a directory containing a checkpoint\n        file.\n\n    Returns:\n      restore_fn: A function such that restore_fn(sess) loads model variables\n        from the checkpoint file.\n    \"\"\"\n    # Load the Graph.\n    tf.logging.info(\"Loading GraphDef from file: %s\", graph_def_file)\n    graph_def = tf.GraphDef()\n    with tf.gfile.FastGFile(graph_def_file, \"rb\") as f:\n      graph_def.ParseFromString(f.read())\n    tf.import_graph_def(graph_def, name=\"\")\n\n    # Load the Saver.\n    tf.logging.info(\"Loading SaverDef from file: %s\", saver_def_file)\n    saver_def = tf.train.SaverDef()\n    with tf.gfile.FastGFile(saver_def_file, \"rb\") as f:\n      saver_def.ParseFromString(f.read())\n    saver = tf.train.Saver(saver_def=saver_def)\n\n    return self._create_restore_fn(checkpoint_path, saver)\n\n  def feed_image(self, sess, encoded_image):\n    \"\"\"Feeds an image and returns the initial model state.\n\n    See comments at the top of file.\n\n    Args:\n      sess: TensorFlow Session object.\n      encoded_image: An encoded image string.\n\n    Returns:\n      state: A numpy array of shape [1, state_size].\n    \"\"\"\n    tf.logging.fatal(\"Please implement feed_image in subclass\")\n\n  def inference_step(self, sess, input_feed, state_feed):\n    \"\"\"Runs one step of inference.\n\n    Args:\n      sess: TensorFlow Session object.\n      input_feed: A numpy array of shape [batch_size].\n      state_feed: A numpy array of shape [batch_size, state_size].\n\n    Returns:\n      softmax_output: A numpy array of shape [batch_size, vocab_size].\n      new_state: A numpy array of shape [batch_size, state_size].\n      metadata: Optional. If not None, a string containing metadata about the\n        current inference step (e.g. serialized numpy array containing\n        activations from a particular model layer.).\n    \"\"\"\n    tf.logging.fatal(\"Please implement inference_step in subclass\")\n\n# pylint: enable=unused-argument\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/inference_utils/vocabulary.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Vocabulary class for an image-to-text model.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nimport tensorflow as tf\n\n\nclass Vocabulary(object):\n  \"\"\"Vocabulary class for an image-to-text model.\"\"\"\n\n  def __init__(self,\n               vocab_file,\n               start_word=\"<S>\",\n               end_word=\"</S>\",\n               unk_word=\"<UNK>\"):\n    \"\"\"Initializes the vocabulary.\n\n    Args:\n      vocab_file: File containing the vocabulary, where the words are the first\n        whitespace-separated token on each line (other tokens are ignored) and\n        the word ids are the corresponding line numbers.\n      start_word: Special word denoting sentence start.\n      end_word: Special word denoting sentence end.\n      unk_word: Special word denoting unknown words.\n    \"\"\"\n    if not tf.gfile.Exists(vocab_file):\n      tf.logging.fatal(\"Vocab file %s not found.\", vocab_file)\n    tf.logging.info(\"Initializing vocabulary from file: %s\", vocab_file)\n\n    with tf.gfile.GFile(vocab_file, mode=\"r\") as f:\n      reverse_vocab = list(f.readlines())\n    reverse_vocab = [line.split()[0] for line in reverse_vocab]\n    assert start_word in reverse_vocab\n    assert end_word in reverse_vocab\n    if unk_word not in reverse_vocab:\n      reverse_vocab.append(unk_word)\n    vocab = dict([(x, y) for (y, x) in enumerate(reverse_vocab)])\n\n    tf.logging.info(\"Created vocabulary with %d words\" % len(vocab))\n\n    self.vocab = vocab  # vocab[word] = id\n    self.reverse_vocab = reverse_vocab  # reverse_vocab[id] = word\n\n    # Save special word ids.\n    self.start_id = vocab[start_word]\n    self.end_id = vocab[end_word]\n    self.unk_id = vocab[unk_word]\n\n  def word_to_id(self, word):\n    \"\"\"Returns the integer word id of a word string.\"\"\"\n    if word in self.vocab:\n      return self.vocab[word]\n    else:\n      return self.unk_id\n\n  def id_to_word(self, word_id):\n    \"\"\"Returns the word string of an integer word id.\"\"\"\n    if word_id >= len(self.reverse_vocab):\n      return self.reverse_vocab[self.unk_id]\n    else:\n      return self.reverse_vocab[word_id]\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/inference_wrapper.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Model wrapper class for performing inference with a ShowAndTellModel.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\n\nfrom im2txt import show_and_tell_model\nfrom im2txt.inference_utils import inference_wrapper_base\n\n\nclass InferenceWrapper(inference_wrapper_base.InferenceWrapperBase):\n  \"\"\"Model wrapper class for performing inference with a ShowAndTellModel.\"\"\"\n\n  def __init__(self):\n    super(InferenceWrapper, self).__init__()\n\n  def build_model(self, model_config):\n    model = show_and_tell_model.ShowAndTellModel(model_config, mode=\"inference\")\n    model.build()\n    return model\n\n  def feed_image(self, sess, encoded_image):\n    initial_state = sess.run(fetches=\"lstm/initial_state:0\",\n                             feed_dict={\"image_feed:0\": encoded_image})\n    return initial_state\n\n  def inference_step(self, sess, input_feed, state_feed):\n    softmax_output, state_output = sess.run(\n        fetches=[\"softmax:0\", \"lstm/state:0\"],\n        feed_dict={\n            \"input_feed:0\": input_feed,\n            \"lstm/state_feed:0\": state_feed,\n        })\n    return softmax_output, state_output, None\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/ops/BUILD",
    "content": "package(default_visibility = [\"//im2txt:internal\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\nexports_files([\"LICENSE\"])\n\npy_library(\n    name = \"image_processing\",\n    srcs = [\"image_processing.py\"],\n    srcs_version = \"PY2AND3\",\n)\n\npy_library(\n    name = \"image_embedding\",\n    srcs = [\"image_embedding.py\"],\n    srcs_version = \"PY2AND3\",\n)\n\npy_test(\n    name = \"image_embedding_test\",\n    size = \"small\",\n    srcs = [\"image_embedding_test.py\"],\n    deps = [\n        \":image_embedding\",\n    ],\n)\n\npy_library(\n    name = \"inputs\",\n    srcs = [\"inputs.py\"],\n    srcs_version = \"PY2AND3\",\n)\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/ops/image_embedding.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Image embedding ops.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nimport tensorflow as tf\n\nfrom tensorflow.contrib.slim.python.slim.nets.inception_v3 import inception_v3_base\n\nslim = tf.contrib.slim\n\n\ndef inception_v3(images,\n                 trainable=True,\n                 is_training=True,\n                 weight_decay=0.00004,\n                 stddev=0.1,\n                 dropout_keep_prob=0.8,\n                 use_batch_norm=True,\n                 batch_norm_params=None,\n                 add_summaries=True,\n                 scope=\"InceptionV3\"):\n  \"\"\"Builds an Inception V3 subgraph for image embeddings.\n\n  Args:\n    images: A float32 Tensor of shape [batch, height, width, channels].\n    trainable: Whether the inception submodel should be trainable or not.\n    is_training: Boolean indicating training mode or not.\n    weight_decay: Coefficient for weight regularization.\n    stddev: The standard deviation of the trunctated normal weight initializer.\n    dropout_keep_prob: Dropout keep probability.\n    use_batch_norm: Whether to use batch normalization.\n    batch_norm_params: Parameters for batch normalization. See\n      tf.contrib.layers.batch_norm for details.\n    add_summaries: Whether to add activation summaries.\n    scope: Optional Variable scope.\n\n  Returns:\n    end_points: A dictionary of activations from inception_v3 layers.\n  \"\"\"\n  # Only consider the inception model to be in training mode if it's trainable.\n  is_inception_model_training = trainable and is_training\n\n  if use_batch_norm:\n    # Default parameters for batch normalization.\n    if not batch_norm_params:\n      batch_norm_params = {\n          \"is_training\": is_inception_model_training,\n          \"trainable\": trainable,\n          # Decay for the moving averages.\n          \"decay\": 0.9997,\n          # Epsilon to prevent 0s in variance.\n          \"epsilon\": 0.001,\n          # Collection containing the moving mean and moving variance.\n          \"variables_collections\": {\n              \"beta\": None,\n              \"gamma\": None,\n              \"moving_mean\": [\"moving_vars\"],\n              \"moving_variance\": [\"moving_vars\"],\n          }\n      }\n  else:\n    batch_norm_params = None\n\n  if trainable:\n    weights_regularizer = tf.contrib.layers.l2_regularizer(weight_decay)\n  else:\n    weights_regularizer = None\n\n  with tf.variable_scope(scope, \"InceptionV3\", [images]) as scope:\n    with slim.arg_scope(\n        [slim.conv2d, slim.fully_connected],\n        weights_regularizer=weights_regularizer,\n        trainable=trainable):\n      with slim.arg_scope(\n          [slim.conv2d],\n          weights_initializer=tf.truncated_normal_initializer(stddev=stddev),\n          activation_fn=tf.nn.relu,\n          normalizer_fn=slim.batch_norm,\n          normalizer_params=batch_norm_params):\n        net, end_points = inception_v3_base(images, scope=scope)\n        with tf.variable_scope(\"logits\"):\n          shape = net.get_shape()\n          net = slim.avg_pool2d(net, shape[1:3], padding=\"VALID\", scope=\"pool\")\n          net = slim.dropout(\n              net,\n              keep_prob=dropout_keep_prob,\n              is_training=is_inception_model_training,\n              scope=\"dropout\")\n          net = slim.flatten(net, scope=\"flatten\")\n\n  # Add summaries.\n  if add_summaries:\n    for v in end_points.values():\n      tf.contrib.layers.summaries.summarize_activation(v)\n\n  return net\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/ops/image_embedding_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Tests for tensorflow_models.im2txt.ops.image_embedding.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nimport tensorflow as tf\n\nfrom im2txt.ops import image_embedding\n\n\nclass InceptionV3Test(tf.test.TestCase):\n\n  def setUp(self):\n    super(InceptionV3Test, self).setUp()\n\n    batch_size = 4\n    height = 299\n    width = 299\n    num_channels = 3\n    self._images = tf.placeholder(tf.float32,\n                                  [batch_size, height, width, num_channels])\n    self._batch_size = batch_size\n\n  def _countInceptionParameters(self):\n    \"\"\"Counts the number of parameters in the inception model at top scope.\"\"\"\n    counter = {}\n    for v in tf.all_variables():\n      name_tokens = v.op.name.split(\"/\")\n      if name_tokens[0] == \"InceptionV3\":\n        name = \"InceptionV3/\" + name_tokens[1]\n        num_params = v.get_shape().num_elements()\n        assert num_params\n        counter[name] = counter.get(name, 0) + num_params\n    return counter\n\n  def _verifyParameterCounts(self):\n    \"\"\"Verifies the number of parameters in the inception model.\"\"\"\n    param_counts = self._countInceptionParameters()\n    expected_param_counts = {\n        \"InceptionV3/Conv2d_1a_3x3\": 960,\n        \"InceptionV3/Conv2d_2a_3x3\": 9312,\n        \"InceptionV3/Conv2d_2b_3x3\": 18624,\n        \"InceptionV3/Conv2d_3b_1x1\": 5360,\n        \"InceptionV3/Conv2d_4a_3x3\": 138816,\n        \"InceptionV3/Mixed_5b\": 256368,\n        \"InceptionV3/Mixed_5c\": 277968,\n        \"InceptionV3/Mixed_5d\": 285648,\n        \"InceptionV3/Mixed_6a\": 1153920,\n        \"InceptionV3/Mixed_6b\": 1298944,\n        \"InceptionV3/Mixed_6c\": 1692736,\n        \"InceptionV3/Mixed_6d\": 1692736,\n        \"InceptionV3/Mixed_6e\": 2143872,\n        \"InceptionV3/Mixed_7a\": 1699584,\n        \"InceptionV3/Mixed_7b\": 5047872,\n        \"InceptionV3/Mixed_7c\": 6080064,\n    }\n    self.assertDictEqual(expected_param_counts, param_counts)\n\n  def _assertCollectionSize(self, expected_size, collection):\n    actual_size = len(tf.get_collection(collection))\n    if expected_size != actual_size:\n      self.fail(\"Found %d items in collection %s (expected %d).\" %\n                (actual_size, collection, expected_size))\n\n  def testTrainableTrueIsTrainingTrue(self):\n    embeddings = image_embedding.inception_v3(\n        self._images, trainable=True, is_training=True)\n    self.assertEqual([self._batch_size, 2048], embeddings.get_shape().as_list())\n\n    self._verifyParameterCounts()\n    self._assertCollectionSize(376, tf.GraphKeys.VARIABLES)\n    self._assertCollectionSize(188, tf.GraphKeys.TRAINABLE_VARIABLES)\n    self._assertCollectionSize(188, tf.GraphKeys.UPDATE_OPS)\n    self._assertCollectionSize(94, tf.GraphKeys.REGULARIZATION_LOSSES)\n    self._assertCollectionSize(0, tf.GraphKeys.LOSSES)\n    self._assertCollectionSize(23, tf.GraphKeys.SUMMARIES)\n\n  def testTrainableTrueIsTrainingFalse(self):\n    embeddings = image_embedding.inception_v3(\n        self._images, trainable=True, is_training=False)\n    self.assertEqual([self._batch_size, 2048], embeddings.get_shape().as_list())\n\n    self._verifyParameterCounts()\n    self._assertCollectionSize(376, tf.GraphKeys.VARIABLES)\n    self._assertCollectionSize(188, tf.GraphKeys.TRAINABLE_VARIABLES)\n    self._assertCollectionSize(0, tf.GraphKeys.UPDATE_OPS)\n    self._assertCollectionSize(94, tf.GraphKeys.REGULARIZATION_LOSSES)\n    self._assertCollectionSize(0, tf.GraphKeys.LOSSES)\n    self._assertCollectionSize(23, tf.GraphKeys.SUMMARIES)\n\n  def testTrainableFalseIsTrainingTrue(self):\n    embeddings = image_embedding.inception_v3(\n        self._images, trainable=False, is_training=True)\n    self.assertEqual([self._batch_size, 2048], embeddings.get_shape().as_list())\n\n    self._verifyParameterCounts()\n    self._assertCollectionSize(376, tf.GraphKeys.VARIABLES)\n    self._assertCollectionSize(0, tf.GraphKeys.TRAINABLE_VARIABLES)\n    self._assertCollectionSize(0, tf.GraphKeys.UPDATE_OPS)\n    self._assertCollectionSize(0, tf.GraphKeys.REGULARIZATION_LOSSES)\n    self._assertCollectionSize(0, tf.GraphKeys.LOSSES)\n    self._assertCollectionSize(23, tf.GraphKeys.SUMMARIES)\n\n  def testTrainableFalseIsTrainingFalse(self):\n    embeddings = image_embedding.inception_v3(\n        self._images, trainable=False, is_training=False)\n    self.assertEqual([self._batch_size, 2048], embeddings.get_shape().as_list())\n\n    self._verifyParameterCounts()\n    self._assertCollectionSize(376, tf.GraphKeys.VARIABLES)\n    self._assertCollectionSize(0, tf.GraphKeys.TRAINABLE_VARIABLES)\n    self._assertCollectionSize(0, tf.GraphKeys.UPDATE_OPS)\n    self._assertCollectionSize(0, tf.GraphKeys.REGULARIZATION_LOSSES)\n    self._assertCollectionSize(0, tf.GraphKeys.LOSSES)\n    self._assertCollectionSize(23, tf.GraphKeys.SUMMARIES)\n\n\nif __name__ == \"__main__\":\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/ops/image_processing.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Helper functions for image preprocessing.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nimport tensorflow as tf\n\n\ndef distort_image(image, thread_id):\n  \"\"\"Perform random distortions on an image.\n\n  Args:\n    image: A float32 Tensor of shape [height, width, 3] with values in [0, 1).\n    thread_id: Preprocessing thread id used to select the ordering of color\n      distortions. There should be a multiple of 2 preprocessing threads.\n\n  Returns:\n    distorted_image: A float32 Tensor of shape [height, width, 3] with values in\n      [0, 1].\n  \"\"\"\n  # Randomly flip horizontally.\n  with tf.name_scope(\"flip_horizontal\", values=[image]):\n    image = tf.image.random_flip_left_right(image)\n\n  # Randomly distort the colors based on thread id.\n  color_ordering = thread_id % 2\n  with tf.name_scope(\"distort_color\", values=[image]):\n    if color_ordering == 0:\n      image = tf.image.random_brightness(image, max_delta=32. / 255.)\n      image = tf.image.random_saturation(image, lower=0.5, upper=1.5)\n      image = tf.image.random_hue(image, max_delta=0.032)\n      image = tf.image.random_contrast(image, lower=0.5, upper=1.5)\n    elif color_ordering == 1:\n      image = tf.image.random_brightness(image, max_delta=32. / 255.)\n      image = tf.image.random_contrast(image, lower=0.5, upper=1.5)\n      image = tf.image.random_saturation(image, lower=0.5, upper=1.5)\n      image = tf.image.random_hue(image, max_delta=0.032)\n\n    # The random_* ops do not necessarily clamp.\n    image = tf.clip_by_value(image, 0.0, 1.0)\n\n  return image\n\n\ndef process_image(encoded_image,\n                  is_training,\n                  height,\n                  width,\n                  resize_height=346,\n                  resize_width=346,\n                  thread_id=0,\n                  image_format=\"jpeg\"):\n  \"\"\"Decode an image, resize and apply random distortions.\n\n  In training, images are distorted slightly differently depending on thread_id.\n\n  Args:\n    encoded_image: String Tensor containing the image.\n    is_training: Boolean; whether preprocessing for training or eval.\n    height: Height of the output image.\n    width: Width of the output image.\n    resize_height: If > 0, resize height before crop to final dimensions.\n    resize_width: If > 0, resize width before crop to final dimensions.\n    thread_id: Preprocessing thread id used to select the ordering of color\n      distortions. There should be a multiple of 2 preprocessing threads.\n    image_format: \"jpeg\" or \"png\".\n\n  Returns:\n    A float32 Tensor of shape [height, width, 3] with values in [-1, 1].\n\n  Raises:\n    ValueError: If image_format is invalid.\n  \"\"\"\n  # Helper function to log an image summary to the visualizer. Summaries are\n  # only logged in thread 0.\n  def image_summary(name, image):\n    if not thread_id:\n      tf.image_summary(name, tf.expand_dims(image, 0))\n\n  # Decode image into a float32 Tensor of shape [?, ?, 3] with values in [0, 1).\n  with tf.name_scope(\"decode\", values=[encoded_image]):\n    if image_format == \"jpeg\":\n      image = tf.image.decode_jpeg(encoded_image, channels=3)\n    elif image_format == \"png\":\n      image = tf.image.decode_png(encoded_image, channels=3)\n    else:\n      raise ValueError(\"Invalid image format: %s\" % image_format)\n  image = tf.image.convert_image_dtype(image, dtype=tf.float32)\n  image_summary(\"original_image\", image)\n\n  # Resize image.\n  assert (resize_height > 0) == (resize_width > 0)\n  if resize_height:\n    image = tf.image.resize_images(image,\n                                   size=[resize_height, resize_width],\n                                   method=tf.image.ResizeMethod.BILINEAR)\n\n  # Crop to final dimensions.\n  if is_training:\n    image = tf.random_crop(image, [height, width, 3])\n  else:\n    # Central crop, assuming resize_height > height, resize_width > width.\n    image = tf.image.resize_image_with_crop_or_pad(image, height, width)\n\n  image_summary(\"resized_image\", image)\n\n  # Randomly distort the image.\n  if is_training:\n    image = distort_image(image, thread_id)\n\n  image_summary(\"final_image\", image)\n\n  # Rescale to [-1,1] instead of [0, 1]\n  image = tf.sub(image, 0.5)\n  image = tf.mul(image, 2.0)\n  return image\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/ops/inputs.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Input ops.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nimport tensorflow as tf\n\n\ndef parse_sequence_example(serialized, image_feature, caption_feature):\n  \"\"\"Parses a tensorflow.SequenceExample into an image and caption.\n\n  Args:\n    serialized: A scalar string Tensor; a single serialized SequenceExample.\n    image_feature: Name of SequenceExample context feature containing image\n      data.\n    caption_feature: Name of SequenceExample feature list containing integer\n      captions.\n\n  Returns:\n    encoded_image: A scalar string Tensor containing a JPEG encoded image.\n    caption: A 1-D uint64 Tensor with dynamically specified length.\n  \"\"\"\n  context, sequence = tf.parse_single_sequence_example(\n      serialized,\n      context_features={\n          image_feature: tf.FixedLenFeature([], dtype=tf.string)\n      },\n      sequence_features={\n          caption_feature: tf.FixedLenSequenceFeature([], dtype=tf.int64),\n      })\n\n  encoded_image = context[image_feature]\n  caption = sequence[caption_feature]\n  return encoded_image, caption\n\n\ndef prefetch_input_data(reader,\n                        file_pattern,\n                        is_training,\n                        batch_size,\n                        values_per_shard,\n                        input_queue_capacity_factor=16,\n                        num_reader_threads=1,\n                        shard_queue_name=\"filename_queue\",\n                        value_queue_name=\"input_queue\"):\n  \"\"\"Prefetches string values from disk into an input queue.\n\n  In training the capacity of the queue is important because a larger queue\n  means better mixing of training examples between shards. The minimum number of\n  values kept in the queue is values_per_shard * input_queue_capacity_factor,\n  where input_queue_memory factor should be chosen to trade-off better mixing\n  with memory usage.\n\n  Args:\n    reader: Instance of tf.ReaderBase.\n    file_pattern: Comma-separated list of file patterns (e.g.\n        /tmp/train_data-?????-of-00100).\n    is_training: Boolean; whether prefetching for training or eval.\n    batch_size: Model batch size used to determine queue capacity.\n    values_per_shard: Approximate number of values per shard.\n    input_queue_capacity_factor: Minimum number of values to keep in the queue\n      in multiples of values_per_shard. See comments above.\n    num_reader_threads: Number of reader threads to fill the queue.\n    shard_queue_name: Name for the shards filename queue.\n    value_queue_name: Name for the values input queue.\n\n  Returns:\n    A Queue containing prefetched string values.\n  \"\"\"\n  data_files = []\n  for pattern in file_pattern.split(\",\"):\n    data_files.extend(tf.gfile.Glob(pattern))\n  if not data_files:\n    tf.logging.fatal(\"Found no input files matching %s\", file_pattern)\n  else:\n    tf.logging.info(\"Prefetching values from %d files matching %s\",\n                    len(data_files), file_pattern)\n\n  if is_training:\n    filename_queue = tf.train.string_input_producer(\n        data_files, shuffle=True, capacity=16, name=shard_queue_name)\n    min_queue_examples = values_per_shard * input_queue_capacity_factor\n    capacity = min_queue_examples + 100 * batch_size\n    values_queue = tf.RandomShuffleQueue(\n        capacity=capacity,\n        min_after_dequeue=min_queue_examples,\n        dtypes=[tf.string],\n        name=\"random_\" + value_queue_name)\n  else:\n    filename_queue = tf.train.string_input_producer(\n        data_files, shuffle=False, capacity=1, name=shard_queue_name)\n    capacity = values_per_shard + 3 * batch_size\n    values_queue = tf.FIFOQueue(\n        capacity=capacity, dtypes=[tf.string], name=\"fifo_\" + value_queue_name)\n\n  enqueue_ops = []\n  for _ in range(num_reader_threads):\n    _, value = reader.read(filename_queue)\n    enqueue_ops.append(values_queue.enqueue([value]))\n  tf.train.queue_runner.add_queue_runner(tf.train.queue_runner.QueueRunner(\n      values_queue, enqueue_ops))\n  tf.scalar_summary(\n      \"queue/%s/fraction_of_%d_full\" % (values_queue.name, capacity),\n      tf.cast(values_queue.size(), tf.float32) * (1. / capacity))\n\n  return values_queue\n\n\ndef batch_with_dynamic_pad(images_and_captions,\n                           batch_size,\n                           queue_capacity,\n                           add_summaries=True):\n  \"\"\"Batches input images and captions.\n\n  This function splits the caption into an input sequence and a target sequence,\n  where the target sequence is the input sequence right-shifted by 1. Input and\n  target sequences are batched and padded up to the maximum length of sequences\n  in the batch. A mask is created to distinguish real words from padding words.\n\n  Example:\n    Actual captions in the batch ('-' denotes padded character):\n      [\n        [ 1 2 5 4 5 ],\n        [ 1 2 3 4 - ],\n        [ 1 2 3 - - ],\n      ]\n\n    input_seqs:\n      [\n        [ 1 2 3 4 ],\n        [ 1 2 3 - ],\n        [ 1 2 - - ],\n      ]\n\n    target_seqs:\n      [\n        [ 2 3 4 5 ],\n        [ 2 3 4 - ],\n        [ 2 3 - - ],\n      ]\n\n    mask:\n      [\n        [ 1 1 1 1 ],\n        [ 1 1 1 0 ],\n        [ 1 1 0 0 ],\n      ]\n\n  Args:\n    images_and_captions: A list of pairs [image, caption], where image is a\n      Tensor of shape [height, width, channels] and caption is a 1-D Tensor of\n      any length. Each pair will be processed and added to the queue in a\n      separate thread.\n    batch_size: Batch size.\n    queue_capacity: Queue capacity.\n    add_summaries: If true, add caption length summaries.\n\n  Returns:\n    images: A Tensor of shape [batch_size, height, width, channels].\n    input_seqs: An int32 Tensor of shape [batch_size, padded_length].\n    target_seqs: An int32 Tensor of shape [batch_size, padded_length].\n    mask: An int32 0/1 Tensor of shape [batch_size, padded_length].\n  \"\"\"\n  enqueue_list = []\n  for image, caption in images_and_captions:\n    caption_length = tf.shape(caption)[0]\n    input_length = tf.expand_dims(tf.sub(caption_length, 1), 0)\n\n    input_seq = tf.slice(caption, [0], input_length)\n    target_seq = tf.slice(caption, [1], input_length)\n    indicator = tf.ones(input_length, dtype=tf.int32)\n    enqueue_list.append([image, input_seq, target_seq, indicator])\n\n  images, input_seqs, target_seqs, mask = tf.train.batch_join(\n      enqueue_list,\n      batch_size=batch_size,\n      capacity=queue_capacity,\n      dynamic_pad=True,\n      name=\"batch_and_pad\")\n\n  if add_summaries:\n    lengths = tf.add(tf.reduce_sum(mask, 1), 1)\n    tf.scalar_summary(\"caption_length/batch_min\", tf.reduce_min(lengths))\n    tf.scalar_summary(\"caption_length/batch_max\", tf.reduce_max(lengths))\n    tf.scalar_summary(\"caption_length/batch_mean\", tf.reduce_mean(lengths))\n\n  return images, input_seqs, target_seqs, mask\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/run_inference.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\nr\"\"\"Generate captions for images using default beam search parameters.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport math\nimport os\n\n\nimport tensorflow as tf\n\nfrom im2txt import configuration\nfrom im2txt import inference_wrapper\nfrom im2txt.inference_utils import caption_generator\nfrom im2txt.inference_utils import vocabulary\n\nFLAGS = tf.flags.FLAGS\n\ntf.flags.DEFINE_string(\"checkpoint_path\", \"\",\n                       \"Model checkpoint file or directory containing a \"\n                       \"model checkpoint file.\")\ntf.flags.DEFINE_string(\"vocab_file\", \"\", \"Text file containing the vocabulary.\")\ntf.flags.DEFINE_string(\"input_files\", \"\",\n                       \"File pattern or comma-separated list of file patterns \"\n                       \"of image files.\")\n\n\ndef main(_):\n  # Build the inference graph.\n  g = tf.Graph()\n  with g.as_default():\n    model = inference_wrapper.InferenceWrapper()\n    restore_fn = model.build_graph_from_config(configuration.ModelConfig(),\n                                               FLAGS.checkpoint_path)\n  g.finalize()\n\n  # Create the vocabulary.\n  vocab = vocabulary.Vocabulary(FLAGS.vocab_file)\n\n  filenames = []\n  for file_pattern in FLAGS.input_files.split(\",\"):\n    filenames.extend(tf.gfile.Glob(file_pattern))\n  tf.logging.info(\"Running caption generation on %d files matching %s\",\n                  len(filenames), FLAGS.input_files)\n\n  with tf.Session(graph=g) as sess:\n    # Load the model from checkpoint.\n    restore_fn(sess)\n\n    # Prepare the caption generator. Here we are implicitly using the default\n    # beam search parameters. See caption_generator.py for a description of the\n    # available beam search parameters.\n    generator = caption_generator.CaptionGenerator(model, vocab)\n\n    for filename in filenames:\n      with tf.gfile.GFile(filename, \"r\") as f:\n        image = f.read()\n      captions = generator.beam_search(sess, image)\n      print(\"Captions for image %s:\" % os.path.basename(filename))\n      for i, caption in enumerate(captions):\n        # Ignore begin and end words.\n        sentence = [vocab.id_to_word(w) for w in caption.sentence[1:-1]]\n        sentence = \" \".join(sentence)\n        print(\"  %d) %s (p=%f)\" % (i, sentence, math.exp(caption.logprob)))\n\n\nif __name__ == \"__main__\":\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/show_and_tell_model.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Image-to-text implementation based on http://arxiv.org/abs/1411.4555.\n\n\"Show and Tell: A Neural Image Caption Generator\"\nOriol Vinyals, Alexander Toshev, Samy Bengio, Dumitru Erhan\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nimport tensorflow as tf\n\nfrom im2txt.ops import image_embedding\nfrom im2txt.ops import image_processing\nfrom im2txt.ops import inputs as input_ops\n\n\nclass ShowAndTellModel(object):\n  \"\"\"Image-to-text implementation based on http://arxiv.org/abs/1411.4555.\n\n  \"Show and Tell: A Neural Image Caption Generator\"\n  Oriol Vinyals, Alexander Toshev, Samy Bengio, Dumitru Erhan\n  \"\"\"\n\n  def __init__(self, config, mode, train_inception=False):\n    \"\"\"Basic setup.\n\n    Args:\n      config: Object containing configuration parameters.\n      mode: \"train\", \"eval\" or \"inference\".\n      train_inception: Whether the inception submodel variables are trainable.\n    \"\"\"\n    assert mode in [\"train\", \"eval\", \"inference\"]\n    self.config = config\n    self.mode = mode\n    self.train_inception = train_inception\n\n    # Reader for the input data.\n    self.reader = tf.TFRecordReader()\n\n    # To match the \"Show and Tell\" paper we initialize all variables with a\n    # random uniform initializer.\n    self.initializer = tf.random_uniform_initializer(\n        minval=-self.config.initializer_scale,\n        maxval=self.config.initializer_scale)\n\n    # A float32 Tensor with shape [batch_size, height, width, channels].\n    self.images = None\n\n    # An int32 Tensor with shape [batch_size, padded_length].\n    self.input_seqs = None\n\n    # An int32 Tensor with shape [batch_size, padded_length].\n    self.target_seqs = None\n\n    # An int32 0/1 Tensor with shape [batch_size, padded_length].\n    self.input_mask = None\n\n    # A float32 Tensor with shape [batch_size, embedding_size].\n    self.image_embeddings = None\n\n    # A float32 Tensor with shape [batch_size, padded_length, embedding_size].\n    self.seq_embeddings = None\n\n    # A float32 scalar Tensor; the total loss for the trainer to optimize.\n    self.total_loss = None\n\n    # A float32 Tensor with shape [batch_size * padded_length].\n    self.target_cross_entropy_losses = None\n\n    # A float32 Tensor with shape [batch_size * padded_length].\n    self.target_cross_entropy_loss_weights = None\n\n    # Collection of variables from the inception submodel.\n    self.inception_variables = []\n\n    # Function to restore the inception submodel from checkpoint.\n    self.init_fn = None\n\n    # Global step Tensor.\n    self.global_step = None\n\n  def is_training(self):\n    \"\"\"Returns true if the model is built for training mode.\"\"\"\n    return self.mode == \"train\"\n\n  def process_image(self, encoded_image, thread_id=0):\n    \"\"\"Decodes and processes an image string.\n\n    Args:\n      encoded_image: A scalar string Tensor; the encoded image.\n      thread_id: Preprocessing thread id used to select the ordering of color\n        distortions.\n\n    Returns:\n      A float32 Tensor of shape [height, width, 3]; the processed image.\n    \"\"\"\n    return image_processing.process_image(encoded_image,\n                                          is_training=self.is_training(),\n                                          height=self.config.image_height,\n                                          width=self.config.image_width,\n                                          thread_id=thread_id,\n                                          image_format=self.config.image_format)\n\n  def build_inputs(self):\n    \"\"\"Input prefetching, preprocessing and batching.\n\n    Outputs:\n      self.images\n      self.input_seqs\n      self.target_seqs (training and eval only)\n      self.input_mask (training and eval only)\n    \"\"\"\n    if self.mode == \"inference\":\n      # In inference mode, images and inputs are fed via placeholders.\n      image_feed = tf.placeholder(dtype=tf.string, shape=[], name=\"image_feed\")\n      input_feed = tf.placeholder(dtype=tf.int64,\n                                  shape=[None],  # batch_size\n                                  name=\"input_feed\")\n\n      # Process image and insert batch dimensions.\n      images = tf.expand_dims(self.process_image(image_feed), 0)\n      input_seqs = tf.expand_dims(input_feed, 1)\n\n      # No target sequences or input mask in inference mode.\n      target_seqs = None\n      input_mask = None\n    else:\n      # Prefetch serialized SequenceExample protos.\n      input_queue = input_ops.prefetch_input_data(\n          self.reader,\n          self.config.input_file_pattern,\n          is_training=self.is_training(),\n          batch_size=self.config.batch_size,\n          values_per_shard=self.config.values_per_input_shard,\n          input_queue_capacity_factor=self.config.input_queue_capacity_factor,\n          num_reader_threads=self.config.num_input_reader_threads)\n\n      # Image processing and random distortion. Split across multiple threads\n      # with each thread applying a slightly different distortion.\n      assert self.config.num_preprocess_threads % 2 == 0\n      images_and_captions = []\n      for thread_id in range(self.config.num_preprocess_threads):\n        serialized_sequence_example = input_queue.dequeue()\n        encoded_image, caption = input_ops.parse_sequence_example(\n            serialized_sequence_example,\n            image_feature=self.config.image_feature_name,\n            caption_feature=self.config.caption_feature_name)\n        image = self.process_image(encoded_image, thread_id=thread_id)\n        images_and_captions.append([image, caption])\n\n      # Batch inputs.\n      queue_capacity = (2 * self.config.num_preprocess_threads *\n                        self.config.batch_size)\n      images, input_seqs, target_seqs, input_mask = (\n          input_ops.batch_with_dynamic_pad(images_and_captions,\n                                           batch_size=self.config.batch_size,\n                                           queue_capacity=queue_capacity))\n\n    self.images = images\n    self.input_seqs = input_seqs\n    self.target_seqs = target_seqs\n    self.input_mask = input_mask\n\n  def build_image_embeddings(self):\n    \"\"\"Builds the image model subgraph and generates image embeddings.\n\n    Inputs:\n      self.images\n\n    Outputs:\n      self.image_embeddings\n    \"\"\"\n    inception_output = image_embedding.inception_v3(\n        self.images,\n        trainable=self.train_inception,\n        is_training=self.is_training())\n    self.inception_variables = tf.get_collection(\n        tf.GraphKeys.VARIABLES, scope=\"InceptionV3\")\n\n    # Map inception output into embedding space.\n    with tf.variable_scope(\"image_embedding\") as scope:\n      image_embeddings = tf.contrib.layers.fully_connected(\n          inputs=inception_output,\n          num_outputs=self.config.embedding_size,\n          activation_fn=None,\n          weights_initializer=self.initializer,\n          biases_initializer=None,\n          scope=scope)\n\n    # Save the embedding size in the graph.\n    tf.constant(self.config.embedding_size, name=\"embedding_size\")\n\n    self.image_embeddings = image_embeddings\n\n  def build_seq_embeddings(self):\n    \"\"\"Builds the input sequence embeddings.\n\n    Inputs:\n      self.input_seqs\n\n    Outputs:\n      self.seq_embeddings\n    \"\"\"\n    with tf.variable_scope(\"seq_embedding\"), tf.device(\"/cpu:0\"):\n      embedding_map = tf.get_variable(\n          name=\"map\",\n          shape=[self.config.vocab_size, self.config.embedding_size],\n          initializer=self.initializer)\n      seq_embeddings = tf.nn.embedding_lookup(embedding_map, self.input_seqs)\n\n    self.seq_embeddings = seq_embeddings\n\n  def build_model(self):\n    \"\"\"Builds the model.\n\n    Inputs:\n      self.image_embeddings\n      self.seq_embeddings\n      self.target_seqs (training and eval only)\n      self.input_mask (training and eval only)\n\n    Outputs:\n      self.total_loss (training and eval only)\n      self.target_cross_entropy_losses (training and eval only)\n      self.target_cross_entropy_loss_weights (training and eval only)\n    \"\"\"\n    # This LSTM cell has biases and outputs tanh(new_c) * sigmoid(o), but the\n    # modified LSTM in the \"Show and Tell\" paper has no biases and outputs\n    # new_c * sigmoid(o).\n    lstm_cell = tf.nn.rnn_cell.BasicLSTMCell(\n        num_units=self.config.num_lstm_units, state_is_tuple=True)\n    if self.mode == \"train\":\n      lstm_cell = tf.nn.rnn_cell.DropoutWrapper(\n          lstm_cell,\n          input_keep_prob=self.config.lstm_dropout_keep_prob,\n          output_keep_prob=self.config.lstm_dropout_keep_prob)\n\n    with tf.variable_scope(\"lstm\", initializer=self.initializer) as lstm_scope:\n      # Feed the image embeddings to set the initial LSTM state.\n      zero_state = lstm_cell.zero_state(\n          batch_size=self.image_embeddings.get_shape()[0], dtype=tf.float32)\n      _, initial_state = lstm_cell(self.image_embeddings, zero_state)\n\n      # Allow the LSTM variables to be reused.\n      lstm_scope.reuse_variables()\n\n      if self.mode == \"inference\":\n        # In inference mode, use concatenated states for convenient feeding and\n        # fetching.\n        tf.concat(1, initial_state, name=\"initial_state\")\n\n        # Placeholder for feeding a batch of concatenated states.\n        state_feed = tf.placeholder(dtype=tf.float32,\n                                    shape=[None, sum(lstm_cell.state_size)],\n                                    name=\"state_feed\")\n        state_tuple = tf.split(1, 2, state_feed)\n\n        # Run a single LSTM step.\n        lstm_outputs, state_tuple = lstm_cell(\n            inputs=tf.squeeze(self.seq_embeddings, squeeze_dims=[1]),\n            state=state_tuple)\n\n        # Concatentate the resulting state.\n        tf.concat(1, state_tuple, name=\"state\")\n      else:\n        # Run the batch of sequence embeddings through the LSTM.\n        sequence_length = tf.reduce_sum(self.input_mask, 1)\n        lstm_outputs, _ = tf.nn.dynamic_rnn(cell=lstm_cell,\n                                            inputs=self.seq_embeddings,\n                                            sequence_length=sequence_length,\n                                            initial_state=initial_state,\n                                            dtype=tf.float32,\n                                            scope=lstm_scope)\n\n    # Stack batches vertically.\n    lstm_outputs = tf.reshape(lstm_outputs, [-1, lstm_cell.output_size])\n\n    with tf.variable_scope(\"logits\") as logits_scope:\n      logits = tf.contrib.layers.fully_connected(\n          inputs=lstm_outputs,\n          num_outputs=self.config.vocab_size,\n          activation_fn=None,\n          weights_initializer=self.initializer,\n          scope=logits_scope)\n\n    if self.mode == \"inference\":\n      tf.nn.softmax(logits, name=\"softmax\")\n    else:\n      targets = tf.reshape(self.target_seqs, [-1])\n      weights = tf.to_float(tf.reshape(self.input_mask, [-1]))\n\n      # Compute losses.\n      losses = tf.nn.sparse_softmax_cross_entropy_with_logits(logits, targets)\n      batch_loss = tf.div(tf.reduce_sum(tf.mul(losses, weights)),\n                          tf.reduce_sum(weights),\n                          name=\"batch_loss\")\n      tf.contrib.losses.add_loss(batch_loss)\n      total_loss = tf.contrib.losses.get_total_loss()\n\n      # Add summaries.\n      tf.scalar_summary(\"batch_loss\", batch_loss)\n      tf.scalar_summary(\"total_loss\", total_loss)\n      for var in tf.trainable_variables():\n        tf.histogram_summary(var.op.name, var)\n\n      self.total_loss = total_loss\n      self.target_cross_entropy_losses = losses  # Used in evaluation.\n      self.target_cross_entropy_loss_weights = weights  # Used in evaluation.\n\n  def setup_inception_initializer(self):\n    \"\"\"Sets up the function to restore inception variables from checkpoint.\"\"\"\n    if self.mode != \"inference\":\n      # Restore inception variables only.\n      saver = tf.train.Saver(self.inception_variables)\n\n      def restore_fn(sess):\n        tf.logging.info(\"Restoring Inception variables from checkpoint file %s\",\n                        self.config.inception_checkpoint_file)\n        saver.restore(sess, self.config.inception_checkpoint_file)\n\n      self.init_fn = restore_fn\n\n  def setup_global_step(self):\n    \"\"\"Sets up the global step Tensor.\"\"\"\n    global_step = tf.Variable(\n        initial_value=0,\n        name=\"global_step\",\n        trainable=False,\n        collections=[tf.GraphKeys.GLOBAL_STEP, tf.GraphKeys.VARIABLES])\n\n    self.global_step = global_step\n\n  def build(self):\n    \"\"\"Creates all ops for training and evaluation.\"\"\"\n    self.build_inputs()\n    self.build_image_embeddings()\n    self.build_seq_embeddings()\n    self.build_model()\n    self.setup_inception_initializer()\n    self.setup_global_step()\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/show_and_tell_model_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Tests for tensorflow_models.im2txt.show_and_tell_model.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom im2txt import configuration\nfrom im2txt import show_and_tell_model\n\n\nclass ShowAndTellModel(show_and_tell_model.ShowAndTellModel):\n  \"\"\"Subclass of ShowAndTellModel without the disk I/O.\"\"\"\n\n  def build_inputs(self):\n    if self.mode == \"inference\":\n      # Inference mode doesn't read from disk, so defer to parent.\n      return super(ShowAndTellModel, self).build_inputs()\n    else:\n      # Replace disk I/O with random Tensors.\n      self.images = tf.random_uniform(\n          shape=[self.config.batch_size, self.config.image_height,\n                 self.config.image_width, 3],\n          minval=-1,\n          maxval=1)\n      self.input_seqs = tf.random_uniform(\n          [self.config.batch_size, 15],\n          minval=0,\n          maxval=self.config.vocab_size,\n          dtype=tf.int64)\n      self.target_seqs = tf.random_uniform(\n          [self.config.batch_size, 15],\n          minval=0,\n          maxval=self.config.vocab_size,\n          dtype=tf.int64)\n      self.input_mask = tf.ones_like(self.input_seqs)\n\n\nclass ShowAndTellModelTest(tf.test.TestCase):\n\n  def setUp(self):\n    super(ShowAndTellModelTest, self).setUp()\n    self._model_config = configuration.ModelConfig()\n\n  def _countModelParameters(self):\n    \"\"\"Counts the number of parameters in the model at top level scope.\"\"\"\n    counter = {}\n    for v in tf.all_variables():\n      name = v.op.name.split(\"/\")[0]\n      num_params = v.get_shape().num_elements()\n      assert num_params\n      counter[name] = counter.get(name, 0) + num_params\n    return counter\n\n  def _checkModelParameters(self):\n    \"\"\"Verifies the number of parameters in the model.\"\"\"\n    param_counts = self._countModelParameters()\n    expected_param_counts = {\n        \"InceptionV3\": 21802784,\n        # inception_output_size * embedding_size\n        \"image_embedding\": 1048576,\n        # vocab_size * embedding_size\n        \"seq_embedding\": 6144000,\n        # (embedding_size + num_lstm_units + 1) * 4 * num_lstm_units\n        \"lstm\": 2099200,\n        # (num_lstm_units + 1) * vocab_size\n        \"logits\": 6156000,\n        \"global_step\": 1,\n    }\n    self.assertDictEqual(expected_param_counts, param_counts)\n\n  def _checkOutputs(self, expected_shapes, feed_dict=None):\n    \"\"\"Verifies that the model produces expected outputs.\n\n    Args:\n      expected_shapes: A dict mapping Tensor or Tensor name to expected output\n        shape.\n      feed_dict: Values of Tensors to feed into Session.run().\n    \"\"\"\n    fetches = expected_shapes.keys()\n\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      outputs = sess.run(fetches, feed_dict)\n\n    for index, output in enumerate(outputs):\n      tensor = fetches[index]\n      expected = expected_shapes[tensor]\n      actual = output.shape\n      if expected != actual:\n        self.fail(\"Tensor %s has shape %s (expected %s).\" %\n                  (tensor, actual, expected))\n\n  def testBuildForTraining(self):\n    model = ShowAndTellModel(self._model_config, mode=\"train\")\n    model.build()\n\n    self._checkModelParameters()\n\n    expected_shapes = {\n        # [batch_size, image_height, image_width, 3]\n        model.images: (32, 299, 299, 3),\n        # [batch_size, sequence_length]\n        model.input_seqs: (32, 15),\n        # [batch_size, sequence_length]\n        model.target_seqs: (32, 15),\n        # [batch_size, sequence_length]\n        model.input_mask: (32, 15),\n        # [batch_size, embedding_size]\n        model.image_embeddings: (32, 512),\n        # [batch_size, sequence_length, embedding_size]\n        model.seq_embeddings: (32, 15, 512),\n        # Scalar\n        model.total_loss: (),\n        # [batch_size * sequence_length]\n        model.target_cross_entropy_losses: (480,),\n        # [batch_size * sequence_length]\n        model.target_cross_entropy_loss_weights: (480,),\n    }\n    self._checkOutputs(expected_shapes)\n\n  def testBuildForEval(self):\n    model = ShowAndTellModel(self._model_config, mode=\"eval\")\n    model.build()\n\n    self._checkModelParameters()\n\n    expected_shapes = {\n        # [batch_size, image_height, image_width, 3]\n        model.images: (32, 299, 299, 3),\n        # [batch_size, sequence_length]\n        model.input_seqs: (32, 15),\n        # [batch_size, sequence_length]\n        model.target_seqs: (32, 15),\n        # [batch_size, sequence_length]\n        model.input_mask: (32, 15),\n        # [batch_size, embedding_size]\n        model.image_embeddings: (32, 512),\n        # [batch_size, sequence_length, embedding_size]\n        model.seq_embeddings: (32, 15, 512),\n        # Scalar\n        model.total_loss: (),\n        # [batch_size * sequence_length]\n        model.target_cross_entropy_losses: (480,),\n        # [batch_size * sequence_length]\n        model.target_cross_entropy_loss_weights: (480,),\n    }\n    self._checkOutputs(expected_shapes)\n\n  def testBuildForInference(self):\n    model = ShowAndTellModel(self._model_config, mode=\"inference\")\n    model.build()\n\n    self._checkModelParameters()\n\n    # Test feeding an image to get the initial LSTM state.\n    images_feed = np.random.rand(1, 299, 299, 3)\n    feed_dict = {model.images: images_feed}\n    expected_shapes = {\n        # [batch_size, embedding_size]\n        model.image_embeddings: (1, 512),\n        # [batch_size, 2 * num_lstm_units]\n        \"lstm/initial_state:0\": (1, 1024),\n    }\n    self._checkOutputs(expected_shapes, feed_dict)\n\n    # Test feeding a batch of inputs and LSTM states to get softmax output and\n    # LSTM states.\n    input_feed = np.random.randint(0, 10, size=3)\n    state_feed = np.random.rand(3, 1024)\n    feed_dict = {\"input_feed:0\": input_feed, \"lstm/state_feed:0\": state_feed}\n    expected_shapes = {\n        # [batch_size, 2 * num_lstm_units]\n        \"lstm/state:0\": (3, 1024),\n        # [batch_size, vocab_size]\n        \"softmax:0\": (3, 12000),\n    }\n    self._checkOutputs(expected_shapes, feed_dict)\n\n\nif __name__ == \"__main__\":\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/im2txt/im2txt/train.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Train the model.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nimport tensorflow as tf\n\nfrom im2txt import configuration\nfrom im2txt import show_and_tell_model\n\nFLAGS = tf.app.flags.FLAGS\n\ntf.flags.DEFINE_string(\"input_file_pattern\", \"\",\n                       \"File pattern of sharded TFRecord input files.\")\ntf.flags.DEFINE_string(\"inception_checkpoint_file\", \"\",\n                       \"Path to a pretrained inception_v3 model.\")\ntf.flags.DEFINE_string(\"train_dir\", \"\",\n                       \"Directory for saving and loading model checkpoints.\")\ntf.flags.DEFINE_boolean(\"train_inception\", False,\n                        \"Whether to train inception submodel variables.\")\ntf.flags.DEFINE_integer(\"number_of_steps\", 1000000, \"Number of training steps.\")\ntf.flags.DEFINE_integer(\"log_every_n_steps\", 1,\n                        \"Frequency at which loss and global step are logged.\")\n\ntf.logging.set_verbosity(tf.logging.INFO)\n\n\ndef main(unused_argv):\n  assert FLAGS.input_file_pattern, \"--input_file_pattern is required\"\n  assert FLAGS.train_dir, \"--train_dir is required\"\n\n  model_config = configuration.ModelConfig()\n  model_config.input_file_pattern = FLAGS.input_file_pattern\n  model_config.inception_checkpoint_file = FLAGS.inception_checkpoint_file\n  training_config = configuration.TrainingConfig()\n\n  # Create training directory.\n  train_dir = FLAGS.train_dir\n  if not tf.gfile.IsDirectory(train_dir):\n    tf.logging.info(\"Creating training directory: %s\", train_dir)\n    tf.gfile.MakeDirs(train_dir)\n\n  # Build the TensorFlow graph.\n  g = tf.Graph()\n  with g.as_default():\n    # Build the model.\n    model = show_and_tell_model.ShowAndTellModel(\n        model_config, mode=\"train\", train_inception=FLAGS.train_inception)\n    model.build()\n\n    # Set up the learning rate.\n    learning_rate_decay_fn = None\n    if FLAGS.train_inception:\n      learning_rate = tf.constant(training_config.train_inception_learning_rate)\n    else:\n      learning_rate = tf.constant(training_config.initial_learning_rate)\n      if training_config.learning_rate_decay_factor > 0:\n        num_batches_per_epoch = (training_config.num_examples_per_epoch /\n                                 model_config.batch_size)\n        decay_steps = int(num_batches_per_epoch *\n                          training_config.num_epochs_per_decay)\n\n        def _learning_rate_decay_fn(learning_rate, global_step):\n          return tf.train.exponential_decay(\n              learning_rate,\n              global_step,\n              decay_steps=decay_steps,\n              decay_rate=training_config.learning_rate_decay_factor,\n              staircase=True)\n\n        learning_rate_decay_fn = _learning_rate_decay_fn\n\n    # Set up the training ops.\n    train_op = tf.contrib.layers.optimize_loss(\n        loss=model.total_loss,\n        global_step=model.global_step,\n        learning_rate=learning_rate,\n        optimizer=training_config.optimizer,\n        clip_gradients=training_config.clip_gradients,\n        learning_rate_decay_fn=learning_rate_decay_fn)\n\n    # Set up the Saver for saving and restoring model checkpoints.\n    saver = tf.train.Saver(max_to_keep=training_config.max_checkpoints_to_keep)\n\n  # Run training.\n  tf.contrib.slim.learning.train(\n      train_op,\n      train_dir,\n      log_every_n_steps=FLAGS.log_every_n_steps,\n      graph=g,\n      global_step=model.global_step,\n      number_of_steps=FLAGS.number_of_steps,\n      init_fn=model.init_fn,\n      saver=saver)\n\n\nif __name__ == \"__main__\":\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/inception/.gitignore",
    "content": "/bazel-bin\n/bazel-ci_build-cache\n/bazel-genfiles\n/bazel-out\n/bazel-inception\n/bazel-testlogs\n/bazel-tf\n"
  },
  {
    "path": "model_zoo/models/inception/README.md",
    "content": "# Inception in TensorFlow\n\n[ImageNet](http://www.image-net.org/) is a common academic data set in machine\nlearning for training an image recognition system. Code in this directory\ndemonstrates how to use TensorFlow to train and evaluate a type of convolutional\nneural network (CNN) on this academic data set. In particular, we demonstrate\nhow to train the Inception v3 architecture as specified in:\n\n_Rethinking the Inception Architecture for Computer Vision_\n\nChristian Szegedy, Vincent Vanhoucke, Sergey Ioffe, Jonathon Shlens, Zbigniew\nWojna\n\nhttp://arxiv.org/abs/1512.00567\n\nThis network achieves 21.2% top-1 and 5.6% top-5 error for single frame\nevaluation with a computational cost of 5 billion multiply-adds per inference\nand with using less than 25 million parameters. Below is a visualization of the\nmodel architecture.\n\n<center>\n![Inception-v3 Architecture](g3doc/inception_v3_architecture.png)\n</center>\n\n## Description of Code\n\nThe code base provides three core binaries for:\n\n*   Training an Inception v3 network from scratch across multiple GPUs and/or\n    multiple machines using the ImageNet 2012 Challenge training data set.\n*   Evaluating an Inception v3 network using the ImageNet 2012 Challenge\n    validation data set.\n*   Retraining an Inception v3 network on a novel task and back-propagating the\n    errors to fine tune the network weights.\n\nThe training procedure employs synchronous stochastic gradient descent across\nmultiple GPUs. The user may specify the number of GPUs they wish harness. The\nsynchronous training performs *batch-splitting* by dividing a given batch across\nmultiple GPUs.\n\nThe training set up is nearly identical to the section [Training a Model Using\nMultiple GPU Cards]\n(https://www.tensorflow.org/tutorials/deep_cnn/index.html#training-a-model-using-multiple-gpu-cards)\nwhere we have substituted the CIFAR-10 model architecture with Inception v3. The\nprimary differences with that setup are:\n\n*   Calculate and update the batch-norm statistics during training so that they\n    may be substituted in during evaluation.\n*   Specify the model architecture using a (still experimental) higher level\n    language called TensorFlow-Slim.\n\nFor more details about TensorFlow-Slim, please see the [Slim README]\n(inception/slim/README.md). Please note that this higher-level language is still\n*experimental* and the API may change over time depending on usage and\nsubsequent research.\n\n## Getting Started\n\n**NOTE** Before doing anything, we first need to build TensorFlow from source,\nand installed as a PIP package. Please follow the instructions at [Installing\nFrom Source]\n(https://www.tensorflow.org/get_started/os_setup.html#create-the-pip-package-and-install).\n\nBefore you run the training script for the first time, you will need to download\nand convert the ImageNet data to native TFRecord format. The TFRecord format\nconsists of a set of sharded files where each entry is a serialized `tf.Example`\nproto. Each `tf.Example` proto contains the ImageNet image (JPEG encoded) as\nwell as metadata such as label and bounding box information. See\n[`parse_example_proto`](inception/image_processing.py) for details.\n\nWe provide a single [script](inception/data/download_and_preprocess_imagenet.sh) for\ndownloading and converting ImageNet data to TFRecord format. Downloading and\npreprocessing the data may take several hours (up to half a day) depending on\nyour network and computer speed. Please be patient.\n\nTo begin, you will need to sign up for an account with [ImageNet]\n(http://image-net.org) to gain access to the data. Look for the sign up page,\ncreate an account and request an access key to download the data.\n\nAfter you have `USERNAME` and `PASSWORD`, you are ready to run our script. Make\nsure that your hard disk has at least 500 GB of free space for downloading and\nstoring the data. Here we select `DATA_DIR=$HOME/imagenet-data` as such a\nlocation but feel free to edit accordingly.\n\nWhen you run the below script, please enter *USERNAME* and *PASSWORD* when\nprompted. This will occur at the very beginning. Once these values are entered,\nyou will not need to interact with the script again.\n\n```shell\n# location of where to place the ImageNet data\nDATA_DIR=$HOME/imagenet-data\n\n# build the preprocessing script.\nbazel build inception/download_and_preprocess_imagenet\n\n# run it\nbazel-bin/inception/download_and_preprocess_imagenet \"${DATA_DIR}\"\n```\n\nThe final line of the output script should read:\n\n```shell\n2016-02-17 14:30:17.287989: Finished writing all 1281167 images in data set.\n```\n\nWhen the script finishes you will find 1024 and 128 training and validation\nfiles in the `DATA_DIR`. The files will match the patterns `train-????-of-1024`\nand `validation-?????-of-00128`, respectively.\n\n[Congratulations!](https://www.youtube.com/watch?v=9bZkp7q19f0) You are now\nready to train or evaluate with the ImageNet data set.\n\n## How to Train from Scratch\n\n**WARNING** Training an Inception v3 network from scratch is a computationally\nintensive task and depending on your compute setup may take several days or even\nweeks.\n\n*Before proceeding* please read the [Convolutional Neural Networks]\n(https://www.tensorflow.org/tutorials/deep_cnn/index.html) tutorial in\nparticular focus on [Training a Model Using Multiple GPU Cards]\n(https://www.tensorflow.org/tutorials/deep_cnn/index.html#training-a-model-using-multiple-gpu-cards)\n. The model training method is nearly identical to that described in the\nCIFAR-10 multi-GPU model training. Briefly, the model training\n\n*   Places an individual model replica on each GPU. Split the batch across the\n    GPUs.\n*   Updates model parameters synchronously by waiting for all GPUs to finish\n    processing a batch of data.\n\nThe training procedure is encapsulated by this diagram of how operations and\nvariables are placed on CPU and GPUs respectively.\n\n<div style=\"width:40%; margin:auto; margin-bottom:10px; margin-top:20px;\">\n  <img style=\"width:100%\" src=\"https://www.tensorflow.org/images/Parallelism.png\">\n</div>\n\nEach tower computes the gradients for a portion of the batch and the gradients\nare combined and averaged across the multiple towers in order to provide a\nsingle update of the Variables stored on the CPU.\n\nA crucial aspect of training a network of this size is *training speed* in terms\nof wall-clock time. The training speed is dictated by many factors -- most\nimportantly the batch size and the learning rate schedule. Both of these\nparameters are heavily coupled to the hardware set up.\n\nGenerally speaking, a batch size is a difficult parameter to tune as it requires\nbalancing memory demands of the model, memory available on the GPU and speed of\ncomputation. Generally speaking, employing larger batch sizes leads to more\nefficient computation and potentially more efficient training steps.\n\nWe have tested several hardware setups for training this model from scratch but\nwe emphasize that depending your hardware set up, you may need to adapt the\nbatch size and learning rate schedule.\n\nPlease see the comments in `inception_train.py` for a few selected learning rate\nplans based on some selected hardware setups.\n\nTo train this model, you simply need to specify the following:\n\n```shell\n# Build the model. Note that we need to make sure the TensorFlow is ready to\n# use before this as this command will not build TensorFlow.\nbazel build inception/imagenet_train\n\n# run it\nbazel-bin/inception/imagenet_train --num_gpus=1 --batch_size=32 --train_dir=/tmp/imagenet_train --data_dir=/tmp/imagenet_data\n```\n\nThe model reads in the ImageNet training data from `--data_dir`. If you followed\nthe instructions in [Getting Started](#getting-started), then set\n`--data_dir=\"${DATA_DIR}\"`. The script assumes that there exists a set of\nsharded TFRecord files containing the ImageNet data. If you have not created\nTFRecord files, please refer to [Getting Started](#getting-started)\n\nHere is the output of the above command line when running on a Tesla K40c:\n\n```shell\n2016-03-07 12:24:59.922898: step 0, loss = 13.11 (5.3 examples/sec; 6.064 sec/batch)\n2016-03-07 12:25:55.206783: step 10, loss = 13.71 (9.4 examples/sec; 3.394 sec/batch)\n2016-03-07 12:26:28.905231: step 20, loss = 14.81 (9.5 examples/sec; 3.380 sec/batch)\n2016-03-07 12:27:02.699719: step 30, loss = 14.45 (9.5 examples/sec; 3.378 sec/batch)\n2016-03-07 12:27:36.515699: step 40, loss = 13.98 (9.5 examples/sec; 3.376 sec/batch)\n2016-03-07 12:28:10.220956: step 50, loss = 13.92 (9.6 examples/sec; 3.327 sec/batch)\n2016-03-07 12:28:43.658223: step 60, loss = 13.28 (9.6 examples/sec; 3.350 sec/batch)\n...\n```\n\nIn this example, a log entry is printed every 10 step and the line includes the\ntotal loss (starts around 13.0-14.0) and the speed of processing in terms of\nthroughput (examples / sec) and batch speed (sec/batch).\n\nThe number of GPU devices is specified by `--num_gpus` (which defaults to 1).\nSpecifying `--num_gpus` greater then 1 splits the batch evenly split across the\nGPU cards.\n\n```shell\n# Build the model. Note that we need to make sure the TensorFlow is ready to\n# use before this as this command will not build TensorFlow.\nbazel build inception/imagenet_train\n\n# run it\nbazel-bin/inception/imagenet_train --num_gpus=2 --batch_size=64 --train_dir=/tmp/imagenet_train\n```\n\nThis model splits the batch of 64 images across 2 GPUs and calculates the\naverage gradient by waiting for both GPUs to finish calculating the gradients\nfrom their respective data (See diagram above). Generally speaking, using larger\nnumbers of GPUs leads to higher throughput as well as the opportunity to use\nlarger batch sizes. In turn, larger batch sizes imply better estimates of the\ngradient enabling the usage of higher learning rates. In summary, using more\nGPUs results in simply faster training speed.\n\nNote that selecting a batch size is a difficult parameter to tune as it requires\nbalancing memory demands of the model, memory available on the GPU and speed of\ncomputation. Generally speaking, employing larger batch sizes leads to more\nefficient computation and potentially more efficient training steps.\n\nNote that there is considerable noise in the loss function on individual steps\nin the previous log. Because of this noise, it is difficult to discern how well\na model is learning. The solution to the last problem is to launch TensorBoard\npointing to the directory containing the events log.\n\n```shell\ntensorboard --logdir=/tmp/imagenet_train\n```\n\nTensorBoard has access to the many Summaries produced by the model that describe\nmultitudes of statistics tracking the model behavior and the quality of the\nlearned model. In particular, TensorBoard tracks a exponentially smoothed\nversion of the loss. In practice, it is far easier to judge how well a model\nlearns by monitoring the smoothed version of the loss.\n\n## How to Train from Scratch in a Distributed Setting\n\n**NOTE** Distributed TensorFlow requires version 0.8 or later.\n\nDistributed TensorFlow lets us use multiple machines to train a model faster.\nThis is quite different from the training with multiple GPU towers on a single\nmachine where all parameters and gradients computation are in the same place. We\ncoordinate the computation across multiple machines by employing a centralized\nrepository for parameters that maintains a unified, single copy of model\nparameters. Each individual machine sends gradient updates to the centralized\nparameter repository which coordinates these updates and sends back updated\nparameters to the individual machines running the model training.\n\nWe term each machine that runs a copy of the training a `worker` or `replica`.\nWe term each machine that maintains model parameters a `ps`, short for\n`parameter server`. Note that we might have more than one machine acting as a\n`ps` as the model parameters may be sharded across multiple machines.\n\nVariables may be updated with synchronous or asynchronous gradient updates. One\nmay construct a an [`Optimizer`]\n(https://www.tensorflow.org/api_docs/python/train.html#optimizers) in TensorFlow\nthat constructs the necessary graph for either case diagrammed below from\nTensorFlow [Whitepaper]\n(http://download.tensorflow.org/paper/whitepaper2015.pdf):\n\n<div style=\"width:40%; margin:auto; margin-bottom:10px; margin-top:20px;\">\n  <img style=\"width:100%\"\n  src=\"https://www.tensorflow.org/images/tensorflow_figure7.png\">\n</div>\n\nIn [a recent paper](https://arxiv.org/abs/1604.00981), synchronous gradient\nupdates have demonstrated to reach higher accuracy in a shorter amount of time.\nIn this distributed Inception example we employ synchronous gradient updates.\n\nNote that in this example each replica has a single tower that uses one GPU.\n\nThe command-line flags `worker_hosts` and `ps_hosts` specify available servers.\nThe same binary will be used for both the `worker` jobs and the `ps` jobs.\nCommand line flag `job_name` will be used to specify what role a task will be\nplaying and `task_id` will be used to idenify which one of the jobs it is\nrunning. Several things to note here:\n\n*   The numbers of `ps` and `worker` tasks are inferred from the lists of hosts\n    specified in the flags. The `task_id` should be within the range `[0,\n    num_ps_tasks)` for `ps` tasks and `[0, num_worker_tasks)` for `worker`\n    tasks.\n*   `ps` and `worker` tasks can run on the same machine, as long as that machine\n    has sufficient resources to handle both tasks. Note that the `ps` task does\n    not benefit from a GPU, so it should not attempt to use one (see below).\n*   Multiple `worker` tasks can run on the same machine with multiple GPUs so\n    machine_A with 2 GPUs may have 2 workers while machine_B with 1 GPU just has\n    1 worker.\n*   The default learning rate schedule works well for a wide range of number of\n    replicas [25, 50, 100] but feel free to tune it for even better results.\n*   The command line of both `ps` and `worker` tasks should include the complete\n    list of `ps_hosts` and `worker_hosts`.\n*   There is a chief `worker` among all workers which defaults to `worker` 0.\n    The chief will be in charge of initializing all the parameters, writing out\n    the summaries and the checkpoint. The checkpoint and summary will be in the\n    `train_dir` of the host for `worker` 0.\n*   Each worker processes a batch_size number of examples but each gradient\n    update is computed from all replicas. Hence, the effective batch size of\n    this model is batch_size * num_workers.\n\n```shell\n# Build the model. Note that we need to make sure the TensorFlow is ready to\n# use before this as this command will not build TensorFlow.\nbazel build inception/imagenet_distributed_train\n\n# To start worker 0, go to the worker0 host and run the following (Note that\n# task_id should be in the range [0, num_worker_tasks):\nbazel-bin/inception/imagenet_distributed_train \\\n--batch_size=32 \\\n--data_dir=$HOME/imagenet-data \\\n--job_name='worker' \\\n--task_id=0 \\\n--ps_hosts='ps0.example.com:2222' \\\n--worker_hosts='worker0.example.com:2222,worker1.example.com:2222'\n\n# To start worker 1, go to the worker1 host and run the following (Note that\n# task_id should be in the range [0, num_worker_tasks):\nbazel-bin/inception/imagenet_distributed_train \\\n--batch_size=32 \\\n--data_dir=$HOME/imagenet-data \\\n--job_name='worker' \\\n--task_id=1 \\\n--ps_hosts='ps0.example.com:2222' \\\n--worker_hosts='worker0.example.com:2222,worker1.example.com:2222'\n\n# To start the parameter server (ps), go to the ps host and run the following (Note\n# that task_id should be in the range [0, num_ps_tasks):\nbazel-bin/inception/imagenet_distributed_train \\\n--job_name='ps' \\\n--task_id=0 \\\n--ps_hosts='ps0.example.com:2222' \\\n--worker_hosts='worker0.example.com:2222,worker1.example.com:2222'\n```\n\nIf you have installed a GPU-compatible version of TensorFlow, the `ps` will also\ntry to allocate GPU memory although it is not helpful. This could potentially\ncrash the worker on the same machine as it has little to no GPU memory to\nallocate. To avoid this, you can prepend the previous command to start `ps`\nwith: `CUDA_VISIBLE_DEVICES=''`\n\n```shell\nCUDA_VISIBLE_DEVICES='' bazel-bin/inception/imagenet_distributed_train \\\n--job_name='ps' \\\n--task_id=0 \\\n--ps_hosts='ps0.example.com:2222' \\\n--worker_hosts='worker0.example.com:2222,worker1.example.com:2222'\n```\n\nIf you have run everything correctly, you should see a log in each `worker` job\nthat looks like the following. Note the training speed varies depending on your\nhardware and the first several steps could take much longer.\n\n```shell\nINFO:tensorflow:PS hosts are: ['ps0.example.com:2222', 'ps1.example.com:2222']\nINFO:tensorflow:Worker hosts are: ['worker0.example.com:2222', 'worker1.example.com:2222']\nI tensorflow/core/distributed_runtime/rpc/grpc_channel.cc:206] Initialize HostPortsGrpcChannelCache for job ps -> {ps0.example.com:2222, ps1.example.com:2222}\nI tensorflow/core/distributed_runtime/rpc/grpc_channel.cc:206] Initialize HostPortsGrpcChannelCache for job worker -> {localhost:2222, worker1.example.com:2222}\nI tensorflow/core/distributed_runtime/rpc/grpc_server_lib.cc:202] Started server with target: grpc://localhost:2222\nINFO:tensorflow:Created variable global_step:0 with shape () and init <function zeros_initializer at 0x7f6aa014b140>\n\n...\n\nINFO:tensorflow:Created variable logits/logits/biases:0 with shape (1001,) and init <function _initializer at 0x7f6a77f3cf50>\nINFO:tensorflow:SyncReplicas enabled: replicas_to_aggregate=2; total_num_replicas=2\nINFO:tensorflow:2016-04-13 01:56:26.405639 Supervisor\nINFO:tensorflow:Started 2 queues for processing input data.\nINFO:tensorflow:global_step/sec: 0\nINFO:tensorflow:Worker 0: 2016-04-13 01:58:40.342404: step 0, loss = 12.97(0.0 examples/sec; 65.428  sec/batch)\nINFO:tensorflow:global_step/sec: 0.0172907\n...\n```\n\nand a log in each `ps` job that looks like the following:\n\n```shell\nINFO:tensorflow:PS hosts are: ['ps0.example.com:2222', 'ps1.example.com:2222']\nINFO:tensorflow:Worker hosts are: ['worker0.example.com:2222', 'worker1.example.com:2222']\nI tensorflow/core/distributed_runtime/rpc/grpc_channel.cc:206] Initialize HostPortsGrpcChannelCache for job ps -> {localhost:2222, ps1.example.com:2222}\nI tensorflow/core/distributed_runtime/rpc/grpc_channel.cc:206] Initialize HostPortsGrpcChannelCache for job worker -> {worker0.example.com:2222, worker1.example.com:2222}\nI tensorflow/core/distributed_runtime/rpc/grpc_server_lib.cc:202] Started server with target: grpc://localhost:2222\n```\n\n[Congratulations!](https://www.youtube.com/watch?v=9bZkp7q19f0) You are now\ntraining Inception in a distributed manner.\n\n## How to Evaluate\n\nEvaluating an Inception v3 model on the ImageNet 2012 validation data set\nrequires running a separate binary.\n\nThe evaluation procedure is nearly identical to [Evaluating a Model]\n(https://www.tensorflow.org/tutorials/deep_cnn/index.html#evaluating-a-model)\ndescribed in the [Convolutional Neural Network]\n(https://www.tensorflow.org/tutorials/deep_cnn/index.html) tutorial.\n\n**WARNING** Be careful not to run the evaluation and training binary on the same\nGPU or else you might run out of memory. Consider running the evaluation on a\nseparate GPU if available or suspending the training binary while running the\nevaluation on the same GPU.\n\nBriefly, one can evaluate the model by running:\n\n```shell\n# Build the model. Note that we need to make sure the TensorFlow is ready to\n# use before this as this command will not build TensorFlow.\nbazel build inception/imagenet_eval\n\n# run it\nbazel-bin/inception/imagenet_eval --checkpoint_dir=/tmp/imagenet_train --eval_dir=/tmp/imagenet_eval\n```\n\nNote that we point `--checkpoint_dir` to the location of the checkpoints saved\nby `inception_train.py` above. Running the above command results in the\nfollowing output:\n\n```shell\n2016-02-17 22:32:50.391206: precision @ 1 = 0.735\n...\n```\n\nThe script calculates the precision @ 1 over the entire validation data\nperiodically. The precision @ 1 measures the how often the highest scoring\nprediction from the model matched the ImageNet label -- in this case, 73.5%. If\nyou wish to run the eval just once and not periodically, append the `--run_once`\noption.\n\nMuch like the training script, `imagenet_eval.py` also exports summaries that\nmay be visualized in TensorBoard. These summaries calculate additional\nstatistics on the predictions (e.g. recall @ 5) as well as monitor the\nstatistics of the model activations and weights during evaluation.\n\n## How to Fine-Tune a Pre-Trained Model on a New Task\n\n### Getting Started\n\nMuch like training the ImageNet model we must first convert a new data set to\nthe sharded TFRecord format which each entry is a serialized `tf.Example` proto.\n\nWe have provided a script demonstrating how to do this for small data set of of\na few thousand flower images spread across 5 labels:\n\n```shell\ndaisy, dandelion, roses, sunflowers, tulips\n```\n\nThere is a single automated script that downloads the data set and converts it\nto the TFRecord format. Much like the ImageNet data set, each record in the\nTFRecord format is a serialized `tf.Example` proto whose entries include a\nJPEG-encoded string and an integer label. Please see [`parse_example_proto`]\n(inception/image_processing.py) for details.\n\nThe script just takes a few minutes to run depending your network connection\nspeed for downloading and processing the images. Your hard disk requires 200MB\nof free storage. Here we select `DATA_DIR=$HOME/flowers-data` as such a location\nbut feel free to edit accordingly.\n\n```shell\n# location of where to place the flowers data\nFLOWERS_DATA_DIR=$HOME/flowers-data\n\n# build the preprocessing script.\nbazel build inception/download_and_preprocess_flowers\n\n# run it\nbazel-bin/inception/download_and_preprocess_flowers \"${FLOWERS_DATA_DIR}\"\n```\n\nIf the script runs successfully, the final line of the terminal output should\nlook like:\n\n```shell\n2016-02-24 20:42:25.067551: Finished writing all 3170 images in data set.\n```\n\nWhen the script finishes you will find 2 shards for the training and validation\nfiles in the `DATA_DIR`. The files will match the patterns `train-????-of-00001`\nand `validation-?????-of-00001`, respectively.\n\n**NOTE** If you wish to prepare a custom image data set for transfer learning,\nyou will need to invoke [`build_image_data.py`](inception/data/build_image_data.py) on\nyour custom data set. Please see the associated options and assumptions behind\nthis script by reading the comments section of [`build_image_data.py`]\n(inception/data/build_image_data.py). Also, if your custom data has a different \nnumber of examples or classes, you need to change the appropriate values in\n[`imagenet_data.py`](inception/imagenet_data.py).\n\nThe second piece you will need is a trained Inception v3 image model. You have\nthe option of either training one yourself (See [How to Train from Scratch]\n(#how-to-train-from-scratch) for details) or you can download a pre-trained\nmodel like so:\n\n```shell\n# location of where to place the Inception v3 model\nDATA_DIR=$HOME/inception-v3-model\ncd ${DATA_DIR}\n\n# download the Inception v3 model\ncurl -O http://download.tensorflow.org/models/image/imagenet/inception-v3-2016-03-01.tar.gz\ntar xzf inception-v3-2016-03-01.tar.gz\n\n# this will create a directory called inception-v3 which contains the following files.\n> ls inception-v3\nREADME.txt\ncheckpoint\nmodel.ckpt-157585\n```\n\n[Congratulations!](https://www.youtube.com/watch?v=9bZkp7q19f0) You are now\nready to fine-tune your pre-trained Inception v3 model with the flower data set.\n\n### How to Retrain a Trained Model on the Flowers Data\n\nWe are now ready to fine-tune a pre-trained Inception-v3 model on the flowers\ndata set. This requires two distinct changes to our training procedure:\n\n1.  Build the exact same model as previously except we change the number of\n    labels in the final classification layer.\n\n2.  Restore all weights from the pre-trained Inception-v3 except for the final\n    classification layer; this will get randomly initialized instead.\n\nWe can perform these two operations by specifying two flags:\n`--pretrained_model_checkpoint_path` and `--fine_tune`. The first flag is a\nstring that points to the path of a pre-trained Inception-v3 model. If this flag\nis specified, it will load the entire model from the checkpoint before the\nscript begins training.\n\nThe second flag `--fine_tune` is a boolean that indicates whether the last\nclassification layer should be randomly initialized or restored. You may set\nthis flag to false if you wish to continue training a pre-trained model from a\ncheckpoint. If you set this flag to true, you can train a new classification\nlayer from scratch.\n\nIn order to understand how `--fine_tune` works, please see the discussion on\n`Variables` in the TensorFlow-Slim [`README.md`](inception/slim/README.md).\n\nPutting this all together you can retrain a pre-trained Inception-v3 model on\nthe flowers data set with the following command.\n\n```shell\n# Build the model. Note that we need to make sure the TensorFlow is ready to\n# use before this as this command will not build TensorFlow.\nbazel build inception/flowers_train\n\n# Path to the downloaded Inception-v3 model.\nMODEL_PATH=\"${INCEPTION_MODEL_DIR}/model.ckpt-157585\"\n\n# Directory where the flowers data resides.\nFLOWERS_DATA_DIR=/tmp/flowers-data/\n\n# Directory where to save the checkpoint and events files.\nTRAIN_DIR=/tmp/flowers_train/\n\n# Run the fine-tuning on the flowers data set starting from the pre-trained\n# Imagenet-v3 model.\nbazel-bin/inception/flowers_train \\\n  --train_dir=\"${TRAIN_DIR}\" \\\n  --data_dir=\"${FLOWERS_DATA_DIR}\" \\\n  --pretrained_model_checkpoint_path=\"${MODEL_PATH}\" \\\n  --fine_tune=True \\\n  --initial_learning_rate=0.001 \\\n  --input_queue_memory_factor=1\n```\n\nWe have added a few extra options to the training procedure.\n\n*   Fine-tuning a model a separate data set requires significantly lowering the\n    initial learning rate. We set the initial learning rate to 0.001.\n*   The flowers data set is quite small so we shrink the size of the shuffling\n    queue of examples. See [Adjusting Memory Demands](#adjusting-memory-demands)\n    for more details.\n\nThe training script will only reports the loss. To evaluate the quality of the\nfine-tuned model, you will need to run `flowers_eval`:\n\n```shell\n# Build the model. Note that we need to make sure the TensorFlow is ready to\n# use before this as this command will not build TensorFlow.\nbazel build inception/flowers_eval\n\n# Directory where we saved the fine-tuned checkpoint and events files.\nTRAIN_DIR=/tmp/flowers_train/\n\n# Directory where the flowers data resides.\nFLOWERS_DATA_DIR=/tmp/flowers-data/\n\n# Directory where to save the evaluation events files.\nEVAL_DIR=/tmp/flowers_eval/\n\n# Evaluate the fine-tuned model on a hold-out of the flower data set.\nbazel-bin/inception/flowers_eval \\\n  --eval_dir=\"${EVAL_DIR}\" \\\n  --data_dir=\"${FLOWERS_DATA_DIR}\" \\\n  --subset=validation \\\n  --num_examples=500 \\\n  --checkpoint_dir=\"${TRAIN_DIR}\" \\\n  --input_queue_memory_factor=1 \\\n  --run_once\n```\n\nWe find that the evaluation arrives at roughly 93.4% precision@1 after the model\nhas been running for 2000 steps.\n\n```shell\nSuccesfully loaded model from /tmp/flowers/model.ckpt-1999 at step=1999.\n2016-03-01 16:52:51.761219: starting evaluation on (validation).\n2016-03-01 16:53:05.450419: [20 batches out of 20] (36.5 examples/sec; 0.684sec/batch)\n2016-03-01 16:53:05.450471: precision @ 1 = 0.9340 recall @ 5 = 0.9960 [500 examples]\n```\n\n## How to Construct a New Dataset for Retraining\n\nOne can use the existing scripts supplied with this model to build a new dataset\nfor training or fine-tuning. The main script to employ is\n[`build_image_data.py`](inception/data/build_image_data.py). Briefly, this script takes a\nstructured directory of images and converts it to a sharded `TFRecord` that can\nbe read by the Inception model.\n\nIn particular, you will need to create a directory of training images that\nreside within `$TRAIN_DIR` and `$VALIDATION_DIR` arranged as such:\n\n```shell\n  $TRAIN_DIR/dog/image0.jpeg\n  $TRAIN_DIR/dog/image1.jpg\n  $TRAIN_DIR/dog/image2.png\n  ...\n  $TRAIN_DIR/cat/weird-image.jpeg\n  $TRAIN_DIR/cat/my-image.jpeg\n  $TRAIN_DIR/cat/my-image.JPG\n  ...\n  $VALIDATION_DIR/dog/imageA.jpeg\n  $VALIDATION_DIR/dog/imageB.jpg\n  $VALIDATION_DIR/dog/imageC.png\n  ...\n  $VALIDATION_DIR/cat/weird-image.PNG\n  $VALIDATION_DIR/cat/that-image.jpg\n  $VALIDATION_DIR/cat/cat.JPG\n  ...\n```\n\nEach sub-directory in `$TRAIN_DIR` and `$VALIDATION_DIR` corresponds to a unique\nlabel for the images that reside within that sub-directory. The images may be\nJPEG or PNG images. We do not support other images types currently.\n\nOnce the data is arranged in this directory structure, we can run\n`build_image_data.py` on the data to generate the sharded `TFRecord` dataset.\nEach entry of the `TFRecord` is a serialized `tf.Example` protocol buffer. A\ncomplete list of information contained in the `tf.Example` is described in the\ncomments of `build_image_data.py`.\n\nTo run `build_image_data.py`, you can run the following command line:\n\n```shell\n# location to where to save the TFRecord data.\nOUTPUT_DIRECTORY=$HOME/my-custom-data/\n\n# build the preprocessing script.\nbazel build inception/build_image_data\n\n# convert the data.\nbazel-bin/inception/build_image_data \\\n  --train_directory=\"${TRAIN_DIR}\" \\\n  --validation_directory=\"${VALIDATION_DIR}\" \\\n  --output_directory=\"${OUTPUT_DIRECTORY}\" \\\n  --labels_file=\"${LABELS_FILE}\" \\\n  --train_shards=128 \\\n  --validation_shards=24 \\\n  --num_threads=8\n```\n\nwhere the `$OUTPUT_DIRECTORY` is the location of the sharded `TFRecords`. The\n`$LABELS_FILE` will be a text file that is read by the script that provides\na list of all of the labels. For instance, in the case flowers data set, the\n`$LABELS_FILE` contained the following data:\n\n```shell\ndaisy\ndandelion\nroses\nsunflowers\ntulips\n```\n\nNote that each row of each label corresponds with the entry in the final\nclassifier in the model. That is, the `daisy` corresponds to the classifier for\nentry `1`; `dandelion` is entry `2`, etc. We skip label `0` as a background\nclass.\n\nAfter running this script produces files that look like the following:\n\n```shell\n  $TRAIN_DIR/train-00000-of-00024\n  $TRAIN_DIR/train-00001-of-00024\n  ...\n  $TRAIN_DIR/train-00023-of-00024\n\nand\n\n  $VALIDATION_DIR/validation-00000-of-00008\n  $VALIDATION_DIR/validation-00001-of-00008\n  ...\n  $VALIDATION_DIR/validation-00007-of-00008\n```\n\nwhere 24 and 8 are the number of shards specified for each dataset,\nrespectively. Generally speaking, we aim for selecting the number of shards such\nthat roughly 1024 images reside in each shard. Once this data set is built, you\nare ready to train or fine-tune an Inception model on this data set.\n\nNote, if you are piggy backing on the flowers retraining scripts, be sure to \nupdate `num_classes()` and `num_examples_per_epoch()` in `flowers_data.py` \nto correspond with your data.\n\n## Practical Considerations for Training a Model\n\nThe model architecture and training procedure is heavily dependent on the\nhardware used to train the model. If you wish to train or fine-tune this model\non your machine **you will need to adjust and empirically determine a good set\nof training hyper-parameters for your setup**. What follows are some general\nconsiderations for novices.\n\n### Finding Good Hyperparameters\n\nRoughly 5-10 hyper-parameters govern the speed at which a network is trained. In\naddition to `--batch_size` and `--num_gpus`, there are several constants defined\nin [inception_train.py](inception/inception_train.py) which dictate the learning\nschedule.\n\n```shell\nRMSPROP_DECAY = 0.9                # Decay term for RMSProp.\nMOMENTUM = 0.9                     # Momentum in RMSProp.\nRMSPROP_EPSILON = 1.0              # Epsilon term for RMSProp.\nINITIAL_LEARNING_RATE = 0.1        # Initial learning rate.\nNUM_EPOCHS_PER_DECAY = 30.0        # Epochs after which learning rate decays.\nLEARNING_RATE_DECAY_FACTOR = 0.16  # Learning rate decay factor.\n```\n\nThere are many papers that discuss the various tricks and trade-offs associated\nwith training a model with stochastic gradient descent. For those new to the\nfield, some great references are:\n\n*   Y Bengio, [Practical recommendations for gradient-based training of deep\n    architectures](http://arxiv.org/abs/1206.5533)\n*   I Goodfellow, Y Bengio and A Courville, [Deep Learning]\n    (http://www.deeplearningbook.org/)\n\nWhat follows is a summary of some general advice for identifying appropriate\nmodel hyper-parameters in the context of this particular model training setup.\nNamely, this library provides *synchronous* updates to model parameters based on\nbatch-splitting the model across multiple GPUs.\n\n*   Higher learning rates leads to faster training. Too high of learning rate\n    leads to instability and will cause model parameters to diverge to infinity\n    or NaN.\n\n*   Larger batch sizes lead to higher quality estimates of the gradient and\n    permit training the model with higher learning rates.\n\n*   Often the GPU memory is a bottleneck that prevents employing larger batch\n    sizes. Employing more GPUs allows one to user larger batch sizes because\n    this model splits the batch across the GPUs.\n\n**NOTE** If one wishes to train this model with *asynchronous* gradient updates,\none will need to substantially alter this model and new considerations need to\nbe factored into hyperparameter tuning. See [Large Scale Distributed Deep\nNetworks](http://research.google.com/archive/large_deep_networks_nips2012.html)\nfor a discussion in this domain.\n\n### Adjusting Memory Demands\n\nTraining this model has large memory demands in terms of the CPU and GPU. Let's\ndiscuss each item in turn.\n\nGPU memory is relatively small compared to CPU memory. Two items dictate the\namount of GPU memory employed -- model architecture and batch size. Assuming\nthat you keep the model architecture fixed, the sole parameter governing the GPU\ndemand is the batch size. A good rule of thumb is to try employ as large of\nbatch size as will fit on the GPU.\n\nIf you run out of GPU memory, either lower the `--batch_size` or employ more\nGPUs on your desktop. The model performs batch-splitting across GPUs, thus N\nGPUs can handle N times the batch size of 1 GPU.\n\nThe model requires a large amount of CPU memory as well. We have tuned the model\nto employ about ~20GB of CPU memory. Thus, having access to about 40 GB of CPU\nmemory would be ideal.\n\nIf that is not possible, you can tune down the memory demands of the model via\nlowering `--input_queue_memory_factor`. Images are preprocessed asynchronously\nwith respect to the main training across `--num_preprocess_threads` threads. The\npreprocessed images are stored in shuffling queue in which each GPU performs a\ndequeue operation in order to receive a `batch_size` worth of images.\n\nIn order to guarantee good shuffling across the data, we maintain a large\nshuffling queue of 1024 x `input_queue_memory_factor` images. For the current\nmodel architecture, this corresponds to about 4GB of CPU memory. You may lower\n`input_queue_memory_factor` in order to decrease the memory footprint. Keep in\nmind though that lowering this value drastically may result in a model with\nslightly lower predictive accuracy when training from scratch. Please see\ncomments in [`image_processing.py`](inception/image_processing.py) for more details.\n\n## Troubleshooting\n\n#### The model runs out of CPU memory.\n\nIn lieu of buying more CPU memory, an easy fix is to decrease\n`--input_queue_memory_factor`. See [Adjusting Memory Demands]\n(#adjusting-memory-demands).\n\n#### The model runs out of GPU memory.\n\nThe data is not able to fit on the GPU card. The simplest solution is to\ndecrease the batch size of the model. Otherwise, you will need to think about a\nmore sophisticated method for specifying the training which cuts up the model\nacross multiple `session.run()` calls or partitions the model across multiple\nGPUs. See [Using GPUs](https://www.tensorflow.org/how_tos/using_gpu/index.html)\nand [Adjusting Memory Demands](#adjusting-memory-demands) for more information.\n\n#### The model training results in NaN's.\n\nThe learning rate of the model is too high. Turn down your learning rate.\n\n#### I wish to train a model with a different image size.\n\nThe simplest solution is to artificially resize your images to `299x299` pixels.\nSee [Images](https://www.tensorflow.org/api_docs/python/image.html) section for\nmany resizing, cropping and padding methods. Note that the entire model\narchitecture is predicated on a `299x299` image, thus if you wish to change the\ninput image size, then you may need to redesign the entire model architecture.\n\n#### What hardware specification are these hyper-parameters targeted for?\n\nWe targeted a desktop with 128GB of CPU ram connected to 8 NVIDIA Tesla K40 GPU\ncards but we have run this on desktops with 32GB of CPU ram and 1 NVIDIA Tesla\nK40. You can get a sense of the various training configurations we tested by\nreading the comments in [`inception_train.py`](inception/inception_train.py).\n\n#### How do I continue training from a checkpoint in distributed setting?\n\nYou only need to make sure that the checkpoint is in a location that can be\nreached by all of the `ps` tasks. By specifying the checkpoint location with\n`--train_dir` , the `ps` servers will load the checkpoint before commencing\ntraining.\n"
  },
  {
    "path": "model_zoo/models/inception/WORKSPACE",
    "content": "workspace(name = \"inception\")\n"
  },
  {
    "path": "model_zoo/models/inception/inception/BUILD",
    "content": "# Description:\n# Example TensorFlow models for ImageNet.\n\npackage(default_visibility = [\":internal\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\nexports_files([\"LICENSE\"])\n\npackage_group(\n    name = \"internal\",\n    packages = [\"//inception/...\"],\n)\n\npy_library(\n    name = \"dataset\",\n    srcs = [\n        \"dataset.py\",\n    ],\n)\n\npy_library(\n    name = \"imagenet_data\",\n    srcs = [\n        \"imagenet_data.py\",\n    ],\n    deps = [\n        \":dataset\",\n    ],\n)\n\npy_library(\n    name = \"flowers_data\",\n    srcs = [\n        \"flowers_data.py\",\n    ],\n    deps = [\n        \":dataset\",\n    ],\n)\n\npy_library(\n    name = \"image_processing\",\n    srcs = [\n        \"image_processing.py\",\n    ],\n)\n\npy_library(\n    name = \"inception\",\n    srcs = [\n        \"inception_model.py\",\n    ],\n    visibility = [\"//visibility:public\"],\n    deps = [\n        \":dataset\",\n        \"//inception/slim\",\n    ],\n)\n\npy_binary(\n    name = \"imagenet_eval\",\n    srcs = [\n        \"imagenet_eval.py\",\n    ],\n    deps = [\n        \":imagenet_data\",\n        \":inception_eval\",\n    ],\n)\n\npy_binary(\n    name = \"flowers_eval\",\n    srcs = [\n        \"flowers_eval.py\",\n    ],\n    deps = [\n        \":flowers_data\",\n        \":inception_eval\",\n    ],\n)\n\npy_library(\n    name = \"inception_eval\",\n    srcs = [\n        \"inception_eval.py\",\n    ],\n    deps = [\n        \":image_processing\",\n        \":inception\",\n    ],\n)\n\npy_binary(\n    name = \"imagenet_train\",\n    srcs = [\n        \"imagenet_train.py\",\n    ],\n    deps = [\n        \":imagenet_data\",\n        \":inception_train\",\n    ],\n)\n\npy_binary(\n    name = \"imagenet_distributed_train\",\n    srcs = [\n        \"imagenet_distributed_train.py\",\n    ],\n    deps = [\n        \":imagenet_data\",\n        \":inception_distributed_train\",\n    ],\n)\n\npy_binary(\n    name = \"flowers_train\",\n    srcs = [\n        \"flowers_train.py\",\n    ],\n    deps = [\n        \":flowers_data\",\n        \":inception_train\",\n    ],\n)\n\npy_library(\n    name = \"inception_train\",\n    srcs = [\n        \"inception_train.py\",\n    ],\n    deps = [\n        \":image_processing\",\n        \":inception\",\n    ],\n)\n\npy_library(\n    name = \"inception_distributed_train\",\n    srcs = [\n        \"inception_distributed_train.py\",\n    ],\n    deps = [\n        \":image_processing\",\n        \":inception\",\n    ],\n)\n\npy_binary(\n    name = \"build_image_data\",\n    srcs = [\"data/build_image_data.py\"],\n)\n\nsh_binary(\n    name = \"download_and_preprocess_flowers\",\n    srcs = [\"data/download_and_preprocess_flowers.sh\"],\n    data = [\n        \":build_image_data\",\n    ],\n)\n\nsh_binary(\n    name = \"download_and_preprocess_imagenet\",\n    srcs = [\"data/download_and_preprocess_imagenet.sh\"],\n    data = [\n        \"data/download_imagenet.sh\",\n        \"data/imagenet_2012_validation_synset_labels.txt\",\n        \"data/imagenet_lsvrc_2015_synsets.txt\",\n        \"data/imagenet_metadata.txt\",\n        \"data/preprocess_imagenet_validation_data.py\",\n        \"data/process_bounding_boxes.py\",\n        \":build_imagenet_data\",\n    ],\n)\n\npy_binary(\n    name = \"build_imagenet_data\",\n    srcs = [\"data/build_imagenet_data.py\"],\n)\n\nfilegroup(\n    name = \"srcs\",\n    srcs = glob(\n        [\n            \"**/*.py\",\n            \"BUILD\",\n        ],\n    ),\n)\n\nfilegroup(\n    name = \"imagenet_metadata\",\n    srcs = [\n        \"data/imagenet_lsvrc_2015_synsets.txt\",\n        \"data/imagenet_metadata.txt\",\n    ],\n    visibility = [\"//visibility:public\"],\n)\n"
  },
  {
    "path": "model_zoo/models/inception/inception/data/build_image_data.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Converts image data to TFRecords file format with Example protos.\n\nThe image data set is expected to reside in JPEG files located in the\nfollowing directory structure.\n\n  data_dir/label_0/image0.jpeg\n  data_dir/label_0/image1.jpg\n  ...\n  data_dir/label_1/weird-image.jpeg\n  data_dir/label_1/my-image.jpeg\n  ...\n\nwhere the sub-directory is the unique label associated with these images.\n\nThis TensorFlow script converts the training and evaluation data into\na sharded data set consisting of TFRecord files\n\n  train_directory/train-00000-of-01024\n  train_directory/train-00001-of-01024\n  ...\n  train_directory/train-00127-of-01024\n\nand\n\n  validation_directory/validation-00000-of-00128\n  validation_directory/validation-00001-of-00128\n  ...\n  validation_directory/validation-00127-of-00128\n\nwhere we have selected 1024 and 128 shards for each data set. Each record\nwithin the TFRecord file is a serialized Example proto. The Example proto\ncontains the following fields:\n\n  image/encoded: string containing JPEG encoded image in RGB colorspace\n  image/height: integer, image height in pixels\n  image/width: integer, image width in pixels\n  image/colorspace: string, specifying the colorspace, always 'RGB'\n  image/channels: integer, specifying the number of channels, always 3\n  image/format: string, specifying the format, always'JPEG'\n\n  image/filename: string containing the basename of the image file\n            e.g. 'n01440764_10026.JPEG' or 'ILSVRC2012_val_00000293.JPEG'\n  image/class/label: integer specifying the index in a classification layer.\n    The label ranges from [0, num_labels] where 0 is unused and left as\n    the background class.\n  image/class/text: string specifying the human-readable version of the label\n    e.g. 'dog'\n\nIf you data set involves bounding boxes, please look at build_imagenet_data.py.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom datetime import datetime\nimport os\nimport random\nimport sys\nimport threading\n\n\nimport numpy as np\nimport tensorflow as tf\n\ntf.app.flags.DEFINE_string('train_directory', '/tmp/',\n                           'Training data directory')\ntf.app.flags.DEFINE_string('validation_directory', '/tmp/',\n                           'Validation data directory')\ntf.app.flags.DEFINE_string('output_directory', '/tmp/',\n                           'Output data directory')\n\ntf.app.flags.DEFINE_integer('train_shards', 2,\n                            'Number of shards in training TFRecord files.')\ntf.app.flags.DEFINE_integer('validation_shards', 2,\n                            'Number of shards in validation TFRecord files.')\n\ntf.app.flags.DEFINE_integer('num_threads', 2,\n                            'Number of threads to preprocess the images.')\n\n# The labels file contains a list of valid labels are held in this file.\n# Assumes that the file contains entries as such:\n#   dog\n#   cat\n#   flower\n# where each line corresponds to a label. We map each label contained in\n# the file to an integer corresponding to the line number starting from 0.\ntf.app.flags.DEFINE_string('labels_file', '', 'Labels file')\n\n\nFLAGS = tf.app.flags.FLAGS\n\n\ndef _int64_feature(value):\n  \"\"\"Wrapper for inserting int64 features into Example proto.\"\"\"\n  if not isinstance(value, list):\n    value = [value]\n  return tf.train.Feature(int64_list=tf.train.Int64List(value=value))\n\n\ndef _bytes_feature(value):\n  \"\"\"Wrapper for inserting bytes features into Example proto.\"\"\"\n  return tf.train.Feature(bytes_list=tf.train.BytesList(value=[value]))\n\n\ndef _convert_to_example(filename, image_buffer, label, text, height, width):\n  \"\"\"Build an Example proto for an example.\n\n  Args:\n    filename: string, path to an image file, e.g., '/path/to/example.JPG'\n    image_buffer: string, JPEG encoding of RGB image\n    label: integer, identifier for the ground truth for the network\n    text: string, unique human-readable, e.g. 'dog'\n    height: integer, image height in pixels\n    width: integer, image width in pixels\n  Returns:\n    Example proto\n  \"\"\"\n\n  colorspace = 'RGB'\n  channels = 3\n  image_format = 'JPEG'\n\n  example = tf.train.Example(features=tf.train.Features(feature={\n      'image/height': _int64_feature(height),\n      'image/width': _int64_feature(width),\n      'image/colorspace': _bytes_feature(colorspace),\n      'image/channels': _int64_feature(channels),\n      'image/class/label': _int64_feature(label),\n      'image/class/text': _bytes_feature(text),\n      'image/format': _bytes_feature(image_format),\n      'image/filename': _bytes_feature(os.path.basename(filename)),\n      'image/encoded': _bytes_feature(image_buffer)}))\n  return example\n\n\nclass ImageCoder(object):\n  \"\"\"Helper class that provides TensorFlow image coding utilities.\"\"\"\n\n  def __init__(self):\n    # Create a single Session to run all image coding calls.\n    self._sess = tf.Session()\n\n    # Initializes function that converts PNG to JPEG data.\n    self._png_data = tf.placeholder(dtype=tf.string)\n    image = tf.image.decode_png(self._png_data, channels=3)\n    self._png_to_jpeg = tf.image.encode_jpeg(image, format='rgb', quality=100)\n\n    # Initializes function that decodes RGB JPEG data.\n    self._decode_jpeg_data = tf.placeholder(dtype=tf.string)\n    self._decode_jpeg = tf.image.decode_jpeg(self._decode_jpeg_data, channels=3)\n\n  def png_to_jpeg(self, image_data):\n    return self._sess.run(self._png_to_jpeg,\n                          feed_dict={self._png_data: image_data})\n\n  def decode_jpeg(self, image_data):\n    image = self._sess.run(self._decode_jpeg,\n                           feed_dict={self._decode_jpeg_data: image_data})\n    assert len(image.shape) == 3\n    assert image.shape[2] == 3\n    return image\n\n\ndef _is_png(filename):\n  \"\"\"Determine if a file contains a PNG format image.\n\n  Args:\n    filename: string, path of the image file.\n\n  Returns:\n    boolean indicating if the image is a PNG.\n  \"\"\"\n  return '.png' in filename\n\n\ndef _process_image(filename, coder):\n  \"\"\"Process a single image file.\n\n  Args:\n    filename: string, path to an image file e.g., '/path/to/example.JPG'.\n    coder: instance of ImageCoder to provide TensorFlow image coding utils.\n  Returns:\n    image_buffer: string, JPEG encoding of RGB image.\n    height: integer, image height in pixels.\n    width: integer, image width in pixels.\n  \"\"\"\n  # Read the image file.\n  with tf.gfile.FastGFile(filename, 'r') as f:\n    image_data = f.read()\n\n  # Convert any PNG to JPEG's for consistency.\n  if _is_png(filename):\n    print('Converting PNG to JPEG for %s' % filename)\n    image_data = coder.png_to_jpeg(image_data)\n\n  # Decode the RGB JPEG.\n  image = coder.decode_jpeg(image_data)\n\n  # Check that image converted to RGB\n  assert len(image.shape) == 3\n  height = image.shape[0]\n  width = image.shape[1]\n  assert image.shape[2] == 3\n\n  return image_data, height, width\n\n\ndef _process_image_files_batch(coder, thread_index, ranges, name, filenames,\n                               texts, labels, num_shards):\n  \"\"\"Processes and saves list of images as TFRecord in 1 thread.\n\n  Args:\n    coder: instance of ImageCoder to provide TensorFlow image coding utils.\n    thread_index: integer, unique batch to run index is within [0, len(ranges)).\n    ranges: list of pairs of integers specifying ranges of each batches to\n      analyze in parallel.\n    name: string, unique identifier specifying the data set\n    filenames: list of strings; each string is a path to an image file\n    texts: list of strings; each string is human readable, e.g. 'dog'\n    labels: list of integer; each integer identifies the ground truth\n    num_shards: integer number of shards for this data set.\n  \"\"\"\n  # Each thread produces N shards where N = int(num_shards / num_threads).\n  # For instance, if num_shards = 128, and the num_threads = 2, then the first\n  # thread would produce shards [0, 64).\n  num_threads = len(ranges)\n  assert not num_shards % num_threads\n  num_shards_per_batch = int(num_shards / num_threads)\n\n  shard_ranges = np.linspace(ranges[thread_index][0],\n                             ranges[thread_index][1],\n                             num_shards_per_batch + 1).astype(int)\n  num_files_in_thread = ranges[thread_index][1] - ranges[thread_index][0]\n\n  counter = 0\n  for s in xrange(num_shards_per_batch):\n    # Generate a sharded version of the file name, e.g. 'train-00002-of-00010'\n    shard = thread_index * num_shards_per_batch + s\n    output_filename = '%s-%.5d-of-%.5d' % (name, shard, num_shards)\n    output_file = os.path.join(FLAGS.output_directory, output_filename)\n    writer = tf.python_io.TFRecordWriter(output_file)\n\n    shard_counter = 0\n    files_in_shard = np.arange(shard_ranges[s], shard_ranges[s + 1], dtype=int)\n    for i in files_in_shard:\n      filename = filenames[i]\n      label = labels[i]\n      text = texts[i]\n\n      image_buffer, height, width = _process_image(filename, coder)\n\n      example = _convert_to_example(filename, image_buffer, label,\n                                    text, height, width)\n      writer.write(example.SerializeToString())\n      shard_counter += 1\n      counter += 1\n\n      if not counter % 1000:\n        print('%s [thread %d]: Processed %d of %d images in thread batch.' %\n              (datetime.now(), thread_index, counter, num_files_in_thread))\n        sys.stdout.flush()\n\n    writer.close()\n    print('%s [thread %d]: Wrote %d images to %s' %\n          (datetime.now(), thread_index, shard_counter, output_file))\n    sys.stdout.flush()\n    shard_counter = 0\n  print('%s [thread %d]: Wrote %d images to %d shards.' %\n        (datetime.now(), thread_index, counter, num_files_in_thread))\n  sys.stdout.flush()\n\n\ndef _process_image_files(name, filenames, texts, labels, num_shards):\n  \"\"\"Process and save list of images as TFRecord of Example protos.\n\n  Args:\n    name: string, unique identifier specifying the data set\n    filenames: list of strings; each string is a path to an image file\n    texts: list of strings; each string is human readable, e.g. 'dog'\n    labels: list of integer; each integer identifies the ground truth\n    num_shards: integer number of shards for this data set.\n  \"\"\"\n  assert len(filenames) == len(texts)\n  assert len(filenames) == len(labels)\n\n  # Break all images into batches with a [ranges[i][0], ranges[i][1]].\n  spacing = np.linspace(0, len(filenames), FLAGS.num_threads + 1).astype(np.int)\n  ranges = []\n  for i in xrange(len(spacing) - 1):\n    ranges.append([spacing[i], spacing[i+1]])\n\n  # Launch a thread for each batch.\n  print('Launching %d threads for spacings: %s' % (FLAGS.num_threads, ranges))\n  sys.stdout.flush()\n\n  # Create a mechanism for monitoring when all threads are finished.\n  coord = tf.train.Coordinator()\n\n  # Create a generic TensorFlow-based utility for converting all image codings.\n  coder = ImageCoder()\n\n  threads = []\n  for thread_index in xrange(len(ranges)):\n    args = (coder, thread_index, ranges, name, filenames,\n            texts, labels, num_shards)\n    t = threading.Thread(target=_process_image_files_batch, args=args)\n    t.start()\n    threads.append(t)\n\n  # Wait for all the threads to terminate.\n  coord.join(threads)\n  print('%s: Finished writing all %d images in data set.' %\n        (datetime.now(), len(filenames)))\n  sys.stdout.flush()\n\n\ndef _find_image_files(data_dir, labels_file):\n  \"\"\"Build a list of all images files and labels in the data set.\n\n  Args:\n    data_dir: string, path to the root directory of images.\n\n      Assumes that the image data set resides in JPEG files located in\n      the following directory structure.\n\n        data_dir/dog/another-image.JPEG\n        data_dir/dog/my-image.jpg\n\n      where 'dog' is the label associated with these images.\n\n    labels_file: string, path to the labels file.\n\n      The list of valid labels are held in this file. Assumes that the file\n      contains entries as such:\n        dog\n        cat\n        flower\n      where each line corresponds to a label. We map each label contained in\n      the file to an integer starting with the integer 0 corresponding to the\n      label contained in the first line.\n\n  Returns:\n    filenames: list of strings; each string is a path to an image file.\n    texts: list of strings; each string is the class, e.g. 'dog'\n    labels: list of integer; each integer identifies the ground truth.\n  \"\"\"\n  print('Determining list of input files and labels from %s.' % data_dir)\n  unique_labels = [l.strip() for l in tf.gfile.FastGFile(\n      labels_file, 'r').readlines()]\n\n  labels = []\n  filenames = []\n  texts = []\n\n  # Leave label index 0 empty as a background class.\n  label_index = 1\n\n  # Construct the list of JPEG files and labels.\n  for text in unique_labels:\n    jpeg_file_path = '%s/%s/*' % (data_dir, text)\n    matching_files = tf.gfile.Glob(jpeg_file_path)\n\n    labels.extend([label_index] * len(matching_files))\n    texts.extend([text] * len(matching_files))\n    filenames.extend(matching_files)\n\n    if not label_index % 100:\n      print('Finished finding files in %d of %d classes.' % (\n          label_index, len(labels)))\n    label_index += 1\n\n  # Shuffle the ordering of all image files in order to guarantee\n  # random ordering of the images with respect to label in the\n  # saved TFRecord files. Make the randomization repeatable.\n  shuffled_index = range(len(filenames))\n  random.seed(12345)\n  random.shuffle(shuffled_index)\n\n  filenames = [filenames[i] for i in shuffled_index]\n  texts = [texts[i] for i in shuffled_index]\n  labels = [labels[i] for i in shuffled_index]\n\n  print('Found %d JPEG files across %d labels inside %s.' %\n        (len(filenames), len(unique_labels), data_dir))\n  return filenames, texts, labels\n\n\ndef _process_dataset(name, directory, num_shards, labels_file):\n  \"\"\"Process a complete data set and save it as a TFRecord.\n\n  Args:\n    name: string, unique identifier specifying the data set.\n    directory: string, root path to the data set.\n    num_shards: integer number of shards for this data set.\n    labels_file: string, path to the labels file.\n  \"\"\"\n  filenames, texts, labels = _find_image_files(directory, labels_file)\n  _process_image_files(name, filenames, texts, labels, num_shards)\n\n\ndef main(unused_argv):\n  assert not FLAGS.train_shards % FLAGS.num_threads, (\n      'Please make the FLAGS.num_threads commensurate with FLAGS.train_shards')\n  assert not FLAGS.validation_shards % FLAGS.num_threads, (\n      'Please make the FLAGS.num_threads commensurate with '\n      'FLAGS.validation_shards')\n  print('Saving results to %s' % FLAGS.output_directory)\n\n  # Run it!\n  _process_dataset('validation', FLAGS.validation_directory,\n                   FLAGS.validation_shards, FLAGS.labels_file)\n  _process_dataset('train', FLAGS.train_directory,\n                   FLAGS.train_shards, FLAGS.labels_file)\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/inception/inception/data/build_imagenet_data.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Converts ImageNet data to TFRecords file format with Example protos.\n\nThe raw ImageNet data set is expected to reside in JPEG files located in the\nfollowing directory structure.\n\n  data_dir/n01440764/ILSVRC2012_val_00000293.JPEG\n  data_dir/n01440764/ILSVRC2012_val_00000543.JPEG\n  ...\n\nwhere 'n01440764' is the unique synset label associated with\nthese images.\n\nThe training data set consists of 1000 sub-directories (i.e. labels)\neach containing 1200 JPEG images for a total of 1.2M JPEG images.\n\nThe evaluation data set consists of 1000 sub-directories (i.e. labels)\neach containing 50 JPEG images for a total of 50K JPEG images.\n\nThis TensorFlow script converts the training and evaluation data into\na sharded data set consisting of 1024 and 128 TFRecord files, respectively.\n\n  train_directory/train-00000-of-01024\n  train_directory/train-00001-of-01024\n  ...\n  train_directory/train-00127-of-01024\n\nand\n\n  validation_directory/validation-00000-of-00128\n  validation_directory/validation-00001-of-00128\n  ...\n  validation_directory/validation-00127-of-00128\n\nEach validation TFRecord file contains ~390 records. Each training TFREcord\nfile contains ~1250 records. Each record within the TFRecord file is a\nserialized Example proto. The Example proto contains the following fields:\n\n  image/encoded: string containing JPEG encoded image in RGB colorspace\n  image/height: integer, image height in pixels\n  image/width: integer, image width in pixels\n  image/colorspace: string, specifying the colorspace, always 'RGB'\n  image/channels: integer, specifying the number of channels, always 3\n  image/format: string, specifying the format, always'JPEG'\n\n  image/filename: string containing the basename of the image file\n            e.g. 'n01440764_10026.JPEG' or 'ILSVRC2012_val_00000293.JPEG'\n  image/class/label: integer specifying the index in a classification layer.\n    The label ranges from [1, 1000] where 0 is not used.\n  image/class/synset: string specifying the unique ID of the label,\n    e.g. 'n01440764'\n  image/class/text: string specifying the human-readable version of the label\n    e.g. 'red fox, Vulpes vulpes'\n\n  image/object/bbox/xmin: list of integers specifying the 0+ human annotated\n    bounding boxes\n  image/object/bbox/xmax: list of integers specifying the 0+ human annotated\n    bounding boxes\n  image/object/bbox/ymin: list of integers specifying the 0+ human annotated\n    bounding boxes\n  image/object/bbox/ymax: list of integers specifying the 0+ human annotated\n    bounding boxes\n  image/object/bbox/label: integer specifying the index in a classification\n    layer. The label ranges from [1, 1000] where 0 is not used. Note this is\n    always identical to the image label.\n\nNote that the length of xmin is identical to the length of xmax, ymin and ymax\nfor each example.\n\nRunning this script using 16 threads may take around ~2.5 hours on a HP Z420.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom datetime import datetime\nimport os\nimport random\nimport sys\nimport threading\n\n\nimport numpy as np\nimport tensorflow as tf\n\ntf.app.flags.DEFINE_string('train_directory', '/tmp/',\n                           'Training data directory')\ntf.app.flags.DEFINE_string('validation_directory', '/tmp/',\n                           'Validation data directory')\ntf.app.flags.DEFINE_string('output_directory', '/tmp/',\n                           'Output data directory')\n\ntf.app.flags.DEFINE_integer('train_shards', 1024,\n                            'Number of shards in training TFRecord files.')\ntf.app.flags.DEFINE_integer('validation_shards', 128,\n                            'Number of shards in validation TFRecord files.')\n\ntf.app.flags.DEFINE_integer('num_threads', 8,\n                            'Number of threads to preprocess the images.')\n\n# The labels file contains a list of valid labels are held in this file.\n# Assumes that the file contains entries as such:\n#   n01440764\n#   n01443537\n#   n01484850\n# where each line corresponds to a label expressed as a synset. We map\n# each synset contained in the file to an integer (based on the alphabetical\n# ordering). See below for details.\ntf.app.flags.DEFINE_string('labels_file',\n                           'imagenet_lsvrc_2015_synsets.txt',\n                           'Labels file')\n\n# This file containing mapping from synset to human-readable label.\n# Assumes each line of the file looks like:\n#\n#   n02119247    black fox\n#   n02119359    silver fox\n#   n02119477    red fox, Vulpes fulva\n#\n# where each line corresponds to a unique mapping. Note that each line is\n# formatted as <synset>\\t<human readable label>.\ntf.app.flags.DEFINE_string('imagenet_metadata_file',\n                           'imagenet_metadata.txt',\n                           'ImageNet metadata file')\n\n# This file is the output of process_bounding_box.py\n# Assumes each line of the file looks like:\n#\n#   n00007846_64193.JPEG,0.0060,0.2620,0.7545,0.9940\n#\n# where each line corresponds to one bounding box annotation associated\n# with an image. Each line can be parsed as:\n#\n#   <JPEG file name>, <xmin>, <ymin>, <xmax>, <ymax>\n#\n# Note that there might exist mulitple bounding box annotations associated\n# with an image file.\ntf.app.flags.DEFINE_string('bounding_box_file',\n                           './imagenet_2012_bounding_boxes.csv',\n                           'Bounding box file')\n\nFLAGS = tf.app.flags.FLAGS\n\n\ndef _int64_feature(value):\n  \"\"\"Wrapper for inserting int64 features into Example proto.\"\"\"\n  if not isinstance(value, list):\n    value = [value]\n  return tf.train.Feature(int64_list=tf.train.Int64List(value=value))\n\n\ndef _float_feature(value):\n  \"\"\"Wrapper for inserting float features into Example proto.\"\"\"\n  if not isinstance(value, list):\n    value = [value]\n  return tf.train.Feature(float_list=tf.train.FloatList(value=value))\n\n\ndef _bytes_feature(value):\n  \"\"\"Wrapper for inserting bytes features into Example proto.\"\"\"\n  return tf.train.Feature(bytes_list=tf.train.BytesList(value=[value]))\n\n\ndef _convert_to_example(filename, image_buffer, label, synset, human, bbox,\n                        height, width):\n  \"\"\"Build an Example proto for an example.\n\n  Args:\n    filename: string, path to an image file, e.g., '/path/to/example.JPG'\n    image_buffer: string, JPEG encoding of RGB image\n    label: integer, identifier for the ground truth for the network\n    synset: string, unique WordNet ID specifying the label, e.g., 'n02323233'\n    human: string, human-readable label, e.g., 'red fox, Vulpes vulpes'\n    bbox: list of bounding boxes; each box is a list of integers\n      specifying [xmin, ymin, xmax, ymax]. All boxes are assumed to belong to\n      the same label as the image label.\n    height: integer, image height in pixels\n    width: integer, image width in pixels\n  Returns:\n    Example proto\n  \"\"\"\n  xmin = []\n  ymin = []\n  xmax = []\n  ymax = []\n  for b in bbox:\n    assert len(b) == 4\n    # pylint: disable=expression-not-assigned\n    [l.append(point) for l, point in zip([xmin, ymin, xmax, ymax], b)]\n    # pylint: enable=expression-not-assigned\n\n  colorspace = 'RGB'\n  channels = 3\n  image_format = 'JPEG'\n\n  example = tf.train.Example(features=tf.train.Features(feature={\n      'image/height': _int64_feature(height),\n      'image/width': _int64_feature(width),\n      'image/colorspace': _bytes_feature(colorspace),\n      'image/channels': _int64_feature(channels),\n      'image/class/label': _int64_feature(label),\n      'image/class/synset': _bytes_feature(synset),\n      'image/class/text': _bytes_feature(human),\n      'image/object/bbox/xmin': _float_feature(xmin),\n      'image/object/bbox/xmax': _float_feature(xmax),\n      'image/object/bbox/ymin': _float_feature(ymin),\n      'image/object/bbox/ymax': _float_feature(ymax),\n      'image/object/bbox/label': _int64_feature([label] * len(xmin)),\n      'image/format': _bytes_feature(image_format),\n      'image/filename': _bytes_feature(os.path.basename(filename)),\n      'image/encoded': _bytes_feature(image_buffer)}))\n  return example\n\n\nclass ImageCoder(object):\n  \"\"\"Helper class that provides TensorFlow image coding utilities.\"\"\"\n\n  def __init__(self):\n    # Create a single Session to run all image coding calls.\n    self._sess = tf.Session()\n\n    # Initializes function that converts PNG to JPEG data.\n    self._png_data = tf.placeholder(dtype=tf.string)\n    image = tf.image.decode_png(self._png_data, channels=3)\n    self._png_to_jpeg = tf.image.encode_jpeg(image, format='rgb', quality=100)\n\n    # Initializes function that converts CMYK JPEG data to RGB JPEG data.\n    self._cmyk_data = tf.placeholder(dtype=tf.string)\n    image = tf.image.decode_jpeg(self._cmyk_data, channels=0)\n    self._cmyk_to_rgb = tf.image.encode_jpeg(image, format='rgb', quality=100)\n\n    # Initializes function that decodes RGB JPEG data.\n    self._decode_jpeg_data = tf.placeholder(dtype=tf.string)\n    self._decode_jpeg = tf.image.decode_jpeg(self._decode_jpeg_data, channels=3)\n\n  def png_to_jpeg(self, image_data):\n    return self._sess.run(self._png_to_jpeg,\n                          feed_dict={self._png_data: image_data})\n\n  def cmyk_to_rgb(self, image_data):\n    return self._sess.run(self._cmyk_to_rgb,\n                          feed_dict={self._cmyk_data: image_data})\n\n  def decode_jpeg(self, image_data):\n    image = self._sess.run(self._decode_jpeg,\n                           feed_dict={self._decode_jpeg_data: image_data})\n    assert len(image.shape) == 3\n    assert image.shape[2] == 3\n    return image\n\n\ndef _is_png(filename):\n  \"\"\"Determine if a file contains a PNG format image.\n\n  Args:\n    filename: string, path of the image file.\n\n  Returns:\n    boolean indicating if the image is a PNG.\n  \"\"\"\n  # File list from:\n  # https://groups.google.com/forum/embed/?place=forum/torch7#!topic/torch7/fOSTXHIESSU\n  return 'n02105855_2933.JPEG' in filename\n\n\ndef _is_cmyk(filename):\n  \"\"\"Determine if file contains a CMYK JPEG format image.\n\n  Args:\n    filename: string, path of the image file.\n\n  Returns:\n    boolean indicating if the image is a JPEG encoded with CMYK color space.\n  \"\"\"\n  # File list from:\n  # https://github.com/cytsai/ilsvrc-cmyk-image-list\n  blacklist = ['n01739381_1309.JPEG', 'n02077923_14822.JPEG',\n               'n02447366_23489.JPEG', 'n02492035_15739.JPEG',\n               'n02747177_10752.JPEG', 'n03018349_4028.JPEG',\n               'n03062245_4620.JPEG', 'n03347037_9675.JPEG',\n               'n03467068_12171.JPEG', 'n03529860_11437.JPEG',\n               'n03544143_17228.JPEG', 'n03633091_5218.JPEG',\n               'n03710637_5125.JPEG', 'n03961711_5286.JPEG',\n               'n04033995_2932.JPEG', 'n04258138_17003.JPEG',\n               'n04264628_27969.JPEG', 'n04336792_7448.JPEG',\n               'n04371774_5854.JPEG', 'n04596742_4225.JPEG',\n               'n07583066_647.JPEG', 'n13037406_4650.JPEG']\n  return filename.split('/')[-1] in blacklist\n\n\ndef _process_image(filename, coder):\n  \"\"\"Process a single image file.\n\n  Args:\n    filename: string, path to an image file e.g., '/path/to/example.JPG'.\n    coder: instance of ImageCoder to provide TensorFlow image coding utils.\n  Returns:\n    image_buffer: string, JPEG encoding of RGB image.\n    height: integer, image height in pixels.\n    width: integer, image width in pixels.\n  \"\"\"\n  # Read the image file.\n  with tf.gfile.FastGFile(filename, 'r') as f:\n    image_data = f.read()\n\n  # Clean the dirty data.\n  if _is_png(filename):\n    # 1 image is a PNG.\n    print('Converting PNG to JPEG for %s' % filename)\n    image_data = coder.png_to_jpeg(image_data)\n  elif _is_cmyk(filename):\n    # 22 JPEG images are in CMYK colorspace.\n    print('Converting CMYK to RGB for %s' % filename)\n    image_data = coder.cmyk_to_rgb(image_data)\n\n  # Decode the RGB JPEG.\n  image = coder.decode_jpeg(image_data)\n\n  # Check that image converted to RGB\n  assert len(image.shape) == 3\n  height = image.shape[0]\n  width = image.shape[1]\n  assert image.shape[2] == 3\n\n  return image_data, height, width\n\n\ndef _process_image_files_batch(coder, thread_index, ranges, name, filenames,\n                               synsets, labels, humans, bboxes, num_shards):\n  \"\"\"Processes and saves list of images as TFRecord in 1 thread.\n\n  Args:\n    coder: instance of ImageCoder to provide TensorFlow image coding utils.\n    thread_index: integer, unique batch to run index is within [0, len(ranges)).\n    ranges: list of pairs of integers specifying ranges of each batches to\n      analyze in parallel.\n    name: string, unique identifier specifying the data set\n    filenames: list of strings; each string is a path to an image file\n    synsets: list of strings; each string is a unique WordNet ID\n    labels: list of integer; each integer identifies the ground truth\n    humans: list of strings; each string is a human-readable label\n    bboxes: list of bounding boxes for each image. Note that each entry in this\n      list might contain from 0+ entries corresponding to the number of bounding\n      box annotations for the image.\n    num_shards: integer number of shards for this data set.\n  \"\"\"\n  # Each thread produces N shards where N = int(num_shards / num_threads).\n  # For instance, if num_shards = 128, and the num_threads = 2, then the first\n  # thread would produce shards [0, 64).\n  num_threads = len(ranges)\n  assert not num_shards % num_threads\n  num_shards_per_batch = int(num_shards / num_threads)\n\n  shard_ranges = np.linspace(ranges[thread_index][0],\n                             ranges[thread_index][1],\n                             num_shards_per_batch + 1).astype(int)\n  num_files_in_thread = ranges[thread_index][1] - ranges[thread_index][0]\n\n  counter = 0\n  for s in xrange(num_shards_per_batch):\n    # Generate a sharded version of the file name, e.g. 'train-00002-of-00010'\n    shard = thread_index * num_shards_per_batch + s\n    output_filename = '%s-%.5d-of-%.5d' % (name, shard, num_shards)\n    output_file = os.path.join(FLAGS.output_directory, output_filename)\n    writer = tf.python_io.TFRecordWriter(output_file)\n\n    shard_counter = 0\n    files_in_shard = np.arange(shard_ranges[s], shard_ranges[s + 1], dtype=int)\n    for i in files_in_shard:\n      filename = filenames[i]\n      label = labels[i]\n      synset = synsets[i]\n      human = humans[i]\n      bbox = bboxes[i]\n\n      image_buffer, height, width = _process_image(filename, coder)\n\n      example = _convert_to_example(filename, image_buffer, label,\n                                    synset, human, bbox,\n                                    height, width)\n      writer.write(example.SerializeToString())\n      shard_counter += 1\n      counter += 1\n\n      if not counter % 1000:\n        print('%s [thread %d]: Processed %d of %d images in thread batch.' %\n              (datetime.now(), thread_index, counter, num_files_in_thread))\n        sys.stdout.flush()\n\n    writer.close()\n    print('%s [thread %d]: Wrote %d images to %s' %\n          (datetime.now(), thread_index, shard_counter, output_file))\n    sys.stdout.flush()\n    shard_counter = 0\n  print('%s [thread %d]: Wrote %d images to %d shards.' %\n        (datetime.now(), thread_index, counter, num_files_in_thread))\n  sys.stdout.flush()\n\n\ndef _process_image_files(name, filenames, synsets, labels, humans,\n                         bboxes, num_shards):\n  \"\"\"Process and save list of images as TFRecord of Example protos.\n\n  Args:\n    name: string, unique identifier specifying the data set\n    filenames: list of strings; each string is a path to an image file\n    synsets: list of strings; each string is a unique WordNet ID\n    labels: list of integer; each integer identifies the ground truth\n    humans: list of strings; each string is a human-readable label\n    bboxes: list of bounding boxes for each image. Note that each entry in this\n      list might contain from 0+ entries corresponding to the number of bounding\n      box annotations for the image.\n    num_shards: integer number of shards for this data set.\n  \"\"\"\n  assert len(filenames) == len(synsets)\n  assert len(filenames) == len(labels)\n  assert len(filenames) == len(humans)\n  assert len(filenames) == len(bboxes)\n\n  # Break all images into batches with a [ranges[i][0], ranges[i][1]].\n  spacing = np.linspace(0, len(filenames), FLAGS.num_threads + 1).astype(np.int)\n  ranges = []\n  threads = []\n  for i in xrange(len(spacing) - 1):\n    ranges.append([spacing[i], spacing[i+1]])\n\n  # Launch a thread for each batch.\n  print('Launching %d threads for spacings: %s' % (FLAGS.num_threads, ranges))\n  sys.stdout.flush()\n\n  # Create a mechanism for monitoring when all threads are finished.\n  coord = tf.train.Coordinator()\n\n  # Create a generic TensorFlow-based utility for converting all image codings.\n  coder = ImageCoder()\n\n  threads = []\n  for thread_index in xrange(len(ranges)):\n    args = (coder, thread_index, ranges, name, filenames,\n            synsets, labels, humans, bboxes, num_shards)\n    t = threading.Thread(target=_process_image_files_batch, args=args)\n    t.start()\n    threads.append(t)\n\n  # Wait for all the threads to terminate.\n  coord.join(threads)\n  print('%s: Finished writing all %d images in data set.' %\n        (datetime.now(), len(filenames)))\n  sys.stdout.flush()\n\n\ndef _find_image_files(data_dir, labels_file):\n  \"\"\"Build a list of all images files and labels in the data set.\n\n  Args:\n    data_dir: string, path to the root directory of images.\n\n      Assumes that the ImageNet data set resides in JPEG files located in\n      the following directory structure.\n\n        data_dir/n01440764/ILSVRC2012_val_00000293.JPEG\n        data_dir/n01440764/ILSVRC2012_val_00000543.JPEG\n\n      where 'n01440764' is the unique synset label associated with these images.\n\n    labels_file: string, path to the labels file.\n\n      The list of valid labels are held in this file. Assumes that the file\n      contains entries as such:\n        n01440764\n        n01443537\n        n01484850\n      where each line corresponds to a label expressed as a synset. We map\n      each synset contained in the file to an integer (based on the alphabetical\n      ordering) starting with the integer 1 corresponding to the synset\n      contained in the first line.\n\n      The reason we start the integer labels at 1 is to reserve label 0 as an\n      unused background class.\n\n  Returns:\n    filenames: list of strings; each string is a path to an image file.\n    synsets: list of strings; each string is a unique WordNet ID.\n    labels: list of integer; each integer identifies the ground truth.\n  \"\"\"\n  print('Determining list of input files and labels from %s.' % data_dir)\n  challenge_synsets = [l.strip() for l in\n                       tf.gfile.FastGFile(labels_file, 'r').readlines()]\n\n  labels = []\n  filenames = []\n  synsets = []\n\n  # Leave label index 0 empty as a background class.\n  label_index = 1\n\n  # Construct the list of JPEG files and labels.\n  for synset in challenge_synsets:\n    jpeg_file_path = '%s/%s/*.JPEG' % (data_dir, synset)\n    matching_files = tf.gfile.Glob(jpeg_file_path)\n\n    labels.extend([label_index] * len(matching_files))\n    synsets.extend([synset] * len(matching_files))\n    filenames.extend(matching_files)\n\n    if not label_index % 100:\n      print('Finished finding files in %d of %d classes.' % (\n          label_index, len(challenge_synsets)))\n    label_index += 1\n\n  # Shuffle the ordering of all image files in order to guarantee\n  # random ordering of the images with respect to label in the\n  # saved TFRecord files. Make the randomization repeatable.\n  shuffled_index = range(len(filenames))\n  random.seed(12345)\n  random.shuffle(shuffled_index)\n\n  filenames = [filenames[i] for i in shuffled_index]\n  synsets = [synsets[i] for i in shuffled_index]\n  labels = [labels[i] for i in shuffled_index]\n\n  print('Found %d JPEG files across %d labels inside %s.' %\n        (len(filenames), len(challenge_synsets), data_dir))\n  return filenames, synsets, labels\n\n\ndef _find_human_readable_labels(synsets, synset_to_human):\n  \"\"\"Build a list of human-readable labels.\n\n  Args:\n    synsets: list of strings; each string is a unique WordNet ID.\n    synset_to_human: dict of synset to human labels, e.g.,\n      'n02119022' --> 'red fox, Vulpes vulpes'\n\n  Returns:\n    List of human-readable strings corresponding to each synset.\n  \"\"\"\n  humans = []\n  for s in synsets:\n    assert s in synset_to_human, ('Failed to find: %s' % s)\n    humans.append(synset_to_human[s])\n  return humans\n\n\ndef _find_image_bounding_boxes(filenames, image_to_bboxes):\n  \"\"\"Find the bounding boxes for a given image file.\n\n  Args:\n    filenames: list of strings; each string is a path to an image file.\n    image_to_bboxes: dictionary mapping image file names to a list of\n      bounding boxes. This list contains 0+ bounding boxes.\n  Returns:\n    List of bounding boxes for each image. Note that each entry in this\n    list might contain from 0+ entries corresponding to the number of bounding\n    box annotations for the image.\n  \"\"\"\n  num_image_bbox = 0\n  bboxes = []\n  for f in filenames:\n    basename = os.path.basename(f)\n    if basename in image_to_bboxes:\n      bboxes.append(image_to_bboxes[basename])\n      num_image_bbox += 1\n    else:\n      bboxes.append([])\n  print('Found %d images with bboxes out of %d images' % (\n      num_image_bbox, len(filenames)))\n  return bboxes\n\n\ndef _process_dataset(name, directory, num_shards, synset_to_human,\n                     image_to_bboxes):\n  \"\"\"Process a complete data set and save it as a TFRecord.\n\n  Args:\n    name: string, unique identifier specifying the data set.\n    directory: string, root path to the data set.\n    num_shards: integer number of shards for this data set.\n    synset_to_human: dict of synset to human labels, e.g.,\n      'n02119022' --> 'red fox, Vulpes vulpes'\n    image_to_bboxes: dictionary mapping image file names to a list of\n      bounding boxes. This list contains 0+ bounding boxes.\n  \"\"\"\n  filenames, synsets, labels = _find_image_files(directory, FLAGS.labels_file)\n  humans = _find_human_readable_labels(synsets, synset_to_human)\n  bboxes = _find_image_bounding_boxes(filenames, image_to_bboxes)\n  _process_image_files(name, filenames, synsets, labels,\n                       humans, bboxes, num_shards)\n\n\ndef _build_synset_lookup(imagenet_metadata_file):\n  \"\"\"Build lookup for synset to human-readable label.\n\n  Args:\n    imagenet_metadata_file: string, path to file containing mapping from\n      synset to human-readable label.\n\n      Assumes each line of the file looks like:\n\n        n02119247    black fox\n        n02119359    silver fox\n        n02119477    red fox, Vulpes fulva\n\n      where each line corresponds to a unique mapping. Note that each line is\n      formatted as <synset>\\t<human readable label>.\n\n  Returns:\n    Dictionary of synset to human labels, such as:\n      'n02119022' --> 'red fox, Vulpes vulpes'\n  \"\"\"\n  lines = tf.gfile.FastGFile(imagenet_metadata_file, 'r').readlines()\n  synset_to_human = {}\n  for l in lines:\n    if l:\n      parts = l.strip().split('\\t')\n      assert len(parts) == 2\n      synset = parts[0]\n      human = parts[1]\n      synset_to_human[synset] = human\n  return synset_to_human\n\n\ndef _build_bounding_box_lookup(bounding_box_file):\n  \"\"\"Build a lookup from image file to bounding boxes.\n\n  Args:\n    bounding_box_file: string, path to file with bounding boxes annotations.\n\n      Assumes each line of the file looks like:\n\n        n00007846_64193.JPEG,0.0060,0.2620,0.7545,0.9940\n\n      where each line corresponds to one bounding box annotation associated\n      with an image. Each line can be parsed as:\n\n        <JPEG file name>, <xmin>, <ymin>, <xmax>, <ymax>\n\n      Note that there might exist mulitple bounding box annotations associated\n      with an image file. This file is the output of process_bounding_boxes.py.\n\n  Returns:\n    Dictionary mapping image file names to a list of bounding boxes. This list\n    contains 0+ bounding boxes.\n  \"\"\"\n  lines = tf.gfile.FastGFile(bounding_box_file, 'r').readlines()\n  images_to_bboxes = {}\n  num_bbox = 0\n  num_image = 0\n  for l in lines:\n    if l:\n      parts = l.split(',')\n      assert len(parts) == 5, ('Failed to parse: %s' % l)\n      filename = parts[0]\n      xmin = float(parts[1])\n      ymin = float(parts[2])\n      xmax = float(parts[3])\n      ymax = float(parts[4])\n      box = [xmin, ymin, xmax, ymax]\n\n      if filename not in images_to_bboxes:\n        images_to_bboxes[filename] = []\n        num_image += 1\n      images_to_bboxes[filename].append(box)\n      num_bbox += 1\n\n  print('Successfully read %d bounding boxes '\n        'across %d images.' % (num_bbox, num_image))\n  return images_to_bboxes\n\n\ndef main(unused_argv):\n  assert not FLAGS.train_shards % FLAGS.num_threads, (\n      'Please make the FLAGS.num_threads commensurate with FLAGS.train_shards')\n  assert not FLAGS.validation_shards % FLAGS.num_threads, (\n      'Please make the FLAGS.num_threads commensurate with '\n      'FLAGS.validation_shards')\n  print('Saving results to %s' % FLAGS.output_directory)\n\n  # Build a map from synset to human-readable label.\n  synset_to_human = _build_synset_lookup(FLAGS.imagenet_metadata_file)\n  image_to_bboxes = _build_bounding_box_lookup(FLAGS.bounding_box_file)\n\n  # Run it!\n  _process_dataset('validation', FLAGS.validation_directory,\n                   FLAGS.validation_shards, synset_to_human, image_to_bboxes)\n  _process_dataset('train', FLAGS.train_directory, FLAGS.train_shards,\n                   synset_to_human, image_to_bboxes)\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/inception/inception/data/download_and_preprocess_flowers.sh",
    "content": "#!/bin/bash\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n# Script to download and preprocess the flowers data set. This data set\n# provides a demonstration for how to perform fine-tuning (i.e. tranfer\n# learning) from one model to a new data set.\n#\n# This script provides a demonstration for how to prepare an arbitrary\n# data set for training an Inception v3 model.\n#\n# We demonstrate this with the flowers data set which consists of images\n# of labeled flower images from 5 classes:\n#\n# daisy, dandelion, roses, sunflowers, tulips\n#\n# The final output of this script are sharded TFRecord files containing\n# serialized Example protocol buffers. See build_image_data.py for\n# details of how the Example protocol buffer contains image data.\n#\n# usage:\n#  ./download_and_preprocess_flowers.sh [data-dir]\nset -e\n\nif [ -z \"$1\" ]; then\n  echo \"usage download_and_preprocess_flowers.sh [data dir]\"\n  exit\nfi\n\n# Create the output and temporary directories.\nDATA_DIR=\"${1%/}\"\nSCRATCH_DIR=\"${DATA_DIR}/raw-data/\"\nmkdir -p \"${DATA_DIR}\"\nmkdir -p \"${SCRATCH_DIR}\"\nWORK_DIR=\"$0.runfiles/inception/inception\"\n\n# Download the flowers data.\nDATA_URL=\"http://download.tensorflow.org/example_images/flower_photos.tgz\"\nCURRENT_DIR=$(pwd)\ncd \"${DATA_DIR}\"\nTARBALL=\"flower_photos.tgz\"\nif [ ! -f ${TARBALL} ]; then\n  echo \"Downloading flower data set.\"\n  wget -O ${TARBALL} \"${DATA_URL}\"\nelse\n  echo \"Skipping download of flower data.\"\nfi\n\n# Note the locations of the train and validation data.\nTRAIN_DIRECTORY=\"${SCRATCH_DIR}train/\"\nVALIDATION_DIRECTORY=\"${SCRATCH_DIR}validation/\"\n\n# Expands the data into the flower_photos/ directory and rename it as the\n# train directory.\ntar xf flower_photos.tgz\nrm -rf \"${TRAIN_DIRECTORY}\" \"${VALIDATION_DIRECTORY}\"\nmv flower_photos \"${TRAIN_DIRECTORY}\"\n\n# Generate a list of 5 labels: daisy, dandelion, roses, sunflowers, tulips\nLABELS_FILE=\"${SCRATCH_DIR}/labels.txt\"\nls -1 \"${TRAIN_DIRECTORY}\" | grep -v 'LICENSE' | sed 's/\\///' | sort > \"${LABELS_FILE}\"\n\n# Generate the validation data set.\nwhile read LABEL; do\n  VALIDATION_DIR_FOR_LABEL=\"${VALIDATION_DIRECTORY}${LABEL}\"\n  TRAIN_DIR_FOR_LABEL=\"${TRAIN_DIRECTORY}${LABEL}\"\n\n  # Move the first randomly selected 100 images to the validation set.\n  mkdir -p \"${VALIDATION_DIR_FOR_LABEL}\"\n  VALIDATION_IMAGES=$(ls -1 \"${TRAIN_DIR_FOR_LABEL}\" | shuf | head -100)\n  for IMAGE in ${VALIDATION_IMAGES}; do\n    mv -f \"${TRAIN_DIRECTORY}${LABEL}/${IMAGE}\" \"${VALIDATION_DIR_FOR_LABEL}\"\n  done\ndone < \"${LABELS_FILE}\"\n\n# Build the TFRecords version of the image data.\ncd \"${CURRENT_DIR}\"\nBUILD_SCRIPT=\"${WORK_DIR}/build_image_data\"\nOUTPUT_DIRECTORY=\"${DATA_DIR}\"\n\"${BUILD_SCRIPT}\" \\\n  --train_directory=\"${TRAIN_DIRECTORY}\" \\\n  --validation_directory=\"${VALIDATION_DIRECTORY}\" \\\n  --output_directory=\"${OUTPUT_DIRECTORY}\" \\\n  --labels_file=\"${LABELS_FILE}\"\n"
  },
  {
    "path": "model_zoo/models/inception/inception/data/download_and_preprocess_flowers_mac.sh",
    "content": "#!/bin/bash\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n# Script to download and preprocess the flowers data set. This data set\n# provides a demonstration for how to perform fine-tuning (i.e. tranfer\n# learning) from one model to a new data set.\n#\n# This script provides a demonstration for how to prepare an arbitrary\n# data set for training an Inception v3 model.\n#\n# We demonstrate this with the flowers data set which consists of images\n# of labeled flower images from 5 classes:\n#\n# daisy, dandelion, roses, sunflowers, tulips\n#\n# The final output of this script are sharded TFRecord files containing\n# serialized Example protocol buffers. See build_image_data.py for\n# details of how the Example protocol buffer contains image data.\n#\n# usage:\n#  ./download_and_preprocess_flowers.sh [data-dir]\nset -e\n\nif [ -z \"$1\" ]; then\n  echo \"usage download_and_preprocess_flowers.sh [data dir]\"\n  exit\nfi\n\n# Create the output and temporary directories.\nDATA_DIR=\"${1%/}\"\nSCRATCH_DIR=\"${DATA_DIR}/raw-data/\"\nmkdir -p \"${DATA_DIR}\"\nmkdir -p \"${SCRATCH_DIR}\"\nWORK_DIR=\"$0.runfiles/inception/inception\"\n\n# Download the flowers data.\nDATA_URL=\"http://download.tensorflow.org/example_images/flower_photos.tgz\"\nCURRENT_DIR=$(pwd)\ncd \"${DATA_DIR}\"\nTARBALL=\"flower_photos.tgz\"\nif [ ! -f ${TARBALL} ]; then\n  echo \"Downloading flower data set.\"\n  wget -O ${TARBALL} \"${DATA_URL}\"\nelse\n  echo \"Skipping download of flower data.\"\nfi\n\n# Note the locations of the train and validation data.\nTRAIN_DIRECTORY=\"${SCRATCH_DIR}train/\"\nVALIDATION_DIRECTORY=\"${SCRATCH_DIR}validation/\"\n\n# Expands the data into the flower_photos/ directory and rename it as the\n# train directory.\ntar xf flower_photos.tgz\nrm -rf \"${TRAIN_DIRECTORY}\" \"${VALIDATION_DIRECTORY}\"\nmv flower_photos \"${TRAIN_DIRECTORY}\"\n\n# Generate a list of 5 labels: daisy, dandelion, roses, sunflowers, tulips\nLABELS_FILE=\"${SCRATCH_DIR}/labels.txt\"\nls -1 \"${TRAIN_DIRECTORY}\" | grep -v 'LICENSE' | sed 's/\\///' | sort > \"${LABELS_FILE}\"\n\n# Generate the validation data set.\nwhile read LABEL; do\n  VALIDATION_DIR_FOR_LABEL=\"${VALIDATION_DIRECTORY}${LABEL}\"\n  TRAIN_DIR_FOR_LABEL=\"${TRAIN_DIRECTORY}${LABEL}\"\n\n  # Move the first randomly selected 100 images to the validation set.\n  mkdir -p \"${VALIDATION_DIR_FOR_LABEL}\"\n  VALIDATION_IMAGES=$(ls -1 \"${TRAIN_DIR_FOR_LABEL}\" | gshuf | head -100)\n  for IMAGE in ${VALIDATION_IMAGES}; do\n    mv -f \"${TRAIN_DIRECTORY}${LABEL}/${IMAGE}\" \"${VALIDATION_DIR_FOR_LABEL}\"\n  done\ndone < \"${LABELS_FILE}\"\n\n# Build the TFRecords version of the image data.\ncd \"${CURRENT_DIR}\"\nBUILD_SCRIPT=\"${WORK_DIR}/build_image_data\"\nOUTPUT_DIRECTORY=\"${DATA_DIR}\"\n\"${BUILD_SCRIPT}\" \\\n  --train_directory=\"${TRAIN_DIRECTORY}\" \\\n  --validation_directory=\"${VALIDATION_DIRECTORY}\" \\\n  --output_directory=\"${OUTPUT_DIRECTORY}\" \\\n  --labels_file=\"${LABELS_FILE}\"\n"
  },
  {
    "path": "model_zoo/models/inception/inception/data/download_and_preprocess_imagenet.sh",
    "content": "#!/bin/bash\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n# Script to download and preprocess ImageNet Challenge 2012\n# training and validation data set.\n#\n# The final output of this script are sharded TFRecord files containing\n# serialized Example protocol buffers. See build_imagenet_data.py for\n# details of how the Example protocol buffers contain the ImageNet data.\n#\n# The final output of this script appears as such:\n#\n#   data_dir/train-00000-of-01024\n#   data_dir/train-00001-of-01024\n#    ...\n#   data_dir/train-00127-of-01024\n#\n# and\n#\n#   data_dir/validation-00000-of-00128\n#   data_dir/validation-00001-of-00128\n#   ...\n#   data_dir/validation-00127-of-00128\n#\n# Note that this script may take several hours to run to completion. The\n# conversion of the ImageNet data to TFRecords alone takes 2-3 hours depending\n# on the speed of your machine. Please be patient.\n#\n# **IMPORTANT**\n# To download the raw images, the user must create an account with image-net.org\n# and generate a username and access_key. The latter two are required for\n# downloading the raw images.\n#\n# usage:\n#  ./download_and_preprocess_imagenet.sh [data-dir]\nset -e\n\nif [ -z \"$1\" ]; then\n  echo \"usage download_and_preprocess_imagenet.sh [data dir]\"\n  exit\nfi\n\n# Create the output and temporary directories.\nDATA_DIR=\"${1%/}\"\nSCRATCH_DIR=\"${DATA_DIR}/raw-data/\"\nmkdir -p \"${DATA_DIR}\"\nmkdir -p \"${SCRATCH_DIR}\"\nWORK_DIR=\"$0.runfiles/inception/inception\"\n\n# Download the ImageNet data.\nLABELS_FILE=\"${WORK_DIR}/data/imagenet_lsvrc_2015_synsets.txt\"\nDOWNLOAD_SCRIPT=\"${WORK_DIR}/data/download_imagenet.sh\"\n\"${DOWNLOAD_SCRIPT}\" \"${SCRATCH_DIR}\" \"${LABELS_FILE}\"\n\n# Note the locations of the train and validation data.\nTRAIN_DIRECTORY=\"${SCRATCH_DIR}train/\"\nVALIDATION_DIRECTORY=\"${SCRATCH_DIR}validation/\"\n\n# Preprocess the validation data by moving the images into the appropriate\n# sub-directory based on the label (synset) of the image.\necho \"Organizing the validation data into sub-directories.\"\nPREPROCESS_VAL_SCRIPT=\"${WORK_DIR}/data/preprocess_imagenet_validation_data.py\"\nVAL_LABELS_FILE=\"${WORK_DIR}/data/imagenet_2012_validation_synset_labels.txt\"\n\n\"${PREPROCESS_VAL_SCRIPT}\" \"${VALIDATION_DIRECTORY}\" \"${VAL_LABELS_FILE}\"\n\n# Convert the XML files for bounding box annotations into a single CSV.\necho \"Extracting bounding box information from XML.\"\nBOUNDING_BOX_SCRIPT=\"${WORK_DIR}/data/process_bounding_boxes.py\"\nBOUNDING_BOX_FILE=\"${SCRATCH_DIR}/imagenet_2012_bounding_boxes.csv\"\nBOUNDING_BOX_DIR=\"${SCRATCH_DIR}bounding_boxes/\"\n\n\"${BOUNDING_BOX_SCRIPT}\" \"${BOUNDING_BOX_DIR}\" \"${LABELS_FILE}\" \\\n | sort >\"${BOUNDING_BOX_FILE}\"\necho \"Finished downloading and preprocessing the ImageNet data.\"\n\n# Build the TFRecords version of the ImageNet data.\nBUILD_SCRIPT=\"${WORK_DIR}/build_imagenet_data\"\nOUTPUT_DIRECTORY=\"${DATA_DIR}\"\nIMAGENET_METADATA_FILE=\"${WORK_DIR}/data/imagenet_metadata.txt\"\n\n\"${BUILD_SCRIPT}\" \\\n  --train_directory=\"${TRAIN_DIRECTORY}\" \\\n  --validation_directory=\"${VALIDATION_DIRECTORY}\" \\\n  --output_directory=\"${OUTPUT_DIRECTORY}\" \\\n  --imagenet_metadata_file=\"${IMAGENET_METADATA_FILE}\" \\\n  --labels_file=\"${LABELS_FILE}\" \\\n  --bounding_box_file=\"${BOUNDING_BOX_FILE}\"\n"
  },
  {
    "path": "model_zoo/models/inception/inception/data/download_imagenet.sh",
    "content": "#!/bin/bash\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n# Script to download ImageNet Challenge 2012 training and validation data set.\n#\n# Downloads and decompresses raw images and bounding boxes.\n#\n# **IMPORTANT**\n# To download the raw images, the user must create an account with image-net.org\n# and generate a username and access_key. The latter two are required for\n# downloading the raw images.\n#\n# usage:\n#  ./download_imagenet.sh [dirname]\nset -e\n\nif [ \"x$IMAGENET_ACCESS_KEY\" == x -o \"x$IMAGENET_USERNAME\" == x ]; then\n  cat <<END\nIn order to download the imagenet data, you have to create an account with\nimage-net.org. This will get you a username and an access key. You can set the\nIMAGENET_USERNAME and IMAGENET_ACCESS_KEY environment variables, or you can\nenter the credentials here.\nEND\n  read -p \"Username: \" IMAGENET_USERNAME\n  read -p \"Access key: \" IMAGENET_ACCESS_KEY\nfi\n\nOUTDIR=\"${1:-./imagenet-data}\"\nSYNSETS_FILE=\"${2:-./synsets.txt}\"\nSYNSETS_FILE=\"${PWD}/${SYNSETS_FILE}\"\n\necho \"Saving downloaded files to $OUTDIR\"\nmkdir -p \"${OUTDIR}\"\nCURRENT_DIR=$(pwd)\nBBOX_DIR=\"${OUTDIR}bounding_boxes\"\nmkdir -p \"${BBOX_DIR}\"\ncd \"${OUTDIR}\"\n\n# Download and process all of the ImageNet bounding boxes.\nBASE_URL=\"http://www.image-net.org/challenges/LSVRC/2012/nonpub\"\n\n# See here for details: http://www.image-net.org/download-bboxes\nBOUNDING_BOX_ANNOTATIONS=\"${BASE_URL}/ILSVRC2012_bbox_train_v2.tar.gz\"\nBBOX_TAR_BALL=\"${BBOX_DIR}/annotations.tar.gz\"\necho \"Downloading bounding box annotations.\"\nwget \"${BOUNDING_BOX_ANNOTATIONS}\" -O \"${BBOX_TAR_BALL}\" || BASE_URL_CHANGE=1\nif [ $BASE_URL_CHANGE ]; then\n  BASE_URL=\"http://www.image-net.org/challenges/LSVRC/2012/nnoupb\"\n  BOUNDING_BOX_ANNOTATIONS=\"${BASE_URL}/ILSVRC2012_bbox_train_v2.tar.gz\"\n  BBOX_TAR_BALL=\"${BBOX_DIR}/annotations.tar.gz\"\nfi\nwget \"${BOUNDING_BOX_ANNOTATIONS}\" -O \"${BBOX_TAR_BALL}\"\necho \"Uncompressing bounding box annotations ...\"\ntar xzf \"${BBOX_TAR_BALL}\" -C \"${BBOX_DIR}\"\n\nLABELS_ANNOTATED=\"${BBOX_DIR}/*\"\nNUM_XML=$(ls -1 ${LABELS_ANNOTATED} | wc -l)\necho \"Identified ${NUM_XML} bounding box annotations.\"\n\n# Download and uncompress all images from the ImageNet 2012 validation dataset.\nVALIDATION_TARBALL=\"ILSVRC2012_img_val.tar\"\nOUTPUT_PATH=\"${OUTDIR}validation/\"\nmkdir -p \"${OUTPUT_PATH}\"\ncd \"${OUTDIR}/..\"\necho \"Downloading ${VALIDATION_TARBALL} to ${OUTPUT_PATH}.\"\nwget -nd -c \"${BASE_URL}/${VALIDATION_TARBALL}\"\ntar xf \"${VALIDATION_TARBALL}\" -C \"${OUTPUT_PATH}\"\n\n# Download all images from the ImageNet 2012 train dataset.\nTRAIN_TARBALL=\"ILSVRC2012_img_train.tar\"\nOUTPUT_PATH=\"${OUTDIR}train/\"\nmkdir -p \"${OUTPUT_PATH}\"\ncd \"${OUTDIR}/..\"\necho \"Downloading ${TRAIN_TARBALL} to ${OUTPUT_PATH}.\"\nwget -nd -c \"${BASE_URL}/${TRAIN_TARBALL}\"\n\n# Un-compress the individual tar-files within the train tar-file.\necho \"Uncompressing individual train tar-balls in the training data.\"\n\nwhile read SYNSET; do\n  echo \"Processing: ${SYNSET}\"\n\n  # Create a directory and delete anything there.\n  mkdir -p \"${OUTPUT_PATH}/${SYNSET}\"\n  rm -rf \"${OUTPUT_PATH}/${SYNSET}/*\"\n\n  # Uncompress into the directory.\n  tar xf \"${TRAIN_TARBALL}\" \"${SYNSET}.tar\"\n  tar xf \"${SYNSET}.tar\" -C \"${OUTPUT_PATH}/${SYNSET}/\"\n  rm -f \"${SYNSET}.tar\"\n\n  echo \"Finished processing: ${SYNSET}\"\ndone < \"${SYNSETS_FILE}\"\n"
  },
  {
    "path": "model_zoo/models/inception/inception/data/imagenet_2012_validation_synset_labels.txt",
    "content": "n01751748\nn09193705\nn02105855\nn04263257\nn03125729\nn01735189\nn02346627\nn02776631\nn03794056\nn02328150\nn01917289\nn02125311\nn02484975\nn04065272\nn03496892\nn02066245\nn01914609\nn01616318\nn02971356\nn03126707\nn02346627\nn02091244\nn07742313\nn03956157\nn01616318\nn04380533\nn02114548\nn02089973\nn01729977\nn04435653\nn02280649\nn03444034\nn02077923\nn09835506\nn03478589\nn04532106\nn01644900\nn02666196\nn04141327\nn01773797\nn03125729\nn04049303\nn02006656\nn02097209\nn02111277\nn03950228\nn03393912\nn02089973\nn03930630\nn02640242\nn01828970\nn01632777\nn04372370\nn03485794\nn02443114\nn02930766\nn02112018\nn13040303\nn04485082\nn03482405\nn02963159\nn02093859\nn01910747\nn01693334\nn04371430\nn02526121\nn01871265\nn04532106\nn04482393\nn04370456\nn02927161\nn02074367\nn01608432\nn02966193\nn01795545\nn02791270\nn02087394\nn02116738\nn02091635\nn02895154\nn09193705\nn02088094\nn04200800\nn01737021\nn02974003\nn03032252\nn02483708\nn01632458\nn02992529\nn01698640\nn02114548\nn02497673\nn02480855\nn04147183\nn02487347\nn03895866\nn02325366\nn02033041\nn07745940\nn02415577\nn02951585\nn02087394\nn04485082\nn04505470\nn02097658\nn04591157\nn01770081\nn02992211\nn03691459\nn03594734\nn01983481\nn03937543\nn02105412\nn03843555\nn02091244\nn07831146\nn03710637\nn03733281\nn03782006\nn03733131\nn03933933\nn02980441\nn04409515\nn02606052\nn02226429\nn02883205\nn02422699\nn01614925\nn07697537\nn02123394\nn04252077\nn03337140\nn02117135\nn02107142\nn04037443\nn02397096\nn03187595\nn02319095\nn07932039\nn03372029\nn02088466\nn02319095\nn04125021\nn03954731\nn09421951\nn04487394\nn02113624\nn03843555\nn03485407\nn09332890\nn03642806\nn03710193\nn01677366\nn01950731\nn07714990\nn02114855\nn02119022\nn04086273\nn04201297\nn03733281\nn02100877\nn03016953\nn03733805\nn03063599\nn07714990\nn03854065\nn04149813\nn03786901\nn03467068\nn02087046\nn04326547\nn02100735\nn03775546\nn02111500\nn02814533\nn02097047\nn02027492\nn02109961\nn02389026\nn02105855\nn02445715\nn03259280\nn07711569\nn03710637\nn03670208\nn02128757\nn04467665\nn02114855\nn01873310\nn03476684\nn02093428\nn03891251\nn02859443\nn04125021\nn01978287\nn02643566\nn07697537\nn01560419\nn03290653\nn13037406\nn03891332\nn02883205\nn02106382\nn02672831\nn04330267\nn02489166\nn02058221\nn03584829\nn07565083\nn03125729\nn02123597\nn04536866\nn02965783\nn09428293\nn02965783\nn11879895\nn01560419\nn01775062\nn03595614\nn02110958\nn03709823\nn03777754\nn02951585\nn02100877\nn01629819\nn02909870\nn02101388\nn02091244\nn01667114\nn03998194\nn01986214\nn04192698\nn02128757\nn02793495\nn09256479\nn01443537\nn02089973\nn01981276\nn02837789\nn03888605\nn03201208\nn02480855\nn03814639\nn04090263\nn01986214\nn02415577\nn01534433\nn02093256\nn03134739\nn03016953\nn12620546\nn03937543\nn02815834\nn03776460\nn10565667\nn03207743\nn02992529\nn01631663\nn03729826\nn04033995\nn04462240\nn01443537\nn02091831\nn03874293\nn03874599\nn04238763\nn07584110\nn02749479\nn02110185\nn09193705\nn04311004\nn02788148\nn02445715\nn06874185\nn04074963\nn01631663\nn03803284\nn01828970\nn02096437\nn04554684\nn03599486\nn03595614\nn02123394\nn04515003\nn04591157\nn04560804\nn02794156\nn03344393\nn02687172\nn04328186\nn04479046\nn03967562\nn01440764\nn04465501\nn03457902\nn04532670\nn01688243\nn01749939\nn01768244\nn02091831\nn02321529\nn02939185\nn02129604\nn12985857\nn03485794\nn02408429\nn01443537\nn03590841\nn07697537\nn04154565\nn03443371\nn02514041\nn09468604\nn03769881\nn02787622\nn02526121\nn03888605\nn01622779\nn01872401\nn07745940\nn03085013\nn02445715\nn02120505\nn01751748\nn04141327\nn02443484\nn02089078\nn01608432\nn01514668\nn03160309\nn04070727\nn07715103\nn02110958\nn03976657\nn03902125\nn02909870\nn01740131\nn04532106\nn03197337\nn02493509\nn10148035\nn02172182\nn02437616\nn03062245\nn04286575\nn03018349\nn02951358\nn02130308\nn04277352\nn02096585\nn04589890\nn02965783\nn02978881\nn02804414\nn02112137\nn02007558\nn03670208\nn02894605\nn03657121\nn03876231\nn02165105\nn01669191\nn02011460\nn03710193\nn03796401\nn02916936\nn03492542\nn03998194\nn04552348\nn01824575\nn01917289\nn03461385\nn03874293\nn03272010\nn02099712\nn02999410\nn04179913\nn07831146\nn02096177\nn04350905\nn04507155\nn03743016\nn02105505\nn03649909\nn03680355\nn01910747\nn03529860\nn02787622\nn02012849\nn02011460\nn02094114\nn02950826\nn02105855\nn09288635\nn01773797\nn01774750\nn04409515\nn02497673\nn02113799\nn02786058\nn02443484\nn02981792\nn03095699\nn01664065\nn02092002\nn07711569\nn02219486\nn13133613\nn02114548\nn03529860\nn02097298\nn13133613\nn04355933\nn01537544\nn01847000\nn04428191\nn02666196\nn02268443\nn03291819\nn01828970\nn04099969\nn02747177\nn07720875\nn02088094\nn02113624\nn03710637\nn03637318\nn03942813\nn02093859\nn03794056\nn02930766\nn02930766\nn04525038\nn03796401\nn03709823\nn02097047\nn04604644\nn03938244\nn01560419\nn02097298\nn02091635\nn04136333\nn07718747\nn02417914\nn03355925\nn02445715\nn02445715\nn03495258\nn04447861\nn02111500\nn03584829\nn03977966\nn04116512\nn04019541\nn04200800\nn02408429\nn02085936\nn03992509\nn02769748\nn04613696\nn07716906\nn02085782\nn07718472\nn04398044\nn03920288\nn01860187\nn03272010\nn04008634\nn04090263\nn02028035\nn01677366\nn13037406\nn04067472\nn02095889\nn04532670\nn01582220\nn03476684\nn02395406\nn04487394\nn02443484\nn02510455\nn04550184\nn02814860\nn12144580\nn03126707\nn02486410\nn02125311\nn03777754\nn03924679\nn04613696\nn07875152\nn02058221\nn03188531\nn02777292\nn02489166\nn02066245\nn04579432\nn01630670\nn02666196\nn02091635\nn02114548\nn02356798\nn03201208\nn03240683\nn03590841\nn03018349\nn02104029\nn04251144\nn10148035\nn02169497\nn02089867\nn01734418\nn04476259\nn02843684\nn04008634\nn03400231\nn02119022\nn02137549\nn03761084\nn02490219\nn03840681\nn04346328\nn01677366\nn02102318\nn04458633\nn04476259\nn04209239\nn01795545\nn10565667\nn02114367\nn02107574\nn03032252\nn02104365\nn03133878\nn04336792\nn02112137\nn03000684\nn04553703\nn02102480\nn03825788\nn01695060\nn03250847\nn07860988\nn04310018\nn02071294\nn01945685\nn01855672\nn02037110\nn03868863\nn04229816\nn12057211\nn02408429\nn02481823\nn07716358\nn04487394\nn03662601\nn02979186\nn02910353\nn04266014\nn03895866\nn04443257\nn02917067\nn04149813\nn03041632\nn02364673\nn02999410\nn04435653\nn04228054\nn02814860\nn01531178\nn03662601\nn07880968\nn04487081\nn07614500\nn03532672\nn01807496\nn02011460\nn02074367\nn04462240\nn02977058\nn02281406\nn03041632\nn04350905\nn02788148\nn02137549\nn04562935\nn04590129\nn02093991\nn03995372\nn02111889\nn04081281\nn02133161\nn02006656\nn02107908\nn04347754\nn02950826\nn02504013\nn04560804\nn02088364\nn02128385\nn02860847\nn04399382\nn02105412\nn02115641\nn07753592\nn07880968\nn03598930\nn03724870\nn02066245\nn02128925\nn04465501\nn02094258\nn02086646\nn04141076\nn04136333\nn13133613\nn02342885\nn02281406\nn03443371\nn07613480\nn04008634\nn04141327\nn04347754\nn03314780\nn02165456\nn03930313\nn04392985\nn01872401\nn04204238\nn07831146\nn02690373\nn12144580\nn02776631\nn02877765\nn02108089\nn03532672\nn03126707\nn01560419\nn02268853\nn03691459\nn03404251\nn02364673\nn02101556\nn02326432\nn03954731\nn07831146\nn03584254\nn02012849\nn03804744\nn02128385\nn01530575\nn03933933\nn04409515\nn02823428\nn01877812\nn03920288\nn02510455\nn02112350\nn03594945\nn03642806\nn02395406\nn03452741\nn02860847\nn03673027\nn02102040\nn04505470\nn04086273\nn02099849\nn01990800\nn03781244\nn04461696\nn02106166\nn04141076\nn07717556\nn02361337\nn03976657\nn03832673\nn03109150\nn01776313\nn03788195\nn03884397\nn04019541\nn01693334\nn03633091\nn02325366\nn03623198\nn02795169\nn01744401\nn01955084\nn02002556\nn07754684\nn02174001\nn02793495\nn02095889\nn02484975\nn02094433\nn09229709\nn03207941\nn02655020\nn03773504\nn04367480\nn03933933\nn01955084\nn04355933\nn13040303\nn02786058\nn04090263\nn02101006\nn02124075\nn03720891\nn07749582\nn04517823\nn01534433\nn04335435\nn03661043\nn02101556\nn03785016\nn03133878\nn02113978\nn02930766\nn02783161\nn03958227\nn02441942\nn02859443\nn02096437\nn02447366\nn07742313\nn07583066\nn02110063\nn03146219\nn12998815\nn03425413\nn02123394\nn03594734\nn02006656\nn02992211\nn04442312\nn03032252\nn01608432\nn02927161\nn03485794\nn07583066\nn03347037\nn01847000\nn04557648\nn03478589\nn01530575\nn02098105\nn01755581\nn03045698\nn02028035\nn03538406\nn03956157\nn01871265\nn13044778\nn02119789\nn07875152\nn02107908\nn02791124\nn03697007\nn03207743\nn02791270\nn02865351\nn03345487\nn03976467\nn03124043\nn04252225\nn02165105\nn03314780\nn04040759\nn02730930\nn02236044\nn07873807\nn02006656\nn02514041\nn03534580\nn03179701\nn04366367\nn02138441\nn03450230\nn01943899\nn07836838\nn03691459\nn04467665\nn02115641\nn01742172\nn02795169\nn02481823\nn07583066\nn02749479\nn01665541\nn04131690\nn03769881\nn02009229\nn04487081\nn02123159\nn04542943\nn07760859\nn02097658\nn02113799\nn07932039\nn02097474\nn03793489\nn02791124\nn04591713\nn01735189\nn01631663\nn02892767\nn04458633\nn02277742\nn07697537\nn03781244\nn02791270\nn03854065\nn04356056\nn07802026\nn03733131\nn01980166\nn02174001\nn07684084\nn01981276\nn03874293\nn03146219\nn02099267\nn02018207\nn04398044\nn03832673\nn02493509\nn03478589\nn06359193\nn02971356\nn02093754\nn04487081\nn03929855\nn03485407\nn01930112\nn01592084\nn02088238\nn04613696\nn03967562\nn03814639\nn04311174\nn04286575\nn03884397\nn03534580\nn03793489\nn02106382\nn03045698\nn03661043\nn03814906\nn02669723\nn03459775\nn03785016\nn04584207\nn03657121\nn03476991\nn04243546\nn04560804\nn03788365\nn01796340\nn04019541\nn03496892\nn07711569\nn03788195\nn02133161\nn04548362\nn02113712\nn03673027\nn12144580\nn02481823\nn02132136\nn03956157\nn01532829\nn04493381\nn02094258\nn03483316\nn01770081\nn02006656\nn02871525\nn01580077\nn07730033\nn02097474\nn02093647\nn02088466\nn01795545\nn07716906\nn03481172\nn01608432\nn02097209\nn01629819\nn07695742\nn02389026\nn02977058\nn04090263\nn04522168\nn02871525\nn04258138\nn02127052\nn04476259\nn03617480\nn04273569\nn03485794\nn06794110\nn03085013\nn02974003\nn02869837\nn02086240\nn01685808\nn02088466\nn03584829\nn01514668\nn02114367\nn03447447\nn04435653\nn03065424\nn01616318\nn02841315\nn02655020\nn03496892\nn04040759\nn01496331\nn02094258\nn03787032\nn02172182\nn01693334\nn02168699\nn03793489\nn07613480\nn01824575\nn01665541\nn04065272\nn02699494\nn02526121\nn01774750\nn03126707\nn04254777\nn02325366\nn01665541\nn02007558\nn01873310\nn01734418\nn03271574\nn01776313\nn01644373\nn02486410\nn02106662\nn03125729\nn02087394\nn02094433\nn07684084\nn04532670\nn01843383\nn02835271\nn12985857\nn04485082\nn02167151\nn03394916\nn01664065\nn04286575\nn03874293\nn02699494\nn01601694\nn01582220\nn02486261\nn02268853\nn03947888\nn13040303\nn03967562\nn03602883\nn01882714\nn04505470\nn02226429\nn04522168\nn02481823\nn02108422\nn03670208\nn07718747\nn01688243\nn02747177\nn07248320\nn02328150\nn02963159\nn02117135\nn03676483\nn06596364\nn01775062\nn03724870\nn03347037\nn13133613\nn02319095\nn03944341\nn02088238\nn02110185\nn01443537\nn06794110\nn02606052\nn02113186\nn02704792\nn03692522\nn03018349\nn02095314\nn04523525\nn02356798\nn04228054\nn02108000\nn04371430\nn01770393\nn04456115\nn02110958\nn01631663\nn02708093\nn02835271\nn02807133\nn02280649\nn02277742\nn03857828\nn03452741\nn03388043\nn06596364\nn04252225\nn04458633\nn01689811\nn03935335\nn01560419\nn02500267\nn02319095\nn02412080\nn02096437\nn03814639\nn03494278\nn01518878\nn02486261\nn01629819\nn04606251\nn03787032\nn01877812\nn01773157\nn02104365\nn02113978\nn02123394\nn02966687\nn01728920\nn02916936\nn01860187\nn03255030\nn02011460\nn02087394\nn02817516\nn02085620\nn02437616\nn02606052\nn03447721\nn01773157\nn02497673\nn04380533\nn02056570\nn01917289\nn12267677\nn04325704\nn02130308\nn02730930\nn03933933\nn02981792\nn07892512\nn02112018\nn02398521\nn02009912\nn02002724\nn02086079\nn02100236\nn03085013\nn02837789\nn02018795\nn02106382\nn02489166\nn03937543\nn02910353\nn07836838\nn15075141\nn02877765\nn03602883\nn02233338\nn13037406\nn01580077\nn04069434\nn04371774\nn03938244\nn02326432\nn03085013\nn02804610\nn04141975\nn02484975\nn02930766\nn03000134\nn02488702\nn02113023\nn02088632\nn02783161\nn02490219\nn04505470\nn02123394\nn04357314\nn02825657\nn02493509\nn03720891\nn03673027\nn03492542\nn01739381\nn02105056\nn03481172\nn03947888\nn02099601\nn02105505\nn01514859\nn07871810\nn03445924\nn12267677\nn04536866\nn03314780\nn12768682\nn02028035\nn01980166\nn02099601\nn01981276\nn07730033\nn02909870\nn04179913\nn02089973\nn02111277\nn12057211\nn01632458\nn02123394\nn04350905\nn03937543\nn02730930\nn01795545\nn02091244\nn01632777\nn03584829\nn03709823\nn02086646\nn01824575\nn03977966\nn03417042\nn02892201\nn01806143\nn02105855\nn02115913\nn03902125\nn01774384\nn07880968\nn02112137\nn09428293\nn04116512\nn02486410\nn03930630\nn04090263\nn01843383\nn07802026\nn04429376\nn02317335\nn02027492\nn01818515\nn02086646\nn02018207\nn04371430\nn03347037\nn03014705\nn04125021\nn03764736\nn02981792\nn02114367\nn04192698\nn04330267\nn03729826\nn02607072\nn02504458\nn03769881\nn02018207\nn03929855\nn04591157\nn03947888\nn04317175\nn03125729\nn01749939\nn04399382\nn02276258\nn03598930\nn02606052\nn03089624\nn02099601\nn03770439\nn02655020\nn07745940\nn02095314\nn04336792\nn04033995\nn02112018\nn02132136\nn02860847\nn03100240\nn02966687\nn02111129\nn04273569\nn04149813\nn02092002\nn03769881\nn04599235\nn03825788\nn04118776\nn04336792\nn02115641\nn01622779\nn02909870\nn02276258\nn02977058\nn02326432\nn01608432\nn03347037\nn02978881\nn02787622\nn02093256\nn02101556\nn02100735\nn02085782\nn02342885\nn03733281\nn02085782\nn03706229\nn02002724\nn13037406\nn02422106\nn07614500\nn02113712\nn04336792\nn02486261\nn02356798\nn02268443\nn04179913\nn04277352\nn02346627\nn03089624\nn02835271\nn02086240\nn04579432\nn03180011\nn04285008\nn02408429\nn04392985\nn02091244\nn02815834\nn02834397\nn04009552\nn02488291\nn03290653\nn03325584\nn03637318\nn02730930\nn02865351\nn02119789\nn03929855\nn03676483\nn04423845\nn03874293\nn03908618\nn03598930\nn02090379\nn01944390\nn04152593\nn09288635\nn02066245\nn01768244\nn03272010\nn01531178\nn03255030\nn03676483\nn02002556\nn02749479\nn02415577\nn02403003\nn07565083\nn02981792\nn01776313\nn02097474\nn02667093\nn02096177\nn03255030\nn01819313\nn02791124\nn02279972\nn04090263\nn09193705\nn04335435\nn03733131\nn03250847\nn04263257\nn02096585\nn03976467\nn02963159\nn04613696\nn04310018\nn02107574\nn03724870\nn09428293\nn02101006\nn04372370\nn03930630\nn07584110\nn01735189\nn04599235\nn02835271\nn04330267\nn02108915\nn02110185\nn07684084\nn04204347\nn02672831\nn03742115\nn04131690\nn09428293\nn04487394\nn03710193\nn09332890\nn03478589\nn04486054\nn02951358\nn09428293\nn04596742\nn01872401\nn04505470\nn04154565\nn02666196\nn02437616\nn03724870\nn02120079\nn01828970\nn03141823\nn01698640\nn03095699\nn04099969\nn02123045\nn04482393\nn04026417\nn02110806\nn04033901\nn04041544\nn02869837\nn04136333\nn02112350\nn03388043\nn03065424\nn02128757\nn04330267\nn02879718\nn02859443\nn01968897\nn01847000\nn01871265\nn02129165\nn02408429\nn04263257\nn13054560\nn02090379\nn04553703\nn03929660\nn01990800\nn03494278\nn01514859\nn02804610\nn01773157\nn02087046\nn07802026\nn03777754\nn07720875\nn01694178\nn06794110\nn02795169\nn07583066\nn02094114\nn03841143\nn01985128\nn03776460\nn02859443\nn02808304\nn02092339\nn02441942\nn02002724\nn04296562\nn02086910\nn02690373\nn01616318\nn07718472\nn02086240\nn04049303\nn04235860\nn06359193\nn02110958\nn01518878\nn02950826\nn03447721\nn02111129\nn04517823\nn03769881\nn02112350\nn07693725\nn07747607\nn02444819\nn02109047\nn04485082\nn10148035\nn03127925\nn04328186\nn03347037\nn02102480\nn07614500\nn02676566\nn04599235\nn03534580\nn02093256\nn03710721\nn02167151\nn04116512\nn04141975\nn03877472\nn02092339\nn03042490\nn04604644\nn03355925\nn04009552\nn03598930\nn02672831\nn03425413\nn03649909\nn02099429\nn01819313\nn02640242\nn02978881\nn03670208\nn02342885\nn03888257\nn03729826\nn02457408\nn02860847\nn09246464\nn02097298\nn03649909\nn04228054\nn02113624\nn01978287\nn03895866\nn03393912\nn03127925\nn03720891\nn01774384\nn04065272\nn03485407\nn04033901\nn02488291\nn12057211\nn01774750\nn01798484\nn01537544\nn07720875\nn03838899\nn04120489\nn02264363\nn02113978\nn02799071\nn02114367\nn04332243\nn03062245\nn02077923\nn02398521\nn04435653\nn01692333\nn07831146\nn04523525\nn02342885\nn07753275\nn01807496\nn02098413\nn01744401\nn07836838\nn02104029\nn02092339\nn02092339\nn02115913\nn01608432\nn03325584\nn02066245\nn03345487\nn03394916\nn01773797\nn02113186\nn02667093\nn02124075\nn04118538\nn02134084\nn02317335\nn03047690\nn03938244\nn02219486\nn07718747\nn02490219\nn04326547\nn02690373\nn07717556\nn01580077\nn02443484\nn04443257\nn04033995\nn07590611\nn02403003\nn07768694\nn03803284\nn04371774\nn02802426\nn06794110\nn04483307\nn02791270\nn02028035\nn03764736\nn07860988\nn09421951\nn03773504\nn04152593\nn04367480\nn02950826\nn02168699\nn04458633\nn01983481\nn04404412\nn04252225\nn04596742\nn02480495\nn02281787\nn01795545\nn02089867\nn02169497\nn02666196\nn04311004\nn02879718\nn03457902\nn02074367\nn03297495\nn02481823\nn04485082\nn02091244\nn07718747\nn02102480\nn04147183\nn03014705\nn02814860\nn04532670\nn02094114\nn01532829\nn01664065\nn04090263\nn03995372\nn03134739\nn06596364\nn03710637\nn01807496\nn02096294\nn04026417\nn02165105\nn03998194\nn02112706\nn04366367\nn02177972\nn04152593\nn04442312\nn01697457\nn03775071\nn07892512\nn02091831\nn02101388\nn01749939\nn03384352\nn02484975\nn03868242\nn01753488\nn02687172\nn02807133\nn02231487\nn02018795\nn04270147\nn03063599\nn04591713\nn03895866\nn03481172\nn04456115\nn01755581\nn02319095\nn02526121\nn01796340\nn02094433\nn01558993\nn04238763\nn03127925\nn03017168\nn02692877\nn04179913\nn02791124\nn03494278\nn06596364\nn01751748\nn02074367\nn03249569\nn04357314\nn07579787\nn04550184\nn06596364\nn03761084\nn07718472\nn03376595\nn04428191\nn01773157\nn07248320\nn03400231\nn04447861\nn03854065\nn01694178\nn02111500\nn04111531\nn02090622\nn03450230\nn04536866\nn01817953\nn02843684\nn03776460\nn04201297\nn04204238\nn02094114\nn04238763\nn01667114\nn02116738\nn03709823\nn04153751\nn02422699\nn01796340\nn07836838\nn02027492\nn03478589\nn01689811\nn02110958\nn03538406\nn03207743\nn01669191\nn06794110\nn02087394\nn01641577\nn07873807\nn03314780\nn04591157\nn02487347\nn04277352\nn07749582\nn03792782\nn03947888\nn03792782\nn01669191\nn02102318\nn03788365\nn03899768\nn04392985\nn01629819\nn04557648\nn02640242\nn02325366\nn07749582\nn04264628\nn04487081\nn02978881\nn03720891\nn01494475\nn02951358\nn01828970\nn04286575\nn04540053\nn04332243\nn04367480\nn03840681\nn02106662\nn03376595\nn02113186\nn03085013\nn09246464\nn03127747\nn04367480\nn03290653\nn07760859\nn02102973\nn03290653\nn01751748\nn02089973\nn02086910\nn02112350\nn03272562\nn04456115\nn03785016\nn02110341\nn01728920\nn04554684\nn02417914\nn01756291\nn03590841\nn01877812\nn02113186\nn02093256\nn02099849\nn02397096\nn03642806\nn02231487\nn04179913\nn02012849\nn02279972\nn04447861\nn04355933\nn01560419\nn02445715\nn03770679\nn03929855\nn01688243\nn06596364\nn07930864\nn01945685\nn01631663\nn03216828\nn03995372\nn02782093\nn01860187\nn04443257\nn04579432\nn07745940\nn04146614\nn02177972\nn04392985\nn01644373\nn02317335\nn04553703\nn02138441\nn13040303\nn01985128\nn02134418\nn01945685\nn02526121\nn02317335\nn01820546\nn04501370\nn01560419\nn02268443\nn03796401\nn03916031\nn02992211\nn03127747\nn03180011\nn02102480\nn04277352\nn01776313\nn03017168\nn02111129\nn02190166\nn02098413\nn02090721\nn01776313\nn09421951\nn02113023\nn02672831\nn03764736\nn04146614\nn03347037\nn03868242\nn02667093\nn02093647\nn02169497\nn02089973\nn07747607\nn02085782\nn02815834\nn02105412\nn02086910\nn04204238\nn03530642\nn07583066\nn04039381\nn02965783\nn04501370\nn04086273\nn04263257\nn02443484\nn04162706\nn07613480\nn04525038\nn04266014\nn03721384\nn04467665\nn04523525\nn04162706\nn02025239\nn04146614\nn01677366\nn04179913\nn04125021\nn02917067\nn04392985\nn04550184\nn02090721\nn03796401\nn03014705\nn04344873\nn02091635\nn01608432\nn03690938\nn04141975\nn01629819\nn04523525\nn01955084\nn01756291\nn04443257\nn02927161\nn07880968\nn07836838\nn02484975\nn02091032\nn07714571\nn03535780\nn04149813\nn09468604\nn02033041\nn03584254\nn04550184\nn03887697\nn03838899\nn02174001\nn03272010\nn03297495\nn04074963\nn03649909\nn03496892\nn03467068\nn02268853\nn03400231\nn02093256\nn04367480\nn02091134\nn04118776\nn02086646\nn07753592\nn02504013\nn02104365\nn02096177\nn03961711\nn04069434\nn03376595\nn01817953\nn01955084\nn02107142\nn03344393\nn03709823\nn02974003\nn02090379\nn04332243\nn03125729\nn03935335\nn02814860\nn01860187\nn03220513\nn02094114\nn03877472\nn02009912\nn02108000\nn02229544\nn03697007\nn03124170\nn02206856\nn03841143\nn04153751\nn01742172\nn13133613\nn04525305\nn01930112\nn02795169\nn02233338\nn02417914\nn03935335\nn01770393\nn02125311\nn03482405\nn04604644\nn02009912\nn03791053\nn03223299\nn03032252\nn04501370\nn03372029\nn03485794\nn02110341\nn04200800\nn02106166\nn04592741\nn02950826\nn04041544\nn07831146\nn04116512\nn01514859\nn03868242\nn03026506\nn02443484\nn02701002\nn04116512\nn02815834\nn03929855\nn03676483\nn01534433\nn02701002\nn02113978\nn04371430\nn03991062\nn07718472\nn02268853\nn04264628\nn02098105\nn07565083\nn02112706\nn02094114\nn02093991\nn02488291\nn02093859\nn03047690\nn01682714\nn07717410\nn01883070\nn04562935\nn01498041\nn07745940\nn02109525\nn01644900\nn01694178\nn03063689\nn02894605\nn01682714\nn03544143\nn02101556\nn02966687\nn03485407\nn03657121\nn02236044\nn07860988\nn01677366\nn07718747\nn02690373\nn04099969\nn03814639\nn02098413\nn01985128\nn02093647\nn02504458\nn01944390\nn03445924\nn03866082\nn03355925\nn02105855\nn03041632\nn03791053\nn03954731\nn07695742\nn02102040\nn03956157\nn03983396\nn02105855\nn03249569\nn03976467\nn03843555\nn02641379\nn03272562\nn03658185\nn03976467\nn02398521\nn03791053\nn03065424\nn03759954\nn03216828\nn03796401\nn01980166\nn09193705\nn01773797\nn02129604\nn04009552\nn02980441\nn03188531\nn02100735\nn07860988\nn03929855\nn04037443\nn03467068\nn02094114\nn03899768\nn04525038\nn02074367\nn04033901\nn02012849\nn02009229\nn02109961\nn03804744\nn02396427\nn02233338\nn03240683\nn03393912\nn03777568\nn02494079\nn02106662\nn04033995\nn02231487\nn04355338\nn04550184\nn02699494\nn04118538\nn03388043\nn02869837\nn02097047\nn03063689\nn01530575\nn02091032\nn03042490\nn03930313\nn02264363\nn02442845\nn02325366\nn01883070\nn01614925\nn03447721\nn03444034\nn02979186\nn02815834\nn02123394\nn03250847\nn02883205\nn04554684\nn03047690\nn01773157\nn02172182\nn03249569\nn04613696\nn03692522\nn04044716\nn12985857\nn02342885\nn03425413\nn02895154\nn01704323\nn01560419\nn02974003\nn07695742\nn03016953\nn03729826\nn03250847\nn02927161\nn02091635\nn01990800\nn02980441\nn02676566\nn02114548\nn02422699\nn04208210\nn02109961\nn04332243\nn04127249\nn03871628\nn02391049\nn01537544\nn02124075\nn02422106\nn01775062\nn03188531\nn02443114\nn01694178\nn03063689\nn02088364\nn04476259\nn04442312\nn03792972\nn07831146\nn02483708\nn04346328\nn04591713\nn03794056\nn04153751\nn03782006\nn02058221\nn04162706\nn04522168\nn03673027\nn04483307\nn03691459\nn03478589\nn02102318\nn07749582\nn07730033\nn01829413\nn01729977\nn04501370\nn09472597\nn03781244\nn02134084\nn01742172\nn03782006\nn04553703\nn09835506\nn03804744\nn02088238\nn04067472\nn03764736\nn02992529\nn03874599\nn03124043\nn04065272\nn02782093\nn03788195\nn04389033\nn03673027\nn04389033\nn03775071\nn07753113\nn12144580\nn02013706\nn02190166\nn04275548\nn03250847\nn03947888\nn01729977\nn02138441\nn04264628\nn03967562\nn03445924\nn04355338\nn02640242\nn01440764\nn12267677\nn02489166\nn02165105\nn03599486\nn03272010\nn02018207\nn02747177\nn04487081\nn02119789\nn02666196\nn02606052\nn02086646\nn04040759\nn01984695\nn12998815\nn01751748\nn04584207\nn04149813\nn01981276\nn02841315\nn03777754\nn04376876\nn02859443\nn04389033\nn01665541\nn04208210\nn04041544\nn02071294\nn13052670\nn01616318\nn03871628\nn02028035\nn03110669\nn01819313\nn04229816\nn02769748\nn03832673\nn02095889\nn01806143\nn02708093\nn07753113\nn02804610\nn02879718\nn03595614\nn02769748\nn07802026\nn04357314\nn09288635\nn07753592\nn04525038\nn04590129\nn01981276\nn01530575\nn02006656\nn03903868\nn02095570\nn03602883\nn03476991\nn04328186\nn03617480\nn03272562\nn02328150\nn04536866\nn02814860\nn03710193\nn04263257\nn02699494\nn04418357\nn01496331\nn02086079\nn03495258\nn03417042\nn03065424\nn03041632\nn04467665\nn02085936\nn03956157\nn02110341\nn07760859\nn03467068\nn02825657\nn02669723\nn07579787\nn02097658\nn03717622\nn03590841\nn02268443\nn07697313\nn02859443\nn01622779\nn02999410\nn01877812\nn01744401\nn01669191\nn04507155\nn02108000\nn10148035\nn04009552\nn09421951\nn03457902\nn02091032\nn03759954\nn01443537\nn02011460\nn01984695\nn02791270\nn03617480\nn02089973\nn02105641\nn03595614\nn03207941\nn03146219\nn04367480\nn07695742\nn03376595\nn09835506\nn02342885\nn03393912\nn04311004\nn04589890\nn02114367\nn02104029\nn01945685\nn02094114\nn01824575\nn04380533\nn02025239\nn03218198\nn02110627\nn04026417\nn02749479\nn07613480\nn02437312\nn03347037\nn02403003\nn03942813\nn03450230\nn04252225\nn02108000\nn03837869\nn02165105\nn03000247\nn04344873\nn02504458\nn02110185\nn01498041\nn04270147\nn04239074\nn03924679\nn02086646\nn09835506\nn03424325\nn04370456\nn03777754\nn03529860\nn02102040\nn01688243\nn02110627\nn02100735\nn02102177\nn04086273\nn01883070\nn04366367\nn02107574\nn02102480\nn04008634\nn02169497\nn04141327\nn02442845\nn03662601\nn01855032\nn04589890\nn02018795\nn03271574\nn02097298\nn03445777\nn02102040\nn03617480\nn02108422\nn02097474\nn02109525\nn02097474\nn11879895\nn03223299\nn02100583\nn03840681\nn02091032\nn01843065\nn03769881\nn02091467\nn02134418\nn02109047\nn04456115\nn03866082\nn04239074\nn02484975\nn04259630\nn07760859\nn09246464\nn01484850\nn02443114\nn04251144\nn03843555\nn04131690\nn07716906\nn03584254\nn04033901\nn04146614\nn03633091\nn13037406\nn04254680\nn07583066\nn03483316\nn02056570\nn02102177\nn04355338\nn01669191\nn04039381\nn01532829\nn02978881\nn03691459\nn04118776\nn02672831\nn06785654\nn07749582\nn02536864\nn02116738\nn04239074\nn02483708\nn03124170\nn07930864\nn02018207\nn04074963\nn01514859\nn02089867\nn03804744\nn04116512\nn02802426\nn03627232\nn03787032\nn02281406\nn07613480\nn02526121\nn02860847\nn01806143\nn03706229\nn03982430\nn04009552\nn01616318\nn01828970\nn03920288\nn03680355\nn02727426\nn02963159\nn02102973\nn04209133\nn01798484\nn02190166\nn02091635\nn02089078\nn04371774\nn04515003\nn02655020\nn02104029\nn01877812\nn02794156\nn02974003\nn02096585\nn04525305\nn02672831\nn02113712\nn02917067\nn02096437\nn07745940\nn02326432\nn03314780\nn02236044\nn02102973\nn02093428\nn03297495\nn03676483\nn03775071\nn04536866\nn04554684\nn03400231\nn04346328\nn01530575\nn04133789\nn03160309\nn01930112\nn03494278\nn03063599\nn03891332\nn04476259\nn02410509\nn03417042\nn07753113\nn03498962\nn03991062\nn04086273\nn01739381\nn07753275\nn03065424\nn03476991\nn07565083\nn01608432\nn04258138\nn03803284\nn02120079\nn02454379\nn01537544\nn02492035\nn02219486\nn01735189\nn03594734\nn02442845\nn04485082\nn03599486\nn02086079\nn03995372\nn04501370\nn02113712\nn02102480\nn03599486\nn04162706\nn03868242\nn04209133\nn02791124\nn01819313\nn02116738\nn02894605\nn03764736\nn03476684\nn02123159\nn02325366\nn03457902\nn02123597\nn09399592\nn02488291\nn03788365\nn01770081\nn01498041\nn02110341\nn02834397\nn02391049\nn02113023\nn02099712\nn01739381\nn02980441\nn02027492\nn03208938\nn07734744\nn02027492\nn02108000\nn03902125\nn04044716\nn09428293\nn01981276\nn02869837\nn03425413\nn03085013\nn03804744\nn02443114\nn01983481\nn02088466\nn02077923\nn01740131\nn09468604\nn02783161\nn03888257\nn02797295\nn04252225\nn01622779\nn01669191\nn03710637\nn01669191\nn01983481\nn02108422\nn04111531\nn04179913\nn04204238\nn04389033\nn02087046\nn01872401\nn02692877\nn01632777\nn02640242\nn02927161\nn02814860\nn03792972\nn04039381\nn02480855\nn03599486\nn04326547\nn03691459\nn04592741\nn03014705\nn01582220\nn13052670\nn02802426\nn01797886\nn04263257\nn04350905\nn03372029\nn02484975\nn09428293\nn03887697\nn02112350\nn03110669\nn02910353\nn02096294\nn02102177\nn02115913\nn02804610\nn04239074\nn04005630\nn04118538\nn04067472\nn02128757\nn02097658\nn02099849\nn01882714\nn02494079\nn03379051\nn02808440\nn04392985\nn02114548\nn02206856\nn03976657\nn01729322\nn07831146\nn01883070\nn02361337\nn02128757\nn02097130\nn04447861\nn13052670\nn02096177\nn03691459\nn02134084\nn02494079\nn03642806\nn04136333\nn02268853\nn02417914\nn03891332\nn09246464\nn03032252\nn02825657\nn03498962\nn03160309\nn04026417\nn04296562\nn03534580\nn03216828\nn07880968\nn03393912\nn02948072\nn04560804\nn04152593\nn04509417\nn03884397\nn02129604\nn01944390\nn04310018\nn04086273\nn07584110\nn04258138\nn04264628\nn13040303\nn02109525\nn04462240\nn02791270\nn03384352\nn04070727\nn02108422\nn03485407\nn02093647\nn03000134\nn03089624\nn07615774\nn03956157\nn02776631\nn01729977\nn03868242\nn03899768\nn01871265\nn03180011\nn03630383\nn01968897\nn02939185\nn02097474\nn04154565\nn04462240\nn02028035\nn04041544\nn02111129\nn03026506\nn04389033\nn02808440\nn03124170\nn02129165\nn02776631\nn04259630\nn03902125\nn07760859\nn01744401\nn02128757\nn02843684\nn02091134\nn02256656\nn03814639\nn02666196\nn02497673\nn13054560\nn01914609\nn01580077\nn02089867\nn03630383\nn02025239\nn02123597\nn02807133\nn03673027\nn04317175\nn15075141\nn01795545\nn03888257\nn03062245\nn04209133\nn01531178\nn02410509\nn04162706\nn03814639\nn02102177\nn04399382\nn03220513\nn06874185\nn04152593\nn07880968\nn02066245\nn01735189\nn03271574\nn01592084\nn04355933\nn02085936\nn01978455\nn04597913\nn07871810\nn02093859\nn01773549\nn03126707\nn03452741\nn02027492\nn02408429\nn01985128\nn03670208\nn04458633\nn04273569\nn03785016\nn01751748\nn03188531\nn02917067\nn02086240\nn03770439\nn03240683\nn03920288\nn03954731\nn02109525\nn03016953\nn02107683\nn01665541\nn04310018\nn03485407\nn03187595\nn03814639\nn02095570\nn01968897\nn03874599\nn02493509\nn02130308\nn02749479\nn01945685\nn02536864\nn04154565\nn02328150\nn03908618\nn01737021\nn02408429\nn02231487\nn04131690\nn03970156\nn01530575\nn04336792\nn02951358\nn02879718\nn03944341\nn03788195\nn02895154\nn03838899\nn02037110\nn04009552\nn03141823\nn02102973\nn07730033\nn01984695\nn07693725\nn04065272\nn01631663\nn02699494\nn03095699\nn02112350\nn04019541\nn09835506\nn01484850\nn07697313\nn01729322\nn03085013\nn04041544\nn02396427\nn02879718\nn03891332\nn04590129\nn03271574\nn02454379\nn01944390\nn02099267\nn02097658\nn07720875\nn02484975\nn03733805\nn02086240\nn04204238\nn03483316\nn03201208\nn02095570\nn01630670\nn03201208\nn01755581\nn02879718\nn03065424\nn02037110\nn02108915\nn02807133\nn04023962\nn01669191\nn02098286\nn04252225\nn02115641\nn02281787\nn06794110\nn02391049\nn04486054\nn01817953\nn04041544\nn04277352\nn02107574\nn09193705\nn04371774\nn04372370\nn03724870\nn03388183\nn04371430\nn02788148\nn01817953\nn02699494\nn07730033\nn09468604\nn04254777\nn04501370\nn03637318\nn02782093\nn04152593\nn01882714\nn02916936\nn03661043\nn04336792\nn02422699\nn04019541\nn01664065\nn03325584\nn03976657\nn04423845\nn04404412\nn03527444\nn02123045\nn02094114\nn01558993\nn03062245\nn02113712\nn03662601\nn03065424\nn03388183\nn03447721\nn01667778\nn03584254\nn03000247\nn07718747\nn01737021\nn02676566\nn01795545\nn07860988\nn04086273\nn04332243\nn03447721\nn01829413\nn02236044\nn02165105\nn01796340\nn02092339\nn01443537\nn04370456\nn03961711\nn07579787\nn01753488\nn02708093\nn02111277\nn01774750\nn04286575\nn02483708\nn02002724\nn02536864\nn03400231\nn03485794\nn02480495\nn02509815\nn04111531\nn07716358\nn01968897\nn04579145\nn02892201\nn02091134\nn04118776\nn03249569\nn01601694\nn04522168\nn02441942\nn03271574\nn02692877\nn03930313\nn02100735\nn04428191\nn03706229\nn02119789\nn02111277\nn01629819\nn04476259\nn03958227\nn03240683\nn02504458\nn04461696\nn09229709\nn01728920\nn02422106\nn03450230\nn02268853\nn03902125\nn03868863\nn09428293\nn04482393\nn03680355\nn01744401\nn12620546\nn02002556\nn04136333\nn02447366\nn02226429\nn03249569\nn02281406\nn03721384\nn03874599\nn02951585\nn04074963\nn02480495\nn03929855\nn03016953\nn03376595\nn07747607\nn15075141\nn02085620\nn04141975\nn03733805\nn03670208\nn02085620\nn01491361\nn03803284\nn02415577\nn07714571\nn03929855\nn13037406\nn01740131\nn01580077\nn03891251\nn02128925\nn01664065\nn02090379\nn07920052\nn02279972\nn02490219\nn02906734\nn01914609\nn01704323\nn02105412\nn03492542\nn04482393\nn02788148\nn01985128\nn03388549\nn04251144\nn02939185\nn02114548\nn07836838\nn10148035\nn03976467\nn03447721\nn02006656\nn07802026\nn04370456\nn02417914\nn01776313\nn02112018\nn03938244\nn02536864\nn07802026\nn04501370\nn02963159\nn03759954\nn02028035\nn04044716\nn02123394\nn02823428\nn01491361\nn04008634\nn01877812\nn07615774\nn09256479\nn01833805\nn04127249\nn04507155\nn03673027\nn01882714\nn03697007\nn03637318\nn04332243\nn12267677\nn07714571\nn03485794\nn04004767\nn02795169\nn02120505\nn02086646\nn02107908\nn03888257\nn01795545\nn03272010\nn07714571\nn02097047\nn03874293\nn02391049\nn01855672\nn01871265\nn04208210\nn02487347\nn02013706\nn02096051\nn03598930\nn03873416\nn02871525\nn02102973\nn03710637\nn01773157\nn03208938\nn04325704\nn02002724\nn02137549\nn02125311\nn01440764\nn01806567\nn03345487\nn04209239\nn07860988\nn07802026\nn07714571\nn12768682\nn02108422\nn01770393\nn03124043\nn04023962\nn02105056\nn04476259\nn02871525\nn03598930\nn02206856\nn03223299\nn02259212\nn02607072\nn02834397\nn02364673\nn03131574\nn02802426\nn02117135\nn04370456\nn01829413\nn04033901\nn02123159\nn02794156\nn02132136\nn02883205\nn07720875\nn03920288\nn02892201\nn04285008\nn03345487\nn03661043\nn04423845\nn02013706\nn01924916\nn03095699\nn09428293\nn04153751\nn02865351\nn03384352\nn02786058\nn02099429\nn03014705\nn02113712\nn01833805\nn03924679\nn03937543\nn02892767\nn01819313\nn02109047\nn01694178\nn01729322\nn02808440\nn04266014\nn01978287\nn04111531\nn04540053\nn02100735\nn03935335\nn04372370\nn03930630\nn02443114\nn03854065\nn03724870\nn09193705\nn02640242\nn03967562\nn07711569\nn04147183\nn03710721\nn02965783\nn02951585\nn01582220\nn03014705\nn02643566\nn01739381\nn03814906\nn01882714\nn01729322\nn02860847\nn04350905\nn01697457\nn03220513\nn04311004\nn03877472\nn04209239\nn04149813\nn03770679\nn04548362\nn07930864\nn03661043\nn03400231\nn02930766\nn04613696\nn03866082\nn01990800\nn01534433\nn03947888\nn02492660\nn01985128\nn03793489\nn03977966\nn01795545\nn04086273\nn01688243\nn02423022\nn04277352\nn03877472\nn03208938\nn04476259\nn04550184\nn03063599\nn04523525\nn02123597\nn02708093\nn02134418\nn02086079\nn11879895\nn03676483\nn02107574\nn02113978\nn03764736\nn03642806\nn01748264\nn02167151\nn04612504\nn02817516\nn02051845\nn03724870\nn02077923\nn01443537\nn03065424\nn02105505\nn02051845\nn02087394\nn01735189\nn04310018\nn01632458\nn02509815\nn02093859\nn01669191\nn03868242\nn03400231\nn02423022\nn02090622\nn03146219\nn02397096\nn03532672\nn02013706\nn01622779\nn02483708\nn03187595\nn02114712\nn03131574\nn03476991\nn03838899\nn02105162\nn04604644\nn01689811\nn02113624\nn03691459\nn15075141\nn01773797\nn01491361\nn04209133\nn04476259\nn03444034\nn02488291\nn03485407\nn01630670\nn04599235\nn02174001\nn02834397\nn02509815\nn03538406\nn03535780\nn02105855\nn04501370\nn02098105\nn03763968\nn03095699\nn04591713\nn02363005\nn03599486\nn01491361\nn02090622\nn03590841\nn03832673\nn02013706\nn06874185\nn06596364\nn04074963\nn04389033\nn02447366\nn01631663\nn02841315\nn03733805\nn03146219\nn02974003\nn03947888\nn02095570\nn02422106\nn04049303\nn02396427\nn03891251\nn02422106\nn04486054\nn02091831\nn07760859\nn03179701\nn03947888\nn03692522\nn02097298\nn03602883\nn02974003\nn02951585\nn04141327\nn04357314\nn02786058\nn02268853\nn04596742\nn03788365\nn02111277\nn02104365\nn03584254\nn04509417\nn03494278\nn02939185\nn02363005\nn03047690\nn04366367\nn04409515\nn04380533\nn03187595\nn01882714\nn03680355\nn03124170\nn01986214\nn04004767\nn01833805\nn04141076\nn02033041\nn03109150\nn04560804\nn07930864\nn02114548\nn02877765\nn02093754\nn01737021\nn02093647\nn03794056\nn01843383\nn01978287\nn01669191\nn02870880\nn02071294\nn02098286\nn04120489\nn04239074\nn01537544\nn02504013\nn03929855\nn09193705\nn03534580\nn03018349\nn04179913\nn01735189\nn01665541\nn12768682\nn02669723\nn03930313\nn04200800\nn02363005\nn04552348\nn03992509\nn02123159\nn04505470\nn01518878\nn01742172\nn02445715\nn03584254\nn02101556\nn02398521\nn02106166\nn04372370\nn04346328\nn02109047\nn03498962\nn01980166\nn07753275\nn04447861\nn09332890\nn04417672\nn07248320\nn02412080\nn03218198\nn04428191\nn04447861\nn04557648\nn01677366\nn01774750\nn09399592\nn02859443\nn04456115\nn02018795\nn03935335\nn04465501\nn02112706\nn02799071\nn07684084\nn01614925\nn02167151\nn04606251\nn04317175\nn04311004\nn02077923\nn04326547\nn02483708\nn02963159\nn07565083\nn04557648\nn02397096\nn04133789\nn02229544\nn04317175\nn07749582\nn03803284\nn04456115\nn01828970\nn02408429\nn01632458\nn03028079\nn03291819\nn01773797\nn02096585\nn02110341\nn01669191\nn01986214\nn03742115\nn01910747\nn02966687\nn02025239\nn07615774\nn02090721\nn01855672\nn02965783\nn03924679\nn11879895\nn02113186\nn04270147\nn02804610\nn06359193\nn02965783\nn03777754\nn09399592\nn01693334\nn04033901\nn02098413\nn01981276\nn03657121\nn02096437\nn03841143\nn02123394\nn02447366\nn03345487\nn02963159\nn01580077\nn03481172\nn02483362\nn02894605\nn02109525\nn04525038\nn01917289\nn03983396\nn04462240\nn04153751\nn03992509\nn02906734\nn03290653\nn02017213\nn02808440\nn04515003\nn02422106\nn02115913\nn03720891\nn10148035\nn02794156\nn02096294\nn03220513\nn02437312\nn02058221\nn04540053\nn07753592\nn02105641\nn04325704\nn04447861\nn07695742\nn03666591\nn03642806\nn01910747\nn03733281\nn01768244\nn03888605\nn13133613\nn03590841\nn03127925\nn02488291\nn04208210\nn04592741\nn04557648\nn02169497\nn01773549\nn02672831\nn03742115\nn01983481\nn02113978\nn03494278\nn02490219\nn02488291\nn03062245\nn02167151\nn02676566\nn04392985\nn03877472\nn02168699\nn02488291\nn02840245\nn03014705\nn04044716\nn02119022\nn01824575\nn02840245\nn04023962\nn03032252\nn02486410\nn03197337\nn02974003\nn04086273\nn02441942\nn03496892\nn03721384\nn03538406\nn03041632\nn02927161\nn02408429\nn03759954\nn03690938\nn01930112\nn01744401\nn02992529\nn03873416\nn07615774\nn02012849\nn03777568\nn03676483\nn01968897\nn03866082\nn04005630\nn04285008\nn02841315\nn02106030\nn02276258\nn02422106\nn03649909\nn03017168\nn02097474\nn02948072\nn02256656\nn04179913\nn09835506\nn02111889\nn02988304\nn07836838\nn02051845\nn02971356\nn02640242\nn03065424\nn04201297\nn02281406\nn02134418\nn02500267\nn02895154\nn02870880\nn03617480\nn02415577\nn03733131\nn03594734\nn04152593\nn04258138\nn04286575\nn04336792\nn02484975\nn04041544\nn04081281\nn03291819\nn04584207\nn02100877\nn03459775\nn01498041\nn04429376\nn04252077\nn04515003\nn02108089\nn03876231\nn03838899\nn07716358\nn02025239\nn02965783\nn04033901\nn03841143\nn02102318\nn03888605\nn03777568\nn04350905\nn02870880\nn04277352\nn07720875\nn02317335\nn02504458\nn02488291\nn02137549\nn02490219\nn04428191\nn03662601\nn04532670\nn02105412\nn02091831\nn04154565\nn01531178\nn07753275\nn02117135\nn01882714\nn03272010\nn03759954\nn03866082\nn03992509\nn02137549\nn01537544\nn01494475\nn03179701\nn01694178\nn04554684\nn04204347\nn11879895\nn04366367\nn04371430\nn12057211\nn02730930\nn03461385\nn01728572\nn01688243\nn04141975\nn02174001\nn04310018\nn02077923\nn02105505\nn03250847\nn01776313\nn04532106\nn02346627\nn04493381\nn07742313\nn04335435\nn02112018\nn02097298\nn04254120\nn02231487\nn03394916\nn01806143\nn04311004\nn03216828\nn07615774\nn07614500\nn07768694\nn07248320\nn03594734\nn04008634\nn02091134\nn02606052\nn04310018\nn07714990\nn01945685\nn02326432\nn01704323\nn01944390\nn01514668\nn01514668\nn01740131\nn04356056\nn03492542\nn02643566\nn03759954\nn03854065\nn03781244\nn03125729\nn02087394\nn02093754\nn02802426\nn03527444\nn07747607\nn03394916\nn01644373\nn02823428\nn02106550\nn03954731\nn01944390\nn09472597\nn03126707\nn02102973\nn03443371\nn03529860\nn02489166\nn04606251\nn04371774\nn03197337\nn04252225\nn01986214\nn03841143\nn02111129\nn04251144\nn02782093\nn03786901\nn04542943\nn03196217\nn01735189\nn03125729\nn02089867\nn04009552\nn02860847\nn02229544\nn01871265\nn03930313\nn04296562\nn03388549\nn02437616\nn02423022\nn02190166\nn04522168\nn04136333\nn02009229\nn07716358\nn01798484\nn01990800\nn04525038\nn07754684\nn01582220\nn03673027\nn02977058\nn04317175\nn03495258\nn02692877\nn02089973\nn01843065\nn03584254\nn02802426\nn02364673\nn01807496\nn02172182\nn03742115\nn02687172\nn02769748\nn07716358\nn03028079\nn02107142\nn02749479\nn02417914\nn04296562\nn01829413\nn01698640\nn03935335\nn02096294\nn02112706\nn02692877\nn01740131\nn07754684\nn04136333\nn02112137\nn02326432\nn02113624\nn07715103\nn02484975\nn03781244\nn01630670\nn02701002\nn03776460\nn01978455\nn01755581\nn01819313\nn03838899\nn04146614\nn04251144\nn02113023\nn02483362\nn04456115\nn02101006\nn02992211\nn02037110\nn03045698\nn02963159\nn03249569\nn06359193\nn03196217\nn01693334\nn02085936\nn03697007\nn02092002\nn02099712\nn02793495\nn03710721\nn02102318\nn03895866\nn02097209\nn03127747\nn01950731\nn02106166\nn01443537\nn03372029\nn04229816\nn01990800\nn04258138\nn03637318\nn03633091\nn03770439\nn01818515\nn04069434\nn02110063\nn01664065\nn02504458\nn01641577\nn04562935\nn03825788\nn03873416\nn02484975\nn01984695\nn03761084\nn02892201\nn04392985\nn04357314\nn02097130\nn03394916\nn03124170\nn03938244\nn01582220\nn04133789\nn07871810\nn02114855\nn02445715\nn03017168\nn01729977\nn02101006\nn04153751\nn07730033\nn02802426\nn02130308\nn02096585\nn01860187\nn01980166\nn02825657\nn03450230\nn04037443\nn04090263\nn02361337\nn02823750\nn02843684\nn03372029\nn01749939\nn02808440\nn03384352\nn02129165\nn02095570\nn02916936\nn02098105\nn02093256\nn03445777\nn02111500\nn04553703\nn03871628\nn03876231\nn03062245\nn03207941\nn04428191\nn02408429\nn04005630\nn02777292\nn03877845\nn04599235\nn02514041\nn04081281\nn02111889\nn03208938\nn02105855\nn10565667\nn02493793\nn02676566\nn02219486\nn04147183\nn01531178\nn04542943\nn02492660\nn04235860\nn02321529\nn01687978\nn02066245\nn01818515\nn03461385\nn03710637\nn03854065\nn01872401\nn01847000\nn03690938\nn06596364\nn07932039\nn02102973\nn01806567\nn02106382\nn15075141\nn02109047\nn02087394\nn01774750\nn02128385\nn07871810\nn02086240\nn04209239\nn07749582\nn04392985\nn02058221\nn01644373\nn03127925\nn03690938\nn04485082\nn03388183\nn02110627\nn02165105\nn03785016\nn02259212\nn02108915\nn02099267\nn04044716\nn01990800\nn01986214\nn01632777\nn01580077\nn02106030\nn01632458\nn03337140\nn01695060\nn09399592\nn04116512\nn03443371\nn02097658\nn04039381\nn02422699\nn02105855\nn03792782\nn02229544\nn01950731\nn02256656\nn03916031\nn01534433\nn03791053\nn04200800\nn03314780\nn04120489\nn04584207\nn01820546\nn04125021\nn02930766\nn02093647\nn02910353\nn03452741\nn03482405\nn04380533\nn01622779\nn07768694\nn03042490\nn03461385\nn04285008\nn04540053\nn02099267\nn12057211\nn04118776\nn04162706\nn12620546\nn01534433\nn01675722\nn02089078\nn03290653\nn02883205\nn07697537\nn03393912\nn02113186\nn03014705\nn04435653\nn03590841\nn03773504\nn02782093\nn02980441\nn04239074\nn04228054\nn03877845\nn04023962\nn04404412\nn02088238\nn03617480\nn03670208\nn09229709\nn02971356\nn04553703\nn01748264\nn02091467\nn07697537\nn02113186\nn07615774\nn02328150\nn02883205\nn07579787\nn01514668\nn03877845\nn02108915\nn07760859\nn02125311\nn03899768\nn01924916\nn02487347\nn02979186\nn03594945\nn03895866\nn02441942\nn13040303\nn03710193\nn03709823\nn03544143\nn02843684\nn02085782\nn02088466\nn01910747\nn04599235\nn01847000\nn02423022\nn03476991\nn02690373\nn07730033\nn03733281\nn02129604\nn02027492\nn04443257\nn03977966\nn03992509\nn02108422\nn07875152\nn03793489\nn03127925\nn04579145\nn02395406\nn02119022\nn03706229\nn03902125\nn03777568\nn02125311\nn04458633\nn02672831\nn01784675\nn02138441\nn04328186\nn02120505\nn01644373\nn03544143\nn01818515\nn03877472\nn04044716\nn04009552\nn03220513\nn04067472\nn02172182\nn02823750\nn02317335\nn04467665\nn02229544\nn04049303\nn02116738\nn07584110\nn02018795\nn03930313\nn02480495\nn02172182\nn09399592\nn01530575\nn02971356\nn02105641\nn01698640\nn04553703\nn02280649\nn01807496\nn02504458\nn03617480\nn03884397\nn02011460\nn02704792\nn03393912\nn01667114\nn03598930\nn01775062\nn07717410\nn04118776\nn03218198\nn03255030\nn02111129\nn02892201\nn03444034\nn03692522\nn02364673\nn07718747\nn04418357\nn04235860\nn03000684\nn03929660\nn03670208\nn01560419\nn02494079\nn03197337\nn01737021\nn07697313\nn02127052\nn03764736\nn04270147\nn02097474\nn04204347\nn03291819\nn03134739\nn02086240\nn03691459\nn01924916\nn04550184\nn02093754\nn03110669\nn02643566\nn02108422\nn02795169\nn02483362\nn03983396\nn02093647\nn02815834\nn04069434\nn03930313\nn02326432\nn02086079\nn03958227\nn04258138\nn03498962\nn03697007\nn03126707\nn02980441\nn03530642\nn02086910\nn02087394\nn02280649\nn04285008\nn02093256\nn01950731\nn03733131\nn04277352\nn02086240\nn03544143\nn03782006\nn01632777\nn02086646\nn03297495\nn09246464\nn02123597\nn02687172\nn04487081\nn02236044\nn03710193\nn02607072\nn02788148\nn01776313\nn04376876\nn02102973\nn07873807\nn03372029\nn02104029\nn02669723\nn01693334\nn12985857\nn03785016\nn02066245\nn01698640\nn04086273\nn03047690\nn04026417\nn01773797\nn03742115\nn02018207\nn01978455\nn02988304\nn03595614\nn02965783\nn02992529\nn01773157\nn03417042\nn03376595\nn04435653\nn07711569\nn03970156\nn02877765\nn04111531\nn09256479\nn02641379\nn04179913\nn02113023\nn03977966\nn04525038\nn02190166\nn04070727\nn02111277\nn02128757\nn01784675\nn02412080\nn03146219\nn03485794\nn01773157\nn02119022\nn02704792\nn01737021\nn03697007\nn03450230\nn01770081\nn03792782\nn02089867\nn02817516\nn03141823\nn01773157\nn07860988\nn02317335\nn04442312\nn04428191\nn04049303\nn12620546\nn04591157\nn03980874\nn03314780\nn02514041\nn03376595\nn01774384\nn01774384\nn04579432\nn04336792\nn01872401\nn02483708\nn03127925\nn03314780\nn03843555\nn01770081\nn02480855\nn04118776\nn01910747\nn03126707\nn02233338\nn02114855\nn02808304\nn02107683\nn03590841\nn01737021\nn01514859\nn04346328\nn02102480\nn02093754\nn09472597\nn09332890\nn03630383\nn02492035\nn04026417\nn02110185\nn03125729\nn04465501\nn07695742\nn03775546\nn02930766\nn07753275\nn07684084\nn04486054\nn01677366\nn03127747\nn02917067\nn04347754\nn02704792\nn07583066\nn07714990\nn02111500\nn03085013\nn02233338\nn03977966\nn03876231\nn07760859\nn03623198\nn02268853\nn07730033\nn02097047\nn02981792\nn01984695\nn04584207\nn01665541\nn01734418\nn02100877\nn03109150\nn02099712\nn01855672\nn02486410\nn02099267\nn03804744\nn04179913\nn02091032\nn04200800\nn04127249\nn01833805\nn01855672\nn02909870\nn04423845\nn03345487\nn04456115\nn04517823\nn07714990\nn03492542\nn01531178\nn07892512\nn01534433\nn03982430\nn04116512\nn02097130\nn04612504\nn03146219\nn02097130\nn04517823\nn07684084\nn01978455\nn02236044\nn01798484\nn04200800\nn01985128\nn09468604\nn02268853\nn02090622\nn03000684\nn04447861\nn04154565\nn02840245\nn03126707\nn02391049\nn04532106\nn01728572\nn03124043\nn01773549\nn02480855\nn07860988\nn02105056\nn03888605\nn02116738\nn02804610\nn02113799\nn03899768\nn01729322\nn07873807\nn02116738\nn02795169\nn02256656\nn07720875\nn03584829\nn02097209\nn02092002\nn07614500\nn03599486\nn02825657\nn02966687\nn04428191\nn02488702\nn01774384\nn03908618\nn03814639\nn02444819\nn02825657\nn02325366\nn03394916\nn02077923\nn03709823\nn04579432\nn03967562\nn01514668\nn04548280\nn03899768\nn02892201\nn01704323\nn01484850\nn03535780\nn03775546\nn03337140\nn01514859\nn01580077\nn01580077\nn04509417\nn03977966\nn02115641\nn07697313\nn07753275\nn04542943\nn02910353\nn02087046\nn04443257\nn03788365\nn04429376\nn01484850\nn02843684\nn04479046\nn01990800\nn09193705\nn02115641\nn01773549\nn09246464\nn03956157\nn03065424\nn02174001\nn01824575\nn02099267\nn02093647\nn03133878\nn01580077\nn01622779\nn03271574\nn07768694\nn04376876\nn01877812\nn03110669\nn01728920\nn04141327\nn04389033\nn02096294\nn02492035\nn03876231\nn07716906\nn02097474\nn02086240\nn02708093\nn02105641\nn01984695\nn03125729\nn03944341\nn03450230\nn02109525\nn04389033\nn07760859\nn01704323\nn04540053\nn02823428\nn02115641\nn03733281\nn02093754\nn01532829\nn07802026\nn09472597\nn02091134\nn03041632\nn04372370\nn01608432\nn04265275\nn02804414\nn03109150\nn04328186\nn02107312\nn03100240\nn03250847\nn03393912\nn02090622\nn02840245\nn02870880\nn04562935\nn02397096\nn03995372\nn02106662\nn02096177\nn02493509\nn02965783\nn01981276\nn01990800\nn01698640\nn02088238\nn02107908\nn09399592\nn02790996\nn02091134\nn04252225\nn02447366\nn03179701\nn02123394\nn02974003\nn03124170\nn03045698\nn03271574\nn04067472\nn01494475\nn01984695\nn02321529\nn03062245\nn07892512\nn02123045\nn02099849\nn02672831\nn03854065\nn02825657\nn01644900\nn07745940\nn04366367\nn09288635\nn03447447\nn03124043\nn12267677\nn02091244\nn02111277\nn02088632\nn12985857\nn04517823\nn03594945\nn04049303\nn03908714\nn03697007\nn07714571\nn01986214\nn03014705\nn04238763\nn02950826\nn01755581\nn02108089\nn02111500\nn02028035\nn03425413\nn02276258\nn03690938\nn03478589\nn04579432\nn04209133\nn02492035\nn04479046\nn03131574\nn04026417\nn01981276\nn01514668\nn02643566\nn03791053\nn02870880\nn04235860\nn06596364\nn04019541\nn09246464\nn03065424\nn13054560\nn04597913\nn02111500\nn04252077\nn03857828\nn02100236\nn04442312\nn02363005\nn04040759\nn03127925\nn04033995\nn03662601\nn02966193\nn03761084\nn03838899\nn04081281\nn04243546\nn04252077\nn04487081\nn04417672\nn03662601\nn03476991\nn01829413\nn07614500\nn02701002\nn07754684\nn04258138\nn01744401\nn03259280\nn02676566\nn03017168\nn01817953\nn04049303\nn01692333\nn02108551\nn03134739\nn02410509\nn03871628\nn04525305\nn02093754\nn04461696\nn04523525\nn11939491\nn04612504\nn03706229\nn02167151\nn01582220\nn03692522\nn03595614\nn02823428\nn03950228\nn04399382\nn03877845\nn04596742\nn04005630\nn03724870\nn03445924\nn07614500\nn01883070\nn03710637\nn04120489\nn03127925\nn03249569\nn02879718\nn04562935\nn03630383\nn02106662\nn02097474\nn02114855\nn09332890\nn02096051\nn03995372\nn03016953\nn03447447\nn10565667\nn07579787\nn02102040\nn02097298\nn01514668\nn04332243\nn03770679\nn02102040\nn01616318\nn01694178\nn02817516\nn02086240\nn03787032\nn01582220\nn02097130\nn03690938\nn02825657\nn02106662\nn02490219\nn02514041\nn03958227\nn03658185\nn03187595\nn02107908\nn07734744\nn02093859\nn02011460\nn04447861\nn02640242\nn02793495\nn02514041\nn01534433\nn02132136\nn02108422\nn01768244\nn04399382\nn01734418\nn02037110\nn02444819\nn03272562\nn02906734\nn01740131\nn03325584\nn03598930\nn02277742\nn03443371\nn03447721\nn02097130\nn04347754\nn03903868\nn03529860\nn06785654\nn01985128\nn02892767\nn02074367\nn02445715\nn03131574\nn02892201\nn02114548\nn02096294\nn03787032\nn03776460\nn02870880\nn04347754\nn03930313\nn02095889\nn02124075\nn01641577\nn07753592\nn02100583\nn04591157\nn02488291\nn03690938\nn03791053\nn02860847\nn04612504\nn01677366\nn02112350\nn03062245\nn02909870\nn09428293\nn01860187\nn02999410\nn13044778\nn04070727\nn02105855\nn01950731\nn04443257\nn02110341\nn04265275\nn04273569\nn03000247\nn01675722\nn03838899\nn13040303\nn03016953\nn03793489\nn02119022\nn04366367\nn03388549\nn06874185\nn02980441\nn03676483\nn04065272\nn02102040\nn04501370\nn01740131\nn04162706\nn04325704\nn01443537\nn02672831\nn02101006\nn04417672\nn01990800\nn02133161\nn02264363\nn04548280\nn03935335\nn02906734\nn01985128\nn02107574\nn03125729\nn03208938\nn02074367\nn03133878\nn02085782\nn02607072\nn03388043\nn02096585\nn07693725\nn02786058\nn01443537\nn01873310\nn02791124\nn04325704\nn03530642\nn04147183\nn02484975\nn02091635\nn03100240\nn02879718\nn02093991\nn11879895\nn01737021\nn13054560\nn01945685\nn04356056\nn02342885\nn04192698\nn04536866\nn04435653\nn01829413\nn01496331\nn03887697\nn03770679\nn12057211\nn12985857\nn04266014\nn02916936\nn04429376\nn02229544\nn03763968\nn03595614\nn02837789\nn02109047\nn02106030\nn03180011\nn02102973\nn02865351\nn02074367\nn02169497\nn02087046\nn03141823\nn02124075\nn02437312\nn07892512\nn01776313\nn02641379\nn01644900\nn03042490\nn03630383\nn03785016\nn07730033\nn03544143\nn02007558\nn02109047\nn02910353\nn02107312\nn02389026\nn01698640\nn03633091\nn04442312\nn07248320\nn04525038\nn03459775\nn03297495\nn03676483\nn03476991\nn02097658\nn03888257\nn02115913\nn01532829\nn02085936\nn01532829\nn02107312\nn02403003\nn03933933\nn02483362\nn02105162\nn02066245\nn01518878\nn01685808\nn03782006\nn07695742\nn09835506\nn04141076\nn02454379\nn02107683\nn03874293\nn02177972\nn02106166\nn04590129\nn03388549\nn04399382\nn02096585\nn02093256\nn02319095\nn04560804\nn02089973\nn03223299\nn02091244\nn02089867\nn04335435\nn03825788\nn02056570\nn01669191\nn02113978\nn03141823\nn02640242\nn02841315\nn04146614\nn03400231\nn02490219\nn03791053\nn07880968\nn02025239\nn03873416\nn02437616\nn03220513\nn02089973\nn03045698\nn02100735\nn04228054\nn06785654\nn04554684\nn03595614\nn03933933\nn03954731\nn02110806\nn02056570\nn04476259\nn03032252\nn02445715\nn03895866\nn02317335\nn04479046\nn02782093\nn02172182\nn02417914\nn03041632\nn04507155\nn02672831\nn02108000\nn07714990\nn03532672\nn02123597\nn03218198\nn02091134\nn02825657\nn02916936\nn03874599\nn03876231\nn03160309\nn04118538\nn03259280\nn03670208\nn07745940\nn03733805\nn01669191\nn03404251\nn07718747\nn07831146\nn02403003\nn02883205\nn02415577\nn01784675\nn02492035\nn03599486\nn01877812\nn01877812\nn03498962\nn04355338\nn03617480\nn03404251\nn02277742\nn02169497\nn02113624\nn04067472\nn04465501\nn04335435\nn02444819\nn09421951\nn04591157\nn01622779\nn03425413\nn02346627\nn04162706\nn03874293\nn02138441\nn04005630\nn03769881\nn03942813\nn04285008\nn02114855\nn02114712\nn02708093\nn03124170\nn01498041\nn07613480\nn02363005\nn03355925\nn13054560\nn03180011\nn04552348\nn02423022\nn04525038\nn02504013\nn02107312\nn02091467\nn02101006\nn03721384\nn07695742\nn02823428\nn04589890\nn04584207\nn04111531\nn03160309\nn01531178\nn02123394\nn02777292\nn04208210\nn01667114\nn01667114\nn04597913\nn03529860\nn03450230\nn02123045\nn12768682\nn01924916\nn02536864\nn04442312\nn02747177\nn07831146\nn02951358\nn03857828\nn03482405\nn03028079\nn04040759\nn02417914\nn01689811\nn03188531\nn04070727\nn07720875\nn02168699\nn11939491\nn01704323\nn03223299\nn01930112\nn02747177\nn03903868\nn02093428\nn01728572\nn03459775\nn04409515\nn03977966\nn03220513\nn04355933\nn03662601\nn03916031\nn07836838\nn07714571\nn03891332\nn02105251\nn03028079\nn02117135\nn02096585\nn04458633\nn02883205\nn01818515\nn01641577\nn04070727\nn02093428\nn03494278\nn03255030\nn03769881\nn07716358\nn03877845\nn07760859\nn03495258\nn04370456\nn02091134\nn03874293\nn03026506\nn03259280\nn02097209\nn03873416\nn07760859\nn02108422\nn01872401\nn01981276\nn04153751\nn02110185\nn02095570\nn01496331\nn04285008\nn03075370\nn02815834\nn09256479\nn02092339\nn02808304\nn09428293\nn02101006\nn02412080\nn04285008\nn03954731\nn04311004\nn03476991\nn01518878\nn02687172\nn02342885\nn02346627\nn02883205\nn03457902\nn02097658\nn02504458\nn03930313\nn02087394\nn02802426\nn03272010\nn02102318\nn02091467\nn02099849\nn04552348\nn02443114\nn02276258\nn03642806\nn02342885\nn03916031\nn02125311\nn02837789\nn02130308\nn04509417\nn03207941\nn03877845\nn13052670\nn02317335\nn03444034\nn03179701\nn04371774\nn03924679\nn02950826\nn02110958\nn02113978\nn02109961\nn02363005\nn02090622\nn07930864\nn03857828\nn03763968\nn07684084\nn02497673\nn02102480\nn04275548\nn04264628\nn02058221\nn01687978\nn02877765\nn01748264\nn02028035\nn02909870\nn04332243\nn09835506\nn04192698\nn03877845\nn03832673\nn04179913\nn03623198\nn02107908\nn04548362\nn01641577\nn02992211\nn04326547\nn02783161\nn03743016\nn01729977\nn04146614\nn01695060\nn03649909\nn02087394\nn03424325\nn01688243\nn03223299\nn01914609\nn02091032\nn02095570\nn07720875\nn02606052\nn03584829\nn02110185\nn03220513\nn07745940\nn01824575\nn02099601\nn11939491\nn07749582\nn03457902\nn01784675\nn02112018\nn03733131\nn04328186\nn04037443\nn03717622\nn01694178\nn02871525\nn02808440\nn04560804\nn02097474\nn02137549\nn01981276\nn02443114\nn02101006\nn04550184\nn12985857\nn02236044\nn02488291\nn04532106\nn03895866\nn03617480\nn03417042\nn03903868\nn03584254\nn02389026\nn04435653\nn02492035\nn01796340\nn03447721\nn03447447\nn03595614\nn04579145\nn02777292\nn04147183\nn02006656\nn03843555\nn02504458\nn03444034\nn03673027\nn04417672\nn10148035\nn04179913\nn03792972\nn04552348\nn02281406\nn02326432\nn02493509\nn03314780\nn03485407\nn01980166\nn04442312\nn03602883\nn01986214\nn02108915\nn02492660\nn03384352\nn04367480\nn04467665\nn02814860\nn01728572\nn03733281\nn03216828\nn02494079\nn03733805\nn02279972\nn01692333\nn02091635\nn04487081\nn03866082\nn03208938\nn07714990\nn02906734\nn02807133\nn02095570\nn03594945\nn03492542\nn02442845\nn01833805\nn02395406\nn06874185\nn02490219\nn02071294\nn02447366\nn01537544\nn02281787\nn02268443\nn03775546\nn04429376\nn03832673\nn04398044\nn04370456\nn02128757\nn04162706\nn04146614\nn04482393\nn07860988\nn02167151\nn02095889\nn02487347\nn01632777\nn02992211\nn02097658\nn02107683\nn03980874\nn07753592\nn02037110\nn03388183\nn01695060\nn04258138\nn02802426\nn03425413\nn02403003\nn03868242\nn02006656\nn02667093\nn02607072\nn02093647\nn02536864\nn04591713\nn02669723\nn03733805\nn03259280\nn03709823\nn04483307\nn03877472\nn02113023\nn04133789\nn06359193\nn03903868\nn03089624\nn02013706\nn04266014\nn02504013\nn02101006\nn02124075\nn01774750\nn02112350\nn02526121\nn03485407\nn03496892\nn02655020\nn07714571\nn02087394\nn03160309\nn02091831\nn03047690\nn04612504\nn02859443\nn04033995\nn02950826\nn03187595\nn01592084\nn07892512\nn04507155\nn01692333\nn01981276\nn02823750\nn04251144\nn04548362\nn07565083\nn04209133\nn01877812\nn04486054\nn09421951\nn02231487\nn02113799\nn02098413\nn04081281\nn02999410\nn02107312\nn02346627\nn01675722\nn02795169\nn03649909\nn04090263\nn03871628\nn01877812\nn03670208\nn03866082\nn03496892\nn07248320\nn04162706\nn02098413\nn04069434\nn03938244\nn02101006\nn02325366\nn03388549\nn03393912\nn01739381\nn02108089\nn03000134\nn03124170\nn02037110\nn02098105\nn01986214\nn03314780\nn10148035\nn04200800\nn03457902\nn02091831\nn02835271\nn03642806\nn02101388\nn02128757\nn04004767\nn02091635\nn04311004\nn04328186\nn01829413\nn02108000\nn03877845\nn03935335\nn01744401\nn01531178\nn13044778\nn02699494\nn01775062\nn02088364\nn04239074\nn03781244\nn02442845\nn03028079\nn09421951\nn12768682\nn02454379\nn03065424\nn02113023\nn01873310\nn03594945\nn03792782\nn03529860\nn02174001\nn02487347\nn01692333\nn02837789\nn04487394\nn02509815\nn03970156\nn02445715\nn02666196\nn02009912\nn01797886\nn07583066\nn02111500\nn03461385\nn04371774\nn04296562\nn02978881\nn02066245\nn02129604\nn03761084\nn09229709\nn01774750\nn02108915\nn01797886\nn04482393\nn03792782\nn02095314\nn01693334\nn04560804\nn04376876\nn07718747\nn01532829\nn03888605\nn02980441\nn01494475\nn02093754\nn07802026\nn04562935\nn02165456\nn02356798\nn03977966\nn03124170\nn02797295\nn04201297\nn04392985\nn04579432\nn02106550\nn02782093\nn04252077\nn04326547\nn02454379\nn02437312\nn01729977\nn02123045\nn04229816\nn02077923\nn03788195\nn02124075\nn02051845\nn02087394\nn02096437\nn02403003\nn02769748\nn04392985\nn02134084\nn02840245\nn04273569\nn03125729\nn03967562\nn03961711\nn03961711\nn07579787\nn04270147\nn02965783\nn02006656\nn03995372\nn03444034\nn02814860\nn04070727\nn04208210\nn04486054\nn03729826\nn02120079\nn04591713\nn02808304\nn02105641\nn03770439\nn04228054\nn02094114\nn03400231\nn02106166\nn03868863\nn02089078\nn03954731\nn04355338\nn02669723\nn04200800\nn04266014\nn03929855\nn02107312\nn04023962\nn03958227\nn01677366\nn02791124\nn03485407\nn02129165\nn03075370\nn01558993\nn02988304\nn04355933\nn02134418\nn01675722\nn07920052\nn02321529\nn02018795\nn03992509\nn03868863\nn03796401\nn02892767\nn04254120\nn03785016\nn04591157\nn01518878\nn06794110\nn01930112\nn02951585\nn07711569\nn01496331\nn02788148\nn03207743\nn03794056\nn04332243\nn04356056\nn07873807\nn02667093\nn03271574\nn02794156\nn02493793\nn03527444\nn02951585\nn03240683\nn02109961\nn01795545\nn03599486\nn04599235\nn01644900\nn07880968\nn04317175\nn02840245\nn02408429\nn07248320\nn04285008\nn02096585\nn02704792\nn04560804\nn03785016\nn02927161\nn03697007\nn07930864\nn07248320\nn02028035\nn02123597\nn02676566\nn07583066\nn02871525\nn02134084\nn02091032\nn04462240\nn02117135\nn02009912\nn09193705\nn09472597\nn02834397\nn03764736\nn01753488\nn03895866\nn02112018\nn02165105\nn02837789\nn03457902\nn04522168\nn04023962\nn04536866\nn04005630\nn02110627\nn02708093\nn04554684\nn01514668\nn02090379\nn07836838\nn02108089\nn03095699\nn04366367\nn04039381\nn07802026\nn03100240\nn03255030\nn04235860\nn02980441\nn03218198\nn01514668\nn03000684\nn02088094\nn02815834\nn03657121\nn03891251\nn02808440\nn02916936\nn03661043\nn04243546\nn04065272\nn03666591\nn04604644\nn04509417\nn03937543\nn04509417\nn02109961\nn04251144\nn02869837\nn02113712\nn02492660\nn02841315\nn07734744\nn04456115\nn02640242\nn03929855\nn04266014\nn01644900\nn02807133\nn03814639\nn01514859\nn01784675\nn04023962\nn02256656\nn01695060\nn03532672\nn04070727\nn03742115\nn03482405\nn01773797\nn03388183\nn03792782\nn09246464\nn03394916\nn13052670\nn03498962\nn02356798\nn02966193\nn01798484\nn03394916\nn04476259\nn03854065\nn03950228\nn02708093\nn02206856\nn03026506\nn04004767\nn03691459\nn01682714\nn02095570\nn02480855\nn03424325\nn01531178\nn03868863\nn02883205\nn02795169\nn04399382\nn02840245\nn02808304\nn01695060\nn02110063\nn01601694\nn04229816\nn02927161\nn03187595\nn02454379\nn04483307\nn01986214\nn02104029\nn04485082\nn02808304\nn03384352\nn02107574\nn02927161\nn03924679\nn01685808\nn02364673\nn04389033\nn07718472\nn01558993\nn03047690\nn03595614\nn02071294\nn03028079\nn01806143\nn03814639\nn02007558\nn04525038\nn02128385\nn02391049\nn04372370\nn03769881\nn02100877\nn09288635\nn03950228\nn02786058\nn03788365\nn01667114\nn02119789\nn02279972\nn02033041\nn02086910\nn01749939\nn03337140\nn07693725\nn02492660\nn02442845\nn02917067\nn03733281\nn07920052\nn02490219\nn02111277\nn02123394\nn02128757\nn02992211\nn03424325\nn03942813\nn04399382\nn04417672\nn01828970\nn03854065\nn02325366\nn02492035\nn03220513\nn02087046\nn03602883\nn01983481\nn01498041\nn02834397\nn03791053\nn04604644\nn07730033\nn01675722\nn02105056\nn04039381\nn02835271\nn02787622\nn04591157\nn02484975\nn04044716\nn02977058\nn03000247\nn03602883\nn02112018\nn04584207\nn03733281\nn04209133\nn02106662\nn01740131\nn03983396\nn04141327\nn03476684\nn03337140\nn04311174\nn02510455\nn03476991\nn04456115\nn03141823\nn04009552\nn03461385\nn01797886\nn01734418\nn02108915\nn04251144\nn04192698\nn04525038\nn03995372\nn01985128\nn07930864\nn02514041\nn02098413\nn03388183\nn02095889\nn02992529\nn07920052\nn03249569\nn02667093\nn03393912\nn03743016\nn03876231\nn02138441\nn07875152\nn02099601\nn01630670\nn02099429\nn03706229\nn03992509\nn03141823\nn03109150\nn02504013\nn02992529\nn01943899\nn03796401\nn01675722\nn04141327\nn07697537\nn04141327\nn02871525\nn04254680\nn07836838\nn03133878\nn02346627\nn03649909\nn02090622\nn03124170\nn04458633\nn04525305\nn03666591\nn02699494\nn03680355\nn01692333\nn02480495\nn03109150\nn02342885\nn02776631\nn04596742\nn03018349\nn04525305\nn01824575\nn01882714\nn02115641\nn02788148\nn04335435\nn02085936\nn02782093\nn03095699\nn03127925\nn09468604\nn07717410\nn03417042\nn12998815\nn02113023\nn07742313\nn04296562\nn07714571\nn02107312\nn01806143\nn04033995\nn02025239\nn03930313\nn02641379\nn03804744\nn07745940\nn02097658\nn07930864\nn03089624\nn02492035\nn02791124\nn02172182\nn02865351\nn01739381\nn03950228\nn02099429\nn01644900\nn02788148\nn01622779\nn02027492\nn04254120\nn03929855\nn02814533\nn02226429\nn07715103\nn03840681\nn02256656\nn01833805\nn12267677\nn01687978\nn04592741\nn04592741\nn07873807\nn02110627\nn02277742\nn04266014\nn01776313\nn02794156\nn02093428\nn04311004\nn03920288\nn03047690\nn03992509\nn02112350\nn04591157\nn03017168\nn03459775\nn01667778\nn01820546\nn03485794\nn02804610\nn03602883\nn03666591\nn01872401\nn04589890\nn02730930\nn02090379\nn03670208\nn02892201\nn03372029\nn03062245\nn02486410\nn04562935\nn01697457\nn02099429\nn04111531\nn01728920\nn04153751\nn02113624\nn01770393\nn04266014\nn02017213\nn03483316\nn01742172\nn02480855\nn01739381\nn01768244\nn03908714\nn02006656\nn02089867\nn03026506\nn01558993\nn03980874\nn03775546\nn01980166\nn09399592\nn02804610\nn04336792\nn02027492\nn04251144\nn02100735\nn03788365\nn13040303\nn02328150\nn15075141\nn07802026\nn01532829\nn03594734\nn02676566\nn04404412\nn02346627\nn02843684\nn02108000\nn02871525\nn02606052\nn03982430\nn02165456\nn02823750\nn01871265\nn02730930\nn03770679\nn04505470\nn03404251\nn01883070\nn02979186\nn02093991\nn01630670\nn04120489\nn01443537\nn04371774\nn03866082\nn01833805\nn03527444\nn03998194\nn03873416\nn02930766\nn03776460\nn06596364\nn02321529\nn04392985\nn03796401\nn04483307\nn02526121\nn02396427\nn02113023\nn03443371\nn07747607\nn01980166\nn02058221\nn02167151\nn02769748\nn03127925\nn02190166\nn03272562\nn02097130\nn04560804\nn02086240\nn04326547\nn02095314\nn01843383\nn02107312\nn03954731\nn02281406\nn02105641\nn03075370\nn02883205\nn01829413\nn02099849\nn02112137\nn07684084\nn03095699\nn02408429\nn10565667\nn02641379\nn02259212\nn02128757\nn03344393\nn01665541\nn04004767\nn07734744\nn02088364\nn02100583\nn02672831\nn01820546\nn03376595\nn04070727\nn02981792\nn03709823\nn02206856\nn01537544\nn01776313\nn04579145\nn02492035\nn02804414\nn02113799\nn02104365\nn03483316\nn09256479\nn03642806\nn07590611\nn02094433\nn02089973\nn02497673\nn01968897\nn02090721\nn02167151\nn02974003\nn02514041\nn03781244\nn02408429\nn02279972\nn04311174\nn01990800\nn02804610\nn03146219\nn13040303\nn07930864\nn04423845\nn02437616\nn03388043\nn04487394\nn04201297\nn02704792\nn01729322\nn04371430\nn03937543\nn03216828\nn02486261\nn02666196\nn04612504\nn03180011\nn03240683\nn03627232\nn01877812\nn04486054\nn02782093\nn02814533\nn02119022\nn03788195\nn07720875\nn02096051\nn03903868\nn02105162\nn04125021\nn03272010\nn03794056\nn02058221\nn03457902\nn04584207\nn03785016\nn04311004\nn03837869\nn02101556\nn03840681\nn03425413\nn03496892\nn02127052\nn01980166\nn03770439\nn04398044\nn02105412\nn03032252\nn03594734\nn02096437\nn10148035\nn01443537\nn04125021\nn03649909\nn02939185\nn01737021\nn02510455\nn02398521\nn02490219\nn03595614\nn04277352\nn03649909\nn07716906\nn02808440\nn03124170\nn03538406\nn03376595\nn02860847\nn01797886\nn04243546\nn03673027\nn04462240\nn03595614\nn04579432\nn01558993\nn04081281\nn04136333\nn03223299\nn03197337\nn02094114\nn03452741\nn04392985\nn02666196\nn02786058\nn09332890\nn03759954\nn04125021\nn03000684\nn04597913\nn01768244\nn02099601\nn07716358\nn03530642\nn01860187\nn02012849\nn02814860\nn02110063\nn03160309\nn02091032\nn15075141\nn02127052\nn02699494\nn04447861\nn02109961\nn03532672\nn04099969\nn03594945\nn02101556\nn04200800\nn02100236\nn04149813\nn07920052\nn04149813\nn02097209\nn03793489\nn09428293\nn03840681\nn02799071\nn04332243\nn01807496\nn04479046\nn02101388\nn02099849\nn02085620\nn02655020\nn02802426\nn04204347\nn02094433\nn02814533\nn04398044\nn04090263\nn02051845\nn04548362\nn04259630\nn04209133\nn04596742\nn02114855\nn02091635\nn01795545\nn02231487\nn07831146\nn02110341\nn01728920\nn02802426\nn01978455\nn03388043\nn03041632\nn03976657\nn02443484\nn01735189\nn04310018\nn02009229\nn02325366\nn03075370\nn04149813\nn03891251\nn02125311\nn04074963\nn02105855\nn04525038\nn02002724\nn03924679\nn03947888\nn03544143\nn01704323\nn02177972\nn04509417\nn07754684\nn03961711\nn02364673\nn07614500\nn04239074\nn02825657\nn02391049\nn03447721\nn03042490\nn04442312\nn02098105\nn03388043\nn03692522\nn04428191\nn02100236\nn04591157\nn03729826\nn03775071\nn02480855\nn03697007\nn02088094\nn02012849\nn02119789\nn02085782\nn03424325\nn01872401\nn01631663\nn02788148\nn01698640\nn02672831\nn04162706\nn04591157\nn02128385\nn02992529\nn03443371\nn03792782\nn04200800\nn04069434\nn02490219\nn03868242\nn04277352\nn03770439\nn01773157\nn04026417\nn03492542\nn02107908\nn04548362\nn03379051\nn01582220\nn02109047\nn04579145\nn02114548\nn04152593\nn02769748\nn04296562\nn02097209\nn01983481\nn04366367\nn03657121\nn02879718\nn02119789\nn03947888\nn02342885\nn04152593\nn04370456\nn03032252\nn07880968\nn04328186\nn02107574\nn02017213\nn01945685\nn04550184\nn01514859\nn04479046\nn07695742\nn03481172\nn07747607\nn02437312\nn03742115\nn01924916\nn01608432\nn04584207\nn02825657\nn12144580\nn01689811\nn04228054\nn02113624\nn07697313\nn04367480\nn04026417\nn01616318\nn02643566\nn04228054\nn01443537\nn04252077\nn01734418\nn02490219\nn02814533\nn01796340\nn03160309\nn04355933\nn03666591\nn02443114\nn03595614\nn02948072\nn03786901\nn04380533\nn01824575\nn02018207\nn02111500\nn03188531\nn03417042\nn13037406\nn02869837\nn03627232\nn07716906\nn02130308\nn02422106\nn03544143\nn02108551\nn03314780\nn01694178\nn02437312\nn02978881\nn04243546\nn02823428\nn03916031\nn01616318\nn01496331\nn15075141\nn02071294\nn03095699\nn04525305\nn02483362\nn02109047\nn02930766\nn03792972\nn04507155\nn02091032\nn01744401\nn03929660\nn01632458\nn02090622\nn13037406\nn01580077\nn03028079\nn04366367\nn03000247\nn02088094\nn04376876\nn02110341\nn03983396\nn02791124\nn02977058\nn03384352\nn03042490\nn02643566\nn04522168\nn02804414\nn07760859\nn02445715\nn01728920\nn04285008\nn01697457\nn03961711\nn03134739\nn01882714\nn07716358\nn02364673\nn02536864\nn07880968\nn03662601\nn02699494\nn04133789\nn04141076\nn04366367\nn02892201\nn02100877\nn01695060\nn07747607\nn02971356\nn02804414\nn01665541\nn02422699\nn03065424\nn07693725\nn04336792\nn07932039\nn04311174\nn07715103\nn02268853\nn02096585\nn01981276\nn04133789\nn02814860\nn03388183\nn01631663\nn02447366\nn01560419\nn02319095\nn04370456\nn04152593\nn02939185\nn01534433\nn02909870\nn01537544\nn07565083\nn02106030\nn01630670\nn02837789\nn03633091\nn01614925\nn13052670\nn02104029\nn02877765\nn02106166\nn02011460\nn03590841\nn02130308\nn01968897\nn02397096\nn02966193\nn02129165\nn03393912\nn03133878\nn03743016\nn03947888\nn02133161\nn02102480\nn02457408\nn02111889\nn02364673\nn02980441\nn02138441\nn03908714\nn04599235\nn03220513\nn01729977\nn02808304\nn03223299\nn03444034\nn03538406\nn03384352\nn02607072\nn07684084\nn07697537\nn07565083\nn02939185\nn04483307\nn01843065\nn03272010\nn04370456\nn03627232\nn03259280\nn01698640\nn01775062\nn02769748\nn04428191\nn04326547\nn02090721\nn02051845\nn03124170\nn02422106\nn02134418\nn09399592\nn03447721\nn04090263\nn04584207\nn03884397\nn02356798\nn02105641\nn03786901\nn02835271\nn02090379\nn03379051\nn04389033\nn01847000\nn02125311\nn02089078\nn01498041\nn01749939\nn02102177\nn04023962\nn03788365\nn02127052\nn04326547\nn01641577\nn02484975\nn07768694\nn03777754\nn04487394\nn07873807\nn02089078\nn02112137\nn03733281\nn04141975\nn02105251\nn04040759\nn13052670\nn07684084\nn03179701\nn03804744\nn03127747\nn01748264\nn02408429\nn03126707\nn03595614\nn04235860\nn02117135\nn03938244\nn02497673\nn03425413\nn04192698\nn03980874\nn01774384\nn04591157\nn02403003\nn01729322\nn02834397\nn03527444\nn03763968\nn04120489\nn02100735\nn01955084\nn02483362\nn02510455\nn01817953\nn03868242\nn02483362\nn04418357\nn01968897\nn03691459\nn01882714\nn02883205\nn01829413\nn02870880\nn02396427\nn01843383\nn10148035\nn02699494\nn01580077\nn04238763\nn03496892\nn07684084\nn02950826\nn03445777\nn01798484\nn03877845\nn04239074\nn01622779\nn02099712\nn02837789\nn07730033\nn09835506\nn04532106\nn03976467\nn03854065\nn01756291\nn07892512\nn15075141\nn02971356\nn02113023\nn04023962\nn02108551\nn02002724\nn09288635\nn03457902\nn03124170\nn01484850\nn04548362\nn03201208\nn01734418\nn02090622\nn03929660\nn03868863\nn02480855\nn02028035\nn01692333\nn02206856\nn03970156\nn07768694\nn04376876\nn02089973\nn03976467\nn03134739\nn03788195\nn04399382\nn04023962\nn03393912\nn12620546\nn03085013\nn02277742\nn03272562\nn01698640\nn04039381\nn02877765\nn03680355\nn01873310\nn04039381\nn02980441\nn04376876\nn01729322\nn02795169\nn01530575\nn04515003\nn02794156\nn02165105\nn03594945\nn02093991\nn02256656\nn02105412\nn03216828\nn02110806\nn03297495\nn02112137\nn03710721\nn02110185\nn09421951\nn02480855\nn04336792\nn02510455\nn02087046\nn02110627\nn04005630\nn02536864\nn04277352\nn01774750\nn02667093\nn04554684\nn02823750\nn03196217\nn01496331\nn01855032\nn02128757\nn03764736\nn02981792\nn03876231\nn04458633\nn03888257\nn01860187\nn04326547\nn09421951\nn07880968\nn02500267\nn01770081\nn03584254\nn07711569\nn09468604\nn01614925\nn03788365\nn04560804\nn01729977\nn03717622\nn02410509\nn02437312\nn03000684\nn01632777\nn02028035\nn07873807\nn01630670\nn03388183\nn02110185\nn02098413\nn02107142\nn04209133\nn07932039\nn03992509\nn04612504\nn01986214\nn04270147\nn06874185\nn02909870\nn02168699\nn03785016\nn01532829\nn04264628\nn02484975\nn02799071\nn04209133\nn07584110\nn01560419\nn02117135\nn07684084\nn03814906\nn03908618\nn02279972\nn02098413\nn02097658\nn04154565\nn02125311\nn02018795\nn02168699\nn02096177\nn03047690\nn02747177\nn03788365\nn02128385\nn03000134\nn03775546\nn04204238\nn04604644\nn03980874\nn03598930\nn01855672\nn02090721\nn07715103\nn02443114\nn02102177\nn04258138\nn04591713\nn03297495\nn01667778\nn04350905\nn04589890\nn06794110\nn03884397\nn04367480\nn03877845\nn10148035\nn03492542\nn04116512\nn03785016\nn01968897\nn02111889\nn04579432\nn03492542\nn02111277\nn03535780\nn03786901\nn02113799\nn04347754\nn03535780\nn02963159\nn03249569\nn03617480\nn04070727\nn02108000\nn03075370\nn03355925\nn04418357\nn02783161\nn02112137\nn03179701\nn02114367\nn02098286\nn02119022\nn03000684\nn01695060\nn15075141\nn02877765\nn02107683\nn03721384\nn02107142\nn02092339\nn02687172\nn02396427\nn01629819\nn03272010\nn10148035\nn04141076\nn04044716\nn04277352\nn02364673\nn04141975\nn01819313\nn03775546\nn03379051\nn01756291\nn03785016\nn04476259\nn04612504\nn01632777\nn03838899\nn02007558\nn01440764\nn02088094\nn01735189\nn02356798\nn02095889\nn09229709\nn02132136\nn02091635\nn07754684\nn03146219\nn03467068\nn03047690\nn02408429\nn02086910\nn02012849\nn04522168\nn01943899\nn12144580\nn01820546\nn01824575\nn01677366\nn03868242\nn03814639\nn02091635\nn04033901\nn02074367\nn04597913\nn07880968\nn01871265\nn03000684\nn01983481\nn07753592\nn04235860\nn02229544\nn03814906\nn03527444\nn04532106\nn02447366\nn04179913\nn04116512\nn01631663\nn04037443\nn03947888\nn02708093\nn03874293\nn04612504\nn04589890\nn02097130\nn03089624\nn03670208\nn04579145\nn03344393\nn07614500\nn04462240\nn01751748\nn04201297\nn07802026\nn02795169\nn07613480\nn07747607\nn02115913\nn02493793\nn03770679\nn02268443\nn02009912\nn04423845\nn01530575\nn01685808\nn07715103\nn03016953\nn03355925\nn04554684\nn04366367\nn03207941\nn03887697\nn04336792\nn03759954\nn03595614\nn02480855\nn04525038\nn04355338\nn02129165\nn03255030\nn02843684\nn04493381\nn02992211\nn03814906\nn04239074\nn06794110\nn03977966\nn02979186\nn03207941\nn07875152\nn01798484\nn02484975\nn02127052\nn02133161\nn03929660\nn02966687\nn12985857\nn01873310\nn07584110\nn02088094\nn01748264\nn02101006\nn03450230\nn03657121\nn03991062\nn02013706\nn03742115\nn03595614\nn04591713\nn03891251\nn01943899\nn03065424\nn04127249\nn03584829\nn02018207\nn02089973\nn03773504\nn01751748\nn02119022\nn02276258\nn04086273\nn01877812\nn02917067\nn02168699\nn02107574\nn03954731\nn02443114\nn02101556\nn01943899\nn03457902\nn01644900\nn01770081\nn03495258\nn02606052\nn02109047\nn01532829\nn02099429\nn02100735\nn03216828\nn04204347\nn02095889\nn03794056\nn02104365\nn03595614\nn01630670\nn03223299\nn04389033\nn01796340\nn02098286\nn02109525\nn04509417\nn01580077\nn04209239\nn01675722\nn07718747\nn02787622\nn04553703\nn02100877\nn02708093\nn01687978\nn01944390\nn02807133\nn03908714\nn12620546\nn04009552\nn04591713\nn02112350\nn02168699\nn03773504\nn03127747\nn03393912\nn03617480\nn02704792\nn03590841\nn03445924\nn02486261\nn03803284\nn03954731\nn02971356\nn03000247\nn03887697\nn02894605\nn04286575\nn02172182\nn01873310\nn04118538\nn04357314\nn02113624\nn02667093\nn03141823\nn04423845\nn03742115\nn02085620\nn02727426\nn04606251\nn02088466\nn03109150\nn03134739\nn02361337\nn03832673\nn02087394\nn02177972\nn04347754\nn07718747\nn03710721\nn03970156\nn04229816\nn01601694\nn02606052\nn03425413\nn03447447\nn04336792\nn04486054\nn04201297\nn07614500\nn02226429\nn01622779\nn04435653\nn09288635\nn02790996\nn02108000\nn03961711\nn03417042\nn03017168\nn03840681\nn02509815\nn04019541\nn01692333\nn01843065\nn03461385\nn04296562\nn02493509\nn03133878\nn02110627\nn07932039\nn02091831\nn03249569\nn02091467\nn03680355\nn07714990\nn02412080\nn03250847\nn03447721\nn02916936\nn02107683\nn02492035\nn03404251\nn02102177\nn07932039\nn04557648\nn04372370\nn03891251\nn02974003\nn15075141\nn02444819\nn04462240\nn02100236\nn02108551\nn04515003\nn02002556\nn02794156\nn04204238\nn04090263\nn04584207\nn02120505\nn03773504\nn02165456\nn07684084\nn04311174\nn02002556\nn02106382\nn01695060\nn02783161\nn02422699\nn03982430\nn02397096\nn03976657\nn02692877\nn03841143\nn03710637\nn04259630\nn02099601\nn03942813\nn12998815\nn11939491\nn04399382\nn03065424\nn01644373\nn04462240\nn03992509\nn03534580\nn02398521\nn02095889\nn02808440\nn04264628\nn02786058\nn04399382\nn03933933\nn04487081\nn01873310\nn04409515\nn02108089\nn02091831\nn07734744\nn04552348\nn04162706\nn02123045\nn13040303\nn02492035\nn03657121\nn02488291\nn02027492\nn02769748\nn07753113\nn03814639\nn01704323\nn02276258\nn04557648\nn03478589\nn04435653\nn03535780\nn04371774\nn02823750\nn02124075\nn07695742\nn03337140\nn03884397\nn01917289\nn07720875\nn07742313\nn04019541\nn02130308\nn02102040\nn02104365\nn02963159\nn01687978\nn07754684\nn02328150\nn02791124\nn04286575\nn04606251\nn03814639\nn09246464\nn02009229\nn01665541\nn04399382\nn04429376\nn04033995\nn04238763\nn09256479\nn01632458\nn04004767\nn04111531\nn03710637\nn02107908\nn04008634\nn02106382\nn02086079\nn07871810\nn02105505\nn02013706\nn03733131\nn07875152\nn03376595\nn03594945\nn01776313\nn03016953\nn04243546\nn04252225\nn03709823\nn02939185\nn02107574\nn02097047\nn02109525\nn03916031\nn02116738\nn07579787\nn02018795\nn03967562\nn03075370\nn12998815\nn01818515\nn02190166\nn02701002\nn01685808\nn12267677\nn02107683\nn07695742\nn02085782\nn03692522\nn02086646\nn03623198\nn03534580\nn02133161\nn07584110\nn03980874\nn03710721\nn03838899\nn04311174\nn03976467\nn02966687\nn03785016\nn02097658\nn04442312\nn04380533\nn03042490\nn03982430\nn02510455\nn02408429\nn02093859\nn07718472\nn02086079\nn02834397\nn03670208\nn01728572\nn02444819\nn02091467\nn04325704\nn04332243\nn03223299\nn01734418\nn03496892\nn01697457\nn03884397\nn03483316\nn04285008\nn01795545\nn03220513\nn02007558\nn01532829\nn02236044\nn06596364\nn04111531\nn03032252\nn03814639\nn04317175\nn04033995\nn02086079\nn07684084\nn01829413\nn02128757\nn03983396\nn04487081\nn02190166\nn04523525\nn04328186\nn04116512\nn03450230\nn04228054\nn02102177\nn03873416\nn02488702\nn02226429\nn02018207\nn04044716\nn03394916\nn01818515\nn01910747\nn03584829\nn03240683\nn04133789\nn03095699\nn04325704\nn02606052\nn02102318\nn02106382\nn03424325\nn02906734\nn01818515\nn04548362\nn04086273\nn07590611\nn02033041\nn04501370\nn02486261\nn03793489\nn02974003\nn09428293\nn02088466\nn04355933\nn02113712\nn02777292\nn02490219\nn02105056\nn02071294\nn02655020\nn03425413\nn02808440\nn02493509\nn03384352\nn02108422\nn04350905\nn07695742\nn02077923\nn03476991\nn03857828\nn02494079\nn01440764\nn02277742\nn02509815\nn07730033\nn01774384\nn02951585\nn02892201\nn02488702\nn02782093\nn03854065\nn04517823\nn03467068\nn07920052\nn03180011\nn02111129\nn02361337\nn03544143\nn07717556\nn03291819\nn02110063\nn03825788\nn02110185\nn02108422\nn01744401\nn04204347\nn01744401\nn02086079\nn01773549\nn03498962\nn02979186\nn01694178\nn04265275\nn04371774\nn01669191\nn01582220\nn02128925\nn02747177\nn02108551\nn02105056\nn02107312\nn01532829\nn01698640\nn03661043\nn02834397\nn03956157\nn01739381\nn02500267\nn02317335\nn02951358\nn02105505\nn07718747\nn04192698\nn04536866\nn03710637\nn02346627\nn03476684\nn02086910\nn02747177\nn02096177\nn04548280\nn01630670\nn01682714\nn04275548\nn03538406\nn02113712\nn09421951\nn01560419\nn04252225\nn02423022\nn01697457\nn02389026\nn03595614\nn02415577\nn04004767\nn02672831\nn03018349\nn03998194\nn03089624\nn04273569\nn02058221\nn03544143\nn02395406\nn03535780\nn03450230\nn03888605\nn13052670\nn01910747\nn01843065\nn03982430\nn03447721\nn01955084\nn01630670\nn03803284\nn02120079\nn03372029\nn02504458\nn03874599\nn02011460\nn02108089\nn03627232\nn02492660\nn04399382\nn02412080\nn03325584\nn03706229\nn02500267\nn02123159\nn04238763\nn02883205\nn13044778\nn07836838\nn02799071\nn01917289\nn04273569\nn04552348\nn01795545\nn02011460\nn03944341\nn02356798\nn04264628\nn02859443\nn02108915\nn02108422\nn04591713\nn02099849\nn07693725\nn01795545\nn04596742\nn03868242\nn03958227\nn02093991\nn03134739\nn01917289\nn02099712\nn03314780\nn11879895\nn10148035\nn02018795\nn02747177\nn04542943\nn03141823\nn02797295\nn01704323\nn02777292\nn02769748\nn04033995\nn01860187\nn02321529\nn01917289\nn03785016\nn03956157\nn03100240\nn04041544\nn02165105\nn03947888\nn03891251\nn03709823\nn02988304\nn02106030\nn02095570\nn02814860\nn03649909\nn03110669\nn02444819\nn04044716\nn04487394\nn02422106\nn04069434\nn02165456\nn02098105\nn02106382\nn02280649\nn02002556\nn01980166\nn02091032\nn09229709\nn03642806\nn03770679\nn02172182\nn07892512\nn01944390\nn04462240\nn02114548\nn02403003\nn03899768\nn09472597\nn03530642\nn02974003\nn02777292\nn02093428\nn01829413\nn02097298\nn01882714\nn01833805\nn03481172\nn02094114\nn03218198\nn02640242\nn02422699\nn03297495\nn04592741\nn01644373\nn02066245\nn03028079\nn04399382\nn03355925\nn03187595\nn02071294\nn01494475\nn02119789\nn02963159\nn03976657\nn03759954\nn02916936\nn02120079\nn03109150\nn04370456\nn02817516\nn01734418\nn02415577\nn03691459\nn04023962\nn02114712\nn03995372\nn06359193\nn01943899\nn01860187\nn02859443\nn02268443\nn02488702\nn03110669\nn03250847\nn02165105\nn02102480\nn03026506\nn04465501\nn03733131\nn01910747\nn04277352\nn03065424\nn01644900\nn02951358\nn04399382\nn02326432\nn03529860\nn03764736\nn02444819\nn02093256\nn02091134\nn02091635\nn11879895\nn03657121\nn04613696\nn03452741\nn04596742\nn02097474\nn02672831\nn01968897\nn02486410\nn02488291\nn02356798\nn07749582\nn04033995\nn03000684\nn04428191\nn02089078\nn04005630\nn03476991\nn02817516\nn04371774\nn12144580\nn12144580\nn03950228\nn02009912\nn03425413\nn04141975\nn02790996\nn01818515\nn07583066\nn04116512\nn03417042\nn01739381\nn01944390\nn03447721\nn03891332\nn01689811\nn04081281\nn02892767\nn04590129\nn01632777\nn02086910\nn01742172\nn04579145\nn02814860\nn04458633\nn04487394\nn02088632\nn03942813\nn04162706\nn07613480\nn02098413\nn04037443\nn02457408\nn04461696\nn02110185\nn03887697\nn03344393\nn04336792\nn04209239\nn02480495\nn02102480\nn04040759\nn03372029\nn03017168\nn02087046\nn02110185\nn04131690\nn02133161\nn02749479\nn02092002\nn04612504\nn03388183\nn03417042\nn02168699\nn07248320\nn02012849\nn03791053\nn02027492\nn07768694\nn02115913\nn02093428\nn01630670\nn02226429\nn01514859\nn07716358\nn02860847\nn04041544\nn02105505\nn02107683\nn03394916\nn03384352\nn04536866\nn02107312\nn04487081\nn02447366\nn02113186\nn03777754\nn03496892\nn09421951\nn02097298\nn02112706\nn02128757\nn02169497\nn03933933\nn02109961\nn04254120\nn04562935\nn02457408\nn02093754\nn15075141\nn02788148\nn01751748\nn02837789\nn06359193\nn01630670\nn03908618\nn07754684\nn02013706\nn03680355\nn02788148\nn06794110\nn02102040\nn01496331\nn03482405\nn02107312\nn13054560\nn03843555\nn01644373\nn02894605\nn01818515\nn03899768\nn02134084\nn01692333\nn02948072\nn03743016\nn07583066\nn02279972\nn07760859\nn03868863\nn02422699\nn02825657\nn02480855\nn02226429\nn04033901\nn01817953\nn04285008\nn04550184\nn04476259\nn02100877\nn09835506\nn02410509\nn03207743\nn03877845\nn03947888\nn01774750\nn02641379\nn04584207\nn02481823\nn07768694\nn02130308\nn04147183\nn04596742\nn02395406\nn07754684\nn04252225\nn04118538\nn09256479\nn07742313\nn02769748\nn03888257\nn03658185\nn04067472\nn02481823\nn03255030\nn03903868\nn03124043\nn03874599\nn06596364\nn04355933\nn04613696\nn04357314\nn02814860\nn02099601\nn01806567\nn02396427\nn02106166\nn03769881\nn02113023\nn04146614\nn02640242\nn02966193\nn02841315\nn02481823\nn03724870\nn03998194\nn04522168\nn02747177\nn02317335\nn04067472\nn02129165\nn07714571\nn03992509\nn03379051\nn04141975\nn02028035\nn02085936\nn04540053\nn02112137\nn03977966\nn03637318\nn03887697\nn09468604\nn03424325\nn04584207\nn01917289\nn07579787\nn03325584\nn01829413\nn04540053\nn03127925\nn01558993\nn02027492\nn03424325\nn03109150\nn06794110\nn01773797\nn03188531\nn02106382\nn03788365\nn02123159\nn01773797\nn02229544\nn02727426\nn02823428\nn02454379\nn02106030\nn01924916\nn12998815\nn04179913\nn04099969\nn07684084\nn03450230\nn04435653\nn02422106\nn03637318\nn03018349\nn04429376\nn03868863\nn02110806\nn02226429\nn02006656\nn03843555\nn06359193\nn01860187\nn01694178\nn02138441\nn03630383\nn04009552\nn02101006\nn03496892\nn03447721\nn07920052\nn07873807\nn01729977\nn03220513\nn01614925\nn02134084\nn03908618\nn03763968\nn03544143\nn02797295\nn04392985\nn01728920\nn03876231\nn03259280\nn03325584\nn04296562\nn02909870\nn02493793\nn02112706\nn02776631\nn02447366\nn01514859\nn03954731\nn03344393\nn04125021\nn03930630\nn04116512\nn02441942\nn03344393\nn02125311\nn02643566\nn03840681\nn02106662\nn03325584\nn07695742\nn01491361\nn03814906\nn03075370\nn02098286\nn02666196\nn07718472\nn02948072\nn01698640\nn03777754\nn07714571\nn01945685\nn03085013\nn03445777\nn04380533\nn01986214\nn03673027\nn03710193\nn02441942\nn01734418\nn02105412\nn03447447\nn04591157\nn02727426\nn04486054\nn02510455\nn03958227\nn01978455\nn04461696\nn03908618\nn04522168\nn02107908\nn07715103\nn04009552\nn03457902\nn03447447\nn01820546\nn02692877\nn03874599\nn02101388\nn02115641\nn03532672\nn03127925\nn04081281\nn02814533\nn02916936\nn02483708\nn02791124\nn04505470\nn04417672\nn03876231\nn01829413\nn09246464\nn01728920\nn02363005\nn07754684\nn07717556\nn03000247\nn01873310\nn02091635\nn07831146\nn02794156\nn03825788\nn03476991\nn04033901\nn02607072\nn02123394\nn03534580\nn01770081\nn02011460\nn02843684\nn02109525\nn03916031\nn04418357\nn03710637\nn03075370\nn01644900\nn04254680\nn07768694\nn04228054\nn04258138\nn04357314\nn07836838\nn03000134\nn04310018\nn03000134\nn02098413\nn02108000\nn04252077\nn02457408\nn04483307\nn02105505\nn03125729\nn02091467\nn03868242\nn02106166\nn03240683\nn02917067\nn02105056\nn04525305\nn01753488\nn02978881\nn03977966\nn02486261\nn04162706\nn02120079\nn03709823\nn03127747\nn02089973\nn03089624\nn03814906\nn01534433\nn04613696\nn03325584\nn04505470\nn03325584\nn02115641\nn03630383\nn01930112\nn04204238\nn03063689\nn02233338\nn03916031\nn02786058\nn02113799\nn03935335\nn04179913\nn03690938\nn02442845\nn01819313\nn01534433\nn01753488\nn02823750\nn01491361\nn03124043\nn01749939\nn02328150\nn03272562\nn02094258\nn04597913\nn01773549\nn03724870\nn01871265\nn01751748\nn04039381\nn03733805\nn02783161\nn02948072\nn02397096\nn02233338\nn02093647\nn03016953\nn04344873\nn02640242\nn01677366\nn02106166\nn07745940\nn03710637\nn03529860\nn02988304\nn04350905\nn02105056\nn01630670\nn12998815\nn02094258\nn03481172\nn04515003\nn04418357\nn03075370\nn04273569\nn01592084\nn03290653\nn04487394\nn02109047\nn02259212\nn04604644\nn03976467\nn04023962\nn02910353\nn03394916\nn02106662\nn01882714\nn03494278\nn01770393\nn03445924\nn02102177\nn02110958\nn02089973\nn01924916\nn02113799\nn01817953\nn02091134\nn01697457\nn03443371\nn04482393\nn01749939\nn01985128\nn04116512\nn03452741\nn03220513\nn02510455\nn03761084\nn02916936\nn02089867\nn02281406\nn03445777\nn03642806\nn03255030\nn09428293\nn01774750\nn03220513\nn04254777\nn13037406\nn04235860\nn07875152\nn01877812\nn02086240\nn03876231\nn02484975\nn03595614\nn03733805\nn02099712\nn03884397\nn03016953\nn02088632\nn04086273\nn02797295\nn04392985\nn03124043\nn02102480\nn02100583\nn01855032\nn02667093\nn01945685\nn03250847\nn01644373\nn04147183\nn02641379\nn02342885\nn03666591\nn03000134\nn03197337\nn02807133\nn03394916\nn01797886\nn02443114\nn02056570\nn02916936\nn04090263\nn01756291\nn03724870\nn02747177\nn04553703\nn01983481\nn04479046\nn07920052\nn01631663\nn01981276\nn02097474\nn02268443\nn01944390\nn02108422\nn04487081\nn07734744\nn02091244\nn02835271\nn01824575\nn02056570\nn03773504\nn01688243\nn03345487\nn03345487\nn02486410\nn03271574\nn03485407\nn02483362\nn02113712\nn02786058\nn04579145\nn02948072\nn03595614\nn03594734\nn01491361\nn01729977\nn04033995\nn04597913\nn01871265\nn02992211\nn02361337\nn04070727\nn02007558\nn03110669\nn09399592\nn02009912\nn03249569\nn02415577\nn02190166\nn02701002\nn03042490\nn01871265\nn02091467\nn03208938\nn02105505\nn04589890\nn02138441\nn04591157\nn03344393\nn01622779\nn01924916\nn02137549\nn04328186\nn07590611\nn01776313\nn04389033\nn02058221\nn03786901\nn02865351\nn02536864\nn04154565\nn02108422\nn07583066\nn03770439\nn04235860\nn03594945\nn02096051\nn03590841\nn04525038\nn02264363\nn04592741\nn02364673\nn01735189\nn02977058\nn02488291\nn07871810\nn03062245\nn04557648\nn03837869\nn01770081\nn04273569\nn03290653\nn03124043\nn02971356\nn02423022\nn02094114\nn01695060\nn01917289\nn02814533\nn03250847\nn02110063\nn02666196\nn02488291\nn02504013\nn02130308\nn01695060\nn03089624\nn02906734\nn02791124\nn09835506\nn07695742\nn06874185\nn04229816\nn02408429\nn02087394\nn03297495\nn02058221\nn03763968\nn01491361\nn03781244\nn03873416\nn02111277\nn13052670\nn02119022\nn02108000\nn02791124\nn03028079\nn02906734\nn02112350\nn02102318\nn04118776\nn02823428\nn04435653\nn03786901\nn02105505\nn01514859\nn02860847\nn01871265\nn07742313\nn01695060\nn01735189\nn03141823\nn02692877\nn04254680\nn02483708\nn02011460\nn02927161\nn02113978\nn02106166\nn03770679\nn02169497\nn04482393\nn02277742\nn04485082\nn01984695\nn03658185\nn01697457\nn09428293\nn02102480\nn04501370\nn04141975\nn01614925\nn02089078\nn03935335\nn02486410\nn01843065\nn01984695\nn02363005\nn04536866\nn04141076\nn01950731\nn03445777\nn02102040\nn07715103\nn09256479\nn03781244\nn02090379\nn02129165\nn04532670\nn02939185\nn04259630\nn03788365\nn03461385\nn04606251\nn04428191\nn02488702\nn01518878\nn02107142\nn01622779\nn02483708\nn07753113\nn07930864\nn01984695\nn03476684\nn02655020\nn03376595\nn01806143\nn04286575\nn02490219\nn02640242\nn04141975\nn03938244\nn02100735\nn04041544\nn02108915\nn03769881\nn02108551\nn02110185\nn02086646\nn03388043\nn07697313\nn02098105\nn04597913\nn04090263\nn02492660\nn02795169\nn02086240\nn02097130\nn02346627\nn01622779\nn01978287\nn01924916\nn02655020\nn02787622\nn02108551\nn03717622\nn07697313\nn02105505\nn07753113\nn04204347\nn02909870\nn01828970\nn02018795\nn07836838\nn01775062\nn07716358\nn01675722\nn02807133\nn02493793\nn02091467\nn02804414\nn12144580\nn02823428\nn09229709\nn03379051\nn02791270\nn01828970\nn03832673\nn04366367\nn03877845\nn03372029\nn03961711\nn03916031\nn03788365\nn04265275\nn01806143\nn04008634\nn02794156\nn03777754\nn01630670\nn07860988\nn04239074\nn04270147\nn03761084\nn04270147\nn04487081\nn02481823\nn02395406\nn02093859\nn03991062\nn04264628\nn04258138\nn06359193\nn02074367\nn07614500\nn02865351\nn07718747\nn04074963\nn04482393\nn03347037\nn02110063\nn07836838\nn02090379\nn03595614\nn03482405\nn13052670\nn04023962\nn03991062\nn04548280\nn02056570\nn02794156\nn13133613\nn02100877\nn03272010\nn02107683\nn04149813\nn04152593\nn02002556\nn03954731\nn01968897\nn03388043\nn03764736\nn02690373\nn02966193\nn01518878\nn02128385\nn03197337\nn02092002\nn03110669\nn03478589\nn02457408\nn02870880\nn02011460\nn02093428\nn03063689\nn03337140\nn04356056\nn02963159\nn04435653\nn03871628\nn02110627\nn02088238\nn03160309\nn03983396\nn02992529\nn03843555\nn01773549\nn02389026\nn09468604\nn04505470\nn02109961\nn02794156\nn03854065\nn04355338\nn02094433\nn13133613\nn03272010\nn01667778\nn03494278\nn12768682\nn02481823\nn03085013\nn03179701\nn01667778\nn02102040\nn02112706\nn02951585\nn02108089\nn02099601\nn07860988\nn04033995\nn03388183\nn02127052\nn02107142\nn03814639\nn04004767\nn02099712\nn01582220\nn02102177\nn02100735\nn03958227\nn02481823\nn01773549\nn03131574\nn04540053\nn03424325\nn03871628\nn02116738\nn09229709\nn02797295\nn02704792\nn02825657\nn02115913\nn03888605\nn02009229\nn03063689\nn07734744\nn02669723\nn02101556\nn03045698\nn04532106\nn03961711\nn04372370\nn02655020\nn02094433\nn02088466\nn04005630\nn12144580\nn02892767\nn02091244\nn03110669\nn03759954\nn03594945\nn03594945\nn04462240\nn07711569\nn03259280\nn04482393\nn02018207\nn03134739\nn03832673\nn04467665\nn04285008\nn02169497\nn03796401\nn02099267\nn02909870\nn02105412\nn04265275\nn01728572\nn04336792\nn02834397\nn02804414\nn04548362\nn03109150\nn02895154\nn03929660\nn01685808\nn02111500\nn04033995\nn01768244\nn02002556\nn03887697\nn04069434\nn03594734\nn02500267\nn07714990\nn02137549\nn03014705\nn02447366\nn01537544\nn07802026\nn03895866\nn04330267\nn03602883\nn02795169\nn04153751\nn03782006\nn02489166\nn03447721\nn03417042\nn04550184\nn02500267\nn02112706\nn03347037\nn02088364\nn02640242\nn03983396\nn02817516\nn01695060\nn13133613\nn02095314\nn03887697\nn02892767\nn07697313\nn11939491\nn04332243\nn02667093\nn02643566\nn02493509\nn04251144\nn02730930\nn04118776\nn02097209\nn04335435\nn03016953\nn03691459\nn04037443\nn02100583\nn02104029\nn02088466\nn09193705\nn03495258\nn02095314\nn03355925\nn07613480\nn02971356\nn04153751\nn01945685\nn01697457\nn04532106\nn02895154\nn04548362\nn04485082\nn02002724\nn02999410\nn03976467\nn02951358\nn03874293\nn02442845\nn04229816\nn01614925\nn02769748\nn04461696\nn02486410\nn03916031\nn04562935\nn02098413\nn02097474\nn03584829\nn02606052\nn02123394\nn03871628\nn04311004\nn02865351\nn01601694\nn02111129\nn04509417\nn01882714\nn03908714\nn02102973\nn03983396\nn02093859\nn03775071\nn02667093\nn02906734\nn07873807\nn04277352\nn04153751\nn01675722\nn01601694\nn04263257\nn01582220\nn03000134\nn04263257\nn04286575\nn06359193\nn02445715\nn03179701\nn04275548\nn02444819\nn02002724\nn03124170\nn02018795\nn02776631\nn12144580\nn03041632\nn02101556\nn04435653\nn04254120\nn04505470\nn03297495\nn02093256\nn03529860\nn01734418\nn04462240\nn02089867\nn03259280\nn03804744\nn02484975\nn03372029\nn02992529\nn01629819\nn03814639\nn04004767\nn02280649\nn04275548\nn04023962\nn03476684\nn01843383\nn02490219\nn03450230\nn02088238\nn02129165\nn07716906\nn02006656\nn07615774\nn04033901\nn02101388\nn02412080\nn02871525\nn01689811\nn02447366\nn02951585\nn03325584\nn04238763\nn01817953\nn07753275\nn03803284\nn03724870\nn01694178\nn04613696\nn03961711\nn04553703\nn04493381\nn04507155\nn03388183\nn04483307\nn02840245\nn01739381\nn03837869\nn03980874\nn02093647\nn02992529\nn03983396\nn02110958\nn01688243\nn02100236\nn01873310\nn04525038\nn03496892\nn04350905\nn02115913\nn01824575\nn04443257\nn01729322\nn03197337\nn09421951\nn07614500\nn03445777\nn03680355\nn04579145\nn03345487\nn03062245\nn02655020\nn02769748\nn03930630\nn03956157\nn04332243\nn03690938\nn04153751\nn04456115\nn02883205\nn01631663\nn02841315\nn02480495\nn02396427\nn04357314\nn01695060\nn02101556\nn03947888\nn04367480\nn03958227\nn01924916\nn02111129\nn02939185\nn01829413\nn02108915\nn03388183\nn02410509\nn04273569\nn02119789\nn04505470\nn02094258\nn02231487\nn02916936\nn02441942\nn04039381\nn02883205\nn02098413\nn01496331\nn03534580\nn07714990\nn04286575\nn03000247\nn03691459\nn03376595\nn01729322\nn12144580\nn04192698\nn03998194\nn02979186\nn02102973\nn02110627\nn01728572\nn03272010\nn03786901\nn04033901\nn02097047\nn03947888\nn07873807\nn02097047\nn07754684\nn02276258\nn02104365\nn01734418\nn03976467\nn02825657\nn01694178\nn01682714\nn02747177\nn03710193\nn09288635\nn02510455\nn02319095\nn02088364\nn02129604\nn04326547\nn03871628\nn02096177\nn09246464\nn03127925\nn02488702\nn06785654\nn02066245\nn12998815\nn01632777\nn02091244\nn01742172\nn03908618\nn04536866\nn03841143\nn01917289\nn02276258\nn03457902\nn04041544\nn03259280\nn02236044\nn02090379\nn04127249\nn03873416\nn02415577\nn03590841\nn02094258\nn03884397\nn01978287\nn02172182\nn01990800\nn04476259\nn03871628\nn03584829\nn04118776\nn02509815\nn02102480\nn01729977\nn02776631\nn03125729\nn02948072\nn01774384\nn01695060\nn07734744\nn01990800\nn02445715\nn03017168\nn02606052\nn04612504\nn02119789\nn02113978\nn03706229\nn02115913\nn02655020\nn02640242\nn03478589\nn03891251\nn02892201\nn02676566\nn01877812\nn02037110\nn07745940\nn02090721\nn04548280\nn02971356\nn03042490\nn02865351\nn04310018\nn07802026\nn01843065\nn01944390\nn03443371\nn01496331\nn13044778\nn03196217\nn02111889\nn09288635\nn03777568\nn03970156\nn02027492\nn09332890\nn04326547\nn04458633\nn02093428\nn03992509\nn03908618\nn03290653\nn04311004\nn03764736\nn04465501\nn03345487\nn04099969\nn02843684\nn02361337\nn02066245\nn02099601\nn03259280\nn02105641\nn01755581\nn03937543\nn03249569\nn02124075\nn03761084\nn02834397\nn03891251\nn07753275\nn04389033\nn03599486\nn04392985\nn01582220\nn03642806\nn01749939\nn01944390\nn03146219\nn09428293\nn02112350\nn03249569\nn02085936\nn03240683\nn04597913\nn03249569\nn02256656\nn07248320\nn04376876\nn03089624\nn04118538\nn02966687\nn03891332\nn01773157\nn02948072\nn01685808\nn04371430\nn02107312\nn01749939\nn02085936\nn02091831\nn02098105\nn02708093\nn02120505\nn01601694\nn06874185\nn02319095\nn01616318\nn01775062\nn13040303\nn03796401\nn04482393\nn03272562\nn03478589\nn02190166\nn02910353\nn02951358\nn01749939\nn12985857\nn04254120\nn03944341\nn03743016\nn01855672\nn04228054\nn03642806\nn03956157\nn04162706\nn02992211\nn01883070\nn03045698\nn02018207\nn01872401\nn04239074\nn07932039\nn04392985\nn02641379\nn01484850\nn01742172\nn04376876\nn04550184\nn03733805\nn04371774\nn04317175\nn03873416\nn02361337\nn02002556\nn02168699\nn02098413\nn02104365\nn03841143\nn02074367\nn04344873\nn07615774\nn04149813\nn02321529\nn12144580\nn02509815\nn03938244\nn01978455\nn03047690\nn04252077\nn02487347\nn03141823\nn02666196\nn02123045\nn02486410\nn02492660\nn03796401\nn02112350\nn07730033\nn03950228\nn04162706\nn02895154\nn02105641\nn03404251\nn02007558\nn01739381\nn02481823\nn04409515\nn02443114\nn02879718\nn03345487\nn02268853\nn12620546\nn03930313\nn04380533\nn01518878\nn04596742\nn03680355\nn02074367\nn01667778\nn03376595\nn04366367\nn02097047\nn02101006\nn01873310\nn03876231\nn04507155\nn02086910\nn04370456\nn02687172\nn03724870\nn02966193\nn02776631\nn03089624\nn04456115\nn03325584\nn01770081\nn04428191\nn01667778\nn02132136\nn02105162\nn03743016\nn04367480\nn02098105\nn03000134\nn02100236\nn02011460\nn02097047\nn02177972\nn04493381\nn03874293\nn02017213\nn03908714\nn02361337\nn02669723\nn02119022\nn02105505\nn03884397\nn02190166\nn03216828\nn02410509\nn02101556\nn02098286\nn03250847\nn02117135\nn03929660\nn04332243\nn03891332\nn02018207\nn01498041\nn03977966\nn02892767\nn03781244\nn02094433\nn02112137\nn02910353\nn03791053\nn01773157\nn03599486\nn11939491\nn01496331\nn02950826\nn09246464\nn02099429\nn02108551\nn02895154\nn09229709\nn07932039\nn03721384\nn03529860\nn02113186\nn03929660\nn02086646\nn02787622\nn02676566\nn02006656\nn02104365\nn03045698\nn03100240\nn03599486\nn03924679\nn03937543\nn02869837\nn02123394\nn01980166\nn04355933\nn03133878\nn03709823\nn06794110\nn02110341\nn01796340\nn02978881\nn03495258\nn03452741\nn02091032\nn04442312\nn04118776\nn01630670\nn03662601\nn02174001\nn04606251\nn02107142\nn03814906\nn03457902\nn02085782\nn03598930\nn02094258\nn03000247\nn02966193\nn02489166\nn04367480\nn02110063\nn07753275\nn07715103\nn04485082\nn03075370\nn02098105\nn13054560\nn02730930\nn03670208\nn02281787\nn04462240\nn02510455\nn02814860\nn04482393\nn03498962\nn09229709\nn02097130\nn04265275\nn04004767\nn02093647\nn01443537\nn01704323\nn02096437\nn03394916\nn04423845\nn02108422\nn03706229\nn02869837\nn01737021\nn03930313\nn04039381\nn02113186\nn02403003\nn02037110\nn03637318\nn02823750\nn01677366\nn02093256\nn02096294\nn06596364\nn03220513\nn02106030\nn02917067\nn02090622\nn04141076\nn01749939\nn02981792\nn02111889\nn02116738\nn09246464\nn02791124\nn02091244\nn02119022\nn02445715\nn03216828\nn03095699\nn03481172\nn04442312\nn02802426\nn09428293\nn03065424\nn02363005\nn12057211\nn02422106\nn02999410\nn03207743\nn03786901\nn02363005\nn02417914\nn01698640\nn03063599\nn04409515\nn03891251\nn03794056\nn02101388\nn04044716\nn02226429\nn01818515\nn01558993\nn02110806\nn03337140\nn03627232\nn04204238\nn07873807\nn03930630\nn04311174\nn01616318\nn04330267\nn04179913\nn04501370\nn02687172\nn02086079\nn03976467\nn03950228\nn01773797\nn03197337\nn02640242\nn01440764\nn02342885\nn02389026\nn02895154\nn02056570\nn04584207\nn03042490\nn09421951\nn01616318\nn03384352\nn07248320\nn03590841\nn03903868\nn02129165\nn02123159\nn03837869\nn03630383\nn02119789\nn07768694\nn02102973\nn03788195\nn01682714\nn02130308\nn03495258\nn03770439\nn02398521\nn02965783\nn02033041\nn02088094\nn02939185\nn01914609\nn04147183\nn03720891\nn02105641\nn01843383\nn01818515\nn02730930\nn02109961\nn04398044\nn04131690\nn01914609\nn03481172\nn04317175\nn03344393\nn04557648\nn02120505\nn02109961\nn02128385\nn02391049\nn03041632\nn09246464\nn03666591\nn02111129\nn02974003\nn02643566\nn03492542\nn02090622\nn02389026\nn01735189\nn03478589\nn03785016\nn03854065\nn03207743\nn04399382\nn02108422\nn04428191\nn07760859\nn03888605\nn02704792\nn03697007\nn03657121\nn04141975\nn04008634\nn02799071\nn02018795\nn02877765\nn07613480\nn11939491\nn02108089\nn02098413\nn01440764\nn01776313\nn03804744\nn01817953\nn02788148\nn03400231\nn03899768\nn02027492\nn02028035\nn02087394\nn04392985\nn01944390\nn04204238\nn03995372\nn02437616\nn03000684\nn03146219\nn01496331\nn02128925\nn02025239\nn03903868\nn06596364\nn01990800\nn03877845\nn02704792\nn01773549\nn03271574\nn02667093\nn01514668\nn02089867\nn02410509\nn09193705\nn04204238\nn02110806\nn02823428\nn01807496\nn07753592\nn02835271\nn04579432\nn03763968\nn01667114\nn01770393\nn02364673\nn03777568\nn04204238\nn04252077\nn01496331\nn02877765\nn01532829\nn02640242\nn04483307\nn04332243\nn03197337\nn02094433\nn03995372\nn03485407\nn02085782\nn04591157\nn07930864\nn02086079\nn01983481\nn04162706\nn02981792\nn02447366\nn03733805\nn02097298\nn04120489\nn04442312\nn07714990\nn02823428\nn02788148\nn02791270\nn11879895\nn03776460\nn02834397\nn03657121\nn02423022\nn03785016\nn03888257\nn02018207\nn01742172\nn04154565\nn02536864\nn03447721\nn02229544\nn04540053\nn04266014\nn03457902\nn03425413\nn02504013\nn02107312\nn02177972\nn02489166\nn04330267\nn03791053\nn04311004\nn02422699\nn02319095\nn04606251\nn04229816\nn02101556\nn04592741\nn03666591\nn02088094\nn02017213\nn03759954\nn02128925\nn03544143\nn03188531\nn03459775\nn04254680\nn03496892\nn02483362\nn02906734\nn07753275\nn02879718\nn02641379\nn02814860\nn03400231\nn02966687\nn09246464\nn02114712\nn02087046\nn02115913\nn03424325\nn03529860\nn01943899\nn04238763\nn03146219\nn02747177\nn02233338\nn13044778\nn03109150\nn02112350\nn03180011\nn02091831\nn03134739\nn03133878\nn01740131\nn02125311\nn02398521\nn02219486\nn04086273\nn02091244\nn02099849\nn02119789\nn04039381\nn02094114\nn04562935\nn03938244\nn07693725\nn12998815\nn04542943\nn02389026\nn03417042\nn01440764\nn02095889\nn02090379\nn02493509\nn02672831\nn01534433\nn02794156\nn02396427\nn02117135\nn03782006\nn04336792\nn03042490\nn03075370\nn02488291\nn04332243\nn02708093\nn02097209\nn02356798\nn03837869\nn04355338\nn03584829\nn03041632\nn06359193\nn03041632\nn03888257\nn03717622\nn04235860\nn04275548\nn01592084\nn03388549\nn01669191\nn07760859\nn02090622\nn01440764\nn01729322\nn02480495\nn07871810\nn04505470\nn04418357\nn03404251\nn03676483\nn02165105\nn04008634\nn03958227\nn02480855\nn02823750\nn07579787\nn02009912\nn07734744\nn03372029\nn01440764\nn02102177\nn03840681\nn07753275\nn03026506\nn01601694\nn03047690\nn02086079\nn02979186\nn02089078\nn02397096\nn12985857\nn02808304\nn04118538\nn04229816\nn09428293\nn07880968\nn04548280\nn03804744\nn01622779\nn02110063\nn02814860\nn02128385\nn01824575\nn01496331\nn04286575\nn03599486\nn03857828\nn03866082\nn03495258\nn02526121\nn02098105\nn02102973\nn03124043\nn04357314\nn07768694\nn03000134\nn03970156\nn04040759\nn02112706\nn04008634\nn04040759\nn06794110\nn02086646\nn02066245\nn03884397\nn03967562\nn04125021\nn02910353\nn02236044\nn01981276\nn07871810\nn02099849\nn03146219\nn04146614\nn09193705\nn02113023\nn02100236\nn13044778\nn03584829\nn03180011\nn02027492\nn03240683\nn02526121\nn01494475\nn02492660\nn01774750\nn07768694\nn02113712\nn03666591\nn12998815\nn03657121\nn02110806\nn03717622\nn02087394\nn02692877\nn02497673\nn04507155\nn02114855\nn04332243\nn02100877\nn04332243\nn02110627\nn03424325\nn02104365\nn01943899\nn03535780\nn02883205\nn01667778\nn01986214\nn02666196\nn02966687\nn02097658\nn03866082\nn04239074\nn02488702\nn01735189\nn04090263\nn04008634\nn03742115\nn03877472\nn03788195\nn03794056\nn01768244\nn02797295\nn02009229\nn03085013\nn02119789\nn04557648\nn02099267\nn03424325\nn03666591\nn01667778\nn07875152\nn01514668\nn02492660\nn03482405\nn04033901\nn04044716\nn03290653\nn12057211\nn02981792\nn01496331\nn02483362\nn03314780\nn04099969\nn02669723\nn02113799\nn02074367\nn02094258\nn03866082\nn04540053\nn02777292\nn03782006\nn02105251\nn03761084\nn01955084\nn02643566\nn02106662\nn01580077\nn01828970\nn02690373\nn03063599\nn02114548\nn03014705\nn03724870\nn02088364\nn07716358\nn03724870\nn03937543\nn02091635\nn02106382\nn07613480\nn13133613\nn04591157\nn02396427\nn03776460\nn02108089\nn02017213\nn04350905\nn02107683\nn04228054\nn01773549\nn03888257\nn02488291\nn04493381\nn01817953\nn01641577\nn02012849\nn01797886\nn02787622\nn02910353\nn04067472\nn03100240\nn02087046\nn03733131\nn02643566\nn02916936\nn02480495\nn02815834\nn02086079\nn02814860\nn02114712\nn07742313\nn01728920\nn02356798\nn13044778\nn01798484\nn04613696\nn02108915\nn02109047\nn03272010\nn04008634\nn02097209\nn01843065\nn02999410\nn04086273\nn03888257\nn02123394\nn04356056\nn09468604\nn01601694\nn03950228\nn04344873\nn02672831\nn12768682\nn02110341\nn10148035\nn02114367\nn04409515\nn03240683\nn04285008\nn07831146\nn03584254\nn01855672\nn02489166\nn03216828\nn03297495\nn04086273\nn01514859\nn01629819\nn02643566\nn02113023\nn02791270\nn03983396\nn07880968\nn02268853\nn03970156\nn02091831\nn02268853\nn02167151\nn03742115\nn03947888\nn04591157\nn03729826\nn02988304\nn03717622\nn02391049\nn02096585\nn02219486\nn02093647\nn02002556\nn02504458\nn01665541\nn03938244\nn03776460\nn02093256\nn02056570\nn02096051\nn02488702\nn07693725\nn01796340\nn02950826\nn01828970\nn03534580\nn03394916\nn04404412\nn03895866\nn01944390\nn04554684\nn02444819\nn03623198\nn04263257\nn04099969\nn02105855\nn03584829\nn04442312\nn01514668\nn02088364\nn01943899\nn02091831\nn02071294\nn03461385\nn04485082\nn01630670\nn01873310\nn02011460\nn02113978\nn01629819\nn07711569\nn04023962\nn01631663\nn02815834\nn01797886\nn03662601\nn02704792\nn02494079\nn02124075\nn03530642\nn03424325\nn02974003\nn01685808\nn02086910\nn04004767\nn03720891\nn04200800\nn01755581\nn04118776\nn02058221\nn03124170\nn03584829\nn01978455\nn02100583\nn03131574\nn03467068\nn02490219\nn02978881\nn02096051\nn04254120\nn03028079\nn04371774\nn02105641\nn02397096\nn04258138\nn03297495\nn02108000\nn02096585\nn02090721\nn02786058\nn02025239\nn01784675\nn03393912\nn01755581\nn02437616\nn02219486\nn03388549\nn02769748\nn03384352\nn03998194\nn02699494\nn04277352\nn03637318\nn02415577\nn03788365\nn01943899\nn02009229\nn04325704\nn04532670\nn01498041\nn03793489\nn04141076\nn04525038\nn04548362\nn02012849\nn02093754\nn03534580\nn04532670\nn02859443\nn02027492\nn04070727\nn03673027\nn11879895\nn02643566\nn04606251\nn04613696\nn03680355\nn01860187\nn04251144\nn01739381\nn02098413\nn04019541\nn02101556\nn03201208\nn04532106\nn02879718\nn02951585\nn04604644\nn04275548\nn02097474\nn03482405\nn07734744\nn03868242\nn04332243\nn04589890\nn03788365\nn03649909\nn02090721\nn02672831\nn02109525\nn02112018\nn07615774\nn02102480\nn03125729\nn01632458\nn04252225\nn01824575\nn02666196\nn03832673\nn02105641\nn07768694\nn03871628\nn03127925\nn03344393\nn02096177\nn03887697\nn03424325\nn03014705\nn03796401\nn03617480\nn04065272\nn03982430\nn04479046\nn03763968\nn02486410\nn07742313\nn02687172\nn03794056\nn04254680\nn03661043\nn02837789\nn02454379\nn01560419\nn04443257\nn07613480\nn02110806\nn01818515\nn02099712\nn03384352\nn04366367\nn03676483\nn02892767\nn02110627\nn02096294\nn01667778\nn02870880\nn03425413\nn01751748\nn04275548\nn03187595\nn02437312\nn03623198\nn01796340\nn09472597\nn04523525\nn02486261\nn01531178\nn02493509\nn02979186\nn03584829\nn03924679\nn02099601\nn03259280\nn04229816\nn01872401\nn04579432\nn01855672\nn01622779\nn02509815\nn04525305\nn04131690\nn02484975\nn09193705\nn02097658\nn02877765\nn02749479\nn06596364\nn01806567\nn02093428\nn01773157\nn03207941\nn03947888\nn01818515\nn02092339\nn02276258\nn03207743\nn02794156\nn02106166\nn03529860\nn04493381\nn02086079\nn02011460\nn03961711\nn03680355\nn04263257\nn01819313\nn02102177\nn04254120\nn03888257\nn03729826\nn04136333\nn04346328\nn02107908\nn02447366\nn03125729\nn03476684\nn02443114\nn03788195\nn03710637\nn03657121\nn03633091\nn03141823\nn07802026\nn02113978\nn01665541\nn01744401\nn02834397\nn03633091\nn04335435\nn02011460\nn02099712\nn03527444\nn03180011\nn02408429\nn02123394\nn03980874\nn04070727\nn03445777\nn04465501\nn03530642\nn03291819\nn04252077\nn01689811\nn02058221\nn02112137\nn01950731\nn01682714\nn02231487\nn07684084\nn03481172\nn02963159\nn07768694\nn03977966\nn02165456\nn02939185\nn04258138\nn02123045\nn02128757\nn02037110\nn02128925\nn02483362\nn03483316\nn04273569\nn04208210\nn03942813\nn03291819\nn03467068\nn02091467\nn02113624\nn03950228\nn03786901\nn04228054\nn03649909\nn01629819\nn02104365\nn02865351\nn02097047\nn03902125\nn02231487\nn04033995\nn02172182\nn01632777\nn02494079\nn02391049\nn02093256\nn03992509\nn03710721\nn03272010\nn03124043\nn02422699\nn02492035\nn02410509\nn04120489\nn02793495\nn03594734\nn03841143\nn03124043\nn04265275\nn02088466\nn02123159\nn03461385\nn01675722\nn02965783\nn07753113\nn07614500\nn04154565\nn03590841\nn02361337\nn07720875\nn01843383\nn04162706\nn02134418\nn03271574\nn01494475\nn01729977\nn01689811\nn01582220\nn02655020\nn03594945\nn02099712\nn02110627\nn02441942\nn02791124\nn02007558\nn03891332\nn02791270\nn02037110\nn02127052\nn01910747\nn01829413\nn04523525\nn02417914\nn04465501\nn01860187\nn03935335\nn03908714\nn02018207\nn02006656\nn07802026\nn03950228\nn07590611\nn02092002\nn04423845\nn02790996\nn04252225\nn03666591\nn02109961\nn03930630\nn02860847\nn04552348\nn02092339\nn09229709\nn02791270\nn07579787\nn03196217\nn02500267\nn02790996\nn01622779\nn02484975\nn02669723\nn02280649\nn11879895\nn03769881\nn02167151\nn02403003\nn03717622\nn02093991\nn03942813\nn04254680\nn04443257\nn01860187\nn09229709\nn02028035\nn02087394\nn01986214\nn02115641\nn02640242\nn04328186\nn03908618\nn04154565\nn02797295\nn02097209\nn02125311\nn07932039\nn02102973\nn03529860\nn01980166\nn02443114\nn03733131\nn07718472\nn03255030\nn02009912\nn02087394\nn03218198\nn02106550\nn03888605\nn01704323\nn02091635\nn03710721\nn02325366\nn02112350\nn03207743\nn03980874\nn03042490\nn07590611\nn02096051\nn02408429\nn02091244\nn03773504\nn01491361\nn02120505\nn02607072\nn02487347\nn02504458\nn04204347\nn02037110\nn02790996\nn02107312\nn04044716\nn02002556\nn02727426\nn04606251\nn02091831\nn03598930\nn03089624\nn01807496\nn07613480\nn04404412\nn04542943\nn09229709\nn03467068\nn01943899\nn11939491\nn02086646\nn02095314\nn02328150\nn02992529\nn02281787\nn04008634\nn07697313\nn03347037\nn02012849\nn02099429\nn04179913\nn02106662\nn03841143\nn07768694\nn07880968\nn02111129\nn04456115\nn04330267\nn01629819\nn04146614\nn03710193\nn03250847\nn02808304\nn03018349\nn01943899\nn02398521\nn03388549\nn02097658\nn03529860\nn02782093\nn01592084\nn04311174\nn02823750\nn04067472\nn02422699\nn03832673\nn04367480\nn04557648\nn02051845\nn01882714\nn02012849\nn03796401\nn01735189\nn09256479\nn03529860\nn11939491\nn03673027\nn01669191\nn03742115\nn02692877\nn02328150\nn07715103\nn02268443\nn02268853\nn01770393\nn07718747\nn07714571\nn01695060\nn01843065\nn03404251\nn02823750\nn04264628\nn03478589\nn02643566\nn01514859\nn02086646\nn01692333\nn03841143\nn03977966\nn04136333\nn02089973\nn02097298\nn04311174\nn01677366\nn01930112\nn02128925\nn03710721\nn02909870\nn02027492\nn04252077\nn03544143\nn09332890\nn04118776\nn04553703\nn02488702\nn02109525\nn04443257\nn01728572\nn03384352\nn04136333\nn07718472\nn03773504\nn04273569\nn02730930\nn02259212\nn03125729\nn01748264\nn03095699\nn02504458\nn04579432\nn02231487\nn04442312\nn03447447\nn02939185\nn02110341\nn04458633\nn03492542\nn02841315\nn04285008\nn02787622\nn01514668\nn03877472\nn04486054\nn04238763\nn02480495\nn07871810\nn01968897\nn03954731\nn03584829\nn03379051\nn02123394\nn03259280\nn07920052\nn02113712\nn02092002\nn02727426\nn04149813\nn01775062\nn03457902\nn03791053\nn02106550\nn09288635\nn01742172\nn02219486\nn04332243\nn02490219\nn04033901\nn03590841\nn04344873\nn07753592\nn02085936\nn03447721\nn01580077\nn02120505\nn02504458\nn03633091\nn02113023\nn02109525\nn11879895\nn03445924\nn01882714\nn02089867\nn04604644\nn03697007\nn02814533\nn02094114\nn01631663\nn02105251\nn02948072\nn04200800\nn01820546\nn03125729\nn03290653\nn02102480\nn04525038\nn03347037\nn03950228\nn02319095\nn03160309\nn03787032\nn02107574\nn04487394\nn04548280\nn07697537\nn01580077\nn03599486\nn04599235\nn01735189\nn04612504\nn02786058\nn03000247\nn02906734\nn13054560\nn02132136\nn02939185\nn02101006\nn04141975\nn04127249\nn07565083\nn01641577\nn02017213\nn02095889\nn02096585\nn03461385\nn02231487\nn04493381\nn02092339\nn04332243\nn02497673\nn02119022\nn02099601\nn04311004\nn03920288\nn02704792\nn02091032\nn03240683\nn03538406\nn04560804\nn01440764\nn02776631\nn02013706\nn02099849\nn01532829\nn02110341\nn01944390\nn03218198\nn02099712\nn04429376\nn03249569\nn02422106\nn04254777\nn04009552\nn03617480\nn03337140\nn01692333\nn02493509\nn12144580\nn03095699\nn03781244\nn03782006\nn02099429\nn09428293\nn04179913\nn02105251\nn07716358\nn04357314\nn03895866\nn02948072\nn03888257\nn03447447\nn07248320\nn01537544\nn02487347\nn03982430\nn02910353\nn07892512\nn09468604\nn03857828\nn03290653\nn03388043\nn03843555\nn04423845\nn04404412\nn04347754\nn01537544\nn02992529\nn02101388\nn02056570\nn02093859\nn02105412\nn03933933\nn02704792\nn03063599\nn12267677\nn04482393\nn01443537\nn03670208\nn04590129\nn07565083\nn04111531\nn03188531\nn02114712\nn04409515\nn03272010\nn02107312\nn02112018\nn03676483\nn03770439\nn13133613\nn04259630\nn02105641\nn04049303\nn02807133\nn03249569\nn02099267\nn04065272\nn07716906\nn02087394\nn01669191\nn04376876\nn01847000\nn02123597\nn04131690\nn02033041\nn04357314\nn01530575\nn02841315\nn01698640\nn04179913\nn01824575\nn02092002\nn02058221\nn03617480\nn04146614\nn02097130\nn09399592\nn02892201\nn02116738\nn04204347\nn04522168\nn04136333\nn01531178\nn02346627\nn02168699\nn01980166\nn07711569\nn03347037\nn04208210\nn02823750\nn02124075\nn02509815\nn03404251\nn02088364\nn01798484\nn02009912\nn03814639\nn02172182\nn03840681\nn02002556\nn03888257\nn03065424\nn03325584\nn02317335\nn02281406\nn03658185\nn02095570\nn03920288\nn03710637\nn02123597\nn03877472\nn04357314\nn07802026\nn04067472\nn02437616\nn03482405\nn01532829\nn04553703\nn03065424\nn02058221\nn07718472\nn04252225\nn02096585\nn02097658\nn04525305\nn12057211\nn04259630\nn02490219\nn04285008\nn01534433\nn01622779\nn04067472\nn04557648\nn03888257\nn02096051\nn01632458\nn02808304\nn12985857\nn01756291\nn02111500\nn02963159\nn02790996\nn03630383\nn07714990\nn04589890\nn02128757\nn02786058\nn02951358\nn03763968\nn02356798\nn01818515\nn02607072\nn07717410\nn03877472\nn04069434\nn02483362\nn04479046\nn02268853\nn10148035\nn02815834\nn02116738\nn04501370\nn03131574\nn02099712\nn02108915\nn04209239\nn03770439\nn02226429\nn12144580\nn02906734\nn02783161\nn02667093\nn04239074\nn02110063\nn01582220\nn07768694\nn01774750\nn03787032\nn12057211\nn03764736\nn01795545\nn03623198\nn01443537\nn02892201\nn03868242\nn03384352\nn02403003\nn03658185\nn03485794\nn02085782\nn04328186\nn03388183\nn04344873\nn07716358\nn02097047\nn01737021\nn01695060\nn02098286\nn04258138\nn03127747\nn07565083\nn01667114\nn03929660\nn03476684\nn03785016\nn04041544\nn02100236\nn03854065\nn03529860\nn02097209\nn02100236\nn04540053\nn02002556\nn03495258\nn02834397\nn04346328\nn03485407\nn02835271\nn01729977\nn02802426\nn03781244\nn02793495\nn02892767\nn02086240\nn02490219\nn02119022\nn06359193\nn03207743\nn01980166\nn04467665\nn04332243\nn03598930\nn04523525\nn03877472\nn03976657\nn02256656\nn02097130\nn02606052\nn04037443\nn02793495\nn03929855\nn04118776\nn02727426\nn01833805\nn02536864\nn03710721\nn03459775\nn04311004\nn02113712\nn02480495\nn03041632\nn02966193\nn03476684\nn07716358\nn04310018\nn07579787\nn02493793\nn02094433\nn07734744\nn01744401\nn03770679\nn04523525\nn02364673\nn03355925\nn07715103\nn02403003\nn01644900\nn01518878\nn02815834\nn04251144\nn02690373\nn02124075\nn04553703\nn04081281\nn02408429\nn01704323\nn02640242\nn03478589\nn04447861\nn07875152\nn04209133\nn07734744\nn04487081\nn02177972\nn02892767\nn02113624\nn03016953\nn07753275\nn02319095\nn07745940\nn02108000\nn02028035\nn02504458\nn02106550\nn07754684\nn03063599\nn03787032\nn02098105\nn03467068\nn02089867\nn02093428\nn07718747\nn07831146\nn03496892\nn03961711\nn01924916\nn01883070\nn01704323\nn03733281\nn03791053\nn02930766\nn03478589\nn01980166\nn01985128\nn09472597\nn03967562\nn02087394\nn01914609\nn02497673\nn03924679\nn03706229\nn02108089\nn15075141\nn03977966\nn07715103\nn03187595\nn02236044\nn04599235\nn03529860\nn04023962\nn02092339\nn02977058\nn07584110\nn07730033\nn03272010\nn03676483\nn02493509\nn09468604\nn02091467\nn03534580\nn03125729\nn04467665\nn01665541\nn04330267\nn02917067\nn03196217\nn02009229\nn03042490\nn01632458\nn03100240\nn02965783\nn02172182\nn03920288\nn03109150\nn07747607\nn02093859\nn02655020\nn03658185\nn03584254\nn02110806\nn04596742\nn02113799\nn01530575\nn03345487\nn02917067\nn03788195\nn02105162\nn15075141\nn04317175\nn04251144\nn02112018\nn04326547\nn03838899\nn01955084\nn02417914\nn02099849\nn02317335\nn03095699\nn02699494\nn04554684\nn03729826\nn04005630\nn02108422\nn03127925\nn02123045\nn03832673\nn02504013\nn01806567\nn04069434\nn04023962\nn04111531\nn02097209\nn02105056\nn02097209\nn03376595\nn02095314\nn01756291\nn03773504\nn01980166\nn06794110\nn04074963\nn02747177\nn02108551\nn03255030\nn03891251\nn03935335\nn03673027\nn02111277\nn03188531\nn02100236\nn02992529\nn02607072\nn02095889\nn02002556\nn02834397\nn02134084\nn07716906\nn02804414\nn02134084\nn04008634\nn02509815\nn04254120\nn04147183\nn04204238\nn03908714\nn04162706\nn03197337\nn11879895\nn03787032\nn04111531\nn02978881\nn02102177\nn03379051\nn04371774\nn01704323\nn03710721\nn01518878\nn03016953\nn02106382\nn04540053\nn01558993\nn02105412\nn02981792\nn03028079\nn03782006\nn02086079\nn04192698\nn02233338\nn03649909\nn03496892\nn02276258\nn03832673\nn04070727\nn03899768\nn03017168\nn03485794\nn04591157\nn02493509\nn02093754\nn02107683\nn04208210\nn02992529\nn03124043\nn03876231\nn03691459\nn01667778\nn07730033\nn04252225\nn04208210\nn02860847\nn01742172\nn02094114\nn03000134\nn07860988\nn01775062\nn03958227\nn03045698\nn03759954\nn02086240\nn03676483\nn04532670\nn02100583\nn02793495\nn01855032\nn04275548\nn04409515\nn03733131\nn03710193\nn07760859\nn03854065\nn01629819\nn02840245\nn03691459\nn03452741\nn03297495\nn03877472\nn02125311\nn04037443\nn02526121\nn01698640\nn04591713\nn02860847\nn02412080\nn01728572\nn04152593\nn02879718\nn02699494\nn02115913\nn03000134\nn02326432\nn02966193\nn04326547\nn04049303\nn04501370\nn07590611\nn02088466\nn01665541\nn03141823\nn02037110\nn02110958\nn03481172\nn07860988\nn02509815\nn02869837\nn03930313\nn03492542\nn02480855\nn02486261\nn03495258\nn03478589\nn03063599\nn04525038\nn02109525\nn02787622\nn01592084\nn02437616\nn13040303\nn04118776\nn02104365\nn02927161\nn03532672\nn03814639\nn01910747\nn01737021\nn03877845\nn07579787\nn09288635\nn01981276\nn03133878\nn02667093\nn02747177\nn02500267\nn04370456\nn01601694\nn03769881\nn04372370\nn02114712\nn02326432\nn03134739\nn03041632\nn01685808\nn02233338\nn01614925\nn03982430\nn03929855\nn04069434\nn04367480\nn03961711\nn03201208\nn02092002\nn04370456\nn04376876\nn02395406\nn03717622\nn04317175\nn02088094\nn02950826\nn01697457\nn04591157\nn01784675\nn03930630\nn04251144\nn02802426\nn07697537\nn01689811\nn12998815\nn04550184\nn04486054\nn01667778\nn03916031\nn01795545\nn02790996\nn01910747\nn02085936\nn03938244\nn03976467\nn02325366\nn03527444\nn02268443\nn03290653\nn03444034\nn02105056\nn02096437\nn03457902\nn03843555\nn02500267\nn02088094\nn02769748\nn04525038\nn02606052\nn04487081\nn02486261\nn03492542\nn03733131\nn02120505\nn07745940\nn02112137\nn07579787\nn02105505\nn03452741\nn10148035\nn04125021\nn04026417\nn02089867\nn03995372\nn02177972\nn03903868\nn04409515\nn01943899\nn02100236\nn03124170\nn03197337\nn02361337\nn04325704\nn03920288\nn03825788\nn02101388\nn11879895\nn03443371\nn02071294\nn07880968\nn03769881\nn03902125\nn02110806\nn03637318\nn04019541\nn03840681\nn02342885\nn03476684\nn02094114\nn04023962\nn03706229\nn02730930\nn02877765\nn04548362\nn02088632\nn04285008\nn07873807\nn03903868\nn04501370\nn04118538\nn02025239\nn03530642\nn02018207\nn03476684\nn03602883\nn02948072\nn02102040\nn02123394\nn01944390\nn02268853\nn04590129\nn01530575\nn02117135\nn03691459\nn02504013\nn03179701\nn04357314\nn04399382\nn03218198\nn02865351\nn03598930\nn02113978\nn03697007\nn01843383\nn02074367\nn02264363\nn01742172\nn02123045\nn02795169\nn03721384\nn02129165\nn03544143\nn04522168\nn12985857\nn02814860\nn02110958\nn02100735\nn13044778\nn02817516\nn07730033\nn04429376\nn04033995\nn04367480\nn03729826\nn02493793\nn04141975\nn01740131\nn01914609\nn02134418\nn01739381\nn02687172\nn02483362\nn13037406\nn01742172\nn02396427\nn02397096\nn01689811\nn09399592\nn04347754\nn02865351\nn04344873\nn02111889\nn02939185\nn04033995\nn02037110\nn01773157\nn03599486\nn02093647\nn01532829\nn02097209\nn02492660\nn04009552\nn04033901\nn02099429\nn02056570\nn02098413\nn02992211\nn03788195\nn03207743\nn03444034\nn03814639\nn04485082\nn01981276\nn01978455\nn03461385\nn01688243\nn02277742\nn03388043\nn02871525\nn02101556\nn03131574\nn02236044\nn07248320\nn03041632\nn02095314\nn04344873\nn02119022\nn02172182\nn13054560\nn01978287\nn03532672\nn04536866\nn02105412\nn04118538\nn02443484\nn01695060\nn02909870\nn02441942\nn02017213\nn02799071\nn04147183\nn04589890\nn02056570\nn02486261\nn03345487\nn04328186\nn02328150\nn04476259\nn04346328\nn04273569\nn03290653\nn03627232\nn02791124\nn02012849\nn02259212\nn02090379\nn03627232\nn03764736\nn02817516\nn04326547\nn03065424\nn02909870\nn01675722\nn04522168\nn13133613\nn02655020\nn04209133\nn02783161\nn03796401\nn03250847\nn01872401\nn01682714\nn01873310\nn01631663\nn04005630\nn02843684\nn02769748\nn02804610\nn03782006\nn01978455\nn02097298\nn02787622\nn07716906\nn02111129\nn02123045\nn02279972\nn02497673\nn02980441\nn02111129\nn03297495\nn04487081\nn04370456\nn01667778\nn03710193\nn02096294\nn03876231\nn03938244\nn02950826\nn04311174\nn04081281\nn01687978\nn04371774\nn06794110\nn02281406\nn04326547\nn02395406\nn02096051\nn02113186\nn04070727\nn02206856\nn02690373\nn01729977\nn03000684\nn01514859\nn03197337\nn03445924\nn04604644\nn02280649\nn02090379\nn02012849\nn01534433\nn07734744\nn03838899\nn02177972\nn04423845\nn03899768\nn02098105\nn03633091\nn02701002\nn04371430\nn02114367\nn03947888\nn01820546\nn02088238\nn03929855\nn04612504\nn02963159\nn02966193\nn02037110\nn03982430\nn02107574\nn02966193\nn04355933\nn03372029\nn02113978\nn04398044\nn02087046\nn02106166\nn04465501\nn03179701\nn10565667\nn03492542\nn01735189\nn02120079\nn02105251\nn01873310\nn02110063\nn03388183\nn02444819\nn02687172\nn01871265\nn02445715\nn04590129\nn12985857\nn01819313\nn03938244\nn02443114\nn04380533\nn04277352\nn02444819\nn02536864\nn02111277\nn02948072\nn03938244\nn07753113\nn01440764\nn09193705\nn02509815\nn01770393\nn01828970\nn03794056\nn03902125\nn02097474\nn07714571\nn02107908\nn01698640\nn04590129\nn02481823\nn04418357\nn02504013\nn02815834\nn01530575\nn03131574\nn02104365\nn04204238\nn02454379\nn04147183\nn02077923\nn02488291\nn02342885\nn02097474\nn07716358\nn03337140\nn04417672\nn01694178\nn04311004\nn06785654\nn07768694\nn04149813\nn01560419\nn03970156\nn04125021\nn09428293\nn04258138\nn03720891\nn04086273\nn02804610\nn03642806\nn03133878\nn02974003\nn01629819\nn03983396\nn04154565\nn02483362\nn04019541\nn03065424\nn04040759\nn06596364\nn04131690\nn01770393\nn04550184\nn02120079\nn03255030\nn02326432\nn03344393\nn12985857\nn01675722\nn01729322\nn02112137\nn04398044\nn02013706\nn04162706\nn04069434\nn03630383\nn02840245\nn01644900\nn03680355\nn04229816\nn09193705\nn02788148\nn04462240\nn03775546\nn06596364\nn02090721\nn03388183\nn04252077\nn03042490\nn01843065\nn02111129\nn01616318\nn04409515\nn10148035\nn01677366\nn02655020\nn02107683\nn02105162\nn03888257\nn02128925\nn03868863\nn04069434\nn01773797\nn03792782\nn03792782\nn01560419\nn07742313\nn13054560\nn02981792\nn03916031\nn03623198\nn04146614\nn11879895\nn01675722\nn02097130\nn04423845\nn02089973\nn04592741\nn01968897\nn07718747\nn02992529\nn07753275\nn07745940\nn02108422\nn02804414\nn02342885\nn03379051\nn02457408\nn02437312\nn03787032\nn02091032\nn02002556\nn03666591\nn03717622\nn07831146\nn03208938\nn02840245\nn03891332\nn04589890\nn03887697\nn04141076\nn03770439\nn02113023\nn02009912\nn02823750\nn04252077\nn02396427\nn02099601\nn02279972\nn01843383\nn02749479\nn04228054\nn04590129\nn01773797\nn02027492\nn02093428\nn02259212\nn01910747\nn02088364\nn02093754\nn07860988\nn02093428\nn01494475\nn03888605\nn04589890\nn02092339\nn07584110\nn02190166\nn02096051\nn04023962\nn02484975\nn03980874\nn02870880\nn01807496\nn02090721\nn02011460\nn02033041\nn01514668\nn02094114\nn02687172\nn02013706\nn04523525\nn07718747\nn02361337\nn07720875\nn04005630\nn04509417\nn07613480\nn01622779\nn03131574\nn01631663\nn02701002\nn03014705\nn02607072\nn01560419\nn03197337\nn09193705\nn02099849\nn03000134\nn02480495\nn03733805\nn07802026\nn01749939\nn03956157\nn01955084\nn03445777\nn02927161\nn02105162\nn02088238\nn06794110\nn09332890\nn02823428\nn03773504\nn03657121\nn04044716\nn07760859\nn03207941\nn07717410\nn01664065\nn03291819\nn01580077\nn02132136\nn01687978\nn09332890\nn04590129\nn04487081\nn03838899\nn01981276\nn03899768\nn04004767\nn03207743\nn02106166\nn07873807\nn04039381\nn03388549\nn03977966\nn03384352\nn02114367\nn07695742\nn02105412\nn04591157\nn01729322\nn02066245\nn03938244\nn03240683\nn07880968\nn03782006\nn02086646\nn01632777\nn02793495\nn02281406\nn02443484\nn03208938\nn04350905\nn03179701\nn03658185\nn02480855\nn01737021\nn09256479\nn04357314\nn03424325\nn02807133\nn01855032\nn01828970\nn03980874\nn02107683\nn03895866\nn07768694\nn02090721\nn02110958\nn02669723\nn04599235\nn02105641\nn02692877\nn02927161\nn01582220\nn02325366\nn04039381\nn02790996\nn07760859\nn02114712\nn02099712\nn04275548\nn04366367\nn02687172\nn02113624\nn02454379\nn04120489\nn03785016\nn02279972\nn04209239\nn01677366\nn01682714\nn01601694\nn02483708\nn07718747\nn04344873\nn02483362\nn07717556\nn01981276\nn02699494\nn03160309\nn02123597\nn03970156\nn01669191\nn01756291\nn02606052\nn02795169\nn03478589\nn02259212\nn06785654\nn02114712\nn04311174\nn03891332\nn04443257\nn01687978\nn04259630\nn02128925\nn02526121\nn03447721\nn04239074\nn03877472\nn03710637\nn07711569\nn04153751\nn01682714\nn03598930\nn04131690\nn01819313\nn02085620\nn02113023\nn03133878\nn07768694\nn04579432\nn04532670\nn03976467\nn04326547\nn02951358\nn02279972\nn03000247\nn03837869\nn09288635\nn03196217\nn03733805\nn02111889\nn04286575\nn01985128\nn02105056\nn02783161\nn03902125\nn02643566\nn04553703\nn03787032\nn02799071\nn02137549\nn03445777\nn03240683\nn02093256\nn01847000\nn01978455\nn02089973\nn03482405\nn06874185\nn02280649\nn02129604\nn02892767\nn02480495\nn02106662\nn12144580\nn03599486\nn02066245\nn02454379\nn01873310\nn03690938\nn02389026\nn02264363\nn02966193\nn02500267\nn03538406\nn01843065\nn04254680\nn04346328\nn03961711\nn03970156\nn03207941\nn03791053\nn02085936\nn03954731\nn03857828\nn02807133\nn02443114\nn02219486\nn03670208\nn04263257\nn03110669\nn01795545\nn03467068\nn02115913\nn02119789\nn04487081\nn02791124\nn04201297\nn04265275\nn01784675\nn02814533\nn02417914\nn07932039\nn02606052\nn01768244\nn04311004\nn03662601\nn02607072\nn01773549\nn02085620\nn02730930\nn04347754\nn02051845\nn01914609\nn03729826\nn02129165\nn01537544\nn03888605\nn03764736\nn04579145\nn01630670\nn01950731\nn03599486\nn03786901\nn04243546\nn04040759\nn03594945\nn01632458\nn02823750\nn04442312\nn02859443\nn01629819\nn04254777\nn04039381\nn01641577\nn04553703\nn03443371\nn04467665\nn03991062\nn02219486\nn02799071\nn04026417\nn03930313\nn02096585\nn03534580\nn07753113\nn03868863\nn01773549\nn03720891\nn02727426\nn02096177\nn03272562\nn02100236\nn03450230\nn03697007\nn02927161\nn01798484\nn02865351\nn01631663\nn02100236\nn03871628\nn03394916\nn03983396\nn03908714\nn02641379\nn07892512\nn01877812\nn01824575\nn02106030\nn02100583\nn03424325\nn02106166\nn01682714\nn04456115\nn01784675\nn03868242\nn02100877\nn04033901\nn04266014\nn04332243\nn02443114\nn04487081\nn01774750\nn02129165\nn01984695\nn03769881\nn02422106\nn04328186\nn02108915\nn02088364\nn02795169\nn01773157\nn03063689\nn04326547\nn01644900\nn09229709\nn02133161\nn03016953\nn02085620\nn07565083\nn02317335\nn04485082\nn02125311\nn04591157\nn02396427\nn04347754\nn02129604\nn02422699\nn02123597\nn03388183\nn03590841\nn02807133\nn03676483\nn03255030\nn02174001\nn04536866\nn02104029\nn02817516\nn02087046\nn02085782\nn02115641\nn02086910\nn02834397\nn03201208\nn02086240\nn02454379\nn02422699\nn02106662\nn04560804\nn02699494\nn02871525\nn04591157\nn04149813\nn03920288\nn02099267\nn02105412\nn01667778\nn03535780\nn02085936\nn03344393\nn03871628\nn02268853\nn02276258\nn03773504\nn04505470\nn02895154\nn01740131\nn02101388\nn01847000\nn04111531\nn02280649\nn04509417\nn01496331\nn02264363\nn02109525\nn03372029\nn03903868\nn01796340\nn02988304\nn02486261\nn07932039\nn03841143\nn02089867\nn02099429\nn03062245\nn02799071\nn03485794\nn03944341\nn02090379\nn04370456\nn04125021\nn03929855\nn02110063\nn02794156\nn04141076\nn02085936\nn04606251\nn02099712\nn01773549\nn02992529\nn03347037\nn02120505\nn02727426\nn03483316\nn04479046\nn03544143\nn03888605\nn04548362\nn13037406\nn04044716\nn02259212\nn02835271\nn01797886\nn02823428\nn04086273\nn02127052\nn03133878\nn03733281\nn02676566\nn02667093\nn04026417\nn07932039\nn04252077\nn03976467\nn04366367\nn03443371\nn04346328\nn02112018\nn03781244\nn03459775\nn03876231\nn01534433\nn03017168\nn02808304\nn07730033\nn02169497\nn02514041\nn04458633\nn02002556\nn03980874\nn03131574\nn01807496\nn04330267\nn01773549\nn02123159\nn04204347\nn02395406\nn02321529\nn03124043\nn03617480\nn01910747\nn01784675\nn03733131\nn07875152\nn04599235\nn09428293\nn07565083\nn02206856\nn03127747\nn02086240\nn04146614\nn04532670\nn03259280\nn02104365\nn01855032\nn04366367\nn02977058\nn02444819\nn02088632\nn04562935\nn03891251\nn07718747\nn02783161\nn03929855\nn01872401\nn07693725\nn02859443\nn04370456\nn02259212\nn02231487\nn04065272\nn02361337\nn02395406\nn02094433\nn01833805\nn02097474\nn03868242\nn04041544\nn02493793\nn02174001\nn02085620\nn12620546\nn02412080\nn02808440\nn02489166\nn04069434\nn03763968\nn03721384\nn04522168\nn03527444\nn04147183\nn02277742\nn03743016\nn02490219\nn01443537\nn01534433\nn02965783\nn02106382\nn02007558\nn03908618\nn04357314\nn02108089\nn01980166\nn03642806\nn04090263\nn02093256\nn02841315\nn01695060\nn04152593\nn04532670\nn04201297\nn03476684\nn02236044\nn02769748\nn03187595\nn02841315\nn04081281\nn07873807\nn04548362\nn03595614\nn04532670\nn03047690\nn04552348\nn01806143\nn04542943\nn07717556\nn03782006\nn02107574\nn04118776\nn04523525\nn04141327\nn03000684\nn02124075\nn02667093\nn03976467\nn02965783\nn06785654\nn04548280\nn03840681\nn04243546\nn03447721\nn03720891\nn03825788\nn02791270\nn02870880\nn03535780\nn02165456\nn02132136\nn04044716\nn03970156\nn03692522\nn01744401\nn04418357\nn02167151\nn02790996\nn03903868\nn02860847\nn02417914\nn01985128\nn02281787\nn10148035\nn02974003\nn03777754\nn03445777\nn04532106\nn02085782\nn03452741\nn03670208\nn03866082\nn02105162\nn03220513\nn03529860\nn04376876\nn01440764\nn03498962\nn02687172\nn01665541\nn04344873\nn02489166\nn03384352\nn02443484\nn03976657\nn04540053\nn01817953\nn02098105\nn02655020\nn01756291\nn02099267\nn04141327\nn07734744\nn03690938\nn02133161\nn10148035\nn03461385\nn03840681\nn02099267\nn03908618\nn02483708\nn03710637\nn02804610\nn02906734\nn07836838\nn03930313\nn02786058\nn01795545\nn02804610\nn02095570\nn03447721\nn04311004\nn04229816\nn04208210\nn03710193\nn03584829\nn04355338\nn03146219\nn02085620\nn04522168\nn02106030\nn03908618\nn02113624\nn04429376\nn02100877\nn02894605\nn02088632\nn02490219\nn02264363\nn04204238\nn07717556\nn02699494\nn13040303\nn02782093\nn04238763\nn03935335\nn02111889\nn04147183\nn02089078\nn03598930\nn04131690\nn01534433\nn04039381\nn02113023\nn03649909\nn02804610\nn02950826\nn07695742\nn03899768\nn03662601\nn02100877\nn06359193\nn04270147\nn03527444\nn04023962\nn03207743\nn03691459\nn02086646\nn04456115\nn04335435\nn04493381\nn03355925\nn02128757\nn03710637\nn02749479\nn04111531\nn02669723\nn04591157\nn02106550\nn04069434\nn01669191\nn03496892\nn01855672\nn03803284\nn04371774\nn02965783\nn01955084\nn03710637\nn04147183\nn03792782\nn04597913\nn04266014\nn02790996\nn02099601\nn03627232\nn02219486\nn07760859\nn02877765\nn07715103\nn02259212\nn07747607\nn04376876\nn01748264\nn04317175\nn02687172\nn13037406\nn02321529\nn02981792\nn02992211\nn03891332\nn01944390\nn02398521\nn07753275\nn01687978\nn03325584\nn01806143\nn01795545\nn02256656\nn13133613\nn06785654\nn02236044\nn04033901\nn02892767\nn03792972\nn07753592\nn01580077\nn03535780\nn03602883\nn02423022\nn03599486\nn02279972\nn02655020\nn03637318\nn02108000\nn03355925\nn04486054\nn01986214\nn03014705\nn04599235\nn02107312\nn04522168\nn03782006\nn02091244\nn04238763\nn01641577\nn02268853\nn07711569\nn03662601\nn02102318\nn01677366\nn02097209\nn03763968\nn03786901\nn02509815\nn02086910\nn06794110\nn07920052\nn03379051\nn02346627\nn02018795\nn02480495\nn07711569\nn04532670\nn02099712\nn02110806\nn03759954\nn02123597\nn04154565\nn03347037\nn02077923\nn02514041\nn01616318\nn02641379\nn04086273\nn02097298\nn02930766\nn01983481\nn03995372\nn03891332\nn03218198\nn02058221\nn01729322\nn02799071\nn01820546\nn04127249\nn02834397\nn02097209\nn03196217\nn03216828\nn02096585\nn04229816\nn11879895\nn03977966\nn03876231\nn03908618\nn03255030\nn02106662\nn02488702\nn02978881\nn03868242\nn03710721\nn03494278\nn02363005\nn02939185\nn07768694\nn04505470\nn02028035\nn02894605\nn07717410\nn07745940\nn04429376\nn04344873\nn02727426\nn01753488\nn02110806\nn03661043\nn01806567\nn01955084\nn03467068\nn02110063\nn03902125\nn03450230\nn01692333\nn02114855\nn01644900\nn07742313\nn07565083\nn04505470\nn02088364\nn03733131\nn02105056\nn02606052\nn03179701\nn07715103\nn02641379\nn03259280\nn07873807\nn04584207\nn02110063\nn03218198\nn02494079\nn01644373\nn04332243\nn02115913\nn02120079\nn09229709\nn02481823\nn04235860\nn02113799\nn02823428\nn04371774\nn02442845\nn01498041\nn03944341\nn09332890\nn02091134\nn02690373\nn02788148\nn02869837\nn04204238\nn01675722\nn02236044\nn02280649\nn12144580\nn01882714\nn04120489\nn02999410\nn03692522\nn01729322\nn04532670\nn03337140\nn02966193\nn07742313\nn03793489\nn04355933\nn03220513\nn02445715\nn04443257\nn04026417\nn02823428\nn03976467\nn02102177\nn03773504\nn04487394\nn02085936\nn07614500\nn02089078\nn02206856\nn04147183\nn04501370\nn02422699\nn02085782\nn02097130\nn03929660\nn01751748\nn02099849\nn01924916\nn01692333\nn04275548\nn03991062\nn01824575\nn03218198\nn02018207\nn03530642\nn03782006\nn03697007\nn07734744\nn01820546\nn02280649\nn02115913\nn04325704\nn02104029\nn03250847\nn11879895\nn03709823\nn03271574\nn04483307\nn04525038\nn02835271\nn02102318\nn04285008\nn01491361\nn01742172\nn02077923\nn01728572\nn01914609\nn03388549\nn03085013\nn02395406\nn03868863\nn04033901\nn02011460\nn02123159\nn02391049\nn04039381\nn01695060\nn02129165\nn03944341\nn04462240\nn02403003\nn03920288\nn03649909\nn04515003\nn03372029\nn02091467\nn04372370\nn02129165\nn01753488\nn02113712\nn03445777\nn04525305\nn01768244\nn02493509\nn03743016\nn12998815\nn03770439\nn02777292\nn02097298\nn01687978\nn04179913\nn02749479\nn03627232\nn03207743\nn03476991\nn07745940\nn01883070\nn03792972\nn03769881\nn02011460\nn02870880\nn02123045\nn04040759\nn07684084\nn02111277\nn01877812\nn04019541\nn03197337\nn02494079\nn03187595\nn02687172\nn02883205\nn07754684\nn09399592\nn02791270\nn03063689\nn03902125\nn02415577\nn02086240\nn02093991\nn02802426\nn03782006\nn03478589\nn02128385\nn02894605\nn02115641\nn02011460\nn02951358\nn02128757\nn02871525\nn02346627\nn03450230\nn09229709\nn02417914\nn01796340\nn02128925\nn04486054\nn02749479\nn02346627\nn01930112\nn02091032\nn02963159\nn01944390\nn02793495\nn02018207\nn04153751\nn02790996\nn02129165\nn03538406\nn02965783\nn03179701\nn03160309\nn01644373\nn01770393\nn02109961\nn01873310\nn03085013\nn01735189\nn04370456\nn02018207\nn02018795\nn02110627\nn03804744\nn03534580\nn07760859\nn01631663\nn04482393\nn02917067\nn07753592\nn03447447\nn02112706\nn03947888\nn02927161\nn04228054\nn03259280\nn07753275\nn07753592\nn02948072\nn07697313\nn01984695\nn11879895\nn02125311\nn12998815\nn03976657\nn02096294\nn04264628\nn04548362\nn02276258\nn03891251\nn03127925\nn02834397\nn03854065\nn02979186\nn07920052\nn02110627\nn02095314\nn04049303\nn02965783\nn02895154\nn02013706\nn04044716\nn03709823\nn02138441\nn02777292\nn01943899\nn07892512\nn02091831\nn03743016\nn01514668\nn04243546\nn02105251\nn03032252\nn01855032\nn04612504\nn03770679\nn03866082\nn02091134\nn03443371\nn03777568\nn03773504\nn02480855\nn07745940\nn02391049\nn01910747\nn02277742\nn03938244\nn02788148\nn01440764\nn03425413\nn03895866\nn03950228\nn02133161\nn01843065\nn02992211\nn02834397\nn02066245\nn03337140\nn07716358\nn03584829\nn02095314\nn02093991\nn02974003\nn02025239\nn04596742\nn02916936\nn01768244\nn03720891\nn02056570\nn02102177\nn04557648\nn02268853\nn02098105\nn01514859\nn04141975\nn02071294\nn03188531\nn04254777\nn03709823\nn03095699\nn04517823\nn03733131\nn07693725\nn03476684\nn03724870\nn03983396\nn02342885\nn02510455\nn03874293\nn02823428\nn04356056\nn01494475\nn04251144\nn02894605\nn02097658\nn04273569\nn02123045\nn03250847\nn01687978\nn02012849\nn03733131\nn02096294\nn02279972\nn01641577\nn03804744\nn02871525\nn04479046\nn07697313\nn02786058\nn01924916\nn07932039\nn02099712\nn03271574\nn02488702\nn02927161\nn02815834\nn02877765\nn04560804\nn03297495\nn04590129\nn03944341\nn03980874\nn02105056\nn01734418\nn03947888\nn02363005\nn06596364\nn07753275\nn02930766\nn02093859\nn03207941\nn01818515\nn03657121\nn01629819\nn03063689\nn03255030\nn02808440\nn02981792\nn09246464\nn04591713\nn03492542\nn04517823\nn03240683\nn07716358\nn07717556\nn02814533\nn01843383\nn03691459\nn02134418\nn02110185\nn02093754\nn02807133\nn07684084\nn02091244\nn03873416\nn02113624\nn02094433\nn02917067\nn03450230\nn03888605\nn01616318\nn04435653\nn02111277\nn02006656\nn02363005\nn02497673\nn07753592\nn07711569\nn01693334\nn03954731\nn04033995\nn04208210\nn02817516\nn07754684\nn02256656\nn13052670\nn04417672\nn11939491\nn02443114\nn03445777\nn02093859\nn07684084\nn03026506\nn04081281\nn02002724\nn02317335\nn03584829\nn04039381\nn03062245\nn02091134\nn07745940\nn02092002\nn03991062\nn02843684\nn03961711\nn04069434\nn01558993\nn07745940\nn04486054\nn04347754\nn02011460\nn02808304\nn02109961\nn04229816\nn04409515\nn04116512\nn03857828\nn02445715\nn03920288\nn02488702\nn03126707\nn07932039\nn02835271\nn03445924\nn01797886\nn03476684\nn03658185\nn01943899\nn02951358\nn03532672\nn02966193\nn02988304\nn02229544\nn02095570\nn02841315\nn04536866\nn02268853\nn03445924\nn03803284\nn04254777\nn02443484\nn03133878\nn02799071\nn13133613\nn02102040\nn02107908\nn03947888\nn04487394\nn03599486\nn03452741\nn02097298\nn04417672\nn02493793\nn02325366\nn07747607\nn03188531\nn04482393\nn02088632\nn04461696\nn03249569\nn07693725\nn02096437\nn01773797\nn02105162\nn02843684\nn02950826\nn02492660\nn04366367\nn01981276\nn03207941\nn02966193\nn03534580\nn02112018\nn01688243\nn04584207\nn02415577\nn01847000\nn02514041\nn02488291\nn02749479\nn04380533\nn02510455\nn02526121\nn07745940\nn03930313\nn03877845\nn01755581\nn01667114\nn02108000\nn02699494\nn02363005\nn02100877\nn03770439\nn02114712\nn02100735\nn02108000\nn02028035\nn02108551\nn02484975\nn07718747\nn03498962\nn01665541\nn02894605\nn04118776\nn02119022\nn04258138\nn04604644\nn02115641\nn07768694\nn12267677\nn03908714\nn03876231\nn07717556\nn11879895\nn01688243\nn03208938\nn12267677\nn02669723\nn02965783\nn02276258\nn01631663\nn04487394\nn02825657\nn01749939\nn04037443\nn04041544\nn03376595\nn04532670\nn02104365\nn02233338\nn02793495\nn03770439\nn01910747\nn04154565\nn01980166\nn03793489\nn02025239\nn02480495\nn03781244\nn04399382\nn07871810\nn04065272\nn02017213\nn01943899\nn04067472\nn03761084\nn02094433\nn03538406\nn02494079\nn04147183\nn04141076\nn04589890\nn01601694\nn02123394\nn06874185\nn02114548\nn03637318\nn03710193\nn04536866\nn09399592\nn03452741\nn03594945\nn07860988\nn03085013\nn02814533\nn03461385\nn04252077\nn02859443\nn04033901\nn01530575\nn03476684\nn04069434\nn02105056\nn02128385\nn01694178\nn01688243\nn03372029\nn04465501\nn02808440\nn04235860\nn02177972\nn13044778\nn02096177\nn01770081\nn01669191\nn02481823\nn07880968\nn03888605\nn02117135\nn02096437\nn02397096\nn01592084\nn03769881\nn03026506\nn02107574\nn02114367\nn03124170\nn03733281\nn03692522\nn02037110\nn02167151\nn01930112\nn03995372\nn03355925\nn03676483\nn03000247\nn02966193\nn02910353\nn01682714\nn02910353\nn02510455\nn02106550\nn02120079\nn03841143\nn04229816\nn02447366\nn02091467\nn04456115\nn03937543\nn01818515\nn04086273\nn02865351\nn03109150\nn02808304\nn03483316\nn01560419\nn07930864\nn04392985\nn04592741\nn04192698\nn02089973\nn03485794\nn07613480\nn02951585\nn01494475\nn01443537\nn02097298\nn02877765\nn02101388\nn03271574\nn03041632\nn03895866\nn02865351\nn02091134\nn02027492\nn03201208\nn03983396\nn02364673\nn02134084\nn02165105\nn01773549\nn04127249\nn04275548\nn01883070\nn02112706\nn03776460\nn02108000\nn02397096\nn04525305\nn02113624\nn02268853\nn02091134\nn03476991\nn02815834\nn04525305\nn03857828\nn03272010\nn04523525\nn04335435\nn03595614\nn07932039\nn03345487\nn03877472\nn04485082\nn02794156\nn03877472\nn03492542\nn02114712\nn02883205\nn02106662\nn03417042\nn03617480\nn02978881\nn02101556\nn04039381\nn02105641\nn02098413\nn04552348\nn02823750\nn07753113\nn02110063\nn09332890\nn09468604\nn02457408\nn01537544\nn02497673\nn09229709\nn04311004\nn02776631\nn02692877\nn03623198\nn04328186\nn03697007\nn02102177\nn01687978\nn03207743\nn03733131\nn02099429\nn03769881\nn02099601\nn02787622\nn03000134\nn03895866\nn02127052\nn04136333\nn02106662\nn13044778\nn01981276\nn03680355\nn03372029\nn03908618\nn03877472\nn04346328\nn04557648\nn04270147\nn04428191\nn02870880\nn03297495\nn02871525\nn02391049\nn02123045\nn01871265\nn02071294\nn02119022\nn04592741\nn02509815\nn03424325\nn02514041\nn02101006\nn02747177\nn01950731\nn02172182\nn04336792\nn04356056\nn04252077\nn01740131\nn04613696\nn04023962\nn04485082\nn02128925\nn02086079\nn03983396\nn02134084\nn02133161\nn02128925\nn04517823\nn07875152\nn02128385\nn04204347\nn02077923\nn03272010\nn02840245\nn02105641\nn01817953\nn04146614\nn04554684\nn03796401\nn04039381\nn02788148\nn04483307\nn02493793\nn03692522\nn03075370\nn03733281\nn04238763\nn02815834\nn03065424\nn02672831\nn03602883\nn04346328\nn02066245\nn03444034\nn03594734\nn15075141\nn12144580\nn07579787\nn02992529\nn04515003\nn02107142\nn02117135\nn01734418\nn01693334\nn02105505\nn02992211\nn02869837\nn13133613\nn02666196\nn04041544\nn03857828\nn04418357\nn02113978\nn01744401\nn02797295\nn02699494\nn02489166\nn02098286\nn04243546\nn02134418\nn02106662\nn03670208\nn04090263\nn02692877\nn03467068\nn04238763\nn03788365\nn03657121\nn02906734\nn02326432\nn02676566\nn02607072\nn03627232\nn02894605\nn03538406\nn04136333\nn01632458\nn04125021\nn03134739\nn01697457\nn03924679\nn04243546\nn09256479\nn02493793\nn07871810\nn02177972\nn01917289\nn02088466\nn04069434\nn03891251\nn02113799\nn07711569\nn01833805\nn04270147\nn04259630\nn02859443\nn04270147\nn02110063\nn03042490\nn03290653\nn02002724\nn02100583\nn01608432\nn03710193\nn03777754\nn02971356\nn04482393\nn13037406\nn01768244\nn03929855\nn03016953\nn07584110\nn02113023\nn04447861\nn02128925\nn02988304\nn04201297\nn02006656\nn01807496\nn03658185\nn03394916\nn07716358\nn07579787\nn02102177\nn01729322\nn03775071\nn04482393\nn02415577\nn02607072\nn02909870\nn03255030\nn03344393\nn02325366\nn02102480\nn02102177\nn04423845\nn02130308\nn03785016\nn02787622\nn04200800\nn02087046\nn04487394\nn04152593\nn04065272\nn07831146\nn02843684\nn07248320\nn03498962\nn02128757\nn04523525\nn02999410\nn03697007\nn02097209\nn11939491\nn04141327\nn07248320\nn04461696\nn02110185\nn02483708\nn03902125\nn02168699\nn02834397\nn02108915\nn02963159\nn03841143\nn02120505\nn02111129\nn02112350\nn03793489\nn03649909\nn04090263\nn02727426\nn04033995\nn01608432\nn02364673\nn02895154\nn07730033\nn02423022\nn02999410\nn07579787\nn02086079\nn01631663\nn02494079\nn04118776\nn03467068\nn03476684\nn03954731\nn03775546\nn02981792\nn01873310\nn01980166\nn04049303\nn04099969\nn02965783\nn02281787\nn02823750\nn02655020\nn02403003\nn02951358\nn02028035\nn02504458\nn03814639\nn02085620\nn04486054\nn03761084\nn07930864\nn04522168\nn04347754\nn01644373\nn02992211\nn04483307\nn02102973\nn04467665\nn03026506\nn03026506\nn07697537\nn01532829\nn04442312\nn02108551\nn01824575\nn04254777\nn03109150\nn01728920\nn04380533\nn02795169\nn04493381\nn03141823\nn01817953\nn04026417\nn02909870\nn01601694\nn02834397\nn03376595\nn02909870\nn07711569\nn03891251\nn01806567\nn03854065\nn03814906\nn02808304\nn04153751\nn07768694\nn04532106\nn02102973\nn02346627\nn13133613\nn02129604\nn02443484\nn03792972\nn02804414\nn02097298\nn02708093\nn01748264\nn03992509\nn04591713\nn02105162\nn03840681\nn02276258\nn02100583\nn02408429\nn03770679\nn07717556\nn02280649\nn02006656\nn04560804\nn04285008\nn03868863\nn02088238\nn02799071\nn04560804\nn02108551\nn02487347\nn01614925\nn04505470\nn04090263\nn03661043\nn01675722\nn01531178\nn01632458\nn01695060\nn04254777\nn04355933\nn03743016\nn04259630\nn01534433\nn02110958\nn02112350\nn02488702\nn02687172\nn09246464\nn02071294\nn02497673\nn03871628\nn07717556\nn02105412\nn02999410\nn02105412\nn04208210\nn04589890\nn03379051\nn03404251\nn03014705\nn04146614\nn03938244\nn02107142\nn03452741\nn01667114\nn04311174\nn01667778\nn03127747\nn02105412\nn09399592\nn07716906\nn03673027\nn03197337\nn03450230\nn02113186\nn01775062\nn04380533\nn06359193\nn03483316\nn02172182\nn03496892\nn03843555\nn04476259\nn02110806\nn04467665\nn04548280\nn01518878\nn02281787\nn02093647\nn04404412\nn04356056\nn03840681\nn03995372\nn02326432\nn02777292\nn01776313\nn03220513\nn02795169\nn02074367\nn01968897\nn07693725\nn02906734\nn03777754\nn02497673\nn03126707\nn04259630\nn03729826\nn04026417\nn01855032\nn02808440\nn04346328\nn03930313\nn04560804\nn03127925\nn07684084\nn04417672\nn02172182\nn02325366\nn03899768\nn01644900\nn02113186\nn03710637\nn03857828\nn02114548\nn04326547\nn02643566\nn02092002\nn03124170\nn02281406\nn01806567\nn04254680\nn03344393\nn01532829\nn02116738\nn02116738\nn02094258\nn03690938\nn03272562\nn03110669\nn03786901\nn07920052\nn04355933\nn01978455\nn01806143\nn01944390\nn03450230\nn02088364\nn03956157\nn02437312\nn03590841\nn04344873\nn02277742\nn02111277\nn01784675\nn04483307\nn02132136\nn04019541\nn01693334\nn01608432\nn01667114\nn02236044\nn03775546\nn01739381\nn02100583\nn02090622\nn01729322\nn04350905\nn02056570\nn04612504\nn04505470\nn12057211\nn03837869\nn01531178\nn04376876\nn02454379\nn02124075\nn02395406\nn02114367\nn03481172\nn02109047\nn07715103\nn04154565\nn02423022\nn01756291\nn02108089\nn02493793\nn03602883\nn02168699\nn01978455\nn02097298\nn02447366\nn04229816\nn07583066\nn03207743\nn07248320\nn02100583\nn02823750\nn01608432\nn04418357\nn01833805\nn03930630\nn03425413\nn02788148\nn03637318\nn04265275\nn02281787\nn04335435\nn02093428\nn06359193\nn03944341\nn04041544\nn04515003\nn02106550\nn02097130\nn02837789\nn07753275\nn04026417\nn03673027\nn03887697\nn03110669\nn03769881\nn01532829\nn02006656\nn04296562\nn04347754\nn01828970\nn03125729\nn03877472\nn02096051\nn04483307\nn02398521\nn03770679\nn02106662\nn03775546\nn04347754\nn02676566\nn03690938\nn07831146\nn04398044\nn01985128\nn02109047\nn03785016\nn03494278\nn03792972\nn02114367\nn03777754\nn04090263\nn02132136\nn03134739\nn01491361\nn09332890\nn03803284\nn02120079\nn03075370\nn02104365\nn03884397\nn02790996\nn01751748\nn07695742\nn02123045\nn03759954\nn03733131\nn12998815\nn03223299\nn07745940\nn04532106\nn02111889\nn02708093\nn01944390\nn01534433\nn02361337\nn02113624\nn02090721\nn02093256\nn02025239\nn04355933\nn03452741\nn01530575\nn01443537\nn04209239\nn02037110\nn04154565\nn03594945\nn04465501\nn07714990\nn03868863\nn01819313\nn04026417\nn04553703\nn02112706\nn01980166\nn02797295\nn03888257\nn02342885\nn03216828\nn03388043\nn03804744\nn02138441\nn01689811\nn04553703\nn02231487\nn04208210\nn03372029\nn02096177\nn04429376\nn03272010\nn02493509\nn03127747\nn02786058\nn03777568\nn04238763\nn03535780\nn03938244\nn02408429\nn02097658\nn02123159\nn03891251\nn02165105\nn02437312\nn02114712\nn04540053\nn04270147\nn02113186\nn02281406\nn03899768\nn04442312\nn04023962\nn02963159\nn02102973\nn01860187\nn03297495\nn03733805\nn03980874\nn04336792\nn04366367\nn02412080\nn02966687\nn03763968\nn02098286\nn01756291\nn03929855\nn03944341\nn03271574\nn04026417\nn07754684\nn01985128\nn07753113\nn01675722\nn02106166\nn02116738\nn03916031\nn04065272\nn03110669\nn07747607\nn02009912\nn03950228\nn03483316\nn07716358\nn03216828\nn09835506\nn03393912\nn02526121\nn03770439\nn02002724\nn02871525\nn01776313\nn04355933\nn03450230\nn02025239\nn02107312\nn04606251\nn03063599\nn01795545\nn04254777\nn02120079\nn01833805\nn02099601\nn13052670\nn02676566\nn03457902\nn03720891\nn03793489\nn01775062\nn01978287\nn10565667\nn02916936\nn03599486\nn02110958\nn01443537\nn04204238\nn02672831\nn07717410\nn04209239\nn01491361\nn02963159\nn03424325\nn03697007\nn03344393\nn03445777\nn02999410\nn02441942\nn04525038\nn02403003\nn07684084\nn03125729\nn02095570\nn01796340\nn03599486\nn07747607\nn04507155\nn07768694\nn04501370\nn07734744\nn02676566\nn01871265\nn03680355\nn02088466\nn10565667\nn02110958\nn02096437\nn01498041\nn02130308\nn07836838\nn03884397\nn04065272\nn02033041\nn02607072\nn13040303\nn02808304\nn03095699\nn03485407\nn02395406\nn04560804\nn02676566\nn04589890\nn02110958\nn02837789\nn01669191\nn02123045\nn07579787\nn01667778\nn12998815\nn04613696\nn02951585\nn03623198\nn03764736\nn02892767\nn02102318\nn04040759\nn02123045\nn03062245\nn02701002\nn03201208\nn04266014\nn01873310\nn04597913\nn03595614\nn07716906\nn02988304\nn03445924\nn02860847\nn02095889\nn02115913\nn01756291\nn02114548\nn02457408\nn03995372\nn01614925\nn02107312\nn03930630\nn03017168\nn03535780\nn01985128\nn02177972\nn03045698\nn13133613\nn04398044\nn02099267\nn01829413\nn02114712\nn02104029\nn01440764\nn04263257\nn04251144\nn03584254\nn03874599\nn06359193\nn04070727\nn04209133\nn04065272\nn01748264\nn02980441\nn02093754\nn02097658\nn03187595\nn01742172\nn04590129\nn03188531\nn02504013\nn02017213\nn02979186\nn02843684\nn04040759\nn01667778\nn01820546\nn02116738\nn04243546\nn04090263\nn03888605\nn01985128\nn02823750\nn04141975\nn03376595\nn02108915\nn03372029\nn02423022\nn01728920\nn02102973\nn01580077\nn02492660\nn07716906\nn02096294\nn03259280\nn03884397\nn02102973\nn03666591\nn02486410\nn02102480\nn02105162\nn09246464\nn02823750\nn04152593\nn03196217\nn01818515\nn04591157\nn04328186\nn01742172\nn01753488\nn02971356\nn09428293\nn02927161\nn03180011\nn04099969\nn02795169\nn02895154\nn03929660\nn01910747\nn03854065\nn02747177\nn03803284\nn02123394\nn04264628\nn04243546\nn02123159\nn01983481\nn02526121\nn12267677\nn06785654\nn04606251\nn01855672\nn02281406\nn04296562\nn01773549\nn02127052\nn02090622\nn02088094\nn04125021\nn01728920\nn03595614\nn02090622\nn04285008\nn03874293\nn02823428\nn02028035\nn02077923\nn02017213\nn03903868\nn02127052\nn04317175\nn02107683\nn01984695\nn03995372\nn02090721\nn02089867\nn10148035\nn01737021\nn01883070\nn01819313\nn03958227\nn03841143\nn03459775\nn03777568\nn03417042\nn02110185\nn03388549\nn03924679\nn02672831\nn02165456\nn03207743\nn04136333\nn02971356\nn04039381\nn04162706\nn02791124\nn03124170\nn01843065\nn04428191\nn03874599\nn02102480\nn04487394\nn01883070\nn02966193\nn01494475\nn02110341\nn07716358\nn07248320\nn02814860\nn04133789\nn02443114\nn02110063\nn04509417\nn02108089\nn04548362\nn01748264\nn03710637\nn02091467\nn02110341\nn02113624\nn01819313\nn02939185\nn03272562\nn02787622\nn12267677\nn04141327\nn02110958\nn01687978\nn04429376\nn01729322\nn02093647\nn07920052\nn01910747\nn02107908\nn03895866\nn02086079\nn02895154\nn13037406\nn03876231\nn04590129\nn01692333\nn03717622\nn02109525\nn04355338\nn03777568\nn03314780\nn03887697\nn04141975\nn01978287\nn04597913\nn04141975\nn02782093\nn03868242\nn02002724\nn03196217\nn04153751\nn01629819\nn02808440\nn02058221\nn01531178\nn02114712\nn03494278\nn04204347\nn03793489\nn03483316\nn04209239\nn03776460\nn04336792\nn02114548\nn02667093\nn02834397\nn04456115\nn03394916\nn04346328\nn01776313\nn02124075\nn02356798\nn03895866\nn02963159\nn01883070\nn03355925\nn02226429\nn03417042\nn02106550\nn02101388\nn04200800\nn02011460\nn02112706\nn04326547\nn01985128\nn03110669\nn03804744\nn04141327\nn11939491\nn02105251\nn03201208\nn07754684\nn01632777\nn04553703\nn04149813\nn02481823\nn03947888\nn01534433\nn03457902\nn02776631\nn04209239\nn04523525\nn04074963\nn02233338\nn03930313\nn03249569\nn03884397\nn01601694\nn04560804\nn02514041\nn03417042\nn07880968\nn03594734\nn03344393\nn02088632\nn02106662\nn02108551\nn01744401\nn02483708\nn02971356\nn02909870\nn02841315\nn03496892\nn02100583\nn03476684\nn07718472\nn01641577\nn06596364\nn03954731\nn04357314\nn04259630\nn07695742\nn04423845\nn03249569\nn04111531\nn02895154\nn04149813\nn02114712\nn04252225\nn03770679\nn02837789\nn04428191\nn02361337\nn02100236\nn01728920\nn03594945\nn02268443\nn07875152\nn07695742\nn02108551\nn01531178\nn01980166\nn02106382\nn03658185\nn02988304\nn04141076\nn02906734\nn02012849\nn02786058\nn01614925\nn02206856\nn01631663\nn03100240\nn03047690\nn03180011\nn02895154\nn02782093\nn03595614\nn09332890\nn07749582\nn04258138\nn03095699\nn02096177\nn01728920\nn03538406\nn01806143\nn02088238\nn04501370\nn09229709\nn04423845\nn02397096\nn02133161\nn02088238\nn02264363\nn02101006\nn04515003\nn02870880\nn04548280\nn04461696\nn03028079\nn02268853\nn03874599\nn01877812\nn02699494\nn12985857\nn02454379\nn04326547\nn02089867\nn01560419\nn02093256\nn04204347\nn04347754\nn02086240\nn04286575\nn04482393\nn03840681\nn04065272\nn02480855\nn02749479\nn03492542\nn02096437\nn02317335\nn02174001\nn04525305\nn04039381\nn07753592\nn13037406\nn02494079\nn04258138\nn02229544\nn01843383\nn01728920\nn04330267\nn02325366\nn02808304\nn04462240\nn03874293\nn03482405\nn01629819\nn03781244\nn04392985\nn04258138\nn03160309\nn02096585\nn01614925\nn02017213\nn04133789\nn04277352\nn02106030\nn04428191\nn03400231\nn03249569\nn01514668\nn10148035\nn02397096\nn07697313\nn07802026\nn03887697\nn07248320\nn01855032\nn03908618\nn02086910\nn04254680\nn02104365\nn03445777\nn02011460\nn07695742\nn04344873\nn01667778\nn02091244\nn01534433\nn02097474\nn02701002\nn03208938\nn03676483\nn03770439\nn01755581\nn02108915\nn01753488\nn02102480\nn03633091\nn03662601\nn01770393\nn07590611\nn04264628\nn03998194\nn02396427\nn02102040\nn01770393\nn04162706\nn02281406\nn12768682\nn01945685\nn03483316\nn01978287\nn02119022\nn02169497\nn03991062\nn04465501\nn07614500\nn01990800\nn01534433\nn03770679\nn09288635\nn03188531\nn09256479\nn04259630\nn02110627\nn04560804\nn02113978\nn02095889\nn04599235\nn03259280\nn02111277\nn02794156\nn04328186\nn04254680\nn03661043\nn03599486\nn02097130\nn02033041\nn02071294\nn03937543\nn09288635\nn03709823\nn02489166\nn03673027\nn01828970\nn04532106\nn03496892\nn01924916\nn04548280\nn02319095\nn02395406\nn02782093\nn04554684\nn02086240\nn03916031\nn02791270\nn07717410\nn04238763\nn02730930\nn01514859\nn01748264\nn02988304\nn03461385\nn03272562\nn04330267\nn07860988\nn02276258\nn07871810\nn02097474\nn02999410\nn04037443\nn01614925\nn04033901\nn03944341\nn02655020\nn01608432\nn03874599\nn03594945\nn04252225\nn07892512\nn03717622\nn03763968\nn02110627\nn02795169\nn03000134\nn02494079\nn03042490\nn03100240\nn07875152\nn02802426\nn02484975\nn09229709\nn02747177\nn06596364\nn04557648\nn02123394\nn02002724\nn02167151\nn02504013\nn01616318\nn03770439\nn04428191\nn02051845\nn04579145\nn02093754\nn12267677\nn01641577\nn02963159\nn02807133\nn04590129\nn03467068\nn01629819\nn02443484\nn02088238\nn02412080\nn03532672\nn04591157\nn04486054\nn02692877\nn02727426\nn04371774\nn04273569\nn03733131\nn03544143\nn02104365\nn02109961\nn03447447\nn01872401\nn03961711\nn02116738\nn01688243\nn01749939\nn03141823\nn02509815\nn12985857\nn01829413\nn02109047\nn02526121\nn02097658\nn03216828\nn02870880\nn04266014\nn04355338\nn03633091\nn01910747\nn02006656\nn03445924\nn02906734\nn04099969\nn02099712\nn02229544\nn04443257\nn02687172\nn04273569\nn02489166\nn03924679\nn12985857\nn02167151\nn02321529\nn02102040\nn02870880\nn01693334\nn02097298\nn01882714\nn04040759\nn03791053\nn02979186\nn02454379\nn03131574\nn04141327\nn02981792\nn02974003\nn02090721\nn04131690\nn02106030\nn02493793\nn02963159\nn04596742\nn11879895\nn03457902\nn02823750\nn01774750\nn03788365\nn02389026\nn02823750\nn02493509\nn07583066\nn01682714\nn03899768\nn02279972\nn07747607\nn01692333\nn04243546\nn04317175\nn02106550\nn01664065\nn01677366\nn02093754\nn04346328\nn02106550\nn02127052\nn03666591\nn03877845\nn03125729\nn03786901\nn03775071\nn02412080\nn01518878\nn03720891\nn01735189\nn02356798\nn02110806\nn03047690\nn04462240\nn02951585\nn01558993\nn03065424\nn02860847\nn02486410\nn02398521\nn04346328\nn02106030\nn02445715\nn04153751\nn02509815\nn01828970\nn04069434\nn07714571\nn13044778\nn01955084\nn03662601\nn01664065\nn02708093\nn02408429\nn03920288\nn02190166\nn02091635\nn04229816\nn01773549\nn02106662\nn02009912\nn01558993\nn02127052\nn02843684\nn02174001\nn03345487\nn01990800\nn03584254\nn02389026\nn02389026\nn04069434\nn03032252\nn07749582\nn02110627\nn02807133\nn02012849\nn03208938\nn02107142\nn03995372\nn02927161\nn03888257\nn02802426\nn09193705\nn07716906\nn03345487\nn02088094\nn03297495\nn02871525\nn02363005\nn02206856\nn02445715\nn02783161\nn02948072\nn09421951\nn02410509\nn02808304\nn03903868\nn02110063\nn03724870\nn07836838\nn04141975\nn02487347\nn02112137\nn02804610\nn07734744\nn04462240\nn03372029\nn02177972\nn02085620\nn01917289\nn04070727\nn02823428\nn02860847\nn04392985\nn02791124\nn01847000\nn01784675\nn02093991\nn03457902\nn02939185\nn04493381\nn03271574\nn02509815\nn03793489\nn02690373\nn03983396\nn02927161\nn03018349\nn03908618\nn02110341\nn03776460\nn02124075\nn04335435\nn03127747\nn02948072\nn03085013\nn02442845\nn02916936\nn01688243\nn02879718\nn02097298\nn04589890\nn02607072\nn02948072\nn04525038\nn02100735\nn02814533\nn03000134\nn03478589\nn02037110\nn04235860\nn02112137\nn04435653\nn04273569\nn03794056\nn01910747\nn01748264\nn01883070\nn04200800\nn04590129\nn03443371\nn02791124\nn03075370\nn03673027\nn01742172\nn03476684\nn01484850\nn01675722\nn02978881\nn03938244\nn02106166\nn01729977\nn04118776\nn04209239\nn03376595\nn04008634\nn02095889\nn01855032\nn03376595\nn04456115\nn02879718\nn04238763\nn02268443\nn02794156\nn02105505\nn01914609\nn03899768\nn02676566\nn02099601\nn02106382\nn04264628\nn04501370\nn03594734\nn03895866\nn04332243\nn04008634\nn02492035\nn01773797\nn04228054\nn02110958\nn06359193\nn02403003\nn04409515\nn03337140\nn02483708\nn02106166\nn04209133\nn02114367\nn03743016\nn03201208\nn03207941\nn02804414\nn04487081\nn01945685\nn02606052\nn03388043\nn03661043\nn02804610\nn04235860\nn02795169\nn03476991\nn03444034\nn03942813\nn04026417\nn03337140\nn02108422\nn04033995\nn03041632\nn02134418\nn04554684\nn03733131\nn02116738\nn03786901\nn03937543\nn04147183\nn04131690\nn03400231\nn02125311\nn02410509\nn01775062\nn02814533\nn02110185\nn04008634\nn04597913\nn01883070\nn07714990\nn02112350\nn02437616\nn03662601\nn02074367\nn04239074\nn03063689\nn07831146\nn02869837\nn03920288\nn13052670\nn03016953\nn02788148\nn04613696\nn02113023\nn03866082\nn02992529\nn04479046\nn04467665\nn04540053\nn02927161\nn03992509\nn04347754\nn03495258\nn03633091\nn02105251\nn02231487\nn02102318\nn02667093\nn01749939\nn02133161\nn03372029\nn02486261\nn04004767\nn02088466\nn07579787\nn02791270\nn03131574\nn02391049\nn01664065\nn02099429\nn01776313\nn03920288\nn02109047\nn02317335\nn04612504\nn03584254\nn03457902\nn02051845\nn03047690\nn04507155\nn02704792\nn01748264\nn02017213\nn03450230\nn02841315\nn04070727\nn02992211\nn03404251\nn02092339\nn12768682\nn07873807\nn03041632\nn03379051\nn04435653\nn04146614\nn02012849\nn03443371\nn04152593\nn04507155\nn03447447\nn04252225\nn03770439\nn13037406\nn01748264\nn04550184\nn03207941\nn07716906\nn03595614\nn07875152\nn04560804\nn04479046\nn03127925\nn07248320\nn02342885\nn02088466\nn03485407\nn09399592\nn04039381\nn04548280\nn02099267\nn04254777\nn06785654\nn02190166\nn03868242\nn04141076\nn02980441\nn03868863\nn02437312\nn02096177\nn02701002\nn03259280\nn02834397\nn15075141\nn07880968\nn02096585\nn09256479\nn02091032\nn03457902\nn02099849\nn02398521\nn02129165\nn03404251\nn01774384\nn03977966\nn02980441\nn02137549\nn03920288\nn01770081\nn03891332\nn03196217\nn02782093\nn02510455\nn03535780\nn04263257\nn02790996\nn03146219\nn01601694\nn03379051\nn03188531\nn02790996\nn04596742\nn01560419\nn03376595\nn12768682\nn02504013\nn03388043\nn02231487\nn03134739\nn03775071\nn02509815\nn07695742\nn02325366\nn09835506\nn04418357\nn04483307\nn04069434\nn03991062\nn02487347\nn03223299\nn02817516\nn03207743\nn02110627\nn04604644\nn02112350\nn02109961\nn03534580\nn03208938\nn03125729\nn03947888\nn04154565\nn01860187\nn02328150\nn02777292\nn02112018\nn02113978\nn02033041\nn07871810\nn10148035\nn01981276\nn07860988\nn03492542\nn04005630\nn02093428\nn04355933\nn02108089\nn03841143\nn02704792\nn02277742\nn03874599\nn04371774\nn01775062\nn03461385\nn02096585\nn02093754\nn02011460\nn02814533\nn02787622\nn02114367\nn01641577\nn03992509\nn04265275\nn02096051\nn07745940\nn02422106\nn01496331\nn03188531\nn07614500\nn02101006\nn02101006\nn13040303\nn02085936\nn03961711\nn02093991\nn07714571\nn01986214\nn01669191\nn01984695\nn03297495\nn02108422\nn03249569\nn04398044\nn03775546\nn01986214\nn04579432\nn07714571\nn01945685\nn02640242\nn06785654\nn04116512\nn02099429\nn09229709\nn01682714\nn01749939\nn02007558\nn01498041\nn04507155\nn02124075\nn02101006\nn02104029\nn02676566\nn02606052\nn04238763\nn02101388\nn02107312\nn03347037\nn02493509\nn02396427\nn04065272\nn03840681\nn04515003\nn02091635\nn02325366\nn04033901\nn01675722\nn03788365\nn13037406\nn03527444\nn01695060\nn04328186\nn07590611\nn01728572\nn02119022\nn02974003\nn02410509\nn07892512\nn07730033\nn04330267\nn03868863\nn02018207\nn02500267\nn02980441\nn01843065\nn02093859\nn02094114\nn07768694\nn04154565\nn02123394\nn03843555\nn02123159\nn02107574\nn01795545\nn02917067\nn02071294\nn03895866\nn03179701\nn03950228\nn04259630\nn02165105\nn02120079\nn02804610\nn02279972\nn01728920\nn02978881\nn03710637\nn01872401\nn03160309\nn02442845\nn09256479\nn02950826\nn02841315\nn04357314\nn02865351\nn04111531\nn07747607\nn03594945\nn03763968\nn04606251\nn03895866\nn02113978\nn04554684\nn04344873\nn04254120\nn01740131\nn03976467\nn07753275\nn02443484\nn02939185\nn02977058\nn13037406\nn07747607\nn04467665\nn01784675\nn04536866\nn02123159\nn02119789\nn04548362\nn02111129\nn06794110\nn04239074\nn03733805\nn02088466\nn03764736\nn01914609\nn02105505\nn02412080\nn04254680\nn04523525\nn07697537\nn01728920\nn02794156\nn02113978\nn13040303\nn01514859\nn04398044\nn02364673\nn01924916\nn02007558\nn03803284\nn02795169\nn03916031\nn02088238\nn02086646\nn03063689\nn01806143\nn04366367\nn03109150\nn04523525\nn04208210\nn01978287\nn03272010\nn03146219\nn03933933\nn04525305\nn03124043\nn02510455\nn01687978\nn01824575\nn04613696\nn06359193\nn03110669\nn03388183\nn03691459\nn02280649\nn03133878\nn02085782\nn02087046\nn02090721\nn02497673\nn04344873\nn04330267\nn01514859\nn02488702\nn04525038\nn07711569\nn01978455\nn01768244\nn02105855\nn04604644\nn02281406\nn01739381\nn01693334\nn02113978\nn07749582\nn03786901\nn01883070\nn09246464\nn03841143\nn03482405\nn12998815\nn03938244\nn04238763\nn03929855\nn02892201\nn02486261\nn02676566\nn01843065\nn01728920\nn03379051\nn02823750\nn02776631\nn02488291\nn02317335\nn02002724\nn01755581\nn03110669\nn04019541\nn03095699\nn04004767\nn03877845\nn02120505\nn02113624\nn07695742\nn03127747\nn03041632\nn01744401\nn02098286\nn02100735\nn02264363\nn04456115\nn02219486\nn02129165\nn04275548\nn03874599\nn03706229\nn01770081\nn02988304\nn02105505\nn02130308\nn02113799\nn06596364\nn02028035\nn01784675\nn04266014\nn02422106\nn03271574\nn01622779\nn04229816\nn02988304\nn02977058\nn03594734\nn03196217\nn04008634\nn03947888\nn03032252\nn02037110\nn03424325\nn03873416\nn03379051\nn02096437\nn03887697\nn04154565\nn03803284\nn06794110\nn03956157\nn03297495\nn03444034\nn09256479\nn02317335\nn03871628\nn04192698\nn07873807\nn02793495\nn03764736\nn02483362\nn01773797\nn03788195\nn03032252\nn04311174\nn02111889\nn03970156\nn04447861\nn02018795\nn03666591\nn03314780\nn02229544\nn02172182\nn02486410\nn02607072\nn02276258\nn04254777\nn02403003\nn02094114\nn09246464\nn02114367\nn03788365\nn03297495\nn02492660\nn04326547\nn03201208\nn04286575\nn03492542\nn03877472\nn01910747\nn01608432\nn02490219\nn03710637\nn04344873\nn02951358\nn01498041\nn01729322\nn04409515\nn04146614\nn03873416\nn02090721\nn04081281\nn03976467\nn02837789\nn04409515\nn03759954\nn02168699\nn03127925\nn03970156\nn01665541\nn03160309\nn04251144\nn04311174\nn02098413\nn02480855\nn01773549\nn02489166\nn03494278\nn02229544\nn01729977\nn04552348\nn04033995\nn01882714\nn04366367\nn03271574\nn03666591\nn02093428\nn02791124\nn03384352\nn03498962\nn03709823\nn02422699\nn02085782\nn04133789\nn02486261\nn12985857\nn04372370\nn03857828\nn04367480\nn04612504\nn04399382\nn01632458\nn03717622\nn02514041\nn02018207\nn07615774\nn02098413\nn03691459\nn02108915\nn07920052\nn04228054\nn04493381\nn04081281\nn03832673\nn13052670\nn04584207\nn04252225\nn01608432\nn02708093\nn04398044\nn02087046\nn04599235\nn02177972\nn02326432\nn02490219\nn03761084\nn02101556\nn04599235\nn04467665\nn02097658\nn01978287\nn04612504\nn02397096\nn03018349\nn02391049\nn07584110\nn02457408\nn01776313\nn02120079\nn02727426\nn02791270\nn04590129\nn02058221\nn03599486\nn03788365\nn02098105\nn02097047\nn03794056\nn02966193\nn01494475\nn02514041\nn01773157\nn07613480\nn09332890\nn02086910\nn02071294\nn02105412\nn02966193\nn02481823\nn04228054\nn02825657\nn03775071\nn02096177\nn02328150\nn01768244\nn03028079\nn03534580\nn01484850\nn09428293\nn03788365\nn02106550\nn03782006\nn04258138\nn03710637\nn02097298\nn03721384\nn02391049\nn02013706\nn02840245\nn03249569\nn02454379\nn02865351\nn02206856\nn02093991\nn01877812\nn03485407\nn02101388\nn03014705\nn04456115\nn03976657\nn03188531\nn02342885\nn02096437\nn02102318\nn03376595\nn03271574\nn02177972\nn03594945\nn03126707\nn02099712\nn01692333\nn02966687\nn03930313\nn01667778\nn07716906\nn01580077\nn03804744\nn02111277\nn03100240\nn04548280\nn02814533\nn04204347\nn04141327\nn02066245\nn02096585\nn02102480\nn03125729\nn03272010\nn03980874\nn07753592\nn02105412\nn02443114\nn04579432\nn02101556\nn03995372\nn02950826\nn01534433\nn02088238\nn07715103\nn02795169\nn01484850\nn01753488\nn02607072\nn01530575\nn01692333\nn04153751\nn02111500\nn03131574\nn03803284\nn02437312\nn02974003\nn02776631\nn04125021\nn09428293\nn02843684\nn03047690\nn02417914\nn03998194\nn03110669\nn02445715\nn04525305\nn03998194\nn01514668\nn02321529\nn02088466\nn01644373\nn07714571\nn04357314\nn03991062\nn02088094\nn02687172\nn02110185\nn02089078\nn09468604\nn02408429\nn04389033\nn03706229\nn02488702\nn03992509\nn02417914\nn04086273\nn07613480\nn04270147\nn03887697\nn01601694\nn02123159\nn01518878\nn07836838\nn04443257\nn01592084\nn03109150\nn02264363\nn02808304\nn04252225\nn01630670\nn04507155\nn03047690\nn03344393\nn02981792\nn03680355\nn07579787\nn02526121\nn01984695\nn04485082\nn03814639\nn02977058\nn03866082\nn04404412\nn04116512\nn03100240\nn03127925\nn01847000\nn02051845\nn02177972\nn02106030\nn03770679\nn03535780\nn03676483\nn01843383\nn01873310\nn02085936\nn02328150\nn03089624\nn02102318\nn02500267\nn04040759\nn04552348\nn02101006\nn07749582\nn03884397\nn02111129\nn03662601\nn03250847\nn02129604\nn03461385\nn03970156\nn04317175\nn03958227\nn07714990\nn01980166\nn03929660\nn03314780\nn01855032\nn03630383\nn01817953\nn02095889\nn04505470\nn02727426\nn03598930\nn02105855\nn02115913\nn03110669\nn10148035\nn02106550\nn02086079\nn04380533\nn10565667\nn03249569\nn02095889\nn02492660\nn07873807\nn02797295\nn04209239\nn02786058\nn02837789\nn02841315\nn02704792\nn03935335\nn04562935\nn02099429\nn02112137\nn03325584\nn04442312\nn04033995\nn07614500\nn02108089\nn03710721\nn03100240\nn02093859\nn02906734\nn04254777\nn07871810\nn02422106\nn04049303\nn03961711\nn02777292\nn04443257\nn04597913\nn02927161\nn03424325\nn03032252\nn02795169\nn02123394\nn01498041\nn01751748\nn03793489\nn03345487\nn02091635\nn02123159\nn02107142\nn02484975\nn03666591\nn03085013\nn04325704\nn03208938\nn04562935\nn04152593\nn09472597\nn07875152\nn04597913\nn04099969\nn03976657\nn02028035\nn03796401\nn02917067\nn02110958\nn02730930\nn02802426\nn02917067\nn02704792\nn07760859\nn02123597\nn01981276\nn01688243\nn03400231\nn02088238\nn07753275\nn02100583\nn01955084\nn02777292\nn01534433\nn03908714\nn02120079\nn04465501\nn02641379\nn02098286\nn01534433\nn02917067\nn04371774\nn02110958\nn03538406\nn03443371\nn03902125\nn03075370\nn04336792\nn02091831\nn02510455\nn02097047\nn03908618\nn02817516\nn02111889\nn01531178\nn02481823\nn03110669\nn02095570\nn03982430\nn03444034\nn07714571\nn07932039\nn01768244\nn02837789\nn03637318\nn04141975\nn01910747\nn03873416\nn03018349\nn02114548\nn07717556\nn03494278\nn03924679\nn02012849\nn02361337\nn02398521\nn03443371\nn07615774\nn02009912\nn02395406\nn02777292\nn02783161\nn02445715\nn03743016\nn03891332\nn04542943\nn15075141\nn02091244\nn02114367\nn03404251\nn03000134\nn01667114\nn03763968\nn02233338\nn09428293\nn03793489\nn04258138\nn04023962\nn01667778\nn03899768\nn13133613\nn03599486\nn03042490\nn04467665\nn03633091\nn02437616\nn02835271\nn03791053\nn04486054\nn07717410\nn07613480\nn01728920\nn03400231\nn02790996\nn02676566\nn04562935\nn02264363\nn04141975\nn03089624\nn03954731\nn03467068\nn02690373\nn02102040\nn01985128\nn04116512\nn02497673\nn04392985\nn03937543\nn02006656\nn01773549\nn02704792\nn02999410\nn07930864\nn02011460\nn02107312\nn02910353\nn01795545\nn04111531\nn02894605\nn01614925\nn02793495\nn02100877\nn03761084\nn02504013\nn02408429\nn07583066\nn01744401\nn03447447\nn03125729\nn01978287\nn04346328\nn03742115\nn02483708\nn13054560\nn02096177\nn03920288\nn02837789\nn03877472\nn02165105\nn03937543\nn03982430\nn03787032\nn07880968\nn04371774\nn04146614\nn03394916\nn03903868\nn02687172\nn01494475\nn02536864\nn02129165\nn07920052\nn01496331\nn02009912\nn02692877\nn02101006\nn03271574\nn04371774\nn01496331\nn04557648\nn02027492\nn02125311\nn03376595\nn01872401\nn04346328\nn02091134\nn04238763\nn01776313\nn01796340\nn01770081\nn03141823\nn01665541\nn04133789\nn02096437\nn02096051\nn10565667\nn04542943\nn03447447\nn09421951\nn02113624\nn03160309\nn02504458\nn01774750\nn03871628\nn04590129\nn12057211\nn03481172\nn03000247\nn04090263\nn04141076\nn01914609\nn03775071\nn02869837\nn04509417\nn04371430\nn02097209\nn04613696\nn02669723\nn02883205\nn01748264\nn01955084\nn04204238\nn03743016\nn02177972\nn03868863\nn04133789\nn02168699\nn04041544\nn02115913\nn02259212\nn02096177\nn02277742\nn04493381\nn02093859\nn03160309\nn04120489\nn09246464\nn04005630\nn03938244\nn03208938\nn04033901\nn02835271\nn04049303\nn02951585\nn04229816\nn01755581\nn01734418\nn01843065\nn02114367\nn09288635\nn04147183\nn03196217\nn04367480\nn03467068\nn01491361\nn02091831\nn04154565\nn07875152\nn07873807\nn02690373\nn02730930\nn04389033\nn02879718\nn03223299\nn01784675\nn03447721\nn01742172\nn01728572\nn12985857\nn03376595\nn03089624\nn03887697\nn04270147\nn01930112\nn02814533\nn07802026\nn07920052\nn03425413\nn06596364\nn03134739\nn02108422\nn12998815\nn07753113\nn02056570\nn09256479\nn04238763\nn02951585\nn04033901\nn01833805\nn01737021\nn01694178\nn06785654\nn02500267\nn02085782\nn03825788\nn03899768\nn01843383\nn02782093\nn01855672\nn04239074\nn04604644\nn07583066\nn03041632\nn02777292\nn03627232\nn03884397\nn02328150\nn04005630\nn02093859\nn01749939\nn03000134\nn04037443\nn03888257\nn01824575\nn07875152\nn02526121\nn07920052\nn02102040\nn02869837\nn02099849\nn04356056\nn01749939\nn02442845\nn04487081\nn02087046\nn04201297\nn02094433\nn02480495\nn02096585\nn01518878\nn04141975\nn02981792\nn01632458\nn02093647\nn02018207\nn04040759\nn01820546\nn03840681\nn03832673\nn02051845\nn01883070\nn03534580\nn02028035\nn03857828\nn01682714\nn04049303\nn02096585\nn04254120\nn02071294\nn03868863\nn02206856\nn04086273\nn02177972\nn02085782\nn03942813\nn01496331\nn04355933\nn02790996\nn04265275\nn03976467\nn02279972\nn02086240\nn01824575\nn09421951\nn02123159\nn02086079\nn07717410\nn02422106\nn02236044\nn01608432\nn03062245\nn07734744\nn01983481\nn04542943\nn01773797\nn02526121\nn01688243\nn01990800\nn02169497\nn01768244\nn01770393\nn03977966\nn02096585\nn03532672\nn07711569\nn01734418\nn04326547\nn09332890\nn04584207\nn02114712\nn02093754\nn03495258\nn01616318\nn02326432\nn04507155\nn03527444\nn01981276\nn02097298\nn03958227\nn02165105\nn07718472\nn04591157\nn04286575\nn04208210\nn02120505\nn04265275\nn04147183\nn03271574\nn02128385\nn02110958\nn03888257\nn02730930\nn01978455\nn02843684\nn03590841\nn03065424\nn03854065\nn01739381\nn01773797\nn03976657\nn04116512\nn02092339\nn01817953\nn02119789\nn01748264\nn02169497\nn03125729\nn02091467\nn07714571\nn02704792\nn02085936\nn02108915\nn03314780\nn02086646\nn07697537\nn03584829\nn03773504\nn04204347\nn01796340\nn03930313\nn02033041\nn02236044\nn02895154\nn02708093\nn02115641\nn04209239\nn01735189\nn03201208\nn09468604\nn03047690\nn04254777\nn06596364\nn03627232\nn01532829\nn01694178\nn04081281\nn03495258\nn02788148\nn01775062\nn04355933\nn03017168\nn04599235\nn03785016\nn07871810\nn03980874\nn02071294\nn04493381\nn04372370\nn02087046\nn04584207\nn04086273\nn02092339\nn02817516\nn03240683\nn12998815\nn03075370\nn02804414\nn01833805\nn01695060\nn04596742\nn04398044\nn02106382\nn04204238\nn02219486\nn02437312\nn04335435\nn01531178\nn04201297\nn03920288\nn03759954\nn03792782\nn02412080\nn04536866\nn03874293\nn02708093\nn02437312\nn04509417\nn01990800\nn04579145\nn02480495\nn04371430\nn02105056\nn03930630\nn03481172\nn02808440\nn07932039\nn04428191\nn02971356\nn02090379\nn03857828\nn02988304\nn02115913\nn04599235\nn04033901\nn11879895\nn03014705\nn02002724\nn02445715\nn02870880\nn02951585\nn02129604\nn02123394\nn01860187\nn03788195\nn03729826\nn01665541\nn01531178\nn04442312\nn02777292\nn13044778\nn07720875\nn02027492\nn02480855\nn04447861\nn02403003\nn03874599\nn01622779\nn02860847\nn03884397\nn13040303\nn03796401\nn03388549\nn03970156\nn02112137\nn03775071\nn01601694\nn02093991\nn01664065\nn02077923\nn02487347\nn02444819\nn02480855\nn04505470\nn03980874\nn03447447\nn01955084\nn02056570\nn03127747\nn02692877\nn06596364\nn03400231\nn03482405\nn03920288\nn03871628\nn03496892\nn12267677\nn04310018\nn02865351\nn01924916\nn03000247\nn03393912\nn02825657\nn06785654\nn02097474\nn04179913\nn02112350\nn03444034\nn03133878\nn02132136\nn02843684\nn01770393\nn01871265\nn03290653\nn03207941\nn03476991\nn03481172\nn04590129\nn01532829\nn03642806\nn03388183\nn02094258\nn03496892\nn04467665\nn02963159\nn02328150\nn02101388\nn09256479\nn03777568\nn02165456\nn03042490\nn02363005\nn13054560\nn02808440\nn04532670\nn01688243\nn03602883\nn02206856\nn03400231\nn02346627\nn01871265\nn01806567\nn02727426\nn04067472\nn02088094\nn04553703\nn13037406\nn07718472\nn04252077\nn04258138\nn02808440\nn02328150\nn03325584\nn01774750\nn02123159\nn02111277\nn04591157\nn03871628\nn03775071\nn04136333\nn03976467\nn03908618\nn03483316\nn04487394\nn02769748\nn04523525\nn12998815\nn04553703\nn04152593\nn02346627\nn02007558\nn03110669\nn01440764\nn09472597\nn02730930\nn02782093\nn04483307\nn02028035\nn04040759\nn03372029\nn02808440\nn02120505\nn03141823\nn02100236\nn01770393\nn01739381\nn03208938\nn03954731\nn04536866\nn04456115\nn03000247\nn04612504\nn02837789\nn03538406\nn02699494\nn03967562\nn04398044\nn03710721\nn04356056\nn04033995\nn02415577\nn04270147\nn03866082\nn03271574\nn02133161\nn03483316\nn01514668\nn03770679\nn04532670\nn03720891\nn02096437\nn03444034\nn02088632\nn02328150\nn02787622\nn12998815\nn07716358\nn02817516\nn03961711\nn02823428\nn01753488\nn02443114\nn04370456\nn04542943\nn03876231\nn02509815\nn04371430\nn04141975\nn02112350\nn02321529\nn02097474\nn04461696\nn03804744\nn02786058\nn12768682\nn01855032\nn03992509\nn01773797\nn02443484\nn02101006\nn09421951\nn03837869\nn04356056\nn01744401\nn02701002\nn03977966\nn02105056\nn02102318\nn03095699\nn01728572\nn01873310\nn03930313\nn03930630\nn06359193\nn02033041\nn04604644\nn03781244\nn04599235\nn02114548\nn02356798\nn03271574\nn07932039\nn02100735\nn04069434\nn04346328\nn09332890\nn12768682\nn02795169\nn04049303\nn02403003\nn04239074\nn02493793\nn02127052\nn04317175\nn02363005\nn03832673\nn04296562\nn03630383\nn01739381\nn02107683\nn02012849\nn03786901\nn04033995\nn03782006\nn02113624\nn02783161\nn02134418\nn03532672\nn02012849\nn02415577\nn02096437\nn03220513\nn01945685\nn02892201\nn04044716\nn07742313\nn03376595\nn02643566\nn01735189\nn01729977\nn02105251\nn09421951\nn02099712\nn03388043\nn02174001\nn04147183\nn02013706\nn13054560\nn02978881\nn09246464\nn02699494\nn02107312\nn03017168\nn07745940\nn02233338\nn02791270\nn01950731\nn03857828\nn02025239\nn03452741\nn02101388\nn03388549\nn01484850\nn02111277\nn01950731\nn02174001\nn02105162\nn02480855\nn03325584\nn03272562\nn03876231\nn01644373\nn04380533\nn07697537\nn04380533\nn02190166\nn07753592\nn01630670\nn02730930\nn03788195\nn02669723\nn02100735\nn03271574\nn03179701\nn02486261\nn02105412\nn02417914\nn01770081\nn02123394\nn01855672\nn02480495\nn02692877\nn01532829\nn04372370\nn01910747\nn03400231\nn02444819\nn04099969\nn03498962\nn04154565\nn02783161\nn03124170\nn03417042\nn04254120\nn07717410\nn04372370\nn07565083\nn03661043\nn04074963\nn02504458\nn03720891\nn03445924\nn03873416\nn03775071\nn02443114\nn03623198\nn03000247\nn02423022\nn03929660\nn02782093\nn01930112\nn01776313\nn03388183\nn02133161\nn02782093\nn03393912\nn03794056\nn09256479\nn07920052\nn03384352\nn02666196\nn02894605\nn03476684\nn02526121\nn02123045\nn03673027\nn03197337\nn02114548\nn04599235\nn02085936\nn02963159\nn04258138\nn03983396\nn03187595\nn03290653\nn03179701\nn01531178\nn02398521\nn02119789\nn02089867\nn04548362\nn02486410\nn01704323\nn01494475\nn04141327\nn02790996\nn02056570\nn02106166\nn02018795\nn04523525\nn03598930\nn04118776\nn03662601\nn04509417\nn02606052\nn02966193\nn03775071\nn02317335\nn03146219\nn03355925\nn02229544\nn02443114\nn03355925\nn04590129\nn02804414\nn02114367\nn03379051\nn02138441\nn03461385\nn04200800\nn03584829\nn01755581\nn04335435\nn03127747\nn04263257\nn04192698\nn01622779\nn02422699\nn02107683\nn04532670\nn02906734\nn02804414\nn12768682\nn02108089\nn02909870\nn03837869\nn02113186\nn02112350\nn01677366\nn03630383\nn02526121\nn02840245\nn01687978\nn04515003\nn15075141\nn02841315\nn02422106\nn02783161\nn02814533\nn02102177\nn02415577\nn03782006\nn01770081\nn02114548\nn03958227\nn01728920\nn03494278\nn01873310\nn02894605\nn01833805\nn03160309\nn04458633\nn03223299\nn12620546\nn12998815\nn01496331\nn04461696\nn01981276\nn03595614\nn02101388\nn03937543\nn03100240\nn03791053\nn04613696\nn02134084\nn04141975\nn02093859\nn03125729\nn02326432\nn03680355\nn03998194\nn01494475\nn02342885\nn03976657\nn01819313\nn04606251\nn01740131\nn02797295\nn02123394\nn02169497\nn03630383\nn01689811\nn03950228\nn07584110\nn04591713\nn04127249\nn12144580\nn07831146\nn03791053\nn02808440\nn02793495\nn02437312\nn02138441\nn02111500\nn02109961\nn03459775\nn03126707\nn03388549\nn02096294\nn03961711\nn04209133\nn04243546\nn02791270\nn01685808\nn02965783\nn03775546\nn02074367\nn03775546\nn03584254\nn02119789\nn02437312\nn03888257\nn03187595\nn02123045\nn03937543\nn02412080\nn01729322\nn03908714\nn02125311\nn01494475\nn02894605\nn03908618\nn02114855\nn02123159\nn03598930\nn02107142\nn03290653\nn02791124\nn03803284\nn03937543\nn03388043\nn03131574\nn02788148\nn02106382\nn04467665\nn02100877\nn04330267\nn03697007\nn03710721\nn02403003\nn02108089\nn03017168\nn03733281\nn03792972\nn02105056\nn01806567\nn01630670\nn03337140\nn03467068\nn01873310\nn02398521\nn02013706\nn04120489\nn02708093\nn02110341\nn03770679\nn02480495\nn03450230\nn03584254\nn02823750\nn04127249\nn02410509\nn04562935\nn04019541\nn04613696\nn01632777\nn07836838\nn02114855\nn02100236\nn02102318\nn07831146\nn03742115\nn03662601\nn03720891\nn02804610\nn02107142\nn03733131\nn03791053\nn03991062\nn02808304\nn03594945\nn02749479\nn04562935\nn02134084\nn02342885\nn03538406\nn02107683\nn02012849\nn01682714\nn02988304\nn07932039\nn02206856\nn03447447\nn01753488\nn01755581\nn02119022\nn04597913\nn03314780\nn02865351\nn03459775\nn01530575\nn04335435\nn09288635\nn02769748\nn02256656\nn03131574\nn03770439\nn02123045\nn02096177\nn04131690\nn02397096\nn01798484\nn02107574\nn02113186\nn01855672\nn03791053\nn03770679\nn01983481\nn02093256\nn01968897\nn02692877\nn02356798\nn07875152\nn02107312\nn02837789\nn03042490\nn03188531\nn03447721\nn02825657\nn03868242\nn04552348\nn01770081\nn02095314\nn04204347\nn02087394\nn04065272\nn02132136\nn02134418\nn01632777\nn04325704\nn03776460\nn01955084\nn02129604\nn01644900\nn02101006\nn04357314\nn12985857\nn03670208\nn07760859\nn04067472\nn02099849\nn03770679\nn02978881\nn03623198\nn03717622\nn04536866\nn02835271\nn07717410\nn04429376\nn02869837\nn03124170\nn01632458\nn01531178\nn03127925\nn02097047\nn03950228\nn03028079\nn02107312\nn13052670\nn02090721\nn07711569\nn02091831\nn01530575\nn04146614\nn01667114\nn03958227\nn02098286\nn07871810\nn01980166\nn02412080\nn02500267\nn01924916\nn04254680\nn02480495\nn01774384\nn03216828\nn07711569\nn03026506\nn01749939\nn03344393\nn03938244\nn02098105\nn01986214\nn01917289\nn04418357\nn02058221\nn02106030\nn02966193\nn03032252\nn02206856\nn03063599\nn02107312\nn03843555\nn02108551\nn01855672\nn02107142\nn02102040\nn04357314\nn04505470\nn03529860\nn02437312\nn02129604\nn03773504\nn02100877\nn03877472\nn04501370\nn07880968\nn04458633\nn02167151\nn03721384\nn02102480\nn07579787\nn02123394\nn02484975\nn03942813\nn04270147\nn03777568\nn02085782\nn01729977\nn04404412\nn04311174\nn03160309\nn02454379\nn02096294\nn04065272\nn02483362\nn02364673\nn03100240\nn07873807\nn03594734\nn04344873\nn07590611\nn01883070\nn03770439\nn03141823\nn02133161\nn01689811\nn01833805\nn02814860\nn04367480\nn03710637\nn07714571\nn02071294\nn01768244\nn03388183\nn01847000\nn03325584\nn01667114\nn02236044\nn04141327\nn03467068\nn01687978\nn04285008\nn03483316\nn03447447\nn02264363\nn02097209\nn04501370\nn09468604\nn02930766\nn01917289\nn04554684\nn02979186\nn02442845\nn03345487\nn02486410\nn02841315\nn03899768\nn09399592\nn03344393\nn02088364\nn03763968\nn02105162\nn04235860\nn03903868\nn09428293\nn03661043\nn03249569\nn02268443\nn02444819\nn02116738\nn03902125\nn02093991\nn02110185\nn03832673\nn03983396\nn07716358\nn02113712\nn03887697\nn03424325\nn03958227\nn01534433\nn02086646\nn04591713\nn07753113\nn03841143\nn02790996\nn02165456\nn02009229\nn02814860\nn04462240\nn02730930\nn02085620\nn02098413\nn03337140\nn02807133\nn04263257\nn02108422\nn02138441\nn01630670\nn04008634\nn02113799\nn02643566\nn12057211\nn01665541\nn04404412\nn03691459\nn01729977\nn03290653\nn01924916\nn02486410\nn04332243\nn13052670\nn03598930\nn02437616\nn02093991\nn01729977\nn02115641\nn02825657\nn02786058\nn02788148\nn02094258\nn02793495\nn03388043\nn02128757\nn02443484\nn02088094\nn03110669\nn01985128\nn07714990\nn02869837\nn03595614\nn04592741\nn02127052\nn07880968\nn02643566\nn09256479\nn02356798\nn02509815\nn04487394\nn03721384\nn01728572\nn02992211\nn03877845\nn02231487\nn02445715\nn02095570\nn04579145\nn03706229\nn02107574\nn01833805\nn01629819\nn03445777\nn03710721\nn03014705\nn04336792\nn04311174\nn03724870\nn03920288\nn03063689\nn03908618\nn02085620\nn02699494\nn02096437\nn03804744\nn04209239\nn03249569\nn11939491\nn01882714\nn02129165\nn03773504\nn04346328\nn02102040\nn12620546\nn02177972\nn02066245\nn03492542\nn02090721\nn04482393\nn01914609\nn02174001\nn02233338\nn01693334\nn01665541\nn02280649\nn01514668\nn01641577\nn02107683\nn04040759\nn03355925\nn04579432\nn02280649\nn02361337\nn03937543\nn03891251\nn02492035\nn03759954\nn03763968\nn01582220\nn03866082\nn04086273\nn04330267\nn04476259\nn04118776\nn03180011\nn03838899\nn03627232\nn04264628\nn02101006\nn02113624\nn02395406\nn01675722\nn04090263\nn03785016\nn02137549\nn02277742\nn03642806\nn07718472\nn03447447\nn03792782\nn04008634\nn04254777\nn01631663\nn04254680\nn02074367\nn01744401\nn03127747\nn02190166\nn03623198\nn02607072\nn02877765\nn02790996\nn02992529\nn02492660\nn02117135\nn01580077\nn03028079\nn02102040\nn01494475\nn04461696\nn01917289\nn04146614\nn04004767\nn02906734\nn01560419\nn02085936\nn12267677\nn03075370\nn01682714\nn02669723\nn01751748\nn02999410\nn10148035\nn02797295\nn03958227\nn03134739\nn01860187\nn02443114\nn03028079\nn03495258\nn03787032\nn02108089\nn01687978\nn01484850\nn02098105\nn03942813\nn02109525\nn04613696\nn01631663\nn09835506\nn01784675\nn02137549\nn09472597\nn02895154\nn03676483\nn04209239\nn01784675\nn03028079\nn03355925\nn03483316\nn03337140\nn03495258\nn04311004\nn04270147\nn03791053\nn02488702\nn02895154\nn02100583\nn10565667\nn04548280\nn02091134\nn01806567\nn02264363\nn02708093\nn02111277\nn02692877\nn03837869\nn03240683\nn03773504\nn03706229\nn03742115\nn01734418\nn12998815\nn03452741\nn06596364\nn03041632\nn02096585\nn04317175\nn07892512\nn01755581\nn03777568\nn03457902\nn02106382\nn01601694\nn03691459\nn02114855\nn03461385\nn02096294\nn03498962\nn04482393\nn02412080\nn03857828\nn02124075\nn02106550\nn03950228\nn07730033\nn02093991\nn07768694\nn02870880\nn02672831\nn02268443\nn03773504\nn09332890\nn02025239\nn04562935\nn07742313\nn04192698\nn04049303\nn01644900\nn02769748\nn01774384\nn02894605\nn03127747\nn03045698\nn03388549\nn03724870\nn03706229\nn03825788\nn01775062\nn03670208\nn02492035\nn01983481\nn04435653\nn03028079\nn03445924\nn02108000\nn01882714\nn02346627\nn09399592\nn12620546\nn03047690\nn02807133\nn03630383\nn03325584\nn02110063\nn07860988\nn01443537\nn04523525\nn02112706\nn02815834\nn03720891\nn03843555\nn02992211\nn02107908\nn03662601\nn03207743\nn04507155\nn02094433\nn02791270\nn02788148\nn02094258\nn02105162\nn04179913\nn07930864\nn03873416\nn02027492\nn02790996\nn03924679\nn07753275\nn03658185\nn02444819\nn07802026\nn01484850\nn02113186\nn02110341\nn02090622\nn04366367\nn01773157\nn03792972\nn02690373\nn02090622\nn06794110\nn02101388\nn07697313\nn03297495\nn03032252\nn01688243\nn02090379\nn02017213\nn04152593\nn02108551\nn03658185\nn02643566\nn04049303\nn03544143\nn03709823\nn01632458\nn02111500\nn07717556\nn01688243\nn07747607\nn01592084\nn03485794\nn02443114\nn03888257\nn07753592\nn01930112\nn03127747\nn01580077\nn12057211\nn03344393\nn03697007\nn01601694\nn01818515\nn04517823\nn04584207\nn02002724\nn03424325\nn03895866\nn03787032\nn02100236\nn03110669\nn04523525\nn01983481\nn04465501\nn02090721\nn02980441\nn02088094\nn02492035\nn03109150\nn02091635\nn07695742\nn02074367\nn07754684\nn02783161\nn03761084\nn02096585\nn04099969\nn01930112\nn03379051\nn02105412\nn02097298\nn04026417\nn03866082\nn04004767\nn01704323\nn04286575\nn02321529\nn04417672\nn04389033\nn02909870\nn01685808\nn01806143\nn02006656\nn03832673\nn07697313\nn07932039\nn02206856\nn12144580\nn02108422\nn07753113\nn03777754\nn04259630\nn02641379\nn13052670\nn03788365\nn02870880\nn02799071\nn02137549\nn02999410\nn04317175\nn02094114\nn03529860\nn03188531\nn03160309\nn03697007\nn02091831\nn03594734\nn04389033\nn02799071\nn07747607\nn02504458\nn04277352\nn01914609\nn02281787\nn03868863\nn09421951\nn03792782\nn02102318\nn01484850\nn04192698\nn02089867\nn03584254\nn01728572\nn03062245\nn02109047\nn02108422\nn02088632\nn02447366\nn02236044\nn02910353\nn02105056\nn03498962\nn03250847\nn04120489\nn02999410\nn03467068\nn03187595\nn03255030\nn04004767\nn02091635\nn04507155\nn03782006\nn02317335\nn02165456\nn04243546\nn02099849\nn04239074\nn09246464\nn04335435\nn03770439\nn01978455\nn01644373\nn02256656\nn02509815\nn03584254\nn03710721\nn01795545\nn07753592\nn02412080\nn07892512\nn02091032\nn04074963\nn03197337\nn03075370\nn02111129\nn03930630\nn01770081\nn04235860\nn02132136\nn02100735\nn01978287\nn02097658\nn04540053\nn04149813\nn02105251\nn01984695\nn03314780\nn02115641\nn04235860\nn02843684\nn04311004\nn04118776\nn02276258\nn02909870\nn02701002\nn02051845\nn04599235\nn01689811\nn03637318\nn03344393\nn04591713\nn02018795\nn02795169\nn04462240\nn03776460\nn03404251\nn03188531\nn07749582\nn01631663\nn02123597\nn02328150\nn02110958\nn02125311\nn04023962\nn03133878\nn03131574\nn02091467\nn01484850\nn02096177\nn01496331\nn02058221\nn03028079\nn02113023\nn02480855\nn02892201\nn04418357\nn03042490\nn03124170\nn12985857\nn04141975\nn01860187\nn02130308\nn04037443\nn13052670\nn07714571\nn02391049\nn04149813\nn04099969\nn01729977\nn04243546\nn02978881\nn03131574\nn02127052\nn04366367\nn02229544\nn01669191\nn02489166\nn07716906\nn03208938\nn02088466\nn02093754\nn01632777\nn04118538\nn02363005\nn02114855\nn09256479\nn02787622\nn02105412\nn03498962\nn12768682\nn03216828\nn03598930\nn02643566\nn03837869\nn07695742\nn01817953\nn01667778\nn04251144\nn02231487\nn04005630\nn03445777\nn04597913\nn07615774\nn02769748\nn01833805\nn01828970\nn01796340\nn01694178\nn03995372\nn03494278\nn03271574\nn03014705\nn02088632\nn03788195\nn02328150\nn02992529\nn03498962\nn02169497\nn02112137\nn02483362\nn07836838\nn02086240\nn01739381\nn02325366\nn03877472\nn04589890\nn02133161\nn01632777\nn02105162\nn04019541\nn01775062\nn02107574\nn04509417\nn01860187\nn02088632\nn03459775\nn03133878\nn04254680\nn01755581\nn02939185\nn02091134\nn02114712\nn07714990\nn02484975\nn03445924\nn03018349\nn02802426\nn01774384\nn03124043\nn03355925\nn03146219\nn03388183\nn02226429\nn07860988\nn03388183\nn04009552\nn02488291\nn03899768\nn03649909\nn03393912\nn02797295\nn03014705\nn03729826\nn01560419\nn02114367\nn03637318\nn02115641\nn04517823\nn02346627\nn02033041\nn02804414\nn07714990\nn04120489\nn03481172\nn02099267\nn10565667\nn03825788\nn03240683\nn02123597\nn02097130\nn02090721\nn02094433\nn02667093\nn03461385\nn02101388\nn09399592\nn02109047\nn04153751\nn04479046\nn03223299\nn13133613\nn01688243\nn02363005\nn04493381\nn02445715\nn02280649\nn03804744\nn04596742\nn04597913\nn01729322\nn02793495\nn04604644\nn04592741\nn03425413\nn04332243\nn04562935\nn02494079\nn07693725\nn07717410\nn06874185\nn03063689\nn02389026\nn02110627\nn03930630\nn01871265\nn07716358\nn02114712\nn03216828\nn06596364\nn03494278\nn07579787\nn04548280\nn04409515\nn02102040\nn07753113\nn01632777\nn02843684\nn02395406\nn02100583\nn03481172\nn02099849\nn02708093\nn01980166\nn02096294\nn01744401\nn03291819\nn04004767\nn01534433\nn03223299\nn03773504\nn04090263\nn02002724\nn02422106\nn04325704\nn01531178\nn02948072\nn02281787\nn04239074\nn04399382\nn03400231\nn02802426\nn02165456\nn02256656\nn02104029\nn06794110\nn07932039\nn02793495\nn02093754\nn02834397\nn02165456\nn03394916\nn02138441\nn01729977\nn02138441\nn04311174\nn03388043\nn03344393\nn03445924\nn02504013\nn13040303\nn02363005\nn02206856\nn03982430\nn03661043\nn02107574\nn03785016\nn02231487\nn04487394\nn04376876\nn04277352\nn07718472\nn04118776\nn01914609\nn01798484\nn01944390\nn03355925\nn03742115\nn02108089\nn03924679\nn03134739\nn02011460\nn02974003\nn02100583\nn01496331\nn01860187\nn02100236\nn04596742\nn02119789\nn02342885\nn04044716\nn04099969\nn03602883\nn07717556\nn04548280\nn03843555\nn04409515\nn02093647\nn01797886\nn04429376\nn03063599\nn07760859\nn02487347\nn01697457\nn03706229\nn02988304\nn03134739\nn02979186\nn02892201\nn03840681\nn03425413\nn13044778\nn04330267\nn03425413\nn02099849\nn04044716\nn01440764\nn02105251\nn03599486\nn03240683\nn02097130\nn04162706\nn03443371\nn02492660\nn03793489\nn04347754\nn04296562\nn03666591\nn04584207\nn04136333\nn02123159\nn04070727\nn02981792\nn07718472\nn01694178\nn10565667\nn04532670\nn02480495\nn07590611\nn02111277\nn04554684\nn01695060\nn04311004\nn02102480\nn04447861\nn02807133\nn04398044\nn04418357\nn03690938\nn01644373\nn03837869\nn02493793\nn01796340\nn02095889\nn03781244\nn02088466\nn02906734\nn04596742\nn12057211\nn02097658\nn03954731\nn02447366\nn03223299\nn03710637\nn03459775\nn04458633\nn02397096\nn03877472\nn07584110\nn03393912\nn07716906\nn07836838\nn03720891\nn02109961\nn04326547\nn01753488\nn02389026\nn07734744\nn07745940\nn02094114\nn02981792\nn02097298\nn03930630\nn02783161\nn04346328\nn01774750\nn01829413\nn02910353\nn02894605\nn02132136\nn04372370\nn04040759\nn02493509\nn03788195\nn04357314\nn02106166\nn02168699\nn02091831\nn02105056\nn01986214\nn02268443\nn01739381\nn01774384\nn02444819\nn02105641\nn01687978\nn04606251\nn03325584\nn04596742\nn02325366\nn02950826\nn04067472\nn02086646\nn02113799\nn04557648\nn04429376\nn01704323\nn02056570\nn02488291\nn07614500\nn03089624\nn01532829\nn03160309\nn04550184\nn07730033\nn02095570\nn04367480\nn04081281\nn04254120\nn04443257\nn03777568\nn03584829\nn04201297\nn12144580\nn02834397\nn03127925\nn02100735\nn02256656\nn02092002\nn01753488\nn04259630\nn03197337\nn02510455\nn02108422\nn02013706\nn03840681\nn02108089\nn04485082\nn03584829\nn02134084\nn03814639\nn04522168\nn04589890\nn04252225\nn03188531\nn03594945\nn03691459\nn04041544\nn04033901\nn04090263\nn02486410\nn03873416\nn03871628\nn02325366\nn02841315\nn02037110\nn02909870\nn01629819\nn07565083\nn02088094\nn03954731\nn12998815\nn03661043\nn04332243\nn02167151\nn04099969\nn04266014\nn03733131\nn02033041\nn02165456\nn02109047\nn02999410\nn02177972\nn02033041\nn03899768\nn01685808\nn04023962\nn02114712\nn03775546\nn02092002\nn02107142\nn02977058\nn01582220\nn04127249\nn03814906\nn03769881\nn03393912\nn03291819\nn02497673\nn03127925\nn09193705\nn07831146\nn03980874\nn07753113\nn01558993\nn02808304\nn03854065\nn04483307\nn02102040\nn04326547\nn02443484\nn09256479\nn03961711\nn01641577\nn03733131\nn04254680\nn02099601\nn02089078\nn03016953\nn03216828\nn02101388\nn02229544\nn02606052\nn04141076\nn01694178\nn03063689\nn01774384\nn02607072\nn02091244\nn03937543\nn04328186\nn03532672\nn03485407\nn07717556\nn02006656\nn04525305\nn02123597\nn02708093\nn02137549\nn07614500\nn03947888\nn03983396\nn03544143\nn01440764\nn01440764\nn03717622\nn02085620\nn02727426\nn03485794\nn03825788\nn04259630\nn02788148\nn03930630\nn04392985\nn02454379\nn02100236\nn01534433\nn02102318\nn04044716\nn02113186\nn02066245\nn02127052\nn01950731\nn03000684\nn02843684\nn04147183\nn02110063\nn07590611\nn02113712\nn04074963\nn03871628\nn02168699\nn09246464\nn07802026\nn01693334\nn03908714\nn02130308\nn09193705\nn02091244\nn02111500\nn03642806\nn04033901\nn02999410\nn02128925\nn06359193\nn07717410\nn02102318\nn04208210\nn02086079\nn03868863\nn03743016\nn03062245\nn03717622\nn04069434\nn03598930\nn01978287\nn04026417\nn01748264\nn02096294\nn04483307\nn01592084\nn03787032\nn03742115\nn01795545\nn02807133\nn02769748\nn02108915\nn04509417\nn02093754\nn02129604\nn02090622\nn01806567\nn04579432\nn04542943\nn03400231\nn07871810\nn09399592\nn02114367\nn04049303\nn02979186\nn02494079\nn03944341\nn03535780\nn03297495\nn07831146\nn02457408\nn04254680\nn03028079\nn03498962\nn02883205\nn02077923\nn02090721\nn04005630\nn02056570\nn01775062\nn03866082\nn02087394\nn04336792\nn01917289\nn04111531\nn02007558\nn04086273\nn02843684\nn13037406\nn04200800\nn03000684\nn03991062\nn02488702\nn02808440\nn03887697\nn01784675\nn02058221\nn02841315\nn02114367\nn03657121\nn02787622\nn03095699\nn03450230\nn02123394\nn02869837\nn03793489\nn02094258\nn04380533\nn02978881\nn07584110\nn02927161\nn02930766\nn02093428\nn04507155\nn03534580\nn03857828\nn01872401\nn03337140\nn02980441\nn02102177\nn02509815\nn02097047\nn02992529\nn02797295\nn03866082\nn02279972\nn03485794\nn03530642\nn01518878\nn04483307\nn04033901\nn07749582\nn02917067\nn03623198\nn02233338\nn03623198\nn03594945\nn02256656\nn02999410\nn02093991\nn02002724\nn03788365\nn03623198\nn02110063\nn01740131\nn04346328\nn04033995\nn02095889\nn04311174\nn02445715\nn03218198\nn02640242\nn04462240\nn03180011\nn02093256\nn03425413\nn02504013\nn03877472\nn02087046\nn03976467\nn02091134\nn04044716\nn02088364\nn02009912\nn02206856\nn03297495\nn02871525\nn03633091\nn02105855\nn03075370\nn02119789\nn01644373\nn03216828\nn03478589\nn03929855\nn02939185\nn01847000\nn02317335\nn01983481\nn03657121\nn02086910\nn02088238\nn02168699\nn03976467\nn07697313\nn03743016\nn04086273\nn04200800\nn01632777\nn03529860\nn03404251\nn03255030\nn03476991\nn04311174\nn02093991\nn03924679\nn03478589\nn04258138\nn01774384\nn02277742\nn01980166\nn02951358\nn03983396\nn03482405\nn02091244\nn01592084\nn02415577\nn02125311\nn03888257\nn03871628\nn02096437\nn03743016\nn04118776\nn02526121\nn07711569\nn01694178\nn01744401\nn03424325\nn10565667\nn02007558\nn01860187\nn03127925\nn04380533\nn03637318\nn02088238\nn04118538\nn02101006\nn02110958\nn01820546\nn02106550\nn03874293\nn02229544\nn03937543\nn03838899\nn04147183\nn03697007\nn02655020\nn01677366\nn02415577\nn03891332\nn03673027\nn02328150\nn02363005\nn04209133\nn04065272\nn04399382\nn02114548\nn03724870\nn12620546\nn04277352\nn02105855\nn01704323\nn01697457\nn02094433\nn02110958\nn02092339\nn01734418\nn02108915\nn02791270\nn01534433\nn04111531\nn03476684\nn02708093\nn01955084\nn01580077\nn01592084\nn03602883\nn02871525\nn04037443\nn02086910\nn13040303\nn07749582\nn01930112\nn13037406\nn03792972\nn01775062\nn02403003\nn02974003\nn01644373\nn02966193\nn03481172\nn02095570\nn03297495\nn01614925\nn01440764\nn02879718\nn02105641\nn03125729\nn03891332\nn01697457\nn03443371\nn03794056\nn02231487\nn02395406\nn02787622\nn03425413\nn02111889\nn01632458\nn02110806\nn03584829\nn03733805\nn04613696\nn07747607\nn02687172\nn03792782\nn02492035\nn02489166\nn03393912\nn03018349\nn03843555\nn02769748\nn02168699\nn03272010\nn04532106\nn01943899\nn01882714\nn03127747\nn02088632\nn04589890\nn12768682\nn07715103\nn02410509\nn03995372\nn01728920\nn02091134\nn01820546\nn01739381\nn02917067\nn04591157\nn07697313\nn01728920\nn02835271\nn02028035\nn03908714\nn02096294\nn02106030\nn03384352\nn02174001\nn04522168\nn03866082\nn02817516\nn01978287\nn04259630\nn04399382\nn02113978\nn03447721\nn02749479\nn03188531\nn02483708\nn07693725\nn03014705\nn01622779\nn03642806\nn02018207\nn09332890\nn03670208\nn03291819\nn02017213\nn02098286\nn04141327\nn02105251\nn02447366\nn02321529\nn03792782\nn01443537\nn01943899\nn04522168\nn13133613\nn03891251\nn02106166\nn04592741\nn04179913\nn03216828\nn04467665\nn01883070\nn07614500\nn02105162\nn04456115\nn04332243\nn04049303\nn07615774\nn01616318\nn07802026\nn03291819\nn01688243\nn02396427\nn09229709\nn09399592\nn02027492\nn04517823\nn03325584\nn02165456\nn03803284\nn02802426\nn09428293\nn02168699\nn02106662\nn03259280\nn03733131\nn04258138\nn01924916\nn01945685\nn09428293\nn02871525\nn02786058\nn03721384\nn04285008\nn03485794\nn01784675\nn04428191\nn02092002\nn04372370\nn04099969\nn03026506\nn02971356\nn02106030\nn04131690\nn01847000\nn03794056\nn12985857\nn02488702\nn01872401\nn03372029\nn01806567\nn01917289\nn03444034\nn01776313\nn02814533\nn02672831\nn03637318\nn02113978\nn02165456\nn04548280\nn02917067\nn01560419\nn02825657\nn04552348\nn02999410\nn02190166\nn03065424\nn02825657\nn07716358\nn02877765\nn09421951\nn12267677\nn01819313\nn04264628\nn03344393\nn02002724\nn01641577\nn02256656\nn01532829\nn03854065\nn02791270\nn02951585\nn03014705\nn01592084\nn01728572\nn01774750\nn03868242\nn04370456\nn03337140\nn03124043\nn03290653\nn02488291\nn04505470\nn04553703\nn02107574\nn01692333\nn12620546\nn04086273\nn03657121\nn01582220\nn03485407\nn03840681\nn07768694\nn03782006\nn02114548\nn11939491\nn04552348\nn03208938\nn02006656\nn03764736\nn07695742\nn01820546\nn02326432\nn02009229\nn02408429\nn03018349\nn03018349\nn02504458\nn02089973\nn01917289\nn01739381\nn02130308\nn04099969\nn02102040\nn03788195\nn03764736\nn02422699\nn01978287\nn02860847\nn02749479\nn03877845\nn03404251\nn04209133\nn07695742\nn04090263\nn03720891\nn04311174\nn03642806\nn03933933\nn04005630\nn02093991\nn02977058\nn09835506\nn03417042\nn01742172\nn03888257\nn02782093\nn07802026\nn03208938\nn02130308\nn02090622\nn04040759\nn02422699\nn03594945\nn02437616\nn03337140\nn09399592\nn02129604\nn02488291\nn04597913\nn03089624\nn03710193\nn02930766\nn04435653\nn01806567\nn03100240\nn01582220\nn03871628\nn02422106\nn02494079\nn04372370\nn07716358\nn04277352\nn02236044\nn03891332\nn03814639\nn02396427\nn02793495\nn02096437\nn02504458\nn02085936\nn01978287\nn04239074\nn03532672\nn02869837\nn02127052\nn03680355\nn02206856\nn03602883\nn01817953\nn03733805\nn03938244\nn03450230\nn04044716\nn02965783\nn03938244\nn01592084\nn03290653\nn04479046\nn07831146\nn01735189\nn04525305\nn02870880\nn02776631\nn02172182\nn04081281\nn03876231\nn01985128\nn01917289\nn10148035\nn04286575\nn03598930\nn02085782\nn02699494\nn04009552\nn03492542\nn07749582\nn03017168\nn03494278\nn02134418\nn03792782\nn01687978\nn13040303\nn03220513\nn03347037\nn03476684\nn01828970\nn02114367\nn07715103\nn02119789\nn01749939\nn03791053\nn02457408\nn01440764\nn01824575\nn04372370\nn07802026\nn04270147\nn04033901\nn04515003\nn03950228\nn04005630\nn02091032\nn02090379\nn02486410\nn07684084\nn04592741\nn02106382\nn02165456\nn02483708\nn01737021\nn02814533\nn04081281\nn03884397\nn07749582\nn01641577\nn03929855\nn04550184\nn04467665\nn03930313\nn02951585\nn02747177\nn04487394\nn01773549\nn04228054\nn02410509\nn04596742\nn02795169\nn03496892\nn04613696\nn02398521\nn03814906\nn02823750\nn02106550\nn02128385\nn02364673\nn03770679\nn02099429\nn01669191\nn12057211\nn04476259\nn02229544\nn03781244\nn02509815\nn02807133\nn02132136\nn03447721\nn02840245\nn03743016\nn04118776\nn04356056\nn02190166\nn03424325\nn04606251\nn04146614\nn04040759\nn07754684\nn02119022\nn02454379\nn02443484\nn04310018\nn03527444\nn04399382\nn03843555\nn01740131\nn02127052\nn02749479\nn03045698\nn02086240\nn01795545\nn04592741\nn02701002\nn04149813\nn02823750\nn01728920\nn04493381\nn02894605\nn03970156\nn03838899\nn03877845\nn03534580\nn02094258\nn03047690\nn02033041\nn03208938\nn03124043\nn03000134\nn03250847\nn01817953\nn02727426\nn01669191\nn02268443\nn03770439\nn02389026\nn04550184\nn02804610\nn03461385\nn02091244\nn02363005\nn02391049\nn07717410\nn03404251\nn07695742\nn04462240\nn01817953\nn06359193\nn01685808\nn02509815\nn09835506\nn04523525\nn04398044\nn01955084\nn02423022\nn02129604\nn02066245\nn01773797\nn02859443\nn04090263\nn03617480\nn04548280\nn03929855\nn03777754\nn02791270\nn02317335\nn03791053\nn03180011\nn01677366\nn03976467\nn02497673\nn01729322\nn03297495\nn02268853\nn01742172\nn07716906\nn03630383\nn02825657\nn02094258\nn07873807\nn03776460\nn01843383\nn02840245\nn02607072\nn01491361\nn03109150\nn03908618\nn02132136\nn01950731\nn02133161\nn04070727\nn03384352\nn03594945\nn03933933\nn03891332\nn01968897\nn09229709\nn02095314\nn02088364\nn01641577\nn03124170\nn03272562\nn02817516\nn01943899\nn07590611\nn04235860\nn03991062\nn02006656\nn04026417\nn02113799\nn04311004\nn02815834\nn04008634\nn07718472\nn02437616\nn04325704\nn03676483\nn03207941\nn02066245\nn03873416\nn02489166\nn03782006\nn04523525\nn03710637\nn02791270\nn09835506\nn01768244\nn03888257\nn04325704\nn02007558\nn01641577\nn03983396\nn04179913\nn03786901\nn03425413\nn02012849\nn03876231\nn02802426\nn04067472\nn02112350\nn02797295\nn03895866\nn07753113\nn03297495\nn02091635\nn04487394\nn03729826\nn02104029\nn02102973\nn03000247\nn01871265\nn03920288\nn03627232\nn02229544\nn02092339\nn02802426\nn03018349\nn13044778\nn03014705\nn02776631\nn03109150\nn13052670\nn03218198\nn04125021\nn04550184\nn04479046\nn04443257\nn03908618\nn02094433\nn02113186\nn02105162\nn02980441\nn02971356\nn07697313\nn02102177\nn04613696\nn02095889\nn02979186\nn09472597\nn03476684\nn02692877\nn01756291\nn03976657\nn03494278\nn03026506\nn04228054\nn04146614\nn03100240\nn02018795\nn01873310\nn04026417\nn02086910\nn04192698\nn02093991\nn04116512\nn02107908\nn02066245\nn04026417\nn02444819\nn02536864\nn02361337\nn03770439\nn02086646\nn03444034\nn04008634\nn02727426\nn07615774\nn02107908\nn03637318\nn04317175\nn03662601\nn09256479\nn03933933\nn03666591\nn02102318\nn07802026\nn04467665\nn03109150\nn03710721\nn02817516\nn01855672\nn03259280\nn02108089\nn01943899\nn02655020\nn02817516\nn07871810\nn03935335\nn03250847\nn04417672\nn04252077\nn01910747\nn03950228\nn02009912\nn02690373\nn02787622\nn01685808\nn02486410\nn04326547\nn03467068\nn01742172\nn02965783\nn04209133\nn06874185\nn01797886\nn01755581\nn03942813\nn02087394\nn02137549\nn03047690\nn04447861\nn04275548\nn02229544\nn03530642\nn01930112\nn04548362\nn04552348\nn02486261\nn02328150\nn03355925\nn02096177\nn02403003\nn01817953\nn01629819\nn03983396\nn03207941\nn01806567\nn02089973\nn07714990\nn03590841\nn02086646\nn03781244\nn02090622\nn03445924\nn02051845\nn04560804\nn09288635\nn03840681\nn01622779\nn03445924\nn02058221\nn03837869\nn02125311\nn02783161\nn01698640\nn02787622\nn03706229\nn02840245\nn02808440\nn03680355\nn01560419\nn01978287\nn02422699\nn01687978\nn01537544\nn03793489\nn03016953\nn04044716\nn01560419\nn02056570\nn03179701\nn09468604\nn03623198\nn02690373\nn02454379\nn04467665\nn02112018\nn04591157\nn04243546\nn04254777\nn01558993\nn07932039\nn04258138\nn02085936\nn03240683\nn04409515\nn03661043\nn01532829\nn03930630\nn02112350\nn02837789\nn02098286\nn04485082\nn03272562\nn02105505\nn03916031\nn07742313\nn03042490\nn02105855\nn04229816\nn04447861\nn02916936\nn02120505\nn02917067\nn01984695\nn02454379\nn03529860\nn03482405\nn04049303\nn03452741\nn02113023\nn03447721\nn01728572\nn03942813\nn03929855\nn03344393\nn01692333\nn01945685\nn03929660\nn07565083\nn04579432\nn03594734\nn03793489\nn02114712\nn02111129\nn02091244\nn12057211\nn02493793\nn03404251\nn03026506\nn01817953\nn02130308\nn02930766\nn03594734\nn02777292\nn02486410\nn09468604\nn02489166\nn01981276\nn04275548\nn02865351\nn04118538\nn01641577\nn02113624\nn04008634\nn01945685\nn02692877\nn02749479\nn03891332\nn02795169\nn02105641\nn04136333\nn04417672\nn04263257\nn06596364\nn02091032\nn03770679\nn07749582\nn02977058\nn03594734\nn02317335\nn04550184\nn02437312\nn01728572\nn02395406\nn04522168\nn04209133\nn02108000\nn01843383\nn04004767\nn03804744\nn04398044\nn02643566\nn13052670\nn03443371\nn02101388\nn02133161\nn02641379\nn03814906\nn02115913\nn02108915\nn01978287\nn04277352\nn04493381\nn01608432\nn04548280\nn03379051\nn03796401\nn02051845\nn04350905\nn04612504\nn03207743\nn02097298\nn03447447\nn02804610\nn01770393\nn10148035\nn02094258\nn03720891\nn02089078\nn02130308\nn02536864\nn03942813\nn02110341\nn04579432\nn07716358\nn03095699\nn02128925\nn04141975\nn02119789\nn03481172\nn03532672\nn02655020\nn07749582\nn02109961\nn02101556\nn03662601\nn03803284\nn02641379\nn04367480\nn02101388\nn04562935\nn01694178\nn02088466\nn02536864\nn03781244\nn04192698\nn02167151\nn02089078\nn03544143\nn03026506\nn02128925\nn04251144\nn03929855\nn03085013\nn03125729\nn01677366\nn03661043\nn04584207\nn04200800\nn02487347\nn02321529\nn03814906\nn01924916\nn02802426\nn01693334\nn02169497\nn02128925\nn07717556\nn03895866\nn02099429\nn03085013\nn11939491\nn09468604\nn02109047\nn07565083\nn04310018\nn02988304\nn07754684\nn02058221\nn02114367\nn03485794\nn03424325\nn04443257\nn01697457\nn02219486\nn02877765\nn01644900\nn03775071\nn02097047\nn02085620\nn07693725\nn03160309\nn02815834\nn03110669\nn03868863\nn04008634\nn03743016\nn02094114\nn03208938\nn07590611\nn04273569\nn03706229\nn02013706\nn07753592\nn02916936\nn02112137\nn02108089\nn03841143\nn03595614\nn03125729\nn07742313\nn02487347\nn04235860\nn02782093\nn01742172\nn04604644\nn04554684\nn04086273\nn02906734\nn02091635\nn03201208\nn07693725\nn09332890\nn02088364\nn03017168\nn03729826\nn03983396\nn03676483\nn04204347\nn04251144\nn02917067\nn04081281\nn03930313\nn03494278\nn03160309\nn02389026\nn03250847\nn03133878\nn02091635\nn02389026\nn02087394\nn02113799\nn02281787\nn04548280\nn04509417\nn03384352\nn02009229\nn04370456\nn07753275\nn02102177\nn01494475\nn03459775\nn02804610\nn04456115\nn02099712\nn01494475\nn04344873\nn03788195\nn01944390\nn01910747\nn03868242\nn03452741\nn13044778\nn01883070\nn02701002\nn02793495\nn02692877\nn03220513\nn01978287\nn02483362\nn01776313\nn02808304\nn03721384\nn02012849\nn03733281\nn07920052\nn02326432\nn04192698\nn02113799\nn02106550\nn02097298\nn02509815\nn02835271\nn04548280\nn04522168\nn03950228\nn01689811\nn09428293\nn01877812\nn02100583\nn01704323\nn03680355\nn03000247\nn03742115\nn04486054\nn02097298\nn02091635\nn03680355\nn02002556\nn02101388\nn01818515\nn02454379\nn03216828\nn03933933\nn02107683\nn04252077\nn02980441\nn04039381\nn03201208\nn02102177\nn03388549\nn04523525\nn03770439\nn03710193\nn01675722\nn04501370\nn04501370\nn02092002\nn03598930\nn07932039\nn02101006\nn02268853\nn04259630\nn03871628\nn02786058\nn03485794\nn02009912\nn02091244\nn02808304\nn01860187\nn07613480\nn01843065\nn02095889\nn01943899\nn02859443\nn02112350\nn02165456\nn01773797\nn02328150\nn03485407\nn01955084\nn01601694\nn03290653\nn01796340\nn06359193\nn01558993\nn03950228\nn02096437\nn02093859\nn01773549\nn04154565\nn02437616\nn02017213\nn04146614\nn02488702\nn02137549\nn02013706\nn02100735\nn04465501\nn02727426\nn04467665\nn02095889\nn02415577\nn03075370\nn02097298\nn02027492\nn02441942\nn02104029\nn03617480\nn03623198\nn02536864\nn07875152\nn04208210\nn02423022\nn03016953\nn01669191\nn04344873\nn02526121\nn09472597\nn03873416\nn01829413\nn12057211\nn02950826\nn02786058\nn02486410\nn02486261\nn02423022\nn02107574\nn03773504\nn01558993\nn02096177\nn03961711\nn01873310\nn04118538\nn02091032\nn03483316\nn13040303\nn03180011\nn02125311\nn02172182\nn03976657\nn02094258\nn02980441\nn02107312\nn01755581\nn02776631\nn02492660\nn01664065\nn01514668\nn02966193\nn02492035\nn03482405\nn04019541\nn03954731\nn02106550\nn04404412\nn02797295\nn01955084\nn04612504\nn04069434\nn02492035\nn10565667\nn02091134\nn01631663\nn02727426\nn02071294\nn02124075\nn02092002\nn02321529\nn04208210\nn01819313\nn02087046\nn04409515\nn03485794\nn04356056\nn02087046\nn02492035\nn02085782\nn03788365\nn02483708\nn04532106\nn02106030\nn03742115\nn03868242\nn03000684\nn02100236\nn02398521\nn03976657\nn03595614\nn03884397\nn03109150\nn02978881\nn02279972\nn02391049\nn03417042\nn01734418\nn07565083\nn03970156\nn02256656\nn01689811\nn02107683\nn04591713\nn02105855\nn04099969\nn02980441\nn07720875\nn04259630\nn07920052\nn03777754\nn02099429\nn03777568\nn03759954\nn02109525\nn04264628\nn03584829\nn04525305\nn02099712\nn01689811\nn02169497\nn02011460\nn02109961\nn03814906\nn02095314\nn03866082\nn02966687\nn03710721\nn02690373\nn02514041\nn03062245\nn02797295\nn02167151\nn01518878\nn13040303\nn13044778\nn02088364\nn03045698\nn03857828\nn09288635\nn03873416\nn10148035\nn02837789\nn03388183\nn03272010\nn13054560\nn02699494\nn02051845\nn02966193\nn02437312\nn04557648\nn02177972\nn03792782\nn01751748\nn02892767\nn04344873\nn03902125\nn01558993\nn02087394\nn02006656\nn01784675\nn02099601\nn03930313\nn02980441\nn02097209\nn02091032\nn03742115\nn02606052\nn02104365\nn02097130\nn07860988\nn02120079\nn04235860\nn02883205\nn02727426\nn02099267\nn03884397\nn02992211\nn03095699\nn04254777\nn02093859\nn03146219\nn04548362\nn04335435\nn02489166\nn01531178\nn02259212\nn02894605\nn02114855\nn03188531\nn02088466\nn03956157\nn04589890\nn04525038\nn02233338\nn04612504\nn07711569\nn02437312\nn03976657\nn12144580\nn01843065\nn02120505\nn07745940\nn04552348\nn03710721\nn03425413\nn01697457\nn02396427\nn02092339\nn02493509\nn02087046\nn02123159\nn04251144\nn04259630\nn02096051\nn04507155\nn02106662\nn03445777\nn03494278\nn01756291\nn03063689\nn02105162\nn04346328\nn04591713\nn03662601\nn02093428\nn02917067\nn03710721\nn02493509\nn02794156\nn07720875\nn01669191\nn02088364\nn01873310\nn04037443\nn03598930\nn07714571\nn04069434\nn03888257\nn07718472\nn03676483\nn03929660\nn02514041\nn02105056\nn04275548\nn03534580\nn04296562\nn03770439\nn02165456\nn02704792\nn03995372\nn04344873\nn02123159\nn11879895\nn02094114\nn02514041\nn03388549\nn01629819\nn02776631\nn02963159\nn03857828\nn07768694\nn01847000\nn02229544\nn02834397\nn04380533\nn07717410\nn02112706\nn03014705\nn11939491\nn02769748\nn03075370\nn03534580\nn02116738\nn02111277\nn03482405\nn02096294\nn01819313\nn02105056\nn04540053\nn03028079\nn03467068\nn02107683\nn12768682\nn02481823\nn02447366\nn03255030\nn02977058\nn12620546\nn03131574\nn02981792\nn02110063\nn03494278\nn02415577\nn02398521\nn04554684\nn03063599\nn04579145\nn04335435\nn04264628\nn04311004\nn02457408\nn02106550\nn04483307\nn02977058\nn02091244\nn02169497\nn03041632\nn03630383\nn02669723\nn02104029\nn02364673\nn02749479\nn02107312\nn02128925\nn02091831\nn04554684\nn01978287\nn02655020\nn02125311\nn04136333\nn07753113\nn01943899\nn04204347\nn03372029\nn04418357\nn02980441\nn02859443\nn04235860\nn09472597\nn02328150\nn02017213\nn01734418\nn03930313\nn03868242\nn04355338\nn04118538\nn02804610\nn02028035\nn02835271\nn02114548\nn03710193\nn04033901\nn01984695\nn03443371\nn03956157\nn07753113\nn03532672\nn01664065\nn02786058\nn02125311\nn02085620\nn02655020\nn04235860\nn03018349\nn13040303\nn03658185\nn04254680\nn01484850\nn03594945\nn04209133\nn03877845\nn12985857\nn02102040\nn02112018\nn03467068\nn02115641\nn04562935\nn03042490\nn04429376\nn02895154\nn13052670\nn01514668\nn01491361\nn01924916\nn04039381\nn02437616\nn04065272\nn01855672\nn03733281\nn03935335\nn02492035\nn02130308\nn04131690\nn01484850\nn03197337\nn03761084\nn03899768\nn02128385\nn04604644\nn03623198\nn04152593\nn02783161\nn04252225\nn04118538\nn02412080\nn03717622\nn02480495\nn02102480\nn02676566\nn02492035\nn04265275\nn07742313\nn03483316\nn03706229\nn02129165\nn07718747\nn03967562\nn01443537\nn02190166\nn01943899\nn02089078\nn03627232\nn02110958\nn03902125\nn04081281\nn02172182\nn02099849\nn02492035\nn02999410\nn04435653\nn03127925\nn07880968\nn04243546\nn03544143\nn01877812\nn02823750\nn02814533\nn02916936\nn02120505\nn02088632\nn02977058\nn07734744\nn02676566\nn01770081\nn04116512\nn02871525\nn02091032\nn02536864\nn03223299\nn02963159\nn03180011\nn03207743\nn03496892\nn03444034\nn03100240\nn04592741\nn02091831\nn04613696\nn02097130\nn03196217\nn04523525\nn04505470\nn04153751\nn03786901\nn03220513\nn02808440\nn04399382\nn03594945\nn01978455\nn01824575\nn01986214\nn03792782\nn02730930\nn03208938\nn02641379\nn02106030\nn02106550\nn02110063\nn03786901\nn04532670\nn03595614\nn13054560\nn02233338\nn03803284\nn03355925\nn02236044\nn02951585\nn03063599\nn03047690\nn01496331\nn02708093\nn02356798\nn04442312\nn02107574\nn03459775\nn04026417\nn02860847\nn02655020\nn03983396\nn03658185\nn04589890\nn03956157\nn02093991\nn02091032\nn02977058\nn01667114\nn02500267\nn03347037\nn07716906\nn03598930\nn02841315\nn04254777\nn04049303\nn13040303\nn03495258\nn04596742\nn15075141\nn02105251\nn01667114\nn01775062\nn02002724\nn04536866\nn01768244\nn02808440\nn02087046\nn02917067\nn04111531\nn02190166\nn03690938\nn13040303\nn04133789\nn03877845\nn01985128\nn03220513\nn03970156\nn04483307\nn01641577\nn03384352\nn02823750\nn02088238\nn04346328\nn04423845\nn04356056\nn04509417\nn02606052\nn01704323\nn07831146\nn02120505\nn02099601\nn02799071\nn02233338\nn03394916\nn02865351\nn03272562\nn03843555\nn09246464\nn02825657\nn02951585\nn03692522\nn04517823\nn03803284\nn02086910\nn07613480\nn09399592\nn03775071\nn02099429\nn07695742\nn03527444\nn04330267\nn03832673\nn02894605\nn02951585\nn09332890\nn13054560\nn03623198\nn02363005\nn04275548\nn09288635\nn03902125\nn04435653\nn04398044\nn02666196\nn04147183\nn02454379\nn02107574\nn04592741\nn04200800\nn02066245\nn01629819\nn03272562\nn03877472\nn02009229\nn03532672\nn02437312\nn02089078\nn04127249\nn03443371\nn02091635\nn02667093\nn03935335\nn02364673\nn02165105\nn03770439\nn03063599\nn02363005\nn03100240\nn02815834\nn04275548\nn02791270\nn02325366\nn01695060\nn02787622\nn07753113\nn02128385\nn04125021\nn02395406\nn04371430\nn03388043\nn12620546\nn04597913\nn03967562\nn02708093\nn02280649\nn02113978\nn09288635\nn03425413\nn03207941\nn01740131\nn04120489\nn02106382\nn02536864\nn04458633\nn03633091\nn03967562\nn04371430\nn02690373\nn02113186\nn02870880\nn02114855\nn02396427\nn02132136\nn02107908\nn01950731\nn02992529\nn03814639\nn03594734\nn07613480\nn07932039\nn03721384\nn02641379\nn03721384\nn03661043\nn04509417\nn02814533\nn02437616\nn04192698\nn02002724\nn15075141\nn03670208\nn02974003\nn02094433\nn03617480\nn04486054\nn03290653\nn03255030\nn04435653\nn02916936\nn01728572\nn01632777\nn03028079\nn02106382\nn12267677\nn02279972\nn02111129\nn01820546\nn03680355\nn03991062\nn02090721\nn02879718\nn01514668\nn01728572\nn04442312\nn03379051\nn02930766\nn03982430\nn02497673\nn02115641\nn02389026\nn02793495\nn03594945\nn03661043\nn04398044\nn01773797\nn03630383\nn07892512\nn02259212\nn02128757\nn03595614\nn03126707\nn04200800\nn12620546\nn02091032\nn01531178\nn03775071\nn02346627\nn02096294\nn04204347\nn02892201\nn01807496\nn03825788\nn02342885\nn02128385\nn07745940\nn04404412\nn03720891\nn02109961\nn03976657\nn02093256\nn03787032\nn03794056\nn04136333\nn03787032\nn02105855\nn01774384\nn02974003\nn02106030\nn04023962\nn03485794\nn02086910\nn02091134\nn02727426\nn04591157\nn03804744\nn04111531\nn03733805\nn02787622\nn02980441\nn03347037\nn01630670\nn04579432\nn01944390\nn12620546\nn02114712\nn03527444\nn04239074\nn01807496\nn01592084\nn02879718\nn04429376\nn02643566\nn07871810\nn07753113\nn03042490\nn02281787\nn03179701\nn01685808\nn03814906\nn02927161\nn02346627\nn03160309\nn04037443\nn02708093\nn03590841\nn04370456\nn02948072\nn02494079\nn06785654\nn04507155\nn02011460\nn02256656\nn04037443\nn03485794\nn03271574\nn04254777\nn02128757\nn04154565\nn03461385\nn02966193\nn02226429\nn02101006\nn02112018\nn07695742\nn02110341\nn02443114\nn02110185\nn02948072\nn02840245\nn03854065\nn02096294\nn02980441\nn03062245\nn03584829\nn01644900\nn03891251\nn03599486\nn02701002\nn02172182\nn03888605\nn03642806\nn04562935\nn01930112\nn02389026\nn02783161\nn02807133\nn04099969\nn03457902\nn03633091\nn03594945\nn07695742\nn07714990\nn03208938\nn04479046\nn09835506\nn03595614\nn01983481\nn03670208\nn01734418\nn01978455\nn03721384\nn02091635\nn02133161\nn04026417\nn01734418\nn03530642\nn04209133\nn04099969\nn01616318\nn02279972\nn03676483\nn03868863\nn02666196\nn02396427\nn01768244\nn03240683\nn02112018\nn13133613\nn03032252\nn04235860\nn02110627\nn03404251\nn04350905\nn02087046\nn01843383\nn01797886\nn02992211\nn02950826\nn02268853\nn03888605\nn07248320\nn03160309\nn07248320\nn03868242\nn01704323\nn01944390\nn04462240\nn06794110\nn03032252\nn04376876\nn02281406\nn02134418\nn03584829\nn03598930\nn04254777\nn04435653\nn02017213\nn04049303\nn03180011\nn03782006\nn02749479\nn04525305\nn02791270\nn04429376\nn02102318\nn07584110\nn02966687\nn02423022\nn02107142\nn02101556\nn04179913\nn02999410\nn02091134\nn02797295\nn04560804\nn01955084\nn07583066\nn03743016\nn03623198\nn03843555\nn02134084\nn02093256\nn02105505\nn03788195\nn07716906\nn04542943\nn04296562\nn02120079\nn03920288\nn02892767\nn04311174\nn04141327\nn02117135\nn03888605\nn04557648\nn04523525\nn02281787\nn02951358\nn03680355\nn07693725\nn02870880\nn02007558\nn06596364\nn01984695\nn03345487\nn02091244\nn09256479\nn02105162\nn07693725\nn03838899\nn03534580\nn02493509\nn02096177\nn07892512\nn02018795\nn04592741\nn01728920\nn07875152\nn01773797\nn02051845\nn04273569\nn03125729\nn01773549\nn04376876\nn04336792\nn02137549\nn03633091\nn01877812\nn02128757\nn04423845\nn02981792\nn03452741\nn01735189\nn04532106\nn02268853\nn07615774\nn03538406\nn01917289\nn01496331\nn01773549\nn03788195\nn02916936\nn03045698\nn03743016\nn03868863\nn04479046\nn01882714\nn03197337\nn02013706\nn07873807\nn02480855\nn04409515\nn02930766\nn03888257\nn03127925\nn11939491\nn02328150\nn02895154\nn02408429\nn02361337\nn02092339\nn01484850\nn03065424\nn02167151\nn01798484\nn02110341\nn02085620\nn04417672\nn02097047\nn04235860\nn02692877\nn04599235\nn04201297\nn02110341\nn03776460\nn02037110\nn02174001\nn02797295\nn02939185\nn03637318\nn03710721\nn02086646\nn03657121\nn02509815\nn07836838\nn04592741\nn04264628\nn04399382\nn02814533\nn04311174\nn02137549\nn07753113\nn02704792\nn02093859\nn01694178\nn03444034\nn01784675\nn02088466\nn03692522\nn02091244\nn02133161\nn09835506\nn01614925\nn02168699\nn02113624\nn03109150\nn02190166\nn03710721\nn02092002\nn01644373\nn04357314\nn01704323\nn01882714\nn03908618\nn04592741\nn02095570\nn02870880\nn04277352\nn03666591\nn09332890\nn02090721\nn04326547\nn04251144\nn04033901\nn02977058\nn03095699\nn02114548\nn02966193\nn07717410\nn04562935\nn02814860\nn02963159\nn02090721\nn03891251\nn02325366\nn03630383\nn03742115\nn03400231\nn07753275\nn02174001\nn01877812\nn02870880\nn02892201\nn02727426\nn02115913\nn02395406\nn03956157\nn02074367\nn07760859\nn04476259\nn03018349\nn04208210\nn04560804\nn03794056\nn03803284\nn03476684\nn01514668\nn04347754\nn01773157\nn01820546\nn04443257\nn03976657\nn04146614\nn02100583\nn04476259\nn01776313\nn02095570\nn03180011\nn02110806\nn02129165\nn02504013\nn02808304\nn03854065\nn02066245\nn01685808\nn03290653\nn01924916\nn03776460\nn02102973\nn03871628\nn04266014\nn04350905\nn02104029\nn03598930\nn04344873\nn10565667\nn02123045\nn02437312\nn03759954\nn02437616\nn02123159\nn01664065\nn02916936\nn03124170\nn02504013\nn03272562\nn03617480\nn02091244\nn02051845\nn02090622\nn04376876\nn04613696\nn02108551\nn04328186\nn01682714\nn03777754\nn02095570\nn07802026\nn02437616\nn02169497\nn02100735\nn01748264\nn03942813\nn04296562\nn02264363\nn04517823\nn03207743\nn02927161\nn04332243\nn02110185\nn04409515\nn02480495\nn09468604\nn02100735\nn07716358\nn15075141\nn03814639\nn02105251\nn01537544\nn01855672\nn01644900\nn04037443\nn02870880\nn02264363\nn04336792\nn09229709\nn03146219\nn02837789\nn03733281\nn04599235\nn04008634\nn02111500\nn04560804\nn02116738\nn02009229\nn03272562\nn02106030\nn03666591\nn02356798\nn09835506\nn02727426\nn02113712\nn02397096\nn04153751\nn02808304\nn02033041\nn02992529\nn02837789\nn03355925\nn03492542\nn03991062\nn02457408\nn03085013\nn04501370\nn02843684\nn02490219\nn02106382\nn02489166\nn03670208\nn02447366\nn02655020\nn13054560\nn03445924\nn03903868\nn02099601\nn02119022\nn02422106\nn04019541\nn04355933\nn04200800\nn02123597\nn13052670\nn03250847\nn02992529\nn02951585\nn03085013\nn01768244\nn04525305\nn03187595\nn01798484\nn03467068\nn04370456\nn03832673\nn02097130\nn03240683\nn04371430\nn04579432\nn04458633\nn04483307\nn02980441\nn02102318\nn04154565\nn03452741\nn03961711\nn02808440\nn03063689\nn02114855\nn02096051\nn04461696\nn04487394\nn02113186\nn07892512\nn03223299\nn04081281\nn04371774\nn04417672\nn03249569\nn03197337\nn02101006\nn01768244\nn02113186\nn03899768\nn02783161\nn01734418\nn01728920\nn02497673\nn03063599\nn04479046\nn02895154\nn02100877\nn01983481\nn03908618\nn04507155\nn03344393\nn01829413\nn02342885\nn02190166\nn07802026\nn03991062\nn02974003\nn01698640\nn04447861\nn03623198\nn04347754\nn07614500\nn12144580\nn04254680\nn04482393\nn01943899\nn03887697\nn03598930\nn02483362\nn02120079\nn03680355\nn03485407\nn02130308\nn02894605\nn03841143\nn02172182\nn02727426\nn04418357\nn02097209\nn03495258\nn02701002\nn03481172\nn02860847\nn04435653\nn03384352\nn04131690\nn02701002\nn03868863\nn01644373\nn03000247\nn02397096\nn04118776\nn02117135\nn02051845\nn03649909\nn02869837\nn03661043\nn02090622\nn02190166\nn02134084\nn02701002\nn03496892\nn02871525\nn04277352\nn02966193\nn07697313\nn03447447\nn03388183\nn02483708\nn03623198\nn09421951\nn02128925\nn02823428\nn02410509\nn02099429\nn04162706\nn01601694\nn06794110\nn03929660\nn07920052\nn04273569\nn02259212\nn03180011\nn01685808\nn02095889\nn04204347\nn02804414\nn02236044\nn04111531\nn02132136\nn07717556\nn03388183\nn04200800\nn04154565\nn02099601\nn03065424\nn03942813\nn01930112\nn04049303\nn02965783\nn03444034\nn03131574\nn02090721\nn02281787\nn04389033\nn07615774\nn02086240\nn02105412\nn03794056\nn03977966\nn01728572\nn03218198\nn07584110\nn02134084\nn03991062\nn03124170\nn04070727\nn03908618\nn07932039\nn02110806\nn01630670\nn03598930\nn04355338\nn03014705\nn02172182\nn03721384\nn02095314\nn02979186\nn01742172\nn04409515\nn02089973\nn02422699\nn03763968\nn02492660\nn02910353\nn03743016\nn03196217\nn02840245\nn03804744\nn04532106\nn03773504\nn02100236\nn02325366\nn07753275\nn03483316\nn01494475\nn04344873\nn04259630\nn03627232\nn02280649\nn02883205\nn04404412\nn04357314\nn04286575\nn03803284\nn02098413\nn04209239\nn01632777\nn03908618\nn02110185\nn02457408\nn02788148\nn03467068\nn01443537\nn04310018\nn03325584\nn02395406\nn03133878\nn02134084\nn02089867\nn01833805\nn03443371\nn03838899\nn03216828\nn03485794\nn03761084\nn02500267\nn04435653\nn01514668\nn10565667\nn01675722\nn02233338\nn02497673\nn01784675\nn03761084\nn02279972\nn03721384\nn02088238\nn03017168\nn01770081\nn03347037\nn02231487\nn12768682\nn03877472\nn02730930\nn02088238\nn01592084\nn03998194\nn03478589\nn03776460\nn02086910\nn02113624\nn02669723\nn01930112\nn04356056\nn12768682\nn09421951\nn03908618\nn02120079\nn02133161\nn03345487\nn02087046\nn04118538\nn03344393\nn02704792\nn02112018\nn02100583\nn03196217\nn04133789\nn02640242\nn02817516\nn01740131\nn01532829\nn04548362\nn04509417\nn02364673\nn02415577\nn04204347\nn12267677\nn03445777\nn07584110\nn03544143\nn03764736\nn07892512\nn01770393\nn01688243\nn04033995\nn04590129\nn01978287\nn02113712\nn02093428\nn01819313\nn02437312\nn03706229\nn03535780\nn02112137\nn04266014\nn02137549\nn03630383\nn03089624\nn04208210\nn03100240\nn02480495\nn02860847\nn03062245\nn04409515\nn04404412\nn02687172\nn04065272\nn03770439\nn04049303\nn03249569\nn02088238\nn01978287\nn04532106\nn01687978\nn01751748\nn02981792\nn03792972\nn04326547\nn01728920\nn04612504\nn07714990\nn03764736\nn07717410\nn04141327\nn03032252\nn02107574\nn02226429\nn01820546\nn02088364\nn03961711\nn07753113\nn02094114\nn03733805\nn02607072\nn02028035\nn03857828\nn02807133\nn04456115\nn02640242\nn02206856\nn12144580\nn02115913\nn03627232\nn02699494\nn01756291\nn03630383\nn02280649\nn02799071\nn07749582\nn01773157\nn09256479\nn04235860\nn06874185\nn02002556\nn02454379\nn03775546\nn02177972\nn02009229\nn03297495\nn03895866\nn01694178\nn01698640\nn01796340\nn03124043\nn02107683\nn02981792\nn04540053\nn07695742\nn02102318\nn02123597\nn04152593\nn01695060\nn04252077\nn01689811\nn01882714\nn04141327\nn07753592\nn02793495\nn04136333\nn03876231\nn02860847\nn04591157\nn04380533\nn03259280\nn03530642\nn01558993\nn04355338\nn02017213\nn02091032\nn07615774\nn07693725\nn02319095\nn04335435\nn06794110\nn11879895\nn09332890\nn02708093\nn02643566\nn03895866\nn03838899\nn03393912\nn02112137\nn01955084\nn02094433\nn02791124\nn03877472\nn03792782\nn01756291\nn02097474\nn03259280\nn02190166\nn07715103\nn02095889\nn04532106\nn04597913\nn03743016\nn04548362\nn02481823\nn03388549\nn02319095\nn03792972\nn02823750\nn03623198\nn03933933\nn02231487\nn03476684\nn02098286\nn02169497\nn03379051\nn02457408\nn07742313\nn07615774\nn02206856\nn04239074\nn03393912\nn01592084\nn03680355\nn02837789\nn03590841\nn01986214\nn03657121\nn03697007\nn01697457\nn02447366\nn04418357\nn04367480\nn03220513\nn04479046\nn03100240\nn03000684\nn01978287\nn02105855\nn03127925\nn02105855\nn02092002\nn02028035\nn02094258\nn04204347\nn01795545\nn02125311\nn02823750\nn02112137\nn03126707\nn02123597\nn03223299\nn01798484\nn02280649\nn01776313\nn02641379\nn01608432\nn03249569\nn01630670\nn03895866\nn03888257\nn02422106\nn02093859\nn04125021\nn04065272\nn03814906\nn03992509\nn04423845\nn03393912\nn02066245\nn02114548\nn10148035\nn01608432\nn04355338\nn04277352\nn03976467\nn02859443\nn04141076\nn02127052\nn02088466\nn07880968\nn09835506\nn03874293\nn03481172\nn04355338\nn02894605\nn03544143\nn02977058\nn01773157\nn02486261\nn02112137\nn03075370\nn01601694\nn04004767\nn04273569\nn04275548\nn02966193\nn03443371\nn01755581\nn02100877\nn04325704\nn02090379\nn02088466\nn03347037\nn03691459\nn01616318\nn01820546\nn04009552\nn03637318\nn01795545\nn02108000\nn01843383\nn03908618\nn07753275\nn02950826\nn04069434\nn02701002\nn02799071\nn02786058\nn02526121\nn03459775\nn04552348\nn04462240\nn02108915\nn02088364\nn02791270\nn01682714\nn02123394\nn02101388\nn02840245\nn04493381\nn01990800\nn04162706\nn13054560\nn01632777\nn02093859\nn02025239\nn02797295\nn03179701\nn02980441\nn04596742\nn01980166\nn09835506\nn03445777\nn03110669\nn02094114\nn02086079\nn01443537\nn02110063\nn04355338\nn01560419\nn03355925\nn02119022\nn03447447\nn02219486\nn02113624\nn04523525\nn01983481\nn10565667\nn03803284\nn04367480\nn03400231\nn01980166\nn04596742\nn02417914\nn02514041\nn02033041\nn02094114\nn02134084\nn13040303\nn03763968\nn04111531\nn02090622\nn02486261\nn03452741\nn04458633\nn02094114\nn02097658\nn01978455\nn02988304\nn04229816\nn02892767\nn02804414\nn03240683\nn01443537\nn02088632\nn02172182\nn02786058\nn02701002\nn04515003\nn07693725\nn03594945\nn02100735\nn04204347\nn02093754\nn09428293\nn03958227\nn03042490\nn06359193\nn02102177\nn03445924\nn04141975\nn03690938\nn02108089\nn03075370\nn04517823\nn03208938\nn03958227\nn10148035\nn02444819\nn02092002\nn10565667\nn02437312\nn02280649\nn02909870\nn03977966\nn03110669\nn03777568\nn07930864\nn04560804\nn03888605\nn02120505\nn03014705\nn01744401\nn03770439\nn03393912\nn02727426\nn02093754\nn03379051\nn03788195\nn02099601\nn02481823\nn03291819\nn04127249\nn03803284\nn03794056\nn03478589\nn02009912\nn07579787\nn02951358\nn03297495\nn04517823\nn03794056\nn03854065\nn04325704\nn03902125\nn03207941\nn03160309\nn02727426\nn03498962\nn02056570\nn01530575\nn03290653\nn03133878\nn02099267\nn03742115\nn04273569\nn02977058\nn03724870\nn04597913\nn03763968\nn03201208\nn02672831\nn02096437\nn02916936\nn04398044\nn03110669\nn01580077\nn03775546\nn01665541\nn03109150\nn01843383\nn01751748\nn04487394\nn02804414\nn04200800\nn03661043\nn01806143\nn01641577\nn02325366\nn03976467\nn02917067\nn01819313\nn04465501\nn01955084\nn03063599\nn04099969\nn02793495\nn02086079\nn02859443\nn03690938\nn13052670\nn02088238\nn02699494\nn03721384\nn02006656\nn02415577\nn02981792\nn02492035\nn03379051\nn02280649\nn03095699\nn03720891\nn03459775\nn02422106\nn01644373\nn03347037\nn02834397\nn03218198\nn03627232\nn04557648\nn02423022\nn01784675\nn03425413\nn04579432\nn07875152\nn03461385\nn03404251\nn03658185\nn07720875\nn01943899\nn12620546\nn03967562\nn02102480\nn02500267\nn02087046\nn03595614\nn02100236\nn07892512\nn04505470\nn01986214\nn02447366\nn01978455\nn03942813\nn02917067\nn02125311\nn04275548\nn02077923\nn01829413\nn04557648\nn02483362\nn03250847\nn02454379\nn02793495\nn03891251\nn03938244\nn03467068\nn02226429\nn02106166\nn04465501\nn04423845\nn02108422\nn02776631\nn01773797\nn03250847\nn04606251\nn01664065\nn04127249\nn04254777\nn02483362\nn03041632\nn01729322\nn02093859\nn02977058\nn04252225\nn02116738\nn02950826\nn03494278\nn02130308\nn03786901\nn04462240\nn03617480\nn04418357\nn02879718\nn03018349\nn03272010\nn03379051\nn01614925\nn02102040\nn01630670\nn03627232\nn13037406\nn09288635\nn07584110\nn02102177\nn03347037\nn01632458\nn01768244\nn03584254\nn04346328\nn03599486\nn03109150\nn03692522\nn15075141\nn01742172\nn02841315\nn13040303\nn02117135\nn02107142\nn04266014\nn03724870\nn07248320\nn02704792\nn03871628\nn01990800\nn02129604\nn02119789\nn02125311\nn04606251\nn07768694\nn03187595\nn04376876\nn04483307\nn02110063\nn02107142\nn02782093\nn04487081\nn01675722\nn01608432\nn03297495\nn02098105\nn01950731\nn04238763\nn02105855\nn04552348\nn02051845\nn02128925\nn02877765\nn02128385\nn02877765\nn01872401\nn01682714\nn03481172\nn02509815\nn02236044\nn02280649\nn02488702\nn03492542\nn01749939\nn03207743\nn03179701\nn02100877\nn01981276\nn03710637\nn03223299\nn01630670\nn03877472\nn01560419\nn02259212\nn04127249\nn03796401\nn04486054\nn01807496\nn03492542\nn01694178\nn01740131\nn01985128\nn03637318\nn03584254\nn07717556\nn07753592\nn02791124\nn03786901\nn02965783\nn03733131\nn04458633\nn01614925\nn04435653\nn03534580\nn04532106\nn02276258\nn01697457\nn03187595\nn04590129\nn04004767\nn03877472\nn07248320\nn03207743\nn02892767\nn03976467\nn03133878\nn03594734\nn01877812\nn03785016\nn04613696\nn03534580\nn02013706\nn01985128\nn02110806\nn02441942\nn04554684\nn03916031\nn01748264\nn04204347\nn03450230\nn01622779\nn02799071\nn02017213\nn03201208\nn02487347\nn02497673\nn01795545\nn02487347\nn04487081\nn03710637\nn04026417\nn07747607\nn02092002\nn02701002\nn02492660\nn03995372\nn02415577\nn02091831\nn02423022\nn02165456\nn03666591\nn04604644\nn02107142\nn02951358\nn02219486\nn04542943\nn03777568\nn03787032\nn04332243\nn02927161\nn09288635\nn01704323\nn02091244\nn02894605\nn04554684\nn02085936\nn03014705\nn01871265\nn02113799\nn02107683\nn03347037\nn04296562\nn09256479\nn02110341\nn06874185\nn03967562\nn02708093\nn04344873\nn02437616\nn04523525\nn02099712\nn04404412\nn04277352\nn02948072\nn04111531\nn03452741\nn02966193\nn03452741\nn02100735\nn04597913\nn07747607\nn03764736\nn02123159\nn02107574\nn01729977\nn03976467\nn03788195\nn07717556\nn15075141\nn04596742\nn01729977\nn03042490\nn02102040\nn02093991\nn12144580\nn02107908\nn04612504\nn02981792\nn01644900\nn02128385\nn02128925\nn02110806\nn01748264\nn02777292\nn04209239\nn02112350\nn02361337\nn04141327\nn02229544\nn02281406\nn03895866\nn02108915\nn12768682\nn02106030\nn03218198\nn04133789\nn02093428\nn03461385\nn02119789\nn03444034\nn02877765\nn03724870\nn03773504\nn01698640\nn02504013\nn02231487\nn01558993\nn06785654\nn01981276\nn02389026\nn04277352\nn02687172\nn03291819\nn04447861\nn04310018\nn02486410\nn02105855\nn02948072\nn03785016\nn02002724\nn03417042\nn03188531\nn02259212\nn02776631\nn02951585\nn03337140\nn01751748\nn02879718\nn04277352\nn12057211\nn02951585\nn03967562\nn07714571\nn02085620\nn02510455\nn02869837\nn01980166\nn01756291\nn03792972\nn02112137\nn03680355\nn03841143\nn07565083\nn07693725\nn07715103\nn01820546\nn01873310\nn03777568\nn01833805\nn02676566\nn03447721\nn02500267\nn03602883\nn04239074\nn04118538\nn04536866\nn04548362\nn02776631\nn01667778\nn03825788\nn03891332\nn04258138\nn04542943\nn02099849\nn03041632\nn04179913\nn01632458\nn01537544\nn02930766\nn03814639\nn02643566\nn03498962\nn01798484\nn02692877\nn03134739\nn03314780\nn02870880\nn07768694\nn04141076\nn03786901\nn03314780\nn02172182\nn02092339\nn03259280\nn07880968\nn02115641\nn01990800\nn12768682\nn07930864\nn03527444\nn02091244\nn03769881\nn01494475\nn03249569\nn02395406\nn03776460\nn12985857\nn02056570\nn02486410\nn01737021\nn02488702\nn01978455\nn01622779\nn02510455\nn01776313\nn07831146\nn02018207\nn02808304\nn01855032\nn03803284\nn02514041\nn02099849\nn01806143\nn03837869\nn03902125\nn02895154\nn04208210\nn02107142\nn01855672\nn02480495\nn04065272\nn03761084\nn02100236\nn02111277\nn02089867\nn04552348\nn02791124\nn02101556\nn02480855\nn02097658\nn03180011\nn03899768\nn02087394\nn02236044\nn02794156\nn04550184\nn02099849\nn02111129\nn03976657\nn01847000\nn04465501\nn03063599\nn03733131\nn09332890\nn02892767\nn01978455\nn02111129\nn03832673\nn04141327\nn02276258\nn03786901\nn02672831\nn01978455\nn02807133\nn03290653\nn03297495\nn02112350\nn02894605\nn03763968\nn02776631\nn04606251\nn03498962\nn04443257\nn04355933\nn02727426\nn12057211\nn04376876\nn02403003\nn03495258\nn04584207\nn04462240\nn01729322\nn03207941\nn02483708\nn10565667\nn03866082\nn04019541\nn04154565\nn13052670\nn02992211\nn03642806\nn03372029\nn03832673\nn03617480\nn01797886\nn04591157\nn04443257\nn03045698\nn03207941\nn04081281\nn02165105\nn02105412\nn02980441\nn02097658\nn02823750\nn02397096\nn03662601\nn01514859\nn03759954\nn02859443\nn02011460\nn03467068\nn04458633\nn02111277\nn01751748\nn03127747\nn03838899\nn07715103\nn02894605\nn02793495\nn07248320\nn03995372\nn02094258\nn03937543\nn03642806\nn02607072\nn03483316\nn02090622\nn04525305\nn02085936\nn03920288\nn03063599\nn01843065\nn02099267\nn01739381\nn03793489\nn02018207\nn03775071\nn01496331\nn06785654\nn03935335\nn03887697\nn07747607\nn03773504\nn07860988\nn04456115\nn02492035\nn03874293\nn04275548\nn03063689\nn02101006\nn01807496\nn02113978\nn02655020\nn02488702\nn02174001\nn04004767\nn04579432\nn04141975\nn03584254\nn02112706\nn03127747\nn02097047\nn04458633\nn02814533\nn02510455\nn02106166\nn02492035\nn13054560\nn04090263\nn02110341\nn02965783\nn04235860\nn01735189\nn01698640\nn07697313\nn02276258\nn03868242\nn02321529\nn03042490\nn04418357\nn03814906\nn02607072\nn04517823\nn03496892\nn07717556\nn02051845\nn03291819\nn09399592\nn02791124\nn02259212\nn02233338\nn07802026\nn03047690\nn03995372\nn03530642\nn02966687\nn02492035\nn02229544\nn01689811\nn01532829\nn03733805\nn01776313\nn02112137\nn04200800\nn07747607\nn03016953\nn03729826\nn07734744\nn02088094\nn04542943\nn02667093\nn03400231\nn04355933\nn03544143\nn02128385\nn04356056\nn02112018\nn02859443\nn02128925\nn02091032\nn04004767\nn02096051\nn02113712\nn02927161\nn03476991\nn02423022\nn12144580\nn04548280\nn03724870\nn04335435\nn07583066\nn02871525\nn03272010\nn02484975\nn02786058\nn09472597\nn04209133\nn03717622\nn03598930\nn02417914\nn01824575\nn04204238\nn02999410\nn04467665\nn04239074\nn03444034\nn04263257\nn03903868\nn02492035\nn02110627\nn02007558\nn02090379\nn03995372\nn04325704\nn04277352\nn02494079\nn02321529\nn12144580\nn01687978\nn03095699\nn02074367\nn02128925\nn02363005\nn02346627\nn04579145\nn03133878\nn02776631\nn03787032\nn03127747\nn01749939\nn01860187\nn04317175\nn12768682\nn02219486\nn03630383\nn02097130\nn02859443\nn03529860\nn02229544\nn03272562\nn04116512\nn01685808\nn03902125\nn02174001\nn02112706\nn02840245\nn04141975\nn01641577\nn02326432\nn07749582\nn02797295\nn04596742\nn02974003\nn01729977\nn02504013\nn02843684\nn03825788\nn04517823\nn03216828\nn04346328\nn02408429\nn01797886\nn02493509\nn02799071\nn04204347\nn07716906\nn06874185\nn02093647\nn02111889\nn04254777\nn02966687\nn03938244\nn02321529\nn03089624\nn02096585\nn02877765\nn03259280\nn02895154\nn02107574\nn07615774\nn03131574\nn02497673\nn01688243\nn04273569\nn03873416\nn03763968\nn01534433\nn03187595\nn02786058\nn02165105\nn02099601\nn02782093\nn01601694\nn03459775\nn01770081\nn04019541\nn01742172\nn03452741\nn03891251\nn01818515\nn03825788\nn04141975\nn02087394\nn02325366\nn02092339\nn07584110\nn03649909\nn02113712\nn04579145\nn03908714\nn04392985\nn02124075\nn13040303\nn02051845\nn02231487\nn02493509\nn01748264\nn03457902\nn03146219\nn01675722\nn03787032\nn02361337\nn07579787\nn04479046\nn02168699\nn02992211\nn02113624\nn02974003\nn04357314\nn07920052\nn07615774\nn03452741\nn03534580\nn02094258\nn04505470\nn02641379\nn03868863\nn02422699\nn03249569\nn02123394\nn02106662\nn01784675\nn04371430\nn04557648\nn02514041\nn02051845\nn03916031\nn01751748\nn02504458\nn07734744\nn02494079\nn03902125\nn02930766\nn03977966\nn03724870\nn04116512\nn03272010\nn04049303\nn03590841\nn02361337\nn04044716\nn03680355\nn03637318\nn11939491\nn03866082\nn03272010\nn02119789\nn07615774\nn03602883\nn03492542\nn04310018\nn02231487\nn02110185\nn03544143\nn03995372\nn02268443\nn01440764\nn02480855\nn02317335\nn01692333\nn02109961\nn03379051\nn03075370\nn02687172\nn04442312\nn03584254\nn01729977\nn02727426\nn03134739\nn01828970\nn02093428\nn02233338\nn02091831\nn02939185\nn04579432\nn04266014\nn03291819\nn03954731\nn03838899\nn07871810\nn02077923\nn12057211\nn02415577\nn02115641\nn03781244\nn07880968\nn07711569\nn03838899\nn03180011\nn02114712\nn03887697\nn02930766\nn01644900\nn02111277\nn02999410\nn03534580\nn02497673\nn02410509\nn02777292\nn03461385\nn04086273\nn03627232\nn01689811\nn09193705\nn01955084\nn03916031\nn04355338\nn04259630\nn03617480\nn01498041\nn02169497\nn02423022\nn02422106\nn02699494\nn02494079\nn04515003\nn03724870\nn02113799\nn03930630\nn04458633\nn04065272\nn02939185\nn02281787\nn02504458\nn02190166\nn03691459\nn02408429\nn07579787\nn02114712\nn04125021\nn04461696\nn03384352\nn03388183\nn03837869\nn03485407\nn01986214\nn03255030\nn02804610\nn03255030\nn01924916\nn04398044\nn04540053\nn02667093\nn03146219\nn02483708\nn03125729\nn09256479\nn02089078\nn02607072\nn03742115\nn04067472\nn02114712\nn03196217\nn04254120\nn02105412\nn03250847\nn02111500\nn07565083\nn04162706\nn01917289\nn03018349\nn03530642\nn02107908\nn02169497\nn02018795\nn03658185\nn03424325\nn02018207\nn03630383\nn03903868\nn07745940\nn02138441\nn03372029\nn02319095\nn01855672\nn03062245\nn07753592\nn04147183\nn04254777\nn03838899\nn02219486\nn04270147\nn07871810\nn01910747\nn02999410\nn12768682\nn03649909\nn04120489\nn02002724\nn01756291\nn02445715\nn02009912\nn01798484\nn04532670\nn04604644\nn04044716\nn02169497\nn02669723\nn04461696\nn02134084\nn03743016\nn01798484\nn03404251\nn02783161\nn03201208\nn02134084\nn02607072\nn03180011\nn02094433\nn03388549\nn07590611\nn02640242\nn02085782\nn02871525\nn03967562\nn02119789\nn04507155\nn04149813\nn03492542\nn02437312\nn02098105\nn01443537\nn01632458\nn02860847\nn02113023\nn03337140\nn12620546\nn03459775\nn11879895\nn03085013\nn02096585\nn02088466\nn01751748\nn02497673\nn02236044\nn03109150\nn02130308\nn04325704\nn03676483\nn02105412\nn03180011\nn02787622\nn02025239\nn01693334\nn02325366\nn02281787\nn04597913\nn04346328\nn04404412\nn02006656\nn02107312\nn02165456\nn03042490\nn04418357\nn02093428\nn04133789\nn07754684\nn03075370\nn03916031\nn04536866\nn07711569\nn02895154\nn02105251\nn02692877\nn03344393\nn04493381\nn04579145\nn03201208\nn04243546\nn02167151\nn01797886\nn09256479\nn01582220\nn04548362\nn03476684\nn04606251\nn04579432\nn02086910\nn02134084\nn02109525\nn04238763\nn03764736\nn04044716\nn04548362\nn02692877\nn03207941\nn04229816\nn03598930\nn04591157\nn02317335\nn01734418\nn15075141\nn03825788\nn04536866\nn04254777\nn02277742\nn03877845\nn02747177\nn01667778\nn01664065\nn03180011\nn02701002\nn13040303\nn03388549\nn04591713\nn04389033\nn02699494\nn02105162\nn02280649\nn04254777\nn02607072\nn01985128\nn03045698\nn03717622\nn02086240\nn03903868\nn02326432\nn02229544\nn03530642\nn01685808\nn02091467\nn03544143\nn03902125\nn02125311\nn09399592\nn04070727\nn07730033\nn07684084\nn04398044\nn03372029\nn03483316\nn03495258\nn01728572\nn04037443\nn02395406\nn03457902\nn03761084\nn01734418\nn02090721\nn03976657\nn03785016\nn01514668\nn04357314\nn02835271\nn02504013\nn02489166\nn03530642\nn02950826\nn02111889\nn04371774\nn04560804\nn03445924\nn02091831\nn07753592\nn03447721\nn01770081\nn02487347\nn02794156\nn02097209\nn03891251\nn02790996\nn03109150\nn04380533\nn03595614\nn04153751\nn04591713\nn02108915\nn04429376\nn01641577\nn04264628\nn03271574\nn02114367\nn07930864\nn02105641\nn02104365\nn03717622\nn04423845\nn02094258\nn02116738\nn01692333\nn02909870\nn02606052\nn02099849\nn02363005\nn07734744\nn02841315\nn01860187\nn02090721\nn03841143\nn02892201\nn04125021\nn04612504\nn01537544\nn04505470\nn02281406\nn03983396\nn02123045\nn01784675\nn02493509\nn03476991\nn03534580\nn02123159\nn02808440\nn04074963\nn01616318\nn03786901\nn03721384\nn02086240\nn02488702\nn03642806\nn03160309\nn01796340\nn13044778\nn09256479\nn03089624\nn02086910\nn04604644\nn04040759\nn07584110\nn04552348\nn04149813\nn02066245\nn01580077\nn04443257\nn04336792\nn02107683\nn01797886\nn02134418\nn02134418\nn01632777\nn06359193\nn01797886\nn03485407\nn04259630\nn03992509\nn07248320\nn04486054\nn03026506\nn02088632\nn03124043\nn02442845\nn02091467\nn03376595\nn04310018\nn02966687\nn03777568\nn03100240\nn04350905\nn02843684\nn02109961\nn01631663\nn03240683\nn03141823\nn02091635\nn01443537\nn11939491\nn02002724\nn03733281\nn02106662\nn03942813\nn03337140\nn03777568\nn04251144\nn07716906\nn01820546\nn03929660\nn03478589\nn02441942\nn02364673\nn09835506\nn04515003\nn02264363\nn01773157\nn01770393\nn03777568\nn04049303\nn02219486\nn02130308\nn02437312\nn02815834\nn02093647\nn01616318\nn04332243\nn12620546\nn10148035\nn02927161\nn02128757\nn03496892\nn03417042\nn04200800\nn02484975\nn01689811\nn02107574\nn03976657\nn03998194\nn02088632\nn04243546\nn03788365\nn02087046\nn10565667\nn03832673\nn02412080\nn01558993\nn03492542\nn04540053\nn01796340\nn04376876\nn02395406\nn03075370\nn07753592\nn02481823\nn02457408\nn02110806\nn03877472\nn01667778\nn03131574\nn03956157\nn02108422\nn02114548\nn03272010\nn03394916\nn01774384\nn03623198\nn02027492\nn04099969\nn02106662\nn02951358\nn01798484\nn13133613\nn03207743\nn04560804\nn02268443\nn03775071\nn04346328\nn01930112\nn03584254\nn02790996\nn09256479\nn01985128\nn02480495\nn02268853\nn03627232\nn03180011\nn02233338\nn03982430\nn02841315\nn03649909\nn04336792\nn09468604\nn02056570\nn02787622\nn03764736\nn02442845\nn02437616\nn03445924\nn01917289\nn02107312\nn02137549\nn03599486\nn03721384\nn04041544\nn01824575\nn04285008\nn01687978\nn01514668\nn04554684\nn04209239\nn03272562\nn03425413\nn02797295\nn02106382\nn06359193\nn03642806\nn01677366\nn03134739\nn02105641\nn01985128\nn03594945\nn07583066\nn02667093\nn02086646\nn07590611\nn02111889\nn03857828\nn04259630\nn02730930\nn04285008\nn03095699\nn03761084\nn02167151\nn04404412\nn04254120\nn04461696\nn04192698\nn01873310\nn03763968\nn02804414\nn04325704\nn01682714\nn02120505\nn03584829\nn04356056\nn04476259\nn09332890\nn04399382\nn03676483\nn03961711\nn09332890\nn02096294\nn04532106\nn04149813\nn03891251\nn06874185\nn02769748\nn04485082\nn04277352\nn03793489\nn03788365\nn02389026\nn03709823\nn03032252\nn02606052\nn03271574\nn03492542\nn01665541\nn01675722\nn03691459\nn07892512\nn02799071\nn02007558\nn02510455\nn03742115\nn04136333\nn03630383\nn02910353\nn02111129\nn02488702\nn01950731\nn04204238\nn04461696\nn02102318\nn03538406\nn03916031\nn02130308\nn04311174\nn01667114\nn02115641\nn04487394\nn02233338\nn02099267\nn01797886\nn02051845\nn04428191\nn02124075\nn04532670\nn03775546\nn07892512\nn02100877\nn04398044\nn04590129\nn02101388\nn04254680\nn04485082\nn03026506\nn04111531\nn03924679\nn01667778\nn02169497\nn04311004\nn03947888\nn02093754\nn01818515\nn03763968\nn04380533\nn02077923\nn02488702\nn01770393\nn02226429\nn07932039\nn02095314\nn01847000\nn03250847\nn04296562\nn02100236\nn03045698\nn07590611\nn03787032\nn02101006\nn01873310\nn02009912\nn02096051\nn07749582\nn02112018\nn03000134\nn03447721\nn04118776\nn03970156\nn01944390\nn07613480\nn02879718\nn01873310\nn03187595\nn03325584\nn01496331\nn02097298\nn03793489\nn02111500\nn04311174\nn01739381\nn02114548\nn02165105\nn01930112\nn02823428\nn04111531\nn02137549\nn04355338\nn03916031\nn03791053\nn02113186\nn04081281\nn02104029\nn03483316\nn04579145\nn01558993\nn01748264\nn02791270\nn03929660\nn02129604\nn02102040\nn03796401\nn02007558\nn11879895\nn06794110\nn07614500\nn02006656\nn04065272\nn02486261\nn02640242\nn01806143\nn03991062\nn02788148\nn09472597\nn03935335\nn02510455\nn03958227\nn02105641\nn04428191\nn03018349\nn02116738\nn03773504\nn02087046\nn03709823\nn01749939\nn02190166\nn02085782\nn01843065\nn03743016\nn01828970\nn01828970\nn03908714\nn03937543\nn02817516\nn04592741\nn02869837\nn03874293\nn04540053\nn03250847\nn02971356\nn02114548\nn02113023\nn04081281\nn03857828\nn03450230\nn04127249\nn02108089\nn02093428\nn04392985\nn04254120\nn02782093\nn02012849\nn03179701\nn04357314\nn13133613\nn02992211\nn04243546\nn01664065\nn01695060\nn04005630\nn03400231\nn03733131\nn02107142\nn02104365\nn04597913\nn04238763\nn04371430\nn03877472\nn04589890\nn04154565\nn01734418\nn03781244\nn07745940\nn02109961\nn01755581\nn07742313\nn04118776\nn01734418\nn02085782\nn03100240\nn02013706\nn03658185\nn03290653\nn02105505\nn03888257\nn02865351\nn02277742\nn02099849\nn03131574\nn02102177\nn02093428\nn02814860\nn01734418\nn01580077\nn04136333\nn04483307\nn01774384\nn02364673\nn06874185\nn07754684\nn07734744\nn04487081\nn07802026\nn09399592\nn03602883\nn04435653\nn02096437\nn02672831\nn02107683\nn02086646\nn01698640\nn03485794\nn03967562\nn01664065\nn03837869\nn01950731\nn02909870\nn01756291\nn02091467\nn03658185\nn02690373\nn02012849\nn03709823\nn02123597\nn13044778\nn02167151\nn03425413\nn07730033\nn03721384\nn03126707\nn02883205\nn02111889\nn03866082\nn01698640\nn04584207\nn03485407\nn02105251\nn03743016\nn03314780\nn03769881\nn01494475\nn04005630\nn03291819\nn03721384\nn04118776\nn03868242\nn04265275\nn09835506\nn03443371\nn03459775\nn04501370\nn01688243\nn03494278\nn02486410\nn02105251\nn03956157\nn02410509\nn02116738\nn04532106\nn02100236\nn04591157\nn02398521\nn04131690\nn03935335\nn02098105\nn04428191\nn02110627\nn03970156\nn03950228\nn02110341\nn04201297\nn07932039\nn07920052\nn03063689\nn02137549\nn03100240\nn01665541\nn04099969\nn02106382\nn02009912\nn03223299\nn02091635\nn03982430\nn04548362\nn01978455\nn01614925\nn02841315\nn07711569\nn04335435\nn02892767\nn03345487\nn02948072\nn04127249\nn02909870\nn02099712\nn04162706\nn01981276\nn02085620\nn02917067\nn07716358\nn04332243\nn03724870\nn04074963\nn01984695\nn03794056\nn03929855\nn01773157\nn01806567\nn04350905\nn03804744\nn10565667\nn07747607\nn03218198\nn03942813\nn01877812\nn03924679\nn07753592\nn02113799\nn02086079\nn03814639\nn02834397\nn02109525\nn07720875\nn04273569\nn03018349\nn03404251\nn03888257\nn03485407\nn07730033\nn13052670\nn02095889\nn01739381\nn01514859\nn02106030\nn07860988\nn03775546\nn04263257\nn03485794\nn03924679\nn04228054\nn02319095\nn02747177\nn03770679\nn03980874\nn02097658\nn02988304\nn07579787\nn02137549\nn01644373\nn02870880\nn04069434\nn13040303\nn02106550\nn02804414\nn07565083\nn03877845\nn03187595\nn02074367\nn02099712\nn01950731\nn03884397\nn03776460\nn04209133\nn03697007\nn01978287\nn03792972\nn07716906\nn04146614\nn03887697\nn02095889\nn02096177\nn04435653\nn02091032\nn02840245\nn02097658\nn02002724\nn02058221\nn03127747\nn04501370\nn01817953\nn02113186\nn01877812\nn04004767\nn02441942\nn02408429\nn04116512\nn02134418\nn03529860\nn03041632\nn03447447\nn03188531\nn03770439\nn03633091\nn02086646\nn02011460\nn04209133\nn04229816\nn01622779\nn01667114\nn01685808\nn02113186\nn02097047\nn03876231\nn02699494\nn03961711\nn03530642\nn03452741\nn02708093\nn01985128\nn02894605\nn03124170\nn03633091\nn13054560\nn02112137\nn02120505\nn01532829\nn03929660\nn04589890\nn04507155\nn01685808\nn02077923\nn04523525\nn04592741\nn02056570\nn03841143\nn02226429\nn04243546\nn04285008\nn02483708\nn03944341\nn04553703\nn03977966\nn02441942\nn01818515\nn03871628\nn03692522\nn07768694\nn02607072\nn04456115\nn04590129\nn03476991\nn02091134\nn03394916\nn01990800\nn02066245\nn02279972\nn01944390\nn02105251\nn04273569\nn03857828\nn02110185\nn02096051\nn01770081\nn02259212\nn02799071\nn01806143\nn03476684\nn01796340\nn03100240\nn01632777\nn02190166\nn02066245\nn03976657\nn03788365\nn02108422\nn03400231\nn04589890\nn04435653\nn02326432\nn03954731\nn04591157\nn02823428\nn07716358\nn02088632\nn01824575\nn01631663\nn02086079\nn03995372\nn04517823\nn02480855\nn03445777\nn04357314\nn03884397\nn03445924\nn03777754\nn03133878\nn03873416\nn02086240\nn04553703\nn04133789\nn07693725\nn02895154\nn02317335\nn04613696\nn01819313\nn03977966\nn02109047\nn03000247\nn02443114\nn03272010\nn01697457\nn04200800\nn02109047\nn02840245\nn01739381\nn06794110\nn01756291\nn01748264\nn03950228\nn02971356\nn02123159\nn04346328\nn02092339\nn01729977\nn03187595\nn02454379\nn03794056\nn03967562\nn04039381\nn02879718\nn02441942\nn04515003\nn04311174\nn03100240\nn03868242\nn03126707\nn04461696\nn13054560\nn04398044\nn01667114\nn01664065\nn02106382\nn04613696\nn02948072\nn12144580\nn03877472\nn02096585\nn03935335\nn04429376\nn02110185\nn03207941\nn02123045\nn03788195\nn04259630\nn02097209\nn02092002\nn01877812\nn03529860\nn02966687\nn03980874\nn02013706\nn02776631\nn02445715\nn01496331\nn01807496\nn02112137\nn02086646\nn04118776\nn03658185\nn01985128\nn02504013\nn12998815\nn02233338\nn12057211\nn07875152\nn03840681\nn03721384\nn03908714\nn02412080\nn02113799\nn02096437\nn02669723\nn03775546\nn03393912\nn07718472\nn01883070\nn02120079\nn01532829\nn04443257\nn02917067\nn02877765\nn02115913\nn07920052\nn01773797\nn02123159\nn03447447\nn04613696\nn03933933\nn04380533\nn01728572\nn03535780\nn04599235\nn02877765\nn13037406\nn02971356\nn02504458\nn02101388\nn04370456\nn09229709\nn02113624\nn02492035\nn02089867\nn09421951\nn02219486\nn02494079\nn02963159\nn03930630\nn02206856\nn02091831\nn02504013\nn02097298\nn09428293\nn04596742\nn01632777\nn02018207\nn03344393\nn03388549\nn03791053\nn01729322\nn02018207\nn03599486\nn03297495\nn02093859\nn01629819\nn04037443\nn01693334\nn02058221\nn03141823\nn04252225\nn04418357\nn01774384\nn03871628\nn03598930\nn03032252\nn02321529\nn02117135\nn02206856\nn03944341\nn02111129\nn02346627\nn03404251\nn02113023\nn02009229\nn02879718\nn01748264\nn01773549\nn04252077\nn02825657\nn03476991\nn03584254\nn04350905\nn13052670\nn04141076\nn03388549\nn02415577\nn02607072\nn04346328\nn01914609\nn02641379\nn03782006\nn01601694\nn03388183\nn03803284\nn02690373\nn02106662\nn02097047\nn07892512\nn02277742\nn10148035\nn02412080\nn02091635\nn01917289\nn03742115\nn04074963\nn03124043\nn02669723\nn04507155\nn02808304\nn02111500\nn03761084\nn01797886\nn03874599\nn03476991\nn04404412\nn02108915\nn01694178\nn02802426\nn02974003\nn03028079\nn03944341\nn03742115\nn02111500\nn02117135\nn02092339\nn04133789\nn03868242\nn07714990\nn07579787\nn04252077\nn02096051\nn02102480\nn02174001\nn03085013\nn01740131\nn02107312\nn04162706\nn02869837\nn02412080\nn04612504\nn01807496\nn04041544\nn03459775\nn02017213\nn02101006\nn07749582\nn02109047\nn07718472\nn02877765\nn01622779\nn01882714\nn03781244\nn02137549\nn02342885\nn03498962\nn04127249\nn06785654\nn02105412\nn03447447\nn09193705\nn02326432\nn04590129\nn02892201\nn03425413\nn04235860\nn03000247\nn03272562\nn03598930\nn02174001\nn03347037\nn07920052\nn01784675\nn07718747\nn02279972\nn02097298\nn03394916\nn03977966\nn03692522\nn03825788\nn07717556\nn02727426\nn02396427\nn07747607\nn04330267\nn03062245\nn02389026\nn02871525\nn02107142\nn02012849\nn02077923\nn03532672\nn03216828\nn02486261\nn01494475\nn04251144\nn02109047\nn03649909\nn01873310\nn03710637\nn01632458\nn02077923\nn04263257\nn04423845\nn02279972\nn01728572\nn02128757\nn04552348\nn07747607\nn07932039\nn02071294\nn02951585\nn02123159\nn04201297\nn03680355\nn02892767\nn03930630\nn01798484\nn01729977\nn01798484\nn04371430\nn02090379\nn03347037\nn03998194\nn03947888\nn02108422\nn02837789\nn03888257\nn01739381\nn04179913\nn07590611\nn02279972\nn03063599\nn02113712\nn02444819\nn03532672\nn02687172\nn07720875\nn01819313\nn02445715\nn03793489\nn02092002\nn03899768\nn03424325\nn02978881\nn01534433\nn02999410\nn04557648\nn01608432\nn02391049\nn03929660\nn02835271\nn03876231\nn02102318\nn02777292\nn04004767\nn03933933\nn07836838\nn01751748\nn07718472\nn04254777\nn03424325\nn03063599\nn02095570\nn01824575\nn04311004\nn01677366\nn03062245\nn03627232\nn03134739\nn04372370\nn03075370\nn02802426\nn03447721\nn01829413\nn02090379\nn04192698\nn03743016\nn01692333\nn02099601\nn03720891\nn02951585\nn01532829\nn02281406\nn02096177\nn03920288\nn02927161\nn04179913\nn02100236\nn04515003\nn07802026\nn02088632\nn03950228\nn09193705\nn03841143\nn02093647\nn04336792\nn04357314\nn03929660\nn02093647\nn02093428\nn04049303\nn01873310\nn02268853\nn03838899\nn01484850\nn03337140\nn01537544\nn02174001\nn03063599\nn02640242\nn03721384\nn04596742\nn02795169\nn02492660\nn02892201\nn02361337\nn04417672\nn02113624\nn02028035\nn02999410\nn01629819\nn02115913\nn02089078\nn01768244\nn04263257\nn01944390\nn01945685\nn02071294\nn03937543\nn02391049\nn02018207\nn02129165\nn02074367\nn01518878\nn03445777\nn04149813\nn02669723\nn02097047\nn02865351\nn07753592\nn02814533\nn03874599\nn07720875\nn04116512\nn02417914\nn02027492\nn03877845\nn02123159\nn04264628\nn02236044\nn02108089\nn04133789\nn04147183\nn02085620\nn02091134\nn03944341\nn13037406\nn02422106\nn01498041\nn03775071\nn04357314\nn02102040\nn01682714\nn01775062\nn03014705\nn01693334\nn01616318\nn04604644\nn03109150\nn02088238\nn01981276\nn02422106\nn01985128\nn04026417\nn01644900\nn02095570\nn04266014\nn02236044\nn02115913\nn01883070\nn03840681\nn02481823\nn03447721\nn01981276\nn03673027\nn02835271\nn02123159\nn02113186\nn03947888\nn02100877\nn03814639\nn02510455\nn04037443\nn03929660\nn03837869\nn02791270\nn03461385\nn02951585\nn04525305\nn02788148\nn02165105\nn04592741\nn02091467\nn03188531\nn02091134\nn03617480\nn03954731\nn04328186\nn02105162\nn02870880\nn03028079\nn04596742\nn04204347\nn02108422\nn01740131\nn02363005\nn03840681\nn04116512\nn02138441\nn04367480\nn01773797\nn04350905\nn02095314\nn09229709\nn02494079\nn03788365\nn02117135\nn01641577\nn04192698\nn02087046\nn12620546\nn02410509\nn03777568\nn02948072\nn03662601\nn02690373\nn02441942\nn03127925\nn02066245\nn02097130\nn03187595\nn02977058\nn03977966\nn03291819\nn02788148\nn03482405\nn02090721\nn02105641\nn04525038\nn04328186\nn03424325\nn03498962\nn03223299\nn04552348\nn09193705\nn07697537\nn04596742\nn01797886\nn01980166\nn02093991\nn01688243\nn01817953\nn03485407\nn01795545\nn02794156\nn02102480\nn01819313\nn03188531\nn02965783\nn03534580\nn02395406\nn02033041\nn03337140\nn04200800\nn02797295\nn02804414\nn02088364\nn03000247\nn03937543\nn02389026\nn01682714\nn02101388\nn01685808\nn07880968\nn02509815\nn03938244\nn04532670\nn03967562\nn03196217\nn02892767\nn01843383\nn02978881\nn01748264\nn04423845\nn02396427\nn03388043\nn03000134\nn04429376\nn03483316\nn03485407\nn02256656\nn04086273\nn02356798\nn02747177\nn01773157\nn03297495\nn02403003\nn07718472\nn03445924\nn01843383\nn02328150\nn03447447\nn02124075\nn02098105\nn06596364\nn03388183\nn06596364\nn02504013\nn04041544\nn02009912\nn02093859\nn04350905\nn02317335\nn07871810\nn02105855\nn02607072\nn02095570\nn02389026\nn06785654\nn09421951\nn02114855\nn03216828\nn01855032\nn03095699\nn02115641\nn01955084\nn03095699\nn03133878\nn03902125\nn02395406\nn04371774\nn04525305\nn03345487\nn02108551\nn01774750\nn02480495\nn03594945\nn02091635\nn04557648\nn03388549\nn01784675\nn13040303\nn13037406\nn01776313\nn02099601\nn03134739\nn02110185\nn01537544\nn13133613\nn02102040\nn01530575\nn01735189\nn01491361\nn07583066\nn02137549\nn03908714\nn03045698\nn01914609\nn02326432\nn01631663\nn03868242\nn03920288\nn03729826\nn02002724\nn03776460\nn03535780\nn03146219\nn02094258\nn03841143\nn02797295\nn02500267\nn04392985\nn02504458\nn01773797\nn04325704\nn03920288\nn02999410\nn02655020\nn02097474\nn09472597\nn02099712\nn02980441\nn04461696\nn02814533\nn03495258\nn01784675\nn03000684\nn07760859\nn04141327\nn02641379\nn04200800\nn04141327\nn01943899\nn04037443\nn04357314\nn02097474\nn03857828\nn01630670\nn02417914\nn02747177\nn04590129\nn02037110\nn03841143\nn04204238\nn04252225\nn02791270\nn09193705\nn04376876\nn02815834\nn01817953\nn04356056\nn02007558\nn02917067\nn03544143\nn03954731\nn03372029\nn02930766\nn04310018\nn03630383\nn04009552\nn02132136\nn07745940\nn02094114\nn02480855\nn02093991\nn02113624\nn03662601\nn12144580\nn02443114\nn01914609\nn04040759\nn02834397\nn02276258\nn04557648\nn07718472\nn02108915\nn07753113\nn02093428\nn03976467\nn01984695\nn02492035\nn04275548\nn02100877\nn04254777\nn02799071\nn03908618\nn03773504\nn03347037\nn02107574\nn03529860\nn02093256\nn03291819\nn02110958\nn04275548\nn04273569\nn02113023\nn03958227\nn04417672\nn03272562\nn01980166\nn01514668\nn02002556\nn02086079\nn02104365\nn01677366\nn03770679\nn02096177\nn02094258\nn01440764\nn01943899\nn02099849\nn03899768\nn01729322\nn01776313\nn06359193\nn02447366\nn03857828\nn03384352\nn02111277\nn02226429\nn04366367\nn01737021\nn01537544\nn02951358\nn04371430\nn03196217\nn02100236\nn04443257\nn04479046\nn03983396\nn03218198\nn02105505\nn01978287\nn04286575\nn03866082\nn04208210\nn03891332\nn03857828\nn02504013\nn03982430\nn04554684\nn04317175\nn04552348\nn12057211\nn02483362\nn02097474\nn02361337\nn02120505\nn03594945\nn03498962\nn01978455\nn01829413\nn02105505\nn01978455\nn04356056\nn07718472\nn01518878\nn02795169\nn03617480\nn03372029\nn02099267\nn04229816\nn07717410\nn02895154\nn02110185\nn04149813\nn02056570\nn04404412\nn03028079\nn02110341\nn04120489\nn02804414\nn02988304\nn02167151\nn04392985\nn07747607\nn02966687\nn09399592\nn03761084\nn03400231\nn04136333\nn04423845\nn02978881\nn02099429\nn07892512\nn02137549\nn01807496\nn04033995\nn03876231\nn03063599\nn04005630\nn02489166\nn03197337\nn04456115\nn03388043\nn03062245\nn03899768\nn04371430\nn03729826\nn02165456\nn02769748\nn02412080\nn02086240\nn01665541\nn02412080\nn02445715\nn01735189\nn02086079\nn02110185\nn07697537\nn02112350\nn02137549\nn02398521\nn02971356\nn03980874\nn02106030\nn02980441\nn09193705\nn03393912\nn04562935\nn03691459\nn02870880\nn02443484\nn02979186\nn02100735\nn01682714\nn02607072\nn01688243\nn02454379\nn02443484\nn07248320\nn03814639\nn04509417\nn04019541\nn03938244\nn01667114\nn03791053\nn04442312\nn02226429\nn01693334\nn02794156\nn01773549\nn01685808\nn03598930\nn02017213\nn02124075\nn02091134\nn01530575\nn03657121\nn01768244\nn04552348\nn02106030\nn01667114\nn02790996\nn02699494\nn03291819\nn01694178\nn02423022\nn01855672\nn03459775\nn04070727\nn03770439\nn03709823\nn01924916\nn06785654\nn03272562\nn02099429\nn03100240\nn02174001\nn06794110\nn03759954\nn04357314\nn03584829\nn03345487\nn03443371\nn02100236\nn03709823\nn04350905\nn02086910\nn02977058\nn02112018\nn04409515\nn04118776\nn03376595\nn02101556\nn02776631\nn02108551\nn03291819\nn07745940\nn02109047\nn04336792\nn03494278\nn03388183\nn02398521\nn03485794\nn03018349\nn03967562\nn02116738\nn02085620\nn02108551\nn02894605\nn07695742\nn01693334\nn04356056\nn02120079\nn04540053\nn03134739\nn01644900\nn01697457\nn02108000\nn03720891\nn03733281\nn04404412\nn02098105\nn02089867\nn01530575\nn03884397\nn03602883\nn02090721\nn04228054\nn03208938\nn02483708\nn02017213\nn02097047\nn02509815\nn02447366\nn03532672\nn01518878\nn02123045\nn01847000\nn02690373\nn02092002\nn02096177\nn04487081\nn02526121\nn02124075\nn03717622\nn02106030\nn02002724\nn03240683\nn03902125\nn03709823\nn02974003\nn02100583\nn03201208\nn01833805\nn13052670\nn02219486\nn02107574\nn07742313\nn02112018\nn02489166\nn02441942\nn07753275\nn01819313\nn02643566\nn03110669\nn04482393\nn04613696\nn02129604\nn02088466\nn02134418\nn02114855\nn04591157\nn02277742\nn02112350\nn03590841\nn04476259\nn02326432\nn01755581\nn11939491\nn04264628\nn12998815\nn02101388\nn02137549\nn02236044\nn02123394\nn02909870\nn03733805\nn04120489\nn03958227\nn02100877\nn02169497\nn02168699\nn03794056\nn04146614\nn03787032\nn03937543\nn03388549\nn01978455\nn06874185\nn03717622\nn07875152\nn01820546\nn03445777\nn02109961\nn04127249\nn07716358\nn03661043\nn01534433\nn03982430\nn02490219\nn04152593\nn03062245\nn01644373\nn02951358\nn04041544\nn02974003\nn02102318\nn04127249\nn02500267\nn04548280\nn02690373\nn02125311\nn01950731\nn02007558\nn12267677\nn03045698\nn01443537\nn02447366\nn02124075\nn03916031\nn03146219\nn02843684\nn02980441\nn03187595\nn02091134\nn03124170\nn07749582\nn03594734\nn02666196\nn03782006\nn07697537\nn02111889\nn03724870\nn02085620\nn03492542\nn02102177\nn04515003\nn02167151\nn03877472\nn07720875\nn02097209\nn03208938\nn01601694\nn04067472\nn02174001\nn02123394\nn07583066\nn03599486\nn04005630\nn01698640\nn03047690\nn03793489\nn02916936\nn02124075\nn01592084\nn03127747\nn02130308\nn02094114\nn04131690\nn03063599\nn02110341\nn04008634\nn03218198\nn01496331\nn03146219\nn03496892\nn02097047\nn02397096\nn03942813\nn03787032\nn02125311\nn02119789\nn01945685\nn02105162\nn03127747\nn02107142\nn02992529\nn12620546\nn04067472\nn01630670\nn02423022\nn02948072\nn01491361\nn04067472\nn04263257\nn03223299\nn02088238\nn02231487\nn01739381\nn01532829\nn02099849\nn09256479\nn01580077\nn03895866\nn02037110\nn07742313\nn02091032\nn03841143\nn01986214\nn04356056\nn02971356\nn01774384\nn02097474\nn04019541\nn07753275\nn01944390\nn04371774\nn02120079\nn07932039\nn04033901\nn04074963\nn02843684\nn03457902\nn02089078\nn03544143\nn02088238\nn02342885\nn01753488\nn02895154\nn04009552\nn01806143\nn03794056\nn01740131\nn02423022\nn02033041\nn03942813\nn04023962\nn03630383\nn04251144\nn04376876\nn02107142\nn01740131\nn03075370\nn01494475\nn04590129\nn02786058\nn01773549\nn02028035\nn01978287\nn02966193\nn03982430\nn02442845\nn07734744\nn07615774\nn03970156\nn03000134\nn01883070\nn02124075\nn07892512\nn03970156\nn03958227\nn04532670\nn03743016\nn04479046\nn02011460\nn02391049\nn03877845\nn01981276\nn02488291\nn01592084\nn03544143\nn02168699\nn01494475\nn03887697\nn03249569\nn03777754\nn02100236\nn02017213\nn02999410\nn03590841\nn03476991\nn04192698\nn01582220\nn04604644\nn03658185\nn03773504\nn02640242\nn01819313\nn02906734\nn07697537\nn02403003\nn04270147\nn03544143\nn02859443\nn03733131\nn03733131\nn04251144\nn01806143\nn04254120\nn04350905\nn02090379\nn01582220\nn03868242\nn02088466\nn02793495\nn04136333\nn03476684\nn02129604\nn02112137\nn01622779\nn02087046\nn02114548\nn07875152\nn01773549\nn03721384\nn01843065\nn01601694\nn04254680\nn07860988\nn04523525\nn01843383\nn03314780\nn04069434\nn02791270\nn04125021\nn07880968\nn03314780\nn04346328\nn04335435\nn02093647\nn04532106\nn04465501\nn02102177\nn04344873\nn03788195\nn03803284\nn09835506\nn01872401\nn01688243\nn02233338\nn03633091\nn03888605\nn02095570\nn04579145\nn03598930\nn02980441\nn03095699\nn02088466\nn04296562\nn01739381\nn02033041\nn04346328\nn01695060\nn03733281\nn04265275\nn01796340\nn07880968\nn02894605\nn04465501\nn01644900\nn03100240\nn03447721\nn03792782\nn01828970\nn02486261\nn02690373\nn01774750\nn09229709\nn03045698\nn03874293\nn12267677\nn03637318\nn02398521\nn02782093\nn01728572\nn02457408\nn04005630\nn04525305\nn01820546\nn02138441\nn03532672\nn02808440\nn12985857\nn02085620\nn04584207\nn02125311\nn07742313\nn03355925\nn03868242\nn03871628\nn03840681\nn04310018\nn02793495\nn02489166\nn02727426\nn04592741\nn02841315\nn02490219\nn04273569\nn04228054\nn03991062\nn02093647\nn02113023\nn01698640\nn04591713\nn02111277\nn04596742\nn02110627\nn03720891\nn04251144\nn03179701\nn02091244\nn07745940\nn03000247\nn04243546\nn07697313\nn03127925\nn01985128\nn03942813\nn02013706\nn02483708\nn01632458\nn02279972\nn02009912\nn02256656\nn01768244\nn02091635\nn03770679\nn12144580\nn01806567\nn04536866\nn03991062\nn02391049\nn02326432\nn04443257\nn02097047\nn02101006\nn02051845\nn03933933\nn03595614\nn07695742\nn07579787\nn02120079\nn02110627\nn02095314\nn03201208\nn03803284\nn02444819\nn03899768\nn02233338\nn02747177\nn03483316\nn04136333\nn03220513\nn03623198\nn03134739\nn03630383\nn02808440\nn03769881\nn02799071\nn04019541\nn01498041\nn04428191\nn02094433\nn03450230\nn02092002\nn03929660\nn03000134\nn01914609\nn03721384\nn04389033\nn02128385\nn03000247\nn02091244\nn02108000\nn02110063\nn02128385\nn02641379\nn01664065\nn02109525\nn07802026\nn07714571\nn03691459\nn02109961\nn01688243\nn04515003\nn04252225\nn02877765\nn03476991\nn07717410\nn04389033\nn02129165\nn01440764\nn12985857\nn04371430\nn03447721\nn02441942\nn02110958\nn02094433\nn04146614\nn03857828\nn03788195\nn03804744\nn02102040\nn02317335\nn09246464\nn02110958\nn02256656\nn03781244\nn01689811\nn02487347\nn02092002\nn03733805\nn01531178\nn02454379\nn02088238\nn01729322\nn01945685\nn01774384\nn01632458\nn03776460\nn01877812\nn07615774\nn02423022\nn03384352\nn01518878\nn03000684\nn02018207\nn03876231\nn02113799\nn01855032\nn02910353\nn02109047\nn03967562\nn02112018\nn02708093\nn02417914\nn13040303\nn04005630\nn02794156\nn01689811\nn02113186\nn03476991\nn03773504\nn03868863\nn03788365\nn02133161\nn02708093\nn07718747\nn02106030\nn03916031\nn02493793\nn02277742\nn02701002\nn04238763\nn07742313\nn01755581\nn02321529\nn01728572\nn12057211\nn03016953\nn04009552\nn02107312\nn04486054\nn03837869\nn04127249\nn03837869\nn03895866\nn03032252\nn04380533\nn02777292\nn01729322\nn02607072\nn03792972\nn03930630\nn02814533\nn04005630\nn04099969\nn02110806\nn03594734\nn03697007\nn02071294\nn02346627\nn02096294\nn01440764\nn12267677\nn02097658\nn02111889\nn03825788\nn04153751\nn04259630\nn04254680\nn02092002\nn01833805\nn04200800\nn04435653\nn07753113\nn03888257\nn01744401\nn04192698\nn02415577\nn04550184\nn02097474\nn02793495\nn04252225\nn03388549\nn02422106\nn02807133\nn02090622\nn03598930\nn01592084\nn01924916\nn07584110\nn02114712\nn03874599\nn03590841\nn09246464\nn04589890\nn03794056\nn03180011\nn02104029\nn03272562\nn04263257\nn03874599\nn07714990\nn02791124\nn03690938\nn02837789\nn02138441\nn02859443\nn03026506\nn02442845\nn04004767\nn02397096\nn04120489\nn01882714\nn03124170\nn03992509\nn01818515\nn03124170\nn02002724\nn03680355\nn02096051\nn02492660\nn04033995\nn04019541\nn02108915\nn01872401\nn04366367\nn04501370\nn04355338\nn03661043\nn02536864\nn01796340\nn02326432\nn02493509\nn02099849\nn02096051\nn02974003\nn03481172\nn03089624\nn01773157\nn03445777\nn02138441\nn07565083\nn03916031\nn02363005\nn01944390\nn02093754\nn04560804\nn12267677\nn03967562\nn07932039\nn03666591\nn02256656\nn03770439\nn04509417\nn03720891\nn07565083\nn07875152\nn01843383\nn03481172\nn02708093\nn02165105\nn02123394\nn01644900\nn02109961\nn04335435\nn02096177\nn02110185\nn02687172\nn04116512\nn01693334\nn03133878\nn02493793\nn01806143\nn07892512\nn03670208\nn04264628\nn03014705\nn07615774\nn02992211\nn03063599\nn04209239\nn02489166\nn07920052\nn04081281\nn04486054\nn02783161\nn03594734\nn03016953\nn02834397\nn04409515\nn03544143\nn01924916\nn02174001\nn04599235\nn07754684\nn07753275\nn02112706\nn03197337\nn02095570\nn02120079\nn03804744\nn01820546\nn02099849\nn04004767\nn02092339\nn03983396\nn01749939\nn04162706\nn04264628\nn03598930\nn02098286\nn07892512\nn03929660\nn04209133\nn03000684\nn04589890\nn02963159\nn02206856\nn03970156\nn04418357\nn02090379\nn03785016\nn02488291\nn04501370\nn04118538\nn04311174\nn03838899\nn02906734\nn01665541\nn03188531\nn03642806\nn03220513\nn02105855\nn03642806\nn02123394\nn02457408\nn03208938\nn04536866\nn02056570\nn02088466\nn04019541\nn02165456\nn02097209\nn02108000\nn04536866\nn02777292\nn02939185\nn04366367\nn01616318\nn03337140\nn04229816\nn03792782\nn07831146\nn03903868\nn03041632\nn02089867\nn07695742\nn03534580\nn03271574\nn01843383\nn07836838\nn02279972\nn07584110\nn02119789\nn01843065\nn02206856\nn03042490\nn02104029\nn04447861\nn03814906\nn02280649\nn03494278\nn02256656\nn02909870\nn03602883\nn01748264\nn02093428\nn03841143\nn03710193\nn01675722\nn02395406\nn03250847\nn02397096\nn12267677\nn03770679\nn02007558\nn03642806\nn07871810\nn03742115\nn02190166\nn07716358\nn01978455\nn02169497\nn04204347\nn03417042\nn02793495\nn03530642\nn03188531\nn02105505\nn02804414\nn02093754\nn02092339\nn02860847\nn02085936\nn02786058\nn02056570\nn02165456\nn03710637\nn04200800\nn04592741\nn03935335\nn02102973\nn04296562\nn04328186\nn12267677\nn01824575\nn02494079\nn02730930\nn02356798\nn03937543\nn03290653\nn02109047\nn02112137\nn02104365\nn02085620\nn09246464\nn01817953\nn03345487\nn02410509\nn02281787\nn04487081\nn01770393\nn03814906\nn01728920\nn02481823\nn01768244\nn03891251\nn04111531\nn03347037\nn03929660\nn02951585\nn02840245\nn02489166\nn01756291\nn02669723\nn07583066\nn02268443\nn04552348\nn04263257\nn04371774\nn03379051\nn04355338\nn04355933\nn04118538\nn04099969\nn04507155\nn02480495\nn03814639\nn02105855\nn02487347\nn04553703\nn04310018\nn03895866\nn03000247\nn01796340\nn03903868\nn03903868\nn07583066\nn04192698\nn02018795\nn02096177\nn02098286\nn03970156\nn03733281\nn07614500\nn03388043\nn02110958\nn01601694\nn07715103\nn02127052\nn02325366\nn03673027\nn02950826\nn02091467\nn03110669\nn03840681\nn03680355\nn02441942\nn03485407\nn02097474\nn02398521\nn02776631\nn02701002\nn02325366\nn03388043\nn07873807\nn03763968\nn04515003\nn02094258\nn02422699\nn01667114\nn04263257\nn07590611\nn02110185\nn03899768\nn03877845\nn03197337\nn12144580\nn04152593\nn02108089\nn02493793\nn02105855\nn03481172\nn04228054\nn03899768\nn02093754\nn01737021\nn02415577\nn01685808\nn01773157\nn02101388\nn03710721\nn01873310\nn03627232\nn02708093\nn02102318\nn07747607\nn02791124\nn02870880\nn03388549\nn04372370\nn03775071\nn04347754\nn03026506\nn07720875\nn01883070\nn03690938\nn03776460\nn01558993\nn04552348\nn03457902\nn07768694\nn04356056\nn04485082\nn09288635\nn07760859\nn03991062\nn04136333\nn03938244\nn02102177\nn03991062\nn04550184\nn04127249\nn01498041\nn03691459\nn03255030\nn02417914\nn02099429\nn04254777\nn04277352\nn01855032\nn01983481\nn04604644\nn02102973\nn02790996\nn02094258\nn02489166\nn03887697\nn02443114\nn04228054\nn01667778\nn02172182\nn04133789\nn03196217\nn02018207\nn03124170\nn02841315\nn02174001\nn02138441\nn02364673\nn03874599\nn02690373\nn12267677\nn02071294\nn02396427\nn02100236\nn04125021\nn01704323\nn02281406\nn02226429\nn02097298\nn02787622\nn02086910\nn02415577\nn02123597\nn03977966\nn03743016\nn02951585\nn04548280\nn03216828\nn02096437\nn02233338\nn02536864\nn01773157\nn03657121\nn02883205\nn03777754\nn01843065\nn15075141\nn04462240\nn02086240\nn03832673\nn04026417\nn04346328\nn02808440\nn04152593\nn03017168\nn03710193\nn02110341\nn02111500\nn02117135\nn02018207\nn03769881\nn02087394\nn04286575\nn02105855\nn03218198\nn04509417\nn02749479\nn01756291\nn03584254\nn07613480\nn02437312\nn04458633\nn01518878\nn01677366\nn02797295\nn07717410\nn03775071\nn04209133\nn03425413\nn04347754\nn02028035\nn02085936\nn04317175\nn04310018\nn13044778\nn01693334\nn03047690\nn03983396\nn02268443\nn04442312\nn02109961\nn04019541\nn04335435\nn07932039\nn03743016\nn02268443\nn04523525\nn02134418\nn02860847\nn02096051\nn02817516\nn04238763\nn12620546\nn02092002\nn13037406\nn03000134\nn04228054\nn02002724\nn02086079\nn03394916\nn04265275\nn04136333\nn02481823\nn04041544\nn03272562\nn02999410\nn02488702\nn01824575\nn03967562\nn02730930\nn01843383\nn04604644\nn02177972\nn01744401\nn07860988\nn04153751\nn01491361\nn03297495\nn04346328\nn03956157\nn02325366\nn02974003\nn03733281\nn03899768\nn07717556\nn02114367\nn04366367\nn03400231\nn02808440\nn01968897\nn02259212\nn03642806\nn01955084\nn03776460\nn09835506\nn01775062\nn02979186\nn02093991\nn04263257\nn04485082\nn04482393\nn03179701\nn01739381\nn02088238\nn03991062\nn13040303\nn01534433\nn01978455\nn02480495\nn02086910\nn02097209\nn02096294\nn04209133\nn09428293\nn03018349\nn07871810\nn01986214\nn01491361\nn02106662\nn03028079\nn04179913\nn04264628\nn03450230\nn04376876\nn02129165\nn02127052\nn02111500\nn04254680\nn02951358\nn03854065\nn02488702\nn02834397\nn02128757\nn03075370\nn07583066\nn03047690\nn01829413\nn03124043\nn01843065\nn07697537\nn07734744\nn02834397\nn02814860\nn02481823\nn04356056\nn03124043\nn01990800\nn03291819\nn02487347\nn03658185\nn04404412\nn03791053\nn03866082\nn02930766\nn02074367\nn02777292\nn04458633\nn02098286\nn02843684\nn04592741\nn01641577\nn03529860\nn01484850\nn04141076\nn03485407\nn03590841\nn04037443\nn07613480\nn01688243\nn04074963\nn02701002\nn03535780\nn02090379\nn02111889\nn06874185\nn07693725\nn07802026\nn07754684\nn01774384\nn01514668\nn02028035\nn04423845\nn02096051\nn02115641\nn01774384\nn02894605\nn03026506\nn02666196\nn03690938\nn02112706\nn03787032\nn01748264\nn03733131\nn03920288\nn04141076\nn02101006\nn03944341\nn12267677\nn03782006\nn03924679\nn02437616\nn02992529\nn02871525\nn02104029\nn03376595\nn04243546\nn03854065\nn03983396\nn02104029\nn01883070\nn07716906\nn02092002\nn02114855\nn03255030\nn01873310\nn01704323\nn04192698\nn03485407\nn02916936\nn07590611\nn02869837\nn03527444\nn03595614\nn02105412\nn09835506\nn04033901\nn04285008\nn02326432\nn02104029\nn07716906\nn07760859\nn03832673\nn03492542\nn02408429\nn03781244\nn02099849\nn03840681\nn02092339\nn03590841\nn01685808\nn01694178\nn07753592\nn03535780\nn02730930\nn04270147\nn02011460\nn04483307\nn01688243\nn01737021\nn02033041\nn03100240\nn03447447\nn03584829\nn02483362\nn03998194\nn02483362\nn03481172\nn01558993\nn04606251\nn01537544\nn02808440\nn03825788\nn01773157\nn04507155\nn04141076\nn02504013\nn04562935\nn07590611\nn04357314\nn01608432\nn02097658\nn03950228\nn02814860\nn01498041\nn04553703\nn12768682\nn03032252\nn02097474\nn01955084\nn07695742\nn02483708\nn02106550\nn04515003\nn02226429\nn04370456\nn03000684\nn03837869\nn02113799\nn02102480\nn03459775\nn02120079\nn02071294\nn13054560\nn04192698\nn02504458\nn04372370\nn04251144\nn02006656\nn03908618\nn04311174\nn03018349\nn13133613\nn03796401\nn04409515\nn02102480\nn02843684\nn04040759\nn02086646\nn02948072\nn07836838\nn03476684\nn02236044\nn04296562\nn02017213\nn04612504\nn02769748\nn07717410\nn07717410\nn01751748\nn03773504\nn02085782\nn04562935\nn04239074\nn07760859\nn07768694\nn03160309\nn01692333\nn03045698\nn03272562\nn04417672\nn03954731\nn04505470\nn04154565\nn03691459\nn04209239\nn04409515\nn02363005\nn07734744\nn02422699\nn03529860\nn04235860\nn04536866\nn01981276\nn03888257\nn02276258\nn03388043\nn07718472\nn02869837\nn02006656\nn03595614\nn02917067\nn01440764\nn01855032\nn03930630\nn02105505\nn01491361\nn03345487\nn04372370\nn03187595\nn01491361\nn04264628\nn04557648\nn02119022\nn02607072\nn02396427\nn07615774\nn04553703\nn07718472\nn03530642\nn02100583\nn04557648\nn03485407\nn07745940\nn01531178\nn03954731\nn04465501\nn12768682\nn04486054\nn03595614\nn04548362\nn07753113\nn02701002\nn04525038\nn02317335\nn02443484\nn02939185\nn03314780\nn02089078\nn02859443\nn02091467\nn02124075\nn03690938\nn02091831\nn02454379\nn04065272\nn03196217\nn02655020\nn04487394\nn04286575\nn03125729\nn03854065\nn03670208\nn02108422\nn02102480\nn02988304\nn02009229\nn02099267\nn02097209\nn02948072\nn02110806\nn02177972\nn03494278\nn01737021\nn13133613\nn04447861\nn04591713\nn03495258\nn02859443\nn02860847\nn04554684\nn03637318\nn04258138\nn01797886\nn03095699\nn04041544\nn03602883\nn04525038\nn03706229\nn02093859\nn02119022\nn02454379\nn07614500\nn02276258\nn07714571\nn02177972\nn02129604\nn01601694\nn04355338\nn02999410\nn07760859\nn02165456\nn02111129\nn03220513\nn02437616\nn04465501\nn03272010\nn02167151\nn02174001\nn02607072\nn04254120\nn07584110\nn03388549\nn03063599\nn02795169\nn02727426\nn02799071\nn10565667\nn02454379\nn07717410\nn02504013\nn04266014\nn04493381\nn03832673\nn02033041\nn02447366\nn03314780\nn02930766\nn02110806\nn04033901\nn02870880\nn01872401\nn03063689\nn03814906\nn01798484\nn02219486\nn02111129\nn03124170\nn03443371\nn01855672\nn03089624\nn04239074\nn03814906\nn04285008\nn02097474\nn01819313\nn02364673\nn03773504\nn04310018\nn04398044\nn13054560\nn01665541\nn02025239\nn03976657\nn04553703\nn07715103\nn02018795\nn03794056\nn03595614\nn03026506\nn02128925\nn03717622\nn03041632\nn04417672\nn07753275\nn07718747\nn01728920\nn03447447\nn02114548\nn02769748\nn01784675\nn02100877\nn02097658\nn04523525\nn02002556\nn03404251\nn03786901\nn04162706\nn02776631\nn13133613\nn04254777\nn04355338\nn02104029\nn04201297\nn03775071\nn02093754\nn03992509\nn03134739\nn12057211\nn04116512\nn02281787\nn07920052\nn02105641\nn01943899\nn03841143\nn02487347\nn04486054\nn02281787\nn02342885\nn03775546\nn02011460\nn02089078\nn03776460\nn04423845\nn02865351\nn03089624\nn04371774\nn01514859\nn01734418\nn02328150\nn09468604\nn03063689\nn02951585\nn02095314\nn03792972\nn03776460\nn02346627\nn02894605\nn01775062\nn02130308\nn04192698\nn13044778\nn01751748\nn07697537\nn03868242\nn04525038\nn02259212\nn02391049\nn04399382\nn02667093\nn01530575\nn01632777\nn03259280\nn02840245\nn04019541\nn02422699\nn02113712\nn03930630\nn02643566\nn02231487\nn04487394\nn03937543\nn03355925\nn01828970\nn01580077\nn07932039\nn02877765\nn02167151\nn03476991\nn02825657\nn01751748\nn03207941\nn03840681\nn09288635\nn01843383\nn04536866\nn03814906\nn04429376\nn04428191\nn03814906\nn04344873\nn01693334\nn03417042\nn02747177\nn01986214\nn02277742\nn03127747\nn02422699\nn12985857\nn02672831\nn02823428\nn02112018\nn04037443\nn07695742\nn02536864\nn02788148\nn02088364\nn02105251\nn02105641\nn02123159\nn03729826\nn03125729\nn04179913\nn02097474\nn03297495\nn03042490\nn04252225\nn03141823\nn09193705\nn04149813\nn02655020\nn03788365\nn03085013\nn02037110\nn01944390\nn02120505\nn04536866\nn07695742\nn02951358\nn03417042\nn03733131\nn04325704\nn03843555\nn03179701\nn02009229\nn04523525\nn02098413\nn02096585\nn03424325\nn02105162\nn04590129\nn01537544\nn02093991\nn03394916\nn01514668\nn13133613\nn03445924\nn03873416\nn01632458\nn03706229\nn02085782\nn01632777\nn04371430\nn12144580\nn01665541\nn02102040\nn02701002\nn04131690\nn04347754\nn13040303\nn01775062\nn02114712\nn01833805\nn03759954\nn02860847\nn04330267\nn02859443\nn02138441\nn01774384\nn07717556\nn04311004\nn03908714\nn02361337\nn04065272\nn04146614\nn04179913\nn01697457\nn03857828\nn04285008\nn02089078\nn01755581\nn02056570\nn02701002\nn02483708\nn02101556\nn01737021\nn03874599\nn02107683\nn03657121\nn01592084\nn03995372\nn03788195\nn02100877\nn03447447\nn09399592\nn04350905\nn04266014\nn02979186\nn02988304\nn02879718\nn03032252\nn01530575\nn03291819\nn04131690\nn02037110\nn01632458\nn02102177\nn04367480\nn01807496\nn02107908\nn01740131\nn02096585\nn04235860\nn02363005\nn02110958\nn07711569\nn03384352\nn03530642\nn03761084\nn03602883\nn01531178\nn01774384\nn04456115\nn01985128\nn01694178\nn03065424\nn04589890\nn04049303\nn07248320\nn06874185\nn04604644\nn01775062\nn02123597\nn02095570\nn01985128\nn02115913\nn01622779\nn01601694\nn04589890\nn01560419\nn01440764\nn02051845\nn03218198\nn03047690\nn03854065\nn02442845\nn02361337\nn02835271\nn01531178\nn02108422\nn02115913\nn03141823\nn02088238\nn03690938\nn03207941\nn02510455\nn01806143\nn01740131\nn03854065\nn02488291\nn04428191\nn03063599\nn02101556\nn02087046\nn02101556\nn03792972\nn04296562\nn02101006\nn02776631\nn01773797\nn03709823\nn04458633\nn02281406\nn03691459\nn03692522\nn02089867\nn03868863\nn02012849\nn03763968\nn01944390\nn01667114\nn03950228\nn02128385\nn02319095\nn04553703\nn03452741\nn03345487\nn02672831\nn03935335\nn02104365\nn01592084\nn04149813\nn03594734\nn02233338\nn01688243\nn07718472\nn03394916\nn13040303\nn01986214\nn02510455\nn04285008\nn03956157\nn02264363\nn03127747\nn03445777\nn04467665\nn03240683\nn03065424\nn04517823\nn02165105\nn03602883\nn01753488\nn04399382\nn09256479\nn02086910\nn03956157\nn03485794\nn02484975\nn02666196\nn02097209\nn03535780\nn02112018\nn03109150\nn04590129\nn01667778\nn02787622\nn02088364\nn03388549\nn02494079\nn01843065\nn02108551\nn03929855\nn03498962\nn02109525\nn04328186\nn09256479\nn04540053\nn03459775\nn03982430\nn02444819\nn01494475\nn02086079\nn02125311\nn03529860\nn01843383\nn03992509\nn01641577\nn04099969\nn04254777\nn01608432\nn02346627\nn02397096\nn02676566\nn01491361\nn02074367\nn04252225\nn04485082\nn02092002\nn02098286\nn02727426\nn03100240\nn13054560\nn02097298\nn02123045\nn02002724\nn02109047\nn03131574\nn02692877\nn02088632\nn04465501\nn02930766\nn01843065\nn03697007\nn02102973\nn04147183\nn02117135\nn07754684\nn02787622\nn02114548\nn04515003\nn01855672\nn01682714\nn02110063\nn04127249\nn03127925\nn04429376\nn03710193\nn03796401\nn02786058\nn02794156\nn02112018\nn02423022\nn02094114\nn02092339\nn03344393\nn03888605\nn02437312\nn02107574\nn03710637\nn01491361\nn04074963\nn02128385\nn04044716\nn02093991\nn02113186\nn01592084\nn07714990\nn02174001\nn02777292\nn02090379\nn04509417\nn02486261\nn02841315\nn02096051\nn01768244\nn03895866\nn03891332\nn02102177\nn04525038\nn03777754\nn07716906\nn02091244\nn02966687\nn01981276\nn02092339\nn04612504\nn09229709\nn02099429\nn04540053\nn03935335\nn01644373\nn02088466\nn04380533\nn02105162\nn02916936\nn01944390\nn02123159\nn03459775\nn01944390\nn02100735\nn01740131\nn03599486\nn02169497\nn03888605\nn04296562\nn03794056\nn03110669\nn02356798\nn03032252\nn04482393\nn03888605\nn01748264\nn02098413\nn03967562\nn03706229\nn13052670\nn04252225\nn02009229\nn04252225\nn09421951\nn01930112\nn04461696\nn04208210\nn02443484\nn03045698\nn03967562\nn07880968\nn02177972\nn01698640\nn02704792\nn04328186\nn01828970\nn04482393\nn03400231\nn03394916\nn04467665\nn04259630\nn01860187\nn03868863\nn03000134\nn02783161\nn02509815\nn04465501\nn02417914\nn04482393\nn02787622\nn02089867\nn03240683\nn02403003\nn04296562\nn02782093\nn02892201\nn03777754\nn04612504\nn03372029\nn01756291\nn03902125\nn03355925\nn01843383\nn04579432\nn02091134\nn04579432\nn03481172\nn02841315\nn07831146\nn03075370\nn02009912\nn04201297\nn02396427\nn01753488\nn03249569\nn04090263\nn01704323\nn02526121\nn04204347\nn02777292\nn03126707\nn04254120\nn02111277\nn01582220\nn02206856\nn02939185\nn01693334\nn02641379\nn04263257\nn04347754\nn07734744\nn01990800\nn04399382\nn04270147\nn03944341\nn01773549\nn03259280\nn02089078\nn02094433\nn04525305\nn04493381\nn01669191\nn02066245\nn02841315\nn03796401\nn04371430\nn04548362\nn03944341\nn01773157\nn03223299\nn03692522\nn03594945\nn02100877\nn03000134\nn02783161\nn03345487\nn02802426\nn01944390\nn02817516\nn02102973\nn03956157\nn03627232\nn02114712\nn03837869\nn02797295\nn04458633\nn03196217\nn02963159\nn02110341\nn02108551\nn09468604\nn03452741\nn02174001\nn04380533\nn07716358\nn04037443\nn03803284\nn03958227\nn09288635\nn04442312\nn03272562\nn03891251\nn04118776\nn04532670\nn01742172\nn03733281\nn02102177\nn03026506\nn02606052\nn01818515\nn04589890\nn04428191\nn02279972\nn02123045\nn04254120\nn03000684\nn01983481\nn02704792\nn07590611\nn04162706\nn02088632\nn02112706\nn03938244\nn02112018\nn02123597\nn01531178\nn02325366\nn03000684\nn02066245\nn02859443\nn03063599\nn07753113\nn02999410\nn03777568\nn02108089\nn01872401\nn02025239\nn01484850\nn03899768\nn04162706\nn02110341\nn02091467\nn04417672\nn03000134\nn04356056\nn04417672\nn01689811\nn02412080\nn02086646\nn02096294\nn01622779\nn02089973\nn02835271\nn09193705\nn04111531\nn04456115\nn09193705\nn03633091\nn07749582\nn07697537\nn02860847\nn01855672\nn03743016\nn02077923\nn07754684\nn01833805\nn02013706\nn03976657\nn03134739\nn03720891\nn02837789\nn04355933\nn03584829\nn09472597\nn01843065\nn01749939\nn03717622\nn03982430\nn02504458\nn02127052\nn03127747\nn04026417\nn03866082\nn01872401\nn02094258\nn03291819\nn02110627\nn03982430\nn02093256\nn02277742\nn02965783\nn04428191\nn01740131\nn02795169\nn02119789\nn03535780\nn03461385\nn01980166\nn02486410\nn03720891\nn04597913\nn03666591\nn02843684\nn04252225\nn10565667\nn02268443\nn01491361\nn02098105\nn03775071\nn03187595\nn07760859\nn02259212\nn03042490\nn03942813\nn04069434\nn04120489\nn01820546\nn04548280\nn07718472\nn02417914\nn02095314\nn06874185\nn03447447\nn03983396\nn04592741\nn02102177\nn03649909\nn03594945\nn02099712\nn04370456\nn04517823\nn07875152\nn03207941\nn02398521\nn03954731\nn01796340\nn01798484\nn02113712\nn01491361\nn04423845\nn03483316\nn04461696\nn02106550\nn01773157\nn13052670\nn02091244\nn03706229\nn01560419\nn03832673\nn02492660\nn04099969\nn03982430\nn04532670\nn01631663\nn02085782\nn01728920\nn03240683\nn04584207\nn01806567\nn01729977\nn01601694\nn04350905\nn04179913\nn04592741\nn02108422\nn02110806\nn02814533\nn01773797\nn02704792\nn02782093\nn03916031\nn03467068\nn03710721\nn04554684\nn01955084\nn07717556\nn02009229\nn02256656\nn03095699\nn02094258\nn02486410\nn02027492\nn04200800\nn04371430\nn03662601\nn02444819\nn01665541\nn01614925\nn02112018\nn03773504\nn04505470\nn02951358\nn02948072\nn02101556\nn03868242\nn02093256\nn01641577\nn02128385\nn03000684\nn03874293\nn03134739\nn01440764\nn02268853\nn07584110\nn04399382\nn01843065\nn03188531\nn02086240\nn04540053\nn01829413\nn04462240\nn03018349\nn03782006\nn07730033\nn03676483\nn04275548\nn03930630\nn03764736\nn02226429\nn02007558\nn04149813\nn01820546\nn01829413\nn02110185\nn02107683\nn03840681\nn02018207\nn01833805\nn03902125\nn03868863\nn03443371\nn02113978\nn03793489\nn02859443\nn02097047\nn04192698\nn07590611\nn07880968\nn07697537\nn02342885\nn02398521\nn02002724\nn02910353\nn02442845\nn02906734\nn02494079\nn02091831\nn02823750\nn04447861\nn01796340\nn03089624\nn03924679\nn01980166\nn04435653\nn03649909\nn02107142\nn02110063\nn02403003\nn04081281\nn01735189\nn01532829\nn03891251\nn02077923\nn03977966\nn03452741\nn04465501\nn02777292\nn02113799\nn04367480\nn03787032\nn01744401\nn02667093\nn03933933\nn01580077\nn02794156\nn01796340\nn02002556\nn02837789\nn01818515\nn09835506\nn04604644\nn01917289\nn03180011\nn02102480\nn03873416\nn03995372\nn03884397\nn03657121\nn02093754\nn02102318\nn02097658\nn02108422\nn01855672\nn02489166\nn03208938\nn02116738\nn07802026\nn03584254\nn02108000\nn09256479\nn02892767\nn02105162\nn03388549\nn02870880\nn02116738\nn01807496\nn03045698\nn03717622\nn03109150\nn03388549\nn02437616\nn07930864\nn03991062\nn03709823\nn03680355\nn02033041\nn02843684\nn02795169\nn02236044\nn02509815\nn04442312\nn12998815\nn03255030\nn02111889\nn03595614\nn03788195\nn02690373\nn01756291\nn01698640\nn07565083\nn01983481\nn03445777\nn03998194\nn02879718\nn07930864\nn03255030\nn02086646\nn04120489\nn03733281\nn01667114\nn03532672\nn03179701\nn04229816\nn03733281\nn09256479\nn02105251\nn03146219\nn04330267\nn06874185\nn12620546\nn01641577\nn02106550\nn02445715\nn03146219\nn02493793\nn02509815\nn02804610\nn03590841\nn01871265\nn02483362\nn02437616\nn03895866\nn02071294\nn03291819\nn13044778\nn02114855\nn01984695\nn02500267\nn06359193\nn01843065\nn03763968\nn02643566\nn04258138\nn02667093\nn07734744\nn04153751\nn02138441\nn03188531\nn07802026\nn02100583\nn07860988\nn01817953\nn02106166\nn02483708\nn03782006\nn02007558\nn04476259\nn02835271\nn03124170\nn04550184\nn03661043\nn04204238\nn03776460\nn03837869\nn04443257\nn02486261\nn01537544\nn02317335\nn02134418\nn04557648\nn01872401\nn04209239\nn01677366\nn02100735\nn02096437\nn04479046\nn01693334\nn02965783\nn01514859\nn07613480\nn02108422\nn01914609\nn03482405\nn03710637\nn04009552\nn02106166\nn01531178\nn02704792\nn04487394\nn02834397\nn02108915\nn02484975\nn04310018\nn02095570\nn03447721\nn02119022\nn03017168\nn03697007\nn03249569\nn02835271\nn04591713\nn03347037\nn02791124\nn01692333\nn01882714\nn03196217\nn02422699\nn04041544\nn03796401\nn02028035\nn02966193\nn04235860\nn03642806\nn03838899\nn02510455\nn01930112\nn03781244\nn02091032\nn02025239\nn03196217\nn02094114\nn01978455\nn04254120\nn13040303\nn03459775\nn07716358\nn03016953\nn03876231\nn02892767\nn04069434\nn02256656\nn02168699\nn02128757\nn01986214\nn02009229\nn02790996\nn03630383\nn07718747\nn02361337\nn02951585\nn07873807\nn03223299\nn07836838\nn04266014\nn03956157\nn02002724\nn02077923\nn02002556\nn02951358\nn03259280\nn02113186\nn02843684\nn04332243\nn01775062\nn02777292\nn04118538\nn02226429\nn03908618\nn02782093\nn03777568\nn02101556\nn02701002\nn02018795\nn02102318\nn03045698\nn04254680\nn02692877\nn12620546\nn02325366\nn01560419\nn02977058\nn03127925\nn04325704\nn03483316\nn02101556\nn03450230\nn04264628\nn02101556\nn03482405\nn07715103\nn03544143\nn02395406\nn01797886\nn03207941\nn04389033\nn01978455\nn01755581\nn02708093\nn03461385\nn02342885\nn01930112\nn04009552\nn02804610\nn13037406\nn02092339\nn02106550\nn04033995\nn02395406\nn03733131\nn02859443\nn04008634\nn02841315\nn02412080\nn03785016\nn01440764\nn03100240\nn01665541\nn03710721\nn04599235\nn04370456\nn02124075\nn02138441\nn03085013\nn01744401\nn04296562\nn09835506\nn03785016\nn07754684\nn04311004\nn02124075\nn02802426\nn04239074\nn02971356\nn02009229\nn02096177\nn01695060\nn03954731\nn01828970\nn02086240\nn02447366\nn03095699\nn03590841\nn03482405\nn02107574\nn02096294\nn03085013\nn04456115\nn04486054\nn04599235\nn03141823\nn04263257\nn03877845\nn04428191\nn03976657\nn02797295\nn03637318\nn03041632\nn07579787\nn02687172\nn03201208\nn04579145\nn01608432\nn02099849\nn01667114\nn04372370\nn02106166\nn03075370\nn02138441\nn03028079\nn01930112\nn03388183\nn03825788\nn13044778\nn02687172\nn03692522\nn02391049\nn04254120\nn03146219\nn03126707\nn02025239\nn07714571\nn02869837\nn01580077\nn03594945\nn02109525\nn04099969\nn03792972\nn03623198\nn01872401\nn02441942\nn03032252\nn02687172\nn02096294\nn02037110\nn04310018\nn02280649\nn03992509\nn04037443\nn01806567\nn02325366\nn03372029\nn02259212\nn04371430\nn02391049\nn01755581\nn01820546\nn02264363\nn01494475\nn03201208\nn01774750\nn03259280\nn02687172\nn04090263\nn02483708\nn04487081\nn03218198\nn02480495\nn01692333\nn03017168\nn01843065\nn03930630\nn02056570\nn03041632\nn02799071\nn03344393\nn01514859\nn02113978\nn02027492\nn01981276\nn02397096\nn04192698\nn03134739\nn02666196\nn02117135\nn04461696\nn02231487\nn09246464\nn04149813\nn02102040\nn02086910\nn04355338\nn02457408\nn02093428\nn01689811\nn03481172\nn07836838\nn03803284\nn01910747\nn04553703\nn03478589\nn03584829\nn04254777\nn04254120\nn02105505\nn02361337\nn03992509\nn02804610\nn02102318\nn01560419\nn01773549\nn03902125\nn06359193\nn02129165\nn02120079\nn02113712\nn01728920\nn03160309\nn07871810\nn04258138\nn03045698\nn04552348\nn13044778\nn03717622\nn02025239\nn02268443\nn02108915\nn04542943\nn03240683\nn02966687\nn07754684\nn03991062\nn02769748\nn03187595\nn03271574\nn02256656\nn03637318\nn04357314\nn03207941\nn01728920\nn04074963\nn03000684\nn04118538\nn03888257\nn03000134\nn02930766\nn02437616\nn01622779\nn03954731\nn04266014\nn02108915\nn01729977\nn04553703\nn02328150\nn07715103\nn03617480\nn02441942\nn01734418\nn02229544\nn02259212\nn03017168\nn02077923\nn03871628\nn02025239\nn02992211\nn01978287\nn01755581\nn04008634\nn01773797\nn04209239\nn04584207\nn02493793\nn01616318\nn04127249\nn01877812\nn02814860\nn03535780\nn04040759\nn02879718\nn02514041\nn04592741\nn03854065\nn01614925\nn04026417\nn03837869\nn02865351\nn04239074\nn06794110\nn02190166\nn04208210\nn02088238\nn02497673\nn03179701\nn04613696\nn01693334\nn02672831\nn02817516\nn02106662\nn04392985\nn03777754\nn03649909\nn04311004\nn01664065\nn04389033\nn02807133\nn03476991\nn03141823\nn03793489\nn02988304\nn03325584\nn01871265\nn09288635\nn04326547\nn02110063\nn03220513\nn02093859\nn01693334\nn02815834\nn02107574\nn04487081\nn04347754\nn07695742\nn04086273\nn04493381\nn01580077\nn02910353\nn07754684\nn04067472\nn12768682\nn01675722\nn02437312\nn04417672\nn03868863\nn13054560\nn02100735\nn03888605\nn04009552\nn04238763\nn03876231\nn03706229\nn02859443\nn01530575\nn01824575\nn02096437\nn04486054\nn02704792\nn02110185\nn01824575\nn12620546\nn03814906\nn04154565\nn02058221\nn02111129\nn03690938\nn03857828\nn01534433\nn09229709\nn02086910\nn04507155\nn02098105\nn02089078\nn04355933\nn02930766\nn03384352\nn02892201\nn03992509\nn02109961\nn04479046\nn03000247\nn03047690\nn04258138\nn04005630\nn02281787\nn01693334\nn03379051\nn01614925\nn04479046\nn04591713\nn03920288\nn02051845\nn01756291\nn02107312\nn04435653\nn03325584\nn02058221\nn02107683\nn02111277\nn03786901\nn07768694\nn03891332\nn04204347\nn03400231\nn03961711\nn02490219\nn03347037\nn04597913\nn02090721\nn03450230\nn02112137\nn03250847\nn03868242\nn02058221\nn04141327\nn03761084\nn02090379\nn02486261\nn02095570\nn01749939\nn02804610\nn04273569\nn02777292\nn03930630\nn03775546\nn07716906\nn02916936\nn02930766\nn03709823\nn02056570\nn02412080\nn02666196\nn03196217\nn04479046\nn04509417\nn01532829\nn07697313\nn02493793\nn02058221\nn04252077\nn02002556\nn02085936\nn03063599\nn04273569\nn04550184\nn03710193\nn01742172\nn02443484\nn03720891\nn03706229\nn02643566\nn03218198\nn03877845\nn01630670\nn07714990\nn02264363\nn01532829\nn04540053\nn02113712\nn04259630\nn03661043\nn03220513\nn03445924\nn07831146\nn01530575\nn03691459\nn01773157\nn06785654\nn03290653\nn03995372\nn03866082\nn02276258\nn03777568\nn01675722\nn12985857\nn02835271\nn03444034\nn02101006\nn03637318\nn03787032\nn04258138\nn03535780\nn04065272\nn02099267\nn03347037\nn01755581\nn03908714\nn02056570\nn02093647\nn01729977\nn04344873\nn01847000\nn02112350\nn01632458\nn04562935\nn03325584\nn04127249\nn04141076\nn04554684\nn07714571\nn02027492\nn03532672\nn02992529\nn02321529\nn03538406\nn03721384\nn02013706\nn04599235\nn02093991\nn02777292\nn02123394\nn07747607\nn03424325\nn03976657\nn04209239\nn02951585\nn07753592\nn04443257\nn03388183\nn10148035\nn03344393\nn04336792\nn02120505\nn01981276\nn03933933\nn01829413\nn03916031\nn02776631\nn01775062\nn04286575\nn04209239\nn07730033\nn02099712\nn07613480\nn02100583\nn03733805\nn03873416\nn04476259\nn02113799\nn02690373\nn09468604\nn02009912\nn01980166\nn02096294\nn03764736\nn03417042\nn03000134\nn10565667\nn04120489\nn02114855\nn04039381\nn04376876\nn02843684\nn02643566\nn03924679\nn03958227\nn03773504\nn02276258\nn03776460\nn03000684\nn02129165\nn03445924\nn02108089\nn04310018\nn03873416\nn02236044\nn03483316\nn02099601\nn02115913\nn02441942\nn03967562\nn04479046\nn04344873\nn02123597\nn02229544\nn03179701\nn02791124\nn04525305\nn03976657\nn04147183\nn02835271\nn01685808\nn02280649\nn01768244\nn02489166\nn04355338\nn02279972\nn03770679\nn01498041\nn04041544\nn02085620\nn02086240\nn03532672\nn02268853\nn02978881\nn02363005\nn04442312\nn02280649\nn02108915\nn04380533\nn04462240\nn03271574\nn03930630\nn02892767\nn01797886\nn01978287\nn02437616\nn03920288\nn03160309\nn01560419\nn02666196\nn03424325\nn02514041\nn02790996\nn02397096\nn01775062\nn02071294\nn02100583\nn04380533\nn01990800\nn03903868\nn07583066\nn02013706\nn02130308\nn02113023\nn03884397\nn03000684\nn04037443\nn01687978\nn02058221\nn02704792\nn07693725\nn04039381\nn03461385\nn01950731\nn03773504\nn02104365\nn04536866\nn02328150\nn07871810\nn03372029\nn04462240\nn02133161\nn02808304\nn03443371\nn01843065\nn01914609\nn01855032\nn04380533\nn02086646\nn02363005\nn04296562\nn04033995\nn02871525\nn03742115\nn02704792\nn02108915\nn03670208\nn02093428\nn04428191\nn09421951\nn01984695\nn02128757\nn01917289\nn04033901\nn02092002\nn03840681\nn03476684\nn04286575\nn04423845\nn02951358\nn03877845\nn01728572\nn03481172\nn03208938\nn02487347\nn02107908\nn07565083\nn04479046\nn03832673\nn02948072\nn02950826\nn03929660\nn04370456\nn02978881\nn01498041\nn02783161\nn03697007\nn01820546\nn03026506\nn04584207\nn02091467\nn02422699\nn02123045\nn03793489\nn03958227\nn02443484\nn02098286\nn02788148\nn04392985\nn12768682\nn03843555\nn02894605\nn04372370\nn02077923\nn02111889\nn01770393\nn02840245\nn01631663\nn02786058\nn04462240\nn02264363\nn03942813\nn02457408\nn03476991\nn02107312\nn02917067\nn04612504\nn02100583\nn04239074\nn04476259\nn02105855\nn03929855\nn02389026\nn04389033\nn03876231\nn04041544\nn01806143\nn07584110\nn02814533\nn03868863\nn02104365\nn02128925\nn02105251\nn04447861\nn04517823\nn02395406\nn04208210\nn02091831\nn04330267\nn02444819\nn02815834\nn02264363\nn01484850\nn02105641\nn02808440\nn02116738\nn01873310\nn03792972\nn02125311\nn01855032\nn02704792\nn07717556\nn03814906\nn01667114\nn03857828\nn01784675\nn02091032\nn04409515\nn01614925\nn03769881\nn02814533\nn02093754\nn07747607\nn03857828\nn04277352\nn02104029\nn04131690\nn02951358\nn02134084\nn07749582\nn03126707\nn04325704\nn02497673\nn02105412\nn01685808\nn07871810\nn02927161\nn04380533\nn04152593\nn02106382\nn04350905\nn01795545\nn03871628\nn02965783\nn07614500\nn03884397\nn03980874\nn02492035\nn02113712\nn03417042\nn04259630\nn03483316\nn01494475\nn02088238\nn07565083\nn07753113\nn04366367\nn04120489\nn04429376\nn02091467\nn02112350\nn02699494\nn03995372\nn02113186\nn01685808\nn03347037\nn02843684\nn02108089\nn03825788\nn03773504\nn02787622\nn04325704\nn03796401\nn01698640\nn03045698\nn02422699\nn04417672\nn04141327\nn04118538\nn02113624\nn04550184\nn01728572\nn04380533\nn04209133\nn01537544\nn07920052\nn04317175\nn01742172\nn02786058\nn03417042\nn03770679\nn02804414\nn02236044\nn03085013\nn04019541\nn03661043\nn03769881\nn01773797\nn02835271\nn01494475\nn01773797\nn02097298\nn01667114\nn02106030\nn02106030\nn03146219\nn01930112\nn02102177\nn13040303\nn04357314\nn04264628\nn07875152\nn04371774\nn02099849\nn03127925\nn02869837\nn03710193\nn02097130\nn07730033\nn04311004\nn03085013\nn02102040\nn04486054\nn02111889\nn04204238\nn03792972\nn03450230\nn03617480\nn02124075\nn03495258\nn03769881\nn02916936\nn01704323\nn03063599\nn01883070\nn01614925\nn04311004\nn01692333\nn03125729\nn04192698\nn03874293\nn03496892\nn04118776\nn02454379\nn04116512\nn01677366\nn01514668\nn03476991\nn03733805\nn03942813\nn03095699\nn02883205\nn02091467\nn02817516\nn06794110\nn03131574\nn02101388\nn01978455\nn02106382\nn02108915\nn03216828\nn07615774\nn07730033\nn01770393\nn04371430\nn02123159\nn01984695\nn01737021\nn02825657\nn02099267\nn03658185\nn02815834\nn02120079\nn03908714\nn04554684\nn04604644\nn03109150\nn03866082\nn03908714\nn03617480\nn02093647\nn02510455\nn04074963\nn03089624\nn02095314\nn03218198\nn02817516\nn01943899\nn03854065\nn03891251\nn04423845\nn04131690\nn04442312\nn01537544\nn03325584\nn02095889\nn03291819\nn03042490\nn02504013\nn03146219\nn04252077\nn02328150\nn01697457\nn02655020\nn04606251\nn07720875\nn02091831\nn02097209\nn01630670\nn01950731\nn01910747\nn07695742\nn03063689\nn01871265\nn03478589\nn07583066\nn02109525\nn03982430\nn04270147\nn01871265\nn02033041\nn03476991\nn01494475\nn09229709\nn03967562\nn03902125\nn02837789\nn04311004\nn04228054\nn02087394\nn04147183\nn02133161\nn03100240\nn04204238\nn02445715\nn03481172\nn04487394\nn03796401\nn02978881\nn01877812\nn01496331\nn07717410\nn02871525\nn02442845\nn02112706\nn02879718\nn03085013\nn02799071\nn03902125\nn02965783\nn02281406\nn04404412\nn02123159\nn02747177\nn04548280\nn04591713\nn04044716\nn03742115\nn02992211\nn07717410\nn10148035\nn02099429\nn02486261\nn04447861\nn03843555\nn04263257\nn04330267\nn02787622\nn02823750\nn01740131\nn04235860\nn03498962\nn02492660\nn02437312\nn07718747\nn03803284\nn02364673\nn02906734\nn07684084\nn03970156\nn03825788\nn03814906\nn07715103\nn02749479\nn02815834\nn02877765\nn02088364\nn02088632\nn04270147\nn07248320\nn01514668\nn01883070\nn02276258\nn04554684\nn02009229\nn07248320\nn01924916\nn03376595\nn03983396\nn02112018\nn01770393\nn02403003\nn02051845\nn02870880\nn02484975\nn02113799\nn03717622\nn07930864\nn07717410\nn02730930\nn03874599\nn02105162\nn02099712\nn01530575\nn03891332\nn01773157\nn02808440\nn02177972\nn03759954\nn07579787\nn02877765\nn03958227\nn03977966\nn03825788\nn03028079\nn04501370\nn02259212\nn03961711\nn03496892\nn03706229\nn04409515\nn12144580\nn03769881\nn09193705\nn02782093\nn01734418\nn04285008\nn02120505\nn02111277\nn02640242\nn02790996\nn02099267\nn07871810\nn01986214\nn01984695\nn12985857\nn04542943\nn03888605\nn04074963\nn10565667\nn04483307\nn09835506\nn02129165\nn03538406\nn01498041\nn04461696\nn03944341\nn03259280\nn01484850\nn04486054\nn03788195\nn09193705\nn03530642\nn04557648\nn02892201\nn04509417\nn03041632\nn02093256\nn02391049\nn04479046\nn03961711\nn15075141\nn02108915\nn01847000\nn02325366\nn03770439\nn03676483\nn06794110\nn01770393\nn02788148\nn03127925\nn03710721\nn02484975\nn02536864\nn02105855\nn03733131\nn04435653\nn02124075\nn03792782\nn04465501\nn01644373\nn02085620\nn03720891\nn03814639\nn03133878\nn02892201\nn02077923\nn02992211\nn02114712\nn02410509\nn03733131\nn03843555\nn02917067\nn02128385\nn04009552\nn03888605\nn03388043\nn04596742\nn03935335\nn06785654\nn02356798\nn02398521\nn03445924\nn03041632\nn03535780\nn07753113\nn02834397\nn01824575\nn07697313\nn04487081\nn02509815\nn02106550\nn01704323\nn01742172\nn02094433\nn01817953\nn03032252\nn01742172\nn02483362\nn02096437\nn02487347\nn02096294\nn04465501\nn02948072\nn03424325\nn02111500\nn02114367\nn01537544\nn01945685\nn02607072\nn04005630\nn04127249\nn07714990\nn03662601\nn03179701\nn09468604\nn01530575\nn03100240\nn06359193\nn02510455\nn02120079\nn02096437\nn03141823\nn01484850\nn04579432\nn04118538\nn02094433\nn02086910\nn01622779\nn07747607\nn07718747\nn02106030\nn02363005\nn03599486\nn03637318\nn02101388\nn03662601\nn03188531\nn02104029\nn11939491\nn04238763\nn01945685\nn02834397\nn02099712\nn01558993\nn03450230\nn03838899\nn04243546\nn02123159\nn04536866\nn02808304\nn04120489\nn03127925\nn04505470\nn03782006\nn02281406\nn04252225\nn02776631\nn02444819\nn04005630\nn03717622\nn03961711\nn03444034\nn03970156\nn01824575\nn02396427\nn02165456\nn02226429\nn02056570\nn07693725\nn04599235\nn03944341\nn02134418\nn03788365\nn07717410\nn04264628\nn03967562\nn04265275\nn03584254\nn01614925\nn07720875\nn03814639\nn04370456\nn04037443\nn03297495\nn02129604\nn03131574\nn04243546\nn02105855\nn03895866\nn03216828\nn02317335\nn02106030\nn03661043\nn01924916\nn02165456\nn04536866\nn01616318\nn02799071\nn03788195\nn02363005\nn01924916\nn04461696\nn04270147\nn02843684\nn04258138\nn03944341\nn01737021\nn01882714\nn02817516\nn02097298\nn01843383\nn04019541\nn04118776\nn02799071\nn03967562\nn03494278\nn02229544\nn04325704\nn03967562\nn13044778\nn03344393\nn04557648\nn03447721\nn09472597\nn04118538\nn03424325\nn04599235\nn01530575\nn02835271\nn09472597\nn02092002\nn02730930\nn04599235\nn02422699\nn03657121\nn01622779\nn03903868\nn02090721\nn04443257\nn01734418\nn07714571\nn01496331\nn02264363\nn03483316\nn03742115\nn07714990\nn03590841\nn03871628\nn04311174\nn02114548\nn03255030\nn02105505\nn07579787\nn07697313\nn03400231\nn06874185\nn04591713\nn04509417\nn03255030\nn03404251\nn02268853\nn07613480\nn07768694\nn02321529\nn01818515\nn01877812\nn02895154\nn03485794\nn04553703\nn02364673\nn09229709\nn02916936\nn04235860\nn07932039\nn15075141\nn02006656\nn02487347\nn02087394\nn02480855\nn04372370\nn03733805\nn02979186\nn02033041\nn10565667\nn02006656\nn02099267\nn02108915\nn03930630\nn01728572\nn04552348\nn02090721\nn02870880\nn02951585\nn04259630\nn02328150\nn04435653\nn02843684\nn03788195\nn03887697\nn04335435\nn04228054\nn01608432\nn04355933\nn02123045\nn04589890\nn04086273\nn03832673\nn02111277\nn01704323\nn03599486\nn04254680\nn02086240\nn02817516\nn02487347\nn04592741\nn03272010\nn02018795\nn01930112\nn03223299\nn03388043\nn03888605\nn04040759\nn02169497\nn02793495\nn04376876\nn02177972\nn04485082\nn07717410\nn04081281\nn03109150\nn02090622\nn03482405\nn01664065\nn03032252\nn03355925\nn01910747\nn04536866\nn03000247\nn03527444\nn02025239\nn04254777\nn04141975\nn03793489\nn02979186\nn02127052\nn01847000\nn02328150\nn02909870\nn10565667\nn03709823\nn02992211\nn02093859\nn07747607\nn07717410\nn03249569\nn01734418\nn03944341\nn04344873\nn01677366\nn02108000\nn03876231\nn04461696\nn06596364\nn09428293\nn03482405\nn02088094\nn04136333\nn04204238\nn01697457\nn04074963\nn01514859\nn02106662\nn04252225\nn02117135\nn03476684\nn01770393\nn02795169\nn03733131\nn03676483\nn04133789\nn04435653\nn01728920\nn04033995\nn04355933\nn01675722\nn03717622\nn04428191\nn03535780\nn02105162\nn07753275\nn04483307\nn02917067\nn04118776\nn03000684\nn03000134\nn02281787\nn01770393\nn02326432\nn01753488\nn02167151\nn02808304\nn04392985\nn03197337\nn03100240\nn04286575\nn03127925\nn01945685\nn02536864\nn02799071\nn02783161\nn02346627\nn02264363\nn02088364\nn02093754\nn03617480\nn02105162\nn02966687\nn01795545\nn02091831\nn01537544\nn03041632\nn02834397\nn02699494\nn03404251\nn01860187\nn04550184\nn02992211\nn02437312\nn02098105\nn07590611\nn03527444\nn07583066\nn01748264\nn02966687\nn03803284\nn04366367\nn02119022\nn01740131\nn02099601\nn01534433\nn04606251\nn02099601\nn02488702\nn04336792\nn02391049\nn02086646\nn02086079\nn02110806\nn02110341\nn04447861\nn02119789\nn04162706\nn02259212\nn03124043\nn02101388\nn03630383\nn02980441\nn02494079\nn03602883\nn01695060\nn04141327\nn04266014\nn03047690\nn02097209\nn02113023\nn02174001\nn01669191\nn01667778\nn02096051\nn04251144\nn02112706\nn02988304\nn03461385\nn03447447\nn02077923\nn03887697\nn02342885\nn01641577\nn01616318\nn02007558\nn01698640\nn04033995\nn03804744\nn02110063\nn03355925\nn01667114\nn01914609\nn03804744\nn02669723\nn07836838\nn02412080\nn03743016\nn04336792\nn13052670\nn03791053\nn03776460\nn03017168\nn04404412\nn03777754\nn04037443\nn03796401\nn04404412\nn06596364\nn02105412\nn04023962\nn01734418\nn02328150\nn02101006\nn07684084\nn02002556\nn13133613\nn07248320\nn01753488\nn02107908\nn02123394\nn04154565\nn02504458\nn13052670\nn04008634\nn02916936\nn02107683\nn02134084\nn02443484\nn07720875\nn04493381\nn03761084\nn02102040\nn03089624\nn01985128\nn01753488\nn02137549\nn09835506\nn03443371\nn02346627\nn02002556\nn04589890\nn04562935\nn01632777\nn02317335\nn01632458\nn02493509\nn02398521\nn03970156\nn02667093\nn03825788\nn02086646\nn13044778\nn02088238\nn01776313\nn02481823\nn04423845\nn03047690\nn07749582\nn02977058\nn01796340\nn02110627\nn02910353\nn03201208\nn01728572\nn02114367\nn03980874\nn02776631\nn02165456\nn02437312\nn02364673\nn03764736\nn04041544\nn12998815\nn03388043\nn03803284\nn02113624\nn02102318\nn03424325\nn03250847\nn09288635\nn03924679\nn03956157\nn01910747\nn04560804\nn07714990\nn04542943\nn07716906\nn02128925\nn04487394\nn04399382\nn04044716\nn04465501\nn03854065\nn02398521\nn02823750\nn07583066\nn02107312\nn04584207\nn01829413\nn01833805\nn02417914\nn04081281\nn02088364\nn02113799\nn04376876\nn02093991\nn02730930\nn04133789\nn02442845\nn02018207\nn03930630\nn02910353\nn02730930\nn03776460\nn02088364\nn04264628\nn07714990\nn04461696\nn03372029\nn02090379\nn01819313\nn03657121\nn02106662\nn02109525\nn02500267\nn04376876\nn04483307\nn03843555\nn13037406\nn02097047\nn02403003\nn03290653\nn02690373\nn02536864\nn02091467\nn03843555\nn04044716\nn01537544\nn02037110\nn04146614\nn04612504\nn01484850\nn07684084\nn03220513\nn04326547\nn03127925\nn02971356\nn03476991\nn01774384\nn07565083\nn02672831\nn03967562\nn03998194\nn09229709\nn01641577\nn01682714\nn04204347\nn03160309\nn03478589\nn03792972\nn04458633\nn04392985\nn02480855\nn02099429\nn07714571\nn02098105\nn02963159\nn02777292\nn03529860\nn03706229\nn12057211\nn04612504\nn04554684\nn03590841\nn03661043\nn04065272\nn01531178\nn07614500\nn02017213\nn02859443\nn04235860\nn02256656\nn03481172\nn02110063\nn02281787\nn04579432\nn01985128\nn02363005\nn04317175\nn01737021\nn03216828\nn02095570\nn07714571\nn04525305\nn07565083\nn03494278\nn04525038\nn01494475\nn04404412\nn07718747\nn03903868\nn04376876\nn02088632\nn07720875\nn02111277\nn01728920\nn04311004\nn02877765\nn06785654\nn01978455\nn01729977\nn02906734\nn01601694\nn04429376\nn02676566\nn03733281\nn02106382\nn02817516\nn04039381\nn04356056\nn01514859\nn03791053\nn04376876\nn03630383\nn04252077\nn04417672\nn01641577\nn04141076\nn02025239\nn02992529\nn02672831\nn02088466\nn01797886\nn04501370\nn04149813\nn02172182\nn04336792\nn04417672\nn03944341\nn03961711\nn04493381\nn04258138\nn04523525\nn02423022\nn02102177\nn02865351\nn04507155\nn07930864\nn02097047\nn03916031\nn02892201\nn04254680\nn01608432\nn04461696\nn03483316\nn02500267\nn02916936\nn03452741\nn02892201\nn02113186\nn03775546\nn03478589\nn03633091\nn04599235\nn03065424\nn02097209\nn01873310\nn04604644\nn04418357\nn03794056\nn03179701\nn01440764\nn01806143\nn02093859\nn01496331\nn01669191\nn04367480\nn02971356\nn02114548\nn03249569\nn01796340\nn07613480\nn04505470\nn03804744\nn02950826\nn03743016\nn02777292\nn03089624\nn02110341\nn03485407\nn02480855\nn02356798\nn02910353\nn03662601\nn01601694\nn04141076\nn03384352\nn02492660\nn03376595\nn02776631\nn02025239\nn04065272\nn02033041\nn03417042\nn09332890\nn02097658\nn04552348\nn03447447\nn03781244\nn03000684\nn01749939\nn01677366\nn02094114\nn04465501\nn04372370\nn02281787\nn03196217\nn02277742\nn02701002\nn03290653\nn03452741\nn01806143\nn04037443\nn03825788\nn04266014\nn07716906\nn02123597\nn02110063\nn02981792\nn03804744\nn02134418\nn03970156\nn02483362\nn02486261\nn01514668\nn02134084\nn03970156\nn01558993\nn01644373\nn03692522\nn03804744\nn02804414\nn02108551\nn01560419\nn02490219\nn03710637\nn03673027\nn04552348\nn02094114\nn03967562\nn03776460\nn02447366\nn03733805\nn03127925\nn02279972\nn09428293\nn03089624\nn03938244\nn04041544\nn02113712\nn03594734\nn02206856\nn03485794\nn02256656\nn02981792\nn03347037\nn03026506\nn04356056\nn09332890\nn07565083\nn07760859\nn04286575\nn02790996\nn01873310\nn03337140\nn04483307\nn02281787\nn02114548\nn12057211\nn02971356\nn04591713\nn04371774\nn03841143\nn02229544\nn02794156\nn04270147\nn04090263\nn04592741\nn02120505\nn02120505\nn03532672\nn03062245\nn03089624\nn03710193\nn03792972\nn02085936\nn01924916\nn01692333\nn04428191\nn13044778\nn06359193\nn07693725\nn02916936\nn02488702\nn02489166\nn02102318\nn03980874\nn04265275\nn04429376\nn02480855\nn07873807\nn03478589\nn02071294\nn02097298\nn01734418\nn02123159\nn02951585\nn07714990\nn02859443\nn04447861\nn02096585\nn03902125\nn04525038\nn03028079\nn03866082\nn03891332\nn03220513\nn03207743\nn04589890\nn03871628\nn01774750\nn02125311\nn02747177\nn04153751\nn02101556\nn02095570\nn01629819\nn03042490\nn01872401\nn04311004\nn04228054\nn03983396\nn04456115\nn04070727\nn02490219\nn02093256\nn03710193\nn03742115\nn03841143\nn04285008\nn02074367\nn02526121\nn02116738\nn03666591\nn02363005\nn02910353\nn02219486\nn03063599\nn01955084\nn02104029\nn02114855\nn04023962\nn04376876\nn04275548\nn01682714\nn01641577\nn02676566\nn07892512\nn01775062\nn03457902\nn04486054\nn03457902\nn02843684\nn07768694\nn04026417\nn03355925\nn02025239\nn03781244\nn03947888\nn02280649\nn03450230\nn02098286\nn03776460\nn03594945\nn07734744\nn02276258\nn07720875\nn02988304\nn03595614\nn02951358\nn03764736\nn02939185\nn02091134\nn01978287\nn02268443\nn03127747\nn03814639\nn03874293\nn04081281\nn07768694\nn07715103\nn02790996\nn03160309\nn04525038\nn02013706\nn04540053\nn02105056\nn07715103\nn01860187\nn07920052\nn01687978\nn07590611\nn03394916\nn03947888\nn01945685\nn02110063\nn04074963\nn04606251\nn03594945\nn04254120\nn03187595\nn02110958\nn02977058\nn07930864\nn02099601\nn03590841\nn02441942\nn01806567\nn02643566\nn03874293\nn03255030\nn04487394\nn07760859\nn02112137\nn04486054\nn01496331\nn03337140\nn01882714\nn02113978\nn07615774\nn02168699\nn04465501\nn02086910\nn04136333\nn04254120\nn03530642\nn03187595\nn01770393\nn02422106\nn03709823\nn02910353\nn01855672\nn02361337\nn01580077\nn01694178\nn04120489\nn04517823\nn03775546\nn01773157\nn03775546\nn03777568\nn04355933\nn01784675\nn01498041\nn02422699\nn04447861\nn02177972\nn02319095\nn03935335\nn03980874\nn03976657\nn02442845\nn02085782\nn03976467\nn07583066\nn04461696\nn04467665\nn02105641\nn04501370\nn03777754\nn04065272\nn03447721\nn02206856\nn03459775\nn03947888\nn04111531\nn02807133\nn03481172\nn01983481\nn03733131\nn02105641\nn03841143\nn03976467\nn02391049\nn03196217\nn02422699\nn04462240\nn04328186\nn04310018\nn04417672\nn03018349\nn02965783\nn01629819\nn03207941\nn04311174\nn02226429\nn02363005\nn03041632\nn04033901\nn02410509\nn02112137\nn02747177\nn02825657\nn02097298\nn02992529\nn03032252\nn01734418\nn04090263\nn04201297\nn02094258\nn04111531\nn04265275\nn04065272\nn02676566\nn03388043\nn07930864\nn02423022\nn02108551\nn03424325\nn02815834\nn04228054\nn02097209\nn02137549\nn03314780\nn01608432\nn01820546\nn02109961\nn01580077\nn07579787\nn03788365\nn02749479\nn03930313\nn01806567\nn02927161\nn04447861\nn04548362\nn02259212\nn04252225\nn02105162\nn03345487\nn02727426\nn07584110\nn04005630\nn02096294\nn04273569\nn02422106\nn03534580\nn09288635\nn01795545\nn02397096\nn02730930\nn01806143\nn03661043\nn02807133\nn02277742\nn07613480\nn03297495\nn03761084\nn03109150\nn07716906\nn12267677\nn04204238\nn04204347\nn04596742\nn03710637\nn02481823\nn02669723\nn01491361\nn01629819\nn03982430\nn02869837\nn01843065\nn04311174\nn01820546\nn01677366\nn02108089\nn01807496\nn03710721\nn03063599\nn03498962\nn01729322\nn02769748\nn02268853\nn04081281\nn03983396\nn06359193\nn02127052\nn02107142\nn02488702\nn02006656\nn07831146\nn02676566\nn04277352\nn03527444\nn03372029\nn03314780\nn02114712\nn01978287\nn03337140\nn03538406\nn02917067\nn01756291\nn01667778\nn01795545\nn01631663\nn02088364\nn02808304\nn01797886\nn02104029\nn03201208\nn01558993\nn03967562\nn04428191\nn02494079\nn04162706\nn04515003\nn04040759\nn01774750\nn01943899\nn02098413\nn02099601\nn04270147\nn02417914\nn03065424\nn07734744\nn02007558\nn02119789\nn07695742\nn02364673\nn01689811\nn02672831\nn02124075\nn01644900\nn04335435\nn02086646\nn02095889\nn02105251\nn02391049\nn01955084\nn02480495\nn03032252\nn02808440\nn03637318\nn02877765\nn04597913\nn02112706\nn04590129\nn01910747\nn02895154\nn03062245\nn03775546\nn03372029\nn04228054\nn04258138\nn04074963\nn11879895\nn01986214\nn01943899\nn02138441\nn01806143\nn01983481\nn03478589\nn04389033\nn02951358\nn02102318\nn03763968\nn03594734\nn01689811\nn07753113\nn02074367\nn01819313\nn03467068\nn03393912\nn02056570\nn04008634\nn04254777\nn01644900\nn02106166\nn03891251\nn04435653\nn01773549\nn03729826\nn01770081\nn03529860\nn03110669\nn03841143\nn02091244\nn04067472\nn04371430\nn03796401\nn03782006\nn04238763\nn01784675\nn04019541\nn02097209\nn02259212\nn03956157\nn02112706\nn02111889\nn03527444\nn02167151\nn04442312\nn07695742\nn03710193\nn04074963\nn02099849\nn02134418\nn02825657\nn13037406\nn02085782\nn02417914\nn12620546\nn04275548\nn02804610\nn04146614\nn01514668\nn01443537\nn04509417\nn02892201\nn02088466\nn03065424\nn04254120\nn03792972\nn01924916\nn02037110\nn07697537\nn03394916\nn02101006\nn02110806\nn03146219\nn02814860\nn03649909\nn03127747\nn01980166\nn02092002\nn03787032\nn02133161\nn03874599\nn04201297\nn02106550\nn07615774\nn03710637\nn03527444\nn07714990\nn03017168\nn02111500\nn01744401\nn03950228\nn02410509\nn02483708\nn07583066\nn04589890\nn02655020\nn02259212\nn01990800\nn03457902\nn07920052\nn04505470\nn02111129\nn03216828\nn02892767\nn02095314\nn02092002\nn01664065\nn03944341\nn03495258\nn01737021\nn01677366\nn01806567\nn02097298\nn04532670\nn04522168\nn02708093\nn02066245\nn02971356\nn02906734\nn03492542\nn03930313\nn02396427\nn02037110\nn03297495\nn03017168\nn01773797\nn03786901\nn02910353\nn02102177\nn02730930\nn02480495\nn04562935\nn02109525\nn02988304\nn02091467\nn04204238\nn04476259\nn01532829\nn03208938\nn04532106\nn02165105\nn01677366\nn07715103\nn02795169\nn02127052\nn02098286\nn01728572\nn01833805\nn02445715\nn02259212\nn04209133\nn07711569\nn07860988\nn09421951\nn03125729\nn04141076\nn01742172\nn03063689\nn01704323\nn01748264\nn01770393\nn01955084\nn02894605\nn03792972\nn04141975\nn02672831\nn03018349\nn02971356\nn02859443\nn07749582\nn03792782\nn02398521\nn04254777\nn02326432\nn03877472\nn02123045\nn03623198\nn02342885\nn03187595\nn03884397\nn04330267\nn04266014\nn02138441\nn03538406\nn03000247\nn02363005\nn02883205\nn07753592\nn04371430\nn03871628\nn03633091\nn04023962\nn01740131\nn04251144\nn02870880\nn02009912\nn03461385\nn02328150\nn01945685\nn02280649\nn02012849\nn02112137\nn04326547\nn02117135\nn07930864\nn04136333\nn04370456\nn01737021\nn01817953\nn03888605\nn03452741\nn04330267\nn07932039\nn02398521\nn07930864\nn03787032\nn02112350\nn12267677\nn03494278\nn07693725\nn03857828\nn02815834\nn04376876\nn03874293\nn04371774\nn03929855\nn02841315\nn02090721\nn09468604\nn02488291\nn02106662\nn03461385\nn04485082\nn03995372\nn02493793\nn01914609\nn02002556\nn07711569\nn02098286\nn07693725\nn02422106\nn02110958\nn04613696\nn03692522\nn07920052\nn02799071\nn04037443\nn02113978\nn01530575\nn10565667\nn10148035\nn03773504\nn03347037\nn09193705\nn02113978\nn01882714\nn03527444\nn02979186\nn01877812\nn02111129\nn03417042\nn03461385\nn02114855\nn12768682\nn01950731\nn02667093\nn02011460\nn03290653\nn02108000\nn04229816\nn01930112\nn02486261\nn04542943\nn04235860\nn07768694\nn02403003\nn03786901\nn02396427\nn02109047\nn01968897\nn03388043\nn04258138\nn02112137\nn02607072\nn02134084\nn03837869\nn04200800\nn02071294\nn04141076\nn02085620\nn03218198\nn02098286\nn02099601\nn04099969\nn03216828\nn02892767\nn03482405\nn03838899\nn03018349\nn04487394\nn04141076\nn02106382\nn11939491\nn03100240\nn03908714\nn07831146\nn09256479\nn12267677\nn04152593\nn02093428\nn02791270\nn02099429\nn02105056\nn03223299\nn02643566\nn07720875\nn02124075\nn02699494\nn03888605\nn03249569\nn03584254\nn02981792\nn04133789\nn03534580\nn01518878\nn02704792\nn07747607\nn13037406\nn02488291\nn03538406\nn03627232\nn02099429\nn02704792\nn07684084\nn03733805\nn02397096\nn02114367\nn02319095\nn02086646\nn02094433\nn04133789\nn04483307\nn02504013\nn04525038\nn04265275\nn04209239\nn03967562\nn02129165\nn03777754\nn09835506\nn02727426\nn01693334\nn02457408\nn02128925\nn03903868\nn04409515\nn01950731\nn06359193\nn03187595\nn01950731\nn04041544\nn02892767\nn02363005\nn04355338\nn02277742\nn04090263\nn03314780\nn04285008\nn01847000\nn02094433\nn02098105\nn07892512\nn09229709\nn03527444\nn03530642\nn01774384\nn01773157\nn04366367\nn03676483\nn01930112\nn03933933\nn03877845\nn02104365\nn07697537\nn02444819\nn13037406\nn04296562\nn02457408\nn11879895\nn04120489\nn03958227\nn03187595\nn03930630\nn02277742\nn01774750\nn04550184\nn02837789\nn04479046\nn02500267\nn04317175\nn07875152\nn01687978\nn02088094\nn02814533\nn02109961\nn02117135\nn04579145\nn07880968\nn02190166\nn02396427\nn04542943\nn04357314\nn02114855\nn03920288\nn02120079\nn01776313\nn01847000\nn04447861\nn04019541\nn03201208\nn03857828\nn03404251\nn07754684\nn09256479\nn02442845\nn06794110\nn02917067\nn04592741\nn02389026\nn03444034\nn03724870\nn02895154\nn02165456\nn03804744\nn01742172\nn02037110\nn02087046\nn02865351\nn02025239\nn03887697\nn02814533\nn04133789\nn03891332\nn02483708\nn07714571\nn03982430\nn04579145\nn02127052\nn07932039\nn04238763\nn03710637\nn02825657\nn03977966\nn02321529\nn02493509\nn02219486\nn09193705\nn01950731\nn03457902\nn03908714\nn03980874\nn02113624\nn03393912\nn03379051\nn01688243\nn02971356\nn04243546\nn02510455\nn02092002\nn02116738\nn02391049\nn04111531\nn02128925\nn02097047\nn02071294\nn04462240\nn01748264\nn02086910\nn04326547\nn02107908\nn06874185\nn03773504\nn04039381\nn03874293\nn04482393\nn04371774\nn02088094\nn03887697\nn03452741\nn07802026\nn02509815\nn03347037\nn03983396\nn01774750\nn02879718\nn03888257\nn01796340\nn07717556\nn02112706\nn01742172\nn12998815\nn03271574\nn01775062\nn02112706\nn04153751\nn04350905\nn02481823\nn02487347\nn01950731\nn02667093\nn02089973\nn04592741\nn03393912\nn02840245\nn02006656\nn01498041\nn04548362\nn02782093\nn09193705\nn02443114\nn01773549\nn02093428\nn04116512\nn01770393\nn02128925\nn02939185\nn04133789\nn02777292\nn03976657\nn03876231\nn02443114\nn04590129\nn02114855\nn04335435\nn03372029\nn04418357\nn02109961\nn02088094\nn02279972\nn03657121\nn04482393\nn04229816\nn02264363\nn04136333\nn02027492\nn03617480\nn07753592\nn03459775\nn04154565\nn03425413\nn01955084\nn03127925\nn02017213\nn02437616\nn01774384\nn07760859\nn01818515\nn03000684\nn02128385\nn04487081\nn02105505\nn03376595\nn02130308\nn02108000\nn03042490\nn02992211\nn07718472\nn02417914\nn02701002\nn02058221\nn03888605\nn01694178\nn01855672\nn02168699\nn02676566\nn04507155\nn03777754\nn01704323\nn02088094\nn03444034\nn02883205\nn02909870\nn02787622\nn02102973\nn02514041\nn03085013\nn04328186\nn02494079\nn02093428\nn01986214\nn03594945\nn01847000\nn02110958\nn04252077\nn03041632\nn09421951\nn03776460\nn03676483\nn02804610\nn02112350\nn02096294\nn02108089\nn03690938\nn04372370\nn03877845\nn02111500\nn04476259\nn02104029\nn02085782\nn03424325\nn01943899\nn02443114\nn02865351\nn02129604\nn04487394\nn02493509\nn03026506\nn04136333\nn04507155\nn04356056\nn04039381\nn03944341\nn03947888\nn02098105\nn02133161\nn02841315\nn04251144\nn02094114\nn04505470\nn01829413\nn02493509\nn11879895\nn07875152\nn01983481\nn02500267\nn02085620\nn13040303\nn03902125\nn12620546\nn03599486\nn03891332\nn02102480\nn04118538\nn01807496\nn01860187\nn03444034\nn01491361\nn07831146\nn02666196\nn02892767\nn13040303\nn03032252\nn02125311\nn02168699\nn02117135\nn02395406\nn01537544\nn07753275\nn04428191\nn02109961\nn04235860\nn02417914\nn04584207\nn04070727\nn01873310\nn02749479\nn02769748\nn07714571\nn04367480\nn02012849\nn01665541\nn02167151\nn02088466\nn03527444\nn04409515\nn02013706\nn03325584\nn02441942\nn07613480\nn02101006\nn02088632\nn02129604\nn01685808\nn02966687\nn04367480\nn03908618\nn02977058\nn04111531\nn03042490\nn03717622\nn06785654\nn02980441\nn01968897\nn01843065\nn04554684\nn04523525\nn04417672\nn01855672\nn03873416\nn02100877\nn02105505\nn03492542\nn01833805\nn04116512\nn04487394\nn02105505\nn03297495\nn02119022\nn04392985\nn02108422\nn02098413\nn02012849\nn04487394\nn01990800\nn02817516\nn03216828\nn03187595\nn07871810\nn02669723\nn02229544\nn02966687\nn02113712\nn03930313\nn03417042\nn02389026\nn03249569\nn03633091\nn02096294\nn02110627\nn03916031\nn07920052\nn04146614\nn03207743\nn02325366\nn03954731\nn04133789\nn03788195\nn03982430\nn02112706\nn02017213\nn02492660\nn03976467\nn03792782\nn02123159\nn07754684\nn03444034\nn03063599\nn02326432\nn02009912\nn04154565\nn03492542\nn03649909\nn02101388\nn02091134\nn02892201\nn02077923\nn02168699\nn04239074\nn03899768\nn04461696\nn03124170\nn09428293\nn03000247\nn01558993\nn02104365\nn02093991\nn03837869\nn02169497\nn03492542\nn03706229\nn02129165\nn03216828\nn03662601\nn02444819\nn03930313\nn04039381\nn01601694\nn04228054\nn02788148\nn03133878\nn01983481\nn02093859\nn02106166\nn02102973\nn03982430\nn02667093\nn03891332\nn01592084\nn02172182\nn03404251\nn02259212\nn03250847\nn02817516\nn07747607\nn03063599\nn03935335\nn02085620\nn02092002\nn02999410\nn02504458\nn03100240\nn04392985\nn02105855\nn07718747\nn03721384\nn02483362\nn01629819\nn02107683\nn02951358\nn07920052\nn03733805\nn02483362\nn01798484\nn04418357\nn04251144\nn03197337\nn03908618\nn01978287\nn01817953\nn04486054\nn04127249\nn01945685\nn07711569\nn02088238\nn02105641\nn02910353\nn07892512\nn01484850\nn03657121\nn02859443\nn07860988\nn04141327\nn03868863\nn01768244\nn03657121\nn02102973\nn02111500\nn01632458\nn02319095\nn04328186\nn04311004\nn01558993\nn01773549\nn01622779\nn02442845\nn07768694\nn01632777\nn03733805\nn03133878\nn02012849\nn03496892\nn02066245\nn02094433\nn03271574\nn02128757\nn03792782\nn02018795\nn01630670\nn02101006\nn04067472\nn02100583\nn04317175\nn03602883\nn04141327\nn02102040\nn07875152\nn02892201\nn04127249\nn07753275\nn04355338\nn02236044\nn01749939\nn07717556\nn02317335\nn02606052\nn04483307\nn04435653\nn04264628\nn04347754\nn04179913\nn07583066\nn04146614\nn03478589\nn03599486\nn02676566\nn02264363\nn04371430\nn03782006\nn04604644\nn03180011\nn03045698\nn03887697\nn02085936\nn07614500\nn04296562\nn02074367\nn01729977\nn02018795\nn01735189\nn03777568\nn03775546\nn02091244\nn03838899\nn04357314\nn01945685\nn03788365\nn02441942\nn04429376\nn02119022\nn01945685\nn03627232\nn02056570\nn02437616\nn03590841\nn01491361\nn01871265\nn04442312\nn01833805\nn04596742\nn04553703\nn04487394\nn03763968\nn02514041\nn11879895\nn04525038\nn02510455\nn04275548\nn01531178\nn04162706\nn03240683\nn04589890\nn03871628\nn04443257\nn02655020\nn04264628\nn01843383\nn02138441\nn02091032\nn02281406\nn03272010\nn03775546\nn03345487\nn03532672\nn02814860\nn07714571\nn02423022\nn03187595\nn03992509\nn03933933\nn03956157\nn07920052\nn01981276\nn03710721\nn04201297\nn09472597\nn02097130\nn02111889\nn03929660\nn02804610\nn03961711\nn07613480\nn01755581\nn02277742\nn03452741\nn02396427\nn01514859\nn04590129\nn04116512\nn01631663\nn07711569\nn02134084\nn04332243\nn04517823\nn01558993\nn02817516\nn02088632\nn03457902\nn01775062\nn02328150\nn02804610\nn02077923\nn02129604\nn02095314\nn03388183\nn02536864\nn03134739\nn03014705\nn02423022\nn04254120\nn03776460\nn03788195\nn03637318\nn02112706\nn03777568\nn02089078\nn03838899\nn03661043\nn02687172\nn02097658\nn02395406\nn01820546\nn03788365\nn02963159\nn02097298\nn07717556\nn02114367\nn02219486\nn04442312\nn04536866\nn02979186\nn04458633\nn07584110\nn03633091\nn04501370\nn03000684\nn02417914\nn02093859\nn04228054\nn03478589\nn02112137\nn03642806\nn02113712\nn02817516\nn03980874\nn01644900\nn11879895\nn04347754\nn03788195\nn02825657\nn02119789\nn02128925\nn02129604\nn04523525\nn04162706\nn03000247\nn04347754\nn02447366\nn02096294\nn02002724\nn02098413\nn03467068\nn01582220\nn02002556\nn03063689\nn01855672\nn02971356\nn02086240\nn02817516\nn01930112\nn02490219\nn09428293\nn02091467\nn03710637\nn02917067\nn06596364\nn01532829\nn02056570\nn04560804\nn01735189\nn04557648\nn07711569\nn06785654\nn04118776\nn02860847\nn02007558\nn02356798\nn04070727\nn02489166\nn07714990\nn02104365\nn02007558\nn03649909\nn01667114\nn01641577\nn03028079\nn03494278\nn07880968\nn03775071\nn01632458\nn01990800\nn02442845\nn02119022\nn02006656\nn02701002\nn02483362\nn03124170\nn01531178\nn02704792\nn02099849\nn01873310\nn01735189\nn04462240\nn03065424\nn04398044\nn04120489\nn04330267\nn03967562\nn02099601\nn03388043\nn02100583\nn02093991\nn09399592\nn01773797\nn03761084\nn02342885\nn02206856\nn02098286\nn03207743\nn13040303\nn01629819\nn02927161\nn04125021\nn04554684\nn02328150\nn03476684\nn02114367\nn03793489\nn03633091\nn03930630\nn02871525\nn02097474\nn02113799\nn02408429\nn03899768\nn07831146\nn04525038\nn02808304\nn03724870\nn02033041\nn02110063\nn03063689\nn01855672\nn02395406\nn04254680\nn03063689\nn02487347\nn02640242\nn03457902\nn12267677\nn04482393\nn04009552\nn02174001\nn01990800\nn04209133\nn01950731\nn02113186\nn03095699\nn01770081\nn04127249\nn02971356\nn02490219\nn04044716\nn01667778\nn03710721\nn03141823\nn04099969\nn02325366\nn04599235\nn01978455\nn03599486\nn02090622\nn03630383\nn02117135\nn02037110\nn02219486\nn03297495\nn02105505\nn04263257\nn02442845\nn04266014\nn03393912\nn02115641\nn02883205\nn01729977\nn03047690\nn02361337\nn04560804\nn02106662\nn03876231\nn03041632\nn02098105\nn01560419\nn02089078\nn03218198\nn04153751\nn02123597\nn03584829\nn02930766\nn03781244\nn02264363\nn07711569\nn04418357\nn06596364\nn03345487\nn02835271\nn04467665\nn03450230\nn03692522\nn03929660\nn03935335\nn01630670\nn02120505\nn02172182\nn03777754\nn04209133\nn01687978\nn03481172\nn02088094\nn02112350\nn03982430\nn02124075\nn03854065\nn04141076\nn06785654\nn02981792\nn03207941\nn03028079\nn13133613\nn02423022\nn03777568\nn02328150\nn02037110\nn02092002\nn02655020\nn04443257\nn02963159\nn01687978\nn09193705\nn10148035\nn03065424\nn03792972\nn02013706\nn01494475\nn07860988\nn02099267\nn04355933\nn02457408\nn01943899\nn03733131\nn04252077\nn02978881\nn03868863\nn03544143\nn03692522\nn12768682\nn02088094\nn04023962\nn02793495\nn03840681\nn01773549\nn03843555\nn04482393\nn07753592\nn03673027\nn07930864\nn01685808\nn02037110\nn02787622\nn06596364\nn02033041\nn04204238\nn12267677\nn02321529\nn03404251\nn03000684\nn07753592\nn03804744\nn01514668\nn03594945\nn02110627\nn03793489\nn04243546\nn02490219\nn02817516\nn03291819\nn02100877\nn01440764\nn04209239\nn02088364\nn04590129\nn02110806\nn09229709\nn02447366\nn04606251\nn04562935\nn02128385\nn02837789\nn02363005\nn04133789\nn02165456\nn03649909\nn03661043\nn02107683\nn01688243\nn01843383\nn03891251\nn12620546\nn03832673\nn03452741\nn04074963\nn04228054\nn03982430\nn01795545\nn02877765\nn03196217\nn04435653\nn02105505\nn04467665\nn07695742\nn02672831\nn03690938\nn04456115\nn04125021\nn15075141\nn03761084\nn04487394\nn02108089\nn07932039\nn01806567\nn02089078\nn02028035\nn03623198\nn02108551\nn01632458\nn03445924\nn01739381\nn03887697\nn07836838\nn02364673\nn03355925\nn02113799\nn04476259\nn02437312\nn03534580\nn03841143\nn03131574\nn07697537\nn01818515\nn03929660\nn02093647\nn02892767\nn03916031\nn04081281\nn04443257\nn02441942\nn01534433\nn01843383\nn02951358\nn02089078\nn03874293\nn03127925\nn02094258\nn04366367\nn03485407\nn04597913\nn01755581\nn01795545\nn01601694\nn01944390\nn03124170\nn02395406\nn03594734\nn01685808\nn01582220\nn02110627\nn03991062\nn02699494\nn09472597\nn02500267\nn03476991\nn02963159\nn02089867\nn01697457\nn03347037\nn01806143\nn02074367\nn02699494\nn04090263\nn03763968\nn02422699\nn04070727\nn01694178\nn01797886\nn03459775\nn03977966\nn01751748\nn03803284\nn01950731\nn01532829\nn02454379\nn02051845\nn03976657\nn07248320\nn07753275\nn09332890\nn02002556\nn03602883\nn12057211\nn02123045\nn02950826\nn02219486\nn02115641\nn02085936\nn02951585\nn02111889\nn02102480\nn01443537\nn02105162\nn02794156\nn04479046\nn03047690\nn02105412\nn02692877\nn01739381\nn07930864\nn04552348\nn02835271\nn01531178\nn04120489\nn01582220\nn02840245\nn02422106\nn01697457\nn03075370\nn04136333\nn03874599\nn03492542\nn02389026\nn03207743\nn02089867\nn04136333\nn06359193\nn02106382\nn02101006\nn02091467\nn03325584\nn01616318\nn02804610\nn07717556\nn02111500\nn01608432\nn02007558\nn03887697\nn02107142\nn02641379\nn07734744\nn03710193\nn02231487\nn02028035\nn04296562\nn04009552\nn02977058\nn03710721\nn03884397\nn03775546\nn07892512\nn04254777\nn07697537\nn03792782\nn02102480\nn03000247\nn02117135\nn01796340\nn02892201\nn04254680\nn04040759\nn01773549\nn04040759\nn03124170\nn02790996\nn04037443\nn02033041\nn04509417\nn01484850\nn03697007\nn04208210\nn04209133\nn02497673\nn03840681\nn03785016\nn04086273\nn02085936\nn02134084\nn03404251\nn02098286\nn07734744\nn03998194\nn02086910\nn03250847\nn03983396\nn04336792\nn03457902\nn03026506\nn03980874\nn01818515\nn04507155\nn03933933\nn13037406\nn04235860\nn02504013\nn03297495\nn02802426\nn01491361\nn02916936\nn01755581\nn02727426\nn04228054\nn03584254\nn04317175\nn01667114\nn04486054\nn02110341\nn04465501\nn02974003\nn12768682\nn12998815\nn02111129\nn11879895\nn03775546\nn03496892\nn03791053\nn01768244\nn09421951\nn04192698\nn04517823\nn02514041\nn12985857\nn13054560\nn04330267\nn03388549\nn04254120\nn04423845\nn11879895\nn02776631\nn02137549\nn03495258\nn03355925\nn02486410\nn02749479\nn03187595\nn03388043\nn04005630\nn02100877\nn07714990\nn06359193\nn02096051\nn02105641\nn07579787\nn09472597\nn04355338\nn03680355\nn02730930\nn03874599\nn02730930\nn04552348\nn03535780\nn01753488\nn02012849\nn01704323\nn02097209\nn03908714\nn04589890\nn04372370\nn01443537\nn03457902\nn04238763\nn09246464\nn01739381\nn02488702\nn04026417\nn01530575\nn07749582\nn02102480\nn04557648\nn02096585\nn01740131\nn04389033\nn03314780\nn07875152\nn02492660\nn12057211\nn04371430\nn02099267\nn03495258\nn02096051\nn02105162\nn02105641\nn03016953\nn02808440\nn03598930\nn04542943\nn01855672\nn03733281\nn07717410\nn02504013\nn02091831\nn04133789\nn04356056\nn02879718\nn03891251\nn03379051\nn02113978\nn09288635\nn02444819\nn01945685\nn03980874\nn02526121\nn02101556\nn04040759\nn02009229\nn03837869\nn04311174\nn07583066\nn02777292\nn03950228\nn02129165\nn02114548\nn02100735\nn04590129\nn03400231\nn03868242\nn02074367\nn06874185\nn04141327\nn01833805\nn09288635\nn04070727\nn02795169\nn03944341\nn01560419\nn03187595\nn02092339\nn03388043\nn03255030\nn04532670\nn02120505\nn02894605\nn02101388\nn01608432\nn03995372\nn02259212\nn03908618\nn03223299\nn02107683\nn07932039\nn03063689\nn01629819\nn03982430\nn03188531\nn01748264\nn03877472\nn02115913\nn01748264\nn04350905\nn04070727\nn02643566\nn02966193\nn01770393\nn02672831\nn02494079\nn02930766\nn03259280\nn02442845\nn03903868\nn03710721\nn02690373\nn01531178\nn01496331\nn03710721\nn02088094\nn07717556\nn03920288\nn02089078\nn02109525\nn02808304\nn03447447\nn04548280\nn02906734\nn07716358\nn01774384\nn03637318\nn02909870\nn03788195\nn02699494\nn04355338\nn02095889\nn02606052\nn03623198\nn01641577\nn01669191\nn02457408\nn03627232\nn02769748\nn04311004\nn03584254\nn03220513\nn03530642\nn04285008\nn01644373\nn09421951\nn03733281\nn03047690\nn02808304\nn03720891\nn02437616\nn07684084\nn01749939\nn04409515\nn02494079\nn02948072\nn02110806\nn02077923\nn01924916\nn01496331\nn04604644\nn02667093\nn02107142\nn01692333\nn04277352\nn04254777\nn02676566\nn12144580\nn03630383\nn02095889\nn03666591\nn03937543\nn01498041\nn03272562\nn09472597\nn03223299\nn04456115\nn02099601\nn03000134\nn02951585\nn03717622\nn01910747\nn06596364\nn01820546\nn02018795\nn04264628\nn02096177\nn01944390\nn01978287\nn01818515\nn03125729\nn02093256\nn01855032\nn02009912\nn02097047\nn02113712\nn01883070\nn01774750\nn01665541\nn02093428\nn01980166\nn04392985\nn03947888\nn02690373\nn02090721\nn04023962\nn03476684\nn04389033\nn03729826\nn02910353\nn01632458\nn02167151\nn02676566\nn03045698\nn01770081\nn04238763\nn10148035\nn04344873\nn02481823\nn04467665\nn02013706\nn02088238\nn02877765\nn01833805\nn07718747\nn02091467\nn03627232\nn04141076\nn04209239\nn01950731\nn04467665\nn03976657\nn03729826\nn04398044\nn07754684\nn04465501\nn01776313\nn02111129\nn03207743\nn03201208\nn01847000\nn02085936\nn03710721\nn04599235\nn02817516\nn02807133\nn04389033\nn02840245\nn04423845\nn07718472\nn02356798\nn02167151\nn02966687\nn02790996\nn02840245\nn02342885\nn02437312\nn07716906\nn02233338\nn03379051\nn01990800\nn02443114\nn01498041\nn03337140\nn02165105\nn04525305\nn02226429\nn01558993\nn02110341\nn04069434\nn01644900\nn02096177\nn04347754\nn03127747\nn02106382\nn01608432\nn02412080\nn02134084\nn04486054\nn04026417\nn02437616\nn04081281\nn04417672\nn02018207\nn03018349\nn03595614\nn02120079\nn03388183\nn03902125\nn02403003\nn03933933\nn09193705\nn01872401\nn03534580\nn02129165\nn03710193\nn01981276\nn02259212\nn07873807\nn01843065\nn02457408\nn02837789\nn02177972\nn02951585\nn02101006\nn02965783\nn04482393\nn01616318\nn04465501\nn03485407\nn02086646\nn02085620\nn02361337\nn01753488\nn04579145\nn01682714\nn02105641\nn04065272\nn01968897\nn02102973\nn12144580\nn04372370\nn02127052\nn02690373\nn02895154\nn04049303\nn03676483\nn02268443\nn02869837\nn02206856\nn04201297\nn02091244\nn02101556\nn02843684\nn04380533\nn07753275\nn01534433\nn02027492\nn02971356\nn04118538\nn03384352\nn03444034\nn03676483\nn03495258\nn02666196\nn01756291\nn03482405\nn02098413\nn04355933\nn03841143\nn02120079\nn02417914\nn03857828\nn02114712\nn01729977\nn01770081\nn03733131\nn03793489\nn03590841\nn02088364\nn01847000\nn11939491\nn03724870\nn02025239\nn07717556\nn02119789\nn03016953\nn02129165\nn04033901\nn02790996\nn02012849\nn02099429\nn03691459\nn04330267\nn10148035\nn03888257\nn07584110\nn02096437\nn04515003\nn02804610\nn02096437\nn04418357\nn02033041\nn02092339\nn12620546\nn01669191\nn03160309\nn02112137\nn02172182\nn03110669\nn04380533\nn03673027\nn03347037\nn04201297\nn02492660\nn02110958\nn02783161\nn02483708\nn02110958\nn04120489\nn03908618\nn02423022\nn04350905\nn04153751\nn02444819\nn02114548\nn07747607\nn07614500\nn04070727\nn04074963\nn01616318\nn02112706\nn02096437\nn04228054\nn01644900\nn01756291\nn02442845\nn03980874\nn02441942\nn04149813\nn03950228\nn01843383\nn02910353\nn03207743\nn04263257\nn02099429\nn04486054\nn02606052\nn04238763\nn02099601\nn02177972\nn03584829\nn04356056\nn03673027\nn02086646\nn04485082\nn02692877\nn03761084\nn03249569\nn04252077\nn02092339\nn01770081\nn02877765\nn02129604\nn03032252\nn13044778\nn02607072\nn03498962\nn02120505\nn01534433\nn01491361\nn07730033\nn02098413\nn02793495\nn02017213\nn02100877\nn02948072\nn02398521\nn03498962\nn02494079\nn04026417\nn03259280\nn04209133\nn02094258\nn02028035\nn03627232\nn03529860\nn02077923\nn03843555\nn03873416\nn02116738\nn03995372\nn02104365\nn04347754\nn04590129\nn03657121\nn01774384\nn03937543\nn07836838\nn04127249\nn02391049\nn04296562\nn02492035\nn04254120\nn04201297\nn02115641\nn02094258\nn03729826\nn02090379\nn02165456\nn02107142\nn01518878\nn03649909\nn01558993\nn01843383\nn01695060\nn02134084\nn02101556\nn02123045\nn03929855\nn02110185\nn03291819\nn02099601\nn04443257\nn02487347\nn01795545\nn04458633\nn02229544\nn03325584\nn04086273\nn03017168\nn01729977\nn03388043\nn01675722\nn02009229\nn03126707\nn02117135\nn03873416\nn04332243\nn02486410\nn03394916\nn02480855\nn02837789\nn03018349\nn03998194\nn04317175\nn01819313\nn03291819\nn01664065\nn02128385\nn02417914\nn04040759\nn01440764\nn09468604\nn03240683\nn07248320\nn11939491\nn02971356\nn02096437\nn02101556\nn04467665\nn03983396\nn04146614\nn04252077\nn03476684\nn02777292\nn03617480\nn04004767\nn02102177\nn02088632\nn07749582\nn04264628\nn04487081\nn02808440\nn04399382\nn03961711\nn04229816\nn03977966\nn03133878\nn03877845\nn03995372\nn04131690\nn02093754\nn02110806\nn01872401\nn02106662\nn07836838\nn04553703\nn02095314\nn12620546\nn02231487\nn02277742\nn04456115\nn02643566\nn02317335\nn04008634\nn04476259\nn04550184\nn02107908\nn02125311\nn03355925\nn03769881\nn07615774\nn02443114\nn02167151\nn04590129\nn12620546\nn02177972\nn03866082\nn07718472\nn02102318\nn07697313\nn03384352\nn04330267\nn03874293\nn03895866\nn02444819\nn03908714\nn02395406\nn04355933\nn03220513\nn04147183\nn02099267\nn01983481\nn01770081\nn02095570\nn01695060\nn02115641\nn04355338\nn07584110\nn02843684\nn04023962\nn02102480\nn04116512\nn02094258\nn04326547\nn02951358\nn01784675\nn03494278\nn03935335\nn02106662\nn02256656\nn03944341\nn02105641\nn02666196\nn03982430\nn02814533\nn04204238\nn07730033\nn01807496\nn03042490\nn02963159\nn02504458\nn03535780\nn04355933\nn02009229\nn02423022\nn01582220\nn07614500\nn02321529\nn03272562\nn03642806\nn04251144\nn02115913\nn02107312\nn03924679\nn02699494\nn03908714\nn04522168\nn09246464\nn03617480\nn02231487\nn02127052\nn04335435\nn02804610\nn02437616\nn03249569\nn01682714\nn02790996\nn03742115\nn02112350\nn02837789\nn04371774\nn03443371\nn02992529\nn01688243\nn03733281\nn07875152\nn02105641\nn02110958\nn02018795\nn04482393\nn03063689\nn02328150\nn02109525\nn02071294\nn02808304\nn03530642\nn03970156\nn01860187\nn02102973\nn03220513\nn03032252\nn01797886\nn03792782\nn02085936\nn04487394\nn02790996\nn01773157\nn04367480\nn03290653\nn03478589\nn04542943\nn07579787\nn02190166\nn06785654\nn02002724\nn01740131\nn04033995\nn01978287\nn02011460\nn03937543\nn02096437\nn01534433\nn02978881\nn03445924\nn07716358\nn02093428\nn01776313\nn02704792\nn01687978\nn04550184\nn02102973\nn02165456\nn03347037\nn01755581\nn02111889\nn03967562\nn01491361\nn02437616\nn02089078\nn02123597\nn04507155\nn03110669\nn03868242\nn03874599\nn02120505\nn03930313\nn02165105\nn04604644\nn03445777\nn02099712\nn02009229\nn04389033\nn04371774\nn02437616\nn04243546\nn03794056\nn03775071\nn04479046\nn03796401\nn02892767\nn03929660\nn02133161\nn03944341\nn03884397\nn04589890\nn03590841\nn02071294\nn04263257\nn01768244\nn02410509\nn04465501\nn02098286\nn02747177\nn02105162\nn01667114\nn02999410\nn01560419\nn07749582\nn01968897\nn02130308\nn02110806\nn02106382\nn07590611\nn07697537\nn04591157\nn04462240\nn02988304\nn03126707\nn02727426\nn04127249\nn02843684\nn03179701\nn02443484\nn04344873\nn02280649\nn03216828\nn12985857\nn04548280\nn03602883\nn03447721\nn01694178\nn02415577\nn02699494\nn03085013\nn02895154\nn04371774\nn03495258\nn03791053\nn02641379\nn02980441\nn02950826\nn02110063\nn03788195\nn01693334\nn02606052\nn07742313\nn02113624\nn03874293\nn04209239\nn03388043\nn02927161\nn03944341\nn04579432\nn03759954\nn02101388\nn01978287\nn03443371\nn02129604\nn01693334\nn07742313\nn01770393\nn06785654\nn03126707\nn02058221\nn03721384\nn02093647\nn07684084\nn03775546\nn03494278\nn03131574\nn02823428\nn02111889\nn04208210\nn02190166\nn04228054\nn03888257\nn02169497\nn01770081\nn02974003\nn03637318\nn02089078\nn02117135\nn02457408\nn02606052\nn03877845\nn02776631\nn01882714\nn03325584\nn02095314\nn02102973\nn02236044\nn02090622\nn02797295\nn01775062\nn02098286\nn03498962\nn02128385\nn02783161\nn07768694\nn03337140\nn01751748\nn04447861\nn02172182\nn03743016\nn03599486\nn04380533\nn07892512\nn03598930\nn02085782\nn01685808\nn02879718\nn01491361\nn04273569\nn02441942\nn04553703\nn03649909\nn03141823\nn02115641\nn04372370\nn04265275\nn04493381\nn06596364\nn02825657\nn02480495\nn02097298\nn03532672\nn01531178\nn03843555\nn03770679\nn02346627\nn02127052\nn03297495\nn02869837\nn02106166\nn01440764\nn02510455\nn02095570\nn02177972\nn03347037\nn01978455\nn02488702\nn02791124\nn04229816\nn01675722\nn03630383\nn01930112\nn04005630\nn04039381\nn03950228\nn04592741\nn01914609\nn02129165\nn01871265\nn03902125\nn01689811\nn03534580\nn01945685\nn01773549\nn02089867\nn03788195\nn02788148\nn02113023\nn03534580\nn04592741\nn02797295\nn03017168\nn04355933\nn02097209\nn02167151\nn04026417\nn03271574\nn02105251\nn04004767\nn02108000\nn04350905\nn02106662\nn03201208\nn03126707\nn01443537\nn02837789\nn02165456\nn03796401\nn02870880\nn02641379\nn01622779\nn02113023\nn07880968\nn02165456\nn03840681\nn03372029\nn04044716\nn03840681\nn03692522\nn03992509\nn02085620\nn03530642\nn02113186\nn02086079\nn07614500\nn09468604\nn03602883\nn09468604\nn04270147\nn04146614\nn02892201\nn03958227\nn03832673\nn02268443\nn02236044\nn01494475\nn02009912\nn01532829\nn02093754\nn03404251\nn03770439\nn07734744\nn04252077\nn07714571\nn02120079\nn01665541\nn02123394\nn03240683\nn04264628\nn02457408\nn07614500\nn02124075\nn03425413\nn03133878\nn07930864\nn03160309\nn02484975\nn02086240\nn02978881\nn04404412\nn02643566\nn02494079\nn02749479\nn02114855\nn02106166\nn02114712\nn03662601\nn07583066\nn02396427\nn02108089\nn04335435\nn03017168\nn02113186\nn04493381\nn02909870\nn03075370\nn03627232\nn03794056\nn01734418\nn02951358\nn02457408\nn02883205\nn02917067\nn03250847\nn02804610\nn02110958\nn02088364\nn03891251\nn02641379\nn02098105\nn02113624\nn02027492\nn02066245\nn02168699\nn06359193\nn03627232\nn09229709\nn02749479\nn04355338\nn04252225\nn02939185\nn01632777\nn02395406\nn02219486\nn02988304\nn01518878\nn03891332\nn02114548\nn02892767\nn01491361\nn03933933\nn02795169\nn09472597\nn07579787\nn03032252\nn02093754\nn13054560\nn03891251\nn02105505\nn02132136\nn07873807\nn02640242\nn04461696\nn04613696\nn09468604\nn02113186\nn02493509\nn04553703\nn01968897\nn04296562\nn03467068\nn03763968\nn04209239\nn02219486\nn03888257\nn01871265\nn03325584\nn03272562\nn03854065\nn01558993\nn03670208\nn01665541\nn03325584\nn01695060\nn02457408\nn02797295\nn02950826\nn02099429\nn03291819\nn02939185\nn03976467\nn02120079\nn02879718\nn04579145\nn04120489\nn01632458\nn02009912\nn04328186\nn06874185\nn02398521\nn02488291\nn02107312\nn03026506\nn02119022\nn01843383\nn03657121\nn03062245\nn07584110\nn02091032\nn03476991\nn02013706\nn02607072\nn02113712\nn03788365\nn04355338\nn04428191\nn04442312\nn01753488\nn12620546\nn03417042\nn02108089\nn07871810\nn03930313\nn04019541\nn04074963\nn02408429\nn02817516\nn01955084\nn02747177\nn09472597\nn03866082\nn02099267\nn03782006\nn03998194\nn02823428\nn04487081\nn03956157\nn03854065\nn02002556\nn01440764\nn02093256\nn02229544\nn02109047\nn03160309\nn02825657\nn02423022\nn03016953\nn04179913\nn01860187\nn02107574\nn06359193\nn02088094\nn04065272\nn02088632\nn02130308\nn03769881\nn02966193\nn06794110\nn07590611\nn03924679\nn04153751\nn02112706\nn02509815\nn04335435\nn04579432\nn02815834\nn02361337\nn02123159\nn03133878\nn02457408\nn02092002\nn04347754\nn03775071\nn03498962\nn02101388\nn03447447\nn02443114\nn04039381\nn02791124\nn02104365\nn01776313\nn04442312\nn03584254\nn02094258\nn02086646\nn04370456\nn01797886\nn03724870\nn01775062\nn02687172\nn02091244\nn03124043\nn01632777\nn02787622\nn01930112\nn01664065\nn01734418\nn02110063\nn01818515\nn04336792\nn03793489\nn02097298\nn02017213\nn04273569\nn03485794\nn02002724\nn04507155\nn11879895\nn02087046\nn02486410\nn04033995\nn03345487\nn03692522\nn04347754\nn01986214\nn03873416\nn03483316\nn02101556\nn03425413\nn03000684\nn02114367\nn02113712\nn03535780\nn02454379\nn03788195\nn02086240\nn02095889\nn02422699\nn03400231\nn03690938\nn01494475\nn02099601\nn04612504\nn07753275\nn03814639\nn02165105\nn03314780\nn03478589\nn01796340\nn02105641\nn01847000\nn01877812\nn02447366\nn03929660\nn02992529\nn02088094\nn07745940\nn04522168\nn04069434\nn12620546\nn03673027\nn03998194\nn03028079\nn04252225\nn02033041\nn01843065\nn07720875\nn02099712\nn02939185\nn02098413\nn04296562\nn03796401\nn01729977\nn02859443\nn02105251\nn02860847\nn04209133\nn02108000\nn04235860\nn02782093\nn02814533\nn01614925\nn01484850\nn01669191\nn04525305\nn07716906\nn02119022\nn03721384\nn02259212\nn03976657\nn02415577\nn04392985\nn04023962\nn02793495\nn04592741\nn02233338\nn02777292\nn01514859\nn03127747\nn04548362\nn03947888\nn03792782\nn03445777\nn04592741\nn02165105\nn02105056\nn04525038\nn02395406\nn02129604\nn09399592\nn09229709\nn06785654\nn03045698\nn04380533\nn02835271\nn07715103\nn03692522\nn02950826\nn02259212\nn03773504\nn04560804\nn04355933\nn02167151\nn01695060\nn02091635\nn07745940\nn03958227\nn03642806\nn01537544\nn03733131\nn02028035\nn02667093\nn03617480\nn02443484\nn04532106\nn06874185\nn02730930\nn01632458\nn04067472\nn09246464\nn02264363\nn09229709\nn02708093\nn03804744\nn03042490\nn03347037\nn02120079\nn02098105\nn02092339\nn03017168\nn02099429\nn03160309\nn12267677\nn03642806\nn07579787\nn02817516\nn01770393\nn01667114\nn04417672\nn04515003\nn02091134\nn02090721\nn04428191\nn02086646\nn04536866\nn03000684\nn01692333\nn04591157\nn03967562\nn03743016\nn04579145\nn02110063\nn04040759\nn02074367\nn03100240\nn04552348\nn02916936\nn03485407\nn02489166\nn03271574\nn01677366\nn02457408\nn02966193\nn04152593\nn01491361\nn01748264\nn03530642\nn03840681\nn01768244\nn02226429\nn03642806\nn02002556\nn03598930\nn01631663\nn03787032\nn03954731\nn04462240\nn03680355\nn02013706\nn03271574\nn04357314\nn02397096\nn01697457\nn02441942\nn03661043\nn01985128\nn03658185\nn02099267\nn04522168\nn13037406\nn02108422\nn04111531\nn01728920\nn02085620\nn01644373\nn02101388\nn02795169\nn02100877\nn04509417\nn02088466\nn02769748\nn02965783\nn03649909\nn03179701\nn01742172\nn01877812\nn03769881\nn03000247\nn02106662\nn03888605\nn03937543\nn04346328\nn03976467\nn03187595\nn15075141\nn03062245\nn03710721\nn04009552\nn02447366\nn02107574\nn03970156\nn03991062\nn02098413\nn07892512\nn03529860\nn03935335\nn01531178\nn02835271\nn03787032\nn02101388\nn02085620\nn02701002\nn11939491\nn01698640\nn02233338\nn11879895\nn02101556\nn07753592\nn02441942\nn07871810\nn01914609\nn02132136\nn02097658\nn07720875\nn02259212\nn01560419\nn02510455\nn04200800\nn04254777\nn01616318\nn04522168\nn02100236\nn04356056\nn07615774\nn03160309\nn02666196\nn02169497\nn03207941\nn07831146\nn04131690\nn04136333\nn02895154\nn02002556\nn04311174\nn04243546\nn13052670\nn02895154\nn03527444\nn02090622\nn04429376\nn01667778\nn01871265\nn01608432\nn03424325\nn02111129\nn02094114\nn03706229\nn02883205\nn07590611\nn02948072\nn01770393\nn03290653\nn02128925\nn02110185\nn02110341\nn01796340\nn02342885\nn02487347\nn04310018\nn02091635\nn02708093\nn03016953\nn02264363\nn04372370\nn03272562\nn02089078\nn03764736\nn02963159\nn03874599\nn02641379\nn01984695\nn02802426\nn02346627\nn03773504\nn04273569\nn02111889\nn03498962\nn03141823\nn04350905\nn02095314\nn04335435\nn03388183\nn01537544\nn03947888\nn02106662\nn03854065\nn01484850\nn02086079\nn07714571\nn01768244\nn04070727\nn03494278\nn03584829\nn03837869\nn01945685\nn03733281\nn04429376\nn02099601\nn04554684\nn04509417\nn01943899\nn07565083\nn04515003\nn03777754\nn03594734\nn03777568\nn03840681\nn02536864\nn04442312\nn03127747\nn03445777\nn04579432\nn03063599\nn02113978\nn03787032\nn01742172\nn02487347\nn04486054\nn02093859\nn04162706\nn02328150\nn03482405\nn04517823\nn07615774\nn04192698\nn02808304\nn02037110\nn04254120\nn02490219\nn07684084\nn02094258\nn02814533\nn02174001\nn07753275\nn04033901\nn02481823\nn03770679\nn03134739\nn01560419\nn04275548\nn01667778\nn01737021\nn01806567\nn04456115\nn07613480\nn01737021\nn03761084\nn07753592\nn04461696\nn04336792\nn02137549\nn02100735\nn04005630\nn02112706\nn12144580\nn03785016\nn03372029\nn04486054\nn02117135\nn01667778\nn02927161\nn07760859\nn03924679\nn04040759\nn07742313\nn02106030\nn03388549\nn03950228\nn01768244\nn07734744\nn04479046\nn02791124\nn01807496\nn04357314\nn01484850\nn03888605\nn04277352\nn04326547\nn03876231\nn07584110\nn02092002\nn01667778\nn01682714\nn02091831\nn02108089\nn02951585\nn02219486\nn02090379\nn01950731\nn02089867\nn01828970\nn03837869\nn01978287\nn02092002\nn02814533\nn01664065\nn12768682\nn07930864\nn04357314\nn02802426\nn02089867\nn03063689\nn03535780\nn04591713\nn03796401\nn02877765\nn02823428\nn07717410\nn04612504\nn03642806\nn04033995\nn02095889\nn04074963\nn01855032\nn04270147\nn03110669\nn03255030\nn03530642\nn10148035\nn07745940\nn02490219\nn02074367\nn02097130\nn02106662\nn03891332\nn02089973\nn04209239\nn04548280\nn04154565\nn02037110\nn02113978\nn02115913\nn02018795\nn02823428\nn02091032\nn03874293\nn04146614\nn04560804\nn04522168\nn07717556\nn04311004\nn02105855\nn02109961\nn02134084\nn02930766\nn01855032\nn02480495\nn02509815\nn02100877\nn02795169\nn02125311\nn01734418\nn03124043\nn02165105\nn02840245\nn03759954\nn01622779\nn02442845\nn04328186\nn04152593\nn04554684\nn02965783\nn02510455\nn03445777\nn07615774\nn12998815\nn07717410\nn03742115\nn04264628\nn02165456\nn04074963\nn02098105\nn02132136\nn01872401\nn02441942\nn04560804\nn02422699\nn02802426\nn07768694\nn01518878\nn02096051\nn02786058\nn02483708\nn02099601\nn04435653\nn01630670\nn02177972\nn13052670\nn02028035\nn01978455\nn13054560\nn02165105\nn04317175\nn01739381\nn02168699\nn02483362\nn02342885\nn02007558\nn01798484\nn04579145\nn02361337\nn02643566\nn04147183\nn04208210\nn01798484\nn02488291\nn03773504\nn03662601\nn02483708\nn01986214\nn04005630\nn02165105\nn02009229\nn03814639\nn04462240\nn02090379\nn03786901\nn01734418\nn01770081\nn02814533\nn03445777\nn03196217\nn02747177\nn02493793\nn03970156\nn02165105\nn03930313\nn02169497\nn04204347\nn02113712\nn02979186\nn02085782\nn04265275\nn01694178\nn09229709\nn04317175\nn07760859\nn02865351\nn03841143\nn01601694\nn02128925\nn03908714\nn01775062\nn01770393\nn02877765\nn03902125\nn01744401\nn02094114\nn03271574\nn04372370\nn07697313\nn04229816\nn02692877\nn01537544\nn04153751\nn02490219\nn09193705\nn02951585\nn01986214\nn02865351\nn02105855\nn04392985\nn03825788\nn04265275\nn12267677\nn03787032\nn02088632\nn04507155\nn03481172\nn03868242\nn02797295\nn02500267\nn02480855\nn03956157\nn02948072\nn03792782\nn03478589\nn04590129\nn01729322\nn02105056\nn02837789\nn03393912\nn02319095\nn02100735\nn02093256\nn03782006\nn03388043\nn03891251\nn02391049\nn02167151\nn03045698\nn01534433\nn04067472\nn02105641\nn04423845\nn01983481\nn03160309\nn02802426\nn09428293\nn02106382\nn04325704\nn02444819\nn01755581\nn02895154\nn02129604\nn02910353\nn07873807\nn07716358\nn03325584\nn02104029\nn01883070\nn02408429\nn02992529\nn02111277\nn04141327\nn02098105\nn12998815\nn04133789\nn02837789\nn02321529\nn04041544\nn03131574\nn01968897\nn03721384\nn09428293\nn03637318\nn04536866\nn01641577\nn01828970\nn02794156\nn02105855\nn02825657\nn02100735\nn02487347\nn02281406\nn04550184\nn02804414\nn03594734\nn01806143\nn09256479\nn04204238\nn03544143\nn04350905\nn04380533\nn03459775\nn04509417\nn02480495\nn04204347\nn03967562\nn03666591\nn03481172\nn03179701\nn01728920\nn09835506\nn02509815\nn11939491\nn02125311\nn01774750\nn01924916\nn04380533\nn03496892\nn02510455\nn02808304\nn04328186\nn04009552\nn02105505\nn02454379\nn04507155\nn01592084\nn04118538\nn01644373\nn02965783\nn03742115\nn07715103\nn03733281\nn02268853\nn03967562\nn02107574\nn04597913\nn01798484\nn04562935\nn04584207\nn07717556\nn02110958\nn04597913\nn07693725\nn02086910\nn04136333\nn01843383\nn02794156\nn02101556\nn04192698\nn02389026\nn03250847\nn01817953\nn01682714\nn01491361\nn06874185\nn02093647\nn02483362\nn04435653\nn01667778\nn04548280\nn03133878\nn02840245\nn01950731\nn04229816\nn01817953\nn04346328\nn07871810\nn04493381\nn03476684\nn01882714\nn03100240\nn02105505\nn03623198\nn02128925\nn07749582\nn03124170\nn03042490\nn01531178\nn03180011\nn02276258\nn03538406\nn01843383\nn01833805\nn02109047\nn01735189\nn01514859\nn02396427\nn01537544\nn07920052\nn02077923\nn03661043\nn03445924\nn01514859\nn04418357\nn01630670\nn02256656\nn02980441\nn01985128\nn03787032\nn09399592\nn02096177\nn03095699\nn02791270\nn02002556\nn02099429\nn02687172\nn04487081\nn03775071\nn04120489\nn02100877\nn04131690\nn02111277\nn04008634\nn03796401\nn03690938\nn03496892\nn02487347\nn02098286\nn04398044\nn02281787\nn02641379\nn03179701\nn03110669\nn03314780\nn03388549\nn02441942\nn02091831\nn03933933\nn07584110\nn02510455\nn02437312\nn02417914\nn02110806\nn02667093\nn03384352\nn03529860\nn04209239\nn04254120\nn04310018\nn07615774\nn01984695\nn03188531\nn02701002\nn01749939\nn03494278\nn04317175\nn02480855\nn04553703\nn04591713\nn02093991\nn03496892\nn03498962\nn02870880\nn07734744\nn02090622\nn02095889\nn03089624\nn03814906\nn01443537\nn03775546\nn03895866\nn04254680\nn02093991\nn02094433\nn03709823\nn04133789\nn04356056\nn09421951\nn03781244\nn03970156\nn03709823\nn03873416\nn03950228\nn03425413\nn09229709\nn03141823\nn03290653\nn01675722\nn04259630\nn04613696\nn03838899\nn01443537\nn03617480\nn02112350\nn01774384\nn02108915\nn03876231\nn02099429\nn02226429\nn01770393\nn01694178\nn06794110\nn03220513\nn11879895\nn03124043\nn02105855\nn02486410\nn04004767\nn09835506\nn07745940\nn02097047\nn03721384\nn03133878\nn02093647\nn06794110\nn04317175\nn02134418\nn02692877\nn02128757\nn03794056\nn02727426\nn01484850\nn02514041\nn02106382\nn02097298\nn04613696\nn02701002\nn03770439\nn01855672\nn02328150\nn03944341\nn09468604\nn02281787\nn04554684\nn02098105\nn03179701\nn02174001\nn02109961\nn03742115\nn04562935\nn03729826\nn04133789\nn04086273\nn01514859\nn04597913\nn04476259\nn01914609\nn02095889\nn03125729\nn04366367\nn02443114\nn02098413\nn03599486\nn01614925\nn04483307\nn02105412\nn01631663\nn02500267\nn02095889\nn04264628\nn07753592\nn02123597\nn03884397\nn04579432\nn03938244\nn07831146\nn02101006\nn02092002\nn02006656\nn02106166\nn04596742\nn03770679\nn04149813\nn04599235\nn04332243\nn03379051\nn01776313\nn01806567\nn09468604\nn04554684\nn02747177\nn04243546\nn03838899\nn01855032\nn01917289\nn02226429\nn03706229\nn03843555\nn07615774\nn02268853\nn04141975\nn01728920\nn01531178\nn03838899\nn09472597\nn01847000\nn13133613\nn04522168\nn02088466\nn09193705\nn03445924\nn02092002\nn02640242\nn07742313\nn04612504\nn01986214\nn09229709\nn02488291\nn02643566\nn03891251\nn09468604\nn01983481\nn07920052\nn03770679\nn02097130\nn03769881\nn03498962\nn07697537\nn02422699\nn04254777\nn03452741\nn04152593\nn01616318\nn02259212\nn03690938\nn04501370\nn04355933\nn01498041\nn04023962\nn02488702\nn04443257\nn02091134\nn02978881\nn02091244\nn01756291\nn04120489\nn04141327\nn02504458\nn01667778\nn02108089\nn03843555\nn02951358\nn01807496\nn02102318\nn07745940\nn06794110\nn02363005\nn07753113\nn01644900\nn02363005\nn01484850\nn02105056\nn02107312\nn03482405\nn01945685\nn02823750\nn02090622\nn03710193\nn03379051\nn07873807\nn04263257\nn03062245\nn02088632\nn04208210\nn04141327\nn07932039\nn02951358\nn02790996\nn02777292\nn02804414\nn03970156\nn04501370\nn02641379\nn01774750\nn01498041\nn04116512\nn02233338\nn03706229\nn02097047\nn07697537\nn02444819\nn04153751\nn02398521\nn03908714\nn02088632\nn02113712\nn02132136\nn04258138\nn03425413\nn02397096\nn02443484\nn06785654\nn04367480\nn03717622\nn03721384\nn02981792\nn01955084\nn02090721\nn02879718\nn02113712\nn02417914\nn02093859\nn02009912\nn02006656\nn01770393\nn02701002\nn01818515\nn12998815\nn03532672\nn03666591\nn06794110\nn03110669\nn03220513\nn03976467\nn02396427\nn03888257\nn02514041\nn02837789\nn07711569\nn07613480\nn03075370\nn07684084\nn02708093\nn02099267\nn03131574\nn01843383\nn02091032\nn03796401\nn04243546\nn04389033\nn03014705\nn03868863\nn01883070\nn01744401\nn12267677\nn03876231\nn01847000\nn02219486\nn01955084\nn03089624\nn04350905\nn02119022\nn04004767\nn02793495\nn03404251\nn03014705\nn01677366\nn03690938\nn04162706\nn04552348\nn01985128\nn07873807\nn02526121\nn07932039\nn02102973\nn02108000\nn04493381\nn02097130\nn04086273\nn03832673\nn02088364\nn02119789\nn02113712\nn07716906\nn03792972\nn02097658\nn02226429\nn09428293\nn02116738\nn07753113\nn02777292\nn02017213\nn04209239\nn02077923\nn02509815\nn07716906\nn02843684\nn02417914\nn07920052\nn09288635\nn01980166\nn09193705\nn03124043\nn03944341\nn02219486\nn02127052\nn04147183\nn02106550\nn04550184\nn01728572\nn02102480\nn04371430\nn03983396\nn02815834\nn04264628\nn04356056\nn02096294\nn02106382\nn07579787\nn02536864\nn03630383\nn02114367\nn03781244\nn03271574\nn01739381\nn04008634\nn03594734\nn03201208\nn02058221\nn02134418\nn10148035\nn01631663\nn02526121\nn02002556\nn02095314\nn02098105\nn04509417\nn04612504\nn02497673\nn01580077\nn01697457\nn03109150\nn09468604\nn03874293\nn02109961\nn02110627\nn02892201\nn02088364\nn03100240\nn03532672\nn02892767\nn07860988\nn03337140\nn02951358\nn03691459\nn03134739\nn02422106\nn02788148\nn03814906\nn02444819\nn06785654\nn04612504\nn02123394\nn03042490\nn04116512\nn03527444\nn09288635\nn01983481\nn09332890\nn07715103\nn01828970\nn04037443\nn03089624\nn02504458\nn01917289\nn03223299\nn02119022\nn02206856\nn04252077\nn02012849\nn02037110\nn01751748\nn07930864\nn04131690\nn07697313\nn02841315\nn03950228\nn04254680\nn04141975\nn03983396\nn02124075\nn12998815\nn03709823\nn01689811\nn02966687\nn03590841\nn02002556\nn01770393\nn04532106\nn02109961\nn04286575\nn02910353\nn03785016\nn04125021\nn04370456\nn02115641\nn03874293\nn13054560\nn02480855\nn02105855\nn01773157\nn02108915\nn02108000\nn03764736\nn02231487\nn04507155\nn01744401\nn04325704\nn02526121\nn04371774\nn01582220\nn02088094\nn12267677\nn07880968\nn04266014\nn02417914\nn04270147\nn07684084\nn01443537\nn03866082\nn04179913\nn02422106\nn07697537\nn02687172\nn03803284\nn01692333\nn04192698\nn02481823\nn02115913\nn03404251\nn02138441\nn02999410\nn03388183\nn02317335\nn03759954\nn04335435\nn03814906\nn03692522\nn13052670\nn03729826\nn02790996\nn02012849\nn03935335\nn01667114\nn07836838\nn01580077\nn07615774\nn03535780\nn02226429\nn03903868\nn02999410\nn03532672\nn03498962\nn01531178\nn03868242\nn02128757\nn03793489\nn01755581\nn09332890\nn02087394\nn03920288\nn02128385\nn03495258\nn02114712\nn03976467\nn04259630\nn02794156\nn01774384\nn02091467\nn04467665\nn02091635\nn04579432\nn03599486\nn02328150\nn04147183\nn02486410\nn04252077\nn02395406\nn07584110\nn03075370\nn02138441\nn02105505\nn04311004\nn04086273\nn04435653\nn04467665\nn04201297\nn01689811\nn03345487\nn02090379\nn02776631\nn04023962\nn02114367\nn13044778\nn02917067\nn07711569\nn03452741\nn01734418\nn03272010\nn01744401\nn09399592\nn02114855\nn03594734\nn02860847\nn04141076\nn02133161\nn03804744\nn01924916\nn04532106\nn01770081\nn02096177\nn02797295\nn03188531\nn04204347\nn03063689\nn02841315\nn02276258\nn02086646\nn03775071\nn03947888\nn02137549\nn03063599\nn02074367\nn02051845\nn03832673\nn03982430\nn01776313\nn02102177\nn02106550\nn03929855\nn04201297\nn01592084\nn02906734\nn03124043\nn03598930\nn07590611\nn02091635\nn02128757\nn04204347\nn01698640\nn01955084\nn03891251\nn02823428\nn03417042\nn03666591\nn03958227\nn03895866\nn02690373\nn01667778\nn02692877\nn03532672\nn07920052\nn03924679\nn03085013\nn07697313\nn02444819\nn02992211\nn07248320\nn02950826\nn02077923\nn03786901\nn03016953\nn02111889\nn02892201\nn02786058\nn02106382\nn02877765\nn02687172\nn02747177\nn02105412\nn07753113\nn03207743\nn04418357\nn02009912\nn01580077\nn01616318\nn04273569\nn01945685\nn03706229\nn04326547\nn02105056\nn13037406\nn03459775\nn02526121\nn02837789\nn04346328\nn01819313\nn02321529\nn03916031\nn03026506\nn02105251\nn04599235\nn01518878\nn02110627\nn01984695\nn01943899\nn04069434\nn02113023\nn01531178\nn03947888\nn03733805\nn03873416\nn02087394\nn04273569\nn03690938\nn02281787\nn04515003\nn01630670\nn03445924\nn04317175\nn02395406\nn02018207\nn02128385\nn03255030\nn02169497\nn03717622\nn03602883\nn02488291\nn01622779\nn03992509\nn02877765\nn03873416\nn01855672\nn03478589\nn03404251\nn07584110\nn03980874\nn03476684\nn02138441\nn02977058\nn02105162\nn03485407\nn01616318\nn02051845\nn03793489\nn01768244\nn04209239\nn03930630\nn04532106\nn03259280\nn02841315\nn02966193\nn03980874\nn04532106\nn02981792\nn01776313\nn04355338\nn02110341\nn03697007\nn02454379\nn02655020\nn03841143\nn07584110\nn02123394\nn03255030\nn07711569\nn03724870\nn03110669\nn03133878\nn01641577\nn01644373\nn04049303\nn07768694\nn03075370\nn02823428\nn02640242\nn02104365\nn04009552\nn02129604\nn03733805\nn02281787\nn04208210\nn04067472\nn01514859\nn03384352\nn03544143\nn03355925\nn01694178\nn03950228\nn07717556\nn02317335\nn02113799\nn07583066\nn02999410\nn07760859\nn02410509\nn02013706\nn04285008\nn04296562\nn03196217\nn03000134\nn02110627\nn04442312\nn02787622\nn02443484\nn02137549\nn03337140\nn03594734\nn02879718\nn02415577\nn02092339\nn03450230\nn02102040\nn07747607\nn03085013\nn03026506\nn06874185\nn02493793\nn03532672\nn01644900\nn03792782\nn04004767\nn02966193\nn01784675\nn13037406\nn03481172\nn03775546\nn04033995\nn02101556\nn03666591\nn04317175\nn01882714\nn02640242\nn03063689\nn04560804\nn01860187\nn04376876\nn04523525\nn01833805\nn02169497\nn03314780\nn02988304\nn02168699\nn04044716\nn02109961\nn01770393\nn01531178\nn04152593\nn02106662\nn04389033\nn01735189\nn07871810\nn04277352\nn02077923\nn03347037\nn02111500\nn02088238\nn03534580\nn03314780\nn02791270\nn04548280\nn03109150\nn03944341\nn02137549\nn04523525\nn04592741\nn04266014\nn01978455\nn02091032\nn04398044\nn02113624\nn02408429\nn04417672\nn04009552\nn02231487\nn04599235\nn07248320\nn04086273\nn04606251\nn03532672\nn02112137\nn09256479\nn04523525\nn01697457\nn03662601\nn04070727\nn02098286\nn02017213\nn02177972\nn01689811\nn03697007\nn03874599\nn02110185\nn04417672\nn04310018\nn02130308\nn04252077\nn03534580\nn01860187\nn03814906\nn02442845\nn04487394\nn02090379\nn01930112\nn07860988\nn02869837\nn02231487\nn03956157\nn03482405\nn02489166\nn02107683\nn01677366\nn01806143\nn03775071\nn02825657\nn02783161\nn01622779\nn02268853\nn04044716\nn04540053\nn02107142\nn04487394\nn03376595\nn01496331\nn02815834\nn02099267\nn04229816\nn07615774\nn03272562\nn01855672\nn02804414\nn01818515\nn02704792\nn02483708\nn01629819\nn03393912\nn03794056\nn01644373\nn02951585\nn02497673\nn02415577\nn01871265\nn07718747\nn02966193\nn03017168\nn01530575\nn02319095\nn02090379\nn03297495\nn03388183\nn03825788\nn01798484\nn03814906\nn02027492\nn02111889\nn04118538\nn02356798\nn01983481\nn01986214\nn02808440\nn02486261\nn01751748\nn03777568\nn04335435\nn07720875\nn03633091\nn03534580\nn04141975\nn04162706\nn03998194\nn07579787\nn02676566\nn03483316\nn01693334\nn04238763\nn02071294\nn04493381\nn07875152\nn01753488\nn02091635\nn03314780\nn03291819\nn03924679\nn12768682\nn06794110\nn03291819\nn03544143\nn01698640\nn06785654\nn03782006\nn04154565\nn02012849\nn07930864\nn03017168\nn04133789\nn02138441\nn03769881\nn03773504\nn07930864\nn04589890\nn01806143\nn03207743\nn02097474\nn01582220\nn02939185\nn02640242\nn02981792\nn03657121\nn02106166\nn02666196\nn01751748\nn03188531\nn01768244\nn04429376\nn02690373\nn01806567\nn02319095\nn02107683\nn04550184\nn04350905\nn01797886\nn04447861\nn04485082\nn03443371\nn04229816\nn03443371\nn04579145\nn03125729\nn03942813\nn03649909\nn02119022\nn02105251\nn12144580\nn02992529\nn01518878\nn02977058\nn01968897\nn02233338\nn03642806\nn01833805\nn09421951\nn01985128\nn01824575\nn04286575\nn04330267\nn02106166\nn07875152\nn02094258\nn02123394\nn01537544\nn04493381\nn02102480\nn02086240\nn02085782\nn03786901\nn04254680\nn03721384\nn04311174\nn04487394\nn02099267\nn03207941\nn02883205\nn02672831\nn04008634\nn03868863\nn04251144\nn03529860\nn01608432\nn02093647\nn02028035\nn03982430\nn01687978\nn01632458\nn03125729\nn02389026\nn02085782\nn06359193\nn03459775\nn01773797\nn02093754\nn04275548\nn02120505\nn03450230\nn03854065\nn02096177\nn02112706\nn02089867\nn02138441\nn02504458\nn02865351\nn04479046\nn03180011\nn03223299\nn02804414\nn02134418\nn01751748\nn02483708\nn01692333\nn02992211\nn03404251\nn07716906\nn01924916\nn07695742\nn02112137\nn02692877\nn02423022\nn02860847\nn01877812\nn04326547\nn02051845\nn01855672\nn02667093\nn01829413\nn07760859\nn01630670\nn02869837\nn02086910\nn01740131\nn02398521\nn03016953\nn02091134\nn02096585\nn02093647\nn03220513\nn07716906\nn03188531\nn03627232\nn03690938\nn02788148\nn04254680\nn02493509\nn02098413\nn03532672\nn02111889\nn01843065\nn02666196\nn02457408\nn03785016\nn02097474\nn02704792\nn03868863\nn04540053\nn03529860\nn04238763\nn03658185\nn03970156\nn04285008\nn02526121\nn02096585\nn03814639\nn03180011\nn02480855\nn03594945\nn02101006\nn04517823\nn12985857\nn02104029\nn04111531\nn01729322\nn03773504\nn01580077\nn02098413\nn04065272\nn02085936\nn02093859\nn02104365\nn09472597\nn02865351\nn04254680\nn02951358\nn02281787\nn01496331\nn02093256\nn01910747\nn04509417\nn02417914\nn02389026\nn03666591\nn06794110\nn03786901\nn07695742\nn02133161\nn04540053\nn02782093\nn01871265\nn03690938\nn02028035\nn02106550\nn02494079\nn07831146\nn01498041\nn02130308\nn04483307\nn01820546\nn02105056\nn04487081\nn09332890\nn02437312\nn03692522\nn02871525\nn02326432\nn07749582\nn02992211\nn02497673\nn03544143\nn13052670\nn13133613\nn07714571\nn03868863\nn02606052\nn02111129\nn03874293\nn02190166\nn02226429\nn02363005\nn02443484\nn04579145\nn03425413\nn03018349\nn03452741\nn02791124\nn02346627\nn02128757\nn03998194\nn03530642\nn01592084\nn01917289\nn03764736\nn07615774\nn03977966\nn02877765\nn02089973\nn01986214\nn01872401\nn03942813\nn01689811\nn02834397\nn07714990\nn02486261\nn02397096\nn04467665\nn02909870\nn04517823\nn04131690\nn01728572\nn01729322\nn01797886\nn02108551\nn03866082\nn01677366\nn02979186\nn03710637\nn03933933\nn03930313\nn03899768\nn03763968\nn02326432\nn02107142\nn02066245\nn04099969\nn07860988\nn07695742\nn01924916\nn03895866\nn03788365\nn01632777\nn02787622\nn01768244\nn01768244\nn03146219\nn06785654\nn02110341\nn03400231\nn02123045\nn02025239\nn03670208\nn01784675\nn03982430\nn04485082\nn03208938\nn01990800\nn03930313\nn02708093\nn04597913\nn01796340\nn02100236\nn01608432\nn01828970\nn01614925\nn03400231\nn01631663\nn03759954\nn01872401\nn01917289\nn02690373\nn01664065\nn03016953\nn04376876\nn01664065\nn02950826\nn04557648\nn02793495\nn02111129\nn01968897\nn03781244\nn07871810\nn02641379\nn02097209\nn02109047\nn03065424\nn03838899\nn04501370\nn01753488\nn04049303\nn02097047\nn04311004\nn03538406\nn03666591\nn02017213\nn02093647\nn04409515\nn03207743\nn01843065\nn03697007\nn03291819\nn03197337\nn03000247\nn02443484\nn03891251\nn02085782\nn04033901\nn03658185\nn01819313\nn03388549\nn02606052\nn04612504\nn01582220\nn02883205\nn04467665\nn03535780\nn04326547\nn03895866\nn02095889\nn02123045\nn03777568\nn01631663\nn02999410\nn07717410\nn02837789\nn04461696\nn07720875\nn03141823\nn03216828\nn04589890\nn02105641\nn03196217\nn01797886\nn07742313\nn02396427\nn04532106\nn02655020\nn02437312\nn03028079\nn02037110\nn03788365\nn01978455\nn02483362\nn02444819\nn01580077\nn04347754\nn01728572\nn03063689\nn02106662\nn02672831\nn03895866\nn04560804\nn04540053\nn02233338\nn03777754\nn02788148\nn09472597\nn02484975\nn04404412\nn02087046\nn02089078\nn03255030\nn03095699\nn07714990\nn02641379\nn03218198\nn02481823\nn01514859\nn03337140\nn04399382\nn02641379\nn02129604\nn03982430\nn04127249\nn04125021\nn01774384\nn01740131\nn02325366\nn04041544\nn02667093\nn07836838\nn01739381\nn02108000\nn02277742\nn01950731\nn03777754\nn04310018\nn02917067\nn02835271\nn04515003\nn02119789\nn02966687\nn03085013\nn12144580\nn02071294\nn12998815\nn04162706\nn03028079\nn03218198\nn02895154\nn04562935\nn07613480\nn02128925\nn03649909\nn01629819\nn01883070\nn02098413\nn02002724\nn02106382\nn01530575\nn02113978\nn02124075\nn04332243\nn02655020\nn04239074\nn01910747\nn09399592\nn02096051\nn03930630\nn07693725\nn03933933\nn03187595\nn02281787\nn02892201\nn02108000\nn01687978\nn03803284\nn07892512\nn02074367\nn03891251\nn03384352\nn04409515\nn02107574\nn01860187\nn03529860\nn02280649\nn02860847\nn03325584\nn04409515\nn03692522\nn02089973\nn02782093\nn03208938\nn02980441\nn01693334\nn01773157\nn01729977\nn03063689\nn02865351\nn03459775\nn03637318\nn04263257\nn04604644\nn04311004\nn02120079\nn02112018\nn03196217\nn01871265\nn02804610\nn07892512\nn03124043\nn02219486\nn02089973\nn02109047\nn04040759\nn07711569\nn04458633\nn07720875\nn02277742\nn01675722\nn02119022\nn02106030\nn03763968\nn02105412\nn03017168\nn03857828\nn04346328\nn04005630\nn03492542\nn02480495\nn02090622\nn03814906\nn04004767\nn02992529\nn02692877\nn09332890\nn02979186\nn01770393\nn02129165\nn02391049\nn07871810\nn03355925\nn04398044\nn07860988\nn03961711\nn02089973\nn03404251\nn02395406\nn03063689\nn04070727\nn04552348\nn02112137\nn02110958\nn01753488\nn07697537\nn04389033\nn02783161\nn07693725\nn04286575\nn07753113\nn07716358\nn03394916\nn02093256\nn01737021\nn07836838\nn02268853\nn02130308\nn02906734\nn02134418\nn02108000\nn01560419\nn03131574\nn02133161\nn03000247\nn02279972\nn02951585\nn03733805\nn01677366\nn03976467\nn03535780\nn03938244\nn01644373\nn02109525\nn03649909\nn02190166\nn01692333\nn02910353\nn01807496\nn03982430\nn02974003\nn03950228\nn01978287\nn03720891\nn02892767\nn02504013\nn01855032\nn02483362\nn02025239\nn03868242\nn02094114\nn02109047\nn07749582\nn01669191\nn03785016\nn04041544\nn02087046\nn03272010\nn03447447\nn02783161\nn03976657\nn02087394\nn04548280\nn01860187\nn01689811\nn04584207\nn04251144\nn02113023\nn03977966\nn03792972\nn13054560\nn06785654\nn07734744\nn02115641\nn04606251\nn02277742\nn02794156\nn02137549\nn04479046\nn01753488\nn04485082\nn02100735\nn02869837\nn03534580\nn02879718\nn04525305\nn01829413\nn03792782\nn02109961\nn03443371\nn02009229\nn01744401\nn01728572\nn02098413\nn04311004\nn03272010\nn02095570\nn01632458\nn02783161\nn01644900\nn01601694\nn01608432\nn04335435\nn02086910\nn04418357\nn02097658\nn03124170\nn04228054\nn02494079\nn07754684\nn02493793\nn02165105\nn02133161\nn01847000\nn03394916\nn02105162\nn01950731\nn03970156\nn02233338\nn03045698\nn02099601\nn11939491\nn04467665\nn04346328\nn04347754\nn03063689\nn03100240\nn02127052\nn03887697\nn09428293\nn02361337\nn02606052\nn04590129\nn02692877\nn03796401\nn04532106\nn03538406\nn07747607\nn01978455\nn07717556\nn02894605\nn03134739\nn04243546\nn03903868\nn02879718\nn01824575\nn01877812\nn01770081\nn04525305\nn01773549\nn02099712\nn01774384\nn02823428\nn01860187\nn03461385\nn04366367\nn02167151\nn02454379\nn03777568\nn01833805\nn03761084\nn04542943\nn02504458\nn02033041\nn02095314\nn03527444\nn02280649\nn02123045\nn01644373\nn12998815\nn03792972\nn02480495\nn03417042\nn02091467\nn02415577\nn12985857\nn03544143\nn04370456\nn02110806\nn03676483\nn03602883\nn03538406\nn04201297\nn03929855\nn02504013\nn10565667\nn02097130\nn03950228\nn01675722\nn04523525\nn02966687\nn02504458\nn02089973\nn01641577\nn04330267\nn04146614\nn01631663\nn02978881\nn07802026\nn04039381\nn03485794\nn03825788\nn04265275\nn03141823\nn04033995\nn03179701\nn01986214\nn04604644\nn02730930\nn03920288\nn02799071\nn04399382\nn04023962\nn02951358\nn02114367\nn02074367\nn03992509\nn03000134\nn01824575\nn04525305\nn02119789\nn03899768\nn03617480\nn02012849\nn03814639\nn04347754\nn04597913\nn02113799\nn04562935\nn03777754\nn02687172\nn02066245\nn02704792\nn01751748\nn02090622\nn03857828\nn03777754\nn02130308\nn02606052\nn03483316\nn02808440\nn02114712\nn01774384\nn09468604\nn03045698\nn02107574\nn02112706\nn03777754\nn04209239\nn07745940\nn02690373\nn07584110\nn03388549\nn03977966\nn04584207\nn02279972\nn02443114\nn02493509\nn02494079\nn03063599\nn01774750\nn01968897\nn01695060\nn04380533\nn02128757\nn09256479\nn02909870\nn04501370\nn03935335\nn07693725\nn04591713\nn03787032\nn01498041\nn03042490\nn02086910\nn01855672\nn04596742\nn02445715\nn02859443\nn02804610\nn03709823\nn02488291\nn02410509\nn03393912\nn03498962\nn03131574\nn03791053\nn03763968\nn02097130\nn03042490\nn01641577\nn01677366\nn01828970\nn02096051\nn03888605\nn02094114\nn02892201\nn02486261\nn03983396\nn02133161\nn03602883\nn03065424\nn02749479\nn02791124\nn01968897\nn02797295\nn02877765\nn01843065\nn02892201\nn03786901\nn02174001\nn03133878\nn02107908\nn04136333\nn02437616\nn04592741\nn04044716\nn01773157\nn02130308\nn02325366\nn04591713\nn04090263\nn03902125\nn03670208\nn07753113\nn03866082\nn04201297\nn02093859\nn02410509\nn02823750\nn01740131\nn03417042\nn03874293\nn03710193\nn02871525\nn02091467\nn04254120\nn02109525\nn04404412\nn02094433\nn11939491\nn02107683\nn04356056\nn02002556\nn02168699\nn01945685\nn04376876\nn04033901\nn01530575\nn03838899\nn01776313\nn03028079\nn03658185\nn04310018\nn02090379\nn02109525\nn04376876\nn04418357\nn04409515\nn07583066\nn03841143\nn02837789\nn03494278\nn03457902\nn02497673\nn02504013\nn02110063\nn02835271\nn01491361\nn02807133\nn02085782\nn02088364\nn02607072\nn02120505\nn07718472\nn03781244\nn02389026\nn03026506\nn02769748\nn02096177\nn02840245\nn02606052\nn03857828\nn03837869\nn01735189\nn02093256\nn02112706\nn02749479\nn04525038\nn03982430\nn02510455\nn02410509\nn03680355\nn02105505\nn03017168\nn02120079\nn03532672\nn03992509\nn02009229\nn02106166\nn02105056\nn02422699\nn03770439\nn03794056\nn03777568\nn02110806\nn01950731\nn04371430\nn03417042\nn03743016\nn01729977\nn02669723\nn02094433\nn04251144\nn02119022\nn01697457\nn01682714\nn07614500\nn02127052\nn03042490\nn02113799\nn04399382\nn03794056\nn02963159\nn02730930\nn01592084\nn04067472\nn02815834\nn07753592\nn13052670\nn07875152\nn06785654\nn04509417\nn03977966\nn03345487\nn03223299\nn04277352\nn06794110\nn02389026\nn07920052\nn02100877\nn04435653\nn04239074\nn04069434\nn03617480\nn01494475\nn02672831\nn07831146\nn02097047\nn03814639\nn02514041\nn02091635\nn01687978\nn02116738\nn01630670\nn01695060\nn04204238\nn04090263\nn04081281\nn01819313\nn02132136\nn03787032\nn04044716\nn15075141\nn03954731\nn04389033\nn02002556\nn04591157\nn04133789\nn04277352\nn02641379\nn03733805\nn04417672\nn02403003\nn01580077\nn03920288\nn03673027\nn07697537\nn07836838\nn04243546\nn02977058\nn07684084\nn07697537\nn02132136\nn03131574\nn02093647\nn03443371\nn03134739\nn04550184\nn03891251\nn02087394\nn07697537\nn07583066\nn04522168\nn04493381\nn04065272\nn02097130\nn04467665\nn01614925\nn03961711\nn02802426\nn02089078\nn02018207\nn03947888\nn01748264\nn02280649\nn02002556\nn03709823\nn01494475\nn03485794\nn04479046\nn02108551\nn03325584\nn03188531\nn02091032\nn02259212\nn02033041\nn03290653\nn04033995\nn07614500\nn02169497\nn04553703\nn02268443\nn09288635\nn01843383\nn04428191\nn03717622\nn02268853\nn02012849\nn02894605\nn02134418\nn01751748\nn02823750\nn02177972\nn03424325\nn02397096\nn07753275\nn02417914\nn03379051\nn02096585\nn03814639\nn03355925\nn03127747\nn02264363\nn03733131\nn02481823\nn03447447\nn04409515\nn02066245\nn02102318\nn03028079\nn02107574\nn04026417\nn02058221\nn02106662\nn02607072\nn01641577\nn03376595\nn07892512\nn11939491\nn02488702\nn09421951\nn01910747\nn02364673\nn07248320\nn03908714\nn02939185\nn02099601\nn03680355\nn02095889\nn02917067\nn04380533\nn01592084\nn02109525\nn02123394\nn02236044\nn02346627\nn12057211\nn12620546\nn04346328\nn01531178\nn01735189\nn04152593\nn04487394\nn02123597\nn01768244\nn02129604\nn09193705\nn04131690\nn02085936\nn02088238\nn03538406\nn03131574\nn02110185\nn03124043\nn03000247\nn02107574\nn02110958\nn03018349\nn02930766\nn02229544\nn02483362\nn03887697\nn01773797\nn02264363\nn02088364\nn04127249\nn02113023\nn03146219\nn02114855\nn04536866\nn03770679\nn01796340\nn03866082\nn04380533\nn03764736\nn07749582\nn03658185\nn04579145\nn01784675\nn01644373\nn02110063\nn02971356\nn02494079\nn02361337\nn02490219\nn03803284\nn02113624\nn02106550\nn03814906\nn03180011\nn01872401\nn02730930\nn04548280\nn02814860\nn02105162\nn03676483\nn01871265\nn07716358\nn04476259\nn03887697\nn07697537\nn02514041\nn04004767\nn04371774\nn01855032\nn01518878\nn09835506\nn01943899\nn03908714\nn03400231\nn02129604\nn02492035\nn04252225\nn02107312\nn03443371\nn02950826\nn03814639\nn02951585\nn04265275\nn01806567\nn03482405\nn01882714\nn01580077\nn02091831\nn04266014\nn02895154\nn04532106\nn02999410\nn03729826\nn03345487\nn02105162\nn02690373\nn04597913\nn04325704\nn03461385\nn01695060\nn01818515\nn09472597\nn01806567\nn07754684\nn04326547\nn02093859\nn04049303\nn02641379\nn03196217\nn02088466\nn04376876\nn02009229\nn03929855\nn02025239\nn03814906\nn03291819\nn04612504\nn03000134\nn02837789\nn07718747\nn03459775\nn02281406\nn01693334\nn02219486\nn04266014\nn04399382\nn01774750\nn02980441\nn03062245\nn04418357\nn02841315\nn04239074\nn02117135\nn03908714\nn04429376\nn02089867\nn01641577\nn02444819\nn04277352\nn01443537\nn04522168\nn02137549\nn03770439\nn03697007\nn07248320\nn04523525\nn04141975\nn04442312\nn02979186\nn03929855\nn03160309\nn07613480\nn04154565\nn03452741\nn03063689\nn01983481\nn03884397\nn02687172\nn01622779\nn01774750\nn02096051\nn04074963\nn03207941\nn02107908\nn03180011\nn04557648\nn01491361\nn04209239\nn02091467\nn03930313\nn03417042\nn02395406\nn02112350\nn02108915\nn02123597\nn04125021\nn03777754\nn09288635\nn02066245\nn03196217\nn04118538\nn03733281\nn02106550\nn02111889\nn03720891\nn04604644\nn03016953\nn03249569\nn04039381\nn02100735\nn01582220\nn02423022\nn03764736\nn03109150\nn02028035\nn02510455\nn01735189\nn02666196\nn02992211\nn04356056\nn03240683\nn01978455\nn04579145\nn02963159\nn09288635\nn02442845\nn04606251\nn02087046\nn03344393\nn01883070\nn03697007\nn03891251\nn03662601\nn02138441\nn01753488\nn04613696\nn01950731\nn03485794\nn02110341\nn02892767\nn02492035\nn04273569\nn04008634\nn02095314\nn03794056\nn09472597\nn02802426\nn07716906\nn03792972\nn01872401\nn03673027\nn02279972\nn02910353\nn03933933\nn03938244\nn01558993\nn03908714\nn01914609\nn02101006\nn02672831\nn04067472\nn02526121\nn07836838\nn02817516\nn07742313\nn01828970\nn04286575\nn03649909\nn02107683\nn02988304\nn02165456\nn04560804\nn01629819\nn03814906\nn03782006\nn02264363\nn02909870\nn09246464\nn02328150\nn02730930\nn04596742\nn03095699\nn03146219\nn01824575\nn03977966\nn01807496\nn02500267\nn02098105\nn01796340\nn02113978\nn02948072\nn03089624\nn04550184\nn07565083\nn03529860\nn03544143\nn02791270\nn03775071\nn03710721\nn13044778\nn02504458\nn02514041\nn03743016\nn03483316\nn12985857\nn03709823\nn04465501\nn03028079\nn04209239\nn01807496\nn02859443\nn04398044\nn03337140\nn02783161\nn02500267\nn01644373\nn07711569\nn03888257\nn02655020\nn09399592\nn03197337\nn02007558\nn03961711\nn04542943\nn02116738\nn01580077\nn02088632\nn02096294\nn03388183\nn02099267\nn03445924\nn04133789\nn04332243\nn03201208\nn03032252\nn02504458\nn02979186\nn04584207\nn03535780\nn02229544\nn02111500\nn04525305\nn03197337\nn02398521\nn02088238\nn02364673\nn04146614\nn02113186\nn02391049\nn02098286\nn04548362\nn02009229\nn07802026\nn07716906\nn02111889\nn02730930\nn01632777\nn02099601\nn02981792\nn03637318\nn01735189\nn04049303\nn02129165\nn02443484\nn03770679\nn04149813\nn01622779\nn03110669\nn01945685\nn03937543\nn02977058\nn02457408\nn03041632\nn01694178\nn03095699\nn02085936\nn04252077\nn03529860\nn01978455\nn01768244\nn06359193\nn02107908\nn04162706\nn03494278\nn02009912\nn01740131\nn03717622\nn13054560\nn03014705\nn02087394\nn02093991\nn03063689\nn02113023\nn03733131\nn04493381\nn03825788\nn02643566\nn03495258\nn06794110\nn02280649\nn04065272\nn02110958\nn03452741\nn03314780\nn01828970\nn02871525\nn04447861\nn02815834\nn04417672\nn04328186\nn02134418\nn03788365\nn03877845\nn04487081\nn02500267\nn03372029\nn03837869\nn01968897\nn03443371\nn12768682\nn01685808\nn03584829\nn02814860\nn03485407\nn03670208\nn01817953\nn03026506\nn01440764\nn01685808\nn03691459\nn04141076\nn04179913\nn03670208\nn01755581\nn03958227\nn03388043\nn03223299\nn02504013\nn01773549\nn01694178\nn02112018\nn01739381\nn01695060\nn01980166\nn03788365\nn03187595\nn02277742\nn01669191\nn02892201\nn02123045\nn07747607\nn04604644\nn04149813\nn04074963\nn02111277\nn02101006\nn03961711\nn01978287\nn03127747\nn02129604\nn07717410\nn02264363\nn07802026\nn02089973\nn02096585\nn04243546\nn01688243\nn02817516\nn04596742\nn03673027\nn02797295\nn07753113\nn01685808\nn02871525\nn02093991\nn01984695\nn07760859\nn03032252\nn07711569\nn02280649\nn03761084\nn03160309\nn03891332\nn02883205\nn04372370\nn04041544\nn04552348\nn04264628\nn04041544\nn01910747\nn03950228\nn02666196\nn04204347\nn01560419\nn04204238\nn02236044\nn03131574\nn04487081\nn02018795\nn02843684\nn03000684\nn01667778\nn02115641\nn04548362\nn01943899\nn02100877\nn02093256\nn02018207\nn02112137\nn03141823\nn02093754\nn02174001\nn04476259\nn02480495\nn03887697\nn02769748\nn02002724\nn02113978\nn02110627\nn03874293\nn02107574\nn02109047\nn01855032\nn02794156\nn03134739\nn07742313\nn03124043\nn02486261\nn02992529\nn01734418\nn02321529\nn03047690\nn02879718\nn02025239\nn03131574\nn04347754\nn03216828\nn02264363\nn03041632\nn02071294\nn01914609\nn02497673\nn02172182\nn01667778\nn02106550\nn02814860\nn01773549\nn01986214\nn02236044\nn02009912\nn02487347\nn01755581\nn03623198\nn02445715\nn06794110\nn02085620\nn04482393\nn01820546\nn04579145\nn02326432\nn07754684\nn04111531\nn03724870\nn02093256\nn07711569\nn02017213\nn01688243\nn01669191\nn01664065\nn02092339\nn02108551\nn04525305\nn03950228\nn03929660\nn03956157\nn03891332\nn04493381\nn02102973\nn03255030\nn01990800\nn02500267\nn02281406\nn01824575\nn03032252\nn02129165\nn02356798\nn03538406\nn02009229\nn02097658\nn03095699\nn03786901\nn03743016\nn02980441\nn07742313\nn02106166\nn03314780\nn02097209\nn04037443\nn04086273\nn03394916\nn02037110\nn02112018\nn03379051\nn02951585\nn04501370\nn04355338\nn03874293\nn04153751\nn07930864\nn02930766\nn01496331\nn04265275\nn02256656\nn01667114\nn03630383\nn04591713\nn02704792\nn03207743\nn03854065\nn03720891\nn07873807\nn02120505\nn02099849\nn04152593\nn02100877\nn04560804\nn03792972\nn03733131\nn13133613\nn02114548\nn03000247\nn04146614\nn04398044\nn02325366\nn03633091\nn09256479\nn03617480\nn01530575\nn03633091\nn03018349\nn01768244\nn02871525\nn04040759\nn03658185\nn03272562\nn02447366\nn04392985\nn02797295\nn03903868\nn04548362\nn07714571\nn03884397\nn03888605\nn02105505\nn03666591\nn03063599\nn03530642\nn02097474\nn04483307\nn04554684\nn02978881\nn02492660\nn03692522\nn04589890\nn04579432\nn02127052\nn02112706\nn02804610\nn02190166\nn11939491\nn03000134\nn01697457\nn12620546\nn02256656\nn01968897\nn02950826\nn03127925\nn02939185\nn06596364\nn02091134\nn03877472\nn02113799\nn02102973\nn02027492\nn03498962\nn02834397\nn07248320\nn04286575\nn01735189\nn02417914\nn03690938\nn03404251\nn01739381\nn02099267\nn02219486\nn02108089\nn02206856\nn03208938\nn03127747\nn02279972\nn02281406\nn02113023\nn01601694\nn07715103\nn02107908\nn02120079\nn02102318\nn02096051\nn01990800\nn02917067\nn03372029\nn03538406\nn12267677\nn03314780\nn03903868\nn02009229\nn02100236\nn03759954\nn02277742\nn03804744\nn02966687\nn02102318\nn09835506\nn01484850\nn02097047\nn02795169\nn03673027\nn02169497\nn03532672\nn04067472\nn01944390\nn02786058\nn04019541\nn01665541\nn04162706\nn01695060\nn04116512\nn03680355\nn04548280\nn04517823\nn02883205\nn02869837\nn01871265\nn01737021\nn01496331\nn01773797\nn04562935\nn03617480\nn03930630\nn04033901\nn04270147\nn03388183\nn02823428\nn02090622\nn02504013\nn04356056\nn02510455\nn01860187\nn02492660\nn02879718\nn02669723\nn15075141\nn04263257\nn02422106\nn04350905\nn02105056\nn02102973\nn03776460\nn03857828\nn02120505\nn02105412\nn02643566\nn03291819\nn04447861\nn03938244\nn07717556\nn02423022\nn03450230\nn01770393\nn04254680\nn03530642\nn03476991\nn03710721\nn04116512\nn04398044\nn02930766\nn04370456\nn02231487\nn04019541\nn03476991\nn04366367\nn02930766\nn01728920\nn03908618\nn07615774\nn06794110\nn01744401\nn04153751\nn03187595\nn02009912\nn02096437\nn02018207\nn02363005\nn07717410\nn02939185\nn03495258\nn03787032\nn03920288\nn04392985\nn02109961\nn04325704\nn03240683\nn01773157\nn02317335\nn03929660\nn02493509\nn03920288\nn03447721\nn02486261\nn04562935\nn01829413\nn01930112\nn02104365\nn02992211\nn04033901\nn03710193\nn02797295\nn01847000\nn02100583\nn04483307\nn03874599\nn04275548\nn04540053\nn01558993\nn04560804\nn04542943\nn01773549\nn04317175\nn03935335\nn07717410\nn02165456\nn03832673\nn01692333\nn03788195\nn07831146\nn03590841\nn03840681\nn02277742\nn09472597\nn07614500\nn04548280\nn03443371\nn04532670\nn01774750\nn04486054\nn03127747\nn03676483\nn02669723\nn02017213\nn01945685\nn02219486\nn04599235\nn03530642\nn04254777\nn02111500\nn03125729\nn01631663\nn07880968\nn02111277\nn01817953\nn03776460\nn01622779\nn03240683\nn02906734\nn02391049\nn01695060\nn04023962\nn01514668\nn04133789\nn02871525\nn02277742\nn02090721\nn01693334\nn04074963\nn07693725\nn01873310\nn02279972\nn02971356\nn02071294\nn03991062\nn02088238\nn03538406\nn04552348\nn02112706\nn04229816\nn03126707\nn01518878\nn03903868\nn13054560\nn04149813\nn01828970\nn03197337\nn02443114\nn03255030\nn01558993\nn03529860\nn04069434\nn02396427\nn03197337\nn02356798\nn02504013\nn02641379\nn02017213\nn01882714\nn01514859\nn04429376\nn04366367\nn04443257\nn03075370\nn03782006\nn02927161\nn03899768\nn07715103\nn03980874\nn01514668\nn03761084\nn01773797\nn02120079\nn04131690\nn07248320\nn02133161\nn02096051\nn13052670\nn02979186\nn02113023\nn03594945\nn02123045\nn02120505\nn02119022\nn02493793\nn01728572\nn03482405\nn01980166\nn07745940\nn01773549\nn02123394\nn02093754\nn03534580\nn02174001\nn02641379\nn01693334\nn01983481\nn02793495\nn04456115\nn04141327\nn02096585\nn01855672\nn03223299\nn03544143\nn02321529\nn09193705\nn04409515\nn02105162\nn03775546\nn01990800\nn02128757\nn03769881\nn03314780\nn03598930\nn03452741\nn03388183\nn03958227\nn02236044\nn04208210\nn07693725\nn01945685\nn04579432\nn02486410\nn02791270\nn02099429\nn02074367\nn04208210\nn01981276\nn03240683\nn03425413\nn02115913\nn03124043\nn02002724\nn02667093\nn03724870\nn07730033\nn03733281\nn04522168\nn07717556\nn03977966\nn03788365\nn01484850\nn03482405\nn03623198\nn07892512\nn07711569\nn03710637\nn03376595\nn04141975\nn02981792\nn03804744\nn02107312\nn03733131\nn01739381\nn04252077\nn03445924\nn04599235\nn02422699\nn03637318\nn03673027\nn03425413\nn02442845\nn02325366\nn02410509\nn02641379\nn02165105\nn02769748\nn02859443\nn01806567\nn03527444\nn02099601\nn07715103\nn01531178\nn04599235\nn07697313\nn02091244\nn04317175\nn02823428\nn02096437\nn02236044\nn02190166\nn02948072\nn01728920\nn01728572\nn03000684\nn03133878\nn02017213\nn01978287\nn03775071\nn04479046\nn07720875\nn06785654\nn01843383\nn02108089\nn02606052\nn02794156\nn02100583\nn12620546\nn02412080\nn01677366\nn03710637\nn07753275\nn02417914\nn04019541\nn01697457\nn01806143\nn03759954\nn02115913\nn12985857\nn03530642\nn02133161\nn02086240\nn02782093\nn02259212\nn02110806\nn03733131\nn02096294\nn04229816\nn06794110\nn02699494\nn03761084\nn01592084\nn07695742\nn01631663\nn03017168\nn04350905\nn02256656\nn04285008\nn01984695\nn04275548\nn01883070\nn03047690\nn02445715\nn02088094\nn03223299\nn01729322\nn03837869\nn02102480\nn02088364\nn02102177\nn04265275\nn02319095\nn02229544\nn03759954\nn02869837\nn04209133\nn03291819\nn04371774\nn02138441\nn02417914\nn02128757\nn02098286\nn04591157\nn03443371\nn03902125\nn02422106\nn04423845\nn04465501\nn13052670\nn02087394\nn04367480\nn07742313\nn03538406\nn03492542\nn03868863\nn02088632\nn01582220\nn03876231\nn03770439\nn02977058\nn03457902\nn03874293\nn03902125\nn03929855\nn02391049\nn03180011\nn03956157\nn02790996\nn02099712\nn01980166\nn04041544\nn02033041\nn03976657\nn01751748\nn02127052\nn01494475\nn02128385\nn04204347\nn03690938\nn03759954\nn02412080\nn04204238\nn03662601\nn02114855\nn03788365\nn02104029\nn02101556\nn01737021\nn09288635\nn02096177\nn02492035\nn04238763\nn03393912\nn04149813\nn02398521\nn01742172\nn02130308\nn01534433\nn04404412\nn02107683\nn02708093\nn04209239\nn07715103\nn07718747\nn04462240\nn02510455\nn02098105\nn02277742\nn02096437\nn02802426\nn02486261\nn02091134\nn03272010\nn01491361\nn04604644\nn02640242\nn03692522\nn02229544\nn07720875\nn04606251\nn04201297\nn11939491\nn02088364\nn02655020\nn03657121\nn02112350\nn02326432\nn03445777\nn02028035\nn04326547\nn03400231\nn02091032\nn03710193\nn01742172\nn01806567\nn03485407\nn03450230\nn01735189\nn02319095\nn03467068\nn04458633\nn03394916\nn02500267\nn04525038\nn02112137\nn02107908\nn12768682\nn02119789\nn03662601\nn07860988\nn04584207\nn07932039\nn03062245\nn07745940\nn03085013\nn04465501\nn02483708\nn03379051\nn01631663\nn01773157\nn02364673\nn02917067\nn02488702\nn02105412\nn02423022\nn03868242\nn02018207\nn02113624\nn04041544\nn04548280\nn03483316\nn03444034\nn02125311\nn02281406\nn04041544\nn03223299\nn03602883\nn12144580\nn04192698\nn07831146\nn01748264\nn02096177\nn01798484\nn03075370\nn01807496\nn04479046\nn03457902\nn02504013\nn02097047\nn07583066\nn02979186\nn03595614\nn04286575\nn09246464\nn02981792\nn03220513\nn02090379\nn02037110\nn02009912\nn07860988\nn04435653\nn02486261\nn02129604\nn01491361\nn04579432\nn02165456\nn03259280\nn01860187\nn03796401\nn02356798\nn01828970\nn02206856\nn03983396\nn02783161\nn03134739\nn02823428\nn04371430\nn04118776\nn02106166\nn02988304\nn01770081\nn04465501\nn03447447\nn03976467\nn02977058\nn02058221\nn02280649\nn03445777\nn03884397\nn01797886\nn03240683\nn03485794\nn02974003\nn04548280\nn02168699\nn07716906\nn02002556\nn01632777\nn02111129\nn02492035\nn02123159\nn03424325\nn02231487\nn01641577\nn07873807\nn02363005\nn02100877\nn03777568\nn01530575\nn03998194\nn01829413\nn02480855\nn09288635\nn02321529\nn02509815\nn03482405\nn04493381\nn02319095\nn03223299\nn03388549\nn02113186\nn02093859\nn07718747\nn01855032\nn10148035\nn07753113\nn04154565\nn02423022\nn04179913\nn02486410\nn02106382\nn02033041\nn02483708\nn01537544\nn02123597\nn03240683\nn04026417\nn02108422\nn09399592\nn02104365\nn03794056\nn01776313\nn02787622\nn03854065\nn01729977\nn02127052\nn03942813\nn02109047\nn03133878\nn03775071\nn02268443\nn04118776\nn02009912\nn02111889\nn04542943\nn03759954\nn03633091\nn03124043\nn03016953\nn02133161\nn02106030\nn01773797\nn03887697\nn04501370\nn04120489\nn02096051\nn01682714\nn03133878\nn02992211\nn01795545\nn02033041\nn04285008\nn02113978\nn02006656\nn01768244\nn02837789\nn01622779\nn02091831\nn02992529\nn03929660\nn02493793\nn03447447\nn02013706\nn03478589\nn07615774\nn03530642\nn02410509\nn01968897\nn04252077\nn03976467\nn07871810\nn01697457\nn04200800\nn01806567\nn03998194\nn03721384\nn02107683\nn02950826\nn02834397\nn02978881\nn02106166\nn02098413\nn04204238\nn04328186\nn01943899\nn03494278\nn01798484\nn07714990\nn02105056\nn04033995\nn03207743\nn03459775\nn02704792\nn03379051\nn04372370\nn01855032\nn03124170\nn04039381\nn04355338\nn01774384\nn03016953\nn02486261\nn01632777\nn02319095\nn02106550\nn03476684\nn01644900\nn03729826\nn03047690\nn04179913\nn02437312\nn03769881\nn01664065\nn02107683\nn09835506\nn01784675\nn02483362\nn02089867\nn04356056\nn03666591\nn06359193\nn02277742\nn04456115\nn02099267\nn03657121\nn04149813\nn07579787\nn04372370\nn02095314\nn03496892\nn02483708\nn04417672\nn04447861\nn02804610\nn03126707\nn01704323\nn09332890\nn02090379\nn03837869\nn11939491\nn03866082\nn03733131\nn02165456\nn04443257\nn02281787\nn02398521\nn07718472\nn02106382\nn02066245\nn04428191\nn03527444\nn03085013\nn02112350\nn02094433\nn03942813\nn02398521\nn02865351\nn03908618\nn02229544\nn01981276\nn03208938\nn02236044\nn04542943\nn02804610\nn02843684\nn01687978\nn02447366\nn02099849\nn03017168\nn02999410\nn02013706\nn02102040\nn02825657\nn02091831\nn01833805\nn02117135\nn01910747\nn03724870\nn04209133\nn04328186\nn03761084\nn04509417\nn04612504\nn01537544\nn01748264\nn04542943\nn02892767\nn04332243\nn04591713\nn02116738\nn07714990\nn03782006\nn07697313\nn03692522\nn02776631\nn03197337\nn06874185\nn02089867\nn02790996\nn02979186\nn03938244\nn03028079\nn02823428\nn04133789\nn02794156\nn02815834\nn03063599\nn10148035\nn02486261\nn04435653\nn01943899\nn02391049\nn02090622\nn04542943\nn02058221\nn02089867\nn02115641\nn03930313\nn02105412\nn03691459\nn03781244\nn03721384\nn01484850\nn03201208\nn03710721\nn03384352\nn02410509\nn03787032\nn03970156\nn02105251\nn03958227\nn02690373\nn01729322\nn01518878\nn04254680\nn02988304\nn03670208\nn04033901\nn02018795\nn02749479\nn03447721\nn02093428\nn02099712\nn02094114\nn02814860\nn02167151\nn04525305\nn02483362\nn02105251\nn02817516\nn04125021\nn02979186\nn01829413\nn02097658\nn02909870\nn01558993\nn03216828\nn02280649\nn02051845\nn02115913\nn03938244\nn04522168\nn01632458\nn02106382\nn02939185\nn04111531\nn01693334\nn02268853\nn02109525\nn02125311\nn03617480\nn02437616\nn04146614\nn03832673\nn02870880\nn04554684\nn02071294\nn02971356\nn03775071\nn04326547\nn11879895\nn01531178\nn02667093\nn04317175\nn02027492\nn02002556\nn02206856\nn03527444\nn04557648\nn04467665\nn01742172\nn02100236\nn02096437\nn13054560\nn02389026\nn02098105\nn07871810\nn02488291\nn04251144\nn12057211\nn04483307\nn01917289\nn03637318\nn01950731\nn01955084\nn02869837\nn04037443\nn02099267\nn04254120\nn02493793\nn12144580\nn01968897\nn03770679\nn02910353\nn04146614\nn04154565\nn02128757\nn04380533\nn03530642\nn02640242\nn01530575\nn04325704\nn04562935\nn03838899\nn02692877\nn03692522\nn03916031\nn02486261\nn03724870\nn02099267\nn03207941\nn02128925\nn03461385\nn01950731\nn02492660\nn02102973\nn07749582\nn04310018\nn02110806\nn02105056\nn09428293\nn02087394\nn15075141\nn03141823\nn03709823\nn03930630\nn02280649\nn04069434\nn07718747\nn02480495\nn07754684\nn12985857\nn03602883\nn01665541\nn04465501\nn02788148\nn02114548\nn07753275\nn03788195\nn02814860\nn02090379\nn03425413\nn01751748\nn04311174\nn01796340\nn07613480\nn03445777\nn04404412\nn03124170\nn02364673\nn01829413\nn03134739\nn07730033\nn03379051\nn04485082\nn03250847\nn07730033\nn07714571\nn02790996\nn03160309\nn02268443\nn02093859\nn13052670\nn02086910\nn01632458\nn04259630\nn01806567\nn02094433\nn02093647\nn02111500\nn03876231\nn01883070\nn02098286\nn04483307\nn03344393\nn01592084\nn04579432\nn04152593\nn04579145\nn03998194\nn02093256\nn01616318\nn03085013\nn03527444\nn04116512\nn02514041\nn03627232\nn03376595\nn04443257\nn03095699\nn02403003\nn04589890\nn01910747\nn02978881\nn02727426\nn01985128\nn03482405\nn02132136\nn04277352\nn13133613\nn02033041\nn02100877\nn01806143\nn03733805\nn01748264\nn02483362\nn03776460\nn02105412\nn03887697\nn01773157\nn02056570\nn02808440\nn02007558\nn04146614\nn02097130\nn03888605\nn02412080\nn01806567\nn02457408\nn03935335\nn03775071\nn07697313\nn01774750\nn07873807\nn07749582\nn02091134\nn02871525\nn02117135\nn03657121\nn03661043\nn02088632\nn03776460\nn02120505\nn02165456\nn03089624\nn03485794\nn01534433\nn02835271\nn03240683\nn04251144\nn02086910\nn03447447\nn04200800\nn01582220\nn02655020\nn04458633\nn04371430\nn02097047\nn03970156\nn04418357\nn04243546\nn02098413\nn02992529\nn03384352\nn02640242\nn02894605\nn03920288\nn03250847\nn02607072\nn04326547\nn04485082\nn03868863\nn09472597\nn02027492\nn02692877\nn03388549\nn03874599\nn02096051\nn01847000\nn02328150\nn01534433\nn02910353\nn01829413\nn02107142\nn03977966\nn02090622\nn03444034\nn04418357\nn04254680\nn02692877\nn02002724\nn03535780\nn02108551\nn02112350\nn15075141\nn04141975\nn04507155\nn04509417\nn11939491\nn02112706\nn02110627\nn03125729\nn03680355\nn01644373\nn01644373\nn01756291\nn01753488\nn02098105\nn02342885\nn03759954\nn02110958\nn02797295\nn02006656\nn02111500\nn04033901\nn01784675\nn04277352\nn02489166\nn02481823\nn02398521\nn01739381\nn02823428\nn02939185\nn12985857\nn04275548\nn04127249\nn02087394\nn03920288\nn04482393\nn03100240\nn03000684\nn07248320\nn02454379\nn02361337\nn03218198\nn02106030\nn03544143\nn04456115\nn02165105\nn03188531\nn01641577\nn07742313\nn03761084\nn01518878\nn04376876\nn03782006\nn02422699\nn01773797\nn02106550\nn04590129\nn03902125\nn02823750\nn03393912\nn04090263\nn01737021\nn02129165\nn01498041\nn03792782\nn02966687\nn02504458\nn03838899\nn01689811\nn04347754\nn01608432\nn01817953\nn02536864\nn01729977\nn02096437\nn03924679\nn02096437\nn01798484\nn02869837\nn04336792\nn03485407\nn03868863\nn04376876\nn03602883\nn02128925\nn02102973\nn02447366\nn07716358\nn03857828\nn04517823\nn03837869\nn07749582\nn02105162\nn02281787\nn02769748\nn02085620\nn01751748\nn02093647\nn04423845\nn02488702\nn03485794\nn03908714\nn01498041\nn02231487\nn02108551\nn03179701\nn02786058\nn01855032\nn04147183\nn04254680\nn04557648\nn01728572\nn04325704\nn07860988\nn01847000\nn13044778\nn03445777\nn03447447\nn02169497\nn03290653\nn03376595\nn02094114\nn03854065\nn02422699\nn01796340\nn03459775\nn02091244\nn04399382\nn03476684\nn02951585\nn03207941\nn02174001\nn03445777\nn01950731\nn04562935\nn01728572\nn02089973\nn01945685\nn02791270\nn04090263\nn01665541\nn02264363\nn04228054\nn03345487\nn03947888\nn01944390\nn04153751\nn01664065\nn03223299\nn02930766\nn04404412\nn03992509\nn01877812\nn02977058\nn09835506\nn12267677\nn03127747\nn01980166\nn09835506\nn07753113\nn02860847\nn02840245\nn01748264\nn03891251\nn02484975\nn02095314\nn03063689\nn04372370\nn11879895\nn02447366\nn01795545\nn03201208\nn01797886\nn04548362\nn03028079\nn03201208\nn02109047\nn03804744\nn03417042\nn02111500\nn02109047\nn02415577\nn04456115\nn02486410\nn03976657\nn02109525\nn03602883\nn03937543\nn02492660\nn02127052\nn02641379\nn03146219\nn02091635\nn02110185\nn04389033\nn04330267\nn02165456\nn04152593\nn04548362\nn02094433\nn04372370\nn03208938\nn02356798\nn02666196\nn02279972\nn03661043\nn03187595\nn03131574\nn07742313\nn02104029\nn02172182\nn02090622\nn02085782\nn02123159\nn02105855\nn02422106\nn01667114\nn01943899\nn03692522\nn03788195\nn07718472\nn03146219\nn04553703\nn09472597\nn04447861\nn02790996\nn03673027\nn02102040\nn07565083\nn01532829\nn02276258\nn04141327\nn01817953\nn04118538\nn01990800\nn02123597\nn01751748\nn02025239\nn01644373\nn03355925\nn02177972\nn04286575\nn04009552\nn03899768\nn03857828\nn04613696\nn02120079\nn02007558\nn04311174\nn03594945\nn04355338\nn03325584\nn07590611\nn07831146\nn03899768\nn02165105\nn06359193\nn06874185\nn03657121\nn02056570\nn09428293\nn04597913\nn02114855\nn04548280\nn03065424\nn01986214\nn03623198\nn04485082\nn03888605\nn02114855\nn02917067\nn04067472\nn03457902\nn03775071\nn07579787\nn02509815\nn04458633\nn03347037\nn02098105\nn12985857\nn03691459\nn04525305\nn01817953\nn03393912\nn04251144\nn02088364\nn02526121\nn02444819\nn02088238\nn02051845\nn01667114\nn04487394\nn04125021\nn02883205\nn04162706\nn02085936\nn02807133\nn02978881\nn04350905\nn01843383\nn02906734\nn01608432\nn02950826\nn04131690\nn02823428\nn02106030\nn01818515\nn03840681\nn03443371\nn03447447\nn02492660\nn11879895\nn02981792\nn01514668\nn02701002\nn04192698\nn02106030\nn07717410\nn03492542\nn06794110\nn03977966\nn04008634\nn07768694\nn04515003\nn02111889\nn02363005\nn01930112\nn04447861\nn07684084\nn01883070\nn03250847\nn02825657\nn03793489\nn01616318\nn02110341\nn06596364\nn04456115\nn01749939\nn03180011\nn02690373\nn02088094\nn01984695\nn02493793\nn09428293\nn03888605\nn09229709\nn02128757\nn04239074\nn04040759\nn03062245\nn02168699\nn02977058\nn01773157\nn02101388\nn03459775\nn04532106\nn04026417\nn02870880\nn04179913\nn02115913\nn04525038\nn11939491\nn02165105\nn04258138\nn09472597\nn01491361\nn03706229\nn03937543\nn01855672\nn03673027\nn02443484\nn03706229\nn04149813\nn03599486\nn03272562\nn01704323\nn01537544\nn03424325\nn02085782\nn02190166\nn04592741\nn02504458\nn04086273\nn07754684\nn02443484\nn02086910\nn01756291\nn01873310\nn02096437\nn02870880\nn02106166\nn07613480\nn03018349\nn03447721\nn04335435\nn02114855\nn07760859\nn03825788\nn02107142\nn02095570\nn01697457\nn03837869\nn02018795\nn02113624\nn03781244\nn03942813\nn02445715\nn02111129\nn04372370\nn02115641\nn07802026\nn02137549\nn02099429\nn03998194\nn04162706\nn03208938\nn02486410\nn02536864\nn02437616\nn02128757\nn04604644\nn03016953\nn04404412\nn02096585\nn01494475\nn03657121\nn04259630\nn04423845\nn03388549\nn02640242\nn02988304\nn02165456\nn03924679\nn04086273\nn02492660\nn02113624\nn02093859\nn02089867\nn04192698\nn01944390\nn01632777\nn02966687\nn02107908\nn02098286\nn07831146\nn02007558\nn04536866\nn02808304\nn07718472\nn03930630\nn07754684\nn01774750\nn03980874\nn03384352\nn02104029\nn02769748\nn02058221\nn01695060\nn03929660\nn13040303\nn03089624\nn04443257\nn04428191\nn03775546\nn04517823\nn01945685\nn03216828\nn02965783\nn02088466\nn04133789\nn03838899\nn02123597\nn02128385\nn02486410\nn03124170\nn03530642\nn02500267\nn12768682\nn02128385\nn01592084\nn02526121\nn04356056\nn02137549\nn03854065\nn07684084\nn01855032\nn02992211\nn02484975\nn02106030\nn09421951\nn04367480\nn09256479\nn02119022\nn02493509\nn03803284\nn01685808\nn07697537\nn01807496\nn03733281\nn03417042\nn02219486\nn09229709\nn02526121\nn03908714\nn04204347\nn03527444\nn01740131\nn02492035\nn02094258\nn03769881\nn03026506\nn02804414\nn02489166\nn02883205\nn03482405\nn04366367\nn03868863\nn03891332\nn01797886\nn03447447\nn04399382\nn04146614\nn02423022\nn02268443\nn03250847\nn07753592\nn01984695\nn03709823\nn03884397\nn03630383\nn03814639\nn02834397\nn01737021\nn03786901\nn01775062\nn01883070\nn09428293\nn03977966\nn07754684\nn03384352\nn02794156\nn13054560\nn02132136\nn02769748\nn07718747\nn02950826\nn01930112\nn02086240\nn02125311\nn03947888\nn02840245\nn03220513\nn03720891\nn02791270\nn02802426\nn03866082\nn03825788\nn02487347\nn02169497\nn02860847\nn01728920\nn03535780\nn03710193\nn02091467\nn04243546\nn01616318\nn03942813\nn02128757\nn04049303\nn04417672\nn02127052\nn03838899\nn03729826\nn02909870\nn09421951\nn04515003\nn02165105\nn03146219\nn04423845\nn03602883\nn01930112\nn04208210\nn03887697\nn03761084\nn02268853\nn04392985\nn03649909\nn03447721\nn02692877\nn12267677\nn07715103\nn04392985\nn04509417\nn04041544\nn03538406\nn01664065\nn03179701\nn01820546\nn04204347\nn03929660\nn02102973\nn03903868\nn01742172\nn01770081\nn03109150\nn04273569\nn02123045\nn07590611\nn13037406\nn02102177\nn03000247\nn02410509\nn02088632\nn07768694\nn06785654\nn03393912\nn03496892\nn04275548\nn03854065\nn04355933\nn01807496\nn07720875\nn04584207\nn03792782\nn03208938\nn02666196\nn04149813\nn02107683\nn04049303\nn04118538\nn04418357\nn02877765\nn01883070\nn02509815\nn10565667\nn02497673\nn02115913\nn03837869\nn02190166\nn04592741\nn04285008\nn04606251\nn03075370\nn04125021\nn03796401\nn02091134\nn03792972\nn01824575\nn02086079\nn01855032\nn07742313\nn03393912\nn03958227\nn02137549\nn02113978\nn02356798\nn02808440\nn02105412\nn01797886\nn04204347\nn03837869\nn02111277\nn02777292\nn02129604\nn07930864\nn02489166\nn03459775\nn01644900\nn04149813\nn03854065\nn03125729\nn04141076\nn04505470\nn02089973\nn02172182\nn04266014\nn04606251\nn07768694\nn09472597\nn02134418\nn03623198\nn02793495\nn01484850\nn02276258\nn02095889\nn03733281\nn03535780\nn03983396\nn02640242\nn01818515\nn02051845\nn03544143\nn02092002\nn02906734\nn01518878\nn03769881\nn02087046\nn03891332\nn04392985\nn03485794\nn03445777\nn02115913\nn02321529\nn03633091\nn01984695\nn04590129\nn02268443\nn02676566\nn02134084\nn03658185\nn02091134\nn03733805\nn02488702\nn02869837\nn02640242\nn03160309\nn02443484\nn02441942\nn01775062\nn02825657\nn12144580\nn04591713\nn02783161\nn01882714\nn02815834\nn02814860\nn02102177\nn02988304\nn03376595\nn02165105\nn04081281\nn03495258\nn09193705\nn04493381\nn02815834\nn11939491\nn02883205\nn03063689\nn02095570\nn04033901\nn03937543\nn02107908\nn07742313\nn02114712\nn02971356\nn02906734\nn02814860\nn01692333\nn02808440\nn03706229\nn04335435\nn03791053\nn03742115\nn02099429\nn02877765\nn02321529\nn03814639\nn01592084\nn03272562\nn02786058\nn01667114\nn03947888\nn02100735\nn04409515\nn01601694\nn03777568\nn12620546\nn06794110\nn02483708\nn03666591\nn03759954\nn01871265\nn02790996\nn01955084\nn03868863\nn03026506\nn04070727\nn02233338\nn01983481\nn02640242\nn01819313\nn02794156\nn03017168\nn02486261\nn04118776\nn02769748\nn03250847\nn02113799\nn02105056\nn02108422\nn01806567\nn04229816\nn09256479\nn04141327\nn01692333\nn01644373\nn02493509\nn02892201\nn02346627\nn07747607\nn04120489\nn03032252\nn04081281\nn09468604\nn02108422\nn07753113\nn02441942\nn03775071\nn02319095\nn04579145\nn02097474\nn03697007\nn02769748\nn02129604\nn04141076\nn04476259\nn02442845\nn04442312\nn02012849\nn01806567\nn03337140\nn02097209\nn03207941\nn01632458\nn01818515\nn02233338\nn02088094\nn02727426\nn04239074\nn03095699\nn04606251\nn03902125\nn02099267\nn02086240\nn03337140\nn02085782\nn02412080\nn03637318\nn01734418\nn02113023\nn04251144\nn03764736\nn02114855\nn02799071\nn01675722\nn02843684\nn01756291\nn04417672\nn02835271\nn04141076\nn04389033\nn04482393\nn02087394\nn02115641\nn03017168\nn01753488\nn02514041\nn04509417\nn02089973\nn03075370\nn01644373\nn03791053\nn04265275\nn02111500\nn02097209\nn04458633\nn07802026\nn04141076\nn04597913\nn02281787\nn12057211\nn02277742\nn07716906\nn03920288\nn04326547\nn03127747\nn03404251\nn02108915\nn02127052\nn02391049\nn04229816\nn02837789\nn03314780\nn02089973\nn04296562\nn02791270\nn03000134\nn01644900\nn04209133\nn01669191\nn02107142\nn03908714\nn03045698\nn03485794\nn02108551\nn02807133\nn02892767\nn04525305\nn02493509\nn10148035\nn03201208\nn03690938\nn04505470\nn02206856\nn02098105\nn03478589\nn02123597\nn02783161\nn01667114\nn02106550\nn03733805\nn03424325\nn01882714\nn01855672\nn01855672\nn01983481\nn01695060\nn01847000\nn02799071\nn04428191\nn03223299\nn13052670\nn02101556\nn04265275\nn03016953\nn01775062\nn04033901\nn01753488\nn03146219\nn04235860\nn03759954\nn03788195\nn07749582\nn01829413\nn02093256\nn02231487\nn04536866\nn03146219\nn04004767\nn02493793\nn04371774\nn02395406\nn02114712\nn02747177\nn01560419\nn03814906\nn04141327\nn01833805\nn03825788\nn02128925\nn02120079\nn03658185\nn03935335\nn03530642\nn01968897\nn02114548\nn03873416\nn01985128\nn01514859\nn02669723\nn04311174\nn03141823\nn01872401\nn03920288\nn02927161\nn02397096\nn04357314\nn03535780\nn03127925\nn01807496\nn02895154\nn02794156\nn03666591\nn04004767\nn04039381\nn04179913\nn01828970\nn02128385\nn02095570\nn04592741\nn02793495\nn02096177\nn01631663\nn02111500\nn12057211\nn04356056\nn02894605\nn02226429\nn04482393\nn01950731\nn03452741\nn01632777\nn03197337\nn04505470\nn04599235\nn01484850\nn04501370\nn02095570\nn02276258\nn02410509\nn04037443\nn02276258\nn04418357\nn02892767\nn02099267\nn03791053\nn04599235\nn03642806\nn03530642\nn07718472\nn07693725\nn11939491\nn02793495\nn02988304\nn02096051\nn01514668\nn01616318\nn04243546\nn02808440\nn04270147\nn02106030\nn04344873\nn07930864\nn03444034\nn07860988\nn02119022\nn02108000\nn04562935\nn02105162\nn02492035\nn02823750\nn03481172\nn02108000\nn04310018\nn02107142\nn02226429\nn02074367\nn03785016\nn04553703\nn03495258\nn07579787\nn07745940\nn02111277\nn04476259\nn03476684\nn04487081\nn02091134\nn07714571\nn02105251\nn04404412\nn04398044\nn01924916\nn02487347\nn12620546\nn03255030\nn04325704\nn02093647\nn02814533\nn03125729\nn03000247\nn02492035\nn01530575\nn02108915\nn02114367\nn01796340\nn13044778\nn04522168\nn02443114\nn04589890\nn04201297\nn03733805\nn02168699\nn01616318\nn03594945\nn04479046\nn02391049\nn02892201\nn04447861\nn02134084\nn02096294\nn01484850\nn03930630\nn02090721\nn04118538\nn02445715\nn06596364\nn03599486\nn04579145\nn09468604\nn01986214\nn01820546\nn02526121\nn02408429\nn03854065\nn01855032\nn03272562\nn09288635\nn02106550\nn02095314\nn01667778\nn02137549\nn02483708\nn02804610\nn04125021\nn03769881\nn02814533\nn07718472\nn04263257\nn03877472\nn02107312\nn03042490\nn01697457\nn09468604\nn03146219\nn02799071\nn03764736\nn02493793\nn03787032\nn02808304\nn03485407\nn01740131\nn04589890\nn01914609\nn02883205\nn04254680\nn03777568\nn02280649\nn02102040\nn02823750\nn04147183\nn02091467\nn04069434\nn01729977\nn01818515\nn04023962\nn03584254\nn02095314\nn03983396\nn03956157\nn02097209\nn02095314\nn02825657\nn02107142\nn02219486\nn03796401\nn01687978\nn03944341\nn02097658\nn07718747\nn04552348\nn04263257\nn03942813\nn02037110\nn03787032\nn03642806\nn01689811\nn02102973\nn02480495\nn07684084\nn02408429\nn04356056\nn02117135\nn07584110\nn04265275\nn02493793\nn01682714\nn01981276\nn04592741\nn03976467\nn02948072\nn04086273\nn04277352\nn13054560\nn02480495\nn01983481\nn02085782\nn03598930\nn03345487\nn02017213\nn03179701\nn01984695\nn04296562\nn04507155\nn04328186\nn01534433\nn02494079\nn03916031\nn04376876\nn02093428\nn01843383\nn01924916\nn03207743\nn07747607\nn03785016\nn03388549\nn02113624\nn03961711\nn02086646\nn02134084\nn04606251\nn04493381\nn02096585\nn02992529\nn03891332\nn01616318\nn01496331\nn01694178\nn01695060\nn04026417\nn01695060\nn02117135\nn03584254\nn04336792\nn01698640\nn02177972\nn04532670\nn02859443\nn02095889\nn01682714\nn11879895\nn02114855\nn02484975\nn02097047\nn04204238\nn04604644\nn01775062\nn03775071\nn01773549\nn03956157\nn03792972\nn04404412\nn09835506\nn07717556\nn02037110\nn02361337\nn02105412\nn04447861\nn02835271\nn03240683\nn07613480\nn02422699\nn02488702\nn01776313\nn04579432\nn04116512\nn03857828\nn02676566\nn03063599\nn02397096\nn02977058\nn02089867\nn04429376\nn03018349\nn13037406\nn03998194\nn01693334\nn01770081\nn03991062\nn03141823\nn03691459\nn04039381\nn02894605\nn02096177\nn02093256\nn02917067\nn03791053\nn03976467\nn02795169\nn02112706\nn01692333\nn02111129\nn03110669\nn03803284\nn01592084\nn02514041\nn02104365\nn02089867\nn07860988\nn02093256\nn02403003\nn04522168\nn02837789\nn01855032\nn02793495\nn02093991\nn02437312\nn02980441\nn04116512\nn02120079\nn04371774\nn02104365\nn04153751\nn02091635\nn01775062\nn04310018\nn03529860\nn02105162\nn02814860\nn02088364\nn02116738\nn03630383\nn02229544\nn04111531\nn01882714\nn01917289\nn03877472\nn02346627\nn03476991\nn02115641\nn03110669\nn02799071\nn03272562\nn01729322\nn03599486\nn03445777\nn04099969\nn02536864\nn03026506\nn03899768\nn04485082\nn01440764\nn04370456\nn04125021\nn07565083\nn02012849\nn02437616\nn02281406\nn03141823\nn01440764\nn04548362\nn03584254\nn04366367\nn04069434\nn02108551\nn07697313\nn02916936\nn03124043\nn01697457\nn02095570\nn03016953\nn02441942\nn02106382\nn01833805\nn03045698\nn04404412\nn03888605\nn04259630\nn03075370\nn03124170\nn03534580\nn04277352\nn03717622\nn02526121\nn01797886\nn04133789\nn02105855\nn03530642\nn02130308\nn01980166\nn04192698\nn04336792\nn07742313\nn01692333\nn02279972\nn04371430\nn01592084\nn09332890\nn04332243\nn04392985\nn07720875\nn03478589\nn03291819\nn04560804\nn02106030\nn04049303\nn02927161\nn07753113\nn04065272\nn02835271\nn03047690\nn03538406\nn01582220\nn02113624\nn03792782\nn04116512\nn02093859\nn03961711\nn02109047\nn07831146\nn02825657\nn13054560\nn02951585\nn02442845\nn02817516\nn03874599\nn02093859\nn01755581\nn02860847\nn02167151\nn01537544\nn02099601\nn02111500\nn03670208\nn03179701\nn02093647\nn03444034\nn03131574\nn02111500\nn04069434\nn01744401\nn03220513\nn03393912\nn02486261\nn03372029\nn01728572\nn02422106\nn01833805\nn03594734\nn13044778\nn02074367\nn02391049\nn07873807\nn09468604\nn02799071\nn03832673\nn02361337\nn02111277\nn04204238\nn02172182\nn04562935\nn02100735\nn02007558\nn03630383\nn01484850\nn02484975\nn02096051\nn02206856\nn03770679\nn04265275\nn09246464\nn09835506\nn07614500\nn09472597\nn03379051\nn03457902\nn01855032\nn04201297\nn02951585\nn13133613\nn03770439\nn02172182\nn03992509\nn03617480\nn02802426\nn02676566\nn01687978\nn07711569\nn03690938\nn02869837\nn03942813\nn04332243\nn01491361\nn12768682\nn01910747\nn04179913\nn03627232\nn13037406\nn07745940\nn04152593\nn01806143\nn07565083\nn03627232\nn12267677\nn03837869\nn02094433\nn04238763\nn03496892\nn04612504\nn02807133\nn02106166\nn02484975\nn03208938\nn04065272\nn02107574\nn07715103\nn04517823\nn10565667\nn02807133\nn03717622\nn04557648\nn04591157\nn02326432\nn06874185\nn04442312\nn03042490\nn03188531\nn04487394\nn02006656\nn01729322\nn03929660\nn03425413\nn03216828\nn02346627\nn02526121\nn02089078\nn01669191\nn10565667\nn04376876\nn04258138\nn02489166\nn02493793\nn03584829\nn03379051\nn02094114\nn01514668\nn03770439\nn02231487\nn01855032\nn03180011\nn04606251\nn03916031\nn01774750\nn02087394\nn03297495\nn01968897\nn02105056\nn01491361\nn02114712\nn02097130\nn02692877\nn04125021\nn03476684\nn03658185\nn02966687\nn02259212\nn03355925\nn13133613\nn03394916\nn02107312\nn02788148\nn02109961\nn01440764\nn03124043\nn06359193\nn04133789\nn02500267\nn04209133\nn03344393\nn03494278\nn02977058\nn03710637\nn01622779\nn09421951\nn02790996\nn02089078\nn02256656\nn01531178\nn04479046\nn04141327\nn03000134\nn02504013\nn03627232\nn02114712\nn03325584\nn03773504\nn04004767\nn04266014\nn02977058\nn02125311\nn02281406\nn03291819\nn01675722\nn02138441\nn03804744\nn03000684\nn02114367\nn03187595\nn01943899\nn02125311\nn02113624\nn02823428\nn02233338\nn03110669\nn02500267\nn03594734\nn03347037\nn01990800\nn02074367\nn02396427\nn03954731\nn02687172\nn02883205\nn03127925\nn02111500\nn07718747\nn02447366\nn04286575\nn02930766\nn01664065\nn04153751\nn01687978\nn02422699\nn02791270\nn02835271\nn02504458\nn01917289\nn04252077\nn04548280\nn03089624\nn07590611\nn07754684\nn01739381\nn04483307\nn01914609\nn02087046\nn03697007\nn04039381\nn01820546\nn04355338\nn02100735\nn03032252\nn02091467\nn01728572\nn02002556\nn03874599\nn02859443\nn04146614\nn03534580\nn04532106\nn01981276\nn03814639\nn01689811\nn06359193\nn01675722\nn03888605\nn07714990\nn04476259\nn02536864\nn02492035\nn04265275\nn02948072\nn03804744\nn04380533\nn01518878\nn04005630\nn07590611\nn04417672\nn03709823\nn02105412\nn02363005\nn01494475\nn03680355\nn02951358\nn04597913\nn03998194\nn01855032\nn02018795\nn03271574\nn02167151\nn02009912\nn03825788\nn04482393\nn01774750\nn02500267\nn01514859\nn03908618\nn03761084\nn03633091\nn02096177\nn03729826\nn07717556\nn03670208\nn01773797\nn04554684\nn01697457\nn03691459\nn02138441\nn03764736\nn02123394\nn04192698\nn04120489\nn07615774\nn03929855\nn02494079\nn01669191\nn01498041\nn03250847\nn03924679\nn02356798\nn02823750\nn03447721\nn02058221\nn07930864\nn01530575\nn04428191\nn04372370\nn03840681\nn02027492\nn01498041\nn07718472\nn03954731\nn04099969\nn03954731\nn01770081\nn03445924\nn03045698\nn03527444\nn02840245\nn04201297\nn01735189\nn01986214\nn02002724\nn02113978\nn02177972\nn03908714\nn03888257\nn02100236\nn02437312\nn02236044\nn07871810\nn03775071\nn03947888\nn03933933\nn02066245\nn02128385\nn01491361\nn02493509\nn07717556\nn02865351\nn03187595\nn02666196\nn01917289\nn01770081\nn02788148\nn03661043\nn02481823\nn02085620\nn02799071\nn03590841\nn01749939\nn01614925\nn02950826\nn02088632\nn01498041\nn02105162\nn01737021\nn02690373\nn03584254\nn02791124\nn02088238\nn04328186\nn01582220\nn02231487\nn03717622\nn01751748\nn03721384\nn02108422\nn01669191\nn02980441\nn04243546\nn03982430\nn02422106\nn03014705\nn04371774\nn04125021\nn02090622\nn01930112\nn04552348\nn03764736\nn01582220\nn02056570\nn02089973\nn09399592\nn03450230\nn03770679\nn03445924\nn02007558\nn02268443\nn02396427\nn01440764\nn03062245\nn02134418\nn03594734\nn02094433\nn04264628\nn02992211\nn02093428\nn02100735\nn04367480\nn03764736\nn03041632\nn01443537\nn03476684\nn09229709\nn04355338\nn02128385\nn04550184\nn01806567\nn02098413\nn04086273\nn02090379\nn03958227\nn02091467\nn02108000\nn03658185\nn02843684\nn01440764\nn02981792\nn07892512\nn03297495\nn03692522\nn03937543\nn03691459\nn03240683\nn02977058\nn07730033\nn04591713\nn11939491\nn03902125\nn02783161\nn04355338\nn02281406\nn03538406\nn01608432\nn03935335\nn01983481\nn02730930\nn01968897\nn03769881\nn04493381\nn02112018\nn02391049\nn04389033\nn03775546\nn02172182\nn09399592\nn02093991\nn01806143\nn02226429\nn01669191\nn04125021\nn02113712\nn02860847\nn02074367\nn02447366\nn02783161\nn02454379\nn01984695\nn03721384\nn03633091\nn03376595\nn02120505\nn02105505\nn04517823\nn03372029\nn03527444\nn03786901\nn03478589\nn02066245\nn07892512\nn01491361\nn02108089\nn03325584\nn03717622\nn03773504\nn01582220\nn03676483\nn04540053\nn07248320\nn04118538\nn02095314\nn12267677\nn03602883\nn02815834\nn03379051\nn02172182\nn02107142\nn06874185\nn01776313\nn07714571\nn01775062\nn03452741\nn03916031\nn04118538\nn01580077\nn02497673\nn01518878\nn03673027\nn02101388\nn03187595\nn04350905\nn02408429\nn03417042\nn02514041\nn02116738\nn03476684\nn02497673\nn04285008\nn03126707\nn03544143\nn04147183\nn03481172\nn04041544\nn02268443\nn09472597\nn02085782\nn03400231\nn03954731\nn04074963\nn03782006\nn02281787\nn04023962\nn04008634\nn07875152\nn07716906\nn02109525\nn03995372\nn02096177\nn01981276\nn03884397\nn02509815\nn03529860\nn03584829\nn02268853\nn04141975\nn04599235\nn03759954\nn02894605\nn02454379\nn03014705\nn02786058\nn04505470\nn02172182\nn02979186\nn02091635\nn02007558\nn02797295\nn02817516\nn02233338\nn04099969\nn03250847\nn02950826\nn02124075\nn01484850\nn02096294\nn02965783\nn01943899\nn02028035\nn04486054\nn02417914\nn03445777\nn04009552\nn02125311\nn03770439\nn02018207\nn02219486\nn04111531\nn09288635\nn03825788\nn03223299\nn04606251\nn02396427\nn07717410\nn02111277\nn04515003\nn02643566\nn03733131\nn02093428\nn01807496\nn02480855\nn03527444\nn02099849\nn04482393\nn02361337\nn02107574\nn04201297\nn03633091\nn04033995\nn02641379\nn02790996\nn02190166\nn03127747\nn02483362\nn03126707\nn03590841\nn07717410\nn04033901\nn02676566\nn07875152\nn02100236\nn04584207\nn01737021\nn02493509\nn02105251\nn03930630\nn03873416\nn02396427\nn02493793\nn03250847\nn02088466\nn02814533\nn02108000\nn01443537\nn02988304\nn01944390\nn04285008\nn04356056\nn01930112\nn03630383\nn02281406\nn02346627\nn04493381\nn03709823\nn01755581\nn02018795\nn07802026\nn11939491\nn07836838\nn04429376\nn03967562\nn02113023\nn03724870\nn03792972\nn01753488\nn07875152\nn07753592\nn04357314\nn03642806\nn04131690\nn04258138\nn01667114\nn02782093\nn02493509\nn04465501\nn07583066\nn02256656\nn01532829\nn01872401\nn07684084\nn03763968\nn04579145\nn03492542\nn04417672\nn04350905\nn04069434\nn03866082\nn04311174\nn01756291\nn02797295\nn03642806\nn03676483\nn03697007\nn02087046\nn03207941\nn04201297\nn02074367\nn01608432\nn02111500\nn03633091\nn02804610\nn04562935\nn02093859\nn03935335\nn02051845\nn01990800\nn02799071\nn04228054\nn02100877\nn01755581\nn02129604\nn02727426\nn01860187\nn04326547\nn03776460\nn02206856\nn02093256\nn01968897\nn02326432\nn03770679\nn02509815\nn02978881\nn03018349\nn03394916\nn02977058\nn03891332\nn01665541\nn04141327\nn02233338\nn02092339\nn03388549\nn04548362\nn04296562\nn04067472\nn03014705\nn02747177\nn02441942\nn04081281\nn03290653\nn02066245\nn01983481\nn02085936\nn01518878\nn02085620\nn04346328\nn01601694\nn01532829\nn03992509\nn01694178\nn02437616\nn04612504\nn02666196\nn03950228\nn02093754\nn02123597\nn01817953\nn02190166\nn04067472\nn03933933\nn02398521\nn02097130\nn03444034\nn03792972\nn04418357\nn01871265\nn03208938\nn01768244\nn02174001\nn02219486\nn01774384\nn07742313\nn04355933\nn02129165\nn07742313\nn01697457\nn04310018\nn02669723\nn04367480\nn01592084\nn02105251\nn02113799\nn07565083\nn02091032\nn02011460\nn03773504\nn02445715\nn04275548\nn02112018\nn01632458\nn02486261\nn07714990\nn02106550\nn03478589\nn02963159\nn03743016\nn04146614\nn03970156\nn03874293\nn07749582\nn06874185\nn01950731\nn01498041\nn04090263\nn02077923\nn02106662\nn02786058\nn04591157\nn03481172\nn03924679\nn02500267\nn04258138\nn04540053\nn03160309\nn02087394\nn03494278\nn04325704\nn01669191\nn02108551\nn01980166\nn03314780\nn02808440\nn04447861\nn02281787\nn02095889\nn02489166\nn02114367\nn04344873\nn02058221\nn02444819\nn02988304\nn03495258\nn02002556\nn03874293\nn02085782\nn01695060\nn02870880\nn01608432\nn02948072\nn04067472\nn02098286\nn02093428\nn04009552\nn12267677\nn02085782\nn03376595\nn04335435\nn03891332\nn03733281\nn02264363\nn02132136\nn04263257\nn01698640\nn01753488\nn07714990\nn03417042\nn03259280\nn01737021\nn04118538\nn01773797\nn03124170\nn03874293\nn09421951\nn02747177\nn09288635\nn04136333\nn03956157\nn02093256\nn03729826\nn03538406\nn01774384\nn04355338\nn02105251\nn02403003\nn01697457\nn01828970\nn02892767\nn02018207\nn02134084\nn03733805\nn07930864\nn02097474\nn04507155\nn04344873\nn02950826\nn03721384\nn01943899\nn07920052\nn02319095\nn04149813\nn02364673\nn01742172\nn04428191\nn03450230\nn09399592\nn01689811\nn01978287\nn07716358\nn02074367\nn04557648\nn03062245\nn02105251\nn07716906\nn03623198\nn03125729\nn03876231\nn04509417\nn03041632\nn04347754\nn06359193\nn04118538\nn01806143\nn07749582\nn02105855\nn13052670\nn02094114\nn03775071\nn01873310\nn03788195\nn04311004\nn03018349\nn03089624\nn02087046\nn03379051\nn04493381\nn07714990\nn03895866\nn15075141\nn07684084\nn01755581\nn07715103\nn04285008\nn03476991\nn04049303\nn03496892\nn03041632\nn02403003\nn03832673\nn04131690\nn04479046\nn04479046\nn02259212\nn01734418\nn02002556\nn03179701\nn03992509\nn07932039\nn04467665\nn02099712\nn04456115\nn03690938\nn04367480\nn01729322\nn03961711\nn03841143\nn02963159\nn03476991\nn04074963\nn02077923\nn01532829\nn02865351\nn02966687\nn01694178\nn03017168\nn04429376\nn03935335\nn09246464\nn04004767\nn03208938\nn04111531\nn04389033\nn07760859\nn04326547\nn04209239\nn07697537\nn03785016\nn04367480\nn04037443\nn04311174\nn02814533\nn02113799\nn02825657\nn02672831\nn02114855\nn02090622\nn09399592\nn04482393\nn01910747\nn04417672\nn04162706\nn02098413\nn07717556\nn01580077\nn02092002\nn03014705\nn04370456\nn02835271\nn03047690\nn03944341\nn07613480\nn02361337\nn02356798\nn02835271\nn02011460\nn02096051\nn01843065\nn03498962\nn07583066\nn07734744\nn04277352\nn02088632\nn09835506\nn04141327\nn01820546\nn03218198\nn03825788\nn04310018\nn02099849\nn02025239\nn07753275\nn03876231\nn02099267\nn03794056\nn07590611\nn01740131\nn02091032\nn04200800\nn01770081\nn02869837\nn03379051\nn01833805\nn03929855\nn02749479\nn01644900\nn03445777\nn02110627\nn01630670\nn04273569\nn04483307\nn02138441\nn07892512\nn01983481\nn02108422\nn02948072\nn02094258\nn03141823\nn01632458\nn04517823\nn04380533\nn09472597\nn02165456\nn01930112\nn03018349\nn02268853\nn01770081\nn04141975\nn03998194\nn03384352\nn04147183\nn03045698\nn03791053\nn03944341\nn02536864\nn01829413\nn02088466\nn01694178\nn02106382\nn01748264\nn03759954\nn12985857\nn04254680\nn04465501\nn02795169\nn02096177\nn02444819\nn01558993\nn02115641\nn03445924\nn02701002\nn06359193\nn01773549\nn03637318\nn02437312\nn04332243\nn02865351\nn02088632\nn04067472\nn02092002\nn03956157\nn04326547\nn02786058\nn01784675\nn01847000\nn04146614\nn03666591\nn04310018\nn01914609\nn07695742\nn03404251\nn03891251\nn06874185\nn03062245\nn03355925\nn12267677\nn04254120\nn07714990\nn02233338\nn02804414\nn03062245\nn02018795\nn07720875\nn03075370\nn03530642\nn01980166\nn01667114\nn04553703\nn09468604\nn06794110\nn04367480\nn02963159\nn03710193\nn01980166\nn03000134\nn03938244\nn02231487\nn02493509\nn03447721\nn07583066\nn09472597\nn03877845\nn04147183\nn04229816\nn12998815\nn03877472\nn07718472\nn03063599\nn01665541\nn02111889\nn06596364\nn02094433\nn01817953\nn02091635\nn01755581\nn01740131\nn01592084\nn03673027\nn03467068\nn03924679\nn04467665\nn03733805\nn01833805\nn03089624\nn02091635\nn02489166\nn02112350\nn04192698\nn02102040\nn02823428\nn04074963\nn01872401\nn04579145\nn03788365\nn04086273\nn02009229\nn07753113\nn02504458\nn02002724\nn02097474\nn07754684\nn03134739\nn02113978\nn02403003\nn03998194\nn01688243\nn03891332\nn04133789\nn02111500\nn02916936\nn07248320\nn04404412\nn04209239\nn07590611\nn03673027\nn04008634\nn03272010\nn13040303\nn09399592\nn02007558\nn02488291\nn07716906\nn04009552\nn02111889\nn03658185\nn01980166\nn04367480\nn02892201\nn04423845\nn03131574\nn04041544\nn04266014\nn03825788\nn02033041\nn02002724\nn01871265\nn04099969\nn02321529\nn02666196\nn01698640\nn03709823\nn02356798\nn03089624\nn03873416\nn02097130\nn02108089\nn04258138\nn01667778\nn04456115\nn03492542\nn02363005\nn01871265\nn01950731\nn04153751\nn01984695\nn01614925\nn02110958\nn01824575\nn01981276\nn15075141\nn03814906\nn03874599\nn04118776\nn01675722\nn02939185\nn03742115\nn01697457\nn02326432\nn02090622\nn04532106\nn03983396\nn02415577\nn02412080\nn02102480\nn03459775\nn04380533\nn04254777\nn01631663\nn03404251\nn07871810\nn02123045\nn02226429\nn01871265\nn01820546\nn01688243\nn02825657\nn01689811\nn02095570\nn04019541\nn03777754\nn01748264\nn02123045\nn02129604\nn02105056\nn02125311\nn02089973\nn03649909\nn04540053\nn03670208\nn02097209\nn01819313\nn03110669\nn02124075\nn02437616\nn01843383\nn03935335\nn02782093\nn07753113\nn03791053\nn02111129\nn07614500\nn03761084\nn03676483\nn01978455\nn03857828\nn02488702\nn02165456\nn07734744\nn03991062\nn02860847\nn03954731\nn03045698\nn03944341\nn02111129\nn02092002\nn03891251\nn02130308\nn01945685\nn03188531\nn02457408\nn03085013\nn03796401\nn13052670\nn02398521\nn03743016\nn02229544\nn03160309\nn02276258\nn02276258\nn02504013\nn02281406\nn02877765\nn03649909\nn07697313\nn02058221\nn02077923\nn03394916\nn02256656\nn04328186\nn02009229\nn03476684\nn03388549\nn07714571\nn09193705\nn02396427\nn01806567\nn02090379\nn02100583\nn04483307\nn02120079\nn01914609\nn01630670\nn04259630\nn07695742\nn02106030\nn02883205\nn02398521\nn03995372\nn07590611\nn04099969\nn02110063\nn03785016\nn02669723\nn03125729\nn04442312\nn07920052\nn02497673\nn02454379\nn02091831\nn02454379\nn02088632\nn02115641\nn03761084\nn02606052\nn02264363\nn01843065\nn03623198\nn03445777\nn02481823\nn01773157\nn03109150\nn04458633\nn02165456\nn02190166\nn04111531\nn03197337\nn04542943\nn04507155\nn02089867\nn02342885\nn02099601\nn03787032\nn03483316\nn02454379\nn04041544\nn02086079\nn04485082\nn07831146\nn02106030\nn03445777\nn02398521\nn02666196\nn02009912\nn01534433\nn03126707\nn12057211\nn04355933\nn02025239\nn04336792\nn02906734\nn02002556\nn04487394\nn03291819\nn01614925\nn04235860\nn04270147\nn03291819\nn03837869\nn04192698\nn04120489\nn02930766\nn02128385\nn02837789\nn02105505\nn01704323\nn02481823\nn03384352\nn02167151\nn07753592\nn07614500\nn02134084\nn04515003\nn01729322\nn04033901\nn02134418\nn01514668\nn03942813\nn02101556\nn03642806\nn03733131\nn03290653\nn02174001\nn01784675\nn03777754\nn03942813\nn02802426\nn04049303\nn03535780\nn02492035\nn04070727\nn03075370\nn04372370\nn07860988\nn04367480\nn03786901\nn04562935\nn07590611\nn02102973\nn07248320\nn03095699\nn04009552\nn07614500\nn09288635\nn03724870\nn04258138\nn01698640\nn07753113\nn04263257\nn01755581\nn04447861\nn02666196\nn03733281\nn02051845\nn02058221\nn03958227\nn02403003\nn02097474\nn02099429\nn02484975\nn07836838\nn10565667\nn07720875\nn02486261\nn02321529\nn01755581\nn03100240\nn03063599\nn01664065\nn02783161\nn03803284\nn03110669\nn02086240\nn02487347\nn02097209\nn04310018\nn02012849\nn04120489\nn03482405\nn02447366\nn01749939\nn03478589\nn02963159\nn04428191\nn04285008\nn01530575\nn02111129\nn03109150\nn07697313\nn02802426\nn03690938\nn01914609\nn02481823\nn02259212\nn03538406\nn15075141\nn03649909\nn04483307\nn04613696\nn10565667\nn02488702\nn02094258\nn02096585\nn02127052\nn02391049\nn01734418\nn09332890\nn03379051\nn02133161\nn12144580\nn02099429\nn04447861\nn04120489\nn07860988\nn02129604\nn03065424\nn02095314\nn04154565\nn02655020\nn02165105\nn04275548\nn02415577\nn02786058\nn02091467\nn03444034\nn01498041\nn07590611\nn04554684\nn02109047\nn04552348\nn03814639\nn03125729\nn03888257\nn03950228\nn02089973\nn03967562\nn02749479\nn03729826\nn02018207\nn04487081\nn03017168\nn03976657\nn03938244\nn02769748\nn07836838\nn02002724\nn03100240\nn03598930\nn04479046\nn01644373\nn02708093\nn02134418\nn13054560\nn09332890\nn03133878\nn04554684\nn03041632\nn02869837\nn03014705\nn02510455\nn03954731\nn02788148\nn02859443\nn02640242\nn02087046\nn03891332\nn02124075\nn03476684\nn04270147\nn04542943\nn03916031\nn02051845\nn02104029\nn04270147\nn02422106\nn03692522\nn02115641\nn02447366\nn03710721\nn02112018\nn03000134\nn02105162\nn02097047\nn02356798\nn04037443\nn02071294\nn07892512\nn03924679\nn01687978\nn02098286\nn03345487\nn04254777\nn03680355\nn02963159\nn01582220\nn04090263\nn03761084\nn04604644\nn02097209\nn03109150\nn02088632\nn03937543\nn01943899\nn02093647\nn02093428\nn03461385\nn04270147\nn04389033\nn03534580\nn09468604\nn02107312\nn01797886\nn02090379\nn02871525\nn01667778\nn01773549\nn01755581\nn02093991\nn04350905\nn03995372\nn02280649\nn03933933\nn02226429\nn03207941\nn09399592\nn02106030\nn03590841\nn02966193\nn03787032\nn02115913\nn04099969\nn04273569\nn02037110\nn01917289\nn04254777\nn03888257\nn02807133\nn04589890\nn02091032\nn01685808\nn07714571\nn03777568\nn03379051\nn03028079\nn04275548\nn02395406\nn04040759\nn02109961\nn01872401\nn03825788\nn02112706\nn03692522\nn02086910\nn02321529\nn03131574\nn04311004\nn03929855\nn01514859\nn03804744\nn03417042\nn02794156\nn07730033\nn04120489\nn02342885\nn04041544\nn04366367\nn02116738\nn02992211\nn02276258\nn02895154\nn01984695\nn03661043\nn03207941\nn02025239\nn02123045\nn02117135\nn02107908\nn02815834\nn04355933\nn03598930\nn07742313\nn03876231\nn02259212\nn01775062\nn03617480\nn03840681\nn03902125\nn02930766\nn03633091\nn04404412\nn03825788\nn03337140\nn02018795\nn02447366\nn07613480\nn02493793\nn01694178\nn12620546\nn06874185\nn02443484\nn04209133\nn04515003\nn04540053\nn01796340\nn03623198\nn02108551\nn03763968\nn02410509\nn11879895\nn03832673\nn03930630\nn02490219\nn03937543\nn02111889\nn02096437\nn04154565\nn02971356\nn02865351\nn03776460\nn02777292\nn02190166\nn04612504\nn04081281\nn02747177\nn03777754\nn02445715\nn03857828\nn11939491\nn01981276\nn04041544\nn04458633\nn03447721\nn02106030\nn02834397\nn02097474\nn01877812\nn02085936\nn02096051\nn03272562\nn03793489\nn02099849\nn03649909\nn01882714\nn02860847\nn04039381\nn04264628\nn02484975\nn02167151\nn02074367\nn01773549\nn04367480\nn07718747\nn02841315\nn02910353\nn02106550\nn03602883\nn04153751\nn03992509\nn09468604\nn02129604\nn09229709\nn02056570\nn03594734\nn02111277\nn07590611\nn02704792\nn03868863\nn02115641\nn02444819\nn02808304\nn04355338\nn02281787\nn02138441\nn03814906\nn04409515\nn01739381\nn03495258\nn03627232\nn02085620\nn02190166\nn03355925\nn03188531\nn02100735\nn03961711\nn02823428\nn07860988\nn01740131\nn09229709\nn03777568\nn03908618\nn02108551\nn02177972\nn09288635\nn01693334\nn02106382\nn04026417\nn03388183\nn02002724\nn03208938\nn04517823\nn04336792\nn03658185\nn02097474\nn02690373\nn13044778\nn02281787\nn02641379\nn02130308\nn02704792\nn01582220\nn02027492\nn04525305\nn02119789\nn13054560\nn03724870\nn02488291\nn07697313\nn02132136\nn04336792\nn03983396\nn03944341\nn01774384\nn02027492\nn02091134\nn07860988\nn02106550\nn04357314\nn03662601\nn03868242\nn03804744\nn02112350\nn01774750\nn02088238\nn07718472\nn01742172\nn02992529\nn04404412\nn02089867\nn03345487\nn02437312\nn02930766\nn13133613\nn02206856\nn02486410\nn03843555\nn04476259\nn02094433\nn01843065\nn07714571\nn02389026\nn04099969\nn01843065\nn03180011\nn09472597\nn03670208\nn01751748\nn01807496\nn02229544\nn02101006\nn03188531\nn03290653\nn02403003\nn02699494\nn04266014\nn02708093\nn04399382\nn02804414\nn07747607\nn02749479\nn03424325\nn04522168\nn01843065\nn01682714\nn02138441\nn11879895\nn04355338\nn03662601\nn03658185\nn03483316\nn07718747\nn03476684\nn02110958\nn04040759\nn03814906\nn04461696\nn02492660\nn04044716\nn04596742\nn01770081\nn01806143\nn04589890\nn03016953\nn02493793\nn01983481\nn01484850\nn02981792\nn03710637\nn02104029\nn01498041\nn03976657\nn04009552\nn02790996\nn04235860\nn04447861\nn01910747\nn03481172\nn04090263\nn03929660\nn07248320\nn03271574\nn03661043\nn03954731\nn03016953\nn07614500\nn03920288\nn02091244\nn02676566\nn13044778\nn03843555\nn07871810\nn03832673\nn04252225\nn02174001\nn03832673\nn10148035\nn02280649\nn09229709\nn06874185\nn02823428\nn02692877\nn02823428\nn07753592\nn02782093\nn03459775\nn09288635\nn04204347\nn02483708\nn04461696\nn02791124\nn03710193\nn12768682\nn04435653\nn04204347\nn02669723\nn03657121\nn01518878\nn04026417\nn02319095\nn03791053\nn02110063\nn02281787\nn03197337\nn04152593\nn02025239\nn03633091\nn02259212\nn02423022\nn03891332\nn03874293\nn02071294\nn01773797\nn07711569\nn02007558\nn13133613\nn02017213\nn04270147\nn02113624\nn02916936\nn01675722\nn07614500\nn03673027\nn02109961\nn02950826\nn02966193\nn01685808\nn02804610\nn02095314\nn03929855\nn10565667\nn02013706\nn02123394\nn03590841\nn07711569\nn02113799\nn07860988\nn04367480\nn07873807\nn02096585\nn02002724\nn02134418\nn02398521\nn04033901\nn02110063\nn09468604\nn01990800\nn04423845\nn02177972\nn04447861\nn02096585\nn02442845\nn04265275\nn04317175\nn01807496\nn04366367\nn03814906\nn12998815\nn03482405\nn03884397\nn03673027\nn03673027\nn03793489\nn02443114\nn02988304\nn02422106\nn04326547\nn02992529\nn01860187\nn03895866\nn03180011\nn04118776\nn03461385\nn04275548\nn15075141\nn03761084\nn01944390\nn04317175\nn04152593\nn02927161\nn03956157\nn02085620\nn02727426\nn01667114\nn04493381\nn01729322\nn04081281\nn01484850\nn03124043\nn02841315\nn02108089\nn03345487\nn02892201\nn07875152\nn02093991\nn03697007\nn02119789\nn01739381\nn02319095\nn02361337\nn01883070\nn02492035\nn02107312\nn07715103\nn04264628\nn01843065\nn07860988\nn01795545\nn01592084\nn03676483\nn04254120\nn03223299\nn03220513\nn02108915\nn03873416\nn02128925\nn02389026\nn01698640\nn15075141\nn03028079\nn01644900\nn01694178\nn03761084\nn03873416\nn03710637\nn03924679\nn03627232\nn04542943\nn03095699\nn02100236\nn01784675\nn01744401\nn04153751\nn03770439\nn02107142\nn03297495\nn07753275\nn04008634\nn07615774\nn04550184\nn02110806\nn04404412\nn03976467\nn07715103\nn04525038\nn02776631\nn02099267\nn02095314\nn03028079\nn02100236\nn03930630\nn03188531\nn02094258\nn04554684\nn03887697\nn02116738\nn02007558\nn02102973\nn02130308\nn04328186\nn04141076\nn03220513\nn02444819\nn04458633\nn01735189\nn02701002\nn02071294\nn01498041\nn04070727\nn04423845\nn02089973\nn04141975\nn01729322\nn01824575\nn04251144\nn01692333\nn01484850\nn04208210\nn01667114\nn04458633\nn04141076\nn02058221\nn02088466\nn07760859\nn04560804\nn02099267\nn03000134\nn02481823\nn02788148\nn02097047\nn04487081\nn04286575\nn02233338\nn04344873\nn02490219\nn02123159\nn02120079\nn02114855\nn02088238\nn01775062\nn04136333\nn03344393\nn03535780\nn02074367\nn03782006\nn02487347\nn02134418\nn02500267\nn03208938\nn04162706\nn02410509\nn02091635\nn04417672\nn01537544\nn02951358\nn02116738\nn03594734\nn03775071\nn03594945\nn04532670\nn01695060\nn02277742\nn02123597\nn02883205\nn07932039\nn02497673\nn07754684\nn02112018\nn03538406\nn03895866\nn01494475\nn02177972\nn03197337\nn02105641\nn02992529\nn04070727\nn02109525\nn02125311\nn04456115\nn02980441\nn03841143\nn03938244\nn03661043\nn01756291\nn03794056\nn02018207\nn03126707\nn01614925\nn03992509\nn03127925\nn02115913\nn03773504\nn02776631\nn09472597\nn02177972\nn03532672\nn04476259\nn04517823\nn13052670\nn07753275\nn01685808\nn04120489\nn02120079\nn02123159\nn02087046\nn03598930\nn02487347\nn03065424\nn04517823\nn02797295\nn02804414\nn02843684\nn02018795\nn03976657\nn04005630\nn02699494\nn03814906\nn09332890\nn02493793\nn04442312\nn02100877\nn04532670\nn03047690\nn02077923\nn03733281\nn04266014\nn09835506\nn02492660\nn04330267\nn07716358\nn01601694\nn04579432\nn04380533\nn01749939\nn03444034\nn03400231\nn03584254\nn03710721\nn03895866\nn04591713\nn03903868\nn02088364\nn04141975\nn01774384\nn02112018\nn04485082\nn04259630\nn03041632\nn02097130\nn03775546\nn02093991\nn01742172\nn09193705\nn01984695\nn01924916\nn02190166\nn03706229\nn13037406\nn04604644\nn03602883\nn02504458\nn03467068\nn04536866\nn04398044\nn01986214\nn03777754\nn02066245\nn02346627\nn04370456\nn02108551\nn04204238\nn04371430\nn03792972\nn02441942\nn02096294\nn02699494\nn04589890\nn02085936\nn02105056\nn02415577\nn07734744\nn02098286\nn02113186\nn02096294\nn02871525\nn03873416\nn01784675\nn02788148\nn02051845\nn07930864\nn01692333\nn02111889\nn03662601\nn02097474\nn02165456\nn03595614\nn03452741\nn04606251\nn03796401\nn03452741\nn07693725\nn02112018\nn03388549\nn04562935\nn13133613\nn04461696\nn01796340\nn04270147\nn03187595\nn03666591\nn04120489\nn04522168\nn02111500\nn03976467\nn01729322\nn02364673\nn04356056\nn02797295\nn02114855\nn02749479\nn04357314\nn07565083\nn02676566\nn02088466\nn02823750\nn02093256\nn02256656\nn02119022\nn02883205\nn03584254\nn03775071\nn01682714\nn03124170\nn04201297\nn04044716\nn01629819\nn12998815\nn07584110\nn04532106\nn03825788\nn04501370\nn01560419\nn03065424\nn02106030\nn04229816\nn03623198\nn02280649\nn06785654\nn02342885\nn02488291\nn02606052\nn03271574\nn04070727\nn03717622\nn02447366\nn03065424\nn03527444\nn01943899\nn02095889\nn02132136\nn04204347\nn03026506\nn01749939\nn03742115\nn02105162\nn03733281\nn02006656\nn04552348\nn02493793\nn02992211\nn02089867\nn04111531\nn04590129\nn03982430\nn03495258\nn02640242\nn02099429\nn02132136\nn02444819\nn02056570\nn03494278\nn01773157\nn02137549\nn01534433\nn02018795\nn03630383\nn02281787\nn04120489\nn02104029\nn02098413\nn02488702\nn03379051\nn02807133\nn04591713\nn02110185\nn04209239\nn01558993\nn04325704\nn04264628\nn03291819\nn02793495\nn02133161\nn03908714\nn03584254\nn02091831\nn02099429\nn09835506\nn01798484\nn03041632\nn02808304\nn04136333\nn09428293\nn04465501\nn01688243\nn02093428\nn02129165\nn07749582\nn03197337\nn04392985\nn04367480\nn02484975\nn02607072\nn03089624\nn04116512\nn04286575\nn02233338\nn04118538\nn04254777\nn02410509\nn02091244\nn03016953\nn03026506\nn02113978\nn02091032\nn02096585\nn04179913\nn01775062\nn03903868\nn04277352\nn02841315\nn04597913\nn01614925\nn04067472\nn03876231\nn02095889\nn02100877\nn03444034\nn01484850\nn02490219\nn03272010\nn12057211\nn03980874\nn02097474\nn04270147\nn04429376\nn04111531\nn09399592\nn04005630\nn03595614\nn02123045\nn03657121\nn07892512\nn03840681\nn04296562\nn02807133\nn01806567\nn04258138\nn02114367\nn01675722\nn02794156\nn01698640\nn04296562\nn07717556\nn03476991\nn04005630\nn02099712\nn02099429\nn03721384\nn04277352\nn03127925\nn02256656\nn03201208\nn02088466\nn02086079\nn01632458\nn04376876\nn03998194\nn01440764\nn02704792\nn01855032\nn03095699\nn04355933\nn04465501\nn03841143\nn04501370\nn01558993\nn03042490\nn01950731\nn03935335\nn04584207\nn01984695\nn02747177\nn03775546\nn04525038\nn01632777\nn04485082\nn04116512\nn02486410\nn02096585\nn02096051\nn02110627\nn03272010\nn03775546\nn02123597\nn02992529\nn01632458\nn02089078\nn03954731\nn02437616\nn02120505\nn04507155\nn02114712\nn03532672\nn03983396\nn02108000\nn01514859\nn07802026\nn02951358\nn01882714\nn04505470\nn02231487\nn03388043\nn04482393\nn02112018\nn04008634\nn02606052\nn04273569\nn03594734\nn04532670\nn01855032\nn02342885\nn03950228\nn02093859\nn02841315\nn02025239\nn03930630\nn01797886\nn03240683\nn01775062\nn02321529\nn02342885\nn02108551\nn03216828\nn02281406\nn03710721\nn04201297\nn01950731\nn03216828\nn07880968\nn04208210\nn02514041\nn02123597\nn04517823\nn04553703\nn03482405\nn07697313\nn03690938\nn02444819\nn04049303\nn03085013\nn01843065\nn03709823\nn02117135\nn02787622\nn07579787\nn02099601\nn04229816\nn03776460\nn01644900\nn07579787\nn03733281\nn09472597\nn01797886\nn07802026\nn01806567\nn02108551\nn02093754\nn02132136\nn04254120\nn03877472\nn02480855\nn04285008\nn15075141\nn04325704\nn09332890\nn03947888\nn01828970\nn02106030\nn04501370\nn07730033\nn02113186\nn03026506\nn04266014\nn11939491\nn04270147\nn03777754\nn04522168\nn01860187\nn02443484\nn02835271\nn04125021\nn02794156\nn06596364\nn04265275\nn04136333\nn10565667\nn04483307\nn02277742\nn02094433\nn07716906\nn01514859\nn02397096\nn02102318\nn04442312\nn03680355\nn02086240\nn02174001\nn02277742\nn03832673\nn01768244\nn01739381\nn02361337\nn02607072\nn01843383\nn02091467\nn02090721\nn01756291\nn02099429\nn01806567\nn02966687\nn02094258\nn01986214\nn07697537\nn02909870\nn03967562\nn04296562\nn03388043\nn04482393\nn09421951\nn07614500\nn02865351\nn02089973\nn04557648\nn01537544\nn01819313\nn03929855\nn04136333\nn03977966\nn04099969\nn01675722\nn03832673\nn02643566\nn07749582\nn04275548\nn04005630\nn02074367\nn03623198\nn03495258\nn04296562\nn02437312\nn02113799\nn03874599\nn02454379\nn02877765\nn02109525\nn04270147\nn01729977\nn02950826\nn02110063\nn03216828\nn01484850\nn03062245\nn02128385\nn04228054\nn03179701\nn01796340\nn01694178\nn02088094\nn03942813\nn02869837\nn03770439\nn02097658\nn03047690\nn03742115\nn03724870\nn02966687\nn02098286\nn01687978\nn02100236\nn01616318\nn04442312\nn02396427\nn03998194\nn01773549\nn07747607\nn01944390\nn03891332\nn03045698\nn03877472\nn03207941\nn02494079\nn01819313\nn02093754\nn02088238\nn02168699\nn04515003\nn01675722\nn02018207\nn02690373\nn03777568\nn03026506\nn02342885\nn02102040\nn07583066\nn03961711\nn02916936\nn03958227\nn01698640\nn07714990\nn02483708\nn03680355\nn04141975\nn02085936\nn07930864\nn03691459\nn02892767\nn03770679\nn03450230\nn02165456\nn04560804\nn01614925\nn04458633\nn02500267\nn02190166\nn04380533\nn02950826\nn07860988\nn02346627\nn03814906\nn02494079\nn01817953\nn09421951\nn03041632\nn04371430\nn04371430\nn03743016\nn01630670\nn04074963\nn04326547\nn02894605\nn02086910\nn03935335\nn04461696\nn03476991\nn03697007\nn01818515\nn04263257\nn02088238\nn07697313\nn02110806\nn07747607\nn02108422\nn02641379\nn04507155\nn02124075\nn12985857\nn02342885\nn07697537\nn03742115\nn12998815\nn04591713\nn03450230\nn02110185\nn02091831\nn03424325\nn01795545\nn04507155\nn01616318\nn01704323\nn03887697\nn02128925\nn01824575\nn02099712\nn03498962\nn04273569\nn04090263\nn01775062\nn03970156\nn02480855\nn02730930\nn02326432\nn04355933\nn03355925\nn01734418\nn02107908\nn01978287\nn03874599\nn03478589\nn03788365\nn02325366\nn02445715\nn03180011\nn03792782\nn01667778\nn02490219\nn01882714\nn04005630\nn04118538\nn03775071\nn03792782\nn02123045\nn02264363\nn02776631\nn01773157\nn01614925\nn04548362\nn02009912\nn02487347\nn03272562\nn01685808\nn02835271\nn02110063\nn04153751\nn02123045\nn02417914\nn04208210\nn03476684\nn01768244\nn07697313\nn02100583\nn02504013\nn04040759\nn04067472\nn01798484\nn07248320\nn02094258\nn02483708\nn04557648\nn01828970\nn02172182\nn03658185\nn02493509\nn03991062\nn03494278\nn03291819\nn02410509\nn03733805\nn04579432\nn03124043\nn02966193\nn02190166\nn02526121\nn07753592\nn07753592\nn07768694\nn09246464\nn07711569\nn02018795\nn02105056\nn01669191\nn02268853\nn02488291\nn02793495\nn02101556\nn04476259\nn07584110\nn04542943\nn03670208\nn03929855\nn04204347\nn02094433\nn09472597\nn04479046\nn01667778\nn03459775\nn02056570\nn12620546\nn04286575\nn02795169\nn04209239\nn02101556\nn04532670\nn02009229\nn04584207\nn02795169\nn02112350\nn01667778\nn02939185\nn03908618\nn01753488\nn02841315\nn03388183\nn03218198\nn02776631\nn02363005\nn02130308\nn06596364\nn02814860\nn02110063\nn02117135\nn07684084\nn04254680\nn03109150\nn02408429\nn04389033\nn04483307\nn01797886\nn02095889\nn03958227\nn04548280\nn02410509\nn03837869\nn03720891\nn04435653\nn01498041\nn02749479\nn07718747\nn04461696\nn03388043\nn02133161\nn02165105\nn02817516\nn04532670\nn02013706\nn01682714\nn02102177\nn03290653\nn04086273\nn02090379\nn01797886\nn01440764\nn01818515\nn04562935\nn02782093\nn03793489\nn11879895\nn02814860\nn02669723\nn02974003\nn07693725\nn02104029\nn03372029\nn03045698\nn03100240\nn02127052\nn07579787\nn03874599\nn02504458\nn02132136\nn03692522\nn04517823\nn03223299\nn04418357\nn02110806\nn01728572\nn04259630\nn03930313\nn02321529\nn02105251\nn04317175\nn01491361\nn07753275\nn02028035\nn04476259\nn03742115\nn03032252\nn02328150\nn04591713\nn02088094\nn02190166\nn04067472\nn03134739\nn02102318\nn03026506\nn04371430\nn03535780\nn01614925\nn02111889\nn03977966\nn03131574\nn02071294\nn02110627\nn02109961\nn02412080\nn01580077\nn06359193\nn04209133\nn03775546\nn03630383\nn01753488\nn02672831\nn02092339\nn01644900\nn07730033\nn03124043\nn04065272\nn03697007\nn01616318\nn01558993\nn02107683\nn04044716\nn03877472\nn02786058\nn02087046\nn07717410\nn04019541\nn01622779\nn03337140\nn02978881\nn04131690\nn03887697\nn01582220\nn02536864\nn04065272\nn02977058\nn03825788\nn01687978\nn01756291\nn04486054\nn01737021\nn01968897\nn03047690\nn02106166\nn02259212\nn02326432\nn04476259\nn02115913\nn02006656\nn04254120\nn02871525\nn03220513\nn03769881\nn03692522\nn02730930\nn04235860\nn02112018\nn02107142\nn02834397\nn04008634\nn02100583\nn01729977\nn07714571\nn01629819\nn02028035\nn03724870\nn04355933\nn01614925\nn07714571\nn07584110\nn02870880\nn13054560\nn02727426\nn03877472\nn04263257\nn04127249\nn03630383\nn01978287\nn13044778\nn02509815\nn04251144\nn04141327\nn12620546\nn03388043\nn02951358\nn02412080\nn03110669\nn03937543\nn04044716\nn02101388\nn07716358\nn04462240\nn03933933\nn02840245\nn03485407\nn03461385\nn02119789\nn01944390\nn01924916\nn04127249\nn04209239\nn03908618\nn03133878\nn03992509\nn02410509\nn03796401\nn01798484\nn04557648\nn02088632\nn03000247\nn02971356\nn03840681\nn01776313\nn01773157\nn04366367\nn03325584\nn03873416\nn01807496\nn02790996\nn09421951\nn07734744\nn03000247\nn04597913\nn04332243\nn02408429\nn01677366\nn02229544\nn03891251\nn02110063\nn03532672\nn03937543\nn01558993\nn04540053\nn12057211\nn03388183\nn02841315\nn09399592\nn03933933\nn02823428\nn02102040\nn02690373\nn02895154\nn02085936\nn04458633\nn02415577\nn04579432\nn04557648\nn03630383\nn02009912\nn02113978\nn03000247\nn09246464\nn03498962\nn02992211\nn03249569\nn03930313\nn01632458\nn02086910\nn02097209\nn03032252\nn01496331\nn04118538\nn03272010\nn02095314\nn02930766\nn02112137\nn03697007\nn04127249\nn04141076\nn03376595\nn07613480\nn04023962\nn03958227\nn04515003\nn04596742\nn02108000\nn03874599\nn01776313\nn02088238\nn01950731\nn02086910\nn03384352\nn02093859\nn02088632\nn02749479\nn01631663\nn01955084\nn04275548\nn02493793\nn03690938\nn02802426\nn02110341\nn02906734\nn02124075\nn03991062\nn03584254\nn03444034\nn02979186\nn03888605\nn01534433\nn02129165\nn01614925\nn02397096\nn12985857\nn02123159\nn01984695\nn02097047\nn01616318\nn02117135\nn01682714\nn03814906\nn02105251\nn01877812\nn04367480\nn01770081\nn02099849\nn02328150\nn07590611\nn07734744\nn03673027\nn02129165\nn02111500\nn04090263\nn02129604\nn02894605\nn02128757\nn04238763\nn03720891\nn03793489\nn03424325\nn07716358\nn02493509\nn02099849\nn02091244\nn02097658\nn02138441\nn03047690\nn02093647\nn02108915\nn04263257\nn02129165\nn04335435\nn07760859\nn02091831\nn03445924\nn02280649\nn02640242\nn04613696\nn03527444\nn01798484\nn03995372\nn01728572\nn04004767\nn02099267\nn07920052\nn03709823\nn02095570\nn02018795\nn03642806\nn04074963\nn04141327\nn01917289\nn04131690\nn03250847\nn02104365\nn03602883\nn02093428\nn03109150\nn03240683\nn02086079\nn02114712\nn02093256\nn02102040\nn03495258\nn04584207\nn02870880\nn02916936\nn07875152\nn07583066\nn02730930\nn04019541\nn04254120\nn02666196\nn03141823\nn03063689\nn06596364\nn02906734\nn03445777\nn02971356\nn03891332\nn07892512\nn02442845\nn03527444\nn02667093\nn01806143\nn03902125\nn02457408\nn01693334\nn02799071\nn02814533\nn06874185\nn02088466\nn03825788\nn01484850\nn03355925\nn02095889\nn02086646\nn03942813\nn03425413\nn04550184\nn02817516\nn04049303\nn04483307\nn02097209\nn03388549\nn02815834\nn02487347\nn02074367\nn02113186\nn02536864\nn02114855\nn07697313\nn03938244\nn02492035\nn02085620\nn02085620\nn03223299\nn04273569\nn03496892\nn03866082\nn03065424\nn03877845\nn02871525\nn03404251\nn04462240\nn02113799\nn02093859\nn03742115\nn02123045\nn04487081\nn02107312\nn03938244\nn02966687\nn02342885\nn03781244\nn02493509\nn02134084\nn02749479\nn07749582\nn12144580\nn02114548\nn13052670\nn07753113\nn03777754\nn07615774\nn02483708\nn01784675\nn01978287\nn02536864\nn02443484\nn03877472\nn04074963\nn01632777\nn02815834\nn01669191\nn02104029\nn02093859\nn01883070\nn01774750\nn01667778\nn01728920\nn02219486\nn03124170\nn02123394\nn01740131\nn04228054\nn01592084\nn02128925\nn02281787\nn02093647\nn01667778\nn02128925\nn01978287\nn02130308\nn03065424\nn12620546\nn13052670\nn02480855\nn03376595\nn07734744\nn04019541\nn02536864\nn04350905\nn01773549\nn03782006\nn02111129\nn01806567\nn07753275\nn02256656\nn01984695\nn04443257\nn02410509\nn02092339\nn02115913\nn01806143\nn02815834\nn03908618\nn02279972\nn03691459\nn03216828\nn04370456\nn02676566\nn03710721\nn01629819\nn03967562\nn03482405\nn04487081\nn01744401\nn02454379\nn02007558\nn03201208\nn03793489\nn03902125\nn02672831\nn03447447\nn02749479\nn01440764\nn03538406\nn03794056\nn02097130\nn04332243\nn02814860\nn02488291\nn03032252\nn02137549\nn02281406\nn01494475\nn02749479\nn04458633\nn01847000\nn03825788\nn01819313\nn01847000\nn03908618\nn03444034\nn02483362\nn04254680\nn02123597\nn03838899\nn02104029\nn03633091\nn03775546\nn01807496\nn03692522\nn03721384\nn04208210\nn02892767\nn02086240\nn02492660\nn04049303\nn04238763\nn03793489\nn02107574\nn02364673\nn02134084\nn02092339\nn02906734\nn04371774\nn02097658\nn02102040\nn01968897\nn02090622\nn03916031\nn03658185\nn02536864\nn03697007\nn03924679\nn02325366\nn03337140\nn02999410\nn01983481\nn03141823\nn03662601\nn01729322\nn02676566\nn02992211\nn03089624\nn01632777\nn02443484\nn03534580\nn01847000\nn02102318\nn01855032\nn03961711\nn03895866\nn02892767\nn01601694\nn02443484\nn03930313\nn03062245\nn02988304\nn02090622\nn02107908\nn03290653\nn04542943\nn04296562\nn01986214\nn02233338\nn02093991\nn03482405\nn02966193\nn03786901\nn02027492\nn04392985\nn03376595\nn07714990\nn02504013\nn04606251\nn03724870\nn02093991\nn03933933\nn02804414\nn03063599\nn01698640\nn03498962\nn04252225\nn02013706\nn03026506\nn03787032\nn04536866\nn02100583\nn01582220\nn02500267\nn03388183\nn07693725\nn02033041\nn03908714\nn02219486\nn02730930\nn03710193\nn02108915\nn01749939\nn02817516\nn01729977\nn02086910\nn02107908\nn03450230\nn07565083\nn02128385\nn03141823\nn04259630\nn01914609\nn07697537\nn04447861\nn02099849\nn03126707\nn01943899\nn04118776\nn02791124\nn03763968\nn03492542\nn02094433\nn04366367\nn01614925\nn02007558\nn02128757\nn04019541\nn04612504\nn02841315\nn13044778\nn04147183\nn03933933\nn02110627\nn02226429\nn01631663\nn03676483\nn02487347\nn04507155\nn03216828\nn07718472\nn02058221\nn03127747\nn07745940\nn02102177\nn02113712\nn02965783\nn03840681\nn04310018\nn01774384\nn02177972\nn03063599\nn01697457\nn03759954\nn02085620\nn07753113\nn03393912\nn02692877\nn03868242\nn02403003\nn03249569\nn03884397\nn02396427\nn03457902\nn07718747\nn02167151\nn04154565\nn04147183\nn04118538\nn03124043\nn04372370\nn01667114\nn03998194\nn03995372\nn10565667\nn01798484\nn04591157\nn03127747\nn02105641\nn03485407\nn02102177\nn04461696\nn01824575\nn02066245\nn04317175\nn02107312\nn06874185\nn04465501\nn02939185\nn04019541\nn03459775\nn04548280\nn03047690\nn04325704\nn07871810\nn01819313\nn03782006\nn02086079\nn03584254\nn03929660\nn02492035\nn03670208\nn02412080\nn02109525\nn02397096\nn01582220\nn03188531\nn02105641\nn02033041\nn03992509\nn02328150\nn03000684\nn03126707\nn07590611\nn02102480\nn07684084\nn07590611\nn09421951\nn04285008\nn02930766\nn04604644\nn03584829\nn03447721\nn01693334\nn02910353\nn03532672\nn04127249\nn04154565\nn03014705\nn13052670\nn03483316\nn02817516\nn03759954\nn03733805\nn04204238\nn02110341\nn04147183\nn02007558\nn02268443\nn03133878\nn03255030\nn02442845\nn02018207\nn04069434\nn02667093\nn03866082\nn02113978\nn02108000\nn03832673\nn04039381\nn01677366\nn01955084\nn02113023\nn04371430\nn03134739\nn03840681\nn07714571\nn01955084\nn03785016\nn03924679\nn04443257\nn03709823\nn04204347\nn02086079\nn02361337\nn04317175\nn09229709\nn04270147\nn01518878\nn02105412\nn07720875\nn02177972\nn02098105\nn03534580\nn02492660\nn03954731\nn03874599\nn04243546\nn04344873\nn04252077\nn02009229\nn01774384\nn03843555\nn02988304\nn02422699\nn03045698\nn03775071\nn02098105\nn04099969\nn01582220\nn03026506\nn02099849\nn02814860\nn02980441\nn07875152\nn01873310\nn02117135\nn02510455\nn02108422\nn04599235\nn03450230\nn02105505\nn04239074\nn04131690\nn04033995\nn03445924\nn01558993\nn02791270\nn03770679\nn02480855\nn02134084\nn02098286\nn03478589\nn01744401\nn04532670\nn02105412\nn03874599\nn04125021\nn01682714\nn02747177\nn02992211\nn03710193\nn01514859\nn01687978\nn04418357\nn02017213\nn01677366\nn02281406\nn02138441\nn03594945\nn02106030\nn03017168\nn02105251\nn04273569\nn02488291\nn09332890\nn03873416\nn02895154\nn02494079\nn02437616\nn01692333\nn04311004\nn03218198\nn02110185\nn02256656\nn07880968\nn02666196\nn03337140\nn04399382\nn04265275\nn04254120\nn01798484\nn03602883\nn03825788\nn01833805\nn02704792\nn01734418\nn03594734\nn02701002\nn02085620\nn01582220\nn03623198\nn03000134\nn02992211\nn03691459\nn02526121\nn03998194\nn01990800\nn03933933\nn02950826\nn01748264\nn15075141\nn10565667\nn15075141\nn02116738\nn02643566\nn02837789\nn04005630\nn02091134\nn02071294\nn10148035\nn02951358\nn04127249\nn03866082\nn04579145\nn04239074\nn02492035\nn02107683\nn04239074\nn04004767\nn04550184\nn03961711\nn03201208\nn03207941\nn03134739\nn02892767\nn03394916\nn02398521\nn03868863\nn02486410\nn04487394\nn03394916\nn01496331\nn04418357\nn02168699\nn02097209\nn01537544\nn01687978\nn02799071\nn04009552\nn03345487\nn04346328\nn12057211\nn03485794\nn02443484\nn02229544\nn02840245\nn02415577\nn02104029\nn03792782\nn03888605\nn02128925\nn03045698\nn03837869\nn02749479\nn04033995\nn02422106\nn03404251\nn04208210\nn02113712\nn03459775\nn02514041\nn04371430\nn01644373\nn03447721\nn13052670\nn03492542\nn04366367\nn01968897\nn02033041\nn02114712\nn02804414\nn01796340\nn04009552\nn04597913\nn03141823\nn04612504\nn01729322\nn02492660\nn03792972\nn02130308\nn03400231\nn01632777\nn03085013\nn01729322\nn02095570\nn03970156\nn04009552\nn03950228\nn02086646\nn02108000\nn03196217\nn01580077\nn04275548\nn04599235\nn01774750\nn03498962\nn03457902\nn03930630\nn04590129\nn01968897\nn04462240\nn04554684\nn02840245\nn02804414\nn07614500\nn03482405\nn02871525\nn04192698\nn02699494\nn03388183\nn04153751\nn03733281\nn01797886\nn01689811\nn02777292\nn02389026\nn03788365\nn01514859\nn02102480\nn03942813\nn02111129\nn03017168\nn02105855\nn04328186\nn02115641\nn02093647\nn02415577\nn02536864\nn13044778\nn02113712\nn02123394\nn01735189\nn03085013\nn03127747\nn02105641\nn04606251\nn02814533\nn02980441\nn02910353\nn02098105\nn04380533\nn02098286\nn02018795\nn02788148\nn01807496\nn03908714\nn03388549\nn02100877\nn03982430\nn01986214\nn04201297\nn03347037\nn04008634\nn04557648\nn03445924\nn02980441\nn03131574\nn02948072\nn01797886\nn04005630\nn02111889\nn02325366\nn01728920\nn02129165\nn02168699\nn04465501\nn01728572\nn02105641\nn01774384\nn04418357\nn02325366\nn03888605\nn04149813\nn02281406\nn03599486\nn03124170\nn02100583\nn03956157\nn03788195\nn04286575\nn04136333\nn04344873\nn03743016\nn01494475\nn01910747\nn02787622\nn04562935\nn02909870\nn02974003\nn02111500\nn03388549\nn04550184\nn07745940\nn03673027\nn02727426\nn03207743\nn04487081\nn04009552\nn02130308\nn02105412\nn03476991\nn01632458\nn02790996\nn04505470\nn04380533\nn02108422\nn07920052\nn03467068\nn03249569\nn03633091\nn02124075\nn03763968\nn03710637\nn03100240\nn02256656\nn03461385\nn02869837\nn02948072\nn03991062\nn02091244\nn04476259\nn02099429\nn02346627\nn02782093\nn02457408\nn02009229\nn02910353\nn02087046\nn01877812\nn03787032\nn02281406\nn04461696\nn03782006\nn01924916\nn03223299\nn01768244\nn04023962\nn07717410\nn03062245\nn07875152\nn03393912\nn02364673\nn03937543\nn02101388\nn04548280\nn12620546\nn03584829\nn04606251\nn02776631\nn04443257\nn02788148\nn03838899\nn02051845\nn07768694\nn03498962\nn02100583\nn02102177\nn07716358\nn04589890\nn02128757\nn02489166\nn03417042\nn03355925\nn02111889\nn03297495\nn03180011\nn03196217\nn02859443\nn02321529\nn04443257\nn03089624\nn07730033\nn03874293\nn03594945\nn02423022\nn11879895\nn02104029\nn02916936\nn02403003\nn03709823\nn04467665\nn01833805\nn02119022\nn02687172\nn02492660\nn02877765\nn02099429\nn03942813\nn02105855\nn02168699\nn07565083\nn03895866\nn03126707\nn02346627\nn02606052\nn03670208\nn02114548\nn02109047\nn03916031\nn01871265\nn04523525\nn02690373\nn03014705\nn02356798\nn02128385\nn02133161\nn03884397\nn02108915\nn03759954\nn03630383\nn02106382\nn02256656\nn02085936\nn03197337\nn03661043\nn04590129\nn03958227\nn04525038\nn02037110\nn03956157\nn03717622\nn02326432\nn03249569\nn01631663\nn01687978\nn12144580\nn02277742\nn03692522\nn04507155\nn04389033\nn04548280\nn01914609\nn01776313\nn03125729\nn02096051\nn02769748\nn04131690\nn02669723\nn04376876\nn01818515\nn02091244\nn03207743\nn03134739\nn03838899\nn02641379\nn02666196\nn02397096\nn02009229\nn02410509\nn02276258\nn03062245\nn02097130\nn02093754\nn02123045\nn04357314\nn03089624\nn02091244\nn01685808\nn02412080\nn03841143\nn01807496\nn02098286\nn02124075\nn02086646\nn03627232\nn09468604\nn01768244\nn07920052\nn03976467\nn03534580\nn03617480\nn04467665\nn07584110\nn04040759\nn02090379\nn03393912\nn01945685\nn04482393\nn01537544\nn02231487\nn02137549\nn03045698\nn04346328\nn04597913\nn02114367\nn07613480\nn02892767\nn04209133\nn02097047\nn02100877\nn02480855\nn03259280\nn03272010\nn07684084\nn03743016\nn01773549\nn02708093\nn02939185\nn03617480\nn01753488\nn07880968\nn03218198\nn02871525\nn02093256\nn01798484\nn02417914\nn02108915\nn04125021\nn03126707\nn04285008\nn02526121\nn04111531\nn02089078\nn02927161\nn02971356\nn04553703\nn02442845\nn01945685\nn01491361\nn04347754\nn04371774\nn09428293\nn04370456\nn01682714\nn01664065\nn02085620\nn02114855\nn03255030\nn02130308\nn04200800\nn02447366\nn04127249\nn02110185\nn02793495\nn03944341\nn03196217\nn02096294\nn04133789\nn07754684\nn03384352\nn03459775\nn04579145\nn01682714\nn03041632\nn07860988\nn06596364\nn04296562\nn04152593\nn01698640\nn03792972\nn04067472\nn03394916\nn01728920\nn04597913\nn04090263\nn03445777\nn13040303\nn07717556\nn01914609\nn07730033\nn02108089\nn04597913\nn02786058\nn06785654\nn03956157\nn04584207\nn03697007\nn02114712\nn02749479\nn07248320\nn03673027\nn02090379\nn04501370\nn01917289\nn04265275\nn04515003\nn03710721\nn03495258\nn04532670\nn04040759\nn01829413\nn02840245\nn02699494\nn02106550\nn03089624\nn02105056\nn02860847\nn02487347\nn02085782\nn03888257\nn03691459\nn02398521\nn04398044\nn01687978\nn04371774\nn02777292\nn01664065\nn04476259\nn04548280\nn12144580\nn02669723\nn02095314\nn02877765\nn04429376\nn03400231\nn03729826\nn02825657\nn02802426\nn03733281\nn03124043\nn07871810\nn02169497\nn04263257\nn01689811\nn04485082\nn04099969\nn03902125\nn04371430\nn02091635\nn03344393\nn02815834\nn13044778\nn02100877\nn02130308\nn09246464\nn02843684\nn01735189\nn06874185\nn02100583\nn02100877\nn15075141\nn02109525\nn02486410\nn02950826\nn01871265\nn02823750\nn07583066\nn02051845\nn01751748\nn02483362\nn03908618\nn02977058\nn02111889\nn04447861\nn02114855\nn02095314\nn02804414\nn02489166\nn04277352\nn02236044\nn02408429\nn02655020\nn01693334\nn03447721\nn02093647\nn02791124\nn02077923\nn04536866\nn03291819\nn02093859\nn02115641\nn04254680\nn04501370\nn04019541\nn02795169\nn03459775\nn04209133\nn07860988\nn04553703\nn02484975\nn03530642\nn02906734\nn04325704\nn04008634\nn12057211\nn02342885\nn04344873\nn03794056\nn02107142\nn04090263\nn02009229\nn02971356\nn02504458\nn04273569\nn09399592\nn03272562\nn02277742\nn02279972\nn07930864\nn02917067\nn04004767\nn04392985\nn07718747\nn02089078\nn03903868\nn03208938\nn02133161\nn03376595\nn02978881\nn03201208\nn02834397\nn02443484\nn02085620\nn02111889\nn03532672\nn04263257\nn03661043\nn15075141\nn04200800\nn03786901\nn01873310\nn04423845\nn01737021\nn02951358\nn02116738\nn01798484\nn03980874\nn02834397\nn02398521\nn01531178\nn07734744\nn01847000\nn03841143\nn02110185\nn13044778\nn02727426\nn02799071\nn02107908\nn01806143\nn03770679\nn03967562\nn02086646\nn02892767\nn01855032\nn02165105\nn01514859\nn04037443\nn03877472\nn03729826\nn01728920\nn02676566\nn03627232\nn04069434\nn04192698\nn02486261\nn02795169\nn04033901\nn01824575\nn02105641\nn02444819\nn01824575\nn03908714\nn04239074\nn02102480\nn02264363\nn01498041\nn02930766\nn04355933\nn04125021\nn03481172\nn02123159\nn02099712\nn04209239\nn02111889\nn02002556\nn03690938\nn04429376\nn03814906\nn04525305\nn02107908\nn01692333\nn04127249\nn01914609\nn04201297\nn02807133\nn01985128\nn02979186\nn02088238\nn03594945\nn03388043\nn09468604\nn03729826\nn02704792\nn07930864\nn03355925\nn04554684\nn04131690\nn04026417\nn02437616\nn03769881\nn04330267\nn02091831\nn01797886\nn02687172\nn02906734\nn02091635\nn02814533\nn02114712\nn03770439\nn04099969\nn04033995\nn02085936\nn01644900\nn02930766\nn01917289\nn01704323\nn04515003\nn01950731\nn03888257\nn07836838\nn02687172\nn02102318\nn02106030\nn02676566\nn01749939\nn03314780\nn03690938\nn02823750\nn03344393\nn03666591\nn04458633\nn04398044\nn01440764\nn04482393\nn03075370\nn02701002\nn04023962\nn01558993\nn07716358\nn02325366\nn02106382\nn04590129\nn10148035\nn02236044\nn04252077\nn12144580\nn02110627\nn03000134\nn02086079\nn03032252\nn02408429\nn03394916\nn02871525\nn01806567\nn02127052\nn02879718\nn03032252\nn03935335\nn04482393\nn03710721\nn04522168\nn04371430\nn04579145\nn03967562\nn03201208\nn04355338\nn04328186\nn04111531\nn01968897\nn02115913\nn01518878\nn04344873\nn02814533\nn01697457\nn04371430\nn01855032\nn01806143\nn03598930\nn02971356\nn03372029\nn02101388\nn02963159\nn02391049\nn01560419\nn02114367\nn03933933\nn03259280\nn01756291\nn04479046\nn07583066\nn03792972\nn02100877\nn07768694\nn02007558\nn03937543\nn03666591\nn02104029\nn01910747\nn02095889\nn04417672\nn03769881\nn03929855\nn02641379\nn02229544\nn07614500\nn04311174\nn02361337\nn07753592\nn02206856\nn04090263\nn03444034\nn04525305\nn02281406\nn02526121\nn01807496\nn02096294\nn01667778\nn02480855\nn07711569\nn02009229\nn01697457\nn03271574\nn01687978\nn02100236\nn03908714\nn01531178\nn02364673\nn03773504\nn03000684\nn02981792\nn04485082\nn01797886\nn03498962\nn03538406\nn03530642\nn01872401\nn02342885\nn02457408\nn02480495\nn02480855\nn01770393\nn01560419\nn01665541\nn04540053\nn04346328\nn04485082\nn02091635\nn03733805\nn02120505\nn02988304\nn04049303\nn02607072\nn02488702\nn03026506\nn07718472\nn03627232\nn03388043\nn02403003\nn03627232\nn03877845\nn03388043\nn02487347\nn04005630\nn01682714\nn01818515\nn04311174\nn01664065\nn04509417\nn02086910\nn02219486\nn04392985\nn04344873\nn01685808\nn07717410\nn03384352\nn01728920\nn02027492\nn02012849\nn04336792\nn02481823\nn07565083\nn03868863\nn03179701\nn02109525\nn04330267\nn03982430\nn03272010\nn04005630\nn02112137\nn03770439\nn02088094\nn02114548\nn02091032\nn01728572\nn03240683\nn02808440\nn02486410\nn02930766\nn01737021\nn03733805\nn03110669\nn03016953\nn01748264\nn02325366\nn01748264\nn02364673\nn02017213\nn04252077\nn02860847\nn03124043\nn03461385\nn02090721\nn03998194\nn02095570\nn07753113\nn04423845\nn04044716\nn01695060\nn01632458\nn02643566\nn02167151\nn01860187\nn02403003\nn02840245\nn03658185\nn04116512\nn02096294\nn01735189\nn01514859\nn04131690\nn02978881\nn03461385\nn03944341\nn02441942\nn07753113\nn01693334\nn09399592\nn02105412\nn03400231\nn04550184\nn02823428\nn02112137\nn03920288\nn04509417\nn03785016\nn03534580\nn02066245\nn02807133\nn01924916\nn02017213\nn03796401\nn02090721\nn01981276\nn02497673\nn09399592\nn01749939\nn03344393\nn03344393\nn02490219\nn04335435\nn04065272\nn07873807\nn03314780\nn03530642\nn02783161\nn02114548\nn02319095\nn03018349\nn01498041\nn02859443\nn02096051\nn04251144\nn03042490\nn02167151\nn02096294\nn09246464\nn12985857\nn02100583\nn03240683\nn02236044\nn02356798\nn02317335\nn02859443\nn02510455\nn01945685\nn03792972\nn02011460\nn03220513\nn04141076\nn03662601\nn07745940\nn02747177\nn12998815\nn04209133\nn02097130\nn01685808\nn04273569\nn04515003\nn02094258\nn02109047\nn03028079\nn02408429\nn03777754\nn02113186\nn02500267\nn03891251\nn02112018\nn04487081\nn02927161\nn01664065\nn03534580\nn03729826\nn03187595\nn02105505\nn07718747\nn02802426\nn02226429\nn04116512\nn01756291\nn01817953\nn07714990\nn02457408\nn03109150\nn04026417\nn02437312\nn02124075\nn02113978\nn03109150\nn02389026\nn06785654\nn03089624\nn03444034\nn04149813\nn02091032\nn04376876\nn02606052\nn03492542\nn04579145\nn01496331\nn01592084\nn04141975\nn01580077\nn02112706\nn03388043\nn02256656\nn02087394\nn04179913\nn07930864\nn04355338\nn03874293\nn04033995\nn02088364\nn03535780\nn03476991\nn04336792\nn03888257\nn07836838\nn03028079\nn03877845\nn03982430\nn02116738\nn04596742\nn03843555\nn15075141\nn04325704\nn04398044\nn02134084\nn02132136\nn03602883\nn01955084\nn02268853\nn02490219\nn04044716\nn02492660\nn01770393\nn03447447\nn07871810\nn01739381\nn03933933\nn02110958\nn04517823\nn10565667\nn02087046\nn02909870\nn07747607\nn13037406\nn03743016\nn02113023\nn07716358\nn01828970\nn04579145\nn04482393\nn02169497\nn04371430\nn01751748\nn01632777\nn02106382\nn01697457\nn04074963\nn03062245\nn02607072\nn03868863\nn04409515\nn01829413\nn04254680\nn01728920\nn02802426\nn03666591\nn01984695\nn02708093\nn02090721\nn02089973\nn02099849\nn02134084\nn13133613\nn03733281\nn02268853\nn04347754\nn02115641\nn04346328\nn02769748\nn01665541\nn03961711\nn02391049\nn01675722\nn02017213\nn03045698\nn02356798\nn02977058\nn01873310\nn02276258\nn03692522\nn02107908\nn03954731\nn04389033\nn02226429\nn03676483\nn02107908\nn01484850\nn01774750\nn02979186\nn03761084\nn03623198\nn03445777\nn03770679\nn01728572\nn03495258\nn04613696\nn02441942\nn03594734\nn02114855\nn02883205\nn04311174\nn04532670\nn02134418\nn03717622\nn02859443\nn03930313\nn03126707\nn03977966\nn03983396\nn04456115\nn07760859\nn01532829\nn04208210\nn03991062\nn04131690\nn03649909\nn03425413\nn02017213\nn02974003\nn03958227\nn02408429\nn01614925\nn03884397\nn04429376\nn01749939\nn01756291\nn01498041\nn03992509\nn03532672\nn04286575\nn03376595\nn02108000\nn02108551\nn07565083\nn03792782\nn02089867\nn07684084\nn03404251\nn03871628\nn04311004\nn13040303\nn02111129\nn02422699\nn03733281\nn04153751\nn04179913\nn02268443\nn02443114\nn03485794\nn07579787\nn02110063\nn01616318\nn03871628\nn07697537\nn02114367\nn02091134\nn02883205\nn02814533\nn03871628\nn02105056\nn02865351\nn03991062\nn02104365\nn04275548\nn03929660\nn03814639\nn02834397\nn03792782\nn07730033\nn02445715\nn02804610\nn02119789\nn04040759\nn02415577\nn02206856\nn02114367\nn04493381\nn02276258\nn03991062\nn02236044\nn04332243\nn07760859\nn02504013\nn02090379\nn02445715\nn10565667\nn04487081\nn09472597\nn04398044\nn01873310\nn02087046\nn03788365\nn02097658\nn03467068\nn07717410\nn03642806\nn03063689\nn01914609\nn03792782\nn12267677\nn03220513\nn02119789\nn02950826\nn02113712\nn03697007\nn04009552\nn03876231\nn10148035\nn03590841\nn03461385\nn02814860\nn03729826\nn03255030\nn09288635\nn02094114\nn04550184\nn02115913\nn01990800\nn02112350\nn12998815\nn02672831\nn01860187\nn04493381\nn02979186\nn02441942\nn02128757\nn01883070\nn03803284\nn03417042\nn02992211\nn04462240\nn03759954\nn01984695\nn07584110\nn04118538\nn02105412\nn03218198\nn02835271\nn03314780\nn04070727\nn03325584\nn01742172\nn04266014\nn03447447\nn02701002\nn01877812\nn03062245\nn01592084\nn01924916\nn03781244\nn01798484\nn02730930\nn02417914\nn02791124\nn02412080\nn09256479\nn04008634\nn02493793\nn07753275\nn03980874\nn02280649\nn03400231\nn03476991\nn02787622\nn02086240\nn04041544\nn04370456\nn04591713\nn03062245\nn04254120\nn02125311\nn03920288\nn02088364\nn02002724\nn02107683\nn01498041\nn04550184\nn01984695\nn04584207\nn02971356\nn03961711\nn02447366\nn01855672\nn03126707\nn03481172\nn02640242\nn03376595\nn02814860\nn01498041\nn04442312\nn03776460\nn01882714\nn04485082\nn03201208\nn01978455\nn04456115\nn03467068\nn02086240\nn02256656\nn04517823\nn03291819\nn04263257\nn02106662\nn02823750\nn03527444\nn01807496\nn02112018\nn02860847\nn01980166\nn01514859\nn02879718\nn02128925\nn03944341\nn07831146\nn04049303\nn04004767\nn04254120\nn02108422\nn07871810\nn01775062\nn02808304\nn03929660\nn02667093\nn07716906\nn03697007\nn12057211\nn03196217\nn01855032\nn02097047\nn02444819\nn07711569\nn02071294\nn06596364\nn03584829\nn02025239\nn09256479\nn02484975\nn02840245\nn02814533\nn03188531\nn03891332\nn01560419\nn02110185\nn01685808\nn03207941\nn02096294\nn02672831\nn04311004\nn04265275\nn07730033\nn04296562\nn02167151\nn02110341\nn03832673\nn03709823\nn02115641\nn02510455\nn04325704\nn02129604\nn04296562\nn13037406\nn04554684\nn03706229\nn02500267\nn02101388\nn02206856\nn02111889\nn04442312\nn02102973\nn02098105\nn02906734\nn01770081\nn13054560\nn04325704\nn02909870\nn02927161\nn03976467\nn03014705\nn02483362\nn02012849\nn02321529\nn03841143\nn04389033\nn02094258\nn15075141\nn03733805\nn03958227\nn03792972\nn04542943\nn02979186\nn07614500\nn03666591\nn03929855\nn07802026\nn02974003\nn02319095\nn02804414\nn04325704\nn02109525\nn02999410\nn02120079\nn04404412\nn01871265\nn03871628\nn03337140\nn01667778\nn01819313\nn04532670\nn02319095\nn03457902\nn02978881\nn02119789\nn04026417\nn01693334\nn01744401\nn03825788\nn04273569\nn03942813\nn01984695\nn02727426\nn01820546\nn04487081\nn03956157\nn04465501\nn04579145\nn02117135\nn04447861\nn03085013\nn02134084\nn03769881\nn03717622\nn02105251\nn03761084\nn02088466\nn01872401\nn02807133\nn03775546\nn03590841\nn03617480\nn01677366\nn02119789\nn02226429\nn04409515\nn03995372\nn02013706\nn07697537\nn02025239\nn02114712\nn03394916\nn02494079\nn01968897\nn03977966\nn11879895\nn03492542\nn03843555\nn03742115\nn04208210\nn02423022\nn04515003\nn13054560\nn02483708\nn04507155\nn07717410\nn03255030\nn03133878\nn03877845\nn04344873\nn04540053\nn09399592\nn04517823\nn04086273\nn02978881\nn02115641\nn04461696\nn02102973\nn02277742\nn04399382\nn04330267\nn03661043\nn13037406\nn04604644\nn03958227\nn02397096\nn04125021\nn03445924\nn03492542\nn02092339\nn03787032\nn03791053\nn02804414\nn01753488\nn07754684\nn01496331\nn01990800\nn04356056\nn04065272\nn01756291\nn04136333\nn03662601\nn02006656\nn02326432\nn02018795\nn03777568\nn07932039\nn04265275\nn02268853\nn03649909\nn04548362\nn03538406\nn02104365\nn03062245\nn04131690\nn01955084\nn04606251\nn04037443\nn01990800\nn02892767\nn02113023\nn03873416\nn04254680\nn02444819\nn04606251\nn02091032\nn03623198\nn01693334\nn04162706\nn04476259\nn01773157\nn02510455\nn01616318\nn02782093\nn04209133\nn03777568\nn12998815\nn04417672\nn12620546\nn04517823\nn02259212\nn02727426\nn02797295\nn03062245\nn02794156\nn04347754\nn03417042\nn02123159\nn03530642\nn07715103\nn07716906\nn03874599\nn04179913\nn01877812\nn02101388\nn02233338\nn04141327\nn02666196\nn04131690\nn03032252\nn02114367\nn03045698\nn02090721\nn02815834\nn07873807\nn02965783\nn04429376\nn04604644\nn01855032\nn02018795\nn03729826\nn04404412\nn07615774\nn02013706\nn01955084\nn01774750\nn01644373\nn02096177\nn02114712\nn03891332\nn03482405\nn03916031\nn02099849\nn02480855\nn13044778\nn02226429\nn03670208\nn13133613\nn03670208\nn04125021\nn02276258\nn03131574\nn03929855\nn02687172\nn02443484\nn02101006\nn04367480\nn02109525\nn04049303\nn02096051\nn03929660\nn02776631\nn02027492\nn01795545\nn02109525\nn03584829\nn03595614\nn02992211\nn04243546\nn03404251\nn04023962\nn03085013\nn02128385\nn02111129\nn04613696\nn04152593\nn02978881\nn02909870\nn10565667\nn03467068\nn02280649\nn03763968\nn02056570\nn02504458\nn03958227\nn03874599\nn02133161\nn03871628\nn02099849\nn03179701\nn01985128\nn02112137\nn02098413\nn01945685\nn02105505\nn03796401\nn04152593\nn02410509\nn01665541\nn04147183\nn02655020\nn02233338\nn03297495\nn01776313\nn01945685\nn03710193\nn04462240\nn03956157\nn02229544\nn02782093\nn04355338\nn03000684\nn04542943\nn02111277\nn04505470\nn03196217\nn02112706\nn03590841\nn03197337\nn02526121\nn04522168\nn01877812\nn03617480\nn02870880\nn04591713\nn06359193\nn02110958\nn07892512\nn03796401\nn03047690\nn01518878\nn04263257\nn01910747\nn07753275\nn01882714\nn04033901\nn01784675\nn02489166\nn03534580\nn04447861\nn02403003\nn07717556\nn02027492\nn03710721\nn02281787\nn02807133\nn03124170\nn02396427\nn02981792\nn04613696\nn02481823\nn04522168\nn03930313\nn10565667\nn03776460\nn03180011\nn04235860\nn02397096\nn03016953\nn03838899\nn09193705\nn04404412\nn04336792\nn02978881\nn07720875\nn04286575\nn12985857\nn07613480\nn03063689\nn02206856\nn02011460\nn02769748\nn02317335\nn02749479\nn01770081\nn02422699\nn02088094\nn02906734\nn06785654\nn04152593\nn03916031\nn02113186\nn02115913\nn02791124\nn03764736\nn02356798\nn02979186\nn02749479\nn03630383\nn03259280\nn04023962\nn04026417\nn02909870\nn03404251\nn03868863\nn03495258\nn03899768\nn03733805\nn02823750\nn02086079\nn04356056\nn03196217\nn01806143\nn07718472\nn04335435\nn03937543\nn04070727\nn01631663\nn02643566\nn11879895\nn03690938\nn02093428\nn02105641\nn02091134\nn03131574\nn03485407\nn01677366\nn02099601\nn02123045\nn02443114\nn02134418\nn04370456\nn01883070\nn04141076\nn03467068\nn02105162\nn02226429\nn02397096\nn02692877\nn02447366\nn13037406\nn09332890\nn04482393\nn03877845\nn02102480\nn10565667\nn02791270\nn02669723\nn02808304\nn04548362\nn03658185\nn02489166\nn02098286\nn07615774\nn04532106\nn01807496\nn02992529\nn01694178\nn04428191\nn03445924\nn07742313\nn04037443\nn03887697\nn01630670\nn02099267\nn02123597\nn01981276\nn02825657\nn02106662\nn03657121\nn03249569\nn03218198\nn04152593\nn12985857\nn03160309\nn02939185\nn01817953\nn01773157\nn02999410\nn03482405\nn04200800\nn02488702\nn03272562\nn03992509\nn03544143\nn04141327\nn02099712\nn03016953\nn02107142\nn01751748\nn02009912\nn02087394\nn04355933\nn02117135\nn13054560\nn02006656\nn03733805\nn03710193\nn04141076\nn01608432\nn09835506\nn04398044\nn07579787\nn02099712\nn02123597\nn07836838\nn04131690\nn04090263\nn02981792\nn02018795\nn03602883\nn02074367\nn02443484\nn02871525\nn02457408\nn02799071\nn03764736\nn03804744\nn02190166\nn03769881\nn04399382\nn04553703\nn02058221\nn02981792\nn01692333\nn01631663\nn03868242\nn06785654\nn03977966\nn04423845\nn02791124\nn02128385\nn01664065\nn01756291\nn07802026\nn02979186\nn02814533\nn12768682\nn04201297\nn07742313\nn02489166\nn02120079\nn03743016\nn03482405\nn01795545\nn02108551\nn02096051\nn02951358\nn02169497\nn04532106\nn02268443\nn03676483\nn01798484\nn02113712\nn07697313\nn02112018\nn04525038\nn03982430\nn04239074\nn02123597\nn03063689\nn02091134\nn02138441\nn03255030\nn02012849\nn02879718\nn02111277\nn02088466\nn02105056\nn01776313\nn04584207\nn02095314\nn01806567\nn01770393\nn03271574\nn03599486\nn10148035\nn03627232\nn04275548\nn03063689\nn03016953\nn01990800\nn04141076\nn03131574\nn01968897\nn02093256\nn01774750\nn01855672\nn04435653\nn03127747\nn03657121\nn03529860\nn07730033\nn02837789\nn01828970\nn02002556\nn02132136\nn03873416\nn03424325\nn04259630\nn02097130\nn03272562\nn03496892\nn04525305\nn03916031\nn01644373\nn04591713\nn02504013\nn02091831\nn01847000\nn03000684\nn01770393\nn03763968\nn02093754\nn03063689\nn02085782\nn03290653\nn03777568\nn07718472\nn02090721\nn02089078\nn03792782\nn13037406\nn02111889\nn04550184\nn03063599\nn04229816\nn04238763\nn01693334\nn03743016\nn02108551\nn04604644\nn02281787\nn02119789\nn02808304\nn09332890\nn02106550\nn07802026\nn03249569\nn07836838\nn03775546\nn04204347\nn04592741\nn01498041\nn03929660\nn02077923\nn02108089\nn02094433\nn02107574\nn13133613\nn02749479\nn03249569\nn02641379\nn03804744\nn02321529\nn01797886\nn02690373\nn13054560\nn02950826\nn01737021\nn01689811\nn01664065\nn07693725\nn02342885\nn02169497\nn09288635\nn02087394\nn03376595\nn02120505\nn03938244\nn03345487\nn02500267\nn01797886\nn04443257\nn03492542\nn02094258\nn03721384\nn13044778\nn03868863\nn07711569\nn02236044\nn04081281\nn03838899\nn04596742\nn02111500\nn04251144\nn02100583\nn07714571\nn04238763\nn02105412\nn02443484\nn04019541\nn03394916\nn03776460\nn03000134\nn02109525\nn02109525\nn02870880\nn03393912\nn03197337\nn04081281\nn03763968\nn01688243\nn02110806\nn02834397\nn02939185\nn02279972\nn03888605\nn02268443\nn02988304\nn04310018\nn04285008\nn09246464\nn02389026\nn01558993\nn01955084\nn01930112\nn01644373\nn12620546\nn02093256\nn09256479\nn02002724\nn03160309\nn04204238\nn01753488\nn03393912\nn01641577\nn02100735\nn04584207\nn02100236\nn02879718\nn02988304\nn02105162\nn02110806\nn04258138\nn03590841\nn02927161\nn01498041\nn03720891\nn04515003\nn02134418\nn03014705\nn03344393\nn02783161\nn04443257\nn02492660\nn03218198\nn01755581\nn02090622\nn03179701\nn04252225\nn04417672\nn04037443\nn04065272\nn03721384\nn02089973\nn02091635\nn03804744\nn09288635\nn04613696\nn03796401\nn07714990\nn01770393\nn01742172\nn02128385\nn03492542\nn03916031\nn01883070\nn01739381\nn02980441\nn02966687\nn04486054\nn04443257\nn01984695\nn03026506\nn02808440\nn02977058\nn02114367\nn02094114\nn02326432\nn03016953\nn02106166\nn03710193\nn01644373\nn02091134\nn03259280\nn03018349\nn03791053\nn04008634\nn02095570\nn07718747\nn03376595\nn07717410\nn02894605\nn07583066\nn02281787\nn03483316\nn02105505\nn03837869\nn04591713\nn02749479\nn01514668\nn02090379\nn03424325\nn03642806\nn02089973\nn01532829\nn02105641\nn04591713\nn01819313\nn02127052\nn03124043\nn03649909\nn02113186\nn04067472\nn02114548\nn03791053\nn03792782\nn02093991\nn03530642\nn02397096\nn02281787\nn03661043\nn03495258\nn02174001\nn07880968\nn03459775\nn02100236\nn02727426\nn01820546\nn02988304\nn02112350\nn03476684\nn04238763\nn02028035\nn02120505\nn01704323\nn03047690\nn02268443\nn02443114\nn02112137\nn02879718\nn01697457\nn04264628\nn03314780\nn03649909\nn02133161\nn07730033\nn03670208\nn02835271\nn03584829\nn02326432\nn03916031\nn03485794\nn03314780\nn02342885\nn02105412\nn02321529\nn01669191\nn07742313\nn03045698\nn02510455\nn04201297\nn03710721\nn02966687\nn02094258\nn02109047\nn03376595\nn03017168\nn01924916\nn02017213\nn02086079\nn03666591\nn04465501\nn02981792\nn03832673\nn01806567\nn02793495\nn02110806\nn01833805\nn01622779\nn02493509\nn03495258\nn03485407\nn02051845\nn04141975\nn02909870\nn01698640\nn02096294\nn02009912\nn02097658\nn02018207\nn02804414\nn03095699\nn01665541\nn03532672\nn02102177\nn01806143\nn01847000\nn07693725\nn02268853\nn03530642\nn03908618\nn03781244\nn04286575\nn02111129\nn04273569\nn04590129\nn02100583\nn03916031\nn04404412\nn02708093\nn03160309\nn07579787\nn03476991\nn04204238\nn03344393\nn09193705\nn01665541\nn01968897\nn03180011\nn02948072\nn01871265\nn01843383\nn02494079\nn02105505\nn02356798\nn02769748\nn01955084\nn01990800\nn02113712\nn03976657\nn03633091\nn03937543\nn04252225\nn02442845\nn03461385\nn03014705\nn01644900\nn03924679\nn04152593\nn02974003\nn02804414\nn03290653\nn04344873\nn02326432\nn04371430\nn03485794\nn02107142\nn03483316\nn04330267\nn01883070\nn02105505\nn03062245\nn03924679\nn02326432\nn03761084\nn02104029\nn02074367\nn04023962\nn02123597\nn04264628\nn03902125\nn02077923\nn02927161\nn03272562\nn04399382\nn07875152\nn03478589\nn03680355\nn02093428\nn03903868\nn02396427\nn01753488\nn01914609\nn04487081\nn03372029\nn01753488\nn02096585\nn07747607\nn01601694\nn03146219\nn03733131\nn03124043\nn02090622\nn03063599\nn03599486\nn03976657\nn07880968\nn02086910\nn02494079\nn02100735\nn01693334\nn02966193\nn02089973\nn03866082\nn02640242\nn02094433\nn03947888\nn01592084\nn04039381\nn04263257\nn04326547\nn02841315\nn04009552\nn02099712\nn03271574\nn02701002\nn03791053\nn04252077\nn07717410\nn02027492\nn02097474\nn02113799\nn01773797\nn11939491\nn03494278\nn02971356\nn02509815\nn02107683\nn04328186\nn03998194\nn03938244\nn03721384\nn02089973\nn07684084\nn04613696\nn03476991\nn03444034\nn03272010\nn02219486\nn07613480\nn03899768\nn01770393\nn04532106\nn04264628\nn03314780\nn02422106\nn01689811\nn04154565\nn03991062\nn02088094\nn03384352\nn02088632\nn03146219\nn02017213\nn02123597\nn01806567\nn01740131\nn01829413\nn04004767\nn04355338\nn04044716\nn01735189\nn03218198\nn02108422\nn07831146\nn02110185\nn07932039\nn03658185\nn01773797\nn09288635\nn02133161\nn01820546\nn09332890\nn09468604\nn03935335\nn04562935\nn03908714\nn02167151\nn03216828\nn02497673\nn04493381\nn03452741\nn02117135\nn04131690\nn02120505\nn03743016\nn02364673\nn03980874\nn04462240\nn02804414\nn02051845\nn02808440\nn02172182\nn09428293\nn02093428\nn03220513\nn02699494\nn03803284\nn03804744\nn02514041\nn04099969\nn04296562\nn03388549\nn12998815\nn03933933\nn04208210\nn02410509\nn04482393\nn04487081\nn02486261\nn02113799\nn04228054\nn09835506\nn04067472\nn01664065\nn04428191\nn01740131\nn02493509\nn11939491\nn03042490\nn03584254\nn09468604\nn04120489\nn02483708\nn01498041\nn03786901\nn04523525\nn02165105\nn03888605\nn02115913\nn04201297\nn04501370\nn04037443\nn02172182\nn03793489\nn03724870\nn02391049\nn04069434\nn02807133\nn02056570\nn07584110\nn04398044\nn04398044\nn03854065\nn02655020\nn02107312\nn04366367\nn04086273\nn03485407\nn02104029\nn04251144\nn03627232\nn02132136\nn02979186\nn02317335\nn03201208\nn04479046\nn03452741\nn04258138\nn07590611\nn04149813\nn04355933\nn03207941\nn04479046\nn02441942\nn03866082\nn07583066\nn03445777\nn03017168\nn02672831\nn04204238\nn04326547\nn02113712\nn01514668\nn02415577\nn03706229\nn02981792\nn02840245\nn04389033\nn03992509\nn02403003\nn04005630\nn03637318\nn04371430\nn04347754\nn02100583\nn01518878\nn02319095\nn02492035\nn04597913\nn02206856\nn02025239\nn04591157\nn01773549\nn04081281\nn07697537\nn01682714\nn04069434\nn02085782\nn02655020\nn07714571\nn01614925\nn04008634\nn07873807\nn04131690\nn03680355\nn02422699\nn07753592\nn03840681\nn06785654\nn01530575\nn02096051\nn03764736\nn02108089\nn04044716\nn03384352\nn01818515\nn02056570\nn02097130\nn01665541\nn01688243\nn04131690\nn04606251\nn01616318\nn01688243\nn02113186\nn04613696\nn01737021\nn02776631\nn03995372\nn01806143\nn01753488\nn04037443\nn02879718\nn04009552\nn02110806\nn04332243\nn04560804\nn03884397\nn02110958\nn03888605\nn01685808\nn07565083\nn02883205\nn02492660\nn01798484\nn03100240\nn02088094\nn04229816\nn02098286\nn02841315\nn03017168\nn04120489\nn07718747\nn03933933\nn04355933\nn04483307\nn02107142\nn01744401\nn02093991\nn02112137\nn02085936\nn03929855\nn02051845\nn02091831\nn01740131\nn02948072\nn02112706\nn04584207\nn04070727\nn03584254\nn04235860\nn01749939\nn02086079\nn03424325\nn04485082\nn02165456\nn03259280\nn02132136\nn03445924\nn12768682\nn03325584\nn01644373\nn02361337\nn04523525\nn07753592\nn04067472\nn04579145\nn07880968\nn02231487\nn04486054\nn03658185\nn04429376\nn03126707\nn02085620\nn02104365\nn02692877\nn04557648\nn04606251\nn03888605\nn02105412\nn06785654\nn02101388\nn03393912\nn04370456\nn12985857\nn07871810\nn03742115\nn04238763\nn02101006\nn02090379\nn09399592\nn07930864\nn02123597\nn03494278\nn02363005\nn07892512\nn02776631\nn03785016\nn07930864\nn02123394\nn01855032\nn02883205\nn02091831\nn03868242\nn02930766\nn01945685\nn03594734\nn02493793\nn02398521\nn04501370\nn03417042\nn02815834\nn03710637\nn02100583\nn02497673\nn02894605\nn03895866\nn01756291\nn02091032\nn02120505\nn03980874\nn07745940\nn02769748\nn04208210\nn01990800\nn02397096\nn01692333\nn03814639\nn01855672\nn04154565\nn02317335\nn02815834\nn07693725\nn03720891\nn02110627\nn13037406\nn02391049\nn04131690\nn01930112\nn07760859\nn03770679\nn02111500\nn04252225\nn01877812\nn03180011\nn13044778\nn02492660\nn04273569\nn04004767\nn04238763\nn03706229\nn04357314\nn01641577\nn04311174\nn03109150\nn03866082\nn03933933\nn02412080\nn03207743\nn03218198\nn07716906\nn03218198\nn02667093\nn02799071\nn02346627\nn03874293\nn01537544\nn01728572\nn03804744\nn01855672\nn01744401\nn02747177\nn02939185\nn02676566\nn02950826\nn02097298\nn01819313\nn02276258\nn09428293\nn01682714\nn03710637\nn03920288\nn02672831\nn02447366\nn02860847\nn02412080\nn04254680\nn01692333\nn02807133\nn03394916\nn13133613\nn01806567\nn07720875\nn07836838\nn02088094\nn02102040\nn01580077\nn03775546\nn04238763\nn04118776\nn04540053\nn02096294\nn02441942\nn03781244\nn02093256\nn02988304\nn02423022\nn07871810\nn01704323\nn02132136\nn01560419\nn02206856\nn01833805\nn02980441\nn11879895\nn07875152\nn03930313\nn03042490\nn03954731\nn03933933\nn03126707\nn03461385\nn02114855\nn03929660\nn04550184\nn02783161\nn03944341\nn07693725\nn02123045\nn09288635\nn03196217\nn03297495\nn02091831\nn03670208\nn04487394\nn02105251\nn02454379\nn02099849\nn04409515\nn01592084\nn02092002\nn07590611\nn03992509\nn02412080\nn03075370\nn02447366\nn02669723\nn12985857\nn03584254\nn01753488\nn02708093\nn02497673\nn04069434\nn01484850\nn07873807\nn03492542\nn03457902\nn03670208\nn04376876\nn01697457\nn02101556\nn11879895\nn02071294\nn03710193\nn03961711\nn03930313\nn02793495\nn12768682\nn03657121\nn04596742\nn04204238\nn02093754\nn03961711\nn09472597\nn03379051\nn02417914\nn02107312\nn02489166\nn01828970\nn03884397\nn04251144\nn03792782\nn02782093\nn01820546\nn02981792\nn06359193\nn03443371\nn01735189\nn04501370\nn03673027\nn03770679\nn03085013\nn02112706\nn01978287\nn02794156\nn02087394\nn01443537\nn04286575\nn02123394\nn04264628\nn03337140\nn03710721\nn03947888\nn02514041\nn02328150\nn02110185\nn03992509\nn02965783\nn02096177\nn01824575\nn03929855\nn02815834\nn02643566\nn01744401\nn02672831\nn02447366\nn06874185\nn04325704\nn02317335\nn03126707\nn02056570\nn02457408\nn03443371\nn04125021\nn03866082\nn03127747\nn04311004\nn02134084\nn01910747\nn07716358\nn02134418\nn02071294\nn04335435\nn03594734\nn06359193\nn04336792\nn02097474\nn07717410\nn02092339\nn04376876\nn03785016\nn02087394\nn02825657\nn03208938\nn03720891\nn04366367\nn02480855\nn03124043\nn04067472\nn03180011\nn04049303\nn04243546\nn04423845\nn03127747\nn02259212\nn03697007\nn04136333\nn04590129\nn03942813\nn02268443\nn04008634\nn04254680\nn04125021\nn04040759\nn03924679\nn04485082\nn02410509\nn04259630\nn03584829\nn03196217\nn03776460\nn01774750\nn09421951\nn07802026\nn04399382\nn04536866\nn04525038\nn02091467\nn03902125\nn03544143\nn02791270\nn03888605\nn03376595\nn02397096\nn03777754\nn04592741\nn03047690\nn07693725\nn02113978\nn04398044\nn02783161\nn04596742\nn03785016\nn01582220\nn02791270\nn02791124\nn02129165\nn03404251\nn03670208\nn03903868\nn02978881\nn02094433\nn04252225\nn02096177\nn03496892\nn03000684\nn03983396\nn02111277\nn03720891\nn03782006\nn01829413\nn04153751\nn03271574\nn03538406\nn03970156\nn03924679\nn02088094\nn01806143\nn02113978\nn03207941\nn03347037\nn03633091\nn03404251\nn04579145\nn02276258\nn02086240\nn02799071\nn03871628\nn02087394\nn02264363\nn03478589\nn03788365\nn02097658\nn02093647\nn07920052\nn03788195\nn03720891\nn07717556\nn02113023\nn01855032\nn07802026\nn02037110\nn03832673\nn04350905\nn07613480\nn02814860\nn03777754\nn03218198\nn02441942\nn02115913\nn02109961\nn04347754\nn03841143\nn02786058\nn02690373\nn07697313\nn07613480\nn01873310\nn03874599\nn02113624\nn02992211\nn07871810\nn03388183\nn01644900\nn04067472\nn04039381\nn02361337\nn04039381\nn04370456\nn01843065\nn01877812\nn02488291\nn03692522\nn02669723\nn03018349\nn03207743\nn02096177\nn01514859\nn02105056\nn03495258\nn03207743\nn04523525\nn03259280\nn03127747\nn02988304\nn02096437\nn02087394\nn04370456\nn01882714\nn01644900\nn11879895\nn03814639\nn03763968\nn03788365\nn04579145\nn03837869\nn04429376\nn02219486\nn03983396\nn04591157\nn07693725\nn02281787\nn01829413\nn04606251\nn02795169\nn03467068\nn02486410\nn04505470\nn02488702\nn02108089\nn02783161\nn06596364\nn01558993\nn07871810\nn02655020\nn02256656\nn03290653\nn03131574\nn01829413\nn02930766\nn03529860\nn01871265\nn01675722\nn02840245\nn04392985\nn04286575\nn03404251\nn02823428\nn02951585\nn02077923\nn03000247\nn01843065\nn02804414\nn04525038\nn01749939\nn03095699\nn04552348\nn03532672\nn03527444\nn03947888\nn02667093\nn02346627\nn01667114\nn07749582\nn02128385\nn02093754\nn02092002\nn02782093\nn04310018\nn02104365\nn02134418\nn03769881\nn02776631\nn01984695\nn02097658\nn02095570\nn02321529\nn02108000\nn02098413\nn03623198\nn03100240\nn03109150\nn02168699\nn03017168\nn01819313\nn02117135\nn03871628\nn03924679\nn04399382\nn15075141\nn03884397\nn03425413\nn03584829\nn03976467\nn02979186\nn02124075\nn02869837\nn03998194\nn02025239\nn01558993\nn04044716\nn02107908\nn04404412\nn04266014\nn03944341\nn01751748\nn02025239\nn04040759\nn02102973\nn03930630\nn09246464\nn02174001\nn02389026\nn03764736\nn01795545\nn02790996\nn02526121\nn03133878\nn03124043\nn02979186\nn02093754\nn03598930\nn03250847\nn02134084\nn03733281\nn02226429\nn04019541\nn02105855\nn02256656\nn02787622\nn04435653\nn03599486\nn03733131\nn02325366\nn03259280\nn03028079\nn03476684\nn03133878\nn03590841\nn03197337\nn04525038\nn03494278\nn04270147\nn01860187\nn02086910\nn02457408\nn03627232\nn03133878\nn03947888\nn02823428\nn02097298\nn02108000\nn04540053\nn03141823\nn03201208\nn03476991\nn02113023\nn03777754\nn03854065\nn02415577\nn02974003\nn01820546\nn02087046\nn04149813\nn04332243\nn02090379\nn04509417\nn07760859\nn03637318\nn02672831\nn03141823\nn03538406\nn03201208\nn04286575\nn02097658\nn03873416\nn04515003\nn09193705\nn02939185\nn03933933\nn01749939\nn03483316\nn02098105\nn02107908\nn02130308\nn02105641\nn04458633\nn03692522\nn02777292\nn07565083\nn02708093\nn02783161\nn04037443\nn04259630\nn02112706\nn07802026\nn01729977\nn02168699\nn04192698\nn04209133\nn07590611\nn01729322\nn02028035\nn04579432\nn01518878\nn02443484\nn07742313\nn04376876\nn04019541\nn02791270\nn02906734\nn02264363\nn02233338\nn06874185\nn04069434\nn13044778\nn02981792\nn02117135\nn03775071\nn03249569\nn04239074\nn03868242\nn02099267\nn03467068\nn02791270\nn01632777\nn01817953\nn04325704\nn01582220\nn04081281\nn03838899\nn02865351\nn02445715\nn04009552\nn02089867\nn02256656\nn01860187\nn02815834\nn04447861\nn03786901\nn04120489\nn03584254\nn03255030\nn02006656\nn03187595\nn04152593\nn03467068\nn03942813\nn03947888\nn07831146\nn02090721\nn04532670\nn03018349\nn02093991\nn01917289\nn01729322\nn02108422\nn03197337\nn02951585\nn04263257\nn07932039\nn01537544\nn03495258\nn01755581\nn02096051\nn01737021\nn04120489\nn02111500\nn03895866\nn02106166\nn04350905\nn04081281\nn02791124\nn04501370\nn02115913\nn02088466\nn07614500\nn02410509\nn01740131\nn03483316\nn02701002\nn03792782\nn03995372\nn03016953\nn02536864\nn12144580\nn02011460\nn04355933\nn02423022\nn03658185\nn03344393\nn02096177\nn03692522\nn04423845\nn02110185\nn02177972\nn03197337\nn03924679\nn01749939\nn02229544\nn03000247\nn01744401\nn02321529\nn03874293\nn03481172\nn01872401\nn02112018\nn02492035\nn03670208\nn04372370\nn01697457\nn02788148\nn01796340\nn03272562\nn02098286\nn03781244\nn03666591\nn13037406\nn04532670\nn03394916\nn01744401\nn02114855\nn04542943\nn02860847\nn02268443\nn04254120\nn02088466\nn11939491\nn03788195\nn07860988\nn03832673\nn02134084\nn02092339\nn02797295\nn04252077\nn04591713\nn02096177\nn03134739\nn03982430\nn02107574\nn02233338\nn07697313\nn03891332\nn03325584\nn03208938\nn01518878\nn02509815\nn03710721\nn04487394\nn03014705\nn02099429\nn02834397\nn04141975\nn01978455\nn03891332\nn02870880\nn04265275\nn02497673\nn01955084\nn02963159\nn02099712\nn02793495\nn03691459\nn02085782\nn03991062\nn02088094\nn07711569\nn02346627\nn07695742\nn03218198\nn01784675\nn02799071\nn03944341\nn03179701\nn02415577\nn04370456\nn04443257\nn04254777\nn01496331\nn02699494\nn01677366\nn02514041\nn02086240\nn02107908\nn11879895\nn03770679\nn02749479\nn03803284\nn04485082\nn03201208\nn03045698\nn03944341\nn01930112\nn02113186\nn04286575\nn03706229\nn02871525\nn01774384\nn01855032\nn02109047\nn02114548\nn12998815\nn03218198\nn03216828\nn04371774\nn02114712\nn04548280\nn02276258\nn04033995\nn03393912\nn03980874\nn04389033\nn07583066\nn01704323\nn03445924\nn02018795\nn03445777\nn02098286\nn03838899\nn01689811\nn03666591\nn03000247\nn02099712\nn03483316\nn04505470\nn02490219\nn04239074\nn01531178\nn02116738\nn01950731\nn02113624\nn04204238\nn02276258\nn07715103\nn03026506\nn02108551\nn02127052\nn02088466\nn02093256\nn02102040\nn03976657\nn04532670\nn03776460\nn03220513\nn03903868\nn03792972\nn03529860\nn02009229\nn02113624\nn02447366\nn03461385\nn02102318\nn04263257\nn02114855\nn02676566\nn03425413\nn03538406\nn03666591\nn03272010\nn07768694\nn04392985\nn04330267\nn03026506\nn07730033\nn02094258\nn04515003\nn04265275\nn13044778\nn02965783\nn02120505\nn02058221\nn03314780\nn02793495\nn02708093\nn03633091\nn03014705\nn01665541\nn02526121\nn04067472\nn04428191\nn07836838\nn02177972\nn01817953\nn04296562\nn04099969\nn03956157\nn02114367\nn02091635\nn02113978\nn03838899\nn02437616\nn04370456\nn02423022\nn02112706\nn02096585\nn02497673\nn04505470\nn02098286\nn02319095\nn04560804\nn03976657\nn04330267\nn02481823\nn04532670\nn12057211\nn03584254\nn04065272\nn04596742\nn02823428\nn01494475\nn03133878\nn07579787\nn04141975\nn03794056\nn03000684\nn04067472\nn02108422\nn04254777\nn01616318\nn03814906\nn03444034\nn04277352\nn04612504\nn02917067\nn03729826\nn02095314\nn03796401\nn04486054\nn03637318\nn02786058\nn03661043\nn03400231\nn02112350\nn03980874\nn04251144\nn01978287\nn03483316\nn03633091\nn04597913\nn02093647\nn02097474\nn02097130\nn03998194\nn01689811\nn04482393\nn02231487\nn04328186\nn03188531\nn02490219\nn04579432\nn09256479\nn03770439\nn07697537\nn02389026\nn04252225\nn03594945\nn04310018\nn01978455\nn03803284\nn03063689\nn01924916\nn03240683\nn03837869\nn02114712\nn02999410\nn04371774\nn03676483\nn02091467\nn03196217\nn03347037\nn04487081\nn03888257\nn03787032\nn01631663\nn03447721\nn02086079\nn01644373\nn09468604\nn07613480\nn04356056\nn04493381\nn06785654\nn03179701\nn01675722\nn04429376\nn02966193\nn03584254\nn03673027\nn03223299\nn03443371\nn02106382\nn04125021\nn03786901\nn04467665\nn03498962\nn03662601\nn02088632\nn02510455\nn12998815\nn02747177\nn04252077\nn12267677\nn04501370\nn02113978\nn03141823\nn01817953\nn03126707\nn03110669\nn02910353\nn03417042\nn09193705\nn02102318\nn01807496\nn02268443\nn01632777\nn02814533\nn07875152\nn01484850\nn02092339\nn02791124\nn04417672\nn03160309\nn02134418\nn03483316\nn01829413\nn02095889\nn07693725\nn04579145\nn03942813\nn02091134\nn04209239\nn07584110\nn04590129\nn03873416\nn02105056\nn02488291\nn04136333\nn01855032\nn04525305\nn04039381\nn02025239\nn03476991\nn01614925\nn01735189\nn02894605\nn04505470\nn02127052\nn12267677\nn02865351\nn03481172\nn02445715\nn02892767\nn02974003\nn03249569\nn01860187\nn01687978\nn03733805\nn03445777\nn02676566\nn07734744\nn03544143\nn03676483\nn03877845\nn03372029\nn03977966\nn02090721\nn03676483\nn02655020\nn02134418\nn02364673\nn02110627\nn03527444\nn04317175\nn02280649\nn02788148\nn02119789\nn02804610\nn04435653\nn02120505\nn02802426\nn02606052\nn07717410\nn03290653\nn03017168\nn02087046\nn02093647\nn04259630\nn01819313\nn03467068\nn02113712\nn03935335\nn02927161\nn02113186\nn03673027\nn04200800\nn04192698\nn01518878\nn03417042\nn02093754\nn02088364\nn02749479\nn01688243\nn04070727\nn04604644\nn02457408\nn06874185\nn04483307\nn02422106\nn01692333\nn02834397\nn03485794\nn02219486\nn01950731\nn02028035\nn01644900\nn03125729\nn12144580\nn01682714\nn03843555\nn03602883\nn02018795\nn03447447\nn02865351\nn03223299\nn03355925\nn04592741\nn02106662\nn02033041\nn01820546\nn03761084\nn02165105\nn02397096\nn02101556\nn04328186\nn03933933\nn03355925\nn04328186\nn03950228\nn03134739\nn03535780\nn01748264\nn04330267\nn02699494\nn01985128\nn02978881\nn04141327\nn02403003\nn02120079\nn07579787\nn02317335\nn02509815\nn04146614\nn01944390\nn04467665\nn02927161\nn12620546\nn02098286\nn01914609\nn02486410\nn02963159\nn03085013\nn04525305\nn04141076\nn01742172\nn01798484\nn02102480\nn01729322\nn03938244\nn02096585\nn04099969\nn02437616\nn03729826\nn01829413\nn03527444\nn04086273\nn02013706\nn03594734\nn02105855\nn04536866\nn02489166\nn02093991\nn02109525\nn01930112\nn01580077\nn02457408\nn04328186\nn01751748\nn03026506\nn04235860\nn02113023\nn03063689\nn01882714\nn03930630\nn03710721\nn04264628\nn04081281\nn04116512\nn04044716\nn01697457\nn04330267\nn02860847\nn02107908\nn04399382\nn03873416\nn04509417\nn03792972\nn02102318\nn01883070\nn07742313\nn02033041\nn12620546\nn03995372\nn02086646\nn03485794\nn07747607\nn02098413\nn03877472\nn02106550\nn04263257\nn02134418\nn04263257\nn04606251\nn01630670\nn02280649\nn02504013\nn02871525\nn04081281\nn03782006\nn01514668\nn02396427\nn02093428\nn02979186\nn04254777\nn04009552\nn03602883\nn07747607\nn04562935\nn02033041\nn04505470\nn02906734\nn03045698\nn01629819\nn04613696\nn07717556\nn02487347\nn01917289\nn01817953\nn07753275\nn02457408\nn02992529\nn01742172\nn03950228\nn03584254\nn02526121\nn01494475\nn02085936\nn02391049\nn04355933\nn03950228\nn03584829\nn02128385\nn01872401\nn02091467\nn03481172\nn04204347\nn03899768\nn02107312\nn02692877\nn04606251\nn03770679\nn07749582\nn01558993\nn02099712\nn03792782\nn03791053\nn04317175\nn02086079\nn02480855\nn01682714\nn04509417\nn03792972\nn02108551\nn02606052\nn03995372\nn04336792\nn02490219\nn07695742\nn12998815\nn03759954\nn04265275\nn02971356\nn03661043\nn02120505\nn01530575\nn03690938\nn02422106\nn02120079\nn07873807\nn04579432\nn03930313\nn09288635\nn02509815\nn03998194\nn03791053\nn01930112\nn03991062\nn02125311\nn02909870\nn07718747\nn01729322\nn02133161\nn03763968\nn03944341\nn01943899\nn02445715\nn04443257\nn02109047\nn04141327\nn03041632\nn01592084\nn02906734\nn01828970\nn03388549\nn01917289\nn02859443\nn02110958\nn03956157\nn02797295\nn02100583\nn02776631\nn03485407\nn04285008\nn03623198\nn01753488\nn03146219\nn03535780\nn12768682\nn12768682\nn02100583\nn03976657\nn04251144\nn03444034\nn03980874\nn02066245\nn01692333\nn03223299\nn04461696\nn09835506\nn02206856\nn13040303\nn02088094\nn02487347\nn03781244\nn03832673\nn02917067\nn01806567\nn03776460\nn04208210\nn04462240\nn02093428\nn02123045\nn03047690\nn04201297\nn02895154\nn04252225\nn03837869\nn01877812\nn03961711\nn01753488\nn02105505\nn02112018\nn02110627\nn02389026\nn02782093\nn02099712\nn03742115\nn04141076\nn01735189\nn02879718\nn03594734\nn04462240\nn02788148\nn02106166\nn03991062\nn01820546\nn04259630\nn04310018\nn15075141\nn03717622\nn03595614\nn03598930\nn02132136\nn03630383\nn03692522\nn04591157\nn04154565\nn02346627\nn02687172\nn07693725\nn02514041\nn02128757\nn02095314\nn01855032\nn03942813\nn03485407\nn13133613\nn03062245\nn03447447\nn02895154\nn04380533\nn02364673\nn03146219\nn02109961\nn02113799\nn02859443\nn01558993\nn02119789\nn01930112\nn04275548\nn03602883\nn02497673\nn02037110\nn03026506\nn07930864\nn04330267\nn02480495\nn02107683\nn03786901\nn01917289\nn03133878\nn04532670\nn01775062\nn03633091\nn03777568\nn01945685\nn03109150\nn03792972\nn02895154\nn04548362\nn02114855\nn03775071\nn07717556\nn02483362\nn02909870\nn02027492\nn07584110\nn03594734\nn03642806\nn03877845\nn03379051\nn02927161\nn04417672\nn04009552\nn04004767\nn02799071\nn03874599\nn01883070\nn03933933\nn03450230\nn01698640\nn03146219\nn02113023\nn03379051\nn03160309\nn01968897\nn03976467\nn04328186\nn02018207\nn02123597\nn02791124\nn01729977\nn04228054\nn02966687\nn02094258\nn03425413\nn01819313\nn02100236\nn02389026\nn02108551\nn02085620\nn03791053\nn03916031\nn01871265\nn01698640\nn02100877\nn03146219\nn03903868\nn03803284\nn04204238\nn04037443\nn02128925\nn03131574\nn02823428\nn09421951\nn03884397\nn07742313\nn03871628\nn01770081\nn04540053\nn03000134\nn02443114\nn04476259\nn04317175\nn02091032\nn07248320\nn04146614\nn04532106\nn07920052\nn02484975\nn04612504\nn01530575\nn03929660\nn04540053\nn01796340\nn01828970\nn04162706\nn03481172\nn03983396\nn02777292\nn02018795\nn02869837\nn02835271\nn03201208\nn01518878\nn12057211\nn03787032\nn02641379\nn04554684\nn02791124\nn01819313\nn02389026\nn04090263\nn03908618\nn03792972\nn02484975\nn07590611\nn01530575\nn12985857\nn09229709\nn01755581\nn03627232\nn02123159\nn03775546\nn04596742\nn04346328\nn02669723\nn07753592\nn07613480\nn03884397\nn02892201\nn01924916\nn04467665\nn02488291\nn03868242\nn02356798\nn04265275\nn02077923\nn02102973\nn03457902\nn02190166\nn03259280\nn02105162\nn02091831\nn02256656\nn01872401\nn02493793\nn02408429\nn02106550\nn03929660\nn03325584\nn04332243\nn04270147\nn01630670\nn03250847\nn02114367\nn02106166\nn03134739\nn02814860\nn02110063\nn03903868\nn02395406\nn04311174\nn03532672\nn02840245\nn01986214\nn04429376\nn02119022\nn03218198\nn02783161\nn03770439\nn02089867\nn02966687\nn03658185\nn09193705\nn03085013\nn02971356\nn04049303\nn11939491\nn02105641\nn03494278\nn02364673\nn01534433\nn01735189\nn02105855\nn03743016\nn07718472\nn02113799\nn04443257\nn02096294\nn02128925\nn02264363\nn03796401\nn02444819\nn03770679\nn02093647\nn03483316\nn02107574\nn04127249\nn02978881\nn13054560\nn02823750\nn03794056\nn03000684\nn01496331\nn01807496\nn02791270\nn01860187\nn03218198\nn02364673\nn03498962\nn04153751\nn01688243\nn03388183\nn01968897\nn02172182\nn02112018\nn02883205\nn03854065\nn12267677\nn02094258\nn04254120\nn01855672\nn02100877\nn03344393\nn07693725\nn02669723\nn02264363\nn03763968\nn03637318\nn04447861\nn01984695\nn12267677\nn04335435\nn02120505\nn02104365\nn03450230\nn04286575\nn03207941\nn02106166\nn03325584\nn03793489\nn03788365\nn03877845\nn02190166\nn02051845\nn02100583\nn02104029\nn06359193\nn01514859\nn02106550\nn02165456\nn02276258\nn01514859\nn03485407\nn01632777\nn02408429\nn03124043\nn03717622\nn04252225\nn04517823\nn03425413\nn04310018\nn03017168\nn03832673\nn01770081\nn03127925\nn02089867\nn03461385\nn03485407\nn01592084\nn02256656\nn03146219\nn01795545\nn03947888\nn07693725\nn04483307\nn02002556\nn04532670\nn04049303\nn02892201\nn03857828\nn01494475\nn01601694\nn04131690\nn02666196\nn02098286\nn02641379\nn04228054\nn03980874\nn04590129\nn01616318\nn03690938\nn04127249\nn03345487\nn02113023\nn01749939\nn04229816\nn02927161\nn03956157\nn02111500\nn01756291\nn02492035\nn02119022\nn02443114\nn02950826\nn02319095\nn04346328\nn02128757\nn03998194\nn02667093\nn01943899\nn04467665\nn01530575\nn01614925\nn04346328\nn02093754\nn03733805\nn03742115\nn03197337\nn02107908\nn01737021\nn02281787\nn03141823\nn04254120\nn01532829\nn02526121\nn02966687\nn02484975\nn03832673\nn02113799\nn03958227\nn04350905\nn03623198\nn06874185\nn03337140\nn02097658\nn04311174\nn04201297\nn03908714\nn01740131\nn03929855\nn02509815\nn03903868\nn03658185\nn01843065\nn04557648\nn04392985\nn02454379\nn02493793\nn04275548\nn03220513\nn02606052\nn04118776\nn02514041\nn07684084\nn03388183\nn02794156\nn01632777\nn04238763\nn04372370\nn03876231\nn02948072\nn02096437\nn02497673\nn03843555\nn07565083\nn02097130\nn04509417\nn03255030\nn02129165\nn01682714\nn07753275\nn09472597\nn02134418\nn02219486\nn02097047\nn03063689\nn02091467\nn03781244\nn02807133\nn03814906\nn04355338\nn04579145\nn03272010\nn02086646\nn02106662\nn03956157\nn02783161\nn02112137\nn03188531\nn03126707\nn01608432\nn03337140\nn01847000\nn04125021\nn04147183\nn07720875\nn02319095\nn02510455\nn04311174\nn03584254\nn04542943\nn02102480\nn02114712\nn02268443\nn07718472\nn03792972\nn03724870\nn04239074\nn02091134\nn02129604\nn03127925\nn02086646\nn03207941\nn01819313\nn04522168\nn03271574\nn04487394\nn03710193\nn02105855\nn03131574\nn02105251\nn02095889\nn03384352\nn07880968\nn02259212\nn04069434\nn01669191\nn03710193\nn01855672\nn13037406\nn01484850\nn04476259\nn03871628\nn01774750\nn02108551\nn02090622\nn03733281\nn03724870\nn03976657\nn02099267\nn04127249\nn02097474\nn02056570\nn01795545\nn07714571\nn02107142\nn01608432\nn02113023\nn04486054\nn03876231\nn04270147\nn03461385\nn13040303\nn02102318\nn02910353\nn02094114\nn02786058\nn02992211\nn02396427\nn04344873\nn02097130\nn01443537\nn04325704\nn02093428\nn04258138\nn07584110\nn03443371\nn03481172\nn02110341\nn04141975\nn02226429\nn02281406\nn04141327\nn04118538\nn02037110\nn02226429\nn01692333\nn03916031\nn02787622\nn03594945\nn07860988\nn03729826\nn04515003\nn04612504\nn02007558\nn01560419\nn02951358\nn02837789\nn04456115\nn04239074\nn02094433\nn04553703\nn03045698\nn03874599\nn03595614\nn02514041\nn03876231\nn04467665\nn04146614\nn02089973\nn04005630\nn04266014\nn04074963\nn03527444\nn04355338\nn09246464\nn03980874\nn01990800\nn03697007\nn13133613\nn07613480\nn02655020\nn03240683\nn04111531\nn01871265\nn01695060\nn03478589\nn04265275\nn02094433\nn02009229\nn02708093\nn03447447\nn03216828\nn04371430\nn03991062\nn02607072\nn02481823\nn02102318\nn09256479\nn02123597\nn02927161\nn01737021\nn01675722\nn11939491\nn03937543\nn03729826\nn01820546\nn01847000\nn02112137\nn01675722\nn04613696\nn02974003\nn03384352\nn03627232\nn04429376\nn01756291\nn03496892\nn02398521\nn02168699\nn03000247\nn01739381\nn04371430\nn04335435\nn03532672\nn02441942\nn03400231\nn03793489\nn01795545\nn01740131\nn02110806\nn03063599\nn02095314\nn04579432\nn04591157\nn02321529\nn03661043\nn01440764\nn04228054\nn04462240\nn03877472\nn03720891\nn02514041\nn03272562\nn01601694\nn02091467\nn04041544\nn03796401\nn03594734\nn02089078\nn02493793\nn01440764\nn09399592\nn03775071\nn04296562\nn02099849\nn02804610\nn03384352\nn02088632\nn04026417\nn02794156\nn01968897\nn02133161\nn03777754\nn02494079\nn02107142\nn03710193\nn02640242\nn04209133\nn02443114\nn03259280\nn02172182\nn02089078\nn04049303\nn02093647\nn06785654\nn03733131\nn03476991\nn04259630\nn01768244\nn13037406\nn02168699\nn02013706\nn02089078\nn01817953\nn02280649\nn02877765\nn04273569\nn02097209\nn06785654\nn02104365\nn02107908\nn02484975\nn02906734\nn09468604\nn01632777\nn01494475\nn01983481\nn04372370\nn02364673\nn02730930\nn02100583\nn04127249\nn03355925\nn02108089\nn03197337\nn03857828\nn01496331\nn02110341\nn04074963\nn02087046\nn03000684\nn03485794\nn02500267\nn02105162\nn03425413\nn01944390\nn02112018\nn04005630\nn01582220\nn04275548\nn07754684\nn02011460\nn02132136\nn01748264\nn04228054\nn02980441\nn02113624\nn04597913\nn02123159\nn02027492\nn04590129\nn02114548\nn03208938\nn02099267\nn03538406\nn03218198\nn04254120\nn03337140\nn02089078\nn02701002\nn02086240\nn02088632\nn01943899\nn13052670\nn04606251\nn09229709\nn01687978\nn03929660\nn02093754\nn01729322\nn02107908\nn07715103\nn03773504\nn04592741\nn02107908\nn02264363\nn04154565\nn02098105\nn03485794\nn02791270\nn06874185\nn02488702\nn03014705\nn03657121\nn03854065\nn02107574\nn02669723\nn03950228\nn02317335\nn04133789\nn01685808\nn03933933\nn02097047\nn02011460\nn01819313\nn03982430\nn01784675\nn03670208\nn03220513\nn04118538\nn02782093\nn02783161\nn03496892\nn02107574\nn04040759\nn02013706\nn02777292\nn01775062\nn01748264\nn03018349\nn04111531\nn02089867\nn09246464\nn04548280\nn07734744\nn03291819\nn04552348\nn03871628\nn07753113\nn01729322\nn07715103\nn04596742\nn02128385\nn03976467\nn04548280\nn02497673\nn02134418\nn02105251\nn03970156\nn01749939\nn01795545\nn01855032\nn02395406\nn02098413\nn02111500\nn02895154\nn07565083\nn03742115\nn02108089\nn02321529\nn02971356\nn02437616\nn03208938\nn01667114\nn02226429\nn03877845\nn02910353\nn04070727\nn04152593\nn01883070\nn02870880\nn02504458\nn04243546\nn02096051\nn03899768\nn02321529\nn03877845\nn03450230\nn03290653\nn01664065\nn03908714\nn01537544\nn02088238\nn01882714\nn01773549\nn04418357\nn02727426\nn01872401\nn02106382\nn03991062\nn02017213\nn02018207\nn04370456\nn02219486\nn02669723\nn01694178\nn01784675\nn03443371\nn02114548\nn01806567\nn04090263\nn07932039\nn01608432\nn02281406\nn04238763\nn01664065\nn02028035\nn01917289\nn03793489\nn04209239\nn03042490\nn03400231\nn02356798\nn03065424\nn04335435\nn01664065\nn01692333\nn07880968\nn03297495\nn02841315\nn03095699\nn07697313\nn09399592\nn01917289\nn03724870\nn13133613\nn03787032\nn02493793\nn03843555\nn01629819\nn03843555\nn04461696\nn01669191\nn03976657\nn02097047\nn03773504\nn02951585\nn04398044\nn03599486\nn03250847\nn03796401\nn01737021\nn02776631\nn03599486\nn02110806\nn04254680\nn02138441\nn02483362\nn02747177\nn03733805\nn04118538\nn01829413\nn02112137\nn02102318\nn02097474\nn02119789\nn04136333\nn04579432\nn02493509\nn01667778\nn02442845\nn02097209\nn03404251\nn02488291\nn02091032\nn01882714\nn04081281\nn02963159\nn02088632\nn01491361\nn04380533\nn04423845\nn01629819\nn03956157\nn04548362\nn02804610\nn04310018\nn04251144\nn07860988\nn02692877\nn03938244\nn01484850\nn04325704\nn01560419\nn02916936\nn02442845\nn03998194\nn04330267\nn03425413\nn07932039\nn01984695\nn03345487\nn03259280\nn07768694\nn02444819\nn01675722\nn02328150\nn04070727\nn04423845\nn03729826\nn07684084\nn03485794\nn03498962\nn01753488\nn03958227\nn02895154\nn03100240\nn02110806\nn04118776\nn02105056\nn03874293\nn04037443\nn03496892\nn07745940\nn03871628\nn03372029\nn02100735\nn02132136\nn03623198\nn03666591\nn02823750\nn01735189\nn02106382\nn07697537\nn02454379\nn04311004\nn03110669\nn04009552\nn02074367\nn02442845\nn02099601\nn09246464\nn03814906\nn04049303\nn01749939\nn03803284\nn02667093\nn03908714\nn04409515\nn03290653\nn07730033\nn02268443\nn03028079\nn02514041\nn04592741\nn07720875\nn02988304\nn02606052\nn03877472\nn01798484\nn03742115\nn04461696\nn02917067\nn01629819\nn04486054\nn04548362\nn02860847\nn02107683\nn01944390\nn03786901\nn04044716\nn01824575\nn01440764\nn02279972\nn01914609\nn03272562\nn07590611\nn01728572\nn01687978\nn03791053\nn01518878\nn02950826\nn03982430\nn02966193\nn03841143\nn02672831\nn02787622\nn02165105\nn04525038\nn03662601\nn12057211\nn04522168\nn04613696\nn02088632\nn01985128\nn09472597\nn03271574\nn01687978\nn04147183\nn07875152\nn01580077\nn03393912\nn03903868\nn04074963\nn03788365\nn01843065\nn03690938\nn02105056\nn04525305\nn01631663\nn02097047\nn02486410\nn04152593\nn02879718\nn04443257\nn02102040\nn02093859\nn02127052\nn09332890\nn01770393\nn03527444\nn03697007\nn04515003\nn07873807\nn04429376\nn03991062\nn03085013\nn01828970\nn01608432\nn03930313\nn02105641\nn01756291\nn02500267\nn04039381\nn02168699\nn03259280\nn01855032\nn10565667\nn02115641\nn04515003\nn02669723\nn02988304\nn03825788\nn02025239\nn03706229\nn01914609\nn03344393\nn04049303\nn03259280\nn02091244\nn02514041\nn03065424\nn12057211\nn02027492\nn04118538\nn04141076\nn03899768\nn04462240\nn02096051\nn02978881\nn02114855\nn04509417\nn04505470\nn03201208\nn01986214\nn02417914\nn01677366\nn07747607\nn04409515\nn01685808\nn04599235\nn03187595\nn03657121\nn15075141\nn04372370\nn02966687\nn01820546\nn03344393\nn03476991\nn03763968\nn04070727\nn03041632\nn01877812\nn07248320\nn07875152\nn02892767\nn03355925\nn01685808\nn04228054\nn03843555\nn01755581\nn04347754\nn02277742\nn03000247\nn07742313\nn07875152\nn03075370\nn02799071\nn03133878\nn06596364\nn01806143\nn03930313\nn03930313\nn02730930\nn01773797\nn03902125\nn03721384\nn02951358\nn02119022\nn01744401\nn02112706\nn02396427\nn03633091\nn01514668\nn03791053\nn02395406\nn04370456\nn03657121\nn02096585\nn02107312\nn03970156\nn03126707\nn02105251\nn02442845\nn04461696\nn07715103\nn03873416\nn01677366\nn02012849\nn03527444\nn01798484\nn04562935\nn02279972\nn02423022\nn03992509\nn01592084\nn03788195\nn02259212\nn04462240\nn03929660\nn02090622\nn04254120\nn01592084\nn02109961\nn03769881\nn02268443\nn02909870\nn01641577\nn04550184\nn04507155\nn01630670\nn04152593\nn02090379\nn01983481\nn09421951\nn04517823\nn01744401\nn07745940\nn01843383\nn03476684\nn01735189\nn03930313\nn03916031\nn02093991\nn03207743\nn02787622\nn02106166\nn04398044\nn04428191\nn04209133\nn02085620\nn09835506\nn01871265\nn03459775\nn02089973\nn02643566\nn02481823\nn02123159\nn07875152\nn04557648\nn03196217\nn04033995\nn02037110\nn01955084\nn03089624\nn01751748\nn02099429\nn03325584\nn03445777\nn03902125\nn02116738\nn02799071\nn02843684\nn03109150\nn02869837\nn06794110\nn03908618\nn02105251\nn02790996\nn02966687\nn09256479\nn02939185\nn04417672\nn02113624\nn04266014\nn02174001\nn02483362\nn03127925\nn03717622\nn01744401\nn01739381\nn02606052\nn03290653\nn04330267\nn02486410\nn02457408\nn04355338\nn01498041\nn02134418\nn01440764\nn04552348\nn02319095\nn03781244\nn07730033\nn04525038\nn02018795\nn03494278\nn04589890\nn01829413\nn04456115\nn04118776\nn02687172\nn02992529\nn07932039\nn03075370\nn04557648\nn01728920\nn01688243\nn02443484\nn03843555\nn03786901\nn03016953\nn02536864\nn04125021\nn01514668\nn04461696\nn01983481\nn02493509\nn07614500\nn01776313\nn02091467\nn02106030\nn02814860\nn02002556\nn01818515\nn03160309\nn02092339\nn02013706\nn01753488\nn01739381\nn02981792\nn01753488\nn02704792\nn09332890\nn02317335\nn03255030\nn04201297\nn02093256\nn01688243\nn03792782\nn03028079\nn01944390\nn02107908\nn03803284\nn03775546\nn02128757\nn04542943\nn04560804\nn02514041\nn04204347\nn02916936\nn03344393\nn02364673\nn03942813\nn01614925\nn02494079\nn04542943\nn07742313\nn02490219\nn03843555\nn02281406\nn02493793\nn02123597\nn04613696\nn01796340\nn07753592\nn03384352\nn03916031\nn03908714\nn03992509\nn04201297\nn03637318\nn02977058\nn02091032\nn02494079\nn03673027\nn04548362\nn01950731\nn03721384\nn02999410\nn02483362\nn02111277\nn03709823\nn02087046\nn03929660\nn07930864\nn03954731\nn03063599\nn03692522\nn02018207\nn03788195\nn04040759\nn02011460\nn07871810\nn03690938\nn04486054\nn01986214\nn04591713\nn04127249\nn01807496\nn02095570\nn01981276\nn02128925\nn02992529\nn02815834\nn01698640\nn01632458\nn02492660\nn02319095\nn03938244\nn03876231\nn01798484\nn03666591\nn02110806\nn03782006\nn01943899\nn02643566\nn04120489\nn04399382\nn02085782\nn04389033\nn07714571\nn01614925\nn03494278\nn04141076\nn03388043\nn04118776\nn03291819\nn02389026\nn04209133\nn01685808\nn03769881\nn04074963\nn04458633\nn04532670\nn02484975\nn07579787\nn02058221\nn03000134\nn01704323\nn04044716\nn03000684\nn03179701\nn07716906\nn01518878\nn02497673\nn03445924\nn02093647\nn02410509\nn03026506\nn04153751\nn04141076\nn03532672\nn04201297\nn07836838\nn03188531\nn02486410\nn04275548\nn02133161\nn03394916\nn02098105\nn04376876\nn02106382\nn03483316\nn02490219\nn03032252\nn03770439\nn02025239\nn03840681\nn03496892\nn03633091\nn02837789\nn03126707\nn02104365\nn04584207\nn04347754\nn04243546\nn02110185\nn02865351\nn02167151\nn02871525\nn02088466\nn02138441\nn02804610\nn03935335\nn02782093\nn01744401\nn09472597\nn03445924\nn01737021\nn02102480\nn02086646\nn02137549\nn02481823\nn02107574\nn02096437\nn02701002\nn03272562\nn02978881\nn01737021\nn01824575\nn03887697\nn02097298\nn03692522\nn02437312\nn03814639\nn02236044\nn02094433\nn07742313\nn04398044\nn03255030\nn04258138\nn02422106\nn06785654\nn02319095\nn03692522\nn04350905\nn04252077\nn03804744\nn03131574\nn02107312\nn07583066\nn02006656\nn01608432\nn04428191\nn04346328\nn02493793\nn04040759\nn03733281\nn02093754\nn01677366\nn02481823\nn11939491\nn13044778\nn04070727\nn02500267\nn03347037\nn03942813\nn03218198\nn02747177\nn04286575\nn01530575\nn02437312\nn02090379\nn04447861\nn01843383\nn01629819\nn01871265\nn02077923\nn02105162\nn03873416\nn02106662\nn02096437\nn02132136\nn03000684\nn01917289\nn02777292\nn02077923\nn02110063\nn02027492\nn02124075\nn04467665\nn04192698\nn04525305\nn12057211\nn02894605\nn02108551\nn04392985\nn01742172\nn02825657\nn04336792\nn04265275\nn02172182\nn02483362\nn02168699\nn02088094\nn02128925\nn03764736\nn02113712\nn03197337\nn03393912\nn03804744\nn07697313\nn03770679\nn02795169\nn02104365\nn10148035\nn01534433\nn03089624\nn10565667\nn04536866\nn02259212\nn01828970\nn01667114\nn02110958\nn03841143\nn03325584\nn03450230\nn04423845\nn04149813\nn02802426\nn03876231\nn03868242\nn07614500\nn04356056\nn02128925\nn03379051\nn02099712\nn02870880\nn02085936\nn13044778\nn03388043\nn02113712\nn02113624\nn03141823\nn02110627\nn03394916\nn04548362\nn02927161\nn01914609\nn04275548\nn03271574\nn03527444\nn01530575\nn03775546\nn02965783\nn02105505\nn03982430\nn04258138\nn03201208\nn07684084\nn02437616\nn03388043\nn04389033\nn02841315\nn03250847\nn02480495\nn01749939\nn12998815\nn02114712\nn02056570\nn03602883\nn02281406\nn02086079\nn03769881\nn03791053\nn02165456\nn02747177\nn13040303\nn04023962\nn02948072\nn04243546\nn02690373\nn04442312\nn03837869\nn04417672\nn13054560\nn02106166\nn01776313\nn02667093\nn07565083\nn13133613\nn07730033\nn02488291\nn04423845\nn03623198\nn03977966\nn03866082\nn02100735\nn02834397\nn04461696\nn02089078\nn01694178\nn01944390\nn03706229\nn03223299\nn03980874\nn03991062\nn04004767\nn04201297\nn03761084\nn03443371\nn02033041\nn02138441\nn01924916\nn04133789\nn06359193\nn02091032\nn02981792\nn03180011\nn04522168\nn04317175\nn02106662\nn01847000\nn12768682\nn03496892\nn02892767\nn07684084\nn01877812\nn03345487\nn03495258\nn03661043\nn01990800\nn03417042\nn04330267\nn01443537\nn02397096\nn01582220\nn01910747\nn02025239\nn03724870\nn02787622\nn02892201\nn02086079\nn04417672\nn04550184\nn04525305\nn03877845\nn07718472\nn04266014\nn02396427\nn01773797\nn02009912\nn01795545\nn02120079\nn02105505\nn04252077\nn07734744\nn02793495\nn04372370\nn02667093\nn01629819\nn02493793\nn02640242\nn01748264\nn02134418\nn04335435\nn02966687\nn01608432\nn03325584\nn02013706\nn02364673\nn02791124\nn02979186\nn04493381\nn03045698\nn03032252\nn02092339\nn01806143\nn03535780\nn02319095\nn04562935\nn01873310\nn02279972\nn02124075\nn03482405\nn02056570\nn02823750\nn02823428\nn01443537\nn02860847\nn02690373\nn03825788\nn04461696\nn02106030\nn01983481\nn01632777\nn04562935\nn01847000\nn03661043\nn03272010\nn02113978\nn04550184\nn02699494\nn04505470\nn01629819\nn03944341\nn03792782\nn02071294\nn02114367\nn04536866\nn02910353\nn03355925\nn03908618\nn02786058\nn02097047\nn02088094\nn02089867\nn04356056\nn02095570\nn01756291\nn02441942\nn04208210\nn07693725\nn02088094\nn06596364\nn02992529\nn04081281\nn03467068\nn01847000\nn01693334\nn03680355\nn04501370\nn03763968\nn01917289\nn02669723\nn01924916\nn02110958\nn04041544\nn02110806\nn02134084\nn02130308\nn02443484\nn02843684\nn01968897\nn01855672\nn02113799\nn03584829\nn12768682\nn01531178\nn03197337\nn01784675\nn03075370\nn04252077\nn03935335\nn02999410\nn07716358\nn04238763\nn07753275\nn02279972\nn02666196\nn02007558\nn02105251\nn02226429\nn01751748\nn02127052\nn04579145\nn02051845\nn02445715\nn02102177\nn03759954\nn03179701\nn02007558\nn03649909\nn03992509\nn03447721\nn02916936\nn03196217\nn01883070\nn01983481\nn03000684\nn01756291\nn02111277\nn03857828\nn04479046\nn02177972\nn04067472\nn03444034\nn03854065\nn03720891\nn04208210\nn01740131\nn04423845\nn01855672\nn03388549\nn02206856\nn04606251\nn03887697\nn02865351\nn04579145\nn01496331\nn02804414\nn02787622\nn04004767\nn02097047\nn02490219\nn03529860\nn03680355\nn03942813\nn01632458\nn03733281\nn03584829\nn02797295\nn02966687\nn01824575\nn07831146\nn04366367\nn03666591\nn03788195\nn02966193\nn03042490\nn06874185\nn03345487\nn02123597\nn02895154\nn01664065\nn01819313\nn12985857\nn01855672\nn02095314\nn02102973\nn02966193\nn02115913\nn03590841\nn02093991\nn02169497\nn02814860\nn02089078\nn02138441\nn02113712\nn02883205\nn01601694\nn01774384\nn04111531\nn03000134\nn02088364\nn02489166\nn01914609\nn04009552\nn03680355\nn03843555\nn03950228\nn03680355\nn04597913\nn04347754\nn04116512\nn02747177\nn01514668\nn02840245\nn03483316\nn07715103\nn04153751\nn02500267\nn03998194\nn15075141\nn03930313\nn02112706\nn03888257\nn02110063\nn02108000\nn02102973\nn02483708\nn02097474\nn02011460\nn02492035\nn02814860\nn02009229\nn03877845\nn06596364\nn07248320\nn04344873\nn04536866\nn02823750\nn03291819\nn01770081\nn02892767\nn03481172\nn02066245\nn04370456\nn02264363\nn03670208\nn02397096\nn03075370\nn02087394\nn02536864\nn04599235\nn03982430\nn04523525\nn04522168\nn13052670\nn03633091\nn04067472\nn02988304\nn04486054\nn01677366\nn02492660\nn03127747\nn02112350\nn04336792\nn03417042\nn13133613\nn01608432\nn02865351\nn02129165\nn01773157\nn04258138\nn04041544\nn04252077\nn03197337\nn03794056\nn03877845\nn04346328\nn02086910\nn01694178\nn03445924\nn04532670\nn03781244\nn04141975\nn03124170\nn03874293\nn03498962\nn01739381\nn02791270\nn07892512\nn03444034\nn02105162\nn01734418\nn04070727\nn02916936\nn03840681\nn04399382\nn07749582\nn02480495\nn04515003\nn01688243\nn02107142\nn01914609\nn01742172\nn07753113\nn01828970\nn01797886\nn04606251\nn03062245\nn03400231\nn03483316\nn02978881\nn02109047\nn02795169\nn01728920\nn03530642\nn04209133\nn02105641\nn02111277\nn01737021\nn02092339\nn04589890\nn02454379\nn12267677\nn03627232\nn01990800\nn02109047\nn03314780\nn01798484\nn03691459\nn02669723\nn03781244\nn03467068\nn01770081\nn01796340\nn03930313\nn02226429\nn02514041\nn02356798\nn07880968\nn04131690\nn02807133\nn03841143\nn02346627\nn02397096\nn02963159\nn02641379\nn02093428\nn01537544\nn02814860\nn04074963\nn02109525\nn02085782\nn02102973\nn02319095\nn02437616\nn02395406\nn02488291\nn03777568\nn03710193\nn09421951\nn03838899\nn04004767\nn02011460\nn02526121\nn02112018\nn02687172\nn02825657\nn01882714\nn01968897\nn03196217\nn02101556\nn04389033\nn04127249\nn04254680\nn03063689\nn04125021\nn01689811\nn04325704\nn02137549\nn10565667\nn02391049\nn07836838\nn04584207\nn02423022\nn02088364\nn03961711\nn02457408\nn03535780\nn02412080\nn03017168\nn02979186\nn02676566\nn01860187\nn02423022\nn03891332\nn01494475\nn01704323\nn04423845\nn03976467\nn02091831\nn02101006\nn01491361\nn03063689\nn01910747\nn01784675\nn03967562\nn02094114\nn04065272\nn01534433\nn04372370\nn02879718\nn02871525\nn02168699\nn01784675\nn03492542\nn02101388\nn07718472\nn02110185\nn12998815\nn03127925\nn03207743\nn12057211\nn07565083\nn04525038\nn04118776\nn01616318\nn02965783\nn02206856\nn03899768\nn01687978\nn03379051\nn02104029\nn04229816\nn03124170\nn02281406\nn03032252\nn02101556\nn02980441\nn03485794\nn04366367\nn02492035\nn03599486\nn04548362\nn03764736\nn07760859\nn01978287\nn04505470\nn02488291\nn02782093\nn03417042\nn02486261\nn03843555\nn02319095\nn02493509\nn01798484\nn03857828\nn03950228\nn02791124\nn03207941\nn01751748\nn03916031\nn04074963\nn03724870\nn13133613\nn03937543\nn03255030\nn04372370\nn02168699\nn03920288\nn02514041\nn02112350\nn01443537\nn01807496\nn04070727\nn01675722\nn01518878\nn03599486\nn04162706\nn04147183\nn01795545\nn01698640\nn01873310\nn07718472\nn04033995\nn04418357\nn04429376\nn02110806\nn01944390\nn09835506\nn02092339\nn02948072\nn01978455\nn02100236\nn03710193\nn04517823\nn04154565\nn03761084\nn02346627\nn02672831\nn02422106\nn01664065\nn04125021\nn03450230\nn03980874\nn03642806\nn03866082\nn01494475\nn01910747\nn02229544\nn01770393\nn02114367\nn07920052\nn01872401\nn02109047\nn03884397\nn02704792\nn07716906\nn03843555\nn03095699\nn04532106\nn02093754\nn02879718\nn04515003\nn07718747\nn02094258\nn03838899\nn03126707\nn07730033\nn03085013\nn03680355\nn02123045\nn02279972\nn02086240\nn02134418\nn03388549\nn03637318\nn03345487\nn04517823\nn03476991\nn07734744\nn03602883\nn04371774\nn04229816\nn03249569\nn02676566\nn02011460\nn02916936\nn01806567\nn02814533\nn01560419\nn03970156\nn01978455\nn02823750\nn02883205\nn02110627\nn03787032\nn10148035\nn04596742\nn04033995\nn02444819\nn03954731\nn04311174\nn02095889\nn01914609\nn03710193\nn02782093\nn01820546\nn02091134\nn04355933\nn02389026\nn04090263\nn04254120\nn01820546\nn01641577\nn02106550\nn02326432\nn03532672\nn03065424\nn07836838\nn02786058\nn04235860\nn04264628\nn02091244\nn03773504\nn02013706\nn04458633\nn04270147\nn07711569\nn04325704\nn03017168\nn02112350\nn04192698\nn02769748\nn02096051\nn04149813\nn02483708\nn04040759\nn04265275\nn02071294\nn07873807\nn02488702\nn04200800\nn02134084\nn04418357\nn04552348\nn02999410\nn02817516\nn01981276\nn02233338\nn02504458\nn02116738\nn03633091\nn03372029\nn07714990\nn04552348\nn02504458\nn02172182\nn03691459\nn02089078\nn03594734\nn02643566\nn01665541\nn01818515\nn02802426\nn03662601\nn03495258\nn01773797\nn02206856\nn03710721\nn04442312\nn02137549\nn03657121\nn04311004\nn03775071\nn03630383\nn02412080\nn01443537\nn03874293\nn03874599\nn07590611\nn04162706\nn02108551\nn07749582\nn02804414\nn03777754\nn03584829\nn02699494\nn02097298\nn03661043\nn01774750\nn03594945\nn04005630\nn07697313\nn02009229\nn03529860\nn04355933\nn03899768\nn03337140\nn02110958\nn02092339\nn02097130\nn03337140\nn01818515\nn03345487\nn01496331\nn03124043\nn02095570\nn01558993\nn03814906\nn03216828\nn03930630\nn06874185\nn02113799\nn07720875\nn03887697\nn03697007\nn02231487\nn02669723\nn02480855\nn04366367\nn03706229\nn03529860\nn03924679\nn03527444\nn01770393\nn04493381\nn04532670\nn02883205\nn04192698\nn02129604\nn02669723\nn04259630\nn02091831\nn09332890\nn01883070\nn04026417\nn03485407\nn01877812\nn01644900\nn09256479\nn04286575\nn01601694\nn04428191\nn03065424\nn03770439\nn02174001\nn02110341\nn02916936\nn04086273\nn03393912\nn02701002\nn03991062\nn01608432\nn04273569\nn04522168\nn07760859\nn02493793\nn02804414\nn02229544\nn04009552\nn03874599\nn03649909\nn07614500\nn02094433\nn02097298\nn03662601\nn03450230\nn02093256\nn04033995\nn02113023\nn09246464\nn01704323\nn02488702\nn02096294\nn04536866\nn07873807\nn03770439\nn04409515\nn04532106\nn04542943\nn07584110\nn02808304\nn03903868\nn03888605\nn02051845\nn02115641\nn02099267\nn03452741\nn03498962\nn01945685\nn01692333\nn03930630\nn02794156\nn04311004\nn03482405\nn04540053\nn09256479\nn02607072\nn02281406\nn03991062\nn02056570\nn04243546\nn03100240\nn01532829\nn03127747\nn02119022\nn02666196\nn03379051\nn04417672\nn07920052\nn03617480\nn01818515\nn03998194\nn03388183\nn02113799\nn04344873\nn03590841\nn04228054\nn04228054\nn02231487\nn03888257\nn04086273\nn02090622\nn03933933\nn02422106\nn03720891\nn02093991\nn04347754\nn01630670\nn03843555\nn03729826\nn01644900\nn02264363\nn03126707\nn12057211\nn04461696\nn02098286\nn02276258\nn04552348\nn01514668\nn04243546\nn02871525\nn02106382\nn02100583\nn02085936\nn04487081\nn03995372\nn01601694\nn02279972\nn03444034\nn07730033\nn02011460\nn02099601\nn04536866\nn03014705\nn02486261\nn04590129\nn04265275\nn03447447\nn02102177\nn03388043\nn01665541\nn03924679\nn06874185\nn03018349\nn02403003\nn03196217\nn02132136\nn01514859\nn02397096\nn02113186\nn03924679\nn02096437\nn07831146\nn04584207\nn03777568\nn02276258\nn02108915\nn04540053\nn03874293\nn02033041\nn04270147\nn02114367\nn07730033\nn02342885\nn03929660\nn03032252\nn02992211\nn03658185\nn02777292\nn02879718\nn02319095\nn07760859\nn03888257\nn02910353\nn03868863\nn04133789\nn04136333\nn04356056\nn02028035\nn03000134\nn03355925\nn04326547\nn02494079\nn04099969\nn02966193\nn04147183\nn02966193\nn07697313\nn03877472\nn02486261\nn02510455\nn07720875\nn03764736\nn04239074\nn02443484\nn07720875\nn02840245\nn03782006\nn02119789\nn04328186\nn02417914\nn03216828\nn02108551\nn02013706\nn01734418\nn03729826\nn01689811\nn04522168\nn02422106\nn04004767\nn12620546\nn04041544\nn04116512\nn03478589\nn02174001\nn04486054\nn02107142\nn02422699\nn03400231\nn07930864\nn04200800\nn01582220\nn07753592\nn02690373\nn07880968\nn03958227\nn01665541\nn01847000\nn12768682\nn03478589\nn02091467\nn02787622\nn02776631\nn03000247\nn04074963\nn03743016\nn03325584\nn09246464\nn03871628\nn01740131\nn09288635\nn02730930\nn03884397\nn03775546\nn02114712\nn07718472\nn01728920\nn02494079\nn01774750\nn03967562\nn07718747\nn02906734\nn03444034\nn02408429\nn02319095\nn04330267\nn02113624\nn02231487\nn04141076\nn04552348\nn03759954\nn04120489\nn02869837\nn03838899\nn02268443\nn02321529\nn04023962\nn03843555\nn04525038\nn02361337\nn03924679\nn02236044\nn01530575\nn02877765\nn01980166\nn03777568\nn04008634\nn04579145\nn07873807\nn03207743\nn03970156\nn04254680\nn03345487\nn02454379\nn03110669\nn01980166\nn02536864\nn04285008\nn07684084\nn01924916\nn02108915\nn04074963\nn03837869\nn01882714\nn03873416\nn02169497\nn02687172\nn02268853\nn02906734\nn03018349\nn04310018\nn02978881\nn01693334\nn04542943\nn03770679\nn02123045\nn02974003\nn02086646\nn01530575\nn03786901\nn03710193\nn03388183\nn02112350\nn02113186\nn01883070\nn04552348\nn04344873\nn01773157\nn02109961\nn02123159\nn04404412\nn01917289\nn02169497\nn03899768\nn03697007\nn03874599\nn02669723\nn07717556\nn04147183\nn03424325\nn03498962\nn07715103\nn01632777\nn02264363\nn03018349\nn01669191\nn04204238\nn01829413\nn03785016\nn01871265\nn02992529\nn04127249\nn01774384\nn13040303\nn02090721\nn07615774\nn02231487\nn03126707\nn04399382\nn02127052\nn02480495\nn04357314\nn04597913\nn04311174\nn04376876\nn03344393\nn04146614\nn01622779\nn04325704\nn03527444\nn07753275\nn02422699\nn03759954\nn01824575\nn01704323\nn04067472\nn01872401\nn02114712\nn02979186\nn07615774\nn02094433\nn02106550\nn01930112\nn02086079\nn07754684\nn02088238\nn03764736\nn02077923\nn01770081\nn03763968\nn03544143\nn03777568\nn03706229\nn07871810\nn02100583\nn02096585\nn03538406\nn02794156\nn04325704\nn04127249\nn02277742\nn03314780\nn13037406\nn02607072\nn07720875\nn02277742\nn02412080\nn13054560\nn02865351\nn03467068\nn03891251\nn02089973\nn02002724\nn02017213\nn02917067\nn01665541\nn07714990\nn03372029\nn03584254\nn03662601\nn03337140\nn02692877\nn02110627\nn04201297\nn04154565\nn03637318\nn03255030\nn07745940\nn02056570\nn03895866\nn02169497\nn01818515\nn04493381\nn03041632\nn02110627\nn04553703\nn02099429\nn09428293\nn03495258\nn02483708\nn04336792\nn02825657\nn03891251\nn01860187\nn09472597\nn01753488\nn04540053\nn02895154\nn02321529\nn03259280\nn01630670\nn03000134\nn03866082\nn01514859\nn07873807\nn02105056\nn01978455\nn02009912\nn03794056\nn03720891\nn03995372\nn02869837\nn02169497\nn03425413\nn04355338\nn02977058\nn02916936\nn03840681\nn04560804\nn03042490\nn07734744\nn03706229\nn01774384\nn03530642\nn02346627\nn02105251\nn02229544\nn04522168\nn03535780\nn02105505\nn02168699\nn02138441\nn04131690\nn02172182\nn02111129\nn02776631\nn03785016\nn03895866\nn02457408\nn03146219\nn02134084\nn02097130\nn02361337\nn07720875\nn01871265\nn02231487\nn07717556\nn04328186\nn04317175\nn03065424\nn02442845\nn03729826\nn02892201\nn02489166\nn03721384\nn02096437\nn02093647\nn03376595\nn01692333\nn02134084\nn01978287\nn01592084\nn02504458\nn03544143\nn04039381\nn02690373\nn01756291\nn03814639\nn03443371\nn03633091\nn02066245\nn03868242\nn02133161\nn01496331\nn02108915\nn03325584\nn03372029\nn02085782\nn04026417\nn02111500\nn03482405\nn04149813\nn02108551\nn03337140\nn03970156\nn02443484\nn03657121\nn03633091\nn01675722\nn02965783\nn03908714\nn03777754\nn03394916\nn06794110\nn02492660\nn02099429\nn01828970\nn04404412\nn01532829\nn02109047\nn07768694\nn02104365\nn01632777\nn02794156\nn02807133\nn07615774\nn01532829\nn13040303\nn04149813\nn01828970\nn03345487\nn02096585\nn03291819\nn07754684\nn02123597\nn04266014\nn02114855\nn02018207\nn04532106\nn04579432\nn09246464\nn02088364\nn07615774\nn04487394\nn04612504\nn07613480\nn02058221\nn03980874\nn02134418\nn01622779\nn04209239\nn02692877\nn01560419\nn02870880\nn03445924\nn02117135\nn04356056\nn02097047\nn02281406\nn04243546\nn02129604\nn02395406\nn02089973\nn09332890\nn07747607\nn09246464\nn04417672\nn02859443\nn02105251\nn02012849\nn03724870\nn04562935\nn02790996\nn02825657\nn02510455\nn03884397\nn04069434\nn01843383\nn01440764\nn02909870\nn04344873\nn13054560\nn03976657\nn04270147\nn02804610\nn03792972\nn01704323\nn01689811\nn03908714\nn03062245\nn03376595\nn02442845\nn04589890\nn02114855\nn04465501\nn01664065\nn07711569\nn02457408\nn02165105\nn02389026\nn03207743\nn04081281\nn04458633\nn01843065\nn04335435\nn03444034\nn04311174\nn02128385\nn01819313\nn02098413\nn02110341\nn06874185\nn02098413\nn02007558\nn02077923\nn04461696\nn01514859\nn03388549\nn03447721\nn03207743\nn02443114\nn01664065\nn03825788\nn02799071\nn01753488\nn03642806\nn01847000\nn09421951\nn02086910\nn02441942\nn03141823\nn01664065\nn03642806\nn02364673\nn03884397\nn02033041\nn04019541\nn04266014\nn07749582\nn01818515\nn02415577\nn02804414\nn04599235\nn01910747\nn02965783\nn04111531\nn03794056\nn02088364\nn03733805\nn02497673\nn04296562\nn01983481\nn04041544\nn07892512\nn02085936\nn03929855\nn02396427\nn03854065\nn02802426\nn01751748\nn01632458\nn03207941\nn02110627\nn04554684\nn03729826\nn02480495\nn01914609\nn04200800\nn02480495\nn01630670\nn03825788\nn04458633\nn07754684\nn01756291\nn02807133\nn02099712\nn03223299\nn03394916\nn02100735\nn04548362\nn01774750\nn03085013\nn02974003\nn04004767\nn02111129\nn02113799\nn02963159\nn04275548\nn06874185\nn02105855\nn03710193\nn02916936\nn03125729\nn04209239\nn04033995\nn07930864\nn03443371\nn04604644\nn03788195\nn04238763\nn02174001\nn03637318\nn07615774\nn04200800\nn02107142\nn03709823\nn03786901\nn02086079\nn03201208\nn03000684\nn04099969\nn02102480\nn01950731\nn07753113\nn02013706\nn04536866\nn02423022\nn02687172\nn04208210\nn04596742\nn02051845\nn01833805\nn02058221\nn03344393\nn03857828\nn01978287\nn04118538\nn03976657\nn03717622\nn02097130\nn09399592\nn01768244\nn02317335\nn04204238\nn01580077\nn02097298\nn03673027\nn02013706\nn02105251\nn07697313\nn03980874\nn02804610\nn02125311\nn03781244\nn02095570\nn03344393\nn02408429\nn02110627\nn02807133\nn02129604\nn04332243\nn04398044\nn13044778\nn02098413\nn02129604\nn03763968\nn03028079\nn02108000\nn03825788\nn02116738\nn04344873\nn03924679\nn02486261\nn02667093\nn03584254\nn04554684\nn07932039\nn01872401\nn02128757\nn02966687\nn02101556\nn03207941\nn04476259\nn07684084\nn02109525\nn02268443\nn03793489\nn02106662\nn04335435\nn03146219\nn01774384\nn03980874\nn01930112\nn03485794\nn03710193\nn04525305\nn03916031\nn07565083\nn02264363\nn03676483\nn04235860\nn02808304\nn03796401\nn12620546\nn02098286\nn02091831\nn02319095\nn02264363\nn04317175\nn04120489\nn02788148\nn02110341\nn04252077\nn07715103\nn04540053\nn03016953\nn02091244\nn02640242\nn04612504\nn03000134\nn02112706\nn01532829\nn02115913\nn02101556\nn02119789\nn04252225\nn03492542\nn03272010\nn03770679\nn01629819\nn04517823\nn04366367\nn02410509\nn03623198\nn03777754\nn03899768\nn04367480\nn04525305\nn03208938\nn02951358\nn03110669\nn04483307\nn04517823\nn02422699\nn04509417\nn03590841\nn09332890\nn01629819\nn04557648\nn09421951\nn13052670\nn01677366\nn02058221\nn02102318\nn03126707\nn04548280\nn03187595\nn02966687\nn03938244\nn02486261\nn02096177\nn02165105\nn02979186\nn04310018\nn01669191\nn04356056\nn01644373\nn03676483\nn04311174\nn03617480\nn02107908\nn04310018\nn02100236\nn03623198\nn03841143\nn02488702\nn04507155\nn02097130\nn02769748\nn03781244\nn02441942\nn03240683\nn02115641\nn02117135\nn02137549\nn02113023\nn02129165\nn04532106\nn04118538\nn01774750\nn02917067\nn03394916\nn04458633\nn01704323\nn04399382\nn02410509\nn02111277\nn02102177\nn03000247\nn02107683\nn04037443\nn03445777\nn04296562\nn02971356\nn04418357\nn02730930\nn03841143\nn01774384\nn03271574\nn02443114\nn12144580\nn02097298\nn02948072\nn04179913\nn02105251\nn03888605\nn03208938\nn04265275\nn09421951\nn02408429\nn02101388\nn02105056\nn07836838\nn04591713\nn02011460\nn04532106\nn01698640\nn04330267\nn04039381\nn04542943\nn02317335\nn02504013\nn01704323\nn01829413\nn04357314\nn04252077\nn01601694\nn02006656\nn03124043\nn02965783\nn02814533\nn03347037\nn03920288\nn03874599\nn02364673\nn03496892\nn01978455\nn03544143\nn04252077\nn03630383\nn03717622\nn03141823\nn04259630\nn03785016\nn02174001\nn02869837\nn04335435\nn02687172\nn01729977\nn02018795\nn01494475\nn03529860\nn02106166\nn04553703\nn04523525\nn02445715\nn03891332\nn02747177\nn03676483\nn02667093\nn07920052\nn02910353\nn02097209\nn03991062\nn04204238\nn02110341\nn02089867\nn01776313\nn02328150\nn03180011\nn07717410\nn03047690\nn04505470\nn03014705\nn01518878\nn01807496\nn04591713\nn02999410\nn04254777\nn02870880\nn02002556\nn02095889\nn02487347\nn03944341\nn03770679\nn03794056\nn03759954\nn02093991\nn01968897\nn03743016\nn03388183\nn03775546\nn02437312\nn04120489\nn03642806\nn02808440\nn04099969\nn03891332\nn03958227\nn02113799\nn03998194\nn02104029\nn03250847\nn02100877\nn07714990\nn03110669\nn02676566\nn03347037\nn03530642\nn10565667\nn02108000\nn03110669\nn03690938\nn02095314\nn02012849\nn02277742\nn01532829\nn04553703\nn02051845\nn04456115\nn03998194\nn02417914\nn03594734\nn01775062\nn02105855\nn03903868\nn02096294\nn04371774\nn02927161\nn03657121\nn03937543\nn04532106\nn01883070\nn01537544\nn02667093\nn02104029\nn02487347\nn02104365\nn02051845\nn04243546\nn02006656\nn02808304\nn04251144\nn02356798\nn02391049\nn07753275\nn02974003\nn03482405\nn09193705\nn01694178\nn02168699\nn12768682\nn03272562\nn03710193\nn03843555\nn03126707\nn03196217\nn06785654\nn04350905\nn07873807\nn04310018\nn02264363\nn02492660\nn10565667\nn04275548\nn04147183\nn04366367\nn02114855\nn02100236\nn04154565\nn02276258\nn03424325\nn03777568\nn03494278\nn01806143\nn03459775\nn03598930\nn03967562\nn03775546\nn04418357\nn02412080\nn04591157\nn01770081\nn03877472\nn01531178\nn03794056\nn04485082\nn03786901\nn01773797\nn04254680\nn02128925\nn02128757\nn02442845\nn02606052\nn02099429\nn04442312\nn01807496\nn02107312\nn03710637\nn02027492\nn03016953\nn02017213\nn12768682\nn04192698\nn02747177\nn04532106\nn01537544\nn04254777\nn03259280\nn02025239\nn09835506\nn02096437\nn04372370\nn02797295\nn03871628\nn02481823\nn03837869\nn02268443\nn04522168\nn03690938\nn04550184\nn03657121\nn02105251\nn01833805\nn01755581\nn07734744\nn01873310\nn03538406\nn01688243\nn03452741\nn02120505\nn02412080\nn04254120\nn04019541\nn02112706\nn02100735\nn03201208\nn03134739\nn02514041\nn04065272\nn02165105\nn04443257\nn04149813\nn03871628\nn02100236\nn02412080\nn02992211\nn02951358\nn03776460\nn02666196\nn03000134\nn12144580\nn03141823\nn02110341\nn02094114\nn02504458\nn04389033\nn02085936\nn04553703\nn03594734\nn09468604\nn03980874\nn07831146\nn03141823\nn13054560\nn01704323\nn02356798\nn03970156\nn02071294\nn06794110\nn02860847\nn03970156\nn11879895\nn04389033\nn01770393\nn02104365\nn02033041\nn07754684\nn02666196\nn03658185\nn03447447\nn03840681\nn01990800\nn03992509\nn02319095\nn04540053\nn04141975\nn03026506\nn02009229\nn07880968\nn03459775\nn02488291\nn02108551\nn03793489\nn03041632\nn03887697\nn12057211\nn07875152\nn01828970\nn01796340\nn03494278\nn02281787\nn01698640\nn01537544\nn02110185\nn04209133\nn02536864\nn07714990\nn02100236\nn04317175\nn04265275\nn01983481\nn01833805\nn02808440\nn01443537\nn07697313\nn02109525\nn03935335\nn03903868\nn04074963\nn01807496\nn03729826\nn04111531\nn07860988\nn04133789\nn03873416\nn03991062\nn03028079\nn03207743\nn02487347\nn03207941\nn03920288\nn02100735\nn02105855\nn03544143\nn02071294\nn03496892\nn03461385\nn01443537\nn04239074\nn03956157\nn04553703\nn04371430\nn12057211\nn04118776\nn02793495\nn02808304\nn03709823\nn02099267\nn03063599\nn03018349\nn02009912\nn03467068\nn03637318\nn12998815\nn04153751\nn03063599\nn02132136\nn02879718\nn02835271\nn03089624\nn01734418\nn02027492\nn04133789\nn01491361\nn03041632\nn02361337\nn03710637\nn02169497\nn02268443\nn03291819\nn02492660\nn04069434\nn03457902\nn04200800\nn04429376\nn01945685\nn02910353\nn02096177\nn04204347\nn03347037\nn01806567\nn02002724\nn01675722\nn04404412\nn03476684\nn03868242\nn01773157\nn02102040\nn02088094\nn02797295\nn07831146\nn03764736\nn03000684\nn02536864\nn01983481\nn02106550\nn04065272\nn01685808\nn02090622\nn04579432\nn04204238\nn13054560\nn03016953\nn03937543\nn04229816\nn02492660\nn03445924\nn11939491\nn03544143\nn02894605\nn07697537\nn04153751\nn02483362\nn02134084\nn04208210\nn03197337\nn01753488\nn03680355\nn03938244\nn03857828\nn03761084\nn02105162\nn03742115\nn02536864\nn02930766\nn01514668\nn03876231\nn02493509\nn02095314\nn04517823\nn01729977\nn04442312\nn11939491\nn01614925\nn03496892\nn02281787\nn02095570\nn02105505\nn04127249\nn04579432\nn03804744\nn04613696\nn01440764\nn04133789\nn02115641\nn02099849\nn04493381\nn02102480\nn11939491\nn07565083\nn03425413\nn01756291\nn02132136\nn02109525\nn03995372\nn12057211\nn07697537\nn04023962\nn03690938\nn03676483\nn03868863\nn04147183\nn02895154\nn01773549\nn01667114\nn12267677\nn04507155\nn03658185\nn01644373\nn06785654\nn02114548\nn04065272\nn04118538\nn01491361\nn03792782\nn03773504\nn07831146\nn02092002\nn02808304\nn04330267\nn02437312\nn03481172\nn03706229\nn02100583\nn04347754\nn02666196\nn04074963\nn03976467\nn02090721\nn02002556\nn01728572\nn02129165\nn02483362\nn01910747\nn03887697\nn02422106\nn04039381\nn02356798\nn04350905\nn02871525\nn02086079\nn04485082\nn04116512\nn02346627\nn02840245\nn03345487\nn04336792\nn03777568\nn02797295\nn02093428\nn04037443\nn03188531\nn03538406\nn02108089\nn02268853\nn02219486\nn02415577\nn02113978\nn04367480\nn02111277\nn07754684\nn03207941\nn02708093\nn02791124\nn04239074\nn01872401\nn03124043\nn02788148\nn03933933\nn01798484\nn03065424\nn03658185\nn09421951\nn03000247\nn02669723\nn04592741\nn02097130\nn02105641\nn01629819\nn02793495\nn03954731\nn04141327\nn02966687\nn02769748\nn02281787\nn01687978\nn04229816\nn04009552\nn04418357\nn04461696\nn02006656\nn03770439\nn02017213\nn07716358\nn02445715\nn02389026\nn02948072\nn06785654\nn02268443\nn03457902\nn04118776\nn12768682\nn02095314\nn01518878\nn04275548\nn02894605\nn01843383\nn02840245\nn07697313\nn07930864\nn02690373\nn02788148\nn04081281\nn03127925\nn03706229\nn03721384\nn01632458\nn04265275\nn01924916\nn02979186\nn01872401\nn04235860\nn04476259\nn07697537\nn02488702\nn03920288\nn03670208\nn04493381\nn02113712\nn01682714\nn03271574\nn03018349\nn01641577\nn02422699\nn02807133\nn02749479\nn02749479\nn02480495\nn02120505\nn02277742\nn03935335\nn03759954\nn02113186\nn02100236\nn03126707\nn04458633\nn02281406\nn01775062\nn04204347\nn02116738\nn03388043\nn04418357\nn02100583\nn03584829\nn01592084\nn04456115\nn01728920\nn02091635\nn03637318\nn02105056\nn02110627\nn02776631\nn03788365\nn03179701\nn02009912\nn02219486\nn04179913\nn07590611\nn03903868\nn04560804\nn01917289\nn04133789\nn02085620\nn03259280\nn02484975\nn01744401\nn07836838\nn07753592\nn03673027\nn01494475\nn01728572\nn02174001\nn07873807\nn02058221\nn04252225\nn03782006\nn04133789\nn15075141\nn02106662\nn02346627\nn03769881\nn03630383\nn03871628\nn01984695\nn01514668\nn01749939\nn03457902\nn04347754\nn04370456\nn02892201\nn01693334\nn03109150\nn02102973\nn02098413\nn01930112\nn02834397\nn02091032\nn02489166\nn12985857\nn02092339\nn03995372\nn02089078\nn03709823\nn02111500\nn02268443\nn02410509\nn01798484\nn03720891\nn03868863\nn02092002\nn03018349\nn04487394\nn03240683\nn03803284\nn07579787\nn02804414\nn03887697\nn04542943\nn02113023\nn02607072\nn01882714\nn02102040\nn07697537\nn02443114\nn01986214\nn02777292\nn02939185\nn02009229\nn03769881\nn04554684\nn02037110\nn02817516\nn02089078\nn03691459\nn03680355\nn04591713\nn03804744\nn03617480\nn01795545\nn02865351\nn02840245\nn02909870\nn02101006\nn04208210\nn04487081\nn02111889\nn04264628\nn01629819\nn02111129\nn12768682\nn03134739\nn03075370\nn13037406\nn02100735\nn04330267\nn04540053\nn01498041\nn03874599\nn03874599\nn04485082\nn03095699\nn04252225\nn02172182\nn01667114\nn04557648\nn02119022\nn02091467\nn04350905\nn01817953\nn01985128\nn04067472\nn02504013\nn04476259\nn09229709\nn02865351\nn02105251\nn03255030\nn02325366\nn04200800\nn03065424\nn04330267\nn02403003\nn02123159\nn02326432\nn02097130\nn02966687\nn04591157\nn03538406\nn02107908\nn02009912\nn01644900\nn02356798\nn04201297\nn04235860\nn02110185\nn03544143\nn02787622\nn04296562\nn02804414\nn02114367\nn02894605\nn02119022\nn02965783\nn03837869\nn01955084\nn02701002\nn02137549\nn03794056\nn03759954\nn03956157\nn03461385\nn02939185\nn07892512\nn07715103\nn01742172\nn04350905\nn01817953\nn02865351\nn02002556\nn01644900\nn02795169\nn03617480\nn03207743\nn02403003\nn03109150\nn03590841\nn02480855\nn02091032\nn07584110\nn02102318\nn02111277\nn02692877\nn04604644\nn03793489\nn01877812\nn02412080\nn01698640\nn02110806\nn04019541\nn04476259\nn04584207\nn02012849\nn03720891\nn04311174\nn03459775\nn03781244\nn09428293\nn02106550\nn02132136\nn03630383\nn02128925\nn03903868\nn03814639\nn01630670\nn02106550\nn01855672\nn01807496\nn02088364\nn03290653\nn02109525\nn03902125\nn07583066\nn04542943\nn03937543\nn07583066\nn04008634\nn04532670\nn02095314\nn04118538\nn07584110\nn02747177\nn03929855\nn01950731\nn07742313\nn03649909\nn02319095\nn01697457\nn02092339\nn09332890\nn04347754\nn02480495\nn03478589\nn07880968\nn03935335\nn03976657\nn02835271\nn04367480\nn02177972\nn04070727\nn04277352\nn04125021\nn03134739\nn02128757\nn02504013\nn04111531\nn04152593\nn04591713\nn03400231\nn01704323\nn12768682\nn02110806\nn04418357\nn02536864\nn04409515\nn04542943\nn03763968\nn03662601\nn02490219\nn02086240\nn04404412\nn07718747\nn02096051\nn04599235\nn01944390\nn01990800\nn04152593\nn02807133\nn02086910\nn03347037\nn01847000\nn02107683\nn02279972\nn04019541\nn01695060\nn02087046\nn03891251\nn04154565\nn04398044\nn02504013\nn02138441\nn04285008\nn03942813\nn04239074\nn02704792\nn03794056\nn04476259\nn04483307\nn03982430\nn02109047\nn11939491\nn04335435\nn02727426\nn03781244\nn01978455\nn03887697\nn02268853\nn02607072\nn02009229\nn04371774\nn07892512\nn04523525\nn01748264\nn03924679\nn04200800\nn04026417\nn04208210\nn04548362\nn04389033\nn04152593\nn02910353\nn07697313\nn03196217\nn04200800\nn02279972\nn01917289\nn02488291\nn02808304\nn03992509\nn02804414\nn01774750\nn04442312\nn03535780\nn02802426\nn04044716\nn02128385\nn07697313\nn04179913\nn03400231\nn03095699\nn03871628\nn02129165\nn01773797\nn03691459\nn02018795\nn04116512\nn03089624\nn02127052\nn02111129\nn02093256\nn03742115\nn04429376\nn02009229\nn02815834\nn07747607\nn03481172\nn03220513\nn03495258\nn02974003\nn01704323\nn04277352\nn07684084\nn02107574\nn02276258\nn12998815\nn03617480\nn03721384\nn02992529\nn02321529\nn03933933\nn03764736\nn03764736\nn02317335\nn04235860\nn02808440\nn02110341\nn04542943\nn02442845\nn02869837\nn01742172\nn02088632\nn02120079\nn04259630\nn03447447\nn03876231\nn02037110\nn01914609\nn02102040\nn13054560\nn03930630\nn03759954\nn07584110\nn04259630\nn03291819\nn07697537\nn01614925\nn03814906\nn04540053\nn02116738\nn01776313\nn03954731\nn04479046\nn03658185\nn04357314\nn03763968\nn01755581\nn01749939\nn02981792\nn03485407\nn02442845\nn04548280\nn07880968\nn02825657\nn09332890\nn04596742\nn04596742\nn02930766\nn01843383\nn03532672\nn13133613\nn02963159\nn03759954\nn02098413\nn04367480\nn02643566\nn04254777\nn02415577\nn04560804\nn04485082\nn03781244\nn04597913\nn04482393\nn01530575\nn03250847\nn02108089\nn04404412\nn02687172\nn03786901\nn02108000\nn02687172\nn02317335\nn02606052\nn02165105\nn03045698\nn03218198\nn02415577\nn04069434\nn04482393\nn01806143\nn01443537\nn02100735\nn04153751\nn04254777\nn02091467\nn03482405\nn02794156\nn07754684\nn03495258\nn04542943\nn01797886\nn03085013\nn03792972\nn01980166\nn02782093\nn03920288\nn03666591\nn01695060\nn02486410\nn02088364\nn02389026\nn07753592\nn07248320\nn03355925\nn01737021\nn04266014\nn02167151\nn03930630\nn02133161\nn02107142\nn03180011\nn04023962\nn01443537\nn02443114\nn02892201\nn03109150\nn01872401\nn07565083\nn02815834\nn02206856\nn03729826\nn10565667\nn02111129\nn02704792\nn02117135\nn03000247\nn02129604\nn04550184\nn03089624\nn03785016\nn01689811\nn02441942\nn01641577\nn02229544\nn01622779\nn02089973\nn02791270\nn02102177\nn02114855\nn13040303\nn03944341\nn01667114\nn04149813\nn03792972\nn02869837\nn02112706\nn13044778\nn01688243\nn02097658\nn02109961\nn03791053\nn04286575\nn01985128\nn03014705\nn04265275\nn04467665\nn01985128\nn04344873\nn04335435\nn02676566\nn01806143\nn04599235\nn02093859\nn04486054\nn01601694\nn02966193\nn02965783\nn02099712\nn02808440\nn03785016\nn04285008\nn04141076\nn07760859\nn03717622\nn01917289\nn03942813\nn04409515\nn01819313\nn03255030\nn02328150\nn07590611\nn01985128\nn03998194\nn12985857\nn03014705\nn02823428\nn03127747\nn02825657\nn03935335\nn02793495\nn04509417\nn02655020\nn07873807\nn02906734\nn03720891\nn04037443\nn04254120\nn07614500\nn01667114\nn02415577\nn03710637\nn02361337\nn04081281\nn04070727\nn03649909\nn07720875\nn02011460\nn01443537\nn04525305\nn02894605\nn02113712\nn09229709\nn04367480\nn04266014\nn02105056\nn09421951\nn02814860\nn02167151\nn01744401\nn02808304\nn02106030\nn02074367\nn02536864\nn04485082\nn03538406\nn02108915\nn02114548\nn01698640\nn04286575\nn02797295\nn02124075\nn02927161\nn02747177\nn02641379\nn02325366\nn02536864\nn03697007\nn02281406\nn03017168\nn02090721\nn03776460\nn02037110\nn03100240\nn04398044\nn02871525\nn03792782\nn02787622\nn03180011\nn04522168\nn04266014\nn03218198\nn02088094\nn02097298\nn04548362\nn03196217\nn02095889\nn01873310\nn02088466\nn01968897\nn04548280\nn04604644\nn02090379\nn03787032\nn04229816\nn03891251\nn02356798\nn04350905\nn03782006\nn01664065\nn03950228\nn01601694\nn01558993\nn02777292\nn02091134\nn02088632\nn02442845\nn02137549\nn01669191\nn02007558\nn03782006\nn03692522\nn02916936\nn04357314\nn02132136\nn03930630\nn04019541\nn04005630\nn02102480\nn03443371\nn04523525\nn03814906\nn07693725\nn04371774\nn04209239\nn03720891\nn02086079\nn02071294\nn01774384\nn01560419\nn04204238\nn02101556\nn03998194\nn04486054\nn04505470\nn02089867\nn04179913\nn02112018\nn04201297\nn03673027\nn03908714\nn02105056\nn02791270\nn03775071\nn03785016\nn02088238\nn04376876\nn03272562\nn02132136\nn01748264\nn02939185\nn03485794\nn02105412\nn02814860\nn03527444\nn03803284\nn02396427\nn03877845\nn07614500\nn01514859\nn02105056\nn03047690\nn04254120\nn03218198\nn02910353\nn04328186\nn03776460\nn02109961\nn03467068\nn02704792\nn04136333\nn02169497\nn02094114\nn03837869\nn03131574\nn02090622\nn04238763\nn01682714\nn03388043\nn04493381\nn04040759\nn02099601\nn03803284\nn02101388\nn13044778\nn04483307\nn03404251\nn02090622\nn12768682\nn04367480\nn03134739\nn02356798\nn02408429\nn02974003\nn02101388\nn03124170\nn04435653\nn02105855\nn07920052\nn03272010\nn03180011\nn07717556\nn04235860\nn07716358\nn02088094\nn07873807\nn03775071\nn02110341\nn02817516\nn03146219\nn02113186\nn09246464\nn02119022\nn03240683\nn03706229\nn02701002\nn04154565\nn03467068\nn03843555\nn02107683\nn02088094\nn02108915\nn02786058\nn02326432\nn01629819\nn01614925\nn12267677\nn02108422\nn02481823\nn02892201\nn02877765\nn01955084\nn12057211\nn03063689\nn02113978\nn02777292\nn03717622\nn02787622\nn02437312\nn03992509\nn01930112\nn02500267\nn03627232\nn04505470\nn03250847\nn03400231\nn02977058\nn04554684\nn04456115\nn04147183\nn03676483\nn04465501\nn02094114\nn04532106\nn07892512\nn04557648\nn03482405\nn02088238\nn03991062\nn01751748\nn02104029\nn03733281\nn02536864\nn01860187\nn03133878\nn02110627\nn03208938\nn04192698\nn02106166\nn03028079\nn04515003\nn03787032\nn04317175\nn03447721\nn02326432\nn03535780\nn03998194\nn04560804\nn04507155\nn03134739\nn01697457\nn04270147\nn02107683\nn04525305\nn02410509\nn02099712\nn02132136\nn02268853\nn01817953\nn03929855\nn07615774\nn02100735\nn01833805\nn03207743\nn04584207\nn04266014\nn07248320\nn03467068\nn03908618\nn02133161\nn02486410\nn01755581\nn02445715\nn01914609\nn02841315\nn02877765\nn01697457\nn01981276\nn06794110\nn04485082\nn02119022\nn02481823\nn02802426\nn01689811\nn01796340\nn02667093\nn01622779\nn01980166\nn02442845\nn04328186\nn01871265\nn03729826\nn02123394\nn01630670\nn02106166\nn10148035\nn02437616\n"
  },
  {
    "path": "model_zoo/models/inception/inception/data/imagenet_lsvrc_2015_synsets.txt",
    "content": "n01440764\nn01443537\nn01484850\nn01491361\nn01494475\nn01496331\nn01498041\nn01514668\nn01514859\nn01518878\nn01530575\nn01531178\nn01532829\nn01534433\nn01537544\nn01558993\nn01560419\nn01580077\nn01582220\nn01592084\nn01601694\nn01608432\nn01614925\nn01616318\nn01622779\nn01629819\nn01630670\nn01631663\nn01632458\nn01632777\nn01641577\nn01644373\nn01644900\nn01664065\nn01665541\nn01667114\nn01667778\nn01669191\nn01675722\nn01677366\nn01682714\nn01685808\nn01687978\nn01688243\nn01689811\nn01692333\nn01693334\nn01694178\nn01695060\nn01697457\nn01698640\nn01704323\nn01728572\nn01728920\nn01729322\nn01729977\nn01734418\nn01735189\nn01737021\nn01739381\nn01740131\nn01742172\nn01744401\nn01748264\nn01749939\nn01751748\nn01753488\nn01755581\nn01756291\nn01768244\nn01770081\nn01770393\nn01773157\nn01773549\nn01773797\nn01774384\nn01774750\nn01775062\nn01776313\nn01784675\nn01795545\nn01796340\nn01797886\nn01798484\nn01806143\nn01806567\nn01807496\nn01817953\nn01818515\nn01819313\nn01820546\nn01824575\nn01828970\nn01829413\nn01833805\nn01843065\nn01843383\nn01847000\nn01855032\nn01855672\nn01860187\nn01871265\nn01872401\nn01873310\nn01877812\nn01882714\nn01883070\nn01910747\nn01914609\nn01917289\nn01924916\nn01930112\nn01943899\nn01944390\nn01945685\nn01950731\nn01955084\nn01968897\nn01978287\nn01978455\nn01980166\nn01981276\nn01983481\nn01984695\nn01985128\nn01986214\nn01990800\nn02002556\nn02002724\nn02006656\nn02007558\nn02009229\nn02009912\nn02011460\nn02012849\nn02013706\nn02017213\nn02018207\nn02018795\nn02025239\nn02027492\nn02028035\nn02033041\nn02037110\nn02051845\nn02056570\nn02058221\nn02066245\nn02071294\nn02074367\nn02077923\nn02085620\nn02085782\nn02085936\nn02086079\nn02086240\nn02086646\nn02086910\nn02087046\nn02087394\nn02088094\nn02088238\nn02088364\nn02088466\nn02088632\nn02089078\nn02089867\nn02089973\nn02090379\nn02090622\nn02090721\nn02091032\nn02091134\nn02091244\nn02091467\nn02091635\nn02091831\nn02092002\nn02092339\nn02093256\nn02093428\nn02093647\nn02093754\nn02093859\nn02093991\nn02094114\nn02094258\nn02094433\nn02095314\nn02095570\nn02095889\nn02096051\nn02096177\nn02096294\nn02096437\nn02096585\nn02097047\nn02097130\nn02097209\nn02097298\nn02097474\nn02097658\nn02098105\nn02098286\nn02098413\nn02099267\nn02099429\nn02099601\nn02099712\nn02099849\nn02100236\nn02100583\nn02100735\nn02100877\nn02101006\nn02101388\nn02101556\nn02102040\nn02102177\nn02102318\nn02102480\nn02102973\nn02104029\nn02104365\nn02105056\nn02105162\nn02105251\nn02105412\nn02105505\nn02105641\nn02105855\nn02106030\nn02106166\nn02106382\nn02106550\nn02106662\nn02107142\nn02107312\nn02107574\nn02107683\nn02107908\nn02108000\nn02108089\nn02108422\nn02108551\nn02108915\nn02109047\nn02109525\nn02109961\nn02110063\nn02110185\nn02110341\nn02110627\nn02110806\nn02110958\nn02111129\nn02111277\nn02111500\nn02111889\nn02112018\nn02112137\nn02112350\nn02112706\nn02113023\nn02113186\nn02113624\nn02113712\nn02113799\nn02113978\nn02114367\nn02114548\nn02114712\nn02114855\nn02115641\nn02115913\nn02116738\nn02117135\nn02119022\nn02119789\nn02120079\nn02120505\nn02123045\nn02123159\nn02123394\nn02123597\nn02124075\nn02125311\nn02127052\nn02128385\nn02128757\nn02128925\nn02129165\nn02129604\nn02130308\nn02132136\nn02133161\nn02134084\nn02134418\nn02137549\nn02138441\nn02165105\nn02165456\nn02167151\nn02168699\nn02169497\nn02172182\nn02174001\nn02177972\nn02190166\nn02206856\nn02219486\nn02226429\nn02229544\nn02231487\nn02233338\nn02236044\nn02256656\nn02259212\nn02264363\nn02268443\nn02268853\nn02276258\nn02277742\nn02279972\nn02280649\nn02281406\nn02281787\nn02317335\nn02319095\nn02321529\nn02325366\nn02326432\nn02328150\nn02342885\nn02346627\nn02356798\nn02361337\nn02363005\nn02364673\nn02389026\nn02391049\nn02395406\nn02396427\nn02397096\nn02398521\nn02403003\nn02408429\nn02410509\nn02412080\nn02415577\nn02417914\nn02422106\nn02422699\nn02423022\nn02437312\nn02437616\nn02441942\nn02442845\nn02443114\nn02443484\nn02444819\nn02445715\nn02447366\nn02454379\nn02457408\nn02480495\nn02480855\nn02481823\nn02483362\nn02483708\nn02484975\nn02486261\nn02486410\nn02487347\nn02488291\nn02488702\nn02489166\nn02490219\nn02492035\nn02492660\nn02493509\nn02493793\nn02494079\nn02497673\nn02500267\nn02504013\nn02504458\nn02509815\nn02510455\nn02514041\nn02526121\nn02536864\nn02606052\nn02607072\nn02640242\nn02641379\nn02643566\nn02655020\nn02666196\nn02667093\nn02669723\nn02672831\nn02676566\nn02687172\nn02690373\nn02692877\nn02699494\nn02701002\nn02704792\nn02708093\nn02727426\nn02730930\nn02747177\nn02749479\nn02769748\nn02776631\nn02777292\nn02782093\nn02783161\nn02786058\nn02787622\nn02788148\nn02790996\nn02791124\nn02791270\nn02793495\nn02794156\nn02795169\nn02797295\nn02799071\nn02802426\nn02804414\nn02804610\nn02807133\nn02808304\nn02808440\nn02814533\nn02814860\nn02815834\nn02817516\nn02823428\nn02823750\nn02825657\nn02834397\nn02835271\nn02837789\nn02840245\nn02841315\nn02843684\nn02859443\nn02860847\nn02865351\nn02869837\nn02870880\nn02871525\nn02877765\nn02879718\nn02883205\nn02892201\nn02892767\nn02894605\nn02895154\nn02906734\nn02909870\nn02910353\nn02916936\nn02917067\nn02927161\nn02930766\nn02939185\nn02948072\nn02950826\nn02951358\nn02951585\nn02963159\nn02965783\nn02966193\nn02966687\nn02971356\nn02974003\nn02977058\nn02978881\nn02979186\nn02980441\nn02981792\nn02988304\nn02992211\nn02992529\nn02999410\nn03000134\nn03000247\nn03000684\nn03014705\nn03016953\nn03017168\nn03018349\nn03026506\nn03028079\nn03032252\nn03041632\nn03042490\nn03045698\nn03047690\nn03062245\nn03063599\nn03063689\nn03065424\nn03075370\nn03085013\nn03089624\nn03095699\nn03100240\nn03109150\nn03110669\nn03124043\nn03124170\nn03125729\nn03126707\nn03127747\nn03127925\nn03131574\nn03133878\nn03134739\nn03141823\nn03146219\nn03160309\nn03179701\nn03180011\nn03187595\nn03188531\nn03196217\nn03197337\nn03201208\nn03207743\nn03207941\nn03208938\nn03216828\nn03218198\nn03220513\nn03223299\nn03240683\nn03249569\nn03250847\nn03255030\nn03259280\nn03271574\nn03272010\nn03272562\nn03290653\nn03291819\nn03297495\nn03314780\nn03325584\nn03337140\nn03344393\nn03345487\nn03347037\nn03355925\nn03372029\nn03376595\nn03379051\nn03384352\nn03388043\nn03388183\nn03388549\nn03393912\nn03394916\nn03400231\nn03404251\nn03417042\nn03424325\nn03425413\nn03443371\nn03444034\nn03445777\nn03445924\nn03447447\nn03447721\nn03450230\nn03452741\nn03457902\nn03459775\nn03461385\nn03467068\nn03476684\nn03476991\nn03478589\nn03481172\nn03482405\nn03483316\nn03485407\nn03485794\nn03492542\nn03494278\nn03495258\nn03496892\nn03498962\nn03527444\nn03529860\nn03530642\nn03532672\nn03534580\nn03535780\nn03538406\nn03544143\nn03584254\nn03584829\nn03590841\nn03594734\nn03594945\nn03595614\nn03598930\nn03599486\nn03602883\nn03617480\nn03623198\nn03627232\nn03630383\nn03633091\nn03637318\nn03642806\nn03649909\nn03657121\nn03658185\nn03661043\nn03662601\nn03666591\nn03670208\nn03673027\nn03676483\nn03680355\nn03690938\nn03691459\nn03692522\nn03697007\nn03706229\nn03709823\nn03710193\nn03710637\nn03710721\nn03717622\nn03720891\nn03721384\nn03724870\nn03729826\nn03733131\nn03733281\nn03733805\nn03742115\nn03743016\nn03759954\nn03761084\nn03763968\nn03764736\nn03769881\nn03770439\nn03770679\nn03773504\nn03775071\nn03775546\nn03776460\nn03777568\nn03777754\nn03781244\nn03782006\nn03785016\nn03786901\nn03787032\nn03788195\nn03788365\nn03791053\nn03792782\nn03792972\nn03793489\nn03794056\nn03796401\nn03803284\nn03804744\nn03814639\nn03814906\nn03825788\nn03832673\nn03837869\nn03838899\nn03840681\nn03841143\nn03843555\nn03854065\nn03857828\nn03866082\nn03868242\nn03868863\nn03871628\nn03873416\nn03874293\nn03874599\nn03876231\nn03877472\nn03877845\nn03884397\nn03887697\nn03888257\nn03888605\nn03891251\nn03891332\nn03895866\nn03899768\nn03902125\nn03903868\nn03908618\nn03908714\nn03916031\nn03920288\nn03924679\nn03929660\nn03929855\nn03930313\nn03930630\nn03933933\nn03935335\nn03937543\nn03938244\nn03942813\nn03944341\nn03947888\nn03950228\nn03954731\nn03956157\nn03958227\nn03961711\nn03967562\nn03970156\nn03976467\nn03976657\nn03977966\nn03980874\nn03982430\nn03983396\nn03991062\nn03992509\nn03995372\nn03998194\nn04004767\nn04005630\nn04008634\nn04009552\nn04019541\nn04023962\nn04026417\nn04033901\nn04033995\nn04037443\nn04039381\nn04040759\nn04041544\nn04044716\nn04049303\nn04065272\nn04067472\nn04069434\nn04070727\nn04074963\nn04081281\nn04086273\nn04090263\nn04099969\nn04111531\nn04116512\nn04118538\nn04118776\nn04120489\nn04125021\nn04127249\nn04131690\nn04133789\nn04136333\nn04141076\nn04141327\nn04141975\nn04146614\nn04147183\nn04149813\nn04152593\nn04153751\nn04154565\nn04162706\nn04179913\nn04192698\nn04200800\nn04201297\nn04204238\nn04204347\nn04208210\nn04209133\nn04209239\nn04228054\nn04229816\nn04235860\nn04238763\nn04239074\nn04243546\nn04251144\nn04252077\nn04252225\nn04254120\nn04254680\nn04254777\nn04258138\nn04259630\nn04263257\nn04264628\nn04265275\nn04266014\nn04270147\nn04273569\nn04275548\nn04277352\nn04285008\nn04286575\nn04296562\nn04310018\nn04311004\nn04311174\nn04317175\nn04325704\nn04326547\nn04328186\nn04330267\nn04332243\nn04335435\nn04336792\nn04344873\nn04346328\nn04347754\nn04350905\nn04355338\nn04355933\nn04356056\nn04357314\nn04366367\nn04367480\nn04370456\nn04371430\nn04371774\nn04372370\nn04376876\nn04380533\nn04389033\nn04392985\nn04398044\nn04399382\nn04404412\nn04409515\nn04417672\nn04418357\nn04423845\nn04428191\nn04429376\nn04435653\nn04442312\nn04443257\nn04447861\nn04456115\nn04458633\nn04461696\nn04462240\nn04465501\nn04467665\nn04476259\nn04479046\nn04482393\nn04483307\nn04485082\nn04486054\nn04487081\nn04487394\nn04493381\nn04501370\nn04505470\nn04507155\nn04509417\nn04515003\nn04517823\nn04522168\nn04523525\nn04525038\nn04525305\nn04532106\nn04532670\nn04536866\nn04540053\nn04542943\nn04548280\nn04548362\nn04550184\nn04552348\nn04553703\nn04554684\nn04557648\nn04560804\nn04562935\nn04579145\nn04579432\nn04584207\nn04589890\nn04590129\nn04591157\nn04591713\nn04592741\nn04596742\nn04597913\nn04599235\nn04604644\nn04606251\nn04612504\nn04613696\nn06359193\nn06596364\nn06785654\nn06794110\nn06874185\nn07248320\nn07565083\nn07579787\nn07583066\nn07584110\nn07590611\nn07613480\nn07614500\nn07615774\nn07684084\nn07693725\nn07695742\nn07697313\nn07697537\nn07711569\nn07714571\nn07714990\nn07715103\nn07716358\nn07716906\nn07717410\nn07717556\nn07718472\nn07718747\nn07720875\nn07730033\nn07734744\nn07742313\nn07745940\nn07747607\nn07749582\nn07753113\nn07753275\nn07753592\nn07754684\nn07760859\nn07768694\nn07802026\nn07831146\nn07836838\nn07860988\nn07871810\nn07873807\nn07875152\nn07880968\nn07892512\nn07920052\nn07930864\nn07932039\nn09193705\nn09229709\nn09246464\nn09256479\nn09288635\nn09332890\nn09399592\nn09421951\nn09428293\nn09468604\nn09472597\nn09835506\nn10148035\nn10565667\nn11879895\nn11939491\nn12057211\nn12144580\nn12267677\nn12620546\nn12768682\nn12985857\nn12998815\nn13037406\nn13040303\nn13044778\nn13052670\nn13054560\nn13133613\nn15075141\n"
  },
  {
    "path": "model_zoo/models/inception/inception/data/imagenet_metadata.txt",
    "content": "n00004475\torganism, being\nn00005787\tbenthos\nn00006024\theterotroph\nn00006484\tcell\nn00007846\tperson, individual, someone, somebody, mortal, soul\nn00015388\tanimal, animate being, beast, brute, creature, fauna\nn00017222\tplant, flora, plant life\nn00021265\tfood, nutrient\nn00021939\tartifact, artefact\nn00120010\thop\nn00141669\tcheck-in\nn00288000\tdressage\nn00288190\tcurvet, vaulting\nn00288384\tpiaffe\nn00324978\tfunambulism, tightrope walking\nn00326094\trock climbing\nn00433458\tcontact sport\nn00433661\toutdoor sport, field sport\nn00433802\tgymnastics, gymnastic exercise\nn00434075\tacrobatics, tumbling\nn00439826\ttrack and field\nn00440039\ttrack, running\nn00440218\tjumping\nn00440382\tbroad jump, long jump\nn00440509\thigh jump\nn00440643\tFosbury flop\nn00440747\tskiing\nn00440941\tcross-country skiing\nn00441073\tski jumping\nn00441824\twater sport, aquatics\nn00442115\tswimming, swim\nn00442437\tbathe\nn00442847\tdip, plunge\nn00442981\tdive, diving\nn00443231\tfloating, natation\nn00443375\tdead-man's float, prone float\nn00443517\tbelly flop, belly flopper, belly whop, belly whopper\nn00443692\tcliff diving\nn00443803\tflip\nn00443917\tgainer, full gainer\nn00444142\thalf gainer\nn00444340\tjackknife\nn00444490\tswan dive, swallow dive\nn00444651\tskin diving, skin-dive\nn00444846\tscuba diving\nn00444937\tsnorkeling, snorkel diving\nn00445055\tsurfing, surfboarding, surfriding\nn00445226\twater-skiing\nn00445351\trowing, row\nn00445685\tsculling\nn00445802\tboxing, pugilism, fisticuffs\nn00446311\tprofessional boxing\nn00446411\tin-fighting\nn00446493\tfight\nn00446632\trope-a-dope\nn00446804\tspar, sparring\nn00446980\tarchery\nn00447073\tsledding\nn00447221\ttobogganing\nn00447361\tluging\nn00447463\tbobsledding\nn00447540\twrestling, rassling, grappling\nn00447957\tGreco-Roman wrestling\nn00448126\tprofessional wrestling\nn00448232\tsumo\nn00448466\tskating\nn00448640\tice skating\nn00448748\tfigure skating\nn00448872\trollerblading\nn00448958\troller skating\nn00449054\tskateboarding\nn00449168\tspeed skating\nn00449295\tracing\nn00449517\tauto racing, car racing\nn00449695\tboat racing\nn00449796\thydroplane racing\nn00449892\tcamel racing\nn00449977\tgreyhound racing\nn00450070\thorse racing\nn00450335\triding, horseback riding, equitation\nn00450700\tequestrian sport\nn00450866\tpony-trekking\nn00450998\tshowjumping, stadium jumping\nn00451186\tcross-country riding, cross-country jumping\nn00451370\tcycling\nn00451563\tbicycling\nn00451635\tmotorcycling\nn00451768\tdune cycling\nn00451866\tblood sport\nn00452034\tbullfighting, tauromachy\nn00452152\tcockfighting\nn00452293\thunt, hunting\nn00452734\tbattue\nn00452864\tbeagling\nn00453126\tcoursing\nn00453313\tdeer hunting, deer hunt\nn00453396\tducking, duck hunting\nn00453478\tfox hunting, foxhunt\nn00453631\tpigsticking\nn00453935\tfishing, sportfishing\nn00454237\tangling\nn00454395\tfly-fishing\nn00454493\ttroll, trolling\nn00454624\tcasting, cast\nn00454855\tbait casting\nn00454983\tfly casting\nn00455076\tovercast\nn00455173\tsurf casting, surf fishing\nn00456465\tday game\nn00463246\tathletic game\nn00463543\tice hockey, hockey, hockey game\nn00464277\ttetherball\nn00464478\twater polo\nn00464651\toutdoor game\nn00464894\tgolf, golf game\nn00466273\tprofessional golf\nn00466377\tround of golf, round\nn00466524\tmedal play, stroke play\nn00466630\tmatch play\nn00466712\tminiature golf\nn00466880\tcroquet\nn00467320\tquoits, horseshoes\nn00467536\tshuffleboard, shovelboard\nn00467719\tfield game\nn00467995\tfield hockey, hockey\nn00468299\tshinny, shinney\nn00468480\tfootball, football game\nn00469651\tAmerican football, American football game\nn00470554\tprofessional football\nn00470682\ttouch football\nn00470830\thurling\nn00470966\trugby, rugby football, rugger\nn00471437\tball game, ballgame\nn00471613\tbaseball, baseball game\nn00474568\tball\nn00474657\tprofessional baseball\nn00474769\thardball\nn00474881\tperfect game\nn00475014\tno-hit game, no-hitter\nn00475142\tone-hitter, 1-hitter\nn00475273\ttwo-hitter, 2-hitter\nn00475403\tthree-hitter, 3-hitter\nn00475535\tfour-hitter, 4-hitter\nn00475661\tfive-hitter, 5-hitter\nn00475787\tsoftball, softball game\nn00476140\trounders\nn00476235\tstickball, stickball game\nn00476389\tcricket\nn00477392\tlacrosse\nn00477639\tpolo\nn00477827\tpushball\nn00478262\tsoccer, association football\nn00479076\tcourt game\nn00479440\thandball\nn00479616\tracquetball\nn00479734\tfives\nn00479887\tsquash, squash racquets, squash rackets\nn00480211\tvolleyball, volleyball game\nn00480366\tjai alai, pelota\nn00480508\tbadminton\nn00480885\tbattledore, battledore and shuttlecock\nn00480993\tbasketball, basketball game, hoops\nn00481803\tprofessional basketball\nn00481938\tdeck tennis\nn00482122\tnetball\nn00482298\ttennis, lawn tennis\nn00483205\tprofessional tennis\nn00483313\tsingles\nn00483409\tsingles\nn00483508\tdoubles\nn00483605\tdoubles\nn00483705\troyal tennis, real tennis, court tennis\nn00483848\tpallone\nn00523513\tsport, athletics\nn00812526\tclasp, clench, clutch, clutches, grasp, grip, hold\nn00825773\tjudo\nn00887544\tteam sport\nn01035504\tLast Supper, Lord's Supper\nn01035667\tSeder, Passover supper\nn01055165\tcamping, encampment, bivouacking, tenting\nn01314388\tpest\nn01314663\tcritter\nn01314781\tcreepy-crawly\nn01314910\tdarter\nn01315213\tpeeper\nn01315330\thomeotherm, homoiotherm, homotherm\nn01315581\tpoikilotherm, ectotherm\nn01315805\trange animal\nn01316422\tscavenger\nn01316579\tbottom-feeder, bottom-dweller\nn01316734\tbottom-feeder\nn01316949\twork animal\nn01317089\tbeast of burden, jument\nn01317294\tdraft animal\nn01317391\tpack animal, sumpter\nn01317541\tdomestic animal, domesticated animal\nn01317813\tfeeder\nn01317916\tfeeder\nn01318053\tstocker\nn01318279\thatchling\nn01318381\thead\nn01318478\tmigrator\nn01318660\tmolter, moulter\nn01318894\tpet\nn01319001\tstayer\nn01319187\tstunt\nn01319467\tmarine animal, marine creature, sea animal, sea creature\nn01319685\tby-catch, bycatch\nn01320872\tfemale\nn01321123\then\nn01321230\tmale\nn01321456\tadult\nn01321579\tyoung, offspring\nn01321770\torphan\nn01321854\tyoung mammal\nn01322221\tbaby\nn01322343\tpup, whelp\nn01322508\twolf pup, wolf cub\nn01322604\tpuppy\nn01322685\tcub, young carnivore\nn01322898\tlion cub\nn01322983\tbear cub\nn01323068\ttiger cub\nn01323155\tkit\nn01323261\tsuckling\nn01323355\tsire\nn01323493\tdam\nn01323599\tthoroughbred, purebred, pureblood\nn01323781\tgiant\nn01324305\tmutant\nn01324431\tcarnivore\nn01324610\therbivore\nn01324799\tinsectivore\nn01324916\tacrodont\nn01325060\tpleurodont\nn01326291\tmicroorganism, micro-organism\nn01327909\tmonohybrid\nn01329186\tarbovirus, arborvirus\nn01330126\tadenovirus\nn01330497\tarenavirus\nn01332181\tMarburg virus\nn01333082\tArenaviridae\nn01333483\tvesiculovirus\nn01333610\tReoviridae\nn01334217\tvariola major, variola major virus\nn01334690\tviroid, virusoid\nn01335218\tcoliphage\nn01337191\tparamyxovirus\nn01337734\tpoliovirus\nn01338685\therpes, herpes virus\nn01339083\therpes simplex 1, HS1, HSV-1, HSV-I\nn01339336\therpes zoster, herpes zoster virus\nn01339471\therpes varicella zoster, herpes varicella zoster virus\nn01339801\tcytomegalovirus, CMV\nn01340014\tvaricella zoster virus\nn01340522\tpolyoma, polyoma virus\nn01340785\tlyssavirus\nn01340935\treovirus\nn01341090\trotavirus\nn01342269\tmoneran, moneron\nn01347583\tarchaebacteria, archaebacterium, archaeobacteria, archeobacteria\nn01349735\tbacteroid\nn01350226\tBacillus anthracis, anthrax bacillus\nn01350701\tYersinia pestis\nn01351170\tBrucella\nn01351315\tspirillum, spirilla\nn01357328\tbotulinus, botulinum, Clostridium botulinum\nn01357507\tclostridium perfringens\nn01358572\tcyanobacteria, blue-green algae\nn01359762\ttrichodesmium\nn01362336\tnitric bacteria, nitrobacteria\nn01363719\tspirillum\nn01365474\tFrancisella, genus Francisella\nn01365885\tgonococcus, Neisseria gonorrhoeae\nn01366700\tCorynebacterium diphtheriae, C. diphtheriae, Klebs-Loeffler bacillus\nn01367772\tenteric bacteria, enterobacteria, enterics, entric\nn01368672\tklebsiella\nn01369358\tSalmonella typhimurium\nn01369484\ttyphoid bacillus, Salmonella typhosa, Salmonella typhi\nn01374703\tnitrate bacterium, nitric bacterium\nn01374846\tnitrite bacterium, nitrous bacterium\nn01375204\tactinomycete\nn01376237\tstreptomyces\nn01376437\tStreptomyces erythreus\nn01376543\tStreptomyces griseus\nn01377278\ttubercle bacillus, Mycobacterium tuberculosis\nn01377510\tpus-forming bacteria\nn01377694\tstreptobacillus\nn01378545\tmyxobacteria, myxobacterium, myxobacter, gliding bacteria, slime bacteria\nn01379389\tstaphylococcus, staphylococci, staph\nn01380610\tdiplococcus\nn01380754\tpneumococcus, Diplococcus pneumoniae\nn01381044\tstreptococcus, streptococci, strep\nn01382033\tspirochete, spirochaete\nn01384084\tplanktonic algae\nn01384164\tzooplankton\nn01384687\tparasite\nn01385017\tendoparasite, entoparasite, entozoan, entozoon, endozoan\nn01385330\tectoparasite, ectozoan, ectozoon, epizoan, epizoon\nn01386007\tpathogen\nn01386182\tcommensal\nn01386354\tmyrmecophile\nn01387065\tprotoctist\nn01389507\tprotozoan, protozoon\nn01390123\tsarcodinian, sarcodine\nn01390763\theliozoan\nn01392275\tendameba\nn01392380\tameba, amoeba\nn01393486\tglobigerina\nn01394040\ttestacean\nn01394492\tarcella\nn01394771\tdifflugia\nn01395254\tciliate, ciliated protozoan, ciliophoran\nn01396048\tparamecium, paramecia\nn01396617\tstentor\nn01397114\talga, algae\nn01397690\tarame\nn01397871\tseagrass\nn01400247\tgolden algae\nn01400391\tyellow-green algae\nn01402600\tbrown algae\nn01403457\tkelp\nn01404365\tfucoid, fucoid algae\nn01404495\tfucoid\nn01405007\tfucus\nn01405616\tbladderwrack, Ascophyllum nodosum\nn01407798\tgreen algae, chlorophyte\nn01410457\tpond scum\nn01411450\tchlorella\nn01412694\tstonewort\nn01413457\tdesmid\nn01414216\tsea moss\nn01415626\teukaryote, eucaryote\nn01415920\tprokaryote, procaryote\nn01416213\tzooid\nn01418498\tLeishmania, genus Leishmania\nn01418620\tzoomastigote, zooflagellate\nn01419332\tpolymastigote\nn01419573\tcostia, Costia necatrix\nn01419888\tgiardia\nn01421333\tcryptomonad, cryptophyte\nn01421807\tsporozoan\nn01422185\tsporozoite\nn01422335\ttrophozoite\nn01422450\tmerozoite\nn01423302\tcoccidium, eimeria\nn01423617\tgregarine\nn01424420\tplasmodium, Plasmodium vivax, malaria parasite\nn01425223\tleucocytozoan, leucocytozoon\nn01427399\tmicrosporidian\nn01429172\tOstariophysi, order Ostariophysi\nn01438208\tcypriniform fish\nn01438581\tloach\nn01439121\tcyprinid, cyprinid fish\nn01439514\tcarp\nn01439808\tdomestic carp, Cyprinus carpio\nn01440160\tleather carp\nn01440242\tmirror carp\nn01440467\tEuropean bream, Abramis brama\nn01440764\ttench, Tinca tinca\nn01441117\tdace, Leuciscus leuciscus\nn01441272\tchub, Leuciscus cephalus\nn01441425\tshiner\nn01441910\tcommon shiner, silversides, Notropis cornutus\nn01442450\troach, Rutilus rutilus\nn01442710\trudd, Scardinius erythrophthalmus\nn01442972\tminnow, Phoxinus phoxinus\nn01443243\tgudgeon, Gobio gobio\nn01443537\tgoldfish, Carassius auratus\nn01443831\tcrucian carp, Carassius carassius, Carassius vulgaris\nn01444339\telectric eel, Electrophorus electric\nn01444783\tcatostomid\nn01445429\tbuffalo fish, buffalofish\nn01445593\tblack buffalo, Ictiobus niger\nn01445857\thog sucker, hog molly, Hypentelium nigricans\nn01446152\tredhorse, redhorse sucker\nn01446589\tcyprinodont\nn01446760\tkillifish\nn01447139\tmummichog, Fundulus heteroclitus\nn01447331\tstriped killifish, mayfish, may fish, Fundulus majalis\nn01447658\trivulus\nn01447946\tflagfish, American flagfish, Jordanella floridae\nn01448291\tswordtail, helleri, topminnow, Xyphophorus helleri\nn01448594\tguppy, rainbow fish, Lebistes reticulatus\nn01448951\ttopminnow, poeciliid fish, poeciliid, live-bearer\nn01449374\tmosquitofish, Gambusia affinis\nn01449712\tplaty, Platypoecilus maculatus\nn01449980\tmollie, molly\nn01450661\tsquirrelfish\nn01450950\treef squirrelfish, Holocentrus coruscus\nn01451115\tdeepwater squirrelfish, Holocentrus bullisi\nn01451295\tHolocentrus ascensionis\nn01451426\tsoldierfish, soldier-fish\nn01451863\tanomalops, flashlight fish\nn01452345\tflashlight fish, Photoblepharon palpebratus\nn01453087\tJohn Dory, Zeus faber\nn01453475\tboarfish, Capros aper\nn01453742\tboarfish\nn01454545\tcornetfish\nn01454856\tstickleback, prickleback\nn01455317\tthree-spined stickleback, Gasterosteus aculeatus\nn01455461\tten-spined stickleback, Gasterosteus pungitius\nn01455778\tpipefish, needlefish\nn01456137\tdwarf pipefish, Syngnathus hildebrandi\nn01456454\tdeepwater pipefish, Cosmocampus profundus\nn01456756\tseahorse, sea horse\nn01457082\tsnipefish, bellows fish\nn01457407\tshrimpfish, shrimp-fish\nn01457852\ttrumpetfish, Aulostomus maculatus\nn01458746\tpellicle\nn01458842\tembryo, conceptus, fertilized egg\nn01459791\tfetus, foetus\nn01460303\tabortus\nn01461315\tspawn\nn01461646\tblastula, blastosphere\nn01462042\tblastocyst, blastodermic vessicle\nn01462544\tgastrula\nn01462803\tmorula\nn01464844\tyolk, vitellus\nn01466257\tchordate\nn01467336\tcephalochordate\nn01467804\tlancelet, amphioxus\nn01468238\ttunicate, urochordate, urochord\nn01468712\tascidian\nn01469103\tsea squirt\nn01469723\tsalp, salpa\nn01470145\tdoliolum\nn01470479\tlarvacean\nn01470733\tappendicularia\nn01470895\tascidian tadpole\nn01471682\tvertebrate, craniate\nn01472303\tAmniota\nn01472502\tamniote\nn01473806\taquatic vertebrate\nn01474283\tjawless vertebrate, jawless fish, agnathan\nn01474864\tostracoderm\nn01475232\theterostracan\nn01475940\tanaspid\nn01476418\tconodont\nn01477080\tcyclostome\nn01477525\tlamprey, lamprey eel, lamper eel\nn01477875\tsea lamprey, Petromyzon marinus\nn01478511\thagfish, hag, slime eels\nn01478969\tMyxine glutinosa\nn01479213\teptatretus\nn01479820\tgnathostome\nn01480106\tplacoderm\nn01480516\tcartilaginous fish, chondrichthian\nn01480880\tholocephalan, holocephalian\nn01481331\tchimaera\nn01481498\trabbitfish, Chimaera monstrosa\nn01482071\telasmobranch, selachian\nn01482330\tshark\nn01483021\tcow shark, six-gilled shark, Hexanchus griseus\nn01483522\tmackerel shark\nn01483830\tporbeagle, Lamna nasus\nn01484097\tmako, mako shark\nn01484285\tshortfin mako, Isurus oxyrhincus\nn01484447\tlongfin mako, Isurus paucus\nn01484562\tbonito shark, blue pointed, Isurus glaucus\nn01484850\tgreat white shark, white shark, man-eater, man-eating shark, Carcharodon carcharias\nn01485479\tbasking shark, Cetorhinus maximus\nn01486010\tthresher, thrasher, thresher shark, fox shark, Alopius vulpinus\nn01486540\tcarpet shark, Orectolobus barbatus\nn01486838\tnurse shark, Ginglymostoma cirratum\nn01487506\tsand tiger, sand shark, Carcharias taurus, Odontaspis taurus\nn01488038\twhale shark, Rhincodon typus\nn01488918\trequiem shark\nn01489501\tbull shark, cub shark, Carcharhinus leucas\nn01489709\tsandbar shark, Carcharhinus plumbeus\nn01489920\tblacktip shark, sandbar shark, Carcharhinus limbatus\nn01490112\twhitetip shark, oceanic whitetip shark, white-tipped shark, Carcharinus longimanus\nn01490360\tdusky shark, Carcharhinus obscurus\nn01490670\tlemon shark, Negaprion brevirostris\nn01491006\tblue shark, great blue shark, Prionace glauca\nn01491361\ttiger shark, Galeocerdo cuvieri\nn01491661\tsoupfin shark, soupfin, soup-fin, Galeorhinus zyopterus\nn01491874\tdogfish\nn01492357\tsmooth dogfish\nn01492569\tsmoothhound, smoothhound shark, Mustelus mustelus\nn01492708\tAmerican smooth dogfish, Mustelus canis\nn01492860\tFlorida smoothhound, Mustelus norrisi\nn01493146\twhitetip shark, reef whitetip shark, Triaenodon obseus\nn01493541\tspiny dogfish\nn01493829\tAtlantic spiny dogfish, Squalus acanthias\nn01494041\tPacific spiny dogfish, Squalus suckleyi\nn01494475\thammerhead, hammerhead shark\nn01494757\tsmooth hammerhead, Sphyrna zygaena\nn01494882\tsmalleye hammerhead, Sphyrna tudes\nn01495006\tshovelhead, bonnethead, bonnet shark, Sphyrna tiburo\nn01495493\tangel shark, angelfish, Squatina squatina, monkfish\nn01495701\tray\nn01496331\telectric ray, crampfish, numbfish, torpedo\nn01497118\tsawfish\nn01497413\tsmalltooth sawfish, Pristis pectinatus\nn01497738\tguitarfish\nn01498041\tstingray\nn01498406\troughtail stingray, Dasyatis centroura\nn01498699\tbutterfly ray\nn01498989\teagle ray\nn01499396\tspotted eagle ray, spotted ray, Aetobatus narinari\nn01499732\tcownose ray, cow-nosed ray, Rhinoptera bonasus\nn01500091\tmanta, manta ray, devilfish\nn01500476\tAtlantic manta, Manta birostris\nn01500854\tdevil ray, Mobula hypostoma\nn01501160\tskate\nn01501641\tgrey skate, gray skate, Raja batis\nn01501777\tlittle skate, Raja erinacea\nn01501948\tthorny skate, Raja radiata\nn01502101\tbarndoor skate, Raja laevis\nn01503061\tbird\nn01503976\tdickeybird, dickey-bird, dickybird, dicky-bird\nn01504179\tfledgling, fledgeling\nn01504344\tnestling, baby bird\nn01514668\tcock\nn01514752\tgamecock, fighting cock\nn01514859\then\nn01514926\tnester\nn01515078\tnight bird\nn01515217\tnight raven\nn01515303\tbird of passage\nn01516212\tarchaeopteryx, archeopteryx, Archaeopteryx lithographica\nn01517389\tarchaeornis\nn01517565\tratite, ratite bird, flightless bird\nn01517966\tcarinate, carinate bird, flying bird\nn01518878\tostrich, Struthio camelus\nn01519563\tcassowary\nn01519873\temu, Dromaius novaehollandiae, Emu novaehollandiae\nn01520576\tkiwi, apteryx\nn01521399\trhea, Rhea americana\nn01521756\trhea, nandu, Pterocnemia pennata\nn01522450\telephant bird, aepyornis\nn01523105\tmoa\nn01524359\tpasserine, passeriform bird\nn01524761\tnonpasserine bird\nn01525720\toscine, oscine bird\nn01526521\tsongbird, songster\nn01526766\thoney eater, honeysucker\nn01527194\taccentor\nn01527347\thedge sparrow, sparrow, dunnock, Prunella modularis\nn01527617\tlark\nn01527917\tskylark, Alauda arvensis\nn01528396\twagtail\nn01528654\tpipit, titlark, lark\nn01528845\tmeadow pipit, Anthus pratensis\nn01529672\tfinch\nn01530439\tchaffinch, Fringilla coelebs\nn01530575\tbrambling, Fringilla montifringilla\nn01531178\tgoldfinch, Carduelis carduelis\nn01531344\tlinnet, lintwhite, Carduelis cannabina\nn01531512\tsiskin, Carduelis spinus\nn01531639\tred siskin, Carduelis cucullata\nn01531811\tredpoll, Carduelis flammea\nn01531971\tredpoll, Carduelis hornemanni\nn01532325\tNew World goldfinch, goldfinch, yellowbird, Spinus tristis\nn01532511\tpine siskin, pine finch, Spinus pinus\nn01532829\thouse finch, linnet, Carpodacus mexicanus\nn01533000\tpurple finch, Carpodacus purpureus\nn01533339\tcanary, canary bird\nn01533481\tcommon canary, Serinus canaria\nn01533651\tserin\nn01533893\tcrossbill, Loxia curvirostra\nn01534155\tbullfinch, Pyrrhula pyrrhula\nn01534433\tjunco, snowbird\nn01534582\tdark-eyed junco, slate-colored junco, Junco hyemalis\nn01534762\tNew World sparrow\nn01535140\tvesper sparrow, grass finch, Pooecetes gramineus\nn01535469\twhite-throated sparrow, whitethroat, Zonotrichia albicollis\nn01535690\twhite-crowned sparrow, Zonotrichia leucophrys\nn01536035\tchipping sparrow, Spizella passerina\nn01536186\tfield sparrow, Spizella pusilla\nn01536334\ttree sparrow, Spizella arborea\nn01536644\tsong sparrow, Melospiza melodia\nn01536780\tswamp sparrow, Melospiza georgiana\nn01537134\tbunting\nn01537544\tindigo bunting, indigo finch, indigo bird, Passerina cyanea\nn01537895\tortolan, ortolan bunting, Emberiza hortulana\nn01538059\treed bunting, Emberiza schoeniclus\nn01538200\tyellowhammer, yellow bunting, Emberiza citrinella\nn01538362\tyellow-breasted bunting, Emberiza aureola\nn01538630\tsnow bunting, snowbird, snowflake, Plectrophenax nivalis\nn01538955\thoneycreeper\nn01539272\tbanana quit\nn01539573\tsparrow, true sparrow\nn01539925\tEnglish sparrow, house sparrow, Passer domesticus\nn01540090\ttree sparrow, Passer montanus\nn01540233\tgrosbeak, grossbeak\nn01540566\tevening grosbeak, Hesperiphona vespertina\nn01540832\thawfinch, Coccothraustes coccothraustes\nn01541102\tpine grosbeak, Pinicola enucleator\nn01541386\tcardinal, cardinal grosbeak, Richmondena Cardinalis, Cardinalis cardinalis, redbird\nn01541760\tpyrrhuloxia, Pyrrhuloxia sinuata\nn01541922\ttowhee\nn01542168\tchewink, cheewink, Pipilo erythrophthalmus\nn01542433\tgreen-tailed towhee, Chlorura chlorura\nn01542786\tweaver, weaverbird, weaver finch\nn01543175\tbaya, Ploceus philippinus\nn01543383\twhydah, whidah, widow bird\nn01543632\tJava sparrow, Java finch, ricebird, Padda oryzivora\nn01543936\tavadavat, amadavat\nn01544208\tgrassfinch, grass finch\nn01544389\tzebra finch, Poephila castanotis\nn01544704\thoneycreeper, Hawaiian honeycreeper\nn01545574\tlyrebird\nn01546039\tscrubbird, scrub-bird, scrub bird\nn01546506\tbroadbill\nn01546921\ttyrannid\nn01547832\tNew World flycatcher, flycatcher, tyrant flycatcher, tyrant bird\nn01548301\tkingbird, Tyrannus tyrannus\nn01548492\tArkansas kingbird, western kingbird\nn01548694\tCassin's kingbird, Tyrannus vociferans\nn01548865\teastern kingbird\nn01549053\tgrey kingbird, gray kingbird, petchary, Tyrannus domenicensis domenicensis\nn01549430\tpewee, peewee, peewit, pewit, wood pewee, Contopus virens\nn01549641\twestern wood pewee, Contopus sordidulus\nn01549886\tphoebe, phoebe bird, Sayornis phoebe\nn01550172\tvermillion flycatcher, firebird, Pyrocephalus rubinus mexicanus\nn01550761\tcotinga, chatterer\nn01551080\tcock of the rock, Rupicola rupicola\nn01551300\tcock of the rock, Rupicola peruviana\nn01551711\tmanakin\nn01552034\tbellbird\nn01552333\tumbrella bird, Cephalopterus ornatus\nn01552813\tovenbird\nn01553142\tantbird, ant bird\nn01553527\tant thrush\nn01553762\tant shrike\nn01554017\tspotted antbird, Hylophylax naevioides\nn01554448\twoodhewer, woodcreeper, wood-creeper, tree creeper\nn01555004\tpitta\nn01555305\tscissortail, scissortailed flycatcher, Muscivora-forficata\nn01555809\tOld World flycatcher, true flycatcher, flycatcher\nn01556182\tspotted flycatcher, Muscicapa striata, Muscicapa grisola\nn01556514\tthickhead, whistler\nn01557185\tthrush\nn01557962\tmissel thrush, mistle thrush, mistletoe thrush, Turdus viscivorus\nn01558149\tsong thrush, mavis, throstle, Turdus philomelos\nn01558307\tfieldfare, snowbird, Turdus pilaris\nn01558461\tredwing, Turdus iliacus\nn01558594\tblackbird, merl, merle, ouzel, ousel, European blackbird, Turdus merula\nn01558765\tring ouzel, ring blackbird, ring thrush, Turdus torquatus\nn01558993\trobin, American robin, Turdus migratorius\nn01559160\tclay-colored robin, Turdus greyi\nn01559477\thermit thrush, Hylocichla guttata\nn01559639\tveery, Wilson's thrush, Hylocichla fuscescens\nn01559804\twood thrush, Hylocichla mustelina\nn01560105\tnightingale, Luscinia megarhynchos\nn01560280\tthrush nightingale, Luscinia luscinia\nn01560419\tbulbul\nn01560636\tOld World chat, chat\nn01560793\tstonechat, Saxicola torquata\nn01560935\twhinchat, Saxicola rubetra\nn01561181\tsolitaire\nn01561452\tredstart, redtail\nn01561732\twheatear\nn01562014\tbluebird\nn01562265\trobin, redbreast, robin redbreast, Old World robin, Erithacus rubecola\nn01562451\tbluethroat, Erithacus svecicus\nn01563128\twarbler\nn01563449\tgnatcatcher\nn01563746\tkinglet\nn01563945\tgoldcrest, golden-crested kinglet, Regulus regulus\nn01564101\tgold-crowned kinglet, Regulus satrata\nn01564217\truby-crowned kinglet, ruby-crowned wren, Regulus calendula\nn01564394\tOld World warbler, true warbler\nn01564773\tblackcap, Silvia atricapilla\nn01564914\tgreater whitethroat, whitethroat, Sylvia communis\nn01565078\tlesser whitethroat, whitethroat, Sylvia curruca\nn01565345\twood warbler, Phylloscopus sibilatrix\nn01565599\tsedge warbler, sedge bird, sedge wren, reedbird, Acrocephalus schoenobaenus\nn01565930\twren warbler\nn01566207\ttailorbird, Orthotomus sutorius\nn01566645\tbabbler, cackler\nn01567133\tNew World warbler, wood warbler\nn01567678\tparula warbler, northern parula, Parula americana\nn01567879\tWilson's warbler, Wilson's blackcap, Wilsonia pusilla\nn01568132\tflycatching warbler\nn01568294\tAmerican redstart, redstart, Setophaga ruticilla\nn01568720\tCape May warbler, Dendroica tigrina\nn01568892\tyellow warbler, golden warbler, yellowbird, Dendroica petechia\nn01569060\tBlackburn, Blackburnian warbler, Dendroica fusca\nn01569262\tAudubon's warbler, Audubon warbler, Dendroica auduboni\nn01569423\tmyrtle warbler, myrtle bird, Dendroica coronata\nn01569566\tblackpoll, Dendroica striate\nn01569836\tNew World chat, chat\nn01569971\tyellow-breasted chat, Icteria virens\nn01570267\tovenbird, Seiurus aurocapillus\nn01570421\twater thrush\nn01570676\tyellowthroat\nn01570839\tcommon yellowthroat, Maryland yellowthroat, Geothlypis trichas\nn01571410\triflebird, Ptloris paradisea\nn01571904\tNew World oriole, American oriole, oriole\nn01572328\tnorthern oriole, Icterus galbula\nn01572489\tBaltimore oriole, Baltimore bird, hangbird, firebird, Icterus galbula galbula\nn01572654\tBullock's oriole, Icterus galbula bullockii\nn01572782\torchard oriole, Icterus spurius\nn01573074\tmeadowlark, lark\nn01573240\teastern meadowlark, Sturnella magna\nn01573360\twestern meadowlark, Sturnella neglecta\nn01573627\tcacique, cazique\nn01573898\tbobolink, ricebird, reedbird, Dolichonyx oryzivorus\nn01574045\tNew World blackbird, blackbird\nn01574390\tgrackle, crow blackbird\nn01574560\tpurple grackle, Quiscalus quiscula\nn01574801\trusty blackbird, rusty grackle, Euphagus carilonus\nn01575117\tcowbird\nn01575401\tred-winged blackbird, redwing, Agelaius phoeniceus\nn01575745\tOld World oriole, oriole\nn01576076\tgolden oriole, Oriolus oriolus\nn01576358\tfig-bird\nn01576695\tstarling\nn01577035\tcommon starling, Sturnus vulgaris\nn01577458\trose-colored starling, rose-colored pastor, Pastor sturnus, Pastor roseus\nn01577659\tmyna, mynah, mina, minah, myna bird, mynah bird\nn01577941\tcrested myna, Acridotheres tristis\nn01578180\thill myna, Indian grackle, grackle, Gracula religiosa\nn01578575\tcorvine bird\nn01579028\tcrow\nn01579149\tAmerican crow, Corvus brachyrhyncos\nn01579260\traven, Corvus corax\nn01579410\trook, Corvus frugilegus\nn01579578\tjackdaw, daw, Corvus monedula\nn01579729\tchough\nn01580077\tjay\nn01580379\tOld World jay\nn01580490\tcommon European jay, Garullus garullus\nn01580772\tNew World jay\nn01580870\tblue jay, jaybird, Cyanocitta cristata\nn01581166\tCanada jay, grey jay, gray jay, camp robber, whisker jack, Perisoreus canadensis\nn01581434\tRocky Mountain jay, Perisoreus canadensis capitalis\nn01581730\tnutcracker\nn01581874\tcommon nutcracker, Nucifraga caryocatactes\nn01581984\tClark's nutcracker, Nucifraga columbiana\nn01582220\tmagpie\nn01582398\tEuropean magpie, Pica pica\nn01582498\tAmerican magpie, Pica pica hudsonia\nn01582856\tAustralian magpie\nn01583209\tbutcherbird\nn01583495\tcurrawong, bell magpie\nn01583828\tpiping crow, piping crow-shrike, Gymnorhina tibicen\nn01584225\twren, jenny wren\nn01584695\twinter wren, Troglodytes troglodytes\nn01584853\thouse wren, Troglodytes aedon\nn01585121\tmarsh wren\nn01585287\tlong-billed marsh wren, Cistothorus palustris\nn01585422\tsedge wren, short-billed marsh wren, Cistothorus platensis\nn01585715\trock wren, Salpinctes obsoletus\nn01586020\tCarolina wren, Thryothorus ludovicianus\nn01586374\tcactus wren\nn01586941\tmockingbird, mocker, Mimus polyglotktos\nn01587278\tblue mockingbird, Melanotis caerulescens\nn01587526\tcatbird, grey catbird, gray catbird, Dumetella carolinensis\nn01587834\tthrasher, mocking thrush\nn01588002\tbrown thrasher, brown thrush, Toxostoma rufums\nn01588431\tNew Zealand wren\nn01588725\trock wren, Xenicus gilviventris\nn01588996\trifleman bird, Acanthisitta chloris\nn01589286\tcreeper, tree creeper\nn01589718\tbrown creeper, American creeper, Certhia americana\nn01589893\tEuropean creeper, Certhia familiaris\nn01590220\twall creeper, tichodrome, Tichodroma muriaria\nn01591005\tEuropean nuthatch, Sitta europaea\nn01591123\tred-breasted nuthatch, Sitta canadensis\nn01591301\twhite-breasted nuthatch, Sitta carolinensis\nn01591697\ttitmouse, tit\nn01592084\tchickadee\nn01592257\tblack-capped chickadee, blackcap, Parus atricapillus\nn01592387\ttufted titmouse, Parus bicolor\nn01592540\tCarolina chickadee, Parus carolinensis\nn01592694\tblue tit, tomtit, Parus caeruleus\nn01593028\tbushtit, bush tit\nn01593282\twren-tit, Chamaea fasciata\nn01593553\tverdin, Auriparus flaviceps\nn01594004\tfairy bluebird, bluebird\nn01594372\tswallow\nn01594787\tbarn swallow, chimney swallow, Hirundo rustica\nn01594968\tcliff swallow, Hirundo pyrrhonota\nn01595168\ttree swallow, tree martin, Hirundo nigricans\nn01595450\twhite-bellied swallow, tree swallow, Iridoprocne bicolor\nn01595624\tmartin\nn01595974\thouse martin, Delichon urbica\nn01596273\tbank martin, bank swallow, sand martin, Riparia riparia\nn01596608\tpurple martin, Progne subis\nn01597022\twood swallow, swallow shrike\nn01597336\ttanager\nn01597737\tscarlet tanager, Piranga olivacea, redbird, firebird\nn01597906\twestern tanager, Piranga ludoviciana\nn01598074\tsummer tanager, summer redbird, Piranga rubra\nn01598271\thepatic tanager, Piranga flava hepatica\nn01598588\tshrike\nn01598988\tbutcherbird\nn01599159\tEuropean shrike, Lanius excubitor\nn01599269\tnorthern shrike, Lanius borealis\nn01599388\twhite-rumped shrike, Lanius ludovicianus excubitorides\nn01599556\tloggerhead shrike, Lanius lucovicianus\nn01599741\tmigrant shrike, Lanius ludovicianus migrans\nn01600085\tbush shrike\nn01600341\tblack-fronted bush shrike, Chlorophoneus nigrifrons\nn01600657\tbowerbird, catbird\nn01601068\tsatin bowerbird, satin bird, Ptilonorhynchus violaceus\nn01601410\tgreat bowerbird, Chlamydera nuchalis\nn01601694\twater ouzel, dipper\nn01602080\tEuropean water ouzel, Cinclus aquaticus\nn01602209\tAmerican water ouzel, Cinclus mexicanus\nn01602630\tvireo\nn01602832\tred-eyed vireo, Vireo olivaceous\nn01603000\tsolitary vireo, Vireo solitarius\nn01603152\tblue-headed vireo, Vireo solitarius solitarius\nn01603600\twaxwing\nn01603812\tcedar waxwing, cedarbird, Bombycilla cedrorun\nn01603953\tBohemian waxwing, Bombycilla garrulus\nn01604330\tbird of prey, raptor, raptorial bird\nn01604968\tAccipitriformes, order Accipitriformes\nn01605630\thawk\nn01606097\teyas\nn01606177\ttiercel, tercel, tercelet\nn01606522\tgoshawk, Accipiter gentilis\nn01606672\tsparrow hawk, Accipiter nisus\nn01606809\tCooper's hawk, blue darter, Accipiter cooperii\nn01606978\tchicken hawk, hen hawk\nn01607309\tbuteonine\nn01607429\tredtail, red-tailed hawk, Buteo jamaicensis\nn01607600\trough-legged hawk, roughleg, Buteo lagopus\nn01607812\tred-shouldered hawk, Buteo lineatus\nn01607962\tbuzzard, Buteo buteo\nn01608265\thoney buzzard, Pernis apivorus\nn01608432\tkite\nn01608814\tblack kite, Milvus migrans\nn01609062\tswallow-tailed kite, swallow-tailed hawk, Elanoides forficatus\nn01609391\twhite-tailed kite, Elanus leucurus\nn01609751\tharrier\nn01609956\tmarsh harrier, Circus Aeruginosus\nn01610100\tMontagu's harrier, Circus pygargus\nn01610226\tmarsh hawk, northern harrier, hen harrier, Circus cyaneus\nn01610552\tharrier eagle, short-toed eagle\nn01610955\tfalcon\nn01611472\tperegrine, peregrine falcon, Falco peregrinus\nn01611674\tfalcon-gentle, falcon-gentil\nn01611800\tgyrfalcon, gerfalcon, Falco rusticolus\nn01611969\tkestrel, Falco tinnunculus\nn01612122\tsparrow hawk, American kestrel, kestrel, Falco sparverius\nn01612275\tpigeon hawk, merlin, Falco columbarius\nn01612476\thobby, Falco subbuteo\nn01612628\tcaracara\nn01612955\tAudubon's caracara, Polyborus cheriway audubonii\nn01613177\tcarancha, Polyborus plancus\nn01613294\teagle, bird of Jove\nn01613615\tyoung bird\nn01613807\teaglet\nn01614038\tharpy, harpy eagle, Harpia harpyja\nn01614343\tgolden eagle, Aquila chrysaetos\nn01614556\ttawny eagle, Aquila rapax\nn01614925\tbald eagle, American eagle, Haliaeetus leucocephalus\nn01615121\tsea eagle\nn01615303\tKamchatkan sea eagle, Stellar's sea eagle, Haliaeetus pelagicus\nn01615458\tern, erne, grey sea eagle, gray sea eagle, European sea eagle, white-tailed sea eagle, Haliatus albicilla\nn01615703\tfishing eagle, Haliaeetus leucorhyphus\nn01616086\tosprey, fish hawk, fish eagle, sea eagle, Pandion haliaetus\nn01616318\tvulture\nn01616551\tAegypiidae, family Aegypiidae\nn01616764\tOld World vulture\nn01617095\tgriffon vulture, griffon, Gyps fulvus\nn01617443\tbearded vulture, lammergeier, lammergeyer, Gypaetus barbatus\nn01617766\tEgyptian vulture, Pharaoh's chicken, Neophron percnopterus\nn01618082\tblack vulture, Aegypius monachus\nn01618503\tsecretary bird, Sagittarius serpentarius\nn01618922\tNew World vulture, cathartid\nn01619310\tbuzzard, turkey buzzard, turkey vulture, Cathartes aura\nn01619536\tcondor\nn01619835\tAndean condor, Vultur gryphus\nn01620135\tCalifornia condor, Gymnogyps californianus\nn01620414\tblack vulture, carrion crow, Coragyps atratus\nn01620735\tking vulture, Sarcorhamphus papa\nn01621127\towl, bird of Minerva, bird of night, hooter\nn01621635\towlet\nn01622120\tlittle owl, Athene noctua\nn01622352\thorned owl\nn01622483\tgreat horned owl, Bubo virginianus\nn01622779\tgreat grey owl, great gray owl, Strix nebulosa\nn01622959\ttawny owl, Strix aluco\nn01623110\tbarred owl, Strix varia\nn01623425\tscreech owl, Otus asio\nn01623615\tscreech owl\nn01623706\tscops owl\nn01623880\tspotted owl, Strix occidentalis\nn01624115\tOld World scops owl, Otus scops\nn01624212\tOriental scops owl, Otus sunia\nn01624305\thoot owl\nn01624537\thawk owl, Surnia ulula\nn01624833\tlong-eared owl, Asio otus\nn01625121\tlaughing owl, laughing jackass, Sceloglaux albifacies\nn01625562\tbarn owl, Tyto alba\nn01627424\tamphibian\nn01628331\tIchyostega\nn01628770\turodele, caudate\nn01629276\tsalamander\nn01629819\tEuropean fire salamander, Salamandra salamandra\nn01629962\tspotted salamander, fire salamander, Salamandra maculosa\nn01630148\talpine salamander, Salamandra atra\nn01630284\tnewt, triton\nn01630670\tcommon newt, Triturus vulgaris\nn01630901\tred eft, Notophthalmus viridescens\nn01631175\tPacific newt\nn01631354\trough-skinned newt, Taricha granulosa\nn01631512\tCalifornia newt, Taricha torosa\nn01631663\teft\nn01632047\tambystomid, ambystomid salamander\nn01632308\tmole salamander, Ambystoma talpoideum\nn01632458\tspotted salamander, Ambystoma maculatum\nn01632601\ttiger salamander, Ambystoma tigrinum\nn01632777\taxolotl, mud puppy, Ambystoma mexicanum\nn01632952\twaterdog\nn01633406\thellbender, mud puppy, Cryptobranchus alleganiensis\nn01633781\tgiant salamander, Megalobatrachus maximus\nn01634227\tolm, Proteus anguinus\nn01634522\tmud puppy, Necturus maculosus\nn01635027\tdicamptodon, dicamptodontid\nn01635176\tPacific giant salamander, Dicamptodon ensatus\nn01635480\tolympic salamander, Rhyacotriton olympicus\nn01636127\tlungless salamander, plethodont\nn01636352\teastern red-backed salamander, Plethodon cinereus\nn01636510\twestern red-backed salamander, Plethodon vehiculum\nn01636829\tdusky salamander\nn01637112\tclimbing salamander\nn01637338\tarboreal salamander, Aneides lugubris\nn01637615\tslender salamander, worm salamander\nn01637932\tweb-toed salamander\nn01638194\tShasta salamander, Hydromantes shastae\nn01638329\tlimestone salamander, Hydromantes brunus\nn01638722\tamphiuma, congo snake, congo eel, blind eel\nn01639187\tsiren\nn01639765\tfrog, toad, toad frog, anuran, batrachian, salientian\nn01640846\ttrue frog, ranid\nn01641206\twood-frog, wood frog, Rana sylvatica\nn01641391\tleopard frog, spring frog, Rana pipiens\nn01641577\tbullfrog, Rana catesbeiana\nn01641739\tgreen frog, spring frog, Rana clamitans\nn01641930\tcascades frog, Rana cascadae\nn01642097\tgoliath frog, Rana goliath\nn01642257\tpickerel frog, Rana palustris\nn01642391\ttarahumara frog, Rana tarahumarae\nn01642539\tgrass frog, Rana temporaria\nn01642943\tleptodactylid frog, leptodactylid\nn01643255\trobber frog\nn01643507\tbarking frog, robber frog, Hylactophryne augusti\nn01643896\tcrapaud, South American bullfrog, Leptodactylus pentadactylus\nn01644373\ttree frog, tree-frog\nn01644900\ttailed frog, bell toad, ribbed toad, tailed toad, Ascaphus trui\nn01645466\tLiopelma hamiltoni\nn01645776\ttrue toad\nn01646292\tbufo\nn01646388\tagua, agua toad, Bufo marinus\nn01646555\tEuropean toad, Bufo bufo\nn01646648\tnatterjack, Bufo calamita\nn01646802\tAmerican toad, Bufo americanus\nn01646902\tEurasian green toad, Bufo viridis\nn01647033\tAmerican green toad, Bufo debilis\nn01647180\tYosemite toad, Bufo canorus\nn01647303\tTexas toad, Bufo speciosus\nn01647466\tsouthwestern toad, Bufo microscaphus\nn01647640\twestern toad, Bufo boreas\nn01648139\tobstetrical toad, midwife toad, Alytes obstetricans\nn01648356\tmidwife toad, Alytes cisternasi\nn01648620\tfire-bellied toad, Bombina bombina\nn01649170\tspadefoot, spadefoot toad\nn01649412\twestern spadefoot, Scaphiopus hammondii\nn01649556\tsouthern spadefoot, Scaphiopus multiplicatus\nn01649726\tplains spadefoot, Scaphiopus bombifrons\nn01650167\ttree toad, tree frog, tree-frog\nn01650690\tspring peeper, Hyla crucifer\nn01650901\tPacific tree toad, Hyla regilla\nn01651059\tcanyon treefrog, Hyla arenicolor\nn01651285\tchameleon tree frog\nn01651487\tcricket frog\nn01651641\tnorthern cricket frog, Acris crepitans\nn01651778\teastern cricket frog, Acris gryllus\nn01652026\tchorus frog\nn01652297\tlowland burrowing treefrog, northern casque-headed frog, Pternohyla fodiens\nn01653026\twestern narrow-mouthed toad, Gastrophryne olivacea\nn01653223\teastern narrow-mouthed toad, Gastrophryne carolinensis\nn01653509\tsheep frog\nn01653773\ttongueless frog\nn01654083\tSurinam toad, Pipa pipa, Pipa americana\nn01654637\tAfrican clawed frog, Xenopus laevis\nn01654863\tSouth American poison toad\nn01655344\tcaecilian, blindworm\nn01661091\treptile, reptilian\nn01661592\tanapsid, anapsid reptile\nn01661818\tdiapsid, diapsid reptile\nn01662060\tDiapsida, subclass Diapsida\nn01662622\tchelonian, chelonian reptile\nn01662784\tturtle\nn01663401\tsea turtle, marine turtle\nn01663782\tgreen turtle, Chelonia mydas\nn01664065\tloggerhead, loggerhead turtle, Caretta caretta\nn01664369\tridley\nn01664492\tAtlantic ridley, bastard ridley, bastard turtle, Lepidochelys kempii\nn01664674\tPacific ridley, olive ridley, Lepidochelys olivacea\nn01664990\thawksbill turtle, hawksbill, hawkbill, tortoiseshell turtle, Eretmochelys imbricata\nn01665541\tleatherback turtle, leatherback, leathery turtle, Dermochelys coriacea\nn01665932\tsnapping turtle\nn01666228\tcommon snapping turtle, snapper, Chelydra serpentina\nn01666585\talligator snapping turtle, alligator snapper, Macroclemys temmincki\nn01667114\tmud turtle\nn01667432\tmusk turtle, stinkpot\nn01667778\tterrapin\nn01668091\tdiamondback terrapin, Malaclemys centrata\nn01668436\tred-bellied terrapin, red-bellied turtle, redbelly, Pseudemys rubriventris\nn01668665\tslider, yellow-bellied terrapin, Pseudemys scripta\nn01668892\tcooter, river cooter, Pseudemys concinna\nn01669191\tbox turtle, box tortoise\nn01669372\tWestern box turtle, Terrapene ornata\nn01669654\tpainted turtle, painted terrapin, painted tortoise, Chrysemys picta\nn01670092\ttortoise\nn01670535\tEuropean tortoise, Testudo graeca\nn01670802\tgiant tortoise\nn01671125\tgopher tortoise, gopher turtle, gopher, Gopherus polypemus\nn01671479\tdesert tortoise, Gopherus agassizii\nn01671705\tTexas tortoise\nn01672032\tsoft-shelled turtle, pancake turtle\nn01672432\tspiny softshell, Trionyx spiniferus\nn01672611\tsmooth softshell, Trionyx muticus\nn01673282\ttuatara, Sphenodon punctatum\nn01674216\tsaurian\nn01674464\tlizard\nn01674990\tgecko\nn01675352\tflying gecko, fringed gecko, Ptychozoon homalocephalum\nn01675722\tbanded gecko\nn01676755\tiguanid, iguanid lizard\nn01677366\tcommon iguana, iguana, Iguana iguana\nn01677747\tmarine iguana, Amblyrhynchus cristatus\nn01678043\tdesert iguana, Dipsosaurus dorsalis\nn01678343\tchuckwalla, Sauromalus obesus\nn01678657\tzebra-tailed lizard, gridiron-tailed lizard, Callisaurus draconoides\nn01679005\tfringe-toed lizard, Uma notata\nn01679307\tearless lizard\nn01679626\tcollared lizard\nn01679962\tleopard lizard\nn01680264\tspiny lizard\nn01680478\tfence lizard\nn01680655\twestern fence lizard, swift, blue-belly, Sceloporus occidentalis\nn01680813\teastern fence lizard, pine lizard, Sceloporus undulatus\nn01680983\tsagebrush lizard, Sceloporus graciosus\nn01681328\tside-blotched lizard, sand lizard, Uta stansburiana\nn01681653\ttree lizard, Urosaurus ornatus\nn01681940\thorned lizard, horned toad, horny frog\nn01682172\tTexas horned lizard, Phrynosoma cornutum\nn01682435\tbasilisk\nn01682714\tAmerican chameleon, anole, Anolis carolinensis\nn01683201\tworm lizard\nn01683558\tnight lizard\nn01684133\tskink, scincid, scincid lizard\nn01684578\twestern skink, Eumeces skiltonianus\nn01684741\tmountain skink, Eumeces callicephalus\nn01685439\tteiid lizard, teiid\nn01685808\twhiptail, whiptail lizard\nn01686044\tracerunner, race runner, six-lined racerunner, Cnemidophorus sexlineatus\nn01686220\tplateau striped whiptail, Cnemidophorus velox\nn01686403\tChihuahuan spotted whiptail, Cnemidophorus exsanguis\nn01686609\twestern whiptail, Cnemidophorus tigris\nn01686808\tcheckered whiptail, Cnemidophorus tesselatus\nn01687128\tteju\nn01687290\tcaiman lizard\nn01687665\tagamid, agamid lizard\nn01687978\tagama\nn01688243\tfrilled lizard, Chlamydosaurus kingi\nn01688961\tmoloch\nn01689081\tmountain devil, spiny lizard, Moloch horridus\nn01689411\tanguid lizard\nn01689811\talligator lizard\nn01690149\tblindworm, slowworm, Anguis fragilis\nn01690466\tglass lizard, glass snake, joint snake\nn01691217\tlegless lizard\nn01691652\tLanthanotus borneensis\nn01691951\tvenomous lizard\nn01692333\tGila monster, Heloderma suspectum\nn01692523\tbeaded lizard, Mexican beaded lizard, Heloderma horridum\nn01692864\tlacertid lizard, lacertid\nn01693175\tsand lizard, Lacerta agilis\nn01693334\tgreen lizard, Lacerta viridis\nn01693783\tchameleon, chamaeleon\nn01694178\tAfrican chameleon, Chamaeleo chamaeleon\nn01694311\thorned chameleon, Chamaeleo oweni\nn01694709\tmonitor, monitor lizard, varan\nn01694955\tAfrican monitor, Varanus niloticus\nn01695060\tKomodo dragon, Komodo lizard, dragon lizard, giant lizard, Varanus komodoensis\nn01696633\tcrocodilian reptile, crocodilian\nn01697178\tcrocodile\nn01697457\tAfrican crocodile, Nile crocodile, Crocodylus niloticus\nn01697611\tAsian crocodile, Crocodylus porosus\nn01697749\tMorlett's crocodile\nn01697978\tfalse gavial, Tomistoma schlegeli\nn01698434\talligator, gator\nn01698640\tAmerican alligator, Alligator mississipiensis\nn01698782\tChinese alligator, Alligator sinensis\nn01699040\tcaiman, cayman\nn01699254\tspectacled caiman, Caiman sclerops\nn01699675\tgavial, Gavialis gangeticus\nn01701551\tarmored dinosaur\nn01701859\tstegosaur, stegosaurus, Stegosaur stenops\nn01702256\tankylosaur, ankylosaurus\nn01702479\tEdmontonia\nn01703011\tbone-headed dinosaur\nn01703161\tpachycephalosaur, pachycephalosaurus\nn01703569\tceratopsian, horned dinosaur\nn01704103\tprotoceratops\nn01704323\ttriceratops\nn01704626\tstyracosaur, styracosaurus\nn01705010\tpsittacosaur, psittacosaurus\nn01705591\tornithopod, ornithopod dinosaur\nn01705934\thadrosaur, hadrosaurus, duck-billed dinosaur\nn01707294\ttrachodon, trachodont\nn01708106\tsaurischian, saurischian dinosaur\nn01708998\tsauropod, sauropod dinosaur\nn01709484\tapatosaur, apatosaurus, brontosaur, brontosaurus, thunder lizard, Apatosaurus excelsus\nn01709876\tbarosaur, barosaurus\nn01710177\tdiplodocus\nn01711160\targentinosaur\nn01712008\ttheropod, theropod dinosaur, bird-footed dinosaur\nn01712752\tceratosaur, ceratosaurus\nn01713170\tcoelophysis\nn01713764\ttyrannosaur, tyrannosaurus, Tyrannosaurus rex\nn01714231\tallosaur, allosaurus\nn01715888\tornithomimid\nn01717016\tmaniraptor\nn01717229\toviraptorid\nn01717467\tvelociraptor\nn01718096\tdeinonychus\nn01718414\tutahraptor, superslasher\nn01719403\tsynapsid, synapsid reptile\nn01721174\tdicynodont\nn01721898\tpelycosaur\nn01722670\tdimetrodon\nn01722998\tpterosaur, flying reptile\nn01723579\tpterodactyl\nn01724231\tichthyosaur\nn01724840\tichthyosaurus\nn01725086\tstenopterygius, Stenopterygius quadrisicissus\nn01725713\tplesiosaur, plesiosaurus\nn01726203\tnothosaur\nn01726692\tsnake, serpent, ophidian\nn01727646\tcolubrid snake, colubrid\nn01728266\thoop snake\nn01728572\tthunder snake, worm snake, Carphophis amoenus\nn01728920\tringneck snake, ring-necked snake, ring snake\nn01729322\thognose snake, puff adder, sand viper\nn01729672\tleaf-nosed snake\nn01729977\tgreen snake, grass snake\nn01730185\tsmooth green snake, Opheodrys vernalis\nn01730307\trough green snake, Opheodrys aestivus\nn01730563\tgreen snake\nn01730812\tracer\nn01730960\tblacksnake, black racer, Coluber constrictor\nn01731137\tblue racer, Coluber constrictor flaviventris\nn01731277\thorseshoe whipsnake, Coluber hippocrepis\nn01731545\twhip-snake, whip snake, whipsnake\nn01731764\tcoachwhip, coachwhip snake, Masticophis flagellum\nn01731941\tCalifornia whipsnake, striped racer, Masticophis lateralis\nn01732093\tSonoran whipsnake, Masticophis bilineatus\nn01732244\trat snake\nn01732614\tcorn snake, red rat snake, Elaphe guttata\nn01732789\tblack rat snake, blacksnake, pilot blacksnake, mountain blacksnake, Elaphe obsoleta\nn01732989\tchicken snake\nn01733214\tIndian rat snake, Ptyas mucosus\nn01733466\tglossy snake, Arizona elegans\nn01733757\tbull snake, bull-snake\nn01733957\tgopher snake, Pituophis melanoleucus\nn01734104\tpine snake\nn01734418\tking snake, kingsnake\nn01734637\tcommon kingsnake, Lampropeltis getulus\nn01734808\tmilk snake, house snake, milk adder, checkered adder, Lampropeltis triangulum\nn01735189\tgarter snake, grass snake\nn01735439\tcommon garter snake, Thamnophis sirtalis\nn01735577\tribbon snake, Thamnophis sauritus\nn01735728\tWestern ribbon snake, Thamnophis proximus\nn01736032\tlined snake, Tropidoclonion lineatum\nn01736375\tground snake, Sonora semiannulata\nn01736796\teastern ground snake, Potamophis striatula, Haldea striatula\nn01737021\twater snake\nn01737472\tcommon water snake, banded water snake, Natrix sipedon, Nerodia sipedon\nn01737728\twater moccasin\nn01737875\tgrass snake, ring snake, ringed snake, Natrix natrix\nn01738065\tviperine grass snake, Natrix maura\nn01738306\tred-bellied snake, Storeria occipitamaculata\nn01738601\tsand snake\nn01738731\tbanded sand snake, Chilomeniscus cinctus\nn01739094\tblack-headed snake\nn01739381\tvine snake\nn01739647\tlyre snake\nn01739871\tSonoran lyre snake, Trimorphodon lambda\nn01740131\tnight snake, Hypsiglena torquata\nn01740551\tblind snake, worm snake\nn01740885\twestern blind snake, Leptotyphlops humilis\nn01741232\tindigo snake, gopher snake, Drymarchon corais\nn01741442\teastern indigo snake, Drymarchon corais couperi\nn01741562\tconstrictor\nn01741943\tboa\nn01742172\tboa constrictor, Constrictor constrictor\nn01742447\trubber boa, tow-headed snake, Charina bottae\nn01742821\trosy boa, Lichanura trivirgata\nn01743086\tanaconda, Eunectes murinus\nn01743605\tpython\nn01743936\tcarpet snake, Python variegatus, Morelia spilotes variegatus\nn01744100\treticulated python, Python reticulatus\nn01744270\tIndian python, Python molurus\nn01744401\trock python, rock snake, Python sebae\nn01744555\tamethystine python\nn01745125\telapid, elapid snake\nn01745484\tcoral snake, harlequin-snake, New World coral snake\nn01745902\teastern coral snake, Micrurus fulvius\nn01746191\twestern coral snake, Micruroides euryxanthus\nn01746359\tcoral snake, Old World coral snake\nn01746952\tAfrican coral snake, Aspidelaps lubricus\nn01747285\tAustralian coral snake, Rhynchoelaps australis\nn01747589\tcopperhead, Denisonia superba\nn01747885\tcobra\nn01748264\tIndian cobra, Naja naja\nn01748389\tasp, Egyptian cobra, Naja haje\nn01748686\tblack-necked cobra, spitting cobra, Naja nigricollis\nn01748906\thamadryad, king cobra, Ophiophagus hannah, Naja hannah\nn01749244\tringhals, rinkhals, spitting snake, Hemachatus haemachatus\nn01749582\tmamba\nn01749742\tblack mamba, Dendroaspis augusticeps\nn01749939\tgreen mamba\nn01750167\tdeath adder, Acanthophis antarcticus\nn01750437\ttiger snake, Notechis scutatus\nn01750743\tAustralian blacksnake, Pseudechis porphyriacus\nn01751036\tkrait\nn01751215\tbanded krait, banded adder, Bungarus fasciatus\nn01751472\ttaipan, Oxyuranus scutellatus\nn01751748\tsea snake\nn01752165\tviper\nn01752585\tadder, common viper, Vipera berus\nn01752736\tasp, asp viper, Vipera aspis\nn01753032\tpuff adder, Bitis arietans\nn01753180\tgaboon viper, Bitis gabonica\nn01753488\thorned viper, cerastes, sand viper, horned asp, Cerastes cornutus\nn01753959\tpit viper\nn01754370\tcopperhead, Agkistrodon contortrix\nn01754533\twater moccasin, cottonmouth, cottonmouth moccasin, Agkistrodon piscivorus\nn01754876\trattlesnake, rattler\nn01755581\tdiamondback, diamondback rattlesnake, Crotalus adamanteus\nn01755740\ttimber rattlesnake, banded rattlesnake, Crotalus horridus horridus\nn01755952\tcanebrake rattlesnake, canebrake rattler, Crotalus horridus atricaudatus\nn01756089\tprairie rattlesnake, prairie rattler, Western rattlesnake, Crotalus viridis\nn01756291\tsidewinder, horned rattlesnake, Crotalus cerastes\nn01756508\tWestern diamondback, Western diamondback rattlesnake, Crotalus atrox\nn01756733\trock rattlesnake, Crotalus lepidus\nn01756916\ttiger rattlesnake, Crotalus tigris\nn01757115\tMojave rattlesnake, Crotalus scutulatus\nn01757343\tspeckled rattlesnake, Crotalus mitchellii\nn01757677\tmassasauga, massasauga rattler, Sistrurus catenatus\nn01757901\tground rattler, massasauga, Sistrurus miliaris\nn01758141\tfer-de-lance, Bothrops atrops\nn01758757\tcarcase, carcass\nn01758895\tcarrion\nn01767661\tarthropod\nn01768244\ttrilobite\nn01769347\tarachnid, arachnoid\nn01770081\tharvestman, daddy longlegs, Phalangium opilio\nn01770393\tscorpion\nn01770795\tfalse scorpion, pseudoscorpion\nn01771100\tbook scorpion, Chelifer cancroides\nn01771417\twhip-scorpion, whip scorpion\nn01771766\tvinegarroon, Mastigoproctus giganteus\nn01772222\tspider\nn01772664\torb-weaving spider\nn01773157\tblack and gold garden spider, Argiope aurantia\nn01773549\tbarn spider, Araneus cavaticus\nn01773797\tgarden spider, Aranea diademata\nn01774097\tcomb-footed spider, theridiid\nn01774384\tblack widow, Latrodectus mactans\nn01774750\ttarantula\nn01775062\twolf spider, hunting spider\nn01775370\tEuropean wolf spider, tarantula, Lycosa tarentula\nn01775730\ttrap-door spider\nn01776192\tacarine\nn01776313\ttick\nn01776705\thard tick, ixodid\nn01777304\tIxodes dammini, deer tick\nn01777467\tIxodes neotomae\nn01777649\tIxodes pacificus, western black-legged tick\nn01777909\tIxodes scapularis, black-legged tick\nn01778217\tsheep-tick, sheep tick, Ixodes ricinus\nn01778487\tIxodes persulcatus\nn01778621\tIxodes dentatus\nn01778801\tIxodes spinipalpis\nn01779148\twood tick, American dog tick, Dermacentor variabilis\nn01779463\tsoft tick, argasid\nn01779629\tmite\nn01779939\tweb-spinning mite\nn01780142\tacarid\nn01780426\ttrombidiid\nn01780696\ttrombiculid\nn01781071\tharvest mite, chigger, jigger, redbug\nn01781570\tacarus, genus Acarus\nn01781698\titch mite, sarcoptid\nn01781875\trust mite\nn01782209\tspider mite, tetranychid\nn01782516\tred spider, red spider mite, Panonychus ulmi\nn01783017\tmyriapod\nn01783706\tgarden centipede, garden symphilid, symphilid, Scutigerella immaculata\nn01784293\ttardigrade\nn01784675\tcentipede\nn01785667\thouse centipede, Scutigera coleoptrata\nn01786646\tmillipede, millepede, milliped\nn01787006\tsea spider, pycnogonid\nn01787191\tMerostomata, class Merostomata\nn01787835\thorseshoe crab, king crab, Limulus polyphemus, Xiphosurus polyphemus\nn01788291\tAsian horseshoe crab\nn01788579\teurypterid\nn01788864\ttongue worm, pentastomid\nn01789386\tgallinaceous bird, gallinacean\nn01789740\tdomestic fowl, fowl, poultry\nn01790171\tDorking\nn01790304\tPlymouth Rock\nn01790398\tCornish, Cornish fowl\nn01790557\tRock Cornish\nn01790711\tgame fowl\nn01790812\tcochin, cochin china\nn01791107\tjungle fowl, gallina\nn01791314\tjungle cock\nn01791388\tjungle hen\nn01791463\tred jungle fowl, Gallus gallus\nn01791625\tchicken, Gallus gallus\nn01791954\tbantam\nn01792042\tchick, biddy\nn01792158\tcock, rooster\nn01792429\tcockerel\nn01792530\tcapon\nn01792640\then, biddy\nn01792808\tcackler\nn01792955\tbrood hen, broody, broody hen, setting hen, sitter\nn01793085\tmother hen\nn01793159\tlayer\nn01793249\tpullet\nn01793340\tspring chicken\nn01793435\tRhode Island red\nn01793565\tDominique, Dominick\nn01793715\tOrpington\nn01794158\tturkey, Meleagris gallopavo\nn01794344\tturkey cock, gobbler, tom, tom turkey\nn01794651\tocellated turkey, Agriocharis ocellata\nn01795088\tgrouse\nn01795545\tblack grouse\nn01795735\tEuropean black grouse, heathfowl, Lyrurus tetrix\nn01795900\tAsian black grouse, Lyrurus mlokosiewiczi\nn01796019\tblackcock, black cock\nn01796105\tgreyhen, grayhen, grey hen, gray hen, heath hen\nn01796340\tptarmigan\nn01796519\tred grouse, moorfowl, moorbird, moor-bird, moorgame, Lagopus scoticus\nn01796729\tmoorhen\nn01797020\tcapercaillie, capercailzie, horse of the wood, Tetrao urogallus\nn01797307\tspruce grouse, Canachites canadensis\nn01797601\tsage grouse, sage hen, Centrocercus urophasianus\nn01797886\truffed grouse, partridge, Bonasa umbellus\nn01798168\tsharp-tailed grouse, sprigtail, sprig tail, Pedioecetes phasianellus\nn01798484\tprairie chicken, prairie grouse, prairie fowl\nn01798706\tgreater prairie chicken, Tympanuchus cupido\nn01798839\tlesser prairie chicken, Tympanuchus pallidicinctus\nn01798979\theath hen, Tympanuchus cupido cupido\nn01799302\tguan\nn01799679\tcurassow\nn01800195\tpiping guan\nn01800424\tchachalaca\nn01800633\tTexas chachalaca, Ortilis vetula macalli\nn01801088\tmegapode, mound bird, mound-bird, mound builder, scrub fowl\nn01801479\tmallee fowl, leipoa, lowan, Leipoa ocellata\nn01801672\tmallee hen\nn01801876\tbrush turkey, Alectura lathami\nn01802159\tmaleo, Macrocephalon maleo\nn01802721\tphasianid\nn01803078\tpheasant\nn01803362\tring-necked pheasant, Phasianus colchicus\nn01803641\tafropavo, Congo peafowl, Afropavo congensis\nn01803893\targus, argus pheasant\nn01804163\tgolden pheasant, Chrysolophus pictus\nn01804478\tbobwhite, bobwhite quail, partridge\nn01804653\tnorthern bobwhite, Colinus virginianus\nn01804921\tOld World quail\nn01805070\tmigratory quail, Coturnix coturnix, Coturnix communis\nn01805321\tmonal, monaul\nn01805801\tpeafowl, bird of Juno\nn01806061\tpeachick, pea-chick\nn01806143\tpeacock\nn01806297\tpeahen\nn01806364\tblue peafowl, Pavo cristatus\nn01806467\tgreen peafowl, Pavo muticus\nn01806567\tquail\nn01806847\tCalifornia quail, Lofortyx californicus\nn01807105\ttragopan\nn01807496\tpartridge\nn01807828\tHungarian partridge, grey partridge, gray partridge, Perdix perdix\nn01808140\tred-legged partridge, Alectoris ruffa\nn01808291\tGreek partridge, rock partridge, Alectoris graeca\nn01808596\tmountain quail, mountain partridge, Oreortyx picta palmeri\nn01809106\tguinea fowl, guinea, Numida meleagris\nn01809371\tguinea hen\nn01809752\thoatzin, hoactzin, stinkbird, Opisthocomus hoazin\nn01810268\ttinamou, partridge\nn01810700\tcolumbiform bird\nn01811243\tdodo, Raphus cucullatus\nn01811909\tpigeon\nn01812187\tpouter pigeon, pouter\nn01812337\tdove\nn01812662\trock dove, rock pigeon, Columba livia\nn01812866\tband-tailed pigeon, band-tail pigeon, bandtail, Columba fasciata\nn01813088\twood pigeon, ringdove, cushat, Columba palumbus\nn01813385\tturtledove\nn01813532\tStreptopelia turtur\nn01813658\tringdove, Streptopelia risoria\nn01813948\tAustralian turtledove, turtledove, Stictopelia cuneata\nn01814217\tmourning dove, Zenaidura macroura\nn01814370\tdomestic pigeon\nn01814549\tsquab\nn01814620\tfairy swallow\nn01814755\troller, tumbler, tumbler pigeon\nn01814921\thoming pigeon, homer\nn01815036\tcarrier pigeon\nn01815270\tpassenger pigeon, Ectopistes migratorius\nn01815601\tsandgrouse, sand grouse\nn01816017\tpainted sandgrouse, Pterocles indicus\nn01816140\tpin-tailed sandgrouse, pin-tailed grouse, Pterocles alchata\nn01816474\tpallas's sandgrouse, Syrrhaptes paradoxus\nn01816887\tparrot\nn01817263\tpopinjay\nn01817346\tpoll, poll parrot\nn01817953\tAfrican grey, African gray, Psittacus erithacus\nn01818299\tamazon\nn01818515\tmacaw\nn01818832\tkea, Nestor notabilis\nn01819115\tcockatoo\nn01819313\tsulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita\nn01819465\tpink cockatoo, Kakatoe leadbeateri\nn01819734\tcockateel, cockatiel, cockatoo parrot, Nymphicus hollandicus\nn01820052\tlovebird\nn01820348\tlory\nn01820546\tlorikeet\nn01820801\tvaried Lorikeet, Glossopsitta versicolor\nn01821076\trainbow lorikeet, Trichoglossus moluccanus\nn01821203\tparakeet, parrakeet, parroket, paraquet, paroquet, parroquet\nn01821554\tCarolina parakeet, Conuropsis carolinensis\nn01821869\tbudgerigar, budgereegah, budgerygah, budgie, grass parakeet, lovebird, shell parakeet, Melopsittacus undulatus\nn01822300\tring-necked parakeet, Psittacula krameri\nn01822602\tcuculiform bird\nn01823013\tcuckoo\nn01823414\tEuropean cuckoo, Cuculus canorus\nn01823740\tblack-billed cuckoo, Coccyzus erythropthalmus\nn01824035\troadrunner, chaparral cock, Geococcyx californianus\nn01824344\tani\nn01824575\tcoucal\nn01824749\tcrow pheasant, Centropus sinensis\nn01825278\ttouraco, turaco, turacou, turakoo\nn01825930\tcoraciiform bird\nn01826364\troller\nn01826680\tEuropean roller, Coracias garrulus\nn01826844\tground roller\nn01827403\tkingfisher\nn01827793\tEurasian kingfisher, Alcedo atthis\nn01828096\tbelted kingfisher, Ceryle alcyon\nn01828556\tkookaburra, laughing jackass, Dacelo gigas\nn01828970\tbee eater\nn01829413\thornbill\nn01829869\thoopoe, hoopoo\nn01830042\tEuopean hoopoe, Upupa epops\nn01830479\twood hoopoe\nn01830915\tmotmot, momot\nn01831360\ttody\nn01831712\tapodiform bird\nn01832167\tswift\nn01832493\tEuropean swift, Apus apus\nn01832813\tchimney swift, chimney swallow, Chateura pelagica\nn01833112\tswiftlet, Collocalia inexpectata\nn01833415\ttree swift, crested swift\nn01833805\thummingbird\nn01834177\tArchilochus colubris\nn01834540\tthornbill\nn01835276\tgoatsucker, nightjar, caprimulgid\nn01835769\tEuropean goatsucker, European nightjar, Caprimulgus europaeus\nn01835918\tchuck-will's-widow, Caprimulgus carolinensis\nn01836087\twhippoorwill, Caprimulgus vociferus\nn01836673\tpoorwill, Phalaenoptilus nuttallii\nn01837072\tfrogmouth\nn01837526\toilbird, guacharo, Steatornis caripensis\nn01838038\tpiciform bird\nn01838598\twoodpecker, peckerwood, pecker\nn01839086\tgreen woodpecker, Picus viridis\nn01839330\tdowny woodpecker\nn01839598\tflicker\nn01839750\tyellow-shafted flicker, Colaptes auratus, yellowhammer\nn01839949\tgilded flicker, Colaptes chrysoides\nn01840120\tred-shafted flicker, Colaptes caper collaris\nn01840412\tivorybill, ivory-billed woodpecker, Campephilus principalis\nn01840775\tredheaded woodpecker, redhead, Melanerpes erythrocephalus\nn01841102\tsapsucker\nn01841288\tyellow-bellied sapsucker, Sphyrapicus varius\nn01841441\tred-breasted sapsucker, Sphyrapicus varius ruber\nn01841679\twryneck\nn01841943\tpiculet\nn01842235\tbarbet\nn01842504\tpuffbird\nn01842788\thoney guide\nn01843065\tjacamar\nn01843383\ttoucan\nn01843719\ttoucanet\nn01844231\ttrogon\nn01844551\tquetzal, quetzal bird\nn01844746\tresplendent quetzel, resplendent trogon, Pharomacrus mocino\nn01844917\taquatic bird\nn01845132\twaterfowl, water bird, waterbird\nn01845477\tanseriform bird\nn01846331\tduck\nn01847000\tdrake\nn01847089\tquack-quack\nn01847170\tduckling\nn01847253\tdiving duck\nn01847407\tdabbling duck, dabbler\nn01847806\tmallard, Anas platyrhynchos\nn01847978\tblack duck, Anas rubripes\nn01848123\tteal\nn01848323\tgreenwing, green-winged teal, Anas crecca\nn01848453\tbluewing, blue-winged teal, Anas discors\nn01848555\tgarganey, Anas querquedula\nn01848648\twidgeon, wigeon, Anas penelope\nn01848840\tAmerican widgeon, baldpate, Anas americana\nn01848976\tshoveler, shoveller, broadbill, Anas clypeata\nn01849157\tpintail, pin-tailed duck, Anas acuta\nn01849466\tsheldrake\nn01849676\tshelduck\nn01849863\truddy duck, Oxyura jamaicensis\nn01850192\tbufflehead, butterball, dipper, Bucephela albeola\nn01850373\tgoldeneye, whistler, Bucephela clangula\nn01850553\tBarrow's goldeneye, Bucephala islandica\nn01850873\tcanvasback, canvasback duck, Aythya valisineria\nn01851038\tpochard, Aythya ferina\nn01851207\tredhead, Aythya americana\nn01851375\tscaup, scaup duck, bluebill, broadbill\nn01851573\tgreater scaup, Aythya marila\nn01851731\tlesser scaup, lesser scaup duck, lake duck, Aythya affinis\nn01851895\twild duck\nn01852142\twood duck, summer duck, wood widgeon, Aix sponsa\nn01852329\twood drake\nn01852400\tmandarin duck, Aix galericulata\nn01852671\tmuscovy duck, musk duck, Cairina moschata\nn01852861\tsea duck\nn01853195\teider, eider duck\nn01853498\tscoter, scooter\nn01853666\tcommon scoter, Melanitta nigra\nn01853870\told squaw, oldwife, Clangula hyemalis\nn01854415\tmerganser, fish duck, sawbill, sheldrake\nn01854700\tgoosander, Mergus merganser\nn01854838\tAmerican merganser, Mergus merganser americanus\nn01855032\tred-breasted merganser, Mergus serrator\nn01855188\tsmew, Mergus albellus\nn01855476\thooded merganser, hooded sheldrake, Lophodytes cucullatus\nn01855672\tgoose\nn01856072\tgosling\nn01856155\tgander\nn01856380\tChinese goose, Anser cygnoides\nn01856553\tgreylag, graylag, greylag goose, graylag goose, Anser anser\nn01856890\tblue goose, Chen caerulescens\nn01857079\tsnow goose\nn01857325\tbrant, brant goose, brent, brent goose\nn01857512\tcommon brant goose, Branta bernicla\nn01857632\thonker, Canada goose, Canadian goose, Branta canadensis\nn01857851\tbarnacle goose, barnacle, Branta leucopsis\nn01858281\tcoscoroba\nn01858441\tswan\nn01858780\tcob\nn01858845\tpen\nn01858906\tcygnet\nn01859190\tmute swan, Cygnus olor\nn01859325\twhooper, whooper swan, Cygnus cygnus\nn01859496\ttundra swan, Cygnus columbianus\nn01859689\twhistling swan, Cygnus columbianus columbianus\nn01859852\tBewick's swan, Cygnus columbianus bewickii\nn01860002\ttrumpeter, trumpeter swan, Cygnus buccinator\nn01860187\tblack swan, Cygnus atratus\nn01860497\tscreamer\nn01860864\thorned screamer, Anhima cornuta\nn01861148\tcrested screamer\nn01861330\tchaja, Chauna torquata\nn01861778\tmammal, mammalian\nn01862399\tfemale mammal\nn01871265\ttusker\nn01871543\tprototherian\nn01871875\tmonotreme, egg-laying mammal\nn01872401\techidna, spiny anteater, anteater\nn01872772\techidna, spiny anteater, anteater\nn01873310\tplatypus, duckbill, duckbilled platypus, duck-billed platypus, Ornithorhynchus anatinus\nn01874434\tmarsupial, pouched mammal\nn01874928\topossum, possum\nn01875313\tcommon opossum, Didelphis virginiana, Didelphis marsupialis\nn01875610\tcrab-eating opossum\nn01876034\topossum rat\nn01876326\tbandicoot\nn01876667\trabbit-eared bandicoot, rabbit bandicoot, bilby, Macrotis lagotis\nn01877134\tkangaroo\nn01877606\tgiant kangaroo, great grey kangaroo, Macropus giganteus\nn01877812\twallaby, brush kangaroo\nn01878061\tcommon wallaby, Macropus agiles\nn01878335\thare wallaby, kangaroo hare\nn01878639\tnail-tailed wallaby, nail-tailed kangaroo\nn01878929\trock wallaby, rock kangaroo\nn01879217\tpademelon, paddymelon\nn01879509\ttree wallaby, tree kangaroo\nn01879837\tmusk kangaroo, Hypsiprymnodon moschatus\nn01880152\trat kangaroo, kangaroo rat\nn01880473\tpotoroo\nn01880716\tbettong\nn01880813\tjerboa kangaroo, kangaroo jerboa\nn01881171\tphalanger, opossum, possum\nn01881564\tcuscus\nn01881857\tbrush-tailed phalanger, Trichosurus vulpecula\nn01882125\tflying phalanger, flying opossum, flying squirrel\nn01882714\tkoala, koala bear, kangaroo bear, native bear, Phascolarctos cinereus\nn01883070\twombat\nn01883513\tdasyurid marsupial, dasyurid\nn01883920\tdasyure\nn01884104\teastern dasyure, Dasyurus quoll\nn01884203\tnative cat, Dasyurus viverrinus\nn01884476\tthylacine, Tasmanian wolf, Tasmanian tiger, Thylacinus cynocephalus\nn01884834\tTasmanian devil, ursine dasyure, Sarcophilus hariisi\nn01885158\tpouched mouse, marsupial mouse, marsupial rat\nn01885498\tnumbat, banded anteater, anteater, Myrmecobius fasciatus\nn01886045\tpouched mole, marsupial mole, Notoryctus typhlops\nn01886756\tplacental, placental mammal, eutherian, eutherian mammal\nn01887474\tlivestock, stock, farm animal\nn01887623\tbull\nn01887787\tcow\nn01887896\tcalf\nn01888045\tcalf\nn01888181\tyearling\nn01888264\tbuck\nn01888411\tdoe\nn01889074\tinsectivore\nn01889520\tmole\nn01889849\tstarnose mole, star-nosed mole, Condylura cristata\nn01890144\tbrewer's mole, hair-tailed mole, Parascalops breweri\nn01890564\tgolden mole\nn01890860\tshrew mole\nn01891013\tAsiatic shrew mole, Uropsilus soricipes\nn01891274\tAmerican shrew mole, Neurotrichus gibbsii\nn01891633\tshrew, shrewmouse\nn01892030\tcommon shrew, Sorex araneus\nn01892145\tmasked shrew, Sorex cinereus\nn01892385\tshort-tailed shrew, Blarina brevicauda\nn01892551\twater shrew\nn01892744\tAmerican water shrew, Sorex palustris\nn01893021\tEuropean water shrew, Neomys fodiens\nn01893164\tMediterranean water shrew, Neomys anomalus\nn01893399\tleast shrew, Cryptotis parva\nn01893825\thedgehog, Erinaceus europaeus, Erinaceus europeaeus\nn01894207\ttenrec, tendrac\nn01894522\ttailless tenrec, Tenrec ecaudatus\nn01894956\totter shrew, potamogale, Potamogale velox\nn01896844\teiderdown\nn01897257\taftershaft\nn01897426\tsickle feather\nn01897536\tcontour feather\nn01897667\tbastard wing, alula, spurious wing\nn01898593\tsaddle hackle, saddle feather\nn01899894\tencolure\nn01900150\thair\nn01903234\tsquama\nn01903346\tscute\nn01903498\tsclerite\nn01904029\tplastron\nn01904806\tscallop shell\nn01904886\toyster shell\nn01905321\ttheca\nn01905661\tinvertebrate\nn01906749\tsponge, poriferan, parazoan\nn01907287\tchoanocyte, collar cell\nn01907738\tglass sponge\nn01908042\tVenus's flower basket\nn01908958\tmetazoan\nn01909422\tcoelenterate, cnidarian\nn01909788\tplanula\nn01909906\tpolyp\nn01910252\tmedusa, medusoid, medusan\nn01910747\tjellyfish\nn01911063\tscyphozoan\nn01911403\tChrysaora quinquecirrha\nn01911839\thydrozoan, hydroid\nn01912152\thydra\nn01912454\tsiphonophore\nn01912809\tnanomia\nn01913166\tPortuguese man-of-war, man-of-war, jellyfish\nn01913346\tpraya\nn01913440\tapolemia\nn01914163\tanthozoan, actinozoan\nn01914609\tsea anemone, anemone\nn01914830\tactinia, actinian, actiniarian\nn01915700\tsea pen\nn01915811\tcoral\nn01916187\tgorgonian, gorgonian coral\nn01916388\tsea feather\nn01916481\tsea fan\nn01916588\tred coral\nn01916925\tstony coral, madrepore, madriporian coral\nn01917289\tbrain coral\nn01917611\tstaghorn coral, stag's-horn coral\nn01917882\tmushroom coral\nn01918744\tctenophore, comb jelly\nn01919385\tberoe\nn01920051\tplatyctenean\nn01920438\tsea gooseberry\nn01921059\tVenus's girdle, Cestum veneris\nn01922303\tworm\nn01922717\thelminth, parasitic worm\nn01922948\twoodworm\nn01923025\twoodborer, borer\nn01923404\tacanthocephalan, spiny-headed worm\nn01923890\tarrowworm, chaetognath\nn01924800\tbladder worm\nn01924916\tflatworm, platyhelminth\nn01925270\tplanarian, planaria\nn01925695\tfluke, trematode, trematode worm\nn01925916\tcercaria\nn01926379\tliver fluke, Fasciola hepatica\nn01926689\tFasciolopsis buski\nn01927159\tschistosome, blood fluke\nn01927456\ttapeworm, cestode\nn01927928\techinococcus\nn01928215\ttaenia\nn01928517\tribbon worm, nemertean, nemertine, proboscis worm\nn01928865\tbeard worm, pogonophoran\nn01929186\trotifer\nn01930112\tnematode, nematode worm, roundworm\nn01930852\tcommon roundworm, Ascaris lumbricoides\nn01931140\tchicken roundworm, Ascaridia galli\nn01931520\tpinworm, threadworm, Enterobius vermicularis\nn01931714\teelworm\nn01932151\tvinegar eel, vinegar worm, Anguillula aceti, Turbatrix aceti\nn01932936\ttrichina, Trichinella spiralis\nn01933151\thookworm\nn01933478\tfilaria\nn01933988\tGuinea worm, Dracunculus medinensis\nn01934440\tannelid, annelid worm, segmented worm\nn01934844\tarchiannelid\nn01935176\toligochaete, oligochaete worm\nn01935395\tearthworm, angleworm, fishworm, fishing worm, wiggler, nightwalker, nightcrawler, crawler, dew worm, red worm\nn01936391\tpolychaete, polychete, polychaete worm, polychete worm\nn01936671\tlugworm, lug, lobworm\nn01936858\tsea mouse\nn01937579\tbloodworm\nn01937909\tleech, bloodsucker, hirudinean\nn01938454\tmedicinal leech, Hirudo medicinalis\nn01938735\thorseleech\nn01940736\tmollusk, mollusc, shellfish\nn01941223\tscaphopod\nn01941340\ttooth shell, tusk shell\nn01942177\tgastropod, univalve\nn01942869\tabalone, ear-shell\nn01943087\tormer, sea-ear, Haliotis tuberculata\nn01943541\tscorpion shell\nn01943899\tconch\nn01944118\tgiant conch, Strombus gigas\nn01944390\tsnail\nn01944812\tedible snail, Helix pomatia\nn01944955\tgarden snail\nn01945143\tbrown snail, Helix aspersa\nn01945340\tHelix hortensis\nn01945685\tslug\nn01945845\tseasnail\nn01946277\tneritid, neritid gastropod\nn01946630\tnerita\nn01946827\tbleeding tooth, Nerita peloronta\nn01947139\tneritina\nn01947396\twhelk\nn01947997\tmoon shell, moonshell\nn01948446\tperiwinkle, winkle\nn01948573\tlimpet\nn01949085\tcommon limpet, Patella vulgata\nn01949499\tkeyhole limpet, Fissurella apertura, Diodora apertura\nn01949973\triver limpet, freshwater limpet, Ancylus fluviatilis\nn01950731\tsea slug, nudibranch\nn01951274\tsea hare, Aplysia punctata\nn01951613\tHermissenda crassicornis\nn01952029\tbubble shell\nn01952712\tphysa\nn01953361\tcowrie, cowry\nn01953594\tmoney cowrie, Cypraea moneta\nn01953762\ttiger cowrie, Cypraea tigris\nn01954516\tsolenogaster, aplacophoran\nn01955084\tchiton, coat-of-mail shell, sea cradle, polyplacophore\nn01955933\tbivalve, pelecypod, lamellibranch\nn01956344\tspat\nn01956481\tclam\nn01956764\tseashell\nn01957335\tsoft-shell clam, steamer, steamer clam, long-neck clam, Mya arenaria\nn01958038\tquahog, quahaug, hard-shell clam, hard clam, round clam, Venus mercenaria, Mercenaria mercenaria\nn01958346\tlittleneck, littleneck clam\nn01958435\tcherrystone, cherrystone clam\nn01958531\tgeoduck\nn01959029\trazor clam, jackknife clam, knife-handle\nn01959492\tgiant clam, Tridacna gigas\nn01959985\tcockle\nn01960177\tedible cockle, Cardium edule\nn01960459\toyster\nn01961234\tJapanese oyster, Ostrea gigas\nn01961600\tVirginia oyster\nn01961985\tpearl oyster, Pinctada margaritifera\nn01962506\tsaddle oyster, Anomia ephippium\nn01962788\twindow oyster, windowpane oyster, capiz, Placuna placenta\nn01963317\tark shell\nn01963479\tblood clam\nn01963571\tmussel\nn01964049\tmarine mussel, mytilid\nn01964271\tedible mussel, Mytilus edulis\nn01964441\tfreshwater mussel, freshwater clam\nn01964957\tpearly-shelled mussel\nn01965252\tthin-shelled mussel\nn01965529\tzebra mussel, Dreissena polymorpha\nn01965889\tscallop, scollop, escallop\nn01966377\tbay scallop, Pecten irradians\nn01966586\tsea scallop, giant scallop, Pecten magellanicus\nn01967094\tshipworm, teredinid\nn01967308\tteredo\nn01967963\tpiddock\nn01968315\tcephalopod, cephalopod mollusk\nn01968897\tchambered nautilus, pearly nautilus, nautilus\nn01969726\toctopod\nn01970164\toctopus, devilfish\nn01970667\tpaper nautilus, nautilus, Argonaut, Argonauta argo\nn01971094\tdecapod\nn01971280\tsquid\nn01971620\tloligo\nn01971850\tommastrephes\nn01972131\tarchiteuthis, giant squid\nn01972541\tcuttlefish, cuttle\nn01973148\tspirula, Spirula peronii\nn01974773\tcrustacean\nn01975687\tmalacostracan crustacean\nn01976146\tdecapod crustacean, decapod\nn01976868\tbrachyuran\nn01976957\tcrab\nn01977485\tstone crab, Menippe mercenaria\nn01978010\thard-shell crab\nn01978136\tsoft-shell crab, soft-shelled crab\nn01978287\tDungeness crab, Cancer magister\nn01978455\trock crab, Cancer irroratus\nn01978587\tJonah crab, Cancer borealis\nn01978930\tswimming crab\nn01979269\tEnglish lady crab, Portunus puber\nn01979526\tAmerican lady crab, lady crab, calico crab, Ovalipes ocellatus\nn01979874\tblue crab, Callinectes sapidus\nn01980166\tfiddler crab\nn01980655\tpea crab\nn01981276\tking crab, Alaska crab, Alaskan king crab, Alaska king crab, Paralithodes camtschatica\nn01981702\tspider crab\nn01982068\tEuropean spider crab, king crab, Maja squinado\nn01982347\tgiant crab, Macrocheira kaempferi\nn01982650\tlobster\nn01983048\ttrue lobster\nn01983481\tAmerican lobster, Northern lobster, Maine lobster, Homarus americanus\nn01983674\tEuropean lobster, Homarus vulgaris\nn01983829\tCape lobster, Homarus capensis\nn01984245\tNorway lobster, Nephrops norvegicus\nn01984695\tspiny lobster, langouste, rock lobster, crawfish, crayfish, sea crawfish\nn01985128\tcrayfish, crawfish, crawdad, crawdaddy\nn01985493\tOld World crayfish, ecrevisse\nn01985797\tAmerican crayfish\nn01986214\thermit crab\nn01986806\tshrimp\nn01987076\tsnapping shrimp, pistol shrimp\nn01987545\tprawn\nn01987727\tlong-clawed prawn, river prawn, Palaemon australis\nn01988203\ttropical prawn\nn01988701\tkrill\nn01988869\tEuphausia pacifica\nn01989516\topossum shrimp\nn01989869\tstomatopod, stomatopod crustacean\nn01990007\tmantis shrimp, mantis crab\nn01990516\tsquilla, mantis prawn\nn01990800\tisopod\nn01991028\twoodlouse, slater\nn01991520\tpill bug\nn01992262\tsow bug\nn01992423\tsea louse, sea slater\nn01992773\tamphipod\nn01993525\tskeleton shrimp\nn01993830\twhale louse\nn01994910\tdaphnia, water flea\nn01995514\tfairy shrimp\nn01995686\tbrine shrimp, Artemia salina\nn01996280\ttadpole shrimp\nn01996585\tcopepod, copepod crustacean\nn01997119\tcyclops, water flea\nn01997825\tseed shrimp, mussel shrimp, ostracod\nn01998183\tbarnacle, cirriped, cirripede\nn01998741\tacorn barnacle, rock barnacle, Balanus balanoides\nn01999186\tgoose barnacle, gooseneck barnacle, Lepas fascicularis\nn01999767\tonychophoran, velvet worm, peripatus\nn02000954\twading bird, wader\nn02002075\tstork\nn02002556\twhite stork, Ciconia ciconia\nn02002724\tblack stork, Ciconia nigra\nn02003037\tadjutant bird, adjutant, adjutant stork, Leptoptilus dubius\nn02003204\tmarabou, marabout, marabou stork, Leptoptilus crumeniferus\nn02003577\topenbill\nn02003839\tjabiru, Jabiru mycteria\nn02004131\tsaddlebill, jabiru, Ephippiorhynchus senegalensis\nn02004492\tpoliceman bird, black-necked stork, jabiru, Xenorhyncus asiaticus\nn02004855\twood ibis, wood stork, flinthead, Mycteria americana\nn02005399\tshoebill, shoebird, Balaeniceps rex\nn02005790\tibis\nn02006063\twood ibis, wood stork, Ibis ibis\nn02006364\tsacred ibis, Threskiornis aethiopica\nn02006656\tspoonbill\nn02006985\tcommon spoonbill, Platalea leucorodia\nn02007284\troseate spoonbill, Ajaia ajaja\nn02007558\tflamingo\nn02008041\theron\nn02008497\tgreat blue heron, Ardea herodius\nn02008643\tgreat white heron, Ardea occidentalis\nn02008796\tegret\nn02009229\tlittle blue heron, Egretta caerulea\nn02009380\tsnowy egret, snowy heron, Egretta thula\nn02009508\tlittle egret, Egretta garzetta\nn02009750\tgreat white heron, Casmerodius albus\nn02009912\tAmerican egret, great white heron, Egretta albus\nn02010272\tcattle egret, Bubulcus ibis\nn02010453\tnight heron, night raven\nn02010728\tblack-crowned night heron, Nycticorax nycticorax\nn02011016\tyellow-crowned night heron, Nyctanassa violacea\nn02011281\tboatbill, boat-billed heron, broadbill, Cochlearius cochlearius\nn02011460\tbittern\nn02011805\tAmerican bittern, stake driver, Botaurus lentiginosus\nn02011943\tEuropean bittern, Botaurus stellaris\nn02012185\tleast bittern, Ixobrychus exilis\nn02012849\tcrane\nn02013177\twhooping crane, whooper, Grus americana\nn02013567\tcourlan, Aramus guarauna\nn02013706\tlimpkin, Aramus pictus\nn02014237\tcrested cariama, seriema, Cariama cristata\nn02014524\tchunga, seriema, Chunga burmeisteri\nn02014941\trail\nn02015357\tweka, maori hen, wood hen\nn02015554\tcrake\nn02015797\tcorncrake, land rail, Crex crex\nn02016066\tspotted crake, Porzana porzana\nn02016358\tgallinule, marsh hen, water hen, swamphen\nn02016659\tFlorida gallinule, Gallinula chloropus cachinnans\nn02016816\tmoorhen, Gallinula chloropus\nn02016956\tpurple gallinule\nn02017213\tEuropean gallinule, Porphyrio porphyrio\nn02017475\tAmerican gallinule, Porphyrula martinica\nn02017725\tnotornis, takahe, Notornis mantelli\nn02018027\tcoot\nn02018207\tAmerican coot, marsh hen, mud hen, water hen, Fulica americana\nn02018368\tOld World coot, Fulica atra\nn02018795\tbustard\nn02019190\tgreat bustard, Otis tarda\nn02019438\tplain turkey, Choriotis australis\nn02019929\tbutton quail, button-quail, bustard quail, hemipode\nn02020219\tstriped button quail, Turnix sylvatica\nn02020578\tplain wanderer, Pedionomus torquatus\nn02021050\ttrumpeter\nn02021281\tBrazilian trumpeter, Psophia crepitans\nn02021795\tseabird, sea bird, seafowl\nn02022684\tshorebird, shore bird, limicoline bird\nn02023341\tplover\nn02023855\tpiping plover, Charadrius melodus\nn02023992\tkilldeer, kildeer, killdeer plover, Charadrius vociferus\nn02024185\tdotterel, dotrel, Charadrius morinellus, Eudromias morinellus\nn02024479\tgolden plover\nn02024763\tlapwing, green plover, peewit, pewit\nn02025043\tturnstone\nn02025239\truddy turnstone, Arenaria interpres\nn02025389\tblack turnstone, Arenaria-Melanocephala\nn02026059\tsandpiper\nn02026629\tsurfbird, Aphriza virgata\nn02026948\tEuropean sandpiper, Actitis hypoleucos\nn02027075\tspotted sandpiper, Actitis macularia\nn02027357\tleast sandpiper, stint, Erolia minutilla\nn02027492\tred-backed sandpiper, dunlin, Erolia alpina\nn02027897\tgreenshank, Tringa nebularia\nn02028035\tredshank, Tringa totanus\nn02028175\tyellowlegs\nn02028342\tgreater yellowlegs, Tringa melanoleuca\nn02028451\tlesser yellowlegs, Tringa flavipes\nn02028727\tpectoral sandpiper, jacksnipe, Calidris melanotos\nn02028900\tknot, greyback, grayback, Calidris canutus\nn02029087\tcurlew sandpiper, Calidris Ferruginea\nn02029378\tsanderling, Crocethia alba\nn02029706\tupland sandpiper, upland plover, Bartramian sandpiper, Bartramia longicauda\nn02030035\truff, Philomachus pugnax\nn02030224\treeve\nn02030287\ttattler\nn02030568\tPolynesian tattler, Heteroscelus incanus\nn02030837\twillet, Catoptrophorus semipalmatus\nn02030996\twoodcock\nn02031298\tEurasian woodcock, Scolopax rusticola\nn02031585\tAmerican woodcock, woodcock snipe, Philohela minor\nn02031934\tsnipe\nn02032222\twhole snipe, Gallinago gallinago\nn02032355\tWilson's snipe, Gallinago gallinago delicata\nn02032480\tgreat snipe, woodcock snipe, Gallinago media\nn02032769\tjacksnipe, half snipe, Limnocryptes minima\nn02033041\tdowitcher\nn02033208\tgreyback, grayback, Limnodromus griseus\nn02033324\tred-breasted snipe, Limnodromus scolopaceus\nn02033561\tcurlew\nn02033779\tEuropean curlew, Numenius arquata\nn02033882\tEskimo curlew, Numenius borealis\nn02034129\tgodwit\nn02034295\tHudsonian godwit, Limosa haemastica\nn02034661\tstilt, stiltbird, longlegs, long-legs, stilt plover, Himantopus stilt\nn02034971\tblack-necked stilt, Himantopus mexicanus\nn02035210\tblack-winged stilt, Himantopus himantopus\nn02035402\twhite-headed stilt, Himantopus himantopus leucocephalus\nn02035656\tkaki, Himantopus novae-zelandiae\nn02036053\tstilt, Australian stilt\nn02036228\tbanded stilt, Cladorhyncus leucocephalum\nn02036711\tavocet\nn02037110\toystercatcher, oyster catcher\nn02037464\tphalarope\nn02037869\tred phalarope, Phalaropus fulicarius\nn02038141\tnorthern phalarope, Lobipes lobatus\nn02038466\tWilson's phalarope, Steganopus tricolor\nn02038993\tpratincole, glareole\nn02039171\tcourser\nn02039497\tcream-colored courser, Cursorius cursor\nn02039780\tcrocodile bird, Pluvianus aegyptius\nn02040266\tstone curlew, thick-knee, Burhinus oedicnemus\nn02040505\tcoastal diving bird\nn02041085\tlarid\nn02041246\tgull, seagull, sea gull\nn02041678\tmew, mew gull, sea mew, Larus canus\nn02041875\tblack-backed gull, great black-backed gull, cob, Larus marinus\nn02042046\therring gull, Larus argentatus\nn02042180\tlaughing gull, blackcap, pewit, pewit gull, Larus ridibundus\nn02042472\tivory gull, Pagophila eburnea\nn02042759\tkittiwake\nn02043063\ttern\nn02043333\tsea swallow, Sterna hirundo\nn02043808\tskimmer\nn02044178\tjaeger\nn02044517\tparasitic jaeger, arctic skua, Stercorarius parasiticus\nn02044778\tskua, bonxie\nn02044908\tgreat skua, Catharacta skua\nn02045369\tauk\nn02045596\tauklet\nn02045864\trazorbill, razor-billed auk, Alca torda\nn02046171\tlittle auk, dovekie, Plautus alle\nn02046759\tguillemot\nn02046939\tblack guillemot, Cepphus grylle\nn02047045\tpigeon guillemot, Cepphus columba\nn02047260\tmurre\nn02047411\tcommon murre, Uria aalge\nn02047517\tthick-billed murre, Uria lomvia\nn02047614\tpuffin\nn02047975\tAtlantic puffin, Fratercula arctica\nn02048115\thorned puffin, Fratercula corniculata\nn02048353\ttufted puffin, Lunda cirrhata\nn02048698\tgaviiform seabird\nn02049088\tloon, diver\nn02049532\tpodicipitiform seabird\nn02050004\tgrebe\nn02050313\tgreat crested grebe, Podiceps cristatus\nn02050442\tred-necked grebe, Podiceps grisegena\nn02050586\tblack-necked grebe, eared grebe, Podiceps nigricollis\nn02050809\tdabchick, little grebe, Podiceps ruficollis\nn02051059\tpied-billed grebe, Podilymbus podiceps\nn02051474\tpelecaniform seabird\nn02051845\tpelican\nn02052204\twhite pelican, Pelecanus erythrorhynchos\nn02052365\tOld world white pelican, Pelecanus onocrotalus\nn02052775\tfrigate bird, man-of-war bird\nn02053083\tgannet\nn02053425\tsolan, solan goose, solant goose, Sula bassana\nn02053584\tbooby\nn02054036\tcormorant, Phalacrocorax carbo\nn02054502\tsnakebird, anhinga, darter\nn02054711\twater turkey, Anhinga anhinga\nn02055107\ttropic bird, tropicbird, boatswain bird\nn02055658\tsphenisciform seabird\nn02055803\tpenguin\nn02056228\tAdelie, Adelie penguin, Pygoscelis adeliae\nn02056570\tking penguin, Aptenodytes patagonica\nn02056728\temperor penguin, Aptenodytes forsteri\nn02057035\tjackass penguin, Spheniscus demersus\nn02057330\trock hopper, crested penguin\nn02057731\tpelagic bird, oceanic bird\nn02057898\tprocellariiform seabird\nn02058221\talbatross, mollymawk\nn02058594\twandering albatross, Diomedea exulans\nn02058747\tblack-footed albatross, gooney, gooney bird, goonie, goony, Diomedea nigripes\nn02059162\tpetrel\nn02059541\twhite-chinned petrel, Procellaria aequinoctialis\nn02059852\tgiant petrel, giant fulmar, Macronectes giganteus\nn02060133\tfulmar, fulmar petrel, Fulmarus glacialis\nn02060411\tshearwater\nn02060569\tManx shearwater, Puffinus puffinus\nn02060889\tstorm petrel\nn02061217\tstormy petrel, northern storm petrel, Hydrobates pelagicus\nn02061560\tMother Carey's chicken, Mother Carey's hen, Oceanites oceanicus\nn02061853\tdiving petrel\nn02062017\taquatic mammal\nn02062430\tcetacean, cetacean mammal, blower\nn02062744\twhale\nn02063224\tbaleen whale, whalebone whale\nn02063662\tright whale\nn02064000\tbowhead, bowhead whale, Greenland whale, Balaena mysticetus\nn02064338\trorqual, razorback\nn02064816\tblue whale, sulfur bottom, Balaenoptera musculus\nn02065026\tfinback, finback whale, fin whale, common rorqual, Balaenoptera physalus\nn02065263\tsei whale, Balaenoptera borealis\nn02065407\tlesser rorqual, piked whale, minke whale, Balaenoptera acutorostrata\nn02065726\thumpback, humpback whale, Megaptera novaeangliae\nn02066245\tgrey whale, gray whale, devilfish, Eschrichtius gibbosus, Eschrichtius robustus\nn02066707\ttoothed whale\nn02067240\tsperm whale, cachalot, black whale, Physeter catodon\nn02067603\tpygmy sperm whale, Kogia breviceps\nn02067768\tdwarf sperm whale, Kogia simus\nn02068206\tbeaked whale\nn02068541\tbottle-nosed whale, bottlenose whale, bottlenose, Hyperoodon ampullatus\nn02068974\tdolphin\nn02069412\tcommon dolphin, Delphinus delphis\nn02069701\tbottlenose dolphin, bottle-nosed dolphin, bottlenose\nn02069974\tAtlantic bottlenose dolphin, Tursiops truncatus\nn02070174\tPacific bottlenose dolphin, Tursiops gilli\nn02070430\tporpoise\nn02070624\tharbor porpoise, herring hog, Phocoena phocoena\nn02070776\tvaquita, Phocoena sinus\nn02071028\tgrampus, Grampus griseus\nn02071294\tkiller whale, killer, orca, grampus, sea wolf, Orcinus orca\nn02071636\tpilot whale, black whale, common blackfish, blackfish, Globicephala melaena\nn02072040\triver dolphin\nn02072493\tnarwhal, narwal, narwhale, Monodon monoceros\nn02072798\twhite whale, beluga, Delphinapterus leucas\nn02073250\tsea cow, sirenian mammal, sirenian\nn02073831\tmanatee, Trichechus manatus\nn02074367\tdugong, Dugong dugon\nn02074726\tSteller's sea cow, Hydrodamalis gigas\nn02075296\tcarnivore\nn02075612\tomnivore\nn02075927\tpinniped mammal, pinniped, pinnatiped\nn02076196\tseal\nn02076402\tcrabeater seal, crab-eating seal\nn02076779\teared seal\nn02077152\tfur seal\nn02077384\tguadalupe fur seal, Arctocephalus philippi\nn02077658\tfur seal\nn02077787\tAlaska fur seal, Callorhinus ursinus\nn02077923\tsea lion\nn02078292\tSouth American sea lion, Otaria Byronia\nn02078574\tCalifornia sea lion, Zalophus californianus, Zalophus californicus\nn02078738\tAustralian sea lion, Zalophus lobatus\nn02079005\tSteller sea lion, Steller's sea lion, Eumetopias jubatus\nn02079389\tearless seal, true seal, hair seal\nn02079851\tharbor seal, common seal, Phoca vitulina\nn02080146\tharp seal, Pagophilus groenlandicus\nn02080415\telephant seal, sea elephant\nn02080713\tbearded seal, squareflipper square flipper, Erignathus barbatus\nn02081060\thooded seal, bladdernose, Cystophora cristata\nn02081571\twalrus, seahorse, sea horse\nn02081798\tAtlantic walrus, Odobenus rosmarus\nn02081927\tPacific walrus, Odobenus divergens\nn02082056\tFissipedia\nn02082190\tfissiped mammal, fissiped\nn02082791\taardvark, ant bear, anteater, Orycteropus afer\nn02083346\tcanine, canid\nn02083672\tbitch\nn02083780\tbrood bitch\nn02084071\tdog, domestic dog, Canis familiaris\nn02084732\tpooch, doggie, doggy, barker, bow-wow\nn02084861\tcur, mongrel, mutt\nn02085019\tfeist, fice\nn02085118\tpariah dog, pye-dog, pie-dog\nn02085272\tlapdog\nn02085374\ttoy dog, toy\nn02085620\tChihuahua\nn02085782\tJapanese spaniel\nn02085936\tMaltese dog, Maltese terrier, Maltese\nn02086079\tPekinese, Pekingese, Peke\nn02086240\tShih-Tzu\nn02086346\ttoy spaniel\nn02086478\tEnglish toy spaniel\nn02086646\tBlenheim spaniel\nn02086753\tKing Charles spaniel\nn02086910\tpapillon\nn02087046\ttoy terrier\nn02087122\thunting dog\nn02087314\tcourser\nn02087394\tRhodesian ridgeback\nn02087551\thound, hound dog\nn02088094\tAfghan hound, Afghan\nn02088238\tbasset, basset hound\nn02088364\tbeagle\nn02088466\tbloodhound, sleuthhound\nn02088632\tbluetick\nn02088745\tboarhound\nn02088839\tcoonhound\nn02088992\tcoondog\nn02089078\tblack-and-tan coonhound\nn02089232\tdachshund, dachsie, badger dog\nn02089468\tsausage dog, sausage hound\nn02089555\tfoxhound\nn02089725\tAmerican foxhound\nn02089867\tWalker hound, Walker foxhound\nn02089973\tEnglish foxhound\nn02090129\tharrier\nn02090253\tPlott hound\nn02090379\tredbone\nn02090475\twolfhound\nn02090622\tborzoi, Russian wolfhound\nn02090721\tIrish wolfhound\nn02090827\tgreyhound\nn02091032\tItalian greyhound\nn02091134\twhippet\nn02091244\tIbizan hound, Ibizan Podenco\nn02091467\tNorwegian elkhound, elkhound\nn02091635\totterhound, otter hound\nn02091831\tSaluki, gazelle hound\nn02092002\tScottish deerhound, deerhound\nn02092173\tstaghound\nn02092339\tWeimaraner\nn02092468\tterrier\nn02093056\tbullterrier, bull terrier\nn02093256\tStaffordshire bullterrier, Staffordshire bull terrier\nn02093428\tAmerican Staffordshire terrier, Staffordshire terrier, American pit bull terrier, pit bull terrier\nn02093647\tBedlington terrier\nn02093754\tBorder terrier\nn02093859\tKerry blue terrier\nn02093991\tIrish terrier\nn02094114\tNorfolk terrier\nn02094258\tNorwich terrier\nn02094433\tYorkshire terrier\nn02094562\trat terrier, ratter\nn02094721\tManchester terrier, black-and-tan terrier\nn02094931\ttoy Manchester, toy Manchester terrier\nn02095050\tfox terrier\nn02095212\tsmooth-haired fox terrier\nn02095314\twire-haired fox terrier\nn02095412\twirehair, wirehaired terrier, wire-haired terrier\nn02095570\tLakeland terrier\nn02095727\tWelsh terrier\nn02095889\tSealyham terrier, Sealyham\nn02096051\tAiredale, Airedale terrier\nn02096177\tcairn, cairn terrier\nn02096294\tAustralian terrier\nn02096437\tDandie Dinmont, Dandie Dinmont terrier\nn02096585\tBoston bull, Boston terrier\nn02096756\tschnauzer\nn02097047\tminiature schnauzer\nn02097130\tgiant schnauzer\nn02097209\tstandard schnauzer\nn02097298\tScotch terrier, Scottish terrier, Scottie\nn02097474\tTibetan terrier, chrysanthemum dog\nn02097658\tsilky terrier, Sydney silky\nn02097786\tSkye terrier\nn02097967\tClydesdale terrier\nn02098105\tsoft-coated wheaten terrier\nn02098286\tWest Highland white terrier\nn02098413\tLhasa, Lhasa apso\nn02098550\tsporting dog, gun dog\nn02098806\tbird dog\nn02098906\twater dog\nn02099029\tretriever\nn02099267\tflat-coated retriever\nn02099429\tcurly-coated retriever\nn02099601\tgolden retriever\nn02099712\tLabrador retriever\nn02099849\tChesapeake Bay retriever\nn02099997\tpointer, Spanish pointer\nn02100236\tGerman short-haired pointer\nn02100399\tsetter\nn02100583\tvizsla, Hungarian pointer\nn02100735\tEnglish setter\nn02100877\tIrish setter, red setter\nn02101006\tGordon setter\nn02101108\tspaniel\nn02101388\tBrittany spaniel\nn02101556\tclumber, clumber spaniel\nn02101670\tfield spaniel\nn02101861\tspringer spaniel, springer\nn02102040\tEnglish springer, English springer spaniel\nn02102177\tWelsh springer spaniel\nn02102318\tcocker spaniel, English cocker spaniel, cocker\nn02102480\tSussex spaniel\nn02102605\twater spaniel\nn02102806\tAmerican water spaniel\nn02102973\tIrish water spaniel\nn02103181\tgriffon, wire-haired pointing griffon\nn02103406\tworking dog\nn02103841\twatchdog, guard dog\nn02104029\tkuvasz\nn02104184\tattack dog\nn02104280\thousedog\nn02104365\tschipperke\nn02104523\tshepherd dog, sheepdog, sheep dog\nn02104882\tBelgian sheepdog, Belgian shepherd\nn02105056\tgroenendael\nn02105162\tmalinois\nn02105251\tbriard\nn02105412\tkelpie\nn02105505\tkomondor\nn02105641\tOld English sheepdog, bobtail\nn02105855\tShetland sheepdog, Shetland sheep dog, Shetland\nn02106030\tcollie\nn02106166\tBorder collie\nn02106382\tBouvier des Flandres, Bouviers des Flandres\nn02106550\tRottweiler\nn02106662\tGerman shepherd, German shepherd dog, German police dog, alsatian\nn02106854\tpolice dog\nn02106966\tpinscher\nn02107142\tDoberman, Doberman pinscher\nn02107312\tminiature pinscher\nn02107420\tSennenhunde\nn02107574\tGreater Swiss Mountain dog\nn02107683\tBernese mountain dog\nn02107908\tAppenzeller\nn02108000\tEntleBucher\nn02108089\tboxer\nn02108254\tmastiff\nn02108422\tbull mastiff\nn02108551\tTibetan mastiff\nn02108672\tbulldog, English bulldog\nn02108915\tFrench bulldog\nn02109047\tGreat Dane\nn02109150\tguide dog\nn02109256\tSeeing Eye dog\nn02109391\thearing dog\nn02109525\tSaint Bernard, St Bernard\nn02109687\tseizure-alert dog\nn02109811\tsled dog, sledge dog\nn02109961\tEskimo dog, husky\nn02110063\tmalamute, malemute, Alaskan malamute\nn02110185\tSiberian husky\nn02110341\tdalmatian, coach dog, carriage dog\nn02110532\tliver-spotted dalmatian\nn02110627\taffenpinscher, monkey pinscher, monkey dog\nn02110806\tbasenji\nn02110958\tpug, pug-dog\nn02111129\tLeonberg\nn02111277\tNewfoundland, Newfoundland dog\nn02111500\tGreat Pyrenees\nn02111626\tspitz\nn02111889\tSamoyed, Samoyede\nn02112018\tPomeranian\nn02112137\tchow, chow chow\nn02112350\tkeeshond\nn02112497\tgriffon, Brussels griffon, Belgian griffon\nn02112706\tBrabancon griffon\nn02112826\tcorgi, Welsh corgi\nn02113023\tPembroke, Pembroke Welsh corgi\nn02113186\tCardigan, Cardigan Welsh corgi\nn02113335\tpoodle, poodle dog\nn02113624\ttoy poodle\nn02113712\tminiature poodle\nn02113799\tstandard poodle\nn02113892\tlarge poodle\nn02113978\tMexican hairless\nn02114100\twolf\nn02114367\ttimber wolf, grey wolf, gray wolf, Canis lupus\nn02114548\twhite wolf, Arctic wolf, Canis lupus tundrarum\nn02114712\tred wolf, maned wolf, Canis rufus, Canis niger\nn02114855\tcoyote, prairie wolf, brush wolf, Canis latrans\nn02115012\tcoydog\nn02115096\tjackal, Canis aureus\nn02115335\twild dog\nn02115641\tdingo, warrigal, warragal, Canis dingo\nn02115913\tdhole, Cuon alpinus\nn02116185\tcrab-eating dog, crab-eating fox, Dusicyon cancrivorus\nn02116450\traccoon dog, Nyctereutes procyonides\nn02116738\tAfrican hunting dog, hyena dog, Cape hunting dog, Lycaon pictus\nn02117135\thyena, hyaena\nn02117512\tstriped hyena, Hyaena hyaena\nn02117646\tbrown hyena, strand wolf, Hyaena brunnea\nn02117900\tspotted hyena, laughing hyena, Crocuta crocuta\nn02118176\taardwolf, Proteles cristata\nn02118333\tfox\nn02118643\tvixen\nn02118707\tReynard\nn02119022\tred fox, Vulpes vulpes\nn02119247\tblack fox\nn02119359\tsilver fox\nn02119477\tred fox, Vulpes fulva\nn02119634\tkit fox, prairie fox, Vulpes velox\nn02119789\tkit fox, Vulpes macrotis\nn02120079\tArctic fox, white fox, Alopex lagopus\nn02120278\tblue fox\nn02120505\tgrey fox, gray fox, Urocyon cinereoargenteus\nn02120997\tfeline, felid\nn02121620\tcat, true cat\nn02121808\tdomestic cat, house cat, Felis domesticus, Felis catus\nn02122298\tkitty, kitty-cat, puss, pussy, pussycat\nn02122430\tmouser\nn02122510\talley cat\nn02122580\tstray\nn02122725\ttom, tomcat\nn02122810\tgib\nn02122878\ttabby, queen\nn02122948\tkitten, kitty\nn02123045\ttabby, tabby cat\nn02123159\ttiger cat\nn02123242\ttortoiseshell, tortoiseshell-cat, calico cat\nn02123394\tPersian cat\nn02123478\tAngora, Angora cat\nn02123597\tSiamese cat, Siamese\nn02123785\tblue point Siamese\nn02123917\tBurmese cat\nn02124075\tEgyptian cat\nn02124157\tMaltese, Maltese cat\nn02124313\tAbyssinian, Abyssinian cat\nn02124484\tManx, Manx cat\nn02124623\twildcat\nn02125010\tsand cat\nn02125081\tEuropean wildcat, catamountain, Felis silvestris\nn02125311\tcougar, puma, catamount, mountain lion, painter, panther, Felis concolor\nn02125494\tocelot, panther cat, Felis pardalis\nn02125689\tjaguarundi, jaguarundi cat, jaguarondi, eyra, Felis yagouaroundi\nn02125872\tkaffir cat, caffer cat, Felis ocreata\nn02126028\tjungle cat, Felis chaus\nn02126139\tserval, Felis serval\nn02126317\tleopard cat, Felis bengalensis\nn02126640\tmargay, margay cat, Felis wiedi\nn02126787\tmanul, Pallas's cat, Felis manul\nn02127052\tlynx, catamount\nn02127292\tcommon lynx, Lynx lynx\nn02127381\tCanada lynx, Lynx canadensis\nn02127482\tbobcat, bay lynx, Lynx rufus\nn02127586\tspotted lynx, Lynx pardina\nn02127678\tcaracal, desert lynx, Lynx caracal\nn02127808\tbig cat, cat\nn02128385\tleopard, Panthera pardus\nn02128598\tleopardess\nn02128669\tpanther\nn02128757\tsnow leopard, ounce, Panthera uncia\nn02128925\tjaguar, panther, Panthera onca, Felis onca\nn02129165\tlion, king of beasts, Panthera leo\nn02129463\tlioness\nn02129530\tlionet\nn02129604\ttiger, Panthera tigris\nn02129837\tBengal tiger\nn02129923\ttigress\nn02129991\tliger\nn02130086\ttiglon, tigon\nn02130308\tcheetah, chetah, Acinonyx jubatus\nn02130545\tsaber-toothed tiger, sabertooth\nn02130925\tSmiledon californicus\nn02131653\tbear\nn02132136\tbrown bear, bruin, Ursus arctos\nn02132320\tbruin\nn02132466\tSyrian bear, Ursus arctos syriacus\nn02132580\tgrizzly, grizzly bear, silvertip, silver-tip, Ursus horribilis, Ursus arctos horribilis\nn02132788\tAlaskan brown bear, Kodiak bear, Kodiak, Ursus middendorffi, Ursus arctos middendorffi\nn02133161\tAmerican black bear, black bear, Ursus americanus, Euarctos americanus\nn02133400\tcinnamon bear\nn02133704\tAsiatic black bear, black bear, Ursus thibetanus, Selenarctos thibetanus\nn02134084\tice bear, polar bear, Ursus Maritimus, Thalarctos maritimus\nn02134418\tsloth bear, Melursus ursinus, Ursus ursinus\nn02134971\tviverrine, viverrine mammal\nn02135220\tcivet, civet cat\nn02135610\tlarge civet, Viverra zibetha\nn02135844\tsmall civet, Viverricula indica, Viverricula malaccensis\nn02136103\tbinturong, bearcat, Arctictis bintourong\nn02136285\tCryptoprocta, genus Cryptoprocta\nn02136452\tfossa, fossa cat, Cryptoprocta ferox\nn02136794\tfanaloka, Fossa fossa\nn02137015\tgenet, Genetta genetta\nn02137302\tbanded palm civet, Hemigalus hardwickii\nn02137549\tmongoose\nn02137722\tIndian mongoose, Herpestes nyula\nn02137888\tichneumon, Herpestes ichneumon\nn02138169\tpalm cat, palm civet\nn02138441\tmeerkat, mierkat\nn02138647\tslender-tailed meerkat, Suricata suricatta\nn02138777\tsuricate, Suricata tetradactyla\nn02139199\tbat, chiropteran\nn02139671\tfruit bat, megabat\nn02140049\tflying fox\nn02140179\tPteropus capestratus\nn02140268\tPteropus hypomelanus\nn02140491\tharpy, harpy bat, tube-nosed bat, tube-nosed fruit bat\nn02140858\tCynopterus sphinx\nn02141306\tcarnivorous bat, microbat\nn02141611\tmouse-eared bat\nn02141713\tleafnose bat, leaf-nosed bat\nn02142407\tmacrotus, Macrotus californicus\nn02142734\tspearnose bat\nn02142898\tPhyllostomus hastatus\nn02143142\thognose bat, Choeronycteris mexicana\nn02143439\thorseshoe bat\nn02143891\thorseshoe bat\nn02144251\torange bat, orange horseshoe bat, Rhinonicteris aurantius\nn02144593\tfalse vampire, false vampire bat\nn02144936\tbig-eared bat, Megaderma lyra\nn02145424\tvespertilian bat, vespertilionid\nn02145910\tfrosted bat, Vespertilio murinus\nn02146201\tred bat, Lasiurus borealis\nn02146371\tbrown bat\nn02146700\tlittle brown bat, little brown myotis, Myotis leucifugus\nn02146879\tcave myotis, Myotis velifer\nn02147173\tbig brown bat, Eptesicus fuscus\nn02147328\tserotine, European brown bat, Eptesicus serotinus\nn02147591\tpallid bat, cave bat, Antrozous pallidus\nn02147947\tpipistrelle, pipistrel, Pipistrellus pipistrellus\nn02148088\teastern pipistrel, Pipistrellus subflavus\nn02148512\tjackass bat, spotted bat, Euderma maculata\nn02148835\tlong-eared bat\nn02148991\twestern big-eared bat, Plecotus townsendi\nn02149420\tfreetail, free-tailed bat, freetailed bat\nn02149653\tguano bat, Mexican freetail bat, Tadarida brasiliensis\nn02149861\tpocketed bat, pocketed freetail bat, Tadirida femorosacca\nn02150134\tmastiff bat\nn02150482\tvampire bat, true vampire bat\nn02150885\tDesmodus rotundus\nn02151230\thairy-legged vampire bat, Diphylla ecaudata\nn02152740\tpredator, predatory animal\nn02152881\tprey, quarry\nn02152991\tgame\nn02153109\tbig game\nn02153203\tgame bird\nn02153809\tfossorial mammal\nn02156732\ttetrapod\nn02156871\tquadruped\nn02157206\thexapod\nn02157285\tbiped\nn02159955\tinsect\nn02160947\tsocial insect\nn02161225\tholometabola, metabola\nn02161338\tdefoliator\nn02161457\tpollinator\nn02161588\tgallfly\nn02162561\tscorpion fly\nn02163008\thanging fly\nn02163297\tcollembolan, springtail\nn02164464\tbeetle\nn02165105\ttiger beetle\nn02165456\tladybug, ladybeetle, lady beetle, ladybird, ladybird beetle\nn02165877\ttwo-spotted ladybug, Adalia bipunctata\nn02166229\tMexican bean beetle, bean beetle, Epilachna varivestis\nn02166567\tHippodamia convergens\nn02166826\tvedalia, Rodolia cardinalis\nn02167151\tground beetle, carabid beetle\nn02167505\tbombardier beetle\nn02167820\tcalosoma\nn02167944\tsearcher, searcher beetle, Calosoma scrutator\nn02168245\tfirefly, lightning bug\nn02168427\tglowworm\nn02168699\tlong-horned beetle, longicorn, longicorn beetle\nn02169023\tsawyer, sawyer beetle\nn02169218\tpine sawyer\nn02169497\tleaf beetle, chrysomelid\nn02169705\tflea beetle\nn02169974\tColorado potato beetle, Colorado beetle, potato bug, potato beetle, Leptinotarsa decemlineata\nn02170400\tcarpet beetle, carpet bug\nn02170599\tbuffalo carpet beetle, Anthrenus scrophulariae\nn02170738\tblack carpet beetle\nn02170993\tclerid beetle, clerid\nn02171164\tbee beetle\nn02171453\tlamellicorn beetle\nn02171869\tscarabaeid beetle, scarabaeid, scarabaean\nn02172182\tdung beetle\nn02172518\tscarab, scarabaeus, Scarabaeus sacer\nn02172678\ttumblebug\nn02172761\tdorbeetle\nn02172870\tJune beetle, June bug, May bug, May beetle\nn02173113\tgreen June beetle, figeater\nn02173373\tJapanese beetle, Popillia japonica\nn02173784\tOriental beetle, Asiatic beetle, Anomala orientalis\nn02174001\trhinoceros beetle\nn02174355\tmelolonthid beetle\nn02174659\tcockchafer, May bug, May beetle, Melolontha melolontha\nn02175014\trose chafer, rose bug, Macrodactylus subspinosus\nn02175569\trose chafer, rose beetle, Cetonia aurata\nn02175916\tstag beetle\nn02176261\telaterid beetle, elater, elaterid\nn02176439\tclick beetle, skipjack, snapping beetle\nn02176747\tfirefly, fire beetle, Pyrophorus noctiluca\nn02176916\twireworm\nn02177196\twater beetle\nn02177506\twhirligig beetle\nn02177775\tdeathwatch beetle, deathwatch, Xestobium rufovillosum\nn02177972\tweevil\nn02178411\tsnout beetle\nn02178717\tboll weevil, Anthonomus grandis\nn02179012\tblister beetle, meloid\nn02179192\toil beetle\nn02179340\tSpanish fly\nn02179891\tDutch-elm beetle, Scolytus multistriatus\nn02180233\tbark beetle\nn02180427\tspruce bark beetle, Dendroctonus rufipennis\nn02180875\trove beetle\nn02181235\tdarkling beetle, darkling groung beetle, tenebrionid\nn02181477\tmealworm\nn02181724\tflour beetle, flour weevil\nn02182045\tseed beetle, seed weevil\nn02182355\tpea weevil, Bruchus pisorum\nn02182642\tbean weevil, Acanthoscelides obtectus\nn02182930\trice weevil, black weevil, Sitophylus oryzae\nn02183096\tAsian longhorned beetle, Anoplophora glabripennis\nn02183507\tweb spinner\nn02183857\tlouse, sucking louse\nn02184473\tcommon louse, Pediculus humanus\nn02184589\thead louse, Pediculus capitis\nn02184720\tbody louse, cootie, Pediculus corporis\nn02185167\tcrab louse, pubic louse, crab, Phthirius pubis\nn02185481\tbird louse, biting louse, louse\nn02186153\tflea\nn02186717\tPulex irritans\nn02187150\tdog flea, Ctenocephalides canis\nn02187279\tcat flea, Ctenocephalides felis\nn02187554\tchigoe, chigger, chigoe flea, Tunga penetrans\nn02187900\tsticktight, sticktight flea, Echidnophaga gallinacea\nn02188699\tdipterous insect, two-winged insects, dipteran, dipteron\nn02189363\tgall midge, gallfly, gall gnat\nn02189670\tHessian fly, Mayetiola destructor\nn02190166\tfly\nn02190790\thousefly, house fly, Musca domestica\nn02191273\ttsetse fly, tsetse, tzetze fly, tzetze, glossina\nn02191773\tblowfly, blow fly\nn02191979\tbluebottle, Calliphora vicina\nn02192252\tgreenbottle, greenbottle fly\nn02192513\tflesh fly, Sarcophaga carnaria\nn02192814\ttachina fly\nn02193009\tgadfly\nn02193163\tbotfly\nn02194249\thuman botfly, Dermatobia hominis\nn02194750\tsheep botfly, sheep gadfly, Oestrus ovis\nn02195091\twarble fly\nn02195526\thorsefly, cleg, clegg, horse fly\nn02195819\tbee fly\nn02196119\trobber fly, bee killer\nn02196344\tfruit fly, pomace fly\nn02196896\tapple maggot, railroad worm, Rhagoletis pomonella\nn02197185\tMediterranean fruit fly, medfly, Ceratitis capitata\nn02197689\tdrosophila, Drosophila melanogaster\nn02197877\tvinegar fly\nn02198129\tleaf miner, leaf-miner\nn02198532\tlouse fly, hippoboscid\nn02198859\thorse tick, horsefly, Hippobosca equina\nn02199170\tsheep ked, sheep-tick, sheep tick, Melophagus Ovinus\nn02199502\thorn fly, Haematobia irritans\nn02200198\tmosquito\nn02200509\twiggler, wriggler\nn02200630\tgnat\nn02200850\tyellow-fever mosquito, Aedes aegypti\nn02201000\tAsian tiger mosquito, Aedes albopictus\nn02201497\tanopheline\nn02201626\tmalarial mosquito, malaria mosquito\nn02202006\tcommon mosquito, Culex pipiens\nn02202124\tCulex quinquefasciatus, Culex fatigans\nn02202287\tgnat\nn02202678\tpunkie, punky, punkey, no-see-um, biting midge\nn02203152\tmidge\nn02203592\tfungus gnat\nn02203978\tpsychodid\nn02204249\tsand fly, sandfly, Phlebotomus papatasii\nn02204722\tfungus gnat, sciara, sciarid\nn02204907\tarmyworm\nn02205219\tcrane fly, daddy longlegs\nn02205673\tblackfly, black fly, buffalo gnat\nn02206270\thymenopterous insect, hymenopteran, hymenopteron, hymenopter\nn02206856\tbee\nn02207179\tdrone\nn02207345\tqueen bee\nn02207449\tworker\nn02207647\tsoldier\nn02207805\tworker bee\nn02208280\thoneybee, Apis mellifera\nn02208498\tAfricanized bee, Africanized honey bee, killer bee, Apis mellifera scutellata, Apis mellifera adansonii\nn02208848\tblack bee, German bee\nn02208979\tCarniolan bee\nn02209111\tItalian bee\nn02209354\tcarpenter bee\nn02209624\tbumblebee, humblebee\nn02209964\tcuckoo-bumblebee\nn02210427\tandrena, andrenid, mining bee\nn02210921\tNomia melanderi, alkali bee\nn02211444\tleaf-cutting bee, leaf-cutter, leaf-cutter bee\nn02211627\tmason bee\nn02211896\tpotter bee\nn02212062\twasp\nn02212602\tvespid, vespid wasp\nn02212958\tpaper wasp\nn02213107\thornet\nn02213239\tgiant hornet, Vespa crabro\nn02213543\tcommon wasp, Vespula vulgaris\nn02213663\tbald-faced hornet, white-faced hornet, Vespula maculata\nn02213788\tyellow jacket, yellow hornet, Vespula maculifrons\nn02214096\tPolistes annularis\nn02214341\tmason wasp\nn02214499\tpotter wasp\nn02214660\tMutillidae, family Mutillidae\nn02214773\tvelvet ant\nn02215161\tsphecoid wasp, sphecoid\nn02215621\tmason wasp\nn02215770\tdigger wasp\nn02216211\tcicada killer, Sphecius speciosis\nn02216365\tmud dauber\nn02216740\tgall wasp, gallfly, cynipid wasp, cynipid gall wasp\nn02217563\tchalcid fly, chalcidfly, chalcid, chalcid wasp\nn02217839\tstrawworm, jointworm\nn02218134\tchalcis fly\nn02218371\tichneumon fly\nn02218713\tsawfly\nn02219015\tbirch leaf miner, Fenusa pusilla\nn02219486\tant, emmet, pismire\nn02220055\tpharaoh ant, pharaoh's ant, Monomorium pharaonis\nn02220225\tlittle black ant, Monomorium minimum\nn02220518\tarmy ant, driver ant, legionary ant\nn02220804\tcarpenter ant\nn02221083\tfire ant\nn02221414\twood ant, Formica rufa\nn02221571\tslave ant\nn02221715\tFormica fusca\nn02221820\tslave-making ant, slave-maker\nn02222035\tsanguinary ant, Formica sanguinea\nn02222321\tbulldog ant\nn02222582\tAmazon ant, Polyergus rufescens\nn02223266\ttermite, white ant\nn02223520\tdry-wood termite\nn02224023\tReticulitermes lucifugus\nn02224713\tMastotermes darwiniensis\nn02225081\tMastotermes electrodominicus\nn02225798\tpowder-post termite, Cryptotermes brevis\nn02226183\torthopterous insect, orthopteron, orthopteran\nn02226429\tgrasshopper, hopper\nn02226821\tshort-horned grasshopper, acridid\nn02226970\tlocust\nn02227247\tmigratory locust, Locusta migratoria\nn02227604\tmigratory grasshopper\nn02227966\tlong-horned grasshopper, tettigoniid\nn02228341\tkatydid\nn02228697\tmormon cricket, Anabrus simplex\nn02229156\tsand cricket, Jerusalem cricket, Stenopelmatus fuscus\nn02229544\tcricket\nn02229765\tmole cricket\nn02230023\tEuropean house cricket, Acheta domestica\nn02230187\tfield cricket, Acheta assimilis\nn02230480\ttree cricket\nn02230634\tsnowy tree cricket, Oecanthus fultoni\nn02231052\tphasmid, phasmid insect\nn02231487\twalking stick, walkingstick, stick insect\nn02231803\tdiapheromera, Diapheromera femorata\nn02232223\twalking leaf, leaf insect\nn02233338\tcockroach, roach\nn02233943\toriental cockroach, oriental roach, Asiatic cockroach, blackbeetle, Blatta orientalis\nn02234355\tAmerican cockroach, Periplaneta americana\nn02234570\tAustralian cockroach, Periplaneta australasiae\nn02234848\tGerman cockroach, Croton bug, crotonbug, water bug, Blattella germanica\nn02235205\tgiant cockroach\nn02236044\tmantis, mantid\nn02236241\tpraying mantis, praying mantid, Mantis religioso\nn02236355\tbug\nn02236896\themipterous insect, bug, hemipteran, hemipteron\nn02237424\tleaf bug, plant bug\nn02237581\tmirid bug, mirid, capsid\nn02237868\tfour-lined plant bug, four-lined leaf bug, Poecilocapsus lineatus\nn02238235\tlygus bug\nn02238358\ttarnished plant bug, Lygus lineolaris\nn02238594\tlace bug\nn02238887\tlygaeid, lygaeid bug\nn02239192\tchinch bug, Blissus leucopterus\nn02239528\tcoreid bug, coreid\nn02239774\tsquash bug, Anasa tristis\nn02240068\tleaf-footed bug, leaf-foot bug\nn02240517\tbedbug, bed bug, chinch, Cimex lectularius\nn02241008\tbackswimmer, Notonecta undulata\nn02241426\ttrue bug\nn02241569\theteropterous insect\nn02241799\twater bug\nn02242137\tgiant water bug\nn02242455\twater scorpion\nn02243209\twater boatman, boat bug\nn02243562\twater strider, pond-skater, water skater\nn02243878\tcommon pond-skater, Gerris lacustris\nn02244173\tassassin bug, reduviid\nn02244515\tconenose, cone-nosed bug, conenose bug, big bedbug, kissing bug\nn02244797\twheel bug, Arilus cristatus\nn02245111\tfirebug\nn02245443\tcotton stainer\nn02246011\thomopterous insect, homopteran\nn02246628\twhitefly\nn02246941\tcitrus whitefly, Dialeurodes citri\nn02247216\tgreenhouse whitefly, Trialeurodes vaporariorum\nn02247511\tsweet-potato whitefly\nn02247655\tsuperbug, Bemisia tabaci, poinsettia strain\nn02248062\tcotton strain\nn02248368\tcoccid insect\nn02248510\tscale insect\nn02248887\tsoft scale\nn02249134\tbrown soft scale, Coccus hesperidum\nn02249515\tarmored scale\nn02249809\tSan Jose scale, Aspidiotus perniciosus\nn02250280\tcochineal insect, cochineal, Dactylopius coccus\nn02250822\tmealybug, mealy bug\nn02251067\tcitrophilous mealybug, citrophilus mealybug, Pseudococcus fragilis\nn02251233\tComstock mealybug, Comstock's mealybug, Pseudococcus comstocki\nn02251593\tcitrus mealybug, Planococcus citri\nn02251775\tplant louse, louse\nn02252226\taphid\nn02252799\tapple aphid, green apple aphid, Aphis pomi\nn02252972\tblackfly, bean aphid, Aphis fabae\nn02253127\tgreenfly\nn02253264\tgreen peach aphid\nn02253494\tant cow\nn02253715\twoolly aphid, woolly plant louse\nn02253913\twoolly apple aphid, American blight, Eriosoma lanigerum\nn02254246\twoolly alder aphid, Prociphilus tessellatus\nn02254697\tadelgid\nn02254901\tbalsam woolly aphid, Adelges piceae\nn02255023\tspruce gall aphid, Adelges abietis\nn02255391\twoolly adelgid\nn02256172\tjumping plant louse, psylla, psyllid\nn02256656\tcicada, cicala\nn02257003\tdog-day cicada, harvest fly\nn02257284\tseventeen-year locust, periodical cicada, Magicicada septendecim\nn02257715\tspittle insect, spittlebug\nn02257985\tfroghopper\nn02258198\tmeadow spittlebug, Philaenus spumarius\nn02258508\tpine spittlebug\nn02258629\tSaratoga spittlebug, Aphrophora saratogensis\nn02259212\tleafhopper\nn02259377\tplant hopper, planthopper\nn02259708\ttreehopper\nn02259987\tlantern fly, lantern-fly\nn02260421\tpsocopterous insect\nn02260863\tpsocid\nn02261063\tbark-louse, bark louse\nn02261419\tbooklouse, book louse, deathwatch, Liposcelis divinatorius\nn02261757\tcommon booklouse, Trogium pulsatorium\nn02262178\tephemerid, ephemeropteran\nn02262449\tmayfly, dayfly, shadfly\nn02262803\tstonefly, stone fly, plecopteran\nn02263378\tneuropteron, neuropteran, neuropterous insect\nn02264021\tant lion, antlion, antlion fly\nn02264232\tdoodlebug, ant lion, antlion\nn02264363\tlacewing, lacewing fly\nn02264591\taphid lion, aphis lion\nn02264885\tgreen lacewing, chrysopid, stink fly\nn02265330\tbrown lacewing, hemerobiid, hemerobiid fly\nn02266050\tdobson, dobsonfly, dobson fly, Corydalus cornutus\nn02266269\thellgrammiate, dobson\nn02266421\tfish fly, fish-fly\nn02266864\talderfly, alder fly, Sialis lutaria\nn02267208\tsnakefly\nn02267483\tmantispid\nn02268148\todonate\nn02268443\tdragonfly, darning needle, devil's darning needle, sewing needle, snake feeder, snake doctor, mosquito hawk, skeeter hawk\nn02268853\tdamselfly\nn02269196\ttrichopterous insect, trichopteran, trichopteron\nn02269340\tcaddis fly, caddis-fly, caddice fly, caddice-fly\nn02269522\tcaseworm\nn02269657\tcaddisworm, strawworm\nn02270011\tthysanuran insect, thysanuron\nn02270200\tbristletail\nn02270623\tsilverfish, Lepisma saccharina\nn02270945\tfirebrat, Thermobia domestica\nn02271222\tjumping bristletail, machilid\nn02271570\tthysanopter, thysanopteron, thysanopterous insect\nn02271897\tthrips, thrip, thripid\nn02272286\ttobacco thrips, Frankliniella fusca\nn02272552\tonion thrips, onion louse, Thrips tobaci\nn02272871\tearwig\nn02273392\tcommon European earwig, Forficula auricularia\nn02274024\tlepidopterous insect, lepidopteron, lepidopteran\nn02274259\tbutterfly\nn02274822\tnymphalid, nymphalid butterfly, brush-footed butterfly, four-footed butterfly\nn02275560\tmourning cloak, mourning cloak butterfly, Camberwell beauty, Nymphalis antiopa\nn02275773\ttortoiseshell, tortoiseshell butterfly\nn02276078\tpainted beauty, Vanessa virginiensis\nn02276258\tadmiral\nn02276355\tred admiral, Vanessa atalanta\nn02276749\twhite admiral, Limenitis camilla\nn02276902\tbanded purple, white admiral, Limenitis arthemis\nn02277094\tred-spotted purple, Limenitis astyanax\nn02277268\tviceroy, Limenitis archippus\nn02277422\tanglewing\nn02277742\tringlet, ringlet butterfly\nn02278024\tcomma, comma butterfly, Polygonia comma\nn02278210\tfritillary\nn02278463\tsilverspot\nn02278839\temperor butterfly, emperor\nn02278980\tpurple emperor, Apatura iris\nn02279257\tpeacock, peacock butterfly, Inachis io\nn02279637\tdanaid, danaid butterfly\nn02279972\tmonarch, monarch butterfly, milkweed butterfly, Danaus plexippus\nn02280458\tpierid, pierid butterfly\nn02280649\tcabbage butterfly\nn02281015\tsmall white, Pieris rapae\nn02281136\tlarge white, Pieris brassicae\nn02281267\tsouthern cabbage butterfly, Pieris protodice\nn02281406\tsulphur butterfly, sulfur butterfly\nn02281787\tlycaenid, lycaenid butterfly\nn02282257\tblue\nn02282385\tcopper\nn02282553\tAmerican copper, Lycaena hypophlaeas\nn02282903\thairstreak, hairstreak butterfly\nn02283077\tStrymon melinus\nn02283201\tmoth\nn02283617\tmoth miller, miller\nn02283951\ttortricid, tortricid moth\nn02284224\tleaf roller, leaf-roller\nn02284611\ttea tortrix, tortrix, Homona coffearia\nn02284884\torange tortrix, tortrix, Argyrotaenia citrana\nn02285179\tcodling moth, codlin moth, Carpocapsa pomonella\nn02285548\tlymantriid, tussock moth\nn02285801\ttussock caterpillar\nn02286089\tgypsy moth, gipsy moth, Lymantria dispar\nn02286425\tbrowntail, brown-tail moth, Euproctis phaeorrhoea\nn02286654\tgold-tail moth, Euproctis chrysorrhoea\nn02287004\tgeometrid, geometrid moth\nn02287352\tPaleacrita vernata\nn02287622\tAlsophila pometaria\nn02287799\tcankerworm\nn02287987\tspring cankerworm\nn02288122\tfall cankerworm\nn02288268\tmeasuring worm, inchworm, looper\nn02288789\tpyralid, pyralid moth\nn02289307\tbee moth, wax moth, Galleria mellonella\nn02289610\tcorn borer, European corn borer moth, corn borer moth, Pyrausta nubilalis\nn02289988\tMediterranean flour moth, Anagasta kuehniella\nn02290340\ttobacco moth, cacao moth, Ephestia elutella\nn02290664\talmond moth, fig moth, Cadra cautella\nn02290870\traisin moth, Cadra figulilella\nn02291220\ttineoid, tineoid moth\nn02291572\ttineid, tineid moth\nn02291748\tclothes moth\nn02292085\tcasemaking clothes moth, Tinea pellionella\nn02292401\twebbing clothes moth, webbing moth, Tineola bisselliella\nn02292692\tcarpet moth, tapestry moth, Trichophaga tapetzella\nn02293352\tgelechiid, gelechiid moth\nn02293868\tgrain moth\nn02294097\tangoumois moth, angoumois grain moth, Sitotroga cerealella\nn02294407\tpotato moth, potato tuber moth, splitworm, Phthorimaea operculella\nn02294577\tpotato tuberworm, Phthorimaea operculella\nn02295064\tnoctuid moth, noctuid, owlet moth\nn02295390\tcutworm\nn02295870\tunderwing\nn02296021\tred underwing, Catocala nupta\nn02296276\tantler moth, Cerapteryx graminis\nn02296612\theliothis moth, Heliothis zia\nn02296912\tarmy cutworm, Chorizagrotis auxiliaris\nn02297294\tarmyworm, Pseudaletia unipuncta\nn02297442\tarmyworm, army worm, Pseudaletia unipuncta\nn02297819\tSpodoptera exigua\nn02297938\tbeet armyworm, Spodoptera exigua\nn02298095\tSpodoptera frugiperda\nn02298218\tfall armyworm, Spodoptera frugiperda\nn02298541\thawkmoth, hawk moth, sphingid, sphinx moth, hummingbird moth\nn02299039\tManduca sexta\nn02299157\ttobacco hornworm, tomato worm, Manduca sexta\nn02299378\tManduca quinquemaculata\nn02299505\ttomato hornworm, potato worm, Manduca quinquemaculata\nn02299846\tdeath's-head moth, Acherontia atropos\nn02300173\tbombycid, bombycid moth, silkworm moth\nn02300554\tdomestic silkworm moth, domesticated silkworm moth, Bombyx mori\nn02300797\tsilkworm\nn02301452\tsaturniid, saturniid moth\nn02301935\temperor, emperor moth, Saturnia pavonia\nn02302244\timperial moth, Eacles imperialis\nn02302459\tgiant silkworm moth, silkworm moth\nn02302620\tsilkworm, giant silkworm, wild wilkworm\nn02302969\tluna moth, Actias luna\nn02303284\tcecropia, cecropia moth, Hyalophora cecropia\nn02303585\tcynthia moth, Samia cynthia, Samia walkeri\nn02303777\tailanthus silkworm, Samia cynthia\nn02304036\tio moth, Automeris io\nn02304432\tpolyphemus moth, Antheraea polyphemus\nn02304657\tpernyi moth, Antheraea pernyi\nn02304797\ttussah, tusseh, tussur, tussore, tusser, Antheraea mylitta\nn02305085\tatlas moth, Atticus atlas\nn02305407\tarctiid, arctiid moth\nn02305636\ttiger moth\nn02305929\tcinnabar, cinnabar moth, Callimorpha jacobeae\nn02306433\tlasiocampid, lasiocampid moth\nn02306825\teggar, egger\nn02307176\ttent-caterpillar moth, Malacosoma americana\nn02307325\ttent caterpillar\nn02307515\ttent-caterpillar moth, Malacosoma disstria\nn02307681\tforest tent caterpillar, Malacosoma disstria\nn02307910\tlappet, lappet moth\nn02308033\tlappet caterpillar\nn02308139\twebworm\nn02308471\twebworm moth\nn02308618\tHyphantria cunea\nn02308735\tfall webworm, Hyphantria cunea\nn02309120\tgarden webworm, Loxostege similalis\nn02309242\tinstar\nn02309337\tcaterpillar\nn02309841\tcorn borer, Pyrausta nubilalis\nn02310000\tbollworm\nn02310149\tpink bollworm, Gelechia gossypiella\nn02310334\tcorn earworm, cotton bollworm, tomato fruitworm, tobacco budworm, vetchworm, Heliothis zia\nn02310585\tcabbageworm, Pieris rapae\nn02310717\twoolly bear, woolly bear caterpillar\nn02310941\twoolly bear moth\nn02311060\tlarva\nn02311617\tnymph\nn02311748\tleptocephalus\nn02312006\tgrub\nn02312175\tmaggot\nn02312325\tleatherjacket\nn02312427\tpupa\nn02312640\tchrysalis\nn02312912\timago\nn02313008\tqueen\nn02313360\tphoronid\nn02313709\tbryozoan, polyzoan, sea mat, sea moss, moss animal\nn02315487\tbrachiopod, lamp shell, lampshell\nn02315821\tpeanut worm, sipunculid\nn02316707\techinoderm\nn02317335\tstarfish, sea star\nn02317781\tbrittle star, brittle-star, serpent star\nn02318167\tbasket star, basket fish\nn02318687\tAstrophyton muricatum\nn02319095\tsea urchin\nn02319308\tedible sea urchin, Echinus esculentus\nn02319555\tsand dollar\nn02319829\theart urchin\nn02320127\tcrinoid\nn02320465\tsea lily\nn02321170\tfeather star, comatulid\nn02321529\tsea cucumber, holothurian\nn02322047\ttrepang, Holothuria edulis\nn02322992\tDuplicidentata\nn02323449\tlagomorph, gnawing mammal\nn02323902\tleporid, leporid mammal\nn02324045\trabbit, coney, cony\nn02324431\trabbit ears\nn02324514\tlapin\nn02324587\tbunny, bunny rabbit\nn02324850\tEuropean rabbit, Old World rabbit, Oryctolagus cuniculus\nn02325366\twood rabbit, cottontail, cottontail rabbit\nn02325722\teastern cottontail, Sylvilagus floridanus\nn02325884\tswamp rabbit, canecutter, swamp hare, Sylvilagus aquaticus\nn02326074\tmarsh hare, swamp rabbit, Sylvilagus palustris\nn02326432\thare\nn02326763\tleveret\nn02326862\tEuropean hare, Lepus europaeus\nn02327028\tjackrabbit\nn02327175\twhite-tailed jackrabbit, whitetail jackrabbit, Lepus townsendi\nn02327435\tblacktail jackrabbit, Lepus californicus\nn02327656\tpolar hare, Arctic hare, Lepus arcticus\nn02327842\tsnowshoe hare, snowshoe rabbit, varying hare, Lepus americanus\nn02328009\tBelgian hare, leporide\nn02328150\tAngora, Angora rabbit\nn02328429\tpika, mouse hare, rock rabbit, coney, cony\nn02328820\tlittle chief hare, Ochotona princeps\nn02328942\tcollared pika, Ochotona collaris\nn02329401\trodent, gnawer\nn02330245\tmouse\nn02331046\trat\nn02331309\tpocket rat\nn02331842\tmurine\nn02332156\thouse mouse, Mus musculus\nn02332447\tharvest mouse, Micromyx minutus\nn02332755\tfield mouse, fieldmouse\nn02332954\tnude mouse\nn02333190\tEuropean wood mouse, Apodemus sylvaticus\nn02333546\tbrown rat, Norway rat, Rattus norvegicus\nn02333733\twharf rat\nn02333819\tsewer rat\nn02333909\tblack rat, roof rat, Rattus rattus\nn02334201\tbandicoot rat, mole rat\nn02334460\tjerboa rat\nn02334728\tkangaroo mouse\nn02335127\twater rat\nn02335231\tbeaver rat\nn02336011\tNew World mouse\nn02336275\tAmerican harvest mouse, harvest mouse\nn02336641\twood mouse\nn02336826\twhite-footed mouse, vesper mouse, Peromyscus leucopus\nn02337001\tdeer mouse, Peromyscus maniculatus\nn02337171\tcactus mouse, Peromyscus eremicus\nn02337332\tcotton mouse, Peromyscus gossypinus\nn02337598\tpygmy mouse, Baiomys taylori\nn02337902\tgrasshopper mouse\nn02338145\tmuskrat, musquash, Ondatra zibethica\nn02338449\tround-tailed muskrat, Florida water rat, Neofiber alleni\nn02338722\tcotton rat, Sigmodon hispidus\nn02338901\twood rat, wood-rat\nn02339282\tdusky-footed wood rat\nn02339376\tvole, field mouse\nn02339922\tpackrat, pack rat, trade rat, bushytail woodrat, Neotoma cinerea\nn02340186\tdusky-footed woodrat, Neotoma fuscipes\nn02340358\teastern woodrat, Neotoma floridana\nn02340640\trice rat, Oryzomys palustris\nn02340930\tpine vole, pine mouse, Pitymys pinetorum\nn02341288\tmeadow vole, meadow mouse, Microtus pennsylvaticus\nn02341475\twater vole, Richardson vole, Microtus richardsoni\nn02341616\tprairie vole, Microtus ochrogaster\nn02341974\twater vole, water rat, Arvicola amphibius\nn02342250\tred-backed mouse, redback vole\nn02342534\tphenacomys\nn02342885\thamster\nn02343058\tEurasian hamster, Cricetus cricetus\nn02343320\tgolden hamster, Syrian hamster, Mesocricetus auratus\nn02343772\tgerbil, gerbille\nn02344175\tjird\nn02344270\ttamarisk gerbil, Meriones unguiculatus\nn02344408\tsand rat, Meriones longifrons\nn02344528\tlemming\nn02344918\tEuropean lemming, Lemmus lemmus\nn02345078\tbrown lemming, Lemmus trimucronatus\nn02345340\tgrey lemming, gray lemming, red-backed lemming\nn02345600\tpied lemming\nn02345774\tHudson bay collared lemming, Dicrostonyx hudsonius\nn02345997\tsouthern bog lemming, Synaptomys cooperi\nn02346170\tnorthern bog lemming, Synaptomys borealis\nn02346627\tporcupine, hedgehog\nn02346998\tOld World porcupine\nn02347274\tbrush-tailed porcupine, brush-tail porcupine\nn02347573\tlong-tailed porcupine, Trichys lipura\nn02347744\tNew World porcupine\nn02348173\tCanada porcupine, Erethizon dorsatum\nn02348788\tpocket mouse\nn02349205\tsilky pocket mouse, Perognathus flavus\nn02349390\tplains pocket mouse, Perognathus flavescens\nn02349557\thispid pocket mouse, Perognathus hispidus\nn02349847\tMexican pocket mouse, Liomys irroratus\nn02350105\tkangaroo rat, desert rat, Dipodomys phillipsii\nn02350357\tOrd kangaroo rat, Dipodomys ordi\nn02350670\tkangaroo mouse, dwarf pocket rat\nn02350989\tjumping mouse\nn02351343\tmeadow jumping mouse, Zapus hudsonius\nn02351870\tjerboa\nn02352002\ttypical jerboa\nn02352290\tJaculus jaculus\nn02352591\tdormouse\nn02352932\tloir, Glis glis\nn02353172\thazel mouse, Muscardinus avellanarius\nn02353411\tlerot\nn02353861\tgopher, pocket gopher, pouched rat\nn02354162\tplains pocket gopher, Geomys bursarius\nn02354320\tsoutheastern pocket gopher, Geomys pinetis\nn02354621\tvalley pocket gopher, Thomomys bottae\nn02354781\tnorthern pocket gopher, Thomomys talpoides\nn02355227\tsquirrel\nn02355477\ttree squirrel\nn02356381\teastern grey squirrel, eastern gray squirrel, cat squirrel, Sciurus carolinensis\nn02356612\twestern grey squirrel, western gray squirrel, Sciurus griseus\nn02356798\tfox squirrel, eastern fox squirrel, Sciurus niger\nn02356977\tblack squirrel\nn02357111\tred squirrel, cat squirrel, Sciurus vulgaris\nn02357401\tAmerican red squirrel, spruce squirrel, red squirrel, Sciurus hudsonicus, Tamiasciurus hudsonicus\nn02357585\tchickeree, Douglas squirrel, Tamiasciurus douglasi\nn02357911\tantelope squirrel, whitetail antelope squirrel, antelope chipmunk, Citellus leucurus\nn02358091\tground squirrel, gopher, spermophile\nn02358390\tmantled ground squirrel, Citellus lateralis\nn02358584\tsuslik, souslik, Citellus citellus\nn02358712\tflickertail, Richardson ground squirrel, Citellus richardsoni\nn02358890\trock squirrel, Citellus variegatus\nn02359047\tArctic ground squirrel, parka squirrel, Citellus parryi\nn02359324\tprairie dog, prairie marmot\nn02359556\tblacktail prairie dog, Cynomys ludovicianus\nn02359667\twhitetail prairie dog, Cynomys gunnisoni\nn02359915\teastern chipmunk, hackee, striped squirrel, ground squirrel, Tamias striatus\nn02360282\tchipmunk\nn02360480\tbaronduki, baranduki, barunduki, burunduki, Eutamius asiaticus, Eutamius sibiricus\nn02360781\tAmerican flying squirrel\nn02360933\tsouthern flying squirrel, Glaucomys volans\nn02361090\tnorthern flying squirrel, Glaucomys sabrinus\nn02361337\tmarmot\nn02361587\tgroundhog, woodchuck, Marmota monax\nn02361706\thoary marmot, whistler, whistling marmot, Marmota caligata\nn02361850\tyellowbelly marmot, rockchuck, Marmota flaviventris\nn02362194\tAsiatic flying squirrel\nn02363005\tbeaver\nn02363245\tOld World beaver, Castor fiber\nn02363351\tNew World beaver, Castor canadensis\nn02363996\tmountain beaver, sewellel, Aplodontia rufa\nn02364520\tcavy\nn02364673\tguinea pig, Cavia cobaya\nn02364840\taperea, wild cavy, Cavia porcellus\nn02365108\tmara, Dolichotis patagonum\nn02365480\tcapybara, capibara, Hydrochoerus hydrochaeris\nn02366002\tagouti, Dasyprocta aguti\nn02366301\tpaca, Cuniculus paca\nn02366579\tmountain paca\nn02366959\tcoypu, nutria, Myocastor coypus\nn02367492\tchinchilla, Chinchilla laniger\nn02367812\tmountain chinchilla, mountain viscacha\nn02368116\tviscacha, chinchillon, Lagostomus maximus\nn02368399\tabrocome, chinchilla rat, rat chinchilla\nn02368821\tmole rat\nn02369293\tmole rat\nn02369555\tsand rat\nn02369680\tnaked mole rat\nn02369935\tqueen, queen mole rat\nn02370137\tDamaraland mole rat\nn02370525\tUngulata\nn02370806\tungulate, hoofed mammal\nn02371344\tunguiculate, unguiculate mammal\nn02372140\tdinoceras, uintathere\nn02372584\thyrax, coney, cony, dassie, das\nn02372952\trock hyrax, rock rabbit, Procavia capensis\nn02373336\todd-toed ungulate, perissodactyl, perissodactyl mammal\nn02374149\tequine, equid\nn02374451\thorse, Equus caballus\nn02375302\troan\nn02375438\tstablemate, stable companion\nn02375757\tgee-gee\nn02375862\teohippus, dawn horse\nn02376542\tfoal\nn02376679\tfilly\nn02376791\tcolt\nn02376918\tmale horse\nn02377063\tridgeling, ridgling, ridgel, ridgil\nn02377181\tstallion, entire\nn02377291\tstud, studhorse\nn02377388\tgelding\nn02377480\tmare, female horse\nn02377603\tbroodmare, stud mare\nn02377703\tsaddle horse, riding horse, mount\nn02378149\tremount\nn02378299\tpalfrey\nn02378415\twarhorse\nn02378541\tcavalry horse\nn02378625\tcharger, courser\nn02378755\tsteed\nn02378870\tprancer\nn02378969\thack\nn02379081\tcow pony\nn02379183\tquarter horse\nn02379329\tMorgan\nn02379430\tTennessee walker, Tennessee walking horse, Walking horse, Plantation walking horse\nn02379630\tAmerican saddle horse\nn02379743\tAppaloosa\nn02379908\tArabian, Arab\nn02380052\tLippizan, Lipizzan, Lippizaner\nn02380335\tpony\nn02380464\tpolo pony\nn02380583\tmustang\nn02380745\tbronco, bronc, broncho\nn02380875\tbucking bronco\nn02381004\tbuckskin\nn02381119\tcrowbait, crow-bait\nn02381261\tdun\nn02381364\tgrey, gray\nn02381460\twild horse\nn02381609\ttarpan, Equus caballus gomelini\nn02381831\tPrzewalski's horse, Przevalski's horse, Equus caballus przewalskii, Equus caballus przevalskii\nn02382039\tcayuse, Indian pony\nn02382132\thack\nn02382204\thack, jade, nag, plug\nn02382338\tplow horse, plough horse\nn02382437\tpony\nn02382635\tShetland pony\nn02382750\tWelsh pony\nn02382850\tExmoor\nn02382948\tracehorse, race horse, bangtail\nn02383231\tthoroughbred\nn02384741\tsteeplechaser\nn02384858\tracer\nn02385002\tfinisher\nn02385098\tpony\nn02385214\tyearling\nn02385580\tdark horse\nn02385676\tmudder\nn02385776\tnonstarter\nn02385898\tstalking-horse\nn02386014\tharness horse\nn02386141\tcob\nn02386224\thackney\nn02386310\tworkhorse\nn02386496\tdraft horse, draught horse, dray horse\nn02386746\tpackhorse\nn02386853\tcarthorse, cart horse, drayhorse\nn02386968\tClydesdale\nn02387093\tPercheron\nn02387254\tfarm horse, dobbin\nn02387346\tshire, shire horse\nn02387452\tpole horse, poler\nn02387722\tpost horse, post-horse, poster\nn02387887\tcoach horse\nn02387983\tpacer\nn02388143\tpacer, pacemaker, pacesetter\nn02388276\ttrotting horse, trotter\nn02388453\tpole horse\nn02388588\tstepper, high stepper\nn02388735\tchestnut\nn02388832\tliver chestnut\nn02388917\tbay\nn02389026\tsorrel\nn02389128\tpalomino\nn02389261\tpinto\nn02389346\tass\nn02389559\tdomestic ass, donkey, Equus asinus\nn02389779\tburro\nn02389865\tmoke\nn02389943\tjack, jackass\nn02390015\tjennet, jenny, jenny ass\nn02390101\tmule\nn02390258\thinny\nn02390454\twild ass\nn02390640\tAfrican wild ass, Equus asinus\nn02390738\tkiang, Equus kiang\nn02390834\tonager, Equus hemionus\nn02390938\tchigetai, dziggetai, Equus hemionus hemionus\nn02391049\tzebra\nn02391234\tcommon zebra, Burchell's zebra, Equus Burchelli\nn02391373\tmountain zebra, Equus zebra zebra\nn02391508\tgrevy's zebra, Equus grevyi\nn02391617\tquagga, Equus quagga\nn02391994\trhinoceros, rhino\nn02392434\tIndian rhinoceros, Rhinoceros unicornis\nn02392555\twoolly rhinoceros, Rhinoceros antiquitatis\nn02392824\twhite rhinoceros, Ceratotherium simum, Diceros simus\nn02393161\tblack rhinoceros, Diceros bicornis\nn02393580\ttapir\nn02393807\tNew World tapir, Tapirus terrestris\nn02393940\tMalayan tapir, Indian tapir, Tapirus indicus\nn02394477\teven-toed ungulate, artiodactyl, artiodactyl mammal\nn02395003\tswine\nn02395406\thog, pig, grunter, squealer, Sus scrofa\nn02395694\tpiglet, piggy, shoat, shote\nn02395855\tsucking pig\nn02395931\tporker\nn02396014\tboar\nn02396088\tsow\nn02396157\trazorback, razorback hog, razorbacked hog\nn02396427\twild boar, boar, Sus scrofa\nn02396796\tbabirusa, babiroussa, babirussa, Babyrousa Babyrussa\nn02397096\twarthog\nn02397529\tpeccary, musk hog\nn02397744\tcollared peccary, javelina, Tayassu angulatus, Tayassu tajacu, Peccari angulatus\nn02397987\twhite-lipped peccary, Tayassu pecari\nn02398521\thippopotamus, hippo, river horse, Hippopotamus amphibius\nn02399000\truminant\nn02401031\tbovid\nn02402010\tbovine\nn02402175\tox, wild ox\nn02402425\tcattle, cows, kine, oxen, Bos taurus\nn02403003\tox\nn02403153\tstirk\nn02403231\tbullock, steer\nn02403325\tbull\nn02403454\tcow, moo-cow\nn02403740\theifer\nn02403820\tbullock\nn02403920\tdogie, dogy, leppy\nn02404028\tmaverick\nn02404186\tbeef, beef cattle\nn02404432\tlonghorn, Texas longhorn\nn02404573\tBrahman, Brahma, Brahmin, Bos indicus\nn02404906\tzebu\nn02405101\taurochs, urus, Bos primigenius\nn02405302\tyak, Bos grunniens\nn02405440\tbanteng, banting, tsine, Bos banteng\nn02405577\tWelsh, Welsh Black\nn02405692\tred poll\nn02405799\tSanta Gertrudis\nn02405929\tAberdeen Angus, Angus, black Angus\nn02406046\tAfricander\nn02406174\tdairy cattle, dairy cow, milch cow, milk cow, milcher, milker\nn02406432\tAyrshire\nn02406533\tBrown Swiss\nn02406647\tCharolais\nn02406749\tJersey\nn02406859\tDevon\nn02406952\tgrade\nn02407071\tDurham, shorthorn\nn02407172\tmilking shorthorn\nn02407276\tGalloway\nn02407390\tFriesian, Holstein, Holstein-Friesian\nn02407521\tGuernsey\nn02407625\tHereford, whiteface\nn02407763\tcattalo, beefalo\nn02407959\tOld World buffalo, buffalo\nn02408429\twater buffalo, water ox, Asiatic buffalo, Bubalus bubalis\nn02408660\tIndian buffalo\nn02408817\tcarabao\nn02409038\tanoa, dwarf buffalo, Anoa depressicornis\nn02409202\ttamarau, tamarao, Bubalus mindorensis, Anoa mindorensis\nn02409508\tCape buffalo, Synercus caffer\nn02409870\tAsian wild ox\nn02410011\tgaur, Bibos gaurus\nn02410141\tgayal, mithan, Bibos frontalis\nn02410509\tbison\nn02410702\tAmerican bison, American buffalo, buffalo, Bison bison\nn02410900\twisent, aurochs, Bison bonasus\nn02411206\tmusk ox, musk sheep, Ovibos moschatus\nn02411705\tsheep\nn02411999\tewe\nn02412080\tram, tup\nn02412210\twether\nn02412440\tlamb\nn02412629\tlambkin\nn02412700\tbaa-lamb\nn02412787\thog, hogget, hogg\nn02412909\tteg\nn02412977\tPersian lamb\nn02413050\tblack sheep\nn02413131\tdomestic sheep, Ovis aries\nn02413484\tCotswold\nn02413593\tHampshire, Hampshire down\nn02413717\tLincoln\nn02413824\tExmoor\nn02413917\tCheviot\nn02414043\tbroadtail, caracul, karakul\nn02414209\tlongwool\nn02414290\tmerino, merino sheep\nn02414442\tRambouillet\nn02414578\twild sheep\nn02414763\targali, argal, Ovis ammon\nn02414904\tMarco Polo sheep, Marco Polo's sheep, Ovis poli\nn02415130\turial, Ovis vignei\nn02415253\tDall sheep, Dall's sheep, white sheep, Ovis montana dalli\nn02415435\tmountain sheep\nn02415577\tbighorn, bighorn sheep, cimarron, Rocky Mountain bighorn, Rocky Mountain sheep, Ovis canadensis\nn02415829\tmouflon, moufflon, Ovis musimon\nn02416104\taoudad, arui, audad, Barbary sheep, maned sheep, Ammotragus lervia\nn02416519\tgoat, caprine animal\nn02416820\tkid\nn02416880\tbilly, billy goat, he-goat\nn02416964\tnanny, nanny-goat, she-goat\nn02417070\tdomestic goat, Capra hircus\nn02417242\tCashmere goat, Kashmir goat\nn02417387\tAngora, Angora goat\nn02417534\twild goat\nn02417663\tbezoar goat, pasang, Capra aegagrus\nn02417785\tmarkhor, markhoor, Capra falconeri\nn02417914\tibex, Capra ibex\nn02418064\tgoat antelope\nn02418465\tmountain goat, Rocky Mountain goat, Oreamnos americanus\nn02418770\tgoral, Naemorhedus goral\nn02419056\tserow\nn02419336\tchamois, Rupicapra rupicapra\nn02419634\ttakin, gnu goat, Budorcas taxicolor\nn02419796\tantelope\nn02420509\tblackbuck, black buck, Antilope cervicapra\nn02420828\tgerenuk, Litocranius walleri\nn02421136\taddax, Addax nasomaculatus\nn02421449\tgnu, wildebeest\nn02421792\tdik-dik\nn02422106\thartebeest\nn02422391\tsassaby, topi, Damaliscus lunatus\nn02422699\timpala, Aepyceros melampus\nn02423022\tgazelle\nn02423218\tThomson's gazelle, Gazella thomsoni\nn02423362\tGazella subgutturosa\nn02423589\tspringbok, springbuck, Antidorcas marsupialis, Antidorcas euchore\nn02424085\tbongo, Tragelaphus eurycerus, Boocercus eurycerus\nn02424305\tkudu, koodoo, koudou\nn02424486\tgreater kudu, Tragelaphus strepsiceros\nn02424589\tlesser kudu, Tragelaphus imberbis\nn02424695\tharnessed antelope\nn02424909\tnyala, Tragelaphus angasi\nn02425086\tmountain nyala, Tragelaphus buxtoni\nn02425228\tbushbuck, guib, Tragelaphus scriptus\nn02425532\tnilgai, nylghai, nylghau, blue bull, Boselaphus tragocamelus\nn02425887\tsable antelope, Hippotragus niger\nn02426176\tsaiga, Saiga tatarica\nn02426481\tsteenbok, steinbok, Raphicerus campestris\nn02426813\teland\nn02427032\tcommon eland, Taurotragus oryx\nn02427183\tgiant eland, Taurotragus derbianus\nn02427470\tkob, Kobus kob\nn02427576\tlechwe, Kobus leche\nn02427724\twaterbuck\nn02428089\tpuku, Adenota vardoni\nn02428349\toryx, pasang\nn02428508\tgemsbok, gemsbuck, Oryx gazella\nn02428842\tforest goat, spindle horn, Pseudoryx nghetinhensis\nn02429456\tpronghorn, prongbuck, pronghorn antelope, American antelope, Antilocapra americana\nn02430045\tdeer, cervid\nn02430559\tstag\nn02430643\troyal, royal stag\nn02430748\tpricket\nn02430830\tfawn\nn02431122\tred deer, elk, American elk, wapiti, Cervus elaphus\nn02431337\thart, stag\nn02431441\thind\nn02431542\tbrocket\nn02431628\tsambar, sambur, Cervus unicolor\nn02431785\twapiti, elk, American elk, Cervus elaphus canadensis\nn02431976\tJapanese deer, sika, Cervus nipon, Cervus sika\nn02432291\tVirginia deer, white tail, whitetail, white-tailed deer, whitetail deer, Odocoileus Virginianus\nn02432511\tmule deer, burro deer, Odocoileus hemionus\nn02432704\tblack-tailed deer, blacktail deer, blacktail, Odocoileus hemionus columbianus\nn02432983\telk, European elk, moose, Alces alces\nn02433318\tfallow deer, Dama dama\nn02433546\troe deer, Capreolus capreolus\nn02433729\troebuck\nn02433925\tcaribou, reindeer, Greenland caribou, Rangifer tarandus\nn02434190\twoodland caribou, Rangifer caribou\nn02434415\tbarren ground caribou, Rangifer arcticus\nn02434712\tbrocket\nn02434954\tmuntjac, barking deer\nn02435216\tmusk deer, Moschus moschiferus\nn02435517\tpere david's deer, elaphure, Elaphurus davidianus\nn02435853\tchevrotain, mouse deer\nn02436224\tkanchil, Tragulus kanchil\nn02436353\tnapu, Tragulus Javanicus\nn02436645\twater chevrotain, water deer, Hyemoschus aquaticus\nn02437136\tcamel\nn02437312\tArabian camel, dromedary, Camelus dromedarius\nn02437482\tBactrian camel, Camelus bactrianus\nn02437616\tllama\nn02437971\tdomestic llama, Lama peruana\nn02438173\tguanaco, Lama guanicoe\nn02438272\talpaca, Lama pacos\nn02438580\tvicuna, Vicugna vicugna\nn02439033\tgiraffe, camelopard, Giraffa camelopardalis\nn02439398\tokapi, Okapia johnstoni\nn02441326\tmusteline mammal, mustelid, musteline\nn02441942\tweasel\nn02442172\termine, shorttail weasel, Mustela erminea\nn02442336\tstoat\nn02442446\tNew World least weasel, Mustela rixosa\nn02442572\tOld World least weasel, Mustela nivalis\nn02442668\tlongtail weasel, long-tailed weasel, Mustela frenata\nn02442845\tmink\nn02443015\tAmerican mink, Mustela vison\nn02443114\tpolecat, fitch, foulmart, foumart, Mustela putorius\nn02443346\tferret\nn02443484\tblack-footed ferret, ferret, Mustela nigripes\nn02443808\tmuishond\nn02443959\tsnake muishond, Poecilogale albinucha\nn02444251\tstriped muishond, Ictonyx striata\nn02444819\totter\nn02445004\triver otter, Lutra canadensis\nn02445171\tEurasian otter, Lutra lutra\nn02445394\tsea otter, Enhydra lutris\nn02445715\tskunk, polecat, wood pussy\nn02446206\tstriped skunk, Mephitis mephitis\nn02446352\thooded skunk, Mephitis macroura\nn02446645\thog-nosed skunk, hognosed skunk, badger skunk, rooter skunk, Conepatus leuconotus\nn02447021\tspotted skunk, little spotted skunk, Spilogale putorius\nn02447366\tbadger\nn02447762\tAmerican badger, Taxidea taxus\nn02448060\tEurasian badger, Meles meles\nn02448318\tratel, honey badger, Mellivora capensis\nn02448633\tferret badger\nn02448885\thog badger, hog-nosed badger, sand badger, Arctonyx collaris\nn02449183\twolverine, carcajou, skunk bear, Gulo luscus\nn02449350\tglutton, Gulo gulo, wolverine\nn02449699\tgrison, Grison vittatus, Galictis vittatus\nn02450034\tmarten, marten cat\nn02450295\tpine marten, Martes martes\nn02450426\tsable, Martes zibellina\nn02450561\tAmerican marten, American sable, Martes americana\nn02450677\tstone marten, beech marten, Martes foina\nn02450829\tfisher, pekan, fisher cat, black cat, Martes pennanti\nn02451125\tyellow-throated marten, Charronia flavigula\nn02451415\ttayra, taira, Eira barbara\nn02451575\tfictional animal\nn02453108\tpachyderm\nn02453611\tedentate\nn02454379\tarmadillo\nn02454794\tpeba, nine-banded armadillo, Texas armadillo, Dasypus novemcinctus\nn02455135\tapar, three-banded armadillo, Tolypeutes tricinctus\nn02455428\ttatouay, cabassous, Cabassous unicinctus\nn02455720\tpeludo, poyou, Euphractus sexcinctus\nn02456008\tgiant armadillo, tatou, tatu, Priodontes giganteus\nn02456275\tpichiciago, pichiciego, fairy armadillo, chlamyphore, Chlamyphorus truncatus\nn02456962\tsloth, tree sloth\nn02457408\tthree-toed sloth, ai, Bradypus tridactylus\nn02457945\ttwo-toed sloth, unau, unai, Choloepus didactylus\nn02458135\ttwo-toed sloth, unau, unai, Choloepus hoffmanni\nn02458517\tmegatherian, megatheriid, megatherian mammal\nn02459190\tmylodontid\nn02460009\tanteater, New World anteater\nn02460451\tant bear, giant anteater, great anteater, tamanoir, Myrmecophaga jubata\nn02460817\tsilky anteater, two-toed anteater, Cyclopes didactylus\nn02461128\ttamandua, tamandu, lesser anteater, Tamandua tetradactyla\nn02461830\tpangolin, scaly anteater, anteater\nn02462213\tcoronet\nn02469248\tscapular\nn02469472\ttadpole, polliwog, pollywog\nn02469914\tprimate\nn02470238\tsimian\nn02470325\tape\nn02470709\tanthropoid\nn02470899\tanthropoid ape\nn02471300\thominoid\nn02471762\thominid\nn02472293\thomo, man, human being, human\nn02472987\tworld, human race, humanity, humankind, human beings, humans, mankind, man\nn02473307\tHomo erectus\nn02473554\tPithecanthropus, Pithecanthropus erectus, genus Pithecanthropus\nn02473720\tJava man, Trinil man\nn02473857\tPeking man\nn02473983\tSinanthropus, genus Sinanthropus\nn02474110\tHomo soloensis\nn02474282\tJavanthropus, genus Javanthropus\nn02474605\tHomo habilis\nn02474777\tHomo sapiens\nn02475078\tNeandertal man, Neanderthal man, Neandertal, Neanderthal, Homo sapiens neanderthalensis\nn02475358\tCro-magnon\nn02475669\tHomo sapiens sapiens, modern man\nn02476219\taustralopithecine\nn02476567\tAustralopithecus afarensis\nn02476870\tAustralopithecus africanus\nn02477028\tAustralopithecus boisei\nn02477187\tZinjanthropus, genus Zinjanthropus\nn02477329\tAustralopithecus robustus\nn02477516\tParanthropus, genus Paranthropus\nn02477782\tSivapithecus\nn02478239\trudapithecus, Dryopithecus Rudapithecus hungaricus\nn02478875\tproconsul\nn02479332\tAegyptopithecus\nn02480153\tgreat ape, pongid\nn02480495\torangutan, orang, orangutang, Pongo pygmaeus\nn02480855\tgorilla, Gorilla gorilla\nn02481103\twestern lowland gorilla, Gorilla gorilla gorilla\nn02481235\teastern lowland gorilla, Gorilla gorilla grauri\nn02481366\tmountain gorilla, Gorilla gorilla beringei\nn02481500\tsilverback\nn02481823\tchimpanzee, chimp, Pan troglodytes\nn02482060\twestern chimpanzee, Pan troglodytes verus\nn02482286\teastern chimpanzee, Pan troglodytes schweinfurthii\nn02482474\tcentral chimpanzee, Pan troglodytes troglodytes\nn02482650\tpygmy chimpanzee, bonobo, Pan paniscus\nn02483092\tlesser ape\nn02483362\tgibbon, Hylobates lar\nn02483708\tsiamang, Hylobates syndactylus, Symphalangus syndactylus\nn02484322\tmonkey\nn02484473\tOld World monkey, catarrhine\nn02484975\tguenon, guenon monkey\nn02485225\ttalapoin, Cercopithecus talapoin\nn02485371\tgrivet, Cercopithecus aethiops\nn02485536\tvervet, vervet monkey, Cercopithecus aethiops pygerythrus\nn02485688\tgreen monkey, African green monkey, Cercopithecus aethiops sabaeus\nn02485988\tmangabey\nn02486261\tpatas, hussar monkey, Erythrocebus patas\nn02486410\tbaboon\nn02486657\tchacma, chacma baboon, Papio ursinus\nn02486908\tmandrill, Mandrillus sphinx\nn02487079\tdrill, Mandrillus leucophaeus\nn02487347\tmacaque\nn02487547\trhesus, rhesus monkey, Macaca mulatta\nn02487675\tbonnet macaque, bonnet monkey, capped macaque, crown monkey, Macaca radiata\nn02487847\tBarbary ape, Macaca sylvana\nn02488003\tcrab-eating macaque, croo monkey, Macaca irus\nn02488291\tlangur\nn02488415\tentellus, hanuman, Presbytes entellus, Semnopithecus entellus\nn02488702\tcolobus, colobus monkey\nn02488894\tguereza, Colobus guereza\nn02489166\tproboscis monkey, Nasalis larvatus\nn02489589\tNew World monkey, platyrrhine, platyrrhinian\nn02490219\tmarmoset\nn02490597\ttrue marmoset\nn02490811\tpygmy marmoset, Cebuella pygmaea\nn02491107\ttamarin, lion monkey, lion marmoset, leoncita\nn02491329\tsilky tamarin, Leontocebus rosalia\nn02491474\tpinche, Leontocebus oedipus\nn02492035\tcapuchin, ringtail, Cebus capucinus\nn02492356\tdouroucouli, Aotus trivirgatus\nn02492660\thowler monkey, howler\nn02492948\tsaki\nn02493224\tuakari\nn02493509\ttiti, titi monkey\nn02493793\tspider monkey, Ateles geoffroyi\nn02494079\tsquirrel monkey, Saimiri sciureus\nn02494383\twoolly monkey\nn02495242\ttree shrew\nn02496052\tprosimian\nn02496913\tlemur\nn02497673\tMadagascar cat, ring-tailed lemur, Lemur catta\nn02498153\taye-aye, Daubentonia madagascariensis\nn02498743\tslender loris, Loris gracilis\nn02499022\tslow loris, Nycticebus tardigradua, Nycticebus pygmaeus\nn02499316\tpotto, kinkajou, Perodicticus potto\nn02499568\tangwantibo, golden potto, Arctocebus calabarensis\nn02499808\tgalago, bushbaby, bush baby\nn02500267\tindri, indris, Indri indri, Indri brevicaudatus\nn02500596\twoolly indris, Avahi laniger\nn02501583\ttarsier\nn02501923\tTarsius syrichta\nn02502006\tTarsius glis\nn02502514\tflying lemur, flying cat, colugo\nn02502807\tCynocephalus variegatus\nn02503127\tproboscidean, proboscidian\nn02503517\telephant\nn02503756\trogue elephant\nn02504013\tIndian elephant, Elephas maximus\nn02504458\tAfrican elephant, Loxodonta africana\nn02504770\tmammoth\nn02505063\twoolly mammoth, northern mammoth, Mammuthus primigenius\nn02505238\tcolumbian mammoth, Mammuthus columbi\nn02505485\timperial mammoth, imperial elephant, Archidiskidon imperator\nn02505998\tmastodon, mastodont\nn02506947\tplantigrade mammal, plantigrade\nn02507148\tdigitigrade mammal, digitigrade\nn02507649\tprocyonid\nn02508021\traccoon, racoon\nn02508213\tcommon raccoon, common racoon, coon, ringtail, Procyon lotor\nn02508346\tcrab-eating raccoon, Procyon cancrivorus\nn02508742\tbassarisk, cacomistle, cacomixle, coon cat, raccoon fox, ringtail, ring-tailed cat, civet cat, miner's cat, Bassariscus astutus\nn02509197\tkinkajou, honey bear, potto, Potos flavus, Potos caudivolvulus\nn02509515\tcoati, coati-mondi, coati-mundi, coon cat, Nasua narica\nn02509815\tlesser panda, red panda, panda, bear cat, cat bear, Ailurus fulgens\nn02510455\tgiant panda, panda, panda bear, coon bear, Ailuropoda melanoleuca\nn02511730\ttwitterer\nn02512053\tfish\nn02512752\tfingerling\nn02512830\tgame fish, sport fish\nn02512938\tfood fish\nn02513248\trough fish\nn02513355\tgroundfish, bottom fish\nn02513560\tyoung fish\nn02513727\tparr\nn02513805\tmouthbreeder\nn02513939\tspawner\nn02514041\tbarracouta, snoek\nn02515214\tcrossopterygian, lobefin, lobe-finned fish\nn02515713\tcoelacanth, Latimeria chalumnae\nn02516188\tlungfish\nn02516776\tceratodus\nn02517442\tcatfish, siluriform fish\nn02517938\tsilurid, silurid fish\nn02518324\tEuropean catfish, sheatfish, Silurus glanis\nn02518622\telectric catfish, Malopterurus electricus\nn02519148\tbullhead, bullhead catfish\nn02519340\thorned pout, hornpout, pout, Ameiurus Melas\nn02519472\tbrown bullhead\nn02519686\tchannel catfish, channel cat, Ictalurus punctatus\nn02519862\tblue catfish, blue cat, blue channel catfish, blue channel cat\nn02520147\tflathead catfish, mudcat, goujon, shovelnose catfish, spoonbill catfish, Pylodictus olivaris\nn02520525\tarmored catfish\nn02520810\tsea catfish\nn02521646\tgadoid, gadoid fish\nn02522399\tcod, codfish\nn02522637\tcodling\nn02522722\tAtlantic cod, Gadus morhua\nn02522866\tPacific cod, Alaska cod, Gadus macrocephalus\nn02523110\twhiting, Merlangus merlangus, Gadus merlangus\nn02523427\tburbot, eelpout, ling, cusk, Lota lota\nn02523877\thaddock, Melanogrammus aeglefinus\nn02524202\tpollack, pollock, Pollachius pollachius\nn02524524\thake\nn02524659\tsilver hake, Merluccius bilinearis, whiting\nn02524928\tling\nn02525382\tcusk, torsk, Brosme brosme\nn02525703\tgrenadier, rattail, rattail fish\nn02526121\teel\nn02526425\telver\nn02526818\tcommon eel, freshwater eel\nn02527057\ttuna, Anguilla sucklandii\nn02527271\tmoray, moray eel\nn02527622\tconger, conger eel\nn02528163\tteleost fish, teleost, teleostan\nn02529293\tbeaked salmon, sandfish, Gonorhynchus gonorhynchus\nn02529772\tclupeid fish, clupeid\nn02530052\twhitebait\nn02530188\tbrit, britt\nn02530421\tshad\nn02530637\tcommon American shad, Alosa sapidissima\nn02530831\triver shad, Alosa chrysocloris\nn02530999\tallice shad, allis shad, allice, allis, Alosa alosa\nn02531114\talewife, Alosa pseudoharengus, Pomolobus pseudoharengus\nn02531625\tmenhaden, Brevoortia tyrannis\nn02532028\therring, Clupea harangus\nn02532272\tAtlantic herring, Clupea harengus harengus\nn02532451\tPacific herring, Clupea harengus pallasii\nn02532602\tsardine\nn02532786\tsild\nn02532918\tbrisling, sprat, Clupea sprattus\nn02533209\tpilchard, sardine, Sardina pilchardus\nn02533545\tPacific sardine, Sardinops caerulea\nn02533834\tanchovy\nn02534165\tmediterranean anchovy, Engraulis encrasicholus\nn02534559\tsalmonid\nn02534734\tsalmon\nn02535080\tparr\nn02535163\tblackfish\nn02535258\tredfish\nn02535537\tAtlantic salmon, Salmo salar\nn02535759\tlandlocked salmon, lake salmon\nn02536165\tsockeye, sockeye salmon, red salmon, blueback salmon, Oncorhynchus nerka\nn02536456\tchinook, chinook salmon, king salmon, quinnat salmon, Oncorhynchus tshawytscha\nn02536864\tcoho, cohoe, coho salmon, blue jack, silver salmon, Oncorhynchus kisutch\nn02537085\ttrout\nn02537319\tbrown trout, salmon trout, Salmo trutta\nn02537525\trainbow trout, Salmo gairdneri\nn02537716\tsea trout\nn02538010\tlake trout, salmon trout, Salvelinus namaycush\nn02538216\tbrook trout, speckled trout, Salvelinus fontinalis\nn02538406\tchar, charr\nn02538562\tArctic char, Salvelinus alpinus\nn02538985\twhitefish\nn02539424\tlake whitefish, Coregonus clupeaformis\nn02539573\tcisco, lake herring, Coregonus artedi\nn02539894\tround whitefish, Menominee whitefish, Prosopium cylindraceum\nn02540412\tsmelt\nn02540983\tsparling, European smelt, Osmerus eperlanus\nn02541257\tcapelin, capelan, caplin\nn02541687\ttarpon, Tarpon atlanticus\nn02542017\tladyfish, tenpounder, Elops saurus\nn02542432\tbonefish, Albula vulpes\nn02542958\targentine\nn02543255\tlanternfish\nn02543565\tlizardfish, snakefish, snake-fish\nn02544274\tlancetfish, lancet fish, wolffish\nn02545841\topah, moonfish, Lampris regius\nn02546028\tNew World opah, Lampris guttatus\nn02546331\tribbonfish\nn02546627\tdealfish, Trachipterus arcticus\nn02547014\toarfish, king of the herring, ribbonfish, Regalecus glesne\nn02547733\tbatfish\nn02548247\tgoosefish, angler, anglerfish, angler fish, monkfish, lotte, allmouth, Lophius Americanus\nn02548689\ttoadfish, Opsanus tau\nn02548884\toyster fish, oyster-fish, oysterfish\nn02549248\tfrogfish\nn02549376\tsargassum fish\nn02549989\tneedlefish, gar, billfish\nn02550203\ttimucu\nn02550460\tflying fish\nn02550655\tmonoplane flying fish, two-wing flying fish\nn02551134\thalfbeak\nn02551668\tsaury, billfish, Scomberesox saurus\nn02552171\tspiny-finned fish, acanthopterygian\nn02553028\tlingcod, Ophiodon elongatus\nn02554730\tpercoid fish, percoid, percoidean\nn02555863\tperch\nn02556373\tclimbing perch, Anabas testudineus, A. testudineus\nn02556846\tperch\nn02557182\tyellow perch, Perca flavescens\nn02557318\tEuropean perch, Perca fluviatilis\nn02557591\tpike-perch, pike perch\nn02557749\twalleye, walleyed pike, jack salmon, dory, Stizostedion vitreum\nn02557909\tblue pike, blue pickerel, blue pikeperch, blue walleye, Strizostedion vitreum glaucum\nn02558206\tsnail darter, Percina tanasi\nn02558860\tcusk-eel\nn02559144\tbrotula\nn02559383\tpearlfish, pearl-fish\nn02559862\trobalo\nn02560110\tsnook\nn02561108\tpike\nn02561381\tnorthern pike, Esox lucius\nn02561514\tmuskellunge, Esox masquinongy\nn02561661\tpickerel\nn02561803\tchain pickerel, chain pike, Esox niger\nn02561937\tredfin pickerel, barred pickerel, Esox americanus\nn02562315\tsunfish, centrarchid\nn02562796\tcrappie\nn02562971\tblack crappie, Pomoxis nigromaculatus\nn02563079\twhite crappie, Pomoxis annularis\nn02563182\tfreshwater bream, bream\nn02563648\tpumpkinseed, Lepomis gibbosus\nn02563792\tbluegill, Lepomis macrochirus\nn02563949\tspotted sunfish, stumpknocker, Lepomis punctatus\nn02564270\tfreshwater bass\nn02564403\trock bass, rock sunfish, Ambloplites rupestris\nn02564720\tblack bass\nn02564935\tKentucky black bass, spotted black bass, Micropterus pseudoplites\nn02565072\tsmallmouth, smallmouth bass, smallmouthed bass, smallmouth black bass, smallmouthed black bass, Micropterus dolomieu\nn02565324\tlargemouth, largemouth bass, largemouthed bass, largemouth black bass, largemouthed black bass, Micropterus salmoides\nn02565573\tbass\nn02566109\tserranid fish, serranid\nn02566489\twhite perch, silver perch, Morone americana\nn02566665\tyellow bass, Morone interrupta\nn02567334\tblackmouth bass, Synagrops bellus\nn02567633\trock sea bass, rock bass, Centropristis philadelphica\nn02568087\tstriped bass, striper, Roccus saxatilis, rockfish\nn02568447\tstone bass, wreckfish, Polyprion americanus\nn02568959\tgrouper\nn02569484\thind\nn02569631\trock hind, Epinephelus adscensionis\nn02569905\tcreole-fish, Paranthias furcifer\nn02570164\tjewfish, Mycteroperca bonaci\nn02570484\tsoapfish\nn02570838\tsurfperch, surffish, surf fish\nn02571167\trainbow seaperch, rainbow perch, Hipsurus caryi\nn02571652\tbigeye\nn02571810\tcatalufa, Priacanthus arenatus\nn02572196\tcardinalfish\nn02572484\tflame fish, flamefish, Apogon maculatus\nn02573249\ttilefish, Lopholatilus chamaeleonticeps\nn02573704\tbluefish, Pomatomus saltatrix\nn02574271\tcobia, Rachycentron canadum, sergeant fish\nn02574910\tremora, suckerfish, sucking fish\nn02575325\tsharksucker, Echeneis naucrates\nn02575590\twhale sucker, whalesucker, Remilegia australis\nn02576223\tcarangid fish, carangid\nn02576575\tjack\nn02576906\tcrevalle jack, jack crevalle, Caranx hippos\nn02577041\tyellow jack, Caranx bartholomaei\nn02577164\trunner, blue runner, Caranx crysos\nn02577403\trainbow runner, Elagatis bipinnulata\nn02577662\tleatherjacket, leatherjack\nn02577952\tthreadfish, thread-fish, Alectis ciliaris\nn02578233\tmoonfish, Atlantic moonfish, horsefish, horsehead, horse-head, dollarfish, Selene setapinnis\nn02578454\tlookdown, lookdown fish, Selene vomer\nn02578771\tamberjack, amberfish\nn02578928\tyellowtail, Seriola dorsalis\nn02579303\tkingfish, Seriola grandis\nn02579557\tpompano\nn02579762\tFlorida pompano, Trachinotus carolinus\nn02579928\tpermit, Trachinotus falcatus\nn02580336\tscad\nn02580679\thorse mackerel, jack mackerel, Spanish mackerel, saurel, Trachurus symmetricus\nn02580830\thorse mackerel, saurel, Trachurus trachurus\nn02581108\tbigeye scad, big-eyed scad, goggle-eye, Selar crumenophthalmus\nn02581482\tmackerel scad, mackerel shad, Decapterus macarellus\nn02581642\tround scad, cigarfish, quiaquia, Decapterus punctatus\nn02581957\tdolphinfish, dolphin, mahimahi\nn02582220\tCoryphaena hippurus\nn02582349\tCoryphaena equisetis\nn02582721\tpomfret, Brama raii\nn02583567\tcharacin, characin fish, characid\nn02583890\ttetra\nn02584145\tcardinal tetra, Paracheirodon axelrodi\nn02584449\tpiranha, pirana, caribe\nn02585872\tcichlid, cichlid fish\nn02586238\tbolti, Tilapia nilotica\nn02586543\tsnapper\nn02587051\tred snapper, Lutjanus blackfordi\nn02587300\tgrey snapper, gray snapper, mangrove snapper, Lutjanus griseus\nn02587479\tmutton snapper, muttonfish, Lutjanus analis\nn02587618\tschoolmaster, Lutjanus apodus\nn02587877\tyellowtail, yellowtail snapper, Ocyurus chrysurus\nn02588286\tgrunt\nn02588794\tmargate, Haemulon album\nn02588945\tSpanish grunt, Haemulon macrostomum\nn02589062\ttomtate, Haemulon aurolineatum\nn02589196\tcottonwick, Haemulon malanurum\nn02589316\tsailor's-choice, sailors choice, Haemulon parra\nn02589623\tporkfish, pork-fish, Anisotremus virginicus\nn02589796\tpompon, black margate, Anisotremus surinamensis\nn02590094\tpigfish, hogfish, Orthopristis chrysopterus\nn02590495\tsparid, sparid fish\nn02590702\tsea bream, bream\nn02590987\tporgy\nn02591330\tred porgy, Pagrus pagrus\nn02591613\tEuropean sea bream, Pagellus centrodontus\nn02591911\tAtlantic sea bream, Archosargus rhomboidalis\nn02592055\tsheepshead, Archosargus probatocephalus\nn02592371\tpinfish, sailor's-choice, squirrelfish, Lagodon rhomboides\nn02592734\tsheepshead porgy, Calamus penna\nn02593019\tsnapper, Chrysophrys auratus\nn02593191\tblack bream, Chrysophrys australis\nn02593453\tscup, northern porgy, northern scup, Stenotomus chrysops\nn02593679\tscup, southern porgy, southern scup, Stenotomus aculeatus\nn02594250\tsciaenid fish, sciaenid\nn02594942\tstriped drum, Equetus pulcher\nn02595056\tjackknife-fish, Equetus lanceolatus\nn02595339\tsilver perch, mademoiselle, Bairdiella chrysoura\nn02595702\tred drum, channel bass, redfish, Sciaenops ocellatus\nn02596067\tmulloway, jewfish, Sciaena antarctica\nn02596252\tmaigre, maiger, Sciaena aquila\nn02596381\tcroaker\nn02596720\tAtlantic croaker, Micropogonias undulatus\nn02597004\tyellowfin croaker, surffish, surf fish, Umbrina roncador\nn02597367\twhiting\nn02597608\tkingfish\nn02597818\tking whiting, Menticirrhus americanus\nn02597972\tnorthern whiting, Menticirrhus saxatilis\nn02598134\tcorbina, Menticirrhus undulatus\nn02598573\twhite croaker, chenfish, kingfish, Genyonemus lineatus\nn02598878\twhite croaker, queenfish, Seriphus politus\nn02599052\tsea trout\nn02599347\tweakfish, Cynoscion regalis\nn02599557\tspotted weakfish, spotted sea trout, spotted squeateague, Cynoscion nebulosus\nn02599958\tmullet\nn02600298\tgoatfish, red mullet, surmullet, Mullus surmuletus\nn02600503\tred goatfish, Mullus auratus\nn02600798\tyellow goatfish, Mulloidichthys martinicus\nn02601344\tmullet, grey mullet, gray mullet\nn02601767\tstriped mullet, Mugil cephalus\nn02601921\twhite mullet, Mugil curema\nn02602059\tliza, Mugil liza\nn02602405\tsilversides, silverside\nn02602760\tjacksmelt, Atherinopsis californiensis\nn02603317\tbarracuda\nn02603540\tgreat barracuda, Sphyraena barracuda\nn02603862\tsweeper\nn02604157\tsea chub\nn02604480\tBermuda chub, rudderfish, Kyphosus sectatrix\nn02604954\tspadefish, angelfish, Chaetodipterus faber\nn02605316\tbutterfly fish\nn02605703\tchaetodon\nn02605936\tangelfish\nn02606052\trock beauty, Holocanthus tricolor\nn02606384\tdamselfish, demoiselle\nn02606751\tbeaugregory, Pomacentrus leucostictus\nn02607072\tanemone fish\nn02607201\tclown anemone fish, Amphiprion percula\nn02607470\tsergeant major, Abudefduf saxatilis\nn02607862\twrasse\nn02608284\tpigfish, giant pigfish, Achoerodus gouldii\nn02608547\thogfish, hog snapper, Lachnolaimus maximus\nn02608860\tslippery dick, Halicoeres bivittatus\nn02608996\tpuddingwife, pudding-wife, Halicoeres radiatus\nn02609302\tbluehead, Thalassoma bifasciatum\nn02609823\tpearly razorfish, Hemipteronatus novacula\nn02610066\ttautog, blackfish, Tautoga onitis\nn02610373\tcunner, bergall, Tautogolabrus adspersus\nn02610664\tparrotfish, polly fish, pollyfish\nn02610980\tthreadfin\nn02611561\tjawfish\nn02611898\tstargazer\nn02612167\tsand stargazer\nn02613181\tblenny, combtooth blenny\nn02613572\tshanny, Blennius pholis\nn02613820\tMolly Miller, Scartella cristata\nn02614140\tclinid, clinid fish\nn02614482\tpikeblenny\nn02614653\tbluethroat pikeblenny, Chaenopsis ocellata\nn02614978\tgunnel, bracketed blenny\nn02615298\trock gunnel, butterfish, Pholis gunnellus\nn02616128\teelblenny\nn02616397\twrymouth, ghostfish, Cryptacanthodes maculatus\nn02616851\twolffish, wolf fish, catfish\nn02617537\tviviparous eelpout, Zoarces viviparus\nn02618094\tocean pout, Macrozoarces americanus\nn02618513\tsand lance, sand launce, sand eel, launce\nn02618827\tdragonet\nn02619165\tgoby, gudgeon\nn02619550\tmudskipper, mudspringer\nn02619861\tsleeper, sleeper goby\nn02620167\tflathead\nn02620578\tarcherfish, Toxotes jaculatrix\nn02621258\tsurgeonfish\nn02621908\tgempylid\nn02622249\tsnake mackerel, Gempylus serpens\nn02622547\tescolar, Lepidocybium flavobrunneum\nn02622712\toilfish, Ruvettus pretiosus\nn02622955\tcutlassfish, frost fish, hairtail\nn02623445\tscombroid, scombroid fish\nn02624167\tmackerel\nn02624551\tcommon mackerel, shiner, Scomber scombrus\nn02624807\tSpanish mackerel, Scomber colias\nn02624987\tchub mackerel, tinker, Scomber japonicus\nn02625258\twahoo, Acanthocybium solandri\nn02625612\tSpanish mackerel\nn02625851\tking mackerel, cavalla, cero, Scomberomorus cavalla\nn02626089\tScomberomorus maculatus\nn02626265\tcero, pintado, kingfish, Scomberomorus regalis\nn02626471\tsierra, Scomberomorus sierra\nn02626762\ttuna, tunny\nn02627037\talbacore, long-fin tunny, Thunnus alalunga\nn02627292\tbluefin, bluefin tuna, horse mackerel, Thunnus thynnus\nn02627532\tyellowfin, yellowfin tuna, Thunnus albacares\nn02627835\tbonito\nn02628062\tskipjack, Atlantic bonito, Sarda sarda\nn02628259\tChile bonito, Chilean bonito, Pacific bonito, Sarda chiliensis\nn02628600\tskipjack, skipjack tuna, Euthynnus pelamis\nn02629230\tbonito, oceanic bonito, Katsuwonus pelamis\nn02629716\tswordfish, Xiphias gladius\nn02630281\tsailfish\nn02630615\tAtlantic sailfish, Istiophorus albicans\nn02630739\tbillfish\nn02631041\tmarlin\nn02631330\tblue marlin, Makaira nigricans\nn02631475\tblack marlin, Makaira mazara, Makaira marlina\nn02631628\tstriped marlin, Makaira mitsukurii\nn02631775\twhite marlin, Makaira albida\nn02632039\tspearfish\nn02632494\tlouvar, Luvarus imperialis\nn02633422\tdollarfish, Poronotus triacanthus\nn02633677\tpalometa, California pompano, Palometa simillima\nn02633977\tharvestfish, Paprilus alepidotus\nn02634545\tdriftfish\nn02635154\tbarrelfish, black rudderfish, Hyperglyphe perciformis\nn02635580\tclingfish\nn02636170\ttripletail\nn02636405\tAtlantic tripletail, Lobotes surinamensis\nn02636550\tPacific tripletail, Lobotes pacificus\nn02636854\tmojarra\nn02637179\tyellowfin mojarra, Gerres cinereus\nn02637475\tsilver jenny, Eucinostomus gula\nn02637977\twhiting\nn02638596\tganoid, ganoid fish\nn02639087\tbowfin, grindle, dogfish, Amia calva\nn02639605\tpaddlefish, duckbill, Polyodon spathula\nn02639922\tChinese paddlefish, Psephurus gladis\nn02640242\tsturgeon\nn02640626\tPacific sturgeon, white sturgeon, Sacramento sturgeon, Acipenser transmontanus\nn02640857\tbeluga, hausen, white sturgeon, Acipenser huso\nn02641379\tgar, garfish, garpike, billfish, Lepisosteus osseus\nn02642107\tscorpaenoid, scorpaenoid fish\nn02642644\tscorpaenid, scorpaenid fish\nn02643112\tscorpionfish, scorpion fish, sea scorpion\nn02643316\tplumed scorpionfish, Scorpaena grandicornis\nn02643566\tlionfish\nn02643836\tstonefish, Synanceja verrucosa\nn02644113\trockfish\nn02644360\tcopper rockfish, Sebastodes caurinus\nn02644501\tvermillion rockfish, rasher, Sebastodes miniatus\nn02644665\tred rockfish, Sebastodes ruberrimus\nn02644817\trosefish, ocean perch, Sebastodes marinus\nn02645538\tbullhead\nn02645691\tmiller's-thumb\nn02645953\tsea raven, Hemitripterus americanus\nn02646667\tlumpfish, Cyclopterus lumpus\nn02646892\tlumpsucker\nn02648035\tpogge, armed bullhead, Agonus cataphractus\nn02648625\tgreenling\nn02648916\tkelp greenling, Hexagrammos decagrammus\nn02649218\tpainted greenling, convict fish, convictfish, Oxylebius pictus\nn02649546\tflathead\nn02650050\tgurnard\nn02650413\ttub gurnard, yellow gurnard, Trigla lucerna\nn02650541\tsea robin, searobin\nn02651060\tnorthern sea robin, Prionotus carolinus\nn02652132\tflying gurnard, flying robin, butterflyfish\nn02652668\tplectognath, plectognath fish\nn02653145\ttriggerfish\nn02653497\tqueen triggerfish, Bessy cerca, oldwench, oldwife, Balistes vetula\nn02653786\tfilefish\nn02654112\tleatherjacket, leatherfish\nn02654425\tboxfish, trunkfish\nn02654745\tcowfish, Lactophrys quadricornis\nn02655020\tpuffer, pufferfish, blowfish, globefish\nn02655523\tspiny puffer\nn02655848\tporcupinefish, porcupine fish, Diodon hystrix\nn02656032\tballoonfish, Diodon holocanthus\nn02656301\tburrfish\nn02656670\tocean sunfish, sunfish, mola, headfish\nn02656969\tsharptail mola, Mola lanceolata\nn02657368\tflatfish\nn02657694\tflounder\nn02658079\trighteye flounder, righteyed flounder\nn02658531\tplaice, Pleuronectes platessa\nn02658811\tEuropean flatfish, Platichthys flesus\nn02659176\tyellowtail flounder, Limanda ferruginea\nn02659478\twinter flounder, blackback flounder, lemon sole, Pseudopleuronectes americanus\nn02659808\tlemon sole, Microstomus kitt\nn02660091\tAmerican plaice, Hippoglossoides platessoides\nn02660208\thalibut, holibut\nn02660519\tAtlantic halibut, Hippoglossus hippoglossus\nn02660640\tPacific halibut, Hippoglossus stenolepsis\nn02661017\tlefteye flounder, lefteyed flounder\nn02661473\tsouthern flounder, Paralichthys lethostigmus\nn02661618\tsummer flounder, Paralichthys dentatus\nn02662239\twhiff\nn02662397\thorned whiff, Citharichthys cornutus\nn02662559\tsand dab\nn02662825\twindowpane, Scophthalmus aquosus\nn02662993\tbrill, Scophthalmus rhombus\nn02663211\tturbot, Psetta maxima\nn02663485\ttonguefish, tongue-fish\nn02663849\tsole\nn02664285\tEuropean sole, Solea solea\nn02664642\tEnglish sole, lemon sole, Parophrys vitulus\nn02665250\thogchoker, Trinectes maculatus\nn02665985\taba\nn02666196\tabacus\nn02666501\tabandoned ship, derelict\nn02666624\tA battery\nn02666943\tabattoir, butchery, shambles, slaughterhouse\nn02667093\tabaya\nn02667244\tAbbe condenser\nn02667379\tabbey\nn02667478\tabbey\nn02667576\tabbey\nn02667693\tAbney level\nn02668393\tabrader, abradant\nn02668613\tabrading stone\nn02669295\tabutment\nn02669442\tabutment arch\nn02669534\tacademic costume\nn02669723\tacademic gown, academic robe, judge's robe\nn02670186\taccelerator, throttle, throttle valve\nn02670382\taccelerator, particle accelerator, atom smasher\nn02670683\taccelerator, accelerator pedal, gas pedal, gas, throttle, gun\nn02670935\taccelerometer\nn02671780\taccessory, accoutrement, accouterment\nn02672152\taccommodating lens implant, accommodating IOL\nn02672371\taccommodation\nn02672831\taccordion, piano accordion, squeeze box\nn02675077\tacetate disk, phonograph recording disk\nn02675219\tacetate rayon, acetate\nn02675522\tachromatic lens\nn02676097\tacoustic delay line, sonic delay line\nn02676261\tacoustic device\nn02676566\tacoustic guitar\nn02676670\tacoustic modem\nn02676938\tacropolis\nn02677028\tacrylic\nn02677136\tacrylic, acrylic paint\nn02677436\tactinometer\nn02677718\taction, action mechanism\nn02678010\tactive matrix screen\nn02678384\tactuator\nn02678897\tadapter, adaptor\nn02679142\tadder\nn02679257\tadding machine, totalizer, totaliser\nn02679961\taddressing machine, Addressograph\nn02680110\tadhesive bandage\nn02680512\tadit\nn02680638\tadjoining room\nn02680754\tadjustable wrench, adjustable spanner\nn02681392\tadobe, adobe brick\nn02682311\tadz, adze\nn02682407\taeolian harp, aeolian lyre, wind harp\nn02682569\taerator\nn02682811\taerial torpedo\nn02682922\taerosol, aerosol container, aerosol can, aerosol bomb, spray can\nn02683183\tAertex\nn02683323\tafghan\nn02683454\tAfro-wig\nn02683558\tafterburner\nn02683791\tafter-shave, after-shave lotion\nn02684248\tagateware\nn02684356\tagglomerator\nn02684515\taglet, aiglet, aiguilette\nn02684649\taglet, aiglet\nn02684962\tagora, public square\nn02685082\taigrette, aigret\nn02685253\taileron\nn02685365\tair bag\nn02685701\tairbrake\nn02685995\tairbrush\nn02686121\tairbus\nn02686227\tair compressor\nn02686379\tair conditioner, air conditioning\nn02686568\taircraft\nn02687172\taircraft carrier, carrier, flattop, attack aircraft carrier\nn02687423\taircraft engine\nn02687682\tair cushion, air spring\nn02687821\tairdock, hangar, repair shed\nn02687992\tairfield, landing field, flying field, field\nn02688273\tair filter, air cleaner\nn02688443\tairfoil, aerofoil, control surface, surface\nn02689144\tairframe\nn02689274\tair gun, airgun, air rifle\nn02689434\tair hammer, jackhammer, pneumatic hammer\nn02689748\tair horn\nn02689819\tairing cupboard\nn02690373\tairliner\nn02690715\tairmailer\nn02691156\tairplane, aeroplane, plane\nn02692086\tairplane propeller, airscrew, prop\nn02692232\tairport, airdrome, aerodrome, drome\nn02692513\tair pump, vacuum pump\nn02692680\tair search radar\nn02692877\tairship, dirigible\nn02693246\tair terminal, airport terminal\nn02693413\tair-to-air missile\nn02693540\tair-to-ground missile, air-to-surface missile\nn02694045\taisle\nn02694279\tAladdin's lamp\nn02694426\talarm, warning device, alarm system\nn02694662\talarm clock, alarm\nn02694966\talb\nn02695627\talcazar\nn02695762\talcohol thermometer, alcohol-in-glass thermometer\nn02696165\talehouse\nn02696246\talembic\nn02696569\talgometer\nn02696843\talidade, alidad\nn02697022\talidade, alidad\nn02697221\tA-line\nn02697576\tAllen screw\nn02697675\tAllen wrench\nn02697876\talligator wrench\nn02698244\talms dish, alms tray\nn02698473\talpaca\nn02698634\talpenstock\nn02699494\taltar\nn02699629\taltar, communion table, Lord's table\nn02699770\taltarpiece, reredos\nn02699915\taltazimuth\nn02700064\talternator\nn02700258\taltimeter\nn02700895\tAmati\nn02701002\tambulance\nn02701260\tamen corner\nn02701730\tAmerican organ\nn02702989\tammeter\nn02703124\tammonia clock\nn02703275\tammunition, ammo\nn02704645\tamphibian, amphibious aircraft\nn02704792\tamphibian, amphibious vehicle\nn02704949\tamphitheater, amphitheatre, coliseum\nn02705201\tamphitheater, amphitheatre\nn02705429\tamphora\nn02705944\tamplifier\nn02706221\tampulla\nn02706806\tamusement arcade\nn02708093\tanalog clock\nn02708224\tanalog computer, analogue computer\nn02708433\tanalog watch\nn02708555\tanalytical balance, chemical balance\nn02708711\tanalyzer, analyser\nn02708885\tanamorphosis, anamorphism\nn02709101\tanastigmat\nn02709367\tanchor, ground tackle\nn02709637\tanchor chain, anchor rope\nn02709763\tanchor light, riding light, riding lamp\nn02709908\tAND circuit, AND gate\nn02710044\tandiron, firedog, dog, dog-iron\nn02710201\tandroid, humanoid, mechanical man\nn02710324\tanechoic chamber\nn02710429\tanemometer, wind gauge, wind gage\nn02710600\taneroid barometer, aneroid\nn02711237\tangiocardiogram\nn02711780\tangioscope\nn02712545\tangle bracket, angle iron\nn02712643\tangledozer\nn02713003\tankle brace\nn02713218\tanklet, anklets, bobbysock, bobbysocks\nn02713364\tanklet\nn02713496\tankus\nn02714315\tanode\nn02714535\tanode\nn02714751\tanswering machine\nn02715229\tantenna, aerial, transmitting aerial\nn02715513\tanteroom, antechamber, entrance hall, hall, foyer, lobby, vestibule\nn02715712\tantiaircraft, antiaircraft gun, flak, flack, pom-pom, ack-ack, ack-ack gun\nn02716626\tantiballistic missile, ABM\nn02720048\tantifouling paint\nn02720576\tanti-G suit, G suit\nn02721813\tantimacassar\nn02723165\tantiperspirant\nn02724722\tanti-submarine rocket\nn02725872\tanvil\nn02726017\tao dai\nn02726210\tapadana\nn02726305\tapartment, flat\nn02726681\tapartment building, apartment house\nn02727016\taperture\nn02727141\taperture\nn02727426\tapiary, bee house\nn02727825\tapparatus, setup\nn02728440\tapparel, wearing apparel, dress, clothes\nn02729222\tapplecart\nn02729837\tappliance\nn02729965\tappliance, contraption, contrivance, convenience, gadget, gizmo, gismo, widget\nn02730265\tapplicator, applier\nn02730568\tappointment, fitting\nn02730930\tapron\nn02731251\tapron string\nn02731398\tapse, apsis\nn02731629\taqualung, Aqua-Lung, scuba\nn02731900\taquaplane\nn02732072\taquarium, fish tank, marine museum\nn02732572\tarabesque\nn02732827\tarbor, arbour, bower, pergola\nn02733213\tarcade, colonnade\nn02733524\tarch\nn02734725\tarchitecture\nn02734835\tarchitrave\nn02735268\tarch support\nn02735361\tarc lamp, arc light\nn02735538\tarctic, galosh, golosh, rubber, gumshoe\nn02735688\tarea\nn02736396\tareaway\nn02736798\targyle, argyll\nn02737351\tark\nn02737660\tarm\nn02738031\tarmament\nn02738271\tarmature\nn02738449\tarmband\nn02738535\tarmchair\nn02738741\tarmet\nn02738859\tarm guard, arm pad\nn02738978\tarmhole\nn02739123\tarmilla\nn02739427\tarmlet, arm band\nn02739550\tarmoire\nn02739668\tarmor, armour\nn02739889\tarmored car, armoured car\nn02740061\tarmored car, armoured car\nn02740300\tarmored personnel carrier, armoured personnel carrier, APC\nn02740533\tarmored vehicle, armoured vehicle\nn02740764\tarmor plate, armour plate, armor plating, plate armor, plate armour\nn02741367\tarmory, armoury, arsenal\nn02741475\tarmrest\nn02742070\tarquebus, harquebus, hackbut, hagbut\nn02742194\tarray\nn02742322\tarray, raiment, regalia\nn02742468\tarrester, arrester hook\nn02742753\tarrow\nn02743426\tarsenal, armory, armoury\nn02744323\tarterial road\nn02744844\tarthrogram\nn02744961\tarthroscope\nn02745492\tartificial heart\nn02745611\tartificial horizon, gyro horizon, flight indicator\nn02745816\tartificial joint\nn02746008\tartificial kidney, hemodialyzer\nn02746225\tartificial skin\nn02746365\tartillery, heavy weapon, gun, ordnance\nn02746595\tartillery shell\nn02746683\tartist's loft\nn02746978\tart school\nn02747063\tascot\nn02747177\tashcan, trash can, garbage can, wastebin, ash bin, ash-bin, ashbin, dustbin, trash barrel, trash bin\nn02747672\tash-pan\nn02747802\tashtray\nn02748183\taspergill, aspersorium\nn02748359\taspersorium\nn02748491\taspirator\nn02749169\taspirin powder, headache powder\nn02749292\tassault gun\nn02749479\tassault rifle, assault gun\nn02749670\tassegai, assagai\nn02749790\tassembly\nn02749953\tassembly\nn02750070\tassembly hall\nn02750169\tassembly plant\nn02750320\tastatic coils\nn02750652\tastatic galvanometer\nn02751067\tastrodome\nn02751215\tastrolabe\nn02751295\tastronomical telescope\nn02751490\tastronomy satellite\nn02752199\tathenaeum, atheneum\nn02752496\tathletic sock, sweat sock, varsity sock\nn02752615\tathletic supporter, supporter, suspensor, jockstrap, jock\nn02752810\tatlas, telamon\nn02752917\tatmometer, evaporometer\nn02753044\tatom bomb, atomic bomb, A-bomb, fission bomb, plutonium bomb\nn02753394\tatomic clock\nn02753710\tatomic pile, atomic reactor, pile, chain reactor\nn02754103\tatomizer, atomiser, spray, sprayer, nebulizer, nebuliser\nn02754656\tatrium\nn02755140\tattache case, attache\nn02755352\tattachment, bond\nn02755529\tattack submarine\nn02755675\tattenuator\nn02755823\tattic\nn02755984\tattic fan\nn02756098\tattire, garb, dress\nn02756854\taudio amplifier\nn02756977\taudiocassette\nn02757061\taudio CD, audio compact disc\nn02757337\taudiometer, sonometer\nn02757462\taudio system, sound system\nn02757714\taudiotape\nn02757810\taudiotape\nn02757927\taudiovisual, audiovisual aid\nn02758134\tauditorium\nn02758490\tauger, gimlet, screw auger, wimble\nn02758863\tautobahn\nn02758960\tautoclave, sterilizer, steriliser\nn02759257\tautofocus\nn02759387\tautogiro, autogyro, gyroplane\nn02759700\tautoinjector\nn02759963\tautoloader, self-loader\nn02760099\tautomat\nn02760199\tautomat\nn02760298\tautomatic choke\nn02760429\tautomatic firearm, automatic gun, automatic weapon\nn02760658\tautomatic pistol, automatic\nn02760855\tautomatic rifle, automatic, machine rifle\nn02761034\tautomatic transmission, automatic drive\nn02761206\tautomation\nn02761392\tautomaton, robot, golem\nn02761557\tautomobile engine\nn02761696\tautomobile factory, auto factory, car factory\nn02761834\tautomobile horn, car horn, motor horn, horn, hooter\nn02762169\tautopilot, automatic pilot, robot pilot\nn02762371\tautoradiograph\nn02762508\tautostrada\nn02762725\tauxiliary boiler, donkey boiler\nn02762909\tauxiliary engine, donkey engine\nn02763083\tauxiliary pump, donkey pump\nn02763198\tauxiliary research submarine\nn02763306\tauxiliary storage, external storage, secondary storage\nn02763604\taviary, bird sanctuary, volary\nn02763714\tawl\nn02763901\tawning, sunshade, sunblind\nn02764044\tax, axe\nn02764398\tax handle, axe handle\nn02764505\tax head, axe head\nn02764614\taxis, axis of rotation\nn02764779\taxle\nn02764935\taxle bar\nn02765028\taxletree\nn02766168\tbabushka\nn02766320\tbaby bed, baby's bed\nn02766534\tbaby buggy, baby carriage, carriage, perambulator, pram, stroller, go-cart, pushchair, pusher\nn02766792\tbaby grand, baby grand piano, parlor grand, parlor grand piano, parlour grand, parlour grand piano\nn02767038\tbaby powder\nn02767147\tbaby shoe\nn02767433\tback, backrest\nn02767665\tback\nn02767956\tbackbench\nn02768114\tbackboard\nn02768226\tbackboard, basketball backboard\nn02768433\tbackbone\nn02768655\tback brace\nn02768973\tbackgammon board\nn02769075\tbackground, desktop, screen background\nn02769290\tbackhoe\nn02769669\tbacklighting\nn02769748\tbackpack, back pack, knapsack, packsack, rucksack, haversack\nn02769963\tbackpacking tent, pack tent\nn02770078\tbackplate\nn02770211\tback porch\nn02770585\tbacksaw, back saw\nn02770721\tbackscratcher\nn02770830\tbackseat\nn02771004\tbackspace key, backspace, backspacer\nn02771166\tbackstairs\nn02771286\tbackstay\nn02771547\tbackstop\nn02771750\tbacksword\nn02772101\tbackup system\nn02772435\tbadminton court\nn02772554\tbadminton equipment\nn02772700\tbadminton racket, badminton racquet, battledore\nn02773037\tbag\nn02773838\tbag, traveling bag, travelling bag, grip, suitcase\nn02774152\tbag, handbag, pocketbook, purse\nn02774630\tbaggage, luggage\nn02774921\tbaggage\nn02775039\tbaggage car, luggage van\nn02775178\tbaggage claim\nn02775483\tbagpipe\nn02775689\tbailey\nn02775813\tbailey\nn02775897\tBailey bridge\nn02776007\tbain-marie\nn02776205\tbait, decoy, lure\nn02776505\tbaize\nn02776631\tbakery, bakeshop, bakehouse\nn02776825\tbalaclava, balaclava helmet\nn02776978\tbalalaika\nn02777100\tbalance\nn02777292\tbalance beam, beam\nn02777402\tbalance wheel, balance\nn02777638\tbalbriggan\nn02777734\tbalcony\nn02777927\tbalcony\nn02778131\tbaldachin\nn02778294\tbaldric, baldrick\nn02778456\tbale\nn02778588\tbaling wire\nn02778669\tball\nn02779435\tball\nn02779609\tball and chain\nn02779719\tball-and-socket joint\nn02779971\tballast, light ballast\nn02780315\tball bearing, needle bearing, roller bearing\nn02780445\tball cartridge\nn02780588\tballcock, ball cock\nn02780704\tballdress\nn02780815\tballet skirt, tutu\nn02781121\tball gown\nn02781213\tballistic galvanometer\nn02781338\tballistic missile\nn02781517\tballistic pendulum\nn02781764\tballistocardiograph, cardiograph\nn02782093\tballoon\nn02782432\tballoon bomb, Fugo\nn02782602\tballoon sail\nn02782681\tballot box\nn02782778\tballpark, park\nn02783035\tball-peen hammer\nn02783161\tballpoint, ballpoint pen, ballpen, Biro\nn02783324\tballroom, dance hall, dance palace\nn02783459\tball valve\nn02783900\tbalsa raft, Kon Tiki\nn02783994\tbaluster\nn02784124\tbanana boat\nn02784998\tband\nn02785648\tbandage, patch\nn02786058\tBand Aid\nn02786198\tbandanna, bandana\nn02786331\tbandbox\nn02786463\tbanderilla\nn02786611\tbandoleer, bandolier\nn02786736\tbandoneon\nn02786837\tbandsaw, band saw\nn02787120\tbandwagon\nn02787269\tbangalore torpedo\nn02787435\tbangle, bauble, gaud, gewgaw, novelty, fallal, trinket\nn02787622\tbanjo\nn02788021\tbanner, streamer\nn02788148\tbannister, banister, balustrade, balusters, handrail\nn02788386\tbanquette\nn02788462\tbanyan, banian\nn02788572\tbaptismal font, baptistry, baptistery, font\nn02788689\tbar\nn02789487\tbar\nn02790669\tbarbecue, barbeque\nn02790823\tbarbed wire, barbwire\nn02790996\tbarbell\nn02791124\tbarber chair\nn02791270\tbarbershop\nn02791532\tbarbette carriage\nn02791665\tbarbican, barbacan\nn02791795\tbar bit\nn02792409\tbareboat\nn02792552\tbarge, flatboat, hoy, lighter\nn02792948\tbarge pole\nn02793089\tbaritone, baritone horn\nn02793199\tbark, barque\nn02793296\tbar magnet\nn02793414\tbar mask\nn02793495\tbarn\nn02793684\tbarndoor\nn02793842\tbarn door\nn02793930\tbarnyard\nn02794008\tbarograph\nn02794156\tbarometer\nn02794368\tbarong\nn02794474\tbarouche\nn02794664\tbar printer\nn02794779\tbarrack\nn02794972\tbarrage balloon\nn02795169\tbarrel, cask\nn02795528\tbarrel, gun barrel\nn02795670\tbarrelhouse, honky-tonk\nn02795783\tbarrel knot, blood knot\nn02795978\tbarrel organ, grind organ, hand organ, hurdy gurdy, hurdy-gurdy, street organ\nn02796207\tbarrel vault\nn02796318\tbarrette\nn02796412\tbarricade\nn02796623\tbarrier\nn02796995\tbarroom, bar, saloon, ginmill, taproom\nn02797295\tbarrow, garden cart, lawn cart, wheelbarrow\nn02797535\tbascule\nn02797692\tbase, pedestal, stand\nn02797881\tbase, bag\nn02799071\tbaseball\nn02799175\tbaseball bat, lumber\nn02799323\tbaseball cap, jockey cap, golf cap\nn02799897\tbaseball equipment\nn02800213\tbaseball glove, glove, baseball mitt, mitt\nn02800497\tbasement, cellar\nn02800675\tbasement\nn02800940\tbasic point defense missile system\nn02801047\tbasilica, Roman basilica\nn02801184\tbasilica\nn02801450\tbasilisk\nn02801525\tbasin\nn02801823\tbasinet\nn02801938\tbasket, handbasket\nn02802215\tbasket, basketball hoop, hoop\nn02802426\tbasketball\nn02802544\tbasketball court\nn02802721\tbasketball equipment\nn02802990\tbasket weave\nn02803349\tbass\nn02803539\tbass clarinet\nn02803666\tbass drum, gran casa\nn02803809\tbasset horn\nn02803934\tbass fiddle, bass viol, bull fiddle, double bass, contrabass, string bass\nn02804123\tbass guitar\nn02804252\tbass horn, sousaphone, tuba\nn02804414\tbassinet\nn02804515\tbassinet\nn02804610\tbassoon\nn02805283\tbaster\nn02805845\tbastinado\nn02805983\tbastion\nn02806088\tbastion, citadel\nn02806379\tbat\nn02806530\tbath\nn02806762\tbath chair\nn02806875\tbathhouse, bagnio\nn02806992\tbathhouse, bathing machine\nn02807133\tbathing cap, swimming cap\nn02807523\tbath oil\nn02807616\tbathrobe\nn02807731\tbathroom, bath\nn02808185\tbath salts\nn02808304\tbath towel\nn02808440\tbathtub, bathing tub, bath, tub\nn02808829\tbathyscaphe, bathyscaph, bathyscape\nn02808968\tbathysphere\nn02809105\tbatik\nn02809241\tbatiste\nn02809364\tbaton, wand\nn02809491\tbaton\nn02809605\tbaton\nn02809736\tbaton\nn02810139\tbattering ram\nn02810270\tbatter's box\nn02810471\tbattery, electric battery\nn02810782\tbattery, stamp battery\nn02811059\tbatting cage, cage\nn02811204\tbatting glove\nn02811350\tbatting helmet\nn02811468\tbattle-ax, battle-axe\nn02811618\tbattle cruiser\nn02811719\tbattle dress\nn02811936\tbattlement, crenelation, crenellation\nn02812201\tbattleship, battlewagon\nn02812342\tbattle sight, battlesight\nn02812631\tbay\nn02812785\tbay\nn02812949\tbayonet\nn02813252\tbay rum\nn02813399\tbay window, bow window\nn02813544\tbazaar, bazar\nn02813645\tbazaar, bazar\nn02813752\tbazooka\nn02813981\tB battery\nn02814116\tBB gun\nn02814338\tbeach house\nn02814428\tbeach towel\nn02814533\tbeach wagon, station wagon, wagon, estate car, beach waggon, station waggon, waggon\nn02814774\tbeachwear\nn02814860\tbeacon, lighthouse, beacon light, pharos\nn02815478\tbeading plane\nn02815749\tbeaker\nn02815834\tbeaker\nn02815950\tbeam\nn02816494\tbeam balance\nn02816656\tbeanbag\nn02816768\tbeanie, beany\nn02817031\tbearing\nn02817251\tbearing rein, checkrein\nn02817386\tbearing wall\nn02817516\tbearskin, busby, shako\nn02817650\tbeater\nn02817799\tbeating-reed instrument, reed instrument, reed\nn02818135\tbeaver, castor\nn02818254\tbeaver\nn02818687\tBeckman thermometer\nn02818832\tbed\nn02819697\tbed\nn02820085\tbed and breakfast, bed-and-breakfast\nn02820210\tbedclothes, bed clothing, bedding\nn02820556\tBedford cord\nn02820675\tbed jacket\nn02821202\tbedpan\nn02821415\tbedpost\nn02821543\tbedroll\nn02821627\tbedroom, sleeping room, sleeping accommodation, chamber, bedchamber\nn02821943\tbedroom furniture\nn02822064\tbedsitting room, bedsitter, bedsit\nn02822220\tbedspread, bedcover, bed cover, bed covering, counterpane, spread\nn02822399\tbedspring\nn02822579\tbedstead, bedframe\nn02822762\tbeefcake\nn02822865\tbeehive, hive\nn02823124\tbeeper, pager\nn02823335\tbeer barrel, beer keg\nn02823428\tbeer bottle\nn02823510\tbeer can\nn02823586\tbeer garden\nn02823750\tbeer glass\nn02823848\tbeer hall\nn02823964\tbeer mat\nn02824058\tbeer mug, stein\nn02824152\tbelaying pin\nn02824319\tbelfry\nn02824448\tbell\nn02825153\tbell arch\nn02825240\tbellarmine, longbeard, long-beard, greybeard\nn02825442\tbellbottom trousers, bell-bottoms, bellbottom pants\nn02825657\tbell cote, bell cot\nn02825872\tbell foundry\nn02825961\tbell gable\nn02826068\tbell jar, bell glass\nn02826259\tbellows\nn02826459\tbellpull\nn02826589\tbell push\nn02826683\tbell seat, balloon seat\nn02826812\tbell tent\nn02826886\tbell tower\nn02827148\tbellyband\nn02827606\tbelt\nn02828115\tbelt, belt ammunition, belted ammunition\nn02828299\tbelt buckle\nn02828427\tbelting\nn02828884\tbench\nn02829246\tbench clamp\nn02829353\tbench hook\nn02829510\tbench lathe\nn02829596\tbench press\nn02830157\tbender\nn02831237\tberet\nn02831335\tberlin\nn02831595\tBermuda shorts, Jamaica shorts\nn02831724\tberth, bunk, built in bed\nn02831894\tbesom\nn02831998\tBessemer converter\nn02833040\tbethel\nn02833140\tbetting shop\nn02833275\tbevatron\nn02833403\tbevel, bevel square\nn02833793\tbevel gear, pinion and crown wheel, pinion and ring gear\nn02834027\tB-flat clarinet, licorice stick\nn02834397\tbib\nn02834506\tbib-and-tucker\nn02834642\tbicorn, bicorne\nn02834778\tbicycle, bike, wheel, cycle\nn02835271\tbicycle-built-for-two, tandem bicycle, tandem\nn02835412\tbicycle chain\nn02835551\tbicycle clip, trouser clip\nn02835724\tbicycle pump\nn02835829\tbicycle rack\nn02835915\tbicycle seat, saddle\nn02836035\tbicycle wheel\nn02836174\tbidet\nn02836268\tbier\nn02836392\tbier\nn02836513\tbi-fold door\nn02836607\tbifocals\nn02836900\tBig Blue, BLU-82\nn02837134\tbig board\nn02837567\tbight\nn02837789\tbikini, two-piece\nn02837887\tbikini pants\nn02838014\tbilge\nn02838178\tbilge keel\nn02838345\tbilge pump\nn02838577\tbilge well\nn02838728\tbill, peak, eyeshade, visor, vizor\nn02838958\tbill, billhook\nn02839110\tbillboard, hoarding\nn02839351\tbilliard ball\nn02839592\tbilliard room, billiard saloon, billiard parlor, billiard parlour, billiard hall\nn02839910\tbin\nn02840134\tbinder, ligature\nn02840245\tbinder, ring-binder\nn02840515\tbindery\nn02840619\tbinding, book binding, cover, back\nn02841063\tbin liner\nn02841187\tbinnacle\nn02841315\tbinoculars, field glasses, opera glasses\nn02841506\tbinocular microscope\nn02841641\tbiochip\nn02841847\tbiohazard suit\nn02842133\tbioscope\nn02842573\tbiplane\nn02842809\tbirch, birch rod\nn02843029\tbirchbark canoe, birchbark, birch bark\nn02843158\tbirdbath\nn02843276\tbirdcage\nn02843465\tbirdcall\nn02843553\tbird feeder, birdfeeder, feeder\nn02843684\tbirdhouse\nn02843777\tbird shot, buckshot, duck shot\nn02843909\tbiretta, berretta, birretta\nn02844056\tbishop\nn02844214\tbistro\nn02844307\tbit\nn02844714\tbit\nn02845130\tbite plate, biteplate\nn02845293\tbitewing\nn02845985\tbitumastic\nn02846141\tblack\nn02846260\tblack\nn02846511\tblackboard, chalkboard\nn02846619\tblackboard eraser\nn02846733\tblack box\nn02846874\tblackface\nn02847461\tblackjack, cosh, sap\nn02847631\tblack tie\nn02847852\tblackwash\nn02848118\tbladder\nn02848216\tblade\nn02848523\tblade, vane\nn02848806\tblade\nn02848921\tblank, dummy, blank shell\nn02849154\tblanket, cover\nn02849885\tblast furnace\nn02850060\tblasting cap\nn02850358\tblazer, sport jacket, sport coat, sports jacket, sports coat\nn02850732\tblender, liquidizer, liquidiser\nn02850950\tblimp, sausage balloon, sausage\nn02851099\tblind, screen\nn02851795\tblind curve, blind bend\nn02851939\tblindfold\nn02852043\tbling, bling bling\nn02852173\tblinker, flasher\nn02852360\tblister pack, bubble pack\nn02853016\tblock\nn02853218\tblockade\nn02853336\tblockade-runner\nn02853745\tblock and tackle\nn02853870\tblockbuster\nn02854378\tblockhouse\nn02854532\tblock plane\nn02854630\tbloodmobile\nn02854739\tbloomers, pants, drawers, knickers\nn02854926\tblouse\nn02855089\tblower\nn02855390\tblowtorch, torch, blowlamp\nn02855701\tblucher\nn02855793\tbludgeon\nn02855925\tblue\nn02856013\tblue chip\nn02856237\tblunderbuss\nn02856362\tblunt file\nn02857365\tboarding\nn02857477\tboarding house, boardinghouse\nn02857644\tboardroom, council chamber\nn02857907\tboards\nn02858304\tboat\nn02859184\tboater, leghorn, Panama, Panama hat, sailor, skimmer, straw hat\nn02859343\tboat hook\nn02859443\tboathouse\nn02859557\tboatswain's chair, bosun's chair\nn02859729\tboat train\nn02859955\tboatyard\nn02860415\tbobbin, spool, reel\nn02860640\tbobby pin, hairgrip, grip\nn02860847\tbobsled, bobsleigh, bob\nn02861022\tbobsled, bobsleigh\nn02861147\tbocce ball, bocci ball, boccie ball\nn02861286\tbodega\nn02861387\tbodice\nn02861509\tbodkin, threader\nn02861658\tbodkin\nn02861777\tbodkin\nn02861886\tbody\nn02862048\tbody armor, body armour, suit of armor, suit of armour, coat of mail, cataphract\nn02862916\tbody lotion\nn02863014\tbody stocking\nn02863176\tbody plethysmograph\nn02863340\tbody pad\nn02863426\tbodywork\nn02863536\tBofors gun\nn02863638\tbogy, bogie, bogey\nn02863750\tboiler, steam boiler\nn02864122\tboiling water reactor, BWR\nn02864504\tbolero\nn02864593\tbollard, bitt\nn02864987\tbolo, bolo knife\nn02865351\tbolo tie, bolo, bola tie, bola\nn02865665\tbolt\nn02865931\tbolt, deadbolt\nn02866106\tbolt\nn02866386\tbolt cutter\nn02866578\tbomb\nn02867401\tbombazine\nn02867592\tbomb calorimeter, bomb\nn02867715\tbomber\nn02867966\tbomber jacket\nn02868240\tbomblet, cluster bomblet\nn02868429\tbomb rack\nn02868546\tbombshell\nn02868638\tbomb shelter, air-raid shelter, bombproof\nn02868975\tbone-ash cup, cupel, refractory pot\nn02869155\tbone china\nn02869249\tbones, castanets, clappers, finger cymbals\nn02869563\tboneshaker\nn02869737\tbongo, bongo drum\nn02869837\tbonnet, poke bonnet\nn02870526\tbook\nn02870676\tbook bag\nn02870772\tbookbindery\nn02870880\tbookcase\nn02871005\tbookend\nn02871147\tbookmark, bookmarker\nn02871314\tbookmobile\nn02871439\tbookshelf\nn02871525\tbookshop, bookstore, bookstall\nn02871631\tboom\nn02871824\tboom, microphone boom\nn02871963\tboomerang, throwing stick, throw stick\nn02872333\tbooster, booster rocket, booster unit, takeoff booster, takeoff rocket\nn02872529\tbooster, booster amplifier, booster station, relay link, relay station, relay transmitter\nn02872752\tboot\nn02873520\tboot\nn02873623\tboot camp\nn02873733\tbootee, bootie\nn02873839\tbooth, cubicle, stall, kiosk\nn02874086\tbooth\nn02874214\tbooth\nn02874336\tboothose\nn02874442\tbootjack\nn02874537\tbootlace\nn02874642\tbootleg\nn02874750\tbootstrap\nn02875436\tbore bit, borer, rock drill, stone drill\nn02875626\tboron chamber\nn02875948\tborstal\nn02876084\tbosom\nn02876326\tBoston rocker\nn02876457\tbota\nn02876657\tbottle\nn02877266\tbottle, feeding bottle, nursing bottle\nn02877513\tbottle bank\nn02877642\tbottlebrush\nn02877765\tbottlecap\nn02877962\tbottle opener\nn02878107\tbottling plant\nn02878222\tbottom, freighter, merchantman, merchant ship\nn02878425\tboucle\nn02878534\tboudoir\nn02878628\tboulle, boule, buhl\nn02878796\tbouncing betty\nn02879087\tbouquet, corsage, posy, nosegay\nn02879309\tboutique, dress shop\nn02879422\tboutonniere\nn02879517\tbow\nn02879718\tbow\nn02880189\tbow, bowknot\nn02880393\tbow and arrow\nn02880546\tbowed stringed instrument, string\nn02880842\tBowie knife\nn02880940\tbowl\nn02881193\tbowl\nn02881546\tbowl\nn02881757\tbowler hat, bowler, derby hat, derby, plug hat\nn02881906\tbowline, bowline knot\nn02882190\tbowling alley\nn02882301\tbowling ball, bowl\nn02882483\tbowling equipment\nn02882647\tbowling pin, pin\nn02882894\tbowling shoe\nn02883004\tbowsprit\nn02883101\tbowstring\nn02883205\tbow tie, bow-tie, bowtie\nn02883344\tbox\nn02884225\tbox, loge\nn02884450\tbox, box seat\nn02884859\tbox beam, box girder\nn02884994\tbox camera, box Kodak\nn02885108\tboxcar\nn02885233\tbox coat\nn02885338\tboxing equipment\nn02885462\tboxing glove, glove\nn02885882\tbox office, ticket office, ticket booth\nn02886321\tbox spring\nn02886434\tbox wrench, box end wrench\nn02886599\tbrace, bracing\nn02887079\tbrace, braces, orthodontic braces\nn02887209\tbrace\nn02887489\tbrace, suspender, gallus\nn02887832\tbrace and bit\nn02887970\tbracelet, bangle\nn02888270\tbracer, armguard\nn02888429\tbrace wrench\nn02888569\tbracket, wall bracket\nn02888898\tbradawl, pricker\nn02889425\tbrake\nn02889646\tbrake\nn02889856\tbrake band\nn02889996\tbrake cylinder, hydraulic brake cylinder, master cylinder\nn02890188\tbrake disk\nn02890351\tbrake drum, drum\nn02890513\tbrake lining\nn02890662\tbrake pad\nn02890804\tbrake pedal\nn02890940\tbrake shoe, shoe, skid\nn02891188\tbrake system, brakes\nn02891788\tbrass, brass instrument\nn02892201\tbrass, memorial tablet, plaque\nn02892304\tbrass\nn02892392\tbrassard\nn02892499\tbrasserie\nn02892626\tbrassie\nn02892767\tbrassiere, bra, bandeau\nn02892948\tbrass knucks, knucks, brass knuckles, knuckles, knuckle duster\nn02893269\tbrattice\nn02893418\tbrazier, brasier\nn02893608\tbreadbasket\nn02893692\tbread-bin, breadbox\nn02893941\tbread knife\nn02894024\tbreakable\nn02894158\tbreakfast area, breakfast nook\nn02894337\tbreakfast table\nn02894605\tbreakwater, groin, groyne, mole, bulwark, seawall, jetty\nn02894847\tbreast drill\nn02895008\tbreast implant\nn02895154\tbreastplate, aegis, egis\nn02895328\tbreast pocket\nn02895438\tbreathalyzer, breathalyser\nn02896074\tbreechblock, breech closer\nn02896294\tbreechcloth, breechclout, loincloth\nn02896442\tbreeches, knee breeches, knee pants, knickerbockers, knickers\nn02896694\tbreeches buoy\nn02896856\tbreechloader\nn02896949\tbreeder reactor\nn02897097\tBren, Bren gun\nn02897389\tbrewpub\nn02897820\tbrick\nn02898093\tbrickkiln\nn02898173\tbricklayer's hammer\nn02898269\tbrick trowel, mason's trowel\nn02898369\tbrickwork\nn02898585\tbridal gown, wedding gown, wedding dress\nn02898711\tbridge, span\nn02899439\tbridge, nosepiece\nn02900160\tbridle\nn02900459\tbridle path, bridle road\nn02900594\tbridoon\nn02900705\tbriefcase\nn02900857\tbriefcase bomb\nn02900987\tbriefcase computer\nn02901114\tbriefs, Jockey shorts\nn02901259\tbrig\nn02901377\tbrig\nn02901481\tbrigandine\nn02901620\tbrigantine, hermaphrodite brig\nn02901793\tbrilliantine\nn02901901\tbrilliant pebble\nn02902079\tbrim\nn02902687\tbristle brush\nn02902816\tbritches\nn02902916\tbroad arrow\nn02903006\tbroadax, broadaxe\nn02903126\tbrochette\nn02903204\tbroadcaster, spreader\nn02903727\tbroadcloth\nn02903852\tbroadcloth\nn02904109\tbroad hatchet\nn02904233\tbroadloom\nn02904505\tbroadside\nn02904640\tbroadsword\nn02904803\tbrocade\nn02904927\tbrogan, brogue, clodhopper, work shoe\nn02905036\tbroiler\nn02905152\tbroken arch\nn02905886\tbronchoscope\nn02906734\tbroom\nn02906963\tbroom closet\nn02907082\tbroomstick, broom handle\nn02907296\tbrougham\nn02907391\tBrowning automatic rifle, BAR\nn02907656\tBrowning machine gun, Peacemaker\nn02907873\tbrownstone\nn02908123\tbrunch coat\nn02908217\tbrush\nn02908773\tBrussels carpet\nn02908951\tBrussels lace\nn02909053\tbubble\nn02909165\tbubble chamber\nn02909285\tbubble jet printer, bubble-jet printer, bubblejet\nn02909706\tbuckboard\nn02909870\tbucket, pail\nn02910145\tbucket seat\nn02910241\tbucket shop\nn02910353\tbuckle\nn02910542\tbuckram\nn02910701\tbucksaw\nn02910864\tbuckskins\nn02910964\tbuff, buffer\nn02911332\tbuffer, polisher\nn02911485\tbuffer, buffer storage, buffer store\nn02912065\tbuffet, counter, sideboard\nn02912319\tbuffing wheel\nn02912557\tbuggy, roadster\nn02912894\tbugle\nn02913152\tbuilding, edifice\nn02914991\tbuilding complex, complex\nn02915904\tbulldog clip, alligator clip\nn02916065\tbulldog wrench\nn02916179\tbulldozer, dozer\nn02916350\tbullet, slug\nn02916936\tbulletproof vest\nn02917067\tbullet train, bullet\nn02917377\tbullhorn, loud hailer, loud-hailer\nn02917521\tbullion\nn02917607\tbullnose, bullnosed plane\nn02917742\tbullpen, detention cell, detention centre\nn02917964\tbullpen\nn02918112\tbullring\nn02918330\tbulwark\nn02918455\tbumboat\nn02918595\tbumper\nn02918831\tbumper\nn02918964\tbumper car, Dodgem\nn02919148\tbumper guard\nn02919308\tbumper jack\nn02919414\tbundle, sheaf\nn02919648\tbung, spile\nn02919792\tbungalow, cottage\nn02919890\tbungee, bungee cord\nn02919976\tbunghole\nn02920083\tbunk\nn02920164\tbunk, feed bunk\nn02920259\tbunk bed, bunk\nn02920369\tbunker, sand trap, trap\nn02920503\tbunker, dugout\nn02920658\tbunker\nn02921029\tbunsen burner, bunsen, etna\nn02921195\tbunting\nn02921292\tbur, burr\nn02921406\tBurberry\nn02921592\tburette, buret\nn02921756\tburglar alarm\nn02921884\tburial chamber, sepulcher, sepulchre, sepulture\nn02922159\tburial garment\nn02922292\tburial mound, grave mound, barrow, tumulus\nn02922461\tburin\nn02922578\tburqa, burka\nn02922798\tburlap, gunny\nn02922877\tburn bag\nn02923129\tburner\nn02923535\tburnous, burnoose, burnouse\nn02923682\tburp gun, machine pistol\nn02923915\tburr\nn02924116\tbus, autobus, coach, charabanc, double-decker, jitney, motorbus, motorcoach, omnibus, passenger vehicle\nn02925009\tbushel basket\nn02925107\tbushing, cylindrical lining\nn02925385\tbush jacket\nn02925519\tbusiness suit\nn02925666\tbuskin, combat boot, desert boot, half boot, top boot\nn02926426\tbustier\nn02926591\tbustle\nn02927053\tbutcher knife\nn02927161\tbutcher shop, meat market\nn02927764\tbutter dish\nn02927887\tbutterfly valve\nn02928049\tbutter knife\nn02928299\tbutt hinge\nn02928413\tbutt joint, butt\nn02928608\tbutton\nn02929184\tbuttonhook\nn02929289\tbuttress, buttressing\nn02929462\tbutt shaft\nn02929582\tbutt weld, butt-weld\nn02929923\tbuzz bomb, robot bomb, flying bomb, doodlebug, V-1\nn02930080\tbuzzer\nn02930214\tBVD, BVD's\nn02930339\tbypass condenser, bypass capacitor\nn02930645\tbyway, bypath, byroad\nn02930766\tcab, hack, taxi, taxicab\nn02931013\tcab, cabriolet\nn02931148\tcab\nn02931294\tcabana\nn02931417\tcabaret, nightclub, night club, club, nightspot\nn02931836\tcaber\nn02932019\tcabin\nn02932400\tcabin\nn02932523\tcabin car, caboose\nn02932693\tcabin class, second class, economy class\nn02932891\tcabin cruiser, cruiser, pleasure boat, pleasure craft\nn02933112\tcabinet\nn02933340\tcabinet, console\nn02933462\tcabinet, locker, storage locker\nn02933649\tcabinetwork\nn02933750\tcabin liner\nn02933990\tcable, cable television, cable system, cable television service\nn02934168\tcable, line, transmission line\nn02934451\tcable car, car\nn02935017\tcache, memory cache\nn02935387\tcaddy, tea caddy\nn02935490\tcaesium clock\nn02935658\tcafe, coffeehouse, coffee shop, coffee bar\nn02935891\tcafeteria\nn02936176\tcafeteria tray\nn02936281\tcaff\nn02936402\tcaftan, kaftan\nn02936570\tcaftan, kaftan\nn02936714\tcage, coop\nn02936921\tcage\nn02937010\tcagoule\nn02937336\tcaisson\nn02937958\tcalash, caleche, calash top\nn02938218\tcalceus\nn02938321\tcalcimine\nn02938886\tcalculator, calculating machine\nn02939185\tcaldron, cauldron\nn02939763\tcalico\nn02939866\tcaliper, calliper\nn02940289\tcall-board\nn02940385\tcall center, call centre\nn02940570\tcaller ID\nn02940706\tcalliope, steam organ\nn02941095\tcalorimeter\nn02941228\tcalpac, calpack, kalpac\nn02941845\tcamail, aventail, ventail\nn02942015\tcamber arch\nn02942147\tcambric\nn02942349\tcamcorder\nn02942460\tcamel's hair, camelhair\nn02942699\tcamera, photographic camera\nn02943241\tcamera lens, optical lens\nn02943465\tcamera lucida\nn02943686\tcamera obscura\nn02943871\tcamera tripod\nn02943964\tcamise\nn02944075\tcamisole\nn02944146\tcamisole, underbodice\nn02944256\tcamlet\nn02944459\tcamouflage\nn02944579\tcamouflage, camo\nn02944826\tcamp, encampment, cantonment, bivouac\nn02945161\tcamp\nn02945813\tcamp, refugee camp\nn02945964\tcampaign hat\nn02946127\tcampanile, belfry\nn02946270\tcamp chair\nn02946348\tcamper, camping bus, motor home\nn02946509\tcamper trailer\nn02946753\tcampstool\nn02946824\tcamshaft\nn02946921\tcan, tin, tin can\nn02947212\tcanal\nn02947660\tcanal boat, narrow boat, narrowboat\nn02947818\tcandelabrum, candelabra\nn02947977\tcandid camera\nn02948072\tcandle, taper, wax light\nn02948293\tcandlepin\nn02948403\tcandlesnuffer\nn02948557\tcandlestick, candle holder\nn02948834\tcandlewick\nn02948942\tcandy thermometer\nn02949084\tcane\nn02949202\tcane\nn02949356\tcangue\nn02949542\tcanister, cannister, tin\nn02950018\tcannery\nn02950120\tcannikin\nn02950186\tcannikin\nn02950256\tcannon\nn02950482\tcannon\nn02950632\tcannon\nn02950826\tcannon\nn02950943\tcannonball, cannon ball, round shot\nn02951358\tcanoe\nn02951585\tcan opener, tin opener\nn02951703\tcanopic jar, canopic vase\nn02951843\tcanopy\nn02952109\tcanopy\nn02952237\tcanopy\nn02952374\tcanteen\nn02952485\tcanteen\nn02952585\tcanteen\nn02952674\tcanteen, mobile canteen\nn02952798\tcanteen\nn02952935\tcant hook\nn02953056\tcantilever\nn02953197\tcantilever bridge\nn02953455\tcantle\nn02953552\tCanton crepe\nn02953673\tcanvas, canvass\nn02953850\tcanvas, canvass\nn02954163\tcanvas tent, canvas, canvass\nn02954340\tcap\nn02954938\tcap\nn02955065\tcap\nn02955247\tcapacitor, capacitance, condenser, electrical condenser\nn02955540\tcaparison, trapping, housing\nn02955767\tcape, mantle\nn02956393\tcapital ship\nn02956699\tcapitol\nn02956795\tcap opener\nn02956883\tcapote, hooded cloak\nn02957008\tcapote, hooded coat\nn02957135\tcap screw\nn02957252\tcapstan\nn02957427\tcapstone, copestone, coping stone, stretcher\nn02957755\tcapsule\nn02957862\tcaptain's chair\nn02958343\tcar, auto, automobile, machine, motorcar\nn02959942\tcar, railcar, railway car, railroad car\nn02960352\tcar, elevator car\nn02960690\tcarabiner, karabiner, snap ring\nn02960903\tcarafe, decanter\nn02961035\tcaravansary, caravanserai, khan, caravan inn\nn02961225\tcar battery, automobile battery\nn02961451\tcarbine\nn02961544\tcar bomb\nn02961947\tcarbon arc lamp, carbon arc\nn02962061\tcarboy\nn02962200\tcarburetor, carburettor\nn02962414\tcar carrier\nn02962843\tcardcase\nn02962938\tcardiac monitor, heart monitor\nn02963159\tcardigan\nn02963302\tcard index, card catalog, card catalogue\nn02963503\tcardiograph, electrocardiograph\nn02963692\tcardioid microphone\nn02963821\tcar door\nn02963987\tcardroom\nn02964075\tcard table\nn02964196\tcard table\nn02964295\tcar-ferry\nn02964634\tcargo area, cargo deck, cargo hold, hold, storage area\nn02964843\tcargo container\nn02964934\tcargo door\nn02965024\tcargo hatch\nn02965122\tcargo helicopter\nn02965216\tcargo liner\nn02965300\tcargo ship, cargo vessel\nn02965529\tcarillon\nn02965783\tcar mirror\nn02966068\tcaroche\nn02966193\tcarousel, carrousel, merry-go-round, roundabout, whirligig\nn02966545\tcarpenter's hammer, claw hammer, clawhammer\nn02966687\tcarpenter's kit, tool kit\nn02966786\tcarpenter's level\nn02966942\tcarpenter's mallet\nn02967081\tcarpenter's rule\nn02967170\tcarpenter's square\nn02967294\tcarpetbag\nn02967407\tcarpet beater, rug beater\nn02967540\tcarpet loom\nn02967626\tcarpet pad, rug pad, underlay, underlayment\nn02967782\tcarpet sweeper, sweeper\nn02967991\tcarpet tack\nn02968074\tcarport, car port\nn02968210\tcarrack, carack\nn02968333\tcarrel, carrell, cubicle, stall\nn02968473\tcarriage, equipage, rig\nn02969010\tcarriage\nn02969163\tcarriage bolt\nn02969323\tcarriageway\nn02969527\tcarriage wrench\nn02969634\tcarrick bend\nn02969886\tcarrier\nn02970408\tcarryall, holdall, tote, tote bag\nn02970534\tcarrycot\nn02970685\tcar seat\nn02970849\tcart\nn02971167\tcar tire, automobile tire, auto tire, rubber tire\nn02971356\tcarton\nn02971473\tcartouche, cartouch\nn02971579\tcar train\nn02971691\tcartridge\nn02971940\tcartridge, pickup\nn02972397\tcartridge belt\nn02972714\tcartridge extractor, cartridge remover, extractor\nn02972934\tcartridge fuse\nn02973017\tcartridge holder, cartridge clip, clip, magazine\nn02973236\tcartwheel\nn02973805\tcarving fork\nn02973904\tcarving knife\nn02974003\tcar wheel\nn02974348\tcaryatid\nn02974454\tcascade liquefier\nn02974565\tcascade transformer\nn02974697\tcase\nn02975212\tcase, display case, showcase, vitrine\nn02975589\tcase, compositor's case, typesetter's case\nn02975994\tcasein paint, casein\nn02976123\tcase knife, sheath knife\nn02976249\tcase knife\nn02976350\tcasement\nn02976455\tcasement window\nn02976552\tcasern\nn02976641\tcase shot, canister, canister shot\nn02976815\tcash bar\nn02976939\tcashbox, money box, till\nn02977058\tcash machine, cash dispenser, automated teller machine, automatic teller machine, automated teller, automatic teller, ATM\nn02977330\tcashmere\nn02977438\tcash register, register\nn02977619\tcasing, case\nn02977936\tcasino, gambling casino\nn02978055\tcasket, jewel casket\nn02978205\tcasque\nn02978367\tcasquet, casquetel\nn02978478\tCassegrainian telescope, Gregorian telescope\nn02978753\tcasserole\nn02978881\tcassette\nn02979074\tcassette deck\nn02979186\tcassette player\nn02979290\tcassette recorder\nn02979399\tcassette tape\nn02979516\tcassock\nn02979836\tcast, plaster cast, plaster bandage\nn02980036\tcaster, castor\nn02980203\tcaster, castor\nn02980441\tcastle\nn02980625\tcastle, rook\nn02981024\tcatacomb\nn02981198\tcatafalque\nn02981321\tcatalytic converter\nn02981565\tcatalytic cracker, cat cracker\nn02981792\tcatamaran\nn02981911\tcatapult, arbalest, arbalist, ballista, bricole, mangonel, onager, trebuchet, trebucket\nn02982232\tcatapult, launcher\nn02982416\tcatboat\nn02982515\tcat box\nn02982599\tcatch\nn02983072\tcatchall\nn02983189\tcatcher's mask\nn02983357\tcatchment\nn02983507\tCaterpillar, cat\nn02983904\tcathedra, bishop's throne\nn02984061\tcathedral\nn02984203\tcathedral, duomo\nn02984469\tcatheter\nn02984699\tcathode\nn02985137\tcathode-ray tube, CRT\nn02985606\tcat-o'-nine-tails, cat\nn02985828\tcat's-paw\nn02985963\tcatsup bottle, ketchup bottle\nn02986066\tcattle car\nn02986160\tcattle guard, cattle grid\nn02986348\tcattleship, cattle boat\nn02987047\tcautery, cauterant\nn02987379\tcavalier hat, slouch hat\nn02987492\tcavalry sword, saber, sabre\nn02987706\tcavetto\nn02987823\tcavity wall\nn02987950\tC battery\nn02988066\tC-clamp\nn02988156\tCD drive\nn02988304\tCD player\nn02988486\tCD-R, compact disc recordable, CD-WO, compact disc write-once\nn02988679\tCD-ROM, compact disc read-only memory\nn02988963\tCD-ROM drive\nn02989099\tcedar chest\nn02990373\tceiling\nn02990758\tcelesta\nn02991048\tcell, electric cell\nn02991302\tcell, jail cell, prison cell\nn02991847\tcellar, wine cellar\nn02992032\tcellblock, ward\nn02992211\tcello, violoncello\nn02992368\tcellophane\nn02992529\tcellular telephone, cellular phone, cellphone, cell, mobile phone\nn02992795\tcellulose tape, Scotch tape, Sellotape\nn02993194\tcenotaph, empty tomb\nn02993368\tcenser, thurible\nn02993546\tcenter, centre\nn02994573\tcenter punch\nn02994743\tCentigrade thermometer\nn02995345\tcentral processing unit, CPU, C.P.U., central processor, processor, mainframe\nn02995871\tcentrifugal pump\nn02995998\tcentrifuge, extractor, separator\nn02997391\tceramic\nn02997607\tceramic ware\nn02997910\tcereal bowl\nn02998003\tcereal box\nn02998107\tcerecloth\nn02998563\tcesspool, cesspit, sink, sump\nn02998696\tchachka, tsatske, tshatshke, tchotchke\nn02998841\tchador, chadar, chaddar, chuddar\nn02999138\tchafing dish\nn02999410\tchain\nn02999936\tchain\nn03000134\tchainlink fence\nn03000247\tchain mail, ring mail, mail, chain armor, chain armour, ring armor, ring armour\nn03000530\tchain printer\nn03000684\tchain saw, chainsaw\nn03001115\tchain store\nn03001282\tchain tongs\nn03001540\tchain wrench\nn03001627\tchair\nn03002096\tchair\nn03002210\tchair of state\nn03002341\tchairlift, chair lift\nn03002555\tchaise, shay\nn03002711\tchaise longue, chaise, daybed\nn03002816\tchalet\nn03002948\tchalice, goblet\nn03003091\tchalk\nn03003633\tchallis\nn03004275\tchamberpot, potty, thunder mug\nn03004409\tchambray\nn03004531\tchamfer bit\nn03004620\tchamfer plane\nn03004713\tchamois cloth\nn03004824\tchancel, sanctuary, bema\nn03005033\tchancellery\nn03005147\tchancery\nn03005285\tchandelier, pendant, pendent\nn03005515\tchandlery\nn03005619\tchanfron, chamfron, testiere, frontstall, front-stall\nn03006626\tchanter, melody pipe\nn03006788\tchantry\nn03006903\tchap\nn03007130\tchapel\nn03007297\tchapterhouse, fraternity house, frat house\nn03007444\tchapterhouse\nn03007591\tcharacter printer, character-at-a-time printer, serial printer\nn03008177\tcharcuterie\nn03008817\tcharge-exchange accelerator\nn03008976\tcharger, battery charger\nn03009111\tchariot\nn03009269\tchariot\nn03009794\tcharnel house, charnel\nn03010473\tchassis\nn03010656\tchassis\nn03010795\tchasuble\nn03010915\tchateau\nn03011018\tchatelaine\nn03011355\tchecker, chequer\nn03011741\tcheckout, checkout counter\nn03012013\tcheekpiece\nn03012159\tcheeseboard, cheese tray\nn03012373\tcheesecloth\nn03012499\tcheese cutter\nn03012644\tcheese press\nn03012734\tchemical bomb, gas bomb\nn03012897\tchemical plant\nn03013006\tchemical reactor\nn03013438\tchemise, sack, shift\nn03013580\tchemise, shimmy, shift, slip, teddy\nn03013850\tchenille\nn03014440\tchessman, chess piece\nn03014705\tchest\nn03015149\tchesterfield\nn03015254\tchest of drawers, chest, bureau, dresser\nn03015478\tchest protector\nn03015631\tcheval-de-frise, chevaux-de-frise\nn03015851\tcheval glass\nn03016209\tchicane\nn03016389\tchicken coop, coop, hencoop, henhouse\nn03016609\tchicken wire\nn03016737\tchicken yard, hen yard, chicken run, fowl run\nn03016868\tchiffon\nn03016953\tchiffonier, commode\nn03017070\tchild's room\nn03017168\tchime, bell, gong\nn03017698\tchimney breast\nn03017835\tchimney corner, inglenook\nn03018209\tchina\nn03018349\tchina cabinet, china closet\nn03018614\tchinchilla\nn03018712\tChinese lantern\nn03018848\tChinese puzzle\nn03019198\tchinning bar\nn03019304\tchino\nn03019434\tchino\nn03019685\tchin rest\nn03019806\tchin strap\nn03019938\tchintz\nn03020034\tchip, microchip, micro chip, silicon chip, microprocessor chip\nn03020416\tchip, poker chip\nn03020692\tchisel\nn03021228\tchlamys\nn03024064\tchoir\nn03024233\tchoir loft\nn03024333\tchoke\nn03024518\tchoke, choke coil, choking coil\nn03025070\tchokey, choky\nn03025165\tchoo-choo\nn03025250\tchopine, platform\nn03025886\tchordophone\nn03026506\tChristmas stocking\nn03026907\tchronograph\nn03027001\tchronometer\nn03027108\tchronoscope\nn03027250\tchuck\nn03027505\tchuck wagon\nn03027625\tchukka, chukka boot\nn03028079\tchurch, church building\nn03028596\tchurch bell\nn03028785\tchurch hat\nn03029066\tchurch key\nn03029197\tchurch tower\nn03029296\tchuridars\nn03029445\tchurn, butter churn\nn03029925\tciderpress\nn03030262\tcigar band\nn03030353\tcigar box\nn03030557\tcigar cutter\nn03030880\tcigarette butt\nn03031012\tcigarette case\nn03031152\tcigarette holder\nn03031422\tcigar lighter, cigarette lighter, pocket lighter\nn03031756\tcinch, girth\nn03032252\tcinema, movie theater, movie theatre, movie house, picture palace\nn03032453\tcinquefoil\nn03032811\tcircle, round\nn03033267\tcirclet\nn03033362\tcircuit, electrical circuit, electric circuit\nn03033986\tcircuit board, circuit card, board, card, plug-in, add-in\nn03034244\tcircuit breaker, breaker\nn03034405\tcircuitry\nn03034516\tcircular plane, compass plane\nn03034663\tcircular saw, buzz saw\nn03035252\tcircus tent, big top, round top, top\nn03035510\tcistern\nn03035715\tcistern, water tank\nn03035832\tcittern, cithern, cither, citole, gittern\nn03036022\tcity hall\nn03036149\tcityscape\nn03036244\tcity university\nn03036341\tcivies, civvies\nn03036469\tcivilian clothing, civilian dress, civilian garb, plain clothes\nn03036701\tclack valve, clack, clapper valve\nn03036866\tclamp, clinch\nn03037108\tclamshell, grapple\nn03037228\tclapper, tongue\nn03037404\tclapperboard\nn03037590\tclarence\nn03037709\tclarinet\nn03038041\tClark cell, Clark standard cell\nn03038281\tclasp\nn03038480\tclasp knife, jackknife\nn03038685\tclassroom, schoolroom\nn03038870\tclavichord\nn03039015\tclavier, Klavier\nn03039259\tclay pigeon\nn03039353\tclaymore mine, claymore\nn03039493\tclaymore\nn03039827\tcleaners, dry cleaners\nn03039947\tcleaning implement, cleaning device, cleaning equipment\nn03040229\tcleaning pad\nn03040376\tclean room, white room\nn03040836\tclearway\nn03041114\tcleat\nn03041265\tcleat\nn03041449\tcleats\nn03041632\tcleaver, meat cleaver, chopper\nn03041810\tclerestory, clearstory\nn03042139\tclevis\nn03042384\tclews\nn03042490\tcliff dwelling\nn03042697\tclimbing frame\nn03042829\tclinch\nn03042984\tclinch, clench\nn03043173\tclincher\nn03043274\tclinic\nn03043423\tclinical thermometer, mercury-in-glass clinical thermometer\nn03043693\tclinker, clinker brick\nn03043798\tclinometer, inclinometer\nn03043958\tclip\nn03044671\tclip lead\nn03044801\tclip-on\nn03044934\tclipper\nn03045074\tclipper\nn03045228\tclipper, clipper ship\nn03045337\tcloak\nn03045698\tcloak\nn03045800\tcloakroom, coatroom\nn03046029\tcloche\nn03046133\tcloche\nn03046257\tclock\nn03046802\tclock pendulum\nn03046921\tclock radio\nn03047052\tclock tower\nn03047171\tclockwork\nn03047690\tclog, geta, patten, sabot\nn03047799\tcloisonne\nn03047941\tcloister\nn03048883\tclosed circuit, loop\nn03049066\tclosed-circuit television\nn03049326\tclosed loop, closed-loop system\nn03049457\tcloset\nn03049782\tcloseup lens\nn03049924\tcloth cap, flat cap\nn03050026\tcloth covering\nn03050453\tclothesbrush\nn03050546\tclothes closet, clothespress\nn03050655\tclothes dryer, clothes drier\nn03050864\tclothes hamper, laundry basket, clothes basket, voider\nn03051041\tclotheshorse\nn03051249\tclothespin, clothes pin, clothes peg\nn03051396\tclothes tree, coat tree, coat stand\nn03051540\tclothing, article of clothing, vesture, wear, wearable, habiliment\nn03052464\tclothing store, haberdashery, haberdashery store, mens store\nn03052917\tclout nail, clout\nn03053047\tclove hitch\nn03053976\tclub car, lounge car\nn03054491\tclubroom\nn03054605\tcluster bomb\nn03054901\tclutch\nn03055159\tclutch, clutch pedal\nn03055418\tclutch bag, clutch\nn03055670\tcoach, four-in-hand, coach-and-four\nn03055857\tcoach house, carriage house, remise\nn03056097\tcoal car\nn03056215\tcoal chute\nn03056288\tcoal house\nn03056493\tcoal shovel\nn03056583\tcoaming\nn03056873\tcoaster brake\nn03057021\tcoat\nn03057541\tcoat button\nn03057636\tcoat closet\nn03057724\tcoatdress\nn03057841\tcoatee\nn03057920\tcoat hanger, clothes hanger, dress hanger\nn03058107\tcoating, coat\nn03058603\tcoating\nn03058949\tcoat of paint\nn03059103\tcoatrack, coat rack, hatrack\nn03059236\tcoattail\nn03059366\tcoaxial cable, coax, coax cable\nn03059685\tcobweb\nn03059934\tcobweb\nn03060728\tCockcroft and Walton accelerator, Cockcroft-Walton accelerator, Cockcroft and Walton voltage multiplier, Cockcroft-Walton voltage multiplier\nn03061050\tcocked hat\nn03061211\tcockhorse\nn03061345\tcockleshell\nn03061505\tcockpit\nn03061674\tcockpit\nn03061819\tcockpit\nn03061893\tcockscomb, coxcomb\nn03062015\tcocktail dress, sheath\nn03062122\tcocktail lounge\nn03062245\tcocktail shaker\nn03062336\tcocotte\nn03062651\tcodpiece\nn03062798\tcoelostat\nn03062985\tcoffee can\nn03063073\tcoffee cup\nn03063199\tcoffee filter\nn03063338\tcoffee maker\nn03063485\tcoffee mill, coffee grinder\nn03063599\tcoffee mug\nn03063689\tcoffeepot\nn03063834\tcoffee stall\nn03063968\tcoffee table, cocktail table\nn03064250\tcoffee urn\nn03064350\tcoffer\nn03064562\tCoffey still\nn03064758\tcoffin, casket\nn03064935\tcog, sprocket\nn03065243\tcoif\nn03065424\tcoil, spiral, volute, whorl, helix\nn03065708\tcoil\nn03066232\tcoil\nn03066359\tcoil spring, volute spring\nn03066464\tcoin box\nn03066849\tcolander, cullender\nn03067093\tcold cathode\nn03067212\tcold chisel, set chisel\nn03067339\tcold cream, coldcream, face cream, vanishing cream\nn03067518\tcold frame\nn03068181\tcollar, neckband\nn03068998\tcollar\nn03069752\tcollege\nn03070059\tcollet, collet chuck\nn03070193\tcollider\nn03070396\tcolliery, pit\nn03070587\tcollimator\nn03070854\tcollimator\nn03071021\tcologne, cologne water, eau de cologne\nn03071160\tcolonnade\nn03071288\tcolonoscope\nn03071552\tcolorimeter, tintometer\nn03072056\tcolors, colours\nn03072201\tcolor television, colour television, color television system, colour television system, color TV, colour TV\nn03072440\tcolor tube, colour tube, color television tube, colour television tube, color TV tube, colour TV tube\nn03072682\tcolor wash, colour wash\nn03073296\tColt\nn03073384\tcolter, coulter\nn03073545\tcolumbarium\nn03073694\tcolumbarium, cinerarium\nn03073977\tcolumn, pillar\nn03074380\tcolumn, pillar\nn03074855\tcomb\nn03075097\tcomb\nn03075248\tcomber\nn03075370\tcombination lock\nn03075500\tcombination plane\nn03075634\tcombine\nn03075768\tcomforter, pacifier, baby's dummy, teething ring\nn03075946\tcommand module\nn03076411\tcommissary\nn03076623\tcommissary\nn03076708\tcommodity, trade good, good\nn03077442\tcommon ax, common axe, Dayton ax, Dayton axe\nn03077616\tcommon room\nn03077741\tcommunications satellite\nn03078287\tcommunication system\nn03078506\tcommunity center, civic center\nn03078670\tcommutator\nn03078802\tcommuter, commuter train\nn03078995\tcompact, powder compact\nn03079136\tcompact, compact car\nn03079230\tcompact disk, compact disc, CD\nn03079494\tcompact-disk burner, CD burner\nn03079616\tcompanionway\nn03079741\tcompartment\nn03080309\tcompartment\nn03080497\tcompass\nn03080633\tcompass\nn03080731\tcompass card, mariner's compass\nn03080904\tcompass saw\nn03081859\tcompound\nn03081986\tcompound lens\nn03082127\tcompound lever\nn03082280\tcompound microscope\nn03082450\tcompress\nn03082656\tcompression bandage, tourniquet\nn03082807\tcompressor\nn03082979\tcomputer, computing machine, computing device, data processor, electronic computer, information processing system\nn03084420\tcomputer circuit\nn03084834\tcomputerized axial tomography scanner, CAT scanner\nn03085013\tcomputer keyboard, keypad\nn03085219\tcomputer monitor\nn03085333\tcomputer network\nn03085602\tcomputer screen, computer display\nn03085781\tcomputer store\nn03085915\tcomputer system, computing system, automatic data processing system, ADP system, ADPS\nn03086183\tconcentration camp, stockade\nn03086457\tconcert grand, concert piano\nn03086580\tconcert hall\nn03086670\tconcertina\nn03086868\tconcertina\nn03087069\tconcrete mixer, cement mixer\nn03087245\tcondensation pump, diffusion pump\nn03087366\tcondenser, optical condenser\nn03087521\tcondenser\nn03087643\tcondenser\nn03087816\tcondenser microphone, capacitor microphone\nn03088389\tcondominium\nn03088580\tcondominium, condo\nn03088707\tconductor\nn03089477\tcone clutch, cone friction clutch\nn03089624\tconfectionery, confectionary, candy store\nn03089753\tconference center, conference house\nn03089879\tconference room\nn03090000\tconference table, council table, council board\nn03090172\tconfessional\nn03090437\tconformal projection, orthomorphic projection\nn03090710\tcongress boot, congress shoe, congress gaiter\nn03090856\tconic projection, conical projection\nn03091044\tconnecting rod\nn03091223\tconnecting room\nn03091374\tconnection, connexion, connector, connecter, connective\nn03091907\tconning tower\nn03092053\tconning tower\nn03092166\tconservatory, hothouse, indoor garden\nn03092314\tconservatory, conservatoire\nn03092476\tconsole\nn03092656\tconsole\nn03092883\tconsole table, console\nn03093427\tconsulate\nn03093792\tcontact, tangency\nn03094159\tcontact, contact lens\nn03094503\tcontainer\nn03095699\tcontainer ship, containership, container vessel\nn03095965\tcontainment\nn03096439\tcontrabassoon, contrafagotto, double bassoon\nn03096960\tcontrol, controller\nn03097362\tcontrol center\nn03097535\tcontrol circuit, negative feedback circuit\nn03097673\tcontrol key, command key\nn03098140\tcontrol panel, instrument panel, control board, board, panel\nn03098515\tcontrol rod\nn03098688\tcontrol room\nn03098806\tcontrol system\nn03098959\tcontrol tower\nn03099147\tconvector\nn03099274\tconvenience store\nn03099454\tconvent\nn03099622\tconventicle, meetinghouse\nn03099771\tconverging lens, convex lens\nn03099945\tconverter, convertor\nn03100240\tconvertible\nn03100346\tconvertible, sofa bed\nn03100490\tconveyance, transport\nn03100897\tconveyer belt, conveyor belt, conveyer, conveyor, transporter\nn03101156\tcooker\nn03101302\tcookfire\nn03101375\tcookhouse\nn03101517\tcookie cutter\nn03101664\tcookie jar, cooky jar\nn03101796\tcookie sheet, baking tray\nn03101986\tcooking utensil, cookware\nn03102371\tcookstove\nn03102516\tcoolant system\nn03102654\tcooler, ice chest\nn03102859\tcooling system, cooling\nn03103128\tcooling system, engine cooling system\nn03103396\tcooling tower\nn03103563\tcoonskin cap, coonskin\nn03103904\tcope\nn03104019\tcoping saw\nn03104512\tcopperware\nn03105088\tcopyholder\nn03105214\tcoquille\nn03105306\tcoracle\nn03105467\tcorbel, truss\nn03105645\tcorbel arch\nn03105810\tcorbel step, corbie-step, corbiestep, crow step\nn03105974\tcorbie gable\nn03106722\tcord, corduroy\nn03106898\tcord, electric cord\nn03107046\tcordage\nn03107488\tcords, corduroys\nn03107716\tcore\nn03108455\tcore bit\nn03108624\tcore drill\nn03108759\tcorer\nn03108853\tcork, bottle cork\nn03109033\tcorker\nn03109150\tcorkscrew, bottle screw\nn03109253\tcorncrib\nn03109693\tcorner, quoin\nn03109881\tcorner, nook\nn03110202\tcorner post\nn03110669\tcornet, horn, trumpet, trump\nn03111041\tcornice\nn03111177\tcornice\nn03111296\tcornice, valance, valance board, pelmet\nn03111690\tcorrectional institution\nn03112240\tcorrugated fastener, wiggle nail\nn03112719\tcorselet, corslet\nn03112869\tcorset, girdle, stays\nn03113152\tcosmetic\nn03113505\tcosmotron\nn03113657\tcostume\nn03113835\tcostume\nn03114041\tcostume\nn03114236\tcostume\nn03114379\tcosy, tea cosy, cozy, tea cozy\nn03114504\tcot, camp bed\nn03114743\tcottage tent\nn03114839\tcotter, cottar\nn03115014\tcotter pin\nn03115180\tcotton\nn03115400\tcotton flannel, Canton flannel\nn03115663\tcotton mill\nn03115762\tcouch\nn03115897\tcouch\nn03116008\tcouchette\nn03116163\tcoude telescope, coude system\nn03116530\tcounter\nn03116767\tcounter, tabulator\nn03117199\tcounter\nn03117642\tcounterbore, countersink, countersink bit\nn03118346\tcounter tube\nn03118969\tcountry house\nn03119203\tcountry store, general store, trading post\nn03119396\tcoupe\nn03119510\tcoupling, coupler\nn03120198\tcourt, courtyard\nn03120491\tcourt\nn03120778\tcourt, courtroom\nn03121040\tcourt\nn03121190\tCourtelle\nn03121298\tcourthouse\nn03121431\tcourthouse\nn03121897\tcoverall\nn03122073\tcovered bridge\nn03122202\tcovered couch\nn03122295\tcovered wagon, Conestoga wagon, Conestoga, prairie wagon, prairie schooner\nn03122748\tcovering\nn03123553\tcoverlet\nn03123666\tcover plate\nn03123809\tcowbarn, cowshed, cow barn, cowhouse, byre\nn03123917\tcowbell\nn03124043\tcowboy boot\nn03124170\tcowboy hat, ten-gallon hat\nn03124313\tcowhide\nn03124474\tcowl\nn03124590\tcow pen, cattle pen, corral\nn03125057\tCPU board, mother board\nn03125588\tcrackle, crackleware, crackle china\nn03125729\tcradle\nn03125870\tcraft\nn03126090\tcramp, cramp iron\nn03126385\tcrampon, crampoon, climbing iron, climber\nn03126580\tcrampon, crampoon\nn03126707\tcrane\nn03126927\tcraniometer\nn03127024\tcrank, starter\nn03127203\tcrankcase\nn03127408\tcrankshaft\nn03127531\tcrash barrier\nn03127747\tcrash helmet\nn03127925\tcrate\nn03128085\tcravat\nn03128248\tcrayon, wax crayon\nn03128427\tcrazy quilt\nn03128519\tcream, ointment, emollient\nn03129001\tcream pitcher, creamer\nn03129471\tcreche, foundling hospital\nn03129636\tcreche\nn03129753\tcredenza, credence\nn03129848\tcreel\nn03130066\tcrematory, crematorium, cremation chamber\nn03130233\tcrematory, crematorium\nn03130563\tcrepe, crape\nn03130761\tcrepe de Chine\nn03130866\tcrescent wrench\nn03131193\tcretonne\nn03131574\tcrib, cot\nn03131669\tcrib\nn03131967\tcricket ball\nn03132076\tcricket bat, bat\nn03132261\tcricket equipment\nn03132438\tcringle, eyelet, loop, grommet, grummet\nn03132666\tcrinoline\nn03132776\tcrinoline\nn03133050\tcrochet needle, crochet hook\nn03133415\tcrock, earthenware jar\nn03133878\tCrock Pot\nn03134118\tcrook, shepherd's crook\nn03134232\tCrookes radiometer\nn03134394\tCrookes tube\nn03134739\tcroquet ball\nn03134853\tcroquet equipment\nn03135030\tcroquet mallet\nn03135532\tcross\nn03135656\tcrossbar\nn03135788\tcrossbar\nn03135917\tcrossbar\nn03136051\tcrossbench\nn03136254\tcross bit\nn03136369\tcrossbow\nn03136504\tcrosscut saw, crosscut handsaw, cutoff saw\nn03137473\tcrossjack, mizzen course\nn03137579\tcrosspiece\nn03138128\tcrotchet\nn03138217\tcroupier's rake\nn03138344\tcrowbar, wrecking bar, pry, pry bar\nn03138669\tcrown, diadem\nn03139089\tcrown, crownwork, jacket, jacket crown, cap\nn03139464\tcrown jewels\nn03139640\tcrown lens\nn03139998\tcrow's nest\nn03140126\tcrucible, melting pot\nn03140292\tcrucifix, rood, rood-tree\nn03140431\tcruet, crewet\nn03140546\tcruet-stand\nn03140652\tcruise control\nn03140771\tcruise missile\nn03140900\tcruiser\nn03141065\tcruiser, police cruiser, patrol car, police car, prowl car, squad car\nn03141327\tcruise ship, cruise liner\nn03141455\tcrupper\nn03141612\tcruse\nn03141702\tcrusher\nn03141823\tcrutch\nn03142099\tcryometer\nn03142205\tcryoscope\nn03142325\tcryostat\nn03142431\tcrypt\nn03142679\tcrystal, watch crystal, watch glass\nn03143400\tcrystal detector\nn03143572\tcrystal microphone\nn03143754\tcrystal oscillator, quartz oscillator\nn03144156\tcrystal set\nn03144873\tcubitiere\nn03144982\tcucking stool, ducking stool\nn03145147\tcuckoo clock\nn03145277\tcuddy\nn03145384\tcudgel\nn03145522\tcue, cue stick, pool cue, pool stick\nn03145719\tcue ball\nn03145843\tcuff, turnup\nn03146219\tcuirass\nn03146342\tcuisse\nn03146449\tcul, cul de sac, dead end\nn03146560\tculdoscope\nn03146687\tcullis\nn03146777\tculotte\nn03146846\tcultivator, tiller\nn03147084\tculverin\nn03147156\tculverin\nn03147280\tculvert\nn03147509\tcup\nn03148324\tcupboard, closet\nn03148518\tcup hook\nn03148727\tcupola\nn03148808\tcupola\nn03149135\tcurb, curb bit\nn03149401\tcurb roof\nn03149686\tcurbstone, kerbstone\nn03149810\tcurette, curet\nn03150232\tcurler, hair curler, roller, crimper\nn03150511\tcurling iron\nn03150661\tcurrycomb\nn03150795\tcursor, pointer\nn03151077\tcurtain, drape, drapery, mantle, pall\nn03152303\tcustomhouse, customshouse\nn03152951\tcutaway, cutaway drawing, cutaway model\nn03153246\tcutlas, cutlass\nn03153585\tcutoff\nn03153948\tcutout\nn03154073\tcutter, cutlery, cutting tool\nn03154316\tcutter\nn03154446\tcutting implement\nn03154616\tcutting room\nn03154745\tcutty stool\nn03154895\tcutwork\nn03155178\tcybercafe\nn03155502\tcyclopean masonry\nn03155915\tcyclostyle\nn03156071\tcyclotron\nn03156279\tcylinder\nn03156405\tcylinder, piston chamber\nn03156767\tcylinder lock\nn03157348\tcymbal\nn03158186\tdacha\nn03158414\tDacron, Terylene\nn03158668\tdado\nn03158796\tdado plane\nn03158885\tdagger, sticker\nn03159535\tdairy, dairy farm\nn03159640\tdais, podium, pulpit, rostrum, ambo, stump, soapbox\nn03160001\tdaisy print wheel, daisy wheel\nn03160186\tdaisywheel printer\nn03160309\tdam, dike, dyke\nn03160740\tdamask\nn03161016\tdampener, moistener\nn03161450\tdamper, muffler\nn03161893\tdamper block, piano damper\nn03162297\tdark lantern, bull's-eye\nn03162460\tdarkroom\nn03162556\tdarning needle, embroidery needle\nn03162714\tdart\nn03162818\tdart\nn03163222\tdashboard, fascia\nn03163381\tdashiki, daishiki\nn03163488\tdash-pot\nn03163798\tdata converter\nn03163973\tdata input device, input device\nn03164192\tdata multiplexer\nn03164344\tdata system, information system\nn03164605\tdavenport\nn03164722\tdavenport\nn03164929\tdavit\nn03165096\tdaybed, divan bed\nn03165211\tdaybook, ledger\nn03165466\tday nursery, day care center\nn03165616\tday school\nn03165823\tdead axle\nn03165955\tdeadeye\nn03166120\tdeadhead\nn03166514\tdeanery\nn03166600\tdeathbed\nn03166685\tdeath camp\nn03166809\tdeath house, death row\nn03166951\tdeath knell, death bell\nn03167153\tdeath seat\nn03167978\tdeck\nn03168107\tdeck\nn03168217\tdeck chair, beach chair\nn03168543\tdeck-house\nn03168663\tdeckle\nn03168774\tdeckle edge, deckle\nn03168933\tdeclinometer, transit declinometer\nn03169063\tdecoder\nn03169176\tdecolletage\nn03170292\tdecoupage\nn03170459\tdedicated file server\nn03170635\tdeep-freeze, Deepfreeze, deep freezer, freezer\nn03170872\tdeerstalker\nn03171228\tdefense system, defence system\nn03171356\tdefensive structure, defense, defence\nn03171635\tdefibrillator\nn03171910\tdefilade\nn03172038\tdeflector\nn03172738\tdelayed action\nn03172965\tdelay line\nn03173270\tdelft\nn03173387\tdelicatessen, deli, food shop\nn03173929\tdelivery truck, delivery van, panel truck\nn03174079\tdelta wing\nn03174450\tdemijohn\nn03174731\tdemitasse\nn03175081\tden\nn03175189\tdenim, dungaree, jean\nn03175301\tdensimeter, densitometer\nn03175457\tdensitometer\nn03175604\tdental appliance\nn03175843\tdental floss, floss\nn03175983\tdental implant\nn03176238\tdentist's drill, burr drill\nn03176386\tdenture, dental plate, plate\nn03176594\tdeodorant, deodourant\nn03176763\tdepartment store, emporium\nn03177059\tdeparture lounge\nn03177165\tdepilatory, depilator, epilator\nn03177708\tdepressor\nn03178000\tdepth finder\nn03178173\tdepth gauge, depth gage\nn03178430\tderrick\nn03178538\tderrick\nn03178674\tderringer\nn03179701\tdesk\nn03179910\tdesk phone\nn03180011\tdesktop computer\nn03180384\tdessert spoon\nn03180504\tdestroyer, guided missile destroyer\nn03180732\tdestroyer escort\nn03180865\tdetached house, single dwelling\nn03180969\tdetector, sensor, sensing element\nn03181293\tdetector\nn03181667\tdetention home, detention house, house of detention, detention camp\nn03182140\tdetonating fuse\nn03182232\tdetonator, detonating device, cap\nn03182912\tdeveloper\nn03183080\tdevice\nn03185868\tDewar flask, Dewar\nn03186199\tdhoti\nn03186285\tdhow\nn03186818\tdial, telephone dial\nn03187037\tdial\nn03187153\tdial\nn03187268\tdialog box, panel\nn03187595\tdial telephone, dial phone\nn03187751\tdialyzer, dialysis machine\nn03188290\tdiamante\nn03188531\tdiaper, nappy, napkin\nn03188725\tdiaper\nn03188871\tdiaphone\nn03189083\tdiaphragm, stop\nn03189311\tdiaphragm\nn03189818\tdiathermy machine\nn03190458\tdibble, dibber\nn03191286\tdice cup, dice box\nn03191451\tdicer\nn03191561\tdickey, dickie, dicky, shirtfront\nn03191776\tdickey, dickie, dicky, dickey-seat, dickie-seat, dicky-seat\nn03192543\tDictaphone\nn03192907\tdie\nn03193107\tdiesel, diesel engine, diesel motor\nn03193260\tdiesel-electric locomotive, diesel-electric\nn03193423\tdiesel-hydraulic locomotive, diesel-hydraulic\nn03193597\tdiesel locomotive\nn03193754\tdiestock\nn03194170\tdifferential analyzer\nn03194297\tdifferential gear, differential\nn03194812\tdiffuser, diffusor\nn03194992\tdiffuser, diffusor\nn03195332\tdigester\nn03195485\tdiggings, digs, domiciliation, lodgings, pad\nn03195799\tdigital-analog converter, digital-to-analog converter\nn03195959\tdigital audiotape, DAT\nn03196062\tdigital camera\nn03196217\tdigital clock\nn03196324\tdigital computer\nn03196598\tdigital display, alphanumeric display\nn03196990\tdigital subscriber line, DSL\nn03197201\tdigital voltmeter\nn03197337\tdigital watch\nn03197446\tdigitizer, digitiser, analog-digital converter, analog-to-digital converter\nn03198223\tdilator, dilater\nn03198500\tdildo\nn03199358\tdimity\nn03199488\tdimmer\nn03199647\tdiner\nn03199775\tdinette\nn03199901\tdinghy, dory, rowboat\nn03200231\tdining area\nn03200357\tdining car, diner, dining compartment, buffet car\nn03200539\tdining-hall\nn03200701\tdining room, dining-room\nn03200906\tdining-room furniture\nn03201035\tdining-room table\nn03201208\tdining table, board\nn03201529\tdinner bell\nn03201638\tdinner dress, dinner gown, formal, evening gown\nn03201776\tdinner jacket, tux, tuxedo, black tie\nn03201895\tdinner napkin\nn03201996\tdinner pail, dinner bucket\nn03202354\tdinner table\nn03202481\tdinner theater, dinner theatre\nn03202760\tdiode, semiconductor diode, junction rectifier, crystal rectifier\nn03202940\tdiode, rectifying tube, rectifying valve\nn03203089\tdip\nn03203806\tdiplomatic building\nn03204134\tdipole, dipole antenna\nn03204306\tdipper\nn03204436\tdipstick\nn03204558\tDIP switch, dual inline package switch\nn03204955\tdirectional antenna\nn03205143\tdirectional microphone\nn03205304\tdirection finder\nn03205458\tdirk\nn03205574\tdirndl\nn03205669\tdirndl\nn03205903\tdirty bomb\nn03206023\tdischarge lamp\nn03206158\tdischarge pipe\nn03206282\tdisco, discotheque\nn03206405\tdiscount house, discount store, discounter, wholesale house\nn03206602\tdiscus, saucer\nn03206718\tdisguise\nn03206908\tdish\nn03207305\tdish, dish aerial, dish antenna, saucer\nn03207548\tdishpan\nn03207630\tdish rack\nn03207743\tdishrag, dishcloth\nn03207835\tdishtowel, dish towel, tea towel\nn03207941\tdishwasher, dish washer, dishwashing machine\nn03208556\tdisk, disc\nn03208938\tdisk brake, disc brake\nn03209359\tdisk clutch\nn03209477\tdisk controller\nn03209666\tdisk drive, disc drive, hard drive, Winchester drive\nn03209910\tdiskette, floppy, floppy disk\nn03210245\tdisk harrow, disc harrow\nn03210372\tdispatch case, dispatch box\nn03210552\tdispensary\nn03210683\tdispenser\nn03211117\tdisplay, video display\nn03211413\tdisplay adapter, display adaptor\nn03211616\tdisplay panel, display board, board\nn03211789\tdisplay window, shop window, shopwindow, show window\nn03212114\tdisposal, electric pig, garbage disposal\nn03212247\tdisrupting explosive, bursting explosive\nn03212406\tdistaff\nn03212811\tdistillery, still\nn03213014\tdistributor, distributer, electrical distributor\nn03213361\tdistributor cam\nn03213538\tdistributor cap\nn03213715\tdistributor housing\nn03213826\tdistributor point, breaker point, point\nn03214253\tditch\nn03214450\tditch spade, long-handled spade\nn03214582\tditty bag\nn03214966\tdivan\nn03215076\tdivan, diwan\nn03215191\tdive bomber\nn03215337\tdiverging lens, concave lens\nn03215508\tdivided highway, dual carriageway\nn03215749\tdivider\nn03215930\tdiving bell\nn03216199\tdivining rod, dowser, dowsing rod, waterfinder, water finder\nn03216402\tdiving suit, diving dress\nn03216562\tdixie\nn03216710\tDixie cup, paper cup\nn03216828\tdock, dockage, docking facility\nn03217653\tdoeskin\nn03217739\tdogcart\nn03217889\tdoggie bag, doggy bag\nn03218198\tdogsled, dog sled, dog sleigh\nn03218446\tdog wrench\nn03219010\tdoily, doyley, doyly\nn03219135\tdoll, dolly\nn03219483\tdollhouse, doll's house\nn03219612\tdolly\nn03219859\tdolman\nn03219966\tdolman, dolman jacket\nn03220095\tdolman sleeve\nn03220237\tdolmen, cromlech, portal tomb\nn03220513\tdome\nn03220692\tdome, domed stadium, covered stadium\nn03221059\tdomino, half mask, eye mask\nn03221351\tdongle\nn03221540\tdonkey jacket\nn03221720\tdoor\nn03222176\tdoor\nn03222318\tdoor\nn03222516\tdoorbell, bell, buzzer\nn03222722\tdoorframe, doorcase\nn03222857\tdoorjamb, doorpost\nn03223162\tdoorlock\nn03223299\tdoormat, welcome mat\nn03223441\tdoornail\nn03223553\tdoorplate\nn03223686\tdoorsill, doorstep, threshold\nn03223923\tdoorstop, doorstopper\nn03224490\tDoppler radar\nn03224603\tdormer, dormer window\nn03224753\tdormer window\nn03224893\tdormitory, dorm, residence hall, hall, student residence\nn03225108\tdormitory, dormitory room, dorm room\nn03225458\tdosemeter, dosimeter\nn03225616\tdossal, dossel\nn03225777\tdot matrix printer, matrix printer, dot printer\nn03225988\tdouble bed\nn03226090\tdouble-bitted ax, double-bitted axe, Western ax, Western axe\nn03226254\tdouble boiler, double saucepan\nn03226375\tdouble-breasted jacket\nn03226538\tdouble-breasted suit\nn03226880\tdouble door\nn03227010\tdouble glazing\nn03227184\tdouble-hung window\nn03227317\tdouble knit\nn03227721\tdoubler\nn03227856\tdouble reed\nn03228016\tdouble-reed instrument, double reed\nn03228254\tdoublet\nn03228365\tdoubletree\nn03228533\tdouche, douche bag\nn03228692\tdovecote, columbarium, columbary\nn03228796\tDover's powder\nn03228967\tdovetail, dovetail joint\nn03229115\tdovetail plane\nn03229244\tdowel, dowel pin, joggle\nn03229526\tdownstage\nn03231160\tdrafting instrument\nn03231368\tdrafting table, drawing table\nn03231819\tDragunov\nn03232309\tdrainage ditch\nn03232417\tdrainage system\nn03232543\tdrain basket\nn03232815\tdrainplug\nn03232923\tdrape\nn03233123\tdrapery\nn03233624\tdrawbar\nn03233744\tdrawbridge, lift bridge\nn03233905\tdrawer\nn03234164\tdrawers, underdrawers, shorts, boxers, boxershorts\nn03234952\tdrawing chalk\nn03235042\tdrawing room, withdrawing room\nn03235180\tdrawing room\nn03235327\tdrawknife, drawshave\nn03235796\tdrawstring bag\nn03235979\tdray, camion\nn03236093\tdreadnought, dreadnaught\nn03236217\tdredge\nn03236423\tdredger\nn03236580\tdredging bucket\nn03236735\tdress, frock\nn03237212\tdress blues, dress whites\nn03237340\tdresser\nn03237416\tdress hat, high hat, opera hat, silk hat, stovepipe, top hat, topper, beaver\nn03237639\tdressing, medical dressing\nn03237839\tdressing case\nn03237992\tdressing gown, robe-de-chambre, lounging robe\nn03238131\tdressing room\nn03238286\tdressing sack, dressing sacque\nn03238586\tdressing table, dresser, vanity, toilet table\nn03238762\tdress rack\nn03238879\tdress shirt, evening shirt\nn03239054\tdress suit, full dress, tailcoat, tail coat, tails, white tie, white tie and tails\nn03239259\tdress uniform\nn03239607\tdrift net\nn03239726\tdrill\nn03240140\telectric drill\nn03240683\tdrilling platform, offshore rig\nn03240892\tdrill press\nn03241093\tdrill rig, drilling rig, oilrig, oil rig\nn03241335\tdrinking fountain, water fountain, bubbler\nn03241496\tdrinking vessel\nn03241903\tdrip loop\nn03242120\tdrip mat\nn03242264\tdrip pan\nn03242390\tdripping pan, drip pan\nn03242506\tdrip pot\nn03242995\tdrive\nn03243218\tdrive\nn03243625\tdrive line, drive line system\nn03244047\tdriver, number one wood\nn03244231\tdriveshaft\nn03244388\tdriveway, drive, private road\nn03244775\tdriving iron, one iron\nn03244919\tdriving wheel\nn03245271\tdrogue, drogue chute, drogue parachute\nn03245421\tdrogue parachute\nn03245724\tdrone, drone pipe, bourdon\nn03245889\tdrone, pilotless aircraft, radio-controlled aircraft\nn03246197\tdrop arch\nn03246312\tdrop cloth\nn03246454\tdrop curtain, drop cloth, drop\nn03246653\tdrop forge, drop hammer, drop press\nn03246933\tdrop-leaf table\nn03247083\tdropper, eye dropper\nn03247351\tdroshky, drosky\nn03247495\tdrove, drove chisel\nn03248835\tdrugget\nn03249342\tdrugstore, apothecary's shop, chemist's, chemist's shop, pharmacy\nn03249569\tdrum, membranophone, tympan\nn03249956\tdrum, metal drum\nn03250089\tdrum brake\nn03250279\tdrumhead, head\nn03250405\tdrum printer\nn03250588\tdrum sander, electric sander, sander, smoother\nn03250847\tdrumstick\nn03250952\tdry battery\nn03251100\tdry-bulb thermometer\nn03251280\tdry cell\nn03251533\tdry dock, drydock, graving dock\nn03251766\tdryer, drier\nn03251932\tdry fly\nn03252231\tdry kiln\nn03252324\tdry masonry\nn03252422\tdry point\nn03252637\tdry wall, dry-stone wall\nn03252787\tdual scan display\nn03253071\tduck\nn03253187\tduckboard\nn03253279\tduckpin\nn03253714\tdudeen\nn03253796\tduffel, duffle\nn03253886\tduffel bag, duffle bag, duffel, duffle\nn03254046\tduffel coat, duffle coat\nn03254189\tdugout\nn03254374\tdugout canoe, dugout, pirogue\nn03254625\tdulciana\nn03254737\tdulcimer\nn03254862\tdulcimer\nn03255030\tdumbbell\nn03255167\tdumb bomb, gravity bomb\nn03255322\tdumbwaiter, food elevator\nn03255488\tdumdum, dumdum bullet\nn03255899\tdumpcart\nn03256032\tDumpster\nn03256166\tdump truck, dumper, tipper truck, tipper lorry, tip truck, tipper\nn03256472\tDumpy level\nn03256631\tdunce cap, dunce's cap, fool's cap\nn03256788\tdune buggy, beach buggy\nn03256928\tdungeon\nn03257065\tduplex apartment, duplex\nn03257210\tduplex house, duplex, semidetached house\nn03257586\tduplicator, copier\nn03258192\tdust bag, vacuum bag\nn03258330\tdustcloth, dustrag, duster\nn03258456\tdust cover\nn03258577\tdust cover, dust sheet\nn03258905\tdustmop, dust mop, dry mop\nn03259009\tdustpan\nn03259280\tDutch oven\nn03259401\tDutch oven\nn03259505\tdwelling, home, domicile, abode, habitation, dwelling house\nn03260206\tdye-works\nn03260504\tdynamo\nn03260733\tdynamometer, ergometer\nn03260849\tEames chair\nn03261019\tearflap, earlap\nn03261263\tearly warning radar\nn03261395\tearly warning system\nn03261603\tearmuff\nn03261776\tearphone, earpiece, headphone, phone\nn03262072\tearplug\nn03262248\tearplug\nn03262519\tearthenware\nn03262717\tearthwork\nn03262809\teasel\nn03262932\teasy chair, lounge chair, overstuffed chair\nn03263076\teaves\nn03263338\tecclesiastical attire, ecclesiastical robe\nn03263640\techinus\nn03263758\techocardiograph\nn03264906\tedger\nn03265032\tedge tool\nn03265754\tefficiency apartment\nn03266195\tegg-and-dart, egg-and-anchor, egg-and-tongue\nn03266371\teggbeater, eggwhisk\nn03266620\tegg timer\nn03266749\teiderdown, duvet, continental quilt\nn03267113\teight ball\nn03267468\tejection seat, ejector seat, capsule\nn03267696\telastic\nn03267821\telastic bandage\nn03268142\tElastoplast\nn03268311\telbow\nn03268645\telbow pad\nn03268790\telectric, electric automobile, electric car\nn03268918\telectrical cable\nn03269073\telectrical contact\nn03269203\telectrical converter\nn03269401\telectrical device\nn03270165\telectrical system\nn03270695\telectric bell\nn03270854\telectric blanket\nn03271030\telectric chair, chair, death chair, hot seat\nn03271260\telectric clock\nn03271376\telectric-discharge lamp, gas-discharge lamp\nn03271574\telectric fan, blower\nn03271765\telectric frying pan\nn03271865\telectric furnace\nn03272010\telectric guitar\nn03272125\telectric hammer\nn03272239\telectric heater, electric fire\nn03272383\telectric lamp\nn03272562\telectric locomotive\nn03272810\telectric meter, power meter\nn03272940\telectric mixer\nn03273061\telectric motor\nn03273551\telectric organ, electronic organ, Hammond organ, organ\nn03273740\telectric range\nn03273913\telectric refrigerator, fridge\nn03274265\telectric toothbrush\nn03274435\telectric typewriter\nn03274561\telectro-acoustic transducer\nn03274796\telectrode\nn03275125\telectrodynamometer\nn03275311\telectroencephalograph\nn03275566\telectrograph\nn03275681\telectrolytic, electrolytic capacitor, electrolytic condenser\nn03275864\telectrolytic cell\nn03276179\telectromagnet\nn03276696\telectrometer\nn03276839\telectromyograph\nn03277004\telectron accelerator\nn03277149\telectron gun\nn03277459\telectronic balance\nn03277602\telectronic converter\nn03277771\telectronic device\nn03278248\telectronic equipment\nn03278914\telectronic fetal monitor, electronic foetal monitor, fetal monitor, foetal monitor\nn03279153\telectronic instrument, electronic musical instrument\nn03279364\telectronic voltmeter\nn03279508\telectron microscope\nn03279804\telectron multiplier\nn03279918\telectrophorus\nn03280216\telectroscope\nn03280394\telectrostatic generator, electrostatic machine, Wimshurst machine, Van de Graaff generator\nn03280644\telectrostatic printer\nn03281145\televator, lift\nn03281524\televator\nn03281673\televator shaft\nn03282060\tembankment\nn03282295\tembassy\nn03282401\tembellishment\nn03283221\temergency room, ER\nn03283413\temesis basin\nn03283827\temitter\nn03284308\tempty\nn03284482\temulsion, photographic emulsion\nn03284743\tenamel\nn03284886\tenamel\nn03284981\tenamelware\nn03285578\tencaustic\nn03285730\tencephalogram, pneumoencephalogram\nn03285912\tenclosure\nn03286572\tendoscope\nn03287351\tenergizer, energiser\nn03287733\tengine\nn03288003\tengine\nn03288500\tengineering, engine room\nn03288643\tenginery\nn03288742\tEnglish horn, cor anglais\nn03288886\tEnglish saddle, English cavalry saddle\nn03289660\tenlarger\nn03289985\tensemble\nn03290096\tensign\nn03290195\tentablature\nn03290653\tentertainment center\nn03291413\tentrenching tool, trenching spade\nn03291551\tentrenchment, intrenchment\nn03291741\tenvelope\nn03291819\tenvelope\nn03291963\tenvelope, gasbag\nn03292085\teolith\nn03292362\tepauliere\nn03292475\tepee\nn03292603\tepergne\nn03292736\tepicyclic train, epicyclic gear train\nn03292960\tepidiascope\nn03293095\tepilating wax\nn03293741\tequalizer, equaliser\nn03293863\tequatorial\nn03294048\tequipment\nn03294604\terasable programmable read-only memory, EPROM\nn03294833\teraser\nn03295012\terecting prism\nn03295140\terection\nn03295246\tErlenmeyer flask\nn03295928\tescape hatch\nn03296081\tescapement\nn03296217\tescape wheel\nn03296328\tescarpment, escarp, scarp, protective embankment\nn03296478\tescutcheon, scutcheon\nn03296963\tesophagoscope, oesophagoscope\nn03297103\tespadrille\nn03297226\tespalier\nn03297495\tespresso maker\nn03297644\tespresso shop\nn03297735\testablishment\nn03298089\testaminet\nn03298352\testradiol patch\nn03298716\tetagere\nn03298858\tetamine, etamin\nn03299406\tetching\nn03300216\tethernet\nn03300443\tethernet cable\nn03301175\tEton jacket\nn03301291\tetui\nn03301389\teudiometer\nn03301568\teuphonium\nn03301833\tevaporative cooler\nn03301940\tevening bag\nn03302671\texercise bike, exercycle\nn03302790\texercise device\nn03302938\texhaust, exhaust system\nn03303217\texhaust fan\nn03303669\texhaust valve\nn03303831\texhibition hall, exhibition area\nn03304197\tExocet\nn03304323\texpansion bit, expansive bit\nn03304465\texpansion bolt\nn03305300\texplosive detection system, EDS\nn03305522\texplosive device\nn03305953\texplosive trace detection, ETD\nn03306385\texpress, limited\nn03306869\textension, telephone extension, extension phone\nn03307037\textension cord\nn03307573\texternal-combustion engine\nn03307792\texternal drive\nn03308152\textractor\nn03308481\teyebrow pencil\nn03308614\teyecup, eyebath, eye cup\nn03309110\teyeliner\nn03309356\teyepatch, patch\nn03309465\teyepiece, ocular\nn03309687\teyeshadow\nn03309808\tfabric, cloth, material, textile\nn03313333\tfacade, frontage, frontal\nn03314227\tface guard\nn03314378\tface mask\nn03314608\tfaceplate\nn03314780\tface powder\nn03314884\tface veil\nn03315644\tfacing, cladding\nn03315805\tfacing\nn03315990\tfacing, veneer\nn03316105\tfacsimile, facsimile machine, fax\nn03316406\tfactory, mill, manufacturing plant, manufactory\nn03316873\tfactory ship\nn03317233\tfagot, faggot\nn03317510\tfagot stitch, faggot stitch\nn03317673\tFahrenheit thermometer\nn03317788\tfaience\nn03317889\tfaille\nn03318136\tfairlead\nn03318294\tfairy light\nn03318865\tfalchion\nn03318983\tfallboard, fall-board\nn03319167\tfallout shelter\nn03319457\tfalse face\nn03319576\tfalse teeth\nn03319745\tfamily room\nn03320046\tfan\nn03320262\tfan belt\nn03320421\tfan blade\nn03320519\tfancy dress, masquerade, masquerade costume\nn03320845\tfanion\nn03320959\tfanlight\nn03321103\tfanjet, fan-jet, fanjet engine, turbojet, turbojet engine, turbofan, turbofan engine\nn03321419\tfanjet, fan-jet, turbofan, turbojet\nn03321563\tfanny pack, butt pack\nn03321843\tfan tracery\nn03321954\tfan vaulting\nn03322570\tfarm building\nn03322704\tfarmer's market, green market, greenmarket\nn03322836\tfarmhouse\nn03322940\tfarm machine\nn03323096\tfarmplace, farm-place, farmstead\nn03323211\tfarmyard\nn03323319\tfarthingale\nn03323703\tfastener, fastening, holdfast, fixing\nn03324629\tfast reactor\nn03324814\tfat farm\nn03324928\tfatigues\nn03325088\tfaucet, spigot\nn03325288\tfauld\nn03325403\tfauteuil\nn03325584\tfeather boa, boa\nn03325691\tfeatheredge\nn03325941\tfedora, felt hat, homburg, Stetson, trilby\nn03326073\tfeedback circuit, feedback loop\nn03326371\tfeedlot\nn03326475\tfell, felled seam\nn03326660\tfelloe, felly\nn03326795\tfelt\nn03326948\tfelt-tip pen, felt-tipped pen, felt tip, Magic Marker\nn03327133\tfelucca\nn03327234\tfence, fencing\nn03327553\tfencing mask, fencer's mask\nn03327691\tfencing sword\nn03327841\tfender, wing\nn03328201\tfender, buffer, cowcatcher, pilot\nn03329302\tFerris wheel\nn03329536\tferrule, collet\nn03329663\tferry, ferryboat\nn03330002\tferule\nn03330665\tfestoon\nn03330792\tfetoscope, foetoscope\nn03330947\tfetter, hobble\nn03331077\tfez, tarboosh\nn03331244\tfiber, fibre, vulcanized fiber\nn03331599\tfiber optic cable, fibre optic cable\nn03332005\tfiberscope\nn03332173\tfichu\nn03332271\tfiddlestick, violin bow\nn03332393\tfield artillery, field gun\nn03332591\tfield coil, field winding\nn03332784\tfield-effect transistor, FET\nn03332989\tfield-emission microscope\nn03333129\tfield glass, glass, spyglass\nn03333252\tfield hockey ball\nn03333349\tfield hospital\nn03333610\tfield house, sports arena\nn03333711\tfield lens\nn03333851\tfield magnet\nn03334017\tfield-sequential color television, field-sequential color TV, field-sequential color television system, field-sequential color TV system\nn03334291\tfield tent\nn03334382\tfieldwork\nn03334492\tfife\nn03334912\tfifth wheel, spare\nn03335030\tfighter, fighter aircraft, attack aircraft\nn03335333\tfighting chair\nn03335461\tfig leaf\nn03335846\tfigure eight, figure of eight\nn03336168\tfigure loom, figured-fabric loom\nn03336282\tfigure skate\nn03336575\tfilament\nn03336742\tfilature\nn03336839\tfile\nn03337140\tfile, file cabinet, filing cabinet\nn03337383\tfile folder\nn03337494\tfile server\nn03337822\tfiligree, filagree, fillagree\nn03338287\tfilling\nn03338821\tfilm, photographic film\nn03339296\tfilm, plastic film\nn03339529\tfilm advance\nn03339643\tfilter\nn03340009\tfilter\nn03340723\tfinder, viewfinder, view finder\nn03340923\tfinery\nn03341035\tfine-tooth comb, fine-toothed comb\nn03341153\tfinger\nn03341297\tfingerboard\nn03341606\tfinger bowl\nn03342015\tfinger paint, fingerpaint\nn03342127\tfinger-painting\nn03342262\tfinger plate, escutcheon, scutcheon\nn03342432\tfingerstall, cot\nn03342657\tfinish coat, finishing coat\nn03342863\tfinish coat, finishing coat\nn03342961\tfinisher\nn03343047\tfin keel\nn03343234\tfipple\nn03343354\tfipple flute, fipple pipe, recorder, vertical flute\nn03343560\tfire\nn03343737\tfire alarm, smoke alarm\nn03343853\tfirearm, piece, small-arm\nn03344305\tfire bell\nn03344393\tfireboat\nn03344509\tfirebox\nn03344642\tfirebrick\nn03344784\tfire control radar\nn03344935\tfire control system\nn03345487\tfire engine, fire truck\nn03345837\tfire extinguisher, extinguisher, asphyxiator\nn03346135\tfire iron\nn03346289\tfireman's ax, fireman's axe\nn03346455\tfireplace, hearth, open fireplace\nn03347037\tfire screen, fireguard\nn03347472\tfire tongs, coal tongs\nn03347617\tfire tower\nn03348142\tfirewall\nn03348868\tfiring chamber, gun chamber\nn03349020\tfiring pin\nn03349296\tfirkin\nn03349367\tfirmer chisel\nn03349469\tfirst-aid kit\nn03349599\tfirst-aid station\nn03349771\tfirst base\nn03349892\tfirst class\nn03350204\tfishbowl, fish bowl, goldfish bowl\nn03350352\tfisherman's bend\nn03350456\tfisherman's knot, true lover's knot, truelove knot\nn03350602\tfisherman's lure, fish lure\nn03351151\tfishhook\nn03351262\tfishing boat, fishing smack, fishing vessel\nn03351434\tfishing gear, tackle, fishing tackle, fishing rig, rig\nn03351979\tfishing rod, fishing pole\nn03352232\tfish joint\nn03352366\tfish knife\nn03352628\tfishnet, fishing net\nn03352961\tfish slice\nn03353281\tfitment\nn03353951\tfixative\nn03354207\tfixer-upper\nn03354903\tflag\nn03355468\tflageolet, treble recorder, shepherd's pipe\nn03355768\tflagon\nn03355925\tflagpole, flagstaff\nn03356038\tflagship\nn03356279\tflail\nn03356446\tflambeau\nn03356559\tflamethrower\nn03356858\tflange, rim\nn03356982\tflannel\nn03357081\tflannel, gabardine, tweed, white\nn03357267\tflannelette\nn03357716\tflap, flaps\nn03358172\tflash, photoflash, flash lamp, flashgun, flashbulb, flash bulb\nn03358380\tflash\nn03358726\tflash camera\nn03358841\tflasher\nn03359137\tflashlight, torch\nn03359285\tflashlight battery\nn03359436\tflash memory\nn03359566\tflask\nn03360133\tflat arch, straight arch\nn03360300\tflatbed\nn03360431\tflatbed press, cylinder press\nn03360622\tflat bench\nn03360731\tflatcar, flatbed, flat\nn03361109\tflat file\nn03361297\tflatlet\nn03361380\tflat panel display, FPD\nn03361550\tflats\nn03361683\tflat tip screwdriver\nn03362639\tfleece\nn03362771\tfleet ballistic missile submarine\nn03362890\tfleur-de-lis, fleur-de-lys\nn03363363\tflight simulator, trainer\nn03363549\tflintlock\nn03363749\tflintlock, firelock\nn03364008\tflip-flop, thong\nn03364156\tflipper, fin\nn03364599\tfloat, plasterer's float\nn03364937\tfloating dock, floating dry dock\nn03365231\tfloatplane, pontoon plane\nn03365374\tflood, floodlight, flood lamp, photoflood\nn03365592\tfloor, flooring\nn03365991\tfloor, level, storey, story\nn03366464\tfloor\nn03366721\tfloorboard\nn03366823\tfloor cover, floor covering\nn03366974\tfloor joist\nn03367059\tfloor lamp\nn03367321\tflophouse, dosshouse\nn03367410\tflorist, florist shop, flower store\nn03367545\tfloss\nn03367875\tflotsam, jetsam\nn03367969\tflour bin\nn03368048\tflour mill\nn03368352\tflowerbed, flower bed, bed of flowers\nn03369276\tflugelhorn, fluegelhorn\nn03369407\tfluid drive\nn03369512\tfluid flywheel\nn03369866\tflume\nn03370387\tfluorescent lamp\nn03370646\tfluoroscope, roentgenoscope\nn03371875\tflush toilet, lavatory\nn03372029\tflute, transverse flute\nn03372549\tflute, flute glass, champagne flute\nn03372822\tflux applicator\nn03372933\tfluxmeter\nn03373237\tfly\nn03373611\tflying boat\nn03373943\tflying buttress, arc-boutant\nn03374102\tflying carpet\nn03374282\tflying jib\nn03374372\tfly rod\nn03374473\tfly tent\nn03374570\tflytrap\nn03374649\tflywheel\nn03374838\tfob, watch chain, watch guard\nn03375171\tfoghorn\nn03375329\tfoglamp\nn03375575\tfoil\nn03376159\tfold, sheepfold, sheep pen, sheepcote\nn03376279\tfolder\nn03376595\tfolding chair\nn03376771\tfolding door, accordion door\nn03376938\tfolding saw\nn03378005\tfood court\nn03378174\tfood processor\nn03378342\tfood hamper\nn03378442\tfoot\nn03378593\tfootage\nn03378765\tfootball\nn03379051\tfootball helmet\nn03379204\tfootball stadium\nn03379343\tfootbath\nn03379719\tfoot brake\nn03379828\tfootbridge, overcrossing, pedestrian bridge\nn03379989\tfoothold, footing\nn03380301\tfootlocker, locker\nn03380647\tfoot rule\nn03380724\tfootstool, footrest, ottoman, tuffet\nn03380867\tfootwear, footgear\nn03381126\tfootwear\nn03381231\tforceps\nn03381450\tforce pump\nn03381565\tfore-and-after\nn03381776\tfore-and-aft sail\nn03382104\tforecastle, fo'c'sle\nn03382292\tforecourt\nn03382413\tforedeck\nn03382533\tfore edge, foredge\nn03382708\tforeground\nn03382856\tforemast\nn03382969\tfore plane\nn03383099\tforesail\nn03383211\tforestay\nn03383378\tforetop\nn03383468\tfore-topmast\nn03383562\tfore-topsail\nn03383821\tforge\nn03384167\tfork\nn03384352\tforklift\nn03384891\tformalwear, eveningwear, evening dress, evening clothes\nn03385295\tFormica\nn03385557\tfortification, munition\nn03386011\tfortress, fort\nn03386343\tforty-five\nn03386544\tFoucault pendulum\nn03386726\tfoulard\nn03386870\tfoul-weather gear\nn03387323\tfoundation garment, foundation\nn03387653\tfoundry, metalworks\nn03388043\tfountain\nn03388183\tfountain pen\nn03388323\tfour-in-hand\nn03388549\tfour-poster\nn03388711\tfour-pounder\nn03388990\tfour-stroke engine, four-stroke internal-combustion engine\nn03389611\tfour-wheel drive, 4WD\nn03389761\tfour-wheel drive, 4WD\nn03389889\tfour-wheeler\nn03389983\tfowling piece\nn03390075\tfoxhole, fox hole\nn03390327\tfragmentation bomb, antipersonnel bomb, anti-personnel bomb, daisy cutter\nn03390673\tfrail\nn03390786\tfraise\nn03390983\tframe, framing\nn03391301\tframe\nn03391613\tframe buffer\nn03391770\tframework\nn03392648\tFrancis turbine\nn03392741\tfranking machine\nn03393017\tfree house\nn03393199\tfree-reed\nn03393324\tfree-reed instrument\nn03393761\tfreewheel\nn03393912\tfreight car\nn03394149\tfreight elevator, service elevator\nn03394272\tfreight liner, liner train\nn03394480\tfreight train, rattler\nn03394649\tFrench door\nn03394916\tFrench horn, horn\nn03395256\tFrench polish, French polish shellac\nn03395401\tFrench roof\nn03395514\tFrench window\nn03395859\tFresnel lens\nn03396074\tfret\nn03396580\tfriary\nn03396654\tfriction clutch\nn03396997\tfrieze\nn03397087\tfrieze\nn03397266\tfrigate\nn03397412\tfrigate\nn03397532\tfrill, flounce, ruffle, furbelow\nn03397947\tFrisbee\nn03398153\tfrock\nn03398228\tfrock coat\nn03399579\tfrontlet, frontal\nn03399677\tfront porch\nn03399761\tfront projector\nn03399971\tfruit machine\nn03400231\tfrying pan, frypan, skillet\nn03400972\tfuel filter\nn03401129\tfuel gauge, fuel indicator\nn03401279\tfuel injection, fuel injection system\nn03401721\tfuel system\nn03402188\tfull-dress uniform\nn03402369\tfull metal jacket\nn03402511\tfull skirt\nn03402785\tfumigator\nn03402941\tfuneral home, funeral parlor, funeral parlour, funeral chapel, funeral church, funeral-residence\nn03403643\tfunnel\nn03404012\tfunny wagon\nn03404149\tfur\nn03404251\tfur coat\nn03404360\tfur hat\nn03404449\tfurnace\nn03404900\tfurnace lining, refractory\nn03405111\tfurnace room\nn03405265\tfurnishing\nn03405595\tfurnishing, trappings\nn03405725\tfurniture, piece of furniture, article of furniture\nn03406759\tfur-piece\nn03406966\tfurrow\nn03407369\tfuse, electrical fuse, safety fuse\nn03407865\tfusee drive, fusee\nn03408054\tfuselage\nn03408264\tfusil\nn03408340\tfustian\nn03408444\tfuton\nn03409297\tgabardine\nn03409393\tgable, gable end, gable wall\nn03409591\tgable roof, saddle roof, saddleback, saddleback roof\nn03409920\tgadgetry\nn03410022\tgaff\nn03410147\tgaff\nn03410303\tgaff\nn03410423\tgaffsail, gaff-headed sail\nn03410571\tgaff topsail, fore-and-aft topsail\nn03410740\tgag, muzzle\nn03410938\tgaiter\nn03411079\tgaiter\nn03411208\tGalilean telescope\nn03411339\tgalleon\nn03411927\tgallery\nn03412058\tgallery, art gallery, picture gallery\nn03412220\tgalley, ship's galley, caboose, cookhouse\nn03412387\tgalley\nn03412511\tgalley\nn03412906\tgallows\nn03413124\tgallows tree, gallows-tree, gibbet, gallous\nn03413264\tgalvanometer\nn03413428\tgambling house, gambling den, gambling hell, gaming house\nn03413684\tgambrel, gambrel roof\nn03413828\tgame\nn03414029\tgamebag\nn03414162\tgame equipment\nn03414676\tgaming table\nn03415252\tgamp, brolly\nn03415486\tgangplank, gangboard, gangway\nn03415626\tgangsaw\nn03415749\tgangway\nn03415868\tgantlet\nn03416094\tgantry, gauntry\nn03416489\tgarage\nn03416640\tgarage, service department\nn03416775\tGarand rifle, Garand, M-1, M-1 rifle\nn03416900\tgarbage\nn03417042\tgarbage truck, dustcart\nn03417202\tgarboard, garboard plank, garboard strake\nn03417345\tgarden\nn03417749\tgarden\nn03417970\tgarden rake\nn03418158\tgarden spade\nn03418242\tgarden tool, lawn tool\nn03418402\tgarden trowel\nn03418618\tgargoyle\nn03418749\tgaribaldi\nn03418915\tgarlic press\nn03419014\tgarment\nn03420345\tgarment bag\nn03420801\tgarrison cap, overseas cap\nn03420935\tgarrote, garotte, garrotte, iron collar\nn03421117\tgarter, supporter\nn03421324\tgarter belt, suspender belt\nn03421485\tgarter stitch\nn03421669\tgas guzzler\nn03421768\tgas shell\nn03421960\tgas bracket\nn03422072\tgas burner, gas jet\nn03422484\tgas-cooled reactor\nn03422589\tgas-discharge tube\nn03422771\tgas engine\nn03423099\tgas fixture\nn03423224\tgas furnace\nn03423306\tgas gun\nn03423479\tgas heater\nn03423568\tgas holder, gasometer\nn03423719\tgasket\nn03423877\tgas lamp\nn03424204\tgas maser\nn03424325\tgasmask, respirator, gas helmet\nn03424489\tgas meter, gasometer\nn03424630\tgasoline engine, petrol engine\nn03424862\tgasoline gauge, gasoline gage, gas gauge, gas gage, petrol gauge, petrol gage\nn03425241\tgas oven\nn03425325\tgas oven\nn03425413\tgas pump, gasoline pump, petrol pump, island dispenser\nn03425595\tgas range, gas stove, gas cooker\nn03425769\tgas ring\nn03426134\tgas tank, gasoline tank, petrol tank\nn03426285\tgas thermometer, air thermometer\nn03426462\tgastroscope\nn03426574\tgas turbine\nn03426871\tgas-turbine ship\nn03427202\tgat, rod\nn03427296\tgate\nn03428090\tgatehouse\nn03428226\tgateleg table\nn03428349\tgatepost\nn03429003\tgathered skirt\nn03429137\tGatling gun\nn03429288\tgauge, gage\nn03429682\tgauntlet, gantlet\nn03429771\tgauntlet, gantlet, metal glove\nn03429914\tgauze, netting, veiling\nn03430091\tgauze, gauze bandage\nn03430313\tgavel\nn03430418\tgazebo, summerhouse\nn03430551\tgear, gear wheel, geared wheel, cogwheel\nn03430959\tgear, paraphernalia, appurtenance\nn03431243\tgear, gear mechanism\nn03431570\tgearbox, gear box, gear case\nn03431745\tgearing, gear, geartrain, power train, train\nn03432061\tgearset\nn03432129\tgearshift, gearstick, shifter, gear lever\nn03432360\tGeiger counter, Geiger-Muller counter\nn03432509\tGeiger tube, Geiger-Muller tube\nn03433247\tgene chip, DNA chip\nn03433637\tgeneral-purpose bomb, GP bomb\nn03433877\tgenerator\nn03434188\tgenerator\nn03434285\tgenerator\nn03434830\tGeneva gown\nn03435593\tgeodesic dome\nn03435743\tgeorgette\nn03435991\tgharry\nn03436075\tghat\nn03436182\tghetto blaster, boom box\nn03436417\tgift shop, novelty shop\nn03436549\tgift wrapping\nn03436656\tgig\nn03436772\tgig\nn03436891\tgig\nn03436990\tgig\nn03437184\tgildhall\nn03437295\tgill net\nn03437430\tgilt, gilding\nn03437581\tgimbal\nn03437741\tgingham\nn03437829\tgirandole, girandola\nn03437941\tgirder\nn03438071\tgirdle, cincture, sash, waistband, waistcloth\nn03438257\tglass, drinking glass\nn03438661\tglass\nn03438780\tglass cutter\nn03438863\tglasses case\nn03439348\tglebe house\nn03439631\tGlengarry\nn03439814\tglider, sailplane\nn03440216\tGlobal Positioning System, GPS\nn03440682\tglockenspiel, orchestral bells\nn03440876\tglory hole, lazaretto\nn03441112\tglove\nn03441345\tglove compartment\nn03441465\tglow lamp\nn03441582\tglow tube\nn03442288\tglyptic art, glyptography\nn03442487\tglyptics, lithoglyptics\nn03442597\tgnomon\nn03442756\tgoal\nn03443005\tgoalmouth\nn03443149\tgoalpost\nn03443371\tgoblet\nn03443543\tgodown\nn03443912\tgoggles\nn03444034\tgo-kart\nn03445326\tgold plate\nn03445617\tgolf bag\nn03445777\tgolf ball\nn03445924\tgolfcart, golf cart\nn03446070\tgolf club, golf-club, club\nn03446268\tgolf-club head, club head, club-head, clubhead\nn03446832\tgolf equipment\nn03447075\tgolf glove\nn03447358\tgolliwog, golliwogg\nn03447447\tgondola\nn03447721\tgong, tam-tam\nn03447894\tgoniometer\nn03448031\tGordian knot\nn03448590\tgorget\nn03448696\tgossamer\nn03448956\tGothic arch\nn03449217\tgouache\nn03449309\tgouge\nn03449451\tgourd, calabash\nn03449564\tgovernment building\nn03449858\tgovernment office\nn03450230\tgown\nn03450516\tgown, robe\nn03450734\tgown, surgical gown, scrubs\nn03450881\tgrab\nn03450974\tgrab bag\nn03451120\tgrab bar\nn03451253\tgrace cup\nn03451365\tgrade separation\nn03451711\tgraduated cylinder\nn03451798\tgraffito, graffiti\nn03452267\tgramophone, acoustic gramophone\nn03452449\tgranary, garner\nn03452594\tgrandfather clock, longcase clock\nn03452741\tgrand piano, grand\nn03453231\tgraniteware\nn03453320\tgranny knot, granny\nn03453443\tgrape arbor, grape arbour\nn03454110\tgrapnel, grapnel anchor\nn03454211\tgrapnel, grapple, grappler, grappling hook, grappling iron\nn03454442\tgrass skirt\nn03454536\tgrate, grating\nn03454707\tgrate, grating\nn03454885\tgrater\nn03455355\tgraver, graving tool, pointel, pointrel\nn03455488\tgravestone, headstone, tombstone\nn03455642\tgravimeter, gravity meter\nn03455802\tgravure, photogravure, heliogravure\nn03456024\tgravy boat, gravy holder, sauceboat, boat\nn03456186\tgrey, gray\nn03456299\tgrease-gun, gun\nn03456447\tgreasepaint\nn03456548\tgreasy spoon\nn03456665\tgreatcoat, overcoat, topcoat\nn03457008\tgreat hall\nn03457451\tgreave, jambeau\nn03457686\tgreengrocery\nn03457902\tgreenhouse, nursery, glasshouse\nn03458271\tgrenade\nn03458422\tgrid, gridiron\nn03459328\tgriddle\nn03459591\tgrill, grille, grillwork\nn03459775\tgrille, radiator grille\nn03459914\tgrillroom, grill\nn03460040\tgrinder\nn03460147\tgrinding wheel, emery wheel\nn03460297\tgrindstone\nn03460455\tgripsack\nn03460899\tgristmill\nn03461288\tgrocery bag\nn03461385\tgrocery store, grocery, food market, market\nn03461651\tgrogram\nn03461882\tgroined vault\nn03461988\tgroover\nn03462110\tgrosgrain\nn03462315\tgros point\nn03462747\tground, earth\nn03462972\tground bait\nn03463185\tground control\nn03463381\tground floor, first floor, ground level\nn03463666\tgroundsheet, ground cloth\nn03464053\tG-string, thong\nn03464467\tguard, safety, safety device\nn03464628\tguard boat\nn03464952\tguardroom\nn03465040\tguardroom\nn03465151\tguard ship\nn03465320\tguard's van\nn03465426\tgueridon\nn03465500\tGuarnerius\nn03465605\tguesthouse\nn03465718\tguestroom\nn03465818\tguidance system, guidance device\nn03466162\tguided missile\nn03466493\tguided missile cruiser\nn03466600\tguided missile frigate\nn03466839\tguildhall\nn03466947\tguilloche\nn03467068\tguillotine\nn03467254\tguimpe\nn03467380\tguimpe\nn03467517\tguitar\nn03467796\tguitar pick\nn03467887\tgulag\nn03467984\tgun\nn03468570\tgunboat\nn03468696\tgun carriage\nn03468821\tgun case\nn03469031\tgun emplacement, weapons emplacement\nn03469175\tgun enclosure, gun turret, turret\nn03469493\tgunlock, firing mechanism\nn03469832\tgunnery\nn03469903\tgunnysack, gunny sack, burlap bag\nn03470005\tgun pendulum\nn03470222\tgun room\nn03470387\tgunsight, gun-sight\nn03470629\tgun trigger, trigger\nn03470948\tgurney\nn03471030\tgusher\nn03471190\tgusset, inset\nn03471347\tgusset, gusset plate\nn03471779\tguy, guy cable, guy wire, guy rope\nn03472232\tgymnastic apparatus, exerciser\nn03472535\tgym shoe, sneaker, tennis shoe\nn03472672\tgym suit\nn03472796\tgymslip\nn03472937\tgypsy cab\nn03473078\tgyrocompass\nn03473227\tgyroscope, gyro\nn03473465\tgyrostabilizer, gyrostabiliser\nn03473817\thabergeon\nn03473966\thabit\nn03474167\thabit, riding habit\nn03474352\thacienda\nn03474779\thacksaw, hack saw, metal saw\nn03474896\thaft, helve\nn03475581\thairbrush\nn03475674\thaircloth, hair\nn03475823\thairdressing, hair tonic, hair oil, hair grease\nn03475961\thairnet\nn03476083\thairpiece, false hair, postiche\nn03476313\thairpin\nn03476542\thair shirt\nn03476684\thair slide\nn03476991\thair spray\nn03477143\thairspring\nn03477303\thair trigger\nn03477410\thalberd\nn03477512\thalf binding\nn03477773\thalf hatchet\nn03477902\thalf hitch\nn03478589\thalf track\nn03478756\thall\nn03478907\thall\nn03479121\thall\nn03479266\tHall of Fame\nn03479397\thall of residence\nn03479502\thallstand\nn03480579\thalter\nn03480719\thalter, hackamore\nn03480973\thame\nn03481172\thammer\nn03481521\thammer, power hammer\nn03482001\thammer\nn03482128\thammerhead\nn03482252\thammock, sack\nn03482405\thamper\nn03482523\thand\nn03482877\thandball\nn03483086\thandbarrow\nn03483230\thandbell\nn03483316\thand blower, blow dryer, blow drier, hair dryer, hair drier\nn03483531\thandbow\nn03483637\thand brake, emergency, emergency brake, parking brake\nn03483823\thand calculator, pocket calculator\nn03483971\thandcar\nn03484083\thandcart, pushcart, cart, go-cart\nn03484487\thand cream\nn03484576\thandcuff, cuff, handlock, manacle\nn03484809\thand drill, handheld drill\nn03484931\thand glass, simple microscope, magnifying glass\nn03485198\thand glass, hand mirror\nn03485309\thand grenade\nn03485407\thand-held computer, hand-held microcomputer\nn03485575\thandhold\nn03485794\thandkerchief, hankie, hanky, hankey\nn03487090\thandlebar\nn03487331\thandloom\nn03487444\thand lotion\nn03487533\thand luggage\nn03487642\thand-me-down\nn03487774\thand mower\nn03487886\thand pump\nn03488111\thandrest\nn03488188\thandsaw, hand saw, carpenter's saw\nn03488438\thandset, French telephone\nn03488603\thand shovel\nn03488784\thandspike\nn03488887\thandstamp, rubber stamp\nn03489048\thand throttle\nn03489162\thand tool\nn03490006\thand towel, face towel\nn03490119\thand truck, truck\nn03490324\thandwear, hand wear\nn03490449\thandwheel\nn03490649\thandwheel\nn03490784\thangar queen\nn03490884\thanger\nn03491032\thang glider\nn03491724\thangman's rope, hangman's halter, halter, hemp, hempen necktie\nn03491988\thank\nn03492087\thansom, hansom cab\nn03492250\tharbor, harbour\nn03492542\thard disc, hard disk, fixed disk\nn03492922\thard hat, tin hat, safety hat\nn03493219\thardtop\nn03493792\thardware, ironware\nn03493911\thardware store, ironmonger, ironmonger's shop\nn03494278\tharmonica, mouth organ, harp, mouth harp\nn03494537\tharmonium, organ, reed organ\nn03494706\tharness\nn03495039\tharness\nn03495258\tharp\nn03495570\tharp\nn03495671\tharpoon\nn03495941\tharpoon gun\nn03496183\tharpoon log\nn03496296\tharpsichord, cembalo\nn03496486\tHarris Tweed\nn03496612\tharrow\nn03496892\tharvester, reaper\nn03497100\thash house\nn03497352\thasp\nn03497657\that, chapeau, lid\nn03498441\thatbox\nn03498536\thatch\nn03498662\thatchback, hatchback door\nn03498781\thatchback\nn03498866\thatchel, heckle\nn03498962\thatchet\nn03499354\thatpin\nn03499468\thauberk, byrnie\nn03499907\tHawaiian guitar, steel guitar\nn03500090\thawse, hawsehole, hawsepipe\nn03500209\thawser\nn03500295\thawser bend\nn03500389\thay bale\nn03500457\thayfork\nn03500557\thayloft, haymow, mow\nn03500699\thaymaker, hay conditioner\nn03500838\thayrack, hayrig\nn03500971\thayrack\nn03501152\thazard\nn03501288\thead\nn03501520\thead\nn03501614\thead\nn03502200\theadboard\nn03502331\thead covering, veil\nn03502509\theaddress, headgear\nn03502777\theader\nn03502897\theader\nn03503097\theader, coping, cope\nn03503233\theader, lintel\nn03503358\theadfast\nn03503477\thead gasket\nn03503567\thead gate\nn03503718\theadgear\nn03503997\theadlight, headlamp\nn03504205\theadpiece\nn03504293\theadpin, kingpin\nn03504723\theadquarters, central office, main office, home office, home base\nn03505015\theadrace\nn03505133\theadrest\nn03505383\theadsail\nn03505504\theadscarf\nn03505667\theadset\nn03505764\thead shop\nn03506028\theadstall, headpiece\nn03506184\theadstock\nn03506370\thealth spa, spa, health club\nn03506560\thearing aid, ear trumpet\nn03506727\thearing aid, deaf-aid\nn03506880\thearse\nn03507241\thearth, fireside\nn03507458\thearthrug\nn03507658\theart-lung machine\nn03507963\theat engine\nn03508101\theater, warmer\nn03508485\theat exchanger\nn03508881\theating pad, hot pad\nn03509394\theat lamp, infrared lamp\nn03509608\theat pump\nn03509843\theat-seeking missile\nn03510072\theat shield\nn03510244\theat sink\nn03510384\theaume\nn03510487\theaver\nn03510583\theavier-than-air craft\nn03510866\theckelphone, basset oboe\nn03510987\thectograph, heliotype\nn03511175\thedge, hedgerow\nn03511333\thedge trimmer\nn03512030\thelicon, bombardon\nn03512147\thelicopter, chopper, whirlybird, eggbeater\nn03512452\theliograph\nn03512624\theliometer\nn03512911\thelm\nn03513137\thelmet\nn03513376\thelmet\nn03514129\thematocrit, haematocrit\nn03514340\themming-stitch\nn03514451\themostat, haemostat\nn03514693\themstitch, hemstitching\nn03514894\thenroost\nn03515338\theraldry\nn03515934\thermitage\nn03516266\therringbone\nn03516367\therringbone, herringbone pattern\nn03516647\tHerschelian telescope, off-axis reflector\nn03516844\tHessian boot, hessian, jackboot, Wellington, Wellington boot\nn03516996\theterodyne receiver, superheterodyne receiver, superhet\nn03517509\thibachi\nn03517647\thideaway, retreat\nn03517760\thi-fi, high fidelity sound system\nn03517899\thigh altar\nn03517982\thigh-angle gun\nn03518135\thighball glass\nn03518230\thighboard\nn03518305\thighboy, tallboy\nn03518445\thighchair, feeding chair\nn03518631\thigh gear, high\nn03518829\thigh-hat cymbal, high hat\nn03518943\thighlighter\nn03519081\thighlighter\nn03519226\thigh-pass filter\nn03519387\thigh-rise, tower block\nn03519674\thigh table\nn03519848\thigh-warp loom\nn03520493\thijab\nn03521076\thinge, flexible joint\nn03521431\thinging post, swinging post\nn03521544\thip boot, thigh boot\nn03521675\thipflask, pocket flask\nn03521771\thip pad\nn03521899\thip pocket\nn03522003\thippodrome\nn03522100\thip roof, hipped roof\nn03522634\thitch\nn03522863\thitch\nn03522990\thitching post\nn03523134\thitchrack, hitching bar\nn03523398\thob\nn03523506\thobble skirt\nn03523987\thockey skate\nn03524150\thockey stick\nn03524287\thod\nn03524425\thodoscope\nn03524574\thoe\nn03524745\thoe handle\nn03524976\thogshead\nn03525074\thoist\nn03525252\thold, keep\nn03525454\tholder\nn03525693\tholding cell\nn03525827\tholding device\nn03526062\tholding pen, holding paddock, holding yard\nn03527149\thollowware, holloware\nn03527444\tholster\nn03527565\tholster\nn03527675\tholy of holies, sanctum sanctorum\nn03528100\thome, nursing home, rest home\nn03528263\thome appliance, household appliance\nn03528523\thome computer\nn03528901\thome plate, home base, home, plate\nn03529175\thome room, homeroom\nn03529444\thomespun\nn03529629\thomestead\nn03529860\thome theater, home theatre\nn03530189\thoming torpedo\nn03530511\thone\nn03530642\thoneycomb\nn03530910\thood, bonnet, cowl, cowling\nn03531281\thood\nn03531447\thood\nn03531546\thood, exhaust hood\nn03531691\thood\nn03531982\thood latch\nn03532342\thook\nn03532672\thook, claw\nn03532919\thook\nn03533014\thookah, narghile, nargileh, sheesha, shisha, chicha, calean, kalian, water pipe, hubble-bubble, hubbly-bubbly\nn03533392\thook and eye\nn03533486\thookup, assemblage\nn03533654\thookup\nn03533845\thook wrench, hook spanner\nn03534580\thoopskirt, crinoline\nn03534695\thoosegow, hoosgow\nn03534776\tHoover\nn03535024\thope chest, wedding chest\nn03535284\thopper\nn03535647\thopsacking, hopsack\nn03535780\thorizontal bar, high bar\nn03536122\thorizontal stabilizer, horizontal stabiliser, tailplane\nn03536568\thorizontal tail\nn03536761\thorn\nn03537085\thorn\nn03537241\thorn\nn03537412\thorn button\nn03537550\thornpipe, pibgorn, stockhorn\nn03538037\thorse, gymnastic horse\nn03538179\thorsebox\nn03538300\thorsecar\nn03538406\thorse cart, horse-cart\nn03538542\thorsecloth\nn03538634\thorse-drawn vehicle\nn03538817\thorsehair\nn03538957\thorsehair wig\nn03539103\thorseless carriage\nn03539293\thorse pistol, horse-pistol\nn03539433\thorseshoe, shoe\nn03539546\thorseshoe\nn03539678\thorse-trail\nn03539754\thorsewhip\nn03540090\those\nn03540267\thosiery, hose\nn03540476\thospice\nn03540595\thospital, infirmary\nn03540914\thospital bed\nn03541091\thospital room\nn03541269\thospital ship\nn03541393\thospital train\nn03541537\thostel, youth hostel, student lodging\nn03541696\thostel, hostelry, inn, lodge, auberge\nn03541923\thot-air balloon\nn03542333\thotel\nn03542605\thotel-casino, casino-hotel\nn03542727\thotel-casino, casino-hotel\nn03542860\thotel room\nn03543012\thot line\nn03543112\thot pants\nn03543254\thot plate, hotplate\nn03543394\thot rod, hot-rod\nn03543511\thot spot, hotspot\nn03543603\thot tub\nn03543735\thot-water bottle, hot-water bag\nn03543945\thoundstooth check, hound's-tooth check, dogstooth check, dogs-tooth check, dog's-tooth check\nn03544143\thourglass\nn03544238\thour hand, little hand\nn03544360\thouse\nn03545150\thouse\nn03545470\thouseboat\nn03545585\thouselights\nn03545756\thouse of cards, cardhouse, card-house, cardcastle\nn03545961\thouse of correction\nn03546112\thouse paint, housepaint\nn03546235\thousetop\nn03546340\thousing, lodging, living accommodations\nn03547054\thovel, hut, hutch, shack, shanty\nn03547229\thovercraft, ground-effect machine\nn03547397\thowdah, houdah\nn03547530\thuarache, huaraches\nn03547861\thub-and-spoke, hub-and-spoke system\nn03548086\thubcap\nn03548195\thuck, huckaback\nn03548320\thug-me-tight\nn03548402\thula-hoop\nn03548533\thulk\nn03548626\thull\nn03548930\thumeral veil, veil\nn03549199\tHumvee, Hum-Vee\nn03549350\thunter, hunting watch\nn03549473\thunting knife\nn03549589\thurdle\nn03549732\thurricane deck, hurricane roof, promenade deck, awning deck\nn03549897\thurricane lamp, hurricane lantern, tornado lantern, storm lantern, storm lamp\nn03550153\thut, army hut, field hut\nn03550289\thutch\nn03550420\thutment\nn03551084\thydraulic brake, hydraulic brakes\nn03551395\thydraulic press\nn03551582\thydraulic pump, hydraulic ram\nn03551790\thydraulic system\nn03552001\thydraulic transmission, hydraulic transmission system\nn03552449\thydroelectric turbine\nn03552749\thydrofoil, hydroplane\nn03553019\thydrofoil, foil\nn03553248\thydrogen bomb, H-bomb, fusion bomb, thermonuclear bomb\nn03553486\thydrometer, gravimeter\nn03554375\thygrodeik\nn03554460\thygrometer\nn03554645\thygroscope\nn03555006\thyperbaric chamber\nn03555217\thypercoaster\nn03555426\thypermarket\nn03555564\thypodermic needle\nn03555662\thypodermic syringe, hypodermic, hypo\nn03555862\thypsometer\nn03555996\thysterosalpingogram\nn03556173\tI-beam\nn03556679\tice ax, ice axe, piolet\nn03556811\ticeboat, ice yacht, scooter\nn03556992\ticebreaker, iceboat\nn03557270\ticed-tea spoon\nn03557360\tice hockey rink, ice-hockey rink\nn03557590\tice machine\nn03557692\tice maker\nn03557840\tice pack, ice bag\nn03558007\ticepick, ice pick\nn03558176\tice rink, ice-skating rink, ice\nn03558404\tice skate\nn03558633\tice tongs\nn03558739\ticetray\nn03559373\ticonoscope\nn03559531\tIdentikit, Identikit picture\nn03559999\tidle pulley, idler pulley, idle wheel\nn03560430\tigloo, iglu\nn03560860\tignition coil\nn03561047\tignition key\nn03561169\tignition switch\nn03561573\timaret\nn03562565\timmovable bandage\nn03563200\timpact printer\nn03563460\timpeller\nn03563710\timplant\nn03563967\timplement\nn03564849\timpression\nn03565288\timprint\nn03565565\timprovised explosive device, I.E.D., IED\nn03565710\timpulse turbine\nn03565830\tin-basket, in-tray\nn03565991\tincendiary bomb, incendiary, firebomb\nn03566193\tincinerator\nn03566329\tinclined plane\nn03566555\tinclinometer, dip circle\nn03566730\tinclinometer\nn03566860\tincrustation, encrustation\nn03567066\tincubator, brooder\nn03567635\tindex register\nn03567788\tIndiaman\nn03567912\tIndian club\nn03568117\tindicator\nn03568818\tinduction coil\nn03569014\tinductor, inductance\nn03569174\tindustrial watercourse\nn03569293\tinertial guidance system, inertial navigation system\nn03569494\tinflater, inflator\nn03571280\tinhaler, inhalator\nn03571439\tinjector\nn03571625\tink bottle, inkpot\nn03571853\tink eraser\nn03571942\tink-jet printer\nn03572107\tinkle\nn03572205\tinkstand\nn03572321\tinkwell, inkstand\nn03572631\tinlay\nn03573574\tinside caliper\nn03573848\tinsole, innersole\nn03574243\tinstep\nn03574416\tinstillator\nn03574555\tinstitution\nn03574816\tinstrument\nn03575958\tinstrument of punishment\nn03576215\tinstrument of torture\nn03576443\tintaglio, diaglyph\nn03576955\tintake valve\nn03577090\tintegrated circuit, microcircuit\nn03577312\tintegrator, planimeter\nn03577474\tIntelnet\nn03577672\tinterceptor\nn03577818\tinterchange\nn03578055\tintercommunication system, intercom\nn03578251\tintercontinental ballistic missile, ICBM\nn03578656\tinterface, port\nn03578981\tinterferometer\nn03579538\tinterior door\nn03579982\tinternal-combustion engine, ICE\nn03580518\tinternal drive\nn03580615\tinternet, net, cyberspace\nn03580845\tinterphone\nn03580990\tinterrupter\nn03581125\tintersection, crossroad, crossway, crossing, carrefour\nn03581531\tinterstice\nn03581897\tintraocular lens\nn03582508\tintravenous pyelogram, IVP\nn03582959\tinverter\nn03583419\tion engine\nn03583621\tionization chamber, ionization tube\nn03584254\tiPod\nn03584400\tvideo iPod\nn03584829\tiron, smoothing iron\nn03585073\tiron\nn03585337\tiron, branding iron\nn03585438\tirons, chains\nn03585551\tironclad\nn03585682\tiron foundry\nn03585778\tiron horse\nn03585875\tironing\nn03586219\tiron lung\nn03586631\tironmongery\nn03586911\tironworks\nn03587205\tirrigation ditch\nn03588216\tizar\nn03588841\tjabot\nn03588951\tjack\nn03589313\tjack, jackstones\nn03589513\tjack\nn03589672\tjack\nn03589791\tjacket\nn03590306\tjacket\nn03590475\tjacket\nn03590588\tjack-in-the-box\nn03590841\tjack-o'-lantern\nn03590932\tjack plane\nn03591116\tJacob's ladder, jack ladder, pilot ladder\nn03591313\tjaconet\nn03591592\tJacquard loom, Jacquard\nn03591798\tjacquard\nn03591901\tjag, dag\nn03592245\tjail, jailhouse, gaol, clink, slammer, poky, pokey\nn03592669\tjalousie\nn03592773\tjamb\nn03592931\tjammer\nn03593122\tjampot, jamjar\nn03593222\tjapan\nn03593526\tjar\nn03593862\tJarvik heart, Jarvik artificial heart\nn03594010\tjaunting car, jaunty car\nn03594148\tjavelin\nn03594277\tjaw\nn03594523\tJaws of Life\nn03594734\tjean, blue jean, denim\nn03594945\tjeep, landrover\nn03595055\tjellaba\nn03595264\tjerkin\nn03595409\tjeroboam, double-magnum\nn03595523\tjersey\nn03595614\tjersey, T-shirt, tee shirt\nn03595860\tjet, jet plane, jet-propelled plane\nn03596099\tjet bridge\nn03596285\tjet engine\nn03596543\tjetliner\nn03597147\tjeweler's glass\nn03597317\tjewelled headdress, jeweled headdress\nn03597916\tjew's harp, jews' harp, mouth bow\nn03598151\tjib\nn03598299\tjibboom\nn03598385\tjig\nn03598515\tjig\nn03598646\tjiggermast, jigger\nn03598783\tjigsaw, scroll saw, fretsaw\nn03598930\tjigsaw puzzle\nn03599486\tjinrikisha, ricksha, rickshaw\nn03599964\tjobcentre\nn03600285\tjodhpurs, jodhpur breeches, riding breeches\nn03600475\tjodhpur, jodhpur boot, jodhpur shoe\nn03600722\tjoinery\nn03600977\tjoint\nn03601442\tJoint Direct Attack Munition, JDAM\nn03601638\tjointer, jointer plane, jointing plane, long plane\nn03601840\tjoist\nn03602081\tjolly boat, jolly\nn03602194\tjorum\nn03602365\tjoss house\nn03602686\tjournal bearing\nn03602790\tjournal box\nn03602883\tjoystick\nn03603442\tjungle gym\nn03603594\tjunk\nn03603722\tjug\nn03604156\tjukebox, nickelodeon\nn03604311\tjumbojet, jumbo jet\nn03604400\tjumper, pinafore, pinny\nn03604536\tjumper\nn03604629\tjumper\nn03604763\tjumper\nn03604843\tjumper cable, jumper lead, lead, booster cable\nn03605417\tjump seat\nn03605504\tjump suit\nn03605598\tjump suit, jumpsuit\nn03605722\tjunction\nn03605915\tjunction, conjunction\nn03606106\tjunction barrier, barrier strip\nn03606251\tjunk shop\nn03606347\tjury box\nn03606465\tjury mast\nn03607029\tkachina\nn03607186\tkaffiyeh\nn03607527\tkalansuwa\nn03607659\tKalashnikov\nn03607923\tkameez\nn03608504\tkanzu\nn03609147\tkatharometer\nn03609235\tkayak\nn03609397\tkazoo\nn03609542\tkeel\nn03609786\tkeelboat\nn03609959\tkeelson\nn03610098\tkeep, donjon, dungeon\nn03610418\tkeg\nn03610524\tkennel, doghouse, dog house\nn03610682\tkepi, peaked cap, service cap, yachting cap\nn03610836\tkeratoscope\nn03610992\tkerchief\nn03612010\tketch\nn03612814\tkettle, boiler\nn03612965\tkettle, kettledrum, tympanum, tympani, timpani\nn03613294\tkey\nn03613592\tkey\nn03614007\tkeyboard\nn03614383\tkeyboard buffer\nn03614532\tkeyboard instrument\nn03614782\tkeyhole\nn03614887\tkeyhole saw\nn03615300\tkhadi, khaddar\nn03615406\tkhaki\nn03615563\tkhakis\nn03615655\tkhimar\nn03615790\tkhukuri\nn03616091\tkick pleat\nn03616225\tkicksorter, pulse height analyzer\nn03616428\tkickstand\nn03616763\tkick starter, kick start\nn03616979\tkid glove, suede glove\nn03617095\tkiln\nn03617312\tkilt\nn03617480\tkimono\nn03617594\tkinescope, picture tube, television tube\nn03617834\tKinetoscope\nn03618101\tking\nn03618339\tking\nn03618546\tkingbolt, kingpin, swivel pin\nn03618678\tking post\nn03618797\tKipp's apparatus\nn03618982\tkirk\nn03619050\tkirpan\nn03619196\tkirtle\nn03619275\tkirtle\nn03619396\tkit, outfit\nn03619650\tkit\nn03619793\tkitbag, kit bag\nn03619890\tkitchen\nn03620052\tkitchen appliance\nn03620353\tkitchenette\nn03620967\tkitchen table\nn03621049\tkitchen utensil\nn03621377\tkitchenware\nn03621694\tkite balloon\nn03622058\tklaxon, claxon\nn03622401\tklieg light\nn03622526\tklystron\nn03622839\tknee brace\nn03622931\tknee-high, knee-hi\nn03623198\tknee pad\nn03623338\tknee piece\nn03623556\tknife\nn03624134\tknife\nn03624400\tknife blade\nn03624767\tknight, horse\nn03625355\tknit\nn03625539\tknitting machine\nn03625646\tknitting needle\nn03625943\tknitwear\nn03626115\tknob, boss\nn03626272\tknob, pommel\nn03626418\tknobble\nn03626502\tknobkerrie, knobkerry\nn03626760\tknocker, doorknocker, rapper\nn03627232\tknot\nn03627954\tknuckle joint, hinge joint\nn03628071\tkohl\nn03628215\tkoto\nn03628421\tkraal\nn03628511\tkremlin\nn03628728\tkris, creese, crease\nn03628831\tkrummhorn, crumhorn, cromorne\nn03628984\tKundt's tube\nn03629100\tKurdistan\nn03629231\tkurta\nn03629520\tkylix, cylix\nn03629643\tkymograph, cymograph\nn03630262\tlab bench, laboratory bench\nn03630383\tlab coat, laboratory coat\nn03631177\tlace\nn03631811\tlacquer\nn03631922\tlacquerware\nn03632100\tlacrosse ball\nn03632577\tladder-back\nn03632729\tladder-back, ladder-back chair\nn03632852\tladder truck, aerial ladder truck\nn03632963\tladies' room, powder room\nn03633091\tladle\nn03633341\tlady chapel\nn03633632\tlagerphone\nn03633886\tlag screw, lag bolt\nn03634034\tlake dwelling, pile dwelling\nn03634899\tlally, lally column\nn03635032\tlamasery\nn03635108\tlambrequin\nn03635330\tlame\nn03635516\tlaminar flow clean room\nn03635668\tlaminate\nn03635932\tlamination\nn03636248\tlamp\nn03636649\tlamp\nn03637027\tlamp house, lamphouse, lamp housing\nn03637181\tlamppost\nn03637318\tlampshade, lamp shade\nn03637480\tlanai\nn03637787\tlancet arch, lancet\nn03637898\tlancet window\nn03638014\tlandau\nn03638180\tlander\nn03638623\tlanding craft\nn03638743\tlanding flap\nn03638883\tlanding gear\nn03639077\tlanding net\nn03639230\tlanding skid\nn03639497\tland line, landline\nn03639675\tland mine, ground-emplaced mine, booby trap\nn03639880\tland office\nn03640850\tlanolin\nn03640988\tlantern\nn03641569\tlanyard, laniard\nn03641947\tlap, lap covering\nn03642144\tlaparoscope\nn03642341\tlapboard\nn03642444\tlapel\nn03642573\tlap joint, splice\nn03642806\tlaptop, laptop computer\nn03643149\tlaryngoscope\nn03643253\tlaser, optical maser\nn03643491\tlaser-guided bomb, LGB\nn03643737\tlaser printer\nn03643907\tlash, thong\nn03644073\tlashing\nn03644378\tlasso, lariat, riata, reata\nn03644858\tlatch\nn03645011\tlatch, door latch\nn03645168\tlatchet\nn03645290\tlatchkey\nn03645577\tlateen, lateen sail\nn03646020\tlatex paint, latex, rubber-base paint\nn03646148\tlath\nn03646296\tlathe\nn03646809\tlatrine\nn03646916\tlattice, latticework, fretwork\nn03647423\tlaunch\nn03647520\tlauncher, rocket launcher\nn03648219\tlaundry, wash, washing, washables\nn03648431\tlaundry cart\nn03648667\tlaundry truck\nn03649003\tlavalava\nn03649161\tlavaliere, lavalier, lavalliere\nn03649288\tlaver\nn03649674\tlawn chair, garden chair\nn03649797\tlawn furniture\nn03649909\tlawn mower, mower\nn03650551\tlayette\nn03651388\tlead-acid battery, lead-acid accumulator\nn03651605\tlead-in\nn03651843\tleading rein\nn03652100\tlead pencil\nn03652389\tleaf spring\nn03652729\tlean-to\nn03652826\tlean-to tent\nn03652932\tleash, tether, lead\nn03653110\tleatherette, imitation leather\nn03653220\tleather strip\nn03653454\tLeclanche cell\nn03653583\tlectern, reading desk\nn03653740\tlecture room\nn03653833\tlederhosen\nn03653975\tledger board\nn03654576\tleg\nn03654826\tleg\nn03655072\tlegging, leging, leg covering\nn03655470\tLeiden jar, Leyden jar\nn03655720\tleisure wear\nn03656484\tlens, lense, lens system\nn03656957\tlens, electron lens\nn03657121\tlens cap, lens cover\nn03657239\tlens implant, interocular lens implant, IOL\nn03657511\tleotard, unitard, body suit, cat suit\nn03658102\tletter case\nn03658185\tletter opener, paper knife, paperknife\nn03658635\tlevee\nn03658858\tlevel, spirit level\nn03659292\tlever\nn03659686\tlever, lever tumbler\nn03659809\tlever\nn03659950\tlever lock\nn03660124\tLevi's, levis\nn03660562\tLiberty ship\nn03660909\tlibrary\nn03661043\tlibrary\nn03661340\tlid\nn03662301\tLiebig condenser\nn03662452\tlie detector\nn03662601\tlifeboat\nn03662719\tlife buoy, lifesaver, life belt, life ring\nn03662887\tlife jacket, life vest, cork jacket\nn03663433\tlife office\nn03663531\tlife preserver, preserver, flotation device\nn03663910\tlife-support system, life support\nn03664159\tlife-support system, life support\nn03664675\tlifting device\nn03664840\tlift pump\nn03664943\tligament\nn03665232\tligature\nn03665366\tlight, light source\nn03665851\tlight arm\nn03665924\tlight bulb, lightbulb, bulb, incandescent lamp, electric light, electric-light bulb\nn03666238\tlight circuit, lighting circuit\nn03666362\tlight-emitting diode, LED\nn03666591\tlighter, light, igniter, ignitor\nn03666917\tlighter-than-air craft\nn03667060\tlight filter, diffusing screen\nn03667235\tlighting\nn03667552\tlight machine gun\nn03667664\tlight meter, exposure meter, photometer\nn03667829\tlight microscope\nn03668067\tlightning rod, lightning conductor\nn03668279\tlight pen, electronic stylus\nn03668488\tlightship\nn03668803\tLilo\nn03669245\tlimber\nn03669534\tlimekiln\nn03669886\tlimiter, clipper\nn03670208\tlimousine, limo\nn03671914\tlinear accelerator, linac\nn03672521\tlinen\nn03672827\tline printer, line-at-a-time printer\nn03673027\tliner, ocean liner\nn03673270\tliner, lining\nn03673450\tlingerie, intimate apparel\nn03673767\tlining, liner\nn03674270\tlink, data link\nn03674440\tlinkage\nn03674731\tLink trainer\nn03674842\tlinocut\nn03675076\tlinoleum knife, linoleum cutter\nn03675235\tLinotype, Linotype machine\nn03675445\tlinsey-woolsey\nn03675558\tlinstock\nn03675907\tlion-jaw forceps\nn03676087\tlip-gloss\nn03676483\tlipstick, lip rouge\nn03676623\tliqueur glass\nn03676759\tliquid crystal display, LCD\nn03677115\tliquid metal reactor\nn03677682\tlisle\nn03677766\tlister, lister plow, lister plough, middlebreaker, middle buster\nn03678558\tlitterbin, litter basket, litter-basket\nn03678729\tlittle theater, little theatre\nn03678879\tlive axle, driving axle\nn03679384\tliving quarters, quarters\nn03679712\tliving room, living-room, sitting room, front room, parlor, parlour\nn03680248\tload\nn03680355\tLoafer\nn03680512\tloaner\nn03680734\tlobe\nn03680858\tlobster pot\nn03680942\tlocal\nn03681477\tlocal area network, LAN\nn03681813\tlocal oscillator, heterodyne oscillator\nn03682380\tLochaber ax\nn03682487\tlock\nn03682877\tlock, ignition lock\nn03683079\tlock, lock chamber\nn03683341\tlock\nn03683457\tlockage\nn03683606\tlocker\nn03683708\tlocker room\nn03683995\tlocket\nn03684143\tlock-gate\nn03684224\tlocking pliers\nn03684489\tlockring, lock ring, lock washer\nn03684611\tlockstitch\nn03684740\tlockup\nn03684823\tlocomotive, engine, locomotive engine, railway locomotive\nn03685307\tlodge, indian lodge\nn03685486\tlodge, hunting lodge\nn03685640\tlodge\nn03685820\tlodging house, rooming house\nn03686130\tloft, attic, garret\nn03686363\tloft, pigeon loft\nn03686470\tloft\nn03686924\tlog cabin\nn03687137\tloggia\nn03687928\tlongbow\nn03688066\tlong iron\nn03688192\tlong johns\nn03688405\tlong sleeve\nn03688504\tlong tom\nn03688605\tlong trousers, long pants\nn03688707\tlong underwear, union suit\nn03688832\tlooking glass, glass\nn03688943\tlookout, observation tower, lookout station, observatory\nn03689157\tloom\nn03689570\tloop knot\nn03690168\tlorgnette\nn03690279\tLorraine cross, cross of Lorraine\nn03690473\tlorry, camion\nn03690851\tlota\nn03690938\tlotion\nn03691459\tloudspeaker, speaker, speaker unit, loudspeaker system, speaker system\nn03691817\tlounge, waiting room, waiting area\nn03692004\tlounger\nn03692136\tlounging jacket, smoking jacket\nn03692272\tlounging pajama, lounging pyjama\nn03692379\tloungewear\nn03692522\tloupe, jeweler's loupe\nn03692842\tlouvered window, jalousie\nn03693293\tlove knot, lovers' knot, lover's knot, true lovers' knot, true lover's knot\nn03693474\tlove seat, loveseat, tete-a-tete, vis-a-vis\nn03693707\tloving cup\nn03693860\tlowboy\nn03694196\tlow-pass filter\nn03694356\tlow-warp-loom\nn03694639\tLP, L-P\nn03694761\tL-plate\nn03694949\tlubber's hole\nn03695122\tlubricating system, force-feed lubricating system, force feed, pressure-feed lubricating system, pressure feed\nn03695452\tluff\nn03695616\tlug\nn03695753\tluge\nn03695857\tLuger\nn03695957\tluggage carrier\nn03696065\tluggage compartment, automobile trunk, trunk\nn03696301\tluggage rack, roof rack\nn03696445\tlugger\nn03696568\tlugsail, lug\nn03696746\tlug wrench\nn03696909\tlumberjack, lumber jacket\nn03697007\tlumbermill, sawmill\nn03697366\tlunar excursion module, lunar module, LEM\nn03697552\tlunchroom\nn03697812\tlunette\nn03697913\tlungi, lungyi, longyi\nn03698123\tlunula\nn03698226\tlusterware\nn03698360\tlute\nn03698604\tluxury liner, express luxury liner\nn03698723\tlyceum\nn03698815\tlychgate, lichgate\nn03699280\tlyre\nn03699591\tmachete, matchet, panga\nn03699754\tmachicolation\nn03699975\tmachine\nn03700963\tmachine, simple machine\nn03701191\tmachine bolt\nn03701391\tmachine gun\nn03701640\tmachinery\nn03701790\tmachine screw\nn03702248\tmachine tool\nn03702440\tmachinist's vise, metalworking vise\nn03702582\tmachmeter\nn03703075\tmackinaw\nn03703203\tmackinaw, Mackinaw boat\nn03703463\tmackinaw, Mackinaw coat\nn03703590\tmackintosh, macintosh\nn03703730\tmacrame\nn03703862\tmadras\nn03703945\tMae West, air jacket\nn03704549\tmagazine rack\nn03704834\tmagic lantern\nn03705379\tmagnet\nn03705808\tmagnetic bottle\nn03706229\tmagnetic compass\nn03706415\tmagnetic core memory, core memory\nn03706653\tmagnetic disk, magnetic disc, disk, disc\nn03706939\tmagnetic head\nn03707171\tmagnetic mine\nn03707372\tmagnetic needle\nn03707597\tmagnetic recorder\nn03707766\tmagnetic stripe\nn03708036\tmagnetic tape, mag tape, tape\nn03708425\tmagneto, magnetoelectric machine\nn03708843\tmagnetometer, gaussmeter\nn03708962\tmagnetron\nn03709206\tmagnifier\nn03709363\tmagnum\nn03709545\tmagnus hitch\nn03709644\tmail\nn03709823\tmailbag, postbag\nn03709960\tmailbag, mail pouch\nn03710079\tmailboat, mail boat, packet, packet boat\nn03710193\tmailbox, letter box\nn03710294\tmail car\nn03710421\tmaildrop\nn03710528\tmailer\nn03710637\tmaillot\nn03710721\tmaillot, tank suit\nn03710937\tmailsorter\nn03711044\tmail train\nn03711711\tmainframe, mainframe computer\nn03711999\tmainmast\nn03712111\tmain rotor\nn03712337\tmainsail\nn03712444\tmainspring\nn03712887\tmain-topmast\nn03712981\tmain-topsail\nn03713069\tmain yard\nn03713151\tmaisonette, maisonnette\nn03713436\tmajolica, maiolica\nn03714235\tmakeup, make-up, war paint\nn03715114\tMaksutov telescope\nn03715275\tmalacca, malacca cane\nn03715386\tmallet, beetle\nn03715669\tmallet, hammer\nn03715892\tmallet\nn03716228\tmammogram\nn03716887\tmandola\nn03716966\tmandolin\nn03717131\tmanger, trough\nn03717285\tmangle\nn03717447\tmanhole\nn03717622\tmanhole cover\nn03718212\tman-of-war, ship of the line\nn03718335\tmanometer\nn03718458\tmanor, manor house\nn03718581\tmanor hall, hall\nn03718699\tMANPAD\nn03718789\tmansard, mansard roof\nn03718935\tmanse\nn03719053\tmansion, mansion house, manse, hall, residence\nn03719343\tmantel, mantelpiece, mantle, mantlepiece, chimneypiece\nn03719560\tmantelet, mantilla\nn03719743\tmantilla\nn03720005\tMao jacket\nn03720163\tmap\nn03720665\tmaquiladora\nn03720891\tmaraca\nn03721047\tmarble\nn03721252\tmarching order\nn03721384\tmarimba, xylophone\nn03721590\tmarina\nn03722007\tmarker\nn03722288\tmarketplace, market place, mart, market\nn03722646\tmarlinespike, marlinspike, marlingspike\nn03722944\tmarocain, crepe marocain\nn03723153\tmarquee, marquise\nn03723267\tmarquetry, marqueterie\nn03723439\tmarriage bed\nn03723781\tmartello tower\nn03723885\tmartingale\nn03724066\tmascara\nn03724176\tmaser\nn03724417\tmasher\nn03724538\tmashie, five iron\nn03724623\tmashie niblick, seven iron\nn03724756\tmasjid, musjid\nn03724870\tmask\nn03725035\tmask\nn03725506\tMasonite\nn03725600\tMason jar\nn03725717\tmasonry\nn03725869\tmason's level\nn03726116\tmassage parlor\nn03726233\tmassage parlor\nn03726371\tmass spectrograph\nn03726516\tmass spectrometer, spectrometer\nn03726760\tmast\nn03726993\tmast\nn03727067\tmastaba, mastabah\nn03727465\tmaster bedroom\nn03727605\tmasterpiece, chef-d'oeuvre\nn03727837\tmat\nn03727946\tmat, gym mat\nn03728437\tmatch, lucifer, friction match\nn03728982\tmatch\nn03729131\tmatchboard\nn03729308\tmatchbook\nn03729402\tmatchbox\nn03729482\tmatchlock\nn03729647\tmatch plane, tonguing and grooving plane\nn03729826\tmatchstick\nn03729951\tmaterial\nn03730153\tmateriel, equipage\nn03730334\tmaternity hospital\nn03730494\tmaternity ward\nn03730655\tmatrix\nn03730788\tMatthew Walker, Matthew Walker knot\nn03730893\tmatting\nn03731019\tmattock\nn03731483\tmattress cover\nn03731695\tmaul, sledge, sledgehammer\nn03731882\tmaulstick, mahlstick\nn03732020\tMauser\nn03732114\tmausoleum\nn03732458\tmaxi\nn03732543\tMaxim gun\nn03732658\tmaximum and minimum thermometer\nn03733131\tmaypole\nn03733281\tmaze, labyrinth\nn03733465\tmazer\nn03733547\tmeans\nn03733644\tmeasure\nn03733805\tmeasuring cup\nn03733925\tmeasuring instrument, measuring system, measuring device\nn03735637\tmeasuring stick, measure, measuring rod\nn03735963\tmeat counter\nn03736064\tmeat grinder\nn03736147\tmeat hook\nn03736269\tmeat house\nn03736372\tmeat safe\nn03736470\tmeat thermometer\nn03736970\tmechanical device\nn03738066\tmechanical piano, Pianola, player piano\nn03738241\tmechanical system\nn03738472\tmechanism\nn03739518\tmedical building, health facility, healthcare facility\nn03739693\tmedical instrument\nn03742019\tmedicine ball\nn03742115\tmedicine chest, medicine cabinet\nn03742238\tMEDLINE\nn03743016\tmegalith, megalithic structure\nn03743279\tmegaphone\nn03743902\tmemorial, monument\nn03744276\tmemory, computer memory, storage, computer storage, store, memory board\nn03744684\tmemory chip\nn03744840\tmemory device, storage device\nn03745146\tmenagerie, zoo, zoological garden\nn03745487\tmending\nn03745571\tmenhir, standing stone\nn03746005\tmenorah\nn03746155\tMenorah\nn03746330\tman's clothing\nn03746486\tmen's room, men's\nn03748162\tmercantile establishment, retail store, sales outlet, outlet\nn03749504\tmercury barometer\nn03749634\tmercury cell\nn03749807\tmercury thermometer, mercury-in-glass thermometer\nn03750206\tmercury-vapor lamp\nn03750437\tmercy seat\nn03750614\tmerlon\nn03751065\tmess, mess hall\nn03751269\tmess jacket, monkey jacket, shell jacket\nn03751458\tmess kit\nn03751590\tmessuage\nn03751757\tmetal detector\nn03752071\tmetallic\nn03752185\tmetal screw\nn03752398\tmetal wood\nn03752922\tmeteorological balloon\nn03753077\tmeter\nn03753514\tmeterstick, metrestick\nn03757604\tmetronome\nn03758089\tmezzanine, mezzanine floor, entresol\nn03758220\tmezzanine, first balcony\nn03758894\tmicrobalance\nn03758992\tmicrobrewery\nn03759243\tmicrofiche\nn03759432\tmicrofilm\nn03759661\tmicrometer, micrometer gauge, micrometer caliper\nn03759954\tmicrophone, mike\nn03760310\tmicroprocessor\nn03760671\tmicroscope\nn03760944\tmicrotome\nn03761084\tmicrowave, microwave oven\nn03761588\tmicrowave diathermy machine\nn03761731\tmicrowave linear accelerator\nn03762238\tmiddy, middy blouse\nn03762332\tmidiron, two iron\nn03762434\tmihrab\nn03762602\tmihrab\nn03762982\tmilitary hospital\nn03763727\tmilitary quarters\nn03763968\tmilitary uniform\nn03764276\tmilitary vehicle\nn03764606\tmilk bar\nn03764736\tmilk can\nn03764822\tmilk float\nn03764995\tmilking machine\nn03765128\tmilking stool\nn03765467\tmilk wagon, milkwagon\nn03765561\tmill, grinder, milling machinery\nn03765934\tmilldam\nn03766044\tmiller, milling machine\nn03766218\tmilliammeter\nn03766322\tmillinery, woman's hat\nn03766508\tmillinery, hat shop\nn03766600\tmilling\nn03766697\tmillivoltmeter\nn03766935\tmillstone\nn03767112\tmillstone\nn03767203\tmillwheel, mill wheel\nn03767459\tmimeograph, mimeo, mimeograph machine, Roneo, Roneograph\nn03767745\tminaret\nn03767966\tmincer, mincing machine\nn03768132\tmine\nn03768683\tmine detector\nn03768823\tminelayer\nn03768916\tmineshaft\nn03769610\tminibar, cellaret\nn03769722\tminibike, motorbike\nn03769881\tminibus\nn03770085\tminicar\nn03770224\tminicomputer\nn03770316\tministry\nn03770439\tminiskirt, mini\nn03770520\tminisub, minisubmarine\nn03770679\tminivan\nn03770834\tminiver\nn03770954\tmink, mink coat\nn03772077\tminster\nn03772269\tmint\nn03772584\tminute hand, big hand\nn03772674\tMinuteman\nn03773035\tmirror\nn03773504\tmissile\nn03773835\tmissile defense system, missile defence system\nn03774327\tmiter box, mitre box\nn03774461\tmiter joint, mitre joint, miter, mitre\nn03775071\tmitten\nn03775199\tmixer\nn03775388\tmixer\nn03775546\tmixing bowl\nn03775636\tmixing faucet\nn03775747\tmizzen, mizen\nn03775847\tmizzenmast, mizenmast, mizzen, mizen\nn03776167\tmobcap\nn03776460\tmobile home, manufactured home\nn03776877\tmoccasin, mocassin\nn03776997\tmock-up\nn03777126\tmod con\nn03777568\tModel T\nn03777754\tmodem\nn03778459\tmodillion\nn03778817\tmodule\nn03779000\tmodule\nn03779128\tmohair\nn03779246\tmoire, watered-silk\nn03779370\tmold, mould, cast\nn03779884\tmoldboard, mouldboard\nn03780047\tmoldboard plow, mouldboard plough\nn03780799\tmoleskin\nn03781055\tMolotov cocktail, petrol bomb, gasoline bomb\nn03781244\tmonastery\nn03781467\tmonastic habit\nn03781594\tmoneybag\nn03781683\tmoney belt\nn03781787\tmonitor\nn03782006\tmonitor\nn03782190\tmonitor, monitoring device\nn03782794\tmonkey-wrench, monkey wrench\nn03782929\tmonk's cloth\nn03783304\tmonochrome\nn03783430\tmonocle, eyeglass\nn03783575\tmonofocal lens implant, monofocal IOL\nn03783873\tmonoplane\nn03784139\tmonotype\nn03784270\tmonstrance, ostensorium\nn03784793\tmooring tower, mooring mast\nn03784896\tMoorish arch, horseshoe arch\nn03785016\tmoped\nn03785142\tmop handle\nn03785237\tmoquette\nn03785499\tmorgue, mortuary, dead room\nn03785721\tmorion, cabasset\nn03786096\tmorning dress\nn03786194\tmorning dress\nn03786313\tmorning room\nn03786621\tMorris chair\nn03786715\tmortar, howitzer, trench mortar\nn03786901\tmortar\nn03787032\tmortarboard\nn03787523\tmortise joint, mortise-and-tenon joint\nn03788047\tmosaic\nn03788195\tmosque\nn03788365\tmosquito net\nn03788498\tmotel\nn03788601\tmotel room\nn03788914\tMother Hubbard, muumuu\nn03789171\tmotion-picture camera, movie camera, cine-camera\nn03789400\tmotion-picture film, movie film, cine-film\nn03789603\tmotley\nn03789794\tmotley\nn03789946\tmotor\nn03790230\tmotorboat, powerboat\nn03790512\tmotorcycle, bike\nn03790755\tmotor hotel, motor inn, motor lodge, tourist court, court\nn03790953\tmotorized wheelchair\nn03791053\tmotor scooter, scooter\nn03791235\tmotor vehicle, automotive vehicle\nn03792048\tmound, hill\nn03792334\tmound, hill, pitcher's mound\nn03792526\tmount, setting\nn03792782\tmountain bike, all-terrain bike, off-roader\nn03792972\tmountain tent\nn03793489\tmouse, computer mouse\nn03793850\tmouse button\nn03794056\tmousetrap\nn03794136\tmousse, hair mousse, hair gel\nn03794798\tmouthpiece, embouchure\nn03795123\tmouthpiece\nn03795269\tmouthpiece, gumshield\nn03795758\tmovement\nn03795976\tmovie projector, cine projector, film projector\nn03796181\tmoving-coil galvanometer\nn03796401\tmoving van\nn03796522\tmud brick\nn03796605\tmudguard, splash guard, splash-guard\nn03796848\tmudhif\nn03796974\tmuff\nn03797062\tmuffle\nn03797182\tmuffler\nn03797264\tmufti\nn03797390\tmug\nn03797896\tmulch\nn03798061\tmule, scuff\nn03798442\tmultichannel recorder\nn03798610\tmultiengine airplane, multiengine plane\nn03798982\tmultiplex\nn03799113\tmultiplexer\nn03799240\tmultiprocessor\nn03799375\tmultistage rocket, step rocket\nn03799610\tmunition, ordnance, ordnance store\nn03799876\tMurphy bed\nn03800371\tmusette, shepherd's pipe\nn03800485\tmusette pipe\nn03800563\tmuseum\nn03800772\tmushroom anchor\nn03800933\tmusical instrument, instrument\nn03801353\tmusic box, musical box\nn03801533\tmusic hall, vaudeville theater, vaudeville theatre\nn03801671\tmusic school\nn03801760\tmusic stand, music rack\nn03801880\tmusic stool, piano stool\nn03802007\tmusket\nn03802228\tmusket ball, ball\nn03802393\tmuslin\nn03802643\tmustache cup, moustache cup\nn03802800\tmustard plaster, sinapism\nn03802973\tmute\nn03803116\tmuzzle loader\nn03803284\tmuzzle\nn03803780\tmyelogram\nn03804211\tnacelle\nn03804744\tnail\nn03805180\tnailbrush\nn03805280\tnailfile\nn03805374\tnailhead\nn03805503\tnailhead\nn03805725\tnail polish, nail enamel, nail varnish\nn03805933\tnainsook\nn03807334\tNapier's bones, Napier's rods\nn03809211\tnard, spikenard\nn03809312\tnarrowbody aircraft, narrow-body aircraft, narrow-body\nn03809603\tnarrow wale\nn03809686\tnarthex\nn03809802\tnarthex\nn03810412\tnasotracheal tube\nn03810952\tnational monument\nn03811295\tnautilus, nuclear submarine, nuclear-powered submarine\nn03811444\tnavigational system\nn03811847\tnaval equipment\nn03811965\tnaval gun\nn03812263\tnaval missile\nn03812382\tnaval radar\nn03812789\tnaval tactical data system\nn03812924\tnaval weaponry\nn03813078\tnave\nn03813176\tnavigational instrument\nn03813946\tnebuchadnezzar\nn03814528\tneckband\nn03814639\tneck brace\nn03814727\tneckcloth, stock\nn03814817\tneckerchief\nn03814906\tnecklace\nn03815149\tnecklet\nn03815278\tneckline\nn03815482\tneckpiece\nn03815615\tnecktie, tie\nn03816005\tneckwear\nn03816136\tneedle\nn03816394\tneedle\nn03816530\tneedlenose pliers\nn03816849\tneedlework, needlecraft\nn03817191\tnegative\nn03817331\tnegative magnetic pole, negative pole, south-seeking pole\nn03817522\tnegative pole\nn03817647\tnegligee, neglige, peignoir, wrapper, housecoat\nn03818001\tneolith\nn03818343\tneon lamp, neon induction lamp, neon tube\nn03819047\tnephoscope\nn03819336\tnest\nn03819448\tnest egg\nn03819595\tnet, network, mesh, meshing, meshwork\nn03819994\tnet\nn03820154\tnet\nn03820318\tnet\nn03820728\tnetwork, electronic network\nn03820950\tnetwork\nn03821145\tneutron bomb\nn03821424\tnewel\nn03821518\tnewel post, newel\nn03822171\tnewspaper, paper\nn03822361\tnewsroom\nn03822504\tnewsroom\nn03822656\tnewsstand\nn03822767\tNewtonian telescope, Newtonian reflector\nn03823111\tnib, pen nib\nn03823216\tniblick, nine iron\nn03823312\tnicad, nickel-cadmium accumulator\nn03823673\tnickel-iron battery, nickel-iron accumulator\nn03823906\tNicol prism\nn03824197\tnight bell\nn03824284\tnightcap\nn03824381\tnightgown, gown, nightie, night-robe, nightdress\nn03824589\tnight latch\nn03824713\tnight-light\nn03824999\tnightshirt\nn03825080\tnightwear, sleepwear, nightclothes\nn03825271\tninepin, skittle, skittle pin\nn03825442\tninepin ball, skittle ball\nn03825673\tninon\nn03825788\tnipple\nn03825913\tnipple shield\nn03826039\tniqab\nn03826186\tNissen hut, Quonset hut\nn03827420\tnogging\nn03827536\tnoisemaker\nn03828020\tnonsmoker, nonsmoking car\nn03829340\tnon-volatile storage, nonvolatile storage\nn03829857\tNorfolk jacket\nn03829954\tnoria\nn03831203\tnosebag, feedbag\nn03831382\tnoseband, nosepiece\nn03831757\tnose flute\nn03832144\tnosewheel\nn03832673\tnotebook, notebook computer\nn03833907\tnuclear-powered ship\nn03834040\tnuclear reactor, reactor\nn03834472\tnuclear rocket\nn03834604\tnuclear weapon, atomic weapon\nn03835197\tnude, nude painting\nn03835729\tnumdah, numdah rug, nammad\nn03835941\tnun's habit\nn03836062\tnursery, baby's room\nn03836451\tnut and bolt\nn03836602\tnutcracker\nn03836906\tnylon\nn03836976\tnylons, nylon stocking, rayons, rayon stocking, silk stocking\nn03837422\toar\nn03837606\toast\nn03837698\toast house\nn03837869\tobelisk\nn03838024\tobject ball\nn03838298\tobjective, objective lens, object lens, object glass\nn03838748\toblique bandage\nn03838899\toboe, hautboy, hautbois\nn03839172\toboe da caccia\nn03839276\toboe d'amore\nn03839424\tobservation dome\nn03839671\tobservatory\nn03839795\tobstacle\nn03840327\tobturator\nn03840681\tocarina, sweet potato\nn03840823\toctant\nn03841011\todd-leg caliper\nn03841143\todometer, hodometer, mileometer, milometer\nn03841290\toeil de boeuf\nn03841666\toffice, business office\nn03842012\toffice building, office block\nn03842156\toffice furniture\nn03842276\tofficer's mess\nn03842377\toff-line equipment, auxiliary equipment\nn03842585\togee, cyma reversa\nn03842754\togee arch, keel arch\nn03842986\tohmmeter\nn03843092\toil, oil color, oil colour\nn03843316\toilcan\nn03843438\toilcloth\nn03843555\toil filter\nn03843883\toil heater, oilstove, kerosene heater, kerosine heater\nn03844045\toil lamp, kerosene lamp, kerosine lamp\nn03844233\toil paint\nn03844550\toil pump\nn03844673\toil refinery, petroleum refinery\nn03844815\toilskin, slicker\nn03844965\toil slick\nn03845107\toilstone\nn03845190\toil tanker, oiler, tanker, tank ship\nn03845990\told school tie\nn03846100\tolive drab\nn03846234\tolive drab, olive-drab uniform\nn03846431\tOlympian Zeus\nn03846677\tomelet pan, omelette pan\nn03846772\tomnidirectional antenna, nondirectional antenna\nn03846970\tomnirange, omnidirectional range, omnidirectional radio range\nn03847471\tonion dome\nn03847823\topen-air market, open-air marketplace, market square\nn03848033\topen circuit\nn03848168\topen-end wrench, tappet wrench\nn03848348\topener\nn03848537\topen-hearth furnace\nn03849275\topenside plane, rabbet plane\nn03849412\topen sight\nn03849679\topenwork\nn03849814\topera, opera house\nn03849943\topera cloak, opera hood\nn03850053\toperating microscope\nn03850245\toperating room, OR, operating theater, operating theatre, surgery\nn03850492\toperating table\nn03850613\tophthalmoscope\nn03851341\toptical device\nn03851787\toptical disk, optical disc\nn03852280\toptical instrument\nn03852544\toptical pyrometer, pyroscope\nn03852688\toptical telescope\nn03853291\torchestra pit, pit\nn03853924\tordinary, ordinary bicycle\nn03854065\torgan, pipe organ\nn03854421\torgandy, organdie\nn03854506\torganic light-emitting diode, OLED\nn03854722\torgan loft\nn03854815\torgan pipe, pipe, pipework\nn03855214\torganza\nn03855333\toriel, oriel window\nn03855464\toriflamme\nn03855604\tO ring\nn03855756\tOrlon\nn03855908\torlop deck, orlop, fourth deck\nn03856012\torphanage, orphans' asylum\nn03856335\torphrey\nn03856465\torrery\nn03856728\torthicon, image orthicon\nn03857026\torthochromatic film\nn03857156\torthopter, ornithopter\nn03857291\torthoscope\nn03857687\toscillograph\nn03857828\toscilloscope, scope, cathode-ray oscilloscope, CRO\nn03858085\tossuary\nn03858183\totoscope, auriscope, auroscope\nn03858418\tottoman, pouf, pouffe, puff, hassock\nn03858533\toubliette\nn03858837\tout-basket, out-tray\nn03859000\toutboard motor, outboard\nn03859170\toutboard motorboat, outboard\nn03859280\toutbuilding\nn03859495\touterwear, overclothes\nn03859608\toutfall\nn03859958\toutfit, getup, rig, turnout\nn03860234\toutfitter\nn03860404\touthouse, privy, earth-closet, jakes\nn03861048\toutput device\nn03861271\toutrigger\nn03861430\toutrigger canoe\nn03861596\toutside caliper\nn03861842\toutside mirror\nn03862379\toutwork\nn03862676\toven\nn03862862\toven thermometer\nn03863108\toverall\nn03863262\toverall, boilersuit, boilers suit\nn03863657\tovercoat, overcoating\nn03863783\toverdrive\nn03863923\tovergarment, outer garment\nn03864139\toverhand knot\nn03864356\toverhang\nn03864692\toverhead projector\nn03865288\tovermantel\nn03865371\tovernighter, overnight bag, overnight case\nn03865557\toverpass, flyover\nn03865820\toverride\nn03865949\tovershoe\nn03866082\toverskirt\nn03867854\toxbow\nn03868044\tOxbridge\nn03868242\toxcart\nn03868324\toxeye\nn03868406\toxford\nn03868643\toximeter\nn03868763\toxyacetylene torch\nn03868863\toxygen mask\nn03869838\toyster bar\nn03869976\toyster bed, oyster bank, oyster park\nn03870105\tpace car\nn03870290\tpacemaker, artificial pacemaker\nn03870546\tpack\nn03870672\tpack\nn03870980\tpack, face pack\nn03871083\tpackage, parcel\nn03871371\tpackage store, liquor store, off-licence\nn03871524\tpackaging\nn03871628\tpacket\nn03871724\tpacking box, packing case\nn03871860\tpackinghouse, packing plant\nn03872016\tpackinghouse\nn03872167\tpacking needle\nn03872273\tpacksaddle\nn03873416\tpaddle, boat paddle\nn03873699\tpaddle\nn03873848\tpaddle\nn03873996\tpaddle box, paddle-box\nn03874138\tpaddle steamer, paddle-wheeler\nn03874293\tpaddlewheel, paddle wheel\nn03874487\tpaddock\nn03874599\tpadlock\nn03874823\tpage printer, page-at-a-time printer\nn03875218\tpaint, pigment\nn03875806\tpaintball\nn03875955\tpaintball gun\nn03876111\tpaintbox\nn03876231\tpaintbrush\nn03877351\tpaisley\nn03877472\tpajama, pyjama, pj's, jammies\nn03877674\tpajama, pyjama\nn03877845\tpalace\nn03878066\tpalace, castle\nn03878211\tpalace\nn03878294\tpalanquin, palankeen\nn03878418\tpaleolith\nn03878511\tpalestra, palaestra\nn03878674\tpalette, pallet\nn03878828\tpalette knife\nn03878963\tpalisade\nn03879456\tpallet\nn03879705\tpallette, palette\nn03880032\tpallium\nn03880129\tpallium\nn03880323\tpan\nn03880531\tpan, cooking pan\nn03881305\tpancake turner\nn03881404\tpanchromatic film\nn03881534\tpanda car\nn03882611\tpaneling, panelling, pane\nn03882960\tpanhandle\nn03883054\tpanic button\nn03883385\tpannier\nn03883524\tpannier\nn03883664\tpannikin\nn03883773\tpanopticon\nn03883944\tpanopticon\nn03884397\tpanpipe, pandean pipe, syrinx\nn03884554\tpantaloon\nn03884639\tpantechnicon\nn03884778\tpantheon\nn03884926\tpantheon\nn03885028\tpantie, panty, scanty, step-in\nn03885194\tpanting, trousering\nn03885293\tpant leg, trouser leg\nn03885410\tpantograph\nn03885535\tpantry, larder, buttery\nn03885669\tpants suit, pantsuit\nn03885788\tpanty girdle\nn03885904\tpantyhose\nn03886053\tpanzer\nn03886641\tpaper chain\nn03886762\tpaper clip, paperclip, gem clip\nn03886940\tpaper cutter\nn03887185\tpaper fastener\nn03887330\tpaper feed\nn03887512\tpaper mill\nn03887697\tpaper towel\nn03887899\tparabolic mirror\nn03888022\tparabolic reflector, paraboloid reflector\nn03888257\tparachute, chute\nn03888605\tparallel bars, bars\nn03888808\tparallel circuit, shunt circuit\nn03888998\tparallel interface, parallel port\nn03889397\tparang\nn03889503\tparapet, breastwork\nn03889626\tparapet\nn03889726\tparasail\nn03889871\tparasol, sunshade\nn03890093\tparer, paring knife\nn03890233\tparfait glass\nn03890358\tpargeting, pargetting, pargetry\nn03890514\tpari-mutuel machine, totalizer, totaliser, totalizator, totalisator\nn03891051\tparka, windbreaker, windcheater, anorak\nn03891251\tpark bench\nn03891332\tparking meter\nn03891538\tparlor, parlour\nn03892178\tparquet, parquet floor\nn03892425\tparquetry, parqueterie\nn03892557\tparsonage, vicarage, rectory\nn03892728\tParsons table\nn03893935\tpartial denture\nn03894051\tparticle detector\nn03894379\tpartition, divider\nn03894677\tparts bin\nn03894933\tparty line\nn03895038\tparty wall\nn03895170\tparvis\nn03895866\tpassenger car, coach, carriage\nn03896103\tpassenger ship\nn03896233\tpassenger train\nn03896419\tpassenger van\nn03896526\tpasse-partout\nn03896628\tpassive matrix display\nn03896984\tpasskey, passe-partout, master key, master\nn03897130\tpass-through\nn03897634\tpastry cart\nn03897943\tpatch\nn03898129\tpatchcord\nn03898271\tpatchouli, patchouly, pachouli\nn03898395\tpatch pocket\nn03898633\tpatchwork, patchwork quilt\nn03898787\tpatent log, screw log, taffrail log\nn03899100\tpaternoster\nn03899612\tpatina\nn03899768\tpatio, terrace\nn03899933\tpatisserie\nn03900028\tpatka\nn03900194\tpatrol boat, patrol ship\nn03900301\tpatty-pan\nn03900393\tpave\nn03900979\tpavilion, marquee\nn03901229\tpavior, paviour, paving machine\nn03901338\tpavis, pavise\nn03901750\tpawn\nn03901974\tpawnbroker's shop, pawnshop, loan office\nn03902125\tpay-phone, pay-station\nn03902220\tPC board\nn03902482\tpeach orchard\nn03902756\tpea jacket, peacoat\nn03903133\tpeavey, peavy, cant dog, dog hook\nn03903290\tpectoral, pectoral medallion\nn03903424\tpedal, treadle, foot pedal, foot lever\nn03903733\tpedal pusher, toreador pants\nn03903868\tpedestal, plinth, footstall\nn03904060\tpedestal table\nn03904183\tpedestrian crossing, zebra crossing\nn03904433\tpedicab, cycle rickshaw\nn03904657\tpediment\nn03904782\tpedometer\nn03904909\tpeeler\nn03905361\tpeep sight\nn03905540\tpeg, nog\nn03905730\tpeg, pin, thole, tholepin, rowlock, oarlock\nn03905947\tpeg\nn03906106\tpeg, wooden leg, leg, pegleg\nn03906224\tpegboard\nn03906463\tPelham\nn03906590\tpelican crossing\nn03906789\tpelisse\nn03906894\tpelvimeter\nn03906997\tpen\nn03907475\tpenal colony\nn03907654\tpenal institution, penal facility\nn03907908\tpenalty box\nn03908111\tpen-and-ink\nn03908204\tpencil\nn03908456\tpencil\nn03908618\tpencil box, pencil case\nn03908714\tpencil sharpener\nn03909020\tpendant earring, drop earring, eardrop\nn03909160\tpendulum\nn03909406\tpendulum clock\nn03909516\tpendulum watch\nn03909658\tpenetration bomb\nn03911406\tpenile implant\nn03911513\tpenitentiary, pen\nn03911658\tpenknife\nn03911767\tpenlight\nn03911866\tpennant, pennon, streamer, waft\nn03912218\tpennywhistle, tin whistle, whistle\nn03912821\tpenthouse\nn03913343\tpentode\nn03913930\tpeplos, peplus, peplum\nn03914106\tpeplum\nn03914337\tpepper mill, pepper grinder\nn03914438\tpepper shaker, pepper box, pepper pot\nn03914583\tpepper spray\nn03914831\tpercale\nn03915118\tpercolator\nn03915320\tpercussion cap\nn03915437\tpercussion instrument, percussive instrument\nn03915900\tperforation\nn03916031\tperfume, essence\nn03916289\tperfumery\nn03916385\tperfumery\nn03916470\tperfumery\nn03916720\tperipheral, computer peripheral, peripheral device\nn03917048\tperiscope\nn03917198\tperistyle\nn03917327\tperiwig, peruke\nn03917814\tpermanent press, durable press\nn03918074\tperpetual motion machine\nn03918480\tpersonal computer, PC, microcomputer\nn03918737\tpersonal digital assistant, PDA, personal organizer, personal organiser, organizer, organiser\nn03919096\tpersonnel carrier\nn03919289\tpestle\nn03919430\tpestle, muller, pounder\nn03919808\tpetcock\nn03920288\tPetri dish\nn03920384\tpetrolatum gauze\nn03920641\tpet shop\nn03920737\tpetticoat, half-slip, underskirt\nn03920867\tpew, church bench\nn03923379\tphial, vial, ampule, ampul, ampoule\nn03923564\tPhillips screw\nn03923692\tPhillips screwdriver\nn03923918\tphonograph needle, needle\nn03924069\tphonograph record, phonograph recording, record, disk, disc, platter\nn03924407\tphotocathode\nn03924532\tphotocoagulator\nn03924679\tphotocopier\nn03926148\tphotographic equipment\nn03926412\tphotographic paper, photographic material\nn03926876\tphotometer\nn03927091\tphotomicrograph\nn03927299\tPhotostat, Photostat machine\nn03927539\tphotostat\nn03927792\tphysical pendulum, compound pendulum\nn03928116\tpiano, pianoforte, forte-piano\nn03928589\tpiano action\nn03928814\tpiano keyboard, fingerboard, clavier\nn03928994\tpiano wire\nn03929091\tpiccolo\nn03929202\tpick, pickax, pickaxe\nn03929443\tpick\nn03929660\tpick, plectrum, plectron\nn03929855\tpickelhaube\nn03930229\tpicket boat\nn03930313\tpicket fence, paling\nn03930431\tpicket ship\nn03930515\tpickle barrel\nn03930630\tpickup, pickup truck\nn03931765\tpicture frame\nn03931885\tpicture hat\nn03931980\tpicture rail\nn03932080\tpicture window\nn03932670\tpiece of cloth, piece of material\nn03933391\tpied-a-terre\nn03933933\tpier\nn03934042\tpier\nn03934229\tpier arch\nn03934311\tpier glass, pier mirror\nn03934565\tpier table\nn03934656\tpieta\nn03934890\tpiezometer\nn03935116\tpig bed, pig\nn03935234\tpiggery, pig farm\nn03935335\tpiggy bank, penny bank\nn03935883\tpilaster\nn03936269\tpile, spile, piling, stilt\nn03936466\tpile driver\nn03937543\tpill bottle\nn03937835\tpillbox, toque, turban\nn03937931\tpillion\nn03938037\tpillory\nn03938244\tpillow\nn03938401\tpillow block\nn03938522\tpillow lace, bobbin lace\nn03938725\tpillow sham\nn03939062\tpilot bit\nn03939178\tpilot boat\nn03939281\tpilot burner, pilot light, pilot\nn03939440\tpilot cloth\nn03939565\tpilot engine\nn03939677\tpilothouse, wheelhouse\nn03939844\tpilot light, pilot lamp, indicator lamp\nn03940256\tpin\nn03940894\tpin, flag\nn03941013\tpin, pin tumbler\nn03941231\tpinata\nn03941417\tpinball machine, pin table\nn03941586\tpince-nez\nn03941684\tpincer, pair of pincers, tweezer, pair of tweezers\nn03941887\tpinch bar\nn03942028\tpincurl clip\nn03942600\tpinfold\nn03942813\tping-pong ball\nn03942920\tpinhead\nn03943115\tpinion\nn03943266\tpinnacle\nn03943623\tpinprick\nn03943714\tpinstripe\nn03943833\tpinstripe\nn03943920\tpinstripe\nn03944024\tpintle\nn03944138\tpinwheel, pinwheel wind collector\nn03944341\tpinwheel\nn03945459\ttabor pipe\nn03945615\tpipe\nn03945817\tpipe bomb\nn03945928\tpipe cleaner\nn03946076\tpipe cutter\nn03946162\tpipefitting, pipe fitting\nn03947111\tpipet, pipette\nn03947343\tpipe vise, pipe clamp\nn03947466\tpipe wrench, tube wrench\nn03947798\tpique\nn03947888\tpirate, pirate ship\nn03948242\tpiste\nn03948459\tpistol, handgun, side arm, shooting iron\nn03948830\tpistol grip\nn03948950\tpiston, plunger\nn03949145\tpiston ring\nn03949317\tpiston rod\nn03949761\tpit\nn03950228\tpitcher, ewer\nn03950359\tpitchfork\nn03950537\tpitching wedge\nn03950647\tpitch pipe\nn03950899\tpith hat, pith helmet, sun helmet, topee, topi\nn03951068\tpiton\nn03951213\tPitot-static tube, Pitot head, Pitot tube\nn03951453\tPitot tube, Pitot\nn03951800\tpitsaw\nn03951971\tpivot, pin\nn03952150\tpivoting window\nn03952576\tpizzeria, pizza shop, pizza parlor\nn03953020\tplace of business, business establishment\nn03953416\tplace of worship, house of prayer, house of God, house of worship\nn03953901\tplacket\nn03954393\tplanchet, coin blank\nn03954731\tplane, carpenter's plane, woodworking plane\nn03955296\tplane, planer, planing machine\nn03955489\tplane seat\nn03955809\tplanetarium\nn03955941\tplanetarium\nn03956157\tplanetarium\nn03956331\tplanetary gear, epicyclic gear, planet wheel, planet gear\nn03956531\tplank-bed\nn03956623\tplanking\nn03956785\tplanner\nn03956922\tplant, works, industrial plant\nn03957315\tplanter\nn03957420\tplaster, adhesive plaster, sticking plaster\nn03957762\tplasterboard, gypsum board\nn03957991\tplastering trowel\nn03958227\tplastic bag\nn03958338\tplastic bomb\nn03958630\tplastic laminate\nn03958752\tplastic wrap\nn03959014\tplastron\nn03959123\tplastron\nn03959227\tplastron\nn03959701\tplate, scale, shell\nn03960374\tplate, collection plate\nn03960490\tplate\nn03961394\tplaten\nn03961630\tplaten\nn03961711\tplate rack\nn03961828\tplate rail\nn03961939\tplatform\nn03962525\tplatform, weapons platform\nn03962685\tplatform\nn03962852\tplatform bed\nn03962932\tplatform rocker\nn03963028\tplating, metal plating\nn03963198\tplatter\nn03963294\tplayback\nn03963483\tplaybox, play-box\nn03963645\tplayground\nn03964495\tplaypen, pen\nn03964611\tplaysuit\nn03965456\tplaza, mall, center, shopping mall, shopping center, shopping centre\nn03965907\tpleat, plait\nn03966206\tplenum\nn03966325\tplethysmograph\nn03966582\tpleximeter, plessimeter\nn03966751\tplexor, plessor, percussor\nn03966976\tpliers, pair of pliers, plyers\nn03967270\tplimsoll\nn03967396\tplotter\nn03967562\tplow, plough\nn03967942\tplug, stopper, stopple\nn03968293\tplug, male plug\nn03968479\tplug fuse\nn03968581\tplughole\nn03968728\tplumb bob, plumb, plummet\nn03969510\tplumb level\nn03970156\tplunger, plumber's helper\nn03970363\tplus fours\nn03970546\tplush\nn03971218\tplywood, plyboard\nn03971321\tpneumatic drill\nn03971960\tp-n junction\nn03972146\tp-n-p transistor\nn03972372\tpoacher\nn03972524\tpocket\nn03973003\tpocket battleship\nn03973285\tpocketcomb, pocket comb\nn03973402\tpocket flap\nn03973520\tpocket-handkerchief\nn03973628\tpocketknife, pocket knife\nn03973839\tpocket watch\nn03973945\tpod, fuel pod\nn03974070\tpogo stick\nn03974915\tpoint-and-shoot camera\nn03975035\tpointed arch\nn03975657\tpointing trowel\nn03975788\tpoint lace, needlepoint\nn03975926\tpoker, stove poker, fire hook, salamander\nn03976105\tpolarimeter, polariscope\nn03976268\tPolaroid\nn03976467\tPolaroid camera, Polaroid Land camera\nn03976657\tpole\nn03977158\tpole\nn03977266\tpoleax, poleaxe\nn03977430\tpoleax, poleaxe\nn03977592\tpolice boat\nn03977966\tpolice van, police wagon, paddy wagon, patrol wagon, wagon, black Maria\nn03978421\tpolling booth\nn03978575\tpolo ball\nn03978686\tpolo mallet, polo stick\nn03978815\tpolonaise\nn03978966\tpolo shirt, sport shirt\nn03979377\tpolyester\nn03979492\tpolygraph\nn03980026\tpomade, pomatum\nn03980478\tpommel horse, side horse\nn03980874\tponcho\nn03980986\tpongee\nn03981094\tponiard, bodkin\nn03981340\tpontifical\nn03981566\tpontoon\nn03981760\tpontoon bridge, bateau bridge, floating bridge\nn03981924\tpony cart, ponycart, donkey cart, tub-cart\nn03982232\tpool ball\nn03982331\tpoolroom\nn03982430\tpool table, billiard table, snooker table\nn03982642\tpoop deck\nn03982767\tpoor box, alms box, mite box\nn03982895\tpoorhouse\nn03983396\tpop bottle, soda bottle\nn03983499\tpopgun\nn03983612\tpoplin\nn03983712\tpopper\nn03983928\tpoppet, poppet valve\nn03984125\tpop tent\nn03984234\tporcelain\nn03984381\tporch\nn03984643\tporkpie, porkpie hat\nn03984759\tporringer\nn03985069\tportable\nn03985232\tportable computer\nn03985441\tportable circular saw, portable saw\nn03985881\tportcullis\nn03986071\tporte-cochere\nn03986224\tporte-cochere\nn03986355\tportfolio\nn03986562\tporthole\nn03986704\tportico\nn03986857\tportiere\nn03986949\tportmanteau, Gladstone, Gladstone bag\nn03987266\tportrait camera\nn03987376\tportrait lens\nn03987674\tpositive pole, positive magnetic pole, north-seeking pole\nn03987865\tpositive pole\nn03987990\tpositron emission tomography scanner, PET scanner\nn03988170\tpost\nn03988758\tpostage meter\nn03988926\tpost and lintel\nn03989199\tpost chaise\nn03989349\tpostern\nn03989447\tpost exchange, PX\nn03989665\tposthole digger, post-hole digger\nn03989777\tpost horn\nn03989898\tposthouse, post house\nn03990474\tpot\nn03991062\tpot, flowerpot\nn03991202\tpotbelly, potbelly stove\nn03991321\tPotemkin village\nn03991443\tpotential divider, voltage divider\nn03991646\tpotentiometer, pot\nn03991837\tpotentiometer\nn03992325\tpotpourri\nn03992436\tpotsherd\nn03992509\tpotter's wheel\nn03992703\tpottery, clayware\nn03992975\tpottle\nn03993053\tpotty seat, potty chair\nn03993180\tpouch\nn03993403\tpoultice, cataplasm, plaster\nn03993703\tpound, dog pound\nn03993878\tpound net\nn03994008\tpowder\nn03994297\tpowder and shot\nn03994417\tpowdered mustard, dry mustard\nn03994614\tpowder horn, powder flask\nn03994757\tpowder keg\nn03995018\tpower brake\nn03995265\tpower cord\nn03995372\tpower drill\nn03995535\tpower line, power cable\nn03995661\tpower loom\nn03995856\tpower mower, motor mower\nn03996004\tpower pack\nn03996145\tpower saw, saw, sawing machine\nn03996416\tpower shovel, excavator, digger, shovel\nn03996849\tpower steering, power-assisted steering\nn03997274\tpower takeoff, PTO\nn03997484\tpower tool\nn03997875\tpraetorium, pretorium\nn03998194\tprayer rug, prayer mat\nn03998333\tprayer shawl, tallith, tallis\nn03998673\tprecipitator, electrostatic precipitator, Cottrell precipitator\nn03999064\tprefab\nn03999160\tpresbytery\nn03999621\tpresence chamber\nn03999992\tpress, mechanical press\nn04000311\tpress, printing press\nn04000480\tpress\nn04000592\tpress box\nn04000716\tpress gallery\nn04000998\tpress of sail, press of canvas\nn04001132\tpressure cabin\nn04001265\tpressure cooker\nn04001397\tpressure dome\nn04001499\tpressure gauge, pressure gage\nn04001661\tpressurized water reactor, PWR\nn04001845\tpressure suit\nn04002262\tpricket\nn04002371\tprie-dieu\nn04002629\tprimary coil, primary winding, primary\nn04003241\tPrimus stove, Primus\nn04003359\tPrince Albert\nn04003856\tprint\nn04004099\tprint buffer\nn04004210\tprinted circuit\nn04004475\tprinter, printing machine\nn04004767\tprinter\nn04004990\tprinter cable\nn04005197\tpriory\nn04005630\tprison, prison house\nn04005912\tprison camp, internment camp, prisoner of war camp, POW camp\nn04006067\tprivateer\nn04006227\tprivate line\nn04006330\tprivet hedge\nn04006411\tprobe\nn04007415\tproctoscope\nn04007664\tprod, goad\nn04008385\tproduction line, assembly line, line\nn04008634\tprojectile, missile\nn04009552\tprojector\nn04009801\tprojector\nn04009923\tprolonge\nn04010057\tprolonge knot, sailor's breastplate\nn04010779\tprompter, autocue\nn04010927\tprong\nn04011827\tpropeller, propellor\nn04012084\tpropeller plane\nn04012482\tpropjet, turboprop, turbo-propeller plane\nn04012665\tproportional counter tube, proportional counter\nn04013060\tpropulsion system\nn04013176\tproscenium, proscenium wall\nn04013600\tproscenium arch\nn04013729\tprosthesis, prosthetic device\nn04014297\tprotective covering, protective cover, protection\nn04015204\tprotective garment\nn04015786\tproton accelerator\nn04015908\tprotractor\nn04016240\tpruner, pruning hook, lopper\nn04016479\tpruning knife\nn04016576\tpruning saw\nn04016684\tpruning shears\nn04016846\tpsaltery\nn04017571\tpsychrometer\nn04017807\tPT boat, mosquito boat, mosquito craft, motor torpedo boat\nn04018155\tpublic address system, P.A. system, PA system, P.A., PA\nn04018399\tpublic house, pub, saloon, pothouse, gin mill, taphouse\nn04018667\tpublic toilet, comfort station, public convenience, convenience, public lavatory, restroom, toilet facility, wash room\nn04019101\tpublic transport\nn04019335\tpublic works\nn04019541\tpuck, hockey puck\nn04019696\tpull\nn04019881\tpullback, tieback\nn04020087\tpull chain\nn04020298\tpulley, pulley-block, pulley block, block\nn04020744\tpull-off, rest area, rest stop, layby, lay-by\nn04020912\tPullman, Pullman car\nn04021028\tpullover, slipover\nn04021164\tpull-through\nn04021362\tpulse counter\nn04021503\tpulse generator\nn04021704\tpulse timing circuit\nn04021798\tpump\nn04022332\tpump\nn04022434\tpump action, slide action\nn04022708\tpump house, pumping station\nn04022866\tpump room\nn04023021\tpump-type pliers\nn04023119\tpump well\nn04023249\tpunch, puncher\nn04023422\tpunchboard\nn04023695\tpunch bowl\nn04023962\tpunching bag, punch bag, punching ball, punchball\nn04024137\tpunch pliers\nn04024274\tpunch press\nn04024862\tpunnet\nn04024983\tpunt\nn04025508\tpup tent, shelter tent\nn04025633\tpurdah\nn04026053\tpurifier\nn04026180\tpurl, purl stitch\nn04026417\tpurse\nn04026813\tpush-bike\nn04026918\tpush broom\nn04027023\tpush button, push, button\nn04027367\tpush-button radio\nn04027706\tpusher, zori\nn04027820\tput-put\nn04027935\tputtee\nn04028074\tputter, putting iron\nn04028221\tputty knife\nn04028315\tpuzzle\nn04028581\tpylon, power pylon\nn04028764\tpylon\nn04029416\tpyramidal tent\nn04029647\tpyrograph\nn04029734\tpyrometer\nn04029913\tpyrometric cone\nn04030054\tpyrostat\nn04030161\tpyx, pix\nn04030274\tpyx, pix, pyx chest, pix chest\nn04030414\tpyxis\nn04030518\tquad, quadrangle\nn04030846\tquadrant\nn04030965\tquadraphony, quadraphonic system, quadriphonic system\nn04031884\tquartering\nn04032509\tquarterstaff\nn04032603\tquartz battery, quartz mill\nn04032936\tquartz lamp\nn04033287\tqueen\nn04033425\tqueen\nn04033557\tqueen post\nn04033801\tquern\nn04033901\tquill, quill pen\nn04033995\tquilt, comforter, comfort, puff\nn04034262\tquilted bedspread\nn04034367\tquilting\nn04035231\tquipu\nn04035634\tquirk molding, quirk moulding\nn04035748\tquirt\nn04035836\tquiver\nn04035912\tquoin, coign, coigne\nn04036155\tquoit\nn04036303\tQWERTY keyboard\nn04036776\trabbet, rebate\nn04036963\trabbet joint\nn04037076\trabbit ears\nn04037220\trabbit hutch\nn04037298\traceabout\nn04037443\tracer, race car, racing car\nn04037873\traceway, race\nn04037964\tracing boat\nn04038231\tracing gig\nn04038338\tracing skiff, single shell\nn04038440\track, stand\nn04038727\track\nn04039041\track, wheel\nn04039209\track and pinion\nn04039381\tracket, racquet\nn04039742\tracquetball\nn04039848\tradar, microwave radar, radio detection and ranging, radiolocation\nn04040247\tradial, radial tire, radial-ply tire\nn04040373\tradial engine, rotary engine\nn04040540\tradiation pyrometer\nn04040759\tradiator\nn04041069\tradiator\nn04041243\tradiator cap\nn04041408\tradiator hose\nn04041544\tradio, wireless\nn04041747\tradio antenna, radio aerial\nn04042076\tradio chassis\nn04042204\tradio compass\nn04042358\tradiogram, radiograph, shadowgraph, skiagraph, skiagram\nn04042632\tradio interferometer\nn04042795\tradio link, link\nn04042985\tradiometer\nn04043168\tradiomicrometer\nn04043411\tradio-phonograph, radio-gramophone\nn04043733\tradio receiver, receiving set, radio set, radio, tuner, wireless\nn04044307\tradiotelegraph, radiotelegraphy, wireless telegraph, wireless telegraphy\nn04044498\tradiotelephone, radiophone, wireless telephone\nn04044716\tradio telescope, radio reflector\nn04044955\tradiotherapy equipment\nn04045085\tradio transmitter\nn04045255\tradome, radar dome\nn04045397\traft\nn04045644\trafter, balk, baulk\nn04045787\traft foundation\nn04045941\trag, shred, tag, tag end, tatter\nn04046091\tragbag\nn04046277\traglan\nn04046400\traglan sleeve\nn04046590\trail\nn04046974\trail fence\nn04047139\trailhead\nn04047401\trailing, rail\nn04047733\trailing\nn04047834\trailroad bed\nn04048441\trailroad tunnel\nn04049303\train barrel\nn04049405\traincoat, waterproof\nn04049585\train gauge, rain gage, pluviometer, udometer\nn04049753\train stick\nn04050066\trake\nn04050313\trake handle\nn04050600\tRAM disk\nn04050933\tramekin, ramequin\nn04051269\tramjet, ramjet engine, atherodyde, athodyd, flying drainpipe\nn04051439\trammer\nn04051549\tramp, incline\nn04051705\trampant arch\nn04051825\trampart, bulwark, wall\nn04052235\tramrod\nn04052346\tramrod\nn04052442\tranch, spread, cattle ranch, cattle farm\nn04052658\tranch house\nn04052757\trandom-access memory, random access memory, random memory, RAM, read/write memory\nn04053508\trangefinder, range finder\nn04053677\trange hood\nn04053767\trange pole, ranging pole, flagpole\nn04054361\trapier, tuck\nn04054566\trariora\nn04054670\trasp, wood file\nn04055180\tratchet, rachet, ratch\nn04055447\tratchet wheel\nn04055700\trathskeller\nn04055861\tratline, ratlin\nn04056073\trat-tail file\nn04056180\trattan, ratan\nn04056413\trattrap\nn04056932\trayon\nn04057047\trazor\nn04057215\trazorblade\nn04057435\treaction-propulsion engine, reaction engine\nn04057673\treaction turbine\nn04057846\treactor\nn04057981\treading lamp\nn04058096\treading room\nn04058239\tread-only memory, ROM, read-only storage, fixed storage\nn04058486\tread-only memory chip\nn04058594\treadout, read-out\nn04058721\tread/write head, head\nn04059157\tready-to-wear\nn04059298\treal storage\nn04059399\treamer\nn04059516\treamer, juicer, juice reamer\nn04059947\trearview mirror\nn04060198\tReaumur thermometer\nn04060448\trebozo\nn04060647\treceiver, receiving system\nn04060904\treceptacle\nn04061681\treception desk\nn04061793\treception room\nn04061969\trecess, niche\nn04062179\treciprocating engine\nn04062428\trecliner, reclining chair, lounger\nn04062644\treconnaissance plane\nn04062807\treconnaissance vehicle, scout car\nn04063154\trecord changer, auto-changer, changer\nn04063373\trecorder, recording equipment, recording machine\nn04063868\trecording\nn04064213\trecording system\nn04064401\trecord player, phonograph\nn04064747\trecord sleeve, record cover\nn04064862\trecovery room\nn04065272\trecreational vehicle, RV, R.V.\nn04065464\trecreation room, rec room\nn04065789\trecycling bin\nn04065909\trecycling plant\nn04066023\tredbrick university\nn04066270\tred carpet\nn04066388\tredoubt\nn04066476\tredoubt\nn04066767\treduction gear\nn04067143\treed pipe\nn04067231\treed stop\nn04067353\treef knot, flat knot\nn04067472\treel\nn04067658\treel\nn04067818\trefectory\nn04067921\trefectory table\nn04068441\trefinery\nn04068601\treflecting telescope, reflector\nn04069166\treflectometer\nn04069276\treflector\nn04069434\treflex camera\nn04069582\treflux condenser\nn04069777\treformatory, reform school, training school\nn04070003\treformer\nn04070207\trefracting telescope\nn04070415\trefractometer\nn04070545\trefrigeration system\nn04070727\trefrigerator, icebox\nn04070964\trefrigerator car\nn04071102\trefuge, sanctuary, asylum\nn04071263\tregalia\nn04071393\tregimentals\nn04072193\tregulator\nn04072551\trein\nn04072960\trelay, electrical relay\nn04073425\trelease, button\nn04073948\treligious residence, cloister\nn04074185\treliquary\nn04074963\tremote control, remote\nn04075291\tremote terminal, link-attached terminal, remote station, link-attached station\nn04075468\tremovable disk\nn04075715\trendering\nn04075813\trep, repp\nn04075916\trepair shop, fix-it shop\nn04076052\trepeater\nn04076284\trepeating firearm, repeater\nn04076713\trepository, monument\nn04077430\treproducer\nn04077594\trerebrace, upper cannon\nn04077734\trescue equipment\nn04077889\tresearch center, research facility\nn04078002\treseau\nn04078574\treservoir\nn04078955\treset\nn04079106\treset button\nn04079244\tresidence\nn04079603\tresistance pyrometer\nn04079933\tresistor, resistance\nn04080138\tresonator\nn04080454\tresonator, cavity resonator, resonating chamber\nn04080705\tresort hotel, spa\nn04080833\trespirator, inhalator\nn04081281\trestaurant, eating house, eating place, eatery\nn04081699\trest house\nn04081844\trestraint, constraint\nn04082344\tresuscitator\nn04082562\tretainer\nn04082710\tretaining wall\nn04082886\treticle, reticule, graticule\nn04083113\treticulation\nn04083309\treticule\nn04083649\tretort\nn04083800\tretractor\nn04084517\treturn key, return\nn04084682\treverberatory furnace\nn04084889\trevers, revere\nn04085017\treverse, reverse gear\nn04085574\treversible\nn04085873\trevetment, revetement, stone facing\nn04086066\trevetment\nn04086273\trevolver, six-gun, six-shooter\nn04086446\trevolving door, revolver\nn04086663\trheometer\nn04086794\trheostat, variable resistor\nn04086937\trhinoscope\nn04087126\trib\nn04087432\triband, ribband\nn04087709\tribbed vault\nn04087826\tribbing\nn04088229\tribbon development\nn04088343\trib joint pliers\nn04088441\tricer\nn04088696\triddle\nn04088797\tride\nn04089152\tridge, ridgepole, rooftree\nn04089376\tridge rope\nn04089666\triding boot\nn04089836\triding crop, hunting crop\nn04089976\triding mower\nn04090263\trifle\nn04090548\trifle ball\nn04090781\trifle grenade\nn04091097\trig\nn04091466\trigger, rigger brush\nn04091584\trigger\nn04091693\trigging, tackle\nn04092168\trigout\nn04093157\tringlet\nn04093223\trings\nn04093625\trink, skating rink\nn04093775\triot gun\nn04093915\tripcord\nn04094060\tripcord\nn04094250\tripping bar\nn04094438\tripping chisel\nn04094608\tripsaw, splitsaw\nn04094720\triser\nn04094859\triser, riser pipe, riser pipeline, riser main\nn04095109\tRitz\nn04095210\triver boat\nn04095342\trivet\nn04095577\triveting machine, riveter, rivetter\nn04095938\troach clip, roach holder\nn04096066\troad, route\nn04096733\troadbed\nn04096848\troadblock, barricade\nn04097085\troadhouse\nn04097373\troadster, runabout, two-seater\nn04097622\troadway\nn04097760\troaster\nn04097866\trobe\nn04098169\trobotics equipment\nn04098260\tRochon prism, Wollaston prism\nn04098399\trock bit, roller bit\nn04098513\trocker\nn04098795\trocker, cradle\nn04099003\trocker arm, valve rocker\nn04099175\trocket, rocket engine\nn04099429\trocket, projectile\nn04099969\trocking chair, rocker\nn04100174\trod\nn04100519\trodeo\nn04101375\troll\nn04101497\troller\nn04101701\troller\nn04101860\troller bandage\nn04102037\tin-line skate\nn04102162\tRollerblade\nn04102285\troller blind\nn04102406\troller coaster, big dipper, chute-the-chute\nn04102618\troller skate\nn04102760\troller towel\nn04102872\troll film\nn04102962\trolling hitch\nn04103094\trolling mill\nn04103206\trolling pin\nn04103364\trolling stock\nn04103665\troll-on\nn04103769\troll-on\nn04103918\troll-on roll-off\nn04104147\tRolodex\nn04104384\tRoman arch, semicircular arch\nn04104500\tRoman building\nn04104770\tromper, romper suit\nn04104925\trood screen\nn04105068\troof\nn04105438\troof\nn04105704\troofing\nn04105893\troom\nn04107598\troomette\nn04107743\troom light\nn04107984\troost\nn04108268\trope\nn04108822\trope bridge\nn04108999\trope tow\nn04110068\trose water\nn04110178\trose window, rosette\nn04110281\trosin bag\nn04110439\trotary actuator, positioner\nn04110654\trotary engine\nn04110841\trotary press\nn04110955\trotating mechanism\nn04111190\trotating shaft, shaft\nn04111414\trotisserie\nn04111531\trotisserie\nn04111668\trotor\nn04111962\trotor, rotor coil\nn04112147\trotor\nn04112252\trotor blade, rotary wing\nn04112430\trotor head, rotor shaft\nn04112579\trotunda\nn04112654\trotunda\nn04112752\trouge, paint, blusher\nn04112921\troughcast\nn04113038\trouleau\nn04113194\troulette, toothed wheel\nn04113316\troulette ball\nn04113406\troulette wheel, wheel\nn04113641\tround, unit of ammunition, one shot\nn04113765\tround arch\nn04113968\tround-bottom flask\nn04114069\troundel\nn04114301\tround file\nn04114428\troundhouse\nn04114719\trouter\nn04114844\trouter\nn04114996\trouter plane\nn04115144\trowel\nn04115256\trow house, town house\nn04115456\trowing boat\nn04115542\trowlock arch\nn04115802\troyal\nn04115996\troyal mast\nn04116098\trubber band, elastic band, elastic\nn04116294\trubber boot, gum boot\nn04116389\trubber bullet\nn04116512\trubber eraser, rubber, pencil eraser\nn04117216\trudder\nn04117464\trudder\nn04117639\trudder blade\nn04118021\trug, carpet, carpeting\nn04118538\trugby ball\nn04118635\truin\nn04118776\trule, ruler\nn04119091\trumble\nn04119230\trumble seat\nn04119360\trummer\nn04119478\trumpus room, playroom, game room\nn04119630\truncible spoon\nn04119751\trundle, spoke, rung\nn04120489\trunning shoe\nn04120695\trunning suit\nn04120842\trunway\nn04121228\trushlight, rush candle\nn04121342\trusset\nn04121426\trya, rya rug\nn04121511\tsaber, sabre\nn04121728\tsaber saw, jigsaw, reciprocating saw\nn04122262\tsable\nn04122349\tsable, sable brush, sable's hair pencil\nn04122492\tsable coat\nn04122578\tsabot, wooden shoe\nn04122685\tsachet\nn04122825\tsack, poke, paper bag, carrier bag\nn04123026\tsack, sacque\nn04123123\tsackbut\nn04123228\tsackcloth\nn04123317\tsackcloth\nn04123448\tsack coat\nn04123567\tsacking, bagging\nn04123740\tsaddle\nn04124098\tsaddlebag\nn04124202\tsaddle blanket, saddlecloth, horse blanket\nn04124370\tsaddle oxford, saddle shoe\nn04124488\tsaddlery\nn04124573\tsaddle seat\nn04124887\tsaddle stitch\nn04125021\tsafe\nn04125116\tsafe\nn04125257\tsafe-deposit, safe-deposit box, safety-deposit, safety deposit box, deposit box, lockbox\nn04125541\tsafe house\nn04125692\tsafety arch\nn04125853\tsafety belt, life belt, safety harness\nn04126066\tsafety bicycle, safety bike\nn04126244\tsafety bolt, safety lock\nn04126541\tsafety curtain\nn04126659\tsafety fuse\nn04126852\tsafety lamp, Davy lamp\nn04126980\tsafety match, book matches\nn04127117\tsafety net\nn04127249\tsafety pin\nn04127395\tsafety rail, guardrail\nn04127521\tsafety razor\nn04127633\tsafety valve, relief valve, escape valve, escape cock, escape\nn04127904\tsail, canvas, canvass, sheet\nn04128413\tsail\nn04128499\tsailboat, sailing boat\nn04128710\tsailcloth\nn04128837\tsailing vessel, sailing ship\nn04129490\tsailing warship\nn04129688\tsailor cap\nn04129766\tsailor suit\nn04130143\tsalad bar\nn04130257\tsalad bowl\nn04130566\tsalinometer\nn04130907\tsallet, salade\nn04131015\tsalon\nn04131113\tsalon\nn04131208\tsalon, beauty salon, beauty parlor, beauty parlour, beauty shop\nn04131368\tsaltbox\nn04131499\tsaltcellar\nn04131690\tsaltshaker, salt shaker\nn04131811\tsaltworks\nn04131929\tsalver\nn04132158\tsalwar, shalwar\nn04132465\tSam Browne belt\nn04132603\tsamisen, shamisen\nn04132829\tsamite\nn04132985\tsamovar\nn04133114\tsampan\nn04133789\tsandal\nn04134008\tsandbag\nn04134170\tsandblaster\nn04134523\tsandbox\nn04134632\tsandglass\nn04135024\tsand wedge\nn04135118\tsandwich board\nn04135315\tsanitary napkin, sanitary towel, Kotex\nn04135710\tcling film, clingfilm, Saran Wrap\nn04135933\tsarcenet, sarsenet\nn04136045\tsarcophagus\nn04136161\tsari, saree\nn04136333\tsarong\nn04136510\tsash, window sash\nn04136800\tsash fastener, sash lock, window lock\nn04137089\tsash window\nn04137217\tsatchel\nn04137355\tsateen\nn04137444\tsatellite, artificial satellite, orbiter\nn04137773\tsatellite receiver\nn04137897\tsatellite television, satellite TV\nn04138131\tsatellite transmitter\nn04138261\tsatin\nn04138869\tSaturday night special\nn04138977\tsaucepan\nn04139140\tsaucepot\nn04139395\tsauna, sweat room\nn04139859\tsavings bank, coin bank, money box, bank\nn04140064\tsaw\nn04140539\tsawed-off shotgun\nn04140631\tsawhorse, horse, sawbuck, buck\nn04140777\tsawmill\nn04140853\tsaw set\nn04141076\tsax, saxophone\nn04141198\tsaxhorn\nn04141327\tscabbard\nn04141712\tscaffolding, staging\nn04141838\tscale\nn04141975\tscale, weighing machine\nn04142175\tscaler\nn04142327\tscaling ladder\nn04142434\tscalpel\nn04142731\tscanner, electronic scanner\nn04142999\tscanner\nn04143140\tscanner, digital scanner, image scanner\nn04143365\tscantling, stud\nn04143897\tscarf\nn04144241\tscarf joint, scarf\nn04144539\tscatter rug, throw rug\nn04144651\tscauper, scorper\nn04145863\tSchmidt telescope, Schmidt camera\nn04146050\tschool, schoolhouse\nn04146343\tschoolbag\nn04146504\tschool bell\nn04146614\tschool bus\nn04146862\tschool ship, training ship\nn04146976\tschool system\nn04147183\tschooner\nn04147291\tschooner\nn04147495\tscientific instrument\nn04147793\tscimitar\nn04147916\tscintillation counter\nn04148054\tscissors, pair of scissors\nn04148285\tsclerometer\nn04148464\tscoinson arch, sconcheon arch\nn04148579\tsconce\nn04148703\tsconce\nn04149083\tscoop\nn04149374\tscooter\nn04149813\tscoreboard\nn04150153\tscouring pad\nn04150273\tscow\nn04150371\tscow\nn04150980\tscraper\nn04151108\tscratcher\nn04151581\tscreen\nn04151940\tscreen, cover, covert, concealment\nn04152387\tscreen\nn04152593\tscreen, CRT screen\nn04153025\tscreen door, screen\nn04153330\tscreening\nn04153751\tscrew\nn04154152\tscrew, screw propeller\nn04154340\tscrew\nn04154565\tscrewdriver\nn04154753\tscrew eye\nn04154854\tscrew key\nn04154938\tscrew thread, thread\nn04155068\tscrewtop\nn04155177\tscrew wrench\nn04155457\tscriber, scribe, scratch awl\nn04155625\tscrim\nn04155735\tscrimshaw\nn04155889\tscriptorium\nn04156040\tscrubber\nn04156140\tscrub brush, scrubbing brush, scrubber\nn04156297\tscrub plane\nn04156411\tscuffer\nn04156591\tscuffle, scuffle hoe, Dutch hoe\nn04156814\tscull\nn04156946\tscull\nn04157099\tscullery\nn04157320\tsculpture\nn04158002\tscuttle, coal scuttle\nn04158138\tscyphus\nn04158250\tscythe\nn04158672\tseabag\nn04158807\tsea boat\nn04158956\tsea chest\nn04160036\tsealing wax, seal\nn04160261\tsealskin\nn04160372\tseam\nn04160586\tseaplane, hydroplane\nn04160847\tsearchlight\nn04161010\tsearing iron\nn04161358\tseat\nn04161981\tseat\nn04162433\tseat\nn04162706\tseat belt, seatbelt\nn04163530\tsecateurs\nn04164002\tsecondary coil, secondary winding, secondary\nn04164199\tsecond balcony, family circle, upper balcony, peanut gallery\nn04164406\tsecond base\nn04164757\tsecond hand\nn04164868\tsecretary, writing table, escritoire, secretaire\nn04165409\tsectional\nn04165675\tsecurity blanket\nn04165945\tsecurity system, security measure, security\nn04166111\tsecurity system\nn04166281\tsedan, saloon\nn04166436\tsedan, sedan chair\nn04167346\tseeder\nn04167489\tseeker\nn04167661\tseersucker\nn04168084\tsegmental arch\nn04168199\tSegway, Segway Human Transporter, Segway HT\nn04168472\tseidel\nn04168541\tseine\nn04168840\tseismograph\nn04169437\tselector, selector switch\nn04169597\tselenium cell\nn04170037\tself-propelled vehicle\nn04170384\tself-registering thermometer\nn04170515\tself-starter\nn04170694\tselsyn, synchro\nn04170933\tselvage, selvedge\nn04171208\tsemaphore\nn04171459\tsemiautomatic firearm\nn04171629\tsemiautomatic pistol, semiautomatic\nn04171831\tsemiconductor device, semiconductor unit, semiconductor\nn04172107\tsemi-detached house\nn04172230\tsemigloss\nn04172342\tsemitrailer, semi\nn04172512\tsennit\nn04172607\tsensitometer\nn04172776\tsentry box\nn04172904\tseparate\nn04173046\tseptic tank\nn04173172\tsequence, episode\nn04173511\tsequencer, sequenator\nn04173907\tserape, sarape\nn04174026\tserge\nn04174101\tserger\nn04174234\tserial port\nn04174500\tserpent\nn04174705\tserration\nn04175039\tserver\nn04175147\tserver, host\nn04175574\tservice club\nn04176068\tserving cart\nn04176190\tserving dish\nn04176295\tservo, servomechanism, servosystem\nn04176528\tset\nn04177041\tset gun, spring gun\nn04177329\tsetscrew\nn04177545\tsetscrew\nn04177654\tset square\nn04177755\tsettee\nn04177820\tsettle, settee\nn04177931\tsettlement house\nn04178190\tseventy-eight, 78\nn04178329\tSeven Wonders of the Ancient World, Seven Wonders of the World\nn04178668\tsewage disposal plant, disposal plant\nn04179126\tsewer, sewerage, cloaca\nn04179712\tsewing basket\nn04179824\tsewing kit\nn04179913\tsewing machine\nn04180063\tsewing needle\nn04180229\tsewing room\nn04180888\tsextant\nn04181083\tsgraffito\nn04181228\tshackle, bond, hamper, trammel\nn04181561\tshackle\nn04181718\tshade\nn04182152\tshadow box\nn04182322\tshaft\nn04183217\tshag rug\nn04183329\tshaker\nn04183957\tshank\nn04184095\tshank, stem\nn04184316\tshantung\nn04184435\tshaper, shaping machine\nn04184600\tshaping tool\nn04184880\tsharkskin\nn04185071\tsharpener\nn04185529\tSharpie\nn04185804\tshaver, electric shaver, electric razor\nn04185946\tshaving brush\nn04186051\tshaving cream, shaving soap\nn04186268\tshaving foam\nn04186455\tshawl\nn04186624\tshawm\nn04186848\tshears\nn04187061\tsheath\nn04187233\tsheathing, overlay, overlayer\nn04187547\tshed\nn04187751\tsheep bell\nn04187885\tsheepshank\nn04187970\tsheepskin coat, afghan\nn04188064\tsheepwalk, sheeprun\nn04188179\tsheet, bed sheet\nn04189092\tsheet bend, becket bend, weaver's knot, weaver's hitch\nn04189282\tsheeting\nn04189651\tsheet pile, sheath pile, sheet piling\nn04189816\tSheetrock\nn04190052\tshelf\nn04190376\tshelf bracket\nn04190464\tshell\nn04190747\tshell, case, casing\nn04190997\tshell, racing shell\nn04191150\tshellac, shellac varnish\nn04191595\tshelter\nn04191943\tshelter\nn04192238\tshelter\nn04192361\tsheltered workshop\nn04192521\tSheraton\nn04192698\tshield, buckler\nn04192858\tshield\nn04193179\tshielding\nn04193377\tshift key, shift\nn04193742\tshillelagh, shillalah\nn04193883\tshim\nn04194009\tshingle\nn04194127\tshin guard, shinpad\nn04194289\tship\nn04196080\tshipboard system\nn04196502\tshipping, cargo ships, merchant marine, merchant vessels\nn04196803\tshipping room\nn04196925\tship-towed long-range acoustic detection system\nn04197110\tshipwreck\nn04197391\tshirt\nn04197781\tshirt button\nn04197878\tshirtdress\nn04198015\tshirtfront\nn04198233\tshirting\nn04198355\tshirtsleeve\nn04198453\tshirttail\nn04198562\tshirtwaist, shirtwaister\nn04198722\tshiv\nn04198797\tshock absorber, shock, cushion\nn04199027\tshoe\nn04200000\tshoe\nn04200258\tshoebox\nn04200537\tshoehorn\nn04200800\tshoe shop, shoe-shop, shoe store\nn04200908\tshoetree\nn04201064\tshofar, shophar\nn04201297\tshoji\nn04201733\tshooting brake\nn04202142\tshooting lodge, shooting box\nn04202282\tshooting stick\nn04202417\tshop, store\nn04203356\tshop bell\nn04204081\tshopping bag\nn04204238\tshopping basket\nn04204347\tshopping cart\nn04204755\tshort circuit, short\nn04205062\tshort iron\nn04205318\tshort pants, shorts, trunks\nn04205505\tshort sleeve\nn04205613\tshortwave diathermy machine\nn04206070\tshot\nn04206225\tshot glass, jigger, pony\nn04206356\tshotgun, scattergun\nn04206570\tshotgun shell\nn04206790\tshot tower\nn04207151\tshoulder\nn04207343\tshoulder bag\nn04207596\tshouldered arch\nn04207763\tshoulder holster\nn04207903\tshoulder pad\nn04208065\tshoulder patch\nn04208210\tshovel\nn04208427\tshovel\nn04208582\tshovel hat\nn04208760\tshowboat\nn04208936\tshower\nn04209133\tshower cap\nn04209239\tshower curtain\nn04209509\tshower room\nn04209613\tshower stall, shower bath\nn04209811\tshowroom, salesroom, saleroom\nn04210012\tshrapnel\nn04210120\tshredder\nn04210288\tshrimper\nn04210390\tshrine\nn04210591\tshrink-wrap\nn04210858\tshunt\nn04211001\tshunt, electrical shunt, bypass\nn04211219\tshunter\nn04211356\tshutter\nn04211528\tshutter\nn04211857\tshuttle\nn04211970\tshuttle\nn04212165\tshuttle bus\nn04212282\tshuttlecock, bird, birdie, shuttle\nn04212467\tshuttle helicopter\nn04212810\tSibley tent\nn04213105\tsickbay, sick berth\nn04213264\tsickbed\nn04213353\tsickle, reaping hook, reap hook\nn04213530\tsickroom\nn04214046\tsideboard\nn04214282\tsidecar\nn04214413\tside chapel\nn04214649\tsidelight, running light\nn04215153\tsidesaddle\nn04215402\tsidewalk, pavement\nn04215588\tsidewall\nn04215800\tside-wheeler\nn04215910\tsidewinder\nn04216634\tsieve, screen\nn04216860\tsifter\nn04216963\tsights\nn04217387\tsigmoidoscope, flexible sigmoidoscope\nn04217546\tsignal box, signal tower\nn04217718\tsignaling device\nn04217882\tsignboard, sign\nn04218564\tsilencer, muffler\nn04218921\tsilent butler\nn04219185\tSilex\nn04219424\tsilk\nn04219580\tsilks\nn04220250\tsilo\nn04220805\tsilver plate\nn04221076\tsilverpoint\nn04221673\tsimple pendulum\nn04221823\tsimulator\nn04222210\tsingle bed\nn04222307\tsingle-breasted jacket\nn04222470\tsingle-breasted suit\nn04222723\tsingle prop, single-propeller plane\nn04222847\tsingle-reed instrument, single-reed woodwind\nn04223066\tsingle-rotor helicopter\nn04223170\tsinglestick, fencing stick, backsword\nn04223299\tsinglet, vest, undershirt\nn04224395\tsiren\nn04224543\tsister ship\nn04224842\tsitar\nn04225031\tsitz bath, hip bath\nn04225222\tsix-pack, six pack, sixpack\nn04225729\tskate\nn04225987\tskateboard\nn04226322\tskeg\nn04226464\tskein\nn04226537\tskeleton, skeletal frame, frame, underframe\nn04226826\tskeleton key\nn04226962\tskep\nn04227050\tskep\nn04227144\tsketch, study\nn04227519\tsketcher\nn04227787\tskew arch\nn04227900\tskewer\nn04228054\tski\nn04228215\tski binding, binding\nn04228422\tskibob\nn04228581\tski boot\nn04228693\tski cap, stocking cap, toboggan cap\nn04229007\tskidder\nn04229107\tskid lid\nn04229480\tskiff\nn04229620\tski jump\nn04229737\tski lodge\nn04229816\tski mask\nn04229959\tskimmer\nn04230387\tski parka, ski jacket\nn04230487\tski-plane\nn04230603\tski pole\nn04230707\tski rack\nn04230808\tskirt\nn04231272\tskirt\nn04231693\tski tow, ski lift, lift\nn04231905\tSkivvies\nn04232153\tskullcap\nn04232312\tskybox\nn04232437\tskyhook\nn04232800\tskylight, fanlight\nn04233027\tskysail\nn04233124\tskyscraper\nn04233295\tskywalk\nn04233715\tslacks\nn04233832\tslack suit\nn04234160\tslasher\nn04234260\tslash pocket\nn04234455\tslat, spline\nn04234670\tslate\nn04234763\tslate pencil\nn04234887\tslate roof\nn04235291\tsled, sledge, sleigh\nn04235646\tsleeper\nn04235771\tsleeper\nn04235860\tsleeping bag\nn04236001\tsleeping car, sleeper, wagon-lit\nn04236377\tsleeve, arm\nn04236702\tsleeve\nn04236809\tsleigh bed\nn04236935\tsleigh bell, cascabel\nn04237174\tslice bar\nn04237287\tslicer\nn04237423\tslicer\nn04238128\tslide, playground slide, sliding board\nn04238321\tslide fastener, zip, zipper, zip fastener\nn04238617\tslide projector\nn04238763\tslide rule, slipstick\nn04238953\tslide valve\nn04239074\tsliding door\nn04239218\tsliding seat\nn04239333\tsliding window\nn04239436\tsling, scarf bandage, triangular bandage\nn04239639\tsling\nn04239786\tslingback, sling\nn04239900\tslinger ring\nn04240434\tslip clutch, slip friction clutch\nn04240752\tslipcover\nn04240867\tslip-joint pliers\nn04241042\tslipknot\nn04241249\tslip-on\nn04241394\tslipper, carpet slipper\nn04241573\tslip ring\nn04242084\tslit lamp\nn04242315\tslit trench\nn04242408\tsloop\nn04242587\tsloop of war\nn04242704\tslop basin, slop bowl\nn04243003\tslop pail, slop jar\nn04243142\tslops\nn04243251\tslopshop, slopseller's shop\nn04243546\tslot, one-armed bandit\nn04243941\tslot machine, coin machine\nn04244379\tsluice, sluiceway, penstock\nn04244847\tsmack\nn04244997\tsmall boat\nn04245218\tsmall computer system interface, SCSI\nn04245412\tsmall ship\nn04245508\tsmall stores\nn04245847\tsmart bomb\nn04246060\tsmelling bottle\nn04246271\tsmocking\nn04246459\tsmoke bomb, smoke grenade\nn04246731\tsmokehouse, meat house\nn04246855\tsmoker, smoking car, smoking carriage, smoking compartment\nn04247011\tsmoke screen, smokescreen\nn04247440\tsmoking room\nn04247544\tsmoothbore\nn04247630\tsmooth plane, smoothing plane\nn04247736\tsnack bar, snack counter, buffet\nn04247876\tsnaffle, snaffle bit\nn04248209\tsnap, snap fastener, press stud\nn04248396\tsnap brim\nn04248507\tsnap-brim hat\nn04248851\tsnare, gin, noose\nn04249415\tsnare drum, snare, side drum\nn04249582\tsnatch block\nn04249882\tsnifter, brandy snifter, brandy glass\nn04250224\tsniper rifle, precision rifle\nn04250473\tsnips, tinsnips\nn04250599\tSno-cat\nn04250692\tsnood\nn04250850\tsnorkel, schnorkel, schnorchel, snorkel breather, breather\nn04251144\tsnorkel\nn04251701\tsnowbank, snow bank\nn04251791\tsnowboard\nn04252077\tsnowmobile\nn04252225\tsnowplow, snowplough\nn04252331\tsnowshoe\nn04252560\tsnowsuit\nn04252653\tsnow thrower, snow blower\nn04253057\tsnuffbox\nn04253168\tsnuffer\nn04253304\tsnuffers\nn04253931\tsoapbox\nn04254009\tsoap dish\nn04254120\tsoap dispenser\nn04254450\tsoap pad\nn04254680\tsoccer ball\nn04254777\tsock\nn04255163\tsocket\nn04255346\tsocket wrench\nn04255499\tsocle\nn04255586\tsoda can\nn04255670\tsoda fountain\nn04255768\tsoda fountain\nn04255899\tsod house, soddy, adobe house\nn04256318\tsodium-vapor lamp, sodium-vapour lamp\nn04256520\tsofa, couch, lounge\nn04256758\tsoffit\nn04256891\tsoftball, playground ball\nn04257223\tsoft pedal\nn04257684\tsoil pipe\nn04257790\tsolar array, solar battery, solar panel\nn04257986\tsolar cell, photovoltaic cell\nn04258138\tsolar dish, solar collector, solar furnace\nn04258333\tsolar heater\nn04258438\tsolar house\nn04258618\tsolar telescope\nn04258732\tsolar thermal system\nn04258859\tsoldering iron\nn04259202\tsolenoid\nn04259468\tsolleret, sabaton\nn04259630\tsombrero\nn04260192\tsonic depth finder, fathometer\nn04260364\tsonogram, echogram\nn04260589\tsonograph\nn04261116\tsorter\nn04261281\tsouk\nn04261369\tsound bow\nn04261506\tsoundbox, body\nn04261638\tsound camera\nn04261767\tsounder\nn04261868\tsound film\nn04262161\tsounding board, soundboard\nn04262530\tsounding rocket\nn04262678\tsound recording, audio recording, audio\nn04262869\tsound spectrograph\nn04263257\tsoup bowl\nn04263336\tsoup ladle\nn04263502\tsoupspoon, soup spoon\nn04263760\tsource of illumination\nn04263950\tsourdine\nn04264134\tsoutache\nn04264233\tsoutane\nn04264361\tsou'wester\nn04264485\tsoybean future\nn04264628\tspace bar\nn04264765\tspace capsule, capsule\nn04264914\tspacecraft, ballistic capsule, space vehicle\nn04265275\tspace heater\nn04265428\tspace helmet\nn04265904\tspace rocket\nn04266014\tspace shuttle\nn04266162\tspace station, space platform, space laboratory\nn04266375\tspacesuit\nn04266486\tspade\nn04266849\tspade bit\nn04266968\tspaghetti junction\nn04267091\tSpandau\nn04267165\tspandex\nn04267246\tspandrel, spandril\nn04267435\tspanker\nn04267577\tspar\nn04267985\tsparge pipe\nn04268142\tspark arrester, sparker\nn04268275\tspark arrester\nn04268418\tspark chamber, spark counter\nn04268565\tspark coil\nn04268799\tspark gap\nn04269086\tspark lever\nn04269270\tspark plug, sparking plug, plug\nn04269502\tsparkplug wrench\nn04269668\tspark transmitter\nn04269822\tspat, gaiter\nn04269944\tspatula\nn04270147\tspatula\nn04270371\tspeakerphone\nn04270576\tspeaking trumpet\nn04270891\tspear, lance, shaft\nn04271148\tspear, gig, fizgig, fishgig, lance\nn04271531\tspecialty store\nn04271793\tspecimen bottle\nn04271891\tspectacle\nn04272054\tspectacles, specs, eyeglasses, glasses\nn04272389\tspectator pump, spectator\nn04272782\tspectrograph\nn04272928\tspectrophotometer\nn04273064\tspectroscope, prism spectroscope\nn04273285\tspeculum\nn04273569\tspeedboat\nn04273659\tspeed bump\nn04273796\tspeedometer, speed indicator\nn04273972\tspeed skate, racing skate\nn04274686\tspherometer\nn04274985\tsphygmomanometer\nn04275093\tspicemill\nn04275175\tspice rack\nn04275283\tspider\nn04275548\tspider web, spider's web\nn04275661\tspike\nn04275904\tspike\nn04277352\tspindle\nn04277493\tspindle, mandrel, mandril, arbor\nn04277669\tspindle\nn04277826\tspin dryer, spin drier\nn04278247\tspinet\nn04278353\tspinet\nn04278447\tspinnaker\nn04278605\tspinner\nn04278932\tspinning frame\nn04279063\tspinning jenny\nn04279172\tspinning machine\nn04279353\tspinning rod\nn04279462\tspinning wheel\nn04279858\tspiral bandage\nn04279987\tspiral ratchet screwdriver, ratchet screwdriver\nn04280259\tspiral spring\nn04280373\tspirit lamp\nn04280487\tspirit stove\nn04280845\tspirometer\nn04280970\tspit\nn04281260\tspittoon, cuspidor\nn04281375\tsplashboard, splasher, dashboard\nn04281571\tsplasher\nn04281998\tsplice, splicing\nn04282231\tsplicer\nn04282494\tsplint\nn04282872\tsplit rail, fence rail\nn04282992\tSpode\nn04283096\tspoiler\nn04283255\tspoiler\nn04283378\tspoke, wheel spoke, radius\nn04283585\tspokeshave\nn04283784\tsponge cloth\nn04283905\tsponge mop\nn04284002\tspoon\nn04284341\tspoon\nn04284438\tSpork\nn04284572\tsporran\nn04284869\tsport kite, stunt kite\nn04285008\tsports car, sport car\nn04285146\tsports equipment\nn04285622\tsports implement\nn04285803\tsportswear, athletic wear, activewear\nn04285965\tsport utility, sport utility vehicle, S.U.V., SUV\nn04286128\tspot\nn04286575\tspotlight, spot\nn04286960\tspot weld, spot-weld\nn04287351\tspouter\nn04287451\tsprag\nn04287747\tspray gun\nn04287898\tspray paint\nn04287986\tspreader\nn04288165\tsprig\nn04288272\tspring\nn04288533\tspring balance, spring scale\nn04288673\tspringboard\nn04289027\tsprinkler\nn04289195\tsprinkler system\nn04289449\tsprit\nn04289576\tspritsail\nn04289690\tsprocket, sprocket wheel\nn04289827\tsprocket\nn04290079\tspun yarn\nn04290259\tspur, gad\nn04290507\tspur gear, spur wheel\nn04290615\tsputnik\nn04290762\tspy satellite\nn04291069\tsquad room\nn04291242\tsquare\nn04291759\tsquare knot\nn04291992\tsquare-rigger\nn04292080\tsquare sail\nn04292221\tsquash ball\nn04292414\tsquash racket, squash racquet, bat\nn04292572\tsquawk box, squawker, intercom speaker\nn04292921\tsqueegee\nn04293119\tsqueezer\nn04293258\tsquelch circuit, squelch, squelcher\nn04293744\tsquinch\nn04294212\tstabilizer, stabiliser\nn04294426\tstabilizer\nn04294614\tstabilizer bar, anti-sway bar\nn04294879\tstable, stalls, horse barn\nn04295081\tstable gear, saddlery, tack\nn04295353\tstabling\nn04295571\tstacks\nn04295777\tstaddle\nn04295881\tstadium, bowl, arena, sports stadium\nn04296562\tstage\nn04297098\tstagecoach, stage\nn04297750\tstained-glass window\nn04297847\tstair-carpet\nn04298053\tstair-rod\nn04298661\tstairwell\nn04298765\tstake\nn04299215\tstall, stand, sales booth\nn04299370\tstall\nn04299963\tstamp\nn04300358\tstamp mill, stamping mill\nn04300509\tstamping machine, stamper\nn04300643\tstanchion\nn04301000\tstand\nn04301242\tstandard\nn04301474\tstandard cell\nn04301760\tstandard transmission, stick shift\nn04302200\tstanding press\nn04302863\tstanhope\nn04302988\tStanley Steamer\nn04303095\tstaple\nn04303258\tstaple\nn04303357\tstaple gun, staplegun, tacker\nn04303497\tstapler, stapling machine\nn04304215\tstarship, spaceship\nn04304375\tstarter, starter motor, starting motor\nn04304680\tstarting gate, starting stall\nn04305016\tStassano furnace, electric-arc furnace\nn04305210\tStatehouse\nn04305323\tstately home\nn04305471\tstate prison\nn04305572\tstateroom\nn04305947\tstatic tube\nn04306080\tstation\nn04306592\tstator, stator coil\nn04306847\tstatue\nn04307419\tstay\nn04307767\tstaysail\nn04307878\tsteakhouse, chophouse\nn04307986\tsteak knife\nn04308084\tstealth aircraft\nn04308273\tstealth bomber\nn04308397\tstealth fighter\nn04308583\tsteam bath, steam room, vapor bath, vapour bath\nn04308807\tsteamboat\nn04308915\tsteam chest\nn04309049\tsteam engine\nn04309348\tsteamer, steamship\nn04309548\tsteamer\nn04309833\tsteam iron\nn04310018\tsteam locomotive\nn04310157\tsteamroller, road roller\nn04310507\tsteam shovel\nn04310604\tsteam turbine\nn04310721\tsteam whistle\nn04310904\tsteel\nn04311004\tsteel arch bridge\nn04311174\tsteel drum\nn04311595\tsteel mill, steelworks, steel plant, steel factory\nn04312020\tsteel-wool pad\nn04312154\tsteelyard, lever scale, beam scale\nn04312432\tsteeple, spire\nn04312654\tsteerage\nn04312756\tsteering gear\nn04312916\tsteering linkage\nn04313220\tsteering system, steering mechanism\nn04313503\tsteering wheel, wheel\nn04313628\tstele, stela\nn04314107\tstem-winder\nn04314216\tstencil\nn04314522\tSten gun\nn04314632\tstenograph\nn04314914\tstep, stair\nn04315342\tstep-down transformer\nn04315713\tstep stool\nn04315828\tstep-up transformer\nn04315948\tstereo, stereophony, stereo system, stereophonic system\nn04316498\tstereoscope\nn04316815\tstern chaser\nn04316924\tsternpost\nn04317063\tsternwheeler\nn04317175\tstethoscope\nn04317325\tstewing pan, stewpan\nn04317420\tstick\nn04317833\tstick\nn04317976\tstick, control stick, joystick\nn04318131\tstick\nn04318787\tstile\nn04318892\tstiletto\nn04318982\tstill\nn04319545\tstillroom, still room\nn04319774\tStillson wrench\nn04319937\tstilt\nn04320405\tStinger\nn04320598\tstink bomb, stench bomb\nn04320871\tstirrer\nn04320973\tstirrup, stirrup iron\nn04321121\tstirrup pump\nn04321453\tstob\nn04322026\tstock, gunstock\nn04322531\tstockade\nn04322692\tstockcar\nn04322801\tstock car\nn04323519\tstockinet, stockinette\nn04323819\tstocking\nn04324120\tstock-in-trade\nn04324297\tstockpot\nn04324387\tstockroom, stock room\nn04324515\tstocks\nn04325041\tstock saddle, Western saddle\nn04325208\tstockyard\nn04325704\tstole\nn04325804\tstomacher\nn04325968\tstomach pump\nn04326547\tstone wall\nn04326676\tstoneware\nn04326799\tstonework\nn04326896\tstool\nn04327204\tstoop, stoep\nn04327544\tstop bath, short-stop, short-stop bath\nn04327682\tstopcock, cock, turncock\nn04328054\tstopper knot\nn04328186\tstopwatch, stop watch\nn04328329\tstorage battery, accumulator\nn04328580\tstorage cell, secondary cell\nn04328703\tstorage ring\nn04328946\tstorage space\nn04329477\tstoreroom, storage room, stowage\nn04329681\tstorm cellar, cyclone cellar, tornado cellar\nn04329834\tstorm door\nn04329958\tstorm window, storm sash\nn04330109\tstoup, stoop\nn04330189\tstoup\nn04330267\tstove\nn04330340\tstove, kitchen stove, range, kitchen range, cooking stove\nn04330669\tstove bolt\nn04330746\tstovepipe\nn04330896\tstovepipe iron\nn04330998\tStradavarius, Strad\nn04331277\tstraight chair, side chair\nn04331443\tstraightedge\nn04331639\tstraightener\nn04331765\tstraight flute, straight-fluted drill\nn04331892\tstraight pin\nn04332074\tstraight razor\nn04332243\tstrainer\nn04332580\tstraitjacket, straightjacket\nn04332987\tstrap\nn04333129\tstrap\nn04333869\tstrap hinge, joint hinge\nn04334105\tstrapless\nn04334365\tstreamer fly\nn04334504\tstreamliner\nn04334599\tstreet\nn04335209\tstreet\nn04335435\tstreetcar, tram, tramcar, trolley, trolley car\nn04335693\tstreet clothes\nn04335886\tstreetlight, street lamp\nn04336792\tstretcher\nn04337157\tstretcher\nn04337287\tstretch pants\nn04337503\tstrickle\nn04337650\tstrickle\nn04338517\tstringed instrument\nn04338963\tstringer\nn04339062\tstringer\nn04339191\tstring tie\nn04339638\tstrip\nn04339879\tstrip lighting\nn04340019\tstrip mall\nn04340521\tstroboscope, strobe, strobe light\nn04340750\tstrongbox, deedbox\nn04340935\tstronghold, fastness\nn04341133\tstrongroom\nn04341288\tstrop\nn04341414\tstructural member\nn04341686\tstructure, construction\nn04343511\tstudent center\nn04343630\tstudent lamp\nn04343740\tstudent union\nn04344003\tstud finder\nn04344734\tstudio apartment, studio\nn04344873\tstudio couch, day bed\nn04345028\tstudy\nn04345201\tstudy hall\nn04345787\tstuffing nut, packing nut\nn04346003\tstump\nn04346157\tstun gun, stun baton\nn04346328\tstupa, tope\nn04346428\tsty, pigsty, pigpen\nn04346511\tstylus, style\nn04346679\tstylus\nn04346855\tsub-assembly\nn04347119\tsubcompact, subcompact car\nn04347519\tsubmachine gun\nn04347754\tsubmarine, pigboat, sub, U-boat\nn04348070\tsubmarine torpedo\nn04348184\tsubmersible, submersible warship\nn04348359\tsubmersible\nn04348988\tsubtracter\nn04349189\tsubway token\nn04349306\tsubway train\nn04349401\tsubwoofer\nn04349913\tsuction cup\nn04350104\tsuction pump\nn04350235\tsudatorium, sudatory\nn04350458\tsuede cloth, suede\nn04350581\tsugar bowl\nn04350688\tsugar refinery\nn04350769\tsugar spoon, sugar shell\nn04350905\tsuit, suit of clothes\nn04351550\tsuite, rooms\nn04351699\tsuiting\nn04353573\tsulky\nn04354026\tsummer house\nn04354182\tsumo ring\nn04354387\tsump\nn04354487\tsump pump\nn04354589\tsunbonnet\nn04355115\tSunday best, Sunday clothes\nn04355267\tsun deck\nn04355338\tsundial\nn04355511\tsundress\nn04355684\tsundries\nn04355821\tsun gear\nn04355933\tsunglass\nn04356056\tsunglasses, dark glasses, shades\nn04356595\tsunhat, sun hat\nn04356772\tsunlamp, sun lamp, sunray lamp, sun-ray lamp\nn04356925\tsun parlor, sun parlour, sun porch, sunporch, sunroom, sun lounge, solarium\nn04357121\tsunroof, sunshine-roof\nn04357314\tsunscreen, sunblock, sun blocker\nn04357531\tsunsuit\nn04357930\tsupercharger\nn04358117\tsupercomputer\nn04358256\tsuperconducting supercollider\nn04358491\tsuperhighway, information superhighway\nn04358707\tsupermarket\nn04358874\tsuperstructure\nn04359034\tsupertanker\nn04359124\tsupper club\nn04359217\tsupplejack\nn04359335\tsupply chamber\nn04359500\tsupply closet\nn04359589\tsupport\nn04360501\tsupport\nn04360798\tsupport column\nn04360914\tsupport hose, support stocking\nn04361095\tsupporting structure\nn04361260\tsupporting tower\nn04361937\tsurcoat\nn04362624\tsurface gauge, surface gage, scribing block\nn04362821\tsurface lift\nn04362972\tsurface search radar\nn04363082\tsurface ship\nn04363210\tsurface-to-air missile, SAM\nn04363412\tsurface-to-air missile system\nn04363671\tsurfboat\nn04363777\tsurcoat\nn04363874\tsurgeon's knot\nn04363991\tsurgery\nn04364160\tsurge suppressor, surge protector, spike suppressor, spike arrester, lightning arrester\nn04364397\tsurgical dressing\nn04364545\tsurgical instrument\nn04364827\tsurgical knife\nn04364994\tsurplice\nn04365112\tsurrey\nn04365229\tsurtout\nn04365328\tsurveillance system\nn04365484\tsurveying instrument, surveyor's instrument\nn04365751\tsurveyor's level\nn04366033\tsushi bar\nn04366116\tsuspension, suspension system\nn04366367\tsuspension bridge\nn04366832\tsuspensory, suspensory bandage\nn04367011\tsustaining pedal, loud pedal\nn04367371\tsuture, surgical seam\nn04367480\tswab, swob, mop\nn04367746\tswab\nn04367950\tswaddling clothes, swaddling bands\nn04368109\tswag\nn04368235\tswage block\nn04368365\tswagger stick\nn04368496\tswallow-tailed coat, swallowtail, morning coat\nn04368695\tswamp buggy, marsh buggy\nn04368840\tswan's down\nn04369025\tswathe, wrapping\nn04369282\tswatter, flyswatter, flyswat\nn04369485\tsweat bag\nn04369618\tsweatband\nn04370048\tsweater, jumper\nn04370288\tsweat pants, sweatpants\nn04370456\tsweatshirt\nn04370600\tsweatshop\nn04370774\tsweat suit, sweatsuit, sweats, workout suit\nn04370955\tsweep, sweep oar\nn04371050\tsweep hand, sweep-second\nn04371430\tswimming trunks, bathing trunks\nn04371563\tswimsuit, swimwear, bathing suit, swimming costume, bathing costume\nn04371774\tswing\nn04371979\tswing door, swinging door\nn04372370\tswitch, electric switch, electrical switch\nn04373089\tswitchblade, switchblade knife, flick-knife, flick knife\nn04373428\tswitch engine, donkey engine\nn04373563\tswivel\nn04373704\tswivel chair\nn04373795\tswizzle stick\nn04373894\tsword, blade, brand, steel\nn04374315\tsword cane, sword stick\nn04374521\tS wrench\nn04374735\tsynagogue, temple, tabernacle\nn04374907\tsynchrocyclotron\nn04375080\tsynchroflash\nn04375241\tsynchromesh\nn04375405\tsynchronous converter, rotary, rotary converter\nn04375615\tsynchronous motor\nn04375775\tsynchrotron\nn04375926\tsynchroscope, synchronoscope, synchronizer, synchroniser\nn04376400\tsynthesizer, synthesiser\nn04376876\tsyringe\nn04377057\tsystem\nn04378489\ttabard\nn04378651\tTabernacle\nn04378956\ttabi, tabis\nn04379096\ttab key, tab\nn04379243\ttable\nn04379964\ttable\nn04380255\ttablefork\nn04380346\ttable knife\nn04380533\ttable lamp\nn04380916\ttable saw\nn04381073\ttablespoon\nn04381450\ttablet-armed chair\nn04381587\ttable-tennis table, ping-pong table, pingpong table\nn04381724\ttable-tennis racquet, table-tennis bat, pingpong paddle\nn04381860\ttabletop\nn04381994\ttableware\nn04382334\ttabor, tabour\nn04382438\ttaboret, tabouret\nn04382537\ttachistoscope, t-scope\nn04382695\ttachograph\nn04382880\ttachometer, tach\nn04383015\ttachymeter, tacheometer\nn04383130\ttack\nn04383301\ttack hammer\nn04383839\ttaffeta\nn04383923\ttaffrail\nn04384593\ttailgate, tailboard\nn04384910\ttaillight, tail lamp, rear light, rear lamp\nn04385079\ttailor-made\nn04385157\ttailor's chalk\nn04385536\ttailpipe\nn04385799\ttail rotor, anti-torque rotor\nn04386051\ttailstock\nn04386456\ttake-up\nn04386664\ttalaria\nn04386792\ttalcum, talcum powder\nn04387095\ttam, tam-o'-shanter, tammy\nn04387201\ttambour\nn04387261\ttambour, embroidery frame, embroidery hoop\nn04387400\ttambourine\nn04387531\ttammy\nn04387706\ttamp, tamper, tamping bar\nn04387932\tTampax\nn04388040\ttampion, tompion\nn04388162\ttampon\nn04388473\ttandoor\nn04388574\ttangram\nn04388743\ttank, storage tank\nn04389033\ttank, army tank, armored combat vehicle, armoured combat vehicle\nn04389430\ttankard\nn04389521\ttank car, tank\nn04389718\ttank destroyer\nn04389854\ttank engine, tank locomotive\nn04389999\ttanker plane\nn04390483\ttank shell\nn04390577\ttank top\nn04390873\ttannoy\nn04390977\ttap, spigot\nn04391445\ttapa, tappa\nn04391838\ttape, tape recording, taping\nn04392113\ttape, tapeline, tape measure\nn04392526\ttape deck\nn04392764\ttape drive, tape transport, transport\nn04392985\ttape player\nn04393095\ttape recorder, tape machine\nn04393301\ttaper file\nn04393549\ttapestry, tapis\nn04393808\ttappet\nn04393913\ttap wrench\nn04394031\ttare\nn04394261\ttarget, butt\nn04394421\ttarget acquisition system\nn04394630\ttarmacadam, tarmac, macadam\nn04395024\ttarpaulin, tarp\nn04395106\ttartan, plaid\nn04395332\ttasset, tasse\nn04395651\ttattoo\nn04395875\ttavern, tap house\nn04396226\ttawse\nn04396335\ttaximeter\nn04396650\tT-bar lift, T-bar, Alpine lift\nn04396808\ttea bag\nn04396902\ttea ball\nn04397027\ttea cart, teacart, tea trolley, tea wagon\nn04397168\ttea chest\nn04397261\tteaching aid\nn04397452\tteacup\nn04397645\ttea gown\nn04397768\tteakettle\nn04397860\ttea maker\nn04398044\tteapot\nn04398497\tteashop, teahouse, tearoom, tea parlor, tea parlour\nn04398688\tteaspoon\nn04398834\ttea-strainer\nn04398951\ttea table\nn04399046\ttea tray\nn04399158\ttea urn\nn04399382\tteddy, teddy bear\nn04399537\ttee, golf tee\nn04399846\ttee hinge, T hinge\nn04400109\ttelecom hotel, telco building\nn04400289\ttelecommunication system, telecom system, telecommunication equipment, telecom equipment\nn04400499\ttelegraph, telegraphy\nn04400737\ttelegraph key\nn04400899\ttelemeter\nn04401088\ttelephone, phone, telephone set\nn04401578\ttelephone bell\nn04401680\ttelephone booth, phone booth, call box, telephone box, telephone kiosk\nn04401828\ttelephone cord, phone cord\nn04401949\ttelephone jack, phone jack\nn04402057\ttelephone line, phone line, telephone circuit, subscriber line, line\nn04402342\ttelephone plug, phone plug\nn04402449\ttelephone pole, telegraph pole, telegraph post\nn04402580\ttelephone receiver, receiver\nn04402746\ttelephone system, phone system\nn04402984\ttelephone wire, telephone line, telegraph wire, telegraph line\nn04403413\ttelephoto lens, zoom lens\nn04403524\tTeleprompter\nn04403638\ttelescope, scope\nn04403925\ttelescopic sight, telescope sight\nn04404072\ttelethermometer\nn04404200\tteletypewriter, teleprinter, teletype machine, telex, telex machine\nn04404412\ttelevision, television system\nn04404817\ttelevision antenna, tv-antenna\nn04404997\ttelevision camera, tv camera, camera\nn04405540\ttelevision equipment, video equipment\nn04405762\ttelevision monitor, tv monitor\nn04405907\ttelevision receiver, television, television set, tv, tv set, idiot box, boob tube, telly, goggle box\nn04406239\ttelevision room, tv room\nn04406552\ttelevision transmitter\nn04406687\ttelpher, telfer\nn04406817\ttelpherage, telferage\nn04407257\ttempera, poster paint, poster color, poster colour\nn04407435\ttemple\nn04407686\ttemple\nn04408871\ttemporary hookup, patch\nn04409011\ttender, supply ship\nn04409128\ttender, ship's boat, pinnace, cutter\nn04409279\ttender\nn04409384\ttenement, tenement house\nn04409515\ttennis ball\nn04409625\ttennis camp\nn04409806\ttennis racket, tennis racquet\nn04409911\ttenon\nn04410086\ttenor drum, tom-tom\nn04410365\ttenoroon\nn04410485\ttenpenny nail\nn04410565\ttenpin\nn04410663\ttensimeter\nn04410760\ttensiometer\nn04410886\ttensiometer\nn04411019\ttensiometer\nn04411264\ttent, collapsible shelter\nn04411835\ttenter\nn04411966\ttenterhook\nn04412097\ttent-fly, rainfly, fly sheet, fly, tent flap\nn04412300\ttent peg\nn04412416\ttepee, tipi, teepee\nn04413151\tterminal, pole\nn04413419\tterminal\nn04413969\tterraced house\nn04414101\tterra cotta\nn04414199\tterrarium\nn04414319\tterra sigillata, Samian ware\nn04414476\tterry, terry cloth, terrycloth\nn04414675\tTesla coil\nn04414909\ttessera\nn04415257\ttest equipment\nn04415663\ttest rocket, research rocket, test instrument vehicle\nn04415815\ttest room, testing room\nn04416005\ttestudo\nn04416901\ttetraskelion, tetraskele\nn04417086\ttetrode\nn04417180\ttextile machine\nn04417361\ttextile mill\nn04417672\tthatch, thatched roof\nn04417809\ttheater, theatre, house\nn04418357\ttheater curtain, theatre curtain\nn04418644\ttheater light\nn04419073\ttheodolite, transit\nn04419642\ttheremin\nn04419868\tthermal printer\nn04420024\tthermal reactor\nn04420720\tthermocouple, thermocouple junction\nn04421083\tthermoelectric thermometer, thermel, electric thermometer\nn04421258\tthermograph, thermometrograph\nn04421417\tthermograph\nn04421582\tthermohydrometer, thermogravimeter\nn04421740\tthermojunction\nn04421872\tthermometer\nn04422409\tthermonuclear reactor, fusion reactor\nn04422566\tthermopile\nn04422727\tthermos, thermos bottle, thermos flask\nn04422875\tthermostat, thermoregulator\nn04423552\tthigh pad\nn04423687\tthill\nn04423845\tthimble\nn04424692\tthinning shears\nn04425804\tthird base, third\nn04425977\tthird gear, third\nn04426184\tthird rail\nn04426316\tthong\nn04426427\tthong\nn04427216\tthree-centered arch, basket-handle arch\nn04427473\tthree-decker\nn04427559\tthree-dimensional radar, 3d radar\nn04427715\tthree-piece suit\nn04427857\tthree-quarter binding\nn04428008\tthree-way switch, three-point switch\nn04428191\tthresher, thrasher, threshing machine\nn04428382\tthreshing floor\nn04428634\tthriftshop, second-hand store\nn04429038\tthroat protector\nn04429376\tthrone\nn04430475\tthrust bearing\nn04430605\tthruster\nn04430896\tthumb\nn04431025\tthumbhole\nn04431436\tthumbscrew\nn04431648\tthumbstall\nn04431745\tthumbtack, drawing pin, pushpin\nn04431925\tthunderer\nn04432043\tthwart, cross thwart\nn04432203\ttiara\nn04432662\tticking\nn04432785\ttickler coil\nn04433377\ttie, tie beam\nn04433585\ttie, railroad tie, crosstie, sleeper\nn04434207\ttie rack\nn04434531\ttie rod\nn04434932\ttights, leotards\nn04435180\ttile\nn04435552\ttile cutter\nn04435653\ttile roof\nn04435759\ttiller\nn04435870\ttilter\nn04436012\ttilt-top table, tip-top table, tip table\nn04436185\ttimber\nn04436329\ttimber\nn04436401\ttimber hitch\nn04436542\ttimbrel\nn04436832\ttime bomb, infernal machine\nn04436992\ttime capsule\nn04437276\ttime clock\nn04437380\ttime-delay measuring instrument, time-delay measuring system\nn04437670\ttime-fuse\nn04437953\ttimepiece, timekeeper, horologe\nn04438304\ttimer\nn04438507\ttimer\nn04438643\ttime-switch\nn04438897\ttin\nn04439505\ttinderbox\nn04439585\ttine\nn04439712\ttinfoil, tin foil\nn04440597\ttippet\nn04440963\ttire chain, snow chain\nn04441093\ttire iron, tire tool\nn04441528\ttitfer\nn04441662\ttithe barn\nn04441790\ttitrator\nn04442312\ttoaster\nn04442441\ttoaster oven\nn04442582\ttoasting fork\nn04442741\ttoastrack\nn04443164\ttobacco pouch\nn04443257\ttobacco shop, tobacconist shop, tobacconist\nn04443433\ttoboggan\nn04443766\ttoby, toby jug, toby fillpot jug\nn04444121\ttocsin, warning bell\nn04444218\ttoe\nn04444749\ttoecap\nn04444953\ttoehold\nn04445040\ttoga\nn04445154\ttoga virilis\nn04445327\ttoggle\nn04445610\ttoggle bolt\nn04445782\ttoggle joint\nn04445952\ttoggle switch, toggle, on-off switch, on/off switch\nn04446162\ttogs, threads, duds\nn04446276\ttoilet, lavatory, lav, can, john, privy, bathroom\nn04446844\ttoilet bag, sponge bag\nn04447028\ttoilet bowl\nn04447156\ttoilet kit, travel kit\nn04447276\ttoilet powder, bath powder, dusting powder\nn04447443\ttoiletry, toilet articles\nn04447861\ttoilet seat\nn04448070\ttoilet water, eau de toilette\nn04448185\ttokamak\nn04448361\ttoken\nn04449290\ttollbooth, tolbooth, tollhouse\nn04449449\ttoll bridge\nn04449550\ttollgate, tollbar\nn04449700\ttoll line\nn04449966\ttomahawk, hatchet\nn04450133\tTommy gun, Thompson submachine gun\nn04450243\ttomograph\nn04450465\ttone arm, pickup, pickup arm\nn04450640\ttoner\nn04450749\ttongs, pair of tongs\nn04450994\ttongue\nn04451139\ttongue and groove joint\nn04451318\ttongue depressor\nn04451636\ttonometer\nn04451818\ttool\nn04452528\ttool bag\nn04452615\ttoolbox, tool chest, tool cabinet, tool case\nn04452757\ttoolshed, toolhouse\nn04452848\ttooth\nn04453037\ttooth\nn04453156\ttoothbrush\nn04453390\ttoothpick\nn04453666\ttop\nn04453910\ttop, cover\nn04454654\ttopgallant, topgallant mast\nn04454792\ttopgallant, topgallant sail\nn04454908\ttopiary\nn04455048\ttopknot\nn04455250\ttopmast\nn04455579\ttopper\nn04455652\ttopsail\nn04456011\ttoque\nn04456115\ttorch\nn04456472\ttorpedo\nn04456734\ttorpedo\nn04457157\ttorpedo\nn04457326\ttorpedo boat\nn04457474\ttorpedo-boat destroyer\nn04457638\ttorpedo tube\nn04457767\ttorque converter\nn04457910\ttorque wrench\nn04458201\ttorture chamber\nn04458633\ttotem pole\nn04458843\ttouch screen, touchscreen\nn04459018\ttoupee, toupe\nn04459122\ttouring car, phaeton, tourer\nn04459243\ttourist class, third class\nn04459362\ttowel\nn04459610\ttoweling, towelling\nn04459773\ttowel rack, towel horse\nn04459909\ttowel rail, towel bar\nn04460130\ttower\nn04461437\ttown hall\nn04461570\ttowpath, towing path\nn04461696\ttow truck, tow car, wrecker\nn04461879\ttoy\nn04462011\ttoy box, toy chest\nn04462240\ttoyshop\nn04462576\ttrace detector\nn04463679\ttrack, rail, rails, runway\nn04464125\ttrack\nn04464615\ttrackball\nn04464852\ttracked vehicle\nn04465050\ttract house\nn04465203\ttract housing\nn04465358\ttraction engine\nn04465501\ttractor\nn04465666\ttractor\nn04466871\ttrail bike, dirt bike, scrambler\nn04467099\ttrailer, house trailer\nn04467307\ttrailer\nn04467506\ttrailer camp, trailer park\nn04467665\ttrailer truck, tractor trailer, trucking rig, rig, articulated lorry, semi\nn04467899\ttrailing edge\nn04468005\ttrain, railroad train\nn04469003\ttramline, tramway, streetcar track\nn04469251\ttrammel\nn04469514\ttrampoline\nn04469684\ttramp steamer, tramp\nn04469813\ttramway, tram, aerial tramway, cable tramway, ropeway\nn04470741\ttransdermal patch, skin patch\nn04471148\ttransept\nn04471315\ttransformer\nn04471632\ttransistor, junction transistor, electronic transistor\nn04471912\ttransit instrument\nn04472243\ttransmission, transmission system\nn04472563\ttransmission shaft\nn04472726\ttransmitter, sender\nn04472961\ttransom, traverse\nn04473108\ttransom, transom window, fanlight\nn04473275\ttransponder\nn04473884\ttransporter\nn04474035\ttransporter, car transporter\nn04474187\ttransport ship\nn04474466\ttrap\nn04475309\ttrap door\nn04475411\ttrapeze\nn04475496\ttrave, traverse, crossbeam, crosspiece\nn04475631\ttravel iron\nn04475749\ttrawl, dragnet, trawl net\nn04475900\ttrawl, trawl line, spiller, setline, trotline\nn04476116\ttrawler, dragger\nn04476259\ttray\nn04476526\ttray cloth\nn04476831\ttread\nn04476972\ttread\nn04477219\ttreadmill, treadwheel, tread-wheel\nn04477387\ttreadmill\nn04477548\ttreasure chest\nn04477725\ttreasure ship\nn04478066\ttreenail, trenail, trunnel\nn04478383\ttrefoil arch\nn04478512\ttrellis, treillage\nn04478657\ttrench\nn04479046\ttrench coat\nn04479287\ttrench knife\nn04479405\ttrepan\nn04479526\ttrepan, trephine\nn04479694\ttrestle\nn04479823\ttrestle\nn04479939\ttrestle bridge\nn04480033\ttrestle table\nn04480141\ttrestlework\nn04480303\ttrews\nn04480527\ttrial balloon\nn04480853\ttriangle\nn04480995\ttriangle\nn04481524\ttriclinium\nn04481642\ttriclinium\nn04482177\ttricorn, tricorne\nn04482297\ttricot\nn04482393\ttricycle, trike, velocipede\nn04482975\ttrident\nn04483073\ttrigger\nn04483307\ttrimaran\nn04483925\ttrimmer\nn04484024\ttrimmer arch\nn04484432\ttriode\nn04485082\ttripod\nn04485423\ttriptych\nn04485586\ttrip wire\nn04485750\ttrireme\nn04485884\ttriskelion, triskele\nn04486054\ttriumphal arch\nn04486213\ttrivet\nn04486322\ttrivet\nn04486616\ttroika\nn04486934\ttroll\nn04487081\ttrolleybus, trolley coach, trackless trolley\nn04487394\ttrombone\nn04487724\ttroop carrier, troop transport\nn04487894\ttroopship\nn04488202\ttrophy case\nn04488427\ttrough\nn04488530\ttrouser\nn04488742\ttrouser cuff\nn04488857\ttrouser press, pants presser\nn04489008\ttrouser, pant\nn04489695\ttrousseau\nn04489817\ttrowel\nn04490091\ttruck, motortruck\nn04491312\ttrumpet arch\nn04491388\ttruncheon, nightstick, baton, billy, billystick, billy club\nn04491638\ttrundle bed, trundle, truckle bed, truckle\nn04491769\ttrunk\nn04491934\ttrunk hose\nn04492060\ttrunk lid\nn04492157\ttrunk line\nn04492375\ttruss\nn04492749\ttruss bridge\nn04493109\ttry square\nn04493259\tT-square\nn04493381\ttub, vat\nn04494204\ttube, vacuum tube, thermionic vacuum tube, thermionic tube, electron tube, thermionic valve\nn04495051\ttuck box\nn04495183\ttucker\nn04495310\ttucker-bag\nn04495450\ttuck shop\nn04495555\tTudor arch, four-centered arch\nn04495698\ttudung\nn04495843\ttugboat, tug, towboat, tower\nn04496614\ttulle\nn04496726\ttumble-dryer, tumble drier\nn04496872\ttumbler\nn04497249\ttumbrel, tumbril\nn04497442\ttun\nn04497570\ttunic\nn04497801\ttuning fork\nn04498275\ttupik, tupek, sealskin tent\nn04498389\tturban\nn04498523\tturbine\nn04498873\tturbogenerator\nn04499062\ttureen\nn04499300\tTurkish bath\nn04499446\tTurkish towel, terry towel\nn04499554\tTurk's head\nn04499810\tturnbuckle\nn04500060\tturner, food turner\nn04500390\tturnery\nn04501127\tturnpike\nn04501281\tturnspit\nn04501370\tturnstile\nn04501550\tturntable\nn04501837\tturntable, lazy Susan\nn04501947\tturret\nn04502059\tturret clock\nn04502197\tturtleneck, turtle, polo-neck\nn04502502\ttweed\nn04502670\ttweeter\nn04502851\ttwenty-two, .22\nn04502989\ttwenty-two pistol\nn04503073\ttwenty-two rifle\nn04503155\ttwill\nn04503269\ttwill, twill weave\nn04503413\ttwin bed\nn04503499\ttwinjet\nn04503593\ttwist bit, twist drill\nn04503705\ttwo-by-four\nn04504038\ttwo-man tent\nn04504141\ttwo-piece, two-piece suit, lounge suit\nn04504770\ttypesetting machine\nn04505036\ttypewriter\nn04505345\ttypewriter carriage\nn04505470\ttypewriter keyboard\nn04505888\ttyrolean, tirolean\nn04506289\tuke, ukulele\nn04506402\tulster\nn04506506\tultracentrifuge\nn04506688\tultramicroscope, dark-field microscope\nn04506895\tUltrasuede\nn04506994\tultraviolet lamp, ultraviolet source\nn04507155\tumbrella\nn04507326\tumbrella tent\nn04507453\tundercarriage\nn04507689\tundercoat, underseal\nn04508163\tundergarment, unmentionable\nn04508489\tunderpants\nn04508949\tunderwear, underclothes, underclothing\nn04509171\tundies\nn04509260\tuneven parallel bars, uneven bars\nn04509417\tunicycle, monocycle\nn04509592\tuniform\nn04510706\tuniversal joint, universal\nn04511002\tuniversity\nn04513827\tupholstery\nn04513998\tupholstery material\nn04514095\tupholstery needle\nn04514241\tuplift\nn04514648\tupper berth, upper\nn04515003\tupright, upright piano\nn04515444\tupset, swage\nn04515729\tupstairs\nn04515890\turceole\nn04516116\turn\nn04516214\turn\nn04516354\tused-car, secondhand car\nn04516672\tutensil\nn04517211\tUzi\nn04517408\tvacation home\nn04517823\tvacuum, vacuum cleaner\nn04517999\tvacuum chamber\nn04518132\tvacuum flask, vacuum bottle\nn04518343\tvacuum gauge, vacuum gage\nn04518643\tValenciennes, Valenciennes lace\nn04518764\tvalise\nn04519153\tvalve\nn04519536\tvalve\nn04519728\tvalve-in-head engine\nn04519887\tvambrace, lower cannon\nn04520170\tvan\nn04520382\tvan, caravan\nn04520784\tvane\nn04520962\tvaporizer, vaporiser\nn04521571\tvariable-pitch propeller\nn04521863\tvariometer\nn04521987\tvarnish\nn04522168\tvase\nn04523525\tvault\nn04523831\tvault, bank vault\nn04524142\tvaulting horse, long horse, buck\nn04524313\tvehicle\nn04524594\tVelcro\nn04524716\tvelocipede\nn04524941\tvelour, velours\nn04525038\tvelvet\nn04525191\tvelveteen\nn04525305\tvending machine\nn04525417\tveneer, veneering\nn04525584\tVenetian blind\nn04525821\tVenn diagram, Venn's diagram\nn04526520\tventilation, ventilation system, ventilating system\nn04526800\tventilation shaft\nn04526964\tventilator\nn04527648\tveranda, verandah, gallery\nn04528079\tverdigris\nn04528968\tvernier caliper, vernier micrometer\nn04529108\tvernier scale, vernier\nn04529681\tvertical file\nn04529962\tvertical stabilizer, vertical stabiliser, vertical fin, tail fin, tailfin\nn04530283\tvertical tail\nn04530456\tVery pistol, Verey pistol\nn04530566\tvessel, watercraft\nn04531098\tvessel\nn04531873\tvest, waistcoat\nn04532022\tvestiture\nn04532106\tvestment\nn04532398\tvest pocket\nn04532504\tvestry, sacristy\nn04532670\tviaduct\nn04532831\tvibraphone, vibraharp, vibes\nn04533042\tvibrator\nn04533199\tvibrator\nn04533499\tVictrola\nn04533594\tvicuna\nn04533700\tvideocassette\nn04533802\tvideocassette recorder, VCR\nn04533946\tvideodisk, videodisc, DVD\nn04534127\tvideo recording, video\nn04534359\tvideotape\nn04534520\tvideotape\nn04534895\tvigil light, vigil candle\nn04535252\tvilla\nn04535370\tvilla\nn04535524\tvilla\nn04536153\tviol\nn04536335\tviola\nn04536465\tviola da braccio\nn04536595\tviola da gamba, gamba, bass viol\nn04536765\tviola d'amore\nn04536866\tviolin, fiddle\nn04537436\tvirginal, pair of virginals\nn04538249\tviscometer, viscosimeter\nn04538403\tviscose rayon, viscose\nn04538552\tvise, bench vise\nn04538878\tvisor, vizor\nn04539053\tvisual display unit, VDU\nn04539203\tvivarium\nn04539407\tViyella\nn04539794\tvoile\nn04540053\tvolleyball\nn04540255\tvolleyball net\nn04540397\tvoltage regulator\nn04540761\tvoltaic cell, galvanic cell, primary cell\nn04541136\tvoltaic pile, pile, galvanic pile\nn04541320\tvoltmeter\nn04541662\tvomitory\nn04541777\tvon Neumann machine\nn04541987\tvoting booth\nn04542095\tvoting machine\nn04542329\tvoussoir\nn04542474\tvox angelica, voix celeste\nn04542595\tvox humana\nn04542715\twaders\nn04542858\twading pool\nn04542943\twaffle iron\nn04543158\twagon, waggon\nn04543509\twagon, coaster wagon\nn04543636\twagon tire\nn04543772\twagon wheel\nn04543924\twain\nn04543996\twainscot, wainscoting, wainscotting\nn04544325\twainscoting, wainscotting\nn04544450\twaist pack, belt bag\nn04545305\twalker, baby-walker, go-cart\nn04545471\twalker, Zimmer, Zimmer frame\nn04545748\twalker\nn04545858\twalkie-talkie, walky-talky\nn04545984\twalk-in\nn04546081\twalking shoe\nn04546194\twalking stick\nn04546340\tWalkman\nn04546595\twalk-up apartment, walk-up\nn04546855\twall\nn04547592\twall\nn04548280\twall clock\nn04548362\twallet, billfold, notecase, pocketbook\nn04549028\twall tent\nn04549122\twall unit\nn04549629\twand\nn04549721\tWankel engine, Wankel rotary engine, epitrochoidal engine\nn04549919\tward, hospital ward\nn04550184\twardrobe, closet, press\nn04550676\twardroom\nn04551055\twarehouse, storage warehouse\nn04551833\twarming pan\nn04552097\twar paint\nn04552348\twarplane, military plane\nn04552551\twar room\nn04552696\twarship, war vessel, combat ship\nn04553389\twash\nn04553561\twash-and-wear\nn04553703\twashbasin, handbasin, washbowl, lavabo, wash-hand basin\nn04554211\twashboard, splashboard\nn04554406\twashboard\nn04554684\twasher, automatic washer, washing machine\nn04554871\twasher\nn04554998\twashhouse\nn04555291\twashroom\nn04555400\twashstand, wash-hand stand\nn04555600\twashtub\nn04555700\twastepaper basket, waste-paper basket, wastebasket, waste basket, circular file\nn04555897\twatch, ticker\nn04556408\twatch cap\nn04556533\twatch case\nn04556664\twatch glass\nn04556948\twatchtower\nn04557308\twater-base paint\nn04557522\twater bed\nn04557648\twater bottle\nn04557751\twater butt\nn04558059\twater cart\nn04558199\twater chute\nn04558478\twater closet, closet, W.C., loo\nn04558804\twatercolor, water-color, watercolour, water-colour\nn04559023\twater-cooled reactor\nn04559166\twater cooler\nn04559451\twater faucet, water tap, tap, hydrant\nn04559620\twater filter\nn04559730\twater gauge, water gage, water glass\nn04559910\twater glass\nn04559994\twater hazard\nn04560113\twater heater, hot-water heater, hot-water tank\nn04560292\twatering can, watering pot\nn04560502\twatering cart\nn04560619\twater jacket\nn04560804\twater jug\nn04560882\twater jump\nn04561010\twater level\nn04561287\twater meter\nn04561422\twater mill\nn04561734\twaterproof\nn04561857\twaterproofing\nn04561965\twater pump\nn04562122\twater scooter, sea scooter, scooter\nn04562262\twater ski\nn04562496\twaterspout\nn04562935\twater tower\nn04563020\twater wagon, water waggon\nn04563204\twaterwheel, water wheel\nn04563413\twaterwheel, water wheel\nn04563560\twater wings\nn04563790\twaterworks\nn04564278\twattmeter\nn04564581\twaxwork, wax figure\nn04565039\tways, shipway, slipway\nn04565375\tweapon, arm, weapon system\nn04566257\tweaponry, arms, implements of war, weapons system, munition\nn04566561\tweapons carrier\nn04566756\tweathercock\nn04567098\tweatherglass\nn04567593\tweather satellite, meteorological satellite\nn04567746\tweather ship\nn04568069\tweathervane, weather vane, vane, wind vane\nn04568557\tweb, entanglement\nn04568713\tweb\nn04568841\twebbing\nn04569063\twebcam\nn04569520\twedge\nn04569822\twedge\nn04570118\twedgie\nn04570214\tWedgwood\nn04570416\tweeder, weed-whacker\nn04570532\tweeds, widow's weeds\nn04570815\tweekender\nn04570958\tweighbridge\nn04571292\tweight, free weight, exercising weight\nn04571566\tweir\nn04571686\tweir\nn04571800\twelcome wagon\nn04571958\tweld\nn04572121\twelder's mask\nn04572235\tweldment\nn04572935\twell\nn04573045\twellhead\nn04573281\twelt\nn04573379\tWeston cell, cadmium cell\nn04573513\twet bar\nn04573625\twet-bulb thermometer\nn04573832\twet cell\nn04573937\twet fly\nn04574067\twet suit\nn04574348\twhaleboat\nn04574471\twhaler, whaling ship\nn04574606\twhaling gun\nn04574999\twheel\nn04575723\twheel\nn04575824\twheel and axle\nn04576002\twheelchair\nn04576211\twheeled vehicle\nn04576971\twheelwork\nn04577139\twherry\nn04577293\twherry, Norfolk wherry\nn04577426\twhetstone\nn04577567\twhiffletree, whippletree, swingletree\nn04577769\twhip\nn04578112\twhipcord\nn04578329\twhipping post\nn04578559\twhipstitch, whipping, whipstitching\nn04578708\twhirler\nn04578801\twhisk, whisk broom\nn04578934\twhisk\nn04579056\twhiskey bottle\nn04579145\twhiskey jug\nn04579230\twhispering gallery, whispering dome\nn04579432\twhistle\nn04579667\twhistle\nn04579986\twhite\nn04580493\twhite goods\nn04581102\twhitewash\nn04581595\twhorehouse, brothel, bordello, bagnio, house of prostitution, house of ill repute, bawdyhouse, cathouse, sporting house\nn04581829\twick, taper\nn04582205\twicker, wickerwork, caning\nn04582349\twicker basket\nn04582771\twicket, hoop\nn04582869\twicket\nn04583022\twickiup, wikiup\nn04583212\twide-angle lens, fisheye lens\nn04583620\twidebody aircraft, wide-body aircraft, wide-body, twin-aisle airplane\nn04583888\twide wale\nn04583967\twidow's walk\nn04584056\tWiffle, Wiffle Ball\nn04584207\twig\nn04584373\twigwam\nn04585128\tWilton, Wilton carpet\nn04585318\twimple\nn04585456\twincey\nn04585626\twinceyette\nn04585745\twinch, windlass\nn04585980\tWinchester\nn04586072\twindbreak, shelterbelt\nn04586581\twinder, key\nn04586932\twind instrument, wind\nn04587327\twindjammer\nn04587404\twindmill, aerogenerator, wind generator\nn04587559\twindmill\nn04587648\twindow\nn04588739\twindow\nn04589190\twindow blind\nn04589325\twindow box\nn04589434\twindow envelope\nn04589593\twindow frame\nn04589890\twindow screen\nn04590021\twindow seat\nn04590129\twindow shade\nn04590263\twindowsill\nn04590553\twindshield, windscreen\nn04590746\twindshield wiper, windscreen wiper, wiper, wiper blade\nn04590933\tWindsor chair\nn04591056\tWindsor knot\nn04591157\tWindsor tie\nn04591249\twind tee\nn04591359\twind tunnel\nn04591517\twind turbine\nn04591631\twine bar\nn04591713\twine bottle\nn04591887\twine bucket, wine cooler\nn04592005\twine cask, wine barrel\nn04592099\twineglass\nn04592356\twinepress\nn04592465\twinery, wine maker\nn04592596\twineskin\nn04592741\twing\nn04593077\twing chair\nn04593185\twing nut, wing-nut, wing screw, butterfly nut, thumbnut\nn04593376\twing tip\nn04593524\twing tip\nn04593629\twinker, blinker, blinder\nn04593866\twiper, wiper arm, contact arm\nn04594114\twiper motor\nn04594218\twire\nn04594489\twire, conducting wire\nn04594742\twire cloth\nn04594828\twire cutter\nn04594919\twire gauge, wire gage\nn04595028\twireless local area network, WLAN, wireless fidelity, WiFi\nn04595285\twire matrix printer, wire printer, stylus printer\nn04595501\twire recorder\nn04595611\twire stripper\nn04595762\twirework, grillwork\nn04595855\twiring\nn04596116\twishing cap\nn04596492\twitness box, witness stand\nn04596742\twok\nn04596852\twoman's clothing\nn04597066\twood\nn04597309\twoodcarving\nn04597400\twood chisel\nn04597804\twoodenware\nn04597913\twooden spoon\nn04598136\twoodscrew\nn04598318\twoodshed\nn04598416\twood vise, woodworking vise, shoulder vise\nn04598582\twoodwind, woodwind instrument, wood\nn04598965\twoof, weft, filling, pick\nn04599124\twoofer\nn04599235\twool, woolen, woollen\nn04600312\tworkbasket, workbox, workbag\nn04600486\tworkbench, work bench, bench\nn04600912\twork-clothing, work-clothes\nn04601041\tworkhouse\nn04601159\tworkhouse\nn04601938\tworkpiece\nn04602762\tworkroom\nn04602840\tworks, workings\nn04602956\twork-shirt\nn04603399\tworkstation\nn04603729\tworktable, work table\nn04603872\tworkwear\nn04604276\tWorld Wide Web, WWW, web\nn04604644\tworm fence, snake fence, snake-rail fence, Virginia fence\nn04604806\tworm gear\nn04605057\tworm wheel\nn04605163\tworsted\nn04605321\tworsted, worsted yarn\nn04605446\twrap, wrapper\nn04605572\twraparound\nn04605726\twrapping, wrap, wrapper\nn04606251\twreck\nn04606574\twrench, spanner\nn04607035\twrestling mat\nn04607242\twringer\nn04607640\twrist pad\nn04607759\twrist pin, gudgeon pin\nn04607869\twristwatch, wrist watch\nn04607982\twriting arm\nn04608329\twriting desk\nn04608435\twriting desk\nn04608567\twriting implement\nn04608809\txerographic printer\nn04608923\tXerox, xerographic copier, Xerox machine\nn04609531\tX-ray film\nn04609651\tX-ray machine\nn04609811\tX-ray tube\nn04610013\tyacht, racing yacht\nn04610176\tyacht chair\nn04610274\tyagi, Yagi aerial\nn04610503\tyard\nn04610676\tyard\nn04611351\tyardarm\nn04611795\tyard marker\nn04611916\tyardstick, yard measure\nn04612026\tyarmulke, yarmulka, yarmelke\nn04612159\tyashmak, yashmac\nn04612257\tyataghan\nn04612373\tyawl, dandy\nn04612504\tyawl\nn04612840\tyoke\nn04613015\tyoke\nn04613158\tyoke, coupling\nn04613696\tyurt\nn04613939\tZamboni\nn04614505\tzero\nn04614655\tziggurat, zikkurat, zikurat\nn04614844\tzill\nn04615149\tzip gun\nn04615226\tzither, cither, zithern\nn04615644\tzoot suit\nn04682018\tshading\nn04950713\tgrain\nn04950952\twood grain, woodgrain, woodiness\nn04951071\tgraining, woodgraining\nn04951186\tmarbleization, marbleisation, marbleizing, marbleising\nn04951373\tlight, lightness\nn04951716\taura, aureole, halo, nimbus, glory, gloriole\nn04951875\tsunniness\nn04953296\tglint\nn04953678\topalescence, iridescence\nn04955160\tpolish, gloss, glossiness, burnish\nn04957356\tprimary color for pigments, primary colour for pigments\nn04957589\tprimary color for light, primary colour for light\nn04958634\tcolorlessness, colourlessness, achromatism, achromaticity\nn04958865\tmottle\nn04959061\tachromia\nn04959230\tshade, tint, tincture, tone\nn04959672\tchromatic color, chromatic colour, spectral color, spectral colour\nn04960277\tblack, blackness, inkiness\nn04960582\tcoal black, ebony, jet black, pitch black, sable, soot black\nn04961062\talabaster\nn04961331\tbone, ivory, pearl, off-white\nn04961691\tgray, grayness, grey, greyness\nn04962062\tash grey, ash gray, silver, silver grey, silver gray\nn04962240\tcharcoal, charcoal grey, charcoal gray, oxford grey, oxford gray\nn04963111\tsanguine\nn04963307\tTurkey red, alizarine red\nn04963588\tcrimson, ruby, deep red\nn04963740\tdark red\nn04964001\tclaret\nn04964799\tfuschia\nn04964878\tmaroon\nn04965179\torange, orangeness\nn04965451\treddish orange\nn04965661\tyellow, yellowness\nn04966543\tgamboge, lemon, lemon yellow, maize\nn04966941\tpale yellow, straw, wheat\nn04967191\tgreen, greenness, viridity\nn04967561\tgreenishness\nn04967674\tsea green\nn04967801\tsage green\nn04967882\tbottle green\nn04968056\temerald\nn04968139\tolive green, olive-green\nn04968749\tjade green, jade\nn04968895\tblue, blueness\nn04969242\tazure, cerulean, sapphire, lazuline, sky-blue\nn04969540\tsteel blue\nn04969798\tgreenish blue, aqua, aquamarine, turquoise, cobalt blue, peacock blue\nn04969952\tpurplish blue, royal blue\nn04970059\tpurple, purpleness\nn04970312\tTyrian purple\nn04970398\tindigo\nn04970470\tlavender\nn04970631\treddish purple, royal purple\nn04970916\tpink\nn04971211\tcarnation\nn04971313\trose, rosiness\nn04972350\tchestnut\nn04972451\tchocolate, coffee, deep brown, umber, burnt umber\nn04972801\tlight brown\nn04973020\ttan, topaz\nn04973291\tbeige, ecru\nn04973386\treddish brown, sepia, burnt sienna, Venetian red, mahogany\nn04973585\tbrick red\nn04973669\tcopper, copper color\nn04973816\tIndian red\nn04974145\tpuce\nn04974340\tolive\nn04974859\tultramarine\nn04975739\tcomplementary color, complementary\nn04976319\tpigmentation\nn04976952\tcomplexion, skin color, skin colour\nn04977412\truddiness, rosiness\nn04978561\tnonsolid color, nonsolid colour, dithered color, dithered colour\nn04979002\taposematic coloration, warning coloration\nn04979307\tcryptic coloration\nn04981658\tring\nn05102764\tcenter of curvature, centre of curvature\nn05218119\tcadaver, corpse, stiff, clay, remains\nn05233741\tmandibular notch\nn05235879\trib\nn05238282\tskin, tegument, cutis\nn05239437\tskin graft\nn05241218\tepidermal cell\nn05241485\tmelanocyte\nn05241662\tprickle cell\nn05242070\tcolumnar cell, columnar epithelial cell\nn05242239\tspongioblast\nn05242928\tsquamous cell\nn05244421\tamyloid plaque, amyloid protein plaque\nn05244755\tdental plaque, bacterial plaque\nn05244934\tmacule, macula\nn05245192\tfreckle, lentigo\nn05257476\tbouffant\nn05257967\tsausage curl\nn05258051\tforelock\nn05258627\tspit curl, kiss curl\nn05259914\tpigtail\nn05260127\tpageboy\nn05260240\tpompadour\nn05261310\tthatch\nn05262422\tsoup-strainer, toothbrush\nn05262534\tmustachio, moustachio, handle-bars\nn05262698\twalrus mustache, walrus moustache\nn05263183\tstubble\nn05263316\tvandyke beard, vandyke\nn05263448\tsoul patch, Attilio\nn05265736\tesophageal smear\nn05266096\tparaduodenal smear, duodenal smear\nn05266879\tspecimen\nn05278922\tpunctum\nn05279953\tglenoid fossa, glenoid cavity\nn05282652\tdiastema\nn05285623\tmarrow, bone marrow\nn05302499\tmouth, oral cavity, oral fissure, rima oris\nn05314075\tcanthus\nn05399034\tmilk\nn05399243\tmother's milk\nn05399356\tcolostrum, foremilk\nn05418717\tvein, vena, venous blood vessel\nn05427346\tganglion cell, gangliocyte\nn05442594\tX chromosome\nn05447757\tembryonic cell, formative cell\nn05448704\tmyeloblast\nn05448827\tsideroblast\nn05449196\tosteocyte\nn05449661\tmegalocyte, macrocyte\nn05449959\tleukocyte, leucocyte, white blood cell, white cell, white blood corpuscle, white corpuscle, WBC\nn05450617\thistiocyte\nn05451099\tfixed phagocyte\nn05451384\tlymphocyte, lymph cell\nn05453412\tmonoblast\nn05453657\tneutrophil, neutrophile\nn05453815\tmicrophage\nn05454833\tsickle cell\nn05454978\tsiderocyte\nn05455113\tspherocyte\nn05458173\tootid\nn05458576\toocyte\nn05459101\tspermatid\nn05459457\tLeydig cell, Leydig's cell\nn05459769\tstriated muscle cell, striated muscle fiber\nn05460759\tsmooth muscle cell\nn05464534\tRanvier's nodes, nodes of Ranvier\nn05467054\tneuroglia, glia\nn05467758\tastrocyte\nn05468098\tprotoplasmic astrocyte\nn05468739\toligodendrocyte\nn05469664\tproprioceptor\nn05469861\tdendrite\nn05475397\tsensory fiber, afferent fiber\nn05482922\tsubarachnoid space\nn05486510\tcerebral cortex, cerebral mantle, pallium, cortex\nn05491154\trenal cortex\nn05526957\tprepuce, foreskin\nn05538625\thead, caput\nn05539947\tscalp\nn05541509\tfrontal eminence\nn05542893\tsuture, sutura, fibrous joint\nn05545879\tforamen magnum\nn05571341\tesophagogastric junction, oesophagogastric junction\nn05578095\theel\nn05581932\tcuticle\nn05584746\thangnail, agnail\nn05586759\texoskeleton\nn05604434\tabdominal wall\nn05716342\tlemon\nn06008896\tcoordinate axis\nn06209940\tlandscape\nn06254669\tmedium\nn06255081\tvehicle\nn06255613\tpaper\nn06259898\tchannel, transmission channel\nn06262567\tfilm, cinema, celluloid\nn06262943\tsilver screen\nn06263202\tfree press\nn06263369\tpress, public press\nn06263609\tprint media\nn06263762\tstorage medium, data-storage medium\nn06263895\tmagnetic storage medium, magnetic medium, magnetic storage\nn06266417\tjournalism, news media\nn06266633\tFleet Street\nn06266710\tphotojournalism\nn06266878\tnews photography\nn06266973\trotogravure\nn06267145\tnewspaper, paper\nn06267564\tdaily\nn06267655\tgazette\nn06267758\tschool newspaper, school paper\nn06267893\ttabloid, rag, sheet\nn06267991\tyellow journalism, tabloid, tab\nn06271778\ttelecommunication, telecom\nn06272290\ttelephone, telephony\nn06272612\tvoice mail, voicemail\nn06272803\tcall, phone call, telephone call\nn06273207\tcall-back\nn06273294\tcollect call\nn06273414\tcall forwarding\nn06273555\tcall-in\nn06273743\tcall waiting\nn06273890\tcrank call\nn06273986\tlocal call\nn06274092\tlong distance, long-distance call, trunk call\nn06274292\ttoll call\nn06274546\twake-up call\nn06274760\tthree-way calling\nn06274921\ttelegraphy\nn06275095\tcable, cablegram, overseas telegram\nn06275353\twireless\nn06275471\tradiotelegraph, radiotelegraphy, wireless telegraphy\nn06276501\tradiotelephone, radiotelephony, wireless telephone\nn06276697\tbroadcasting\nn06276902\tRediffusion\nn06277025\tmultiplex\nn06277135\tradio, radiocommunication, wireless\nn06277280\ttelevision, telecasting, TV, video\nn06278338\tcable television, cable\nn06278475\thigh-definition television, HDTV\nn06281040\treception\nn06281175\tsignal detection, detection\nn06340977\tHakham\nn06359193\tweb site, website, internet site, site\nn06359467\tchat room, chatroom\nn06359657\tportal site, portal\nn06415688\tjotter\nn06417096\tbreviary\nn06418693\twordbook\nn06419354\tdesk dictionary, collegiate dictionary\nn06423496\treckoner, ready reckoner\nn06470073\tdocument, written document, papers\nn06591815\talbum, record album\nn06592078\tconcept album\nn06592281\trock opera\nn06592421\ttribute album, benefit album\nn06595351\tmagazine, mag\nn06596179\tcolour supplement\nn06596364\tcomic book\nn06596474\tnews magazine\nn06596607\tpulp, pulp magazine\nn06596727\tslick, slick magazine, glossy\nn06596845\ttrade magazine\nn06613686\tmovie, film, picture, moving picture, moving-picture show, motion picture, motion-picture show, picture show, pic, flick\nn06614901\touttake\nn06616216\tshoot-'em-up\nn06618653\tspaghetti Western\nn06625062\tencyclical, encyclical letter\nn06785654\tcrossword puzzle, crossword\nn06793231\tsign\nn06794110\tstreet sign\nn06874185\ttraffic light, traffic signal, stoplight\nn06883725\tswastika, Hakenkreuz\nn06892775\tconcert\nn06998748\tartwork, art, graphics, nontextual matter\nn07005523\tlobe\nn07248320\tbook jacket, dust cover, dust jacket, dust wrapper\nn07273802\tcairn\nn07461050\tthree-day event\nn07556406\tcomfort food\nn07556637\tcomestible, edible, eatable, pabulum, victual, victuals\nn07556872\ttuck\nn07556970\tcourse\nn07557165\tdainty, delicacy, goody, kickshaw, treat\nn07557434\tdish\nn07560193\tfast food\nn07560331\tfinger food\nn07560422\tingesta\nn07560542\tkosher\nn07560652\tfare\nn07560903\tdiet\nn07561112\tdiet\nn07561590\tdietary\nn07561848\tbalanced diet\nn07562017\tbland diet, ulcer diet\nn07562172\tclear liquid diet\nn07562379\tdiabetic diet\nn07562495\tdietary supplement\nn07562651\tcarbohydrate loading, carbo loading\nn07562881\tfad diet\nn07562984\tgluten-free diet\nn07563207\thigh-protein diet\nn07563366\thigh-vitamin diet, vitamin-deficiency diet\nn07563642\tlight diet\nn07563800\tliquid diet\nn07564008\tlow-calorie diet\nn07564101\tlow-fat diet\nn07564292\tlow-sodium diet, low-salt diet, salt-free diet\nn07564515\tmacrobiotic diet\nn07564629\treducing diet, obesity diet\nn07564796\tsoft diet, pap, spoon food\nn07564971\tvegetarianism\nn07565083\tmenu\nn07565161\tchow, chuck, eats, grub\nn07565259\tboard, table\nn07565608\tmess\nn07565725\tration\nn07565945\tfield ration\nn07566092\tK ration\nn07566231\tC-ration\nn07566340\tfoodstuff, food product\nn07566863\tstarches\nn07567039\tbreadstuff\nn07567139\tcoloring, colouring, food coloring, food colouring, food color, food colour\nn07567390\tconcentrate\nn07567611\ttomato concentrate\nn07567707\tmeal\nn07567980\tkibble\nn07568095\tcornmeal, Indian meal\nn07568241\tfarina\nn07568389\tmatzo meal, matzoh meal, matzah meal\nn07568502\toatmeal, rolled oats\nn07568625\tpea flour\nn07568818\troughage, fiber\nn07568991\tbran\nn07569106\tflour\nn07569423\tplain flour\nn07569543\twheat flour\nn07569644\twhole wheat flour, graham flour, graham, whole meal flour\nn07569873\tsoybean meal, soybean flour, soy flour\nn07570021\tsemolina\nn07570530\tcorn gluten feed\nn07570720\tnutriment, nourishment, nutrition, sustenance, aliment, alimentation, victuals\nn07572353\tcommissariat, provisions, provender, viands, victuals\nn07572616\tlarder\nn07572858\tfrozen food, frozen foods\nn07572957\tcanned food, canned foods, canned goods, tinned goods\nn07573103\tcanned meat, tinned meat\nn07573347\tSpam\nn07573453\tdehydrated food, dehydrated foods\nn07573563\tsquare meal\nn07573696\tmeal, repast\nn07574176\tpotluck\nn07574426\trefection\nn07574504\trefreshment\nn07574602\tbreakfast\nn07574780\tcontinental breakfast, petit dejeuner\nn07574923\tbrunch\nn07575076\tlunch, luncheon, tiffin, dejeuner\nn07575226\tbusiness lunch\nn07575392\thigh tea\nn07575510\ttea, afternoon tea, teatime\nn07575726\tdinner\nn07575984\tsupper\nn07576182\tbuffet\nn07576438\tpicnic\nn07576577\tcookout\nn07576781\tbarbecue, barbeque\nn07576969\tclambake\nn07577144\tfish fry\nn07577374\tbite, collation, snack\nn07577538\tnosh\nn07577657\tnosh-up\nn07577772\tploughman's lunch\nn07577918\tcoffee break, tea break\nn07578093\tbanquet, feast, spread\nn07579575\tentree, main course\nn07579688\tpiece de resistance\nn07579787\tplate\nn07579917\tadobo\nn07580053\tside dish, side order, entremets\nn07580253\tspecial\nn07580359\tcasserole\nn07580470\tchicken casserole\nn07580592\tchicken cacciatore, chicken cacciatora, hunter's chicken\nn07581249\tantipasto\nn07581346\tappetizer, appetiser, starter\nn07581607\tcanape\nn07581775\tcocktail\nn07581931\tfruit cocktail\nn07582027\tcrab cocktail\nn07582152\tshrimp cocktail\nn07582277\thors d'oeuvre\nn07582441\trelish\nn07582609\tdip\nn07582811\tbean dip\nn07582892\tcheese dip\nn07582970\tclam dip\nn07583066\tguacamole\nn07583197\tsoup\nn07583865\tsoup du jour\nn07583978\talphabet soup\nn07584110\tconsomme\nn07584228\tmadrilene\nn07584332\tbisque\nn07584423\tborsch, borsh, borscht, borsht, borshch, bortsch\nn07584593\tbroth\nn07584859\tbarley water\nn07584938\tbouillon\nn07585015\tbeef broth, beef stock\nn07585107\tchicken broth, chicken stock\nn07585208\tbroth, stock\nn07585474\tstock cube\nn07585557\tchicken soup\nn07585644\tcock-a-leekie, cocky-leeky\nn07585758\tgazpacho\nn07585906\tgumbo\nn07585997\tjulienne\nn07586099\tmarmite\nn07586179\tmock turtle soup\nn07586318\tmulligatawny\nn07586485\toxtail soup\nn07586604\tpea soup\nn07586718\tpepper pot, Philadelphia pepper pot\nn07586894\tpetite marmite, minestrone, vegetable soup\nn07587023\tpotage, pottage\nn07587111\tpottage\nn07587206\tturtle soup, green turtle soup\nn07587331\teggdrop soup\nn07587441\tchowder\nn07587618\tcorn chowder\nn07587700\tclam chowder\nn07587819\tManhattan clam chowder\nn07587962\tNew England clam chowder\nn07588111\tfish chowder\nn07588193\twon ton, wonton, wonton soup\nn07588299\tsplit-pea soup\nn07588419\tgreen pea soup, potage St. Germain\nn07588574\tlentil soup\nn07588688\tScotch broth\nn07588817\tvichyssoise\nn07588947\tstew\nn07589458\tbigos\nn07589543\tBrunswick stew\nn07589724\tburgoo\nn07589872\tburgoo\nn07589967\tolla podrida, Spanish burgoo\nn07590068\tmulligan stew, mulligan, Irish burgoo\nn07590177\tpurloo, chicken purloo, poilu\nn07590320\tgoulash, Hungarian goulash, gulyas\nn07590502\thotchpotch\nn07590611\thot pot, hotpot\nn07590752\tbeef goulash\nn07590841\tpork-and-veal goulash\nn07590974\tporkholt\nn07591049\tIrish stew\nn07591162\toyster stew\nn07591236\tlobster stew\nn07591330\tlobscouse, lobscuse, scouse\nn07591473\tfish stew\nn07591586\tbouillabaisse\nn07591813\tmatelote\nn07591961\tpaella\nn07592094\tfricassee\nn07592317\tchicken stew\nn07592400\tturkey stew\nn07592481\tbeef stew\nn07592656\tragout\nn07592768\tratatouille\nn07592922\tsalmi\nn07593004\tpot-au-feu\nn07593107\tslumgullion\nn07593199\tsmorgasbord\nn07593471\tviand\nn07593774\tready-mix\nn07593972\tbrownie mix\nn07594066\tcake mix\nn07594155\tlemonade mix\nn07594250\tself-rising flour, self-raising flour\nn07594737\tchoice morsel, tidbit, titbit\nn07594840\tsavory, savoury\nn07595051\tcalf's-foot jelly\nn07595180\tcaramel, caramelized sugar\nn07595368\tlump sugar\nn07595649\tcane sugar\nn07595751\tcastor sugar, caster sugar\nn07595914\tpowdered sugar\nn07596046\tgranulated sugar\nn07596160\ticing sugar\nn07596362\tcorn sugar\nn07596452\tbrown sugar\nn07596566\tdemerara, demerara sugar\nn07596684\tsweet, confection\nn07596967\tconfectionery\nn07597145\tconfiture\nn07597263\tsweetmeat\nn07597365\tcandy, confect\nn07598256\tcandy bar\nn07598529\tcarob bar\nn07598622\thardbake\nn07598734\thard candy\nn07598928\tbarley-sugar, barley candy\nn07599068\tbrandyball\nn07599161\tjawbreaker\nn07599242\tlemon drop\nn07599383\tsourball\nn07599468\tpatty\nn07599554\tpeppermint patty\nn07599649\tbonbon\nn07599783\tbrittle, toffee, toffy\nn07599911\tpeanut brittle\nn07599998\tchewing gum, gum\nn07600177\tgum ball\nn07600285\tbubble gum\nn07600394\tbutterscotch\nn07600506\tcandied fruit, succade, crystallized fruit\nn07600696\tcandied apple, candy apple, taffy apple, caramel apple, toffee apple\nn07600895\tcrystallized ginger\nn07601025\tgrapefruit peel\nn07601175\tlemon peel\nn07601290\torange peel\nn07601407\tcandied citrus peel\nn07601572\tcandy cane\nn07601686\tcandy corn\nn07601809\tcaramel\nn07602650\tcenter, centre\nn07604956\tcomfit\nn07605040\tcotton candy, spun sugar, candyfloss\nn07605198\tdragee\nn07605282\tdragee\nn07605380\tfondant\nn07605474\tfudge\nn07605597\tchocolate fudge\nn07605693\tdivinity, divinity fudge\nn07605804\tpenuche, penoche, panoche, panocha\nn07605944\tgumdrop\nn07606058\tjujube\nn07606191\thoney crisp\nn07606278\tmint, mint candy\nn07606419\thorehound\nn07606538\tpeppermint, peppermint candy\nn07606669\tjelly bean, jelly egg\nn07606764\tkiss, candy kiss\nn07606933\tmolasses kiss\nn07607027\tmeringue kiss\nn07607138\tchocolate kiss\nn07607361\tlicorice, liquorice\nn07607492\tLife Saver\nn07607605\tlollipop, sucker, all-day sucker\nn07607707\tlozenge\nn07607832\tcachou\nn07607967\tcough drop, troche, pastille, pastil\nn07608098\tmarshmallow\nn07608245\tmarzipan, marchpane\nn07608339\tnougat\nn07608429\tnougat bar\nn07608533\tnut bar\nn07608641\tpeanut bar\nn07608721\tpopcorn ball\nn07608866\tpraline\nn07608980\trock candy\nn07609083\trock candy, rock\nn07609215\tsugar candy\nn07609316\tsugarplum\nn07609407\ttaffy\nn07609549\tmolasses taffy\nn07609632\ttruffle, chocolate truffle\nn07609728\tTurkish Delight\nn07609840\tdessert, sweet, afters\nn07610295\tambrosia, nectar\nn07610502\tambrosia\nn07610620\tbaked Alaska\nn07610746\tblancmange\nn07610890\tcharlotte\nn07611046\tcompote, fruit compote\nn07611148\tdumpling\nn07611267\tflan\nn07611358\tfrozen dessert\nn07611733\tjunket\nn07611839\tmousse\nn07611991\tmousse\nn07612137\tpavlova\nn07612273\tpeach melba\nn07612367\twhip\nn07612530\tprune whip\nn07612632\tpudding\nn07612996\tpudding, pud\nn07613158\tsyllabub, sillabub\nn07613266\ttiramisu\nn07613480\ttrifle\nn07613671\ttipsy cake\nn07613815\tjello, Jell-O\nn07614103\tapple dumpling\nn07614198\tice, frappe\nn07614348\twater ice, sorbet\nn07614500\tice cream, icecream\nn07614730\tice-cream cone\nn07614825\tchocolate ice cream\nn07615052\tNeapolitan ice cream\nn07615190\tpeach ice cream\nn07615289\tsherbert, sherbet\nn07615460\tstrawberry ice cream\nn07615569\ttutti-frutti\nn07615671\tvanilla ice cream\nn07615774\tice lolly, lolly, lollipop, popsicle\nn07615954\tice milk\nn07616046\tfrozen yogurt\nn07616174\tsnowball\nn07616265\tsnowball\nn07616386\tparfait\nn07616487\tice-cream sundae, sundae\nn07616590\tsplit\nn07616748\tbanana split\nn07616906\tfrozen pudding\nn07617051\tfrozen custard, soft ice cream\nn07617188\tpudding\nn07617344\tflummery\nn07617447\tfish mousse\nn07617526\tchicken mousse\nn07617611\tchocolate mousse\nn07617708\tplum pudding, Christmas pudding\nn07617839\tcarrot pudding\nn07617932\tcorn pudding\nn07618029\tsteamed pudding\nn07618119\tduff, plum duff\nn07618281\tvanilla pudding\nn07618432\tchocolate pudding\nn07618587\tbrown Betty\nn07618684\tNesselrode, Nesselrode pudding\nn07618871\tpease pudding\nn07619004\tcustard\nn07619208\tcreme caramel\nn07619301\tcreme anglais\nn07619409\tcreme brulee\nn07619508\tfruit custard\nn07619881\ttapioca\nn07620047\ttapioca pudding\nn07620145\troly-poly, roly-poly pudding\nn07620327\tsuet pudding\nn07620597\tBavarian cream\nn07620689\tmaraschino, maraschino cherry\nn07621264\tnonpareil\nn07621497\tzabaglione, sabayon\nn07621618\tgarnish\nn07623136\tpastry, pastry dough\nn07624466\tturnover\nn07624666\tapple turnover\nn07624757\tknish\nn07624924\tpirogi, piroshki, pirozhki\nn07625061\tsamosa\nn07625324\ttimbale\nn07627931\tpuff paste, pate feuillete\nn07628068\tphyllo\nn07628181\tpuff batter, pouf paste, pate a choux\nn07631926\tice-cream cake, icebox cake\nn07639069\tdoughnut, donut, sinker\nn07641928\tfish cake, fish ball\nn07642361\tfish stick, fish finger\nn07642471\tconserve, preserve, conserves, preserves\nn07642742\tapple butter\nn07642833\tchowchow\nn07642933\tjam\nn07643026\tlemon curd, lemon cheese\nn07643200\tstrawberry jam, strawberry preserves\nn07643306\tjelly\nn07643474\tapple jelly\nn07643577\tcrabapple jelly\nn07643679\tgrape jelly\nn07643764\tmarmalade\nn07643891\torange marmalade\nn07643981\tgelatin, jelly\nn07644244\tgelatin dessert\nn07648913\tbuffalo wing\nn07648997\tbarbecued wing\nn07650792\tmess\nn07650903\tmince\nn07651025\tpuree\nn07654148\tbarbecue, barbeque\nn07654298\tbiryani, biriani\nn07655067\tescalope de veau Orloff\nn07655263\tsaute\nn07663899\tpatty, cake\nn07665438\tveal parmesan, veal parmigiana\nn07666176\tveal cordon bleu\nn07672914\tmargarine, margarin, oleo, oleomargarine, marge\nn07678586\tmincemeat\nn07678729\tstuffing, dressing\nn07678953\tturkey stuffing\nn07679034\toyster stuffing, oyster dressing\nn07679140\tforcemeat, farce\nn07679356\tbread, breadstuff, staff of life\nn07680168\tanadama bread\nn07680313\tbap\nn07680416\tbarmbrack\nn07680517\tbreadstick, bread-stick\nn07680655\tgrissino\nn07680761\tbrown bread, Boston brown bread\nn07680932\tbun, roll\nn07681264\ttea bread\nn07681355\tcaraway seed bread\nn07681450\tchallah, hallah\nn07681691\tcinnamon bread\nn07681805\tcracked-wheat bread\nn07681926\tcracker\nn07682197\tcrouton\nn07682316\tdark bread, whole wheat bread, whole meal bread, brown bread\nn07682477\tEnglish muffin\nn07682624\tflatbread\nn07682808\tgarlic bread\nn07682952\tgluten bread\nn07683039\tgraham bread\nn07683138\tHost\nn07683265\tflatbrod\nn07683360\tbannock\nn07683490\tchapatti, chapati\nn07683617\tpita, pocket bread\nn07683786\tloaf of bread, loaf\nn07684084\tFrench loaf\nn07684164\tmatzo, matzoh, matzah, unleavened bread\nn07684289\tnan, naan\nn07684422\tonion bread\nn07684517\traisin bread\nn07684600\tquick bread\nn07684938\tbanana bread\nn07685031\tdate bread\nn07685118\tdate-nut bread\nn07685218\tnut bread\nn07685303\toatcake\nn07685399\tIrish soda bread\nn07685546\tskillet bread, fry bread\nn07685730\trye bread\nn07685918\tblack bread, pumpernickel\nn07686021\tJewish rye bread, Jewish rye\nn07686202\tlimpa\nn07686299\tSwedish rye bread, Swedish rye\nn07686461\tsalt-rising bread\nn07686634\tsimnel\nn07686720\tsour bread, sourdough bread\nn07686873\ttoast\nn07687053\twafer\nn07687211\twhite bread, light bread\nn07687381\tbaguet, baguette\nn07687469\tFrench bread\nn07687626\tItalian bread\nn07687789\tcornbread\nn07688021\tcorn cake\nn07688130\tskillet corn bread\nn07688265\tashcake, ash cake, corn tash\nn07688412\thoecake\nn07688624\tcornpone, pone\nn07688757\tcorn dab, corn dodger, dodger\nn07688898\thush puppy, hushpuppy\nn07689003\tjohnnycake, johnny cake, journey cake\nn07689217\tShawnee cake\nn07689313\tspoon bread, batter bread\nn07689490\tcinnamon toast\nn07689624\torange toast\nn07689757\tMelba toast\nn07689842\tzwieback, rusk, Brussels biscuit, twice-baked bread\nn07690019\tfrankfurter bun, hotdog bun\nn07690152\thamburger bun, hamburger roll\nn07690273\tmuffin, gem\nn07690431\tbran muffin\nn07690511\tcorn muffin\nn07690585\tYorkshire pudding\nn07690739\tpopover\nn07690892\tscone\nn07691091\tdrop scone, griddlecake, Scotch pancake\nn07691237\tcross bun, hot cross bun\nn07691539\tbrioche\nn07691650\tcrescent roll, croissant\nn07691758\thard roll, Vienna roll\nn07691863\tsoft roll\nn07691954\tkaiser roll\nn07692114\tParker House roll\nn07692248\tclover-leaf roll\nn07692405\tonion roll\nn07692517\tbialy, bialystoker\nn07692614\tsweet roll, coffee roll\nn07692887\tbear claw, bear paw\nn07693048\tcinnamon roll, cinnamon bun, cinnamon snail\nn07693223\thoney bun, sticky bun, caramel bun, schnecken\nn07693439\tpinwheel roll\nn07693590\tdanish, danish pastry\nn07693725\tbagel, beigel\nn07693889\tonion bagel\nn07693972\tbiscuit\nn07694169\trolled biscuit\nn07694403\tbaking-powder biscuit\nn07694516\tbuttermilk biscuit, soda biscuit\nn07694659\tshortcake\nn07694839\thardtack, pilot biscuit, pilot bread, sea biscuit, ship biscuit\nn07695187\tsaltine\nn07695284\tsoda cracker\nn07695410\toyster cracker\nn07695504\twater biscuit\nn07695652\tgraham cracker\nn07695742\tpretzel\nn07695878\tsoft pretzel\nn07695965\tsandwich\nn07696403\tsandwich plate\nn07696527\tbutty\nn07696625\tham sandwich\nn07696728\tchicken sandwich\nn07696839\tclub sandwich, three-decker, triple-decker\nn07696977\topen-face sandwich, open sandwich\nn07697100\thamburger, beefburger, burger\nn07697313\tcheeseburger\nn07697408\ttunaburger\nn07697537\thotdog, hot dog, red hot\nn07697699\tSloppy Joe\nn07697825\tbomber, grinder, hero, hero sandwich, hoagie, hoagy, Cuban sandwich, Italian sandwich, poor boy, sub, submarine, submarine sandwich, torpedo, wedge, zep\nn07698250\tgyro\nn07698401\tbacon-lettuce-tomato sandwich, BLT\nn07698543\tReuben\nn07698672\twestern, western sandwich\nn07698782\twrap\nn07700003\tspaghetti\nn07703889\thasty pudding\nn07704054\tgruel\nn07704205\tcongee, jook\nn07704305\tskilly\nn07705931\tedible fruit\nn07707451\tvegetable, veggie, veg\nn07708124\tjulienne, julienne vegetable\nn07708398\traw vegetable, rabbit food\nn07708512\tcrudites\nn07708685\tcelery stick\nn07708798\tlegume\nn07709046\tpulse\nn07709172\tpotherb\nn07709333\tgreens, green, leafy vegetable\nn07709701\tchop-suey greens\nn07709881\tbean curd, tofu\nn07710007\tsolanaceous vegetable\nn07710283\troot vegetable\nn07710616\tpotato, white potato, Irish potato, murphy, spud, tater\nn07710952\tbaked potato\nn07711080\tfrench fries, french-fried potatoes, fries, chips\nn07711232\thome fries, home-fried potatoes\nn07711371\tjacket potato\nn07711569\tmashed potato\nn07711683\tpotato skin, potato peel, potato peelings\nn07711799\tUruguay potato\nn07711907\tyam\nn07712063\tsweet potato\nn07712267\tyam\nn07712382\tsnack food\nn07712559\tchip, crisp, potato chip, Saratoga chip\nn07712748\tcorn chip\nn07712856\ttortilla chip\nn07712959\tnacho\nn07713074\teggplant, aubergine, mad apple\nn07713267\tpieplant, rhubarb\nn07713395\tcruciferous vegetable\nn07713763\tmustard, mustard greens, leaf mustard, Indian mustard\nn07713895\tcabbage, chou\nn07714078\tkale, kail, cole\nn07714188\tcollards, collard greens\nn07714287\tChinese cabbage, celery cabbage, Chinese celery\nn07714448\tbok choy, bok choi\nn07714571\thead cabbage\nn07714802\tred cabbage\nn07714895\tsavoy cabbage, savoy\nn07714990\tbroccoli\nn07715103\tcauliflower\nn07715221\tbrussels sprouts\nn07715407\tbroccoli rabe, broccoli raab\nn07715561\tsquash\nn07715721\tsummer squash\nn07716034\tyellow squash\nn07716203\tcrookneck, crookneck squash, summer crookneck\nn07716358\tzucchini, courgette\nn07716504\tmarrow, vegetable marrow\nn07716649\tcocozelle\nn07716750\tpattypan squash\nn07716906\tspaghetti squash\nn07717070\twinter squash\nn07717410\tacorn squash\nn07717556\tbutternut squash\nn07717714\thubbard squash\nn07717858\tturban squash\nn07718068\tbuttercup squash\nn07718195\tcushaw\nn07718329\twinter crookneck squash\nn07718472\tcucumber, cuke\nn07718671\tgherkin\nn07718747\tartichoke, globe artichoke\nn07718920\tartichoke heart\nn07719058\tJerusalem artichoke, sunchoke\nn07719213\tasparagus\nn07719330\tbamboo shoot\nn07719437\tsprout\nn07719616\tbean sprout\nn07719756\talfalfa sprout\nn07719839\tbeet, beetroot\nn07719980\tbeet green\nn07720084\tsugar beet\nn07720185\tmangel-wurzel\nn07720277\tchard, Swiss chard, spinach beet, leaf beet\nn07720442\tpepper\nn07720615\tsweet pepper\nn07720875\tbell pepper\nn07721018\tgreen pepper\nn07721118\tglobe pepper\nn07721195\tpimento, pimiento\nn07721325\thot pepper\nn07721456\tchili, chili pepper, chilli, chilly, chile\nn07721678\tjalapeno, jalapeno pepper\nn07721833\tchipotle\nn07721942\tcayenne, cayenne pepper\nn07722052\ttabasco, red pepper\nn07722217\tonion\nn07722390\tBermuda onion\nn07722485\tgreen onion, spring onion, scallion\nn07722666\tVidalia onion\nn07722763\tSpanish onion\nn07722888\tpurple onion, red onion\nn07723039\tleek\nn07723177\tshallot\nn07723330\tsalad green, salad greens\nn07723559\tlettuce\nn07723753\tbutterhead lettuce\nn07723968\tbuttercrunch\nn07724078\tBibb lettuce\nn07724173\tBoston lettuce\nn07724269\tcrisphead lettuce, iceberg lettuce, iceberg\nn07724492\tcos, cos lettuce, romaine, romaine lettuce\nn07724654\tleaf lettuce, loose-leaf lettuce\nn07724819\tceltuce\nn07724943\tbean, edible bean\nn07725158\tgoa bean\nn07725255\tlentil\nn07725376\tpea\nn07725531\tgreen pea, garden pea\nn07725663\tmarrowfat pea\nn07725789\tsnow pea, sugar pea\nn07725888\tsugar snap pea\nn07726009\tsplit-pea\nn07726095\tchickpea, garbanzo\nn07726230\tcajan pea, pigeon pea, dahl\nn07726386\tfield pea\nn07726525\tmushy peas\nn07726672\tblack-eyed pea, cowpea\nn07726796\tcommon bean\nn07727048\tkidney bean\nn07727140\tnavy bean, pea bean, white bean\nn07727252\tpinto bean\nn07727377\tfrijole\nn07727458\tblack bean, turtle bean\nn07727578\tfresh bean\nn07727741\tflageolet, haricot\nn07727868\tgreen bean\nn07728053\tsnap bean, snap\nn07728181\tstring bean\nn07728284\tKentucky wonder, Kentucky wonder bean\nn07728391\tscarlet runner, scarlet runner bean, runner bean, English runner bean\nn07728585\tharicot vert, haricots verts, French bean\nn07728708\twax bean, yellow bean\nn07728804\tshell bean\nn07729000\tlima bean\nn07729142\tFordhooks\nn07729225\tsieva bean, butter bean, butterbean, civet bean\nn07729384\tfava bean, broad bean\nn07729485\tsoy, soybean, soya, soya bean\nn07729828\tgreen soybean\nn07729926\tfield soybean\nn07730033\tcardoon\nn07730207\tcarrot\nn07730320\tcarrot stick\nn07730406\tcelery\nn07730562\tpascal celery, Paschal celery\nn07730708\tceleriac, celery root\nn07730855\tchicory, curly endive\nn07731006\tradicchio\nn07731122\tcoffee substitute\nn07731284\tchicory, chicory root\nn07731436\tPostum\nn07731587\tchicory escarole, endive, escarole\nn07731767\tBelgian endive, French endive, witloof\nn07731952\tcorn, edible corn\nn07732168\tsweet corn, green corn\nn07732302\thominy\nn07732433\tlye hominy\nn07732525\tpearl hominy\nn07732636\tpopcorn\nn07732747\tcress\nn07732904\twatercress\nn07733005\tgarden cress\nn07733124\twinter cress\nn07733217\tdandelion green\nn07733394\tgumbo, okra\nn07733567\tkohlrabi, turnip cabbage\nn07733712\tlamb's-quarter, pigweed, wild spinach\nn07733847\twild spinach\nn07734017\ttomato\nn07734183\tbeefsteak tomato\nn07734292\tcherry tomato\nn07734417\tplum tomato\nn07734555\ttomatillo, husk tomato, Mexican husk tomato\nn07734744\tmushroom\nn07734879\tstuffed mushroom\nn07735052\tsalsify\nn07735179\toyster plant, vegetable oyster\nn07735294\tscorzonera, black salsify\nn07735404\tparsnip\nn07735510\tpumpkin\nn07735687\tradish\nn07735803\tturnip\nn07735981\twhite turnip\nn07736087\trutabaga, swede, swedish turnip, yellow turnip\nn07736256\tturnip greens\nn07736371\tsorrel, common sorrel\nn07736527\tFrench sorrel\nn07736692\tspinach\nn07736813\ttaro, taro root, cocoyam, dasheen, edda\nn07736971\ttruffle, earthnut\nn07737081\tedible nut\nn07737594\tbunya bunya\nn07737745\tpeanut, earthnut, goober, goober pea, groundnut, monkey nut\nn07738105\tfreestone\nn07738224\tcling, clingstone\nn07739035\twindfall\nn07739125\tapple\nn07739344\tcrab apple, crabapple\nn07739506\teating apple, dessert apple\nn07739923\tBaldwin\nn07740033\tCortland\nn07740115\tCox's Orange Pippin\nn07740220\tDelicious\nn07740342\tGolden Delicious, Yellow Delicious\nn07740461\tRed Delicious\nn07740597\tEmpire\nn07740744\tGrimes' golden\nn07740855\tJonathan\nn07740954\tMcIntosh\nn07741138\tMacoun\nn07741235\tNorthern Spy\nn07741357\tPearmain\nn07741461\tPippin\nn07741623\tPrima\nn07741706\tStayman\nn07741804\tWinesap\nn07741888\tStayman Winesap\nn07742012\tcooking apple\nn07742224\tBramley's Seedling\nn07742313\tGranny Smith\nn07742415\tLane's Prince Albert\nn07742513\tNewtown Wonder\nn07742605\tRome Beauty\nn07742704\tberry\nn07743224\tbilberry, whortleberry, European blueberry\nn07743384\thuckleberry\nn07743544\tblueberry\nn07743723\twintergreen, boxberry, checkerberry, teaberry, spiceberry\nn07743902\tcranberry\nn07744057\tlingonberry, mountain cranberry, cowberry, lowbush cranberry\nn07744246\tcurrant\nn07744430\tgooseberry\nn07744559\tblack currant\nn07744682\tred currant\nn07744811\tblackberry\nn07745046\tboysenberry\nn07745197\tdewberry\nn07745357\tloganberry\nn07745466\traspberry\nn07745661\tsaskatoon, serviceberry, shadberry, juneberry\nn07745940\tstrawberry\nn07746038\tsugarberry, hackberry\nn07746186\tpersimmon\nn07746334\tacerola, barbados cherry, surinam cherry, West Indian cherry\nn07746551\tcarambola, star fruit\nn07746749\tceriman, monstera\nn07746910\tcarissa plum, natal plum\nn07747055\tcitrus, citrus fruit, citrous fruit\nn07747607\torange\nn07747811\ttemple orange\nn07747951\tmandarin, mandarin orange\nn07748157\tclementine\nn07748276\tsatsuma\nn07748416\ttangerine\nn07748574\ttangelo, ugli, ugli fruit\nn07748753\tbitter orange, Seville orange, sour orange\nn07748912\tsweet orange\nn07749095\tJaffa orange\nn07749192\tnavel orange\nn07749312\tValencia orange\nn07749446\tkumquat\nn07749582\tlemon\nn07749731\tlime\nn07749870\tkey lime\nn07749969\tgrapefruit\nn07750146\tpomelo, shaddock\nn07750299\tcitrange\nn07750449\tcitron\nn07750586\talmond\nn07750736\tJordan almond\nn07750872\tapricot\nn07751004\tpeach\nn07751148\tnectarine\nn07751280\tpitahaya\nn07751451\tplum\nn07751737\tdamson, damson plum\nn07751858\tgreengage, greengage plum\nn07751977\tbeach plum\nn07752109\tsloe\nn07752264\tVictoria plum\nn07752377\tdried fruit\nn07752514\tdried apricot\nn07752602\tprune\nn07752664\traisin\nn07752782\tseedless raisin, sultana\nn07752874\tseeded raisin\nn07752966\tcurrant\nn07753113\tfig\nn07753275\tpineapple, ananas\nn07753448\tanchovy pear, river pear\nn07753592\tbanana\nn07753743\tpassion fruit\nn07753980\tgranadilla\nn07754155\tsweet calabash\nn07754279\tbell apple, sweet cup, water lemon, yellow granadilla\nn07754451\tbreadfruit\nn07754684\tjackfruit, jak, jack\nn07754894\tcacao bean, cocoa bean\nn07755089\tcocoa\nn07755262\tcanistel, eggfruit\nn07755411\tmelon\nn07755619\tmelon ball\nn07755707\tmuskmelon, sweet melon\nn07755929\tcantaloup, cantaloupe\nn07756096\twinter melon\nn07756325\thoneydew, honeydew melon\nn07756499\tPersian melon\nn07756641\tnet melon, netted melon, nutmeg melon\nn07756838\tcasaba, casaba melon\nn07756951\twatermelon\nn07757132\tcherry\nn07757312\tsweet cherry, black cherry\nn07757511\tbing cherry\nn07757602\theart cherry, oxheart, oxheart cherry\nn07757753\tblackheart, blackheart cherry\nn07757874\tcapulin, Mexican black cherry\nn07757990\tsour cherry\nn07758125\tamarelle\nn07758260\tmorello\nn07758407\tcocoa plum, coco plum, icaco\nn07758582\tgherkin\nn07758680\tgrape\nn07758950\tfox grape\nn07759194\tConcord grape\nn07759324\tCatawba\nn07759424\tmuscadine, bullace grape\nn07759576\tscuppernong\nn07759691\tslipskin grape\nn07759816\tvinifera grape\nn07760070\temperor\nn07760153\tmuscat, muscatel, muscat grape\nn07760297\tribier\nn07760395\tsultana\nn07760501\tTokay\nn07760673\tflame tokay\nn07760755\tThompson Seedless\nn07760859\tcustard apple\nn07761141\tcherimoya, cherimolla\nn07761309\tsoursop, guanabana\nn07761611\tsweetsop, annon, sugar apple\nn07761777\tilama\nn07761954\tpond apple\nn07762114\tpapaw, pawpaw\nn07762244\tpapaya\nn07762373\tkai apple\nn07762534\tketembilla, kitembilla, kitambilla\nn07762740\tackee, akee\nn07762913\tdurian\nn07763107\tfeijoa, pineapple guava\nn07763290\tgenip, Spanish lime\nn07763483\tgenipap, genipap fruit\nn07763629\tkiwi, kiwi fruit, Chinese gooseberry\nn07763792\tloquat, Japanese plum\nn07763987\tmangosteen\nn07764155\tmango\nn07764315\tsapodilla, sapodilla plum, sapota\nn07764486\tsapote, mammee, marmalade plum\nn07764630\ttamarind, tamarindo\nn07764847\tavocado, alligator pear, avocado pear, aguacate\nn07765073\tdate\nn07765208\telderberry\nn07765361\tguava\nn07765517\tmombin\nn07765612\thog plum, yellow mombin\nn07765728\thog plum, wild plum\nn07765862\tjaboticaba\nn07765999\tjujube, Chinese date, Chinese jujube\nn07766173\tlitchi, litchi nut, litchee, lichi, leechee, lichee, lychee\nn07766409\tlonganberry, dragon's eye\nn07766530\tmamey, mammee, mammee apple\nn07766723\tmarang\nn07766891\tmedlar\nn07767002\tmedlar\nn07767171\tmulberry\nn07767344\tolive\nn07767549\tblack olive, ripe olive\nn07767709\tgreen olive\nn07767847\tpear\nn07768068\tbosc\nn07768139\tanjou\nn07768230\tbartlett, bartlett pear\nn07768318\tseckel, seckel pear\nn07768423\tplantain\nn07768590\tplumcot\nn07768694\tpomegranate\nn07768858\tprickly pear\nn07769102\tBarbados gooseberry, blade apple\nn07769306\tquandong, quandang, quantong, native peach\nn07769465\tquandong nut\nn07769584\tquince\nn07769731\trambutan, rambotan\nn07769886\tpulasan, pulassan\nn07770034\trose apple\nn07770180\tsorb, sorb apple\nn07770439\tsour gourd, monkey bread\nn07770571\tedible seed\nn07770763\tpumpkin seed\nn07770869\tbetel nut, areca nut\nn07771082\tbeechnut\nn07771212\twalnut\nn07771405\tblack walnut\nn07771539\tEnglish walnut\nn07771731\tbrazil nut, brazil\nn07771891\tbutternut\nn07772026\tsouari nut\nn07772147\tcashew, cashew nut\nn07772274\tchestnut\nn07772413\tchincapin, chinkapin, chinquapin\nn07772788\thazelnut, filbert, cobnut, cob\nn07772935\tcoconut, cocoanut\nn07773428\tcoconut milk, coconut water\nn07774182\tgrugru nut\nn07774295\thickory nut\nn07774479\tcola extract\nn07774596\tmacadamia nut\nn07774719\tpecan\nn07774842\tpine nut, pignolia, pinon nut\nn07775050\tpistachio, pistachio nut\nn07775197\tsunflower seed\nn07783827\tanchovy paste\nn07785487\trollmops\nn07800091\tfeed, provender\nn07800487\tcattle cake\nn07800636\tcreep feed\nn07800740\tfodder\nn07801007\tfeed grain\nn07801091\teatage, forage, pasture, pasturage, grass\nn07801342\tsilage, ensilage\nn07801508\toil cake\nn07801709\toil meal\nn07801779\talfalfa\nn07801892\tbroad bean, horse bean\nn07802026\thay\nn07802152\ttimothy\nn07802246\tstover\nn07802417\tgrain, food grain, cereal\nn07802767\tgrist\nn07802863\tgroats\nn07802963\tmillet\nn07803093\tbarley, barleycorn\nn07803213\tpearl barley\nn07803310\tbuckwheat\nn07803408\tbulgur, bulghur, bulgur wheat\nn07803545\twheat, wheat berry\nn07803779\tcracked wheat\nn07803895\tstodge\nn07803992\twheat germ\nn07804152\toat\nn07804323\trice\nn07804543\tbrown rice\nn07804657\twhite rice, polished rice\nn07804771\twild rice, Indian rice\nn07804900\tpaddy\nn07805006\tslop, slops, swill, pigswill, pigwash\nn07805254\tmash\nn07805389\tchicken feed, scratch\nn07805478\tcud, rechewed food\nn07805594\tbird feed, bird food, birdseed\nn07805731\tpetfood, pet-food, pet food\nn07805966\tdog food\nn07806043\tcat food\nn07806120\tcanary seed\nn07806221\tsalad\nn07806633\ttossed salad\nn07806774\tgreen salad\nn07806879\tCaesar salad\nn07807002\tsalmagundi\nn07807171\tsalad nicoise\nn07807317\tcombination salad\nn07807472\tchef's salad\nn07807594\tpotato salad\nn07807710\tpasta salad\nn07807834\tmacaroni salad\nn07807922\tfruit salad\nn07808022\tWaldorf salad\nn07808166\tcrab Louis\nn07808268\therring salad\nn07808352\ttuna fish salad, tuna salad\nn07808479\tchicken salad\nn07808587\tcoleslaw, slaw\nn07808675\taspic\nn07808806\tmolded salad\nn07808904\ttabbouleh, tabooli\nn07809096\tingredient, fixings\nn07809368\tflavorer, flavourer, flavoring, flavouring, seasoner, seasoning\nn07810531\tbouillon cube\nn07810907\tcondiment\nn07811416\therb\nn07812046\tfines herbes\nn07812184\tspice\nn07812662\tspearmint oil\nn07812790\tlemon oil\nn07812913\twintergreen oil, oil of wintergreen\nn07813107\tsalt, table salt, common salt\nn07813324\tcelery salt\nn07813495\tonion salt\nn07813579\tseasoned salt\nn07813717\tsour salt\nn07813833\tfive spice powder\nn07814007\tallspice\nn07814203\tcinnamon\nn07814390\tstick cinnamon\nn07814487\tclove\nn07814634\tcumin, cumin seed\nn07814790\tfennel\nn07814925\tginger, gingerroot\nn07815163\tginger, powdered ginger\nn07815294\tmace\nn07815424\tnutmeg\nn07815588\tpepper, peppercorn\nn07815839\tblack pepper\nn07815956\twhite pepper\nn07816052\tsassafras\nn07816164\tbasil, sweet basil\nn07816296\tbay leaf\nn07816398\tborage\nn07816575\thyssop\nn07816726\tcaraway\nn07816839\tchervil\nn07817024\tchives\nn07817160\tcomfrey, healing herb\nn07817315\tcoriander, Chinese parsley, cilantro\nn07817465\tcoriander, coriander seed\nn07817599\tcostmary\nn07817758\tfennel, common fennel\nn07817871\tfennel, Florence fennel, finocchio\nn07818029\tfennel seed\nn07818133\tfenugreek, fenugreek seed\nn07818277\tgarlic, ail\nn07818422\tclove, garlic clove\nn07818572\tgarlic chive\nn07818689\tlemon balm\nn07818825\tlovage\nn07818995\tmarjoram, oregano\nn07819166\tmint\nn07819303\tmustard seed\nn07819480\tmustard, table mustard\nn07819682\tChinese mustard\nn07819769\tnasturtium\nn07819896\tparsley\nn07820036\tsalad burnet\nn07820145\trosemary\nn07820297\true\nn07820497\tsage\nn07820683\tclary sage\nn07820814\tsavory, savoury\nn07820960\tsummer savory, summer savoury\nn07821107\twinter savory, winter savoury\nn07821260\tsweet woodruff, waldmeister\nn07821404\tsweet cicely\nn07821610\ttarragon, estragon\nn07821758\tthyme\nn07821919\tturmeric\nn07822053\tcaper\nn07822197\tcatsup, ketchup, cetchup, tomato ketchup\nn07822323\tcardamom, cardamon, cardamum\nn07822518\tcayenne, cayenne pepper, red pepper\nn07822687\tchili powder\nn07822845\tchili sauce\nn07823105\tchutney, Indian relish\nn07823280\tsteak sauce\nn07823369\ttaco sauce\nn07823460\tsalsa\nn07823591\tmint sauce\nn07823698\tcranberry sauce\nn07823814\tcurry powder\nn07823951\tcurry\nn07824191\tlamb curry\nn07824268\tduck sauce, hoisin sauce\nn07824383\thorseradish\nn07824502\tmarinade\nn07824702\tpaprika\nn07824863\tSpanish paprika\nn07824988\tpickle\nn07825194\tdill pickle\nn07825399\tbread and butter pickle\nn07825496\tpickle relish\nn07825597\tpiccalilli\nn07825717\tsweet pickle\nn07825850\tapplesauce, apple sauce\nn07825972\tsoy sauce, soy\nn07826091\tTabasco, Tabasco sauce\nn07826250\ttomato paste\nn07826340\tangelica\nn07826453\tangelica\nn07826544\talmond extract\nn07826653\tanise, aniseed, anise seed\nn07826930\tChinese anise, star anise, star aniseed\nn07827130\tjuniper berries\nn07827284\tsaffron\nn07827410\tsesame seed, benniseed\nn07827554\tcaraway seed\nn07827750\tpoppy seed\nn07827896\tdill, dill weed\nn07828041\tdill seed\nn07828156\tcelery seed\nn07828275\tlemon extract\nn07828378\tmonosodium glutamate, MSG\nn07828642\tvanilla bean\nn07828987\tvinegar, acetum\nn07829248\tcider vinegar\nn07829331\twine vinegar\nn07829412\tsauce\nn07830493\tanchovy sauce\nn07830593\thot sauce\nn07830690\thard sauce\nn07830841\thorseradish sauce, sauce Albert\nn07830986\tbolognese pasta sauce\nn07831146\tcarbonara\nn07831267\ttomato sauce\nn07831450\ttartare sauce, tartar sauce\nn07831663\twine sauce\nn07831821\tmarchand de vin, mushroom wine sauce\nn07831955\tbread sauce\nn07832099\tplum sauce\nn07832202\tpeach sauce\nn07832307\tapricot sauce\nn07832416\tpesto\nn07832592\travigote, ravigotte\nn07832741\tremoulade sauce\nn07832902\tdressing, salad dressing\nn07833333\tsauce Louis\nn07833535\tbleu cheese dressing, blue cheese dressing\nn07833672\tblue cheese dressing, Roquefort dressing\nn07833816\tFrench dressing, vinaigrette, sauce vinaigrette\nn07833951\tLorenzo dressing\nn07834065\tanchovy dressing\nn07834160\tItalian dressing\nn07834286\thalf-and-half dressing\nn07834507\tmayonnaise, mayo\nn07834618\tgreen mayonnaise, sauce verte\nn07834774\taioli, aioli sauce, garlic sauce\nn07834872\tRussian dressing, Russian mayonnaise\nn07835051\tsalad cream\nn07835173\tThousand Island dressing\nn07835331\tbarbecue sauce\nn07835457\thollandaise\nn07835547\tbearnaise\nn07835701\tBercy, Bercy butter\nn07835823\tbordelaise\nn07835921\tbourguignon, bourguignon sauce, Burgundy sauce\nn07836077\tbrown sauce, sauce Espagnole\nn07836269\tEspagnole, sauce Espagnole\nn07836456\tChinese brown sauce, brown sauce\nn07836600\tblanc\nn07836731\tcheese sauce\nn07836838\tchocolate sauce, chocolate syrup\nn07837002\thot-fudge sauce, fudge sauce\nn07837110\tcocktail sauce, seafood sauce\nn07837234\tColbert, Colbert butter\nn07837362\twhite sauce, bechamel sauce, bechamel\nn07837545\tcream sauce\nn07837630\tMornay sauce\nn07837755\tdemiglace, demi-glaze\nn07837912\tgravy, pan gravy\nn07838073\tgravy\nn07838233\tspaghetti sauce, pasta sauce\nn07838441\tmarinara\nn07838551\tmole\nn07838659\thunter's sauce, sauce chausseur\nn07838811\tmushroom sauce\nn07838905\tmustard sauce\nn07839055\tNantua, shrimp sauce\nn07839172\tHungarian sauce, paprika sauce\nn07839312\tpepper sauce, Poivrade\nn07839478\troux\nn07839593\tSmitane\nn07839730\tSoubise, white onion sauce\nn07839864\tLyonnaise sauce, brown onion sauce\nn07840027\tveloute\nn07840124\tallemande, allemande sauce\nn07840219\tcaper sauce\nn07840304\tpoulette\nn07840395\tcurry sauce\nn07840520\tWorcester sauce, Worcestershire, Worcestershire sauce\nn07840672\tcoconut milk, coconut cream\nn07840804\tegg, eggs\nn07841037\tegg white, white, albumen, ovalbumin\nn07841345\tegg yolk, yolk\nn07841495\tboiled egg, coddled egg\nn07841639\thard-boiled egg, hard-cooked egg\nn07841800\tEaster egg\nn07841907\tEaster egg\nn07842044\tchocolate egg\nn07842130\tcandy egg\nn07842202\tpoached egg, dropped egg\nn07842308\tscrambled eggs\nn07842433\tdeviled egg, stuffed egg\nn07842605\tshirred egg, baked egg, egg en cocotte\nn07842753\tomelet, omelette\nn07842972\tfirm omelet\nn07843117\tFrench omelet\nn07843220\tfluffy omelet\nn07843348\twestern omelet\nn07843464\tsouffle\nn07843636\tfried egg\nn07843775\tdairy product\nn07844042\tmilk\nn07844604\tmilk\nn07844786\tsour milk\nn07844867\tsoya milk, soybean milk, soymilk\nn07845087\tformula\nn07845166\tpasteurized milk\nn07845335\tcows' milk\nn07845421\tyak's milk\nn07845495\tgoats' milk\nn07845571\tacidophilus milk\nn07845702\traw milk\nn07845775\tscalded milk\nn07845863\thomogenized milk\nn07846014\tcertified milk\nn07846143\tpowdered milk, dry milk, dried milk, milk powder\nn07846274\tnonfat dry milk\nn07846359\tevaporated milk\nn07846471\tcondensed milk\nn07846557\tskim milk, skimmed milk\nn07846688\tsemi-skimmed milk\nn07846802\twhole milk\nn07846938\tlow-fat milk\nn07847047\tbuttermilk\nn07847198\tcream\nn07847453\tclotted cream, Devonshire cream\nn07847585\tdouble creme, heavy whipping cream\nn07847706\thalf-and-half\nn07847827\theavy cream\nn07847917\tlight cream, coffee cream, single cream\nn07848093\tsour cream, soured cream\nn07848196\twhipping cream, light whipping cream\nn07848338\tbutter\nn07848771\tclarified butter, drawn butter\nn07848936\tghee\nn07849026\tbrown butter, beurre noisette\nn07849186\tMeuniere butter, lemon butter\nn07849336\tyogurt, yoghurt, yoghourt\nn07849506\tblueberry yogurt\nn07849619\traita\nn07849733\twhey\nn07849912\tcurd\nn07850083\tcurd\nn07850219\tclabber\nn07850329\tcheese\nn07851054\tparing\nn07851298\tcream cheese\nn07851443\tdouble cream\nn07851554\tmascarpone\nn07851641\ttriple cream, triple creme\nn07851767\tcottage cheese, pot cheese, farm cheese, farmer's cheese\nn07851926\tprocess cheese, processed cheese\nn07852045\tbleu, blue cheese\nn07852229\tStilton\nn07852302\tRoquefort\nn07852376\tgorgonzola\nn07852452\tDanish blue\nn07852532\tBavarian blue\nn07852614\tBrie\nn07852712\tbrick cheese\nn07852833\tCamembert\nn07852919\tcheddar, cheddar cheese, Armerican cheddar, American cheese\nn07853125\trat cheese, store cheese\nn07853232\tCheshire cheese\nn07853345\tdouble Gloucester\nn07853445\tEdam\nn07853560\tgoat cheese, chevre\nn07853648\tGouda, Gouda cheese\nn07853762\tgrated cheese\nn07853852\thand cheese\nn07853946\tLiederkranz\nn07854066\tLimburger\nn07854184\tmozzarella\nn07854266\tMuenster\nn07854348\tParmesan\nn07854455\tquark cheese, quark\nn07854614\tricotta\nn07854707\tstring cheese\nn07854813\tSwiss cheese\nn07854982\tEmmenthal, Emmental, Emmenthaler, Emmentaler\nn07855105\tGruyere\nn07855188\tsapsago\nn07855317\tVelveeta\nn07855413\tnut butter\nn07855510\tpeanut butter\nn07855603\tmarshmallow fluff\nn07855721\tonion butter\nn07855812\tpimento butter\nn07855907\tshrimp butter\nn07856045\tlobster butter\nn07856186\tyak butter\nn07856270\tspread, paste\nn07856756\tcheese spread\nn07856895\tanchovy butter\nn07856992\tfishpaste\nn07857076\tgarlic butter\nn07857170\tmiso\nn07857356\twasabi\nn07857598\tsnail butter\nn07857731\thummus, humus, hommos, hoummos, humous\nn07857959\tpate\nn07858114\tduck pate\nn07858197\tfoie gras, pate de foie gras\nn07858336\ttapenade\nn07858484\ttahini\nn07858595\tsweetening, sweetener\nn07858841\taspartame\nn07858978\thoney\nn07859142\tsaccharin\nn07859284\tsugar, refined sugar\nn07859583\tsyrup, sirup\nn07859796\tsugar syrup\nn07859951\tmolasses\nn07860103\tsorghum, sorghum molasses\nn07860208\ttreacle, golden syrup\nn07860331\tgrenadine\nn07860447\tmaple syrup\nn07860548\tcorn syrup\nn07860629\tmiraculous food, manna, manna from heaven\nn07860805\tbatter\nn07860988\tdough\nn07861158\tbread dough\nn07861247\tpancake batter\nn07861334\tfritter batter\nn07861557\tcoq au vin\nn07861681\tchicken provencale\nn07861813\tchicken and rice\nn07861983\tmoo goo gai pan\nn07862095\tarroz con pollo\nn07862244\tbacon and eggs\nn07862348\tbarbecued spareribs, spareribs\nn07862461\tbeef Bourguignonne, boeuf Bourguignonne\nn07862611\tbeef Wellington, filet de boeuf en croute\nn07862770\tbitok\nn07862946\tboiled dinner, New England boiled dinner\nn07863107\tBoston baked beans\nn07863229\tbubble and squeak\nn07863374\tpasta\nn07863547\tcannelloni\nn07863644\tcarbonnade flamande, Belgian beef stew\nn07863802\tcheese souffle\nn07863935\tchicken Marengo\nn07864065\tchicken cordon bleu\nn07864198\tMaryland chicken\nn07864317\tchicken paprika, chicken paprikash\nn07864475\tchicken Tetrazzini\nn07864638\tTetrazzini\nn07864756\tchicken Kiev\nn07864934\tchili, chili con carne\nn07865105\tchili dog\nn07865196\tchop suey\nn07865484\tchow mein\nn07865575\tcodfish ball, codfish cake\nn07865700\tcoquille\nn07865788\tcoquilles Saint-Jacques\nn07866015\tcroquette\nn07866151\tcottage pie\nn07866277\trissole\nn07866409\tdolmas, stuffed grape leaves\nn07866571\tegg foo yong, egg fu yung\nn07866723\tegg roll, spring roll\nn07866868\teggs Benedict\nn07867021\tenchilada\nn07867164\tfalafel, felafel\nn07867324\tfish and chips\nn07867421\tfondue, fondu\nn07867616\tcheese fondue\nn07867751\tchocolate fondue\nn07867883\tfondue, fondu\nn07868045\tbeef fondue, boeuf fondu bourguignon\nn07868200\tFrench toast\nn07868340\tfried rice, Chinese fried rice\nn07868508\tfrittata\nn07868684\tfrog legs\nn07868830\tgalantine\nn07868955\tgefilte fish, fish ball\nn07869111\thaggis\nn07869291\tham and eggs\nn07869391\thash\nn07869522\tcorned beef hash\nn07869611\tjambalaya\nn07869775\tkabob, kebab, shish kebab\nn07869937\tkedgeree\nn07870069\tsouvlaki, souvlakia\nn07870167\tlasagna, lasagne\nn07870313\tseafood Newburg\nn07870478\tlobster Newburg, lobster a la Newburg\nn07870620\tshrimp Newburg\nn07870734\tNewburg sauce\nn07870894\tlobster thermidor\nn07871065\tlutefisk, lutfisk\nn07871234\tmacaroni and cheese\nn07871335\tmacedoine\nn07871436\tmeatball\nn07871588\tporcupine ball, porcupines\nn07871720\tSwedish meatball\nn07871810\tmeat loaf, meatloaf\nn07872593\tmoussaka\nn07872748\tosso buco\nn07873057\tmarrow, bone marrow\nn07873198\tpheasant under glass\nn07873348\tpigs in blankets\nn07873464\tpilaf, pilaff, pilau, pilaw\nn07873679\tbulgur pilaf\nn07873807\tpizza, pizza pie\nn07874063\tsausage pizza\nn07874159\tpepperoni pizza\nn07874259\tcheese pizza\nn07874343\tanchovy pizza\nn07874441\tSicilian pizza\nn07874531\tpoi\nn07874674\tpork and beans\nn07874780\tporridge\nn07874995\toatmeal, burgoo\nn07875086\tloblolly\nn07875152\tpotpie\nn07875267\trijsttaffel, rijstaffel, rijstafel\nn07875436\trisotto, Italian rice\nn07875560\troulade\nn07875693\tfish loaf\nn07875835\tsalmon loaf\nn07875926\tSalisbury steak\nn07876026\tsauerbraten\nn07876189\tsauerkraut\nn07876281\tscallopine, scallopini\nn07876460\tveal scallopini\nn07876550\tscampi\nn07876651\tScotch egg\nn07876775\tScotch woodcock\nn07876893\tscrapple\nn07877187\tspaghetti and meatballs\nn07877299\tSpanish rice\nn07877675\tsteak tartare, tartar steak, cannibal mound\nn07877849\tpepper steak\nn07877961\tsteak au poivre, peppered steak, pepper steak\nn07878145\tbeef Stroganoff\nn07878283\tstuffed cabbage\nn07878479\tkishke, stuffed derma\nn07878647\tstuffed peppers\nn07878785\tstuffed tomato, hot stuffed tomato\nn07878926\tstuffed tomato, cold stuffed tomato\nn07879072\tsuccotash\nn07879174\tsukiyaki\nn07879350\tsashimi\nn07879450\tsushi\nn07879560\tSwiss steak\nn07879659\ttamale\nn07879821\ttamale pie\nn07879953\ttempura\nn07880080\tteriyaki\nn07880213\tterrine\nn07880325\tWelsh rarebit, Welsh rabbit, rarebit\nn07880458\tschnitzel, Wiener schnitzel\nn07880751\ttaco\nn07880880\tchicken taco\nn07880968\tburrito\nn07881117\tbeef burrito\nn07881205\tquesadilla\nn07881404\ttostada\nn07881525\tbean tostada\nn07881625\trefried beans, frijoles refritos\nn07881800\tbeverage, drink, drinkable, potable\nn07882420\twish-wash\nn07882497\tconcoction, mixture, intermixture\nn07882886\tmix, premix\nn07883031\tfilling\nn07883156\tlekvar\nn07883251\tpotion\nn07883384\telixir\nn07883510\telixir of life\nn07883661\tphilter, philtre, love-potion, love-philter, love-philtre\nn07884567\talcohol, alcoholic drink, alcoholic beverage, intoxicant, inebriant\nn07885705\tproof spirit\nn07886057\thome brew, homebrew\nn07886176\thooch, hootch\nn07886317\tkava, kavakava\nn07886463\taperitif\nn07886572\tbrew, brewage\nn07886849\tbeer\nn07887099\tdraft beer, draught beer\nn07887192\tsuds\nn07887304\tMunich beer, Munchener\nn07887461\tbock, bock beer\nn07887634\tlager, lager beer\nn07887967\tlight beer\nn07888058\tOktoberfest, Octoberfest\nn07888229\tPilsner, Pilsener\nn07888378\tshebeen\nn07888465\tWeissbier, white beer, wheat beer\nn07888816\tWeizenbock\nn07888909\tmalt\nn07889193\twort\nn07889274\tmalt, malt liquor\nn07889510\tale\nn07889814\tbitter\nn07889990\tBurton\nn07890068\tpale ale\nn07890226\tporter, porter's beer\nn07890352\tstout\nn07890540\tGuinness\nn07890617\tkvass\nn07890750\tmead\nn07890890\tmetheglin\nn07890970\thydromel\nn07891095\toenomel\nn07891189\tnear beer\nn07891309\tginger beer\nn07891433\tsake, saki, rice beer\nn07891726\twine, vino\nn07892418\tvintage\nn07892512\tred wine\nn07892813\twhite wine\nn07893253\tblush wine, pink wine, rose, rose wine\nn07893425\taltar wine, sacramental wine\nn07893528\tsparkling wine\nn07893642\tchampagne, bubbly\nn07893792\tcold duck\nn07893891\tBurgundy, Burgundy wine\nn07894102\tBeaujolais\nn07894298\tMedoc\nn07894451\tCanary wine\nn07894551\tChablis, white Burgundy\nn07894703\tMontrachet\nn07894799\tChardonnay, Pinot Chardonnay\nn07894965\tPinot noir\nn07895100\tPinot blanc\nn07895237\tBordeaux, Bordeaux wine\nn07895435\tclaret, red Bordeaux\nn07895595\tChianti\nn07895710\tCabernet, Cabernet Sauvignon\nn07895839\tMerlot\nn07895962\tSauvignon blanc\nn07896060\tCalifornia wine\nn07896165\tCotes de Provence\nn07896287\tdessert wine\nn07896422\tDubonnet\nn07896560\tjug wine\nn07896661\tmacon, maconnais\nn07896765\tMoselle\nn07896893\tMuscadet\nn07896994\tplonk\nn07897116\tretsina\nn07897200\tRhine wine, Rhenish, hock\nn07897438\tRiesling\nn07897600\tliebfraumilch\nn07897750\tRhone wine\nn07897865\tRioja\nn07897975\tsack\nn07898117\tSaint Emilion\nn07898247\tSoave\nn07898333\tzinfandel\nn07898443\tSauterne, Sauternes\nn07898617\tstraw wine\nn07898745\ttable wine\nn07898895\tTokay\nn07899003\tvin ordinaire\nn07899108\tvermouth\nn07899292\tsweet vermouth, Italian vermouth\nn07899434\tdry vermouth, French vermouth\nn07899533\tChenin blanc\nn07899660\tVerdicchio\nn07899769\tVouvray\nn07899899\tYquem\nn07899976\tgeneric, generic wine\nn07900225\tvarietal, varietal wine\nn07900406\tfortified wine\nn07900616\tMadeira\nn07900734\tmalmsey\nn07900825\tport, port wine\nn07900958\tsherry\nn07901355\tMarsala\nn07901457\tmuscat, muscatel, muscadel, muscadelle\nn07901587\tliquor, spirits, booze, hard drink, hard liquor, John Barleycorn, strong drink\nn07902121\tneutral spirits, ethyl alcohol\nn07902336\taqua vitae, ardent spirits\nn07902443\teau de vie\nn07902520\tmoonshine, bootleg, corn liquor\nn07902698\tbathtub gin\nn07902799\taquavit, akvavit\nn07902937\tarrack, arak\nn07903101\tbitters\nn07903208\tbrandy\nn07903543\tapplejack\nn07903643\tCalvados\nn07903731\tArmagnac\nn07903841\tCognac\nn07903962\tgrappa\nn07904072\tkirsch\nn07904293\tslivovitz\nn07904395\tgin\nn07904637\tsloe gin\nn07904760\tgeneva, Holland gin, Hollands\nn07904865\tgrog\nn07904934\touzo\nn07905038\trum\nn07905296\tdemerara, demerara rum\nn07905386\tJamaica rum\nn07905474\tschnapps, schnaps\nn07905618\tpulque\nn07905770\tmescal\nn07905979\ttequila\nn07906111\tvodka\nn07906284\twhiskey, whisky\nn07906572\tblended whiskey, blended whisky\nn07906718\tbourbon\nn07906877\tcorn whiskey, corn whisky, corn\nn07907037\tfirewater\nn07907161\tIrish, Irish whiskey, Irish whisky\nn07907342\tpoteen\nn07907429\trye, rye whiskey, rye whisky\nn07907548\tScotch, Scotch whiskey, Scotch whisky, malt whiskey, malt whisky, Scotch malt whiskey, Scotch malt whisky\nn07907831\tsour mash, sour mash whiskey\nn07907943\tliqueur, cordial\nn07908411\tabsinth, absinthe\nn07908567\tamaretto\nn07908647\tanisette, anisette de Bordeaux\nn07908812\tbenedictine\nn07908923\tChartreuse\nn07909129\tcoffee liqueur\nn07909231\tcreme de cacao\nn07909362\tcreme de menthe\nn07909504\tcreme de fraise\nn07909593\tDrambuie\nn07909714\tGalliano\nn07909811\torange liqueur\nn07909954\tcuracao, curacoa\nn07910048\ttriple sec\nn07910152\tGrand Marnier\nn07910245\tkummel\nn07910379\tmaraschino, maraschino liqueur\nn07910538\tpastis\nn07910656\tPernod\nn07910799\tpousse-cafe\nn07910970\tKahlua\nn07911061\tratafia, ratafee\nn07911249\tsambuca\nn07911371\tmixed drink\nn07911677\tcocktail\nn07912093\tDom Pedro\nn07912211\thighball\nn07913180\tmixer\nn07913300\tbishop\nn07913393\tBloody Mary\nn07913537\tVirgin Mary, bloody shame\nn07913644\tbullshot\nn07913774\tcobbler\nn07913882\tcollins, Tom Collins\nn07914006\tcooler\nn07914128\trefresher\nn07914271\tsmoothie\nn07914413\tdaiquiri, rum cocktail\nn07914586\tstrawberry daiquiri\nn07914686\tNADA daiquiri\nn07914777\tspritzer\nn07914887\tflip\nn07914995\tgimlet\nn07915094\tgin and tonic\nn07915213\tgrasshopper\nn07915366\tHarvey Wallbanger\nn07915491\tjulep, mint julep\nn07915618\tmanhattan\nn07915800\tRob Roy\nn07915918\tmargarita\nn07916041\tmartini\nn07916183\tgin and it\nn07916319\tvodka martini\nn07916437\told fashioned\nn07916582\tpink lady\nn07917133\tSazerac\nn07917272\tscrewdriver\nn07917392\tsidecar\nn07917507\tScotch and soda\nn07917618\tsling\nn07917791\tbrandy sling\nn07917874\tgin sling\nn07917951\trum sling\nn07918028\tsour\nn07918193\twhiskey sour, whisky sour\nn07918309\tstinger\nn07918706\tswizzle\nn07918879\thot toddy, toddy\nn07919165\tzombie, zombi\nn07919310\tfizz\nn07919441\tIrish coffee\nn07919572\tcafe au lait\nn07919665\tcafe noir, demitasse\nn07919787\tdecaffeinated coffee, decaf\nn07919894\tdrip coffee\nn07920052\tespresso\nn07920222\tcaffe latte, latte\nn07920349\tcappuccino, cappuccino coffee, coffee cappuccino\nn07920540\ticed coffee, ice coffee\nn07920663\tinstant coffee\nn07920872\tmocha, mocha coffee\nn07920989\tmocha\nn07921090\tcassareep\nn07921239\tTurkish coffee\nn07921360\tchocolate milk\nn07921455\tcider, cyder\nn07921615\thard cider\nn07921834\tscrumpy\nn07921948\tsweet cider\nn07922041\tmulled cider\nn07922147\tperry\nn07922512\trotgut\nn07922607\tslug\nn07922764\tcocoa, chocolate, hot chocolate, drinking chocolate\nn07922955\tcriollo\nn07923748\tjuice\nn07924033\tfruit juice, fruit crush\nn07924276\tnectar\nn07924366\tapple juice\nn07924443\tcranberry juice\nn07924560\tgrape juice\nn07924655\tmust\nn07924747\tgrapefruit juice\nn07924834\torange juice\nn07924955\tfrozen orange juice, orange-juice concentrate\nn07925116\tpineapple juice\nn07925229\tlemon juice\nn07925327\tlime juice\nn07925423\tpapaya juice\nn07925500\ttomato juice\nn07925608\tcarrot juice\nn07925708\tV-8 juice\nn07925808\tkoumiss, kumis\nn07925966\tfruit drink, ade\nn07926250\tlemonade\nn07926346\tlimeade\nn07926442\torangeade\nn07926540\tmalted milk\nn07926785\tmate\nn07926920\tmulled wine\nn07927070\tnegus\nn07927197\tsoft drink\nn07927512\tpop, soda, soda pop, soda water, tonic\nn07927716\tbirch beer\nn07927836\tbitter lemon\nn07927931\tcola, dope\nn07928163\tcream soda\nn07928264\tegg cream\nn07928367\tginger ale, ginger pop\nn07928488\torange soda\nn07928578\tphosphate\nn07928696\tCoca Cola, Coke\nn07928790\tPepsi, Pepsi Cola\nn07928887\troot beer\nn07928998\tsarsaparilla\nn07929172\ttonic, tonic water, quinine water\nn07929351\tcoffee bean, coffee berry, coffee\nn07929519\tcoffee, java\nn07929940\tcafe royale, coffee royal\nn07930062\tfruit punch\nn07930205\tmilk punch\nn07930315\tmimosa, buck's fizz\nn07930433\tpina colada\nn07930554\tpunch\nn07930864\tcup\nn07931001\tchampagne cup\nn07931096\tclaret cup\nn07931280\twassail\nn07931452\tplanter's punch\nn07931612\tWhite Russian\nn07931733\tfish house punch\nn07931870\tMay wine\nn07932039\teggnog\nn07932323\tcassiri\nn07932454\tspruce beer\nn07932614\trickey\nn07932762\tgin rickey\nn07932841\ttea, tea leaf\nn07933154\ttea bag\nn07933274\ttea\nn07933530\ttea-like drink\nn07933652\tcambric tea\nn07933799\tcuppa, cupper\nn07933891\therb tea, herbal tea, herbal\nn07934032\ttisane\nn07934152\tcamomile tea\nn07934282\tice tea, iced tea\nn07934373\tsun tea\nn07934530\tblack tea\nn07934678\tcongou, congo, congou tea, English breakfast tea\nn07934800\tDarjeeling\nn07934908\torange pekoe, pekoe\nn07935043\tsouchong, soochong\nn07935152\tgreen tea\nn07935288\thyson\nn07935379\toolong\nn07935504\twater\nn07935737\tbottled water\nn07935878\tbranch water\nn07936015\tspring water\nn07936093\tsugar water\nn07936263\tdrinking water\nn07936459\tice water\nn07936548\tsoda water, carbonated water, club soda, seltzer, sparkling water\nn07936745\tmineral water\nn07936979\tseltzer\nn07937069\tVichy water\nn07937344\tperishable, spoilable\nn07937461\tcouscous\nn07937621\tramekin, ramequin\nn07938007\tmultivitamin, multivitamin pill\nn07938149\tvitamin pill\nn07938313\tsoul food\nn07938594\tmold, mould\nn07942152\tpeople\nn07951464\tcollection, aggregation, accumulation, assemblage\nn07954211\tbook, rule book\nn07977870\tlibrary\nn08079613\tbaseball club, ball club, club, nine\nn08182379\tcrowd\nn08238463\tclass, form, grade, course\nn08242223\tcore, nucleus, core group\nn08249459\tconcert band, military band\nn08253141\tdance\nn08256735\twedding, wedding party\nn08376250\tchain, concatenation\nn08385989\tpower breakfast\nn08492354\taerie, aery, eyrie, eyry\nn08492461\tagora\nn08494231\tamusement park, funfair, pleasure ground\nn08495908\taphelion\nn08496334\tapron\nn08500819\tinterplanetary space\nn08500989\tinterstellar space\nn08501887\tintergalactic space\nn08505018\tbush\nn08506347\tsemidesert\nn08511017\tbeam-ends\nn08517010\tbridgehead\nn08517676\tbus stop\nn08518171\tcampsite, campground, camping site, camping ground, bivouac, encampment, camping area\nn08519299\tdetention basin\nn08521623\tcemetery, graveyard, burial site, burial ground, burying ground, memorial park, necropolis\nn08523340\ttrichion, crinion\nn08524735\tcity, metropolis, urban center\nn08539072\tbusiness district, downtown\nn08539276\toutskirts\nn08540532\tborough\nn08547468\tcow pasture\nn08547544\tcrest\nn08551296\teparchy, exarchate\nn08554440\tsuburb, suburbia, suburban area\nn08555333\tstockbroker belt\nn08555710\tcrawlspace, crawl space\nn08558770\tsheikdom, sheikhdom\nn08558963\tresidence, abode\nn08559155\tdomicile, legal residence\nn08560295\tdude ranch\nn08569482\tfarmland, farming area\nn08571275\tmidfield\nn08571642\tfirebreak, fireguard\nn08571898\tflea market\nn08573674\tbattlefront, front, front line\nn08573842\tgarbage heap, junk heap, rubbish heap, scrapheap, trash heap, junk pile, trash pile, refuse heap\nn08578517\tbenthos, benthic division, benthonic zone\nn08579266\tgoldfield\nn08579352\tgrainfield, grain field\nn08580944\thalf-mast, half-staff\nn08583292\themline\nn08583455\theronry\nn08583554\thipline\nn08583682\thipline\nn08584914\thole-in-the-wall\nn08586978\tjunkyard\nn08589670\tisoclinic line, isoclinal\nn08596076\tlittoral, litoral, littoral zone, sands\nn08597579\tmagnetic pole\nn08598301\tgrassland\nn08598568\tmecca\nn08599174\tobserver's meridian\nn08599292\tprime meridian\nn08611339\tnombril\nn08611421\tno-parking zone\nn08613733\toutdoors, out-of-doors, open air, open\nn08614632\tfairground\nn08616050\tpasture, pastureland, grazing land, lea, ley\nn08618831\tperihelion\nn08619112\tperiselene, perilune\nn08623676\tlocus of infection\nn08628141\tkasbah, casbah\nn08633683\twaterfront\nn08640531\tresort, resort hotel, holiday resort\nn08640739\tresort area, playground, vacation spot\nn08640962\trough\nn08643267\tashram\nn08644045\tharborage, harbourage\nn08645104\tscrubland\nn08645212\tweald\nn08645318\twold\nn08647264\tschoolyard\nn08648917\tshowplace\nn08649711\tbedside\nn08651104\tsideline, out of bounds\nn08652376\tski resort\nn08658309\tsoil horizon\nn08658918\tgeological horizon\nn08659242\tcoal seam\nn08659331\tcoalface\nn08659446\tfield\nn08659861\toilfield\nn08661878\tTemperate Zone\nn08662427\tterreplein\nn08663051\tthree-mile limit\nn08663703\tdesktop\nn08663860\ttop\nn08673039\tkampong, campong\nn08674344\tsubtropics, semitropics\nn08676253\tbarrio\nn08677424\tveld, veldt\nn08677801\tvertex, peak, apex, acme\nn08678783\twaterline, water line, water level\nn08679167\thigh-water mark\nn08679269\tlow-water mark\nn08679562\tcontinental divide\nn08685188\tzodiac\nn08782627\tAegean island\nn08896327\tsultanate\nn09032191\tSwiss canton\nn09186592\tabyssal zone\nn09189157\taerie, aery, eyrie, eyry\nn09191635\tair bubble\nn09193551\talluvial flat, alluvial plain\nn09193705\talp\nn09194227\tAlpine glacier, Alpine type of glacier\nn09199101\tanthill, formicary\nn09201998\taquifer\nn09203827\tarchipelago\nn09205509\tarete\nn09206896\tarroyo\nn09206985\tascent, acclivity, rise, raise, climb, upgrade\nn09208496\tasterism\nn09209025\tasthenosphere\nn09210862\tatoll\nn09213434\tbank\nn09213565\tbank\nn09214060\tbar\nn09214269\tbarbecue pit\nn09214916\tbarrier reef\nn09215023\tbaryon, heavy particle\nn09215437\tbasin\nn09217230\tbeach\nn09218315\thoneycomb\nn09218494\tbelay\nn09218641\tben\nn09219233\tberm\nn09223487\tbladder stone, cystolith\nn09224725\tbluff\nn09226869\tborrow pit\nn09228055\tbrae\nn09229709\tbubble\nn09230041\tburrow, tunnel\nn09230202\tbutte\nn09231117\tcaldera\nn09233446\tcanyon, canon\nn09233603\tcanyonside\nn09238926\tcave\nn09239302\tcavern\nn09242389\tchasm\nn09245515\tcirque, corrie, cwm\nn09246464\tcliff, drop, drop-off\nn09247410\tcloud\nn09248153\tcoast\nn09248399\tcoastland\nn09249034\tcol, gap\nn09249155\tcollector\nn09251407\tcomet\nn09255070\tcontinental glacier\nn09256479\tcoral reef\nn09257843\tcove\nn09259025\tcrag\nn09259219\tcrater\nn09260907\tcultivated land, farmland, plowland, ploughland, tilled land, tillage, tilth\nn09262690\tdale\nn09263912\tdefile, gorge\nn09264803\tdelta\nn09265620\tdescent, declivity, fall, decline, declination, declension, downslope\nn09266604\tdiapir\nn09267854\tdivot\nn09268007\tdivot\nn09269341\tdown\nn09269472\tdownhill\nn09269882\tdraw\nn09270160\tdrey\nn09270657\tdrumlin\nn09270735\tdune, sand dune\nn09274152\tescarpment, scarp\nn09274305\tesker\nn09279986\tfireball\nn09281252\tflare star\nn09282208\tfloor\nn09283193\tfomite, vehicle\nn09283405\tfoothill\nn09283514\tfootwall\nn09283767\tforeland\nn09283866\tforeshore\nn09287415\tgauge boson\nn09287968\tgeological formation, formation\nn09288635\tgeyser\nn09289331\tglacier\nn09289596\tglen\nn09290350\tgopher hole\nn09290444\tgorge\nn09294877\tgrotto, grot\nn09295210\tgrowler\nn09295946\tgulch, flume\nn09300306\tgully\nn09300905\thail\nn09302616\thighland, upland\nn09303008\thill\nn09303528\thillside\nn09304750\thole, hollow\nn09305031\thollow, holler\nn09305898\thot spring, thermal spring\nn09308572\ticeberg, berg\nn09308743\ticecap, ice cap\nn09309046\tice field\nn09309168\tice floe, floe\nn09309292\tice mass\nn09310616\tinclined fault\nn09315159\tion\nn09319604\tisthmus\nn09325824\tkidney stone, urinary calculus, nephrolith, renal calculus\nn09326662\tknoll, mound, hillock, hummock, hammock\nn09327077\tkopje, koppie\nn09327538\tKuiper belt, Edgeworth-Kuiper belt\nn09330378\tlake bed, lake bottom\nn09331251\tlakefront\nn09332890\tlakeside, lakeshore\nn09335693\tlandfall\nn09335809\tlandfill\nn09336555\tlather\nn09337048\tleak\nn09337253\tledge, shelf\nn09338013\tlepton\nn09339810\tlithosphere, geosphere\nn09344198\tlowland\nn09344324\tlunar crater\nn09344724\tmaar\nn09348460\tmassif\nn09349648\tmeander\nn09351905\tmesa, table\nn09352849\tmeteorite\nn09353815\tmicrofossil\nn09354511\tmidstream\nn09357346\tmolehill\nn09357447\tmonocline\nn09359803\tmountain, mount\nn09361517\tmountainside, versant\nn09362316\tmouth\nn09362945\tmull\nn09366017\tnatural depression, depression\nn09366317\tnatural elevation, elevation\nn09375606\tnullah\nn09376198\tocean\nn09376526\tocean floor, sea floor, ocean bottom, seabed, sea bottom, Davy Jones's locker, Davy Jones\nn09376786\toceanfront\nn09381242\toutcrop, outcropping, rock outcrop\nn09382099\toxbow\nn09384106\tpallasite\nn09389867\tperforation\nn09391386\tphotosphere\nn09391644\tpiedmont\nn09391774\tPiedmont glacier, Piedmont type of glacier\nn09392402\tpinetum\nn09393524\tplage\nn09393605\tplain, field, champaign\nn09396465\tpoint\nn09396608\tpolar glacier\nn09398076\tpothole, chuckhole\nn09398677\tprecipice\nn09399592\tpromontory, headland, head, foreland\nn09400584\tptyalith\nn09400987\tpulsar\nn09402944\tquicksand\nn09403086\trabbit burrow, rabbit hole\nn09403211\tradiator\nn09403427\trainbow\nn09403734\trange, mountain range, range of mountains, chain, mountain chain, chain of mountains\nn09405078\trangeland\nn09405787\travine\nn09406793\treef\nn09409512\tridge\nn09409752\tridge, ridgeline\nn09410224\trift valley\nn09411189\triparian forest\nn09411295\tripple mark\nn09415584\triverbank, riverside\nn09415671\triverbed, river bottom\nn09416076\trock, stone\nn09416890\troof\nn09421031\tsaltpan\nn09421799\tsandbank\nn09421951\tsandbar, sand bar\nn09422190\tsandpit\nn09422631\tsanitary landfill\nn09425019\tsawpit\nn09425344\tscablands\nn09428293\tseashore, coast, seacoast, sea-coast\nn09428628\tseaside, seaboard\nn09429630\tseif dune\nn09432283\tshell\nn09432990\tshiner\nn09433312\tshoal\nn09433442\tshore\nn09433839\tshoreline\nn09435739\tsinkhole, sink, swallow hole\nn09436444\tski slope\nn09436708\tsky\nn09437454\tslope, incline, side\nn09438844\tsnowcap\nn09438940\tsnowdrift\nn09439032\tsnowfield\nn09439213\tsoapsuds, suds, lather\nn09442595\tspit, tongue\nn09443281\tspoor\nn09443641\tspume\nn09444783\tstar\nn09445008\tsteep\nn09445289\tsteppe\nn09447666\tstrand\nn09448690\tstreambed, creek bed\nn09450163\tsun, Sun\nn09451237\tsupernova\nn09452291\tswale\nn09452395\tswamp, swampland\nn09452760\tswell\nn09453008\ttableland, plateau\nn09454153\ttalus, scree\nn09454412\ttangle\nn09454744\ttar pit\nn09456207\tterrace, bench\nn09457979\ttidal basin\nn09458269\ttideland\nn09459979\ttor\nn09460046\ttor\nn09461069\tTrapezium\nn09462600\ttroposphere\nn09463226\ttundra\nn09464486\ttwinkler\nn09466678\tuphill\nn09467696\turolith\nn09468604\tvalley, vale\nn09470027\tvehicle-borne transmission\nn09470222\tvein, mineral vein\nn09472413\tvolcanic crater, crater\nn09472597\tvolcano\nn09474010\twadi\nn09474412\twall\nn09474765\twarren, rabbit warren\nn09475044\twasp's nest, wasps' nest, hornet's nest, hornets' nest\nn09475179\twatercourse\nn09475925\twaterside\nn09476123\twater table, water level, groundwater level\nn09478210\twhinstone, whin\nn09480959\twormcast\nn09481120\txenolith\nn09493983\tCirce\nn09495962\tgryphon, griffin, griffon\nn09505153\tspiritual leader\nn09537660\tmessiah, christ\nn09556121\tRhea Silvia, Rea Silvia\nn09605110\tnumber one\nn09606009\tadventurer, venturer\nn09606527\tanomaly, unusual person\nn09607630\tappointee, appointment\nn09607782\targonaut\nn09607903\tAshkenazi\nn09608709\tbenefactor, helper\nn09610255\tcolor-blind person\nn09610405\tcommoner, common man, common person\nn09611722\tconservator\nn09612700\tcontrarian\nn09613118\tcontadino\nn09613191\tcontestant\nn09613690\tcosigner, cosignatory\nn09615336\tdiscussant\nn09616573\tenologist, oenologist, fermentologist\nn09616922\tentertainer\nn09617161\teulogist, panegyrist\nn09617435\tex-gambler\nn09617577\texperimenter\nn09617696\texperimenter\nn09618760\texponent\nn09618880\tex-president\nn09618957\tface\nn09619168\tfemale, female person\nn09619452\tfinisher\nn09620078\tinhabitant, habitant, dweller, denizen, indweller\nn09620794\tnative, indigen, indigene, aborigine, aboriginal\nn09621232\tnative\nn09622049\tjuvenile, juvenile person\nn09622302\tlover\nn09624168\tmale, male person\nn09624559\tmediator, go-between, intermediator, intermediary, intercessor\nn09624899\tmediatrix\nn09625401\tnational, subject\nn09626238\tpeer, equal, match, compeer\nn09627807\tprize winner, lottery winner\nn09627906\trecipient, receiver\nn09629065\treligionist\nn09629246\tsensualist\nn09629752\ttraveler, traveller\nn09631129\tunwelcome person, persona non grata\nn09632274\tunskilled person\nn09632518\tworker\nn09633969\twrongdoer, offender\nn09635534\tBlack African\nn09635635\tAfrikaner, Afrikander, Boer\nn09635973\tAryan\nn09636339\tBlack, Black person, blackamoor, Negro, Negroid\nn09637339\tBlack woman\nn09638454\tmulatto\nn09638875\tWhite, White person, Caucasian\nn09639382\tCircassian\nn09639919\tSemite\nn09640327\tChaldean, Chaldaean, Chaldee\nn09640715\tElamite\nn09641002\twhite man\nn09641578\tWASP, white Anglo-Saxon Protestant\nn09643799\tgook, slant-eye\nn09644152\tMongol, Mongolian\nn09644657\tTatar, Tartar, Mongol Tatar\nn09648743\tNahuatl\nn09648911\tAztec\nn09649067\tOlmec\nn09650729\tBiloxi\nn09650839\tBlackfoot\nn09650989\tBrule\nn09651123\tCaddo\nn09651968\tCheyenne\nn09652149\tChickasaw\nn09653144\tCocopa, Cocopah\nn09653438\tComanche\nn09654079\tCreek\nn09654518\tDelaware\nn09654898\tDiegueno\nn09655213\tEsselen\nn09655466\tEyeish\nn09656077\tHavasupai\nn09657206\tHunkpapa\nn09657748\tIowa, Ioway\nn09658254\tKalapooia, Kalapuya, Calapooya, Calapuya\nn09658398\tKamia\nn09658815\tKekchi\nn09658921\tKichai\nn09659039\tKickapoo\nn09659188\tKiliwa, Kiliwi\nn09660010\tMalecite\nn09660240\tMaricopa\nn09661873\tMohican, Mahican\nn09662038\tMuskhogean, Muskogean\nn09662661\tNavaho, Navajo\nn09662951\tNootka\nn09663248\tOglala, Ogalala\nn09663786\tOsage\nn09663999\tOneida\nn09664556\tPaiute, Piute\nn09664908\tPassamaquody\nn09665367\tPenobscot\nn09665545\tPenutian\nn09666349\tPotawatomi\nn09666476\tPowhatan\nn09666883\tkachina\nn09667358\tSalish\nn09668199\tShahaptian, Sahaptin, Sahaptino\nn09668437\tShasta\nn09668562\tShawnee\nn09668988\tSihasapa\nn09669631\tTeton, Lakota, Teton Sioux, Teton Dakota\nn09670280\tTaracahitian\nn09670521\tTarahumara\nn09670909\tTuscarora\nn09671089\tTutelo\nn09672590\tYana\nn09672725\tYavapai\nn09672840\tYokuts\nn09673091\tYuma\nn09674412\tGadaba\nn09674786\tKolam\nn09675045\tKui\nn09675673\tToda\nn09675799\tTulu\nn09675922\tGujarati, Gujerati\nn09676021\tKashmiri\nn09676247\tPunjabi, Panjabi\nn09676884\tSlav\nn09677427\tAnabaptist\nn09678747\tAdventist, Second Adventist\nn09679028\tgentile, non-Jew, goy\nn09679170\tgentile\nn09679925\tCatholic\nn09680908\tOld Catholic\nn09681107\tUniat, Uniate, Uniate Christian\nn09681234\tCopt\nn09681973\tJewess\nn09683180\tJihadist\nn09683757\tBuddhist\nn09683924\tZen Buddhist\nn09684082\tMahayanist\nn09684901\tswami\nn09685233\tHare Krishna\nn09685806\tShintoist\nn09686262\tEurafrican\nn09686401\tEurasian\nn09688233\tGael\nn09688804\tFrank\nn09689435\tAfghan, Afghanistani\nn09689958\tAlbanian\nn09690083\tAlgerian\nn09690208\tAltaic\nn09690496\tAndorran\nn09690621\tAngolan\nn09690864\tAnguillan\nn09691604\tAustrian\nn09691729\tBahamian\nn09691858\tBahraini, Bahreini\nn09692125\tBasotho\nn09692915\tHerero\nn09693244\tLuba, Chiluba\nn09693982\tBarbadian\nn09694664\tBolivian\nn09694771\tBornean\nn09695019\tCarioca\nn09695132\tTupi\nn09695514\tBruneian\nn09695620\tBulgarian\nn09695979\tByelorussian, Belorussian, White Russian\nn09696456\tCameroonian\nn09696585\tCanadian\nn09696763\tFrench Canadian\nn09697401\tCentral American\nn09697986\tChilean\nn09698644\tCongolese\nn09699020\tCypriot, Cypriote, Cyprian\nn09699642\tDane\nn09700125\tDjiboutian\nn09700964\tBritisher, Briton, Brit\nn09701148\tEnglish person\nn09701833\tEnglishwoman\nn09702134\tAnglo-Saxon\nn09702673\tAngle\nn09703101\tWest Saxon\nn09703344\tLombard, Langobard\nn09703485\tlimey, John Bull\nn09703708\tCantabrigian\nn09703809\tCornishman\nn09703932\tCornishwoman\nn09704057\tLancastrian\nn09704157\tLancastrian\nn09704283\tGeordie\nn09705003\tOxonian\nn09705124\tEthiopian\nn09705671\tAmhara\nn09705784\tEritrean\nn09706029\tFinn\nn09706255\tKomi\nn09707061\tLivonian\nn09707289\tLithuanian\nn09707735\tSelkup, Ostyak-Samoyed\nn09708750\tParisian\nn09708889\tParisienne\nn09709531\tCreole\nn09709673\tCreole\nn09710041\tGabonese\nn09710164\tGreek, Hellene\nn09710886\tDorian\nn09711132\tAthenian\nn09711435\tLaconian\nn09712324\tGuyanese\nn09712448\tHaitian\nn09712696\tMalay, Malayan\nn09712967\tMoro\nn09713108\tNetherlander, Dutchman, Hollander\nn09714120\tIcelander\nn09714694\tIraqi, Iraki\nn09715165\tIrishman\nn09715303\tIrishwoman\nn09715427\tDubliner\nn09716047\tItalian\nn09716933\tRoman\nn09717233\tSabine\nn09718217\tJapanese, Nipponese\nn09718811\tJordanian\nn09718936\tKorean\nn09719309\tKenyan\nn09719794\tLao, Laotian\nn09720033\tLapp, Lapplander, Sami, Saami, Same, Saame\nn09720256\tLatin American, Latino\nn09720595\tLebanese\nn09720702\tLevantine\nn09720842\tLiberian\nn09721244\tLuxemburger, Luxembourger\nn09721444\tMacedonian\nn09722064\tSabahan\nn09722658\tMexican\nn09722817\tChicano\nn09723067\tMexican-American, Mexicano\nn09723819\tNamibian\nn09723944\tNauruan\nn09724234\tGurkha\nn09724533\tNew Zealander, Kiwi\nn09724656\tNicaraguan\nn09724785\tNigerian\nn09725000\tHausa, Haussa\nn09725229\tNorth American\nn09725546\tNova Scotian, bluenose\nn09725653\tOmani\nn09725772\tPakistani\nn09725935\tBrahui\nn09726621\tSouth American Indian\nn09726811\tCarib, Carib Indian\nn09727440\tFilipino\nn09727826\tPolynesian\nn09728137\tQatari, Katari\nn09728285\tRomanian, Rumanian\nn09729062\tMuscovite\nn09729156\tGeorgian\nn09730077\tSarawakian\nn09730204\tScandinavian, Norse, Northman\nn09730824\tSenegalese\nn09731343\tSlovene\nn09731436\tSouth African\nn09731571\tSouth American\nn09732170\tSudanese\nn09733459\tSyrian\nn09733793\tTahitian\nn09734185\tTanzanian\nn09734450\tTibetan\nn09734535\tTogolese\nn09734639\tTuareg\nn09735258\tTurki\nn09735654\tChuvash\nn09736485\tTurkoman, Turkmen, Turcoman\nn09736798\tUzbek, Uzbeg, Uzbak, Usbek, Usbeg\nn09736945\tUgandan\nn09737050\tUkranian\nn09737161\tYakut\nn09737453\tTungus, Evenk\nn09738121\tIgbo\nn09738400\tAmerican\nn09740724\tAnglo-American\nn09741074\tAlaska Native, Alaskan Native, Native Alaskan\nn09741331\tArkansan, Arkansawyer\nn09741722\tCarolinian\nn09741816\tColoradan\nn09741904\tConnecticuter\nn09741999\tDelawarean, Delawarian\nn09742101\tFloridian\nn09742315\tGerman American\nn09742927\tIllinoisan\nn09743487\tMainer, Down Easter\nn09743601\tMarylander\nn09743792\tMinnesotan, Gopher\nn09744161\tNebraskan, Cornhusker\nn09744346\tNew Hampshirite, Granite Stater\nn09744462\tNew Jerseyan, New Jerseyite, Garden Stater\nn09744679\tNew Yorker\nn09744834\tNorth Carolinian, Tarheel\nn09745229\tOregonian, Beaver\nn09745324\tPennsylvanian, Keystone Stater\nn09745834\tTexan\nn09745933\tUtahan\nn09746936\tUruguayan\nn09747191\tVietnamese, Annamese\nn09747495\tGambian\nn09748101\tEast German\nn09748408\tBerliner\nn09748648\tPrussian\nn09748889\tGhanian\nn09749386\tGuinean\nn09750282\tPapuan\nn09750641\tWalloon\nn09750770\tYemeni\nn09750891\tYugoslav, Jugoslav, Yugoslavian, Jugoslavian\nn09751076\tSerbian, Serb\nn09751496\tXhosa\nn09751622\tZairese, Zairean\nn09751895\tZimbabwean\nn09752023\tZulu\nn09752519\tGemini, Twin\nn09753348\tSagittarius, Archer\nn09753792\tPisces, Fish\nn09754152\tabbe\nn09754217\tabbess, mother superior, prioress\nn09754633\tabnegator\nn09754907\tabridger, abbreviator\nn09755086\tabstractor, abstracter\nn09755241\tabsconder\nn09755555\tabsolver\nn09755788\tabecedarian\nn09755893\taberrant\nn09756049\tabettor, abetter\nn09756195\tabhorrer\nn09756961\tabomination\nn09757449\tabseiler, rappeller\nn09758173\tabstainer, ascetic\nn09758885\tacademic administrator\nn09759501\tacademician\nn09760290\taccessory before the fact\nn09760609\tcompanion\nn09760913\taccompanist, accompanyist\nn09761068\taccomplice, confederate\nn09761753\taccount executive, account representative, registered representative, customer's broker, customer's man\nn09762011\taccused\nn09762385\taccuser\nn09763272\tacid head\nn09763784\tacquaintance, friend\nn09764201\tacquirer\nn09764598\taerialist\nn09764732\taction officer\nn09764900\tactive\nn09765118\tactive citizen\nn09765278\tactor, histrion, player, thespian, role player\nn09767197\tactor, doer, worker\nn09769076\taddict, nut, freak, junkie, junky\nn09769525\tadducer\nn09769929\tadjuster, adjustor, claims adjuster, claims adjustor, claim agent\nn09770179\tadjutant, aide, aide-de-camp\nn09770359\tadjutant general\nn09771435\tadmirer, adorer\nn09772330\tadoptee\nn09772746\tadulterer, fornicator\nn09772930\tadulteress, fornicatress, hussy, jade, loose woman, slut, strumpet, trollop\nn09773962\tadvertiser, advertizer, adman\nn09774167\tadvisee\nn09774783\tadvocate, advocator, proponent, exponent\nn09775907\taeronautical engineer\nn09776346\taffiliate\nn09776642\taffluent\nn09776807\taficionado\nn09777870\tbuck sergeant\nn09778266\tagent-in-place\nn09778537\taggravator, annoyance\nn09778783\tagitator, fomenter\nn09778927\tagnostic\nn09779124\tagnostic, doubter\nn09779280\tagonist\nn09779461\tagony aunt\nn09779790\tagriculturist, agriculturalist, cultivator, grower, raiser\nn09780395\tair attache\nn09780828\tair force officer, commander\nn09780984\tairhead\nn09781398\tair traveler, air traveller\nn09781504\talarmist\nn09781650\talbino\nn09782167\talcoholic, alky, dipsomaniac, boozer, lush, soaker, souse\nn09782397\talderman\nn09782855\talexic\nn09783537\talienee, grantee\nn09783776\talienor\nn09783884\taliterate, aliterate person\nn09784043\talgebraist\nn09784160\tallegorizer, allegoriser\nn09784564\talliterator\nn09785236\talmoner, medical social worker\nn09785659\talpinist\nn09785891\taltar boy\nn09786115\talto\nn09787534\tambassador, embassador\nn09787765\tambassador\nn09788073\tambusher\nn09788237\tamicus curiae, friend of the court\nn09789150\tamoralist\nn09789566\tamputee\nn09789898\tanalogist\nn09790047\tanalphabet, analphabetic\nn09790482\tanalyst\nn09791014\tindustry analyst\nn09791419\tmarket strategist\nn09791816\tanarchist, nihilist, syndicalist\nn09792125\tanathema, bete noire\nn09792555\tancestor, ascendant, ascendent, antecedent, root\nn09792969\tanchor, anchorman, anchorperson\nn09793141\tancient\nn09793352\tanecdotist, raconteur\nn09793946\tangler, troller\nn09794550\tanimator\nn09794668\tanimist\nn09795010\tannotator\nn09795124\tannouncer\nn09795334\tannouncer\nn09796809\tanti\nn09796974\tanti-American\nn09797742\tanti-Semite, Jew-baiter\nn09797873\tAnzac\nn09797998\tape-man\nn09798096\taphakic\nn09800469\tappellant, plaintiff in error\nn09800964\tappointee\nn09801102\tapprehender\nn09801275\tApril fool\nn09801533\taspirant, aspirer, hopeful, wannabe, wannabee\nn09802445\tappreciator\nn09802641\tappropriator\nn09802951\tArabist\nn09804230\tarchaist\nn09805151\tarchbishop\nn09805324\tarcher, bowman\nn09805475\tarchitect, designer\nn09806944\tarchivist\nn09807075\tarchpriest, hierarch, high priest, prelate, primate\nn09808080\tAristotelian, Aristotelean, Peripatetic\nn09808591\tarmiger\nn09809279\tarmy attache\nn09809538\tarmy engineer, military engineer\nn09809749\tarmy officer\nn09809925\tarranger, adapter, transcriber\nn09810166\tarrival, arriver, comer\nn09811568\tarthritic\nn09811712\tarticulator\nn09811852\tartilleryman, cannoneer, gunner, machine gunner\nn09813219\tartist's model, sitter\nn09814252\tassayer\nn09814381\tassemblyman\nn09814488\tassemblywoman\nn09814567\tassenter\nn09814660\tasserter, declarer, affirmer, asseverator, avower\nn09815455\tassignee\nn09815790\tassistant, helper, help, supporter\nn09816654\tassistant professor\nn09816771\tassociate\nn09817174\tassociate\nn09817386\tassociate professor\nn09818022\tastronaut, spaceman, cosmonaut\nn09819477\tcosmographer, cosmographist\nn09820044\tatheist\nn09820263\tathlete, jock\nn09821831\tattendant, attender, tender\nn09822830\tattorney general\nn09823153\tauditor\nn09823287\taugur, auspex\nn09823502\taunt, auntie, aunty\nn09823832\tau pair girl\nn09824135\tauthoritarian, dictator\nn09824609\tauthority\nn09825096\tauthorizer, authoriser\nn09825750\tautomobile mechanic, auto-mechanic, car-mechanic, mechanic, grease monkey\nn09826204\taviator, aeronaut, airman, flier, flyer\nn09826605\taviatrix, airwoman, aviatress\nn09826821\tayah\nn09827246\tbabu, baboo\nn09827363\tbaby, babe, sister\nn09828216\tbaby\nn09828403\tbaby boomer, boomer\nn09828988\tbaby farmer\nn09830194\tback\nn09830400\tbackbencher\nn09830629\tbackpacker, packer\nn09830759\tbackroom boy, brain truster\nn09830926\tbackscratcher\nn09831962\tbad person\nn09832456\tbaggage\nn09832633\tbag lady\nn09832978\tbailee\nn09833111\tbailiff\nn09833275\tbailor\nn09833441\tbairn\nn09833536\tbaker, bread maker\nn09833751\tbalancer\nn09833997\tbalker, baulker, noncompliant\nn09834258\tball-buster, ball-breaker\nn09834378\tball carrier, runner\nn09834699\tballet dancer\nn09834885\tballet master\nn09835017\tballet mistress\nn09835153\tballetomane\nn09835230\tball hawk\nn09835348\tballoonist\nn09835506\tballplayer, baseball player\nn09836160\tbullfighter, toreador\nn09836343\tbanderillero\nn09836519\tmatador\nn09836786\tpicador\nn09837459\tbandsman\nn09837720\tbanker\nn09838295\tbank robber\nn09838370\tbankrupt, insolvent\nn09838621\tbantamweight\nn09839702\tbarmaid\nn09840217\tbaron, big businessman, business leader, king, magnate, mogul, power, top executive, tycoon\nn09840435\tbaron\nn09840520\tbaron\nn09841188\tbartender, barman, barkeep, barkeeper, mixologist\nn09841515\tbaseball coach, baseball manager\nn09841696\tbase runner, runner\nn09842047\tbasketball player, basketeer, cager\nn09842288\tbasketweaver, basketmaker\nn09842395\tBasket Maker\nn09842528\tbass, basso\nn09842823\tbastard, by-blow, love child, illegitimate child, illegitimate, whoreson\nn09843443\tbat boy\nn09843602\tbather\nn09843716\tbatman\nn09843824\tbaton twirler, twirler\nn09844457\tBavarian\nn09844898\tbeadsman, bedesman\nn09845401\tbeard\nn09845849\tbeatnik, beat\nn09846142\tbeauty consultant\nn09846469\tBedouin, Beduin\nn09846586\tbedwetter, bed wetter, wetter\nn09846755\tbeekeeper, apiarist, apiculturist\nn09846894\tbeer drinker, ale drinker\nn09847267\tbeggarman\nn09847344\tbeggarwoman\nn09847543\tbeldam, beldame\nn09848110\ttheist\nn09848489\tbeliever, truster\nn09849167\tbell founder\nn09849990\tbenedick, benedict\nn09850760\tberserker, berserk\nn09850974\tbesieger\nn09851165\tbest, topper\nn09851575\tbetrothed\nn09853541\tBig Brother\nn09853645\tbigot\nn09853881\tbig shot, big gun, big wheel, big cheese, big deal, big enchilada, big fish, head honcho\nn09854218\tbig sister\nn09854421\tbilliard player\nn09854915\tbiochemist\nn09855433\tbiographer\nn09856401\tbird fancier\nn09856671\tbirth\nn09856827\tbirth-control campaigner, birth-control reformer\nn09857007\tbisexual, bisexual person\nn09858165\tblack belt\nn09858299\tblackmailer, extortioner, extortionist\nn09858733\tBlack Muslim\nn09859152\tblacksmith\nn09859285\tblade\nn09859975\tblind date\nn09861287\tbluecoat\nn09861599\tbluestocking, bas bleu\nn09861863\tboatbuilder\nn09861946\tboatman, boater, waterman\nn09862183\tboatswain, bos'n, bo's'n, bosun, bo'sun\nn09862621\tbobby\nn09863031\tbodyguard, escort\nn09863339\tboffin\nn09863749\tBolshevik, Marxist, red, bolshie, bolshy\nn09863936\tBolshevik, Bolshevist\nn09864632\tbombshell\nn09864968\tbondman, bondsman\nn09865068\tbondwoman, bondswoman, bondmaid\nn09865162\tbondwoman, bondswoman, bondmaid\nn09865398\tbond servant\nn09865672\tbook agent\nn09865744\tbookbinder\nn09866115\tbookkeeper\nn09866354\tbookmaker\nn09866559\tbookworm\nn09866661\tbooster, shoplifter, lifter\nn09866817\tbootblack, shoeblack\nn09866922\tbootlegger, moonshiner\nn09867069\tbootmaker, boot maker\nn09867154\tborderer\nn09867311\tborder patrolman\nn09868270\tbotanist, phytologist, plant scientist\nn09868782\tbottom feeder\nn09868899\tboulevardier\nn09869317\tbounty hunter\nn09869447\tbounty hunter\nn09869578\tBourbon\nn09870096\tbowler\nn09871095\tslugger, slogger\nn09871229\tcub, lad, laddie, sonny, sonny boy\nn09871681\tBoy Scout\nn09871867\tboy scout\nn09871952\tboy wonder\nn09872066\tbragger, braggart, boaster, blowhard, line-shooter, vaunter\nn09872557\tbrahman, brahmin\nn09873348\tbrawler\nn09873473\tbreadwinner\nn09873769\tbreaststroker\nn09873899\tbreeder, stock breeder\nn09874428\tbrick\nn09874725\tbride\nn09874862\tbridesmaid, maid of honor\nn09875025\tbridge agent\nn09875979\tbroadcast journalist\nn09876701\tBrother\nn09877288\tbrother-in-law\nn09877587\tbrowser\nn09877750\tBrummie, Brummy\nn09877951\tbuddy, brother, chum, crony, pal, sidekick\nn09878921\tbull\nn09879552\tbully\nn09880189\tbunny, bunny girl\nn09880741\tburglar\nn09881265\tbursar\nn09881358\tbusboy, waiter's assistant\nn09881895\tbusiness editor\nn09883047\tbusiness traveler\nn09883452\tbuster\nn09883807\tbusybody, nosy-parker, nosey-parker, quidnunc\nn09885059\tbuttinsky\nn09885866\tcabinetmaker, furniture maker\nn09886403\tcaddie, golf caddie\nn09886540\tcadet, plebe\nn09888635\tcaller, caller-out\nn09889065\tcall girl\nn09889170\tcalligrapher, calligraphist\nn09889691\tcampaigner, candidate, nominee\nn09889941\tcamper\nn09890192\tcamp follower\nn09890749\tcandidate, prospect\nn09891730\tcanonist\nn09892262\tcapitalist\nn09892513\tcaptain, headwaiter, maitre d'hotel, maitre d'\nn09892693\tcaptain, senior pilot\nn09893191\tcaptain\nn09893344\tcaptain, chieftain\nn09893502\tcaptive\nn09893600\tcaptive\nn09894143\tcardinal\nn09894445\tcardiologist, heart specialist, heart surgeon\nn09894654\tcard player\nn09894909\tcardsharp, card sharp, cardsharper, card sharper, sharper, sharpie, sharpy, card shark\nn09895222\tcareerist\nn09895480\tcareer man\nn09895561\tcaregiver\nn09895701\tcaretaker\nn09895902\tcaretaker\nn09896170\tcaricaturist\nn09896311\tcarillonneur\nn09896401\tcaroler, caroller\nn09896685\tcarpenter\nn09896826\tcarper, niggler\nn09898020\tCartesian\nn09899289\tcashier\nn09899671\tcasualty, injured party\nn09899782\tcasualty\nn09899929\tcasuist, sophist\nn09901337\tcatechist\nn09901502\tcatechumen, neophyte\nn09901642\tcaterer\nn09901786\tCatholicos\nn09901921\tcat fancier\nn09902128\tCavalier, Royalist\nn09902353\tcavalryman, trooper\nn09902731\tcaveman, cave man, cave dweller, troglodyte\nn09902851\tcelebrant\nn09902954\tcelebrant, celebrator, celebrater\nn09903153\tcelebrity, famous person\nn09903501\tcellist, violoncellist\nn09903639\tcensor\nn09903936\tcensor\nn09904208\tcentenarian\nn09904837\tcentrist, middle of the roader, moderate, moderationist\nn09905050\tcenturion\nn09905185\tcertified public accountant, CPA\nn09905530\tchachka, tsatske, tshatshke, tchotchke, tchotchkeleh\nn09906293\tchambermaid, fille de chambre\nn09906449\tchameleon\nn09906704\tchampion, champ, title-holder\nn09907804\tchandler\nn09908769\tprison chaplain\nn09909660\tcharcoal burner\nn09909929\tcharge d'affaires\nn09910222\tcharioteer\nn09910374\tcharmer, beguiler\nn09910556\tchartered accountant\nn09910840\tchartist, technical analyst\nn09911226\tcharwoman, char, cleaning woman, cleaning lady, woman\nn09912431\tmale chauvinist, sexist\nn09912681\tcheapskate, tightwad\nn09912907\tChechen\nn09912995\tchecker\nn09913329\tcheerer\nn09913455\tcheerleader\nn09913593\tcheerleader\nn09915434\tCheops, Khufu\nn09915651\tchess master\nn09916348\tchief executive officer, CEO, chief operating officer\nn09917214\tchief of staff\nn09917345\tchief petty officer\nn09917481\tChief Secretary\nn09917593\tchild, kid, youngster, minor, shaver, nipper, small fry, tiddler, tike, tyke, fry, nestling\nn09918248\tchild, kid\nn09918554\tchild, baby\nn09918867\tchild prodigy, infant prodigy, wonder child\nn09919061\tchimneysweeper, chimneysweep, sweep\nn09919200\tchiropractor\nn09919451\tchit\nn09919899\tchoker\nn09920106\tchoragus\nn09920283\tchoreographer\nn09920901\tchorus girl, showgirl, chorine\nn09921034\tchosen\nn09923003\tcicerone\nn09923186\tcigar smoker\nn09923418\tcipher, cypher, nobody, nonentity\nn09923561\tcircus acrobat\nn09923673\tcitizen\nn09923996\tcity editor\nn09924106\tcity father\nn09924195\tcity man\nn09924313\tcity slicker, city boy\nn09924437\tcivic leader, civil leader\nn09924996\tcivil rights leader, civil rights worker, civil rights activist\nn09927089\tcleaner\nn09927451\tclergyman, reverend, man of the cloth\nn09928136\tcleric, churchman, divine, ecclesiastic\nn09928451\tclerk\nn09928845\tclever Dick, clever clogs\nn09929202\tclimatologist\nn09929298\tclimber\nn09929577\tclinician\nn09930257\tcloser, finisher\nn09930628\tcloset queen\nn09930876\tclown, buffoon, goof, goofball, merry andrew\nn09931165\tclown, buffoon\nn09931418\tcoach, private instructor, tutor\nn09931640\tcoach, manager, handler\nn09932098\tpitching coach\nn09932336\tcoachman\nn09932508\tcoal miner, collier, pitman\nn09932788\tcoastguardsman\nn09933020\tcobber\nn09933098\tcobbler, shoemaker\nn09933842\tcodger, old codger\nn09933972\tco-beneficiary\nn09934337\tcog\nn09934488\tcognitive neuroscientist\nn09934774\tcoiffeur\nn09935107\tcoiner\nn09935434\tcollaborator, cooperator, partner, pardner\nn09936825\tcolleen\nn09936892\tcollege student, university student\nn09937056\tcollegian, college man, college boy\nn09937688\tcolonial\nn09937802\tcolonialist\nn09937903\tcolonizer, coloniser\nn09938080\tcoloratura, coloratura soprano\nn09938449\tcolor guard\nn09938991\tcolossus, behemoth, giant, heavyweight, titan\nn09940725\tcomedian\nn09940818\tcomedienne\nn09941089\tcomer\nn09941571\tcommander\nn09941787\tcommander in chief, generalissimo\nn09941964\tcommanding officer, commandant, commander\nn09942697\tcommissar, political commissar\nn09942970\tcommissioned officer\nn09943239\tcommissioned military officer\nn09943811\tcommissioner\nn09944022\tcommissioner\nn09944160\tcommittee member\nn09944430\tcommitteewoman\nn09945021\tcommodore\nn09945223\tcommunicant\nn09945319\tcommunist, commie\nn09945603\tCommunist\nn09945745\tcommuter\nn09946814\tcompere\nn09947127\tcomplexifier\nn09950457\tcompulsive\nn09950728\tcomputational linguist\nn09951070\tcomputer scientist\nn09951274\tcomputer user\nn09951524\tComrade\nn09951616\tconcert-goer, music lover\nn09952163\tconciliator, make-peace, pacifier, peacemaker, reconciler\nn09953052\tconductor\nn09953350\tconfectioner, candymaker\nn09953615\tConfederate\nn09954355\tconfessor\nn09954639\tconfidant, intimate\nn09955406\tConfucian, Confucianist\nn09955944\trep\nn09956578\tconqueror, vanquisher\nn09957523\tConservative\nn09958133\tNonconformist, chapelgoer\nn09958292\tAnglican\nn09958447\tconsignee\nn09958569\tconsigner, consignor\nn09959142\tconstable\nn09959658\tconstructivist\nn09960688\tcontractor\nn09961198\tcontralto\nn09961331\tcontributor\nn09961469\tcontrol freak\nn09961605\tconvalescent\nn09961739\tconvener\nn09962966\tconvict, con, inmate, yard bird, yardbird\nn09964202\tcopilot, co-pilot\nn09964411\tcopycat, imitator, emulator, ape, aper\nn09965515\tcoreligionist\nn09965787\tcornerback\nn09966470\tcorporatist\nn09966554\tcorrespondent, letter writer\nn09967063\tcosmetician\nn09967406\tcosmopolitan, cosmopolite\nn09967555\tCossack\nn09967816\tcost accountant\nn09967967\tco-star\nn09968259\tcostumier, costumer, costume designer\nn09968652\tcotter, cottier\nn09968741\tcotter, cottar\nn09968845\tcounselor, counsellor\nn09970088\tcounterterrorist\nn09970192\tcounterspy, mole\nn09970402\tcountess\nn09970822\tcompromiser\nn09971273\tcountrywoman\nn09971385\tcounty agent, agricultural agent, extension agent\nn09971839\tcourtier\nn09972010\tcousin, first cousin, cousin-german, full cousin\nn09972458\tcover girl, pin-up, lovely\nn09972587\tcow\nn09974648\tcraftsman, artisan, journeyman, artificer\nn09975425\tcraftsman, crafter\nn09976024\tcrapshooter\nn09976283\tcrazy, loony, looney, nutcase, weirdo\nn09976429\tcreature, wight\nn09976728\tcreditor\nn09976917\tcreep, weirdo, weirdie, weirdy, spook\nn09978442\tcriminologist\nn09979321\tcritic\nn09979913\tCroesus\nn09980458\tcross-examiner, cross-questioner\nn09980805\tcrossover voter, crossover\nn09980985\tcroupier\nn09981092\tcrown prince\nn09981278\tcrown princess\nn09981540\tcryptanalyst, cryptographer, cryptologist\nn09981939\tCub Scout\nn09982152\tcuckold\nn09982525\tcultist\nn09983314\tcurandera\nn09983572\tcurate, minister of religion, minister, parson, pastor, rector\nn09983889\tcurator, conservator\nn09984960\tcustomer agent\nn09985470\tcutter, carver\nn09985809\tcyberpunk\nn09985978\tcyborg, bionic man, bionic woman\nn09986450\tcymbalist\nn09986700\tCynic\nn09986904\tcytogeneticist\nn09987045\tcytologist\nn09987161\tczar\nn09987239\tczar, tsar, tzar\nn09988063\tdad, dada, daddy, pa, papa, pappa, pop\nn09988311\tdairyman\nn09988493\tDalai Lama, Grand Lama\nn09988703\tdallier, dillydallier, dilly-dallier, mope, lounger\nn09989502\tdancer, professional dancer, terpsichorean\nn09990415\tdancer, social dancer\nn09990690\tclog dancer\nn09990777\tdancing-master, dance master\nn09991740\tdark horse\nn09991867\tdarling, favorite, favourite, pet, dearie, deary, ducky\nn09992538\tdate, escort\nn09992837\tdaughter, girl\nn09993252\tdawdler, drone, laggard, lagger, trailer, poke\nn09993651\tday boarder\nn09994400\tday laborer, day labourer\nn09994673\tdeacon, Protestant deacon\nn09994808\tdeaconess\nn09994878\tdeadeye\nn09995829\tdeipnosophist\nn09996039\tdropout\nn09996304\tdeadhead\nn09996481\tdeaf person\nn09997622\tdebtor, debitor\nn09998788\tdeckhand, roustabout\nn09999135\tdefamer, maligner, slanderer, vilifier, libeler, backbiter, traducer\nn10000294\tdefense contractor\nn10000459\tdeist, freethinker\nn10000787\tdelegate\nn10001217\tdeliveryman, delivery boy, deliverer\nn10001481\tdemagogue, demagog, rabble-rouser\nn10001764\tdemigod, superman, Ubermensch\nn10002257\tdemographer, demographist, population scientist\nn10002760\tdemonstrator, protester\nn10003476\tden mother\nn10004718\tdepartment head\nn10005006\tdepositor\nn10005934\tdeputy\nn10006177\tdermatologist, skin doctor\nn10006748\tdescender\nn10007684\tdesignated hitter\nn10007809\tdesigner, intriguer\nn10007995\tdesk clerk, hotel desk clerk, hotel clerk\nn10008123\tdesk officer\nn10008254\tdesk sergeant, deskman, station keeper\nn10009162\tdetainee, political detainee\nn10009276\tdetective, investigator, tec, police detective\nn10009484\tdetective\nn10009671\tdetractor, disparager, depreciator, knocker\nn10010062\tdeveloper\nn10010243\tdeviationist\nn10010632\tdevisee\nn10010767\tdevisor\nn10010864\tdevourer\nn10011360\tdialectician\nn10011486\tdiarist, diary keeper, journalist\nn10012484\tdietician, dietitian, nutritionist\nn10013811\tdiocesan\nn10015215\tdirector, theater director, theatre director\nn10015485\tdirector\nn10015792\tdirty old man\nn10015897\tdisbeliever, nonbeliever, unbeliever\nn10017272\tdisk jockey, disc jockey, dj\nn10017422\tdispatcher\nn10018747\tdistortionist\nn10018861\tdistributor, distributer\nn10019072\tdistrict attorney, DA\nn10019187\tdistrict manager\nn10019406\tdiver, plunger\nn10020366\tdivorcee, grass widow\nn10020533\tex-wife, ex\nn10020670\tdivorce lawyer\nn10020807\tdocent\nn10020890\tdoctor, doc, physician, MD, Dr., medico\nn10022908\tdodo, fogy, fogey, fossil\nn10023264\tdoge\nn10023506\tdog in the manger\nn10023656\tdogmatist, doctrinaire\nn10024025\tdolichocephalic\nn10024362\tdomestic partner, significant other, spousal equivalent, spouse equivalent\nn10024937\tDominican\nn10025060\tdominus, dominie, domine, dominee\nn10025295\tdon, father\nn10025391\tDonatist\nn10025635\tdonna\nn10026976\tdosser, street person\nn10027246\tdouble, image, look-alike\nn10027590\tdouble-crosser, double-dealer, two-timer, betrayer, traitor\nn10028402\tdown-and-out\nn10028541\tdoyenne\nn10029068\tdraftsman, drawer\nn10030277\tdramatist, playwright\nn10032987\tdreamer\nn10033412\tdressmaker, modiste, needlewoman, seamstress, sempstress\nn10033572\tdressmaker's model\nn10033663\tdribbler, driveller, slobberer, drooler\nn10033888\tdribbler\nn10034201\tdrinker, imbiber, toper, juicer\nn10034614\tdrinker\nn10035952\tdrug addict, junkie, junky\nn10036266\tdrug user, substance abuser, user\nn10036444\tDruid\nn10036692\tdrum majorette, majorette\nn10036929\tdrummer\nn10037080\tdrunk\nn10037385\tdrunkard, drunk, rummy, sot, inebriate, wino\nn10037588\tDruze, Druse\nn10037922\tdry, prohibitionist\nn10038119\tdry nurse\nn10038409\tduchess\nn10038620\tduke\nn10039271\tduffer\nn10039946\tdunker\nn10040240\tDutch uncle\nn10040698\tdyspeptic\nn10040945\teager beaver, busy bee, live wire, sharpie, sharpy\nn10041373\tearl\nn10041887\tearner, wage earner\nn10042690\teavesdropper\nn10042845\teccentric, eccentric person, flake, oddball, geek\nn10043024\teclectic, eclecticist\nn10043491\teconometrician, econometrist\nn10043643\teconomist, economic expert\nn10044682\tectomorph\nn10044879\teditor, editor in chief\nn10047199\tegocentric, egoist\nn10047459\tegotist, egoist, swellhead\nn10048117\tejaculator\nn10048367\telder\nn10048612\telder statesman\nn10048836\telected official\nn10049363\telectrician, lineman, linesman\nn10050043\telegist\nn10050880\telocutionist\nn10051026\temancipator, manumitter\nn10051761\tembryologist\nn10051861\temeritus\nn10051975\temigrant, emigre, emigree, outgoer\nn10052694\temissary, envoy\nn10053439\tempress\nn10053808\temployee\nn10054657\temployer\nn10055297\tenchantress, witch\nn10055410\tenchantress, temptress, siren, Delilah, femme fatale\nn10055566\tencyclopedist, encyclopaedist\nn10055730\tendomorph\nn10055847\tenemy, foe, foeman, opposition\nn10056103\tenergizer, energiser, vitalizer, vitaliser, animator\nn10056611\tend man\nn10056719\tend man, corner man\nn10057271\tendorser, indorser\nn10058411\tenjoyer\nn10058962\tenlisted woman\nn10059067\tenophile, oenophile\nn10060075\tentrant\nn10060175\tentrant\nn10060352\tentrepreneur, enterpriser\nn10061043\tenvoy, envoy extraordinary, minister plenipotentiary\nn10061195\tenzymologist\nn10061431\teparch\nn10061882\tepidemiologist\nn10062042\tepigone, epigon\nn10062176\tepileptic\nn10062275\tEpiscopalian\nn10062492\tequerry\nn10062594\tequerry\nn10062716\terotic\nn10062905\tescapee\nn10062996\tescapist, dreamer, wishful thinker\nn10063635\tEskimo, Esquimau, Inuit\nn10063919\tespionage agent\nn10064831\testhetician, aesthetician\nn10064977\tetcher\nn10065758\tethnologist\nn10066206\tEtonian\nn10066314\tetymologist\nn10067011\tevangelist, revivalist, gospeler, gospeller\nn10067305\tEvangelist\nn10067600\tevent planner\nn10067968\texaminer, inspector\nn10068234\texaminer, tester, quizzer\nn10068425\texarch\nn10069296\texecutant\nn10069981\texecutive secretary\nn10070108\texecutive vice president\nn10070377\texecutrix\nn10070449\texegete\nn10070563\texhibitor, exhibitioner, shower\nn10070711\texhibitionist, show-off\nn10071332\texile, expatriate, expat\nn10071557\texistentialist, existentialist philosopher, existential philosopher\nn10072054\texorcist, exorciser\nn10074249\tex-spouse\nn10074578\textern, medical extern\nn10074735\textremist\nn10074841\textrovert, extravert\nn10075299\teyewitness\nn10075693\tfacilitator\nn10076224\tfairy godmother\nn10076483\tfalangist, phalangist\nn10076604\tfalconer, hawker\nn10076957\tfalsifier\nn10077106\tfamiliar\nn10077593\tfan, buff, devotee, lover\nn10077879\tfanatic, fiend\nn10078131\tfancier, enthusiast\nn10078719\tfarm boy\nn10078806\tfarmer, husbandman, granger, sodbuster\nn10079399\tfarmhand, fieldhand, field hand, farm worker\nn10079893\tfascist\nn10080117\tfascista\nn10080508\tfatalist, determinist, predestinarian, predestinationist\nn10080869\tfather, male parent, begetter\nn10081204\tFather, Padre\nn10081842\tfather-figure\nn10082043\tfather-in-law\nn10082299\tFauntleroy, Little Lord Fauntleroy\nn10082423\tFauve, fauvist\nn10082562\tfavorite son\nn10082687\tfeatherweight\nn10082997\tfederalist\nn10083677\tfellow traveler, fellow traveller\nn10083823\tfemale aristocrat\nn10084043\tfemale offspring\nn10084295\tfemale child, girl, little girl\nn10085101\tfence\nn10085869\tfiance, groom-to-be\nn10086383\tfielder, fieldsman\nn10086744\tfield judge\nn10087434\tfighter pilot\nn10087736\tfiler\nn10088200\tfilm director, director\nn10090745\tfinder\nn10091349\tfire chief, fire marshal\nn10091450\tfire-eater, fire-swallower\nn10091564\tfire-eater, hothead\nn10091651\tfireman, firefighter, fire fighter, fire-eater\nn10091861\tfire marshall\nn10091997\tfire walker\nn10092488\tfirst baseman, first sacker\nn10092643\tfirstborn, eldest\nn10092794\tfirst lady\nn10092978\tfirst lieutenant, 1st lieutenant\nn10093167\tfirst offender\nn10093475\tfirst sergeant, sergeant first class\nn10093818\tfishmonger, fishwife\nn10094320\tflagellant\nn10094584\tflag officer\nn10094782\tflak catcher, flak, flack catcher, flack\nn10095265\tflanker back, flanker\nn10095420\tflapper\nn10095769\tflatmate\nn10095869\tflatterer, adulator\nn10096126\tflibbertigibbet, foolish woman\nn10096508\tflight surgeon\nn10097262\tfloorwalker, shopwalker\nn10097477\tflop, dud, washout\nn10097590\tFlorentine\nn10097842\tflower girl\nn10097995\tflower girl\nn10098245\tflutist, flautist, flute player\nn10098388\tfly-by-night\nn10098517\tflyweight\nn10098624\tflyweight\nn10098710\tfoe, enemy\nn10098862\tfolk dancer\nn10099002\tfolk poet\nn10099375\tfollower\nn10101308\tfootball hero\nn10101634\tfootball player, footballer\nn10101981\tfootman\nn10102800\tforefather, father, sire\nn10103155\tforemother\nn10103228\tforeign agent\nn10103921\tforeigner, outsider\nn10104064\tboss\nn10104487\tforeman\nn10104756\tforester, tree farmer, arboriculturist\nn10104888\tforewoman\nn10105085\tforger, counterfeiter\nn10105733\tforward\nn10105906\tfoster-brother, foster brother\nn10106387\tfoster-father, foster father\nn10106509\tfoster-mother, foster mother\nn10106995\tfoster-sister, foster sister\nn10107173\tfoster-son, foster son\nn10107303\tfounder, beginner, founding father, father\nn10108018\tfoundress\nn10108089\tfour-minute man\nn10108464\tframer\nn10108832\tFrancophobe\nn10109443\tfreak, monster, monstrosity, lusus naturae\nn10109662\tfree agent, free spirit, freewheeler\nn10109826\tfree agent\nn10110093\tfreedom rider\nn10110731\tfree-liver\nn10110893\tfreeloader\nn10111358\tfree trader\nn10111779\tFreudian\nn10111903\tfriar, mendicant\nn10112129\tmonk, monastic\nn10113249\tfrontierswoman\nn10113583\tfront man, front, figurehead, nominal head, straw man, strawman\nn10113869\tfrotteur\nn10114476\tfucker\nn10114550\tfucker\nn10114662\tfuddy-duddy\nn10115430\tfullback\nn10115946\tfunambulist, tightrope walker\nn10116370\tfundamentalist\nn10116478\tfundraiser\nn10116702\tfuturist\nn10117017\tgadgeteer\nn10117267\tgagman, gagster, gagwriter\nn10117415\tgagman, standup comedian\nn10117739\tgainer, weight gainer\nn10117851\tgal\nn10118301\tgaloot\nn10118743\tgambist\nn10118844\tgambler\nn10119609\tgamine\nn10120330\tgarbage man, garbageman, garbage collector, garbage carter, garbage hauler, refuse collector, dustman\nn10120671\tgardener\nn10121026\tgarment cutter\nn10121246\tgarroter, garrotter, strangler, throttler, choker\nn10121714\tgasman\nn10121800\tgastroenterologist\nn10122300\tgatherer\nn10122531\tgawker\nn10123122\tgendarme\nn10123844\tgeneral, full general\nn10126177\tgenerator, source, author\nn10126424\tgeneticist\nn10126708\tgenitor\nn10127186\tgent\nn10127689\tgeologist\nn10128519\tgeophysicist\nn10128748\tghostwriter, ghost\nn10129338\tGibson girl\nn10129825\tgirl, miss, missy, young lady, young woman, fille\nn10130686\tgirlfriend, girl, lady friend\nn10130877\tgirlfriend\nn10131151\tgirl wonder\nn10131268\tGirondist, Girondin\nn10131590\tgitano\nn10131815\tgladiator\nn10132035\tglassblower\nn10132502\tgleaner\nn10134178\tgoat herder, goatherd\nn10134396\tgodchild\nn10134760\tgodfather\nn10134982\tgodparent\nn10135129\tgodson\nn10135197\tgofer\nn10135297\tgoffer, gopher\nn10136615\tgoldsmith, goldworker, gold-worker\nn10136959\tgolfer, golf player, linksman\nn10137825\tgondolier, gondoliere\nn10138369\tgood guy\nn10138472\tgood old boy, good ole boy, good ol' boy\nn10139077\tgood Samaritan\nn10139651\tgossip columnist\nn10140051\tgouger\nn10140597\tgovernor general\nn10140683\tgrabber\nn10140783\tgrader\nn10140929\tgraduate nurse, trained nurse\nn10141364\tgrammarian, syntactician\nn10141732\tgranddaughter\nn10142166\tgrande dame\nn10142391\tgrandfather, gramps, granddad, grandad, granddaddy, grandpa\nn10142537\tGrand Inquisitor\nn10142747\tgrandma, grandmother, granny, grannie, gran, nan, nanna\nn10142946\tgrandmaster\nn10143172\tgrandparent\nn10143595\tgrantee\nn10143725\tgranter\nn10144338\tgrass widower, divorced man\nn10145239\tgreat-aunt, grandaunt\nn10145340\tgreat grandchild\nn10145480\tgreat granddaughter\nn10145590\tgreat grandmother\nn10145774\tgreat grandparent\nn10145902\tgreat grandson\nn10146002\tgreat-nephew, grandnephew\nn10146104\tgreat-niece, grandniece\nn10146416\tGreen Beret\nn10146816\tgrenadier, grenade thrower\nn10146927\tgreeter, saluter, welcomer\nn10147121\tgringo\nn10147262\tgrinner\nn10147710\tgrocer\nn10147935\tgroom, bridegroom\nn10148035\tgroom, bridegroom\nn10148305\tgrouch, grump, crank, churl, crosspatch\nn10148825\tgroup captain\nn10149436\tgrunter\nn10149867\tprison guard, jailer, jailor, gaoler, screw, turnkey\nn10150071\tguard\nn10150794\tguesser\nn10150940\tguest, invitee\nn10151133\tguest\nn10151261\tguest of honor\nn10151367\tguest worker, guestworker\nn10151570\tguide\nn10151760\tguitarist, guitar player\nn10152306\tgunnery sergeant\nn10152616\tguru\nn10152763\tguru\nn10153155\tguvnor\nn10153414\tguy, cat, hombre, bozo\nn10153594\tgymnast\nn10153865\tgym rat\nn10154013\tgynecologist, gynaecologist, woman's doctor\nn10154186\tGypsy, Gipsy, Romany, Rommany, Romani, Roma, Bohemian\nn10154601\thack, drudge, hacker\nn10155222\thacker, cyber-terrorist, cyberpunk\nn10155600\thaggler\nn10155849\thairdresser, hairstylist, stylist, styler\nn10156629\thakim, hakeem\nn10156831\tHakka\nn10157016\thalberdier\nn10157128\thalfback\nn10157271\thalf blood\nn10158506\thand\nn10159045\tanimal trainer, handler\nn10159289\thandyman, jack of all trades, odd-job man\nn10159533\thang glider\nn10160188\thardliner\nn10160280\tharlequin\nn10160412\tharmonizer, harmoniser\nn10161622\thash head\nn10162016\thatchet man, iceman\nn10162194\thater\nn10162354\thatmaker, hatter, milliner, modiste\nn10164025\theadman, tribal chief, chieftain, chief\nn10164233\theadmaster, schoolmaster, master\nn10164492\thead nurse\nn10165448\thearer, listener, auditor, attender\nn10166189\theartbreaker\nn10166394\theathen, pagan, gentile, infidel\nn10167152\theavyweight\nn10167361\theavy\nn10167565\theckler, badgerer\nn10167838\thedger\nn10168012\thedger, equivocator, tergiversator\nn10168183\thedonist, pagan, pleasure seeker\nn10168584\their, inheritor, heritor\nn10168837\their apparent\nn10169147\theiress, inheritress, inheritrix\nn10169241\their presumptive\nn10169419\thellion, heller, devil\nn10169796\thelmsman, steersman, steerer\nn10170060\thire\nn10170681\thematologist, haematologist\nn10170866\themiplegic\nn10171219\therald, trumpeter\nn10171456\therbalist, herb doctor\nn10171567\therder, herdsman, drover\nn10172080\thermaphrodite, intersex, gynandromorph, androgyne, epicene, epicene person\nn10173410\theroine\nn10173579\theroin addict\nn10173665\thero worshiper, hero worshipper\nn10173771\tHerr\nn10174253\thighbinder\nn10174330\thighbrow\nn10174445\thigh commissioner\nn10174589\thighflier, highflyer\nn10174695\tHighlander, Scottish Highlander, Highland Scot\nn10174971\thigh-muck-a-muck, pooh-bah\nn10175248\thigh priest\nn10175725\thighjacker, hijacker\nn10176913\thireling, pensionary\nn10177150\thistorian, historiographer\nn10178077\thitchhiker\nn10178216\thitter, striker\nn10179069\thobbyist\nn10180580\tholdout\nn10180791\tholdover, hangover\nn10180923\tholdup man, stickup man\nn10181445\thomeboy\nn10181547\thomeboy\nn10181799\thome buyer\nn10181878\thomegirl\nn10182190\thomeless, homeless person\nn10182402\thomeopath, homoeopath\nn10183347\thonest woman\nn10183931\thonor guard, guard of honor\nn10184505\thooker\nn10185148\thoper\nn10185483\thornist\nn10185793\thorseman, equestrian, horseback rider\nn10186068\thorse trader\nn10186143\thorsewoman\nn10186216\thorse wrangler, wrangler\nn10186350\thorticulturist, plantsman\nn10186686\thospital chaplain\nn10186774\thost, innkeeper, boniface\nn10187130\thost\nn10187491\thostess\nn10187990\thotelier, hotelkeeper, hotel manager, hotelman, hosteller\nn10188715\thousekeeper\nn10188856\thousemaster\nn10188957\thousemate\nn10189278\thouse physician, resident, resident physician\nn10189597\thouse sitter\nn10190122\thousing commissioner\nn10190516\thuckster, cheap-jack\nn10191001\thugger\nn10191388\thumanist, humanitarian\nn10191613\thumanitarian, do-gooder, improver\nn10192839\thunk\nn10193650\thuntress\nn10194231\tex-husband, ex\nn10194775\thydrologist\nn10195056\thyperope\nn10195155\thypertensive\nn10195261\thypnotist, hypnotizer, hypnotiser, mesmerist, mesmerizer\nn10195593\thypocrite, dissembler, dissimulator, phony, phoney, pretender\nn10196404\ticeman\nn10196725\ticonoclast\nn10197392\tideologist, ideologue\nn10198437\tidol, matinee idol\nn10198832\tidolizer, idoliser\nn10199251\timam, imaum\nn10200246\timperialist\nn10200781\timportant person, influential person, personage\nn10202225\tinamorato\nn10202624\tincumbent, officeholder\nn10202763\tincurable\nn10203949\tinductee\nn10204177\tindustrialist\nn10204833\tinfanticide\nn10205231\tinferior\nn10205344\tinfernal\nn10205457\tinfielder\nn10205714\tinfiltrator\nn10206173\tinformer, betrayer, rat, squealer, blabber\nn10206506\tingenue\nn10206629\tingenue\nn10207077\tpolymath\nn10207169\tin-law, relative-in-law\nn10208189\tinquiry agent\nn10208847\tinspector\nn10208950\tinspector general\nn10209082\tinstigator, initiator\nn10209731\tinsurance broker, insurance agent, general agent, underwriter\nn10210137\tinsurgent, insurrectionist, freedom fighter, rebel\nn10210512\tintelligence analyst\nn10210648\tinterior designer, designer, interior decorator, house decorator, room decorator, decorator\nn10210911\tinterlocutor, conversational partner\nn10211036\tinterlocutor, middleman\nn10211666\tInternational Grandmaster\nn10211830\tinternationalist\nn10212231\tinternist\nn10212501\tinterpreter, translator\nn10212780\tinterpreter\nn10213034\tintervenor\nn10213429\tintrovert\nn10214062\tinvader, encroacher\nn10214390\tinvalidator, voider, nullifier\nn10215623\tinvestigator\nn10216106\tinvestor\nn10216403\tinvigilator\nn10217208\tirreligionist\nn10218043\tIvy Leaguer\nn10218164\tJack of all trades\nn10218292\tJacksonian\nn10219240\tJane Doe\nn10219453\tjanissary\nn10219879\tJat\nn10220080\tJavanese, Javan\nn10220924\tJekyll and Hyde\nn10221312\tjester, fool, motley fool\nn10221520\tJesuit\nn10222170\tjezebel\nn10222259\tjilt\nn10222497\tjobber, middleman, wholesaler\nn10222716\tjob candidate\nn10223069\tJob's comforter\nn10223177\tjockey\nn10223606\tJohn Doe\nn10224578\tjournalist\nn10225219\tjudge, justice, jurist\nn10225931\tjudge advocate\nn10226413\tjuggler\nn10227166\tJungian\nn10227266\tjunior\nn10227393\tjunior\nn10227490\tJunior, Jr, Jnr\nn10227698\tjunior lightweight\nn10227793\tjunior middleweight\nn10227985\tjurist, legal expert\nn10228278\tjuror, juryman, jurywoman\nn10228468\tjustice of the peace\nn10228592\tjusticiar, justiciary\nn10228712\tkachina\nn10229883\tkeyboardist\nn10230216\tKhedive\nn10233248\tkingmaker\nn10235024\tking, queen, world-beater\nn10235269\tKing's Counsel\nn10235385\tCounsel to the Crown\nn10236304\tkin, kinsperson, family\nn10236521\tenate, matrikin, matrilineal kin, matrisib, matrilineal sib\nn10236842\tkink\nn10237069\tkinswoman\nn10237196\tkisser, osculator\nn10237464\tkitchen help\nn10237556\tkitchen police, KP\nn10237676\tKlansman, Ku Kluxer, Kluxer\nn10237799\tkleptomaniac\nn10238272\tkneeler\nn10238375\tknight\nn10239928\tknocker\nn10240082\tknower, apprehender\nn10240235\tknow-it-all, know-all\nn10240417\tkolkhoznik\nn10240821\tKshatriya\nn10241024\tlabor coach, birthing coach, doula, monitrice\nn10241300\tlaborer, manual laborer, labourer, jack\nn10242328\tLabourite\nn10243137\tlady\nn10243273\tlady-in-waiting\nn10243483\tlady's maid\nn10243664\tlama\nn10243872\tlamb, dear\nn10244108\tlame duck\nn10244359\tlamplighter\nn10244913\tland agent\nn10245029\tlandgrave\nn10245341\tlandlubber, lubber, landsman\nn10245507\tlandlubber, landsman, landman\nn10245639\tlandowner, landholder, property owner\nn10245863\tlandscape architect, landscape gardener, landscaper, landscapist\nn10246317\tlanglaufer\nn10246395\tlanguisher\nn10246703\tlapidary, lapidarist\nn10247358\tlass, lassie, young girl, jeune fille\nn10247880\tLatin\nn10248008\tLatin\nn10248198\tlatitudinarian\nn10248377\tJehovah's Witness\nn10249191\tlaw agent\nn10249270\tlawgiver, lawmaker\nn10249459\tlawman, law officer, peace officer\nn10249869\tlaw student\nn10249950\tlawyer, attorney\nn10250712\tlay reader\nn10251329\tlazybones\nn10251612\tleaker\nn10252075\tleaseholder, lessee\nn10252222\tlector, lecturer, reader\nn10252354\tlector, reader\nn10252547\tlecturer\nn10253122\tleft-hander, lefty, southpaw\nn10253296\tlegal representative\nn10253479\tlegate, official emissary\nn10253611\tlegatee\nn10253703\tlegionnaire, legionary\nn10255459\tletterman\nn10257221\tliberator\nn10258602\tlicenser\nn10258786\tlicentiate\nn10259348\tlieutenant\nn10259780\tlieutenant colonel, light colonel\nn10259997\tlieutenant commander\nn10260473\tlieutenant junior grade, lieutenant JG\nn10260706\tlife\nn10260800\tlifeguard, lifesaver\nn10261211\tlife tenant\nn10261511\tlight flyweight\nn10261624\tlight heavyweight, cruiserweight\nn10261862\tlight heavyweight\nn10262343\tlight-o'-love, light-of-love\nn10262445\tlightweight\nn10262561\tlightweight\nn10262655\tlightweight\nn10262880\tlilliputian\nn10263146\tlimnologist\nn10263411\tlineman\nn10263790\tline officer\nn10265281\tlion-hunter\nn10265801\tlisper\nn10265891\tlister\nn10266016\tliterary critic\nn10266328\tliterate, literate person\nn10266848\tlitigant, litigator\nn10267166\tlitterer, litterbug, litter lout\nn10267311\tlittle brother\nn10267865\tlittle sister\nn10268629\tlobbyist\nn10269199\tlocksmith\nn10269289\tlocum tenens, locum\nn10271677\tLord, noble, nobleman\nn10272782\tloser\nn10272913\tloser, also-ran\nn10273064\tfailure, loser, nonstarter, unsuccessful person\nn10274173\tLothario\nn10274318\tloudmouth, blusterer\nn10274815\tlowerclassman, underclassman\nn10275249\tLowlander, Scottish Lowlander, Lowland Scot\nn10275395\tloyalist, stalwart\nn10275848\tLuddite\nn10276045\tlumberman, lumberjack, logger, feller, faller\nn10276477\tlumper\nn10276942\tbedlamite\nn10277027\tpyromaniac\nn10277638\tlutist, lutanist, lutenist\nn10277815\tLutheran\nn10277912\tlyricist, lyrist\nn10278456\tmacebearer, mace, macer\nn10279018\tmachinist, mechanic, shop mechanic\nn10279778\tmadame\nn10280034\tmaenad\nn10280130\tmaestro, master\nn10280598\tmagdalen\nn10280674\tmagician, prestidigitator, conjurer, conjuror, illusionist\nn10281546\tmagus\nn10281770\tmaharani, maharanee\nn10281896\tmahatma\nn10282482\tmaid, maiden\nn10282672\tmaid, maidservant, housemaid, amah\nn10283170\tmajor\nn10283366\tmajor\nn10283546\tmajor-domo, seneschal\nn10284064\tmaker, shaper\nn10284871\tmalahini\nn10284965\tmalcontent\nn10286282\tmalik\nn10286539\tmalingerer, skulker, shammer\nn10286749\tMalthusian\nn10288964\tadonis\nn10289039\tman\nn10289176\tman\nn10289462\tmanageress\nn10289766\tmandarin\nn10290422\tmaneuverer, manoeuvrer\nn10290541\tmaniac\nn10290813\tManichaean, Manichean, Manichee\nn10290919\tmanicurist\nn10291110\tmanipulator\nn10291469\tman-at-arms\nn10291822\tman of action, man of deeds\nn10291942\tman of letters\nn10292316\tmanufacturer, producer\nn10293332\tmarcher, parader\nn10293590\tmarchioness, marquise\nn10293861\tmargrave\nn10294020\tmargrave\nn10294139\tMarine, devil dog, leatherneck, shipboard soldier\nn10295371\tmarquess\nn10295479\tmarquis, marquess\nn10296176\tmarshal, marshall\nn10296444\tmartinet, disciplinarian, moralist\nn10297234\tmascot\nn10297367\tmasochist\nn10297531\tmason, stonemason\nn10297841\tmasquerader, masker, masquer\nn10298202\tmasseur\nn10298271\tmasseuse\nn10298647\tmaster\nn10298912\tmaster, captain, sea captain, skipper\nn10299125\tmaster-at-arms\nn10299250\tmaster of ceremonies, emcee, host\nn10299700\tmasturbator, onanist\nn10299875\tmatchmaker, matcher, marriage broker\nn10300041\tmate, first mate\nn10300154\tmate\nn10300303\tmate\nn10300500\tmater\nn10300654\tmaterial\nn10300829\tmaterialist\nn10302576\tmatriarch, materfamilias\nn10302700\tmatriarch\nn10302905\tmatriculate\nn10303037\tmatron\nn10303814\tmayor, city manager\nn10304086\tmayoress\nn10304650\tmechanical engineer\nn10304914\tmedalist, medallist, medal winner\nn10305635\tmedical officer, medic\nn10305802\tmedical practitioner, medical man\nn10306004\tmedical scientist\nn10306279\tmedium, spiritualist, sensitive\nn10306496\tmegalomaniac\nn10306595\tmelancholic, melancholiac\nn10306890\tMelkite, Melchite\nn10307114\tmelter\nn10308066\tnonmember\nn10308168\tboard member\nn10308275\tclansman, clanswoman, clan member\nn10308504\tmemorizer, memoriser\nn10308653\tMendelian\nn10308732\tmender, repairer, fixer\nn10310783\tMesoamerican\nn10311506\tmessmate\nn10311661\tmestiza\nn10312287\tmeteorologist\nn10312491\tmeter maid\nn10312600\tMethodist\nn10313000\tMetis\nn10313239\tmetropolitan\nn10313441\tmezzo-soprano, mezzo\nn10313724\tmicroeconomist, microeconomic expert\nn10314054\tmiddle-aged man\nn10314182\tmiddlebrow\nn10314517\tmiddleweight\nn10314836\tmidwife, accoucheuse\nn10315217\tmikado, tenno\nn10315456\tMilanese\nn10315561\tmiler\nn10315730\tmiles gloriosus\nn10316360\tmilitary attache\nn10316527\tmilitary chaplain, padre, Holy Joe, sky pilot\nn10316862\tmilitary leader\nn10317007\tmilitary officer, officer\nn10317500\tmilitary policeman, MP\nn10317963\tmill agent\nn10318293\tmill-hand, factory worker\nn10318607\tmillionairess\nn10318686\tmillwright\nn10319313\tminder\nn10320484\tmining engineer\nn10320863\tminister, government minister\nn10321126\tministrant\nn10321340\tminor leaguer, bush leaguer\nn10321632\tMinuteman\nn10321882\tmisanthrope, misanthropist\nn10322238\tmisfit\nn10323634\tmistress\nn10323752\tmistress, kept woman, fancy woman\nn10323999\tmixed-blood\nn10324560\tmodel, poser\nn10325549\tclass act\nn10325774\tmodeler, modeller\nn10326776\tmodifier\nn10327143\tmolecular biologist\nn10327987\tMonegasque, Monacan\nn10328123\tmonetarist\nn10328328\tmoneygrubber\nn10328437\tmoneymaker\nn10328696\tMongoloid\nn10328941\tmonolingual\nn10329035\tmonologist\nn10330593\tmoonlighter\nn10330931\tmoralist\nn10331098\tmorosoph\nn10331167\tmorris dancer\nn10331258\tmortal enemy\nn10331347\tmortgagee, mortgage holder\nn10331841\tmortician, undertaker, funeral undertaker, funeral director\nn10332110\tmoss-trooper\nn10332385\tmother, female parent\nn10332861\tmother\nn10332953\tmother\nn10333044\tmother figure\nn10333165\tmother hen\nn10333317\tmother-in-law\nn10333439\tmother's boy, mamma's boy, mama's boy\nn10333601\tmother's daughter\nn10333838\tmotorcycle cop, motorcycle policeman, speed cop\nn10334009\tmotorcyclist\nn10334461\tMound Builder\nn10334782\tmountebank, charlatan\nn10335246\tmourner, griever, sorrower, lamenter\nn10335801\tmouthpiece, mouth\nn10335931\tmover\nn10336411\tmoviegoer, motion-picture fan\nn10336904\tmuffin man\nn10337488\tmugwump, independent, fencesitter\nn10338231\tMullah, Mollah, Mulla\nn10338391\tmuncher\nn10339179\tmurderess\nn10339251\tmurder suspect\nn10339717\tmusher\nn10340312\tmusician, instrumentalist, player\nn10341243\tmusicologist\nn10341343\tmusic teacher\nn10341446\tmusketeer\nn10341573\tMuslimah\nn10341955\tmutilator, maimer, mangler\nn10342180\tmutineer\nn10342367\tmute, deaf-mute, deaf-and-dumb person\nn10342543\tmutterer, mumbler, murmurer\nn10342893\tmuzzler\nn10342992\tMycenaen\nn10343088\tmycologist\nn10343355\tmyope\nn10343449\tmyrmidon\nn10343554\tmystic, religious mystic\nn10343869\tmythologist\nn10344121\tnaif\nn10344203\tnailer\nn10344319\tnamby-pamby\nn10344656\tname dropper\nn10344774\tnamer\nn10345015\tnan\nn10345100\tnanny, nursemaid, nurse\nn10345302\tnarc, nark, narcotics agent\nn10345422\tnarcissist, narcist\nn10345659\tnark, copper's nark\nn10346015\tnationalist\nn10347204\tnautch girl\nn10347446\tnaval commander\nn10348526\tNavy SEAL, SEAL\nn10349243\tobstructionist, obstructor, obstructer, resister, thwarter\nn10349750\tNazarene\nn10349836\tNazarene, Ebionite\nn10350220\tNazi, German Nazi\nn10350774\tnebbish, nebbech\nn10351064\tnecker\nn10353016\tneonate, newborn, newborn infant, newborn baby\nn10353355\tnephew\nn10353928\tneurobiologist\nn10354265\tneurologist, brain doctor\nn10354754\tneurosurgeon, brain surgeon\nn10355142\tneutral\nn10355306\tneutralist\nn10355449\tnewcomer, fledgling, fledgeling, starter, neophyte, freshman, newbie, entrant\nn10355688\tnewcomer\nn10355806\tNew Dealer\nn10356450\tnewspaper editor\nn10356877\tnewsreader, news reader\nn10357012\tNewtonian\nn10357613\tniece\nn10357737\tniggard, skinflint, scrooge, churl\nn10358032\tnight porter\nn10358124\tnight rider, nightrider\nn10358575\tNIMBY\nn10359117\tniqaabi\nn10359422\tnitpicker\nn10359546\tNobelist, Nobel Laureate\nn10359659\tNOC\nn10360366\tnoncandidate\nn10360747\tnoncommissioned officer, noncom, enlisted officer\nn10361060\tnondescript\nn10361194\tnondriver\nn10361296\tnonparticipant\nn10361525\tnonperson, unperson\nn10362003\tnonresident\nn10362319\tnonsmoker\nn10362557\tNorthern Baptist\nn10363445\tnoticer\nn10363573\tnovelist\nn10364198\tnovitiate, novice\nn10364502\tnuclear chemist, radiochemist\nn10365514\tnudger\nn10366145\tnullipara\nn10366276\tnumber theorist\nn10366966\tnurse\nn10368291\tnursling, nurseling, suckling\nn10368528\tnymph, houri\nn10368624\tnymphet\nn10368711\tnympholept\nn10368798\tnymphomaniac, nympho\nn10369095\toarswoman\nn10369317\toboist\nn10369417\tobscurantist\nn10369528\tobserver, commentator\nn10369699\tobstetrician, accoucheur\nn10369955\toccupier\nn10370381\toccultist\nn10370955\twine lover\nn10371052\tofferer, offeror\nn10371221\toffice-bearer\nn10371330\toffice boy\nn10371450\tofficeholder, officer\nn10373390\tofficiant\nn10373525\tFederal, Fed, federal official\nn10374541\toilman\nn10374849\toil tycoon\nn10374943\told-age pensioner\nn10375052\told boy\nn10375314\told lady\nn10375402\told man\nn10376523\toldster, old person, senior citizen, golden ager\nn10376890\told-timer, oldtimer, gaffer, old geezer, antique\nn10377021\told woman\nn10377185\toligarch\nn10377291\tOlympian\nn10377542\tomnivore\nn10377633\toncologist\nn10378026\tonlooker, looker-on\nn10378113\tonomancer\nn10378780\toperator\nn10379376\topportunist, self-seeker\nn10380126\toptimist\nn10380499\tOrangeman\nn10380672\torator, speechmaker, rhetorician, public speaker, speechifier\nn10381804\torderly, hospital attendant\nn10381981\torderly\nn10382157\torderly sergeant\nn10382302\tordinand\nn10382480\tordinary\nn10382710\torgan-grinder\nn10382825\torganist\nn10383094\torganization man\nn10383237\torganizer, organiser, arranger\nn10383505\torganizer, organiser, labor organizer\nn10383816\toriginator, conceiver, mastermind\nn10384214\tornithologist, bird watcher\nn10384392\torphan\nn10384496\torphan\nn10385566\tosteopath, osteopathist\nn10386196\tout-and-outer\nn10386754\toutdoorswoman\nn10386874\toutfielder\nn10386984\toutfielder\nn10387196\tright fielder\nn10387324\tright-handed pitcher, right-hander\nn10387836\toutlier\nn10389865\towner-occupier\nn10389976\toyabun\nn10390600\tpackrat\nn10390698\tpadrone\nn10390807\tpadrone\nn10391416\tpage, pageboy\nn10393909\tpainter\nn10394434\tPaleo-American, Paleo-Amerind, Paleo-Indian\nn10394786\tpaleontologist, palaeontologist, fossilist\nn10395073\tpallbearer, bearer\nn10395209\tpalmist, palmister, chiromancer\nn10395390\tpamperer, spoiler, coddler, mollycoddler\nn10395828\tPanchen Lama\nn10396106\tpanelist, panellist\nn10396337\tpanhandler\nn10396727\tpaparazzo\nn10396908\tpaperboy\nn10397001\tpaperhanger, paperer\nn10397142\tpaperhanger\nn10397392\tpapoose, pappoose\nn10399130\tpardoner\nn10400003\tparetic\nn10400108\tparishioner\nn10400205\tpark commissioner\nn10400437\tParliamentarian, Member of Parliament\nn10400618\tparliamentary agent\nn10400998\tparodist, lampooner\nn10401204\tparricide\nn10401331\tparrot\nn10401639\tpartaker, sharer\nn10402709\tpart-timer\nn10402824\tparty\nn10403633\tparty man, party liner\nn10403876\tpassenger, rider\nn10404426\tpasser\nn10404998\tpaster\nn10405540\tpater\nn10405694\tpatient\nn10406266\tpatriarch\nn10406391\tpatriarch\nn10406765\tpatriarch, paterfamilias\nn10407310\tpatriot, nationalist\nn10407954\tpatron, sponsor, supporter\nn10408809\tpatternmaker\nn10409459\tpawnbroker\nn10409752\tpayer, remunerator\nn10410246\tpeacekeeper\nn10410996\tpeasant\nn10411356\tpedant, bookworm, scholastic\nn10411551\tpeddler, pedlar, packman, hawker, pitchman\nn10411867\tpederast, paederast, child molester\nn10414239\tpenologist\nn10414768\tpentathlete\nn10414865\tPentecostal, Pentecostalist\nn10415037\tpercussionist\nn10416567\tperiodontist\nn10417288\tpeshmerga\nn10417424\tpersonality\nn10417551\tpersonal representative\nn10417682\tpersonage\nn10417843\tpersona grata\nn10417969\tpersona non grata\nn10418101\tpersonification\nn10418735\tperspirer, sweater\nn10419047\tpervert, deviant, deviate, degenerate\nn10419472\tpessimist\nn10419630\tpest, blighter, cuss, pesterer, gadfly\nn10419785\tPeter Pan\nn10420031\tpetitioner, suppliant, supplicant, requester\nn10420277\tpetit juror, petty juror\nn10420507\tpet sitter, critter sitter\nn10420649\tpetter, fondler\nn10421016\tPharaoh, Pharaoh of Egypt\nn10421470\tpharmacist, druggist, chemist, apothecary, pill pusher, pill roller\nn10421956\tphilanthropist, altruist\nn10422405\tphilatelist, stamp collector\nn10425946\tphilosopher\nn10426454\tphonetician\nn10426630\tphonologist\nn10427223\tphotojournalist\nn10427359\tphotometrist, photometrician\nn10427764\tphysical therapist, physiotherapist\nn10428004\tphysicist\nn10431122\tpiano maker\nn10431625\tpicker, chooser, selector\nn10432189\tpicnicker, picknicker\nn10432441\tpilgrim\nn10432875\tpill\nn10432957\tpillar, mainstay\nn10433077\tpill head\nn10433452\tpilot\nn10433610\tPiltdown man, Piltdown hoax\nn10433737\tpimp, procurer, panderer, pander, pandar, fancy man, ponce\nn10435169\tpipe smoker\nn10435251\tpip-squeak, squirt, small fry\nn10435716\tpisser, urinator\nn10435988\tpitcher, hurler, twirler\nn10436334\tpitchman\nn10437014\tplaceman, placeseeker\nn10437137\tplacer miner\nn10437262\tplagiarist, plagiarizer, plagiariser, literary pirate, pirate\nn10437698\tplainsman\nn10438172\tplanner, contriver, deviser\nn10438619\tplanter, plantation owner\nn10438842\tplasterer\nn10439373\tplatinum blond, platinum blonde\nn10439523\tplatitudinarian\nn10439727\tplayboy, man-about-town, Corinthian\nn10439851\tplayer, participant\nn10441037\tplaymate, playfellow\nn10441124\tpleaser\nn10441694\tpledger\nn10441962\tplenipotentiary\nn10442093\tplier, plyer\nn10442232\tplodder, slowpoke, stick-in-the-mud, slowcoach\nn10442417\tplodder, slogger\nn10442573\tplotter, mapper\nn10443032\tplumber, pipe fitter\nn10443659\tpluralist\nn10443830\tpluralist\nn10444194\tpoet\nn10448322\tpointsman\nn10448455\tpoint woman\nn10449664\tpolicyholder\nn10450038\tpolitical prisoner\nn10450161\tpolitical scientist\nn10450303\tpolitician, politico, pol, political leader\nn10451450\tpolitician\nn10451590\tpollster, poll taker, headcounter, canvasser\nn10451858\tpolluter, defiler\nn10453184\tpool player\nn10455619\tportraitist, portrait painter, portrayer, limner\nn10456070\tposeuse\nn10456138\tpositivist, rationalist\nn10456696\tpostdoc, post doc\nn10457214\tposter girl\nn10457444\tpostulator\nn10457903\tprivate citizen\nn10458111\tproblem solver, solver, convergent thinker\nn10458356\tpro-lifer\nn10458596\tprosthetist\nn10459882\tpostulant\nn10460033\tpotboy, potman\nn10461060\tpoultryman, poulterer\nn10462588\tpower user\nn10462751\tpower worker, power-station worker\nn10462860\tpractitioner, practician\nn10464052\tprayer, supplicant\nn10464542\tpreceptor, don\nn10464711\tpredecessor\nn10464870\tpreemptor, pre-emptor\nn10465002\tpreemptor, pre-emptor\nn10465451\tpremature baby, preterm baby, premature infant, preterm infant, preemie, premie\nn10465831\tpresbyter\nn10466198\tpresenter, sponsor\nn10466564\tpresentist\nn10466918\tpreserver\nn10467179\tpresident\nn10467395\tPresident of the United States, United States President, President, Chief Executive\nn10468750\tpresident, prexy\nn10469611\tpress agent, publicity man, public relations man, PR man\nn10469874\tpress photographer\nn10470779\tpriest\nn10471640\tprima ballerina\nn10471732\tprima donna, diva\nn10471859\tprima donna\nn10472129\tprimigravida, gravida I\nn10472447\tprimordial dwarf, hypoplastic dwarf, true dwarf, normal dwarf\nn10473453\tprince charming\nn10473562\tprince consort\nn10473789\tprinceling\nn10473917\tPrince of Wales\nn10474064\tprincess\nn10474343\tprincess royal\nn10474446\tprincipal, dealer\nn10474645\tprincipal, school principal, head teacher, head\nn10475835\tprint seller\nn10475940\tprior\nn10476467\tprivate, buck private, common soldier\nn10477713\tprobationer, student nurse\nn10477955\tprocessor\nn10478118\tprocess-server\nn10478293\tproconsul\nn10478462\tproconsul\nn10478827\tproctologist\nn10478960\tproctor, monitor\nn10479135\tprocurator\nn10479328\tprocurer, securer\nn10481167\tprofit taker\nn10481268\tprogrammer, computer programmer, coder, software engineer\nn10482054\tpromiser, promisor\nn10482220\tpromoter, booster, plugger\nn10482587\tpromulgator\nn10482921\tpropagandist\nn10483138\tpropagator, disseminator\nn10483395\tproperty man, propman, property master\nn10483799\tprophetess\nn10483890\tprophet\nn10484858\tprosecutor, public prosecutor, prosecuting officer, prosecuting attorney\nn10485298\tprospector\nn10485883\tprotectionist\nn10486166\tprotegee\nn10486236\tprotozoologist\nn10486561\tprovost marshal\nn10487182\tpruner, trimmer\nn10487363\tpsalmist\nn10487592\tpsephologist\nn10488016\tpsychiatrist, head-shrinker, shrink\nn10488309\tpsychic\nn10488656\tpsycholinguist\nn10489426\tpsychophysicist\nn10490421\tpublican, tavern keeper\nn10491998\tpudge\nn10492086\tpuerpera\nn10492727\tpunching bag\nn10493199\tpunter\nn10493419\tpunter\nn10493685\tpuppeteer\nn10493835\tpuppy, pup\nn10493922\tpurchasing agent\nn10494195\tpuritan\nn10494373\tPuritan\nn10495167\tpursuer\nn10495421\tpusher, shover\nn10495555\tpusher, drug peddler, peddler, drug dealer, drug trafficker\nn10495756\tpusher, thruster\nn10496393\tputz\nn10496489\tPygmy, Pigmy\nn10497135\tqadi\nn10497534\tquadriplegic\nn10497645\tquadruplet, quad\nn10498046\tquaker, trembler\nn10498699\tquarter\nn10498816\tquarterback, signal caller, field general\nn10498986\tquartermaster\nn10499110\tquartermaster general\nn10499232\tQuebecois\nn10499355\tqueen, queen regnant, female monarch\nn10499631\tQueen of England\nn10499857\tqueen\nn10500217\tqueen\nn10500419\tqueen consort\nn10500603\tqueen mother\nn10500824\tQueen's Counsel\nn10500942\tquestion master, quizmaster\nn10501453\tquick study, sponge\nn10501635\tquietist\nn10502046\tquitter\nn10502329\trabbi\nn10502950\tracist, racialist\nn10503818\tradiobiologist\nn10504090\tradiologic technologist\nn10504206\tradiologist, radiotherapist\nn10505347\trainmaker\nn10505613\traiser\nn10505732\traja, rajah\nn10505942\trake, rakehell, profligate, rip, blood, roue\nn10506336\tramrod\nn10506544\tranch hand\nn10506915\tranker\nn10507070\tranter, raver\nn10507380\trape suspect\nn10507482\trapper\nn10507565\trapporteur\nn10507692\trare bird, rara avis\nn10508141\tratepayer\nn10508379\traw recruit\nn10508710\treader\nn10509063\treading teacher\nn10509161\trealist\nn10509810\treal estate broker, real estate agent, estate agent, land agent, house agent\nn10510245\trear admiral\nn10510974\treceiver\nn10511771\treciter\nn10512201\trecruit, enlistee\nn10512372\trecruit, military recruit\nn10512708\trecruiter\nn10512859\trecruiting-sergeant\nn10513509\tredcap\nn10513823\tredhead, redheader, red-header, carrottop\nn10513938\tredneck, cracker\nn10514051\treeler\nn10514121\treenactor\nn10514255\treferral\nn10514429\treferee, ref\nn10514784\trefiner\nn10515863\tReform Jew\nn10516527\tregistered nurse, RN\nn10517137\tregistrar\nn10517283\tRegius professor\nn10518349\treliever, allayer, comforter\nn10519126\tanchorite, hermit\nn10519494\treligious leader\nn10519984\tremover\nn10520286\tRenaissance man, generalist\nn10520544\trenegade\nn10520964\trentier\nn10521100\trepairman, maintenance man, service man\nn10521662\treporter, newsman, newsperson\nn10521853\tnewswoman\nn10522035\trepresentative\nn10522324\treprobate, miscreant\nn10522759\trescuer, recoverer, saver\nn10523341\treservist\nn10524076\tresident commissioner\nn10524223\trespecter\nn10524869\trestaurateur, restauranter\nn10525134\trestrainer, controller\nn10525436\tretailer, retail merchant\nn10525617\tretiree, retired person\nn10525878\treturning officer\nn10526534\trevenant\nn10527147\trevisionist\nn10527334\trevolutionist, revolutionary, subversive, subverter\nn10528023\trheumatologist\nn10528148\tRhodesian man, Homo rhodesiensis\nn10528493\trhymer, rhymester, versifier, poetizer, poetiser\nn10529231\trich person, wealthy person, have\nn10530150\trider\nn10530383\triding master\nn10530571\trifleman\nn10530959\tright-hander, right hander, righthander\nn10531109\tright-hand man, chief assistant, man Friday\nn10531445\tringer\nn10531838\tringleader\nn10533874\troadman, road mender\nn10533983\troarer, bawler, bellower, screamer, screecher, shouter, yeller\nn10536134\trocket engineer, rocket scientist\nn10536274\trocket scientist\nn10536416\trock star\nn10537708\tRomanov, Romanoff\nn10537906\tromanticist, romantic\nn10538629\tropemaker, rope-maker, roper\nn10538733\troper\nn10538853\troper\nn10539015\tropewalker, ropedancer\nn10539160\trosebud\nn10539278\tRosicrucian\nn10540114\tMountie\nn10540252\tRough Rider\nn10540656\troundhead\nn10541833\tcivil authority, civil officer\nn10542608\trunner\nn10542761\trunner\nn10542888\trunner\nn10543161\trunning back\nn10543937\trusher\nn10544232\trustic\nn10544748\tsaboteur, wrecker, diversionist\nn10545792\tsadist\nn10546428\tsailing master, navigator\nn10546633\tsailor, crewman\nn10548419\tsalesgirl, saleswoman, saleslady\nn10548537\tsalesman\nn10548681\tsalesperson, sales representative, sales rep\nn10549510\tsalvager, salvor\nn10550252\tsandwichman\nn10550369\tsangoma\nn10550468\tsannup\nn10551576\tsapper\nn10552393\tSassenach\nn10553140\tsatrap\nn10553235\tsaunterer, stroller, ambler\nn10554024\tSavoyard\nn10554141\tsawyer\nn10554846\tscalper\nn10555059\tscandalmonger\nn10555430\tscapegrace, black sheep\nn10556033\tscene painter\nn10556518\tschemer, plotter\nn10556704\tschizophrenic\nn10556825\tschlemiel, shlemiel\nn10557246\tschlockmeister, shlockmeister\nn10557854\tscholar, scholarly person, bookman, student\nn10559009\tscholiast\nn10559288\tschoolchild, school-age child, pupil\nn10559508\tschoolfriend\nn10559683\tSchoolman, medieval Schoolman\nn10559996\tschoolmaster\nn10560106\tschoolmate, classmate, schoolfellow, class fellow\nn10560637\tscientist\nn10561222\tscion\nn10561320\tscoffer, flouter, mocker, jeerer\nn10561736\tscofflaw\nn10562135\tscorekeeper, scorer\nn10562283\tscorer\nn10562509\tscourer\nn10562968\tscout, talent scout\nn10563314\tscoutmaster\nn10563403\tscrambler\nn10563711\tscratcher\nn10564098\tscreen actor, movie actor\nn10565502\tscrutineer, canvasser\nn10565667\tscuba diver\nn10566072\tsculptor, sculpturer, carver, statue maker\nn10567613\tSea Scout\nn10567722\tseasonal worker, seasonal\nn10567848\tseasoner\nn10568200\tsecond baseman, second sacker\nn10568358\tsecond cousin\nn10568443\tseconder\nn10568608\tsecond fiddle, second banana\nn10568915\tsecond-in-command\nn10569011\tsecond lieutenant, 2nd lieutenant\nn10569179\tsecond-rater, mediocrity\nn10570019\tsecretary\nn10570704\tSecretary of Agriculture, Agriculture Secretary\nn10571907\tSecretary of Health and Human Services\nn10572706\tSecretary of State\nn10572889\tSecretary of the Interior, Interior Secretary\nn10573957\tsectarian, sectary, sectarist\nn10574311\tsection hand\nn10574538\tsecularist\nn10574840\tsecurity consultant\nn10575463\tseeded player, seed\nn10575594\tseeder, cloud seeder\nn10575787\tseeker, searcher, quester\nn10576223\tsegregate\nn10576316\tsegregator, segregationist\nn10576676\tselectman\nn10576818\tselectwoman\nn10576962\tselfish person\nn10577182\tself-starter\nn10577284\tseller, marketer, vender, vendor, trafficker\nn10577710\tselling agent\nn10577820\tsemanticist, semiotician\nn10578021\tsemifinalist\nn10578162\tseminarian, seminarist\nn10578471\tsenator\nn10578656\tsendee\nn10579062\tsenior\nn10579549\tsenior vice president\nn10580030\tseparatist, separationist\nn10580437\tseptuagenarian\nn10580535\tserf, helot, villein\nn10581648\tspree killer\nn10581890\tserjeant-at-law, serjeant, sergeant-at-law, sergeant\nn10582604\tserver\nn10582746\tserviceman, military man, man, military personnel\nn10583387\tsettler, colonist\nn10583790\tsettler\nn10585077\tsex symbol\nn10585217\tsexton, sacristan\nn10585628\tshaheed\nn10586166\tShakespearian, Shakespearean\nn10586265\tshanghaier, seizer\nn10586444\tsharecropper, cropper, sharecrop farmer\nn10586903\tshaver\nn10586998\tShavian\nn10588074\tsheep\nn10588357\tsheik, tribal sheik, sheikh, tribal sheikh, Arab chief\nn10588724\tshelver\nn10588965\tshepherd\nn10589666\tship-breaker\nn10590146\tshipmate\nn10590239\tshipowner\nn10590452\tshipping agent\nn10590903\tshirtmaker\nn10591072\tshogun\nn10591811\tshopaholic\nn10592049\tshop girl\nn10592811\tshop steward, steward\nn10593521\tshot putter\nn10594147\tshrew, termagant\nn10594523\tshuffler\nn10594857\tshyster, pettifogger\nn10595164\tsibling, sib\nn10595647\tsick person, diseased person, sufferer\nn10596517\tsightreader\nn10596899\tsignaler, signaller\nn10597505\tsigner\nn10597745\tsignor, signior\nn10597889\tsignora\nn10598013\tsignore\nn10598181\tsignorina\nn10598459\tsilent partner, sleeping partner\nn10598904\taddle-head, addlehead, loon, birdbrain\nn10599215\tsimperer\nn10599806\tsinger, vocalist, vocalizer, vocaliser\nn10601234\tSinologist\nn10601362\tsipper\nn10602119\tsirrah\nn10602470\tSister\nn10602985\tsister, sis\nn10603528\twaverer, vacillator, hesitator, hesitater\nn10603851\tsitar player\nn10604275\tsixth-former\nn10604380\tskateboarder\nn10604634\tskeptic, sceptic, doubter\nn10604880\tsketcher\nn10604979\tskidder\nn10605253\tskier\nn10605737\tskinny-dipper\nn10607291\tskin-diver, aquanaut\nn10607478\tskinhead\nn10609092\tslasher\nn10609198\tslattern, slut, slovenly woman, trollop\nn10610465\tsleeper, slumberer\nn10610850\tsleeper\nn10611267\tsleeping beauty\nn10611613\tsleuth, sleuthhound\nn10612210\tslob, sloven, pig, slovenly person\nn10612373\tsloganeer\nn10612518\tslopseller, slop-seller\nn10613996\tsmasher, stunner, knockout, beauty, ravisher, sweetheart, peach, lulu, looker, mantrap, dish\nn10614507\tsmirker\nn10614629\tsmith, metalworker\nn10615179\tsmoothie, smoothy, sweet talker, charmer\nn10615334\tsmuggler, runner, contrabandist, moon curser, moon-curser\nn10616578\tsneezer\nn10617024\tsnob, prig, snot, snoot\nn10617193\tsnoop, snooper\nn10617397\tsnorer\nn10618234\tsob sister\nn10618342\tsoccer player\nn10618465\tsocial anthropologist, cultural anthropologist\nn10618685\tsocial climber, climber\nn10618848\tsocialist\nn10619492\tsocializer, socialiser\nn10619642\tsocial scientist\nn10619888\tsocial secretary\nn10620212\tSocinian\nn10620586\tsociolinguist\nn10620758\tsociologist\nn10621294\tsoda jerk, soda jerker\nn10621400\tsodalist\nn10621514\tsodomite, sodomist, sod, bugger\nn10622053\tsoldier\nn10624074\tson, boy\nn10624310\tsongster\nn10624437\tsongstress\nn10624540\tsongwriter, songster, ballad maker\nn10625860\tsorcerer, magician, wizard, necromancer, thaumaturge, thaumaturgist\nn10626630\tsorehead\nn10627252\tsoul mate\nn10628097\tSouthern Baptist\nn10628644\tsovereign, crowned head, monarch\nn10629329\tspacewalker\nn10629647\tSpanish American, Hispanic American, Hispanic\nn10629939\tsparring partner, sparring mate\nn10630093\tspastic\nn10630188\tspeaker, talker, utterer, verbalizer, verbaliser\nn10631131\tnative speaker\nn10631309\tSpeaker\nn10631654\tspeechwriter\nn10632576\tspecialist, medical specialist\nn10633298\tspecifier\nn10633450\tspectator, witness, viewer, watcher, looker\nn10634464\tspeech therapist\nn10634849\tspeedskater, speed skater\nn10634990\tspellbinder\nn10635788\tsphinx\nn10636488\tspinster, old maid\nn10637483\tsplit end\nn10638922\tsport, sportsman, sportswoman\nn10639238\tsport, summercater\nn10639359\tsporting man, outdoor man\nn10639637\tsports announcer, sportscaster, sports commentator\nn10639817\tsports editor\nn10641223\tsprog\nn10642596\tsquare dancer\nn10642705\tsquare shooter, straight shooter, straight arrow\nn10643095\tsquatter\nn10643837\tsquire\nn10643937\tsquire\nn10644598\tstaff member, staffer\nn10645017\tstaff sergeant\nn10645223\tstage director\nn10646032\tstainer\nn10646140\tstakeholder\nn10646433\tstalker\nn10646641\tstalking-horse\nn10646780\tstammerer, stutterer\nn10646942\tstamper, stomper, tramper, trampler\nn10647745\tstandee\nn10648237\tstand-in, substitute, relief, reliever, backup, backup man, fill-in\nn10648696\tstar, principal, lead\nn10649197\tstarlet\nn10649308\tstarter, dispatcher\nn10650162\tstatesman, solon, national leader\nn10652605\tstate treasurer\nn10652703\tstationer, stationery seller\nn10654015\tstenographer, amanuensis, shorthand typist\nn10654211\tstentor\nn10654321\tstepbrother, half-brother, half brother\nn10654827\tstepmother\nn10654932\tstepparent\nn10655169\tstevedore, loader, longshoreman, docker, dockhand, dock worker, dockworker, dock-walloper, lumper\nn10655442\tsteward\nn10655594\tsteward, flight attendant\nn10655730\tsteward\nn10655986\tstickler\nn10656120\tstiff\nn10656223\tstifler, smotherer\nn10656969\tstipendiary, stipendiary magistrate\nn10657306\tstitcher\nn10657556\tstockjobber\nn10657835\tstock trader\nn10658304\tstockist\nn10659042\tstoker, fireman\nn10659762\tstooper\nn10660128\tstore detective\nn10660621\tstrafer\nn10660883\tstraight man, second banana\nn10661002\tstranger, alien, unknown\nn10661216\tstranger\nn10661563\tstrategist, strategian\nn10661732\tstraw boss, assistant foreman\nn10663315\tstreetwalker, street girl, hooker, hustler, floozy, floozie, slattern\nn10663549\tstretcher-bearer, litter-bearer\nn10665302\tstruggler\nn10665587\tstud, he-man, macho-man\nn10665698\tstudent, pupil, educatee\nn10666752\tstumblebum, palooka\nn10667477\tstylist\nn10667709\tsubaltern\nn10667863\tsubcontractor\nn10668450\tsubduer, surmounter, overcomer\nn10668666\tsubject, case, guinea pig\nn10669991\tsubordinate, subsidiary, underling, foot soldier\nn10671042\tsubstitute, reserve, second-stringer\nn10671613\tsuccessor, heir\nn10671736\tsuccessor, replacement\nn10671898\tsuccorer, succourer\nn10672371\tSufi\nn10672540\tsuffragan, suffragan bishop\nn10672662\tsuffragette\nn10673296\tsugar daddy\nn10673776\tsuicide bomber\nn10674130\tsuitor, suer, wooer\nn10674713\tsumo wrestler\nn10675010\tsunbather\nn10675142\tsundowner\nn10675609\tsuper heavyweight\nn10676018\tsuperior, higher-up, superordinate\nn10676434\tsupermom\nn10676569\tsupernumerary, spear carrier, extra\nn10678937\tsupremo\nn10679174\tsurgeon, operating surgeon, sawbones\nn10679503\tSurgeon General\nn10679610\tSurgeon General\nn10679723\tsurpriser\nn10680609\tsurveyor\nn10680796\tsurveyor\nn10681194\tsurvivor, subsister\nn10681557\tsutler, victualer, victualler, provisioner\nn10682713\tsweeper\nn10682953\tsweetheart, sweetie, steady, truelove\nn10683675\tswinger, tramp\nn10684146\tswitcher, whipper\nn10684630\tswot, grind, nerd, wonk, dweeb\nn10684827\tsycophant, toady, crawler, lackey, ass-kisser\nn10685398\tsylph\nn10686073\tsympathizer, sympathiser, well-wisher\nn10686517\tsymphonist\nn10686694\tsyncopator\nn10686885\tsyndic\nn10688356\ttactician\nn10688811\ttagger\nn10689306\ttailback\nn10690268\ttallyman, tally clerk\nn10690421\ttallyman\nn10690648\ttanker, tank driver\nn10691318\ttapper, wiretapper, phone tapper\nn10691937\tTartuffe, Tartufe\nn10692090\tTarzan\nn10692482\ttaster, taste tester, taste-tester, sampler\nn10692883\ttax assessor, assessor\nn10693235\ttaxer\nn10693334\ttaxi dancer\nn10693824\ttaxonomist, taxonomer, systematist\nn10694258\tteacher, instructor\nn10694939\tteaching fellow\nn10695450\ttearaway\nn10696101\ttechnical sergeant\nn10696508\ttechnician\nn10697135\tTed, Teddy boy\nn10697282\tteetotaler, teetotaller, teetotalist\nn10698368\ttelevision reporter, television newscaster, TV reporter, TV newsman\nn10699558\ttemporizer, temporiser\nn10699752\ttempter\nn10699981\tterm infant\nn10700105\ttoiler\nn10700201\ttenant, renter\nn10700640\ttenant\nn10700963\ttenderfoot\nn10701180\ttennis player\nn10701644\ttennis pro, professional tennis player\nn10701962\ttenor saxophonist, tenorist\nn10702167\ttermer\nn10702615\tterror, scourge, threat\nn10703221\ttertigravida, gravida III\nn10703336\ttestator, testate\nn10703480\ttestatrix\nn10703692\ttestee, examinee\nn10704238\ttest-tube baby\nn10704712\tTexas Ranger, Ranger\nn10704886\tthane\nn10705448\ttheatrical producer\nn10705615\ttheologian, theologist, theologizer, theologiser\nn10706812\ttheorist, theoretician, theorizer, theoriser, idealogue\nn10707134\ttheosophist\nn10707233\ttherapist, healer\nn10707707\tThessalonian\nn10708292\tthinker, creative thinker, mind\nn10708454\tthinker\nn10709529\tthrower\nn10710171\tthurifer\nn10710259\tticket collector, ticket taker\nn10710778\ttight end\nn10710913\ttiler\nn10711483\ttimekeeper, timer\nn10711766\tTimorese\nn10712229\ttinkerer, fiddler\nn10712374\ttinsmith, tinner\nn10712474\ttinter\nn10712690\ttippler, social drinker\nn10712835\ttipster, tout\nn10713254\tT-man\nn10713686\ttoastmaster, symposiarch\nn10713843\ttoast mistress\nn10714195\ttobogganist\nn10715030\ttomboy, romp, hoyden\nn10715347\ttoolmaker\nn10715789\ttorchbearer\nn10716576\tTory\nn10716864\tTory\nn10717055\ttosser\nn10717196\ttosser, jerk-off, wanker\nn10717337\ttotalitarian\nn10718131\ttourist, tourer, holidaymaker\nn10718349\ttout, touter\nn10718509\ttout, ticket tout\nn10718665\ttovarich, tovarisch\nn10718952\ttowhead\nn10719036\ttown clerk\nn10719132\ttown crier, crier\nn10719267\ttownsman, towner\nn10719807\ttoxicologist\nn10720197\ttrack star\nn10720453\ttrader, bargainer, dealer, monger\nn10720964\ttrade unionist, unionist, union member\nn10721124\ttraditionalist, diehard\nn10721321\ttraffic cop\nn10721612\ttragedian\nn10721708\ttragedian\nn10721819\ttragedienne\nn10722029\ttrail boss\nn10722575\ttrainer\nn10722965\ttraitor, treasonist\nn10723230\ttraitress\nn10723597\ttransactor\nn10724132\ttranscriber\nn10724372\ttransfer, transferee\nn10724570\ttransferee\nn10725280\ttranslator, transcriber\nn10726031\ttransvestite, cross-dresser\nn10726786\ttraveling salesman, travelling salesman, commercial traveler, commercial traveller, roadman, bagman\nn10727016\ttraverser\nn10727171\ttrawler\nn10727458\tTreasury, First Lord of the Treasury\nn10728117\ttrencher\nn10728233\ttrend-setter, taste-maker, fashion arbiter\nn10728624\ttribesman\nn10728998\ttrier, attempter, essayer\nn10729330\ttrifler\nn10730542\ttrooper\nn10730728\ttrooper, state trooper\nn10731013\tTrotskyite, Trotskyist, Trot\nn10731732\ttruant, hooky player\nn10732010\ttrumpeter, cornetist\nn10732521\ttrusty\nn10732854\tTudor\nn10732967\ttumbler\nn10733820\ttutee\nn10734394\ttwin\nn10734741\ttwo-timer\nn10734891\tTyke\nn10734963\ttympanist, timpanist\nn10735173\ttypist\nn10735298\ttyrant, autocrat, despot\nn10735984\tumpire, ump\nn10737103\tunderstudy, standby\nn10737264\tundesirable\nn10738111\tunicyclist\nn10738215\tunilateralist\nn10738670\tUnitarian\nn10738871\tArminian\nn10739135\tuniversal donor\nn10739297\tUNIX guru\nn10739391\tUnknown Soldier\nn10740594\tupsetter\nn10740732\tupstager\nn10740868\tupstart, parvenu, nouveau-riche, arriviste\nn10741152\tupstart\nn10741367\turchin\nn10741493\turologist\nn10742005\tusherette\nn10742111\tusher, doorkeeper\nn10742546\tusurper, supplanter\nn10742997\tutility man\nn10743124\tutilizer, utiliser\nn10743356\tUtopian\nn10744078\tuxoricide\nn10744164\tvacationer, vacationist\nn10745006\tvaledictorian, valedictory speaker\nn10745770\tvalley girl\nn10746931\tvaulter, pole vaulter, pole jumper\nn10747119\tvegetarian\nn10747424\tvegan\nn10747548\tvenerator\nn10747965\tventure capitalist\nn10748142\tventurer, merchant-venturer\nn10748506\tvermin, varmint\nn10748620\tvery important person, VIP, high-up, dignitary, panjandrum, high muckamuck\nn10749928\tvibist, vibraphonist\nn10750031\tvicar\nn10750188\tvicar\nn10750640\tvicar-general\nn10751026\tvice chancellor\nn10751152\tvicegerent\nn10751265\tvice president, V.P.\nn10751710\tvice-regent\nn10752480\tvictim, dupe\nn10753061\tVictorian\nn10753182\tvictualer, victualler\nn10753339\tvigilante, vigilance man\nn10753442\tvillager\nn10753989\tvintager\nn10754189\tvintner, wine merchant\nn10754281\tviolator, debaucher, ravisher\nn10754449\tviolator, lawbreaker, law offender\nn10755080\tviolist\nn10755164\tvirago\nn10755394\tvirologist\nn10755648\tVisayan, Bisayan\nn10756061\tviscountess\nn10756148\tviscount\nn10756261\tVisigoth\nn10756641\tvisionary\nn10756837\tvisiting fireman\nn10757050\tvisiting professor\nn10757492\tvisualizer, visualiser\nn10758337\tvixen, harpy, hellcat\nn10758445\tvizier\nn10758949\tvoicer\nn10759151\tvolunteer, unpaid worker\nn10759331\tvolunteer, military volunteer, voluntary\nn10759982\tvotary\nn10760199\tvotary\nn10760622\tvouchee\nn10760951\tvower\nn10761190\tvoyager\nn10761326\tvoyeur, Peeping Tom, peeper\nn10761519\tvulcanizer, vulcaniser\nn10762212\twaffler\nn10762480\tWagnerian\nn10763075\twaif, street child\nn10763245\twailer\nn10763383\twaiter, server\nn10763620\twaitress\nn10764465\twalking delegate\nn10764622\twalk-on\nn10764719\twallah\nn10765305\twally\nn10765587\twaltzer\nn10765679\twanderer, roamer, rover, bird of passage\nn10765885\tWandering Jew\nn10766260\twanton\nn10768148\twarrantee\nn10768272\twarrantee\nn10768903\twasher\nn10769084\twasherman, laundryman\nn10769188\twashwoman, washerwoman, laundrywoman, laundress\nn10769321\twassailer, carouser\nn10769459\twastrel, waster\nn10771066\tWave\nn10772092\tweatherman, weather forecaster\nn10772580\tweekend warrior\nn10772937\tweeder\nn10773665\twelder\nn10773800\twelfare case, charity case\nn10774329\twesterner\nn10774756\tWest-sider\nn10775003\twetter\nn10775128\twhaler\nn10776052\tWhig\nn10776339\twhiner, complainer, moaner, sniveller, crybaby, bellyacher, grumbler, squawker\nn10776887\twhipper-in\nn10777299\twhisperer\nn10778044\twhiteface\nn10778148\tCarmelite, White Friar\nn10778711\tAugustinian\nn10778999\twhite hope, great white hope\nn10779610\twhite supremacist\nn10779897\twhoremaster, whoremonger\nn10779995\twhoremaster, whoremonger, john, trick\nn10780284\twidow, widow woman\nn10780632\twife, married woman\nn10781236\twiggler, wriggler, squirmer\nn10781817\twimp, chicken, crybaby\nn10782362\twing commander\nn10782471\twinger\nn10782791\twinner\nn10782940\twinner, victor\nn10783240\twindow dresser, window trimmer\nn10783539\twinker\nn10783646\twiper\nn10783734\twireman, wirer\nn10784113\twise guy, smart aleck, wiseacre, wisenheimer, weisenheimer\nn10784544\twitch doctor\nn10784922\twithdrawer\nn10785480\twithdrawer\nn10787470\twoman, adult female\nn10788852\twoman\nn10789415\twonder boy, golden boy\nn10789709\twonderer\nn10791115\tworking girl\nn10791221\tworkman, workingman, working man, working person\nn10791820\tworkmate\nn10791890\tworldling\nn10792335\tworshiper, worshipper\nn10792506\tworthy\nn10792856\twrecker\nn10793570\twright\nn10793799\twrite-in candidate, write-in\nn10794014\twriter, author\nn10801561\tWykehamist\nn10801802\tyakuza\nn10802507\tyard bird, yardbird\nn10802621\tyardie\nn10802953\tyardman\nn10803031\tyardmaster, trainmaster, train dispatcher\nn10803282\tyenta\nn10803978\tyogi\nn10804287\tyoung buck, young man\nn10804636\tyoung Turk\nn10804732\tYoung Turk\nn10805501\tZionist\nn10806113\tzoo keeper\nn10994097\tGenet, Edmund Charles Edouard Genet, Citizen Genet\nn11100798\tKennan, George F. Kennan, George Frost Kennan\nn11196627\tMunro, H. H. Munro, Hector Hugh Munro, Saki\nn11242849\tPopper, Karl Popper, Sir Karl Raimund Popper\nn11318824\tStoker, Bram Stoker, Abraham Stoker\nn11346873\tTownes, Charles Townes, Charles Hard Townes\nn11448153\tdust storm, duster, sandstorm, sirocco\nn11487732\tparhelion, mock sun, sundog\nn11508382\tsnow, snowfall\nn11511327\tfacula\nn11524451\twave\nn11530008\tmicroflora\nn11531193\twilding\nn11531334\tsemi-climber\nn11532682\tvolva\nn11533212\tbasidiocarp\nn11533999\tdomatium\nn11536567\tapomict\nn11536673\taquatic\nn11537327\tbryophyte, nonvascular plant\nn11539289\tacrocarp, acrocarpous moss\nn11542137\tsphagnum, sphagnum moss, peat moss, bog moss\nn11542640\tliverwort, hepatic\nn11544015\thepatica, Marchantia polymorpha\nn11545350\tpecopteris\nn11545524\tpteridophyte, nonflowering plant\nn11545714\tfern\nn11547562\tfern ally\nn11547855\tspore\nn11548728\tcarpospore\nn11548870\tchlamydospore\nn11549009\tconidium, conidiospore\nn11549245\toospore\nn11549779\ttetraspore\nn11549895\tzoospore\nn11552133\tcryptogam\nn11552386\tspermatophyte, phanerogam, seed plant\nn11552594\tseedling\nn11552806\tannual\nn11552976\tbiennial\nn11553240\tperennial\nn11553522\thygrophyte\nn11596108\tgymnosperm\nn11597657\tgnetum, Gnetum gnemon\nn11598287\tCatha edulis\nn11598686\tephedra, joint fir\nn11598886\tmahuang, Ephedra sinica\nn11599324\twelwitschia, Welwitschia mirabilis\nn11600372\tcycad\nn11601177\tsago palm, Cycas revoluta\nn11601333\tfalse sago, fern palm, Cycas circinalis\nn11601918\tzamia\nn11602091\tcoontie, Florida arrowroot, Seminole bread, Zamia pumila\nn11602478\tceratozamia\nn11602873\tdioon\nn11603246\tencephalartos\nn11603462\tkaffir bread, Encephalartos caffer\nn11603835\tmacrozamia\nn11604046\tburrawong, Macrozamia communis, Macrozamia spiralis\nn11608250\tpine, pine tree, true pine\nn11609475\tpinon, pinyon\nn11609684\tnut pine\nn11609862\tpinon pine, Mexican nut pine, Pinus cembroides\nn11610047\tRocky mountain pinon, Pinus edulis\nn11610215\tsingle-leaf, single-leaf pine, single-leaf pinyon, Pinus monophylla\nn11610437\tbishop pine, bishop's pine, Pinus muricata\nn11610602\tCalifornia single-leaf pinyon, Pinus californiarum\nn11610823\tParry's pinyon, Pinus quadrifolia, Pinus parryana\nn11611087\tspruce pine, Pinus glabra\nn11611233\tblack pine, Pinus nigra\nn11611356\tpitch pine, northern pitch pine, Pinus rigida\nn11611561\tpond pine, Pinus serotina\nn11611758\tstone pine, umbrella pine, European nut pine, Pinus pinea\nn11612018\tSwiss pine, Swiss stone pine, arolla pine, cembra nut tree, Pinus cembra\nn11612235\tcembra nut, cedar nut\nn11612349\tSwiss mountain pine, mountain pine, dwarf mountain pine, mugho pine, mugo pine, Pinus mugo\nn11612575\tancient pine, Pinus longaeva\nn11612923\twhite pine\nn11613219\tAmerican white pine, eastern white pine, weymouth pine, Pinus strobus\nn11613459\twestern white pine, silver pine, mountain pine, Pinus monticola\nn11613692\tsouthwestern white pine, Pinus strobiformis\nn11613867\tlimber pine, Pinus flexilis\nn11614039\twhitebark pine, whitebarked pine, Pinus albicaulis\nn11614250\tyellow pine\nn11614420\tponderosa, ponderosa pine, western yellow pine, bull pine, Pinus ponderosa\nn11614713\tJeffrey pine, Jeffrey's pine, black pine, Pinus jeffreyi\nn11615026\tshore pine, lodgepole, lodgepole pine, spruce pine, Pinus contorta\nn11615259\tSierra lodgepole pine, Pinus contorta murrayana\nn11615387\tloblolly pine, frankincense pine, Pinus taeda\nn11615607\tjack pine, Pinus banksiana\nn11615812\tswamp pine\nn11615967\tlongleaf pine, pitch pine, southern yellow pine, Georgia pine, Pinus palustris\nn11616260\tshortleaf pine, short-leaf pine, shortleaf yellow pine, Pinus echinata\nn11616486\tred pine, Canadian red pine, Pinus resinosa\nn11616662\tScotch pine, Scots pine, Scotch fir, Pinus sylvestris\nn11616852\tscrub pine, Virginia pine, Jersey pine, Pinus virginiana\nn11617090\tMonterey pine, Pinus radiata\nn11617272\tbristlecone pine, Rocky Mountain bristlecone pine, Pinus aristata\nn11617631\ttable-mountain pine, prickly pine, hickory pine, Pinus pungens\nn11617878\tknobcone pine, Pinus attenuata\nn11618079\tJapanese red pine, Japanese table pine, Pinus densiflora\nn11618290\tJapanese black pine, black pine, Pinus thunbergii\nn11618525\tTorrey pine, Torrey's pine, soledad pine, grey-leaf pine, sabine pine, Pinus torreyana\nn11618861\tlarch, larch tree\nn11619227\tAmerican larch, tamarack, black larch, Larix laricina\nn11619455\twestern larch, western tamarack, Oregon larch, Larix occidentalis\nn11619687\tsubalpine larch, Larix lyallii\nn11619845\tEuropean larch, Larix decidua\nn11620016\tSiberian larch, Larix siberica, Larix russica\nn11620389\tgolden larch, Pseudolarix amabilis\nn11620673\tfir, fir tree, true fir\nn11621029\tsilver fir\nn11621281\tamabilis fir, white fir, Pacific silver fir, red silver fir, Christmas tree, Abies amabilis\nn11621547\tEuropean silver fir, Christmas tree, Abies alba\nn11621727\twhite fir, Colorado fir, California white fir, Abies concolor, Abies lowiana\nn11621950\tbalsam fir, balm of Gilead, Canada balsam, Abies balsamea\nn11622184\tFraser fir, Abies fraseri\nn11622368\tlowland fir, lowland white fir, giant fir, grand fir, Abies grandis\nn11622591\tAlpine fir, subalpine fir, Abies lasiocarpa\nn11622771\tSanta Lucia fir, bristlecone fir, Abies bracteata, Abies venusta\nn11623105\tcedar, cedar tree, true cedar\nn11623815\tcedar of Lebanon, Cedrus libani\nn11623967\tdeodar, deodar cedar, Himalayan cedar, Cedrus deodara\nn11624192\tAtlas cedar, Cedrus atlantica\nn11624531\tspruce\nn11625003\tNorway spruce, Picea abies\nn11625223\tweeping spruce, Brewer's spruce, Picea breweriana\nn11625391\tEngelmann spruce, Engelmann's spruce, Picea engelmannii\nn11625632\twhite spruce, Picea glauca\nn11625804\tblack spruce, Picea mariana, spruce pine\nn11626010\tSiberian spruce, Picea obovata\nn11626152\tSitka spruce, Picea sitchensis\nn11626409\toriental spruce, Picea orientalis\nn11626585\tColorado spruce, Colorado blue spruce, silver spruce, Picea pungens\nn11626826\tred spruce, eastern spruce, yellow spruce, Picea rubens\nn11627168\themlock, hemlock tree\nn11627512\teastern hemlock, Canadian hemlock, spruce pine, Tsuga canadensis\nn11627714\tCarolina hemlock, Tsuga caroliniana\nn11627908\tmountain hemlock, black hemlock, Tsuga mertensiana\nn11628087\twestern hemlock, Pacific hemlock, west coast hemlock, Tsuga heterophylla\nn11628456\tdouglas fir\nn11628793\tgreen douglas fir, douglas spruce, douglas pine, douglas hemlock, Oregon fir, Oregon pine, Pseudotsuga menziesii\nn11629047\tbig-cone spruce, big-cone douglas fir, Pseudotsuga macrocarpa\nn11629354\tCathaya\nn11630017\tcedar, cedar tree\nn11630489\tcypress, cypress tree\nn11631159\tgowen cypress, Cupressus goveniana\nn11631405\tpygmy cypress, Cupressus pigmaea, Cupressus goveniana pigmaea\nn11631619\tSanta Cruz cypress, Cupressus abramsiana, Cupressus goveniana abramsiana\nn11631854\tArizona cypress, Cupressus arizonica\nn11631985\tGuadalupe cypress, Cupressus guadalupensis\nn11632167\tMonterey cypress, Cupressus macrocarpa\nn11632376\tMexican cypress, cedar of Goa, Portuguese cypress, Cupressus lusitanica\nn11632619\tItalian cypress, Mediterranean cypress, Cupressus sempervirens\nn11632929\tKing William pine, Athrotaxis selaginoides\nn11633284\tChilean cedar, Austrocedrus chilensis\nn11634736\tincense cedar, red cedar, Calocedrus decurrens, Libocedrus decurrens\nn11635152\tsouthern white cedar, coast white cedar, Atlantic white cedar, white cypress, white cedar, Chamaecyparis thyoides\nn11635433\tOregon cedar, Port Orford cedar, Lawson's cypress, Lawson's cedar, Chamaecyparis lawsoniana\nn11635830\tyellow cypress, yellow cedar, Nootka cypress, Alaska cedar, Chamaecyparis nootkatensis\nn11636204\tJapanese cedar, Japan cedar, sugi, Cryptomeria japonica\nn11636835\tjuniper berry\nn11639084\tincense cedar\nn11639306\tkawaka, Libocedrus plumosa\nn11639445\tpahautea, Libocedrus bidwillii, mountain pine\nn11640132\tmetasequoia, dawn redwood, Metasequoia glyptostrodoides\nn11643835\tarborvitae\nn11644046\twestern red cedar, red cedar, canoe cedar, Thuja plicata\nn11644226\tAmerican arborvitae, northern white cedar, white cedar, Thuja occidentalis\nn11644462\tOriental arborvitae, Thuja orientalis, Platycladus orientalis\nn11644872\thiba arborvitae, Thujopsis dolobrata\nn11645163\tketeleeria\nn11645590\tWollemi pine\nn11645914\taraucaria\nn11646167\tmonkey puzzle, chile pine, Araucaria araucana\nn11646344\tnorfolk island pine, Araucaria heterophylla, Araucaria excelsa\nn11646517\tnew caledonian pine, Araucaria columnaris\nn11646694\tbunya bunya, bunya bunya tree, Araucaria bidwillii\nn11646955\thoop pine, Moreton Bay pine, Araucaria cunninghamii\nn11647306\tkauri pine, dammar pine\nn11647703\tkauri, kaury, Agathis australis\nn11647868\tamboina pine, amboyna pine, Agathis dammara, Agathis alba\nn11648039\tdundathu pine, queensland kauri, smooth bark kauri, Agathis robusta\nn11648268\tred kauri, Agathis lanceolata\nn11648776\tplum-yew\nn11649150\tCalifornia nutmeg, nutmeg-yew, Torreya californica\nn11649359\tstinking cedar, stinking yew, Torrey tree, Torreya taxifolia\nn11649878\tcelery pine\nn11650160\tcelery top pine, celery-topped pine, Phyllocladus asplenifolius\nn11650307\ttanekaha, Phyllocladus trichomanoides\nn11650430\tAlpine celery pine, Phyllocladus alpinus\nn11650558\tyellowwood, yellowwood tree\nn11650759\tgymnospermous yellowwood\nn11652039\tpodocarp\nn11652217\tyacca, yacca podocarp, Podocarpus coriaceus\nn11652376\tbrown pine, Rockingham podocarp, Podocarpus elatus\nn11652578\tcape yellowwood, African yellowwood, Podocarpus elongatus\nn11652753\tSouth-African yellowwood, Podocarpus latifolius\nn11652966\talpine totara, Podocarpus nivalis\nn11653126\ttotara, Podocarpus totara\nn11653570\tcommon yellowwood, bastard yellowwood, Afrocarpus falcata\nn11653904\tkahikatea, New Zealand Dacryberry, New Zealand white pine, Dacrycarpus dacrydioides, Podocarpus dacrydioides\nn11654293\trimu, imou pine, red pine, Dacrydium cupressinum\nn11654438\ttarwood, tar-wood, Dacrydium colensoi\nn11654984\tcommon sickle pine, Falcatifolium falciforme\nn11655152\tyellow-leaf sickle pine, Falcatifolium taxoides\nn11655592\ttarwood, tar-wood, New Zealand mountain pine, Halocarpus bidwilli, Dacrydium bidwilli\nn11655974\twestland pine, silver pine, Lagarostrobus colensoi\nn11656123\thuon pine, Lagarostrobus franklinii, Dacrydium franklinii\nn11656549\tChilean rimu, Lepidothamnus fonkii\nn11656771\tmountain rimu, Lepidothamnus laxifolius, Dacridium laxifolius\nn11657585\tnagi, Nageia nagi\nn11658331\tmiro, black pine, Prumnopitys ferruginea, Podocarpus ferruginea\nn11658544\tmatai, black pine, Prumnopitys taxifolia, Podocarpus spicata\nn11658709\tplum-fruited yew, Prumnopitys andina, Prumnopitys elegans\nn11659248\tPrince Albert yew, Prince Albert's yew, Saxe-gothea conspicua\nn11659627\tSundacarpus amara, Prumnopitys amara, Podocarpus amara\nn11660300\tJapanese umbrella pine, Sciadopitys verticillata\nn11661372\tyew\nn11661909\tOld World yew, English yew, Taxus baccata\nn11662128\tPacific yew, California yew, western yew, Taxus brevifolia\nn11662371\tJapanese yew, Taxus cuspidata\nn11662585\tFlorida yew, Taxus floridana\nn11662937\tNew Caledonian yew, Austrotaxus spicata\nn11663263\twhite-berry yew, Pseudotaxus chienii\nn11664418\tginkgo, gingko, maidenhair tree, Ginkgo biloba\nn11665372\tangiosperm, flowering plant\nn11666854\tdicot, dicotyledon, magnoliopsid, exogen\nn11668117\tmonocot, monocotyledon, liliopsid, endogen\nn11669786\tfloret, floweret\nn11669921\tflower\nn11672269\tbloomer\nn11672400\twildflower, wild flower\nn11674019\tapetalous flower\nn11674332\tinflorescence\nn11675025\trosebud\nn11675404\tgynostegium\nn11675738\tpollinium\nn11676500\tpistil\nn11676743\tgynobase\nn11676850\tgynophore\nn11677485\tstylopodium\nn11677902\tcarpophore\nn11678010\tcornstalk, corn stalk\nn11678299\tpetiolule\nn11678377\tmericarp\nn11679378\tmicropyle\nn11680457\tgerm tube\nn11680596\tpollen tube\nn11682659\tgemma\nn11683216\tgalbulus\nn11683838\tnectary, honey gland\nn11684264\tpericarp, seed vessel\nn11684499\tepicarp, exocarp\nn11684654\tmesocarp\nn11685091\tpip\nn11685621\tsilique, siliqua\nn11686195\tcataphyll\nn11686652\tperisperm\nn11686780\tmonocarp, monocarpic plant, monocarpous plant\nn11686912\tsporophyte\nn11687071\tgametophyte\nn11687432\tmegasporangium, macrosporangium\nn11687789\tmicrospore\nn11687964\tmicrosporangium\nn11688069\tmicrosporophyll\nn11688378\tarchespore, archesporium\nn11689197\tbonduc nut, nicker nut, nicker seed\nn11689367\tJob's tears\nn11689483\toilseed, oil-rich seed\nn11689678\tcastor bean\nn11689815\tcottonseed\nn11689957\tcandlenut\nn11690088\tpeach pit\nn11690254\thypanthium, floral cup, calyx tube\nn11690455\tpetal, flower petal\nn11691046\tcorolla\nn11691857\tlip\nn11692265\tperianth, chlamys, floral envelope, perigone, perigonium\nn11692792\tthistledown\nn11693981\tcustard apple, custard apple tree\nn11694300\tcherimoya, cherimoya tree, Annona cherimola\nn11694469\tilama, ilama tree, Annona diversifolia\nn11694664\tsoursop, prickly custard apple, soursop tree, Annona muricata\nn11694866\tbullock's heart, bullock's heart tree, bullock heart, Annona reticulata\nn11695085\tsweetsop, sweetsop tree, Annona squamosa\nn11695285\tpond apple, pond-apple tree, Annona glabra\nn11695599\tpawpaw, papaw, papaw tree, Asimina triloba\nn11695974\tilang-ilang, ylang-ylang, Cananga odorata\nn11696450\tlancewood, lancewood tree, Oxandra lanceolata\nn11696935\tGuinea pepper, negro pepper, Xylopia aethiopica\nn11697560\tbarberry\nn11697802\tAmerican barberry, Berberis canadensis\nn11698042\tcommon barberry, European barberry, Berberis vulgaris\nn11698245\tJapanese barberry, Berberis thunbergii\nn11699442\tOregon grape, Oregon holly grape, hollygrape, mountain grape, holly-leaves barberry, Mahonia aquifolium\nn11699751\tOregon grape, Mahonia nervosa\nn11700058\tmayapple, May apple, wild mandrake, Podophyllum peltatum\nn11700279\tMay apple\nn11700864\tallspice\nn11701066\tCarolina allspice, strawberry shrub, strawberry bush, sweet shrub, Calycanthus floridus\nn11701302\tspicebush, California allspice, Calycanthus occidentalis\nn11702713\tkatsura tree, Cercidiphyllum japonicum\nn11703669\tlaurel\nn11704093\ttrue laurel, bay, bay laurel, bay tree, Laurus nobilis\nn11704620\tcamphor tree, Cinnamomum camphora\nn11704791\tcinnamon, Ceylon cinnamon, Ceylon cinnamon tree, Cinnamomum zeylanicum\nn11705171\tcassia, cassia-bark tree, Cinnamomum cassia\nn11705387\tcassia bark, Chinese cinnamon\nn11705573\tSaigon cinnamon, Cinnamomum loureirii\nn11705776\tcinnamon bark\nn11706325\tspicebush, spice bush, American spicebush, Benjamin bush, Lindera benzoin, Benzoin odoriferum\nn11706761\tavocado, avocado tree, Persea Americana\nn11706942\tlaurel-tree, red bay, Persea borbonia\nn11707229\tsassafras, sassafras tree, Sassafras albidum\nn11707827\tCalifornia laurel, California bay tree, Oregon myrtle, pepperwood, spice tree, sassafras laurel, California olive, mountain laurel, Umbellularia californica\nn11708658\tanise tree\nn11708857\tpurple anise, Illicium floridanum\nn11709045\tstar anise, Illicium anisatum\nn11709205\tstar anise, Chinese anise, Illicium verum\nn11709674\tmagnolia\nn11710136\tsouthern magnolia, evergreen magnolia, large-flowering magnolia, bull bay, Magnolia grandiflora\nn11710393\tumbrella tree, umbrella magnolia, elkwood, elk-wood, Magnolia tripetala\nn11710658\tearleaved umbrella tree, Magnolia fraseri\nn11710827\tcucumber tree, Magnolia acuminata\nn11710987\tlarge-leaved magnolia, large-leaved cucumber tree, great-leaved macrophylla, Magnolia macrophylla\nn11711289\tsaucer magnolia, Chinese magnolia, Magnolia soulangiana\nn11711537\tstar magnolia, Magnolia stellata\nn11711764\tsweet bay, swamp bay, swamp laurel, Magnolia virginiana\nn11711971\tmanglietia, genus Manglietia\nn11712282\ttulip tree, tulip poplar, yellow poplar, canary whitewood, Liriodendron tulipifera\nn11713164\tmoonseed\nn11713370\tcommon moonseed, Canada moonseed, yellow parilla, Menispermum canadense\nn11713763\tCarolina moonseed, Cocculus carolinus\nn11714382\tnutmeg, nutmeg tree, Myristica fragrans\nn11715430\twater nymph, fragrant water lily, pond lily, Nymphaea odorata\nn11715678\tEuropean white lily, Nymphaea alba\nn11716698\tsouthern spatterdock, Nuphar sagittifolium\nn11717399\tlotus, Indian lotus, sacred lotus, Nelumbo nucifera\nn11717577\twater chinquapin, American lotus, yanquapin, Nelumbo lutea\nn11718296\twater-shield, fanwort, Cabomba caroliniana\nn11718681\twater-shield, Brasenia schreberi, water-target\nn11719286\tpeony, paeony\nn11720353\tbuttercup, butterflower, butter-flower, crowfoot, goldcup, kingcup\nn11720643\tmeadow buttercup, tall buttercup, tall crowfoot, tall field buttercup, Ranunculus acris\nn11720891\twater crowfoot, water buttercup, Ranunculus aquatilis\nn11721337\tlesser celandine, pilewort, Ranunculus ficaria\nn11721642\tlesser spearwort, Ranunculus flammula\nn11722036\tgreater spearwort, Ranunculus lingua\nn11722342\twestern buttercup, Ranunculus occidentalis\nn11722466\tcreeping buttercup, creeping crowfoot, Ranunculus repens\nn11722621\tcursed crowfoot, celery-leaved buttercup, Ranunculus sceleratus\nn11722982\taconite\nn11723227\tmonkshood, helmetflower, helmet flower, Aconitum napellus\nn11723452\twolfsbane, wolfbane, wolf's bane, Aconitum lycoctonum\nn11723770\tbaneberry, cohosh, herb Christopher\nn11723986\tbaneberry\nn11724109\tred baneberry, redberry, red-berry, snakeberry, Actaea rubra\nn11724660\tpheasant's-eye, Adonis annua\nn11725015\tanemone, windflower\nn11725311\tAlpine anemone, mountain anemone, Anemone tetonensis\nn11725480\tCanada anemone, Anemone Canadensis\nn11725623\tthimbleweed, Anemone cylindrica\nn11725821\twood anemone, Anemone nemorosa\nn11725973\twood anemone, snowdrop, Anemone quinquefolia\nn11726145\tlongheaded thimbleweed, Anemone riparia\nn11726269\tsnowdrop anemone, snowdrop windflower, Anemone sylvestris\nn11726433\tVirginia thimbleweed, Anemone virginiana\nn11726707\true anemone, Anemonella thalictroides\nn11727091\tcolumbine, aquilegia, aquilege\nn11727358\tmeeting house, honeysuckle, Aquilegia canadensis\nn11727540\tblue columbine, Aquilegia caerulea, Aquilegia scopulorum calcarea\nn11727738\tgranny's bonnets, Aquilegia vulgaris\nn11728099\tmarsh marigold, kingcup, meadow bright, May blob, cowslip, water dragon, Caltha palustris\nn11728769\tAmerican bugbane, summer cohosh, Cimicifuga americana\nn11728945\tblack cohosh, black snakeroot, rattle-top, Cimicifuga racemosa\nn11729142\tfetid bugbane, foetid bugbane, Cimicifuga foetida\nn11729478\tclematis\nn11729860\tpine hyacinth, Clematis baldwinii, Viorna baldwinii\nn11730015\tblue jasmine, blue jessamine, curly clematis, marsh clematis, Clematis crispa\nn11730458\tgolden clematis, Clematis tangutica\nn11730602\tscarlet clematis, Clematis texensis\nn11730750\tleather flower, Clematis versicolor\nn11730933\tleather flower, vase-fine, vase vine, Clematis viorna\nn11731157\tvirgin's bower, old man's beard, devil's darning needle, Clematis virginiana\nn11731659\tpurple clematis, purple virgin's bower, mountain clematis, Clematis verticillaris\nn11732052\tgoldthread, golden thread, Coptis groenlandica, Coptis trifolia groenlandica\nn11732567\trocket larkspur, Consolida ambigua, Delphinium ajacis\nn11733054\tdelphinium\nn11733312\tlarkspur\nn11733548\twinter aconite, Eranthis hyemalis\nn11734493\tlenten rose, black hellebore, Helleborus orientalis\nn11734698\tgreen hellebore, Helleborus viridis\nn11735053\thepatica, liverleaf\nn11735570\tgoldenseal, golden seal, yellow root, turmeric root, Hydrastis Canadensis\nn11735977\tfalse rue anemone, false rue, Isopyrum biternatum\nn11736362\tgiant buttercup, Laccopetalum giganteum\nn11736694\tnigella\nn11736851\tlove-in-a-mist, Nigella damascena\nn11737009\tfennel flower, Nigella hispanica\nn11737125\tblack caraway, nutmeg flower, Roman coriander, Nigella sativa\nn11737534\tpasqueflower, pasque flower\nn11738547\tmeadow rue\nn11738997\tfalse bugbane, Trautvetteria carolinensis\nn11739365\tglobeflower, globe flower\nn11739978\twinter's bark, winter's bark tree, Drimys winteri\nn11740414\tpepper shrub, Pseudowintera colorata, Wintera colorata\nn11741175\tsweet gale, Scotch gale, Myrica gale\nn11741350\twax myrtle\nn11741575\tbay myrtle, puckerbush, Myrica cerifera\nn11741797\tbayberry, candleberry, swamp candleberry, waxberry, Myrica pensylvanica\nn11742310\tsweet fern, Comptonia peregrina, Comptonia asplenifolia\nn11742878\tcorkwood, corkwood tree, Leitneria floridana\nn11744011\tjointed rush, Juncus articulatus\nn11744108\ttoad rush, Juncus bufonius\nn11744471\tslender rush, Juncus tenuis\nn11745817\tzebrawood, zebrawood tree\nn11746600\tConnarus guianensis\nn11747468\tlegume, leguminous plant\nn11748002\tlegume\nn11748811\tpeanut\nn11749112\tgranadilla tree, granadillo, Brya ebenus\nn11749603\tarariba, Centrolobium robustum\nn11750173\ttonka bean, coumara nut\nn11750508\tcourbaril, Hymenaea courbaril\nn11750989\tmelilotus, melilot, sweet clover\nn11751765\tdarling pea, poison bush\nn11751974\tsmooth darling pea, Swainsona galegifolia\nn11752578\tclover, trefoil\nn11752798\talpine clover, Trifolium alpinum\nn11752937\thop clover, shamrock, lesser yellow trefoil, Trifolium dubium\nn11753143\tcrimson clover, Italian clover, Trifolium incarnatum\nn11753355\tred clover, purple clover, Trifolium pratense\nn11753562\tbuffalo clover, Trifolium reflexum, Trifolium stoloniferum\nn11753700\twhite clover, dutch clover, shamrock, Trifolium repens\nn11754893\tmimosa\nn11756092\tacacia\nn11756329\tshittah, shittah tree\nn11756669\twattle\nn11756870\tblack wattle, Acacia auriculiformis\nn11757017\tgidgee, stinking wattle, Acacia cambegei\nn11757190\tcatechu, Jerusalem thorn, Acacia catechu\nn11757653\tsilver wattle, mimosa, Acacia dealbata\nn11757851\thuisache, cassie, mimosa bush, sweet wattle, sweet acacia, scented wattle, flame tree, Acacia farnesiana\nn11758122\tlightwood, Acacia melanoxylon\nn11758276\tgolden wattle, Acacia pycnantha\nn11758483\tfever tree, Acacia xanthophloea\nn11758799\tcoralwood, coral-wood, red sandalwood, Barbados pride, peacock flower fence, Adenanthera pavonina\nn11759224\talbizzia, albizia\nn11759404\tsilk tree, Albizia julibrissin, Albizzia julibrissin\nn11759609\tsiris, siris tree, Albizia lebbeck, Albizzia lebbeck\nn11759853\train tree, saman, monkeypod, monkey pod, zaman, zamang, Albizia saman\nn11760785\tcalliandra\nn11761202\tconacaste, elephant's ear, Enterolobium cyclocarpa\nn11761650\tinga\nn11761836\tice-cream bean, Inga edulis\nn11762018\tguama, Inga laurina\nn11762433\tlead tree, white popinac, Leucaena glauca, Leucaena leucocephala\nn11762927\twild tamarind, Lysiloma latisiliqua, Lysiloma bahamensis\nn11763142\tsabicu, Lysiloma sabicu\nn11763625\tnitta tree\nn11763874\tParkia javanica\nn11764478\tmanila tamarind, camachile, huamachil, wild tamarind, Pithecellobium dulce\nn11764814\tcat's-claw, catclaw, black bead, Pithecellodium unguis-cati\nn11765568\thoney mesquite, Western honey mesquite, Prosopis glandulosa\nn11766046\talgarroba, algarrobilla, algarobilla\nn11766189\tscrew bean, screwbean, tornillo, screwbean mesquite, Prosopis pubescens\nn11766432\tscrew bean\nn11767354\tdogbane\nn11767877\tIndian hemp, rheumatism weed, Apocynum cannabinum\nn11768816\tbushman's poison, ordeal tree, Acocanthera oppositifolia, Acocanthera venenata\nn11769176\timpala lily, mock azalia, desert rose, kudu lily, Adenium obesum, Adenium multiflorum\nn11769621\tallamanda\nn11769803\tcommon allamanda, golden trumpet, Allamanda cathartica\nn11770256\tdita, dita bark, devil tree, Alstonia scholaris\nn11771147\tNepal trumpet flower, Easter lily vine, Beaumontia grandiflora\nn11771539\tcarissa\nn11771746\thedge thorn, natal plum, Carissa bispinosa\nn11771924\tnatal plum, amatungulu, Carissa macrocarpa, Carissa grandiflora\nn11772408\tperiwinkle, rose periwinkle, Madagascar periwinkle, old maid, Cape periwinkle, red periwinkle, cayenne jasmine, Catharanthus roseus, Vinca rosea\nn11772879\tivory tree, conessi, kurchi, kurchee, Holarrhena pubescens, Holarrhena antidysenterica\nn11773408\twhite dipladenia, Mandevilla boliviensis, Dipladenia boliviensis\nn11773628\tChilean jasmine, Mandevilla laxa\nn11773987\toleander, rose bay, Nerium oleander\nn11774513\tfrangipani, frangipanni\nn11774972\tWest Indian jasmine, pagoda tree, Plumeria alba\nn11775340\trauwolfia, rauvolfia\nn11775626\tsnakewood, Rauwolfia serpentina\nn11776234\tStrophanthus kombe\nn11777080\tyellow oleander, Thevetia peruviana, Thevetia neriifolia\nn11778092\tmyrtle, Vinca minor\nn11778257\tlarge periwinkle, Vinca major\nn11779300\tarum, aroid\nn11780148\tcuckoopint, lords-and-ladies, jack-in-the-pulpit, Arum maculatum\nn11780424\tblack calla, Arum palaestinum\nn11781176\tcalamus\nn11782036\talocasia, elephant's ear, elephant ear\nn11782266\tgiant taro, Alocasia macrorrhiza\nn11782761\tamorphophallus\nn11782878\tpungapung, telingo potato, elephant yam, Amorphophallus paeonifolius, Amorphophallus campanulatus\nn11783162\tdevil's tongue, snake palm, umbrella arum, Amorphophallus rivieri\nn11783920\tanthurium, tailflower, tail-flower\nn11784126\tflamingo flower, flamingo plant, Anthurium andraeanum, Anthurium scherzerianum\nn11784497\tjack-in-the-pulpit, Indian turnip, wake-robin, Arisaema triphyllum, Arisaema atrorubens\nn11785276\tfriar's-cowl, Arisarum vulgare\nn11785668\tcaladium\nn11785875\tCaladium bicolor\nn11786131\twild calla, water arum, Calla palustris\nn11786539\ttaro, taro plant, dalo, dasheen, Colocasia esculenta\nn11786843\ttaro, cocoyam, dasheen, eddo\nn11787190\tcryptocoryne, water trumpet\nn11788039\tdracontium\nn11788727\tgolden pothos, pothos, ivy arum, Epipremnum aureum, Scindapsus aureus\nn11789066\tskunk cabbage, Lysichiton americanum\nn11789438\tmonstera\nn11789589\tceriman, Monstera deliciosa\nn11789962\tnephthytis\nn11790089\tNephthytis afzelii\nn11790788\tarrow arum\nn11790936\tgreen arrow arum, tuckahoe, Peltandra virginica\nn11791341\tphilodendron\nn11791569\tpistia, water lettuce, water cabbage, Pistia stratiotes, Pistia stratoites\nn11792029\tpothos\nn11792341\tspathiphyllum, peace lily, spathe flower\nn11792742\tskunk cabbage, polecat weed, foetid pothos, Symplocarpus foetidus\nn11793403\tyautia, tannia, spoonflower, malanga, Xanthosoma sagittifolium, Xanthosoma atrovirens\nn11793779\tcalla lily, calla, arum lily, Zantedeschia aethiopica\nn11794024\tpink calla, Zantedeschia rehmanii\nn11794139\tgolden calla\nn11794519\tduckweed\nn11795049\tcommon duckweed, lesser duckweed, Lemna minor\nn11795216\tstar-duckweed, Lemna trisulca\nn11795580\tgreat duckweed, water flaxseed, Spirodela polyrrhiza\nn11796005\twatermeal\nn11796188\tcommon wolffia, Wolffia columbiana\nn11797321\taralia\nn11797508\tAmerican angelica tree, devil's walking stick, Hercules'-club, Aralia spinosa\nn11797981\tAmerican spikenard, petty morel, life-of-man, Aralia racemosa\nn11798270\tbristly sarsaparilla, bristly sarsparilla, dwarf elder, Aralia hispida\nn11798496\tJapanese angelica tree, Aralia elata\nn11798688\tChinese angelica, Chinese angelica tree, Aralia stipulata\nn11798978\tivy, common ivy, English ivy, Hedera helix\nn11799331\tpuka, Meryta sinclairii\nn11799732\tginseng, nin-sin, Panax ginseng, Panax schinseng, Panax pseudoginseng\nn11800236\tginseng\nn11800565\tumbrella tree, Schefflera actinophylla, Brassaia actinophylla\nn11801392\tbirthwort, Aristolochia clematitis\nn11801665\tDutchman's-pipe, pipe vine, Aristolochia macrophylla, Aristolochia durior\nn11801891\tVirginia snakeroot, Virginia serpentaria, Virginia serpentary, Aristolochia serpentaria\nn11802410\tCanada ginger, black snakeroot, Asarum canadense\nn11802586\theartleaf, heart-leaf, Asarum virginicum\nn11802800\theartleaf, heart-leaf, Asarum shuttleworthii\nn11802995\tasarabacca, Asarum europaeum\nn11805255\tcaryophyllaceous plant\nn11805544\tcorn cockle, corn campion, crown-of-the-field, Agrostemma githago\nn11805956\tsandwort\nn11806219\tmountain sandwort, mountain starwort, mountain daisy, Arenaria groenlandica\nn11806369\tpine-barren sandwort, longroot, Arenaria caroliniana\nn11806521\tseabeach sandwort, Arenaria peploides\nn11806679\trock sandwort, Arenaria stricta\nn11806814\tthyme-leaved sandwort, Arenaria serpyllifolia\nn11807108\tmouse-ear chickweed, mouse eared chickweed, mouse ear, clammy chickweed, chickweed\nn11807525\tsnow-in-summer, love-in-a-mist, Cerastium tomentosum\nn11807696\tAlpine mouse-ear, Arctic mouse-ear, Cerastium alpinum\nn11807979\tpink, garden pink\nn11808299\tsweet William, Dianthus barbatus\nn11808468\tcarnation, clove pink, gillyflower, Dianthus caryophyllus\nn11808721\tchina pink, rainbow pink, Dianthus chinensis\nn11808932\tJapanese pink, Dianthus chinensis heddewigii\nn11809094\tmaiden pink, Dianthus deltoides\nn11809271\tcheddar pink, Diangus gratianopolitanus\nn11809437\tbutton pink, Dianthus latifolius\nn11809594\tcottage pink, grass pink, Dianthus plumarius\nn11809754\tfringed pink, Dianthus supurbus\nn11810030\tdrypis\nn11810358\tbaby's breath, babies'-breath, Gypsophila paniculata\nn11811059\tcoral necklace, Illecebrum verticullatum\nn11811473\tlychnis, catchfly\nn11811706\tragged robin, cuckoo flower, Lychnis flos-cuculi, Lychins floscuculi\nn11811921\tscarlet lychnis, maltese cross, Lychins chalcedonica\nn11812094\tmullein pink, rose campion, gardener's delight, dusty miller, Lychnis coronaria\nn11812910\tsandwort, Moehringia lateriflora\nn11813077\tsandwort, Moehringia mucosa\nn11814584\tsoapwort, hedge pink, bouncing Bet, bouncing Bess, Saponaria officinalis\nn11814996\tknawel, knawe, Scleranthus annuus\nn11815491\tsilene, campion, catchfly\nn11815721\tmoss campion, Silene acaulis\nn11815918\twild pink, Silene caroliniana\nn11816121\tred campion, red bird's eye, Silene dioica, Lychnis dioica\nn11816336\twhite campion, evening lychnis, white cockle, bladder campion, Silene latifolia, Lychnis alba\nn11816649\tfire pink, Silene virginica\nn11816829\tbladder campion, Silene uniflora, Silene vulgaris\nn11817160\tcorn spurry, corn spurrey, Spergula arvensis\nn11817501\tsand spurry, sea spurry, Spergularia rubra\nn11817914\tchickweed\nn11818069\tcommon chickweed, Stellaria media\nn11818636\tcowherb, cow cockle, Vaccaria hispanica, Vaccaria pyramidata, Saponaria vaccaria\nn11819509\tHottentot fig, Hottentot's fig, sour fig, Carpobrotus edulis, Mesembryanthemum edule\nn11819912\tlivingstone daisy, Dorotheanthus bellidiformis\nn11820965\tfig marigold, pebble plant\nn11821184\tice plant, icicle plant, Mesembryanthemum crystallinum\nn11822300\tNew Zealand spinach, Tetragonia tetragonioides, Tetragonia expansa\nn11823043\tamaranth\nn11823305\tamaranth\nn11823436\ttumbleweed, Amaranthus albus, Amaranthus graecizans\nn11823756\tprince's-feather, gentleman's-cane, prince's-plume, red amaranth, purple amaranth, Amaranthus cruentus, Amaranthus hybridus hypochondriacus, Amaranthus hybridus erythrostachys\nn11824146\tpigweed, Amaranthus hypochondriacus\nn11824344\tthorny amaranth, Amaranthus spinosus\nn11824747\talligator weed, alligator grass, Alternanthera philoxeroides\nn11825351\tcockscomb, common cockscomb, Celosia cristata, Celosia argentea cristata\nn11825749\tcottonweed\nn11826198\tglobe amaranth, bachelor's button, Gomphrena globosa\nn11826569\tbloodleaf\nn11827541\tsaltwort, Batis maritima\nn11828577\tlamb's-quarters, pigweed, wild spinach, Chenopodium album\nn11828973\tgood-king-henry, allgood, fat hen, wild spinach, Chenopodium bonus-henricus\nn11829205\tJerusalem oak, feather geranium, Mexican tea, Chenopodium botrys, Atriplex mexicana\nn11829672\toak-leaved goosefoot, oakleaf goosefoot, Chenopodium glaucum\nn11829922\tsowbane, red goosefoot, Chenopodium hybridum\nn11830045\tnettle-leaved goosefoot, nettleleaf goosefoot, Chenopodium murale\nn11830252\tred goosefoot, French spinach, Chenopodium rubrum\nn11830400\tstinking goosefoot, Chenopodium vulvaria\nn11830714\torach, orache\nn11830906\tsaltbush\nn11831100\tgarden orache, mountain spinach, Atriplex hortensis\nn11831297\tdesert holly, Atriplex hymenelytra\nn11831521\tquail bush, quail brush, white thistle, Atriplex lentiformis\nn11832214\tbeet, common beet, Beta vulgaris\nn11832480\tbeetroot, Beta vulgaris rubra\nn11832671\tchard, Swiss chard, spinach beet, leaf beet, chard plant, Beta vulgaris cicla\nn11832899\tmangel-wurzel, mangold-wurzel, mangold, Beta vulgaris vulgaris\nn11833373\twinged pigweed, tumbleweed, Cycloloma atriplicifolium\nn11833749\thalogeton, Halogeton glomeratus\nn11834272\tglasswort, samphire, Salicornia europaea\nn11834654\tsaltwort, barilla, glasswort, kali, kelpwort, Salsola kali, Salsola soda\nn11834890\tRussian thistle, Russian tumbleweed, Russian cactus, tumbleweed, Salsola kali tenuifolia\nn11835251\tgreasewood, black greasewood, Sarcobatus vermiculatus\nn11836327\tscarlet musk flower, Nyctaginia capitata\nn11836722\tsand verbena\nn11837204\tsweet sand verbena, Abronia fragrans\nn11837351\tyellow sand verbena, Abronia latifolia\nn11837562\tbeach pancake, Abronia maritima\nn11837743\tbeach sand verbena, pink sand verbena, Abronia umbellata\nn11837970\tdesert sand verbena, Abronia villosa\nn11838413\ttrailing four o'clock, trailing windmills, Allionia incarnata\nn11838916\tbougainvillea\nn11839460\tumbrellawort\nn11839568\tfour o'clock\nn11839823\tcommon four-o'clock, marvel-of-Peru, Mirabilis jalapa, Mirabilis uniflora\nn11840067\tCalifornia four o'clock, Mirabilis laevis, Mirabilis californica\nn11840246\tsweet four o'clock, maravilla, Mirabilis longiflora\nn11840476\tdesert four o'clock, Colorado four o'clock, maravilla, Mirabilis multiflora\nn11840764\tmountain four o'clock, Mirabilis oblongifolia\nn11841247\tcockspur, Pisonia aculeata\nn11843441\trattail cactus, rat's-tail cactus, Aporocactus flagelliformis\nn11844371\tsaguaro, sahuaro, Carnegiea gigantea\nn11844892\tnight-blooming cereus\nn11845557\techinocactus, barrel cactus\nn11845793\thedgehog cactus\nn11845913\tgolden barrel cactus, Echinocactus grusonii\nn11846312\thedgehog cereus\nn11846425\trainbow cactus\nn11846765\tepiphyllum, orchid cactus\nn11847169\tbarrel cactus\nn11848479\tnight-blooming cereus\nn11848867\tchichipe, Lemaireocereus chichipe\nn11849271\tmescal, mezcal, peyote, Lophophora williamsii\nn11849467\tmescal button, sacred mushroom, magic mushroom\nn11849871\tmammillaria\nn11849983\tfeather ball, Mammillaria plumosa\nn11850521\tgarambulla, garambulla cactus, Myrtillocactus geometrizans\nn11850918\tKnowlton's cactus, Pediocactus knowltonii\nn11851258\tnopal\nn11851578\tprickly pear, prickly pear cactus\nn11851839\tcholla, Opuntia cholla\nn11852028\tnopal, Opuntia lindheimeri\nn11852148\ttuna, Opuntia tuna\nn11852531\tBarbados gooseberry, Barbados-gooseberry vine, Pereskia aculeata\nn11853079\tmistletoe cactus\nn11853356\tChristmas cactus, Schlumbergera buckleyi, Schlumbergera baridgesii\nn11853813\tnight-blooming cereus\nn11854479\tcrab cactus, Thanksgiving cactus, Zygocactus truncatus, Schlumbergera truncatus\nn11855274\tpokeweed\nn11855435\tIndian poke, Phytolacca acinosa\nn11855553\tpoke, pigeon berry, garget, scoke, Phytolacca americana\nn11855842\tombu, bella sombra, Phytolacca dioica\nn11856573\tbloodberry, blood berry, rougeberry, rouge plant, Rivina humilis\nn11857696\tportulaca\nn11857875\trose moss, sun plant, Portulaca grandiflora\nn11858077\tcommon purslane, pussley, pussly, verdolagas, Portulaca oleracea\nn11858703\trock purslane\nn11858814\tred maids, redmaids, Calandrinia ciliata\nn11859275\tCarolina spring beauty, Claytonia caroliniana\nn11859472\tspring beauty, Clatonia lanceolata\nn11859737\tVirginia spring beauty, Claytonia virginica\nn11860208\tsiskiyou lewisia, Lewisia cotyledon\nn11860555\tbitterroot, Lewisia rediviva\nn11861238\tbroad-leaved montia, Montia cordifolia\nn11861487\tblinks, blinking chickweed, water chickweed, Montia lamprosperma\nn11861641\ttoad lily, Montia chamissoi\nn11861853\twinter purslane, miner's lettuce, Cuban spinach, Montia perfoliata\nn11862835\tflame flower, flame-flower, flameflower, Talinum aurantiacum\nn11863467\tpigmy talinum, Talinum brevifolium\nn11863877\tjewels-of-opar, Talinum paniculatum\nn11865071\tcaper\nn11865276\tnative pomegranate, Capparis arborea\nn11865429\tcaper tree, Jamaica caper tree, Capparis cynophallophora\nn11865574\tcaper tree, bay-leaved caper, Capparis flexuosa\nn11865874\tcommon caper, Capparis spinosa\nn11866248\tspiderflower, cleome\nn11866706\tRocky Mountain bee plant, stinking clover, Cleome serrulata\nn11867311\tclammyweed, Polanisia graveolens, Polanisia dodecandra\nn11868814\tcrucifer, cruciferous plant\nn11869351\tcress, cress plant\nn11869689\twatercress\nn11870044\tstonecress, stone cress\nn11870418\tgarlic mustard, hedge garlic, sauce-alone, jack-by-the-hedge, Alliaria officinalis\nn11870747\talyssum, madwort\nn11871059\trose of Jericho, resurrection plant, Anastatica hierochuntica\nn11871496\tArabidopsis thaliana, mouse-ear cress\nn11871748\tArabidopsis lyrata\nn11872146\trock cress, rockcress\nn11872324\tsicklepod, Arabis Canadensis\nn11872658\ttower mustard, tower cress, Turritis glabra, Arabis glabra\nn11873182\thorseradish, horseradish root\nn11873612\twinter cress, St. Barbara's herb, scurvy grass\nn11874081\tyellow rocket, rockcress, rocket cress, Barbarea vulgaris, Sisymbrium barbarea\nn11874423\thoary alison, hoary alyssum, Berteroa incana\nn11874878\tbuckler mustard, Biscutalla laevigata\nn11875523\twild cabbage, Brassica oleracea\nn11875691\tcabbage, cultivated cabbage, Brassica oleracea\nn11875938\thead cabbage, head cabbage plant, Brassica oleracea capitata\nn11876204\tsavoy cabbage\nn11876432\tbrussels sprout, Brassica oleracea gemmifera\nn11876634\tcauliflower, Brassica oleracea botrytis\nn11876803\tbroccoli, Brassica oleracea italica\nn11877193\tcollard\nn11877283\tkohlrabi, Brassica oleracea gongylodes\nn11877473\tturnip plant\nn11877646\tturnip, white turnip, Brassica rapa\nn11877860\trutabaga, turnip cabbage, swede, Swedish turnip, rutabaga plant, Brassica napus napobrassica\nn11878101\tbroccoli raab, broccoli rabe, Brassica rapa ruvo\nn11878283\tmustard\nn11878633\tchinese mustard, indian mustard, leaf mustard, gai choi, Brassica juncea\nn11879054\tbok choy, bok choi, pakchoi, pak choi, Chinese white cabbage, Brassica rapa chinensis\nn11879722\trape, colza, Brassica napus\nn11879895\trapeseed\nn11881189\tshepherd's purse, shepherd's pouch, Capsella bursa-pastoris\nn11882074\tlady's smock, cuckooflower, cuckoo flower, meadow cress, Cardamine pratensis\nn11882237\tcoral-root bittercress, coralroot, coralwort, Cardamine bulbifera, Dentaria bulbifera\nn11882426\tcrinkleroot, crinkle-root, crinkle root, pepper root, toothwort, Cardamine diphylla, Dentaria diphylla\nn11882636\tAmerican watercress, mountain watercress, Cardamine rotundifolia\nn11882821\tspring cress, Cardamine bulbosa\nn11882972\tpurple cress, Cardamine douglasii\nn11883328\twallflower, Cheiranthus cheiri, Erysimum cheiri\nn11883628\tprairie rocket\nn11883945\tscurvy grass, common scurvy grass, Cochlearia officinalis\nn11884384\tsea kale, sea cole, Crambe maritima\nn11884967\ttansy mustard, Descurainia pinnata\nn11885856\tdraba\nn11887119\twallflower\nn11887310\tprairie rocket\nn11887476\tSiberian wall flower, Erysimum allionii, Cheiranthus allionii\nn11887750\twestern wall flower, Erysimum asperum, Cheiranthus asperus, Erysimum arkansanum\nn11888061\twormseed mustard, Erysimum cheiranthoides\nn11888424\theliophila\nn11888800\tdamask violet, Dame's violet, sweet rocket, Hesperis matronalis\nn11889205\ttansy-leaved rocket, Hugueninia tanacetifolia, Sisymbrium tanacetifolia\nn11889619\tcandytuft\nn11890022\twoad\nn11890150\tdyer's woad, Isatis tinctoria\nn11890884\tbladderpod\nn11891175\tsweet alyssum, sweet alison, Lobularia maritima\nn11892029\tMalcolm stock, stock\nn11892181\tVirginian stock, Virginia stock, Malcolmia maritima\nn11892637\tstock, gillyflower\nn11892817\tbrompton stock, Matthiola incana\nn11893640\tbladderpod\nn11893916\tchamois cress, Pritzelago alpina, Lepidium alpina\nn11894327\tradish plant, radish\nn11894558\tjointed charlock, wild radish, wild rape, runch, Raphanus raphanistrum\nn11894770\tradish, Raphanus sativus\nn11895092\tradish, daikon, Japanese radish, Raphanus sativus longipinnatus\nn11895472\tmarsh cress, yellow watercress, Rorippa islandica\nn11895714\tgreat yellowcress, Rorippa amphibia, Nasturtium amphibium\nn11896141\tschizopetalon, Schizopetalon walkeri\nn11896722\tfield mustard, wild mustard, charlock, chadlock, Brassica kaber, Sinapis arvensis\nn11897116\thedge mustard, Sisymbrium officinale\nn11897466\tdesert plume, prince's-plume, Stanleya pinnata, Cleome pinnata\nn11898639\tpennycress\nn11898775\tfield pennycress, French weed, fanweed, penny grass, stinkweed, mithridate mustard, Thlaspi arvense\nn11899223\tfringepod, lacepod\nn11899762\tbladderpod\nn11899921\twasabi\nn11900569\tpoppy\nn11901294\tIceland poppy, Papaver alpinum\nn11901452\twestern poppy, Papaver californicum\nn11901597\tprickly poppy, Papaver argemone\nn11901759\tIceland poppy, arctic poppy, Papaver nudicaule\nn11901977\toriental poppy, Papaver orientale\nn11902200\tcorn poppy, field poppy, Flanders poppy, Papaver rhoeas\nn11902389\topium poppy, Papaver somniferum\nn11902709\tprickly poppy, argemone, white thistle, devil's fig\nn11902982\tMexican poppy, Argemone mexicana\nn11903333\tbocconia, tree celandine, Bocconia frutescens\nn11903671\tcelandine, greater celandine, swallowwort, swallow wort, Chelidonium majus\nn11904109\tcorydalis\nn11904274\tclimbing corydalis, Corydalis claviculata, Fumaria claviculata\nn11905392\tCalifornia poppy, Eschscholtzia californica\nn11905749\thorn poppy, horned poppy, yellow horned poppy, sea poppy, Glaucium flavum\nn11906127\tgolden cup, Mexican tulip poppy, Hunnemania fumariifolia\nn11906514\tplume poppy, bocconia, Macleaya cordata\nn11906917\tblue poppy, Meconopsis betonicifolia\nn11907100\tWelsh poppy, Meconopsis cambrica\nn11907405\tcreamcups, Platystemon californicus\nn11907689\tmatilija poppy, California tree poppy, Romneya coulteri\nn11908549\twind poppy, flaming poppy, Stylomecon heterophyllum, Papaver heterophyllum\nn11908846\tcelandine poppy, wood poppy, Stylophorum diphyllum\nn11909864\tclimbing fumitory, Allegheny vine, Adlumia fungosa, Fumaria fungosa\nn11910271\tbleeding heart, lyreflower, lyre-flower, Dicentra spectabilis\nn11910460\tDutchman's breeches, Dicentra cucullaria\nn11910666\tsquirrel corn, Dicentra canadensis\nn11915214\tcomposite, composite plant\nn11915658\tcompass plant, compass flower\nn11915899\teverlasting, everlasting flower\nn11916467\tachillea\nn11916696\tyarrow, milfoil, Achillea millefolium\nn11917407\tpink-and-white everlasting, pink paper daisy, Acroclinium roseum\nn11917835\twhite snakeroot, white sanicle, Ageratina altissima, Eupatorium rugosum\nn11918286\tageratum\nn11918473\tcommon ageratum, Ageratum houstonianum\nn11918808\tsweet sultan, Amberboa moschata, Centaurea moschata\nn11919447\tragweed, ambrosia, bitterweed\nn11919761\tcommon ragweed, Ambrosia artemisiifolia\nn11919975\tgreat ragweed, Ambrosia trifida\nn11920133\twestern ragweed, perennial ragweed, Ambrosia psilostachya\nn11920498\tammobium\nn11920663\twinged everlasting, Ammobium alatum\nn11920998\tpellitory, pellitory-of-Spain, Anacyclus pyrethrum\nn11921395\tpearly everlasting, cottonweed, Anaphalis margaritacea\nn11921792\tandryala\nn11922661\tplantain-leaved pussytoes\nn11922755\tfield pussytoes\nn11922839\tsolitary pussytoes\nn11922926\tmountain everlasting\nn11923174\tmayweed, dog fennel, stinking mayweed, stinking chamomile, Anthemis cotula\nn11923397\tyellow chamomile, golden marguerite, dyers' chamomile, Anthemis tinctoria\nn11923637\tcorn chamomile, field chamomile, corn mayweed, Anthemis arvensis\nn11924014\twoolly daisy, dwarf daisy, Antheropeas wallacei, Eriophyllum wallacei\nn11924445\tburdock, clotbur\nn11924849\tgreat burdock, greater burdock, cocklebur, Arctium lappa\nn11925303\tAfrican daisy\nn11925450\tblue-eyed African daisy, Arctotis stoechadifolia, Arctotis venusta\nn11925898\tmarguerite, marguerite daisy, Paris daisy, Chrysanthemum frutescens, Argyranthemum frutescens\nn11926365\tsilversword, Argyroxiphium sandwicense\nn11926833\tarnica\nn11926976\theartleaf arnica, Arnica cordifolia\nn11927215\tArnica montana\nn11927740\tlamb succory, dwarf nipplewort, Arnoseris minima\nn11928352\tartemisia\nn11928858\tmugwort\nn11929743\tsweet wormwood, Artemisia annua\nn11930038\tfield wormwood, Artemisia campestris\nn11930203\ttarragon, estragon, Artemisia dracunculus\nn11930353\tsand sage, silvery wormwood, Artemisia filifolia\nn11930571\twormwood sage, prairie sagewort, Artemisia frigida\nn11930788\twestern mugwort, white sage, cudweed, prairie sage, Artemisia ludoviciana, Artemisia gnaphalodes\nn11930994\tRoman wormwood, Artemis pontica\nn11931135\tbud brush, bud sagebrush, Artemis spinescens\nn11931540\tcommon mugwort, Artemisia vulgaris\nn11931918\taster\nn11932745\twood aster\nn11932927\twhorled aster, Aster acuminatus\nn11933099\theath aster, Aster arenosus\nn11933257\theart-leaved aster, Aster cordifolius\nn11933387\twhite wood aster, Aster divaricatus\nn11933546\tbushy aster, Aster dumosus\nn11933728\theath aster, Aster ericoides\nn11933903\twhite prairie aster, Aster falcatus\nn11934041\tstiff aster, Aster linarifolius\nn11934239\tgoldilocks, goldilocks aster, Aster linosyris, Linosyris vulgaris\nn11934463\tlarge-leaved aster, Aster macrophyllus\nn11934616\tNew England aster, Aster novae-angliae\nn11934807\tMichaelmas daisy, New York aster, Aster novi-belgii\nn11935027\tupland white aster, Aster ptarmicoides\nn11935187\tShort's aster, Aster shortii\nn11935330\tsea aster, sea starwort, Aster tripolium\nn11935469\tprairie aster, Aster turbinellis\nn11935627\tannual salt-marsh aster\nn11935715\taromatic aster\nn11935794\tarrow leaved aster\nn11935877\tazure aster\nn11935953\tbog aster\nn11936027\tcrooked-stemmed aster\nn11936113\tEastern silvery aster\nn11936199\tflat-topped white aster\nn11936287\tlate purple aster\nn11936369\tpanicled aster\nn11936448\tperennial salt marsh aster\nn11936539\tpurple-stemmed aster\nn11936624\trough-leaved aster\nn11936707\trush aster\nn11936782\tSchreiber's aster\nn11936864\tsmall white aster\nn11936946\tsmooth aster\nn11937023\tsouthern aster\nn11937102\tstarved aster, calico aster\nn11937195\ttradescant's aster\nn11937278\twavy-leaved aster\nn11937360\tWestern silvery aster\nn11937446\twillow aster\nn11937692\tayapana, Ayapana triplinervis, Eupatorium aya-pana\nn11938556\tmule fat, Baccharis viminea\nn11939180\tbalsamroot\nn11939491\tdaisy\nn11939699\tcommon daisy, English daisy, Bellis perennis\nn11940006\tbur marigold, burr marigold, beggar-ticks, beggar's-ticks, sticktight\nn11940349\tSpanish needles, Bidens bipinnata\nn11940599\ttickseed sunflower, Bidens coronata, Bidens trichosperma\nn11940750\tEuropean beggar-ticks, trifid beggar-ticks, trifid bur marigold, Bidens tripartita\nn11941094\tslender knapweed\nn11941478\tfalse chamomile\nn11941924\tSwan River daisy, Brachycome Iberidifolia\nn11942659\twoodland oxeye, Buphthalmum salicifolium\nn11943133\tIndian plantain\nn11943407\tcalendula\nn11943660\tcommon marigold, pot marigold, ruddles, Scotch marigold, Calendula officinalis\nn11943992\tChina aster, Callistephus chinensis\nn11944196\tthistle\nn11944751\twelted thistle, Carduus crispus\nn11944954\tmusk thistle, nodding thistle, Carduus nutans\nn11945367\tcarline thistle\nn11945514\tstemless carline thistle, Carlina acaulis\nn11945783\tcommon carline thistle, Carlina vulgaris\nn11946051\tsafflower, false saffron, Carthamus tinctorius\nn11946313\tsafflower seed\nn11946727\tcatananche\nn11946918\tblue succory, cupid's dart, Catananche caerulea\nn11947251\tcentaury\nn11947629\tdusty miller, Centaurea cineraria, Centaurea gymnocarpa\nn11947802\tcornflower, bachelor's button, bluebottle, Centaurea cyanus\nn11948044\tstar-thistle, caltrop, Centauria calcitrapa\nn11948264\tknapweed\nn11948469\tsweet sultan, Centaurea imperialis\nn11948864\tgreat knapweed, greater knapweed, Centaurea scabiosa\nn11949015\tBarnaby's thistle, yellow star-thistle, Centaurea solstitialis\nn11949402\tchamomile, camomile, Chamaemelum nobilis, Anthemis nobilis\nn11949857\tchaenactis\nn11950345\tchrysanthemum\nn11950686\tcorn marigold, field marigold, Chrysanthemum segetum\nn11950877\tcrown daisy, Chrysanthemum coronarium\nn11951052\tchop-suey greens, tong ho, shun giku, Chrysanthemum coronarium spatiosum\nn11951511\tgolden aster\nn11951820\tMaryland golden aster, Chrysopsis mariana\nn11952346\tgoldenbush\nn11952541\trabbit brush, rabbit bush, Chrysothamnus nauseosus\nn11953038\tchicory, succory, chicory plant, Cichorium intybus\nn11953339\tendive, witloof, Cichorium endivia\nn11953610\tchicory, chicory root\nn11953884\tplume thistle, plumed thistle\nn11954161\tCanada thistle, creeping thistle, Cirsium arvense\nn11954345\tfield thistle, Cirsium discolor\nn11954484\twoolly thistle, Cirsium flodmanii\nn11954642\tEuropean woolly thistle, Cirsium eriophorum\nn11954798\tmelancholy thistle, Cirsium heterophylum, Cirsium helenioides\nn11955040\tbrook thistle, Cirsium rivulare\nn11955153\tbull thistle, boar thistle, spear thistle, Cirsium vulgare, Cirsium lanceolatum\nn11955532\tblessed thistle, sweet sultan, Cnicus benedictus\nn11955896\tmistflower, mist-flower, ageratum, Conoclinium coelestinum, Eupatorium coelestinum\nn11956348\thorseweed, Canadian fleabane, fleabane, Conyza canadensis, Erigeron canadensis\nn11956850\tcoreopsis, tickseed, tickweed, tick-weed\nn11957317\tgiant coreopsis, Coreopsis gigantea\nn11957514\tsea dahlia, Coreopsis maritima\nn11957678\tcalliopsis, Coreopsis tinctoria\nn11958080\tcosmos, cosmea\nn11958499\tbrass buttons, Cotula coronopifolia\nn11958888\tbilly buttons\nn11959259\thawk's-beard, hawk's-beards\nn11959632\tartichoke, globe artichoke, artichoke plant, Cynara scolymus\nn11959862\tcardoon, Cynara cardunculus\nn11960245\tdahlia, Dahlia pinnata\nn11960673\tGerman ivy, Delairea odorata, Senecio milkanioides\nn11961100\tflorist's chrysanthemum, florists' chrysanthemum, mum, Dendranthema grandifloruom, Chrysanthemum morifolium\nn11961446\tcape marigold, sun marigold, star of the veldt\nn11961871\tleopard's-bane, leopardbane\nn11962272\tconeflower\nn11962667\tglobe thistle\nn11962994\telephant's-foot\nn11963572\ttassel flower, Emilia sagitta\nn11963932\tbrittlebush, brittle bush, incienso, Encelia farinosa\nn11964446\tsunray, Enceliopsis nudicaulis\nn11964848\tengelmannia\nn11965218\tfireweed, Erechtites hieracifolia\nn11965627\tfleabane\nn11965962\tblue fleabane, Erigeron acer\nn11966083\tdaisy fleabane, Erigeron annuus\nn11966215\torange daisy, orange fleabane, Erigeron aurantiacus\nn11966385\tspreading fleabane, Erigeron divergens\nn11966617\tseaside daisy, beach aster, Erigeron glaucous\nn11966896\tPhiladelphia fleabane, Erigeron philadelphicus\nn11967142\trobin's plantain, Erigeron pulchellus\nn11967315\tshowy daisy, Erigeron speciosus\nn11967744\twoolly sunflower\nn11967878\tgolden yarrow, Eriophyllum lanatum\nn11968519\tdog fennel, Eupatorium capillifolium\nn11968704\tJoe-Pye weed, spotted Joe-Pye weed, Eupatorium maculatum\nn11968931\tboneset, agueweed, thoroughwort, Eupatorium perfoliatum\nn11969166\tJoe-Pye weed, purple boneset, trumpet weed, marsh milkweed, Eupatorium purpureum\nn11969607\tblue daisy, blue marguerite, Felicia amelloides\nn11969806\tkingfisher daisy, Felicia bergeriana\nn11970101\tcotton rose, cudweed, filago\nn11970298\therba impia, Filago germanica\nn11970586\tgaillardia\nn11971248\tgazania\nn11971406\ttreasure flower, Gazania rigens\nn11971783\tAfrican daisy\nn11971927\tBarberton daisy, Transvaal daisy, Gerbera jamesonii\nn11972291\tdesert sunflower, Gerea canescens\nn11972759\tcudweed\nn11972959\tchafeweed, wood cudweed, Gnaphalium sylvaticum\nn11973341\tgumweed, gum plant, tarweed, rosinweed\nn11973634\tGrindelia robusta\nn11973749\tcurlycup gumweed, Grindelia squarrosa\nn11974373\tlittle-head snakeweed, Gutierrezia microcephala\nn11974557\trabbitweed, rabbit-weed, snakeweed, broom snakeweed, broom snakeroot, turpentine weed, Gutierrezia sarothrae\nn11974888\tbroomweed, broom-weed, Gutierrezia texana\nn11975254\tvelvet plant, purple velvet plant, royal velvet plant, Gynura aurantiaca\nn11976170\tgoldenbush\nn11976314\tcamphor daisy, Haplopappus phyllocephalus\nn11976511\tyellow spiny daisy, Haplopappus spinulosus\nn11976933\thoary golden bush, Hazardia cana\nn11977303\tsneezeweed\nn11977660\torange sneezeweed, owlclaws, Helenium hoopesii\nn11977887\trosilla, Helenium puberulum\nn11978233\tsunflower, helianthus\nn11978551\tswamp sunflower, Helianthus angustifolius\nn11978713\tcommon sunflower, mirasol, Helianthus annuus\nn11978961\tgiant sunflower, tall sunflower, Indian potato, Helianthus giganteus\nn11979187\tshowy sunflower, Helianthus laetiflorus\nn11979354\tMaximilian's sunflower, Helianthus maximilianii\nn11979527\tprairie sunflower, Helianthus petiolaris\nn11979715\tJerusalem artichoke, girasol, Jerusalem artichoke sunflower, Helianthus tuberosus\nn11979964\tJerusalem artichoke\nn11980318\tstrawflower, golden everlasting, yellow paper daisy, Helichrysum bracteatum\nn11980682\theliopsis, oxeye\nn11981192\tstrawflower\nn11981475\thairy golden aster, prairie golden aster, Heterotheca villosa, Chrysopsis villosa\nn11982115\thawkweed\nn11982545\trattlesnake weed, Hieracium venosum\nn11982939\talpine coltsfoot, Homogyne alpina, Tussilago alpina\nn11983375\talpine gold, alpine hulsea, Hulsea algida\nn11983606\tdwarf hulsea, Hulsea nana\nn11984144\tcat's-ear, California dandelion, capeweed, gosmore, Hypochaeris radicata\nn11984542\tinula\nn11985053\tmarsh elder, iva\nn11985321\tburweed marsh elder, false ragweed, Iva xanthifolia\nn11985739\tkrigia\nn11985903\tdwarf dandelion, Krigia dandelion, Krigia bulbosa\nn11986511\tgarden lettuce, common lettuce, Lactuca sativa\nn11986729\tcos lettuce, romaine lettuce, Lactuca sativa longifolia\nn11987126\tleaf lettuce, Lactuca sativa crispa\nn11987349\tceltuce, stem lettuce, Lactuca sativa asparagina\nn11987511\tprickly lettuce, horse thistle, Lactuca serriola, Lactuca scariola\nn11988132\tgoldfields, Lasthenia chrysostoma\nn11988596\ttidytips, tidy tips, Layia platyglossa\nn11988893\thawkbit\nn11989087\tfall dandelion, arnica bud, Leontodon autumnalis\nn11989393\tedelweiss, Leontopodium alpinum\nn11989869\toxeye daisy, ox-eyed daisy, marguerite, moon daisy, white daisy, Leucanthemum vulgare, Chrysanthemum leucanthemum\nn11990167\toxeye daisy, Leucanthemum maximum, Chrysanthemum maximum\nn11990313\tshasta daisy, Leucanthemum superbum, Chrysanthemum maximum maximum\nn11990627\tPyrenees daisy, Leucanthemum lacustre, Chrysanthemum lacustre\nn11990920\tnorth island edelweiss, Leucogenes leontopodium\nn11991263\tblazing star, button snakeroot, gayfeather, gay-feather, snakeroot\nn11991549\tdotted gayfeather, Liatris punctata\nn11991777\tdense blazing star, Liatris pycnostachya\nn11992479\tTexas star, Lindheimera texana\nn11992806\tAfrican daisy, yellow ageratum, Lonas inodora, Lonas annua\nn11993203\ttahoka daisy, tansy leaf aster, Machaeranthera tanacetifolia\nn11993444\tsticky aster, Machaeranthera bigelovii\nn11993675\tMojave aster, Machaeranthera tortifoloia\nn11994150\ttarweed\nn11995092\tsweet false chamomile, wild chamomile, German chamomile, Matricaria recutita, Matricaria chamomilla\nn11995396\tpineapple weed, rayless chamomile, Matricaria matricarioides\nn11996251\tclimbing hempweed, climbing boneset, wild climbing hempweed, climbing hemp-vine, Mikania scandens\nn11996677\tmutisia\nn11997032\trattlesnake root\nn11997160\twhite lettuce, cankerweed, Nabalus alba, Prenanthes alba\nn11997969\tdaisybush, daisy-bush, daisy bush\nn11998492\tNew Zealand daisybush, Olearia haastii\nn11998888\tcotton thistle, woolly thistle, Scotch thistle, Onopordum acanthium, Onopordon acanthium\nn11999278\tothonna\nn11999656\tcascade everlasting, Ozothamnus secundiflorus, Helichrysum secundiflorum\nn12000191\tbutterweed\nn12001294\tAmerican feverfew, wild quinine, prairie dock, Parthenium integrifolium\nn12001707\tcineraria, Pericallis cruenta, Senecio cruentus\nn12001924\tflorest's cineraria, Pericallis hybrida\nn12002428\tbutterbur, bog rhubarb, Petasites hybridus, Petasites vulgaris\nn12002651\twinter heliotrope, sweet coltsfoot, Petasites fragrans\nn12002826\tsweet coltsfoot, Petasites sagitattus\nn12003167\toxtongue, bristly oxtongue, bitterweed, bugloss, Picris echioides\nn12003696\thawkweed\nn12004120\tmouse-ear hawkweed, Pilosella officinarum, Hieracium pilocella\nn12004547\tstevia\nn12004987\trattlesnake root, Prenanthes purpurea\nn12005656\tfleabane, feabane mullet, Pulicaria dysenterica\nn12006306\tsheep plant, vegetable sheep, Raoulia lutescens, Raoulia australis\nn12006766\tconeflower\nn12006930\tMexican hat, Ratibida columnaris\nn12007196\tlong-head coneflower, prairie coneflower, Ratibida columnifera\nn12007406\tprairie coneflower, Ratibida tagetes\nn12007766\tSwan River everlasting, rhodanthe, Rhodanthe manglesii, Helipterum manglesii\nn12008252\tconeflower\nn12008487\tblack-eyed Susan, Rudbeckia hirta, Rudbeckia serotina\nn12008749\tcutleaved coneflower, Rudbeckia laciniata\nn12009047\tgolden glow, double gold, hortensia, Rudbeckia laciniata hortensia\nn12009420\tlavender cotton, Santolina chamaecyparissus\nn12009792\tcreeping zinnia, Sanvitalia procumbens\nn12010628\tgolden thistle\nn12010815\tSpanish oyster plant, Scolymus hispanicus\nn12011370\tnodding groundsel, Senecio bigelovii\nn12011620\tdusty miller, Senecio cineraria, Cineraria maritima\nn12012111\tbutterweed, ragwort, Senecio glabellus\nn12012253\tragwort, tansy ragwort, ragweed, benweed, Senecio jacobaea\nn12012510\tarrowleaf groundsel, Senecio triangularis\nn12013035\tblack salsify, viper's grass, scorzonera, Scorzonera hispanica\nn12013511\twhite-topped aster\nn12013701\tnarrow-leaved white-topped aster\nn12014085\tsilver sage, silver sagebrush, grey sage, gray sage, Seriphidium canum, Artemisia cana\nn12014355\tsea wormwood, Seriphidium maritimum, Artemisia maritima\nn12014923\tsawwort, Serratula tinctoria\nn12015221\trosinweed, Silphium laciniatum\nn12015525\tmilk thistle, lady's thistle, Our Lady's mild thistle, holy thistle, blessed thistle, Silybum marianum\nn12015959\tgoldenrod\nn12016434\tsilverrod, Solidago bicolor\nn12016567\tmeadow goldenrod, Canadian goldenrod, Solidago canadensis\nn12016777\tMissouri goldenrod, Solidago missouriensis\nn12016914\talpine goldenrod, Solidago multiradiata\nn12017127\tgrey goldenrod, gray goldenrod, Solidago nemoralis\nn12017326\tBlue Mountain tea, sweet goldenrod, Solidago odora\nn12017511\tdyer's weed, Solidago rugosa\nn12017664\tseaside goldenrod, beach goldenrod, Solidago sempervirens\nn12017853\tnarrow goldenrod, Solidago spathulata\nn12018014\tBoott's goldenrod\nn12018100\tElliott's goldenrod\nn12018188\tOhio goldenrod\nn12018271\trough-stemmed goldenrod\nn12018363\tshowy goldenrod\nn12018447\ttall goldenrod\nn12018530\tzigzag goldenrod, broad leaved goldenrod\nn12018760\tsow thistle, milk thistle\nn12019035\tmilkweed, Sonchus oleraceus\nn12019827\tstevia\nn12020184\tstokes' aster, cornflower aster, Stokesia laevis\nn12020507\tmarigold\nn12020736\tAfrican marigold, big marigold, Aztec marigold, Tagetes erecta\nn12020941\tFrench marigold, Tagetes patula\nn12022054\tpainted daisy, pyrethrum, Tanacetum coccineum, Chrysanthemum coccineum\nn12022382\tpyrethrum, Dalmatian pyrethrum, Dalmatia pyrethrum, Tanacetum cinerariifolium, Chrysanthemum cinerariifolium\nn12022821\tnorthern dune tansy, Tanacetum douglasii\nn12023108\tfeverfew, Tanacetum parthenium, Chrysanthemum parthenium\nn12023407\tdusty miller, silver-lace, silver lace, Tanacetum ptarmiciflorum, Chrysanthemum ptarmiciflorum\nn12023726\ttansy, golden buttons, scented fern, Tanacetum vulgare\nn12024176\tdandelion, blowball\nn12024445\tcommon dandelion, Taraxacum ruderalia, Taraxacum officinale\nn12024690\tdandelion green\nn12024805\tRussian dandelion, kok-saghyz, kok-sagyz, Taraxacum kok-saghyz\nn12025220\tstemless hymenoxys, Tetraneuris acaulis, Hymenoxys acaulis\nn12026018\tMexican sunflower, tithonia\nn12026476\tEaster daisy, stemless daisy, Townsendia Exscapa\nn12026981\tyellow salsify, Tragopogon dubius\nn12027222\tsalsify, oyster plant, vegetable oyster, Tragopogon porrifolius\nn12027658\tmeadow salsify, goatsbeard, shepherd's clock, Tragopogon pratensis\nn12028424\tscentless camomile, scentless false camomile, scentless mayweed, scentless hayweed, corn mayweed, Tripleurospermum inodorum, Matricaria inodorum\nn12029039\tturfing daisy, Tripleurospermum tchihatchewii, Matricaria tchihatchewii\nn12029635\tcoltsfoot, Tussilago farfara\nn12030092\tursinia\nn12030654\tcrownbeard, crown-beard, crown beard\nn12030908\twingstem, golden ironweed, yellow ironweed, golden honey plant, Verbesina alternifolia, Actinomeris alternifolia\nn12031139\tcowpen daisy, golden crownbeard, golden crown beard, butter daisy, Verbesina encelioides, Ximenesia encelioides\nn12031388\tgravelweed, Verbesina helianthoides\nn12031547\tVirginia crownbeard, frostweed, frost-weed, Verbesina virginica\nn12031927\tironweed, vernonia\nn12032429\tmule's ears, Wyethia amplexicaulis\nn12032686\twhite-rayed mule's ears, Wyethia helianthoides\nn12033139\tcocklebur, cockle-bur, cockleburr, cockle-burr\nn12033504\txeranthemum\nn12033709\timmortelle, Xeranthemum annuum\nn12034141\tzinnia, old maid, old maid flower\nn12034384\twhite zinnia, Zinnia acerosa\nn12034594\tlittle golden zinnia, Zinnia grandiflora\nn12035631\tblazing star, Mentzelia livicaulis, Mentzelia laevicaulis\nn12035907\tbartonia, Mentzelia lindleyi\nn12036067\tachene\nn12036226\tsamara, key fruit, key\nn12036939\tcampanula, bellflower\nn12037499\tcreeping bellflower, Campanula rapunculoides\nn12037691\tCanterbury bell, cup and saucer, Campanula medium\nn12038038\ttall bellflower, Campanula americana\nn12038208\tmarsh bellflower, Campanula aparinoides\nn12038406\tclustered bellflower, Campanula glomerata\nn12038585\tpeach bells, peach bell, willow bell, Campanula persicifolia\nn12038760\tchimney plant, chimney bellflower, Campanula pyramidalis\nn12038898\trampion, rampion bellflower, Campanula rapunculus\nn12039317\ttussock bellflower, spreading bellflower, Campanula carpatica\nn12041446\torchid, orchidaceous plant\nn12043444\torchis\nn12043673\tmale orchis, early purple orchid, Orchis mascula\nn12043836\tbutterfly orchid, butterfly orchis, Orchis papilionaceae\nn12044041\tshowy orchis, purple orchis, purple-hooded orchis, Orchis spectabilis\nn12044467\taerides\nn12044784\tangrecum\nn12045157\tjewel orchid\nn12045514\tputtyroot, adam-and-eve, Aplectrum hyemale\nn12045860\tarethusa\nn12046028\tbog rose, wild pink, dragon's mouth, Arethusa bulbosa\nn12046428\tbletia\nn12046815\tBletilla striata, Bletia striata\nn12047345\tbrassavola\nn12047884\tspider orchid, Brassia lawrenceana\nn12048056\tspider orchid, Brassia verrucosa\nn12048399\tcaladenia\nn12048928\tcalanthe\nn12049282\tgrass pink, Calopogon pulchellum, Calopogon tuberosum\nn12049562\tcalypso, fairy-slipper, Calypso bulbosa\nn12050533\tcattleya\nn12050959\thelleborine\nn12051103\tred helleborine, Cephalanthera rubra\nn12051514\tspreading pogonia, funnel-crest rosebud orchid, Cleistes divaricata, Pogonia divaricata\nn12051792\trosebud orchid, Cleistes rosea, Pogonia rosea\nn12052267\tsatyr orchid, Coeloglossum bracteatum\nn12052447\tfrog orchid, Coeloglossum viride\nn12052787\tcoelogyne\nn12053405\tcoral root\nn12053690\tspotted coral root, Corallorhiza maculata\nn12053962\tstriped coral root, Corallorhiza striata\nn12054195\tearly coral root, pale coral root, Corallorhiza trifida\nn12055073\tswan orchid, swanflower, swan-flower, swanneck, swan-neck\nn12055516\tcymbid, cymbidium\nn12056099\tcypripedia\nn12056217\tlady's slipper, lady-slipper, ladies' slipper, slipper orchid\nn12056601\tmoccasin flower, nerveroot, Cypripedium acaule\nn12056758\tcommon lady's-slipper, showy lady's-slipper, showy lady slipper, Cypripedium reginae, Cypripedium album\nn12056990\tram's-head, ram's-head lady's slipper, Cypripedium arietinum\nn12057211\tyellow lady's slipper, yellow lady-slipper, Cypripedium calceolus, Cypripedium parviflorum\nn12057447\tlarge yellow lady's slipper, Cypripedium calceolus pubescens\nn12057660\tCalifornia lady's slipper, Cypripedium californicum\nn12057895\tclustered lady's slipper, Cypripedium fasciculatum\nn12058192\tmountain lady's slipper, Cypripedium montanum\nn12058630\tmarsh orchid\nn12058822\tcommon spotted orchid, Dactylorhiza fuchsii, Dactylorhiza maculata fuchsii\nn12059314\tdendrobium\nn12059625\tdisa\nn12060546\tphantom orchid, snow orchid, Eburophyton austinae\nn12061104\ttulip orchid, Encyclia citrina, Cattleya citrina\nn12061380\tbutterfly orchid, Encyclia tampensis, Epidendrum tampense\nn12061614\tbutterfly orchid, butterfly orchis, Epidendrum venosum, Encyclia venosa\nn12062105\tepidendron\nn12062468\thelleborine\nn12062626\tEpipactis helleborine\nn12062781\tstream orchid, chatterbox, giant helleborine, Epipactis gigantea\nn12063211\ttongueflower, tongue-flower\nn12063639\trattlesnake plantain, helleborine\nn12064389\tfragrant orchid, Gymnadenia conopsea\nn12064591\tshort-spurred fragrant orchid, Gymnadenia odoratissima\nn12065316\tfringed orchis, fringed orchid\nn12065649\tfrog orchid\nn12065777\trein orchid, rein orchis\nn12066018\tbog rein orchid, bog candles, Habenaria dilatata\nn12066261\twhite fringed orchis, white fringed orchid, Habenaria albiflora\nn12066451\telegant Habenaria, Habenaria elegans\nn12066630\tpurple-fringed orchid, purple-fringed orchis, Habenaria fimbriata\nn12066821\tcoastal rein orchid, Habenaria greenei\nn12067029\tHooker's orchid, Habenaria hookeri\nn12067193\tragged orchid, ragged orchis, ragged-fringed orchid, green fringed orchis, Habenaria lacera\nn12067433\tprairie orchid, prairie white-fringed orchis, Habenaria leucophaea\nn12067672\tsnowy orchid, Habenaria nivea\nn12067817\tround-leaved rein orchid, Habenaria orbiculata\nn12068138\tpurple fringeless orchid, purple fringeless orchis, Habenaria peramoena\nn12068432\tpurple-fringed orchid, purple-fringed orchis, Habenaria psycodes\nn12068615\tAlaska rein orchid, Habenaria unalascensis\nn12069009\tcrested coral root, Hexalectris spicata\nn12069217\tTexas purple spike, Hexalectris warnockii\nn12069679\tlizard orchid, Himantoglossum hircinum\nn12070016\tlaelia\nn12070381\tliparis\nn12070583\ttwayblade\nn12070712\tfen orchid, fen orchis, Liparis loeselii\nn12071259\tbroad-leaved twayblade, Listera convallarioides\nn12071477\tlesser twayblade, Listera cordata\nn12071744\ttwayblade, Listera ovata\nn12072210\tgreen adder's mouth, Malaxis-unifolia, Malaxis ophioglossoides\nn12072722\tmasdevallia\nn12073217\tmaxillaria\nn12073554\tpansy orchid\nn12073991\todontoglossum\nn12074408\toncidium, dancing lady orchid, butterfly plant, butterfly orchid\nn12074867\tbee orchid, Ophrys apifera\nn12075010\tfly orchid, Ophrys insectifera, Ophrys muscifera\nn12075151\tspider orchid\nn12075299\tearly spider orchid, Ophrys sphegodes\nn12075830\tVenus' slipper, Venus's slipper, Venus's shoe\nn12076223\tphaius\nn12076577\tmoth orchid, moth plant\nn12076852\tbutterfly plant, Phalaenopsis amabilis\nn12077244\trattlesnake orchid\nn12077944\tlesser butterfly orchid, Platanthera bifolia, Habenaria bifolia\nn12078172\tgreater butterfly orchid, Platanthera chlorantha, Habenaria chlorantha\nn12078451\tprairie white-fringed orchid, Platanthera leucophea\nn12078747\ttangle orchid\nn12079120\tIndian crocus\nn12079523\tpleurothallis\nn12079963\tpogonia\nn12080395\tbutterfly orchid\nn12080588\tPsychopsis krameriana, Oncidium papilio kramerianum\nn12080820\tPsychopsis papilio, Oncidium papilio\nn12081215\thelmet orchid, greenhood\nn12081649\tfoxtail orchid\nn12082131\torange-blossom orchid, Sarcochilus falcatus\nn12083113\tsobralia\nn12083591\tladies' tresses, lady's tresses\nn12083847\tscrew augur, Spiranthes cernua\nn12084158\thooded ladies' tresses, Spiranthes romanzoffiana\nn12084400\twestern ladies' tresses, Spiranthes porrifolia\nn12084555\tEuropean ladies' tresses, Spiranthes spiralis\nn12084890\tstanhopea\nn12085267\tstelis\nn12085664\tfly orchid\nn12086012\tvanda\nn12086192\tblue orchid, Vanda caerulea\nn12086539\tvanilla\nn12086778\tvanilla orchid, Vanilla planifolia\nn12087961\tyam, yam plant\nn12088223\tyam\nn12088327\twhite yam, water yam, Dioscorea alata\nn12088495\tcinnamon vine, Chinese yam, Dioscorea batata\nn12088909\telephant's-foot, tortoise plant, Hottentot bread vine, Hottentot's bread vine, Dioscorea elephantipes\nn12089320\twild yam, Dioscorea paniculata\nn12089496\tcush-cush, Dioscorea trifida\nn12089846\tblack bryony, black bindweed, Tamus communis\nn12090890\tprimrose, primula\nn12091213\tEnglish primrose, Primula vulgaris\nn12091377\tcowslip, paigle, Primula veris\nn12091550\toxlip, paigle, Primula elatior\nn12091697\tChinese primrose, Primula sinensis\nn12091953\tpolyanthus, Primula polyantha\nn12092262\tpimpernel\nn12092417\tscarlet pimpernel, red pimpernel, poor man's weatherglass, Anagallis arvensis\nn12092629\tbog pimpernel, Anagallis tenella\nn12092930\tchaffweed, bastard pimpernel, false pimpernel\nn12093329\tcyclamen, Cyclamen purpurascens\nn12093600\tsowbread, Cyclamen hederifolium, Cyclamen neopolitanum\nn12093885\tsea milkwort, sea trifoly, black saltwort, Glaux maritima\nn12094244\tfeatherfoil, feather-foil\nn12094401\twater gillyflower, American featherfoil, Hottonia inflata\nn12094612\twater violet, Hottonia palustris\nn12095020\tloosestrife\nn12095281\tgooseneck loosestrife, Lysimachia clethroides Duby\nn12095412\tyellow pimpernel, Lysimachia nemorum\nn12095543\tfringed loosestrife, Lysimachia ciliatum\nn12095647\tmoneywort, creeping Jenny, creeping Charlie, Lysimachia nummularia\nn12095934\tswamp candles, Lysimachia terrestris\nn12096089\twhorled loosestrife, Lysimachia quadrifolia\nn12096395\twater pimpernel\nn12096563\tbrookweed, Samolus valerandii\nn12096674\tbrookweed, Samolus parviflorus, Samolus floribundus\nn12097396\tcoralberry, spiceberry, Ardisia crenata\nn12097556\tmarlberry, Ardisia escallonoides, Ardisia paniculata\nn12098403\tplumbago\nn12098524\tleadwort, Plumbago europaea\nn12098827\tthrift\nn12099342\tsea lavender, marsh rosemary, statice\nn12100187\tbarbasco, joewood, Jacquinia keyensis\nn12101870\tgramineous plant, graminaceous plant\nn12102133\tgrass\nn12103680\tmidgrass\nn12103894\tshortgrass, short-grass\nn12104104\tsword grass\nn12104238\ttallgrass, tall-grass\nn12104501\therbage, pasturage\nn12104734\tgoat grass, Aegilops triuncalis\nn12105125\twheatgrass, wheat-grass\nn12105353\tcrested wheatgrass, crested wheat grass, fairway crested wheat grass, Agropyron cristatum\nn12105828\tbearded wheatgrass, Agropyron subsecundum\nn12105981\twestern wheatgrass, bluestem wheatgrass, Agropyron smithii\nn12106134\tintermediate wheatgrass, Agropyron intermedium, Elymus hispidus\nn12106323\tslender wheatgrass, Agropyron trachycaulum, Agropyron pauciflorum, Elymus trachycaulos\nn12107002\tvelvet bent, velvet bent grass, brown bent, Rhode Island bent, dog bent, Agrostis canina\nn12107191\tcloud grass, Agrostis nebulosa\nn12107710\tmeadow foxtail, Alopecurus pratensis\nn12107970\tfoxtail, foxtail grass\nn12108432\tbroom grass\nn12108613\tbroom sedge, Andropogon virginicus\nn12108871\ttall oat grass, tall meadow grass, evergreen grass, false oat, French rye, Arrhenatherum elatius\nn12109365\ttoetoe, toitoi, Arundo conspicua, Chionochloa conspicua\nn12109827\toat\nn12110085\tcereal oat, Avena sativa\nn12110236\twild oat, wild oat grass, Avena fatua\nn12110352\tslender wild oat, Avena barbata\nn12110475\twild red oat, animated oat, Avene sterilis\nn12110778\tbrome, bromegrass\nn12111238\tchess, cheat, Bromus secalinus\nn12111627\tfield brome, Bromus arvensis\nn12112008\tgrama, grama grass, gramma, gramma grass\nn12112337\tblack grama, Bouteloua eriopoda\nn12112609\tbuffalo grass, Buchloe dactyloides\nn12112918\treed grass\nn12113195\tfeather reed grass, feathertop, Calamagrostis acutiflora\nn12113323\tAustralian reed grass, Calamagrostic quadriseta\nn12113657\tburgrass, bur grass\nn12114010\tbuffel grass, Cenchrus ciliaris, Pennisetum cenchroides\nn12114590\tRhodes grass, Chloris gayana\nn12115180\tpampas grass, Cortaderia selloana\nn12116058\tgiant star grass, Cynodon plectostachyum\nn12116429\torchard grass, cocksfoot, cockspur, Dactylis glomerata\nn12116734\tEgyptian grass, crowfoot grass, Dactyloctenium aegypticum\nn12117017\tcrabgrass, crab grass, finger grass\nn12117235\tsmooth crabgrass, Digitaria ischaemum\nn12117326\tlarge crabgrass, hairy finger grass, Digitaria sanguinalis\nn12117695\tbarnyard grass, barn grass, barn millet, Echinochloa crusgalli\nn12117912\tJapanese millet, billion-dollar grass, Japanese barnyard millet, sanwa millet, Echinochloa frumentacea\nn12118414\tyardgrass, yard grass, wire grass, goose grass, Eleusine indica\nn12118661\tfinger millet, ragi, ragee, African millet, coracan, corakan, kurakkan, Eleusine coracana\nn12119099\tlyme grass\nn12119238\twild rye\nn12119390\tgiant ryegrass, Elymus condensatus, Leymus condensatus\nn12119539\tsea lyme grass, European dune grass, Elymus arenarius, Leymus arenaria\nn12119717\tCanada wild rye, Elymus canadensis\nn12120347\tteff, teff grass, Eragrostis tef, Eragrostic abyssinica\nn12120578\tweeping love grass, African love grass, Eragrostis curvula\nn12121033\tplume grass\nn12121187\tRavenna grass, wool grass, Erianthus ravennae\nn12121610\tfescue, fescue grass, meadow fescue, Festuca elatior\nn12122442\treed meadow grass, Glyceria grandis\nn12122725\tvelvet grass, Yorkshire fog, Holcus lanatus\nn12122918\tcreeping soft grass, Holcus mollis\nn12123648\tbarleycorn\nn12123741\tbarley grass, wall barley, Hordeum murinum\nn12124172\tlittle barley, Hordeum pusillum\nn12124627\trye grass, ryegrass\nn12124818\tperennial ryegrass, English ryegrass, Lolium perenne\nn12125001\tItalian ryegrass, Italian rye, Lolium multiflorum\nn12125183\tdarnel, tare, bearded darnel, cheat, Lolium temulentum\nn12125584\tnimblewill, nimble Will, Muhlenbergia schreberi\nn12126084\tcultivated rice, Oryza sativa\nn12126360\tricegrass, rice grass\nn12126736\tsmilo, smilo grass, Oryzopsis miliacea\nn12127460\tswitch grass, Panicum virgatum\nn12127575\tbroomcorn millet, hog millet, Panicum miliaceum\nn12127768\tgoose grass, Texas millet, Panicum Texanum\nn12128071\tdallisgrass, dallis grass, paspalum, Paspalum dilatatum\nn12128306\tBahia grass, Paspalum notatum\nn12128490\tknotgrass, Paspalum distichum\nn12129134\tfountain grass, Pennisetum ruppelii, Pennisetum setaceum\nn12129738\treed canary grass, gardener's garters, lady's laces, ribbon grass, Phalaris arundinacea\nn12129986\tcanary grass, birdseed grass, Phalaris canariensis\nn12130549\ttimothy, herd's grass, Phleum pratense\nn12131405\tbluegrass, blue grass\nn12131550\tmeadowgrass, meadow grass\nn12132092\twood meadowgrass, Poa nemoralis, Agrostis alba\nn12132956\tnoble cane\nn12133151\tmunj, munja, Saccharum bengalense, Saccharum munja\nn12133462\tbroom beard grass, prairie grass, wire grass, Andropogon scoparius, Schizachyrium scoparium\nn12133682\tbluestem, blue stem, Andropogon furcatus, Andropogon gerardii\nn12134025\trye, Secale cereale\nn12134486\tbristlegrass, bristle grass\nn12134695\tgiant foxtail\nn12134836\tyellow bristlegrass, yellow bristle grass, yellow foxtail, glaucous bristlegrass, Setaria glauca\nn12135049\tgreen bristlegrass, green foxtail, rough bristlegrass, bottle-grass, bottle grass, Setaria viridis\nn12135576\tSiberian millet, Setaria italica rubrofructa\nn12135729\tGerman millet, golden wonder millet, Setaria italica stramineofructa\nn12135898\tmillet\nn12136392\trattan, rattan cane\nn12136581\tmalacca\nn12136720\treed\nn12137120\tsorghum\nn12137569\tgrain sorghum\nn12137791\tdurra, doura, dourah, Egyptian corn, Indian millet, Guinea corn\nn12137954\tfeterita, federita, Sorghum vulgare caudatum\nn12138110\thegari\nn12138248\tkaoliang\nn12138444\tmilo, milo maize\nn12138578\tshallu, Sorghum vulgare rosburghii\nn12139196\tbroomcorn, Sorghum vulgare technicum\nn12139575\tcordgrass, cord grass\nn12139793\tsalt reed grass, Spartina cynosuroides\nn12139921\tprairie cordgrass, freshwater cordgrass, slough grass, Spartina pectinmata\nn12140511\tsmut grass, blackseed, carpet grass, Sporobolus poiretii\nn12140759\tsand dropseed, Sporobolus cryptandrus\nn12140903\trush grass, rush-grass\nn12141167\tSt. Augustine grass, Stenotaphrum secundatum, buffalo grass\nn12141385\tgrain\nn12141495\tcereal, cereal grass\nn12142085\twheat\nn12142357\twheat berry\nn12142450\tdurum, durum wheat, hard wheat, Triticum durum, Triticum turgidum, macaroni wheat\nn12143065\tspelt, Triticum spelta, Triticum aestivum spelta\nn12143215\temmer, starch wheat, two-grain spelt, Triticum dicoccum\nn12143405\twild wheat, wild emmer, Triticum dicoccum dicoccoides\nn12143676\tcorn, maize, Indian corn, Zea mays\nn12144313\tmealie\nn12144580\tcorn\nn12144987\tdent corn, Zea mays indentata\nn12145148\tflint corn, flint maize, Yankee corn, Zea mays indurata\nn12145477\tpopcorn, Zea mays everta\nn12146311\tzoysia\nn12146488\tManila grass, Japanese carpet grass, Zoysia matrella\nn12146654\tKorean lawn grass, Japanese lawn grass, Zoysia japonica\nn12147226\tbamboo\nn12147835\tcommon bamboo, Bambusa vulgaris\nn12148757\tgiant bamboo, kyo-chiku, Dendrocalamus giganteus\nn12150722\tumbrella plant, umbrella sedge, Cyperus alternifolius\nn12150969\tchufa, yellow nutgrass, earth almond, ground almond, rush nut, Cyperus esculentus\nn12151170\tgalingale, galangal, Cyperus longus\nn12151615\tnutgrass, nut grass, nutsedge, nut sedge, Cyperus rotundus\nn12152031\tsand sedge, sand reed, Carex arenaria\nn12152251\tcypress sedge, Carex pseudocyperus\nn12152532\tcotton grass, cotton rush\nn12152722\tcommon cotton grass, Eriophorum angustifolium\nn12153033\thardstem bulrush, hardstemmed bulrush, Scirpus acutus\nn12153224\twool grass, Scirpus cyperinus\nn12153580\tspike rush\nn12153741\twater chestnut, Chinese water chestnut, Eleocharis dulcis\nn12153914\tneedle spike rush, needle rush, slender spike rush, hair grass, Eleocharis acicularis\nn12154114\tcreeping spike rush, Eleocharis palustris\nn12154773\tpandanus, screw pine\nn12155009\ttextile screw pine, lauhala, Pandanus tectorius\nn12155583\tcattail\nn12155773\tcat's-tail, bullrush, bulrush, nailrod, reed mace, reedmace, Typha latifolia\nn12156679\tbur reed\nn12156819\tgrain, caryopsis\nn12157056\tkernel\nn12157179\trye\nn12157769\tgourd, gourd vine\nn12158031\tgourd\nn12158443\tpumpkin, pumpkin vine, autumn pumpkin, Cucurbita pepo\nn12158798\tsquash, squash vine\nn12159055\tsummer squash, summer squash vine, Cucurbita pepo melopepo\nn12159388\tyellow squash\nn12159555\tmarrow, marrow squash, vegetable marrow\nn12159804\tzucchini, courgette\nn12159942\tcocozelle, Italian vegetable marrow\nn12160125\tcymling, pattypan squash\nn12160303\tspaghetti squash\nn12160490\twinter squash, winter squash plant\nn12160857\tacorn squash\nn12161056\thubbard squash, Cucurbita maxima\nn12161285\tturban squash, Cucurbita maxima turbaniformis\nn12161577\tbuttercup squash\nn12161744\tbutternut squash, Cucurbita maxima\nn12161969\twinter crookneck, winter crookneck squash, Cucurbita moschata\nn12162181\tcushaw, Cucurbita mixta, Cucurbita argyrosperma\nn12162425\tprairie gourd, prairie gourd vine, Missouri gourd, wild pumpkin, buffalo gourd, calabazilla, Cucurbita foetidissima\nn12162758\tprairie gourd\nn12163035\tbryony, briony\nn12163279\twhite bryony, devil's turnip, Bryonia alba\nn12164363\tsweet melon, muskmelon, sweet melon vine, Cucumis melo\nn12164656\tcantaloupe, cantaloup, cantaloupe vine, cantaloup vine, Cucumis melo cantalupensis\nn12164881\twinter melon, Persian melon, honeydew melon, winter melon vine, Cucumis melo inodorus\nn12165170\tnet melon, netted melon, nutmeg melon, Cucumis melo reticulatus\nn12165384\tcucumber, cucumber vine, Cucumis sativus\nn12165758\tsquirting cucumber, exploding cucumber, touch-me-not, Ecballium elaterium\nn12166128\tbottle gourd, calabash, Lagenaria siceraria\nn12166424\tluffa, dishcloth gourd, sponge gourd, rag gourd, strainer vine\nn12166793\tloofah, vegetable sponge, Luffa cylindrica\nn12166929\tangled loofah, sing-kwa, Luffa acutangula\nn12167075\tloofa, loofah, luffa, loufah sponge\nn12167436\tbalsam apple, Momordica balsamina\nn12167602\tbalsam pear, Momordica charantia\nn12168565\tlobelia\nn12169099\twater lobelia, Lobelia dortmanna\nn12170585\tmallow\nn12171098\tmusk mallow, mus rose, Malva moschata\nn12171316\tcommon mallow, Malva neglecta\nn12171966\tokra, gumbo, okra plant, lady's-finger, Abelmoschus esculentus, Hibiscus esculentus\nn12172364\tokra\nn12172481\tabelmosk, musk mallow, Abelmoschus moschatus, Hibiscus moschatus\nn12172906\tflowering maple\nn12173069\tvelvetleaf, velvet-leaf, velvetweed, Indian mallow, butter-print, China jute, Abutilon theophrasti\nn12173664\thollyhock\nn12173912\trose mallow, Alcea rosea, Althea rosea\nn12174311\talthea, althaea, hollyhock\nn12174521\tmarsh mallow, white mallow, Althea officinalis\nn12174926\tpoppy mallow\nn12175181\tfringed poppy mallow, Callirhoe digitata\nn12175370\tpurple poppy mallow, Callirhoe involucrata\nn12175598\tclustered poppy mallow, Callirhoe triangulata\nn12176453\tsea island cotton, tree cotton, Gossypium barbadense\nn12176709\tLevant cotton, Gossypium herbaceum\nn12176953\tupland cotton, Gossypium hirsutum\nn12177129\tPeruvian cotton, Gossypium peruvianum\nn12177455\twild cotton, Arizona wild cotton, Gossypium thurberi\nn12178129\tkenaf, kanaf, deccan hemp, bimli, bimli hemp, Indian hemp, Bombay hemp, Hibiscus cannabinus\nn12178780\tsorrel tree, Hibiscus heterophyllus\nn12178896\trose mallow, swamp mallow, common rose mallow, swamp rose mallow, Hibiscus moscheutos\nn12179122\tcotton rose, Confederate rose, Confederate rose mallow, Hibiscus mutabilis\nn12179632\troselle, rozelle, sorrel, red sorrel, Jamaica sorrel, Hibiscus sabdariffa\nn12180168\tmahoe, majagua, mahagua, balibago, purau, Hibiscus tiliaceus\nn12180456\tflower-of-an-hour, flowers-of-an-hour, bladder ketmia, black-eyed Susan, Hibiscus trionum\nn12180885\tlacebark, ribbonwood, houhere, Hoheria populnea\nn12181352\twild hollyhock, Iliamna remota, Sphaeralcea remota\nn12181612\tmountain hollyhock, Iliamna ruvularis, Iliamna acerifolia\nn12182049\tseashore mallow\nn12182276\tsalt marsh mallow, Kosteletzya virginica\nn12183026\tchaparral mallow, Malacothamnus fasciculatus, Sphaeralcea fasciculata\nn12183452\tmalope, Malope trifida\nn12183816\tfalse mallow\nn12184095\twaxmallow, wax mallow, sleeping hibiscus\nn12184468\tglade mallow, Napaea dioica\nn12184912\tpavonia\nn12185254\tribbon tree, ribbonwood, Plagianthus regius, Plagianthus betulinus\nn12185859\tbush hibiscus, Radyera farragei, Hibiscus farragei\nn12186352\tVirginia mallow, Sida hermaphrodita\nn12186554\tQueensland hemp, jellyleaf, Sida rhombifolia\nn12186839\tIndian mallow, Sida spinosa\nn12187247\tcheckerbloom, wild hollyhock, Sidalcea malviflora\nn12187663\tglobe mallow, false mallow\nn12187891\tprairie mallow, red false mallow, Sphaeralcea coccinea, Malvastrum coccineum\nn12188289\ttulipwood tree\nn12188635\tportia tree, bendy tree, seaside mahoe, Thespesia populnea\nn12189429\tred silk-cotton tree, simal, Bombax ceiba, Bombax malabarica\nn12189779\tcream-of-tartar tree, sour gourd, Adansonia gregorii\nn12189987\tbaobab, monkey-bread tree, Adansonia digitata\nn12190410\tkapok, ceiba tree, silk-cotton tree, white silk-cotton tree, Bombay ceiba, God tree, Ceiba pentandra\nn12190869\tdurian, durion, durian tree, Durio zibethinus\nn12191240\tMontezuma\nn12192132\tshaving-brush tree, Pseudobombax ellipticum\nn12192877\tquandong, quandong tree, Brisbane quandong, silver quandong tree, blue fig, Elaeocarpus grandis\nn12193334\tquandong, blue fig\nn12193665\tmakomako, New Zealand wine berry, wineberry, Aristotelia serrata, Aristotelia racemosa\nn12194147\tJamaican cherry, calabur tree, calabura, silk wood, silkwood, Muntingia calabura\nn12194613\tbreakax, breakaxe, break-axe, Sloanea jamaicensis\nn12195391\tsterculia\nn12195533\tPanama tree, Sterculia apetala\nn12195734\tkalumpang, Java olives, Sterculia foetida\nn12196129\tbottle-tree, bottle tree\nn12196336\tflame tree, flame durrajong, Brachychiton acerifolius, Sterculia acerifolia\nn12196527\tflame tree, broad-leaved bottletree, Brachychiton australis\nn12196694\tkurrajong, currajong, Brachychiton populneus\nn12196954\tQueensland bottletree, narrow-leaved bottletree, Brachychiton rupestris, Sterculia rupestris\nn12197359\tkola, kola nut, kola nut tree, goora nut, Cola acuminata\nn12197601\tkola nut, cola nut\nn12198286\tChinese parasol tree, Chinese parasol, Japanese varnish tree, phoenix tree, Firmiana simplex\nn12198793\tflannelbush, flannel bush, California beauty\nn12199266\tscrew tree\nn12199399\tnut-leaved screw tree, Helicteres isora\nn12199790\tred beech, brown oak, booyong, crow's foot, stave wood, silky elm, Heritiera trifoliolata, Terrietia trifoliolata\nn12199982\tlooking glass tree, Heritiera macrophylla\nn12200143\tlooking-glass plant, Heritiera littoralis\nn12200504\thoney bell, honeybells, Hermannia verticillata, Mahernia verticillata\nn12200905\tmayeng, maple-leaved bayur, Pterospermum acerifolium\nn12201331\tsilver tree, Tarrietia argyrodendron\nn12201580\tcacao, cacao tree, chocolate tree, Theobroma cacao\nn12201938\tobeche, obechi, arere, samba, Triplochiton scleroxcylon\nn12202936\tlinden, linden tree, basswood, lime, lime tree\nn12203529\tAmerican basswood, American lime, Tilia americana\nn12203699\tsmall-leaved linden, small-leaved lime, Tilia cordata\nn12203896\twhite basswood, cottonwood, Tilia heterophylla\nn12204032\tJapanese linden, Japanese lime, Tilia japonica\nn12204175\tsilver lime, silver linden, Tilia tomentosa\nn12204730\tcorchorus\nn12205460\tAfrican hemp, Sparmannia africana\nn12205694\therb, herbaceous plant\nn12214789\tprotea\nn12215022\thoneypot, king protea, Protea cynaroides\nn12215210\thoneyflower, honey-flower, Protea mellifera\nn12215579\tbanksia\nn12215824\thoneysuckle, Australian honeysuckle, coast banksia, Banksia integrifolia\nn12216215\tsmoke bush\nn12216628\tChilean firebush, Chilean flameflower, Embothrium coccineum\nn12216968\tChilean nut, Chile nut, Chile hazel, Chilean hazelnut, Guevina heterophylla, Guevina avellana\nn12217453\tgrevillea\nn12217851\tred-flowered silky oak, Grevillea banksii\nn12218274\tsilky oak, Grevillea robusta\nn12218490\tbeefwood, Grevillea striata\nn12218868\tcushion flower, pincushion hakea, Hakea laurina\nn12219668\trewa-rewa, New Zealand honeysuckle\nn12220019\thoneyflower, honey-flower, mountain devil, Lambertia formosa\nn12220496\tsilver tree, Leucadendron argenteum\nn12220829\tlomatia\nn12221191\tmacadamia, macadamia tree\nn12221368\tMacadamia integrifolia\nn12221522\tmacadamia nut, macadamia nut tree, Macadamia ternifolia\nn12221801\tQueensland nut, Macadamia tetraphylla\nn12222090\tprickly ash, Orites excelsa\nn12222493\tgeebung\nn12222900\twheel tree, firewheel tree, Stenocarpus sinuatus\nn12223160\tscrub beefwood, beefwood, Stenocarpus salignus\nn12223569\twaratah, Telopea Oreades\nn12223764\twaratah, Telopea speciosissima\nn12224978\tcasuarina\nn12225222\tshe-oak\nn12225349\tbeefwood\nn12225563\tAustralian pine, Casuarina equisetfolia\nn12226932\theath\nn12227658\ttree heath, briar, brier, Erica arborea\nn12227909\tbriarroot\nn12228229\twinter heath, spring heath, Erica carnea\nn12228387\tbell heather, heather bell, fine-leaved heath, Erica cinerea\nn12228689\tCornish heath, Erica vagans\nn12228886\tSpanish heath, Portuguese heath, Erica lusitanica\nn12229111\tPrince-of-Wales'-heath, Prince of Wales heath, Erica perspicua\nn12229651\tbog rosemary, moorwort, Andromeda glaucophylla\nn12229887\tmarsh andromeda, common bog rosemary, Andromeda polifolia\nn12230540\tmadrona, madrono, manzanita, Arbutus menziesii\nn12230794\tstrawberry tree, Irish strawberry, Arbutus unedo\nn12231192\tbearberry\nn12231709\talpine bearberry, black bearberry, Arctostaphylos alpina\nn12232114\theartleaf manzanita, Arctostaphylos andersonii\nn12232280\tParry manzanita, Arctostaphylos manzanita\nn12232851\tspike heath, Bruckenthalia spiculifolia\nn12233249\tbryanthus\nn12234318\tleatherleaf, Chamaedaphne calyculata\nn12234669\tConnemara heath, St. Dabeoc's heath, Daboecia cantabrica\nn12235051\ttrailing arbutus, mayflower, Epigaea repens\nn12235479\tcreeping snowberry, moxie plum, maidenhair berry, Gaultheria hispidula\nn12236160\tsalal, shallon, Gaultheria shallon\nn12236546\thuckleberry\nn12236768\tblack huckleberry, Gaylussacia baccata\nn12236977\tdangleberry, dangle-berry, Gaylussacia frondosa\nn12237152\tbox huckleberry, Gaylussacia brachycera\nn12237486\tkalmia\nn12237641\tmountain laurel, wood laurel, American laurel, calico bush, Kalmia latifolia\nn12237855\tswamp laurel, bog laurel, bog kalmia, Kalmia polifolia\nn12238756\ttrapper's tea, glandular Labrador tea\nn12238913\twild rosemary, marsh tea, Ledum palustre\nn12239240\tsand myrtle, Leiophyllum buxifolium\nn12239647\tleucothoe\nn12239880\tdog laurel, dog hobble, switch-ivy, Leucothoe fontanesiana, Leucothoe editorum\nn12240150\tsweet bells, Leucothoe racemosa\nn12240477\talpine azalea, mountain azalea, Loiseleuria procumbens\nn12240965\tstaggerbush, stagger bush, Lyonia mariana\nn12241192\tmaleberry, male berry, privet andromeda, he-huckleberry, Lyonia ligustrina\nn12241426\tfetterbush, fetter bush, shiny lyonia, Lyonia lucida\nn12241880\tfalse azalea, fool's huckleberry, Menziesia ferruginea\nn12242123\tminniebush, minnie bush, Menziesia pilosa\nn12242409\tsorrel tree, sourwood, titi, Oxydendrum arboreum\nn12242850\tmountain heath, Phyllodoce caerulea, Bryanthus taxifolius\nn12243109\tpurple heather, Brewer's mountain heather, Phyllodoce breweri\nn12243693\tfetterbush, mountain fetterbush, mountain andromeda, Pieris floribunda\nn12244153\trhododendron\nn12244458\tcoast rhododendron, Rhododendron californicum\nn12244650\trosebay, Rhododendron maxima\nn12244819\tswamp azalea, swamp honeysuckle, white honeysuckle, Rhododendron viscosum\nn12245319\tazalea\nn12245695\tcranberry\nn12245885\tAmerican cranberry, large cranberry, Vaccinium macrocarpon\nn12246037\tEuropean cranberry, small cranberry, Vaccinium oxycoccus\nn12246232\tblueberry, blueberry bush\nn12246773\tfarkleberry, sparkleberry, Vaccinium arboreum\nn12246941\tlow-bush blueberry, low blueberry, Vaccinium angustifolium, Vaccinium pennsylvanicum\nn12247202\trabbiteye blueberry, rabbit-eye blueberry, rabbiteye, Vaccinium ashei\nn12247407\tdwarf bilberry, dwarf blueberry, Vaccinium caespitosum\nn12247963\tevergreen blueberry, Vaccinium myrsinites\nn12248141\tevergreen huckleberry, Vaccinium ovatum\nn12248359\tbilberry, thin-leaved bilberry, mountain blue berry, Viccinium membranaceum\nn12248574\tbilberry, whortleberry, whinberry, blaeberry, Viccinium myrtillus\nn12248780\tbog bilberry, bog whortleberry, moor berry, Vaccinium uliginosum alpinum\nn12248941\tdryland blueberry, dryland berry, Vaccinium pallidum\nn12249122\tgrouseberry, grouse-berry, grouse whortleberry, Vaccinium scoparium\nn12249294\tdeerberry, squaw huckleberry, Vaccinium stamineum\nn12249542\tcowberry, mountain cranberry, lingonberry, lingenberry, lingberry, foxberry, Vaccinium vitis-idaea\nn12251001\tdiapensia\nn12251278\tgalax, galaxy, wandflower, beetleweed, coltsfoot, Galax urceolata\nn12251740\tpyxie, pixie, pixy, Pyxidanthera barbulata\nn12252168\tshortia\nn12252383\toconee bells, Shortia galacifolia\nn12252866\tAustralian heath\nn12253229\tepacris\nn12253487\tcommon heath, Epacris impressa\nn12253664\tcommon heath, blunt-leaf heath, Epacris obtusifolia\nn12253835\tPort Jackson heath, Epacris purpurascens\nn12254168\tnative cranberry, groundberry, ground-berry, cranberry heath, Astroloma humifusum, Styphelia humifusum\nn12255225\tpink fivecorner, Styphelia triflora\nn12256112\twintergreen, pyrola\nn12256325\tfalse wintergreen, Pyrola americana, Pyrola rotundifolia americana\nn12256522\tlesser wintergreen, Pyrola minor\nn12256708\twild lily of the valley, shinleaf, Pyrola elliptica\nn12256920\twild lily of the valley, Pyrola rotundifolia\nn12257570\tpipsissewa, prince's pine\nn12257725\tlove-in-winter, western prince's pine, Chimaphila umbellata, Chimaphila corymbosa\nn12258101\tone-flowered wintergreen, one-flowered pyrola, Moneses uniflora, Pyrola uniflora\nn12258885\tIndian pipe, waxflower, Monotropa uniflora\nn12259316\tpinesap, false beachdrops, Monotropa hypopithys\nn12260799\tbeech, beech tree\nn12261359\tcommon beech, European beech, Fagus sylvatica\nn12261571\tcopper beech, purple beech, Fagus sylvatica atropunicea, Fagus purpurea, Fagus sylvatica purpurea\nn12261808\tAmerican beech, white beech, red beech, Fagus grandifolia, Fagus americana\nn12262018\tweeping beech, Fagus pendula, Fagus sylvatica pendula\nn12262185\tJapanese beech\nn12262553\tchestnut, chestnut tree\nn12263038\tAmerican chestnut, American sweet chestnut, Castanea dentata\nn12263204\tEuropean chestnut, sweet chestnut, Spanish chestnut, Castanea sativa\nn12263410\tChinese chestnut, Castanea mollissima\nn12263588\tJapanese chestnut, Castanea crenata\nn12263738\tAllegheny chinkapin, eastern chinquapin, chinquapin, dwarf chestnut, Castanea pumila\nn12263987\tOzark chinkapin, Ozark chinquapin, chinquapin, Castanea ozarkensis\nn12264512\toak chestnut\nn12264786\tgiant chinkapin, golden chinkapin, Chrysolepis chrysophylla, Castanea chrysophylla, Castanopsis chrysophylla\nn12265083\tdwarf golden chinkapin, Chrysolepis sempervirens\nn12265394\ttanbark oak, Lithocarpus densiflorus\nn12265600\tJapanese oak, Lithocarpus glabra, Lithocarpus glaber\nn12266217\tsouthern beech, evergreen beech\nn12266528\tmyrtle beech, Nothofagus cuninghamii\nn12266644\tCoigue, Nothofagus dombeyi\nn12266796\tNew Zealand beech\nn12266984\tsilver beech, Nothofagus menziesii\nn12267133\troble beech, Nothofagus obliqua\nn12267265\trauli beech, Nothofagus procera\nn12267411\tblack beech, Nothofagus solanderi\nn12267534\thard beech, Nothofagus truncata\nn12267677\tacorn\nn12267931\tcupule, acorn cup\nn12268246\toak, oak tree\nn12269241\tlive oak\nn12269406\tcoast live oak, California live oak, Quercus agrifolia\nn12269652\twhite oak\nn12270027\tAmerican white oak, Quercus alba\nn12270278\tArizona white oak, Quercus arizonica\nn12270460\tswamp white oak, swamp oak, Quercus bicolor\nn12270741\tEuropean turkey oak, turkey oak, Quercus cerris\nn12270946\tcanyon oak, canyon live oak, maul oak, iron oak, Quercus chrysolepis\nn12271187\tscarlet oak, Quercus coccinea\nn12271451\tjack oak, northern pin oak, Quercus ellipsoidalis\nn12271643\tred oak\nn12271933\tsouthern red oak, swamp red oak, turkey oak, Quercus falcata\nn12272239\tOregon white oak, Oregon oak, Garry oak, Quercus garryana\nn12272432\tholm oak, holm tree, holly-leaved oak, evergreen oak, Quercus ilex\nn12272735\tbear oak, Quercus ilicifolia\nn12272883\tshingle oak, laurel oak, Quercus imbricaria\nn12273114\tbluejack oak, turkey oak, Quercus incana\nn12273344\tCalifornia black oak, Quercus kelloggii\nn12273515\tAmerican turkey oak, turkey oak, Quercus laevis\nn12273768\tlaurel oak, pin oak, Quercus laurifolia\nn12273939\tCalifornia white oak, valley oak, valley white oak, roble, Quercus lobata\nn12274151\tovercup oak, Quercus lyrata\nn12274358\tbur oak, burr oak, mossy-cup oak, mossycup oak, Quercus macrocarpa\nn12274630\tscrub oak\nn12274863\tblackjack oak, blackjack, jack oak, Quercus marilandica\nn12275131\tswamp chestnut oak, Quercus michauxii\nn12275317\tJapanese oak, Quercus mongolica, Quercus grosseserrata\nn12275489\tchestnut oak\nn12275675\tchinquapin oak, chinkapin oak, yellow chestnut oak, Quercus muehlenbergii\nn12275888\tmyrtle oak, seaside scrub oak, Quercus myrtifolia\nn12276110\twater oak, possum oak, Quercus nigra\nn12276314\tNuttall oak, Nuttall's oak, Quercus nuttalli\nn12276477\tdurmast, Quercus petraea, Quercus sessiliflora\nn12276628\tbasket oak, cow oak, Quercus prinus, Quercus montana\nn12276872\tpin oak, swamp oak, Quercus palustris\nn12277150\twillow oak, Quercus phellos\nn12277334\tdwarf chinkapin oak, dwarf chinquapin oak, dwarf oak, Quercus prinoides\nn12277578\tcommon oak, English oak, pedunculate oak, Quercus robur\nn12277800\tnorthern red oak, Quercus rubra, Quercus borealis\nn12278107\tShumard oak, Shumard red oak, Quercus shumardii\nn12278371\tpost oak, box white oak, brash oak, iron oak, Quercus stellata\nn12278650\tcork oak, Quercus suber\nn12278865\tSpanish oak, Quercus texana\nn12279060\thuckleberry oak, Quercus vaccinifolia\nn12279293\tChinese cork oak, Quercus variabilis\nn12279458\tblack oak, yellow oak, quercitron, quercitron oak, Quercus velutina\nn12279772\tsouthern live oak, Quercus virginiana\nn12280060\tinterior live oak, Quercus wislizenii, Quercus wizlizenii\nn12280364\tmast\nn12281241\tbirch, birch tree\nn12281788\tyellow birch, Betula alleghaniensis, Betula leutea\nn12281974\tAmerican white birch, paper birch, paperbark birch, canoe birch, Betula cordifolia, Betula papyrifera\nn12282235\tgrey birch, gray birch, American grey birch, American gray birch, Betula populifolia\nn12282527\tsilver birch, common birch, European white birch, Betula pendula\nn12282737\tdowny birch, white birch, Betula pubescens\nn12282933\tblack birch, river birch, red birch, Betula nigra\nn12283147\tsweet birch, cherry birch, black birch, Betula lenta\nn12283395\tYukon white birch, Betula neoalaskana\nn12283542\tswamp birch, water birch, mountain birch, Western paper birch, Western birch, Betula fontinalis\nn12283790\tNewfoundland dwarf birch, American dwarf birch, Betula glandulosa\nn12284262\talder, alder tree\nn12284821\tcommon alder, European black alder, Alnus glutinosa, Alnus vulgaris\nn12285049\tgrey alder, gray alder, Alnus incana\nn12285195\tseaside alder, Alnus maritima\nn12285369\twhite alder, mountain alder, Alnus rhombifolia\nn12285512\tred alder, Oregon alder, Alnus rubra\nn12285705\tspeckled alder, Alnus rugosa\nn12285900\tsmooth alder, hazel alder, Alnus serrulata\nn12286068\tgreen alder, Alnus veridis\nn12286197\tgreen alder, Alnus veridis crispa, Alnus crispa\nn12286826\thornbeam\nn12286988\tEuropean hornbeam, Carpinus betulus\nn12287195\tAmerican hornbeam, Carpinus caroliniana\nn12287642\thop hornbeam\nn12287836\tOld World hop hornbeam, Ostrya carpinifolia\nn12288005\tEastern hop hornbeam, ironwood, ironwood tree, Ostrya virginiana\nn12288823\thazelnut, hazel, hazelnut tree\nn12289310\tAmerican hazel, Corylus americana\nn12289433\tcobnut, filbert, Corylus avellana, Corylus avellana grandis\nn12289585\tbeaked hazelnut, Corylus cornuta\nn12290748\tcentaury\nn12290975\trosita, Centaurium calycosum\nn12291143\tlesser centaury, Centaurium minus\nn12291459\tseaside centaury\nn12291671\tslender centaury\nn12291959\tprairie gentian, tulip gentian, bluebell, Eustoma grandiflorum\nn12292463\tPersian violet, Exacum affine\nn12292877\tcolumbo, American columbo, deer's-ear, deer's-ears, pyramid plant, American gentian\nn12293723\tgentian\nn12294124\tgentianella, Gentiana acaulis\nn12294331\tclosed gentian, blind gentian, bottle gentian, Gentiana andrewsii\nn12294542\texplorer's gentian, Gentiana calycosa\nn12294723\tclosed gentian, blind gentian, Gentiana clausa\nn12294871\tgreat yellow gentian, Gentiana lutea\nn12295033\tmarsh gentian, calathian violet, Gentiana pneumonanthe\nn12295237\tsoapwort gentian, Gentiana saponaria\nn12295429\tstriped gentian, Gentiana villosa\nn12295796\tagueweed, ague weed, five-flowered gentian, stiff gentian, Gentianella quinquefolia, Gentiana quinquefolia\nn12296045\tfelwort, gentianella amarella\nn12296432\tfringed gentian\nn12296735\tGentianopsis crinita, Gentiana crinita\nn12296929\tGentianopsis detonsa, Gentiana detonsa\nn12297110\tGentianopsid procera, Gentiana procera\nn12297280\tGentianopsis thermalis, Gentiana thermalis\nn12297507\ttufted gentian, Gentianopsis holopetala, Gentiana holopetala\nn12297846\tspurred gentian\nn12298165\tsabbatia\nn12299640\ttoothbrush tree, mustard tree, Salvadora persica\nn12300840\tolive tree\nn12301180\tolive, European olive tree, Olea europaea\nn12301445\tolive\nn12301613\tblack maire, Olea cunninghamii\nn12301766\twhite maire, Olea lanceolata\nn12302071\tfringe tree\nn12302248\tfringe bush, Chionanthus virginicus\nn12302565\tforestiera\nn12303083\tforsythia\nn12303462\tash, ash tree\nn12304115\twhite ash, Fraxinus Americana\nn12304286\tswamp ash, Fraxinus caroliniana\nn12304420\tflowering ash, Fraxinus cuspidata\nn12304703\tEuropean ash, common European ash, Fraxinus excelsior\nn12304899\tOregon ash, Fraxinus latifolia, Fraxinus oregona\nn12305089\tblack ash, basket ash, brown ash, hoop ash, Fraxinus nigra\nn12305293\tmanna ash, flowering ash, Fraxinus ornus\nn12305475\tred ash, downy ash, Fraxinus pennsylvanica\nn12305654\tgreen ash, Fraxinus pennsylvanica subintegerrima\nn12305819\tblue ash, Fraxinus quadrangulata\nn12305986\tmountain ash, Fraxinus texensis\nn12306089\tpumpkin ash, Fraxinus tomentosa\nn12306270\tArizona ash, Fraxinus velutina\nn12306717\tjasmine\nn12306938\tprimrose jasmine, Jasminum mesnyi\nn12307076\twinter jasmine, Jasminum nudiflorum\nn12307240\tcommon jasmine, true jasmine, jessamine, Jasminum officinale\nn12307756\tprivet\nn12308112\tAmur privet, Ligustrum amurense\nn12308447\tJapanese privet, Ligustrum japonicum\nn12308907\tLigustrum obtusifolium\nn12309277\tcommon privet, Ligustrum vulgare\nn12309630\tdevilwood, American olive, Osmanthus americanus\nn12310021\tmock privet\nn12310349\tlilac\nn12310638\tHimalayan lilac, Syringa emodi\nn12311045\tPersian lilac, Syringa persica\nn12311224\tJapanese tree lilac, Syringa reticulata, Syringa amurensis japonica\nn12311413\tJapanese lilac, Syringa villosa\nn12311579\tcommon lilac, Syringa vulgaris\nn12312110\tbloodwort\nn12312728\tkangaroo paw, kangaroo's paw, kangaroo's-foot, kangaroo-foot plant, Australian sword lily, Anigozanthus manglesii\nn12315060\tVirginian witch hazel, Hamamelis virginiana\nn12315245\tvernal witch hazel, Hamamelis vernalis\nn12315598\twinter hazel, flowering hazel\nn12315999\tfothergilla, witch alder\nn12316444\tliquidambar\nn12316572\tsweet gum, sweet gum tree, bilsted, red gum, American sweet gum, Liquidambar styraciflua\nn12317296\tiron tree, iron-tree, ironwood, ironwood tree\nn12318378\twalnut, walnut tree\nn12318782\tCalifornia black walnut, Juglans californica\nn12318965\tbutternut, butternut tree, white walnut, Juglans cinerea\nn12319204\tblack walnut, black walnut tree, black hickory, Juglans nigra\nn12319414\tEnglish walnut, English walnut tree, Circassian walnut, Persian walnut, Juglans regia\nn12320010\thickory, hickory tree\nn12320414\twater hickory, bitter pecan, water bitternut, Carya aquatica\nn12320627\tpignut, pignut hickory, brown hickory, black hickory, Carya glabra\nn12320806\tbitternut, bitternut hickory, bitter hickory, bitter pignut, swamp hickory, Carya cordiformis\nn12321077\tpecan, pecan tree, Carya illinoensis, Carya illinoinsis\nn12321395\tbig shellbark, big shellbark hickory, big shagbark, king nut, king nut hickory, Carya laciniosa\nn12321669\tnutmeg hickory, Carya myristicaeformis, Carya myristiciformis\nn12321873\tshagbark, shagbark hickory, shellbark, shellbark hickory, Carya ovata\nn12322099\tmockernut, mockernut hickory, black hickory, white-heart hickory, big-bud hickory, Carya tomentosa\nn12322501\twing nut, wing-nut\nn12322699\tCaucasian walnut, Pterocarya fraxinifolia\nn12323665\tdhawa, dhava\nn12324056\tcombretum\nn12324222\thiccup nut, hiccough nut, Combretum bracteosum\nn12324388\tbush willow, Combretum appiculatum\nn12324558\tbush willow, Combretum erythrophyllum\nn12324906\tbutton tree, button mangrove, Conocarpus erectus\nn12325234\twhite mangrove, Laguncularia racemosa\nn12325787\toleaster\nn12327022\twater milfoil\nn12327528\tanchovy pear, anchovy pear tree, Grias cauliflora\nn12327846\tbrazil nut, brazil-nut tree, Bertholletia excelsa\nn12328398\tloosestrife\nn12328567\tpurple loosestrife, spiked loosestrife, Lythrum salicaria\nn12328801\tgrass poly, hyssop loosestrife, Lythrum hyssopifolia\nn12329260\tcrape myrtle, crepe myrtle, crepe flower, Lagerstroemia indica\nn12329473\tQueen's crape myrtle, pride-of-India, Lagerstroemia speciosa\nn12330239\tmyrtaceous tree\nn12330469\tmyrtle\nn12330587\tcommon myrtle, Myrtus communis\nn12330891\tbayberry, bay-rum tree, Jamaica bayberry, wild cinnamon, Pimenta acris\nn12331066\tallspice, allspice tree, pimento tree, Pimenta dioica\nn12331263\tallspice tree, Pimenta officinalis\nn12331655\tsour cherry, Eugenia corynantha\nn12331788\tnakedwood, Eugenia dicrana\nn12332030\tSurinam cherry, pitanga, Eugenia uniflora\nn12332218\trose apple, rose-apple tree, jambosa, Eugenia jambos\nn12332555\tfeijoa, feijoa bush\nn12333053\tjaboticaba, jaboticaba tree, Myrciaria cauliflora\nn12333530\tguava, true guava, guava bush, Psidium guajava\nn12333771\tguava, strawberry guava, yellow cattley guava, Psidium littorale\nn12333961\tcattley guava, purple strawberry guava, Psidium cattleianum, Psidium littorale longipes\nn12334153\tBrazilian guava, Psidium guineense\nn12334293\tgum tree, gum\nn12334891\teucalyptus, eucalypt, eucalyptus tree\nn12335483\tflooded gum\nn12335664\tmallee\nn12335800\tstringybark\nn12335937\tsmoothbark\nn12336092\tred gum, peppermint, peppermint gum, Eucalyptus amygdalina\nn12336224\tred gum, marri, Eucalyptus calophylla\nn12336333\triver red gum, river gum, Eucalyptus camaldulensis, Eucalyptus rostrata\nn12336586\tmountain swamp gum, Eucalyptus camphora\nn12336727\tsnow gum, ghost gum, white ash, Eucalyptus coriacea, Eucalyptus pauciflora\nn12336973\talpine ash, mountain oak, Eucalyptus delegatensis\nn12337131\twhite mallee, congoo mallee, Eucalyptus dumosa\nn12337246\twhite stringybark, thin-leaved stringybark, Eucalyptusd eugenioides\nn12337391\twhite mountain ash, Eucalyptus fraxinoides\nn12337617\tblue gum, fever tree, Eucalyptus globulus\nn12337800\trose gum, Eucalypt grandis\nn12337922\tcider gum, Eucalypt gunnii\nn12338034\tswamp gum, Eucalypt ovata\nn12338146\tspotted gum, Eucalyptus maculata\nn12338258\tlemon-scented gum, Eucalyptus citriodora, Eucalyptus maculata citriodora\nn12338454\tblack mallee, black sally, black gum, Eucalytus stellulata\nn12338655\tforest red gum, Eucalypt tereticornis\nn12338796\tmountain ash, Eucalyptus regnans\nn12338979\tmanna gum, Eucalyptus viminalis\nn12339526\tclove, clove tree, Syzygium aromaticum, Eugenia aromaticum, Eugenia caryophyllatum\nn12339831\tclove\nn12340383\ttupelo, tupelo tree\nn12340581\twater gum, Nyssa aquatica\nn12340755\tsour gum, black gum, pepperidge, Nyssa sylvatica\nn12341542\tenchanter's nightshade\nn12341931\tCircaea lutetiana\nn12342299\twillowherb\nn12342498\tfireweed, giant willowherb, rosebay willowherb, wickup, Epilobium angustifolium\nn12342852\tCalifornia fuchsia, humming bird's trumpet, Epilobium canum canum, Zauschneria californica\nn12343480\tfuchsia\nn12343753\tlady's-eardrop, ladies'-eardrop, lady's-eardrops, ladies'-eardrops, Fuchsia coccinea\nn12344283\tevening primrose\nn12344483\tcommon evening primrose, German rampion, Oenothera biennis\nn12344700\tsundrops, Oenothera fruticosa\nn12344837\tMissouri primrose, Ozark sundrops, Oenothera macrocarpa\nn12345280\tpomegranate, pomegranate tree, Punica granatum\nn12345899\tmangrove, Rhizophora mangle\nn12346578\tdaphne\nn12346813\tgarland flower, Daphne cneorum\nn12346986\tspurge laurel, wood laurel, Daphne laureola\nn12347158\tmezereon, February daphne, Daphne mezereum\nn12349315\tIndian rhododendron, Melastoma malabathricum\nn12349711\tMedinilla magnifica\nn12350032\tdeer grass, meadow beauty\nn12350758\tcanna\nn12351091\tachira, indian shot, arrowroot, Canna indica, Canna edulis\nn12351790\tarrowroot, American arrowroot, obedience plant, Maranta arundinaceae\nn12352287\tbanana, banana tree\nn12352639\tdwarf banana, Musa acuminata\nn12352844\tJapanese banana, Musa basjoo\nn12352990\tplantain, plantain tree, Musa paradisiaca\nn12353203\tedible banana, Musa paradisiaca sapientum\nn12353431\tabaca, Manila hemp, Musa textilis\nn12353754\tAbyssinian banana, Ethiopian banana, Ensete ventricosum, Musa ensete\nn12355760\tginger\nn12356023\tcommon ginger, Canton ginger, stem ginger, Zingiber officinale\nn12356395\tturmeric, Curcuma longa, Curcuma domestica\nn12356960\tgalangal, Alpinia galanga\nn12357485\tshellflower, shall-flower, shell ginger, Alpinia Zerumbet, Alpinia speciosa, Languas speciosa\nn12357968\tgrains of paradise, Guinea grains, Guinea pepper, melagueta pepper, Aframomum melegueta\nn12358293\tcardamom, cardamon, Elettaria cardamomum\nn12360108\tbegonia\nn12360534\tfibrous-rooted begonia\nn12360684\ttuberous begonia\nn12360817\trhizomatous begonia\nn12360958\tChristmas begonia, blooming-fool begonia, Begonia cheimantha\nn12361135\tangel-wing begonia, Begonia cocchinea\nn12361560\tbeefsteak begonia, kidney begonia, Begonia erythrophylla, Begonia feastii\nn12361754\tstar begonia, star-leaf begonia, Begonia heracleifolia\nn12361946\trex begonia, king begonia, painted-leaf begonia, beefsteak geranium, Begonia rex\nn12362274\twax begonia, Begonia semperflorens\nn12362514\tSocotra begonia, Begonia socotrana\nn12362668\thybrid tuberous begonia, Begonia tuberhybrida\nn12363301\tdillenia\nn12363768\tguinea gold vine, guinea flower\nn12364604\tpoon\nn12364940\tcalaba, Santa Maria tree, Calophyllum calaba\nn12365158\tMaria, Calophyllum longifolium\nn12365285\tlaurelwood, lancewood tree, Calophyllum candidissimum\nn12365462\tAlexandrian laurel, Calophyllum inophyllum\nn12365900\tclusia\nn12366053\twild fig, Clusia flava\nn12366186\twaxflower, Clusia insignis\nn12366313\tpitch apple, strangler fig, Clusia rosea, Clusia major\nn12366675\tmangosteen, mangosteen tree, Garcinia mangostana\nn12366870\tgamboge tree, Garcinia hanburyi, Garcinia cambogia, Garcinia gummi-gutta\nn12367611\tSt John's wort\nn12368028\tcommon St John's wort, tutsan, Hypericum androsaemum\nn12368257\tgreat St John's wort, Hypericum ascyron, Hypericum pyramidatum\nn12368451\tcreeping St John's wort, Hypericum calycinum\nn12369066\tlow St Andrew's cross, Hypericum hypericoides\nn12369309\tklammath weed, Hypericum perforatum\nn12369476\tshrubby St John's wort, Hypericum prolificum, Hypericum spathulatum\nn12369665\tSt Peter's wort, Hypericum tetrapterum, Hypericum maculatum\nn12369845\tmarsh St-John's wort, Hypericum virginianum\nn12370174\tmammee apple, mammee, mamey, mammee tree, Mammea americana\nn12370549\trose chestnut, ironwood, ironwood tree, Mesua ferrea\nn12371202\tbower actinidia, tara vine, Actinidia arguta\nn12371439\tChinese gooseberry, kiwi, kiwi vine, Actinidia chinensis, Actinidia deliciosa\nn12371704\tsilvervine, silver vine, Actinidia polygama\nn12372233\twild cinnamon, white cinnamon tree, Canella winterana, Canella-alba\nn12373100\tpapaya, papaia, pawpaw, papaya tree, melon tree, Carica papaya\nn12373739\tsouari, souari nut, souari tree, Caryocar nuciferum\nn12374418\trockrose, rock rose\nn12374705\twhite-leaved rockrose, Cistus albidus\nn12374862\tcommon gum cistus, Cistus ladanifer, Cistus ladanum\nn12375769\tfrostweed, frost-weed, frostwort, Helianthemum canadense, Crocanthemum canadense\nn12377198\tdipterocarp\nn12377494\tred lauan, red lauan tree, Shorea teysmanniana\nn12378249\tgovernor's plum, governor plum, Madagascar plum, ramontchi, batoko palm, Flacourtia indica\nn12378753\tkei apple, kei apple bush, Dovyalis caffra\nn12378963\tketembilla, kitembilla, kitambilla, ketembilla tree, Ceylon gooseberry, Dovyalis hebecarpa\nn12379531\tchaulmoogra, chaulmoogra tree, chaulmugra, Hydnocarpus kurzii, Taraktagenos kurzii, Taraktogenos kurzii\nn12380761\twild peach, Kiggelaria africana\nn12381511\tcandlewood\nn12382233\tboojum tree, cirio, Fouquieria columnaris, Idria columnaris\nn12382875\tbird's-eye bush, Ochna serrulata\nn12383737\tgranadilla, purple granadillo, Passiflora edulis\nn12383894\tgranadilla, sweet granadilla, Passiflora ligularis\nn12384037\tgranadilla, giant granadilla, Passiflora quadrangularis\nn12384227\tmaypop, Passiflora incarnata\nn12384375\tJamaica honeysuckle, yellow granadilla, Passiflora laurifolia\nn12384569\tbanana passion fruit, Passiflora mollissima\nn12384680\tsweet calabash, Passiflora maliformis\nn12384839\tlove-in-a-mist, running pop, wild water lemon, Passiflora foetida\nn12385429\treseda\nn12385566\tmignonette, sweet reseda, Reseda odorata\nn12385830\tdyer's rocket, dyer's mignonette, weld, Reseda luteola\nn12386945\tfalse tamarisk, German tamarisk, Myricaria germanica\nn12387103\thalophyte\nn12387633\tviola\nn12387839\tviolet\nn12388143\tfield pansy, heartsease, Viola arvensis\nn12388293\tAmerican dog violet, Viola conspersa\nn12388858\tdog violet, heath violet, Viola canina\nn12388989\thorned violet, tufted pansy, Viola cornuta\nn12389130\ttwo-eyed violet, heartsease, Viola ocellata\nn12389501\tbird's-foot violet, pansy violet, Johnny-jump-up, wood violet, Viola pedata\nn12389727\tdowny yellow violet, Viola pubescens\nn12389932\tlong-spurred violet, Viola rostrata\nn12390099\tpale violet, striped violet, cream violet, Viola striata\nn12390314\thedge violet, wood violet, Viola sylvatica, Viola reichenbachiana\nn12392070\tnettle\nn12392549\tstinging nettle, Urtica dioica\nn12392765\tRoman nettle, Urtica pipulifera\nn12393269\tramie, ramee, Chinese silk plant, China grass, Boehmeria nivea\nn12394118\twood nettle, Laportea canadensis\nn12394328\tAustralian nettle, Australian nettle tree\nn12394638\tpellitory-of-the-wall, wall pellitory, pellitory, Parietaria difussa\nn12395068\trichweed, clearweed, dead nettle, Pilea pumilla\nn12395289\tartillery plant, Pilea microphylla\nn12395463\tfriendship plant, panamica, panamiga, Pilea involucrata\nn12395906\tQueensland grass-cloth plant, Pipturus argenteus\nn12396091\tPipturus albidus\nn12396924\tcannabis, hemp\nn12397431\tIndian hemp, Cannabis indica\nn12399132\tmulberry, mulberry tree\nn12399384\twhite mulberry, Morus alba\nn12399534\tblack mulberry, Morus nigra\nn12399656\tred mulberry, Morus rubra\nn12399899\tosage orange, bow wood, mock orange, Maclura pomifera\nn12400489\tbreadfruit, breadfruit tree, Artocarpus communis, Artocarpus altilis\nn12400720\tjackfruit, jackfruit tree, Artocarpus heterophyllus\nn12400924\tmarang, marang tree, Artocarpus odoratissima\nn12401335\tfig tree\nn12401684\tfig, common fig, common fig tree, Ficus carica\nn12401893\tcaprifig, Ficus carica sylvestris\nn12402051\tgolden fig, Florida strangler fig, strangler fig, wild fig, Ficus aurea\nn12402348\tbanyan, banyan tree, banian, banian tree, Indian banyan, East Indian fig tree, Ficus bengalensis\nn12402596\tpipal, pipal tree, pipul, peepul, sacred fig, bo tree, Ficus religiosa\nn12402840\tIndia-rubber tree, India-rubber plant, India-rubber fig, rubber plant, Assam rubber, Ficus elastica\nn12403075\tmistletoe fig, mistletoe rubber plant, Ficus diversifolia, Ficus deltoidea\nn12403276\tPort Jackson fig, rusty rig, little-leaf fig, Botany Bay fig, Ficus rubiginosa\nn12403513\tsycamore, sycamore fig, mulberry fig, Ficus sycomorus\nn12403994\tpaper mulberry, Broussonetia papyrifera\nn12404729\ttrumpetwood, trumpet-wood, trumpet tree, snake wood, imbauba, Cecropia peltata\nn12405714\telm, elm tree\nn12406304\twinged elm, wing elm, Ulmus alata\nn12406488\tAmerican elm, white elm, water elm, rock elm, Ulmus americana\nn12406715\tsmooth-leaved elm, European field elm, Ulmus carpinifolia\nn12406902\tcedar elm, Ulmus crassifolia\nn12407079\twitch elm, wych elm, Ulmus glabra\nn12407222\tDutch elm, Ulmus hollandica\nn12407396\tHuntingdon elm, Ulmus hollandica vegetata\nn12407545\twater elm, Ulmus laevis\nn12407715\tChinese elm, Ulmus parvifolia\nn12407890\tEnglish elm, European elm, Ulmus procera\nn12408077\tSiberian elm, Chinese elm, dwarf elm, Ulmus pumila\nn12408280\tslippery elm, red elm, Ulmus rubra\nn12408466\tJersey elm, guernsey elm, wheately elm, Ulmus sarniensis, Ulmus campestris sarniensis, Ulmus campestris wheatleyi\nn12408717\tSeptember elm, red elm, Ulmus serotina\nn12408873\trock elm, Ulmus thomasii\nn12409231\thackberry, nettle tree\nn12409470\tEuropean hackberry, Mediterranean hackberry, Celtis australis\nn12409651\tAmerican hackberry, Celtis occidentalis\nn12409840\tsugarberry, Celtis laevigata\nn12411461\tiridaceous plant\nn12412355\tbearded iris\nn12412606\tbeardless iris\nn12412987\torrisroot, orris\nn12413165\tdwarf iris, Iris cristata\nn12413301\tDutch iris, Iris filifolia\nn12413419\tFlorentine iris, orris, Iris germanica florentina, Iris florentina\nn12413642\tstinking iris, gladdon, gladdon iris, stinking gladwyn, roast beef plant, Iris foetidissima\nn12413880\tGerman iris, Iris germanica\nn12414035\tJapanese iris, Iris kaempferi\nn12414159\tGerman iris, Iris kochii\nn12414329\tDalmatian iris, Iris pallida\nn12414449\tPersian iris, Iris persica\nn12414818\tDutch iris, Iris tingitana\nn12414932\tdwarf iris, vernal iris, Iris verna\nn12415595\tSpanish iris, xiphium iris, Iris xiphium\nn12416073\tblackberry-lily, leopard lily, Belamcanda chinensis\nn12416423\tcrocus\nn12416703\tsaffron, saffron crocus, Crocus sativus\nn12417836\tcorn lily\nn12418221\tblue-eyed grass\nn12418507\twandflower, Sparaxis tricolor\nn12419037\tamaryllis\nn12419878\tsalsilla, Bomarea edulis\nn12420124\tsalsilla, Bomarea salsilla\nn12420535\tblood lily\nn12420722\tCape tulip, Haemanthus coccineus\nn12421137\thippeastrum, Hippeastrum puniceum\nn12421467\tnarcissus\nn12421683\tdaffodil, Narcissus pseudonarcissus\nn12421917\tjonquil, Narcissus jonquilla\nn12422129\tjonquil\nn12422559\tJacobean lily, Aztec lily, Strekelia formosissima\nn12425281\tliliaceous plant\nn12426623\tmountain lily, Lilium auratum\nn12426749\tCanada lily, wild yellow lily, meadow lily, wild meadow lily, Lilium canadense\nn12427184\ttiger lily, leopard lily, pine lily, Lilium catesbaei\nn12427391\tColumbia tiger lily, Oregon lily, Lilium columbianum\nn12427566\ttiger lily, devil lily, kentan, Lilium lancifolium\nn12427757\tEaster lily, Bermuda lily, white trumpet lily, Lilium longiflorum\nn12427946\tcoast lily, Lilium maritinum\nn12428076\tTurk's-cap, martagon, Lilium martagon\nn12428242\tMichigan lily, Lilium michiganense\nn12428412\tleopard lily, panther lily, Lilium pardalinum\nn12428747\tTurk's-cap, Turk's cap-lily, Lilium superbum\nn12429352\tAfrican lily, African tulip, blue African lily, Agapanthus africanus\nn12430198\tcolicroot, colic root, crow corn, star grass, unicorn root\nn12430471\tague root, ague grass, Aletris farinosa\nn12430675\tyellow colicroot, Aletris aurea\nn12431434\talliaceous plant\nn12432069\tHooker's onion, Allium acuminatum\nn12432356\twild leek, Levant garlic, kurrat, Allium ampeloprasum\nn12432574\tCanada garlic, meadow leek, rose leek, Allium canadense\nn12432707\tkeeled garlic, Allium carinatum\nn12433081\tonion\nn12433178\tshallot, eschalot, multiplier onion, Allium cepa aggregatum, Allium ascalonicum\nn12433769\tnodding onion, nodding wild onion, lady's leek, Allium cernuum\nn12433952\tWelsh onion, Japanese leek, Allium fistulosum\nn12434106\tred-skinned onion, Allium haematochiton\nn12434483\tdaffodil garlic, flowering onion, Naples garlic, Allium neopolitanum\nn12434634\tfew-flowered leek, Allium paradoxum\nn12434775\tgarlic, Allium sativum\nn12434985\tsand leek, giant garlic, Spanish garlic, rocambole, Allium scorodoprasum\nn12435152\tchives, chive, cive, schnittlaugh, Allium schoenoprasum\nn12435486\tcrow garlic, false garlic, field garlic, stag's garlic, wild garlic, Allium vineale\nn12435649\twild garlic, wood garlic, Ramsons, Allium ursinum\nn12435777\tgarlic chive, Chinese chive, Oriental garlic, Allium tuberosum\nn12435965\tround-headed leek, Allium sphaerocephalum\nn12436090\tthree-cornered leek, triquetrous leek, Allium triquetrum\nn12436907\tcape aloe, Aloe ferox\nn12437513\tkniphofia, tritoma, flame flower, flame-flower, flameflower\nn12437769\tpoker plant, Kniphofia uvaria\nn12437930\tred-hot poker, Kniphofia praecox\nn12439154\tfly poison, Amianthum muscaetoxicum, Amianthum muscitoxicum\nn12439830\tamber lily, Anthericum torreyi\nn12441183\tasparagus, edible asparagus, Asparagus officinales\nn12441390\tasparagus fern, Asparagus setaceous, Asparagus plumosus\nn12441552\tsmilax, Asparagus asparagoides\nn12441958\tasphodel\nn12442548\tJacob's rod\nn12443323\taspidistra, cast-iron plant, bar-room plant, Aspidistra elatio\nn12443736\tcoral drops, Bessera elegans\nn12444095\tChristmas bells\nn12444898\tclimbing onion, Bowiea volubilis\nn12446200\tmariposa, mariposa tulip, mariposa lily\nn12446519\tglobe lily, fairy lantern\nn12446737\tcat's-ear\nn12446908\twhite globe lily, white fairy lantern, Calochortus albus\nn12447121\tyellow globe lily, golden fairy lantern, Calochortus amabilis\nn12447346\trose globe lily, Calochortus amoenus\nn12447581\tstar tulip, elegant cat's ears, Calochortus elegans\nn12447891\tdesert mariposa tulip, Calochortus kennedyi\nn12448136\tyellow mariposa tulip, Calochortus luteus\nn12448361\tsagebrush mariposa tulip, Calochortus macrocarpus\nn12448700\tsego lily, Calochortus nuttallii\nn12449296\tcamas, camass, quamash, camosh, camash\nn12449526\tcommon camas, Camassia quamash\nn12449784\tLeichtlin's camas, Camassia leichtlinii\nn12449934\twild hyacinth, indigo squill, Camassia scilloides\nn12450344\tdogtooth violet, dogtooth, dog's-tooth violet\nn12450607\twhite dogtooth violet, white dog's-tooth violet, blonde lilian, Erythronium albidum\nn12450840\tyellow adder's tongue, trout lily, amberbell, Erythronium americanum\nn12451070\tEuropean dogtooth, Erythronium dens-canis\nn12451240\tfawn lily, Erythronium californicum\nn12451399\tglacier lily, snow lily, Erythronium grandiflorum\nn12451566\tavalanche lily, Erythronium montanum\nn12451915\tfritillary, checkered lily\nn12452256\tmission bells, rice-grain fritillary, Fritillaria affinis, Fritillaria lanceolata, Fritillaria mutica\nn12452480\tmission bells, black fritillary, Fritillaria biflora\nn12452673\tstink bell, Fritillaria agrestis\nn12452836\tcrown imperial, Fritillaria imperialis\nn12453018\twhite fritillary, Fritillaria liliaceae\nn12453186\tsnake's head fritillary, guinea-hen flower, checkered daffodil, leper lily, Fritillaria meleagris\nn12453714\tadobe lily, pink fritillary, Fritillaria pluriflora\nn12453857\tscarlet fritillary, Fritillaria recurva\nn12454159\ttulip\nn12454436\tdwarf tulip, Tulipa armena, Tulipa suaveolens\nn12454556\tlady tulip, candlestick tulip, Tulipa clusiana\nn12454705\tTulipa gesneriana\nn12454793\tcottage tulip\nn12454949\tDarwin tulip\nn12455950\tgloriosa, glory lily, climbing lily, creeping lily, Gloriosa superba\nn12457091\tlemon lily, Hemerocallis lilio-asphodelus, Hemerocallis flava\nn12458550\tcommon hyacinth, Hyacinthus orientalis\nn12458713\tRoman hyacinth, Hyacinthus orientalis albulus\nn12458874\tsummer hyacinth, cape hyacinth, Hyacinthus candicans, Galtonia candicans\nn12459629\tstar-of-Bethlehem\nn12460146\tbath asparagus, Prussian asparagus, Ornithogalum pyrenaicum\nn12460697\tgrape hyacinth\nn12460957\tcommon grape hyacinth, Muscari neglectum\nn12461109\ttassel hyacinth, Muscari comosum\nn12461466\tscilla, squill\nn12461673\tspring squill, Scilla verna, sea onion\nn12462032\tfalse asphodel\nn12462221\tScotch asphodel, Tofieldia pusilla\nn12462582\tsea squill, sea onion, squill, Urginea maritima\nn12462805\tsquill\nn12463134\tbutcher's broom, Ruscus aculeatus\nn12463743\tbog asphodel\nn12463975\tEuropean bog asphodel, Narthecium ossifragum\nn12464128\tAmerican bog asphodel, Narthecium americanum\nn12464476\thellebore, false hellebore\nn12464649\twhite hellebore, American hellebore, Indian poke, bugbane, Veratrum viride\nn12465557\tsquaw grass, bear grass, Xerophyllum tenax\nn12466727\tdeath camas, zigadene\nn12467018\talkali grass, Zigadenus elegans\nn12467197\twhite camas, Zigadenus glaucus\nn12467433\tpoison camas, Zigadenus nuttalli\nn12467592\tgrassy death camas, Zigadenus venenosus, Zigadenus venenosus gramineus\nn12468545\tprairie wake-robin, prairie trillium, Trillium recurvatum\nn12468719\tdwarf-white trillium, snow trillium, early wake-robin\nn12469517\therb Paris, Paris quadrifolia\nn12470092\tsarsaparilla\nn12470512\tbullbrier, greenbrier, catbrier, horse brier, horse-brier, brier, briar, Smilax rotundifolia\nn12470907\trough bindweed, Smilax aspera\nn12472024\tclintonia, Clinton's lily\nn12473608\tfalse lily of the valley, Maianthemum canadense\nn12473840\tfalse lily of the valley, Maianthemum bifolium\nn12474167\tSolomon's-seal\nn12474418\tgreat Solomon's-seal, Polygonatum biflorum, Polygonatum commutatum\nn12475035\tbellwort, merry bells, wild oats\nn12475242\tstrawflower, cornflower, Uvularia grandiflora\nn12475774\tpia, Indian arrowroot, Tacca leontopetaloides, Tacca pinnatifida\nn12476510\tagave, century plant, American aloe\nn12477163\tAmerican agave, Agave americana\nn12477401\tsisal, Agave sisalana\nn12477583\tmaguey, cantala, Agave cantala\nn12477747\tmaguey, Agave atrovirens\nn12477983\tAgave tequilana\nn12478768\tcabbage tree, grass tree, Cordyline australis\nn12479537\tdracaena\nn12480456\ttuberose, Polianthes tuberosa\nn12480895\tsansevieria, bowstring hemp\nn12481150\tAfrican bowstring hemp, African hemp, Sansevieria guineensis\nn12481289\tCeylon bowstring hemp, Sansevieria zeylanica\nn12481458\tmother-in-law's tongue, snake plant, Sansevieria trifasciata\nn12482437\tSpanish bayonet, Yucca aloifolia\nn12482668\tSpanish bayonet, Yucca baccata\nn12482893\tJoshua tree, Yucca brevifolia\nn12483282\tsoapweed, soap-weed, soap tree, Yucca elata\nn12483427\tAdam's needle, Adam's needle-and-thread, spoonleaf yucca, needle palm, Yucca filamentosa\nn12483625\tbear grass, Yucca glauca\nn12483841\tSpanish dagger, Yucca gloriosa\nn12484244\tOur Lord's candle, Yucca whipplei\nn12484784\twater shamrock, buckbean, bogbean, bog myrtle, marsh trefoil, Menyanthes trifoliata\nn12485653\tbutterfly bush, buddleia\nn12485981\tyellow jasmine, yellow jessamine, Carolina jasmine, evening trumpet flower, Gelsemium sempervirens\nn12486574\tflax\nn12487058\tcalabar bean, ordeal bean\nn12488454\tbonduc, bonduc tree, Caesalpinia bonduc, Caesalpinia bonducella\nn12488709\tdivi-divi, Caesalpinia coriaria\nn12489046\tMysore thorn, Caesalpinia decapetala, Caesalpinia sepiaria\nn12489676\tbrazilian ironwood, Caesalpinia ferrea\nn12489815\tbird of paradise, poinciana, Caesalpinia gilliesii, Poinciana gilliesii\nn12490490\tshingle tree, Acrocarpus fraxinifolius\nn12491017\tmountain ebony, orchid tree, Bauhinia variegata\nn12491435\tmsasa, Brachystegia speciformis\nn12491826\tcassia\nn12492106\tgolden shower tree, drumstick tree, purging cassia, pudding pipe tree, canafistola, canafistula, Cassia fistula\nn12492460\tpink shower, pink shower tree, horse cassia, Cassia grandis\nn12492682\trainbow shower, Cassia javonica\nn12492900\thorse cassia, Cassia roxburghii, Cassia marginata\nn12493208\tcarob, carob tree, carob bean tree, algarroba, Ceratonia siliqua\nn12493426\tcarob, carob bean, algarroba bean, algarroba, locust bean, locust pod\nn12493868\tpaloverde\nn12494794\troyal poinciana, flamboyant, flame tree, peacock flower, Delonix regia, Poinciana regia\nn12495146\tlocust tree, locust\nn12495670\twater locust, swamp locust, Gleditsia aquatica\nn12495895\thoney locust, Gleditsia triacanthos\nn12496427\tKentucky coffee tree, bonduc, chicot, Gymnocladus dioica\nn12496949\tlogwood, logwood tree, campeachy, bloodwood tree, Haematoxylum campechianum\nn12497669\tJerusalem thorn, horsebean, Parkinsonia aculeata\nn12498055\tpalo verde, Parkinsonia florida, Cercidium floridum\nn12498457\tDalmatian laburnum, Petteria ramentacea, Cytisus ramentaceus\nn12499163\tsenna\nn12499757\tavaram, tanner's cassia, Senna auriculata, Cassia auriculata\nn12499979\tAlexandria senna, Alexandrian senna, true senna, tinnevelly senna, Indian senna, Senna alexandrina, Cassia acutifolia, Cassia augustifolia\nn12500309\twild senna, Senna marilandica, Cassia marilandica\nn12500518\tsicklepod, Senna obtusifolia, Cassia tora\nn12500751\tcoffee senna, mogdad coffee, styptic weed, stinking weed, Senna occidentalis, Cassia occidentalis\nn12501202\ttamarind, tamarind tree, tamarindo, Tamarindus indica\nn12504570\tfalse indigo, bastard indigo, Amorpha californica\nn12504783\tfalse indigo, bastard indigo, Amorpha fruticosa\nn12505253\thog peanut, wild peanut, Amphicarpaea bracteata, Amphicarpa bracteata\nn12506181\tangelim, andelmin\nn12506341\tcabbage bark, cabbage-bark tree, cabbage tree, Andira inermis\nn12506991\tkidney vetch, Anthyllis vulneraria\nn12507379\tgroundnut, groundnut vine, Indian potato, potato bean, wild bean, Apios americana, Apios tuberosa\nn12507823\trooibos, Aspalathus linearis, Aspalathus cedcarbergensis\nn12508309\tmilk vetch, milk-vetch\nn12508618\talpine milk vetch, Astragalus alpinus\nn12508762\tpurple milk vetch, Astragalus danicus\nn12509109\tcamwood, African sandalwood, Baphia nitida\nn12509476\twild indigo, false indigo\nn12509665\tblue false indigo, Baptisia australis\nn12509821\twhite false indigo, Baptisia lactea\nn12509993\tindigo broom, horsefly weed, rattle weed, Baptisia tinctoria\nn12510343\tdhak, dak, palas, Butea frondosa, Butea monosperma\nn12510774\tpigeon pea, pigeon-pea plant, cajan pea, catjang pea, red gram, dhal, dahl, Cajanus cajan\nn12511488\tsword bean, Canavalia gladiata\nn12511856\tpea tree, caragana\nn12512095\tSiberian pea tree, Caragana arborescens\nn12512294\tChinese pea tree, Caragana sinica\nn12512674\tMoreton Bay chestnut, Australian chestnut\nn12513172\tbutterfly pea, Centrosema virginianum\nn12513613\tJudas tree, love tree, Circis siliquastrum\nn12513933\tredbud, Cercis canadensis\nn12514138\twestern redbud, California redbud, Cercis occidentalis\nn12514592\ttagasaste, Chamaecytisus palmensis, Cytesis proliferus\nn12514992\tweeping tree broom\nn12515393\tflame pea\nn12515711\tchickpea, chickpea plant, Egyptian pea, Cicer arietinum\nn12515925\tchickpea, garbanzo\nn12516165\tKentucky yellowwood, gopherwood, Cladrastis lutea, Cladrastis kentukea\nn12516584\tglory pea, clianthus\nn12516828\tdesert pea, Sturt pea, Sturt's desert pea, Clianthus formosus, Clianthus speciosus\nn12517077\tparrot's beak, parrot's bill, Clianthus puniceus\nn12517445\tbutterfly pea, Clitoria mariana\nn12517642\tblue pea, butterfly pea, Clitoria turnatea\nn12518013\ttelegraph plant, semaphore plant, Codariocalyx motorius, Desmodium motorium, Desmodium gyrans\nn12518481\tbladder senna, Colutea arborescens\nn12519089\taxseed, crown vetch, Coronilla varia\nn12519563\tcrotalaria, rattlebox\nn12520406\tguar, cluster bean, Cyamopsis tetragonolobus, Cyamopsis psoraloides\nn12521186\twhite broom, white Spanish broom, Cytisus albus, Cytisus multiflorus\nn12521394\tcommon broom, Scotch broom, green broom, Cytisus scoparius\nn12522188\trosewood, rosewood tree\nn12522678\tIndian blackwood, East Indian rosewood, East India rosewood, Indian rosewood, Dalbergia latifolia\nn12522894\tsissoo, sissu, sisham, Dalbergia sissoo\nn12523141\tkingwood, kingwood tree, Dalbergia cearensis\nn12523475\tBrazilian rosewood, caviuna wood, jacaranda, Dalbergia nigra\nn12523850\tcocobolo, Dalbergia retusa\nn12524188\tblackwood, blackwood tree\nn12525168\tbitter pea\nn12525513\tderris\nn12525753\tderris root, tuba root, Derris elliptica\nn12526178\tprairie mimosa, prickle-weed, Desmanthus ilinoensis\nn12526516\ttick trefoil, beggar lice, beggar's lice\nn12526754\tbeggarweed, Desmodium tortuosum, Desmodium purpureum\nn12527081\tAustralian pea, Dipogon lignosus, Dolichos lignosus\nn12527738\tcoral tree, erythrina\nn12528109\tkaffir boom, Cape kafferboom, Erythrina caffra\nn12528382\tcoral bean tree, Erythrina corallodendrum\nn12528549\tceibo, crybaby tree, cry-baby tree, common coral tree, Erythrina crista-galli\nn12528768\tkaffir boom, Transvaal kafferboom, Erythrina lysistemon\nn12528974\tIndian coral tree, Erythrina variegata, Erythrina Indica\nn12529220\tcork tree, Erythrina vespertilio\nn12529500\tgoat's rue, goat rue, Galega officinalis\nn12529905\tpoison bush, poison pea, gastrolobium\nn12530629\tSpanish broom, Spanish gorse, Genista hispanica\nn12530818\twoodwaxen, dyer's greenweed, dyer's-broom, dyeweed, greenweed, whin, woadwaxen, Genista tinctoria\nn12531328\tchanar, chanal, Geoffroea decorticans\nn12531727\tgliricidia\nn12532564\tsoy, soybean, soya bean\nn12532886\tlicorice, liquorice, Glycyrrhiza glabra\nn12533190\twild licorice, wild liquorice, American licorice, American liquorice, Glycyrrhiza lepidota\nn12533437\tlicorice root\nn12534208\tWestern Australia coral pea, Hardenbergia comnptoniana\nn12534625\tsweet vetch, Hedysarum boreale\nn12534862\tFrench honeysuckle, sulla, Hedysarum coronarium\nn12536291\tanil, Indigofera suffruticosa, Indigofera anil\nn12537253\tscarlet runner, running postman, Kennedia prostrata\nn12537569\thyacinth bean, bonavist, Indian bean, Egyptian bean, Lablab purpureus, Dolichos lablab\nn12538209\tScotch laburnum, Alpine golden chain, Laburnum alpinum\nn12539074\tvetchling\nn12539306\twild pea\nn12539832\teverlasting pea\nn12540250\tbeach pea, sea pea, Lathyrus maritimus, Lathyrus japonicus\nn12540647\tgrass vetch, grass vetchling, Lathyrus nissolia\nn12540966\tmarsh pea, Lathyrus palustris\nn12541157\tcommon vetchling, meadow pea, yellow vetchling, Lathyrus pratensis\nn12541403\tgrass pea, Indian pea, khesari, Lathyrus sativus\nn12542043\tTangier pea, Tangier peavine, Lalthyrus tingitanus\nn12542240\theath pea, earth-nut pea, earthnut pea, tuberous vetch, Lathyrus tuberosus\nn12543186\tbicolor lespediza, ezo-yama-hagi, Lespedeza bicolor\nn12543455\tjapanese clover, japan clover, jap clover, Lespedeza striata\nn12543639\tKorean lespedeza, Lespedeza stipulacea\nn12543826\tsericea lespedeza, Lespedeza sericea, Lespedeza cuneata\nn12544240\tlentil, lentil plant, Lens culinaris\nn12544539\tlentil\nn12545232\tprairie bird's-foot trefoil, compass plant, prairie lotus, prairie trefoil, Lotus americanus\nn12545635\tbird's foot trefoil, bird's foot clover, babies' slippers, bacon and eggs, Lotus corniculatus\nn12545865\twinged pea, asparagus pea, Lotus tetragonolobus\nn12546183\tlupine, lupin\nn12546420\twhite lupine, field lupine, wolf bean, Egyptian lupine, Lupinus albus\nn12546617\ttree lupine, Lupinus arboreus\nn12546962\twild lupine, sundial lupine, Indian beet, old-maid's bonnet, Lupinus perennis\nn12547215\tbluebonnet, buffalo clover, Texas bluebonnet, Lupinus subcarnosus\nn12547503\tTexas bluebonnet, Lupinus texensis\nn12548280\tmedic, medick, trefoil\nn12548564\tmoon trefoil, Medicago arborea\nn12548804\tsickle alfalfa, sickle lucerne, sickle medick, Medicago falcata\nn12549005\tCalvary clover, Medicago intertexta, Medicago echinus\nn12549192\tblack medick, hop clover, yellow trefoil, nonesuch clover, Medicago lupulina\nn12549420\talfalfa, lucerne, Medicago sativa\nn12549799\tmillettia\nn12550210\tmucuna\nn12550408\tcowage, velvet bean, Bengal bean, Benghal bean, Florida bean, Mucuna pruriens utilis, Mucuna deeringiana, Mucuna aterrima, Stizolobium deeringiana\nn12551173\ttolu tree, tolu balsam tree, Myroxylon balsamum, Myroxylon toluiferum\nn12551457\tPeruvian balsam, Myroxylon pereirae, Myroxylon balsamum pereirae\nn12552309\tsainfoin, sanfoin, holy clover, esparcet, Onobrychis viciifolia, Onobrychis viciaefolia\nn12552893\trestharrow, rest-harrow, Ononis repens\nn12553742\tbead tree, jumby bean, jumby tree, Ormosia monosperma\nn12554029\tjumby bead, jumbie bead, Ormosia coarctata\nn12554526\tlocoweed, crazyweed, crazy weed\nn12554729\tpurple locoweed, purple loco, Oxytropis lambertii\nn12554911\ttumbleweed\nn12555255\tyam bean, Pachyrhizus erosus\nn12555859\tshamrock pea, Parochetus communis\nn12556656\tpole bean\nn12557064\tkidney bean, frijol, frijole\nn12557438\tharicot\nn12557556\twax bean\nn12557681\tscarlet runner, scarlet runner bean, Dutch case-knife bean, runner bean, Phaseolus coccineus, Phaseolus multiflorus\nn12558230\tlima bean, lima bean plant, Phaseolus limensis\nn12558425\tsieva bean, butter bean, butter-bean plant, lima bean, Phaseolus lunatus\nn12558680\ttepary bean, Phaseolus acutifolius latifolius\nn12559044\tchaparral pea, stingaree-bush, Pickeringia montana\nn12559518\tJamaica dogwood, fish fuddle, Piscidia piscipula, Piscidia erythrina\nn12560282\tpea\nn12560621\tgarden pea\nn12560775\tedible-pod pea, edible-podded pea, Pisum sativum macrocarpon\nn12561169\tsugar snap pea, snap pea\nn12561309\tfield pea, field-pea plant, Austrian winter pea, Pisum sativum arvense, Pisum arvense\nn12561594\tfield pea\nn12562141\tcommon flat pea, native holly, Playlobium obtusangulum\nn12562577\tquira\nn12562785\troble, Platymiscium trinitatis\nn12563045\tPanama redwood tree, Panama redwood, Platymiscium pinnatum\nn12563702\tIndian beech, Pongamia glabra\nn12564083\twinged bean, winged pea, goa bean, goa bean vine, Manila bean, Psophocarpus tetragonolobus\nn12564613\tbreadroot, Indian breadroot, pomme blanche, pomme de prairie, Psoralea esculenta\nn12565102\tbloodwood tree, kiaat, Pterocarpus angolensis\nn12565912\tkino, Pterocarpus marsupium\nn12566331\tred sandalwood, red sanders, red sanderswood, red saunders, Pterocarpus santalinus\nn12566954\tkudzu, kudzu vine, Pueraria lobata\nn12567950\tbristly locust, rose acacia, moss locust, Robinia hispida\nn12568186\tblack locust, yellow locust, Robinia pseudoacacia\nn12568649\tclammy locust, Robinia viscosa\nn12569037\tcarib wood, Sabinea carinalis\nn12569616\tColorado River hemp, Sesbania exaltata\nn12569851\tscarlet wisteria tree, vegetable hummingbird, Sesbania grandiflora\nn12570394\tJapanese pagoda tree, Chinese scholartree, Chinese scholar tree, Sophora japonica, Sophora sinensis\nn12570703\tmescal bean, coral bean, frijolito, frijolillo, Sophora secundiflora\nn12570972\tkowhai, Sophora tetraptera\nn12571781\tjade vine, emerald creeper, Strongylodon macrobotrys\nn12572546\thoary pea\nn12572759\tbastard indigo, Tephrosia purpurea\nn12572858\tcatgut, goat's rue, wild sweet pea, Tephrosia virginiana\nn12573256\tbush pea\nn12573474\tfalse lupine, golden pea, yellow pea, Thermopsis macrophylla\nn12573647\tCarolina lupine, Thermopsis villosa\nn12573911\ttipu, tipu tree, yellow jacaranda, pride of Bolivia\nn12574320\tbird's foot trefoil, Trigonella ornithopodioides\nn12574470\tfenugreek, Greek clover, Trigonella foenumgraecum\nn12574866\tgorse, furze, whin, Irish gorse, Ulex europaeus\nn12575322\tvetch\nn12575812\ttufted vetch, bird vetch, Calnada pea, Vicia cracca\nn12576323\tbroad bean, fava bean, horsebean\nn12576451\tbitter betch, Vicia orobus\nn12576695\tbush vetch, Vicia sepium\nn12577362\tmoth bean, Vigna aconitifolia, Phaseolus aconitifolius\nn12577895\tsnailflower, snail-flower, snail flower, snail bean, corkscrew flower, Vigna caracalla, Phaseolus caracalla\nn12578255\tmung, mung bean, green gram, golden gram, Vigna radiata, Phaseolus aureus\nn12578626\tcowpea, cowpea plant, black-eyed pea, Vigna unguiculata, Vigna sinensis\nn12578916\tcowpea, black-eyed pea\nn12579038\tasparagus bean, yard-long bean, Vigna unguiculata sesquipedalis, Vigna sesquipedalis\nn12579404\tswamp oak, Viminaria juncea, Viminaria denudata\nn12579822\tkeurboom, Virgilia capensis, Virgilia oroboides\nn12580012\tkeurboom, Virgilia divaricata\nn12580654\tJapanese wistaria, Wisteria floribunda\nn12580786\tChinese wistaria, Wisteria chinensis\nn12580896\tAmerican wistaria, American wisteria, Wisteria frutescens\nn12581110\tsilky wisteria, Wisteria venusta\nn12582231\tpalm, palm tree\nn12582665\tsago palm\nn12582846\tfeather palm\nn12583126\tfan palm\nn12583401\tpalmetto\nn12583681\tcoyol, coyol palm, Acrocomia vinifera\nn12583855\tgrugru, gri-gri, grugru palm, macamba, Acrocomia aculeata\nn12584191\tareca\nn12584365\tbetel palm, Areca catechu\nn12584715\tsugar palm, gomuti, gomuti palm, Arenga pinnata\nn12585137\tpiassava palm, pissaba palm, Bahia piassava, bahia coquilla, Attalea funifera\nn12585373\tcoquilla nut\nn12585629\tpalmyra, palmyra palm, toddy palm, wine palm, lontar, longar palm, Borassus flabellifer\nn12586298\tcalamus\nn12586499\trattan, rattan palm, Calamus rotang\nn12586725\tlawyer cane, Calamus australis\nn12586989\tfishtail palm\nn12587132\twine palm, jaggery palm, kitul, kittul, kitul tree, toddy palm, Caryota urens\nn12587487\twax palm, Ceroxylon andicola, Ceroxylon alpinum\nn12587803\tcoconut, coconut palm, coco palm, coco, cocoa palm, coconut tree, Cocos nucifera\nn12588320\tcarnauba, carnauba palm, wax palm, Copernicia prunifera, Copernicia cerifera\nn12588780\tcaranday, caranda, caranda palm, wax palm, Copernicia australis, Copernicia alba\nn12589142\tcorozo, corozo palm\nn12589458\tgebang palm, Corypha utan, Corypha gebanga\nn12589687\tlatanier, latanier palm\nn12589841\ttalipot, talipot palm, Corypha umbraculifera\nn12590232\toil palm\nn12590499\tAfrican oil palm, Elaeis guineensis\nn12590600\tAmerican oil palm, Elaeis oleifera\nn12590715\tpalm nut, palm kernel\nn12591017\tcabbage palm, Euterpe oleracea\nn12591351\tcabbage palm, cabbage tree, Livistona australis\nn12591702\ttrue sago palm, Metroxylon sagu\nn12592058\tnipa palm, Nipa fruticans\nn12592544\tbabassu, babassu palm, coco de macao, Orbignya phalerata, Orbignya spesiosa, Orbignya martiana\nn12592839\tbabassu nut\nn12593122\tcohune palm, Orbignya cohune, cohune\nn12593341\tcohune nut\nn12593994\tdate palm, Phoenix dactylifera\nn12594324\tivory palm, ivory-nut palm, ivory plant, Phytelephas macrocarpa\nn12594989\traffia palm, Raffia farinifera, Raffia ruffia\nn12595699\tbamboo palm, Raffia vinifera\nn12595964\tlady palm\nn12596148\tminiature fan palm, bamboo palm, fern rhapis, Rhapis excelsa\nn12596345\treed rhapis, slender lady palm, Rhapis humilis\nn12596709\troyal palm, Roystonea regia\nn12596849\tcabbage palm, Roystonea oleracea\nn12597134\tcabbage palmetto, cabbage palm, Sabal palmetto\nn12597466\tsaw palmetto, scrub palmetto, Serenoa repens\nn12597798\tthatch palm, thatch tree, silver thatch, broom palm, Thrinax parviflora\nn12598027\tkey palm, silvertop palmetto, silver thatch, Thrinax microcarpa, Thrinax morrisii, Thrinax keyensis\nn12599185\tEnglish plantain, narrow-leaved plantain, ribgrass, ribwort, ripple-grass, buckthorn, Plantago lanceolata\nn12599435\tbroad-leaved plantain, common plantain, white-man's foot, whiteman's foot, cart-track plant, Plantago major\nn12599661\thoary plantain, Plantago media\nn12599874\tfleawort, psyllium, Spanish psyllium, Plantago psyllium\nn12600095\trugel's plantain, broad-leaved plantain, Plantago rugelii\nn12600267\thoary plantain, Plantago virginica\nn12601494\tbuckwheat, Polygonum fagopyrum, Fagopyrum esculentum\nn12601805\tprince's-feather, princess feather, kiss-me-over-the-garden-gate, prince's-plume, Polygonum orientale\nn12602262\teriogonum\nn12602434\tumbrella plant, Eriogonum allenii\nn12602612\twild buckwheat, California buckwheat, Erigonum fasciculatum\nn12602980\trhubarb, rhubarb plant\nn12603273\tHimalayan rhubarb, Indian rhubarb, red-veined pie plant, Rheum australe, Rheum emodi\nn12603449\tpie plant, garden rhubarb, Rheum cultorum, Rheum rhabarbarum, Rheum rhaponticum\nn12603672\tChinese rhubarb, Rheum palmatum\nn12604228\tsour dock, garden sorrel, Rumex acetosa\nn12604460\tsheep sorrel, sheep's sorrel, Rumex acetosella\nn12604639\tbitter dock, broad-leaved dock, yellow dock, Rumex obtusifolius\nn12604845\tFrench sorrel, garden sorrel, Rumex scutatus\nn12605683\tyellow-eyed grass\nn12606438\tcommelina\nn12606545\tspiderwort, dayflower\nn12607456\tpineapple, pineapple plant, Ananas comosus\nn12609379\tpipewort, Eriocaulon aquaticum\nn12610328\twater hyacinth, water orchid, Eichhornia crassipes, Eichhornia spesiosa\nn12610740\twater star grass, mud plantain, Heteranthera dubia\nn12611640\tnaiad, water nymph\nn12612170\twater plantain, Alisma plantago-aquatica\nn12612811\tnarrow-leaved water plantain\nn12613706\thydrilla, Hydrilla verticillata\nn12614096\tAmerican frogbit, Limnodium spongia\nn12614477\twaterweed\nn12614625\tCanadian pondweed, Elodea canadensis\nn12615232\ttape grass, eelgrass, wild celery, Vallisneria spiralis\nn12615710\tpondweed\nn12616248\tcurled leaf pondweed, curly pondweed, Potamogeton crispus\nn12616630\tloddon pondweed, Potamogeton nodosus, Potamogeton americanus\nn12616996\tfrog's lettuce\nn12617559\tarrow grass, Triglochin maritima\nn12618146\thorned pondweed, Zannichellia palustris\nn12618727\teelgrass, grass wrack, sea wrack, Zostera marina\nn12620196\trose, rosebush\nn12620546\thip, rose hip, rosehip\nn12620969\tbanksia rose, Rosa banksia\nn12621410\tdamask rose, summer damask rose, Rosa damascena\nn12621619\tsweetbrier, sweetbriar, brier, briar, eglantine, Rosa eglanteria\nn12621945\tCherokee rose, Rosa laevigata\nn12622297\tmusk rose, Rosa moschata\nn12622875\tagrimonia, agrimony\nn12623077\tharvest-lice, Agrimonia eupatoria\nn12623211\tfragrant agrimony, Agrimonia procera\nn12623818\talderleaf Juneberry, alder-leaved serviceberry, Amelanchier alnifolia\nn12624381\tflowering quince\nn12624568\tjaponica, maule's quince, Chaenomeles japonica\nn12625003\tcoco plum, coco plum tree, cocoa plum, icaco, Chrysobalanus icaco\nn12625383\tcotoneaster\nn12625670\tCotoneaster dammeri\nn12625823\tCotoneaster horizontalis\nn12626674\tparsley haw, parsley-leaved thorn, Crataegus apiifolia, Crataegus marshallii\nn12626878\tscarlet haw, Crataegus biltmoreana\nn12627119\tblackthorn, pear haw, pear hawthorn, Crataegus calpodendron, Crataegus tomentosa\nn12627347\tcockspur thorn, cockspur hawthorn, Crataegus crus-galli\nn12627526\tmayhaw, summer haw, Crataegus aestivalis\nn12628356\tred haw, downy haw, Crataegus mollis, Crataegus coccinea mollis\nn12628705\tred haw, Crataegus pedicellata, Crataegus coccinea\nn12628986\tquince, quince bush, Cydonia oblonga\nn12629305\tmountain avens, Dryas octopetala\nn12629666\tloquat, loquat tree, Japanese medlar, Japanese plum, Eriobotrya japonica\nn12630763\tbeach strawberry, Chilean strawberry, Fragaria chiloensis\nn12630999\tVirginia strawberry, scarlet strawberry, Fragaria virginiana\nn12631331\tavens\nn12631637\tyellow avens, Geum alleppicum strictum, Geum strictum\nn12631932\tyellow avens, Geum macrophyllum\nn12632335\tprairie smoke, purple avens, Geum triflorum\nn12632733\tbennet, white avens, Geum virginianum\nn12633061\ttoyon, tollon, Christmasberry, Christmas berry, Heteromeles arbutifolia, Photinia arbutifolia\nn12633638\tapple tree\nn12633994\tapple, orchard apple tree, Malus pumila\nn12634211\twild apple, crab apple, crabapple\nn12634429\tcrab apple, crabapple, cultivated crab apple\nn12634734\tSiberian crab, Siberian crab apple, cherry apple, cherry crab, Malus baccata\nn12634986\twild crab, Malus sylvestris\nn12635151\tAmerican crab apple, garland crab, Malus coronaria\nn12635359\tOregon crab apple, Malus fusca\nn12635532\tSouthern crab apple, flowering crab, Malus angustifolia\nn12635744\tIowa crab, Iowa crab apple, prairie crab, western crab apple, Malus ioensis\nn12635955\tBechtel crab, flowering crab\nn12636224\tmedlar, medlar tree, Mespilus germanica\nn12636885\tcinquefoil, five-finger\nn12637123\tsilverweed, goose-tansy, goose grass, Potentilla anserina\nn12637485\tsalad burnet, burnet bloodwort, pimpernel, Poterium sanguisorba\nn12638218\tplum, plum tree\nn12638556\twild plum, wild plum tree\nn12638753\tAllegheny plum, Alleghany plum, sloe, Prunus alleghaniensis\nn12638964\tAmerican red plum, August plum, goose plum, Prunus americana\nn12639168\tchickasaw plum, hog plum, hog plum bush, Prunus angustifolia\nn12639376\tbeach plum, beach plum bush, Prunus maritima\nn12639584\tcommon plum, Prunus domestica\nn12639736\tbullace, Prunus insititia\nn12639910\tdamson plum, damson plum tree, Prunus domestica insititia\nn12640081\tbig-tree plum, Prunus mexicana\nn12640284\tCanada plum, Prunus nigra\nn12640435\tplumcot, plumcot tree\nn12640607\tapricot, apricot tree\nn12640839\tJapanese apricot, mei, Prunus mume\nn12641007\tcommon apricot, Prunus armeniaca\nn12641180\tpurple apricot, black apricot, Prunus dasycarpa\nn12641413\tcherry, cherry tree\nn12641931\twild cherry, wild cherry tree\nn12642090\twild cherry\nn12642200\tsweet cherry, Prunus avium\nn12642435\theart cherry, oxheart, oxheart cherry\nn12642600\tgean, mazzard, mazzard cherry\nn12642964\tcapulin, capulin tree, Prunus capuli\nn12643113\tcherry laurel, laurel cherry, mock orange, wild orange, Prunus caroliniana\nn12643313\tcherry plum, myrobalan, myrobalan plum, Prunus cerasifera\nn12643473\tsour cherry, sour cherry tree, Prunus cerasus\nn12643688\tamarelle, Prunus cerasus caproniana\nn12643877\tmorello, Prunus cerasus austera\nn12644283\tmarasca\nn12644902\talmond tree\nn12645174\talmond, sweet almond, Prunus dulcis, Prunus amygdalus, Amygdalus communis\nn12645530\tbitter almond, Prunus dulcis amara, Amygdalus communis amara\nn12646072\tjordan almond\nn12646197\tdwarf flowering almond, Prunus glandulosa\nn12646397\tholly-leaved cherry, holly-leaf cherry, evergreen cherry, islay, Prunus ilicifolia\nn12646605\tfuji, fuji cherry, Prunus incisa\nn12646740\tflowering almond, oriental bush cherry, Prunus japonica\nn12646950\tcherry laurel, laurel cherry, Prunus laurocerasus\nn12647231\tCatalina cherry, Prunus lyonii\nn12647376\tbird cherry, bird cherry tree\nn12647560\thagberry tree, European bird cherry, common bird cherry, Prunus padus\nn12647787\thagberry\nn12647893\tpin cherry, Prunus pensylvanica\nn12648045\tpeach, peach tree, Prunus persica\nn12648196\tnectarine, nectarine tree, Prunus persica nectarina\nn12648424\tsand cherry, Prunus pumila, Prunus pumilla susquehanae, Prunus susquehanae, Prunus cuneata\nn12648693\tJapanese plum, Prunus salicina\nn12648888\tblack cherry, black cherry tree, rum cherry, Prunus serotina\nn12649065\tflowering cherry\nn12649317\toriental cherry, Japanese cherry, Japanese flowering cherry, Prunus serrulata\nn12649539\tJapanese flowering cherry, Prunus sieboldii\nn12649866\tSierra plum, Pacific plum, Prunus subcordata\nn12650038\trosebud cherry, winter flowering cherry, Prunus subhirtella\nn12650229\tRussian almond, dwarf Russian almond, Prunus tenella\nn12650379\tflowering almond, Prunus triloba\nn12650556\tchokecherry, chokecherry tree, Prunus virginiana\nn12650805\tchokecherry\nn12650915\twestern chokecherry, Prunus virginiana demissa, Prunus demissa\nn12651229\tPyracantha, pyracanth, fire thorn, firethorn\nn12651611\tpear, pear tree, Pyrus communis\nn12651821\tfruit tree\nn12653218\tbramble bush\nn12653436\tlawyerbush, lawyer bush, bush lawyer, Rubus cissoides, Rubus australis\nn12653633\tstone bramble, Rubus saxatilis\nn12654227\tsand blackberry, Rubus cuneifolius\nn12654857\tboysenberry, boysenberry bush\nn12655062\tloganberry, Rubus loganobaccus, Rubus ursinus loganobaccus\nn12655245\tAmerican dewberry, Rubus canadensis\nn12655351\tNorthern dewberry, American dewberry, Rubus flagellaris\nn12655498\tSouthern dewberry, Rubus trivialis\nn12655605\tswamp dewberry, swamp blackberry, Rubus hispidus\nn12655726\tEuropean dewberry, Rubus caesius\nn12655869\traspberry, raspberry bush\nn12656369\twild raspberry, European raspberry, framboise, Rubus idaeus\nn12656528\tAmerican raspberry, Rubus strigosus, Rubus idaeus strigosus\nn12656685\tblack raspberry, blackcap, blackcap raspberry, thimbleberry, Rubus occidentalis\nn12656909\tsalmonberry, Rubus spectabilis\nn12657082\tsalmonberry, salmon berry, thimbleberry, Rubus parviflorus\nn12657755\twineberry, Rubus phoenicolasius\nn12658118\tmountain ash\nn12658308\trowan, rowan tree, European mountain ash, Sorbus aucuparia\nn12658481\trowanberry\nn12658603\tAmerican mountain ash, Sorbus americana\nn12658715\tWestern mountain ash, Sorbus sitchensis\nn12658846\tservice tree, sorb apple, sorb apple tree, Sorbus domestica\nn12659064\twild service tree, Sorbus torminalis\nn12659356\tspirea, spiraea\nn12659539\tbridal wreath, bridal-wreath, Saint Peter's wreath, St. Peter's wreath, Spiraea prunifolia\nn12660601\tmadderwort, rubiaceous plant\nn12661045\tIndian madder, munjeet, Rubia cordifolia\nn12661227\tmadder, Rubia tinctorum\nn12661538\twoodruff\nn12662074\tdagame, lemonwood tree, Calycophyllum candidissimum\nn12662379\tblolly, West Indian snowberry, Chiococca alba\nn12662772\tcoffee, coffee tree\nn12663023\tArabian coffee, Coffea arabica\nn12663254\tLiberian coffee, Coffea liberica\nn12663359\trobusta coffee, Rio Nunez coffee, Coffea robusta, Coffea canephora\nn12663804\tcinchona, chinchona\nn12664005\tCartagena bark, Cinchona cordifolia, Cinchona lancifolia\nn12664187\tcalisaya, Cinchona officinalis, Cinchona ledgeriana, Cinchona calisaya\nn12664469\tcinchona tree, Cinchona pubescens\nn12664710\tcinchona, cinchona bark, Peruvian bark, Jesuit's bark\nn12665048\tbedstraw\nn12665271\tsweet woodruff, waldmeister, woodruff, fragrant bedstraw, Galium odoratum, Asperula odorata\nn12665659\tNorthern bedstraw, Northern snow bedstraw, Galium boreale\nn12665857\tyellow bedstraw, yellow cleavers, Our Lady's bedstraw, Galium verum\nn12666050\twild licorice, Galium lanceolatum\nn12666159\tcleavers, clivers, goose grass, catchweed, spring cleavers, Galium aparine\nn12666369\twild madder, white madder, white bedstraw, infant's-breath, false baby's breath, Galium mollugo\nn12666965\tcape jasmine, cape jessamine, Gardenia jasminoides, Gardenia augusta\nn12667406\tgenipa\nn12667582\tgenipap fruit, jagua, marmalade box, Genipa Americana\nn12667964\thamelia\nn12668131\tscarlet bush, scarlet hamelia, coloradillo, Hamelia patens, Hamelia erecta\nn12669803\tlemonwood, lemon-wood, lemonwood tree, lemon-wood tree, Psychotria capensis\nn12670334\tnegro peach, Sarcocephalus latifolius, Sarcocephalus esculentus\nn12670758\twild medlar, wild medlar tree, medlar, Vangueria infausta\nn12670962\tSpanish tamarind, Vangueria madagascariensis\nn12671651\tabelia\nn12672289\tbush honeysuckle, Diervilla sessilifolia\nn12673588\tAmerican twinflower, Linnaea borealis americana\nn12674120\thoneysuckle\nn12674685\tAmerican fly honeysuckle, fly honeysuckle, Lonicera canadensis\nn12674895\tItalian honeysuckle, Italian woodbine, Lonicera caprifolium\nn12675299\tyellow honeysuckle, Lonicera flava\nn12675515\thairy honeysuckle, Lonicera hirsuta\nn12675876\tJapanese honeysuckle, Lonicera japonica\nn12676134\tHall's honeysuckle, Lonicera japonica halliana\nn12676370\tMorrow's honeysuckle, Lonicera morrowii\nn12676534\twoodbine, Lonicera periclymenum\nn12676703\ttrumpet honeysuckle, coral honeysuckle, trumpet flower, trumpet vine, Lonicera sempervirens\nn12677120\tEuropean fly honeysuckle, European honeysuckle, Lonicera xylosteum\nn12677331\tswamp fly honeysuckle\nn12677612\tsnowberry, common snowberry, waxberry, Symphoricarpos alba\nn12677841\tcoralberry, Indian currant, Symphoricarpos orbiculatus\nn12678794\tblue elder, blue elderberry, Sambucus caerulea\nn12679023\tdwarf elder, danewort, Sambucus ebulus\nn12679432\tAmerican red elder, red-berried elder, stinking elder, Sambucus pubens\nn12679593\tEuropean red elder, red-berried elder, Sambucus racemosa\nn12679876\tfeverroot, horse gentian, tinker's root, wild coffee, Triostium perfoliatum\nn12680402\tcranberry bush, cranberry tree, American cranberry bush, highbush cranberry, Viburnum trilobum\nn12680652\twayfaring tree, twist wood, twistwood, Viburnum lantana\nn12680864\tguelder rose, European cranberrybush, European cranberry bush, crampbark, cranberry tree, Viburnum opulus\nn12681376\tarrow wood, Viburnum recognitum\nn12681579\tblack haw, Viburnum prunifolium\nn12681893\tweigela, Weigela florida\nn12682411\tteasel, teazel, teasle\nn12682668\tcommon teasel, Dipsacus fullonum\nn12682882\tfuller's teasel, Dipsacus sativus\nn12683096\twild teasel, Dipsacus sylvestris\nn12683407\tscabious, scabiosa\nn12683571\tsweet scabious, pincushion flower, mournful widow, Scabiosa atropurpurea\nn12683791\tfield scabious, Scabiosa arvensis\nn12684379\tjewelweed, lady's earrings, orange balsam, celandine, touch-me-not, Impatiens capensis\nn12685431\tgeranium\nn12685831\tcranesbill, crane's bill\nn12686077\twild geranium, spotted cranesbill, Geranium maculatum\nn12686274\tmeadow cranesbill, Geranium pratense\nn12686496\tRichardson's geranium, Geranium richardsonii\nn12686676\therb robert, herbs robert, herb roberts, Geranium robertianum\nn12686877\tsticky geranium, Geranium viscosissimum\nn12687044\tdove's foot geranium, Geranium molle\nn12687462\trose geranium, sweet-scented geranium, Pelargonium graveolens\nn12687698\tfish geranium, bedding geranium, zonal pelargonium, Pelargonium hortorum\nn12687957\tivy geranium, ivy-leaved geranium, hanging geranium, Pelargonium peltatum\nn12688187\tapple geranium, nutmeg geranium, Pelargonium odoratissimum\nn12688372\tlemon geranium, Pelargonium limoneum\nn12688716\tstorksbill, heron's bill\nn12689305\tmusk clover, muskus grass, white-stemmed filaree, Erodium moschatum\nn12690653\tincense tree\nn12691428\telephant tree, Bursera microphylla\nn12691661\tgumbo-limbo, Bursera simaruba\nn12692024\tBoswellia carteri\nn12692160\tsalai, Boswellia serrata\nn12692521\tbalm of gilead, Commiphora meccanensis\nn12692714\tmyrrh tree, Commiphora myrrha\nn12693244\tProtium heptaphyllum\nn12693352\tProtium guianense\nn12693865\twater starwort\nn12694486\tbarbados cherry, acerola, Surinam cherry, West Indian cherry, Malpighia glabra\nn12695144\tmahogany, mahogany tree\nn12695975\tchinaberry, chinaberry tree, China tree, Persian lilac, pride-of-India, azederach, azedarach, Melia azederach, Melia azedarach\nn12696492\tneem, neem tree, nim tree, margosa, arishth, Azadirachta indica, Melia Azadirachta\nn12696830\tneem seed\nn12697152\tSpanish cedar, Spanish cedar tree, Cedrela odorata\nn12697514\tsatinwood, satinwood tree, Chloroxylon swietenia\nn12698027\tAfrican scented mahogany, cedar mahogany, sapele mahogany, Entandrophragma cylindricum\nn12698435\tsilver ash\nn12698598\tnative beech, flindosa, flindosy, Flindersia australis\nn12698774\tbunji-bunji, Flindersia schottiana\nn12699031\tAfrican mahogany\nn12699301\tlanseh tree, langsat, langset, Lansium domesticum\nn12699922\ttrue mahogany, Cuban mahogany, Dominican mahogany, Swietinia mahogani\nn12700088\tHonduras mahogany, Swietinia macrophylla\nn12700357\tPhilippine mahogany, Philippine cedar, kalantas, Toona calantas, Cedrela calantas\nn12702124\tcaracolito, Ruptiliocarpon caracolito\nn12703190\tcommon wood sorrel, cuckoo bread, shamrock, Oxalis acetosella\nn12703383\tBermuda buttercup, English-weed, Oxalis pes-caprae, Oxalis cernua\nn12703557\tcreeping oxalis, creeping wood sorrel, Oxalis corniculata\nn12703716\tgoatsfoot, goat's foot, Oxalis caprina\nn12703856\tviolet wood sorrel, Oxalis violacea\nn12704041\toca, oka, Oxalis tuberosa, Oxalis crenata\nn12704343\tcarambola, carambola tree, Averrhoa carambola\nn12704513\tbilimbi, Averrhoa bilimbi\nn12705013\tmilkwort\nn12705220\tsenega, Polygala alba\nn12705458\torange milkwort, yellow milkwort, candyweed, yellow bachelor's button, Polygala lutea\nn12705698\tflowering wintergreen, gaywings, bird-on-the-wing, fringed polygala, Polygala paucifolia\nn12705978\tSeneca snakeroot, Seneka snakeroot, senga root, senega root, senega snakeroot, Polygala senega\nn12706410\tcommon milkwort, gand flower, Polygala vulgaris\nn12707199\true, herb of grace, Ruta graveolens\nn12707781\tcitrus, citrus tree\nn12708293\torange, orange tree\nn12708654\tsour orange, Seville orange, bitter orange, bitter orange tree, bigarade, marmalade orange, Citrus aurantium\nn12708941\tbergamot, bergamot orange, Citrus bergamia\nn12709103\tpomelo, pomelo tree, pummelo, shaddock, Citrus maxima, Citrus grandis, Citrus decumana\nn12709349\tcitron, citron tree, Citrus medica\nn12709688\tgrapefruit, Citrus paradisi\nn12709901\tmandarin, mandarin orange, mandarin orange tree, Citrus reticulata\nn12710295\ttangerine, tangerine tree\nn12710415\tclementine, clementine tree\nn12710577\tsatsuma, satsuma tree\nn12710693\tsweet orange, sweet orange tree, Citrus sinensis\nn12710917\ttemple orange, temple orange tree, tangor, king orange, Citrus nobilis\nn12711182\ttangelo, tangelo tree, ugli fruit, Citrus tangelo\nn12711398\trangpur, rangpur lime, lemanderin, Citrus limonia\nn12711596\tlemon, lemon tree, Citrus limon\nn12711817\tsweet lemon, sweet lime, Citrus limetta\nn12711984\tlime, lime tree, Citrus aurantifolia\nn12712320\tcitrange, citrange tree, Citroncirus webberi\nn12712626\tfraxinella, dittany, burning bush, gas plant, Dictamnus alba\nn12713063\tkumquat, cumquat, kumquat tree\nn12713358\tmarumi, marumi kumquat, round kumquat, Fortunella japonica\nn12713521\tnagami, nagami kumquat, oval kumquat, Fortunella margarita\nn12713866\tcork tree, Phellodendron amurense\nn12714254\ttrifoliate orange, trifoliata, wild orange, Poncirus trifoliata\nn12714755\tprickly ash\nn12714949\ttoothache tree, sea ash, Zanthoxylum americanum, Zanthoxylum fraxineum\nn12715195\tHercules'-club, Hercules'-clubs, Hercules-club, Zanthoxylum clava-herculis\nn12715914\tbitterwood tree\nn12716400\tmarupa, Simarouba amara\nn12716594\tparadise tree, bitterwood, Simarouba glauca\nn12717072\tailanthus\nn12717224\ttree of heaven, tree of the gods, Ailanthus altissima\nn12717644\twild mango, dika, wild mango tree, Irvingia gabonensis\nn12718074\tpepper tree, Kirkia wilmsii\nn12718483\tJamaica quassia, bitterwood, Picrasma excelsa, Picrasma excelsum\nn12718995\tquassia, bitterwood, Quassia amara\nn12719684\tnasturtium\nn12719944\tgarden nasturtium, Indian cress, Tropaeolum majus\nn12720200\tbush nasturtium, Tropaeolum minus\nn12720354\tcanarybird flower, canarybird vine, canary creeper, Tropaeolum peregrinum\nn12721122\tbean caper, Syrian bean caper, Zygophyllum fabago\nn12721477\tpalo santo, Bulnesia sarmienti\nn12722071\tlignum vitae, Guaiacum officinale\nn12723062\tcreosote bush, coville, hediondilla, Larrea tridentata\nn12723610\tcaltrop, devil's weed, Tribulus terestris\nn12724942\twillow, willow tree\nn12725521\tosier\nn12725738\twhite willow, Huntingdon willow, Salix alba\nn12725940\tsilver willow, silky willow, Salix alba sericea, Salix sericea\nn12726159\tgolden willow, Salix alba vitellina, Salix vitellina\nn12726357\tcricket-bat willow, Salix alba caerulea\nn12726528\tarctic willow, Salix arctica\nn12726670\tweeping willow, Babylonian weeping willow, Salix babylonica\nn12726902\tWisconsin weeping willow, Salix pendulina, Salix blanda, Salix pendulina blanda\nn12727101\tpussy willow, Salix discolor\nn12727301\tsallow\nn12727518\tgoat willow, florist's willow, pussy willow, Salix caprea\nn12727729\tpeachleaf willow, peach-leaved willow, almond-leaves willow, Salix amygdaloides\nn12727960\talmond willow, black Hollander, Salix triandra, Salix amygdalina\nn12728164\thoary willow, sage willow, Salix candida\nn12728322\tcrack willow, brittle willow, snap willow, Salix fragilis\nn12728508\tprairie willow, Salix humilis\nn12728656\tdwarf willow, Salix herbacea\nn12728864\tgrey willow, gray willow, Salix cinerea\nn12729023\tarroyo willow, Salix lasiolepis\nn12729164\tshining willow, Salix lucida\nn12729315\tswamp willow, black willow, Salix nigra\nn12729521\tbay willow, laurel willow, Salix pentandra\nn12729729\tpurple willow, red willow, red osier, basket willow, purple osier, Salix purpurea\nn12729950\tbalsam willow, Salix pyrifolia\nn12730143\tcreeping willow, Salix repens\nn12730370\tSitka willow, silky willow, Salix sitchensis\nn12730544\tdwarf grey willow, dwarf gray willow, sage willow, Salix tristis\nn12730776\tbearberry willow, Salix uva-ursi\nn12731029\tcommon osier, hemp willow, velvet osier, Salix viminalis\nn12731401\tpoplar, poplar tree\nn12731835\tbalsam poplar, hackmatack, tacamahac, Populus balsamifera\nn12732009\twhite poplar, white aspen, abele, aspen poplar, silver-leaved poplar, Populus alba\nn12732252\tgrey poplar, gray poplar, Populus canescens\nn12732491\tblack poplar, Populus nigra\nn12732605\tLombardy poplar, Populus nigra italica\nn12732756\tcottonwood\nn12732966\tEastern cottonwood, necklace poplar, Populus deltoides\nn12733218\tblack cottonwood, Western balsam poplar, Populus trichocarpa\nn12733428\tswamp cottonwood, black cottonwood, downy poplar, swamp poplar, Populus heterophylla\nn12733647\taspen\nn12733870\tquaking aspen, European quaking aspen, Populus tremula\nn12734070\tAmerican quaking aspen, American aspen, Populus tremuloides\nn12734215\tCanadian aspen, bigtooth aspen, bigtoothed aspen, big-toothed aspen, large-toothed aspen, large tooth aspen, Populus grandidentata\nn12735160\tsandalwood tree, true sandalwood, Santalum album\nn12736603\tquandong, quandang, quandong tree, Eucarya acuminata, Fusanus acuminatus\nn12736999\trabbitwood, buffalo nut, Pyrularia pubera\nn12737383\tLoranthaceae, family Loranthaceae, mistletoe family\nn12737898\tmistletoe, Loranthus europaeus\nn12738259\tAmerican mistletoe, Arceuthobium pusillum\nn12739332\tmistletoe, Viscum album, Old World mistletoe\nn12739966\tAmerican mistletoe, Phoradendron serotinum, Phoradendron flavescens\nn12740967\taalii\nn12741222\tsoapberry, soapberry tree\nn12741586\twild China tree, Sapindus drumondii, Sapindus marginatus\nn12741792\tChina tree, false dogwood, jaboncillo, chinaberry, Sapindus saponaria\nn12742290\takee, akee tree, Blighia sapida\nn12742741\tsoapberry vine\nn12742878\theartseed, Cardiospermum grandiflorum\nn12743009\tballoon vine, heart pea, Cardiospermum halicacabum\nn12743352\tlongan, lungen, longanberry, Dimocarpus longan, Euphorbia litchi, Nephelium longana\nn12743823\tharpullia\nn12743976\tharpulla, Harpullia cupanioides\nn12744142\tMoreton Bay tulipwood, Harpullia pendula\nn12744387\tlitchi, lichee, litchi tree, Litchi chinensis, Nephelium litchi\nn12744850\tSpanish lime, Spanish lime tree, honey berry, mamoncillo, genip, ginep, Melicocca bijuga, Melicocca bijugatus\nn12745386\trambutan, rambotan, rambutan tree, Nephelium lappaceum\nn12745564\tpulasan, pulassan, pulasan tree, Nephelium mutabile\nn12746884\tpachysandra\nn12747120\tAllegheny spurge, Allegheny mountain spurge, Pachysandra procumbens\nn12748248\tbittersweet, American bittersweet, climbing bittersweet, false bittersweet, staff vine, waxwork, shrubby bittersweet, Celastrus scandens\nn12749049\tspindle tree, spindleberry, spindleberry tree\nn12749456\twinged spindle tree, Euonymous alatus\nn12749679\twahoo, burning bush, Euonymus atropurpureus\nn12749852\tstrawberry bush, wahoo, Euonymus americanus\nn12750076\tevergreen bittersweet, Euonymus fortunei radicans, Euonymus radicans vegetus\nn12750767\tcyrilla, leatherwood, white titi, Cyrilla racemiflora\nn12751172\ttiti, buckwheat tree, Cliftonia monophylla\nn12751675\tcrowberry\nn12752205\tmaple\nn12753007\tsilver maple, Acer saccharinum\nn12753245\tsugar maple, rock maple, Acer saccharum\nn12753573\tred maple, scarlet maple, swamp maple, Acer rubrum\nn12753762\tmoosewood, moose-wood, striped maple, striped dogwood, goosefoot maple, Acer pennsylvanicum\nn12754003\tOregon maple, big-leaf maple, Acer macrophyllum\nn12754174\tdwarf maple, Rocky-mountain maple, Acer glabrum\nn12754311\tmountain maple, mountain alder, Acer spicatum\nn12754468\tvine maple, Acer circinatum\nn12754648\thedge maple, field maple, Acer campestre\nn12754781\tNorway maple, Acer platanoides\nn12754981\tsycamore, great maple, scottish maple, Acer pseudoplatanus\nn12755225\tbox elder, ash-leaved maple, Acer negundo\nn12755387\tCalifornia box elder, Acer negundo Californicum\nn12755559\tpointed-leaf maple, Acer argutum\nn12755727\tJapanese maple, full moon maple, Acer japonicum\nn12755876\tJapanese maple, Acer palmatum\nn12756457\tholly\nn12757115\tChinese holly, Ilex cornuta\nn12757303\tbearberry, possum haw, winterberry, Ilex decidua\nn12757458\tinkberry, gallberry, gall-berry, evergreen winterberry, Ilex glabra\nn12757668\tmate, Paraguay tea, Ilex paraguariensis\nn12757816\tAmerican holly, Christmas holly\nn12757930\tlow gallberry holly\nn12758014\ttall gallberry holly\nn12758099\tyaupon holly\nn12758176\tdeciduous holly\nn12758250\tjuneberry holly\nn12758325\tlargeleaf holly\nn12758399\tGeogia holly\nn12758471\tcommon winterberry holly\nn12758555\tsmooth winterberry holly\nn12759273\tcashew, cashew tree, Anacardium occidentale\nn12759668\tgoncalo alves, Astronium fraxinifolium\nn12760539\tVenetian sumac, wig tree, Cotinus coggygria\nn12760875\tlaurel sumac, Malosma laurina, Rhus laurina\nn12761284\tmango, mango tree, Mangifera indica\nn12761702\tpistachio, Pistacia vera, pistachio tree\nn12761905\tterebinth, Pistacia terebinthus\nn12762049\tmastic, mastic tree, lentisk, Pistacia lentiscus\nn12762405\tAustralian sumac, Rhodosphaera rhodanthema, Rhus rhodanthema\nn12762896\tsumac, sumach, shumac\nn12763529\tsmooth sumac, scarlet sumac, vinegar tree, Rhus glabra\nn12764008\tsugar-bush, sugar sumac, Rhus ovata\nn12764202\tstaghorn sumac, velvet sumac, Virginian sumac, vinegar tree, Rhus typhina\nn12764507\tsquawbush, squaw-bush, skunkbush, Rhus trilobata\nn12764978\taroeira blanca, Schinus chichita\nn12765115\tpepper tree, molle, Peruvian mastic tree, Schinus molle\nn12765402\tBrazilian pepper tree, Schinus terebinthifolius\nn12765846\thog plum, yellow mombin, yellow mombin tree, Spondias mombin\nn12766043\tmombin, mombin tree, jocote, Spondias purpurea\nn12766595\tpoison ash, poison dogwood, poison sumac, Toxicodendron vernix, Rhus vernix\nn12766869\tpoison ivy, markweed, poison mercury, poison oak, Toxicodendron radicans, Rhus radicans\nn12767208\twestern poison oak, Toxicodendron diversilobum, Rhus diversiloba\nn12767423\teastern poison oak, Toxicodendron quercifolium, Rhus quercifolia, Rhus toxicodenedron\nn12767648\tvarnish tree, lacquer tree, Chinese lacquer tree, Japanese lacquer tree, Japanese varnish tree, Japanese sumac, Toxicodendron vernicifluum, Rhus verniciflua\nn12768369\thorse chestnut, buckeye, Aesculus hippocastanum\nn12768682\tbuckeye, horse chestnut, conker\nn12768809\tsweet buckeye\nn12768933\tOhio buckeye\nn12769065\tdwarf buckeye, bottlebrush buckeye\nn12769219\tred buckeye\nn12769318\tparticolored buckeye\nn12770529\tebony, ebony tree, Diospyros ebenum\nn12770892\tmarblewood, marble-wood, Andaman marble, Diospyros kurzii\nn12771085\tmarblewood, marble-wood\nn12771192\tpersimmon, persimmon tree\nn12771390\tJapanese persimmon, kaki, Diospyros kaki\nn12771597\tAmerican persimmon, possumwood, Diospyros virginiana\nn12771890\tdate plum, Diospyros lotus\nn12772753\tbuckthorn\nn12772908\tsouthern buckthorn, shittimwood, shittim, mock orange, Bumelia lycioides\nn12773142\tfalse buckthorn, chittamwood, chittimwood, shittimwood, black haw, Bumelia lanuginosa\nn12773651\tstar apple, caimito, Chrysophyllum cainito\nn12773917\tsatinleaf, satin leaf, caimitillo, damson plum, Chrysophyllum oliviforme\nn12774299\tbalata, balata tree, beefwood, bully tree, Manilkara bidentata\nn12774641\tsapodilla, sapodilla tree, Manilkara zapota, Achras zapota\nn12775070\tgutta-percha tree, Palaquium gutta\nn12775393\tgutta-percha tree\nn12775717\tcanistel, canistel tree, Pouteria campechiana nervosa\nn12775919\tmarmalade tree, mammee, sapote, Pouteria zapota, Calocarpum zapota\nn12776558\tsweetleaf, Symplocus tinctoria\nn12776774\tAsiatic sweetleaf, sapphire berry, Symplocus paniculata\nn12777436\tstyrax\nn12777680\tsnowbell, Styrax obassia\nn12777778\tJapanese snowbell, Styrax japonicum\nn12777892\tTexas snowbell, Texas snowbells, Styrax texana\nn12778398\tsilver-bell tree, silverbell tree, snowdrop tree, opossum wood, Halesia carolina, Halesia tetraptera\nn12778605\tcarnivorous plant\nn12779603\tpitcher plant\nn12779851\tcommon pitcher plant, huntsman's cup, huntsman's cups, Sarracenia purpurea\nn12780325\thooded pitcher plant, Sarracenia minor\nn12780563\thuntsman's horn, huntsman's horns, yellow trumpet, yellow pitcher plant, trumpets, Sarracenia flava\nn12781940\ttropical pitcher plant\nn12782530\tsundew, sundew plant, daily dew\nn12782915\tVenus's flytrap, Venus's flytraps, Dionaea muscipula\nn12783316\twaterwheel plant, Aldrovanda vesiculosa\nn12783730\tDrosophyllum lusitanicum\nn12784371\troridula\nn12784889\tAustralian pitcher plant, Cephalotus follicularis\nn12785724\tsedum\nn12785889\tstonecrop\nn12786273\trose-root, midsummer-men, Sedum rosea\nn12786464\torpine, orpin, livelong, live-forever, Sedum telephium\nn12786836\tpinwheel, Aeonium haworthii\nn12787364\tChristmas bush, Christmas tree, Ceratopetalum gummiferum\nn12788854\thortensia, Hydrangea macrophylla hortensis\nn12789054\tfall-blooming hydrangea, Hydrangea paniculata\nn12789554\tcarpenteria, Carpenteria californica\nn12789977\tdecumary, Decumaria barbata, Decumaria barbara\nn12790430\tdeutzia\nn12791064\tphiladelphus\nn12791329\tmock orange, syringa, Philadelphus coronarius\nn12793015\tsaxifrage, breakstone, rockfoil\nn12793284\tyellow mountain saxifrage, Saxifraga aizoides\nn12793494\tmeadow saxifrage, fair-maids-of-France, Saxifraga granulata\nn12793695\tmossy saxifrage, Saxifraga hypnoides\nn12793886\twestern saxifrage, Saxifraga occidentalis\nn12794135\tpurple saxifrage, Saxifraga oppositifolia\nn12794367\tstar saxifrage, starry saxifrage, Saxifraga stellaris\nn12794568\tstrawberry geranium, strawberry saxifrage, mother-of-thousands, Saxifraga stolonifera, Saxifraga sarmentosam\nn12794985\tastilbe\nn12795209\tfalse goatsbeard, Astilbe biternata\nn12795352\tdwarf astilbe, Astilbe chinensis pumila\nn12795555\tspirea, spiraea, Astilbe japonica\nn12796022\tbergenia\nn12796385\tcoast boykinia, Boykinia elata, Boykinia occidentalis\nn12796849\tgolden saxifrage, golden spleen\nn12797368\tumbrella plant, Indian rhubarb, Darmera peltata, Peltiphyllum peltatum\nn12797860\tbridal wreath, bridal-wreath, Francoa ramosa\nn12798284\talumroot, alumbloom\nn12798910\tcoralbells, Heuchera sanguinea\nn12799269\tleatherleaf saxifrage, Leptarrhena pyrolifolia\nn12799776\twoodland star, Lithophragma affine, Lithophragma affinis, Tellima affinis\nn12800049\tprairie star, Lithophragma parviflorum\nn12800586\tmiterwort, mitrewort, bishop's cap\nn12801072\tfive-point bishop's cap, Mitella pentandra\nn12801520\tparnassia, grass-of-Parnassus\nn12801781\tbog star, Parnassia palustris\nn12801966\tfringed grass of Parnassus, Parnassia fimbriata\nn12803226\tfalse alumroot, fringe cups, Tellima grandiflora\nn12803754\tfoamflower, coolwart, false miterwort, false mitrewort, Tiarella cordifolia\nn12803958\tfalse miterwort, false mitrewort, Tiarella unifoliata\nn12804352\tpickaback plant, piggyback plant, youth-on-age, Tolmiea menziesii\nn12805146\tcurrant, currant bush\nn12805561\tblack currant, European black currant, Ribes nigrum\nn12805762\twhite currant, Ribes sativum\nn12806015\tgooseberry, gooseberry bush, Ribes uva-crispa, Ribes grossularia\nn12806732\tplane tree, sycamore, platan\nn12807251\tLondon plane, Platanus acerifolia\nn12807409\tAmerican sycamore, American plane, buttonwood, Platanus occidentalis\nn12807624\toriental plane, Platanus orientalis\nn12807773\tCalifornia sycamore, Platanus racemosa\nn12808007\tArizona sycamore, Platanus wrightii\nn12809868\tGreek valerian, Polemonium reptans\nn12810007\tnorthern Jacob's ladder, Polemonium boreale\nn12810151\tskunkweed, skunk-weed, Polemonium viscosum\nn12810595\tphlox\nn12811027\tmoss pink, mountain phlox, moss phlox, dwarf phlox, Phlox subulata\nn12811713\tevening-snow, Linanthus dichotomus\nn12812235\tacanthus\nn12812478\tbear's breech, bear's breeches, sea holly, Acanthus mollis\nn12812801\tcaricature plant, Graptophyllum pictum\nn12813189\tblack-eyed Susan, black-eyed Susan vine, Thunbergia alata\nn12814643\tcatalpa, Indian bean\nn12814857\tCatalpa bignioides\nn12814960\tCatalpa speciosa\nn12815198\tdesert willow, Chilopsis linearis\nn12815668\tcalabash, calabash tree, Crescentia cujete\nn12815838\tcalabash\nn12816508\tborage, tailwort, Borago officinalis\nn12816942\tcommon amsinckia, Amsinckia intermedia\nn12817464\tanchusa\nn12817694\tbugloss, alkanet, Anchusa officinalis\nn12817855\tcape forget-me-not, Anchusa capensis\nn12818004\tcape forget-me-not, Anchusa riparia\nn12818346\tSpanish elm, Equador laurel, salmwood, cypre, princewood, Cordia alliodora\nn12818601\tprincewood, Spanish elm, Cordia gerascanthus\nn12818966\tChinese forget-me-not, Cynoglossum amabile\nn12819141\thound's-tongue, Cynoglossum officinale\nn12819354\thound's-tongue, Cynoglossum virginaticum\nn12819728\tblueweed, blue devil, blue thistle, viper's bugloss, Echium vulgare\nn12820113\tbeggar's lice, beggar lice\nn12820669\tgromwell, Lithospermum officinale\nn12820853\tpuccoon, Lithospermum caroliniense\nn12821505\tVirginia bluebell, Virginia cowslip, Mertensia virginica\nn12821895\tgarden forget-me-not, Myosotis sylvatica\nn12822115\tforget-me-not, mouse ear, Myosotis scorpiodes\nn12822466\tfalse gromwell\nn12822769\tcomfrey, cumfrey\nn12822955\tcommon comfrey, boneset, Symphytum officinale\nn12823717\tconvolvulus\nn12823859\tbindweed\nn12824053\tfield bindweed, wild morning-glory, Convolvulus arvensis\nn12824289\tscammony, Convolvulus scammonia\nn12824735\tsilverweed\nn12825497\tdodder\nn12826143\tdichondra, Dichondra micrantha\nn12827270\tcypress vine, star-glory, Indian pink, Ipomoea quamoclit, Quamoclit pennata\nn12827537\tmoonflower, belle de nuit, Ipomoea alba\nn12827907\twild potato vine, wild sweet potato vine, man-of-the-earth, manroot, scammonyroot, Ipomoea panurata, Ipomoea fastigiata\nn12828220\tred morning-glory, star ipomoea, Ipomoea coccinea\nn12828379\tman-of-the-earth, Ipomoea leptophylla\nn12828520\tscammony, Ipomoea orizabensis\nn12828791\tJapanese morning glory, Ipomoea nil\nn12828977\timperial Japanese morning glory, Ipomoea imperialis\nn12829582\tgesneriad\nn12829975\tgesneria\nn12830222\tachimenes, hot water plant\nn12830568\taeschynanthus\nn12831141\tlace-flower vine, Alsobia dianthiflora, Episcia dianthiflora\nn12831535\tcolumnea\nn12831932\tepiscia\nn12832315\tgloxinia\nn12832538\tCanterbury bell, Gloxinia perennis\nn12832822\tkohleria\nn12833149\tAfrican violet, Saintpaulia ionantha\nn12833985\tstreptocarpus\nn12834190\tCape primrose\nn12834798\twaterleaf\nn12834938\tVirginia waterleaf, Shawnee salad, shawny, Indian salad, John's cabbage, Hydrophyllum virginianum\nn12835331\tyellow bells, California yellow bells, whispering bells, Emmanthe penduliflora\nn12835766\tyerba santa, Eriodictyon californicum\nn12836212\tnemophila\nn12836337\tbaby blue-eyes, Nemophila menziesii\nn12836508\tfive-spot, Nemophila maculata\nn12836862\tscorpionweed, scorpion weed, phacelia\nn12837052\tCalifornia bluebell, Phacelia campanularia\nn12837259\tCalifornia bluebell, whitlavia, Phacelia minor, Phacelia whitlavia\nn12837466\tfiddleneck, Phacelia tanacetifolia\nn12837803\tfiesta flower, Pholistoma auritum, Nemophila aurita\nn12839574\tbasil thyme, basil balm, mother of thyme, Acinos arvensis, Satureja acinos\nn12839979\tgiant hyssop\nn12840168\tyellow giant hyssop, Agastache nepetoides\nn12840362\tanise hyssop, Agastache foeniculum\nn12840502\tMexican hyssop, Agastache mexicana\nn12840749\tbugle, bugleweed\nn12841007\tcreeping bugle, Ajuga reptans\nn12841193\terect bugle, blue bugle, Ajuga genevensis\nn12841354\tpyramid bugle, Ajuga pyramidalis\nn12842302\twood mint\nn12842519\thairy wood mint, Blephilia hirsuta\nn12842642\tdowny wood mint, Blephilia celiata\nn12842887\tcalamint\nn12843144\tcommon calamint, Calamintha sylvatica, Satureja calamintha officinalis\nn12843316\tlarge-flowered calamint, Calamintha grandiflora, Clinopodium grandiflorum, Satureja grandiflora\nn12843557\tlesser calamint, field balm, Calamintha nepeta, Calamintha nepeta glantulosa, Satureja nepeta, Satureja calamintha glandulosa\nn12843970\twild basil, cushion calamint, Clinopodium vulgare, Satureja vulgaris\nn12844409\thorse balm, horseweed, stoneroot, stone-root, richweed, stone root, Collinsonia canadensis\nn12844939\tcoleus, flame nettle\nn12845187\tcountry borage, Coleus aromaticus, Coleus amboinicus, Plectranthus amboinicus\nn12845413\tpainted nettle, Joseph's coat, Coleus blumei, Solenostemon blumei, Solenostemon scutellarioides\nn12845908\tApalachicola rosemary, Conradina glabra\nn12846335\tdragonhead, dragon's head, Dracocephalum parviflorum\nn12846690\telsholtzia\nn12847008\themp nettle, dead nettle, Galeopsis tetrahit\nn12847374\tground ivy, alehoof, field balm, gill-over-the-ground, runaway robin, Glechoma hederaceae, Nepeta hederaceae\nn12847927\tpennyroyal, American pennyroyal, Hedeoma pulegioides\nn12848499\thyssop, Hyssopus officinalis\nn12849061\tdead nettle\nn12849279\twhite dead nettle, Lamium album\nn12849416\thenbit, Lamium amplexicaule\nn12849952\tEnglish lavender, Lavandula angustifolia, Lavandula officinalis\nn12850168\tFrench lavender, Lavandula stoechas\nn12850336\tspike lavender, French lavender, Lavandula latifolia\nn12850906\tdagga, Cape dagga, red dagga, wilde dagga, Leonotis leonurus\nn12851094\tlion's-ear, Leonotis nepetaefolia, Leonotis nepetifolia\nn12851469\tmotherwort, Leonurus cardiaca\nn12851860\tpitcher sage, Lepechinia calycina, Sphacele calycina\nn12852234\tbugleweed, Lycopus virginicus\nn12852428\twater horehound, Lycopus americanus\nn12852570\tgipsywort, gypsywort, Lycopus europaeus\nn12853080\toriganum\nn12853287\toregano, marjoram, pot marjoram, wild marjoram, winter sweet, Origanum vulgare\nn12853482\tsweet marjoram, knotted marjoram, Origanum majorana, Majorana hortensis\nn12854048\thorehound\nn12854193\tcommon horehound, white horehound, Marrubium vulgare\nn12854600\tlemon balm, garden balm, sweet balm, bee balm, beebalm, Melissa officinalis\nn12855365\tcorn mint, field mint, Mentha arvensis\nn12855494\twater-mint, water mint, Mentha aquatica\nn12855710\tbergamot mint, lemon mint, eau de cologne mint, Mentha citrata\nn12855886\thorsemint, Mentha longifolia\nn12856091\tpeppermint, Mentha piperita\nn12856287\tspearmint, Mentha spicata\nn12856479\tapple mint, applemint, Mentha rotundifolia, Mentha suaveolens\nn12856680\tpennyroyal, Mentha pulegium\nn12857204\tyerba buena, Micromeria chamissonis, Micromeria douglasii, Satureja douglasii\nn12857779\tmolucca balm, bells of Ireland, Molucella laevis\nn12858150\tmonarda, wild bergamot\nn12858397\tbee balm, beebalm, bergamot mint, oswego tea, Monarda didyma\nn12858618\thorsemint, Monarda punctata\nn12858871\tbee balm, beebalm, Monarda fistulosa\nn12858987\tlemon mint, horsemint, Monarda citriodora\nn12859153\tplains lemon monarda, Monarda pectinata\nn12859272\tbasil balm, Monarda clinopodia\nn12859679\tmustang mint, Monardella lanceolata\nn12859986\tcatmint, catnip, Nepeta cataria\nn12860365\tbasil\nn12860978\tbeefsteak plant, Perilla frutescens crispa\nn12861345\tphlomis\nn12861541\tJerusalem sage, Phlomis fruticosa\nn12861892\tphysostegia\nn12862512\tplectranthus\nn12862828\tpatchouli, patchouly, pachouli, Pogostemon cablin\nn12863234\tself-heal, heal all, Prunella vulgaris\nn12863624\tmountain mint\nn12864160\trosemary, Rosmarinus officinalis\nn12865037\tclary sage, Salvia clarea\nn12865562\tpurple sage, chaparral sage, Salvia leucophylla\nn12865708\tcancerweed, cancer weed, Salvia lyrata\nn12865824\tcommon sage, ramona, Salvia officinalis\nn12866002\tmeadow clary, Salvia pratensis\nn12866162\tclary, Salvia sclarea\nn12866333\tpitcher sage, Salvia spathacea\nn12866459\tMexican mint, Salvia divinorum\nn12866635\twild sage, wild clary, vervain sage, Salvia verbenaca\nn12866968\tsavory\nn12867184\tsummer savory, Satureja hortensis, Satureia hortensis\nn12867449\twinter savory, Satureja montana, Satureia montana\nn12867826\tskullcap, helmetflower\nn12868019\tblue pimpernel, blue skullcap, mad-dog skullcap, mad-dog weed, Scutellaria lateriflora\nn12868880\thedge nettle, dead nettle, Stachys sylvatica\nn12869061\thedge nettle, Stachys palustris\nn12869478\tgermander\nn12869668\tAmerican germander, wood sage, Teucrium canadense\nn12870048\tcat thyme, marum, Teucrium marum\nn12870225\twood sage, Teucrium scorodonia\nn12870535\tthyme\nn12870682\tcommon thyme, Thymus vulgaris\nn12870891\twild thyme, creeping thyme, Thymus serpyllum\nn12871272\tblue curls\nn12871696\tturpentine camphor weed, camphorweed, vinegarweed, Trichostema lanceolatum\nn12871859\tbastard pennyroyal, Trichostema dichotomum\nn12872458\tbladderwort\nn12872914\tbutterwort\nn12873341\tgenlisea\nn12873984\tmartynia, Martynia annua\nn12875269\tcommon unicorn plant, devil's claw, common devil's claw, elephant-tusk, proboscis flower, ram's horn, Proboscidea louisianica\nn12875697\tsand devil's claw, Proboscidea arenaria, Martynia arenaria\nn12875861\tsweet unicorn plant, Proboscidea fragrans, Martynia fragrans\nn12876899\tfigwort\nn12877244\tsnapdragon\nn12877493\twhite snapdragon, Antirrhinum coulterianum\nn12877637\tyellow twining snapdragon, Antirrhinum filipes\nn12877838\tMediterranean snapdragon, Antirrhinum majus\nn12878169\tkitten-tails\nn12878325\tAlpine besseya, Besseya alpina\nn12878784\tfalse foxglove, Aureolaria pedicularia, Gerardia pedicularia\nn12879068\tfalse foxglove, Aureolaria virginica, Gerardia virginica\nn12879527\tcalceolaria, slipperwort\nn12879963\tIndian paintbrush, painted cup\nn12880244\tdesert paintbrush, Castilleja chromosa\nn12880462\tgiant red paintbrush, Castilleja miniata\nn12880638\tgreat plains paintbrush, Castilleja sessiliflora\nn12880799\tsulfur paintbrush, Castilleja sulphurea\nn12881105\tshellflower, shell-flower, turtlehead, snakehead, snake-head, Chelone glabra\nn12881913\tmaiden blue-eyed Mary, Collinsia parviflora\nn12882158\tblue-eyed Mary, Collinsia verna\nn12882779\tfoxglove, digitalis\nn12882945\tcommon foxglove, fairy bell, fingerflower, finger-flower, fingerroot, finger-root, Digitalis purpurea\nn12883265\tyellow foxglove, straw foxglove, Digitalis lutea\nn12883628\tgerardia\nn12884100\tblue toadflax, old-field toadflax, Linaria canadensis\nn12884260\ttoadflax, butter-and-eggs, wild snapdragon, devil's flax, Linaria vulgaris\nn12885045\tgolden-beard penstemon, Penstemon barbatus\nn12885265\tscarlet bugler, Penstemon centranthifolius\nn12885510\tred shrubby penstemon, redwood penstemon\nn12885754\tPlatte River penstemon, Penstemon cyananthus\nn12886185\thot-rock penstemon, Penstemon deustus\nn12886402\tJones' penstemon, Penstemon dolius\nn12886600\tshrubby penstemon, lowbush penstemon, Penstemon fruticosus\nn12886831\tnarrow-leaf penstemon, Penstemon linarioides\nn12887293\tballoon flower, scented penstemon, Penstemon palmeri\nn12887532\tParry's penstemon, Penstemon parryi\nn12887713\trock penstemon, cliff penstemon, Penstemon rupicola\nn12888016\tRydberg's penstemon, Penstemon rydbergii\nn12888234\tcascade penstemon, Penstemon serrulatus\nn12888457\tWhipple's penstemon, Penstemon whippleanus\nn12889219\tmoth mullein, Verbascum blattaria\nn12889412\twhite mullein, Verbascum lychnitis\nn12889579\tpurple mullein, Verbascum phoeniceum\nn12889713\tcommon mullein, great mullein, Aaron's rod, flannel mullein, woolly mullein, torch, Verbascum thapsus\nn12890265\tveronica, speedwell\nn12890490\tfield speedwell, Veronica agrestis\nn12890685\tbrooklime, American brooklime, Veronica americana\nn12890928\tcorn speedwell, Veronica arvensis\nn12891093\tbrooklime, European brooklime, Veronica beccabunga\nn12891305\tgermander speedwell, bird's eye, Veronica chamaedrys\nn12891469\twater speedwell, Veronica michauxii, Veronica anagallis-aquatica\nn12891643\tcommon speedwell, gypsyweed, Veronica officinalis\nn12891824\tpurslane speedwell, Veronica peregrina\nn12892013\tthyme-leaved speedwell, Veronica serpyllifolia\nn12893463\tnightshade\nn12893993\thorse nettle, ball nettle, bull nettle, ball nightshade, Solanum carolinense\nn12895298\tAfrican holly, Solanum giganteum\nn12895811\tpotato vine, Solanum jasmoides\nn12896615\tgarden huckleberry, wonderberry, sunberry, Solanum nigrum guineese, Solanum melanocerasum, Solanum burbankii\nn12897118\tnaranjilla, Solanum quitoense\nn12897788\tpotato vine, giant potato creeper, Solanum wendlandii\nn12897999\tpotato tree, Brazilian potato tree, Solanum wrightii, Solanum macranthum\nn12898342\tbelladonna, belladonna plant, deadly nightshade, Atropa belladonna\nn12898774\tbush violet, browallia\nn12899166\tlady-of-the-night, Brunfelsia americana\nn12899537\tangel's trumpet, maikoa, Brugmansia arborea, Datura arborea\nn12899752\tangel's trumpet, Brugmansia suaveolens, Datura suaveolens\nn12899971\tred angel's trumpet, Brugmansia sanguinea, Datura sanguinea\nn12900783\tcone pepper, Capsicum annuum conoides\nn12901724\tbird pepper, Capsicum frutescens baccatum, Capsicum baccatum\nn12902466\tday jessamine, Cestrum diurnum\nn12902662\tnight jasmine, night jessamine, Cestrum nocturnum\nn12903014\ttree tomato, tamarillo\nn12903367\tthorn apple\nn12903503\tjimsonweed, jimson weed, Jamestown weed, common thorn apple, apple of Peru, Datura stramonium\nn12903964\tpichi, Fabiana imbricata\nn12904314\thenbane, black henbane, stinking nightshade, Hyoscyamus niger\nn12904562\tEgyptian henbane, Hyoscyamus muticus\nn12904938\tmatrimony vine, boxthorn\nn12905135\tcommon matrimony vine, Duke of Argyll's tea tree, Lycium barbarum, Lycium halimifolium\nn12905412\tChristmasberry, Christmas berry, Lycium carolinianum\nn12906214\tplum tomato\nn12906498\tmandrake, devil's apples, Mandragora officinarum\nn12906771\tmandrake root, mandrake\nn12907057\tapple of Peru, shoo fly, Nicandra physaloides\nn12907671\tflowering tobacco, Jasmine tobacco, Nicotiana alata\nn12907857\tcommon tobacco, Nicotiana tabacum\nn12908093\twild tobacco, Indian tobacco, Nicotiana rustica\nn12908645\tcupflower, nierembergia\nn12908854\twhitecup, Nierembergia repens, Nierembergia rivularis\nn12909421\tpetunia\nn12909614\tlarge white petunia, Petunia axillaris\nn12909759\tviolet-flowered petunia, Petunia integrifolia\nn12909917\thybrid petunia, Petunia hybrida\nn12911079\tcape gooseberry, purple ground cherry, Physalis peruviana\nn12911264\tstrawberry tomato, dwarf cape gooseberry, Physalis pruinosa\nn12911440\ttomatillo, jamberry, Mexican husk tomato, Physalis ixocarpa\nn12911673\ttomatillo, miltomate, purple ground cherry, jamberry, Physalis philadelphica\nn12911914\tyellow henbane, Physalis viscosa\nn12912274\tcock's eggs, Salpichroa organifolia, Salpichroa rhomboidea\nn12912670\tsalpiglossis\nn12912801\tpainted tongue, Salpiglossis sinuata\nn12913144\tbutterfly flower, poor man's orchid, schizanthus\nn12913524\tScopolia carniolica\nn12913791\tchalice vine, trumpet flower, cupflower, Solandra guttata\nn12914923\tverbena, vervain\nn12915140\tlantana\nn12915568\tblack mangrove, Avicennia marina\nn12915811\twhite mangrove, Avicennia officinalis\nn12916179\tblack mangrove, Aegiceras majus\nn12916511\tteak, Tectona grandis\nn12917901\tspurge\nn12918609\tsun spurge, wartweed, wartwort, devil's milk, Euphorbia helioscopia\nn12918810\tpetty spurge, devil's milk, Euphorbia peplus\nn12918991\tmedusa's head, Euphorbia medusae, Euphorbia caput-medusae\nn12919195\twild spurge, flowering spurge, tramp's spurge, Euphorbia corollata\nn12919403\tsnow-on-the-mountain, snow-in-summer, ghost weed, Euphorbia marginata\nn12919646\tcypress spurge, Euphorbia cyparissias\nn12919847\tleafy spurge, wolf's milk, Euphorbia esula\nn12920043\thairy spurge, Euphorbia hirsuta\nn12920204\tpoinsettia, Christmas star, Christmas flower, lobster plant, Mexican flameleaf, painted leaf, Euphorbia pulcherrima\nn12920521\tJapanese poinsettia, mole plant, paint leaf, Euphorbia heterophylla\nn12920719\tfire-on-the-mountain, painted leaf, Mexican fire plant, Euphorbia cyathophora\nn12920955\twood spurge, Euphorbia amygdaloides\nn12921315\tdwarf spurge, Euphorbia exigua\nn12921499\tscarlet plume, Euphorbia fulgens\nn12921660\tnaboom, cactus euphorbia, Euphorbia ingens\nn12921868\tcrown of thorns, Christ thorn, Christ plant, Euphorbia milii\nn12922119\ttoothed spurge, Euphorbia dentata\nn12922458\tthree-seeded mercury, Acalypha virginica\nn12922763\tcroton, Croton tiglium\nn12923108\tcascarilla, Croton eluteria\nn12923257\tcascarilla bark, eleuthera bark, sweetwood bark\nn12924623\tcastor-oil plant, castor bean plant, palma christi, palma christ, Ricinus communis\nn12925179\tspurge nettle, tread-softly, devil nettle, pica-pica, Cnidoscolus urens, Jatropha urens, Jatropha stimulosus\nn12925583\tphysic nut, Jatropha curcus\nn12926039\tPara rubber tree, caoutchouc tree, Hevea brasiliensis\nn12926480\tcassava, casava\nn12926689\tbitter cassava, manioc, mandioc, mandioca, tapioca plant, gari, Manihot esculenta, Manihot utilissima\nn12927013\tcassava, manioc\nn12927194\tsweet cassava, Manihot dulcis\nn12927494\tcandlenut, varnish tree, Aleurites moluccana\nn12927758\ttung tree, tung, tung-oil tree, Aleurites fordii\nn12928071\tslipper spurge, slipper plant\nn12928307\tcandelilla, Pedilanthus bracteatus, Pedilanthus pavonis\nn12928491\tJewbush, Jew-bush, Jew bush, redbird cactus, redbird flower, Pedilanthus tithymaloides\nn12928819\tjumping bean, jumping seed, Mexican jumping bean\nn12929403\tcamellia, camelia\nn12929600\tjaponica, Camellia japonica\nn12930778\tumbellifer, umbelliferous plant\nn12930951\twild parsley\nn12931231\tfool's parsley, lesser hemlock, Aethusa cynapium\nn12931542\tdill, Anethum graveolens\nn12931906\tangelica, angelique\nn12932173\tgarden angelica, archangel, Angelica Archangelica\nn12932365\twild angelica, Angelica sylvestris\nn12932706\tchervil, beaked parsley, Anthriscus cereifolium\nn12932966\tcow parsley, wild chervil, Anthriscus sylvestris\nn12933274\twild celery, Apium graveolens\nn12934036\tastrantia, masterwort\nn12934174\tgreater masterwort, Astrantia major\nn12934479\tcaraway, Carum carvi\nn12934685\twhorled caraway\nn12934985\twater hemlock, Cicuta verosa\nn12935166\tspotted cowbane, spotted hemlock, spotted water hemlock\nn12935609\themlock, poison hemlock, poison parsley, California fern, Nebraska fern, winter fern, Conium maculatum\nn12936155\tearthnut, Conopodium denudatum\nn12936826\tcumin, Cuminum cyminum\nn12937130\twild carrot, Queen Anne's lace, Daucus carota\nn12938081\teryngo, eringo\nn12938193\tsea holly, sea holm, sea eryngium, Eryngium maritimum\nn12938445\tbutton snakeroot, Eryngium aquaticum\nn12938667\trattlesnake master, rattlesnake's master, button snakeroot, Eryngium yuccifolium\nn12939104\tfennel\nn12939282\tcommon fennel, Foeniculum vulgare\nn12939479\tFlorence fennel, Foeniculum dulce, Foeniculum vulgare dulce\nn12939874\tcow parsnip, hogweed, Heracleum sphondylium\nn12940226\tlovage, Levisticum officinale\nn12940609\tsweet cicely, Myrrhis odorata\nn12941220\twater fennel, Oenanthe aquatica\nn12941536\tparsnip, Pastinaca sativa\nn12941717\tcultivated parsnip\nn12942025\twild parsnip, madnep\nn12942395\tparsley, Petroselinum crispum\nn12942572\tItalian parsley, flat-leaf parsley, Petroselinum crispum neapolitanum\nn12942729\tHamburg parsley, turnip-rooted parsley, Petroselinum crispum tuberosum\nn12943049\tanise, anise plant, Pimpinella anisum\nn12943443\tsanicle, snakeroot\nn12943912\tpurple sanicle, Sanicula bipinnatifida\nn12944095\tEuropean sanicle, Sanicula Europaea\nn12945177\twater parsnip, Sium suave\nn12945366\tgreater water parsnip, Sium latifolium\nn12945549\tskirret, Sium sisarum\nn12946849\tdogwood, dogwood tree, cornel\nn12947313\tcommon white dogwood, eastern flowering dogwood, Cornus florida\nn12947544\tred osier, red osier dogwood, red dogwood, American dogwood, redbrush, Cornus stolonifera\nn12947756\tsilky dogwood, Cornus obliqua\nn12947895\tsilky cornel, silky dogwood, Cornus amomum\nn12948053\tcommon European dogwood, red dogwood, blood-twig, pedwood, Cornus sanguinea\nn12948251\tbunchberry, dwarf cornel, crackerberry, pudding berry, Cornus canadensis\nn12948495\tcornelian cherry, Cornus mas\nn12949160\tpuka, Griselinia lucida\nn12949361\tkapuka, Griselinia littoralis\nn12950126\tvalerian\nn12950314\tcommon valerian, garden heliotrope, Valeriana officinalis\nn12950796\tcommon corn salad, lamb's lettuce, Valerianella olitoria, Valerianella locusta\nn12951146\tred valerian, French honeysuckle, Centranthus ruber\nn12951835\tfilmy fern, film fern\nn12952165\tbristle fern, filmy fern\nn12952469\thare's-foot bristle fern, Trichomanes boschianum\nn12952590\tKillarney fern, Trichomanes speciosum\nn12952717\tkidney fern, Trichomanes reniforme\nn12953206\tflowering fern, osmund\nn12953484\troyal fern, royal osmund, king fern, ditch fern, French bracken, Osmunda regalis\nn12953712\tinterrupted fern, Osmunda clatonia\nn12954353\tcrape fern, Prince-of-Wales fern, Prince-of-Wales feather, Prince-of-Wales plume, Leptopteris superba, Todea superba\nn12954799\tcrepe fern, king fern, Todea barbara\nn12955414\tcurly grass, curly grass fern, Schizaea pusilla\nn12955840\tpine fern, Anemia adiantifolia\nn12956170\tclimbing fern\nn12956367\tcreeping fern, Hartford fern, Lygodium palmatum\nn12956588\tclimbing maidenhair, climbing maidenhair fern, snake fern, Lygodium microphyllum\nn12956922\tscented fern, Mohria caffrorum\nn12957608\tclover fern, pepperwort\nn12957803\tnardoo, nardo, common nardoo, Marsilea drummondii\nn12957924\twater clover, Marsilea quadrifolia\nn12958261\tpillwort, Pilularia globulifera\nn12958615\tregnellidium, Regnellidium diphyllum\nn12959074\tfloating-moss, Salvinia rotundifolia, Salvinia auriculata\nn12959538\tmosquito fern, floating fern, Carolina pond fern, Azolla caroliniana\nn12960378\tadder's tongue, adder's tongue fern\nn12960552\tribbon fern, Ophioglossum pendulum\nn12960863\tgrape fern\nn12961242\tdaisyleaf grape fern, daisy-leaved grape fern, Botrychium matricariifolium\nn12961393\tleathery grape fern, Botrychium multifidum\nn12961536\trattlesnake fern, Botrychium virginianum\nn12961879\tflowering fern, Helminthostachys zeylanica\nn12963628\tpowdery mildew\nn12964920\tDutch elm fungus, Ceratostomella ulmi\nn12965626\tergot, Claviceps purpurea\nn12965951\trye ergot\nn12966804\tblack root rot fungus, Xylaria mali\nn12966945\tdead-man's-fingers, dead-men's-fingers, Xylaria polymorpha\nn12968136\tsclerotinia\nn12968309\tbrown cup\nn12969131\tearthball, false truffle, puffball, hard-skinned puffball\nn12969425\tScleroderma citrinum, Scleroderma aurantium\nn12969670\tScleroderma flavidium, star earthball\nn12969927\tScleroderma bovista, smooth earthball\nn12970193\tPodaxaceae\nn12970293\tstalked puffball\nn12970733\tstalked puffball\nn12971400\tfalse truffle\nn12971804\tRhizopogon idahoensis\nn12972136\tTruncocolumella citrina\nn12973443\tmucor\nn12973791\trhizopus\nn12973937\tbread mold, Rhizopus nigricans\nn12974987\tslime mold, slime mould\nn12975804\ttrue slime mold, acellular slime mold, plasmodial slime mold, myxomycete\nn12976198\tcellular slime mold\nn12976554\tdictostylium\nn12978076\tpond-scum parasite\nn12979316\tpotato wart fungus, Synchytrium endobioticum\nn12979829\twhite fungus, Saprolegnia ferax\nn12980080\twater mold\nn12980840\tdowny mildew, false mildew\nn12981086\tblue mold fungus, Peronospora tabacina\nn12981301\tonion mildew, Peronospora destructor\nn12981443\ttobacco mildew, Peronospora hyoscyami\nn12981954\twhite rust\nn12982468\tpythium\nn12982590\tdamping off fungus, Pythium debaryanum\nn12982915\tPhytophthora citrophthora\nn12983048\tPhytophthora infestans\nn12983654\tclubroot fungus, Plasmodiophora brassicae\nn12983873\tGeglossaceae\nn12983961\tSarcosomataceae\nn12984267\tRufous rubber cup\nn12984489\tdevil's cigar\nn12984595\tdevil's urn\nn12985420\ttruffle, earthnut, earth-ball\nn12985773\tclub fungus\nn12985857\tcoral fungus\nn12986227\ttooth fungus\nn12987056\tlichen\nn12987423\tascolichen\nn12987535\tbasidiolichen\nn12988158\tlecanora\nn12988341\tmanna lichen\nn12988572\tarchil, orchil\nn12989007\troccella, Roccella tinctoria\nn12989938\tbeard lichen, beard moss, Usnea barbata\nn12990597\thorsehair lichen, horsetail lichen\nn12991184\treindeer moss, reindeer lichen, arctic moss, Cladonia rangiferina\nn12991837\tcrottle, crottal, crotal\nn12992177\tIceland moss, Iceland lichen, Cetraria islandica\nn12992868\tfungus\nn12994892\tpromycelium\nn12995601\ttrue fungus\nn12997654\tbasidiomycete, basidiomycetous fungi\nn12997919\tmushroom\nn12998815\tagaric\nn13000891\tmushroom\nn13001041\tmushroom\nn13001206\ttoadstool\nn13001366\thorse mushroom, Agaricus arvensis\nn13001529\tmeadow mushroom, field mushroom, Agaricus campestris\nn13001930\tshiitake, shiitake mushroom, Chinese black mushroom, golden oak mushroom, Oriental black mushroom, Lentinus edodes\nn13002209\tscaly lentinus, Lentinus lepideus\nn13002750\troyal agaric, Caesar's agaric, Amanita caesarea\nn13002925\tfalse deathcap, Amanita mappa\nn13003061\tfly agaric, Amanita muscaria\nn13003254\tdeath cap, death cup, death angel, destroying angel, Amanita phalloides\nn13003522\tblushing mushroom, blusher, Amanita rubescens\nn13003712\tdestroying angel, Amanita verna\nn13004423\tchanterelle, chantarelle, Cantharellus cibarius\nn13004640\tfloccose chanterelle, Cantharellus floccosus\nn13004826\tpig's ears, Cantharellus clavatus\nn13004992\tcinnabar chanterelle, Cantharellus cinnabarinus\nn13005329\tjack-o-lantern fungus, jack-o-lantern, jack-a-lantern, Omphalotus illudens\nn13005984\tinky cap, inky-cap mushroom, Coprinus atramentarius\nn13006171\tshaggymane, shaggy cap, shaggymane mushroom, Coprinus comatus\nn13006631\tmilkcap, Lactarius delicioso\nn13006894\tfairy-ring mushroom, Marasmius oreades\nn13007034\tfairy ring, fairy circle\nn13007417\toyster mushroom, oyster fungus, oyster agaric, Pleurotus ostreatus\nn13007629\tolive-tree agaric, Pleurotus phosphoreus\nn13008157\tPholiota astragalina\nn13008315\tPholiota aurea, golden pholiota\nn13008485\tPholiota destruens\nn13008689\tPholiota flammans\nn13008839\tPholiota flavida\nn13009085\tnameko, viscid mushroom, Pholiota nameko\nn13009244\tPholiota squarrosa-adiposa\nn13009429\tPholiota squarrosa, scaly pholiota\nn13009656\tPholiota squarrosoides\nn13010694\tStropharia ambigua\nn13010951\tStropharia hornemannii\nn13011221\tStropharia rugoso-annulata\nn13011595\tgill fungus\nn13012253\tEntoloma lividum, Entoloma sinuatum\nn13012469\tEntoloma aprile\nn13012973\tChlorophyllum molybdites\nn13013534\tlepiota\nn13013764\tparasol mushroom, Lepiota procera\nn13013965\tpoisonous parasol, Lepiota morgani\nn13014097\tLepiota naucina\nn13014265\tLepiota rhacodes\nn13014409\tAmerican parasol, Lepiota americana\nn13014581\tLepiota rubrotincta\nn13014741\tLepiota clypeolaria\nn13014879\tonion stem, Lepiota cepaestipes\nn13015509\tpink disease fungus, Corticium salmonicolor\nn13015688\tbottom rot fungus, Corticium solani\nn13016076\tpotato fungus, Pellicularia filamentosa, Rhizoctinia solani\nn13016289\tcoffee fungus, Pellicularia koleroga\nn13017102\tblewits, Clitocybe nuda\nn13017240\tsandy mushroom, Tricholoma populinum\nn13017439\tTricholoma pessundatum\nn13017610\tTricholoma sejunctum\nn13017789\tman-on-a-horse, Tricholoma flavovirens\nn13017979\tTricholoma venenata\nn13018088\tTricholoma pardinum\nn13018232\tTricholoma vaccinum\nn13018407\tTricholoma aurantium\nn13018906\tVolvaria bombycina\nn13019496\tPluteus aurantiorugosus\nn13019643\tPluteus magnus, sawdust mushroom\nn13019835\tdeer mushroom, Pluteus cervinus\nn13020191\tstraw mushroom, Chinese mushroom, Volvariella volvacea\nn13020481\tVolvariella bombycina\nn13020964\tClitocybe clavipes\nn13021166\tClitocybe dealbata\nn13021332\tClitocybe inornata\nn13021543\tClitocybe robusta, Clytocybe alba\nn13021689\tClitocybe irina, Tricholoma irinum, Lepista irina\nn13021867\tClitocybe subconnexa\nn13022210\twinter mushroom, Flammulina velutipes\nn13022709\tmycelium\nn13022903\tsclerotium\nn13023134\tsac fungus\nn13024012\tascomycete, ascomycetous fungus\nn13024500\tClavicipitaceae, grainy club mushrooms\nn13024653\tgrainy club\nn13025647\tyeast\nn13025854\tbaker's yeast, brewer's yeast, Saccharomyces cerevisiae\nn13026015\twine-maker's yeast, Saccharomyces ellipsoides\nn13027557\tAspergillus fumigatus\nn13027879\tbrown root rot fungus, Thielavia basicola\nn13028611\tdiscomycete, cup fungus\nn13028937\tLeotia lubrica\nn13029122\tMitrula elegans\nn13029326\tSarcoscypha coccinea, scarlet cup\nn13029610\tCaloscypha fulgens\nn13029760\tAleuria aurantia, orange peel fungus\nn13030337\telf cup\nn13030616\tPeziza domicilina\nn13030852\tblood cup, fairy cup, Peziza coccinea\nn13031193\tUrnula craterium, urn fungus\nn13031323\tGaliella rufa\nn13031474\tJafnea semitosta\nn13032115\tmorel\nn13032381\tcommon morel, Morchella esculenta, sponge mushroom, sponge morel\nn13032618\tDisciotis venosa, cup morel\nn13032923\tVerpa, bell morel\nn13033134\tVerpa bohemica, early morel\nn13033396\tVerpa conica, conic Verpa\nn13033577\tblack morel, Morchella conica, conic morel, Morchella angusticeps, narrowhead morel\nn13033879\tMorchella crassipes, thick-footed morel\nn13034062\tMorchella semilibera, half-free morel, cow's head\nn13034555\tWynnea americana\nn13034788\tWynnea sparassoides\nn13035241\tfalse morel\nn13035389\tlorchel\nn13035707\thelvella\nn13035925\tHelvella crispa, miter mushroom\nn13036116\tHelvella acetabulum\nn13036312\tHelvella sulcata\nn13036804\tdiscina\nn13037406\tgyromitra\nn13037585\tGyromitra californica, California false morel\nn13037805\tGyromitra sphaerospora, round-spored gyromitra\nn13038068\tGyromitra esculenta, brain mushroom, beefsteak morel\nn13038376\tGyromitra infula, saddled-shaped false morel\nn13038577\tGyromitra fastigiata, Gyromitra brunnea\nn13038744\tGyromitra gigas\nn13039349\tgasteromycete, gastromycete\nn13040303\tstinkhorn, carrion fungus\nn13040629\tcommon stinkhorn, Phallus impudicus\nn13040796\tPhallus ravenelii\nn13041312\tdog stinkhorn, Mutinus caninus\nn13041943\tCalostoma lutescens\nn13042134\tCalostoma cinnabarina\nn13042316\tCalostoma ravenelii\nn13042982\tstinky squid, Pseudocolus fusiformis\nn13043926\tpuffball, true puffball\nn13044375\tgiant puffball, Calvatia gigantea\nn13044778\tearthstar\nn13045210\tGeastrum coronatum\nn13045594\tRadiigera fuscogleba\nn13045975\tAstreus pteridis\nn13046130\tAstreus hygrometricus\nn13046669\tbird's-nest fungus\nn13047862\tGastrocybe lateritia\nn13048447\tMacowanites americanus\nn13049953\tpolypore, pore fungus, pore mushroom\nn13050397\tbracket fungus, shelf fungus\nn13050705\tAlbatrellus dispansus\nn13050940\tAlbatrellus ovinus, sheep polypore\nn13051346\tNeolentinus ponderosus\nn13052014\tOligoporus leucospongia\nn13052248\tPolyporus tenuiculus\nn13052670\then-of-the-woods, hen of the woods, Polyporus frondosus, Grifola frondosa\nn13052931\tPolyporus squamosus, scaly polypore\nn13053608\tbeefsteak fungus, Fistulina hepatica\nn13054073\tagaric, Fomes igniarius\nn13054560\tbolete\nn13055423\tBoletus chrysenteron\nn13055577\tBoletus edulis\nn13055792\tFrost's bolete, Boletus frostii\nn13055949\tBoletus luridus\nn13056135\tBoletus mirabilis\nn13056349\tBoletus pallidus\nn13056607\tBoletus pulcherrimus\nn13056799\tBoletus pulverulentus\nn13057054\tBoletus roxanae\nn13057242\tBoletus subvelutipes\nn13057422\tBoletus variipes\nn13057639\tBoletus zelleri\nn13058037\tFuscoboletinus paluster\nn13058272\tFuscoboletinus serotinus\nn13058608\tLeccinum fibrillosum\nn13059298\tSuillus albivelatus\nn13059657\told-man-of-the-woods, Strobilomyces floccopus\nn13060017\tBoletellus russellii\nn13060190\tjelly fungus\nn13061172\tsnow mushroom, Tremella fuciformis\nn13061348\twitches' butter, Tremella lutescens\nn13061471\tTremella foliacea\nn13061704\tTremella reticulata\nn13062421\tJew's-ear, Jew's-ears, ear fungus, Auricularia auricula\nn13063269\trust, rust fungus\nn13063514\taecium\nn13064111\tflax rust, flax rust fungus, Melampsora lini\nn13064457\tblister rust, Cronartium ribicola\nn13065089\twheat rust, Puccinia graminis\nn13065514\tapple rust, cedar-apple rust, Gymnosporangium juniperi-virginianae\nn13066129\tsmut, smut fungus\nn13066448\tcovered smut\nn13066979\tloose smut\nn13067191\tcornsmut, corn smut\nn13067330\tboil smut, Ustilago maydis\nn13067532\tSphacelotheca, genus Sphacelotheca\nn13067672\thead smut, Sphacelotheca reiliana\nn13068255\tbunt, Tilletia caries\nn13068434\tbunt, stinking smut, Tilletia foetida\nn13068735\tonion smut, Urocystis cepulae\nn13068917\tflag smut fungus\nn13069224\twheat flag smut, Urocystis tritici\nn13069773\tfelt fungus, Septobasidium pseudopedicellatum\nn13070308\twaxycap\nn13070875\tHygrocybe acutoconica, conic waxycap\nn13071371\tHygrophorus borealis\nn13071553\tHygrophorus caeruleus\nn13071815\tHygrophorus inocybiformis\nn13072031\tHygrophorus kauffmanii\nn13072209\tHygrophorus marzuolus\nn13072350\tHygrophorus purpurascens\nn13072528\tHygrophorus russula\nn13072706\tHygrophorus sordidus\nn13072863\tHygrophorus tennesseensis\nn13073055\tHygrophorus turundus\nn13073703\tNeohygrophorus angelesianus\nn13074619\tCortinarius armillatus\nn13074814\tCortinarius atkinsonianus\nn13075020\tCortinarius corrugatus\nn13075272\tCortinarius gentilis\nn13075441\tCortinarius mutabilis, purple-staining Cortinarius\nn13075684\tCortinarius semisanguineus\nn13075847\tCortinarius subfoetidus\nn13076041\tCortinarius violaceus\nn13076405\tGymnopilus spectabilis\nn13076643\tGymnopilus validipes\nn13076831\tGymnopilus ventricosus\nn13077033\tmold, mould\nn13077295\tmildew\nn13078021\tverticillium\nn13079073\tmonilia\nn13079419\tcandida\nn13079567\tCandida albicans, Monilia albicans\nn13080306\tblastomycete\nn13080866\tyellow spot fungus, Cercospora kopkei\nn13081229\tgreen smut fungus, Ustilaginoidea virens\nn13081999\tdry rot\nn13082568\trhizoctinia\nn13083023\thouseplant\nn13083461\tbedder, bedding plant\nn13084184\tsucculent\nn13084834\tcultivar\nn13085113\tweed\nn13085747\twort\nn13090018\tbrier\nn13090871\taril\nn13091620\tsporophyll, sporophyl\nn13091774\tsporangium, spore case, spore sac\nn13091982\tsporangiophore\nn13092078\tascus\nn13092240\tascospore\nn13092385\tarthrospore\nn13092987\teusporangium\nn13093275\ttetrasporangium\nn13093629\tgametangium\nn13094145\tsorus\nn13094273\tsorus\nn13095013\tpartial veil\nn13096779\tlignum\nn13098515\tvascular ray, medullary ray\nn13098962\tphloem, bast\nn13099833\tevergreen, evergreen plant\nn13099999\tdeciduous plant\nn13100156\tpoisonous plant\nn13100677\tvine\nn13102648\tcreeper\nn13102775\ttendril\nn13103023\troot climber\nn13103660\tlignosae\nn13103750\tarborescent plant\nn13103877\tsnag\nn13104059\ttree\nn13107694\ttimber tree\nn13107807\ttreelet\nn13107891\tarbor\nn13108131\tbean tree\nn13108323\tpollard\nn13108481\tsapling\nn13108545\tshade tree\nn13108662\tgymnospermous tree\nn13108841\tconifer, coniferous tree\nn13109733\tangiospermous tree, flowering tree\nn13110915\tnut tree\nn13111174\tspice tree\nn13111340\tfever tree\nn13111504\tstump, tree stump\nn13111881\tbonsai\nn13112035\tming tree\nn13112201\tming tree\nn13118330\tundershrub\nn13118707\tsubshrub, suffrutex\nn13119870\tbramble\nn13120211\tliana\nn13120958\tgeophyte\nn13121104\tdesert plant, xerophyte, xerophytic plant, xerophile, xerophilous plant\nn13121349\tmesophyte, mesophytic plant\nn13122364\tmarsh plant, bog plant, swamp plant\nn13123309\themiepiphyte, semiepiphyte\nn13123431\tstrangler, strangler tree\nn13123841\tlithophyte, lithophytic plant\nn13124358\tsaprobe\nn13124654\tautophyte, autophytic plant, autotroph, autotrophic organism\nn13125117\troot\nn13126050\ttaproot\nn13126856\tprop root\nn13127001\tprophyll\nn13127303\trootstock\nn13127666\tquickset\nn13127843\tstolon, runner, offset\nn13128278\ttuberous plant\nn13128582\trhizome, rootstock, rootstalk\nn13128976\trachis\nn13129078\tcaudex\nn13130014\tcladode, cladophyll, phylloclad, phylloclade\nn13130161\treceptacle\nn13130726\tscape, flower stalk\nn13131028\tumbel\nn13131618\tpetiole, leafstalk\nn13132034\tpeduncle\nn13132156\tpedicel, pedicle\nn13132338\tflower cluster\nn13132486\traceme\nn13132656\tpanicle\nn13132756\tthyrse, thyrsus\nn13132940\tcyme\nn13133140\tcymule\nn13133233\tglomerule\nn13133316\tscorpioid cyme\nn13133613\tear, spike, capitulum\nn13133932\tspadix\nn13134302\tbulbous plant\nn13134531\tbulbil, bulblet\nn13134844\tcormous plant\nn13134947\tfruit\nn13135692\tfruitlet\nn13135832\tseed\nn13136316\tbean\nn13136556\tnut\nn13136781\tnutlet\nn13137010\tkernel, meat\nn13137225\tsyconium\nn13137409\tberry\nn13137672\taggregate fruit, multiple fruit, syncarp\nn13137951\tsimple fruit, bacca\nn13138155\tacinus\nn13138308\tdrupe, stone fruit\nn13138658\tdrupelet\nn13138842\tpome, false fruit\nn13139055\tpod, seedpod\nn13139321\tloment\nn13139482\tpyxidium, pyxis\nn13139647\thusk\nn13139837\tcornhusk\nn13140049\tpod, cod, seedcase\nn13140367\taccessory fruit, pseudocarp\nn13141141\tbuckthorn\nn13141415\tbuckthorn berry, yellow berry\nn13141564\tcascara buckthorn, bearberry, bearwood, chittamwood, chittimwood, Rhamnus purshianus\nn13141797\tcascara, cascara sagrada, chittam bark, chittem bark\nn13141972\tCarolina buckthorn, indian cherry, Rhamnus carolinianus\nn13142182\tcoffeeberry, California buckthorn, California coffee, Rhamnus californicus\nn13142504\tredberry, red-berry, Rhamnus croceus\nn13142907\tnakedwood\nn13143285\tjujube, jujube bush, Christ's-thorn, Jerusalem thorn, Ziziphus jujuba\nn13143758\tChrist's-thorn, Jerusalem thorn, Paliurus spina-christi\nn13144084\thazel, hazel tree, Pomaderris apetala\nn13145040\tfox grape, Vitis labrusca\nn13145250\tmuscadine, Vitis rotundifolia\nn13145444\tvinifera, vinifera grape, common grape vine, Vitis vinifera\nn13146403\tPinot blanc\nn13146583\tSauvignon grape\nn13146928\tSauvignon blanc\nn13147153\tMuscadet\nn13147270\tRiesling\nn13147386\tZinfandel\nn13147532\tChenin blanc\nn13147689\tmalvasia\nn13147918\tVerdicchio\nn13148208\tBoston ivy, Japanese ivy, Parthenocissus tricuspidata\nn13148384\tVirginia creeper, American ivy, woodbine, Parthenocissus quinquefolia\nn13149296\ttrue pepper, pepper vine\nn13149970\tbetel, betel pepper, Piper betel\nn13150378\tcubeb\nn13150592\tschizocarp\nn13150894\tpeperomia\nn13151082\twatermelon begonia, Peperomia argyreia, Peperomia sandersii\nn13152339\tyerba mansa, Anemopsis californica\nn13154388\tpinna, pinnule\nn13154494\tfrond\nn13154841\tbract\nn13155095\tbracteole, bractlet\nn13155305\tinvolucre\nn13155611\tglume\nn13156986\tpalmate leaf\nn13157137\tpinnate leaf\nn13157346\tbijugate leaf, bijugous leaf, twice-pinnate\nn13157481\tdecompound leaf\nn13157684\tacuminate leaf\nn13157971\tdeltoid leaf\nn13158167\tensiform leaf\nn13158512\tlinear leaf, elongate leaf\nn13158605\tlyrate leaf\nn13158714\tobtuse leaf\nn13158815\toblanceolate leaf\nn13159357\tpandurate leaf, panduriform leaf\nn13159691\treniform leaf\nn13159890\tspatulate leaf\nn13160116\teven-pinnate leaf, abruptly-pinnate leaf\nn13160254\todd-pinnate leaf\nn13160365\tpedate leaf\nn13160604\tcrenate leaf\nn13160831\tdentate leaf\nn13160938\tdenticulate leaf\nn13161151\terose leaf\nn13161254\truncinate leaf\nn13161904\tprickly-edged leaf\nn13163553\tdeadwood\nn13163649\thaulm, halm\nn13163991\tbranchlet, twig, sprig\nn13164501\tosier\nn13170840\tgiant scrambling fern, Diplopterygium longissimum\nn13171210\tumbrella fern, fan fern, Sticherus flabellatus, Gleichenia flabellata\nn13171797\tfloating fern, water sprite, Ceratopteris pteridioides\nn13172923\tpolypody\nn13173132\tlicorice fern, Polypodium glycyrrhiza\nn13173259\tgrey polypody, gray polypody, resurrection fern, Polypodium polypodioides\nn13173488\tleatherleaf, leathery polypody, coast polypody, Polypodium scouleri\nn13173697\trock polypody, rock brake, American wall fern, Polypodium virgianum\nn13173882\tcommon polypody, adder's fern, wall fern, golden maidenhair, golden polypody, sweet fern, Polypodium vulgare\nn13174354\tbear's-paw fern, Aglaomorpha meyeniana\nn13174670\tstrap fern\nn13174823\tFlorida strap fern, cow-tongue fern, hart's-tongue fern\nn13175682\tbasket fern, Drynaria rigidula\nn13176363\tsnake polypody, Microgramma-piloselloides\nn13176714\tclimbing bird's nest fern, Microsorium punctatum\nn13177048\tgolden polypody, serpent fern, rabbit's-foot fern, Phlebodium aureum, Polypodium aureum\nn13177529\tstaghorn fern\nn13177768\tSouth American staghorn, Platycerium andinum\nn13177884\tcommon staghorn fern, elkhorn fern, Platycerium bifurcatum, Platycerium alcicorne\nn13178284\tfelt fern, tongue fern, Pyrrosia lingua, Cyclophorus lingua\nn13178707\tpotato fern, Solanopteris bifrons\nn13179056\tmyrmecophyte\nn13179804\tgrass fern, ribbon fern, Vittaria lineata\nn13180534\tspleenwort\nn13180875\tblack spleenwort, Asplenium adiantum-nigrum\nn13181055\tbird's nest fern, Asplenium nidus\nn13181244\tebony spleenwort, Scott's Spleenwort, Asplenium platyneuron\nn13181406\tblack-stem spleenwort, black-stemmed spleenwort, little ebony spleenwort\nn13181811\twalking fern, walking leaf, Asplenium rhizophyllum, Camptosorus rhizophyllus\nn13182164\tgreen spleenwort, Asplenium viride\nn13182338\tmountain spleenwort, Asplenium montanum\nn13182799\tlobed spleenwort, Asplenium pinnatifidum\nn13182937\tlanceolate spleenwort, Asplenium billotii\nn13183056\thart's-tongue, hart's-tongue fern, Asplenium scolopendrium, Phyllitis scolopendrium\nn13183489\tscale fern, scaly fern, Asplenium ceterach, Ceterach officinarum\nn13184394\tscolopendrium\nn13185269\tdeer fern, Blechnum spicant\nn13185658\tdoodia, rasp fern\nn13186388\tchain fern\nn13186546\tVirginia chain fern, Woodwardia virginica\nn13187367\tsilver tree fern, sago fern, black tree fern, Cyathea medullaris\nn13188096\tdavallia\nn13188268\thare's-foot fern\nn13188462\tCanary Island hare's foot fern, Davallia canariensis\nn13188767\tsquirrel's-foot fern, ball fern, Davalia bullata, Davalia bullata mariesii, Davallia Mariesii\nn13190060\tbracken, Pteridium esculentum\nn13190747\tsoft tree fern, Dicksonia antarctica\nn13191148\tScythian lamb, Cibotium barometz\nn13191620\tfalse bracken, Culcita dubia\nn13191884\tthyrsopteris, Thyrsopteris elegans\nn13192625\tshield fern, buckler fern\nn13193143\tbroad buckler-fern, Dryopteris dilatata\nn13193269\tfragrant cliff fern, fragrant shield fern, fragrant wood fern, Dryopteris fragrans\nn13193466\tGoldie's fern, Goldie's shield fern, goldie's wood fern, Dryopteris goldiana\nn13193642\twood fern, wood-fern, woodfern\nn13193856\tmale fern, Dryopteris filix-mas\nn13194036\tmarginal wood fern, evergreen wood fern, leatherleaf wood fern, Dryopteris marginalis\nn13194212\tmountain male fern, Dryopteris oreades\nn13194572\tlady fern, Athyrium filix-femina\nn13194758\tAlpine lady fern, Athyrium distentifolium\nn13194918\tsilvery spleenwort, glade fern, narrow-leaved spleenwort, Athyrium pycnocarpon, Diplazium pycnocarpon\nn13195341\tholly fern, Cyrtomium aculeatum, Polystichum aculeatum\nn13195761\tbladder fern\nn13196003\tbrittle bladder fern, brittle fern, fragile fern, Cystopteris fragilis\nn13196234\tmountain bladder fern, Cystopteris montana\nn13196369\tbulblet fern, bulblet bladder fern, berry fern, Cystopteris bulbifera\nn13196738\tsilvery spleenwort, Deparia acrostichoides, Athyrium thelypteroides\nn13197274\toak fern, Gymnocarpium dryopteris, Thelypteris dryopteris\nn13197507\tlimestone fern, northern oak fern, Gymnocarpium robertianum\nn13198054\tostrich fern, shuttlecock fern, fiddlehead, Matteuccia struthiopteris, Pteretis struthiopteris, Onoclea struthiopteris\nn13198482\thart's-tongue, hart's-tongue fern, Olfersia cervina, Polybotrya cervina, Polybotria cervina\nn13198914\tsensitive fern, bead fern, Onoclea sensibilis\nn13199717\tChristmas fern, canker brake, dagger fern, evergreen wood fern, Polystichum acrostichoides\nn13199970\tholly fern\nn13200193\tBraun's holly fern, prickly shield fern, Polystichum braunii\nn13200542\twestern holly fern, Polystichum scopulinum\nn13200651\tsoft shield fern, Polystichum setiferum\nn13200986\tleather fern, leatherleaf fern, ten-day fern, Rumohra adiantiformis, Polystichum adiantiformis\nn13201423\tbutton fern, Tectaria cicutaria\nn13201566\tIndian button fern, Tectaria macrodonta\nn13201969\twoodsia\nn13202125\trusty woodsia, fragrant woodsia, oblong woodsia, Woodsia ilvensis\nn13202355\tAlpine woodsia, northern woodsia, flower-cup fern, Woodsia alpina\nn13202602\tsmooth woodsia, Woodsia glabella\nn13205058\tBoston fern, Nephrolepis exaltata, Nephrolepis exaltata bostoniensis\nn13205249\tbasket fern, toothed sword fern, Nephrolepis pectinata\nn13206178\tgolden fern, leather fern, Acrostichum aureum\nn13206817\tmaidenhair, maidenhair fern\nn13207094\tcommon maidenhair, Venushair, Venus'-hair fern, southern maidenhair, Venus maidenhair, Adiantum capillus-veneris\nn13207335\tAmerican maidenhair fern, five-fingered maidenhair fern, Adiantum pedatum\nn13207572\tBermuda maidenhair, Bermuda maidenhair fern, Adiantum bellum\nn13207736\tbrittle maidenhair, brittle maidenhair fern, Adiantum tenerum\nn13207923\tFarley maidenhair, Farley maidenhair fern, Barbados maidenhair, glory fern, Adiantum tenerum farleyense\nn13208302\tannual fern, Jersey fern, Anogramma leptophylla\nn13208705\tlip fern, lipfern\nn13208965\tsmooth lip fern, Alabama lip fern, Cheilanthes alabamensis\nn13209129\tlace fern, Cheilanthes gracillima\nn13209270\twooly lip fern, hairy lip fern, Cheilanthes lanosa\nn13209460\tsouthwestern lip fern, Cheilanthes eatonii\nn13209808\tbamboo fern, Coniogramme japonica\nn13210350\tAmerican rock brake, American parsley fern, Cryptogramma acrostichoides\nn13210597\tEuropean parsley fern, mountain parsley fern, Cryptogramma crispa\nn13211020\thand fern, Doryopteris pedata\nn13211790\tcliff brake, cliff-brake, rock brake\nn13212025\tcoffee fern, Pellaea andromedifolia\nn13212175\tpurple rock brake, Pellaea atropurpurea\nn13212379\tbird's-foot fern, Pellaea mucronata, Pellaea ornithopus\nn13212559\tbutton fern, Pellaea rotundifolia\nn13213066\tsilver fern, Pityrogramma argentea\nn13213397\tgolden fern, Pityrogramma calomelanos aureoflava\nn13213577\tgold fern, Pityrogramma chrysophylla\nn13214217\tPteris cretica\nn13214340\tspider brake, spider fern, Pteris multifida\nn13214485\tribbon fern, spider fern, Pteris serrulata\nn13215258\tpotato fern, Marattia salicina\nn13215586\tangiopteris, giant fern, Angiopteris evecta\nn13217005\tskeleton fork fern, Psilotum nudum\nn13219422\thorsetail\nn13219833\tcommon horsetail, field horsetail, Equisetum arvense\nn13219976\tswamp horsetail, water horsetail, Equisetum fluviatile\nn13220122\tscouring rush, rough horsetail, Equisetum hyemale, Equisetum hyemale robustum, Equisetum robustum\nn13220355\tmarsh horsetail, Equisetum palustre\nn13220525\twood horsetail, Equisetum Sylvaticum\nn13220663\tvariegated horsetail, variegated scouring rush, Equisetum variegatum\nn13221529\tclub moss, club-moss, lycopod\nn13222877\tshining clubmoss, Lycopodium lucidulum\nn13222985\talpine clubmoss, Lycopodium alpinum\nn13223090\tfir clubmoss, mountain clubmoss, little clubmoss, Lycopodium selago\nn13223588\tground cedar, staghorn moss, Lycopodium complanatum\nn13223710\tground fir, princess pine, tree clubmoss, Lycopodium obscurum\nn13223843\tfoxtail grass, Lycopodium alopecuroides\nn13224673\tspikemoss, spike moss, little club moss\nn13224922\tmeadow spikemoss, basket spikemoss, Selaginella apoda\nn13225244\tdesert selaginella, Selaginella eremophila\nn13225365\tresurrection plant, rose of Jericho, Selaginella lepidophylla\nn13225617\tflorida selaginella, Selaginella eatonii\nn13226320\tquillwort\nn13226871\tearthtongue, earth-tongue\nn13228017\tsnuffbox fern, meadow fern, Thelypteris palustris pubescens, Dryopteris thelypteris pubescens\nn13228536\tchristella\nn13229543\tmountain fern, Oreopteris limbosperma, Dryopteris oreopteris\nn13229951\tNew York fern, Parathelypteris novae-boracensis, Dryopteris noveboracensis\nn13230190\tMassachusetts fern, Parathelypteris simulata, Thelypteris simulata\nn13230662\tbeech fern\nn13230843\tbroad beech fern, southern beech fern, Phegopteris hexagonoptera, Dryopteris hexagonoptera, Thelypteris hexagonoptera\nn13231078\tlong beech fern, narrow beech fern, northern beech fern, Phegopteris connectilis, Dryopteris phegopteris, Thelypteris phegopteris\nn13231678\tshoestring fungus\nn13231919\tArmillaria caligata, booted armillaria\nn13232106\tArmillaria ponderosa, white matsutake\nn13232363\tArmillaria zelleri\nn13232779\thoney mushroom, honey fungus, Armillariella mellea\nn13233727\tmilkweed, silkweed\nn13234114\twhite milkweed, Asclepias albicans\nn13234519\tpoke milkweed, Asclepias exaltata\nn13234678\tswamp milkweed, Asclepias incarnata\nn13234857\tMead's milkweed, Asclepias meadii, Asclepia meadii\nn13235011\tpurple silkweed, Asclepias purpurascens\nn13235159\tshowy milkweed, Asclepias speciosa\nn13235319\tpoison milkweed, horsetail milkweed, Asclepias subverticillata\nn13235503\tbutterfly weed, orange milkweed, chigger flower, chiggerflower, pleurisy root, tuber root, Indian paintbrush, Asclepias tuberosa\nn13235766\twhorled milkweed, Asclepias verticillata\nn13236100\tcruel plant, Araujia sericofera\nn13237188\twax plant, Hoya carnosa\nn13237508\tsilk vine, Periploca graeca\nn13238375\tstapelia, carrion flower, starfish flower\nn13238654\tStapelias asterias\nn13238988\tstephanotis\nn13239177\tMadagascar jasmine, waxflower, Stephanotis floribunda\nn13239736\tnegro vine, Vincetoxicum hirsutum, Vincetoxicum negrum\nn13239921\tzygospore\nn13240362\ttree of knowledge\nn13252672\torangery\nn13354021\tpocketbook\nn13555775\tshit, dump\nn13579829\tcordage\nn13650447\tyard, pace\nn13653902\textremum, peak\nn13862407\tleaf shape, leaf form\nn13862552\tequilateral\nn13862780\tfigure\nn13863020\tpencil\nn13863186\tplane figure, two-dimensional figure\nn13863473\tsolid figure, three-dimensional figure\nn13863771\tline\nn13864035\tbulb\nn13864153\tconvex shape, convexity\nn13864965\tconcave shape, concavity, incurvation, incurvature\nn13865298\tcylinder\nn13865483\tround shape\nn13865904\theart\nn13866144\tpolygon, polygonal shape\nn13866626\tconvex polygon\nn13866827\tconcave polygon\nn13867005\treentrant polygon, reentering polygon\nn13867492\tamorphous shape\nn13868248\tclosed curve\nn13868371\tsimple closed curve, Jordan curve\nn13868515\tS-shape\nn13868944\twave, undulation\nn13869045\textrados\nn13869547\thook, crotchet\nn13869788\tenvelope\nn13869896\tbight\nn13871717\tdiameter\nn13872592\tcone, conoid, cone shape\nn13872822\tfunnel, funnel shape\nn13873361\toblong\nn13873502\tcircle\nn13873917\tcircle\nn13874073\tequator\nn13874558\tscallop, crenation, crenature, crenel, crenelle\nn13875392\tring, halo, annulus, doughnut, anchor ring\nn13875571\tloop\nn13875884\tbight\nn13876561\thelix, spiral\nn13877547\telement of a cone\nn13877667\telement of a cylinder\nn13878306\tellipse, oval\nn13879049\tquadrate\nn13879320\ttriangle, trigon, trilateral\nn13879816\tacute triangle, acute-angled triangle\nn13880199\tisosceles triangle\nn13880415\tobtuse triangle, obtuse-angled triangle\nn13880551\tright triangle, right-angled triangle\nn13880704\tscalene triangle\nn13880994\tparallel\nn13881512\ttrapezoid\nn13881644\tstar\nn13882201\tpentagon\nn13882276\thexagon\nn13882487\theptagon\nn13882563\toctagon\nn13882639\tnonagon\nn13882713\tdecagon\nn13882961\trhombus, rhomb, diamond\nn13883603\tspherical polygon\nn13883763\tspherical triangle\nn13884261\tconvex polyhedron\nn13884384\tconcave polyhedron\nn13884930\tcuboid\nn13885011\tquadrangular prism\nn13886260\tbell, bell shape, campana\nn13888491\tangular distance\nn13889066\ttrue anomaly\nn13889331\tspherical angle\nn13891547\tangle of refraction\nn13891937\tacute angle\nn13893786\tgroove, channel\nn13894154\trut\nn13894434\tbulge, bump, hump, swelling, gibbosity, gibbousness, jut, prominence, protuberance, protrusion, extrusion, excrescence\nn13895262\tbelly\nn13896100\tbow, arc\nn13896217\tcrescent\nn13897198\tellipsoid\nn13897528\thypotenuse\nn13897996\tbalance, equilibrium, equipoise, counterbalance\nn13898207\tconformation\nn13898315\tsymmetry, proportion\nn13898645\tspheroid, ellipsoid of revolution\nn13899735\tspherule\nn13900287\ttoroid\nn13900422\tcolumn, tower, pillar\nn13901211\tbarrel, drum\nn13901321\tpipe, tube\nn13901423\tpellet\nn13901490\tbolus\nn13901858\tdewdrop\nn13902048\tridge\nn13902336\trim\nn13902793\ttaper\nn13903079\tboundary, edge, bound\nn13905121\tincisure, incisura\nn13905275\tnotch\nn13905792\twrinkle, furrow, crease, crinkle, seam, line\nn13906484\tdermatoglyphic\nn13906669\tfrown line\nn13906767\tline of life, life line, lifeline\nn13906936\tline of heart, heart line, love line, mensal line\nn13907272\tcrevice, cranny, crack, fissure, chap\nn13908201\tcleft\nn13908580\troulette, line roulette\nn13911045\tnode\nn13912260\ttree, tree diagram\nn13912540\tstemma\nn13914141\tbrachium\nn13914265\tfork, crotch\nn13914608\tblock, cube\nn13915023\tovoid\nn13915113\ttetrahedron\nn13915209\tpentahedron\nn13915305\thexahedron\nn13915999\tregular polyhedron, regular convex solid, regular convex polyhedron, Platonic body, Platonic solid, ideal solid\nn13916363\tpolyhedral angle\nn13916721\tcube, regular hexahedron\nn13917690\ttruncated pyramid\nn13917785\ttruncated cone\nn13918274\ttail, tail end\nn13918387\ttongue, knife\nn13918717\ttrapezohedron\nn13919547\twedge, wedge shape, cuneus\nn13919919\tkeel\nn13926786\tplace, shoes\nn14131950\therpes\nn14175579\tchlamydia\nn14564779\twall\nn14582716\tmicronutrient\nn14583400\tchyme\nn14585392\tragweed pollen\nn14592309\tpina cloth\nn14603798\tchlorobenzylidenemalononitrile, CS gas\nn14633206\tcarbon, C, atomic number 6\nn14685296\tcharcoal, wood coal\nn14696793\trock, stone\nn14698884\tgravel, crushed rock\nn14714645\taflatoxin\nn14720833\talpha-tocopheral\nn14765422\tleopard\nn14785065\tbricks and mortar\nn14786943\tlagging\nn14804958\thydraulic cement, Portland cement\nn14810561\tcholine\nn14820180\tconcrete\nn14821852\tglass wool\nn14844693\tsoil, dirt\nn14853210\thigh explosive\nn14858292\tlitter\nn14867545\tfish meal\nn14891255\tGreek fire\nn14899328\tculture medium, medium\nn14900184\tagar, nutrient agar\nn14900342\tblood agar\nn14908027\thip tile, hipped tile\nn14909584\thyacinth, jacinth\nn14914945\thydroxide ion, hydroxyl ion\nn14915184\tice, water ice\nn14919819\tinositol\nn14938389\tlinoleum, lino\nn14941787\tlithia water\nn14942411\tlodestone, loadstone\nn14973585\tpantothenic acid, pantothen\nn14974264\tpaper\nn14975598\tpapyrus\nn14976759\tpantile\nn14976871\tblacktop, blacktopping\nn14977188\ttarmacadam, tarmac\nn14977504\tpaving, pavement, paving material\nn14992287\tplaster\nn14993378\tpoison gas\nn15005577\tridge tile\nn15006012\troughcast\nn15019030\tsand\nn15048888\tspackle, spackling compound\nn15060326\trender\nn15060688\twattle and daub\nn15062057\tstucco\nn15067877\ttear gas, teargas, lacrimator, lachrymator\nn15075141\ttoilet tissue, toilet paper, bathroom tissue\nn15086247\tlinseed, flaxseed\nn15089258\tvitamin\nn15089472\tfat-soluble vitamin\nn15089645\twater-soluble vitamin\nn15089803\tvitamin A, antiophthalmic factor, axerophthol, A\nn15090065\tvitamin A1, retinol\nn15090238\tvitamin A2, dehydroretinol\nn15090742\tB-complex vitamin, B complex, vitamin B complex, vitamin B, B vitamin, B\nn15091129\tvitamin B1, thiamine, thiamin, aneurin, antiberiberi factor\nn15091304\tvitamin B12, cobalamin, cyanocobalamin, antipernicious anemia factor\nn15091473\tvitamin B2, vitamin G, riboflavin, lactoflavin, ovoflavin, hepatoflavin\nn15091669\tvitamin B6, pyridoxine, pyridoxal, pyridoxamine, adermin\nn15091846\tvitamin Bc, vitamin M, folate, folic acid, folacin, pteroylglutamic acid, pteroylmonoglutamic acid\nn15092059\tniacin, nicotinic acid\nn15092227\tvitamin D, calciferol, viosterol, ergocalciferol, cholecalciferol, D\nn15092409\tvitamin E, tocopherol, E\nn15092650\tbiotin, vitamin H\nn15092751\tvitamin K, naphthoquinone, antihemorrhagic factor\nn15092942\tvitamin K1, phylloquinone, phytonadione\nn15093049\tvitamin K3, menadione\nn15093137\tvitamin P, bioflavinoid, citrin\nn15093298\tvitamin C, C, ascorbic acid\nn15102359\tplanking\nn15102455\tchipboard, hardboard\nn15102894\tknothole\n"
  },
  {
    "path": "model_zoo/models/inception/inception/data/preprocess_imagenet_validation_data.py",
    "content": "#!/usr/bin/python\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Process the ImageNet Challenge bounding boxes for TensorFlow model training.\n\nAssociate the ImageNet 2012 Challenge validation data set with labels.\n\nThe raw ImageNet validation data set is expected to reside in JPEG files\nlocated in the following directory structure.\n\n data_dir/ILSVRC2012_val_00000001.JPEG\n data_dir/ILSVRC2012_val_00000002.JPEG\n ...\n data_dir/ILSVRC2012_val_00050000.JPEG\n\nThis script moves the files into a directory structure like such:\n data_dir/n01440764/ILSVRC2012_val_00000293.JPEG\n data_dir/n01440764/ILSVRC2012_val_00000543.JPEG\n ...\nwhere 'n01440764' is the unique synset label associated with\nthese images.\n\nThis directory reorganization requires a mapping from validation image\nnumber (i.e. suffix of the original file) to the associated label. This\nis provided in the ImageNet development kit via a Matlab file.\n\nIn order to make life easier and divorce ourselves from Matlab, we instead\nsupply a custom text file that provides this mapping for us.\n\nSample usage:\n  ./preprocess_imagenet_validation_data.py ILSVRC2012_img_val \\\n  imagenet_2012_validation_synset_labels.txt\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\nimport os.path\nimport sys\n\n\nif __name__ == '__main__':\n  if len(sys.argv) < 3:\n    print('Invalid usage\\n'\n          'usage: preprocess_imagenet_validation_data.py '\n          '<validation data dir> <validation labels file>')\n    sys.exit(-1)\n  data_dir = sys.argv[1]\n  validation_labels_file = sys.argv[2]\n\n  # Read in the 50000 synsets associated with the validation data set.\n  labels = [l.strip() for l in open(validation_labels_file).readlines()]\n  unique_labels = set(labels)\n\n  # Make all sub-directories in the validation data dir.\n  for label in unique_labels:\n    labeled_data_dir = os.path.join(data_dir, label)\n    os.makedirs(labeled_data_dir)\n\n  # Move all of the image to the appropriate sub-directory.\n  for i in xrange(len(labels)):\n    basename = 'ILSVRC2012_val_000%.5d.JPEG' % (i + 1)\n    original_filename = os.path.join(data_dir, basename)\n    if not os.path.exists(original_filename):\n      print('Failed to find: ' % original_filename)\n      sys.exit(-1)\n    new_filename = os.path.join(data_dir, labels[i], basename)\n    os.rename(original_filename, new_filename)\n"
  },
  {
    "path": "model_zoo/models/inception/inception/data/process_bounding_boxes.py",
    "content": "#!/usr/bin/python\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Process the ImageNet Challenge bounding boxes for TensorFlow model training.\n\nThis script is called as\n\nprocess_bounding_boxes.py <dir> [synsets-file]\n\nWhere <dir> is a directory containing the downloaded and unpacked bounding box\ndata. If [synsets-file] is supplied, then only the bounding boxes whose\nsynstes are contained within this file are returned. Note that the\n[synsets-file] file contains synset ids, one per line.\n\nThe script dumps out a CSV text file in which each line contains an entry.\n  n00007846_64193.JPEG,0.0060,0.2620,0.7545,0.9940\n\nThe entry can be read as:\n  <JPEG file name>, <xmin>, <ymin>, <xmax>, <ymax>\n\nThe bounding box for <JPEG file name> contains two points (xmin, ymin) and\n(xmax, ymax) specifying the lower-left corner and upper-right corner of a\nbounding box in *relative* coordinates.\n\nThe user supplies a directory where the XML files reside. The directory\nstructure in the directory <dir> is assumed to look like this:\n\n<dir>/nXXXXXXXX/nXXXXXXXX_YYYY.xml\n\nEach XML file contains a bounding box annotation. The script:\n\n (1) Parses the XML file and extracts the filename, label and bounding box info.\n\n (2) The bounding box is specified in the XML files as integer (xmin, ymin) and\n    (xmax, ymax) *relative* to image size displayed to the human annotator. The\n    size of the image displayed to the human annotator is stored in the XML file\n    as integer (height, width).\n\n    Note that the displayed size will differ from the actual size of the image\n    downloaded from image-net.org. To make the bounding box annotation useable,\n    we convert bounding box to floating point numbers relative to displayed\n    height and width of the image.\n\n    Note that each XML file might contain N bounding box annotations.\n\n    Note that the points are all clamped at a range of [0.0, 1.0] because some\n    human annotations extend outside the range of the supplied image.\n\n    See details here: http://image-net.org/download-bboxes\n\n(3) By default, the script outputs all valid bounding boxes. If a\n    [synsets-file] is supplied, only the subset of bounding boxes associated\n    with those synsets are outputted. Importantly, one can supply a list of\n    synsets in the ImageNet Challenge and output the list of bounding boxes\n    associated with the training images of the ILSVRC.\n\n    We use these bounding boxes to inform the random distortion of images\n    supplied to the network.\n\nIf you run this script successfully, you will see the following output\nto stderr:\n> Finished processing 544546 XML files.\n> Skipped 0 XML files not in ImageNet Challenge.\n> Skipped 0 bounding boxes not in ImageNet Challenge.\n> Wrote 615299 bounding boxes from 544546 annotated images.\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport glob\nimport os.path\nimport sys\nimport xml.etree.ElementTree as ET\n\n\nclass BoundingBox(object):\n  pass\n\n\ndef GetItem(name, root, index=0):\n  count = 0\n  for item in root.iter(name):\n    if count == index:\n      return item.text\n    count += 1\n  # Failed to find \"index\" occurrence of item.\n  return -1\n\n\ndef GetInt(name, root, index=0):\n  return int(GetItem(name, root, index))\n\n\ndef FindNumberBoundingBoxes(root):\n  index = 0\n  while True:\n    if GetInt('xmin', root, index) == -1:\n      break\n    index += 1\n  return index\n\n\ndef ProcessXMLAnnotation(xml_file):\n  \"\"\"Process a single XML file containing a bounding box.\"\"\"\n  # pylint: disable=broad-except\n  try:\n    tree = ET.parse(xml_file)\n  except Exception:\n    print('Failed to parse: ' + xml_file, file=sys.stderr)\n    return None\n  # pylint: enable=broad-except\n  root = tree.getroot()\n\n  num_boxes = FindNumberBoundingBoxes(root)\n  boxes = []\n\n  for index in xrange(num_boxes):\n    box = BoundingBox()\n    # Grab the 'index' annotation.\n    box.xmin = GetInt('xmin', root, index)\n    box.ymin = GetInt('ymin', root, index)\n    box.xmax = GetInt('xmax', root, index)\n    box.ymax = GetInt('ymax', root, index)\n\n    box.width = GetInt('width', root)\n    box.height = GetInt('height', root)\n    box.filename = GetItem('filename', root) + '.JPEG'\n    box.label = GetItem('name', root)\n\n    xmin = float(box.xmin) / float(box.width)\n    xmax = float(box.xmax) / float(box.width)\n    ymin = float(box.ymin) / float(box.height)\n    ymax = float(box.ymax) / float(box.height)\n\n    # Some images contain bounding box annotations that\n    # extend outside of the supplied image. See, e.g.\n    # n03127925/n03127925_147.xml\n    # Additionally, for some bounding boxes, the min > max\n    # or the box is entirely outside of the image.\n    min_x = min(xmin, xmax)\n    max_x = max(xmin, xmax)\n    box.xmin_scaled = min(max(min_x, 0.0), 1.0)\n    box.xmax_scaled = min(max(max_x, 0.0), 1.0)\n\n    min_y = min(ymin, ymax)\n    max_y = max(ymin, ymax)\n    box.ymin_scaled = min(max(min_y, 0.0), 1.0)\n    box.ymax_scaled = min(max(max_y, 0.0), 1.0)\n\n    boxes.append(box)\n\n  return boxes\n\nif __name__ == '__main__':\n  if len(sys.argv) < 2 or len(sys.argv) > 3:\n    print('Invalid usage\\n'\n          'usage: process_bounding_boxes.py <dir> [synsets-file]',\n          file=sys.stderr)\n    sys.exit(-1)\n\n  xml_files = glob.glob(sys.argv[1] + '/*/*.xml')\n  print('Identified %d XML files in %s' % (len(xml_files), sys.argv[1]),\n        file=sys.stderr)\n\n  if len(sys.argv) == 3:\n    labels = set([l.strip() for l in open(sys.argv[2]).readlines()])\n    print('Identified %d synset IDs in %s' % (len(labels), sys.argv[2]),\n          file=sys.stderr)\n  else:\n    labels = None\n\n  skipped_boxes = 0\n  skipped_files = 0\n  saved_boxes = 0\n  saved_files = 0\n  for file_index, one_file in enumerate(xml_files):\n    # Example: <...>/n06470073/n00141669_6790.xml\n    label = os.path.basename(os.path.dirname(one_file))\n\n    # Determine if the annotation is from an ImageNet Challenge label.\n    if labels is not None and label not in labels:\n      skipped_files += 1\n      continue\n\n    bboxes = ProcessXMLAnnotation(one_file)\n    assert bboxes is not None, 'No bounding boxes found in ' + one_file\n\n    found_box = False\n    for bbox in bboxes:\n      if labels is not None:\n        if bbox.label != label:\n          # Note: There is a slight bug in the bounding box annotation data.\n          # Many of the dog labels have the human label 'Scottish_deerhound'\n          # instead of the synset ID 'n02092002' in the bbox.label field. As a\n          # simple hack to overcome this issue, we only exclude bbox labels\n          # *which are synset ID's* that do not match original synset label for\n          # the XML file.\n          if bbox.label in labels:\n            skipped_boxes += 1\n            continue\n\n      # Guard against improperly specified boxes.\n      if (bbox.xmin_scaled >= bbox.xmax_scaled or\n          bbox.ymin_scaled >= bbox.ymax_scaled):\n        skipped_boxes += 1\n        continue\n\n      # Note bbox.filename occasionally contains '%s' in the name. This is\n      # data set noise that is fixed by just using the basename of the XML file.\n      image_filename = os.path.splitext(os.path.basename(one_file))[0]\n      print('%s.JPEG,%.4f,%.4f,%.4f,%.4f' %\n            (image_filename,\n             bbox.xmin_scaled, bbox.ymin_scaled,\n             bbox.xmax_scaled, bbox.ymax_scaled))\n\n      saved_boxes += 1\n      found_box = True\n    if found_box:\n      saved_files += 1\n    else:\n      skipped_files += 1\n\n    if not file_index % 5000:\n      print('--> processed %d of %d XML files.' %\n            (file_index + 1, len(xml_files)),\n            file=sys.stderr)\n      print('--> skipped %d boxes and %d XML files.' %\n            (skipped_boxes, skipped_files), file=sys.stderr)\n\n  print('Finished processing %d XML files.' % len(xml_files), file=sys.stderr)\n  print('Skipped %d XML files not in ImageNet Challenge.' % skipped_files,\n        file=sys.stderr)\n  print('Skipped %d bounding boxes not in ImageNet Challenge.' % skipped_boxes,\n        file=sys.stderr)\n  print('Wrote %d bounding boxes from %d annotated images.' %\n        (saved_boxes, saved_files),\n        file=sys.stderr)\n  print('Finished.', file=sys.stderr)\n"
  },
  {
    "path": "model_zoo/models/inception/inception/dataset.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Small library that points to a data set.\n\nMethods of Data class:\n  data_files: Returns a python list of all (sharded) data set files.\n  num_examples_per_epoch: Returns the number of examples in the data set.\n  num_classes: Returns the number of classes in the data set.\n  reader: Return a reader for a single entry from the data set.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom abc import ABCMeta\nfrom abc import abstractmethod\nimport os\n\n\nimport tensorflow as tf\n\nFLAGS = tf.app.flags.FLAGS\n\n# Basic model parameters.\ntf.app.flags.DEFINE_string('data_dir', '/tmp/mydata',\n                           \"\"\"Path to the processed data, i.e. \"\"\"\n                           \"\"\"TFRecord of Example protos.\"\"\")\n\n\nclass Dataset(object):\n  \"\"\"A simple class for handling data sets.\"\"\"\n  __metaclass__ = ABCMeta\n\n  def __init__(self, name, subset):\n    \"\"\"Initialize dataset using a subset and the path to the data.\"\"\"\n    assert subset in self.available_subsets(), self.available_subsets()\n    self.name = name\n    self.subset = subset\n\n  @abstractmethod\n  def num_classes(self):\n    \"\"\"Returns the number of classes in the data set.\"\"\"\n    pass\n    # return 10\n\n  @abstractmethod\n  def num_examples_per_epoch(self):\n    \"\"\"Returns the number of examples in the data subset.\"\"\"\n    pass\n    # if self.subset == 'train':\n    #   return 10000\n    # if self.subset == 'validation':\n    #   return 1000\n\n  @abstractmethod\n  def download_message(self):\n    \"\"\"Prints a download message for the Dataset.\"\"\"\n    pass\n\n  def available_subsets(self):\n    \"\"\"Returns the list of available subsets.\"\"\"\n    return ['train', 'validation']\n\n  def data_files(self):\n    \"\"\"Returns a python list of all (sharded) data subset files.\n\n    Returns:\n      python list of all (sharded) data set files.\n    Raises:\n      ValueError: if there are not data_files matching the subset.\n    \"\"\"\n    tf_record_pattern = os.path.join(FLAGS.data_dir, '%s-*' % self.subset)\n    data_files = tf.gfile.Glob(tf_record_pattern)\n    if not data_files:\n      print('No files found for dataset %s/%s at %s' % (self.name,\n                                                        self.subset,\n                                                        FLAGS.data_dir))\n\n      self.download_message()\n      exit(-1)\n    return data_files\n\n  def reader(self):\n    \"\"\"Return a reader for a single entry from the data set.\n\n    See io_ops.py for details of Reader class.\n\n    Returns:\n      Reader object that reads the data set.\n    \"\"\"\n    return tf.TFRecordReader()\n"
  },
  {
    "path": "model_zoo/models/inception/inception/flowers_data.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Small library that points to the flowers data set.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\n\nfrom inception.dataset import Dataset\n\n\nclass FlowersData(Dataset):\n  \"\"\"Flowers data set.\"\"\"\n\n  def __init__(self, subset):\n    super(FlowersData, self).__init__('Flowers', subset)\n\n  def num_classes(self):\n    \"\"\"Returns the number of classes in the data set.\"\"\"\n    return 5\n\n  def num_examples_per_epoch(self):\n    \"\"\"Returns the number of examples in the data subset.\"\"\"\n    if self.subset == 'train':\n      return 3170\n    if self.subset == 'validation':\n      return 500\n\n  def download_message(self):\n    \"\"\"Instruction to download and extract the tarball from Flowers website.\"\"\"\n\n    print('Failed to find any Flowers %s files'% self.subset)\n    print('')\n    print('If you have already downloaded and processed the data, then make '\n          'sure to set --data_dir to point to the directory containing the '\n          'location of the sharded TFRecords.\\n')\n    print('Please see README.md for instructions on how to build '\n          'the flowers dataset using download_and_preprocess_flowers.\\n')\n"
  },
  {
    "path": "model_zoo/models/inception/inception/flowers_eval.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"A binary to evaluate Inception on the flowers data set.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nimport tensorflow as tf\n\nfrom inception import inception_eval\nfrom inception.flowers_data import FlowersData\n\nFLAGS = tf.app.flags.FLAGS\n\n\ndef main(unused_argv=None):\n  dataset = FlowersData(subset=FLAGS.subset)\n  assert dataset.data_files()\n  if tf.gfile.Exists(FLAGS.eval_dir):\n    tf.gfile.DeleteRecursively(FLAGS.eval_dir)\n  tf.gfile.MakeDirs(FLAGS.eval_dir)\n  inception_eval.evaluate(dataset)\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/inception/inception/flowers_train.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"A binary to train Inception on the flowers data set.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\n\nimport tensorflow as tf\n\nfrom inception import inception_train\nfrom inception.flowers_data import FlowersData\n\nFLAGS = tf.app.flags.FLAGS\n\n\ndef main(_):\n  dataset = FlowersData(subset=FLAGS.subset)\n  assert dataset.data_files()\n  if tf.gfile.Exists(FLAGS.train_dir):\n    tf.gfile.DeleteRecursively(FLAGS.train_dir)\n  tf.gfile.MakeDirs(FLAGS.train_dir)\n  inception_train.train(dataset)\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/inception/inception/image_processing.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Read and preprocess image data.\n\n Image processing occurs on a single image at a time. Image are read and\n preprocessed in parallel across multiple threads. The resulting images\n are concatenated together to form a single batch for training or evaluation.\n\n -- Provide processed image data for a network:\n inputs: Construct batches of evaluation examples of images.\n distorted_inputs: Construct batches of training examples of images.\n batch_inputs: Construct batches of training or evaluation examples of images.\n\n -- Data processing:\n parse_example_proto: Parses an Example proto containing a training example\n   of an image.\n\n -- Image decoding:\n decode_jpeg: Decode a JPEG encoded string into a 3-D float32 Tensor.\n\n -- Image preprocessing:\n image_preprocessing: Decode and preprocess one image for evaluation or training\n distort_image: Distort one image for training a network.\n eval_image: Prepare one image for evaluation.\n distort_color: Distort the color in one image for training.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nFLAGS = tf.app.flags.FLAGS\n\ntf.app.flags.DEFINE_integer('batch_size', 32,\n                            \"\"\"Number of images to process in a batch.\"\"\")\ntf.app.flags.DEFINE_integer('image_size', 299,\n                            \"\"\"Provide square images of this size.\"\"\")\ntf.app.flags.DEFINE_integer('num_preprocess_threads', 4,\n                            \"\"\"Number of preprocessing threads per tower. \"\"\"\n                            \"\"\"Please make this a multiple of 4.\"\"\")\ntf.app.flags.DEFINE_integer('num_readers', 4,\n                            \"\"\"Number of parallel readers during train.\"\"\")\n\n# Images are preprocessed asynchronously using multiple threads specified by\n# --num_preprocss_threads and the resulting processed images are stored in a\n# random shuffling queue. The shuffling queue dequeues --batch_size images\n# for processing on a given Inception tower. A larger shuffling queue guarantees\n# better mixing across examples within a batch and results in slightly higher\n# predictive performance in a trained model. Empirically,\n# --input_queue_memory_factor=16 works well. A value of 16 implies a queue size\n# of 1024*16 images. Assuming RGB 299x299 images, this implies a queue size of\n# 16GB. If the machine is memory limited, then decrease this factor to\n# decrease the CPU memory footprint, accordingly.\ntf.app.flags.DEFINE_integer('input_queue_memory_factor', 16,\n                            \"\"\"Size of the queue of preprocessed images. \"\"\"\n                            \"\"\"Default is ideal but try smaller values, e.g. \"\"\"\n                            \"\"\"4, 2 or 1, if host memory is constrained. See \"\"\"\n                            \"\"\"comments in code for more details.\"\"\")\n\n\ndef inputs(dataset, batch_size=None, num_preprocess_threads=None):\n  \"\"\"Generate batches of ImageNet images for evaluation.\n\n  Use this function as the inputs for evaluating a network.\n\n  Note that some (minimal) image preprocessing occurs during evaluation\n  including central cropping and resizing of the image to fit the network.\n\n  Args:\n    dataset: instance of Dataset class specifying the dataset.\n    batch_size: integer, number of examples in batch\n    num_preprocess_threads: integer, total number of preprocessing threads but\n      None defaults to FLAGS.num_preprocess_threads.\n\n  Returns:\n    images: Images. 4D tensor of size [batch_size, FLAGS.image_size,\n                                       image_size, 3].\n    labels: 1-D integer Tensor of [FLAGS.batch_size].\n  \"\"\"\n  if not batch_size:\n    batch_size = FLAGS.batch_size\n\n  # Force all input processing onto CPU in order to reserve the GPU for\n  # the forward inference and back-propagation.\n  with tf.device('/cpu:0'):\n    images, labels = batch_inputs(\n        dataset, batch_size, train=False,\n        num_preprocess_threads=num_preprocess_threads,\n        num_readers=1)\n\n  return images, labels\n\n\ndef distorted_inputs(dataset, batch_size=None, num_preprocess_threads=None):\n  \"\"\"Generate batches of distorted versions of ImageNet images.\n\n  Use this function as the inputs for training a network.\n\n  Distorting images provides a useful technique for augmenting the data\n  set during training in order to make the network invariant to aspects\n  of the image that do not effect the label.\n\n  Args:\n    dataset: instance of Dataset class specifying the dataset.\n    batch_size: integer, number of examples in batch\n    num_preprocess_threads: integer, total number of preprocessing threads but\n      None defaults to FLAGS.num_preprocess_threads.\n\n  Returns:\n    images: Images. 4D tensor of size [batch_size, FLAGS.image_size,\n                                       FLAGS.image_size, 3].\n    labels: 1-D integer Tensor of [batch_size].\n  \"\"\"\n  if not batch_size:\n    batch_size = FLAGS.batch_size\n\n  # Force all input processing onto CPU in order to reserve the GPU for\n  # the forward inference and back-propagation.\n  with tf.device('/cpu:0'):\n    images, labels = batch_inputs(\n        dataset, batch_size, train=True,\n        num_preprocess_threads=num_preprocess_threads,\n        num_readers=FLAGS.num_readers)\n  return images, labels\n\n\ndef decode_jpeg(image_buffer, scope=None):\n  \"\"\"Decode a JPEG string into one 3-D float image Tensor.\n\n  Args:\n    image_buffer: scalar string Tensor.\n    scope: Optional scope for op_scope.\n  Returns:\n    3-D float Tensor with values ranging from [0, 1).\n  \"\"\"\n  with tf.op_scope([image_buffer], scope, 'decode_jpeg'):\n    # Decode the string as an RGB JPEG.\n    # Note that the resulting image contains an unknown height and width\n    # that is set dynamically by decode_jpeg. In other words, the height\n    # and width of image is unknown at compile-time.\n    image = tf.image.decode_jpeg(image_buffer, channels=3)\n\n    # After this point, all image pixels reside in [0,1)\n    # until the very end, when they're rescaled to (-1, 1).  The various\n    # adjust_* ops all require this range for dtype float.\n    image = tf.image.convert_image_dtype(image, dtype=tf.float32)\n    return image\n\n\ndef distort_color(image, thread_id=0, scope=None):\n  \"\"\"Distort the color of the image.\n\n  Each color distortion is non-commutative and thus ordering of the color ops\n  matters. Ideally we would randomly permute the ordering of the color ops.\n  Rather then adding that level of complication, we select a distinct ordering\n  of color ops for each preprocessing thread.\n\n  Args:\n    image: Tensor containing single image.\n    thread_id: preprocessing thread ID.\n    scope: Optional scope for op_scope.\n  Returns:\n    color-distorted image\n  \"\"\"\n  with tf.op_scope([image], scope, 'distort_color'):\n    color_ordering = thread_id % 2\n\n    if color_ordering == 0:\n      image = tf.image.random_brightness(image, max_delta=32. / 255.)\n      image = tf.image.random_saturation(image, lower=0.5, upper=1.5)\n      image = tf.image.random_hue(image, max_delta=0.2)\n      image = tf.image.random_contrast(image, lower=0.5, upper=1.5)\n    elif color_ordering == 1:\n      image = tf.image.random_brightness(image, max_delta=32. / 255.)\n      image = tf.image.random_contrast(image, lower=0.5, upper=1.5)\n      image = tf.image.random_saturation(image, lower=0.5, upper=1.5)\n      image = tf.image.random_hue(image, max_delta=0.2)\n\n    # The random_* ops do not necessarily clamp.\n    image = tf.clip_by_value(image, 0.0, 1.0)\n    return image\n\n\ndef distort_image(image, height, width, bbox, thread_id=0, scope=None):\n  \"\"\"Distort one image for training a network.\n\n  Distorting images provides a useful technique for augmenting the data\n  set during training in order to make the network invariant to aspects\n  of the image that do not effect the label.\n\n  Args:\n    image: 3-D float Tensor of image\n    height: integer\n    width: integer\n    bbox: 3-D float Tensor of bounding boxes arranged [1, num_boxes, coords]\n      where each coordinate is [0, 1) and the coordinates are arranged\n      as [ymin, xmin, ymax, xmax].\n    thread_id: integer indicating the preprocessing thread.\n    scope: Optional scope for op_scope.\n  Returns:\n    3-D float Tensor of distorted image used for training.\n  \"\"\"\n  with tf.op_scope([image, height, width, bbox], scope, 'distort_image'):\n    # Each bounding box has shape [1, num_boxes, box coords] and\n    # the coordinates are ordered [ymin, xmin, ymax, xmax].\n\n    # Display the bounding box in the first thread only.\n    if not thread_id:\n      image_with_box = tf.image.draw_bounding_boxes(tf.expand_dims(image, 0),\n                                                    bbox)\n      tf.image_summary('image_with_bounding_boxes', image_with_box)\n\n  # A large fraction of image datasets contain a human-annotated bounding\n  # box delineating the region of the image containing the object of interest.\n  # We choose to create a new bounding box for the object which is a randomly\n  # distorted version of the human-annotated bounding box that obeys an allowed\n  # range of aspect ratios, sizes and overlap with the human-annotated\n  # bounding box. If no box is supplied, then we assume the bounding box is\n  # the entire image.\n    sample_distorted_bounding_box = tf.image.sample_distorted_bounding_box(\n        tf.shape(image),\n        bounding_boxes=bbox,\n        min_object_covered=0.1,\n        aspect_ratio_range=[0.75, 1.33],\n        area_range=[0.05, 1.0],\n        max_attempts=100,\n        use_image_if_no_bounding_boxes=True)\n    bbox_begin, bbox_size, distort_bbox = sample_distorted_bounding_box\n    if not thread_id:\n      image_with_distorted_box = tf.image.draw_bounding_boxes(\n          tf.expand_dims(image, 0), distort_bbox)\n      tf.image_summary('images_with_distorted_bounding_box',\n                       image_with_distorted_box)\n\n    # Crop the image to the specified bounding box.\n    distorted_image = tf.slice(image, bbox_begin, bbox_size)\n\n    # This resizing operation may distort the images because the aspect\n    # ratio is not respected. We select a resize method in a round robin\n    # fashion based on the thread number.\n    # Note that ResizeMethod contains 4 enumerated resizing methods.\n    resize_method = thread_id % 4\n    distorted_image = tf.image.resize_images(distorted_image, [height, width],\n                                             method=resize_method)\n    # Restore the shape since the dynamic slice based upon the bbox_size loses\n    # the third dimension.\n    distorted_image.set_shape([height, width, 3])\n    if not thread_id:\n      tf.image_summary('cropped_resized_image',\n                       tf.expand_dims(distorted_image, 0))\n\n    # Randomly flip the image horizontally.\n    distorted_image = tf.image.random_flip_left_right(distorted_image)\n\n    # Randomly distort the colors.\n    distorted_image = distort_color(distorted_image, thread_id)\n\n    if not thread_id:\n      tf.image_summary('final_distorted_image',\n                       tf.expand_dims(distorted_image, 0))\n    return distorted_image\n\n\ndef eval_image(image, height, width, scope=None):\n  \"\"\"Prepare one image for evaluation.\n\n  Args:\n    image: 3-D float Tensor\n    height: integer\n    width: integer\n    scope: Optional scope for op_scope.\n  Returns:\n    3-D float Tensor of prepared image.\n  \"\"\"\n  with tf.op_scope([image, height, width], scope, 'eval_image'):\n    # Crop the central region of the image with an area containing 87.5% of\n    # the original image.\n    image = tf.image.central_crop(image, central_fraction=0.875)\n\n    # Resize the image to the original height and width.\n    image = tf.expand_dims(image, 0)\n    image = tf.image.resize_bilinear(image, [height, width],\n                                     align_corners=False)\n    image = tf.squeeze(image, [0])\n    return image\n\n\ndef image_preprocessing(image_buffer, bbox, train, thread_id=0):\n  \"\"\"Decode and preprocess one image for evaluation or training.\n\n  Args:\n    image_buffer: JPEG encoded string Tensor\n    bbox: 3-D float Tensor of bounding boxes arranged [1, num_boxes, coords]\n      where each coordinate is [0, 1) and the coordinates are arranged as\n      [ymin, xmin, ymax, xmax].\n    train: boolean\n    thread_id: integer indicating preprocessing thread\n\n  Returns:\n    3-D float Tensor containing an appropriately scaled image\n\n  Raises:\n    ValueError: if user does not provide bounding box\n  \"\"\"\n  if bbox is None:\n    raise ValueError('Please supply a bounding box.')\n\n  image = decode_jpeg(image_buffer)\n  height = FLAGS.image_size\n  width = FLAGS.image_size\n\n  if train:\n    image = distort_image(image, height, width, bbox, thread_id)\n  else:\n    image = eval_image(image, height, width)\n\n  # Finally, rescale to [-1,1] instead of [0, 1)\n  image = tf.sub(image, 0.5)\n  image = tf.mul(image, 2.0)\n  return image\n\n\ndef parse_example_proto(example_serialized):\n  \"\"\"Parses an Example proto containing a training example of an image.\n\n  The output of the build_image_data.py image preprocessing script is a dataset\n  containing serialized Example protocol buffers. Each Example proto contains\n  the following fields:\n\n    image/height: 462\n    image/width: 581\n    image/colorspace: 'RGB'\n    image/channels: 3\n    image/class/label: 615\n    image/class/synset: 'n03623198'\n    image/class/text: 'knee pad'\n    image/object/bbox/xmin: 0.1\n    image/object/bbox/xmax: 0.9\n    image/object/bbox/ymin: 0.2\n    image/object/bbox/ymax: 0.6\n    image/object/bbox/label: 615\n    image/format: 'JPEG'\n    image/filename: 'ILSVRC2012_val_00041207.JPEG'\n    image/encoded: <JPEG encoded string>\n\n  Args:\n    example_serialized: scalar Tensor tf.string containing a serialized\n      Example protocol buffer.\n\n  Returns:\n    image_buffer: Tensor tf.string containing the contents of a JPEG file.\n    label: Tensor tf.int32 containing the label.\n    bbox: 3-D float Tensor of bounding boxes arranged [1, num_boxes, coords]\n      where each coordinate is [0, 1) and the coordinates are arranged as\n      [ymin, xmin, ymax, xmax].\n    text: Tensor tf.string containing the human-readable label.\n  \"\"\"\n  # Dense features in Example proto.\n  feature_map = {\n      'image/encoded': tf.FixedLenFeature([], dtype=tf.string,\n                                          default_value=''),\n      'image/class/label': tf.FixedLenFeature([1], dtype=tf.int64,\n                                              default_value=-1),\n      'image/class/text': tf.FixedLenFeature([], dtype=tf.string,\n                                             default_value=''),\n  }\n  sparse_float32 = tf.VarLenFeature(dtype=tf.float32)\n  # Sparse features in Example proto.\n  feature_map.update(\n      {k: sparse_float32 for k in ['image/object/bbox/xmin',\n                                   'image/object/bbox/ymin',\n                                   'image/object/bbox/xmax',\n                                   'image/object/bbox/ymax']})\n\n  features = tf.parse_single_example(example_serialized, feature_map)\n  label = tf.cast(features['image/class/label'], dtype=tf.int32)\n\n  xmin = tf.expand_dims(features['image/object/bbox/xmin'].values, 0)\n  ymin = tf.expand_dims(features['image/object/bbox/ymin'].values, 0)\n  xmax = tf.expand_dims(features['image/object/bbox/xmax'].values, 0)\n  ymax = tf.expand_dims(features['image/object/bbox/ymax'].values, 0)\n\n  # Note that we impose an ordering of (y, x) just to make life difficult.\n  bbox = tf.concat(0, [ymin, xmin, ymax, xmax])\n\n  # Force the variable number of bounding boxes into the shape\n  # [1, num_boxes, coords].\n  bbox = tf.expand_dims(bbox, 0)\n  bbox = tf.transpose(bbox, [0, 2, 1])\n\n  return features['image/encoded'], label, bbox, features['image/class/text']\n\n\ndef batch_inputs(dataset, batch_size, train, num_preprocess_threads=None,\n                 num_readers=1):\n  \"\"\"Contruct batches of training or evaluation examples from the image dataset.\n\n  Args:\n    dataset: instance of Dataset class specifying the dataset.\n      See dataset.py for details.\n    batch_size: integer\n    train: boolean\n    num_preprocess_threads: integer, total number of preprocessing threads\n    num_readers: integer, number of parallel readers\n\n  Returns:\n    images: 4-D float Tensor of a batch of images\n    labels: 1-D integer Tensor of [batch_size].\n\n  Raises:\n    ValueError: if data is not found\n  \"\"\"\n  with tf.name_scope('batch_processing'):\n    data_files = dataset.data_files()\n    if data_files is None:\n      raise ValueError('No data files found for this dataset')\n\n    # Create filename_queue\n    if train:\n      filename_queue = tf.train.string_input_producer(data_files,\n                                                      shuffle=True,\n                                                      capacity=16)\n    else:\n      filename_queue = tf.train.string_input_producer(data_files,\n                                                      shuffle=False,\n                                                      capacity=1)\n    if num_preprocess_threads is None:\n      num_preprocess_threads = FLAGS.num_preprocess_threads\n\n    if num_preprocess_threads % 4:\n      raise ValueError('Please make num_preprocess_threads a multiple '\n                       'of 4 (%d % 4 != 0).', num_preprocess_threads)\n\n    if num_readers is None:\n      num_readers = FLAGS.num_readers\n\n    if num_readers < 1:\n      raise ValueError('Please make num_readers at least 1')\n\n    # Approximate number of examples per shard.\n    examples_per_shard = 1024\n    # Size the random shuffle queue to balance between good global\n    # mixing (more examples) and memory use (fewer examples).\n    # 1 image uses 299*299*3*4 bytes = 1MB\n    # The default input_queue_memory_factor is 16 implying a shuffling queue\n    # size: examples_per_shard * 16 * 1MB = 17.6GB\n    min_queue_examples = examples_per_shard * FLAGS.input_queue_memory_factor\n    if train:\n      examples_queue = tf.RandomShuffleQueue(\n          capacity=min_queue_examples + 3 * batch_size,\n          min_after_dequeue=min_queue_examples,\n          dtypes=[tf.string])\n    else:\n      examples_queue = tf.FIFOQueue(\n          capacity=examples_per_shard + 3 * batch_size,\n          dtypes=[tf.string])\n\n    # Create multiple readers to populate the queue of examples.\n    if num_readers > 1:\n      enqueue_ops = []\n      for _ in range(num_readers):\n        reader = dataset.reader()\n        _, value = reader.read(filename_queue)\n        enqueue_ops.append(examples_queue.enqueue([value]))\n\n      tf.train.queue_runner.add_queue_runner(\n          tf.train.queue_runner.QueueRunner(examples_queue, enqueue_ops))\n      example_serialized = examples_queue.dequeue()\n    else:\n      reader = dataset.reader()\n      _, example_serialized = reader.read(filename_queue)\n\n    images_and_labels = []\n    for thread_id in range(num_preprocess_threads):\n      # Parse a serialized Example proto to extract the image and metadata.\n      image_buffer, label_index, bbox, _ = parse_example_proto(\n          example_serialized)\n      image = image_preprocessing(image_buffer, bbox, train, thread_id)\n      images_and_labels.append([image, label_index])\n\n    images, label_index_batch = tf.train.batch_join(\n        images_and_labels,\n        batch_size=batch_size,\n        capacity=2 * num_preprocess_threads * batch_size)\n\n    # Reshape images into these desired dimensions.\n    height = FLAGS.image_size\n    width = FLAGS.image_size\n    depth = 3\n\n    images = tf.cast(images, tf.float32)\n    images = tf.reshape(images, shape=[batch_size, height, width, depth])\n\n    # Display the training images in the visualizer.\n    tf.image_summary('images', images)\n\n    return images, tf.reshape(label_index_batch, [batch_size])\n"
  },
  {
    "path": "model_zoo/models/inception/inception/imagenet_data.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Small library that points to the ImageNet data set.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\n\nfrom inception.dataset import Dataset\n\n\nclass ImagenetData(Dataset):\n  \"\"\"ImageNet data set.\"\"\"\n\n  def __init__(self, subset):\n    super(ImagenetData, self).__init__('ImageNet', subset)\n\n  def num_classes(self):\n    \"\"\"Returns the number of classes in the data set.\"\"\"\n    return 1000\n\n  def num_examples_per_epoch(self):\n    \"\"\"Returns the number of examples in the data set.\"\"\"\n    # Bounding box data consists of 615299 bounding boxes for 544546 images.\n    if self.subset == 'train':\n      return 1281167\n    if self.subset == 'validation':\n      return 50000\n\n  def download_message(self):\n    \"\"\"Instruction to download and extract the tarball from Flowers website.\"\"\"\n\n    print('Failed to find any ImageNet %s files'% self.subset)\n    print('')\n    print('If you have already downloaded and processed the data, then make '\n          'sure to set --data_dir to point to the directory containing the '\n          'location of the sharded TFRecords.\\n')\n    print('If you have not downloaded and prepared the ImageNet data in the '\n          'TFRecord format, you will need to do this at least once. This '\n          'process could take several hours depending on the speed of your '\n          'computer and network connection\\n')\n    print('Please see README.md for instructions on how to build '\n          'the ImageNet dataset using download_and_preprocess_imagenet.\\n')\n    print('Note that the raw data size is 300 GB and the processed data size '\n          'is 150 GB. Please ensure you have at least 500GB disk space.')\n"
  },
  {
    "path": "model_zoo/models/inception/inception/imagenet_distributed_train.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n# pylint: disable=line-too-long\n\"\"\"A binary to train Inception in a distributed manner using multiple systems.\n\nPlease see accompanying README.md for details and instructions.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom inception import inception_distributed_train\nfrom inception.imagenet_data import ImagenetData\n\nFLAGS = tf.app.flags.FLAGS\n\n\ndef main(unused_args):\n  assert FLAGS.job_name in ['ps', 'worker'], 'job_name must be ps or worker'\n\n  # Extract all the hostnames for the ps and worker jobs to construct the\n  # cluster spec.\n  ps_hosts = FLAGS.ps_hosts.split(',')\n  worker_hosts = FLAGS.worker_hosts.split(',')\n  tf.logging.info('PS hosts are: %s' % ps_hosts)\n  tf.logging.info('Worker hosts are: %s' % worker_hosts)\n\n  cluster_spec = tf.train.ClusterSpec({'ps': ps_hosts,\n                                       'worker': worker_hosts})\n  server = tf.train.Server(\n      {'ps': ps_hosts,\n       'worker': worker_hosts},\n      job_name=FLAGS.job_name,\n      task_index=FLAGS.task_id)\n\n  if FLAGS.job_name == 'ps':\n    # `ps` jobs wait for incoming connections from the workers.\n    server.join()\n  else:\n    # `worker` jobs will actually do the work.\n    dataset = ImagenetData(subset=FLAGS.subset)\n    assert dataset.data_files()\n    # Only the chief checks for or creates train_dir.\n    if FLAGS.task_id == 0:\n      if not tf.gfile.Exists(FLAGS.train_dir):\n        tf.gfile.MakeDirs(FLAGS.train_dir)\n    inception_distributed_train.train(server.target, dataset, cluster_spec)\n\nif __name__ == '__main__':\n  tf.logging.set_verbosity(tf.logging.INFO)\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/inception/inception/imagenet_eval.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"A binary to evaluate Inception on the flowers data set.\n\nNote that using the supplied pre-trained inception checkpoint, the eval should\nachieve:\n  precision @ 1 = 0.7874 recall @ 5 = 0.9436 [50000 examples]\n\nSee the README.md for more details.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nimport tensorflow as tf\n\nfrom inception import inception_eval\nfrom inception.imagenet_data import ImagenetData\n\nFLAGS = tf.app.flags.FLAGS\n\n\ndef main(unused_argv=None):\n  dataset = ImagenetData(subset=FLAGS.subset)\n  assert dataset.data_files()\n  if tf.gfile.Exists(FLAGS.eval_dir):\n    tf.gfile.DeleteRecursively(FLAGS.eval_dir)\n  tf.gfile.MakeDirs(FLAGS.eval_dir)\n  inception_eval.evaluate(dataset)\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/inception/inception/imagenet_train.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"A binary to train Inception on the ImageNet data set.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\n\nimport tensorflow as tf\n\nfrom inception import inception_train\nfrom inception.imagenet_data import ImagenetData\n\nFLAGS = tf.app.flags.FLAGS\n\n\ndef main(_):\n  dataset = ImagenetData(subset=FLAGS.subset)\n  assert dataset.data_files()\n  if tf.gfile.Exists(FLAGS.train_dir):\n    tf.gfile.DeleteRecursively(FLAGS.train_dir)\n  tf.gfile.MakeDirs(FLAGS.train_dir)\n  inception_train.train(dataset)\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/inception/inception/inception_distributed_train.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"A library to train Inception using multiple replicas with synchronous update.\n\nPlease see accompanying README.md for details and instructions.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom datetime import datetime\nimport os.path\nimport time\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom inception import image_processing\nfrom inception import inception_model as inception\nfrom inception.slim import slim\n\nFLAGS = tf.app.flags.FLAGS\n\ntf.app.flags.DEFINE_string('job_name', '', 'One of \"ps\", \"worker\"')\ntf.app.flags.DEFINE_string('ps_hosts', '',\n                           \"\"\"Comma-separated list of hostname:port for the \"\"\"\n                           \"\"\"parameter server jobs. e.g. \"\"\"\n                           \"\"\"'machine1:2222,machine2:1111,machine2:2222'\"\"\")\ntf.app.flags.DEFINE_string('worker_hosts', '',\n                           \"\"\"Comma-separated list of hostname:port for the \"\"\"\n                           \"\"\"worker jobs. e.g. \"\"\"\n                           \"\"\"'machine1:2222,machine2:1111,machine2:2222'\"\"\")\n\ntf.app.flags.DEFINE_string('train_dir', '/tmp/imagenet_train',\n                           \"\"\"Directory where to write event logs \"\"\"\n                           \"\"\"and checkpoint.\"\"\")\ntf.app.flags.DEFINE_integer('max_steps', 1000000, 'Number of batches to run.')\ntf.app.flags.DEFINE_string('subset', 'train', 'Either \"train\" or \"validation\".')\ntf.app.flags.DEFINE_boolean('log_device_placement', False,\n                            'Whether to log device placement.')\n\n# Task ID is used to select the chief and also to access the local_step for\n# each replica to check staleness of the gradients in sync_replicas_optimizer.\ntf.app.flags.DEFINE_integer(\n    'task_id', 0, 'Task ID of the worker/replica running the training.')\n\n# More details can be found in the sync_replicas_optimizer class:\n# tensorflow/python/training/sync_replicas_optimizer.py\ntf.app.flags.DEFINE_integer('num_replicas_to_aggregate', -1,\n                            \"\"\"Number of gradients to collect before \"\"\"\n                            \"\"\"updating the parameters.\"\"\")\ntf.app.flags.DEFINE_integer('save_interval_secs', 10 * 60,\n                            'Save interval seconds.')\ntf.app.flags.DEFINE_integer('save_summaries_secs', 180,\n                            'Save summaries interval seconds.')\n\n# **IMPORTANT**\n# Please note that this learning rate schedule is heavily dependent on the\n# hardware architecture, batch size and any changes to the model architecture\n# specification. Selecting a finely tuned learning rate schedule is an\n# empirical process that requires some experimentation. Please see README.md\n# more guidance and discussion.\n#\n# Learning rate decay factor selected from https://arxiv.org/abs/1604.00981\ntf.app.flags.DEFINE_float('initial_learning_rate', 0.045,\n                          'Initial learning rate.')\ntf.app.flags.DEFINE_float('num_epochs_per_decay', 2.0,\n                          'Epochs after which learning rate decays.')\ntf.app.flags.DEFINE_float('learning_rate_decay_factor', 0.94,\n                          'Learning rate decay factor.')\n\n# Constants dictating the learning rate schedule.\nRMSPROP_DECAY = 0.9                # Decay term for RMSProp.\nRMSPROP_MOMENTUM = 0.9             # Momentum in RMSProp.\nRMSPROP_EPSILON = 1.0              # Epsilon term for RMSProp.\n\n\ndef train(target, dataset, cluster_spec):\n  \"\"\"Train Inception on a dataset for a number of steps.\"\"\"\n  # Number of workers and parameter servers are infered from the workers and ps\n  # hosts string.\n  num_workers = len(cluster_spec.as_dict()['worker'])\n  num_parameter_servers = len(cluster_spec.as_dict()['ps'])\n  # If no value is given, num_replicas_to_aggregate defaults to be the number of\n  # workers.\n  if FLAGS.num_replicas_to_aggregate == -1:\n    num_replicas_to_aggregate = num_workers\n  else:\n    num_replicas_to_aggregate = FLAGS.num_replicas_to_aggregate\n\n  # Both should be greater than 0 in a distributed training.\n  assert num_workers > 0 and num_parameter_servers > 0, (' num_workers and '\n                                                         'num_parameter_servers'\n                                                         ' must be > 0.')\n\n  # Choose worker 0 as the chief. Note that any worker could be the chief\n  # but there should be only one chief.\n  is_chief = (FLAGS.task_id == 0)\n\n  # Ops are assigned to worker by default.\n  with tf.device('/job:worker/task:%d' % FLAGS.task_id):\n    # Variables and its related init/assign ops are assigned to ps.\n    with slim.scopes.arg_scope(\n        [slim.variables.variable, slim.variables.global_step],\n        device=slim.variables.VariableDeviceChooser(num_parameter_servers)):\n      # Create a variable to count the number of train() calls. This equals the\n      # number of updates applied to the variables.\n      global_step = slim.variables.global_step()\n\n      # Calculate the learning rate schedule.\n      num_batches_per_epoch = (dataset.num_examples_per_epoch() /\n                               FLAGS.batch_size)\n      # Decay steps need to be divided by the number of replicas to aggregate.\n      decay_steps = int(num_batches_per_epoch * FLAGS.num_epochs_per_decay /\n                        num_replicas_to_aggregate)\n\n      # Decay the learning rate exponentially based on the number of steps.\n      lr = tf.train.exponential_decay(FLAGS.initial_learning_rate,\n                                      global_step,\n                                      decay_steps,\n                                      FLAGS.learning_rate_decay_factor,\n                                      staircase=True)\n      # Add a summary to track the learning rate.\n      tf.scalar_summary('learning_rate', lr)\n\n      # Create an optimizer that performs gradient descent.\n      opt = tf.train.RMSPropOptimizer(lr,\n                                      RMSPROP_DECAY,\n                                      momentum=RMSPROP_MOMENTUM,\n                                      epsilon=RMSPROP_EPSILON)\n\n      images, labels = image_processing.distorted_inputs(\n          dataset,\n          batch_size=FLAGS.batch_size,\n          num_preprocess_threads=FLAGS.num_preprocess_threads)\n\n      # Number of classes in the Dataset label set plus 1.\n      # Label 0 is reserved for an (unused) background class.\n      num_classes = dataset.num_classes() + 1\n      logits = inception.inference(images, num_classes, for_training=True)\n      # Add classification loss.\n      inception.loss(logits, labels)\n\n      # Gather all of the losses including regularization losses.\n      losses = tf.get_collection(slim.losses.LOSSES_COLLECTION)\n      losses += tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)\n\n      total_loss = tf.add_n(losses, name='total_loss')\n\n      if is_chief:\n        # Compute the moving average of all individual losses and the\n        # total loss.\n        loss_averages = tf.train.ExponentialMovingAverage(0.9, name='avg')\n        loss_averages_op = loss_averages.apply(losses + [total_loss])\n\n        # Attach a scalar summmary to all individual losses and the total loss;\n        # do the same for the averaged version of the losses.\n        for l in losses + [total_loss]:\n          loss_name = l.op.name\n          # Name each loss as '(raw)' and name the moving average version of the\n          # loss as the original loss name.\n          tf.scalar_summary(loss_name + ' (raw)', l)\n          tf.scalar_summary(loss_name, loss_averages.average(l))\n\n        # Add dependency to compute loss_averages.\n        with tf.control_dependencies([loss_averages_op]):\n          total_loss = tf.identity(total_loss)\n\n      # Track the moving averages of all trainable variables.\n      # Note that we maintain a 'double-average' of the BatchNormalization\n      # global statistics.\n      # This is not needed when the number of replicas are small but important\n      # for synchronous distributed training with tens of workers/replicas.\n      exp_moving_averager = tf.train.ExponentialMovingAverage(\n          inception.MOVING_AVERAGE_DECAY, global_step)\n\n      variables_to_average = (\n          tf.trainable_variables() + tf.moving_average_variables())\n\n      # Add histograms for model variables.\n      for var in variables_to_average:\n        tf.histogram_summary(var.op.name, var)\n\n      # Create synchronous replica optimizer.\n      opt = tf.train.SyncReplicasOptimizer(\n          opt,\n          replicas_to_aggregate=num_replicas_to_aggregate,\n          replica_id=FLAGS.task_id,\n          total_num_replicas=num_workers,\n          variable_averages=exp_moving_averager,\n          variables_to_average=variables_to_average)\n\n      batchnorm_updates = tf.get_collection(slim.ops.UPDATE_OPS_COLLECTION)\n      assert batchnorm_updates, 'Batchnorm updates are missing'\n      batchnorm_updates_op = tf.group(*batchnorm_updates)\n      # Add dependency to compute batchnorm_updates.\n      with tf.control_dependencies([batchnorm_updates_op]):\n        total_loss = tf.identity(total_loss)\n\n      # Compute gradients with respect to the loss.\n      grads = opt.compute_gradients(total_loss)\n\n      # Add histograms for gradients.\n      for grad, var in grads:\n        if grad is not None:\n          tf.histogram_summary(var.op.name + '/gradients', grad)\n\n      apply_gradients_op = opt.apply_gradients(grads, global_step=global_step)\n\n      with tf.control_dependencies([apply_gradients_op]):\n        train_op = tf.identity(total_loss, name='train_op')\n\n      # Get chief queue_runners, init_tokens and clean_up_op, which is used to\n      # synchronize replicas.\n      # More details can be found in sync_replicas_optimizer.\n      chief_queue_runners = [opt.get_chief_queue_runner()]\n      init_tokens_op = opt.get_init_tokens_op()\n      clean_up_op = opt.get_clean_up_op()\n\n      # Create a saver.\n      saver = tf.train.Saver()\n\n      # Build the summary operation based on the TF collection of Summaries.\n      summary_op = tf.merge_all_summaries()\n\n      # Build an initialization operation to run below.\n      init_op = tf.initialize_all_variables()\n\n      # We run the summaries in the same thread as the training operations by\n      # passing in None for summary_op to avoid a summary_thread being started.\n      # Running summaries and training operations in parallel could run out of\n      # GPU memory.\n      sv = tf.train.Supervisor(is_chief=is_chief,\n                               logdir=FLAGS.train_dir,\n                               init_op=init_op,\n                               summary_op=None,\n                               global_step=global_step,\n                               saver=saver,\n                               save_model_secs=FLAGS.save_interval_secs)\n\n      tf.logging.info('%s Supervisor' % datetime.now())\n\n      sess_config = tf.ConfigProto(\n          allow_soft_placement=True,\n          log_device_placement=FLAGS.log_device_placement)\n\n      # Get a session.\n      sess = sv.prepare_or_wait_for_session(target, config=sess_config)\n\n      # Start the queue runners.\n      queue_runners = tf.get_collection(tf.GraphKeys.QUEUE_RUNNERS)\n      sv.start_queue_runners(sess, queue_runners)\n      tf.logging.info('Started %d queues for processing input data.',\n                      len(queue_runners))\n\n      if is_chief:\n        sv.start_queue_runners(sess, chief_queue_runners)\n        sess.run(init_tokens_op)\n\n      # Train, checking for Nans. Concurrently run the summary operation at a\n      # specified interval. Note that the summary_op and train_op never run\n      # simultaneously in order to prevent running out of GPU memory.\n      next_summary_time = time.time() + FLAGS.save_summaries_secs\n      while not sv.should_stop():\n        try:\n          start_time = time.time()\n          loss_value, step = sess.run([train_op, global_step])\n          assert not np.isnan(loss_value), 'Model diverged with loss = NaN'\n          if step > FLAGS.max_steps:\n            break\n          duration = time.time() - start_time\n\n          if step % 30 == 0:\n            examples_per_sec = FLAGS.batch_size / float(duration)\n            format_str = ('Worker %d: %s: step %d, loss = %.2f'\n                          '(%.1f examples/sec; %.3f  sec/batch)')\n            tf.logging.info(format_str %\n                            (FLAGS.task_id, datetime.now(), step, loss_value,\n                             examples_per_sec, duration))\n\n          # Determine if the summary_op should be run on the chief worker.\n          if is_chief and next_summary_time < time.time():\n            tf.logging.info('Running Summary operation on the chief.')\n            summary_str = sess.run(summary_op)\n            sv.summary_computed(sess, summary_str)\n            tf.logging.info('Finished running Summary operation.')\n\n            # Determine the next time for running the summary.\n            next_summary_time += FLAGS.save_summaries_secs\n        except:\n          if is_chief:\n            tf.logging.info('About to execute sync_clean_up_op!')\n            sess.run(clean_up_op)\n          raise\n\n      # Stop the supervisor.  This also waits for service threads to finish.\n      sv.stop()\n\n      # Save after the training ends.\n      if is_chief:\n        saver.save(sess,\n                   os.path.join(FLAGS.train_dir, 'model.ckpt'),\n                   global_step=global_step)\n"
  },
  {
    "path": "model_zoo/models/inception/inception/inception_eval.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"A library to evaluate Inception on a single GPU.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom datetime import datetime\nimport math\nimport os.path\nimport time\n\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom inception import image_processing\nfrom inception import inception_model as inception\n\n\nFLAGS = tf.app.flags.FLAGS\n\ntf.app.flags.DEFINE_string('eval_dir', '/tmp/imagenet_eval',\n                           \"\"\"Directory where to write event logs.\"\"\")\ntf.app.flags.DEFINE_string('checkpoint_dir', '/tmp/imagenet_train',\n                           \"\"\"Directory where to read model checkpoints.\"\"\")\n\n# Flags governing the frequency of the eval.\ntf.app.flags.DEFINE_integer('eval_interval_secs', 60 * 5,\n                            \"\"\"How often to run the eval.\"\"\")\ntf.app.flags.DEFINE_boolean('run_once', False,\n                            \"\"\"Whether to run eval only once.\"\"\")\n\n# Flags governing the data used for the eval.\ntf.app.flags.DEFINE_integer('num_examples', 50000,\n                            \"\"\"Number of examples to run. Note that the eval \"\"\"\n                            \"\"\"ImageNet dataset contains 50000 examples.\"\"\")\ntf.app.flags.DEFINE_string('subset', 'validation',\n                           \"\"\"Either 'validation' or 'train'.\"\"\")\n\n\ndef _eval_once(saver, summary_writer, top_1_op, top_5_op, summary_op):\n  \"\"\"Runs Eval once.\n\n  Args:\n    saver: Saver.\n    summary_writer: Summary writer.\n    top_1_op: Top 1 op.\n    top_5_op: Top 5 op.\n    summary_op: Summary op.\n  \"\"\"\n  with tf.Session() as sess:\n    ckpt = tf.train.get_checkpoint_state(FLAGS.checkpoint_dir)\n    if ckpt and ckpt.model_checkpoint_path:\n      if os.path.isabs(ckpt.model_checkpoint_path):\n        # Restores from checkpoint with absolute path.\n        saver.restore(sess, ckpt.model_checkpoint_path)\n      else:\n        # Restores from checkpoint with relative path.\n        saver.restore(sess, os.path.join(FLAGS.checkpoint_dir,\n                                         ckpt.model_checkpoint_path))\n\n      # Assuming model_checkpoint_path looks something like:\n      #   /my-favorite-path/imagenet_train/model.ckpt-0,\n      # extract global_step from it.\n      global_step = ckpt.model_checkpoint_path.split('/')[-1].split('-')[-1]\n      print('Succesfully loaded model from %s at step=%s.' %\n            (ckpt.model_checkpoint_path, global_step))\n    else:\n      print('No checkpoint file found')\n      return\n\n    # Start the queue runners.\n    coord = tf.train.Coordinator()\n    try:\n      threads = []\n      for qr in tf.get_collection(tf.GraphKeys.QUEUE_RUNNERS):\n        threads.extend(qr.create_threads(sess, coord=coord, daemon=True,\n                                         start=True))\n\n      num_iter = int(math.ceil(FLAGS.num_examples / FLAGS.batch_size))\n      # Counts the number of correct predictions.\n      count_top_1 = 0.0\n      count_top_5 = 0.0\n      total_sample_count = num_iter * FLAGS.batch_size\n      step = 0\n\n      print('%s: starting evaluation on (%s).' % (datetime.now(), FLAGS.subset))\n      start_time = time.time()\n      while step < num_iter and not coord.should_stop():\n        top_1, top_5 = sess.run([top_1_op, top_5_op])\n        count_top_1 += np.sum(top_1)\n        count_top_5 += np.sum(top_5)\n        step += 1\n        if step % 20 == 0:\n          duration = time.time() - start_time\n          sec_per_batch = duration / 20.0\n          examples_per_sec = FLAGS.batch_size / sec_per_batch\n          print('%s: [%d batches out of %d] (%.1f examples/sec; %.3f'\n                'sec/batch)' % (datetime.now(), step, num_iter,\n                                examples_per_sec, sec_per_batch))\n          start_time = time.time()\n\n      # Compute precision @ 1.\n      precision_at_1 = count_top_1 / total_sample_count\n      recall_at_5 = count_top_5 / total_sample_count\n      print('%s: precision @ 1 = %.4f recall @ 5 = %.4f [%d examples]' %\n            (datetime.now(), precision_at_1, recall_at_5, total_sample_count))\n\n      summary = tf.Summary()\n      summary.ParseFromString(sess.run(summary_op))\n      summary.value.add(tag='Precision @ 1', simple_value=precision_at_1)\n      summary.value.add(tag='Recall @ 5', simple_value=recall_at_5)\n      summary_writer.add_summary(summary, global_step)\n\n    except Exception as e:  # pylint: disable=broad-except\n      coord.request_stop(e)\n\n    coord.request_stop()\n    coord.join(threads, stop_grace_period_secs=10)\n\n\ndef evaluate(dataset):\n  \"\"\"Evaluate model on Dataset for a number of steps.\"\"\"\n  with tf.Graph().as_default():\n    # Get images and labels from the dataset.\n    images, labels = image_processing.inputs(dataset)\n\n    # Number of classes in the Dataset label set plus 1.\n    # Label 0 is reserved for an (unused) background class.\n    num_classes = dataset.num_classes() + 1\n\n    # Build a Graph that computes the logits predictions from the\n    # inference model.\n    logits, _ = inception.inference(images, num_classes)\n\n    # Calculate predictions.\n    top_1_op = tf.nn.in_top_k(logits, labels, 1)\n    top_5_op = tf.nn.in_top_k(logits, labels, 5)\n\n    # Restore the moving average version of the learned variables for eval.\n    variable_averages = tf.train.ExponentialMovingAverage(\n        inception.MOVING_AVERAGE_DECAY)\n    variables_to_restore = variable_averages.variables_to_restore()\n    saver = tf.train.Saver(variables_to_restore)\n\n    # Build the summary operation based on the TF collection of Summaries.\n    summary_op = tf.merge_all_summaries()\n\n    graph_def = tf.get_default_graph().as_graph_def()\n    summary_writer = tf.train.SummaryWriter(FLAGS.eval_dir,\n                                            graph_def=graph_def)\n\n    while True:\n      _eval_once(saver, summary_writer, top_1_op, top_5_op, summary_op)\n      if FLAGS.run_once:\n        break\n      time.sleep(FLAGS.eval_interval_secs)\n"
  },
  {
    "path": "model_zoo/models/inception/inception/inception_model.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Build the Inception v3 network on ImageNet data set.\n\nThe Inception v3 architecture is described in http://arxiv.org/abs/1512.00567\n\nSummary of available functions:\n inference: Compute inference on the model inputs to make a prediction\n loss: Compute the loss of the prediction with respect to the labels\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport re\n\nimport tensorflow as tf\n\nfrom inception.slim import slim\n\nFLAGS = tf.app.flags.FLAGS\n\n# If a model is trained using multiple GPUs, prefix all Op names with tower_name\n# to differentiate the operations. Note that this prefix is removed from the\n# names of the summaries when visualizing a model.\nTOWER_NAME = 'tower'\n\n# Batch normalization. Constant governing the exponential moving average of\n# the 'global' mean and variance for all activations.\nBATCHNORM_MOVING_AVERAGE_DECAY = 0.9997\n\n# The decay to use for the moving average.\nMOVING_AVERAGE_DECAY = 0.9999\n\n\ndef inference(images, num_classes, for_training=False, restore_logits=True,\n              scope=None):\n  \"\"\"Build Inception v3 model architecture.\n\n  See here for reference: http://arxiv.org/abs/1512.00567\n\n  Args:\n    images: Images returned from inputs() or distorted_inputs().\n    num_classes: number of classes\n    for_training: If set to `True`, build the inference model for training.\n      Kernels that operate differently for inference during training\n      e.g. dropout, are appropriately configured.\n    restore_logits: whether or not the logits layers should be restored.\n      Useful for fine-tuning a model with different num_classes.\n    scope: optional prefix string identifying the ImageNet tower.\n\n  Returns:\n    Logits. 2-D float Tensor.\n    Auxiliary Logits. 2-D float Tensor of side-head. Used for training only.\n  \"\"\"\n  # Parameters for BatchNorm.\n  batch_norm_params = {\n      # Decay for the moving averages.\n      'decay': BATCHNORM_MOVING_AVERAGE_DECAY,\n      # epsilon to prevent 0s in variance.\n      'epsilon': 0.001,\n  }\n  # Set weight_decay for weights in Conv and FC layers.\n  with slim.arg_scope([slim.ops.conv2d, slim.ops.fc], weight_decay=0.00004):\n    with slim.arg_scope([slim.ops.conv2d],\n                        stddev=0.1,\n                        activation=tf.nn.relu,\n                        batch_norm_params=batch_norm_params):\n      logits, endpoints = slim.inception.inception_v3(\n          images,\n          dropout_keep_prob=0.8,\n          num_classes=num_classes,\n          is_training=for_training,\n          restore_logits=restore_logits,\n          scope=scope)\n\n  # Add summaries for viewing model statistics on TensorBoard.\n  _activation_summaries(endpoints)\n\n  # Grab the logits associated with the side head. Employed during training.\n  auxiliary_logits = endpoints['aux_logits']\n\n  return logits, auxiliary_logits\n\n\ndef loss(logits, labels, batch_size=None):\n  \"\"\"Adds all losses for the model.\n\n  Note the final loss is not returned. Instead, the list of losses are collected\n  by slim.losses. The losses are accumulated in tower_loss() and summed to\n  calculate the total loss.\n\n  Args:\n    logits: List of logits from inference(). Each entry is a 2-D float Tensor.\n    labels: Labels from distorted_inputs or inputs(). 1-D tensor\n            of shape [batch_size]\n    batch_size: integer\n  \"\"\"\n  if not batch_size:\n    batch_size = FLAGS.batch_size\n\n  # Reshape the labels into a dense Tensor of\n  # shape [FLAGS.batch_size, num_classes].\n  sparse_labels = tf.reshape(labels, [batch_size, 1])\n  indices = tf.reshape(tf.range(batch_size), [batch_size, 1])\n  concated = tf.concat(1, [indices, sparse_labels])\n  num_classes = logits[0].get_shape()[-1].value\n  dense_labels = tf.sparse_to_dense(concated,\n                                    [batch_size, num_classes],\n                                    1.0, 0.0)\n\n  # Cross entropy loss for the main softmax prediction.\n  slim.losses.cross_entropy_loss(logits[0],\n                                 dense_labels,\n                                 label_smoothing=0.1,\n                                 weight=1.0)\n\n  # Cross entropy loss for the auxiliary softmax head.\n  slim.losses.cross_entropy_loss(logits[1],\n                                 dense_labels,\n                                 label_smoothing=0.1,\n                                 weight=0.4,\n                                 scope='aux_loss')\n\n\ndef _activation_summary(x):\n  \"\"\"Helper to create summaries for activations.\n\n  Creates a summary that provides a histogram of activations.\n  Creates a summary that measure the sparsity of activations.\n\n  Args:\n    x: Tensor\n  \"\"\"\n  # Remove 'tower_[0-9]/' from the name in case this is a multi-GPU training\n  # session. This helps the clarity of presentation on tensorboard.\n  tensor_name = re.sub('%s_[0-9]*/' % TOWER_NAME, '', x.op.name)\n  tf.histogram_summary(tensor_name + '/activations', x)\n  tf.scalar_summary(tensor_name + '/sparsity', tf.nn.zero_fraction(x))\n\n\ndef _activation_summaries(endpoints):\n  with tf.name_scope('summaries'):\n    for act in endpoints.values():\n      _activation_summary(act)\n"
  },
  {
    "path": "model_zoo/models/inception/inception/inception_train.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"A library to train Inception using multiple GPU's with synchronous updates.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport copy\nfrom datetime import datetime\nimport os.path\nimport re\nimport time\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom inception import image_processing\nfrom inception import inception_model as inception\nfrom inception.slim import slim\n\nFLAGS = tf.app.flags.FLAGS\n\ntf.app.flags.DEFINE_string('train_dir', '/tmp/imagenet_train',\n                           \"\"\"Directory where to write event logs \"\"\"\n                           \"\"\"and checkpoint.\"\"\")\ntf.app.flags.DEFINE_integer('max_steps', 10000000,\n                            \"\"\"Number of batches to run.\"\"\")\ntf.app.flags.DEFINE_string('subset', 'train',\n                           \"\"\"Either 'train' or 'validation'.\"\"\")\n\n# Flags governing the hardware employed for running TensorFlow.\ntf.app.flags.DEFINE_integer('num_gpus', 1,\n                            \"\"\"How many GPUs to use.\"\"\")\ntf.app.flags.DEFINE_boolean('log_device_placement', False,\n                            \"\"\"Whether to log device placement.\"\"\")\n\n# Flags governing the type of training.\ntf.app.flags.DEFINE_boolean('fine_tune', False,\n                            \"\"\"If set, randomly initialize the final layer \"\"\"\n                            \"\"\"of weights in order to train the network on a \"\"\"\n                            \"\"\"new task.\"\"\")\ntf.app.flags.DEFINE_string('pretrained_model_checkpoint_path', '',\n                           \"\"\"If specified, restore this pretrained model \"\"\"\n                           \"\"\"before beginning any training.\"\"\")\n\n# **IMPORTANT**\n# Please note that this learning rate schedule is heavily dependent on the\n# hardware architecture, batch size and any changes to the model architecture\n# specification. Selecting a finely tuned learning rate schedule is an\n# empirical process that requires some experimentation. Please see README.md\n# more guidance and discussion.\n#\n# With 8 Tesla K40's and a batch size = 256, the following setup achieves\n# precision@1 = 73.5% after 100 hours and 100K steps (20 epochs).\n# Learning rate decay factor selected from http://arxiv.org/abs/1404.5997.\ntf.app.flags.DEFINE_float('initial_learning_rate', 0.1,\n                          \"\"\"Initial learning rate.\"\"\")\ntf.app.flags.DEFINE_float('num_epochs_per_decay', 30.0,\n                          \"\"\"Epochs after which learning rate decays.\"\"\")\ntf.app.flags.DEFINE_float('learning_rate_decay_factor', 0.16,\n                          \"\"\"Learning rate decay factor.\"\"\")\n\n# Constants dictating the learning rate schedule.\nRMSPROP_DECAY = 0.9                # Decay term for RMSProp.\nRMSPROP_MOMENTUM = 0.9             # Momentum in RMSProp.\nRMSPROP_EPSILON = 1.0              # Epsilon term for RMSProp.\n\n\ndef _tower_loss(images, labels, num_classes, scope):\n  \"\"\"Calculate the total loss on a single tower running the ImageNet model.\n\n  We perform 'batch splitting'. This means that we cut up a batch across\n  multiple GPU's. For instance, if the batch size = 32 and num_gpus = 2,\n  then each tower will operate on an batch of 16 images.\n\n  Args:\n    images: Images. 4D tensor of size [batch_size, FLAGS.image_size,\n                                       FLAGS.image_size, 3].\n    labels: 1-D integer Tensor of [batch_size].\n    num_classes: number of classes\n    scope: unique prefix string identifying the ImageNet tower, e.g.\n      'tower_0'.\n\n  Returns:\n     Tensor of shape [] containing the total loss for a batch of data\n  \"\"\"\n  # When fine-tuning a model, we do not restore the logits but instead we\n  # randomly initialize the logits. The number of classes in the output of the\n  # logit is the number of classes in specified Dataset.\n  restore_logits = not FLAGS.fine_tune\n\n  # Build inference Graph.\n  logits = inception.inference(images, num_classes, for_training=True,\n                               restore_logits=restore_logits,\n                               scope=scope)\n\n  # Build the portion of the Graph calculating the losses. Note that we will\n  # assemble the total_loss using a custom function below.\n  split_batch_size = images.get_shape().as_list()[0]\n  inception.loss(logits, labels, batch_size=split_batch_size)\n\n  # Assemble all of the losses for the current tower only.\n  losses = tf.get_collection(slim.losses.LOSSES_COLLECTION, scope)\n\n  # Calculate the total loss for the current tower.\n  regularization_losses = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)\n  total_loss = tf.add_n(losses + regularization_losses, name='total_loss')\n\n  # Compute the moving average of all individual losses and the total loss.\n  loss_averages = tf.train.ExponentialMovingAverage(0.9, name='avg')\n  loss_averages_op = loss_averages.apply(losses + [total_loss])\n\n  # Attach a scalar summmary to all individual losses and the total loss; do the\n  # same for the averaged version of the losses.\n  for l in losses + [total_loss]:\n    # Remove 'tower_[0-9]/' from the name in case this is a multi-GPU training\n    # session. This helps the clarity of presentation on TensorBoard.\n    loss_name = re.sub('%s_[0-9]*/' % inception.TOWER_NAME, '', l.op.name)\n    # Name each loss as '(raw)' and name the moving average version of the loss\n    # as the original loss name.\n    tf.scalar_summary(loss_name +' (raw)', l)\n    tf.scalar_summary(loss_name, loss_averages.average(l))\n\n  with tf.control_dependencies([loss_averages_op]):\n    total_loss = tf.identity(total_loss)\n  return total_loss\n\n\ndef _average_gradients(tower_grads):\n  \"\"\"Calculate the average gradient for each shared variable across all towers.\n\n  Note that this function provides a synchronization point across all towers.\n\n  Args:\n    tower_grads: List of lists of (gradient, variable) tuples. The outer list\n      is over individual gradients. The inner list is over the gradient\n      calculation for each tower.\n  Returns:\n     List of pairs of (gradient, variable) where the gradient has been averaged\n     across all towers.\n  \"\"\"\n  average_grads = []\n  for grad_and_vars in zip(*tower_grads):\n    # Note that each grad_and_vars looks like the following:\n    #   ((grad0_gpu0, var0_gpu0), ... , (grad0_gpuN, var0_gpuN))\n    grads = []\n    for g, _ in grad_and_vars:\n      # Add 0 dimension to the gradients to represent the tower.\n      expanded_g = tf.expand_dims(g, 0)\n\n      # Append on a 'tower' dimension which we will average over below.\n      grads.append(expanded_g)\n\n    # Average over the 'tower' dimension.\n    grad = tf.concat(0, grads)\n    grad = tf.reduce_mean(grad, 0)\n\n    # Keep in mind that the Variables are redundant because they are shared\n    # across towers. So .. we will just return the first tower's pointer to\n    # the Variable.\n    v = grad_and_vars[0][1]\n    grad_and_var = (grad, v)\n    average_grads.append(grad_and_var)\n  return average_grads\n\n\ndef train(dataset):\n  \"\"\"Train on dataset for a number of steps.\"\"\"\n  with tf.Graph().as_default(), tf.device('/cpu:0'):\n    # Create a variable to count the number of train() calls. This equals the\n    # number of batches processed * FLAGS.num_gpus.\n    global_step = tf.get_variable(\n        'global_step', [],\n        initializer=tf.constant_initializer(0), trainable=False)\n\n    # Calculate the learning rate schedule.\n    num_batches_per_epoch = (dataset.num_examples_per_epoch() /\n                             FLAGS.batch_size)\n    decay_steps = int(num_batches_per_epoch * FLAGS.num_epochs_per_decay)\n\n    # Decay the learning rate exponentially based on the number of steps.\n    lr = tf.train.exponential_decay(FLAGS.initial_learning_rate,\n                                    global_step,\n                                    decay_steps,\n                                    FLAGS.learning_rate_decay_factor,\n                                    staircase=True)\n\n    # Create an optimizer that performs gradient descent.\n    opt = tf.train.RMSPropOptimizer(lr, RMSPROP_DECAY,\n                                    momentum=RMSPROP_MOMENTUM,\n                                    epsilon=RMSPROP_EPSILON)\n\n    # Get images and labels for ImageNet and split the batch across GPUs.\n    assert FLAGS.batch_size % FLAGS.num_gpus == 0, (\n        'Batch size must be divisible by number of GPUs')\n    split_batch_size = int(FLAGS.batch_size / FLAGS.num_gpus)\n\n    # Override the number of preprocessing threads to account for the increased\n    # number of GPU towers.\n    num_preprocess_threads = FLAGS.num_preprocess_threads * FLAGS.num_gpus\n    images, labels = image_processing.distorted_inputs(\n        dataset,\n        num_preprocess_threads=num_preprocess_threads)\n\n    input_summaries = copy.copy(tf.get_collection(tf.GraphKeys.SUMMARIES))\n\n    # Number of classes in the Dataset label set plus 1.\n    # Label 0 is reserved for an (unused) background class.\n    num_classes = dataset.num_classes() + 1\n    \n     # Split the batch of images and labels for towers.\n    images_splits = tf.split(0, FLAGS.num_gpus, images)\n    labels_splits = tf.split(0, FLAGS.num_gpus, labels)\n\n    # Calculate the gradients for each model tower.\n    tower_grads = []\n    for i in xrange(FLAGS.num_gpus):\n      with tf.device('/gpu:%d' % i):\n        with tf.name_scope('%s_%d' % (inception.TOWER_NAME, i)) as scope:\n          # Force all Variables to reside on the CPU.\n          with slim.arg_scope([slim.variables.variable], device='/cpu:0'):\n            # Calculate the loss for one tower of the ImageNet model. This\n            # function constructs the entire ImageNet model but shares the\n            # variables across all towers.\n            loss = _tower_loss(images_splits[i], labels_splits[i], num_classes,\n                               scope)\n\n          # Reuse variables for the next tower.\n          tf.get_variable_scope().reuse_variables()\n\n          # Retain the summaries from the final tower.\n          summaries = tf.get_collection(tf.GraphKeys.SUMMARIES, scope)\n\n          # Retain the Batch Normalization updates operations only from the\n          # final tower. Ideally, we should grab the updates from all towers\n          # but these stats accumulate extremely fast so we can ignore the\n          # other stats from the other towers without significant detriment.\n          batchnorm_updates = tf.get_collection(slim.ops.UPDATE_OPS_COLLECTION,\n                                                scope)\n\n          # Calculate the gradients for the batch of data on this ImageNet\n          # tower.\n          grads = opt.compute_gradients(loss)\n\n          # Keep track of the gradients across all towers.\n          tower_grads.append(grads)\n\n    # We must calculate the mean of each gradient. Note that this is the\n    # synchronization point across all towers.\n    grads = _average_gradients(tower_grads)\n\n    # Add a summaries for the input processing and global_step.\n    summaries.extend(input_summaries)\n\n    # Add a summary to track the learning rate.\n    summaries.append(tf.scalar_summary('learning_rate', lr))\n\n    # Add histograms for gradients.\n    for grad, var in grads:\n      if grad is not None:\n        summaries.append(\n            tf.histogram_summary(var.op.name + '/gradients', grad))\n\n    # Apply the gradients to adjust the shared variables.\n    apply_gradient_op = opt.apply_gradients(grads, global_step=global_step)\n\n    # Add histograms for trainable variables.\n    for var in tf.trainable_variables():\n      summaries.append(tf.histogram_summary(var.op.name, var))\n\n    # Track the moving averages of all trainable variables.\n    # Note that we maintain a \"double-average\" of the BatchNormalization\n    # global statistics. This is more complicated then need be but we employ\n    # this for backward-compatibility with our previous models.\n    variable_averages = tf.train.ExponentialMovingAverage(\n        inception.MOVING_AVERAGE_DECAY, global_step)\n\n    # Another possiblility is to use tf.slim.get_variables().\n    variables_to_average = (tf.trainable_variables() +\n                            tf.moving_average_variables())\n    variables_averages_op = variable_averages.apply(variables_to_average)\n\n    # Group all updates to into a single train op.\n    batchnorm_updates_op = tf.group(*batchnorm_updates)\n    train_op = tf.group(apply_gradient_op, variables_averages_op,\n                        batchnorm_updates_op)\n\n    # Create a saver.\n    saver = tf.train.Saver(tf.all_variables())\n\n    # Build the summary operation from the last tower summaries.\n    summary_op = tf.merge_summary(summaries)\n\n    # Build an initialization operation to run below.\n    init = tf.initialize_all_variables()\n\n    # Start running operations on the Graph. allow_soft_placement must be set to\n    # True to build towers on GPU, as some of the ops do not have GPU\n    # implementations.\n    sess = tf.Session(config=tf.ConfigProto(\n        allow_soft_placement=True,\n        log_device_placement=FLAGS.log_device_placement))\n    sess.run(init)\n\n    if FLAGS.pretrained_model_checkpoint_path:\n      assert tf.gfile.Exists(FLAGS.pretrained_model_checkpoint_path)\n      variables_to_restore = tf.get_collection(\n          slim.variables.VARIABLES_TO_RESTORE)\n      restorer = tf.train.Saver(variables_to_restore)\n      restorer.restore(sess, FLAGS.pretrained_model_checkpoint_path)\n      print('%s: Pre-trained model restored from %s' %\n            (datetime.now(), FLAGS.pretrained_model_checkpoint_path))\n\n    # Start the queue runners.\n    tf.train.start_queue_runners(sess=sess)\n\n    summary_writer = tf.train.SummaryWriter(\n        FLAGS.train_dir,\n        graph_def=sess.graph.as_graph_def(add_shapes=True))\n\n    for step in xrange(FLAGS.max_steps):\n      start_time = time.time()\n      _, loss_value = sess.run([train_op, loss])\n      duration = time.time() - start_time\n\n      assert not np.isnan(loss_value), 'Model diverged with loss = NaN'\n\n      if step % 10 == 0:\n        examples_per_sec = FLAGS.batch_size / float(duration)\n        format_str = ('%s: step %d, loss = %.2f (%.1f examples/sec; %.3f '\n                      'sec/batch)')\n        print(format_str % (datetime.now(), step, loss_value,\n                            examples_per_sec, duration))\n\n      if step % 100 == 0:\n        summary_str = sess.run(summary_op)\n        summary_writer.add_summary(summary_str, step)\n\n      # Save the model checkpoint periodically.\n      if step % 5000 == 0 or (step + 1) == FLAGS.max_steps:\n        checkpoint_path = os.path.join(FLAGS.train_dir, 'model.ckpt')\n        saver.save(sess, checkpoint_path, global_step=step)\n"
  },
  {
    "path": "model_zoo/models/inception/inception/slim/BUILD",
    "content": "# Description:\n#   Contains the operations and nets for building TensorFlow-Slim models.\n\npackage(default_visibility = [\"//inception:internal\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\nexports_files([\"LICENSE\"])\n\npy_library(\n    name = \"scopes\",\n    srcs = [\"scopes.py\"],\n)\n\npy_test(\n    name = \"scopes_test\",\n    size = \"small\",\n    srcs = [\"scopes_test.py\"],\n    deps = [\n        \":scopes\",\n    ],\n)\n\npy_library(\n    name = \"variables\",\n    srcs = [\"variables.py\"],\n    deps = [\n        \":scopes\",\n    ],\n)\n\npy_test(\n    name = \"variables_test\",\n    size = \"small\",\n    srcs = [\"variables_test.py\"],\n    deps = [\n        \":variables\",\n    ],\n)\n\npy_library(\n    name = \"losses\",\n    srcs = [\"losses.py\"],\n)\n\npy_test(\n    name = \"losses_test\",\n    size = \"small\",\n    srcs = [\"losses_test.py\"],\n    deps = [\n        \":losses\",\n    ],\n)\n\npy_library(\n    name = \"ops\",\n    srcs = [\"ops.py\"],\n    deps = [\n        \":losses\",\n        \":scopes\",\n        \":variables\",\n    ],\n)\n\npy_test(\n    name = \"ops_test\",\n    size = \"small\",\n    srcs = [\"ops_test.py\"],\n    deps = [\n        \":ops\",\n        \":variables\",\n    ],\n)\n\npy_library(\n    name = \"inception\",\n    srcs = [\"inception_model.py\"],\n    deps = [\n        \":ops\",\n        \":scopes\",\n    ],\n)\n\npy_test(\n    name = \"inception_test\",\n    size = \"medium\",\n    srcs = [\"inception_test.py\"],\n    deps = [\n        \":inception\",\n    ],\n)\n\npy_library(\n    name = \"slim\",\n    srcs = [\"slim.py\"],\n    deps = [\n        \":inception\",\n        \":losses\",\n        \":ops\",\n        \":scopes\",\n        \":variables\",\n    ],\n)\n\npy_test(\n    name = \"collections_test\",\n    size = \"small\",\n    srcs = [\"collections_test.py\"],\n    deps = [\n        \":slim\",\n    ],\n)\n"
  },
  {
    "path": "model_zoo/models/inception/inception/slim/README.md",
    "content": "# TensorFlow-Slim\n\nTF-Slim is a lightweight library for defining, training and evaluating models in\nTensorFlow. It enables defining complex networks quickly and concisely while\nkeeping a model's architecture transparent and its hyperparameters explicit.\n\n[TOC]\n\n## Teaser\n\nAs a demonstration of the simplicity of using TF-Slim, compare the simplicity of\nthe code necessary for defining the entire [VGG]\n(http://www.robots.ox.ac.uk/~vgg/research/very_deep/) network using TF-Slim to\nthe lengthy and verbose nature of defining just the first three layers (out of\n16) using native tensorflow:\n\n```python{.good}\n# VGG16 in TF-Slim.\ndef vgg16(inputs):\n  with slim.arg_scope([slim.ops.conv2d, slim.ops.fc], stddev=0.01, weight_decay=0.0005):\n    net = slim.ops.repeat_op(2, inputs, slim.ops.conv2d, 64, [3, 3], scope='conv1')\n    net = slim.ops.max_pool(net, [2, 2], scope='pool1')\n    net = slim.ops.repeat_op(2, net, slim.ops.conv2d, 128, [3, 3], scope='conv2')\n    net = slim.ops.max_pool(net, [2, 2], scope='pool2')\n    net = slim.ops.repeat_op(3, net, slim.ops.conv2d, 256, [3, 3], scope='conv3')\n    net = slim.ops.max_pool(net, [2, 2], scope='pool3')\n    net = slim.ops.repeat_op(3, net, slim.ops.conv2d, 512, [3, 3], scope='conv4')\n    net = slim.ops.max_pool(net, [2, 2], scope='pool4')\n    net = slim.ops.repeat_op(3, net, slim.ops.conv2d, 512, [3, 3], scope='conv5')\n    net = slim.ops.max_pool(net, [2, 2], scope='pool5')\n    net = slim.ops.flatten(net, scope='flatten5')\n    net = slim.ops.fc(net, 4096, scope='fc6')\n    net = slim.ops.dropout(net, 0.5, scope='dropout6')\n    net = slim.ops.fc(net, 4096, scope='fc7')\n    net = slim.ops.dropout(net, 0.5, scope='dropout7')\n    net = slim.ops.fc(net, 1000, activation=None, scope='fc8')\n  return net\n```\n\n```python{.bad}\n# Layers 1-3 (out of 16) of VGG16 in native tensorflow.\ndef vgg16(inputs):\n  with tf.name_scope('conv1_1') as scope:\n    kernel = tf.Variable(tf.truncated_normal([3, 3, 3, 64], dtype=tf.float32, stddev=1e-1), name='weights')\n    conv = tf.nn.conv2d(inputs, kernel, [1, 1, 1, 1], padding='SAME')\n    biases = tf.Variable(tf.constant(0.0, shape=[64], dtype=tf.float32), trainable=True, name='biases')\n    bias = tf.nn.bias_add(conv, biases)\n    conv1 = tf.nn.relu(bias, name=scope)\n  with tf.name_scope('conv1_2') as scope:\n    kernel = tf.Variable(tf.truncated_normal([3, 3, 64, 64], dtype=tf.float32, stddev=1e-1), name='weights')\n    conv = tf.nn.conv2d(images, kernel, [1, 1, 1, 1], padding='SAME')\n    biases = tf.Variable(tf.constant(0.0, shape=[64], dtype=tf.float32), trainable=True, name='biases')\n    bias = tf.nn.bias_add(conv, biases)\n    conv1 = tf.nn.relu(bias, name=scope)\n  with tf.name_scope('pool1')\n    pool1 = tf.nn.max_pool(conv1, ksize=[1, 2, 2, 1], strides=[1, 2, 2, 1], padding='VALID', name='pool1')\n```\n\n## Why TF-Slim?\n\nTF-Slim offers several advantages over just the built-in tensorflow libraries:\n\n*   Allows one to define models much more compactly by eliminating boilerplate\n    code. This is accomplished through the use of [argument scoping](scopes.py)\n    and numerous high level [operations](ops.py). These tools increase\n    readability and maintainability, reduce the likelihood of an error from\n    copy-and-pasting hyperparameter values and simplifies hyperparameter tuning.\n*   Makes developing models simple by providing commonly used [loss functions]\n    (losses.py)\n*   Provides a concise [definition](inception_model.py) of [Inception v3]\n    (http://arxiv.org/abs/1512.00567) network architecture ready to be used\n    out-of-the-box or subsumed into new models.\n\nAdditionally TF-Slim was designed with several principles in mind:\n\n*   The various modules of TF-Slim (scopes, variables, ops, losses) are\n    independent. This flexibility allows users to pick and choose components of\n    TF-Slim completely à la carte.\n*   TF-Slim is written using a Functional Programming style. That means it's\n    super-lightweight and can be used right alongside any of TensorFlow's native\n    operations.\n*   Makes re-using network architectures easy. This allows users to build new\n    networks on top of existing ones as well as fine-tuning pre-trained models\n    on new tasks.\n\n## What are the various components of TF-Slim?\n\nTF-Slim is composed of several parts which were designed to exist independently.\nThese include:\n\n*   [scopes.py](./scopes.py): provides a new scope named `arg_scope` that allows\n    a user to define default arguments for specific operations within that\n    scope.\n*   [variables.py](./variables.py): provides convenience wrappers for variable\n    creation and manipulation.\n*   [ops.py](./ops.py): provides high level operations for building models using\n    tensorflow.\n*   [losses.py](./losses.py): contains commonly used loss functions.\n\n## Defining Models\n\nModels can be succinctly defined using TF-Slim by combining its variables,\noperations and scopes. Each of these elements are defined below.\n\n### Variables\n\nCreating [`Variables`](https://www.tensorflow.org/how_tos/variables/index.html)\nin native tensorflow requires either a predefined value or an initialization\nmechanism (random, normally distributed). Furthermore, if a variable needs to be\ncreated on a specific device, such as a GPU, the specification must be [made\nexplicit](https://www.tensorflow.org/how_tos/using_gpu/index.html). To alleviate\nthe code required for variable creation, TF-Slim provides a set of thin wrapper\nfunctions in [variables.py](./variables.py) which allow callers to easily define\nvariables.\n\nFor example, to create a `weight` variable, initialize it using a truncated\nnormal distribution, regularize it with an `l2_loss` and place it on the `CPU`,\none need only declare the following:\n\n```python\nweights = variables.variable('weights',\n                             shape=[10, 10, 3 , 3],\n                             initializer=tf.truncated_normal_initializer(stddev=0.1),\n                             regularizer=lambda t: losses.l2_loss(t, weight=0.05),\n                             device='/cpu:0')\n```\n\nIn addition to the functionality provided by `tf.Variable`, `slim.variables`\nkeeps track of the variables created by `slim.ops` to define a model, which\nallows one to distinguish variables that belong to the model versus other\nvariables.\n\n```python\n# Get all the variables defined by the model.\nmodel_variables = slim.variables.get_variables()\n\n# Get all the variables with the same given name, i.e. 'weights', 'biases'.\nweights = slim.variables.get_variables_by_name('weights')\nbiases = slim.variables.get_variables_by_name('biases')\n\n# Get all the variables in VARIABLES_TO_RESTORE collection.\nvariables_to_restore = tf.get_collection(slim.variables.VARIABLES_TO_RESTORE)\n\n\nweights = variables.variable('weights',\n                             shape=[10, 10, 3 , 3],\n                             initializer=tf.truncated_normal_initializer(stddev=0.1),\n                             regularizer=lambda t: losses.l2_loss(t, weight=0.05),\n                             device='/cpu:0')\n```\n\n### Operations (Layers)\n\nWhile the set of TensorFlow operations is quite extensive, builders of neural\nnetworks typically think of models in terms of \"layers\". A layer, such as a\nConvolutional Layer, a Fully Connected Layer or a BatchNorm Layer are more\nabstract than a single TensorFlow operation and typically involve many such\noperations. For example, a Convolutional Layer in a neural network is built\nusing several steps:\n\n1.  Creating the weight variables\n2.  Creating the bias variables\n3.  Convolving the weights with the input from the previous layer\n4.  Adding the biases to the result of the convolution.\n\nIn python code this can be rather laborious:\n\n```python\ninput = ...\nwith tf.name_scope('conv1_1') as scope:\n  kernel = tf.Variable(tf.truncated_normal([3, 3, 64, 128], dtype=tf.float32,\n                                           stddev=1e-1), name='weights')\n  conv = tf.nn.conv2d(input, kernel, [1, 1, 1, 1], padding='SAME')\n  biases = tf.Variable(tf.constant(0.0, shape=[128], dtype=tf.float32),\n                       trainable=True, name='biases')\n  bias = tf.nn.bias_add(conv, biases)\n  conv1 = tf.nn.relu(bias, name=scope)\n```\n\nTo alleviate the need to duplicate this code repeatedly, TF-Slim provides a\nnumber of convenient operations defined at the (more abstract) level of neural\nnetwork layers. For example, compare the code above to an invocation of the\nTF-Slim code:\n\n```python\ninput = ...\nnet = slim.ops.conv2d(input, [3, 3], 128, scope='conv1_1')\n```\n\nTF-Slim provides numerous operations used in building neural networks which\nroughly correspond to such layers. These include:\n\nLayer                 | TF-Slim Op\n--------------------- | ------------------------\nConvolutional Layer   | [ops.conv2d](ops.py)\nFully Connected Layer | [ops.fc](ops.py)\nBatchNorm layer       | [ops.batch_norm](ops.py)\nMax Pooling Layer     | [ops.max_pool](ops.py)\nAvg Pooling Layer     | [ops.avg_pool](ops.py)\nDropout Layer         | [ops.dropout](ops.py)\n\n[ops.py](./ops.py) also includes operations that are not really \"layers\" per se,\nbut are often used to manipulate hidden unit representations during inference:\n\nOperation | TF-Slim Op\n--------- | ---------------------\nFlatten   | [ops.flatten](ops.py)\n\nTF-Slim also provides a meta-operation called `repeat_op` that allows one to\nrepeatedly perform the same operation. Consider the following snippet from the\n[VGG](https://www.robots.ox.ac.uk/~vgg/research/very_deep/) network whose layers\nperform several convolutions in a row between pooling layers:\n\n```python\nnet = ...\nnet = slim.ops.conv2d(net, 256, [3, 3], scope='conv3_1')\nnet = slim.ops.conv2d(net, 256, [3, 3], scope='conv3_2')\nnet = slim.ops.conv2d(net, 256, [3, 3], scope='conv3_3')\nnet = slim.ops.max_pool(net, [2, 2], scope='pool3')\n```\n\nThis clear duplication of code can be removed via a standard loop:\n\n```python\nnet = ...\nfor i in range(3):\n  net = slim.ops.conv2d(net, 256, [3, 3], scope='conv3_' % (i+1))\nnet = slim.ops.max_pool(net, [2, 2], scope='pool3')\n```\n\nWhile this does reduce the amount of duplication, it can be made even cleaner by\nusing the `RepeatOp`:\n\n```python\nnet = slim.ops.repeat_op(3, net, slim.ops.conv2d, 256, [3, 3], scope='conv3')\nnet = slim.ops.max_pool(net, [2, 2], scope='pool2')\n```\n\nNotice that the RepeatOp not only applies the same argument in-line, it also is\nsmart enough to unroll the scopes such that the scopes assigned to each\nsubsequent call of `ops.conv2d` is appended with an underscore and iteration\nnumber. More concretely, the scopes in the example above would be 'conv3_1',\n'conv3_2' and 'conv3_3'.\n\n### Scopes\n\nIn addition to the types of scope mechanisms in TensorFlow ([name_scope]\n(https://www.tensorflow.org/api_docs/python/framework.html#name_scope),\n[op_scope](https://www.tensorflow.org/api_docs/python/framework.html#op_scope),\n[variable_scope]\n(https://www.tensorflow.org/api_docs/python/state_ops.html#variable_scope),\n[variable_op_scope]\n(https://www.tensorflow.org/api_docs/python/state_ops.html#variable_op_scope)),\nTF-Slim adds a new scoping mechanism called \"argument scope\" or [arg_scope]\n(scopes.py). This new scope allows a user to specify one or more operations and\na set of arguments which will be passed to each of the operations defined in the\n`arg_scope`. This functionality is best illustrated by example. Consider the\nfollowing code snippet:\n\n```python\nnet = slim.ops.conv2d(inputs, 64, [11, 11], 4, padding='SAME', stddev=0.01, weight_decay=0.0005, scope='conv1')\nnet = slim.ops.conv2d(net, 128, [11, 11], padding='VALID', stddev=0.01, weight_decay=0.0005, scope='conv2')\nnet = slim.ops.conv2d(net, 256, [11, 11], padding='SAME', stddev=0.01, weight_decay=0.0005, scope='conv3')\n```\n\nIt should be clear that these three Convolution layers share many of the same\nhyperparameters. Two have the same padding, all three have the same weight_decay\nand standard deviation of its weights. Not only do the duplicated values make\nthe code more difficult to read, it also adds the addition burder to the writer\nof needing to doublecheck that all of the values are identical in each step. One\nsolution would be to specify default values using variables:\n\n```python\npadding='SAME'\nstddev=0.01\nweight_decay=0.0005\nnet = slim.ops.conv2d(inputs, 64, [11, 11], 4, padding=padding, stddev=stddev, weight_decay=weight_decay, scope='conv1')\nnet = slim.ops.conv2d(net, 128, [11, 11], padding='VALID', stddev=stddev, weight_decay=weight_decay, scope='conv2')\nnet = slim.ops.conv2d(net, 256, [11, 11], padding=padding, stddev=stddev, weight_decay=weight_decay, scope='conv3')\n\n```\n\nThis solution ensures that all three convolutions share the exact same variable\nvalues but doesn't reduce the code clutter. By using an `arg_scope`, we can both\nensure that each layer uses the same values and simplify the code:\n\n```python\n  with slim.arg_scope([slim.ops.conv2d], padding='SAME', stddev=0.01, weight_decay=0.0005):\n    net = slim.ops.conv2d(inputs, 64, [11, 11], scope='conv1')\n    net = slim.ops.conv2d(net, 128, [11, 11], padding='VALID', scope='conv2')\n    net = slim.ops.conv2d(net, 256, [11, 11], scope='conv3')\n```\n\nAs the example illustrates, the use of arg_scope makes the code cleaner, simpler\nand easier to maintain. Notice that while argument values are specifed in the\narg_scope, they can be overwritten locally. In particular, while the padding\nargument has been set to 'SAME', the second convolution overrides it with the\nvalue of 'VALID'.\n\nOne can also nest `arg_scope`s and use multiple operations in the same scope.\nFor example:\n\n```python\nwith arg_scope([slim.ops.conv2d, slim.ops.fc], stddev=0.01, weight_decay=0.0005):\n  with arg_scope([slim.ops.conv2d], padding='SAME'), slim.arg_scope([slim.ops.fc], bias=1.0):\n    net = slim.ops.conv2d(inputs, 64, [11, 11], 4, padding='VALID', scope='conv1')\n    net = slim.ops.conv2d(net, 256, [5, 5], stddev=0.03, scope='conv2')\n    net = slim.ops.flatten(net)\n    net = slim.ops.fc(net, 1000, activation=None, scope='fc')\n```\n\nIn this example, the first `arg_scope` applies the same `stddev` and\n`weight_decay` arguments to the `conv2d` and `fc` ops in its scope. In the\nsecond `arg_scope`, additional default arguments to `conv2d` only are specified.\n\nIn addition to `arg_scope`, TF-Slim provides several decorators that wrap the\nuse of tensorflow arg scopes. These include `@AddArgScope`, `@AddNameScope`,\n`@AddVariableScope`, `@AddOpScope` and `@AddVariableOpScope`. To illustrate\ntheir use, consider the following example.\n\n```python\ndef MyNewOp(inputs):\n  varA = ...\n  varB = ...\n  outputs = tf.mul(varA, inputs) + varB\n  return outputs\n\n```\n\nIn this example, the user has created a new op which creates two variables. To\nensure that these variables exist within a certain variable scope (to avoid\ncollisions with variables with the same name), in standard TF, the op must be\ncalled within a variable scope:\n\n```python\ninputs = ...\nwith tf.variable_scope('layer1'):\n  outputs = MyNewOp(inputs)\n```\n\nAs an alternative, one can use TF-Slim's decorators to decorate the function and\nsimplify the call:\n\n```python\n@AddVariableScope\ndef MyNewOp(inputs):\n  ...\n  return outputs\n\n\ninputs = ...\noutputs = MyNewOp('layer1')\n```\n\nThe `@AddVariableScope` decorater simply applies the `tf.variable_scope` scoping\nto the called function taking \"layer1\" as its argument. This allows the code to\nbe written more concisely.\n\n### Losses\n\nThe loss function defines a quantity that we want to minimize. For\nclassification problems, this is typically the cross entropy between the true\n(one-hot) distribution and the predicted probability distribution across\nclasses. For regression problems, this is often the sum-of-squares differences\nbetween the predicted and true values.\n\nCertain models, such as multi-task learning models, require the use of multiple\nloss functions simultaneously. In other words, the loss function ultimatey being\nminimized is the sum of various other loss functions. For example, consider a\nmodel that predicts both the type of scene in an image as well as the depth from\nthe camera of each pixel. This model's loss function would be the sum of the\nclassification loss and depth prediction loss.\n\nTF-Slim provides an easy-to-use mechanism for defining and keeping track of loss\nfunctions via the [losses.py](./losses.py) module. Consider the simple case\nwhere we want to train the VGG network:\n\n```python\n# Load the images and labels.\nimages, labels = ...\n\n# Create the model.\npredictions =  ...\n\n# Define the loss functions and get the total loss.\nloss = losses.cross_entropy_loss(predictions, labels)\n```\n\nIn this example, we start by creating the model (using TF-Slim's VGG\nimplementation), and add the standard classification loss. Now, lets turn to the\ncase where we have a multi-task model that produces multiple outputs:\n\n```python\n# Load the images and labels.\nimages, scene_labels, depth_labels = ...\n\n# Create the model.\nscene_predictions, depth_predictions = CreateMultiTaskModel(images)\n\n# Define the loss functions and get the total loss.\nclassification_loss = slim.losses.cross_entropy_loss(scene_predictions, scene_labels)\nsum_of_squares_loss = slim.losses.l2loss(depth_predictions - depth_labels)\n\n# The following two lines have the same effect:\ntotal_loss1 = classification_loss + sum_of_squares_loss\ntotal_loss2 = tf.get_collection(slim.losses.LOSSES_COLLECTION)\n```\n\nIn this example, we have two losses which we add by calling\n`losses.cross_entropy_loss` and `losses.l2loss`. We can obtain the\ntotal loss by adding them together (`total_loss1`) or by calling\n`losses.GetTotalLoss()`. How did this work? When you create a loss function via\nTF-Slim, TF-Slim adds the loss to a special TensorFlow collection of loss\nfunctions. This enables you to either manage the total loss manually, or allow\nTF-Slim to manage them for you.\n\nWhat if you want to let TF-Slim manage the losses for you but have a custom loss\nfunction? [losses.py](./losses.py) also has a function that adds this loss to\nTF-Slims collection. For example:\n\n```python\n# Load the images and labels.\nimages, scene_labels, depth_labels, pose_labels = ...\n\n# Create the model.\nscene_predictions, depth_predictions, pose_predictions = CreateMultiTaskModel(images)\n\n# Define the loss functions and get the total loss.\nclassification_loss = slim.losses.cross_entropy_loss(scene_predictions, scene_labels)\nsum_of_squares_loss = slim.losses.l2loss(depth_predictions - depth_labels)\npose_loss = MyCustomLossFunction(pose_predictions, pose_labels)\ntf.add_to_collection(slim.losses.LOSSES_COLLECTION, pose_loss) # Letting TF-Slim know about the additional loss.\n\n# The following two lines have the same effect:\ntotal_loss1 = classification_loss + sum_of_squares_loss + pose_loss\ntotal_loss2 = losses.GetTotalLoss()\n```\n\nIn this example, we can again either produce the total loss function manually or\nlet TF-Slim know about the additional loss and let TF-Slim handle the losses.\n\n## Putting the Pieces Together\n\nBy combining TF-Slim Variables, Operations and scopes, we can write a normally\nvery complex network with very few lines of code. For example, the entire [VGG]\n(https://www.robots.ox.ac.uk/~vgg/research/very_deep/) architecture can be\ndefined with just the following snippet:\n\n```python\nwith arg_scope([slim.ops.conv2d, slim.ops.fc], stddev=0.01, weight_decay=0.0005):\n  net = slim.ops.repeat_op(1, inputs, slim.ops.conv2d, 64, [3, 3], scope='conv1')\n  net = slim.ops.max_pool(net, [2, 2], scope='pool1')\n  net = slim.ops.repeat_op(1, net, slim.ops.conv2d, 128, [3, 3], scope='conv2')\n  net = slim.ops.max_pool(net, [2, 2], scope='pool2')\n  net = slim.ops.repeat_op(2, net, slim.ops.conv2d, 256, [3, 3], scope='conv3')\n  net = slim.ops.max_pool(net, [2, 2], scope='pool3')\n  net = slim.ops.repeat_op(2, net, slim.ops.conv2d, 512, [3, 3], scope='conv4')\n  net = slim.ops.max_pool(net, [2, 2], scope='pool4')\n  net = slim.ops.repeat_op(2, net, slim.ops.conv2d, 512, [3, 3], scope='conv5')\n  net = slim.ops.max_pool(net, [2, 2], scope='pool5')\n  net = slim.ops.flatten(net, scope='flatten5')\n  net = slim.ops.fc(net, 4096, scope='fc6')\n  net = slim.ops.dropout(net, 0.5, scope='dropout6')\n  net = slim.ops.fc(net, 4096, scope='fc7')\n  net = slim.ops.dropout(net, 0.5, scope='dropout7')\n  net = slim.ops.fc(net, 1000, activation=None, scope='fc8')\nreturn net\n```\n\n## Re-using previously defined network architectures and pre-trained models.\n\n### Brief Recap on Restoring Variables from a Checkpoint\n\nAfter a model has been trained, it can be restored using `tf.train.Saver()`\nwhich restores `Variables` from a given checkpoint. For many cases,\n`tf.train.Saver()` provides a simple mechanism to restore all or just a few\nvariables.\n\n```python\n# Create some variables.\nv1 = tf.Variable(..., name=\"v1\")\nv2 = tf.Variable(..., name=\"v2\")\n...\n# Add ops to restore all the variables.\nrestorer = tf.train.Saver()\n\n# Add ops to restore some variables.\nrestorer = tf.train.Saver([v1, v2])\n\n# Later, launch the model, use the saver to restore variables from disk, and\n# do some work with the model.\nwith tf.Session() as sess:\n  # Restore variables from disk.\n  restorer.restore(sess, \"/tmp/model.ckpt\")\n  print(\"Model restored.\")\n  # Do some work with the model\n  ...\n```\n\nSee [Restoring Variables]\n(https://www.tensorflow.org/versions/r0.7/how_tos/variables/index.html#restoring-variables)\nand [Choosing which Variables to Save and Restore]\n(https://www.tensorflow.org/versions/r0.7/how_tos/variables/index.html#choosing-which-variables-to-save-and-restore)\nsections of the [Variables]\n(https://www.tensorflow.org/versions/r0.7/how_tos/variables/index.html) page for\nmore details.\n\n### Using slim.variables to Track which Variables need to be Restored\n\nIt is often desirable to fine-tune a pre-trained model on an entirely new\ndataset or even a new task. In these situations, one must specify which layers\nof the model should be reused (and consequently loaded from a checkpoint) and\nwhich layers are new. Indicating which variables or layers should be restored is\na process that quickly becomes cumbersome when done manually.\n\nTo help keep track of which variables to restore, `slim.variables` provides a\n`restore` argument when creating each Variable. By default, all variables are\nmarked as `restore=True`, which results in all variables defined by the model\nbeing restored.\n\n```python\n# Create some variables.\nv1 = slim.variables.variable(name=\"v1\", ..., restore=False)\nv2 = slim.variables.variable(name=\"v2\", ...) # By default restore=True\n...\n# Get list of variables to restore (which contains only 'v2')\nvariables_to_restore = tf.get_collection(slim.variables.VARIABLES_TO_RESTORE)\nrestorer = tf.train.Saver(variables_to_restore)\nwith tf.Session() as sess:\n  # Restore variables from disk.\n  restorer.restore(sess, \"/tmp/model.ckpt\")\n  print(\"Model restored.\")\n  # Do some work with the model\n  ...\n```\n\nAdditionally, every layer in `slim.ops` that creates slim.variables (such as\n`slim.ops.conv2d`, `slim.ops.fc`, `slim.ops.batch_norm`) also has a `restore`\nargument which controls whether the variables created by that layer should be\nrestored or not.\n\n```python\n# Create a small network.\nnet = slim.ops.conv2d(images, 32, [7, 7], stride=2, scope='conv1')\nnet = slim.ops.conv2d(net, 64, [3, 3], scope='conv2')\nnet = slim.ops.conv2d(net, 128, [3, 3], scope='conv3')\nnet = slim.ops.max_pool(net, [3, 3], stride=2, scope='pool3')\nnet = slim.ops.flatten(net)\nnet = slim.ops.fc(net, 10, scope='logits', restore=False)\n...\n\n# VARIABLES_TO_RESTORE would contain the 'weights' and 'bias' defined by 'conv1'\n# 'conv2' and 'conv3' but not the ones defined by 'logits'\nvariables_to_restore = tf.get_collection(slim.variables.VARIABLES_TO_RESTORE)\n\n# Create a restorer that would restore only the needed variables.\nrestorer = tf.train.Saver(variables_to_restore)\n\n# Create a saver that would save all the variables (including 'logits').\nsaver = tf.train.Saver()\nwith tf.Session() as sess:\n  # Restore variables from disk.\n  restorer.restore(sess, \"/tmp/model.ckpt\")\n  print(\"Model restored.\")\n\n  # Do some work with the model\n  ...\n  saver.save(sess, \"/tmp/new_model.ckpt\")\n```\n\nNote: When restoring variables from a checkpoint, the `Saver` locates the\nvariable names in a checkpoint file and maps them to variables in the current\ngraph. Above, we created a saver by passing to it a list of variables. In this\ncase, the names of the variables to locate in the checkpoint file were\nimplicitly obtained from each provided variable's `var.op.name`.\n\nThis works well when the variable names in the checkpoint file match those in\nthe graph. However, sometimes, we want to restore a model from a checkpoint\nwhose variables have different names those in the current graph. In this case,\nwe must provide the `Saver` a dictionary that maps from each checkpoint variable\nname to each graph variable. Consider the following example where the checkpoint\nvariables names are obtained via a simple function:\n\n```python\n# Assuming that 'conv1/weights' should be restored from 'vgg16/conv1/weights'\ndef name_in_checkpoint(var):\n  return 'vgg16/' + var.op.name\n\n# Assuming that 'conv1/weights' and 'conv1/bias' should be restored from 'conv1/params1' and 'conv1/params2'\ndef name_in_checkpoint(var):\n  if \"weights\" in var.op.name:\n    return var.op.name.replace(\"weights\", \"params1\")\n  if \"bias\" in var.op.name:\n    return var.op.name.replace(\"bias\", \"params2\")\n\nvariables_to_restore = tf.get_collection(slim.variables.VARIABLES_TO_RESTORE)\nvariables_to_restore = {name_in_checkpoint(var):var for var in variables_to_restore}\nrestorer = tf.train.Saver(variables_to_restore)\nwith tf.Session() as sess:\n  # Restore variables from disk.\n  restorer.restore(sess, \"/tmp/model.ckpt\")\n```\n\n### Reusing the VGG16 network defined in TF-Slim on a different task, i.e. PASCAL-VOC.\n\nAssuming one have already a pre-trained VGG16 model, one just need to replace\nthe last layer `fc8` with a new layer `fc8_pascal` and use `restore=False`.\n\n```python\ndef vgg16_pascal(inputs):\n  with slim.arg_scope([slim.ops.conv2d, slim.ops.fc], stddev=0.01, weight_decay=0.0005):\n    net = slim.ops.repeat_op(2, inputs, slim.ops.conv2d, 64, [3, 3], scope='conv1')\n    net = slim.ops.max_pool(net, [2, 2], scope='pool1')\n    net = slim.ops.repeat_op(2, net, slim.ops.conv2d, 128, [3, 3], scope='conv2')\n    net = slim.ops.max_pool(net, [2, 2], scope='pool2')\n    net = slim.ops.repeat_op(3, net, slim.ops.conv2d, 256, [3, 3], scope='conv3')\n    net = slim.ops.max_pool(net, [2, 2], scope='pool3')\n    net = slim.ops.repeat_op(3, net, slim.ops.conv2d, 512, [3, 3], scope='conv4')\n    net = slim.ops.max_pool(net, [2, 2], scope='pool4')\n    net = slim.ops.repeat_op(3, net, slim.ops.conv2d, 512, [3, 3], scope='conv5')\n    net = slim.ops.max_pool(net, [2, 2], scope='pool5')\n    net = slim.ops.flatten(net, scope='flatten5')\n    net = slim.ops.fc(net, 4096, scope='fc6')\n    net = slim.ops.dropout(net, 0.5, scope='dropout6')\n    net = slim.ops.fc(net, 4096, scope='fc7')\n    net = slim.ops.dropout(net, 0.5, scope='dropout7')\n    # To reuse vgg16 on PASCAL-VOC, just change the last layer.\n    net = slim.ops.fc(net, 21, activation=None, scope='fc8_pascal', restore=False)\n  return net\n```\n\n## Authors\n\nSergio Guadarrama and Nathan Silberman\n"
  },
  {
    "path": "model_zoo/models/inception/inception/slim/collections_test.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for inception.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom inception.slim import slim\n\n\ndef get_variables(scope=None):\n  return slim.variables.get_variables(scope)\n\n\ndef get_variables_by_name(name):\n  return slim.variables.get_variables_by_name(name)\n\n\nclass CollectionsTest(tf.test.TestCase):\n\n  def testVariables(self):\n    batch_size = 5\n    height, width = 299, 299\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      with slim.arg_scope([slim.ops.conv2d],\n                          batch_norm_params={'decay': 0.9997}):\n        slim.inception.inception_v3(inputs)\n      self.assertEqual(len(get_variables()), 388)\n      self.assertEqual(len(get_variables_by_name('weights')), 98)\n      self.assertEqual(len(get_variables_by_name('biases')), 2)\n      self.assertEqual(len(get_variables_by_name('beta')), 96)\n      self.assertEqual(len(get_variables_by_name('gamma')), 0)\n      self.assertEqual(len(get_variables_by_name('moving_mean')), 96)\n      self.assertEqual(len(get_variables_by_name('moving_variance')), 96)\n\n  def testVariablesWithoutBatchNorm(self):\n    batch_size = 5\n    height, width = 299, 299\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      with slim.arg_scope([slim.ops.conv2d],\n                          batch_norm_params=None):\n        slim.inception.inception_v3(inputs)\n      self.assertEqual(len(get_variables()), 196)\n      self.assertEqual(len(get_variables_by_name('weights')), 98)\n      self.assertEqual(len(get_variables_by_name('biases')), 98)\n      self.assertEqual(len(get_variables_by_name('beta')), 0)\n      self.assertEqual(len(get_variables_by_name('gamma')), 0)\n      self.assertEqual(len(get_variables_by_name('moving_mean')), 0)\n      self.assertEqual(len(get_variables_by_name('moving_variance')), 0)\n\n  def testVariablesByLayer(self):\n    batch_size = 5\n    height, width = 299, 299\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      with slim.arg_scope([slim.ops.conv2d],\n                          batch_norm_params={'decay': 0.9997}):\n        slim.inception.inception_v3(inputs)\n      self.assertEqual(len(get_variables()), 388)\n      self.assertEqual(len(get_variables('conv0')), 4)\n      self.assertEqual(len(get_variables('conv1')), 4)\n      self.assertEqual(len(get_variables('conv2')), 4)\n      self.assertEqual(len(get_variables('conv3')), 4)\n      self.assertEqual(len(get_variables('conv4')), 4)\n      self.assertEqual(len(get_variables('mixed_35x35x256a')), 28)\n      self.assertEqual(len(get_variables('mixed_35x35x288a')), 28)\n      self.assertEqual(len(get_variables('mixed_35x35x288b')), 28)\n      self.assertEqual(len(get_variables('mixed_17x17x768a')), 16)\n      self.assertEqual(len(get_variables('mixed_17x17x768b')), 40)\n      self.assertEqual(len(get_variables('mixed_17x17x768c')), 40)\n      self.assertEqual(len(get_variables('mixed_17x17x768d')), 40)\n      self.assertEqual(len(get_variables('mixed_17x17x768e')), 40)\n      self.assertEqual(len(get_variables('mixed_8x8x2048a')), 36)\n      self.assertEqual(len(get_variables('mixed_8x8x2048b')), 36)\n      self.assertEqual(len(get_variables('logits')), 2)\n      self.assertEqual(len(get_variables('aux_logits')), 10)\n\n  def testVariablesToRestore(self):\n    batch_size = 5\n    height, width = 299, 299\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      with slim.arg_scope([slim.ops.conv2d],\n                          batch_norm_params={'decay': 0.9997}):\n        slim.inception.inception_v3(inputs)\n      variables_to_restore = tf.get_collection(\n          slim.variables.VARIABLES_TO_RESTORE)\n      self.assertEqual(len(variables_to_restore), 388)\n      self.assertListEqual(variables_to_restore, get_variables())\n\n  def testVariablesToRestoreWithoutLogits(self):\n    batch_size = 5\n    height, width = 299, 299\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      with slim.arg_scope([slim.ops.conv2d],\n                          batch_norm_params={'decay': 0.9997}):\n        slim.inception.inception_v3(inputs, restore_logits=False)\n      variables_to_restore = tf.get_collection(\n          slim.variables.VARIABLES_TO_RESTORE)\n      self.assertEqual(len(variables_to_restore), 384)\n\n  def testRegularizationLosses(self):\n    batch_size = 5\n    height, width = 299, 299\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      with slim.arg_scope([slim.ops.conv2d, slim.ops.fc], weight_decay=0.00004):\n        slim.inception.inception_v3(inputs)\n      losses = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)\n      self.assertEqual(len(losses), len(get_variables_by_name('weights')))\n\n  def testTotalLossWithoutRegularization(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1001\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      dense_labels = tf.random_uniform((batch_size, num_classes))\n      with slim.arg_scope([slim.ops.conv2d, slim.ops.fc], weight_decay=0):\n        logits, end_points = slim.inception.inception_v3(\n            inputs,\n            num_classes=num_classes)\n        # Cross entropy loss for the main softmax prediction.\n        slim.losses.cross_entropy_loss(logits,\n                                       dense_labels,\n                                       label_smoothing=0.1,\n                                       weight=1.0)\n        # Cross entropy loss for the auxiliary softmax head.\n        slim.losses.cross_entropy_loss(end_points['aux_logits'],\n                                       dense_labels,\n                                       label_smoothing=0.1,\n                                       weight=0.4,\n                                       scope='aux_loss')\n      losses = tf.get_collection(slim.losses.LOSSES_COLLECTION)\n      self.assertEqual(len(losses), 2)\n\n  def testTotalLossWithRegularization(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      dense_labels = tf.random_uniform((batch_size, num_classes))\n      with slim.arg_scope([slim.ops.conv2d, slim.ops.fc], weight_decay=0.00004):\n        logits, end_points = slim.inception.inception_v3(inputs, num_classes)\n        # Cross entropy loss for the main softmax prediction.\n        slim.losses.cross_entropy_loss(logits,\n                                       dense_labels,\n                                       label_smoothing=0.1,\n                                       weight=1.0)\n        # Cross entropy loss for the auxiliary softmax head.\n        slim.losses.cross_entropy_loss(end_points['aux_logits'],\n                                       dense_labels,\n                                       label_smoothing=0.1,\n                                       weight=0.4,\n                                       scope='aux_loss')\n      losses = tf.get_collection(slim.losses.LOSSES_COLLECTION)\n      self.assertEqual(len(losses), 2)\n      reg_losses = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)\n      self.assertEqual(len(reg_losses), 98)\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/inception/inception/slim/inception_model.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Inception-v3 expressed in TensorFlow-Slim.\n\n  Usage:\n\n  # Parameters for BatchNorm.\n  batch_norm_params = {\n      # Decay for the batch_norm moving averages.\n      'decay': BATCHNORM_MOVING_AVERAGE_DECAY,\n      # epsilon to prevent 0s in variance.\n      'epsilon': 0.001,\n  }\n  # Set weight_decay for weights in Conv and FC layers.\n  with slim.arg_scope([slim.ops.conv2d, slim.ops.fc], weight_decay=0.00004):\n    with slim.arg_scope([slim.ops.conv2d],\n                        stddev=0.1,\n                        activation=tf.nn.relu,\n                        batch_norm_params=batch_norm_params):\n      # Force all Variables to reside on the CPU.\n      with slim.arg_scope([slim.variables.variable], device='/cpu:0'):\n        logits, endpoints = slim.inception.inception_v3(\n            images,\n            dropout_keep_prob=0.8,\n            num_classes=num_classes,\n            is_training=for_training,\n            restore_logits=restore_logits,\n            scope=scope)\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom inception.slim import ops\nfrom inception.slim import scopes\n\n\ndef inception_v3(inputs,\n                 dropout_keep_prob=0.8,\n                 num_classes=1000,\n                 is_training=True,\n                 restore_logits=True,\n                 scope=''):\n  \"\"\"Latest Inception from http://arxiv.org/abs/1512.00567.\n\n    \"Rethinking the Inception Architecture for Computer Vision\"\n\n    Christian Szegedy, Vincent Vanhoucke, Sergey Ioffe, Jonathon Shlens,\n    Zbigniew Wojna\n\n  Args:\n    inputs: a tensor of size [batch_size, height, width, channels].\n    dropout_keep_prob: dropout keep_prob.\n    num_classes: number of predicted classes.\n    is_training: whether is training or not.\n    restore_logits: whether or not the logits layers should be restored.\n      Useful for fine-tuning a model with different num_classes.\n    scope: Optional scope for op_scope.\n\n  Returns:\n    a list containing 'logits', 'aux_logits' Tensors.\n  \"\"\"\n  # end_points will collect relevant activations for external use, for example\n  # summaries or losses.\n  end_points = {}\n  with tf.op_scope([inputs], scope, 'inception_v3'):\n    with scopes.arg_scope([ops.conv2d, ops.fc, ops.batch_norm, ops.dropout],\n                          is_training=is_training):\n      with scopes.arg_scope([ops.conv2d, ops.max_pool, ops.avg_pool],\n                            stride=1, padding='VALID'):\n        # 299 x 299 x 3\n        end_points['conv0'] = ops.conv2d(inputs, 32, [3, 3], stride=2,\n                                         scope='conv0')\n        # 149 x 149 x 32\n        end_points['conv1'] = ops.conv2d(end_points['conv0'], 32, [3, 3],\n                                         scope='conv1')\n        # 147 x 147 x 32\n        end_points['conv2'] = ops.conv2d(end_points['conv1'], 64, [3, 3],\n                                         padding='SAME', scope='conv2')\n        # 147 x 147 x 64\n        end_points['pool1'] = ops.max_pool(end_points['conv2'], [3, 3],\n                                           stride=2, scope='pool1')\n        # 73 x 73 x 64\n        end_points['conv3'] = ops.conv2d(end_points['pool1'], 80, [1, 1],\n                                         scope='conv3')\n        # 73 x 73 x 80.\n        end_points['conv4'] = ops.conv2d(end_points['conv3'], 192, [3, 3],\n                                         scope='conv4')\n        # 71 x 71 x 192.\n        end_points['pool2'] = ops.max_pool(end_points['conv4'], [3, 3],\n                                           stride=2, scope='pool2')\n        # 35 x 35 x 192.\n        net = end_points['pool2']\n      # Inception blocks\n      with scopes.arg_scope([ops.conv2d, ops.max_pool, ops.avg_pool],\n                            stride=1, padding='SAME'):\n        # mixed: 35 x 35 x 256.\n        with tf.variable_scope('mixed_35x35x256a'):\n          with tf.variable_scope('branch1x1'):\n            branch1x1 = ops.conv2d(net, 64, [1, 1])\n          with tf.variable_scope('branch5x5'):\n            branch5x5 = ops.conv2d(net, 48, [1, 1])\n            branch5x5 = ops.conv2d(branch5x5, 64, [5, 5])\n          with tf.variable_scope('branch3x3dbl'):\n            branch3x3dbl = ops.conv2d(net, 64, [1, 1])\n            branch3x3dbl = ops.conv2d(branch3x3dbl, 96, [3, 3])\n            branch3x3dbl = ops.conv2d(branch3x3dbl, 96, [3, 3])\n          with tf.variable_scope('branch_pool'):\n            branch_pool = ops.avg_pool(net, [3, 3])\n            branch_pool = ops.conv2d(branch_pool, 32, [1, 1])\n          net = tf.concat(3, [branch1x1, branch5x5, branch3x3dbl, branch_pool])\n          end_points['mixed_35x35x256a'] = net\n        # mixed_1: 35 x 35 x 288.\n        with tf.variable_scope('mixed_35x35x288a'):\n          with tf.variable_scope('branch1x1'):\n            branch1x1 = ops.conv2d(net, 64, [1, 1])\n          with tf.variable_scope('branch5x5'):\n            branch5x5 = ops.conv2d(net, 48, [1, 1])\n            branch5x5 = ops.conv2d(branch5x5, 64, [5, 5])\n          with tf.variable_scope('branch3x3dbl'):\n            branch3x3dbl = ops.conv2d(net, 64, [1, 1])\n            branch3x3dbl = ops.conv2d(branch3x3dbl, 96, [3, 3])\n            branch3x3dbl = ops.conv2d(branch3x3dbl, 96, [3, 3])\n          with tf.variable_scope('branch_pool'):\n            branch_pool = ops.avg_pool(net, [3, 3])\n            branch_pool = ops.conv2d(branch_pool, 64, [1, 1])\n          net = tf.concat(3, [branch1x1, branch5x5, branch3x3dbl, branch_pool])\n          end_points['mixed_35x35x288a'] = net\n        # mixed_2: 35 x 35 x 288.\n        with tf.variable_scope('mixed_35x35x288b'):\n          with tf.variable_scope('branch1x1'):\n            branch1x1 = ops.conv2d(net, 64, [1, 1])\n          with tf.variable_scope('branch5x5'):\n            branch5x5 = ops.conv2d(net, 48, [1, 1])\n            branch5x5 = ops.conv2d(branch5x5, 64, [5, 5])\n          with tf.variable_scope('branch3x3dbl'):\n            branch3x3dbl = ops.conv2d(net, 64, [1, 1])\n            branch3x3dbl = ops.conv2d(branch3x3dbl, 96, [3, 3])\n            branch3x3dbl = ops.conv2d(branch3x3dbl, 96, [3, 3])\n          with tf.variable_scope('branch_pool'):\n            branch_pool = ops.avg_pool(net, [3, 3])\n            branch_pool = ops.conv2d(branch_pool, 64, [1, 1])\n          net = tf.concat(3, [branch1x1, branch5x5, branch3x3dbl, branch_pool])\n          end_points['mixed_35x35x288b'] = net\n        # mixed_3: 17 x 17 x 768.\n        with tf.variable_scope('mixed_17x17x768a'):\n          with tf.variable_scope('branch3x3'):\n            branch3x3 = ops.conv2d(net, 384, [3, 3], stride=2, padding='VALID')\n          with tf.variable_scope('branch3x3dbl'):\n            branch3x3dbl = ops.conv2d(net, 64, [1, 1])\n            branch3x3dbl = ops.conv2d(branch3x3dbl, 96, [3, 3])\n            branch3x3dbl = ops.conv2d(branch3x3dbl, 96, [3, 3],\n                                      stride=2, padding='VALID')\n          with tf.variable_scope('branch_pool'):\n            branch_pool = ops.max_pool(net, [3, 3], stride=2, padding='VALID')\n          net = tf.concat(3, [branch3x3, branch3x3dbl, branch_pool])\n          end_points['mixed_17x17x768a'] = net\n        # mixed4: 17 x 17 x 768.\n        with tf.variable_scope('mixed_17x17x768b'):\n          with tf.variable_scope('branch1x1'):\n            branch1x1 = ops.conv2d(net, 192, [1, 1])\n          with tf.variable_scope('branch7x7'):\n            branch7x7 = ops.conv2d(net, 128, [1, 1])\n            branch7x7 = ops.conv2d(branch7x7, 128, [1, 7])\n            branch7x7 = ops.conv2d(branch7x7, 192, [7, 1])\n          with tf.variable_scope('branch7x7dbl'):\n            branch7x7dbl = ops.conv2d(net, 128, [1, 1])\n            branch7x7dbl = ops.conv2d(branch7x7dbl, 128, [7, 1])\n            branch7x7dbl = ops.conv2d(branch7x7dbl, 128, [1, 7])\n            branch7x7dbl = ops.conv2d(branch7x7dbl, 128, [7, 1])\n            branch7x7dbl = ops.conv2d(branch7x7dbl, 192, [1, 7])\n          with tf.variable_scope('branch_pool'):\n            branch_pool = ops.avg_pool(net, [3, 3])\n            branch_pool = ops.conv2d(branch_pool, 192, [1, 1])\n          net = tf.concat(3, [branch1x1, branch7x7, branch7x7dbl, branch_pool])\n          end_points['mixed_17x17x768b'] = net\n        # mixed_5: 17 x 17 x 768.\n        with tf.variable_scope('mixed_17x17x768c'):\n          with tf.variable_scope('branch1x1'):\n            branch1x1 = ops.conv2d(net, 192, [1, 1])\n          with tf.variable_scope('branch7x7'):\n            branch7x7 = ops.conv2d(net, 160, [1, 1])\n            branch7x7 = ops.conv2d(branch7x7, 160, [1, 7])\n            branch7x7 = ops.conv2d(branch7x7, 192, [7, 1])\n          with tf.variable_scope('branch7x7dbl'):\n            branch7x7dbl = ops.conv2d(net, 160, [1, 1])\n            branch7x7dbl = ops.conv2d(branch7x7dbl, 160, [7, 1])\n            branch7x7dbl = ops.conv2d(branch7x7dbl, 160, [1, 7])\n            branch7x7dbl = ops.conv2d(branch7x7dbl, 160, [7, 1])\n            branch7x7dbl = ops.conv2d(branch7x7dbl, 192, [1, 7])\n          with tf.variable_scope('branch_pool'):\n            branch_pool = ops.avg_pool(net, [3, 3])\n            branch_pool = ops.conv2d(branch_pool, 192, [1, 1])\n          net = tf.concat(3, [branch1x1, branch7x7, branch7x7dbl, branch_pool])\n          end_points['mixed_17x17x768c'] = net\n        # mixed_6: 17 x 17 x 768.\n        with tf.variable_scope('mixed_17x17x768d'):\n          with tf.variable_scope('branch1x1'):\n            branch1x1 = ops.conv2d(net, 192, [1, 1])\n          with tf.variable_scope('branch7x7'):\n            branch7x7 = ops.conv2d(net, 160, [1, 1])\n            branch7x7 = ops.conv2d(branch7x7, 160, [1, 7])\n            branch7x7 = ops.conv2d(branch7x7, 192, [7, 1])\n          with tf.variable_scope('branch7x7dbl'):\n            branch7x7dbl = ops.conv2d(net, 160, [1, 1])\n            branch7x7dbl = ops.conv2d(branch7x7dbl, 160, [7, 1])\n            branch7x7dbl = ops.conv2d(branch7x7dbl, 160, [1, 7])\n            branch7x7dbl = ops.conv2d(branch7x7dbl, 160, [7, 1])\n            branch7x7dbl = ops.conv2d(branch7x7dbl, 192, [1, 7])\n          with tf.variable_scope('branch_pool'):\n            branch_pool = ops.avg_pool(net, [3, 3])\n            branch_pool = ops.conv2d(branch_pool, 192, [1, 1])\n          net = tf.concat(3, [branch1x1, branch7x7, branch7x7dbl, branch_pool])\n          end_points['mixed_17x17x768d'] = net\n        # mixed_7: 17 x 17 x 768.\n        with tf.variable_scope('mixed_17x17x768e'):\n          with tf.variable_scope('branch1x1'):\n            branch1x1 = ops.conv2d(net, 192, [1, 1])\n          with tf.variable_scope('branch7x7'):\n            branch7x7 = ops.conv2d(net, 192, [1, 1])\n            branch7x7 = ops.conv2d(branch7x7, 192, [1, 7])\n            branch7x7 = ops.conv2d(branch7x7, 192, [7, 1])\n          with tf.variable_scope('branch7x7dbl'):\n            branch7x7dbl = ops.conv2d(net, 192, [1, 1])\n            branch7x7dbl = ops.conv2d(branch7x7dbl, 192, [7, 1])\n            branch7x7dbl = ops.conv2d(branch7x7dbl, 192, [1, 7])\n            branch7x7dbl = ops.conv2d(branch7x7dbl, 192, [7, 1])\n            branch7x7dbl = ops.conv2d(branch7x7dbl, 192, [1, 7])\n          with tf.variable_scope('branch_pool'):\n            branch_pool = ops.avg_pool(net, [3, 3])\n            branch_pool = ops.conv2d(branch_pool, 192, [1, 1])\n          net = tf.concat(3, [branch1x1, branch7x7, branch7x7dbl, branch_pool])\n          end_points['mixed_17x17x768e'] = net\n        # Auxiliary Head logits\n        aux_logits = tf.identity(end_points['mixed_17x17x768e'])\n        with tf.variable_scope('aux_logits'):\n          aux_logits = ops.avg_pool(aux_logits, [5, 5], stride=3,\n                                    padding='VALID')\n          aux_logits = ops.conv2d(aux_logits, 128, [1, 1], scope='proj')\n          # Shape of feature map before the final layer.\n          shape = aux_logits.get_shape()\n          aux_logits = ops.conv2d(aux_logits, 768, shape[1:3], stddev=0.01,\n                                  padding='VALID')\n          aux_logits = ops.flatten(aux_logits)\n          aux_logits = ops.fc(aux_logits, num_classes, activation=None,\n                              stddev=0.001, restore=restore_logits)\n          end_points['aux_logits'] = aux_logits\n        # mixed_8: 8 x 8 x 1280.\n        # Note that the scope below is not changed to not void previous\n        # checkpoints.\n        # (TODO) Fix the scope when appropriate.\n        with tf.variable_scope('mixed_17x17x1280a'):\n          with tf.variable_scope('branch3x3'):\n            branch3x3 = ops.conv2d(net, 192, [1, 1])\n            branch3x3 = ops.conv2d(branch3x3, 320, [3, 3], stride=2,\n                                   padding='VALID')\n          with tf.variable_scope('branch7x7x3'):\n            branch7x7x3 = ops.conv2d(net, 192, [1, 1])\n            branch7x7x3 = ops.conv2d(branch7x7x3, 192, [1, 7])\n            branch7x7x3 = ops.conv2d(branch7x7x3, 192, [7, 1])\n            branch7x7x3 = ops.conv2d(branch7x7x3, 192, [3, 3],\n                                     stride=2, padding='VALID')\n          with tf.variable_scope('branch_pool'):\n            branch_pool = ops.max_pool(net, [3, 3], stride=2, padding='VALID')\n          net = tf.concat(3, [branch3x3, branch7x7x3, branch_pool])\n          end_points['mixed_17x17x1280a'] = net\n        # mixed_9: 8 x 8 x 2048.\n        with tf.variable_scope('mixed_8x8x2048a'):\n          with tf.variable_scope('branch1x1'):\n            branch1x1 = ops.conv2d(net, 320, [1, 1])\n          with tf.variable_scope('branch3x3'):\n            branch3x3 = ops.conv2d(net, 384, [1, 1])\n            branch3x3 = tf.concat(3, [ops.conv2d(branch3x3, 384, [1, 3]),\n                                      ops.conv2d(branch3x3, 384, [3, 1])])\n          with tf.variable_scope('branch3x3dbl'):\n            branch3x3dbl = ops.conv2d(net, 448, [1, 1])\n            branch3x3dbl = ops.conv2d(branch3x3dbl, 384, [3, 3])\n            branch3x3dbl = tf.concat(3, [ops.conv2d(branch3x3dbl, 384, [1, 3]),\n                                         ops.conv2d(branch3x3dbl, 384, [3, 1])])\n          with tf.variable_scope('branch_pool'):\n            branch_pool = ops.avg_pool(net, [3, 3])\n            branch_pool = ops.conv2d(branch_pool, 192, [1, 1])\n          net = tf.concat(3, [branch1x1, branch3x3, branch3x3dbl, branch_pool])\n          end_points['mixed_8x8x2048a'] = net\n        # mixed_10: 8 x 8 x 2048.\n        with tf.variable_scope('mixed_8x8x2048b'):\n          with tf.variable_scope('branch1x1'):\n            branch1x1 = ops.conv2d(net, 320, [1, 1])\n          with tf.variable_scope('branch3x3'):\n            branch3x3 = ops.conv2d(net, 384, [1, 1])\n            branch3x3 = tf.concat(3, [ops.conv2d(branch3x3, 384, [1, 3]),\n                                      ops.conv2d(branch3x3, 384, [3, 1])])\n          with tf.variable_scope('branch3x3dbl'):\n            branch3x3dbl = ops.conv2d(net, 448, [1, 1])\n            branch3x3dbl = ops.conv2d(branch3x3dbl, 384, [3, 3])\n            branch3x3dbl = tf.concat(3, [ops.conv2d(branch3x3dbl, 384, [1, 3]),\n                                         ops.conv2d(branch3x3dbl, 384, [3, 1])])\n          with tf.variable_scope('branch_pool'):\n            branch_pool = ops.avg_pool(net, [3, 3])\n            branch_pool = ops.conv2d(branch_pool, 192, [1, 1])\n          net = tf.concat(3, [branch1x1, branch3x3, branch3x3dbl, branch_pool])\n          end_points['mixed_8x8x2048b'] = net\n        # Final pooling and prediction\n        with tf.variable_scope('logits'):\n          shape = net.get_shape()\n          net = ops.avg_pool(net, shape[1:3], padding='VALID', scope='pool')\n          # 1 x 1 x 2048\n          net = ops.dropout(net, dropout_keep_prob, scope='dropout')\n          net = ops.flatten(net, scope='flatten')\n          # 2048\n          logits = ops.fc(net, num_classes, activation=None, scope='logits',\n                          restore=restore_logits)\n          # 1000\n          end_points['logits'] = logits\n          end_points['predictions'] = tf.nn.softmax(logits, name='predictions')\n      return logits, end_points\n\n\ndef inception_v3_parameters(weight_decay=0.00004, stddev=0.1,\n                            batch_norm_decay=0.9997, batch_norm_epsilon=0.001):\n  \"\"\"Yields the scope with the default parameters for inception_v3.\n\n  Args:\n    weight_decay: the weight decay for weights variables.\n    stddev: standard deviation of the truncated guassian weight distribution.\n    batch_norm_decay: decay for the moving average of batch_norm momentums.\n    batch_norm_epsilon: small float added to variance to avoid dividing by zero.\n\n  Yields:\n    a arg_scope with the parameters needed for inception_v3.\n  \"\"\"\n  # Set weight_decay for weights in Conv and FC layers.\n  with scopes.arg_scope([ops.conv2d, ops.fc],\n                        weight_decay=weight_decay):\n    # Set stddev, activation and parameters for batch_norm.\n    with scopes.arg_scope([ops.conv2d],\n                          stddev=stddev,\n                          activation=tf.nn.relu,\n                          batch_norm_params={\n                              'decay': batch_norm_decay,\n                              'epsilon': batch_norm_epsilon}) as arg_scope:\n      yield arg_scope\n"
  },
  {
    "path": "model_zoo/models/inception/inception/slim/inception_test.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for slim.inception.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom inception.slim import inception_model as inception\n\n\nclass InceptionTest(tf.test.TestCase):\n\n  def testBuildLogits(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = inception.inception_v3(inputs, num_classes)\n      self.assertTrue(logits.op.name.startswith('logits'))\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n\n  def testBuildEndPoints(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      _, end_points = inception.inception_v3(inputs, num_classes)\n      self.assertTrue('logits' in end_points)\n      logits = end_points['logits']\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n      self.assertTrue('aux_logits' in end_points)\n      aux_logits = end_points['aux_logits']\n      self.assertListEqual(aux_logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n      pre_pool = end_points['mixed_8x8x2048b']\n      self.assertListEqual(pre_pool.get_shape().as_list(),\n                           [batch_size, 8, 8, 2048])\n\n  def testVariablesSetDevice(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      # Force all Variables to reside on the device.\n      with tf.variable_scope('on_cpu'), tf.device('/cpu:0'):\n        inception.inception_v3(inputs, num_classes)\n      with tf.variable_scope('on_gpu'), tf.device('/gpu:0'):\n        inception.inception_v3(inputs, num_classes)\n      for v in tf.get_collection(tf.GraphKeys.VARIABLES, scope='on_cpu'):\n        self.assertDeviceEqual(v.device, '/cpu:0')\n      for v in tf.get_collection(tf.GraphKeys.VARIABLES, scope='on_gpu'):\n        self.assertDeviceEqual(v.device, '/gpu:0')\n\n  def testHalfSizeImages(self):\n    batch_size = 5\n    height, width = 150, 150\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, end_points = inception.inception_v3(inputs, num_classes)\n      self.assertTrue(logits.op.name.startswith('logits'))\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n      pre_pool = end_points['mixed_8x8x2048b']\n      self.assertListEqual(pre_pool.get_shape().as_list(),\n                           [batch_size, 3, 3, 2048])\n\n  def testUnknowBatchSize(self):\n    batch_size = 1\n    height, width = 299, 299\n    num_classes = 1000\n    with self.test_session() as sess:\n      inputs = tf.placeholder(tf.float32, (None, height, width, 3))\n      logits, _ = inception.inception_v3(inputs, num_classes)\n      self.assertTrue(logits.op.name.startswith('logits'))\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [None, num_classes])\n      images = tf.random_uniform((batch_size, height, width, 3))\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(logits, {inputs: images.eval()})\n      self.assertEquals(output.shape, (batch_size, num_classes))\n\n  def testEvaluation(self):\n    batch_size = 2\n    height, width = 299, 299\n    num_classes = 1000\n    with self.test_session() as sess:\n      eval_inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = inception.inception_v3(eval_inputs, num_classes,\n                                         is_training=False)\n      predictions = tf.argmax(logits, 1)\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(predictions)\n      self.assertEquals(output.shape, (batch_size,))\n\n  def testTrainEvalWithReuse(self):\n    train_batch_size = 5\n    eval_batch_size = 2\n    height, width = 150, 150\n    num_classes = 1000\n    with self.test_session() as sess:\n      train_inputs = tf.random_uniform((train_batch_size, height, width, 3))\n      inception.inception_v3(train_inputs, num_classes)\n      tf.get_variable_scope().reuse_variables()\n      eval_inputs = tf.random_uniform((eval_batch_size, height, width, 3))\n      logits, _ = inception.inception_v3(eval_inputs, num_classes,\n                                         is_training=False)\n      predictions = tf.argmax(logits, 1)\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(predictions)\n      self.assertEquals(output.shape, (eval_batch_size,))\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/inception/inception/slim/losses.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains convenience wrappers for various Neural Network TensorFlow losses.\n\n  All the losses defined here add themselves to the LOSSES_COLLECTION\n  collection.\n\n  l1_loss: Define a L1 Loss, useful for regularization, i.e. lasso.\n  l2_loss: Define a L2 Loss, useful for regularization, i.e. weight decay.\n  cross_entropy_loss: Define a cross entropy loss using\n    softmax_cross_entropy_with_logits. Useful for classification.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\n# In order to gather all losses in a network, the user should use this\n# key for get_collection, i.e:\n#   losses = tf.get_collection(slim.losses.LOSSES_COLLECTION)\nLOSSES_COLLECTION = '_losses'\n\n\ndef l1_regularizer(weight=1.0, scope=None):\n  \"\"\"Define a L1 regularizer.\n\n  Args:\n    weight: scale the loss by this factor.\n    scope: Optional scope for op_scope.\n\n  Returns:\n    a regularizer function.\n  \"\"\"\n  def regularizer(tensor):\n    with tf.op_scope([tensor], scope, 'L1Regularizer'):\n      l1_weight = tf.convert_to_tensor(weight,\n                                       dtype=tensor.dtype.base_dtype,\n                                       name='weight')\n      return tf.mul(l1_weight, tf.reduce_sum(tf.abs(tensor)), name='value')\n  return regularizer\n\n\ndef l2_regularizer(weight=1.0, scope=None):\n  \"\"\"Define a L2 regularizer.\n\n  Args:\n    weight: scale the loss by this factor.\n    scope: Optional scope for op_scope.\n\n  Returns:\n    a regularizer function.\n  \"\"\"\n  def regularizer(tensor):\n    with tf.op_scope([tensor], scope, 'L2Regularizer'):\n      l2_weight = tf.convert_to_tensor(weight,\n                                       dtype=tensor.dtype.base_dtype,\n                                       name='weight')\n      return tf.mul(l2_weight, tf.nn.l2_loss(tensor), name='value')\n  return regularizer\n\n\ndef l1_l2_regularizer(weight_l1=1.0, weight_l2=1.0, scope=None):\n  \"\"\"Define a L1L2 regularizer.\n\n  Args:\n    weight_l1: scale the L1 loss by this factor.\n    weight_l2: scale the L2 loss by this factor.\n    scope: Optional scope for op_scope.\n\n  Returns:\n    a regularizer function.\n  \"\"\"\n  def regularizer(tensor):\n    with tf.op_scope([tensor], scope, 'L1L2Regularizer'):\n      weight_l1_t = tf.convert_to_tensor(weight_l1,\n                                         dtype=tensor.dtype.base_dtype,\n                                         name='weight_l1')\n      weight_l2_t = tf.convert_to_tensor(weight_l2,\n                                         dtype=tensor.dtype.base_dtype,\n                                         name='weight_l2')\n      reg_l1 = tf.mul(weight_l1_t, tf.reduce_sum(tf.abs(tensor)),\n                      name='value_l1')\n      reg_l2 = tf.mul(weight_l2_t, tf.nn.l2_loss(tensor),\n                      name='value_l2')\n      return tf.add(reg_l1, reg_l2, name='value')\n  return regularizer\n\n\ndef l1_loss(tensor, weight=1.0, scope=None):\n  \"\"\"Define a L1Loss, useful for regularize, i.e. lasso.\n\n  Args:\n    tensor: tensor to regularize.\n    weight: scale the loss by this factor.\n    scope: Optional scope for op_scope.\n\n  Returns:\n    the L1 loss op.\n  \"\"\"\n  with tf.op_scope([tensor], scope, 'L1Loss'):\n    weight = tf.convert_to_tensor(weight,\n                                  dtype=tensor.dtype.base_dtype,\n                                  name='loss_weight')\n    loss = tf.mul(weight, tf.reduce_sum(tf.abs(tensor)), name='value')\n    tf.add_to_collection(LOSSES_COLLECTION, loss)\n    return loss\n\n\ndef l2_loss(tensor, weight=1.0, scope=None):\n  \"\"\"Define a L2Loss, useful for regularize, i.e. weight decay.\n\n  Args:\n    tensor: tensor to regularize.\n    weight: an optional weight to modulate the loss.\n    scope: Optional scope for op_scope.\n\n  Returns:\n    the L2 loss op.\n  \"\"\"\n  with tf.op_scope([tensor], scope, 'L2Loss'):\n    weight = tf.convert_to_tensor(weight,\n                                  dtype=tensor.dtype.base_dtype,\n                                  name='loss_weight')\n    loss = tf.mul(weight, tf.nn.l2_loss(tensor), name='value')\n    tf.add_to_collection(LOSSES_COLLECTION, loss)\n    return loss\n\n\ndef cross_entropy_loss(logits, one_hot_labels, label_smoothing=0,\n                       weight=1.0, scope=None):\n  \"\"\"Define a Cross Entropy loss using softmax_cross_entropy_with_logits.\n\n  It can scale the loss by weight factor, and smooth the labels.\n\n  Args:\n    logits: [batch_size, num_classes] logits outputs of the network .\n    one_hot_labels: [batch_size, num_classes] target one_hot_encoded labels.\n    label_smoothing: if greater than 0 then smooth the labels.\n    weight: scale the loss by this factor.\n    scope: Optional scope for op_scope.\n\n  Returns:\n    A tensor with the softmax_cross_entropy loss.\n  \"\"\"\n  logits.get_shape().assert_is_compatible_with(one_hot_labels.get_shape())\n  with tf.op_scope([logits, one_hot_labels], scope, 'CrossEntropyLoss'):\n    num_classes = one_hot_labels.get_shape()[-1].value\n    one_hot_labels = tf.cast(one_hot_labels, logits.dtype)\n    if label_smoothing > 0:\n      smooth_positives = 1.0 - label_smoothing\n      smooth_negatives = label_smoothing / num_classes\n      one_hot_labels = one_hot_labels * smooth_positives + smooth_negatives\n    cross_entropy = tf.nn.softmax_cross_entropy_with_logits(logits,\n                                                            one_hot_labels,\n                                                            name='xentropy')\n    weight = tf.convert_to_tensor(weight,\n                                  dtype=logits.dtype.base_dtype,\n                                  name='loss_weight')\n    loss = tf.mul(weight, tf.reduce_mean(cross_entropy), name='value')\n    tf.add_to_collection(LOSSES_COLLECTION, loss)\n    return loss\n"
  },
  {
    "path": "model_zoo/models/inception/inception/slim/losses_test.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for slim.losses.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nimport tensorflow as tf\n\nfrom inception.slim import losses\n\n\nclass LossesTest(tf.test.TestCase):\n\n  def testL1Loss(self):\n    with self.test_session():\n      shape = [5, 5, 5]\n      num_elem = 5 * 5 * 5\n      weights = tf.constant(1.0, shape=shape)\n      wd = 0.01\n      loss = losses.l1_loss(weights, wd)\n      self.assertEquals(loss.op.name, 'L1Loss/value')\n      self.assertAlmostEqual(loss.eval(), num_elem * wd, 5)\n\n  def testL2Loss(self):\n    with self.test_session():\n      shape = [5, 5, 5]\n      num_elem = 5 * 5 * 5\n      weights = tf.constant(1.0, shape=shape)\n      wd = 0.01\n      loss = losses.l2_loss(weights, wd)\n      self.assertEquals(loss.op.name, 'L2Loss/value')\n      self.assertAlmostEqual(loss.eval(), num_elem * wd / 2, 5)\n\n\nclass RegularizersTest(tf.test.TestCase):\n\n  def testL1Regularizer(self):\n    with self.test_session():\n      shape = [5, 5, 5]\n      num_elem = 5 * 5 * 5\n      tensor = tf.constant(1.0, shape=shape)\n      loss = losses.l1_regularizer()(tensor)\n      self.assertEquals(loss.op.name, 'L1Regularizer/value')\n      self.assertAlmostEqual(loss.eval(), num_elem, 5)\n\n  def testL1RegularizerWithScope(self):\n    with self.test_session():\n      shape = [5, 5, 5]\n      num_elem = 5 * 5 * 5\n      tensor = tf.constant(1.0, shape=shape)\n      loss = losses.l1_regularizer(scope='L1')(tensor)\n      self.assertEquals(loss.op.name, 'L1/value')\n      self.assertAlmostEqual(loss.eval(), num_elem, 5)\n\n  def testL1RegularizerWithWeight(self):\n    with self.test_session():\n      shape = [5, 5, 5]\n      num_elem = 5 * 5 * 5\n      tensor = tf.constant(1.0, shape=shape)\n      weight = 0.01\n      loss = losses.l1_regularizer(weight)(tensor)\n      self.assertEquals(loss.op.name, 'L1Regularizer/value')\n      self.assertAlmostEqual(loss.eval(), num_elem * weight, 5)\n\n  def testL2Regularizer(self):\n    with self.test_session():\n      shape = [5, 5, 5]\n      num_elem = 5 * 5 * 5\n      tensor = tf.constant(1.0, shape=shape)\n      loss = losses.l2_regularizer()(tensor)\n      self.assertEquals(loss.op.name, 'L2Regularizer/value')\n      self.assertAlmostEqual(loss.eval(), num_elem / 2, 5)\n\n  def testL2RegularizerWithScope(self):\n    with self.test_session():\n      shape = [5, 5, 5]\n      num_elem = 5 * 5 * 5\n      tensor = tf.constant(1.0, shape=shape)\n      loss = losses.l2_regularizer(scope='L2')(tensor)\n      self.assertEquals(loss.op.name, 'L2/value')\n      self.assertAlmostEqual(loss.eval(), num_elem / 2, 5)\n\n  def testL2RegularizerWithWeight(self):\n    with self.test_session():\n      shape = [5, 5, 5]\n      num_elem = 5 * 5 * 5\n      tensor = tf.constant(1.0, shape=shape)\n      weight = 0.01\n      loss = losses.l2_regularizer(weight)(tensor)\n      self.assertEquals(loss.op.name, 'L2Regularizer/value')\n      self.assertAlmostEqual(loss.eval(), num_elem * weight / 2, 5)\n\n  def testL1L2Regularizer(self):\n    with self.test_session():\n      shape = [5, 5, 5]\n      num_elem = 5 * 5 * 5\n      tensor = tf.constant(1.0, shape=shape)\n      loss = losses.l1_l2_regularizer()(tensor)\n      self.assertEquals(loss.op.name, 'L1L2Regularizer/value')\n      self.assertAlmostEqual(loss.eval(), num_elem + num_elem / 2, 5)\n\n  def testL1L2RegularizerWithScope(self):\n    with self.test_session():\n      shape = [5, 5, 5]\n      num_elem = 5 * 5 * 5\n      tensor = tf.constant(1.0, shape=shape)\n      loss = losses.l1_l2_regularizer(scope='L1L2')(tensor)\n      self.assertEquals(loss.op.name, 'L1L2/value')\n      self.assertAlmostEqual(loss.eval(), num_elem + num_elem / 2, 5)\n\n  def testL1L2RegularizerWithWeights(self):\n    with self.test_session():\n      shape = [5, 5, 5]\n      num_elem = 5 * 5 * 5\n      tensor = tf.constant(1.0, shape=shape)\n      weight_l1 = 0.01\n      weight_l2 = 0.05\n      loss = losses.l1_l2_regularizer(weight_l1, weight_l2)(tensor)\n      self.assertEquals(loss.op.name, 'L1L2Regularizer/value')\n      self.assertAlmostEqual(loss.eval(),\n                             num_elem * weight_l1 + num_elem * weight_l2 / 2, 5)\n\n\nclass CrossEntropyLossTest(tf.test.TestCase):\n\n  def testCrossEntropyLossAllCorrect(self):\n    with self.test_session():\n      logits = tf.constant([[10.0, 0.0, 0.0],\n                            [0.0, 10.0, 0.0],\n                            [0.0, 0.0, 10.0]])\n      labels = tf.constant([[1, 0, 0],\n                            [0, 1, 0],\n                            [0, 0, 1]])\n      loss = losses.cross_entropy_loss(logits, labels)\n      self.assertEquals(loss.op.name, 'CrossEntropyLoss/value')\n      self.assertAlmostEqual(loss.eval(), 0.0, 3)\n\n  def testCrossEntropyLossAllWrong(self):\n    with self.test_session():\n      logits = tf.constant([[10.0, 0.0, 0.0],\n                            [0.0, 10.0, 0.0],\n                            [0.0, 0.0, 10.0]])\n      labels = tf.constant([[0, 0, 1],\n                            [1, 0, 0],\n                            [0, 1, 0]])\n      loss = losses.cross_entropy_loss(logits, labels)\n      self.assertEquals(loss.op.name, 'CrossEntropyLoss/value')\n      self.assertAlmostEqual(loss.eval(), 10.0, 3)\n\n  def testCrossEntropyLossAllWrongWithWeight(self):\n    with self.test_session():\n      logits = tf.constant([[10.0, 0.0, 0.0],\n                            [0.0, 10.0, 0.0],\n                            [0.0, 0.0, 10.0]])\n      labels = tf.constant([[0, 0, 1],\n                            [1, 0, 0],\n                            [0, 1, 0]])\n      loss = losses.cross_entropy_loss(logits, labels, weight=0.5)\n      self.assertEquals(loss.op.name, 'CrossEntropyLoss/value')\n      self.assertAlmostEqual(loss.eval(), 5.0, 3)\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/inception/inception/slim/ops.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains convenience wrappers for typical Neural Network TensorFlow layers.\n\n   Additionally it maintains a collection with update_ops that need to be\n   updated after the ops have been computed, for exmaple to update moving means\n   and moving variances of batch_norm.\n\n   Ops that have different behavior during training or eval have an is_training\n   parameter. Additionally Ops that contain variables.variable have a trainable\n   parameter, which control if the ops variables are trainable or not.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nimport tensorflow as tf\n\nfrom tensorflow.python.training import moving_averages\n\nfrom inception.slim import losses\nfrom inception.slim import scopes\nfrom inception.slim import variables\n\n# Used to keep the update ops done by batch_norm.\nUPDATE_OPS_COLLECTION = '_update_ops_'\n\n\n@scopes.add_arg_scope\ndef batch_norm(inputs,\n               decay=0.999,\n               center=True,\n               scale=False,\n               epsilon=0.001,\n               moving_vars='moving_vars',\n               activation=None,\n               is_training=True,\n               trainable=True,\n               restore=True,\n               scope=None,\n               reuse=None):\n  \"\"\"Adds a Batch Normalization layer.\n\n  Args:\n    inputs: a tensor of size [batch_size, height, width, channels]\n            or [batch_size, channels].\n    decay: decay for the moving average.\n    center: If True, subtract beta. If False, beta is not created and ignored.\n    scale: If True, multiply by gamma. If False, gamma is\n      not used. When the next layer is linear (also e.g. ReLU), this can be\n      disabled since the scaling can be done by the next layer.\n    epsilon: small float added to variance to avoid dividing by zero.\n    moving_vars: collection to store the moving_mean and moving_variance.\n    activation: activation function.\n    is_training: whether or not the model is in training mode.\n    trainable: whether or not the variables should be trainable or not.\n    restore: whether or not the variables should be marked for restore.\n    scope: Optional scope for variable_op_scope.\n    reuse: whether or not the layer and its variables should be reused. To be\n      able to reuse the layer scope must be given.\n\n  Returns:\n    a tensor representing the output of the operation.\n\n  \"\"\"\n  inputs_shape = inputs.get_shape()\n  with tf.variable_op_scope([inputs], scope, 'BatchNorm', reuse=reuse):\n    axis = list(range(len(inputs_shape) - 1))\n    params_shape = inputs_shape[-1:]\n    # Allocate parameters for the beta and gamma of the normalization.\n    beta, gamma = None, None\n    if center:\n      beta = variables.variable('beta',\n                                params_shape,\n                                initializer=tf.zeros_initializer,\n                                trainable=trainable,\n                                restore=restore)\n    if scale:\n      gamma = variables.variable('gamma',\n                                 params_shape,\n                                 initializer=tf.ones_initializer,\n                                 trainable=trainable,\n                                 restore=restore)\n    # Create moving_mean and moving_variance add them to\n    # GraphKeys.MOVING_AVERAGE_VARIABLES collections.\n    moving_collections = [moving_vars, tf.GraphKeys.MOVING_AVERAGE_VARIABLES]\n    moving_mean = variables.variable('moving_mean',\n                                     params_shape,\n                                     initializer=tf.zeros_initializer,\n                                     trainable=False,\n                                     restore=restore,\n                                     collections=moving_collections)\n    moving_variance = variables.variable('moving_variance',\n                                         params_shape,\n                                         initializer=tf.ones_initializer,\n                                         trainable=False,\n                                         restore=restore,\n                                         collections=moving_collections)\n    if is_training:\n      # Calculate the moments based on the individual batch.\n      mean, variance = tf.nn.moments(inputs, axis)\n\n      update_moving_mean = moving_averages.assign_moving_average(\n          moving_mean, mean, decay)\n      tf.add_to_collection(UPDATE_OPS_COLLECTION, update_moving_mean)\n      update_moving_variance = moving_averages.assign_moving_average(\n          moving_variance, variance, decay)\n      tf.add_to_collection(UPDATE_OPS_COLLECTION, update_moving_variance)\n    else:\n      # Just use the moving_mean and moving_variance.\n      mean = moving_mean\n      variance = moving_variance\n    # Normalize the activations.\n    outputs = tf.nn.batch_normalization(\n        inputs, mean, variance, beta, gamma, epsilon)\n    outputs.set_shape(inputs.get_shape())\n    if activation:\n      outputs = activation(outputs)\n    return outputs\n\n\ndef _two_element_tuple(int_or_tuple):\n  \"\"\"Converts `int_or_tuple` to height, width.\n\n  Several of the functions that follow accept arguments as either\n  a tuple of 2 integers or a single integer.  A single integer\n  indicates that the 2 values of the tuple are the same.\n\n  This functions normalizes the input value by always returning a tuple.\n\n  Args:\n    int_or_tuple: A list of 2 ints, a single int or a tf.TensorShape.\n\n  Returns:\n    A tuple with 2 values.\n\n  Raises:\n    ValueError: If `int_or_tuple` it not well formed.\n  \"\"\"\n  if isinstance(int_or_tuple, (list, tuple)):\n    if len(int_or_tuple) != 2:\n      raise ValueError('Must be a list with 2 elements: %s' % int_or_tuple)\n    return int(int_or_tuple[0]), int(int_or_tuple[1])\n  if isinstance(int_or_tuple, int):\n    return int(int_or_tuple), int(int_or_tuple)\n  if isinstance(int_or_tuple, tf.TensorShape):\n    if len(int_or_tuple) == 2:\n      return int_or_tuple[0], int_or_tuple[1]\n  raise ValueError('Must be an int, a list with 2 elements or a TensorShape of '\n                   'length 2')\n\n\n@scopes.add_arg_scope\ndef conv2d(inputs,\n           num_filters_out,\n           kernel_size,\n           stride=1,\n           padding='SAME',\n           activation=tf.nn.relu,\n           stddev=0.01,\n           bias=0.0,\n           weight_decay=0,\n           batch_norm_params=None,\n           is_training=True,\n           trainable=True,\n           restore=True,\n           scope=None,\n           reuse=None):\n  \"\"\"Adds a 2D convolution followed by an optional batch_norm layer.\n\n  conv2d creates a variable called 'weights', representing the convolutional\n  kernel, that is convolved with the input. If `batch_norm_params` is None, a\n  second variable called 'biases' is added to the result of the convolution\n  operation.\n\n  Args:\n    inputs: a tensor of size [batch_size, height, width, channels].\n    num_filters_out: the number of output filters.\n    kernel_size: a list of length 2: [kernel_height, kernel_width] of\n      of the filters. Can be an int if both values are the same.\n    stride: a list of length 2: [stride_height, stride_width].\n      Can be an int if both strides are the same.  Note that presently\n      both strides must have the same value.\n    padding: one of 'VALID' or 'SAME'.\n    activation: activation function.\n    stddev: standard deviation of the truncated guassian weight distribution.\n    bias: the initial value of the biases.\n    weight_decay: the weight decay.\n    batch_norm_params: parameters for the batch_norm. If is None don't use it.\n    is_training: whether or not the model is in training mode.\n    trainable: whether or not the variables should be trainable or not.\n    restore: whether or not the variables should be marked for restore.\n    scope: Optional scope for variable_op_scope.\n    reuse: whether or not the layer and its variables should be reused. To be\n      able to reuse the layer scope must be given.\n  Returns:\n    a tensor representing the output of the operation.\n\n  \"\"\"\n  with tf.variable_op_scope([inputs], scope, 'Conv', reuse=reuse):\n    kernel_h, kernel_w = _two_element_tuple(kernel_size)\n    stride_h, stride_w = _two_element_tuple(stride)\n    num_filters_in = inputs.get_shape()[-1]\n    weights_shape = [kernel_h, kernel_w,\n                     num_filters_in, num_filters_out]\n    weights_initializer = tf.truncated_normal_initializer(stddev=stddev)\n    l2_regularizer = None\n    if weight_decay and weight_decay > 0:\n      l2_regularizer = losses.l2_regularizer(weight_decay)\n    weights = variables.variable('weights',\n                                 shape=weights_shape,\n                                 initializer=weights_initializer,\n                                 regularizer=l2_regularizer,\n                                 trainable=trainable,\n                                 restore=restore)\n    conv = tf.nn.conv2d(inputs, weights, [1, stride_h, stride_w, 1],\n                        padding=padding)\n    if batch_norm_params is not None:\n      with scopes.arg_scope([batch_norm], is_training=is_training,\n                            trainable=trainable, restore=restore):\n        outputs = batch_norm(conv, **batch_norm_params)\n    else:\n      bias_shape = [num_filters_out,]\n      bias_initializer = tf.constant_initializer(bias)\n      biases = variables.variable('biases',\n                                  shape=bias_shape,\n                                  initializer=bias_initializer,\n                                  trainable=trainable,\n                                  restore=restore)\n      outputs = tf.nn.bias_add(conv, biases)\n    if activation:\n      outputs = activation(outputs)\n    return outputs\n\n\n@scopes.add_arg_scope\ndef fc(inputs,\n       num_units_out,\n       activation=tf.nn.relu,\n       stddev=0.01,\n       bias=0.0,\n       weight_decay=0,\n       batch_norm_params=None,\n       is_training=True,\n       trainable=True,\n       restore=True,\n       scope=None,\n       reuse=None):\n  \"\"\"Adds a fully connected layer followed by an optional batch_norm layer.\n\n  FC creates a variable called 'weights', representing the fully connected\n  weight matrix, that is multiplied by the input. If `batch_norm` is None, a\n  second variable called 'biases' is added to the result of the initial\n  vector-matrix multiplication.\n\n  Args:\n    inputs: a [B x N] tensor where B is the batch size and N is the number of\n            input units in the layer.\n    num_units_out: the number of output units in the layer.\n    activation: activation function.\n    stddev: the standard deviation for the weights.\n    bias: the initial value of the biases.\n    weight_decay: the weight decay.\n    batch_norm_params: parameters for the batch_norm. If is None don't use it.\n    is_training: whether or not the model is in training mode.\n    trainable: whether or not the variables should be trainable or not.\n    restore: whether or not the variables should be marked for restore.\n    scope: Optional scope for variable_op_scope.\n    reuse: whether or not the layer and its variables should be reused. To be\n      able to reuse the layer scope must be given.\n\n  Returns:\n     the tensor variable representing the result of the series of operations.\n  \"\"\"\n  with tf.variable_op_scope([inputs], scope, 'FC', reuse=reuse):\n    num_units_in = inputs.get_shape()[1]\n    weights_shape = [num_units_in, num_units_out]\n    weights_initializer = tf.truncated_normal_initializer(stddev=stddev)\n    l2_regularizer = None\n    if weight_decay and weight_decay > 0:\n      l2_regularizer = losses.l2_regularizer(weight_decay)\n    weights = variables.variable('weights',\n                                 shape=weights_shape,\n                                 initializer=weights_initializer,\n                                 regularizer=l2_regularizer,\n                                 trainable=trainable,\n                                 restore=restore)\n    if batch_norm_params is not None:\n      outputs = tf.matmul(inputs, weights)\n      with scopes.arg_scope([batch_norm], is_training=is_training,\n                            trainable=trainable, restore=restore):\n        outputs = batch_norm(outputs, **batch_norm_params)\n    else:\n      bias_shape = [num_units_out,]\n      bias_initializer = tf.constant_initializer(bias)\n      biases = variables.variable('biases',\n                                  shape=bias_shape,\n                                  initializer=bias_initializer,\n                                  trainable=trainable,\n                                  restore=restore)\n      outputs = tf.nn.xw_plus_b(inputs, weights, biases)\n    if activation:\n      outputs = activation(outputs)\n    return outputs\n\n\ndef one_hot_encoding(labels, num_classes, scope=None):\n  \"\"\"Transform numeric labels into onehot_labels.\n\n  Args:\n    labels: [batch_size] target labels.\n    num_classes: total number of classes.\n    scope: Optional scope for op_scope.\n  Returns:\n    one hot encoding of the labels.\n  \"\"\"\n  with tf.op_scope([labels], scope, 'OneHotEncoding'):\n    batch_size = labels.get_shape()[0]\n    indices = tf.expand_dims(tf.range(0, batch_size), 1)\n    labels = tf.cast(tf.expand_dims(labels, 1), indices.dtype)\n    concated = tf.concat(1, [indices, labels])\n    onehot_labels = tf.sparse_to_dense(\n        concated, tf.pack([batch_size, num_classes]), 1.0, 0.0)\n    onehot_labels.set_shape([batch_size, num_classes])\n    return onehot_labels\n\n\n@scopes.add_arg_scope\ndef max_pool(inputs, kernel_size, stride=2, padding='VALID', scope=None):\n  \"\"\"Adds a Max Pooling layer.\n\n  It is assumed by the wrapper that the pooling is only done per image and not\n  in depth or batch.\n\n  Args:\n    inputs: a tensor of size [batch_size, height, width, depth].\n    kernel_size: a list of length 2: [kernel_height, kernel_width] of the\n      pooling kernel over which the op is computed. Can be an int if both\n      values are the same.\n    stride: a list of length 2: [stride_height, stride_width].\n      Can be an int if both strides are the same.  Note that presently\n      both strides must have the same value.\n    padding: the padding method, either 'VALID' or 'SAME'.\n    scope: Optional scope for op_scope.\n\n  Returns:\n    a tensor representing the results of the pooling operation.\n  Raises:\n    ValueError: if 'kernel_size' is not a 2-D list\n  \"\"\"\n  with tf.op_scope([inputs], scope, 'MaxPool'):\n    kernel_h, kernel_w = _two_element_tuple(kernel_size)\n    stride_h, stride_w = _two_element_tuple(stride)\n    return tf.nn.max_pool(inputs,\n                          ksize=[1, kernel_h, kernel_w, 1],\n                          strides=[1, stride_h, stride_w, 1],\n                          padding=padding)\n\n\n@scopes.add_arg_scope\ndef avg_pool(inputs, kernel_size, stride=2, padding='VALID', scope=None):\n  \"\"\"Adds a Avg Pooling layer.\n\n  It is assumed by the wrapper that the pooling is only done per image and not\n  in depth or batch.\n\n  Args:\n    inputs: a tensor of size [batch_size, height, width, depth].\n    kernel_size: a list of length 2: [kernel_height, kernel_width] of the\n      pooling kernel over which the op is computed. Can be an int if both\n      values are the same.\n    stride: a list of length 2: [stride_height, stride_width].\n      Can be an int if both strides are the same.  Note that presently\n      both strides must have the same value.\n    padding: the padding method, either 'VALID' or 'SAME'.\n    scope: Optional scope for op_scope.\n\n  Returns:\n    a tensor representing the results of the pooling operation.\n  \"\"\"\n  with tf.op_scope([inputs], scope, 'AvgPool'):\n    kernel_h, kernel_w = _two_element_tuple(kernel_size)\n    stride_h, stride_w = _two_element_tuple(stride)\n    return tf.nn.avg_pool(inputs,\n                          ksize=[1, kernel_h, kernel_w, 1],\n                          strides=[1, stride_h, stride_w, 1],\n                          padding=padding)\n\n\n@scopes.add_arg_scope\ndef dropout(inputs, keep_prob=0.5, is_training=True, scope=None):\n  \"\"\"Returns a dropout layer applied to the input.\n\n  Args:\n    inputs: the tensor to pass to the Dropout layer.\n    keep_prob: the probability of keeping each input unit.\n    is_training: whether or not the model is in training mode. If so, dropout is\n    applied and values scaled. Otherwise, inputs is returned.\n    scope: Optional scope for op_scope.\n\n  Returns:\n    a tensor representing the output of the operation.\n  \"\"\"\n  if is_training and keep_prob > 0:\n    with tf.op_scope([inputs], scope, 'Dropout'):\n      return tf.nn.dropout(inputs, keep_prob)\n  else:\n    return inputs\n\n\ndef flatten(inputs, scope=None):\n  \"\"\"Flattens the input while maintaining the batch_size.\n\n    Assumes that the first dimension represents the batch.\n\n  Args:\n    inputs: a tensor of size [batch_size, ...].\n    scope: Optional scope for op_scope.\n\n  Returns:\n    a flattened tensor with shape [batch_size, k].\n  Raises:\n    ValueError: if inputs.shape is wrong.\n  \"\"\"\n  if len(inputs.get_shape()) < 2:\n    raise ValueError('Inputs must be have a least 2 dimensions')\n  dims = inputs.get_shape()[1:]\n  k = dims.num_elements()\n  with tf.op_scope([inputs], scope, 'Flatten'):\n    return tf.reshape(inputs, [-1, k])\n\n\ndef repeat_op(repetitions, inputs, op, *args, **kwargs):\n  \"\"\"Build a sequential Tower starting from inputs by using an op repeatedly.\n\n  It creates new scopes for each operation by increasing the counter.\n  Example: given repeat_op(3, _, ops.conv2d, 64, [3, 3], scope='conv1')\n    it will repeat the given op under the following variable_scopes:\n      conv1/Conv\n      conv1/Conv_1\n      conv1/Conv_2\n\n  Args:\n    repetitions: number or repetitions.\n    inputs: a tensor of size [batch_size, height, width, channels].\n    op: an operation.\n    *args: args for the op.\n    **kwargs: kwargs for the op.\n\n  Returns:\n    a tensor result of applying the operation op, num times.\n  Raises:\n    ValueError: if the op is unknown or wrong.\n  \"\"\"\n  scope = kwargs.pop('scope', None)\n  with tf.variable_op_scope([inputs], scope, 'RepeatOp'):\n    tower = inputs\n    for _ in range(repetitions):\n      tower = op(tower, *args, **kwargs)\n    return tower\n"
  },
  {
    "path": "model_zoo/models/inception/inception/slim/ops_test.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for slim.ops.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom tensorflow.python.ops import control_flow_ops\n\nfrom inception.slim import ops\nfrom inception.slim import scopes\nfrom inception.slim import variables\n\n\nclass ConvTest(tf.test.TestCase):\n\n  def testCreateConv(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.conv2d(images, 32, [3, 3])\n      self.assertEquals(output.op.name, 'Conv/Relu')\n      self.assertListEqual(output.get_shape().as_list(), [5, height, width, 32])\n\n  def testCreateSquareConv(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.conv2d(images, 32, 3)\n      self.assertEquals(output.op.name, 'Conv/Relu')\n      self.assertListEqual(output.get_shape().as_list(), [5, height, width, 32])\n\n  def testCreateConvWithTensorShape(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.conv2d(images, 32, images.get_shape()[1:3])\n      self.assertEquals(output.op.name, 'Conv/Relu')\n      self.assertListEqual(output.get_shape().as_list(), [5, height, width, 32])\n\n  def testCreateFullyConv(self):\n    height, width = 6, 6\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 32), seed=1)\n      output = ops.conv2d(images, 64, images.get_shape()[1:3], padding='VALID')\n      self.assertEquals(output.op.name, 'Conv/Relu')\n      self.assertListEqual(output.get_shape().as_list(), [5, 1, 1, 64])\n\n  def testCreateVerticalConv(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.conv2d(images, 32, [3, 1])\n      self.assertEquals(output.op.name, 'Conv/Relu')\n      self.assertListEqual(output.get_shape().as_list(),\n                           [5, height, width, 32])\n\n  def testCreateHorizontalConv(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.conv2d(images, 32, [1, 3])\n      self.assertEquals(output.op.name, 'Conv/Relu')\n      self.assertListEqual(output.get_shape().as_list(),\n                           [5, height, width, 32])\n\n  def testCreateConvWithStride(self):\n    height, width = 6, 6\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.conv2d(images, 32, [3, 3], stride=2)\n      self.assertEquals(output.op.name, 'Conv/Relu')\n      self.assertListEqual(output.get_shape().as_list(),\n                           [5, height/2, width/2, 32])\n\n  def testCreateConvCreatesWeightsAndBiasesVars(self):\n    height, width = 3, 3\n    images = tf.random_uniform((5, height, width, 3), seed=1)\n    with self.test_session():\n      self.assertFalse(variables.get_variables('conv1/weights'))\n      self.assertFalse(variables.get_variables('conv1/biases'))\n      ops.conv2d(images, 32, [3, 3], scope='conv1')\n      self.assertTrue(variables.get_variables('conv1/weights'))\n      self.assertTrue(variables.get_variables('conv1/biases'))\n\n  def testCreateConvWithScope(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.conv2d(images, 32, [3, 3], scope='conv1')\n      self.assertEquals(output.op.name, 'conv1/Relu')\n\n  def testCreateConvWithoutActivation(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.conv2d(images, 32, [3, 3], activation=None)\n      self.assertEquals(output.op.name, 'Conv/BiasAdd')\n\n  def testCreateConvValid(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.conv2d(images, 32, [3, 3], padding='VALID')\n      self.assertListEqual(output.get_shape().as_list(), [5, 1, 1, 32])\n\n  def testCreateConvWithWD(self):\n    height, width = 3, 3\n    with self.test_session() as sess:\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      ops.conv2d(images, 32, [3, 3], weight_decay=0.01)\n      wd = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)[0]\n      self.assertEquals(wd.op.name,\n                        'Conv/weights/Regularizer/L2Regularizer/value')\n      sess.run(tf.initialize_all_variables())\n      self.assertTrue(sess.run(wd) <= 0.01)\n\n  def testCreateConvWithoutWD(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      ops.conv2d(images, 32, [3, 3], weight_decay=0)\n      self.assertEquals(\n          tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES), [])\n\n  def testReuseVars(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      ops.conv2d(images, 32, [3, 3], scope='conv1')\n      self.assertEquals(len(variables.get_variables()), 2)\n      ops.conv2d(images, 32, [3, 3], scope='conv1', reuse=True)\n      self.assertEquals(len(variables.get_variables()), 2)\n\n  def testNonReuseVars(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      ops.conv2d(images, 32, [3, 3])\n      self.assertEquals(len(variables.get_variables()), 2)\n      ops.conv2d(images, 32, [3, 3])\n      self.assertEquals(len(variables.get_variables()), 4)\n\n  def testReuseConvWithWD(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      ops.conv2d(images, 32, [3, 3], weight_decay=0.01, scope='conv1')\n      self.assertEquals(len(variables.get_variables()), 2)\n      self.assertEquals(\n          len(tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)), 1)\n      ops.conv2d(images, 32, [3, 3], weight_decay=0.01, scope='conv1',\n                 reuse=True)\n      self.assertEquals(len(variables.get_variables()), 2)\n      self.assertEquals(\n          len(tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)), 1)\n\n  def testConvWithBatchNorm(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 32), seed=1)\n      with scopes.arg_scope([ops.conv2d], batch_norm_params={'decay': 0.9}):\n        net = ops.conv2d(images, 32, [3, 3])\n        net = ops.conv2d(net, 32, [3, 3])\n      self.assertEquals(len(variables.get_variables()), 8)\n      self.assertEquals(len(variables.get_variables('Conv/BatchNorm')), 3)\n      self.assertEquals(len(variables.get_variables('Conv_1/BatchNorm')), 3)\n\n  def testReuseConvWithBatchNorm(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 32), seed=1)\n      with scopes.arg_scope([ops.conv2d], batch_norm_params={'decay': 0.9}):\n        net = ops.conv2d(images, 32, [3, 3], scope='Conv')\n        net = ops.conv2d(net, 32, [3, 3], scope='Conv', reuse=True)\n      self.assertEquals(len(variables.get_variables()), 4)\n      self.assertEquals(len(variables.get_variables('Conv/BatchNorm')), 3)\n      self.assertEquals(len(variables.get_variables('Conv_1/BatchNorm')), 0)\n\n\nclass FCTest(tf.test.TestCase):\n\n  def testCreateFC(self):\n    height, width = 3, 3\n    with self.test_session():\n      inputs = tf.random_uniform((5, height * width * 3), seed=1)\n      output = ops.fc(inputs, 32)\n      self.assertEquals(output.op.name, 'FC/Relu')\n      self.assertListEqual(output.get_shape().as_list(), [5, 32])\n\n  def testCreateFCWithScope(self):\n    height, width = 3, 3\n    with self.test_session():\n      inputs = tf.random_uniform((5, height * width * 3), seed=1)\n      output = ops.fc(inputs, 32, scope='fc1')\n      self.assertEquals(output.op.name, 'fc1/Relu')\n\n  def testCreateFcCreatesWeightsAndBiasesVars(self):\n    height, width = 3, 3\n    inputs = tf.random_uniform((5, height * width * 3), seed=1)\n    with self.test_session():\n      self.assertFalse(variables.get_variables('fc1/weights'))\n      self.assertFalse(variables.get_variables('fc1/biases'))\n      ops.fc(inputs, 32, scope='fc1')\n      self.assertTrue(variables.get_variables('fc1/weights'))\n      self.assertTrue(variables.get_variables('fc1/biases'))\n\n  def testReuseVars(self):\n    height, width = 3, 3\n    inputs = tf.random_uniform((5, height * width * 3), seed=1)\n    with self.test_session():\n      ops.fc(inputs, 32, scope='fc1')\n      self.assertEquals(len(variables.get_variables('fc1')), 2)\n      ops.fc(inputs, 32, scope='fc1', reuse=True)\n      self.assertEquals(len(variables.get_variables('fc1')), 2)\n\n  def testNonReuseVars(self):\n    height, width = 3, 3\n    inputs = tf.random_uniform((5, height * width * 3), seed=1)\n    with self.test_session():\n      ops.fc(inputs, 32)\n      self.assertEquals(len(variables.get_variables('FC')), 2)\n      ops.fc(inputs, 32)\n      self.assertEquals(len(variables.get_variables('FC')), 4)\n\n  def testCreateFCWithoutActivation(self):\n    height, width = 3, 3\n    with self.test_session():\n      inputs = tf.random_uniform((5, height * width * 3), seed=1)\n      output = ops.fc(inputs, 32, activation=None)\n      self.assertEquals(output.op.name, 'FC/xw_plus_b')\n\n  def testCreateFCWithWD(self):\n    height, width = 3, 3\n    with self.test_session() as sess:\n      inputs = tf.random_uniform((5, height * width * 3), seed=1)\n      ops.fc(inputs, 32, weight_decay=0.01)\n      wd = tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)[0]\n      self.assertEquals(wd.op.name,\n                        'FC/weights/Regularizer/L2Regularizer/value')\n      sess.run(tf.initialize_all_variables())\n      self.assertTrue(sess.run(wd) <= 0.01)\n\n  def testCreateFCWithoutWD(self):\n    height, width = 3, 3\n    with self.test_session():\n      inputs = tf.random_uniform((5, height * width * 3), seed=1)\n      ops.fc(inputs, 32, weight_decay=0)\n      self.assertEquals(\n          tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES), [])\n\n  def testReuseFCWithWD(self):\n    height, width = 3, 3\n    with self.test_session():\n      inputs = tf.random_uniform((5, height * width * 3), seed=1)\n      ops.fc(inputs, 32, weight_decay=0.01, scope='fc')\n      self.assertEquals(len(variables.get_variables()), 2)\n      self.assertEquals(\n          len(tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)), 1)\n      ops.fc(inputs, 32, weight_decay=0.01, scope='fc', reuse=True)\n      self.assertEquals(len(variables.get_variables()), 2)\n      self.assertEquals(\n          len(tf.get_collection(tf.GraphKeys.REGULARIZATION_LOSSES)), 1)\n\n  def testFCWithBatchNorm(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height * width * 3), seed=1)\n      with scopes.arg_scope([ops.fc], batch_norm_params={}):\n        net = ops.fc(images, 27)\n        net = ops.fc(net, 27)\n      self.assertEquals(len(variables.get_variables()), 8)\n      self.assertEquals(len(variables.get_variables('FC/BatchNorm')), 3)\n      self.assertEquals(len(variables.get_variables('FC_1/BatchNorm')), 3)\n\n  def testReuseFCWithBatchNorm(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height * width * 3), seed=1)\n      with scopes.arg_scope([ops.fc], batch_norm_params={'decay': 0.9}):\n        net = ops.fc(images, 27, scope='fc1')\n        net = ops.fc(net, 27, scope='fc1', reuse=True)\n      self.assertEquals(len(variables.get_variables()), 4)\n      self.assertEquals(len(variables.get_variables('fc1/BatchNorm')), 3)\n\n\nclass MaxPoolTest(tf.test.TestCase):\n\n  def testCreateMaxPool(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.max_pool(images, [3, 3])\n      self.assertEquals(output.op.name, 'MaxPool/MaxPool')\n      self.assertListEqual(output.get_shape().as_list(), [5, 1, 1, 3])\n\n  def testCreateSquareMaxPool(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.max_pool(images, 3)\n      self.assertEquals(output.op.name, 'MaxPool/MaxPool')\n      self.assertListEqual(output.get_shape().as_list(), [5, 1, 1, 3])\n\n  def testCreateMaxPoolWithScope(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.max_pool(images, [3, 3], scope='pool1')\n      self.assertEquals(output.op.name, 'pool1/MaxPool')\n\n  def testCreateMaxPoolSAME(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.max_pool(images, [3, 3], padding='SAME')\n      self.assertListEqual(output.get_shape().as_list(), [5, 2, 2, 3])\n\n  def testCreateMaxPoolStrideSAME(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.max_pool(images, [3, 3], stride=1, padding='SAME')\n      self.assertListEqual(output.get_shape().as_list(), [5, height, width, 3])\n\n  def testGlobalMaxPool(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.max_pool(images, images.get_shape()[1:3], stride=1)\n      self.assertListEqual(output.get_shape().as_list(), [5, 1, 1, 3])\n\n\nclass AvgPoolTest(tf.test.TestCase):\n\n  def testCreateAvgPool(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.avg_pool(images, [3, 3])\n      self.assertEquals(output.op.name, 'AvgPool/AvgPool')\n      self.assertListEqual(output.get_shape().as_list(), [5, 1, 1, 3])\n\n  def testCreateSquareAvgPool(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.avg_pool(images, 3)\n      self.assertEquals(output.op.name, 'AvgPool/AvgPool')\n      self.assertListEqual(output.get_shape().as_list(), [5, 1, 1, 3])\n\n  def testCreateAvgPoolWithScope(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.avg_pool(images, [3, 3], scope='pool1')\n      self.assertEquals(output.op.name, 'pool1/AvgPool')\n\n  def testCreateAvgPoolSAME(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.avg_pool(images, [3, 3], padding='SAME')\n      self.assertListEqual(output.get_shape().as_list(), [5, 2, 2, 3])\n\n  def testCreateAvgPoolStrideSAME(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.avg_pool(images, [3, 3], stride=1, padding='SAME')\n      self.assertListEqual(output.get_shape().as_list(), [5, height, width, 3])\n\n  def testGlobalAvgPool(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.avg_pool(images, images.get_shape()[1:3], stride=1)\n      self.assertListEqual(output.get_shape().as_list(), [5, 1, 1, 3])\n\n\nclass OneHotEncodingTest(tf.test.TestCase):\n\n  def testOneHotEncodingCreate(self):\n    with self.test_session():\n      labels = tf.constant([0, 1, 2])\n      output = ops.one_hot_encoding(labels, num_classes=3)\n      self.assertEquals(output.op.name, 'OneHotEncoding/SparseToDense')\n      self.assertListEqual(output.get_shape().as_list(), [3, 3])\n\n  def testOneHotEncoding(self):\n    with self.test_session():\n      labels = tf.constant([0, 1, 2])\n      one_hot_labels = tf.constant([[1, 0, 0],\n                                    [0, 1, 0],\n                                    [0, 0, 1]])\n      output = ops.one_hot_encoding(labels, num_classes=3)\n      self.assertAllClose(output.eval(), one_hot_labels.eval())\n\n\nclass DropoutTest(tf.test.TestCase):\n\n  def testCreateDropout(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.dropout(images)\n      self.assertEquals(output.op.name, 'Dropout/dropout/mul_1')\n      output.get_shape().assert_is_compatible_with(images.get_shape())\n\n  def testCreateDropoutNoTraining(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1, name='images')\n      output = ops.dropout(images, is_training=False)\n      self.assertEquals(output, images)\n\n\nclass FlattenTest(tf.test.TestCase):\n\n  def testFlatten4D(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1, name='images')\n      output = ops.flatten(images)\n      self.assertEquals(output.get_shape().num_elements(),\n                        images.get_shape().num_elements())\n      self.assertEqual(output.get_shape()[0], images.get_shape()[0])\n\n  def testFlatten3D(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width), seed=1, name='images')\n      output = ops.flatten(images)\n      self.assertEquals(output.get_shape().num_elements(),\n                        images.get_shape().num_elements())\n      self.assertEqual(output.get_shape()[0], images.get_shape()[0])\n\n  def testFlattenBatchSize(self):\n    height, width = 3, 3\n    with self.test_session() as sess:\n      images = tf.random_uniform((5, height, width, 3), seed=1, name='images')\n      inputs = tf.placeholder(tf.int32, (None, height, width, 3))\n      output = ops.flatten(inputs)\n      self.assertEquals(output.get_shape().as_list(),\n                        [None, height * width * 3])\n      output = sess.run(output, {inputs: images.eval()})\n      self.assertEquals(output.size,\n                        images.get_shape().num_elements())\n      self.assertEqual(output.shape[0], images.get_shape()[0])\n\n\nclass BatchNormTest(tf.test.TestCase):\n\n  def testCreateOp(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      output = ops.batch_norm(images)\n      self.assertTrue(output.op.name.startswith('BatchNorm/batchnorm'))\n      self.assertListEqual(output.get_shape().as_list(), [5, height, width, 3])\n\n  def testCreateVariables(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      ops.batch_norm(images)\n      beta = variables.get_variables_by_name('beta')[0]\n      self.assertEquals(beta.op.name, 'BatchNorm/beta')\n      gamma = variables.get_variables_by_name('gamma')\n      self.assertEquals(gamma, [])\n      moving_mean = tf.moving_average_variables()[0]\n      moving_variance = tf.moving_average_variables()[1]\n      self.assertEquals(moving_mean.op.name, 'BatchNorm/moving_mean')\n      self.assertEquals(moving_variance.op.name, 'BatchNorm/moving_variance')\n\n  def testCreateVariablesWithScale(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      ops.batch_norm(images, scale=True)\n      beta = variables.get_variables_by_name('beta')[0]\n      gamma = variables.get_variables_by_name('gamma')[0]\n      self.assertEquals(beta.op.name, 'BatchNorm/beta')\n      self.assertEquals(gamma.op.name, 'BatchNorm/gamma')\n      moving_mean = tf.moving_average_variables()[0]\n      moving_variance = tf.moving_average_variables()[1]\n      self.assertEquals(moving_mean.op.name, 'BatchNorm/moving_mean')\n      self.assertEquals(moving_variance.op.name, 'BatchNorm/moving_variance')\n\n  def testCreateVariablesWithoutCenterWithScale(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      ops.batch_norm(images, center=False, scale=True)\n      beta = variables.get_variables_by_name('beta')\n      self.assertEquals(beta, [])\n      gamma = variables.get_variables_by_name('gamma')[0]\n      self.assertEquals(gamma.op.name, 'BatchNorm/gamma')\n      moving_mean = tf.moving_average_variables()[0]\n      moving_variance = tf.moving_average_variables()[1]\n      self.assertEquals(moving_mean.op.name, 'BatchNorm/moving_mean')\n      self.assertEquals(moving_variance.op.name, 'BatchNorm/moving_variance')\n\n  def testCreateVariablesWithoutCenterWithoutScale(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      ops.batch_norm(images, center=False, scale=False)\n      beta = variables.get_variables_by_name('beta')\n      self.assertEquals(beta, [])\n      gamma = variables.get_variables_by_name('gamma')\n      self.assertEquals(gamma, [])\n      moving_mean = tf.moving_average_variables()[0]\n      moving_variance = tf.moving_average_variables()[1]\n      self.assertEquals(moving_mean.op.name, 'BatchNorm/moving_mean')\n      self.assertEquals(moving_variance.op.name, 'BatchNorm/moving_variance')\n\n  def testMovingAverageVariables(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      ops.batch_norm(images, scale=True)\n      moving_mean = tf.moving_average_variables()[0]\n      moving_variance = tf.moving_average_variables()[1]\n      self.assertEquals(moving_mean.op.name, 'BatchNorm/moving_mean')\n      self.assertEquals(moving_variance.op.name, 'BatchNorm/moving_variance')\n\n  def testUpdateOps(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      ops.batch_norm(images)\n      update_ops = tf.get_collection(ops.UPDATE_OPS_COLLECTION)\n      update_moving_mean = update_ops[0]\n      update_moving_variance = update_ops[1]\n      self.assertEquals(update_moving_mean.op.name,\n                        'BatchNorm/AssignMovingAvg')\n      self.assertEquals(update_moving_variance.op.name,\n                        'BatchNorm/AssignMovingAvg_1')\n\n  def testReuseVariables(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      ops.batch_norm(images, scale=True, scope='bn')\n      ops.batch_norm(images, scale=True, scope='bn', reuse=True)\n      beta = variables.get_variables_by_name('beta')\n      gamma = variables.get_variables_by_name('gamma')\n      self.assertEquals(len(beta), 1)\n      self.assertEquals(len(gamma), 1)\n      moving_vars = tf.get_collection('moving_vars')\n      self.assertEquals(len(moving_vars), 2)\n\n  def testReuseUpdateOps(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      ops.batch_norm(images, scope='bn')\n      self.assertEquals(len(tf.get_collection(ops.UPDATE_OPS_COLLECTION)), 2)\n      ops.batch_norm(images, scope='bn', reuse=True)\n      self.assertEquals(len(tf.get_collection(ops.UPDATE_OPS_COLLECTION)), 4)\n\n  def testCreateMovingVars(self):\n    height, width = 3, 3\n    with self.test_session():\n      images = tf.random_uniform((5, height, width, 3), seed=1)\n      _ = ops.batch_norm(images, moving_vars='moving_vars')\n      moving_mean = tf.get_collection('moving_vars',\n                                      'BatchNorm/moving_mean')\n      self.assertEquals(len(moving_mean), 1)\n      self.assertEquals(moving_mean[0].op.name, 'BatchNorm/moving_mean')\n      moving_variance = tf.get_collection('moving_vars',\n                                          'BatchNorm/moving_variance')\n      self.assertEquals(len(moving_variance), 1)\n      self.assertEquals(moving_variance[0].op.name, 'BatchNorm/moving_variance')\n\n  def testComputeMovingVars(self):\n    height, width = 3, 3\n    with self.test_session() as sess:\n      image_shape = (10, height, width, 3)\n      image_values = np.random.rand(*image_shape)\n      expected_mean = np.mean(image_values, axis=(0, 1, 2))\n      expected_var = np.var(image_values, axis=(0, 1, 2))\n      images = tf.constant(image_values, shape=image_shape, dtype=tf.float32)\n      output = ops.batch_norm(images, decay=0.1)\n      update_ops = tf.get_collection(ops.UPDATE_OPS_COLLECTION)\n      with tf.control_dependencies(update_ops):\n        barrier = tf.no_op(name='gradient_barrier')\n        output = control_flow_ops.with_dependencies([barrier], output)\n      # Initialize all variables\n      sess.run(tf.initialize_all_variables())\n      moving_mean = variables.get_variables('BatchNorm/moving_mean')[0]\n      moving_variance = variables.get_variables('BatchNorm/moving_variance')[0]\n      mean, variance = sess.run([moving_mean, moving_variance])\n      # After initialization moving_mean == 0 and moving_variance == 1.\n      self.assertAllClose(mean, [0] * 3)\n      self.assertAllClose(variance, [1] * 3)\n      for _ in range(10):\n        sess.run([output])\n      mean = moving_mean.eval()\n      variance = moving_variance.eval()\n      # After 10 updates with decay 0.1 moving_mean == expected_mean and\n      # moving_variance == expected_var.\n      self.assertAllClose(mean, expected_mean)\n      self.assertAllClose(variance, expected_var)\n\n  def testEvalMovingVars(self):\n    height, width = 3, 3\n    with self.test_session() as sess:\n      image_shape = (10, height, width, 3)\n      image_values = np.random.rand(*image_shape)\n      expected_mean = np.mean(image_values, axis=(0, 1, 2))\n      expected_var = np.var(image_values, axis=(0, 1, 2))\n      images = tf.constant(image_values, shape=image_shape, dtype=tf.float32)\n      output = ops.batch_norm(images, decay=0.1, is_training=False)\n      update_ops = tf.get_collection(ops.UPDATE_OPS_COLLECTION)\n      with tf.control_dependencies(update_ops):\n        barrier = tf.no_op(name='gradient_barrier')\n        output = control_flow_ops.with_dependencies([barrier], output)\n      # Initialize all variables\n      sess.run(tf.initialize_all_variables())\n      moving_mean = variables.get_variables('BatchNorm/moving_mean')[0]\n      moving_variance = variables.get_variables('BatchNorm/moving_variance')[0]\n      mean, variance = sess.run([moving_mean, moving_variance])\n      # After initialization moving_mean == 0 and moving_variance == 1.\n      self.assertAllClose(mean, [0] * 3)\n      self.assertAllClose(variance, [1] * 3)\n      # Simulate assigment from saver restore.\n      init_assigns = [tf.assign(moving_mean, expected_mean),\n                      tf.assign(moving_variance, expected_var)]\n      sess.run(init_assigns)\n      for _ in range(10):\n        sess.run([output], {images: np.random.rand(*image_shape)})\n      mean = moving_mean.eval()\n      variance = moving_variance.eval()\n      # Although we feed different images, the moving_mean and moving_variance\n      # shouldn't change.\n      self.assertAllClose(mean, expected_mean)\n      self.assertAllClose(variance, expected_var)\n\n  def testReuseVars(self):\n    height, width = 3, 3\n    with self.test_session() as sess:\n      image_shape = (10, height, width, 3)\n      image_values = np.random.rand(*image_shape)\n      expected_mean = np.mean(image_values, axis=(0, 1, 2))\n      expected_var = np.var(image_values, axis=(0, 1, 2))\n      images = tf.constant(image_values, shape=image_shape, dtype=tf.float32)\n      output = ops.batch_norm(images, decay=0.1, is_training=False)\n      update_ops = tf.get_collection(ops.UPDATE_OPS_COLLECTION)\n      with tf.control_dependencies(update_ops):\n        barrier = tf.no_op(name='gradient_barrier')\n        output = control_flow_ops.with_dependencies([barrier], output)\n      # Initialize all variables\n      sess.run(tf.initialize_all_variables())\n      moving_mean = variables.get_variables('BatchNorm/moving_mean')[0]\n      moving_variance = variables.get_variables('BatchNorm/moving_variance')[0]\n      mean, variance = sess.run([moving_mean, moving_variance])\n      # After initialization moving_mean == 0 and moving_variance == 1.\n      self.assertAllClose(mean, [0] * 3)\n      self.assertAllClose(variance, [1] * 3)\n      # Simulate assigment from saver restore.\n      init_assigns = [tf.assign(moving_mean, expected_mean),\n                      tf.assign(moving_variance, expected_var)]\n      sess.run(init_assigns)\n      for _ in range(10):\n        sess.run([output], {images: np.random.rand(*image_shape)})\n      mean = moving_mean.eval()\n      variance = moving_variance.eval()\n      # Although we feed different images, the moving_mean and moving_variance\n      # shouldn't change.\n      self.assertAllClose(mean, expected_mean)\n      self.assertAllClose(variance, expected_var)\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/inception/inception/slim/scopes.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains the new arg_scope used for TF-Slim ops.\n\n  Allows one to define models much more compactly by eliminating boilerplate\n  code. This is accomplished through the use of argument scoping (arg_scope).\n\n  Example of how to use scopes.arg_scope:\n\n  with scopes.arg_scope(ops.conv2d, padding='SAME',\n                      stddev=0.01, weight_decay=0.0005):\n    net = ops.conv2d(inputs, 64, [11, 11], 4, padding='VALID', scope='conv1')\n    net = ops.conv2d(net, 256, [5, 5], scope='conv2')\n\n  The first call to conv2d will overwrite padding:\n    ops.conv2d(inputs, 64, [11, 11], 4, padding='VALID',\n              stddev=0.01, weight_decay=0.0005, scope='conv1')\n\n  The second call to Conv will use predefined args:\n    ops.conv2d(inputs, 256, [5, 5], padding='SAME',\n               stddev=0.01, weight_decay=0.0005, scope='conv2')\n\n  Example of how to reuse an arg_scope:\n  with scopes.arg_scope(ops.conv2d, padding='SAME',\n                      stddev=0.01, weight_decay=0.0005) as conv2d_arg_scope:\n    net = ops.conv2d(net, 256, [5, 5], scope='conv1')\n    ....\n\n  with scopes.arg_scope(conv2d_arg_scope):\n    net = ops.conv2d(net, 256, [5, 5], scope='conv2')\n\n  Example of how to use scopes.add_arg_scope:\n\n  @scopes.add_arg_scope\n  def conv2d(*args, **kwargs)\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport contextlib\nimport functools\n\nfrom tensorflow.python.framework import ops\n\n_ARGSTACK_KEY = (\"__arg_stack\",)\n\n_DECORATED_OPS = set()\n\n\ndef _get_arg_stack():\n  stack = ops.get_collection(_ARGSTACK_KEY)\n  if stack:\n    return stack[0]\n  else:\n    stack = [{}]\n    ops.add_to_collection(_ARGSTACK_KEY, stack)\n    return stack\n\n\ndef _current_arg_scope():\n  stack = _get_arg_stack()\n  return stack[-1]\n\n\ndef _add_op(op):\n  key_op = (op.__module__, op.__name__)\n  if key_op not in _DECORATED_OPS:\n    _DECORATED_OPS.add(key_op)\n\n\n@contextlib.contextmanager\ndef arg_scope(list_ops_or_scope, **kwargs):\n  \"\"\"Stores the default arguments for the given set of list_ops.\n\n  For usage, please see examples at top of the file.\n\n  Args:\n    list_ops_or_scope: List or tuple of operations to set argument scope for or\n      a dictionary containg the current scope. When list_ops_or_scope is a dict,\n      kwargs must be empty. When list_ops_or_scope is a list or tuple, then\n      every op in it need to be decorated with @add_arg_scope to work.\n    **kwargs: keyword=value that will define the defaults for each op in\n              list_ops. All the ops need to accept the given set of arguments.\n\n  Yields:\n    the current_scope, which is a dictionary of {op: {arg: value}}\n  Raises:\n    TypeError: if list_ops is not a list or a tuple.\n    ValueError: if any op in list_ops has not be decorated with @add_arg_scope.\n  \"\"\"\n  if isinstance(list_ops_or_scope, dict):\n    # Assumes that list_ops_or_scope is a scope that is being reused.\n    if kwargs:\n      raise ValueError(\"When attempting to re-use a scope by suppling a\"\n                       \"dictionary, kwargs must be empty.\")\n    current_scope = list_ops_or_scope.copy()\n    try:\n      _get_arg_stack().append(current_scope)\n      yield current_scope\n    finally:\n      _get_arg_stack().pop()\n  else:\n    # Assumes that list_ops_or_scope is a list/tuple of ops with kwargs.\n    if not isinstance(list_ops_or_scope, (list, tuple)):\n      raise TypeError(\"list_ops_or_scope must either be a list/tuple or reused\"\n                      \"scope (i.e. dict)\")\n    try:\n      current_scope = _current_arg_scope().copy()\n      for op in list_ops_or_scope:\n        key_op = (op.__module__, op.__name__)\n        if not has_arg_scope(op):\n          raise ValueError(\"%s is not decorated with @add_arg_scope\", key_op)\n        if key_op in current_scope:\n          current_kwargs = current_scope[key_op].copy()\n          current_kwargs.update(kwargs)\n          current_scope[key_op] = current_kwargs\n        else:\n          current_scope[key_op] = kwargs.copy()\n      _get_arg_stack().append(current_scope)\n      yield current_scope\n    finally:\n      _get_arg_stack().pop()\n\n\ndef add_arg_scope(func):\n  \"\"\"Decorates a function with args so it can be used within an arg_scope.\n\n  Args:\n    func: function to decorate.\n\n  Returns:\n    A tuple with the decorated function func_with_args().\n  \"\"\"\n  @functools.wraps(func)\n  def func_with_args(*args, **kwargs):\n    current_scope = _current_arg_scope()\n    current_args = kwargs\n    key_func = (func.__module__, func.__name__)\n    if key_func in current_scope:\n      current_args = current_scope[key_func].copy()\n      current_args.update(kwargs)\n    return func(*args, **current_args)\n  _add_op(func)\n  return func_with_args\n\n\ndef has_arg_scope(func):\n  \"\"\"Checks whether a func has been decorated with @add_arg_scope or not.\n\n  Args:\n    func: function to check.\n\n  Returns:\n    a boolean.\n  \"\"\"\n  key_op = (func.__module__, func.__name__)\n  return key_op in _DECORATED_OPS\n"
  },
  {
    "path": "model_zoo/models/inception/inception/slim/scopes_test.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests slim.scopes.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nimport tensorflow as tf\nfrom inception.slim import scopes\n\n\n@scopes.add_arg_scope\ndef func1(*args, **kwargs):\n  return (args, kwargs)\n\n\n@scopes.add_arg_scope\ndef func2(*args, **kwargs):\n  return (args, kwargs)\n\n\nclass ArgScopeTest(tf.test.TestCase):\n\n  def testEmptyArgScope(self):\n    with self.test_session():\n      self.assertEqual(scopes._current_arg_scope(), {})\n\n  def testCurrentArgScope(self):\n    func1_kwargs = {'a': 1, 'b': None, 'c': [1]}\n    key_op = (func1.__module__, func1.__name__)\n    current_scope = {key_op: func1_kwargs.copy()}\n    with self.test_session():\n      with scopes.arg_scope([func1], a=1, b=None, c=[1]) as scope:\n        self.assertDictEqual(scope, current_scope)\n\n  def testCurrentArgScopeNested(self):\n    func1_kwargs = {'a': 1, 'b': None, 'c': [1]}\n    func2_kwargs = {'b': 2, 'd': [2]}\n    key = lambda f: (f.__module__, f.__name__)\n    current_scope = {key(func1): func1_kwargs.copy(),\n                     key(func2): func2_kwargs.copy()}\n    with self.test_session():\n      with scopes.arg_scope([func1], a=1, b=None, c=[1]):\n        with scopes.arg_scope([func2], b=2, d=[2]) as scope:\n          self.assertDictEqual(scope, current_scope)\n\n  def testReuseArgScope(self):\n    func1_kwargs = {'a': 1, 'b': None, 'c': [1]}\n    key_op = (func1.__module__, func1.__name__)\n    current_scope = {key_op: func1_kwargs.copy()}\n    with self.test_session():\n      with scopes.arg_scope([func1], a=1, b=None, c=[1]) as scope1:\n        pass\n      with scopes.arg_scope(scope1) as scope:\n        self.assertDictEqual(scope, current_scope)\n\n  def testReuseArgScopeNested(self):\n    func1_kwargs = {'a': 1, 'b': None, 'c': [1]}\n    func2_kwargs = {'b': 2, 'd': [2]}\n    key = lambda f: (f.__module__, f.__name__)\n    current_scope1 = {key(func1): func1_kwargs.copy()}\n    current_scope2 = {key(func1): func1_kwargs.copy(),\n                      key(func2): func2_kwargs.copy()}\n    with self.test_session():\n      with scopes.arg_scope([func1], a=1, b=None, c=[1]) as scope1:\n        with scopes.arg_scope([func2], b=2, d=[2]) as scope2:\n          pass\n      with scopes.arg_scope(scope1):\n        self.assertDictEqual(scopes._current_arg_scope(), current_scope1)\n      with scopes.arg_scope(scope2):\n        self.assertDictEqual(scopes._current_arg_scope(), current_scope2)\n\n  def testSimpleArgScope(self):\n    func1_args = (0,)\n    func1_kwargs = {'a': 1, 'b': None, 'c': [1]}\n    with self.test_session():\n      with scopes.arg_scope([func1], a=1, b=None, c=[1]):\n        args, kwargs = func1(0)\n        self.assertTupleEqual(args, func1_args)\n        self.assertDictEqual(kwargs, func1_kwargs)\n\n  def testSimpleArgScopeWithTuple(self):\n    func1_args = (0,)\n    func1_kwargs = {'a': 1, 'b': None, 'c': [1]}\n    with self.test_session():\n      with scopes.arg_scope((func1,), a=1, b=None, c=[1]):\n        args, kwargs = func1(0)\n        self.assertTupleEqual(args, func1_args)\n        self.assertDictEqual(kwargs, func1_kwargs)\n\n  def testOverwriteArgScope(self):\n    func1_args = (0,)\n    func1_kwargs = {'a': 1, 'b': 2, 'c': [1]}\n    with scopes.arg_scope([func1], a=1, b=None, c=[1]):\n      args, kwargs = func1(0, b=2)\n      self.assertTupleEqual(args, func1_args)\n      self.assertDictEqual(kwargs, func1_kwargs)\n\n  def testNestedArgScope(self):\n    func1_args = (0,)\n    func1_kwargs = {'a': 1, 'b': None, 'c': [1]}\n    with scopes.arg_scope([func1], a=1, b=None, c=[1]):\n      args, kwargs = func1(0)\n      self.assertTupleEqual(args, func1_args)\n      self.assertDictEqual(kwargs, func1_kwargs)\n      func1_kwargs['b'] = 2\n      with scopes.arg_scope([func1], b=2):\n        args, kwargs = func1(0)\n        self.assertTupleEqual(args, func1_args)\n        self.assertDictEqual(kwargs, func1_kwargs)\n\n  def testSharedArgScope(self):\n    func1_args = (0,)\n    func1_kwargs = {'a': 1, 'b': None, 'c': [1]}\n    with scopes.arg_scope([func1, func2], a=1, b=None, c=[1]):\n      args, kwargs = func1(0)\n      self.assertTupleEqual(args, func1_args)\n      self.assertDictEqual(kwargs, func1_kwargs)\n      args, kwargs = func2(0)\n      self.assertTupleEqual(args, func1_args)\n      self.assertDictEqual(kwargs, func1_kwargs)\n\n  def testSharedArgScopeTuple(self):\n    func1_args = (0,)\n    func1_kwargs = {'a': 1, 'b': None, 'c': [1]}\n    with scopes.arg_scope((func1, func2), a=1, b=None, c=[1]):\n      args, kwargs = func1(0)\n      self.assertTupleEqual(args, func1_args)\n      self.assertDictEqual(kwargs, func1_kwargs)\n      args, kwargs = func2(0)\n      self.assertTupleEqual(args, func1_args)\n      self.assertDictEqual(kwargs, func1_kwargs)\n\n  def testPartiallySharedArgScope(self):\n    func1_args = (0,)\n    func1_kwargs = {'a': 1, 'b': None, 'c': [1]}\n    func2_args = (1,)\n    func2_kwargs = {'a': 1, 'b': None, 'd': [2]}\n    with scopes.arg_scope([func1, func2], a=1, b=None):\n      with scopes.arg_scope([func1], c=[1]), scopes.arg_scope([func2], d=[2]):\n        args, kwargs = func1(0)\n        self.assertTupleEqual(args, func1_args)\n        self.assertDictEqual(kwargs, func1_kwargs)\n        args, kwargs = func2(1)\n        self.assertTupleEqual(args, func2_args)\n        self.assertDictEqual(kwargs, func2_kwargs)\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/inception/inception/slim/slim.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"TF-Slim grouped API. Please see README.md for details and usage.\"\"\"\n# pylint: disable=unused-import\n\n# Collapse tf-slim into a single namespace.\nfrom inception.slim import inception_model as inception\nfrom inception.slim import losses\nfrom inception.slim import ops\nfrom inception.slim import scopes\nfrom inception.slim import variables\nfrom inception.slim.scopes import arg_scope\n"
  },
  {
    "path": "model_zoo/models/inception/inception/slim/variables.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains convenience wrappers for creating variables in TF-Slim.\n\nThe variables module is typically used for defining model variables from the\nops routines (see slim.ops). Such variables are used for training, evaluation\nand inference of models.\n\nAll the variables created through this module would be added to the\nMODEL_VARIABLES collection, if you create a model variable outside slim, it can\nbe added with slim.variables.add_variable(external_variable, reuse).\n\nUsage:\n  weights_initializer = tf.truncated_normal_initializer(stddev=0.01)\n  l2_regularizer = lambda t: losses.l2_loss(t, weight=0.0005)\n  weights = variables.variable('weights',\n                               shape=[100, 100],\n                               initializer=weights_initializer,\n                               regularizer=l2_regularizer,\n                               device='/cpu:0')\n\n  biases = variables.variable('biases',\n                              shape=[100],\n                              initializer=tf.zeros_initializer,\n                              device='/cpu:0')\n\n  # More complex example.\n\n  net = slim.ops.conv2d(input, 32, [3, 3], scope='conv1')\n  net = slim.ops.conv2d(net, 64, [3, 3], scope='conv2')\n  with slim.arg_scope([variables.variable], restore=False):\n    net = slim.ops.conv2d(net, 64, [3, 3], scope='conv3')\n\n  # Get all model variables from all the layers.\n  model_variables = slim.variables.get_variables()\n\n  # Get all model variables from a specific the layer, i.e 'conv1'.\n  conv1_variables = slim.variables.get_variables('conv1')\n\n  # Get all weights from all the layers.\n  weights = slim.variables.get_variables_by_name('weights')\n\n  # Get all bias from all the layers.\n  biases = slim.variables.get_variables_by_name('biases')\n\n  # Get all variables to restore.\n  # (i.e. only those created by 'conv1' and 'conv2')\n  variables_to_restore = slim.variables.get_variables_to_restore()\n\n************************************************\n* Initializing model variables from a checkpoint\n************************************************\n\n# Create some variables.\nv1 = slim.variables.variable(name=\"v1\", ..., restore=False)\nv2 = slim.variables.variable(name=\"v2\", ...) # By default restore=True\n...\n# The list of variables to restore should only contain 'v2'.\nvariables_to_restore = slim.variables.get_variables_to_restore()\nrestorer = tf.train.Saver(variables_to_restore)\nwith tf.Session() as sess:\n  # Restore variables from disk.\n  restorer.restore(sess, \"/tmp/model.ckpt\")\n  print(\"Model restored.\")\n  # Do some work with the model\n  ...\n\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom inception.slim import scopes\n\n# Collection containing all the variables created using slim.variables\nMODEL_VARIABLES = '_model_variables_'\n\n# Collection containing the slim.variables that are created with restore=True.\nVARIABLES_TO_RESTORE = '_variables_to_restore_'\n\n\ndef add_variable(var, restore=True):\n  \"\"\"Adds a variable to the MODEL_VARIABLES collection.\n\n    Optionally it will add the variable to  the VARIABLES_TO_RESTORE collection.\n  Args:\n    var: a variable.\n    restore: whether the variable should be added to the\n      VARIABLES_TO_RESTORE collection.\n\n  \"\"\"\n  collections = [MODEL_VARIABLES]\n  if restore:\n    collections.append(VARIABLES_TO_RESTORE)\n  for collection in collections:\n    if var not in tf.get_collection(collection):\n      tf.add_to_collection(collection, var)\n\n\ndef get_variables(scope=None, suffix=None):\n  \"\"\"Gets the list of variables, filtered by scope and/or suffix.\n\n  Args:\n    scope: an optional scope for filtering the variables to return.\n    suffix: an optional suffix for filtering the variables to return.\n\n  Returns:\n    a copied list of variables with scope and suffix.\n  \"\"\"\n  candidates = tf.get_collection(MODEL_VARIABLES, scope)[:]\n  if suffix is not None:\n    candidates = [var for var in candidates if var.op.name.endswith(suffix)]\n  return candidates\n\n\ndef get_variables_to_restore():\n  \"\"\"Gets the list of variables to restore.\n\n  Returns:\n    a copied list of variables.\n  \"\"\"\n  return tf.get_collection(VARIABLES_TO_RESTORE)[:]\n\n\ndef get_variables_by_name(given_name, scope=None):\n  \"\"\"Gets the list of variables that were given that name.\n\n  Args:\n    given_name: name given to the variable without scope.\n    scope: an optional scope for filtering the variables to return.\n\n  Returns:\n    a copied list of variables with the given name and prefix.\n  \"\"\"\n  return get_variables(scope=scope, suffix=given_name)\n\n\ndef get_unique_variable(name):\n  \"\"\"Gets the variable uniquely identified by that name.\n\n  Args:\n    name: a name that uniquely identifies the variable.\n\n  Returns:\n    a tensorflow variable.\n\n  Raises:\n    ValueError: if no variable uniquely identified by the name exists.\n  \"\"\"\n  candidates = tf.get_collection(tf.GraphKeys.VARIABLES, name)\n  if not candidates:\n    raise ValueError('Couldnt find variable %s' % name)\n\n  for candidate in candidates:\n    if candidate.op.name == name:\n      return candidate\n  raise ValueError('Variable %s does not uniquely identify a variable', name)\n\n\nclass VariableDeviceChooser(object):\n  \"\"\"Slim device chooser for variables.\n\n  When using a parameter server it will assign them in a round-robin fashion.\n  When not using a parameter server it allows GPU:0 placement otherwise CPU:0.\n  \"\"\"\n\n  def __init__(self,\n               num_parameter_servers=0,\n               ps_device='/job:ps',\n               placement='CPU:0'):\n    \"\"\"Initialize VariableDeviceChooser.\n\n    Args:\n      num_parameter_servers: number of parameter servers.\n      ps_device: string representing the parameter server device.\n      placement: string representing the placement of the variable either CPU:0\n        or GPU:0. When using parameter servers forced to CPU:0.\n    \"\"\"\n    self._num_ps = num_parameter_servers\n    self._ps_device = ps_device\n    self._placement = placement if num_parameter_servers == 0 else 'CPU:0'\n    self._next_task_id = 0\n\n  def __call__(self, op):\n    device_string = ''\n    if self._num_ps > 0:\n      task_id = self._next_task_id\n      self._next_task_id = (self._next_task_id + 1) % self._num_ps\n      device_string = '%s/task:%d' % (self._ps_device, task_id)\n    device_string += '/%s' % self._placement\n    return device_string\n\n\n# TODO(sguada) Remove once get_variable is able to colocate op.devices.\ndef variable_device(device, name):\n  \"\"\"Fix the variable device to colocate its ops.\"\"\"\n  if callable(device):\n    var_name = tf.get_variable_scope().name + '/' + name\n    var_def = tf.NodeDef(name=var_name, op='Variable')\n    device = device(var_def)\n  if device is None:\n    device = ''\n  return device\n\n\n@scopes.add_arg_scope\ndef global_step(device=''):\n  \"\"\"Returns the global step variable.\n\n  Args:\n    device: Optional device to place the variable. It can be an string or a\n      function that is called to get the device for the variable.\n\n  Returns:\n    the tensor representing the global step variable.\n  \"\"\"\n  global_step_ref = tf.get_collection(tf.GraphKeys.GLOBAL_STEP)\n  if global_step_ref:\n    return global_step_ref[0]\n  else:\n    collections = [\n        VARIABLES_TO_RESTORE,\n        tf.GraphKeys.VARIABLES,\n        tf.GraphKeys.GLOBAL_STEP,\n    ]\n    # Get the device for the variable.\n    with tf.device(variable_device(device, 'global_step')):\n      return tf.get_variable('global_step', shape=[], dtype=tf.int64,\n                             initializer=tf.zeros_initializer,\n                             trainable=False, collections=collections)\n\n\n@scopes.add_arg_scope\ndef variable(name, shape=None, dtype=tf.float32, initializer=None,\n             regularizer=None, trainable=True, collections=None, device='',\n             restore=True):\n  \"\"\"Gets an existing variable with these parameters or creates a new one.\n\n    It also add itself to a group with its name.\n\n  Args:\n    name: the name of the new or existing variable.\n    shape: shape of the new or existing variable.\n    dtype: type of the new or existing variable (defaults to `DT_FLOAT`).\n    initializer: initializer for the variable if one is created.\n    regularizer: a (Tensor -> Tensor or None) function; the result of\n        applying it on a newly created variable will be added to the collection\n        GraphKeys.REGULARIZATION_LOSSES and can be used for regularization.\n    trainable: If `True` also add the variable to the graph collection\n      `GraphKeys.TRAINABLE_VARIABLES` (see tf.Variable).\n    collections: A list of collection names to which the Variable will be added.\n      Note that the variable is always also added to the tf.GraphKeys.VARIABLES\n      and MODEL_VARIABLES collections.\n    device: Optional device to place the variable. It can be an string or a\n      function that is called to get the device for the variable.\n    restore: whether the variable should be added to the\n      VARIABLES_TO_RESTORE collection.\n\n  Returns:\n    The created or existing variable.\n  \"\"\"\n  collections = list(collections or [])\n\n  # Make sure variables are added to tf.GraphKeys.VARIABLES and MODEL_VARIABLES\n  collections += [tf.GraphKeys.VARIABLES, MODEL_VARIABLES]\n  # Add to VARIABLES_TO_RESTORE if necessary\n  if restore:\n    collections.append(VARIABLES_TO_RESTORE)\n  # Remove duplicates\n  collections = set(collections)\n  # Get the device for the variable.\n  with tf.device(variable_device(device, name)):\n    return tf.get_variable(name, shape=shape, dtype=dtype,\n                           initializer=initializer, regularizer=regularizer,\n                           trainable=trainable, collections=collections)\n"
  },
  {
    "path": "model_zoo/models/inception/inception/slim/variables_test.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for slim.variables.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom inception.slim import scopes\nfrom inception.slim import variables\n\n\nclass VariablesTest(tf.test.TestCase):\n\n  def testCreateVariable(self):\n    with self.test_session():\n      with tf.variable_scope('A'):\n        a = variables.variable('a', [5])\n        self.assertEquals(a.op.name, 'A/a')\n        self.assertListEqual(a.get_shape().as_list(), [5])\n\n  def testGetVariables(self):\n    with self.test_session():\n      with tf.variable_scope('A'):\n        a = variables.variable('a', [5])\n      with tf.variable_scope('B'):\n        b = variables.variable('a', [5])\n      self.assertEquals([a, b], variables.get_variables())\n      self.assertEquals([a], variables.get_variables('A'))\n      self.assertEquals([b], variables.get_variables('B'))\n\n  def testGetVariablesSuffix(self):\n    with self.test_session():\n      with tf.variable_scope('A'):\n        a = variables.variable('a', [5])\n      with tf.variable_scope('A'):\n        b = variables.variable('b', [5])\n      self.assertEquals([a], variables.get_variables(suffix='a'))\n      self.assertEquals([b], variables.get_variables(suffix='b'))\n\n  def testGetVariableWithSingleVar(self):\n    with self.test_session():\n      with tf.variable_scope('parent'):\n        a = variables.variable('child', [5])\n      self.assertEquals(a, variables.get_unique_variable('parent/child'))\n\n  def testGetVariableWithDistractors(self):\n    with self.test_session():\n      with tf.variable_scope('parent'):\n        a = variables.variable('child', [5])\n        with tf.variable_scope('child'):\n          variables.variable('grandchild1', [7])\n          variables.variable('grandchild2', [9])\n      self.assertEquals(a, variables.get_unique_variable('parent/child'))\n\n  def testGetVariableThrowsExceptionWithNoMatch(self):\n    var_name = 'cant_find_me'\n    with self.test_session():\n      with self.assertRaises(ValueError):\n        variables.get_unique_variable(var_name)\n\n  def testGetThrowsExceptionWithChildrenButNoMatch(self):\n    var_name = 'parent/child'\n    with self.test_session():\n      with tf.variable_scope(var_name):\n        variables.variable('grandchild1', [7])\n        variables.variable('grandchild2', [9])\n      with self.assertRaises(ValueError):\n        variables.get_unique_variable(var_name)\n\n  def testGetVariablesToRestore(self):\n    with self.test_session():\n      with tf.variable_scope('A'):\n        a = variables.variable('a', [5])\n      with tf.variable_scope('B'):\n        b = variables.variable('a', [5])\n      self.assertEquals([a, b], variables.get_variables_to_restore())\n\n  def testNoneGetVariablesToRestore(self):\n    with self.test_session():\n      with tf.variable_scope('A'):\n        a = variables.variable('a', [5], restore=False)\n      with tf.variable_scope('B'):\n        b = variables.variable('a', [5], restore=False)\n      self.assertEquals([], variables.get_variables_to_restore())\n      self.assertEquals([a, b], variables.get_variables())\n\n  def testGetMixedVariablesToRestore(self):\n    with self.test_session():\n      with tf.variable_scope('A'):\n        a = variables.variable('a', [5])\n        b = variables.variable('b', [5], restore=False)\n      with tf.variable_scope('B'):\n        c = variables.variable('c', [5])\n        d = variables.variable('d', [5], restore=False)\n      self.assertEquals([a, b, c, d], variables.get_variables())\n      self.assertEquals([a, c], variables.get_variables_to_restore())\n\n  def testReuseVariable(self):\n    with self.test_session():\n      with tf.variable_scope('A'):\n        a = variables.variable('a', [])\n      with tf.variable_scope('A', reuse=True):\n        b = variables.variable('a', [])\n      self.assertEquals(a, b)\n      self.assertListEqual([a], variables.get_variables())\n\n  def testVariableWithDevice(self):\n    with self.test_session():\n      with tf.variable_scope('A'):\n        a = variables.variable('a', [], device='cpu:0')\n        b = variables.variable('b', [], device='cpu:1')\n      self.assertDeviceEqual(a.device, 'cpu:0')\n      self.assertDeviceEqual(b.device, 'cpu:1')\n\n  def testVariableWithDeviceFromScope(self):\n    with self.test_session():\n      with tf.device('/cpu:0'):\n        a = variables.variable('a', [])\n        b = variables.variable('b', [], device='cpu:1')\n      self.assertDeviceEqual(a.device, 'cpu:0')\n      self.assertDeviceEqual(b.device, 'cpu:1')\n\n  def testVariableWithDeviceFunction(self):\n    class DevFn(object):\n\n      def __init__(self):\n        self.counter = -1\n\n      def __call__(self, op):\n        self.counter += 1\n        return 'cpu:%d' % self.counter\n\n    with self.test_session():\n      with scopes.arg_scope([variables.variable], device=DevFn()):\n        a = variables.variable('a', [])\n        b = variables.variable('b', [])\n        c = variables.variable('c', [], device='cpu:12')\n        d = variables.variable('d', [])\n        with tf.device('cpu:99'):\n          e_init = tf.constant(12)\n        e = variables.variable('e', initializer=e_init)\n      self.assertDeviceEqual(a.device, 'cpu:0')\n      self.assertDeviceEqual(a.initial_value.device, 'cpu:0')\n      self.assertDeviceEqual(b.device, 'cpu:1')\n      self.assertDeviceEqual(b.initial_value.device, 'cpu:1')\n      self.assertDeviceEqual(c.device, 'cpu:12')\n      self.assertDeviceEqual(c.initial_value.device, 'cpu:12')\n      self.assertDeviceEqual(d.device, 'cpu:2')\n      self.assertDeviceEqual(d.initial_value.device, 'cpu:2')\n      self.assertDeviceEqual(e.device, 'cpu:3')\n      self.assertDeviceEqual(e.initial_value.device, 'cpu:99')\n\n  def testVariableWithReplicaDeviceSetter(self):\n    with self.test_session():\n      with tf.device(tf.train.replica_device_setter(ps_tasks=2)):\n        a = variables.variable('a', [])\n        b = variables.variable('b', [])\n        c = variables.variable('c', [], device='cpu:12')\n        d = variables.variable('d', [])\n        with tf.device('cpu:99'):\n          e_init = tf.constant(12)\n        e = variables.variable('e', initializer=e_init)\n      # The values below highlight how the replica_device_setter puts initial\n      # values on the worker job, and how it merges explicit devices.\n      self.assertDeviceEqual(a.device, '/job:ps/task:0/cpu:0')\n      self.assertDeviceEqual(a.initial_value.device, '/job:worker/cpu:0')\n      self.assertDeviceEqual(b.device, '/job:ps/task:1/cpu:0')\n      self.assertDeviceEqual(b.initial_value.device, '/job:worker/cpu:0')\n      self.assertDeviceEqual(c.device, '/job:ps/task:0/cpu:12')\n      self.assertDeviceEqual(c.initial_value.device, '/job:worker/cpu:12')\n      self.assertDeviceEqual(d.device, '/job:ps/task:1/cpu:0')\n      self.assertDeviceEqual(d.initial_value.device, '/job:worker/cpu:0')\n      self.assertDeviceEqual(e.device, '/job:ps/task:0/cpu:0')\n      self.assertDeviceEqual(e.initial_value.device, '/job:worker/cpu:99')\n\n  def testVariableWithVariableDeviceChooser(self):\n\n    with tf.Graph().as_default():\n      device_fn = variables.VariableDeviceChooser(num_parameter_servers=2)\n      with scopes.arg_scope([variables.variable], device=device_fn):\n        a = variables.variable('a', [])\n        b = variables.variable('b', [])\n        c = variables.variable('c', [], device='cpu:12')\n        d = variables.variable('d', [])\n        with tf.device('cpu:99'):\n          e_init = tf.constant(12)\n        e = variables.variable('e', initializer=e_init)\n      # The values below highlight how the VariableDeviceChooser puts initial\n      # values on the same device as the variable job.\n      self.assertDeviceEqual(a.device, '/job:ps/task:0/cpu:0')\n      self.assertDeviceEqual(a.initial_value.device, a.device)\n      self.assertDeviceEqual(b.device, '/job:ps/task:1/cpu:0')\n      self.assertDeviceEqual(b.initial_value.device, b.device)\n      self.assertDeviceEqual(c.device, '/cpu:12')\n      self.assertDeviceEqual(c.initial_value.device, c.device)\n      self.assertDeviceEqual(d.device, '/job:ps/task:0/cpu:0')\n      self.assertDeviceEqual(d.initial_value.device, d.device)\n      self.assertDeviceEqual(e.device, '/job:ps/task:1/cpu:0')\n      self.assertDeviceEqual(e.initial_value.device, '/cpu:99')\n\n  def testVariableGPUPlacement(self):\n\n    with tf.Graph().as_default():\n      device_fn = variables.VariableDeviceChooser(placement='gpu:0')\n      with scopes.arg_scope([variables.variable], device=device_fn):\n        a = variables.variable('a', [])\n        b = variables.variable('b', [])\n        c = variables.variable('c', [], device='cpu:12')\n        d = variables.variable('d', [])\n        with tf.device('cpu:99'):\n          e_init = tf.constant(12)\n        e = variables.variable('e', initializer=e_init)\n      # The values below highlight how the VariableDeviceChooser puts initial\n      # values on the same device as the variable job.\n      self.assertDeviceEqual(a.device, '/gpu:0')\n      self.assertDeviceEqual(a.initial_value.device, a.device)\n      self.assertDeviceEqual(b.device, '/gpu:0')\n      self.assertDeviceEqual(b.initial_value.device, b.device)\n      self.assertDeviceEqual(c.device, '/cpu:12')\n      self.assertDeviceEqual(c.initial_value.device, c.device)\n      self.assertDeviceEqual(d.device, '/gpu:0')\n      self.assertDeviceEqual(d.initial_value.device, d.device)\n      self.assertDeviceEqual(e.device, '/gpu:0')\n      self.assertDeviceEqual(e.initial_value.device, '/cpu:99')\n\n  def testVariableCollection(self):\n    with self.test_session():\n      a = variables.variable('a', [], collections='A')\n      b = variables.variable('b', [], collections='B')\n      self.assertEquals(a, tf.get_collection('A')[0])\n      self.assertEquals(b, tf.get_collection('B')[0])\n\n  def testVariableCollections(self):\n    with self.test_session():\n      a = variables.variable('a', [], collections=['A', 'C'])\n      b = variables.variable('b', [], collections=['B', 'C'])\n      self.assertEquals(a, tf.get_collection('A')[0])\n      self.assertEquals(b, tf.get_collection('B')[0])\n\n  def testVariableCollectionsWithArgScope(self):\n    with self.test_session():\n      with scopes.arg_scope([variables.variable], collections='A'):\n        a = variables.variable('a', [])\n        b = variables.variable('b', [])\n      self.assertListEqual([a, b], tf.get_collection('A'))\n\n  def testVariableCollectionsWithArgScopeNested(self):\n    with self.test_session():\n      with scopes.arg_scope([variables.variable], collections='A'):\n        a = variables.variable('a', [])\n        with scopes.arg_scope([variables.variable], collections='B'):\n          b = variables.variable('b', [])\n      self.assertEquals(a, tf.get_collection('A')[0])\n      self.assertEquals(b, tf.get_collection('B')[0])\n\n  def testVariableCollectionsWithArgScopeNonNested(self):\n    with self.test_session():\n      with scopes.arg_scope([variables.variable], collections='A'):\n        a = variables.variable('a', [])\n      with scopes.arg_scope([variables.variable], collections='B'):\n        b = variables.variable('b', [])\n      variables.variable('c', [])\n      self.assertListEqual([a], tf.get_collection('A'))\n      self.assertListEqual([b], tf.get_collection('B'))\n\n  def testVariableRestoreWithArgScopeNested(self):\n    with self.test_session():\n      with scopes.arg_scope([variables.variable], restore=True):\n        a = variables.variable('a', [])\n        with scopes.arg_scope([variables.variable],\n                              trainable=False,\n                              collections=['A', 'B']):\n          b = variables.variable('b', [])\n        c = variables.variable('c', [])\n      self.assertListEqual([a, b, c], variables.get_variables_to_restore())\n      self.assertListEqual([a, c], tf.trainable_variables())\n      self.assertListEqual([b], tf.get_collection('A'))\n      self.assertListEqual([b], tf.get_collection('B'))\n\n\nclass GetVariablesByNameTest(tf.test.TestCase):\n\n  def testGetVariableGivenNameScoped(self):\n    with self.test_session():\n      with tf.variable_scope('A'):\n        a = variables.variable('a', [5])\n        b = variables.variable('b', [5])\n        self.assertEquals([a], variables.get_variables_by_name('a'))\n        self.assertEquals([b], variables.get_variables_by_name('b'))\n\n  def testGetVariablesByNameReturnsByValueWithScope(self):\n    with self.test_session():\n      with tf.variable_scope('A'):\n        a = variables.variable('a', [5])\n        matched_variables = variables.get_variables_by_name('a')\n\n        # If variables.get_variables_by_name returns the list by reference, the\n        # following append should persist, and be returned, in subsequent calls\n        # to variables.get_variables_by_name('a').\n        matched_variables.append(4)\n\n        matched_variables = variables.get_variables_by_name('a')\n        self.assertEquals([a], matched_variables)\n\n  def testGetVariablesByNameReturnsByValueWithoutScope(self):\n    with self.test_session():\n      a = variables.variable('a', [5])\n      matched_variables = variables.get_variables_by_name('a')\n\n      # If variables.get_variables_by_name returns the list by reference, the\n      # following append should persist, and be returned, in subsequent calls\n      # to variables.get_variables_by_name('a').\n      matched_variables.append(4)\n\n      matched_variables = variables.get_variables_by_name('a')\n      self.assertEquals([a], matched_variables)\n\n\nclass GlobalStepTest(tf.test.TestCase):\n\n  def testStable(self):\n    with tf.Graph().as_default():\n      gs = variables.global_step()\n      gs2 = variables.global_step()\n      self.assertTrue(gs is gs2)\n\n  def testDevice(self):\n    with tf.Graph().as_default():\n      with scopes.arg_scope([variables.global_step], device='/gpu:0'):\n        gs = variables.global_step()\n      self.assertDeviceEqual(gs.device, '/gpu:0')\n\n  def testDeviceFn(self):\n    class DevFn(object):\n\n      def __init__(self):\n        self.counter = -1\n\n      def __call__(self, op):\n        self.counter += 1\n        return '/cpu:%d' % self.counter\n\n    with tf.Graph().as_default():\n      with scopes.arg_scope([variables.global_step], device=DevFn()):\n        gs = variables.global_step()\n        gs2 = variables.global_step()\n      self.assertDeviceEqual(gs.device, '/cpu:0')\n      self.assertEquals(gs, gs2)\n      self.assertDeviceEqual(gs2.device, '/cpu:0')\n\n  def testReplicaDeviceSetter(self):\n    device_fn = tf.train.replica_device_setter(2)\n    with tf.Graph().as_default():\n      with scopes.arg_scope([variables.global_step], device=device_fn):\n        gs = variables.global_step()\n        gs2 = variables.global_step()\n        self.assertEquals(gs, gs2)\n        self.assertDeviceEqual(gs.device, '/job:ps/task:0')\n        self.assertDeviceEqual(gs.initial_value.device, '/job:ps/task:0')\n        self.assertDeviceEqual(gs2.device, '/job:ps/task:0')\n        self.assertDeviceEqual(gs2.initial_value.device, '/job:ps/task:0')\n\n  def testVariableWithVariableDeviceChooser(self):\n\n    with tf.Graph().as_default():\n      device_fn = variables.VariableDeviceChooser()\n      with scopes.arg_scope([variables.global_step], device=device_fn):\n        gs = variables.global_step()\n        gs2 = variables.global_step()\n        self.assertEquals(gs, gs2)\n        self.assertDeviceEqual(gs.device, 'cpu:0')\n        self.assertDeviceEqual(gs.initial_value.device, gs.device)\n        self.assertDeviceEqual(gs2.device, 'cpu:0')\n        self.assertDeviceEqual(gs2.initial_value.device, gs2.device)\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/lm_1b/BUILD",
    "content": "package(default_visibility = [\":internal\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\nexports_files([\"LICENSE\"])\n\npackage_group(\n    name = \"internal\",\n    packages = [\n        \"//lm_1b/...\",\n    ],\n)\n\npy_library(\n    name = \"data_utils\",\n    srcs = [\"data_utils.py\"],\n)\n\npy_binary(\n    name = \"lm_1b_eval\",\n    srcs = [\n        \"lm_1b_eval.py\",\n    ],\n    deps = [\n        \":data_utils\",\n    ],\n)\n"
  },
  {
    "path": "model_zoo/models/lm_1b/README.md",
    "content": "<font size=4><b>Language Model on One Billion Word Benchmark</b></font>\n\n<b>Authors:</b>\n\nOriol Vinyals (vinyals@google.com, github: OriolVinyals),\nXin Pan (xpan@google.com, github: panyx0718)\n\n<b>Paper Authors:</b>\n\nRafal Jozefowicz, Oriol Vinyals, Mike Schuster, Noam Shazeer, Yonghui Wu\n\n<b>TL;DR</b>\n\nThis is a pretrained model on One Billion Word Benchmark.\nIf you use this model in your publication, please cite the original paper:\n\n@article{jozefowicz2016exploring,\n  title={Exploring the Limits of Language Modeling},\n  author={Jozefowicz, Rafal and Vinyals, Oriol and Schuster, Mike\n          and Shazeer, Noam and Wu, Yonghui},\n  journal={arXiv preprint arXiv:1602.02410},\n  year={2016}\n}\n\n<b>Introduction</b>\n\nIn this release, we open source a model trained on the One Billion Word\nBenchmark (http://arxiv.org/abs/1312.3005), a large language corpus in English\nwhich was released in 2013. This dataset contains about one billion words, and\nhas a vocabulary size of about 800K words. It contains mostly news data. Since\nsentences in the training set are shuffled, models can ignore the context and\nfocus on sentence level language modeling.\n\nIn the original release and subsequent work, people have used the same test set\nto train models on this dataset as a standard benchmark for language modeling.\nRecently, we wrote an article (http://arxiv.org/abs/1602.02410) describing a\nmodel hybrid between character CNN, a large and deep LSTM, and a specific\nSoftmax architecture which allowed us to train the best model on this dataset\nthus far, almost halving the best perplexity previously obtained by others.\n\n<b>Code Release</b>\n\nThe open-sourced components include:\n\n* TensorFlow GraphDef proto buffer text file.\n* TensorFlow pre-trained checkpoint shards.\n* Code used to evaluate the pre-trained model.\n* Vocabulary file.\n* Test set from LM-1B evaluation.\n\nThe code supports 4 evaluation modes:\n\n* Given provided dataset, calculate the model's perplexity.\n* Given a prefix sentence, predict the next words.\n* Dump the softmax embedding, character-level CNN word embeddings.\n* Give a sentence, dump the embedding from the LSTM state.\n\n<b>Results</b>\n\nModel | Test Perplexity | Number of Params [billions]\n------|-----------------|----------------------------\nSigmoid-RNN-2048 [Blackout] | 68.3 | 4.1\nInterpolated KN 5-gram, 1.1B n-grams [chelba2013one] | 67.6 | 1.76\nSparse Non-Negative Matrix LM [shazeer2015sparse] | 52.9 | 33\nRNN-1024 + MaxEnt 9-gram features [chelba2013one] | 51.3 | 20\nLSTM-512-512 | 54.1 | 0.82\nLSTM-1024-512 | 48.2 | 0.82\nLSTM-2048-512 | 43.7 | 0.83\nLSTM-8192-2048 (No Dropout) | 37.9 | 3.3\nLSTM-8192-2048 (50\\% Dropout) | 32.2 | 3.3\n2-Layer LSTM-8192-1024 (BIG LSTM) | 30.6 | 1.8\n(THIS RELEASE) BIG LSTM+CNN Inputs | <b>30.0</b> | <b>1.04</b>\n\n<b>How To Run</b>\n\nPre-requesite:\n\n* Install TensorFlow.\n* Install Bazel.\n* Download the data files:\n  * Model GraphDef file:\n  [link](http://download.tensorflow.org/models/LM_LSTM_CNN/graph-2016-09-10.pbtxt)\n  * Model Checkpoint sharded file:\n  [1](http://download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-base)\n  [2](http://download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-char-embedding)\n  [3](http://download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-lstm)\n  [4](http://download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax0)\n  [5](http://download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax1)\n  [6](http://download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax2)\n  [7](http://download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax3)\n  [8](http://download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax4)\n  [9](http://download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax5)\n  [10](http://download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax6)\n  [11](http://download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax7)\n  [12](http://download.tensorflow.org/models/LM_LSTM_CNN/all_shards-2016-09-10/ckpt-softmax8)\n  * Vocabulary file:\n  [link](http://download.tensorflow.org/models/LM_LSTM_CNN/vocab-2016-09-10.txt)\n  * test dataset: link\n  [link](http://download.tensorflow.org/models/LM_LSTM_CNN/test/news.en.heldout-00000-of-00050)\n* It is recommended to run on modern desktop instead of laptop.\n\n```shell\n# 1. Clone the code to your workspace.\n# 2. Download the data to your workspace.\n# 3. Create an empty WORKSPACE file in your workspace.\n# 4. Create an empty output directory in your workspace.\n# Example directory structure below:\nls -R\n.:\ndata  lm_1b  output  WORKSPACE\n\n./data:\nckpt-base            ckpt-lstm      ckpt-softmax1  ckpt-softmax3  ckpt-softmax5\nckpt-softmax7  graph-2016-09-10.pbtxt          vocab-2016-09-10.txt\nckpt-char-embedding  ckpt-softmax0  ckpt-softmax2  ckpt-softmax4  ckpt-softmax6\nckpt-softmax8  news.en.heldout-00000-of-00050\n\n./lm_1b:\nBUILD  data_utils.py  lm_1b_eval.py  README.md\n\n./output:\n\n# Build the codes.\nbazel build -c opt lm_1b/...\n# Run sample mode:\nbazel-bin/lm_1b/lm_1b_eval --mode sample \\\n                           --prefix \"I love that I\" \\\n                           --pbtxt data/graph-2016-09-10.pbtxt \\\n                           --vocab_file data/vocab-2016-09-10.txt  \\\n                           --ckpt 'data/ckpt-*'\n...(omitted some TensorFlow output)\nI love\nI love that\nI love that I\nI love that I find\nI love that I find that\nI love that I find that amazing\n...(omitted)\n\n# Run eval mode:\nbazel-bin/lm_1b/lm_1b_eval --mode eval \\\n                           --pbtxt data/graph-2016-09-10.pbtxt \\\n                           --vocab_file data/vocab-2016-09-10.txt  \\\n                           --input_data data/news.en.heldout-00000-of-00050 \\\n                           --ckpt 'data/ckpt-*'\n...(omitted some TensorFlow output)\nLoaded step 14108582.\n# perplexity is high initially because words without context are harder to\n# predict.\nEval Step: 0, Average Perplexity: 2045.512297.\nEval Step: 1, Average Perplexity: 229.478699.\nEval Step: 2, Average Perplexity: 208.116787.\nEval Step: 3, Average Perplexity: 338.870601.\nEval Step: 4, Average Perplexity: 228.950107.\nEval Step: 5, Average Perplexity: 197.685857.\nEval Step: 6, Average Perplexity: 156.287063.\nEval Step: 7, Average Perplexity: 124.866189.\nEval Step: 8, Average Perplexity: 147.204975.\nEval Step: 9, Average Perplexity: 90.124864.\nEval Step: 10, Average Perplexity: 59.897914.\nEval Step: 11, Average Perplexity: 42.591137.\n...(omitted)\nEval Step: 4529, Average Perplexity: 29.243668.\nEval Step: 4530, Average Perplexity: 29.302362.\nEval Step: 4531, Average Perplexity: 29.285674.\n...(omitted. At convergence, it should be around 30.)\n\n# Run dump_emb mode:\nbazel-bin/lm_1b/lm_1b_eval --mode dump_emb \\\n                           --pbtxt data/graph-2016-09-10.pbtxt \\\n                           --vocab_file data/vocab-2016-09-10.txt  \\\n                           --ckpt 'data/ckpt-*' \\\n                           --save_dir output\n...(omitted some TensorFlow output)\nFinished softmax weights\nFinished word embedding 0/793471\nFinished word embedding 1/793471\nFinished word embedding 2/793471\n...(omitted)\nls output/\nembeddings_softmax.npy ...\n\n# Run dump_lstm_emb mode:\nbazel-bin/lm_1b/lm_1b_eval --mode dump_lstm_emb \\\n                           --pbtxt data/graph-2016-09-10.pbtxt \\\n                           --vocab_file data/vocab-2016-09-10.txt \\\n                           --ckpt 'data/ckpt-*' \\\n                           --sentence \"I love who I am .\" \\\n                           --save_dir output\nls output/\nlstm_emb_step_0.npy  lstm_emb_step_2.npy  lstm_emb_step_4.npy\nlstm_emb_step_6.npy  lstm_emb_step_1.npy  lstm_emb_step_3.npy\nlstm_emb_step_5.npy\n```\n"
  },
  {
    "path": "model_zoo/models/lm_1b/data_utils.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"A library for loading 1B word benchmark dataset.\"\"\"\n\nimport random\n\nimport numpy as np\nimport tensorflow as tf\n\n\nclass Vocabulary(object):\n  \"\"\"Class that holds a vocabulary for the dataset.\"\"\"\n\n  def __init__(self, filename):\n    \"\"\"Initialize vocabulary.\n\n    Args:\n      filename: Vocabulary file name.\n    \"\"\"\n\n    self._id_to_word = []\n    self._word_to_id = {}\n    self._unk = -1\n    self._bos = -1\n    self._eos = -1\n\n    with tf.gfile.Open(filename) as f:\n      idx = 0\n      for line in f:\n        word_name = line.strip()\n        if word_name == '<S>':\n          self._bos = idx\n        elif word_name == '</S>':\n          self._eos = idx\n        elif word_name == '<UNK>':\n          self._unk = idx\n        if word_name == '!!!MAXTERMID':\n          continue\n\n        self._id_to_word.append(word_name)\n        self._word_to_id[word_name] = idx\n        idx += 1\n\n  @property\n  def bos(self):\n    return self._bos\n\n  @property\n  def eos(self):\n    return self._eos\n\n  @property\n  def unk(self):\n    return self._unk\n\n  @property\n  def size(self):\n    return len(self._id_to_word)\n\n  def word_to_id(self, word):\n    if word in self._word_to_id:\n      return self._word_to_id[word]\n    return self.unk\n\n  def id_to_word(self, cur_id):\n    if cur_id < self.size:\n      return self._id_to_word[cur_id]\n    return 'ERROR'\n\n  def decode(self, cur_ids):\n    \"\"\"Convert a list of ids to a sentence, with space inserted.\"\"\"\n    return ' '.join([self.id_to_word(cur_id) for cur_id in cur_ids])\n\n  def encode(self, sentence):\n    \"\"\"Convert a sentence to a list of ids, with special tokens added.\"\"\"\n    word_ids = [self.word_to_id(cur_word) for cur_word in sentence.split()]\n    return np.array([self.bos] + word_ids + [self.eos], dtype=np.int32)\n\n\nclass CharsVocabulary(Vocabulary):\n  \"\"\"Vocabulary containing character-level information.\"\"\"\n\n  def __init__(self, filename, max_word_length):\n    super(CharsVocabulary, self).__init__(filename)\n    self._max_word_length = max_word_length\n    chars_set = set()\n\n    for word in self._id_to_word:\n      chars_set |= set(word)\n\n    free_ids = []\n    for i in range(256):\n      if chr(i) in chars_set:\n        continue\n      free_ids.append(chr(i))\n\n    if len(free_ids) < 5:\n      raise ValueError('Not enough free char ids: %d' % len(free_ids))\n\n    self.bos_char = free_ids[0]  # <begin sentence>\n    self.eos_char = free_ids[1]  # <end sentence>\n    self.bow_char = free_ids[2]  # <begin word>\n    self.eow_char = free_ids[3]  # <end word>\n    self.pad_char = free_ids[4]  # <padding>\n\n    chars_set |= {self.bos_char, self.eos_char, self.bow_char, self.eow_char,\n                  self.pad_char}\n\n    self._char_set = chars_set\n    num_words = len(self._id_to_word)\n\n    self._word_char_ids = np.zeros([num_words, max_word_length], dtype=np.int32)\n\n    self.bos_chars = self._convert_word_to_char_ids(self.bos_char)\n    self.eos_chars = self._convert_word_to_char_ids(self.eos_char)\n\n    for i, word in enumerate(self._id_to_word):\n      self._word_char_ids[i] = self._convert_word_to_char_ids(word)\n\n  @property\n  def word_char_ids(self):\n    return self._word_char_ids\n\n  @property\n  def max_word_length(self):\n    return self._max_word_length\n\n  def _convert_word_to_char_ids(self, word):\n    code = np.zeros([self.max_word_length], dtype=np.int32)\n    code[:] = ord(self.pad_char)\n\n    if len(word) > self.max_word_length - 2:\n      word = word[:self.max_word_length-2]\n    cur_word = self.bow_char + word + self.eow_char\n    for j in range(len(cur_word)):\n      code[j] = ord(cur_word[j])\n    return code\n\n  def word_to_char_ids(self, word):\n    if word in self._word_to_id:\n      return self._word_char_ids[self._word_to_id[word]]\n    else:\n      return self._convert_word_to_char_ids(word)\n\n  def encode_chars(self, sentence):\n    chars_ids = [self.word_to_char_ids(cur_word)\n                 for cur_word in sentence.split()]\n    return np.vstack([self.bos_chars] + chars_ids + [self.eos_chars])\n\n\ndef get_batch(generator, batch_size, num_steps, max_word_length, pad=False):\n  \"\"\"Read batches of input.\"\"\"\n  cur_stream = [None] * batch_size\n\n  inputs = np.zeros([batch_size, num_steps], np.int32)\n  char_inputs = np.zeros([batch_size, num_steps, max_word_length], np.int32)\n  global_word_ids = np.zeros([batch_size, num_steps], np.int32)\n  targets = np.zeros([batch_size, num_steps], np.int32)\n  weights = np.ones([batch_size, num_steps], np.float32)\n\n  no_more_data = False\n  while True:\n    inputs[:] = 0\n    char_inputs[:] = 0\n    global_word_ids[:] = 0\n    targets[:] = 0\n    weights[:] = 0.0\n\n    for i in range(batch_size):\n      cur_pos = 0\n\n      while cur_pos < num_steps:\n        if cur_stream[i] is None or len(cur_stream[i][0]) <= 1:\n          try:\n            cur_stream[i] = list(generator.next())\n          except StopIteration:\n            # No more data, exhaust current streams and quit\n            no_more_data = True\n            break\n\n        how_many = min(len(cur_stream[i][0]) - 1, num_steps - cur_pos)\n        next_pos = cur_pos + how_many\n\n        inputs[i, cur_pos:next_pos] = cur_stream[i][0][:how_many]\n        char_inputs[i, cur_pos:next_pos] = cur_stream[i][1][:how_many]\n        global_word_ids[i, cur_pos:next_pos] = cur_stream[i][2][:how_many]\n        targets[i, cur_pos:next_pos] = cur_stream[i][0][1:how_many+1]\n        weights[i, cur_pos:next_pos] = 1.0\n\n        cur_pos = next_pos\n        cur_stream[i][0] = cur_stream[i][0][how_many:]\n        cur_stream[i][1] = cur_stream[i][1][how_many:]\n        cur_stream[i][2] = cur_stream[i][2][how_many:]\n\n        if pad:\n          break\n\n    if no_more_data and np.sum(weights) == 0:\n      # There is no more data and this is an empty batch. Done!\n      break\n    yield inputs, char_inputs, global_word_ids, targets, weights\n\n\nclass LM1BDataset(object):\n  \"\"\"Utility class for 1B word benchmark dataset.\n\n  The current implementation reads the data from the tokenized text files.\n  \"\"\"\n\n  def __init__(self, filepattern, vocab):\n    \"\"\"Initialize LM1BDataset reader.\n\n    Args:\n      filepattern: Dataset file pattern.\n      vocab: Vocabulary.\n    \"\"\"\n    self._vocab = vocab\n    self._all_shards = tf.gfile.Glob(filepattern)\n    tf.logging.info('Found %d shards at %s', len(self._all_shards), filepattern)\n\n  def _load_random_shard(self):\n    \"\"\"Randomly select a file and read it.\"\"\"\n    return self._load_shard(random.choice(self._all_shards))\n\n  def _load_shard(self, shard_name):\n    \"\"\"Read one file and convert to ids.\n\n    Args:\n      shard_name: file path.\n\n    Returns:\n      list of (id, char_id, global_word_id) tuples.\n    \"\"\"\n    tf.logging.info('Loading data from: %s', shard_name)\n    with tf.gfile.Open(shard_name) as f:\n      sentences = f.readlines()\n    chars_ids = [self.vocab.encode_chars(sentence) for sentence in sentences]\n    ids = [self.vocab.encode(sentence) for sentence in sentences]\n\n    global_word_ids = []\n    current_idx = 0\n    for word_ids in ids:\n      current_size = len(word_ids) - 1  # without <BOS> symbol\n      cur_ids = np.arange(current_idx, current_idx + current_size)\n      global_word_ids.append(cur_ids)\n      current_idx += current_size\n\n    tf.logging.info('Loaded %d words.', current_idx)\n    tf.logging.info('Finished loading')\n    return zip(ids, chars_ids, global_word_ids)\n\n  def _get_sentence(self, forever=True):\n    while True:\n      ids = self._load_random_shard()\n      for current_ids in ids:\n        yield current_ids\n      if not forever:\n        break\n\n  def get_batch(self, batch_size, num_steps, pad=False, forever=True):\n    return get_batch(self._get_sentence(forever), batch_size, num_steps,\n                     self.vocab.max_word_length, pad=pad)\n\n  @property\n  def vocab(self):\n    return self._vocab\n"
  },
  {
    "path": "model_zoo/models/lm_1b/lm_1b_eval.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Eval pre-trained 1 billion word language model.\n\"\"\"\nimport os\nimport sys\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom google.protobuf import text_format\nimport data_utils\n\nFLAGS = tf.flags.FLAGS\n# General flags.\ntf.flags.DEFINE_string('mode', 'eval',\n                       'One of [sample, eval, dump_emb, dump_lstm_emb]. '\n                       '\"sample\" mode samples future word predictions, using '\n                       'FLAGS.prefix as prefix (prefix could be left empty). '\n                       '\"eval\" mode calculates perplexity of the '\n                       'FLAGS.input_data. '\n                       '\"dump_emb\" mode dumps word and softmax embeddings to '\n                       'FLAGS.save_dir. embeddings are dumped in the same '\n                       'order as words in vocabulary. All words in vocabulary '\n                       'are dumped.'\n                       'dump_lstm_emb dumps lstm embeddings of FLAGS.sentence '\n                       'to FLAGS.save_dir.')\ntf.flags.DEFINE_string('pbtxt', '',\n                       'GraphDef proto text file used to construct model '\n                       'structure.')\ntf.flags.DEFINE_string('ckpt', '',\n                       'Checkpoint directory used to fill model values.')\ntf.flags.DEFINE_string('vocab_file', '', 'Vocabulary file.')\ntf.flags.DEFINE_string('save_dir', '',\n                       'Used for \"dump_emb\" mode to save word embeddings.')\n# sample mode flags.\ntf.flags.DEFINE_string('prefix', '',\n                       'Used for \"sample\" mode to predict next words.')\ntf.flags.DEFINE_integer('max_sample_words', 100,\n                        'Sampling stops either when </S> is met or this number '\n                        'of steps has passed.')\ntf.flags.DEFINE_integer('num_samples', 3,\n                        'Number of samples to generate for the prefix.')\n# dump_lstm_emb mode flags.\ntf.flags.DEFINE_string('sentence', '',\n                       'Used as input for \"dump_lstm_emb\" mode.')\n# eval mode flags.\ntf.flags.DEFINE_string('input_data', '',\n                       'Input data files for eval model.')\ntf.flags.DEFINE_integer('max_eval_steps', 1000000,\n                        'Maximum mumber of steps to run \"eval\" mode.')\n\n\n# For saving demo resources, use batch size 1 and step 1.\nBATCH_SIZE = 1\nNUM_TIMESTEPS = 1\nMAX_WORD_LEN = 50\n\n\ndef _LoadModel(gd_file, ckpt_file):\n  \"\"\"Load the model from GraphDef and Checkpoint.\n\n  Args:\n    gd_file: GraphDef proto text file.\n    ckpt_file: TensorFlow Checkpoint file.\n\n  Returns:\n    TensorFlow session and tensors dict.\n  \"\"\"\n  with tf.Graph().as_default():\n    sys.stderr.write('Recovering graph.\\n')\n    with tf.gfile.FastGFile(gd_file, 'r') as f:\n      s = f.read()\n      gd = tf.GraphDef()\n      text_format.Merge(s, gd)\n\n    tf.logging.info('Recovering Graph %s', gd_file)\n    t = {}\n    [t['states_init'], t['lstm/lstm_0/control_dependency'],\n     t['lstm/lstm_1/control_dependency'], t['softmax_out'], t['class_ids_out'],\n     t['class_weights_out'], t['log_perplexity_out'], t['inputs_in'],\n     t['targets_in'], t['target_weights_in'], t['char_inputs_in'],\n     t['all_embs'], t['softmax_weights'], t['global_step']\n    ] = tf.import_graph_def(gd, {}, ['states_init',\n                                     'lstm/lstm_0/control_dependency:0',\n                                     'lstm/lstm_1/control_dependency:0',\n                                     'softmax_out:0',\n                                     'class_ids_out:0',\n                                     'class_weights_out:0',\n                                     'log_perplexity_out:0',\n                                     'inputs_in:0',\n                                     'targets_in:0',\n                                     'target_weights_in:0',\n                                     'char_inputs_in:0',\n                                     'all_embs_out:0',\n                                     'Reshape_3:0',\n                                     'global_step:0'], name='')\n\n    sys.stderr.write('Recovering checkpoint %s\\n' % ckpt_file)\n    sess = tf.Session(config=tf.ConfigProto(allow_soft_placement=True))\n    sess.run('save/restore_all', {'save/Const:0': ckpt_file})\n    sess.run(t['states_init'])\n\n  return sess, t\n\n\ndef _EvalModel(dataset):\n  \"\"\"Evaluate model perplexity using provided dataset.\n\n  Args:\n    dataset: LM1BDataset object.\n  \"\"\"\n  sess, t = _LoadModel(FLAGS.pbtxt, FLAGS.ckpt)\n\n  current_step = t['global_step'].eval(session=sess)\n  sys.stderr.write('Loaded step %d.\\n' % current_step)\n\n  data_gen = dataset.get_batch(BATCH_SIZE, NUM_TIMESTEPS, forever=False)\n  sum_num = 0.0\n  sum_den = 0.0\n  perplexity = 0.0\n  for i, (inputs, char_inputs, _, targets, weights) in enumerate(data_gen):\n    input_dict = {t['inputs_in']: inputs,\n                  t['targets_in']: targets,\n                  t['target_weights_in']: weights}\n    if 'char_inputs_in' in t:\n      input_dict[t['char_inputs_in']] = char_inputs\n    log_perp = sess.run(t['log_perplexity_out'], feed_dict=input_dict)\n\n    if np.isnan(log_perp):\n      sys.stderr.error('log_perplexity is Nan.\\n')\n    else:\n      sum_num += log_perp * weights.mean()\n      sum_den += weights.mean()\n    if sum_den > 0:\n      perplexity = np.exp(sum_num / sum_den)\n\n    sys.stderr.write('Eval Step: %d, Average Perplexity: %f.\\n' %\n                     (i, perplexity))\n\n    if i > FLAGS.max_eval_steps:\n      break\n\n\ndef _SampleSoftmax(softmax):\n  return min(np.sum(np.cumsum(softmax) < np.random.rand()), len(softmax) - 1)\n\n\ndef _SampleModel(prefix_words, vocab):\n  \"\"\"Predict next words using the given prefix words.\n\n  Args:\n    prefix_words: Prefix words.\n    vocab: Vocabulary. Contains max word chard id length and converts between\n        words and ids.\n  \"\"\"\n  targets = np.zeros([BATCH_SIZE, NUM_TIMESTEPS], np.int32)\n  weights = np.ones([BATCH_SIZE, NUM_TIMESTEPS], np.float32)\n\n  sess, t = _LoadModel(FLAGS.pbtxt, FLAGS.ckpt)\n\n  if prefix_words.find('<S>') != 0:\n    prefix_words = '<S> ' + prefix_words\n\n  prefix = [vocab.word_to_id(w) for w in prefix_words.split()]\n  prefix_char_ids = [vocab.word_to_char_ids(w) for w in prefix_words.split()]\n  for _ in xrange(FLAGS.num_samples):\n    inputs = np.zeros([BATCH_SIZE, NUM_TIMESTEPS], np.int32)\n    char_ids_inputs = np.zeros(\n        [BATCH_SIZE, NUM_TIMESTEPS, vocab.max_word_length], np.int32)\n    samples = prefix[:]\n    char_ids_samples = prefix_char_ids[:]\n    sent = ''\n    while True:\n      inputs[0, 0] = samples[0]\n      char_ids_inputs[0, 0, :] = char_ids_samples[0]\n      samples = samples[1:]\n      char_ids_samples = char_ids_samples[1:]\n\n      softmax = sess.run(t['softmax_out'],\n                         feed_dict={t['char_inputs_in']: char_ids_inputs,\n                                    t['inputs_in']: inputs,\n                                    t['targets_in']: targets,\n                                    t['target_weights_in']: weights})\n\n      sample = _SampleSoftmax(softmax[0])\n      sample_char_ids = vocab.word_to_char_ids(vocab.id_to_word(sample))\n\n      if not samples:\n        samples = [sample]\n        char_ids_samples = [sample_char_ids]\n      sent += vocab.id_to_word(samples[0]) + ' '\n      sys.stderr.write('%s\\n' % sent)\n\n      if (vocab.id_to_word(samples[0]) == '</S>' or\n          len(sent) > FLAGS.max_sample_words):\n        break\n\n\ndef _DumpEmb(vocab):\n  \"\"\"Dump the softmax weights and word embeddings to files.\n\n  Args:\n    vocab: Vocabulary. Contains vocabulary size and converts word to ids.\n  \"\"\"\n  assert FLAGS.save_dir, 'Must specify FLAGS.save_dir for dump_emb.'\n  inputs = np.zeros([BATCH_SIZE, NUM_TIMESTEPS], np.int32)\n  targets = np.zeros([BATCH_SIZE, NUM_TIMESTEPS], np.int32)\n  weights = np.ones([BATCH_SIZE, NUM_TIMESTEPS], np.float32)\n\n  sess, t = _LoadModel(FLAGS.pbtxt, FLAGS.ckpt)\n\n  softmax_weights = sess.run(t['softmax_weights'])\n  fname = FLAGS.save_dir + '/embeddings_softmax.npy'\n  with tf.gfile.Open(fname, mode='w') as f:\n    np.save(f, softmax_weights)\n  sys.stderr.write('Finished softmax weights\\n')\n\n  all_embs = np.zeros([vocab.size, 1024])\n  for i in range(vocab.size):\n    input_dict = {t['inputs_in']: inputs,\n                  t['targets_in']: targets,\n                  t['target_weights_in']: weights}\n    if 'char_inputs_in' in t:\n      input_dict[t['char_inputs_in']] = (\n          vocab.word_char_ids[i].reshape([-1, 1, MAX_WORD_LEN]))\n    embs = sess.run(t['all_embs'], input_dict)\n    all_embs[i, :] = embs\n    sys.stderr.write('Finished word embedding %d/%d\\n' % (i, vocab.size))\n\n  fname = FLAGS.save_dir + '/embeddings_char_cnn.npy'\n  with tf.gfile.Open(fname, mode='w') as f:\n    np.save(f, all_embs)\n  sys.stderr.write('Embedding file saved\\n')\n\n\ndef _DumpSentenceEmbedding(sentence, vocab):\n  \"\"\"Predict next words using the given prefix words.\n\n  Args:\n    sentence: Sentence words.\n    vocab: Vocabulary. Contains max word chard id length and converts between\n        words and ids.\n  \"\"\"\n  targets = np.zeros([BATCH_SIZE, NUM_TIMESTEPS], np.int32)\n  weights = np.ones([BATCH_SIZE, NUM_TIMESTEPS], np.float32)\n\n  sess, t = _LoadModel(FLAGS.pbtxt, FLAGS.ckpt)\n\n  if sentence.find('<S>') != 0:\n    sentence = '<S> ' + sentence\n\n  word_ids = [vocab.word_to_id(w) for w in sentence.split()]\n  char_ids = [vocab.word_to_char_ids(w) for w in sentence.split()]\n\n  inputs = np.zeros([BATCH_SIZE, NUM_TIMESTEPS], np.int32)\n  char_ids_inputs = np.zeros(\n      [BATCH_SIZE, NUM_TIMESTEPS, vocab.max_word_length], np.int32)\n  for i in xrange(len(word_ids)):\n    inputs[0, 0] = word_ids[i]\n    char_ids_inputs[0, 0, :] = char_ids[i]\n\n    # Add 'lstm/lstm_0/control_dependency' if you want to dump previous layer\n    # LSTM.\n    lstm_emb = sess.run(t['lstm/lstm_1/control_dependency'],\n                        feed_dict={t['char_inputs_in']: char_ids_inputs,\n                                   t['inputs_in']: inputs,\n                                   t['targets_in']: targets,\n                                   t['target_weights_in']: weights})\n\n    fname = os.path.join(FLAGS.save_dir, 'lstm_emb_step_%d.npy' % i)\n    with tf.gfile.Open(fname, mode='w') as f:\n      np.save(f, lstm_emb)\n    sys.stderr.write('LSTM embedding step %d file saved\\n' % i)\n\n\ndef main(unused_argv):\n  vocab = data_utils.CharsVocabulary(FLAGS.vocab_file, MAX_WORD_LEN)\n\n  if FLAGS.mode == 'eval':\n    dataset = data_utils.LM1BDataset(FLAGS.input_data, vocab)\n    _EvalModel(dataset)\n  elif FLAGS.mode == 'sample':\n    _SampleModel(FLAGS.prefix, vocab)\n  elif FLAGS.mode == 'dump_emb':\n    _DumpEmb(vocab)\n  elif FLAGS.mode == 'dump_lstm_emb':\n    _DumpSentenceEmbedding(FLAGS.sentence, vocab)\n  else:\n    raise Exception('Mode not supported.')\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/namignizer/.gitignore",
    "content": "# Remove the pyc files\n*.pyc\n\n# Ignore the model and the data\nmodel/\ndata/\n"
  },
  {
    "path": "model_zoo/models/namignizer/README.md",
    "content": "# Namignizer\n\nUse a variation of the [PTB](https://www.tensorflow.org/versions/r0.8/tutorials/recurrent/index.html#recurrent-neural-networks) model to recognize and generate names using the [Kaggle Baby Name Database](https://www.kaggle.com/kaggle/us-baby-names).\n\n### API\nNamignizer is implemented in Tensorflow 0.8r and uses the python package `pandas` for some data processing.\n\n#### How to use\nDownload the data from Kaggle and place it in your data directory (or use the small training data provided). The example data looks like so:\n\n```\nId,Name,Year,Gender,Count\n1,Mary,1880,F,7065\n2,Anna,1880,F,2604\n3,Emma,1880,F,2003\n4,Elizabeth,1880,F,1939\n5,Minnie,1880,F,1746\n6,Margaret,1880,F,1578\n7,Ida,1880,F,1472\n8,Alice,1880,F,1414\n9,Bertha,1880,F,1320\n```\n\nBut any data with the two columns: `Name` and `Count` will work.\n\nWith the data, we can then train the model:\n\n```python\ntrain(\"data/SmallNames.txt\", \"model/namignizer\", SmallConfig)\n```\n\nAnd you will get the output:\n\n```\nReading Name data in data/SmallNames.txt\nEpoch: 1 Learning rate: 1.000\n0.090 perplexity: 18.539 speed: 282 lps\n...\n0.890 perplexity: 1.478 speed: 285 lps\n0.990 perplexity: 1.477 speed: 284 lps\nEpoch: 13 Train Perplexity: 1.477\n```\n\nThis will as a side effect write model checkpoints to the `model` directory. With this you will be able to determine the perplexity your model will give you for any arbitrary set of names like so:\n\n```python\nnamignize([\"mary\", \"ida\", \"gazorpazorp\", \"houyhnhnms\", \"bob\"],\n  tf.train.latest_checkpoint(\"model\"), SmallConfig)\n```\nYou will provide the same config and the same checkpoint directory. This will allow you to use a the model you just trained. You will then get a perplexity output for each name like so:\n\n```\nName mary gives us a perplexity of 1.03105580807\nName ida gives us a perplexity of 1.07770049572\nName gazorpazorp gives us a perplexity of 175.940353394\nName houyhnhnms gives us a perplexity of 9.53870773315\nName bob gives us a perplexity of 6.03938627243\n```\n\nFinally, you will also be able generate names using the model like so:\n\n```python\nnamignator(tf.train.latest_checkpoint(\"model\"), SmallConfig)\n```\n\nAgain, you will need to provide the same config and the same checkpoint directory. This will allow you to use a the model you just trained. You will then get a single generated name. Examples of output that I got when using the provided data are:\n\n```\n['b', 'e', 'r', 't', 'h', 'a', '`']\n['m', 'a', 'r', 'y', '`']\n['a', 'n', 'n', 'a', '`']\n['m', 'a', 'r', 'y', '`']\n['b', 'e', 'r', 't', 'h', 'a', '`']\n['a', 'n', 'n', 'a', '`']\n['e', 'l', 'i', 'z', 'a', 'b', 'e', 't', 'h', '`']\n```\n\nNotice that each name ends with a backtick. This marks the end of the name.\n\n### Contact Info\n\nFeel free to reach out to me at knt(at google) or k.nathaniel.tucker(at gmail)\n"
  },
  {
    "path": "model_zoo/models/namignizer/data_utils.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Utilities for parsing Kaggle baby names files.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport collections\nimport os\n\nimport numpy as np\nimport tensorflow as tf\nimport pandas as pd\n\n# the default end of name rep will be zero\n_EON = 0\n\n\ndef read_names(names_path):\n    \"\"\"read data from downloaded file. See SmallNames.txt for example format\n    or go to https://www.kaggle.com/kaggle/us-baby-names for full lists\n\n    Args:\n        names_path: path to the csv file similar to the example type\n    Returns:\n        Dataset: a namedtuple of two elements: deduped names and their associated\n            counts. The names contain only 26 chars and are all lower case\n    \"\"\"\n    names_data = pd.read_csv(names_path)\n    names_data.Name = names_data.Name.str.lower()\n\n    name_data = names_data.groupby(by=[\"Name\"])[\"Count\"].sum()\n    name_counts = np.array(name_data.tolist())\n    names_deduped = np.array(name_data.index.tolist())\n\n    Dataset = collections.namedtuple('Dataset', ['Name', 'Count'])\n    return Dataset(names_deduped, name_counts)\n\n\ndef _letter_to_number(letter):\n    \"\"\"converts letters to numbers between 1 and 27\"\"\"\n    # ord of lower case 'a' is 97\n    return ord(letter) - 96\n\n\ndef namignizer_iterator(names, counts, batch_size, num_steps, epoch_size):\n    \"\"\"Takes a list of names and counts like those output from read_names, and\n    makes an iterator yielding a batch_size by num_steps array of random names\n    separated by an end of name token. The names are choosen randomly according\n    to their counts. The batch may end mid-name\n\n    Args:\n        names: a set of lowercase names composed of 26 characters\n        counts: a list of the frequency of those names\n        batch_size: int\n        num_steps: int\n        epoch_size: number of batches to yield\n    Yields:\n        (x, y): a batch_size by num_steps array of ints representing letters, where\n            x will be the input and y will be the target\n    \"\"\"\n    name_distribution = counts / counts.sum()\n\n    for i in range(epoch_size):\n        data = np.zeros(batch_size * num_steps + 1)\n        samples = np.random.choice(names, size=batch_size * num_steps // 2,\n                                   replace=True, p=name_distribution)\n\n        data_index = 0\n        for sample in samples:\n            if data_index >= batch_size * num_steps:\n                break\n            for letter in map(_letter_to_number, sample) + [_EON]:\n                if data_index >= batch_size * num_steps:\n                    break\n                data[data_index] = letter\n                data_index += 1\n\n        x = data[:batch_size * num_steps].reshape((batch_size, num_steps))\n        y = data[1:batch_size * num_steps + 1].reshape((batch_size, num_steps))\n\n        yield (x, y)\n\n\ndef name_to_batch(name, batch_size, num_steps):\n    \"\"\" Takes a single name and fills a batch with it\n\n    Args:\n        name: lowercase composed of 26 characters\n        batch_size: int\n        num_steps: int\n    Returns:\n        x, y: a batch_size by num_steps array of ints representing letters, where\n            x will be the input and y will be the target. The array is filled up\n            to the length of the string, the rest is filled with zeros\n    \"\"\"\n    data = np.zeros(batch_size * num_steps + 1)\n\n    data_index = 0\n    for letter in map(_letter_to_number, name) + [_EON]:\n        data[data_index] = letter\n        data_index += 1\n\n    x = data[:batch_size * num_steps].reshape((batch_size, num_steps))\n    y = data[1:batch_size * num_steps + 1].reshape((batch_size, num_steps))\n\n    return x, y\n"
  },
  {
    "path": "model_zoo/models/namignizer/model.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"RNN model with embeddings\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\n\nclass NamignizerModel(object):\n    \"\"\"The Namignizer model ~ strongly based on PTB\"\"\"\n\n    def __init__(self, is_training, config):\n        self.batch_size = batch_size = config.batch_size\n        self.num_steps = num_steps = config.num_steps\n        size = config.hidden_size\n        # will always be 27\n        vocab_size = config.vocab_size\n\n        # placeholders for inputs\n        self._input_data = tf.placeholder(tf.int32, [batch_size, num_steps])\n        self._targets = tf.placeholder(tf.int32, [batch_size, num_steps])\n        # weights for the loss function\n        self._weights = tf.placeholder(tf.float32, [batch_size * num_steps])\n\n        # lstm for our RNN cell (GRU supported too)\n        lstm_cell = tf.nn.rnn_cell.BasicLSTMCell(size, forget_bias=0.0)\n        if is_training and config.keep_prob < 1:\n            lstm_cell = tf.nn.rnn_cell.DropoutWrapper(\n                lstm_cell, output_keep_prob=config.keep_prob)\n        cell = tf.nn.rnn_cell.MultiRNNCell([lstm_cell] * config.num_layers)\n\n        self._initial_state = cell.zero_state(batch_size, tf.float32)\n\n        with tf.device(\"/cpu:0\"):\n            embedding = tf.get_variable(\"embedding\", [vocab_size, size])\n            inputs = tf.nn.embedding_lookup(embedding, self._input_data)\n\n        if is_training and config.keep_prob < 1:\n            inputs = tf.nn.dropout(inputs, config.keep_prob)\n\n        outputs = []\n        state = self._initial_state\n        with tf.variable_scope(\"RNN\"):\n            for time_step in range(num_steps):\n                if time_step > 0:\n                    tf.get_variable_scope().reuse_variables()\n                (cell_output, state) = cell(inputs[:, time_step, :], state)\n                outputs.append(cell_output)\n\n        output = tf.reshape(tf.concat(1, outputs), [-1, size])\n        softmax_w = tf.get_variable(\"softmax_w\", [size, vocab_size])\n        softmax_b = tf.get_variable(\"softmax_b\", [vocab_size])\n        logits = tf.matmul(output, softmax_w) + softmax_b\n        loss = tf.nn.seq2seq.sequence_loss_by_example(\n            [logits],\n            [tf.reshape(self._targets, [-1])],\n            [self._weights])\n        self._loss = loss\n        self._cost = cost = tf.reduce_sum(loss) / batch_size\n        self._final_state = state\n\n        # probabilities of each letter\n        self._activations = tf.nn.softmax(logits)\n\n        # ability to save the model\n        self.saver = tf.train.Saver(tf.all_variables())\n\n        if not is_training:\n            return\n\n        self._lr = tf.Variable(0.0, trainable=False)\n        tvars = tf.trainable_variables()\n        grads, _ = tf.clip_by_global_norm(tf.gradients(cost, tvars),\n                                          config.max_grad_norm)\n        optimizer = tf.train.GradientDescentOptimizer(self.lr)\n        self._train_op = optimizer.apply_gradients(zip(grads, tvars))\n\n    def assign_lr(self, session, lr_value):\n        session.run(tf.assign(self.lr, lr_value))\n\n    @property\n    def input_data(self):\n        return self._input_data\n\n    @property\n    def targets(self):\n        return self._targets\n\n    @property\n    def activations(self):\n        return self._activations\n\n    @property\n    def weights(self):\n        return self._weights\n\n    @property\n    def initial_state(self):\n        return self._initial_state\n\n    @property\n    def cost(self):\n        return self._cost\n\n    @property\n    def loss(self):\n        return self._loss\n\n    @property\n    def final_state(self):\n        return self._final_state\n\n    @property\n    def lr(self):\n        return self._lr\n\n    @property\n    def train_op(self):\n        return self._train_op\n"
  },
  {
    "path": "model_zoo/models/namignizer/names.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"A library showing off sequence recognition and generation with the simple\nexample of names.\n\nWe use recurrent neural nets to learn complex functions able to recogize and\ngenerate sequences of a given form. This can be used for natural language\nsyntax recognition, dynamically generating maps or puzzles and of course\nbaby name generation.\n\nBefore using this module, it is recommended to read the Tensorflow tutorial on\nrecurrent neural nets, as it explains the basic concepts of this model, and\nwill show off another module, the PTB module on which this model bases itself.\n\nHere is an overview of the functions available in this module:\n\n* RNN Module for sequence functions based on PTB\n\n* Name recognition specifically for recognizing names, but can be adapted to\n    recognizing sequence patterns\n\n* Name generations specifically for generating names, but can be adapted to\n    generating arbitrary sequence patterns\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport time\n\nimport tensorflow as tf\nimport numpy as np\n\nfrom model import NamignizerModel\nimport data_utils\n\n\nclass SmallConfig(object):\n    \"\"\"Small config.\"\"\"\n    init_scale = 0.1\n    learning_rate = 1.0\n    max_grad_norm = 5\n    num_layers = 2\n    num_steps = 20\n    hidden_size = 200\n    max_epoch = 4\n    max_max_epoch = 13\n    keep_prob = 1.0\n    lr_decay = 0.5\n    batch_size = 20\n    vocab_size = 27\n    epoch_size = 100\n\n\nclass LargeConfig(object):\n    \"\"\"Medium config.\"\"\"\n    init_scale = 0.05\n    learning_rate = 1.0\n    max_grad_norm = 5\n    num_layers = 2\n    num_steps = 35\n    hidden_size = 650\n    max_epoch = 6\n    max_max_epoch = 39\n    keep_prob = 0.5\n    lr_decay = 0.8\n    batch_size = 20\n    vocab_size = 27\n    epoch_size = 100\n\n\nclass TestConfig(object):\n    \"\"\"Tiny config, for testing.\"\"\"\n    init_scale = 0.1\n    learning_rate = 1.0\n    max_grad_norm = 1\n    num_layers = 1\n    num_steps = 2\n    hidden_size = 2\n    max_epoch = 1\n    max_max_epoch = 1\n    keep_prob = 1.0\n    lr_decay = 0.5\n    batch_size = 20\n    vocab_size = 27\n    epoch_size = 100\n\n\ndef run_epoch(session, m, names, counts, epoch_size, eval_op, verbose=False):\n    \"\"\"Runs the model on the given data for one epoch\n\n    Args:\n        session: the tf session holding the model graph\n        m: an instance of the NamignizerModel\n        names: a set of lowercase names of 26 characters\n        counts: a list of the frequency of the above names\n        epoch_size: the number of batches to run\n        eval_op: whether to change the params or not, and how to do it\n    Kwargs:\n        verbose: whether to print out state of training during the epoch\n    Returns:\n        cost: the average cost during the last stage of the epoch\n    \"\"\"\n    start_time = time.time()\n    costs = 0.0\n    iters = 0\n    for step, (x, y) in enumerate(data_utils.namignizer_iterator(names, counts,\n                                                                 m.batch_size, m.num_steps, epoch_size)):\n\n        cost, _ = session.run([m.cost, eval_op],\n                              {m.input_data: x,\n                               m.targets: y,\n                               m.initial_state: m.initial_state.eval(),\n                               m.weights: np.ones(m.batch_size * m.num_steps)})\n        costs += cost\n        iters += m.num_steps\n\n        if verbose and step % (epoch_size // 10) == 9:\n            print(\"%.3f perplexity: %.3f speed: %.0f lps\" %\n                  (step * 1.0 / epoch_size, np.exp(costs / iters),\n                   iters * m.batch_size / (time.time() - start_time)))\n\n        if step >= epoch_size:\n            break\n\n    return np.exp(costs / iters)\n\n\ndef train(data_dir, checkpoint_path, config):\n    \"\"\"Trains the model with the given data\n\n    Args:\n        data_dir: path to the data for the model (see data_utils for data\n            format)\n        checkpoint_path: the path to save the trained model checkpoints\n        config: one of the above configs that specify the model and how it\n            should be run and trained\n    Returns:\n        None\n    \"\"\"\n    # Prepare Name data.\n    print(\"Reading Name data in %s\" % data_dir)\n    names, counts = data_utils.read_names(data_dir)\n\n    with tf.Graph().as_default(), tf.Session() as session:\n        initializer = tf.random_uniform_initializer(-config.init_scale,\n                                                    config.init_scale)\n        with tf.variable_scope(\"model\", reuse=None, initializer=initializer):\n            m = NamignizerModel(is_training=True, config=config)\n\n        tf.initialize_all_variables().run()\n\n        for i in range(config.max_max_epoch):\n            lr_decay = config.lr_decay ** max(i - config.max_epoch, 0.0)\n            m.assign_lr(session, config.learning_rate * lr_decay)\n\n            print(\"Epoch: %d Learning rate: %.3f\" % (i + 1, session.run(m.lr)))\n            train_perplexity = run_epoch(session, m, names, counts, config.epoch_size, m.train_op,\n                                         verbose=True)\n            print(\"Epoch: %d Train Perplexity: %.3f\" %\n                  (i + 1, train_perplexity))\n\n            m.saver.save(session, checkpoint_path, global_step=i)\n\n\ndef namignize(names, checkpoint_path, config):\n    \"\"\"Recognizes names and prints the Perplexity of the model for each names\n    in the list\n\n    Args:\n        names: a list of names in the model format\n        checkpoint_path: the path to restore the trained model from, should not\n            include the model name, just the path to\n        config: one of the above configs that specify the model and how it\n            should be run and trained\n    Returns:\n        None\n    \"\"\"\n    with tf.Graph().as_default(), tf.Session() as session:\n\n        with tf.variable_scope(\"model\"):\n            m = NamignizerModel(is_training=False, config=config)\n\n        m.saver.restore(session, checkpoint_path)\n\n        for name in names:\n            x, y = data_utils.name_to_batch(name, m.batch_size, m.num_steps)\n\n            cost, loss, _ = session.run([m.cost, m.loss, tf.no_op()],\n                                  {m.input_data: x,\n                                   m.targets: y,\n                                   m.initial_state: m.initial_state.eval(),\n                                   m.weights: np.concatenate((\n                                       np.ones(len(name)), np.zeros(m.batch_size * m.num_steps - len(name))))})\n\n            print(\"Name {} gives us a perplexity of {}\".format(\n                name, np.exp(cost)))\n\n\ndef namignator(checkpoint_path, config):\n    \"\"\"Generates names randomly according to a given model\n\n    Args:\n        checkpoint_path: the path to restore the trained model from, should not\n            include the model name, just the path to\n        config: one of the above configs that specify the model and how it\n            should be run and trained\n    Returns:\n        None\n    \"\"\"\n    # mutate the config to become a name generator config\n    config.num_steps = 1\n    config.batch_size = 1\n\n    with tf.Graph().as_default(), tf.Session() as session:\n\n        with tf.variable_scope(\"model\"):\n            m = NamignizerModel(is_training=False, config=config)\n\n        m.saver.restore(session, checkpoint_path)\n\n        activations, final_state, _ = session.run([m.activations, m.final_state, tf.no_op()],\n                                                  {m.input_data: np.zeros((1, 1)),\n                                                   m.targets: np.zeros((1, 1)),\n                                                   m.initial_state: m.initial_state.eval(),\n                                                   m.weights: np.ones(1)})\n\n        # sample from our softmax activations\n        next_letter = np.random.choice(27, p=activations[0])\n        name = [next_letter]\n        while next_letter != 0:\n            activations, final_state, _ = session.run([m.activations, m.final_state, tf.no_op()],\n                                                      {m.input_data: [[next_letter]],\n                                                       m.targets: np.zeros((1, 1)),\n                                                       m.initial_state: final_state,\n                                                       m.weights: np.ones(1)})\n\n            next_letter = np.random.choice(27, p=activations[0])\n            name += [next_letter]\n\n        print(map(lambda x: chr(x + 96), name))\n\n\nif __name__ == \"__main__\":\n    # train(\"data/SmallNames.txt\", \"model/namignizer\", SmallConfig)\n\n    # namignize([\"mary\", \"ida\", \"gazorbazorb\", \"mmmhmm\", \"bob\"],\n    #     tf.train.latest_checkpoint(\"model\"), SmallConfig)\n\n    # namignator(tf.train.latest_checkpoint(\"model\"), SmallConfig)\n"
  },
  {
    "path": "model_zoo/models/neural_gpu/README.md",
    "content": "# NeuralGPU\nCode for the Neural GPU model as described\nin [[http://arxiv.org/abs/1511.08228]].\n\nRequirements:\n* TensorFlow (see tensorflow.org for how to install)\n* Matplotlib for Python (sudo apt-get install python-matplotlib)\n\nThe model can be trained on the following algorithmic tasks:\n\n* `sort` - Sort a symbol list\n* `kvsort` - Sort symbol keys in dictionary\n* `id` - Return the same symbol list\n* `rev` - Reverse a symbol list\n* `rev2` - Reverse a symbol dictionary by key\n* `incr` - Add one to a symbol value\n* `add` - Long decimal addition\n* `left` - First symbol in list\n* `right` - Last symbol in list\n* `left-shift` - Left shift a symbol list\n* `right-shift` - Right shift a symbol list\n* `bmul` - Long binary multiplication\n* `mul` - Long decimal multiplication\n* `dup` - Duplicate a symbol list with padding\n* `badd` - Long binary addition\n* `qadd` - Long quaternary addition\n* `search` - Search for symbol key in dictionary\n\nThe value range for symbols are defined by the `niclass` and `noclass` flags.\nIn particular, the values are in the range `min(--niclass, noclass) - 1`.\nSo if you set `--niclass=33` and `--noclass=33` (the default) then `--task=rev`\nwill be reversing lists of 32 symbols, and `--task=id` will be identity on a\nlist of up to 32 symbols.\n\n\nTo train the model on the reverse task run:\n\n```\npython neural_gpu_trainer.py --task=rev\n```\n\nWhile training, interim / checkpoint model parameters will be\nwritten to `/tmp/neural_gpu/`.\n\nOnce the amount of error gets down to what you're comfortable\nwith, hit `Ctrl-C` to stop the training process. The latest\nmodel parameters will be in `/tmp/neural_gpu/neural_gpu.ckpt-<step>`\nand used on any subsequent run.\n\nTo test a trained model on how well it decodes run:\n\n```\npython neural_gpu_trainer.py --task=rev --mode=1\n```\n\nTo produce an animation of the result run:\n\n```\npython neural_gpu_trainer.py --task=rev --mode=1 --animate=True\n```\n\nMaintained by Lukasz Kaiser (lukaszkaiser)\n"
  },
  {
    "path": "model_zoo/models/neural_gpu/data_utils.py",
    "content": "# Copyright 2015 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Convolutional Gated Recurrent Networks for Algorithm Learning.\"\"\"\n\nimport math\nimport random\nimport sys\nimport time\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom tensorflow.python.platform import gfile\n\nFLAGS = tf.app.flags.FLAGS\n\nbins = [8, 12, 16, 20, 24, 28, 32, 36, 40, 48, 64, 128]\nall_tasks = [\"sort\", \"kvsort\", \"id\", \"rev\", \"rev2\", \"incr\", \"add\", \"left\",\n             \"right\", \"left-shift\", \"right-shift\", \"bmul\", \"mul\", \"dup\",\n             \"badd\", \"qadd\", \"search\"]\nforward_max = 128\nlog_filename = \"\"\n\n\ndef pad(l):\n  for b in bins:\n    if b >= l: return b\n  return forward_max\n\n\ntrain_set = {}\ntest_set = {}\nfor some_task in all_tasks:\n  train_set[some_task] = []\n  test_set[some_task] = []\n  for all_max_len in xrange(10000):\n    train_set[some_task].append([])\n    test_set[some_task].append([])\n\n\ndef add(n1, n2, base=10):\n  \"\"\"Add two numbers represented as lower-endian digit lists.\"\"\"\n  k = max(len(n1), len(n2)) + 1\n  d1 = n1 + [0 for _ in xrange(k - len(n1))]\n  d2 = n2 + [0 for _ in xrange(k - len(n2))]\n  res = []\n  carry = 0\n  for i in xrange(k):\n    if d1[i] + d2[i] + carry < base:\n      res.append(d1[i] + d2[i] + carry)\n      carry = 0\n    else:\n      res.append(d1[i] + d2[i] + carry - base)\n      carry = 1\n  while res and res[-1] == 0:\n    res = res[:-1]\n  if res: return res\n  return [0]\n\n\ndef init_data(task, length, nbr_cases, nclass):\n  \"\"\"Data initialization.\"\"\"\n  def rand_pair(l, task):\n    \"\"\"Random data pair for a task. Total length should be <= l.\"\"\"\n    k = (l-1)/2\n    base = 10\n    if task[0] == \"b\": base = 2\n    if task[0] == \"q\": base = 4\n    d1 = [np.random.randint(base) for _ in xrange(k)]\n    d2 = [np.random.randint(base) for _ in xrange(k)]\n    if task in [\"add\", \"badd\", \"qadd\"]:\n      res = add(d1, d2, base)\n    elif task in [\"mul\", \"bmul\"]:\n      d1n = sum([d * (base ** i) for i, d in enumerate(d1)])\n      d2n = sum([d * (base ** i) for i, d in enumerate(d2)])\n      if task == \"bmul\":\n        res = [int(x) for x in list(reversed(str(bin(d1n * d2n))))[:-2]]\n      else:\n        res = [int(x) for x in list(reversed(str(d1n * d2n)))]\n    else:\n      sys.exit()\n    sep = [12]\n    if task in [\"add\", \"badd\", \"qadd\"]: sep = [11]\n    inp = [d + 1 for d in d1] + sep + [d + 1 for d in d2]\n    return inp, [r + 1 for r in res]\n\n  def rand_dup_pair(l):\n    \"\"\"Random data pair for duplication task. Total length should be <= l.\"\"\"\n    k = l/2\n    x = [np.random.randint(nclass - 1) + 1 for _ in xrange(k)]\n    inp = x + [0 for _ in xrange(l - k)]\n    res = x + x + [0 for _ in xrange(l - 2*k)]\n    return inp, res\n\n  def rand_rev2_pair(l):\n    \"\"\"Random data pair for reverse2 task. Total length should be <= l.\"\"\"\n    inp = [(np.random.randint(nclass - 1) + 1,\n            np.random.randint(nclass - 1) + 1) for _ in xrange(l/2)]\n    res = [i for i in reversed(inp)]\n    return [x for p in inp for x in p], [x for p in res for x in p]\n\n  def rand_search_pair(l):\n    \"\"\"Random data pair for search task. Total length should be <= l.\"\"\"\n    inp = [(np.random.randint(nclass - 1) + 1,\n            np.random.randint(nclass - 1) + 1) for _ in xrange(l-1/2)]\n    q = np.random.randint(nclass - 1) + 1\n    res = 0\n    for (k, v) in reversed(inp):\n      if k == q:\n        res = v\n    return [x for p in inp for x in p] + [q], [res]\n\n  def rand_kvsort_pair(l):\n    \"\"\"Random data pair for key-value sort. Total length should be <= l.\"\"\"\n    keys = [(np.random.randint(nclass - 1) + 1, i) for i in xrange(l/2)]\n    vals = [np.random.randint(nclass - 1) + 1 for _ in xrange(l/2)]\n    kv = [(k, vals[i]) for (k, i) in keys]\n    sorted_kv = [(k, vals[i]) for (k, i) in sorted(keys)]\n    return [x for p in kv for x in p], [x for p in sorted_kv for x in p]\n\n  def spec(inp):\n    \"\"\"Return the target given the input for some tasks.\"\"\"\n    if task == \"sort\":\n      return sorted(inp)\n    elif task == \"id\":\n      return inp\n    elif task == \"rev\":\n      return [i for i in reversed(inp)]\n    elif task == \"incr\":\n      carry = 1\n      res = []\n      for i in xrange(len(inp)):\n        if inp[i] + carry < nclass:\n          res.append(inp[i] + carry)\n          carry = 0\n        else:\n          res.append(1)\n          carry = 1\n      return res\n    elif task == \"left\":\n      return [inp[0]]\n    elif task == \"right\":\n      return [inp[-1]]\n    elif task == \"left-shift\":\n      return [inp[l-1] for l in xrange(len(inp))]\n    elif task == \"right-shift\":\n      return [inp[l+1] for l in xrange(len(inp))]\n    else:\n      print_out(\"Unknown spec for task \" + str(task))\n      sys.exit()\n\n  l = length\n  cur_time = time.time()\n  total_time = 0.0\n  for case in xrange(nbr_cases):\n    total_time += time.time() - cur_time\n    cur_time = time.time()\n    if l > 10000 and case % 100 == 1:\n      print_out(\"  avg gen time %.4f s\" % (total_time / float(case)))\n    if task in [\"add\", \"badd\", \"qadd\", \"bmul\", \"mul\"]:\n      i, t = rand_pair(l, task)\n      train_set[task][len(i)].append([i, t])\n      i, t = rand_pair(l, task)\n      test_set[task][len(i)].append([i, t])\n    elif task == \"dup\":\n      i, t = rand_dup_pair(l)\n      train_set[task][len(i)].append([i, t])\n      i, t = rand_dup_pair(l)\n      test_set[task][len(i)].append([i, t])\n    elif task == \"rev2\":\n      i, t = rand_rev2_pair(l)\n      train_set[task][len(i)].append([i, t])\n      i, t = rand_rev2_pair(l)\n      test_set[task][len(i)].append([i, t])\n    elif task == \"search\":\n      i, t = rand_search_pair(l)\n      train_set[task][len(i)].append([i, t])\n      i, t = rand_search_pair(l)\n      test_set[task][len(i)].append([i, t])\n    elif task == \"kvsort\":\n      i, t = rand_kvsort_pair(l)\n      train_set[task][len(i)].append([i, t])\n      i, t = rand_kvsort_pair(l)\n      test_set[task][len(i)].append([i, t])\n    else:\n      inp = [np.random.randint(nclass - 1) + 1 for i in xrange(l)]\n      target = spec(inp)\n      train_set[task][l].append([inp, target])\n      inp = [np.random.randint(nclass - 1) + 1 for i in xrange(l)]\n      target = spec(inp)\n      test_set[task][l].append([inp, target])\n\n\ndef to_symbol(i):\n  \"\"\"Covert ids to text.\"\"\"\n  if i == 0: return \"\"\n  if i == 11: return \"+\"\n  if i == 12: return \"*\"\n  return str(i-1)\n\n\ndef to_id(s):\n  \"\"\"Covert text to ids.\"\"\"\n  if s == \"+\": return 11\n  if s == \"*\": return 12\n  return int(s) + 1\n\n\ndef get_batch(max_length, batch_size, do_train, task, offset=None, preset=None):\n  \"\"\"Get a batch of data, training or testing.\"\"\"\n  inputs = []\n  targets = []\n  length = max_length\n  if preset is None:\n    cur_set = test_set[task]\n    if do_train: cur_set = train_set[task]\n    while not cur_set[length]:\n      length -= 1\n  pad_length = pad(length)\n  for b in xrange(batch_size):\n    if preset is None:\n      elem = random.choice(cur_set[length])\n      if offset is not None and offset + b < len(cur_set[length]):\n        elem = cur_set[length][offset + b]\n    else:\n      elem = preset\n    inp, target = elem[0], elem[1]\n    assert len(inp) == length\n    inputs.append(inp + [0 for l in xrange(pad_length - len(inp))])\n    targets.append(target + [0 for l in xrange(pad_length - len(target))])\n  res_input = []\n  res_target = []\n  for l in xrange(pad_length):\n    new_input = np.array([inputs[b][l] for b in xrange(batch_size)],\n                         dtype=np.int32)\n    new_target = np.array([targets[b][l] for b in xrange(batch_size)],\n                          dtype=np.int32)\n    res_input.append(new_input)\n    res_target.append(new_target)\n  return res_input, res_target\n\n\ndef print_out(s, newline=True):\n  \"\"\"Print a message out and log it to file.\"\"\"\n  if log_filename:\n    try:\n      with gfile.GFile(log_filename, mode=\"a\") as f:\n        f.write(s + (\"\\n\" if newline else \"\"))\n    # pylint: disable=bare-except\n    except:\n      sys.stdout.write(\"Error appending to %s\\n\" % log_filename)\n  sys.stdout.write(s + (\"\\n\" if newline else \"\"))\n  sys.stdout.flush()\n\n\ndef decode(output):\n  return [np.argmax(o, axis=1) for o in output]\n\n\ndef accuracy(inpt, output, target, batch_size, nprint):\n  \"\"\"Calculate output accuracy given target.\"\"\"\n  assert nprint < batch_size + 1\n  def task_print(inp, output, target):\n    stop_bound = 0\n    print_len = 0\n    while print_len < len(target) and target[print_len] > stop_bound:\n      print_len += 1\n    print_out(\"    i: \" + \" \".join([str(i - 1) for i in inp if i > 0]))\n    print_out(\"    o: \" +\n              \" \".join([str(output[l] - 1) for l in xrange(print_len)]))\n    print_out(\"    t: \" +\n              \" \".join([str(target[l] - 1) for l in xrange(print_len)]))\n  decoded_target = target\n  decoded_output = decode(output)\n  total = 0\n  errors = 0\n  seq = [0 for b in xrange(batch_size)]\n  for l in xrange(len(decoded_output)):\n    for b in xrange(batch_size):\n      if decoded_target[l][b] > 0:\n        total += 1\n        if decoded_output[l][b] != decoded_target[l][b]:\n          seq[b] = 1\n          errors += 1\n  e = 0  # Previous error index\n  for _ in xrange(min(nprint, sum(seq))):\n    while seq[e] == 0:\n      e += 1\n    task_print([inpt[l][e] for l in xrange(len(inpt))],\n               [decoded_output[l][e] for l in xrange(len(decoded_target))],\n               [decoded_target[l][e] for l in xrange(len(decoded_target))])\n    e += 1\n  for b in xrange(nprint - errors):\n    task_print([inpt[l][b] for l in xrange(len(inpt))],\n               [decoded_output[l][b] for l in xrange(len(decoded_target))],\n               [decoded_target[l][b] for l in xrange(len(decoded_target))])\n  return errors, total, sum(seq)\n\n\ndef safe_exp(x):\n  perp = 10000\n  if x < 100: perp = math.exp(x)\n  if perp > 10000: return 10000\n  return perp\n"
  },
  {
    "path": "model_zoo/models/neural_gpu/neural_gpu.py",
    "content": "# Copyright 2015 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"The Neural GPU Model.\"\"\"\n\nimport time\n\nimport tensorflow as tf\n\nimport data_utils\n\n\ndef conv_linear(args, kw, kh, nin, nout, do_bias, bias_start, prefix):\n  \"\"\"Convolutional linear map.\"\"\"\n  assert args is not None\n  if not isinstance(args, (list, tuple)):\n    args = [args]\n  with tf.variable_scope(prefix):\n    k = tf.get_variable(\"CvK\", [kw, kh, nin, nout])\n    if len(args) == 1:\n      res = tf.nn.conv2d(args[0], k, [1, 1, 1, 1], \"SAME\")\n    else:\n      res = tf.nn.conv2d(tf.concat(3, args), k, [1, 1, 1, 1], \"SAME\")\n    if not do_bias: return res\n    bias_term = tf.get_variable(\"CvB\", [nout],\n                                initializer=tf.constant_initializer(0.0))\n    return res + bias_term + bias_start\n\n\ndef sigmoid_cutoff(x, cutoff):\n  \"\"\"Sigmoid with cutoff, e.g., 1.2sigmoid(x) - 0.1.\"\"\"\n  y = tf.sigmoid(x)\n  if cutoff < 1.01: return y\n  d = (cutoff - 1.0) / 2.0\n  return tf.minimum(1.0, tf.maximum(0.0, cutoff * y - d))\n\n\ndef tanh_cutoff(x, cutoff):\n  \"\"\"Tanh with cutoff, e.g., 1.1tanh(x) cut to [-1. 1].\"\"\"\n  y = tf.tanh(x)\n  if cutoff < 1.01: return y\n  d = (cutoff - 1.0) / 2.0\n  return tf.minimum(1.0, tf.maximum(-1.0, (1.0 + d) * y))\n\n\ndef conv_gru(inpts, mem, kw, kh, nmaps, cutoff, prefix):\n  \"\"\"Convolutional GRU.\"\"\"\n  def conv_lin(args, suffix, bias_start):\n    return conv_linear(args, kw, kh, len(args) * nmaps, nmaps, True, bias_start,\n                       prefix + \"/\" + suffix)\n  reset = sigmoid_cutoff(conv_lin(inpts + [mem], \"r\", 1.0), cutoff)\n  # candidate = tanh_cutoff(conv_lin(inpts + [reset * mem], \"c\", 0.0), cutoff)\n  candidate = tf.tanh(conv_lin(inpts + [reset * mem], \"c\", 0.0))\n  gate = sigmoid_cutoff(conv_lin(inpts + [mem], \"g\", 1.0), cutoff)\n  return gate * mem + (1 - gate) * candidate\n\n\n@tf.RegisterGradient(\"CustomIdG\")\ndef _custom_id_grad(_, grads):\n  return grads\n\n\ndef quantize(t, quant_scale, max_value=1.0):\n  \"\"\"Quantize a tensor t with each element in [-max_value, max_value].\"\"\"\n  t = tf.minimum(max_value, tf.maximum(t, -max_value))\n  big = quant_scale * (t + max_value) + 0.5\n  with tf.get_default_graph().gradient_override_map({\"Floor\": \"CustomIdG\"}):\n    res = (tf.floor(big) / quant_scale) - max_value\n  return res\n\n\ndef quantize_weights_op(quant_scale, max_value):\n  ops = [v.assign(quantize(v, quant_scale, float(max_value)))\n         for v in tf.trainable_variables()]\n  return tf.group(*ops)\n\n\ndef relaxed_average(var_name_suffix, rx_step):\n  \"\"\"Calculate the average of relaxed variables having var_name_suffix.\"\"\"\n  relaxed_vars = []\n  for l in xrange(rx_step):\n    with tf.variable_scope(\"RX%d\" % l, reuse=True):\n      try:\n        relaxed_vars.append(tf.get_variable(var_name_suffix))\n      except ValueError:\n        pass\n  dsum = tf.add_n(relaxed_vars)\n  avg = dsum / len(relaxed_vars)\n  diff = [v - avg for v in relaxed_vars]\n  davg = tf.add_n([d*d for d in diff])\n  return avg, tf.reduce_sum(davg)\n\n\ndef relaxed_distance(rx_step):\n  \"\"\"Distance between relaxed variables and their average.\"\"\"\n  res, ops, rx_done = [], [], {}\n  for v in tf.trainable_variables():\n    if v.name[0:2] == \"RX\":\n      rx_name = v.op.name[v.name.find(\"/\") + 1:]\n      if rx_name not in rx_done:\n        avg, dist_loss = relaxed_average(rx_name, rx_step)\n        res.append(dist_loss)\n        rx_done[rx_name] = avg\n      ops.append(v.assign(rx_done[rx_name]))\n  return tf.add_n(res), tf.group(*ops)\n\n\ndef make_dense(targets, noclass):\n  \"\"\"Move a batch of targets to a dense 1-hot representation.\"\"\"\n  with tf.device(\"/cpu:0\"):\n    shape = tf.shape(targets)\n    batch_size = shape[0]\n    indices = targets + noclass * tf.range(0, batch_size)\n    length = tf.expand_dims(batch_size * noclass, 0)\n    dense = tf.sparse_to_dense(indices, length, 1.0, 0.0)\n  return tf.reshape(dense, [-1, noclass])\n\n\ndef check_for_zero(sparse):\n  \"\"\"In a sparse batch of ints, make 1.0 if it's 0 and 0.0 else.\"\"\"\n  with tf.device(\"/cpu:0\"):\n    shape = tf.shape(sparse)\n    batch_size = shape[0]\n    sparse = tf.minimum(sparse, 1)\n    indices = sparse + 2 * tf.range(0, batch_size)\n    dense = tf.sparse_to_dense(indices, tf.expand_dims(2 * batch_size, 0),\n                               1.0, 0.0)\n    reshaped = tf.reshape(dense, [-1, 2])\n  return tf.reshape(tf.slice(reshaped, [0, 0], [-1, 1]), [-1])\n\n\nclass NeuralGPU(object):\n  \"\"\"Neural GPU Model.\"\"\"\n\n  def __init__(self, nmaps, vec_size, niclass, noclass, dropout, rx_step,\n               max_grad_norm, cutoff, nconvs, kw, kh, height, mode,\n               learning_rate, pull, pull_incr, min_length, act_noise=0.0):\n    # Feeds for parameters and ops to update them.\n    self.global_step = tf.Variable(0, trainable=False)\n    self.cur_length = tf.Variable(min_length, trainable=False)\n    self.cur_length_incr_op = self.cur_length.assign_add(1)\n    self.lr = tf.Variable(float(learning_rate), trainable=False)\n    self.lr_decay_op = self.lr.assign(self.lr * 0.98)\n    self.pull = tf.Variable(float(pull), trainable=False)\n    self.pull_incr_op = self.pull.assign(self.pull * pull_incr)\n    self.do_training = tf.placeholder(tf.float32, name=\"do_training\")\n    self.noise_param = tf.placeholder(tf.float32, name=\"noise_param\")\n\n    # Feeds for inputs, targets, outputs, losses, etc.\n    self.input = []\n    self.target = []\n    for l in xrange(data_utils.forward_max + 1):\n      self.input.append(tf.placeholder(tf.int32, name=\"inp{0}\".format(l)))\n      self.target.append(tf.placeholder(tf.int32, name=\"tgt{0}\".format(l)))\n    self.outputs = []\n    self.losses = []\n    self.grad_norms = []\n    self.updates = []\n\n    # Computation.\n    inp0_shape = tf.shape(self.input[0])\n    batch_size = inp0_shape[0]\n    with tf.device(\"/cpu:0\"):\n      emb_weights = tf.get_variable(\n          \"embedding\", [niclass, vec_size],\n          initializer=tf.random_uniform_initializer(-1.7, 1.7))\n      e0 = tf.scatter_update(emb_weights,\n                             tf.constant(0, dtype=tf.int32, shape=[1]),\n                             tf.zeros([1, vec_size]))\n\n    adam = tf.train.AdamOptimizer(self.lr, epsilon=1e-4)\n\n    # Main graph creation loop, for every bin in data_utils.\n    self.steps = []\n    for length in sorted(list(set(data_utils.bins + [data_utils.forward_max]))):\n      data_utils.print_out(\"Creating model for bin of length %d.\" % length)\n      start_time = time.time()\n      if length > data_utils.bins[0]:\n        tf.get_variable_scope().reuse_variables()\n\n      # Embed inputs and calculate mask.\n      with tf.device(\"/cpu:0\"):\n        with tf.control_dependencies([e0]):\n          embedded = [tf.nn.embedding_lookup(emb_weights, self.input[l])\n                      for l in xrange(length)]\n        # Mask to 0-out padding space in each step.\n        imask = [check_for_zero(self.input[l]) for l in xrange(length)]\n        omask = [check_for_zero(self.target[l]) for l in xrange(length)]\n        mask = [1.0 - (imask[i] * omask[i]) for i in xrange(length)]\n        mask = [tf.reshape(m, [-1, 1]) for m in mask]\n        # Use a shifted mask for step scaling and concatenated for weights.\n        shifted_mask = mask + [tf.zeros_like(mask[0])]\n        scales = [shifted_mask[i] * (1.0 - shifted_mask[i+1])\n                  for i in xrange(length)]\n        scales = [tf.reshape(s, [-1, 1, 1, 1]) for s in scales]\n        mask = tf.concat(1, mask[0:length])  # batch x length\n        weights = mask\n        # Add a height dimension to mask to use later for masking.\n        mask = tf.reshape(mask, [-1, length, 1, 1])\n        mask = tf.concat(2, [mask for _ in xrange(height)]) + tf.zeros(\n            tf.pack([batch_size, length, height, nmaps]), dtype=tf.float32)\n\n      # Start is a length-list of batch-by-nmaps tensors, reshape and concat.\n      start = [tf.tanh(embedded[l]) for l in xrange(length)]\n      start = [tf.reshape(start[l], [-1, 1, nmaps]) for l in xrange(length)]\n      start = tf.reshape(tf.concat(1, start), [-1, length, 1, nmaps])\n\n      # First image comes from start by applying one convolution and adding 0s.\n      first = conv_linear(start, 1, 1, vec_size, nmaps, True, 0.0, \"input\")\n      first = [first] + [tf.zeros(tf.pack([batch_size, length, 1, nmaps]),\n                                  dtype=tf.float32) for _ in xrange(height - 1)]\n      first = tf.concat(2, first)\n\n      # Computation steps.\n      keep_prob = 1.0 - self.do_training * (dropout * 8.0 / float(length))\n      step = [tf.nn.dropout(first, keep_prob) * mask]\n      act_noise_scale = act_noise * self.do_training * self.pull\n      outputs = []\n      for it in xrange(length):\n        with tf.variable_scope(\"RX%d\" % (it % rx_step)) as vs:\n          if it >= rx_step:\n            vs.reuse_variables()\n          cur = step[it]\n          # Do nconvs-many CGRU steps.\n          for layer in xrange(nconvs):\n            cur = conv_gru([], cur, kw, kh, nmaps, cutoff, \"cgru_%d\" % layer)\n            cur *= mask\n          outputs.append(tf.slice(cur, [0, 0, 0, 0], [-1, -1, 1, -1]))\n          cur = tf.nn.dropout(cur, keep_prob)\n          if act_noise > 0.00001:\n            cur += tf.truncated_normal(tf.shape(cur)) * act_noise_scale\n          step.append(cur * mask)\n\n      self.steps.append([tf.reshape(s, [-1, length, height * nmaps])\n                         for s in step])\n      # Output is the n-th step output; n = current length, as in scales.\n      output = tf.add_n([outputs[i] * scales[i] for i in xrange(length)])\n      # Final convolution to get logits, list outputs.\n      output = conv_linear(output, 1, 1, nmaps, noclass, True, 0.0, \"output\")\n      output = tf.reshape(output, [-1, length, noclass])\n      external_output = [tf.reshape(o, [-1, noclass])\n                         for o in list(tf.split(1, length, output))]\n      external_output = [tf.nn.softmax(o) for o in external_output]\n      self.outputs.append(external_output)\n\n      # Calculate cross-entropy loss and normalize it.\n      targets = tf.concat(1, [make_dense(self.target[l], noclass)\n                              for l in xrange(length)])\n      targets = tf.reshape(targets, [-1, noclass])\n      xent = tf.reshape(tf.nn.softmax_cross_entropy_with_logits(\n          tf.reshape(output, [-1, noclass]), targets), [-1, length])\n      perp_loss = tf.reduce_sum(xent * weights)\n      perp_loss /= tf.cast(batch_size, dtype=tf.float32)\n      perp_loss /= length\n\n      # Final loss: cross-entropy + shared parameter relaxation part.\n      relax_dist, self.avg_op = relaxed_distance(rx_step)\n      total_loss = perp_loss + relax_dist * self.pull\n      self.losses.append(perp_loss)\n\n      # Gradients and Adam update operation.\n      if length == data_utils.bins[0] or (mode == 0 and\n                                          length < data_utils.bins[-1] + 1):\n        data_utils.print_out(\"Creating backward for bin of length %d.\" % length)\n        params = tf.trainable_variables()\n        grads = tf.gradients(total_loss, params)\n        grads, norm = tf.clip_by_global_norm(grads, max_grad_norm)\n        self.grad_norms.append(norm)\n        for grad in grads:\n          if isinstance(grad, tf.Tensor):\n            grad += tf.truncated_normal(tf.shape(grad)) * self.noise_param\n        update = adam.apply_gradients(zip(grads, params),\n                                      global_step=self.global_step)\n        self.updates.append(update)\n      data_utils.print_out(\"Created model for bin of length %d in\"\n                           \" %.2f s.\" % (length, time.time() - start_time))\n    self.saver = tf.train.Saver(tf.all_variables())\n\n  def step(self, sess, inp, target, do_backward, noise_param=None,\n           get_steps=False):\n    \"\"\"Run a step of the network.\"\"\"\n    assert len(inp) == len(target)\n    length = len(target)\n    feed_in = {}\n    feed_in[self.noise_param.name] = noise_param if noise_param else 0.0\n    feed_in[self.do_training.name] = 1.0 if do_backward else 0.0\n    feed_out = []\n    index = len(data_utils.bins)\n    if length < data_utils.bins[-1] + 1:\n      index = data_utils.bins.index(length)\n    if do_backward:\n      feed_out.append(self.updates[index])\n      feed_out.append(self.grad_norms[index])\n    feed_out.append(self.losses[index])\n    for l in xrange(length):\n      feed_in[self.input[l].name] = inp[l]\n    for l in xrange(length):\n      feed_in[self.target[l].name] = target[l]\n      feed_out.append(self.outputs[index][l])\n    if get_steps:\n      for l in xrange(length+1):\n        feed_out.append(self.steps[index][l])\n    res = sess.run(feed_out, feed_in)\n    offset = 0\n    norm = None\n    if do_backward:\n      offset = 2\n      norm = res[1]\n    outputs = res[offset + 1:offset + 1 + length]\n    steps = res[offset + 1 + length:] if get_steps else None\n    return res[offset], outputs, norm, steps\n"
  },
  {
    "path": "model_zoo/models/neural_gpu/neural_gpu_trainer.py",
    "content": "# Copyright 2015 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Neural GPU for Learning Algorithms.\"\"\"\n\nimport math\nimport os\nimport random\nimport sys\nimport time\n\nimport matplotlib.animation as anim\nimport matplotlib.pyplot as plt\nimport numpy as np\nimport tensorflow as tf\n\nfrom tensorflow.python.platform import gfile\n\nimport data_utils as data\nimport neural_gpu\n\ntf.app.flags.DEFINE_float(\"lr\", 0.001, \"Learning rate.\")\ntf.app.flags.DEFINE_float(\"init_weight\", 1.0, \"Initial weights deviation.\")\ntf.app.flags.DEFINE_float(\"max_grad_norm\", 1.0, \"Clip gradients to this norm.\")\ntf.app.flags.DEFINE_float(\"cutoff\", 1.2, \"Cutoff at the gates.\")\ntf.app.flags.DEFINE_float(\"pull\", 0.0005, \"Starting pull of the relaxations.\")\ntf.app.flags.DEFINE_float(\"pull_incr\", 1.2, \"Increase pull by that much.\")\ntf.app.flags.DEFINE_float(\"curriculum_bound\", 0.15, \"Move curriculum < this.\")\ntf.app.flags.DEFINE_float(\"dropout\", 0.15, \"Dropout that much.\")\ntf.app.flags.DEFINE_float(\"grad_noise_scale\", 0.0, \"Gradient noise scale.\")\ntf.app.flags.DEFINE_integer(\"batch_size\", 32, \"Batch size.\")\ntf.app.flags.DEFINE_integer(\"low_batch_size\", 16, \"Low batch size.\")\ntf.app.flags.DEFINE_integer(\"steps_per_checkpoint\", 200, \"Steps per epoch.\")\ntf.app.flags.DEFINE_integer(\"nmaps\", 128, \"Number of floats in each cell.\")\ntf.app.flags.DEFINE_integer(\"niclass\", 33, \"Number of classes (0 is padding).\")\ntf.app.flags.DEFINE_integer(\"noclass\", 33, \"Number of classes (0 is padding).\")\ntf.app.flags.DEFINE_integer(\"train_data_size\", 5000, \"Training examples/len.\")\ntf.app.flags.DEFINE_integer(\"max_length\", 41, \"Maximum length.\")\ntf.app.flags.DEFINE_integer(\"rx_step\", 6, \"Relax that many recursive steps.\")\ntf.app.flags.DEFINE_integer(\"random_seed\", 125459, \"Random seed.\")\ntf.app.flags.DEFINE_integer(\"nconvs\", 2, \"How many convolutions / 1 step.\")\ntf.app.flags.DEFINE_integer(\"kw\", 3, \"Kernel width.\")\ntf.app.flags.DEFINE_integer(\"kh\", 3, \"Kernel height.\")\ntf.app.flags.DEFINE_integer(\"height\", 4, \"Height.\")\ntf.app.flags.DEFINE_integer(\"forward_max\", 401, \"Maximum forward length.\")\ntf.app.flags.DEFINE_integer(\"jobid\", -1, \"Task id when running on borg.\")\ntf.app.flags.DEFINE_integer(\"nprint\", 0, \"How many test examples to print out.\")\ntf.app.flags.DEFINE_integer(\"mode\", 0, \"Mode: 0-train other-decode.\")\ntf.app.flags.DEFINE_bool(\"animate\", False, \"Whether to produce an animation.\")\ntf.app.flags.DEFINE_bool(\"quantize\", False, \"Whether to quantize variables.\")\ntf.app.flags.DEFINE_string(\"task\", \"rev\", \"Which task are we learning?\")\ntf.app.flags.DEFINE_string(\"train_dir\", \"/tmp/\", \"Directory to store models.\")\ntf.app.flags.DEFINE_string(\"ensemble\", \"\", \"Model paths for ensemble.\")\n\nFLAGS = tf.app.flags.FLAGS\nEXTRA_EVAL = 12\n\n\ndef initialize(sess):\n  \"\"\"Initialize data and model.\"\"\"\n  if FLAGS.jobid >= 0:\n    data.log_filename = os.path.join(FLAGS.train_dir, \"log%d\" % FLAGS.jobid)\n  data.print_out(\"NN \", newline=False)\n\n  # Set random seed.\n  seed = FLAGS.random_seed + max(0, FLAGS.jobid)\n  tf.set_random_seed(seed)\n  random.seed(seed)\n  np.random.seed(seed)\n\n  # Check data sizes.\n  assert data.bins\n  min_length = 3\n  max_length = min(FLAGS.max_length, data.bins[-1])\n  assert max_length + 1 > min_length\n  while len(data.bins) > 1 and data.bins[-2] > max_length + EXTRA_EVAL:\n    data.bins = data.bins[:-1]\n  assert data.bins[0] > FLAGS.rx_step\n  data.forward_max = max(FLAGS.forward_max, data.bins[-1])\n  nclass = min(FLAGS.niclass, FLAGS.noclass)\n  data_size = FLAGS.train_data_size if FLAGS.mode == 0 else 1000\n\n  # Initialize data for each task.\n  tasks = FLAGS.task.split(\"-\")\n  for t in tasks:\n    for l in xrange(max_length + EXTRA_EVAL - 1):\n      data.init_data(t, l, data_size, nclass)\n    data.init_data(t, data.bins[-2], data_size, nclass)\n    data.init_data(t, data.bins[-1], data_size, nclass)\n    end_size = 4 * 1024 if FLAGS.mode > 0 else 1024\n    data.init_data(t, data.forward_max, end_size, nclass)\n\n  # Print out parameters.\n  curriculum = FLAGS.curriculum_bound\n  msg1 = (\"layers %d kw %d h %d kh %d relax %d batch %d noise %.2f task %s\"\n          % (FLAGS.nconvs, FLAGS.kw, FLAGS.height, FLAGS.kh, FLAGS.rx_step,\n             FLAGS.batch_size, FLAGS.grad_noise_scale, FLAGS.task))\n  msg2 = \"data %d %s\" % (FLAGS.train_data_size, msg1)\n  msg3 = (\"cut %.2f pull %.3f lr %.2f iw %.2f cr %.2f nm %d d%.4f gn %.2f %s\" %\n          (FLAGS.cutoff, FLAGS.pull_incr, FLAGS.lr, FLAGS.init_weight,\n           curriculum, FLAGS.nmaps, FLAGS.dropout, FLAGS.max_grad_norm, msg2))\n  data.print_out(msg3)\n\n  # Create checkpoint directory if it does not exist.\n  checkpoint_dir = os.path.join(FLAGS.train_dir, \"neural_gpu%s\"\n                                % (\"\" if FLAGS.jobid < 0 else str(FLAGS.jobid)))\n  if not gfile.IsDirectory(checkpoint_dir):\n    data.print_out(\"Creating checkpoint directory %s.\" % checkpoint_dir)\n    gfile.MkDir(checkpoint_dir)\n\n  # Create model and initialize it.\n  tf.get_variable_scope().set_initializer(\n      tf.uniform_unit_scaling_initializer(factor=1.8 * FLAGS.init_weight))\n  model = neural_gpu.NeuralGPU(\n      FLAGS.nmaps, FLAGS.nmaps, FLAGS.niclass, FLAGS.noclass, FLAGS.dropout,\n      FLAGS.rx_step, FLAGS.max_grad_norm, FLAGS.cutoff, FLAGS.nconvs,\n      FLAGS.kw, FLAGS.kh, FLAGS.height, FLAGS.mode, FLAGS.lr,\n      FLAGS.pull, FLAGS.pull_incr, min_length + 3)\n  data.print_out(\"Created model.\")\n  sess.run(tf.initialize_all_variables())\n  data.print_out(\"Initialized variables.\")\n\n  # Load model from parameters if a checkpoint exists.\n  ckpt = tf.train.get_checkpoint_state(checkpoint_dir)\n  if ckpt and gfile.Exists(ckpt.model_checkpoint_path):\n    data.print_out(\"Reading model parameters from %s\"\n                   % ckpt.model_checkpoint_path)\n    model.saver.restore(sess, ckpt.model_checkpoint_path)\n\n  # Check if there are ensemble models and get their checkpoints.\n  ensemble = []\n  ensemble_dir_list = [d for d in FLAGS.ensemble.split(\",\") if d]\n  for ensemble_dir in ensemble_dir_list:\n    ckpt = tf.train.get_checkpoint_state(ensemble_dir)\n    if ckpt and gfile.Exists(ckpt.model_checkpoint_path):\n      data.print_out(\"Found ensemble model %s\" % ckpt.model_checkpoint_path)\n      ensemble.append(ckpt.model_checkpoint_path)\n\n  # Return the model and needed variables.\n  return (model, min_length, max_length, checkpoint_dir, curriculum, ensemble)\n\n\ndef single_test(l, model, sess, task, nprint, batch_size, print_out=True,\n                offset=None, ensemble=None, get_steps=False):\n  \"\"\"Test model on test data of length l using the given session.\"\"\"\n  inpt, target = data.get_batch(l, batch_size, False, task, offset)\n  _, res, _, steps = model.step(sess, inpt, target, False, get_steps=get_steps)\n  errors, total, seq_err = data.accuracy(inpt, res, target, batch_size, nprint)\n  seq_err = float(seq_err) / batch_size\n  if total > 0:\n    errors = float(errors) / total\n  if print_out:\n    data.print_out(\"  %s len %d errors %.2f sequence-errors %.2f\"\n                   % (task, l, 100*errors, 100*seq_err))\n  # Ensemble eval.\n  if ensemble:\n    results = []\n    for m in ensemble:\n      model.saver.restore(sess, m)\n      _, result, _, _ = model.step(sess, inpt, target, False)\n      m_errors, m_total, m_seq_err = data.accuracy(inpt, result, target,\n                                                   batch_size, nprint)\n      m_seq_err = float(m_seq_err) / batch_size\n      if total > 0:\n        m_errors = float(m_errors) / m_total\n      data.print_out(\"     %s len %d m-errors %.2f m-sequence-errors %.2f\"\n                     % (task, l, 100*m_errors, 100*m_seq_err))\n      results.append(result)\n    ens = [sum(o) for o in zip(*results)]\n    errors, total, seq_err = data.accuracy(inpt, ens, target,\n                                           batch_size, nprint)\n    seq_err = float(seq_err) / batch_size\n    if total > 0:\n      errors = float(errors) / total\n    if print_out:\n      data.print_out(\"  %s len %d ens-errors %.2f ens-sequence-errors %.2f\"\n                     % (task, l, 100*errors, 100*seq_err))\n  return errors, seq_err, (steps, inpt, [np.argmax(o, axis=1) for o in res])\n\n\ndef multi_test(l, model, sess, task, nprint, batch_size, offset=None,\n               ensemble=None):\n  \"\"\"Run multiple tests at lower batch size to save memory.\"\"\"\n  errors, seq_err = 0.0, 0.0\n  to_print = nprint\n  low_batch = FLAGS.low_batch_size\n  low_batch = min(low_batch, batch_size)\n  for mstep in xrange(batch_size / low_batch):\n    cur_offset = None if offset is None else offset + mstep * low_batch\n    err, sq_err, _ = single_test(l, model, sess, task, to_print, low_batch,\n                                 False, cur_offset, ensemble=ensemble)\n    to_print = max(0, to_print - low_batch)\n    errors += err\n    seq_err += sq_err\n    if FLAGS.mode > 0:\n      cur_errors = float(low_batch * errors) / ((mstep+1) * low_batch)\n      cur_seq_err = float(low_batch * seq_err) / ((mstep+1) * low_batch)\n      data.print_out(\"    %s multitest current errors %.2f sequence-errors %.2f\"\n                     % (task, 100*cur_errors, 100*cur_seq_err))\n  errors = float(low_batch) * float(errors) / batch_size\n  seq_err = float(low_batch) * float(seq_err) / batch_size\n  data.print_out(\"  %s len %d errors %.2f sequence-errors %.2f\"\n                 % (task, l, 100*errors, 100*seq_err))\n  return errors, seq_err\n\n\ndef train():\n  \"\"\"Train the model.\"\"\"\n  batch_size = FLAGS.batch_size\n  tasks = FLAGS.task.split(\"-\")\n  with tf.Session() as sess:\n    (model, min_length, max_length, checkpoint_dir,\n     curriculum, _) = initialize(sess)\n    quant_op = neural_gpu.quantize_weights_op(512, 8)\n    max_cur_length = min(min_length + 3, max_length)\n    prev_acc_perp = [1000000 for _ in xrange(3)]\n    prev_seq_err = 1.0\n\n    # Main traning loop.\n    while True:\n      global_step, pull, max_cur_length, learning_rate = sess.run(\n          [model.global_step, model.pull, model.cur_length, model.lr])\n      acc_loss, acc_total, acc_errors, acc_seq_err = 0.0, 0, 0, 0\n      acc_grad_norm, step_count, step_time = 0.0, 0, 0.0\n      for _ in xrange(FLAGS.steps_per_checkpoint):\n        global_step += 1\n        task = random.choice(tasks)\n\n        # Select the length for curriculum learning.\n        l = np.random.randint(max_cur_length - min_length + 1) + min_length\n        # Prefer longer stuff 60% of time.\n        if np.random.randint(100) < 60:\n          l1 = np.random.randint(max_cur_length - min_length+1) + min_length\n          l = max(l, l1)\n        # Mixed curriculum learning: in 25% of cases go to any larger length.\n        if np.random.randint(100) < 25:\n          l1 = np.random.randint(max_length - min_length + 1) + min_length\n          l = max(l, l1)\n\n        # Run a step and time it.\n        start_time = time.time()\n        inp, target = data.get_batch(l, batch_size, True, task)\n        noise_param = math.sqrt(math.pow(global_step, -0.55) *\n                                prev_seq_err) * FLAGS.grad_noise_scale\n        loss, res, gnorm, _ = model.step(sess, inp, target, True, noise_param)\n        step_time += time.time() - start_time\n        acc_grad_norm += float(gnorm)\n\n        # Accumulate statistics only if we did not exceed curriculum length.\n        if l < max_cur_length + 1:\n          step_count += 1\n          acc_loss += loss\n          errors, total, seq_err = data.accuracy(inp, res, target,\n                                                 batch_size, 0)\n          acc_total += total\n          acc_errors += errors\n          acc_seq_err += seq_err\n\n      # Normalize and print out accumulated statistics.\n      acc_loss /= step_count\n      step_time /= FLAGS.steps_per_checkpoint\n      acc_seq_err = float(acc_seq_err) / (step_count * batch_size)\n      prev_seq_err = max(0.0, acc_seq_err - 0.02)  # No noise at error < 2%.\n      acc_errors = float(acc_errors) / acc_total if acc_total > 0 else 1.0\n      msg1 = \"step %d step-time %.2f\" % (global_step, step_time)\n      msg2 = \"lr %.8f pull %.3f\" % (learning_rate, pull)\n      msg3 = (\"%s %s grad-norm %.8f\"\n              % (msg1, msg2, acc_grad_norm / FLAGS.steps_per_checkpoint))\n      data.print_out(\"%s len %d ppx %.8f errors %.2f sequence-errors %.2f\" %\n                     (msg3, max_cur_length, data.safe_exp(acc_loss),\n                      100*acc_errors, 100*acc_seq_err))\n\n      # If errors are below the curriculum threshold, move curriculum forward.\n      if curriculum > acc_seq_err:\n        if FLAGS.quantize:\n          # Quantize weights.\n          data.print_out(\"  Quantizing parameters.\")\n          sess.run([quant_op])\n        # Increase current length (until the next with training data).\n        do_incr = True\n        while do_incr and max_cur_length < max_length:\n          sess.run(model.cur_length_incr_op)\n          for t in tasks:\n            if data.train_set[t]: do_incr = False\n        # Forget last perplexities if we're not yet at the end.\n        if max_cur_length < max_length:\n          prev_acc_perp.append(1000000)\n        # Either increase pull or, if it's large, average parameters.\n        if pull < 0.1:\n          sess.run(model.pull_incr_op)\n        else:\n          data.print_out(\"  Averaging parameters.\")\n          sess.run(model.avg_op)\n          if acc_seq_err < (curriculum / 3.0):\n            sess.run(model.lr_decay_op)\n\n      # Lower learning rate if we're worse than the last 3 checkpoints.\n      acc_perp = data.safe_exp(acc_loss)\n      if acc_perp > max(prev_acc_perp[-3:]):\n        sess.run(model.lr_decay_op)\n      prev_acc_perp.append(acc_perp)\n\n      # Save checkpoint.\n      checkpoint_path = os.path.join(checkpoint_dir, \"neural_gpu.ckpt\")\n      model.saver.save(sess, checkpoint_path,\n                       global_step=model.global_step)\n\n      # Run evaluation.\n      bound = data.bins[-1] + 1\n      for t in tasks:\n        l = min_length\n        while l < max_length + EXTRA_EVAL and l < bound:\n          _, seq_err, _ = single_test(l, model, sess, t,\n                                      FLAGS.nprint, batch_size)\n          l += 1\n          while l < bound + 1 and not data.test_set[t][l]:\n            l += 1\n        if seq_err < 0.05:  # Run larger test if we're good enough.\n          _, seq_err = multi_test(data.forward_max, model, sess, t,\n                                  FLAGS.nprint, batch_size * 4)\n      if seq_err < 0.01:  # Super-large test on 1-task large-forward models.\n        if data.forward_max > 4000 and len(tasks) == 1:\n          multi_test(data.forward_max, model, sess, tasks[0], FLAGS.nprint,\n                     batch_size * 16, 0)\n\n\ndef animate(l, test_data, anim_size):\n  \"\"\"Create animation for the given data (hacky matplotlib use).\"\"\"\n  xf = 12  # Extra frames to slow down at start and end.\n  fps = 2  # Frames per step.\n\n  # Make the figure.\n  fig = plt.figure(figsize=(16, 9), facecolor=\"white\")\n  ax = fig.add_axes([0, 0, 1, 1], frameon=False, zorder=2)\n  ax.set_xticks([i * 24-0.5 for i in xrange(4)])\n  ax.set_xticklabels([])\n  ax.set_yticks([i - 0.5 for i in xrange(l+1)])\n  ax.grid(which=\"major\", axis=\"both\", linestyle=\"-\", color=\"black\")\n  # We need text fields.\n  text_fields = []\n  text_size = 24*32/l\n  for y in xrange(l):\n    text_fields.append(ax.text(\n        11.25, y + 0.15, \"\", color=\"g\", ha=\"center\", va=\"center\",\n        bbox={\"facecolor\": \"b\", \"alpha\": 0.01, \"pad\": 24 * text_size},\n        size=text_size - (4 * 32 / l), animated=True))\n  im = ax.imshow(np.zeros_like(test_data[0][0][0]), vmin=-1.0,\n                 vmax=1.0, cmap=\"gray\", aspect=\"auto\", origin=\"upper\",\n                 interpolation=\"none\", animated=True)\n  im.set_zorder(1)\n\n  # Main animation step.\n  def animation_update(frame_no, test_data, xf, im, text_fields):\n    \"\"\"Update an animation frame.\"\"\"\n    steps, inpt, out_raw = test_data\n    length = len(steps)\n    batch = frame_no / (fps * (l+4*xf))\n    index = int((frame_no % (fps * (l+4*xf))) / fps)\n    # Cut output after first padding.\n    out = [out_raw[i][batch] for i in xrange(len(text_fields))]\n    if 0 in out:\n      i = out.index(0)\n      out = out[0:i] + [0 for _ in xrange(len(out) - i)]\n    # Show the state after the first frames.\n    if index >= 2*xf:\n      im.set_array(steps[min(length - 1, index - 2*xf)][batch])\n      for i, t in enumerate(text_fields):\n        if index - 2*xf < length:\n          t.set_text(\"\")\n        else:\n          t.set_text(data.to_symbol(out[i]))\n    else:\n      for i, t in enumerate(text_fields):\n        t.set_text(data.to_symbol(inpt[i][batch]) if index < xf else \"\")\n      if index < xf:\n        im.set_array(np.zeros_like(steps[0][0]))\n      else:\n        im.set_array(steps[0][batch])\n    return im,\n\n  # Create the animation and save to mp4.\n  animation = anim.FuncAnimation(\n      fig, animation_update, blit=True, frames=(l+4*xf)*anim_size*fps,\n      interval=500/fps, fargs=(test_data, xf, im, text_fields))\n  animation.save(\"/tmp/neural_gpu.mp4\", writer=\"mencoder\", fps=4*fps, dpi=3*80)\n\n\ndef evaluate():\n  \"\"\"Evaluate an existing model.\"\"\"\n  batch_size = FLAGS.batch_size\n  tasks = FLAGS.task.split(\"-\")\n  with tf.Session() as sess:\n    model, min_length, max_length, _, _, ensemble = initialize(sess)\n    bound = data.bins[-1] + 1\n    for t in tasks:\n      l = min_length\n      while l < max_length + EXTRA_EVAL and l < bound:\n        _, seq_err, _ = single_test(l, model, sess, t, FLAGS.nprint,\n                                    batch_size, ensemble=ensemble)\n        l += 1\n        while l < bound + 1 and not data.test_set[t][l]:\n          l += 1\n      # Animate.\n      if FLAGS.animate:\n        anim_size = 2\n        _, _, test_data = single_test(l, model, sess, t, 0, anim_size,\n                                      get_steps=True)\n        animate(l, test_data, anim_size)\n      # More tests.\n      _, seq_err = multi_test(data.forward_max, model, sess, t, FLAGS.nprint,\n                              batch_size * 4, ensemble=ensemble)\n    if seq_err < 0.01:  # Super-test if we're very good and in large-test mode.\n      if data.forward_max > 4000 and len(tasks) == 1:\n        multi_test(data.forward_max, model, sess, tasks[0], FLAGS.nprint,\n                   batch_size * 64, 0, ensemble=ensemble)\n\n\ndef interactive():\n  \"\"\"Interactively probe an existing model.\"\"\"\n  with tf.Session() as sess:\n    model, _, _, _, _, _ = initialize(sess)\n    sys.stdout.write(\"Input to Neural GPU, e.g., 0 1. Use -1 for PAD.\\n\")\n    sys.stdout.write(\"> \")\n    sys.stdout.flush()\n    inpt = sys.stdin.readline()\n    while inpt:\n      ids = [data.to_id(s) for s in inpt.strip().split()]\n      inpt, target = data.get_batch(len(ids), 1, False, \"\",\n                                    preset=(ids, [0 for _ in ids]))\n      _, res, _, _ = model.step(sess, inpt, target, False)\n      res = [np.argmax(o, axis=1) for o in res]\n      res = [o for o in res[:len(ids)] if o > 0]\n      print \"  \" + \" \".join([data.to_symbol(output[0]) for output in res])\n      sys.stdout.write(\"> \")\n      sys.stdout.flush()\n      inpt = sys.stdin.readline()\n\n\ndef main(_):\n  if FLAGS.mode == 0:\n    train()\n  elif FLAGS.mode == 1:\n    evaluate()\n  else:\n    interactive()\n\nif __name__ == \"__main__\":\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/neural_programmer/README.md",
    "content": "Implementation of the Neural Programmer model described in https://openreview.net/pdf?id=ry2YOrcge\n\nDownload the data from http://www-nlp.stanford.edu/software/sempre/wikitable/ Change the data_dir FLAG to the location of the data\n\nTraining: python neural_programmer.py\n\nThe models are written to FLAGS.output_dir\n\nTesting: python neural_programmer.py --evaluator_job=True\n\nThe models are loaded from FLAGS.output_dir. The evaluation is done on development data.\n\nMaintained by Arvind Neelakantan (arvind2505)\n"
  },
  {
    "path": "model_zoo/models/neural_programmer/data_utils.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Functions for constructing vocabulary, converting the examples to integer format and building the required masks for batch computation Author: aneelakantan (Arvind Neelakantan)\n\"\"\"\n\nimport copy\nimport numbers\nimport numpy as np\nimport wiki_data\n\n\ndef return_index(a):\n  for i in range(len(a)):\n    if (a[i] == 1.0):\n      return i\n\n\ndef construct_vocab(data, utility, add_word=False):\n  ans = []\n  for example in data:\n    sent = \"\"\n    for word in example.question:\n      if (not (isinstance(word, numbers.Number))):\n        sent += word + \" \"\n    example.original_nc = copy.deepcopy(example.number_columns)\n    example.original_wc = copy.deepcopy(example.word_columns)\n    example.original_nc_names = copy.deepcopy(example.number_column_names)\n    example.original_wc_names = copy.deepcopy(example.word_column_names)\n    if (add_word):\n      continue\n    number_found = 0\n    if (not (example.is_bad_example)):\n      for word in example.question:\n        if (isinstance(word, numbers.Number)):\n          number_found += 1\n        else:\n          if (not (utility.word_ids.has_key(word))):\n            utility.words.append(word)\n            utility.word_count[word] = 1\n            utility.word_ids[word] = len(utility.word_ids)\n            utility.reverse_word_ids[utility.word_ids[word]] = word\n          else:\n            utility.word_count[word] += 1\n      for col_name in example.word_column_names:\n        for word in col_name:\n          if (isinstance(word, numbers.Number)):\n            number_found += 1\n          else:\n            if (not (utility.word_ids.has_key(word))):\n              utility.words.append(word)\n              utility.word_count[word] = 1\n              utility.word_ids[word] = len(utility.word_ids)\n              utility.reverse_word_ids[utility.word_ids[word]] = word\n            else:\n              utility.word_count[word] += 1\n      for col_name in example.number_column_names:\n        for word in col_name:\n          if (isinstance(word, numbers.Number)):\n            number_found += 1\n          else:\n            if (not (utility.word_ids.has_key(word))):\n              utility.words.append(word)\n              utility.word_count[word] = 1\n              utility.word_ids[word] = len(utility.word_ids)\n              utility.reverse_word_ids[utility.word_ids[word]] = word\n            else:\n              utility.word_count[word] += 1\n\n\ndef word_lookup(word, utility):\n  if (utility.word_ids.has_key(word)):\n    return word\n  else:\n    return utility.unk_token\n\n\ndef convert_to_int_2d_and_pad(a, utility):\n  ans = []\n  #print a\n  for b in a:\n    temp = []\n    if (len(b) > utility.FLAGS.max_entry_length):\n      b = b[0:utility.FLAGS.max_entry_length]\n    for remaining in range(len(b), utility.FLAGS.max_entry_length):\n      b.append(utility.dummy_token)\n    assert len(b) == utility.FLAGS.max_entry_length\n    for word in b:\n      temp.append(utility.word_ids[word_lookup(word, utility)])\n    ans.append(temp)\n  #print ans\n  return ans\n\n\ndef convert_to_bool_and_pad(a, utility):\n  a = a.tolist()\n  for i in range(len(a)):\n    for j in range(len(a[i])):\n      if (a[i][j] < 1):\n        a[i][j] = False\n      else:\n        a[i][j] = True\n    a[i] = a[i] + [False] * (utility.FLAGS.max_elements - len(a[i]))\n  return a\n\n\nseen_tables = {}\n\n\ndef partial_match(question, table, number):\n  answer = []\n  match = {}\n  for i in range(len(table)):\n    temp = []\n    for j in range(len(table[i])):\n      temp.append(0)\n    answer.append(temp)\n  for i in range(len(table)):\n    for j in range(len(table[i])):\n      for word in question:\n        if (number):\n          if (word == table[i][j]):\n            answer[i][j] = 1.0\n            match[i] = 1.0\n        else:\n          if (word in table[i][j]):\n            answer[i][j] = 1.0\n            match[i] = 1.0\n  return answer, match\n\n\ndef exact_match(question, table, number):\n  #performs exact match operation\n  answer = []\n  match = {}\n  matched_indices = []\n  for i in range(len(table)):\n    temp = []\n    for j in range(len(table[i])):\n      temp.append(0)\n    answer.append(temp)\n  for i in range(len(table)):\n    for j in range(len(table[i])):\n      if (number):\n        for word in question:\n          if (word == table[i][j]):\n            match[i] = 1.0\n            answer[i][j] = 1.0\n      else:\n        table_entry = table[i][j]\n        for k in range(len(question)):\n          if (k + len(table_entry) <= len(question)):\n            if (table_entry == question[k:(k + len(table_entry))]):\n              #if(len(table_entry) == 1):\n              #print \"match: \", table_entry, question\n              match[i] = 1.0\n              answer[i][j] = 1.0\n              matched_indices.append((k, len(table_entry)))\n  return answer, match, matched_indices\n\n\ndef partial_column_match(question, table, number):\n  answer = []\n  for i in range(len(table)):\n    answer.append(0)\n  for i in range(len(table)):\n    for word in question:\n      if (word in table[i]):\n        answer[i] = 1.0\n  return answer\n\n\ndef exact_column_match(question, table, number):\n  #performs exact match on column names\n  answer = []\n  matched_indices = []\n  for i in range(len(table)):\n    answer.append(0)\n  for i in range(len(table)):\n    table_entry = table[i]\n    for k in range(len(question)):\n      if (k + len(table_entry) <= len(question)):\n        if (table_entry == question[k:(k + len(table_entry))]):\n          answer[i] = 1.0\n          matched_indices.append((k, len(table_entry)))\n  return answer, matched_indices\n\n\ndef get_max_entry(a):\n  e = {}\n  for w in a:\n    if (w != \"UNK, \"):\n      if (e.has_key(w)):\n        e[w] += 1\n      else:\n        e[w] = 1\n  if (len(e) > 0):\n    (key, val) = sorted(e.items(), key=lambda x: -1 * x[1])[0]\n    if (val > 1):\n      return key\n    else:\n      return -1.0\n  else:\n    return -1.0\n\n\ndef list_join(a):\n  ans = \"\"\n  for w in a:\n    ans += str(w) + \", \"\n  return ans\n\n\ndef group_by_max(table, number):\n  #computes the most frequently occuring entry in a column\n  answer = []\n  for i in range(len(table)):\n    temp = []\n    for j in range(len(table[i])):\n      temp.append(0)\n    answer.append(temp)\n  for i in range(len(table)):\n    if (number):\n      curr = table[i]\n    else:\n      curr = [list_join(w) for w in table[i]]\n    max_entry = get_max_entry(curr)\n    #print i, max_entry\n    for j in range(len(curr)):\n      if (max_entry == curr[j]):\n        answer[i][j] = 1.0\n      else:\n        answer[i][j] = 0.0\n  return answer\n\n\ndef pick_one(a):\n  for i in range(len(a)):\n    if (1.0 in a[i]):\n      return True\n  return False\n\n\ndef check_processed_cols(col, utility):\n  return True in [\n      True for y in col\n      if (y != utility.FLAGS.pad_int and y !=\n          utility.FLAGS.bad_number_pre_process)\n  ]\n\n\ndef complete_wiki_processing(data, utility, train=True):\n  #convert to integers and padding\n  processed_data = []\n  num_bad_examples = 0\n  for example in data:\n    number_found = 0\n    if (example.is_bad_example):\n      num_bad_examples += 1\n    if (not (example.is_bad_example)):\n      example.string_question = example.question[:]\n      #entry match\n      example.processed_number_columns = example.processed_number_columns[:]\n      example.processed_word_columns = example.processed_word_columns[:]\n      example.word_exact_match, word_match, matched_indices = exact_match(\n          example.string_question, example.original_wc, number=False)\n      example.number_exact_match, number_match, _ = exact_match(\n          example.string_question, example.original_nc, number=True)\n      if (not (pick_one(example.word_exact_match)) and not (\n          pick_one(example.number_exact_match))):\n        assert len(word_match) == 0\n        assert len(number_match) == 0\n        example.word_exact_match, word_match = partial_match(\n            example.string_question, example.original_wc, number=False)\n      #group by max\n      example.word_group_by_max = group_by_max(example.original_wc, False)\n      example.number_group_by_max = group_by_max(example.original_nc, True)\n      #column name match\n      example.word_column_exact_match, wcol_matched_indices = exact_column_match(\n          example.string_question, example.original_wc_names, number=False)\n      example.number_column_exact_match, ncol_matched_indices = exact_column_match(\n          example.string_question, example.original_nc_names, number=False)\n      if (not (1.0 in example.word_column_exact_match) and not (\n          1.0 in example.number_column_exact_match)):\n        example.word_column_exact_match = partial_column_match(\n            example.string_question, example.original_wc_names, number=False)\n        example.number_column_exact_match = partial_column_match(\n            example.string_question, example.original_nc_names, number=False)\n      if (len(word_match) > 0 or len(number_match) > 0):\n        example.question.append(utility.entry_match_token)\n      if (1.0 in example.word_column_exact_match or\n          1.0 in example.number_column_exact_match):\n        example.question.append(utility.column_match_token)\n      example.string_question = example.question[:]\n      example.number_lookup_matrix = np.transpose(\n          example.number_lookup_matrix)[:]\n      example.word_lookup_matrix = np.transpose(example.word_lookup_matrix)[:]\n      example.columns = example.number_columns[:]\n      example.word_columns = example.word_columns[:]\n      example.len_total_cols = len(example.word_column_names) + len(\n          example.number_column_names)\n      example.column_names = example.number_column_names[:]\n      example.word_column_names = example.word_column_names[:]\n      example.string_column_names = example.number_column_names[:]\n      example.string_word_column_names = example.word_column_names[:]\n      example.sorted_number_index = []\n      example.sorted_word_index = []\n      example.column_mask = []\n      example.word_column_mask = []\n      example.processed_column_mask = []\n      example.processed_word_column_mask = []\n      example.word_column_entry_mask = []\n      example.question_attention_mask = []\n      example.question_number = example.question_number_1 = -1\n      example.question_attention_mask = []\n      example.ordinal_question = []\n      example.ordinal_question_one = []\n      new_question = []\n      if (len(example.number_columns) > 0):\n        example.len_col = len(example.number_columns[0])\n      else:\n        example.len_col = len(example.word_columns[0])\n      for (start, length) in matched_indices:\n        for j in range(length):\n          example.question[start + j] = utility.unk_token\n      #print example.question\n      for word in example.question:\n        if (isinstance(word, numbers.Number) or wiki_data.is_date(word)):\n          if (not (isinstance(word, numbers.Number)) and\n              wiki_data.is_date(word)):\n            word = word.replace(\"X\", \"\").replace(\"-\", \"\")\n          number_found += 1\n          if (number_found == 1):\n            example.question_number = word\n            if (len(example.ordinal_question) > 0):\n              example.ordinal_question[len(example.ordinal_question) - 1] = 1.0\n            else:\n              example.ordinal_question.append(1.0)\n          elif (number_found == 2):\n            example.question_number_1 = word\n            if (len(example.ordinal_question_one) > 0):\n              example.ordinal_question_one[len(example.ordinal_question_one) -\n                                           1] = 1.0\n            else:\n              example.ordinal_question_one.append(1.0)\n        else:\n          new_question.append(word)\n          example.ordinal_question.append(0.0)\n          example.ordinal_question_one.append(0.0)\n      example.question = [\n          utility.word_ids[word_lookup(w, utility)] for w in new_question\n      ]\n      example.question_attention_mask = [0.0] * len(example.question)\n      #when the first question number occurs before a word\n      example.ordinal_question = example.ordinal_question[0:len(\n          example.question)]\n      example.ordinal_question_one = example.ordinal_question_one[0:len(\n          example.question)]\n      #question-padding\n      example.question = [utility.word_ids[utility.dummy_token]] * (\n          utility.FLAGS.question_length - len(example.question)\n      ) + example.question\n      example.question_attention_mask = [-10000.0] * (\n          utility.FLAGS.question_length - len(example.question_attention_mask)\n      ) + example.question_attention_mask\n      example.ordinal_question = [0.0] * (utility.FLAGS.question_length -\n                                          len(example.ordinal_question)\n                                         ) + example.ordinal_question\n      example.ordinal_question_one = [0.0] * (utility.FLAGS.question_length -\n                                              len(example.ordinal_question_one)\n                                             ) + example.ordinal_question_one\n      if (True):\n        #number columns and related-padding\n        num_cols = len(example.columns)\n        start = 0\n        for column in example.number_columns:\n          if (check_processed_cols(example.processed_number_columns[start],\n                                   utility)):\n            example.processed_column_mask.append(0.0)\n          sorted_index = sorted(\n              range(len(example.processed_number_columns[start])),\n              key=lambda k: example.processed_number_columns[start][k],\n              reverse=True)\n          sorted_index = sorted_index + [utility.FLAGS.pad_int] * (\n              utility.FLAGS.max_elements - len(sorted_index))\n          example.sorted_number_index.append(sorted_index)\n          example.columns[start] = column + [utility.FLAGS.pad_int] * (\n              utility.FLAGS.max_elements - len(column))\n          example.processed_number_columns[start] += [utility.FLAGS.pad_int] * (\n              utility.FLAGS.max_elements -\n              len(example.processed_number_columns[start]))\n          start += 1\n          example.column_mask.append(0.0)\n        for remaining in range(num_cols, utility.FLAGS.max_number_cols):\n          example.sorted_number_index.append([utility.FLAGS.pad_int] *\n                                             (utility.FLAGS.max_elements))\n          example.columns.append([utility.FLAGS.pad_int] *\n                                 (utility.FLAGS.max_elements))\n          example.processed_number_columns.append([utility.FLAGS.pad_int] *\n                                                  (utility.FLAGS.max_elements))\n          example.number_exact_match.append([0.0] *\n                                            (utility.FLAGS.max_elements))\n          example.number_group_by_max.append([0.0] *\n                                             (utility.FLAGS.max_elements))\n          example.column_mask.append(-100000000.0)\n          example.processed_column_mask.append(-100000000.0)\n          example.number_column_exact_match.append(0.0)\n          example.column_names.append([utility.dummy_token])\n        #word column  and related-padding\n        start = 0\n        word_num_cols = len(example.word_columns)\n        for column in example.word_columns:\n          if (check_processed_cols(example.processed_word_columns[start],\n                                   utility)):\n            example.processed_word_column_mask.append(0.0)\n          sorted_index = sorted(\n              range(len(example.processed_word_columns[start])),\n              key=lambda k: example.processed_word_columns[start][k],\n              reverse=True)\n          sorted_index = sorted_index + [utility.FLAGS.pad_int] * (\n              utility.FLAGS.max_elements - len(sorted_index))\n          example.sorted_word_index.append(sorted_index)\n          column = convert_to_int_2d_and_pad(column, utility)\n          example.word_columns[start] = column + [[\n              utility.word_ids[utility.dummy_token]\n          ] * utility.FLAGS.max_entry_length] * (utility.FLAGS.max_elements -\n                                                 len(column))\n          example.processed_word_columns[start] += [utility.FLAGS.pad_int] * (\n              utility.FLAGS.max_elements -\n              len(example.processed_word_columns[start]))\n          example.word_column_entry_mask.append([0] * len(column) + [\n              utility.word_ids[utility.dummy_token]\n          ] * (utility.FLAGS.max_elements - len(column)))\n          start += 1\n          example.word_column_mask.append(0.0)\n        for remaining in range(word_num_cols, utility.FLAGS.max_word_cols):\n          example.sorted_word_index.append([utility.FLAGS.pad_int] *\n                                           (utility.FLAGS.max_elements))\n          example.word_columns.append([[utility.word_ids[utility.dummy_token]] *\n                                       utility.FLAGS.max_entry_length] *\n                                      (utility.FLAGS.max_elements))\n          example.word_column_entry_mask.append(\n              [utility.word_ids[utility.dummy_token]] *\n              (utility.FLAGS.max_elements))\n          example.word_exact_match.append([0.0] * (utility.FLAGS.max_elements))\n          example.word_group_by_max.append([0.0] * (utility.FLAGS.max_elements))\n          example.processed_word_columns.append([utility.FLAGS.pad_int] *\n                                                (utility.FLAGS.max_elements))\n          example.word_column_mask.append(-100000000.0)\n          example.processed_word_column_mask.append(-100000000.0)\n          example.word_column_exact_match.append(0.0)\n          example.word_column_names.append([utility.dummy_token] *\n                                           utility.FLAGS.max_entry_length)\n        seen_tables[example.table_key] = 1\n      #convert column and word column names to integers\n      example.column_ids = convert_to_int_2d_and_pad(example.column_names,\n                                                     utility)\n      example.word_column_ids = convert_to_int_2d_and_pad(\n          example.word_column_names, utility)\n      for i_em in range(len(example.number_exact_match)):\n        example.number_exact_match[i_em] = example.number_exact_match[\n            i_em] + [0.0] * (utility.FLAGS.max_elements -\n                             len(example.number_exact_match[i_em]))\n        example.number_group_by_max[i_em] = example.number_group_by_max[\n            i_em] + [0.0] * (utility.FLAGS.max_elements -\n                             len(example.number_group_by_max[i_em]))\n      for i_em in range(len(example.word_exact_match)):\n        example.word_exact_match[i_em] = example.word_exact_match[\n            i_em] + [0.0] * (utility.FLAGS.max_elements -\n                             len(example.word_exact_match[i_em]))\n        example.word_group_by_max[i_em] = example.word_group_by_max[\n            i_em] + [0.0] * (utility.FLAGS.max_elements -\n                             len(example.word_group_by_max[i_em]))\n      example.exact_match = example.number_exact_match + example.word_exact_match\n      example.group_by_max = example.number_group_by_max + example.word_group_by_max\n      example.exact_column_match = example.number_column_exact_match + example.word_column_exact_match\n      #answer and related mask, padding\n      if (example.is_lookup):\n        example.answer = example.calc_answer\n        example.number_print_answer = example.number_lookup_matrix.tolist()\n        example.word_print_answer = example.word_lookup_matrix.tolist()\n        for i_answer in range(len(example.number_print_answer)):\n          example.number_print_answer[i_answer] = example.number_print_answer[\n              i_answer] + [0.0] * (utility.FLAGS.max_elements -\n                                   len(example.number_print_answer[i_answer]))\n        for i_answer in range(len(example.word_print_answer)):\n          example.word_print_answer[i_answer] = example.word_print_answer[\n              i_answer] + [0.0] * (utility.FLAGS.max_elements -\n                                   len(example.word_print_answer[i_answer]))\n        example.number_lookup_matrix = convert_to_bool_and_pad(\n            example.number_lookup_matrix, utility)\n        example.word_lookup_matrix = convert_to_bool_and_pad(\n            example.word_lookup_matrix, utility)\n        for remaining in range(num_cols, utility.FLAGS.max_number_cols):\n          example.number_lookup_matrix.append([False] *\n                                              utility.FLAGS.max_elements)\n          example.number_print_answer.append([0.0] * utility.FLAGS.max_elements)\n        for remaining in range(word_num_cols, utility.FLAGS.max_word_cols):\n          example.word_lookup_matrix.append([False] *\n                                            utility.FLAGS.max_elements)\n          example.word_print_answer.append([0.0] * utility.FLAGS.max_elements)\n        example.print_answer = example.number_print_answer + example.word_print_answer\n      else:\n        example.answer = example.calc_answer\n        example.print_answer = [[0.0] * (utility.FLAGS.max_elements)] * (\n            utility.FLAGS.max_number_cols + utility.FLAGS.max_word_cols)\n      #question_number masks\n      if (example.question_number == -1):\n        example.question_number_mask = np.zeros([utility.FLAGS.max_elements])\n      else:\n        example.question_number_mask = np.ones([utility.FLAGS.max_elements])\n      if (example.question_number_1 == -1):\n        example.question_number_one_mask = -10000.0\n      else:\n        example.question_number_one_mask = np.float64(0.0)\n      if (example.len_col > utility.FLAGS.max_elements):\n        continue\n      processed_data.append(example)\n  return processed_data\n\n\ndef add_special_words(utility):\n  utility.words.append(utility.entry_match_token)\n  utility.word_ids[utility.entry_match_token] = len(utility.word_ids)\n  utility.reverse_word_ids[utility.word_ids[\n      utility.entry_match_token]] = utility.entry_match_token\n  utility.entry_match_token_id = utility.word_ids[utility.entry_match_token]\n  print \"entry match token: \", utility.word_ids[\n      utility.entry_match_token], utility.entry_match_token_id\n  utility.words.append(utility.column_match_token)\n  utility.word_ids[utility.column_match_token] = len(utility.word_ids)\n  utility.reverse_word_ids[utility.word_ids[\n      utility.column_match_token]] = utility.column_match_token\n  utility.column_match_token_id = utility.word_ids[utility.column_match_token]\n  print \"entry match token: \", utility.word_ids[\n      utility.column_match_token], utility.column_match_token_id\n  utility.words.append(utility.dummy_token)\n  utility.word_ids[utility.dummy_token] = len(utility.word_ids)\n  utility.reverse_word_ids[utility.word_ids[\n      utility.dummy_token]] = utility.dummy_token\n  utility.dummy_token_id = utility.word_ids[utility.dummy_token]\n  utility.words.append(utility.unk_token)\n  utility.word_ids[utility.unk_token] = len(utility.word_ids)\n  utility.reverse_word_ids[utility.word_ids[\n      utility.unk_token]] = utility.unk_token\n\n\ndef perform_word_cutoff(utility):\n  if (utility.FLAGS.word_cutoff > 0):\n    for word in utility.word_ids.keys():\n      if (utility.word_count.has_key(word) and utility.word_count[word] <\n          utility.FLAGS.word_cutoff and word != utility.unk_token and\n          word != utility.dummy_token and word != utility.entry_match_token and\n          word != utility.column_match_token):\n        utility.word_ids.pop(word)\n        utility.words.remove(word)\n\n\ndef word_dropout(question, utility):\n  if (utility.FLAGS.word_dropout_prob > 0.0):\n    new_question = []\n    for i in range(len(question)):\n      if (question[i] != utility.dummy_token_id and\n          utility.random.random() > utility.FLAGS.word_dropout_prob):\n        new_question.append(utility.word_ids[utility.unk_token])\n      else:\n        new_question.append(question[i])\n    return new_question\n  else:\n    return question\n\n\ndef generate_feed_dict(data, curr, batch_size, gr, train=False, utility=None):\n  #prepare feed dict dictionary\n  feed_dict = {}\n  feed_examples = []\n  for j in range(batch_size):\n    feed_examples.append(data[curr + j])\n  if (train):\n    feed_dict[gr.batch_question] = [\n        word_dropout(feed_examples[j].question, utility)\n        for j in range(batch_size)\n    ]\n  else:\n    feed_dict[gr.batch_question] = [\n        feed_examples[j].question for j in range(batch_size)\n    ]\n  feed_dict[gr.batch_question_attention_mask] = [\n      feed_examples[j].question_attention_mask for j in range(batch_size)\n  ]\n  feed_dict[\n      gr.batch_answer] = [feed_examples[j].answer for j in range(batch_size)]\n  feed_dict[gr.batch_number_column] = [\n      feed_examples[j].columns for j in range(batch_size)\n  ]\n  feed_dict[gr.batch_processed_number_column] = [\n      feed_examples[j].processed_number_columns for j in range(batch_size)\n  ]\n  feed_dict[gr.batch_processed_sorted_index_number_column] = [\n      feed_examples[j].sorted_number_index for j in range(batch_size)\n  ]\n  feed_dict[gr.batch_processed_sorted_index_word_column] = [\n      feed_examples[j].sorted_word_index for j in range(batch_size)\n  ]\n  feed_dict[gr.batch_question_number] = np.array(\n      [feed_examples[j].question_number for j in range(batch_size)]).reshape(\n          (batch_size, 1))\n  feed_dict[gr.batch_question_number_one] = np.array(\n      [feed_examples[j].question_number_1 for j in range(batch_size)]).reshape(\n          (batch_size, 1))\n  feed_dict[gr.batch_question_number_mask] = [\n      feed_examples[j].question_number_mask for j in range(batch_size)\n  ]\n  feed_dict[gr.batch_question_number_one_mask] = np.array(\n      [feed_examples[j].question_number_one_mask for j in range(batch_size)\n      ]).reshape((batch_size, 1))\n  feed_dict[gr.batch_print_answer] = [\n      feed_examples[j].print_answer for j in range(batch_size)\n  ]\n  feed_dict[gr.batch_exact_match] = [\n      feed_examples[j].exact_match for j in range(batch_size)\n  ]\n  feed_dict[gr.batch_group_by_max] = [\n      feed_examples[j].group_by_max for j in range(batch_size)\n  ]\n  feed_dict[gr.batch_column_exact_match] = [\n      feed_examples[j].exact_column_match for j in range(batch_size)\n  ]\n  feed_dict[gr.batch_ordinal_question] = [\n      feed_examples[j].ordinal_question for j in range(batch_size)\n  ]\n  feed_dict[gr.batch_ordinal_question_one] = [\n      feed_examples[j].ordinal_question_one for j in range(batch_size)\n  ]\n  feed_dict[gr.batch_number_column_mask] = [\n      feed_examples[j].column_mask for j in range(batch_size)\n  ]\n  feed_dict[gr.batch_number_column_names] = [\n      feed_examples[j].column_ids for j in range(batch_size)\n  ]\n  feed_dict[gr.batch_processed_word_column] = [\n      feed_examples[j].processed_word_columns for j in range(batch_size)\n  ]\n  feed_dict[gr.batch_word_column_mask] = [\n      feed_examples[j].word_column_mask for j in range(batch_size)\n  ]\n  feed_dict[gr.batch_word_column_names] = [\n      feed_examples[j].word_column_ids for j in range(batch_size)\n  ]\n  feed_dict[gr.batch_word_column_entry_mask] = [\n      feed_examples[j].word_column_entry_mask for j in range(batch_size)\n  ]\n  return feed_dict\n"
  },
  {
    "path": "model_zoo/models/neural_programmer/model.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Author: aneelakantan (Arvind Neelakantan)\n\"\"\"\n\nimport numpy as np\nimport tensorflow as tf\nimport nn_utils\n\n\nclass Graph():\n\n  def __init__(self, utility, batch_size, max_passes, mode=\"train\"):\n    self.utility = utility\n    self.data_type = self.utility.tf_data_type[self.utility.FLAGS.data_type]\n    self.max_elements = self.utility.FLAGS.max_elements\n    max_elements = self.utility.FLAGS.max_elements\n    self.num_cols = self.utility.FLAGS.max_number_cols\n    self.num_word_cols = self.utility.FLAGS.max_word_cols\n    self.question_length = self.utility.FLAGS.question_length\n    self.batch_size = batch_size\n    self.max_passes = max_passes\n    self.mode = mode\n    self.embedding_dims = self.utility.FLAGS.embedding_dims\n    #input question and a mask\n    self.batch_question = tf.placeholder(tf.int32,\n                                         [batch_size, self.question_length])\n    self.batch_question_attention_mask = tf.placeholder(\n        self.data_type, [batch_size, self.question_length])\n    #ground truth scalar answer and lookup answer\n    self.batch_answer = tf.placeholder(self.data_type, [batch_size])\n    self.batch_print_answer = tf.placeholder(\n        self.data_type,\n        [batch_size, self.num_cols + self.num_word_cols, max_elements])\n    #number columns and its processed version\n    self.batch_number_column = tf.placeholder(\n        self.data_type, [batch_size, self.num_cols, max_elements\n                        ])  #columns with numeric entries\n    self.batch_processed_number_column = tf.placeholder(\n        self.data_type, [batch_size, self.num_cols, max_elements])\n    self.batch_processed_sorted_index_number_column = tf.placeholder(\n        tf.int32, [batch_size, self.num_cols, max_elements])\n    #word columns and its processed version\n    self.batch_processed_word_column = tf.placeholder(\n        self.data_type, [batch_size, self.num_word_cols, max_elements])\n    self.batch_processed_sorted_index_word_column = tf.placeholder(\n        tf.int32, [batch_size, self.num_word_cols, max_elements])\n    self.batch_word_column_entry_mask = tf.placeholder(\n        tf.int32, [batch_size, self.num_word_cols, max_elements])\n    #names of word and number columns along with their mask\n    self.batch_word_column_names = tf.placeholder(\n        tf.int32,\n        [batch_size, self.num_word_cols, self.utility.FLAGS.max_entry_length])\n    self.batch_word_column_mask = tf.placeholder(\n        self.data_type, [batch_size, self.num_word_cols])\n    self.batch_number_column_names = tf.placeholder(\n        tf.int32,\n        [batch_size, self.num_cols, self.utility.FLAGS.max_entry_length])\n    self.batch_number_column_mask = tf.placeholder(self.data_type,\n                                                   [batch_size, self.num_cols])\n    #exact match and group by max operation\n    self.batch_exact_match = tf.placeholder(\n        self.data_type,\n        [batch_size, self.num_cols + self.num_word_cols, max_elements])\n    self.batch_column_exact_match = tf.placeholder(\n        self.data_type, [batch_size, self.num_cols + self.num_word_cols])\n    self.batch_group_by_max = tf.placeholder(\n        self.data_type,\n        [batch_size, self.num_cols + self.num_word_cols, max_elements])\n    #numbers in the question along with their position. This is used to compute arguments to the comparison operations\n    self.batch_question_number = tf.placeholder(self.data_type, [batch_size, 1])\n    self.batch_question_number_one = tf.placeholder(self.data_type,\n                                                    [batch_size, 1])\n    self.batch_question_number_mask = tf.placeholder(\n        self.data_type, [batch_size, max_elements])\n    self.batch_question_number_one_mask = tf.placeholder(self.data_type,\n                                                         [batch_size, 1])\n    self.batch_ordinal_question = tf.placeholder(\n        self.data_type, [batch_size, self.question_length])\n    self.batch_ordinal_question_one = tf.placeholder(\n        self.data_type, [batch_size, self.question_length])\n\n  def LSTM_question_embedding(self, sentence, sentence_length):\n    #LSTM processes the input question\n    lstm_params = \"question_lstm\"\n    hidden_vectors = []\n    sentence = self.batch_question\n    question_hidden = tf.zeros(\n        [self.batch_size, self.utility.FLAGS.embedding_dims], self.data_type)\n    question_c_hidden = tf.zeros(\n        [self.batch_size, self.utility.FLAGS.embedding_dims], self.data_type)\n    if (self.utility.FLAGS.rnn_dropout > 0.0):\n      if (self.mode == \"train\"):\n        rnn_dropout_mask = tf.cast(\n            tf.random_uniform(\n                tf.shape(question_hidden), minval=0.0, maxval=1.0) <\n            self.utility.FLAGS.rnn_dropout,\n            self.data_type) / self.utility.FLAGS.rnn_dropout\n      else:\n        rnn_dropout_mask = tf.ones_like(question_hidden)\n    for question_iterator in range(self.question_length):\n      curr_word = sentence[:, question_iterator]\n      question_vector = nn_utils.apply_dropout(\n          nn_utils.get_embedding(curr_word, self.utility, self.params),\n          self.utility.FLAGS.dropout, self.mode)\n      question_hidden, question_c_hidden = nn_utils.LSTMCell(\n          question_vector, question_hidden, question_c_hidden, lstm_params,\n          self.params)\n      if (self.utility.FLAGS.rnn_dropout > 0.0):\n        question_hidden = question_hidden * rnn_dropout_mask\n      hidden_vectors.append(tf.expand_dims(question_hidden, 0))\n    hidden_vectors = tf.concat(0, hidden_vectors)\n    return question_hidden, hidden_vectors\n\n  def history_recurrent_step(self, curr_hprev, hprev):\n    #A single RNN step for controller or history RNN\n    return tf.tanh(\n        tf.matmul(\n            tf.concat(1, [hprev, curr_hprev]), self.params[\n                \"history_recurrent\"])) + self.params[\"history_recurrent_bias\"]\n\n  def question_number_softmax(self, hidden_vectors):\n    #Attention on quetsion to decide the question number to passed to comparison ops\n    def compute_ans(op_embedding, comparison):\n      op_embedding = tf.expand_dims(op_embedding, 0)\n      #dot product of operation embedding with hidden state to the left of the number occurence\n      first = tf.transpose(\n          tf.matmul(op_embedding,\n                    tf.transpose(\n                        tf.reduce_sum(hidden_vectors * tf.tile(\n                            tf.expand_dims(\n                                tf.transpose(self.batch_ordinal_question), 2),\n                            [1, 1, self.utility.FLAGS.embedding_dims]), 0))))\n      second = self.batch_question_number_one_mask + tf.transpose(\n          tf.matmul(op_embedding,\n                    tf.transpose(\n                        tf.reduce_sum(hidden_vectors * tf.tile(\n                            tf.expand_dims(\n                                tf.transpose(self.batch_ordinal_question_one), 2\n                            ), [1, 1, self.utility.FLAGS.embedding_dims]), 0))))\n      question_number_softmax = tf.nn.softmax(tf.concat(1, [first, second]))\n      if (self.mode == \"test\"):\n        cond = tf.equal(question_number_softmax,\n                        tf.reshape(\n                            tf.reduce_max(question_number_softmax, 1),\n                            [self.batch_size, 1]))\n        question_number_softmax = tf.select(\n            cond,\n            tf.fill(tf.shape(question_number_softmax), 1.0),\n            tf.fill(tf.shape(question_number_softmax), 0.0))\n        question_number_softmax = tf.cast(question_number_softmax,\n                                          self.data_type)\n      ans = tf.reshape(\n          tf.reduce_sum(question_number_softmax * tf.concat(\n              1, [self.batch_question_number, self.batch_question_number_one]),\n                        1), [self.batch_size, 1])\n      return ans\n\n    def compute_op_position(op_name):\n      for i in range(len(self.utility.operations_set)):\n        if (op_name == self.utility.operations_set[i]):\n          return i\n\n    def compute_question_number(op_name):\n      op_embedding = tf.nn.embedding_lookup(self.params_unit,\n                                            compute_op_position(op_name))\n      return compute_ans(op_embedding, op_name)\n\n    curr_greater_question_number = compute_question_number(\"greater\")\n    curr_lesser_question_number = compute_question_number(\"lesser\")\n    curr_geq_question_number = compute_question_number(\"geq\")\n    curr_leq_question_number = compute_question_number(\"leq\")\n    return curr_greater_question_number, curr_lesser_question_number, curr_geq_question_number, curr_leq_question_number\n\n  def perform_attention(self, context_vector, hidden_vectors, length, mask):\n    #Performs attention on hiddent_vectors using context vector\n    context_vector = tf.tile(\n        tf.expand_dims(context_vector, 0), [length, 1, 1])  #time * bs * d\n    attention_softmax = tf.nn.softmax(\n        tf.transpose(tf.reduce_sum(context_vector * hidden_vectors, 2)) +\n        mask)  #batch_size * time\n    attention_softmax = tf.tile(\n        tf.expand_dims(tf.transpose(attention_softmax), 2),\n        [1, 1, self.embedding_dims])\n    ans_vector = tf.reduce_sum(attention_softmax * hidden_vectors, 0)\n    return ans_vector\n\n  #computes embeddings for column names using parameters of question module\n  def get_column_hidden_vectors(self):\n    #vector representations for the column names\n    self.column_hidden_vectors = tf.reduce_sum(\n        nn_utils.get_embedding(self.batch_number_column_names, self.utility,\n                               self.params), 2)\n    self.word_column_hidden_vectors = tf.reduce_sum(\n        nn_utils.get_embedding(self.batch_word_column_names, self.utility,\n                               self.params), 2)\n\n  def create_summary_embeddings(self):\n    #embeddings for each text entry in the table using parameters of the question module\n    self.summary_text_entry_embeddings = tf.reduce_sum(\n        tf.expand_dims(self.batch_exact_match, 3) * tf.expand_dims(\n            tf.expand_dims(\n                tf.expand_dims(\n                    nn_utils.get_embedding(self.utility.entry_match_token_id,\n                                           self.utility, self.params), 0), 1),\n            2), 2)\n\n  def compute_column_softmax(self, column_controller_vector, time_step):\n    #compute softmax over all the columns using column controller vector\n    column_controller_vector = tf.tile(\n        tf.expand_dims(column_controller_vector, 1),\n        [1, self.num_cols + self.num_word_cols, 1])  #max_cols * bs * d\n    column_controller_vector = nn_utils.apply_dropout(\n        column_controller_vector, self.utility.FLAGS.dropout, self.mode)\n    self.full_column_hidden_vectors = tf.concat(\n        1, [self.column_hidden_vectors, self.word_column_hidden_vectors])\n    self.full_column_hidden_vectors += self.summary_text_entry_embeddings\n    self.full_column_hidden_vectors = nn_utils.apply_dropout(\n        self.full_column_hidden_vectors, self.utility.FLAGS.dropout, self.mode)\n    column_logits = tf.reduce_sum(\n        column_controller_vector * self.full_column_hidden_vectors, 2) + (\n            self.params[\"word_match_feature_column_name\"] *\n            self.batch_column_exact_match) + self.full_column_mask\n    column_softmax = tf.nn.softmax(column_logits)  #batch_size * max_cols\n    return column_softmax\n\n  def compute_first_or_last(self, select, first=True):\n    #perform first ot last operation on row select with probabilistic row selection\n    answer = tf.zeros_like(select)\n    running_sum = tf.zeros([self.batch_size, 1], self.data_type)\n    for i in range(self.max_elements):\n      if (first):\n        current = tf.slice(select, [0, i], [self.batch_size, 1])\n      else:\n        current = tf.slice(select, [0, self.max_elements - 1 - i],\n                           [self.batch_size, 1])\n      curr_prob = current * (1 - running_sum)\n      curr_prob = curr_prob * tf.cast(curr_prob >= 0.0, self.data_type)\n      running_sum += curr_prob\n      temp_ans = []\n      curr_prob = tf.expand_dims(tf.reshape(curr_prob, [self.batch_size]), 0)\n      for i_ans in range(self.max_elements):\n        if (not (first) and i_ans == self.max_elements - 1 - i):\n          temp_ans.append(curr_prob)\n        elif (first and i_ans == i):\n          temp_ans.append(curr_prob)\n        else:\n          temp_ans.append(tf.zeros_like(curr_prob))\n      temp_ans = tf.transpose(tf.concat(0, temp_ans))\n      answer += temp_ans\n    return answer\n\n  def make_hard_softmax(self, softmax):\n    #converts soft selection to hard selection. used at test time\n    cond = tf.equal(\n        softmax, tf.reshape(tf.reduce_max(softmax, 1), [self.batch_size, 1]))\n    softmax = tf.select(\n        cond, tf.fill(tf.shape(softmax), 1.0), tf.fill(tf.shape(softmax), 0.0))\n    softmax = tf.cast(softmax, self.data_type)\n    return softmax\n\n  def compute_max_or_min(self, select, maxi=True):\n    #computes the argmax and argmin of a column with probabilistic row selection\n    answer = tf.zeros([\n        self.batch_size, self.num_cols + self.num_word_cols, self.max_elements\n    ], self.data_type)\n    sum_prob = tf.zeros([self.batch_size, self.num_cols + self.num_word_cols],\n                        self.data_type)\n    for j in range(self.max_elements):\n      if (maxi):\n        curr_pos = j\n      else:\n        curr_pos = self.max_elements - 1 - j\n      select_index = tf.slice(self.full_processed_sorted_index_column,\n                              [0, 0, curr_pos], [self.batch_size, -1, 1])\n      select_mask = tf.equal(\n          tf.tile(\n              tf.expand_dims(\n                  tf.tile(\n                      tf.expand_dims(tf.range(self.max_elements), 0),\n                      [self.batch_size, 1]), 1),\n              [1, self.num_cols + self.num_word_cols, 1]), select_index)\n      curr_prob = tf.expand_dims(select, 1) * tf.cast(\n          select_mask, self.data_type) * self.select_bad_number_mask\n      curr_prob = curr_prob * tf.expand_dims((1 - sum_prob), 2)\n      curr_prob = curr_prob * tf.expand_dims(\n          tf.cast((1 - sum_prob) > 0.0, self.data_type), 2)\n      answer = tf.select(select_mask, curr_prob, answer)\n      sum_prob += tf.reduce_sum(curr_prob, 2)\n    return answer\n\n  def perform_operations(self, softmax, full_column_softmax, select,\n                         prev_select_1, curr_pass):\n    #performs all the 15 operations. computes scalar output, lookup answer and row selector\n    column_softmax = tf.slice(full_column_softmax, [0, 0],\n                              [self.batch_size, self.num_cols])\n    word_column_softmax = tf.slice(full_column_softmax, [0, self.num_cols],\n                                   [self.batch_size, self.num_word_cols])\n    init_max = self.compute_max_or_min(select, maxi=True)\n    init_min = self.compute_max_or_min(select, maxi=False)\n    #operations that are column  independent\n    count = tf.reshape(tf.reduce_sum(select, 1), [self.batch_size, 1])\n    select_full_column_softmax = tf.tile(\n        tf.expand_dims(full_column_softmax, 2),\n        [1, 1, self.max_elements\n        ])  #BS * (max_cols + max_word_cols) * max_elements\n    select_word_column_softmax = tf.tile(\n        tf.expand_dims(word_column_softmax, 2),\n        [1, 1, self.max_elements])  #BS * max_word_cols * max_elements\n    select_greater = tf.reduce_sum(\n        self.init_select_greater * select_full_column_softmax,\n        1) * self.batch_question_number_mask  #BS * max_elements\n    select_lesser = tf.reduce_sum(\n        self.init_select_lesser * select_full_column_softmax,\n        1) * self.batch_question_number_mask  #BS * max_elements\n    select_geq = tf.reduce_sum(\n        self.init_select_geq * select_full_column_softmax,\n        1) * self.batch_question_number_mask  #BS * max_elements\n    select_leq = tf.reduce_sum(\n        self.init_select_leq * select_full_column_softmax,\n        1) * self.batch_question_number_mask  #BS * max_elements\n    select_max = tf.reduce_sum(init_max * select_full_column_softmax,\n                               1)  #BS * max_elements\n    select_min = tf.reduce_sum(init_min * select_full_column_softmax,\n                               1)  #BS * max_elements\n    select_prev = tf.concat(1, [\n        tf.slice(select, [0, 1], [self.batch_size, self.max_elements - 1]),\n        tf.cast(tf.zeros([self.batch_size, 1]), self.data_type)\n    ])\n    select_next = tf.concat(1, [\n        tf.cast(tf.zeros([self.batch_size, 1]), self.data_type), tf.slice(\n            select, [0, 0], [self.batch_size, self.max_elements - 1])\n    ])\n    select_last_rs = self.compute_first_or_last(select, False)\n    select_first_rs = self.compute_first_or_last(select, True)\n    select_word_match = tf.reduce_sum(self.batch_exact_match *\n                                      select_full_column_softmax, 1)\n    select_group_by_max = tf.reduce_sum(self.batch_group_by_max *\n                                        select_full_column_softmax, 1)\n    length_content = 1\n    length_select = 13\n    length_print = 1\n    values = tf.concat(1, [count])\n    softmax_content = tf.slice(softmax, [0, 0],\n                               [self.batch_size, length_content])\n    #compute scalar output\n    output = tf.reduce_sum(tf.mul(softmax_content, values), 1)\n    #compute lookup answer\n    softmax_print = tf.slice(softmax, [0, length_content + length_select],\n                             [self.batch_size, length_print])\n    curr_print = select_full_column_softmax * tf.tile(\n        tf.expand_dims(select, 1),\n        [1, self.num_cols + self.num_word_cols, 1\n        ])  #BS * max_cols * max_elements (conisders only column)\n    self.batch_lookup_answer = curr_print * tf.tile(\n        tf.expand_dims(softmax_print, 2),\n        [1, self.num_cols + self.num_word_cols, self.max_elements\n        ])  #BS * max_cols * max_elements\n    self.batch_lookup_answer = self.batch_lookup_answer * self.select_full_mask\n    #compute row select\n    softmax_select = tf.slice(softmax, [0, length_content],\n                              [self.batch_size, length_select])\n    select_lists = [\n        tf.expand_dims(select_prev, 1), tf.expand_dims(select_next, 1),\n        tf.expand_dims(select_first_rs, 1), tf.expand_dims(select_last_rs, 1),\n        tf.expand_dims(select_group_by_max, 1),\n        tf.expand_dims(select_greater, 1), tf.expand_dims(select_lesser, 1),\n        tf.expand_dims(select_geq, 1), tf.expand_dims(select_leq, 1),\n        tf.expand_dims(select_max, 1), tf.expand_dims(select_min, 1),\n        tf.expand_dims(select_word_match, 1),\n        tf.expand_dims(self.reset_select, 1)\n    ]\n    select = tf.reduce_sum(\n        tf.tile(tf.expand_dims(softmax_select, 2), [1, 1, self.max_elements]) *\n        tf.concat(1, select_lists), 1)\n    select = select * self.select_whole_mask\n    return output, select\n\n  def one_pass(self, select, question_embedding, hidden_vectors, hprev,\n               prev_select_1, curr_pass):\n    #Performs one timestep which involves selecting an operation and a column\n    attention_vector = self.perform_attention(\n        hprev, hidden_vectors, self.question_length,\n        self.batch_question_attention_mask)  #batch_size * embedding_dims\n    controller_vector = tf.nn.relu(\n        tf.matmul(hprev, self.params[\"controller_prev\"]) + tf.matmul(\n            tf.concat(1, [question_embedding, attention_vector]), self.params[\n                \"controller\"]))\n    column_controller_vector = tf.nn.relu(\n        tf.matmul(hprev, self.params[\"column_controller_prev\"]) + tf.matmul(\n            tf.concat(1, [question_embedding, attention_vector]), self.params[\n                \"column_controller\"]))\n    controller_vector = nn_utils.apply_dropout(\n        controller_vector, self.utility.FLAGS.dropout, self.mode)\n    self.operation_logits = tf.matmul(controller_vector,\n                                      tf.transpose(self.params_unit))\n    softmax = tf.nn.softmax(self.operation_logits)\n    soft_softmax = softmax\n    #compute column softmax: bs * max_columns\n    weighted_op_representation = tf.transpose(\n        tf.matmul(tf.transpose(self.params_unit), tf.transpose(softmax)))\n    column_controller_vector = tf.nn.relu(\n        tf.matmul(\n            tf.concat(1, [\n                column_controller_vector, weighted_op_representation\n            ]), self.params[\"break_conditional\"]))\n    full_column_softmax = self.compute_column_softmax(column_controller_vector,\n                                                      curr_pass)\n    soft_column_softmax = full_column_softmax\n    if (self.mode == \"test\"):\n      full_column_softmax = self.make_hard_softmax(full_column_softmax)\n      softmax = self.make_hard_softmax(softmax)\n    output, select = self.perform_operations(softmax, full_column_softmax,\n                                             select, prev_select_1, curr_pass)\n    return output, select, softmax, soft_softmax, full_column_softmax, soft_column_softmax\n\n  def compute_lookup_error(self, val):\n    #computes lookup error.\n    cond = tf.equal(self.batch_print_answer, val)\n    inter = tf.select(\n        cond, self.init_print_error,\n        tf.tile(\n            tf.reshape(tf.constant(1e10, self.data_type), [1, 1, 1]), [\n                self.batch_size, self.utility.FLAGS.max_word_cols +\n                self.utility.FLAGS.max_number_cols,\n                self.utility.FLAGS.max_elements\n            ]))\n    return tf.reduce_min(tf.reduce_min(inter, 1), 1) * tf.cast(\n        tf.greater(\n            tf.reduce_sum(tf.reduce_sum(tf.cast(cond, self.data_type), 1), 1),\n            0.0), self.data_type)\n\n  def soft_min(self, x, y):\n    return tf.maximum(-1.0 * (1 / (\n        self.utility.FLAGS.soft_min_value + 0.0)) * tf.log(\n            tf.exp(-self.utility.FLAGS.soft_min_value * x) + tf.exp(\n                -self.utility.FLAGS.soft_min_value * y)), tf.zeros_like(x))\n\n  def error_computation(self):\n    #computes the error of each example in a batch\n    math_error = 0.5 * tf.square(tf.sub(self.scalar_output, self.batch_answer))\n    #scale math error\n    math_error = math_error / self.rows\n    math_error = tf.minimum(math_error, self.utility.FLAGS.max_math_error *\n                            tf.ones(tf.shape(math_error), self.data_type))\n    self.init_print_error = tf.select(\n        self.batch_gold_select, -1 * tf.log(self.batch_lookup_answer + 1e-300 +\n                                            self.invert_select_full_mask), -1 *\n        tf.log(1 - self.batch_lookup_answer)) * self.select_full_mask\n    print_error_1 = self.init_print_error * tf.cast(\n        tf.equal(self.batch_print_answer, 0.0), self.data_type)\n    print_error = tf.reduce_sum(tf.reduce_sum((print_error_1), 1), 1)\n    for val in range(1, 58):\n      print_error += self.compute_lookup_error(val + 0.0)\n    print_error = print_error * self.utility.FLAGS.print_cost / self.num_entries\n    if (self.mode == \"train\"):\n      error = tf.select(\n          tf.logical_and(\n              tf.not_equal(self.batch_answer, 0.0),\n              tf.not_equal(\n                  tf.reduce_sum(tf.reduce_sum(self.batch_print_answer, 1), 1),\n                  0.0)),\n          self.soft_min(math_error, print_error),\n          tf.select(\n              tf.not_equal(self.batch_answer, 0.0), math_error, print_error))\n    else:\n      error = tf.select(\n          tf.logical_and(\n              tf.equal(self.scalar_output, 0.0),\n              tf.equal(\n                  tf.reduce_sum(tf.reduce_sum(self.batch_lookup_answer, 1), 1),\n                  0.0)),\n          tf.ones_like(math_error),\n          tf.select(\n              tf.equal(self.scalar_output, 0.0), print_error, math_error))\n    return error\n\n  def batch_process(self):\n    #Computes loss and fraction of correct examples in a batch.\n    self.params_unit = nn_utils.apply_dropout(\n        self.params[\"unit\"], self.utility.FLAGS.dropout, self.mode)\n    batch_size = self.batch_size\n    max_passes = self.max_passes\n    num_timesteps = 1\n    max_elements = self.max_elements\n    select = tf.cast(\n        tf.fill([self.batch_size, max_elements], 1.0), self.data_type)\n    hprev = tf.cast(\n        tf.fill([self.batch_size, self.embedding_dims], 0.0),\n        self.data_type)  #running sum of the hidden states of the model\n    output = tf.cast(tf.fill([self.batch_size, 1], 0.0),\n                     self.data_type)  #output of the model\n    correct = tf.cast(\n        tf.fill([1], 0.0), self.data_type\n    )  #to compute accuracy, returns number of correct examples for this batch\n    total_error = 0.0\n    prev_select_1 = tf.zeros_like(select)\n    self.create_summary_embeddings()\n    self.get_column_hidden_vectors()\n    #get question embedding\n    question_embedding, hidden_vectors = self.LSTM_question_embedding(\n        self.batch_question, self.question_length)\n    #compute arguments for comparison operation\n    greater_question_number, lesser_question_number, geq_question_number, leq_question_number = self.question_number_softmax(\n        hidden_vectors)\n    self.init_select_greater = tf.cast(\n        tf.greater(self.full_processed_column,\n                   tf.expand_dims(greater_question_number, 2)), self.\n        data_type) * self.select_bad_number_mask  #bs * max_cols * max_elements\n    self.init_select_lesser = tf.cast(\n        tf.less(self.full_processed_column,\n                tf.expand_dims(lesser_question_number, 2)), self.\n        data_type) * self.select_bad_number_mask  #bs * max_cols * max_elements\n    self.init_select_geq = tf.cast(\n        tf.greater_equal(self.full_processed_column,\n                         tf.expand_dims(geq_question_number, 2)), self.\n        data_type) * self.select_bad_number_mask  #bs * max_cols * max_elements\n    self.init_select_leq = tf.cast(\n        tf.less_equal(self.full_processed_column,\n                      tf.expand_dims(leq_question_number, 2)), self.\n        data_type) * self.select_bad_number_mask  #bs * max_cols * max_elements\n    self.init_select_word_match = 0\n    if (self.utility.FLAGS.rnn_dropout > 0.0):\n      if (self.mode == \"train\"):\n        history_rnn_dropout_mask = tf.cast(\n            tf.random_uniform(\n                tf.shape(hprev), minval=0.0, maxval=1.0) <\n            self.utility.FLAGS.rnn_dropout,\n            self.data_type) / self.utility.FLAGS.rnn_dropout\n      else:\n        history_rnn_dropout_mask = tf.ones_like(hprev)\n    select = select * self.select_whole_mask\n    self.batch_log_prob = tf.zeros([self.batch_size], dtype=self.data_type)\n    #Perform max_passes and at each  pass select operation and column\n    for curr_pass in range(max_passes):\n      print \"step: \", curr_pass\n      output, select, softmax, soft_softmax, column_softmax, soft_column_softmax = self.one_pass(\n          select, question_embedding, hidden_vectors, hprev, prev_select_1,\n          curr_pass)\n      prev_select_1 = select\n      #compute input to history RNN\n      input_op = tf.transpose(\n          tf.matmul(\n              tf.transpose(self.params_unit), tf.transpose(\n                  soft_softmax)))  #weighted average of emebdding of operations\n      input_col = tf.reduce_sum(\n          tf.expand_dims(soft_column_softmax, 2) *\n          self.full_column_hidden_vectors, 1)\n      history_input = tf.concat(1, [input_op, input_col])\n      history_input = nn_utils.apply_dropout(\n          history_input, self.utility.FLAGS.dropout, self.mode)\n      hprev = self.history_recurrent_step(history_input, hprev)\n      if (self.utility.FLAGS.rnn_dropout > 0.0):\n        hprev = hprev * history_rnn_dropout_mask\n    self.scalar_output = output\n    error = self.error_computation()\n    cond = tf.less(error, 0.0001, name=\"cond\")\n    correct_add = tf.select(\n        cond, tf.fill(tf.shape(cond), 1.0), tf.fill(tf.shape(cond), 0.0))\n    correct = tf.reduce_sum(correct_add)\n    error = error / batch_size\n    total_error = tf.reduce_sum(error)\n    total_correct = correct / batch_size\n    return total_error, total_correct\n\n  def compute_error(self):\n    #Sets mask variables and performs batch processing\n    self.batch_gold_select = self.batch_print_answer > 0.0\n    self.full_column_mask = tf.concat(\n        1, [self.batch_number_column_mask, self.batch_word_column_mask])\n    self.full_processed_column = tf.concat(\n        1,\n        [self.batch_processed_number_column, self.batch_processed_word_column])\n    self.full_processed_sorted_index_column = tf.concat(1, [\n        self.batch_processed_sorted_index_number_column,\n        self.batch_processed_sorted_index_word_column\n    ])\n    self.select_bad_number_mask = tf.cast(\n        tf.logical_and(\n            tf.not_equal(self.full_processed_column,\n                         self.utility.FLAGS.pad_int),\n            tf.not_equal(self.full_processed_column,\n                         self.utility.FLAGS.bad_number_pre_process)),\n        self.data_type)\n    self.select_mask = tf.cast(\n        tf.logical_not(\n            tf.equal(self.batch_number_column, self.utility.FLAGS.pad_int)),\n        self.data_type)\n    self.select_word_mask = tf.cast(\n        tf.logical_not(\n            tf.equal(self.batch_word_column_entry_mask,\n                     self.utility.dummy_token_id)), self.data_type)\n    self.select_full_mask = tf.concat(\n        1, [self.select_mask, self.select_word_mask])\n    self.select_whole_mask = tf.maximum(\n        tf.reshape(\n            tf.slice(self.select_mask, [0, 0, 0],\n                     [self.batch_size, 1, self.max_elements]),\n            [self.batch_size, self.max_elements]),\n        tf.reshape(\n            tf.slice(self.select_word_mask, [0, 0, 0],\n                     [self.batch_size, 1, self.max_elements]),\n            [self.batch_size, self.max_elements]))\n    self.invert_select_full_mask = tf.cast(\n        tf.concat(1, [\n            tf.equal(self.batch_number_column, self.utility.FLAGS.pad_int),\n            tf.equal(self.batch_word_column_entry_mask,\n                     self.utility.dummy_token_id)\n        ]), self.data_type)\n    self.batch_lookup_answer = tf.zeros(tf.shape(self.batch_gold_select))\n    self.reset_select = self.select_whole_mask\n    self.rows = tf.reduce_sum(self.select_whole_mask, 1)\n    self.num_entries = tf.reshape(\n        tf.reduce_sum(tf.reduce_sum(self.select_full_mask, 1), 1),\n        [self.batch_size])\n    self.final_error, self.final_correct = self.batch_process()\n    return self.final_error\n\n  def create_graph(self, params, global_step):\n    #Creates the graph to compute error, gradient computation and updates parameters\n    self.params = params\n    batch_size = self.batch_size\n    learning_rate = tf.cast(self.utility.FLAGS.learning_rate, self.data_type)\n    self.total_cost = self.compute_error() \n    optimize_params = self.params.values()\n    optimize_names = self.params.keys()\n    print \"optimize params \", optimize_names\n    if (self.utility.FLAGS.l2_regularizer > 0.0):\n      reg_cost = 0.0\n      for ind_param in self.params.keys():\n        reg_cost += tf.nn.l2_loss(self.params[ind_param])\n      self.total_cost += self.utility.FLAGS.l2_regularizer * reg_cost\n    grads = tf.gradients(self.total_cost, optimize_params, name=\"gradients\")\n    grad_norm = 0.0\n    for p, name in zip(grads, optimize_names):\n      print \"grads: \", p, name\n      if isinstance(p, tf.IndexedSlices):\n        grad_norm += tf.reduce_sum(p.values * p.values)\n      elif not (p == None):\n        grad_norm += tf.reduce_sum(p * p)\n    grad_norm = tf.sqrt(grad_norm)\n    max_grad_norm = np.float32(self.utility.FLAGS.clip_gradients).astype(\n        self.utility.np_data_type[self.utility.FLAGS.data_type])\n    grad_scale = tf.minimum(\n        tf.cast(1.0, self.data_type), max_grad_norm / grad_norm)\n    clipped_grads = list()\n    for p in grads:\n      if isinstance(p, tf.IndexedSlices):\n        tmp = p.values * grad_scale\n        clipped_grads.append(tf.IndexedSlices(tmp, p.indices))\n      elif not (p == None):\n        clipped_grads.append(p * grad_scale)\n      else:\n        clipped_grads.append(p)\n    grads = clipped_grads\n    self.global_step = global_step\n    params_list = self.params.values()\n    params_list.append(self.global_step)\n    adam = tf.train.AdamOptimizer(\n        learning_rate,\n        epsilon=tf.cast(self.utility.FLAGS.eps, self.data_type),\n        use_locking=True)\n    self.step = adam.apply_gradients(zip(grads, optimize_params), \n\t\t\t\t\tglobal_step=self.global_step)\n    self.init_op = tf.initialize_all_variables()\n\n"
  },
  {
    "path": "model_zoo/models/neural_programmer/neural_programmer.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Implementation of the Neural Programmer model described in https://openreview.net/pdf?id=ry2YOrcge\n\nThis file calls functions to load & pre-process data, construct the TF graph\nand performs training or evaluation as specified by the flag evaluator_job\nAuthor: aneelakantan (Arvind Neelakantan)\n\"\"\"\nimport time\nfrom random import Random\nimport numpy as np\nimport tensorflow as tf\nimport model\nimport wiki_data\nimport parameters\nimport data_utils\n\ntf.flags.DEFINE_integer(\"train_steps\", 100001, \"Number of steps to train\")\ntf.flags.DEFINE_integer(\"eval_cycle\", 500,\n                        \"Evaluate model at every eval_cycle steps\")\ntf.flags.DEFINE_integer(\"max_elements\", 100,\n                        \"maximum rows that are  considered for processing\")\ntf.flags.DEFINE_integer(\n    \"max_number_cols\", 15,\n    \"maximum number columns that are considered for processing\")\ntf.flags.DEFINE_integer(\n    \"max_word_cols\", 25,\n    \"maximum number columns that are considered for processing\")\ntf.flags.DEFINE_integer(\"question_length\", 62, \"maximum question length\")\ntf.flags.DEFINE_integer(\"max_entry_length\", 1, \"\")\ntf.flags.DEFINE_integer(\"max_passes\", 4, \"number of operation passes\")\ntf.flags.DEFINE_integer(\"embedding_dims\", 256, \"\")\ntf.flags.DEFINE_integer(\"batch_size\", 20, \"\")\ntf.flags.DEFINE_float(\"clip_gradients\", 1.0, \"\")\ntf.flags.DEFINE_float(\"eps\", 1e-6, \"\")\ntf.flags.DEFINE_float(\"param_init\", 0.1, \"\")\ntf.flags.DEFINE_float(\"learning_rate\", 0.001, \"\")\ntf.flags.DEFINE_float(\"l2_regularizer\", 0.0001, \"\")\ntf.flags.DEFINE_float(\"print_cost\", 50.0,\n                      \"weighting factor in the objective function\")\ntf.flags.DEFINE_string(\"job_id\", \"temp\", \"\"\"job id\"\"\")\ntf.flags.DEFINE_string(\"output_dir\", \"../model/\",\n                       \"\"\"output_dir\"\"\")\ntf.flags.DEFINE_string(\"data_dir\", \"../data/\",\n                       \"\"\"data_dir\"\"\")\ntf.flags.DEFINE_integer(\"write_every\", 500, \"wrtie every N\")\ntf.flags.DEFINE_integer(\"param_seed\", 150, \"\")\ntf.flags.DEFINE_integer(\"python_seed\", 200, \"\")\ntf.flags.DEFINE_float(\"dropout\", 0.8, \"dropout keep probability\")\ntf.flags.DEFINE_float(\"rnn_dropout\", 0.9,\n                      \"dropout keep probability for rnn connections\")\ntf.flags.DEFINE_float(\"pad_int\", -20000.0,\n                      \"number columns are padded with pad_int\")\ntf.flags.DEFINE_string(\"data_type\", \"double\", \"float or double\")\ntf.flags.DEFINE_float(\"word_dropout_prob\", 0.9, \"word dropout keep prob\")\ntf.flags.DEFINE_integer(\"word_cutoff\", 10, \"\")\ntf.flags.DEFINE_integer(\"vocab_size\", 10800, \"\")\ntf.flags.DEFINE_boolean(\"evaluator_job\", False,\n                        \"wehther to run as trainer/evaluator\")\ntf.flags.DEFINE_float(\n    \"bad_number_pre_process\", -200000.0,\n    \"number that is added to a corrupted table entry in a number column\")\ntf.flags.DEFINE_float(\"max_math_error\", 3.0,\n                      \"max square loss error that is considered\")\ntf.flags.DEFINE_float(\"soft_min_value\", 5.0, \"\")\nFLAGS = tf.flags.FLAGS\n\n\nclass Utility:\n  #holds FLAGS and other variables that are used in different files\n  def __init__(self):\n    global FLAGS\n    self.FLAGS = FLAGS\n    self.unk_token = \"UNK\"\n    self.entry_match_token = \"entry_match\"\n    self.column_match_token = \"column_match\"\n    self.dummy_token = \"dummy_token\"\n    self.tf_data_type = {}\n    self.tf_data_type[\"double\"] = tf.float64\n    self.tf_data_type[\"float\"] = tf.float32\n    self.np_data_type = {}\n    self.np_data_type[\"double\"] = np.float64\n    self.np_data_type[\"float\"] = np.float32\n    self.operations_set = [\"count\"] + [\n        \"prev\", \"next\", \"first_rs\", \"last_rs\", \"group_by_max\", \"greater\",\n        \"lesser\", \"geq\", \"leq\", \"max\", \"min\", \"word-match\"\n    ] + [\"reset_select\"] + [\"print\"]\n    self.word_ids = {}\n    self.reverse_word_ids = {}\n    self.word_count = {}\n    self.random = Random(FLAGS.python_seed)\n\n\ndef evaluate(sess, data, batch_size, graph, i):\n  #computes accuracy\n  num_examples = 0.0\n  gc = 0.0\n  for j in range(0, len(data) - batch_size + 1, batch_size):\n    [ct] = sess.run([graph.final_correct],\n                    feed_dict=data_utils.generate_feed_dict(data, j, batch_size,\n                                                            graph))\n    gc += ct * batch_size\n    num_examples += batch_size\n  print \"dev set accuracy   after \", i, \" : \", gc / num_examples\n  print num_examples, len(data)\n  print \"--------\"\n\n\ndef Train(graph, utility, batch_size, train_data, sess, model_dir,\n          saver):\n  #performs training\n  curr = 0\n  train_set_loss = 0.0\n  utility.random.shuffle(train_data)\n  start = time.time()\n  for i in range(utility.FLAGS.train_steps):\n    curr_step = i\n    if (i > 0 and i % FLAGS.write_every == 0):\n      model_file = model_dir + \"/model_\" + str(i)\n      saver.save(sess, model_file)\n    if curr + batch_size >= len(train_data):\n      curr = 0\n      utility.random.shuffle(train_data)\n    step, cost_value = sess.run(\n        [graph.step, graph.total_cost],\n        feed_dict=data_utils.generate_feed_dict(\n            train_data, curr, batch_size, graph, train=True, utility=utility))\n    curr = curr + batch_size\n    train_set_loss += cost_value\n    if (i > 0 and i % FLAGS.eval_cycle == 0):\n      end = time.time()\n      time_taken = end - start\n      print \"step \", i, \" \", time_taken, \" seconds \"\n      start = end\n      print \" printing train set loss: \", train_set_loss / utility.FLAGS.eval_cycle\n      train_set_loss = 0.0\n\n\ndef master(train_data, dev_data, utility):\n  #creates TF graph and calls trainer or evaluator\n  batch_size = utility.FLAGS.batch_size \n  model_dir = utility.FLAGS.output_dir + \"/model\" + utility.FLAGS.job_id + \"/\"\n  #create all paramters of the model\n  param_class = parameters.Parameters(utility)\n  params, global_step, init = param_class.parameters(utility)\n  key = \"test\" if (FLAGS.evaluator_job) else \"train\"\n  graph = model.Graph(utility, batch_size, utility.FLAGS.max_passes, mode=key)\n  graph.create_graph(params, global_step)\n  prev_dev_error = 0.0\n  final_loss = 0.0\n  final_accuracy = 0.0\n  #start session\n  with tf.Session() as sess:\n    sess.run(init.name)\n    sess.run(graph.init_op.name)\n    to_save = params.copy()\n    saver = tf.train.Saver(to_save, max_to_keep=500)\n    if (FLAGS.evaluator_job):\n      while True:\n        selected_models = {}\n        file_list = tf.gfile.ListDirectory(model_dir)\n        for model_file in file_list:\n          if (\"checkpoint\" in model_file or \"index\" in model_file or\n              \"meta\" in model_file):\n            continue\n          if (\"data\" in model_file):\n            model_file = model_file.split(\".\")[0]\n          model_step = int(\n              model_file.split(\"_\")[len(model_file.split(\"_\")) - 1])\n          selected_models[model_step] = model_file\n        file_list = sorted(selected_models.items(), key=lambda x: x[0])\n        if (len(file_list) > 0):\n          file_list = file_list[0:len(file_list) - 1]\n\tprint \"list of models: \", file_list\n        for model_file in file_list:\n          model_file = model_file[1]\n          print \"restoring: \", model_file\n          saver.restore(sess, model_dir + \"/\" + model_file)\n          model_step = int(\n              model_file.split(\"_\")[len(model_file.split(\"_\")) - 1])\n          print \"evaluating on dev \", model_file, model_step\n          evaluate(sess, dev_data, batch_size, graph, model_step)\n    else:\n      ckpt = tf.train.get_checkpoint_state(model_dir)\n      print \"model dir: \", model_dir\n      if (not (tf.gfile.IsDirectory(model_dir))):\n        print \"create dir: \", model_dir\n        tf.gfile.MkDir(model_dir)\n      Train(graph, utility, batch_size, train_data, sess, model_dir,\n            saver)\n\ndef main(args):\n  utility = Utility()\n  train_name = \"random-split-1-train.examples\"\n  dev_name = \"random-split-1-dev.examples\"\n  test_name = \"pristine-unseen-tables.examples\"\n  #load data\n  dat = wiki_data.WikiQuestionGenerator(train_name, dev_name, test_name, FLAGS.data_dir)\n  train_data, dev_data, test_data = dat.load()\n  utility.words = []\n  utility.word_ids = {}\n  utility.reverse_word_ids = {}\n  #construct vocabulary\n  data_utils.construct_vocab(train_data, utility)\n  data_utils.construct_vocab(dev_data, utility, True)\n  data_utils.construct_vocab(test_data, utility, True)\n  data_utils.add_special_words(utility)\n  data_utils.perform_word_cutoff(utility)\n  #convert data to int format and pad the inputs\n  train_data = data_utils.complete_wiki_processing(train_data, utility, True)\n  dev_data = data_utils.complete_wiki_processing(dev_data, utility, False)\n  test_data = data_utils.complete_wiki_processing(test_data, utility, False)\n  print \"# train examples \", len(train_data)\n  print \"# dev examples \", len(dev_data)\n  print \"# test examples \", len(test_data)\n  print \"running open source\"\n  #construct TF graph and train or evaluate\n  master(train_data, dev_data, utility)\n\n\nif __name__ == \"__main__\":\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/neural_programmer/nn_utils.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Author: aneelakantan (Arvind Neelakantan)\n\"\"\"\n\nimport tensorflow as tf\n\ndef get_embedding(word, utility, params):\n  return tf.nn.embedding_lookup(params[\"word\"], word)\n\n\ndef apply_dropout(x, dropout_rate, mode):\n  if (dropout_rate > 0.0):\n    if (mode == \"train\"):\n      x = tf.nn.dropout(x, dropout_rate)\n    else:\n      x = x\n  return x\n\n\ndef LSTMCell(x, mprev, cprev, key, params):\n  \"\"\"Create an LSTM cell.\n\n  Implements the equations in pg.2 from\n  \"Long Short-Term Memory Based Recurrent Neural Network Architectures\n  For Large Vocabulary Speech Recognition\",\n  Hasim Sak, Andrew Senior, Francoise Beaufays.\n\n  Args:\n    w: A dictionary of the weights and optional biases as returned\n      by LSTMParametersSplit().\n    x: Inputs to this cell.\n    mprev: m_{t-1}, the recurrent activations (same as the output)\n      from the previous cell.\n    cprev: c_{t-1}, the cell activations from the previous cell.\n    keep_prob: Keep probability on the input and the outputs of a cell.\n\n  Returns:\n    m: Outputs of this cell.\n    c: Cell Activations.\n    \"\"\"\n\n  i = tf.matmul(x, params[key + \"_ix\"]) + tf.matmul(mprev, params[key + \"_im\"])\n  i = tf.nn.bias_add(i, params[key + \"_i\"])\n  f = tf.matmul(x, params[key + \"_fx\"]) + tf.matmul(mprev, params[key + \"_fm\"])\n  f = tf.nn.bias_add(f, params[key + \"_f\"])\n  c = tf.matmul(x, params[key + \"_cx\"]) + tf.matmul(mprev, params[key + \"_cm\"])\n  c = tf.nn.bias_add(c, params[key + \"_c\"])\n  o = tf.matmul(x, params[key + \"_ox\"]) + tf.matmul(mprev, params[key + \"_om\"])\n  o = tf.nn.bias_add(o, params[key + \"_o\"])\n  i = tf.sigmoid(i, name=\"i_gate\")\n  f = tf.sigmoid(f, name=\"f_gate\")\n  o = tf.sigmoid(o, name=\"o_gate\")\n  c = f * cprev + i * tf.tanh(c)\n  m = o * c\n  return m, c\n"
  },
  {
    "path": "model_zoo/models/neural_programmer/parameters.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Author: aneelakantan (Arvind Neelakantan)\n\"\"\"\n\nimport numpy as np\nimport tensorflow as tf\n\n\nclass Parameters:\n\n  def __init__(self, u):\n    self.utility = u\n    self.init_seed_counter = 0\n    self.word_init = {}\n\n  def parameters(self, utility):\n    params = {}\n    inits = []\n    embedding_dims = self.utility.FLAGS.embedding_dims\n    params[\"unit\"] = tf.Variable(\n        self.RandomUniformInit([len(utility.operations_set), embedding_dims]))\n    params[\"word\"] = tf.Variable(\n        self.RandomUniformInit([utility.FLAGS.vocab_size, embedding_dims]))\n    params[\"word_match_feature_column_name\"] = tf.Variable(\n        self.RandomUniformInit([1]))\n    params[\"controller\"] = tf.Variable(\n        self.RandomUniformInit([2 * embedding_dims, embedding_dims]))\n    params[\"column_controller\"] = tf.Variable(\n        self.RandomUniformInit([2 * embedding_dims, embedding_dims]))\n    params[\"column_controller_prev\"] = tf.Variable(\n        self.RandomUniformInit([embedding_dims, embedding_dims]))\n    params[\"controller_prev\"] = tf.Variable(\n        self.RandomUniformInit([embedding_dims, embedding_dims]))\n    global_step = tf.Variable(1, name=\"global_step\")\n    #weigths of question and history RNN (or LSTM)\n    key_list = [\"question_lstm\"]\n    for key in key_list:\n      # Weights going from inputs to nodes.\n      for wgts in [\"ix\", \"fx\", \"cx\", \"ox\"]:\n        params[key + \"_\" + wgts] = tf.Variable(\n            self.RandomUniformInit([embedding_dims, embedding_dims]))\n      # Weights going from nodes to nodes.\n      for wgts in [\"im\", \"fm\", \"cm\", \"om\"]:\n        params[key + \"_\" + wgts] = tf.Variable(\n            self.RandomUniformInit([embedding_dims, embedding_dims]))\n      #Biases for the gates and cell\n      for bias in [\"i\", \"f\", \"c\", \"o\"]:\n        if (bias == \"f\"):\n          print \"forget gate bias\"\n          params[key + \"_\" + bias] = tf.Variable(\n              tf.random_uniform([embedding_dims], 1.0, 1.1, self.utility.\n                                tf_data_type[self.utility.FLAGS.data_type]))\n        else:\n          params[key + \"_\" + bias] = tf.Variable(\n              self.RandomUniformInit([embedding_dims]))\n    params[\"history_recurrent\"] = tf.Variable(\n        self.RandomUniformInit([3 * embedding_dims, embedding_dims]))\n    params[\"history_recurrent_bias\"] = tf.Variable(\n        self.RandomUniformInit([1, embedding_dims]))\n    params[\"break_conditional\"] = tf.Variable(\n        self.RandomUniformInit([2 * embedding_dims, embedding_dims]))\n    init = tf.initialize_all_variables()\n    return params, global_step, init\n\n  def RandomUniformInit(self, shape):\n    \"\"\"Returns a RandomUniform Tensor between -param_init and param_init.\"\"\"\n    param_seed = self.utility.FLAGS.param_seed\n    self.init_seed_counter += 1\n    return tf.random_uniform(\n        shape, -1.0 *\n        (np.float32(self.utility.FLAGS.param_init)\n        ).astype(self.utility.np_data_type[self.utility.FLAGS.data_type]),\n        (np.float32(self.utility.FLAGS.param_init)\n        ).astype(self.utility.np_data_type[self.utility.FLAGS.data_type]),\n        self.utility.tf_data_type[self.utility.FLAGS.data_type],\n        param_seed + self.init_seed_counter)\n"
  },
  {
    "path": "model_zoo/models/neural_programmer/wiki_data.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Loads the WikiQuestions dataset.\n\nAn example consists of question, table. Additionally, we store the processed\ncolumns which store the entries after performing number, date and other\npreprocessing as done in the baseline.\ncolumns, column names and processed columns are split into word and number\ncolumns.\nlookup answer (or matrix) is also split into number and word lookup matrix\nAuthor: aneelakantan (Arvind Neelakantan)\n\"\"\"\nimport math\nimport os\nimport re\nimport numpy as np\nimport unicodedata as ud\nimport tensorflow as tf\n\nbad_number = -200000.0  #number that is added to a corrupted table entry in a number column\n\ndef is_nan_or_inf(number):\n  return math.isnan(number) or math.isinf(number)\n\ndef strip_accents(s):\n  u = unicode(s, \"utf-8\")\n  u_new = ''.join(c for c in ud.normalize('NFKD', u) if ud.category(c) != 'Mn')\n  return u_new.encode(\"utf-8\")\n\n\ndef correct_unicode(string):\n  string = strip_accents(string)\n  string = re.sub(\"\\xc2\\xa0\", \" \", string).strip()\n  string = re.sub(\"\\xe2\\x80\\x93\", \"-\", string).strip()\n  #string = re.sub(ur'[\\u0300-\\u036F]', \"\", string)\n  string = re.sub(\"â€š\", \",\", string)\n  string = re.sub(\"â€¦\", \"...\", string)\n  #string = re.sub(\"[Â·ãƒ»]\", \".\", string)\n  string = re.sub(\"Ë†\", \"^\", string)\n  string = re.sub(\"Ëœ\", \"~\", string)\n  string = re.sub(\"â€¹\", \"<\", string)\n  string = re.sub(\"â€º\", \">\", string)\n  #string = re.sub(\"[â€˜â€™Â´`]\", \"'\", string)\n  #string = re.sub(\"[â€œâ€Â«Â»]\", \"\\\"\", string)\n  #string = re.sub(\"[â€¢â€ â€¡]\", \"\", string)\n  #string = re.sub(\"[â€â€‘â€“â€”]\", \"-\", string)\n  string = re.sub(ur'[\\u2E00-\\uFFFF]', \"\", string)\n  string = re.sub(\"\\\\s+\", \" \", string).strip()\n  return string\n\n\ndef simple_normalize(string):\n  string = correct_unicode(string)\n  # Citations\n  string = re.sub(\"\\[(nb ?)?\\d+\\]\", \"\", string)\n  string = re.sub(\"\\*+$\", \"\", string)\n  # Year in parenthesis\n  string = re.sub(\"\\(\\d* ?-? ?\\d*\\)\", \"\", string)\n  string = re.sub(\"^\\\"(.*)\\\"$\", \"\", string)\n  return string\n\n\ndef full_normalize(string):\n  #print \"an: \", string\n  string = simple_normalize(string)\n  # Remove trailing info in brackets\n  string = re.sub(\"\\[[^\\]]*\\]\", \"\", string)\n  # Remove most unicode characters in other languages\n  string = re.sub(ur'[\\u007F-\\uFFFF]', \"\", string.strip())\n  # Remove trailing info in parenthesis\n  string = re.sub(\"\\([^)]*\\)$\", \"\", string.strip())\n  string = final_normalize(string)\n  # Get rid of question marks\n  string = re.sub(\"\\?\", \"\", string).strip()\n  # Get rid of trailing colons (usually occur in column titles)\n  string = re.sub(\"\\:$\", \" \", string).strip()\n  # Get rid of slashes\n  string = re.sub(r\"/\", \" \", string).strip()\n  string = re.sub(r\"\\\\\", \" \", string).strip()\n  # Replace colon, slash, and dash with space\n  # Note: need better replacement for this when parsing time\n  string = re.sub(r\"\\:\", \" \", string).strip()\n  string = re.sub(\"/\", \" \", string).strip()\n  string = re.sub(\"-\", \" \", string).strip()\n  # Convert empty strings to UNK\n  # Important to do this last or near last\n  if not string:\n    string = \"UNK\"\n  return string\n\ndef final_normalize(string):\n  # Remove leading and trailing whitespace\n  string = re.sub(\"\\\\s+\", \" \", string).strip()\n  # Convert entirely to lowercase\n  string = string.lower()\n  # Get rid of strangely escaped newline characters\n  string = re.sub(\"\\\\\\\\n\", \" \", string).strip()\n  # Get rid of quotation marks\n  string = re.sub(r\"\\\"\", \"\", string).strip()\n  string = re.sub(r\"\\'\", \"\", string).strip()\n  string = re.sub(r\"`\", \"\", string).strip()\n  # Get rid of *\n  string = re.sub(\"\\*\", \"\", string).strip()\n  return string\n\ndef is_number(x):\n  try:\n    f = float(x)\n    return not is_nan_or_inf(f)\n  except ValueError:\n    return False\n  except TypeError:\n    return False\n\n\nclass WikiExample(object):\n\n  def __init__(self, id, question, answer, table_key):\n    self.question_id = id\n    self.question = question\n    self.answer = answer\n    self.table_key = table_key\n    self.lookup_matrix = []\n    self.is_bad_example = False\n    self.is_word_lookup = False\n    self.is_ambiguous_word_lookup = False\n    self.is_number_lookup = False\n    self.is_number_calc = False\n    self.is_unknown_answer = False\n\n\nclass TableInfo(object):\n\n  def __init__(self, word_columns, word_column_names, word_column_indices,\n               number_columns, number_column_names, number_column_indices,\n               processed_word_columns, processed_number_columns, orig_columns):\n    self.word_columns = word_columns\n    self.word_column_names = word_column_names\n    self.word_column_indices = word_column_indices\n    self.number_columns = number_columns\n    self.number_column_names = number_column_names\n    self.number_column_indices = number_column_indices\n    self.processed_word_columns = processed_word_columns\n    self.processed_number_columns = processed_number_columns\n    self.orig_columns = orig_columns\n\n\nclass WikiQuestionLoader(object):\n\n  def __init__(self, data_name, root_folder):\n    self.root_folder = root_folder\n    self.data_folder = os.path.join(self.root_folder, \"data\")\n    self.examples = []\n    self.data_name = data_name\n\n  def num_questions(self):\n    return len(self.examples)\n\n  def load_qa(self):\n    data_source = os.path.join(self.data_folder, self.data_name)\n    f = tf.gfile.GFile(data_source, \"r\")\n    id_regex = re.compile(\"\\(id ([^\\)]*)\\)\")\n    for line in f:\n      id_match = id_regex.search(line)\n      id = id_match.group(1)\n      self.examples.append(id)\n\n  def load(self):\n    self.load_qa()\n\n\ndef is_date(word):\n  if (not (bool(re.search(\"[a-z0-9]\", word, re.IGNORECASE)))):\n    return False\n  if (len(word) != 10):\n    return False\n  if (word[4] != \"-\"):\n    return False\n  if (word[7] != \"-\"):\n    return False\n  for i in range(len(word)):\n    if (not (word[i] == \"X\" or word[i] == \"x\" or word[i] == \"-\" or re.search(\n        \"[0-9]\", word[i]))):\n      return False\n  return True\n\n\nclass WikiQuestionGenerator(object):\n\n  def __init__(self, train_name, dev_name, test_name, root_folder):\n    self.train_name = train_name\n    self.dev_name = dev_name\n    self.test_name = test_name\n    self.train_loader = WikiQuestionLoader(train_name, root_folder)\n    self.dev_loader = WikiQuestionLoader(dev_name, root_folder)\n    self.test_loader = WikiQuestionLoader(test_name, root_folder)\n    self.bad_examples = 0\n    self.root_folder = root_folder   \n    self.data_folder = os.path.join(self.root_folder, \"annotated/data\")\n    self.annotated_examples = {}\n    self.annotated_tables = {}\n    self.annotated_word_reject = {}\n    self.annotated_word_reject[\"-lrb-\"] = 1\n    self.annotated_word_reject[\"-rrb-\"] = 1\n    self.annotated_word_reject[\"UNK\"] = 1\n\n  def is_money(self, word):\n    if (not (bool(re.search(\"[a-z0-9]\", word, re.IGNORECASE)))):\n      return False\n    for i in range(len(word)):\n      if (not (word[i] == \"E\" or word[i] == \".\" or re.search(\"[0-9]\",\n                                                             word[i]))):\n        return False\n    return True\n\n  def remove_consecutive(self, ner_tags, ner_values):\n    for i in range(len(ner_tags)):\n      if ((ner_tags[i] == \"NUMBER\" or ner_tags[i] == \"MONEY\" or\n           ner_tags[i] == \"PERCENT\" or ner_tags[i] == \"DATE\") and\n          i + 1 < len(ner_tags) and ner_tags[i] == ner_tags[i + 1] and\n          ner_values[i] == ner_values[i + 1] and ner_values[i] != \"\"):\n        word = ner_values[i]\n        word = word.replace(\">\", \"\").replace(\"<\", \"\").replace(\"=\", \"\").replace(\n            \"%\", \"\").replace(\"~\", \"\").replace(\"$\", \"\").replace(\"£\", \"\").replace(\n                \"€\", \"\")\n        if (re.search(\"[A-Z]\", word) and not (is_date(word)) and not (\n            self.is_money(word))):\n          ner_values[i] = \"A\"\n        else:\n          ner_values[i] = \",\"\n    return ner_tags, ner_values\n\n  def pre_process_sentence(self, tokens, ner_tags, ner_values):\n    sentence = []\n    tokens = tokens.split(\"|\")\n    ner_tags = ner_tags.split(\"|\")\n    ner_values = ner_values.split(\"|\")\n    ner_tags, ner_values = self.remove_consecutive(ner_tags, ner_values)\n    #print \"old: \", tokens\n    for i in range(len(tokens)):\n      word = tokens[i]\n      if (ner_values[i] != \"\" and\n          (ner_tags[i] == \"NUMBER\" or ner_tags[i] == \"MONEY\" or\n           ner_tags[i] == \"PERCENT\" or ner_tags[i] == \"DATE\")):\n        word = ner_values[i]\n        word = word.replace(\">\", \"\").replace(\"<\", \"\").replace(\"=\", \"\").replace(\n            \"%\", \"\").replace(\"~\", \"\").replace(\"$\", \"\").replace(\"£\", \"\").replace(\n                \"€\", \"\")\n        if (re.search(\"[A-Z]\", word) and not (is_date(word)) and not (\n            self.is_money(word))):\n          word = tokens[i]\n        if (is_number(ner_values[i])):\n          word = float(ner_values[i])\n        elif (is_number(word)):\n          word = float(word)\n        if (tokens[i] == \"score\"):\n          word = \"score\"\n      if (is_number(word)):\n        word = float(word)\n      if (not (self.annotated_word_reject.has_key(word))):\n        if (is_number(word) or is_date(word) or self.is_money(word)):\n          sentence.append(word)\n        else:\n          word = full_normalize(word)\n          if (not (self.annotated_word_reject.has_key(word)) and\n              bool(re.search(\"[a-z0-9]\", word, re.IGNORECASE))):\n            m = re.search(\",\", word)\n            sentence.append(word.replace(\",\", \"\"))\n    if (len(sentence) == 0):\n      sentence.append(\"UNK\")\n    return sentence\n\n  def load_annotated_data(self, in_file):\n    self.annotated_examples = {}\n    self.annotated_tables = {}\n    f = tf.gfile.GFile(in_file, \"r\")\n    counter = 0\n    for line in f:\n      if (counter > 0):\n        line = line.strip()\n        (question_id, utterance, context, target_value, tokens, lemma_tokens,\n         pos_tags, ner_tags, ner_values, target_canon) = line.split(\"\\t\")\n        question = self.pre_process_sentence(tokens, ner_tags, ner_values)\n        target_canon = target_canon.split(\"|\")\n        self.annotated_examples[question_id] = WikiExample(\n            question_id, question, target_canon, context)\n        self.annotated_tables[context] = []\n      counter += 1\n    print \"Annotated examples loaded \", len(self.annotated_examples)\n    f.close()\n\n  def is_number_column(self, a):\n    for w in a:\n      if (len(w) != 1):\n        return False\n      if (not (is_number(w[0]))):\n        return False\n    return True\n\n  def convert_table(self, table):\n    answer = []\n    for i in range(len(table)):\n      temp = []\n      for j in range(len(table[i])):\n        temp.append(\" \".join([str(w) for w in table[i][j]]))\n      answer.append(temp)\n    return answer\n\n  def load_annotated_tables(self):\n    for table in self.annotated_tables.keys():\n      annotated_table = table.replace(\"csv\", \"annotated\")\n      orig_columns = []\n      processed_columns = []\n      f = tf.gfile.GFile(os.path.join(self.root_folder, annotated_table), \"r\")\n      counter = 0\n      for line in f:\n        if (counter > 0):\n          line = line.strip()\n          line = line + \"\\t\" * (13 - len(line.split(\"\\t\")))\n          (row, col, read_id, content, tokens, lemma_tokens, pos_tags, ner_tags,\n           ner_values, number, date, num2, read_list) = line.split(\"\\t\")\n        counter += 1\n      f.close()\n      max_row = int(row)\n      max_col = int(col)\n      for i in range(max_col + 1):\n        orig_columns.append([])\n        processed_columns.append([])\n        for j in range(max_row + 1):\n          orig_columns[i].append(bad_number)\n          processed_columns[i].append(bad_number)\n      #print orig_columns\n      f = tf.gfile.GFile(os.path.join(self.root_folder, annotated_table), \"r\")\n      counter = 0\n      column_names = []\n      for line in f:\n        if (counter > 0):\n          line = line.strip()\n          line = line + \"\\t\" * (13 - len(line.split(\"\\t\")))\n          (row, col, read_id, content, tokens, lemma_tokens, pos_tags, ner_tags,\n           ner_values, number, date, num2, read_list) = line.split(\"\\t\")\n          entry = self.pre_process_sentence(tokens, ner_tags, ner_values)\n          if (row == \"-1\"):\n            column_names.append(entry)\n          else:\n            orig_columns[int(col)][int(row)] = entry\n            if (len(entry) == 1 and is_number(entry[0])):\n              processed_columns[int(col)][int(row)] = float(entry[0])\n            else:\n              for single_entry in entry:\n                if (is_number(single_entry)):\n                  processed_columns[int(col)][int(row)] = float(single_entry)\n                  break\n              nt = ner_tags.split(\"|\")\n              nv = ner_values.split(\"|\")\n              for i_entry in range(len(tokens.split(\"|\"))):\n                if (nt[i_entry] == \"DATE\" and\n                    is_number(nv[i_entry].replace(\"-\", \"\").replace(\"X\", \"\"))):\n                  processed_columns[int(col)][int(row)] = float(nv[\n                      i_entry].replace(\"-\", \"\").replace(\"X\", \"\"))\n                  #processed_columns[int(col)][int(row)] =  float(nv[i_entry])\n            if (len(entry) == 1 and (is_number(entry[0]) or is_date(entry[0]) or\n                                     self.is_money(entry[0]))):\n              if (len(entry) == 1 and not (is_number(entry[0])) and\n                  is_date(entry[0])):\n                entry[0] = entry[0].replace(\"X\", \"x\")\n        counter += 1\n      word_columns = []\n      processed_word_columns = []\n      word_column_names = []\n      word_column_indices = []\n      number_columns = []\n      processed_number_columns = []\n      number_column_names = []\n      number_column_indices = []\n      for i in range(max_col + 1):\n        if (self.is_number_column(orig_columns[i])):\n          number_column_indices.append(i)\n          number_column_names.append(column_names[i])\n          temp = []\n          for w in orig_columns[i]:\n            if (is_number(w[0])):\n              temp.append(w[0])\n          number_columns.append(temp)\n          processed_number_columns.append(processed_columns[i])\n        else:\n          word_column_indices.append(i)\n          word_column_names.append(column_names[i])\n          word_columns.append(orig_columns[i])\n          processed_word_columns.append(processed_columns[i])\n      table_info = TableInfo(\n          word_columns, word_column_names, word_column_indices, number_columns,\n          number_column_names, number_column_indices, processed_word_columns,\n          processed_number_columns, orig_columns)\n      self.annotated_tables[table] = table_info\n      f.close()\n\n  def answer_classification(self):\n    lookup_questions = 0\n    number_lookup_questions = 0\n    word_lookup_questions = 0\n    ambiguous_lookup_questions = 0\n    number_questions = 0\n    bad_questions = 0\n    ice_bad_questions = 0\n    tot = 0\n    got = 0\n    ice = {}\n    with tf.gfile.GFile(\n        self.root_folder + \"/arvind-with-norms-2.tsv\", mode=\"r\") as f:\n      lines = f.readlines()\n      for line in lines:\n        line = line.strip()\n        if (not (self.annotated_examples.has_key(line.split(\"\\t\")[0]))):\n          continue\n        if (len(line.split(\"\\t\")) == 4):\n          line = line + \"\\t\" * (5 - len(line.split(\"\\t\")))\n          if (not (is_number(line.split(\"\\t\")[2]))):\n            ice_bad_questions += 1\n        (example_id, ans_index, ans_raw, process_answer,\n         matched_cells) = line.split(\"\\t\")\n        if (ice.has_key(example_id)):\n          ice[example_id].append(line.split(\"\\t\"))\n        else:\n          ice[example_id] = [line.split(\"\\t\")]\n    for q_id in self.annotated_examples.keys():\n      tot += 1\n      example = self.annotated_examples[q_id]\n      table_info = self.annotated_tables[example.table_key]\n      # Figure out if the answer is numerical or lookup\n      n_cols = len(table_info.orig_columns)\n      n_rows = len(table_info.orig_columns[0])\n      example.lookup_matrix = np.zeros((n_rows, n_cols))\n      exact_matches = {}\n      for (example_id, ans_index, ans_raw, process_answer,\n           matched_cells) in ice[q_id]:\n        for match_cell in matched_cells.split(\"|\"):\n          if (len(match_cell.split(\",\")) == 2):\n            (row, col) = match_cell.split(\",\")\n            row = int(row)\n            col = int(col)\n            if (row >= 0):\n              exact_matches[ans_index] = 1\n      answer_is_in_table = len(exact_matches) == len(example.answer)\n      if (answer_is_in_table):\n        for (example_id, ans_index, ans_raw, process_answer,\n             matched_cells) in ice[q_id]:\n          for match_cell in matched_cells.split(\"|\"):\n            if (len(match_cell.split(\",\")) == 2):\n              (row, col) = match_cell.split(\",\")\n              row = int(row)\n              col = int(col)\n              example.lookup_matrix[row, col] = float(ans_index) + 1.0\n      example.lookup_number_answer = 0.0\n      if (answer_is_in_table):\n        lookup_questions += 1\n        if len(example.answer) == 1 and is_number(example.answer[0]):\n          example.number_answer = float(example.answer[0])\n          number_lookup_questions += 1\n          example.is_number_lookup = True\n        else:\n          #print \"word lookup\"\n          example.calc_answer = example.number_answer = 0.0\n          word_lookup_questions += 1\n          example.is_word_lookup = True\n      else:\n        if (len(example.answer) == 1 and is_number(example.answer[0])):\n          example.number_answer = example.answer[0]\n          example.is_number_calc = True\n        else:\n          bad_questions += 1\n          example.is_bad_example = True\n          example.is_unknown_answer = True\n      example.is_lookup = example.is_word_lookup or example.is_number_lookup\n      if not example.is_word_lookup and not example.is_bad_example:\n        number_questions += 1\n        example.calc_answer = example.answer[0]\n        example.lookup_number_answer = example.calc_answer\n      # Split up the lookup matrix into word part and number part\n      number_column_indices = table_info.number_column_indices\n      word_column_indices = table_info.word_column_indices\n      example.word_columns = table_info.word_columns\n      example.number_columns = table_info.number_columns\n      example.word_column_names = table_info.word_column_names\n      example.processed_number_columns = table_info.processed_number_columns\n      example.processed_word_columns = table_info.processed_word_columns\n      example.number_column_names = table_info.number_column_names\n      example.number_lookup_matrix = example.lookup_matrix[:,\n                                                           number_column_indices]\n      example.word_lookup_matrix = example.lookup_matrix[:, word_column_indices]\n\n  def load(self):\n    train_data = []\n    dev_data = []\n    test_data = []\n    self.load_annotated_data(\n        os.path.join(self.data_folder, \"training.annotated\"))\n    self.load_annotated_tables()\n    self.answer_classification()\n    self.train_loader.load()\n    self.dev_loader.load()\n    for i in range(self.train_loader.num_questions()):\n      example = self.train_loader.examples[i]\n      example = self.annotated_examples[example]\n      train_data.append(example)\n    for i in range(self.dev_loader.num_questions()):\n      example = self.dev_loader.examples[i]\n      dev_data.append(self.annotated_examples[example])\n\n    self.load_annotated_data(\n        os.path.join(self.data_folder, \"pristine-unseen-tables.annotated\"))\n    self.load_annotated_tables()\n    self.answer_classification()\n    self.test_loader.load()\n    for i in range(self.test_loader.num_questions()):\n      example = self.test_loader.examples[i]\n      test_data.append(self.annotated_examples[example])\n    return train_data, dev_data, test_data\n"
  },
  {
    "path": "model_zoo/models/resnet/BUILD",
    "content": "package(default_visibility = [\":internal\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\nexports_files([\"LICENSE\"])\n\npackage_group(\n    name = \"internal\",\n    packages = [\n        \"//resnet/...\",\n    ],\n)\n\nfilegroup(\n    name = \"py_srcs\",\n    data = glob([\n        \"**/*.py\",\n    ]),\n)\n\npy_library(\n    name = \"resnet_model\",\n    srcs = [\"resnet_model.py\"],\n)\n\npy_binary(\n    name = \"resnet_main\",\n    srcs = [\n        \"resnet_main.py\",\n    ],\n    deps = [\n        \":cifar_input\",\n        \":resnet_model\",\n    ],\n)\n\npy_library(\n    name = \"cifar_input\",\n    srcs = [\"cifar_input.py\"],\n)\n"
  },
  {
    "path": "model_zoo/models/resnet/README.md",
    "content": "<font size=4><b>Reproduced ResNet on CIFAR-10 and CIFAR-100 dataset.</b></font>\n\ncontact: panyx0718 (xpan@google.com)\n\n<b>Dataset:</b>\n\nhttps://www.cs.toronto.edu/~kriz/cifar.html\n\n<b>Related papers:</b>\n\nIdentity Mappings in Deep Residual Networks\n\nhttps://arxiv.org/pdf/1603.05027v2.pdf\n\nDeep Residual Learning for Image Recognition\n\nhttps://arxiv.org/pdf/1512.03385v1.pdf\n\nWide Residual Networks\n\nhttps://arxiv.org/pdf/1605.07146v1.pdf\n\n<b>Settings:</b>\n\n* Random split 50k training set into 45k/5k train/eval split.\n* Pad to 36x36 and random crop. Horizontal flip. Per-image whitenting. \n* Momentum optimizer 0.9.\n* Learning rate schedule: 0.1 (40k), 0.01 (60k), 0.001 (>60k).\n* L2 weight decay: 0.002.\n* Batch size: 128. (28-10 wide and 1001 layer bottleneck use 64)\n\n<b>Results:</b>\n\n<left>\n![Precisions](g3doc/cifar_resnet.gif)\n</left>\n<left>\n![Precisions Legends](g3doc/cifar_resnet_legends.gif)\n</left>\n\n\nCIFAR-10 Model|Best Precision|Steps\n--------------|--------------|------\n32 layer|92.5%|~80k\n110 layer|93.6%|~80k\n164 layer bottleneck|94.5%|~80k\n1001 layer bottleneck|94.9%|~80k\n28-10 wide|95%|~90k\n\nCIFAR-100 Model|Best Precision|Steps\n---------------|--------------|-----\n32 layer|68.1%|~45k\n110 layer|71.3%|~60k\n164 layer bottleneck|75.7%|~50k\n1001 layer bottleneck|78.2%|~70k\n28-10 wide|78.3%|~70k\n\n<b>Prerequisite:</b>\n\n1. Install TensorFlow, Bazel.\n\n2. Download CIFAR-10/CIFAR-100 dataset.\n\n```shell\ncurl -o cifar-10-binary.tar.gz https://www.cs.toronto.edu/~kriz/cifar-10-binary.tar.gz\ncurl -o cifar-100-binary.tar.gz https://www.cs.toronto.edu/~kriz/cifar-100-binary.tar.gz\n```\n\n<b>How to run:</b>\n\n```shell\n# cd to the your workspace.\n# It contains an empty WORKSPACE file, resnet codes and cifar10 dataset.\n# Note: User can split 5k from train set for eval set.\nls -R\n  .:\n  cifar10  resnet  WORKSPACE\n\n  ./cifar10:\n  data_batch_1.bin  data_batch_2.bin  data_batch_3.bin  data_batch_4.bin\n  data_batch_5.bin  test_batch.bin\n\n  ./resnet:\n  BUILD  cifar_input.py  g3doc  README.md  resnet_main.py  resnet_model.py\n\n# Build everything for GPU.\nbazel build -c opt --config=cuda resnet/...\n\n# Train the model.\nbazel-bin/resnet/resnet_main --train_data_path=cifar10/data_batch* \\\n                             --log_root=/tmp/resnet_model \\\n                             --train_dir=/tmp/resnet_model/train \\\n                             --dataset='cifar10' \\\n                             --num_gpus=1\n\n# Evaluate the model.\n# Avoid running on the same GPU as the training job at the same time,\n# otherwise, you might run out of memory.\nbazel-bin/resnet/resnet_main --eval_data_path=cifar10/test_batch.bin \\\n                             --log_root=/tmp/resnet_model \\\n                             --eval_dir=/tmp/resnet_model/test \\\n                             --mode=eval \\\n                             --dataset='cifar10' \\\n                             --num_gpus=0\n```\n"
  },
  {
    "path": "model_zoo/models/resnet/cifar_input.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"CIFAR dataset input module.\n\"\"\"\n\nimport tensorflow as tf\n\n\ndef build_input(dataset, data_path, batch_size, mode):\n  \"\"\"Build CIFAR image and labels.\n\n  Args:\n    dataset: Either 'cifar10' or 'cifar100'.\n    data_path: Filename for data.\n    batch_size: Input batch size.\n    mode: Either 'train' or 'eval'.\n  Returns:\n    images: Batches of images. [batch_size, image_size, image_size, 3]\n    labels: Batches of labels. [batch_size, num_classes]\n  Raises:\n    ValueError: when the specified dataset is not supported.\n  \"\"\"\n  image_size = 32\n  if dataset == 'cifar10':\n    label_bytes = 1\n    label_offset = 0\n    num_classes = 10\n  elif dataset == 'cifar100':\n    label_bytes = 1\n    label_offset = 1\n    num_classes = 100\n  else:\n    raise ValueError('Not supported dataset %s', dataset)\n\n  depth = 3\n  image_bytes = image_size * image_size * depth\n  record_bytes = label_bytes + label_offset + image_bytes\n\n  data_files = tf.gfile.Glob(data_path)\n  file_queue = tf.train.string_input_producer(data_files, shuffle=True)\n  # Read examples from files in the filename queue.\n  reader = tf.FixedLengthRecordReader(record_bytes=record_bytes)\n  _, value = reader.read(file_queue)\n\n  # Convert these examples to dense labels and processed images.\n  record = tf.reshape(tf.decode_raw(value, tf.uint8), [record_bytes])\n  label = tf.cast(tf.slice(record, [label_offset], [label_bytes]), tf.int32)\n  # Convert from string to [depth * height * width] to [depth, height, width].\n  depth_major = tf.reshape(tf.slice(record, [label_bytes], [image_bytes]),\n                           [depth, image_size, image_size])\n  # Convert from [depth, height, width] to [height, width, depth].\n  image = tf.cast(tf.transpose(depth_major, [1, 2, 0]), tf.float32)\n\n  if mode == 'train':\n    image = tf.image.resize_image_with_crop_or_pad(\n        image, image_size+4, image_size+4)\n    image = tf.random_crop(image, [image_size, image_size, 3])\n    image = tf.image.random_flip_left_right(image)\n    # Brightness/saturation/constrast provides small gains .2%~.5% on cifar.\n    # image = tf.image.random_brightness(image, max_delta=63. / 255.)\n    # image = tf.image.random_saturation(image, lower=0.5, upper=1.5)\n    # image = tf.image.random_contrast(image, lower=0.2, upper=1.8)\n    image = tf.image.per_image_whitening(image)\n\n    example_queue = tf.RandomShuffleQueue(\n        capacity=16 * batch_size,\n        min_after_dequeue=8 * batch_size,\n        dtypes=[tf.float32, tf.int32],\n        shapes=[[image_size, image_size, depth], [1]])\n    num_threads = 16\n  else:\n    image = tf.image.resize_image_with_crop_or_pad(\n        image, image_size, image_size)\n    image = tf.image.per_image_whitening(image)\n\n    example_queue = tf.FIFOQueue(\n        3 * batch_size,\n        dtypes=[tf.float32, tf.int32],\n        shapes=[[image_size, image_size, depth], [1]])\n    num_threads = 1\n\n  example_enqueue_op = example_queue.enqueue([image, label])\n  tf.train.add_queue_runner(tf.train.queue_runner.QueueRunner(\n      example_queue, [example_enqueue_op] * num_threads))\n\n  # Read 'batch' labels + images from the example queue.\n  images, labels = example_queue.dequeue_many(batch_size)\n  labels = tf.reshape(labels, [batch_size, 1])\n  indices = tf.reshape(tf.range(0, batch_size, 1), [batch_size, 1])\n  labels = tf.sparse_to_dense(\n      tf.concat(1, [indices, labels]),\n      [batch_size, num_classes], 1.0, 0.0)\n\n  assert len(images.get_shape()) == 4\n  assert images.get_shape()[0] == batch_size\n  assert images.get_shape()[-1] == 3\n  assert len(labels.get_shape()) == 2\n  assert labels.get_shape()[0] == batch_size\n  assert labels.get_shape()[1] == num_classes\n\n  # Display the training images in the visualizer.\n  tf.image_summary('images', images)\n  return images, labels\n"
  },
  {
    "path": "model_zoo/models/resnet/resnet_main.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"ResNet Train/Eval module.\n\"\"\"\nimport sys\nimport time\n\nimport cifar_input\nimport numpy as np\nimport resnet_model\nimport tensorflow as tf\n\nFLAGS = tf.app.flags.FLAGS\ntf.app.flags.DEFINE_string('dataset', 'cifar10', 'cifar10 or cifar100.')\ntf.app.flags.DEFINE_string('mode', 'train', 'train or eval.')\ntf.app.flags.DEFINE_string('train_data_path', '', 'Filepattern for training data.')\ntf.app.flags.DEFINE_string('eval_data_path', '', 'Filepattern for eval data')\ntf.app.flags.DEFINE_integer('image_size', 32, 'Image side length.')\ntf.app.flags.DEFINE_string('train_dir', '',\n                           'Directory to keep training outputs.')\ntf.app.flags.DEFINE_string('eval_dir', '',\n                           'Directory to keep eval outputs.')\ntf.app.flags.DEFINE_integer('eval_batch_count', 50,\n                            'Number of batches to eval.')\ntf.app.flags.DEFINE_bool('eval_once', False,\n                         'Whether evaluate the model only once.')\ntf.app.flags.DEFINE_string('log_root', '',\n                           'Directory to keep the checkpoints. Should be a '\n                           'parent directory of FLAGS.train_dir/eval_dir.')\ntf.app.flags.DEFINE_integer('num_gpus', 0,\n                            'Number of gpus used for training. (0 or 1)')\n\n\ndef train(hps):\n  \"\"\"Training loop.\"\"\"\n  images, labels = cifar_input.build_input(\n      FLAGS.dataset, FLAGS.train_data_path, hps.batch_size, FLAGS.mode)\n  model = resnet_model.ResNet(hps, images, labels, FLAGS.mode)\n  model.build_graph()\n  summary_writer = tf.train.SummaryWriter(FLAGS.train_dir)\n\n  sv = tf.train.Supervisor(logdir=FLAGS.log_root,\n                           is_chief=True,\n                           summary_op=None,\n                           save_summaries_secs=60,\n                           save_model_secs=300,\n                           global_step=model.global_step)\n  sess = sv.prepare_or_wait_for_session(\n      config=tf.ConfigProto(allow_soft_placement=True))\n\n  step = 0\n  lrn_rate = 0.1\n\n  while not sv.should_stop():\n    (_, summaries, loss, predictions, truth, train_step) = sess.run(\n        [model.train_op, model.summaries, model.cost, model.predictions,\n         model.labels, model.global_step],\n        feed_dict={model.lrn_rate: lrn_rate})\n\n    if train_step < 40000:\n      lrn_rate = 0.1\n    elif train_step < 60000:\n      lrn_rate = 0.01\n    elif train_step < 80000:\n      lrn_rate = 0.001\n    else:\n      lrn_rate = 0.0001\n\n    truth = np.argmax(truth, axis=1)\n    predictions = np.argmax(predictions, axis=1)\n    precision = np.mean(truth == predictions)\n\n    step += 1\n    if step % 100 == 0:\n      precision_summ = tf.Summary()\n      precision_summ.value.add(\n          tag='Precision', simple_value=precision)\n      summary_writer.add_summary(precision_summ, train_step)\n      summary_writer.add_summary(summaries, train_step)\n      tf.logging.info('loss: %.3f, precision: %.3f\\n' % (loss, precision))\n      summary_writer.flush()\n\n  sv.Stop()\n\n\ndef evaluate(hps):\n  \"\"\"Eval loop.\"\"\"\n  images, labels = cifar_input.build_input(\n      FLAGS.dataset, FLAGS.eval_data_path, hps.batch_size, FLAGS.mode)\n  model = resnet_model.ResNet(hps, images, labels, FLAGS.mode)\n  model.build_graph()\n  saver = tf.train.Saver()\n  summary_writer = tf.train.SummaryWriter(FLAGS.eval_dir)\n\n  sess = tf.Session(config=tf.ConfigProto(allow_soft_placement=True))\n  tf.train.start_queue_runners(sess)\n\n  best_precision = 0.0\n  while True:\n    time.sleep(60)\n    try:\n      ckpt_state = tf.train.get_checkpoint_state(FLAGS.log_root)\n    except tf.errors.OutOfRangeError as e:\n      tf.logging.error('Cannot restore checkpoint: %s', e)\n      continue\n    if not (ckpt_state and ckpt_state.model_checkpoint_path):\n      tf.logging.info('No model to eval yet at %s', FLAGS.log_root)\n      continue\n    tf.logging.info('Loading checkpoint %s', ckpt_state.model_checkpoint_path)\n    saver.restore(sess, ckpt_state.model_checkpoint_path)\n\n    total_prediction, correct_prediction = 0, 0\n    for _ in xrange(FLAGS.eval_batch_count):\n      (summaries, loss, predictions, truth, train_step) = sess.run(\n          [model.summaries, model.cost, model.predictions,\n           model.labels, model.global_step])\n\n      truth = np.argmax(truth, axis=1)\n      predictions = np.argmax(predictions, axis=1)\n      correct_prediction += np.sum(truth == predictions)\n      total_prediction += predictions.shape[0]\n\n    precision = 1.0 * correct_prediction / total_prediction\n    best_precision = max(precision, best_precision)\n\n    precision_summ = tf.Summary()\n    precision_summ.value.add(\n        tag='Precision', simple_value=precision)\n    summary_writer.add_summary(precision_summ, train_step)\n    best_precision_summ = tf.Summary()\n    best_precision_summ.value.add(\n        tag='Best Precision', simple_value=best_precision)\n    summary_writer.add_summary(best_precision_summ, train_step)\n    summary_writer.add_summary(summaries, train_step)\n    tf.logging.info('loss: %.3f, precision: %.3f, best precision: %.3f\\n' %\n                    (loss, precision, best_precision))\n    summary_writer.flush()\n\n    if FLAGS.eval_once:\n      break\n\n\ndef main(_):\n  if FLAGS.num_gpus == 0:\n    dev = '/cpu:0'\n  elif FLAGS.num_gpus == 1:\n    dev = '/gpu:0'\n  else:\n    raise ValueError('Only support 0 or 1 gpu.')\n\n  if FLAGS.mode == 'train':\n    batch_size = 128\n  elif FLAGS.mode == 'eval':\n    batch_size = 100\n\n  if FLAGS.dataset == 'cifar10':\n    num_classes = 10\n  elif FLAGS.dataset == 'cifar100':\n    num_classes = 100\n\n  hps = resnet_model.HParams(batch_size=batch_size,\n                             num_classes=num_classes,\n                             min_lrn_rate=0.0001,\n                             lrn_rate=0.1,\n                             num_residual_units=5,\n                             use_bottleneck=False,\n                             weight_decay_rate=0.0002,\n                             relu_leakiness=0.1,\n                             optimizer='mom')\n\n  with tf.device(dev):\n    if FLAGS.mode == 'train':\n      train(hps)\n    elif FLAGS.mode == 'eval':\n      evaluate(hps)\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/resnet/resnet_model.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"ResNet model.\n\nRelated papers:\nhttps://arxiv.org/pdf/1603.05027v2.pdf\nhttps://arxiv.org/pdf/1512.03385v1.pdf\nhttps://arxiv.org/pdf/1605.07146v1.pdf\n\"\"\"\nfrom collections import namedtuple\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom tensorflow.python.training import moving_averages\n\n\nHParams = namedtuple('HParams',\n                     'batch_size, num_classes, min_lrn_rate, lrn_rate, '\n                     'num_residual_units, use_bottleneck, weight_decay_rate, '\n                     'relu_leakiness, optimizer')\n\n\nclass ResNet(object):\n  \"\"\"ResNet model.\"\"\"\n\n  def __init__(self, hps, images, labels, mode):\n    \"\"\"ResNet constructor.\n\n    Args:\n      hps: Hyperparameters.\n      images: Batches of images. [batch_size, image_size, image_size, 3]\n      labels: Batches of labels. [batch_size, num_classes]\n      mode: One of 'train' and 'eval'.\n    \"\"\"\n    self.hps = hps\n    self._images = images\n    self.labels = labels\n    self.mode = mode\n\n    self._extra_train_ops = []\n\n  def build_graph(self):\n    \"\"\"Build a whole graph for the model.\"\"\"\n    self.global_step = tf.Variable(0, name='global_step', trainable=False)\n    self._build_model()\n    if self.mode == 'train':\n      self._build_train_op()\n    self.summaries = tf.merge_all_summaries()\n\n  def _stride_arr(self, stride):\n    \"\"\"Map a stride scalar to the stride array for tf.nn.conv2d.\"\"\"\n    return [1, stride, stride, 1]\n\n  def _build_model(self):\n    \"\"\"Build the core model within the graph.\"\"\"\n    with tf.variable_scope('init'):\n      x = self._images\n      x = self._conv('init_conv', x, 3, 3, 16, self._stride_arr(1))\n\n    strides = [1, 2, 2]\n    activate_before_residual = [True, False, False]\n    if self.hps.use_bottleneck:\n      res_func = self._bottleneck_residual\n      filters = [16, 64, 128, 256]\n    else:\n      res_func = self._residual\n      filters = [16, 16, 32, 64]\n      # Uncomment the following codes to use w28-10 wide residual network.\n      # It is more memory efficient than very deep residual network and has\n      # comparably good performance.\n      # https://arxiv.org/pdf/1605.07146v1.pdf\n      # filters = [16, 160, 320, 640]\n      # Update hps.num_residual_units to 9\n\n    with tf.variable_scope('unit_1_0'):\n      x = res_func(x, filters[0], filters[1], self._stride_arr(strides[0]),\n                   activate_before_residual[0])\n    for i in xrange(1, self.hps.num_residual_units):\n      with tf.variable_scope('unit_1_%d' % i):\n        x = res_func(x, filters[1], filters[1], self._stride_arr(1), False)\n\n    with tf.variable_scope('unit_2_0'):\n      x = res_func(x, filters[1], filters[2], self._stride_arr(strides[1]),\n                   activate_before_residual[1])\n    for i in xrange(1, self.hps.num_residual_units):\n      with tf.variable_scope('unit_2_%d' % i):\n        x = res_func(x, filters[2], filters[2], self._stride_arr(1), False)\n\n    with tf.variable_scope('unit_3_0'):\n      x = res_func(x, filters[2], filters[3], self._stride_arr(strides[2]),\n                   activate_before_residual[2])\n    for i in xrange(1, self.hps.num_residual_units):\n      with tf.variable_scope('unit_3_%d' % i):\n        x = res_func(x, filters[3], filters[3], self._stride_arr(1), False)\n\n    with tf.variable_scope('unit_last'):\n      x = self._batch_norm('final_bn', x)\n      x = self._relu(x, self.hps.relu_leakiness)\n      x = self._global_avg_pool(x)\n\n    with tf.variable_scope('logit'):\n      logits = self._fully_connected(x, self.hps.num_classes)\n      self.predictions = tf.nn.softmax(logits)\n\n    with tf.variable_scope('costs'):\n      xent = tf.nn.softmax_cross_entropy_with_logits(\n          logits, self.labels)\n      self.cost = tf.reduce_mean(xent, name='xent')\n      self.cost += self._decay()\n\n      tf.scalar_summary('cost', self.cost)\n\n  def _build_train_op(self):\n    \"\"\"Build training specific ops for the graph.\"\"\"\n    self.lrn_rate = tf.constant(self.hps.lrn_rate, tf.float32)\n    tf.scalar_summary('learning rate', self.lrn_rate)\n\n    trainable_variables = tf.trainable_variables()\n    grads = tf.gradients(self.cost, trainable_variables)\n\n    if self.hps.optimizer == 'sgd':\n      optimizer = tf.train.GradientDescentOptimizer(self.lrn_rate)\n    elif self.hps.optimizer == 'mom':\n      optimizer = tf.train.MomentumOptimizer(self.lrn_rate, 0.9)\n\n    apply_op = optimizer.apply_gradients(\n        zip(grads, trainable_variables),\n        global_step=self.global_step, name='train_step')\n\n    train_ops = [apply_op] + self._extra_train_ops\n    self.train_op = tf.group(*train_ops)\n\n  # TODO(xpan): Consider batch_norm in contrib/layers/python/layers/layers.py\n  def _batch_norm(self, name, x):\n    \"\"\"Batch normalization.\"\"\"\n    with tf.variable_scope(name):\n      params_shape = [x.get_shape()[-1]]\n\n      beta = tf.get_variable(\n          'beta', params_shape, tf.float32,\n          initializer=tf.constant_initializer(0.0, tf.float32))\n      gamma = tf.get_variable(\n          'gamma', params_shape, tf.float32,\n          initializer=tf.constant_initializer(1.0, tf.float32))\n\n      if self.mode == 'train':\n        mean, variance = tf.nn.moments(x, [0, 1, 2], name='moments')\n\n        moving_mean = tf.get_variable(\n            'moving_mean', params_shape, tf.float32,\n            initializer=tf.constant_initializer(0.0, tf.float32),\n            trainable=False)\n        moving_variance = tf.get_variable(\n            'moving_variance', params_shape, tf.float32,\n            initializer=tf.constant_initializer(1.0, tf.float32),\n            trainable=False)\n\n        self._extra_train_ops.append(moving_averages.assign_moving_average(\n            moving_mean, mean, 0.9))\n        self._extra_train_ops.append(moving_averages.assign_moving_average(\n            moving_variance, variance, 0.9))\n      else:\n        mean = tf.get_variable(\n            'moving_mean', params_shape, tf.float32,\n            initializer=tf.constant_initializer(0.0, tf.float32),\n            trainable=False)\n        variance = tf.get_variable(\n            'moving_variance', params_shape, tf.float32,\n            initializer=tf.constant_initializer(1.0, tf.float32),\n            trainable=False)\n        tf.histogram_summary(mean.op.name, mean)\n        tf.histogram_summary(variance.op.name, variance)\n      # elipson used to be 1e-5. Maybe 0.001 solves NaN problem in deeper net.\n      y = tf.nn.batch_normalization(\n          x, mean, variance, beta, gamma, 0.001)\n      y.set_shape(x.get_shape())\n      return y\n\n  def _residual(self, x, in_filter, out_filter, stride,\n                activate_before_residual=False):\n    \"\"\"Residual unit with 2 sub layers.\"\"\"\n    if activate_before_residual:\n      with tf.variable_scope('shared_activation'):\n        x = self._batch_norm('init_bn', x)\n        x = self._relu(x, self.hps.relu_leakiness)\n        orig_x = x\n    else:\n      with tf.variable_scope('residual_only_activation'):\n        orig_x = x\n        x = self._batch_norm('init_bn', x)\n        x = self._relu(x, self.hps.relu_leakiness)\n\n    with tf.variable_scope('sub1'):\n      x = self._conv('conv1', x, 3, in_filter, out_filter, stride)\n\n    with tf.variable_scope('sub2'):\n      x = self._batch_norm('bn2', x)\n      x = self._relu(x, self.hps.relu_leakiness)\n      x = self._conv('conv2', x, 3, out_filter, out_filter, [1, 1, 1, 1])\n\n    with tf.variable_scope('sub_add'):\n      if in_filter != out_filter:\n        orig_x = tf.nn.avg_pool(orig_x, stride, stride, 'VALID')\n        orig_x = tf.pad(\n            orig_x, [[0, 0], [0, 0], [0, 0],\n                     [(out_filter-in_filter)//2, (out_filter-in_filter)//2]])\n      x += orig_x\n\n    tf.logging.info('image after unit %s', x.get_shape())\n    return x\n\n  def _bottleneck_residual(self, x, in_filter, out_filter, stride,\n                           activate_before_residual=False):\n    \"\"\"Bottleneck resisual unit with 3 sub layers.\"\"\"\n    if activate_before_residual:\n      with tf.variable_scope('common_bn_relu'):\n        x = self._batch_norm('init_bn', x)\n        x = self._relu(x, self.hps.relu_leakiness)\n        orig_x = x\n    else:\n      with tf.variable_scope('residual_bn_relu'):\n        orig_x = x\n        x = self._batch_norm('init_bn', x)\n        x = self._relu(x, self.hps.relu_leakiness)\n\n    with tf.variable_scope('sub1'):\n      x = self._conv('conv1', x, 1, in_filter, out_filter/4, stride)\n\n    with tf.variable_scope('sub2'):\n      x = self._batch_norm('bn2', x)\n      x = self._relu(x, self.hps.relu_leakiness)\n      x = self._conv('conv2', x, 3, out_filter/4, out_filter/4, [1, 1, 1, 1])\n\n    with tf.variable_scope('sub3'):\n      x = self._batch_norm('bn3', x)\n      x = self._relu(x, self.hps.relu_leakiness)\n      x = self._conv('conv3', x, 1, out_filter/4, out_filter, [1, 1, 1, 1])\n\n    with tf.variable_scope('sub_add'):\n      if in_filter != out_filter:\n        orig_x = self._conv('project', orig_x, 1, in_filter, out_filter, stride)\n      x += orig_x\n\n    tf.logging.info('image after unit %s', x.get_shape())\n    return x\n\n  def _decay(self):\n    \"\"\"L2 weight decay loss.\"\"\"\n    costs = []\n    for var in tf.trainable_variables():\n      if var.op.name.find(r'DW') > 0:\n        costs.append(tf.nn.l2_loss(var))\n        # tf.histogram_summary(var.op.name, var)\n\n    return tf.mul(self.hps.weight_decay_rate, tf.add_n(costs))\n\n  def _conv(self, name, x, filter_size, in_filters, out_filters, strides):\n    \"\"\"Convolution.\"\"\"\n    with tf.variable_scope(name):\n      n = filter_size * filter_size * out_filters\n      kernel = tf.get_variable(\n          'DW', [filter_size, filter_size, in_filters, out_filters],\n          tf.float32, initializer=tf.random_normal_initializer(\n              stddev=np.sqrt(2.0/n)))\n      return tf.nn.conv2d(x, kernel, strides, padding='SAME')\n\n  def _relu(self, x, leakiness=0.0):\n    \"\"\"Relu, with optional leaky support.\"\"\"\n    return tf.select(tf.less(x, 0.0), leakiness * x, x, name='leaky_relu')\n\n  def _fully_connected(self, x, out_dim):\n    \"\"\"FullyConnected layer for final output.\"\"\"\n    x = tf.reshape(x, [self.hps.batch_size, -1])\n    w = tf.get_variable(\n        'DW', [x.get_shape()[1], out_dim],\n        initializer=tf.uniform_unit_scaling_initializer(factor=1.0))\n    b = tf.get_variable('biases', [out_dim],\n                        initializer=tf.constant_initializer())\n    return tf.nn.xw_plus_b(x, w, b)\n\n  def _global_avg_pool(self, x):\n    assert x.get_shape().ndims == 4\n    return tf.reduce_mean(x, [1, 2])\n"
  },
  {
    "path": "model_zoo/models/slim/BUILD",
    "content": "# Description:\n#   Contains files for loading, training and evaluating TF-Slim-based models.\n\npackage(default_visibility = [\":internal\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\nexports_files([\"LICENSE\"])\n\npackage_group(name = \"internal\")\n\npy_library(\n    name = \"dataset_utils\",\n    srcs = [\"datasets/dataset_utils.py\"],\n)\n\npy_library(\n    name = \"download_and_convert_cifar10\",\n    srcs = [\"datasets/download_and_convert_cifar10.py\"],\n    deps = [\":dataset_utils\"],\n)\n\npy_library(\n    name = \"download_and_convert_flowers\",\n    srcs = [\"datasets/download_and_convert_flowers.py\"],\n    deps = [\":dataset_utils\"],\n)\n\npy_library(\n    name = \"download_and_convert_mnist\",\n    srcs = [\"datasets/download_and_convert_mnist.py\"],\n    deps = [\":dataset_utils\"],\n)\n\npy_binary(\n    name = \"download_and_convert_data\",\n    srcs = [\"download_and_convert_data.py\"],\n    deps = [\n        \":download_and_convert_cifar10\",\n        \":download_and_convert_flowers\",\n        \":download_and_convert_mnist\",\n    ],\n)\n\npy_binary(\n    name = \"cifar10\",\n    srcs = [\"datasets/cifar10.py\"],\n    deps = [\":dataset_utils\"],\n)\n\npy_binary(\n    name = \"flowers\",\n    srcs = [\"datasets/flowers.py\"],\n    deps = [\":dataset_utils\"],\n)\n\npy_binary(\n    name = \"imagenet\",\n    srcs = [\"datasets/imagenet.py\"],\n    deps = [\":dataset_utils\"],\n)\n\npy_binary(\n    name = \"mnist\",\n    srcs = [\"datasets/mnist.py\"],\n    deps = [\":dataset_utils\"],\n)\n\npy_library(\n    name = \"dataset_factory\",\n    srcs = [\"datasets/dataset_factory.py\"],\n    deps = [\n        \":cifar10\",\n        \":flowers\",\n        \":imagenet\",\n        \":mnist\",\n    ],\n)\n\npy_library(\n    name = \"model_deploy\",\n    srcs = [\"deployment/model_deploy.py\"],\n)\n\npy_test(\n    name = \"model_deploy_test\",\n    srcs = [\"deployment/model_deploy_test.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\":model_deploy\"],\n)\n\npy_library(\n    name = \"cifarnet_preprocessing\",\n    srcs = [\"preprocessing/cifarnet_preprocessing.py\"],\n)\n\npy_library(\n    name = \"inception_preprocessing\",\n    srcs = [\"preprocessing/inception_preprocessing.py\"],\n)\n\npy_library(\n    name = \"lenet_preprocessing\",\n    srcs = [\"preprocessing/lenet_preprocessing.py\"],\n)\n\npy_library(\n    name = \"vgg_preprocessing\",\n    srcs = [\"preprocessing/vgg_preprocessing.py\"],\n)\n\npy_library(\n    name = \"preprocessing_factory\",\n    srcs = [\"preprocessing/preprocessing_factory.py\"],\n    deps = [\n        \":cifarnet_preprocessing\",\n        \":inception_preprocessing\",\n        \":lenet_preprocessing\",\n        \":vgg_preprocessing\",\n    ],\n)\n\n# Typical networks definitions.\n\npy_library(\n    name = \"nets\",\n    deps = [\n        \":alexnet\",\n        \":cifarnet\",\n        \":inception\",\n        \":lenet\",\n        \":overfeat\",\n        \":resnet_v1\",\n        \":resnet_v2\",\n        \":vgg\",\n    ],\n)\n\npy_library(\n    name = \"alexnet\",\n    srcs = [\"nets/alexnet.py\"],\n    srcs_version = \"PY2AND3\",\n)\n\npy_test(\n    name = \"alexnet_test\",\n    size = \"medium\",\n    srcs = [\"nets/alexnet_test.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\":alexnet\"],\n)\n\npy_library(\n    name = \"cifarnet\",\n    srcs = [\"nets/cifarnet.py\"],\n)\n\npy_library(\n    name = \"inception\",\n    srcs = [\"nets/inception.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\n        \":inception_resnet_v2\",\n        \":inception_v1\",\n        \":inception_v2\",\n        \":inception_v3\",\n        \":inception_v4\",\n    ],\n)\n\npy_library(\n    name = \"inception_utils\",\n    srcs = [\"nets/inception_utils.py\"],\n    srcs_version = \"PY2AND3\",\n)\n\npy_library(\n    name = \"inception_v1\",\n    srcs = [\"nets/inception_v1.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\n        \":inception_utils\",\n    ],\n)\n\npy_library(\n    name = \"inception_v2\",\n    srcs = [\"nets/inception_v2.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\n        \":inception_utils\",\n    ],\n)\n\npy_library(\n    name = \"inception_v3\",\n    srcs = [\"nets/inception_v3.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\n        \":inception_utils\",\n    ],\n)\n\npy_library(\n    name = \"inception_v4\",\n    srcs = [\"nets/inception_v4.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\n        \":inception_utils\",\n    ],\n)\n\npy_library(\n    name = \"inception_resnet_v2\",\n    srcs = [\"nets/inception_resnet_v2.py\"],\n    srcs_version = \"PY2AND3\",\n)\n\npy_test(\n    name = \"inception_v1_test\",\n    size = \"large\",\n    srcs = [\"nets/inception_v1_test.py\"],\n    shard_count = 3,\n    srcs_version = \"PY2AND3\",\n    deps = [\":inception\"],\n)\n\npy_test(\n    name = \"inception_v2_test\",\n    size = \"large\",\n    srcs = [\"nets/inception_v2_test.py\"],\n    shard_count = 3,\n    srcs_version = \"PY2AND3\",\n    deps = [\":inception\"],\n)\n\npy_test(\n    name = \"inception_v3_test\",\n    size = \"large\",\n    srcs = [\"nets/inception_v3_test.py\"],\n    shard_count = 3,\n    srcs_version = \"PY2AND3\",\n    deps = [\":inception\"],\n)\n\npy_test(\n    name = \"inception_v4_test\",\n    size = \"large\",\n    srcs = [\"nets/inception_v4_test.py\"],\n    shard_count = 3,\n    srcs_version = \"PY2AND3\",\n    deps = [\":inception\"],\n)\n\npy_test(\n    name = \"inception_resnet_v2_test\",\n    size = \"large\",\n    srcs = [\"nets/inception_resnet_v2_test.py\"],\n    shard_count = 3,\n    srcs_version = \"PY2AND3\",\n    deps = [\":inception\"],\n)\n\npy_library(\n    name = \"lenet\",\n    srcs = [\"nets/lenet.py\"],\n)\n\npy_library(\n    name = \"overfeat\",\n    srcs = [\"nets/overfeat.py\"],\n    srcs_version = \"PY2AND3\",\n)\n\npy_test(\n    name = \"overfeat_test\",\n    size = \"medium\",\n    srcs = [\"nets/overfeat_test.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\":overfeat\"],\n)\n\npy_library(\n    name = \"resnet_utils\",\n    srcs = [\"nets/resnet_utils.py\"],\n    srcs_version = \"PY2AND3\",\n)\n\npy_library(\n    name = \"resnet_v1\",\n    srcs = [\"nets/resnet_v1.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\n        \":resnet_utils\",\n    ],\n)\n\npy_test(\n    name = \"resnet_v1_test\",\n    size = \"medium\",\n    srcs = [\"nets/resnet_v1_test.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\":resnet_v1\"],\n)\n\npy_library(\n    name = \"resnet_v2\",\n    srcs = [\"nets/resnet_v2.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\n        \":resnet_utils\",\n    ],\n)\n\npy_test(\n    name = \"resnet_v2_test\",\n    size = \"medium\",\n    srcs = [\"nets/resnet_v2_test.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\":resnet_v2\"],\n)\n\npy_library(\n    name = \"vgg\",\n    srcs = [\"nets/vgg.py\"],\n    srcs_version = \"PY2AND3\",\n)\n\npy_test(\n    name = \"vgg_test\",\n    size = \"medium\",\n    srcs = [\"nets/vgg_test.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\":vgg\"],\n)\n\npy_library(\n    name = \"nets_factory\",\n    srcs = [\"nets/nets_factory.py\"],\n    deps = [\":nets\"],\n)\n\npy_test(\n    name = \"nets_factory_test\",\n    size = \"medium\",\n    srcs = [\"nets/nets_factory_test.py\"],\n    srcs_version = \"PY2AND3\",\n    deps = [\":nets_factory\"],\n)\n\npy_binary(\n    name = \"train_image_classifier\",\n    srcs = [\"train_image_classifier.py\"],\n    deps = [\n        \":dataset_factory\",\n        \":model_deploy\",\n        \":nets_factory\",\n        \":preprocessing_factory\",\n    ],\n)\n\npy_binary(\n    name = \"eval_image_classifier\",\n    srcs = [\"eval_image_classifier.py\"],\n    deps = [\n        \":dataset_factory\",\n        \":model_deploy\",\n        \":nets_factory\",\n        \":preprocessing_factory\",\n    ],\n)\n"
  },
  {
    "path": "model_zoo/models/slim/README.md",
    "content": "# TensorFlow-Slim image classification library\n\n[TF-slim](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/slim)\nis a new lightweight high-level API of TensorFlow (`tensorflow.contrib.slim`)\nfor defining, training and evaluating complex\nmodels. This directory contains\ncode for training and evaluating several widely used Convolutional Neural\nNetwork (CNN) image classification models using TF-slim.\nIt contains scripts that will allow\nyou to train models from scratch or fine-tune them from pre-trained network\nweights. It also contains code for downloading standard image datasets,\nconverting them\nto TensorFlow's native TFRecord format and reading them in using TF-Slim's\ndata reading and queueing utilities. You can easily train any model on any of\nthese datasets, as we demonstrate below. We've also included a\n[jupyter notebook](https://github.com/tensorflow/models/blob/master/slim/slim_walkthough.ipynb),\nwhich provides working examples of how to use TF-Slim for image classification.\n\n## Contacts\n\nMaintainers of TF-slim:\n\n* Nathan Silberman,\n  github: [nathansilberman](https://github.com/nathansilberman)\n* Sergio Guadarrama, github: [sguada](https://github.com/sguada)\n\n## Table of contents\n\n<a href=\"#Install\">Installation and setup</a><br>\n<a href='#Data'>Preparing the datasets</a><br>\n<a href='#Pretrained'>Using pre-trained models</a><br>\n<a href='#Training'>Training from scratch</a><br>\n<a href='#Tuning'>Fine tuning to a new task</a><br>\n<a href='#Eval'>Evaluating performance</a><br>\n\n# Installation\n<a id='Install'></a>\n\nIn this section, we describe the steps required to install the appropriate\nprerequisite packages.\n\n## Installing latest version of TF-slim\n\nAs of 8/28/16, the latest [stable release of TF](https://www.tensorflow.org/versions/r0.10/get_started/os_setup.html#pip-installation)\nis r0.10, which contains most of TF-Slim but not some later additions. To obtain the\nlatest version, you must install the most recent nightly build of\nTensorFlow. You can find the latest nightly binaries at\n[TensorFlow Installation](https://github.com/tensorflow/tensorflow#installation)\nin the section that reads \"People who are a little more adventurous can\nalso try our nightly binaries\". Copy the link address that corresponds to\nthe appropriate machine architecture and python version, and pip install\nit. For example:\n\n```shell\nexport TF_BINARY_URL=https://ci.tensorflow.org/view/Nightly/job/nightly-matrix-cpu/TF_BUILD_CONTAINER_TYPE=CPU,TF_BUILD_IS_OPT=OPT,TF_BUILD_IS_PIP=PIP,TF_BUILD_PYTHON_VERSION=PYTHON2,label=cpu-slave/lastSuccessfulBuild/artifact/pip_test/whl/tensorflow-0.10.0rc0-cp27-none-linux_x86_64.whl\nsudo pip install --upgrade $TF_BINARY_URL\n```\n\nTo test this has worked, execute the following command; it should run\nwithout raising any errors.\n\n```\npython -c \"import tensorflow.contrib.slim as slim; eval = slim.evaluation.evaluate_once\"\n```\n\n## Installing the TF-slim image models library\n\nTo use TF-Slim for image classification, you also have to install\nthe [TF-Slim image models library](https://github.com/tensorflow/models/tree/master/slim),\nwhich is not part of the core TF library.\nTo do this, check out the\n[tensorflow/models](https://github.com/tensorflow/models/) repository as follows:\n\n```bash\ncd $HOME/workspace\ngit clone https://github.com/tensorflow/models/\n```\n\nThis will put the TF-Slim image models library in `$HOME/workspace/models/slim`.\n(It will also create a directory called\n[models/inception](https://github.com/tensorflow/models/tree/master/inception),\nwhich contains an older version of slim; you can safely ignore this.)\n\nTo verify that this has worked, execute the following commands; it should run\nwithout raising any errors.\n\n```\ncd $HOME/workspace/models/slim\npython -c \"from nets import cifarnet; mynet = cifarnet.cifarnet\"\n```\n\n\n# Preparing the datasets\n<a id='Data'></a>\n\nAs part of this library, we've included scripts to download several popular\nimage datasets (listed below) and convert them to slim format.\n\nDataset | Training Set Size | Testing Set Size | Number of Classes | Comments\n:------:|:---------------:|:---------------------:|:-----------:|:-----------:\nFlowers|2500 | 2500 | 5 | Various sizes (source: Flickr)\n[Cifar10](https://www.cs.toronto.edu/~kriz/cifar.html) | 60k| 10k | 10 |32x32 color\n[MNIST](http://yann.lecun.com/exdb/mnist/)| 60k | 10k | 10 | 28x28 gray\n[ImageNet](http://www.image-net.org/challenges/LSVRC/2012/)|1.2M| 50k | 1000 | Various sizes\n\n## Downloading and converting to TFRecord format\n\nFor each dataset, we'll need to download the raw data and convert it to\nTensorFlow's native\n[TFRecord](https://www.tensorflow.org/versions/r0.10/api_docs/python/python_io.html#tfrecords-format-details)\nformat. Each TFRecord contains a\n[TF-Example](https://github.com/tensorflow/tensorflow/blob/r0.10/tensorflow/core/example/example.proto)\nprotocol buffer. Below we demonstrate how to do this for the Flowers dataset.\n\n```shell\n$ DATA_DIR=/tmp/data/flowers\n$ python download_and_convert_data.py \\\n    --dataset_name=flowers \\\n    --dataset_dir=\"${DATA_DIR}\"\n```\n\nWhen the script finishes you will find several TFRecord files created:\n\n```shell\n$ ls ${DATA_DIR}\nflowers_train-00000-of-00005.tfrecord\n...\nflowers_train-00004-of-00005.tfrecord\nflowers_validation-00000-of-00005.tfrecord\n...\nflowers_validation-00004-of-00005.tfrecord\nlabels.txt\n```\n\nThese represent the training and validation data, sharded over 5 files each.\nYou will also find the `$DATA_DIR/labels.txt` file which contains the mapping\nfrom integer labels to class names.\n\nYou can use the same script to create the mnist and cifar10 datasets.\nHowever, for ImageNet, you have to follow the instructions\n[here](https://github.com/tensorflow/models/blob/master/inception/README.md#getting-started).\nNote that you first have to sign up for an account at image-net.org.\nAlso, the download can take several hours, and uses about 500MB.\n\n\n## Creating a TF-Slim Dataset Descriptor.\n\nOnce the TFRecord files have been created, you can easily define a Slim\n[Dataset](https://github.com/tensorflow/tensorflow/blob/r0.10/tensorflow/contrib/slim/python/slim/data/dataset.py),\nwhich stores pointers to the data file, as well as various other pieces of\nmetadata, such as the class labels, the train/test split, and how to parse the\nTFExample protos. We have included the TF-Slim Dataset descriptors\nfor\n[Cifar10](https://github.com/tensorflow/models/blob/master/slim/datasets/cifar10.py),\n[ImageNet](https://github.com/tensorflow/models/blob/master/slim/datasets/imagenet.py),\n[Flowers](https://github.com/tensorflow/models/blob/master/slim/datasets/flowers.py),\nand\n[MNIST](https://github.com/tensorflow/models/blob/master/slim/datasets/mnist.py).\nAn example of how to load data using a TF-Slim dataset descriptor using a\nTF-Slim\n[DatasetDataProvider](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/slim/python/slim/data/dataset_data_provider.py)\nis found below:\n\n```python\nimport tensorflow as tf\nfrom datasets import flowers\n\nslim = tf.contrib.slim\n\n# Selects the 'validation' dataset.\ndataset = flowers.get_split('validation', DATA_DIR)\n\n# Creates a TF-Slim DataProvider which reads the dataset in the background\n# during both training and testing.\nprovider = slim.dataset_data_provider.DatasetDataProvider(dataset)\n[image, label] = provider.get(['image', 'label'])\n```\n\n\n# Pre-trained Models\n<a id='Pretrained'></a>\n\nNeural nets work best when they have many parameters, making them powerful\nfunction approximators.\nHowever, this  means they must be trained on very large datasets. Because\ntraining models from scratch can be a very computationally intensive process\nrequiring days or even weeks, we provide various pre-trained models,\nas listed below. These CNNs have been trained on the\n[ILSVRC-2012-CLS](http://www.image-net.org/challenges/LSVRC/2012/)\nimage classification dataset.\n\nIn the table below, we list each model, the corresponding\nTensorFlow model file, the link to the model checkpoint, and the top 1 and top 5\naccuracy (on the imagenet test set).\nNote that the VGG and ResNet parameters have been converted from their original\ncaffe formats\n([here](https://github.com/BVLC/caffe/wiki/Model-Zoo#models-used-by-the-vgg-team-in-ilsvrc-2014)\nand\n[here](https://github.com/KaimingHe/deep-residual-networks)),\nwhereas the Inception parameters have been trained internally at\nGoogle. Also be aware that these accuracies were computed by evaluating using a\nsingle image crop. Some academic papers report higher accuracy by using multiple\ncrops at multiple scales.\n\nModel | TF-Slim File | Checkpoint | Top-1 Accuracy| Top-5 Accuracy |\n:----:|:------------:|:----------:|:-------:|:--------:|\n[Inception V1](http://arxiv.org/abs/1409.4842v1)|[Code](https://github.com/tensorflow/models/blob/master/slim/nets/inception_v1.py)|[inception_v1_2016_08_28.tar.gz](http://download.tensorflow.org/models/inception_v1_2016_08_28.tar.gz)|69.8|89.6|\n[Inception V2](http://arxiv.org/abs/1502.03167)|[Code](https://github.com/tensorflow/models/blob/master/slim/nets/inception_v2.py)|[inception_v2_2016_08_28.tar.gz](http://download.tensorflow.org/models/inception_v2_2016_08_28.tar.gz)|73.9|91.8|\n[Inception V3](http://arxiv.org/abs/1512.00567)|[Code](https://github.com/tensorflow/models/blob/master/slim/nets/inception_v3.py)|[inception_v3_2016_08_28.tar.gz](http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz)|78.0|93.9|\n[Inception V4](http://arxiv.org/abs/1602.07261)|[Code](https://github.com/tensorflow/models/blob/master/slim/nets/inception_v4.py)|[inception_v4_2016_09_09.tar.gz](http://download.tensorflow.org/models/inception_v4_2016_09_09.tar.gz)|80.2|95.2|\n[Inception-ResNet-v2](http://arxiv.org/abs/1602.07261)|[Code](https://github.com/tensorflow/models/blob/master/slim/nets/inception_resnet_v2.py)|[inception_resnet_v2.tar.gz](http://download.tensorflow.org/models/inception_resnet_v2_2016_08_30.tar.gz)|80.4|95.3|\n[ResNet 50](https://arxiv.org/abs/1512.03385)|[Code](https://github.com/tensorflow/models/blob/master/slim/nets/resnet_v1.py)|[resnet_v1_50.tar.gz](http://download.tensorflow.org/models/resnet_v1_50_2016_08_28.tar.gz)|75.2|92.2|\n[ResNet 101](https://arxiv.org/abs/1512.03385)|[Code](https://github.com/tensorflow/models/blob/master/slim/nets/resnet_v1.py)|[resnet_v1_101.tar.gz](http://download.tensorflow.org/models/resnet_v1_101_2016_08_28.tar.gz)|76.4|92.9|\n[ResNet 152](https://arxiv.org/abs/1512.03385)|[Code](https://github.com/tensorflow/models/blob/master/slim/nets/resnet_v1.py)|[resnet_v1_152.tar.gz](http://download.tensorflow.org/models/resnet_v1_152_2016_08_28.tar.gz)|76.8|93.2|\n[VGG 16](http://arxiv.org/abs/1409.1556.pdf)|[Code](https://github.com/tensorflow/models/blob/master/slim/nets/vgg.py)|[vgg_16.tar.gz](http://download.tensorflow.org/models/vgg_16_2016_08_28.tar.gz)|71.5|89.8|\n[VGG 19](http://arxiv.org/abs/1409.1556.pdf)|[Code](https://github.com/tensorflow/models/blob/master/slim/nets/vgg.py)|[vgg_19.tar.gz](http://download.tensorflow.org/models/vgg_19_2016_08_28.tar.gz)|71.1|89.8|\n\n\nHere is an example of how to download the Inception V3 checkpoint:\n\n```shell\n$ CHECKPOINT_DIR=/tmp/checkpoints\n$ mkdir ${CHECKPOINT_DIR}\n$ wget http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz\n$ tar -xvf inception_v3_2016_08_28.tar.gz\n$ mv inception_v3.ckpt ${CHECKPOINT_DIR}\n$ rm inception_v3_2016_08_28.tar.gz\n```\n\n\n\n# Training a model from scratch.\n<a id='Training'></a>\n\nWe provide an easy way to train a model from scratch using any TF-Slim dataset.\nThe following example demonstrates how to train Inception V3 using the default\nparameters on the ImageNet dataset.\n\n```shell\nDATASET_DIR=/tmp/imagenet\nTRAIN_DIR=/tmp/train_logs\npython train_image_classifier.py \\\n    --train_dir=${TRAIN_DIR} \\\n    --dataset_name=imagenet \\\n    --dataset_split_name=train \\\n    --dataset_dir=${DATASET_DIR} \\\n    --model_name=inception_v3\n```\n\nThis process may take several days, depending on your hardware setup.\nFor convenience, we provide a way to train a model on multiple GPUs,\nand/or multiple CPUs, either synchrononously or asynchronously.\nSee [model_deploy](https://github.com/tensorflow/models/blob/master/slim/deployment/model_deploy.py)\nfor details.\n\n\n# Fine-tuning a model from an existing checkpoint\n<a id='Tuning'></a>\n\nRather than training from scratch, we'll often want to start from a pre-trained\nmodel and fine-tune it.\nTo indicate a checkpoint from which to fine-tune, we'll call training with\nthe `--checkpoint_path` flag and assign it an absolute path to a checkpoint\nfile.\n\nWhen fine-tuning a model, we need to be careful about restoring checkpoint\nweights. In particular, when we fine-tune a model on a new task with a different\nnumber of output labels, we wont be able restore the final logits (classifier)\nlayer. For this, we'll use the `--checkpoint_exclude_scopes` flag. This flag\nhinders certain variables from being loaded. When fine-tuning on a\nclassification task using a different number of classes than the trained model,\nthe new model will have a final 'logits' layer whose dimensions differ from the\npre-trained model. For example, if fine-tuning an ImageNet-trained model on\nFlowers, the pre-trained logits layer will have dimensions `[2048 x 1001]` but\nour new logits layer will have dimensions `[2048 x 5]`. Consequently, this\nflag indicates to TF-Slim to avoid loading these weights from the checkpoint.\n\nKeep in mind that warm-starting from a checkpoint affects the model's weights\nonly during the initialization of the model. Once a model has started training,\na new checkpoint will be created in `${TRAIN_DIR}`. If the fine-tuning\ntraining is stopped and restarted, this new checkpoint will be the one from\nwhich weights are restored and not the `${checkpoint_path}$`. Consequently,\nthe flags `--checkpoint_path` and `--checkpoint_exclude_scopes` are only used\nduring the `0-`th global step (model initialization). Typically for fine-tuning\none only want train a sub-set of layers, so the flag `--trainable_scopes` allows\nto specify which subsets of layers should trained, the rest would remain frozen.\n\nBelow we give an example of\n[fine-tuning inception-v3 on flowers](https://github.com/tensorflow/models/blob/master/slim/scripts/finetune_inception_v3_on_flowers.sh),\ninception_v3  was trained on ImageNet with 1000 class labels, but the flowers\ndataset only have 5 classes. Since the dataset is quite small we will only train\nthe new layers.\n\n\n```shell\n$ DATASET_DIR=/tmp/flowers\n$ TRAIN_DIR=/tmp/flowers-models/inception_v3\n$ CHECKPOINT_PATH=/tmp/my_checkpoints/inception_v3.ckpt\n$ python train_image_classifier.py \\\n    --train_dir=${TRAIN_DIR} \\\n    --dataset_dir=${DATASET_DIR} \\\n    --dataset_name=flowers \\\n    --dataset_split_name=train \\\n    --model_name=inception_v3 \\\n    --checkpoint_path=${CHECKPOINT_PATH} \\\n    --checkpoint_exclude_scopes=InceptionV3/Logits,InceptionV3/AuxLogits/Logits \\\n    --trainable_scopes=InceptionV3/Logits,InceptionV3/AuxLogits/Logits\n```\n\n\n\n# Evaluating performance of a model\n<a id='Eval'></a>\n\nTo evaluate the performance of a model (whether pretrained or your own),\nyou can use the eval_image_classifier.py script, as shown below.\n\nBelow we give an example of downloading the pretrained inception model and\nevaluating it on the imagenet dataset.\n\n```shell\nCHECKPOINT_FILE = ${CHECKPOINT_DIR}/inception_v3.ckpt  # Example\n$ python eval_image_classifier.py \\\n    --alsologtostderr \\\n    --checkpoint_path=${CHECKPOINT_FILE} \\\n    --dataset_dir=${DATASET_DIR} \\\n    --dataset_name=imagenet \\\n    --dataset_split_name=validation \\\n    --model_name=inception_v3\n```\n\n\n\n# Troubleshooting\n\n#### The model runs out of CPU memory.\n\nSee\n[Model Runs out of CPU memory](https://github.com/tensorflow/models/tree/master/inception#the-model-runs-out-of-cpu-memory).\n\n#### The model runs out of GPU memory.\n\nSee\n[Adjusting Memory Demands](https://github.com/tensorflow/models/tree/master/inception#adjusting-memory-demands).\n\n#### The model training results in NaN's.\n\nSee\n[Model Resulting in NaNs](https://github.com/tensorflow/models/tree/master/inception#the-model-training-results-in-nans).\n\n#### The ResNet and VGG Models have 1000 classes but the ImageNet dataset has 1001\n\nThe ImageNet dataset provied has an empty background class which was can be used\nto fine-tune the model to other tasks. If you try training or fine-tuning the\nVGG or ResNet models using the ImageNet dataset, you might encounter the\nfollowing error:\n\n```bash\nInvalidArgumentError: Assign requires shapes of both tensors to match. lhs shape= [1001] rhs shape= [1000]\n```\nThis is due to the fact that the VGG and ResNet final layers have only 1000\noutputs rather than 1001.\n\nTo fix this issue, you can set the `--labels_offsets=1` flag. This results in\nthe ImageNet labels being shifted down by one:\n\n\n#### I wish to train a model with a different image size.\n\nThe preprocessing functions all take `height` and `width` as parameters. You\ncan change the default values using the following snippet:\n\n```python\nimage_preprocessing_fn = preprocessing_factory.get_preprocessing(\n    preprocessing_name,\n    height=MY_NEW_HEIGHT,\n    width=MY_NEW_WIDTH,\n    is_training=True)\n```\n\n#### What hardware specification are these hyper-parameters targeted for?\n\nSee\n[Hardware Specifications](https://github.com/tensorflow/models/tree/master/inception#what-hardware-specification-are-these-hyper-parameters-targeted-for).\n\n"
  },
  {
    "path": "model_zoo/models/slim/datasets/__init__.py",
    "content": "\n"
  },
  {
    "path": "model_zoo/models/slim/datasets/cifar10.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Provides data for the Cifar10 dataset.\n\nThe dataset scripts used to create the dataset can be found at:\ntensorflow/models/slim/data/create_cifar10_dataset.py\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\nimport tensorflow as tf\n\nfrom datasets import dataset_utils\n\nslim = tf.contrib.slim\n\n_FILE_PATTERN = 'cifar10_%s.tfrecord'\n\nSPLITS_TO_SIZES = {'train': 50000, 'test': 10000}\n\n_NUM_CLASSES = 10\n\n_ITEMS_TO_DESCRIPTIONS = {\n    'image': 'A [32 x 32 x 3] color image.',\n    'label': 'A single integer between 0 and 9',\n}\n\n\ndef get_split(split_name, dataset_dir, file_pattern=None, reader=None):\n  \"\"\"Gets a dataset tuple with instructions for reading cifar10.\n\n  Args:\n    split_name: A train/test split name.\n    dataset_dir: The base directory of the dataset sources.\n    file_pattern: The file pattern to use when matching the dataset sources.\n      It is assumed that the pattern contains a '%s' string so that the split\n      name can be inserted.\n    reader: The TensorFlow reader type.\n\n  Returns:\n    A `Dataset` namedtuple.\n\n  Raises:\n    ValueError: if `split_name` is not a valid train/test split.\n  \"\"\"\n  if split_name not in SPLITS_TO_SIZES:\n    raise ValueError('split name %s was not recognized.' % split_name)\n\n  if not file_pattern:\n    file_pattern = _FILE_PATTERN\n  file_pattern = os.path.join(dataset_dir, file_pattern % split_name)\n\n  # Allowing None in the signature so that dataset_factory can use the default.\n  if not reader:\n    reader = tf.TFRecordReader\n\n  keys_to_features = {\n      'image/encoded': tf.FixedLenFeature((), tf.string, default_value=''),\n      'image/format': tf.FixedLenFeature((), tf.string, default_value='png'),\n      'image/class/label': tf.FixedLenFeature(\n          [], tf.int64, default_value=tf.zeros([], dtype=tf.int64)),\n  }\n\n  items_to_handlers = {\n      'image': slim.tfexample_decoder.Image(shape=[32, 32, 3]),\n      'label': slim.tfexample_decoder.Tensor('image/class/label'),\n  }\n\n  decoder = slim.tfexample_decoder.TFExampleDecoder(\n      keys_to_features, items_to_handlers)\n\n  labels_to_names = None\n  if dataset_utils.has_labels(dataset_dir):\n    labels_to_names = dataset_utils.read_label_file(dataset_dir)\n\n  return slim.dataset.Dataset(\n      data_sources=file_pattern,\n      reader=reader,\n      decoder=decoder,\n      num_samples=SPLITS_TO_SIZES[split_name],\n      items_to_descriptions=_ITEMS_TO_DESCRIPTIONS,\n      num_classes=_NUM_CLASSES,\n      labels_to_names=labels_to_names)\n"
  },
  {
    "path": "model_zoo/models/slim/datasets/dataset_factory.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"A factory-pattern class which returns classification image/label pairs.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nfrom datasets import cifar10\nfrom datasets import flowers\nfrom datasets import imagenet\nfrom datasets import mnist\n\ndatasets_map = {\n    'cifar10': cifar10,\n    'flowers': flowers,\n    'imagenet': imagenet,\n    'mnist': mnist,\n}\n\n\ndef get_dataset(name, split_name, dataset_dir, file_pattern=None, reader=None):\n  \"\"\"Given a dataset name and a split_name returns a Dataset.\n\n  Args:\n    name: String, the name of the dataset.\n    split_name: A train/test split name.\n    dataset_dir: The directory where the dataset files are stored.\n    file_pattern: The file pattern to use for matching the dataset source files.\n    reader: The subclass of tf.ReaderBase. If left as `None`, then the default\n      reader defined by each dataset is used.\n\n  Returns:\n    A `Dataset` class.\n\n  Raises:\n    ValueError: If the dataset `name` is unknown.\n  \"\"\"\n  if name not in datasets_map:\n    raise ValueError('Name of dataset unknown %s' % name)\n  return datasets_map[name].get_split(\n      split_name,\n      dataset_dir,\n      file_pattern,\n      reader)\n"
  },
  {
    "path": "model_zoo/models/slim/datasets/dataset_utils.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains utilities for downloading and converting datasets.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\nimport sys\nimport tarfile\n\nfrom six.moves import urllib\nimport tensorflow as tf\n\nLABELS_FILENAME = 'labels.txt'\n\n\ndef int64_feature(values):\n  \"\"\"Returns a TF-Feature of int64s.\n\n  Args:\n    values: A scalar or list of values.\n\n  Returns:\n    a TF-Feature.\n  \"\"\"\n  if not isinstance(values, (tuple, list)):\n    values = [values]\n  return tf.train.Feature(int64_list=tf.train.Int64List(value=values))\n\n\ndef bytes_feature(values):\n  \"\"\"Returns a TF-Feature of bytes.\n\n  Args:\n    values: A string.\n\n  Returns:\n    a TF-Feature.\n  \"\"\"\n  return tf.train.Feature(bytes_list=tf.train.BytesList(value=[values]))\n\n\ndef image_to_tfexample(image_data, image_format, height, width, class_id):\n  return tf.train.Example(features=tf.train.Features(feature={\n      'image/encoded': bytes_feature(image_data),\n      'image/format': bytes_feature(image_format),\n      'image/class/label': int64_feature(class_id),\n      'image/height': int64_feature(height),\n      'image/width': int64_feature(width),\n  }))\n\n\ndef download_and_uncompress_tarball(tarball_url, dataset_dir):\n  \"\"\"Downloads the `tarball_url` and uncompresses it locally.\n\n  Args:\n    tarball_url: The URL of a tarball file.\n    dataset_dir: The directory where the temporary files are stored.\n  \"\"\"\n  filename = tarball_url.split('/')[-1]\n  filepath = os.path.join(dataset_dir, filename)\n\n  def _progress(count, block_size, total_size):\n    sys.stdout.write('\\r>> Downloading %s %.1f%%' % (\n        filename, float(count * block_size) / float(total_size) * 100.0))\n    sys.stdout.flush()\n  filepath, _ = urllib.request.urlretrieve(tarball_url, filepath, _progress)\n  print()\n  statinfo = os.stat(filepath)\n  print('Successfully downloaded', filename, statinfo.st_size, 'bytes.')\n  tarfile.open(filepath, 'r:gz').extractall(dataset_dir)\n\n\ndef write_label_file(labels_to_class_names, dataset_dir,\n                     filename=LABELS_FILENAME):\n  \"\"\"Writes a file with the list of class names.\n\n  Args:\n    labels_to_class_names: A map of (integer) labels to class names.\n    dataset_dir: The directory in which the labels file should be written.\n    filename: The filename where the class names are written.\n  \"\"\"\n  labels_filename = os.path.join(dataset_dir, filename)\n  with tf.gfile.Open(labels_filename, 'w') as f:\n    for label in labels_to_class_names:\n      class_name = labels_to_class_names[label]\n      f.write('%d:%s\\n' % (label, class_name))\n\n\ndef has_labels(dataset_dir, filename=LABELS_FILENAME):\n  \"\"\"Specifies whether or not the dataset directory contains a label map file.\n\n  Args:\n    dataset_dir: The directory in which the labels file is found.\n    filename: The filename where the class names are written.\n\n  Returns:\n    `True` if the labels file exists and `False` otherwise.\n  \"\"\"\n  return tf.gfile.Exists(os.path.join(dataset_dir, filename))\n\n\ndef read_label_file(dataset_dir, filename=LABELS_FILENAME):\n  \"\"\"Reads the labels file and returns a mapping from ID to class name.\n\n  Args:\n    dataset_dir: The directory in which the labels file is found.\n    filename: The filename where the class names are written.\n\n  Returns:\n    A map from a label (integer) to class name.\n  \"\"\"\n  labels_filename = os.path.join(dataset_dir, filename)\n  with tf.gfile.Open(labels_filename, 'r') as f:\n    lines = f.read().decode()\n  lines = lines.split('\\n')\n  lines = filter(None, lines)\n\n  labels_to_class_names = {}\n  for line in lines:\n    index = line.index(':')\n    labels_to_class_names[int(line[:index])] = line[index+1:]\n  return labels_to_class_names\n"
  },
  {
    "path": "model_zoo/models/slim/datasets/download_and_convert_cifar10.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\nr\"\"\"Downloads and converts cifar10 data to TFRecords of TF-Example protos.\n\nThis module downloads the cifar10 data, uncompresses it, reads the files\nthat make up the cifar10 data and creates two TFRecord datasets: one for train\nand one for test. Each TFRecord dataset is comprised of a set of TF-Example\nprotocol buffers, each of which contain a single image and label.\n\nThe script should take several minutes to run.\n\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport cPickle\nimport os\nimport sys\nimport tarfile\n\nimport numpy as np\nfrom six.moves import urllib\nimport tensorflow as tf\n\nfrom datasets import dataset_utils\n\n# The URL where the CIFAR data can be downloaded.\n_DATA_URL = 'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz'\n\n# The number of training files.\n_NUM_TRAIN_FILES = 5\n\n# The height and width of each image.\n_IMAGE_SIZE = 32\n\n# The names of the classes.\n_CLASS_NAMES = [\n    'airplane',\n    'automobile',\n    'bird',\n    'cat',\n    'deer',\n    'dog',\n    'frog',\n    'horse',\n    'ship',\n    'truck',\n]\n\n\ndef _add_to_tfrecord(filename, tfrecord_writer, offset=0):\n  \"\"\"Loads data from the cifar10 pickle files and writes files to a TFRecord.\n\n  Args:\n    filename: The filename of the cifar10 pickle file.\n    tfrecord_writer: The TFRecord writer to use for writing.\n    offset: An offset into the absolute number of images previously written.\n\n  Returns:\n    The new offset.\n  \"\"\"\n  with tf.gfile.Open(filename, 'r') as f:\n    data = cPickle.load(f)\n\n  images = data['data']\n  num_images = images.shape[0]\n\n  images = images.reshape((num_images, 3, 32, 32))\n  labels = data['labels']\n\n  with tf.Graph().as_default():\n    image_placeholder = tf.placeholder(dtype=tf.uint8)\n    encoded_image = tf.image.encode_png(image_placeholder)\n\n    with tf.Session('') as sess:\n\n      for j in range(num_images):\n        sys.stdout.write('\\r>> Reading file [%s] image %d/%d' % (\n            filename, offset + j + 1, offset + num_images))\n        sys.stdout.flush()\n\n        image = np.squeeze(images[j]).transpose((1, 2, 0))\n        label = labels[j]\n\n        png_string = sess.run(encoded_image,\n                              feed_dict={image_placeholder: image})\n\n        example = dataset_utils.image_to_tfexample(\n            png_string, 'png', _IMAGE_SIZE, _IMAGE_SIZE, label)\n        tfrecord_writer.write(example.SerializeToString())\n\n  return offset + num_images\n\n\ndef _get_output_filename(dataset_dir, split_name):\n  \"\"\"Creates the output filename.\n\n  Args:\n    dataset_dir: The dataset directory where the dataset is stored.\n    split_name: The name of the train/test split.\n\n  Returns:\n    An absolute file path.\n  \"\"\"\n  return '%s/cifar10_%s.tfrecord' % (dataset_dir, split_name)\n\n\ndef _download_and_uncompress_dataset(dataset_dir):\n  \"\"\"Downloads cifar10 and uncompresses it locally.\n\n  Args:\n    dataset_dir: The directory where the temporary files are stored.\n  \"\"\"\n  filename = _DATA_URL.split('/')[-1]\n  filepath = os.path.join(dataset_dir, filename)\n\n  if not os.path.exists(filepath):\n    def _progress(count, block_size, total_size):\n      sys.stdout.write('\\r>> Downloading %s %.1f%%' % (\n          filename, float(count * block_size) / float(total_size) * 100.0))\n      sys.stdout.flush()\n    filepath, _ = urllib.request.urlretrieve(_DATA_URL, filepath, _progress)\n    print()\n    statinfo = os.stat(filepath)\n    print('Successfully downloaded', filename, statinfo.st_size, 'bytes.')\n    tarfile.open(filepath, 'r:gz').extractall(dataset_dir)\n\n\ndef _clean_up_temporary_files(dataset_dir):\n  \"\"\"Removes temporary files used to create the dataset.\n\n  Args:\n    dataset_dir: The directory where the temporary files are stored.\n  \"\"\"\n  filename = _DATA_URL.split('/')[-1]\n  filepath = os.path.join(dataset_dir, filename)\n  tf.gfile.Remove(filepath)\n\n  tmp_dir = os.path.join(dataset_dir, 'cifar-10-batches-py')\n  tf.gfile.DeleteRecursively(tmp_dir)\n\n\ndef run(dataset_dir):\n  \"\"\"Runs the download and conversion operation.\n\n  Args:\n    dataset_dir: The dataset directory where the dataset is stored.\n  \"\"\"\n  if not tf.gfile.Exists(dataset_dir):\n    tf.gfile.MakeDirs(dataset_dir)\n\n  training_filename = _get_output_filename(dataset_dir, 'train')\n  testing_filename = _get_output_filename(dataset_dir, 'test')\n\n  if tf.gfile.Exists(training_filename) and tf.gfile.Exists(testing_filename):\n    print('Dataset files already exist. Exiting without re-creating them.')\n    return\n\n  dataset_utils.download_and_uncompress_tarball(_DATA_URL, dataset_dir)\n\n  # First, process the training data:\n  with tf.python_io.TFRecordWriter(training_filename) as tfrecord_writer:\n    offset = 0\n    for i in range(_NUM_TRAIN_FILES):\n      filename = os.path.join(dataset_dir,\n                              'cifar-10-batches-py',\n                              'data_batch_%d' % (i + 1))  # 1-indexed.\n      offset = _add_to_tfrecord(filename, tfrecord_writer, offset)\n\n  # Next, process the testing data:\n  with tf.python_io.TFRecordWriter(testing_filename) as tfrecord_writer:\n    filename = os.path.join(dataset_dir,\n                            'cifar-10-batches-py',\n                            'test_batch')\n    _add_to_tfrecord(filename, tfrecord_writer)\n\n  # Finally, write the labels file:\n  labels_to_class_names = dict(zip(range(len(_CLASS_NAMES)), _CLASS_NAMES))\n  dataset_utils.write_label_file(labels_to_class_names, dataset_dir)\n\n  _clean_up_temporary_files(dataset_dir)\n  print('\\nFinished converting the Cifar10 dataset!')\n"
  },
  {
    "path": "model_zoo/models/slim/datasets/download_and_convert_flowers.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\nr\"\"\"Downloads and converts Flowers data to TFRecords of TF-Example protos.\n\nThis module downloads the Flowers data, uncompresses it, reads the files\nthat make up the Flowers data and creates two TFRecord datasets: one for train\nand one for test. Each TFRecord dataset is comprised of a set of TF-Example\nprotocol buffers, each of which contain a single image and label.\n\nThe script should take about a minute to run.\n\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport math\nimport os\nimport random\nimport sys\n\nimport tensorflow as tf\n\nfrom datasets import dataset_utils\n\n# The URL where the Flowers data can be downloaded.\n_DATA_URL = 'http://download.tensorflow.org/example_images/flower_photos.tgz'\n\n# The number of images in the validation set.\n_NUM_VALIDATION = 350\n\n# Seed for repeatability.\n_RANDOM_SEED = 0\n\n# The number of shards per dataset split.\n_NUM_SHARDS = 5\n\n\nclass ImageReader(object):\n  \"\"\"Helper class that provides TensorFlow image coding utilities.\"\"\"\n\n  def __init__(self):\n    # Initializes function that decodes RGB JPEG data.\n    self._decode_jpeg_data = tf.placeholder(dtype=tf.string)\n    self._decode_jpeg = tf.image.decode_jpeg(self._decode_jpeg_data, channels=3)\n\n  def read_image_dims(self, sess, image_data):\n    image = self.decode_jpeg(sess, image_data)\n    return image.shape[0], image.shape[1]\n\n  def decode_jpeg(self, sess, image_data):\n    image = sess.run(self._decode_jpeg,\n                     feed_dict={self._decode_jpeg_data: image_data})\n    assert len(image.shape) == 3\n    assert image.shape[2] == 3\n    return image\n\n\ndef _get_filenames_and_classes(dataset_dir):\n  \"\"\"Returns a list of filenames and inferred class names.\n\n  Args:\n    dataset_dir: A directory containing a set of subdirectories representing\n      class names. Each subdirectory should contain PNG or JPG encoded images.\n\n  Returns:\n    A list of image file paths, relative to `dataset_dir` and the list of\n    subdirectories, representing class names.\n  \"\"\"\n  flower_root = os.path.join(dataset_dir, 'flower_photos')\n  directories = []\n  class_names = []\n  for filename in os.listdir(flower_root):\n    path = os.path.join(flower_root, filename)\n    if os.path.isdir(path):\n      directories.append(path)\n      class_names.append(filename)\n\n  photo_filenames = []\n  for directory in directories:\n    for filename in os.listdir(directory):\n      path = os.path.join(directory, filename)\n      photo_filenames.append(path)\n\n  return photo_filenames, sorted(class_names)\n\n\ndef _get_dataset_filename(dataset_dir, split_name, shard_id):\n  output_filename = 'flowers_%s_%05d-of-%05d.tfrecord' % (\n      split_name, shard_id, _NUM_SHARDS)\n  return os.path.join(dataset_dir, output_filename)\n\n\ndef _convert_dataset(split_name, filenames, class_names_to_ids, dataset_dir):\n  \"\"\"Converts the given filenames to a TFRecord dataset.\n\n  Args:\n    split_name: The name of the dataset, either 'train' or 'validation'.\n    filenames: A list of absolute paths to png or jpg images.\n    class_names_to_ids: A dictionary from class names (strings) to ids\n      (integers).\n    dataset_dir: The directory where the converted datasets are stored.\n  \"\"\"\n  assert split_name in ['train', 'validation']\n\n  num_per_shard = int(math.ceil(len(filenames) / float(_NUM_SHARDS)))\n\n  with tf.Graph().as_default():\n    image_reader = ImageReader()\n\n    with tf.Session('') as sess:\n\n      for shard_id in range(_NUM_SHARDS):\n        output_filename = _get_dataset_filename(\n            dataset_dir, split_name, shard_id)\n\n        with tf.python_io.TFRecordWriter(output_filename) as tfrecord_writer:\n          start_ndx = shard_id * num_per_shard\n          end_ndx = min((shard_id+1) * num_per_shard, len(filenames))\n          for i in range(start_ndx, end_ndx):\n            sys.stdout.write('\\r>> Converting image %d/%d shard %d' % (\n                i+1, len(filenames), shard_id))\n            sys.stdout.flush()\n\n            # Read the filename:\n            image_data = tf.gfile.FastGFile(filenames[i], 'r').read()\n            height, width = image_reader.read_image_dims(sess, image_data)\n\n            class_name = os.path.basename(os.path.dirname(filenames[i]))\n            class_id = class_names_to_ids[class_name]\n\n            example = dataset_utils.image_to_tfexample(\n                image_data, 'jpg', height, width, class_id)\n            tfrecord_writer.write(example.SerializeToString())\n\n  sys.stdout.write('\\n')\n  sys.stdout.flush()\n\n\ndef _clean_up_temporary_files(dataset_dir):\n  \"\"\"Removes temporary files used to create the dataset.\n\n  Args:\n    dataset_dir: The directory where the temporary files are stored.\n  \"\"\"\n  filename = _DATA_URL.split('/')[-1]\n  filepath = os.path.join(dataset_dir, filename)\n  tf.gfile.Remove(filepath)\n\n  tmp_dir = os.path.join(dataset_dir, 'flower_photos')\n  tf.gfile.DeleteRecursively(tmp_dir)\n\n\ndef _dataset_exists(dataset_dir):\n  for split_name in ['train', 'validation']:\n    for shard_id in range(_NUM_SHARDS):\n      output_filename = _get_dataset_filename(\n          dataset_dir, split_name, shard_id)\n      if not tf.gfile.Exists(output_filename):\n        return False\n  return True\n\n\ndef run(dataset_dir):\n  \"\"\"Runs the download and conversion operation.\n\n  Args:\n    dataset_dir: The dataset directory where the dataset is stored.\n  \"\"\"\n  if not tf.gfile.Exists(dataset_dir):\n    tf.gfile.MakeDirs(dataset_dir)\n\n  if _dataset_exists(dataset_dir):\n    print('Dataset files already exist. Exiting without re-creating them.')\n    return\n\n  dataset_utils.download_and_uncompress_tarball(_DATA_URL, dataset_dir)\n  photo_filenames, class_names = _get_filenames_and_classes(dataset_dir)\n  class_names_to_ids = dict(zip(class_names, range(len(class_names))))\n\n  # Divide into train and test:\n  random.seed(_RANDOM_SEED)\n  random.shuffle(photo_filenames)\n  training_filenames = photo_filenames[_NUM_VALIDATION:]\n  validation_filenames = photo_filenames[:_NUM_VALIDATION]\n\n  # First, convert the training and validation sets.\n  _convert_dataset('train', training_filenames, class_names_to_ids,\n                   dataset_dir)\n  _convert_dataset('validation', validation_filenames, class_names_to_ids,\n                   dataset_dir)\n\n  # Finally, write the labels file:\n  labels_to_class_names = dict(zip(range(len(class_names)), class_names))\n  dataset_utils.write_label_file(labels_to_class_names, dataset_dir)\n\n  _clean_up_temporary_files(dataset_dir)\n  print('\\nFinished converting the Flowers dataset!')\n\n"
  },
  {
    "path": "model_zoo/models/slim/datasets/download_and_convert_mnist.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\nr\"\"\"Downloads and converts MNIST data to TFRecords of TF-Example protos.\n\nThis module downloads the MNIST data, uncompresses it, reads the files\nthat make up the MNIST data and creates two TFRecord datasets: one for train\nand one for test. Each TFRecord dataset is comprised of a set of TF-Example\nprotocol buffers, each of which contain a single image and label.\n\nThe script should take about a minute to run.\n\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport gzip\nimport os\nimport sys\n\nimport numpy as np\nfrom six.moves import urllib\nimport tensorflow as tf\n\nfrom datasets import dataset_utils\n\n# The URLs where the MNIST data can be downloaded.\n_DATA_URL = 'http://yann.lecun.com/exdb/mnist/'\n_TRAIN_DATA_FILENAME = 'train-images-idx3-ubyte.gz'\n_TRAIN_LABELS_FILENAME = 'train-labels-idx1-ubyte.gz'\n_TEST_DATA_FILENAME = 't10k-images-idx3-ubyte.gz'\n_TEST_LABELS_FILENAME = 't10k-labels-idx1-ubyte.gz'\n\n_IMAGE_SIZE = 28\n_NUM_CHANNELS = 1\n\n# The names of the classes.\n_CLASS_NAMES = [\n    'zero',\n    'one',\n    'two',\n    'three',\n    'four',\n    'five',\n    'size',\n    'seven',\n    'eight',\n    'nine',\n]\n\n\ndef _extract_images(filename, num_images):\n  \"\"\"Extract the images into a numpy array.\n\n  Args:\n    filename: The path to an MNIST images file.\n    num_images: The number of images in the file.\n\n  Returns:\n    A numpy array of shape [number_of_images, height, width, channels].\n  \"\"\"\n  print('Extracting images from: ', filename)\n  with gzip.open(filename) as bytestream:\n    bytestream.read(16)\n    buf = bytestream.read(\n        _IMAGE_SIZE * _IMAGE_SIZE * num_images * _NUM_CHANNELS)\n    data = np.frombuffer(buf, dtype=np.uint8)\n    data = data.reshape(num_images, _IMAGE_SIZE, _IMAGE_SIZE, _NUM_CHANNELS)\n  return data\n\n\ndef _extract_labels(filename, num_labels):\n  \"\"\"Extract the labels into a vector of int64 label IDs.\n\n  Args:\n    filename: The path to an MNIST labels file.\n    num_labels: The number of labels in the file.\n\n  Returns:\n    A numpy array of shape [number_of_labels]\n  \"\"\"\n  print('Extracting labels from: ', filename)\n  with gzip.open(filename) as bytestream:\n    bytestream.read(8)\n    buf = bytestream.read(1 * num_labels)\n    labels = np.frombuffer(buf, dtype=np.uint8).astype(np.int64)\n  return labels\n\n\ndef _add_to_tfrecord(data_filename, labels_filename, num_images,\n                     tfrecord_writer):\n  \"\"\"Loads data from the binary MNIST files and writes files to a TFRecord.\n\n  Args:\n    data_filename: The filename of the MNIST images.\n    labels_filename: The filename of the MNIST labels.\n    num_images: The number of images in the dataset.\n    tfrecord_writer: The TFRecord writer to use for writing.\n  \"\"\"\n  images = _extract_images(data_filename, num_images)\n  labels = _extract_labels(labels_filename, num_images)\n\n  shape = (_IMAGE_SIZE, _IMAGE_SIZE, _NUM_CHANNELS)\n  with tf.Graph().as_default():\n    image = tf.placeholder(dtype=tf.uint8, shape=shape)\n    encoded_png = tf.image.encode_png(image)\n\n    with tf.Session('') as sess:\n      for j in range(num_images):\n        sys.stdout.write('\\r>> Converting image %d/%d' % (j + 1, num_images))\n        sys.stdout.flush()\n\n        png_string = sess.run(encoded_png, feed_dict={image: images[j]})\n\n        example = dataset_utils.image_to_tfexample(\n            png_string, 'png', _IMAGE_SIZE, _IMAGE_SIZE, labels[j])\n        tfrecord_writer.write(example.SerializeToString())\n\n\ndef _get_output_filename(dataset_dir, split_name):\n  \"\"\"Creates the output filename.\n\n  Args:\n    dataset_dir: The directory where the temporary files are stored.\n    split_name: The name of the train/test split.\n\n  Returns:\n    An absolute file path.\n  \"\"\"\n  return '%s/mnist_%s.tfrecord' % (dataset_dir, split_name)\n\n\ndef _download_dataset(dataset_dir):\n  \"\"\"Downloads MNIST locally.\n\n  Args:\n    dataset_dir: The directory where the temporary files are stored.\n  \"\"\"\n  for filename in [_TRAIN_DATA_FILENAME,\n                   _TRAIN_LABELS_FILENAME,\n                   _TEST_DATA_FILENAME,\n                   _TEST_LABELS_FILENAME]:\n    filepath = os.path.join(dataset_dir, filename)\n\n    if not os.path.exists(filepath):\n      print('Downloading file %s...' % filename)\n      def _progress(count, block_size, total_size):\n        sys.stdout.write('\\r>> Downloading %.1f%%' % (\n            float(count * block_size) / float(total_size) * 100.0))\n        sys.stdout.flush()\n      filepath, _ = urllib.request.urlretrieve(_DATA_URL + filename,\n                                               filepath,\n                                               _progress)\n      print()\n      with tf.gfile.GFile(filepath) as f:\n        size = f.Size()\n      print('Successfully downloaded', filename, size, 'bytes.')\n\n\ndef _clean_up_temporary_files(dataset_dir):\n  \"\"\"Removes temporary files used to create the dataset.\n\n  Args:\n    dataset_dir: The directory where the temporary files are stored.\n  \"\"\"\n  for filename in [_TRAIN_DATA_FILENAME,\n                   _TRAIN_LABELS_FILENAME,\n                   _TEST_DATA_FILENAME,\n                   _TEST_LABELS_FILENAME]:\n    filepath = os.path.join(dataset_dir, filename)\n    tf.gfile.Remove(filepath)\n\n\ndef run(dataset_dir):\n  \"\"\"Runs the download and conversion operation.\n\n  Args:\n    dataset_dir: The dataset directory where the dataset is stored.\n  \"\"\"\n  if not tf.gfile.Exists(dataset_dir):\n    tf.gfile.MakeDirs(dataset_dir)\n\n  training_filename = _get_output_filename(dataset_dir, 'train')\n  testing_filename = _get_output_filename(dataset_dir, 'test')\n\n  if tf.gfile.Exists(training_filename) and tf.gfile.Exists(testing_filename):\n    print('Dataset files already exist. Exiting without re-creating them.')\n    return\n\n  _download_dataset(dataset_dir)\n\n  # First, process the training data:\n  with tf.python_io.TFRecordWriter(training_filename) as tfrecord_writer:\n    data_filename = os.path.join(dataset_dir, _TRAIN_DATA_FILENAME)\n    labels_filename = os.path.join(dataset_dir, _TRAIN_LABELS_FILENAME)\n    _add_to_tfrecord(data_filename, labels_filename, 60000, tfrecord_writer)\n\n  # Next, process the testing data:\n  with tf.python_io.TFRecordWriter(testing_filename) as tfrecord_writer:\n    data_filename = os.path.join(dataset_dir, _TEST_DATA_FILENAME)\n    labels_filename = os.path.join(dataset_dir, _TEST_LABELS_FILENAME)\n    _add_to_tfrecord(data_filename, labels_filename, 10000, tfrecord_writer)\n\n  # Finally, write the labels file:\n  labels_to_class_names = dict(zip(range(len(_CLASS_NAMES)), _CLASS_NAMES))\n  dataset_utils.write_label_file(labels_to_class_names, dataset_dir)\n\n  _clean_up_temporary_files(dataset_dir)\n  print('\\nFinished converting the MNIST dataset!')\n"
  },
  {
    "path": "model_zoo/models/slim/datasets/flowers.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Provides data for the Cifar10 dataset.\n\nThe dataset scripts used to create the dataset can be found at:\ntensorflow/models/slim/data/create_cifar10_dataset.py\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\nimport tensorflow as tf\n\nfrom datasets import dataset_utils\n\nslim = tf.contrib.slim\n\n_FILE_PATTERN = 'flowers_%s_*.tfrecord'\n\nSPLITS_TO_SIZES = {'train': 3320, 'validation': 350}\n\n_NUM_CLASSES = 5\n\n_ITEMS_TO_DESCRIPTIONS = {\n    'image': 'A color image of varying size.',\n    'label': 'A single integer between 0 and 4',\n}\n\n\ndef get_split(split_name, dataset_dir, file_pattern=None, reader=None):\n  \"\"\"Gets a dataset tuple with instructions for reading cifar10.\n\n  Args:\n    split_name: A train/validation split name.\n    dataset_dir: The base directory of the dataset sources.\n    file_pattern: The file pattern to use when matching the dataset sources.\n      It is assumed that the pattern contains a '%s' string so that the split\n      name can be inserted.\n    reader: The TensorFlow reader type.\n\n  Returns:\n    A `Dataset` namedtuple.\n\n  Raises:\n    ValueError: if `split_name` is not a valid train/validation split.\n  \"\"\"\n  if split_name not in SPLITS_TO_SIZES:\n    raise ValueError('split name %s was not recognized.' % split_name)\n\n  if not file_pattern:\n    file_pattern = _FILE_PATTERN\n  file_pattern = os.path.join(dataset_dir, file_pattern % split_name)\n\n  # Allowing None in the signature so that dataset_factory can use the default.\n  if reader is None:\n    reader = tf.TFRecordReader\n\n  keys_to_features = {\n      'image/encoded': tf.FixedLenFeature((), tf.string, default_value=''),\n      'image/format': tf.FixedLenFeature((), tf.string, default_value='png'),\n      'image/class/label': tf.FixedLenFeature(\n          [], tf.int64, default_value=tf.zeros([], dtype=tf.int64)),\n  }\n\n  items_to_handlers = {\n      'image': slim.tfexample_decoder.Image(),\n      'label': slim.tfexample_decoder.Tensor('image/class/label'),\n  }\n\n  decoder = slim.tfexample_decoder.TFExampleDecoder(\n      keys_to_features, items_to_handlers)\n\n  labels_to_names = None\n  if dataset_utils.has_labels(dataset_dir):\n    labels_to_names = dataset_utils.read_label_file(dataset_dir)\n\n  return slim.dataset.Dataset(\n      data_sources=file_pattern,\n      reader=reader,\n      decoder=decoder,\n      num_samples=SPLITS_TO_SIZES[split_name],\n      items_to_descriptions=_ITEMS_TO_DESCRIPTIONS,\n      num_classes=_NUM_CLASSES,\n      labels_to_names=labels_to_names)\n"
  },
  {
    "path": "model_zoo/models/slim/datasets/imagenet.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Provides data for the ImageNet ILSVRC 2012 Dataset plus some bounding boxes.\n\nSome images have one or more bounding boxes associated with the label of the\nimage. See details here: http://image-net.org/download-bboxes\n\nImageNet is based upon WordNet 3.0. To uniquely identify a synset, we use\n\"WordNet ID\" (wnid), which is a concatenation of POS ( i.e. part of speech )\nand SYNSET OFFSET of WordNet. For more information, please refer to the\nWordNet documentation[http://wordnet.princeton.edu/wordnet/documentation/].\n\n\"There are bounding boxes for over 3000 popular synsets available.\nFor each synset, there are on average 150 images with bounding boxes.\"\n\nWARNING: Don't use for object detection, in this case all the bounding boxes\nof the image belong to just one class.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\nfrom six.moves import urllib\nimport tensorflow as tf\n\nfrom datasets import dataset_utils\n\nslim = tf.contrib.slim\n\n# TODO(nsilberman): Add tfrecord file type once the script is updated.\n_FILE_PATTERN = '%s-*'\n\n_SPLITS_TO_SIZES = {\n    'train': 1281167,\n    'validation': 50000,\n}\n\n_ITEMS_TO_DESCRIPTIONS = {\n    'image': 'A color image of varying height and width.',\n    'label': 'The label id of the image, integer between 0 and 999',\n    'label_text': 'The text of the label.',\n    'object/bbox': 'A list of bounding boxes.',\n    'object/label': 'A list of labels, one per each object.',\n}\n\n_NUM_CLASSES = 1001\n\n\ndef create_readable_names_for_imagenet_labels():\n  \"\"\"Create a dict mapping label id to human readable string.\n\n  Returns:\n      labels_to_names: dictionary where keys are integers from to 1000\n      and values are human-readable names.\n\n  We retrieve a synset file, which contains a list of valid synset labels used\n  by ILSVRC competition. There is one synset one per line, eg.\n          #   n01440764\n          #   n01443537\n  We also retrieve a synset_to_human_file, which contains a mapping from synsets\n  to human-readable names for every synset in Imagenet. These are stored in a\n  tsv format, as follows:\n          #   n02119247    black fox\n          #   n02119359    silver fox\n  We assign each synset (in alphabetical order) an integer, starting from 1\n  (since 0 is reserved for the background class).\n\n  Code is based on\n  https://github.com/tensorflow/models/blob/master/inception/inception/data/build_imagenet_data.py#L463\n  \"\"\"\n\n  # pylint: disable=g-line-too-long\n  base_url = 'https://raw.githubusercontent.com/tensorflow/models/master/inception/inception/data/'\n  synset_url = '{}/imagenet_lsvrc_2015_synsets.txt'.format(base_url)\n  synset_to_human_url = '{}/imagenet_metadata.txt'.format(base_url)\n\n  filename, _ = urllib.request.urlretrieve(synset_url)\n  synset_list = [s.strip() for s in open(filename).readlines()]\n  num_synsets_in_ilsvrc = len(synset_list)\n  assert num_synsets_in_ilsvrc == 1000\n\n  filename, _ = urllib.request.urlretrieve(synset_to_human_url)\n  synset_to_human_list = open(filename).readlines()\n  num_synsets_in_all_imagenet = len(synset_to_human_list)\n  assert num_synsets_in_all_imagenet == 21842\n\n  synset_to_human = {}\n  for s in synset_to_human_list:\n    parts = s.strip().split('\\t')\n    assert len(parts) == 2\n    synset = parts[0]\n    human = parts[1]\n    synset_to_human[synset] = human\n\n  label_index = 1\n  labels_to_names = {0: 'background'}\n  for synset in synset_list:\n    name = synset_to_human[synset]\n    labels_to_names[label_index] = name\n    label_index += 1\n\n  return labels_to_names\n\n\ndef get_split(split_name, dataset_dir, file_pattern=None, reader=None):\n  \"\"\"Gets a dataset tuple with instructions for reading ImageNet.\n\n  Args:\n    split_name: A train/test split name.\n    dataset_dir: The base directory of the dataset sources.\n    file_pattern: The file pattern to use when matching the dataset sources.\n      It is assumed that the pattern contains a '%s' string so that the split\n      name can be inserted.\n    reader: The TensorFlow reader type.\n\n  Returns:\n    A `Dataset` namedtuple.\n\n  Raises:\n    ValueError: if `split_name` is not a valid train/test split.\n  \"\"\"\n  if split_name not in _SPLITS_TO_SIZES:\n    raise ValueError('split name %s was not recognized.' % split_name)\n\n  if not file_pattern:\n    file_pattern = _FILE_PATTERN\n  file_pattern = os.path.join(dataset_dir, file_pattern % split_name)\n\n  # Allowing None in the signature so that dataset_factory can use the default.\n  if reader is None:\n    reader = tf.TFRecordReader\n\n  keys_to_features = {\n      'image/encoded': tf.FixedLenFeature(\n          (), tf.string, default_value=''),\n      'image/format': tf.FixedLenFeature(\n          (), tf.string, default_value='jpeg'),\n      'image/class/label': tf.FixedLenFeature(\n          [], dtype=tf.int64, default_value=-1),\n      'image/class/text': tf.FixedLenFeature(\n          [], dtype=tf.string, default_value=''),\n      'image/object/bbox/xmin': tf.VarLenFeature(\n          dtype=tf.float32),\n      'image/object/bbox/ymin': tf.VarLenFeature(\n          dtype=tf.float32),\n      'image/object/bbox/xmax': tf.VarLenFeature(\n          dtype=tf.float32),\n      'image/object/bbox/ymax': tf.VarLenFeature(\n          dtype=tf.float32),\n      'image/object/class/label': tf.VarLenFeature(\n          dtype=tf.int64),\n  }\n\n  items_to_handlers = {\n      'image': slim.tfexample_decoder.Image('image/encoded', 'image/format'),\n      'label': slim.tfexample_decoder.Tensor('image/class/label'),\n      'label_text': slim.tfexample_decoder.Tensor('image/class/text'),\n      'object/bbox': slim.tfexample_decoder.BoundingBox(\n          ['ymin', 'xmin', 'ymax', 'xmax'], 'image/object/bbox/'),\n      'object/label': slim.tfexample_decoder.Tensor('image/object/class/label'),\n  }\n\n  decoder = slim.tfexample_decoder.TFExampleDecoder(\n      keys_to_features, items_to_handlers)\n\n  labels_to_names = None\n  if dataset_utils.has_labels(dataset_dir):\n    labels_to_names = dataset_utils.read_label_file(dataset_dir)\n  else:\n    labels_to_names = create_readable_names_for_imagenet_labels()\n    dataset_utils.write_label_file(labels_to_names, dataset_dir)\n\n  return slim.dataset.Dataset(\n      data_sources=file_pattern,\n      reader=reader,\n      decoder=decoder,\n      num_samples=_SPLITS_TO_SIZES[split_name],\n      items_to_descriptions=_ITEMS_TO_DESCRIPTIONS,\n      num_classes=_NUM_CLASSES,\n      labels_to_names=labels_to_names)\n"
  },
  {
    "path": "model_zoo/models/slim/datasets/mnist.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Provides data for the MNIST dataset.\n\nThe dataset scripts used to create the dataset can be found at:\ntensorflow/models/slim/data/create_mnist_dataset.py\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\nimport tensorflow as tf\n\nfrom datasets import dataset_utils\n\nslim = tf.contrib.slim\n\n_FILE_PATTERN = 'mnist_%s.tfrecord'\n\n_SPLITS_TO_SIZES = {'train': 60000, 'test': 10000}\n\n_NUM_CLASSES = 10\n\n_ITEMS_TO_DESCRIPTIONS = {\n    'image': 'A [28 x 28 x 1] grayscale image.',\n    'label': 'A single integer between 0 and 9',\n}\n\n\ndef get_split(split_name, dataset_dir, file_pattern=None, reader=None):\n  \"\"\"Gets a dataset tuple with instructions for reading MNIST.\n\n  Args:\n    split_name: A train/test split name.\n    dataset_dir: The base directory of the dataset sources.\n    file_pattern: The file pattern to use when matching the dataset sources.\n      It is assumed that the pattern contains a '%s' string so that the split\n      name can be inserted.\n    reader: The TensorFlow reader type.\n\n  Returns:\n    A `Dataset` namedtuple.\n\n  Raises:\n    ValueError: if `split_name` is not a valid train/test split.\n  \"\"\"\n  if split_name not in _SPLITS_TO_SIZES:\n    raise ValueError('split name %s was not recognized.' % split_name)\n\n  if not file_pattern:\n    file_pattern = _FILE_PATTERN\n  file_pattern = os.path.join(dataset_dir, file_pattern % split_name)\n\n  # Allowing None in the signature so that dataset_factory can use the default.\n  if reader is None:\n    reader = tf.TFRecordReader\n\n  keys_to_features = {\n      'image/encoded': tf.FixedLenFeature((), tf.string, default_value=''),\n      'image/format': tf.FixedLenFeature((), tf.string, default_value='raw'),\n      'image/class/label': tf.FixedLenFeature(\n          [1], tf.int64, default_value=tf.zeros([1], dtype=tf.int64)),\n  }\n\n  items_to_handlers = {\n      'image': slim.tfexample_decoder.Image(shape=[28, 28, 1], channels=1),\n      'label': slim.tfexample_decoder.Tensor('image/class/label', shape=[]),\n  }\n\n  decoder = slim.tfexample_decoder.TFExampleDecoder(\n      keys_to_features, items_to_handlers)\n\n  labels_to_names = None\n  if dataset_utils.has_labels(dataset_dir):\n    labels_to_names = dataset_utils.read_label_file(dataset_dir)\n\n  return slim.dataset.Dataset(\n      data_sources=file_pattern,\n      reader=reader,\n      decoder=decoder,\n      num_samples=_SPLITS_TO_SIZES[split_name],\n      num_classes=_NUM_CLASSES,\n      items_to_descriptions=_ITEMS_TO_DESCRIPTIONS,\n      labels_to_names=labels_to_names)\n"
  },
  {
    "path": "model_zoo/models/slim/deployment/__init__.py",
    "content": "\n"
  },
  {
    "path": "model_zoo/models/slim/deployment/model_deploy.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Deploy Slim models across multiple clones and replicas.\n\n# TODO(sguada) docstring paragraph by (a) motivating the need for the file and\n# (b) defining clones.\n\n# TODO(sguada) describe the high-level components of model deployment.\n# E.g. \"each model deployment is composed of several parts: a DeploymentConfig,\n# which captures A, B and C, an input_fn which loads data.. etc\n\nTo easily train a model on multiple GPUs or across multiple machines this\nmodule provides a set of helper functions: `create_clones`,\n`optimize_clones` and `deploy`.\n\nUsage:\n\n  g = tf.Graph()\n\n  # Set up DeploymentConfig\n  config = model_deploy.DeploymentConfig(num_clones=2, clone_on_cpu=True)\n\n  # Create the global step on the device storing the variables.\n  with tf.device(config.variables_device()):\n    global_step = slim.create_global_step()\n\n  # Define the inputs\n  with tf.device(config.inputs_device()):\n    images, labels = LoadData(...)\n    inputs_queue = slim.data.prefetch_queue((images, labels))\n\n  # Define the optimizer.\n  with tf.device(config.optimizer_device()):\n    optimizer = tf.train.MomentumOptimizer(FLAGS.learning_rate, FLAGS.momentum)\n\n  # Define the model including the loss.\n  def model_fn(inputs_queue):\n    images, labels = inputs_queue.dequeue()\n    predictions = CreateNetwork(images)\n    slim.losses.log_loss(predictions, labels)\n\n  model_dp = model_deploy.deploy(config, model_fn, [inputs_queue],\n                                 optimizer=optimizer)\n\n  # Run training.\n  slim.learning.train(model_dp.train_op, my_log_dir,\n                      summary_op=model_dp.summary_op)\n\nThe Clone namedtuple holds together the values associated with each call to\nmodel_fn:\n  * outputs: The return values of the calls to `model_fn()`.\n  * scope: The scope used to create the clone.\n  * device: The device used to create the clone.\n\nDeployedModel namedtuple, holds together the values needed to train multiple\nclones:\n  * train_op: An operation that run the optimizer training op and include\n    all the update ops created by `model_fn`. Present only if an optimizer\n    was specified.\n  * summary_op: An operation that run the summaries created by `model_fn`\n    and process_gradients.\n  * total_loss: A `Tensor` that contains the sum of all losses created by\n    `model_fn` plus the regularization losses.\n  * clones: List of `Clone` tuples returned by `create_clones()`.\n\nDeploymentConfig parameters:\n  * num_clones: Number of model clones to deploy in each replica.\n  * clone_on_cpu: True if clones should be placed on CPU.\n  * replica_id: Integer.  Index of the replica for which the model is\n      deployed.  Usually 0 for the chief replica.\n  * num_replicas: Number of replicas to use.\n  * num_ps_tasks: Number of tasks for the `ps` job. 0 to not use replicas.\n  * worker_job_name: A name for the worker job.\n  * ps_job_name: A name for the parameter server job.\n\nTODO(sguada):\n  - describe side effect to the graph.\n  - what happens to summaries and update_ops.\n  - which graph collections are altered.\n  - write a tutorial on how to use this.\n  - analyze the possibility of calling deploy more than once.\n\n\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport collections\n\nimport tensorflow as tf\n\nfrom tensorflow.python.ops import control_flow_ops\n\nslim = tf.contrib.slim\n\n\n__all__ = ['create_clones',\n           'deploy',\n           'optimize_clones',\n           'DeployedModel',\n           'DeploymentConfig',\n           'Clone',\n          ]\n\n\n# Namedtuple used to represent a clone during deployment.\nClone = collections.namedtuple('Clone',\n                               ['outputs',  # Whatever model_fn() returned.\n                                'scope',  # The scope used to create it.\n                                'device',  # The device used to create.\n                               ])\n\n# Namedtuple used to represent a DeployedModel, returned by deploy().\nDeployedModel = collections.namedtuple('DeployedModel',\n                                       ['train_op',  # The `train_op`\n                                        'summary_op',  # The `summary_op`\n                                        'total_loss',  # The loss `Tensor`\n                                        'clones',  # A list of `Clones` tuples.\n                                       ])\n\n# Default parameters for DeploymentConfig\n_deployment_params = {'num_clones': 1,\n                      'clone_on_cpu': False,\n                      'replica_id': 0,\n                      'num_replicas': 1,\n                      'num_ps_tasks': 0,\n                      'worker_job_name': 'worker',\n                      'ps_job_name': 'ps'}\n\n\ndef create_clones(config, model_fn, args=None, kwargs=None):\n  \"\"\"Creates multiple clones according to config using a `model_fn`.\n\n  The returned values of `model_fn(*args, **kwargs)` are collected along with\n  the scope and device used to created it in a namedtuple\n  `Clone(outputs, scope, device)`\n\n  Note: it is assumed that any loss created by `model_fn` is collected at\n  the tf.GraphKeys.LOSSES collection.\n\n  To recover the losses, summaries or update_ops created by the clone use:\n  ```python\n    losses = tf.get_collection(tf.GraphKeys.LOSSES, clone.scope)\n    summaries = tf.get_collection(tf.GraphKeys.SUMMARIES, clone.scope)\n    update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS, clone.scope)\n  ```\n\n  The deployment options are specified by the config object and support\n  deploying one or several clones on different GPUs and one or several replicas\n  of such clones.\n\n  The argument `model_fn` is called `config.num_clones` times to create the\n  model clones as `model_fn(*args, **kwargs)`.\n\n  If `config` specifies deployment on multiple replicas then the default\n  tensorflow device is set appropriatly for each call to `model_fn` and for the\n  slim variable creation functions: model and global variables will be created\n  on the `ps` device, the clone operations will be on the `worker` device.\n\n  Args:\n    config: A DeploymentConfig object.\n    model_fn: A callable. Called as `model_fn(*args, **kwargs)`\n    args: Optional list of arguments to pass to `model_fn`.\n    kwargs: Optional list of keyword arguments to pass to `model_fn`.\n\n  Returns:\n    A list of namedtuples `Clone`.\n  \"\"\"\n  clones = []\n  args = args or []\n  kwargs = kwargs or {}\n  with slim.arg_scope([slim.model_variable, slim.variable],\n                      device=config.variables_device()):\n    # Create clones.\n    for i in range(0, config.num_clones):\n      with tf.name_scope(config.clone_scope(i)) as clone_scope:\n        clone_device = config.clone_device(i)\n        with tf.device(clone_device):\n          with tf.variable_scope(tf.get_variable_scope(),\n                                 reuse=True if i > 0 else None):\n            outputs = model_fn(*args, **kwargs)\n          clones.append(Clone(outputs, clone_scope, clone_device))\n  return clones\n\n\ndef _gather_clone_loss(clone, num_clones, regularization_losses):\n  \"\"\"Gather the loss for a single clone.\n\n  Args:\n    clone: A Clone namedtuple.\n    num_clones: The number of clones being deployed.\n    regularization_losses: Possibly empty list of regularization_losses\n      to add to the clone losses.\n\n  Returns:\n    A tensor for the total loss for the clone.  Can be None.\n  \"\"\"\n  # The return value.\n  sum_loss = None\n  # Individual components of the loss that will need summaries.\n  clone_loss = None\n  regularization_loss = None\n  # Compute and aggregate losses on the clone device.\n  with tf.device(clone.device):\n    all_losses = []\n    clone_losses = tf.get_collection(tf.GraphKeys.LOSSES, clone.scope)\n    if clone_losses:\n      clone_loss = tf.add_n(clone_losses, name='clone_loss')\n      if num_clones > 1:\n        clone_loss = tf.div(clone_loss, 1.0 * num_clones,\n                            name='scaled_clone_loss')\n      all_losses.append(clone_loss)\n    if regularization_losses:\n      regularization_loss = tf.add_n(regularization_losses,\n                                     name='regularization_loss')\n      all_losses.append(regularization_loss)\n    if all_losses:\n      sum_loss = tf.add_n(all_losses)\n  # Add the summaries out of the clone device block.\n  if clone_loss is not None:\n    tf.scalar_summary(clone.scope + '/clone_loss', clone_loss,\n                      name='clone_loss')\n  if regularization_loss is not None:\n    tf.scalar_summary('regularization_loss', regularization_loss,\n                      name='regularization_loss')\n  return sum_loss\n\n\ndef _optimize_clone(optimizer, clone, num_clones, regularization_losses,\n                    **kwargs):\n  \"\"\"Compute losses and gradients for a single clone.\n\n  Args:\n    optimizer: A tf.Optimizer  object.\n    clone: A Clone namedtuple.\n    num_clones: The number of clones being deployed.\n    regularization_losses: Possibly empty list of regularization_losses\n      to add to the clone losses.\n    **kwargs: Dict of kwarg to pass to compute_gradients().\n\n  Returns:\n    A tuple (clone_loss, clone_grads_and_vars).\n      - clone_loss: A tensor for the total loss for the clone.  Can be None.\n      - clone_grads_and_vars: List of (gradient, variable) for the clone.\n        Can be empty.\n  \"\"\"\n  sum_loss = _gather_clone_loss(clone, num_clones, regularization_losses)\n  clone_grad = None\n  if sum_loss is not None:\n    with tf.device(clone.device):\n      clone_grad = optimizer.compute_gradients(sum_loss, **kwargs)\n  return sum_loss, clone_grad\n\n\ndef optimize_clones(clones, optimizer,\n                    regularization_losses=None,\n                    **kwargs):\n  \"\"\"Compute clone losses and gradients for the given list of `Clones`.\n\n  Note: The regularization_losses are added to the first clone losses.\n\n  Args:\n   clones: List of `Clones` created by `create_clones()`.\n   optimizer: An `Optimizer` object.\n   regularization_losses: Optional list of regularization losses. If None it\n     will gather them from tf.GraphKeys.REGULARIZATION_LOSSES. Pass `[]` to\n     exclude them.\n   **kwargs: Optional list of keyword arguments to pass to `compute_gradients`.\n\n  Returns:\n   A tuple (total_loss, grads_and_vars).\n     - total_loss: A Tensor containing the average of the clone losses including\n       the regularization loss.\n     - grads_and_vars: A List of tuples (gradient, variable) containing the sum\n       of the gradients for each variable.\n\n  \"\"\"\n  grads_and_vars = []\n  clones_losses = []\n  num_clones = len(clones)\n  if regularization_losses is None:\n    regularization_losses = tf.get_collection(\n        tf.GraphKeys.REGULARIZATION_LOSSES)\n  for clone in clones:\n    with tf.name_scope(clone.scope):\n      clone_loss, clone_grad = _optimize_clone(\n          optimizer, clone, num_clones, regularization_losses, **kwargs)\n      if clone_loss is not None:\n        clones_losses.append(clone_loss)\n        grads_and_vars.append(clone_grad)\n      # Only use regularization_losses for the first clone\n      regularization_losses = None\n  # Compute the total_loss summing all the clones_losses.\n  total_loss = tf.add_n(clones_losses, name='total_loss')\n  # Sum the gradients accross clones.\n  grads_and_vars = _sum_clones_gradients(grads_and_vars)\n  return total_loss, grads_and_vars\n\n\ndef deploy(config,\n           model_fn,\n           args=None,\n           kwargs=None,\n           optimizer=None,\n           summarize_gradients=False):\n  \"\"\"Deploys a Slim-constructed model across multiple clones.\n\n  The deployment options are specified by the config object and support\n  deploying one or several clones on different GPUs and one or several replicas\n  of such clones.\n\n  The argument `model_fn` is called `config.num_clones` times to create the\n  model clones as `model_fn(*args, **kwargs)`.\n\n  The optional argument `optimizer` is an `Optimizer` object.  If not `None`,\n  the deployed model is configured for training with that optimizer.\n\n  If `config` specifies deployment on multiple replicas then the default\n  tensorflow device is set appropriatly for each call to `model_fn` and for the\n  slim variable creation functions: model and global variables will be created\n  on the `ps` device, the clone operations will be on the `worker` device.\n\n  Args:\n    config: A `DeploymentConfig` object.\n    model_fn: A callable. Called as `model_fn(*args, **kwargs)`\n    args: Optional list of arguments to pass to `model_fn`.\n    kwargs: Optional list of keyword arguments to pass to `model_fn`.\n    optimizer: Optional `Optimizer` object.  If passed the model is deployed\n      for training with that optimizer.\n    summarize_gradients: Whether or not add summaries to the gradients.\n\n  Returns:\n    A `DeployedModel` namedtuple.\n\n  \"\"\"\n  # Gather initial summaries.\n  summaries = set(tf.get_collection(tf.GraphKeys.SUMMARIES))\n\n  # Create Clones.\n  clones = create_clones(config, model_fn, args, kwargs)\n  first_clone = clones[0]\n\n  # Gather update_ops from the first clone. These contain, for example,\n  # the updates for the batch_norm variables created by model_fn.\n  update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS, first_clone.scope)\n\n  train_op = None\n  total_loss = None\n  with tf.device(config.optimizer_device()):\n    if optimizer:\n      # Place the global step on the device storing the variables.\n      with tf.device(config.variables_device()):\n        global_step = slim.get_or_create_global_step()\n\n      # Compute the gradients for the clones.\n      total_loss, clones_gradients = optimize_clones(clones, optimizer)\n\n      if clones_gradients:\n        if summarize_gradients:\n          # Add summaries to the gradients.\n          summaries |= set(_add_gradients_summaries(clones_gradients))\n\n        # Create gradient updates.\n        grad_updates = optimizer.apply_gradients(clones_gradients,\n                                                 global_step=global_step)\n        update_ops.append(grad_updates)\n\n        update_op = tf.group(*update_ops)\n        train_op = control_flow_ops.with_dependencies([update_op], total_loss,\n                                                      name='train_op')\n    else:\n      clones_losses = []\n      regularization_losses = tf.get_collection(\n          tf.GraphKeys.REGULARIZATION_LOSSES)\n      for clone in clones:\n        with tf.name_scope(clone.scope):\n          clone_loss = _gather_clone_loss(clone, len(clones),\n                                          regularization_losses)\n          if clone_loss is not None:\n            clones_losses.append(clone_loss)\n          # Only use regularization_losses for the first clone\n          regularization_losses = None\n      if clones_losses:\n        total_loss = tf.add_n(clones_losses, name='total_loss')\n\n    # Add the summaries from the first clone. These contain the summaries\n    # created by model_fn and either optimize_clones() or _gather_clone_loss().\n    summaries |= set(tf.get_collection(tf.GraphKeys.SUMMARIES,\n                                       first_clone.scope))\n\n    if total_loss is not None:\n      # Add total_loss to summary.\n      summaries.add(tf.scalar_summary('total_loss', total_loss,\n                                      name='total_loss'))\n\n    if summaries:\n      # Merge all summaries together.\n      summary_op = tf.merge_summary(list(summaries), name='summary_op')\n    else:\n      summary_op = None\n\n  return DeployedModel(train_op, summary_op, total_loss, clones)\n\n\ndef _sum_clones_gradients(clone_grads):\n  \"\"\"Calculate the sum gradient for each shared variable across all clones.\n\n  This function assumes that the clone_grads has been scaled appropriately by\n  1 / num_clones.\n\n  Args:\n    clone_grads: A List of List of tuples (gradient, variable), one list per\n    `Clone`.\n\n  Returns:\n     List of tuples of (gradient, variable) where the gradient has been summed\n     across all clones.\n  \"\"\"\n  sum_grads = []\n  for grad_and_vars in zip(*clone_grads):\n    # Note that each grad_and_vars looks like the following:\n    #   ((grad_var0_clone0, var0), ... (grad_varN_cloneN, varN))\n    grads = []\n    var = grad_and_vars[0][1]\n    for g, v in grad_and_vars:\n      assert v == var\n      if g is not None:\n        grads.append(g)\n    if grads:\n      if len(grads) > 1:\n        sum_grad = tf.add_n(grads, name=var.op.name + '/sum_grads')\n      else:\n        sum_grad = grads[0]\n      sum_grads.append((sum_grad, var))\n  return sum_grads\n\n\ndef _add_gradients_summaries(grads_and_vars):\n  \"\"\"Add histogram summaries to gradients.\n\n  Note: The summaries are also added to the SUMMARIES collection.\n\n  Args:\n    grads_and_vars: A list of gradient to variable pairs (tuples).\n\n  Returns:\n    The _list_ of the added summaries for grads_and_vars.\n  \"\"\"\n  summaries = []\n  for grad, var in grads_and_vars:\n    if grad is not None:\n      if isinstance(grad, tf.IndexedSlices):\n        grad_values = grad.values\n      else:\n        grad_values = grad\n      summaries.append(tf.histogram_summary(var.op.name + ':gradient',\n                                            grad_values))\n      summaries.append(tf.histogram_summary(var.op.name + ':gradient_norm',\n                                            tf.global_norm([grad_values])))\n    else:\n      tf.logging.info('Var %s has no gradient', var.op.name)\n  return summaries\n\n\nclass DeploymentConfig(object):\n  \"\"\"Configuration for deploying a model with `deploy()`.\n\n  You can pass an instance of this class to `deploy()` to specify exactly\n  how to deploy the model to build.  If you do not pass one, an instance built\n  from the default deployment_hparams will be used.\n  \"\"\"\n\n  def __init__(self,\n               num_clones=1,\n               clone_on_cpu=False,\n               replica_id=0,\n               num_replicas=1,\n               num_ps_tasks=0,\n               worker_job_name='worker',\n               ps_job_name='ps'):\n    \"\"\"Create a DeploymentConfig.\n\n    The config describes how to deploy a model across multiple clones and\n    replicas.  The model will be replicated `num_clones` times in each replica.\n    If `clone_on_cpu` is True, each clone will placed on CPU.\n\n    If `num_replicas` is 1, the model is deployed via a single process.  In that\n    case `worker_device`, `num_ps_tasks`, and `ps_device` are ignored.\n\n    If `num_replicas` is greater than 1, then `worker_device` and `ps_device`\n    must specify TensorFlow devices for the `worker` and `ps` jobs and\n    `num_ps_tasks` must be positive.\n\n    Args:\n      num_clones: Number of model clones to deploy in each replica.\n      clone_on_cpu: If True clones would be placed on CPU.\n      replica_id: Integer.  Index of the replica for which the model is\n        deployed.  Usually 0 for the chief replica.\n      num_replicas: Number of replicas to use.\n      num_ps_tasks: Number of tasks for the `ps` job. 0 to not use replicas.\n      worker_job_name: A name for the worker job.\n      ps_job_name: A name for the parameter server job.\n\n    Raises:\n      ValueError: If the arguments are invalid.\n    \"\"\"\n    if num_replicas > 1:\n      if num_ps_tasks < 1:\n        raise ValueError('When using replicas num_ps_tasks must be positive')\n    if num_replicas > 1 or num_ps_tasks > 0:\n      if not worker_job_name:\n        raise ValueError('Must specify worker_job_name when using replicas')\n      if not ps_job_name:\n        raise ValueError('Must specify ps_job_name when using parameter server')\n    if replica_id >= num_replicas:\n      raise ValueError('replica_id must be less than num_replicas')\n    self._num_clones = num_clones\n    self._clone_on_cpu = clone_on_cpu\n    self._replica_id = replica_id\n    self._num_replicas = num_replicas\n    self._num_ps_tasks = num_ps_tasks\n    self._ps_device = '/job:' + ps_job_name if num_ps_tasks > 0 else ''\n    self._worker_device = '/job:' + worker_job_name if num_ps_tasks > 0 else ''\n\n  @property\n  def num_clones(self):\n    return self._num_clones\n\n  @property\n  def clone_on_cpu(self):\n    return self._clone_on_cpu\n\n  @property\n  def replica_id(self):\n    return self._replica_id\n\n  @property\n  def num_replicas(self):\n    return self._num_replicas\n\n  @property\n  def num_ps_tasks(self):\n    return self._num_ps_tasks\n\n  @property\n  def ps_device(self):\n    return self._ps_device\n\n  @property\n  def worker_device(self):\n    return self._worker_device\n\n  def caching_device(self):\n    \"\"\"Returns the device to use for caching variables.\n\n    Variables are cached on the worker CPU when using replicas.\n\n    Returns:\n      A device string or None if the variables do not need to be cached.\n    \"\"\"\n    if self._num_ps_tasks > 0:\n      return lambda op: op.device\n    else:\n      return None\n\n  def clone_device(self, clone_index):\n    \"\"\"Device used to create the clone and all the ops inside the clone.\n\n    Args:\n      clone_index: Int, representing the clone_index.\n\n    Returns:\n      A value suitable for `tf.device()`.\n\n    Raises:\n      ValueError: if `clone_index` is greater or equal to the number of clones\".\n    \"\"\"\n    if clone_index >= self._num_clones:\n      raise ValueError('clone_index must be less than num_clones')\n    device = ''\n    if self._num_ps_tasks > 0:\n      device += self._worker_device\n    if self._clone_on_cpu:\n      device += '/device:CPU:0'\n    else:\n      if self._num_clones > 1:\n        device += '/device:GPU:%d' % clone_index\n    return device\n\n  def clone_scope(self, clone_index):\n    \"\"\"Name scope to create the clone.\n\n    Args:\n      clone_index: Int, representing the clone_index.\n\n    Returns:\n      A name_scope suitable for `tf.name_scope()`.\n\n    Raises:\n      ValueError: if `clone_index` is greater or equal to the number of clones\".\n    \"\"\"\n    if clone_index >= self._num_clones:\n      raise ValueError('clone_index must be less than num_clones')\n    scope = ''\n    if self._num_clones > 1:\n      scope = 'clone_%d' % clone_index\n    return scope\n\n  def optimizer_device(self):\n    \"\"\"Device to use with the optimizer.\n\n    Returns:\n      A value suitable for `tf.device()`.\n    \"\"\"\n    if self._num_ps_tasks > 0 or self._num_clones > 0:\n      return self._worker_device + '/device:CPU:0'\n    else:\n      return ''\n\n  def inputs_device(self):\n    \"\"\"Device to use to build the inputs.\n\n    Returns:\n      A value suitable for `tf.device()`.\n    \"\"\"\n    device = ''\n    if self._num_ps_tasks > 0:\n      device += self._worker_device\n    device += '/device:CPU:0'\n    return device\n\n  def variables_device(self):\n    \"\"\"Returns the device to use for variables created inside the clone.\n\n    Returns:\n      A value suitable for `tf.device()`.\n    \"\"\"\n    device = ''\n    if self._num_ps_tasks > 0:\n      device += self._ps_device\n    device += '/device:CPU:0'\n\n    class _PSDeviceChooser(object):\n      \"\"\"Slim device chooser for variables when using PS.\"\"\"\n\n      def __init__(self, device, tasks):\n        self._device = device\n        self._tasks = tasks\n        self._task = 0\n\n      def choose(self, op):\n        if op.device:\n          return op.device\n        node_def = op if isinstance(op, tf.NodeDef) else op.node_def\n        if node_def.op == 'Variable':\n          t = self._task\n          self._task = (self._task + 1) % self._tasks\n          d = '%s/task:%d' % (self._device, t)\n          return d\n        else:\n          return op.device\n\n    if not self._num_ps_tasks:\n      return device\n    else:\n      chooser = _PSDeviceChooser(device, self._num_ps_tasks)\n      return chooser.choose\n"
  },
  {
    "path": "model_zoo/models/slim/deployment/model_deploy_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for model_deploy.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom deployment import model_deploy\n\nslim = tf.contrib.slim\n\n\nclass DeploymentConfigTest(tf.test.TestCase):\n\n  def testDefaults(self):\n    deploy_config = model_deploy.DeploymentConfig()\n\n    self.assertEqual(slim.get_variables(), [])\n    self.assertEqual(deploy_config.caching_device(), None)\n    self.assertDeviceEqual(deploy_config.clone_device(0), '')\n    self.assertEqual(deploy_config.clone_scope(0), '')\n    self.assertDeviceEqual(deploy_config.optimizer_device(), 'CPU:0')\n    self.assertDeviceEqual(deploy_config.inputs_device(), 'CPU:0')\n    self.assertDeviceEqual(deploy_config.variables_device(), 'CPU:0')\n\n  def testCPUonly(self):\n    deploy_config = model_deploy.DeploymentConfig(clone_on_cpu=True)\n\n    self.assertEqual(deploy_config.caching_device(), None)\n    self.assertDeviceEqual(deploy_config.clone_device(0), 'CPU:0')\n    self.assertEqual(deploy_config.clone_scope(0), '')\n    self.assertDeviceEqual(deploy_config.optimizer_device(), 'CPU:0')\n    self.assertDeviceEqual(deploy_config.inputs_device(), 'CPU:0')\n    self.assertDeviceEqual(deploy_config.variables_device(), 'CPU:0')\n\n  def testMultiGPU(self):\n    deploy_config = model_deploy.DeploymentConfig(num_clones=2)\n\n    self.assertEqual(deploy_config.caching_device(), None)\n    self.assertDeviceEqual(deploy_config.clone_device(0), 'GPU:0')\n    self.assertDeviceEqual(deploy_config.clone_device(1), 'GPU:1')\n    self.assertEqual(deploy_config.clone_scope(0), 'clone_0')\n    self.assertEqual(deploy_config.clone_scope(1), 'clone_1')\n    self.assertDeviceEqual(deploy_config.optimizer_device(), 'CPU:0')\n    self.assertDeviceEqual(deploy_config.inputs_device(), 'CPU:0')\n    self.assertDeviceEqual(deploy_config.variables_device(), 'CPU:0')\n\n  def testPS(self):\n    deploy_config = model_deploy.DeploymentConfig(num_clones=1, num_ps_tasks=1)\n\n    self.assertDeviceEqual(deploy_config.clone_device(0),\n                           '/job:worker')\n    self.assertEqual(deploy_config.clone_scope(0), '')\n    self.assertDeviceEqual(deploy_config.optimizer_device(),\n                           '/job:worker/device:CPU:0')\n    self.assertDeviceEqual(deploy_config.inputs_device(),\n                           '/job:worker/device:CPU:0')\n    with tf.device(deploy_config.variables_device()):\n      a = tf.Variable(0)\n      b = tf.Variable(0)\n      c = tf.no_op()\n      d = slim.variable('a', [],\n                        caching_device=deploy_config.caching_device())\n    self.assertDeviceEqual(a.device, '/job:ps/task:0/device:CPU:0')\n    self.assertDeviceEqual(a.device, a.value().device)\n    self.assertDeviceEqual(b.device, '/job:ps/task:0/device:CPU:0')\n    self.assertDeviceEqual(b.device, b.value().device)\n    self.assertDeviceEqual(c.device, '')\n    self.assertDeviceEqual(d.device, '/job:ps/task:0/device:CPU:0')\n    self.assertDeviceEqual(d.value().device, '')\n\n  def testMultiGPUPS(self):\n    deploy_config = model_deploy.DeploymentConfig(num_clones=2, num_ps_tasks=1)\n\n    self.assertEqual(deploy_config.caching_device()(tf.no_op()), '')\n    self.assertDeviceEqual(deploy_config.clone_device(0),\n                           '/job:worker/device:GPU:0')\n    self.assertDeviceEqual(deploy_config.clone_device(1),\n                           '/job:worker/device:GPU:1')\n    self.assertEqual(deploy_config.clone_scope(0), 'clone_0')\n    self.assertEqual(deploy_config.clone_scope(1), 'clone_1')\n    self.assertDeviceEqual(deploy_config.optimizer_device(),\n                           '/job:worker/device:CPU:0')\n    self.assertDeviceEqual(deploy_config.inputs_device(),\n                           '/job:worker/device:CPU:0')\n\n  def testReplicasPS(self):\n    deploy_config = model_deploy.DeploymentConfig(num_replicas=2,\n                                                  num_ps_tasks=2)\n\n    self.assertDeviceEqual(deploy_config.clone_device(0),\n                           '/job:worker')\n    self.assertEqual(deploy_config.clone_scope(0), '')\n    self.assertDeviceEqual(deploy_config.optimizer_device(),\n                           '/job:worker/device:CPU:0')\n    self.assertDeviceEqual(deploy_config.inputs_device(),\n                           '/job:worker/device:CPU:0')\n\n  def testReplicasMultiGPUPS(self):\n    deploy_config = model_deploy.DeploymentConfig(num_replicas=2,\n                                                  num_clones=2,\n                                                  num_ps_tasks=2)\n    self.assertDeviceEqual(deploy_config.clone_device(0),\n                           '/job:worker/device:GPU:0')\n    self.assertDeviceEqual(deploy_config.clone_device(1),\n                           '/job:worker/device:GPU:1')\n    self.assertEqual(deploy_config.clone_scope(0), 'clone_0')\n    self.assertEqual(deploy_config.clone_scope(1), 'clone_1')\n    self.assertDeviceEqual(deploy_config.optimizer_device(),\n                           '/job:worker/device:CPU:0')\n    self.assertDeviceEqual(deploy_config.inputs_device(),\n                           '/job:worker/device:CPU:0')\n\n  def testVariablesPS(self):\n    deploy_config = model_deploy.DeploymentConfig(num_ps_tasks=2)\n\n    with tf.device(deploy_config.variables_device()):\n      a = tf.Variable(0)\n      b = tf.Variable(0)\n      c = tf.no_op()\n      d = slim.variable('a', [],\n                        caching_device=deploy_config.caching_device())\n\n    self.assertDeviceEqual(a.device, '/job:ps/task:0/device:CPU:0')\n    self.assertDeviceEqual(a.device, a.value().device)\n    self.assertDeviceEqual(b.device, '/job:ps/task:1/device:CPU:0')\n    self.assertDeviceEqual(b.device, b.value().device)\n    self.assertDeviceEqual(c.device, '')\n    self.assertDeviceEqual(d.device, '/job:ps/task:0/device:CPU:0')\n    self.assertDeviceEqual(d.value().device, '')\n\n\ndef LogisticClassifier(inputs, labels, scope=None, reuse=None):\n  with tf.variable_scope(scope, 'LogisticClassifier', [inputs, labels],\n                         reuse=reuse):\n    predictions = slim.fully_connected(inputs, 1, activation_fn=tf.sigmoid,\n                                       scope='fully_connected')\n    slim.losses.log_loss(predictions, labels)\n    return predictions\n\n\ndef BatchNormClassifier(inputs, labels, scope=None, reuse=None):\n  with tf.variable_scope(scope, 'BatchNormClassifier', [inputs, labels],\n                         reuse=reuse):\n    inputs = slim.batch_norm(inputs, decay=0.1)\n    predictions = slim.fully_connected(inputs, 1,\n                                       activation_fn=tf.sigmoid,\n                                       scope='fully_connected')\n    slim.losses.log_loss(predictions, labels)\n    return predictions\n\n\nclass CreatecloneTest(tf.test.TestCase):\n\n  def setUp(self):\n    # Create an easy training set:\n    np.random.seed(0)\n\n    self._inputs = np.zeros((16, 4))\n    self._labels = np.random.randint(0, 2, size=(16, 1)).astype(np.float32)\n    self._logdir = self.get_temp_dir()\n\n    for i in range(16):\n      j = int(2 * self._labels[i] + np.random.randint(0, 2))\n      self._inputs[i, j] = 1\n\n  def testCreateLogisticClassifier(self):\n    g = tf.Graph()\n    with g.as_default():\n      tf.set_random_seed(0)\n      tf_inputs = tf.constant(self._inputs, dtype=tf.float32)\n      tf_labels = tf.constant(self._labels, dtype=tf.float32)\n\n      model_fn = LogisticClassifier\n      clone_args = (tf_inputs, tf_labels)\n      deploy_config = model_deploy.DeploymentConfig(num_clones=1)\n\n      self.assertEqual(slim.get_variables(), [])\n      clones = model_deploy.create_clones(deploy_config, model_fn, clone_args)\n      clone = clones[0]\n      self.assertEqual(len(slim.get_variables()), 2)\n      for v in slim.get_variables():\n        self.assertDeviceEqual(v.device, 'CPU:0')\n        self.assertDeviceEqual(v.value().device, 'CPU:0')\n      self.assertEqual(clone.outputs.op.name,\n                       'LogisticClassifier/fully_connected/Sigmoid')\n      self.assertEqual(clone.scope, '')\n      self.assertDeviceEqual(clone.device, '')\n      self.assertEqual(len(slim.losses.get_losses()), 1)\n      update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)\n      self.assertEqual(update_ops, [])\n\n  def testCreateSingleclone(self):\n    g = tf.Graph()\n    with g.as_default():\n      tf.set_random_seed(0)\n      tf_inputs = tf.constant(self._inputs, dtype=tf.float32)\n      tf_labels = tf.constant(self._labels, dtype=tf.float32)\n\n      model_fn = BatchNormClassifier\n      clone_args = (tf_inputs, tf_labels)\n      deploy_config = model_deploy.DeploymentConfig(num_clones=1)\n\n      self.assertEqual(slim.get_variables(), [])\n      clones = model_deploy.create_clones(deploy_config, model_fn, clone_args)\n      clone = clones[0]\n      self.assertEqual(len(slim.get_variables()), 5)\n      for v in slim.get_variables():\n        self.assertDeviceEqual(v.device, 'CPU:0')\n        self.assertDeviceEqual(v.value().device, 'CPU:0')\n      self.assertEqual(clone.outputs.op.name,\n                       'BatchNormClassifier/fully_connected/Sigmoid')\n      self.assertEqual(clone.scope, '')\n      self.assertDeviceEqual(clone.device, '')\n      self.assertEqual(len(slim.losses.get_losses()), 1)\n      update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)\n      self.assertEqual(len(update_ops), 2)\n\n  def testCreateMulticlone(self):\n    g = tf.Graph()\n    with g.as_default():\n      tf.set_random_seed(0)\n      tf_inputs = tf.constant(self._inputs, dtype=tf.float32)\n      tf_labels = tf.constant(self._labels, dtype=tf.float32)\n\n      model_fn = BatchNormClassifier\n      clone_args = (tf_inputs, tf_labels)\n      num_clones = 4\n      deploy_config = model_deploy.DeploymentConfig(num_clones=num_clones)\n\n      self.assertEqual(slim.get_variables(), [])\n      clones = model_deploy.create_clones(deploy_config, model_fn, clone_args)\n      self.assertEqual(len(slim.get_variables()), 5)\n      for v in slim.get_variables():\n        self.assertDeviceEqual(v.device, 'CPU:0')\n        self.assertDeviceEqual(v.value().device, 'CPU:0')\n      self.assertEqual(len(clones), num_clones)\n      for i, clone in enumerate(clones):\n        self.assertEqual(\n            clone.outputs.op.name,\n            'clone_%d/BatchNormClassifier/fully_connected/Sigmoid' % i)\n        update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS, clone.scope)\n        self.assertEqual(len(update_ops), 2)\n        self.assertEqual(clone.scope, 'clone_%d/' % i)\n        self.assertDeviceEqual(clone.device, 'GPU:%d' % i)\n\n  def testCreateOnecloneWithPS(self):\n    g = tf.Graph()\n    with g.as_default():\n      tf.set_random_seed(0)\n      tf_inputs = tf.constant(self._inputs, dtype=tf.float32)\n      tf_labels = tf.constant(self._labels, dtype=tf.float32)\n\n      model_fn = BatchNormClassifier\n      clone_args = (tf_inputs, tf_labels)\n      deploy_config = model_deploy.DeploymentConfig(num_clones=1,\n                                                    num_ps_tasks=1)\n\n      self.assertEqual(slim.get_variables(), [])\n      clones = model_deploy.create_clones(deploy_config, model_fn, clone_args)\n      self.assertEqual(len(clones), 1)\n      clone = clones[0]\n      self.assertEqual(clone.outputs.op.name,\n                       'BatchNormClassifier/fully_connected/Sigmoid')\n      self.assertDeviceEqual(clone.device, '/job:worker')\n      self.assertEqual(clone.scope, '')\n      self.assertEqual(len(slim.get_variables()), 5)\n      for v in slim.get_variables():\n        self.assertDeviceEqual(v.device, '/job:ps/task:0/CPU:0')\n        self.assertDeviceEqual(v.device, v.value().device)\n\n  def testCreateMulticloneWithPS(self):\n    g = tf.Graph()\n    with g.as_default():\n      tf.set_random_seed(0)\n      tf_inputs = tf.constant(self._inputs, dtype=tf.float32)\n      tf_labels = tf.constant(self._labels, dtype=tf.float32)\n\n      model_fn = BatchNormClassifier\n      clone_args = (tf_inputs, tf_labels)\n      deploy_config = model_deploy.DeploymentConfig(num_clones=2,\n                                                    num_ps_tasks=2)\n\n      self.assertEqual(slim.get_variables(), [])\n      clones = model_deploy.create_clones(deploy_config, model_fn, clone_args)\n      self.assertEqual(len(slim.get_variables()), 5)\n      for i, v in enumerate(slim.get_variables()):\n        t = i % 2\n        self.assertDeviceEqual(v.device, '/job:ps/task:%d/device:CPU:0' % t)\n        self.assertDeviceEqual(v.device, v.value().device)\n      self.assertEqual(len(clones), 2)\n      for i, clone in enumerate(clones):\n        self.assertEqual(\n            clone.outputs.op.name,\n            'clone_%d/BatchNormClassifier/fully_connected/Sigmoid' % i)\n        self.assertEqual(clone.scope, 'clone_%d/' % i)\n        self.assertDeviceEqual(clone.device, '/job:worker/device:GPU:%d' % i)\n\n\nclass OptimizeclonesTest(tf.test.TestCase):\n\n  def setUp(self):\n    # Create an easy training set:\n    np.random.seed(0)\n\n    self._inputs = np.zeros((16, 4))\n    self._labels = np.random.randint(0, 2, size=(16, 1)).astype(np.float32)\n    self._logdir = self.get_temp_dir()\n\n    for i in range(16):\n      j = int(2 * self._labels[i] + np.random.randint(0, 2))\n      self._inputs[i, j] = 1\n\n  def testCreateLogisticClassifier(self):\n    g = tf.Graph()\n    with g.as_default():\n      tf.set_random_seed(0)\n      tf_inputs = tf.constant(self._inputs, dtype=tf.float32)\n      tf_labels = tf.constant(self._labels, dtype=tf.float32)\n\n      model_fn = LogisticClassifier\n      clone_args = (tf_inputs, tf_labels)\n      deploy_config = model_deploy.DeploymentConfig(num_clones=1)\n\n      self.assertEqual(slim.get_variables(), [])\n      clones = model_deploy.create_clones(deploy_config, model_fn, clone_args)\n      self.assertEqual(len(slim.get_variables()), 2)\n      update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)\n      self.assertEqual(update_ops, [])\n\n      optimizer = tf.train.GradientDescentOptimizer(learning_rate=1.0)\n      total_loss, grads_and_vars = model_deploy.optimize_clones(clones,\n                                                                optimizer)\n      self.assertEqual(len(grads_and_vars), len(tf.trainable_variables()))\n      self.assertEqual(total_loss.op.name, 'total_loss')\n      for g, v in grads_and_vars:\n        self.assertDeviceEqual(g.device, '')\n        self.assertDeviceEqual(v.device, 'CPU:0')\n\n  def testCreateSingleclone(self):\n    g = tf.Graph()\n    with g.as_default():\n      tf.set_random_seed(0)\n      tf_inputs = tf.constant(self._inputs, dtype=tf.float32)\n      tf_labels = tf.constant(self._labels, dtype=tf.float32)\n\n      model_fn = BatchNormClassifier\n      clone_args = (tf_inputs, tf_labels)\n      deploy_config = model_deploy.DeploymentConfig(num_clones=1)\n\n      self.assertEqual(slim.get_variables(), [])\n      clones = model_deploy.create_clones(deploy_config, model_fn, clone_args)\n      self.assertEqual(len(slim.get_variables()), 5)\n      update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)\n      self.assertEqual(len(update_ops), 2)\n\n      optimizer = tf.train.GradientDescentOptimizer(learning_rate=1.0)\n      total_loss, grads_and_vars = model_deploy.optimize_clones(clones,\n                                                                optimizer)\n      self.assertEqual(len(grads_and_vars), len(tf.trainable_variables()))\n      self.assertEqual(total_loss.op.name, 'total_loss')\n      for g, v in grads_and_vars:\n        self.assertDeviceEqual(g.device, '')\n        self.assertDeviceEqual(v.device, 'CPU:0')\n\n  def testCreateMulticlone(self):\n    g = tf.Graph()\n    with g.as_default():\n      tf.set_random_seed(0)\n      tf_inputs = tf.constant(self._inputs, dtype=tf.float32)\n      tf_labels = tf.constant(self._labels, dtype=tf.float32)\n\n      model_fn = BatchNormClassifier\n      clone_args = (tf_inputs, tf_labels)\n      num_clones = 4\n      deploy_config = model_deploy.DeploymentConfig(num_clones=num_clones)\n\n      self.assertEqual(slim.get_variables(), [])\n      clones = model_deploy.create_clones(deploy_config, model_fn, clone_args)\n      self.assertEqual(len(slim.get_variables()), 5)\n      update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)\n      self.assertEqual(len(update_ops), num_clones * 2)\n\n      optimizer = tf.train.GradientDescentOptimizer(learning_rate=1.0)\n      total_loss, grads_and_vars = model_deploy.optimize_clones(clones,\n                                                                optimizer)\n      self.assertEqual(len(grads_and_vars), len(tf.trainable_variables()))\n      self.assertEqual(total_loss.op.name, 'total_loss')\n      for g, v in grads_and_vars:\n        self.assertDeviceEqual(g.device, '')\n        self.assertDeviceEqual(v.device, 'CPU:0')\n\n  def testCreateMulticloneCPU(self):\n    g = tf.Graph()\n    with g.as_default():\n      tf.set_random_seed(0)\n      tf_inputs = tf.constant(self._inputs, dtype=tf.float32)\n      tf_labels = tf.constant(self._labels, dtype=tf.float32)\n\n      model_fn = BatchNormClassifier\n      model_args = (tf_inputs, tf_labels)\n      num_clones = 4\n      deploy_config = model_deploy.DeploymentConfig(num_clones=num_clones,\n                                                    clone_on_cpu=True)\n\n      self.assertEqual(slim.get_variables(), [])\n      clones = model_deploy.create_clones(deploy_config, model_fn, model_args)\n      self.assertEqual(len(slim.get_variables()), 5)\n      update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)\n      self.assertEqual(len(update_ops), num_clones * 2)\n\n      optimizer = tf.train.GradientDescentOptimizer(learning_rate=1.0)\n      total_loss, grads_and_vars = model_deploy.optimize_clones(clones,\n                                                                optimizer)\n      self.assertEqual(len(grads_and_vars), len(tf.trainable_variables()))\n      self.assertEqual(total_loss.op.name, 'total_loss')\n      for g, v in grads_and_vars:\n        self.assertDeviceEqual(g.device, '')\n        self.assertDeviceEqual(v.device, 'CPU:0')\n\n  def testCreateOnecloneWithPS(self):\n    g = tf.Graph()\n    with g.as_default():\n      tf.set_random_seed(0)\n      tf_inputs = tf.constant(self._inputs, dtype=tf.float32)\n      tf_labels = tf.constant(self._labels, dtype=tf.float32)\n\n      model_fn = BatchNormClassifier\n      model_args = (tf_inputs, tf_labels)\n      deploy_config = model_deploy.DeploymentConfig(num_clones=1,\n                                                    num_ps_tasks=1)\n\n      self.assertEqual(slim.get_variables(), [])\n      clones = model_deploy.create_clones(deploy_config, model_fn, model_args)\n      self.assertEqual(len(slim.get_variables()), 5)\n      update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)\n      self.assertEqual(len(update_ops), 2)\n\n      optimizer = tf.train.GradientDescentOptimizer(learning_rate=1.0)\n      total_loss, grads_and_vars = model_deploy.optimize_clones(clones,\n                                                                optimizer)\n      self.assertEqual(len(grads_and_vars), len(tf.trainable_variables()))\n      self.assertEqual(total_loss.op.name, 'total_loss')\n      for g, v in grads_and_vars:\n        self.assertDeviceEqual(g.device, '/job:worker')\n        self.assertDeviceEqual(v.device, '/job:ps/task:0/CPU:0')\n\n\nclass DeployTest(tf.test.TestCase):\n\n  def setUp(self):\n    # Create an easy training set:\n    np.random.seed(0)\n\n    self._inputs = np.zeros((16, 4))\n    self._labels = np.random.randint(0, 2, size=(16, 1)).astype(np.float32)\n    self._logdir = self.get_temp_dir()\n\n    for i in range(16):\n      j = int(2 * self._labels[i] + np.random.randint(0, 2))\n      self._inputs[i, j] = 1\n\n  def testLocalTrainOp(self):\n    g = tf.Graph()\n    with g.as_default():\n      tf.set_random_seed(0)\n      tf_inputs = tf.constant(self._inputs, dtype=tf.float32)\n      tf_labels = tf.constant(self._labels, dtype=tf.float32)\n\n      model_fn = BatchNormClassifier\n      model_args = (tf_inputs, tf_labels)\n      deploy_config = model_deploy.DeploymentConfig(num_clones=2,\n                                                    clone_on_cpu=True)\n\n      optimizer = tf.train.GradientDescentOptimizer(learning_rate=1.0)\n\n      self.assertEqual(slim.get_variables(), [])\n      model = model_deploy.deploy(deploy_config, model_fn, model_args,\n                                  optimizer=optimizer)\n\n      update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS)\n      self.assertEqual(len(update_ops), 4)\n      self.assertEqual(len(model.clones), 2)\n      self.assertEqual(model.total_loss.op.name, 'total_loss')\n      self.assertEqual(model.summary_op.op.name, 'summary_op/summary_op')\n      self.assertEqual(model.train_op.op.name, 'train_op')\n\n      with tf.Session() as sess:\n        sess.run(tf.initialize_all_variables())\n        moving_mean = tf.contrib.framework.get_variables_by_name(\n            'moving_mean')[0]\n        moving_variance = tf.contrib.framework.get_variables_by_name(\n            'moving_variance')[0]\n        initial_loss = sess.run(model.total_loss)\n        initial_mean, initial_variance = sess.run([moving_mean,\n                                                   moving_variance])\n        self.assertAllClose(initial_mean, [0.0, 0.0, 0.0, 0.0])\n        self.assertAllClose(initial_variance, [1.0, 1.0, 1.0, 1.0])\n        for _ in range(10):\n          sess.run(model.train_op)\n        final_loss = sess.run(model.total_loss)\n        self.assertLess(final_loss, initial_loss / 10.0)\n\n        final_mean, final_variance = sess.run([moving_mean,\n                                               moving_variance])\n        self.assertAllClose(final_mean, [0.125, 0.25, 0.375, 0.25])\n        self.assertAllClose(final_variance, [0.109375, 0.1875,\n                                             0.234375, 0.1875])\n\n  def testNoSummariesOnGPU(self):\n    with tf.Graph().as_default():\n      deploy_config = model_deploy.DeploymentConfig(num_clones=2)\n\n      # clone function creates a fully_connected layer with a regularizer loss.\n      def ModelFn():\n        inputs = tf.constant(1.0, shape=(10, 20), dtype=tf.float32)\n        reg = tf.contrib.layers.l2_regularizer(0.001)\n        tf.contrib.layers.fully_connected(inputs, 30, weights_regularizer=reg)\n\n      model = model_deploy.deploy(\n          deploy_config, ModelFn,\n          optimizer=tf.train.GradientDescentOptimizer(1.0))\n      # The model summary op should have a few summary inputs and all of them\n      # should be on the CPU.\n      self.assertTrue(model.summary_op.op.inputs)\n      for inp in  model.summary_op.op.inputs:\n        self.assertEqual('/device:CPU:0', inp.device)\n\n  def testNoSummariesOnGPUForEvals(self):\n    with tf.Graph().as_default():\n      deploy_config = model_deploy.DeploymentConfig(num_clones=2)\n\n      # clone function creates a fully_connected layer with a regularizer loss.\n      def ModelFn():\n        inputs = tf.constant(1.0, shape=(10, 20), dtype=tf.float32)\n        reg = tf.contrib.layers.l2_regularizer(0.001)\n        tf.contrib.layers.fully_connected(inputs, 30, weights_regularizer=reg)\n\n      # No optimizer here, it's an eval.\n      model = model_deploy.deploy(deploy_config, ModelFn)\n      # The model summary op should have a few summary inputs and all of them\n      # should be on the CPU.\n      self.assertTrue(model.summary_op.op.inputs)\n      for inp in  model.summary_op.op.inputs:\n        self.assertEqual('/device:CPU:0', inp.device)\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/slim/download_and_convert_data.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\nr\"\"\"Downloads and converts a particular dataset.\n\nUsage:\n```shell\n\n$ python download_and_convert_data.py \\\n    --dataset_name=mnist \\\n    --dataset_dir=/tmp/mnist\n\n$ python download_and_convert_data.py \\\n    --dataset_name=cifar10 \\\n    --dataset_dir=/tmp/cifar10\n\n$ python download_and_convert_data.py \\\n    --dataset_name=flowers \\\n    --dataset_dir=/tmp/flowers\n```\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom datasets import download_and_convert_cifar10\nfrom datasets import download_and_convert_flowers\nfrom datasets import download_and_convert_mnist\n\nFLAGS = tf.app.flags.FLAGS\n\ntf.app.flags.DEFINE_string(\n    'dataset_name',\n    None,\n    'The name of the dataset to convert, one of \"cifar10\", \"flowers\", \"mnist\".')\n\ntf.app.flags.DEFINE_string(\n    'dataset_dir',\n    None,\n    'The directory where the output TFRecords and temporary files are saved.')\n\n\ndef main(_):\n  if not FLAGS.dataset_name:\n    raise ValueError('You must supply the dataset name with --dataset_name')\n  if not FLAGS.dataset_dir:\n    raise ValueError('You must supply the dataset directory with --dataset_dir')\n\n  if FLAGS.dataset_name == 'cifar10':\n    download_and_convert_cifar10.run(FLAGS.dataset_dir)\n  elif FLAGS.dataset_name == 'flowers':\n    download_and_convert_flowers.run(FLAGS.dataset_dir)\n  elif FLAGS.dataset_name == 'mnist':\n    download_and_convert_mnist.run(FLAGS.dataset_dir)\n  else:\n    raise ValueError(\n        'dataset_name [%s] was not recognized.' % FLAGS.dataset_dir)\n\nif __name__ == '__main__':\n  tf.app.run()\n\n"
  },
  {
    "path": "model_zoo/models/slim/eval_image_classifier.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Generic evaluation script that evaluates a model using a given dataset.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport math\nimport tensorflow as tf\n\nfrom datasets import dataset_factory\nfrom nets import nets_factory\nfrom preprocessing import preprocessing_factory\n\nslim = tf.contrib.slim\n\ntf.app.flags.DEFINE_integer(\n    'batch_size', 100, 'The number of samples in each batch.')\n\ntf.app.flags.DEFINE_integer(\n    'max_num_batches', None,\n    'Max number of batches to evaluate by default use all.')\n\ntf.app.flags.DEFINE_string(\n    'master', '', 'The address of the TensorFlow master to use.')\n\ntf.app.flags.DEFINE_string(\n    'checkpoint_path', '/tmp/tfmodel/',\n    'The directory where the model was written to or an absolute path to a '\n    'checkpoint file.')\n\ntf.app.flags.DEFINE_string(\n    'eval_dir', '/tmp/tfmodel/', 'Directory where the results are saved to.')\n\ntf.app.flags.DEFINE_integer(\n    'num_preprocessing_threads', 4,\n    'The number of threads used to create the batches.')\n\ntf.app.flags.DEFINE_string(\n    'dataset_name', 'imagenet', 'The name of the dataset to load.')\n\ntf.app.flags.DEFINE_string(\n    'dataset_split_name', 'test', 'The name of the train/test split.')\n\ntf.app.flags.DEFINE_string(\n    'dataset_dir', None, 'The directory where the dataset files are stored.')\n\ntf.app.flags.DEFINE_integer(\n    'labels_offset', 0,\n    'An offset for the labels in the dataset. This flag is primarily used to '\n    'evaluate the VGG and ResNet architectures which do not use a background '\n    'class for the ImageNet dataset.')\n\ntf.app.flags.DEFINE_string(\n    'model_name', 'inception_v3', 'The name of the architecture to evaluate.')\n\ntf.app.flags.DEFINE_string(\n    'preprocessing_name', None, 'The name of the preprocessing to use. If left '\n    'as `None`, then the model_name flag is used.')\n\ntf.app.flags.DEFINE_float(\n    'moving_average_decay', None,\n    'The decay to use for the moving average.'\n    'If left as None, then moving averages are not used.')\n\ntf.app.flags.DEFINE_integer(\n    'eval_image_size', None, 'Eval image size')\n\nFLAGS = tf.app.flags.FLAGS\n\n\ndef main(_):\n  if not FLAGS.dataset_dir:\n    raise ValueError('You must supply the dataset directory with --dataset_dir')\n\n  tf.logging.set_verbosity(tf.logging.INFO)\n  with tf.Graph().as_default():\n    tf_global_step = slim.get_or_create_global_step()\n\n    ######################\n    # Select the dataset #\n    ######################\n    dataset = dataset_factory.get_dataset(\n        FLAGS.dataset_name, FLAGS.dataset_split_name, FLAGS.dataset_dir)\n\n    ####################\n    # Select the model #\n    ####################\n    network_fn = nets_factory.get_network_fn(\n        FLAGS.model_name,\n        num_classes=(dataset.num_classes - FLAGS.labels_offset),\n        is_training=False)\n\n    ##############################################################\n    # Create a dataset provider that loads data from the dataset #\n    ##############################################################\n    provider = slim.dataset_data_provider.DatasetDataProvider(\n        dataset,\n        shuffle=False,\n        common_queue_capacity=2 * FLAGS.batch_size,\n        common_queue_min=FLAGS.batch_size)\n    [image, label] = provider.get(['image', 'label'])\n    label -= FLAGS.labels_offset\n\n    #####################################\n    # Select the preprocessing function #\n    #####################################\n    preprocessing_name = FLAGS.preprocessing_name or FLAGS.model_name\n    image_preprocessing_fn = preprocessing_factory.get_preprocessing(\n        preprocessing_name,\n        is_training=False)\n\n    eval_image_size = FLAGS.eval_image_size or network_fn.default_image_size\n\n    image = image_preprocessing_fn(image, eval_image_size, eval_image_size)\n\n    images, labels = tf.train.batch(\n        [image, label],\n        batch_size=FLAGS.batch_size,\n        num_threads=FLAGS.num_preprocessing_threads,\n        capacity=5 * FLAGS.batch_size)\n\n    ####################\n    # Define the model #\n    ####################\n    logits, _ = network_fn(images)\n\n    if FLAGS.moving_average_decay:\n      variable_averages = tf.train.ExponentialMovingAverage(\n          FLAGS.moving_average_decay, tf_global_step)\n      variables_to_restore = variable_averages.variables_to_restore(\n          slim.get_model_variables())\n      variables_to_restore[tf_global_step.op.name] = tf_global_step\n    else:\n      variables_to_restore = slim.get_variables_to_restore()\n\n    predictions = tf.argmax(logits, 1)\n    labels = tf.squeeze(labels)\n\n    # Define the metrics:\n    names_to_values, names_to_updates = slim.metrics.aggregate_metric_map({\n        'Accuracy': slim.metrics.streaming_accuracy(predictions, labels),\n        'Recall@5': slim.metrics.streaming_recall_at_k(\n            logits, labels, 5),\n    })\n\n    # Print the summaries to screen.\n    for name, value in names_to_values.iteritems():\n      summary_name = 'eval/%s' % name\n      op = tf.scalar_summary(summary_name, value, collections=[])\n      op = tf.Print(op, [value], summary_name)\n      tf.add_to_collection(tf.GraphKeys.SUMMARIES, op)\n\n    # TODO(sguada) use num_epochs=1\n    if FLAGS.max_num_batches:\n      num_batches = FLAGS.max_num_batches\n    else:\n      # This ensures that we make a single pass over all of the data.\n      num_batches = math.ceil(dataset.num_samples / float(FLAGS.batch_size))\n\n    if tf.gfile.IsDirectory(FLAGS.checkpoint_path):\n      checkpoint_path = tf.train.latest_checkpoint(FLAGS.checkpoint_path)\n    else:\n      checkpoint_path = FLAGS.checkpoint_path\n\n    tf.logging.info('Evaluating %s' % checkpoint_path)\n\n    slim.evaluation.evaluate_once(\n        master=FLAGS.master,\n        checkpoint_path=checkpoint_path,\n        logdir=FLAGS.eval_dir,\n        num_evals=num_batches,\n        eval_op=names_to_updates.values(),\n        variables_to_restore=variables_to_restore)\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/slim/nets/__init__.py",
    "content": "\n"
  },
  {
    "path": "model_zoo/models/slim/nets/alexnet.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains a model definition for AlexNet.\n\nThis work was first described in:\n  ImageNet Classification with Deep Convolutional Neural Networks\n  Alex Krizhevsky, Ilya Sutskever and Geoffrey E. Hinton\n\nand later refined in:\n  One weird trick for parallelizing convolutional neural networks\n  Alex Krizhevsky, 2014\n\nHere we provide the implementation proposed in \"One weird trick\" and not\n\"ImageNet Classification\", as per the paper, the LRN layers have been removed.\n\nUsage:\n  with slim.arg_scope(alexnet.alexnet_v2_arg_scope()):\n    outputs, end_points = alexnet.alexnet_v2(inputs)\n\n@@alexnet_v2\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nslim = tf.contrib.slim\ntrunc_normal = lambda stddev: tf.truncated_normal_initializer(0.0, stddev)\n\n\ndef alexnet_v2_arg_scope(weight_decay=0.0005):\n  with slim.arg_scope([slim.conv2d, slim.fully_connected],\n                      activation_fn=tf.nn.relu,\n                      biases_initializer=tf.constant_initializer(0.1),\n                      weights_regularizer=slim.l2_regularizer(weight_decay)):\n    with slim.arg_scope([slim.conv2d], padding='SAME'):\n      with slim.arg_scope([slim.max_pool2d], padding='VALID') as arg_sc:\n        return arg_sc\n\n\ndef alexnet_v2(inputs,\n               num_classes=1000,\n               is_training=True,\n               dropout_keep_prob=0.5,\n               spatial_squeeze=True,\n               scope='alexnet_v2'):\n  \"\"\"AlexNet version 2.\n\n  Described in: http://arxiv.org/pdf/1404.5997v2.pdf\n  Parameters from:\n  github.com/akrizhevsky/cuda-convnet2/blob/master/layers/\n  layers-imagenet-1gpu.cfg\n\n  Note: All the fully_connected layers have been transformed to conv2d layers.\n        To use in classification mode, resize input to 224x224. To use in fully\n        convolutional mode, set spatial_squeeze to false.\n        The LRN layers have been removed and change the initializers from\n        random_normal_initializer to xavier_initializer.\n\n  Args:\n    inputs: a tensor of size [batch_size, height, width, channels].\n    num_classes: number of predicted classes.\n    is_training: whether or not the model is being trained.\n    dropout_keep_prob: the probability that activations are kept in the dropout\n      layers during training.\n    spatial_squeeze: whether or not should squeeze the spatial dimensions of the\n      outputs. Useful to remove unnecessary dimensions for classification.\n    scope: Optional scope for the variables.\n\n  Returns:\n    the last op containing the log predictions and end_points dict.\n  \"\"\"\n  with tf.variable_scope(scope, 'alexnet_v2', [inputs]) as sc:\n    end_points_collection = sc.name + '_end_points'\n    # Collect outputs for conv2d, fully_connected and max_pool2d.\n    with slim.arg_scope([slim.conv2d, slim.fully_connected, slim.max_pool2d],\n                        outputs_collections=[end_points_collection]):\n      net = slim.conv2d(inputs, 64, [11, 11], 4, padding='VALID',\n                        scope='conv1')\n      net = slim.max_pool2d(net, [3, 3], 2, scope='pool1')\n      net = slim.conv2d(net, 192, [5, 5], scope='conv2')\n      net = slim.max_pool2d(net, [3, 3], 2, scope='pool2')\n      net = slim.conv2d(net, 384, [3, 3], scope='conv3')\n      net = slim.conv2d(net, 384, [3, 3], scope='conv4')\n      net = slim.conv2d(net, 256, [3, 3], scope='conv5')\n      net = slim.max_pool2d(net, [3, 3], 2, scope='pool5')\n\n      # Use conv2d instead of fully_connected layers.\n      with slim.arg_scope([slim.conv2d],\n                          weights_initializer=trunc_normal(0.005),\n                          biases_initializer=tf.constant_initializer(0.1)):\n        net = slim.conv2d(net, 4096, [5, 5], padding='VALID',\n                          scope='fc6')\n        net = slim.dropout(net, dropout_keep_prob, is_training=is_training,\n                           scope='dropout6')\n        net = slim.conv2d(net, 4096, [1, 1], scope='fc7')\n        net = slim.dropout(net, dropout_keep_prob, is_training=is_training,\n                           scope='dropout7')\n        net = slim.conv2d(net, num_classes, [1, 1],\n                          activation_fn=None,\n                          normalizer_fn=None,\n                          biases_initializer=tf.zeros_initializer,\n                          scope='fc8')\n\n      # Convert end_points_collection into a end_point dict.\n      end_points = slim.utils.convert_collection_to_dict(end_points_collection)\n      if spatial_squeeze:\n        net = tf.squeeze(net, [1, 2], name='fc8/squeezed')\n        end_points[sc.name + '/fc8'] = net\n      return net, end_points\nalexnet_v2.default_image_size = 224\n"
  },
  {
    "path": "model_zoo/models/slim/nets/alexnet_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for slim.nets.alexnet.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom nets import alexnet\n\nslim = tf.contrib.slim\n\n\nclass AlexnetV2Test(tf.test.TestCase):\n\n  def testBuild(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = alexnet.alexnet_v2(inputs, num_classes)\n      self.assertEquals(logits.op.name, 'alexnet_v2/fc8/squeezed')\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n\n  def testFullyConvolutional(self):\n    batch_size = 1\n    height, width = 300, 400\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = alexnet.alexnet_v2(inputs, num_classes, spatial_squeeze=False)\n      self.assertEquals(logits.op.name, 'alexnet_v2/fc8/BiasAdd')\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, 4, 7, num_classes])\n\n  def testEndPoints(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      _, end_points = alexnet.alexnet_v2(inputs, num_classes)\n      expected_names = ['alexnet_v2/conv1',\n                        'alexnet_v2/pool1',\n                        'alexnet_v2/conv2',\n                        'alexnet_v2/pool2',\n                        'alexnet_v2/conv3',\n                        'alexnet_v2/conv4',\n                        'alexnet_v2/conv5',\n                        'alexnet_v2/pool5',\n                        'alexnet_v2/fc6',\n                        'alexnet_v2/fc7',\n                        'alexnet_v2/fc8'\n                       ]\n      self.assertSetEqual(set(end_points.keys()), set(expected_names))\n\n  def testModelVariables(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      alexnet.alexnet_v2(inputs, num_classes)\n      expected_names = ['alexnet_v2/conv1/weights',\n                        'alexnet_v2/conv1/biases',\n                        'alexnet_v2/conv2/weights',\n                        'alexnet_v2/conv2/biases',\n                        'alexnet_v2/conv3/weights',\n                        'alexnet_v2/conv3/biases',\n                        'alexnet_v2/conv4/weights',\n                        'alexnet_v2/conv4/biases',\n                        'alexnet_v2/conv5/weights',\n                        'alexnet_v2/conv5/biases',\n                        'alexnet_v2/fc6/weights',\n                        'alexnet_v2/fc6/biases',\n                        'alexnet_v2/fc7/weights',\n                        'alexnet_v2/fc7/biases',\n                        'alexnet_v2/fc8/weights',\n                        'alexnet_v2/fc8/biases',\n                       ]\n      model_variables = [v.op.name for v in slim.get_model_variables()]\n      self.assertSetEqual(set(model_variables), set(expected_names))\n\n  def testEvaluation(self):\n    batch_size = 2\n    height, width = 224, 224\n    num_classes = 1000\n    with self.test_session():\n      eval_inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = alexnet.alexnet_v2(eval_inputs, is_training=False)\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n      predictions = tf.argmax(logits, 1)\n      self.assertListEqual(predictions.get_shape().as_list(), [batch_size])\n\n  def testTrainEvalWithReuse(self):\n    train_batch_size = 2\n    eval_batch_size = 1\n    train_height, train_width = 224, 224\n    eval_height, eval_width = 300, 400\n    num_classes = 1000\n    with self.test_session():\n      train_inputs = tf.random_uniform(\n          (train_batch_size, train_height, train_width, 3))\n      logits, _ = alexnet.alexnet_v2(train_inputs)\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [train_batch_size, num_classes])\n      tf.get_variable_scope().reuse_variables()\n      eval_inputs = tf.random_uniform(\n          (eval_batch_size, eval_height, eval_width, 3))\n      logits, _ = alexnet.alexnet_v2(eval_inputs, is_training=False,\n                                     spatial_squeeze=False)\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [eval_batch_size, 4, 7, num_classes])\n      logits = tf.reduce_mean(logits, [1, 2])\n      predictions = tf.argmax(logits, 1)\n      self.assertEquals(predictions.get_shape().as_list(), [eval_batch_size])\n\n  def testForward(self):\n    batch_size = 1\n    height, width = 224, 224\n    with self.test_session() as sess:\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = alexnet.alexnet_v2(inputs)\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(logits)\n      self.assertTrue(output.any())\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/slim/nets/cifarnet.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains a variant of the CIFAR-10 model definition.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nslim = tf.contrib.slim\n\ntrunc_normal = lambda stddev: tf.truncated_normal_initializer(stddev=stddev)\n\n\ndef cifarnet(images, num_classes=10, is_training=False,\n             dropout_keep_prob=0.5,\n             prediction_fn=slim.softmax,\n             scope='CifarNet'):\n  \"\"\"Creates a variant of the CifarNet model.\n\n  Note that since the output is a set of 'logits', the values fall in the\n  interval of (-infinity, infinity). Consequently, to convert the outputs to a\n  probability distribution over the characters, one will need to convert them\n  using the softmax function:\n\n        logits = cifarnet.cifarnet(images, is_training=False)\n        probabilities = tf.nn.softmax(logits)\n        predictions = tf.argmax(logits, 1)\n\n  Args:\n    images: A batch of `Tensors` of size [batch_size, height, width, channels].\n    num_classes: the number of classes in the dataset.\n    is_training: specifies whether or not we're currently training the model.\n      This variable will determine the behaviour of the dropout layer.\n    dropout_keep_prob: the percentage of activation values that are retained.\n    prediction_fn: a function to get predictions out of logits.\n    scope: Optional variable_scope.\n\n  Returns:\n    logits: the pre-softmax activations, a tensor of size\n      [batch_size, `num_classes`]\n    end_points: a dictionary from components of the network to the corresponding\n      activation.\n  \"\"\"\n  end_points = {}\n\n  with tf.variable_scope(scope, 'CifarNet', [images, num_classes]):\n    net = slim.conv2d(images, 64, [5, 5], scope='conv1')\n    end_points['conv1'] = net\n    net = slim.max_pool2d(net, [2, 2], 2, scope='pool1')\n    end_points['pool1'] = net\n    net = tf.nn.lrn(net, 4, bias=1.0, alpha=0.001/9.0, beta=0.75, name='norm1')\n    net = slim.conv2d(net, 64, [5, 5], scope='conv2')\n    end_points['conv2'] = net\n    net = tf.nn.lrn(net, 4, bias=1.0, alpha=0.001/9.0, beta=0.75, name='norm2')\n    net = slim.max_pool2d(net, [2, 2], 2, scope='pool2')\n    end_points['pool2'] = net\n    net = slim.flatten(net)\n    end_points['Flatten'] = net\n    net = slim.fully_connected(net, 384, scope='fc3')\n    end_points['fc3'] = net\n    net = slim.dropout(net, dropout_keep_prob, is_training=is_training,\n                       scope='dropout3')\n    net = slim.fully_connected(net, 192, scope='fc4')\n    end_points['fc4'] = net\n    logits = slim.fully_connected(net, num_classes,\n                                  biases_initializer=tf.zeros_initializer,\n                                  weights_initializer=trunc_normal(1/192.0),\n                                  weights_regularizer=None,\n                                  activation_fn=None,\n                                  scope='logits')\n\n    end_points['Logits'] = logits\n    end_points['Predictions'] = prediction_fn(logits, scope='Predictions')\n\n  return logits, end_points\ncifarnet.default_image_size = 32\n\n\ndef cifarnet_arg_scope(weight_decay=0.004):\n  \"\"\"Defines the default cifarnet argument scope.\n\n  Args:\n    weight_decay: The weight decay to use for regularizing the model.\n\n  Returns:\n    An `arg_scope` to use for the inception v3 model.\n  \"\"\"\n  with slim.arg_scope(\n      [slim.conv2d],\n      weights_initializer=tf.truncated_normal_initializer(stddev=5e-2),\n      activation_fn=tf.nn.relu):\n    with slim.arg_scope(\n        [slim.fully_connected],\n        biases_initializer=tf.constant_initializer(0.1),\n        weights_initializer=trunc_normal(0.04),\n        weights_regularizer=slim.l2_regularizer(weight_decay),\n        activation_fn=tf.nn.relu) as sc:\n      return sc\n"
  },
  {
    "path": "model_zoo/models/slim/nets/inception.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Brings all inception models under one namespace.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n# pylint: disable=unused-import\nfrom nets.inception_resnet_v2 import inception_resnet_v2\nfrom nets.inception_resnet_v2 import inception_resnet_v2_arg_scope\nfrom nets.inception_v1 import inception_v1\nfrom nets.inception_v1 import inception_v1_arg_scope\nfrom nets.inception_v1 import inception_v1_base\nfrom nets.inception_v2 import inception_v2\nfrom nets.inception_v2 import inception_v2_arg_scope\nfrom nets.inception_v2 import inception_v2_base\nfrom nets.inception_v3 import inception_v3\nfrom nets.inception_v3 import inception_v3_arg_scope\nfrom nets.inception_v3 import inception_v3_base\nfrom nets.inception_v4 import inception_v4\nfrom nets.inception_v4 import inception_v4_arg_scope\nfrom nets.inception_v4 import inception_v4_base\n# pylint: enable=unused-import\n"
  },
  {
    "path": "model_zoo/models/slim/nets/inception_resnet_v2.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains the definition of the Inception Resnet V2 architecture.\n\nAs described in http://arxiv.org/abs/1602.07261.\n\n  Inception-v4, Inception-ResNet and the Impact of Residual Connections\n    on Learning\n  Christian Szegedy, Sergey Ioffe, Vincent Vanhoucke, Alex Alemi\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nimport tensorflow as tf\n\nslim = tf.contrib.slim\n\n\ndef block35(net, scale=1.0, activation_fn=tf.nn.relu, scope=None, reuse=None):\n  \"\"\"Builds the 35x35 resnet block.\"\"\"\n  with tf.variable_scope(scope, 'Block35', [net], reuse=reuse):\n    with tf.variable_scope('Branch_0'):\n      tower_conv = slim.conv2d(net, 32, 1, scope='Conv2d_1x1')\n    with tf.variable_scope('Branch_1'):\n      tower_conv1_0 = slim.conv2d(net, 32, 1, scope='Conv2d_0a_1x1')\n      tower_conv1_1 = slim.conv2d(tower_conv1_0, 32, 3, scope='Conv2d_0b_3x3')\n    with tf.variable_scope('Branch_2'):\n      tower_conv2_0 = slim.conv2d(net, 32, 1, scope='Conv2d_0a_1x1')\n      tower_conv2_1 = slim.conv2d(tower_conv2_0, 48, 3, scope='Conv2d_0b_3x3')\n      tower_conv2_2 = slim.conv2d(tower_conv2_1, 64, 3, scope='Conv2d_0c_3x3')\n    mixed = tf.concat(3, [tower_conv, tower_conv1_1, tower_conv2_2])\n    up = slim.conv2d(mixed, net.get_shape()[3], 1, normalizer_fn=None,\n                     activation_fn=None, scope='Conv2d_1x1')\n    net += scale * up\n    if activation_fn:\n      net = activation_fn(net)\n  return net\n\n\ndef block17(net, scale=1.0, activation_fn=tf.nn.relu, scope=None, reuse=None):\n  \"\"\"Builds the 17x17 resnet block.\"\"\"\n  with tf.variable_scope(scope, 'Block17', [net], reuse=reuse):\n    with tf.variable_scope('Branch_0'):\n      tower_conv = slim.conv2d(net, 192, 1, scope='Conv2d_1x1')\n    with tf.variable_scope('Branch_1'):\n      tower_conv1_0 = slim.conv2d(net, 128, 1, scope='Conv2d_0a_1x1')\n      tower_conv1_1 = slim.conv2d(tower_conv1_0, 160, [1, 7],\n                                  scope='Conv2d_0b_1x7')\n      tower_conv1_2 = slim.conv2d(tower_conv1_1, 192, [7, 1],\n                                  scope='Conv2d_0c_7x1')\n    mixed = tf.concat(3, [tower_conv, tower_conv1_2])\n    up = slim.conv2d(mixed, net.get_shape()[3], 1, normalizer_fn=None,\n                     activation_fn=None, scope='Conv2d_1x1')\n    net += scale * up\n    if activation_fn:\n      net = activation_fn(net)\n  return net\n\n\ndef block8(net, scale=1.0, activation_fn=tf.nn.relu, scope=None, reuse=None):\n  \"\"\"Builds the 8x8 resnet block.\"\"\"\n  with tf.variable_scope(scope, 'Block8', [net], reuse=reuse):\n    with tf.variable_scope('Branch_0'):\n      tower_conv = slim.conv2d(net, 192, 1, scope='Conv2d_1x1')\n    with tf.variable_scope('Branch_1'):\n      tower_conv1_0 = slim.conv2d(net, 192, 1, scope='Conv2d_0a_1x1')\n      tower_conv1_1 = slim.conv2d(tower_conv1_0, 224, [1, 3],\n                                  scope='Conv2d_0b_1x3')\n      tower_conv1_2 = slim.conv2d(tower_conv1_1, 256, [3, 1],\n                                  scope='Conv2d_0c_3x1')\n    mixed = tf.concat(3, [tower_conv, tower_conv1_2])\n    up = slim.conv2d(mixed, net.get_shape()[3], 1, normalizer_fn=None,\n                     activation_fn=None, scope='Conv2d_1x1')\n    net += scale * up\n    if activation_fn:\n      net = activation_fn(net)\n  return net\n\n\ndef inception_resnet_v2(inputs, num_classes=1001, is_training=True,\n                        dropout_keep_prob=0.8,\n                        reuse=None,\n                        scope='InceptionResnetV2'):\n  \"\"\"Creates the Inception Resnet V2 model.\n\n  Args:\n    inputs: a 4-D tensor of size [batch_size, height, width, 3].\n    num_classes: number of predicted classes.\n    is_training: whether is training or not.\n    dropout_keep_prob: float, the fraction to keep before final layer.\n    reuse: whether or not the network and its variables should be reused. To be\n      able to reuse 'scope' must be given.\n    scope: Optional variable_scope.\n\n  Returns:\n    logits: the logits outputs of the model.\n    end_points: the set of end_points from the inception model.\n  \"\"\"\n  end_points = {}\n\n  with tf.variable_scope(scope, 'InceptionResnetV2', [inputs], reuse=reuse):\n    with slim.arg_scope([slim.batch_norm, slim.dropout],\n                        is_training=is_training):\n      with slim.arg_scope([slim.conv2d, slim.max_pool2d, slim.avg_pool2d],\n                          stride=1, padding='SAME'):\n\n        # 149 x 149 x 32\n        net = slim.conv2d(inputs, 32, 3, stride=2, padding='VALID',\n                          scope='Conv2d_1a_3x3')\n        end_points['Conv2d_1a_3x3'] = net\n        # 147 x 147 x 32\n        net = slim.conv2d(net, 32, 3, padding='VALID',\n                          scope='Conv2d_2a_3x3')\n        end_points['Conv2d_2a_3x3'] = net\n        # 147 x 147 x 64\n        net = slim.conv2d(net, 64, 3, scope='Conv2d_2b_3x3')\n        end_points['Conv2d_2b_3x3'] = net\n        # 73 x 73 x 64\n        net = slim.max_pool2d(net, 3, stride=2, padding='VALID',\n                              scope='MaxPool_3a_3x3')\n        end_points['MaxPool_3a_3x3'] = net\n        # 73 x 73 x 80\n        net = slim.conv2d(net, 80, 1, padding='VALID',\n                          scope='Conv2d_3b_1x1')\n        end_points['Conv2d_3b_1x1'] = net\n        # 71 x 71 x 192\n        net = slim.conv2d(net, 192, 3, padding='VALID',\n                          scope='Conv2d_4a_3x3')\n        end_points['Conv2d_4a_3x3'] = net\n        # 35 x 35 x 192\n        net = slim.max_pool2d(net, 3, stride=2, padding='VALID',\n                              scope='MaxPool_5a_3x3')\n        end_points['MaxPool_5a_3x3'] = net\n\n        # 35 x 35 x 320\n        with tf.variable_scope('Mixed_5b'):\n          with tf.variable_scope('Branch_0'):\n            tower_conv = slim.conv2d(net, 96, 1, scope='Conv2d_1x1')\n          with tf.variable_scope('Branch_1'):\n            tower_conv1_0 = slim.conv2d(net, 48, 1, scope='Conv2d_0a_1x1')\n            tower_conv1_1 = slim.conv2d(tower_conv1_0, 64, 5,\n                                        scope='Conv2d_0b_5x5')\n          with tf.variable_scope('Branch_2'):\n            tower_conv2_0 = slim.conv2d(net, 64, 1, scope='Conv2d_0a_1x1')\n            tower_conv2_1 = slim.conv2d(tower_conv2_0, 96, 3,\n                                        scope='Conv2d_0b_3x3')\n            tower_conv2_2 = slim.conv2d(tower_conv2_1, 96, 3,\n                                        scope='Conv2d_0c_3x3')\n          with tf.variable_scope('Branch_3'):\n            tower_pool = slim.avg_pool2d(net, 3, stride=1, padding='SAME',\n                                         scope='AvgPool_0a_3x3')\n            tower_pool_1 = slim.conv2d(tower_pool, 64, 1,\n                                       scope='Conv2d_0b_1x1')\n          net = tf.concat(3, [tower_conv, tower_conv1_1,\n                              tower_conv2_2, tower_pool_1])\n\n        end_points['Mixed_5b'] = net\n        net = slim.repeat(net, 10, block35, scale=0.17)\n\n        # 17 x 17 x 1024\n        with tf.variable_scope('Mixed_6a'):\n          with tf.variable_scope('Branch_0'):\n            tower_conv = slim.conv2d(net, 384, 3, stride=2, padding='VALID',\n                                     scope='Conv2d_1a_3x3')\n          with tf.variable_scope('Branch_1'):\n            tower_conv1_0 = slim.conv2d(net, 256, 1, scope='Conv2d_0a_1x1')\n            tower_conv1_1 = slim.conv2d(tower_conv1_0, 256, 3,\n                                        scope='Conv2d_0b_3x3')\n            tower_conv1_2 = slim.conv2d(tower_conv1_1, 384, 3,\n                                        stride=2, padding='VALID',\n                                        scope='Conv2d_1a_3x3')\n          with tf.variable_scope('Branch_2'):\n            tower_pool = slim.max_pool2d(net, 3, stride=2, padding='VALID',\n                                         scope='MaxPool_1a_3x3')\n          net = tf.concat(3, [tower_conv, tower_conv1_2, tower_pool])\n\n        end_points['Mixed_6a'] = net\n        net = slim.repeat(net, 20, block17, scale=0.10)\n\n        # Auxillary tower\n        with tf.variable_scope('AuxLogits'):\n          aux = slim.avg_pool2d(net, 5, stride=3, padding='VALID',\n                                scope='Conv2d_1a_3x3')\n          aux = slim.conv2d(aux, 128, 1, scope='Conv2d_1b_1x1')\n          aux = slim.conv2d(aux, 768, aux.get_shape()[1:3],\n                            padding='VALID', scope='Conv2d_2a_5x5')\n          aux = slim.flatten(aux)\n          aux = slim.fully_connected(aux, num_classes, activation_fn=None,\n                                     scope='Logits')\n          end_points['AuxLogits'] = aux\n\n        with tf.variable_scope('Mixed_7a'):\n          with tf.variable_scope('Branch_0'):\n            tower_conv = slim.conv2d(net, 256, 1, scope='Conv2d_0a_1x1')\n            tower_conv_1 = slim.conv2d(tower_conv, 384, 3, stride=2,\n                                       padding='VALID', scope='Conv2d_1a_3x3')\n          with tf.variable_scope('Branch_1'):\n            tower_conv1 = slim.conv2d(net, 256, 1, scope='Conv2d_0a_1x1')\n            tower_conv1_1 = slim.conv2d(tower_conv1, 288, 3, stride=2,\n                                        padding='VALID', scope='Conv2d_1a_3x3')\n          with tf.variable_scope('Branch_2'):\n            tower_conv2 = slim.conv2d(net, 256, 1, scope='Conv2d_0a_1x1')\n            tower_conv2_1 = slim.conv2d(tower_conv2, 288, 3,\n                                        scope='Conv2d_0b_3x3')\n            tower_conv2_2 = slim.conv2d(tower_conv2_1, 320, 3, stride=2,\n                                        padding='VALID', scope='Conv2d_1a_3x3')\n          with tf.variable_scope('Branch_3'):\n            tower_pool = slim.max_pool2d(net, 3, stride=2, padding='VALID',\n                                         scope='MaxPool_1a_3x3')\n          net = tf.concat(3, [tower_conv_1, tower_conv1_1,\n                              tower_conv2_2, tower_pool])\n\n        end_points['Mixed_7a'] = net\n\n        net = slim.repeat(net, 9, block8, scale=0.20)\n        net = block8(net, activation_fn=None)\n\n        net = slim.conv2d(net, 1536, 1, scope='Conv2d_7b_1x1')\n        end_points['Conv2d_7b_1x1'] = net\n\n        with tf.variable_scope('Logits'):\n          end_points['PrePool'] = net\n          net = slim.avg_pool2d(net, net.get_shape()[1:3], padding='VALID',\n                                scope='AvgPool_1a_8x8')\n          net = slim.flatten(net)\n\n          net = slim.dropout(net, dropout_keep_prob, is_training=is_training,\n                             scope='Dropout')\n\n          end_points['PreLogitsFlatten'] = net\n          logits = slim.fully_connected(net, num_classes, activation_fn=None,\n                                        scope='Logits')\n          end_points['Logits'] = logits\n          end_points['Predictions'] = tf.nn.softmax(logits, name='Predictions')\n\n    return logits, end_points\ninception_resnet_v2.default_image_size = 299\n\n\ndef inception_resnet_v2_arg_scope(weight_decay=0.00004,\n                                  batch_norm_decay=0.9997,\n                                  batch_norm_epsilon=0.001):\n  \"\"\"Yields the scope with the default parameters for inception_resnet_v2.\n\n  Args:\n    weight_decay: the weight decay for weights variables.\n    batch_norm_decay: decay for the moving average of batch_norm momentums.\n    batch_norm_epsilon: small float added to variance to avoid dividing by zero.\n\n  Returns:\n    a arg_scope with the parameters needed for inception_resnet_v2.\n  \"\"\"\n  # Set weight_decay for weights in conv2d and fully_connected layers.\n  with slim.arg_scope([slim.conv2d, slim.fully_connected],\n                      weights_regularizer=slim.l2_regularizer(weight_decay),\n                      biases_regularizer=slim.l2_regularizer(weight_decay)):\n\n    batch_norm_params = {\n        'decay': batch_norm_decay,\n        'epsilon': batch_norm_epsilon,\n    }\n    # Set activation_fn and parameters for batch_norm.\n    with slim.arg_scope([slim.conv2d], activation_fn=tf.nn.relu,\n                        normalizer_fn=slim.batch_norm,\n                        normalizer_params=batch_norm_params) as scope:\n      return scope\n"
  },
  {
    "path": "model_zoo/models/slim/nets/inception_resnet_v2_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for slim.inception_resnet_v2.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom nets import inception\n\n\nclass InceptionTest(tf.test.TestCase):\n\n  def testBuildLogits(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = inception.inception_resnet_v2(inputs, num_classes)\n      self.assertTrue(logits.op.name.startswith('InceptionResnetV2/Logits'))\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n\n  def testBuildEndPoints(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      _, end_points = inception.inception_resnet_v2(inputs, num_classes)\n      self.assertTrue('Logits' in end_points)\n      logits = end_points['Logits']\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n      self.assertTrue('AuxLogits' in end_points)\n      aux_logits = end_points['AuxLogits']\n      self.assertListEqual(aux_logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n      pre_pool = end_points['PrePool']\n      self.assertListEqual(pre_pool.get_shape().as_list(),\n                           [batch_size, 8, 8, 1536])\n\n  def testVariablesSetDevice(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      # Force all Variables to reside on the device.\n      with tf.variable_scope('on_cpu'), tf.device('/cpu:0'):\n        inception.inception_resnet_v2(inputs, num_classes)\n      with tf.variable_scope('on_gpu'), tf.device('/gpu:0'):\n        inception.inception_resnet_v2(inputs, num_classes)\n      for v in tf.get_collection(tf.GraphKeys.VARIABLES, scope='on_cpu'):\n        self.assertDeviceEqual(v.device, '/cpu:0')\n      for v in tf.get_collection(tf.GraphKeys.VARIABLES, scope='on_gpu'):\n        self.assertDeviceEqual(v.device, '/gpu:0')\n\n  def testHalfSizeImages(self):\n    batch_size = 5\n    height, width = 150, 150\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, end_points = inception.inception_resnet_v2(inputs, num_classes)\n      self.assertTrue(logits.op.name.startswith('InceptionResnetV2/Logits'))\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n      pre_pool = end_points['PrePool']\n      self.assertListEqual(pre_pool.get_shape().as_list(),\n                           [batch_size, 3, 3, 1536])\n\n  def testUnknownBatchSize(self):\n    batch_size = 1\n    height, width = 299, 299\n    num_classes = 1000\n    with self.test_session() as sess:\n      inputs = tf.placeholder(tf.float32, (None, height, width, 3))\n      logits, _ = inception.inception_resnet_v2(inputs, num_classes)\n      self.assertTrue(logits.op.name.startswith('InceptionResnetV2/Logits'))\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [None, num_classes])\n      images = tf.random_uniform((batch_size, height, width, 3))\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(logits, {inputs: images.eval()})\n      self.assertEquals(output.shape, (batch_size, num_classes))\n\n  def testEvaluation(self):\n    batch_size = 2\n    height, width = 299, 299\n    num_classes = 1000\n    with self.test_session() as sess:\n      eval_inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = inception.inception_resnet_v2(eval_inputs,\n                                                num_classes,\n                                                is_training=False)\n      predictions = tf.argmax(logits, 1)\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(predictions)\n      self.assertEquals(output.shape, (batch_size,))\n\n  def testTrainEvalWithReuse(self):\n    train_batch_size = 5\n    eval_batch_size = 2\n    height, width = 150, 150\n    num_classes = 1000\n    with self.test_session() as sess:\n      train_inputs = tf.random_uniform((train_batch_size, height, width, 3))\n      inception.inception_resnet_v2(train_inputs, num_classes)\n      eval_inputs = tf.random_uniform((eval_batch_size, height, width, 3))\n      logits, _ = inception.inception_resnet_v2(eval_inputs,\n                                                num_classes,\n                                                is_training=False,\n                                                reuse=True)\n      predictions = tf.argmax(logits, 1)\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(predictions)\n      self.assertEquals(output.shape, (eval_batch_size,))\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/slim/nets/inception_utils.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains common code shared by all inception models.\n\nUsage of arg scope:\n  with slim.arg_scope(inception_arg_scope()):\n    logits, end_points = inception.inception_v3(images, num_classes,\n                                                is_training=is_training)\n\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nslim = tf.contrib.slim\n\n\ndef inception_arg_scope(weight_decay=0.00004,\n                        use_batch_norm=True,\n                        batch_norm_decay=0.9997,\n                        batch_norm_epsilon=0.001):\n  \"\"\"Defines the default arg scope for inception models.\n\n  Args:\n    weight_decay: The weight decay to use for regularizing the model.\n    use_batch_norm: \"If `True`, batch_norm is applied after each convolution.\n    batch_norm_decay: Decay for batch norm moving average.\n    batch_norm_epsilon: Small float added to variance to avoid dividing by zero\n      in batch norm.\n\n  Returns:\n    An `arg_scope` to use for the inception models.\n  \"\"\"\n  batch_norm_params = {\n      # Decay for the moving averages.\n      'decay': batch_norm_decay,\n      # epsilon to prevent 0s in variance.\n      'epsilon': batch_norm_epsilon,\n      # collection containing update_ops.\n      'updates_collections': tf.GraphKeys.UPDATE_OPS,\n  }\n  if use_batch_norm:\n    normalizer_fn = slim.batch_norm\n    normalizer_params = batch_norm_params\n  else:\n    normalizer_fn = None\n    normalizer_params = {}\n  # Set weight_decay for weights in Conv and FC layers.\n  with slim.arg_scope([slim.conv2d, slim.fully_connected],\n                      weights_regularizer=slim.l2_regularizer(weight_decay)):\n    with slim.arg_scope(\n        [slim.conv2d],\n        weights_initializer=slim.variance_scaling_initializer(),\n        activation_fn=tf.nn.relu,\n        normalizer_fn=normalizer_fn,\n        normalizer_params=normalizer_params) as sc:\n      return sc\n"
  },
  {
    "path": "model_zoo/models/slim/nets/inception_v1.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains the definition for inception v1 classification network.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom nets import inception_utils\n\nslim = tf.contrib.slim\ntrunc_normal = lambda stddev: tf.truncated_normal_initializer(0.0, stddev)\n\n\ndef inception_v1_base(inputs,\n                      final_endpoint='Mixed_5c',\n                      scope='InceptionV1'):\n  \"\"\"Defines the Inception V1 base architecture.\n\n  This architecture is defined in:\n    Going deeper with convolutions\n    Christian Szegedy, Wei Liu, Yangqing Jia, Pierre Sermanet, Scott Reed,\n    Dragomir Anguelov, Dumitru Erhan, Vincent Vanhoucke, Andrew Rabinovich.\n    http://arxiv.org/pdf/1409.4842v1.pdf.\n\n  Args:\n    inputs: a tensor of size [batch_size, height, width, channels].\n    final_endpoint: specifies the endpoint to construct the network up to. It\n      can be one of ['Conv2d_1a_7x7', 'MaxPool_2a_3x3', 'Conv2d_2b_1x1',\n      'Conv2d_2c_3x3', 'MaxPool_3a_3x3', 'Mixed_3b', 'Mixed_3c',\n      'MaxPool_4a_3x3', 'Mixed_4b', 'Mixed_4c', 'Mixed_4d', 'Mixed_4e',\n      'Mixed_4f', 'MaxPool_5a_2x2', 'Mixed_5b', 'Mixed_5c']\n    scope: Optional variable_scope.\n\n  Returns:\n    A dictionary from components of the network to the corresponding activation.\n\n  Raises:\n    ValueError: if final_endpoint is not set to one of the predefined values.\n  \"\"\"\n  end_points = {}\n  with tf.variable_scope(scope, 'InceptionV1', [inputs]):\n    with slim.arg_scope(\n        [slim.conv2d, slim.fully_connected],\n        weights_initializer=trunc_normal(0.01)):\n      with slim.arg_scope([slim.conv2d, slim.max_pool2d],\n                          stride=1, padding='SAME'):\n        end_point = 'Conv2d_1a_7x7'\n        net = slim.conv2d(inputs, 64, [7, 7], stride=2, scope=end_point)\n        end_points[end_point] = net\n        if final_endpoint == end_point: return net, end_points\n        end_point = 'MaxPool_2a_3x3'\n        net = slim.max_pool2d(net, [3, 3], stride=2, scope=end_point)\n        end_points[end_point] = net\n        if final_endpoint == end_point: return net, end_points\n        end_point = 'Conv2d_2b_1x1'\n        net = slim.conv2d(net, 64, [1, 1], scope=end_point)\n        end_points[end_point] = net\n        if final_endpoint == end_point: return net, end_points\n        end_point = 'Conv2d_2c_3x3'\n        net = slim.conv2d(net, 192, [3, 3], scope=end_point)\n        end_points[end_point] = net\n        if final_endpoint == end_point: return net, end_points\n        end_point = 'MaxPool_3a_3x3'\n        net = slim.max_pool2d(net, [3, 3], stride=2, scope=end_point)\n        end_points[end_point] = net\n        if final_endpoint == end_point: return net, end_points\n\n        end_point = 'Mixed_3b'\n        with tf.variable_scope(end_point):\n          with tf.variable_scope('Branch_0'):\n            branch_0 = slim.conv2d(net, 64, [1, 1], scope='Conv2d_0a_1x1')\n          with tf.variable_scope('Branch_1'):\n            branch_1 = slim.conv2d(net, 96, [1, 1], scope='Conv2d_0a_1x1')\n            branch_1 = slim.conv2d(branch_1, 128, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_2'):\n            branch_2 = slim.conv2d(net, 16, [1, 1], scope='Conv2d_0a_1x1')\n            branch_2 = slim.conv2d(branch_2, 32, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_3'):\n            branch_3 = slim.max_pool2d(net, [3, 3], scope='MaxPool_0a_3x3')\n            branch_3 = slim.conv2d(branch_3, 32, [1, 1], scope='Conv2d_0b_1x1')\n          net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if final_endpoint == end_point: return net, end_points\n\n        end_point = 'Mixed_3c'\n        with tf.variable_scope(end_point):\n          with tf.variable_scope('Branch_0'):\n            branch_0 = slim.conv2d(net, 128, [1, 1], scope='Conv2d_0a_1x1')\n          with tf.variable_scope('Branch_1'):\n            branch_1 = slim.conv2d(net, 128, [1, 1], scope='Conv2d_0a_1x1')\n            branch_1 = slim.conv2d(branch_1, 192, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_2'):\n            branch_2 = slim.conv2d(net, 32, [1, 1], scope='Conv2d_0a_1x1')\n            branch_2 = slim.conv2d(branch_2, 96, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_3'):\n            branch_3 = slim.max_pool2d(net, [3, 3], scope='MaxPool_0a_3x3')\n            branch_3 = slim.conv2d(branch_3, 64, [1, 1], scope='Conv2d_0b_1x1')\n          net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if final_endpoint == end_point: return net, end_points\n\n        end_point = 'MaxPool_4a_3x3'\n        net = slim.max_pool2d(net, [3, 3], stride=2, scope=end_point)\n        end_points[end_point] = net\n        if final_endpoint == end_point: return net, end_points\n\n        end_point = 'Mixed_4b'\n        with tf.variable_scope(end_point):\n          with tf.variable_scope('Branch_0'):\n            branch_0 = slim.conv2d(net, 192, [1, 1], scope='Conv2d_0a_1x1')\n          with tf.variable_scope('Branch_1'):\n            branch_1 = slim.conv2d(net, 96, [1, 1], scope='Conv2d_0a_1x1')\n            branch_1 = slim.conv2d(branch_1, 208, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_2'):\n            branch_2 = slim.conv2d(net, 16, [1, 1], scope='Conv2d_0a_1x1')\n            branch_2 = slim.conv2d(branch_2, 48, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_3'):\n            branch_3 = slim.max_pool2d(net, [3, 3], scope='MaxPool_0a_3x3')\n            branch_3 = slim.conv2d(branch_3, 64, [1, 1], scope='Conv2d_0b_1x1')\n          net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if final_endpoint == end_point: return net, end_points\n\n        end_point = 'Mixed_4c'\n        with tf.variable_scope(end_point):\n          with tf.variable_scope('Branch_0'):\n            branch_0 = slim.conv2d(net, 160, [1, 1], scope='Conv2d_0a_1x1')\n          with tf.variable_scope('Branch_1'):\n            branch_1 = slim.conv2d(net, 112, [1, 1], scope='Conv2d_0a_1x1')\n            branch_1 = slim.conv2d(branch_1, 224, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_2'):\n            branch_2 = slim.conv2d(net, 24, [1, 1], scope='Conv2d_0a_1x1')\n            branch_2 = slim.conv2d(branch_2, 64, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_3'):\n            branch_3 = slim.max_pool2d(net, [3, 3], scope='MaxPool_0a_3x3')\n            branch_3 = slim.conv2d(branch_3, 64, [1, 1], scope='Conv2d_0b_1x1')\n          net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if final_endpoint == end_point: return net, end_points\n\n        end_point = 'Mixed_4d'\n        with tf.variable_scope(end_point):\n          with tf.variable_scope('Branch_0'):\n            branch_0 = slim.conv2d(net, 128, [1, 1], scope='Conv2d_0a_1x1')\n          with tf.variable_scope('Branch_1'):\n            branch_1 = slim.conv2d(net, 128, [1, 1], scope='Conv2d_0a_1x1')\n            branch_1 = slim.conv2d(branch_1, 256, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_2'):\n            branch_2 = slim.conv2d(net, 24, [1, 1], scope='Conv2d_0a_1x1')\n            branch_2 = slim.conv2d(branch_2, 64, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_3'):\n            branch_3 = slim.max_pool2d(net, [3, 3], scope='MaxPool_0a_3x3')\n            branch_3 = slim.conv2d(branch_3, 64, [1, 1], scope='Conv2d_0b_1x1')\n          net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if final_endpoint == end_point: return net, end_points\n\n        end_point = 'Mixed_4e'\n        with tf.variable_scope(end_point):\n          with tf.variable_scope('Branch_0'):\n            branch_0 = slim.conv2d(net, 112, [1, 1], scope='Conv2d_0a_1x1')\n          with tf.variable_scope('Branch_1'):\n            branch_1 = slim.conv2d(net, 144, [1, 1], scope='Conv2d_0a_1x1')\n            branch_1 = slim.conv2d(branch_1, 288, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_2'):\n            branch_2 = slim.conv2d(net, 32, [1, 1], scope='Conv2d_0a_1x1')\n            branch_2 = slim.conv2d(branch_2, 64, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_3'):\n            branch_3 = slim.max_pool2d(net, [3, 3], scope='MaxPool_0a_3x3')\n            branch_3 = slim.conv2d(branch_3, 64, [1, 1], scope='Conv2d_0b_1x1')\n          net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if final_endpoint == end_point: return net, end_points\n\n        end_point = 'Mixed_4f'\n        with tf.variable_scope(end_point):\n          with tf.variable_scope('Branch_0'):\n            branch_0 = slim.conv2d(net, 256, [1, 1], scope='Conv2d_0a_1x1')\n          with tf.variable_scope('Branch_1'):\n            branch_1 = slim.conv2d(net, 160, [1, 1], scope='Conv2d_0a_1x1')\n            branch_1 = slim.conv2d(branch_1, 320, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_2'):\n            branch_2 = slim.conv2d(net, 32, [1, 1], scope='Conv2d_0a_1x1')\n            branch_2 = slim.conv2d(branch_2, 128, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_3'):\n            branch_3 = slim.max_pool2d(net, [3, 3], scope='MaxPool_0a_3x3')\n            branch_3 = slim.conv2d(branch_3, 128, [1, 1], scope='Conv2d_0b_1x1')\n          net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if final_endpoint == end_point: return net, end_points\n\n        end_point = 'MaxPool_5a_2x2'\n        net = slim.max_pool2d(net, [2, 2], stride=2, scope=end_point)\n        end_points[end_point] = net\n        if final_endpoint == end_point: return net, end_points\n\n        end_point = 'Mixed_5b'\n        with tf.variable_scope(end_point):\n          with tf.variable_scope('Branch_0'):\n            branch_0 = slim.conv2d(net, 256, [1, 1], scope='Conv2d_0a_1x1')\n          with tf.variable_scope('Branch_1'):\n            branch_1 = slim.conv2d(net, 160, [1, 1], scope='Conv2d_0a_1x1')\n            branch_1 = slim.conv2d(branch_1, 320, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_2'):\n            branch_2 = slim.conv2d(net, 32, [1, 1], scope='Conv2d_0a_1x1')\n            branch_2 = slim.conv2d(branch_2, 128, [3, 3], scope='Conv2d_0a_3x3')\n          with tf.variable_scope('Branch_3'):\n            branch_3 = slim.max_pool2d(net, [3, 3], scope='MaxPool_0a_3x3')\n            branch_3 = slim.conv2d(branch_3, 128, [1, 1], scope='Conv2d_0b_1x1')\n          net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if final_endpoint == end_point: return net, end_points\n\n        end_point = 'Mixed_5c'\n        with tf.variable_scope(end_point):\n          with tf.variable_scope('Branch_0'):\n            branch_0 = slim.conv2d(net, 384, [1, 1], scope='Conv2d_0a_1x1')\n          with tf.variable_scope('Branch_1'):\n            branch_1 = slim.conv2d(net, 192, [1, 1], scope='Conv2d_0a_1x1')\n            branch_1 = slim.conv2d(branch_1, 384, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_2'):\n            branch_2 = slim.conv2d(net, 48, [1, 1], scope='Conv2d_0a_1x1')\n            branch_2 = slim.conv2d(branch_2, 128, [3, 3], scope='Conv2d_0b_3x3')\n          with tf.variable_scope('Branch_3'):\n            branch_3 = slim.max_pool2d(net, [3, 3], scope='MaxPool_0a_3x3')\n            branch_3 = slim.conv2d(branch_3, 128, [1, 1], scope='Conv2d_0b_1x1')\n          net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if final_endpoint == end_point: return net, end_points\n    raise ValueError('Unknown final endpoint %s' % final_endpoint)\n\n\ndef inception_v1(inputs,\n                 num_classes=1000,\n                 is_training=True,\n                 dropout_keep_prob=0.8,\n                 prediction_fn=slim.softmax,\n                 spatial_squeeze=True,\n                 reuse=None,\n                 scope='InceptionV1'):\n  \"\"\"Defines the Inception V1 architecture.\n\n  This architecture is defined in:\n\n    Going deeper with convolutions\n    Christian Szegedy, Wei Liu, Yangqing Jia, Pierre Sermanet, Scott Reed,\n    Dragomir Anguelov, Dumitru Erhan, Vincent Vanhoucke, Andrew Rabinovich.\n    http://arxiv.org/pdf/1409.4842v1.pdf.\n\n  The default image size used to train this network is 224x224.\n\n  Args:\n    inputs: a tensor of size [batch_size, height, width, channels].\n    num_classes: number of predicted classes.\n    is_training: whether is training or not.\n    dropout_keep_prob: the percentage of activation values that are retained.\n    prediction_fn: a function to get predictions out of logits.\n    spatial_squeeze: if True, logits is of shape is [B, C], if false logits is\n        of shape [B, 1, 1, C], where B is batch_size and C is number of classes.\n    reuse: whether or not the network and its variables should be reused. To be\n      able to reuse 'scope' must be given.\n    scope: Optional variable_scope.\n\n  Returns:\n    logits: the pre-softmax activations, a tensor of size\n      [batch_size, num_classes]\n    end_points: a dictionary from components of the network to the corresponding\n      activation.\n  \"\"\"\n  # Final pooling and prediction\n  with tf.variable_scope(scope, 'InceptionV1', [inputs, num_classes],\n                         reuse=reuse) as scope:\n    with slim.arg_scope([slim.batch_norm, slim.dropout],\n                        is_training=is_training):\n      net, end_points = inception_v1_base(inputs, scope=scope)\n      with tf.variable_scope('Logits'):\n        net = slim.avg_pool2d(net, [7, 7], stride=1, scope='MaxPool_0a_7x7')\n        net = slim.dropout(net,\n                           dropout_keep_prob, scope='Dropout_0b')\n        logits = slim.conv2d(net, num_classes, [1, 1], activation_fn=None,\n                             normalizer_fn=None, scope='Conv2d_0c_1x1')\n        if spatial_squeeze:\n          logits = tf.squeeze(logits, [1, 2], name='SpatialSqueeze')\n\n        end_points['Logits'] = logits\n        end_points['Predictions'] = prediction_fn(logits, scope='Predictions')\n  return logits, end_points\ninception_v1.default_image_size = 224\n\ninception_v1_arg_scope = inception_utils.inception_arg_scope\n"
  },
  {
    "path": "model_zoo/models/slim/nets/inception_v1_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for nets.inception_v1.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom nets import inception\n\nslim = tf.contrib.slim\n\n\nclass InceptionV1Test(tf.test.TestCase):\n\n  def testBuildClassificationNetwork(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    logits, end_points = inception.inception_v1(inputs, num_classes)\n    self.assertTrue(logits.op.name.startswith('InceptionV1/Logits'))\n    self.assertListEqual(logits.get_shape().as_list(),\n                         [batch_size, num_classes])\n    self.assertTrue('Predictions' in end_points)\n    self.assertListEqual(end_points['Predictions'].get_shape().as_list(),\n                         [batch_size, num_classes])\n\n  def testBuildBaseNetwork(self):\n    batch_size = 5\n    height, width = 224, 224\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    mixed_6c, end_points = inception.inception_v1_base(inputs)\n    self.assertTrue(mixed_6c.op.name.startswith('InceptionV1/Mixed_5c'))\n    self.assertListEqual(mixed_6c.get_shape().as_list(),\n                         [batch_size, 7, 7, 1024])\n    expected_endpoints = ['Conv2d_1a_7x7', 'MaxPool_2a_3x3', 'Conv2d_2b_1x1',\n                          'Conv2d_2c_3x3', 'MaxPool_3a_3x3', 'Mixed_3b',\n                          'Mixed_3c', 'MaxPool_4a_3x3', 'Mixed_4b', 'Mixed_4c',\n                          'Mixed_4d', 'Mixed_4e', 'Mixed_4f', 'MaxPool_5a_2x2',\n                          'Mixed_5b', 'Mixed_5c']\n    self.assertItemsEqual(end_points.keys(), expected_endpoints)\n\n  def testBuildOnlyUptoFinalEndpoint(self):\n    batch_size = 5\n    height, width = 224, 224\n    endpoints = ['Conv2d_1a_7x7', 'MaxPool_2a_3x3', 'Conv2d_2b_1x1',\n                 'Conv2d_2c_3x3', 'MaxPool_3a_3x3', 'Mixed_3b', 'Mixed_3c',\n                 'MaxPool_4a_3x3', 'Mixed_4b', 'Mixed_4c', 'Mixed_4d',\n                 'Mixed_4e', 'Mixed_4f', 'MaxPool_5a_2x2', 'Mixed_5b',\n                 'Mixed_5c']\n    for index, endpoint in enumerate(endpoints):\n      with tf.Graph().as_default():\n        inputs = tf.random_uniform((batch_size, height, width, 3))\n        out_tensor, end_points = inception.inception_v1_base(\n            inputs, final_endpoint=endpoint)\n        self.assertTrue(out_tensor.op.name.startswith(\n            'InceptionV1/' + endpoint))\n        self.assertItemsEqual(endpoints[:index+1], end_points)\n\n  def testBuildAndCheckAllEndPointsUptoMixed5c(self):\n    batch_size = 5\n    height, width = 224, 224\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    _, end_points = inception.inception_v1_base(inputs,\n                                                final_endpoint='Mixed_5c')\n    endpoints_shapes = {'Conv2d_1a_7x7': [5, 112, 112, 64],\n                        'MaxPool_2a_3x3': [5, 56, 56, 64],\n                        'Conv2d_2b_1x1': [5, 56, 56, 64],\n                        'Conv2d_2c_3x3': [5, 56, 56, 192],\n                        'MaxPool_3a_3x3': [5, 28, 28, 192],\n                        'Mixed_3b': [5, 28, 28, 256],\n                        'Mixed_3c': [5, 28, 28, 480],\n                        'MaxPool_4a_3x3': [5, 14, 14, 480],\n                        'Mixed_4b': [5, 14, 14, 512],\n                        'Mixed_4c': [5, 14, 14, 512],\n                        'Mixed_4d': [5, 14, 14, 512],\n                        'Mixed_4e': [5, 14, 14, 528],\n                        'Mixed_4f': [5, 14, 14, 832],\n                        'MaxPool_5a_2x2': [5, 7, 7, 832],\n                        'Mixed_5b': [5, 7, 7, 832],\n                        'Mixed_5c': [5, 7, 7, 1024]}\n\n    self.assertItemsEqual(endpoints_shapes.keys(), end_points.keys())\n    for endpoint_name in endpoints_shapes:\n      expected_shape = endpoints_shapes[endpoint_name]\n      self.assertTrue(endpoint_name in end_points)\n      self.assertListEqual(end_points[endpoint_name].get_shape().as_list(),\n                           expected_shape)\n\n  def testModelHasExpectedNumberOfParameters(self):\n    batch_size = 5\n    height, width = 224, 224\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    with slim.arg_scope(inception.inception_v1_arg_scope()):\n      inception.inception_v1_base(inputs)\n    total_params, _ = slim.model_analyzer.analyze_vars(\n        slim.get_model_variables())\n    self.assertAlmostEqual(5607184, total_params)\n\n  def testHalfSizeImages(self):\n    batch_size = 5\n    height, width = 112, 112\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    mixed_5c, _ = inception.inception_v1_base(inputs)\n    self.assertTrue(mixed_5c.op.name.startswith('InceptionV1/Mixed_5c'))\n    self.assertListEqual(mixed_5c.get_shape().as_list(),\n                         [batch_size, 4, 4, 1024])\n\n  def testUnknownImageShape(self):\n    tf.reset_default_graph()\n    batch_size = 2\n    height, width = 224, 224\n    num_classes = 1000\n    input_np = np.random.uniform(0, 1, (batch_size, height, width, 3))\n    with self.test_session() as sess:\n      inputs = tf.placeholder(tf.float32, shape=(batch_size, None, None, 3))\n      logits, end_points = inception.inception_v1(inputs, num_classes)\n      self.assertTrue(logits.op.name.startswith('InceptionV1/Logits'))\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n      pre_pool = end_points['Mixed_5c']\n      feed_dict = {inputs: input_np}\n      tf.initialize_all_variables().run()\n      pre_pool_out = sess.run(pre_pool, feed_dict=feed_dict)\n      self.assertListEqual(list(pre_pool_out.shape), [batch_size, 7, 7, 1024])\n\n  def testUnknowBatchSize(self):\n    batch_size = 1\n    height, width = 224, 224\n    num_classes = 1000\n\n    inputs = tf.placeholder(tf.float32, (None, height, width, 3))\n    logits, _ = inception.inception_v1(inputs, num_classes)\n    self.assertTrue(logits.op.name.startswith('InceptionV1/Logits'))\n    self.assertListEqual(logits.get_shape().as_list(),\n                         [None, num_classes])\n    images = tf.random_uniform((batch_size, height, width, 3))\n\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(logits, {inputs: images.eval()})\n      self.assertEquals(output.shape, (batch_size, num_classes))\n\n  def testEvaluation(self):\n    batch_size = 2\n    height, width = 224, 224\n    num_classes = 1000\n\n    eval_inputs = tf.random_uniform((batch_size, height, width, 3))\n    logits, _ = inception.inception_v1(eval_inputs, num_classes,\n                                       is_training=False)\n    predictions = tf.argmax(logits, 1)\n\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(predictions)\n      self.assertEquals(output.shape, (batch_size,))\n\n  def testTrainEvalWithReuse(self):\n    train_batch_size = 5\n    eval_batch_size = 2\n    height, width = 224, 224\n    num_classes = 1000\n\n    train_inputs = tf.random_uniform((train_batch_size, height, width, 3))\n    inception.inception_v1(train_inputs, num_classes)\n    eval_inputs = tf.random_uniform((eval_batch_size, height, width, 3))\n    logits, _ = inception.inception_v1(eval_inputs, num_classes, reuse=True)\n    predictions = tf.argmax(logits, 1)\n\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(predictions)\n      self.assertEquals(output.shape, (eval_batch_size,))\n\n  def testLogitsNotSqueezed(self):\n    num_classes = 25\n    images = tf.random_uniform([1, 224, 224, 3])\n    logits, _ = inception.inception_v1(images,\n                                       num_classes=num_classes,\n                                       spatial_squeeze=False)\n\n    with self.test_session() as sess:\n      tf.initialize_all_variables().run()\n      logits_out = sess.run(logits)\n      self.assertListEqual(list(logits_out.shape), [1, 1, 1, num_classes])\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/slim/nets/inception_v2.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains the definition for inception v2 classification network.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom nets import inception_utils\n\nslim = tf.contrib.slim\ntrunc_normal = lambda stddev: tf.truncated_normal_initializer(0.0, stddev)\n\n\ndef inception_v2_base(inputs,\n                      final_endpoint='Mixed_5c',\n                      min_depth=16,\n                      depth_multiplier=1.0,\n                      scope=None):\n  \"\"\"Inception v2 (6a2).\n\n  Constructs an Inception v2 network from inputs to the given final endpoint.\n  This method can construct the network up to the layer inception(5b) as\n  described in http://arxiv.org/abs/1502.03167.\n\n  Args:\n    inputs: a tensor of shape [batch_size, height, width, channels].\n    final_endpoint: specifies the endpoint to construct the network up to. It\n      can be one of ['Conv2d_1a_7x7', 'MaxPool_2a_3x3', 'Conv2d_2b_1x1',\n      'Conv2d_2c_3x3', 'MaxPool_3a_3x3', 'Mixed_3b', 'Mixed_3c', 'Mixed_4a',\n      'Mixed_4b', 'Mixed_4c', 'Mixed_4d', 'Mixed_4e', 'Mixed_5a', 'Mixed_5b',\n      'Mixed_5c'].\n    min_depth: Minimum depth value (number of channels) for all convolution ops.\n      Enforced when depth_multiplier < 1, and not an active constraint when\n      depth_multiplier >= 1.\n    depth_multiplier: Float multiplier for the depth (number of channels)\n      for all convolution ops. The value must be greater than zero. Typical\n      usage will be to set this value in (0, 1) to reduce the number of\n      parameters or computation cost of the model.\n    scope: Optional variable_scope.\n\n  Returns:\n    tensor_out: output tensor corresponding to the final_endpoint.\n    end_points: a set of activations for external use, for example summaries or\n                losses.\n\n  Raises:\n    ValueError: if final_endpoint is not set to one of the predefined values,\n                or depth_multiplier <= 0\n  \"\"\"\n\n  # end_points will collect relevant activations for external use, for example\n  # summaries or losses.\n  end_points = {}\n\n  # Used to find thinned depths for each layer.\n  if depth_multiplier <= 0:\n    raise ValueError('depth_multiplier is not greater than zero.')\n  depth = lambda d: max(int(d * depth_multiplier), min_depth)\n\n  with tf.variable_scope(scope, 'InceptionV2', [inputs]):\n    with slim.arg_scope(\n        [slim.conv2d, slim.max_pool2d, slim.avg_pool2d, slim.separable_conv2d],\n        stride=1, padding='SAME'):\n\n      # Note that sizes in the comments below assume an input spatial size of\n      # 224x224, however, the inputs can be of any size greater 32x32.\n\n      # 224 x 224 x 3\n      end_point = 'Conv2d_1a_7x7'\n      # depthwise_multiplier here is different from depth_multiplier.\n      # depthwise_multiplier determines the output channels of the initial\n      # depthwise conv (see docs for tf.nn.separable_conv2d), while\n      # depth_multiplier controls the # channels of the subsequent 1x1\n      # convolution. Must have\n      #   in_channels * depthwise_multipler <= out_channels\n      # so that the separable convolution is not overparameterized.\n      depthwise_multiplier = min(int(depth(64) / 3), 8)\n      net = slim.separable_conv2d(\n          inputs, depth(64), [7, 7], depth_multiplier=depthwise_multiplier,\n          stride=2, weights_initializer=trunc_normal(1.0),\n          scope=end_point)\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n      # 112 x 112 x 64\n      end_point = 'MaxPool_2a_3x3'\n      net = slim.max_pool2d(net, [3, 3], scope=end_point, stride=2)\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n      # 56 x 56 x 64\n      end_point = 'Conv2d_2b_1x1'\n      net = slim.conv2d(net, depth(64), [1, 1], scope=end_point,\n                        weights_initializer=trunc_normal(0.1))\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n      # 56 x 56 x 64\n      end_point = 'Conv2d_2c_3x3'\n      net = slim.conv2d(net, depth(192), [3, 3], scope=end_point)\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n      # 56 x 56 x 192\n      end_point = 'MaxPool_3a_3x3'\n      net = slim.max_pool2d(net, [3, 3], scope=end_point, stride=2)\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n      # 28 x 28 x 192\n      # Inception module.\n      end_point = 'Mixed_3b'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(64), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(\n              net, depth(64), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(64), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(\n              net, depth(64), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(branch_2, depth(96), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n          branch_2 = slim.conv2d(branch_2, depth(96), [3, 3],\n                                 scope='Conv2d_0c_3x3')\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')\n          branch_3 = slim.conv2d(\n              branch_3, depth(32), [1, 1],\n              weights_initializer=trunc_normal(0.1),\n              scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if end_point == final_endpoint: return net, end_points\n      # 28 x 28 x 256\n      end_point = 'Mixed_3c'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(64), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(\n              net, depth(64), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(96), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(\n              net, depth(64), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(branch_2, depth(96), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n          branch_2 = slim.conv2d(branch_2, depth(96), [3, 3],\n                                 scope='Conv2d_0c_3x3')\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')\n          branch_3 = slim.conv2d(\n              branch_3, depth(64), [1, 1],\n              weights_initializer=trunc_normal(0.1),\n              scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if end_point == final_endpoint: return net, end_points\n      # 28 x 28 x 320\n      end_point = 'Mixed_4a'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(\n              net, depth(128), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_0 = slim.conv2d(branch_0, depth(160), [3, 3], stride=2,\n                                 scope='Conv2d_1a_3x3')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(\n              net, depth(64), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(\n              branch_1, depth(96), [3, 3], scope='Conv2d_0b_3x3')\n          branch_1 = slim.conv2d(\n              branch_1, depth(96), [3, 3], stride=2, scope='Conv2d_1a_3x3')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.max_pool2d(\n              net, [3, 3], stride=2, scope='MaxPool_1a_3x3')\n        net = tf.concat(3, [branch_0, branch_1, branch_2])\n        end_points[end_point] = net\n        if end_point == final_endpoint: return net, end_points\n      # 14 x 14 x 576\n      end_point = 'Mixed_4b'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(224), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(\n              net, depth(64), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(\n              branch_1, depth(96), [3, 3], scope='Conv2d_0b_3x3')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(\n              net, depth(96), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(branch_2, depth(128), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n          branch_2 = slim.conv2d(branch_2, depth(128), [3, 3],\n                                 scope='Conv2d_0c_3x3')\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')\n          branch_3 = slim.conv2d(\n              branch_3, depth(128), [1, 1],\n              weights_initializer=trunc_normal(0.1),\n              scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if end_point == final_endpoint: return net, end_points\n      # 14 x 14 x 576\n      end_point = 'Mixed_4c'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(192), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(\n              net, depth(96), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(128), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(\n              net, depth(96), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(branch_2, depth(128), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n          branch_2 = slim.conv2d(branch_2, depth(128), [3, 3],\n                                 scope='Conv2d_0c_3x3')\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')\n          branch_3 = slim.conv2d(\n              branch_3, depth(128), [1, 1],\n              weights_initializer=trunc_normal(0.1),\n              scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if end_point == final_endpoint: return net, end_points\n      # 14 x 14 x 576\n      end_point = 'Mixed_4d'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(160), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(\n              net, depth(128), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(160), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(\n              net, depth(128), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(branch_2, depth(160), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n          branch_2 = slim.conv2d(branch_2, depth(160), [3, 3],\n                                 scope='Conv2d_0c_3x3')\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')\n          branch_3 = slim.conv2d(\n              branch_3, depth(96), [1, 1],\n              weights_initializer=trunc_normal(0.1),\n              scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if end_point == final_endpoint: return net, end_points\n\n      # 14 x 14 x 576\n      end_point = 'Mixed_4e'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(96), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(\n              net, depth(128), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(192), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(\n              net, depth(160), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(branch_2, depth(192), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n          branch_2 = slim.conv2d(branch_2, depth(192), [3, 3],\n                                 scope='Conv2d_0c_3x3')\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')\n          branch_3 = slim.conv2d(\n              branch_3, depth(96), [1, 1],\n              weights_initializer=trunc_normal(0.1),\n              scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if end_point == final_endpoint: return net, end_points\n      # 14 x 14 x 576\n      end_point = 'Mixed_5a'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(\n              net, depth(128), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_0 = slim.conv2d(branch_0, depth(192), [3, 3], stride=2,\n                                 scope='Conv2d_1a_3x3')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(\n              net, depth(192), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(256), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n          branch_1 = slim.conv2d(branch_1, depth(256), [3, 3], stride=2,\n                                 scope='Conv2d_1a_3x3')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.max_pool2d(net, [3, 3], stride=2,\n                                     scope='MaxPool_1a_3x3')\n        net = tf.concat(3, [branch_0, branch_1, branch_2])\n        end_points[end_point] = net\n        if end_point == final_endpoint: return net, end_points\n      # 7 x 7 x 1024\n      end_point = 'Mixed_5b'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(352), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(\n              net, depth(192), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(320), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(\n              net, depth(160), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(branch_2, depth(224), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n          branch_2 = slim.conv2d(branch_2, depth(224), [3, 3],\n                                 scope='Conv2d_0c_3x3')\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')\n          branch_3 = slim.conv2d(\n              branch_3, depth(128), [1, 1],\n              weights_initializer=trunc_normal(0.1),\n              scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if end_point == final_endpoint: return net, end_points\n\n      # 7 x 7 x 1024\n      end_point = 'Mixed_5c'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(352), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(\n              net, depth(192), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(320), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(\n              net, depth(192), [1, 1],\n              weights_initializer=trunc_normal(0.09),\n              scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(branch_2, depth(224), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n          branch_2 = slim.conv2d(branch_2, depth(224), [3, 3],\n                                 scope='Conv2d_0c_3x3')\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.max_pool2d(net, [3, 3], scope='MaxPool_0a_3x3')\n          branch_3 = slim.conv2d(\n              branch_3, depth(128), [1, 1],\n              weights_initializer=trunc_normal(0.1),\n              scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n        end_points[end_point] = net\n        if end_point == final_endpoint: return net, end_points\n    raise ValueError('Unknown final endpoint %s' % final_endpoint)\n\n\ndef inception_v2(inputs,\n                 num_classes=1000,\n                 is_training=True,\n                 dropout_keep_prob=0.8,\n                 min_depth=16,\n                 depth_multiplier=1.0,\n                 prediction_fn=slim.softmax,\n                 spatial_squeeze=True,\n                 reuse=None,\n                 scope='InceptionV2'):\n  \"\"\"Inception v2 model for classification.\n\n  Constructs an Inception v2 network for classification as described in\n  http://arxiv.org/abs/1502.03167.\n\n  The default image size used to train this network is 224x224.\n\n  Args:\n    inputs: a tensor of shape [batch_size, height, width, channels].\n    num_classes: number of predicted classes.\n    is_training: whether is training or not.\n    dropout_keep_prob: the percentage of activation values that are retained.\n    min_depth: Minimum depth value (number of channels) for all convolution ops.\n      Enforced when depth_multiplier < 1, and not an active constraint when\n      depth_multiplier >= 1.\n    depth_multiplier: Float multiplier for the depth (number of channels)\n      for all convolution ops. The value must be greater than zero. Typical\n      usage will be to set this value in (0, 1) to reduce the number of\n      parameters or computation cost of the model.\n    prediction_fn: a function to get predictions out of logits.\n    spatial_squeeze: if True, logits is of shape is [B, C], if false logits is\n        of shape [B, 1, 1, C], where B is batch_size and C is number of classes.\n    reuse: whether or not the network and its variables should be reused. To be\n      able to reuse 'scope' must be given.\n    scope: Optional variable_scope.\n\n  Returns:\n    logits: the pre-softmax activations, a tensor of size\n      [batch_size, num_classes]\n    end_points: a dictionary from components of the network to the corresponding\n      activation.\n\n  Raises:\n    ValueError: if final_endpoint is not set to one of the predefined values,\n                or depth_multiplier <= 0\n  \"\"\"\n  if depth_multiplier <= 0:\n    raise ValueError('depth_multiplier is not greater than zero.')\n\n  # Final pooling and prediction\n  with tf.variable_scope(scope, 'InceptionV2', [inputs, num_classes],\n                         reuse=reuse) as scope:\n    with slim.arg_scope([slim.batch_norm, slim.dropout],\n                        is_training=is_training):\n      net, end_points = inception_v2_base(\n          inputs, scope=scope, min_depth=min_depth,\n          depth_multiplier=depth_multiplier)\n      with tf.variable_scope('Logits'):\n        kernel_size = _reduced_kernel_size_for_small_input(net, [7, 7])\n        net = slim.avg_pool2d(net, kernel_size, padding='VALID',\n                              scope='AvgPool_1a_{}x{}'.format(*kernel_size))\n        # 1 x 1 x 1024\n        net = slim.dropout(net, keep_prob=dropout_keep_prob, scope='Dropout_1b')\n        logits = slim.conv2d(net, num_classes, [1, 1], activation_fn=None,\n                             normalizer_fn=None, scope='Conv2d_1c_1x1')\n        if spatial_squeeze:\n          logits = tf.squeeze(logits, [1, 2], name='SpatialSqueeze')\n      end_points['Logits'] = logits\n      end_points['Predictions'] = prediction_fn(logits, scope='Predictions')\n  return logits, end_points\ninception_v2.default_image_size = 224\n\n\ndef _reduced_kernel_size_for_small_input(input_tensor, kernel_size):\n  \"\"\"Define kernel size which is automatically reduced for small input.\n\n  If the shape of the input images is unknown at graph construction time this\n  function assumes that the input images are is large enough.\n\n  Args:\n    input_tensor: input tensor of size [batch_size, height, width, channels].\n    kernel_size: desired kernel size of length 2: [kernel_height, kernel_width]\n\n  Returns:\n    a tensor with the kernel size.\n\n  TODO(jrru): Make this function work with unknown shapes. Theoretically, this\n  can be done with the code below. Problems are two-fold: (1) If the shape was\n  known, it will be lost. (2) inception.slim.ops._two_element_tuple cannot\n  handle tensors that define the kernel size.\n      shape = tf.shape(input_tensor)\n      return = tf.pack([tf.minimum(shape[1], kernel_size[0]),\n                        tf.minimum(shape[2], kernel_size[1])])\n\n  \"\"\"\n  shape = input_tensor.get_shape().as_list()\n  if shape[1] is None or shape[2] is None:\n    kernel_size_out = kernel_size\n  else:\n    kernel_size_out = [min(shape[1], kernel_size[0]),\n                       min(shape[2], kernel_size[1])]\n  return kernel_size_out\n\n\ninception_v2_arg_scope = inception_utils.inception_arg_scope\n"
  },
  {
    "path": "model_zoo/models/slim/nets/inception_v2_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for nets.inception_v2.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom nets import inception\n\nslim = tf.contrib.slim\n\n\nclass InceptionV2Test(tf.test.TestCase):\n\n  def testBuildClassificationNetwork(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    logits, end_points = inception.inception_v2(inputs, num_classes)\n    self.assertTrue(logits.op.name.startswith('InceptionV2/Logits'))\n    self.assertListEqual(logits.get_shape().as_list(),\n                         [batch_size, num_classes])\n    self.assertTrue('Predictions' in end_points)\n    self.assertListEqual(end_points['Predictions'].get_shape().as_list(),\n                         [batch_size, num_classes])\n\n  def testBuildBaseNetwork(self):\n    batch_size = 5\n    height, width = 224, 224\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    mixed_5c, end_points = inception.inception_v2_base(inputs)\n    self.assertTrue(mixed_5c.op.name.startswith('InceptionV2/Mixed_5c'))\n    self.assertListEqual(mixed_5c.get_shape().as_list(),\n                         [batch_size, 7, 7, 1024])\n    expected_endpoints = ['Mixed_3b', 'Mixed_3c', 'Mixed_4a', 'Mixed_4b',\n                          'Mixed_4c', 'Mixed_4d', 'Mixed_4e', 'Mixed_5a',\n                          'Mixed_5b', 'Mixed_5c', 'Conv2d_1a_7x7',\n                          'MaxPool_2a_3x3', 'Conv2d_2b_1x1', 'Conv2d_2c_3x3',\n                          'MaxPool_3a_3x3']\n    self.assertItemsEqual(end_points.keys(), expected_endpoints)\n\n  def testBuildOnlyUptoFinalEndpoint(self):\n    batch_size = 5\n    height, width = 224, 224\n    endpoints = ['Conv2d_1a_7x7', 'MaxPool_2a_3x3', 'Conv2d_2b_1x1',\n                 'Conv2d_2c_3x3', 'MaxPool_3a_3x3', 'Mixed_3b', 'Mixed_3c',\n                 'Mixed_4a', 'Mixed_4b', 'Mixed_4c', 'Mixed_4d', 'Mixed_4e',\n                 'Mixed_5a', 'Mixed_5b', 'Mixed_5c']\n    for index, endpoint in enumerate(endpoints):\n      with tf.Graph().as_default():\n        inputs = tf.random_uniform((batch_size, height, width, 3))\n        out_tensor, end_points = inception.inception_v2_base(\n            inputs, final_endpoint=endpoint)\n        self.assertTrue(out_tensor.op.name.startswith(\n            'InceptionV2/' + endpoint))\n        self.assertItemsEqual(endpoints[:index+1], end_points)\n\n  def testBuildAndCheckAllEndPointsUptoMixed5c(self):\n    batch_size = 5\n    height, width = 224, 224\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    _, end_points = inception.inception_v2_base(inputs,\n                                                final_endpoint='Mixed_5c')\n    endpoints_shapes = {'Mixed_3b': [batch_size, 28, 28, 256],\n                        'Mixed_3c': [batch_size, 28, 28, 320],\n                        'Mixed_4a': [batch_size, 14, 14, 576],\n                        'Mixed_4b': [batch_size, 14, 14, 576],\n                        'Mixed_4c': [batch_size, 14, 14, 576],\n                        'Mixed_4d': [batch_size, 14, 14, 576],\n                        'Mixed_4e': [batch_size, 14, 14, 576],\n                        'Mixed_5a': [batch_size, 7, 7, 1024],\n                        'Mixed_5b': [batch_size, 7, 7, 1024],\n                        'Mixed_5c': [batch_size, 7, 7, 1024],\n                        'Conv2d_1a_7x7': [batch_size, 112, 112, 64],\n                        'MaxPool_2a_3x3': [batch_size, 56, 56, 64],\n                        'Conv2d_2b_1x1': [batch_size, 56, 56, 64],\n                        'Conv2d_2c_3x3': [batch_size, 56, 56, 192],\n                        'MaxPool_3a_3x3': [batch_size, 28, 28, 192]}\n    self.assertItemsEqual(endpoints_shapes.keys(), end_points.keys())\n    for endpoint_name in endpoints_shapes:\n      expected_shape = endpoints_shapes[endpoint_name]\n      self.assertTrue(endpoint_name in end_points)\n      self.assertListEqual(end_points[endpoint_name].get_shape().as_list(),\n                           expected_shape)\n\n  def testModelHasExpectedNumberOfParameters(self):\n    batch_size = 5\n    height, width = 224, 224\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    with slim.arg_scope(inception.inception_v2_arg_scope()):\n      inception.inception_v2_base(inputs)\n    total_params, _ = slim.model_analyzer.analyze_vars(\n        slim.get_model_variables())\n    self.assertAlmostEqual(10173112, total_params)\n\n  def testBuildEndPointsWithDepthMultiplierLessThanOne(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    _, end_points = inception.inception_v2(inputs, num_classes)\n\n    endpoint_keys = [key for key in end_points.keys()\n                     if key.startswith('Mixed') or key.startswith('Conv')]\n\n    _, end_points_with_multiplier = inception.inception_v2(\n        inputs, num_classes, scope='depth_multiplied_net',\n        depth_multiplier=0.5)\n\n    for key in endpoint_keys:\n      original_depth = end_points[key].get_shape().as_list()[3]\n      new_depth = end_points_with_multiplier[key].get_shape().as_list()[3]\n      self.assertEqual(0.5 * original_depth, new_depth)\n\n  def testBuildEndPointsWithDepthMultiplierGreaterThanOne(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    _, end_points = inception.inception_v2(inputs, num_classes)\n\n    endpoint_keys = [key for key in end_points.keys()\n                     if key.startswith('Mixed') or key.startswith('Conv')]\n\n    _, end_points_with_multiplier = inception.inception_v2(\n        inputs, num_classes, scope='depth_multiplied_net',\n        depth_multiplier=2.0)\n\n    for key in endpoint_keys:\n      original_depth = end_points[key].get_shape().as_list()[3]\n      new_depth = end_points_with_multiplier[key].get_shape().as_list()[3]\n      self.assertEqual(2.0 * original_depth, new_depth)\n\n  def testRaiseValueErrorWithInvalidDepthMultiplier(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    with self.assertRaises(ValueError):\n      _ = inception.inception_v2(inputs, num_classes, depth_multiplier=-0.1)\n    with self.assertRaises(ValueError):\n      _ = inception.inception_v2(inputs, num_classes, depth_multiplier=0.0)\n\n  def testHalfSizeImages(self):\n    batch_size = 5\n    height, width = 112, 112\n    num_classes = 1000\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    logits, end_points = inception.inception_v2(inputs, num_classes)\n    self.assertTrue(logits.op.name.startswith('InceptionV2/Logits'))\n    self.assertListEqual(logits.get_shape().as_list(),\n                         [batch_size, num_classes])\n    pre_pool = end_points['Mixed_5c']\n    self.assertListEqual(pre_pool.get_shape().as_list(),\n                         [batch_size, 4, 4, 1024])\n\n  def testUnknownImageShape(self):\n    tf.reset_default_graph()\n    batch_size = 2\n    height, width = 224, 224\n    num_classes = 1000\n    input_np = np.random.uniform(0, 1, (batch_size, height, width, 3))\n    with self.test_session() as sess:\n      inputs = tf.placeholder(tf.float32, shape=(batch_size, None, None, 3))\n      logits, end_points = inception.inception_v2(inputs, num_classes)\n      self.assertTrue(logits.op.name.startswith('InceptionV2/Logits'))\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n      pre_pool = end_points['Mixed_5c']\n      feed_dict = {inputs: input_np}\n      tf.initialize_all_variables().run()\n      pre_pool_out = sess.run(pre_pool, feed_dict=feed_dict)\n      self.assertListEqual(list(pre_pool_out.shape), [batch_size, 7, 7, 1024])\n\n  def testUnknowBatchSize(self):\n    batch_size = 1\n    height, width = 224, 224\n    num_classes = 1000\n\n    inputs = tf.placeholder(tf.float32, (None, height, width, 3))\n    logits, _ = inception.inception_v2(inputs, num_classes)\n    self.assertTrue(logits.op.name.startswith('InceptionV2/Logits'))\n    self.assertListEqual(logits.get_shape().as_list(),\n                         [None, num_classes])\n    images = tf.random_uniform((batch_size, height, width, 3))\n\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(logits, {inputs: images.eval()})\n      self.assertEquals(output.shape, (batch_size, num_classes))\n\n  def testEvaluation(self):\n    batch_size = 2\n    height, width = 224, 224\n    num_classes = 1000\n\n    eval_inputs = tf.random_uniform((batch_size, height, width, 3))\n    logits, _ = inception.inception_v2(eval_inputs, num_classes,\n                                       is_training=False)\n    predictions = tf.argmax(logits, 1)\n\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(predictions)\n      self.assertEquals(output.shape, (batch_size,))\n\n  def testTrainEvalWithReuse(self):\n    train_batch_size = 5\n    eval_batch_size = 2\n    height, width = 150, 150\n    num_classes = 1000\n\n    train_inputs = tf.random_uniform((train_batch_size, height, width, 3))\n    inception.inception_v2(train_inputs, num_classes)\n    eval_inputs = tf.random_uniform((eval_batch_size, height, width, 3))\n    logits, _ = inception.inception_v2(eval_inputs, num_classes, reuse=True)\n    predictions = tf.argmax(logits, 1)\n\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(predictions)\n      self.assertEquals(output.shape, (eval_batch_size,))\n\n  def testLogitsNotSqueezed(self):\n    num_classes = 25\n    images = tf.random_uniform([1, 224, 224, 3])\n    logits, _ = inception.inception_v2(images,\n                                       num_classes=num_classes,\n                                       spatial_squeeze=False)\n\n    with self.test_session() as sess:\n      tf.initialize_all_variables().run()\n      logits_out = sess.run(logits)\n      self.assertListEqual(list(logits_out.shape), [1, 1, 1, num_classes])\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/slim/nets/inception_v3.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains the definition for inception v3 classification network.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom nets import inception_utils\n\nslim = tf.contrib.slim\ntrunc_normal = lambda stddev: tf.truncated_normal_initializer(0.0, stddev)\n\n\ndef inception_v3_base(inputs,\n                      final_endpoint='Mixed_7c',\n                      min_depth=16,\n                      depth_multiplier=1.0,\n                      scope=None):\n  \"\"\"Inception model from http://arxiv.org/abs/1512.00567.\n\n  Constructs an Inception v3 network from inputs to the given final endpoint.\n  This method can construct the network up to the final inception block\n  Mixed_7c.\n\n  Note that the names of the layers in the paper do not correspond to the names\n  of the endpoints registered by this function although they build the same\n  network.\n\n  Here is a mapping from the old_names to the new names:\n  Old name          | New name\n  =======================================\n  conv0             | Conv2d_1a_3x3\n  conv1             | Conv2d_2a_3x3\n  conv2             | Conv2d_2b_3x3\n  pool1             | MaxPool_3a_3x3\n  conv3             | Conv2d_3b_1x1\n  conv4             | Conv2d_4a_3x3\n  pool2             | MaxPool_5a_3x3\n  mixed_35x35x256a  | Mixed_5b\n  mixed_35x35x288a  | Mixed_5c\n  mixed_35x35x288b  | Mixed_5d\n  mixed_17x17x768a  | Mixed_6a\n  mixed_17x17x768b  | Mixed_6b\n  mixed_17x17x768c  | Mixed_6c\n  mixed_17x17x768d  | Mixed_6d\n  mixed_17x17x768e  | Mixed_6e\n  mixed_8x8x1280a   | Mixed_7a\n  mixed_8x8x2048a   | Mixed_7b\n  mixed_8x8x2048b   | Mixed_7c\n\n  Args:\n    inputs: a tensor of size [batch_size, height, width, channels].\n    final_endpoint: specifies the endpoint to construct the network up to. It\n      can be one of ['Conv2d_1a_3x3', 'Conv2d_2a_3x3', 'Conv2d_2b_3x3',\n      'MaxPool_3a_3x3', 'Conv2d_3b_1x1', 'Conv2d_4a_3x3', 'MaxPool_5a_3x3',\n      'Mixed_5b', 'Mixed_5c', 'Mixed_5d', 'Mixed_6a', 'Mixed_6b', 'Mixed_6c',\n      'Mixed_6d', 'Mixed_6e', 'Mixed_7a', 'Mixed_7b', 'Mixed_7c'].\n    min_depth: Minimum depth value (number of channels) for all convolution ops.\n      Enforced when depth_multiplier < 1, and not an active constraint when\n      depth_multiplier >= 1.\n    depth_multiplier: Float multiplier for the depth (number of channels)\n      for all convolution ops. The value must be greater than zero. Typical\n      usage will be to set this value in (0, 1) to reduce the number of\n      parameters or computation cost of the model.\n    scope: Optional variable_scope.\n\n  Returns:\n    tensor_out: output tensor corresponding to the final_endpoint.\n    end_points: a set of activations for external use, for example summaries or\n                losses.\n\n  Raises:\n    ValueError: if final_endpoint is not set to one of the predefined values,\n                or depth_multiplier <= 0\n  \"\"\"\n  # end_points will collect relevant activations for external use, for example\n  # summaries or losses.\n  end_points = {}\n\n  if depth_multiplier <= 0:\n    raise ValueError('depth_multiplier is not greater than zero.')\n  depth = lambda d: max(int(d * depth_multiplier), min_depth)\n\n  with tf.variable_scope(scope, 'InceptionV3', [inputs]):\n    with slim.arg_scope([slim.conv2d, slim.max_pool2d, slim.avg_pool2d],\n                        stride=1, padding='VALID'):\n      # 299 x 299 x 3\n      end_point = 'Conv2d_1a_3x3'\n      net = slim.conv2d(inputs, depth(32), [3, 3], stride=2, scope=end_point)\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n      # 149 x 149 x 32\n      end_point = 'Conv2d_2a_3x3'\n      net = slim.conv2d(net, depth(32), [3, 3], scope=end_point)\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n      # 147 x 147 x 32\n      end_point = 'Conv2d_2b_3x3'\n      net = slim.conv2d(net, depth(64), [3, 3], padding='SAME', scope=end_point)\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n      # 147 x 147 x 64\n      end_point = 'MaxPool_3a_3x3'\n      net = slim.max_pool2d(net, [3, 3], stride=2, scope=end_point)\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n      # 73 x 73 x 64\n      end_point = 'Conv2d_3b_1x1'\n      net = slim.conv2d(net, depth(80), [1, 1], scope=end_point)\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n      # 73 x 73 x 80.\n      end_point = 'Conv2d_4a_3x3'\n      net = slim.conv2d(net, depth(192), [3, 3], scope=end_point)\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n      # 71 x 71 x 192.\n      end_point = 'MaxPool_5a_3x3'\n      net = slim.max_pool2d(net, [3, 3], stride=2, scope=end_point)\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n      # 35 x 35 x 192.\n\n    # Inception blocks\n    with slim.arg_scope([slim.conv2d, slim.max_pool2d, slim.avg_pool2d],\n                        stride=1, padding='SAME'):\n      # mixed: 35 x 35 x 256.\n      end_point = 'Mixed_5b'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(64), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(net, depth(48), [1, 1], scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(64), [5, 5],\n                                 scope='Conv2d_0b_5x5')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(net, depth(64), [1, 1], scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(branch_2, depth(96), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n          branch_2 = slim.conv2d(branch_2, depth(96), [3, 3],\n                                 scope='Conv2d_0c_3x3')\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')\n          branch_3 = slim.conv2d(branch_3, depth(32), [1, 1],\n                                 scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n\n      # mixed_1: 35 x 35 x 288.\n      end_point = 'Mixed_5c'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(64), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(net, depth(48), [1, 1], scope='Conv2d_0b_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(64), [5, 5],\n                                 scope='Conv_1_0c_5x5')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(net, depth(64), [1, 1],\n                                 scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(branch_2, depth(96), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n          branch_2 = slim.conv2d(branch_2, depth(96), [3, 3],\n                                 scope='Conv2d_0c_3x3')\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')\n          branch_3 = slim.conv2d(branch_3, depth(64), [1, 1],\n                                 scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n\n      # mixed_2: 35 x 35 x 288.\n      end_point = 'Mixed_5d'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(64), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(net, depth(48), [1, 1], scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(64), [5, 5],\n                                 scope='Conv2d_0b_5x5')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(net, depth(64), [1, 1], scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(branch_2, depth(96), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n          branch_2 = slim.conv2d(branch_2, depth(96), [3, 3],\n                                 scope='Conv2d_0c_3x3')\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')\n          branch_3 = slim.conv2d(branch_3, depth(64), [1, 1],\n                                 scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n\n      # mixed_3: 17 x 17 x 768.\n      end_point = 'Mixed_6a'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(384), [3, 3], stride=2,\n                                 padding='VALID', scope='Conv2d_1a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(net, depth(64), [1, 1], scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(96), [3, 3],\n                                 scope='Conv2d_0b_3x3')\n          branch_1 = slim.conv2d(branch_1, depth(96), [3, 3], stride=2,\n                                 padding='VALID', scope='Conv2d_1a_1x1')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.max_pool2d(net, [3, 3], stride=2, padding='VALID',\n                                     scope='MaxPool_1a_3x3')\n        net = tf.concat(3, [branch_0, branch_1, branch_2])\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n\n      # mixed4: 17 x 17 x 768.\n      end_point = 'Mixed_6b'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(192), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(net, depth(128), [1, 1], scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(128), [1, 7],\n                                 scope='Conv2d_0b_1x7')\n          branch_1 = slim.conv2d(branch_1, depth(192), [7, 1],\n                                 scope='Conv2d_0c_7x1')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(net, depth(128), [1, 1], scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(branch_2, depth(128), [7, 1],\n                                 scope='Conv2d_0b_7x1')\n          branch_2 = slim.conv2d(branch_2, depth(128), [1, 7],\n                                 scope='Conv2d_0c_1x7')\n          branch_2 = slim.conv2d(branch_2, depth(128), [7, 1],\n                                 scope='Conv2d_0d_7x1')\n          branch_2 = slim.conv2d(branch_2, depth(192), [1, 7],\n                                 scope='Conv2d_0e_1x7')\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')\n          branch_3 = slim.conv2d(branch_3, depth(192), [1, 1],\n                                 scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n\n      # mixed_5: 17 x 17 x 768.\n      end_point = 'Mixed_6c'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(192), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(net, depth(160), [1, 1], scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(160), [1, 7],\n                                 scope='Conv2d_0b_1x7')\n          branch_1 = slim.conv2d(branch_1, depth(192), [7, 1],\n                                 scope='Conv2d_0c_7x1')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(net, depth(160), [1, 1], scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(branch_2, depth(160), [7, 1],\n                                 scope='Conv2d_0b_7x1')\n          branch_2 = slim.conv2d(branch_2, depth(160), [1, 7],\n                                 scope='Conv2d_0c_1x7')\n          branch_2 = slim.conv2d(branch_2, depth(160), [7, 1],\n                                 scope='Conv2d_0d_7x1')\n          branch_2 = slim.conv2d(branch_2, depth(192), [1, 7],\n                                 scope='Conv2d_0e_1x7')\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')\n          branch_3 = slim.conv2d(branch_3, depth(192), [1, 1],\n                                 scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n      # mixed_6: 17 x 17 x 768.\n      end_point = 'Mixed_6d'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(192), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(net, depth(160), [1, 1], scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(160), [1, 7],\n                                 scope='Conv2d_0b_1x7')\n          branch_1 = slim.conv2d(branch_1, depth(192), [7, 1],\n                                 scope='Conv2d_0c_7x1')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(net, depth(160), [1, 1], scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(branch_2, depth(160), [7, 1],\n                                 scope='Conv2d_0b_7x1')\n          branch_2 = slim.conv2d(branch_2, depth(160), [1, 7],\n                                 scope='Conv2d_0c_1x7')\n          branch_2 = slim.conv2d(branch_2, depth(160), [7, 1],\n                                 scope='Conv2d_0d_7x1')\n          branch_2 = slim.conv2d(branch_2, depth(192), [1, 7],\n                                 scope='Conv2d_0e_1x7')\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')\n          branch_3 = slim.conv2d(branch_3, depth(192), [1, 1],\n                                 scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n\n      # mixed_7: 17 x 17 x 768.\n      end_point = 'Mixed_6e'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(192), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(net, depth(192), [1, 1], scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(192), [1, 7],\n                                 scope='Conv2d_0b_1x7')\n          branch_1 = slim.conv2d(branch_1, depth(192), [7, 1],\n                                 scope='Conv2d_0c_7x1')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(net, depth(192), [1, 1], scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(branch_2, depth(192), [7, 1],\n                                 scope='Conv2d_0b_7x1')\n          branch_2 = slim.conv2d(branch_2, depth(192), [1, 7],\n                                 scope='Conv2d_0c_1x7')\n          branch_2 = slim.conv2d(branch_2, depth(192), [7, 1],\n                                 scope='Conv2d_0d_7x1')\n          branch_2 = slim.conv2d(branch_2, depth(192), [1, 7],\n                                 scope='Conv2d_0e_1x7')\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')\n          branch_3 = slim.conv2d(branch_3, depth(192), [1, 1],\n                                 scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n\n      # mixed_8: 8 x 8 x 1280.\n      end_point = 'Mixed_7a'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(192), [1, 1], scope='Conv2d_0a_1x1')\n          branch_0 = slim.conv2d(branch_0, depth(320), [3, 3], stride=2,\n                                 padding='VALID', scope='Conv2d_1a_3x3')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(net, depth(192), [1, 1], scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, depth(192), [1, 7],\n                                 scope='Conv2d_0b_1x7')\n          branch_1 = slim.conv2d(branch_1, depth(192), [7, 1],\n                                 scope='Conv2d_0c_7x1')\n          branch_1 = slim.conv2d(branch_1, depth(192), [3, 3], stride=2,\n                                 padding='VALID', scope='Conv2d_1a_3x3')\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.max_pool2d(net, [3, 3], stride=2, padding='VALID',\n                                     scope='MaxPool_1a_3x3')\n        net = tf.concat(3, [branch_0, branch_1, branch_2])\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n      # mixed_9: 8 x 8 x 2048.\n      end_point = 'Mixed_7b'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(320), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(net, depth(384), [1, 1], scope='Conv2d_0a_1x1')\n          branch_1 = tf.concat(3, [\n              slim.conv2d(branch_1, depth(384), [1, 3], scope='Conv2d_0b_1x3'),\n              slim.conv2d(branch_1, depth(384), [3, 1], scope='Conv2d_0b_3x1')])\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(net, depth(448), [1, 1], scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(\n              branch_2, depth(384), [3, 3], scope='Conv2d_0b_3x3')\n          branch_2 = tf.concat(3, [\n              slim.conv2d(branch_2, depth(384), [1, 3], scope='Conv2d_0c_1x3'),\n              slim.conv2d(branch_2, depth(384), [3, 1], scope='Conv2d_0d_3x1')])\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')\n          branch_3 = slim.conv2d(\n              branch_3, depth(192), [1, 1], scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n\n      # mixed_10: 8 x 8 x 2048.\n      end_point = 'Mixed_7c'\n      with tf.variable_scope(end_point):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, depth(320), [1, 1], scope='Conv2d_0a_1x1')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(net, depth(384), [1, 1], scope='Conv2d_0a_1x1')\n          branch_1 = tf.concat(3, [\n              slim.conv2d(branch_1, depth(384), [1, 3], scope='Conv2d_0b_1x3'),\n              slim.conv2d(branch_1, depth(384), [3, 1], scope='Conv2d_0c_3x1')])\n        with tf.variable_scope('Branch_2'):\n          branch_2 = slim.conv2d(net, depth(448), [1, 1], scope='Conv2d_0a_1x1')\n          branch_2 = slim.conv2d(\n              branch_2, depth(384), [3, 3], scope='Conv2d_0b_3x3')\n          branch_2 = tf.concat(3, [\n              slim.conv2d(branch_2, depth(384), [1, 3], scope='Conv2d_0c_1x3'),\n              slim.conv2d(branch_2, depth(384), [3, 1], scope='Conv2d_0d_3x1')])\n        with tf.variable_scope('Branch_3'):\n          branch_3 = slim.avg_pool2d(net, [3, 3], scope='AvgPool_0a_3x3')\n          branch_3 = slim.conv2d(\n              branch_3, depth(192), [1, 1], scope='Conv2d_0b_1x1')\n        net = tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n      end_points[end_point] = net\n      if end_point == final_endpoint: return net, end_points\n    raise ValueError('Unknown final endpoint %s' % final_endpoint)\n\n\ndef inception_v3(inputs,\n                 num_classes=1000,\n                 is_training=True,\n                 dropout_keep_prob=0.8,\n                 min_depth=16,\n                 depth_multiplier=1.0,\n                 prediction_fn=slim.softmax,\n                 spatial_squeeze=True,\n                 reuse=None,\n                 scope='InceptionV3'):\n  \"\"\"Inception model from http://arxiv.org/abs/1512.00567.\n\n  \"Rethinking the Inception Architecture for Computer Vision\"\n\n  Christian Szegedy, Vincent Vanhoucke, Sergey Ioffe, Jonathon Shlens,\n  Zbigniew Wojna.\n\n  With the default arguments this method constructs the exact model defined in\n  the paper. However, one can experiment with variations of the inception_v3\n  network by changing arguments dropout_keep_prob, min_depth and\n  depth_multiplier.\n\n  The default image size used to train this network is 299x299.\n\n  Args:\n    inputs: a tensor of size [batch_size, height, width, channels].\n    num_classes: number of predicted classes.\n    is_training: whether is training or not.\n    dropout_keep_prob: the percentage of activation values that are retained.\n    min_depth: Minimum depth value (number of channels) for all convolution ops.\n      Enforced when depth_multiplier < 1, and not an active constraint when\n      depth_multiplier >= 1.\n    depth_multiplier: Float multiplier for the depth (number of channels)\n      for all convolution ops. The value must be greater than zero. Typical\n      usage will be to set this value in (0, 1) to reduce the number of\n      parameters or computation cost of the model.\n    prediction_fn: a function to get predictions out of logits.\n    spatial_squeeze: if True, logits is of shape is [B, C], if false logits is\n        of shape [B, 1, 1, C], where B is batch_size and C is number of classes.\n    reuse: whether or not the network and its variables should be reused. To be\n      able to reuse 'scope' must be given.\n    scope: Optional variable_scope.\n\n  Returns:\n    logits: the pre-softmax activations, a tensor of size\n      [batch_size, num_classes]\n    end_points: a dictionary from components of the network to the corresponding\n      activation.\n\n  Raises:\n    ValueError: if 'depth_multiplier' is less than or equal to zero.\n  \"\"\"\n  if depth_multiplier <= 0:\n    raise ValueError('depth_multiplier is not greater than zero.')\n  depth = lambda d: max(int(d * depth_multiplier), min_depth)\n\n  with tf.variable_scope(scope, 'InceptionV3', [inputs, num_classes],\n                         reuse=reuse) as scope:\n    with slim.arg_scope([slim.batch_norm, slim.dropout],\n                        is_training=is_training):\n      net, end_points = inception_v3_base(\n          inputs, scope=scope, min_depth=min_depth,\n          depth_multiplier=depth_multiplier)\n\n      # Auxiliary Head logits\n      with slim.arg_scope([slim.conv2d, slim.max_pool2d, slim.avg_pool2d],\n                          stride=1, padding='SAME'):\n        aux_logits = end_points['Mixed_6e']\n        with tf.variable_scope('AuxLogits'):\n          aux_logits = slim.avg_pool2d(\n              aux_logits, [5, 5], stride=3, padding='VALID',\n              scope='AvgPool_1a_5x5')\n          aux_logits = slim.conv2d(aux_logits, depth(128), [1, 1],\n                                   scope='Conv2d_1b_1x1')\n\n          # Shape of feature map before the final layer.\n          kernel_size = _reduced_kernel_size_for_small_input(\n              aux_logits, [5, 5])\n          aux_logits = slim.conv2d(\n              aux_logits, depth(768), kernel_size,\n              weights_initializer=trunc_normal(0.01),\n              padding='VALID', scope='Conv2d_2a_{}x{}'.format(*kernel_size))\n          aux_logits = slim.conv2d(\n              aux_logits, num_classes, [1, 1], activation_fn=None,\n              normalizer_fn=None, weights_initializer=trunc_normal(0.001),\n              scope='Conv2d_2b_1x1')\n          if spatial_squeeze:\n            aux_logits = tf.squeeze(aux_logits, [1, 2], name='SpatialSqueeze')\n          end_points['AuxLogits'] = aux_logits\n\n      # Final pooling and prediction\n      with tf.variable_scope('Logits'):\n        kernel_size = _reduced_kernel_size_for_small_input(net, [8, 8])\n        net = slim.avg_pool2d(net, kernel_size, padding='VALID',\n                              scope='AvgPool_1a_{}x{}'.format(*kernel_size))\n        # 1 x 1 x 2048\n        net = slim.dropout(net, keep_prob=dropout_keep_prob, scope='Dropout_1b')\n        end_points['PreLogits'] = net\n        # 2048\n        logits = slim.conv2d(net, num_classes, [1, 1], activation_fn=None,\n                             normalizer_fn=None, scope='Conv2d_1c_1x1')\n        if spatial_squeeze:\n          logits = tf.squeeze(logits, [1, 2], name='SpatialSqueeze')\n        # 1000\n      end_points['Logits'] = logits\n      end_points['Predictions'] = prediction_fn(logits, scope='Predictions')\n  return logits, end_points\ninception_v3.default_image_size = 299\n\n\ndef _reduced_kernel_size_for_small_input(input_tensor, kernel_size):\n  \"\"\"Define kernel size which is automatically reduced for small input.\n\n  If the shape of the input images is unknown at graph construction time this\n  function assumes that the input images are is large enough.\n\n  Args:\n    input_tensor: input tensor of size [batch_size, height, width, channels].\n    kernel_size: desired kernel size of length 2: [kernel_height, kernel_width]\n\n  Returns:\n    a tensor with the kernel size.\n\n  TODO(jrru): Make this function work with unknown shapes. Theoretically, this\n  can be done with the code below. Problems are two-fold: (1) If the shape was\n  known, it will be lost. (2) inception.slim.ops._two_element_tuple cannot\n  handle tensors that define the kernel size.\n      shape = tf.shape(input_tensor)\n      return = tf.pack([tf.minimum(shape[1], kernel_size[0]),\n                        tf.minimum(shape[2], kernel_size[1])])\n\n  \"\"\"\n  shape = input_tensor.get_shape().as_list()\n  if shape[1] is None or shape[2] is None:\n    kernel_size_out = kernel_size\n  else:\n    kernel_size_out = [min(shape[1], kernel_size[0]),\n                       min(shape[2], kernel_size[1])]\n  return kernel_size_out\n\n\ninception_v3_arg_scope = inception_utils.inception_arg_scope\n"
  },
  {
    "path": "model_zoo/models/slim/nets/inception_v3_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for nets.inception_v1.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom nets import inception\n\nslim = tf.contrib.slim\n\n\nclass InceptionV3Test(tf.test.TestCase):\n\n  def testBuildClassificationNetwork(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1000\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    logits, end_points = inception.inception_v3(inputs, num_classes)\n    self.assertTrue(logits.op.name.startswith('InceptionV3/Logits'))\n    self.assertListEqual(logits.get_shape().as_list(),\n                         [batch_size, num_classes])\n    self.assertTrue('Predictions' in end_points)\n    self.assertListEqual(end_points['Predictions'].get_shape().as_list(),\n                         [batch_size, num_classes])\n\n  def testBuildBaseNetwork(self):\n    batch_size = 5\n    height, width = 299, 299\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    final_endpoint, end_points = inception.inception_v3_base(inputs)\n    self.assertTrue(final_endpoint.op.name.startswith(\n        'InceptionV3/Mixed_7c'))\n    self.assertListEqual(final_endpoint.get_shape().as_list(),\n                         [batch_size, 8, 8, 2048])\n    expected_endpoints = ['Conv2d_1a_3x3', 'Conv2d_2a_3x3', 'Conv2d_2b_3x3',\n                          'MaxPool_3a_3x3', 'Conv2d_3b_1x1', 'Conv2d_4a_3x3',\n                          'MaxPool_5a_3x3', 'Mixed_5b', 'Mixed_5c', 'Mixed_5d',\n                          'Mixed_6a', 'Mixed_6b', 'Mixed_6c', 'Mixed_6d',\n                          'Mixed_6e', 'Mixed_7a', 'Mixed_7b', 'Mixed_7c']\n    self.assertItemsEqual(end_points.keys(), expected_endpoints)\n\n  def testBuildOnlyUptoFinalEndpoint(self):\n    batch_size = 5\n    height, width = 299, 299\n    endpoints = ['Conv2d_1a_3x3', 'Conv2d_2a_3x3', 'Conv2d_2b_3x3',\n                 'MaxPool_3a_3x3', 'Conv2d_3b_1x1', 'Conv2d_4a_3x3',\n                 'MaxPool_5a_3x3', 'Mixed_5b', 'Mixed_5c', 'Mixed_5d',\n                 'Mixed_6a', 'Mixed_6b', 'Mixed_6c', 'Mixed_6d',\n                 'Mixed_6e', 'Mixed_7a', 'Mixed_7b', 'Mixed_7c']\n\n    for index, endpoint in enumerate(endpoints):\n      with tf.Graph().as_default():\n        inputs = tf.random_uniform((batch_size, height, width, 3))\n        out_tensor, end_points = inception.inception_v3_base(\n            inputs, final_endpoint=endpoint)\n        self.assertTrue(out_tensor.op.name.startswith(\n            'InceptionV3/' + endpoint))\n        self.assertItemsEqual(endpoints[:index+1], end_points)\n\n  def testBuildAndCheckAllEndPointsUptoMixed7c(self):\n    batch_size = 5\n    height, width = 299, 299\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    _, end_points = inception.inception_v3_base(\n        inputs, final_endpoint='Mixed_7c')\n    endpoints_shapes = {'Conv2d_1a_3x3': [batch_size, 149, 149, 32],\n                        'Conv2d_2a_3x3': [batch_size, 147, 147, 32],\n                        'Conv2d_2b_3x3': [batch_size, 147, 147, 64],\n                        'MaxPool_3a_3x3': [batch_size, 73, 73, 64],\n                        'Conv2d_3b_1x1': [batch_size, 73, 73, 80],\n                        'Conv2d_4a_3x3': [batch_size, 71, 71, 192],\n                        'MaxPool_5a_3x3': [batch_size, 35, 35, 192],\n                        'Mixed_5b': [batch_size, 35, 35, 256],\n                        'Mixed_5c': [batch_size, 35, 35, 288],\n                        'Mixed_5d': [batch_size, 35, 35, 288],\n                        'Mixed_6a': [batch_size, 17, 17, 768],\n                        'Mixed_6b': [batch_size, 17, 17, 768],\n                        'Mixed_6c': [batch_size, 17, 17, 768],\n                        'Mixed_6d': [batch_size, 17, 17, 768],\n                        'Mixed_6e': [batch_size, 17, 17, 768],\n                        'Mixed_7a': [batch_size, 8, 8, 1280],\n                        'Mixed_7b': [batch_size, 8, 8, 2048],\n                        'Mixed_7c': [batch_size, 8, 8, 2048]}\n    self.assertItemsEqual(endpoints_shapes.keys(), end_points.keys())\n    for endpoint_name in endpoints_shapes:\n      expected_shape = endpoints_shapes[endpoint_name]\n      self.assertTrue(endpoint_name in end_points)\n      self.assertListEqual(end_points[endpoint_name].get_shape().as_list(),\n                           expected_shape)\n\n  def testModelHasExpectedNumberOfParameters(self):\n    batch_size = 5\n    height, width = 299, 299\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    with slim.arg_scope(inception.inception_v3_arg_scope()):\n      inception.inception_v3_base(inputs)\n    total_params, _ = slim.model_analyzer.analyze_vars(\n        slim.get_model_variables())\n    self.assertAlmostEqual(21802784, total_params)\n\n  def testBuildEndPoints(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1000\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    _, end_points = inception.inception_v3(inputs, num_classes)\n    self.assertTrue('Logits' in end_points)\n    logits = end_points['Logits']\n    self.assertListEqual(logits.get_shape().as_list(),\n                         [batch_size, num_classes])\n    self.assertTrue('AuxLogits' in end_points)\n    aux_logits = end_points['AuxLogits']\n    self.assertListEqual(aux_logits.get_shape().as_list(),\n                         [batch_size, num_classes])\n    self.assertTrue('Mixed_7c' in end_points)\n    pre_pool = end_points['Mixed_7c']\n    self.assertListEqual(pre_pool.get_shape().as_list(),\n                         [batch_size, 8, 8, 2048])\n    self.assertTrue('PreLogits' in end_points)\n    pre_logits = end_points['PreLogits']\n    self.assertListEqual(pre_logits.get_shape().as_list(),\n                         [batch_size, 1, 1, 2048])\n\n  def testBuildEndPointsWithDepthMultiplierLessThanOne(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1000\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    _, end_points = inception.inception_v3(inputs, num_classes)\n\n    endpoint_keys = [key for key in end_points.keys()\n                     if key.startswith('Mixed') or key.startswith('Conv')]\n\n    _, end_points_with_multiplier = inception.inception_v3(\n        inputs, num_classes, scope='depth_multiplied_net',\n        depth_multiplier=0.5)\n\n    for key in endpoint_keys:\n      original_depth = end_points[key].get_shape().as_list()[3]\n      new_depth = end_points_with_multiplier[key].get_shape().as_list()[3]\n      self.assertEqual(0.5 * original_depth, new_depth)\n\n  def testBuildEndPointsWithDepthMultiplierGreaterThanOne(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1000\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    _, end_points = inception.inception_v3(inputs, num_classes)\n\n    endpoint_keys = [key for key in end_points.keys()\n                     if key.startswith('Mixed') or key.startswith('Conv')]\n\n    _, end_points_with_multiplier = inception.inception_v3(\n        inputs, num_classes, scope='depth_multiplied_net',\n        depth_multiplier=2.0)\n\n    for key in endpoint_keys:\n      original_depth = end_points[key].get_shape().as_list()[3]\n      new_depth = end_points_with_multiplier[key].get_shape().as_list()[3]\n      self.assertEqual(2.0 * original_depth, new_depth)\n\n  def testRaiseValueErrorWithInvalidDepthMultiplier(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1000\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    with self.assertRaises(ValueError):\n      _ = inception.inception_v3(inputs, num_classes, depth_multiplier=-0.1)\n    with self.assertRaises(ValueError):\n      _ = inception.inception_v3(inputs, num_classes, depth_multiplier=0.0)\n\n  def testHalfSizeImages(self):\n    batch_size = 5\n    height, width = 150, 150\n    num_classes = 1000\n\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    logits, end_points = inception.inception_v3(inputs, num_classes)\n    self.assertTrue(logits.op.name.startswith('InceptionV3/Logits'))\n    self.assertListEqual(logits.get_shape().as_list(),\n                         [batch_size, num_classes])\n    pre_pool = end_points['Mixed_7c']\n    self.assertListEqual(pre_pool.get_shape().as_list(),\n                         [batch_size, 3, 3, 2048])\n\n  def testUnknownImageShape(self):\n    tf.reset_default_graph()\n    batch_size = 2\n    height, width = 299, 299\n    num_classes = 1000\n    input_np = np.random.uniform(0, 1, (batch_size, height, width, 3))\n    with self.test_session() as sess:\n      inputs = tf.placeholder(tf.float32, shape=(batch_size, None, None, 3))\n      logits, end_points = inception.inception_v3(inputs, num_classes)\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n      pre_pool = end_points['Mixed_7c']\n      feed_dict = {inputs: input_np}\n      tf.initialize_all_variables().run()\n      pre_pool_out = sess.run(pre_pool, feed_dict=feed_dict)\n      self.assertListEqual(list(pre_pool_out.shape), [batch_size, 8, 8, 2048])\n\n  def testUnknowBatchSize(self):\n    batch_size = 1\n    height, width = 299, 299\n    num_classes = 1000\n\n    inputs = tf.placeholder(tf.float32, (None, height, width, 3))\n    logits, _ = inception.inception_v3(inputs, num_classes)\n    self.assertTrue(logits.op.name.startswith('InceptionV3/Logits'))\n    self.assertListEqual(logits.get_shape().as_list(),\n                         [None, num_classes])\n    images = tf.random_uniform((batch_size, height, width, 3))\n\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(logits, {inputs: images.eval()})\n      self.assertEquals(output.shape, (batch_size, num_classes))\n\n  def testEvaluation(self):\n    batch_size = 2\n    height, width = 299, 299\n    num_classes = 1000\n\n    eval_inputs = tf.random_uniform((batch_size, height, width, 3))\n    logits, _ = inception.inception_v3(eval_inputs, num_classes,\n                                       is_training=False)\n    predictions = tf.argmax(logits, 1)\n\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(predictions)\n      self.assertEquals(output.shape, (batch_size,))\n\n  def testTrainEvalWithReuse(self):\n    train_batch_size = 5\n    eval_batch_size = 2\n    height, width = 150, 150\n    num_classes = 1000\n\n    train_inputs = tf.random_uniform((train_batch_size, height, width, 3))\n    inception.inception_v3(train_inputs, num_classes)\n    eval_inputs = tf.random_uniform((eval_batch_size, height, width, 3))\n    logits, _ = inception.inception_v3(eval_inputs, num_classes,\n                                       is_training=False, reuse=True)\n    predictions = tf.argmax(logits, 1)\n\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(predictions)\n      self.assertEquals(output.shape, (eval_batch_size,))\n\n  def testLogitsNotSqueezed(self):\n    num_classes = 25\n    images = tf.random_uniform([1, 299, 299, 3])\n    logits, _ = inception.inception_v3(images,\n                                       num_classes=num_classes,\n                                       spatial_squeeze=False)\n\n    with self.test_session() as sess:\n      tf.initialize_all_variables().run()\n      logits_out = sess.run(logits)\n      self.assertListEqual(list(logits_out.shape), [1, 1, 1, num_classes])\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/slim/nets/inception_v4.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains the definition of the Inception V4 architecture.\n\nAs described in http://arxiv.org/abs/1602.07261.\n\n  Inception-v4, Inception-ResNet and the Impact of Residual Connections\n    on Learning\n  Christian Szegedy, Sergey Ioffe, Vincent Vanhoucke, Alex Alemi\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom nets import inception_utils\n\nslim = tf.contrib.slim\n\n\ndef block_inception_a(inputs, scope=None, reuse=None):\n  \"\"\"Builds Inception-A block for Inception v4 network.\"\"\"\n  # By default use stride=1 and SAME padding\n  with slim.arg_scope([slim.conv2d, slim.avg_pool2d, slim.max_pool2d],\n                      stride=1, padding='SAME'):\n    with tf.variable_scope(scope, 'BlockInceptionA', [inputs], reuse=reuse):\n      with tf.variable_scope('Branch_0'):\n        branch_0 = slim.conv2d(inputs, 96, [1, 1], scope='Conv2d_0a_1x1')\n      with tf.variable_scope('Branch_1'):\n        branch_1 = slim.conv2d(inputs, 64, [1, 1], scope='Conv2d_0a_1x1')\n        branch_1 = slim.conv2d(branch_1, 96, [3, 3], scope='Conv2d_0b_3x3')\n      with tf.variable_scope('Branch_2'):\n        branch_2 = slim.conv2d(inputs, 64, [1, 1], scope='Conv2d_0a_1x1')\n        branch_2 = slim.conv2d(branch_2, 96, [3, 3], scope='Conv2d_0b_3x3')\n        branch_2 = slim.conv2d(branch_2, 96, [3, 3], scope='Conv2d_0c_3x3')\n      with tf.variable_scope('Branch_3'):\n        branch_3 = slim.avg_pool2d(inputs, [3, 3], scope='AvgPool_0a_3x3')\n        branch_3 = slim.conv2d(branch_3, 96, [1, 1], scope='Conv2d_0b_1x1')\n      return tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n\n\ndef block_reduction_a(inputs, scope=None, reuse=None):\n  \"\"\"Builds Reduction-A block for Inception v4 network.\"\"\"\n  # By default use stride=1 and SAME padding\n  with slim.arg_scope([slim.conv2d, slim.avg_pool2d, slim.max_pool2d],\n                      stride=1, padding='SAME'):\n    with tf.variable_scope(scope, 'BlockReductionA', [inputs], reuse=reuse):\n      with tf.variable_scope('Branch_0'):\n        branch_0 = slim.conv2d(inputs, 384, [3, 3], stride=2, padding='VALID',\n                               scope='Conv2d_1a_3x3')\n      with tf.variable_scope('Branch_1'):\n        branch_1 = slim.conv2d(inputs, 192, [1, 1], scope='Conv2d_0a_1x1')\n        branch_1 = slim.conv2d(branch_1, 224, [3, 3], scope='Conv2d_0b_3x3')\n        branch_1 = slim.conv2d(branch_1, 256, [3, 3], stride=2,\n                               padding='VALID', scope='Conv2d_1a_3x3')\n      with tf.variable_scope('Branch_2'):\n        branch_2 = slim.max_pool2d(inputs, [3, 3], stride=2, padding='VALID',\n                                   scope='MaxPool_1a_3x3')\n      return tf.concat(3, [branch_0, branch_1, branch_2])\n\n\ndef block_inception_b(inputs, scope=None, reuse=None):\n  \"\"\"Builds Inception-B block for Inception v4 network.\"\"\"\n  # By default use stride=1 and SAME padding\n  with slim.arg_scope([slim.conv2d, slim.avg_pool2d, slim.max_pool2d],\n                      stride=1, padding='SAME'):\n    with tf.variable_scope(scope, 'BlockInceptionB', [inputs], reuse=reuse):\n      with tf.variable_scope('Branch_0'):\n        branch_0 = slim.conv2d(inputs, 384, [1, 1], scope='Conv2d_0a_1x1')\n      with tf.variable_scope('Branch_1'):\n        branch_1 = slim.conv2d(inputs, 192, [1, 1], scope='Conv2d_0a_1x1')\n        branch_1 = slim.conv2d(branch_1, 224, [1, 7], scope='Conv2d_0b_1x7')\n        branch_1 = slim.conv2d(branch_1, 256, [7, 1], scope='Conv2d_0c_7x1')\n      with tf.variable_scope('Branch_2'):\n        branch_2 = slim.conv2d(inputs, 192, [1, 1], scope='Conv2d_0a_1x1')\n        branch_2 = slim.conv2d(branch_2, 192, [7, 1], scope='Conv2d_0b_7x1')\n        branch_2 = slim.conv2d(branch_2, 224, [1, 7], scope='Conv2d_0c_1x7')\n        branch_2 = slim.conv2d(branch_2, 224, [7, 1], scope='Conv2d_0d_7x1')\n        branch_2 = slim.conv2d(branch_2, 256, [1, 7], scope='Conv2d_0e_1x7')\n      with tf.variable_scope('Branch_3'):\n        branch_3 = slim.avg_pool2d(inputs, [3, 3], scope='AvgPool_0a_3x3')\n        branch_3 = slim.conv2d(branch_3, 128, [1, 1], scope='Conv2d_0b_1x1')\n      return tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n\n\ndef block_reduction_b(inputs, scope=None, reuse=None):\n  \"\"\"Builds Reduction-B block for Inception v4 network.\"\"\"\n  # By default use stride=1 and SAME padding\n  with slim.arg_scope([slim.conv2d, slim.avg_pool2d, slim.max_pool2d],\n                      stride=1, padding='SAME'):\n    with tf.variable_scope(scope, 'BlockReductionB', [inputs], reuse=reuse):\n      with tf.variable_scope('Branch_0'):\n        branch_0 = slim.conv2d(inputs, 192, [1, 1], scope='Conv2d_0a_1x1')\n        branch_0 = slim.conv2d(branch_0, 192, [3, 3], stride=2,\n                               padding='VALID', scope='Conv2d_1a_3x3')\n      with tf.variable_scope('Branch_1'):\n        branch_1 = slim.conv2d(inputs, 256, [1, 1], scope='Conv2d_0a_1x1')\n        branch_1 = slim.conv2d(branch_1, 256, [1, 7], scope='Conv2d_0b_1x7')\n        branch_1 = slim.conv2d(branch_1, 320, [7, 1], scope='Conv2d_0c_7x1')\n        branch_1 = slim.conv2d(branch_1, 320, [3, 3], stride=2,\n                               padding='VALID', scope='Conv2d_1a_3x3')\n      with tf.variable_scope('Branch_2'):\n        branch_2 = slim.max_pool2d(inputs, [3, 3], stride=2, padding='VALID',\n                                   scope='MaxPool_1a_3x3')\n      return tf.concat(3, [branch_0, branch_1, branch_2])\n\n\ndef block_inception_c(inputs, scope=None, reuse=None):\n  \"\"\"Builds Inception-C block for Inception v4 network.\"\"\"\n  # By default use stride=1 and SAME padding\n  with slim.arg_scope([slim.conv2d, slim.avg_pool2d, slim.max_pool2d],\n                      stride=1, padding='SAME'):\n    with tf.variable_scope(scope, 'BlockInceptionC', [inputs], reuse=reuse):\n      with tf.variable_scope('Branch_0'):\n        branch_0 = slim.conv2d(inputs, 256, [1, 1], scope='Conv2d_0a_1x1')\n      with tf.variable_scope('Branch_1'):\n        branch_1 = slim.conv2d(inputs, 384, [1, 1], scope='Conv2d_0a_1x1')\n        branch_1 = tf.concat(3, [\n            slim.conv2d(branch_1, 256, [1, 3], scope='Conv2d_0b_1x3'),\n            slim.conv2d(branch_1, 256, [3, 1], scope='Conv2d_0c_3x1')])\n      with tf.variable_scope('Branch_2'):\n        branch_2 = slim.conv2d(inputs, 384, [1, 1], scope='Conv2d_0a_1x1')\n        branch_2 = slim.conv2d(branch_2, 448, [3, 1], scope='Conv2d_0b_3x1')\n        branch_2 = slim.conv2d(branch_2, 512, [1, 3], scope='Conv2d_0c_1x3')\n        branch_2 = tf.concat(3, [\n            slim.conv2d(branch_2, 256, [1, 3], scope='Conv2d_0d_1x3'),\n            slim.conv2d(branch_2, 256, [3, 1], scope='Conv2d_0e_3x1')])\n      with tf.variable_scope('Branch_3'):\n        branch_3 = slim.avg_pool2d(inputs, [3, 3], scope='AvgPool_0a_3x3')\n        branch_3 = slim.conv2d(branch_3, 256, [1, 1], scope='Conv2d_0b_1x1')\n      return tf.concat(3, [branch_0, branch_1, branch_2, branch_3])\n\n\ndef inception_v4_base(inputs, final_endpoint='Mixed_7d', scope=None):\n  \"\"\"Creates the Inception V4 network up to the given final endpoint.\n\n  Args:\n    inputs: a 4-D tensor of size [batch_size, height, width, 3].\n    final_endpoint: specifies the endpoint to construct the network up to.\n      It can be one of [ 'Conv2d_1a_3x3', 'Conv2d_2a_3x3', 'Conv2d_2b_3x3',\n      'Mixed_3a', 'Mixed_4a', 'Mixed_5a', 'Mixed_5b', 'Mixed_5c', 'Mixed_5d',\n      'Mixed_5e', 'Mixed_6a', 'Mixed_6b', 'Mixed_6c', 'Mixed_6d', 'Mixed_6e',\n      'Mixed_6f', 'Mixed_6g', 'Mixed_6h', 'Mixed_7a', 'Mixed_7b', 'Mixed_7c',\n      'Mixed_7d']\n    scope: Optional variable_scope.\n\n  Returns:\n    logits: the logits outputs of the model.\n    end_points: the set of end_points from the inception model.\n\n  Raises:\n    ValueError: if final_endpoint is not set to one of the predefined values,\n  \"\"\"\n  end_points = {}\n\n  def add_and_check_final(name, net):\n    end_points[name] = net\n    return name == final_endpoint\n\n  with tf.variable_scope(scope, 'InceptionV4', [inputs]):\n    with slim.arg_scope([slim.conv2d, slim.max_pool2d, slim.avg_pool2d],\n                        stride=1, padding='SAME'):\n      # 299 x 299 x 3\n      net = slim.conv2d(inputs, 32, [3, 3], stride=2,\n                        padding='VALID', scope='Conv2d_1a_3x3')\n      if add_and_check_final('Conv2d_1a_3x3', net): return net, end_points\n      # 149 x 149 x 32\n      net = slim.conv2d(net, 32, [3, 3], padding='VALID',\n                        scope='Conv2d_2a_3x3')\n      if add_and_check_final('Conv2d_2a_3x3', net): return net, end_points\n      # 147 x 147 x 32\n      net = slim.conv2d(net, 64, [3, 3], scope='Conv2d_2b_3x3')\n      if add_and_check_final('Conv2d_2b_3x3', net): return net, end_points\n      # 147 x 147 x 64\n      with tf.variable_scope('Mixed_3a'):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.max_pool2d(net, [3, 3], stride=2, padding='VALID',\n                                     scope='MaxPool_0a_3x3')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(net, 96, [3, 3], stride=2, padding='VALID',\n                                 scope='Conv2d_0a_3x3')\n        net = tf.concat(3, [branch_0, branch_1])\n        if add_and_check_final('Mixed_3a', net): return net, end_points\n\n      # 73 x 73 x 160\n      with tf.variable_scope('Mixed_4a'):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, 64, [1, 1], scope='Conv2d_0a_1x1')\n          branch_0 = slim.conv2d(branch_0, 96, [3, 3], padding='VALID',\n                                 scope='Conv2d_1a_3x3')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.conv2d(net, 64, [1, 1], scope='Conv2d_0a_1x1')\n          branch_1 = slim.conv2d(branch_1, 64, [1, 7], scope='Conv2d_0b_1x7')\n          branch_1 = slim.conv2d(branch_1, 64, [7, 1], scope='Conv2d_0c_7x1')\n          branch_1 = slim.conv2d(branch_1, 96, [3, 3], padding='VALID',\n                                 scope='Conv2d_1a_3x3')\n        net = tf.concat(3, [branch_0, branch_1])\n        if add_and_check_final('Mixed_4a', net): return net, end_points\n\n      # 71 x 71 x 192\n      with tf.variable_scope('Mixed_5a'):\n        with tf.variable_scope('Branch_0'):\n          branch_0 = slim.conv2d(net, 192, [3, 3], stride=2, padding='VALID',\n                                 scope='Conv2d_1a_3x3')\n        with tf.variable_scope('Branch_1'):\n          branch_1 = slim.max_pool2d(net, [3, 3], stride=2, padding='VALID',\n                                     scope='MaxPool_1a_3x3')\n        net = tf.concat(3, [branch_0, branch_1])\n        if add_and_check_final('Mixed_5a', net): return net, end_points\n\n      # 35 x 35 x 384\n      # 4 x Inception-A blocks\n      for idx in range(4):\n        block_scope = 'Mixed_5' + chr(ord('b') + idx)\n        net = block_inception_a(net, block_scope)\n        if add_and_check_final(block_scope, net): return net, end_points\n\n      # 35 x 35 x 384\n      # Reduction-A block\n      net = block_reduction_a(net, 'Mixed_6a')\n      if add_and_check_final('Mixed_6a', net): return net, end_points\n\n      # 17 x 17 x 1024\n      # 7 x Inception-B blocks\n      for idx in range(7):\n        block_scope = 'Mixed_6' + chr(ord('b') + idx)\n        net = block_inception_b(net, block_scope)\n        if add_and_check_final(block_scope, net): return net, end_points\n\n      # 17 x 17 x 1024\n      # Reduction-B block\n      net = block_reduction_b(net, 'Mixed_7a')\n      if add_and_check_final('Mixed_7a', net): return net, end_points\n\n      # 8 x 8 x 1536\n      # 3 x Inception-C blocks\n      for idx in range(3):\n        block_scope = 'Mixed_7' + chr(ord('b') + idx)\n        net = block_inception_c(net, block_scope)\n        if add_and_check_final(block_scope, net): return net, end_points\n  raise ValueError('Unknown final endpoint %s' % final_endpoint)\n\n\ndef inception_v4(inputs, num_classes=1001, is_training=True,\n                 dropout_keep_prob=0.8,\n                 reuse=None,\n                 scope='InceptionV4',\n                 create_aux_logits=True):\n  \"\"\"Creates the Inception V4 model.\n\n  Args:\n    inputs: a 4-D tensor of size [batch_size, height, width, 3].\n    num_classes: number of predicted classes.\n    is_training: whether is training or not.\n    dropout_keep_prob: float, the fraction to keep before final layer.\n    reuse: whether or not the network and its variables should be reused. To be\n      able to reuse 'scope' must be given.\n    scope: Optional variable_scope.\n    create_aux_logits: Whether to include the auxilliary logits.\n\n  Returns:\n    logits: the logits outputs of the model.\n    end_points: the set of end_points from the inception model.\n  \"\"\"\n  end_points = {}\n  with tf.variable_scope(scope, 'InceptionV4', [inputs], reuse=reuse) as scope:\n    with slim.arg_scope([slim.batch_norm, slim.dropout],\n                        is_training=is_training):\n      net, end_points = inception_v4_base(inputs, scope=scope)\n\n      with slim.arg_scope([slim.conv2d, slim.max_pool2d, slim.avg_pool2d],\n                          stride=1, padding='SAME'):\n        # Auxiliary Head logits\n        if create_aux_logits:\n          with tf.variable_scope('AuxLogits'):\n            # 17 x 17 x 1024\n            aux_logits = end_points['Mixed_6h']\n            aux_logits = slim.avg_pool2d(aux_logits, [5, 5], stride=3,\n                                         padding='VALID',\n                                         scope='AvgPool_1a_5x5')\n            aux_logits = slim.conv2d(aux_logits, 128, [1, 1],\n                                     scope='Conv2d_1b_1x1')\n            aux_logits = slim.conv2d(aux_logits, 768,\n                                     aux_logits.get_shape()[1:3],\n                                     padding='VALID', scope='Conv2d_2a')\n            aux_logits = slim.flatten(aux_logits)\n            aux_logits = slim.fully_connected(aux_logits, num_classes,\n                                              activation_fn=None,\n                                              scope='Aux_logits')\n            end_points['AuxLogits'] = aux_logits\n\n        # Final pooling and prediction\n        with tf.variable_scope('Logits'):\n          # 8 x 8 x 1536\n          net = slim.avg_pool2d(net, net.get_shape()[1:3], padding='VALID',\n                                scope='AvgPool_1a')\n          # 1 x 1 x 1536\n          net = slim.dropout(net, dropout_keep_prob, scope='Dropout_1b')\n          net = slim.flatten(net, scope='PreLogitsFlatten')\n          end_points['PreLogitsFlatten'] = net\n          # 1536\n          logits = slim.fully_connected(net, num_classes, activation_fn=None,\n                                        scope='Logits')\n          end_points['Logits'] = logits\n          end_points['Predictions'] = tf.nn.softmax(logits, name='Predictions')\n    return logits, end_points\ninception_v4.default_image_size = 299\n\n\ninception_v4_arg_scope = inception_utils.inception_arg_scope\n"
  },
  {
    "path": "model_zoo/models/slim/nets/inception_v4_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for slim.inception_v4.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom nets import inception\n\n\nclass InceptionTest(tf.test.TestCase):\n\n  def testBuildLogits(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1000\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    logits, end_points = inception.inception_v4(inputs, num_classes)\n    auxlogits = end_points['AuxLogits']\n    predictions = end_points['Predictions']\n    self.assertTrue(auxlogits.op.name.startswith('InceptionV4/AuxLogits'))\n    self.assertListEqual(auxlogits.get_shape().as_list(),\n                         [batch_size, num_classes])\n    self.assertTrue(logits.op.name.startswith('InceptionV4/Logits'))\n    self.assertListEqual(logits.get_shape().as_list(),\n                         [batch_size, num_classes])\n    self.assertTrue(predictions.op.name.startswith(\n        'InceptionV4/Logits/Predictions'))\n    self.assertListEqual(predictions.get_shape().as_list(),\n                         [batch_size, num_classes])\n\n  def testBuildWithoutAuxLogits(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1000\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    logits, endpoints = inception.inception_v4(inputs, num_classes,\n                                               create_aux_logits=False)\n    self.assertFalse('AuxLogits' in endpoints)\n    self.assertTrue(logits.op.name.startswith('InceptionV4/Logits'))\n    self.assertListEqual(logits.get_shape().as_list(),\n                         [batch_size, num_classes])\n\n  def testAllEndPointsShapes(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1000\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    _, end_points = inception.inception_v4(inputs, num_classes)\n    endpoints_shapes = {'Conv2d_1a_3x3': [batch_size, 149, 149, 32],\n                        'Conv2d_2a_3x3': [batch_size, 147, 147, 32],\n                        'Conv2d_2b_3x3': [batch_size, 147, 147, 64],\n                        'Mixed_3a': [batch_size, 73, 73, 160],\n                        'Mixed_4a': [batch_size, 71, 71, 192],\n                        'Mixed_5a': [batch_size, 35, 35, 384],\n                        # 4 x Inception-A blocks\n                        'Mixed_5b': [batch_size, 35, 35, 384],\n                        'Mixed_5c': [batch_size, 35, 35, 384],\n                        'Mixed_5d': [batch_size, 35, 35, 384],\n                        'Mixed_5e': [batch_size, 35, 35, 384],\n                        # Reduction-A block\n                        'Mixed_6a': [batch_size, 17, 17, 1024],\n                        # 7 x Inception-B blocks\n                        'Mixed_6b': [batch_size, 17, 17, 1024],\n                        'Mixed_6c': [batch_size, 17, 17, 1024],\n                        'Mixed_6d': [batch_size, 17, 17, 1024],\n                        'Mixed_6e': [batch_size, 17, 17, 1024],\n                        'Mixed_6f': [batch_size, 17, 17, 1024],\n                        'Mixed_6g': [batch_size, 17, 17, 1024],\n                        'Mixed_6h': [batch_size, 17, 17, 1024],\n                        # Reduction-A block\n                        'Mixed_7a': [batch_size, 8, 8, 1536],\n                        # 3 x Inception-C blocks\n                        'Mixed_7b': [batch_size, 8, 8, 1536],\n                        'Mixed_7c': [batch_size, 8, 8, 1536],\n                        'Mixed_7d': [batch_size, 8, 8, 1536],\n                        # Logits and predictions\n                        'AuxLogits': [batch_size, num_classes],\n                        'PreLogitsFlatten': [batch_size, 1536],\n                        'Logits': [batch_size, num_classes],\n                        'Predictions': [batch_size, num_classes]}\n    self.assertItemsEqual(endpoints_shapes.keys(), end_points.keys())\n    for endpoint_name in endpoints_shapes:\n      expected_shape = endpoints_shapes[endpoint_name]\n      self.assertTrue(endpoint_name in end_points)\n      self.assertListEqual(end_points[endpoint_name].get_shape().as_list(),\n                           expected_shape)\n\n  def testBuildBaseNetwork(self):\n    batch_size = 5\n    height, width = 299, 299\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    net, end_points = inception.inception_v4_base(inputs)\n    self.assertTrue(net.op.name.startswith(\n        'InceptionV4/Mixed_7d'))\n    self.assertListEqual(net.get_shape().as_list(), [batch_size, 8, 8, 1536])\n    expected_endpoints = [\n        'Conv2d_1a_3x3', 'Conv2d_2a_3x3', 'Conv2d_2b_3x3', 'Mixed_3a',\n        'Mixed_4a', 'Mixed_5a', 'Mixed_5b', 'Mixed_5c', 'Mixed_5d',\n        'Mixed_5e', 'Mixed_6a', 'Mixed_6b', 'Mixed_6c', 'Mixed_6d',\n        'Mixed_6e', 'Mixed_6f', 'Mixed_6g', 'Mixed_6h', 'Mixed_7a',\n        'Mixed_7b', 'Mixed_7c', 'Mixed_7d']\n    self.assertItemsEqual(end_points.keys(), expected_endpoints)\n    for name, op in end_points.iteritems():\n      self.assertTrue(op.name.startswith('InceptionV4/' + name))\n\n  def testBuildOnlyUpToFinalEndpoint(self):\n    batch_size = 5\n    height, width = 299, 299\n    all_endpoints = [\n        'Conv2d_1a_3x3', 'Conv2d_2a_3x3', 'Conv2d_2b_3x3', 'Mixed_3a',\n        'Mixed_4a', 'Mixed_5a', 'Mixed_5b', 'Mixed_5c', 'Mixed_5d',\n        'Mixed_5e', 'Mixed_6a', 'Mixed_6b', 'Mixed_6c', 'Mixed_6d',\n        'Mixed_6e', 'Mixed_6f', 'Mixed_6g', 'Mixed_6h', 'Mixed_7a',\n        'Mixed_7b', 'Mixed_7c', 'Mixed_7d']\n    for index, endpoint in enumerate(all_endpoints):\n      with tf.Graph().as_default():\n        inputs = tf.random_uniform((batch_size, height, width, 3))\n        out_tensor, end_points = inception.inception_v4_base(\n            inputs, final_endpoint=endpoint)\n        self.assertTrue(out_tensor.op.name.startswith(\n            'InceptionV4/' + endpoint))\n        self.assertItemsEqual(all_endpoints[:index+1], end_points)\n\n  def testVariablesSetDevice(self):\n    batch_size = 5\n    height, width = 299, 299\n    num_classes = 1000\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    # Force all Variables to reside on the device.\n    with tf.variable_scope('on_cpu'), tf.device('/cpu:0'):\n      inception.inception_v4(inputs, num_classes)\n    with tf.variable_scope('on_gpu'), tf.device('/gpu:0'):\n      inception.inception_v4(inputs, num_classes)\n    for v in tf.get_collection(tf.GraphKeys.VARIABLES, scope='on_cpu'):\n      self.assertDeviceEqual(v.device, '/cpu:0')\n    for v in tf.get_collection(tf.GraphKeys.VARIABLES, scope='on_gpu'):\n      self.assertDeviceEqual(v.device, '/gpu:0')\n\n  def testHalfSizeImages(self):\n    batch_size = 5\n    height, width = 150, 150\n    num_classes = 1000\n    inputs = tf.random_uniform((batch_size, height, width, 3))\n    logits, end_points = inception.inception_v4(inputs, num_classes)\n    self.assertTrue(logits.op.name.startswith('InceptionV4/Logits'))\n    self.assertListEqual(logits.get_shape().as_list(),\n                         [batch_size, num_classes])\n    pre_pool = end_points['Mixed_7d']\n    self.assertListEqual(pre_pool.get_shape().as_list(),\n                         [batch_size, 3, 3, 1536])\n\n  def testUnknownBatchSize(self):\n    batch_size = 1\n    height, width = 299, 299\n    num_classes = 1000\n    with self.test_session() as sess:\n      inputs = tf.placeholder(tf.float32, (None, height, width, 3))\n      logits, _ = inception.inception_v4(inputs, num_classes)\n      self.assertTrue(logits.op.name.startswith('InceptionV4/Logits'))\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [None, num_classes])\n      images = tf.random_uniform((batch_size, height, width, 3))\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(logits, {inputs: images.eval()})\n      self.assertEquals(output.shape, (batch_size, num_classes))\n\n  def testEvaluation(self):\n    batch_size = 2\n    height, width = 299, 299\n    num_classes = 1000\n    with self.test_session() as sess:\n      eval_inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = inception.inception_v4(eval_inputs,\n                                         num_classes,\n                                         is_training=False)\n      predictions = tf.argmax(logits, 1)\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(predictions)\n      self.assertEquals(output.shape, (batch_size,))\n\n  def testTrainEvalWithReuse(self):\n    train_batch_size = 5\n    eval_batch_size = 2\n    height, width = 150, 150\n    num_classes = 1000\n    with self.test_session() as sess:\n      train_inputs = tf.random_uniform((train_batch_size, height, width, 3))\n      inception.inception_v4(train_inputs, num_classes)\n      eval_inputs = tf.random_uniform((eval_batch_size, height, width, 3))\n      logits, _ = inception.inception_v4(eval_inputs,\n                                         num_classes,\n                                         is_training=False,\n                                         reuse=True)\n      predictions = tf.argmax(logits, 1)\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(predictions)\n      self.assertEquals(output.shape, (eval_batch_size,))\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/slim/nets/lenet.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains a variant of the LeNet model definition.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nslim = tf.contrib.slim\n\n\ndef lenet(images, num_classes=10, is_training=False,\n          dropout_keep_prob=0.5,\n          prediction_fn=slim.softmax,\n          scope='LeNet'):\n  \"\"\"Creates a variant of the LeNet model.\n\n  Note that since the output is a set of 'logits', the values fall in the\n  interval of (-infinity, infinity). Consequently, to convert the outputs to a\n  probability distribution over the characters, one will need to convert them\n  using the softmax function:\n\n        logits = lenet.lenet(images, is_training=False)\n        probabilities = tf.nn.softmax(logits)\n        predictions = tf.argmax(logits, 1)\n\n  Args:\n    images: A batch of `Tensors` of size [batch_size, height, width, channels].\n    num_classes: the number of classes in the dataset.\n    is_training: specifies whether or not we're currently training the model.\n      This variable will determine the behaviour of the dropout layer.\n    dropout_keep_prob: the percentage of activation values that are retained.\n    prediction_fn: a function to get predictions out of logits.\n    scope: Optional variable_scope.\n\n  Returns:\n    logits: the pre-softmax activations, a tensor of size\n      [batch_size, `num_classes`]\n    end_points: a dictionary from components of the network to the corresponding\n      activation.\n  \"\"\"\n  end_points = {}\n\n  with tf.variable_scope(scope, 'LeNet', [images, num_classes]):\n    net = slim.conv2d(images, 32, [5, 5], scope='conv1')\n    net = slim.max_pool2d(net, [2, 2], 2, scope='pool1')\n    net = slim.conv2d(net, 64, [5, 5], scope='conv2')\n    net = slim.max_pool2d(net, [2, 2], 2, scope='pool2')\n    net = slim.flatten(net)\n    end_points['Flatten'] = net\n\n    net = slim.fully_connected(net, 1024, scope='fc3')\n    net = slim.dropout(net, dropout_keep_prob, is_training=is_training,\n                       scope='dropout3')\n    logits = slim.fully_connected(net, num_classes, activation_fn=None,\n                                  scope='fc4')\n\n  end_points['Logits'] = logits\n  end_points['Predictions'] = prediction_fn(logits, scope='Predictions')\n\n  return logits, end_points\nlenet.default_image_size = 28\n\n\ndef lenet_arg_scope(weight_decay=0.0):\n  \"\"\"Defines the default lenet argument scope.\n\n  Args:\n    weight_decay: The weight decay to use for regularizing the model.\n\n  Returns:\n    An `arg_scope` to use for the inception v3 model.\n  \"\"\"\n  with slim.arg_scope(\n      [slim.conv2d, slim.fully_connected],\n      weights_regularizer=slim.l2_regularizer(weight_decay),\n      weights_initializer=tf.truncated_normal_initializer(stddev=0.1),\n      activation_fn=tf.nn.relu) as sc:\n    return sc\n"
  },
  {
    "path": "model_zoo/models/slim/nets/nets_factory.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains a factory for building various models.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nimport functools\n\nimport tensorflow as tf\n\nfrom nets import alexnet\nfrom nets import cifarnet\nfrom nets import inception\nfrom nets import lenet\nfrom nets import overfeat\nfrom nets import resnet_v1\nfrom nets import resnet_v2\nfrom nets import vgg\n\nslim = tf.contrib.slim\n\nnetworks_map = {'alexnet_v2': alexnet.alexnet_v2,\n                'cifarnet': cifarnet.cifarnet,\n                'overfeat': overfeat.overfeat,\n                'vgg_a': vgg.vgg_a,\n                'vgg_16': vgg.vgg_16,\n                'vgg_19': vgg.vgg_19,\n                'inception_v1': inception.inception_v1,\n                'inception_v2': inception.inception_v2,\n                'inception_v3': inception.inception_v3,\n                'inception_v4': inception.inception_v4,\n                'inception_resnet_v2': inception.inception_resnet_v2,\n                'lenet': lenet.lenet,\n                'resnet_v1_50': resnet_v1.resnet_v1_50,\n                'resnet_v1_101': resnet_v1.resnet_v1_101,\n                'resnet_v1_152': resnet_v1.resnet_v1_152,\n                'resnet_v1_200': resnet_v1.resnet_v1_200,\n                'resnet_v2_50': resnet_v2.resnet_v2_50,\n                'resnet_v2_101': resnet_v2.resnet_v2_101,\n                'resnet_v2_152': resnet_v2.resnet_v2_152,\n                'resnet_v2_200': resnet_v2.resnet_v2_200,\n               }\n\narg_scopes_map = {'alexnet_v2': alexnet.alexnet_v2_arg_scope,\n                  'cifarnet': cifarnet.cifarnet_arg_scope,\n                  'overfeat': overfeat.overfeat_arg_scope,\n                  'vgg_a': vgg.vgg_arg_scope,\n                  'vgg_16': vgg.vgg_arg_scope,\n                  'vgg_19': vgg.vgg_arg_scope,\n                  'inception_v1': inception.inception_v3_arg_scope,\n                  'inception_v2': inception.inception_v3_arg_scope,\n                  'inception_v3': inception.inception_v3_arg_scope,\n                  'inception_v4': inception.inception_v4_arg_scope,\n                  'inception_resnet_v2':\n                  inception.inception_resnet_v2_arg_scope,\n                  'lenet': lenet.lenet_arg_scope,\n                  'resnet_v1_50': resnet_v1.resnet_arg_scope,\n                  'resnet_v1_101': resnet_v1.resnet_arg_scope,\n                  'resnet_v1_152': resnet_v1.resnet_arg_scope,\n                  'resnet_v1_200': resnet_v1.resnet_arg_scope,\n                  'resnet_v2_50': resnet_v2.resnet_arg_scope,\n                  'resnet_v2_101': resnet_v2.resnet_arg_scope,\n                  'resnet_v2_152': resnet_v2.resnet_arg_scope,\n                  'resnet_v2_200': resnet_v2.resnet_arg_scope,\n                 }\n\n\ndef get_network_fn(name, num_classes, weight_decay=0.0, is_training=False):\n  \"\"\"Returns a network_fn such as `logits, end_points = network_fn(images)`.\n\n  Args:\n    name: The name of the network.\n    num_classes: The number of classes to use for classification.\n    weight_decay: The l2 coefficient for the model weights.\n    is_training: `True` if the model is being used for training and `False`\n      otherwise.\n\n  Returns:\n    network_fn: A function that applies the model to a batch of images. It has\n      the following signature:\n        logits, end_points = network_fn(images)\n  Raises:\n    ValueError: If network `name` is not recognized.\n  \"\"\"\n  if name not in networks_map:\n    raise ValueError('Name of network unknown %s' % name)\n  arg_scope = arg_scopes_map[name](weight_decay=weight_decay)\n  func = networks_map[name]\n  @functools.wraps(func)\n  def network_fn(images):\n    with slim.arg_scope(arg_scope):\n      return func(images, num_classes, is_training=is_training)\n  if hasattr(func, 'default_image_size'):\n    network_fn.default_image_size = func.default_image_size\n\n  return network_fn\n"
  },
  {
    "path": "model_zoo/models/slim/nets/nets_factory_test.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Tests for slim.inception.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\n\nimport tensorflow as tf\n\nfrom nets import nets_factory\n\n\nclass NetworksTest(tf.test.TestCase):\n\n  def testGetNetworkFn(self):\n    batch_size = 5\n    num_classes = 1000\n    for net in nets_factory.networks_map:\n      with self.test_session():\n        net_fn = nets_factory.get_network_fn(net, num_classes)\n        # Most networks use 224 as their default_image_size\n        image_size = getattr(net_fn, 'default_image_size', 224)\n        inputs = tf.random_uniform((batch_size, image_size, image_size, 3))\n        logits, end_points = net_fn(inputs)\n        self.assertTrue(isinstance(logits, tf.Tensor))\n        self.assertTrue(isinstance(end_points, dict))\n        self.assertEqual(logits.get_shape().as_list()[0], batch_size)\n        self.assertEqual(logits.get_shape().as_list()[-1], num_classes)\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/slim/nets/overfeat.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains the model definition for the OverFeat network.\n\nThe definition for the network was obtained from:\n  OverFeat: Integrated Recognition, Localization and Detection using\n  Convolutional Networks\n  Pierre Sermanet, David Eigen, Xiang Zhang, Michael Mathieu, Rob Fergus and\n  Yann LeCun, 2014\n  http://arxiv.org/abs/1312.6229\n\nUsage:\n  with slim.arg_scope(overfeat.overfeat_arg_scope()):\n    outputs, end_points = overfeat.overfeat(inputs)\n\n@@overfeat\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nslim = tf.contrib.slim\ntrunc_normal = lambda stddev: tf.truncated_normal_initializer(0.0, stddev)\n\n\ndef overfeat_arg_scope(weight_decay=0.0005):\n  with slim.arg_scope([slim.conv2d, slim.fully_connected],\n                      activation_fn=tf.nn.relu,\n                      weights_regularizer=slim.l2_regularizer(weight_decay),\n                      biases_initializer=tf.zeros_initializer):\n    with slim.arg_scope([slim.conv2d], padding='SAME'):\n      with slim.arg_scope([slim.max_pool2d], padding='VALID') as arg_sc:\n        return arg_sc\n\n\ndef overfeat(inputs,\n             num_classes=1000,\n             is_training=True,\n             dropout_keep_prob=0.5,\n             spatial_squeeze=True,\n             scope='overfeat'):\n  \"\"\"Contains the model definition for the OverFeat network.\n\n  The definition for the network was obtained from:\n    OverFeat: Integrated Recognition, Localization and Detection using\n    Convolutional Networks\n    Pierre Sermanet, David Eigen, Xiang Zhang, Michael Mathieu, Rob Fergus and\n    Yann LeCun, 2014\n    http://arxiv.org/abs/1312.6229\n\n  Note: All the fully_connected layers have been transformed to conv2d layers.\n        To use in classification mode, resize input to 231x231. To use in fully\n        convolutional mode, set spatial_squeeze to false.\n\n  Args:\n    inputs: a tensor of size [batch_size, height, width, channels].\n    num_classes: number of predicted classes.\n    is_training: whether or not the model is being trained.\n    dropout_keep_prob: the probability that activations are kept in the dropout\n      layers during training.\n    spatial_squeeze: whether or not should squeeze the spatial dimensions of the\n      outputs. Useful to remove unnecessary dimensions for classification.\n    scope: Optional scope for the variables.\n\n  Returns:\n    the last op containing the log predictions and end_points dict.\n\n  \"\"\"\n  with tf.variable_scope(scope, 'overfeat', [inputs]) as sc:\n    end_points_collection = sc.name + '_end_points'\n    # Collect outputs for conv2d, fully_connected and max_pool2d\n    with slim.arg_scope([slim.conv2d, slim.fully_connected, slim.max_pool2d],\n                        outputs_collections=end_points_collection):\n      net = slim.conv2d(inputs, 64, [11, 11], 4, padding='VALID',\n                        scope='conv1')\n      net = slim.max_pool2d(net, [2, 2], scope='pool1')\n      net = slim.conv2d(net, 256, [5, 5], padding='VALID', scope='conv2')\n      net = slim.max_pool2d(net, [2, 2], scope='pool2')\n      net = slim.conv2d(net, 512, [3, 3], scope='conv3')\n      net = slim.conv2d(net, 1024, [3, 3], scope='conv4')\n      net = slim.conv2d(net, 1024, [3, 3], scope='conv5')\n      net = slim.max_pool2d(net, [2, 2], scope='pool5')\n      with slim.arg_scope([slim.conv2d],\n                          weights_initializer=trunc_normal(0.005),\n                          biases_initializer=tf.constant_initializer(0.1)):\n        # Use conv2d instead of fully_connected layers.\n        net = slim.conv2d(net, 3072, [6, 6], padding='VALID', scope='fc6')\n        net = slim.dropout(net, dropout_keep_prob, is_training=is_training,\n                           scope='dropout6')\n        net = slim.conv2d(net, 4096, [1, 1], scope='fc7')\n        net = slim.dropout(net, dropout_keep_prob, is_training=is_training,\n                           scope='dropout7')\n        net = slim.conv2d(net, num_classes, [1, 1],\n                          activation_fn=None,\n                          normalizer_fn=None,\n                          biases_initializer=tf.zeros_initializer,\n                          scope='fc8')\n      # Convert end_points_collection into a end_point dict.\n      end_points = slim.utils.convert_collection_to_dict(end_points_collection)\n      if spatial_squeeze:\n        net = tf.squeeze(net, [1, 2], name='fc8/squeezed')\n        end_points[sc.name + '/fc8'] = net\n      return net, end_points\noverfeat.default_image_size = 231\n"
  },
  {
    "path": "model_zoo/models/slim/nets/overfeat_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for slim.nets.overfeat.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom nets import overfeat\n\nslim = tf.contrib.slim\n\n\nclass OverFeatTest(tf.test.TestCase):\n\n  def testBuild(self):\n    batch_size = 5\n    height, width = 231, 231\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = overfeat.overfeat(inputs, num_classes)\n      self.assertEquals(logits.op.name, 'overfeat/fc8/squeezed')\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n\n  def testFullyConvolutional(self):\n    batch_size = 1\n    height, width = 281, 281\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = overfeat.overfeat(inputs, num_classes, spatial_squeeze=False)\n      self.assertEquals(logits.op.name, 'overfeat/fc8/BiasAdd')\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, 2, 2, num_classes])\n\n  def testEndPoints(self):\n    batch_size = 5\n    height, width = 231, 231\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      _, end_points = overfeat.overfeat(inputs, num_classes)\n      expected_names = ['overfeat/conv1',\n                        'overfeat/pool1',\n                        'overfeat/conv2',\n                        'overfeat/pool2',\n                        'overfeat/conv3',\n                        'overfeat/conv4',\n                        'overfeat/conv5',\n                        'overfeat/pool5',\n                        'overfeat/fc6',\n                        'overfeat/fc7',\n                        'overfeat/fc8'\n                       ]\n      self.assertSetEqual(set(end_points.keys()), set(expected_names))\n\n  def testModelVariables(self):\n    batch_size = 5\n    height, width = 231, 231\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      overfeat.overfeat(inputs, num_classes)\n      expected_names = ['overfeat/conv1/weights',\n                        'overfeat/conv1/biases',\n                        'overfeat/conv2/weights',\n                        'overfeat/conv2/biases',\n                        'overfeat/conv3/weights',\n                        'overfeat/conv3/biases',\n                        'overfeat/conv4/weights',\n                        'overfeat/conv4/biases',\n                        'overfeat/conv5/weights',\n                        'overfeat/conv5/biases',\n                        'overfeat/fc6/weights',\n                        'overfeat/fc6/biases',\n                        'overfeat/fc7/weights',\n                        'overfeat/fc7/biases',\n                        'overfeat/fc8/weights',\n                        'overfeat/fc8/biases',\n                       ]\n      model_variables = [v.op.name for v in slim.get_model_variables()]\n      self.assertSetEqual(set(model_variables), set(expected_names))\n\n  def testEvaluation(self):\n    batch_size = 2\n    height, width = 231, 231\n    num_classes = 1000\n    with self.test_session():\n      eval_inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = overfeat.overfeat(eval_inputs, is_training=False)\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n      predictions = tf.argmax(logits, 1)\n      self.assertListEqual(predictions.get_shape().as_list(), [batch_size])\n\n  def testTrainEvalWithReuse(self):\n    train_batch_size = 2\n    eval_batch_size = 1\n    train_height, train_width = 231, 231\n    eval_height, eval_width = 281, 281\n    num_classes = 1000\n    with self.test_session():\n      train_inputs = tf.random_uniform(\n          (train_batch_size, train_height, train_width, 3))\n      logits, _ = overfeat.overfeat(train_inputs)\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [train_batch_size, num_classes])\n      tf.get_variable_scope().reuse_variables()\n      eval_inputs = tf.random_uniform(\n          (eval_batch_size, eval_height, eval_width, 3))\n      logits, _ = overfeat.overfeat(eval_inputs, is_training=False,\n                                    spatial_squeeze=False)\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [eval_batch_size, 2, 2, num_classes])\n      logits = tf.reduce_mean(logits, [1, 2])\n      predictions = tf.argmax(logits, 1)\n      self.assertEquals(predictions.get_shape().as_list(), [eval_batch_size])\n\n  def testForward(self):\n    batch_size = 1\n    height, width = 231, 231\n    with self.test_session() as sess:\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = overfeat.overfeat(inputs)\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(logits)\n      self.assertTrue(output.any())\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/slim/nets/resnet_utils.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains building blocks for various versions of Residual Networks.\n\nResidual networks (ResNets) were proposed in:\n  Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun\n  Deep Residual Learning for Image Recognition. arXiv:1512.03385, 2015\n\nMore variants were introduced in:\n  Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun\n  Identity Mappings in Deep Residual Networks. arXiv: 1603.05027, 2016\n\nWe can obtain different ResNet variants by changing the network depth, width,\nand form of residual unit. This module implements the infrastructure for\nbuilding them. Concrete ResNet units and full ResNet networks are implemented in\nthe accompanying resnet_v1.py and resnet_v2.py modules.\n\nCompared to https://github.com/KaimingHe/deep-residual-networks, in the current\nimplementation we subsample the output activations in the last residual unit of\neach block, instead of subsampling the input activations in the first residual\nunit of each block. The two implementations give identical results but our\nimplementation is more memory efficient.\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport collections\nimport tensorflow as tf\n\nslim = tf.contrib.slim\n\n\nclass Block(collections.namedtuple('Block', ['scope', 'unit_fn', 'args'])):\n  \"\"\"A named tuple describing a ResNet block.\n\n  Its parts are:\n    scope: The scope of the `Block`.\n    unit_fn: The ResNet unit function which takes as input a `Tensor` and\n      returns another `Tensor` with the output of the ResNet unit.\n    args: A list of length equal to the number of units in the `Block`. The list\n      contains one (depth, depth_bottleneck, stride) tuple for each unit in the\n      block to serve as argument to unit_fn.\n  \"\"\"\n\n\ndef subsample(inputs, factor, scope=None):\n  \"\"\"Subsamples the input along the spatial dimensions.\n\n  Args:\n    inputs: A `Tensor` of size [batch, height_in, width_in, channels].\n    factor: The subsampling factor.\n    scope: Optional variable_scope.\n\n  Returns:\n    output: A `Tensor` of size [batch, height_out, width_out, channels] with the\n      input, either intact (if factor == 1) or subsampled (if factor > 1).\n  \"\"\"\n  if factor == 1:\n    return inputs\n  else:\n    return slim.max_pool2d(inputs, [1, 1], stride=factor, scope=scope)\n\n\ndef conv2d_same(inputs, num_outputs, kernel_size, stride, rate=1, scope=None):\n  \"\"\"Strided 2-D convolution with 'SAME' padding.\n\n  When stride > 1, then we do explicit zero-padding, followed by conv2d with\n  'VALID' padding.\n\n  Note that\n\n     net = conv2d_same(inputs, num_outputs, 3, stride=stride)\n\n  is equivalent to\n\n     net = slim.conv2d(inputs, num_outputs, 3, stride=1, padding='SAME')\n     net = subsample(net, factor=stride)\n\n  whereas\n\n     net = slim.conv2d(inputs, num_outputs, 3, stride=stride, padding='SAME')\n\n  is different when the input's height or width is even, which is why we add the\n  current function. For more details, see ResnetUtilsTest.testConv2DSameEven().\n\n  Args:\n    inputs: A 4-D tensor of size [batch, height_in, width_in, channels].\n    num_outputs: An integer, the number of output filters.\n    kernel_size: An int with the kernel_size of the filters.\n    stride: An integer, the output stride.\n    rate: An integer, rate for atrous convolution.\n    scope: Scope.\n\n  Returns:\n    output: A 4-D tensor of size [batch, height_out, width_out, channels] with\n      the convolution output.\n  \"\"\"\n  if stride == 1:\n    return slim.conv2d(inputs, num_outputs, kernel_size, stride=1, rate=rate,\n                       padding='SAME', scope=scope)\n  else:\n    kernel_size_effective = kernel_size + (kernel_size - 1) * (rate - 1)\n    pad_total = kernel_size_effective - 1\n    pad_beg = pad_total // 2\n    pad_end = pad_total - pad_beg\n    inputs = tf.pad(inputs,\n                    [[0, 0], [pad_beg, pad_end], [pad_beg, pad_end], [0, 0]])\n    return slim.conv2d(inputs, num_outputs, kernel_size, stride=stride,\n                       rate=rate, padding='VALID', scope=scope)\n\n\n@slim.add_arg_scope\ndef stack_blocks_dense(net, blocks, output_stride=None,\n                       outputs_collections=None):\n  \"\"\"Stacks ResNet `Blocks` and controls output feature density.\n\n  First, this function creates scopes for the ResNet in the form of\n  'block_name/unit_1', 'block_name/unit_2', etc.\n\n  Second, this function allows the user to explicitly control the ResNet\n  output_stride, which is the ratio of the input to output spatial resolution.\n  This is useful for dense prediction tasks such as semantic segmentation or\n  object detection.\n\n  Most ResNets consist of 4 ResNet blocks and subsample the activations by a\n  factor of 2 when transitioning between consecutive ResNet blocks. This results\n  to a nominal ResNet output_stride equal to 8. If we set the output_stride to\n  half the nominal network stride (e.g., output_stride=4), then we compute\n  responses twice.\n\n  Control of the output feature density is implemented by atrous convolution.\n\n  Args:\n    net: A `Tensor` of size [batch, height, width, channels].\n    blocks: A list of length equal to the number of ResNet `Blocks`. Each\n      element is a ResNet `Block` object describing the units in the `Block`.\n    output_stride: If `None`, then the output will be computed at the nominal\n      network stride. If output_stride is not `None`, it specifies the requested\n      ratio of input to output spatial resolution, which needs to be equal to\n      the product of unit strides from the start up to some level of the ResNet.\n      For example, if the ResNet employs units with strides 1, 2, 1, 3, 4, 1,\n      then valid values for the output_stride are 1, 2, 6, 24 or None (which\n      is equivalent to output_stride=24).\n    outputs_collections: Collection to add the ResNet block outputs.\n\n  Returns:\n    net: Output tensor with stride equal to the specified output_stride.\n\n  Raises:\n    ValueError: If the target output_stride is not valid.\n  \"\"\"\n  # The current_stride variable keeps track of the effective stride of the\n  # activations. This allows us to invoke atrous convolution whenever applying\n  # the next residual unit would result in the activations having stride larger\n  # than the target output_stride.\n  current_stride = 1\n\n  # The atrous convolution rate parameter.\n  rate = 1\n\n  for block in blocks:\n    with tf.variable_scope(block.scope, 'block', [net]) as sc:\n      for i, unit in enumerate(block.args):\n        if output_stride is not None and current_stride > output_stride:\n          raise ValueError('The target output_stride cannot be reached.')\n\n        with tf.variable_scope('unit_%d' % (i + 1), values=[net]):\n          unit_depth, unit_depth_bottleneck, unit_stride = unit\n\n          # If we have reached the target output_stride, then we need to employ\n          # atrous convolution with stride=1 and multiply the atrous rate by the\n          # current unit's stride for use in subsequent layers.\n          if output_stride is not None and current_stride == output_stride:\n            net = block.unit_fn(net,\n                                depth=unit_depth,\n                                depth_bottleneck=unit_depth_bottleneck,\n                                stride=1,\n                                rate=rate)\n            rate *= unit_stride\n\n          else:\n            net = block.unit_fn(net,\n                                depth=unit_depth,\n                                depth_bottleneck=unit_depth_bottleneck,\n                                stride=unit_stride,\n                                rate=1)\n            current_stride *= unit_stride\n      net = slim.utils.collect_named_outputs(outputs_collections, sc.name, net)\n\n  if output_stride is not None and current_stride != output_stride:\n    raise ValueError('The target output_stride cannot be reached.')\n\n  return net\n\n\ndef resnet_arg_scope(weight_decay=0.0001,\n                     batch_norm_decay=0.997,\n                     batch_norm_epsilon=1e-5,\n                     batch_norm_scale=True):\n  \"\"\"Defines the default ResNet arg scope.\n\n  TODO(gpapan): The batch-normalization related default values above are\n    appropriate for use in conjunction with the reference ResNet models\n    released at https://github.com/KaimingHe/deep-residual-networks. When\n    training ResNets from scratch, they might need to be tuned.\n\n  Args:\n    weight_decay: The weight decay to use for regularizing the model.\n    batch_norm_decay: The moving average decay when estimating layer activation\n      statistics in batch normalization.\n    batch_norm_epsilon: Small constant to prevent division by zero when\n      normalizing activations by their variance in batch normalization.\n    batch_norm_scale: If True, uses an explicit `gamma` multiplier to scale the\n      activations in the batch normalization layer.\n\n  Returns:\n    An `arg_scope` to use for the resnet models.\n  \"\"\"\n  batch_norm_params = {\n      'decay': batch_norm_decay,\n      'epsilon': batch_norm_epsilon,\n      'scale': batch_norm_scale,\n      'updates_collections': tf.GraphKeys.UPDATE_OPS,\n  }\n\n  with slim.arg_scope(\n      [slim.conv2d],\n      weights_regularizer=slim.l2_regularizer(weight_decay),\n      weights_initializer=slim.variance_scaling_initializer(),\n      activation_fn=tf.nn.relu,\n      normalizer_fn=slim.batch_norm,\n      normalizer_params=batch_norm_params):\n    with slim.arg_scope([slim.batch_norm], **batch_norm_params):\n      # The following implies padding='SAME' for pool1, which makes feature\n      # alignment easier for dense prediction tasks. This is also used in\n      # https://github.com/facebook/fb.resnet.torch. However the accompanying\n      # code of 'Deep Residual Learning for Image Recognition' uses\n      # padding='VALID' for pool1. You can switch to that choice by setting\n      # slim.arg_scope([slim.max_pool2d], padding='VALID').\n      with slim.arg_scope([slim.max_pool2d], padding='SAME') as arg_sc:\n        return arg_sc\n"
  },
  {
    "path": "model_zoo/models/slim/nets/resnet_v1.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains definitions for the original form of Residual Networks.\n\nThe 'v1' residual networks (ResNets) implemented in this module were proposed\nby:\n[1] Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun\n    Deep Residual Learning for Image Recognition. arXiv:1512.03385\n\nOther variants were introduced in:\n[2] Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun\n    Identity Mappings in Deep Residual Networks. arXiv: 1603.05027\n\nThe networks defined in this module utilize the bottleneck building block of\n[1] with projection shortcuts only for increasing depths. They employ batch\nnormalization *after* every weight layer. This is the architecture used by\nMSRA in the Imagenet and MSCOCO 2016 competition models ResNet-101 and\nResNet-152. See [2; Fig. 1a] for a comparison between the current 'v1'\narchitecture and the alternative 'v2' architecture of [2] which uses batch\nnormalization *before* every weight layer in the so-called full pre-activation\nunits.\n\nTypical use:\n\n   from tensorflow.contrib.slim.nets import resnet_v1\n\nResNet-101 for image classification into 1000 classes:\n\n   # inputs has shape [batch, 224, 224, 3]\n   with slim.arg_scope(resnet_v1.resnet_arg_scope()):\n      net, end_points = resnet_v1.resnet_v1_101(inputs, 1000, is_training=False)\n\nResNet-101 for semantic segmentation into 21 classes:\n\n   # inputs has shape [batch, 513, 513, 3]\n   with slim.arg_scope(resnet_v1.resnet_arg_scope()):\n      net, end_points = resnet_v1.resnet_v1_101(inputs,\n                                                21,\n                                                is_training=False,\n                                                global_pool=False,\n                                                output_stride=16)\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom nets import resnet_utils\n\n\nresnet_arg_scope = resnet_utils.resnet_arg_scope\nslim = tf.contrib.slim\n\n\n@slim.add_arg_scope\ndef bottleneck(inputs, depth, depth_bottleneck, stride, rate=1,\n               outputs_collections=None, scope=None):\n  \"\"\"Bottleneck residual unit variant with BN after convolutions.\n\n  This is the original residual unit proposed in [1]. See Fig. 1(a) of [2] for\n  its definition. Note that we use here the bottleneck variant which has an\n  extra bottleneck layer.\n\n  When putting together two consecutive ResNet blocks that use this unit, one\n  should use stride = 2 in the last unit of the first block.\n\n  Args:\n    inputs: A tensor of size [batch, height, width, channels].\n    depth: The depth of the ResNet unit output.\n    depth_bottleneck: The depth of the bottleneck layers.\n    stride: The ResNet unit's stride. Determines the amount of downsampling of\n      the units output compared to its input.\n    rate: An integer, rate for atrous convolution.\n    outputs_collections: Collection to add the ResNet unit output.\n    scope: Optional variable_scope.\n\n  Returns:\n    The ResNet unit's output.\n  \"\"\"\n  with tf.variable_scope(scope, 'bottleneck_v1', [inputs]) as sc:\n    depth_in = slim.utils.last_dimension(inputs.get_shape(), min_rank=4)\n    if depth == depth_in:\n      shortcut = resnet_utils.subsample(inputs, stride, 'shortcut')\n    else:\n      shortcut = slim.conv2d(inputs, depth, [1, 1], stride=stride,\n                             activation_fn=None, scope='shortcut')\n\n    residual = slim.conv2d(inputs, depth_bottleneck, [1, 1], stride=1,\n                           scope='conv1')\n    residual = resnet_utils.conv2d_same(residual, depth_bottleneck, 3, stride,\n                                        rate=rate, scope='conv2')\n    residual = slim.conv2d(residual, depth, [1, 1], stride=1,\n                           activation_fn=None, scope='conv3')\n\n    output = tf.nn.relu(shortcut + residual)\n\n    return slim.utils.collect_named_outputs(outputs_collections,\n                                            sc.original_name_scope,\n                                            output)\n\n\ndef resnet_v1(inputs,\n              blocks,\n              num_classes=None,\n              is_training=True,\n              global_pool=True,\n              output_stride=None,\n              include_root_block=True,\n              reuse=None,\n              scope=None):\n  \"\"\"Generator for v1 ResNet models.\n\n  This function generates a family of ResNet v1 models. See the resnet_v1_*()\n  methods for specific model instantiations, obtained by selecting different\n  block instantiations that produce ResNets of various depths.\n\n  Training for image classification on Imagenet is usually done with [224, 224]\n  inputs, resulting in [7, 7] feature maps at the output of the last ResNet\n  block for the ResNets defined in [1] that have nominal stride equal to 32.\n  However, for dense prediction tasks we advise that one uses inputs with\n  spatial dimensions that are multiples of 32 plus 1, e.g., [321, 321]. In\n  this case the feature maps at the ResNet output will have spatial shape\n  [(height - 1) / output_stride + 1, (width - 1) / output_stride + 1]\n  and corners exactly aligned with the input image corners, which greatly\n  facilitates alignment of the features to the image. Using as input [225, 225]\n  images results in [8, 8] feature maps at the output of the last ResNet block.\n\n  For dense prediction tasks, the ResNet needs to run in fully-convolutional\n  (FCN) mode and global_pool needs to be set to False. The ResNets in [1, 2] all\n  have nominal stride equal to 32 and a good choice in FCN mode is to use\n  output_stride=16 in order to increase the density of the computed features at\n  small computational and memory overhead, cf. http://arxiv.org/abs/1606.00915.\n\n  Args:\n    inputs: A tensor of size [batch, height_in, width_in, channels].\n    blocks: A list of length equal to the number of ResNet blocks. Each element\n      is a resnet_utils.Block object describing the units in the block.\n    num_classes: Number of predicted classes for classification tasks. If None\n      we return the features before the logit layer.\n    is_training: whether is training or not.\n    global_pool: If True, we perform global average pooling before computing the\n      logits. Set to True for image classification, False for dense prediction.\n    output_stride: If None, then the output will be computed at the nominal\n      network stride. If output_stride is not None, it specifies the requested\n      ratio of input to output spatial resolution.\n    include_root_block: If True, include the initial convolution followed by\n      max-pooling, if False excludes it.\n    reuse: whether or not the network and its variables should be reused. To be\n      able to reuse 'scope' must be given.\n    scope: Optional variable_scope.\n\n  Returns:\n    net: A rank-4 tensor of size [batch, height_out, width_out, channels_out].\n      If global_pool is False, then height_out and width_out are reduced by a\n      factor of output_stride compared to the respective height_in and width_in,\n      else both height_out and width_out equal one. If num_classes is None, then\n      net is the output of the last ResNet block, potentially after global\n      average pooling. If num_classes is not None, net contains the pre-softmax\n      activations.\n    end_points: A dictionary from components of the network to the corresponding\n      activation.\n\n  Raises:\n    ValueError: If the target output_stride is not valid.\n  \"\"\"\n  with tf.variable_scope(scope, 'resnet_v1', [inputs], reuse=reuse) as sc:\n    end_points_collection = sc.name + '_end_points'\n    with slim.arg_scope([slim.conv2d, bottleneck,\n                         resnet_utils.stack_blocks_dense],\n                        outputs_collections=end_points_collection):\n      with slim.arg_scope([slim.batch_norm], is_training=is_training):\n        net = inputs\n        if include_root_block:\n          if output_stride is not None:\n            if output_stride % 4 != 0:\n              raise ValueError('The output_stride needs to be a multiple of 4.')\n            output_stride /= 4\n          net = resnet_utils.conv2d_same(net, 64, 7, stride=2, scope='conv1')\n          net = slim.max_pool2d(net, [3, 3], stride=2, scope='pool1')\n        net = resnet_utils.stack_blocks_dense(net, blocks, output_stride)\n        if global_pool:\n          # Global average pooling.\n          net = tf.reduce_mean(net, [1, 2], name='pool5', keep_dims=True)\n        if num_classes is not None:\n          net = slim.conv2d(net, num_classes, [1, 1], activation_fn=None,\n                            normalizer_fn=None, scope='logits')\n        # Convert end_points_collection into a dictionary of end_points.\n        end_points = slim.utils.convert_collection_to_dict(end_points_collection)\n        if num_classes is not None:\n          end_points['predictions'] = slim.softmax(net, scope='predictions')\n        return net, end_points\nresnet_v1.default_image_size = 224\n\n\ndef resnet_v1_50(inputs,\n                 num_classes=None,\n                 is_training=True,\n                 global_pool=True,\n                 output_stride=None,\n                 reuse=None,\n                 scope='resnet_v1_50'):\n  \"\"\"ResNet-50 model of [1]. See resnet_v1() for arg and return description.\"\"\"\n  blocks = [\n      resnet_utils.Block(\n          'block1', bottleneck, [(256, 64, 1)] * 2 + [(256, 64, 2)]),\n      resnet_utils.Block(\n          'block2', bottleneck, [(512, 128, 1)] * 3 + [(512, 128, 2)]),\n      resnet_utils.Block(\n          'block3', bottleneck, [(1024, 256, 1)] * 5 + [(1024, 256, 2)]),\n      resnet_utils.Block(\n          'block4', bottleneck, [(2048, 512, 1)] * 3)\n  ]\n  return resnet_v1(inputs, blocks, num_classes, is_training,\n                   global_pool=global_pool, output_stride=output_stride,\n                   include_root_block=True, reuse=reuse, scope=scope)\n\n\ndef resnet_v1_101(inputs,\n                  num_classes=None,\n                  is_training=True,\n                  global_pool=True,\n                  output_stride=None,\n                  reuse=None,\n                  scope='resnet_v1_101'):\n  \"\"\"ResNet-101 model of [1]. See resnet_v1() for arg and return description.\"\"\"\n  blocks = [\n      resnet_utils.Block(\n          'block1', bottleneck, [(256, 64, 1)] * 2 + [(256, 64, 2)]),\n      resnet_utils.Block(\n          'block2', bottleneck, [(512, 128, 1)] * 3 + [(512, 128, 2)]),\n      resnet_utils.Block(\n          'block3', bottleneck, [(1024, 256, 1)] * 22 + [(1024, 256, 2)]),\n      resnet_utils.Block(\n          'block4', bottleneck, [(2048, 512, 1)] * 3)\n  ]\n  return resnet_v1(inputs, blocks, num_classes, is_training,\n                   global_pool=global_pool, output_stride=output_stride,\n                   include_root_block=True, reuse=reuse, scope=scope)\n\n\ndef resnet_v1_152(inputs,\n                  num_classes=None,\n                  is_training=True,\n                  global_pool=True,\n                  output_stride=None,\n                  reuse=None,\n                  scope='resnet_v1_152'):\n  \"\"\"ResNet-152 model of [1]. See resnet_v1() for arg and return description.\"\"\"\n  blocks = [\n      resnet_utils.Block(\n          'block1', bottleneck, [(256, 64, 1)] * 2 + [(256, 64, 2)]),\n      resnet_utils.Block(\n          'block2', bottleneck, [(512, 128, 1)] * 7 + [(512, 128, 2)]),\n      resnet_utils.Block(\n          'block3', bottleneck, [(1024, 256, 1)] * 35 + [(1024, 256, 2)]),\n      resnet_utils.Block(\n          'block4', bottleneck, [(2048, 512, 1)] * 3)]\n  return resnet_v1(inputs, blocks, num_classes, is_training,\n                   global_pool=global_pool, output_stride=output_stride,\n                   include_root_block=True, reuse=reuse, scope=scope)\n\n\ndef resnet_v1_200(inputs,\n                  num_classes=None,\n                  is_training=True,\n                  global_pool=True,\n                  output_stride=None,\n                  reuse=None,\n                  scope='resnet_v1_200'):\n  \"\"\"ResNet-200 model of [2]. See resnet_v1() for arg and return description.\"\"\"\n  blocks = [\n      resnet_utils.Block(\n          'block1', bottleneck, [(256, 64, 1)] * 2 + [(256, 64, 2)]),\n      resnet_utils.Block(\n          'block2', bottleneck, [(512, 128, 1)] * 23 + [(512, 128, 2)]),\n      resnet_utils.Block(\n          'block3', bottleneck, [(1024, 256, 1)] * 35 + [(1024, 256, 2)]),\n      resnet_utils.Block(\n          'block4', bottleneck, [(2048, 512, 1)] * 3)]\n  return resnet_v1(inputs, blocks, num_classes, is_training,\n                   global_pool=global_pool, output_stride=output_stride,\n                   include_root_block=True, reuse=reuse, scope=scope)\n"
  },
  {
    "path": "model_zoo/models/slim/nets/resnet_v1_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for slim.nets.resnet_v1.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom nets import resnet_utils\nfrom nets import resnet_v1\n\nslim = tf.contrib.slim\n\n\ndef create_test_input(batch_size, height, width, channels):\n  \"\"\"Create test input tensor.\n\n  Args:\n    batch_size: The number of images per batch or `None` if unknown.\n    height: The height of each image or `None` if unknown.\n    width: The width of each image or `None` if unknown.\n    channels: The number of channels per image or `None` if unknown.\n\n  Returns:\n    Either a placeholder `Tensor` of dimension\n      [batch_size, height, width, channels] if any of the inputs are `None` or a\n    constant `Tensor` with the mesh grid values along the spatial dimensions.\n  \"\"\"\n  if None in [batch_size, height, width, channels]:\n    return tf.placeholder(tf.float32, (batch_size, height, width, channels))\n  else:\n    return tf.to_float(\n        np.tile(\n            np.reshape(\n                np.reshape(np.arange(height), [height, 1]) +\n                np.reshape(np.arange(width), [1, width]),\n                [1, height, width, 1]),\n            [batch_size, 1, 1, channels]))\n\n\nclass ResnetUtilsTest(tf.test.TestCase):\n\n  def testSubsampleThreeByThree(self):\n    x = tf.reshape(tf.to_float(tf.range(9)), [1, 3, 3, 1])\n    x = resnet_utils.subsample(x, 2)\n    expected = tf.reshape(tf.constant([0, 2, 6, 8]), [1, 2, 2, 1])\n    with self.test_session():\n      self.assertAllClose(x.eval(), expected.eval())\n\n  def testSubsampleFourByFour(self):\n    x = tf.reshape(tf.to_float(tf.range(16)), [1, 4, 4, 1])\n    x = resnet_utils.subsample(x, 2)\n    expected = tf.reshape(tf.constant([0, 2, 8, 10]), [1, 2, 2, 1])\n    with self.test_session():\n      self.assertAllClose(x.eval(), expected.eval())\n\n  def testConv2DSameEven(self):\n    n, n2 = 4, 2\n\n    # Input image.\n    x = create_test_input(1, n, n, 1)\n\n    # Convolution kernel.\n    w = create_test_input(1, 3, 3, 1)\n    w = tf.reshape(w, [3, 3, 1, 1])\n\n    tf.get_variable('Conv/weights', initializer=w)\n    tf.get_variable('Conv/biases', initializer=tf.zeros([1]))\n    tf.get_variable_scope().reuse_variables()\n\n    y1 = slim.conv2d(x, 1, [3, 3], stride=1, scope='Conv')\n    y1_expected = tf.to_float([[14, 28, 43, 26],\n                               [28, 48, 66, 37],\n                               [43, 66, 84, 46],\n                               [26, 37, 46, 22]])\n    y1_expected = tf.reshape(y1_expected, [1, n, n, 1])\n\n    y2 = resnet_utils.subsample(y1, 2)\n    y2_expected = tf.to_float([[14, 43],\n                               [43, 84]])\n    y2_expected = tf.reshape(y2_expected, [1, n2, n2, 1])\n\n    y3 = resnet_utils.conv2d_same(x, 1, 3, stride=2, scope='Conv')\n    y3_expected = y2_expected\n\n    y4 = slim.conv2d(x, 1, [3, 3], stride=2, scope='Conv')\n    y4_expected = tf.to_float([[48, 37],\n                               [37, 22]])\n    y4_expected = tf.reshape(y4_expected, [1, n2, n2, 1])\n\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      self.assertAllClose(y1.eval(), y1_expected.eval())\n      self.assertAllClose(y2.eval(), y2_expected.eval())\n      self.assertAllClose(y3.eval(), y3_expected.eval())\n      self.assertAllClose(y4.eval(), y4_expected.eval())\n\n  def testConv2DSameOdd(self):\n    n, n2 = 5, 3\n\n    # Input image.\n    x = create_test_input(1, n, n, 1)\n\n    # Convolution kernel.\n    w = create_test_input(1, 3, 3, 1)\n    w = tf.reshape(w, [3, 3, 1, 1])\n\n    tf.get_variable('Conv/weights', initializer=w)\n    tf.get_variable('Conv/biases', initializer=tf.zeros([1]))\n    tf.get_variable_scope().reuse_variables()\n\n    y1 = slim.conv2d(x, 1, [3, 3], stride=1, scope='Conv')\n    y1_expected = tf.to_float([[14, 28, 43, 58, 34],\n                               [28, 48, 66, 84, 46],\n                               [43, 66, 84, 102, 55],\n                               [58, 84, 102, 120, 64],\n                               [34, 46, 55, 64, 30]])\n    y1_expected = tf.reshape(y1_expected, [1, n, n, 1])\n\n    y2 = resnet_utils.subsample(y1, 2)\n    y2_expected = tf.to_float([[14, 43, 34],\n                               [43, 84, 55],\n                               [34, 55, 30]])\n    y2_expected = tf.reshape(y2_expected, [1, n2, n2, 1])\n\n    y3 = resnet_utils.conv2d_same(x, 1, 3, stride=2, scope='Conv')\n    y3_expected = y2_expected\n\n    y4 = slim.conv2d(x, 1, [3, 3], stride=2, scope='Conv')\n    y4_expected = y2_expected\n\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      self.assertAllClose(y1.eval(), y1_expected.eval())\n      self.assertAllClose(y2.eval(), y2_expected.eval())\n      self.assertAllClose(y3.eval(), y3_expected.eval())\n      self.assertAllClose(y4.eval(), y4_expected.eval())\n\n  def _resnet_plain(self, inputs, blocks, output_stride=None, scope=None):\n    \"\"\"A plain ResNet without extra layers before or after the ResNet blocks.\"\"\"\n    with tf.variable_scope(scope, values=[inputs]):\n      with slim.arg_scope([slim.conv2d], outputs_collections='end_points'):\n        net = resnet_utils.stack_blocks_dense(inputs, blocks, output_stride)\n        end_points = dict(tf.get_collection('end_points'))\n        return net, end_points\n\n  def testEndPointsV1(self):\n    \"\"\"Test the end points of a tiny v1 bottleneck network.\"\"\"\n    bottleneck = resnet_v1.bottleneck\n    blocks = [resnet_utils.Block('block1', bottleneck, [(4, 1, 1), (4, 1, 2)]),\n              resnet_utils.Block('block2', bottleneck, [(8, 2, 1), (8, 2, 1)])]\n    inputs = create_test_input(2, 32, 16, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      _, end_points = self._resnet_plain(inputs, blocks, scope='tiny')\n    expected = [\n        'tiny/block1/unit_1/bottleneck_v1/shortcut',\n        'tiny/block1/unit_1/bottleneck_v1/conv1',\n        'tiny/block1/unit_1/bottleneck_v1/conv2',\n        'tiny/block1/unit_1/bottleneck_v1/conv3',\n        'tiny/block1/unit_2/bottleneck_v1/conv1',\n        'tiny/block1/unit_2/bottleneck_v1/conv2',\n        'tiny/block1/unit_2/bottleneck_v1/conv3',\n        'tiny/block2/unit_1/bottleneck_v1/shortcut',\n        'tiny/block2/unit_1/bottleneck_v1/conv1',\n        'tiny/block2/unit_1/bottleneck_v1/conv2',\n        'tiny/block2/unit_1/bottleneck_v1/conv3',\n        'tiny/block2/unit_2/bottleneck_v1/conv1',\n        'tiny/block2/unit_2/bottleneck_v1/conv2',\n        'tiny/block2/unit_2/bottleneck_v1/conv3']\n    self.assertItemsEqual(expected, end_points)\n\n  def _stack_blocks_nondense(self, net, blocks):\n    \"\"\"A simplified ResNet Block stacker without output stride control.\"\"\"\n    for block in blocks:\n      with tf.variable_scope(block.scope, 'block', [net]):\n        for i, unit in enumerate(block.args):\n          depth, depth_bottleneck, stride = unit\n          with tf.variable_scope('unit_%d' % (i + 1), values=[net]):\n            net = block.unit_fn(net,\n                                depth=depth,\n                                depth_bottleneck=depth_bottleneck,\n                                stride=stride,\n                                rate=1)\n    return net\n\n  def _atrousValues(self, bottleneck):\n    \"\"\"Verify the values of dense feature extraction by atrous convolution.\n\n    Make sure that dense feature extraction by stack_blocks_dense() followed by\n    subsampling gives identical results to feature extraction at the nominal\n    network output stride using the simple self._stack_blocks_nondense() above.\n\n    Args:\n      bottleneck: The bottleneck function.\n    \"\"\"\n    blocks = [\n        resnet_utils.Block('block1', bottleneck, [(4, 1, 1), (4, 1, 2)]),\n        resnet_utils.Block('block2', bottleneck, [(8, 2, 1), (8, 2, 2)]),\n        resnet_utils.Block('block3', bottleneck, [(16, 4, 1), (16, 4, 2)]),\n        resnet_utils.Block('block4', bottleneck, [(32, 8, 1), (32, 8, 1)])\n    ]\n    nominal_stride = 8\n\n    # Test both odd and even input dimensions.\n    height = 30\n    width = 31\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      with slim.arg_scope([slim.batch_norm], is_training=False):\n        for output_stride in [1, 2, 4, 8, None]:\n          with tf.Graph().as_default():\n            with self.test_session() as sess:\n              tf.set_random_seed(0)\n              inputs = create_test_input(1, height, width, 3)\n              # Dense feature extraction followed by subsampling.\n              output = resnet_utils.stack_blocks_dense(inputs,\n                                                       blocks,\n                                                       output_stride)\n              if output_stride is None:\n                factor = 1\n              else:\n                factor = nominal_stride // output_stride\n\n              output = resnet_utils.subsample(output, factor)\n              # Make the two networks use the same weights.\n              tf.get_variable_scope().reuse_variables()\n              # Feature extraction at the nominal network rate.\n              expected = self._stack_blocks_nondense(inputs, blocks)\n              sess.run(tf.initialize_all_variables())\n              output, expected = sess.run([output, expected])\n              self.assertAllClose(output, expected, atol=1e-4, rtol=1e-4)\n\n  def testAtrousValuesBottleneck(self):\n    self._atrousValues(resnet_v1.bottleneck)\n\n\nclass ResnetCompleteNetworkTest(tf.test.TestCase):\n  \"\"\"Tests with complete small ResNet v1 networks.\"\"\"\n\n  def _resnet_small(self,\n                    inputs,\n                    num_classes=None,\n                    is_training=True,\n                    global_pool=True,\n                    output_stride=None,\n                    include_root_block=True,\n                    reuse=None,\n                    scope='resnet_v1_small'):\n    \"\"\"A shallow and thin ResNet v1 for faster tests.\"\"\"\n    bottleneck = resnet_v1.bottleneck\n    blocks = [\n        resnet_utils.Block(\n            'block1', bottleneck, [(4, 1, 1)] * 2 + [(4, 1, 2)]),\n        resnet_utils.Block(\n            'block2', bottleneck, [(8, 2, 1)] * 2 + [(8, 2, 2)]),\n        resnet_utils.Block(\n            'block3', bottleneck, [(16, 4, 1)] * 2 + [(16, 4, 2)]),\n        resnet_utils.Block(\n            'block4', bottleneck, [(32, 8, 1)] * 2)]\n    return resnet_v1.resnet_v1(inputs, blocks, num_classes,\n                               is_training=is_training,\n                               global_pool=global_pool,\n                               output_stride=output_stride,\n                               include_root_block=include_root_block,\n                               reuse=reuse,\n                               scope=scope)\n\n  def testClassificationEndPoints(self):\n    global_pool = True\n    num_classes = 10\n    inputs = create_test_input(2, 224, 224, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      logits, end_points = self._resnet_small(inputs, num_classes,\n                                              global_pool=global_pool,\n                                              scope='resnet')\n    self.assertTrue(logits.op.name.startswith('resnet/logits'))\n    self.assertListEqual(logits.get_shape().as_list(), [2, 1, 1, num_classes])\n    self.assertTrue('predictions' in end_points)\n    self.assertListEqual(end_points['predictions'].get_shape().as_list(),\n                         [2, 1, 1, num_classes])\n\n  def testClassificationShapes(self):\n    global_pool = True\n    num_classes = 10\n    inputs = create_test_input(2, 224, 224, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      _, end_points = self._resnet_small(inputs, num_classes,\n                                         global_pool=global_pool,\n                                         scope='resnet')\n      endpoint_to_shape = {\n          'resnet/block1': [2, 28, 28, 4],\n          'resnet/block2': [2, 14, 14, 8],\n          'resnet/block3': [2, 7, 7, 16],\n          'resnet/block4': [2, 7, 7, 32]}\n      for endpoint in endpoint_to_shape:\n        shape = endpoint_to_shape[endpoint]\n        self.assertListEqual(end_points[endpoint].get_shape().as_list(), shape)\n\n  def testFullyConvolutionalEndpointShapes(self):\n    global_pool = False\n    num_classes = 10\n    inputs = create_test_input(2, 321, 321, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      _, end_points = self._resnet_small(inputs, num_classes,\n                                         global_pool=global_pool,\n                                         scope='resnet')\n      endpoint_to_shape = {\n          'resnet/block1': [2, 41, 41, 4],\n          'resnet/block2': [2, 21, 21, 8],\n          'resnet/block3': [2, 11, 11, 16],\n          'resnet/block4': [2, 11, 11, 32]}\n      for endpoint in endpoint_to_shape:\n        shape = endpoint_to_shape[endpoint]\n        self.assertListEqual(end_points[endpoint].get_shape().as_list(), shape)\n\n  def testRootlessFullyConvolutionalEndpointShapes(self):\n    global_pool = False\n    num_classes = 10\n    inputs = create_test_input(2, 128, 128, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      _, end_points = self._resnet_small(inputs, num_classes,\n                                         global_pool=global_pool,\n                                         include_root_block=False,\n                                         scope='resnet')\n      endpoint_to_shape = {\n          'resnet/block1': [2, 64, 64, 4],\n          'resnet/block2': [2, 32, 32, 8],\n          'resnet/block3': [2, 16, 16, 16],\n          'resnet/block4': [2, 16, 16, 32]}\n      for endpoint in endpoint_to_shape:\n        shape = endpoint_to_shape[endpoint]\n        self.assertListEqual(end_points[endpoint].get_shape().as_list(), shape)\n\n  def testAtrousFullyConvolutionalEndpointShapes(self):\n    global_pool = False\n    num_classes = 10\n    output_stride = 8\n    inputs = create_test_input(2, 321, 321, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      _, end_points = self._resnet_small(inputs,\n                                         num_classes,\n                                         global_pool=global_pool,\n                                         output_stride=output_stride,\n                                         scope='resnet')\n      endpoint_to_shape = {\n          'resnet/block1': [2, 41, 41, 4],\n          'resnet/block2': [2, 41, 41, 8],\n          'resnet/block3': [2, 41, 41, 16],\n          'resnet/block4': [2, 41, 41, 32]}\n      for endpoint in endpoint_to_shape:\n        shape = endpoint_to_shape[endpoint]\n        self.assertListEqual(end_points[endpoint].get_shape().as_list(), shape)\n\n  def testAtrousFullyConvolutionalValues(self):\n    \"\"\"Verify dense feature extraction with atrous convolution.\"\"\"\n    nominal_stride = 32\n    for output_stride in [4, 8, 16, 32, None]:\n      with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n        with tf.Graph().as_default():\n          with self.test_session() as sess:\n            tf.set_random_seed(0)\n            inputs = create_test_input(2, 81, 81, 3)\n            # Dense feature extraction followed by subsampling.\n            output, _ = self._resnet_small(inputs, None, is_training=False,\n                                           global_pool=False,\n                                           output_stride=output_stride)\n            if output_stride is None:\n              factor = 1\n            else:\n              factor = nominal_stride // output_stride\n            output = resnet_utils.subsample(output, factor)\n            # Make the two networks use the same weights.\n            tf.get_variable_scope().reuse_variables()\n            # Feature extraction at the nominal network rate.\n            expected, _ = self._resnet_small(inputs, None, is_training=False,\n                                             global_pool=False)\n            sess.run(tf.initialize_all_variables())\n            self.assertAllClose(output.eval(), expected.eval(),\n                                atol=1e-4, rtol=1e-4)\n\n  def testUnknownBatchSize(self):\n    batch = 2\n    height, width = 65, 65\n    global_pool = True\n    num_classes = 10\n    inputs = create_test_input(None, height, width, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      logits, _ = self._resnet_small(inputs, num_classes,\n                                     global_pool=global_pool,\n                                     scope='resnet')\n    self.assertTrue(logits.op.name.startswith('resnet/logits'))\n    self.assertListEqual(logits.get_shape().as_list(),\n                         [None, 1, 1, num_classes])\n    images = create_test_input(batch, height, width, 3)\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(logits, {inputs: images.eval()})\n      self.assertEqual(output.shape, (batch, 1, 1, num_classes))\n\n  def testFullyConvolutionalUnknownHeightWidth(self):\n    batch = 2\n    height, width = 65, 65\n    global_pool = False\n    inputs = create_test_input(batch, None, None, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      output, _ = self._resnet_small(inputs, None, global_pool=global_pool)\n    self.assertListEqual(output.get_shape().as_list(),\n                         [batch, None, None, 32])\n    images = create_test_input(batch, height, width, 3)\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(output, {inputs: images.eval()})\n      self.assertEqual(output.shape, (batch, 3, 3, 32))\n\n  def testAtrousFullyConvolutionalUnknownHeightWidth(self):\n    batch = 2\n    height, width = 65, 65\n    global_pool = False\n    output_stride = 8\n    inputs = create_test_input(batch, None, None, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      output, _ = self._resnet_small(inputs,\n                                     None,\n                                     global_pool=global_pool,\n                                     output_stride=output_stride)\n    self.assertListEqual(output.get_shape().as_list(),\n                         [batch, None, None, 32])\n    images = create_test_input(batch, height, width, 3)\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(output, {inputs: images.eval()})\n      self.assertEqual(output.shape, (batch, 9, 9, 32))\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/slim/nets/resnet_v2.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains definitions for the preactivation form of Residual Networks.\n\nResidual networks (ResNets) were originally proposed in:\n[1] Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun\n    Deep Residual Learning for Image Recognition. arXiv:1512.03385\n\nThe full preactivation 'v2' ResNet variant implemented in this module was\nintroduced by:\n[2] Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun\n    Identity Mappings in Deep Residual Networks. arXiv: 1603.05027\n\nThe key difference of the full preactivation 'v2' variant compared to the\n'v1' variant in [1] is the use of batch normalization before every weight layer.\nAnother difference is that 'v2' ResNets do not include an activation function in\nthe main pathway. Also see [2; Fig. 4e].\n\nTypical use:\n\n   from tensorflow.contrib.slim.nets import resnet_v2\n\nResNet-101 for image classification into 1000 classes:\n\n   # inputs has shape [batch, 224, 224, 3]\n   with slim.arg_scope(resnet_v2.resnet_arg_scope()):\n      net, end_points = resnet_v2.resnet_v2_101(inputs, 1000, is_training=False)\n\nResNet-101 for semantic segmentation into 21 classes:\n\n   # inputs has shape [batch, 513, 513, 3]\n   with slim.arg_scope(resnet_v2.resnet_arg_scope(is_training)):\n      net, end_points = resnet_v2.resnet_v2_101(inputs,\n                                                21,\n                                                is_training=False,\n                                                global_pool=False,\n                                                output_stride=16)\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom nets import resnet_utils\n\nslim = tf.contrib.slim\nresnet_arg_scope = resnet_utils.resnet_arg_scope\n\n\n@slim.add_arg_scope\ndef bottleneck(inputs, depth, depth_bottleneck, stride, rate=1,\n               outputs_collections=None, scope=None):\n  \"\"\"Bottleneck residual unit variant with BN before convolutions.\n\n  This is the full preactivation residual unit variant proposed in [2]. See\n  Fig. 1(b) of [2] for its definition. Note that we use here the bottleneck\n  variant which has an extra bottleneck layer.\n\n  When putting together two consecutive ResNet blocks that use this unit, one\n  should use stride = 2 in the last unit of the first block.\n\n  Args:\n    inputs: A tensor of size [batch, height, width, channels].\n    depth: The depth of the ResNet unit output.\n    depth_bottleneck: The depth of the bottleneck layers.\n    stride: The ResNet unit's stride. Determines the amount of downsampling of\n      the units output compared to its input.\n    rate: An integer, rate for atrous convolution.\n    outputs_collections: Collection to add the ResNet unit output.\n    scope: Optional variable_scope.\n\n  Returns:\n    The ResNet unit's output.\n  \"\"\"\n  with tf.variable_scope(scope, 'bottleneck_v2', [inputs]) as sc:\n    depth_in = slim.utils.last_dimension(inputs.get_shape(), min_rank=4)\n    preact = slim.batch_norm(inputs, activation_fn=tf.nn.relu, scope='preact')\n    if depth == depth_in:\n      shortcut = resnet_utils.subsample(inputs, stride, 'shortcut')\n    else:\n      shortcut = slim.conv2d(preact, depth, [1, 1], stride=stride,\n                             normalizer_fn=None, activation_fn=None,\n                             scope='shortcut')\n\n    residual = slim.conv2d(preact, depth_bottleneck, [1, 1], stride=1,\n                           scope='conv1')\n    residual = resnet_utils.conv2d_same(residual, depth_bottleneck, 3, stride,\n                                        rate=rate, scope='conv2')\n    residual = slim.conv2d(residual, depth, [1, 1], stride=1,\n                           normalizer_fn=None, activation_fn=None,\n                           scope='conv3')\n\n    output = shortcut + residual\n\n    return slim.utils.collect_named_outputs(outputs_collections,\n                                            sc.original_name_scope,\n                                            output)\n\n\ndef resnet_v2(inputs,\n              blocks,\n              num_classes=None,\n              is_training=True,\n              global_pool=True,\n              output_stride=None,\n              include_root_block=True,\n              reuse=None,\n              scope=None):\n  \"\"\"Generator for v2 (preactivation) ResNet models.\n\n  This function generates a family of ResNet v2 models. See the resnet_v2_*()\n  methods for specific model instantiations, obtained by selecting different\n  block instantiations that produce ResNets of various depths.\n\n  Training for image classification on Imagenet is usually done with [224, 224]\n  inputs, resulting in [7, 7] feature maps at the output of the last ResNet\n  block for the ResNets defined in [1] that have nominal stride equal to 32.\n  However, for dense prediction tasks we advise that one uses inputs with\n  spatial dimensions that are multiples of 32 plus 1, e.g., [321, 321]. In\n  this case the feature maps at the ResNet output will have spatial shape\n  [(height - 1) / output_stride + 1, (width - 1) / output_stride + 1]\n  and corners exactly aligned with the input image corners, which greatly\n  facilitates alignment of the features to the image. Using as input [225, 225]\n  images results in [8, 8] feature maps at the output of the last ResNet block.\n\n  For dense prediction tasks, the ResNet needs to run in fully-convolutional\n  (FCN) mode and global_pool needs to be set to False. The ResNets in [1, 2] all\n  have nominal stride equal to 32 and a good choice in FCN mode is to use\n  output_stride=16 in order to increase the density of the computed features at\n  small computational and memory overhead, cf. http://arxiv.org/abs/1606.00915.\n\n  Args:\n    inputs: A tensor of size [batch, height_in, width_in, channels].\n    blocks: A list of length equal to the number of ResNet blocks. Each element\n      is a resnet_utils.Block object describing the units in the block.\n    num_classes: Number of predicted classes for classification tasks. If None\n      we return the features before the logit layer.\n    is_training: whether is training or not.\n    global_pool: If True, we perform global average pooling before computing the\n      logits. Set to True for image classification, False for dense prediction.\n    output_stride: If None, then the output will be computed at the nominal\n      network stride. If output_stride is not None, it specifies the requested\n      ratio of input to output spatial resolution.\n    include_root_block: If True, include the initial convolution followed by\n      max-pooling, if False excludes it. If excluded, `inputs` should be the\n      results of an activation-less convolution.\n    reuse: whether or not the network and its variables should be reused. To be\n      able to reuse 'scope' must be given.\n    scope: Optional variable_scope.\n\n\n  Returns:\n    net: A rank-4 tensor of size [batch, height_out, width_out, channels_out].\n      If global_pool is False, then height_out and width_out are reduced by a\n      factor of output_stride compared to the respective height_in and width_in,\n      else both height_out and width_out equal one. If num_classes is None, then\n      net is the output of the last ResNet block, potentially after global\n      average pooling. If num_classes is not None, net contains the pre-softmax\n      activations.\n    end_points: A dictionary from components of the network to the corresponding\n      activation.\n\n  Raises:\n    ValueError: If the target output_stride is not valid.\n  \"\"\"\n  with tf.variable_scope(scope, 'resnet_v2', [inputs], reuse=reuse) as sc:\n    end_points_collection = sc.name + '_end_points'\n    with slim.arg_scope([slim.conv2d, bottleneck,\n                         resnet_utils.stack_blocks_dense],\n                        outputs_collections=end_points_collection):\n      with slim.arg_scope([slim.batch_norm], is_training=is_training):\n        net = inputs\n        if include_root_block:\n          if output_stride is not None:\n            if output_stride % 4 != 0:\n              raise ValueError('The output_stride needs to be a multiple of 4.')\n            output_stride /= 4\n          # We do not include batch normalization or activation functions in\n          # conv1 because the first ResNet unit will perform these. Cf.\n          # Appendix of [2].\n          with slim.arg_scope([slim.conv2d],\n                              activation_fn=None, normalizer_fn=None):\n            net = resnet_utils.conv2d_same(net, 64, 7, stride=2, scope='conv1')\n          net = slim.max_pool2d(net, [3, 3], stride=2, scope='pool1')\n        net = resnet_utils.stack_blocks_dense(net, blocks, output_stride)\n        # This is needed because the pre-activation variant does not have batch\n        # normalization or activation functions in the residual unit output. See\n        # Appendix of [2].\n        net = slim.batch_norm(net, activation_fn=tf.nn.relu, scope='postnorm')\n        if global_pool:\n          # Global average pooling.\n          net = tf.reduce_mean(net, [1, 2], name='pool5', keep_dims=True)\n        if num_classes is not None:\n          net = slim.conv2d(net, num_classes, [1, 1], activation_fn=None,\n                            normalizer_fn=None, scope='logits')\n        # Convert end_points_collection into a dictionary of end_points.\n        end_points = slim.utils.convert_collection_to_dict(end_points_collection)\n        if num_classes is not None:\n          end_points['predictions'] = slim.softmax(net, scope='predictions')\n        return net, end_points\nresnet_v2.default_image_size = 224\n\n\ndef resnet_v2_50(inputs,\n                 num_classes=None,\n                 is_training=True,\n                 global_pool=True,\n                 output_stride=None,\n                 reuse=None,\n                 scope='resnet_v2_50'):\n  \"\"\"ResNet-50 model of [1]. See resnet_v2() for arg and return description.\"\"\"\n  blocks = [\n      resnet_utils.Block(\n          'block1', bottleneck, [(256, 64, 1)] * 2 + [(256, 64, 2)]),\n      resnet_utils.Block(\n          'block2', bottleneck, [(512, 128, 1)] * 3 + [(512, 128, 2)]),\n      resnet_utils.Block(\n          'block3', bottleneck, [(1024, 256, 1)] * 5 + [(1024, 256, 2)]),\n      resnet_utils.Block(\n          'block4', bottleneck, [(2048, 512, 1)] * 3)]\n  return resnet_v2(inputs, blocks, num_classes, is_training=is_training,\n                   global_pool=global_pool, output_stride=output_stride,\n                   include_root_block=True, reuse=reuse, scope=scope)\n\n\ndef resnet_v2_101(inputs,\n                  num_classes=None,\n                  is_training=True,\n                  global_pool=True,\n                  output_stride=None,\n                  reuse=None,\n                  scope='resnet_v2_101'):\n  \"\"\"ResNet-101 model of [1]. See resnet_v2() for arg and return description.\"\"\"\n  blocks = [\n      resnet_utils.Block(\n          'block1', bottleneck, [(256, 64, 1)] * 2 + [(256, 64, 2)]),\n      resnet_utils.Block(\n          'block2', bottleneck, [(512, 128, 1)] * 3 + [(512, 128, 2)]),\n      resnet_utils.Block(\n          'block3', bottleneck, [(1024, 256, 1)] * 22 + [(1024, 256, 2)]),\n      resnet_utils.Block(\n          'block4', bottleneck, [(2048, 512, 1)] * 3)]\n  return resnet_v2(inputs, blocks, num_classes, is_training=is_training,\n                   global_pool=global_pool, output_stride=output_stride,\n                   include_root_block=True, reuse=reuse, scope=scope)\n\n\ndef resnet_v2_152(inputs,\n                  num_classes=None,\n                  is_training=True,\n                  global_pool=True,\n                  output_stride=None,\n                  reuse=None,\n                  scope='resnet_v2_152'):\n  \"\"\"ResNet-152 model of [1]. See resnet_v2() for arg and return description.\"\"\"\n  blocks = [\n      resnet_utils.Block(\n          'block1', bottleneck, [(256, 64, 1)] * 2 + [(256, 64, 2)]),\n      resnet_utils.Block(\n          'block2', bottleneck, [(512, 128, 1)] * 7 + [(512, 128, 2)]),\n      resnet_utils.Block(\n          'block3', bottleneck, [(1024, 256, 1)] * 35 + [(1024, 256, 2)]),\n      resnet_utils.Block(\n          'block4', bottleneck, [(2048, 512, 1)] * 3)]\n  return resnet_v2(inputs, blocks, num_classes, is_training=is_training,\n                   global_pool=global_pool, output_stride=output_stride,\n                   include_root_block=True, reuse=reuse, scope=scope)\n\n\ndef resnet_v2_200(inputs,\n                  num_classes=None,\n                  is_training=True,\n                  global_pool=True,\n                  output_stride=None,\n                  reuse=None,\n                  scope='resnet_v2_200'):\n  \"\"\"ResNet-200 model of [2]. See resnet_v2() for arg and return description.\"\"\"\n  blocks = [\n      resnet_utils.Block(\n          'block1', bottleneck, [(256, 64, 1)] * 2 + [(256, 64, 2)]),\n      resnet_utils.Block(\n          'block2', bottleneck, [(512, 128, 1)] * 23 + [(512, 128, 2)]),\n      resnet_utils.Block(\n          'block3', bottleneck, [(1024, 256, 1)] * 35 + [(1024, 256, 2)]),\n      resnet_utils.Block(\n          'block4', bottleneck, [(2048, 512, 1)] * 3)]\n  return resnet_v2(inputs, blocks, num_classes, is_training=is_training,\n                   global_pool=global_pool, output_stride=output_stride,\n                   include_root_block=True, reuse=reuse, scope=scope)\n"
  },
  {
    "path": "model_zoo/models/slim/nets/resnet_v2_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for slim.nets.resnet_v2.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom nets import resnet_utils\nfrom nets import resnet_v2\n\nslim = tf.contrib.slim\n\n\ndef create_test_input(batch_size, height, width, channels):\n  \"\"\"Create test input tensor.\n\n  Args:\n    batch_size: The number of images per batch or `None` if unknown.\n    height: The height of each image or `None` if unknown.\n    width: The width of each image or `None` if unknown.\n    channels: The number of channels per image or `None` if unknown.\n\n  Returns:\n    Either a placeholder `Tensor` of dimension\n      [batch_size, height, width, channels] if any of the inputs are `None` or a\n    constant `Tensor` with the mesh grid values along the spatial dimensions.\n  \"\"\"\n  if None in [batch_size, height, width, channels]:\n    return tf.placeholder(tf.float32, (batch_size, height, width, channels))\n  else:\n    return tf.to_float(\n        np.tile(\n            np.reshape(\n                np.reshape(np.arange(height), [height, 1]) +\n                np.reshape(np.arange(width), [1, width]),\n                [1, height, width, 1]),\n            [batch_size, 1, 1, channels]))\n\n\nclass ResnetUtilsTest(tf.test.TestCase):\n\n  def testSubsampleThreeByThree(self):\n    x = tf.reshape(tf.to_float(tf.range(9)), [1, 3, 3, 1])\n    x = resnet_utils.subsample(x, 2)\n    expected = tf.reshape(tf.constant([0, 2, 6, 8]), [1, 2, 2, 1])\n    with self.test_session():\n      self.assertAllClose(x.eval(), expected.eval())\n\n  def testSubsampleFourByFour(self):\n    x = tf.reshape(tf.to_float(tf.range(16)), [1, 4, 4, 1])\n    x = resnet_utils.subsample(x, 2)\n    expected = tf.reshape(tf.constant([0, 2, 8, 10]), [1, 2, 2, 1])\n    with self.test_session():\n      self.assertAllClose(x.eval(), expected.eval())\n\n  def testConv2DSameEven(self):\n    n, n2 = 4, 2\n\n    # Input image.\n    x = create_test_input(1, n, n, 1)\n\n    # Convolution kernel.\n    w = create_test_input(1, 3, 3, 1)\n    w = tf.reshape(w, [3, 3, 1, 1])\n\n    tf.get_variable('Conv/weights', initializer=w)\n    tf.get_variable('Conv/biases', initializer=tf.zeros([1]))\n    tf.get_variable_scope().reuse_variables()\n\n    y1 = slim.conv2d(x, 1, [3, 3], stride=1, scope='Conv')\n    y1_expected = tf.to_float([[14, 28, 43, 26],\n                               [28, 48, 66, 37],\n                               [43, 66, 84, 46],\n                               [26, 37, 46, 22]])\n    y1_expected = tf.reshape(y1_expected, [1, n, n, 1])\n\n    y2 = resnet_utils.subsample(y1, 2)\n    y2_expected = tf.to_float([[14, 43],\n                               [43, 84]])\n    y2_expected = tf.reshape(y2_expected, [1, n2, n2, 1])\n\n    y3 = resnet_utils.conv2d_same(x, 1, 3, stride=2, scope='Conv')\n    y3_expected = y2_expected\n\n    y4 = slim.conv2d(x, 1, [3, 3], stride=2, scope='Conv')\n    y4_expected = tf.to_float([[48, 37],\n                               [37, 22]])\n    y4_expected = tf.reshape(y4_expected, [1, n2, n2, 1])\n\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      self.assertAllClose(y1.eval(), y1_expected.eval())\n      self.assertAllClose(y2.eval(), y2_expected.eval())\n      self.assertAllClose(y3.eval(), y3_expected.eval())\n      self.assertAllClose(y4.eval(), y4_expected.eval())\n\n  def testConv2DSameOdd(self):\n    n, n2 = 5, 3\n\n    # Input image.\n    x = create_test_input(1, n, n, 1)\n\n    # Convolution kernel.\n    w = create_test_input(1, 3, 3, 1)\n    w = tf.reshape(w, [3, 3, 1, 1])\n\n    tf.get_variable('Conv/weights', initializer=w)\n    tf.get_variable('Conv/biases', initializer=tf.zeros([1]))\n    tf.get_variable_scope().reuse_variables()\n\n    y1 = slim.conv2d(x, 1, [3, 3], stride=1, scope='Conv')\n    y1_expected = tf.to_float([[14, 28, 43, 58, 34],\n                               [28, 48, 66, 84, 46],\n                               [43, 66, 84, 102, 55],\n                               [58, 84, 102, 120, 64],\n                               [34, 46, 55, 64, 30]])\n    y1_expected = tf.reshape(y1_expected, [1, n, n, 1])\n\n    y2 = resnet_utils.subsample(y1, 2)\n    y2_expected = tf.to_float([[14, 43, 34],\n                               [43, 84, 55],\n                               [34, 55, 30]])\n    y2_expected = tf.reshape(y2_expected, [1, n2, n2, 1])\n\n    y3 = resnet_utils.conv2d_same(x, 1, 3, stride=2, scope='Conv')\n    y3_expected = y2_expected\n\n    y4 = slim.conv2d(x, 1, [3, 3], stride=2, scope='Conv')\n    y4_expected = y2_expected\n\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      self.assertAllClose(y1.eval(), y1_expected.eval())\n      self.assertAllClose(y2.eval(), y2_expected.eval())\n      self.assertAllClose(y3.eval(), y3_expected.eval())\n      self.assertAllClose(y4.eval(), y4_expected.eval())\n\n  def _resnet_plain(self, inputs, blocks, output_stride=None, scope=None):\n    \"\"\"A plain ResNet without extra layers before or after the ResNet blocks.\"\"\"\n    with tf.variable_scope(scope, values=[inputs]):\n      with slim.arg_scope([slim.conv2d], outputs_collections='end_points'):\n        net = resnet_utils.stack_blocks_dense(inputs, blocks, output_stride)\n        end_points = dict(tf.get_collection('end_points'))\n        return net, end_points\n\n  def testEndPointsV2(self):\n    \"\"\"Test the end points of a tiny v2 bottleneck network.\"\"\"\n    bottleneck = resnet_v2.bottleneck\n    blocks = [resnet_utils.Block('block1', bottleneck, [(4, 1, 1), (4, 1, 2)]),\n              resnet_utils.Block('block2', bottleneck, [(8, 2, 1), (8, 2, 1)])]\n    inputs = create_test_input(2, 32, 16, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      _, end_points = self._resnet_plain(inputs, blocks, scope='tiny')\n    expected = [\n        'tiny/block1/unit_1/bottleneck_v2/shortcut',\n        'tiny/block1/unit_1/bottleneck_v2/conv1',\n        'tiny/block1/unit_1/bottleneck_v2/conv2',\n        'tiny/block1/unit_1/bottleneck_v2/conv3',\n        'tiny/block1/unit_2/bottleneck_v2/conv1',\n        'tiny/block1/unit_2/bottleneck_v2/conv2',\n        'tiny/block1/unit_2/bottleneck_v2/conv3',\n        'tiny/block2/unit_1/bottleneck_v2/shortcut',\n        'tiny/block2/unit_1/bottleneck_v2/conv1',\n        'tiny/block2/unit_1/bottleneck_v2/conv2',\n        'tiny/block2/unit_1/bottleneck_v2/conv3',\n        'tiny/block2/unit_2/bottleneck_v2/conv1',\n        'tiny/block2/unit_2/bottleneck_v2/conv2',\n        'tiny/block2/unit_2/bottleneck_v2/conv3']\n    self.assertItemsEqual(expected, end_points)\n\n  def _stack_blocks_nondense(self, net, blocks):\n    \"\"\"A simplified ResNet Block stacker without output stride control.\"\"\"\n    for block in blocks:\n      with tf.variable_scope(block.scope, 'block', [net]):\n        for i, unit in enumerate(block.args):\n          depth, depth_bottleneck, stride = unit\n          with tf.variable_scope('unit_%d' % (i + 1), values=[net]):\n            net = block.unit_fn(net,\n                                depth=depth,\n                                depth_bottleneck=depth_bottleneck,\n                                stride=stride,\n                                rate=1)\n    return net\n\n  def _atrousValues(self, bottleneck):\n    \"\"\"Verify the values of dense feature extraction by atrous convolution.\n\n    Make sure that dense feature extraction by stack_blocks_dense() followed by\n    subsampling gives identical results to feature extraction at the nominal\n    network output stride using the simple self._stack_blocks_nondense() above.\n\n    Args:\n      bottleneck: The bottleneck function.\n    \"\"\"\n    blocks = [\n        resnet_utils.Block('block1', bottleneck, [(4, 1, 1), (4, 1, 2)]),\n        resnet_utils.Block('block2', bottleneck, [(8, 2, 1), (8, 2, 2)]),\n        resnet_utils.Block('block3', bottleneck, [(16, 4, 1), (16, 4, 2)]),\n        resnet_utils.Block('block4', bottleneck, [(32, 8, 1), (32, 8, 1)])\n    ]\n    nominal_stride = 8\n\n    # Test both odd and even input dimensions.\n    height = 30\n    width = 31\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      with slim.arg_scope([slim.batch_norm], is_training=False):\n        for output_stride in [1, 2, 4, 8, None]:\n          with tf.Graph().as_default():\n            with self.test_session() as sess:\n              tf.set_random_seed(0)\n              inputs = create_test_input(1, height, width, 3)\n              # Dense feature extraction followed by subsampling.\n              output = resnet_utils.stack_blocks_dense(inputs,\n                                                       blocks,\n                                                       output_stride)\n              if output_stride is None:\n                factor = 1\n              else:\n                factor = nominal_stride // output_stride\n\n              output = resnet_utils.subsample(output, factor)\n              # Make the two networks use the same weights.\n              tf.get_variable_scope().reuse_variables()\n              # Feature extraction at the nominal network rate.\n              expected = self._stack_blocks_nondense(inputs, blocks)\n              sess.run(tf.initialize_all_variables())\n              output, expected = sess.run([output, expected])\n              self.assertAllClose(output, expected, atol=1e-4, rtol=1e-4)\n\n  def testAtrousValuesBottleneck(self):\n    self._atrousValues(resnet_v2.bottleneck)\n\n\nclass ResnetCompleteNetworkTest(tf.test.TestCase):\n  \"\"\"Tests with complete small ResNet v2 networks.\"\"\"\n\n  def _resnet_small(self,\n                    inputs,\n                    num_classes=None,\n                    is_training=True,\n                    global_pool=True,\n                    output_stride=None,\n                    include_root_block=True,\n                    reuse=None,\n                    scope='resnet_v2_small'):\n    \"\"\"A shallow and thin ResNet v2 for faster tests.\"\"\"\n    bottleneck = resnet_v2.bottleneck\n    blocks = [\n        resnet_utils.Block(\n            'block1', bottleneck, [(4, 1, 1)] * 2 + [(4, 1, 2)]),\n        resnet_utils.Block(\n            'block2', bottleneck, [(8, 2, 1)] * 2 + [(8, 2, 2)]),\n        resnet_utils.Block(\n            'block3', bottleneck, [(16, 4, 1)] * 2 + [(16, 4, 2)]),\n        resnet_utils.Block(\n            'block4', bottleneck, [(32, 8, 1)] * 2)]\n    return resnet_v2.resnet_v2(inputs, blocks, num_classes,\n                               is_training=is_training,\n                               global_pool=global_pool,\n                               output_stride=output_stride,\n                               include_root_block=include_root_block,\n                               reuse=reuse,\n                               scope=scope)\n\n  def testClassificationEndPoints(self):\n    global_pool = True\n    num_classes = 10\n    inputs = create_test_input(2, 224, 224, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      logits, end_points = self._resnet_small(inputs, num_classes,\n                                              global_pool=global_pool,\n                                              scope='resnet')\n    self.assertTrue(logits.op.name.startswith('resnet/logits'))\n    self.assertListEqual(logits.get_shape().as_list(), [2, 1, 1, num_classes])\n    self.assertTrue('predictions' in end_points)\n    self.assertListEqual(end_points['predictions'].get_shape().as_list(),\n                         [2, 1, 1, num_classes])\n\n  def testClassificationShapes(self):\n    global_pool = True\n    num_classes = 10\n    inputs = create_test_input(2, 224, 224, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      _, end_points = self._resnet_small(inputs, num_classes,\n                                         global_pool=global_pool,\n                                         scope='resnet')\n      endpoint_to_shape = {\n          'resnet/block1': [2, 28, 28, 4],\n          'resnet/block2': [2, 14, 14, 8],\n          'resnet/block3': [2, 7, 7, 16],\n          'resnet/block4': [2, 7, 7, 32]}\n      for endpoint in endpoint_to_shape:\n        shape = endpoint_to_shape[endpoint]\n        self.assertListEqual(end_points[endpoint].get_shape().as_list(), shape)\n\n  def testFullyConvolutionalEndpointShapes(self):\n    global_pool = False\n    num_classes = 10\n    inputs = create_test_input(2, 321, 321, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      _, end_points = self._resnet_small(inputs, num_classes,\n                                         global_pool=global_pool,\n                                         scope='resnet')\n      endpoint_to_shape = {\n          'resnet/block1': [2, 41, 41, 4],\n          'resnet/block2': [2, 21, 21, 8],\n          'resnet/block3': [2, 11, 11, 16],\n          'resnet/block4': [2, 11, 11, 32]}\n      for endpoint in endpoint_to_shape:\n        shape = endpoint_to_shape[endpoint]\n        self.assertListEqual(end_points[endpoint].get_shape().as_list(), shape)\n\n  def testRootlessFullyConvolutionalEndpointShapes(self):\n    global_pool = False\n    num_classes = 10\n    inputs = create_test_input(2, 128, 128, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      _, end_points = self._resnet_small(inputs, num_classes,\n                                         global_pool=global_pool,\n                                         include_root_block=False,\n                                         scope='resnet')\n      endpoint_to_shape = {\n          'resnet/block1': [2, 64, 64, 4],\n          'resnet/block2': [2, 32, 32, 8],\n          'resnet/block3': [2, 16, 16, 16],\n          'resnet/block4': [2, 16, 16, 32]}\n      for endpoint in endpoint_to_shape:\n        shape = endpoint_to_shape[endpoint]\n        self.assertListEqual(end_points[endpoint].get_shape().as_list(), shape)\n\n  def testAtrousFullyConvolutionalEndpointShapes(self):\n    global_pool = False\n    num_classes = 10\n    output_stride = 8\n    inputs = create_test_input(2, 321, 321, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      _, end_points = self._resnet_small(inputs,\n                                         num_classes,\n                                         global_pool=global_pool,\n                                         output_stride=output_stride,\n                                         scope='resnet')\n      endpoint_to_shape = {\n          'resnet/block1': [2, 41, 41, 4],\n          'resnet/block2': [2, 41, 41, 8],\n          'resnet/block3': [2, 41, 41, 16],\n          'resnet/block4': [2, 41, 41, 32]}\n      for endpoint in endpoint_to_shape:\n        shape = endpoint_to_shape[endpoint]\n        self.assertListEqual(end_points[endpoint].get_shape().as_list(), shape)\n\n  def testAtrousFullyConvolutionalValues(self):\n    \"\"\"Verify dense feature extraction with atrous convolution.\"\"\"\n    nominal_stride = 32\n    for output_stride in [4, 8, 16, 32, None]:\n      with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n        with tf.Graph().as_default():\n          with self.test_session() as sess:\n            tf.set_random_seed(0)\n            inputs = create_test_input(2, 81, 81, 3)\n            # Dense feature extraction followed by subsampling.\n            output, _ = self._resnet_small(inputs, None,\n                                           is_training=False,\n                                           global_pool=False,\n                                           output_stride=output_stride)\n            if output_stride is None:\n              factor = 1\n            else:\n              factor = nominal_stride // output_stride\n            output = resnet_utils.subsample(output, factor)\n            # Make the two networks use the same weights.\n            tf.get_variable_scope().reuse_variables()\n            # Feature extraction at the nominal network rate.\n            expected, _ = self._resnet_small(inputs, None,\n                                             is_training=False,\n                                             global_pool=False)\n            sess.run(tf.initialize_all_variables())\n            self.assertAllClose(output.eval(), expected.eval(),\n                                atol=1e-4, rtol=1e-4)\n\n  def testUnknownBatchSize(self):\n    batch = 2\n    height, width = 65, 65\n    global_pool = True\n    num_classes = 10\n    inputs = create_test_input(None, height, width, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      logits, _ = self._resnet_small(inputs, num_classes,\n                                     global_pool=global_pool,\n                                     scope='resnet')\n    self.assertTrue(logits.op.name.startswith('resnet/logits'))\n    self.assertListEqual(logits.get_shape().as_list(),\n                         [None, 1, 1, num_classes])\n    images = create_test_input(batch, height, width, 3)\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(logits, {inputs: images.eval()})\n      self.assertEqual(output.shape, (batch, 1, 1, num_classes))\n\n  def testFullyConvolutionalUnknownHeightWidth(self):\n    batch = 2\n    height, width = 65, 65\n    global_pool = False\n    inputs = create_test_input(batch, None, None, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      output, _ = self._resnet_small(inputs, None,\n                                     global_pool=global_pool)\n    self.assertListEqual(output.get_shape().as_list(),\n                         [batch, None, None, 32])\n    images = create_test_input(batch, height, width, 3)\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(output, {inputs: images.eval()})\n      self.assertEqual(output.shape, (batch, 3, 3, 32))\n\n  def testAtrousFullyConvolutionalUnknownHeightWidth(self):\n    batch = 2\n    height, width = 65, 65\n    global_pool = False\n    output_stride = 8\n    inputs = create_test_input(batch, None, None, 3)\n    with slim.arg_scope(resnet_utils.resnet_arg_scope()):\n      output, _ = self._resnet_small(inputs,\n                                     None,\n                                     global_pool=global_pool,\n                                     output_stride=output_stride)\n    self.assertListEqual(output.get_shape().as_list(),\n                         [batch, None, None, 32])\n    images = create_test_input(batch, height, width, 3)\n    with self.test_session() as sess:\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(output, {inputs: images.eval()})\n      self.assertEqual(output.shape, (batch, 9, 9, 32))\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/slim/nets/vgg.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains model definitions for versions of the Oxford VGG network.\n\nThese model definitions were introduced in the following technical report:\n\n  Very Deep Convolutional Networks For Large-Scale Image Recognition\n  Karen Simonyan and Andrew Zisserman\n  arXiv technical report, 2015\n  PDF: http://arxiv.org/pdf/1409.1556.pdf\n  ILSVRC 2014 Slides: http://www.robots.ox.ac.uk/~karen/pdf/ILSVRC_2014.pdf\n  CC-BY-4.0\n\nMore information can be obtained from the VGG website:\nwww.robots.ox.ac.uk/~vgg/research/very_deep/\n\nUsage:\n  with slim.arg_scope(vgg.vgg_arg_scope()):\n    outputs, end_points = vgg.vgg_a(inputs)\n\n  with slim.arg_scope(vgg.vgg_arg_scope()):\n    outputs, end_points = vgg.vgg_16(inputs)\n\n@@vgg_a\n@@vgg_16\n@@vgg_19\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nslim = tf.contrib.slim\n\n\ndef vgg_arg_scope(weight_decay=0.0005):\n  \"\"\"Defines the VGG arg scope.\n\n  Args:\n    weight_decay: The l2 regularization coefficient.\n\n  Returns:\n    An arg_scope.\n  \"\"\"\n  with slim.arg_scope([slim.conv2d, slim.fully_connected],\n                      activation_fn=tf.nn.relu,\n                      weights_regularizer=slim.l2_regularizer(weight_decay),\n                      biases_initializer=tf.zeros_initializer):\n    with slim.arg_scope([slim.conv2d], padding='SAME') as arg_sc:\n      return arg_sc\n\n\ndef vgg_a(inputs,\n          num_classes=1000,\n          is_training=True,\n          dropout_keep_prob=0.5,\n          spatial_squeeze=True,\n          scope='vgg_a'):\n  \"\"\"Oxford Net VGG 11-Layers version A Example.\n\n  Note: All the fully_connected layers have been transformed to conv2d layers.\n        To use in classification mode, resize input to 224x224.\n\n  Args:\n    inputs: a tensor of size [batch_size, height, width, channels].\n    num_classes: number of predicted classes.\n    is_training: whether or not the model is being trained.\n    dropout_keep_prob: the probability that activations are kept in the dropout\n      layers during training.\n    spatial_squeeze: whether or not should squeeze the spatial dimensions of the\n      outputs. Useful to remove unnecessary dimensions for classification.\n    scope: Optional scope for the variables.\n\n  Returns:\n    the last op containing the log predictions and end_points dict.\n  \"\"\"\n  with tf.variable_scope(scope, 'vgg_a', [inputs]) as sc:\n    end_points_collection = sc.name + '_end_points'\n    # Collect outputs for conv2d, fully_connected and max_pool2d.\n    with slim.arg_scope([slim.conv2d, slim.max_pool2d],\n                        outputs_collections=end_points_collection):\n      net = slim.repeat(inputs, 1, slim.conv2d, 64, [3, 3], scope='conv1')\n      net = slim.max_pool2d(net, [2, 2], scope='pool1')\n      net = slim.repeat(net, 1, slim.conv2d, 128, [3, 3], scope='conv2')\n      net = slim.max_pool2d(net, [2, 2], scope='pool2')\n      net = slim.repeat(net, 2, slim.conv2d, 256, [3, 3], scope='conv3')\n      net = slim.max_pool2d(net, [2, 2], scope='pool3')\n      net = slim.repeat(net, 2, slim.conv2d, 512, [3, 3], scope='conv4')\n      net = slim.max_pool2d(net, [2, 2], scope='pool4')\n      net = slim.repeat(net, 2, slim.conv2d, 512, [3, 3], scope='conv5')\n      net = slim.max_pool2d(net, [2, 2], scope='pool5')\n      # Use conv2d instead of fully_connected layers.\n      net = slim.conv2d(net, 4096, [7, 7], padding='VALID', scope='fc6')\n      net = slim.dropout(net, dropout_keep_prob, is_training=is_training,\n                         scope='dropout6')\n      net = slim.conv2d(net, 4096, [1, 1], scope='fc7')\n      net = slim.dropout(net, dropout_keep_prob, is_training=is_training,\n                         scope='dropout7')\n      net = slim.conv2d(net, num_classes, [1, 1],\n                        activation_fn=None,\n                        normalizer_fn=None,\n                        scope='fc8')\n      # Convert end_points_collection into a end_point dict.\n      end_points = slim.utils.convert_collection_to_dict(end_points_collection)\n      if spatial_squeeze:\n        net = tf.squeeze(net, [1, 2], name='fc8/squeezed')\n        end_points[sc.name + '/fc8'] = net\n      return net, end_points\nvgg_a.default_image_size = 224\n\n\ndef vgg_16(inputs,\n           num_classes=1000,\n           is_training=True,\n           dropout_keep_prob=0.5,\n           spatial_squeeze=True,\n           scope='vgg_16'):\n  \"\"\"Oxford Net VGG 16-Layers version D Example.\n\n  Note: All the fully_connected layers have been transformed to conv2d layers.\n        To use in classification mode, resize input to 224x224.\n\n  Args:\n    inputs: a tensor of size [batch_size, height, width, channels].\n    num_classes: number of predicted classes.\n    is_training: whether or not the model is being trained.\n    dropout_keep_prob: the probability that activations are kept in the dropout\n      layers during training.\n    spatial_squeeze: whether or not should squeeze the spatial dimensions of the\n      outputs. Useful to remove unnecessary dimensions for classification.\n    scope: Optional scope for the variables.\n\n  Returns:\n    the last op containing the log predictions and end_points dict.\n  \"\"\"\n  with tf.variable_scope(scope, 'vgg_16', [inputs]) as sc:\n    end_points_collection = sc.name + '_end_points'\n    # Collect outputs for conv2d, fully_connected and max_pool2d.\n    with slim.arg_scope([slim.conv2d, slim.fully_connected, slim.max_pool2d],\n                        outputs_collections=end_points_collection):\n      net = slim.repeat(inputs, 2, slim.conv2d, 64, [3, 3], scope='conv1')\n      net = slim.max_pool2d(net, [2, 2], scope='pool1')\n      net = slim.repeat(net, 2, slim.conv2d, 128, [3, 3], scope='conv2')\n      net = slim.max_pool2d(net, [2, 2], scope='pool2')\n      net = slim.repeat(net, 3, slim.conv2d, 256, [3, 3], scope='conv3')\n      net = slim.max_pool2d(net, [2, 2], scope='pool3')\n      net = slim.repeat(net, 3, slim.conv2d, 512, [3, 3], scope='conv4')\n      net = slim.max_pool2d(net, [2, 2], scope='pool4')\n      net = slim.repeat(net, 3, slim.conv2d, 512, [3, 3], scope='conv5')\n      net = slim.max_pool2d(net, [2, 2], scope='pool5')\n      # Use conv2d instead of fully_connected layers.\n      net = slim.conv2d(net, 4096, [7, 7], padding='VALID', scope='fc6')\n      net = slim.dropout(net, dropout_keep_prob, is_training=is_training,\n                         scope='dropout6')\n      net = slim.conv2d(net, 4096, [1, 1], scope='fc7')\n      net = slim.dropout(net, dropout_keep_prob, is_training=is_training,\n                         scope='dropout7')\n      net = slim.conv2d(net, num_classes, [1, 1],\n                        activation_fn=None,\n                        normalizer_fn=None,\n                        scope='fc8')\n      # Convert end_points_collection into a end_point dict.\n      end_points = slim.utils.convert_collection_to_dict(end_points_collection)\n      if spatial_squeeze:\n        net = tf.squeeze(net, [1, 2], name='fc8/squeezed')\n        end_points[sc.name + '/fc8'] = net\n      return net, end_points\nvgg_16.default_image_size = 224\n\n\ndef vgg_19(inputs,\n           num_classes=1000,\n           is_training=True,\n           dropout_keep_prob=0.5,\n           spatial_squeeze=True,\n           scope='vgg_19'):\n  \"\"\"Oxford Net VGG 19-Layers version E Example.\n\n  Note: All the fully_connected layers have been transformed to conv2d layers.\n        To use in classification mode, resize input to 224x224.\n\n  Args:\n    inputs: a tensor of size [batch_size, height, width, channels].\n    num_classes: number of predicted classes.\n    is_training: whether or not the model is being trained.\n    dropout_keep_prob: the probability that activations are kept in the dropout\n      layers during training.\n    spatial_squeeze: whether or not should squeeze the spatial dimensions of the\n      outputs. Useful to remove unnecessary dimensions for classification.\n    scope: Optional scope for the variables.\n\n  Returns:\n    the last op containing the log predictions and end_points dict.\n  \"\"\"\n  with tf.variable_scope(scope, 'vgg_19', [inputs]) as sc:\n    end_points_collection = sc.name + '_end_points'\n    # Collect outputs for conv2d, fully_connected and max_pool2d.\n    with slim.arg_scope([slim.conv2d, slim.fully_connected, slim.max_pool2d],\n                        outputs_collections=end_points_collection):\n      net = slim.repeat(inputs, 2, slim.conv2d, 64, [3, 3], scope='conv1')\n      net = slim.max_pool2d(net, [2, 2], scope='pool1')\n      net = slim.repeat(net, 2, slim.conv2d, 128, [3, 3], scope='conv2')\n      net = slim.max_pool2d(net, [2, 2], scope='pool2')\n      net = slim.repeat(net, 4, slim.conv2d, 256, [3, 3], scope='conv3')\n      net = slim.max_pool2d(net, [2, 2], scope='pool3')\n      net = slim.repeat(net, 4, slim.conv2d, 512, [3, 3], scope='conv4')\n      net = slim.max_pool2d(net, [2, 2], scope='pool4')\n      net = slim.repeat(net, 4, slim.conv2d, 512, [3, 3], scope='conv5')\n      net = slim.max_pool2d(net, [2, 2], scope='pool5')\n      # Use conv2d instead of fully_connected layers.\n      net = slim.conv2d(net, 4096, [7, 7], padding='VALID', scope='fc6')\n      net = slim.dropout(net, dropout_keep_prob, is_training=is_training,\n                         scope='dropout6')\n      net = slim.conv2d(net, 4096, [1, 1], scope='fc7')\n      net = slim.dropout(net, dropout_keep_prob, is_training=is_training,\n                         scope='dropout7')\n      net = slim.conv2d(net, num_classes, [1, 1],\n                        activation_fn=None,\n                        normalizer_fn=None,\n                        scope='fc8')\n      # Convert end_points_collection into a end_point dict.\n      end_points = slim.utils.convert_collection_to_dict(end_points_collection)\n      if spatial_squeeze:\n        net = tf.squeeze(net, [1, 2], name='fc8/squeezed')\n        end_points[sc.name + '/fc8'] = net\n      return net, end_points\nvgg_19.default_image_size = 224\n\n# Alias\nvgg_d = vgg_16\nvgg_e = vgg_19\n"
  },
  {
    "path": "model_zoo/models/slim/nets/vgg_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for slim.nets.vgg.\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom nets import vgg\n\nslim = tf.contrib.slim\n\n\nclass VGGATest(tf.test.TestCase):\n\n  def testBuild(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = vgg.vgg_a(inputs, num_classes)\n      self.assertEquals(logits.op.name, 'vgg_a/fc8/squeezed')\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n\n  def testFullyConvolutional(self):\n    batch_size = 1\n    height, width = 256, 256\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = vgg.vgg_a(inputs, num_classes, spatial_squeeze=False)\n      self.assertEquals(logits.op.name, 'vgg_a/fc8/BiasAdd')\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, 2, 2, num_classes])\n\n  def testEndPoints(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      _, end_points = vgg.vgg_a(inputs, num_classes)\n      expected_names = ['vgg_a/conv1/conv1_1',\n                        'vgg_a/pool1',\n                        'vgg_a/conv2/conv2_1',\n                        'vgg_a/pool2',\n                        'vgg_a/conv3/conv3_1',\n                        'vgg_a/conv3/conv3_2',\n                        'vgg_a/pool3',\n                        'vgg_a/conv4/conv4_1',\n                        'vgg_a/conv4/conv4_2',\n                        'vgg_a/pool4',\n                        'vgg_a/conv5/conv5_1',\n                        'vgg_a/conv5/conv5_2',\n                        'vgg_a/pool5',\n                        'vgg_a/fc6',\n                        'vgg_a/fc7',\n                        'vgg_a/fc8'\n                       ]\n      self.assertSetEqual(set(end_points.keys()), set(expected_names))\n\n  def testModelVariables(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      vgg.vgg_a(inputs, num_classes)\n      expected_names = ['vgg_a/conv1/conv1_1/weights',\n                        'vgg_a/conv1/conv1_1/biases',\n                        'vgg_a/conv2/conv2_1/weights',\n                        'vgg_a/conv2/conv2_1/biases',\n                        'vgg_a/conv3/conv3_1/weights',\n                        'vgg_a/conv3/conv3_1/biases',\n                        'vgg_a/conv3/conv3_2/weights',\n                        'vgg_a/conv3/conv3_2/biases',\n                        'vgg_a/conv4/conv4_1/weights',\n                        'vgg_a/conv4/conv4_1/biases',\n                        'vgg_a/conv4/conv4_2/weights',\n                        'vgg_a/conv4/conv4_2/biases',\n                        'vgg_a/conv5/conv5_1/weights',\n                        'vgg_a/conv5/conv5_1/biases',\n                        'vgg_a/conv5/conv5_2/weights',\n                        'vgg_a/conv5/conv5_2/biases',\n                        'vgg_a/fc6/weights',\n                        'vgg_a/fc6/biases',\n                        'vgg_a/fc7/weights',\n                        'vgg_a/fc7/biases',\n                        'vgg_a/fc8/weights',\n                        'vgg_a/fc8/biases',\n                       ]\n      model_variables = [v.op.name for v in slim.get_model_variables()]\n      self.assertSetEqual(set(model_variables), set(expected_names))\n\n  def testEvaluation(self):\n    batch_size = 2\n    height, width = 224, 224\n    num_classes = 1000\n    with self.test_session():\n      eval_inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = vgg.vgg_a(eval_inputs, is_training=False)\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n      predictions = tf.argmax(logits, 1)\n      self.assertListEqual(predictions.get_shape().as_list(), [batch_size])\n\n  def testTrainEvalWithReuse(self):\n    train_batch_size = 2\n    eval_batch_size = 1\n    train_height, train_width = 224, 224\n    eval_height, eval_width = 256, 256\n    num_classes = 1000\n    with self.test_session():\n      train_inputs = tf.random_uniform(\n          (train_batch_size, train_height, train_width, 3))\n      logits, _ = vgg.vgg_a(train_inputs)\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [train_batch_size, num_classes])\n      tf.get_variable_scope().reuse_variables()\n      eval_inputs = tf.random_uniform(\n          (eval_batch_size, eval_height, eval_width, 3))\n      logits, _ = vgg.vgg_a(eval_inputs, is_training=False,\n                            spatial_squeeze=False)\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [eval_batch_size, 2, 2, num_classes])\n      logits = tf.reduce_mean(logits, [1, 2])\n      predictions = tf.argmax(logits, 1)\n      self.assertEquals(predictions.get_shape().as_list(), [eval_batch_size])\n\n  def testForward(self):\n    batch_size = 1\n    height, width = 224, 224\n    with self.test_session() as sess:\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = vgg.vgg_a(inputs)\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(logits)\n      self.assertTrue(output.any())\n\n\nclass VGG16Test(tf.test.TestCase):\n\n  def testBuild(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = vgg.vgg_16(inputs, num_classes)\n      self.assertEquals(logits.op.name, 'vgg_16/fc8/squeezed')\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n\n  def testFullyConvolutional(self):\n    batch_size = 1\n    height, width = 256, 256\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = vgg.vgg_16(inputs, num_classes, spatial_squeeze=False)\n      self.assertEquals(logits.op.name, 'vgg_16/fc8/BiasAdd')\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, 2, 2, num_classes])\n\n  def testEndPoints(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      _, end_points = vgg.vgg_16(inputs, num_classes)\n      expected_names = ['vgg_16/conv1/conv1_1',\n                        'vgg_16/conv1/conv1_2',\n                        'vgg_16/pool1',\n                        'vgg_16/conv2/conv2_1',\n                        'vgg_16/conv2/conv2_2',\n                        'vgg_16/pool2',\n                        'vgg_16/conv3/conv3_1',\n                        'vgg_16/conv3/conv3_2',\n                        'vgg_16/conv3/conv3_3',\n                        'vgg_16/pool3',\n                        'vgg_16/conv4/conv4_1',\n                        'vgg_16/conv4/conv4_2',\n                        'vgg_16/conv4/conv4_3',\n                        'vgg_16/pool4',\n                        'vgg_16/conv5/conv5_1',\n                        'vgg_16/conv5/conv5_2',\n                        'vgg_16/conv5/conv5_3',\n                        'vgg_16/pool5',\n                        'vgg_16/fc6',\n                        'vgg_16/fc7',\n                        'vgg_16/fc8'\n                       ]\n      self.assertSetEqual(set(end_points.keys()), set(expected_names))\n\n  def testModelVariables(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      vgg.vgg_16(inputs, num_classes)\n      expected_names = ['vgg_16/conv1/conv1_1/weights',\n                        'vgg_16/conv1/conv1_1/biases',\n                        'vgg_16/conv1/conv1_2/weights',\n                        'vgg_16/conv1/conv1_2/biases',\n                        'vgg_16/conv2/conv2_1/weights',\n                        'vgg_16/conv2/conv2_1/biases',\n                        'vgg_16/conv2/conv2_2/weights',\n                        'vgg_16/conv2/conv2_2/biases',\n                        'vgg_16/conv3/conv3_1/weights',\n                        'vgg_16/conv3/conv3_1/biases',\n                        'vgg_16/conv3/conv3_2/weights',\n                        'vgg_16/conv3/conv3_2/biases',\n                        'vgg_16/conv3/conv3_3/weights',\n                        'vgg_16/conv3/conv3_3/biases',\n                        'vgg_16/conv4/conv4_1/weights',\n                        'vgg_16/conv4/conv4_1/biases',\n                        'vgg_16/conv4/conv4_2/weights',\n                        'vgg_16/conv4/conv4_2/biases',\n                        'vgg_16/conv4/conv4_3/weights',\n                        'vgg_16/conv4/conv4_3/biases',\n                        'vgg_16/conv5/conv5_1/weights',\n                        'vgg_16/conv5/conv5_1/biases',\n                        'vgg_16/conv5/conv5_2/weights',\n                        'vgg_16/conv5/conv5_2/biases',\n                        'vgg_16/conv5/conv5_3/weights',\n                        'vgg_16/conv5/conv5_3/biases',\n                        'vgg_16/fc6/weights',\n                        'vgg_16/fc6/biases',\n                        'vgg_16/fc7/weights',\n                        'vgg_16/fc7/biases',\n                        'vgg_16/fc8/weights',\n                        'vgg_16/fc8/biases',\n                       ]\n      model_variables = [v.op.name for v in slim.get_model_variables()]\n      self.assertSetEqual(set(model_variables), set(expected_names))\n\n  def testEvaluation(self):\n    batch_size = 2\n    height, width = 224, 224\n    num_classes = 1000\n    with self.test_session():\n      eval_inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = vgg.vgg_16(eval_inputs, is_training=False)\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n      predictions = tf.argmax(logits, 1)\n      self.assertListEqual(predictions.get_shape().as_list(), [batch_size])\n\n  def testTrainEvalWithReuse(self):\n    train_batch_size = 2\n    eval_batch_size = 1\n    train_height, train_width = 224, 224\n    eval_height, eval_width = 256, 256\n    num_classes = 1000\n    with self.test_session():\n      train_inputs = tf.random_uniform(\n          (train_batch_size, train_height, train_width, 3))\n      logits, _ = vgg.vgg_16(train_inputs)\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [train_batch_size, num_classes])\n      tf.get_variable_scope().reuse_variables()\n      eval_inputs = tf.random_uniform(\n          (eval_batch_size, eval_height, eval_width, 3))\n      logits, _ = vgg.vgg_16(eval_inputs, is_training=False,\n                             spatial_squeeze=False)\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [eval_batch_size, 2, 2, num_classes])\n      logits = tf.reduce_mean(logits, [1, 2])\n      predictions = tf.argmax(logits, 1)\n      self.assertEquals(predictions.get_shape().as_list(), [eval_batch_size])\n\n  def testForward(self):\n    batch_size = 1\n    height, width = 224, 224\n    with self.test_session() as sess:\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = vgg.vgg_16(inputs)\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(logits)\n      self.assertTrue(output.any())\n\n\nclass VGG19Test(tf.test.TestCase):\n\n  def testBuild(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = vgg.vgg_19(inputs, num_classes)\n      self.assertEquals(logits.op.name, 'vgg_19/fc8/squeezed')\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n\n  def testFullyConvolutional(self):\n    batch_size = 1\n    height, width = 256, 256\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = vgg.vgg_19(inputs, num_classes, spatial_squeeze=False)\n      self.assertEquals(logits.op.name, 'vgg_19/fc8/BiasAdd')\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, 2, 2, num_classes])\n\n  def testEndPoints(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      _, end_points = vgg.vgg_19(inputs, num_classes)\n      expected_names = [\n          'vgg_19/conv1/conv1_1',\n          'vgg_19/conv1/conv1_2',\n          'vgg_19/pool1',\n          'vgg_19/conv2/conv2_1',\n          'vgg_19/conv2/conv2_2',\n          'vgg_19/pool2',\n          'vgg_19/conv3/conv3_1',\n          'vgg_19/conv3/conv3_2',\n          'vgg_19/conv3/conv3_3',\n          'vgg_19/conv3/conv3_4',\n          'vgg_19/pool3',\n          'vgg_19/conv4/conv4_1',\n          'vgg_19/conv4/conv4_2',\n          'vgg_19/conv4/conv4_3',\n          'vgg_19/conv4/conv4_4',\n          'vgg_19/pool4',\n          'vgg_19/conv5/conv5_1',\n          'vgg_19/conv5/conv5_2',\n          'vgg_19/conv5/conv5_3',\n          'vgg_19/conv5/conv5_4',\n          'vgg_19/pool5',\n          'vgg_19/fc6',\n          'vgg_19/fc7',\n          'vgg_19/fc8'\n      ]\n      self.assertSetEqual(set(end_points.keys()), set(expected_names))\n\n  def testModelVariables(self):\n    batch_size = 5\n    height, width = 224, 224\n    num_classes = 1000\n    with self.test_session():\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      vgg.vgg_19(inputs, num_classes)\n      expected_names = [\n          'vgg_19/conv1/conv1_1/weights',\n          'vgg_19/conv1/conv1_1/biases',\n          'vgg_19/conv1/conv1_2/weights',\n          'vgg_19/conv1/conv1_2/biases',\n          'vgg_19/conv2/conv2_1/weights',\n          'vgg_19/conv2/conv2_1/biases',\n          'vgg_19/conv2/conv2_2/weights',\n          'vgg_19/conv2/conv2_2/biases',\n          'vgg_19/conv3/conv3_1/weights',\n          'vgg_19/conv3/conv3_1/biases',\n          'vgg_19/conv3/conv3_2/weights',\n          'vgg_19/conv3/conv3_2/biases',\n          'vgg_19/conv3/conv3_3/weights',\n          'vgg_19/conv3/conv3_3/biases',\n          'vgg_19/conv3/conv3_4/weights',\n          'vgg_19/conv3/conv3_4/biases',\n          'vgg_19/conv4/conv4_1/weights',\n          'vgg_19/conv4/conv4_1/biases',\n          'vgg_19/conv4/conv4_2/weights',\n          'vgg_19/conv4/conv4_2/biases',\n          'vgg_19/conv4/conv4_3/weights',\n          'vgg_19/conv4/conv4_3/biases',\n          'vgg_19/conv4/conv4_4/weights',\n          'vgg_19/conv4/conv4_4/biases',\n          'vgg_19/conv5/conv5_1/weights',\n          'vgg_19/conv5/conv5_1/biases',\n          'vgg_19/conv5/conv5_2/weights',\n          'vgg_19/conv5/conv5_2/biases',\n          'vgg_19/conv5/conv5_3/weights',\n          'vgg_19/conv5/conv5_3/biases',\n          'vgg_19/conv5/conv5_4/weights',\n          'vgg_19/conv5/conv5_4/biases',\n          'vgg_19/fc6/weights',\n          'vgg_19/fc6/biases',\n          'vgg_19/fc7/weights',\n          'vgg_19/fc7/biases',\n          'vgg_19/fc8/weights',\n          'vgg_19/fc8/biases',\n      ]\n      model_variables = [v.op.name for v in slim.get_model_variables()]\n      self.assertSetEqual(set(model_variables), set(expected_names))\n\n  def testEvaluation(self):\n    batch_size = 2\n    height, width = 224, 224\n    num_classes = 1000\n    with self.test_session():\n      eval_inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = vgg.vgg_19(eval_inputs, is_training=False)\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [batch_size, num_classes])\n      predictions = tf.argmax(logits, 1)\n      self.assertListEqual(predictions.get_shape().as_list(), [batch_size])\n\n  def testTrainEvalWithReuse(self):\n    train_batch_size = 2\n    eval_batch_size = 1\n    train_height, train_width = 224, 224\n    eval_height, eval_width = 256, 256\n    num_classes = 1000\n    with self.test_session():\n      train_inputs = tf.random_uniform(\n          (train_batch_size, train_height, train_width, 3))\n      logits, _ = vgg.vgg_19(train_inputs)\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [train_batch_size, num_classes])\n      tf.get_variable_scope().reuse_variables()\n      eval_inputs = tf.random_uniform(\n          (eval_batch_size, eval_height, eval_width, 3))\n      logits, _ = vgg.vgg_19(eval_inputs, is_training=False,\n                             spatial_squeeze=False)\n      self.assertListEqual(logits.get_shape().as_list(),\n                           [eval_batch_size, 2, 2, num_classes])\n      logits = tf.reduce_mean(logits, [1, 2])\n      predictions = tf.argmax(logits, 1)\n      self.assertEquals(predictions.get_shape().as_list(), [eval_batch_size])\n\n  def testForward(self):\n    batch_size = 1\n    height, width = 224, 224\n    with self.test_session() as sess:\n      inputs = tf.random_uniform((batch_size, height, width, 3))\n      logits, _ = vgg.vgg_19(inputs)\n      sess.run(tf.initialize_all_variables())\n      output = sess.run(logits)\n      self.assertTrue(output.any())\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/slim/preprocessing/__init__.py",
    "content": "\n"
  },
  {
    "path": "model_zoo/models/slim/preprocessing/cifarnet_preprocessing.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Provides utilities to preprocess images in CIFAR-10.\n\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\n_PADDING = 4\n\nslim = tf.contrib.slim\n\n\ndef preprocess_for_train(image,\n                         output_height,\n                         output_width,\n                         padding=_PADDING):\n  \"\"\"Preprocesses the given image for training.\n\n  Note that the actual resizing scale is sampled from\n    [`resize_size_min`, `resize_size_max`].\n\n  Args:\n    image: A `Tensor` representing an image of arbitrary size.\n    output_height: The height of the image after preprocessing.\n    output_width: The width of the image after preprocessing.\n    padding: The amound of padding before and after each dimension of the image.\n\n  Returns:\n    A preprocessed image.\n  \"\"\"\n  tf.image_summary('image', tf.expand_dims(image, 0))\n\n  # Transform the image to floats.\n  image = tf.to_float(image)\n  if padding > 0:\n    image = tf.pad(image, [[padding, padding], [padding, padding], [0, 0]])\n  # Randomly crop a [height, width] section of the image.\n  distorted_image = tf.random_crop(image,\n                                   [output_height, output_width, 3])\n\n  # Randomly flip the image horizontally.\n  distorted_image = tf.image.random_flip_left_right(distorted_image)\n\n  tf.image_summary('distorted_image', tf.expand_dims(distorted_image, 0))\n\n  # Because these operations are not commutative, consider randomizing\n  # the order their operation.\n  distorted_image = tf.image.random_brightness(distorted_image,\n                                               max_delta=63)\n  distorted_image = tf.image.random_contrast(distorted_image,\n                                             lower=0.2, upper=1.8)\n  # Subtract off the mean and divide by the variance of the pixels.\n  return tf.image.per_image_whitening(distorted_image)\n\n\ndef preprocess_for_eval(image, output_height, output_width):\n  \"\"\"Preprocesses the given image for evaluation.\n\n  Args:\n    image: A `Tensor` representing an image of arbitrary size.\n    output_height: The height of the image after preprocessing.\n    output_width: The width of the image after preprocessing.\n\n  Returns:\n    A preprocessed image.\n  \"\"\"\n  tf.image_summary('image', tf.expand_dims(image, 0))\n  # Transform the image to floats.\n  image = tf.to_float(image)\n\n  # Resize and crop if needed.\n  resized_image = tf.image.resize_image_with_crop_or_pad(image,\n                                                         output_width,\n                                                         output_height)\n  tf.image_summary('resized_image', tf.expand_dims(resized_image, 0))\n\n  # Subtract off the mean and divide by the variance of the pixels.\n  return tf.image.per_image_whitening(resized_image)\n\n\ndef preprocess_image(image, output_height, output_width, is_training=False):\n  \"\"\"Preprocesses the given image.\n\n  Args:\n    image: A `Tensor` representing an image of arbitrary size.\n    output_height: The height of the image after preprocessing.\n    output_width: The width of the image after preprocessing.\n    is_training: `True` if we're preprocessing the image for training and\n      `False` otherwise.\n\n  Returns:\n    A preprocessed image.\n  \"\"\"\n  if is_training:\n    return preprocess_for_train(image, output_height, output_width)\n  else:\n    return preprocess_for_eval(image, output_height, output_width)\n"
  },
  {
    "path": "model_zoo/models/slim/preprocessing/inception_preprocessing.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Provides utilities to preprocess images for the Inception networks.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom tensorflow.python.ops import control_flow_ops\n\n\ndef apply_with_random_selector(x, func, num_cases):\n  \"\"\"Computes func(x, sel), with sel sampled from [0...num_cases-1].\n\n  Args:\n    x: input Tensor.\n    func: Python function to apply.\n    num_cases: Python int32, number of cases to sample sel from.\n\n  Returns:\n    The result of func(x, sel), where func receives the value of the\n    selector as a python integer, but sel is sampled dynamically.\n  \"\"\"\n  sel = tf.random_uniform([], maxval=num_cases, dtype=tf.int32)\n  # Pass the real x only to one of the func calls.\n  return control_flow_ops.merge([\n      func(control_flow_ops.switch(x, tf.equal(sel, case))[1], case)\n      for case in range(num_cases)])[0]\n\n\ndef distort_color(image, color_ordering=0, fast_mode=True, scope=None):\n  \"\"\"Distort the color of a Tensor image.\n\n  Each color distortion is non-commutative and thus ordering of the color ops\n  matters. Ideally we would randomly permute the ordering of the color ops.\n  Rather then adding that level of complication, we select a distinct ordering\n  of color ops for each preprocessing thread.\n\n  Args:\n    image: 3-D Tensor containing single image in [0, 1].\n    color_ordering: Python int, a type of distortion (valid values: 0-3).\n    fast_mode: Avoids slower ops (random_hue and random_contrast)\n    scope: Optional scope for name_scope.\n  Returns:\n    3-D Tensor color-distorted image on range [0, 1]\n  Raises:\n    ValueError: if color_ordering not in [0, 3]\n  \"\"\"\n  with tf.name_scope(scope, 'distort_color', [image]):\n    if fast_mode:\n      if color_ordering == 0:\n        image = tf.image.random_brightness(image, max_delta=32. / 255.)\n        image = tf.image.random_saturation(image, lower=0.5, upper=1.5)\n      else:\n        image = tf.image.random_saturation(image, lower=0.5, upper=1.5)\n        image = tf.image.random_brightness(image, max_delta=32. / 255.)\n    else:\n      if color_ordering == 0:\n        image = tf.image.random_brightness(image, max_delta=32. / 255.)\n        image = tf.image.random_saturation(image, lower=0.5, upper=1.5)\n        image = tf.image.random_hue(image, max_delta=0.2)\n        image = tf.image.random_contrast(image, lower=0.5, upper=1.5)\n      elif color_ordering == 1:\n        image = tf.image.random_saturation(image, lower=0.5, upper=1.5)\n        image = tf.image.random_brightness(image, max_delta=32. / 255.)\n        image = tf.image.random_contrast(image, lower=0.5, upper=1.5)\n        image = tf.image.random_hue(image, max_delta=0.2)\n      elif color_ordering == 2:\n        image = tf.image.random_contrast(image, lower=0.5, upper=1.5)\n        image = tf.image.random_hue(image, max_delta=0.2)\n        image = tf.image.random_brightness(image, max_delta=32. / 255.)\n        image = tf.image.random_saturation(image, lower=0.5, upper=1.5)\n      elif color_ordering == 3:\n        image = tf.image.random_hue(image, max_delta=0.2)\n        image = tf.image.random_saturation(image, lower=0.5, upper=1.5)\n        image = tf.image.random_contrast(image, lower=0.5, upper=1.5)\n        image = tf.image.random_brightness(image, max_delta=32. / 255.)\n      else:\n        raise ValueError('color_ordering must be in [0, 3]')\n\n    # The random_* ops do not necessarily clamp.\n    return tf.clip_by_value(image, 0.0, 1.0)\n\n\ndef distorted_bounding_box_crop(image,\n                                bbox,\n                                min_object_covered=0.1,\n                                aspect_ratio_range=(0.75, 1.33),\n                                area_range=(0.05, 1.0),\n                                max_attempts=100,\n                                scope=None):\n  \"\"\"Generates cropped_image using a one of the bboxes randomly distorted.\n\n  See `tf.image.sample_distorted_bounding_box` for more documentation.\n\n  Args:\n    image: 3-D Tensor of image (it will be converted to floats in [0, 1]).\n    bbox: 3-D float Tensor of bounding boxes arranged [1, num_boxes, coords]\n      where each coordinate is [0, 1) and the coordinates are arranged\n      as [ymin, xmin, ymax, xmax]. If num_boxes is 0 then it would use the whole\n      image.\n    min_object_covered: An optional `float`. Defaults to `0.1`. The cropped\n      area of the image must contain at least this fraction of any bounding box\n      supplied.\n    aspect_ratio_range: An optional list of `floats`. The cropped area of the\n      image must have an aspect ratio = width / height within this range.\n    area_range: An optional list of `floats`. The cropped area of the image\n      must contain a fraction of the supplied image within in this range.\n    max_attempts: An optional `int`. Number of attempts at generating a cropped\n      region of the image of the specified constraints. After `max_attempts`\n      failures, return the entire image.\n    scope: Optional scope for name_scope.\n  Returns:\n    A tuple, a 3-D Tensor cropped_image and the distorted bbox\n  \"\"\"\n  with tf.name_scope(scope, 'distorted_bounding_box_crop', [image, bbox]):\n    # Each bounding box has shape [1, num_boxes, box coords] and\n    # the coordinates are ordered [ymin, xmin, ymax, xmax].\n\n    # A large fraction of image datasets contain a human-annotated bounding\n    # box delineating the region of the image containing the object of interest.\n    # We choose to create a new bounding box for the object which is a randomly\n    # distorted version of the human-annotated bounding box that obeys an\n    # allowed range of aspect ratios, sizes and overlap with the human-annotated\n    # bounding box. If no box is supplied, then we assume the bounding box is\n    # the entire image.\n    sample_distorted_bounding_box = tf.image.sample_distorted_bounding_box(\n        tf.shape(image),\n        bounding_boxes=bbox,\n        min_object_covered=min_object_covered,\n        aspect_ratio_range=aspect_ratio_range,\n        area_range=area_range,\n        max_attempts=max_attempts,\n        use_image_if_no_bounding_boxes=True)\n    bbox_begin, bbox_size, distort_bbox = sample_distorted_bounding_box\n\n    # Crop the image to the specified bounding box.\n    cropped_image = tf.slice(image, bbox_begin, bbox_size)\n    return cropped_image, distort_bbox\n\n\ndef preprocess_for_train(image, height, width, bbox,\n                         fast_mode=True,\n                         scope=None):\n  \"\"\"Distort one image for training a network.\n\n  Distorting images provides a useful technique for augmenting the data\n  set during training in order to make the network invariant to aspects\n  of the image that do not effect the label.\n\n  Additionally it would create image_summaries to display the different\n  transformations applied to the image.\n\n  Args:\n    image: 3-D Tensor of image. If dtype is tf.float32 then the range should be\n      [0, 1], otherwise it would converted to tf.float32 assuming that the range\n      is [0, MAX], where MAX is largest positive representable number for\n      int(8/16/32) data type (see `tf.image.convert_image_dtype` for details).\n    height: integer\n    width: integer\n    bbox: 3-D float Tensor of bounding boxes arranged [1, num_boxes, coords]\n      where each coordinate is [0, 1) and the coordinates are arranged\n      as [ymin, xmin, ymax, xmax].\n    fast_mode: Optional boolean, if True avoids slower transformations (i.e.\n      bi-cubic resizing, random_hue or random_contrast).\n    scope: Optional scope for name_scope.\n  Returns:\n    3-D float Tensor of distorted image used for training with range [-1, 1].\n  \"\"\"\n  with tf.name_scope(scope, 'distort_image', [image, height, width, bbox]):\n    if bbox is None:\n      bbox = tf.constant([0.0, 0.0, 1.0, 1.0],\n                         dtype=tf.float32,\n                         shape=[1, 1, 4])\n    if image.dtype != tf.float32:\n      image = tf.image.convert_image_dtype(image, dtype=tf.float32)\n    # Each bounding box has shape [1, num_boxes, box coords] and\n    # the coordinates are ordered [ymin, xmin, ymax, xmax].\n    image_with_box = tf.image.draw_bounding_boxes(tf.expand_dims(image, 0),\n                                                  bbox)\n    tf.image_summary('image_with_bounding_boxes', image_with_box)\n\n    distorted_image, distorted_bbox = distorted_bounding_box_crop(image, bbox)\n    # Restore the shape since the dynamic slice based upon the bbox_size loses\n    # the third dimension.\n    distorted_image.set_shape([None, None, 3])\n    image_with_distorted_box = tf.image.draw_bounding_boxes(\n        tf.expand_dims(image, 0), distorted_bbox)\n    tf.image_summary('images_with_distorted_bounding_box',\n                     image_with_distorted_box)\n\n    # This resizing operation may distort the images because the aspect\n    # ratio is not respected. We select a resize method in a round robin\n    # fashion based on the thread number.\n    # Note that ResizeMethod contains 4 enumerated resizing methods.\n\n    # We select only 1 case for fast_mode bilinear.\n    num_resize_cases = 1 if fast_mode else 4\n    distorted_image = apply_with_random_selector(\n        distorted_image,\n        lambda x, method: tf.image.resize_images(x, [height, width], method=method),\n        num_cases=num_resize_cases)\n\n    tf.image_summary('cropped_resized_image',\n                     tf.expand_dims(distorted_image, 0))\n\n    # Randomly flip the image horizontally.\n    distorted_image = tf.image.random_flip_left_right(distorted_image)\n\n    # Randomly distort the colors. There are 4 ways to do it.\n    distorted_image = apply_with_random_selector(\n        distorted_image,\n        lambda x, ordering: distort_color(x, ordering, fast_mode),\n        num_cases=4)\n\n    tf.image_summary('final_distorted_image',\n                     tf.expand_dims(distorted_image, 0))\n    distorted_image = tf.sub(distorted_image, 0.5)\n    distorted_image = tf.mul(distorted_image, 2.0)\n    return distorted_image\n\n\ndef preprocess_for_eval(image, height, width,\n                        central_fraction=0.875, scope=None):\n  \"\"\"Prepare one image for evaluation.\n\n  If height and width are specified it would output an image with that size by\n  applying resize_bilinear.\n\n  If central_fraction is specified it would cropt the central fraction of the\n  input image.\n\n  Args:\n    image: 3-D Tensor of image. If dtype is tf.float32 then the range should be\n      [0, 1], otherwise it would converted to tf.float32 assuming that the range\n      is [0, MAX], where MAX is largest positive representable number for\n      int(8/16/32) data type (see `tf.image.convert_image_dtype` for details)\n    height: integer\n    width: integer\n    central_fraction: Optional Float, fraction of the image to crop.\n    scope: Optional scope for name_scope.\n  Returns:\n    3-D float Tensor of prepared image.\n  \"\"\"\n  with tf.name_scope(scope, 'eval_image', [image, height, width]):\n    if image.dtype != tf.float32:\n      image = tf.image.convert_image_dtype(image, dtype=tf.float32)\n    # Crop the central region of the image with an area containing 87.5% of\n    # the original image.\n    if central_fraction:\n      image = tf.image.central_crop(image, central_fraction=central_fraction)\n\n    if height and width:\n      # Resize the image to the specified height and width.\n      image = tf.expand_dims(image, 0)\n      image = tf.image.resize_bilinear(image, [height, width],\n                                       align_corners=False)\n      image = tf.squeeze(image, [0])\n    image = tf.sub(image, 0.5)\n    image = tf.mul(image, 2.0)\n    return image\n\n\ndef preprocess_image(image, height, width,\n                     is_training=False,\n                     bbox=None,\n                     fast_mode=True):\n  \"\"\"Pre-process one image for training or evaluation.\n\n  Args:\n    image: 3-D Tensor [height, width, channels] with the image.\n    height: integer, image expected height.\n    width: integer, image expected width.\n    is_training: Boolean. If true it would transform an image for train,\n      otherwise it would transform it for evaluation.\n    bbox: 3-D float Tensor of bounding boxes arranged [1, num_boxes, coords]\n      where each coordinate is [0, 1) and the coordinates are arranged as\n      [ymin, xmin, ymax, xmax].\n    fast_mode: Optional boolean, if True avoids slower transformations.\n\n  Returns:\n    3-D float Tensor containing an appropriately scaled image\n\n  Raises:\n    ValueError: if user does not provide bounding box\n  \"\"\"\n  if is_training:\n    return preprocess_for_train(image, height, width, bbox, fast_mode)\n  else:\n    return preprocess_for_eval(image, height, width)\n"
  },
  {
    "path": "model_zoo/models/slim/preprocessing/lenet_preprocessing.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Provides utilities for preprocessing.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nslim = tf.contrib.slim\n\n\ndef preprocess_image(image, output_height, output_width, is_training):\n  \"\"\"Preprocesses the given image.\n\n  Args:\n    image: A `Tensor` representing an image of arbitrary size.\n    output_height: The height of the image after preprocessing.\n    output_width: The width of the image after preprocessing.\n    is_training: `True` if we're preprocessing the image for training and\n      `False` otherwise.\n\n  Returns:\n    A preprocessed image.\n  \"\"\"\n  image = tf.to_float(image)\n  image = tf.image.resize_image_with_crop_or_pad(\n      image, output_width, output_height)\n  image = tf.sub(image, 128.0)\n  image = tf.div(image, 128.0)\n  return image\n"
  },
  {
    "path": "model_zoo/models/slim/preprocessing/preprocessing_factory.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Contains a factory for building various models.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom preprocessing import cifarnet_preprocessing\nfrom preprocessing import inception_preprocessing\nfrom preprocessing import lenet_preprocessing\nfrom preprocessing import vgg_preprocessing\n\nslim = tf.contrib.slim\n\n\ndef get_preprocessing(name, is_training=False):\n  \"\"\"Returns preprocessing_fn(image, height, width, **kwargs).\n\n  Args:\n    name: The name of the preprocessing function.\n    is_training: `True` if the model is being used for training and `False`\n      otherwise.\n\n  Returns:\n    preprocessing_fn: A function that preprocessing a single image (pre-batch).\n      It has the following signature:\n        image = preprocessing_fn(image, output_height, output_width, ...).\n\n  Raises:\n    ValueError: If Preprocessing `name` is not recognized.\n  \"\"\"\n  preprocessing_fn_map = {\n      'cifarnet': cifarnet_preprocessing,\n      'inception': inception_preprocessing,\n      'inception_v1': inception_preprocessing,\n      'inception_v2': inception_preprocessing,\n      'inception_v3': inception_preprocessing,\n      'inception_v4': inception_preprocessing,\n      'inception_resnet_v2': inception_preprocessing,\n      'lenet': lenet_preprocessing,\n      'resnet_v1_50': vgg_preprocessing,\n      'resnet_v1_101': vgg_preprocessing,\n      'resnet_v1_152': vgg_preprocessing,\n      'vgg': vgg_preprocessing,\n      'vgg_a': vgg_preprocessing,\n      'vgg_16': vgg_preprocessing,\n      'vgg_19': vgg_preprocessing,\n  }\n\n  if name not in preprocessing_fn_map:\n    raise ValueError('Preprocessing name [%s] was not recognized' % name)\n\n  def preprocessing_fn(image, output_height, output_width, **kwargs):\n    return preprocessing_fn_map[name].preprocess_image(\n        image, output_height, output_width, is_training=is_training, **kwargs)\n\n  return preprocessing_fn\n"
  },
  {
    "path": "model_zoo/models/slim/preprocessing/vgg_preprocessing.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Provides utilities to preprocess images.\n\nThe preprocessing steps for VGG were introduced in the following technical\nreport:\n\n  Very Deep Convolutional Networks For Large-Scale Image Recognition\n  Karen Simonyan and Andrew Zisserman\n  arXiv technical report, 2015\n  PDF: http://arxiv.org/pdf/1409.1556.pdf\n  ILSVRC 2014 Slides: http://www.robots.ox.ac.uk/~karen/pdf/ILSVRC_2014.pdf\n  CC-BY-4.0\n\nMore information can be obtained from the VGG website:\nwww.robots.ox.ac.uk/~vgg/research/very_deep/\n\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom tensorflow.python.ops import control_flow_ops\n\nslim = tf.contrib.slim\n\n_R_MEAN = 123.68\n_G_MEAN = 116.78\n_B_MEAN = 103.94\n\n_RESIZE_SIDE_MIN = 256\n_RESIZE_SIDE_MAX = 512\n\n\ndef _crop(image, offset_height, offset_width, crop_height, crop_width):\n  \"\"\"Crops the given image using the provided offsets and sizes.\n\n  Note that the method doesn't assume we know the input image size but it does\n  assume we know the input image rank.\n\n  Args:\n    image: an image of shape [height, width, channels].\n    offset_height: a scalar tensor indicating the height offset.\n    offset_width: a scalar tensor indicating the width offset.\n    crop_height: the height of the cropped image.\n    crop_width: the width of the cropped image.\n\n  Returns:\n    the cropped (and resized) image.\n\n  Raises:\n    InvalidArgumentError: if the rank is not 3 or if the image dimensions are\n      less than the crop size.\n  \"\"\"\n  original_shape = tf.shape(image)\n\n  rank_assertion = tf.Assert(\n      tf.equal(tf.rank(image), 3),\n      ['Rank of image must be equal to 3.'])\n  cropped_shape = control_flow_ops.with_dependencies(\n      [rank_assertion],\n      tf.pack([crop_height, crop_width, original_shape[2]]))\n\n  size_assertion = tf.Assert(\n      tf.logical_and(\n          tf.greater_equal(original_shape[0], crop_height),\n          tf.greater_equal(original_shape[1], crop_width)),\n      ['Crop size greater than the image size.'])\n\n  offsets = tf.to_int32(tf.pack([offset_height, offset_width, 0]))\n\n  # Use tf.slice instead of crop_to_bounding box as it accepts tensors to\n  # define the crop size.\n  image = control_flow_ops.with_dependencies(\n      [size_assertion],\n      tf.slice(image, offsets, cropped_shape))\n  return tf.reshape(image, cropped_shape)\n\n\ndef _random_crop(image_list, crop_height, crop_width):\n  \"\"\"Crops the given list of images.\n\n  The function applies the same crop to each image in the list. This can be\n  effectively applied when there are multiple image inputs of the same\n  dimension such as:\n\n    image, depths, normals = _random_crop([image, depths, normals], 120, 150)\n\n  Args:\n    image_list: a list of image tensors of the same dimension but possibly\n      varying channel.\n    crop_height: the new height.\n    crop_width: the new width.\n\n  Returns:\n    the image_list with cropped images.\n\n  Raises:\n    ValueError: if there are multiple image inputs provided with different size\n      or the images are smaller than the crop dimensions.\n  \"\"\"\n  if not image_list:\n    raise ValueError('Empty image_list.')\n\n  # Compute the rank assertions.\n  rank_assertions = []\n  for i in range(len(image_list)):\n    image_rank = tf.rank(image_list[i])\n    rank_assert = tf.Assert(\n        tf.equal(image_rank, 3),\n        ['Wrong rank for tensor  %s [expected] [actual]',\n         image_list[i].name, 3, image_rank])\n    rank_assertions.append(rank_assert)\n\n  image_shape = control_flow_ops.with_dependencies(\n      [rank_assertions[0]],\n      tf.shape(image_list[0]))\n  image_height = image_shape[0]\n  image_width = image_shape[1]\n  crop_size_assert = tf.Assert(\n      tf.logical_and(\n          tf.greater_equal(image_height, crop_height),\n          tf.greater_equal(image_width, crop_width)),\n      ['Crop size greater than the image size.'])\n\n  asserts = [rank_assertions[0], crop_size_assert]\n\n  for i in range(1, len(image_list)):\n    image = image_list[i]\n    asserts.append(rank_assertions[i])\n    shape = control_flow_ops.with_dependencies([rank_assertions[i]],\n                                               tf.shape(image))\n    height = shape[0]\n    width = shape[1]\n\n    height_assert = tf.Assert(\n        tf.equal(height, image_height),\n        ['Wrong height for tensor %s [expected][actual]',\n         image.name, height, image_height])\n    width_assert = tf.Assert(\n        tf.equal(width, image_width),\n        ['Wrong width for tensor %s [expected][actual]',\n         image.name, width, image_width])\n    asserts.extend([height_assert, width_assert])\n\n  # Create a random bounding box.\n  #\n  # Use tf.random_uniform and not numpy.random.rand as doing the former would\n  # generate random numbers at graph eval time, unlike the latter which\n  # generates random numbers at graph definition time.\n  max_offset_height = control_flow_ops.with_dependencies(\n      asserts, tf.reshape(image_height - crop_height + 1, []))\n  max_offset_width = control_flow_ops.with_dependencies(\n      asserts, tf.reshape(image_width - crop_width + 1, []))\n  offset_height = tf.random_uniform(\n      [], maxval=max_offset_height, dtype=tf.int32)\n  offset_width = tf.random_uniform(\n      [], maxval=max_offset_width, dtype=tf.int32)\n\n  return [_crop(image, offset_height, offset_width,\n                crop_height, crop_width) for image in image_list]\n\n\ndef _central_crop(image_list, crop_height, crop_width):\n  \"\"\"Performs central crops of the given image list.\n\n  Args:\n    image_list: a list of image tensors of the same dimension but possibly\n      varying channel.\n    crop_height: the height of the image following the crop.\n    crop_width: the width of the image following the crop.\n\n  Returns:\n    the list of cropped images.\n  \"\"\"\n  outputs = []\n  for image in image_list:\n    image_height = tf.shape(image)[0]\n    image_width = tf.shape(image)[1]\n\n    offset_height = (image_height - crop_height) / 2\n    offset_width = (image_width - crop_width) / 2\n\n    outputs.append(_crop(image, offset_height, offset_width,\n                         crop_height, crop_width))\n  return outputs\n\n\ndef _mean_image_subtraction(image, means):\n  \"\"\"Subtracts the given means from each image channel.\n\n  For example:\n    means = [123.68, 116.779, 103.939]\n    image = _mean_image_subtraction(image, means)\n\n  Note that the rank of `image` must be known.\n\n  Args:\n    image: a tensor of size [height, width, C].\n    means: a C-vector of values to subtract from each channel.\n\n  Returns:\n    the centered image.\n\n  Raises:\n    ValueError: If the rank of `image` is unknown, if `image` has a rank other\n      than three or if the number of channels in `image` doesn't match the\n      number of values in `means`.\n  \"\"\"\n  if image.get_shape().ndims != 3:\n    raise ValueError('Input must be of size [height, width, C>0]')\n  num_channels = image.get_shape().as_list()[-1]\n  if len(means) != num_channels:\n    raise ValueError('len(means) must match the number of channels')\n\n  channels = tf.split(2, num_channels, image)\n  for i in range(num_channels):\n    channels[i] -= means[i]\n  return tf.concat(2, channels)\n\n\ndef _smallest_size_at_least(height, width, smallest_side):\n  \"\"\"Computes new shape with the smallest side equal to `smallest_side`.\n\n  Computes new shape with the smallest side equal to `smallest_side` while\n  preserving the original aspect ratio.\n\n  Args:\n    height: an int32 scalar tensor indicating the current height.\n    width: an int32 scalar tensor indicating the current width.\n    smallest_side: A python integer or scalar `Tensor` indicating the size of\n      the smallest side after resize.\n\n  Returns:\n    new_height: an int32 scalar tensor indicating the new height.\n    new_width: and int32 scalar tensor indicating the new width.\n  \"\"\"\n  smallest_side = tf.convert_to_tensor(smallest_side, dtype=tf.int32)\n\n  height = tf.to_float(height)\n  width = tf.to_float(width)\n  smallest_side = tf.to_float(smallest_side)\n\n  scale = tf.cond(tf.greater(height, width),\n                  lambda: smallest_side / width,\n                  lambda: smallest_side / height)\n  new_height = tf.to_int32(height * scale)\n  new_width = tf.to_int32(width * scale)\n  return new_height, new_width\n\n\ndef _aspect_preserving_resize(image, smallest_side):\n  \"\"\"Resize images preserving the original aspect ratio.\n\n  Args:\n    image: A 3-D image `Tensor`.\n    smallest_side: A python integer or scalar `Tensor` indicating the size of\n      the smallest side after resize.\n\n  Returns:\n    resized_image: A 3-D tensor containing the resized image.\n  \"\"\"\n  smallest_side = tf.convert_to_tensor(smallest_side, dtype=tf.int32)\n\n  shape = tf.shape(image)\n  height = shape[0]\n  width = shape[1]\n  new_height, new_width = _smallest_size_at_least(height, width, smallest_side)\n  image = tf.expand_dims(image, 0)\n  resized_image = tf.image.resize_bilinear(image, [new_height, new_width],\n                                           align_corners=False)\n  resized_image = tf.squeeze(resized_image)\n  resized_image.set_shape([None, None, 3])\n  return resized_image\n\n\ndef preprocess_for_train(image,\n                         output_height,\n                         output_width,\n                         resize_side_min=_RESIZE_SIDE_MIN,\n                         resize_side_max=_RESIZE_SIDE_MAX):\n  \"\"\"Preprocesses the given image for training.\n\n  Note that the actual resizing scale is sampled from\n    [`resize_size_min`, `resize_size_max`].\n\n  Args:\n    image: A `Tensor` representing an image of arbitrary size.\n    output_height: The height of the image after preprocessing.\n    output_width: The width of the image after preprocessing.\n    resize_side_min: The lower bound for the smallest side of the image for\n      aspect-preserving resizing.\n    resize_side_max: The upper bound for the smallest side of the image for\n      aspect-preserving resizing.\n\n  Returns:\n    A preprocessed image.\n  \"\"\"\n  resize_side = tf.random_uniform(\n      [], minval=resize_side_min, maxval=resize_side_max+1, dtype=tf.int32)\n\n  image = _aspect_preserving_resize(image, resize_side)\n  image = _random_crop([image], output_height, output_width)[0]\n  image.set_shape([output_height, output_width, 3])\n  image = tf.to_float(image)\n  image = tf.image.random_flip_left_right(image)\n  return _mean_image_subtraction(image, [_R_MEAN, _G_MEAN, _B_MEAN])\n\n\ndef preprocess_for_eval(image, output_height, output_width, resize_side):\n  \"\"\"Preprocesses the given image for evaluation.\n\n  Args:\n    image: A `Tensor` representing an image of arbitrary size.\n    output_height: The height of the image after preprocessing.\n    output_width: The width of the image after preprocessing.\n    resize_side: The smallest side of the image for aspect-preserving resizing.\n\n  Returns:\n    A preprocessed image.\n  \"\"\"\n  image = _aspect_preserving_resize(image, resize_side)\n  image = _central_crop([image], output_height, output_width)[0]\n  image.set_shape([output_height, output_width, 3])\n  image = tf.to_float(image)\n  return _mean_image_subtraction(image, [_R_MEAN, _G_MEAN, _B_MEAN])\n\n\ndef preprocess_image(image, output_height, output_width, is_training=False,\n                     resize_side_min=_RESIZE_SIDE_MIN,\n                     resize_side_max=_RESIZE_SIDE_MAX):\n  \"\"\"Preprocesses the given image.\n\n  Args:\n    image: A `Tensor` representing an image of arbitrary size.\n    output_height: The height of the image after preprocessing.\n    output_width: The width of the image after preprocessing.\n    is_training: `True` if we're preprocessing the image for training and\n      `False` otherwise.\n    resize_side_min: The lower bound for the smallest side of the image for\n      aspect-preserving resizing. If `is_training` is `False`, then this value\n      is used for rescaling.\n    resize_side_max: The upper bound for the smallest side of the image for\n      aspect-preserving resizing. If `is_training` is `False`, this value is\n      ignored. Otherwise, the resize side is sampled from\n        [resize_size_min, resize_size_max].\n\n  Returns:\n    A preprocessed image.\n  \"\"\"\n  if is_training:\n    return preprocess_for_train(image, output_height, output_width,\n                                resize_side_min, resize_side_max)\n  else:\n    return preprocess_for_eval(image, output_height, output_width,\n                               resize_side_min)\n"
  },
  {
    "path": "model_zoo/models/slim/scripts/finetune_inception_v1_on_flowers.sh",
    "content": "#!/bin/bash\n#\n# This script performs the following operations:\n# 1. Downloads the Flowers dataset\n# 2. Fine-tunes an InceptionV1 model on the Flowers training set.\n# 3. Evaluates the model on the Flowers validation set.\n#\n# Usage:\n# cd slim\n# ./slim/scripts/finetune_inception_v1_on_flowers.sh\n\n# Where the pre-trained InceptionV1 checkpoint is saved to.\nPRETRAINED_CHECKPOINT_DIR=/tmp/checkpoints\n\n# Where the training (fine-tuned) checkpoint and logs will be saved to.\nTRAIN_DIR=/tmp/flowers-models/inception_v1\n\n# Where the dataset is saved to.\nDATASET_DIR=/tmp/flowers\n\n# Download the pre-trained checkpoint.\nif [ ! -d \"$PRETRAINED_CHECKPOINT_DIR\" ]; then\n  mkdir ${PRETRAINED_CHECKPOINT_DIR}\nfi\nif [ ! -f ${PRETRAINED_CHECKPOINT_DIR}/inception_v1.ckpt ]; then\n  wget http://download.tensorflow.org/models/inception_v1_2016_08_28.tar.gz\n  tar -xvf inception_v1_2016_08_28.tar.gz\n  mv inception_v1.ckpt ${PRETRAINED_CHECKPOINT_DIR}/inception_v1.ckpt\n  rm inception_v1_2016_08_28.tar.gz\nfi\n\n# Download the dataset\npython download_and_convert_data.py \\\n  --dataset_name=flowers \\\n  --dataset_dir=${DATASET_DIR}\n\n# Fine-tune only the new layers for 2000 steps.\npython train_image_classifier.py \\\n  --train_dir=${TRAIN_DIR} \\\n  --dataset_name=flowers \\\n  --dataset_split_name=train \\\n  --dataset_dir=${DATASET_DIR} \\\n  --model_name=inception_v1 \\\n  --checkpoint_path=${PRETRAINED_CHECKPOINT_DIR}/inception_v1.ckpt \\\n  --checkpoint_exclude_scopes=InceptionV1/Logits \\\n  --trainable_scopes=InceptionV1/Logits \\\n  --max_number_of_steps=3000 \\\n  --batch_size=32 \\\n  --learning_rate=0.01 \\\n  --save_interval_secs=60 \\\n  --save_summaries_secs=60 \\\n  --log_every_n_steps=100 \\\n  --optimizer=rmsprop \\\n  --weight_decay=0.00004\n\n# Run evaluation.\npython eval_image_classifier.py \\\n  --checkpoint_path=${TRAIN_DIR} \\\n  --eval_dir=${TRAIN_DIR} \\\n  --dataset_name=flowers \\\n  --dataset_split_name=validation \\\n  --dataset_dir=${DATASET_DIR} \\\n  --model_name=inception_v1\n\n# Fine-tune all the new layers for 1000 steps.\npython train_image_classifier.py \\\n  --train_dir=${TRAIN_DIR}/all \\\n  --dataset_name=flowers \\\n  --dataset_split_name=train \\\n  --dataset_dir=${DATASET_DIR} \\\n  --checkpoint_path=${TRAIN_DIR} \\\n  --model_name=inception_v1 \\\n  --max_number_of_steps=1000 \\\n  --batch_size=32 \\\n  --learning_rate=0.001 \\\n  --save_interval_secs=60 \\\n  --save_summaries_secs=60 \\\n  --log_every_n_steps=100 \\\n  --optimizer=rmsprop \\\n  --weight_decay=0.00004\n\n# Run evaluation.\npython eval_image_classifier.py \\\n  --checkpoint_path=${TRAIN_DIR}/all \\\n  --eval_dir=${TRAIN_DIR}/all \\\n  --dataset_name=flowers \\\n  --dataset_split_name=validation \\\n  --dataset_dir=${DATASET_DIR} \\\n  --model_name=inception_v1\n"
  },
  {
    "path": "model_zoo/models/slim/scripts/finetune_inception_v3_on_flowers.sh",
    "content": "#!/bin/bash\n#\n# This script performs the following operations:\n# 1. Downloads the Flowers dataset\n# 2. Fine-tunes an InceptionV3 model on the Flowers training set.\n# 3. Evaluates the model on the Flowers validation set.\n#\n# Usage:\n# cd slim\n# ./slim/scripts/finetune_inceptionv3_on_flowers.sh\n\n# Where the pre-trained InceptionV3 checkpoint is saved to.\nPRETRAINED_CHECKPOINT_DIR=/tmp/checkpoints\n\n# Where the training (fine-tuned) checkpoint and logs will be saved to.\nTRAIN_DIR=/tmp/flowers-models/inception_v3\n\n# Where the dataset is saved to.\nDATASET_DIR=/tmp/flowers\n\n# Download the pre-trained checkpoint.\nif [ ! -d \"$PRETRAINED_CHECKPOINT_DIR\" ]; then\n  mkdir ${PRETRAINED_CHECKPOINT_DIR}\nfi\nif [ ! -f ${PRETRAINED_CHECKPOINT_DIR}/inception_v3.ckpt ]; then\n  wget http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz\n  tar -xvf inception_v3_2016_08_28.tar.gz\n  mv inception_v3.ckpt ${PRETRAINED_CHECKPOINT_DIR}/inception_v3.ckpt\n  rm inception_v3_2016_08_28.tar.gz\nfi\n\n# Download the dataset\npython download_and_convert_data.py \\\n  --dataset_name=flowers \\\n  --dataset_dir=${DATASET_DIR}\n\n# Fine-tune only the new layers for 1000 steps.\npython train_image_classifier.py \\\n  --train_dir=${TRAIN_DIR} \\\n  --dataset_name=flowers \\\n  --dataset_split_name=train \\\n  --dataset_dir=${DATASET_DIR} \\\n  --model_name=inception_v3 \\\n  --checkpoint_path=${PRETRAINED_CHECKPOINT_DIR}/inception_v3.ckpt \\\n  --checkpoint_exclude_scopes=InceptionV3/Logits,InceptionV3/AuxLogits \\\n  --trainable_scopes=InceptionV3/Logits,InceptionV3/AuxLogits \\\n  --max_number_of_steps=1000 \\\n  --batch_size=32 \\\n  --learning_rate=0.01 \\\n  --learning_rate_decay_type=fixed \\\n  --save_interval_secs=60 \\\n  --save_summaries_secs=60 \\\n  --log_every_n_steps=100 \\\n  --optimizer=rmsprop \\\n  --weight_decay=0.00004\n\n# Run evaluation.\npython eval_image_classifier.py \\\n  --checkpoint_path=${TRAIN_DIR} \\\n  --eval_dir=${TRAIN_DIR} \\\n  --dataset_name=flowers \\\n  --dataset_split_name=validation \\\n  --dataset_dir=${DATASET_DIR} \\\n  --model_name=inception_v3\n\n# Fine-tune all the new layers for 500 steps.\npython train_image_classifier.py \\\n  --train_dir=${TRAIN_DIR}/all \\\n  --dataset_name=flowers \\\n  --dataset_split_name=train \\\n  --dataset_dir=${DATASET_DIR} \\\n  --model_name=inception_v3 \\\n  --checkpoint_path=${TRAIN_DIR} \\\n  --max_number_of_steps=500 \\\n  --batch_size=32 \\\n  --learning_rate=0.0001 \\\n  --learning_rate_decay_type=fixed \\\n  --save_interval_secs=60 \\\n  --save_summaries_secs=60 \\\n  --log_every_n_steps=10 \\\n  --optimizer=rmsprop \\\n  --weight_decay=0.00004\n\n# Run evaluation.\npython eval_image_classifier.py \\\n  --checkpoint_path=${TRAIN_DIR}/all \\\n  --eval_dir=${TRAIN_DIR}/all \\\n  --dataset_name=flowers \\\n  --dataset_split_name=validation \\\n  --dataset_dir=${DATASET_DIR} \\\n  --model_name=inception_v3\n"
  },
  {
    "path": "model_zoo/models/slim/scripts/train_cifarnet_on_cifar10.sh",
    "content": "#!/bin/bash\n#\n# This script performs the following operations:\n# 1. Downloads the Cifar10 dataset\n# 2. Trains a CifarNet model on the Cifar10 training set.\n# 3. Evaluates the model on the Cifar10 testing set.\n#\n# Usage:\n# cd slim\n# ./scripts/train_cifar_net_on_mnist.sh\n\n# Where the checkpoint and logs will be saved to.\nTRAIN_DIR=/tmp/cifarnet-model\n\n# Where the dataset is saved to.\nDATASET_DIR=/tmp/cifar10\n\n# Download the dataset\npython download_and_convert_data.py \\\n  --dataset_name=cifar10 \\\n  --dataset_dir=${DATASET_DIR}\n\n# Run training.\npython train_image_classifier.py \\\n  --train_dir=${TRAIN_DIR} \\\n  --dataset_name=cifar10 \\\n  --dataset_split_name=train \\\n  --dataset_dir=${DATASET_DIR} \\\n  --model_name=cifarnet \\\n  --preprocessing_name=cifarnet \\\n  --max_number_of_steps=100000 \\\n  --batch_size=128 \\\n  --save_interval_secs=120 \\\n  --save_summaries_secs=120 \\\n  --log_every_n_steps=100 \\\n  --optimizer=sgd \\\n  --learning_rate=0.1 \\\n  --learning_rate_decay_factor=0.1 \\\n  --num_epochs_per_decay=200 \\\n  --weight_decay=0.004\n\n# Run evaluation.\npython eval_image_classifier.py \\\n  --checkpoint_path=${TRAIN_DIR} \\\n  --eval_dir=${TRAIN_DIR} \\\n  --dataset_name=cifar10 \\\n  --dataset_split_name=test \\\n  --dataset_dir=${DATASET_DIR} \\\n  --model_name=cifarnet\n"
  },
  {
    "path": "model_zoo/models/slim/scripts/train_lenet_on_mnist.sh",
    "content": "#!/bin/bash\n#\n# This script performs the following operations:\n# 1. Downloads the MNIST dataset\n# 2. Trains a LeNet model on the MNIST training set.\n# 3. Evaluates the model on the MNIST testing set.\n#\n# Usage:\n# cd slim\n# ./slim/scripts/train_lenet_on_mnist.sh\n\n# Where the checkpoint and logs will be saved to.\nTRAIN_DIR=/tmp/lenet-model\n\n# Where the dataset is saved to.\nDATASET_DIR=/tmp/mnist\n\n# Download the dataset\npython download_and_convert_data.py \\\n  --dataset_name=mnist \\\n  --dataset_dir=${DATASET_DIR}\n\n# Run training.\npython train_image_classifier.py \\\n  --train_dir=${TRAIN_DIR} \\\n  --dataset_name=mnist \\\n  --dataset_split_name=train \\\n  --dataset_dir=${DATASET_DIR} \\\n  --model_name=lenet \\\n  --preprocessing_name=lenet \\\n  --max_number_of_steps=20000 \\\n  --batch_size=50 \\\n  --learning_rate=0.01 \\\n  --save_interval_secs=60 \\\n  --save_summaries_secs=60 \\\n  --log_every_n_steps=100 \\\n  --optimizer=sgd \\\n  --learning_rate_decay_type=fixed \\\n  --weight_decay=0\n\n# Run evaluation.\npython eval_image_classifier.py \\\n  --checkpoint_path=${TRAIN_DIR} \\\n  --eval_dir=${TRAIN_DIR} \\\n  --dataset_name=mnist \\\n  --dataset_split_name=test \\\n  --dataset_dir=${DATASET_DIR} \\\n  --model_name=lenet\n"
  },
  {
    "path": "model_zoo/models/slim/slim_walkthough.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"# TF-Slim Walkthrough\\n\",\n    \"\\n\",\n    \"This notebook will walk you through the basics of using TF-Slim to define, train and evaluate neural networks on various tasks. It assumes a basic knowledge of neural networks. \"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Table of contents\\n\",\n    \"\\n\",\n    \"<a href=\\\"#Install\\\">Installation and setup</a><br>\\n\",\n    \"<a href='#MLP'>Creating your first neural network with TF-Slim</a><br>\\n\",\n    \"<a href='#ReadingTFSlimDatasets'>Reading Data with TF-Slim</a><br>\\n\",\n    \"<a href='#CNN'>Training a convolutional neural network (CNN)</a><br>\\n\",\n    \"<a href='#Pretained'>Using pre-trained models</a><br>\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Installation and setup\\n\",\n    \"<a id='Install'></a>\\n\",\n    \"\\n\",\n    \"As of 8/28/16, the latest stable release of TF is r0.10, which does not contain the latest version of slim.\\n\",\n    \"To obtain the latest version of TF-Slim, please install the most recent nightly build of TF\\n\",\n    \"as explained [here](https://github.com/tensorflow/models/tree/master/slim#installing-latest-version-of-tf-slim).\\n\",\n    \"\\n\",\n    \"To use TF-Slim for image classification (as we do in this notebook), you also have to install the TF-Slim image models library from [here](https://github.com/tensorflow/models/tree/master/slim). Let's suppose you install this into a directory called TF_MODELS. Then you should change directory to  TF_MODELS/slim **before** running this notebook, so that these files are in your python path.\\n\",\n    \"\\n\",\n    \"To check you've got these two steps to work, just execute the cell below. If it complains about unknown modules, restart the notebook after moving to the TF-Slim models directory.\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"import matplotlib\\n\",\n    \"%matplotlib inline\\n\",\n    \"import matplotlib.pyplot as plt\\n\",\n    \"import math\\n\",\n    \"import numpy as np\\n\",\n    \"import tensorflow as tf\\n\",\n    \"import time\\n\",\n    \"\\n\",\n    \"from datasets import dataset_utils\\n\",\n    \"\\n\",\n    \"# Main slim library\\n\",\n    \"slim = tf.contrib.slim\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Creating your first neural network with TF-Slim\\n\",\n    \"<a id='MLP'></a>\\n\",\n    \"\\n\",\n    \"Below we give some code to create a simple multilayer perceptron (MLP)  which can be used\\n\",\n    \"for regression problems. The model has 2 hidden layers.\\n\",\n    \"The output is a single node. \\n\",\n    \"When this function is called, it will create various nodes, and silently add them to whichever global TF graph is currently in scope. When a node which corresponds to a layer with adjustable parameters (eg., a fully connected layer) is created, additional parameter variable nodes are silently created, and added to the graph. (We will discuss how to train the parameters later.)\\n\",\n    \"\\n\",\n    \"We use variable scope to put all the nodes under a common name,\\n\",\n    \"so that the graph has some hierarchical structure.\\n\",\n    \"This is useful when we want to visualize the TF graph in tensorboard, or if we want to query related\\n\",\n    \"variables. \\n\",\n    \"The fully connected layers all use the same L2 weight decay and ReLu activations, as specified by **arg_scope**. (However, the final layer overrides these defaults, and uses an identity activation function.)\\n\",\n    \"\\n\",\n    \"We also illustrate how to add a dropout layer after the first fully connected layer (FC1). Note that at test time, \\n\",\n    \"we do not drop out nodes, but instead use the average activations; hence we need to know whether the model is being\\n\",\n    \"constructed for training or testing, since the computational graph will be different in the two cases\\n\",\n    \"(although the variables, storing the model parameters, will be shared, since they have the same name/scope).\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": true\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"def regression_model(inputs, is_training=True, scope=\\\"deep_regression\\\"):\\n\",\n    \"    \\\"\\\"\\\"Creates the regression model.\\n\",\n    \"\\n\",\n    \"    Args:\\n\",\n    \"        inputs: A node that yields a `Tensor` of size [batch_size, dimensions].\\n\",\n    \"        is_training: Whether or not we're currently training the model.\\n\",\n    \"        scope: An optional variable_op scope for the model.\\n\",\n    \"\\n\",\n    \"    Returns:\\n\",\n    \"        predictions: 1-D `Tensor` of shape [batch_size] of responses.\\n\",\n    \"        end_points: A dict of end points representing the hidden layers.\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    with tf.variable_scope(scope, 'deep_regression', [inputs]):\\n\",\n    \"        end_points = {}\\n\",\n    \"        # Set the default weight _regularizer and acvitation for each fully_connected layer.\\n\",\n    \"        with slim.arg_scope([slim.fully_connected],\\n\",\n    \"                            activation_fn=tf.nn.relu,\\n\",\n    \"                            weights_regularizer=slim.l2_regularizer(0.01)):\\n\",\n    \"\\n\",\n    \"            # Creates a fully connected layer from the inputs with 32 hidden units.\\n\",\n    \"            net = slim.fully_connected(inputs, 32, scope='fc1')\\n\",\n    \"            end_points['fc1'] = net\\n\",\n    \"\\n\",\n    \"            # Adds a dropout layer to prevent over-fitting.\\n\",\n    \"            net = slim.dropout(net, 0.8, is_training=is_training)\\n\",\n    \"\\n\",\n    \"            # Adds another fully connected layer with 16 hidden units.\\n\",\n    \"            net = slim.fully_connected(net, 16, scope='fc2')\\n\",\n    \"            end_points['fc2'] = net\\n\",\n    \"\\n\",\n    \"            # Creates a fully-connected layer with a single hidden unit. Note that the\\n\",\n    \"            # layer is made linear by setting activation_fn=None.\\n\",\n    \"            predictions = slim.fully_connected(net, 1, activation_fn=None, scope='prediction')\\n\",\n    \"            end_points['out'] = predictions\\n\",\n    \"\\n\",\n    \"            return predictions, end_points\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Let's create the model and examine its structure.\\n\",\n    \"\\n\",\n    \"We create a TF graph and call regression_model(), which adds nodes (tensors) to the graph. We then examine their shape, and print the names of all the model variables which have been implicitly created inside of each layer. We see that the names of the variables follow the scopes that we specified.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"with tf.Graph().as_default():\\n\",\n    \"    # Dummy placeholders for arbitrary number of 1d inputs and outputs\\n\",\n    \"    inputs = tf.placeholder(tf.float32, shape=(None, 1))\\n\",\n    \"    outputs = tf.placeholder(tf.float32, shape=(None, 1))\\n\",\n    \"\\n\",\n    \"    # Build model\\n\",\n    \"    predictions, end_points = regression_model(inputs)\\n\",\n    \"\\n\",\n    \"    # Print name and shape of each tensor.\\n\",\n    \"    print \\\"Layers\\\"\\n\",\n    \"    for k, v in end_points.iteritems():\\n\",\n    \"        print 'name = {}, shape = {}'.format(v.name, v.get_shape())\\n\",\n    \"\\n\",\n    \"    # Print name and shape of parameter nodes  (values not yet initialized)\\n\",\n    \"    print \\\"\\\\n\\\"\\n\",\n    \"    print \\\"Parameters\\\"\\n\",\n    \"    for v in slim.get_model_variables():\\n\",\n    \"        print 'name = {}, shape = {}'.format(v.name, v.get_shape())\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Let's create some 1d regression data .\\n\",\n    \"\\n\",\n    \"We will train and test the model on some noisy observations of a nonlinear function.\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"def produce_batch(batch_size, noise=0.3):\\n\",\n    \"    xs = np.random.random(size=[batch_size, 1]) * 10\\n\",\n    \"    ys = np.sin(xs) + 5 + np.random.normal(size=[batch_size, 1], scale=noise)\\n\",\n    \"    return [xs.astype(np.float32), ys.astype(np.float32)]\\n\",\n    \"\\n\",\n    \"x_train, y_train = produce_batch(200)\\n\",\n    \"x_test, y_test = produce_batch(200)\\n\",\n    \"plt.scatter(x_train, y_train)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Let's fit the model to the data\\n\",\n    \"\\n\",\n    \"The user has to specify the loss function and the optimizer, and slim does the rest.\\n\",\n    \"In particular,  the slim.learning.train function does the following:\\n\",\n    \"\\n\",\n    \"- For each iteration, evaluate the train_op, which updates the parameters using the optimizer applied to the current minibatch. Also, update the global_step.\\n\",\n    \"- Occasionally store the model checkpoint in the specified directory. This is useful in case your machine crashes  - then you can simply restart from the specified checkpoint.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": true\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"def convert_data_to_tensors(x, y):\\n\",\n    \"    inputs = tf.constant(x)\\n\",\n    \"    inputs.set_shape([None, 1])\\n\",\n    \"    \\n\",\n    \"    outputs = tf.constant(y)\\n\",\n    \"    outputs.set_shape([None, 1])\\n\",\n    \"    return inputs, outputs\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"# The following snippet trains the regression model using a sum_of_squares loss.\\n\",\n    \"ckpt_dir = '/tmp/regression_model/'\\n\",\n    \"\\n\",\n    \"with tf.Graph().as_default():\\n\",\n    \"    tf.logging.set_verbosity(tf.logging.INFO)\\n\",\n    \"    \\n\",\n    \"    inputs, targets = convert_data_to_tensors(x_train, y_train)\\n\",\n    \"\\n\",\n    \"    # Make the model.\\n\",\n    \"    predictions, nodes = regression_model(inputs, is_training=True)\\n\",\n    \"\\n\",\n    \"    # Add the loss function to the graph.\\n\",\n    \"    loss = slim.losses.sum_of_squares(predictions, targets)\\n\",\n    \"    \\n\",\n    \"    # The total loss is the uers's loss plus any regularization losses.\\n\",\n    \"    total_loss = slim.losses.get_total_loss()\\n\",\n    \"\\n\",\n    \"    # Specify the optimizer and create the train op:\\n\",\n    \"    optimizer = tf.train.AdamOptimizer(learning_rate=0.005)\\n\",\n    \"    train_op = slim.learning.create_train_op(total_loss, optimizer) \\n\",\n    \"\\n\",\n    \"    # Run the training inside a session.\\n\",\n    \"    final_loss = slim.learning.train(\\n\",\n    \"        train_op,\\n\",\n    \"        logdir=ckpt_dir,\\n\",\n    \"        number_of_steps=5000,\\n\",\n    \"        save_summaries_secs=5,\\n\",\n    \"        log_every_n_steps=500)\\n\",\n    \"  \\n\",\n    \"print(\\\"Finished training. Last batch loss:\\\", final_loss)\\n\",\n    \"print(\\\"Checkpoint saved in %s\\\" % ckpt_dir)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Training with multiple loss functions.\\n\",\n    \"\\n\",\n    \"Sometimes we have multiple objectives we want to simultaneously optimize.\\n\",\n    \"In slim, it is easy to add more losses, as we show below. (We do not optimize the total loss in this example,\\n\",\n    \"but we show how to compute it.)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"with tf.Graph().as_default():\\n\",\n    \"    inputs, targets = convert_data_to_tensors(x_train, y_train)\\n\",\n    \"    predictions, end_points = regression_model(inputs, is_training=True)\\n\",\n    \"\\n\",\n    \"    # Add multiple loss nodes.\\n\",\n    \"    sum_of_squares_loss = slim.losses.sum_of_squares(predictions, targets)\\n\",\n    \"    absolute_difference_loss = slim.losses.absolute_difference(predictions, targets)\\n\",\n    \"\\n\",\n    \"    # The following two ways to compute the total loss are equivalent\\n\",\n    \"    regularization_loss = tf.add_n(slim.losses.get_regularization_losses())\\n\",\n    \"    total_loss1 = sum_of_squares_loss + absolute_difference_loss + regularization_loss\\n\",\n    \"\\n\",\n    \"    # Regularization Loss is included in the total loss by default.\\n\",\n    \"    # This is good for training, but not for testing.\\n\",\n    \"    total_loss2 = slim.losses.get_total_loss(add_regularization_losses=True)\\n\",\n    \"    \\n\",\n    \"    init_op = tf.initialize_all_variables()\\n\",\n    \"    \\n\",\n    \"    with tf.Session() as sess:\\n\",\n    \"        sess.run(init_op) # Will initialize the parameters with random weights.\\n\",\n    \"        \\n\",\n    \"        total_loss1, total_loss2 = sess.run([total_loss1, total_loss2])\\n\",\n    \"        \\n\",\n    \"        print('Total Loss1: %f' % total_loss1)\\n\",\n    \"        print('Total Loss2: %f' % total_loss2)\\n\",\n    \"\\n\",\n    \"        print('Regularization Losses:')\\n\",\n    \"        for loss in slim.losses.get_regularization_losses():\\n\",\n    \"            print(loss)\\n\",\n    \"\\n\",\n    \"        print('Loss Functions:')\\n\",\n    \"        for loss in slim.losses.get_losses():\\n\",\n    \"            print(loss)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Let's load the saved model and use it for prediction.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"with tf.Graph().as_default():\\n\",\n    \"    inputs, targets = convert_data_to_tensors(x_test, y_test)\\n\",\n    \"  \\n\",\n    \"    # Create the model structure. (Parameters will be loaded below.)\\n\",\n    \"    predictions, end_points = regression_model(inputs, is_training=False)\\n\",\n    \"\\n\",\n    \"    # Make a session which restores the old parameters from a checkpoint.\\n\",\n    \"    sv = tf.train.Supervisor(logdir=ckpt_dir)\\n\",\n    \"    with sv.managed_session() as sess:\\n\",\n    \"        inputs, predictions, targets = sess.run([inputs, predictions, targets])\\n\",\n    \"\\n\",\n    \"plt.scatter(inputs, targets, c='r');\\n\",\n    \"plt.scatter(inputs, predictions, c='b');\\n\",\n    \"plt.title('red=true, blue=predicted')\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Let's compute various evaluation metrics on the test set.\\n\",\n    \"\\n\",\n    \"In TF-Slim termiology, losses are optimized, but metrics (which may not be differentiable, e.g., precision and recall) are just measured. As an illustration, the code below computes mean squared error and mean absolute error metrics on the test set.\\n\",\n    \"\\n\",\n    \"Each metric declaration creates several local variables (which must be initialized via tf.initialize_local_variables()) and returns both a value_op and an update_op. When evaluated, the value_op returns the current value of the metric. The update_op loads a new batch of data, runs the model, obtains the predictions and accumulates the metric statistics appropriately before returning the current value of the metric. We store these value nodes and update nodes in 2 dictionaries.\\n\",\n    \"\\n\",\n    \"After creating the metric nodes, we can pass them to slim.evaluation.evaluation, which repeatedly evaluates these nodes the specified number of times. (This allows us to compute the evaluation in a streaming fashion across minibatches, which is usefulf for large datasets.) Finally, we print the final value of each metric.\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"with tf.Graph().as_default():\\n\",\n    \"    inputs, targets = convert_data_to_tensors(x_test, y_test)\\n\",\n    \"    predictions, end_points = regression_model(inputs, is_training=False)\\n\",\n    \"\\n\",\n    \"    # Specify metrics to evaluate:\\n\",\n    \"    names_to_value_nodes, names_to_update_nodes = slim.metrics.aggregate_metric_map({\\n\",\n    \"      'Mean Squared Error': slim.metrics.streaming_mean_squared_error(predictions, targets),\\n\",\n    \"      'Mean Absolute Error': slim.metrics.streaming_mean_absolute_error(predictions, targets)\\n\",\n    \"    })\\n\",\n    \"\\n\",\n    \"    # Make a session which restores the old graph parameters, and then run eval.\\n\",\n    \"    sv = tf.train.Supervisor(logdir=ckpt_dir)\\n\",\n    \"    with sv.managed_session() as sess:\\n\",\n    \"        metric_values = slim.evaluation.evaluation(\\n\",\n    \"            sess,\\n\",\n    \"            num_evals=1, # Single pass over data\\n\",\n    \"            eval_op=names_to_update_nodes.values(),\\n\",\n    \"            final_op=names_to_value_nodes.values())\\n\",\n    \"\\n\",\n    \"    names_to_values = dict(zip(names_to_value_nodes.keys(), metric_values))\\n\",\n    \"    for key, value in names_to_values.iteritems():\\n\",\n    \"      print('%s: %f' % (key, value))\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"# Reading Data with TF-Slim\\n\",\n    \"<a id='ReadingTFSlimDatasets'></a>\\n\",\n    \"\\n\",\n    \"Reading data with TF-Slim has two main components: A\\n\",\n    \"[Dataset](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/slim/python/slim/data/dataset.py) and a \\n\",\n    \"[DatasetDataProvider](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/slim/python/slim/data/dataset_data_provider.py). The former is a descriptor of a dataset, while the latter performs the actions necessary for actually reading the data. Lets look at each one in detail:\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"## Dataset\\n\",\n    \"A TF-Slim\\n\",\n    \"[Dataset](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/slim/python/slim/data/dataset.py)\\n\",\n    \"contains descriptive information about a dataset necessary for reading it, such as the list of data files and how to decode them. It also contains metadata including class labels, the size of the train/test splits and descriptions of the tensors that the dataset provides. For example, some datasets contain images with labels. Others augment this data with bounding box annotations, etc. The Dataset object allows us to write generic code using the same API, regardless of the data content and encoding type.\\n\",\n    \"\\n\",\n    \"TF-Slim's Dataset works especially well when the data is stored as a (possibly sharded)\\n\",\n    \"[TFRecords file](https://www.tensorflow.org/versions/r0.10/how_tos/reading_data/index.html#file-formats), where each record contains a [tf.train.Example protocol buffer](https://github.com/tensorflow/tensorflow/blob/r0.10/tensorflow/core/example/example.proto).\\n\",\n    \"TF-Slim uses a consistent convention for naming the keys and values inside each Example record. \\n\",\n    \"\\n\",\n    \"## DatasetDataProvider\\n\",\n    \"\\n\",\n    \"A\\n\",\n    \"[DatasetDataProvider](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/slim/python/slim/data/dataset_data_provider.py) is a class which actually reads the data from a dataset. It is highly configurable to read the data in various ways that may make a big impact on the efficiency of your training process. For example, it can be single or multi-threaded. If your data is sharded across many files, it can read each files serially, or from every file simultaneously.\\n\",\n    \"\\n\",\n    \"## Demo: The Flowers Dataset\\n\",\n    \"\\n\",\n    \"For convenience, we've include scripts to convert several common image datasets into TFRecord format and have provided\\n\",\n    \"the Dataset descriptor files necessary for reading them. We demonstrate how easy it is to use these dataset via the Flowers dataset below.\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Download the Flowers Dataset\\n\",\n    \"<a id='DownloadFlowers'></a>\\n\",\n    \"\\n\",\n    \"We've made available a tarball of the Flowers dataset which has already been converted to TFRecord format.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"import tensorflow as tf\\n\",\n    \"from datasets import dataset_utils\\n\",\n    \"\\n\",\n    \"url = \\\"http://download.tensorflow.org/data/flowers.tar.gz\\\"\\n\",\n    \"flowers_data_dir = '/tmp/flowers'\\n\",\n    \"\\n\",\n    \"if not tf.gfile.Exists(flowers_data_dir):\\n\",\n    \"    tf.gfile.MakeDirs(flowers_data_dir)\\n\",\n    \"\\n\",\n    \"dataset_utils.download_and_uncompress_tarball(url, flowers_data_dir) \"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Display some of the data.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"from datasets import flowers\\n\",\n    \"import tensorflow as tf\\n\",\n    \"\\n\",\n    \"slim = tf.contrib.slim\\n\",\n    \"\\n\",\n    \"with tf.Graph().as_default(): \\n\",\n    \"    dataset = flowers.get_split('train', flowers_data_dir)\\n\",\n    \"    data_provider = slim.dataset_data_provider.DatasetDataProvider(\\n\",\n    \"        dataset, common_queue_capacity=32, common_queue_min=1)\\n\",\n    \"    image, label = data_provider.get(['image', 'label'])\\n\",\n    \"    \\n\",\n    \"    with tf.Session() as sess:    \\n\",\n    \"        with slim.queues.QueueRunners(sess):\\n\",\n    \"            for i in xrange(4):\\n\",\n    \"                np_image, np_label = sess.run([image, label])\\n\",\n    \"                height, width, _ = np_image.shape\\n\",\n    \"                class_name = name = dataset.labels_to_names[np_label]\\n\",\n    \"                \\n\",\n    \"                plt.figure()\\n\",\n    \"                plt.imshow(np_image)\\n\",\n    \"                plt.title('%s, %d x %d' % (name, height, width))\\n\",\n    \"                plt.axis('off')\\n\",\n    \"                plt.show()\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"# Convolutional neural nets (CNNs).\\n\",\n    \"<a id='CNN'></a>\\n\",\n    \"\\n\",\n    \"In this section, we show how to train an image classifier using a simple CNN.\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Define the model.\\n\",\n    \"\\n\",\n    \"Below we define a simple CNN. Note that the output layer is linear function - we will apply softmax transformation externally to the model, either in the loss function (for training), or in the prediction function (during testing).\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": true\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"def my_cnn(images, num_classes, is_training):  # is_training is not used...\\n\",\n    \"    with slim.arg_scope([slim.max_pool2d], kernel_size=[3, 3], stride=2):\\n\",\n    \"        net = slim.conv2d(images, 64, [5, 5])\\n\",\n    \"        net = slim.max_pool2d(net)\\n\",\n    \"        net = slim.conv2d(net, 64, [5, 5])\\n\",\n    \"        net = slim.max_pool2d(net)\\n\",\n    \"        net = slim.flatten(net)\\n\",\n    \"        net = slim.fully_connected(net, 192)\\n\",\n    \"        net = slim.fully_connected(net, num_classes, activation_fn=None)       \\n\",\n    \"        return net\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Apply the model to some randomly generated images.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"import tensorflow as tf\\n\",\n    \"\\n\",\n    \"with tf.Graph().as_default():\\n\",\n    \"    # The model can handle any input size because the first layer is convolutional.\\n\",\n    \"    # The size of the model is determined when image_node is first passed into the my_cnn function.\\n\",\n    \"    # Once the variables are initialized, the size of all the weight matrices is fixed.\\n\",\n    \"    # Because of the fully connected layers, this means that all subsequent images must have the same\\n\",\n    \"    # input size as the first image.\\n\",\n    \"    batch_size, height, width, channels = 3, 28, 28, 3\\n\",\n    \"    images = tf.random_uniform([batch_size, height, width, channels], maxval=1)\\n\",\n    \"    \\n\",\n    \"    # Create the model.\\n\",\n    \"    num_classes = 10\\n\",\n    \"    logits = my_cnn(images, num_classes, is_training=True)\\n\",\n    \"    probabilities = tf.nn.softmax(logits)\\n\",\n    \"  \\n\",\n    \"    # Initialize all the variables (including parameters) randomly.\\n\",\n    \"    init_op = tf.initialize_all_variables()\\n\",\n    \"  \\n\",\n    \"    with tf.Session() as sess:\\n\",\n    \"        # Run the init_op, evaluate the model outputs and print the results:\\n\",\n    \"        sess.run(init_op)\\n\",\n    \"        probabilities = sess.run(probabilities)\\n\",\n    \"        \\n\",\n    \"print('Probabilities Shape:')\\n\",\n    \"print(probabilities.shape)  # batch_size x num_classes \\n\",\n    \"\\n\",\n    \"print('\\\\nProbabilities:')\\n\",\n    \"print(probabilities)\\n\",\n    \"\\n\",\n    \"print('\\\\nSumming across all classes (Should equal 1):')\\n\",\n    \"print(np.sum(probabilities, 1)) # Each row sums to 1\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Train the model on the Flowers dataset.\\n\",\n    \"\\n\",\n    \"Before starting, make sure you've run the code to <a href=\\\"#DownloadFlowers\\\">Download the Flowers</a> dataset. Now, we'll get a sense of what it looks like to use TF-Slim's training functions found in\\n\",\n    \"[learning.py](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/slim/python/slim/learning.py). First, we'll create a function, `load_batch`, that loads batches of dataset from a dataset. Next, we'll train a model for a single step (just to demonstrate the API), and evaluate the results.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"from preprocessing import inception_preprocessing\\n\",\n    \"import tensorflow as tf\\n\",\n    \"\\n\",\n    \"slim = tf.contrib.slim\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"def load_batch(dataset, batch_size=32, height=299, width=299, is_training=False):\\n\",\n    \"    \\\"\\\"\\\"Loads a single batch of data.\\n\",\n    \"    \\n\",\n    \"    Args:\\n\",\n    \"      dataset: The dataset to load.\\n\",\n    \"      batch_size: The number of images in the batch.\\n\",\n    \"      height: The size of each image after preprocessing.\\n\",\n    \"      width: The size of each image after preprocessing.\\n\",\n    \"      is_training: Whether or not we're currently training or evaluating.\\n\",\n    \"    \\n\",\n    \"    Returns:\\n\",\n    \"      images: A Tensor of size [batch_size, height, width, 3], image samples that have been preprocessed.\\n\",\n    \"      images_raw: A Tensor of size [batch_size, height, width, 3], image samples that can be used for visualization.\\n\",\n    \"      labels: A Tensor of size [batch_size], whose values range between 0 and dataset.num_classes.\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    data_provider = slim.dataset_data_provider.DatasetDataProvider(\\n\",\n    \"        dataset, common_queue_capacity=32,\\n\",\n    \"        common_queue_min=8)\\n\",\n    \"    image_raw, label = data_provider.get(['image', 'label'])\\n\",\n    \"    \\n\",\n    \"    # Preprocess image for usage by Inception.\\n\",\n    \"    image = inception_preprocessing.preprocess_image(image_raw, height, width, is_training=is_training)\\n\",\n    \"    \\n\",\n    \"    # Preprocess the image for display purposes.\\n\",\n    \"    image_raw = tf.expand_dims(image_raw, 0)\\n\",\n    \"    image_raw = tf.image.resize_images(image_raw, [height, width])\\n\",\n    \"    image_raw = tf.squeeze(image_raw)\\n\",\n    \"\\n\",\n    \"    # Batch it up.\\n\",\n    \"    images, images_raw, labels = tf.train.batch(\\n\",\n    \"          [image, image_raw, label],\\n\",\n    \"          batch_size=batch_size,\\n\",\n    \"          num_threads=1,\\n\",\n    \"          capacity=2 * batch_size)\\n\",\n    \"    \\n\",\n    \"    return images, images_raw, labels\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"from datasets import flowers\\n\",\n    \"\\n\",\n    \"# This might take a few minutes.\\n\",\n    \"train_dir = '/tmp/tfslim_model/'\\n\",\n    \"print('Will save model to %s' % train_dir)\\n\",\n    \"\\n\",\n    \"with tf.Graph().as_default():\\n\",\n    \"    tf.logging.set_verbosity(tf.logging.INFO)\\n\",\n    \"\\n\",\n    \"    dataset = flowers.get_split('train', flowers_data_dir)\\n\",\n    \"    images, _, labels = load_batch(dataset)\\n\",\n    \"  \\n\",\n    \"    # Create the model:\\n\",\n    \"    logits = my_cnn(images, num_classes=dataset.num_classes, is_training=True)\\n\",\n    \" \\n\",\n    \"    # Specify the loss function:\\n\",\n    \"    one_hot_labels = slim.one_hot_encoding(labels, dataset.num_classes)\\n\",\n    \"    slim.losses.softmax_cross_entropy(logits, one_hot_labels)\\n\",\n    \"    total_loss = slim.losses.get_total_loss()\\n\",\n    \"\\n\",\n    \"    # Create some summaries to visualize the training process:\\n\",\n    \"    tf.scalar_summary('losses/Total Loss', total_loss)\\n\",\n    \"  \\n\",\n    \"    # Specify the optimizer and create the train op:\\n\",\n    \"    optimizer = tf.train.AdamOptimizer(learning_rate=0.01)\\n\",\n    \"    train_op = slim.learning.create_train_op(total_loss, optimizer)\\n\",\n    \"\\n\",\n    \"    # Run the training:\\n\",\n    \"    final_loss = slim.learning.train(\\n\",\n    \"      train_op,\\n\",\n    \"      logdir=train_dir,\\n\",\n    \"      number_of_steps=1, # For speed, we just do 1 epoch\\n\",\n    \"      save_summaries_secs=1)\\n\",\n    \"  \\n\",\n    \"    print('Finished training. Final batch loss %d' % final_loss)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Evaluate some metrics.\\n\",\n    \"\\n\",\n    \"As we discussed above, we can compute various metrics besides the loss.\\n\",\n    \"Below we show how to compute prediction accuracy of the trained model, as well as top-5 classification accuracy. (The difference between evaluation and evaluation_loop is that the latter writes the results to a log directory, so they can be viewed in tensorboard.)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"from datasets import flowers\\n\",\n    \"\\n\",\n    \"# This might take a few minutes.\\n\",\n    \"with tf.Graph().as_default():\\n\",\n    \"    tf.logging.set_verbosity(tf.logging.DEBUG)\\n\",\n    \"    \\n\",\n    \"    dataset = flowers.get_split('train', flowers_data_dir)\\n\",\n    \"    images, _, labels = load_batch(dataset)\\n\",\n    \"    \\n\",\n    \"    logits = my_cnn(images, num_classes=dataset.num_classes, is_training=False)\\n\",\n    \"    predictions = tf.argmax(logits, 1)\\n\",\n    \"    \\n\",\n    \"    # Define the metrics:\\n\",\n    \"    names_to_values, names_to_updates = slim.metrics.aggregate_metric_map({\\n\",\n    \"        'eval/Accuracy': slim.metrics.streaming_accuracy(predictions, labels),\\n\",\n    \"        'eval/Recall@5': slim.metrics.streaming_recall_at_k(logits, labels, 5),\\n\",\n    \"    })\\n\",\n    \"\\n\",\n    \"    print('Running evaluation Loop...')\\n\",\n    \"    checkpoint_path = tf.train.latest_checkpoint(train_dir)\\n\",\n    \"    metric_values = slim.evaluation.evaluate_once(\\n\",\n    \"        master='',\\n\",\n    \"        checkpoint_path=checkpoint_path,\\n\",\n    \"        logdir=train_dir,\\n\",\n    \"        eval_op=names_to_updates.values(),\\n\",\n    \"        final_op=names_to_values.values())\\n\",\n    \"\\n\",\n    \"    names_to_values = dict(zip(names_to_values.keys(), metric_values))\\n\",\n    \"    for name in names_to_values:\\n\",\n    \"        print('%s: %f' % (name, names_to_values[name]))\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"# Using pre-trained models\\n\",\n    \"<a id='Pretrained'></a>\\n\",\n    \"\\n\",\n    \"Neural nets work best when they have many parameters, making them very flexible function approximators.\\n\",\n    \"However, this  means they must be trained on big datasets. Since this process is slow, we provide various pre-trained models - see the list [here](https://github.com/tensorflow/models/tree/master/slim#pre-trained-models).\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"You can either use these models as-is, or you can perform \\\"surgery\\\" on them, to modify them for some other task. For example, it is common to \\\"chop off\\\" the final pre-softmax layer, and replace it with a new set of weights corresponding to some new set of labels. You can then quickly fine tune the new model on a small new dataset. We illustrate this below, using inception-v1 as the base model. While models like Inception V3 are more powerful, Inception V1 is used for speed purposes.\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Download the Inception V1 checkpoint\\n\",\n    \"\\n\",\n    \"\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"from datasets import dataset_utils\\n\",\n    \"\\n\",\n    \"url = \\\"http://download.tensorflow.org/models/inception_v1_2016_08_28.tar.gz\\\"\\n\",\n    \"checkpoints_dir = '/tmp/checkpoints'\\n\",\n    \"\\n\",\n    \"if not tf.gfile.Exists(checkpoints_dir):\\n\",\n    \"    tf.gfile.MakeDirs(checkpoints_dir)\\n\",\n    \"\\n\",\n    \"dataset_utils.download_and_uncompress_tarball(url, checkpoints_dir)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"\\n\",\n    \"### Apply Pre-trained model to Images.\\n\",\n    \"\\n\",\n    \"We have to convert each image to the size expected by the model checkpoint.\\n\",\n    \"There is no easy way to determine this size from the checkpoint itself.\\n\",\n    \"So we use a preprocessor to enforce this.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"import numpy as np\\n\",\n    \"import os\\n\",\n    \"import tensorflow as tf\\n\",\n    \"import urllib2\\n\",\n    \"\\n\",\n    \"from datasets import imagenet\\n\",\n    \"from nets import inception\\n\",\n    \"from preprocessing import inception_preprocessing\\n\",\n    \"\\n\",\n    \"slim = tf.contrib.slim\\n\",\n    \"\\n\",\n    \"batch_size = 3\\n\",\n    \"image_size = inception.inception_v1.default_image_size\\n\",\n    \"\\n\",\n    \"with tf.Graph().as_default():\\n\",\n    \"    url = 'https://upload.wikimedia.org/wikipedia/commons/7/70/EnglishCockerSpaniel_simon.jpg'\\n\",\n    \"    image_string = urllib2.urlopen(url).read()\\n\",\n    \"    image = tf.image.decode_jpeg(image_string, channels=3)\\n\",\n    \"    processed_image = inception_preprocessing.preprocess_image(image, image_size, image_size, is_training=False)\\n\",\n    \"    processed_images  = tf.expand_dims(processed_image, 0)\\n\",\n    \"    \\n\",\n    \"    # Create the model, use the default arg scope to configure the batch norm parameters.\\n\",\n    \"    with slim.arg_scope(inception.inception_v1_arg_scope()):\\n\",\n    \"        logits, _ = inception.inception_v1(processed_images, num_classes=1001, is_training=False)\\n\",\n    \"    probabilities = tf.nn.softmax(logits)\\n\",\n    \"    \\n\",\n    \"    init_fn = slim.assign_from_checkpoint_fn(\\n\",\n    \"        os.path.join(checkpoints_dir, 'inception_v1.ckpt'),\\n\",\n    \"        slim.get_model_variables('InceptionV1'))\\n\",\n    \"    \\n\",\n    \"    with tf.Session() as sess:\\n\",\n    \"        init_fn(sess)\\n\",\n    \"        np_image, probabilities = sess.run([image, probabilities])\\n\",\n    \"        probabilities = probabilities[0, 0:]\\n\",\n    \"        sorted_inds = [i[0] for i in sorted(enumerate(-probabilities), key=lambda x:x[1])]\\n\",\n    \"        \\n\",\n    \"    plt.figure()\\n\",\n    \"    plt.imshow(np_image.astype(np.uint8))\\n\",\n    \"    plt.axis('off')\\n\",\n    \"    plt.show()\\n\",\n    \"\\n\",\n    \"    names = imagenet.create_readable_names_for_imagenet_labels()\\n\",\n    \"    for i in range(5):\\n\",\n    \"        index = sorted_inds[i]\\n\",\n    \"        print('Probability %0.2f%% => [%s]' % (probabilities[index], names[index]))\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Fine-tune the model on a different set of labels.\\n\",\n    \"\\n\",\n    \"We will fine tune the inception model on the Flowers dataset.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"# Note that this may take several minutes.\\n\",\n    \"\\n\",\n    \"import os\\n\",\n    \"\\n\",\n    \"from datasets import flowers\\n\",\n    \"from nets import inception\\n\",\n    \"from preprocessing import inception_preprocessing\\n\",\n    \"\\n\",\n    \"slim = tf.contrib.slim\\n\",\n    \"image_size = inception.inception_v1.default_image_size\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"def get_init_fn():\\n\",\n    \"    \\\"\\\"\\\"Returns a function run by the chief worker to warm-start the training.\\\"\\\"\\\"\\n\",\n    \"    checkpoint_exclude_scopes=[\\\"InceptionV1/Logits\\\", \\\"InceptionV1/AuxLogits\\\"]\\n\",\n    \"    \\n\",\n    \"    exclusions = [scope.strip() for scope in checkpoint_exclude_scopes]\\n\",\n    \"\\n\",\n    \"    variables_to_restore = []\\n\",\n    \"    for var in slim.get_model_variables():\\n\",\n    \"        excluded = False\\n\",\n    \"        for exclusion in exclusions:\\n\",\n    \"            if var.op.name.startswith(exclusion):\\n\",\n    \"                excluded = True\\n\",\n    \"                break\\n\",\n    \"        if not excluded:\\n\",\n    \"            variables_to_restore.append(var)\\n\",\n    \"\\n\",\n    \"    return slim.assign_from_checkpoint_fn(\\n\",\n    \"      os.path.join(checkpoints_dir, 'inception_v1.ckpt'),\\n\",\n    \"      variables_to_restore)\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"train_dir = '/tmp/inception_finetuned/'\\n\",\n    \"\\n\",\n    \"with tf.Graph().as_default():\\n\",\n    \"    tf.logging.set_verbosity(tf.logging.INFO)\\n\",\n    \"    \\n\",\n    \"    dataset = flowers.get_split('train', flowers_data_dir)\\n\",\n    \"    images, _, labels = load_batch(dataset, height=image_size, width=image_size)\\n\",\n    \"    \\n\",\n    \"    # Create the model, use the default arg scope to configure the batch norm parameters.\\n\",\n    \"    with slim.arg_scope(inception.inception_v1_arg_scope()):\\n\",\n    \"        logits, _ = inception.inception_v1(images, num_classes=dataset.num_classes, is_training=True)\\n\",\n    \"        \\n\",\n    \"    # Specify the loss function:\\n\",\n    \"    one_hot_labels = slim.one_hot_encoding(labels, dataset.num_classes)\\n\",\n    \"    slim.losses.softmax_cross_entropy(logits, one_hot_labels)\\n\",\n    \"    total_loss = slim.losses.get_total_loss()\\n\",\n    \"\\n\",\n    \"    # Create some summaries to visualize the training process:\\n\",\n    \"    tf.scalar_summary('losses/Total Loss', total_loss)\\n\",\n    \"  \\n\",\n    \"    # Specify the optimizer and create the train op:\\n\",\n    \"    optimizer = tf.train.AdamOptimizer(learning_rate=0.01)\\n\",\n    \"    train_op = slim.learning.create_train_op(total_loss, optimizer)\\n\",\n    \"    \\n\",\n    \"    # Run the training:\\n\",\n    \"    final_loss = slim.learning.train(\\n\",\n    \"        train_op,\\n\",\n    \"        logdir=train_dir,\\n\",\n    \"        init_fn=get_init_fn(),\\n\",\n    \"        number_of_steps=2)\\n\",\n    \"        \\n\",\n    \"  \\n\",\n    \"print('Finished training. Last batch loss %f' % final_loss)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Apply fine tuned model to some images.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {\n    \"collapsed\": false\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"import numpy as np\\n\",\n    \"import tensorflow as tf\\n\",\n    \"from datasets import flowers\\n\",\n    \"from nets import inception\\n\",\n    \"\\n\",\n    \"slim = tf.contrib.slim\\n\",\n    \"\\n\",\n    \"image_size = inception.inception_v1.default_image_size\\n\",\n    \"batch_size = 3\\n\",\n    \"\\n\",\n    \"with tf.Graph().as_default():\\n\",\n    \"    tf.logging.set_verbosity(tf.logging.INFO)\\n\",\n    \"    \\n\",\n    \"    dataset = flowers.get_split('train', flowers_data_dir)\\n\",\n    \"    images, images_raw, labels = load_batch(dataset, height=image_size, width=image_size)\\n\",\n    \"    \\n\",\n    \"    # Create the model, use the default arg scope to configure the batch norm parameters.\\n\",\n    \"    with slim.arg_scope(inception.inception_v1_arg_scope()):\\n\",\n    \"        logits, _ = inception.inception_v1(images, num_classes=dataset.num_classes, is_training=True)\\n\",\n    \"\\n\",\n    \"    probabilities = tf.nn.softmax(logits)\\n\",\n    \"    \\n\",\n    \"    checkpoint_path = tf.train.latest_checkpoint(train_dir)\\n\",\n    \"    init_fn = slim.assign_from_checkpoint_fn(\\n\",\n    \"      checkpoint_path,\\n\",\n    \"      slim.get_variables_to_restore())\\n\",\n    \"    \\n\",\n    \"    with tf.Session() as sess:\\n\",\n    \"        with slim.queues.QueueRunners(sess):\\n\",\n    \"            sess.run(tf.initialize_local_variables())\\n\",\n    \"            init_fn(sess)\\n\",\n    \"            np_probabilities, np_images_raw, np_labels = sess.run([probabilities, images_raw, labels])\\n\",\n    \"    \\n\",\n    \"            for i in xrange(batch_size): \\n\",\n    \"                image = np_images_raw[i, :, :, :]\\n\",\n    \"                true_label = np_labels[i]\\n\",\n    \"                predicted_label = np.argmax(np_probabilities[i, :])\\n\",\n    \"                predicted_name = dataset.labels_to_names[predicted_label]\\n\",\n    \"                true_name = dataset.labels_to_names[true_label]\\n\",\n    \"                \\n\",\n    \"                plt.figure()\\n\",\n    \"                plt.imshow(image.astype(np.uint8))\\n\",\n    \"                plt.title('Ground Truth: [%s], Prediction [%s]' % (true_name, predicted_name))\\n\",\n    \"                plt.axis('off')\\n\",\n    \"                plt.show()\"\n   ]\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python 2\",\n   \"language\": \"python\",\n   \"name\": \"python2\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 2\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython2\",\n   \"version\": \"2.7.6\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 0\n}\n"
  },
  {
    "path": "model_zoo/models/slim/train_image_classifier.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Generic training script that trains a model using a given dataset.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport tensorflow as tf\n\nfrom tensorflow.python.ops import control_flow_ops\nfrom datasets import dataset_factory\nfrom deployment import model_deploy\nfrom nets import nets_factory\nfrom preprocessing import preprocessing_factory\n\nslim = tf.contrib.slim\n\ntf.app.flags.DEFINE_string(\n    'master', '', 'The address of the TensorFlow master to use.')\n\ntf.app.flags.DEFINE_string(\n    'train_dir', '/tmp/tfmodel/',\n    'Directory where checkpoints and event logs are written to.')\n\ntf.app.flags.DEFINE_integer('num_clones', 1,\n                            'Number of model clones to deploy.')\n\ntf.app.flags.DEFINE_boolean('clone_on_cpu', False,\n                            'Use CPUs to deploy clones.')\n\ntf.app.flags.DEFINE_integer('worker_replicas', 1, 'Number of worker replicas.')\n\ntf.app.flags.DEFINE_integer(\n    'num_ps_tasks', 0,\n    'The number of parameter servers. If the value is 0, then the parameters '\n    'are handled locally by the worker.')\n\ntf.app.flags.DEFINE_integer(\n    'num_readers', 4,\n    'The number of parallel readers that read data from the dataset.')\n\ntf.app.flags.DEFINE_integer(\n    'num_preprocessing_threads', 4,\n    'The number of threads used to create the batches.')\n\ntf.app.flags.DEFINE_integer(\n    'log_every_n_steps', 10,\n    'The frequency with which logs are print.')\n\ntf.app.flags.DEFINE_integer(\n    'save_summaries_secs', 600,\n    'The frequency with which summaries are saved, in seconds.')\n\ntf.app.flags.DEFINE_integer(\n    'save_interval_secs', 600,\n    'The frequency with which the model is saved, in seconds.')\n\ntf.app.flags.DEFINE_integer(\n    'task', 0, 'Task id of the replica running the training.')\n\n######################\n# Optimization Flags #\n######################\n\ntf.app.flags.DEFINE_float(\n    'weight_decay', 0.00004, 'The weight decay on the model weights.')\n\ntf.app.flags.DEFINE_string(\n    'optimizer', 'rmsprop',\n    'The name of the optimizer, one of \"adadelta\", \"adagrad\", \"adam\",'\n    '\"ftrl\", \"momentum\", \"sgd\" or \"rmsprop\".')\n\ntf.app.flags.DEFINE_float(\n    'adadelta_rho', 0.95,\n    'The decay rate for adadelta.')\n\ntf.app.flags.DEFINE_float(\n    'adagrad_initial_accumulator_value', 0.1,\n    'Starting value for the AdaGrad accumulators.')\n\ntf.app.flags.DEFINE_float(\n    'adam_beta1', 0.9,\n    'The exponential decay rate for the 1st moment estimates.')\n\ntf.app.flags.DEFINE_float(\n    'adam_beta2', 0.999,\n    'The exponential decay rate for the 2nd moment estimates.')\n\ntf.app.flags.DEFINE_float('opt_epsilon', 1.0, 'Epsilon term for the optimizer.')\n\ntf.app.flags.DEFINE_float('ftrl_learning_rate_power', -0.5,\n                          'The learning rate power.')\n\ntf.app.flags.DEFINE_float(\n    'ftrl_initial_accumulator_value', 0.1,\n    'Starting value for the FTRL accumulators.')\n\ntf.app.flags.DEFINE_float(\n    'ftrl_l1', 0.0, 'The FTRL l1 regularization strength.')\n\ntf.app.flags.DEFINE_float(\n    'ftrl_l2', 0.0, 'The FTRL l2 regularization strength.')\n\ntf.app.flags.DEFINE_float(\n    'momentum', 0.9,\n    'The momentum for the MomentumOptimizer and RMSPropOptimizer.')\n\ntf.app.flags.DEFINE_float('rmsprop_momentum', 0.9, 'Momentum.')\n\ntf.app.flags.DEFINE_float('rmsprop_decay', 0.9, 'Decay term for RMSProp.')\n\n#######################\n# Learning Rate Flags #\n#######################\n\ntf.app.flags.DEFINE_string(\n    'learning_rate_decay_type',\n    'exponential',\n    'Specifies how the learning rate is decayed. One of \"fixed\", \"exponential\",'\n    ' or \"polynomial\"')\n\ntf.app.flags.DEFINE_float('learning_rate', 0.01, 'Initial learning rate.')\n\ntf.app.flags.DEFINE_float(\n    'end_learning_rate', 0.0001,\n    'The minimal end learning rate used by a polynomial decay learning rate.')\n\ntf.app.flags.DEFINE_float(\n    'label_smoothing', 0.0, 'The amount of label smoothing.')\n\ntf.app.flags.DEFINE_float(\n    'learning_rate_decay_factor', 0.94, 'Learning rate decay factor.')\n\ntf.app.flags.DEFINE_float(\n    'num_epochs_per_decay', 2.0,\n    'Number of epochs after which learning rate decays.')\n\ntf.app.flags.DEFINE_bool(\n    'sync_replicas', False,\n    'Whether or not to synchronize the replicas during training.')\n\ntf.app.flags.DEFINE_integer(\n    'replicas_to_aggregate', 1,\n    'The Number of gradients to collect before updating params.')\n\ntf.app.flags.DEFINE_float(\n    'moving_average_decay', None,\n    'The decay to use for the moving average.'\n    'If left as None, then moving averages are not used.')\n\n#######################\n# Dataset Flags #\n#######################\n\ntf.app.flags.DEFINE_string(\n    'dataset_name', 'imagenet', 'The name of the dataset to load.')\n\ntf.app.flags.DEFINE_string(\n    'dataset_split_name', 'train', 'The name of the train/test split.')\n\ntf.app.flags.DEFINE_string(\n    'dataset_dir', None, 'The directory where the dataset files are stored.')\n\ntf.app.flags.DEFINE_integer(\n    'labels_offset', 0,\n    'An offset for the labels in the dataset. This flag is primarily used to '\n    'evaluate the VGG and ResNet architectures which do not use a background '\n    'class for the ImageNet dataset.')\n\ntf.app.flags.DEFINE_string(\n    'model_name', 'inception_v3', 'The name of the architecture to train.')\n\ntf.app.flags.DEFINE_string(\n    'preprocessing_name', None, 'The name of the preprocessing to use. If left '\n    'as `None`, then the model_name flag is used.')\n\ntf.app.flags.DEFINE_integer(\n    'batch_size', 32, 'The number of samples in each batch.')\n\ntf.app.flags.DEFINE_integer(\n    'train_image_size', None, 'Train image size')\n\ntf.app.flags.DEFINE_integer('max_number_of_steps', None,\n                            'The maximum number of training steps.')\n\n#####################\n# Fine-Tuning Flags #\n#####################\n\ntf.app.flags.DEFINE_string(\n    'checkpoint_path', None,\n    'The path to a checkpoint from which to fine-tune.')\n\ntf.app.flags.DEFINE_string(\n    'checkpoint_exclude_scopes', None,\n    'Comma-separated list of scopes of variables to exclude when restoring '\n    'from a checkpoint.')\n\ntf.app.flags.DEFINE_string(\n    'trainable_scopes', None,\n    'Comma-separated list of scopes to filter the set of variables to train.'\n    'By default, None would train all the variables.')\n\ntf.app.flags.DEFINE_boolean(\n    'ignore_missing_vars', False,\n    'When restoring a checkpoint would ignore missing variables.')\n\nFLAGS = tf.app.flags.FLAGS\n\n\ndef _configure_learning_rate(num_samples_per_epoch, global_step):\n  \"\"\"Configures the learning rate.\n\n  Args:\n    num_samples_per_epoch: The number of samples in each epoch of training.\n    global_step: The global_step tensor.\n\n  Returns:\n    A `Tensor` representing the learning rate.\n\n  Raises:\n    ValueError: if\n  \"\"\"\n  decay_steps = int(num_samples_per_epoch / FLAGS.batch_size *\n                    FLAGS.num_epochs_per_decay)\n  if FLAGS.sync_replicas:\n    decay_steps /= FLAGS.replicas_to_aggregate\n\n  if FLAGS.learning_rate_decay_type == 'exponential':\n    return tf.train.exponential_decay(FLAGS.learning_rate,\n                                      global_step,\n                                      decay_steps,\n                                      FLAGS.learning_rate_decay_factor,\n                                      staircase=True,\n                                      name='exponential_decay_learning_rate')\n  elif FLAGS.learning_rate_decay_type == 'fixed':\n    return tf.constant(FLAGS.learning_rate, name='fixed_learning_rate')\n  elif FLAGS.learning_rate_decay_type == 'polynomial':\n    return tf.train.polynomial_decay(FLAGS.learning_rate,\n                                     global_step,\n                                     decay_steps,\n                                     FLAGS.end_learning_rate,\n                                     power=1.0,\n                                     cycle=False,\n                                     name='polynomial_decay_learning_rate')\n  else:\n    raise ValueError('learning_rate_decay_type [%s] was not recognized',\n                     FLAGS.learning_rate_decay_type)\n\n\ndef _configure_optimizer(learning_rate):\n  \"\"\"Configures the optimizer used for training.\n\n  Args:\n    learning_rate: A scalar or `Tensor` learning rate.\n\n  Returns:\n    An instance of an optimizer.\n\n  Raises:\n    ValueError: if FLAGS.optimizer is not recognized.\n  \"\"\"\n  if FLAGS.optimizer == 'adadelta':\n    optimizer = tf.train.AdadeltaOptimizer(\n        learning_rate,\n        rho=FLAGS.adadelta_rho,\n        epsilon=FLAGS.opt_epsilon)\n  elif FLAGS.optimizer == 'adagrad':\n    optimizer = tf.train.AdagradOptimizer(\n        learning_rate,\n        initial_accumulator_value=FLAGS.adagrad_initial_accumulator_value)\n  elif FLAGS.optimizer == 'adam':\n    optimizer = tf.train.AdamOptimizer(\n        learning_rate,\n        beta1=FLAGS.adam_beta1,\n        beta2=FLAGS.adam_beta2,\n        epsilon=FLAGS.opt_epsilon)\n  elif FLAGS.optimizer == 'ftrl':\n    optimizer = tf.train.FtrlOptimizer(\n        learning_rate,\n        learning_rate_power=FLAGS.ftrl_learning_rate_power,\n        initial_accumulator_value=FLAGS.ftrl_initial_accumulator_value,\n        l1_regularization_strength=FLAGS.ftrl_l1,\n        l2_regularization_strength=FLAGS.ftrl_l2)\n  elif FLAGS.optimizer == 'momentum':\n    optimizer = tf.train.MomentumOptimizer(\n        learning_rate,\n        momentum=FLAGS.momentum,\n        name='Momentum')\n  elif FLAGS.optimizer == 'rmsprop':\n    optimizer = tf.train.RMSPropOptimizer(\n        learning_rate,\n        decay=FLAGS.rmsprop_decay,\n        momentum=FLAGS.rmsprop_momentum,\n        epsilon=FLAGS.opt_epsilon)\n  elif FLAGS.optimizer == 'sgd':\n    optimizer = tf.train.GradientDescentOptimizer(learning_rate)\n  else:\n    raise ValueError('Optimizer [%s] was not recognized', FLAGS.optimizer)\n  return optimizer\n\n\ndef _add_variables_summaries(learning_rate):\n  summaries = []\n  for variable in slim.get_model_variables():\n    summaries.append(tf.histogram_summary(variable.op.name, variable))\n  summaries.append(tf.scalar_summary('training/Learning Rate', learning_rate))\n  return summaries\n\n\ndef _get_init_fn():\n  \"\"\"Returns a function run by the chief worker to warm-start the training.\n\n  Note that the init_fn is only run when initializing the model during the very\n  first global step.\n\n  Returns:\n    An init function run by the supervisor.\n  \"\"\"\n  if FLAGS.checkpoint_path is None:\n    return None\n\n  # Warn the user if a checkpoint exists in the train_dir. Then we'll be\n  # ignoring the checkpoint anyway.\n  if tf.train.latest_checkpoint(FLAGS.train_dir):\n    tf.logging.info(\n        'Ignoring --checkpoint_path because a checkpoint already exists in %s'\n        % FLAGS.train_dir)\n    return None\n\n  exclusions = []\n  if FLAGS.checkpoint_exclude_scopes:\n    exclusions = [scope.strip()\n                  for scope in FLAGS.checkpoint_exclude_scopes.split(',')]\n\n  # TODO(sguada) variables.filter_variables()\n  variables_to_restore = []\n  for var in slim.get_model_variables():\n    excluded = False\n    for exclusion in exclusions:\n      if var.op.name.startswith(exclusion):\n        excluded = True\n        break\n    if not excluded:\n      variables_to_restore.append(var)\n\n  if tf.gfile.IsDirectory(FLAGS.checkpoint_path):\n    checkpoint_path = tf.train.latest_checkpoint(FLAGS.checkpoint_path)\n  else:\n    checkpoint_path = FLAGS.checkpoint_path\n\n  tf.logging.info('Fine-tuning from %s' % checkpoint_path)\n\n  return slim.assign_from_checkpoint_fn(\n      checkpoint_path,\n      variables_to_restore,\n      ignore_missing_vars=FLAGS.ignore_missing_vars)\n\n\ndef _get_variables_to_train():\n  \"\"\"Returns a list of variables to train.\n\n  Returns:\n    A list of variables to train by the optimizer.\n  \"\"\"\n  if FLAGS.trainable_scopes is None:\n    return tf.trainable_variables()\n  else:\n    scopes = [scope.strip() for scope in FLAGS.trainable_scopes.split(',')]\n\n  variables_to_train = []\n  for scope in scopes:\n    variables = tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, scope)\n    variables_to_train.extend(variables)\n  return variables_to_train\n\n\ndef main(_):\n  if not FLAGS.dataset_dir:\n    raise ValueError('You must supply the dataset directory with --dataset_dir')\n\n  tf.logging.set_verbosity(tf.logging.INFO)\n  with tf.Graph().as_default():\n    ######################\n    # Config model_deploy#\n    ######################\n    deploy_config = model_deploy.DeploymentConfig(\n        num_clones=FLAGS.num_clones,\n        clone_on_cpu=FLAGS.clone_on_cpu,\n        replica_id=FLAGS.task,\n        num_replicas=FLAGS.worker_replicas,\n        num_ps_tasks=FLAGS.num_ps_tasks)\n\n    # Create global_step\n    with tf.device(deploy_config.variables_device()):\n      global_step = slim.create_global_step()\n\n    ######################\n    # Select the dataset #\n    ######################\n    dataset = dataset_factory.get_dataset(\n        FLAGS.dataset_name, FLAGS.dataset_split_name, FLAGS.dataset_dir)\n\n    ####################\n    # Select the network #\n    ####################\n    network_fn = nets_factory.get_network_fn(\n        FLAGS.model_name,\n        num_classes=(dataset.num_classes - FLAGS.labels_offset),\n        weight_decay=FLAGS.weight_decay,\n        is_training=True)\n\n    #####################################\n    # Select the preprocessing function #\n    #####################################\n    preprocessing_name = FLAGS.preprocessing_name or FLAGS.model_name\n    image_preprocessing_fn = preprocessing_factory.get_preprocessing(\n        preprocessing_name,\n        is_training=True)\n\n    ##############################################################\n    # Create a dataset provider that loads data from the dataset #\n    ##############################################################\n    with tf.device(deploy_config.inputs_device()):\n      provider = slim.dataset_data_provider.DatasetDataProvider(\n          dataset,\n          num_readers=FLAGS.num_readers,\n          common_queue_capacity=20 * FLAGS.batch_size,\n          common_queue_min=10 * FLAGS.batch_size)\n      [image, label] = provider.get(['image', 'label'])\n      label -= FLAGS.labels_offset\n\n      train_image_size = FLAGS.train_image_size or network_fn.default_image_size\n\n      image = image_preprocessing_fn(image, train_image_size, train_image_size)\n\n      images, labels = tf.train.batch(\n          [image, label],\n          batch_size=FLAGS.batch_size,\n          num_threads=FLAGS.num_preprocessing_threads,\n          capacity=5 * FLAGS.batch_size)\n      labels = slim.one_hot_encoding(\n          labels, dataset.num_classes - FLAGS.labels_offset)\n      batch_queue = slim.prefetch_queue.prefetch_queue(\n          [images, labels], capacity=2 * deploy_config.num_clones)\n\n    ####################\n    # Define the model #\n    ####################\n    def clone_fn(batch_queue):\n      \"\"\"Allows data parallelism by creating multiple clones of network_fn.\"\"\"\n      images, labels = batch_queue.dequeue()\n      logits, end_points = network_fn(images)\n\n      #############################\n      # Specify the loss function #\n      #############################\n      if 'AuxLogits' in end_points:\n        slim.losses.softmax_cross_entropy(\n            end_points['AuxLogits'], labels,\n            label_smoothing=FLAGS.label_smoothing, weight=0.4, scope='aux_loss')\n      slim.losses.softmax_cross_entropy(\n          logits, labels, label_smoothing=FLAGS.label_smoothing, weight=1.0)\n      return end_points\n\n    # Gather initial summaries.\n    summaries = set(tf.get_collection(tf.GraphKeys.SUMMARIES))\n\n    clones = model_deploy.create_clones(deploy_config, clone_fn, [batch_queue])\n    first_clone_scope = deploy_config.clone_scope(0)\n    # Gather update_ops from the first clone. These contain, for example,\n    # the updates for the batch_norm variables created by network_fn.\n    update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS, first_clone_scope)\n\n    # Add summaries for end_points.\n    end_points = clones[0].outputs\n    for end_point in end_points:\n      x = end_points[end_point]\n      summaries.add(tf.histogram_summary('activations/' + end_point, x))\n      summaries.add(tf.scalar_summary('sparsity/' + end_point,\n                                      tf.nn.zero_fraction(x)))\n\n    # Add summaries for losses.\n    for loss in tf.get_collection(tf.GraphKeys.LOSSES, first_clone_scope):\n      summaries.add(tf.scalar_summary('losses/%s' % loss.op.name, loss))\n\n    # Add summaries for variables.\n    for variable in slim.get_model_variables():\n      summaries.add(tf.histogram_summary(variable.op.name, variable))\n\n    #################################\n    # Configure the moving averages #\n    #################################\n    if FLAGS.moving_average_decay:\n      moving_average_variables = slim.get_model_variables()\n      variable_averages = tf.train.ExponentialMovingAverage(\n          FLAGS.moving_average_decay, global_step)\n    else:\n      moving_average_variables, variable_averages = None, None\n\n    #########################################\n    # Configure the optimization procedure. #\n    #########################################\n    with tf.device(deploy_config.optimizer_device()):\n      learning_rate = _configure_learning_rate(dataset.num_samples, global_step)\n      optimizer = _configure_optimizer(learning_rate)\n      summaries.add(tf.scalar_summary('learning_rate', learning_rate,\n                                      name='learning_rate'))\n\n    if FLAGS.sync_replicas:\n      # If sync_replicas is enabled, the averaging will be done in the chief\n      # queue runner.\n      optimizer = tf.train.SyncReplicasOptimizer(\n          opt=optimizer,\n          replicas_to_aggregate=FLAGS.replicas_to_aggregate,\n          variable_averages=variable_averages,\n          variables_to_average=moving_average_variables,\n          replica_id=tf.constant(FLAGS.task, tf.int32, shape=()),\n          total_num_replicas=FLAGS.worker_replicas)\n    elif FLAGS.moving_average_decay:\n      # Update ops executed locally by trainer.\n      update_ops.append(variable_averages.apply(moving_average_variables))\n\n    # Variables to train.\n    variables_to_train = _get_variables_to_train()\n\n    #  and returns a train_tensor and summary_op\n    total_loss, clones_gradients = model_deploy.optimize_clones(\n        clones,\n        optimizer,\n        var_list=variables_to_train)\n    # Add total_loss to summary.\n    summaries.add(tf.scalar_summary('total_loss', total_loss,\n                                    name='total_loss'))\n\n    # Create gradient updates.\n    grad_updates = optimizer.apply_gradients(clones_gradients,\n                                             global_step=global_step)\n    update_ops.append(grad_updates)\n\n    update_op = tf.group(*update_ops)\n    train_tensor = control_flow_ops.with_dependencies([update_op], total_loss,\n                                                      name='train_op')\n\n    # Add the summaries from the first clone. These contain the summaries\n    # created by model_fn and either optimize_clones() or _gather_clone_loss().\n    summaries |= set(tf.get_collection(tf.GraphKeys.SUMMARIES,\n                                       first_clone_scope))\n\n    # Merge all summaries together.\n    summary_op = tf.merge_summary(list(summaries), name='summary_op')\n\n\n    ###########################\n    # Kicks off the training. #\n    ###########################\n    slim.learning.train(\n        train_tensor,\n        logdir=FLAGS.train_dir,\n        master=FLAGS.master,\n        is_chief=(FLAGS.task == 0),\n        init_fn=_get_init_fn(),\n        summary_op=summary_op,\n        number_of_steps=FLAGS.max_number_of_steps,\n        log_every_n_steps=FLAGS.log_every_n_steps,\n        save_summaries_secs=FLAGS.save_summaries_secs,\n        save_interval_secs=FLAGS.save_interval_secs,\n        sync_optimizer=optimizer if FLAGS.sync_replicas else None)\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/street/README.md",
    "content": "# StreetView Tensorflow Recurrent End-to-End Transcription (STREET) Model.\n\nA TensorFlow implementation of the STREET model described in the paper:\n\n\"End-to-End Interpretation of the French Street Name Signs Dataset\"\n\nRaymond Smith, Chunhui Gu, Dar-Shyang Lee, Huiyi Hu, Ranjith\nUnnikrishnan, Julian Ibarz, Sacha Arnoud, Sophia Lin.\n\n*International Workshop on Robust Reading, Amsterdam, 9 October 2016.*\n\nAvailable at: http://link.springer.com/chapter/10.1007%2F978-3-319-46604-0_30\n\n\n## Contact\n***Author:*** Ray Smith (rays@google.com).\n\n***Pull requests and issues:*** @theraysmith.\n\n## Contents\n* [Introduction](#introduction)\n* [Installing and setting up the STREET model](#installing-and-setting-up-the-street-model)\n* [Downloading the datasets](#downloading-the-datasets)\n* [Confidence Tests](#confidence-tests)\n* [Training a model](#training-a-model)\n* [The Variable Graph Specification Language](#the-variable-graph-specification-language)\n\n## Introduction\n\nThe *STREET* model is a deep recurrent neural network that learns how to\nidentify the name of a street (in France) from an image containing upto four\ndifferent views of the street name sign. The model merges information from the\ndifferent views and normalizes the text to the correct format. For example:\n\n<center>\n![Example image](g3doc/avdessapins.png)\n\nAvenue des Sapins\n</center>\n\n\n## Installing and setting up the STREET model\n[Install Tensorflow](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/g3doc/get_started/os_setup.md#virtualenv-installation)\n\nInstall numpy:\n\n```\nsudo pip install numpy\n```\n\nBuild the LSTM op:\n\n```\ncd cc\nTF_INC=$(python -c 'import tensorflow as tf; print(tf.sysconfig.get_include())')\ng++ -std=c++11 -shared rnn_ops.cc -o rnn_ops.so -fPIC -I $TF_INC -O3 -mavx\n```\n\nRun the unittests:\n\n```\ncd ../python\npython decoder_test.py\npython errorcounter_test.py\npython shapes_test.py\npython vgslspecs_test.py\npython vgsl_model_test.py\n```\n\n## Downloading the datasets\n\nThe French Street Name Signs (FSNS) dataset is split into subsets, each\nof which is composed of multiple files.\nNote that these datasets are very large. The approximate sizes are:\n\n*   Train: 512 files of 300MB each.\n*   Validation: 64 files of 40MB each.\n*   Test: 64 files of 50MB each.\n*   Testdata: some smaller data files of a few MB for testing.\n\nHere is a list of the download paths:\n\n```\nhttps://download.tensorflow.org/data/fsns-20160927/charset_size=134.txt\nhttps://download.tensorflow.org/data/fsns-20160927/test/test-00000-of-00064\n...\nhttps://download.tensorflow.org/data/fsns-20160927/test/test-00063-of-00064\nhttps://download.tensorflow.org/data/fsns-20160927/testdata/arial-32-00000-of-00001\nhttps://download.tensorflow.org/data/fsns-20160927/testdata/fsns-00000-of-00001\nhttps://download.tensorflow.org/data/fsns-20160927/testdata/mnist-sample-00000-of-00001\nhttps://download.tensorflow.org/data/fsns-20160927/testdata/numbers-16-00000-of-00001\nhttps://download.tensorflow.org/data/fsns-20160927/train/train-00000-of-00512\n...\nhttps://download.tensorflow.org/data/fsns-20160927/train/train-00511-of-00512\nhttps://download.tensorflow.org/data/fsns-20160927/validation/validation-00000-of-00064\n...\nhttps://download.tensorflow.org/data/fsns-20160927/validation/validation-00063-of-00064\n```\n\nThe above files need to be downloaded individually, as they are large and\ndownloads are more likely to succeed with the individual files than with a\nsingle archive containing them all.\n\n## Confidence Tests\n\nThe datasets download includes a directory `testdata` that contains some small\ndatasets that are big enough to test that models can actually learn something.\nAssuming that you have put the downloads in directory `data` alongside\n`python` then you can run the following tests:\n\n### Mnist for zero-dimensional data\n\n```\ncd python\ntrain_dir=/tmp/mnist\nrm -rf $train_dir\npython vgsl_train.py --model_str='16,0,0,1[Ct5,5,16 Mp3,3 Lfys32 Lfxs64]O0s12' \\\n  --max_steps=1024 --train_data=../data/testdata/mnist-sample-00000-of-00001 \\\n  --initial_learning_rate=0.001 --final_learning_rate=0.001 \\\n  --num_preprocess_threads=1 --train_dir=$train_dir\npython vgsl_eval.py --model_str='16,0,0,1[Ct5,5,16 Mp3,3 Lfys32 Lfxs64]O0s12' \\\n  --num_steps=256 --eval_data=../data/testdata/mnist-sample-00000-of-00001 \\\n  --num_preprocess_threads=1 --decoder=../testdata/numbers.charset_size=12.txt \\\n  --eval_interval_secs=0 --train_dir=$train_dir --eval_dir=$train_dir/eval\n```\n\nDepending on your machine, this should run in about 1 minute, and should obtain\nerror rates below 50%. Actual error rates will vary according to random\ninitialization.\n\n### Fixed-length targets for number recognition\n\n```\ncd python\ntrain_dir=/tmp/fixed\nrm -rf $train_dir\npython vgsl_train.py --model_str='8,16,0,1[S1(1x16)1,3 Lfx32 Lrx32 Lfx32]O1s12' \\\n  --max_steps=3072 --train_data=../data/testdata/numbers-16-00000-of-00001 \\\n  --initial_learning_rate=0.001 --final_learning_rate=0.001 \\\n  --num_preprocess_threads=1 --train_dir=$train_dir\npython vgsl_eval.py --model_str='8,16,0,1[S1(1x16)1,3 Lfx32 Lrx32 Lfx32]O1s12' \\\n  --num_steps=256 --eval_data=../data/testdata/numbers-16-00000-of-00001 \\\n  --num_preprocess_threads=1 --decoder=../testdata/numbers.charset_size=12.txt \\\n  --eval_interval_secs=0 --train_dir=$train_dir --eval_dir=$train_dir/eval\n```\n\nDepending on your machine, this should run in about 1-2 minutes, and should\nobtain a label error rate between 50 and 80%, with word error rates probably\nnot coming below 100%. Actual error rates will vary\naccording to random initialization.\n\n### OCR-style data with CTC\n\n```\ncd python\ntrain_dir=/tmp/ctc\nrm -rf $train_dir\npython vgsl_train.py --model_str='1,32,0,1[S1(1x32)1,3 Lbx100]O1c105' \\\n  --max_steps=4096 --train_data=../data/testdata/arial-32-00000-of-00001 \\\n  --initial_learning_rate=0.001 --final_learning_rate=0.001 \\\n  --num_preprocess_threads=1 --train_dir=$train_dir &\npython vgsl_eval.py --model_str='1,32,0,1[S1(1x32)1,3 Lbx100]O1c105' \\\n  --num_steps=256 --eval_data=../data/testdata/arial-32-00000-of-00001 \\\n  --num_preprocess_threads=1 --decoder=../testdata/arial.charset_size=105.txt \\\n  --eval_interval_secs=15 --train_dir=$train_dir --eval_dir=$train_dir/eval &\ntensorboard --logdir=$train_dir\n```\n\nDepending on your machine, the background training should run for about 3-4\nminutes, and should obtain a label error rate between 10 and 50%, with\ncorrespondingly higher word error rates and even higher sequence error rate.\nActual error rates will vary according to random initialization.\nThe background eval will run for ever, and will have to be terminated by hand.\nThe tensorboard command will run a visualizer that can be viewed with a\nbrowser. Go to the link that it prints to view tensorboard and see the\ntraining progress. See the [Tensorboard](https://www.tensorflow.org/versions/r0.10/how_tos/summaries_and_tensorboard/index.html)\nintroduction for more information.\n\n\n### Mini FSNS dataset\n\nYou can test the actual STREET model on a small FSNS data set. The model will\noverfit to this small dataset, but will give some confidence that everything\nis working correctly. *Note* that this test runs the training and evaluation\nin parallel, which is something that you should do when training any substantial\nsystem, so you can monitor progress.\n\n\n```\ncd python\ntrain_dir=/tmp/fsns\nrm -rf $train_dir\npython vgsl_train.py --max_steps=10000 --num_preprocess_threads=1 \\\n  --train_data=../data/testdata/fsns-00000-of-00001 \\\n  --initial_learning_rate=0.0001 --final_learning_rate=0.0001 \\\n  --train_dir=$train_dir &\npython vgsl_eval.py --num_steps=256 --num_preprocess_threads=1 \\\n   --eval_data=../data/testdata/fsns-00000-of-00001 \\\n   --decoder=../testdata/charset_size=134.txt \\\n   --eval_interval_secs=300 --train_dir=$train_dir --eval_dir=$train_dir/eval &\ntensorboard --logdir=$train_dir\n```\n\nDepending on your machine, the training should finish in about 1-2 *hours*.\nAs with the CTC testset above, the eval and tensorboard will have to be\nterminated manually.\n\n## Training a full FSNS model\n\nAfter running the tests above, you are ready to train the real thing!\n*Note* that you might want to use a train_dir somewhere other than /tmp as\nyou can stop the training, reboot if needed and continue if you keep the\ndata intact, but /tmp gets deleted on a reboot.\n\n```\ncd python\ntrain_dir=/tmp/fsns\nrm -rf $train_dir\npython vgsl_train.py --max_steps=100000000 --train_data=../data/train/train* \\\n  --train_dir=$train_dir &\npython vgsl_eval.py --num_steps=1000 \\\n  --eval_data=../data/validation/validation* \\\n  --decoder=../testdata/charset_size=134.txt \\\n  --eval_interval_secs=300 --train_dir=$train_dir --eval_dir=$train_dir/eval &\ntensorboard --logdir=$train_dir\n```\n\nTraining will take a very long time (probably many weeks) to reach minimum\nerror rate on a single machine, although it will probably take substatially\nfewer iterations than with parallel training. Faster training can be obtained\nwith parallel training on a cluster.\nSince the setup is likely to be very site-specific, please see the TensorFlow\ndocumentation on\n[Distributed TensorFlow](https://www.tensorflow.org/versions/r0.10/how_tos/distributed/index.html)\nfor more information. Some code changes may be needed in the `Train` function\nin `vgsl_model.py`.\n\nWith 40 parallel training workers, nearly optimal error rates (about 25%\nsequence error on the validation set) are obtained in about 30 million steps,\nalthough the error continues to fall slightly over the next 30 million, to\nperhaps as low as 23%.\n\nWith a single machine the number of steps could be substantially lower.\nAlthough untested on this problem, on other problems the ratio is typically\n5 to 1 so low error rates could be obtained as soon as 6 million iterations,\nwhich could be reached in about 4 weeks.\n\n\n## The Variable Graph Specification Language\n\nThe STREET model makes use of a graph specification language (VGSL) that\nenables rapid experimentation with different model architectures. The language\ndefines a Tensor Flow graph that can be used to process images of variable sizes\nto output a 1-dimensional sequence, like a transcription/OCR problem, or a\n0-dimensional label, as for image identification problems. For more information\nsee [vgslspecs](g3doc/vgslspecs.md)\n\n"
  },
  {
    "path": "model_zoo/models/street/cc/rnn_ops.cc",
    "content": "/* Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// OpKernel of LSTM Neural Networks:\n//\n//   LSTM: VariableLSTMOp (VariableLSTMGradOp)\n//\n// where (.*) are the ops to compute gradients for the corresponding ops.\n\n#define EIGEN_USE_THREADS\n\n#include <vector>\n#ifdef GOOGLE_INCLUDES\n#include \"third_party/eigen3/Eigen/Core\"\n#include \"third_party/tensorflow/core/framework/op.h\"\n#include \"third_party/tensorflow/core/framework/op_kernel.h\"\n#include \"third_party/tensorflow/core/framework/tensor.h\"\n#else\n#include \"Eigen/Core\"\n#include \"tensorflow/core/framework/op.h\"\n#include \"tensorflow/core/framework/op_kernel.h\"\n#include \"tensorflow/core/framework/tensor.h\"\n#endif  // GOOGLE_INCLUDES\n\nnamespace tensorflow {\n\nusing Eigen::array;\nusing Eigen::DenseIndex;\nusing IndexPair = Eigen::IndexPair<int>;\n\nStatus AreDimsEqual(int dim1, int dim2, const string& message) {\n  if (dim1 != dim2) {\n    return errors::InvalidArgument(message, \": \", dim1, \" vs. \", dim2);\n  }\n  return Status::OK();\n}\n\n// ------------------------------- VariableLSTMOp -----------------------------\n\n// Kernel to compute the forward propagation of a Long Short-Term Memory\n// network. See the doc of the op below for more detail.\nclass VariableLSTMOp : public OpKernel {\n public:\n  explicit VariableLSTMOp(OpKernelConstruction* ctx) : OpKernel(ctx) {\n    OP_REQUIRES_OK(ctx, ctx->GetAttr(\"clip\", &clip_));\n    OP_REQUIRES(\n        ctx, clip_ >= 0.0,\n        errors::InvalidArgument(\"clip_ needs to be equal or greator than 0\"));\n  }\n\n  void Compute(OpKernelContext* ctx) override {\n    // Inputs.\n    const auto input = ctx->input(0).tensor<float, 4>();\n    const auto initial_state = ctx->input(1).tensor<float, 2>();\n    const auto initial_memory = ctx->input(2).tensor<float, 2>();\n    const auto w_m_m = ctx->input(3).tensor<float, 3>();\n    const int batch_size = input.dimension(0);\n    const int seq_len = input.dimension(1);\n    const int output_dim = input.dimension(3);\n\n    // Sanity checks.\n    OP_REQUIRES_OK(ctx, AreDimsEqual(4, input.dimension(2), \"Input num\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(batch_size, initial_state.dimension(0),\n                                     \"State batch\"));\n    OP_REQUIRES_OK(\n        ctx, AreDimsEqual(output_dim, initial_state.dimension(1), \"State dim\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(batch_size, initial_memory.dimension(0),\n                                     \"Memory batch\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(output_dim, initial_memory.dimension(1),\n                                     \"Memory dim\"));\n    OP_REQUIRES_OK(\n        ctx, AreDimsEqual(output_dim, w_m_m.dimension(0), \"Weight dim 0\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(4, w_m_m.dimension(1), \"Weight dim 1\"));\n    OP_REQUIRES_OK(\n        ctx, AreDimsEqual(output_dim, w_m_m.dimension(2), \"Weight dim 2\"));\n\n    // Outputs.\n    Tensor* act_tensor = nullptr;\n    OP_REQUIRES_OK(ctx, ctx->allocate_output(\n                            0, {batch_size, seq_len, output_dim}, &act_tensor));\n    auto act = act_tensor->tensor<float, 3>();\n    act.setZero();\n\n    Tensor* gate_raw_act_tensor = nullptr;\n    OP_REQUIRES_OK(ctx,\n                   ctx->allocate_output(1, {batch_size, seq_len, 4, output_dim},\n                                        &gate_raw_act_tensor));\n    auto gate_raw_act = gate_raw_act_tensor->tensor<float, 4>();\n    gate_raw_act.setZero();\n\n    Tensor* memory_tensor = nullptr;\n    OP_REQUIRES_OK(ctx,\n                   ctx->allocate_output(2, {batch_size, seq_len, output_dim},\n                                        &memory_tensor));\n    auto memory = memory_tensor->tensor<float, 3>();\n    memory.setZero();\n\n    // Const and scratch tensors.\n    Tensor ones_tensor;\n    OP_REQUIRES_OK(ctx, ctx->allocate_temp(DT_FLOAT, {batch_size, output_dim},\n                                           &ones_tensor));\n    auto ones = ones_tensor.tensor<float, 2>();\n    ones.setConstant(1.0);\n\n    Tensor state_tensor;\n    OP_REQUIRES_OK(ctx, ctx->allocate_temp(DT_FLOAT, {batch_size, output_dim},\n                                           &state_tensor));\n    auto state = state_tensor.tensor<float, 2>();\n    state = initial_state;\n\n    Tensor scratch_tensor;\n    OP_REQUIRES_OK(ctx,\n                   ctx->allocate_temp(DT_FLOAT, {batch_size, 4, output_dim},\n                                      &scratch_tensor));\n    auto scratch = scratch_tensor.tensor<float, 3>();\n    scratch.setZero();\n\n    // Uses the most efficient order for the contraction depending on the batch\n    // size.\n\n    // This is the code shared by both cases. It is discouraged to use the\n    // implicit capture with lambda functions, but it should be clear that what\n    // is done here.\n    auto Forward = [&](int i) {\n      // Each pre-activation value is stored in the following order (See the\n      // comment of the op for the meaning):\n      //\n      //   i: 0\n      //   j: 1\n      //   f: 2\n      //   o: 3\n\n      // Adds one to the pre-activation values of the forget gate. This is a\n      // heuristic to make the training easier.\n      scratch.chip(2, 1) += ones;\n\n      gate_raw_act.chip(i, 1) = scratch;\n\n      // c_t = f_t * c_{t-1} + i_t * j_t\n      if (i == 0) {\n        state = initial_memory * scratch.chip(2, 1).sigmoid();\n      } else {\n        state = memory.chip(i - 1, 1) * scratch.chip(2, 1).sigmoid();\n      }\n      state += scratch.chip(0, 1).sigmoid() * scratch.chip(1, 1).tanh();\n\n      if (clip_ > 0.0) {\n        // Clips the values if required.\n        state = state.cwiseMax(-clip_).cwiseMin(clip_);\n      }\n\n      memory.chip(i, 1) = state;\n\n      // h_t = o_t * tanh(c_t)\n      state = scratch.chip(3, 1).sigmoid() * state.tanh();\n\n      act.chip(i, 1) = state;\n    };\n    if (batch_size == 1) {\n      // Reshapes the weight tensor to pretend as if it is a matrix\n      // multiplication which is more efficient.\n      auto w_m_m_r =\n          w_m_m.reshape(array<DenseIndex, 2>{output_dim, 4 * output_dim});\n      // Dimensions for the contraction.\n      const array<IndexPair, 1> m_m_dim = {IndexPair(1, 0)};\n      for (int i = 0; i < seq_len; ++i) {\n        // Computes the pre-activation value of the input and each gate.\n        scratch = input.chip(i, 1) +\n                  state.contract(w_m_m_r, m_m_dim)\n                      .reshape(array<DenseIndex, 3>{batch_size, 4, output_dim});\n        Forward(i);\n      }\n    } else {\n      // Shuffles the dimensions of the weight tensor to be efficient when used\n      // in the left-hand side. Allocates memory for the shuffled tensor for\n      // efficiency.\n      Tensor w_m_m_s_tensor;\n      OP_REQUIRES_OK(ctx,\n                     ctx->allocate_temp(DT_FLOAT, {output_dim * 4, output_dim},\n                                        &w_m_m_s_tensor));\n      auto w_m_m_s = w_m_m_s_tensor.tensor<float, 2>();\n      w_m_m_s = w_m_m.shuffle(array<int, 3>{2, 1, 0})\n                    .reshape(array<DenseIndex, 2>{output_dim * 4, output_dim});\n      // Dimensions for the contraction.\n      const array<IndexPair, 1> m_m_dim = {IndexPair(1, 1)};\n      for (int i = 0; i < seq_len; ++i) {\n        // Computes the pre-activation value of the input and each gate.\n        scratch = input.chip(i, 1) +\n                  w_m_m_s.contract(state, m_m_dim)\n                      .reshape(array<DenseIndex, 3>{output_dim, 4, batch_size})\n                      .shuffle(array<int, 3>{2, 1, 0});\n        Forward(i);\n      }\n    }\n  }\n\n private:\n  // Threshold to clip the values of memory cells.\n  float clip_ = 0;\n};\n\nREGISTER_KERNEL_BUILDER(Name(\"VariableLSTM\").Device(DEVICE_CPU),\n                        VariableLSTMOp);\nREGISTER_OP(\"VariableLSTM\")\n    .Attr(\"clip: float = 0.0\")\n    .Input(\"input: float32\")\n    .Input(\"initial_state: float32\")\n    .Input(\"initial_memory: float32\")\n    .Input(\"w_m_m: float32\")\n    .Output(\"activation: float32\")\n    .Output(\"gate_raw_act: float32\")\n    .Output(\"memory: float32\")\n    .Doc(R\"doc(\nComputes the forward propagation of a Long Short-Term Memory Network.\n\nIt computes the following equation recursively for `0<t<=T`:\n\n  i_t  = sigmoid(a_{i,t})\n  j_t  = tanh(a_{j,t})\n  f_t  = sigmoid(a_{f,t} + 1.0)\n  o_t  = sigmoid(a_{o,t})\n  c_t  = f_t * c_{t-1} + i_t * j_t\n  c'_t = min(max(c_t, -clip), clip) if clip > 0 else c_t\n  h_t  = o_t * tanh(c'_t)\n\nwhere\n\n  a_{l,t} = w_{l,m,m} * h_{t-1} + x'_{l,t}\n\nwhere\n\n  x'_{l,t} = w_{l,m,i} * x_{t}.\n\n`input` corresponds to the concatenation of `X'_i`, `X'_j`, `X'_f`, and `X'_o`\nwhere `X'_l = (x'_{l,1}, x'_{l,2}, ..., x'_{l,T})`, `initial_state` corresponds\nto `h_{0}`, `initial_memory` corresponds to `c_{0}` and `weight` corresponds to\n`w_{l,m,m}`. `X'_l` (the transformed input) is computed outside of the op in\nadvance, so w_{l,m,i} is not passed in to the op.\n\n`activation` corresponds to `H = (h_1, h_2, ..., h_T)`, `gate_raw_activation`\ncorresponds to the concatanation of `A_i`, `A_j`, `A_f` and `A_o`, and `memory`\ncorresponds `C = (c_0, c_1, ..., c_T)`.\n\nAll entries in the batch are propagated to the end, and are assumed to be the\nsame length.\n\ninput: 4-D with shape `[batch_size, seq_len, 4, num_nodes]`\ninitial_state: 2-D with shape `[batch_size, num_nodes]`\ninitial_memory: 2-D with shape `[batch_size, num_nodes]`\nw_m_m: 3-D with shape `[num_nodes, 4, num_nodes]`\nactivation: 3-D with shape `[batch_size, seq_len, num_nodes]`\ngate_raw_act: 3-D with shape `[batch_size, seq_len, 4, num_nodes]`\nmemory: 3-D with shape `[batch_size, seq_len, num_nodes]`\n)doc\");\n\n// ----------------------------- VariableLSTMGradOp ----------------------------\n\n// Kernel to compute the gradient of VariableLSTMOp.\nclass VariableLSTMGradOp : public OpKernel {\n public:\n  explicit VariableLSTMGradOp(OpKernelConstruction* ctx) : OpKernel(ctx) {}\n\n  void Compute(OpKernelContext* ctx) override {\n    // Inputs.\n    const auto initial_state = ctx->input(0).tensor<float, 2>();\n    const auto initial_memory = ctx->input(1).tensor<float, 2>();\n    const auto w_m_m = ctx->input(2).tensor<float, 3>();\n    const auto act = ctx->input(3).tensor<float, 3>();\n    const auto gate_raw_act = ctx->input(4).tensor<float, 4>();\n    const auto memory = ctx->input(5).tensor<float, 3>();\n    const auto act_grad = ctx->input(6).tensor<float, 3>();\n    const auto gate_raw_act_grad = ctx->input(7).tensor<float, 4>();\n    const auto memory_grad = ctx->input(8).tensor<float, 3>();\n    const int batch_size = act.dimension(0);\n    const int seq_len = act.dimension(1);\n    const int output_dim = act.dimension(2);\n\n    // Sanity checks.\n    OP_REQUIRES_OK(ctx, AreDimsEqual(batch_size, initial_state.dimension(0),\n                                     \"State batch\"));\n    OP_REQUIRES_OK(\n        ctx, AreDimsEqual(output_dim, initial_state.dimension(1), \"State dim\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(batch_size, initial_memory.dimension(0),\n                                     \"Memory batch\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(output_dim, initial_memory.dimension(1),\n                                     \"Memory dim\"));\n    OP_REQUIRES_OK(\n        ctx, AreDimsEqual(output_dim, w_m_m.dimension(0), \"Weight dim 0\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(4, w_m_m.dimension(1), \"Weight dim 1\"));\n    OP_REQUIRES_OK(\n        ctx, AreDimsEqual(output_dim, w_m_m.dimension(2), \"Weight dim 2\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(batch_size, gate_raw_act.dimension(0),\n                                     \"Gate raw activation batch\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(seq_len, gate_raw_act.dimension(1),\n                                     \"Gate raw activation  len\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(4, gate_raw_act.dimension(2),\n                                     \"Gate raw activation num\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(output_dim, gate_raw_act.dimension(3),\n                                     \"Gate raw activation dim\"));\n    OP_REQUIRES_OK(\n        ctx, AreDimsEqual(batch_size, memory.dimension(0), \"Memory batch\"));\n    OP_REQUIRES_OK(ctx,\n                   AreDimsEqual(seq_len, memory.dimension(1), \"Memory len\"));\n    OP_REQUIRES_OK(ctx,\n                   AreDimsEqual(output_dim, memory.dimension(2), \"Memory dim\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(batch_size, act_grad.dimension(0),\n                                     \"Activation gradient batch\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(seq_len, act_grad.dimension(1),\n                                     \"Activation gradient len\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(output_dim, act_grad.dimension(2),\n                                     \"Activation gradient dim\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(batch_size, gate_raw_act_grad.dimension(0),\n                                     \"Activation gradient batch\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(seq_len, gate_raw_act_grad.dimension(1),\n                                     \"Activation gradient len\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(4, gate_raw_act_grad.dimension(2),\n                                     \"Activation gradient num\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(output_dim, gate_raw_act_grad.dimension(3),\n                                     \"Activation gradient dim\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(batch_size, memory_grad.dimension(0),\n                                     \"Memory gradient batch\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(seq_len, memory_grad.dimension(1),\n                                     \"Memory gradient len\"));\n    OP_REQUIRES_OK(ctx, AreDimsEqual(output_dim, memory_grad.dimension(2),\n                                     \"Memory gradient dim\"));\n\n    // Outputs.\n    std::vector<Tensor*> collections(4, nullptr);\n    OP_REQUIRES_OK(ctx,\n                   ctx->allocate_output(0, {batch_size, seq_len, 4, output_dim},\n                                        &collections[0]));\n    auto input_grad = collections[0]->tensor<float, 4>();\n    input_grad.setZero();\n\n    OP_REQUIRES_OK(ctx, ctx->allocate_output(1, {batch_size, output_dim},\n                                             &collections[1]));\n    auto init_state_grad = collections[1]->tensor<float, 2>();\n    init_state_grad.setZero();\n\n    OP_REQUIRES_OK(ctx, ctx->allocate_output(2, {batch_size, output_dim},\n                                             &collections[2]));\n    auto init_memory_grad = collections[2]->tensor<float, 2>();\n    init_memory_grad.setZero();\n\n    OP_REQUIRES_OK(ctx, ctx->allocate_output(3, {output_dim, 4, output_dim},\n                                             &collections[3]));\n    auto w_m_m_grad = collections[3]->tensor<float, 3>();\n    w_m_m_grad.setZero();\n\n    // Const and scratch tensors.\n    Tensor ones_tensor;\n    OP_REQUIRES_OK(ctx, ctx->allocate_temp(DT_FLOAT, {batch_size, output_dim},\n                                           &ones_tensor));\n    auto ones = ones_tensor.tensor<float, 2>();\n    ones.setConstant(1.0);\n\n    Tensor scratch_tensor;\n    OP_REQUIRES_OK(ctx,\n                   ctx->allocate_temp(DT_FLOAT, {batch_size, 4, output_dim},\n                                      &scratch_tensor));\n    auto scratch = scratch_tensor.tensor<float, 3>();\n    scratch.setZero();\n\n    Tensor tmp1_tensor;\n    OP_REQUIRES_OK(ctx, ctx->allocate_temp(DT_FLOAT, {batch_size, output_dim},\n                                           &tmp1_tensor));\n    auto tmp1 = tmp1_tensor.tensor<float, 2>();\n    tmp1.setZero();\n\n    Tensor tmp2_tensor;\n    OP_REQUIRES_OK(ctx, ctx->allocate_temp(DT_FLOAT, {batch_size, output_dim},\n                                           &tmp2_tensor));\n    auto tmp2 = tmp2_tensor.tensor<float, 2>();\n    tmp2.setZero();\n\n    // Uses the most efficient order for the contraction depending on the batch\n    // size.\n\n    // Shuffles the dimensions of the weight tensor to be efficient when used in\n    // the left-hand side. Allocates memory for the shuffled tensor for\n    // efficiency.\n    Tensor w_m_m_s_tensor;\n    OP_REQUIRES_OK(ctx,\n                   ctx->allocate_temp(DT_FLOAT, {4, output_dim, output_dim},\n                                      &w_m_m_s_tensor));\n    auto w_m_m_s = w_m_m_s_tensor.tensor<float, 3>();\n    if (batch_size == 1) {\n      // Allocates memory only it is used.\n      w_m_m_s = w_m_m.shuffle(array<int, 3>{1, 2, 0});\n    }\n\n    // Dimensions for the contraction with the weight tensor.\n    const array<IndexPair, 1> m_m_dim =\n        batch_size == 1 ? array<IndexPair, 1>{IndexPair(1, 0)}\n                        : array<IndexPair, 1>{IndexPair(1, 1)};\n    // Dimensions for the contraction of the batch dimensions.\n    const array<IndexPair, 1> b_b_dim = {IndexPair(0, 0)};\n    for (int i = seq_len - 1; i >= 0; --i) {\n      if (i == seq_len - 1) {\n        init_state_grad = act_grad.chip(i, 1);\n      } else {\n        w_m_m_grad +=\n            act.chip(i, 1)\n                .contract(scratch.reshape(\n                              array<DenseIndex, 2>{batch_size, 4 * output_dim}),\n                          b_b_dim)\n                .reshape(array<DenseIndex, 3>{output_dim, 4, output_dim});\n        if (batch_size == 1) {\n          init_state_grad.device(ctx->eigen_cpu_device()) =\n              scratch.chip(0, 1).contract(w_m_m_s.chip(0, 0), m_m_dim) +\n              scratch.chip(1, 1).contract(w_m_m_s.chip(1, 0), m_m_dim) +\n              scratch.chip(2, 1).contract(w_m_m_s.chip(2, 0), m_m_dim) +\n              scratch.chip(3, 1).contract(w_m_m_s.chip(3, 0), m_m_dim);\n        } else {\n          init_state_grad.device(ctx->eigen_cpu_device()) =\n              (w_m_m.chip(0, 1).contract(scratch.chip(0, 1), m_m_dim) +\n               w_m_m.chip(1, 1).contract(scratch.chip(1, 1), m_m_dim) +\n               w_m_m.chip(2, 1).contract(scratch.chip(2, 1), m_m_dim) +\n               w_m_m.chip(3, 1).contract(scratch.chip(3, 1), m_m_dim))\n                  .shuffle(array<int, 2>{1, 0});\n        }\n        init_state_grad += act_grad.chip(i, 1);\n      }\n\n      auto gate_raw_act_t = gate_raw_act.chip(i, 1);\n      auto gate_raw_act_grad_t = gate_raw_act_grad.chip(i, 1);\n\n      // Output gate.\n      tmp1 = memory.chip(i, 1);\n      tmp1 = tmp1.tanh();                          // y_t\n      tmp2 = gate_raw_act_t.chip(3, 1).sigmoid();  // o_t\n      scratch.chip(3, 1) = init_state_grad * tmp1 * tmp2 * (ones - tmp2) +\n                           gate_raw_act_grad_t.chip(3, 1);\n\n      init_memory_grad += init_state_grad * tmp2 * (ones - tmp1.square()) +\n                          memory_grad.chip(i, 1);\n\n      // Input gate.\n      tmp1 = gate_raw_act_t.chip(0, 1).sigmoid();  // i_t\n      tmp2 = gate_raw_act_t.chip(1, 1);\n      tmp2 = tmp2.tanh();  // j_t\n      scratch.chip(0, 1) = init_memory_grad * tmp2 * tmp1 * (ones - tmp1) +\n                           gate_raw_act_grad_t.chip(0, 1);\n\n      // Input.\n      scratch.chip(1, 1) = init_memory_grad * tmp1 * (ones - tmp2.square()) +\n                           gate_raw_act_grad_t.chip(1, 1);\n\n      // Forget gate.\n      tmp1 = gate_raw_act_t.chip(2, 1).sigmoid();  // f_t\n      if (i == 0) {\n        scratch.chip(2, 1) =\n            init_memory_grad * initial_memory * tmp1 * (ones - tmp1) +\n            gate_raw_act_grad_t.chip(2, 1);\n      } else {\n        scratch.chip(2, 1) =\n            init_memory_grad * memory.chip(i - 1, 1) * tmp1 * (ones - tmp1) +\n            gate_raw_act_grad_t.chip(2, 1);\n      }\n\n      // Memory.\n      init_memory_grad *= tmp1;\n\n      input_grad.chip(i, 1) = scratch;\n    }\n    w_m_m_grad += initial_state\n                      .contract(scratch.reshape(array<DenseIndex, 2>{\n                                    batch_size, 4 * output_dim}),\n                                b_b_dim)\n                      .reshape(array<DenseIndex, 3>{output_dim, 4, output_dim});\n    if (batch_size == 1) {\n      init_state_grad.device(ctx->eigen_cpu_device()) =\n          (scratch.chip(0, 1).contract(w_m_m_s.chip(0, 0), m_m_dim) +\n           scratch.chip(1, 1).contract(w_m_m_s.chip(1, 0), m_m_dim) +\n           scratch.chip(2, 1).contract(w_m_m_s.chip(2, 0), m_m_dim) +\n           scratch.chip(3, 1).contract(w_m_m_s.chip(3, 0), m_m_dim));\n    } else {\n      init_state_grad.device(ctx->eigen_cpu_device()) =\n          (w_m_m.chip(0, 1).contract(scratch.chip(0, 1), m_m_dim) +\n           w_m_m.chip(1, 1).contract(scratch.chip(1, 1), m_m_dim) +\n           w_m_m.chip(2, 1).contract(scratch.chip(2, 1), m_m_dim) +\n           w_m_m.chip(3, 1).contract(scratch.chip(3, 1), m_m_dim))\n              .shuffle(array<int, 2>{1, 0});\n    }\n  }\n};\n\nREGISTER_KERNEL_BUILDER(Name(\"VariableLSTMGrad\").Device(DEVICE_CPU),\n                        VariableLSTMGradOp);\n\nREGISTER_OP(\"VariableLSTMGrad\")\n    .Input(\"initial_state: float32\")\n    .Input(\"initial_memory: float32\")\n    .Input(\"w_m_m: float32\")\n    .Input(\"activation: float32\")\n    .Input(\"gate_raw_act: float32\")\n    .Input(\"memory: float32\")\n    .Input(\"act_grad: float32\")\n    .Input(\"gate_raw_act_grad: float32\")\n    .Input(\"memory_grad: float32\")\n    .Output(\"input_grad: float32\")\n    .Output(\"initial_state_grad: float32\")\n    .Output(\"initial_memory_grad: float32\")\n    .Output(\"w_m_m_grad: float32\")\n    .Doc(R\"doc(\nComputes the gradient for VariableLSTM.\n\nThis is to be used conjunction with VariableLSTM. It ignores the clipping used\nin the forward pass.\n\ninitial_state: 2-D with shape `[batch_size, num_nodes]`\ninitial_memory: 2-D with shape `[batch_size, num_nodes]`\nw_m_m: 3-D with shape `[num_nodes, 4, num_nodes]`\nactivation: 3-D with shape `[batch_size, seq_len, num_nodes]`\ngate_raw_act: 3-D with shape `[batch_size, seq_len, 4, num_nodes]`\nmemory: 3-D with shape `[batch_size, seq_len, num_nodes]`\nact_grad: 3-D with shape `[batch_size, seq_len, num_nodes]`\ngate_raw_act_grad: 3-D with shape `[batch_size, seq_len, 4, num_nodes]`\nmemory_grad: 3-D with shape `[batch_size, seq_len, num_nodes]`\ninput_grad: 3-D with shape `[batch_size, seq_len, num_nodes]`\ninitial_state_grad: 2-D with shape `[batch_size, num_nodes]`\ninitial_memory_grad: 2-D with shape `[batch_size, num_nodes]`\nw_m_m_grad: 3-D with shape `[num_nodes, 4, num_nodes]`\n)doc\");\n\n}  // namespace tensorflow\n"
  },
  {
    "path": "model_zoo/models/street/g3doc/vgslspecs.md",
    "content": "# VGSL Specs - rapid prototyping of mixed conv/LSTM networks for images.\n\nVariable-size Graph Specification Language (VGSL) enables the specification of a\nTensor Flow graph, composed of convolutions and LSTMs, that can process\nvariable-sized images, from a very short definition string.\n\n## Applications: What is VGSL Specs good for?\n\nVGSL Specs are designed specifically to create TF graphs for:\n\n*   Variable size images as the input. (In one or BOTH dimensions!)\n*   Output an image (heat map), sequence (like text), or a category.\n*   Convolutions and LSTMs are the main computing component.\n*   Fixed-size images are OK too!\n\nBut wait, aren't there other systems that simplify generating TF graphs? There\nare indeed, but something they all have in common is that they are designed for\nfixed size images only. If you want to solve a real OCR problem, you either have\nto cut the image into arbitrary sized pieces and try to stitch the results back\ntogether, or use VGSL.\n\n## Basic Usage\n\nA full model, including input and the output layers, can be built using\nvgsl_model.py. Alternatively you can supply your own tensors and add your own\nloss function layer if you wish, using vgslspecs.py directly.\n\n### Building a full model\n\nProvided your problem matches the one addressed by vgsl_model, you are good to\ngo.\n\nTargeted problems:\n\n*   Images for input, either 8 bit greyscale or 24 bit color.\n*   Output is 0-d (A category, like cat, dog, train, car.)\n*   Output is 1-d, with either variable length or a fixed length sequence, eg\n    OCR, transcription problems in general.\n\nCurrently only softmax (1 of n) outputs are supported, but it would not be\ndifficult to extend to logistic.\n\nUse vgsl_train.py to train your model, and vgsl_eval.py to evaluate it. They\njust call Train and Eval in vgsl_model.py.\n\n### Model string for a full model\n\nThe model string for a full model includes the input spec, the output spec and\nthe layers spec in between. Example:\n\n```\n'1,0,0,3[Ct5,5,16 Mp3,3 Lfys64 Lfx128 Lrx128 Lfx256]O1c105'\n```\n\nThe first 4 numbers specify the standard TF tensor dimensions: [batch, height,\nwidth, depth], except that height and/or width may be zero, allowing them to be\nvariable. Batch is specific only to training, and may be a different value at\nrecognition/inference time. Depth needs to be 1 for greyscale and 3 for color.\n\nThe model string in square brackets [] is the main model definition, which is\ndescribed [below.](#basic-layers-syntax) The output specification takes the\nform:\n\n```\nO(2|1|0)(l|s|c)n output layer with n classes.\n  2 (heatmap) Output is a 2-d vector map of the input (possibly at\n    different scale). (Not yet supported.)\n  1 (sequence) Output is a 1-d sequence of vector values.\n  0 (category) Output is a 0-d single vector value.\n  l uses a logistic non-linearity on the output, allowing multiple\n    hot elements in any output vector value. (Not yet supported.)\n  s uses a softmax non-linearity, with one-hot output in each value.\n  c uses a softmax with CTC. Can only be used with s (sequence).\n  NOTE Only O0s, O1s and O1c are currently supported.\n```\n\nThe number of classes must match the encoding of the TF Example data set.\n\n### Layers only - providing your own input and loss layers\n\nYou don't have to use the canned input/output modules, if you provide your\nseparate code to read TF Example and loss functions. First prepare your inputs:\n\n*   A TF-conventional batch of: `images = tf.float32[batch, height, width,\n    depth]`\n*   A tensor of the width of each image in the batch: `widths = tf.int64[batch]`\n*   A tensor of the height of each image in the batch: `heights =\n    tf.int64[batch]`\n\nNote that these can be created from individual images using\n`tf.train.batch_join` with `dynamic_pad=True.`\n\n```python\nimport vgslspecs\n...\nspec = '[Ct5,5,16 Mp3,3 Lfys64 Lfx128 Lrx128 Lfx256]'\nvgsl = vgslspecs.VGSLSpecs(widths, heights, is_training=True)\nlast_layer = vgsl.Build(images, spec)\n...\nAddSomeLossFunction(last_layer)....\n```\n\nWith some appropriate training data, this would create a world-class OCR engine!\n\n## Basic Layers Syntax\n\nNOTE that *all* ops input and output the standard TF convention of a 4-d tensor:\n`[batch, height, width, depth]` *regardless of any collapsing of dimensions.*\nThis greatly simplifies things, and allows the VGSLSpecs class to track changes\nto the values of widths and heights, so they can be correctly passed in to LSTM\noperations, and used by any downstream CTC operation.\n\nNOTE: in the descriptions below, `<d>` is a numeric value, and literals are\ndescribed using regular expression syntax.\n\nNOTE: Whitespace is allowed between ops.\n\n### Naming\n\nEach op gets a unique name by default, based on its spec string plus its\ncharacter position in the overall specification. All the Ops take an optional\nname argument in braces after the mnemonic code, but before any numeric\narguments.\n\n### Functional ops\n\n```\nC(s|t|r|l|m)[{name}]<y>,<x>,<d> Convolves using a y,x window, with no shrinkage,\n  SAME infill, d outputs, with s|t|r|l|m non-linear layer.\nF(s|t|r|l|m)[{name}]<d> Fully-connected with s|t|r|l|m non-linearity and d\n  outputs. Reduces height, width to 1. Input height and width must be constant.\nL(f|r|b)(x|y)[s][{name}]<n> LSTM cell with n outputs.\n  The LSTM must have one of:\n    f runs the LSTM forward only.\n    r runs the LSTM reversed only.\n    b runs the LSTM bidirectionally.\n  It will operate on either the x- or y-dimension, treating the other dimension\n  independently (as if part of the batch).\n  (Full 2-d and grid are not yet supported).\n  s (optional) summarizes the output in the requested dimension,\n     outputting only the final step, collapsing the dimension to a\n     single element.\nDo[{name}] Insert a dropout layer.\n```\n\nIn the above, `(s|t|r|l|m)` specifies the type of the non-linearity:\n\n```python\ns = sigmoid\nt = tanh\nr = relu\nl = linear (i.e., None)\nm = softmax\n```\n\nExamples:\n\n`Cr5,5,32` Runs a 5x5 Relu convolution with 32 depth/number of filters.\n\n`Lfx{MyLSTM}128` runs a forward-only LSTM, named 'MyLSTM' in the x-dimension\nwith 128 outputs, treating the y dimension independently.\n\n`Lfys64` runs a forward-only LSTM in the y-dimension with 64 outputs, treating\nthe x-dimension independently and collapses the y-dimension to 1 element.\n\n### Plumbing ops\n\nThe plumbing ops allow the construction of arbitrarily complex graphs. Something\ncurrently missing is the ability to define macros for generating say an\ninception unit in multiple places.\n\n```\n[...] Execute ... networks in series (layers).\n(...) Execute ... networks in parallel, with their output concatenated in depth.\nS[{name}]<d>(<a>x<b>)<e>,<f> Splits one dimension, moves one part to another\n  dimension.\nMp[{name}]<y>,<x> Maxpool the input, reducing the (y,x) rectangle to a single\n  value.\n```\n\nIn the `S` op, `<a>, <b>, <d>, <e>, <f>` are numbers.\n\n`S` is a generalized reshape. It splits input dimension `d` into `a` x `b`,\nsending the high/most significant part `a` to the high/most significant side of\ndimension `e`, and the low part `b` to the high side of dimension `f`.\nException: if `d=e=f`, then then dimension `d` is internally transposed to\n`bxa`. *At least one* of `e`, `f` must be equal to `d`, so no dimension can be\ntotally destroyed. Either `a` or `b` can be zero, meaning whatever is left after\ntaking out the other, allowing dimensions to be of variable size.\n\nNOTE: Remember the standard TF convention of a 4-d tensor: `[batch, height,\nwidth, depth]`, so `batch=0, height=1, width=2, depth=3.`\n\nEg. `S3(3x50)2,3` will split the 150-element depth into 3x50, with the 3 going\nto the most significant part of the width, and the 50 part staying in depth.\nThis will rearrange a 3x50 output parallel operation to spread the 3 output sets\nover width.\n\n### Full Examples\n\nExample 1: A graph capable of high quality OCR.\n\n`1,0,0,1[Ct5,5,16 Mp3,3 Lfys64 Lfx128 Lrx128 Lfx256]O1c105`\n\nAs layer descriptions: (Input layer is at the bottom, output at the top.)\n\n```\nO1c105: Output layer produces 1-d (sequence) output, trained with CTC,\n  outputting 105 classes.\nLfx256: Forward-only LSTM in x with 256 outputs\nLrx128: Reverse-only LSTM in x with 128 outputs\nLfx128: Forward-only LSTM in x with 128 outputs\nLfys64: Dimension-summarizing LSTM, summarizing the y-dimension with 64 outputs\nMp3,3: 3 x 3 Maxpool\nCt5,5,16: 5 x 5 Convolution with 16 outputs and tanh non-linearity\n[]: The body of the graph is alway expressed as a series of layers.\n1,0,0,1: Input is a batch of 1 image of variable size in greyscale\n```\n\nExample 2: The STREET network for reading French street name signs end-to-end.\nFor a detailed description see the [FSNS dataset\npaper](http://link.springer.com/chapter/10.1007%2F978-3-319-46604-0_30)\n\n```\n1,600,150,3[S2(4x150)0,2 Ct5,5,16 Mp2,2 Ct5,5,64 Mp3,3\n  ([Lrys64 Lbx128][Lbys64 Lbx128][Lfys64 Lbx128]) S3(3x0)2,3\n  Lfx128 Lrx128 S0(1x4)0,3 Lfx256]O1c134\n```\n\nSince networks are usually illustrated with the input at the bottom, the input\nlayer is at the bottom, output at the top, with 'headings' *below* the section\nthey introduce.\n\n```\nO1c134: Output is a 1-d sequence, trained with CTC and 134 output softmax.\nLfx256: Forward-only LSTM with 256 outputs\nS0(1x4)0,3: Reshape transferring the batch of 4 tiles to the depth dimension.\nLrx128: Reverse-only LSTM with 128 outputs\nLfx128: Forward-only LSTM with 128 outputs\n(Final section above)\nS3(3x0)2,3: Split the outputs of the 3 parallel summarizers and spread over the\n  x-dimension\n  [Lfys64 Lbx128]: Summarizing LSTM downwards on the y-dimension with 64\n    outputs, followed by a bi-directional LSTM in the x-dimension with 128\n    outputs\n  [Lbys64 Lbx128]: Summarizing bi-directional LSTM on the y-dimension with\n    64 outputs, followed by a bi-directional LSTM in the x-dimension with 128\n    outputs\n  [Lrys64 Lbx128]: Summarizing LSTM upwards on the y-dimension with 64 outputs,\n    followed by a bi-directional LSTM in the x-dimension with 128 outputs\n(): In parallel (re-using the inputs and concatenating the outputs):\n(Summarizing section above)\nMp3,3: 3 x 3 Maxpool\nCt5,5,64: 5 x 5 Convolution with 64 outputs and tanh non-linearity\nMp2,2: 2 x 2 Maxpool\nCt5,5,16: 5 x 5 Convolution with 16 outputs and tanh non-linearity\nS2(4x150)0,2: Split the x-dimension into 4x150, converting each tiled 600x150\nimage into a batch of 4 150x150 images\n(Convolutional input section above)\n[]: The body of the graph is alway expressed as a series of layers.\n1,150,600,3: Input is a batch of 1, 600x150 image in 24 bit color\n```\n\n## Variable size Tensors Under the Hood\n\nHere are some notes about handling variable-sized images since they require some\nconsideration and a little bit of knowledge about what goes on inside.\n\nA variable-sized image is an input for which the width and/or height are not\nknown at graph-building time, so the tensor shape contains unknown/None/-1\nsizes.\n\nMany standard NN layers, such as convolutions, are designed to cope naturally\nwith variable-sized images in TF and produce a variable sized image as the\noutput. For other layers, such as 'Fully connected' variable size is\nfundamentally difficult, if not impossible to deal with, since by definition,\n*all* its inputs are connected via a weight to an output. The number of inputs\ntherefore must be fixed.\n\nIt is possible to handle variable sized images by using sparse tensors. Some\nimplementations make a single variable dimension a list instead of part of the\ntensor. Both these solutions suffer from completely segregating the world of\nvariable size from the world of fixed size, making models and their descriptions\ncompletely non-interchangeable.\n\nIn VGSL, we use a standard 4-d Tensor, `[batch, height, width, depth]` and\neither use a batch size of 1 or put up with padding of the input images to the\nlargest size of any element of the batch. The other price paid for this\nstandardization is that the user must supply a pair of tensors of shape [batch]\nspecifying the width and height of each input in a batch. This allows the LSTMs\nin the graph to know how many iterations to execute and how to correctly\nback-propagate the gradients.\n\nThe standard TF implementation of CTC also requires a tensor giving the sequence\nlengths of its inputs. If the output of VGSL is going into CTC, the lengths can\nbe obtained using:\n\n```python\nimport vgslspecs\n...\nspec = '[Ct5,5,16 Mp3,3 Lfys64 Lfx128 Lrx128 Lfx256]'\nvgsl = vgslspecs.VGSLSpecs(widths, heights, is_training=True)\nlast_layer = vgsl.Build(images, spec)\nseq_lengths = vgsl.GetLengths()\n```\n\nThe above will provide the widths that were given in the constructor, scaled\ndown by the max-pool operator. The heights may be obtained using\n`vgsl.GetLengths(1)`, specifying the index of the y-dimension.\n\nNOTE that currently the only way of collapsing a dimension of unknown size to\nknown size (1) is through the use of a summarizing LSTM. A single summarizing\nLSTM will collapse one dimension (x or y), leaving a 1-d sequence. The 1-d\nsequence can then be collapsed in the other dimension to make a 0-d categorical\n(softmax) or embedding (logistic) output.\n\nUsing the (parallel) op it is entirely possible to run multiple [series] of ops\nthat collapse x first in one and y first in the other, reducing both eventually\nto a single categorical value! For eample, the following description may do\nsomething useful with ImageNet-like problems:\n\n```python\n[Cr5,5,16 Mp2,2 Cr5,5,64 Mp3,3 ([Lfxs64 Lfys256] [Lfys64 Lfxs256]) Fr512 Fr512]\n```\n"
  },
  {
    "path": "model_zoo/models/street/python/decoder.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Basic CTC+recoder decoder.\n\nDecodes a sequence of class-ids into UTF-8 text.\nFor basic information on CTC See:\nAlex Graves et al. Connectionist Temporal Classification: Labelling Unsegmented\nSequence Data with Recurrent Neural Networks.\nhttp://www.cs.toronto.edu/~graves/icml_2006.pdf\n\"\"\"\nimport collections\nimport re\n\nimport errorcounter as ec\nimport tensorflow as tf\n\n# Named tuple Part describes a part of a multi (1 or more) part code that\n# represents a utf-8 string. For example, Chinese character 'x' might be\n# represented by 3 codes of which (utf8='x', index=1, num_codes3) would be the\n# middle part. (The actual code is not stored in the tuple).\nPart = collections.namedtuple('Part', 'utf8 index, num_codes')\n\n\n# Class that decodes a sequence of class-ids into UTF-8 text.\nclass Decoder(object):\n  \"\"\"Basic CTC+recoder decoder.\"\"\"\n\n  def __init__(self, filename):\n    r\"\"\"Constructs a Decoder.\n\n    Reads the text file describing the encoding and build the encoder.\n    The text file contains lines of the form:\n    <code>[,<code>]*\\t<string>\n    Each line defines a mapping from a sequence of one or more integer codes to\n    a corresponding utf-8 string.\n    Args:\n      filename:   Name of file defining the decoding sequences.\n    \"\"\"\n    # self.decoder is a list of lists of Part(utf8, index, num_codes).\n    # The index to the top-level list is a code. The list given by the code\n    # index is a list of the parts represented by that code, Eg if the code 42\n    # represents the 2nd (index 1) out of 3 part of Chinese character 'x', then\n    # self.decoder[42] = [..., (utf8='x', index=1, num_codes3), ...] where ...\n    # means all other uses of the code 42.\n    self.decoder = []\n    if filename:\n      self._InitializeDecoder(filename)\n\n  def SoftmaxEval(self, sess, model, num_steps):\n    \"\"\"Evaluate a model in softmax mode.\n\n    Adds char, word recall and sequence error rate events to the sw summary\n    writer, and returns them as well\n    TODO(rays) Add LogisticEval.\n    Args:\n      sess:  A tensor flow Session.\n      model: The model to run in the session. Requires a VGSLImageModel or any\n        other class that has a using_ctc attribute and a RunAStep(sess) method\n        that reurns a softmax result with corresponding labels.\n      num_steps: Number of steps to evaluate for.\n    Returns:\n      ErrorRates named tuple.\n    Raises:\n      ValueError: If an unsupported number of dimensions is used.\n    \"\"\"\n    coord = tf.train.Coordinator()\n    threads = tf.train.start_queue_runners(sess=sess, coord=coord)\n    # Run the requested number of evaluation steps, gathering the outputs of the\n    # softmax and the true labels of the evaluation examples.\n    total_label_counts = ec.ErrorCounts(0, 0, 0, 0)\n    total_word_counts = ec.ErrorCounts(0, 0, 0, 0)\n    sequence_errors = 0\n    for _ in xrange(num_steps):\n      softmax_result, labels = model.RunAStep(sess)\n      # Collapse softmax to same shape as labels.\n      predictions = softmax_result.argmax(axis=-1)\n      # Exclude batch from num_dims.\n      num_dims = len(predictions.shape) - 1\n      batch_size = predictions.shape[0]\n      null_label = softmax_result.shape[-1] - 1\n      for b in xrange(batch_size):\n        if num_dims == 2:\n          # TODO(rays) Support 2-d data.\n          raise ValueError('2-d label data not supported yet!')\n        else:\n          if num_dims == 1:\n            pred_batch = predictions[b, :]\n            labels_batch = labels[b, :]\n          else:\n            pred_batch = [predictions[b]]\n            labels_batch = [labels[b]]\n          text = self.StringFromCTC(pred_batch, model.using_ctc, null_label)\n          truth = self.StringFromCTC(labels_batch, False, null_label)\n          # Note that recall_errs is false negatives (fn) aka drops/deletions.\n          # Actual recall would be 1-fn/truth_words.\n          # Likewise precision_errs is false positives (fp) aka adds/insertions.\n          # Actual precision would be 1-fp/ocr_words.\n          total_word_counts = ec.AddErrors(total_word_counts,\n                                           ec.CountWordErrors(text, truth))\n          total_label_counts = ec.AddErrors(total_label_counts,\n                                            ec.CountErrors(text, truth))\n          if text != truth:\n            sequence_errors += 1\n\n    coord.request_stop()\n    coord.join(threads)\n    return ec.ComputeErrorRates(total_label_counts, total_word_counts,\n                                sequence_errors, num_steps * batch_size)\n\n  def StringFromCTC(self, ctc_labels, merge_dups, null_label):\n    \"\"\"Decodes CTC output to a string.\n\n    Extracts only sequences of codes that are allowed by self.decoder.\n    Labels that make illegal code sequences are dropped.\n    Note that, by its nature of taking only top choices, this is much weaker\n    than a full-blown beam search that considers all the softmax outputs.\n    For languages without many multi-code sequences, this doesn't make much\n    difference, but for complex scripts the accuracy will be much lower.\n    Args:\n      ctc_labels: List of class labels including null characters to remove.\n      merge_dups: If True, Duplicate labels will be merged\n      null_label: Label value to ignore.\n\n    Returns:\n      Labels decoded to a string.\n    \"\"\"\n    # Run regular ctc on the labels, extracting a list of codes.\n    codes = self._CodesFromCTC(ctc_labels, merge_dups, null_label)\n    length = len(codes)\n    if length == 0:\n      return ''\n    # strings and partials are both indexed by the same index as codes.\n    # strings[i] is the best completed string upto position i, and\n    # partials[i] is a list of partial code sequences at position i.\n    # Warning: memory is squared-order in length.\n    strings = []\n    partials = []\n    for pos in xrange(length):\n      code = codes[pos]\n      parts = self.decoder[code]\n      partials.append([])\n      strings.append('')\n      # Iterate over the parts that this code can represent.\n      for utf8, index, num_codes in parts:\n        if index > pos:\n          continue\n        # We can use code if it is an initial code (index==0) or continues a\n        # sequence in the partials list at the previous position.\n        if index == 0 or partials[pos - 1].count(\n            Part(utf8, index - 1, num_codes)) > 0:\n          if index < num_codes - 1:\n            # Save the partial sequence.\n            partials[-1].append(Part(utf8, index, num_codes))\n          elif not strings[-1]:\n            # A code sequence is completed. Append to the best string that we\n            # had where it started.\n            if pos >= num_codes:\n              strings[-1] = strings[pos - num_codes] + utf8\n            else:\n              strings[-1] = utf8\n      if not strings[-1] and pos > 0:\n        # We didn't get anything here so copy the previous best string, skipping\n        # the current code, but it may just be a partial anyway.\n        strings[-1] = strings[-2]\n    return strings[-1]\n\n  def _InitializeDecoder(self, filename):\n    \"\"\"Reads the decoder file and initializes self.decoder from it.\n\n    Args:\n      filename: Name of text file mapping codes to utf8 strings.\n    Raises:\n      ValueError: if the input file is not parsed correctly.\n    \"\"\"\n    line_re = re.compile(r'(?P<codes>\\d+(,\\d+)*)\\t(?P<utf8>.+)')\n    with tf.gfile.GFile(filename) as f:\n      for line in f:\n        m = line_re.match(line)\n        if m is None:\n          raise ValueError('Unmatched line:', line)\n        # codes is the sequence that maps to the string.\n        str_codes = m.groupdict()['codes'].split(',')\n        codes = []\n        for code in str_codes:\n          codes.append(int(code))\n        utf8 = m.groupdict()['utf8']\n        num_codes = len(codes)\n        for index, code in enumerate(codes):\n          while code >= len(self.decoder):\n            self.decoder.append([])\n          self.decoder[code].append(Part(utf8, index, num_codes))\n\n  def _CodesFromCTC(self, ctc_labels, merge_dups, null_label):\n    \"\"\"Collapses CTC output to regular output.\n\n    Args:\n      ctc_labels: List of class labels including null characters to remove.\n      merge_dups: If True, Duplicate labels will be merged.\n      null_label: Label value to ignore.\n\n    All trailing zeros are removed!!\n    TODO(rays) This may become a problem with non-CTC models.\n    If using charset, this should not be a problem as zero is always space.\n    tf.pad can only append zero, so we have to be able to drop them, as a\n    non-ctc will have learned to output trailing zeros instead of trailing\n    nulls. This is awkward, as the stock ctc loss function requires that the\n    null character be num_classes-1.\n    Returns:\n      (List of) Labels with null characters removed.\n    \"\"\"\n    out_labels = []\n    prev_label = -1\n    zeros_needed = 0\n    for label in ctc_labels:\n      if label == null_label:\n        prev_label = -1\n      elif label != prev_label or not merge_dups:\n        if label == 0:\n          # Count zeros and only emit them when it is clear there is a non-zero\n          # after, so as to truncate away all trailing zeros.\n          zeros_needed += 1\n        else:\n          if merge_dups and zeros_needed > 0:\n            out_labels.append(0)\n          else:\n            out_labels += [0] * zeros_needed\n          zeros_needed = 0\n          out_labels.append(label)\n        prev_label = label\n    return out_labels\n"
  },
  {
    "path": "model_zoo/models/street/python/decoder_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for decoder.\"\"\"\nimport os\n\nimport tensorflow as tf\nimport decoder\n\n\ndef _testdata(filename):\n  return os.path.join('../testdata/', filename)\n\n\nclass DecoderTest(tf.test.TestCase):\n\n  def testCodesFromCTC(self):\n    \"\"\"Tests that the simple CTC decoder drops nulls and duplicates.\n    \"\"\"\n    ctc_labels = [9, 9, 9, 1, 9, 2, 2, 3, 9, 9, 0, 0, 1, 9, 1, 9, 9, 9]\n    decode = decoder.Decoder(filename=None)\n    non_null_labels = decode._CodesFromCTC(\n        ctc_labels, merge_dups=False, null_label=9)\n    self.assertEqual(non_null_labels, [1, 2, 2, 3, 0, 0, 1, 1])\n    idempotent_labels = decode._CodesFromCTC(\n        non_null_labels, merge_dups=False, null_label=9)\n    self.assertEqual(idempotent_labels, non_null_labels)\n    collapsed_labels = decode._CodesFromCTC(\n        ctc_labels, merge_dups=True, null_label=9)\n    self.assertEqual(collapsed_labels, [1, 2, 3, 0, 1, 1])\n    non_idempotent_labels = decode._CodesFromCTC(\n        collapsed_labels, merge_dups=True, null_label=9)\n    self.assertEqual(non_idempotent_labels, [1, 2, 3, 0, 1])\n\n  def testStringFromCTC(self):\n    \"\"\"Tests that the decoder can decode sequences including multi-codes.\n    \"\"\"\n    #             -  f  -  a  r  -  m(1/2)m     -junk sp b  a  r  -  n  -\n    ctc_labels = [9, 6, 9, 1, 3, 9, 4, 9, 5, 5, 9, 5, 0, 2, 1, 3, 9, 4, 9]\n    decode = decoder.Decoder(filename=_testdata('charset_size_10.txt'))\n    text = decode.StringFromCTC(ctc_labels, merge_dups=True, null_label=9)\n    self.assertEqual(text, 'farm barn')\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/street/python/errorcounter.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Some simple tools for error counting.\n\n\"\"\"\nimport collections\n\n# Named tuple Error counts describes the counts needed to accumulate errors\n# over multiple trials:\n#   false negatives (aka drops or deletions),\n#   false positives: (aka adds or insertions),\n#   truth_count: number of elements in ground truth = denominator for fn,\n#   test_count:  number of elements in test string = denominator for fp,\n# Note that recall = 1 - fn/truth_count, precision = 1 - fp/test_count,\n# accuracy = 1 - (fn + fp) / (truth_count + test_count).\nErrorCounts = collections.namedtuple('ErrorCounts', ['fn', 'fp', 'truth_count',\n                                                     'test_count'])\n\n# Named tuple for error rates, as a percentage. Accuracies are just 100-error.\nErrorRates = collections.namedtuple('ErrorRates',\n                                    ['label_error', 'word_recall_error',\n                                     'word_precision_error', 'sequence_error'])\n\n\ndef CountWordErrors(ocr_text, truth_text):\n  \"\"\"Counts the word drop and add errors as a bag of words.\n\n  Args:\n    ocr_text:    OCR text string.\n    truth_text:  Truth text string.\n\n  Returns:\n    ErrorCounts named tuple.\n  \"\"\"\n  # Convert to lists of words.\n  return CountErrors(ocr_text.split(), truth_text.split())\n\n\ndef CountErrors(ocr_text, truth_text):\n  \"\"\"Counts the drops and adds between 2 bags of iterables.\n\n  Simple bag of objects count returns the number of dropped and added\n  elements, regardless of order, from anything that is iterable, eg\n  a pair of strings gives character errors, and a pair of word lists give\n  word errors.\n  Args:\n    ocr_text:    OCR text iterable (eg string for chars, word list for words).\n    truth_text:  Truth text iterable.\n\n  Returns:\n    ErrorCounts named tuple.\n  \"\"\"\n  counts = collections.Counter(truth_text)\n  counts.subtract(ocr_text)\n  drops = sum(c for c in counts.values() if c > 0)\n  adds = sum(-c for c in counts.values() if c < 0)\n  return ErrorCounts(drops, adds, len(truth_text), len(ocr_text))\n\n\ndef AddErrors(counts1, counts2):\n  \"\"\"Adds the counts and returns a new sum tuple.\n\n  Args:\n    counts1: ErrorCounts named tuples to sum.\n    counts2: ErrorCounts named tuples to sum.\n  Returns:\n    Sum of counts1, counts2.\n  \"\"\"\n  return ErrorCounts(counts1.fn + counts2.fn, counts1.fp + counts2.fp,\n                     counts1.truth_count + counts2.truth_count,\n                     counts1.test_count + counts2.test_count)\n\n\ndef ComputeErrorRates(label_counts, word_counts, seq_errors, num_seqs):\n  \"\"\"Returns an ErrorRates corresponding to the given counts.\n\n  Args:\n    label_counts: ErrorCounts for the character labels\n    word_counts:  ErrorCounts for the words\n    seq_errors:   Number of sequence errors\n    num_seqs:     Total sequences\n  Returns:\n    ErrorRates corresponding to the given counts.\n  \"\"\"\n  label_errors = label_counts.fn + label_counts.fp\n  num_labels = label_counts.truth_count + label_counts.test_count\n  return ErrorRates(\n      ComputeErrorRate(label_errors, num_labels),\n      ComputeErrorRate(word_counts.fn, word_counts.truth_count),\n      ComputeErrorRate(word_counts.fp, word_counts.test_count),\n      ComputeErrorRate(seq_errors, num_seqs))\n\n\ndef ComputeErrorRate(error_count, truth_count):\n  \"\"\"Returns a sanitized percent error rate from the raw counts.\n\n  Prevents div by 0 and clips return to 100%.\n  Args:\n    error_count: Number of errors.\n    truth_count: Number to divide by.\n\n  Returns:\n    100.0 * error_count / truth_count clipped to 100.\n  \"\"\"\n  if truth_count == 0:\n    truth_count = 1\n    error_count = 1\n  elif error_count > truth_count:\n    error_count = truth_count\n  return error_count * 100.0 / truth_count\n"
  },
  {
    "path": "model_zoo/models/street/python/errorcounter_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for errorcounter.\"\"\"\nimport tensorflow as tf\nimport errorcounter as ec\n\n\nclass ErrorcounterTest(tf.test.TestCase):\n\n  def testComputeErrorRate(self):\n    \"\"\"Tests that the percent calculation works as expected.\n    \"\"\"\n    rate = ec.ComputeErrorRate(error_count=0, truth_count=0)\n    self.assertEqual(rate, 100.0)\n    rate = ec.ComputeErrorRate(error_count=1, truth_count=0)\n    self.assertEqual(rate, 100.0)\n    rate = ec.ComputeErrorRate(error_count=10, truth_count=1)\n    self.assertEqual(rate, 100.0)\n    rate = ec.ComputeErrorRate(error_count=0, truth_count=1)\n    self.assertEqual(rate, 0.0)\n    rate = ec.ComputeErrorRate(error_count=3, truth_count=12)\n    self.assertEqual(rate, 25.0)\n\n  def testCountErrors(self):\n    \"\"\"Tests that the error counter works as expected.\n    \"\"\"\n    truth_str = 'farm barn'\n    counts = ec.CountErrors(ocr_text=truth_str, truth_text=truth_str)\n    self.assertEqual(\n        counts, ec.ErrorCounts(\n            fn=0, fp=0, truth_count=9, test_count=9))\n    # With a period on the end, we get a char error.\n    dot_str = 'farm barn.'\n    counts = ec.CountErrors(ocr_text=dot_str, truth_text=truth_str)\n    self.assertEqual(\n        counts, ec.ErrorCounts(\n            fn=0, fp=1, truth_count=9, test_count=10))\n    counts = ec.CountErrors(ocr_text=truth_str, truth_text=dot_str)\n    self.assertEqual(\n        counts, ec.ErrorCounts(\n            fn=1, fp=0, truth_count=10, test_count=9))\n    # Space is just another char.\n    no_space = 'farmbarn'\n    counts = ec.CountErrors(ocr_text=no_space, truth_text=truth_str)\n    self.assertEqual(\n        counts, ec.ErrorCounts(\n            fn=1, fp=0, truth_count=9, test_count=8))\n    counts = ec.CountErrors(ocr_text=truth_str, truth_text=no_space)\n    self.assertEqual(\n        counts, ec.ErrorCounts(\n            fn=0, fp=1, truth_count=8, test_count=9))\n    # Lose them all.\n    counts = ec.CountErrors(ocr_text='', truth_text=truth_str)\n    self.assertEqual(\n        counts, ec.ErrorCounts(\n            fn=9, fp=0, truth_count=9, test_count=0))\n    counts = ec.CountErrors(ocr_text=truth_str, truth_text='')\n    self.assertEqual(\n        counts, ec.ErrorCounts(\n            fn=0, fp=9, truth_count=0, test_count=9))\n\n  def testCountWordErrors(self):\n    \"\"\"Tests that the error counter works as expected.\n    \"\"\"\n    truth_str = 'farm barn'\n    counts = ec.CountWordErrors(ocr_text=truth_str, truth_text=truth_str)\n    self.assertEqual(\n        counts, ec.ErrorCounts(\n            fn=0, fp=0, truth_count=2, test_count=2))\n    # With a period on the end, we get a word error.\n    dot_str = 'farm barn.'\n    counts = ec.CountWordErrors(ocr_text=dot_str, truth_text=truth_str)\n    self.assertEqual(\n        counts, ec.ErrorCounts(\n            fn=1, fp=1, truth_count=2, test_count=2))\n    counts = ec.CountWordErrors(ocr_text=truth_str, truth_text=dot_str)\n    self.assertEqual(\n        counts, ec.ErrorCounts(\n            fn=1, fp=1, truth_count=2, test_count=2))\n    # Space is special.\n    no_space = 'farmbarn'\n    counts = ec.CountWordErrors(ocr_text=no_space, truth_text=truth_str)\n    self.assertEqual(\n        counts, ec.ErrorCounts(\n            fn=2, fp=1, truth_count=2, test_count=1))\n    counts = ec.CountWordErrors(ocr_text=truth_str, truth_text=no_space)\n    self.assertEqual(\n        counts, ec.ErrorCounts(\n            fn=1, fp=2, truth_count=1, test_count=2))\n    # Lose them all.\n    counts = ec.CountWordErrors(ocr_text='', truth_text=truth_str)\n    self.assertEqual(\n        counts, ec.ErrorCounts(\n            fn=2, fp=0, truth_count=2, test_count=0))\n    counts = ec.CountWordErrors(ocr_text=truth_str, truth_text='')\n    self.assertEqual(\n        counts, ec.ErrorCounts(\n            fn=0, fp=2, truth_count=0, test_count=2))\n    # With a space in ba rn, there is an extra add.\n    sp_str = 'farm ba rn'\n    counts = ec.CountWordErrors(ocr_text=sp_str, truth_text=truth_str)\n    self.assertEqual(\n        counts, ec.ErrorCounts(\n            fn=1, fp=2, truth_count=2, test_count=3))\n    counts = ec.CountWordErrors(ocr_text=truth_str, truth_text=sp_str)\n    self.assertEqual(\n        counts, ec.ErrorCounts(\n            fn=2, fp=1, truth_count=3, test_count=2))\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/street/python/nn_ops.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Ops and utilities for neural networks.\n\nFor now, just an LSTM layer.\n\"\"\"\nimport shapes\nimport tensorflow as tf\nrnn = tf.load_op_library(\"../cc/rnn_ops.so\")\n\n\ndef rnn_helper(inp,\n               length,\n               cell_type=None,\n               direction=\"forward\",\n               name=None,\n               *args,\n               **kwargs):\n  \"\"\"Adds ops for a recurrent neural network layer.\n\n  This function calls an actual implementation of a recurrent neural network\n  based on `cell_type`.\n\n  There are three modes depending on the value of `direction`:\n\n    forward: Adds a forward RNN.\n    backward: Adds a backward RNN.\n    bidirectional: Adds both forward and backward RNNs and creates a\n                   bidirectional RNN.\n\n  Args:\n    inp: A 3-D tensor of shape [`batch_size`, `max_length`, `feature_dim`].\n    length: A 1-D tensor of shape [`batch_size`] and type int64. Each element\n            represents the length of the corresponding sequence in `inp`.\n    cell_type: Cell type of RNN. Currently can only be \"lstm\".\n    direction: One of \"forward\", \"backward\", \"bidirectional\".\n    name: Name of the op.\n    *args: Other arguments to the layer.\n    **kwargs: Keyword arugments to the layer.\n\n  Returns:\n    A 3-D tensor of shape [`batch_size`, `max_length`, `num_nodes`].\n  \"\"\"\n\n  assert cell_type is not None\n  rnn_func = None\n  if cell_type == \"lstm\":\n    rnn_func = lstm_layer\n  assert rnn_func is not None\n  assert direction in [\"forward\", \"backward\", \"bidirectional\"]\n\n  with tf.variable_scope(name):\n    if direction in [\"forward\", \"bidirectional\"]:\n      forward = rnn_func(\n          inp=inp,\n          length=length,\n          backward=False,\n          name=\"forward\",\n          *args,\n          **kwargs)\n      if isinstance(forward, tuple):\n        # lstm_layer returns a tuple (output, memory). We only need the first\n        # element.\n        forward = forward[0]\n    if direction in [\"backward\", \"bidirectional\"]:\n      backward = rnn_func(\n          inp=inp,\n          length=length,\n          backward=True,\n          name=\"backward\",\n          *args,\n          **kwargs)\n      if isinstance(backward, tuple):\n        # lstm_layer returns a tuple (output, memory). We only need the first\n        # element.\n        backward = backward[0]\n    if direction == \"forward\":\n      out = forward\n    elif direction == \"backward\":\n      out = backward\n    else:\n      out = tf.concat(2, [forward, backward])\n  return out\n\n\n@tf.RegisterShape(\"VariableLSTM\")\ndef _variable_lstm_shape(op):\n  \"\"\"Shape function for the VariableLSTM op.\"\"\"\n  input_shape = op.inputs[0].get_shape().with_rank(4)\n  state_shape = op.inputs[1].get_shape().with_rank(2)\n  memory_shape = op.inputs[2].get_shape().with_rank(2)\n  w_m_m_shape = op.inputs[3].get_shape().with_rank(3)\n  batch_size = input_shape[0].merge_with(state_shape[0])\n  batch_size = input_shape[0].merge_with(memory_shape[0])\n  seq_len = input_shape[1]\n  gate_num = input_shape[2].merge_with(w_m_m_shape[1])\n  output_dim = input_shape[3].merge_with(state_shape[1])\n  output_dim = output_dim.merge_with(memory_shape[1])\n  output_dim = output_dim.merge_with(w_m_m_shape[0])\n  output_dim = output_dim.merge_with(w_m_m_shape[2])\n  return [[batch_size, seq_len, output_dim],\n          [batch_size, seq_len, gate_num, output_dim],\n          [batch_size, seq_len, output_dim]]\n\n\n@tf.RegisterGradient(\"VariableLSTM\")\ndef _variable_lstm_grad(op, act_grad, gate_grad, mem_grad):\n  \"\"\"Gradient function for the VariableLSTM op.\"\"\"\n  initial_state = op.inputs[1]\n  initial_memory = op.inputs[2]\n  w_m_m = op.inputs[3]\n  act = op.outputs[0]\n  gate_raw_act = op.outputs[1]\n  memory = op.outputs[2]\n  return rnn.variable_lstm_grad(initial_state, initial_memory, w_m_m, act,\n                                gate_raw_act, memory, act_grad, gate_grad,\n                                mem_grad)\n\n\ndef lstm_layer(inp,\n               length=None,\n               state=None,\n               memory=None,\n               num_nodes=None,\n               backward=False,\n               clip=50.0,\n               reg_func=tf.nn.l2_loss,\n               weight_reg=False,\n               weight_collection=\"LSTMWeights\",\n               bias_reg=False,\n               stddev=None,\n               seed=None,\n               decode=False,\n               use_native_weights=False,\n               name=None):\n  \"\"\"Adds ops for an LSTM layer.\n\n  This adds ops for the following operations:\n\n    input => (forward-LSTM|backward-LSTM) => output\n\n  The direction of the LSTM is determined by `backward`. If it is false, the\n  forward LSTM is used, the backward one otherwise.\n\n  Args:\n    inp: A 3-D tensor of shape [`batch_size`, `max_length`, `feature_dim`].\n    length: A 1-D tensor of shape [`batch_size`] and type int64. Each element\n            represents the length of the corresponding sequence in `inp`.\n    state: If specified, uses it as the initial state.\n    memory: If specified, uses it as the initial memory.\n    num_nodes: The number of LSTM cells.\n    backward: If true, reverses the `inp` before adding the ops. The output is\n              also reversed so that the direction is the same as `inp`.\n    clip: Value used to clip the cell values.\n    reg_func: Function used for the weight regularization such as\n              `tf.nn.l2_loss`.\n    weight_reg: If true, regularize the filter weights with `reg_func`.\n    weight_collection: Collection to add the weights to for regularization.\n    bias_reg: If true, regularize the bias vector with `reg_func`.\n    stddev: Standard deviation used to initialize the variables.\n    seed: Seed used to initialize the variables.\n    decode: If true, does not add ops which are not used for inference.\n    use_native_weights: If true, uses weights in the same format as the native\n                        implementations.\n    name: Name of the op.\n\n  Returns:\n    A 3-D tensor of shape [`batch_size`, `max_length`, `num_nodes`].\n  \"\"\"\n  with tf.variable_scope(name):\n    if backward:\n      if length is None:\n        inp = tf.reverse(inp, [False, True, False])\n      else:\n        inp = tf.reverse_sequence(inp, length, 1, 0)\n\n    num_prev = inp.get_shape()[2]\n    if stddev:\n      initializer = tf.truncated_normal_initializer(stddev=stddev, seed=seed)\n    else:\n      initializer = tf.uniform_unit_scaling_initializer(seed=seed)\n\n    if use_native_weights:\n      with tf.variable_scope(\"LSTMCell\"):\n        w = tf.get_variable(\n            \"W_0\",\n            shape=[num_prev + num_nodes, 4 * num_nodes],\n            initializer=initializer,\n            dtype=tf.float32)\n        w_i_m = tf.slice(w, [0, 0], [num_prev, 4 * num_nodes], name=\"w_i_m\")\n        w_m_m = tf.reshape(\n            tf.slice(w, [num_prev, 0], [num_nodes, 4 * num_nodes]),\n            [num_nodes, 4, num_nodes],\n            name=\"w_m_m\")\n    else:\n      w_i_m = tf.get_variable(\"w_i_m\", [num_prev, 4 * num_nodes],\n                              initializer=initializer)\n      w_m_m = tf.get_variable(\"w_m_m\", [num_nodes, 4, num_nodes],\n                              initializer=initializer)\n\n    if not decode and weight_reg:\n      tf.add_to_collection(weight_collection, reg_func(w_i_m, name=\"w_i_m_reg\"))\n      tf.add_to_collection(weight_collection, reg_func(w_m_m, name=\"w_m_m_reg\"))\n\n    batch_size = shapes.tensor_dim(inp, dim=0)\n    num_frames = shapes.tensor_dim(inp, dim=1)\n    prev = tf.reshape(inp, tf.pack([batch_size * num_frames, num_prev]))\n\n    if use_native_weights:\n      with tf.variable_scope(\"LSTMCell\"):\n        b = tf.get_variable(\n            \"B\",\n            shape=[4 * num_nodes],\n            initializer=tf.zeros_initializer,\n            dtype=tf.float32)\n      biases = tf.identity(b, name=\"biases\")\n    else:\n      biases = tf.get_variable(\n          \"biases\", [4 * num_nodes], initializer=tf.constant_initializer(0.0))\n    if not decode and bias_reg:\n      tf.add_to_collection(\n          weight_collection, reg_func(\n              biases, name=\"biases_reg\"))\n    prev = tf.nn.xw_plus_b(prev, w_i_m, biases)\n\n    prev = tf.reshape(prev, tf.pack([batch_size, num_frames, 4, num_nodes]))\n    if state is None:\n      state = tf.fill(tf.pack([batch_size, num_nodes]), 0.0)\n    if memory is None:\n      memory = tf.fill(tf.pack([batch_size, num_nodes]), 0.0)\n\n    out, _, mem = rnn.variable_lstm(prev, state, memory, w_m_m, clip=clip)\n\n    if backward:\n      if length is None:\n        out = tf.reverse(out, [False, True, False])\n      else:\n        out = tf.reverse_sequence(out, length, 1, 0)\n\n  return out, mem\n"
  },
  {
    "path": "model_zoo/models/street/python/shapes.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Shape manipulation functions.\n\nrotate_dimensions: prepares for a rotating transpose by returning a rotated\n  list of dimension indices.\ntransposing_reshape: allows a dimension to be factorized, with one of the pieces\n  transferred to another dimension, or to transpose factors within a single\n  dimension.\ntensor_dim: gets a shape dimension as a constant integer if known otherwise a\n  runtime usable tensor value.\ntensor_shape: returns the full shape of a tensor as the tensor_dim.\n\"\"\"\nimport tensorflow as tf\n\n\ndef rotate_dimensions(num_dims, src_dim, dest_dim):\n  \"\"\"Returns a list of dimension indices that will rotate src_dim to dest_dim.\n\n  src_dim is moved to dest_dim, with all intervening dimensions shifted towards\n  the hole left by src_dim. Eg:\n  num_dims = 4, src_dim=3, dest_dim=1\n  Returned list=[0, 3, 1, 2]\n  For a tensor with dims=[5, 4, 3, 2] a transpose would yield [5, 2, 4, 3].\n  Args:\n    num_dims: The number of dimensions to handle.\n    src_dim:  The dimension to move.\n    dest_dim: The dimension to move src_dim to.\n\n  Returns:\n    A list of rotated dimension indices.\n  \"\"\"\n  # List of dimensions for transpose.\n  dim_list = range(num_dims)\n  # Shuffle src_dim to dest_dim by swapping to shuffle up the other dims.\n  step = 1 if dest_dim > src_dim else -1\n  for x in xrange(src_dim, dest_dim, step):\n    dim_list[x], dim_list[x + step] = dim_list[x + step], dim_list[x]\n  return dim_list\n\n\ndef transposing_reshape(tensor,\n                        src_dim,\n                        part_a,\n                        part_b,\n                        dest_dim_a,\n                        dest_dim_b,\n                        name=None):\n  \"\"\"Splits src_dim and sends one of the pieces to another dim.\n\n  Terminology:\n  A matrix is often described as 'row-major' or 'column-major', which doesn't\n  help if you can't remember which is the row index and which is the column,\n  even if you know what 'major' means, so here is a simpler explanation of it:\n  When TF stores a tensor of size [d0, d1, d2, d3] indexed by [i0, i1, i2, i3],\n  the memory address of an element is calculated using:\n  ((i0 * d1 + i1) * d2 + i2) * d3 + i3, so, d0 is the MOST SIGNIFICANT dimension\n  and d3 the LEAST SIGNIFICANT, just like in the decimal number 1234, 1 is the\n  most significant digit and 4 the least significant. In both cases the most\n  significant is multiplied by the largest number to determine its 'value'.\n  Furthermore, if we reshape the tensor to [d0'=d0, d1'=d1 x d2, d2'=d3], then\n  the MOST SIGNIFICANT part of d1' is d1 and the LEAST SIGNIFICANT part of d1'\n  is d2.\n\n  Action:\n  transposing_reshape splits src_dim into factors [part_a, part_b], and sends\n  the most significant part (of size  part_a) to be the most significant part of\n  dest_dim_a*(Exception: see NOTE 2), and the least significant part (of size\n  part_b) to be the most significant part of dest_dim_b.\n  This is basically a combination of reshape, rotating transpose, reshape.\n  NOTE1: At least one of dest_dim_a and dest_dim_b must equal src_dim, ie one of\n  the parts always stays put, so src_dim is never totally destroyed and the\n  output number of dimensions is always the same as the input.\n  NOTE2: If dest_dim_a == dest_dim_b == src_dim, then parts a and b are simply\n  transposed within src_dim to become part_b x part_a, so the most significant\n  part becomes the least significant part and vice versa. Thus if you really\n  wanted to make one of the parts the least significant side of the destiantion,\n  the destination dimension can be internally transposed with a second call to\n  transposing_reshape.\n  NOTE3: One of part_a and part_b may be -1 to allow src_dim to be of unknown\n  size with one known-size factor. Otherwise part_a * part_b must equal the size\n  of src_dim.\n  NOTE4: The reshape preserves as many known-at-graph-build-time dimension sizes\n  as are available.\n\n  Example:\n  Input dims=[5, 2, 6, 2]\n  tensor=[[[[0, 1][2, 3][4, 5][6, 7][8, 9][10, 11]]\n           [[12, 13][14, 15][16, 17][18, 19][20, 21][22, 23]]\n          [[[24, 25]...\n  src_dim=2, part_a=2, part_b=3, dest_dim_a=3, dest_dim_b=2\n  output dims =[5, 2, 3, 4]\n  output tensor=[[[[0, 1, 6, 7][2, 3, 8, 9][4, 5, 10, 11]]\n                  [[12, 13, 18, 19][14, 15, 20, 21][16, 17, 22, 23]]]\n                 [[[24, 26, 28]...\n  Example2:\n  Input dims=[phrases, words, letters]=[2, 6, x]\n  tensor=[[[the][cat][sat][on][the][mat]]\n         [[a][stitch][in][time][saves][nine]]]\n  We can factorize the 6 words into 3x2 = [[the][cat]][[sat][on]][[the][mat]]\n  or 2x3=[[the][cat][sat]][[on][the][mat]] and\n  src_dim=1, part_a=3, part_b=2, dest_dim_a=1, dest_dim_b=1\n  would yield:\n  [[[the][sat][the][cat][on][mat]]\n   [[a][in][saves][stitch][time][nine]]], but\n  src_dim=1, part_a=2, part_b=3, dest_dim_a=1, dest_dim_b=1\n  would yield:\n  [[[the][on][cat][the][sat][mat]]\n   [[a][time][stitch][saves][in][nine]]], and\n  src_dim=1, part_a=2, part_b=3, dest_dim_a=0, dest_dim_b=1\n  would yield:\n  [[[the][cat][sat]]\n   [[a][stitch][in]]\n   [[on][the][mat]]\n   [[time][saves][nine]]]\n  Now remember that the words above represent any least-significant subset of\n  the input dimensions.\n\n  Args:\n    tensor:     A tensor to reshape.\n    src_dim:    The dimension to split.\n    part_a:     The first factor of the split.\n    part_b:     The second factor of the split.\n    dest_dim_a: The dimension to move part_a of src_dim to.\n    dest_dim_b: The dimension to move part_b of src_dim to.\n    name:       Optional base name for all the ops.\n\n  Returns:\n    Reshaped tensor.\n\n  Raises:\n    ValueError: If the args are invalid.\n  \"\"\"\n  if dest_dim_a != src_dim and dest_dim_b != src_dim:\n    raise ValueError(\n        'At least one of dest_dim_a, dest_dim_b must equal src_dim!')\n  if part_a == 0 or part_b == 0:\n    raise ValueError('Zero not allowed for part_a or part_b!')\n  if part_a < 0 and part_b < 0:\n    raise ValueError('At least one of part_a and part_b must be positive!')\n  if not name:\n    name = 'transposing_reshape'\n  prev_shape = tensor_shape(tensor)\n  expanded = tf.reshape(\n      tensor,\n      prev_shape[:src_dim] + [part_a, part_b] + prev_shape[src_dim + 1:],\n      name=name + '_reshape_in')\n  dest = dest_dim_b\n  if dest_dim_a != src_dim:\n    # We are just moving part_a to dest_dim_a.\n    dest = dest_dim_a\n  else:\n    # We are moving part_b to dest_dim_b.\n    src_dim += 1\n  dim_list = rotate_dimensions(len(expanded.get_shape()), src_dim, dest)\n  expanded = tf.transpose(expanded, dim_list, name=name + '_rot_transpose')\n  # Reshape identity except dest,dest+1, which get merged.\n  ex_shape = tensor_shape(expanded)\n  combined = ex_shape[dest] * ex_shape[dest + 1]\n  return tf.reshape(\n      expanded,\n      ex_shape[:dest] + [combined] + ex_shape[dest + 2:],\n      name=name + '_reshape_out')\n\n\ndef tensor_dim(tensor, dim):\n  \"\"\"Returns int dimension if known at a graph build time else a tensor.\n\n  If the size of the dim of tensor is known at graph building time, then that\n  known value is returned, otherwise (instead of None), a Tensor that will give\n  the size of the dimension when the graph is run. The return value will be\n  accepted by tf.reshape in multiple (or even all) dimensions, even when the\n  sizes are not known at graph building time, unlike -1, which can only be used\n  in one dimension. It is a bad idea to use tf.shape all the time, as some ops\n  demand a known (at graph build time) size. This function therefore returns\n  the best available, most useful dimension size.\n  Args:\n    tensor: Input tensor.\n    dim:    Dimension to find the size of.\n\n  Returns:\n    An integer if shape is known at build time, otherwise a tensor of int32.\n  \"\"\"\n  result = tensor.get_shape().as_list()[dim]\n  if result is None:\n    result = tf.shape(tensor)[dim]\n  return result\n\n\ndef tensor_shape(tensor):\n  \"\"\"Returns a heterogeneous list of tensor_dim for the tensor.\n\n  See tensor_dim for a more detailed explanation.\n  Args:\n    tensor: Input tensor.\n\n  Returns:\n    A heterogeneous list of integers and int32 tensors.\n  \"\"\"\n  result = []\n  for d in xrange(len(tensor.get_shape())):\n    result.append(tensor_dim(tensor, d))\n  return result\n"
  },
  {
    "path": "model_zoo/models/street/python/shapes_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for shapes.\"\"\"\n\nimport numpy as np\nimport tensorflow as tf\nimport shapes\n\n\ndef _rand(*size):\n  return np.random.uniform(size=size).astype('f')\n\n\nclass ShapesTest(tf.test.TestCase):\n  \"\"\"Tests just the shapes from a call to transposing_reshape.\"\"\"\n\n  def __init__(self, other):\n    super(ShapesTest, self).__init__(other)\n    self.batch_size = 4\n    self.im_height = 24\n    self.im_width = 36\n    self.depth = 20\n\n  def testReshapeTile(self):\n    \"\"\"Tests that a tiled input can be reshaped to the batch dimension.\"\"\"\n    fake = tf.placeholder(\n        tf.float32, shape=(None, None, None, self.depth), name='inputs')\n    real = _rand(self.batch_size, self.im_height, self.im_width, self.depth)\n    with self.test_session() as sess:\n      outputs = shapes.transposing_reshape(\n          fake, src_dim=2, part_a=3, part_b=-1, dest_dim_a=0, dest_dim_b=2)\n      res_image = sess.run([outputs], feed_dict={fake: real})\n      self.assertEqual(\n          tuple(res_image[0].shape),\n          (self.batch_size * 3, self.im_height, self.im_width / 3, self.depth))\n\n  def testReshapeDepth(self):\n    \"\"\"Tests that depth can be reshaped to the x dimension.\"\"\"\n    fake = tf.placeholder(\n        tf.float32, shape=(None, None, None, self.depth), name='inputs')\n    real = _rand(self.batch_size, self.im_height, self.im_width, self.depth)\n    with self.test_session() as sess:\n      outputs = shapes.transposing_reshape(\n          fake, src_dim=3, part_a=4, part_b=-1, dest_dim_a=2, dest_dim_b=3)\n      res_image = sess.run([outputs], feed_dict={fake: real})\n      self.assertEqual(\n          tuple(res_image[0].shape),\n          (self.batch_size, self.im_height, self.im_width * 4, self.depth / 4))\n\n\nclass DataTest(tf.test.TestCase):\n  \"\"\"Tests that the data is moved correctly in a call to transposing_reshape.\n\n  \"\"\"\n\n  def testTransposingReshape_2_2_3_2_1(self):\n    \"\"\"Case: dest_a == src, dest_b < src: Split with Least sig part going left.\n    \"\"\"\n    with self.test_session() as sess:\n      fake = tf.placeholder(\n          tf.float32, shape=(None, None, None, 2), name='inputs')\n      outputs = shapes.transposing_reshape(\n          fake, src_dim=2, part_a=2, part_b=3, dest_dim_a=2, dest_dim_b=1)\n      # Make real inputs. The tensor looks like this:\n      # tensor=[[[[0, 1][2, 3][4, 5][6, 7][8, 9][10, 11]]\n      #          [[12, 13][14, 15][16, 17][18, 19][20, 21][22, 23]]\n      #         [[[24, 25]...\n      real = np.arange(120).reshape((5, 2, 6, 2))\n      np_array = sess.run([outputs], feed_dict={fake: real})[0]\n      self.assertEqual(tuple(np_array.shape), (5, 6, 2, 2))\n      self.assertAllEqual(np_array[0, :, :, :],\n                          [[[0, 1], [6, 7]], [[12, 13], [18, 19]],\n                           [[2, 3], [8, 9]], [[14, 15], [20, 21]],\n                           [[4, 5], [10, 11]], [[16, 17], [22, 23]]])\n\n  def testTransposingReshape_2_2_3_2_3(self):\n    \"\"\"Case: dest_a == src, dest_b > src: Split with Least sig part going right.\n    \"\"\"\n    with self.test_session() as sess:\n      fake = tf.placeholder(\n          tf.float32, shape=(None, None, None, 2), name='inputs')\n      outputs = shapes.transposing_reshape(\n          fake, src_dim=2, part_a=2, part_b=3, dest_dim_a=2, dest_dim_b=3)\n      # Make real inputs. The tensor looks like this:\n      # tensor=[[[[0, 1][2, 3][4, 5][6, 7][8, 9][10, 11]]\n      #          [[12, 13][14, 15][16, 17][18, 19][20, 21][22, 23]]\n      #         [[[24, 25]...\n      real = np.arange(120).reshape((5, 2, 6, 2))\n      np_array = sess.run([outputs], feed_dict={fake: real})[0]\n      self.assertEqual(tuple(np_array.shape), (5, 2, 2, 6))\n      self.assertAllEqual(\n          np_array[0, :, :, :],\n          [[[0, 1, 2, 3, 4, 5], [6, 7, 8, 9, 10, 11]],\n           [[12, 13, 14, 15, 16, 17], [18, 19, 20, 21, 22, 23]]])\n\n  def testTransposingReshape_2_2_3_2_2(self):\n    \"\"\"Case: dest_a == src, dest_b == src. Transpose within dimension 2.\n    \"\"\"\n    with self.test_session() as sess:\n      fake = tf.placeholder(\n          tf.float32, shape=(None, None, None, 2), name='inputs')\n      outputs = shapes.transposing_reshape(\n          fake, src_dim=2, part_a=2, part_b=3, dest_dim_a=2, dest_dim_b=2)\n      # Make real inputs. The tensor looks like this:\n      # tensor=[[[[0, 1][2, 3][4, 5][6, 7][8, 9][10, 11]]\n      #          [[12, 13][14, 15][16, 17][18, 19][20, 21][22, 23]]\n      #         [[[24, 25]...\n      real = np.arange(120).reshape((5, 2, 6, 2))\n      np_array = sess.run([outputs], feed_dict={fake: real})[0]\n      self.assertEqual(tuple(np_array.shape), (5, 2, 6, 2))\n      self.assertAllEqual(\n          np_array[0, :, :, :],\n          [[[0, 1], [6, 7], [2, 3], [8, 9], [4, 5], [10, 11]],\n           [[12, 13], [18, 19], [14, 15], [20, 21], [16, 17], [22, 23]]])\n\n  def testTransposingReshape_2_2_3_1_2(self):\n    \"\"\"Case: dest_a < src, dest_b == src. Split with Most sig part going left.\n    \"\"\"\n    with self.test_session() as sess:\n      fake = tf.placeholder(\n          tf.float32, shape=(None, None, None, 2), name='inputs')\n      outputs = shapes.transposing_reshape(\n          fake, src_dim=2, part_a=2, part_b=3, dest_dim_a=1, dest_dim_b=2)\n      # Make real inputs. The tensor looks like this:\n      # tensor=[[[[0, 1][2, 3][4, 5][6, 7][8, 9][10, 11]]\n      #          [[12, 13][14, 15][16, 17][18, 19][20, 21][22, 23]]\n      #         [[[24, 25]...\n      real = np.arange(120).reshape((5, 2, 6, 2))\n      np_array = sess.run([outputs], feed_dict={fake: real})[0]\n      self.assertEqual(tuple(np_array.shape), (5, 4, 3, 2))\n      self.assertAllEqual(np_array[0, :, :, :],\n                          [[[0, 1], [2, 3], [4, 5]],\n                           [[12, 13], [14, 15], [16, 17]],\n                           [[6, 7], [8, 9], [10, 11]],\n                           [[18, 19], [20, 21], [22, 23]]])\n\n  def testTransposingReshape_2_2_3_3_2(self):\n    \"\"\"Case: dest_a < src, dest_b == src. Split with Most sig part going right.\n    \"\"\"\n    with self.test_session() as sess:\n      fake = tf.placeholder(\n          tf.float32, shape=(None, None, None, 2), name='inputs')\n      outputs = shapes.transposing_reshape(\n          fake, src_dim=2, part_a=2, part_b=3, dest_dim_a=3, dest_dim_b=2)\n      # Make real inputs. The tensor looks like this:\n      # tensor=[[[[0, 1][2, 3][4, 5][6, 7][8, 9][10, 11]]\n      #          [[12, 13][14, 15][16, 17][18, 19][20, 21][22, 23]]\n      #         [[[24, 25]...\n      real = np.arange(120).reshape((5, 2, 6, 2))\n      np_array = sess.run([outputs], feed_dict={fake: real})[0]\n      self.assertEqual(tuple(np_array.shape), (5, 2, 3, 4))\n      self.assertAllEqual(\n          np_array[0, :, :, :],\n          [[[0, 1, 6, 7], [2, 3, 8, 9], [4, 5, 10, 11]],\n           [[12, 13, 18, 19], [14, 15, 20, 21], [16, 17, 22, 23]]])\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/street/python/vgsl_eval.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Model eval separate from training.\"\"\"\nfrom tensorflow import app\nfrom tensorflow.python.platform import flags\n\nimport vgsl_model\n\nflags.DEFINE_string('eval_dir', '/tmp/mdir/eval',\n                    'Directory where to write event logs.')\nflags.DEFINE_string('graph_def_file', None,\n                    'Output eval graph definition file.')\nflags.DEFINE_string('train_dir', '/tmp/mdir',\n                    'Directory where to find training checkpoints.')\nflags.DEFINE_string('model_str',\n                    '1,150,600,3[S2(4x150)0,2 Ct5,5,16 Mp2,2 Ct5,5,64 Mp3,3'\n                    '([Lrys64 Lbx128][Lbys64 Lbx128][Lfys64 Lbx128])S3(3x0)2,3'\n                    'Lfx128 Lrx128 S0(1x4)0,3 Do Lfx256]O1c134',\n                    'Network description.')\nflags.DEFINE_integer('num_steps', 1000, 'Number of steps to run evaluation.')\nflags.DEFINE_integer('eval_interval_secs', 60,\n                     'Time interval between eval runs.')\nflags.DEFINE_string('eval_data', None, 'Evaluation data filepattern')\nflags.DEFINE_string('decoder', None, 'Charset decoder')\n\nFLAGS = flags.FLAGS\n\n\ndef main(argv):\n  del argv\n  vgsl_model.Eval(FLAGS.train_dir, FLAGS.eval_dir, FLAGS.model_str,\n                  FLAGS.eval_data, FLAGS.decoder, FLAGS.num_steps,\n                  FLAGS.graph_def_file, FLAGS.eval_interval_secs)\n\n\nif __name__ == '__main__':\n  app.run()\n"
  },
  {
    "path": "model_zoo/models/street/python/vgsl_input.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"String network description language to define network layouts.\"\"\"\nimport collections\nimport tensorflow as tf\nfrom tensorflow.python.ops import parsing_ops\n\n# Named tuple for the standard tf image tensor Shape.\n# batch_size:     Number of images to batch-up for training.\n# height:         Fixed height of image or None for variable.\n# width:          Fixed width of image or None for variable.\n# depth:          Desired depth in bytes per pixel of input images.\nImageShape = collections.namedtuple('ImageTensorDims',\n                                    ['batch_size', 'height', 'width', 'depth'])\n\n\ndef ImageInput(input_pattern, num_threads, shape, using_ctc, reader=None):\n  \"\"\"Creates an input image tensor from the input_pattern filenames.\n\n  TODO(rays) Expand for 2-d labels, 0-d labels, and logistic targets.\n  Args:\n    input_pattern:  Filenames of the dataset(s) to read.\n    num_threads:    Number of preprocessing threads.\n    shape:          ImageShape with the desired shape of the input.\n    using_ctc:      Take the unpadded_class labels instead of padded.\n    reader:         Function that returns an actual reader to read Examples from\n      input files. If None, uses tf.TFRecordReader().\n  Returns:\n    images:   Float Tensor containing the input image scaled to [-1.28, 1.27].\n    heights:  Tensor int64 containing the heights of the images.\n    widths:   Tensor int64 containing the widths of the images.\n    labels:   Serialized SparseTensor containing the int64 labels.\n    sparse_labels:   Serialized SparseTensor containing the int64 labels.\n    truths:   Tensor string of the utf8 truth texts.\n  Raises:\n    ValueError: if the optimizer type is unrecognized.\n  \"\"\"\n  data_files = tf.gfile.Glob(input_pattern)\n  assert data_files, 'no files found for dataset ' + input_pattern\n  queue_capacity = shape.batch_size * num_threads * 2\n  filename_queue = tf.train.string_input_producer(\n      data_files, capacity=queue_capacity)\n\n  # Create a subgraph with its own reader (but sharing the\n  # filename_queue) for each preprocessing thread.\n  images_and_label_lists = []\n  for _ in range(num_threads):\n    image, height, width, labels, text = _ReadExamples(filename_queue, shape,\n                                                       using_ctc, reader)\n    images_and_label_lists.append([image, height, width, labels, text])\n  # Create a queue that produces the examples in batches.\n  images, heights, widths, labels, truths = tf.train.batch_join(\n      images_and_label_lists,\n      batch_size=shape.batch_size,\n      capacity=16 * shape.batch_size,\n      dynamic_pad=True)\n  # Deserialize back to sparse, because the batcher doesn't do sparse.\n  labels = tf.deserialize_many_sparse(labels, tf.int64)\n  sparse_labels = tf.cast(labels, tf.int32)\n  labels = tf.sparse_tensor_to_dense(labels)\n  labels = tf.reshape(labels, [shape.batch_size, -1], name='Labels')\n  # Crush the other shapes to just the batch dimension.\n  heights = tf.reshape(heights, [-1], name='Heights')\n  widths = tf.reshape(widths, [-1], name='Widths')\n  truths = tf.reshape(truths, [-1], name='Truths')\n  # Give the images a nice name as well.\n  images = tf.identity(images, name='Images')\n\n  tf.image_summary('Images', images)\n  return images, heights, widths, labels, sparse_labels, truths\n\n\ndef _ReadExamples(filename_queue, shape, using_ctc, reader=None):\n  \"\"\"Builds network input tensor ops for TF Example.\n\n  Args:\n    filename_queue: Queue of filenames, from tf.train.string_input_producer\n    shape:          ImageShape with the desired shape of the input.\n    using_ctc:      Take the unpadded_class labels instead of padded.\n    reader:         Function that returns an actual reader to read Examples from\n      input files. If None, uses tf.TFRecordReader().\n  Returns:\n    image:   Float Tensor containing the input image scaled to [-1.28, 1.27].\n    height:  Tensor int64 containing the height of the image.\n    width:   Tensor int64 containing the width of the image.\n    labels:  Serialized SparseTensor containing the int64 labels.\n    text:    Tensor string of the utf8 truth text.\n  \"\"\"\n  if reader:\n    reader = reader()\n  else:\n    reader = tf.TFRecordReader()\n  _, example_serialized = reader.read(filename_queue)\n  example_serialized = tf.reshape(example_serialized, shape=[])\n  features = tf.parse_single_example(\n      example_serialized,\n      {'image/encoded': parsing_ops.FixedLenFeature(\n          [1], dtype=tf.string, default_value=''),\n       'image/text': parsing_ops.FixedLenFeature(\n           [1], dtype=tf.string, default_value=''),\n       'image/class': parsing_ops.VarLenFeature(dtype=tf.int64),\n       'image/unpadded_class': parsing_ops.VarLenFeature(dtype=tf.int64),\n       'image/height': parsing_ops.FixedLenFeature(\n           [1], dtype=tf.int64, default_value=1),\n       'image/width': parsing_ops.FixedLenFeature(\n           [1], dtype=tf.int64, default_value=1)})\n  if using_ctc:\n    labels = features['image/unpadded_class']\n  else:\n    labels = features['image/class']\n  labels = tf.serialize_sparse(labels)\n  image = tf.reshape(features['image/encoded'], shape=[], name='encoded')\n  image = _ImageProcessing(image, shape)\n  height = tf.reshape(features['image/height'], [-1])\n  width = tf.reshape(features['image/width'], [-1])\n  text = tf.reshape(features['image/text'], shape=[])\n\n  return image, height, width, labels, text\n\n\ndef _ImageProcessing(image_buffer, shape):\n  \"\"\"Convert a PNG string into an input tensor.\n\n  We allow for fixed and variable sizes.\n  Does fixed conversion to floats in the range [-1.28, 1.27].\n  Args:\n    image_buffer: Tensor containing a PNG encoded image.\n    shape:          ImageShape with the desired shape of the input.\n  Returns:\n    image:        Decoded, normalized image in the range [-1.28, 1.27].\n  \"\"\"\n  image = tf.image.decode_png(image_buffer, channels=shape.depth)\n  image.set_shape([shape.height, shape.width, shape.depth])\n  image = tf.cast(image, tf.float32)\n  image = tf.sub(image, 128.0)\n  image = tf.mul(image, 1 / 100.0)\n  return image\n"
  },
  {
    "path": "model_zoo/models/street/python/vgsl_model.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"String network description language to define network layouts.\"\"\"\nimport re\nimport time\n\nimport decoder\nimport errorcounter as ec\nimport shapes\nimport tensorflow as tf\nimport vgsl_input\nimport vgslspecs\nimport tensorflow.contrib.slim as slim\nfrom tensorflow.core.framework import summary_pb2\nfrom tensorflow.python.platform import tf_logging as logging\n\n\n# Parameters for rate decay.\n# We divide the learning_rate_halflife by DECAY_STEPS_FACTOR and use DECAY_RATE\n# as the decay factor for the learning rate, ie we use the DECAY_STEPS_FACTORth\n# root of 2 as the decay rate every halflife/DECAY_STEPS_FACTOR to achieve the\n# desired halflife.\nDECAY_STEPS_FACTOR = 16\nDECAY_RATE = pow(0.5, 1.0 / DECAY_STEPS_FACTOR)\n\n\ndef Train(train_dir,\n          model_str,\n          train_data,\n          max_steps,\n          master='',\n          task=0,\n          ps_tasks=0,\n          initial_learning_rate=0.001,\n          final_learning_rate=0.001,\n          learning_rate_halflife=160000,\n          optimizer_type='Adam',\n          num_preprocess_threads=1,\n          reader=None):\n  \"\"\"Testable trainer with no dependence on FLAGS.\n\n  Args:\n    train_dir: Directory to write checkpoints.\n    model_str: Network specification string.\n    train_data: Training data file pattern.\n    max_steps: Number of training steps to run.\n    master: Name of the TensorFlow master to use.\n    task: Task id of this replica running the training. (0 will be master).\n    ps_tasks: Number of tasks in ps job, or 0 if no ps job.\n    initial_learning_rate: Learing rate at start of training.\n    final_learning_rate: Asymptotic minimum learning rate.\n    learning_rate_halflife: Number of steps over which to halve the difference\n      between initial and final learning rate.\n    optimizer_type: One of 'GradientDescent', 'AdaGrad', 'Momentum', 'Adam'.\n    num_preprocess_threads: Number of input threads.\n    reader: Function that returns an actual reader to read Examples from input\n      files. If None, uses tf.TFRecordReader().\n  \"\"\"\n  if master.startswith('local'):\n    device = tf.ReplicaDeviceSetter(ps_tasks)\n  else:\n    device = '/cpu:0'\n  with tf.Graph().as_default():\n    with tf.device(device):\n      model = InitNetwork(train_data, model_str, 'train', initial_learning_rate,\n                          final_learning_rate, learning_rate_halflife,\n                          optimizer_type, num_preprocess_threads, reader)\n\n      # Create a Supervisor.  It will take care of initialization, summaries,\n      # checkpoints, and recovery.\n      #\n      # When multiple replicas of this program are running, the first one,\n      # identified by --task=0 is the 'chief' supervisor.  It is the only one\n      # that takes case of initialization, etc.\n      sv = tf.train.Supervisor(\n          logdir=train_dir,\n          is_chief=(task == 0),\n          saver=model.saver,\n          save_summaries_secs=10,\n          save_model_secs=30,\n          recovery_wait_secs=5)\n\n      step = 0\n      while step < max_steps:\n        try:\n          # Get an initialized, and possibly recovered session.  Launch the\n          # services: Checkpointing, Summaries, step counting.\n          with sv.managed_session(master) as sess:\n            while step < max_steps:\n              _, step = model.TrainAStep(sess)\n              if sv.coord.should_stop():\n                break\n        except tf.errors.AbortedError as e:\n          logging.error('Received error:%s', e)\n          continue\n\n\ndef Eval(train_dir,\n         eval_dir,\n         model_str,\n         eval_data,\n         decoder_file,\n         num_steps,\n         graph_def_file=None,\n         eval_interval_secs=0,\n         reader=None):\n  \"\"\"Restores a model from a checkpoint and evaluates it.\n\n  Args:\n    train_dir: Directory to find checkpoints.\n    eval_dir: Directory to write summary events.\n    model_str: Network specification string.\n    eval_data: Evaluation data file pattern.\n    decoder_file: File to read to decode the labels.\n    num_steps: Number of eval steps to run.\n    graph_def_file: File to write graph definition to for freezing.\n    eval_interval_secs: How often to run evaluations, or once if 0.\n    reader: Function that returns an actual reader to read Examples from input\n      files. If None, uses tf.TFRecordReader().\n  Returns:\n    (char error rate, word recall error rate, sequence error rate) as percent.\n  Raises:\n    ValueError: If unimplemented feature is used.\n  \"\"\"\n  decode = None\n  if decoder_file:\n    decode = decoder.Decoder(decoder_file)\n\n  # Run eval.\n  rates = ec.ErrorRates(\n      label_error=None,\n      word_recall_error=None,\n      word_precision_error=None,\n      sequence_error=None)\n  with tf.Graph().as_default():\n    model = InitNetwork(eval_data, model_str, 'eval', reader=reader)\n    sw = tf.train.SummaryWriter(eval_dir)\n\n    while True:\n      sess = tf.Session('')\n      if graph_def_file is not None:\n        # Write the eval version of the graph to a file for freezing.\n        if not tf.gfile.Exists(graph_def_file):\n          with tf.gfile.FastGFile(graph_def_file, 'w') as f:\n            f.write(\n                sess.graph.as_graph_def(add_shapes=True).SerializeToString())\n      ckpt = tf.train.get_checkpoint_state(train_dir)\n      if ckpt and ckpt.model_checkpoint_path:\n        step = model.Restore(ckpt.model_checkpoint_path, sess)\n        if decode:\n          rates = decode.SoftmaxEval(sess, model, num_steps)\n          _AddRateToSummary('Label error rate', rates.label_error, step, sw)\n          _AddRateToSummary('Word recall error rate', rates.word_recall_error,\n                            step, sw)\n          _AddRateToSummary('Word precision error rate',\n                            rates.word_precision_error, step, sw)\n          _AddRateToSummary('Sequence error rate', rates.sequence_error, step,\n                            sw)\n          sw.flush()\n          print 'Error rates=', rates\n        else:\n          raise ValueError('Non-softmax decoder evaluation not implemented!')\n      if eval_interval_secs:\n        time.sleep(eval_interval_secs)\n      else:\n        break\n  return rates\n\n\ndef InitNetwork(input_pattern,\n                model_spec,\n                mode='eval',\n                initial_learning_rate=0.00005,\n                final_learning_rate=0.00005,\n                halflife=1600000,\n                optimizer_type='Adam',\n                num_preprocess_threads=1,\n                reader=None):\n  \"\"\"Constructs a python tensor flow model defined by model_spec.\n\n  Args:\n    input_pattern: File pattern of the data in tfrecords of Example.\n    model_spec: Concatenation of input spec, model spec and output spec.\n      See Build below for input/output spec. For model spec, see vgslspecs.py\n    mode: One of 'train', 'eval'\n    initial_learning_rate: Initial learning rate for the network.\n    final_learning_rate: Final learning rate for the network.\n    halflife: Number of steps over which to halve the difference between\n              initial and final learning rate for the network.\n    optimizer_type: One of 'GradientDescent', 'AdaGrad', 'Momentum', 'Adam'.\n    num_preprocess_threads: Number of threads to use for image processing.\n    reader: Function that returns an actual reader to read Examples from input\n      files. If None, uses tf.TFRecordReader().\n    Eval tasks need only specify input_pattern and model_spec.\n\n  Returns:\n    A VGSLImageModel class.\n\n  Raises:\n    ValueError: if the model spec syntax is incorrect.\n  \"\"\"\n  model = VGSLImageModel(mode, model_spec, initial_learning_rate,\n                         final_learning_rate, halflife)\n  left_bracket = model_spec.find('[')\n  right_bracket = model_spec.rfind(']')\n  if left_bracket < 0 or right_bracket < 0:\n    raise ValueError('Failed to find [] in model spec! ', model_spec)\n  input_spec = model_spec[:left_bracket]\n  layer_spec = model_spec[left_bracket:right_bracket + 1]\n  output_spec = model_spec[right_bracket + 1:]\n  model.Build(input_pattern, input_spec, layer_spec, output_spec,\n              optimizer_type, num_preprocess_threads, reader)\n  return model\n\n\nclass VGSLImageModel(object):\n  \"\"\"Class that builds a tensor flow model for training or evaluation.\n  \"\"\"\n\n  def __init__(self, mode, model_spec, initial_learning_rate,\n               final_learning_rate, halflife):\n    \"\"\"Constructs a VGSLImageModel.\n\n    Args:\n      mode:        One of \"train\", \"eval\"\n      model_spec:  Full model specification string, for reference only.\n      initial_learning_rate: Initial learning rate for the network.\n      final_learning_rate: Final learning rate for the network.\n      halflife: Number of steps over which to halve the difference between\n                initial and final learning rate for the network.\n    \"\"\"\n    # The string that was used to build this model.\n    self.model_spec = model_spec\n    # The layers between input and output.\n    self.layers = None\n    # The train/eval mode.\n    self.mode = mode\n    # The initial learning rate.\n    self.initial_learning_rate = initial_learning_rate\n    self.final_learning_rate = final_learning_rate\n    self.decay_steps = halflife / DECAY_STEPS_FACTOR\n    self.decay_rate = DECAY_RATE\n    # Tensor for the labels.\n    self.labels = None\n    self.sparse_labels = None\n    # Debug data containing the truth text.\n    self.truths = None\n    # Tensor for loss\n    self.loss = None\n    # Train operation\n    self.train_op = None\n    # Tensor for the global step counter\n    self.global_step = None\n    # Tensor for the output predictions (usually softmax)\n    self.output = None\n    # True if we are using CTC training mode.\n    self.using_ctc = False\n    # Saver object to load or restore the variables.\n    self.saver = None\n\n  def Build(self, input_pattern, input_spec, model_spec, output_spec,\n            optimizer_type, num_preprocess_threads, reader):\n    \"\"\"Builds the model from the separate input/layers/output spec strings.\n\n    Args:\n      input_pattern: File pattern of the data in tfrecords of TF Example format.\n      input_spec: Specification of the input layer:\n        batchsize,height,width,depth (4 comma-separated integers)\n          Training will run with batches of batchsize images, but runtime can\n          use any batch size.\n          height and/or width can be 0 or -1, indicating variable size,\n          otherwise all images must be the given size.\n          depth must be 1 or 3 to indicate greyscale or color.\n          NOTE 1-d image input, treating the y image dimension as depth, can\n          be achieved using S1(1x0)1,3 as the first op in the model_spec, but\n          the y-size of the input must then be fixed.\n      model_spec: Model definition. See vgslspecs.py\n      output_spec: Output layer definition:\n        O(2|1|0)(l|s|c)n output layer with n classes.\n          2 (heatmap) Output is a 2-d vector map of the input (possibly at\n            different scale).\n          1 (sequence) Output is a 1-d sequence of vector values.\n          0 (value) Output is a 0-d single vector value.\n          l uses a logistic non-linearity on the output, allowing multiple\n            hot elements in any output vector value.\n          s uses a softmax non-linearity, with one-hot output in each value.\n          c uses a softmax with CTC. Can only be used with s (sequence).\n          NOTE Only O1s and O1c are currently supported.\n      optimizer_type: One of 'GradientDescent', 'AdaGrad', 'Momentum', 'Adam'.\n      num_preprocess_threads: Number of threads to use for image processing.\n      reader: Function that returns an actual reader to read Examples from input\n        files. If None, uses tf.TFRecordReader().\n    \"\"\"\n    self.global_step = tf.Variable(0, name='global_step', trainable=False)\n    shape = _ParseInputSpec(input_spec)\n    out_dims, out_func, num_classes = _ParseOutputSpec(output_spec)\n    self.using_ctc = out_func == 'c'\n    images, heights, widths, labels, sparse, _ = vgsl_input.ImageInput(\n        input_pattern, num_preprocess_threads, shape, self.using_ctc, reader)\n    self.labels = labels\n    self.sparse_labels = sparse\n    self.layers = vgslspecs.VGSLSpecs(widths, heights, self.mode == 'train')\n    last_layer = self.layers.Build(images, model_spec)\n    self._AddOutputs(last_layer, out_dims, out_func, num_classes)\n    if self.mode == 'train':\n      self._AddOptimizer(optimizer_type)\n\n    # For saving the model across training and evaluation\n    self.saver = tf.train.Saver()\n\n  def TrainAStep(self, sess):\n    \"\"\"Runs a training step in the session.\n\n    Args:\n      sess: Session in which to train the model.\n    Returns:\n      loss, global_step.\n    \"\"\"\n    _, loss, step = sess.run([self.train_op, self.loss, self.global_step])\n    return loss, step\n\n  def Restore(self, checkpoint_path, sess):\n    \"\"\"Restores the model from the given checkpoint path into the session.\n\n    Args:\n      checkpoint_path: File pathname of the checkpoint.\n      sess:            Session in which to restore the model.\n    Returns:\n      global_step of the model.\n    \"\"\"\n    self.saver.restore(sess, checkpoint_path)\n    return tf.train.global_step(sess, self.global_step)\n\n  def RunAStep(self, sess):\n    \"\"\"Runs a step for eval in the session.\n\n    Args:\n      sess:            Session in which to run the model.\n    Returns:\n      output tensor result, labels tensor result.\n    \"\"\"\n    return sess.run([self.output, self.labels])\n\n  def _AddOutputs(self, prev_layer, out_dims, out_func, num_classes):\n    \"\"\"Adds the output layer and loss function.\n\n    Args:\n      prev_layer:  Output of last layer of main network.\n      out_dims:    Number of output dimensions, 0, 1 or 2.\n      out_func:    Output non-linearity. 's' or 'c'=softmax, 'l'=logistic.\n      num_classes: Number of outputs/size of last output dimension.\n    \"\"\"\n    height_in = shapes.tensor_dim(prev_layer, dim=1)\n    logits, outputs = self._AddOutputLayer(prev_layer, out_dims, out_func,\n                                           num_classes)\n    if self.mode == 'train':\n      # Setup loss for training.\n      self.loss = self._AddLossFunction(logits, height_in, out_dims, out_func)\n      tf.scalar_summary('loss', self.loss, name='loss')\n    elif out_dims == 0:\n      # Be sure the labels match the output, even in eval mode.\n      self.labels = tf.slice(self.labels, [0, 0], [-1, 1])\n      self.labels = tf.reshape(self.labels, [-1])\n\n    logging.info('Final output=%s', outputs)\n    logging.info('Labels tensor=%s', self.labels)\n    self.output = outputs\n\n  def _AddOutputLayer(self, prev_layer, out_dims, out_func, num_classes):\n    \"\"\"Add the fully-connected logits and SoftMax/Logistic output Layer.\n\n    Args:\n      prev_layer:  Output of last layer of main network.\n      out_dims:    Number of output dimensions, 0, 1 or 2.\n      out_func:    Output non-linearity. 's' or 'c'=softmax, 'l'=logistic.\n      num_classes: Number of outputs/size of last output dimension.\n\n    Returns:\n      logits:  Pre-softmax/logistic fully-connected output shaped to out_dims.\n      outputs: Post-softmax/logistic shaped to out_dims.\n\n    Raises:\n      ValueError: if syntax is incorrect.\n    \"\"\"\n    # Reduce dimensionality appropriate to the output dimensions.\n    batch_in = shapes.tensor_dim(prev_layer, dim=0)\n    height_in = shapes.tensor_dim(prev_layer, dim=1)\n    width_in = shapes.tensor_dim(prev_layer, dim=2)\n    depth_in = shapes.tensor_dim(prev_layer, dim=3)\n    if out_dims:\n      # Combine any remaining height and width with batch and unpack after.\n      shaped = tf.reshape(prev_layer, [-1, depth_in])\n    else:\n      # Everything except batch goes to depth, and therefore has to be known.\n      shaped = tf.reshape(prev_layer, [-1, height_in * width_in * depth_in])\n    logits = slim.fully_connected(shaped, num_classes, activation_fn=None)\n    if out_func == 'l':\n      raise ValueError('Logistic not yet supported!')\n    else:\n      output = tf.nn.softmax(logits)\n    # Reshape to the dessired output.\n    if out_dims == 2:\n      output_shape = [batch_in, height_in, width_in, num_classes]\n    elif out_dims == 1:\n      output_shape = [batch_in, height_in * width_in, num_classes]\n    else:\n      output_shape = [batch_in, num_classes]\n    output = tf.reshape(output, output_shape, name='Output')\n    logits = tf.reshape(logits, output_shape)\n    return logits, output\n\n  def _AddLossFunction(self, logits, height_in, out_dims, out_func):\n    \"\"\"Add the appropriate loss function.\n\n    Args:\n      logits:  Pre-softmax/logistic fully-connected output shaped to out_dims.\n      height_in:  Height of logits before going into the softmax layer.\n      out_dims:   Number of output dimensions, 0, 1 or 2.\n      out_func:   Output non-linearity. 's' or 'c'=softmax, 'l'=logistic.\n\n    Returns:\n      loss: That which is to be minimized.\n\n    Raises:\n      ValueError: if logistic is used.\n    \"\"\"\n    if out_func == 'c':\n      # Transpose batch to the middle.\n      ctc_input = tf.transpose(logits, [1, 0, 2])\n      # Compute the widths of each batch element from the input widths.\n      widths = self.layers.GetLengths(dim=2, factor=height_in)\n      cross_entropy = tf.nn.ctc_loss(ctc_input, self.sparse_labels, widths)\n    elif out_func == 's':\n      if out_dims == 2:\n        self.labels = _PadLabels3d(logits, self.labels)\n      elif out_dims == 1:\n        self.labels = _PadLabels2d(\n            shapes.tensor_dim(\n                logits, dim=1), self.labels)\n      else:\n        self.labels = tf.slice(self.labels, [0, 0], [-1, 1])\n        self.labels = tf.reshape(self.labels, [-1])\n      cross_entropy = tf.nn.sparse_softmax_cross_entropy_with_logits(\n          logits, self.labels, name='xent')\n    else:\n      # TODO(rays) Labels need an extra dimension for logistic, so different\n      # padding functions are needed, as well as a different loss function.\n      raise ValueError('Logistic not yet supported!')\n    return tf.reduce_sum(cross_entropy)\n\n  def _AddOptimizer(self, optimizer_type):\n    \"\"\"Adds an optimizer with learning rate decay to minimize self.loss.\n\n    Args:\n      optimizer_type: One of 'GradientDescent', 'AdaGrad', 'Momentum', 'Adam'.\n    Raises:\n      ValueError: if the optimizer type is unrecognized.\n    \"\"\"\n    learn_rate_delta = self.initial_learning_rate - self.final_learning_rate\n    learn_rate_dec = tf.add(\n        tf.train.exponential_decay(learn_rate_delta, self.global_step,\n                                   self.decay_steps, self.decay_rate),\n        self.final_learning_rate)\n    if optimizer_type == 'GradientDescent':\n      opt = tf.train.GradientDescentOptimizer(learn_rate_dec)\n    elif optimizer_type == 'AdaGrad':\n      opt = tf.train.AdagradOptimizer(learn_rate_dec)\n    elif optimizer_type == 'Momentum':\n      opt = tf.train.MomentumOptimizer(learn_rate_dec, momentum=0.9)\n    elif optimizer_type == 'Adam':\n      opt = tf.train.AdamOptimizer(learning_rate=learn_rate_dec)\n    else:\n      raise ValueError('Invalid optimizer type: ' + optimizer_type)\n    tf.scalar_summary('learn_rate', learn_rate_dec, name='lr_summ')\n\n    self.train_op = opt.minimize(\n        self.loss, global_step=self.global_step, name='train')\n\n\ndef _PadLabels3d(logits, labels):\n  \"\"\"Pads or slices 3-d labels to match logits.\n\n  Covers the case of 2-d softmax output, when labels is [batch, height, width]\n  and logits is [batch, height, width, onehot]\n  Args:\n    logits: 4-d Pre-softmax fully-connected output.\n    labels: 3-d, but not necessarily matching in size.\n\n  Returns:\n    labels: Resized by padding or clipping to match logits.\n  \"\"\"\n  logits_shape = shapes.tensor_shape(logits)\n  labels_shape = shapes.tensor_shape(labels)\n  labels = tf.reshape(labels, [-1, labels_shape[2]])\n  labels = _PadLabels2d(logits_shape[2], labels)\n  labels = tf.reshape(labels, [labels_shape[0], -1])\n  labels = _PadLabels2d(logits_shape[1] * logits_shape[2], labels)\n  return tf.reshape(labels, [labels_shape[0], logits_shape[1], logits_shape[2]])\n\n\ndef _PadLabels2d(logits_size, labels):\n  \"\"\"Pads or slices the 2nd dimension of 2-d labels to match logits_size.\n\n  Covers the case of 1-d softmax output, when labels is [batch, seq] and\n  logits is [batch, seq, onehot]\n  Args:\n    logits_size: Tensor returned from tf.shape giving the target size.\n    labels:      2-d, but not necessarily matching in size.\n\n  Returns:\n    labels: Resized by padding or clipping the last dimension to logits_size.\n  \"\"\"\n  pad = logits_size - tf.shape(labels)[1]\n\n  def _PadFn():\n    return tf.pad(labels, [[0, 0], [0, pad]])\n\n  def _SliceFn():\n    return tf.slice(labels, [0, 0], [-1, logits_size])\n\n  return tf.cond(tf.greater(pad, 0), _PadFn, _SliceFn)\n\n\ndef _ParseInputSpec(input_spec):\n  \"\"\"Parses input_spec and returns the numbers obtained therefrom.\n\n  Args:\n    input_spec:  Specification of the input layer. See Build.\n\n  Returns:\n    shape:      ImageShape with the desired shape of the input.\n\n  Raises:\n    ValueError: if syntax is incorrect.\n  \"\"\"\n  pattern = re.compile(R'(\\d+),(\\d+),(\\d+),(\\d+)')\n  m = pattern.match(input_spec)\n  if m is None:\n    raise ValueError('Failed to parse input spec:' + input_spec)\n  batch_size = int(m.group(1))\n  y_size = int(m.group(2)) if int(m.group(2)) > 0 else None\n  x_size = int(m.group(3)) if int(m.group(3)) > 0 else None\n  depth = int(m.group(4))\n  if depth not in [1, 3]:\n    raise ValueError('Depth must be 1 or 3, had:', depth)\n  return vgsl_input.ImageShape(batch_size, y_size, x_size, depth)\n\n\ndef _ParseOutputSpec(output_spec):\n  \"\"\"Parses the output spec.\n\n  Args:\n    output_spec: Output layer definition. See Build.\n\n  Returns:\n    out_dims:     2|1|0 for 2-d, 1-d, 0-d.\n    out_func:     l|s|c for logistic, softmax, softmax+CTC\n    num_classes:  Number of classes in output.\n\n  Raises:\n    ValueError: if syntax is incorrect.\n  \"\"\"\n  pattern = re.compile(R'(O)(0|1|2)(l|s|c)(\\d+)')\n  m = pattern.match(output_spec)\n  if m is None:\n    raise ValueError('Failed to parse output spec:' + output_spec)\n  out_dims = int(m.group(2))\n  out_func = m.group(3)\n  if out_func == 'c' and out_dims != 1:\n    raise ValueError('CTC can only be used with a 1-D sequence!')\n  num_classes = int(m.group(4))\n  return out_dims, out_func, num_classes\n\n\ndef _AddRateToSummary(tag, rate, step, sw):\n  \"\"\"Adds the given rate to the summary with the given tag.\n\n  Args:\n    tag:   Name for this value.\n    rate:  Value to add to the summary. Perhaps an error rate.\n    step:  Global step of the graph for the x-coordinate of the summary.\n    sw:    Summary writer to which to write the rate value.\n  \"\"\"\n  sw.add_summary(\n      summary_pb2.Summary(value=[summary_pb2.Summary.Value(\n          tag=tag, simple_value=rate)]), step)\n"
  },
  {
    "path": "model_zoo/models/street/python/vgsl_model_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for vgsl_model.\"\"\"\nimport os\n\nimport numpy as np\nimport tensorflow as tf\nimport vgsl_input\nimport vgsl_model\n\n\ndef _testdata(filename):\n  return os.path.join('../testdata/', filename)\n\n\ndef _rand(*size):\n  return np.random.uniform(size=size).astype('f')\n\n\nclass VgslModelTest(tf.test.TestCase):\n\n  def testParseInputSpec(self):\n    \"\"\"The parser must return the numbers in the correct order.\n    \"\"\"\n    shape = vgsl_model._ParseInputSpec(input_spec='32,42,256,3')\n    self.assertEqual(\n        shape,\n        vgsl_input.ImageShape(\n            batch_size=32, height=42, width=256, depth=3))\n    # Nones must be inserted for zero sizes.\n    shape = vgsl_model._ParseInputSpec(input_spec='1,0,0,3')\n    self.assertEqual(\n        shape,\n        vgsl_input.ImageShape(\n            batch_size=1, height=None, width=None, depth=3))\n\n  def testParseOutputSpec(self):\n    \"\"\"The parser must return the correct args in the correct order.\n    \"\"\"\n    out_dims, out_func, num_classes = vgsl_model._ParseOutputSpec(\n        output_spec='O1c142')\n    self.assertEqual(out_dims, 1)\n    self.assertEqual(out_func, 'c')\n    self.assertEqual(num_classes, 142)\n    out_dims, out_func, num_classes = vgsl_model._ParseOutputSpec(\n        output_spec='O2s99')\n    self.assertEqual(out_dims, 2)\n    self.assertEqual(out_func, 's')\n    self.assertEqual(num_classes, 99)\n    out_dims, out_func, num_classes = vgsl_model._ParseOutputSpec(\n        output_spec='O0l12')\n    self.assertEqual(out_dims, 0)\n    self.assertEqual(out_func, 'l')\n    self.assertEqual(num_classes, 12)\n\n  def testPadLabels2d(self):\n    \"\"\"Must pad timesteps in labels to match logits.\n    \"\"\"\n    with self.test_session() as sess:\n      # Make placeholders for logits and labels.\n      ph_logits = tf.placeholder(tf.float32, shape=(None, None, 42))\n      ph_labels = tf.placeholder(tf.int64, shape=(None, None))\n      padded_labels = vgsl_model._PadLabels2d(tf.shape(ph_logits)[1], ph_labels)\n      # Make actual inputs.\n      real_logits = _rand(4, 97, 42)\n      real_labels = _rand(4, 85)\n      np_array = sess.run([padded_labels],\n                          feed_dict={ph_logits: real_logits,\n                                     ph_labels: real_labels})[0]\n      self.assertEqual(tuple(np_array.shape), (4, 97))\n      real_labels = _rand(4, 97)\n      np_array = sess.run([padded_labels],\n                          feed_dict={ph_logits: real_logits,\n                                     ph_labels: real_labels})[0]\n      self.assertEqual(tuple(np_array.shape), (4, 97))\n      real_labels = _rand(4, 100)\n      np_array = sess.run([padded_labels],\n                          feed_dict={ph_logits: real_logits,\n                                     ph_labels: real_labels})[0]\n      self.assertEqual(tuple(np_array.shape), (4, 97))\n\n  def testPadLabels3d(self):\n    \"\"\"Must pad height and width in labels to match logits.\n\n    The tricky thing with 3-d is that the rows and columns need to remain\n    intact, so we'll test it with small known data.\n    \"\"\"\n    with self.test_session() as sess:\n      # Make placeholders for logits and labels.\n      ph_logits = tf.placeholder(tf.float32, shape=(None, None, None, 42))\n      ph_labels = tf.placeholder(tf.int64, shape=(None, None, None))\n      padded_labels = vgsl_model._PadLabels3d(ph_logits, ph_labels)\n      # Make actual inputs.\n      real_logits = _rand(1, 3, 4, 42)\n      # Test all 9 combinations of height x width in [small, ok, big]\n      real_labels = np.arange(6).reshape((1, 2, 3))  # Height small, width small\n      np_array = sess.run([padded_labels],\n                          feed_dict={ph_logits: real_logits,\n                                     ph_labels: real_labels})[0]\n      self.assertEqual(tuple(np_array.shape), (1, 3, 4))\n      self.assertAllEqual(np_array[0, :, :],\n                          [[0, 1, 2, 0], [3, 4, 5, 0], [0, 0, 0, 0]])\n      real_labels = np.arange(8).reshape((1, 2, 4))  # Height small, width ok\n      np_array = sess.run([padded_labels],\n                          feed_dict={ph_logits: real_logits,\n                                     ph_labels: real_labels})[0]\n      self.assertEqual(tuple(np_array.shape), (1, 3, 4))\n      self.assertAllEqual(np_array[0, :, :],\n                          [[0, 1, 2, 3], [4, 5, 6, 7], [0, 0, 0, 0]])\n      real_labels = np.arange(10).reshape((1, 2, 5))  # Height small, width big\n      np_array = sess.run([padded_labels],\n                          feed_dict={ph_logits: real_logits,\n                                     ph_labels: real_labels})[0]\n      self.assertEqual(tuple(np_array.shape), (1, 3, 4))\n      self.assertAllEqual(np_array[0, :, :],\n                          [[0, 1, 2, 3], [5, 6, 7, 8], [0, 0, 0, 0]])\n      real_labels = np.arange(9).reshape((1, 3, 3))  # Height ok, width small\n      np_array = sess.run([padded_labels],\n                          feed_dict={ph_logits: real_logits,\n                                     ph_labels: real_labels})[0]\n      self.assertEqual(tuple(np_array.shape), (1, 3, 4))\n      self.assertAllEqual(np_array[0, :, :],\n                          [[0, 1, 2, 0], [3, 4, 5, 0], [6, 7, 8, 0]])\n      real_labels = np.arange(12).reshape((1, 3, 4))  # Height ok, width ok\n      np_array = sess.run([padded_labels],\n                          feed_dict={ph_logits: real_logits,\n                                     ph_labels: real_labels})[0]\n      self.assertEqual(tuple(np_array.shape), (1, 3, 4))\n      self.assertAllEqual(np_array[0, :, :],\n                          [[0, 1, 2, 3], [4, 5, 6, 7], [8, 9, 10, 11]])\n      real_labels = np.arange(15).reshape((1, 3, 5))  # Height ok, width big\n      np_array = sess.run([padded_labels],\n                          feed_dict={ph_logits: real_logits,\n                                     ph_labels: real_labels})[0]\n      self.assertEqual(tuple(np_array.shape), (1, 3, 4))\n      self.assertAllEqual(np_array[0, :, :],\n                          [[0, 1, 2, 3], [5, 6, 7, 8], [10, 11, 12, 13]])\n      real_labels = np.arange(12).reshape((1, 4, 3))  # Height big, width small\n      np_array = sess.run([padded_labels],\n                          feed_dict={ph_logits: real_logits,\n                                     ph_labels: real_labels})[0]\n      self.assertEqual(tuple(np_array.shape), (1, 3, 4))\n      self.assertAllEqual(np_array[0, :, :],\n                          [[0, 1, 2, 0], [3, 4, 5, 0], [6, 7, 8, 0]])\n      real_labels = np.arange(16).reshape((1, 4, 4))  # Height big, width ok\n      np_array = sess.run([padded_labels],\n                          feed_dict={ph_logits: real_logits,\n                                     ph_labels: real_labels})[0]\n      self.assertEqual(tuple(np_array.shape), (1, 3, 4))\n      self.assertAllEqual(np_array[0, :, :],\n                          [[0, 1, 2, 3], [4, 5, 6, 7], [8, 9, 10, 11]])\n      real_labels = np.arange(20).reshape((1, 4, 5))  # Height big, width big\n      np_array = sess.run([padded_labels],\n                          feed_dict={ph_logits: real_logits,\n                                     ph_labels: real_labels})[0]\n      self.assertEqual(tuple(np_array.shape), (1, 3, 4))\n      self.assertAllEqual(np_array[0, :, :],\n                          [[0, 1, 2, 3], [5, 6, 7, 8], [10, 11, 12, 13]])\n\n  def testEndToEndSizes0d(self):\n    \"\"\"Tests that the output sizes match when training/running real 0d data.\n\n    Uses mnist with dual summarizing LSTMs to reduce to a single value.\n    \"\"\"\n    filename = _testdata('mnist-tiny')\n    with self.test_session() as sess:\n      model = vgsl_model.InitNetwork(\n          filename,\n          model_spec='4,0,0,1[Cr5,5,16 Mp3,3 Lfys16 Lfxs16]O0s12',\n          mode='train')\n      tf.initialize_all_variables().run(session=sess)\n      coord = tf.train.Coordinator()\n      tf.train.start_queue_runners(sess=sess, coord=coord)\n      _, step = model.TrainAStep(sess)\n      self.assertEqual(step, 1)\n      output, labels = model.RunAStep(sess)\n      self.assertEqual(len(output.shape), 2)\n      self.assertEqual(len(labels.shape), 1)\n      self.assertEqual(output.shape[0], labels.shape[0])\n      self.assertEqual(output.shape[1], 12)\n\n  # TODO(rays) Support logistic and test with Imagenet (as 0d, multi-object.)\n\n  def testEndToEndSizes1dCTC(self):\n    \"\"\"Tests that the output sizes match when training with CTC.\n\n    Basic bidi LSTM on top of convolution and summarizing LSTM with CTC.\n    \"\"\"\n    filename = _testdata('arial-32-tiny')\n    with self.test_session() as sess:\n      model = vgsl_model.InitNetwork(\n          filename,\n          model_spec='2,0,0,1[Cr5,5,16 Mp3,3 Lfys16 Lbx100]O1c105',\n          mode='train')\n      tf.initialize_all_variables().run(session=sess)\n      coord = tf.train.Coordinator()\n      tf.train.start_queue_runners(sess=sess, coord=coord)\n      _, step = model.TrainAStep(sess)\n      self.assertEqual(step, 1)\n      output, labels = model.RunAStep(sess)\n      self.assertEqual(len(output.shape), 3)\n      self.assertEqual(len(labels.shape), 2)\n      self.assertEqual(output.shape[0], labels.shape[0])\n      # This is ctc - the only cast-iron guarantee is labels <= output.\n      self.assertLessEqual(labels.shape[1], output.shape[1])\n      self.assertEqual(output.shape[2], 105)\n\n  def testEndToEndSizes1dFixed(self):\n    \"\"\"Tests that the output sizes match when training/running 1 data.\n\n    Convolution, summarizing LSTM with fwd rev fwd to allow no CTC.\n    \"\"\"\n    filename = _testdata('numbers-16-tiny')\n    with self.test_session() as sess:\n      model = vgsl_model.InitNetwork(\n          filename,\n          model_spec='8,0,0,1[Cr5,5,16 Mp3,3 Lfys16 Lfx64 Lrx64 Lfx64]O1s12',\n          mode='train')\n      tf.initialize_all_variables().run(session=sess)\n      coord = tf.train.Coordinator()\n      tf.train.start_queue_runners(sess=sess, coord=coord)\n      _, step = model.TrainAStep(sess)\n      self.assertEqual(step, 1)\n      output, labels = model.RunAStep(sess)\n      self.assertEqual(len(output.shape), 3)\n      self.assertEqual(len(labels.shape), 2)\n      self.assertEqual(output.shape[0], labels.shape[0])\n      # Not CTC, output lengths match.\n      self.assertEqual(output.shape[1], labels.shape[1])\n      self.assertEqual(output.shape[2], 12)\n\n  # TODO(rays) Get a 2-d dataset and support 2d (heat map) outputs.\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/street/python/vgsl_train.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Model trainer for single or multi-replica training.\"\"\"\nfrom tensorflow import app\nfrom tensorflow.python.platform import flags\n\nimport vgsl_model\n\nflags.DEFINE_string('master', '', 'Name of the TensorFlow master to use.')\nflags.DEFINE_string('train_dir', '/tmp/mdir',\n                    'Directory where to write event logs.')\nflags.DEFINE_string('model_str',\n                    '1,150,600,3[S2(4x150)0,2 Ct5,5,16 Mp2,2 Ct5,5,64 Mp3,3'\n                    '([Lrys64 Lbx128][Lbys64 Lbx128][Lfys64 Lbx128])S3(3x0)2,3'\n                    'Lfx128 Lrx128 S0(1x4)0,3 Do Lfx256]O1c134',\n                    'Network description.')\nflags.DEFINE_integer('max_steps', 10000, 'Number of steps to train for.')\nflags.DEFINE_integer('task', 0, 'Task id of the replica running the training.')\nflags.DEFINE_integer('ps_tasks', 0, 'Number of tasks in the ps job.'\n                     'If 0 no ps job is used.')\nflags.DEFINE_string('train_data', None, 'Training data filepattern')\nflags.DEFINE_float('initial_learning_rate', 0.00002, 'Initial learning rate')\nflags.DEFINE_float('final_learning_rate', 0.00002, 'Final learning rate')\nflags.DEFINE_integer('learning_rate_halflife', 1600000,\n                     'Halflife of learning rate')\nflags.DEFINE_string('optimizer_type', 'Adam',\n                    'Optimizer from:GradientDescent, AdaGrad, Momentum, Adam')\nflags.DEFINE_integer('num_preprocess_threads', 4, 'Number of input threads')\n\nFLAGS = flags.FLAGS\n\n\ndef main(argv):\n  del argv\n  vgsl_model.Train(FLAGS.train_dir, FLAGS.model_str, FLAGS.train_data,\n                   FLAGS.max_steps, FLAGS.master, FLAGS.task, FLAGS.ps_tasks,\n                   FLAGS.initial_learning_rate, FLAGS.final_learning_rate,\n                   FLAGS.learning_rate_halflife, FLAGS.optimizer_type,\n                   FLAGS.num_preprocess_threads)\n\n\nif __name__ == '__main__':\n  app.run()\n"
  },
  {
    "path": "model_zoo/models/street/python/vgslspecs.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"String network description language mapping to TF-Slim calls where possible.\n\nSee vglspecs.md for detailed description.\n\"\"\"\n\nimport re\nfrom string import maketrans\n\nimport nn_ops\nimport shapes\nimport tensorflow as tf\nimport tensorflow.contrib.slim as slim\n\n\n# Class that builds a set of ops to manipulate variable-sized images.\nclass VGSLSpecs(object):\n  \"\"\"Layers that can be built from a string definition.\"\"\"\n\n  def __init__(self, widths, heights, is_training):\n    \"\"\"Constructs a VGSLSpecs.\n\n    Args:\n      widths:  Tensor of size batch_size of the widths of the inputs.\n      heights: Tensor of size batch_size of the heights of the inputs.\n      is_training: True if the graph should be build for training.\n    \"\"\"\n    # The string that was used to build this model.\n    self.model_str = None\n    # True if we are training\n    self.is_training = is_training\n    # Tensor for the size of the images, of size batch_size.\n    self.widths = widths\n    self.heights = heights\n    # Overall reduction factors of this model so far for each dimension.\n    # TODO(rays) consider building a graph from widths and heights instead of\n    # computing a scale factor.\n    self.reduction_factors = [1.0, 1.0, 1.0, 1.0]\n    # List of Op parsers.\n    # TODO(rays) add more Op types as needed.\n    self.valid_ops = [self.AddSeries, self.AddParallel, self.AddConvLayer,\n                      self.AddMaxPool, self.AddDropout, self.AddReShape,\n                      self.AddFCLayer, self.AddLSTMLayer]\n    # Translation table to convert unacceptable characters that may occur\n    # in op strings that cannot be used as names.\n    self.transtab = maketrans('(,)', '___')\n\n  def Build(self, prev_layer, model_str):\n    \"\"\"Builds a network with input prev_layer from a VGSLSpecs description.\n\n    Args:\n      prev_layer: The input tensor.\n      model_str:  Model definition similar to Tesseract as follows:\n        ============ FUNCTIONAL OPS ============\n        C(s|t|r|l|m)[{name}]<y>,<x>,<d> Convolves using a y,x window, with no\n          shrinkage, SAME infill, d outputs, with s|t|r|l|m non-linear layer.\n          (s|t|r|l|m) specifies the type of non-linearity:\n          s = sigmoid\n          t = tanh\n          r = relu\n          l = linear (i.e., None)\n          m = softmax\n        F(s|t|r|l|m)[{name}]<d> Fully-connected with s|t|r|l|m non-linearity and\n          d outputs. Reduces height, width to 1. Input height and width must be\n          constant.\n        L(f|r|b)(x|y)[s][{name}]<n> LSTM cell with n outputs.\n          f runs the LSTM forward only.\n          r runs the LSTM reversed only.\n          b runs the LSTM bidirectionally.\n          x runs the LSTM in the x-dimension (on data with or without the\n             y-dimension).\n          y runs the LSTM in the y-dimension (data must have a y dimension).\n          s (optional) summarizes the output in the requested dimension,\n             outputting only the final step, collapsing the dimension to a\n             single element.\n          Examples:\n          Lfx128 runs a forward-only LSTM in the x-dimension with 128\n                 outputs, treating any y dimension independently.\n          Lfys64 runs a forward-only LSTM in the y-dimension with 64 outputs\n                 and collapses the y-dimension to 1 element.\n          NOTE that Lbxsn is implemented as (LfxsnLrxsn) since the summaries\n          need to be taken from opposite ends of the output\n        Do[{name}] Insert a dropout layer.\n        ============ PLUMBING OPS ============\n        [...] Execute ... networks in series (layers).\n        (...) Execute ... networks in parallel, with their output concatenated\n          in depth.\n        S[{name}]<d>(<a>x<b>)<e>,<f> Splits one dimension, moves one part to\n          another dimension.\n          Splits input dimension d into a x b, sending the high part (a) to the\n          high side of dimension e, and the low part (b) to the high side of\n          dimension f. Exception: if d=e=f, then then dimension d is internally\n          transposed to bxa.\n          Either a or b can be zero, meaning whatever is left after taking out\n          the other, allowing dimensions to be of variable size.\n          Eg. S3(3x50)2,3 will split the 150-element depth into 3x50, with the 3\n          going to the most significant part of the width, and the 50 part\n          staying in depth.\n          This will rearrange a 3x50 output parallel operation to spread the 3\n          output sets over width.\n        Mp[{name}]<y>,<x> Maxpool the input, reducing the (y,x) rectangle to a\n          single vector value.\n\n    Returns:\n      Output tensor\n    \"\"\"\n    self.model_str = model_str\n    final_layer, _ = self.BuildFromString(prev_layer, 0)\n    return final_layer\n\n  def GetLengths(self, dim=2, factor=1):\n    \"\"\"Returns the lengths of the batch of elements in the given dimension.\n\n    WARNING: The returned sizes may not exactly match TF's calculation.\n    Args:\n      dim: dimension to get the sizes of, in [1,2]. batch, depth not allowed.\n      factor: A scalar value to multiply by.\n\n    Returns:\n      The original heights/widths scaled by the current scaling of the model and\n      the given factor.\n\n    Raises:\n      ValueError: If the args are invalid.\n    \"\"\"\n    if dim == 1:\n      lengths = self.heights\n    elif dim == 2:\n      lengths = self.widths\n    else:\n      raise ValueError('Invalid dimension given to GetLengths')\n    lengths = tf.cast(lengths, tf.float32)\n    if self.reduction_factors[dim] is not None:\n      lengths = tf.div(lengths, self.reduction_factors[dim])\n    else:\n      lengths = tf.ones_like(lengths)\n    if factor != 1:\n      lengths = tf.mul(lengths, tf.cast(factor, tf.float32))\n    return tf.cast(lengths, tf.int32)\n\n  def BuildFromString(self, prev_layer, index):\n    \"\"\"Adds the layers defined by model_str[index:] to the model.\n\n    Args:\n      prev_layer: Input tensor.\n      index:      Position in model_str to start parsing\n\n    Returns:\n      Output tensor, next model_str index.\n\n    Raises:\n      ValueError: If the model string is unrecognized.\n    \"\"\"\n    index = self._SkipWhitespace(index)\n    for op in self.valid_ops:\n      output_layer, next_index = op(prev_layer, index)\n      if output_layer is not None:\n        return output_layer, next_index\n    if output_layer is not None:\n      return output_layer, next_index\n    raise ValueError('Unrecognized model string:' + self.model_str[index:])\n\n  def AddSeries(self, prev_layer, index):\n    \"\"\"Builds a sequence of layers for a VGSLSpecs model.\n\n    Args:\n      prev_layer: Input tensor.\n      index:      Position in model_str to start parsing\n\n    Returns:\n      Output tensor of the series, end index in model_str.\n\n    Raises:\n      ValueError: If [] are unbalanced.\n    \"\"\"\n    if self.model_str[index] != '[':\n      return None, None\n    index += 1\n    while index < len(self.model_str) and self.model_str[index] != ']':\n      prev_layer, index = self.BuildFromString(prev_layer, index)\n    if index == len(self.model_str):\n      raise ValueError('Missing ] at end of series!' + self.model_str)\n    return prev_layer, index + 1\n\n  def AddParallel(self, prev_layer, index):\n    \"\"\"tf.concats outputs of layers that run on the same inputs.\n\n    Args:\n      prev_layer: Input tensor.\n      index:      Position in model_str to start parsing\n\n    Returns:\n      Output tensor of the parallel,  end index in model_str.\n\n    Raises:\n      ValueError: If () are unbalanced or the elements don't match.\n    \"\"\"\n    if self.model_str[index] != '(':\n      return None, None\n    index += 1\n    layers = []\n    num_dims = 0\n    # Each parallel must output the same, including any reduction factor, in\n    # all dimensions except depth.\n    # We have to save the starting factors, so they don't get reduced by all\n    # the elements of the parallel, only once.\n    original_factors = self.reduction_factors\n    final_factors = None\n    while index < len(self.model_str) and self.model_str[index] != ')':\n      self.reduction_factors = original_factors\n      layer, index = self.BuildFromString(prev_layer, index)\n      if num_dims == 0:\n        num_dims = len(layer.get_shape())\n      elif num_dims != len(layer.get_shape()):\n        raise ValueError('All elements of parallel must return same num dims')\n      layers.append(layer)\n      if final_factors:\n        if final_factors != self.reduction_factors:\n          raise ValueError('All elements of parallel must scale the same')\n      else:\n        final_factors = self.reduction_factors\n    if index == len(self.model_str):\n      raise ValueError('Missing ) at end of parallel!' + self.model_str)\n    return tf.concat(num_dims - 1, layers), index + 1\n\n  def AddConvLayer(self, prev_layer, index):\n    \"\"\"Add a single standard convolutional layer.\n\n    Args:\n      prev_layer: Input tensor.\n      index:      Position in model_str to start parsing\n\n    Returns:\n      Output tensor, end index in model_str.\n    \"\"\"\n    pattern = re.compile(R'(C)(s|t|r|l|m)({\\w+})?(\\d+),(\\d+),(\\d+)')\n    m = pattern.match(self.model_str, index)\n    if m is None:\n      return None, None\n    name = self._GetLayerName(m.group(0), index, m.group(3))\n    width = int(m.group(4))\n    height = int(m.group(5))\n    depth = int(m.group(6))\n    fn = self._NonLinearity(m.group(2))\n    return slim.conv2d(\n        prev_layer, depth, [height, width], activation_fn=fn,\n        scope=name), m.end()\n\n  def AddMaxPool(self, prev_layer, index):\n    \"\"\"Add a maxpool layer.\n\n    Args:\n      prev_layer: Input tensor.\n      index:      Position in model_str to start parsing\n\n    Returns:\n      Output tensor, end index in model_str.\n    \"\"\"\n    pattern = re.compile(R'(Mp)({\\w+})?(\\d+),(\\d+)(?:,(\\d+),(\\d+))?')\n    m = pattern.match(self.model_str, index)\n    if m is None:\n      return None, None\n    name = self._GetLayerName(m.group(0), index, m.group(2))\n    height = int(m.group(3))\n    width = int(m.group(4))\n    y_stride = height if m.group(5) is None else m.group(5)\n    x_stride = width if m.group(6) is None else m.group(6)\n    self.reduction_factors[1] *= y_stride\n    self.reduction_factors[2] *= x_stride\n    return slim.max_pool2d(\n        prev_layer, [height, width], [y_stride, x_stride],\n        padding='SAME',\n        scope=name), m.end()\n\n  def AddDropout(self, prev_layer, index):\n    \"\"\"Adds a dropout layer.\n\n    Args:\n      prev_layer: Input tensor.\n      index:      Position in model_str to start parsing\n\n    Returns:\n      Output tensor, end index in model_str.\n    \"\"\"\n    pattern = re.compile(R'(Do)({\\w+})?')\n    m = pattern.match(self.model_str, index)\n    if m is None:\n      return None, None\n    name = self._GetLayerName(m.group(0), index, m.group(2))\n    layer = slim.dropout(\n        prev_layer, 0.5, is_training=self.is_training, scope=name)\n    return layer, m.end()\n\n  def AddReShape(self, prev_layer, index):\n    \"\"\"Reshapes the input tensor by moving each (x_scale,y_scale) rectangle to.\n\n       the depth dimension. NOTE that the TF convention is that inputs are\n       [batch, y, x, depth].\n\n    Args:\n      prev_layer: Input tensor.\n      index:      Position in model_str to start parsing\n\n    Returns:\n      Output tensor, end index in model_str.\n    \"\"\"\n    pattern = re.compile(R'(S)(?:{(\\w)})?(\\d+)\\((\\d+)x(\\d+)\\)(\\d+),(\\d+)')\n    m = pattern.match(self.model_str, index)\n    if m is None:\n      return None, None\n    name = self._GetLayerName(m.group(0), index, m.group(2))\n    src_dim = int(m.group(3))\n    part_a = int(m.group(4))\n    part_b = int(m.group(5))\n    dest_dim_a = int(m.group(6))\n    dest_dim_b = int(m.group(7))\n    if part_a == 0:\n      part_a = -1\n    if part_b == 0:\n      part_b = -1\n    prev_shape = tf.shape(prev_layer)\n    layer = shapes.transposing_reshape(\n        prev_layer, src_dim, part_a, part_b, dest_dim_a, dest_dim_b, name=name)\n    # Compute scale factors.\n    result_shape = tf.shape(layer)\n    for i in xrange(len(self.reduction_factors)):\n      if self.reduction_factors[i] is not None:\n        factor1 = tf.cast(self.reduction_factors[i], tf.float32)\n        factor2 = tf.cast(prev_shape[i], tf.float32)\n        divisor = tf.cast(result_shape[i], tf.float32)\n        self.reduction_factors[i] = tf.div(tf.mul(factor1, factor2), divisor)\n    return layer, m.end()\n\n  def AddFCLayer(self, prev_layer, index):\n    \"\"\"Parse expression and add Fully Connected Layer.\n\n    Args:\n      prev_layer: Input tensor.\n      index:      Position in model_str to start parsing\n\n    Returns:\n      Output tensor, end index in model_str.\n    \"\"\"\n    pattern = re.compile(R'(F)(s|t|r|l|m)({\\w+})?(\\d+)')\n    m = pattern.match(self.model_str, index)\n    if m is None:\n      return None, None\n    fn = self._NonLinearity(m.group(2))\n    name = self._GetLayerName(m.group(0), index, m.group(3))\n    depth = int(m.group(4))\n    input_depth = shapes.tensor_dim(prev_layer, 1) * shapes.tensor_dim(\n        prev_layer, 2) * shapes.tensor_dim(prev_layer, 3)\n    # The slim fully connected is actually a 1x1 conv, so we have to crush the\n    # dimensions on input.\n    # Everything except batch goes to depth, and therefore has to be known.\n    shaped = tf.reshape(\n        prev_layer, [-1, input_depth], name=name + '_reshape_in')\n    output = slim.fully_connected(shaped, depth, activation_fn=fn, scope=name)\n    # Width and height are collapsed to 1.\n    self.reduction_factors[1] = None\n    self.reduction_factors[2] = None\n    return tf.reshape(\n        output, [shapes.tensor_dim(prev_layer, 0), 1, 1, depth],\n        name=name + '_reshape_out'), m.end()\n\n  def AddLSTMLayer(self, prev_layer, index):\n    \"\"\"Parse expression and add LSTM Layer.\n\n    Args:\n      prev_layer: Input tensor.\n      index:      Position in model_str to start parsing\n\n    Returns:\n      Output tensor, end index in model_str.\n    \"\"\"\n    pattern = re.compile(R'(L)(f|r|b)(x|y)(s)?({\\w+})?(\\d+)')\n    m = pattern.match(self.model_str, index)\n    if m is None:\n      return None, None\n    direction = m.group(2)\n    dim = m.group(3)\n    summarize = m.group(4) == 's'\n    name = self._GetLayerName(m.group(0), index, m.group(5))\n    depth = int(m.group(6))\n    if direction == 'b' and summarize:\n      fwd = self._LSTMLayer(prev_layer, 'forward', dim, True, depth,\n                            name + '_forward')\n      back = self._LSTMLayer(prev_layer, 'backward', dim, True, depth,\n                             name + '_reverse')\n      return tf.concat(3, [fwd, back], name=name + '_concat'), m.end()\n    if direction == 'f':\n      direction = 'forward'\n    elif direction == 'r':\n      direction = 'backward'\n    else:\n      direction = 'bidirectional'\n    outputs = self._LSTMLayer(prev_layer, direction, dim, summarize, depth,\n                              name)\n    if summarize:\n      # The x or y dimension is getting collapsed.\n      if dim == 'x':\n        self.reduction_factors[2] = None\n      else:\n        self.reduction_factors[1] = None\n    return outputs, m.end()\n\n  def _LSTMLayer(self, prev_layer, direction, dim, summarize, depth, name):\n    \"\"\"Adds an LSTM layer with the given pre-parsed attributes.\n\n    Always maps 4-D to 4-D regardless of summarize.\n    Args:\n      prev_layer: Input tensor.\n      direction:  'forward' 'backward' or 'bidirectional'\n      dim:        'x' or 'y', dimension to consider as time.\n      summarize:  True if we are to return only the last timestep.\n      depth:      Output depth.\n      name:       Some string naming the op.\n\n    Returns:\n      Output tensor.\n    \"\"\"\n    # If the target dimension is y, we need to transpose.\n    if dim == 'x':\n      lengths = self.GetLengths(2, 1)\n      inputs = prev_layer\n    else:\n      lengths = self.GetLengths(1, 1)\n      inputs = tf.transpose(prev_layer, [0, 2, 1, 3], name=name + '_ytrans_in')\n    input_batch = shapes.tensor_dim(inputs, 0)\n    num_slices = shapes.tensor_dim(inputs, 1)\n    num_steps = shapes.tensor_dim(inputs, 2)\n    input_depth = shapes.tensor_dim(inputs, 3)\n    # Reshape away the other dimension.\n    inputs = tf.reshape(\n        inputs, [-1, num_steps, input_depth], name=name + '_reshape_in')\n    # We need to replicate the lengths by the size of the other dimension, and\n    # any changes that have been made to the batch dimension.\n    tile_factor = tf.to_float(input_batch *\n                              num_slices) / tf.to_float(tf.shape(lengths)[0])\n    lengths = tf.tile(lengths, [tf.cast(tile_factor, tf.int32)])\n    lengths = tf.cast(lengths, tf.int64)\n    outputs = nn_ops.rnn_helper(\n        inputs,\n        lengths,\n        cell_type='lstm',\n        num_nodes=depth,\n        direction=direction,\n        name=name,\n        stddev=0.1)\n    # Output depth is doubled if bi-directional.\n    if direction == 'bidirectional':\n      output_depth = depth * 2\n    else:\n      output_depth = depth\n    # Restore the other dimension.\n    if summarize:\n      outputs = tf.slice(\n          outputs, [0, num_steps - 1, 0], [-1, 1, -1], name=name + '_sum_slice')\n      outputs = tf.reshape(\n          outputs, [input_batch, num_slices, 1, output_depth],\n          name=name + '_reshape_out')\n    else:\n      outputs = tf.reshape(\n          outputs, [input_batch, num_slices, num_steps, output_depth],\n          name=name + '_reshape_out')\n    if dim == 'y':\n      outputs = tf.transpose(outputs, [0, 2, 1, 3], name=name + '_ytrans_out')\n    return outputs\n\n  def _NonLinearity(self, code):\n    \"\"\"Returns the non-linearity function pointer for the given string code.\n\n    For forwards compatibility, allows the full names for stand-alone\n    non-linearities, as well as the single-letter names used in ops like C,F.\n    Args:\n      code: String code representing a non-linearity function.\n    Returns:\n      non-linearity function represented by the code.\n    \"\"\"\n    if code in ['s', 'Sig']:\n      return tf.sigmoid\n    elif code in ['t', 'Tanh']:\n      return tf.tanh\n    elif code in ['r', 'Relu']:\n      return tf.nn.relu\n    elif code in ['m', 'Smax']:\n      return tf.nn.softmax\n    return None\n\n  def _GetLayerName(self, op_str, index, name_str):\n    \"\"\"Generates a name for the op, using a user-supplied name if possible.\n\n    Args:\n      op_str:     String representing the parsed op.\n      index:      Position in model_str of the start of the op.\n      name_str:   User-supplied {name} with {} that need removing or None.\n\n    Returns:\n      Selected name.\n    \"\"\"\n    if name_str:\n      return name_str[1:-1]\n    else:\n      return op_str.translate(self.transtab) + '_' + str(index)\n\n  def _SkipWhitespace(self, index):\n    \"\"\"Skips any leading whitespace in the model description.\n\n    Args:\n      index:      Position in model_str to start parsing\n\n    Returns:\n      end index in model_str of whitespace.\n    \"\"\"\n    pattern = re.compile(R'([ \\t\\n]+)')\n    m = pattern.match(self.model_str, index)\n    if m is None:\n      return index\n    return m.end()\n"
  },
  {
    "path": "model_zoo/models/street/python/vgslspecs_test.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"Tests for vgslspecs.\"\"\"\n\nimport numpy as np\nimport tensorflow as tf\nimport vgslspecs\n\n\ndef _rand(*size):\n  return np.random.uniform(size=size).astype('f')\n\n\nclass VgslspecsTest(tf.test.TestCase):\n\n  def __init__(self, other):\n    super(VgslspecsTest, self).__init__(other)\n    self.max_width = 36\n    self.max_height = 24\n    self.batch_size = 4\n\n  def SetupInputs(self):\n    # Make placeholders for standard inputs.\n    # Everything is variable in the input, except the depth.\n    self.ph_image = tf.placeholder(\n        tf.float32, shape=(None, None, None, 3), name='inputs')\n    self.ph_widths = tf.placeholder(tf.int64, shape=(None,), name='w')\n    self.ph_heights = tf.placeholder(tf.int64, shape=(None,), name='h')\n    # Make actual inputs.\n    self.in_image = _rand(self.batch_size, self.max_height, self.max_width, 3)\n    self.in_widths = [24, 12, self.max_width, 30]\n    self.in_heights = [self.max_height, 18, 12, 6]\n\n  def ExpectScaledSize(self, spec, target_shape, factor=1):\n    \"\"\"Tests that the output of the graph of the given spec has target_shape.\"\"\"\n    with tf.Graph().as_default():\n      with self.test_session() as sess:\n        self.SetupInputs()\n        # Only the placeholders are given at construction time.\n        vgsl = vgslspecs.VGSLSpecs(self.ph_widths, self.ph_heights, True)\n        outputs = vgsl.Build(self.ph_image, spec)\n        # Compute the expected output widths from the given scale factor.\n        target_widths = tf.div(self.in_widths, factor).eval()\n        target_heights = tf.div(self.in_heights, factor).eval()\n        # Run with the 'real' data.\n        tf.initialize_all_variables().run()\n        res_image, res_widths, res_heights = sess.run(\n            [outputs, vgsl.GetLengths(2), vgsl.GetLengths(1)],\n            feed_dict={self.ph_image: self.in_image,\n                       self.ph_widths: self.in_widths,\n                       self.ph_heights: self.in_heights})\n        self.assertEqual(tuple(res_image.shape), target_shape)\n        if target_shape[1] > 1:\n          self.assertEqual(tuple(res_heights), tuple(target_heights))\n        if target_shape[2] > 1:\n          self.assertEqual(tuple(res_widths), tuple(target_widths))\n\n  def testSameSizeConv(self):\n    \"\"\"Test all types of Conv. There is no scaling.\"\"\"\n    self.ExpectScaledSize(\n        '[Cs{MyConv}5,5,16 Ct3,3,12 Cr4,4,24 Cl5,5,64]',\n        (self.batch_size, self.max_height, self.max_width, 64))\n\n  def testSameSizeLSTM(self):\n    \"\"\"Test all non-reducing LSTMs. Output depth is doubled with BiDi.\"\"\"\n    self.ExpectScaledSize('[Lfx16 Lrx8 Do Lbx24 Lfy12 Do{MyDo} Lry7 Lby32]',\n                          (self.batch_size, self.max_height, self.max_width,\n                           64))\n\n  def testSameSizeParallel(self):\n    \"\"\"Parallel affects depth, but not scale.\"\"\"\n    self.ExpectScaledSize('[Cs5,5,16 (Lfx{MyLSTM}32 Lrx32 Lbx16)]',\n                          (self.batch_size, self.max_height, self.max_width,\n                           96))\n\n  def testScalingOps(self):\n    \"\"\"Test a heterogeneous series with scaling.\"\"\"\n    self.ExpectScaledSize('[Cs5,5,16 Mp{MyPool}2,2 Ct3,3,32 Mp3,3 Lfx32 Lry64]',\n                          (self.batch_size, self.max_height / 6,\n                           self.max_width / 6, 64), 6)\n\n  def testXReduction(self):\n    \"\"\"Test a heterogeneous series with reduction of x-dimension.\"\"\"\n    self.ExpectScaledSize('[Cr5,5,16 Mp2,2 Ct3,3,32 Mp3,3 Lfxs32 Lry64]',\n                          (self.batch_size, self.max_height / 6, 1, 64), 6)\n\n  def testYReduction(self):\n    \"\"\"Test a heterogeneous series with reduction of y-dimension.\"\"\"\n    self.ExpectScaledSize('[Cl5,5,16 Mp2,2 Ct3,3,32 Mp3,3 Lfys32 Lfx64]',\n                          (self.batch_size, 1, self.max_width / 6, 64), 6)\n\n  def testXYReduction(self):\n    \"\"\"Test a heterogeneous series with reduction to 0-d.\"\"\"\n    self.ExpectScaledSize(\n        '[Cr5,5,16 Lfys32 Lfxs64 Fr{MyFC}16 Ft20 Fl12 Fs32 Fm40]',\n        (self.batch_size, 1, 1, 40))\n\n  def testReshapeTile(self):\n    \"\"\"Tests that a tiled input can be reshaped to the batch dimension.\"\"\"\n    self.ExpectScaledSize('[S2(3x0)0,2 Cr5,5,16 Lfys16]',\n                          (self.batch_size * 3, 1, self.max_width / 3, 16), 3)\n\n  def testReshapeDepth(self):\n    \"\"\"Tests that depth can be reshaped to the x dimension.\"\"\"\n    self.ExpectScaledSize('[Cl5,5,16 Mp3,3 (Lrys32 Lbys16 Lfys32) S3(3x0)2,3]',\n                          (self.batch_size, 1, self.max_width, 32))\n\n\nif __name__ == '__main__':\n  tf.test.main()\n"
  },
  {
    "path": "model_zoo/models/street/testdata/arial.charset_size=105.txt",
    "content": "0\t \n104\t<nul>\n1\tG\n2\tr\n3\ta\n4\ts\n5\tl\n6\tn\n7\td\n8\t.\n9\tB\n10\tC\n11\tO\n12\tW\n13\tY\n14\t,\n15\t(\n16\tu\n17\tz\n18\ti\n19\te\n20\t)\n21\t1\n22\t9\n23\t2\n24\t-\n25\t6\n26\to\n27\tL\n28\tP\n29\t'\n30\tt\n31\tm\n32\tK\n33\tc\n34\tk\n35\tV\n36\tS\n37\tD\n38\tJ\n39\th\n40\tM\n41\tx\n42\tE\n43\tq\n44\t;\n45\tA\n46\ty\n47\tf\n48\t5\n49\t7\n50\tb\n51\t4\n52\t0\n53\t3\n54\tN\n55\tI\n56\tT\n57\t/\n58\tp\n59\tw\n60\tg\n61\tH\n62\t“\n63\tF\n62\t”\n62\t\"\n29\t’\n64\tR\n24\t—\n65\t8\n66\tv\n67\t?\n68\té\n69\t%\n70\t:\n71\tj\n72\t\\\n73\t{\n74\t}\n75\t|\n76\tU\n77\t$\n78\t°\n79\t*\n80\t!\n81\t]\n82\tQ\n29\t‘\n83\tZ\n84\tX\n85\t[\n86\t=\n87\t+\n88\t§\n89\t_\n90\t£\n91\t&\n92\t#\n93\t>\n94\t<\n95\t~\n96\t€\n97\t@\n98\t¢\n99\t»\n100\t«\n47,5\tﬂ\n47,18\tﬁ\n101\t®\n102\t©\n103\t¥\n"
  },
  {
    "path": "model_zoo/models/street/testdata/charset_size=134.txt",
    "content": "0\t \n133\t<nul>\n1\tl\n2\t’\n3\té\n4\tt\n5\te\n6\ti\n7\tn\n8\ts\n9\tx\n10\tg\n11\tu\n12\to\n13\t1\n14\t8\n15\t7\n16\t0\n17\t-\n18\t.\n19\tp\n20\ta\n21\tr\n22\tè\n23\td\n24\tc\n25\tV\n26\tv\n27\tb\n28\tm\n29\t)\n30\tC\n31\tz\n32\tS\n33\ty\n34\t,\n35\tk\n36\tÉ\n37\tA\n38\th\n39\tE\n40\t»\n41\tD\n42\t/\n43\tH\n44\tM\n45\t(\n46\tG\n47\tP\n48\tç\n2\t'\n49\tR\n50\tf\n51\t\"\n52\t2\n53\tj\n54\t|\n55\tN\n56\t6\n57\t°\n58\t5\n59\tT\n60\tO\n61\tU\n62\t3\n63\t%\n64\t9\n65\tq\n66\tZ\n67\tB\n68\tK\n69\tw\n70\tW\n71\t:\n72\t4\n73\tL\n74\tF\n75\t]\n76\tï\n2\t‘\n77\tI\n78\tJ\n79\tä\n80\tî\n81\t;\n82\tà\n83\tê\n84\tX\n85\tü\n86\tY\n87\tô\n88\t=\n89\t+\n90\t\\\n91\t{\n92\t}\n93\t_\n94\tQ\n95\tœ\n96\tñ\n97\t*\n98\t!\n99\tÜ\n51\t“\n100\tâ\n101\tÇ\n102\tŒ\n103\tû\n104\t?\n105\t$\n106\të\n107\t«\n108\t€\n109\t&\n110\t<\n51\t”\n111\tæ\n112\t#\n113\t®\n114\tÂ\n115\tÈ\n116\t>\n117\t[\n17\t—\n118\tÆ\n119\tù\n120\tÎ\n121\tÔ\n122\tÿ\n123\tÀ\n124\tÊ\n125\t@\n126\tÏ\n127\t©\n128\tË\n129\tÙ\n130\t£\n131\tŸ\n132\tÛ\n"
  },
  {
    "path": "model_zoo/models/street/testdata/charset_size_10.txt",
    "content": "0\t \n9\t<nul>\n1\ta\n2\tb\n3\tr\n4\tn\n4,5\tm\n6\tf\n7\t.\n8\t,\n"
  },
  {
    "path": "model_zoo/models/street/testdata/numbers.charset_size=12.txt",
    "content": "0\t \n11\t<nul>\n1\t9\n2\t8\n3\t7\n4\t6\n5\t1\n6\t4\n7\t0\n8\t3\n9\t5\n10\t2\n"
  },
  {
    "path": "model_zoo/models/swivel/.gitignore",
    "content": "*.an.tab\n*.pyc\n*.ws.tab\nMEN.tar.gz\nMtruk.csv\nSimLex-999.zip\nanalogy\nfastprep\nmyz_naacl13_test_set.tgz\nquestions-words.txt\nrw.zip\nws353simrel.tar.gz\n"
  },
  {
    "path": "model_zoo/models/swivel/README.md",
    "content": "# Swivel in Tensorflow\n\nThis is a [TensorFlow](http://www.tensorflow.org/) implementation of the\n[Swivel algorithm](http://arxiv.org/abs/1602.02215) for generating word\nembeddings.\n\nSwivel works as follows:\n\n1. Compute the co-occurrence statistics from a corpus; that is, determine how\n   often a word *c* appears the context (e.g., \"within ten words\") of a focus\n   word *f*.  This results in a sparse *co-occurrence matrix* whose rows\n   represent the focus words, and whose columns represent the context\n   words. Each cell value is the number of times the focus and context words\n   were observed together.\n2. Re-organize the co-occurrence matrix and chop it into smaller pieces.\n3. Assign a random *embedding vector* of fixed dimension (say, 300) to each\n   focus word and to each context word.\n4. Iteratively attempt to approximate the\n   [pointwise mutual information](https://en.wikipedia.org/wiki/Pointwise_mutual_information)\n   (PMI) between words with the dot product of the corresponding embedding\n   vectors.\n\nNote that the resulting co-occurrence matrix is very sparse (i.e., contains many\nzeros) since most words won't have been observed in the context of other words.\nIn the case of very rare words, it seems reasonable to assume that you just\nhaven't sampled enough data to spot their co-occurrence yet.  On the other hand,\nif we've failed to observed to common words co-occuring, it seems likely that\nthey are *anti-correlated*.\n\nSwivel attempts to capture this intuition by using both the observed and the\nun-observed co-occurrences to inform the way it iteratively adjusts vectors.\nEmpirically, this seems to lead to better embeddings, especially for rare words.\n\n# Contents\n\nThis release includes the following programs.\n\n* `prep.py` is a program that takes a text corpus and pre-processes it for\n  training. Specifically, it computes a vocabulary and token co-occurrence\n  statistics for the corpus.  It then outputs the information into a format that\n  can be digested by the TensorFlow trainer.\n* `swivel.py` is a TensorFlow program that generates embeddings from the\n  co-occurrence statistics.  It uses the files created by `prep.py` as input,\n  and generates two text files as output: the row and column embeddings.\n* `text2bin.py` combines the row and column vectors generated by Swivel into a\n  flat binary file that can be quickly loaded into memory to perform vector\n  arithmetic.  This can also be used to convert embeddings from\n  [Glove](http://nlp.stanford.edu/projects/glove/) and\n  [word2vec](https://code.google.com/archive/p/word2vec/) into a form that can\n  be used by the following tools.\n* `nearest.py` is a program that you can use to manually inspect binary\n  embeddings.\n* `eval.mk` is a GNU makefile that fill retrieve and normalize several common\n  word similarity and analogy evaluation data sets.\n* `wordsim.py` performs word similarity evaluation of the resulting vectors.\n* `analogy` performs analogy evaluation of the resulting vectors.\n* `fastprep` is a C++ program that works much more quickly that `prep.py`, but\n  also has some additional dependencies to build.\n\n# Building Embeddings with Swivel\n\nTo build your own word embeddings with Swivel, you'll need the following:\n\n* A large corpus of text; for example, the\n  [dump of English Wikipedia](https://dumps.wikimedia.org/enwiki/).\n* A working [TensorFlow](http://www.tensorflow.org/) implementation.\n* A machine with plenty of disk space and, ideally, a beefy GPU card.  (We've\n  experimented with the\n  [Nvidia Titan X](http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-titan-x),\n  for example.)\n\nYou'll then run `prep.py` (or `fastprep`) to prepare the data for Swivel and run\n`swivel.py` to create the embeddings. The resulting embeddings will be output\ninto two large text files: one for the row vectors and one for the column\nvectors.  You can use those \"as is\", or convert them into a binary file using\n`text2bin.py` and then use the tools here to experiment with the resulting\nvectors.\n\n## Preparing the data for training\n\nOnce you've downloaded the corpus (e.g., to `/tmp/wiki.txt`), run `prep.py` to\nprepare the data for training:\n\n    ./prep.py --output_dir /tmp/swivel_data --input /tmp/wiki.txt\n\nBy default, `prep.py` will make one pass through the text file to compute a\n\"vocabulary\" of the most frequent words, and then a second pass to compute the\nco-occurrence statistics.  The following options allow you to control this\nbehavior:\n\n| Option | Description |\n|:--- |:--- |\n| `--min_count <n>` | Only include words in the generated vocabulary that appear at least *n* times. |\n| `--max_vocab <n>` | Admit at most *n* words into the vocabulary. |\n| `--vocab <filename>` | Use the specified filename as the vocabulary instead of computing it from the corpus.  The file should contain one word per line. |\n\nThe `prep.py` program is pretty simple.  Notably, it does almost no text\nprocessing: it does no case translation and simply breaks text into tokens by\nsplitting on spaces. Feel free to experiment with the `words` function if you'd\nlike to do something more sophisticated.\n\nUnfortunately, `prep.py` is pretty slow.  Also included is `fastprep`, a C++\nequivalent that works much more quickly.  Building `fastprep.cc` is a bit more\ninvolved: it requires you to pull and build the Tensorflow source code in order\nto provide the libraries and headers that it needs.  See `fastprep.mk` for more\ndetails.\n\n## Training the embeddings\n\nWhen `prep.py` completes, it will have produced a directory containing the data\nthat the Swivel trainer needs to run.  Train embeddings as follows:\n\n    ./swivel.py --input_base_path /tmp/swivel_data \\\n       --output_base_path /tmp/swivel_data\n\nThere are a variety of parameters that you can fiddle with to customize the\nembeddings; some that you may want to experiment with include:\n\n| Option | Description |\n|:--- |:--- |\n| `--embedding_size <dim>` | The dimensionality of the embeddings that are created.  By default, 300 dimensional embeddings are created. |\n| `--num_epochs <n>` | The number of iterations through the data that are performed.  By default, 40 epochs are trained. |\n\nAs mentioned above, access to beefy GPU will dramatically reduce the amount of\ntime it takes Swivel to train embeddings.\n\nWhen complete, you should find `row_embeddings.tsv` and `col_embedding.tsv` in\nthe directory specified by `--ouput_base_path`.  These files are tab-delimited\nfiles that contain one embedding per line.  Each line contains the token\nfollowed by *dim* floating point numbers.\n\n## Exploring and evaluating the embeddings\n\nThere are also some simple tools you can to explore the embeddings.  These tools\nwork with a simple binary vector format that can be `mmap`-ed into memory along\nwith a separate vocabulary file.  Use `text2bin.py` to generate these files:\n\n    ./text2bin.py -o vecs.bin -v vocab.txt /tmp/swivel_data/*_embedding.tsv\n\nYou can do some simple exploration using `nearest.py`:\n\n    ./nearest.py -v vocab.txt -e vecs.bin\n    query> dog\n    dog\n    dogs\n    cat\n    ...\n    query> man woman king\n    king\n    queen\n    princess\n    ...\n\nTo evaluate the embeddings using common word similarity and analogy datasets,\nuse `eval.mk` to retrieve the data sets and build the tools:\n\n    make -f eval.mk\n    ./wordsim.py -v vocab.txt -e vecs.bin *.ws.tab\n    ./analogy --vocab vocab.txt --embeddings vecs.bin *.an.tab\n\nThe word similarity evaluation compares the embeddings' estimate of \"similarity\"\nwith human judgement using\n[Spearman's rho](https://en.wikipedia.org/wiki/Spearman%27s_rank_correlation_coefficient)\nas the measure of correlation.  (Bigger numbers are better.)\n\nThe analogy evaluation tests how well the embeddings can predict analogies like\n\"man is to woman as king is to queen\".\n\nNote that `eval.mk` forces all evaluation data into lower case.  From there,\nboth the word similarity and analogy evaluations assume that the eval data and\nthe embeddings use consistent capitalization: if you train embeddings using\nmixed case and evaluate them using lower case, things won't work well.\n\n# Contact\n\nIf you have any questions about Swivel, feel free to post to\n[swivel-embeddings@googlegroups.com](https://groups.google.com/forum/#!forum/swivel-embeddings)\nor contact us directly:\n\n* Noam Shazeer (`noam@google.com`)\n* Ryan Doherty (`portalfire@google.com`)\n* Colin Evans (`colinhevans@google.com`)\n* Chris Waterson (`waterson@google.com`)\n\n"
  },
  {
    "path": "model_zoo/models/swivel/analogy.cc",
    "content": "/* -*- Mode: C++ -*- */\n\n/*\n * Copyright 2016 Google Inc. All Rights Reserved.\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n *    http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\n/*\n * Computes embedding performance on analogy tasks.  Accepts as input one or\n * more files containing four words per line (A B C D), and determines if:\n *\n *   vec(C) - vec(A) + vec(B) ~= vec(D)\n *\n * Cosine distance in the embedding space is used to retrieve neighbors. Any\n * missing vocabulary items are scored as losses.\n */\n#include <fcntl.h>\n#include <math.h>\n#include <pthread.h>\n#include <stdio.h>\n#include <stdlib.h>\n#include <string.h>\n#include <sys/stat.h>\n#include <sys/types.h>\n#include <unistd.h>\n\n#include <fstream>\n#include <iostream>\n#include <string>\n#include <unordered_map>\n#include <vector>\n\nstatic const char usage[] = R\"(\nPerforms analogy testing of embedding vectors.\n\nUsage:\n\n  analogy --embeddings <embeddings> --vocab <vocab> eval1.tab ...\n\nOptions:\n\n  --embeddings <filename>\n    The file containing the binary embedding vectors to evaluate.\n\n  --vocab <filename>\n    The vocabulary file corresponding to the embedding vectors.\n\n  --nthreads <integer>\n    The number of evaluation threads to run (default: 8)\n)\";\n\n// Reads the vocabulary file into a map from token to vector index.\nstatic std::unordered_map<std::string, int> ReadVocab(\n    const std::string& vocab_filename) {\n  std::unordered_map<std::string, int> vocab;\n  std::ifstream fin(vocab_filename);\n\n  int index = 0;\n  for (std::string token; std::getline(fin, token); ++index) {\n    auto n = token.find('\\t');\n    if (n != std::string::npos) token = token.substr(n);\n\n    vocab[token] = index;\n  }\n\n  return vocab;\n}\n\n// An analogy query: \"A is to B as C is to D\".\ntypedef std::tuple<int, int, int, int> AnalogyQuery;\n\nstd::vector<AnalogyQuery> ReadQueries(\n    const std::string &filename,\n    const std::unordered_map<std::string, int> &vocab, int *total) {\n  std::ifstream fin(filename);\n\n  std::vector<AnalogyQuery> queries;\n  int lineno = 0;\n  while (1) {\n    // Read the four words.\n    std::string words[4];\n    int nread = 0;\n    for (int i = 0; i < 4; ++i) {\n      fin >> words[i];\n      if (!words[i].empty()) ++nread;\n    }\n\n    ++lineno;\n    if (nread == 0) break;\n\n    if (nread < 4) {\n      std::cerr << \"expected four words at line \" << lineno << std::endl;\n      break;\n    }\n\n    // Look up each word's index.\n    int ixs[4], nvalid;\n    for (nvalid = 0; nvalid < 4; ++nvalid) {\n      std::unordered_map<std::string, int>::const_iterator it =\n          vocab.find(words[nvalid]);\n\n      if (it == vocab.end()) break;\n\n      ixs[nvalid] = it->second;\n    }\n\n    // If we don't have all the words, count it as a loss.\n    if (nvalid >= 4)\n      queries.push_back(std::make_tuple(ixs[0], ixs[1], ixs[2], ixs[3]));\n  }\n\n  *total = lineno;\n  return queries;\n}\n\n\n// A thread that evaluates some fraction of the analogies.\nclass AnalogyEvaluator {\n public:\n  // Creates a new Analogy evaluator for a range of analogy queries.\n  AnalogyEvaluator(std::vector<AnalogyQuery>::const_iterator begin,\n                   std::vector<AnalogyQuery>::const_iterator end,\n                   const float *embeddings, const int num_embeddings,\n                   const int dim)\n      : begin_(begin),\n        end_(end),\n        embeddings_(embeddings),\n        num_embeddings_(num_embeddings),\n        dim_(dim) {}\n\n  // A thunk for pthreads.\n  static void* Run(void *param) {\n    AnalogyEvaluator *self = static_cast<AnalogyEvaluator*>(param);\n    self->Evaluate();\n    return nullptr;\n  }\n\n  // Evaluates the analogies.\n  void Evaluate();\n\n  // Returns the number of correct analogies after evaluation is complete.\n  int GetNumCorrect() const { return correct_; }\n\n protected:\n  // The beginning of the range of queries to consider.\n  std::vector<AnalogyQuery>::const_iterator begin_;\n\n  // The end of the range of queries to consider.\n  std::vector<AnalogyQuery>::const_iterator end_;\n\n  // The raw embedding vectors.\n  const float *embeddings_;\n\n  // The number of embedding vectors.\n  const int num_embeddings_;\n\n  // The embedding vector dimensionality.\n  const int dim_;\n\n  // The number of correct analogies.\n  int correct_;\n};\n\n\nvoid AnalogyEvaluator::Evaluate() {\n  float* sum = new float[dim_];\n\n  correct_ = 0;\n  for (auto query = begin_; query < end_; ++query) {\n    const float* vec;\n    int a, b, c, d;\n    std::tie(a, b, c, d) = *query;\n\n    // Compute C - A + B.\n    vec = embeddings_ + dim_ * c;\n    for (int i = 0; i < dim_; ++i) sum[i] = vec[i];\n\n    vec = embeddings_ + dim_ * a;\n    for (int i = 0; i < dim_; ++i) sum[i] -= vec[i];\n\n    vec = embeddings_ + dim_ * b;\n    for (int i = 0; i < dim_; ++i) sum[i] += vec[i];\n\n    // Find the nearest neighbor that isn't one of the query words.\n    int best_ix = -1;\n    float best_dot = -1.0;\n    for (int i = 0; i < num_embeddings_; ++i) {\n      if (i == a || i == b || i == c) continue;\n\n      vec = embeddings_ + dim_ * i;\n\n      float dot = 0;\n      for (int j = 0; j < dim_; ++j) dot += vec[j] * sum[j];\n\n      if (dot > best_dot) {\n        best_ix = i;\n        best_dot = dot;\n      }\n    }\n\n    // The fourth word is the answer; did we get it right?\n    if (best_ix == d) ++correct_;\n  }\n\n  delete[] sum;\n}\n\n\nint main(int argc, char *argv[]) {\n  if (argc <= 1) {\n    printf(usage);\n    return 2;\n  }\n\n  std::string embeddings_filename, vocab_filename;\n  int nthreads = 8;\n\n  std::vector<std::string> input_filenames;\n  std::vector<std::tuple<int, int, int, int>> queries;\n\n  for (int i = 1; i < argc; ++i) {\n    std::string arg = argv[i];\n    if (arg == \"--embeddings\") {\n      if (++i >= argc) goto argmissing;\n      embeddings_filename = argv[i];\n    } else if (arg == \"--vocab\") {\n      if (++i >= argc) goto argmissing;\n      vocab_filename = argv[i];\n    } else if (arg == \"--nthreads\") {\n      if (++i >= argc) goto argmissing;\n      if ((nthreads = atoi(argv[i])) <= 0) goto badarg;\n    } else if (arg == \"--help\") {\n      std::cout << usage << std::endl;\n      return 0;\n    } else if (arg[0] == '-') {\n      std::cerr << \"unknown option: '\" << arg << \"'\" << std::endl;\n      return 2;\n    } else {\n      input_filenames.push_back(arg);\n    }\n\n    continue;\n\n  argmissing:\n    std::cerr << \"missing value for '\" << argv[i - 1] << \"' (--help for help)\"\n              << std::endl;\n    return 2;\n\n  badarg:\n    std::cerr << \"invalid value '\" << argv[i] << \"' for '\" << argv[i - 1]\n              << \"' (--help for help)\" << std::endl;\n\n    return 2;\n  }\n\n  // Read the vocabulary.\n  std::unordered_map<std::string, int> vocab = ReadVocab(vocab_filename);\n  if (!vocab.size()) {\n    std::cerr << \"unable to read vocabulary file '\" << vocab_filename << \"'\"\n              << std::endl;\n    return 1;\n  }\n\n  const int n = vocab.size();\n\n  // Read the vectors.\n  int fd;\n  if ((fd = open(embeddings_filename.c_str(), O_RDONLY)) < 0) {\n    std::cerr << \"unable to open embeddings file '\" << embeddings_filename\n              << \"'\" << std::endl;\n    return 1;\n  }\n\n  off_t nbytes = lseek(fd, 0, SEEK_END);\n  if (nbytes == -1) {\n    std::cerr << \"unable to determine file size for '\" << embeddings_filename\n              << \"'\" << std::endl;\n    return 1;\n  }\n\n  if (nbytes % (sizeof(float) * n) != 0) {\n    std::cerr << \"'\" << embeddings_filename\n              << \"' has a strange file size; expected it to be \"\n                 \"a multiple of the vocabulary size\"\n              << std::endl;\n\n    return 1;\n  }\n\n  const int dim = nbytes / (sizeof(float) * n);\n  float *embeddings = static_cast<float *>(malloc(nbytes));\n  lseek(fd, 0, SEEK_SET);\n  if (read(fd, embeddings, nbytes) < nbytes) {\n    std::cerr << \"unable to read embeddings from \" << embeddings_filename\n              << std::endl;\n    return 1;\n  }\n\n  close(fd);\n\n  /* Normalize the vectors. */\n  for (int i = 0; i < n; ++i) {\n    float *vec = embeddings + dim * i;\n    float norm = 0;\n    for (int j = 0; j < dim; ++j) norm += vec[j] * vec[j];\n\n    norm = sqrt(norm);\n    for (int j = 0; j < dim; ++j) vec[j] /= norm;\n  }\n\n  pthread_attr_t attr;\n  if (pthread_attr_init(&attr) != 0) {\n    std::cerr << \"unable to initalize pthreads\" << std::endl;\n    return 1;\n  }\n\n  /* Read each input file. */\n  for (const auto filename : input_filenames) {\n    int total = 0;\n    std::vector<AnalogyQuery> queries =\n        ReadQueries(filename.c_str(), vocab, &total);\n\n    const int queries_per_thread = queries.size() / nthreads;\n    std::vector<AnalogyEvaluator*> evaluators;\n    std::vector<pthread_t> threads;\n\n    for (int i = 0; i < nthreads; ++i) {\n      auto begin = queries.begin() + i * queries_per_thread;\n      auto end = (i + 1 < nthreads)\n                     ? queries.begin() + (i + 1) * queries_per_thread\n                     : queries.end();\n\n      AnalogyEvaluator *evaluator =\n          new AnalogyEvaluator(begin, end, embeddings, n, dim);\n\n      pthread_t thread;\n      pthread_create(&thread, &attr, AnalogyEvaluator::Run, evaluator);\n      evaluators.push_back(evaluator);\n      threads.push_back(thread);\n    }\n\n    for (auto &thread : threads) pthread_join(thread, 0);\n\n    int correct = 0;\n    for (const AnalogyEvaluator* evaluator : evaluators) {\n      correct += evaluator->GetNumCorrect();\n      delete evaluator;\n    }\n\n    printf(\"%0.3f %s\\n\", static_cast<float>(correct) / total, filename.c_str());\n  }\n\n  return 0;\n}\n"
  },
  {
    "path": "model_zoo/models/swivel/eval.mk",
    "content": "# -*- Mode: Makefile -*-\n#\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n# This makefile pulls down the evaluation datasets and formats them uniformly.\n# Word similarity evaluations are formatted to contain exactly three columns:\n# the two words being compared and the human judgement.\n#\n# Use wordsim.py and analogy to run the actual evaluations.\n\nCXXFLAGS=-std=c++11 -m64 -mavx -g -Ofast -Wall\nLDLIBS=-lpthread -lm\n\nWORDSIM_EVALS=\tws353sim.ws.tab \\\n\t\tws353rel.ws.tab \\\n\t\tmen.ws.tab\t\\\n\t\tmturk.ws.tab \\\n\t\trarewords.ws.tab \\\n\t\tsimlex999.ws.tab \\\n\t\t$(NULL)\n\nANALOGY_EVALS=\tmikolov.an.tab \\\n\t\tmsr.an.tab \\\n\t\t$(NULL)\n\nall: $(WORDSIM_EVALS) $(ANALOGY_EVALS) analogy\n\nws353sim.ws.tab: ws353simrel.tar.gz\n\ttar Oxfz $^ wordsim353_sim_rel/wordsim_similarity_goldstandard.txt > $@\n\nws353rel.ws.tab: ws353simrel.tar.gz\n\ttar Oxfz $^ wordsim353_sim_rel/wordsim_relatedness_goldstandard.txt > $@\n\nmen.ws.tab: MEN.tar.gz\n\ttar Oxfz $^ MEN/MEN_dataset_natural_form_full | tr ' ' '\\t' > $@\n\nmturk.ws.tab: Mtruk.csv\n\tcat $^ | tr -d '\\r' | tr ',' '\\t' > $@\n\nrarewords.ws.tab: rw.zip\n\tunzip -p $^ rw/rw.txt | cut -f1-3 -d $$'\\t' > $@\n\nsimlex999.ws.tab: SimLex-999.zip\n\tunzip -p $^ SimLex-999/SimLex-999.txt \\\n\t| tail -n +2 | cut -f1,2,4 -d $$'\\t' > $@\n\nmikolov.an.tab: questions-words.txt\n\tegrep -v -E '^:' $^ | tr '[A-Z] ' '[a-z]\\t' > $@\n\nmsr.an.tab: myz_naacl13_test_set.tgz\n\ttar Oxfz $^ test_set/word_relationship.questions | tr ' ' '\\t' > /tmp/q\n\ttar Oxfz $^ test_set/word_relationship.answers | cut -f2 -d ' ' > /tmp/a\n\tpaste /tmp/q /tmp/a > $@\n\trm -f /tmp/q /tmp/a\n\n\n# wget commands to fetch the datasets.  Please see the original datasets for\n# appropriate references if you use these.\nws353simrel.tar.gz:\n\twget http://alfonseca.org/pubs/ws353simrel.tar.gz\n\nMEN.tar.gz:\n\twget http://clic.cimec.unitn.it/~elia.bruni/resources/MEN.tar.gz\n\nMtruk.csv:\n\twget http://tx.technion.ac.il/~kirar/files/Mtruk.csv\n\nrw.zip:\n\twget http://www-nlp.stanford.edu/~lmthang/morphoNLM/rw.zip\n\nSimLex-999.zip:\n\twget http://www.cl.cam.ac.uk/~fh295/SimLex-999.zip\n\nquestions-words.txt:\n\twget http://word2vec.googlecode.com/svn/trunk/questions-words.txt\n\nmyz_naacl13_test_set.tgz:\n\twget http://research.microsoft.com/en-us/um/people/gzweig/Pubs/myz_naacl13_test_set.tgz\n\nanalogy: analogy.cc\n\nclean:\n\trm -f *.ws.tab *.an.tab analogy *.pyc\n\ndistclean: clean\n\trm -f *.tgz *.tar.gz *.zip Mtruk.csv questions-words.txt\n"
  },
  {
    "path": "model_zoo/models/swivel/fastprep.cc",
    "content": "/* -*- Mode: C++ -*- */\n\n/*\n * Copyright 2016 Google Inc. All Rights Reserved.\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n *     http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\n/*\n * This program starts with a text file (and optionally a vocabulary file) and\n * computes co-occurrence statistics. It emits output in a format that can be\n * consumed by the \"swivel\" program.  It's functionally equivalent to \"prep.py\",\n * but works much more quickly.\n */\n\n#include <assert.h>\n#include <fcntl.h>\n#include <pthread.h>\n#include <stdio.h>\n#include <sys/mman.h>\n#include <sys/stat.h>\n#include <unistd.h>\n\n#include <algorithm>\n#include <fstream>\n#include <iomanip>\n#include <iostream>\n#include <map>\n#include <string>\n#include <tuple>\n#include <unordered_map>\n#include <vector>\n\n#include \"google/protobuf/io/zero_copy_stream_impl.h\"\n#include \"tensorflow/core/example/example.pb.h\"\n#include \"tensorflow/core/example/feature.pb.h\"\n\nstatic const char usage[] = R\"(\nPrepares a corpus for processing by Swivel.\n\nUsage:\n\n  prep --output_dir <output-dir> --input <text-file>\n\nOptions:\n\n  --input <filename>\n      The input text.\n\n  --output_dir <directory>\n      Specifies the output directory where the various Swivel data\n      files should be placed.  This directory must exist.\n\n  --shard_size <int>\n      Specifies the shard size; default 4096.\n\n  --min_count <int>\n      The minimum number of times a word should appear to be included in the\n      generated vocabulary; default 5.  (Ignored if --vocab is used.)\n\n  --max_vocab <int>\n      The maximum vocabulary size to generate from the input corpus; default\n      102,400.  (Ignored if --vocab is used.)\n\n  --vocab <filename>\n      Use the specified unigram vocabulary instead of generating\n      it from the corpus.\n\n  --window_size <int>\n      Specifies the window size for computing co-occurrence stats;\n      default 10.\n)\";\n\nstruct cooc_t {\n  int row;\n  int col;\n  float cnt;\n};\n\ntypedef std::map<long long, float> cooc_counts_t;\n\n// Retrieves the next word from the input stream, treating words as simply being\n// delimited by whitespace.  Returns true if this is the end of a \"sentence\";\n// i.e., a newline.\nbool NextWord(std::ifstream &fin, std::string* word) {\n  std::string buf;\n  char c;\n\n  if (fin.eof()) {\n    word->erase();\n    return true;\n  }\n\n  // Skip leading whitespace.\n  do {\n    c = fin.get();\n  } while (!fin.eof() && std::isspace(c));\n\n  if (fin.eof()) {\n    word->erase();\n    return true;\n  }\n\n  // Read the next word.\n  do {\n    buf += c;\n    c = fin.get();\n  } while (!fin.eof() && !std::isspace(c));\n\n  *word = buf;\n  if (c == '\\n' || fin.eof()) return true;\n\n  // Skip trailing whitespace.\n  do {\n    c = fin.get();\n  } while (!fin.eof() && std::isspace(c));\n\n  if (fin.eof()) return true;\n\n  fin.unget();\n  return false;\n}\n\n// Creates a vocabulary from the most frequent terms in the input file.\nstd::vector<std::string> CreateVocabulary(const std::string input_filename,\n                                          const int shard_size,\n                                          const int min_vocab_count,\n                                          const int max_vocab_size) {\n  std::vector<std::string> vocab;\n\n  // Count all the distinct tokens in the file.  (XXX this will eventually\n  // consume all memory and should be re-written to periodically trim the data.)\n  std::unordered_map<std::string, long long> counts;\n\n  std::ifstream fin(input_filename, std::ifstream::ate);\n\n  if (!fin) {\n    std::cerr << \"couldn't read input file '\" << input_filename << \"'\"\n              << std::endl;\n\n    return vocab;\n  }\n\n  const auto input_size = fin.tellg();\n  fin.seekg(0);\n\n  long long ntokens = 0;\n  while (!fin.eof()) {\n    std::string word;\n    NextWord(fin, &word);\n    counts[word] += 1;\n\n    if (++ntokens % 1000000 == 0) {\n      const float pct = 100.0 * static_cast<float>(fin.tellg()) / input_size;\n      fprintf(stdout, \"\\rComputing vocabulary: %0.1f%% complete...\", pct);\n      std::flush(std::cout);\n    }\n  }\n\n  std::cout << counts.size() << \" distinct tokens\" << std::endl;\n\n  // Sort the vocabulary from most frequent to least frequent.\n  std::vector<std::pair<std::string, long long>> buf;\n  std::copy(counts.begin(), counts.end(), std::back_inserter(buf));\n  std::sort(buf.begin(), buf.end(),\n            [](const std::pair<std::string, long long> &a,\n               const std::pair<std::string, long long> &b) {\n              return b.second < a.second;\n            });\n\n  // Truncate to the maximum vocabulary size\n  if (static_cast<int>(buf.size()) > max_vocab_size) buf.resize(max_vocab_size);\n  if (buf.empty()) return vocab;\n\n  // Eliminate rare tokens and truncate to a size modulo the shard size.\n  int vocab_size = buf.size();\n  while (vocab_size > 0 && buf[vocab_size - 1].second < min_vocab_count)\n    --vocab_size;\n\n  vocab_size -= vocab_size % shard_size;\n  if (static_cast<int>(buf.size()) > vocab_size) buf.resize(vocab_size);\n\n  // Copy out the tokens.\n  for (const auto& pair : buf) vocab.push_back(pair.first);\n\n  return vocab;\n}\n\nstd::vector<std::string> ReadVocabulary(const std::string vocab_filename) {\n  std::vector<std::string> vocab;\n\n  std::ifstream fin(vocab_filename);\n  int index = 0;\n  for (std::string token; std::getline(fin, token); ++index) {\n    auto n = token.find('\\t');\n    if (n != std::string::npos) token = token.substr(n);\n\n    vocab.push_back(token);\n  }\n\n  return vocab;\n}\n\nvoid WriteVocabulary(const std::vector<std::string> &vocab,\n                     const std::string &output_dirname) {\n  for (const std::string filename : {\"row_vocab.txt\", \"col_vocab.txt\"}) {\n    std::ofstream fout(output_dirname + \"/\" + filename);\n    for (const auto &token : vocab) fout << token << std::endl;\n  }\n}\n\n// Manages accumulation of co-occurrence data into temporary disk buffer files.\nclass CoocBuffer {\n public:\n  CoocBuffer(const std::string &output_dirname, const int num_shards,\n             const int shard_size);\n\n  // Accumulate the co-occurrence counts to the buffer.\n  void AccumulateCoocs(const cooc_counts_t &coocs);\n\n  // Read the buffer to produce shard files.\n  void WriteShards();\n\n protected:\n  // The output directory. Also used for temporary buffer files.\n  const std::string output_dirname_;\n\n  // The number of row/column shards.\n  const int num_shards_;\n\n  // The number of elements per shard.\n  const int shard_size_;\n\n  // Parallel arrays of temporary file paths and file descriptors.\n  std::vector<std::string> paths_;\n  std::vector<int> fds_;\n\n  // Ensures that only one buffer file is getting written at a time.\n  pthread_mutex_t writer_mutex_;\n};\n\nCoocBuffer::CoocBuffer(const std::string &output_dirname, const int num_shards,\n                       const int shard_size)\n    : output_dirname_(output_dirname),\n      num_shards_(num_shards),\n      shard_size_(shard_size),\n      writer_mutex_(PTHREAD_MUTEX_INITIALIZER) {\n  for (int row = 0; row < num_shards_; ++row) {\n    for (int col = 0; col < num_shards_; ++col) {\n      char filename[256];\n      sprintf(filename, \"shard-%03d-%03d.tmp\", row, col);\n\n      std::string path = output_dirname + \"/\" + filename;\n      int fd = open(path.c_str(), O_RDWR | O_CREAT | O_TRUNC, 0666);\n      assert(fd > 0);\n\n      paths_.push_back(path);\n      fds_.push_back(fd);\n    }\n  }\n}\n\nvoid CoocBuffer::AccumulateCoocs(const cooc_counts_t &coocs) {\n  std::vector<std::vector<cooc_t>> bufs(fds_.size());\n\n  for (const auto &cooc : coocs) {\n    const int row_id = cooc.first >> 32;\n    const int col_id = cooc.first & 0xffffffff;\n    const float cnt = cooc.second;\n\n    const int row_shard = row_id % num_shards_;\n    const int row_off = row_id / num_shards_;\n    const int col_shard = col_id % num_shards_;\n    const int col_off = col_id / num_shards_;\n\n    const int top_shard_idx = row_shard * num_shards_ + col_shard;\n    bufs[top_shard_idx].push_back(cooc_t{row_off, col_off, cnt});\n\n    const int bot_shard_idx = col_shard * num_shards_ + row_shard;\n    bufs[bot_shard_idx].push_back(cooc_t{col_off, row_off, cnt});\n  }\n\n  // XXX TODO: lock\n  for (int i = 0; i < static_cast<int>(fds_.size()); ++i) {\n    int rv = pthread_mutex_lock(&writer_mutex_);\n    assert(rv == 0);\n    const int nbytes = bufs[i].size() * sizeof(cooc_t);\n    int nwritten = write(fds_[i], bufs[i].data(), nbytes);\n    assert(nwritten == nbytes);\n    pthread_mutex_unlock(&writer_mutex_);\n  }\n}\n\nvoid CoocBuffer::WriteShards() {\n  for (int shard = 0; shard < static_cast<int>(fds_.size()); ++shard) {\n    const int row_shard = shard / num_shards_;\n    const int col_shard = shard % num_shards_;\n\n    std::cout << \"\\rwriting shard \" << (shard + 1) << \"/\"\n              << (num_shards_ * num_shards_);\n    std::flush(std::cout);\n\n    // Construct the tf::Example proto.  First, we add the global rows and\n    // column that are present in the shard.\n    tensorflow::Example example;\n\n    auto &feature = *example.mutable_features()->mutable_feature();\n    auto global_row = feature[\"global_row\"].mutable_int64_list();\n    auto global_col = feature[\"global_col\"].mutable_int64_list();\n\n    for (int i = 0; i < shard_size_; ++i) {\n      global_row->add_value(row_shard + i * num_shards_);\n      global_col->add_value(col_shard + i * num_shards_);\n    }\n\n    // Next we add co-occurrences as a sparse representation.  Map the\n    // co-occurrence counts that we've spooled off to disk: these are in\n    // arbitrary order and may contain duplicates.\n    const off_t nbytes = lseek(fds_[shard], 0, SEEK_END);\n    cooc_t *coocs = static_cast<cooc_t*>(\n        mmap(0, nbytes, PROT_READ | PROT_WRITE, MAP_SHARED, fds_[shard], 0));\n\n    const int ncoocs = nbytes / sizeof(cooc_t);\n    cooc_t* cur = coocs;\n    cooc_t* end = coocs + ncoocs;\n\n    auto sparse_value = feature[\"sparse_value\"].mutable_float_list();\n    auto sparse_local_row = feature[\"sparse_local_row\"].mutable_int64_list();\n    auto sparse_local_col = feature[\"sparse_local_col\"].mutable_int64_list();\n\n    std::sort(cur, end, [](const cooc_t &a, const cooc_t &b) {\n      return a.row < b.row || (a.row == b.row && a.col < b.col);\n    });\n\n    // Accumulate the counts into the protocol buffer.\n    int last_row = -1, last_col = -1;\n    float count = 0;\n    for (; cur != end; ++cur) {\n      if (cur->row != last_row || cur->col != last_col) {\n        if (last_row >= 0 && last_col >= 0) {\n          sparse_local_row->add_value(last_row);\n          sparse_local_col->add_value(last_col);\n          sparse_value->add_value(count);\n        }\n\n        last_row = cur->row;\n        last_col = cur->col;\n        count = 0;\n      }\n\n      count += cur->cnt;\n    }\n\n    if (last_row >= 0 && last_col >= 0) {\n      sparse_local_row->add_value(last_row);\n      sparse_local_col->add_value(last_col);\n      sparse_value->add_value(count);\n    }\n\n    munmap(coocs, nbytes);\n    close(fds_[shard]);\n\n    // Write the protocol buffer as a binary blob to disk.\n    char filename[256];\n    snprintf(filename, sizeof(filename), \"shard-%03d-%03d.pb\", row_shard,\n             col_shard);\n\n    const std::string path = output_dirname_ + \"/\" + filename;\n    int fd = open(path.c_str(), O_WRONLY | O_TRUNC | O_CREAT, 0666);\n    assert(fd != -1);\n\n    google::protobuf::io::FileOutputStream fout(fd);\n    example.SerializeToZeroCopyStream(&fout);\n    fout.Close();\n\n    // Remove the temporary file.\n    unlink(paths_[shard].c_str());\n  }\n\n  std::cout << std::endl;\n}\n\n// Counts the co-occurrences in part of the file.\nclass CoocCounter {\n public:\n  CoocCounter(const std::string &input_filename, const off_t start,\n              const off_t end, const int window_size,\n              const std::unordered_map<std::string, int> &token_to_id_map,\n              CoocBuffer *coocbuf)\n      : fin_(input_filename, std::ifstream::ate),\n        start_(start),\n        end_(end),\n        window_size_(window_size),\n        token_to_id_map_(token_to_id_map),\n        coocbuf_(coocbuf),\n        marginals_(token_to_id_map.size()) {}\n\n  // PTthreads-friendly thunk to Count.\n  static void* Run(void* param) {\n    CoocCounter* self = static_cast<CoocCounter*>(param);\n    self->Count();\n    return nullptr;\n  }\n\n  // Counts the co-occurrences.\n  void Count();\n\n  const std::vector<double>& Marginals() const { return marginals_; }\n\n protected:\n  // The input stream.\n  std::ifstream fin_;\n\n  // The range of the file to which this counter should attend.\n  const off_t start_;\n  const off_t end_;\n\n  // The window size for computing co-occurrences.\n  const int window_size_;\n\n  // A reference to the mapping from tokens to IDs.\n  const std::unordered_map<std::string, int> &token_to_id_map_;\n\n  // The buffer into which counts are to be accumulated.\n  CoocBuffer* coocbuf_;\n\n  // The marginal counts accumulated by this counter.\n  std::vector<double> marginals_;\n};\n\nvoid CoocCounter::Count() {\n  const int max_coocs_size = 16 * 1024 * 1024;\n\n  // A buffer of co-occurrence counts that we'll periodically sort into\n  // shards.\n  cooc_counts_t coocs;\n\n  fin_.seekg(start_);\n\n  int nlines = 0;\n  for (off_t filepos = start_; filepos < end_; filepos = fin_.tellg()) {\n    // Buffer a single sentence.\n    std::vector<int> sentence;\n    bool eos;\n    do {\n      std::string word;\n      eos = NextWord(fin_, &word);\n      auto it = token_to_id_map_.find(word);\n      if (it != token_to_id_map_.end()) sentence.push_back(it->second);\n    } while (!eos);\n\n    // Generate the co-occurrences for the sentence.\n    for (int pos = 0; pos < static_cast<int>(sentence.size()); ++pos) {\n      const int left_id = sentence[pos];\n\n      const int window_extent =\n          std::min(static_cast<int>(sentence.size()) - pos, 1 + window_size_);\n\n      for (int off = 1; off < window_extent; ++off) {\n        const int right_id = sentence[pos + off];\n        const double count = 1.0 / static_cast<double>(off);\n        const long long lo = std::min(left_id, right_id);\n        const long long hi = std::max(left_id, right_id);\n        const long long key = (hi << 32) | lo;\n        coocs[key] += count;\n\n        marginals_[left_id] += count;\n        marginals_[right_id] += count;\n      }\n\n      marginals_[left_id] += 1.0;\n      const long long key = (static_cast<long long>(left_id) << 32) |\n                            static_cast<long long>(left_id);\n\n      coocs[key] += 0.5;\n    }\n\n    // Periodically flush the co-occurrences to disk.\n    if (coocs.size() > max_coocs_size) {\n      coocbuf_->AccumulateCoocs(coocs);\n      coocs.clear();\n    }\n\n    if (start_ == 0 && ++nlines % 1000 == 0) {\n      const double pct = 100.0 * filepos / end_;\n      fprintf(stdout, \"\\rComputing co-occurrences: %0.1f%% complete...\", pct);\n      std::flush(std::cout);\n    }\n  }\n\n  // Accumulate anything we haven't flushed yet.\n  coocbuf_->AccumulateCoocs(coocs);\n\n  if (start_ == 0) std::cout << \"done.\" << std::endl;\n}\n\nvoid WriteMarginals(const std::vector<double> &marginals,\n                    const std::string &output_dirname) {\n  for (const std::string filename : {\"row_sums.txt\", \"col_sums.txt\"}) {\n    std::ofstream fout(output_dirname + \"/\" + filename);\n    fout.setf(std::ios::fixed);\n    for (double sum : marginals) fout << sum << std::endl;\n  }\n}\n\nint main(int argc, char *argv[]) {\n  std::string input_filename;\n  std::string vocab_filename;\n  std::string output_dirname;\n  bool generate_vocab = true;\n  int max_vocab_size = 100 * 1024;\n  int min_vocab_count = 5;\n  int window_size = 10;\n  int shard_size = 4096;\n  int num_threads = 4;\n\n  for (int i = 1; i < argc; ++i) {\n    std::string arg(argv[i]);\n    if (arg == \"--vocab\") {\n      if (++i >= argc) goto argmissing;\n      generate_vocab = false;\n      vocab_filename = argv[i];\n    } else if (arg == \"--max_vocab\") {\n      if (++i >= argc) goto argmissing;\n      if ((max_vocab_size = atoi(argv[i])) <= 0) goto badarg;\n    } else if (arg == \"--min_count\") {\n      if (++i >= argc) goto argmissing;\n      if ((min_vocab_count = atoi(argv[i])) <= 0) goto badarg;\n    } else if (arg == \"--window_size\") {\n      if (++i >= argc) goto argmissing;\n      if ((window_size = atoi(argv[i])) <= 0) goto badarg;\n    } else if (arg == \"--input\") {\n      if (++i >= argc) goto argmissing;\n      input_filename = argv[i];\n    } else if (arg == \"--output_dir\") {\n      if (++i >= argc) goto argmissing;\n      output_dirname = argv[i];\n    } else if (arg == \"--shard_size\") {\n      if (++i >= argc) goto argmissing;\n      shard_size = atoi(argv[i]);\n    } else if (arg == \"--num_threads\") {\n      if (++i >= argc) goto argmissing;\n      num_threads = atoi(argv[i]);\n    } else if (arg == \"--help\") {\n      std::cout << usage << std::endl;\n      return 0;\n    } else {\n      std::cerr << \"unknown arg '\" << arg << \"'; try --help?\" << std::endl;\n      return 2;\n    }\n\n    continue;\n\n  badarg:\n    std::cerr << \"'\" << argv[i] << \"' is not a valid value for '\" << arg\n              << \"'; try --help?\" << std::endl;\n\n    return 2;\n\n  argmissing:\n    std::cerr << arg << \" requires an argument; try --help?\" << std::endl;\n  }\n\n  if (input_filename.empty()) {\n    std::cerr << \"please specify the input text with '--input'; try --help?\"\n              << std::endl;\n    return 2;\n  }\n\n  if (output_dirname.empty()) {\n    std::cerr << \"please specify the output directory with '--output_dir'\"\n              << std::endl;\n\n    return 2;\n  }\n\n  struct stat sb;\n  if (lstat(output_dirname.c_str(), &sb) != 0 || !S_ISDIR(sb.st_mode)) {\n    std::cerr << \"output directory '\" << output_dirname\n              << \"' does not exist of is not a directory.\" << std::endl;\n\n    return 1;\n  }\n\n  if (lstat(input_filename.c_str(), &sb) != 0 || !S_ISREG(sb.st_mode)) {\n    std::cerr << \"input file '\" << input_filename\n              << \"' does not exist or is not a file.\" << std::endl;\n\n    return 1;\n  }\n\n  // The total size of the input.\n  const off_t input_size = sb.st_size;\n\n  const std::vector<std::string> vocab =\n      generate_vocab ? CreateVocabulary(input_filename, shard_size,\n                                        min_vocab_count, max_vocab_size)\n                     : ReadVocabulary(vocab_filename);\n\n  if (!vocab.size()) {\n    std::cerr << \"Empty vocabulary.\" << std::endl;\n    return 1;\n  }\n\n  std::cout << \"Generating Swivel co-occurrence data into \" << output_dirname\n            << std::endl;\n\n  std::cout << \"Shard size: \" << shard_size << \"x\" << shard_size << std::endl;\n  std::cout << \"Vocab size: \" << vocab.size() << std::endl;\n\n  // Write the vocabulary files into  the output directory.\n  WriteVocabulary(vocab, output_dirname);\n\n  const int num_shards = vocab.size() / shard_size;\n  CoocBuffer coocbuf(output_dirname, num_shards, shard_size);\n\n  // Build a mapping from the token to its position in the vocabulary file.\n  std::unordered_map<std::string, int> token_to_id_map;\n  for (int i = 0; i < static_cast<int>(vocab.size()); ++i)\n    token_to_id_map[vocab[i]] = i;\n\n  // Compute the co-occurrences\n  std::vector<pthread_t> threads;\n  std::vector<CoocCounter*> counters;\n  const off_t nbytes_per_thread = input_size / num_threads;\n\n  pthread_attr_t attr;\n  if (pthread_attr_init(&attr) != 0) {\n    std::cerr << \"unable to initalize pthreads\" << std::endl;\n    return 1;\n  }\n\n  for (int i = 0; i < num_threads; ++i) {\n    // We could make this smarter and look around for newlines.  But\n    // realistically that's not going to change things much.\n    const off_t start = i * nbytes_per_thread;\n    const off_t end =\n        i < num_threads - 1 ? (i + 1) * nbytes_per_thread : input_size;\n\n    CoocCounter *counter = new CoocCounter(\n        input_filename, start, end, window_size, token_to_id_map, &coocbuf);\n\n    counters.push_back(counter);\n\n    pthread_t thread;\n    pthread_create(&thread, &attr, CoocCounter::Run, counter);\n\n    threads.push_back(thread);\n  }\n\n  // Wait for threads to finish and collect marginals.\n  std::vector<double> marginals(vocab.size());\n  for (int i = 0; i < num_threads; ++i) {\n    pthread_join(threads[i], 0);\n\n    const std::vector<double>& counter_marginals = counters[i]->Marginals();\n    for (int j = 0; j < static_cast<int>(vocab.size()); ++j)\n      marginals[j] += counter_marginals[j];\n\n    delete counters[i];\n  }\n\n  std::cout << \"writing marginals...\" << std::endl;\n  WriteMarginals(marginals, output_dirname);\n\n  std::cout << \"writing shards...\" << std::endl;\n  coocbuf.WriteShards();\n\n  return 0;\n}\n"
  },
  {
    "path": "model_zoo/models/swivel/fastprep.mk",
    "content": "# -*- Mode: Makefile -*-\n\n#\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\n\n# This makefile builds \"fastprep\", a faster version of prep.py that can be used\n# to build training data for Swivel.  Building \"fastprep\" is a bit more\n# involved: you'll need to pull and build the Tensorflow source, and then build\n# and install compatible protobuf software.  We've tested this with Tensorflow\n# version 0.7.\n#\n# = Step 1. Pull and Build Tensorflow. =\n#\n# These instructions are somewhat abridged; for pre-requisites and the most\n# up-to-date instructions, refer to:\n#\n#   <https://www.tensorflow.org/versions/r0.7/get_started/os_setup.html#installing-from-sources>\n#\n# To build the Tensorflow components required for \"fastpret\", you'll need to\n# install Bazel, Numpy, Swig, and Python development headers as described in at\n# the above URL.  Run the \"configure\" script as appropriate for your\n# environment and then build the \"build_pip_package\" target:\n#\n#   bazel build -c opt [--config=cuda] //tensorflow/tools/pip_package:build_pip_package\n#\n# This will generate the Tensorflow headers and libraries necessary for\n# \"fastprep\".\n#\n#\n# = Step 2. Build and Install Compatible Protobuf Libraries =\n#\n# \"fastprep\" also needs compatible protocol buffer libraries, which you can\n# build from the protobuf implementation included with the Tensorflow\n# distribution:\n#\n#   cd ${TENSORFLOW_SRCDIR}/google/protobuf\n#   ./autogen.sh\n#   ./configure --prefix=${HOME}  # ...or whatever\n#   make\n#   make install  # ...or maybe \"sudo make install\"\n#\n# This will install the headers and libraries appropriately.\n#\n#\n# = Step 3. Build \"fastprep\". =\n#\n# Finally modify this file (if necessary) to update PB_DIR and TF_DIR to refer\n# to appropriate locations, and:\n#\n#   make -f fastprep.mk\n#\n# If all goes well, you should have a program that is \"flag compatible\" with\n# \"prep.py\" and runs significantly faster.  Use it to generate the co-occurrence\n# matrices and other files necessary to train a Swivel matrix.\n\n\n# The root directory where the Google Protobuf software is installed.\n# Alternative locations might be \"/usr\" or \"/usr/local\".\nPB_DIR=$(HOME)\n\n# Assuming you've got the Tensorflow source unpacked and built in ${HOME}/src:\nTF_DIR=$(HOME)/src/tensorflow\n\nPB_INCLUDE=$(PB_DIR)/include\nTF_INCLUDE=$(TF_DIR)/bazel-genfiles\nCXXFLAGS=-std=c++11 -m64 -mavx -g -Ofast -Wall -I$(TF_INCLUDE) -I$(PB_INCLUDE)\n\nPB_LIBDIR=$(PB_DIR)/lib\nTF_LIBDIR=$(TF_DIR)/bazel-bin/tensorflow/core\nLDFLAGS=-L$(TF_LIBDIR) -L$(PB_LIBDIR)\nLDLIBS=-lprotos_all_cc -lprotobuf -lpthread -lm\n\nfastprep: fastprep.cc\n"
  },
  {
    "path": "model_zoo/models/swivel/glove_to_shards.py",
    "content": "#!/usr/bin/env python\n#\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Converts a Glove binary co-occurrence matrix into Swivel shards.\n\nUsage:\n\n  glove_to_shards.py --input <coocs> --vocab <vocab> --output_dir <output_dir>\n\nOptions\n\n  --input <coocs>\n      The Glove co-occurrence file.\n\n  --vocab <vocab>\n      Path to the vocabulary text file, one token per line.\n\n  --output_dir <directory>\n      Specifies the touput directory where the various Swivel data\n      files sohuld be placed.\n\n  --shard_size <int>\n      Specifies the shard size; default 4096.\n\"\"\"\n\nfrom __future__ import print_function\n\nimport itertools\nimport os\nimport struct\nimport sys\n\nimport tensorflow as tf\n\nflags = tf.app.flags\n\nflags.DEFINE_string('input', 'coocurrences.bin', 'Vocabulary file')\nflags.DEFINE_string('vocab', 'vocab.txt', 'Vocabulary file')\nflags.DEFINE_string('output_dir', '/tmp/swivel_data', 'Output directory')\nflags.DEFINE_integer('shard_size', 4096, 'Shard size')\n\nFLAGS = tf.app.flags.FLAGS\n\nglove_cooc_fmt = struct.Struct('iid')\nshard_cooc_fmt = struct.Struct('if')\n\n\ndef make_shard_files(coocs, nshards, vocab_sz):\n  \"\"\"Chops the binary Glove co-occurrence matrix into shards.\n\n  This reads the Glove binary co-occurrence file and assigns individual\n  co-occurrence counts to the appropriate Swivel shard.\n\n  Args:\n    coocs: the co-occurrnece file to read\n    nshards: the number of shards along one dimension of the square matrix\n    vocab_sz: the vocabulary size\n\n  Returns:\n    A (shard_table, marginals) tuple.  The shard_table maps the row and column\n    shard ID to a file handle containing the co-occurrences for that shard; the\n    marginals contain the marginal sums.\n  \"\"\"\n  row_sums = [0] * vocab_sz\n  col_sums = [0] * vocab_sz\n\n  coocs.seek(0, os.SEEK_END)\n  ncoocs = coocs.tell() / glove_cooc_fmt.size\n  coocs.seek(0, os.SEEK_SET)\n\n  shard_files = {}\n\n  for row in range(nshards):\n    for col in range(nshards):\n      filename = os.path.join(\n          FLAGS.output_dir, 'shard-%03d-%03d.bin' % (row, col))\n\n      shard_files[(row, col)] = open(filename, 'w+')\n\n  for ix in xrange(ncoocs):\n    if ix % 1000000 == 0:\n      sys.stdout.write('\\rsharding co-occurrences: %0.1f%% (%d/%d)' % (\n          100.0 * ix / ncoocs, ix, ncoocs))\n\n      sys.stdout.flush()\n\n    bits = coocs.read(glove_cooc_fmt.size)\n    if not bits:\n      break\n\n    # Glove has 1-indexed IDs.\n    row_id, col_id, cnt = glove_cooc_fmt.unpack(bits)\n    if row_id > vocab_sz or col_id > vocab_sz:\n      continue\n\n    row_id -= 1\n    row_shard = row_id % nshards\n    row_off = row_id / nshards\n\n    col_id -= 1\n    col_shard = col_id % nshards\n    col_off = col_id / nshards\n\n    shard_pos = row_off * FLAGS.shard_size + col_off  # row major\n\n    shard_files[(row_shard, col_shard)].write(\n        shard_cooc_fmt.pack(shard_pos, cnt))\n\n    # Accumulate marginals.\n    row_sums[row_id] += cnt\n    col_sums[col_id] += cnt\n\n  sys.stdout.write('\\n')\n\n  if any(abs(r - c) > 0.1 for r, c in itertools.izip(row_sums, col_sums)):\n    print('WARNING! Row and column marginals differ; is your matrix symmetric?',\n          file=sys.stderr)\n\n  return (shard_files, row_sums)\n\ndef main(_):\n  with open(FLAGS.vocab, 'r') as lines:\n    orig_vocab_sz = sum(1 for _ in lines)\n\n  shard_sz = FLAGS.shard_size\n  vocab_sz = orig_vocab_sz - orig_vocab_sz % shard_sz\n  nshards = vocab_sz / shard_sz\n\n  print('vocab size is %d (originally %d), %d %dx%d-element shards' % (\n      vocab_sz, orig_vocab_sz, nshards * nshards, shard_sz, shard_sz))\n\n  # Create the output directory, if necessary\n  if FLAGS.output_dir and not os.path.isdir(FLAGS.output_dir):\n    os.makedirs(FLAGS.output_dir)\n\n  with open(FLAGS.input, 'r') as coocs:\n    shard_files, marginals = make_shard_files(coocs, nshards, vocab_sz)\n\n  # Now sort the shards and write the TFRecords.\n  filename = os.path.join(FLAGS.output_dir, 'shards.recs')\n  with tf.python_io.TFRecordWriter(filename) as writer:\n    ix = 0\n    for (row, col), fh in shard_files.iteritems():\n      ix += 1\n      sys.stdout.write('\\rwriting shard %d/%d' % (ix, len(shard_files)))\n      sys.stdout.flush()\n\n      fh.seek(0)\n      buf = fh.read()\n      os.unlink(fh.name)\n      fh.close()\n\n      coocs = [\n          shard_cooc_fmt.unpack_from(buf, off)\n          for off in range(0, len(buf), shard_cooc_fmt.size)]\n\n      # N.B. we assume that there aren't any duplicates here!\n      coocs.sort(key=lambda kv: kv[0])\n\n      def _int64s(xs):\n        return tf.train.Feature(int64_list=tf.train.Int64List(value=list(xs)))\n\n      def _floats(xs):\n        return tf.train.Feature(float_list=tf.train.FloatList(value=list(xs)))\n\n      example = tf.train.Example(features=tf.train.Features(feature={\n          'global_row': _int64s(row + nshards * i for i in range(shard_sz)),\n          'global_col': _int64s(col + nshards * i for i in range(shard_sz)),\n          'sparse_local_row': _int64s(pos / shard_sz for pos, _ in coocs),\n          'sparse_local_col': _int64s(pos % shard_sz for pos, _ in coocs),\n          'sparse_value': _floats(cnt for _, cnt in coocs)}))\n\n      writer.write(example.SerializeToString())\n\n  print('\\nwriting marginals...')\n\n  with open(os.path.join(FLAGS.output_dir, 'marginals.txt'), 'w') as fh:\n    for cnt in marginals:\n      fh.write('%0.1f\\n' % cnt)\n\n  print('done!')\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/swivel/nearest.py",
    "content": "#!/usr/bin/env python\n#\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Simple tool for inspecting nearest neighbors and analogies.\"\"\"\n\nimport re\nimport sys\nfrom getopt import GetoptError, getopt\n\nfrom vecs import Vecs\n\ntry:\n  opts, args = getopt(sys.argv[1:], 'v:e:', ['vocab=', 'embeddings='])\nexcept GetoptError, e:\n  print >> sys.stderr, e\n  sys.exit(2)\n\nopt_vocab = 'vocab.txt'\nopt_embeddings = None\n\nfor o, a in opts:\n  if o in ('-v', '--vocab'):\n    opt_vocab = a\n  if o in ('-e', '--embeddings'):\n    opt_embeddings = a\n\nvecs = Vecs(opt_vocab, opt_embeddings)\n\nwhile True:\n  sys.stdout.write('query> ')\n  sys.stdout.flush()\n\n  query = sys.stdin.readline().strip()\n  if not query:\n    break\n\n  parts = re.split(r'\\s+', query)\n\n  if len(parts) == 1:\n    res = vecs.neighbors(parts[0])\n\n  elif len(parts) == 3:\n    vs = [vecs.lookup(w) for w in parts]\n    if any(v is None for v in vs):\n      print 'not in vocabulary: %s' % (\n          ', '.join(tok for tok, v in zip(parts, vs) if v is None))\n\n      continue\n\n    res = vecs.neighbors(vs[2] - vs[0] + vs[1])\n\n  else:\n    print 'use a single word to query neighbors, or three words for analogy'\n    continue\n\n  if not res:\n    continue\n\n  for word, sim in res[:20]:\n    print '%0.4f: %s' % (sim, word)\n\n  print\n"
  },
  {
    "path": "model_zoo/models/swivel/prep.py",
    "content": "#!/usr/bin/env python\n#\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Prepare a corpus for processing by swivel.\n\nCreates a sharded word co-occurrence matrix from a text file input corpus.\n\nUsage:\n\n  prep.py --output_dir <output-dir> --input <text-file>\n\nOptions:\n\n  --input <filename>\n      The input text.\n\n  --output_dir <directory>\n      Specifies the output directory where the various Swivel data\n      files should be placed.\n\n  --shard_size <int>\n      Specifies the shard size; default 4096.\n\n  --min_count <int>\n      Specifies the minimum number of times a word should appear\n      to be included in the vocabulary; default 5.\n\n  --max_vocab <int>\n      Specifies the maximum vocabulary size; default shard size\n      times 1024.\n\n  --vocab <filename>\n      Use the specified unigram vocabulary instead of generating\n      it from the corpus.\n\n  --window_size <int>\n      Specifies the window size for computing co-occurrence stats;\n      default 10.\n\n  --bufsz <int>\n      The number of co-occurrences that are buffered; default 16M.\n\n\"\"\"\n\nimport itertools\nimport math\nimport os\nimport struct\nimport sys\n\nimport tensorflow as tf\n\nflags = tf.app.flags\n\nflags.DEFINE_string('input', '', 'The input text.')\nflags.DEFINE_string('output_dir', '/tmp/swivel_data',\n                    'Output directory for Swivel data')\nflags.DEFINE_integer('shard_size', 4096, 'The size for each shard')\nflags.DEFINE_integer('min_count', 5,\n                     'The minimum number of times a word should occur to be '\n                     'included in the vocabulary')\nflags.DEFINE_integer('max_vocab', 4096 * 64, 'The maximum vocabulary size')\nflags.DEFINE_string('vocab', '', 'Vocabulary to use instead of generating one')\nflags.DEFINE_integer('window_size', 10, 'The window size')\nflags.DEFINE_integer('bufsz', 16 * 1024 * 1024,\n                     'The number of co-occurrences to buffer')\n\nFLAGS = flags.FLAGS\n\nshard_cooc_fmt = struct.Struct('iif')\n\n\ndef words(line):\n  \"\"\"Splits a line of text into tokens.\"\"\"\n  return line.strip().split()\n\n\ndef create_vocabulary(lines):\n  \"\"\"Reads text lines and generates a vocabulary.\"\"\"\n  lines.seek(0, os.SEEK_END)\n  nbytes = lines.tell()\n  lines.seek(0, os.SEEK_SET)\n\n  vocab = {}\n  for lineno, line in enumerate(lines, start=1):\n    for word in words(line):\n      vocab.setdefault(word, 0)\n      vocab[word] += 1\n\n    if lineno % 100000 == 0:\n      pos = lines.tell()\n      sys.stdout.write('\\rComputing vocabulary: %0.1f%% (%d/%d)...' % (\n          100.0 * pos / nbytes, pos, nbytes))\n      sys.stdout.flush()\n\n  sys.stdout.write('\\n')\n\n  vocab = [(tok, n) for tok, n in vocab.iteritems() if n >= FLAGS.min_count]\n  vocab.sort(key=lambda kv: (-kv[1], kv[0]))\n\n  num_words = min(len(vocab), FLAGS.max_vocab)\n  if num_words % FLAGS.shard_size != 0:\n    num_words -= num_words % FLAGS.shard_size\n\n  if not num_words:\n    raise Exception('empty vocabulary')\n\n  print 'vocabulary contains %d tokens' % num_words\n\n  vocab = vocab[:num_words]\n  return [tok for tok, n in vocab]\n\n\ndef write_vocab_and_sums(vocab, sums, vocab_filename, sums_filename):\n  \"\"\"Writes vocabulary and marginal sum files.\"\"\"\n  with open(os.path.join(FLAGS.output_dir, vocab_filename), 'w') as vocab_out:\n    with open(os.path.join(FLAGS.output_dir, sums_filename), 'w') as sums_out:\n      for tok, cnt in itertools.izip(vocab, sums):\n        print >> vocab_out, tok\n        print >> sums_out, cnt\n\n\ndef compute_coocs(lines, vocab):\n  \"\"\"Compute the co-occurrence statistics from the text.\n\n  This generates a temporary file for each shard that contains the intermediate\n  counts from the shard: these counts must be subsequently sorted and collated.\n\n  \"\"\"\n  word_to_id = {tok: idx for idx, tok in enumerate(vocab)}\n\n  lines.seek(0, os.SEEK_END)\n  nbytes = lines.tell()\n  lines.seek(0, os.SEEK_SET)\n\n  num_shards = len(vocab) / FLAGS.shard_size\n\n  shardfiles = {}\n  for row in range(num_shards):\n    for col in range(num_shards):\n      filename = os.path.join(\n          FLAGS.output_dir, 'shard-%03d-%03d.tmp' % (row, col))\n\n      shardfiles[(row, col)] = open(filename, 'w+')\n\n  def flush_coocs():\n    for (row_id, col_id), cnt in coocs.iteritems():\n      row_shard = row_id % num_shards\n      row_off = row_id / num_shards\n      col_shard = col_id % num_shards\n      col_off = col_id / num_shards\n\n      # Since we only stored (a, b), we emit both (a, b) and (b, a).\n      shardfiles[(row_shard, col_shard)].write(\n          shard_cooc_fmt.pack(row_off, col_off, cnt))\n\n      shardfiles[(col_shard, row_shard)].write(\n          shard_cooc_fmt.pack(col_off, row_off, cnt))\n\n  coocs = {}\n  sums = [0.0] * len(vocab)\n\n  for lineno, line in enumerate(lines, start=1):\n    # Computes the word IDs for each word in the sentence.  This has the effect\n    # of \"stretching\" the window past OOV tokens.\n    wids = filter(\n        lambda wid: wid is not None,\n        (word_to_id.get(w) for w in words(line)))\n\n    for pos in xrange(len(wids)):\n      lid = wids[pos]\n      window_extent = min(FLAGS.window_size + 1, len(wids) - pos)\n      for off in xrange(1, window_extent):\n        rid = wids[pos + off]\n        pair = (min(lid, rid), max(lid, rid))\n        count = 1.0 / off\n        sums[lid] += count\n        sums[rid] += count\n        coocs.setdefault(pair, 0.0)\n        coocs[pair] += count\n\n      sums[lid] += 1.0\n      pair = (lid, lid)\n      coocs.setdefault(pair, 0.0)\n      coocs[pair] += 0.5  # Only add 1/2 since we output (a, b) and (b, a)\n\n    if lineno % 10000 == 0:\n      pos = lines.tell()\n      sys.stdout.write('\\rComputing co-occurrences: %0.1f%% (%d/%d)...' % (\n          100.0 * pos / nbytes, pos, nbytes))\n      sys.stdout.flush()\n\n      if len(coocs) > FLAGS.bufsz:\n        flush_coocs()\n        coocs = {}\n\n  flush_coocs()\n  sys.stdout.write('\\n')\n\n  return shardfiles, sums\n\n\ndef write_shards(vocab, shardfiles):\n  \"\"\"Processes the temporary files to generate the final shard data.\n\n  The shard data is stored as a tf.Example protos using a TFRecordWriter. The\n  temporary files are removed from the filesystem once they've been processed.\n\n  \"\"\"\n  num_shards = len(vocab) / FLAGS.shard_size\n\n  ix = 0\n  for (row, col), fh in shardfiles.iteritems():\n    ix += 1\n    sys.stdout.write('\\rwriting shard %d/%d' % (ix, len(shardfiles)))\n    sys.stdout.flush()\n\n    # Read the entire binary co-occurrence and unpack it into an array.\n    fh.seek(0)\n    buf = fh.read()\n    os.unlink(fh.name)\n    fh.close()\n\n    coocs = [\n        shard_cooc_fmt.unpack_from(buf, off)\n        for off in range(0, len(buf), shard_cooc_fmt.size)]\n\n    # Sort and merge co-occurrences for the same pairs.\n    coocs.sort()\n\n    if coocs:\n      current_pos = 0\n      current_row_col = (coocs[current_pos][0], coocs[current_pos][1])\n      for next_pos in range(1, len(coocs)):\n        next_row_col = (coocs[next_pos][0], coocs[next_pos][1])\n        if current_row_col == next_row_col:\n          coocs[current_pos] = (\n              coocs[current_pos][0],\n              coocs[current_pos][1],\n              coocs[current_pos][2] + coocs[next_pos][2])\n        else:\n          current_pos += 1\n          if current_pos < next_pos:\n            coocs[current_pos] = coocs[next_pos]\n\n          current_row_col = (coocs[current_pos][0], coocs[current_pos][1])\n\n      coocs = coocs[:(1 + current_pos)]\n\n    # Convert to a TF Example proto.\n    def _int64s(xs):\n      return tf.train.Feature(int64_list=tf.train.Int64List(value=list(xs)))\n\n    def _floats(xs):\n      return tf.train.Feature(float_list=tf.train.FloatList(value=list(xs)))\n\n    example = tf.train.Example(features=tf.train.Features(feature={\n        'global_row': _int64s(\n            row + num_shards * i for i in range(FLAGS.shard_size)),\n        'global_col': _int64s(\n            col + num_shards * i for i in range(FLAGS.shard_size)),\n\n        'sparse_local_row': _int64s(cooc[0] for cooc in coocs),\n        'sparse_local_col': _int64s(cooc[1] for cooc in coocs),\n        'sparse_value': _floats(cooc[2] for cooc in coocs),\n    }))\n\n    filename = os.path.join(FLAGS.output_dir, 'shard-%03d-%03d.pb' % (row, col))\n    with open(filename, 'w') as out:\n      out.write(example.SerializeToString())\n\n  sys.stdout.write('\\n')\n\n\ndef main(_):\n  # Create the output directory, if necessary\n  if FLAGS.output_dir and not os.path.isdir(FLAGS.output_dir):\n    os.makedirs(FLAGS.output_dir)\n\n  # Read the file onces to create the vocabulary.\n  if FLAGS.vocab:\n    with open(FLAGS.vocab, 'r') as lines:\n      vocab = [line.strip() for line in lines]\n  else:\n    with open(FLAGS.input, 'r') as lines:\n      vocab = create_vocabulary(lines)\n\n  # Now read the file again to determine the co-occurrence stats.\n  with open(FLAGS.input, 'r') as lines:\n    shardfiles, sums = compute_coocs(lines, vocab)\n\n  # Collect individual shards into the shards.recs file.\n  write_shards(vocab, shardfiles)\n\n  # Now write the marginals.  They're symmetric for this application.\n  write_vocab_and_sums(vocab, sums, 'row_vocab.txt', 'row_sums.txt')\n  write_vocab_and_sums(vocab, sums, 'col_vocab.txt', 'col_sums.txt')\n\n  print 'done!'\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/swivel/swivel.py",
    "content": "#!/usr/bin/env python\n#\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Submatrix-wise Vector Embedding Learner.\n\nImplementation of SwiVel algorithm described at:\nhttp://arxiv.org/abs/1602.02215\n\nThis program expects an input directory that contains the following files.\n\n  row_vocab.txt, col_vocab.txt\n\n    The row an column vocabulary files.  Each file should contain one token per\n    line; these will be used to generate a tab-separate file containing the\n    trained embeddings.\n\n  row_sums.txt, col_sum.txt\n\n    The matrix row and column marginal sums.  Each file should contain one\n    decimal floating point number per line which corresponds to the marginal\n    count of the matrix for that row or column.\n\n  shards.recs\n\n    A file containing the sub-matrix shards, stored as TFRecords.  Each shard is\n    expected to be a serialzed tf.Example protocol buffer with the following\n    properties:\n\n      global_row: the global row indicies contained in the shard\n      global_col: the global column indicies contained in the shard\n      sparse_local_row, sparse_local_col, sparse_value: three parallel arrays\n      that are a sparse representation of the submatrix counts.\n\nIt will generate embeddings, training from the input directory for the specified\nnumber of epochs.  When complete, it will output the trained vectors to a\ntab-separated file that contains one line per embedding.  Row and column\nembeddings are stored in separate files.\n\n\"\"\"\n\nimport argparse\nimport glob\nimport math\nimport os\nimport sys\nimport time\nimport threading\n\nimport numpy as np\nimport tensorflow as tf\n\nflags = tf.app.flags\n\nflags.DEFINE_string('input_base_path', '/tmp/swivel_data',\n                    'Directory containing input shards, vocabularies, '\n                    'and marginals.')\nflags.DEFINE_string('output_base_path', '/tmp/swivel_data',\n                    'Path where to write the trained embeddings.')\nflags.DEFINE_integer('embedding_size', 300, 'Size of the embeddings')\nflags.DEFINE_boolean('trainable_bias', False, 'Biases are trainable')\nflags.DEFINE_integer('submatrix_rows', 4096, 'Rows in each training submatrix. '\n                     'This must match the training data.')\nflags.DEFINE_integer('submatrix_cols', 4096, 'Rows in each training submatrix. '\n                     'This must match the training data.')\nflags.DEFINE_float('loss_multiplier', 1.0 / 4096,\n                   'constant multiplier on loss.')\nflags.DEFINE_float('confidence_exponent', 0.5,\n                   'Exponent for l2 confidence function')\nflags.DEFINE_float('confidence_scale', 0.25, 'Scale for l2 confidence function')\nflags.DEFINE_float('confidence_base', 0.1, 'Base for l2 confidence function')\nflags.DEFINE_float('learning_rate', 1.0, 'Initial learning rate')\nflags.DEFINE_integer('num_concurrent_steps', 2,\n                     'Number of threads to train with')\nflags.DEFINE_float('num_epochs', 40, 'Number epochs to train for')\nflags.DEFINE_float('per_process_gpu_memory_fraction', 0.25,\n                   'Fraction of GPU memory to use')\n\nFLAGS = flags.FLAGS\n\n\ndef embeddings_with_init(vocab_size, embedding_dim, name):\n  \"\"\"Creates and initializes the embedding tensors.\"\"\"\n  return tf.get_variable(name=name,\n                         shape=[vocab_size, embedding_dim],\n                         initializer=tf.random_normal_initializer(\n                             stddev=math.sqrt(1.0 / embedding_dim)))\n\n\ndef count_matrix_input(filenames, submatrix_rows, submatrix_cols):\n  \"\"\"Reads submatrix shards from disk.\"\"\"\n  filename_queue = tf.train.string_input_producer(filenames)\n  reader = tf.WholeFileReader()\n  _, serialized_example = reader.read(filename_queue)\n  features = tf.parse_single_example(\n      serialized_example,\n      features={\n          'global_row': tf.FixedLenFeature([submatrix_rows], dtype=tf.int64),\n          'global_col': tf.FixedLenFeature([submatrix_cols], dtype=tf.int64),\n          'sparse_local_row': tf.VarLenFeature(dtype=tf.int64),\n          'sparse_local_col': tf.VarLenFeature(dtype=tf.int64),\n          'sparse_value': tf.VarLenFeature(dtype=tf.float32)\n      })\n\n  global_row = features['global_row']\n  global_col = features['global_col']\n\n  sparse_local_row = features['sparse_local_row'].values\n  sparse_local_col = features['sparse_local_col'].values\n  sparse_count = features['sparse_value'].values\n\n  sparse_indices = tf.concat(1, [tf.expand_dims(sparse_local_row, 1),\n                                 tf.expand_dims(sparse_local_col, 1)])\n  count = tf.sparse_to_dense(sparse_indices, [submatrix_rows, submatrix_cols],\n                             sparse_count)\n\n  queued_global_row, queued_global_col, queued_count = tf.train.batch(\n      [global_row, global_col, count],\n      batch_size=1,\n      num_threads=4,\n      capacity=32)\n\n  queued_global_row = tf.reshape(queued_global_row, [submatrix_rows])\n  queued_global_col = tf.reshape(queued_global_col, [submatrix_cols])\n  queued_count = tf.reshape(queued_count, [submatrix_rows, submatrix_cols])\n\n  return queued_global_row, queued_global_col, queued_count\n\n\ndef read_marginals_file(filename):\n  \"\"\"Reads text file with one number per line to an array.\"\"\"\n  with open(filename) as lines:\n    return [float(line) for line in lines]\n\n\ndef write_embedding_tensor_to_disk(vocab_path, output_path, sess, embedding):\n  \"\"\"Writes tensor to output_path as tsv\"\"\"\n  # Fetch the embedding values from the model\n  embeddings = sess.run(embedding)\n\n  with open(output_path, 'w') as out_f:\n    with open(vocab_path) as vocab_f:\n      for index, word in enumerate(vocab_f):\n        word = word.strip()\n        embedding = embeddings[index]\n        out_f.write(word + '\\t' + '\\t'.join([str(x) for x in embedding]) + '\\n')\n\n\ndef write_embeddings_to_disk(config, model, sess):\n  \"\"\"Writes row and column embeddings disk\"\"\"\n  # Row Embedding\n  row_vocab_path = config.input_base_path + '/row_vocab.txt'\n  row_embedding_output_path = config.output_base_path + '/row_embedding.tsv'\n  print 'Writing row embeddings to:', row_embedding_output_path\n  sys.stdout.flush()\n  write_embedding_tensor_to_disk(row_vocab_path, row_embedding_output_path,\n                                 sess, model.row_embedding)\n\n  # Column Embedding\n  col_vocab_path = config.input_base_path + '/col_vocab.txt'\n  col_embedding_output_path = config.output_base_path + '/col_embedding.tsv'\n  print 'Writing column embeddings to:', col_embedding_output_path\n  sys.stdout.flush()\n  write_embedding_tensor_to_disk(col_vocab_path, col_embedding_output_path,\n                                 sess, model.col_embedding)\n\n\nclass SwivelModel(object):\n  \"\"\"Small class to gather needed pieces from a Graph being built.\"\"\"\n\n  def __init__(self, config):\n    \"\"\"Construct graph for dmc.\"\"\"\n    self._config = config\n\n    # Create paths to input data files\n    print 'Reading model from:', config.input_base_path\n    sys.stdout.flush()\n    count_matrix_files = glob.glob(config.input_base_path + '/shard-*.pb')\n    row_sums_path = config.input_base_path + '/row_sums.txt'\n    col_sums_path = config.input_base_path + '/col_sums.txt'\n\n    # Read marginals\n    row_sums = read_marginals_file(row_sums_path)\n    col_sums = read_marginals_file(col_sums_path)\n\n    self.n_rows = len(row_sums)\n    self.n_cols = len(col_sums)\n    print 'Matrix dim: (%d,%d) SubMatrix dim: (%d,%d) ' % (\n        self.n_rows, self.n_cols, config.submatrix_rows, config.submatrix_cols)\n    sys.stdout.flush()\n    self.n_submatrices = (self.n_rows * self.n_cols /\n                          (config.submatrix_rows * config.submatrix_cols))\n    print 'n_submatrices: %d' % (self.n_submatrices)\n    sys.stdout.flush()\n\n    # ===== CREATE VARIABLES ======\n\n    with tf.device('/cpu:0'):\n      # embeddings\n      self.row_embedding = embeddings_with_init(\n          embedding_dim=config.embedding_size,\n          vocab_size=self.n_rows,\n          name='row_embedding')\n      self.col_embedding = embeddings_with_init(\n          embedding_dim=config.embedding_size,\n          vocab_size=self.n_cols,\n          name='col_embedding')\n      tf.histogram_summary('row_emb', self.row_embedding)\n      tf.histogram_summary('col_emb', self.col_embedding)\n\n      matrix_log_sum = math.log(np.sum(row_sums) + 1)\n      row_bias_init = [math.log(x + 1) for x in row_sums]\n      col_bias_init = [math.log(x + 1) for x in col_sums]\n      self.row_bias = tf.Variable(row_bias_init,\n                                  trainable=config.trainable_bias)\n      self.col_bias = tf.Variable(col_bias_init,\n                                  trainable=config.trainable_bias)\n      tf.histogram_summary('row_bias', self.row_bias)\n      tf.histogram_summary('col_bias', self.col_bias)\n\n    # ===== CREATE GRAPH =====\n\n    # Get input\n    with tf.device('/cpu:0'):\n      global_row, global_col, count = count_matrix_input(\n          count_matrix_files, config.submatrix_rows, config.submatrix_cols)\n\n      # Fetch embeddings.\n      selected_row_embedding = tf.nn.embedding_lookup(self.row_embedding,\n                                                      global_row)\n      selected_col_embedding = tf.nn.embedding_lookup(self.col_embedding,\n                                                      global_col)\n\n      # Fetch biases.\n      selected_row_bias = tf.nn.embedding_lookup([self.row_bias], global_row)\n      selected_col_bias = tf.nn.embedding_lookup([self.col_bias], global_col)\n\n    # Multiply the row and column embeddings to generate predictions.\n    predictions = tf.matmul(\n        selected_row_embedding, selected_col_embedding, transpose_b=True)\n\n    # These binary masks separate zero from non-zero values.\n    count_is_nonzero = tf.to_float(tf.cast(count, tf.bool))\n    count_is_zero = 1 - tf.to_float(tf.cast(count, tf.bool))\n\n    objectives = count_is_nonzero * tf.log(count + 1e-30)\n    objectives -= tf.reshape(selected_row_bias, [config.submatrix_rows, 1])\n    objectives -= selected_col_bias\n    objectives += matrix_log_sum\n\n    err = predictions - objectives\n\n    # The confidence function scales the L2 loss based on the raw co-occurrence\n    # count.\n    l2_confidence = (config.confidence_base + config.confidence_scale * tf.pow(\n        count, config.confidence_exponent))\n\n    l2_loss = config.loss_multiplier * tf.reduce_sum(\n        0.5 * l2_confidence * err * err * count_is_nonzero)\n\n    sigmoid_loss = config.loss_multiplier * tf.reduce_sum(\n        tf.nn.softplus(err) * count_is_zero)\n\n    self.loss = l2_loss + sigmoid_loss\n\n    tf.scalar_summary(\"l2_loss\", l2_loss)\n    tf.scalar_summary(\"sigmoid_loss\", sigmoid_loss)\n    tf.scalar_summary(\"loss\", self.loss)\n\n    # Add optimizer.\n    self.global_step = tf.Variable(0, name='global_step')\n    opt = tf.train.AdagradOptimizer(config.learning_rate)\n    self.train_op = opt.minimize(self.loss, global_step=self.global_step)\n    self.saver = tf.train.Saver(sharded=True)\n\n\ndef main(_):\n  # Create the output path.  If this fails, it really ought to fail\n  # now. :)\n  if not os.path.isdir(FLAGS.output_base_path):\n    os.makedirs(FLAGS.output_base_path)\n\n  # Create and run model\n  with tf.Graph().as_default():\n    model = SwivelModel(FLAGS)\n\n    # Create a session for running Ops on the Graph.\n    gpu_options = tf.GPUOptions(\n        per_process_gpu_memory_fraction=FLAGS.per_process_gpu_memory_fraction)\n    sess = tf.Session(config=tf.ConfigProto(gpu_options=gpu_options))\n\n    # Run the Op to initialize the variables.\n    sess.run(tf.initialize_all_variables())\n\n    # Start feeding input\n    coord = tf.train.Coordinator()\n    threads = tf.train.start_queue_runners(sess=sess, coord=coord)\n\n    # Calculate how many steps each thread should run\n    n_total_steps = int(FLAGS.num_epochs * model.n_rows * model.n_cols) / (\n        FLAGS.submatrix_rows * FLAGS.submatrix_cols)\n    n_steps_per_thread = n_total_steps / FLAGS.num_concurrent_steps\n    n_submatrices_to_train = model.n_submatrices * FLAGS.num_epochs\n    t0 = [time.time()]\n\n    def TrainingFn():\n      for _ in range(n_steps_per_thread):\n        _, global_step = sess.run([model.train_op, model.global_step])\n        n_steps_between_status_updates = 100\n        if (global_step % n_steps_between_status_updates) == 0:\n          elapsed = float(time.time() - t0[0])\n          print '%d/%d submatrices trained (%.1f%%), %.1f submatrices/sec' % (\n              global_step, n_submatrices_to_train,\n              100.0 * global_step / n_submatrices_to_train,\n              n_steps_between_status_updates / elapsed)\n          sys.stdout.flush()\n          t0[0] = time.time()\n\n    # Start training threads\n    train_threads = []\n    for _ in range(FLAGS.num_concurrent_steps):\n      t = threading.Thread(target=TrainingFn)\n      train_threads.append(t)\n      t.start()\n\n    # Wait for threads to finish.\n    for t in train_threads:\n      t.join()\n\n    coord.request_stop()\n    coord.join(threads)\n\n    # Write out vectors\n    write_embeddings_to_disk(FLAGS, model, sess)\n\n    #Shutdown\n    sess.close()\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/swivel/text2bin.py",
    "content": "#!/usr/bin/env python\n#\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Converts vectors from text to a binary format for quicker manipulation.\n\nUsage:\n\n  text2bin.py -o <out> -v <vocab> vec1.txt [vec2.txt ...]\n\nOptiona:\n\n  -o <filename>, --output <filename>\n    The name of the file into which the binary vectors are written.\n\n  -v <filename>, --vocab <filename>\n    The name of the file into which the vocabulary is written.\n\nDescription\n\nThis program merges one or more whitespace separated vector files into a single\nbinary vector file that can be used by downstream evaluation tools in this\ndirectory (\"wordsim.py\" and \"analogy\").\n\nIf more than one vector file is specified, then the files must be aligned\nrow-wise (i.e., each line must correspond to the same embedding), and they must\nhave the same number of columns (i.e., be the same dimension).\n\n\"\"\"\n\nfrom itertools import izip\nfrom getopt import GetoptError, getopt\nimport os\nimport struct\nimport sys\n\ntry:\n  opts, args = getopt(\n      sys.argv[1:], 'o:v:', ['output=', 'vocab='])\nexcept GetoptError, e:\n  print >> sys.stderr, e\n  sys.exit(2)\n\nopt_output = 'vecs.bin'\nopt_vocab = 'vocab.txt'\nfor o, a in opts:\n  if o in ('-o', '--output'):\n    opt_output = a\n  if o in ('-v', '--vocab'):\n    opt_vocab = a\n\ndef go(fhs):\n  fmt = None\n  with open(opt_vocab, 'w') as vocab_out:\n    with open(opt_output, 'w') as vecs_out:\n      for lines in izip(*fhs):\n        parts = [line.split() for line in lines]\n        token = parts[0][0]\n        if any(part[0] != token for part in parts[1:]):\n          raise IOError('vector files must be aligned')\n\n        print >> vocab_out, token\n\n        vec = [sum(float(x) for x in xs) for xs in zip(*parts)[1:]]\n        if not fmt:\n          fmt = struct.Struct('%df' % len(vec))\n\n        vecs_out.write(fmt.pack(*vec))\n\nif args:\n  fhs = [open(filename) for filename in args]\n  go(fhs)\n  for fh in fhs:\n    fh.close()\nelse:\n  go([sys.stdin])\n"
  },
  {
    "path": "model_zoo/models/swivel/vecs.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport mmap\nimport numpy as np\nimport os\nimport struct\n\nclass Vecs(object):\n  def __init__(self, vocab_filename, rows_filename, cols_filename=None):\n    \"\"\"Initializes the vectors from a text vocabulary and binary data.\"\"\"\n    with open(vocab_filename, 'r') as lines:\n      self.vocab = [line.split()[0] for line in lines]\n      self.word_to_idx = {word: idx for idx, word in enumerate(self.vocab)}\n\n    n = len(self.vocab)\n\n    with open(rows_filename, 'r') as rows_fh:\n      rows_fh.seek(0, os.SEEK_END)\n      size = rows_fh.tell()\n\n      # Make sure that the file size seems reasonable.\n      if size % (4 * n) != 0:\n        raise IOError(\n            'unexpected file size for binary vector file %s' % rows_filename)\n\n      # Memory map the rows.\n      dim = size / (4 * n)\n      rows_mm = mmap.mmap(rows_fh.fileno(), 0, prot=mmap.PROT_READ)\n      rows = np.matrix(\n          np.frombuffer(rows_mm, dtype=np.float32).reshape(n, dim))\n\n      # If column vectors were specified, then open them and add them to the row\n      # vectors.\n      if cols_filename:\n        with open(cols_filename, 'r') as cols_fh:\n          cols_mm = mmap.mmap(cols_fh.fileno(), 0, prot=mmap.PROT_READ)\n          cols_fh.seek(0, os.SEEK_END)\n          if cols_fh.tell() != size:\n            raise IOError('row and column vector files have different sizes')\n\n          cols = np.matrix(\n              np.frombuffer(cols_mm, dtype=np.float32).reshape(n, dim))\n\n          rows += cols\n          cols_mm.close()\n\n      # Normalize so that dot products are just cosine similarity.\n      self.vecs = rows / np.linalg.norm(rows, axis=1).reshape(n, 1)\n      rows_mm.close()\n\n  def similarity(self, word1, word2):\n    \"\"\"Computes the similarity of two tokens.\"\"\"\n    idx1 = self.word_to_idx.get(word1)\n    idx2 = self.word_to_idx.get(word2)\n    if not idx1 or not idx2:\n      return None\n\n    return float(self.vecs[idx1] * self.vecs[idx2].transpose())\n\n  def neighbors(self, query):\n    \"\"\"Returns the nearest neighbors to the query (a word or vector).\"\"\"\n    if isinstance(query, basestring):\n      idx = self.word_to_idx.get(query)\n      if idx is None:\n        return None\n\n      query = self.vecs[idx]\n\n    neighbors = self.vecs * query.transpose()\n\n    return sorted(\n      zip(self.vocab, neighbors.flat),\n      key=lambda kv: kv[1], reverse=True)\n\n  def lookup(self, word):\n    \"\"\"Returns the embedding for a token, or None if no embedding exists.\"\"\"\n    idx = self.word_to_idx.get(word)\n    return None if idx is None else self.vecs[idx]\n"
  },
  {
    "path": "model_zoo/models/swivel/wordsim.py",
    "content": "#!/usr/bin/env python\n#\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Computes Spearman's rho with respect to human judgements.\n\nGiven a set of row (and potentially column) embeddings, this computes Spearman's\nrho between the rank ordering of predicted word similarity and human judgements.\n\nUsage:\n\n  wordim.py --embeddings=<binvecs> --vocab=<vocab> eval1.tab eval2.tab ...\n\nOptions:\n\n  --embeddings=<filename>: the vectors to test\n  --vocab=<filename>: the vocabulary file\n\nEvaluation files are assumed to be tab-separated files with exactly three\ncolumns.  The first two columns contain the words, and the third column contains\nthe scored human judgement.\n\n\"\"\"\n\nimport scipy.stats\nimport sys\nfrom getopt import GetoptError, getopt\n\nfrom vecs import Vecs\n\ntry:\n  opts, args = getopt(sys.argv[1:], '', ['embeddings=', 'vocab='])\nexcept GetoptError, e:\n  print >> sys.stderr, e\n  sys.exit(2)\n\nopt_embeddings = None\nopt_vocab = None\n\nfor o, a in opts:\n  if o == '--embeddings':\n    opt_embeddings = a\n  if o == '--vocab':\n    opt_vocab = a\n\nif not opt_vocab:\n  print >> sys.stderr, 'please specify a vocabulary file with \"--vocab\"'\n  sys.exit(2)\n\nif not opt_embeddings:\n  print >> sys.stderr, 'please specify the embeddings with \"--embeddings\"'\n  sys.exit(2)\n\ntry:\n  vecs = Vecs(opt_vocab, opt_embeddings)\nexcept IOError, e:\n  print >> sys.stderr, e\n  sys.exit(1)\n\ndef evaluate(lines):\n  acts, preds = [], []\n\n  with open(filename, 'r') as lines:\n    for line in lines:\n      w1, w2, act = line.strip().split('\\t')\n      pred = vecs.similarity(w1, w2)\n      if pred is None:\n        continue\n\n      acts.append(float(act))\n      preds.append(pred)\n\n  rho, _ = scipy.stats.spearmanr(acts, preds)\n  return rho\n\nfor filename in args:\n  with open(filename, 'r') as lines:\n    print '%0.3f %s' % (evaluate(lines), filename)\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/.gitignore",
    "content": "/bazel-bin\n/bazel-genfiles\n/bazel-out\n/bazel-tensorflow\n/bazel-testlogs\n/bazel-tf\n/bazel-syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/Dockerfile",
    "content": "FROM java:8\n\nENV SYNTAXNETDIR=/opt/tensorflow PATH=$PATH:/root/bin\n\nRUN mkdir -p $SYNTAXNETDIR \\\n    && cd $SYNTAXNETDIR \\\n    && apt-get update \\\n    && apt-get install git zlib1g-dev file swig python2.7 python-dev python-pip python-mock -y \\\n    && pip install --upgrade pip \\\n    && pip install -U protobuf==3.0.0 \\\n    && pip install asciitree \\\n    && pip install numpy \\\n    && wget https://github.com/bazelbuild/bazel/releases/download/0.3.1/bazel-0.3.1-installer-linux-x86_64.sh \\\n    && chmod +x bazel-0.3.1-installer-linux-x86_64.sh \\\n    && ./bazel-0.3.1-installer-linux-x86_64.sh --user \\\n    && git clone --recursive https://github.com/tensorflow/models.git \\\n    && cd $SYNTAXNETDIR/models/syntaxnet/tensorflow \\\n    && echo \"\\n\\n\\n\\n\" | ./configure \\\n    && apt-get autoremove -y \\\n    && apt-get clean\n\nRUN cd $SYNTAXNETDIR/models/syntaxnet \\\n    && bazel test --genrule_strategy=standalone syntaxnet/... util/utf8/...\n\nWORKDIR $SYNTAXNETDIR/models/syntaxnet\n\nCMD [ \"sh\", \"-c\", \"echo 'Bob brought the pizza to Alice.' | syntaxnet/demo.sh\" ]\n\n# COMMANDS to build and run\n# ===============================\n# mkdir build && cp Dockerfile build/ && cd build\n# docker build -t syntaxnet .\n# docker run syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/README.md",
    "content": "# SyntaxNet: Neural Models of Syntax.\n\n*A TensorFlow implementation of the models described in [Andor et al. (2016)]\n(http://arxiv.org/abs/1603.06042).*\n\n**Update**: Parsey models are now [available](universal.md) for 40 languages\ntrained on Universal Dependencies datasets, with support for text segmentation\nand morphological analysis.\n\nAt Google, we spend a lot of time thinking about how computer systems can read\nand understand human language in order to process it in intelligent ways. We are\nexcited to share the fruits of our research with the broader community by\nreleasing SyntaxNet, an open-source neural network framework for [TensorFlow]\n(http://www.tensorflow.org) that provides a foundation for Natural Language\nUnderstanding (NLU) systems. Our release includes all the code needed to train\nnew SyntaxNet models on your own data, as well as *Parsey McParseface*, an\nEnglish parser that we have trained for you, and that you can use to analyze\nEnglish text.\n\nSo, how accurate is Parsey McParseface? For this release, we tried to balance a\nmodel that runs fast enough to be useful on a single machine (e.g. ~600\nwords/second on a modern desktop) and that is also the most accurate parser\navailable. Here's how Parsey McParseface compares to the academic literature on\nseveral different English domains: (all numbers are % correct head assignments\nin the tree, or unlabelled attachment score)\n\nModel                                                                                                           | News  | Web   | Questions\n--------------------------------------------------------------------------------------------------------------- | :---: | :---: | :-------:\n[Martins et al. (2013)](http://www.cs.cmu.edu/~ark/TurboParser/)                                                | 93.10 | 88.23 | 94.21\n[Zhang and McDonald (2014)](http://research.google.com/pubs/archive/38148.pdf)                                  | 93.32 | 88.65 | 93.37\n[Weiss et al. (2015)](http://static.googleusercontent.com/media/research.google.com/en//pubs/archive/43800.pdf) | 93.91 | 89.29 | 94.17\n[Andor et al. (2016)](http://arxiv.org/abs/1603.06042)*                                                         | 94.44 | 90.17 | 95.40\nParsey McParseface                                                                                              | 94.15 | 89.08 | 94.77\n\nWe see that Parsey McParseface is state-of-the-art; more importantly, with\nSyntaxNet you can train larger networks with more hidden units and bigger beam\nsizes if you want to push the accuracy even further: [Andor et al. (2016)]\n(http://arxiv.org/abs/1603.06042)* is simply a SyntaxNet model with a\nlarger beam and network. For futher information on the datasets, see that paper\nunder the section \"Treebank Union\".\n\nParsey McParseface is also state-of-the-art for part-of-speech (POS) tagging\n(numbers below are per-token accuracy):\n\nModel                                                                      | News  | Web   | Questions\n-------------------------------------------------------------------------- | :---: | :---: | :-------:\n[Ling et al. (2015)](http://www.cs.cmu.edu/~lingwang/papers/emnlp2015.pdf) | 97.44 | 94.03 | 96.18\n[Andor et al. (2016)](http://arxiv.org/abs/1603.06042)*                    | 97.77 | 94.80 | 96.86\nParsey McParseface                                                         | 97.52 | 94.24 | 96.45\n\nThe first part of this tutorial describes how to install the necessary tools and\nuse the already trained models provided in this release. In the second part of\nthe tutorial we provide more background about the models, as well as\ninstructions for training models on other datasets.\n\n## Contents\n* [Installation](#installation)\n* [Getting Started](#getting-started)\n    * [Parsing from Standard Input](#parsing-from-standard-input)\n    * [Annotating a Corpus](#annotating-a-corpus)\n    * [Configuring the Python Scripts](#configuring-the-python-scripts)\n    * [Next Steps](#next-steps)\n* [Detailed Tutorial: Building an NLP Pipeline with SyntaxNet](#detailed-tutorial-building-an-nlp-pipeline-with-syntaxnet)\n    * [Obtaining Data](#obtaining-data)\n    * [Part-of-Speech Tagging](#part-of-speech-tagging)\n    * [Training the SyntaxNet POS Tagger](#training-the-syntaxnet-pos-tagger)\n    * [Preprocessing with the Tagger](#preprocessing-with-the-tagger)\n    * [Dependency Parsing: Transition-Based Parsing](#dependency-parsing-transition-based-parsing)\n    * [Training a Parser Step 1: Local Pretraining](#training-a-parser-step-1-local-pretraining)\n    * [Training a Parser Step 2: Global Training](#training-a-parser-step-2-global-training)\n* [Contact](#contact)\n* [Credits](#credits)\n\n## Installation\n\nRunning and training SyntaxNet models requires building this package from\nsource. You'll need to install:\n\n*   python 2.7:\n    * python 3 support is not available yet\n*   pip (python package manager)\n    * `apt-get install python-pip` on Ubuntu\n    * `brew` installs pip along with python on OSX\n*   bazel:\n    *   **versions 0.3.0 - 0.3.1*\n    *   follow the instructions [here](http://bazel.io/docs/install.html)\n    *   Alternately, Download bazel <.deb> from\n        [https://github.com/bazelbuild/bazel/releases]\n        (https://github.com/bazelbuild/bazel/releases) for your system\n        configuration.\n    *   Install it using the command: sudo dpkg -i <.deb file>\n    *   Check for the bazel version by typing: bazel version\n*   swig:\n    *   `apt-get install swig` on Ubuntu\n    *   `brew install swig` on OSX\n*   protocol buffers, with a version supported by TensorFlow:\n    *   check your protobuf version with `pip freeze | grep protobuf`\n    *   upgrade to a supported version with `pip install -U protobuf==3.0.0b2`\n*   asciitree, to draw parse trees on the console for the demo:\n    *   `pip install asciitree`\n*   numpy, package for scientific computing:\n    *   `pip install numpy`\n*   mock, package for unit testing:\n    *   `pip install mock`\n\nOnce you completed the above steps, you can build and test SyntaxNet with the\nfollowing commands:\n\n```shell\n  git clone --recursive https://github.com/tensorflow/models.git\n  cd models/syntaxnet/tensorflow\n  ./configure\n  cd ..\n  bazel test syntaxnet/... util/utf8/...\n  # On Mac, run the following:\n  bazel test --linkopt=-headerpad_max_install_names \\\n    syntaxnet/... util/utf8/...\n```\n\nBazel should complete reporting all tests passed.\n\nYou can also compile SyntaxNet in a [Docker](https://www.docker.com/what-docker)\ncontainer using this [Dockerfile](Dockerfile).\n\nTo build SyntaxNet with GPU support please refer to the instructions in\n[issues/248](https://github.com/tensorflow/models/issues/248).\n\n**Note:** If you are running Docker on OSX, make sure that you have enough\nmemory allocated for your Docker VM.\n\n## Getting Started\n\nOnce you have successfully built SyntaxNet, you can start parsing text right\naway with Parsey McParseface, located under `syntaxnet/models`. The easiest\nthing is to use or modify the included script `syntaxnet/demo.sh`, which shows a\nbasic setup to parse English taking plain text as input.\n\n### Parsing from Standard Input\n\nSimply pass one sentence per line of text into the script at\n`syntaxnet/demo.sh`. The script will break the text into words, run the POS\ntagger, run the parser, and then generate an ASCII version of the parse tree:\n\n```shell\necho 'Bob brought the pizza to Alice.' | syntaxnet/demo.sh\n\nInput: Bob brought the pizza to Alice .\nParse:\nbrought VBD ROOT\n +-- Bob NNP nsubj\n +-- pizza NN dobj\n |   +-- the DT det\n +-- to IN prep\n |   +-- Alice NNP pobj\n +-- . . punct\n```\n\nThe ASCII tree shows the text organized as in the parse, not left-to-right as\nvisualized in our tutorial graphs. In this example, we see that the verb\n\"brought\" is the root of the sentence, with the subject \"Bob\", the object\n\"pizza\", and the prepositional phrase \"to Alice\".\n\nIf you want to feed in tokenized, CONLL-formatted text, you can run `demo.sh\n--conll`.\n\n### Annotating a Corpus\n\nTo change the pipeline to read and write to specific files (as opposed to piping\nthrough stdin and stdout), we have to modify the `demo.sh` to point to the files\nwe want. The SyntaxNet models are configured via a combination of run-time flags\n(which are easy to change) and a text format `TaskSpec` protocol buffer. The\nspec file used in the demo is in\n`syntaxnet/models/parsey_mcparseface/context.pbtxt`.\n\nTo use corpora instead of stdin/stdout, we have to:\n\n1.  Create or modify an `input` field inside the `TaskSpec`, with the\n    `file_pattern` specifying the location we want. If the input corpus is in\n    CONLL format, make sure to put `record_format: 'conll-sentence'`.\n1.  Change the `--input` and/or `--output` flag to use the name of the resource\n    as the output, instead of `stdin` and `stdout`.\n\nE.g., if we wanted to POS tag the CONLL corpus `./wsj.conll`, we would create\ntwo entries, one for the input and one for the output:\n\n```protosame\ninput {\n  name: 'wsj-data'\n  record_format: 'conll-sentence'\n  Part {\n    file_pattern: './wsj.conll'\n  }\n}\ninput {\n  name: 'wsj-data-tagged'\n  record_format: 'conll-sentence'\n  Part {\n    file_pattern: './wsj-tagged.conll'\n  }\n}\n```\n\nThen we can use `--input=wsj-data --output=wsj-data-tagged` on the command line\nto specify reading and writing to these files.\n\n### Configuring the Python Scripts\n\nAs mentioned above, the python scripts are configured in two ways:\n\n1.  **Run-time flags** are used to point to the `TaskSpec` file, switch between\n    inputs for reading and writing, and set various run-time model parameters.\n    At training time, these flags are used to set the learning rate, hidden\n    layer sizes, and other key parameters.\n1.  The **`TaskSpec` proto** stores configuration about the transition system,\n    the features, and a set of named static resources required by the parser. It\n    is specified via the `--task_context` flag. A few key notes to remember:\n\n    -   The `Parameter` settings in the `TaskSpec` have a prefix: either\n        `brain_pos` (they apply to the tagger) or `brain_parser` (they apply to\n        the parser). The `--prefix` run-time flag switches between reading from\n        the two configurations.\n    -   The resources will be created and/or modified during multiple stages of\n        training. As described above, the resources can also be used at\n        evaluation time to read or write to specific files. These resources are\n        also separate from the model parameters, which are saved separately via\n        calls to TensorFlow ops, and loaded via the `--model_path` flag.\n    -   Because the `TaskSpec` contains file path, remember that copying around\n        this file is not enough to relocate a trained model: you need up move\n        and update all the paths as well.\n\nNote that some run-time flags need to be consistent between training and testing\n(e.g. the number of hidden units).\n\n### Next Steps\n\nThere are many ways to extend this framework, e.g. adding new features, changing\nthe model structure, training on other languages, etc. We suggest reading the\ndetailed tutorial below to get a handle on the rest of the framework.\n\n## Detailed Tutorial: Building an NLP Pipeline with SyntaxNet\n\nIn this tutorial, we'll go over how to train new models, and explain in a bit\nmore technical detail the NLP side of the models. Our goal here is to explain\nthe NLP pipeline produced by this package.\n\n### Obtaining Data\n\nThe included English parser, Parsey McParseface, was trained on the the standard\ncorpora of the [Penn Treebank](https://catalog.ldc.upenn.edu/LDC99T42) and\n[OntoNotes](https://catalog.ldc.upenn.edu/LDC2013T19), as well as the [English\nWeb Treebank](https://catalog.ldc.upenn.edu/LDC2012T13), but these are\nunfortunately not freely available.\n\nHowever, the [Universal Dependencies](http://universaldependencies.org/) project\nprovides freely available treebank data in a number of languages. SyntaxNet can\nbe trained and evaluated on any of these corpora.\n\n### Part-of-Speech Tagging\n\nConsider the following sentence, which exhibits several ambiguities that affect\nits interpretation:\n\n> I saw the man with glasses.\n\nThis sentence is composed of words: strings of characters that are segmented\ninto groups (e.g. \"I\", \"saw\", etc.) Each word in the sentence has a *grammatical\nfunction* that can be useful for understanding the meaning of language. For\nexample, \"saw\" in this example is a past tense of the verb \"to see\". But any\ngiven word might have different meanings in different contexts: \"saw\" could just\nas well be a noun (e.g., a saw used for cutting) or a present tense verb (using\na saw to cut something).\n\nA logical first step in understanding language is figuring out these roles for\neach word in the sentence. This process is called *Part-of-Speech (POS)\nTagging*. The roles are called POS tags. Although a given word might have\nmultiple possible tags depending on the context, given any one interpretation of\na sentence each word will generally only have one tag.\n\nOne interesting challenge of POS tagging is that the problem of defining a\nvocabulary of POS tags for a given language is quite involved. While the concept\nof nouns and verbs is pretty common, it has been traditionally difficult to\nagree on a standard set of roles across all languages. The [Universal\nDependencies](http://www.universaldependencies.org) project aims to solve this\nproblem.\n\n### Training the SyntaxNet POS Tagger\n\nIn general, determining the correct POS tag requires understanding the entire\nsentence and the context in which it is uttered. In practice, we can do very\nwell just by considering a small window of words around the word of interest.\nFor example, words that follow the word ‘the’ tend to be adjectives or nouns,\nrather than verbs.\n\nTo predict POS tags, we use a simple setup. We process the sentences\nleft-to-right. For any given word, we extract features of that word and a window\naround it, and use these as inputs to a feed-forward neural network classifier,\nwhich predicts a probability distribution over POS tags. Because we make\ndecisions in left-to-right order, we also use prior decisions as features in\nsubsequent ones (e.g. \"the previous predicted tag was a noun.\").\n\nAll the models in this package use a flexible markup language to define\nfeatures. For example, the features in the POS tagger are found in the\n`brain_pos_features` parameter in the `TaskSpec`, and look like this (modulo\nspacing):\n\n```\nstack(3).word stack(2).word stack(1).word stack.word input.word input(1).word input(2).word input(3).word;\ninput.digit input.hyphen;\nstack.suffix(length=2) input.suffix(length=2) input(1).suffix(length=2);\nstack.prefix(length=2) input.prefix(length=2) input(1).prefix(length=2)\n```\n\nNote that `stack` here means \"words we have already tagged.\" Thus, this feature\nspec uses three types of features: words, suffixes, and prefixes. The features\nare grouped into blocks that share an embedding matrix, concatenated together,\nand fed into a chain of hidden layers. This structure is based upon the model\nproposed by [Chen and Manning (2014)]\n(http://cs.stanford.edu/people/danqi/papers/emnlp2014.pdf).\n\nWe show this layout in the schematic below: the state of the system (a stack and\na buffer, visualized below for both the POS and the dependency parsing task) is\nused to extract sparse features, which are fed into the network in groups. We\nshow only a small subset of the features to simplify the presentation in the\nschematic:\n\n![Schematic](ff_nn_schematic.png \"Feed-forward Network Structure\")\n\nIn the configuration above, each block gets its own embedding matrix and the\nblocks in the configuration above are delineated with a semi-colon. The\ndimensions of each block are controlled in the `brain_pos_embedding_dims`\nparameter. **Important note:** unlike many simple NLP models, this is *not* a\nbag of words model. Remember that although certain features share embedding\nmatrices, the above features will be concatenated, so the interpretation of\n`input.word` will be quite different from `input(1).word`. This also means that\nadding features increases the dimension of the `concat` layer of the model as\nwell as the number of parameters for the first hidden layer.\n\nTo train the model, first edit `syntaxnet/context.pbtxt` so that the inputs\n`training-corpus`, `tuning-corpus`, and `dev-corpus` point to the location of\nyour training data. You can then train a part-of-speech tagger with:\n\n```shell\nbazel-bin/syntaxnet/parser_trainer \\\n  --task_context=syntaxnet/context.pbtxt \\\n  --arg_prefix=brain_pos \\  # read from POS configuration\n  --compute_lexicon \\       # required for first stage of pipeline\n  --graph_builder=greedy \\  # no beam search\n  --training_corpus=training-corpus \\  # names of training/tuning set\n  --tuning_corpus=tuning-corpus \\\n  --output_path=models \\  # where to save new resources\n  --batch_size=32 \\       # Hyper-parameters\n  --decay_steps=3600 \\\n  --hidden_layer_sizes=128 \\\n  --learning_rate=0.08 \\\n  --momentum=0.9 \\\n  --seed=0 \\\n  --params=128-0.08-3600-0.9-0  # name for these parameters\n```\n\nThis will read in the data, construct a lexicon, build a tensorflow graph for\nthe model with the specific hyperparameters, and train the model. Every so often\nthe model will be evaluated on the tuning set, and only the checkpoint with the\nhighest accuracy on this set will be saved. **Note that you should never use a\ncorpus you intend to test your model on as your tuning set, as you will inflate\nyour test set results.**\n\nFor best results, you should repeat this command with at least 3 different\nseeds, and possibly with a few different values for `--learning_rate` and\n`--decay_steps`. Good values for `--learning_rate` are usually close to 0.1, and\nyou usually want `--decay_steps` to correspond to about one tenth of your\ncorpus. The `--params` flag is only a human readable identifier for the model\nbeing trained, used to construct the full output path, so that you don't need to\nworry about clobbering old models by accident.\n\nThe `--arg_prefix` flag controls which parameters should be read from the task\ncontext file `context.pbtxt`. In this case `arg_prefix` is set to `brain_pos`,\nso the paramters being used in this training run are\n`brain_pos_transition_system`, `brain_pos_embedding_dims`, `brain_pos_features`\nand, `brain_pos_embedding_names`. To train the dependency parser later\n`arg_prefix` will be set to `brain_parser`.\n\n### Preprocessing with the Tagger\n\nNow that we have a trained POS tagging model, we want to use the output of this\nmodel as features in the parser. Thus the next step is to run the trained model\nover our training, tuning, and dev (evaluation) sets. We can use the\nparser_eval.py` script for this.\n\nFor example, the model `128-0.08-3600-0.9-0` trained above can be run over the\ntraining, tuning, and dev sets with the following command:\n\n```shell\nPARAMS=128-0.08-3600-0.9-0\nfor SET in training tuning dev; do\n  bazel-bin/syntaxnet/parser_eval \\\n    --task_context=models/brain_pos/greedy/$PARAMS/context \\\n    --hidden_layer_sizes=128 \\\n    --input=$SET-corpus \\\n    --output=tagged-$SET-corpus \\\n    --arg_prefix=brain_pos \\\n    --graph_builder=greedy \\\n    --model_path=models/brain_pos/greedy/$PARAMS/model\ndone\n```\n\n**Important note:** This command only works because we have created entries for\nyou in `context.pbtxt` that correspond to `tagged-training-corpus`,\n`tagged-dev-corpus`, and `tagged-tuning-corpus`. From these default settings,\nthe above will write tagged versions of the training, tuning, and dev set to the\ndirectory `models/brain_pos/greedy/$PARAMS/`. This location is chosen because\nthe `input` entries do not have `file_pattern` set: instead, they have `creator:\nbrain_pos/greedy`, which means that `parser_trainer.py` will construct *new*\nfiles when called with `--arg_prefix=brain_pos --graph_builder=greedy` using the\n`--model_path` flag to determine the location.\n\nFor convenience, `parser_eval.py` also logs POS tagging accuracy after the\noutput tagged datasets have been written.\n\n### Dependency Parsing: Transition-Based Parsing\n\nNow that we have a prediction for the grammatical role of the words, we want to\nunderstand how the words in the sentence relate to each other. This parser is\nbuilt around the *head-modifier* construction: for each word, we choose a\n*syntactic head* that it modifies according to some grammatical role.\n\nAn example for the above sentence is as follows:\n\n![Figure](sawman.png)\n\nBelow each word in the sentence we see both a fine-grained part-of-speech\n(*PRP*, *VBD*, *DT*, *NN* etc.), and a coarse-grained part-of-speech (*PRON*,\n*VERB*, *DET*, *NOUN*, etc.). Coarse-grained POS tags encode basic grammatical\ncategories, while the fine-grained POS tags make further distinctions: for\nexample *NN* is a singular noun (as opposed, for example, to *NNS*, which is a\nplural noun), and *VBD* is a past-tense verb. For more discussion see [Petrov et\nal. (2012)](http://www.lrec-conf.org/proceedings/lrec2012/pdf/274_Paper.pdf).\n\nCrucially, we also see directed arcs signifying grammatical relationships\nbetween different words in the sentence. For example *I* is the subject of\n*saw*, as signified by the directed arc labeled *nsubj* between these words;\n*man* is the direct object (dobj) of *saw*; the preposition *with* modifies\n*man* with a prep relation, signifiying modification by a prepositional phrase;\nand so on. In addition the verb *saw* is identified as the *root* of the entire\nsentence.\n\nWhenever we have a directed arc between two words, we refer to the word at the\nstart of the arc as the *head*, and the word at the end of the arc as the\n*modifier*. For example we have one arc where the head is *saw* and the modifier\nis *I*, another where the head is *saw* and the modifier is *man*, and so on.\n\nThe grammatical relationships encoded in dependency structures are directly\nrelated to the underlying meaning of the sentence in question. They allow us to\neasily recover the answers to various questions, for example *whom did I see?*,\n*who saw the man with glasses?*, and so on.\n\nSyntaxNet is a **transition-based** dependency parser [Nivre (2007)]\n(http://www.mitpressjournals.org/doi/pdfplus/10.1162/coli.07-056-R1-07-027) that\nconstructs a parse incrementally. Like the tagger, it processes words\nleft-to-right. The words all start as unprocessed input, called the *buffer*. As\nwords are encountered they are put onto a *stack*. At each step, the parser can\ndo one of three things:\n\n1.  **SHIFT:** Push another word onto the top of the stack, i.e. shifting one\n    token from the buffer to the stack.\n1.  **LEFT_ARC:** Pop the top two words from the stack. Attach the second to the\n    first, creating an arc pointing to the **left**. Push the **first** word\n    back on the stack.\n1.  **RIGHT_ARC:** Pop the top two words from the stack. Attach the second to\n    the first, creating an arc point to the **right**. Push the **second** word\n    back on the stack.\n\nAt each step, we call the combination of the stack and the buffer the\n*configuration* of the parser. For the left and right actions, we also assign a\ndependency relation label to that arc. This process is visualized in the\nfollowing animation for a short sentence:\n\n![Animation](looping-parser.gif \"Parsing in Action\")\n\nNote that this parser is following a sequence of actions, called a\n**derivation**, to produce a \"gold\" tree labeled by a linguist. We can use this\nsequence of decisions to learn a classifier that takes a configuration and\npredicts the next action to take.\n\n### Training a Parser Step 1: Local Pretraining\n\nAs described in our [paper](http://arxiv.org/abs/1603.06042), the first\nstep in training the model is to *pre-train* using *local* decisions. In this\nphase, we use the gold dependency to guide the parser, and train a softmax layer\nto predict the correct action given these gold dependencies. This can be\nperformed very efficiently, since the parser's decisions are all independent in\nthis setting.\n\nOnce the tagged datasets are available, a locally normalized dependency parsing\nmodel can be trained with the following command:\n\n```shell\nbazel-bin/syntaxnet/parser_trainer \\\n  --arg_prefix=brain_parser \\\n  --batch_size=32 \\\n  --projectivize_training_set \\\n  --decay_steps=4400 \\\n  --graph_builder=greedy \\\n  --hidden_layer_sizes=200,200 \\\n  --learning_rate=0.08 \\\n  --momentum=0.85 \\\n  --output_path=models \\\n  --task_context=models/brain_pos/greedy/$PARAMS/context \\\n  --seed=4 \\\n  --training_corpus=tagged-training-corpus \\\n  --tuning_corpus=tagged-tuning-corpus \\\n  --params=200x200-0.08-4400-0.85-4\n```\n\nNote that we point the trainer to the context corresponding to the POS tagger\nthat we picked previously. This allows the parser to reuse the lexicons and the\ntagged datasets that were created in the previous steps. Processing data can be\ndone similarly to how tagging was done above. For example if in this case we\npicked parameters `200x200-0.08-4400-0.85-4`, the training, tuning and dev sets\ncan be parsed with the following command:\n\n```shell\nPARAMS=200x200-0.08-4400-0.85-4\nfor SET in training tuning dev; do\n  bazel-bin/syntaxnet/parser_eval \\\n    --task_context=models/brain_parser/greedy/$PARAMS/context \\\n    --hidden_layer_sizes=200,200 \\\n    --input=tagged-$SET-corpus \\\n    --output=parsed-$SET-corpus \\\n    --arg_prefix=brain_parser \\\n    --graph_builder=greedy \\\n    --model_path=models/brain_parser/greedy/$PARAMS/model\ndone\n```\n\n### Training a Parser Step 2: Global Training\n\nAs we describe in the paper, there are several problems with the locally\nnormalized models we just trained. The most important is the *label-bias*\nproblem: the model doesn't learn what a good parse looks like, only what action\nto take given a history of gold decisions. This is because the scores are\nnormalized *locally* using a softmax for each decision.\n\nIn the paper, we show how we can achieve much better results using a *globally*\nnormalized model: in this model, the softmax scores are summed in log space, and\nthe scores are not normalized until we reach a final decision. When the parser\nstops, the scores of each hypothesis are normalized against a small set of\npossible parses (in the case of this model, a beam size of 8). When training, we\nforce the parser to stop during parsing when the gold derivation falls off the\nbeam (a strategy known as early-updates).\n\nWe give a simplified view of how this training works for a [garden path\nsentence](https://en.wikipedia.org/wiki/Garden_path_sentence), where it is\nimportant to maintain multiple hypotheses. A single mistake early on in parsing\nleads to a completely incorrect parse; after training, the model learns to\nprefer the second (correct) parse.\n\n![Beam search training](beam_search_training.png)\n\nParsey McParseface correctly parses this sentence. Even though the correct parse\nis initially ranked 4th out of multiple hypotheses, when the end of the garden\npath is reached, Parsey McParseface can recover due to the beam; using a larger\nbeam will get a more accurate model, but it will be slower (we used beam 32 for\nthe models in the paper).\n\nOnce you have the pre-trained locally normalized model, a globally normalized\nparsing model can now be trained with the following command:\n\n```shell\nbazel-bin/syntaxnet/parser_trainer \\\n  --arg_prefix=brain_parser \\\n  --batch_size=8 \\\n  --decay_steps=100 \\\n  --graph_builder=structured \\\n  --hidden_layer_sizes=200,200 \\\n  --learning_rate=0.02 \\\n  --momentum=0.9 \\\n  --output_path=models \\\n  --task_context=models/brain_parser/greedy/$PARAMS/context \\\n  --seed=0 \\\n  --training_corpus=projectivized-training-corpus \\\n  --tuning_corpus=tagged-tuning-corpus \\\n  --params=200x200-0.02-100-0.9-0 \\\n  --pretrained_params=models/brain_parser/greedy/$PARAMS/model \\\n  --pretrained_params_names=\\\nembedding_matrix_0,embedding_matrix_1,embedding_matrix_2,\\\nbias_0,weights_0,bias_1,weights_1\n```\n\nTraining a beam model with the structured builder will take a lot longer than\nthe greedy training runs above, perhaps 3 or 4 times longer. Note once again\nthat multiple restarts of training will yield the most reliable results.\nEvaluation can again be done with `parser_eval.py`. In this case we use\nparameters `200x200-0.02-100-0.9-0` to evaluate on the training, tuning and dev\nsets with the following command:\n\n```shell\nPARAMS=200x200-0.02-100-0.9-0\nfor SET in training tuning dev; do\n  bazel-bin/syntaxnet/parser_eval \\\n    --task_context=models/brain_parser/structured/$PARAMS/context \\\n    --hidden_layer_sizes=200,200 \\\n    --input=tagged-$SET-corpus \\\n    --output=beam-parsed-$SET-corpus \\\n    --arg_prefix=brain_parser \\\n    --graph_builder=structured \\\n    --model_path=models/brain_parser/structured/$PARAMS/model\ndone\n```\n\nHooray! You now have your very own cousin of Parsey McParseface, ready to go out\nand parse text in the wild.\n\n## Contact\n\nTo ask questions or report issues please post on Stack Overflow with the tag\n[syntaxnet](http://stackoverflow.com/questions/tagged/syntaxnet)\nor open an issue on the tensorflow/models\n[issues tracker](https://github.com/tensorflow/models/issues).\nPlease assign SyntaxNet issues to @calberti or @andorardo.\n\n## Credits\n\nOriginal authors of the code in this package include (in alphabetical order):\n\n*   Alessandro Presta\n*   Aliaksei Severyn\n*   Andy Golding\n*   Bernd Bohnet\n*   Chris Alberti\n*   Daniel Andor\n*   David Weiss\n*   Emily Pitler\n*   Greg Coppola\n*   Ji Ma\n*   Keith Hall\n*   Kuzman Ganchev\n*   Michael Collins\n*   Michael Ringgaard\n*   Ryan McDonald\n*   Slav Petrov\n*   Stefan Istrate\n*   Terry Koo\n*   Tim Credo\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/WORKSPACE",
    "content": "local_repository(\n  name = \"org_tensorflow\",\n  path = \"tensorflow\",\n)\n\nload('@org_tensorflow//tensorflow:workspace.bzl', 'tf_workspace')\ntf_workspace()\n\n# Specify the minimum required Bazel version.\nload(\"@org_tensorflow//tensorflow:tensorflow.bzl\", \"check_version\")\ncheck_version(\"0.3.0\")\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/BUILD",
    "content": "# Description:\n# A syntactic parser and part-of-speech tagger in TensorFlow.\n\npackage(\n    default_visibility = [\n        \"//visibility:private\",\n    ],\n    features = [\"-layering_check\"],\n)\n\nlicenses([\"notice\"])  # Apache 2.0\n\nload(\n    \"syntaxnet\",\n    \"tf_proto_library\",\n    \"tf_proto_library_py\",\n    \"tf_gen_op_libs\",\n    \"tf_gen_op_wrapper_py\",\n)\n\n# proto libraries\n\ntf_proto_library(\n    name = \"feature_extractor_proto\",\n    srcs = [\"feature_extractor.proto\"],\n)\n\ntf_proto_library(\n    name = \"sentence_proto\",\n    srcs = [\"sentence.proto\"],\n)\n\ntf_proto_library_py(\n    name = \"sentence_py_pb2\",\n    srcs = [\"sentence.proto\"],\n)\n\ntf_proto_library(\n    name = \"dictionary_proto\",\n    srcs = [\"dictionary.proto\"],\n)\n\ntf_proto_library_py(\n    name = \"dictionary_py_pb2\",\n    srcs = [\"dictionary.proto\"],\n)\n\ntf_proto_library(\n    name = \"kbest_syntax_proto\",\n    srcs = [\"kbest_syntax.proto\"],\n    deps = [\":sentence_proto\"],\n)\n\ntf_proto_library(\n    name = \"task_spec_proto\",\n    srcs = [\"task_spec.proto\"],\n)\n\ntf_proto_library_py(\n    name = \"task_spec_py_pb2\",\n    srcs = [\"task_spec.proto\"],\n)\n\ntf_proto_library(\n    name = \"sparse_proto\",\n    srcs = [\"sparse.proto\"],\n)\n\ntf_proto_library_py(\n    name = \"sparse_py_pb2\",\n    srcs = [\"sparse.proto\"],\n)\n\n# cc libraries for feature extraction and parsing\n\ncc_library(\n    name = \"base\",\n    hdrs = [\"base.h\"],\n    visibility = [\"//visibility:public\"],\n    deps = [\n        \"@com_googlesource_code_re2//:re2\",\n        \"@protobuf//:protobuf\",\n        \"@org_tensorflow//third_party/eigen3\",\n    ] + select({\n        \"//conditions:default\": [\n            \"@org_tensorflow//tensorflow/core:framework\",\n            \"@org_tensorflow//tensorflow/core:lib\",\n        ],\n        \"@org_tensorflow//tensorflow:darwin\": [\n            \"@org_tensorflow//tensorflow/core:framework_headers_lib\",\n        ],\n    }),\n)\n\ncc_library(\n    name = \"utils\",\n    srcs = [\"utils.cc\"],\n    hdrs = [\n        \"utils.h\",\n    ],\n    deps = [\n        \":base\",\n        \"//util/utf8:unicodetext\",\n    ],\n)\n\ncc_library(\n    name = \"test_main\",\n    testonly = 1,\n    srcs = [\"test_main.cc\"],\n    linkopts = [\"-lm\"],\n    deps = [\n        \"//external:gtest\",\n        \"@org_tensorflow//tensorflow/core:lib\",\n        \"@org_tensorflow//tensorflow/core:testlib\",\n    ],\n)\n\ncc_library(\n    name = \"document_format\",\n    srcs = [\"document_format.cc\"],\n    hdrs = [\"document_format.h\"],\n    deps = [\n        \":registry\",\n        \":sentence_proto\",\n        \":task_context\",\n    ],\n)\n\ncc_library(\n    name = \"text_formats\",\n    srcs = [\"text_formats.cc\"],\n    deps = [\n        \":document_format\",\n        \":segmenter_utils\",\n        \":sentence_proto\",\n    ],\n    alwayslink = 1,\n)\n\ncc_library(\n    name = \"fml_parser\",\n    srcs = [\"fml_parser.cc\"],\n    hdrs = [\"fml_parser.h\"],\n    deps = [\n        \":feature_extractor_proto\",\n        \":utils\",\n    ],\n)\n\ncc_library(\n    name = \"proto_io\",\n    hdrs = [\"proto_io.h\"],\n    deps = [\n        \":feature_extractor_proto\",\n        \":fml_parser\",\n        \":sentence_proto\",\n        \":task_context\",\n    ],\n)\n\ncc_library(\n    name = \"char_properties\",\n    srcs = [\"char_properties.cc\"],\n    hdrs = [\"char_properties.h\"],\n    deps = [\n        \":registry\",\n        \":utils\",\n        \"//util/utf8:unicodetext\",\n    ],\n    alwayslink = 1,\n)\n\ncc_library(\n    name = \"segmenter_utils\",\n    srcs = [\"segmenter_utils.cc\"],\n    hdrs = [\"segmenter_utils.h\"],\n    deps = [\n        \":base\",\n        \":char_properties\",\n        \":sentence_proto\",\n        \"//util/utf8:unicodetext\",\n    ],\n    alwayslink = 1,\n)\n\ncc_library(\n    name = \"feature_extractor\",\n    srcs = [\"feature_extractor.cc\"],\n    hdrs = [\n        \"feature_extractor.h\",\n        \"feature_types.h\",\n    ],\n    deps = [\n        \":document_format\",\n        \":feature_extractor_proto\",\n        \":proto_io\",\n        \":sentence_proto\",\n        \":task_context\",\n        \":utils\",\n        \":workspace\",\n    ],\n)\n\ncc_library(\n    name = \"affix\",\n    srcs = [\"affix.cc\"],\n    hdrs = [\"affix.h\"],\n    deps = [\n        \":dictionary_proto\",\n        \":feature_extractor\",\n        \":sentence_proto\",\n        \":shared_store\",\n        \":term_frequency_map\",\n        \":utils\",\n        \":workspace\",\n    ],\n)\n\ncc_library(\n    name = \"sentence_features\",\n    srcs = [\"sentence_features.cc\"],\n    hdrs = [\"sentence_features.h\"],\n    deps = [\n        \":affix\",\n        \":feature_extractor\",\n        \":registry\",\n        \":segmenter_utils\",\n    ],\n)\n\ncc_library(\n    name = \"shared_store\",\n    srcs = [\"shared_store.cc\"],\n    hdrs = [\"shared_store.h\"],\n    deps = [\n        \":utils\",\n    ],\n)\n\ncc_library(\n    name = \"registry\",\n    srcs = [\"registry.cc\"],\n    hdrs = [\"registry.h\"],\n    deps = [\n        \":utils\",\n    ],\n)\n\ncc_library(\n    name = \"workspace\",\n    srcs = [\"workspace.cc\"],\n    hdrs = [\"workspace.h\"],\n    deps = [\n        \":utils\",\n    ],\n)\n\ncc_library(\n    name = \"task_context\",\n    srcs = [\"task_context.cc\"],\n    hdrs = [\"task_context.h\"],\n    deps = [\n        \":task_spec_proto\",\n        \":utils\",\n    ],\n)\n\ncc_library(\n    name = \"term_frequency_map\",\n    srcs = [\"term_frequency_map.cc\"],\n    hdrs = [\"term_frequency_map.h\"],\n    visibility = [\"//visibility:public\"],\n    deps = [\n        \":utils\",\n    ],\n    alwayslink = 1,\n)\n\ncc_library(\n    name = \"morphology_label_set\",\n    srcs = [\"morphology_label_set.cc\"],\n    hdrs = [\"morphology_label_set.h\"],\n    deps = [\n        \":document_format\",\n        \":feature_extractor\",\n        \":proto_io\",\n        \":registry\",\n        \":sentence_proto\",\n        \":utils\",\n    ],\n)\n\ncc_library(\n    name = \"parser_transitions\",\n    srcs = [\n        \"arc_standard_transitions.cc\",\n        \"binary_segment_state.cc\",\n        \"binary_segment_transitions.cc\",\n        \"morpher_transitions.cc\",\n        \"parser_features.cc\",\n        \"parser_state.cc\",\n        \"parser_transitions.cc\",\n        \"tagger_transitions.cc\",\n    ],\n    hdrs = [\n        \"binary_segment_state.h\",\n        \"parser_features.h\",\n        \"parser_state.h\",\n        \"parser_transitions.h\",\n    ],\n    deps = [\n        \":affix\",\n        \":feature_extractor\",\n        \":kbest_syntax_proto\",\n        \":morphology_label_set\",\n        \":registry\",\n        \":segmenter_utils\",\n        \":sentence_features\",\n        \":sentence_proto\",\n        \":shared_store\",\n        \":task_context\",\n        \":term_frequency_map\",\n        \":workspace\",\n    ],\n    alwayslink = 1,\n)\n\ncc_library(\n    name = \"populate_test_inputs\",\n    testonly = 1,\n    srcs = [\"populate_test_inputs.cc\"],\n    hdrs = [\"populate_test_inputs.h\"],\n    deps = [\n        \":dictionary_proto\",\n        \":sentence_proto\",\n        \":task_context\",\n        \":task_spec_proto\",\n        \":term_frequency_map\",\n        \":test_main\",\n    ],\n)\n\ncc_library(\n    name = \"embedding_feature_extractor\",\n    srcs = [\"embedding_feature_extractor.cc\"],\n    hdrs = [\"embedding_feature_extractor.h\"],\n    deps = [\n        \":feature_extractor\",\n        \":parser_transitions\",\n        \":sparse_proto\",\n        \":task_context\",\n        \":workspace\",\n    ],\n)\n\ncc_library(\n    name = \"sentence_batch\",\n    srcs = [\"sentence_batch.cc\"],\n    hdrs = [\"sentence_batch.h\"],\n    deps = [\n        \":embedding_feature_extractor\",\n        \":feature_extractor\",\n        \":parser_transitions\",\n        \":sentence_proto\",\n        \":sparse_proto\",\n        \":task_context\",\n        \":task_spec_proto\",\n        \":term_frequency_map\",\n        \":workspace\",\n    ],\n)\n\ncc_library(\n    name = \"reader_ops\",\n    srcs = [\n        \"beam_reader_ops.cc\",\n        \"reader_ops.cc\",\n    ],\n    deps = [\n        \":parser_transitions\",\n        \":sentence_batch\",\n        \":sentence_proto\",\n        \":sparse_proto\",\n        \":task_context\",\n        \":task_spec_proto\",\n    ],\n    alwayslink = 1,\n)\n\ncc_library(\n    name = \"document_filters\",\n    srcs = [\"document_filters.cc\"],\n    deps = [\n        \":document_format\",\n        \":parser_transitions\",\n        \":sentence_batch\",\n        \":sentence_proto\",\n        \":task_context\",\n        \":text_formats\",\n    ],\n    alwayslink = 1,\n)\n\ncc_library(\n    name = \"lexicon_builder\",\n    srcs = [\"lexicon_builder.cc\"],\n    deps = [\n        \":dictionary_proto\",\n        \":document_format\",\n        \":parser_transitions\",\n        \":segmenter_utils\",\n        \":sentence_batch\",\n        \":sentence_proto\",\n        \":task_context\",\n        \":text_formats\",\n    ],\n    alwayslink = 1,\n)\n\ncc_library(\n    name = \"unpack_sparse_features\",\n    srcs = [\"unpack_sparse_features.cc\"],\n    deps = [\n        \":sparse_proto\",\n        \":utils\",\n    ],\n    alwayslink = 1,\n)\n\ncc_library(\n    name = \"parser_ops_cc\",\n    srcs = [\"ops/parser_ops.cc\"],\n    deps = [\n        \":base\",\n        \":document_filters\",\n        \":lexicon_builder\",\n        \":reader_ops\",\n        \":unpack_sparse_features\",\n    ],\n    alwayslink = 1,\n)\n\ncc_binary(\n    name = \"parser_ops.so\",\n    linkopts = select({\n        \"//conditions:default\": [\"-lm\"],\n        \"@org_tensorflow//tensorflow:darwin\": [],\n    }),\n    linkshared = 1,\n    linkstatic = 1,\n    deps = [\n        \":parser_ops_cc\",\n    ],\n)\n\n# cc tests\n\nfilegroup(\n    name = \"testdata\",\n    srcs = [\n        \"testdata/context.pbtxt\",\n        \"testdata/document\",\n        \"testdata/mini-training-set\",\n    ],\n)\n\nfilegroup(\n    name = \"parsey_data\",\n    srcs = glob([\"models/parsey_mcparseface/*\"]),\n)\n\ncc_test(\n    name = \"binary_segment_state_test\",\n    size = \"small\",\n    srcs = [\"binary_segment_state_test.cc\"],\n    deps = [\n        \":base\",\n        \":parser_transitions\",\n        \":term_frequency_map\",\n        \":test_main\",\n    ],\n)\n\ncc_test(\n    name = \"shared_store_test\",\n    size = \"small\",\n    srcs = [\"shared_store_test.cc\"],\n    deps = [\n        \":shared_store\",\n        \":test_main\",\n    ],\n)\n\ncc_test(\n    name = \"char_properties_test\",\n    srcs = [\"char_properties_test.cc\"],\n    deps = [\n        \":char_properties\",\n        \":test_main\",\n    ],\n)\n\ncc_test(\n    name = \"segmenter_utils_test\",\n    srcs = [\"segmenter_utils_test.cc\"],\n    deps = [\n        \":base\",\n        \":segmenter_utils\",\n        \":sentence_proto\",\n        \":test_main\",\n    ],\n)\n\ncc_test(\n    name = \"sentence_features_test\",\n    size = \"medium\",\n    srcs = [\"sentence_features_test.cc\"],\n    deps = [\n        \":feature_extractor\",\n        \":populate_test_inputs\",\n        \":sentence_features\",\n        \":sentence_proto\",\n        \":task_context\",\n        \":task_spec_proto\",\n        \":term_frequency_map\",\n        \":test_main\",\n        \":workspace\",\n    ],\n)\n\ncc_test(\n    name = \"morphology_label_set_test\",\n    srcs = [\"morphology_label_set_test.cc\"],\n    deps = [\n        \":morphology_label_set\",\n        \":test_main\",\n    ],\n)\n\ncc_test(\n    name = \"arc_standard_transitions_test\",\n    size = \"small\",\n    srcs = [\"arc_standard_transitions_test.cc\"],\n    data = [\":testdata\"],\n    deps = [\n        \":parser_transitions\",\n        \":populate_test_inputs\",\n        \":sentence_proto\",\n        \":task_spec_proto\",\n        \":test_main\",\n    ],\n)\n\ncc_test(\n    name = \"binary_segment_transitions_test\",\n    size = \"small\",\n    srcs = [\"binary_segment_transitions_test.cc\"],\n    deps = [\n        \":parser_transitions\",\n        \":sentence_proto\",\n        \":task_context\",\n        \":test_main\",\n        \":workspace\",\n    ],\n)\n\ncc_test(\n    name = \"tagger_transitions_test\",\n    size = \"small\",\n    srcs = [\"tagger_transitions_test.cc\"],\n    data = [\":testdata\"],\n    deps = [\n        \":parser_transitions\",\n        \":populate_test_inputs\",\n        \":sentence_proto\",\n        \":task_spec_proto\",\n        \":test_main\",\n    ],\n)\n\ncc_test(\n    name = \"parser_features_test\",\n    size = \"small\",\n    srcs = [\"parser_features_test.cc\"],\n    deps = [\n        \":feature_extractor\",\n        \":parser_transitions\",\n        \":populate_test_inputs\",\n        \":sentence_proto\",\n        \":task_context\",\n        \":task_spec_proto\",\n        \":term_frequency_map\",\n        \":test_main\",\n        \":workspace\",\n    ],\n)\n\n# py graph builder and trainer\n\ntf_gen_op_libs(\n    op_lib_names = [\"parser_ops\"],\n)\n\ntf_gen_op_wrapper_py(\n    name = \"parser_ops\",\n    deps = [\":parser_ops_op_lib\"],\n)\n\npy_library(\n    name = \"load_parser_ops_py\",\n    srcs = [\"load_parser_ops.py\"],\n    data = [\":parser_ops.so\"],\n)\n\npy_library(\n    name = \"graph_builder\",\n    srcs = [\"graph_builder.py\"],\n    deps = [\n        \":load_parser_ops_py\",\n        \":parser_ops\",\n        \"@org_tensorflow//tensorflow:tensorflow_py\",\n        \"@org_tensorflow//tensorflow/core:protos_all_py\",\n    ],\n)\n\npy_library(\n    name = \"structured_graph_builder\",\n    srcs = [\"structured_graph_builder.py\"],\n    deps = [\n        \":graph_builder\",\n    ],\n)\n\npy_binary(\n    name = \"parser_trainer\",\n    srcs = [\"parser_trainer.py\"],\n    deps = [\n        \":graph_builder\",\n        \":structured_graph_builder\",\n        \":task_spec_py_pb2\",\n    ],\n)\n\npy_binary(\n    name = \"parser_eval\",\n    srcs = [\"parser_eval.py\"],\n    deps = [\n        \":graph_builder\",\n        \":sentence_py_pb2\",\n        \":structured_graph_builder\",\n        \":task_spec_py_pb2\",\n    ],\n)\n\npy_binary(\n    name = \"conll2tree\",\n    srcs = [\"conll2tree.py\"],\n    deps = [\n        \":graph_builder\",\n        \":sentence_py_pb2\",\n    ],\n)\n\n# py tests\n\npy_test(\n    name = \"lexicon_builder_test\",\n    size = \"small\",\n    srcs = [\"lexicon_builder_test.py\"],\n    deps = [\n        \":graph_builder\",\n        \":sentence_py_pb2\",\n        \":task_spec_py_pb2\",\n    ],\n)\n\npy_test(\n    name = \"text_formats_test\",\n    size = \"small\",\n    srcs = [\"text_formats_test.py\"],\n    deps = [\n        \":graph_builder\",\n        \":sentence_py_pb2\",\n        \":task_spec_py_pb2\",\n    ],\n)\n\npy_test(\n    name = \"reader_ops_test\",\n    size = \"medium\",\n    srcs = [\"reader_ops_test.py\"],\n    data = [\":testdata\"],\n    tags = [\"notsan\"],\n    deps = [\n        \":dictionary_py_pb2\",\n        \":graph_builder\",\n        \":sparse_py_pb2\",\n    ],\n)\n\npy_test(\n    name = \"beam_reader_ops_test\",\n    size = \"medium\",\n    srcs = [\"beam_reader_ops_test.py\"],\n    data = [\":testdata\"],\n    tags = [\"notsan\"],\n    deps = [\n        \":structured_graph_builder\",\n    ],\n)\n\npy_test(\n    name = \"graph_builder_test\",\n    size = \"medium\",\n    srcs = [\"graph_builder_test.py\"],\n    data = [\n        \":testdata\",\n    ],\n    tags = [\"notsan\"],\n    deps = [\n        \":graph_builder\",\n        \":sparse_py_pb2\",\n    ],\n)\n\nsh_test(\n    name = \"parser_trainer_test\",\n    size = \"large\",\n    srcs = [\"parser_trainer_test.sh\"],\n    data = [\n        \":parser_eval\",\n        \":parser_trainer\",\n        \":testdata\",\n    ],\n    tags = [\"notsan\"],\n)\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/affix.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/affix.h\"\n\n#include <ctype.h>\n#include <string.h>\n#include <functional>\n#include <string>\n\n#include \"syntaxnet/shared_store.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/term_frequency_map.h\"\n#include \"syntaxnet/utils.h\"\n#include \"syntaxnet/workspace.h\"\n#include \"tensorflow/core/lib/core/status.h\"\n#include \"tensorflow/core/platform/env.h\"\n#include \"tensorflow/core/platform/regexp.h\"\n#include \"util/utf8/unicodetext.h\"\n\nnamespace syntaxnet {\n\n// Initial number of buckets in term and affix hash maps. This must be a power\n// of two.\nstatic const int kInitialBuckets = 1024;\n\n// Fill factor for term and affix hash maps.\nstatic const int kFillFactor = 2;\n\nint TermHash(const string &term) {\n  return utils::Hash32(term.data(), term.size(), 0xDECAF);\n}\n\n// Copies a substring of a Unicode text to a string.\nstatic void UnicodeSubstring(const UnicodeText::const_iterator &start,\n                             const UnicodeText::const_iterator &end,\n                             string *result) {\n  result->clear();\n  result->append(start.utf8_data(), end.utf8_data() - start.utf8_data());\n}\n\nAffixTable::AffixTable(Type type, int max_length) {\n  type_ = type;\n  max_length_ = max_length;\n  Resize(0);\n}\n\nAffixTable::~AffixTable() { Reset(0); }\n\nvoid AffixTable::Reset(int max_length) {\n  // Save new maximum affix length.\n  max_length_ = max_length;\n\n  // Delete all data.\n  for (size_t i = 0; i < affixes_.size(); ++i) delete affixes_[i];\n  affixes_.clear();\n  buckets_.clear();\n  Resize(0);\n}\n\nvoid AffixTable::Read(const AffixTableEntry &table_entry) {\n  CHECK_EQ(table_entry.type(), type_ == PREFIX ? \"PREFIX\" : \"SUFFIX\");\n  CHECK_GE(table_entry.max_length(), 0);\n  Reset(table_entry.max_length());\n\n  // First, create all affixes.\n  for (int affix_id = 0; affix_id < table_entry.affix_size(); ++affix_id) {\n    const auto &affix_entry = table_entry.affix(affix_id);\n    CHECK_GE(affix_entry.length(), 0);\n    CHECK_LE(affix_entry.length(), max_length_);\n    CHECK(FindAffix(affix_entry.form()) == nullptr);  // forbid duplicates\n    Affix *affix = AddNewAffix(affix_entry.form(), affix_entry.length());\n    CHECK_EQ(affix->id(), affix_id);\n  }\n  CHECK_EQ(affixes_.size(), table_entry.affix_size());\n\n  // Next, link the shorter affixes.\n  for (int affix_id = 0; affix_id < table_entry.affix_size(); ++affix_id) {\n    const auto &affix_entry = table_entry.affix(affix_id);\n    if (affix_entry.shorter_id() == -1) {\n      CHECK_EQ(affix_entry.length(), 1);\n      continue;\n    }\n    CHECK_GT(affix_entry.length(), 1);\n    CHECK_GE(affix_entry.shorter_id(), 0);\n    CHECK_LT(affix_entry.shorter_id(), affixes_.size());\n    Affix *affix = affixes_[affix_id];\n    Affix *shorter = affixes_[affix_entry.shorter_id()];\n    CHECK_EQ(affix->length(), shorter->length() + 1);\n    affix->set_shorter(shorter);\n  }\n}\n\nvoid AffixTable::Read(ProtoRecordReader *reader) {\n  AffixTableEntry table_entry;\n  TF_CHECK_OK(reader->Read(&table_entry));\n  Read(table_entry);\n}\n\nvoid AffixTable::Write(AffixTableEntry *table_entry) const {\n  table_entry->Clear();\n  table_entry->set_type(type_ == PREFIX ? \"PREFIX\" : \"SUFFIX\");\n  table_entry->set_max_length(max_length_);\n  for (const Affix *affix : affixes_) {\n    auto *affix_entry = table_entry->add_affix();\n    affix_entry->set_form(affix->form());\n    affix_entry->set_length(affix->length());\n    affix_entry->set_shorter_id(\n        affix->shorter() == nullptr ? -1 : affix->shorter()->id());\n  }\n}\n\nvoid AffixTable::Write(ProtoRecordWriter *writer) const {\n  AffixTableEntry table_entry;\n  Write(&table_entry);\n  writer->Write(table_entry);\n}\n\nAffix *AffixTable::AddAffixesForWord(const char *word, size_t size) {\n  // The affix length is measured in characters and not bytes so we need to\n  // determine the length in characters.\n  UnicodeText text;\n  text.PointToUTF8(word, size);\n  int length = text.size();\n\n  // Determine longest affix.\n  int affix_len = length;\n  if (affix_len > max_length_) affix_len = max_length_;\n  if (affix_len == 0) return nullptr;\n\n  // Find start and end of longest affix.\n  UnicodeText::const_iterator start, end;\n  if (type_ == PREFIX) {\n    start = end = text.begin();\n    for (int i = 0; i < affix_len; ++i) ++end;\n  } else {\n    start = end = text.end();\n    for (int i = 0; i < affix_len; ++i) --start;\n  }\n\n  // Try to find successively shorter affixes.\n  Affix *top = nullptr;\n  Affix *ancestor = nullptr;\n  string s;\n  while (affix_len > 0) {\n    // Try to find affix in table.\n    UnicodeSubstring(start, end, &s);\n    Affix *affix = FindAffix(s);\n    if (affix == nullptr) {\n      // Affix not found, add new one to table.\n      affix = AddNewAffix(s, affix_len);\n\n      // Update ancestor chain.\n      if (ancestor != nullptr) ancestor->set_shorter(affix);\n      ancestor = affix;\n      if (top == nullptr) top = affix;\n    } else {\n      // Affix found. Update ancestor if needed and return match.\n      if (ancestor != nullptr) ancestor->set_shorter(affix);\n      if (top == nullptr) top = affix;\n      break;\n    }\n\n    // Next affix.\n    if (type_ == PREFIX) {\n      --end;\n    } else {\n      ++start;\n    }\n\n    affix_len--;\n  }\n\n  return top;\n}\n\nAffix *AffixTable::GetAffix(int id) const {\n  if (id < 0 || id >= static_cast<int>(affixes_.size())) {\n    return nullptr;\n  } else {\n    return affixes_[id];\n  }\n}\n\nstring AffixTable::AffixForm(int id) const {\n  Affix *affix = GetAffix(id);\n  if (affix == nullptr) {\n    return \"\";\n  } else {\n    return affix->form();\n  }\n}\n\nint AffixTable::AffixId(const string &form) const {\n  Affix *affix = FindAffix(form);\n  if (affix == nullptr) {\n    return -1;\n  } else {\n    return affix->id();\n  }\n}\n\nAffix *AffixTable::AddNewAffix(const string &form, int length) {\n  int hash = TermHash(form);\n  int id = affixes_.size();\n  if (id > static_cast<int>(buckets_.size()) * kFillFactor) Resize(id);\n  int b = hash & (buckets_.size() - 1);\n\n  // Create new affix object.\n  Affix *affix = new Affix(id, form.c_str(), length);\n  affixes_.push_back(affix);\n\n  // Insert affix in bucket chain.\n  affix->next_ = buckets_[b];\n  buckets_[b] = affix;\n\n  return affix;\n}\n\nAffix *AffixTable::FindAffix(const string &form) const {\n  // Compute hash value for word.\n  int hash = TermHash(form);\n\n  // Try to find affix in hash table.\n  Affix *affix = buckets_[hash & (buckets_.size() - 1)];\n  while (affix != nullptr) {\n    if (strcmp(affix->form_.c_str(), form.c_str()) == 0) return affix;\n    affix = affix->next_;\n  }\n  return nullptr;\n}\n\nvoid AffixTable::Resize(int size_hint) {\n  // Compute new size for bucket array.\n  int new_size = kInitialBuckets;\n  while (new_size < size_hint) new_size *= 2;\n  int mask = new_size - 1;\n\n  // Distribute affixes in new buckets.\n  buckets_.resize(new_size);\n  for (size_t i = 0; i < buckets_.size(); ++i) {\n    buckets_[i] = nullptr;\n  }\n  for (size_t i = 0; i < affixes_.size(); ++i) {\n    Affix *affix = affixes_[i];\n    int b = TermHash(affix->form_) & mask;\n    affix->next_ = buckets_[b];\n    buckets_[b] = affix;\n  }\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/affix.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#ifndef SYNTAXNET_AFFIX_H_\n#define SYNTAXNET_AFFIX_H_\n\n#include <stddef.h>\n#include <string>\n#include <vector>\n\n#include \"syntaxnet/utils.h\"\n#include \"syntaxnet/dictionary.pb.h\"\n#include \"syntaxnet/feature_extractor.h\"\n#include \"syntaxnet/proto_io.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/term_frequency_map.h\"\n#include \"syntaxnet/workspace.h\"\n#include \"tensorflow/core/lib/strings/strcat.h\"\n\nnamespace syntaxnet {\n\n// An affix represents a prefix or suffix of a word of a certain length. Each\n// affix has a unique id and a textual form. An affix also has a pointer to the\n// affix that is one character shorter. This creates a chain of affixes that are\n// successively shorter.\nclass Affix {\n private:\n  friend class AffixTable;\n  Affix(int id, const char *form, int length)\n      : id_(id),\n        length_(length),\n        form_(form),\n        shorter_(nullptr),\n        next_(nullptr) {}\n\n public:\n  // Returns unique id of affix.\n  int id() const { return id_; }\n\n  // Returns the textual representation of the affix.\n  string form() const { return form_; }\n\n  // Returns the length of the affix.\n  int length() const { return length_; }\n\n  // Gets/sets the affix that is one character shorter.\n  Affix *shorter() const { return shorter_; }\n  void set_shorter(Affix *next) { shorter_ = next; }\n\n private:\n  // Affix id.\n  int id_;\n\n  // Length (in characters) of affix.\n  int length_;\n\n  // Text form of affix.\n  string form_;\n\n  // Pointer to affix that is one character shorter.\n  Affix *shorter_;\n\n  // Next affix in bucket chain.\n  Affix *next_;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(Affix);\n};\n\n// An affix table holds all prefixes/suffixes of all the words added to the\n// table up to a maximum length. The affixes are chained together to enable\n// fast lookup of all affixes for a word.\nclass AffixTable {\n public:\n  // Affix table type.\n  enum Type { PREFIX, SUFFIX };\n\n  AffixTable(Type type, int max_length);\n  ~AffixTable();\n\n  // Resets the affix table and initialize the table for affixes of up to the\n  // maximum length specified.\n  void Reset(int max_length);\n\n  // De-serializes this from the given proto.\n  void Read(const AffixTableEntry &table_entry);\n\n  // De-serializes this from the given records.\n  void Read(ProtoRecordReader *reader);\n\n  // Serializes this to the given proto.\n  void Write(AffixTableEntry *table_entry) const;\n\n  // Serializes this to the given records.\n  void Write(ProtoRecordWriter *writer) const;\n\n  // Adds all prefixes/suffixes of the word up to the maximum length to the\n  // table. The longest affix is returned. The pointers in the affix can be\n  // used for getting shorter affixes.\n  Affix *AddAffixesForWord(const char *word, size_t size);\n\n  // Gets the affix information for the affix with a certain id. Returns NULL if\n  // there is no affix in the table with this id.\n  Affix *GetAffix(int id) const;\n\n  // Gets affix form from id. If the affix does not exist in the table, an empty\n  // string is returned.\n  string AffixForm(int id) const;\n\n  // Gets affix id for affix. If the affix does not exist in the table, -1 is\n  // returned.\n  int AffixId(const string &form) const;\n\n  // Returns size of the affix table.\n  int size() const { return affixes_.size(); }\n\n  // Returns the maximum affix length.\n  int max_length() const { return max_length_; }\n\n private:\n  // Adds a new affix to table.\n  Affix *AddNewAffix(const string &form, int length);\n\n  // Finds existing affix in table.\n  Affix *FindAffix(const string &form) const;\n\n  // Resizes bucket array.\n  void Resize(int size_hint);\n\n  // Affix type (prefix or suffix).\n  Type type_;\n\n  // Maximum length of affix.\n  int max_length_;\n\n  // Index from affix ids to affix items.\n  vector<Affix *> affixes_;\n\n  // Buckets for word-to-affix hash map.\n  vector<Affix *> buckets_;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(AffixTable);\n};\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_AFFIX_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/arc_standard_transitions.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// Arc-standard transition system.\n//\n// This transition system has three types of actions:\n//  - The SHIFT action pushes the next input token to the stack and\n//    advances to the next input token.\n//  - The LEFT_ARC action adds a dependency relation from first to second token\n//    on the stack and removes second one.\n//  - The RIGHT_ARC action adds a dependency relation from second to first token\n//    on the stack and removes the first one.\n//\n// The transition system operates with parser actions encoded as integers:\n//  - A SHIFT action is encoded as 0.\n//  - A LEFT_ARC action is encoded as an odd number starting from 1.\n//  - A RIGHT_ARC action is encoded as an even number starting from 2.\n\n#include <string>\n\n#include \"syntaxnet/parser_state.h\"\n#include \"syntaxnet/parser_transitions.h\"\n#include \"syntaxnet/utils.h\"\n#include \"tensorflow/core/lib/strings/strcat.h\"\n\nnamespace syntaxnet {\n\nclass ArcStandardTransitionState : public ParserTransitionState {\n public:\n  // Clones the transition state by returning a new object.\n  ParserTransitionState *Clone() const override {\n    return new ArcStandardTransitionState();\n  }\n\n  // Pushes the root on the stack before using the parser state in parsing.\n  void Init(ParserState *state) override { state->Push(-1); }\n\n  // Adds transition state specific annotations to the document.\n  void AddParseToDocument(const ParserState &state, bool rewrite_root_labels,\n                          Sentence *sentence) const override {\n    for (int i = 0; i < state.NumTokens(); ++i) {\n      Token *token = sentence->mutable_token(i);\n      token->set_label(state.LabelAsString(state.Label(i)));\n      if (state.Head(i) != -1) {\n        token->set_head(state.Head(i));\n      } else {\n        token->clear_head();\n        if (rewrite_root_labels) {\n          token->set_label(state.LabelAsString(state.RootLabel()));\n        }\n      }\n    }\n  }\n\n  // Whether a parsed token should be considered correct for evaluation.\n  bool IsTokenCorrect(const ParserState &state, int index) const override {\n    return state.GoldHead(index) == state.Head(index);\n  }\n\n  // Returns a human readable string representation of this state.\n  string ToString(const ParserState &state) const override {\n    string str;\n    str.append(\"[\");\n    for (int i = state.StackSize() - 1; i >= 0; --i) {\n      const string &word = state.GetToken(state.Stack(i)).word();\n      if (i != state.StackSize() - 1) str.append(\" \");\n      if (word == \"\") {\n        str.append(ParserState::kRootLabel);\n      } else {\n        str.append(word);\n      }\n    }\n    str.append(\"]\");\n    for (int i = state.Next(); i < state.NumTokens(); ++i) {\n      tensorflow::strings::StrAppend(&str, \" \", state.GetToken(i).word());\n    }\n    return str;\n  }\n};\n\nclass ArcStandardTransitionSystem : public ParserTransitionSystem {\n public:\n  // Action types for the arc-standard transition system.\n  enum ParserActionType {\n    SHIFT = 0,\n    LEFT_ARC = 1,\n    RIGHT_ARC = 2,\n  };\n\n  // The SHIFT action uses the same value as the corresponding action type.\n  static ParserAction ShiftAction() { return SHIFT; }\n\n  // The LEFT_ARC action converts the label to an odd number greater or equal\n  // to 1.\n  static ParserAction LeftArcAction(int label) { return 1 + (label << 1); }\n\n  // The RIGHT_ARC action converts the label to an even number greater or equal\n  // to 2.\n  static ParserAction RightArcAction(int label) {\n    return 1 + ((label << 1) | 1);\n  }\n\n  // Extracts the action type from a given parser action.\n  static ParserActionType ActionType(ParserAction action) {\n    return static_cast<ParserActionType>(action < 1 ? action\n                                                    : 1 + (~action & 1));\n  }\n\n  // Extracts the label from a given parser action. If the action is SHIFT,\n  // returns -1.\n  static int Label(ParserAction action) {\n    return action < 1 ? -1 : (action - 1) >> 1;\n  }\n\n  // Returns the number of action types.\n  int NumActionTypes() const override { return 3; }\n\n  // Returns the number of possible actions.\n  int NumActions(int num_labels) const override { return 1 + 2 * num_labels; }\n\n  // The method returns the default action for a given state.\n  ParserAction GetDefaultAction(const ParserState &state) const override {\n    // If there are further tokens available in the input then Shift.\n    if (!state.EndOfInput()) return ShiftAction();\n\n    // Do a \"reduce\".\n    return RightArcAction(2);\n  }\n\n  // Returns the next gold action for a given state according to the\n  // underlying annotated sentence.\n  ParserAction GetNextGoldAction(const ParserState &state) const override {\n    // If the stack contains less than 2 tokens, the only valid parser action is\n    // shift.\n    if (state.StackSize() < 2) {\n      DCHECK(!state.EndOfInput());\n      return ShiftAction();\n    }\n\n    // If the second token on the stack is the head of the first one,\n    // return a right arc action.\n    if (state.GoldHead(state.Stack(0)) == state.Stack(1) &&\n        DoneChildrenRightOf(state, state.Stack(0))) {\n      const int gold_label = state.GoldLabel(state.Stack(0));\n      return RightArcAction(gold_label);\n    }\n\n    // If the first token on the stack is the head of the second one,\n    // return a left arc action.\n    if (state.GoldHead(state.Stack(1)) == state.Top()) {\n      const int gold_label = state.GoldLabel(state.Stack(1));\n      return LeftArcAction(gold_label);\n    }\n\n    // Otherwise, shift.\n    return ShiftAction();\n  }\n\n  // Determines if a token has any children to the right in the sentence.\n  // Arc standard is a bottom-up parsing method and has to finish all sub-trees\n  // first.\n  static bool DoneChildrenRightOf(const ParserState &state, int head) {\n    int index = state.Next();\n    int num_tokens = state.sentence().token_size();\n    while (index < num_tokens) {\n      // Check if the token at index is the child of head.\n      int actual_head = state.GoldHead(index);\n      if (actual_head == head) return false;\n\n      // If the head of the token at index is to the right of it there cannot be\n      // any children in-between, so we can skip forward to the head.  Note this\n      // is only true for projective trees.\n      if (actual_head > index) {\n        index = actual_head;\n      } else {\n        ++index;\n      }\n    }\n    return true;\n  }\n\n  // Checks if the action is allowed in a given parser state.\n  bool IsAllowedAction(ParserAction action,\n                       const ParserState &state) const override {\n    switch (ActionType(action)) {\n      case SHIFT:\n        return IsAllowedShift(state);\n      case LEFT_ARC:\n        return IsAllowedLeftArc(state);\n      case RIGHT_ARC:\n        return IsAllowedRightArc(state);\n    }\n\n    return false;\n  }\n\n  // Returns true if a shift is allowed in the given parser state.\n  bool IsAllowedShift(const ParserState &state) const {\n    // We can shift if there are more input tokens.\n    return !state.EndOfInput();\n  }\n\n  // Returns true if a left-arc is allowed in the given parser state.\n  bool IsAllowedLeftArc(const ParserState &state) const {\n    // Left-arc requires two or more tokens on the stack but the first token\n    // is the root an we do not want and left arc to the root.\n    return state.StackSize() > 2;\n  }\n\n  // Returns true if a right-arc is allowed in the given parser state.\n  bool IsAllowedRightArc(const ParserState &state) const {\n    // Right arc requires three or more tokens on the stack.\n    return state.StackSize() > 1;\n  }\n\n  // Performs the specified action on a given parser state, without adding the\n  // action to the state's history.\n  void PerformActionWithoutHistory(ParserAction action,\n                                   ParserState *state) const override {\n    switch (ActionType(action)) {\n      case SHIFT:\n        PerformShift(state);\n        break;\n      case LEFT_ARC:\n        PerformLeftArc(state, Label(action));\n        break;\n      case RIGHT_ARC:\n        PerformRightArc(state, Label(action));\n        break;\n    }\n  }\n\n  // Makes a shift by pushing the next input token on the stack and moving to\n  // the next position.\n  void PerformShift(ParserState *state) const {\n    DCHECK(IsAllowedShift(*state));\n    state->Push(state->Next());\n    state->Advance();\n  }\n\n  // Makes a left-arc between the two top tokens on stack and pops the second\n  // token on stack.\n  void PerformLeftArc(ParserState *state, int label) const {\n    DCHECK(IsAllowedLeftArc(*state));\n    int s0 = state->Pop();\n    state->AddArc(state->Pop(), s0, label);\n    state->Push(s0);\n  }\n\n  // Makes a right-arc between the two top tokens on stack and pops the stack.\n  void PerformRightArc(ParserState *state, int label) const {\n    DCHECK(IsAllowedRightArc(*state));\n    int s0 = state->Pop();\n    int s1 = state->Pop();\n    state->AddArc(s0, s1, label);\n    state->Push(s1);\n  }\n\n  // We are in a deterministic state when we either reached the end of the input\n  // or reduced everything from the stack.\n  bool IsDeterministicState(const ParserState &state) const override {\n    return state.StackSize() < 2 && !state.EndOfInput();\n  }\n\n  // We are in a final state when we reached the end of the input and the stack\n  // is empty.\n  bool IsFinalState(const ParserState &state) const override {\n    return state.EndOfInput() && state.StackSize() < 2;\n  }\n\n  // Returns a string representation of a parser action.\n  string ActionAsString(ParserAction action,\n                        const ParserState &state) const override {\n    switch (ActionType(action)) {\n      case SHIFT:\n        return \"SHIFT\";\n      case LEFT_ARC:\n        return \"LEFT_ARC(\" + state.LabelAsString(Label(action)) + \")\";\n      case RIGHT_ARC:\n        return \"RIGHT_ARC(\" + state.LabelAsString(Label(action)) + \")\";\n    }\n    return \"UNKNOWN\";\n  }\n\n  // Returns a new transition state to be used to enhance the parser state.\n  ParserTransitionState *NewTransitionState(bool training_mode) const override {\n    return new ArcStandardTransitionState();\n  }\n};\n\nREGISTER_TRANSITION_SYSTEM(\"arc-standard\", ArcStandardTransitionSystem);\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/arc_standard_transitions_test.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include <memory>\n#include <string>\n#include \"syntaxnet/parser_state.h\"\n#include \"syntaxnet/parser_transitions.h\"\n#include \"syntaxnet/populate_test_inputs.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/task_spec.pb.h\"\n#include \"syntaxnet/term_frequency_map.h\"\n#include \"syntaxnet/utils.h\"\n#include <gmock/gmock.h>\n\n#include \"tensorflow/core/lib/core/status.h\"\n#include \"tensorflow/core/platform/env.h\"\n#include \"tensorflow/core/platform/test.h\"\n\nnamespace syntaxnet {\n\nclass ArcStandardTransitionTest : public ::testing::Test {\n public:\n  ArcStandardTransitionTest()\n      : transition_system_(ParserTransitionSystem::Create(\"arc-standard\")) {}\n\n protected:\n  // Creates a label map and a tag map for testing based on the given\n  // document and initializes the transition system appropriately.\n  void SetUpForDocument(const Sentence &document) {\n    input_label_map_ = context_.GetInput(\"label-map\", \"text\", \"\");\n    transition_system_->Setup(&context_);\n    PopulateTestInputs::Defaults(document).Populate(&context_);\n    label_map_.Load(TaskContext::InputFile(*input_label_map_),\n                    0 /* minimum frequency */,\n                    -1 /* maximum number of terms */);\n    transition_system_->Init(&context_);\n  }\n\n  // Creates a cloned state from a sentence in order to test that cloning\n  // works correctly for the new parser states.\n  ParserState *NewClonedState(Sentence *sentence) {\n    ParserState state(sentence, transition_system_->NewTransitionState(\n                                    true /* training mode */),\n                      &label_map_);\n    return state.Clone();\n  }\n\n  // Performs gold transitions and check that the labels and heads recorded\n  // in the parser state match gold heads and labels.\n  void GoldParse(Sentence *sentence) {\n    ParserState *state = NewClonedState(sentence);\n    LOG(INFO) << \"Initial parser state: \" << state->ToString();\n    while (!transition_system_->IsFinalState(*state)) {\n      ParserAction action = transition_system_->GetNextGoldAction(*state);\n      EXPECT_TRUE(transition_system_->IsAllowedAction(action, *state));\n      LOG(INFO) << \"Performing action: \"\n                << transition_system_->ActionAsString(action, *state);\n      transition_system_->PerformActionWithoutHistory(action, state);\n      LOG(INFO) << \"Parser state: \" << state->ToString();\n    }\n    for (int i = 0; i < sentence->token_size(); ++i) {\n      EXPECT_EQ(state->GoldLabel(i), state->Label(i));\n      EXPECT_EQ(state->GoldHead(i), state->Head(i));\n    }\n    delete state;\n  }\n\n  // Always takes the default action, and verifies that this leads to\n  // a final state through a sequence of allowed actions.\n  void DefaultParse(Sentence *sentence) {\n    ParserState *state = NewClonedState(sentence);\n    LOG(INFO) << \"Initial parser state: \" << state->ToString();\n    while (!transition_system_->IsFinalState(*state)) {\n      ParserAction action = transition_system_->GetDefaultAction(*state);\n      EXPECT_TRUE(transition_system_->IsAllowedAction(action, *state));\n      LOG(INFO) << \"Performing action: \"\n                << transition_system_->ActionAsString(action, *state);\n      transition_system_->PerformActionWithoutHistory(action, state);\n      LOG(INFO) << \"Parser state: \" << state->ToString();\n    }\n    delete state;\n  }\n\n  TaskContext context_;\n  TaskInput *input_label_map_ = nullptr;\n  TermFrequencyMap label_map_;\n  std::unique_ptr<ParserTransitionSystem> transition_system_;\n};\n\nTEST_F(ArcStandardTransitionTest, SingleSentenceDocumentTest) {\n  string document_text;\n  Sentence document;\n  TF_CHECK_OK(ReadFileToString(\n      tensorflow::Env::Default(),\n      \"syntaxnet/testdata/document\",\n      &document_text));\n  LOG(INFO) << \"see doc\\n:\" << document_text;\n  CHECK(TextFormat::ParseFromString(document_text, &document));\n  SetUpForDocument(document);\n  GoldParse(&document);\n  DefaultParse(&document);\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/base.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#ifndef SYNTAXNET_BASE_H_\n#define SYNTAXNET_BASE_H_\n\n#include <functional>\n#include <string>\n#include <vector>\n#include <unordered_map>\n#include <unordered_set>\n#include \"tensorflow/core/lib/core/status.h\"\n#include \"tensorflow/core/lib/strings/strcat.h\"\n#include \"tensorflow/core/lib/strings/stringprintf.h\"\n#include \"tensorflow/core/platform/default/integral_types.h\"\n#include \"tensorflow/core/platform/mutex.h\"\n#include \"tensorflow/core/platform/protobuf.h\"\n\n\n\nusing tensorflow::int32;\nusing tensorflow::int64;\nusing tensorflow::uint64;\nusing tensorflow::uint32;\nusing tensorflow::uint32;\nusing tensorflow::protobuf::TextFormat;\nusing tensorflow::mutex_lock;\nusing tensorflow::mutex;\nusing std::map;\nusing std::pair;\nusing std::vector;\nusing std::unordered_map;\nusing std::unordered_set;\ntypedef signed int char32;\n\nusing tensorflow::StringPiece;\nusing std::string;\n\n  // namespace syntaxnet\n\n#endif  // SYNTAXNET_BASE_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/beam_reader_ops.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include <algorithm>\n#include <deque>\n#include <map>\n#include <memory>\n#include <string>\n#include <utility>\n#include <vector>\n\n#include \"syntaxnet/base.h\"\n#include \"syntaxnet/parser_state.h\"\n#include \"syntaxnet/parser_transitions.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/sentence_batch.h\"\n#include \"syntaxnet/shared_store.h\"\n#include \"syntaxnet/sparse.pb.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/task_spec.pb.h\"\n#include \"third_party/eigen3/unsupported/Eigen/CXX11/Tensor\"\n#include \"tensorflow/core/framework/op_kernel.h\"\n#include \"tensorflow/core/framework/tensor.h\"\n#include \"tensorflow/core/framework/tensor_shape.h\"\n#include \"tensorflow/core/lib/core/status.h\"\n#include \"tensorflow/core/platform/env.h\"\n\nusing tensorflow::DEVICE_CPU;\nusing tensorflow::DT_BOOL;\nusing tensorflow::DT_FLOAT;\nusing tensorflow::DT_INT32;\nusing tensorflow::DT_INT64;\nusing tensorflow::DT_STRING;\nusing tensorflow::DataType;\nusing tensorflow::OpKernel;\nusing tensorflow::OpKernelConstruction;\nusing tensorflow::OpKernelContext;\nusing tensorflow::TTypes;\nusing tensorflow::Tensor;\nusing tensorflow::TensorShape;\nusing tensorflow::errors::FailedPrecondition;\nusing tensorflow::errors::InvalidArgument;\n\nnamespace syntaxnet {\n\n// Wraps ParserState so that the history of transitions (actions\n// performed and the beam slot they were performed in) are recorded.\nstruct ParserStateWithHistory {\n public:\n  // New state with an empty history.\n  explicit ParserStateWithHistory(const ParserState &s) : state(s.Clone()) {}\n\n  // New state obtained by cloning the given state and applying the given\n  // action. The given beam slot and action are appended to the history.\n  ParserStateWithHistory(const ParserStateWithHistory &next,\n                         const ParserTransitionSystem &transitions, int32 slot,\n                         int32 action, float score)\n      : state(next.state->Clone()),\n        slot_history(next.slot_history),\n        action_history(next.action_history),\n        score_history(next.score_history) {\n    transitions.PerformAction(action, state.get());\n    slot_history.push_back(slot);\n    action_history.push_back(action);\n    score_history.push_back(score);\n  }\n\n  std::unique_ptr<ParserState> state;\n  std::vector<int32> slot_history;\n  std::vector<int32> action_history;\n  std::vector<float> score_history;\n\n private:\n  TF_DISALLOW_COPY_AND_ASSIGN(ParserStateWithHistory);\n};\n\nstruct BatchStateOptions {\n  // Maximum number of parser states in a beam.\n  int max_beam_size;\n\n  // Number of parallel sentences to decode.\n  int batch_size;\n\n  // Argument prefix for context parameters.\n  string arg_prefix;\n\n  // Corpus name to read from from context inputs.\n  string corpus_name;\n\n  // Whether we allow weights in SparseFeatures protos.\n  bool allow_feature_weights;\n\n  // Whether beams should be considered alive until all states are final, or\n  // until the gold path falls off.\n  bool continue_until_all_final;\n\n  // Whether to skip to a new sentence after each training step.\n  bool always_start_new_sentences;\n\n  // Parameter for deciding which tokens to score.\n  string scoring_type;\n};\n\n// Encapsulates the environment needed to parse with a beam, keeping a\n// record of path histories.\nclass BeamState {\n public:\n  // The agenda is keyed by a tuple that is the score followed by an\n  // int that is -1 if the path coincides with the gold path and 0\n  // otherwise. The lexicographic ordering of the keys therefore\n  // ensures that for all paths sharing the same score, the gold path\n  // will always be at the bottom. This situation can occur at the\n  // onset of training when all weights are zero and therefore all\n  // paths have an identically zero score.\n  typedef std::pair<double, int> KeyType;\n  typedef std::multimap<KeyType, std::unique_ptr<ParserStateWithHistory>>\n      AgendaType;\n  typedef std::pair<const KeyType, std::unique_ptr<ParserStateWithHistory>>\n      AgendaItem;\n  typedef Eigen::Tensor<float, 2, Eigen::RowMajor, Eigen::DenseIndex>\n      ScoreMatrixType;\n\n  // The beam can be\n  //   - ALIVE: parsing is still active, features are being output for at least\n  //     some slots in the beam.\n  //   - DYING: features should be output for this beam only one more time, then\n  //     the beam will be DEAD. This state is reached when the gold path falls\n  //     out of the beam and features have to be output one last time.\n  //   - DEAD: parsing is not active, features are not being output and the no\n  //     actions are taken on the states.\n  enum State { ALIVE = 0, DYING = 1, DEAD = 2 };\n\n  explicit BeamState(const BatchStateOptions &options) : options_(options) {}\n\n  void Reset() {\n    if (options_.always_start_new_sentences ||\n        gold_ == nullptr || transition_system_->IsFinalState(*gold_)) {\n      AdvanceSentence();\n    }\n    slots_.clear();\n    if (gold_ == nullptr) {\n      state_ = DEAD;  // EOF has been reached.\n    } else {\n      gold_->set_is_gold(true);\n      slots_.emplace(KeyType(0.0, -1), std::unique_ptr<ParserStateWithHistory>(\n          new ParserStateWithHistory(*gold_)));\n      state_ = ALIVE;\n    }\n  }\n\n  void UpdateAllFinal() {\n    all_final_ = true;\n    for (const AgendaItem &item : slots_) {\n      if (!transition_system_->IsFinalState(*item.second->state)) {\n        all_final_ = false;\n        break;\n      }\n    }\n    if (all_final_) {\n      state_ = DEAD;\n    }\n  }\n\n  // This method updates the beam. For all elements of the beam, all\n  // allowed transitions are scored and insterted into a new beam. The\n  // beam size is capped by discarding the lowest scoring slots at any\n  // given time. There is one exception to this process: the gold path\n  // is forced to remain in the beam at all times, even if it scores\n  // low. This is to ensure that the gold path can be used for\n  // training at the moment it would otherwise fall off (and be absent\n  // from) the beam.\n  void Advance(const ScoreMatrixType &scores) {\n    // If the beam was in the state of DYING, it is now DEAD.\n    if (state_ == DYING) state_ = DEAD;\n\n    // When to stop advancing depends on the 'continue_until_all_final' arg.\n    if (!IsAlive() || gold_ == nullptr) return;\n\n    AdvanceGold();\n\n    const int score_rows = scores.dimension(0);\n    const int num_actions = scores.dimension(1);\n\n    // Advance beam.\n    AgendaType previous_slots;\n    previous_slots.swap(slots_);\n\n    CHECK_EQ(state_, ALIVE);\n\n    int slot = 0;\n    for (AgendaItem &item : previous_slots) {\n      {\n        ParserState *current = item.second->state.get();\n        VLOG(2) << \"Slot: \" << slot;\n        VLOG(2) << \"Parser state: \" << current->ToString();\n        VLOG(2) << \"Parser state cumulative score: \" << item.first.first << \" \"\n                << (item.first.second < 0 ? \"golden\" : \"\");\n      }\n      if (!transition_system_->IsFinalState(*item.second->state)) {\n        // Not a final state.\n        for (int action = 0; action < num_actions; ++action) {\n          // Is action allowed?\n          if (!transition_system_->IsAllowedAction(action,\n                                                   *item.second->state)) {\n            continue;\n          }\n          CHECK_LT(slot, score_rows);\n          MaybeInsertWithNewAction(item, slot, scores(slot, action), action);\n          PruneBeam();\n        }\n      } else {\n        // Final state: no need to advance.\n        MaybeInsert(&item);\n        PruneBeam();\n      }\n      ++slot;\n    }\n    UpdateAllFinal();\n  }\n\n  void PopulateFeatureOutputs(\n      std::vector<std::vector<std::vector<SparseFeatures>>> *features) {\n    for (const AgendaItem &item : slots_) {\n      VLOG(2) << \"State: \" << item.second->state->ToString();\n      std::vector<std::vector<SparseFeatures>> f =\n          features_->ExtractSparseFeatures(*workspace_, *item.second->state);\n      for (size_t i = 0; i < f.size(); ++i) (*features)[i].push_back(f[i]);\n    }\n  }\n\n  int BeamSize() const { return slots_.size(); }\n\n  bool IsAlive() const { return state_ == ALIVE; }\n\n  bool IsDead() const { return state_ == DEAD; }\n\n  bool AllFinal() const { return all_final_; }\n\n  // The current contents of the beam.\n  AgendaType slots_;\n\n  // Which batch this refers to.\n  int beam_id_ = 0;\n\n  // Sentence batch reader.\n  SentenceBatch *sentence_batch_ = nullptr;\n\n  // Label map.\n  const TermFrequencyMap *label_map_ = nullptr;\n\n  // Transition system.\n  const ParserTransitionSystem *transition_system_ = nullptr;\n\n  // Feature extractor.\n  const ParserEmbeddingFeatureExtractor *features_ = nullptr;\n\n  // Feature workspace set.\n  WorkspaceSet *workspace_ = nullptr;\n\n  // Internal workspace registry for use in feature extraction.\n  WorkspaceRegistry *workspace_registry_ = nullptr;\n\n  // ParserState used to get gold actions.\n  std::unique_ptr<ParserState> gold_;\n\n private:\n  // Creates a new ParserState if there's another sentence to be read.\n  void AdvanceSentence() {\n    gold_.reset();\n    if (sentence_batch_->AdvanceSentence(beam_id_)) {\n      gold_.reset(new ParserState(sentence_batch_->sentence(beam_id_),\n                                  transition_system_->NewTransitionState(true),\n                                  label_map_));\n      workspace_->Reset(*workspace_registry_);\n      features_->Preprocess(workspace_, gold_.get());\n    }\n  }\n\n  void AdvanceGold() {\n    gold_action_ = -1;\n    if (!transition_system_->IsFinalState(*gold_)) {\n      gold_action_ = transition_system_->GetNextGoldAction(*gold_);\n      if (transition_system_->IsAllowedAction(gold_action_, *gold_)) {\n        // In cases where the gold annotation is incompatible with the\n        // transition system, the action returned as gold might be not allowed.\n        transition_system_->PerformAction(gold_action_, gold_.get());\n      }\n    }\n  }\n\n  // Removes the first non-gold beam element if the beam is larger than\n  // the maximum beam size. If the gold element was at the bottom of the\n  // beam, sets the beam state to DYING, otherwise leaves the state alone.\n  void PruneBeam() {\n    if (static_cast<int>(slots_.size()) > options_.max_beam_size) {\n      auto bottom = slots_.begin();\n      if (!options_.continue_until_all_final &&\n          bottom->second->state->is_gold()) {\n        state_ = DYING;\n        ++bottom;\n      }\n      slots_.erase(bottom);\n    }\n  }\n\n  // Inserts an item in the beam if\n  //   - the item is gold,\n  //   - the beam is not full, or\n  //   - the item's new score is greater than the lowest score in the beam after\n  //     the score has been incremented by given delta_score.\n  // Inserted items have slot, delta_score and action appended to their history.\n  void MaybeInsertWithNewAction(const AgendaItem &item, const int slot,\n                                const double delta_score, const int action) {\n    const double score = item.first.first + delta_score;\n    const bool is_gold =\n        item.second->state->is_gold() && action == gold_action_;\n    if (is_gold || static_cast<int>(slots_.size()) < options_.max_beam_size ||\n        score > slots_.begin()->first.first) {\n      const KeyType key{score, -static_cast<int>(is_gold)};\n      slots_.emplace(key, std::unique_ptr<ParserStateWithHistory>(\n                              new ParserStateWithHistory(\n                                  *item.second, *transition_system_, slot,\n                                  action, delta_score)))\n          ->second->state->set_is_gold(is_gold);\n    }\n  }\n\n  // Inserts an item in the beam if\n  //   - the item is gold,\n  //   - the beam is not full, or\n  //   - the item's new score is greater than the lowest score in the beam.\n  // The history of inserted items is left untouched.\n  void MaybeInsert(AgendaItem *item) {\n    const bool is_gold = item->second->state->is_gold();\n    const double score = item->first.first;\n    if (is_gold || static_cast<int>(slots_.size()) < options_.max_beam_size ||\n        score > slots_.begin()->first.first) {\n      slots_.emplace(item->first, std::move(item->second));\n    }\n  }\n\n  // Limits the number of slots on the beam.\n  const BatchStateOptions &options_;\n\n  int gold_action_ = -1;\n  State state_ = ALIVE;\n  bool all_final_ = false;\n  TF_DISALLOW_COPY_AND_ASSIGN(BeamState);\n};\n\n// Encapsulates the state of a batch of beams. It is an object of this\n// type that will persist through repeated Op evaluations as the\n// multiple steps are computed in sequence.\nclass BatchState {\n public:\n  explicit BatchState(const BatchStateOptions &options)\n      : options_(options), features_(options.arg_prefix) {}\n\n  ~BatchState() { SharedStore::Release(label_map_); }\n\n  void Init(TaskContext *task_context) {\n    // Create sentence batch.\n    sentence_batch_.reset(\n        new SentenceBatch(BatchSize(), options_.corpus_name));\n    sentence_batch_->Init(task_context);\n\n    // Create transition system.\n    transition_system_.reset(ParserTransitionSystem::Create(task_context->Get(\n        tensorflow::strings::StrCat(options_.arg_prefix, \"_transition_system\"),\n        \"arc-standard\")));\n    transition_system_->Setup(task_context);\n    transition_system_->Init(task_context);\n\n    // Create label map.\n    string label_map_path =\n        TaskContext::InputFile(*task_context->GetInput(\"label-map\"));\n    label_map_ = SharedStoreUtils::GetWithDefaultName<TermFrequencyMap>(\n        label_map_path, 0, 0);\n\n    // Setup features.\n    features_.Setup(task_context);\n    features_.Init(task_context);\n    features_.RequestWorkspaces(&workspace_registry_);\n\n    // Create workspaces.\n    workspaces_.resize(BatchSize());\n\n    // Create beams.\n    beams_.clear();\n    for (int beam_id = 0; beam_id < BatchSize(); ++beam_id) {\n      beams_.emplace_back(options_);\n      beams_[beam_id].beam_id_ = beam_id;\n      beams_[beam_id].sentence_batch_ = sentence_batch_.get();\n      beams_[beam_id].transition_system_ = transition_system_.get();\n      beams_[beam_id].label_map_ = label_map_;\n      beams_[beam_id].features_ = &features_;\n      beams_[beam_id].workspace_ = &workspaces_[beam_id];\n      beams_[beam_id].workspace_registry_ = &workspace_registry_;\n    }\n  }\n\n  void ResetBeams() {\n    for (BeamState &beam : beams_) {\n      beam.Reset();\n    }\n\n    // Rewind if no states remain in the batch (we need to rewind the corpus).\n    if (sentence_batch_->size() == 0) {\n      ++epoch_;\n      VLOG(2) << \"Starting epoch \" << epoch_;\n      sentence_batch_->Rewind();\n    }\n  }\n\n  // Resets the offset vectors required for a single run because we're\n  // starting a new matrix of scores.\n  void ResetOffsets() {\n    beam_offsets_.clear();\n    step_offsets_ = {0};\n    UpdateOffsets();\n  }\n\n  void AdvanceBeam(const int beam_id,\n                   const TTypes<float>::ConstMatrix &scores) {\n    const int offset = beam_offsets_.back()[beam_id];\n    Eigen::array<Eigen::DenseIndex, 2> offsets = {offset, 0};\n    Eigen::array<Eigen::DenseIndex, 2> extents = {\n        beam_offsets_.back()[beam_id + 1] - offset, NumActions()};\n    BeamState::ScoreMatrixType beam_scores = scores.slice(offsets, extents);\n    beams_[beam_id].Advance(beam_scores);\n  }\n\n  void UpdateOffsets() {\n    beam_offsets_.emplace_back(BatchSize() + 1, 0);\n    std::vector<int> &offsets = beam_offsets_.back();\n    for (int beam_id = 0; beam_id < BatchSize(); ++beam_id) {\n      // If the beam is ALIVE or DYING (but not DEAD), we want to\n      // output the activations.\n      const BeamState &beam = beams_[beam_id];\n      const int beam_size = beam.IsDead() ? 0 : beam.BeamSize();\n      offsets[beam_id + 1] = offsets[beam_id] + beam_size;\n    }\n    const int output_size = offsets.back();\n    step_offsets_.push_back(step_offsets_.back() + output_size);\n  }\n\n  tensorflow::Status PopulateFeatureOutputs(OpKernelContext *context) {\n    const int feature_size = FeatureSize();\n    std::vector<std::vector<std::vector<SparseFeatures>>> features(\n        feature_size);\n    for (int beam_id = 0; beam_id < BatchSize(); ++beam_id) {\n      if (!beams_[beam_id].IsDead()) {\n        beams_[beam_id].PopulateFeatureOutputs(&features);\n      }\n    }\n    CHECK_EQ(features.size(), feature_size);\n    Tensor *output;\n    const int total_slots = beam_offsets_.back().back();\n    for (int i = 0; i < feature_size; ++i) {\n      std::vector<std::vector<SparseFeatures>> &f = features[i];\n      CHECK_EQ(total_slots, f.size());\n      if (total_slots == 0) {\n        TF_RETURN_IF_ERROR(\n            context->allocate_output(i, TensorShape({0, 0}), &output));\n      } else {\n        const int size = f[0].size();\n        TF_RETURN_IF_ERROR(context->allocate_output(\n            i, TensorShape({total_slots, size}), &output));\n        for (int j = 0; j < total_slots; ++j) {\n          CHECK_EQ(size, f[j].size());\n          for (int k = 0; k < size; ++k) {\n            if (!options_.allow_feature_weights && f[j][k].weight_size() > 0) {\n              return FailedPrecondition(\n                  \"Feature weights are not allowed when allow_feature_weights \"\n                  \"is set to false.\");\n            }\n            output->matrix<string>()(j, k) = f[j][k].SerializeAsString();\n          }\n        }\n      }\n    }\n    return tensorflow::Status::OK();\n  }\n\n  // Returns the offset (i.e. row number) of a particular beam at a\n  // particular step in the final concatenated score matrix.\n  int GetOffset(const int step, const int beam_id) const {\n    return step_offsets_[step] + beam_offsets_[step][beam_id];\n  }\n\n  int FeatureSize() const { return features_.embedding_dims().size(); }\n\n  int NumActions() const {\n    return transition_system_->NumActions(label_map_->Size());\n  }\n\n  int BatchSize() const { return options_.batch_size; }\n\n  const BeamState &Beam(const int i) const { return beams_[i]; }\n\n  int Epoch() const { return epoch_; }\n\n  const string &ScoringType() const { return options_.scoring_type; }\n\n private:\n  const BatchStateOptions options_;\n\n  // How many times the document source has been rewound.\n  int epoch_ = 0;\n\n  // Batch of sentences, and the corresponding parser states.\n  std::unique_ptr<SentenceBatch> sentence_batch_;\n\n  // Transition system.\n  std::unique_ptr<ParserTransitionSystem> transition_system_;\n\n  // Label map for transition system..\n  const TermFrequencyMap *label_map_;\n\n  // Typed feature extractor for embeddings.\n  ParserEmbeddingFeatureExtractor features_;\n\n  // Batch: WorkspaceSet objects.\n  std::vector<WorkspaceSet> workspaces_;\n\n  // Internal workspace registry for use in feature extraction.\n  WorkspaceRegistry workspace_registry_;\n\n  std::deque<BeamState> beams_;\n  std::vector<std::vector<int>> beam_offsets_;\n\n  // Keeps track of the slot offset of each step.\n  std::vector<int> step_offsets_;\n  TF_DISALLOW_COPY_AND_ASSIGN(BatchState);\n};\n\n// Creates a BeamState and hooks it up with a parser. This Op needs to\n// remain alive for the duration of the parse.\nclass BeamParseReader : public OpKernel {\n public:\n  explicit BeamParseReader(OpKernelConstruction *context) : OpKernel(context) {\n    string file_path;\n    int feature_size;\n    BatchStateOptions options;\n    OP_REQUIRES_OK(context, context->GetAttr(\"task_context\", &file_path));\n    OP_REQUIRES_OK(context, context->GetAttr(\"feature_size\", &feature_size));\n    OP_REQUIRES_OK(context,\n                   context->GetAttr(\"beam_size\", &options.max_beam_size));\n    OP_REQUIRES_OK(context,\n                   context->GetAttr(\"batch_size\", &options.batch_size));\n    OP_REQUIRES_OK(context,\n                   context->GetAttr(\"arg_prefix\", &options.arg_prefix));\n    OP_REQUIRES_OK(context,\n                   context->GetAttr(\"corpus_name\", &options.corpus_name));\n    OP_REQUIRES_OK(context, context->GetAttr(\"allow_feature_weights\",\n                                             &options.allow_feature_weights));\n    OP_REQUIRES_OK(context,\n                   context->GetAttr(\"continue_until_all_final\",\n                                    &options.continue_until_all_final));\n    OP_REQUIRES_OK(context,\n                   context->GetAttr(\"always_start_new_sentences\",\n                                    &options.always_start_new_sentences));\n\n    // Reads task context from file.\n    string data;\n    OP_REQUIRES_OK(context, ReadFileToString(tensorflow::Env::Default(),\n                                             file_path, &data));\n    TaskContext task_context;\n    OP_REQUIRES(context,\n                TextFormat::ParseFromString(data, task_context.mutable_spec()),\n                InvalidArgument(\"Could not parse task context at \", file_path));\n    OP_REQUIRES(\n        context, options.batch_size > 0,\n        InvalidArgument(\"Batch size \", options.batch_size, \" too small.\"));\n    options.scoring_type = task_context.Get(\n        tensorflow::strings::StrCat(options.arg_prefix, \"_scoring\"), \"\");\n\n    // Create batch state.\n    batch_state_.reset(new BatchState(options));\n    batch_state_->Init(&task_context);\n\n    // Check number of feature groups matches the task context.\n    const int required_size = batch_state_->FeatureSize();\n    OP_REQUIRES(\n        context, feature_size == required_size,\n        InvalidArgument(\"Task context requires feature_size=\", required_size));\n\n    // Set expected signature.\n    std::vector<DataType> output_types(feature_size, DT_STRING);\n    output_types.push_back(DT_INT64);\n    output_types.push_back(DT_INT32);\n    OP_REQUIRES_OK(context, context->MatchSignature({}, output_types));\n  }\n\n  void Compute(OpKernelContext *context) override {\n    mutex_lock lock(mu_);\n\n    // Write features.\n    batch_state_->ResetBeams();\n    batch_state_->ResetOffsets();\n    batch_state_->PopulateFeatureOutputs(context);\n\n    // Forward the beam state vector.\n    Tensor *output;\n    const int feature_size = batch_state_->FeatureSize();\n    OP_REQUIRES_OK(context, context->allocate_output(feature_size,\n                                                     TensorShape({}), &output));\n    output->scalar<int64>()() = reinterpret_cast<int64>(batch_state_.get());\n\n    // Output number of epochs.\n    OP_REQUIRES_OK(context, context->allocate_output(feature_size + 1,\n                                                     TensorShape({}), &output));\n    output->scalar<int32>()() = batch_state_->Epoch();\n  }\n\n private:\n  // mutex to synchronize access to Compute.\n  mutex mu_;\n\n  // The object whose handle will be passed among the Ops.\n  std::unique_ptr<BatchState> batch_state_;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(BeamParseReader);\n};\n\nREGISTER_KERNEL_BUILDER(Name(\"BeamParseReader\").Device(DEVICE_CPU),\n                        BeamParseReader);\n\n// Updates the beam based on incoming scores and outputs new feature vectors\n// based on the updated beam.\nclass BeamParser : public OpKernel {\n public:\n  explicit BeamParser(OpKernelConstruction *context) : OpKernel(context) {\n    int feature_size;\n    OP_REQUIRES_OK(context, context->GetAttr(\"feature_size\", &feature_size));\n\n    // Set expected signature.\n    std::vector<DataType> output_types(feature_size, DT_STRING);\n    output_types.push_back(DT_INT64);\n    output_types.push_back(DT_BOOL);\n    OP_REQUIRES_OK(context,\n                   context->MatchSignature({DT_INT64, DT_FLOAT}, output_types));\n  }\n\n  void Compute(OpKernelContext *context) override {\n    BatchState *batch_state =\n        reinterpret_cast<BatchState *>(context->input(0).scalar<int64>()());\n\n    const TTypes<float>::ConstMatrix scores = context->input(1).matrix<float>();\n    VLOG(2) << \"Scores: \" << scores;\n    CHECK_EQ(scores.dimension(1), batch_state->NumActions());\n\n    // In AdvanceBeam we use beam_offsets_[beam_id] to determine the slice of\n    // scores that should be used for advancing, but beam_offsets_[beam_id] only\n    // exists for beams that have a sentence loaded.\n    const int batch_size = batch_state->BatchSize();\n    for (int beam_id = 0; beam_id < batch_size; ++beam_id) {\n      batch_state->AdvanceBeam(beam_id, scores);\n    }\n    batch_state->UpdateOffsets();\n\n    // Forward the beam state unmodified.\n    Tensor *output;\n    const int feature_size = batch_state->FeatureSize();\n    OP_REQUIRES_OK(context, context->allocate_output(feature_size,\n                                                     TensorShape({}), &output));\n    output->scalar<int64>()() = context->input(0).scalar<int64>()();\n\n    // Output the new features of all the slots in all the beams.\n    OP_REQUIRES_OK(context, batch_state->PopulateFeatureOutputs(context));\n\n    // Output whether the beams are alive.\n    OP_REQUIRES_OK(\n        context, context->allocate_output(feature_size + 1,\n                                          TensorShape({batch_size}), &output));\n    for (int beam_id = 0; beam_id < batch_size; ++beam_id) {\n      output->vec<bool>()(beam_id) = batch_state->Beam(beam_id).IsAlive();\n    }\n  }\n\n private:\n  TF_DISALLOW_COPY_AND_ASSIGN(BeamParser);\n};\n\nREGISTER_KERNEL_BUILDER(Name(\"BeamParser\").Device(DEVICE_CPU), BeamParser);\n\n// Extracts the paths for the elements of the current beams and returns\n// indices into a scoring matrix that is assumed to have been\n// constructed along with the beam search.\nclass BeamParserOutput : public OpKernel {\n public:\n  explicit BeamParserOutput(OpKernelConstruction *context) : OpKernel(context) {\n    // Set expected signature.\n    OP_REQUIRES_OK(context,\n                   context->MatchSignature(\n                       {DT_INT64}, {DT_INT32, DT_INT32, DT_INT32, DT_FLOAT}));\n  }\n\n  void Compute(OpKernelContext *context) override {\n    BatchState *batch_state =\n        reinterpret_cast<BatchState *>(context->input(0).scalar<int64>()());\n\n    const int num_actions = batch_state->NumActions();\n    const int batch_size = batch_state->BatchSize();\n\n    // Vectors for output.\n    //\n    // Each step of each batch:path gets its index computed and a\n    // unique path id assigned.\n    std::vector<int32> indices;\n    std::vector<int32> path_ids;\n\n    // Each unique path gets a batch id and a slot (in the beam)\n    // id. These are in effect the row and column of the final\n    // 'logits' matrix going to CrossEntropy.\n    std::vector<int32> beam_ids;\n    std::vector<int32> slot_ids;\n\n    // To compute the cross entropy we also need the slot id of the\n    // gold path, one per batch.\n    std::vector<int32> gold_slot(batch_size, -1);\n\n    // For good measure we also output the path scores as computed by\n    // the beam decoder, so it can be compared in tests with the path\n    // scores computed via the indices in TF. This has the same length\n    // as beam_ids and slot_ids.\n    std::vector<float> path_scores;\n\n    // The scores tensor has, conceptually, four dimensions: 1. number\n    // of steps, 2. batch size, 3. number of paths on the beam at that\n    // step, and 4. the number of actions scored. However this is not\n    // a true tensor since the size of the beam at each step may not\n    // be equal among all steps and among all batches. Only the batch\n    // size and number of actions is fixed.\n    int path_id = 0;\n    for (int beam_id = 0; beam_id < batch_size; ++beam_id) {\n      // This occurs at the end of the corpus, when there aren't enough\n      // sentences to fill the batch.\n      if (batch_state->Beam(beam_id).gold_ == nullptr) continue;\n\n      // Populate the vectors that will index into the concatenated\n      // scores tensor.\n      int slot = 0;\n      for (const auto &item : batch_state->Beam(beam_id).slots_) {\n        beam_ids.push_back(beam_id);\n        slot_ids.push_back(slot);\n        path_scores.push_back(item.first.first);\n        VLOG(2) << \"PATH SCORE @ beam_id:\" << beam_id << \" slot:\" << slot\n                << \" : \" << item.first.first << \" \" << item.first.second;\n        VLOG(2) << \"SLOT HISTORY: \"\n                << utils::Join(item.second->slot_history, \" \");\n        VLOG(2) << \"SCORE HISTORY: \"\n                << utils::Join(item.second->score_history, \" \");\n        VLOG(2) << \"ACTION HISTORY: \"\n                << utils::Join(item.second->action_history, \" \");\n\n        // Record where the gold path ended up.\n        if (item.second->state->is_gold()) {\n          CHECK_EQ(gold_slot[beam_id], -1);\n          gold_slot[beam_id] = slot;\n        }\n\n        for (size_t step = 0; step < item.second->slot_history.size(); ++step) {\n          const int step_beam_offset = batch_state->GetOffset(step, beam_id);\n          const int slot_index = item.second->slot_history[step];\n          const int action_index = item.second->action_history[step];\n          indices.push_back(num_actions * (step_beam_offset + slot_index) +\n                            action_index);\n          path_ids.push_back(path_id);\n        }\n        ++slot;\n        ++path_id;\n      }\n\n      // One and only path must be the golden one.\n      CHECK_GE(gold_slot[beam_id], 0);\n    }\n\n    const int num_ix_elements = indices.size();\n    Tensor *output;\n    OP_REQUIRES_OK(context, context->allocate_output(\n                                0, TensorShape({2, num_ix_elements}), &output));\n    auto indices_and_path_ids = output->matrix<int32>();\n    for (size_t i = 0; i < indices.size(); ++i) {\n      indices_and_path_ids(0, i) = indices[i];\n      indices_and_path_ids(1, i) = path_ids[i];\n    }\n\n    const int num_path_elements = beam_ids.size();\n    OP_REQUIRES_OK(context,\n                   context->allocate_output(\n                       1, TensorShape({2, num_path_elements}), &output));\n    auto beam_and_slot_ids = output->matrix<int32>();\n    for (size_t i = 0; i < beam_ids.size(); ++i) {\n      beam_and_slot_ids(0, i) = beam_ids[i];\n      beam_and_slot_ids(1, i) = slot_ids[i];\n    }\n\n    OP_REQUIRES_OK(context, context->allocate_output(\n                                2, TensorShape({batch_size}), &output));\n    std::copy(gold_slot.begin(), gold_slot.end(), output->vec<int32>().data());\n\n    OP_REQUIRES_OK(context, context->allocate_output(\n                                3, TensorShape({num_path_elements}), &output));\n    std::copy(path_scores.begin(), path_scores.end(),\n              output->vec<float>().data());\n  }\n\n private:\n  TF_DISALLOW_COPY_AND_ASSIGN(BeamParserOutput);\n};\n\nREGISTER_KERNEL_BUILDER(Name(\"BeamParserOutput\").Device(DEVICE_CPU),\n                        BeamParserOutput);\n\n// Computes eval metrics for the best path in the input beams.\nclass BeamEvalOutput : public OpKernel {\n public:\n  explicit BeamEvalOutput(OpKernelConstruction *context) : OpKernel(context) {\n    // Set expected signature.\n    OP_REQUIRES_OK(context,\n                   context->MatchSignature({DT_INT64}, {DT_INT32, DT_STRING}));\n  }\n\n  void Compute(OpKernelContext *context) override {\n    int num_tokens = 0;\n    int num_correct = 0;\n    int all_final = 0;\n    BatchState *batch_state =\n        reinterpret_cast<BatchState *>(context->input(0).scalar<int64>()());\n    const int batch_size = batch_state->BatchSize();\n    vector<Sentence> documents;\n    for (int beam_id = 0; beam_id < batch_size; ++beam_id) {\n      if (batch_state->Beam(beam_id).gold_ != nullptr &&\n          batch_state->Beam(beam_id).AllFinal()) {\n        ++all_final;\n        const auto &item = *batch_state->Beam(beam_id).slots_.rbegin();\n        ComputeTokenAccuracy(*item.second->state, batch_state->ScoringType(),\n                             &num_tokens, &num_correct);\n        documents.push_back(item.second->state->sentence());\n        item.second->state->AddParseToDocument(&documents.back());\n      }\n    }\n    Tensor *output;\n    OP_REQUIRES_OK(context,\n                   context->allocate_output(0, TensorShape({2}), &output));\n    auto eval_metrics = output->vec<int32>();\n    eval_metrics(0) = num_tokens;\n    eval_metrics(1) = num_correct;\n\n    const int output_size = documents.size();\n    OP_REQUIRES_OK(context, context->allocate_output(\n                                1, TensorShape({output_size}), &output));\n    for (int i = 0; i < output_size; ++i) {\n      output->vec<string>()(i) = documents[i].SerializeAsString();\n    }\n  }\n\n private:\n  // Tallies the # of correct and incorrect tokens for a given ParserState.\n  void ComputeTokenAccuracy(const ParserState &state,\n                            const string &scoring_type,\n                            int *num_tokens, int *num_correct) {\n    for (int i = 0; i < state.sentence().token_size(); ++i) {\n      const Token &token = state.GetToken(i);\n      if (utils::PunctuationUtil::ScoreToken(token.word(), token.tag(),\n                                             scoring_type)) {\n        ++*num_tokens;\n        if (state.IsTokenCorrect(i)) ++*num_correct;\n      }\n    }\n  }\n\n  TF_DISALLOW_COPY_AND_ASSIGN(BeamEvalOutput);\n};\n\nREGISTER_KERNEL_BUILDER(Name(\"BeamEvalOutput\").Device(DEVICE_CPU),\n                        BeamEvalOutput);\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/beam_reader_ops_test.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Tests for beam_reader_ops.\"\"\"\n\n\nimport os.path\nimport time\nimport tensorflow as tf\n\nfrom tensorflow.python.framework import test_util\nfrom tensorflow.python.platform import googletest\nfrom tensorflow.python.platform import tf_logging as logging\n\nfrom syntaxnet import structured_graph_builder\nfrom syntaxnet.ops import gen_parser_ops\n\nFLAGS = tf.app.flags.FLAGS\nif not hasattr(FLAGS, 'test_srcdir'):\n  FLAGS.test_srcdir = ''\nif not hasattr(FLAGS, 'test_tmpdir'):\n  FLAGS.test_tmpdir = tf.test.get_temp_dir()\n\n\nclass ParsingReaderOpsTest(test_util.TensorFlowTestCase):\n\n  def setUp(self):\n    # Creates a task context with the correct testing paths.\n    initial_task_context = os.path.join(\n        FLAGS.test_srcdir,\n        'syntaxnet/'\n        'testdata/context.pbtxt')\n    self._task_context = os.path.join(FLAGS.test_tmpdir, 'context.pbtxt')\n    with open(initial_task_context, 'r') as fin:\n      with open(self._task_context, 'w') as fout:\n        fout.write(fin.read().replace('SRCDIR', FLAGS.test_srcdir)\n                   .replace('OUTPATH', FLAGS.test_tmpdir))\n\n    # Creates necessary term maps.\n    with self.test_session() as sess:\n      gen_parser_ops.lexicon_builder(task_context=self._task_context,\n                                     corpus_name='training-corpus').run()\n      self._num_features, self._num_feature_ids, _, self._num_actions = (\n          sess.run(gen_parser_ops.feature_size(task_context=self._task_context,\n                                               arg_prefix='brain_parser')))\n\n  def MakeGraph(self,\n                max_steps=10,\n                beam_size=2,\n                batch_size=1,\n                **kwargs):\n    \"\"\"Constructs a structured learning graph.\"\"\"\n    assert max_steps > 0, 'Empty network not supported.'\n\n    logging.info('MakeGraph + %s', kwargs)\n\n    with self.test_session(graph=tf.Graph()) as sess:\n      feature_sizes, domain_sizes, embedding_dims, num_actions = sess.run(\n          gen_parser_ops.feature_size(task_context=self._task_context))\n    embedding_dims = [8, 8, 8]\n    hidden_layer_sizes = []\n    learning_rate = 0.01\n    builder = structured_graph_builder.StructuredGraphBuilder(\n        num_actions,\n        feature_sizes,\n        domain_sizes,\n        embedding_dims,\n        hidden_layer_sizes,\n        seed=1,\n        max_steps=max_steps,\n        beam_size=beam_size,\n        gate_gradients=True,\n        use_locking=True,\n        use_averaging=False,\n        check_parameters=False,\n        **kwargs)\n    builder.AddTraining(self._task_context,\n                        batch_size,\n                        learning_rate=learning_rate,\n                        decay_steps=1000,\n                        momentum=0.9,\n                        corpus_name='training-corpus')\n    builder.AddEvaluation(self._task_context,\n                          batch_size,\n                          evaluation_max_steps=25,\n                          corpus_name=None)\n    builder.training['inits'] = tf.group(*builder.inits.values(), name='inits')\n    return builder\n\n  def Train(self, **kwargs):\n    with self.test_session(graph=tf.Graph()) as sess:\n      max_steps = 3\n      batch_size = 3\n      beam_size = 3\n      builder = (\n          self.MakeGraph(\n              max_steps=max_steps, beam_size=beam_size,\n              batch_size=batch_size, **kwargs))\n      logging.info('params: %s', builder.params.keys())\n      logging.info('variables: %s', builder.variables.keys())\n\n      t = builder.training\n      sess.run(t['inits'])\n      costs = []\n      gold_slots = []\n      alive_steps_vector = []\n      every_n = 5\n      walltime = time.time()\n      for step in range(10):\n        if step > 0 and step % every_n == 0:\n          new_walltime = time.time()\n          logging.info(\n              'Step: %d <cost>: %f <gold_slot>: %f <alive_steps>: %f <iter '\n              'time>: %f ms',\n              step, sum(costs[-every_n:]) / float(every_n),\n              sum(gold_slots[-every_n:]) / float(every_n),\n              sum(alive_steps_vector[-every_n:]) / float(every_n),\n              1000 * (new_walltime - walltime) / float(every_n))\n          walltime = new_walltime\n\n        cost, gold_slot, alive_steps, _ = sess.run(\n            [t['cost'], t['gold_slot'], t['alive_steps'], t['train_op']])\n        costs.append(cost)\n        gold_slots.append(gold_slot.mean())\n        alive_steps_vector.append(alive_steps.mean())\n\n      if builder._only_train:\n        trainable_param_names = [\n            k for k in builder.params if k in builder._only_train]\n      else:\n        trainable_param_names = builder.params.keys()\n      if builder._use_averaging:\n        for v in trainable_param_names:\n          avg = builder.variables['%s_avg_var' % v].eval()\n          tf.assign(builder.params[v], avg).eval()\n\n      # Reset for pseudo eval.\n      costs = []\n      gold_slots = []\n      alive_stepss = []\n      for step in range(10):\n        cost, gold_slot, alive_steps = sess.run(\n            [t['cost'], t['gold_slot'], t['alive_steps']])\n        costs.append(cost)\n        gold_slots.append(gold_slot.mean())\n        alive_stepss.append(alive_steps.mean())\n\n      logging.info(\n          'Pseudo eval: <cost>: %f <gold_slot>: %f <alive_steps>: %f',\n          sum(costs[-every_n:]) / float(every_n),\n          sum(gold_slots[-every_n:]) / float(every_n),\n          sum(alive_stepss[-every_n:]) / float(every_n))\n\n  def PathScores(self, iterations, beam_size, max_steps, batch_size):\n    with self.test_session(graph=tf.Graph()) as sess:\n      t = self.MakeGraph(beam_size=beam_size, max_steps=max_steps,\n                         batch_size=batch_size).training\n      sess.run(t['inits'])\n      all_path_scores = []\n      beam_path_scores = []\n      for i in range(iterations):\n        logging.info('run %d', i)\n        tensors = (\n            sess.run(\n                [t['alive_steps'], t['concat_scores'],\n                 t['all_path_scores'], t['beam_path_scores'],\n                 t['indices'], t['path_ids']]))\n\n        logging.info('alive for %s, all_path_scores and beam_path_scores, '\n                     'indices and path_ids:'\n                     '\\n%s\\n%s\\n%s\\n%s',\n                     tensors[0], tensors[2], tensors[3], tensors[4], tensors[5])\n        logging.info('diff:\\n%s', tensors[2] - tensors[3])\n\n        all_path_scores.append(tensors[2])\n        beam_path_scores.append(tensors[3])\n      return all_path_scores, beam_path_scores\n\n  def testParseUntilNotAlive(self):\n    \"\"\"Ensures that the 'alive' condition works in the Cond ops.\"\"\"\n    with self.test_session(graph=tf.Graph()) as sess:\n      t = self.MakeGraph(batch_size=3, beam_size=2, max_steps=5).training\n      sess.run(t['inits'])\n      for i in range(5):\n        logging.info('run %d', i)\n        tf_alive = t['alive'].eval()\n        self.assertFalse(any(tf_alive))\n\n  def testParseMomentum(self):\n    \"\"\"Ensures that Momentum training can be done using the gradients.\"\"\"\n    self.Train()\n    self.Train(model_cost='perceptron_loss')\n    self.Train(model_cost='perceptron_loss',\n               only_train='softmax_weight,softmax_bias', softmax_init=0)\n    self.Train(only_train='softmax_weight,softmax_bias', softmax_init=0)\n\n  def testPathScoresAgree(self):\n    \"\"\"Ensures that path scores computed in the beam are same in the net.\"\"\"\n    all_path_scores, beam_path_scores = self.PathScores(\n        iterations=1, beam_size=130, max_steps=5, batch_size=1)\n    self.assertArrayNear(all_path_scores[0], beam_path_scores[0], 1e-6)\n\n  def testBatchPathScoresAgree(self):\n    \"\"\"Ensures that path scores computed in the beam are same in the net.\"\"\"\n    all_path_scores, beam_path_scores = self.PathScores(\n        iterations=1, beam_size=130, max_steps=5, batch_size=22)\n    self.assertArrayNear(all_path_scores[0], beam_path_scores[0], 1e-6)\n\n  def testBatchOneStepPathScoresAgree(self):\n    \"\"\"Ensures that path scores computed in the beam are same in the net.\"\"\"\n    all_path_scores, beam_path_scores = self.PathScores(\n        iterations=1, beam_size=130, max_steps=1, batch_size=22)\n    self.assertArrayNear(all_path_scores[0], beam_path_scores[0], 1e-6)\n\n\nif __name__ == '__main__':\n  googletest.main()\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/binary_segment_state.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/binary_segment_state.h\"\n\n#include <string>\n#include \"syntaxnet/segmenter_utils.h\"\n#include \"syntaxnet/sentence.pb.h\"\n\nnamespace syntaxnet {\n\nParserTransitionState *BinarySegmentState::Clone() const {\n  return new BinarySegmentState();\n}\n\nstring BinarySegmentState::ToString(const ParserState &state) const {\n  string str(\"[\");\n  for (int i = NumStarts(state) - 1; i >=0; --i) {\n    int start = LastStart(i, state);\n    int end = 0;\n    if (i - 1 >= 0) {\n      end = LastStart(i - 1, state) - 1;\n    } else if (state.EndOfInput()) {\n      end = state.sentence().token_size() - 1;\n    } else {\n      end = state.Next() - 1;\n    }\n    for (int k = start; k <= end; ++k) {\n      str.append(state.GetToken(k).word());\n    }\n    if (i >= 1) str.append(\" \");\n  }\n\n  str.append(\"] \");\n  for (int i = state.Next(); i < state.NumTokens(); ++i) {\n    str.append(state.GetToken(i).word());\n  }\n  return str;\n}\n\nvoid BinarySegmentState::AddParseToDocument(const ParserState &state,\n                                            bool rewrite_root_labels,\n                                            Sentence *sentence) const {\n  if (sentence->token_size() == 0) return;\n  vector<bool> is_starts(sentence->token_size(), false);\n  for (int i = 0; i < NumStarts(state); ++i) {\n    is_starts[LastStart(i, state)] = true;\n  }\n\n  // Break level of the current token is determined based on its previous token.\n  Token::BreakLevel break_level = Token::NO_BREAK;\n  bool is_first_token = true;\n  Sentence new_sentence;\n  for (int i = 0; i < sentence->token_size(); ++i) {\n    const Token &token = sentence->token(i);\n    const string &word = token.word();\n    bool is_break = SegmenterUtils::IsBreakChar(word);\n    if (is_starts[i] || is_first_token) {\n      if (!is_break) {\n        // The current character is the first char of a new token/word.\n        Token *new_token = new_sentence.add_token();\n        new_token->set_start(token.start());\n        new_token->set_end(token.end());\n        new_token->set_word(word);\n\n        // For the first token, keep the old break level to make sure that the\n        // number of sentences stays unchanged.\n        new_token->set_break_level(break_level);\n        is_first_token = false;\n      }\n    } else {\n      // Append the character to the previous token.\n      if (!is_break) {\n        int index = new_sentence.token_size() - 1;\n        auto *last_token = new_sentence.mutable_token(index);\n        last_token->mutable_word()->append(word);\n        last_token->set_end(token.end());\n      }\n    }\n\n    // Update break level. Note we do not introduce new sentences in the\n    // transition system, thus anything goes beyond line break would be reduced\n    // to line break.\n    break_level = is_break ? SegmenterUtils::BreakLevel(word) : Token::NO_BREAK;\n    if (break_level >= Token::LINE_BREAK) break_level = Token::LINE_BREAK;\n  }\n  sentence->mutable_token()->Swap(new_sentence.mutable_token());\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/binary_segment_state.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#ifndef SYNTAXNET_BINARY_SEGMENT_STATE_H_\n#define SYNTAXNET_BINARY_SEGMENT_STATE_H_\n\n#include \"syntaxnet/parser_state.h\"\n#include \"syntaxnet/parser_transitions.h\"\n\nnamespace syntaxnet {\n\nclass Sentence;\n\n// Parser state for binary segmentation transition system. The input of the\n// system is a sequence of utf8 characters that are to be segmented into tokens.\n// The system contains two type of transitions/actions:\n//  -START: the token at input is the first character of a new word.\n//  -MERGE: the token at input is to be merged with the its previous token.\n//\n// A BinarySegmentState is used to store segmentation histories that can be used\n// as features. In addition, it also provides the functionality to add\n// segmentation results to the document. The function assumes that sentences in\n// a document are processed in left-to-right order. See also the comments of\n// the FinishDocument function for explaination.\n//\n// Note on spaces:\n// Spaces, or more generally break-characters, should never be any part of a\n// word, and the START/MERGE of spaces would be ignored. In addition, if a space\n// starts a new word, then the actual first char of that word is the first\n// non-space token following the space.\n// Some examples:\n//  -chars:  ' ' A B\n//  -tags:    S  M M\n//  -result: 'AB'\n//\n//  -chars:  A ' ' B\n//  -tags:   S  M  M\n//  -result: 'AB'\n//\n//  -chars:  A ' ' B\n//  -tags:   S  S  M\n//  -result: 'AB'\n//\n//  -chars:  A  B  ' '\n//  -tags:   S  S  M\n//  -result: 'A', 'B'\nclass BinarySegmentState : public ParserTransitionState {\n public:\n  ParserTransitionState *Clone() const override;\n  void Init(ParserState *state) override {}\n\n  // Returns the number of start tokens that have already been identified. In\n  // other words, number of start tokens between the first token of the sentence\n  // and state.Input(), with state.Input() excluded.\n  static int NumStarts(const ParserState &state) {\n    return state.StackSize();\n  }\n\n  // Returns the index of the k-th most recent start token.\n  static int LastStart(int k, const ParserState &state) {\n    DCHECK_GE(k, 0);\n    DCHECK_LT(k, NumStarts(state));\n    return state.Stack(k);\n  }\n\n  // Adds the token at given index as a new start token.\n  static void AddStart(int index, ParserState *state) {\n    state->Push(index);\n  }\n\n  // Adds segmentation results to the given sentence.\n  void AddParseToDocument(const ParserState &state,\n                          bool rewrite_root_labels,\n                          Sentence *sentence) const override;\n\n  // Whether a parsed token should be considered correct for evaluation.\n  bool IsTokenCorrect(const ParserState &state, int index) const override {\n    return true;\n  }\n\n  // Returns a human readable string representation of this state.\n  string ToString(const ParserState &state) const override;\n};\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_BINARY_SEGMENT_STATE_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/binary_segment_state_test.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/binary_segment_state.h\"\n\n#include <memory>\n\n#include \"syntaxnet/base.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/term_frequency_map.h\"\n#include \"tensorflow/core/platform/test.h\"\n\nnamespace syntaxnet {\n\nclass BinarySegmentStateTest : public ::testing::Test {\n protected:\n  void SetUp() override {\n    // Prepare a sentence.\n    const char *str_sentence = \"text: '测试 的 句子' \"\n        \"token { word: '测' start: 0 end: 2 } \"\n        \"token { word: '试' start: 3 end: 5 } \"\n        \"token { word: ' ' start: 6 end: 6 } \"\n        \"token { word: '的' start: 7 end: 9 } \"\n        \"token { word: ' ' start: 10 end: 10 } \"\n        \"token { word: '句' start: 11 end: 13 } \"\n        \"token { word: '子' start: 14 end: 16 } \";\n    sentence_ = std::unique_ptr<Sentence>(new Sentence());\n    TextFormat::ParseFromString(str_sentence, sentence_.get());\n  }\n\n  // The test document, parse tree, and sentence.\n  std::unique_ptr<Sentence> sentence_;\n  TermFrequencyMap label_map_;\n};\n\nTEST_F(BinarySegmentStateTest, AddStartLastStartNumStartsTest) {\n  BinarySegmentState *segment_state = new BinarySegmentState();\n  ParserState state(sentence_.get(), segment_state, &label_map_);\n\n  // Test segment_state initialized with zero starts.\n  EXPECT_EQ(0, segment_state->NumStarts(state));\n\n  // Adding the first token as a start token.\n  segment_state->AddStart(0, &state);\n  ASSERT_EQ(1, segment_state->NumStarts(state));\n  EXPECT_EQ(0, segment_state->LastStart(0, state));\n\n  // Adding more starts.\n  segment_state->AddStart(2, &state);\n  segment_state->AddStart(3, &state);\n  segment_state->AddStart(4, &state);\n  segment_state->AddStart(5, &state);\n  ASSERT_EQ(5, segment_state->NumStarts(state));\n  EXPECT_EQ(5, segment_state->LastStart(0, state));\n  EXPECT_EQ(4, segment_state->LastStart(1, state));\n  EXPECT_EQ(3, segment_state->LastStart(2, state));\n  EXPECT_EQ(2, segment_state->LastStart(3, state));\n  EXPECT_EQ(0, segment_state->LastStart(4, state));\n}\n\nTEST_F(BinarySegmentStateTest, AddParseToDocumentTest) {\n  BinarySegmentState *segment_state = new BinarySegmentState();\n  ParserState state(sentence_.get(), segment_state, &label_map_);\n\n  // Test gold segmentation.\n  // 0   1   2    3   4   5   6\n  // 测  试  ' '  的  ' '  句  子\n  // S   M   S    S   S   S   M\n  segment_state->AddStart(0, &state);\n  segment_state->AddStart(2, &state);\n  segment_state->AddStart(3, &state);\n  segment_state->AddStart(4, &state);\n  segment_state->AddStart(5, &state);\n  Sentence sentence_with_annotation = *sentence_;\n  segment_state->AddParseToDocument(state, false, &sentence_with_annotation);\n\n  // Test the number of tokens as well as the start/end byte-offsets of each\n  // token.\n  ASSERT_EQ(3, sentence_with_annotation.token_size());\n\n  // The first token is 测试.\n  EXPECT_EQ(0, sentence_with_annotation.token(0).start());\n  EXPECT_EQ(5, sentence_with_annotation.token(0).end());\n\n  // The second token is 的.\n  EXPECT_EQ(7, sentence_with_annotation.token(1).start());\n  EXPECT_EQ(9, sentence_with_annotation.token(1).end());\n\n  // The third token is 句子.\n  EXPECT_EQ(11, sentence_with_annotation.token(2).start());\n  EXPECT_EQ(16, sentence_with_annotation.token(2).end());\n\n  // Test merge space to other tokens. Since spaces, or more generally break\n  // characters, should never be a part of any word, they are skipped no matter\n  // how they are tagged.\n  // 0   1   2    3   4   5   6\n  // 测  试  ' '  的  ' '  句  子\n  // S   M   M    S   M   M   M\n  while (!state.StackEmpty()) state.Pop();\n  segment_state->AddStart(0, &state);\n  segment_state->AddStart(3, &state);\n  sentence_with_annotation = *sentence_;\n  segment_state->AddParseToDocument(state, false, &sentence_with_annotation);\n\n  ASSERT_EQ(2, sentence_with_annotation.token_size());\n\n  // The first token is 测试. Note even a space is tagged as \"merge\", it is not\n  // attached to its previous word.\n  EXPECT_EQ(0, sentence_with_annotation.token(0).start());\n  EXPECT_EQ(5, sentence_with_annotation.token(0).end());\n\n  // The second token is 的句子.\n  EXPECT_EQ(7, sentence_with_annotation.token(1).start());\n  EXPECT_EQ(16, sentence_with_annotation.token(1).end());\n\n  // Test merge a token to space tokens. In such case, the current token would\n  // be merged to the first non-space token on its left side.\n  // 0   1   2    3   4   5   6\n  // 测  试  ' '  的  ' '  句  子\n  // S   M   S    M   S   M   M\n  while (!state.StackEmpty()) state.Pop();\n  segment_state->AddStart(0, &state);\n  segment_state->AddStart(2, &state);\n  segment_state->AddStart(4, &state);\n  sentence_with_annotation = *sentence_;\n  segment_state->AddParseToDocument(state, false, &sentence_with_annotation);\n  ASSERT_EQ(1, sentence_with_annotation.token_size());\n  EXPECT_EQ(0, sentence_with_annotation.token(0).start());\n  EXPECT_EQ(16, sentence_with_annotation.token(0).end());\n}\n\nTEST_F(BinarySegmentStateTest, SpaceDocumentTest) {\n  const char *str_sentence = \"text: ' \\t\\t' \"\n      \"token { word: ' ' start: 0 end: 0 } \"\n      \"token { word: '\\t' start: 1 end: 1 } \"\n      \"token { word: '\\t' start: 2 end: 2 } \";\n  TextFormat::ParseFromString(str_sentence, sentence_.get());\n  BinarySegmentState *segment_state = new BinarySegmentState();\n  ParserState state(sentence_.get(), segment_state, &label_map_);\n\n  // Break-chars should always be skipped, no matter how they are tagged.\n  // 0    1     2\n  //' '   '\\t'  '\\t'\n  // M    M     M\n  Sentence sentence_with_annotation = *sentence_;\n  segment_state->AddParseToDocument(state, false, &sentence_with_annotation);\n  ASSERT_EQ(0, sentence_with_annotation.token_size());\n\n  // 0    1     2\n  //' '   '\\t'  '\\t'\n  // S    S     S\n  segment_state->AddStart(0, &state);\n  segment_state->AddStart(1, &state);\n  segment_state->AddStart(2, &state);\n  sentence_with_annotation = *sentence_;\n  segment_state->AddParseToDocument(state, false, &sentence_with_annotation);\n  ASSERT_EQ(0, sentence_with_annotation.token_size());\n}\n\nTEST_F(BinarySegmentStateTest, DocumentBeginWithSpaceTest) {\n  const char *str_sentence = \"text: ' 空格' \"\n      \"token { word: ' ' start: 0 end: 0 } \"\n      \"token { word: '空' start: 1 end: 3 } \"\n      \"token { word: '格' start: 4 end: 6 } \";\n  TextFormat::ParseFromString(str_sentence, sentence_.get());\n  BinarySegmentState *segment_state = new BinarySegmentState();\n  ParserState state(sentence_.get(), segment_state, &label_map_);\n\n  // 0    1    2\n  //' '   空   格\n  // M    M    M\n  Sentence sentence_with_annotation = *sentence_;\n  segment_state->AddParseToDocument(state, false, &sentence_with_annotation);\n\n  ASSERT_EQ(1, sentence_with_annotation.token_size());\n\n  // The first token is 空格.\n  EXPECT_EQ(1, sentence_with_annotation.token(0).start());\n  EXPECT_EQ(6, sentence_with_annotation.token(0).end());\n\n  // 0    1    2\n  //' '   空   格\n  // S    M    M\n  while (!state.StackEmpty()) state.Pop();\n  segment_state->AddStart(0, &state);\n  sentence_with_annotation = *sentence_;\n  segment_state->AddParseToDocument(state, false, &sentence_with_annotation);\n\n  ASSERT_EQ(1, sentence_with_annotation.token_size());\n\n  // The first token is 空格.\n  EXPECT_EQ(1, sentence_with_annotation.token(0).start());\n  EXPECT_EQ(6, sentence_with_annotation.token(0).end());\n}\n\nTEST_F(BinarySegmentStateTest, EmptyDocumentTest) {\n  const char *str_sentence = \"text: '' \";\n  TextFormat::ParseFromString(str_sentence, sentence_.get());\n  BinarySegmentState *segment_state = new BinarySegmentState();\n  ParserState state(sentence_.get(), segment_state, &label_map_);\n  Sentence sentence_with_annotation = *sentence_;\n  segment_state->AddParseToDocument(state, false, &sentence_with_annotation);\n  ASSERT_EQ(0, sentence_with_annotation.token_size());\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/binary_segment_transitions.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/binary_segment_state.h\"\n#include \"syntaxnet/parser_features.h\"\n#include \"syntaxnet/parser_state.h\"\n#include \"syntaxnet/parser_transitions.h\"\n#include \"syntaxnet/term_frequency_map.h\"\n\nnamespace syntaxnet {\n\n// Given an input of utf8 characters, the BinarySegmentTransitionSystem\n// conducts word segmentation by performing one of the following two actions:\n//  -START: starts a new word with the token at state.input, and also advances\n//          the state.input.\n//  -MERGE: adds the token at state.input to its prevous word, and also advances\n//          state.input.\n//\n// Also see nlp/saft/components/segmentation/transition/binary-segment-state.h\n// for examples on handling spaces.\nclass BinarySegmentTransitionSystem : public ParserTransitionSystem {\n public:\n  BinarySegmentTransitionSystem() {}\n  ParserTransitionState *NewTransitionState(bool train_mode) const override {\n    return new BinarySegmentState();\n  }\n\n  // Action types for the segmentation-transition system.\n  enum ParserActionType {\n    START = 0,\n    MERGE = 1,\n    CARDINAL = 2\n  };\n\n  static int StartAction() { return 0; }\n  static int MergeAction() { return 1; }\n\n  // The system always starts a new word by default.\n  ParserAction GetDefaultAction(const ParserState &state) const override {\n    return START;\n  }\n\n  // Returns the number of action types.\n  int NumActionTypes() const override {\n    return CARDINAL;\n  }\n\n  // Returns the number of possible actions.\n  int NumActions(int num_labels) const override {\n    return CARDINAL;\n  }\n\n  // Returns the next gold action for a given state according to the underlying\n  // annotated sentence. The training data for the transition system is created\n  // by the binary-segmenter-data task. If a token's break_level is NO_BREAK,\n  // then it is a MERGE, START otherwise. The only exception is that the first\n  // token in a sentence for the transition sysytem is always a START.\n  ParserAction GetNextGoldAction(const ParserState &state) const override {\n    if (state.Next() == 0) return StartAction();\n    const Token &token = state.GetToken(state.Next());\n    return (token.break_level() != Token::NO_BREAK ?\n        StartAction() : MergeAction());\n  }\n\n  // Both START and MERGE can be applied to any tokens in the sentence.\n  bool IsAllowedAction(\n      ParserAction action, const ParserState &state) const override {\n    return true;\n  }\n\n  // Performs the specified action on a given parser state, without adding the\n  // action to the state's history.\n  void PerformActionWithoutHistory(\n      ParserAction action, ParserState *state) const override {\n    // Note when the action is less than 0, it is treated as a START.\n    if (action < 0 || action == StartAction()) {\n      MutableTransitionState(state)->AddStart(state->Next(), state);\n    }\n    state->Advance();\n  }\n\n  // Allows backoff to best allowable transition.\n  bool BackOffToBestAllowableTransition() const override { return true; }\n\n  // A state is a deterministic state iff no tokens have been consumed.\n  bool IsDeterministicState(const ParserState &state) const override {\n    return state.Next() == 0;\n  }\n\n  // For binary segmentation, a state is a final state iff all tokens have been\n  // consumed.\n  bool IsFinalState(const ParserState &state) const override {\n    return state.EndOfInput();\n  }\n\n  // Returns a string representation of a parser action.\n  string ActionAsString(\n      ParserAction action, const ParserState &state) const override {\n    return action == StartAction() ? \"START\" : \"MERGE\";\n  }\n\n  // Downcasts the TransitionState in ParserState to an BinarySegmentState.\n  static BinarySegmentState *MutableTransitionState(ParserState *state) {\n    return static_cast<BinarySegmentState *>(state->mutable_transition_state());\n  }\n};\n\nREGISTER_TRANSITION_SYSTEM(\"binary-segment-transitions\",\n                           BinarySegmentTransitionSystem);\n\n// Parser feature locator that returns the token in the sentence that is\n// argument() positions from the provided focus token.\nclass OffsetFeatureLocator : public ParserIndexLocator<OffsetFeatureLocator> {\n public:\n  // Update the current focus to a new location.  If the initial focus or new\n  // focus is outside the range of the sentence, returns -2.\n  void UpdateArgs(const WorkspaceSet &workspaces, const ParserState &state,\n                  int *focus) const {\n    if (*focus < -1 || *focus >= state.sentence().token_size()) {\n      *focus = -2;\n      return;\n    }\n    int new_focus = *focus + argument();\n    if (new_focus < -1 || new_focus >= state.sentence().token_size()) {\n      *focus = -2;\n      return;\n    }\n    *focus = new_focus;\n  }\n};\n\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"offset\", OffsetFeatureLocator);\n\n// Feature function that returns the id of the n-th most recently constructed\n// word. Note that the argument, n, should be larger than 0. When equals to 0,\n// it points to the word which is not yet completed.\nclass LastWordFeatureFunction : public ParserFeatureFunction {\n public:\n  void Setup(TaskContext *context) override {\n    input_word_map_ = context->GetInput(\"word-map\", \"text\", \"\");\n  }\n\n  void Init(TaskContext *context) override {\n    min_freq_ = GetIntParameter(\"min-freq\", 0);\n    max_num_terms_ = GetIntParameter(\"max-num-terms\", 0);\n    word_map_.Load(\n        TaskContext::InputFile(*input_word_map_), min_freq_, max_num_terms_);\n    unk_id_ = word_map_.Size();\n    outside_id_ = unk_id_ + 1;\n    set_feature_type(\n        new ResourceBasedFeatureType<LastWordFeatureFunction>(\n        name(), this, {}));\n  }\n\n  int64 NumValues() const {\n    return outside_id_ + 1;\n  }\n\n  // Returns the string representation of the given feature value.\n  string GetFeatureValueName(FeatureValue value) const {\n    if (value == outside_id_) return \"<OUTSIDE>\";\n    if (value == unk_id_) return \"<UNKNOWN>\";\n    DCHECK_GE(value, 0);\n    DCHECK_LT(value, word_map_.Size());\n    return word_map_.GetTerm(value);\n  }\n\n  FeatureValue Compute(const WorkspaceSet &workspaces, const ParserState &state,\n                       const FeatureVector *result) const override {\n    // n should be larger than 0, since the current word is still under\n    // construction.\n    const int n = argument();\n    CHECK_GT(n, 0);\n    const auto *segment_state = static_cast<const BinarySegmentState *>(\n        state.transition_state());\n    if (n >= segment_state->NumStarts(state)) {\n      return outside_id_;\n    }\n\n    const auto &sentence = state.sentence();\n    const int start = segment_state->LastStart(n, state);\n    const int end = segment_state->LastStart(n - 1, state) - 1;\n    CHECK_GE(end, start);\n\n    const int start_offset = state.GetToken(start).start();\n    const int length = state.GetToken(end).end() - start_offset + 1;\n    const auto *data = sentence.text().data() + start_offset;\n    return word_map_.LookupIndex(string(data, length), unk_id_);\n  }\n\n private:\n  // Task input for the word to id map. Not owned.\n  TaskInput *input_word_map_ = nullptr;\n  TermFrequencyMap word_map_;\n\n  // Special ids of unknown words and out-of-range.\n  int unk_id_ = 0;\n  int outside_id_ = 0;\n\n  // Minimum frequency for term map.\n  int min_freq_;\n\n  // Maximum number of terms for term map.\n  int max_num_terms_;\n};\n\nREGISTER_PARSER_FEATURE_FUNCTION(\"last-word\", LastWordFeatureFunction);\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/binary_segment_transitions_test.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/binary_segment_state.h\"\n#include \"syntaxnet/parser_features.h\"\n#include \"syntaxnet/parser_state.h\"\n#include \"syntaxnet/parser_transitions.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/term_frequency_map.h\"\n#include \"syntaxnet/workspace.h\"\n#include \"tensorflow/core/platform/test.h\"\n\nnamespace syntaxnet {\n\nclass SegmentationTransitionTest : public ::testing::Test {\n protected:\n  void SetUp() override {\n    transition_system_ = std::unique_ptr<ParserTransitionSystem>(\n        ParserTransitionSystem::Create(\"binary-segment-transitions\"));\n\n    // Prepare a sentence.\n    const char *str_sentence = \"text: '因为 有 这样' \"\n        \"token { word: '因' start: 0 end: 2 break_level: SPACE_BREAK } \"\n        \"token { word: '为' start: 3 end: 5 break_level: NO_BREAK } \"\n        \"token { word: ' ' start: 6 end: 6 break_level: SPACE_BREAK } \"\n        \"token { word: '有' start: 7 end: 9 break_level: SPACE_BREAK } \"\n        \"token { word: ' ' start: 10 end: 10 break_level: SPACE_BREAK } \"\n        \"token { word: '这' start: 11 end: 13 break_level: SPACE_BREAK } \"\n        \"token { word: '样' start: 14 end: 16 break_level: NO_BREAK } \";\n    sentence_ = std::unique_ptr<Sentence>(new Sentence());\n    TextFormat::ParseFromString(str_sentence, sentence_.get());\n\n    word_map_.Increment(\"因为\");\n    word_map_.Increment(\"因为\");\n    word_map_.Increment(\"有\");\n    word_map_.Increment(\"这\");\n    word_map_.Increment(\"这\");\n    word_map_.Increment(\"样\");\n    word_map_.Increment(\"样\");\n    word_map_.Increment(\"这样\");\n    word_map_.Increment(\"这样\");\n    string filename = tensorflow::strings::StrCat(\n        tensorflow::testing::TmpDir(), \"word-map\");\n    word_map_.Save(filename);\n\n    // Re-load in sorted order, ignore words that only occurs once.\n    word_map_.Load(filename, 2, -1);\n\n    // Prepare task context.\n    context_ = std::unique_ptr<TaskContext>(new TaskContext());\n    AddInputToContext(\"word-map\", filename, \"text\", \"\");\n    registry_ = std::unique_ptr<WorkspaceRegistry>( new WorkspaceRegistry());\n  }\n\n  // Adds an input to the task context.\n  void AddInputToContext(const string &name,\n                         const string &file_pattern,\n                         const string &file_format,\n                         const string &record_format) {\n    TaskInput *input = context_->GetInput(name);\n    TaskInput::Part *part = input->add_part();\n    part->set_file_pattern(file_pattern);\n    part->set_file_format(file_format);\n    part->set_record_format(record_format);\n  }\n\n  // Prepares a feature for computations.\n  void PrepareFeature(const string &feature_name, ParserState *state) {\n    feature_extractor_ = std::unique_ptr<ParserFeatureExtractor>(\n        new ParserFeatureExtractor());\n    feature_extractor_->Parse(feature_name);\n    feature_extractor_->Setup(context_.get());\n    feature_extractor_->Init(context_.get());\n    feature_extractor_->RequestWorkspaces(registry_.get());\n    workspace_.Reset(*registry_);\n    feature_extractor_->Preprocess(&workspace_, state);\n  }\n\n  // Computes the feature value for the parser state.\n  FeatureValue ComputeFeature(const ParserState &state) const {\n    FeatureVector result;\n    feature_extractor_->ExtractFeatures(workspace_, state, &result);\n    return result.size() > 0 ? result.value(0) : -1;\n  }\n\n  void CheckStarts(const ParserState &state, const vector<int> &target) {\n    ASSERT_EQ(state.StackSize(), target.size());\n    vector<int> starts;\n    for (int i = 0; i < state.StackSize(); ++i) {\n      EXPECT_EQ(state.Stack(i), target[i]);\n    }\n  }\n\n  // The test sentence.\n  std::unique_ptr<Sentence> sentence_;\n\n  // Members for testing features.\n  std::unique_ptr<ParserFeatureExtractor> feature_extractor_;\n  std::unique_ptr<TaskContext> context_;\n  std::unique_ptr<WorkspaceRegistry> registry_;\n  WorkspaceSet workspace_;\n\n  std::unique_ptr<ParserTransitionSystem> transition_system_;\n  TermFrequencyMap label_map_;\n  TermFrequencyMap word_map_;\n};\n\nTEST_F(SegmentationTransitionTest, GoldNextActionTest) {\n  BinarySegmentState *segment_state = static_cast<BinarySegmentState *>(\n      transition_system_->NewTransitionState(true));\n  ParserState state(sentence_.get(), segment_state, &label_map_);\n\n  // Do segmentation by following the gold actions.\n  while (transition_system_->IsFinalState(state) == false) {\n    ParserAction action = transition_system_->GetNextGoldAction(state);\n    transition_system_->PerformActionWithoutHistory(action, &state);\n  }\n\n  // Test STARTs.\n  CheckStarts(state, {5, 4, 3, 2, 0});\n\n  // Test the annotated tokens.\n  segment_state->AddParseToDocument(state, false, sentence_.get());\n  ASSERT_EQ(sentence_->token_size(), 3);\n  EXPECT_EQ(sentence_->token(0).word(), \"因为\");\n  EXPECT_EQ(sentence_->token(1).word(), \"有\");\n  EXPECT_EQ(sentence_->token(2).word(), \"这样\");\n\n  // Test start/end annotation of each token.\n  EXPECT_EQ(sentence_->token(0).start(), 0);\n  EXPECT_EQ(sentence_->token(0).end(), 5);\n  EXPECT_EQ(sentence_->token(1).start(), 7);\n  EXPECT_EQ(sentence_->token(1).end(), 9);\n  EXPECT_EQ(sentence_->token(2).start(), 11);\n  EXPECT_EQ(sentence_->token(2).end(), 16);\n}\n\nTEST_F(SegmentationTransitionTest, DefaultActionTest) {\n  BinarySegmentState *segment_state = static_cast<BinarySegmentState *>(\n      transition_system_->NewTransitionState(true));\n  ParserState state(sentence_.get(), segment_state, &label_map_);\n\n  // Do segmentation, tagging and parsing by following the gold actions.\n  while (transition_system_->IsFinalState(state) == false) {\n    ParserAction action = transition_system_->GetDefaultAction(state);\n    transition_system_->PerformActionWithoutHistory(action, &state);\n  }\n\n  // Every character should be START.\n  CheckStarts(state, {6, 5, 4, 3, 2, 1, 0});\n\n  // Every non-space character should be a word.\n  segment_state->AddParseToDocument(state, false, sentence_.get());\n  ASSERT_EQ(sentence_->token_size(), 5);\n  EXPECT_EQ(sentence_->token(0).word(), \"因\");\n  EXPECT_EQ(sentence_->token(1).word(), \"为\");\n  EXPECT_EQ(sentence_->token(2).word(), \"有\");\n  EXPECT_EQ(sentence_->token(3).word(), \"这\");\n  EXPECT_EQ(sentence_->token(4).word(), \"样\");\n}\n\nTEST_F(SegmentationTransitionTest, LastWordFeatureTest) {\n  const int unk_id = word_map_.Size();\n  const int outside_id = unk_id + 1;\n\n  // Prepare a parser state.\n  BinarySegmentState *segment_state = new BinarySegmentState();\n  auto state = std::unique_ptr<ParserState>(new ParserState(\n      sentence_.get(), segment_state, &label_map_));\n\n  // Test initial state which contains no words.\n  PrepareFeature(\"last-word(1,min-freq=2)\", state.get());\n  EXPECT_EQ(outside_id, ComputeFeature(*state));\n  PrepareFeature(\"last-word(2,min-freq=2)\", state.get());\n  EXPECT_EQ(outside_id, ComputeFeature(*state));\n  PrepareFeature(\"last-word(3,min-freq=2)\", state.get());\n  EXPECT_EQ(outside_id, ComputeFeature(*state));\n\n  // Test when the state contains only one start.\n  segment_state->AddStart(0, state.get());\n  PrepareFeature(\"last-word(1,min-freq=2)\", state.get());\n  EXPECT_EQ(outside_id, ComputeFeature(*state));\n  PrepareFeature(\"last-word(2,min-freq=2)\", state.get());\n  EXPECT_EQ(outside_id, ComputeFeature(*state));\n\n  // Test when the state contains two starts, which forms a complete word and\n  // the start of another new word.\n  segment_state->AddStart(2, state.get());\n  EXPECT_NE(word_map_.LookupIndex(\"因为\", unk_id), unk_id);\n  PrepareFeature(\"last-word(1)\", state.get());\n  EXPECT_EQ(word_map_.LookupIndex(\"因为\", unk_id), ComputeFeature(*state));\n\n  // The last-word still points to outside.\n  PrepareFeature(\"last-word(2,min-freq=2)\", state.get());\n  EXPECT_EQ(outside_id, ComputeFeature(*state));\n\n  // Adding more starts that leads to the following words:\n  // 因为 ‘ ’ 有 ‘ ’\n  segment_state->AddStart(3, state.get());\n  segment_state->AddStart(4, state.get());\n\n  // Note 有 is pruned from the map since its frequency is less than 2.\n  EXPECT_EQ(word_map_.LookupIndex(\"有\", unk_id), unk_id);\n  PrepareFeature(\"last-word(1,min-freq=2)\", state.get());\n  EXPECT_EQ(unk_id, ComputeFeature(*state));\n\n  // Note that last-word(2) points to ' ' which is also a unk.\n  PrepareFeature(\"last-word(2,min-freq=2)\", state.get());\n  EXPECT_EQ(unk_id, ComputeFeature(*state));\n  PrepareFeature(\"last-word(3,min-freq=2)\", state.get());\n  EXPECT_EQ(word_map_.LookupIndex(\"因为\", unk_id), ComputeFeature(*state));\n\n  // Adding two words: \"这\" and \"样\".\n  segment_state->AddStart(5, state.get());\n  segment_state->AddStart(6, state.get());\n  PrepareFeature(\"last-word(1,min-freq=2)\", state.get());\n  EXPECT_EQ(word_map_.LookupIndex(\"这\", unk_id), ComputeFeature(*state));\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/char_properties.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// char_properties.cc - define is_X() tests for various character properties\n//\n// See char_properties.h for how to write a character property.\n//\n// References for the char sets below:\n//\n// . http://www.unicode.org/Public/UNIDATA/PropList.txt\n//\n//   Large (but not exhaustive) list of Unicode chars and their \"properties\"\n//   (e.g., the property \"Pi\" = an initial quote punctuation char).\n//\n// . http://www.unicode.org/Public/UNIDATA/PropertyValueAliases.txt\n//\n//   Defines the list of properties, such as \"Pi\", used in the above list.\n//\n// . http://www.unipad.org/unimap/index.php?param_char=XXXX&page=detail\n//\n//   Gives detail about a particular character code.\n//   XXXX is a 4-hex-digit Unicode character code.\n//\n// . http://www.unicode.org/Public/UNIDATA/UCD.html\n//\n//   General reference for Unicode characters.\n//\n\n#include \"syntaxnet/char_properties.h\"\n\n#include <ctype.h>  // for ispunct, isspace\n#include <memory>\n#include <utility>\n#include <vector>  // for vector\n\n#include \"tensorflow/core/lib/strings/str_util.h\"\n#include \"tensorflow/core/lib/strings/stringprintf.h\"\n#include \"third_party/utf/utf.h\"      // for runetochar, ::UTFmax, Rune\n#include \"util/utf8/unilib.h\"  // for IsValidCodepoint, etc\n#include \"util/utf8/unilib_utf8_utils.h\"\n\n//============================================================\n// CharPropertyImplementation\n//\n\n// A CharPropertyImplementation stores a set of Unicode characters,\n// encoded in UTF-8, as a trie.  The trie is represented as a vector\n// of nodes.  Each node is a 256-element array that specifies what to\n// do with one byte of the UTF-8 sequence.  Each element n of a node\n// is one of:\n//  n = 0,  indicating that the Property is not true of any\n//          character whose UTF-8 encoding includes this byte at\n//          this position\n//  n = -1, indicating that the Property is true for the UTF-8 sequence\n//          that ends with this byte.\n//  n > 0,  indicating the index of the row that describes the\n//          remaining bytes in the UTF-8 sequence.\n//\n// The only operation that needs to be fast is HoldsFor, which tests\n// whether a character has a given property. We use each byte of the\n// character's UTF-8 encoding to index into a row. If the value is 0,\n// then the property is not true for the character. (We might discover\n// this even before getting to the end of the sequence.) If the value\n// is -1, then the property is true for this character. Otherwise,\n// the value is the index of another row, which we index using the next\n// byte in the sequence, and so on. The design of UTF-8 prevents\n// ambiguities here; no prefix of a UTF-8 sequence is a valid UTF-8\n// sequence.\n//\n// While it is possible to implement an iterator for this representation,\n// it is much easier to use set<char32> for this purpose. In fact, we\n// would use that as the entire representation, were it not for concerns\n// that HoldsFor might be slower.\n\nnamespace syntaxnet {\n\nstruct CharPropertyImplementation {\n  unordered_set<char32> chars;\n  vector<vector<int> > rows;\n  CharPropertyImplementation() {\n    rows.reserve(10);\n    rows.resize(1);\n    rows[0].resize(256, 0);\n  }\n  void AddChar(char *buf, int len) {\n    int n = 0;  // row index\n    for (int i = 0; i < len; ++i) {\n      int ch = reinterpret_cast<unsigned char *>(buf)[i];\n      int m = rows[n][ch];\n      if (m > 0) {\n        CHECK_LT(i, len - 1)\n            << \" : \" << (i + 1) << \"-byte UTF-8 sequence \"\n            << \"(\" << tensorflow::str_util::CEscape(string(buf, i + 1)) << \")\"\n            << \" is prefix of previously-seen UTF-8 sequence(s)\";\n        n = m;\n      } else if (i == len - 1) {\n        rows[n][ch] = -1;\n      } else {\n        CHECK_EQ(m, 0) << \" : UTF-8 sequence is extension of previously-seen \"\n                       << (i + 1) << \"-byte UTF-8 sequence \"\n                       << \"(\"\n                       << tensorflow::str_util::CEscape(string(buf, i + 1))\n                       << \")\";\n        int a = rows.size();\n        rows.resize(a + 1);\n        rows[a].resize(256, 0);\n        rows[n][ch] = a;\n        n = a;\n      }\n    }\n  }\n\n  bool HoldsFor(const char *buf) const {\n    const unsigned char *bytes = reinterpret_cast<const unsigned char *>(buf);\n\n    // Lookup each byte of the UTF-8 sequence, starting in row 0.\n    int n = rows[0][*bytes];\n    if (n == 0) return false;\n    if (n == -1) return true;\n\n    // If the value is not 0 or -1, then it is the index of the row for the\n    // second byte in the sequence.\n    n = rows[n][*++bytes];\n    if (n == 0) return false;\n    if (n == -1) return true;\n    n = rows[n][*++bytes];  // Likewise for the third byte.\n    if (n == 0) return false;\n    if (n == -1) return true;\n    n = rows[n][*++bytes];  // Likewise for the fourth byte.\n    if (n == 0) return false;\n\n    // Since there can be at most 4 bytes in the sequence, n must be -1.\n    return true;\n\n    // Implementation note: it is possible (and perhaps clearer) to write this\n    // code as a loop, \"for (int i = 0; i < 4; ++i) ...\", but the TestHoldsFor\n    // benchmark results indicate that doing so produces slower code for\n    // anything other than short 7-bit ASCII strings (< 512 bytes). This is\n    // mysterious, since the compiler unrolls the loop, producing code that\n    // is almost the same as what we have here, except for the shortcut on\n    // the 4th byte.\n  }\n};\n\n//============================================================\n// CharProperty - a property that holds for selected Unicode chars\n//\n\nCharProperty::CharProperty(const char *name,\n                           const int *unicodes,\n                           int num_unicodes)\n    : name_(name),\n      impl_(new CharPropertyImplementation) {\n  // Initialize CharProperty to its char set.\n  AddCharSpec(unicodes, num_unicodes);\n}\n\nCharProperty::CharProperty(const char *name, CharPropertyInitializer *init_fn)\n    : name_(name),\n      impl_(new CharPropertyImplementation) {\n  (*init_fn)(this);\n}\n\nCharProperty::~CharProperty() {\n  delete impl_;\n}\n\nvoid CharProperty::AddChar(int c) {\n  CheckUnicodeVal(c);\n  impl_->chars.insert(c);\n\n  char buf[UTFmax];\n  Rune r = c;\n  int len = runetochar(buf, &r);\n  impl_->AddChar(buf, len);\n}\n\nvoid CharProperty::AddCharRange(int c1, int c2) {\n  for (int c = c1; c <= c2; ++c) {\n    AddChar(c);\n  }\n}\n\nvoid CharProperty::AddAsciiPredicate(AsciiPredicate *pred) {\n  for (int c = 0; c < 256; ++c) {\n    if ((*pred)(c)) {\n      AddChar(c);\n    }\n  }\n}\n\nvoid CharProperty::AddCharProperty(const char *propname) {\n  const CharProperty *prop = CharProperty::Lookup(propname);\n  CHECK(prop != NULL) << \": unknown char property \\\"\" << propname\n                      << \"\\\" in \" << name_;\n  int c = -1;\n  while ((c = prop->NextElementAfter(c)) >= 0) {\n    AddChar(c);\n  }\n}\n\nvoid CharProperty::AddCharSpec(const int *unicodes, int num_unicodes) {\n  for (int i = 0; i < num_unicodes; ++i) {\n    if (i + 3 < num_unicodes && unicodes[i] == kPreUnicodeRange &&\n        unicodes[i + 3] == kPostUnicodeRange) {\n      // Range of unicode values\n      int lower = unicodes[i + 1];\n      int upper = unicodes[i + 2];\n      i += 3;  // i will be incremented once more at top of loop\n      CHECK(lower <= upper) << \": invalid char range in \" << name_\n                            << \": [\" << UnicodeToString(lower) << \", \"\n                            << UnicodeToString(upper) << \"]\";\n      AddCharRange(lower, upper);\n    } else {\n      AddChar(unicodes[i]);\n    }\n  }\n}\n\nbool CharProperty::HoldsFor(int c) const {\n  if (!UniLib::IsValidCodepoint(c)) return false;\n  char buf[UTFmax];\n  Rune r = c;\n  runetochar(buf, &r);\n  return impl_->HoldsFor(buf);\n}\n\nbool CharProperty::HoldsFor(const char *str, int len) const {\n  // UniLib::IsUTF8ValidCodepoint also checks for structural validity.\n  return len > 0 && UniLib::IsUTF8ValidCodepoint(StringPiece(str, len)) &&\n         impl_->HoldsFor(str);\n}\n\n// Return -1 or the smallest Unicode char greater than c for which\n// the CharProperty holds.  Expects c == -1 or HoldsFor(c).\nint CharProperty::NextElementAfter(int c) const {\n  DCHECK(c == -1 || HoldsFor(c));\n  unordered_set<char32>::const_iterator end = impl_->chars.end();\n  if (c < 0) {\n    unordered_set<char32>::const_iterator it = impl_->chars.begin();\n    if (it == end) return -1;\n    return *it;\n  }\n  char32 r = c;\n  unordered_set<char32>::const_iterator it = impl_->chars.find(r);\n  if (it == end) return -1;\n  it++;\n  if (it == end) return -1;\n  return *it;\n}\n\nREGISTER_CLASS_REGISTRY(\"char property wrapper\", CharPropertyWrapper);\n\nconst CharProperty *CharProperty::Lookup(const char *subclass) {\n  // Create a CharPropertyWrapper object and delete it.  We only care about\n  // the CharProperty it provides.\n  std::unique_ptr<CharPropertyWrapper> wrapper(\n      CharPropertyWrapper::Create(subclass));\n  if (wrapper.get() == NULL) {\n    LOG(ERROR) << \"CharPropertyWrapper not found for subclass: \"\n               << \"\\\"\" << subclass << \"\\\"\";\n    return NULL;\n  }\n  return wrapper->GetCharProperty();\n}\n\n// Check that a given Unicode value is in range.\nvoid CharProperty::CheckUnicodeVal(int c) const {\n  CHECK(UniLib::IsValidCodepoint(c))\n      << \"Unicode in \" << name_ << \" out of range: \" << UnicodeToString(c);\n}\n\n// Converts a Unicode value to a string (for error messages).\nstring CharProperty::UnicodeToString(int c) {\n  const char *fmt;\n\n  if (c < 0) {\n    fmt = \"%d\";      // out-of-range\n  } else if (c <= 0x7f) {\n    fmt = \"'%c'\";    // ascii\n  } else if (c <= 0xffff) {\n    fmt = \"0x%04X\";  // 4 hex digits\n  } else {\n    fmt = \"0x%X\";    // also out-of-range\n  }\n\n  return tensorflow::strings::Printf(fmt, c);\n}\n\n//======================================================================\n// Expression-level punctuation\n//\n\n// Punctuation that starts a sentence.\nDEFINE_CHAR_PROPERTY_AS_SET(start_sentence_punc,\n  0x00A1,  // Spanish inverted exclamation mark\n  0x00BF,  // Spanish inverted question mark\n)\n\n// Punctuation that ends a sentence.\n// Based on: http://www.unicode.org/unicode/reports/tr29/#Sentence_Boundaries\nDEFINE_CHAR_PROPERTY_AS_SET(end_sentence_punc,\n  '.',\n  '!',\n  '?',\n  0x055C,  // Armenian exclamation mark\n  0x055E,  // Armenian question mark\n  0x0589,  // Armenian full stop\n  0x061F,  // Arabic question mark\n  0x06D4,  // Arabic full stop\n  0x0700,  // Syriac end of paragraph\n  0x0701,  // Syriac supralinear full stop\n  0x0702,  // Syriac sublinear full stop\n  RANGE(0x0964, 0x0965),  // Devanagari danda..Devanagari double danda\n  0x1362,  // Ethiopic full stop\n  0x1367,  // Ethiopic question mark\n  0x1368,  // Ethiopic paragraph separator\n  0x104A,  // Myanmar sign little section\n  0x104B,  // Myanmar sign section\n  0x166E,  // Canadian syllabics full stop\n  0x17d4,  // Khmer sign khan\n  0x1803,  // Mongolian full stop\n  0x1809,  // Mongolian Manchu full stop\n  0x1944,  // Limbu exclamation mark\n  0x1945,  // Limbu question mark\n  0x203C,  // double exclamation mark\n  0x203D,  // interrobang\n  0x2047,  // double question mark\n  0x2048,  // question exclamation mark\n  0x2049,  // exclamation question mark\n  0x3002,  // ideographic full stop\n  0x037E,  // Greek question mark\n  0xFE52,  // small full stop\n  0xFE56,  // small question mark\n  0xFE57,  // small exclamation mark\n  0xFF01,  // fullwidth exclamation mark\n  0xFF0E,  // fullwidth full stop\n  0xFF1F,  // fullwidth question mark\n  0xFF61,  // halfwidth ideographic full stop\n  0x2026,  // ellipsis\n)\n\n// Punctuation, such as parens, that opens a \"nested expression\" of text.\nDEFINE_CHAR_PROPERTY_AS_SET(open_expr_punc,\n  '(',\n  '[',\n  '<',\n  '{',\n  0x207D,  // superscript left parenthesis\n  0x208D,  // subscript left parenthesis\n  0x27E6,  // mathematical left white square bracket\n  0x27E8,  // mathematical left angle bracket\n  0x27EA,  // mathematical left double angle bracket\n  0x2983,  // left white curly bracket\n  0x2985,  // left white parenthesis\n  0x2987,  // Z notation left image bracket\n  0x2989,  // Z notation left binding bracket\n  0x298B,  // left square bracket with underbar\n  0x298D,  // left square bracket with tick in top corner\n  0x298F,  // left square bracket with tick in bottom corner\n  0x2991,  // left angle bracket with dot\n  0x2993,  // left arc less-than bracket\n  0x2995,  // double left arc greater-than bracket\n  0x2997,  // left black tortoise shell bracket\n  0x29D8,  // left wiggly fence\n  0x29DA,  // left double wiggly fence\n  0x29FC,  // left-pointing curved angle bracket\n  0x3008,  // CJK left angle bracket\n  0x300A,  // CJK left double angle bracket\n  0x3010,  // CJK left black lenticular bracket\n  0x3014,  // CJK left tortoise shell bracket\n  0x3016,  // CJK left white lenticular bracket\n  0x3018,  // CJK left white tortoise shell bracket\n  0x301A,  // CJK left white square bracket\n  0xFD3E,  // Ornate left parenthesis\n  0xFE59,  // small left parenthesis\n  0xFE5B,  // small left curly bracket\n  0xFF08,  // fullwidth left parenthesis\n  0xFF3B,  // fullwidth left square bracket\n  0xFF5B,  // fullwidth left curly bracket\n)\n\n// Punctuation, such as parens, that closes a \"nested expression\" of text.\nDEFINE_CHAR_PROPERTY_AS_SET(close_expr_punc,\n  ')',\n  ']',\n  '>',\n  '}',\n  0x207E,  // superscript right parenthesis\n  0x208E,  // subscript right parenthesis\n  0x27E7,  // mathematical right white square bracket\n  0x27E9,  // mathematical right angle bracket\n  0x27EB,  // mathematical right double angle bracket\n  0x2984,  // right white curly bracket\n  0x2986,  // right white parenthesis\n  0x2988,  // Z notation right image bracket\n  0x298A,  // Z notation right binding bracket\n  0x298C,  // right square bracket with underbar\n  0x298E,  // right square bracket with tick in top corner\n  0x2990,  // right square bracket with tick in bottom corner\n  0x2992,  // right angle bracket with dot\n  0x2994,  // right arc greater-than bracket\n  0x2996,  // double right arc less-than bracket\n  0x2998,  // right black tortoise shell bracket\n  0x29D9,  // right wiggly fence\n  0x29DB,  // right double wiggly fence\n  0x29FD,  // right-pointing curved angle bracket\n  0x3009,  // CJK right angle bracket\n  0x300B,  // CJK right double angle bracket\n  0x3011,  // CJK right black lenticular bracket\n  0x3015,  // CJK right tortoise shell bracket\n  0x3017,  // CJK right white lenticular bracket\n  0x3019,  // CJK right white tortoise shell bracket\n  0x301B,  // CJK right white square bracket\n  0xFD3F,  // Ornate right parenthesis\n  0xFE5A,  // small right parenthesis\n  0xFE5C,  // small right curly bracket\n  0xFF09,  // fullwidth right parenthesis\n  0xFF3D,  // fullwidth right square bracket\n  0xFF5D,  // fullwidth right curly bracket\n)\n\n// Chars that open a quotation.\n// Based on: http://www.unicode.org/uni2book/ch06.pdf\nDEFINE_CHAR_PROPERTY_AS_SET(open_quote,\n  '\"',\n  '\\'',\n  '`',\n  0xFF07,  // fullwidth apostrophe\n  0xFF02,  // fullwidth quotation mark\n  0x2018,  // left single quotation mark (English, others)\n  0x201C,  // left double quotation mark (English, others)\n  0x201B,  // single high-reveresed-9 quotation mark (PropList.txt)\n  0x201A,  // single low-9 quotation mark (Czech, German, Slovak)\n  0x201E,  // double low-9 quotation mark (Czech, German, Slovak)\n  0x201F,  // double high-reversed-9 quotation mark (PropList.txt)\n  0x2019,  // right single quotation mark (Danish, Finnish, Swedish, Norw.)\n  0x201D,  // right double quotation mark (Danish, Finnish, Swedish, Norw.)\n  0x2039,  // single left-pointing angle quotation mark (French, others)\n  0x00AB,  // left-pointing double angle quotation mark (French, others)\n  0x203A,  // single right-pointing angle quotation mark (Slovenian, others)\n  0x00BB,  // right-pointing double angle quotation mark (Slovenian, others)\n  0x300C,  // left corner bracket (East Asian languages)\n  0xFE41,  // presentation form for vertical left corner bracket\n  0xFF62,  // halfwidth left corner bracket (East Asian languages)\n  0x300E,  // left white corner bracket (East Asian languages)\n  0xFE43,  // presentation form for vertical left white corner bracket\n  0x301D,  // reversed double prime quotation mark (East Asian langs, horiz.)\n)\n\n// Chars that close a quotation.\n// Based on: http://www.unicode.org/uni2book/ch06.pdf\nDEFINE_CHAR_PROPERTY_AS_SET(close_quote,\n  '\\'',\n  '\"',\n  '`',\n  0xFF07,  // fullwidth apostrophe\n  0xFF02,  // fullwidth quotation mark\n  0x2019,  // right single quotation mark (English, others)\n  0x201D,  // right double quotation mark (English, others)\n  0x2018,  // left single quotation mark (Czech, German, Slovak)\n  0x201C,  // left double quotation mark (Czech, German, Slovak)\n  0x203A,  // single right-pointing angle quotation mark (French, others)\n  0x00BB,  // right-pointing double angle quotation mark (French, others)\n  0x2039,  // single left-pointing angle quotation mark (Slovenian, others)\n  0x00AB,  // left-pointing double angle quotation mark (Slovenian, others)\n  0x300D,  // right corner bracket (East Asian languages)\n  0xfe42,  // presentation form for vertical right corner bracket\n  0xFF63,  // halfwidth right corner bracket (East Asian languages)\n  0x300F,  // right white corner bracket (East Asian languages)\n  0xfe44,  // presentation form for vertical right white corner bracket\n  0x301F,  // low double prime quotation mark (East Asian languages)\n  0x301E,  // close double prime (East Asian languages written horizontally)\n)\n\n// Punctuation chars that open an expression or a quotation.\nDEFINE_CHAR_PROPERTY(open_punc, prop) {\n  prop->AddCharProperty(\"open_expr_punc\");\n  prop->AddCharProperty(\"open_quote\");\n}\n\n// Punctuation chars that close an expression or a quotation.\nDEFINE_CHAR_PROPERTY(close_punc, prop) {\n  prop->AddCharProperty(\"close_expr_punc\");\n  prop->AddCharProperty(\"close_quote\");\n}\n\n// Punctuation chars that can come at the beginning of a sentence.\nDEFINE_CHAR_PROPERTY(leading_sentence_punc, prop) {\n  prop->AddCharProperty(\"open_punc\");\n  prop->AddCharProperty(\"start_sentence_punc\");\n}\n\n// Punctuation chars that can come at the end of a sentence.\nDEFINE_CHAR_PROPERTY(trailing_sentence_punc, prop) {\n  prop->AddCharProperty(\"close_punc\");\n  prop->AddCharProperty(\"end_sentence_punc\");\n}\n\n//======================================================================\n// Special symbols\n//\n\n// Currency symbols.\n// From: http://www.unicode.org/charts/PDF/U20A0.pdf\nDEFINE_CHAR_PROPERTY_AS_SET(currency_symbol,\n  '$',\n  // 0x00A2,  // cents (NB: typically FOLLOWS the amount)\n  0x00A3,  // pounds and liras\n  0x00A4,  // general currency sign\n  0x00A5,  // yen or yuan\n  0x0192,  // Dutch florin (latin small letter \"f\" with hook)\n  0x09F2,  // Bengali rupee mark\n  0x09F3,  // Bengali rupee sign\n  0x0AF1,  // Guajarati rupee sign\n  0x0BF9,  // Tamil rupee sign\n  0x0E3F,  // Thai baht\n  0x17DB,  // Khmer riel\n  0x20A0,  // alternative euro sign\n  0x20A1,  // Costa Rica, El Salvador (colon sign)\n  0x20A2,  // Brazilian cruzeiro\n  0x20A3,  // French Franc\n  0x20A4,  // alternative lira sign\n  0x20A5,  // mill sign (USA 1/10 cent)\n  0x20A6,  // Nigerian Naira\n  0x20A7,  // Spanish peseta\n  0x20A8,  // Indian rupee\n  0x20A9,  // Korean won\n  0x20AA,  // Israeli new sheqel\n  0x20AB,  // Vietnam dong\n  0x20AC,  // euro sign\n  0x20AD,  // Laotian kip\n  0x20AE,  // Mongolian tugrik\n  0x20AF,  // Greek drachma\n  0x20B0,  // German penny\n  0x20B1,  // Philippine peso (Mexican peso uses \"$\")\n  0x2133,  // Old German mark (script capital M)\n  0xFDFC,  // rial sign\n  0xFFE0,  // fullwidth cents\n  0xFFE1,  // fullwidth pounds\n  0xFFE5,  // fullwidth Japanese yen\n  0xFFE6,  // fullwidth Korean won\n)\n\n// Chinese bookquotes.\n// They look like \"<<\" and \">>\" except that they are single UTF8 chars\n// (U+300A, U+300B). These are used in chinese as special\n// punctuation, refering to the title of a book, an article, a movie,\n// etc.  For example: \"cellphone\" means cellphone, but <<cellphone>>\n// means (exclusively) the movie.\nDEFINE_CHAR_PROPERTY_AS_SET(open_bookquote,\n 0x300A\n)\n\nDEFINE_CHAR_PROPERTY_AS_SET(close_bookquote,\n 0x300B\n)\n\n//======================================================================\n// Token-level punctuation\n//\n\n// Token-prefix symbols, excluding currency symbols -- glom on\n// to following token (esp. if no space after)\nDEFINE_CHAR_PROPERTY_AS_SET(noncurrency_token_prefix_symbol,\n  '#',\n  0x2116,  // numero sign (\"No\")\n)\n\n// Token-prefix symbols -- glom on to following token (esp. if no space after)\nDEFINE_CHAR_PROPERTY(token_prefix_symbol, prop) {\n  prop->AddCharProperty(\"currency_symbol\");\n  prop->AddCharProperty(\"noncurrency_token_prefix_symbol\");\n}\n\n// Token-suffix symbols -- glom on to preceding token (esp. if no space before)\nDEFINE_CHAR_PROPERTY_AS_SET(token_suffix_symbol,\n  '%',\n  0x066A,  // Arabic percent sign\n  0x2030,  // per mille\n  0x2031,  // per ten thousand\n  0x00A2,  // cents sign\n  0x2125,  // ounces sign\n  0x00AA,  // feminine ordinal indicator (Spanish)\n  0x00BA,  // masculine ordinal indicator (Spanish)\n  0x00B0,  // degrees\n  0x2109,  // degrees Fahrenheit\n  0x2103,  // degrees Celsius\n  0x2126,  // ohms\n  0x212A,  // Kelvin\n  0x212B,  // Angstroms (\"A\" with circle on top)\n  0x00A9,  // copyright\n  0x2117,  // sound recording copyright (circled \"P\")\n  0x2122,  // trade mark\n  0x00AE,  // registered trade mark\n  0x2120,  // service mark\n  0x2106,  // cada una (\"c/a\" == \"each\" in Spanish)\n  0x2020,  // dagger (can be used for footnotes)\n  0x2021,  // double dagger (can be used for footnotes)\n)\n\n// Subscripts\nDEFINE_CHAR_PROPERTY_AS_SET(subscript_symbol,\n  0x2080,  // subscript 0\n  0x2081,  // subscript 1\n  0x2082,  // subscript 2\n  0x2083,  // subscript 3\n  0x2084,  // subscript 4\n  0x2085,  // subscript 5\n  0x2086,  // subscript 6\n  0x2087,  // subscript 7\n  0x2088,  // subscript 8\n  0x2089,  // subscript 9\n  0x208A,  // subscript \"+\"\n  0x208B,  // subscript \"-\"\n  0x208C,  // subscript \"=\"\n  0x208D,  // subscript \"(\"\n  0x208E,  // subscript \")\"\n)\n\n// Superscripts\nDEFINE_CHAR_PROPERTY_AS_SET(superscript_symbol,\n  0x2070,  // superscript 0\n  0x00B9,  // superscript 1\n  0x00B2,  // superscript 2\n  0x00B3,  // superscript 3\n  0x2074,  // superscript 4\n  0x2075,  // superscript 5\n  0x2076,  // superscript 6\n  0x2077,  // superscript 7\n  0x2078,  // superscript 8\n  0x2079,  // superscript 9\n  0x2071,  // superscript Latin small \"i\"\n  0x207A,  // superscript \"+\"\n  0x207B,  // superscript \"-\"\n  0x207C,  // superscript \"=\"\n  0x207D,  // superscript \"(\"\n  0x207E,  // superscript \")\"\n  0x207F,  // superscript Latin small \"n\"\n)\n\n//======================================================================\n// General punctuation\n//\n\n// Connector punctuation\n// Code Pc from http://www.unicode.org/Public/UNIDATA/PropList.txt\n// NB: This list is not necessarily exhaustive.\nDEFINE_CHAR_PROPERTY_AS_SET(connector_punc,\n  0x30fb,  // Katakana middle dot\n  0xff65,  // halfwidth Katakana middle dot\n  0x2040,  // character tie\n)\n\n// Dashes\n// Code Pd from http://www.unicode.org/Public/UNIDATA/PropList.txt\n// NB: This list is not necessarily exhaustive.\nDEFINE_CHAR_PROPERTY_AS_SET(dash_punc,\n  '-',\n  '~',\n  0x058a,  // Armenian hyphen\n  0x1806,  // Mongolian todo soft hyphen\n  RANGE(0x2010, 0x2015),  // hyphen..horizontal bar\n  0x2053,  // swung dash -- from Table 6-3 of Unicode book\n  0x207b,  // superscript minus\n  0x208b,  // subscript minus\n  0x2212,  // minus sign\n  0x301c,  // wave dash\n  0x3030,  // wavy dash\n  RANGE(0xfe31, 0xfe32),  // presentation form for vertical em dash..en dash\n  0xfe58,  // small em dash\n  0xfe63,  // small hyphen-minus\n  0xff0d,  // fullwidth hyphen-minus\n)\n\n// Other punctuation\n// Code Po from http://www.unicode.org/Public/UNIDATA/UnicodeData.txt\n// NB: This list is not exhaustive.\nDEFINE_CHAR_PROPERTY_AS_SET(other_punc,\n  ',',\n  ':',\n  ';',\n  0x00b7,  // middle dot\n  0x0387,  // Greek ano teleia\n  0x05c3,  // Hebrew punctuation sof pasuq\n  0x060c,  // Arabic comma\n  0x061b,  // Arabic semicolon\n  0x066b,  // Arabic decimal separator\n  0x066c,  // Arabic thousands separator\n  RANGE(0x0703, 0x70a),  // Syriac contraction and others\n  0x070c,  // Syric harklean metobelus\n  0x0e5a,  // Thai character angkhankhu\n  0x0e5b,  // Thai character khomut\n  0x0f08,  // Tibetan mark sbrul shad\n  RANGE(0x0f0d, 0x0f12),  // Tibetan mark shad..Tibetan mark rgya gram shad\n  0x1361,  // Ethiopic wordspace\n  RANGE(0x1363, 0x1366),  // other Ethiopic chars\n  0x166d,  // Canadian syllabics chi sign\n  RANGE(0x16eb, 0x16ed),  // Runic single punctuation..Runic cross punctuation\n  RANGE(0x17d5, 0x17d6),  // Khmer sign camnuc pii huuh and other\n  0x17da,  // Khmer sign koomut\n  0x1802,  // Mongolian comma\n  RANGE(0x1804, 0x1805),  // Mongolian four dots and other\n  0x1808,  // Mongolian manchu comma\n  0x3001,  // ideographic comma\n  RANGE(0xfe50, 0xfe51),  // small comma and others\n  RANGE(0xfe54, 0xfe55),  // small semicolon and other\n  0xff0c,  // fullwidth comma\n  RANGE(0xff0e, 0xff0f),  // fullwidth stop..fullwidth solidus\n  RANGE(0xff1a, 0xff1b),  // fullwidth colon..fullwidth semicolon\n  0xff64,  // halfwidth ideographic comma\n  0x2016,  // double vertical line\n  RANGE(0x2032, 0x2034),  // prime..triple prime\n  0xfe61,  // small asterisk\n  0xfe68,  // small reverse solidus\n  0xff3c,  // fullwidth reverse solidus\n)\n\n// All punctuation.\n// Code P from http://www.unicode.org/Public/UNIDATA/PropList.txt\n// NB: This list is not necessarily exhaustive.\nDEFINE_CHAR_PROPERTY(punctuation, prop) {\n  prop->AddCharProperty(\"open_punc\");\n  prop->AddCharProperty(\"close_punc\");\n  prop->AddCharProperty(\"leading_sentence_punc\");\n  prop->AddCharProperty(\"trailing_sentence_punc\");\n  prop->AddCharProperty(\"connector_punc\");\n  prop->AddCharProperty(\"dash_punc\");\n  prop->AddCharProperty(\"other_punc\");\n  prop->AddAsciiPredicate(&ispunct);\n}\n\n//======================================================================\n// Separators\n//\n\n// Line separators\n// Code Zl from http://www.unicode.org/Public/UNIDATA/PropList.txt\n// NB: This list is not necessarily exhaustive.\nDEFINE_CHAR_PROPERTY_AS_SET(line_separator,\n  0x2028,                           // line separator\n)\n\n// Paragraph separators\n// Code Zp from http://www.unicode.org/Public/UNIDATA/PropList.txt\n// NB: This list is not necessarily exhaustive.\nDEFINE_CHAR_PROPERTY_AS_SET(paragraph_separator,\n  0x2029,                           // paragraph separator\n)\n\n// Space separators\n// Code Zs from http://www.unicode.org/Public/UNIDATA/PropList.txt\n// NB: This list is not necessarily exhaustive.\nDEFINE_CHAR_PROPERTY_AS_SET(space_separator,\n  0x0020,                           // space\n  0x00a0,                           // no-break space\n  0x1680,                           // Ogham space mark\n  0x180e,                           // Mongolian vowel separator\n  RANGE(0x2000, 0x200a),            // en quad..hair space\n  0x202f,                           // narrow no-break space\n  0x205f,                           // medium mathematical space\n  0x3000,                           // ideographic space\n\n  // Google additions\n  0xe5e5,                           // \"private\" char used as space in Chinese\n)\n\n// Separators -- all line, paragraph, and space separators.\n// Code Z from http://www.unicode.org/Public/UNIDATA/PropList.txt\n// NB: This list is not necessarily exhaustive.\nDEFINE_CHAR_PROPERTY(separator, prop) {\n  prop->AddCharProperty(\"line_separator\");\n  prop->AddCharProperty(\"paragraph_separator\");\n  prop->AddCharProperty(\"space_separator\");\n  prop->AddAsciiPredicate(&isspace);\n}\n\n//======================================================================\n// Alphanumeric Characters\n//\n\n// Digits\nDEFINE_CHAR_PROPERTY_AS_SET(digit,\n  RANGE('0', '9'),\n  RANGE(0x0660, 0x0669),  // Arabic-Indic digits\n\n  RANGE(0x06F0, 0x06F9),  // Eastern Arabic-Indic digits\n)\n\n//======================================================================\n// Japanese Katakana\n//\n\nDEFINE_CHAR_PROPERTY_AS_SET(katakana,\n  0x3099,  // COMBINING KATAKANA-HIRAGANA VOICED SOUND MARK\n  0x309A,  // COMBINING KATAKANA-HIRAGANA SEMI-VOICED SOUND MARK\n  0x309B,  // KATAKANA-HIRAGANA VOICED SOUND MARK\n  0x309C,  // KATAKANA-HIRAGANA SEMI-VOICED SOUND MARK\n  RANGE(0x30A0, 0x30FF),  // Fullwidth Katakana\n  RANGE(0xFF65, 0xFF9F),  // Halfwidth Katakana\n)\n\n//======================================================================\n// BiDi Directional Formatting Codes\n//\n\n// See http://www.unicode.org/reports/tr9/ for a description of Bidi\n// and http://www.unicode.org/charts/PDF/U2000.pdf for the character codes.\nDEFINE_CHAR_PROPERTY_AS_SET(directional_formatting_code,\n  0x200E,  // LRM (Left-to-Right Mark)\n  0x200F,  // RLM (Right-to-Left Mark)\n  0x202A,  // LRE (Left-to-Right Embedding)\n  0x202B,  // RLE (Right-to-Left Embedding)\n  0x202C,  // PDF (Pop Directional Format)\n  0x202D,  // LRO (Left-to-Right Override)\n  0x202E,  // RLO (Right-to-Left Override)\n)\n\n//======================================================================\n// Special collections\n//\n\n// NB: This does not check for all punctuation and symbols in the\n// standard; just those listed in our code. See the definitions in\n// char_properties.cc\nDEFINE_CHAR_PROPERTY(punctuation_or_symbol, prop) {\n  prop->AddCharProperty(\"punctuation\");\n  prop->AddCharProperty(\"subscript_symbol\");\n  prop->AddCharProperty(\"superscript_symbol\");\n  prop->AddCharProperty(\"token_prefix_symbol\");\n  prop->AddCharProperty(\"token_suffix_symbol\");\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/char_properties.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// char_properties.h - define is_X() tests for various character properties\n//\n// Character properties can be defined in two ways:\n//\n// (1) Set-based:\n//\n//     Enumerate the chars that have the property.  Example:\n//\n//       DEFINE_CHAR_PROPERTY_AS_SET(my_fave,\n//         RANGE('0', '9'),\n//         '\\'',\n//         0x00BF,   // Spanish inverted question mark\n//       )\n//\n//     Characters are expressed as Unicode code points; note that ascii codes\n//     are a subset.  RANGE() specifies an inclusive range of code points.\n//\n//     This defines two functions:\n//\n//       bool is_my_fave(const char *str, int len)\n//       bool is_my_fave(int c)\n//\n//     Each returns true for precisely the 12 characters specified above.\n//     Each takes a *single* UTf8 char as its argument -- the first expresses\n//     it as a char * and a length, the second as a Unicode code point.\n//     Please do not pass a string of multiple UTF8 chars to the first one.\n//\n//     To make is_my_fave() externally accessible, put in your .h file:\n//\n//       DECLARE_CHAR_PROPERTY(my_fave)\n//\n// (2) Function-based:\n//\n//     Specify a function that assigns the desired chars to a CharProperty\n//     object.  Example:\n//\n//       DEFINE_CHAR_PROPERTY(my_other_fave, prop) {\n//         for (int i = '0'; i <= '9'; i += 2) {\n//           prop->AddChar(i);\n//         }\n//         prop->AddAsciiPredicate(&ispunct);\n//         prop->AddCharProperty(\"currency_symbol\");\n//       }\n//\n//     This defines a function of one arg: CharProperty *prop.  The function\n//     calls various CharProperty methods to populate the prop.  The last call\n//     above, AddCharProperty(), adds the chars from another char property\n//     (\"currency_symbol\").\n//\n//     As in the set-based case, put a DECLARE_CHAR_PROPERTY(my_other_fave)\n//     in your .h if you want is_my_other_fave() to be externally accessible.\n//\n\n#ifndef SYNTAXNET_CHAR_PROPERTIES_H_\n#define SYNTAXNET_CHAR_PROPERTIES_H_\n\n#include <string>  // for string\n\n#include \"syntaxnet/registry.h\"\n#include \"syntaxnet/utils.h\"\n\n// =====================================================================\n// Registry for accessing CharProperties by name\n//\n// This is for internal use by the CharProperty class and macros; callers\n// should not use it explicitly.\n//\n\nnamespace syntaxnet {\n\nclass CharProperty;   // forward declaration\n\n// Wrapper around a CharProperty, allowing it to be stored in a registry.\nstruct CharPropertyWrapper : RegisterableClass<CharPropertyWrapper> {\n  virtual ~CharPropertyWrapper() { }\n  virtual CharProperty *GetCharProperty() = 0;\n};\n\n#define REGISTER_CHAR_PROPERTY_WRAPPER(type, component) \\\n  REGISTER_CLASS_COMPONENT(CharPropertyWrapper, type, component)\n\n#define REGISTER_CHAR_PROPERTY(lsp, name)                         \\\n  struct name##CharPropertyWrapper : public CharPropertyWrapper { \\\n    CharProperty *GetCharProperty() { return lsp.get(); }         \\\n  };                                                              \\\n  REGISTER_CHAR_PROPERTY_WRAPPER(#name, name##CharPropertyWrapper)\n\n// =====================================================================\n// Macros for defining character properties\n//\n\n// Define is_X() functions to test whether a single UTF8 character has\n// the 'X' char prop.\n#define DEFINE_IS_X_CHAR_PROPERTY_FUNCTIONS(lsp, name) \\\n  bool is_##name(const char *str, int len) {                                 \\\n    return lsp->HoldsFor(str, len);                                          \\\n  }                                                                          \\\n  bool is_##name(int c) {                                                    \\\n    return lsp->HoldsFor(c);                                                 \\\n  }\n\n// Define a char property by enumerating the unicode char points,\n// or RANGE()s thereof, for which it holds.  Example:\n//\n//   DEFINE_CHAR_PROPERTY_AS_SET(my_fave,\n//     'q',\n//     RANGE('0', '9'),\n//     0x20AB,\n//   )\n//\n// \"...\" is a GNU extension.\n#define DEFINE_CHAR_PROPERTY_AS_SET(name, unicodes...)                         \\\n  static const int k_##name##_unicodes[] = {unicodes};                         \\\n  static utils::LazyStaticPtr<CharProperty, const char *, const int *, size_t> \\\n      name##_char_property = {#name, k_##name##_unicodes,                      \\\n                              arraysize(k_##name##_unicodes)};                 \\\n  REGISTER_CHAR_PROPERTY(name##_char_property, name);                          \\\n  DEFINE_IS_X_CHAR_PROPERTY_FUNCTIONS(name##_char_property, name)\n\n// Specify a range (inclusive) of Unicode character values.\n// Example: RANGE('0', '9') specifies the 10 digits.\n// For use as an element in a DEFINE_CHAR_PROPERTY_AS_SET() list.\nstatic const int kPreUnicodeRange = -1;\nstatic const int kPostUnicodeRange = -2;\n#define RANGE(lower, upper) \\\n  kPreUnicodeRange, lower, upper, kPostUnicodeRange\n\n// A function to initialize a CharProperty.\ntypedef void CharPropertyInitializer(CharProperty *prop);\n\n// Define a char property by specifying a block of code that initializes it.\n// Example:\n//\n//   DEFINE_CHAR_PROPERTY(my_other_fave, prop) {\n//     for (int i = '0'; i <= '9'; i += 2) {\n//       prop->AddChar(i);\n//     }\n//     prop->AddAsciiPredicate(&ispunct);\n//     prop->AddCharProperty(\"currency_symbol\");\n//   }\n//\n#define DEFINE_CHAR_PROPERTY(name, charpropvar)                       \\\n  static void init_##name##_char_property(CharProperty *charpropvar); \\\n  static utils::LazyStaticPtr<CharProperty, const char *,             \\\n                              CharPropertyInitializer *>              \\\n      name##_char_property = {#name, &init_##name##_char_property};   \\\n  REGISTER_CHAR_PROPERTY(name##_char_property, name);                 \\\n  DEFINE_IS_X_CHAR_PROPERTY_FUNCTIONS(name##_char_property, name)     \\\n  static void init_##name##_char_property(CharProperty *charpropvar)\n\n// =====================================================================\n// Macro for declaring character properties\n//\n\n#define DECLARE_CHAR_PROPERTY(name) \\\n  extern bool is_##name(const char *str, int len);                           \\\n  extern bool is_##name(int c);                                              \\\n\n// ===========================================================\n// CharProperty - a property that holds for selected Unicode chars\n//\n// A CharProperty is semantically equivalent to set<char32>.\n//\n// The characters for which a CharProperty holds are represented as a trie,\n// i.e., a tree that is indexed by successive bytes of the UTF-8 encoding\n// of the characters.  This permits fast lookup (HoldsFor).\n//\n\n// A function that defines a subset of [0..255], e.g., isspace.\ntypedef int AsciiPredicate(int c);\n\nclass CharProperty {\n public:\n  // Constructor for set-based char properties.\n  CharProperty(const char *name, const int *unicodes, int num_unicodes);\n\n  // Constructor for function-based char properties.\n  CharProperty(const char *name, CharPropertyInitializer *init_fn);\n\n  virtual ~CharProperty();\n\n  // Various ways of adding chars to a CharProperty; for use only in\n  // CharPropertyInitializer functions.\n  void AddChar(int c);\n  void AddCharRange(int c1, int c2);\n  void AddAsciiPredicate(AsciiPredicate *pred);\n  void AddCharProperty(const char *name);\n  void AddCharSpec(const int *unicodes, int num_unicodes);\n\n  // Return true iff the CharProperty holds for a single given UTF8 char.\n  bool HoldsFor(const char *str, int len) const;\n\n  // Return true iff the CharProperty holds for a single given Unicode char.\n  bool HoldsFor(int c) const;\n\n  // You can use this to enumerate the set elements (it was easier\n  // than defining a real iterator).  Returns -1 if there are no more.\n  // Call with -1 to get the first element.  Expects c == -1 or HoldsFor(c).\n  int NextElementAfter(int c) const;\n\n  // Return NULL or the CharProperty with the given name.  Looks up the name\n  // in a CharProperty registry.\n  static const CharProperty *Lookup(const char *name);\n\n private:\n  void CheckUnicodeVal(int c) const;\n  static string UnicodeToString(int c);\n\n  const char *name_;\n  struct CharPropertyImplementation *impl_;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(CharProperty);\n};\n\n//======================================================================\n// Expression-level punctuation\n//\n\n// Punctuation that starts a sentence.\nDECLARE_CHAR_PROPERTY(start_sentence_punc);\n\n// Punctuation that ends a sentence.\nDECLARE_CHAR_PROPERTY(end_sentence_punc);\n\n// Punctuation, such as parens, that opens a \"nested expression\" of text.\nDECLARE_CHAR_PROPERTY(open_expr_punc);\n\n// Punctuation, such as parens, that closes a \"nested expression\" of text.\nDECLARE_CHAR_PROPERTY(close_expr_punc);\n\n// Chars that open a quotation.\nDECLARE_CHAR_PROPERTY(open_quote);\n\n// Chars that close a quotation.\nDECLARE_CHAR_PROPERTY(close_quote);\n\n// Punctuation chars that open an expression or a quotation.\nDECLARE_CHAR_PROPERTY(open_punc);\n\n// Punctuation chars that close an expression or a quotation.\nDECLARE_CHAR_PROPERTY(close_punc);\n\n// Punctuation chars that can come at the beginning of a sentence.\nDECLARE_CHAR_PROPERTY(leading_sentence_punc);\n\n// Punctuation chars that can come at the end of a sentence.\nDECLARE_CHAR_PROPERTY(trailing_sentence_punc);\n\n//======================================================================\n// Token-level punctuation\n//\n\n// Token-prefix symbols -- glom on to following token\n// (esp. if no space after) -- except for currency symbols.\nDECLARE_CHAR_PROPERTY(noncurrency_token_prefix_symbol);\n\n// Token-prefix symbols -- glom on to following token (esp. if no space after).\nDECLARE_CHAR_PROPERTY(token_prefix_symbol);\n\n// Token-suffix symbols -- glom on to preceding token (esp. if no space\n// before).\nDECLARE_CHAR_PROPERTY(token_suffix_symbol);\n\n// Subscripts.\nDECLARE_CHAR_PROPERTY(subscript_symbol);\n\n// Superscripts.\nDECLARE_CHAR_PROPERTY(superscript_symbol);\n\n//======================================================================\n// General punctuation\n//\n\n// Connector punctuation.\nDECLARE_CHAR_PROPERTY(connector_punc);\n\n// Dashes.\nDECLARE_CHAR_PROPERTY(dash_punc);\n\n// Other punctuation.\nDECLARE_CHAR_PROPERTY(other_punc);\n\n// All punctuation.\nDECLARE_CHAR_PROPERTY(punctuation);\n\n//======================================================================\n// Special symbols\n//\n\n// Currency symbols.\nDECLARE_CHAR_PROPERTY(currency_symbol);\n\n// Chinese bookquotes.\nDECLARE_CHAR_PROPERTY(open_bookquote);\nDECLARE_CHAR_PROPERTY(close_bookquote);\n\n//======================================================================\n// Separators\n//\n\n// Line separators.\nDECLARE_CHAR_PROPERTY(line_separator);\n\n// Paragraph separators.\nDECLARE_CHAR_PROPERTY(paragraph_separator);\n\n// Space separators.\nDECLARE_CHAR_PROPERTY(space_separator);\n\n// Separators -- all line, paragraph, and space separators.\nDECLARE_CHAR_PROPERTY(separator);\n\n//======================================================================\n// Alphanumeric Characters\n//\n\n// Digits.\nDECLARE_CHAR_PROPERTY(digit);\n\n// Japanese Katakana.\nDECLARE_CHAR_PROPERTY(katakana);\n\n//======================================================================\n// BiDi Directional Formatting Codes\n//\n\n// Explicit directional formatting codes (LRM, RLM, LRE, RLE, PDF, LRO, RLO)\n// used by the bidirectional algorithm.\n//\n// Note: Use this only to classify characters. To actually determine\n// directionality of BiDi text, look under i18n/bidi.\n//\n// See http://www.unicode.org/reports/tr9/ for a description of the algorithm\n// and http://www.unicode.org/charts/PDF/U2000.pdf for the character codes.\nDECLARE_CHAR_PROPERTY(directional_formatting_code);\n\n//======================================================================\n// Special collections\n//\n\n// NB: This does not check for all punctuation and symbols in the standard;\n// just those listed in our code. See the definitions in char_properties.cc.\nDECLARE_CHAR_PROPERTY(punctuation_or_symbol);\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_CHAR_PROPERTIES_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/char_properties_test.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// Tests for char_properties.cc:\n//\n// (1) Test the DEFINE_CHAR_PROPERTY_AS_SET and DEFINE_CHAR_PROPERTY macros\n//     by defining a few fake char properties and verifying their contents.\n//\n// (2) Test the char properties defined in char_properties.cc by spot-checking\n//     a few chars.\n//\n\n#include \"syntaxnet/char_properties.h\"\n\n#include <ctype.h>  // for ispunct, isspace\n#include <map>\n#include <set>\n#include <utility>\n#include <vector>\n\n#include <gmock/gmock.h>  // for ContainerEq, EXPECT_THAT\n#include \"tensorflow/core/platform/test.h\"\n#include \"third_party/utf/utf.h\"\n#include \"util/utf8/unilib.h\"  // for IsValidCodepoint, etc\n#include \"util/utf8/unilib_utf8_utils.h\"\n\nusing ::testing::ContainerEq;\n\nnamespace syntaxnet {\n\n// Invalid UTF-8 bytes are decoded as the Replacement Character, U+FFFD\n// (which is also Runeerror). Invalid code points are encoded in UTF-8\n// with the UTF-8 representation of the Replacement Character.\nstatic const char ReplacementCharacterUTF8[3] = {'\\xEF', '\\xBF', '\\xBD'};\n\n// ====================================================================\n// CharPropertiesTest\n//\n\nclass CharPropertiesTest : public testing::Test {\n protected:\n  // Collect a set of chars.\n  void CollectChars(const std::set<char32> &chars) {\n    collected_set_.insert(chars.begin(), chars.end());\n  }\n\n  // Collect an array of chars.\n  void CollectArray(const char32 arr[], int len) {\n    collected_set_.insert(arr, arr + len);\n  }\n\n  // Collect the chars for which the named CharProperty holds.\n  void CollectCharProperty(const char *name) {\n    const CharProperty *prop = CharProperty::Lookup(name);\n    ASSERT_TRUE(prop != nullptr) << \"for \" << name;\n\n    for (char32 c = 0; c <= 0x10FFFF; ++c) {\n      if (UniLib::IsValidCodepoint(c) && prop->HoldsFor(c)) {\n        collected_set_.insert(c);\n      }\n    }\n  }\n\n  // Collect the chars for which an ascii predicate holds.\n  void CollectAsciiPredicate(AsciiPredicate *pred) {\n    for (char32 c = 0; c < 256; ++c) {\n      if ((*pred)(c)) {\n        collected_set_.insert(c);\n      }\n    }\n  }\n\n  // Expect the named char property to be true for precisely the chars in\n  // the collected set.\n  void ExpectCharPropertyEqualsCollectedSet(const char *name) {\n    const CharProperty *prop = CharProperty::Lookup(name);\n    ASSERT_TRUE(prop != nullptr) << \"for \" << name;\n\n    // Test that char property holds for all collected chars.  Exercises both\n    // signatures of CharProperty::HoldsFor().\n    for (std::set<char32>::const_iterator it = collected_set_.begin();\n         it != collected_set_.end(); ++it) {\n      // Test utf8 version of is_X().\n      const char32 c = *it;\n      string utf8_char = EncodeAsUTF8(&c, 1);\n      EXPECT_TRUE(prop->HoldsFor(utf8_char.c_str(), utf8_char.size()));\n\n      // Test ucs-2 version of is_X().\n      EXPECT_TRUE(prop->HoldsFor(static_cast<int>(c)));\n    }\n\n    // Test that the char property holds for precisely the collected chars.\n    // Somewhat redundant with previous test, but exercises\n    // CharProperty::NextElementAfter().\n    std::set<char32> actual_chars;\n    int c = -1;\n    while ((c = prop->NextElementAfter(c)) >= 0) {\n      actual_chars.insert(static_cast<char32>(c));\n    }\n    EXPECT_THAT(actual_chars, ContainerEq(collected_set_))\n        << \" for \" << name;\n  }\n\n  // Expect the named char property to be true for at least the chars in\n  // the collected set.\n  void ExpectCharPropertyContainsCollectedSet(const char *name) {\n    const CharProperty *prop = CharProperty::Lookup(name);\n    ASSERT_TRUE(prop != nullptr) << \"for \" << name;\n\n    for (std::set<char32>::const_iterator it = collected_set_.begin();\n         it != collected_set_.end(); ++it) {\n      EXPECT_TRUE(prop->HoldsFor(static_cast<int>(*it)));\n    }\n  }\n\n  string EncodeAsUTF8(const char32 *in, int size) {\n    string out;\n    out.reserve(size);\n    for (int i = 0; i < size; ++i) {\n      char buf[UTFmax];\n      int len = EncodeAsUTF8Char(*in++, buf);\n      out.append(buf, len);\n    }\n    return out;\n  }\n\n  int EncodeAsUTF8Char(char32 in, char *out) {\n    if (UniLib::IsValidCodepoint(in)) {\n      return runetochar(out, &in);\n    } else {\n      memcpy(out, ReplacementCharacterUTF8, 3);\n      return 3;\n    }\n  }\n\n private:\n  std::set<char32> collected_set_;\n};\n\n//======================================================================\n// Declarations of the sample character sets below\n// (to test the DECLARE_CHAR_PROPERTY() macro)\n//\n\nDECLARE_CHAR_PROPERTY(test_digit);\nDECLARE_CHAR_PROPERTY(test_wavy_dash);\nDECLARE_CHAR_PROPERTY(test_digit_or_wavy_dash);\nDECLARE_CHAR_PROPERTY(test_punctuation_plus);\n\n//======================================================================\n// Definitions of sample character sets\n//\n\n// Digits.\nDEFINE_CHAR_PROPERTY_AS_SET(test_digit,\n  RANGE('0', '9'),\n)\n\n// Wavy dashes.\nDEFINE_CHAR_PROPERTY_AS_SET(test_wavy_dash,\n  '~',\n  0x301C,  // wave dash\n  0x3030,  // wavy dash\n)\n\n// Digits or wavy dashes.\nDEFINE_CHAR_PROPERTY(test_digit_or_wavy_dash, prop) {\n  prop->AddCharProperty(\"test_digit\");\n  prop->AddCharProperty(\"test_wavy_dash\");\n}\n\n// Punctuation plus a few extraneous chars.\nDEFINE_CHAR_PROPERTY(test_punctuation_plus, prop) {\n  prop->AddChar('a');\n  prop->AddCharRange('b', 'b');\n  prop->AddCharRange('c', 'e');\n  static const int kUnicodes[] = {'f', RANGE('g', 'i'), 'j'};\n  prop->AddCharSpec(kUnicodes, arraysize(kUnicodes));\n  prop->AddCharProperty(\"punctuation\");\n}\n\n//====================================================================\n// Another form of the character sets above -- for verification\n//\n\nconst char32 kTestDigit[] = {\n  '0', '1', '2', '3', '4', '5', '6', '7', '8', '9'\n};\n\nconst char32 kTestWavyDash[] = {\n  '~',\n  0x301C,  // wave dash,\n  0x3030,  // wavy dash\n};\n\nconst char32 kTestPunctuationPlusExtras[] = {\n  'a',\n  'b',\n  'c',\n  'd',\n  'e',\n  'f',\n  'g',\n  'h',\n  'i',\n  'j',\n};\n\n// ====================================================================\n// Tests\n//\n\nTEST_F(CharPropertiesTest, TestDigit) {\n  CollectArray(kTestDigit, arraysize(kTestDigit));\n  ExpectCharPropertyEqualsCollectedSet(\"test_digit\");\n}\n\nTEST_F(CharPropertiesTest, TestWavyDash) {\n  CollectArray(kTestWavyDash, arraysize(kTestWavyDash));\n  ExpectCharPropertyEqualsCollectedSet(\"test_wavy_dash\");\n}\n\nTEST_F(CharPropertiesTest, TestDigitOrWavyDash) {\n  CollectArray(kTestDigit, arraysize(kTestDigit));\n  CollectArray(kTestWavyDash, arraysize(kTestWavyDash));\n  ExpectCharPropertyEqualsCollectedSet(\"test_digit_or_wavy_dash\");\n}\n\nTEST_F(CharPropertiesTest, TestPunctuationPlus) {\n  CollectCharProperty(\"punctuation\");\n  CollectArray(kTestPunctuationPlusExtras,\n               arraysize(kTestPunctuationPlusExtras));\n  ExpectCharPropertyEqualsCollectedSet(\"test_punctuation_plus\");\n}\n\n// ====================================================================\n// Spot-check predicates in char_properties.cc\n//\n\nTEST_F(CharPropertiesTest, StartSentencePunc) {\n  CollectChars({0x00A1, 0x00BF});\n  ExpectCharPropertyContainsCollectedSet(\"start_sentence_punc\");\n}\n\nTEST_F(CharPropertiesTest, EndSentencePunc) {\n  CollectChars({'.', '!', '?'});\n  ExpectCharPropertyContainsCollectedSet(\"end_sentence_punc\");\n}\n\nTEST_F(CharPropertiesTest, OpenExprPunc) {\n  CollectChars({'(', '['});\n  ExpectCharPropertyContainsCollectedSet(\"open_expr_punc\");\n}\n\nTEST_F(CharPropertiesTest, CloseExprPunc) {\n  CollectChars({')', ']'});\n  ExpectCharPropertyContainsCollectedSet(\"close_expr_punc\");\n}\n\nTEST_F(CharPropertiesTest, OpenQuote) {\n  CollectChars({'\\'', '\"'});\n  ExpectCharPropertyContainsCollectedSet(\"open_quote\");\n}\n\nTEST_F(CharPropertiesTest, CloseQuote) {\n  CollectChars({'\\'', '\"'});\n  ExpectCharPropertyContainsCollectedSet(\"close_quote\");\n}\n\nTEST_F(CharPropertiesTest, OpenBookquote) {\n  CollectChars({0x300A});\n  ExpectCharPropertyContainsCollectedSet(\"open_bookquote\");\n}\n\nTEST_F(CharPropertiesTest, CloseBookquote) {\n  CollectChars({0x300B});\n  ExpectCharPropertyContainsCollectedSet(\"close_bookquote\");\n}\n\nTEST_F(CharPropertiesTest, OpenPunc) {\n  CollectChars({'(', '['});\n  CollectChars({'\\'', '\"'});\n  ExpectCharPropertyContainsCollectedSet(\"open_punc\");\n}\n\nTEST_F(CharPropertiesTest, ClosePunc) {\n  CollectChars({')', ']'});\n  CollectChars({'\\'', '\"'});\n  ExpectCharPropertyContainsCollectedSet(\"close_punc\");\n}\n\nTEST_F(CharPropertiesTest, LeadingSentencePunc) {\n  CollectChars({'(', '['});\n  CollectChars({'\\'', '\"'});\n  CollectChars({0x00A1, 0x00BF});\n  ExpectCharPropertyContainsCollectedSet(\"leading_sentence_punc\");\n}\n\nTEST_F(CharPropertiesTest, TrailingSentencePunc) {\n  CollectChars({')', ']'});\n  CollectChars({'\\'', '\"'});\n  CollectChars({'.', '!', '?'});\n  ExpectCharPropertyContainsCollectedSet(\"trailing_sentence_punc\");\n}\n\nTEST_F(CharPropertiesTest, NoncurrencyTokenPrefixSymbol) {\n  CollectChars({'#'});\n  ExpectCharPropertyContainsCollectedSet(\"noncurrency_token_prefix_symbol\");\n}\n\nTEST_F(CharPropertiesTest, TokenSuffixSymbol) {\n  CollectChars({'%', 0x2122, 0x00A9, 0x00B0});\n  ExpectCharPropertyContainsCollectedSet(\"token_suffix_symbol\");\n}\n\nTEST_F(CharPropertiesTest, TokenPrefixSymbol) {\n  CollectChars({'#'});\n  CollectChars({'$', 0x00A5, 0x20AC});\n  ExpectCharPropertyContainsCollectedSet(\"token_prefix_symbol\");\n}\n\nTEST_F(CharPropertiesTest, SubscriptSymbol) {\n  CollectChars({0x2082, 0x2083});\n  ExpectCharPropertyContainsCollectedSet(\"subscript_symbol\");\n}\n\nTEST_F(CharPropertiesTest, SuperscriptSymbol) {\n  CollectChars({0x00B2, 0x00B3});\n  ExpectCharPropertyContainsCollectedSet(\"superscript_symbol\");\n}\n\nTEST_F(CharPropertiesTest, CurrencySymbol) {\n  CollectChars({'$', 0x00A5, 0x20AC});\n  ExpectCharPropertyContainsCollectedSet(\"currency_symbol\");\n}\n\nTEST_F(CharPropertiesTest, DirectionalFormattingCode) {\n  CollectChars({0x200E, 0x200F, 0x202A, 0x202B, 0x202C, 0x202D, 0x202E});\n  ExpectCharPropertyContainsCollectedSet(\"directional_formatting_code\");\n}\n\nTEST_F(CharPropertiesTest, Punctuation) {\n  CollectAsciiPredicate(ispunct);\n  ExpectCharPropertyContainsCollectedSet(\"punctuation\");\n}\n\nTEST_F(CharPropertiesTest, Separator) {\n  CollectAsciiPredicate(isspace);\n  ExpectCharPropertyContainsCollectedSet(\"separator\");\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/conll2tree.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\"\"\"A program to generate ASCII trees from conll files.\"\"\"\n\nimport collections\nimport re\n\nimport asciitree\nimport tensorflow as tf\n\nimport syntaxnet.load_parser_ops\n\nfrom tensorflow.python.platform import tf_logging as logging\nfrom syntaxnet import sentence_pb2\nfrom syntaxnet.ops import gen_parser_ops\n\nflags = tf.app.flags\nFLAGS = flags.FLAGS\n\nflags.DEFINE_string('task_context',\n                    'syntaxnet/models/parsey_mcparseface/context.pbtxt',\n                    'Path to a task context with inputs and parameters for '\n                    'feature extractors.')\nflags.DEFINE_string('corpus_name', 'stdin-conll',\n                    'Path to a task context with inputs and parameters for '\n                    'feature extractors.')\n\n\ndef to_dict(sentence):\n  \"\"\"Builds a dictionary representing the parse tree of a sentence.\n\n     Note that the suffix \"@id\" (where 'id' is a number) is appended to each\n     element to handle the sentence that has multiple elements with identical\n     representation. Those suffix needs to be removed after the asciitree is\n     rendered.\n\n  Args:\n    sentence: Sentence protocol buffer to represent.\n  Returns:\n    Dictionary mapping tokens to children.\n  \"\"\"\n  token_str = list()\n  children = [[] for token in sentence.token]\n  root = -1\n  for i in range(0, len(sentence.token)):\n    token = sentence.token[i]\n    token_str.append('%s %s %s @%d' %\n                     (token.word, token.tag, token.label, (i+1)))\n    if token.head == -1:\n      root = i\n    else:\n      children[token.head].append(i)\n\n  def _get_dict(i):\n    d = collections.OrderedDict()\n    for c in children[i]:\n      d[token_str[c]] = _get_dict(c)\n    return d\n\n  tree = collections.OrderedDict()\n  tree[token_str[root]] = _get_dict(root)\n  return tree\n\n\ndef main(unused_argv):\n  logging.set_verbosity(logging.INFO)\n  with tf.Session() as sess:\n    src = gen_parser_ops.document_source(batch_size=32,\n                                         corpus_name=FLAGS.corpus_name,\n                                         task_context=FLAGS.task_context)\n    sentence = sentence_pb2.Sentence()\n    while True:\n      documents, finished = sess.run(src)\n      logging.info('Read %d documents', len(documents))\n      for d in documents:\n        sentence.ParseFromString(d)\n        tr = asciitree.LeftAligned()\n        d = to_dict(sentence)\n        print 'Input: %s' % sentence.text\n        print 'Parse:'\n        tr_str = tr(d)\n        pat = re.compile(r'\\s*@\\d+$')\n        for tr_ln in tr_str.splitlines():\n          print pat.sub('', tr_ln)\n\n      if finished:\n        break\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/context.pbtxt",
    "content": "Parameter {\n  name: 'brain_parser_embedding_dims'\n  value: '64;32;32'\n}\nParameter {\n  name: 'brain_parser_features'\n  value: 'input.word input(1).word input(2).word input(3).word stack.word stack(1).word stack(2).word stack(3).word stack.child(1).word stack.child(1).sibling(-1).word stack.child(-1).word stack.child(-1).sibling(1).word stack(1).child(1).word stack(1).child(1).sibling(-1).word stack(1).child(-1).word stack(1).child(-1).sibling(1).word stack.child(2).word stack.child(-2).word stack(1).child(2).word stack(1).child(-2).word;input.tag input(1).tag input(2).tag input(3).tag stack.tag stack(1).tag stack(2).tag stack(3).tag stack.child(1).tag stack.child(1).sibling(-1).tag stack.child(-1).tag stack.child(-1).sibling(1).tag stack(1).child(1).tag stack(1).child(1).sibling(-1).tag stack(1).child(-1).tag stack(1).child(-1).sibling(1).tag stack.child(2).tag stack.child(-2).tag stack(1).child(2).tag stack(1).child(-2).tag;stack.child(1).label stack.child(1).sibling(-1).label stack.child(-1).label stack.child(-1).sibling(1).label stack(1).child(1).label stack(1).child(1).sibling(-1).label stack(1).child(-1).label stack(1).child(-1).sibling(1).label stack.child(2).label stack.child(-2).label stack(1).child(2).label stack(1).child(-2).label'\n}\nParameter {\n  name: 'brain_parser_embedding_names'\n  value: 'words;tags;labels'\n}\nParameter {\n  name: 'brain_parser_scoring'\n  value: 'default'\n}\nParameter {\n  name: 'brain_pos_transition_system'\n  value: 'tagger'\n}\nParameter {\n  name: 'brain_pos_embedding_dims'\n  value: '64;4;8;8'\n}\nParameter {\n  name: 'brain_pos_features'\n  value: 'stack(3).word stack(2).word stack(1).word stack.word input.word input(1).word input(2).word input(3).word;input.digit input.hyphen;stack.suffix(length=2) input.suffix(length=2) input(1).suffix(length=2);stack.prefix(length=2) input.prefix(length=2) input(1).prefix(length=2)'\n}\nParameter {\n  name: 'brain_pos_embedding_names'\n  value: 'words;other;suffix;prefix'\n}\ninput {\n  name: 'training-corpus'\n  record_format: 'conll-sentence'\n  Part {\n    file_pattern: '<your-dataset>/treebank-train.trees.conll'\n  }\n}\ninput {\n  name: 'tuning-corpus'\n  record_format: 'conll-sentence'\n  Part {\n    file_pattern: '<your-dataset>/dev.conll'\n  }\n}\ninput {\n  name: 'dev-corpus'\n  record_format: 'conll-sentence'\n  Part {\n    file_pattern: '<your-dataset>/test.conll'\n  }\n}\ninput {\n  name: 'tagged-training-corpus'\n  creator: 'brain_pos/greedy'\n  record_format: 'conll-sentence'\n}\ninput {\n  name: 'tagged-tuning-corpus'\n  creator: 'brain_pos/greedy'\n  record_format: 'conll-sentence'\n}\ninput {\n  name: 'tagged-dev-corpus'\n  creator: 'brain_pos/greedy'\n  record_format: 'conll-sentence'\n}\ninput {\n  name: 'label-map'\n  creator: 'brain_pos/greedy'\n}\ninput {\n  name: 'word-map'\n  creator: 'brain_pos/greedy'\n}\ninput {\n  name: 'lcword-map'\n  creator: 'brain_pos/greedy'\n}\ninput {\n  name: 'tag-map'\n  creator: 'brain_pos/greedy'\n}\ninput {\n  name: 'category-map'\n  creator: 'brain_pos/greedy'\n}\ninput {\n  name: 'char-map'\n  creator: 'brain_pos/greedy'\n}\ninput {\n  name: 'prefix-table'\n  creator: 'brain_pos/greedy'\n}\ninput {\n  name: 'suffix-table'\n  creator: 'brain_pos/greedy'\n}\ninput {\n  name: 'tag-to-category'\n  creator: 'brain_pos/greedy'\n}\ninput {\n  name: 'projectivized-training-corpus'\n  creator: 'brain_parser/greedy'\n  record_format: 'conll-sentence'\n}\ninput {\n  name: 'parsed-training-corpus'\n  creator: 'brain_parser/greedy'\n  record_format: 'conll-sentence'\n}\ninput {\n  name: 'parsed-tuning-corpus'\n  creator: 'brain_parser/greedy'\n  record_format: 'conll-sentence'\n}\ninput {\n  name: 'parsed-dev-corpus'\n  creator: 'brain_parser/greedy'\n  record_format: 'conll-sentence'\n}\ninput {\n  name: 'beam-parsed-training-corpus'\n  creator: 'brain_parser/structured'\n  record_format: 'conll-sentence'\n}\ninput {\n  name: 'beam-parsed-tuning-corpus'\n  creator: 'brain_parser/structured'\n  record_format: 'conll-sentence'\n}\ninput {\n  name: 'beam-parsed-dev-corpus'\n  creator: 'brain_parser/structured'\n  record_format: 'conll-sentence'\n}\ninput {\n  name: 'stdin'\n  record_format: 'english-text'\n  Part {\n    file_pattern: '-'\n  }\n}\ninput {\n  name: 'stdin-conll'\n  record_format: 'conll-sentence'\n  Part {\n    file_pattern: '-'\n  }\n}\ninput {\n  name: 'stdout-conll'\n  record_format: 'conll-sentence'\n  Part {\n    file_pattern: '-'\n  }\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/demo.sh",
    "content": "#!/bin/bash\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n# A script that runs a tokenizer, a part-of-speech tagger and a dependency\n# parser on an English text file, with one sentence per line.\n#\n# Example usage:\n#  echo \"Parsey McParseface is my favorite parser!\" | syntaxnet/demo.sh\n\n# To run on a conll formatted file, add the --conll command line argument.\n#\n\nPARSER_EVAL=bazel-bin/syntaxnet/parser_eval\nMODEL_DIR=syntaxnet/models/parsey_mcparseface\n[[ \"$1\" == \"--conll\" ]] && INPUT_FORMAT=stdin-conll || INPUT_FORMAT=stdin\n\n$PARSER_EVAL \\\n  --input=$INPUT_FORMAT \\\n  --output=stdout-conll \\\n  --hidden_layer_sizes=64 \\\n  --arg_prefix=brain_tagger \\\n  --graph_builder=structured \\\n  --task_context=$MODEL_DIR/context.pbtxt \\\n  --model_path=$MODEL_DIR/tagger-params \\\n  --slim_model \\\n  --batch_size=1024 \\\n  --alsologtostderr \\\n   | \\\n  $PARSER_EVAL \\\n  --input=stdin-conll \\\n  --output=stdout-conll \\\n  --hidden_layer_sizes=512,512 \\\n  --arg_prefix=brain_parser \\\n  --graph_builder=structured \\\n  --task_context=$MODEL_DIR/context.pbtxt \\\n  --model_path=$MODEL_DIR/parser-params \\\n  --slim_model \\\n  --batch_size=1024 \\\n  --alsologtostderr \\\n  | \\\n  bazel-bin/syntaxnet/conll2tree \\\n  --task_context=$MODEL_DIR/context.pbtxt \\\n  --alsologtostderr\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/dictionary.proto",
    "content": "// Protocol buffers for serializing string<=>index dictionaries.\n\nsyntax = \"proto2\";\n\npackage syntaxnet;\n\n// Serializable representation of a string=>string pair.\nmessage StringToStringPair {\n  // String representing the key.\n  required string key = 1;\n\n  // String representing the value.\n  required string value = 2;\n}\n\n// Serializable representation of a string=>string mapping.\nmessage StringToStringMap {\n  // Key=>value pairs.\n  repeated StringToStringPair pair = 1;\n}\n\n// Affix table entry, for serialization of the affix tables.\nmessage AffixTableEntry {\n  // Nested message for serializing a single affix.\n  message AffixEntry {\n    // The affix as a string.\n    required string form = 1;\n\n    // The length of the affix (this is non-trivial to compute due to UTF-8).\n    required int32 length = 2;\n\n    // The ID of the affix that is one character shorter, or -1 if none exists.\n    required int32 shorter_id = 3;\n  }\n\n  // The type of affix table, as a string.\n  required string type = 1;\n\n  // The maximum affix length.\n  required int32 max_length = 2;\n\n  // The list of affixes, in order of affix ID.\n  repeated AffixEntry affix = 3;\n}\n\n// A light-weight proto to store vectors in binary format.\nmessage TokenEmbedding {\n  required bytes token = 1;  // can be word or phrase, or URL, etc.\n\n  // If available, raw count of this token in the training corpus.\n  optional int64 count = 3;\n\n  message Vector {\n    repeated float values = 1 [packed = true];\n  }\n  optional Vector vector = 2;\n};\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/document_filters.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// Various utilities for handling documents.\n\n#include <stddef.h>\n#include <algorithm>\n#include <memory>\n#include <string>\n#include <vector>\n\n#include \"syntaxnet/base.h\"\n#include \"syntaxnet/feature_extractor.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/utils.h\"\n#include \"third_party/eigen3/unsupported/Eigen/CXX11/Tensor\"\n#include \"tensorflow/core/framework/op_kernel.h\"\n#include \"tensorflow/core/framework/tensor.h\"\n#include \"tensorflow/core/framework/tensor_shape.h\"\n#include \"tensorflow/core/lib/core/status.h\"\n\nusing tensorflow::DEVICE_CPU;\nusing tensorflow::OpKernel;\nusing tensorflow::OpKernelConstruction;\nusing tensorflow::OpKernelContext;\nusing tensorflow::Tensor;\nusing tensorflow::TensorShape;\nusing tensorflow::errors::InvalidArgument;\n\nnamespace syntaxnet {\n\nnamespace {\n\nvoid GetTaskContext(OpKernelConstruction *context, TaskContext *task_context) {\n  string file_path, data;\n  OP_REQUIRES_OK(context, context->GetAttr(\"task_context\", &file_path));\n  OP_REQUIRES_OK(\n      context, ReadFileToString(tensorflow::Env::Default(), file_path, &data));\n  OP_REQUIRES(context,\n              TextFormat::ParseFromString(data, task_context->mutable_spec()),\n              InvalidArgument(\"Could not parse task context at \", file_path));\n}\n\n// Outputs the given batch of sentences as a tensor and deletes them.\nvoid OutputDocuments(OpKernelContext *context,\n                     vector<Sentence *> *document_batch) {\n  const int64 size = document_batch->size();\n  Tensor *output;\n  OP_REQUIRES_OK(context,\n                 context->allocate_output(0, TensorShape({size}), &output));\n  for (int64 i = 0; i < size; ++i) {\n    output->vec<string>()(i) = (*document_batch)[i]->SerializeAsString();\n  }\n  utils::STLDeleteElements(document_batch);\n}\n\n}  // namespace\n\nclass DocumentSource : public OpKernel {\n public:\n  explicit DocumentSource(OpKernelConstruction *context) : OpKernel(context) {\n    GetTaskContext(context, &task_context_);\n    string corpus_name;\n    OP_REQUIRES_OK(context, context->GetAttr(\"corpus_name\", &corpus_name));\n    OP_REQUIRES_OK(context, context->GetAttr(\"batch_size\", &batch_size_));\n    OP_REQUIRES(context, batch_size_ > 0,\n                InvalidArgument(\"invalid batch_size provided\"));\n    corpus_.reset(\n        new TextReader(*task_context_.GetInput(corpus_name), &task_context_));\n  }\n\n  void Compute(OpKernelContext *context) override {\n    mutex_lock lock(mu_);\n    Sentence *document;\n    vector<Sentence *> document_batch;\n    while ((document = corpus_->Read()) != nullptr) {\n      document_batch.push_back(document);\n      if (static_cast<int>(document_batch.size()) == batch_size_) {\n        OutputDocuments(context, &document_batch);\n        OutputLast(context, false);\n        return;\n      }\n    }\n    OutputDocuments(context, &document_batch);\n    OutputLast(context, true);\n  }\n\n private:\n  void OutputLast(OpKernelContext *context, bool last) {\n    Tensor *output;\n    OP_REQUIRES_OK(context,\n                   context->allocate_output(1, TensorShape({}), &output));\n    output->scalar<bool>()() = last;\n  }\n\n  // Task context used to configure this op.\n  TaskContext task_context_;\n\n  // mutex to synchronize access to Compute.\n  mutex mu_;\n\n  std::unique_ptr<TextReader> corpus_;\n  string documents_path_;\n  int batch_size_;\n};\n\nREGISTER_KERNEL_BUILDER(Name(\"DocumentSource\").Device(DEVICE_CPU),\n                        DocumentSource);\n\nclass DocumentSink : public OpKernel {\n public:\n  explicit DocumentSink(OpKernelConstruction *context) : OpKernel(context) {\n    GetTaskContext(context, &task_context_);\n    string corpus_name;\n    OP_REQUIRES_OK(context, context->GetAttr(\"corpus_name\", &corpus_name));\n    writer_.reset(\n        new TextWriter(*task_context_.GetInput(corpus_name), &task_context_));\n  }\n\n  void Compute(OpKernelContext *context) override {\n    mutex_lock lock(mu_);\n    auto documents = context->input(0).vec<string>();\n    for (int i = 0; i < documents.size(); ++i) {\n      Sentence document;\n      OP_REQUIRES(context, document.ParseFromString(documents(i)),\n                  InvalidArgument(\"failed to parse sentence\"));\n      writer_->Write(document);\n    }\n  }\n\n private:\n  // Task context used to configure this op.\n  TaskContext task_context_;\n\n  // mutex to synchronize access to Compute.\n  mutex mu_;\n\n  string documents_path_;\n  std::unique_ptr<TextWriter> writer_;\n};\n\nREGISTER_KERNEL_BUILDER(Name(\"DocumentSink\").Device(DEVICE_CPU),\n                        DocumentSink);\n\n// Sentence filter for filtering out documents where the parse trees are not\n// well-formed, i.e. they contain cycles.\nclass WellFormedFilter : public OpKernel {\n public:\n  explicit WellFormedFilter(OpKernelConstruction *context) : OpKernel(context) {\n    GetTaskContext(context, &task_context_);\n    OP_REQUIRES_OK(context, context->GetAttr(\"keep_malformed_documents\",\n                                             &keep_malformed_));\n  }\n\n  void Compute(OpKernelContext *context) override {\n    auto documents = context->input(0).vec<string>();\n    vector<Sentence *> output_documents;\n    for (int i = 0; i < documents.size(); ++i) {\n      Sentence *document = new Sentence;\n      OP_REQUIRES(context, document->ParseFromString(documents(i)),\n                  InvalidArgument(\"failed to parse sentence\"));\n      if (ShouldKeep(*document)) {\n        output_documents.push_back(document);\n      } else {\n        delete document;\n      }\n    }\n    OutputDocuments(context, &output_documents);\n  }\n\n private:\n  bool ShouldKeep(const Sentence &doc)  {\n    vector<int> visited(doc.token_size(), -1);\n    for (int i = 0; i < doc.token_size(); ++i) {\n      // Already visited node.\n      if (visited[i] != -1) continue;\n      int t = i;\n      while (t != -1) {\n        if (visited[t] == -1) {\n          // If it is not visited yet, mark it.\n          visited[t] = i;\n        } else if (visited[t] < i) {\n          // If the index number is smaller than index and not -1, the token has\n          // already been visited.\n          break;\n        } else {\n          // Loop detected.\n          LOG(ERROR) << \"Loop detected in document \" << doc.DebugString();\n          return keep_malformed_;\n        }\n        t = doc.token(t).head();\n      }\n    }\n    return true;\n  }\n\n private:\n  // Task context used to configure this op.\n  TaskContext task_context_;\n\n  bool keep_malformed_;\n};\n\nREGISTER_KERNEL_BUILDER(Name(\"WellFormedFilter\").Device(DEVICE_CPU),\n                        WellFormedFilter);\n\n// Sentence filter that modifies dependency trees to make them projective. This\n// could be made more efficient by looping over sentences instead of the entire\n// document. Assumes that the document is well-formed in the sense of having\n// no looping dependencies.\n//\n// Task arguments:\n//   bool discard_non_projective (false) : If true, discards documents with\n//     non-projective trees instead of projectivizing them.\nclass ProjectivizeFilter : public OpKernel {\n public:\n  explicit ProjectivizeFilter(OpKernelConstruction *context)\n      : OpKernel(context) {\n    GetTaskContext(context, &task_context_);\n    OP_REQUIRES_OK(context, context->GetAttr(\"discard_non_projective\",\n                                             &discard_non_projective_));\n  }\n\n  void Compute(OpKernelContext *context) override {\n    auto documents = context->input(0).vec<string>();\n    vector<Sentence *> output_documents;\n    for (int i = 0; i < documents.size(); ++i) {\n      Sentence *document = new Sentence;\n      OP_REQUIRES(context, document->ParseFromString(documents(i)),\n                  InvalidArgument(\"failed to parse sentence\"));\n      if (Process(document)) {\n        output_documents.push_back(document);\n      } else {\n        delete document;\n      }\n    }\n    OutputDocuments(context, &output_documents);\n  }\n\n  bool Process(Sentence *doc) {\n    const int num_tokens = doc->token_size();\n\n    // Left and right boundaries for arcs. The left and right ends of an arc are\n    // bounded by the arcs that pass over it. If an arc exceeds these bounds it\n    // will cross an arc passing over it, making it a non-projective arc.\n    vector<int> left(num_tokens);\n    vector<int> right(num_tokens);\n\n    // Lift the shortest non-projective arc until the document is projective.\n    while (true) {\n      // Initialize boundaries to the whole document for all arcs.\n      for (int i = 0; i < num_tokens; ++i) {\n        left[i] = -1;\n        right[i] = num_tokens - 1;\n      }\n\n      // Find left and right bounds for each token.\n      for (int i = 0; i < num_tokens; ++i) {\n        int head_index = doc->token(i).head();\n\n        // Find left and right end of arc.\n        int l = std::min(i, head_index);\n        int r = std::max(i, head_index);\n\n        // Bound all tokens under the arc.\n        for (int j = l + 1; j < r; ++j) {\n          if (left[j] < l) left[j] = l;\n          if (right[j] > r) right[j] = r;\n        }\n      }\n\n      // Find deepest non-projective arc.\n      int deepest_arc = -1;\n      int max_depth = -1;\n\n      // The non-projective arcs are those that exceed their bounds.\n      for (int i = 0; i < num_tokens; ++i) {\n        int head_index = doc->token(i).head();\n        if (head_index == -1) continue;  // any crossing arc must be deeper\n\n        int l = std::min(i, head_index);\n        int r = std::max(i, head_index);\n\n        int left_bound = std::max(left[l], left[r]);\n        int right_bound = std::min(right[l], right[r]);\n\n        if (l < left_bound || r > right_bound) {\n          // Found non-projective arc.\n          if (discard_non_projective_) return false;\n\n          // Pick the deepest as the best candidate for lifting.\n          int depth = 0;\n          int j = i;\n          while (j != -1) {\n            ++depth;\n            j = doc->token(j).head();\n          }\n          if (depth > max_depth) {\n            deepest_arc = i;\n            max_depth = depth;\n          }\n        }\n      }\n\n      // If there are no more non-projective arcs we are done.\n      if (deepest_arc == -1) return true;\n\n      // Lift non-projective arc.\n      int lifted_head = doc->token(doc->token(deepest_arc).head()).head();\n      doc->mutable_token(deepest_arc)->set_head(lifted_head);\n    }\n  }\n\n private:\n  // Task context used to configure this op.\n  TaskContext task_context_;\n\n  // Whether or not to throw away non-projective documents.\n  bool discard_non_projective_;\n};\n\nREGISTER_KERNEL_BUILDER(Name(\"ProjectivizeFilter\").Device(DEVICE_CPU),\n                        ProjectivizeFilter);\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/document_format.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/document_format.h\"\n\nnamespace syntaxnet {\n\n// Component registry for document formatters.\nREGISTER_CLASS_REGISTRY(\"document format\", DocumentFormat);\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/document_format.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// An interface for document formats.\n\n#ifndef SYNTAXNET_DOCUMENT_FORMAT_H__\n#define SYNTAXNET_DOCUMENT_FORMAT_H__\n\n#include <string>\n#include <vector>\n\n#include \"syntaxnet/utils.h\"\n#include \"syntaxnet/registry.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"tensorflow/core/lib/io/buffered_inputstream.h\"\n\nnamespace syntaxnet {\n\n// A document format component converts a key/value pair from a record to one or\n// more documents. The record format is used for selecting the document format\n// component. A document format component can be registered with the\n// REGISTER_DOCUMENT_FORMAT macro.\nclass DocumentFormat : public RegisterableClass<DocumentFormat> {\n public:\n  DocumentFormat() {}\n  virtual ~DocumentFormat() {}\n\n  virtual void Setup(TaskContext *context) {}\n\n  // Reads a record from the given input buffer with format specific logic.\n  // Returns false if no record could be read because we reached end of file.\n  virtual bool ReadRecord(tensorflow::io::BufferedInputStream *buffer,\n                          string *record) = 0;\n\n  // Converts a key/value pair to one or more documents.\n  virtual void ConvertFromString(const string &key, const string &value,\n                                 vector<Sentence *> *documents) = 0;\n\n  // Converts a document to a key/value pair.\n  virtual void ConvertToString(const Sentence &document,\n                               string *key, string *value) = 0;\n\n private:\n  TF_DISALLOW_COPY_AND_ASSIGN(DocumentFormat);\n};\n\n#define REGISTER_DOCUMENT_FORMAT(type, component) \\\n  REGISTER_CLASS_COMPONENT(DocumentFormat, type, component)\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_DOCUMENT_FORMAT_H__\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/embedding_feature_extractor.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/embedding_feature_extractor.h\"\n\n#include <vector>\n\n#include \"syntaxnet/feature_extractor.h\"\n#include \"syntaxnet/parser_features.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/utils.h\"\n\nnamespace syntaxnet {\n\nvoid GenericEmbeddingFeatureExtractor::Setup(TaskContext *context) {\n  // Don't use version to determine how to get feature FML.\n  const string features = context->Get(\n      tensorflow::strings::StrCat(ArgPrefix(), \"_\", \"features\"), \"\");\n  const string embedding_names =\n      context->Get(GetParamName(\"embedding_names\"), \"\");\n  const string embedding_dims =\n      context->Get(GetParamName(\"embedding_dims\"), \"\");\n  LOG(INFO) << \"Features: \" << features;\n  LOG(INFO) << \"Embedding names: \" << embedding_names;\n  LOG(INFO) << \"Embedding dims: \" << embedding_dims;\n  embedding_fml_ = utils::Split(features, ';');\n  add_strings_ = context->Get(GetParamName(\"add_varlen_strings\"), false);\n  embedding_names_ = utils::Split(embedding_names, ';');\n  for (const string &dim : utils::Split(embedding_dims, ';')) {\n    embedding_dims_.push_back(utils::ParseUsing<int>(dim, utils::ParseInt32));\n  }\n}\n\nvoid GenericEmbeddingFeatureExtractor::Init(TaskContext *context) {\n}\n\nvector<vector<SparseFeatures>> GenericEmbeddingFeatureExtractor::ConvertExample(\n    const vector<FeatureVector> &feature_vectors) const {\n  // Extract the features.\n  vector<vector<SparseFeatures>> sparse_features(feature_vectors.size());\n  for (size_t i = 0; i < feature_vectors.size(); ++i) {\n    // Convert the nlp_parser::FeatureVector to dist belief format.\n    sparse_features[i] =\n        vector<SparseFeatures>(generic_feature_extractor(i).feature_types());\n\n    for (int j = 0; j < feature_vectors[i].size(); ++j) {\n      const FeatureType &feature_type = *feature_vectors[i].type(j);\n      const FeatureValue value = feature_vectors[i].value(j);\n      const bool is_continuous = feature_type.name().find(\"continuous\") == 0;\n      const int64 id = is_continuous ? FloatFeatureValue(value).id : value;\n      const int base = feature_type.base();\n      if (id >= 0) {\n        sparse_features[i][base].add_id(id);\n        if (is_continuous) {\n          sparse_features[i][base].add_weight(FloatFeatureValue(value).weight);\n        }\n        if (add_strings_) {\n          sparse_features[i][base].add_description(tensorflow::strings::StrCat(\n              feature_type.name(), \"=\", feature_type.GetFeatureValueName(id)));\n        }\n      }\n    }\n  }\n\n  return sparse_features;\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/embedding_feature_extractor.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#ifndef SYNTAXNET_EMBEDDING_FEATURE_EXTRACTOR_H_\n#define SYNTAXNET_EMBEDDING_FEATURE_EXTRACTOR_H_\n\n#include <functional>\n#include <memory>\n#include <string>\n#include <vector>\n\n#include \"syntaxnet/utils.h\"\n#include \"syntaxnet/feature_extractor.h\"\n#include \"syntaxnet/feature_types.h\"\n#include \"syntaxnet/parser_features.h\"\n#include \"syntaxnet/sentence_features.h\"\n#include \"syntaxnet/sparse.pb.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/workspace.h\"\n#include \"tensorflow/core/lib/strings/strcat.h\"\n\nnamespace syntaxnet {\n\n// An EmbeddingFeatureExtractor manages the extraction of features for\n// embedding-based models. It wraps a sequence of underlying classes of feature\n// extractors, along with associated predicate maps. Each class of feature\n// extractors is associated with a name, e.g., \"words\", \"labels\", \"tags\".\n//\n// The class is split between a generic abstract version,\n// GenericEmbeddingFeatureExtractor (that can be initialized without knowing the\n// signature of the ExtractFeatures method) and a typed version.\n//\n// The predicate maps must be initialized before use: they can be loaded using\n// Read() or updated via UpdateMapsForExample.\nclass GenericEmbeddingFeatureExtractor {\n public:\n  virtual ~GenericEmbeddingFeatureExtractor() {}\n\n  // Get the prefix string to put in front of all arguments, so they don't\n  // conflict with other embedding models.\n  virtual const string ArgPrefix() const = 0;\n\n  // Sets up predicate maps and embedding space names that are common for all\n  // embedding based feature extractors.\n  virtual void Setup(TaskContext *context);\n  virtual void Init(TaskContext *context);\n\n  // Requests workspace for the underlying feature extractors. This is\n  // implemented in the typed class.\n  virtual void RequestWorkspaces(WorkspaceRegistry *registry) = 0;\n\n  // Number of predicates for the embedding at a given index (vocabulary size.)\n  int EmbeddingSize(int index) const {\n    return generic_feature_extractor(index).GetDomainSize();\n  }\n\n  // Returns number of embedding spaces.\n  int NumEmbeddings() const { return embedding_dims_.size(); }\n\n  // Returns the number of features in the embedding space.\n  const int FeatureSize(int idx) const {\n    return generic_feature_extractor(idx).feature_types();\n  }\n\n  // Returns the dimensionality of the embedding space.\n  int EmbeddingDims(int index) const { return embedding_dims_[index]; }\n\n  // Accessor for embedding dims (dimensions of the embedding spaces).\n  const vector<int> &embedding_dims() const { return embedding_dims_; }\n\n  const vector<string> &embedding_fml() const { return embedding_fml_; }\n\n  // Get parameter name by concatenating the prefix and the original name.\n  string GetParamName(const string &param_name) const {\n    return tensorflow::strings::StrCat(ArgPrefix(), \"_\", param_name);\n  }\n\n protected:\n  // Provides the generic class with access to the templated extractors. This is\n  // used to get the type information out of the feature extractor without\n  // knowing the specific calling arguments of the extractor itself.\n  virtual const GenericFeatureExtractor &generic_feature_extractor(\n      int idx) const = 0;\n\n  // Converts a vector of extracted features into\n  // dist_belief::SparseFeatures. Each feature in each feature vector becomes a\n  // single SparseFeatures. The predicates are mapped through map_fn which\n  // should point to either mutable_map_fn or const_map_fn depending on whether\n  // or not the predicate maps should be updated.\n  vector<vector<SparseFeatures>> ConvertExample(\n      const vector<FeatureVector> &feature_vectors) const;\n\n private:\n  // Embedding space names for parameter sharing.\n  vector<string> embedding_names_;\n\n  // FML strings for each feature extractor.\n  vector<string> embedding_fml_;\n\n  // Size of each of the embedding spaces (maximum predicate id).\n  vector<int> embedding_sizes_;\n\n  // Embedding dimensions of the embedding spaces (i.e. 32, 64 etc.)\n  vector<int> embedding_dims_;\n\n  // Whether or not to add string descriptions to converted examples.\n  bool add_strings_;\n};\n\n// Templated, object-specific implementation of the\n// EmbeddingFeatureExtractor. EXTRACTOR should be a FeatureExtractor<OBJ,\n// ARGS...> class that has the appropriate FeatureTraits() to ensure that\n// locator type features work.\n//\n// Note: for backwards compatibility purposes, this always reads the FML spec\n// from \"<prefix>_features\".\ntemplate <class EXTRACTOR, class OBJ, class... ARGS>\nclass EmbeddingFeatureExtractor : public GenericEmbeddingFeatureExtractor {\n public:\n  // Sets up all predicate maps, feature extractors, and flags.\n  void Setup(TaskContext *context) override {\n    GenericEmbeddingFeatureExtractor::Setup(context);\n    feature_extractors_.resize(embedding_fml().size());\n    for (int i = 0; i < embedding_fml().size(); ++i) {\n      feature_extractors_[i].Parse(embedding_fml()[i]);\n      feature_extractors_[i].Setup(context);\n    }\n  }\n\n  // Initializes resources needed by the feature extractors.\n  void Init(TaskContext *context) override {\n    GenericEmbeddingFeatureExtractor::Init(context);\n    for (auto &feature_extractor : feature_extractors_) {\n      feature_extractor.Init(context);\n    }\n  }\n\n  // Requests workspaces from the registry. Must be called after Init(), and\n  // before Preprocess().\n  void RequestWorkspaces(WorkspaceRegistry *registry) override {\n    for (auto &feature_extractor : feature_extractors_) {\n      feature_extractor.RequestWorkspaces(registry);\n    }\n  }\n\n  // Must be called on the object one state for each sentence, before any\n  // feature extraction (e.g., UpdateMapsForExample, ExtractSparseFeatures).\n  void Preprocess(WorkspaceSet *workspaces, OBJ *obj) const {\n    for (auto &feature_extractor : feature_extractors_) {\n      feature_extractor.Preprocess(workspaces, obj);\n    }\n  }\n\n  // Returns a ragged array of SparseFeatures, for 1) each feature extractor\n  // class e, and 2) each feature f extracted by e. Underlying predicate maps\n  // will not be updated and so unrecognized predicates may occur. In such a\n  // case the SparseFeatures object associated with a given extractor class and\n  // feature will be empty.\n  vector<vector<SparseFeatures>> ExtractSparseFeatures(\n      const WorkspaceSet &workspaces, const OBJ &obj, ARGS... args) const {\n    vector<FeatureVector> features(feature_extractors_.size());\n    ExtractFeatures(workspaces, obj, args..., &features);\n    return ConvertExample(features);\n  }\n\n  // Extracts features using the extractors. Note that features must already\n  // be initialized to the correct number of feature extractors. No predicate\n  // mapping is applied.\n  void ExtractFeatures(const WorkspaceSet &workspaces, const OBJ &obj,\n                       ARGS... args,\n                       vector<FeatureVector> *features) const {\n    DCHECK(features != nullptr);\n    DCHECK_EQ(features->size(), feature_extractors_.size());\n    for (int i = 0; i < feature_extractors_.size(); ++i) {\n      (*features)[i].clear();\n      feature_extractors_[i].ExtractFeatures(workspaces, obj, args...,\n                                             &(*features)[i]);\n    }\n  }\n\n protected:\n  // Provides generic access to the feature extractors.\n  const GenericFeatureExtractor &generic_feature_extractor(\n      int idx) const override {\n    DCHECK_LT(idx, feature_extractors_.size());\n    DCHECK_GE(idx, 0);\n    return feature_extractors_[idx];\n  }\n\n private:\n  // Templated feature extractor class.\n  vector<EXTRACTOR> feature_extractors_;\n};\n\nclass ParserEmbeddingFeatureExtractor\n    : public EmbeddingFeatureExtractor<ParserFeatureExtractor, ParserState> {\n public:\n  explicit ParserEmbeddingFeatureExtractor(const string &arg_prefix)\n      : arg_prefix_(arg_prefix) {}\n\n private:\n  const string ArgPrefix() const override { return arg_prefix_; }\n\n  // Prefix for context parameters.\n  string arg_prefix_;\n};\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_EMBEDDING_FEATURE_EXTRACTOR_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/feature_extractor.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/feature_extractor.h\"\n\n#include \"syntaxnet/feature_types.h\"\n#include \"syntaxnet/fml_parser.h\"\n\nnamespace syntaxnet {\n\nconstexpr FeatureValue GenericFeatureFunction::kNone;\n\nGenericFeatureExtractor::GenericFeatureExtractor() {}\n\nGenericFeatureExtractor::~GenericFeatureExtractor() {}\n\nvoid GenericFeatureExtractor::Parse(const string &source) {\n  // Parse feature specification into descriptor.\n  FMLParser parser;\n  parser.Parse(source, mutable_descriptor());\n\n  // Initialize feature extractor from descriptor.\n  InitializeFeatureFunctions();\n}\n\nvoid GenericFeatureExtractor::InitializeFeatureTypes() {\n  // Register all feature types.\n  GetFeatureTypes(&feature_types_);\n  for (size_t i = 0; i < feature_types_.size(); ++i) {\n    FeatureType *ft = feature_types_[i];\n    ft->set_base(i);\n\n    // Check for feature space overflow.\n    double domain_size = ft->GetDomainSize();\n    if (domain_size < 0) {\n      LOG(FATAL) << \"Illegal domain size for feature \" << ft->name()\n                 << domain_size;\n    }\n  }\n\n  vector<string> types_names;\n  GetFeatureTypeNames(&types_names);\n  CHECK_EQ(feature_types_.size(), types_names.size());\n}\n\nvoid GenericFeatureExtractor::GetFeatureTypeNames(\n    vector<string> *type_names) const {\n  for (size_t i = 0; i < feature_types_.size(); ++i) {\n    FeatureType *ft = feature_types_[i];\n    type_names->push_back(ft->name());\n  }\n}\n\nFeatureValue GenericFeatureExtractor::GetDomainSize() const {\n  // Domain size of the set of features is equal to:\n  //   [largest domain size of any feature types] * [number of feature types]\n  FeatureValue max_feature_type_dsize = 0;\n  for (size_t i = 0; i < feature_types_.size(); ++i) {\n    FeatureType *ft = feature_types_[i];\n    const FeatureValue feature_type_dsize = ft->GetDomainSize();\n    if (feature_type_dsize > max_feature_type_dsize) {\n      max_feature_type_dsize = feature_type_dsize;\n    }\n  }\n\n  return max_feature_type_dsize;\n}\n\nstring GenericFeatureFunction::GetParameter(const string &name) const {\n  // Find named parameter in feature descriptor.\n  for (int i = 0; i < descriptor_->parameter_size(); ++i) {\n    if (name == descriptor_->parameter(i).name()) {\n      return descriptor_->parameter(i).value();\n    }\n  }\n  return \"\";\n}\n\nGenericFeatureFunction::GenericFeatureFunction() {}\n\nGenericFeatureFunction::~GenericFeatureFunction() {\n  delete feature_type_;\n}\n\nint GenericFeatureFunction::GetIntParameter(const string &name,\n                                            int default_value) const {\n  string value = GetParameter(name);\n  return utils::ParseUsing<int>(value, default_value,\n                                tensorflow::strings::safe_strto32);\n}\n\nvoid GenericFeatureFunction::GetFeatureTypes(\n    vector<FeatureType *> *types) const {\n  if (feature_type_ != nullptr) types->push_back(feature_type_);\n}\n\nFeatureType *GenericFeatureFunction::GetFeatureType() const {\n  // If a single feature type has been registered return it.\n  if (feature_type_ != nullptr) return feature_type_;\n\n  // Get feature types for function.\n  vector<FeatureType *> types;\n  GetFeatureTypes(&types);\n\n  // If there is exactly one feature type return this, else return null.\n  if (types.size() == 1) return types[0];\n  return nullptr;\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/feature_extractor.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// Generic feature extractor for extracting features from objects. The feature\n// extractor can be used for extracting features from any object. The feature\n// extractor and feature function classes are template classes that have to\n// be instantiated for extracting feature from a specific object type.\n//\n// A feature extractor consists of a hierarchy of feature functions. Each\n// feature function extracts one or more feature type and value pairs from the\n// object.\n//\n// The feature extractor has a modular design where new feature functions can be\n// registered as components. The feature extractor is initialized from a\n// descriptor represented by a protocol buffer. The feature extractor can also\n// be initialized from a text-based source specification of the feature\n// extractor. Feature specification parsers can be added as components. By\n// default the feature extractor can be read from an ASCII protocol buffer or in\n// a simple feature modeling language (fml).\n\n// A feature function is invoked with a focus. Nested feature function can be\n// invoked with another focus determined by the parent feature function.\n\n#ifndef SYNTAXNET_FEATURE_EXTRACTOR_H_\n#define SYNTAXNET_FEATURE_EXTRACTOR_H_\n\n#include <memory>\n#include <string>\n#include <vector>\n\n#include \"syntaxnet/feature_extractor.pb.h\"\n#include \"syntaxnet/feature_types.h\"\n#include \"syntaxnet/proto_io.h\"\n#include \"syntaxnet/registry.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/utils.h\"\n#include \"syntaxnet/workspace.h\"\n#include \"tensorflow/core/lib/core/status.h\"\n#include \"tensorflow/core/lib/core/stringpiece.h\"\n#include \"tensorflow/core/lib/io/record_reader.h\"\n#include \"tensorflow/core/lib/io/record_writer.h\"\n#include \"tensorflow/core/lib/strings/strcat.h\"\n#include \"tensorflow/core/platform/env.h\"\n\nnamespace syntaxnet {\n\n// Use the same type for feature values as is used for predicated.\ntypedef int64 Predicate;\ntypedef Predicate FeatureValue;\n\n// Output feature model in FML format.\nvoid ToFMLFunction(const FeatureFunctionDescriptor &function, string *output);\nvoid ToFML(const FeatureFunctionDescriptor &function, string *output);\n\n// A feature vector contains feature type and value pairs.\nclass FeatureVector {\n public:\n  FeatureVector() {}\n\n  // Adds feature type and value pair to feature vector.\n  void add(FeatureType *type, FeatureValue value) {\n    features_.emplace_back(type, value);\n  }\n\n  // Removes all elements from the feature vector.\n  void clear() { features_.clear(); }\n\n  // Returns the number of elements in the feature vector.\n  int size() const { return features_.size(); }\n\n  // Reserves space in the underlying feature vector.\n  void reserve(int n) { features_.reserve(n); }\n\n  // Returns feature type for an element in the feature vector.\n  FeatureType *type(int index) const { return features_[index].type; }\n\n  // Returns feature value for an element in the feature vector.\n  FeatureValue value(int index) const { return features_[index].value; }\n\n private:\n  // Structure for holding feature type and value pairs.\n  struct Element {\n    Element() : type(nullptr), value(-1) {}\n    Element(FeatureType *t, FeatureValue v) : type(t), value(v) {}\n\n    FeatureType *type;\n    FeatureValue value;\n  };\n\n  // Array for storing feature vector elements.\n  vector<Element> features_;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(FeatureVector);\n};\n\n// The generic feature extractor is the type-independent part of a feature\n// extractor. This holds the descriptor for the feature extractor and the\n// collection of feature types used in the feature extractor.  The feature\n// types are not available until FeatureExtractor<>::Init() has been called.\nclass GenericFeatureExtractor {\n public:\n  GenericFeatureExtractor();\n  virtual ~GenericFeatureExtractor();\n\n  // Initializes the feature extractor from a source representation of the\n  // feature extractor. The first line is used for determining the feature\n  // specification language. If the first line starts with #! followed by a name\n  // then this name is used for instantiating a feature specification parser\n  // with that name. If the language cannot be detected this way it falls back\n  // to using the default language supplied.\n  void Parse(const string &source);\n\n  // Returns the feature extractor descriptor.\n  const FeatureExtractorDescriptor &descriptor() const { return descriptor_; }\n  FeatureExtractorDescriptor *mutable_descriptor() { return &descriptor_; }\n\n  // Returns the number of feature types in the feature extractor.  Invalid\n  // before Init() has been called.\n  int feature_types() const { return feature_types_.size(); }\n\n  // Returns all feature types names used by the extractor. The names are\n  // added to the types_names array.  Invalid before Init() has been called.\n  void GetFeatureTypeNames(vector<string> *type_names) const;\n\n  // Returns a feature type used in the extractor.  Invalid before Init() has\n  // been called.\n  const FeatureType *feature_type(int index) const {\n    return feature_types_[index];\n  }\n\n  // Returns the feature domain size of this feature extractor.\n  // NOTE: The way that domain size is calculated is, for some, unintuitive. It\n  // is the largest domain size of any feature type.\n  FeatureValue GetDomainSize() const;\n\n protected:\n  // Initializes the feature types used by the extractor.  Called from\n  // FeatureExtractor<>::Init().\n  void InitializeFeatureTypes();\n\n private:\n  // Initializes the top-level feature functions.\n  virtual void InitializeFeatureFunctions() = 0;\n\n  // Returns all feature types used by the extractor. The feature types are\n  // added to the result array.\n  virtual void GetFeatureTypes(vector<FeatureType *> *types) const = 0;\n\n  // Descriptor for the feature extractor. This is a protocol buffer that\n  // contains all the information about the feature extractor. The feature\n  // functions are initialized from the information in the descriptor.\n  FeatureExtractorDescriptor descriptor_;\n\n  // All feature types used by the feature extractor. The collection of all the\n  // feature types describes the feature space of the feature set produced by\n  // the feature extractor.  Not owned.\n  vector<FeatureType *> feature_types_;\n};\n\n// The generic feature function is the type-independent part of a feature\n// function. Each feature function is associated with the descriptor that it is\n// instantiated from.  The feature types associated with this feature function\n// will be established by the time FeatureExtractor<>::Init() completes.\nclass GenericFeatureFunction {\n public:\n  // A feature value that represents the absence of a value.\n  static constexpr FeatureValue kNone = -1;\n\n  GenericFeatureFunction();\n  virtual ~GenericFeatureFunction();\n\n  // Sets up the feature function. NB: FeatureTypes of nested functions are not\n  // guaranteed to be available until Init().\n  virtual void Setup(TaskContext *context) {}\n\n  // Initializes the feature function. NB: The FeatureType of this function must\n  // be established when this method completes.\n  virtual void Init(TaskContext *context) {}\n\n  // Requests workspaces from a registry to obtain indices into a WorkspaceSet\n  // for any Workspace objects used by this feature function. NB: This will be\n  // called after Init(), so it can depend on resources and arguments.\n  virtual void RequestWorkspaces(WorkspaceRegistry *registry) {}\n\n  // Appends the feature types produced by the feature function to types.  The\n  // default implementation appends feature_type(), if non-null.  Invalid\n  // before Init() has been called.\n  virtual void GetFeatureTypes(vector<FeatureType *> *types) const;\n\n  // Returns the feature type for feature produced by this feature function. If\n  // the feature function produces features of different types this returns\n  // null.  Invalid before Init() has been called.\n  virtual FeatureType *GetFeatureType() const;\n\n  // Returns the name of the registry used for creating the feature function.\n  // This can be used for checking if two feature functions are of the same\n  // kind.\n  virtual const char *RegistryName() const = 0;\n\n  // Returns the value of a named parameter in the feature functions descriptor.\n  // If the named parameter is not found the global parameters are searched.\n  string GetParameter(const string &name) const;\n  int GetIntParameter(const string &name, int default_value) const;\n\n  // Returns the FML function description for the feature function, i.e. the\n  // name and parameters without the nested features.\n  string FunctionName() const {\n    string output;\n    ToFMLFunction(*descriptor_, &output);\n    return output;\n  }\n\n  // Returns the prefix for nested feature functions. This is the prefix of this\n  // feature function concatenated with the feature function name.\n  string SubPrefix() const {\n    return prefix_.empty() ? FunctionName() : prefix_ + \".\" + FunctionName();\n  }\n\n  // Returns/sets the feature extractor this function belongs to.\n  GenericFeatureExtractor *extractor() const { return extractor_; }\n  void set_extractor(GenericFeatureExtractor *extractor) {\n    extractor_ = extractor;\n  }\n\n  // Returns/sets the feature function descriptor.\n  FeatureFunctionDescriptor *descriptor() const { return descriptor_; }\n  void set_descriptor(FeatureFunctionDescriptor *descriptor) {\n    descriptor_ = descriptor;\n  }\n\n  // Returns a descriptive name for the feature function. The name is taken from\n  // the descriptor for the feature function. If the name is empty or the\n  // feature function is a variable the name is the FML representation of the\n  // feature, including the prefix.\n  string name() const {\n    string output;\n    if (descriptor_->name().empty()) {\n      if (!prefix_.empty()) {\n        output.append(prefix_);\n        output.append(\".\");\n      }\n      ToFML(*descriptor_, &output);\n    } else {\n      output = descriptor_->name();\n    }\n    tensorflow::StringPiece stripped(output);\n    utils::RemoveWhitespaceContext(&stripped);\n    return stripped.ToString();\n  }\n\n  // Returns the argument from the feature function descriptor. It defaults to\n  // 0 if the argument has not been specified.\n  int argument() const {\n    return descriptor_->has_argument() ? descriptor_->argument() : 0;\n  }\n\n  // Returns/sets/clears function name prefix.\n  const string &prefix() const { return prefix_; }\n  void set_prefix(const string &prefix) { prefix_ = prefix; }\n\n protected:\n  // Returns the feature type for single-type feature functions.\n  FeatureType *feature_type() const { return feature_type_; }\n\n  // Sets the feature type for single-type feature functions.  This takes\n  // ownership of feature_type.  Can only be called once.\n  void set_feature_type(FeatureType *feature_type) {\n    CHECK(feature_type_ == nullptr);\n    feature_type_ = feature_type;\n  }\n\n private:\n  // Feature extractor this feature function belongs to.  Not owned.\n  GenericFeatureExtractor *extractor_ = nullptr;\n\n  // Descriptor for feature function.  Not owned.\n  FeatureFunctionDescriptor *descriptor_ = nullptr;\n\n  // Feature type for features produced by this feature function. If the\n  // feature function produces features of multiple feature types this is null\n  // and the feature function must return it's feature types in\n  // GetFeatureTypes().  Owned.\n  FeatureType *feature_type_ = nullptr;\n\n  // Prefix used for sub-feature types of this function.\n  string prefix_;\n};\n\n// Feature function that can extract features from an object.  Templated on\n// two type arguments:\n//\n// OBJ:  The \"object\" from which features are extracted; e.g., a sentence.  This\n//       should be a plain type, rather than a reference or pointer.\n//\n// ARGS: A set of 0 or more types that are used to \"index\" into some part of the\n//       object that should be extracted, e.g. an int token index for a sentence\n//       object.  This should not be a reference type.\ntemplate<class OBJ, class ...ARGS>\nclass FeatureFunction\n    : public GenericFeatureFunction,\n      public RegisterableClass< FeatureFunction<OBJ, ARGS...> > {\n public:\n  using Self = FeatureFunction<OBJ, ARGS...>;\n\n  // Preprocesses the object.  This will be called prior to calling Evaluate()\n  // or Compute() on that object.\n  virtual void Preprocess(WorkspaceSet *workspaces, OBJ *object) const {}\n\n  // Appends features computed from the object and focus to the result.  The\n  // default implementation delegates to Compute(), adding a single value if\n  // available.  Multi-valued feature functions must override this method.\n  virtual void Evaluate(const WorkspaceSet &workspaces, const OBJ &object,\n                        ARGS... args, FeatureVector *result) const {\n    FeatureValue value = Compute(workspaces, object, args..., result);\n    if (value != kNone) result->add(feature_type(), value);\n  }\n\n  // Returns a feature value computed from the object and focus, or kNone if no\n  // value is computed.  Single-valued feature functions only need to override\n  // this method.\n  virtual FeatureValue Compute(const WorkspaceSet &workspaces,\n                               const OBJ &object,\n                               ARGS... args,\n                               const FeatureVector *fv) const {\n    return kNone;\n  }\n\n  // Instantiates a new feature function in a feature extractor from a feature\n  // descriptor.\n  static Self *Instantiate(GenericFeatureExtractor *extractor,\n                           FeatureFunctionDescriptor *fd,\n                           const string &prefix) {\n    Self *f = Self::Create(fd->type());\n    f->set_extractor(extractor);\n    f->set_descriptor(fd);\n    f->set_prefix(prefix);\n    return f;\n  }\n\n  // Returns the name of the registry for the feature function.\n  const char *RegistryName() const override {\n    return Self::registry()->name;\n  }\n\n private:\n  // Special feature function class for resolving variable references. The type\n  // of the feature function is used for resolving the variable reference. When\n  // evaluated it will either get the feature value(s) from the variable portion\n  // of the feature vector, if present, or otherwise it will call the referenced\n  // feature extractor function directly to extract the feature(s).\n  class Reference;\n};\n\n// Base class for features with nested feature functions. The nested functions\n// are of type NES, which may be different from the type of the parent function.\n// NB: NestedFeatureFunction will ensure that all initialization of nested\n// functions takes place during Setup() and Init() -- after the nested features\n// are initialized, the parent feature is initialized via SetupNested() and\n// InitNested(). Alternatively, a derived classes that overrides Setup() and\n// Init() directly should call Parent::Setup(), Parent::Init(), etc. first.\n//\n// Note: NestedFeatureFunction cannot know how to call Preprocess, Evaluate, or\n// Compute, since the nested functions may be of a different type.\ntemplate<class NES, class OBJ, class ...ARGS>\nclass NestedFeatureFunction : public FeatureFunction<OBJ, ARGS...> {\n public:\n  using Parent = NestedFeatureFunction<NES, OBJ, ARGS...>;\n\n  // Clean up nested functions.\n  ~NestedFeatureFunction() override { utils::STLDeleteElements(&nested_); }\n\n  // By default, just appends the nested feature types.\n  void GetFeatureTypes(vector<FeatureType *> *types) const override {\n    CHECK(!this->nested().empty())\n        << \"Nested features require nested features to be defined.\";\n    for (auto *function : nested_) function->GetFeatureTypes(types);\n  }\n\n  // Sets up the nested features.\n  void Setup(TaskContext *context) override {\n    CreateNested(this->extractor(), this->descriptor(), &nested_,\n                 this->SubPrefix());\n    for (auto *function : nested_) function->Setup(context);\n    SetupNested(context);\n  }\n\n  // Sets up this NestedFeatureFunction specifically.\n  virtual void SetupNested(TaskContext *context) {}\n\n  // Initializes the nested features.\n  void Init(TaskContext *context) override {\n    for (auto *function : nested_) function->Init(context);\n    InitNested(context);\n  }\n\n  // Initializes this NestedFeatureFunction specifically.\n  virtual void InitNested(TaskContext *context) {}\n\n  // Gets all the workspaces needed for the nested functions.\n  void RequestWorkspaces(WorkspaceRegistry *registry) override {\n    for (auto *function : nested_) function->RequestWorkspaces(registry);\n  }\n\n  // Returns the list of nested feature functions.\n  const vector<NES *> &nested() const { return nested_; }\n\n  // Instantiates nested feature functions for a feature function. Creates and\n  // initializes one feature function for each sub-descriptor in the feature\n  // descriptor.\n  static void CreateNested(GenericFeatureExtractor *extractor,\n                           FeatureFunctionDescriptor *fd,\n                           vector<NES *> *functions,\n                           const string &prefix) {\n    for (int i = 0; i < fd->feature_size(); ++i) {\n      FeatureFunctionDescriptor *sub = fd->mutable_feature(i);\n      NES *f = NES::Instantiate(extractor, sub, prefix);\n      functions->push_back(f);\n    }\n  }\n\n protected:\n  // The nested feature functions, if any, in order of declaration in the\n  // feature descriptor.  Owned.\n  vector<NES *> nested_;\n};\n\n// Base class for a nested feature function that takes nested features with the\n// same signature as these features, i.e. a meta feature. For this class, we can\n// provide preprocessing of the nested features.\ntemplate<class OBJ, class ...ARGS>\nclass MetaFeatureFunction : public NestedFeatureFunction<\n  FeatureFunction<OBJ, ARGS...>, OBJ, ARGS...> {\n public:\n  // Preprocesses using the nested features.\n  void Preprocess(WorkspaceSet *workspaces, OBJ *object) const override {\n    for (auto *function : this->nested_) {\n      function->Preprocess(workspaces, object);\n    }\n  }\n};\n\n// Template for a special type of locator: The locator of type\n// FeatureFunction<OBJ, ARGS...> calls nested functions of type\n// FeatureFunction<OBJ, IDX, ARGS...>, where the derived class DER is\n// responsible for translating by providing the following:\n//\n// // Gets the new additional focus.\n// IDX GetFocus(const WorkspaceSet &workspaces, const OBJ &object);\n//\n// This is useful to e.g. add a token focus to a parser state based on some\n// desired property of that state.\ntemplate<class DER, class OBJ, class IDX, class ...ARGS>\nclass FeatureAddFocusLocator : public NestedFeatureFunction<\n  FeatureFunction<OBJ, IDX, ARGS...>, OBJ, ARGS...> {\n public:\n  void Preprocess(WorkspaceSet *workspaces, OBJ *object) const override {\n    for (auto *function : this->nested_) {\n      function->Preprocess(workspaces, object);\n    }\n  }\n\n  void Evaluate(const WorkspaceSet &workspaces, const OBJ &object,\n                ARGS... args, FeatureVector *result) const override {\n    IDX focus = static_cast<const DER *>(this)->GetFocus(\n        workspaces, object, args...);\n    for (auto *function : this->nested()) {\n      function->Evaluate(workspaces, object, focus, args..., result);\n    }\n  }\n\n  // Returns the first nested feature's computed value.\n  FeatureValue Compute(const WorkspaceSet &workspaces,\n                       const OBJ &object,\n                       ARGS... args,\n                       const FeatureVector *result) const override {\n    IDX focus = static_cast<const DER *>(this)->GetFocus(\n        workspaces, object, args...);\n    return this->nested()[0]->Compute(\n        workspaces, object, focus, args..., result);\n  }\n};\n\n// CRTP feature locator class. This is a meta feature that modifies ARGS and\n// then calls the nested feature functions with the modified ARGS. Note that in\n// order for this template to work correctly, all of ARGS must be types for\n// which the reference operator & can be interpreted as a pointer to the\n// argument. The derived class DER must implement the UpdateFocus method which\n// takes pointers to the ARGS arguments:\n//\n// // Updates the current arguments.\n// void UpdateArgs(const OBJ &object, ARGS *...args) const;\ntemplate<class DER, class OBJ, class ...ARGS>\nclass FeatureLocator : public MetaFeatureFunction<OBJ, ARGS...> {\n public:\n  // Feature locators have an additional check that there is no intrinsic type.\n  void GetFeatureTypes(vector<FeatureType *> *types) const override {\n    CHECK(this->feature_type() == nullptr)\n        << \"FeatureLocators should not have an intrinsic type.\";\n    MetaFeatureFunction<OBJ, ARGS...>::GetFeatureTypes(types);\n  }\n\n  // Evaluates the locator.\n  void Evaluate(const WorkspaceSet &workspaces, const OBJ &object,\n                ARGS... args, FeatureVector *result) const override {\n    static_cast<const DER *>(this)->UpdateArgs(workspaces, object, &args...);\n    for (auto *function : this->nested()) {\n      function->Evaluate(workspaces, object, args..., result);\n    }\n  }\n\n  // Returns the first nested feature's computed value.\n  FeatureValue Compute(const WorkspaceSet &workspaces, const OBJ &object,\n                       ARGS... args,\n                       const FeatureVector *result) const override {\n    static_cast<const DER *>(this)->UpdateArgs(workspaces, object, &args...);\n    return this->nested()[0]->Compute(workspaces, object, args..., result);\n  }\n};\n\n// Feature extractor for extracting features from objects of a certain class.\n// Template type parameters are as defined for FeatureFunction.\ntemplate<class OBJ, class ...ARGS>\nclass FeatureExtractor : public GenericFeatureExtractor {\n public:\n  // Feature function type for top-level functions in the feature extractor.\n  typedef FeatureFunction<OBJ, ARGS...> Function;\n  typedef FeatureExtractor<OBJ, ARGS...> Self;\n\n  // Feature locator type for the feature extractor.\n  template<class DER>\n  using Locator = FeatureLocator<DER, OBJ, ARGS...>;\n\n  // Initializes feature extractor.\n  FeatureExtractor() {}\n\n  ~FeatureExtractor() override { utils::STLDeleteElements(&functions_); }\n\n  // Sets up the feature extractor. Note that only top-level functions exist\n  // until Setup() is called. This does not take ownership over the context,\n  // which must outlive this.\n  void Setup(TaskContext *context) {\n    for (Function *function : functions_) function->Setup(context);\n  }\n\n  // Initializes the feature extractor.  Must be called after Setup().  This\n  // does not take ownership over the context, which must outlive this.\n  void Init(TaskContext *context) {\n    for (Function *function : functions_) function->Init(context);\n    this->InitializeFeatureTypes();\n  }\n\n  // Requests workspaces from the registry. Must be called after Init(), and\n  // before Preprocess(). Does not take ownership over registry. This should be\n  // the same registry used to initialize the WorkspaceSet used in Preprocess()\n  // and ExtractFeatures(). NB: This is a different ordering from that used in\n  // SentenceFeatureRepresentation style feature computation.\n  void RequestWorkspaces(WorkspaceRegistry *registry) {\n    for (auto *function : functions_) function->RequestWorkspaces(registry);\n  }\n\n  // Preprocesses the object using feature functions for the phase.  Must be\n  // called before any calls to ExtractFeatures() on that object and phase.\n  void Preprocess(WorkspaceSet *workspaces, OBJ *object) const {\n    for (Function *function : functions_) {\n      function->Preprocess(workspaces, object);\n    }\n  }\n\n  // Extracts features from an object with a focus. This invokes all the\n  // top-level feature functions in the feature extractor. Only feature\n  // functions belonging to the specified phase are invoked.\n  void ExtractFeatures(const WorkspaceSet &workspaces, const OBJ &object,\n                       ARGS... args, FeatureVector *result) const {\n    result->reserve(this->feature_types());\n\n    // Extract features.\n    for (int i = 0; i < functions_.size(); ++i) {\n      functions_[i]->Evaluate(workspaces, object, args..., result);\n    }\n  }\n\n private:\n  // Creates and initializes all feature functions in the feature extractor.\n  void InitializeFeatureFunctions() override {\n    // Create all top-level feature functions.\n    for (int i = 0; i < descriptor().feature_size(); ++i) {\n      FeatureFunctionDescriptor *fd = mutable_descriptor()->mutable_feature(i);\n      Function *function = Function::Instantiate(this, fd, \"\");\n      functions_.push_back(function);\n    }\n  }\n\n  // Collect all feature types used in the feature extractor.\n  void GetFeatureTypes(vector<FeatureType *> *types) const override {\n    for (int i = 0; i < functions_.size(); ++i) {\n      functions_[i]->GetFeatureTypes(types);\n    }\n  }\n\n  // Top-level feature functions (and variables) in the feature extractor.\n  // Owned.\n  vector<Function *> functions_;\n};\n\n#define REGISTER_FEATURE_FUNCTION(base, name, component) \\\n  REGISTER_CLASS_COMPONENT(base, name, component)\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_FEATURE_EXTRACTOR_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/feature_extractor.proto",
    "content": "// Protocol buffers for feature extractor.\n\nsyntax = \"proto2\";\n\npackage syntaxnet;\n\nmessage Parameter {\n  optional string name = 1;\n  optional string value = 2;\n}\n\n// Descriptor for feature function.\nmessage FeatureFunctionDescriptor {\n  // Feature function type.\n  required string type = 1;\n\n  // Feature function name.\n  optional string name = 2;\n\n  // Default argument for feature function.\n  optional int32 argument = 3 [default = 0];\n\n  // Named parameters for feature descriptor.\n  repeated Parameter parameter = 4;\n\n  // Nested sub-feature function descriptors.\n  repeated FeatureFunctionDescriptor feature = 7;\n};\n\n// Descriptor for feature extractor.\nmessage FeatureExtractorDescriptor {\n  // Top-level feature function for extractor.\n  repeated FeatureFunctionDescriptor feature = 1;\n};\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/feature_types.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// Common feature types for parser components.\n\n#ifndef SYNTAXNET_FEATURE_TYPES_H_\n#define SYNTAXNET_FEATURE_TYPES_H_\n\n#include <algorithm>\n#include <map>\n#include <string>\n#include <utility>\n\n#include \"syntaxnet/utils.h\"\n\nnamespace syntaxnet {\n\n// Use the same type for feature values as is used for predicated.\ntypedef int64 Predicate;\ntypedef Predicate FeatureValue;\n\n// Each feature value in a feature vector has a feature type. The feature type\n// is used for converting feature type and value pairs to predicate values. The\n// feature type can also return names for feature values and calculate the size\n// of the feature value domain. The FeatureType class is abstract and must be\n// specialized for the concrete feature types.\nclass FeatureType {\n public:\n  // Initializes a feature type.\n  explicit FeatureType(const string &name)\n      : name_(name), base_(0) {}\n\n  virtual ~FeatureType() {}\n\n  // Converts a feature value to a name.\n  virtual string GetFeatureValueName(FeatureValue value) const = 0;\n\n  // Returns the size of the feature values domain.\n  virtual int64 GetDomainSize() const = 0;\n\n  // Returns the feature type name.\n  const string &name() const { return name_; }\n\n  Predicate base() const { return base_; }\n  void set_base(Predicate base) { base_ = base; }\n\n private:\n  // Feature type name.\n  string name_;\n\n  // \"Base\" feature value: i.e. a \"slot\" in a global ordering of features.\n  Predicate base_;\n};\n\n// Templated generic resource based feature type. This feature type delegates\n// look up of feature value names to an unknown resource class, which is not\n// owned. Optionally, this type can also store a mapping of extra values which\n// are not in the resource.\n//\n// Note: this class assumes that Resource->GetFeatureValueName() will return\n// successfully for values ONLY in the range [0, Resource->NumValues()) Any\n// feature value not in the extra value map and not in the above range of\n// Resource will result in a ERROR and return of \"<INVALID>\".\ntemplate<class Resource>\nclass ResourceBasedFeatureType : public FeatureType {\n public:\n  // Creates a new type with given name, resource object, and a mapping of\n  // special values. The values must be greater or equal to\n  // resource->NumValues() so as to avoid collisions; this is verified with\n  // CHECK at creation.\n  ResourceBasedFeatureType(const string &name, const Resource *resource,\n                           const map<FeatureValue, string> &values)\n      : FeatureType(name), resource_(resource), values_(values) {\n    max_value_ = resource->NumValues() - 1;\n    for (const auto &pair : values) {\n      CHECK_GE(pair.first, resource->NumValues()) << \"Invalid extra value: \"\n               << pair.first << \",\" << pair.second;\n      max_value_ = pair.first > max_value_ ? pair.first : max_value_;\n    }\n  }\n\n  // Creates a new type with no special values.\n  ResourceBasedFeatureType(const string &name, const Resource *resource)\n      : ResourceBasedFeatureType(name, resource, {}) {}\n\n  // Returns the feature name for a given feature value. First checks the values\n  // map, then checks the resource to look up the name.\n  string GetFeatureValueName(FeatureValue value) const override {\n    if (values_.find(value) != values_.end()) {\n      return values_.find(value)->second;\n    }\n    if (value >= 0 && value < resource_->NumValues()) {\n      return resource_->GetFeatureValueName(value);\n    } else {\n      LOG(ERROR) << \"Invalid feature value \" << value << \" for \" << name();\n      return \"<INVALID>\";\n    }\n  }\n\n  // Returns the number of possible values for this feature type. This is the\n  // based on the largest value that was observed in the extra values.\n  FeatureValue GetDomainSize() const override { return max_value_ + 1; }\n\n protected:\n  // Shared resource. Not owned.\n  const Resource *resource_ = nullptr;\n\n  // Maximum possible value this feature could take.\n  FeatureValue max_value_;\n\n  // Mapping for extra feature values not in the resource.\n  map<FeatureValue, string> values_;\n};\n\n// Feature type that is defined using an explicit map from FeatureValue to\n// string values.  This can reduce some of the boilerplate when defining\n// features that generate enum values.  Example usage:\n//\n//   class BeverageSizeFeature : public FeatureFunction<Beverage>\n//     enum FeatureValue { SMALL, MEDIUM, LARGE };  // values for this feature\n//     void Init(TaskContext *context) override {\n//       set_feature_type(new EnumFeatureType(\"beverage_size\",\n//           {{SMALL, \"SMALL\"}, {MEDIUM, \"MEDIUM\"}, {LARGE, \"LARGE\"}});\n//     }\n//     [...]\n//   };\nclass EnumFeatureType : public FeatureType {\n public:\n  EnumFeatureType(const string &name,\n                  const map<FeatureValue, string> &value_names)\n      : FeatureType(name), value_names_(value_names) {\n    for (const auto &pair : value_names) {\n      CHECK_GE(pair.first, 0)\n          << \"Invalid feature value: \" << pair.first << \", \" << pair.second;\n      domain_size_ = std::max(domain_size_, pair.first + 1);\n    }\n  }\n\n  // Returns the feature name for a given feature value.\n  string GetFeatureValueName(FeatureValue value) const override {\n    auto it = value_names_.find(value);\n    if (it == value_names_.end()) {\n      LOG(ERROR)\n          << \"Invalid feature value \" << value << \" for \" << name();\n      return \"<INVALID>\";\n    }\n    return it->second;\n  }\n\n  // Returns the number of possible values for this feature type. This is one\n  // greater than the largest value in the value_names map.\n  FeatureValue GetDomainSize() const override { return domain_size_; }\n\n protected:\n  // Maximum possible value this feature could take.\n  FeatureValue domain_size_ = 0;\n\n  // Names of feature values.\n  map<FeatureValue, string> value_names_;\n};\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_FEATURE_TYPES_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/fml_parser.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/fml_parser.h\"\n\n#include <ctype.h>\n#include <string>\n\n#include \"syntaxnet/utils.h\"\n#include \"tensorflow/core/lib/strings/strcat.h\"\n\nnamespace syntaxnet {\n\nvoid FMLParser::Initialize(const string &source) {\n  // Initialize parser state.\n  source_ = source;\n  current_ = source_.begin();\n  item_start_ = line_start_ = current_;\n  line_number_ = item_line_number_ = 1;\n\n  // Read first input item.\n  NextItem();\n}\n\nvoid FMLParser::Error(const string &error_message) {\n  LOG(FATAL) << \"Error in feature model, line \" << item_line_number_\n             << \", position \" << (item_start_ - line_start_ + 1)\n             << \": \" << error_message\n             << \"\\n    \" << string(line_start_, current_) << \" <--HERE\";\n}\n\nvoid FMLParser::Next() {\n  // Move to the next input character. If we are at a line break update line\n  // number and line start position.\n  if (*current_ == '\\n') {\n    ++line_number_;\n    ++current_;\n    line_start_ = current_;\n  } else {\n    ++current_;\n  }\n}\n\nvoid FMLParser::NextItem() {\n  // Skip white space and comments.\n  while (!eos()) {\n    if (*current_ == '#') {\n      // Skip comment.\n      while (!eos() && *current_ != '\\n') Next();\n    } else if (isspace(*current_)) {\n      // Skip whitespace.\n      while (!eos() && isspace(*current_)) Next();\n    } else {\n      break;\n    }\n  }\n\n  // Record start position for next item.\n  item_start_ = current_;\n  item_line_number_ = line_number_;\n\n  // Check for end of input.\n  if (eos()) {\n    item_type_ = END;\n    return;\n  }\n\n  // Parse number.\n  if (isdigit(*current_) || *current_ == '+' || *current_ == '-') {\n    string::iterator start = current_;\n    Next();\n    while (isdigit(*current_) || *current_ == '.') Next();\n    item_text_.assign(start, current_);\n    item_type_ = NUMBER;\n    return;\n  }\n\n  // Parse string.\n  if (*current_ == '\"') {\n    Next();\n    string::iterator start = current_;\n    while (*current_ != '\"') {\n      if (eos()) Error(\"Unterminated string\");\n      Next();\n    }\n    item_text_.assign(start, current_);\n    item_type_ = STRING;\n    Next();\n    return;\n  }\n\n  // Parse identifier name.\n  if (isalpha(*current_) || *current_ == '_' || *current_ == '/') {\n    string::iterator start = current_;\n    while (isalnum(*current_) || *current_ == '_' || *current_ == '-' ||\n           *current_ == '/') Next();\n    item_text_.assign(start, current_);\n    item_type_ = NAME;\n    return;\n  }\n\n  // Single character item.\n  item_type_ = *current_;\n  Next();\n}\n\nvoid FMLParser::Parse(const string &source,\n                      FeatureExtractorDescriptor *result) {\n  // Initialize parser.\n  Initialize(source);\n\n  while (item_type_ != END) {\n    // Parse either a parameter name or a feature.\n    if (item_type_ != NAME) Error(\"Feature type name expected\");\n    string name = item_text_;\n    NextItem();\n\n    if (item_type_ == '=') {\n      Error(\"Invalid syntax: feature expected\");\n    } else {\n      // Parse feature.\n      FeatureFunctionDescriptor *descriptor = result->add_feature();\n      descriptor->set_type(name);\n      ParseFeature(descriptor);\n    }\n  }\n}\n\nvoid FMLParser::ParseFeature(FeatureFunctionDescriptor *result) {\n  // Parse argument and parameters.\n  if (item_type_ == '(') {\n    NextItem();\n    ParseParameter(result);\n    while (item_type_ == ',') {\n      NextItem();\n      ParseParameter(result);\n    }\n\n    if (item_type_ != ')') Error(\") expected\");\n    NextItem();\n  }\n\n  // Parse feature name.\n  if (item_type_ == ':') {\n    NextItem();\n    if (item_type_ != NAME && item_type_ != STRING) {\n      Error(\"Feature name expected\");\n    }\n    string name = item_text_;\n    NextItem();\n\n    // Set feature name.\n    result->set_name(name);\n  }\n\n  // Parse sub-features.\n  if (item_type_ == '.') {\n    // Parse dotted sub-feature.\n    NextItem();\n    if (item_type_ != NAME) Error(\"Feature type name expected\");\n    string type = item_text_;\n    NextItem();\n\n    // Parse sub-feature.\n    FeatureFunctionDescriptor *subfeature = result->add_feature();\n    subfeature->set_type(type);\n    ParseFeature(subfeature);\n  } else if (item_type_ == '{') {\n    // Parse sub-feature block.\n    NextItem();\n    while (item_type_ != '}') {\n      if (item_type_ != NAME) Error(\"Feature type name expected\");\n      string type = item_text_;\n      NextItem();\n\n      // Parse sub-feature.\n      FeatureFunctionDescriptor *subfeature = result->add_feature();\n      subfeature->set_type(type);\n      ParseFeature(subfeature);\n    }\n    NextItem();\n  }\n}\n\nvoid FMLParser::ParseParameter(FeatureFunctionDescriptor *result) {\n  if (item_type_ == NUMBER) {\n    int argument =\n        utils::ParseUsing<int>(item_text_, tensorflow::strings::safe_strto32);\n    NextItem();\n\n    // Set default argument for feature.\n    result->set_argument(argument);\n  } else if (item_type_ == NAME) {\n     string name = item_text_;\n     NextItem();\n     if (item_type_ != '=') Error(\"= expected\");\n     NextItem();\n     if (item_type_ >= END) Error(\"Parameter value expected\");\n     string value = item_text_;\n     NextItem();\n\n     // Add parameter to feature.\n     Parameter *parameter;\n     parameter = result->add_parameter();\n     parameter->set_name(name);\n     parameter->set_value(value);\n  } else {\n    Error(\"Syntax error in parameter list\");\n  }\n}\n\nvoid ToFMLFunction(const FeatureFunctionDescriptor &function, string *output) {\n  output->append(function.type());\n  if (function.argument() != 0 || function.parameter_size() > 0) {\n    output->append(\"(\");\n    bool first = true;\n    if (function.argument() != 0) {\n      tensorflow::strings::StrAppend(output, function.argument());\n      first = false;\n    }\n    for (int i = 0; i < function.parameter_size(); ++i) {\n      if (!first) output->append(\",\");\n      output->append(function.parameter(i).name());\n      output->append(\"=\");\n      output->append(\"\\\"\");\n      output->append(function.parameter(i).value());\n      output->append(\"\\\"\");\n      first = false;\n    }\n    output->append(\")\");\n  }\n}\n\nvoid ToFML(const FeatureFunctionDescriptor &function, string *output) {\n  ToFMLFunction(function, output);\n  if (function.feature_size() == 1) {\n    output->append(\".\");\n    ToFML(function.feature(0), output);\n  } else if (function.feature_size() > 1) {\n    output->append(\" { \");\n    for (int i = 0; i < function.feature_size(); ++i) {\n      if (i > 0) output->append(\" \");\n      ToFML(function.feature(i), output);\n    }\n    output->append(\" } \");\n  }\n}\n\nvoid ToFML(const FeatureExtractorDescriptor &extractor, string *output) {\n  for (int i = 0; i < extractor.feature_size(); ++i) {\n    ToFML(extractor.feature(i), output);\n    output->append(\"\\n\");\n  }\n}\n\nstring AsFML(const FeatureFunctionDescriptor &function) {\n  string str;\n  ToFML(function, &str);\n  return str;\n}\n\nstring AsFML(const FeatureExtractorDescriptor &extractor) {\n  string str;\n  ToFML(extractor, &str);\n  return str;\n}\n\nvoid StripFML(string *fml_string) {\n  auto it = fml_string->begin();\n  while (it != fml_string->end()) {\n    if (*it == '\"') {\n      it = fml_string->erase(it);\n    } else {\n      ++it;\n    }\n  }\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/fml_parser.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// Feature modeling language (fml) parser.\n//\n// BNF grammar for fml:\n//\n// <feature model> ::= { <feature extractor> }\n//\n// <feature extractor> ::= <extractor spec> |\n//                         <extractor spec> '.' <feature extractor> |\n//                         <extractor spec> '{' { <feature extractor> } '}'\n//\n// <extractor spec> ::= <extractor type>\n//                      [ '(' <parameter list> ')' ]\n//                      [ ':' <extractor name> ]\n//\n// <parameter list> = ( <parameter> | <argument> ) { ',' <parameter> }\n//\n// <parameter> ::= <parameter name> '=' <parameter value>\n//\n// <extractor type> ::= NAME\n// <extractor name> ::= NAME | STRING\n// <argument> ::= NUMBER\n// <parameter name> ::= NAME\n// <parameter value> ::= NUMBER | STRING | NAME\n\n#ifndef SYNTAXNET_FML_PARSER_H_\n#define SYNTAXNET_FML_PARSER_H_\n\n#include <string>\n\n#include \"syntaxnet/utils.h\"\n#include \"syntaxnet/feature_extractor.pb.h\"\n\nnamespace syntaxnet {\n\nclass FMLParser {\n public:\n  // Parses fml specification into feature extractor descriptor.\n  void Parse(const string &source, FeatureExtractorDescriptor *result);\n\n private:\n  // Initializes the parser with the source text.\n  void Initialize(const string &source);\n\n  // Outputs error message and exits.\n  void Error(const string &error_message);\n\n  // Moves to the next input character.\n  void Next();\n\n  // Moves to the next input item.\n  void NextItem();\n\n  // Parses a feature descriptor.\n  void ParseFeature(FeatureFunctionDescriptor *result);\n\n  // Parses a parameter specification.\n  void ParseParameter(FeatureFunctionDescriptor *result);\n\n  // Returns true if end of source input has been reached.\n  bool eos() { return current_ == source_.end(); }\n\n  // Item types.\n  enum ItemTypes {\n    END = 0,\n    NAME = -1,\n    NUMBER = -2,\n    STRING = -3,\n  };\n\n  // Source text.\n  string source_;\n\n  // Current input position.\n  string::iterator current_;\n\n  // Line number for current input position.\n  int line_number_;\n\n  // Start position for current item.\n  string::iterator item_start_;\n\n  // Start position for current line.\n  string::iterator line_start_;\n\n  // Line number for current item.\n  int item_line_number_;\n\n  // Item type for current item. If this is positive it is interpreted as a\n  // character. If it is negative it is interpreted as an item type.\n  int item_type_;\n\n  // Text for current item.\n  string item_text_;\n};\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_FML_PARSER_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/graph_builder.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Builds parser models.\"\"\"\n\nimport tensorflow as tf\n\nimport syntaxnet.load_parser_ops\n\nfrom tensorflow.python.ops import control_flow_ops as cf\nfrom tensorflow.python.ops import state_ops\nfrom tensorflow.python.platform import tf_logging as logging\n\nfrom syntaxnet.ops import gen_parser_ops\n\n\ndef BatchedSparseToDense(sparse_indices, output_size):\n  \"\"\"Batch compatible sparse to dense conversion.\n\n  This is useful for one-hot coded target labels.\n\n  Args:\n    sparse_indices: [batch_size] tensor containing one index per batch\n    output_size: needed in order to generate the correct dense output\n\n  Returns:\n    A [batch_size, output_size] dense tensor.\n  \"\"\"\n  eye = tf.diag(tf.fill([output_size], tf.constant(1, tf.float32)))\n  return tf.nn.embedding_lookup(eye, sparse_indices)\n\n\ndef EmbeddingLookupFeatures(params, sparse_features, allow_weights):\n  \"\"\"Computes embeddings for each entry of sparse features sparse_features.\n\n  Args:\n    params: list of 2D tensors containing vector embeddings\n    sparse_features: 1D tensor of strings. Each entry is a string encoding of\n      dist_belief.SparseFeatures, and represents a variable length list of\n      feature ids, and optionally, corresponding weights values.\n    allow_weights: boolean to control whether the weights returned from the\n      SparseFeatures are used to multiply the embeddings.\n\n  Returns:\n    A tensor representing the combined embeddings for the sparse features.\n    For each entry s in sparse_features, the function looks up the embeddings\n    for each id and sums them into a single tensor weighing them by the\n    weight of each id. It returns a tensor with each entry of sparse_features\n    replaced by this combined embedding.\n  \"\"\"\n  if not isinstance(params, list):\n    params = [params]\n  # Lookup embeddings.\n  sparse_features = tf.convert_to_tensor(sparse_features)\n  indices, ids, weights = gen_parser_ops.unpack_sparse_features(sparse_features)\n  embeddings = tf.nn.embedding_lookup(params, ids)\n\n  if allow_weights:\n    # Multiply by weights, reshaping to allow broadcast.\n    broadcast_weights_shape = tf.concat(0, [tf.shape(weights), [1]])\n    embeddings *= tf.reshape(weights, broadcast_weights_shape)\n\n  # Sum embeddings by index.\n  return tf.unsorted_segment_sum(embeddings, indices, tf.size(sparse_features))\n\n\nclass GreedyParser(object):\n  \"\"\"Builds a Chen & Manning style greedy neural net parser.\n\n  Builds a graph with an optional reader op connected at one end and\n  operations needed to train the network on the other. Supports multiple\n  network instantiations sharing the same parameters and network topology.\n\n  The following named nodes are added to the training and eval networks:\n    epochs: a tensor containing the current epoch number\n    cost: a tensor containing the current training step cost\n    gold_actions: a tensor containing actions from gold decoding\n    feature_endpoints: a list of sparse feature vectors\n    logits: output of the final layer before computing softmax\n  The training network also contains:\n    train_op: an op that executes a single training step\n\n  Typical usage:\n\n  parser = graph_builder.GreedyParser(num_actions, num_features,\n                                      num_feature_ids, embedding_sizes,\n                                      hidden_layer_sizes)\n  parser.AddTraining(task_context, batch_size=5)\n  with tf.Session('local') as sess:\n    # This works because the session uses the same default graph as the\n    # GraphBuilder did.\n    sess.run(parser.inits.values())\n    while True:\n      tf_epoch, _ = sess.run([parser.training['epoch'],\n                              parser.training['train_op']])\n      if tf_epoch[0] > 0:\n        break\n  \"\"\"\n\n  def __init__(self,\n               num_actions,\n               num_features,\n               num_feature_ids,\n               embedding_sizes,\n               hidden_layer_sizes,\n               seed=None,\n               gate_gradients=False,\n               use_locking=False,\n               embedding_init=1.0,\n               relu_init=1e-4,\n               bias_init=0.2,\n               softmax_init=1e-4,\n               averaging_decay=0.9999,\n               use_averaging=True,\n               check_parameters=True,\n               check_every=1,\n               allow_feature_weights=False,\n               only_train='',\n               arg_prefix=None,\n               **unused_kwargs):\n    \"\"\"Initialize the graph builder with parameters defining the network.\n\n    Args:\n      num_actions: int size of the set of parser actions\n      num_features: int list of dimensions of the feature vectors\n      num_feature_ids: int list of same length as num_features corresponding to\n        the sizes of the input feature spaces\n      embedding_sizes: int list of same length as num_features of the desired\n        embedding layer sizes\n      hidden_layer_sizes: int list of desired relu layer sizes; may be empty\n      seed: optional random initializer seed to enable reproducibility\n      gate_gradients: if True, gradient updates are computed synchronously,\n        ensuring consistency and reproducibility\n      use_locking: if True, use locking to avoid read-write contention when\n        updating Variables\n      embedding_init: sets the std dev of normal initializer of embeddings to\n        embedding_init / embedding_size ** .5\n      relu_init: sets the std dev of normal initializer of relu weights\n        to relu_init\n      bias_init: sets constant initializer of relu bias to bias_init\n      softmax_init: sets the std dev of normal initializer of softmax init\n        to softmax_init\n      averaging_decay: decay for exponential moving average when computing\n        averaged parameters, set to 1 to do vanilla averaging\n      use_averaging: whether to use moving averages of parameters during evals\n      check_parameters: whether to check for NaN/Inf parameters during\n        training\n      check_every: checks numerics every check_every steps.\n      allow_feature_weights: whether feature weights are allowed.\n      only_train: the comma separated set of parameter names to train. If empty,\n        all model parameters will be trained.\n      arg_prefix: prefix for context parameters.\n    \"\"\"\n    self._num_actions = num_actions\n    self._num_features = num_features\n    self._num_feature_ids = num_feature_ids\n    self._embedding_sizes = embedding_sizes\n    self._hidden_layer_sizes = hidden_layer_sizes\n    self._seed = seed\n    self._gate_gradients = gate_gradients\n    self._use_locking = use_locking\n    self._use_averaging = use_averaging\n    self._check_parameters = check_parameters\n    self._check_every = check_every\n    self._allow_feature_weights = allow_feature_weights\n    self._only_train = set(only_train.split(',')) if only_train else None\n    self._feature_size = len(embedding_sizes)\n    self._embedding_init = embedding_init\n    self._relu_init = relu_init\n    self._softmax_init = softmax_init\n    self._arg_prefix = arg_prefix\n    # Parameters of the network with respect to which training is done.\n    self.params = {}\n    # Other variables, with respect to which no training is done, but which we\n    # nonetheless need to save in order to capture the state of the graph.\n    self.variables = {}\n    # Operations to initialize any nodes that require initialization.\n    self.inits = {}\n    # Training- and eval-related nodes.\n    self.training = {}\n    self.evaluation = {}\n    self.saver = None\n    # Nodes to compute moving averages of parameters, called every train step.\n    self._averaging = {}\n    self._averaging_decay = averaging_decay\n    # Pretrained embeddings that can be used instead of constant initializers.\n    self._pretrained_embeddings = {}\n    # After the following 'with' statement, we'll be able to re-enter the\n    # 'params' scope by re-using the self._param_scope member variable. See for\n    # instance _AddParam.\n    with tf.name_scope('params') as self._param_scope:\n      self._relu_bias_init = tf.constant_initializer(bias_init)\n\n  @property\n  def embedding_size(self):\n    size = 0\n    for i in range(self._feature_size):\n      size += self._num_features[i] * self._embedding_sizes[i]\n    return size\n\n  def _AddParam(self,\n                shape,\n                dtype,\n                name,\n                initializer=None,\n                return_average=False):\n    \"\"\"Add a model parameter w.r.t. we expect to compute gradients.\n\n    _AddParam creates both regular parameters (usually for training) and\n    averaged nodes (usually for inference). It returns one or the other based\n    on the 'return_average' arg.\n\n    Args:\n      shape: int list, tensor shape of the parameter to create\n      dtype: tf.DataType, data type of the parameter\n      name: string, name of the parameter in the TF graph\n      initializer: optional initializer for the paramter\n      return_average: if False, return parameter otherwise return moving average\n\n    Returns:\n      parameter or averaged parameter\n    \"\"\"\n    if name not in self.params:\n      step = tf.cast(self.GetStep(), tf.float32)\n      # Put all parameters and their initializing ops in their own scope\n      # irrespective of the current scope (training or eval).\n      with tf.name_scope(self._param_scope):\n        self.params[name] = tf.get_variable(name, shape, dtype, initializer)\n        param = self.params[name]\n        if initializer is not None:\n          self.inits[name] = state_ops.init_variable(param, initializer)\n        if self._averaging_decay == 1:\n          logging.info('Using vanilla averaging of parameters.')\n          ema = tf.train.ExponentialMovingAverage(decay=(step / (step + 1.0)),\n                                                  num_updates=None)\n        else:\n          ema = tf.train.ExponentialMovingAverage(decay=self._averaging_decay,\n                                                  num_updates=step)\n        self._averaging[name + '_avg_update'] = ema.apply([param])\n        self.variables[name + '_avg_var'] = ema.average(param)\n        self.inits[name + '_avg_init'] = state_ops.init_variable(\n            ema.average(param), tf.zeros_initializer)\n    return (self.variables[name + '_avg_var'] if return_average else\n            self.params[name])\n\n  def GetStep(self):\n    def OnesInitializer(shape, dtype=tf.float32, partition_info=None):\n      return tf.ones(shape, dtype)\n    return self._AddVariable([], tf.int32, 'step', OnesInitializer)\n\n  def _AddVariable(self, shape, dtype, name, initializer=None):\n    if name in self.variables:\n      return self.variables[name]\n    self.variables[name] = tf.get_variable(name, shape, dtype, initializer)\n    if initializer is not None:\n      self.inits[name] = state_ops.init_variable(self.variables[name],\n                                                 initializer)\n    return self.variables[name]\n\n  def _ReluWeightInitializer(self):\n    with tf.name_scope(self._param_scope):\n      return tf.random_normal_initializer(stddev=self._relu_init,\n                                          seed=self._seed)\n\n  def _EmbeddingMatrixInitializer(self, index, embedding_size):\n    if index in self._pretrained_embeddings:\n      return self._pretrained_embeddings[index]\n    else:\n      return tf.random_normal_initializer(\n          stddev=self._embedding_init / embedding_size**.5,\n          seed=self._seed)\n\n  def _AddEmbedding(self,\n                    features,\n                    num_features,\n                    num_ids,\n                    embedding_size,\n                    index,\n                    return_average=False):\n    \"\"\"Adds an embedding matrix and passes the `features` vector through it.\"\"\"\n    embedding_matrix = self._AddParam(\n        [num_ids, embedding_size],\n        tf.float32,\n        'embedding_matrix_%d' % index,\n        self._EmbeddingMatrixInitializer(index, embedding_size),\n        return_average=return_average)\n    embedding = EmbeddingLookupFeatures(embedding_matrix,\n                                        tf.reshape(features,\n                                                   [-1],\n                                                   name='feature_%d' % index),\n                                        self._allow_feature_weights)\n    return tf.reshape(embedding, [-1, num_features * embedding_size])\n\n  def _BuildNetwork(self, feature_endpoints, return_average=False):\n    \"\"\"Builds a feed-forward part of the net given features as input.\n\n    The network topology is already defined in the constructor, so multiple\n    calls to BuildForward build multiple networks whose parameters are all\n    shared. It is the source of the input features and the use of the output\n    that distinguishes each network.\n\n    Args:\n      feature_endpoints: tensors with input features to the network\n      return_average: whether to use moving averages as model parameters\n\n    Returns:\n      logits: output of the final layer before computing softmax\n    \"\"\"\n    assert len(feature_endpoints) == self._feature_size\n\n    # Create embedding layer.\n    embeddings = []\n    for i in range(self._feature_size):\n      embeddings.append(self._AddEmbedding(feature_endpoints[i],\n                                           self._num_features[i],\n                                           self._num_feature_ids[i],\n                                           self._embedding_sizes[i],\n                                           i,\n                                           return_average=return_average))\n\n    last_layer = tf.concat(1, embeddings)\n    last_layer_size = self.embedding_size\n\n    # Create ReLU layers.\n    for i, hidden_layer_size in enumerate(self._hidden_layer_sizes):\n      weights = self._AddParam(\n          [last_layer_size, hidden_layer_size],\n          tf.float32,\n          'weights_%d' % i,\n          self._ReluWeightInitializer(),\n          return_average=return_average)\n      bias = self._AddParam([hidden_layer_size],\n                            tf.float32,\n                            'bias_%d' % i,\n                            self._relu_bias_init,\n                            return_average=return_average)\n      last_layer = tf.nn.relu_layer(last_layer,\n                                    weights,\n                                    bias,\n                                    name='layer_%d' % i)\n      last_layer_size = hidden_layer_size\n\n    # Create softmax layer.\n    softmax_weight = self._AddParam(\n        [last_layer_size, self._num_actions],\n        tf.float32,\n        'softmax_weight',\n        tf.random_normal_initializer(stddev=self._softmax_init,\n                                     seed=self._seed),\n        return_average=return_average)\n    softmax_bias = self._AddParam(\n        [self._num_actions],\n        tf.float32,\n        'softmax_bias',\n        tf.zeros_initializer,\n        return_average=return_average)\n    logits = tf.nn.xw_plus_b(last_layer,\n                             softmax_weight,\n                             softmax_bias,\n                             name='logits')\n    return {'logits': logits}\n\n  def _AddGoldReader(self, task_context, batch_size, corpus_name):\n    features, epochs, gold_actions = (\n        gen_parser_ops.gold_parse_reader(task_context,\n                                         self._feature_size,\n                                         batch_size,\n                                         corpus_name=corpus_name,\n                                         arg_prefix=self._arg_prefix))\n    return {'gold_actions': tf.identity(gold_actions,\n                                        name='gold_actions'),\n            'epochs': tf.identity(epochs,\n                                  name='epochs'),\n            'feature_endpoints': features}\n\n  def _AddDecodedReader(self, task_context, batch_size, transition_scores,\n                        corpus_name):\n    features, epochs, eval_metrics, documents = (\n        gen_parser_ops.decoded_parse_reader(transition_scores,\n                                            task_context,\n                                            self._feature_size,\n                                            batch_size,\n                                            corpus_name=corpus_name,\n                                            arg_prefix=self._arg_prefix))\n    return {'eval_metrics': eval_metrics,\n            'epochs': tf.identity(epochs,\n                                  name='epochs'),\n            'feature_endpoints': features,\n            'documents': documents}\n\n  def _AddCostFunction(self, batch_size, gold_actions, logits):\n    \"\"\"Cross entropy plus L2 loss on weights and biases of the hidden layers.\"\"\"\n    dense_golden = BatchedSparseToDense(gold_actions, self._num_actions)\n    cross_entropy = tf.div(\n        tf.reduce_sum(tf.nn.softmax_cross_entropy_with_logits(\n            logits, dense_golden)), batch_size)\n    regularized_params = [tf.nn.l2_loss(p)\n                          for k, p in self.params.items()\n                          if k.startswith('weights') or k.startswith('bias')]\n    l2_loss = 1e-4 * tf.add_n(regularized_params) if regularized_params else 0\n    return {'cost': tf.add(cross_entropy, l2_loss, name='cost')}\n\n  def AddEvaluation(self,\n                    task_context,\n                    batch_size,\n                    evaluation_max_steps=300,\n                    corpus_name='documents'):\n    \"\"\"Builds the forward network only without the training operation.\n\n    Args:\n      task_context: file path from which to read the task context.\n      batch_size: batch size to request from reader op.\n      evaluation_max_steps: max number of parsing actions during evaluation,\n          only used in beam parsing.\n      corpus_name: name of the task input to read parses from.\n\n    Returns:\n      Dictionary of named eval nodes.\n    \"\"\"\n    def _AssignTransitionScores():\n      return tf.assign(nodes['transition_scores'],\n                       nodes['logits'], validate_shape=False)\n    def _Pass():\n      return tf.constant(-1.0)\n    unused_evaluation_max_steps = evaluation_max_steps\n    with tf.name_scope('evaluation'):\n      nodes = self.evaluation\n      nodes['transition_scores'] = self._AddVariable(\n          [batch_size, self._num_actions], tf.float32, 'transition_scores',\n          tf.constant_initializer(-1.0))\n      nodes.update(self._AddDecodedReader(task_context, batch_size, nodes[\n          'transition_scores'], corpus_name))\n      nodes.update(self._BuildNetwork(nodes['feature_endpoints'],\n                                      return_average=self._use_averaging))\n      nodes['eval_metrics'] = cf.with_dependencies(\n          [tf.cond(tf.greater(tf.size(nodes['logits']), 0),\n                   _AssignTransitionScores, _Pass)],\n          nodes['eval_metrics'], name='eval_metrics')\n    return nodes\n\n  def _IncrementCounter(self, counter):\n    return state_ops.assign_add(counter, 1, use_locking=True)\n\n  def _AddLearningRate(self, initial_learning_rate, decay_steps):\n    \"\"\"Returns a learning rate that decays by 0.96 every decay_steps.\n\n    Args:\n      initial_learning_rate: initial value of the learning rate\n      decay_steps: decay by 0.96 every this many steps\n\n    Returns:\n      learning rate variable.\n    \"\"\"\n    step = self.GetStep()\n    return cf.with_dependencies(\n        [self._IncrementCounter(step)],\n        tf.train.exponential_decay(initial_learning_rate,\n                                   step,\n                                   decay_steps,\n                                   0.96,\n                                   staircase=True))\n\n  def AddPretrainedEmbeddings(self, index, embeddings_path, task_context):\n    \"\"\"Embeddings at the given index will be set to pretrained values.\"\"\"\n\n    def _Initializer(shape, dtype=tf.float32, partition_info=None):\n      unused_dtype = dtype\n      t = gen_parser_ops.word_embedding_initializer(\n          vectors=embeddings_path,\n          task_context=task_context,\n          embedding_init=self._embedding_init)\n\n      t.set_shape(shape)\n      return t\n\n    self._pretrained_embeddings[index] = _Initializer\n\n  def AddTraining(self,\n                  task_context,\n                  batch_size,\n                  learning_rate=0.1,\n                  decay_steps=4000,\n                  momentum=0.9,\n                  corpus_name='documents'):\n    \"\"\"Builds a trainer to minimize the cross entropy cost function.\n\n    Args:\n      task_context: file path from which to read the task context\n      batch_size: batch size to request from reader op\n      learning_rate: initial value of the learning rate\n      decay_steps: decay learning rate by 0.96 every this many steps\n      momentum: momentum parameter used when training with momentum\n      corpus_name: name of the task input to read parses from\n\n    Returns:\n      Dictionary of named training nodes.\n    \"\"\"\n    with tf.name_scope('training'):\n      nodes = self.training\n      nodes.update(self._AddGoldReader(task_context, batch_size, corpus_name))\n      nodes.update(self._BuildNetwork(nodes['feature_endpoints'],\n                                      return_average=False))\n      nodes.update(self._AddCostFunction(batch_size, nodes['gold_actions'],\n                                         nodes['logits']))\n      # Add the optimizer\n      if self._only_train:\n        trainable_params = [v\n                            for k, v in self.params.iteritems()\n                            if k in self._only_train]\n      else:\n        trainable_params = self.params.values()\n      lr = self._AddLearningRate(learning_rate, decay_steps)\n      optimizer = tf.train.MomentumOptimizer(lr,\n                                             momentum,\n                                             use_locking=self._use_locking)\n      train_op = optimizer.minimize(nodes['cost'], var_list=trainable_params)\n      for param in trainable_params:\n        slot = optimizer.get_slot(param, 'momentum')\n        self.inits[slot.name] = state_ops.init_variable(slot,\n                                                        tf.zeros_initializer)\n        self.variables[slot.name] = slot\n      numerical_checks = [\n          tf.check_numerics(param,\n                            message='Parameter is not finite.')\n          for param in trainable_params\n          if param.dtype.base_dtype in [tf.float32, tf.float64]\n      ]\n      check_op = tf.group(*numerical_checks)\n      avg_update_op = tf.group(*self._averaging.values())\n      train_ops = [train_op]\n      if self._check_parameters:\n        train_ops.append(check_op)\n      if self._use_averaging:\n        train_ops.append(avg_update_op)\n      nodes['train_op'] = tf.group(*train_ops, name='train_op')\n    return nodes\n\n  def AddSaver(self, slim_model=False):\n    \"\"\"Adds ops to save and restore model parameters.\n\n    Args:\n      slim_model: whether only averaged variables are saved.\n\n    Returns:\n      the saver object.\n    \"\"\"\n    # We have to put the save op in the root scope otherwise running\n    # \"save/restore_all\" won't find the \"save/Const\" node it expects.\n    with tf.name_scope(None):\n      variables_to_save = self.params.copy()\n      variables_to_save.update(self.variables)\n      if slim_model:\n        for key in variables_to_save.keys():\n          if not key.endswith('avg_var'):\n            del variables_to_save[key]\n      self.saver = tf.train.Saver(variables_to_save)\n    return self.saver\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/graph_builder_test.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Tests for graph_builder.\"\"\"\n\n\n# disable=no-name-in-module,unused-import,g-bad-import-order,maybe-no-member\nimport os.path\nimport tensorflow as tf\n\nfrom tensorflow.python.framework import test_util\nfrom tensorflow.python.ops import variables\nfrom tensorflow.python.platform import googletest\n\nfrom syntaxnet import graph_builder\nfrom syntaxnet import sparse_pb2\nfrom syntaxnet.ops import gen_parser_ops\n\nFLAGS = tf.app.flags.FLAGS\nif not hasattr(FLAGS, 'test_srcdir'):\n  FLAGS.test_srcdir = ''\nif not hasattr(FLAGS, 'test_tmpdir'):\n  FLAGS.test_tmpdir = tf.test.get_temp_dir()\n\n\nclass GraphBuilderTest(test_util.TensorFlowTestCase):\n\n  def setUp(self):\n    # Creates a task context with the correct testing paths.\n    initial_task_context = os.path.join(\n        FLAGS.test_srcdir,\n        'syntaxnet/'\n        'testdata/context.pbtxt')\n    self._task_context = os.path.join(FLAGS.test_tmpdir, 'context.pbtxt')\n    with open(initial_task_context, 'r') as fin:\n      with open(self._task_context, 'w') as fout:\n        fout.write(fin.read().replace('SRCDIR', FLAGS.test_srcdir)\n                   .replace('OUTPATH', FLAGS.test_tmpdir))\n\n    # Creates necessary term maps.\n    with self.test_session() as sess:\n      gen_parser_ops.lexicon_builder(task_context=self._task_context,\n                                     corpus_name='training-corpus').run()\n      self._num_features, self._num_feature_ids, _, self._num_actions = (\n          sess.run(gen_parser_ops.feature_size(task_context=self._task_context,\n                                               arg_prefix='brain_parser')))\n\n  def MakeBuilder(self, use_averaging=True, **kw_args):\n    # Set the seed and gate_gradients to ensure reproducibility.\n    return graph_builder.GreedyParser(\n        self._num_actions, self._num_features, self._num_feature_ids,\n        embedding_sizes=[8, 8, 8], hidden_layer_sizes=[32, 32], seed=42,\n        gate_gradients=True, use_averaging=use_averaging, **kw_args)\n\n  def FindNode(self, name):\n    for node in tf.get_default_graph().as_graph_def().node:\n      if node.name == name:\n        return node\n    return None\n\n  def NodeFound(self, name):\n    return self.FindNode(name) is not None\n\n  def testScope(self):\n    # Set up the network topology\n    graph = tf.Graph()\n    with graph.as_default():\n      parser = self.MakeBuilder()\n      parser.AddTraining(self._task_context,\n                         batch_size=10,\n                         corpus_name='training-corpus')\n      parser.AddEvaluation(self._task_context,\n                           batch_size=2,\n                           corpus_name='tuning-corpus')\n      parser.AddSaver()\n\n      # Check that the node ids we may rely on are there with the expected\n      # names.\n      self.assertEqual(parser.training['logits'].name, 'training/logits:0')\n      self.assertTrue(self.NodeFound('training/logits'))\n      self.assertTrue(self.NodeFound('training/feature_0'))\n      self.assertTrue(self.NodeFound('training/feature_1'))\n      self.assertTrue(self.NodeFound('training/feature_2'))\n      self.assertFalse(self.NodeFound('training/feature_3'))\n\n      self.assertEqual(parser.evaluation['logits'].name, 'evaluation/logits:0')\n      self.assertTrue(self.NodeFound('evaluation/logits'))\n\n      # The saver node is expected to be in the root scope.\n      self.assertTrue(self.NodeFound('save/restore_all'))\n\n      # Also check that the parameters have the scope we expect.\n      self.assertTrue(self.NodeFound('embedding_matrix_0'))\n      self.assertTrue(self.NodeFound('embedding_matrix_1'))\n      self.assertTrue(self.NodeFound('embedding_matrix_2'))\n      self.assertFalse(self.NodeFound('embedding_matrix_3'))\n\n  def testNestedScope(self):\n    # It's OK to put the whole graph in a scope of its own.\n    graph = tf.Graph()\n    with graph.as_default():\n      with graph.name_scope('top'):\n        parser = self.MakeBuilder()\n        parser.AddTraining(self._task_context,\n                           batch_size=10,\n                           corpus_name='training-corpus')\n        parser.AddSaver()\n\n      self.assertTrue(self.NodeFound('top/training/logits'))\n      self.assertTrue(self.NodeFound('top/training/feature_0'))\n\n      # The saver node is expected to be in the root scope no matter what.\n      self.assertFalse(self.NodeFound('top/save/restore_all'))\n      self.assertTrue(self.NodeFound('save/restore_all'))\n\n  def testUseCustomGraphs(self):\n    batch_size = 10\n\n    # Use separate custom graphs.\n    custom_train_graph = tf.Graph()\n    with custom_train_graph.as_default():\n      train_parser = self.MakeBuilder()\n      train_parser.AddTraining(self._task_context,\n                               batch_size,\n                               corpus_name='training-corpus')\n\n    custom_eval_graph = tf.Graph()\n    with custom_eval_graph.as_default():\n      eval_parser = self.MakeBuilder()\n      eval_parser.AddEvaluation(self._task_context,\n                                batch_size,\n                                corpus_name='tuning-corpus')\n\n    # The following session runs should not fail.\n    with self.test_session(graph=custom_train_graph) as sess:\n      self.assertTrue(self.NodeFound('training/logits'))\n      sess.run(train_parser.inits.values())\n      sess.run(['training/logits:0'])\n\n    with self.test_session(graph=custom_eval_graph) as sess:\n      self.assertFalse(self.NodeFound('training/logits'))\n      self.assertTrue(self.NodeFound('evaluation/logits'))\n      sess.run(eval_parser.inits.values())\n      sess.run(['evaluation/logits:0'])\n\n  def testTrainingAndEvalAreIndependent(self):\n    batch_size = 10\n    graph = tf.Graph()\n    with graph.as_default():\n      parser = self.MakeBuilder(use_averaging=False)\n      parser.AddTraining(self._task_context,\n                         batch_size,\n                         corpus_name='training-corpus')\n      parser.AddEvaluation(self._task_context,\n                           batch_size,\n                           corpus_name='tuning-corpus')\n    with self.test_session(graph=graph) as sess:\n      sess.run(parser.inits.values())\n      # Before any training updates are performed, both training and eval nets\n      # should return the same computations.\n      eval_logits, = sess.run([parser.evaluation['logits']])\n      training_logits, = sess.run([parser.training['logits']])\n      self.assertNear(abs((eval_logits - training_logits).sum()), 0, 1e-6)\n\n      # After training, activations should differ.\n      for _ in range(5):\n        eval_logits = parser.evaluation['logits'].eval()\n      for _ in range(5):\n        training_logits, _ = sess.run([parser.training['logits'],\n                                       parser.training['train_op']])\n      self.assertGreater(abs((eval_logits - training_logits).sum()), 0, 1e-3)\n\n  def testReproducibility(self):\n    batch_size = 10\n\n    def ComputeACost(graph):\n      with graph.as_default():\n        parser = self.MakeBuilder(use_averaging=False)\n        parser.AddTraining(self._task_context,\n                           batch_size,\n                           corpus_name='training-corpus')\n        parser.AddEvaluation(self._task_context,\n                             batch_size,\n                             corpus_name='tuning-corpus')\n      with self.test_session(graph=graph) as sess:\n        sess.run(parser.inits.values())\n        for _ in range(5):\n          cost, _ = sess.run([parser.training['cost'],\n                              parser.training['train_op']])\n      return cost\n\n    cost1 = ComputeACost(tf.Graph())\n    cost2 = ComputeACost(tf.Graph())\n    self.assertNear(cost1, cost2, 1e-8)\n\n  def testAddTrainingAndEvalOrderIndependent(self):\n    batch_size = 10\n\n    graph1 = tf.Graph()\n    with graph1.as_default():\n      parser = self.MakeBuilder(use_averaging=False)\n      parser.AddTraining(self._task_context,\n                         batch_size,\n                         corpus_name='training-corpus')\n      parser.AddEvaluation(self._task_context,\n                           batch_size,\n                           corpus_name='tuning-corpus')\n    with self.test_session(graph=graph1) as sess:\n      sess.run(parser.inits.values())\n      metrics1 = None\n      for _ in range(50):\n        cost1, _ = sess.run([parser.training['cost'],\n                             parser.training['train_op']])\n        em1 = parser.evaluation['eval_metrics'].eval()\n        metrics1 = metrics1 + em1 if metrics1 is not None else em1\n\n    # Reverse the order in which Training and Eval stacks are added.\n    graph2 = tf.Graph()\n    with graph2.as_default():\n      parser = self.MakeBuilder(use_averaging=False)\n      parser.AddEvaluation(self._task_context,\n                           batch_size,\n                           corpus_name='tuning-corpus')\n      parser.AddTraining(self._task_context,\n                         batch_size,\n                         corpus_name='training-corpus')\n    with self.test_session(graph=graph2) as sess:\n      sess.run(parser.inits.values())\n      metrics2 = None\n      for _ in range(50):\n        cost2, _ = sess.run([parser.training['cost'],\n                             parser.training['train_op']])\n        em2 = parser.evaluation['eval_metrics'].eval()\n        metrics2 = metrics2 + em2 if metrics2 is not None else em2\n\n    self.assertNear(cost1, cost2, 1e-8)\n    self.assertEqual(abs(metrics1 - metrics2).sum(), 0)\n\n  def testEvalMetrics(self):\n    batch_size = 10\n    graph = tf.Graph()\n    with graph.as_default():\n      parser = self.MakeBuilder()\n      parser.AddEvaluation(self._task_context,\n                           batch_size,\n                           corpus_name='tuning-corpus')\n    with self.test_session(graph=graph) as sess:\n      sess.run(parser.inits.values())\n      tokens = 0\n      correct_heads = 0\n      for _ in range(100):\n        eval_metrics = sess.run(parser.evaluation['eval_metrics'])\n        tokens += eval_metrics[0]\n        correct_heads += eval_metrics[1]\n      self.assertGreater(tokens, 0)\n      self.assertGreaterEqual(tokens, correct_heads)\n      self.assertGreaterEqual(correct_heads, 0)\n\n  def MakeSparseFeatures(self, ids, weights):\n    f = sparse_pb2.SparseFeatures()\n    for i, w in zip(ids, weights):\n      f.id.append(i)\n      f.weight.append(w)\n    return f.SerializeToString()\n\n  def testEmbeddingOp(self):\n    graph = tf.Graph()\n    with self.test_session(graph=graph):\n      params = tf.constant([[1.0, 2.0], [3.0, 4.0], [5.0, 6.0]],\n                           tf.float32)\n\n      var = variables.Variable([self.MakeSparseFeatures([1, 2], [1.0, 1.0]),\n                                self.MakeSparseFeatures([], [])])\n      var.initializer.run()\n      embeddings = graph_builder.EmbeddingLookupFeatures(params, var,\n                                                         True).eval()\n      self.assertAllClose([[8.0, 10.0], [0.0, 0.0]], embeddings)\n\n      var = variables.Variable([self.MakeSparseFeatures([], []),\n                                self.MakeSparseFeatures([0, 2],\n                                                        [0.5, 2.0])])\n      var.initializer.run()\n      embeddings = graph_builder.EmbeddingLookupFeatures(params, var,\n                                                         True).eval()\n      self.assertAllClose([[0.0, 0.0], [10.5, 13.0]], embeddings)\n\n  def testOnlyTrainSomeParameters(self):\n    batch_size = 10\n    graph = tf.Graph()\n    with graph.as_default():\n      parser = self.MakeBuilder(use_averaging=False, only_train='softmax_bias')\n      parser.AddTraining(self._task_context,\n                         batch_size,\n                         corpus_name='training-corpus')\n    with self.test_session(graph=graph) as sess:\n      sess.run(parser.inits.values())\n      # Before training, save the state of two of the parameters.\n      bias0, weight0 = sess.run([parser.params['softmax_bias'],\n                                 parser.params['softmax_weight']])\n\n      for _ in range(5):\n        bias, weight, _ = sess.run([parser.params['softmax_bias'],\n                                    parser.params['softmax_weight'],\n                                    parser.training['train_op']])\n\n      # After training, only one of the parameters should have changed.\n      self.assertAllEqual(weight, weight0)\n      self.assertGreater(abs(bias - bias0).sum(), 0, 1e-5)\n\n\nif __name__ == '__main__':\n  googletest.main()\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/kbest_syntax.proto",
    "content": "// K-best part-of-speech and dependency annotations for tokens.\n\nsyntax = \"proto2\";\n\nimport \"syntaxnet/sentence.proto\";\n\npackage syntaxnet;\n\n// A list of alternative (k-best) syntax analyses, grouped by sentences.\nmessage KBestSyntaxAnalyses {\n  extend Sentence {\n    optional KBestSyntaxAnalyses extension = 60366242;\n  }\n\n  // Alternative analyses for each sentence. Sentences are listed in the\n  // order visited by a SentenceIterator.\n  repeated KBestSyntaxAnalysesForSentence sentence = 1;\n\n  // Alternative analyses for each token.\n  repeated KBestSyntaxAnalysesForToken token = 2;\n}\n\n// A list of alternative (k-best) analyses for a sentence spanning from a start\n// token index to an end token index. The alternative analyses are ordered by\n// decreasing model score from best to worst. The first analysis is the 1-best\n// analysis, which is typically also stored in the document tokens.\nmessage KBestSyntaxAnalysesForSentence {\n  // First token of sentence.\n  optional int32 start = 1 [default = -1];\n\n  // Last token of sentence.\n  optional int32 end = 2 [default = -1];\n\n  // K-best analyses for the tokens in this sentence. All of the analyses in\n  // the list have the same \"type\"; e.g., k-best taggings,\n  // k-best {tagging+parse}s, etc.\n  // Note also that the type of analysis stored in this list can change\n  // depending on where we are in the document processing pipeline; e.g.,\n  // may initially be taggings, and then switch to parses.  The first\n  // token_analysis would be the 1-best analysis, which is typically also stored\n  // in the document.  Note: some post-processors will update the document's\n  // syntax trees, but will leave these unchanged.\n  repeated AlternativeTokenAnalysis token_analysis = 3;\n}\n\n// A list of scored alternative (k-best) analyses for a particular token. These\n// are all distinct from each other and ordered by decreasing model score. The\n// first is the 1-best analysis, which may or may not match the document tokens\n// depending on how the k-best analyses are selected.\nmessage KBestSyntaxAnalysesForToken {\n  // All token analyses in this repeated field refer to the same token.\n  // Each alternative analysis will contain a single entry for repeated fields\n  // such as head, tag, category and label.\n  repeated AlternativeTokenAnalysis token_analysis = 3;\n}\n\n// An alternative analysis of tokens in the document. The repeated fields\n// are indexed relative to the beginning of a sentence. Fields not\n// represented in the alternative analysis are assumed to be unchanged.\n// Currently only alternatives for tags, categories and (labeled) dependency\n// heads are supported.\n// Each repeated field should either have length=0 or length=number of tokens.\nmessage AlternativeTokenAnalysis {\n  // Head of this token in the dependency tree: the id of the token which has\n  // an arc going to this one. If it is the root token of a sentence, then it\n  // is set to -1.\n  repeated int32 head = 1;\n\n  // Part-of-speech tag for token.\n  repeated string tag = 2;\n\n  // Coarse-grained word category for token.\n  repeated string category = 3;\n\n  // Label for dependency relation between this token and its head.\n  repeated string label = 4;\n\n  // The score of this analysis, where bigger values typically indicate better\n  // quality, but there are no guarantees and there is also no pre-defined\n  // range.\n  optional double score = 5;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/lexicon_builder.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include <stddef.h>\n#include <string>\n\n#include \"syntaxnet/affix.h\"\n#include \"syntaxnet/dictionary.pb.h\"\n#include \"syntaxnet/feature_extractor.h\"\n#include \"syntaxnet/segmenter_utils.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/sentence_batch.h\"\n#include \"syntaxnet/term_frequency_map.h\"\n#include \"syntaxnet/utils.h\"\n#include \"tensorflow/core/framework/op_kernel.h\"\n#include \"tensorflow/core/lib/core/status.h\"\n#include \"tensorflow/core/platform/env.h\"\n\n// A task that collects term statistics over a corpus and saves a set of\n// term maps; these saved mappings are used to map strings to ints in both the\n// chunker trainer and the chunker processors.\n\nusing tensorflow::DEVICE_CPU;\nusing tensorflow::DT_INT32;\nusing tensorflow::OpKernel;\nusing tensorflow::OpKernelConstruction;\nusing tensorflow::OpKernelContext;\nusing tensorflow::Tensor;\nusing tensorflow::TensorShape;\nusing tensorflow::errors::InvalidArgument;\n\nnamespace syntaxnet {\n\n// A workflow task that creates term maps (e.g., word, tag, etc.).\n//\n// Non-flag task parameters:\n// int lexicon_max_prefix_length (3):\n//   The maximum prefix length for lexicon words.\n// int lexicon_max_suffix_length (3):\n//   The maximum suffix length for lexicon words.\nclass LexiconBuilder : public OpKernel {\n public:\n  explicit LexiconBuilder(OpKernelConstruction *context) : OpKernel(context) {\n    OP_REQUIRES_OK(context, context->GetAttr(\"corpus_name\", &corpus_name_));\n    OP_REQUIRES_OK(context, context->GetAttr(\"lexicon_max_prefix_length\",\n                                             &max_prefix_length_));\n    OP_REQUIRES_OK(context, context->GetAttr(\"lexicon_max_suffix_length\",\n                                             &max_suffix_length_));\n\n    string file_path, data;\n    OP_REQUIRES_OK(context, context->GetAttr(\"task_context\", &file_path));\n    OP_REQUIRES_OK(context, ReadFileToString(tensorflow::Env::Default(),\n                                             file_path, &data));\n    OP_REQUIRES(context,\n                TextFormat::ParseFromString(data, task_context_.mutable_spec()),\n                InvalidArgument(\"Could not parse task context at \", file_path));\n  }\n\n  // Counts term frequencies.\n  void Compute(OpKernelContext *context) override {\n    // Term frequency maps to be populated by the corpus.\n    TermFrequencyMap words;\n    TermFrequencyMap lcwords;\n    TermFrequencyMap tags;\n    TermFrequencyMap categories;\n    TermFrequencyMap labels;\n    TermFrequencyMap chars;\n\n    // Affix tables to be populated by the corpus.\n    AffixTable prefixes(AffixTable::PREFIX, max_prefix_length_);\n    AffixTable suffixes(AffixTable::SUFFIX, max_suffix_length_);\n\n    // Tag-to-category mapping.\n    TagToCategoryMap tag_to_category;\n\n    // Make a pass over the corpus.\n    int64 num_tokens = 0;\n    int64 num_documents = 0;\n    Sentence *document;\n    TextReader corpus(*task_context_.GetInput(corpus_name_), &task_context_);\n    while ((document = corpus.Read()) != nullptr) {\n      // Gather token information.\n      for (int t = 0; t < document->token_size(); ++t) {\n        // Get token and lowercased word.\n        const Token &token = document->token(t);\n        string word = token.word();\n        utils::NormalizeDigits(&word);\n        string lcword = tensorflow::str_util::Lowercase(word);\n\n        // Make sure the token does not contain a newline.\n        CHECK(lcword.find('\\n') == string::npos);\n\n        // Increment frequencies (only for terms that exist).\n        if (!word.empty() && !HasSpaces(word)) words.Increment(word);\n        if (!lcword.empty() && !HasSpaces(lcword)) lcwords.Increment(lcword);\n        if (!token.tag().empty()) tags.Increment(token.tag());\n        if (!token.category().empty()) categories.Increment(token.category());\n        if (!token.label().empty()) labels.Increment(token.label());\n\n        // Add prefixes/suffixes for the current word.\n        prefixes.AddAffixesForWord(word.c_str(), word.size());\n        suffixes.AddAffixesForWord(word.c_str(), word.size());\n\n        // Add mapping from tag to category.\n        tag_to_category.SetCategory(token.tag(), token.category());\n\n        // Add characters.\n        vector<tensorflow::StringPiece> char_sp;\n        SegmenterUtils::GetUTF8Chars(word, &char_sp);\n        for (const auto &c : char_sp) {\n          const string c_str = c.ToString();\n          if (!c_str.empty() && !HasSpaces(c_str)) chars.Increment(c_str);\n        }\n\n        // Update the number of processed tokens.\n        ++num_tokens;\n      }\n\n      delete document;\n      ++num_documents;\n    }\n    LOG(INFO) << \"Term maps collected over \" << num_tokens << \" tokens from \"\n              << num_documents << \" documents\";\n\n    // Write mappings to disk.\n    words.Save(TaskContext::InputFile(*task_context_.GetInput(\"word-map\")));\n    lcwords.Save(TaskContext::InputFile(*task_context_.GetInput(\"lcword-map\")));\n    tags.Save(TaskContext::InputFile(*task_context_.GetInput(\"tag-map\")));\n    categories.Save(\n        TaskContext::InputFile(*task_context_.GetInput(\"category-map\")));\n    labels.Save(TaskContext::InputFile(*task_context_.GetInput(\"label-map\")));\n    chars.Save(TaskContext::InputFile(*task_context_.GetInput(\"char-map\")));\n\n    // Write affixes to disk.\n    WriteAffixTable(prefixes, TaskContext::InputFile(\n                                  *task_context_.GetInput(\"prefix-table\")));\n    WriteAffixTable(suffixes, TaskContext::InputFile(\n                                  *task_context_.GetInput(\"suffix-table\")));\n\n    // Write tag-to-category mapping to disk.\n    tag_to_category.Save(\n        TaskContext::InputFile(*task_context_.GetInput(\"tag-to-category\")));\n  }\n\n private:\n  // Returns true if the word contains spaces.\n  static bool HasSpaces(const string &word) {\n    for (char c : word) {\n      if (c == ' ') return true;\n    }\n    return false;\n  }\n\n  // Writes an affix table to a task output.\n  static void WriteAffixTable(const AffixTable &affixes,\n                              const string &output_file) {\n    ProtoRecordWriter writer(output_file);\n    affixes.Write(&writer);\n  }\n\n  // Name of the context input to compute lexicons.\n  string corpus_name_;\n\n  // Max length for prefix table.\n  int max_prefix_length_;\n\n  // Max length for suffix table.\n  int max_suffix_length_;\n\n  // Task context used to configure this op.\n  TaskContext task_context_;\n};\n\nREGISTER_KERNEL_BUILDER(Name(\"LexiconBuilder\").Device(DEVICE_CPU),\n                        LexiconBuilder);\n\nclass FeatureSize : public OpKernel {\n public:\n  explicit FeatureSize(OpKernelConstruction *context) : OpKernel(context) {\n    string task_context_path;\n    OP_REQUIRES_OK(context,\n                   context->GetAttr(\"task_context\", &task_context_path));\n    OP_REQUIRES_OK(context, context->GetAttr(\"arg_prefix\", &arg_prefix_));\n    OP_REQUIRES_OK(context, context->MatchSignature(\n                                {}, {DT_INT32, DT_INT32, DT_INT32, DT_INT32}));\n    string data;\n    OP_REQUIRES_OK(context, ReadFileToString(tensorflow::Env::Default(),\n                                             task_context_path, &data));\n    OP_REQUIRES(\n        context,\n        TextFormat::ParseFromString(data, task_context_.mutable_spec()),\n        InvalidArgument(\"Could not parse task context at \", task_context_path));\n    string label_map_path =\n        TaskContext::InputFile(*task_context_.GetInput(\"label-map\"));\n    label_map_ = SharedStoreUtils::GetWithDefaultName<TermFrequencyMap>(\n        label_map_path, 0, 0);\n  }\n\n  ~FeatureSize() override { SharedStore::Release(label_map_); }\n\n  void Compute(OpKernelContext *context) override {\n    // Computes feature sizes.\n    ParserEmbeddingFeatureExtractor features(arg_prefix_);\n    features.Setup(&task_context_);\n    features.Init(&task_context_);\n    const int num_embeddings = features.NumEmbeddings();\n    Tensor *feature_sizes = nullptr;\n    Tensor *domain_sizes = nullptr;\n    Tensor *embedding_dims = nullptr;\n    Tensor *num_actions = nullptr;\n    TF_CHECK_OK(context->allocate_output(0, TensorShape({num_embeddings}),\n                                         &feature_sizes));\n    TF_CHECK_OK(context->allocate_output(1, TensorShape({num_embeddings}),\n                                         &domain_sizes));\n    TF_CHECK_OK(context->allocate_output(2, TensorShape({num_embeddings}),\n                                         &embedding_dims));\n    TF_CHECK_OK(context->allocate_output(3, TensorShape({}), &num_actions));\n    for (int i = 0; i < num_embeddings; ++i) {\n      feature_sizes->vec<int32>()(i) = features.FeatureSize(i);\n      domain_sizes->vec<int32>()(i) = features.EmbeddingSize(i);\n      embedding_dims->vec<int32>()(i) = features.EmbeddingDims(i);\n    }\n\n    // Computes number of actions in the transition system.\n    std::unique_ptr<ParserTransitionSystem> transition_system(\n        ParserTransitionSystem::Create(task_context_.Get(\n            features.GetParamName(\"transition_system\"), \"arc-standard\")));\n    transition_system->Setup(&task_context_);\n    transition_system->Init(&task_context_);\n    num_actions->scalar<int32>()() =\n        transition_system->NumActions(label_map_->Size());\n  }\n\n private:\n  // Task context used to configure this op.\n  TaskContext task_context_;\n\n  // Dependency label map used in transition system.\n  const TermFrequencyMap *label_map_;\n\n  // Prefix for context parameters.\n  string arg_prefix_;\n};\n\nREGISTER_KERNEL_BUILDER(Name(\"FeatureSize\").Device(DEVICE_CPU), FeatureSize);\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/lexicon_builder_test.py",
    "content": "# coding=utf-8\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Tests for lexicon_builder.\"\"\"\n\n\n# disable=no-name-in-module,unused-import,g-bad-import-order,maybe-no-member\nimport os.path\nimport tensorflow as tf\n\nimport syntaxnet.load_parser_ops\n\nfrom tensorflow.python.framework import test_util\nfrom tensorflow.python.platform import googletest\nfrom tensorflow.python.platform import tf_logging as logging\n\nfrom syntaxnet import sentence_pb2\nfrom syntaxnet import task_spec_pb2\nfrom syntaxnet.ops import gen_parser_ops\n\nFLAGS = tf.app.flags.FLAGS\n\nCONLL_DOC1 = u'''1 बात _ n NN _ _ _ _ _\n2 गलत _ adj JJ _ _ _ _ _\n3 हो _ v VM _ _ _ _ _\n4 तो _ avy CC _ _ _ _ _\n5 गुस्सा _ n NN _ _ _ _ _\n6 सेलेब्रिटिज _ n NN _ _ _ _ _\n7 को _ psp PSP _ _ _ _ _\n8 भी _ avy RP _ _ _ _ _\n9 आना _ v VM _ _ _ _ _\n10 लाजमी _ adj JJ _ _ _ _ _\n11 है _ v VM _ _ _ _ _\n12 । _ punc SYM _ _ _ _ _'''\n\nCONLL_DOC2 = u'''1 लेकिन _ avy CC _ _ _ _ _\n2 अभिनेत्री _ n NN _ _ _ _ _\n3 के _ psp PSP _ _ _ _ _\n4 इस _ pn DEM _ _ _ _ _\n5 कदम _ n NN _ _ _ _ _\n6 से _ psp PSP _ _ _ _ _\n7 वहां _ pn PRP _ _ _ _ _\n8 रंग _ n NN _ _ _ _ _\n9 में _ psp PSP _ _ _ _ _\n10 भंग _ adj JJ _ _ _ _ _\n11 पड़ _ v VM _ _ _ _ _\n12 गया _ v VAUX _ _ _ _ _\n13 । _ punc SYM _ _ _ _ _'''\n\nTAGS = ['NN', 'JJ', 'VM', 'CC', 'PSP', 'RP', 'JJ', 'SYM', 'DEM', 'PRP', 'VAUX']\n\nCATEGORIES = ['n', 'adj', 'v', 'avy', 'n', 'psp', 'punc', 'pn']\n\nTOKENIZED_DOCS = u'''बात गलत हो तो गुस्सा सेलेब्रिटिज को भी आना लाजमी है ।\nलेकिन अभिनेत्री के इस कदम से वहां रंग में भंग पड़ गया ।\n'''\n\nCHARS = u'''अ इ आ क ग ज ट त द न प भ ब य म र ल व ह स ि ा ु ी े ै ो ् ड़ । ं'''\n\nCOMMENTS = u'# Line with fake comments.'\n\n\nclass LexiconBuilderTest(test_util.TensorFlowTestCase):\n\n  def setUp(self):\n    if not hasattr(FLAGS, 'test_srcdir'):\n      FLAGS.test_srcdir = ''\n    if not hasattr(FLAGS, 'test_tmpdir'):\n      FLAGS.test_tmpdir = tf.test.get_temp_dir()\n    self.corpus_file = os.path.join(FLAGS.test_tmpdir, 'documents.conll')\n    self.context_file = os.path.join(FLAGS.test_tmpdir, 'context.pbtxt')\n\n  def AddInput(self, name, file_pattern, record_format, context):\n    inp = context.input.add()\n    inp.name = name\n    inp.record_format.append(record_format)\n    inp.part.add().file_pattern = file_pattern\n\n  def WriteContext(self, corpus_format):\n    context = task_spec_pb2.TaskSpec()\n    self.AddInput('documents', self.corpus_file, corpus_format, context)\n    for name in ('word-map', 'lcword-map', 'tag-map',\n                 'category-map', 'label-map', 'prefix-table',\n                 'suffix-table', 'tag-to-category', 'char-map'):\n      self.AddInput(name, os.path.join(FLAGS.test_tmpdir, name), '', context)\n    logging.info('Writing context to: %s', self.context_file)\n    with open(self.context_file, 'w') as f:\n      f.write(str(context))\n\n  def ReadNextDocument(self, sess, doc_source):\n    doc_str, last = sess.run(doc_source)\n    if doc_str:\n      doc = sentence_pb2.Sentence()\n      doc.ParseFromString(doc_str[0])\n    else:\n      doc = None\n    return doc, last\n\n  def ValidateDocuments(self):\n    doc_source = gen_parser_ops.document_source(self.context_file, batch_size=1)\n    with self.test_session() as sess:\n      logging.info('Reading document1')\n      doc, last = self.ReadNextDocument(sess, doc_source)\n      self.assertEqual(len(doc.token), 12)\n      self.assertEqual(u'लाजमी', doc.token[9].word)\n      self.assertFalse(last)\n      logging.info('Reading document2')\n      doc, last = self.ReadNextDocument(sess, doc_source)\n      self.assertEqual(len(doc.token), 13)\n      self.assertEqual(u'भंग', doc.token[9].word)\n      self.assertFalse(last)\n      logging.info('Hitting end of the dataset')\n      doc, last = self.ReadNextDocument(sess, doc_source)\n      self.assertTrue(doc is None)\n      self.assertTrue(last)\n\n  def ValidateTagToCategoryMap(self):\n    with file(os.path.join(FLAGS.test_tmpdir, 'tag-to-category'), 'r') as f:\n      entries = [line.strip().split('\\t') for line in f.readlines()]\n    for tag, category in entries:\n      self.assertIn(tag, TAGS)\n      self.assertIn(category, CATEGORIES)\n\n  def LoadMap(self, map_name):\n    loaded_map = {}\n    with file(os.path.join(FLAGS.test_tmpdir, map_name), 'r') as f:\n      for line in f:\n        entries = line.strip().split(' ')\n        if len(entries) == 2:\n          loaded_map[entries[0]] = entries[1]\n    return loaded_map\n\n  def ValidateCharMap(self):\n    char_map = self.LoadMap('char-map')\n    self.assertEqual(len(char_map), len(CHARS.split(' ')))\n    for char in CHARS.split(' '):\n      self.assertIn(char.encode('utf-8'), char_map)\n\n  def ValidateWordMap(self):\n    word_map = self.LoadMap('word-map')\n    for word in filter(None, TOKENIZED_DOCS.replace('\\n', ' ').split(' ')):\n      self.assertIn(word.encode('utf-8'), word_map)\n\n  def BuildLexicon(self):\n    with self.test_session():\n      gen_parser_ops.lexicon_builder(task_context=self.context_file).run()\n\n  def testCoNLLFormat(self):\n    self.WriteContext('conll-sentence')\n    logging.info('Writing conll file to: %s', self.corpus_file)\n    with open(self.corpus_file, 'w') as f:\n      f.write((CONLL_DOC1 + u'\\n\\n' + CONLL_DOC2 + u'\\n')\n              .replace(' ', '\\t').encode('utf-8'))\n    self.ValidateDocuments()\n    self.BuildLexicon()\n    self.ValidateTagToCategoryMap()\n    self.ValidateCharMap()\n    self.ValidateWordMap()\n\n  def testCoNLLFormatExtraNewlinesAndComments(self):\n    self.WriteContext('conll-sentence')\n    with open(self.corpus_file, 'w') as f:\n      f.write((u'\\n\\n\\n' + CONLL_DOC1 + u'\\n\\n\\n' + COMMENTS +\n               u'\\n\\n' + CONLL_DOC2).replace(' ', '\\t').encode('utf-8'))\n    self.ValidateDocuments()\n    self.BuildLexicon()\n    self.ValidateTagToCategoryMap()\n\n  def testTokenizedTextFormat(self):\n    self.WriteContext('tokenized-text')\n    with open(self.corpus_file, 'w') as f:\n      f.write(TOKENIZED_DOCS.encode('utf-8'))\n    self.ValidateDocuments()\n    self.BuildLexicon()\n\n  def testTokenizedTextFormatExtraNewlines(self):\n    self.WriteContext('tokenized-text')\n    with open(self.corpus_file, 'w') as f:\n      f.write((u'\\n\\n\\n' + TOKENIZED_DOCS + u'\\n\\n\\n').encode('utf-8'))\n    self.ValidateDocuments()\n    self.BuildLexicon()\n\nif __name__ == '__main__':\n  googletest.main()\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/load_parser_ops.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Loads parser_ops shared library.\"\"\"\n\nimport os.path\nimport tensorflow as tf\n\ntf.load_op_library(\n    os.path.join(tf.resource_loader.get_data_files_path(),\n                 'parser_ops.so'))\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/models/parsey_mcparseface/context.pbtxt",
    "content": "Parameter {\n  name: \"brain_parser_embedding_dims\"\n  value: \"32;32;64\"\n}\nParameter {\n  name: \"brain_parser_embedding_names\"\n  value: \"labels;tags;words\"\n}\nParameter {\n  name: 'brain_parser_scoring'\n  value: 'default'\n}\nParameter {\n  name: \"brain_parser_features\"\n  value:\n  'stack.child(1).label '\n  'stack.child(1).sibling(-1).label '\n  'stack.child(-1).label '\n  'stack.child(-1).sibling(1).label '\n  'stack.child(2).label '\n  'stack.child(-2).label '\n  'stack(1).child(1).label '\n  'stack(1).child(1).sibling(-1).label '\n  'stack(1).child(-1).label '\n  'stack(1).child(-1).sibling(1).label '\n  'stack(1).child(2).label '\n  'stack(1).child(-2).label; '\n  'input.token.tag '\n  'input(1).token.tag '\n  'input(2).token.tag '\n  'input(3).token.tag '\n  'stack.token.tag '\n  'stack.child(1).token.tag '\n  'stack.child(1).sibling(-1).token.tag '\n  'stack.child(-1).token.tag '\n  'stack.child(-1).sibling(1).token.tag '\n  'stack.child(2).token.tag '\n  'stack.child(-2).token.tag '\n  'stack(1).token.tag '\n  'stack(1).child(1).token.tag '\n  'stack(1).child(1).sibling(-1).token.tag '\n  'stack(1).child(-1).token.tag '\n  'stack(1).child(-1).sibling(1).token.tag '\n  'stack(1).child(2).token.tag '\n  'stack(1).child(-2).token.tag '\n  'stack(2).token.tag '\n  'stack(3).token.tag; '\n  'input.token.word '\n  'input(1).token.word '\n  'input(2).token.word '\n  'input(3).token.word '\n  'stack.token.word '\n  'stack.child(1).token.word '\n  'stack.child(1).sibling(-1).token.word '\n  'stack.child(-1).token.word '\n  'stack.child(-1).sibling(1).token.word '\n  'stack.child(2).token.word '\n  'stack.child(-2).token.word '\n  'stack(1).token.word '\n  'stack(1).child(1).token.word '\n  'stack(1).child(1).sibling(-1).token.word '\n  'stack(1).child(-1).token.word '\n  'stack(1).child(-1).sibling(1).token.word '\n  'stack(1).child(2).token.word '\n  'stack(1).child(-2).token.word '\n  'stack(2).token.word '\n  'stack(3).token.word '\n}\nParameter {\n  name: \"brain_parser_transition_system\"\n  value: \"arc-standard\"\n}\n\nParameter {\n  name: \"brain_tagger_embedding_dims\"\n  value: \"8;16;16;16;16;64\"\n}\nParameter {\n  name: \"brain_tagger_embedding_names\"\n  value: \"other;prefix2;prefix3;suffix2;suffix3;words\"\n}\nParameter {\n  name: \"brain_tagger_features\"\n  value:\n  'input.digit '\n  'input.hyphen; '\n  'input.prefix(length=\"2\") '\n  'input(1).prefix(length=\"2\") '\n  'input(2).prefix(length=\"2\") '\n  'input(3).prefix(length=\"2\") '\n  'input(-1).prefix(length=\"2\") '\n  'input(-2).prefix(length=\"2\") '\n  'input(-3).prefix(length=\"2\") '\n  'input(-4).prefix(length=\"2\"); '\n  'input.prefix(length=\"3\") '\n  'input(1).prefix(length=\"3\") '\n  'input(2).prefix(length=\"3\") '\n  'input(3).prefix(length=\"3\") '\n  'input(-1).prefix(length=\"3\") '\n  'input(-2).prefix(length=\"3\") '\n  'input(-3).prefix(length=\"3\") '\n  'input(-4).prefix(length=\"3\"); '\n  'input.suffix(length=\"2\") '\n  'input(1).suffix(length=\"2\") '\n  'input(2).suffix(length=\"2\") '\n  'input(3).suffix(length=\"2\") '\n  'input(-1).suffix(length=\"2\") '\n  'input(-2).suffix(length=\"2\") '\n  'input(-3).suffix(length=\"2\") '\n  'input(-4).suffix(length=\"2\"); '\n  'input.suffix(length=\"3\") '\n  'input(1).suffix(length=\"3\") '\n  'input(2).suffix(length=\"3\") '\n  'input(3).suffix(length=\"3\") '\n  'input(-1).suffix(length=\"3\") '\n  'input(-2).suffix(length=\"3\") '\n  'input(-3).suffix(length=\"3\") '\n  'input(-4).suffix(length=\"3\"); '\n  'input.token.word '\n  'input(1).token.word '\n  'input(2).token.word '\n  'input(3).token.word '\n  'input(-1).token.word '\n  'input(-2).token.word '\n  'input(-3).token.word '\n  'input(-4).token.word '\n}\nParameter {\n  name: \"brain_tagger_transition_system\"\n  value: \"tagger\"\n}\n\ninput {\n  name: \"tag-map\"\n  Part {\n    file_pattern: \"syntaxnet/models/parsey_mcparseface/tag-map\"\n  }\n}\ninput {\n  name: \"tag-to-category\"\n  Part {\n    file_pattern: \"syntaxnet/models/parsey_mcparseface/fine-to-universal.map\"\n  }\n}\ninput {\n  name: \"word-map\"\n  Part {\n    file_pattern: \"syntaxnet/models/parsey_mcparseface/word-map\"\n  }\n}\ninput {\n  name: \"label-map\"\n  Part {\n    file_pattern: \"syntaxnet/models/parsey_mcparseface/label-map\"\n  }\n}\ninput {\n  name: \"prefix-table\"\n  Part {\n    file_pattern: \"syntaxnet/models/parsey_mcparseface/prefix-table\"\n  }\n}\ninput {\n  name: \"suffix-table\"\n  Part {\n    file_pattern: \"syntaxnet/models/parsey_mcparseface/suffix-table\"\n  }\n}\ninput {\n  name: 'stdin'\n  record_format: 'english-text'\n  Part {\n    file_pattern: '-'\n  }\n}\ninput {\n  name: 'stdin-conll'\n  record_format: 'conll-sentence'\n  Part {\n    file_pattern: '-'\n  }\n}\ninput {\n  name: 'stdout-conll'\n  record_format: 'conll-sentence'\n  Part {\n    file_pattern: '-'\n  }\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/models/parsey_mcparseface/label-map",
    "content": "46\npunct 243160\nprep 194627\npobj 186958\ndet 170592\nnsubj 144821\nnn 144800\namod 117242\nROOT 90592\ndobj 88551\naux 76523\nadvmod 72893\nconj 59384\ncc 57532\nnum 36350\nposs 35117\ndep 34986\nccomp 29470\ncop 25991\nmark 25141\nxcomp 25111\nrcmod 16234\nauxpass 15740\nadvcl 14996\npossessive 14866\nnsubjpass 14133\npcomp 12488\nappos 11112\npartmod 11106\nneg 11090\nnumber 10658\nprt 7123\nquantmod 6653\ntmod 5418\ninfmod 5134\nnpadvmod 3213\nparataxis 3012\nmwe 2793\nexpl 2712\niobj 1642\nacomp 1632\ndiscourse 1381\ncsubj 1225\npredet 1160\npreconj 749\ngoeswith 146\ncsubjpass 41\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/models/parsey_mcparseface/tag-map",
    "content": "49\nNN 285194\nIN 228165\nDT 179147\nNNP 175147\nJJ 125667\nNNS 115732\n, 97481\n. 85938\nRB 78513\nVB 63952\nCC 57554\nVBD 56635\nCD 55674\nPRP 55244\nVBZ 48126\nVBN 44458\nVBG 34524\nVBP 33669\nTO 28772\nMD 22364\nPRP$ 20706\nHYPH 18526\nPOS 14905\n`` 12193\n'' 12154\nWDT 10267\n: 8713\n$ 7993\nWP 7336\nRP 7335\nWRB 6634\nJJR 6295\nNNPS 5917\n-RRB- 3904\n-LRB- 3840\nJJS 3596\nRBR 3186\nEX 2733\nUH 1521\nRBS 1467\nPDT 1271\nFW 928\nNFP 844\nSYM 652\nADD 476\nLS 392\nWP$ 332\nGW 184\nAFX 42\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/models/parsey_mcparseface/word-map",
    "content": "64036\n, 95982\nthe 95431\n. 79280\nof 50788\nto 46845\nand 41875\na 36812\nin 32626\n- 20001\nthat 18712\nis 18051\n's 16944\nfor 16102\nThe 12278\non 11455\nit 10518\nwith 10026\nwas 9250\nas 9051\nare 8765\nby 8516\nbe 8294\nfrom 8265\nhave 7954\nat 7882\n`` 7861\n\" 7848\nI 7744\n'' 7735\n$ 7534\nhas 7228\nsaid 7068\nwill 6580\nthis 6068\nan 6040\n% 5847\nhe 5824\nn't 5474\nnot 5350\nthey 5185\nits 5089\nyou 5053\nor 5033\nmillion 5019\n? 4768\nMr. 4564\nyear 4550\nhis 4518\nwhich 4349\nwould 4254\nbut 4218\ntheir 4175\nmore 3993\nabout 3968\nwere 3924\nup 3757\n: 3721\none 3694\nwho 3694\nhad 3605\nIn 3578\n-RRB- 3483\nbeen 3476\n-LRB- 3430\nall 3264\npeople 3193\nout 3151\ncan 3149\nalso 3146\nthan 3129\ndo 3044\nsays 2982\nwe 2951\ncompany 2941\n-- 2857\nIt 2819\nnew 2776\nBut 2632\nthere 2631\nother 2614\nsome 2599\nmarket 2559\ntime 2464\ntwo 2458\nyears 2456\nwhen 2347\nbillion 2318\nso 2260\nif 2254\ninto 2204\nonly 2196\nthem 2184\nover 2180\ncould 2139\nHe 2137\nafter 2117\nfirst 2098\nwhat 2093\n; 2086\nU.S. 2043\nno 1942\nlast 1911\nbecause 1894\nChina 1868\nA 1840\ngovernment 1832\n' 1768\nsuch 1767\nmany 1756\nlike 1740\nnow 1726\nshare 1724\njust 1702\nWhat 1686\nmost 1676\nany 1667\nNew 1647\nmy 1612\ndid 1610\nyour 1589\nThis 1573\nWe 1529\nstock 1520\nsay 1503\nmay 1480\neven 1472\nget 1455\nshould 1415\nvery 1408\nmuch 1407\nour 1405\nmake 1397\nbusiness 1396\nthese 1388\nTaiwan 1342\ndown 1337\nher 1337\nthree 1332\nhim 1330\nThey 1322\nthose 1310\nstill 1305\nback 1276\ndoes 1276\nmade 1273\nbetween 1269\nforeign 1263\nwork 1240\nwell 1227\nIf 1225\nthen 1222\nCorp. 1218\nshares 1216\n& 1214\nhigh 1212\nbefore 1210\nUS 1207\nthrough 1205\nquarter 1198\nway 1190\nme 1183\nhow 1181\nbeing 1177\nworld 1166\nPresident 1160\nday 1157\nlong 1143\npresident 1142\ngood 1141\ncompanies 1133\nAnd 1124\nshe 1121\noff 1114\nmoney 1100\ntrading 1100\nsales 1079\nInc. 1077\nYork 1061\nAmerican 1050\nstate 1046\nagainst 1038\ntake 1036\nnext 1029\nunder 1015\nweek 1012\nwhile 1010\nknow 1006\nprice 1006\nthink 1006\nChinese 1003\nown 999\ntoday 998\ngroup 991\ninvestment 986\ncountry 977\nwhere 969\ngo 952\nsince 950\nCo. 943\neconomic 941\n1 940\nsame 935\ngoing 931\n! 930\nboth 930\ndays 925\nold 914\nmonths 907\n10 905\nbuy 905\nsee 903\nindustry 892\nFor 891\nmonth 891\nwant 891\nprices 890\nAs 888\nend 884\nhere 882\nuse 881\nago 861\nmajor 849\nThat 848\nus 847\nbased 846\nAt 838\nanother 838\npart 833\nthird 833\nnumber 830\ninterest 827\n... 818\nearlier 816\nused 815\ntoo 814\n're 813\nrate 813\ndevelopment 810\nofficials 807\ncents 804\npast 795\n30 787\ncalled 786\ncapital 784\nThere 783\nlife 776\nbig 772\ntrade 772\nyesterday 772\nalready 758\nput 758\nWhen 757\nfinancial 757\nfew 756\nneed 753\nright 753\npublic 746\nYou 744\nduring 742\nBush 741\neach 740\nsystem 740\nfive 739\nplan 739\nissue 738\naround 737\nNational 736\ncome 735\nUnited 731\nproducts 726\nhome 715\nincluding 715\nlarge 715\ncountries 714\nrose 712\noil 710\ninvestors 709\ntax 704\nbecome 703\nexpected 703\nmight 695\nhelp 692\npoint 688\nNews 682\ndollars 673\nlittle 672\nmust 672\ngrowth 671\nless 669\n2 666\neconomy 666\nsecond 666\ninternational 665\nsupport 664\nfar 658\ncase 656\nplace 656\nJapan 652\npolitical 651\nbonds 649\namong 647\nprogram 647\nearnings 645\nsmall 645\nchief 644\nincome 644\nlaw 644\nreal 644\nJapanese 643\nBank 642\nrecent 640\npower 639\ncity 636\nreport 634\nseveral 626\n20 619\nnational 619\nfour 618\narea 615\ntotal 613\nexecutive 610\npay 610\ngreat 606\nleast 605\nlevel 605\nreally 601\nimportant 600\nwithout 600\nincrease 596\nevery 591\nhalf 590\nstocks 590\nsell 586\nnet 585\nissues 582\nchairman 580\nearly 580\nfind 578\nvalue 578\nheld 576\nset 576\n15 575\nIraq 575\nagain 573\ngive 573\nunit 573\nperiod 572\nfederal 570\nhigher 570\nformer 568\nterm 568\nmaking 565\ncost 564\nHouse 562\nmeeting 562\nprofit 561\n'm 560\norder 560\n/ 558\nwhether 556\nIsraeli 554\nnever 553\nbank 552\nfamily 552\nproblem 552\nSo 551\ncurrent 551\nmanagement 551\nOn 546\ncame 546\ntook 544\nlot 542\nfunds 541\nfirm 536\nOne 532\nclose 532\nproblems 532\noffice 531\nthings 530\nPalestinian 528\nreported 527\nbest 524\ngot 523\nplans 523\nuntil 523\nproduction 522\nagreement 521\npolicy 521\nFriday 518\nchange 518\nrates 518\nsix 518\nAfter 517\ncontinue 517\nable 515\n50 514\noffer 514\nlocal 513\n3 512\nservice 512\nHow 508\noperations 507\nfound 505\nSome 504\nlower 503\nExchange 501\nfact 501\naccording 500\nbetter 500\ncommon 499\nname 499\nsold 499\n've 497\ndebt 496\nloss 496\naway 495\ncar 495\ntechnology 495\nWest 493\nwar 493\ndue 492\nlook 489\nexchange 485\nam 484\n8 483\nalways 483\nmilitary 483\nfell 480\nfuture 478\nmove 477\nMinister 476\ndifferent 476\nsomething 476\nfurther 474\nleft 474\ntold 474\nhard 473\ninformation 473\nman 473\n'll 472\ndirector 471\nlate 468\nnews 468\nthough 468\ncomputer 467\nonce 467\nteam 467\nchildren 465\ncall 464\nline 464\nadded 463\nSouth 462\ncosts 462\ngeneral 462\nothers 462\nBritish 461\nSeptember 461\nWashington 459\ncontrol 459\nInternational 458\ncourt 458\nreturn 458\ntimes 458\nmembers 457\nAl 454\nshow 454\nsituation 454\nlargest 453\nmarkets 453\n100 452\naverage 452\nvice 452\nclosed 451\nHong 450\nsecurity 450\nca 449\npossible 449\nconstruction 447\ntop 447\nanalysts 446\nface 444\nsale 443\nstrong 442\nEast 440\nresult 440\nresults 440\nrecently 439\nKong 438\npeace 438\nsecurities 438\nlater 437\nwater 437\nwent 435\n5 434\nnation 429\ndeal 428\nresearch 428\nschool 428\ngetting 427\nrights 427\nbelieve 426\nthing 426\nenough 425\nenterprises 425\nindex 425\npoints 425\nfood 424\nknown 424\nyet 421\nAccording 420\nannual 420\nbid 420\nofficial 420\nbanks 419\nexample 419\nworking 419\n1988 417\nCongress 417\nhead 417\nAmerica 416\ninsurance 416\nHowever 415\nboard 415\nassets 414\nState 413\nbegan 412\nmen 412\nspecial 411\nyield 411\nCalifornia 410\ncash 410\nearthquake 410\nfree 410\ncourse 409\nincreased 409\nDepartment 408\nGeneral 408\nconcern 405\nfull 403\nprocess 402\nkeep 401\noften 401\nkind 400\ntrying 400\nIsrael 399\nWith 399\nareas 399\nshort 399\nwhole 399\nproject 398\nlooking 397\nwithin 397\nusing 396\nJohn 394\nagreed 393\never 393\ngas 392\ntalks 392\nSan 391\ncertain 390\ncontract 390\nended 390\ndecision 388\n12 387\nparty 387\nreached 387\nweeks 387\nWorld 386\nmakes 386\nwo 386\nprojects 385\nspokesman 385\nNow 384\nbuilding 384\nFirst 383\nAgency 382\nStock 380\nlikely 380\nopen 380\nStates 379\nalmost 379\ncut 379\nlow 379\nnearly 379\noutside 378\nhowever 377\nreports 377\ntaken 376\nasked 375\nrun 375\nMonday 374\nannounced 374\nfollowing 374\nGroup 372\nofficer 372\ntry 372\nrevenue 370\n1989 369\namount 369\nperson 369\nposition 369\nbill 368\n4 365\nThese 365\nactually 365\noperating 365\nstart 365\nSoviet 364\nAugust 363\nTaiwanese 363\ndollar 363\nservices 363\nagency 362\nprovide 362\ntaking 362\ndoing 361\nwomen 361\nmain 360\nstarted 359\n25 358\nShe 358\ni 358\ncompared 357\nowned 357\nreceived 357\nsaying 357\nTo 354\ndone 354\nlet 354\nwhy 352\ncredit 351\nhand 350\n11 349\n40 349\nGod 349\nadministration 349\nhaving 349\nloans 349\nforces 348\nhealth 348\nprivate 348\nTV 347\ncurrently 347\njob 347\nselling 346\nWho 345\nclear 345\nfeel 345\nhours 345\ntogether 345\ncare 344\nespecially 343\nfirms 343\ngiven 343\nnight 343\nthought 343\nyen 343\nlive 342\nbond 341\nlatest 341\naddition 340\nfund 339\nBeijing 338\nalong 338\nsoon 338\nstop 338\nbring 337\nforce 335\nnear 335\nhistory 334\nthemselves 334\nStreet 333\nbuying 333\nland 333\ndemand 332\nleaders 332\nreason 332\nemployees 331\nquestion 331\nitself 330\nMy 329\nleader 329\nexpect 328\ngroups 328\nworkers 328\nEuropean 327\nkilled 327\nFederal 326\nelection 326\nrelated 326\nside 325\nchanges 324\ncoming 324\nestate 324\nplant 324\nseen 324\nwhose 324\nLondon 323\nOct. 323\nhit 323\nlost 323\n.. 322\nCommittee 322\nmeans 322\nstake 322\nUniversity 321\nhuman 320\nproduct 320\n6 319\nrather 319\nCity 318\noffering 318\nsenior 318\n1990 317\nPrime 317\nbusinesses 315\nvarious 315\nair 314\nnotes 314\nrecord 314\nrelations 314\n13 313\nIraqi 313\nXinhua 313\nprobably 313\nWhile 311\nbad 311\npolice 311\nstatement 311\neither 310\ninclude 310\nhouse 309\nClinton 308\nabove 308\nChen 307\nlosses 307\nmaker 307\nrisk 307\nage 306\ndomestic 306\nmorning 306\nWall 305\npaid 305\nTreasury 304\npersonal 304\ntell 304\nplay 303\ntraders 303\nPeople 302\nanything 302\nnine 302\nrole 302\nsure 302\nled 301\nper 301\nprevious 301\nvolume 301\nNorth 300\nstory 300\nLast 298\nmeet 298\nrise 298\nsingle 298\nBy 297\nUnion 297\nfall 297\nAssociation 296\ncorporate 296\nregion 296\nkey 295\nvisit 295\nHis 294\nmedia 294\nnothing 294\nquality 294\nTexas 293\nattack 293\neffort 293\nhope 293\nseems 293\nanalyst 291\nNo 290\nsociety 290\nAll 289\nParty 289\nSenate 289\nform 289\ndamage 288\nmatter 287\nneeds 287\nindustrial 286\n1987 285\nWhite 285\nefforts 285\ngain 285\n1/2 284\nBoard 284\nEven 284\nSecurities 284\nproposed 284\nPalestinians 283\naction 283\nlegal 283\nterms 283\ndeath 282\nyuan 282\nDecember 281\nbehind 281\nequipment 281\nholding 280\nUAL 279\ndevelop 279\nlead 279\nturn 279\nEurope 278\nTaipei 278\njunk 278\nnamed 278\nAlthough 277\ncharge 277\nDavid 276\nPacific 276\ndefense 276\nexport 276\nhold 276\ntalk 276\nmember 275\n9 274\noverseas 274\nstudents 274\nread 273\nview 273\nwanted 273\nFrancisco 272\nRussia 272\nAn 271\ndrop 271\nOctober 270\nseven 270\nWestern 269\nexperience 269\nroad 269\nsense 269\nstates 269\nbuild 268\ngains 268\nhour 268\nissued 268\nreduce 268\nsomeone 268\nhimself 267\nyoung 267\n'd 266\nMany 266\nbrought 266\ncommunity 266\nreporter 266\nIran 265\nlight 265\nlonger 265\ncommercial 264\nfutures 264\nhuge 264\nquickly 264\npriced 263\nestimated 262\nbecame 261\ndisaster 261\nDo 260\nKorea 260\nWang 260\ncomment 260\ngoods 260\nCouncil 259\ncases 259\nexpects 259\nproposal 259\nDow 258\nEnd 258\nMarch 258\npaper 258\nwants 258\nJames 257\ncomes 256\ndecline 256\njoint 256\nnatural 256\nsimilar 256\nJuly 255\ncampaign 255\npurchase 255\nspace 255\nspending 255\nChicago 254\nRussian 254\ncustomers 254\nestablished 254\ngave 254\nprograms 254\nquite 254\nremain 254\nsides 254\ntelevision 254\n7 253\nMs. 253\nidea 253\nsimply 253\nTuesday 252\norders 252\n60 251\ncontinued 251\ndeclined 251\nincreasing 251\ninterests 251\nRobert 250\nattention 250\ntoward 250\n500 249\nJanuary 249\nNov. 249\nart 249\ncharges 249\ndifficult 249\nwon 249\nAmericans 248\nJones 248\noffered 248\nvote 248\nGermany 247\nal 247\ncenter 247\nloan 247\nreach 247\nMay 246\nThen 246\nbelow 246\nbuilt 246\ncalls 246\ninvolved 246\ntakeover 246\nconference 245\nturned 245\nApril 244\nbook 244\ndrug 244\nopening 244\nstaff 244\nShanghai 243\nclaims 243\npresent 242\npressure 242\nSecretary 241\napproved 241\nexports 241\nsent 241\ncompleted 240\nelse 240\netc. 239\ngrowing 239\nmean 239\noperation 239\nsystems 239\nunderstand 239\nSince 238\nnetwork 238\nproperty 238\nseries 238\npost 237\n17 236\nMichael 236\naccount 236\ndata 236\neffect 236\nCommission 235\ninstead 235\nLtd. 234\nbiggest 234\ncars 234\ncities 234\nquestions 234\n16 233\nInc 233\nleading 233\nleave 233\npotential 233\nForeign 232\navailable 232\nblack 232\ncompetition 232\ncreate 232\neverything 232\ngame 232\nlove 232\noutstanding 232\nbudget 231\neducation 231\nfront 231\nneeded 231\nFrench 230\nenvironment 230\nmainland 230\nGeorge 229\nMost 229\ndepartment 229\ngold 229\nindividual 229\nsocial 229\nbasis 228\nheavy 228\nnuclear 228\nstep 228\nstudy 228\ntown 228\nunits 228\n14 227\n18 227\nacross 227\ndropped 227\nshows 227\ndeveloped 226\nlevels 226\nraise 226\nworks 226\nFord 225\nfiscal 225\nmind 225\n3/4 224\nFrance 224\npoor 224\nallow 223\nfiled 223\nmajority 223\nmiddle 223\nrunning 223\nentire 222\nexecutives 222\ninflation 222\nremains 222\nspend 222\ntest 222\nBig 221\nSaudi 221\nanyone 221\nfriends 221\nsubject 221\nwide 221\nJune 220\nbeginning 220\ndivision 220\nfinal 220\nfourth 220\nFrom 218\nconditions 218\nincluded 218\nserious 218\nArab 217\nCanada 217\nCapital 217\nLi 217\nSecurity 217\nbought 217\nenergy 217\n} 217\nahead 215\ncarry 215\ncaused 215\ncomposite 215\neveryone 215\nevidence 215\nmet 215\n{ 215\nadditional 214\nbegin 214\nparties 214\nphone 214\ndeveloping 213\nfinancing 213\ntaxes 213\nCourt 212\nfigures 212\nAir 211\nDevelopment 211\nMacau 211\nalthough 211\ncommittee 211\nconsumer 211\nmanager 211\noptions 211\nviolence 211\nSaddam 210\nfire 210\ninstitutions 210\nlabor 210\nraised 210\nrest 210\nscale 210\naid 209\nelections 209\nhelped 209\nliving 209\nwife 209\nwords 209\n80 208\nEnglish 208\njudge 208\nlimited 208\nlives 208\nrange 208\nseem 208\nsigned 208\nAlso 207\nBarak 207\nclass 207\nopposition 207\nvia 207\nwoman 207\nask 206\nparents 206\nresidents 206\n200 205\nGerman 205\ncontracts 205\ndrive 205\nstrategy 205\nprofits 204\nbit 203\ncause 203\nfailed 203\nhands 203\nmarketing 203\nparticularly 203\nroom 203\nword 203\n31 202\nWarner 202\nbase 202\ninvestments 202\nperformance 202\nconsidered 201\ndecided 201\nstore 201\ntakes 201\nacquisition 200\nchanged 200\nresponse 200\nworth 200\nLee 199\nMilosevic 199\nOf 199\nWhy 199\nability 199\nagree 199\nopened 199\nstand 199\nwrote 199\nChairman 198\nMinistry 198\nauto 198\neight 198\nCounty 197\nHere 197\nWednesday 197\nimpact 197\nInstitute 196\nKorean 196\nlanguage 196\npassed 196\ntraditional 196\ntransaction 196\ntransportation 196\nBaghdad 195\nCole 195\nCompany 195\nLos 195\nbenefits 195\nfocus 195\nways 195\nTokyo 194\nchild 194\ncivil 194\nsaw 194\nNations 193\nThursday 193\nWhere 193\ncreated 193\nparts 193\nreform 193\nthousands 193\nactivities 192\ndespite 192\nfather 192\nforward 192\nopinion 192\nresources 192\nwrite 192\nBritain 191\ncomputers 191\ndirectly 191\nground 191\nsuit 191\nten 191\ntrial 191\nBecause 190\nCentral 190\nJaguar 190\napproval 190\ninvestor 190\nposted 190\ntroops 190\nbills 189\nhappened 189\nlines 189\nseeking 189\nwrong 189\n24 188\nNot 188\nbody 188\nopportunity 188\nschools 188\nstay 188\ntons 188\nventure 188\nweapons 188\nchance 187\nindustries 187\nnoted 187\ntrust 187\nconcerned 186\ncouple 186\ngoes 186\nsend 186\nslightly 186\nspecific 186\nstores 186\nsupply 186\nAngeles 185\nToday 185\ncentury 185\nequity 185\npreviously 185\ntalking 185\nMeanwhile 184\nadvanced 184\nalone 184\ndaily 184\ndied 184\nhousing 184\ninside 184\nsignificant 184\ntrue 184\n# 183\n1/4 183\nIBM 183\nSept. 183\nTwo 183\ndate 183\nheart 183\npress 183\nscheduled 183\nstation 183\nsteel 183\ntried 183\nPentagon 182\ncapacity 182\nfelt 182\nfilm 182\nfully 182\ninvestigation 182\nmedical 182\npolicies 182\nsector 182\nsite 182\nactivity 181\ncurrency 181\ngreater 181\nhopes 181\nindependent 181\nAmong 180\nNovember 180\nUnder 180\nproduce 180\nwhite 180\nworked 180\narticle 179\nbenefit 179\ncrisis 179\neffective 179\nnecessary 179\nship 179\nsize 179\nsoldiers 179\nsuccess 179\n300 178\nCanadian 178\nOur 178\nUN 178\nWar 178\nconsider 178\ncooperation 178\ndeficit 178\ndirect 178\nfinance 178\nimprove 178\nincludes 178\nminutes 178\nmodern 178\nrequired 178\nrules 178\nspent 178\nwin 178\nGreat 177\nimports 177\nperhaps 177\nprovided 177\nwilling 177\nAsia 176\nDuring 176\nEastern 176\nFed 176\nIs 176\nfield 176\nmoving 176\nMiddle 175\nThanks 175\nWilliam 175\nYear 175\ncover 175\nsubsidiary 175\nusually 175\nFlorida 174\nadvertising 174\nmusic 174\nrestructuring 174\nrule 174\nself 174\nBoth 173\nRichard 173\ncomplete 173\nelected 173\nfear 173\nhighest 173\nimmediately 173\nparticular 173\nreceive 173\nactive 172\nallowed 172\nbanking 172\nhighly 172\nlist 172\nmoved 172\nrelationship 172\nBoston 171\ncancer 171\netc 171\nmaterials 171\nmeasures 171\np. 171\nshareholders 171\nupon 171\nOffice 170\nbuildings 170\ncentral 170\ncertainly 170\ncontinues 170\nletter 170\nplanned 170\nserve 170\n1986 169\n22 169\nCo 169\naccept 169\nattempt 168\nauthorities 168\nexperts 168\noriginal 168\nunion 168\nconcerns 167\nincreases 167\nmoment 167\nD. 166\nDr. 166\neasy 166\nmakers 166\nmanufacturing 166\nprotection 166\nsettlement 166\nCorp 165\nabroad 165\nevent 165\ngained 165\nimage 165\ninvested 165\nmanagers 165\nones 165\npercentage 165\nplants 165\npopular 165\nson 165\ntreatment 165\naccounts 164\nbacked 164\nclearly 164\nexpressed 164\nextremely 164\nheard 164\nholders 164\nmachine 164\nproduced 164\nremember 164\nFebruary 163\nYemen 163\nfight 163\npayments 163\nplanning 163\nGaza 162\nknowledge 162\nplaces 162\ntomorrow 162\ntraffic 162\nactual 161\njobs 161\npopulation 161\nshowed 161\nhear 160\nnegotiations 160\nparent 160\ntravel 160\nacquired 159\nadd 159\nculture 159\nsign 159\nBay 158\nDemocratic 158\nRed 158\nRevenue 158\ncitizens 158\nfast 158\nfeet 158\nlack 158\nlawyers 158\norganization 158\nred 158\nreporters 158\nseek 158\nstyle 158\nthreat 158\n21 157\n70 157\n90 157\nGulf 157\nfigure 157\nfinally 157\nglobal 157\npolitics 157\nReporter 156\nThose 156\nWu 156\napparently 156\nattorney 156\nbattle 156\nevents 156\nfixed 156\nhundred 156\nreleased 156\nsort 156\ntonight 156\nAsian 155\nfloor 155\nmaintain 155\nmeasure 155\nstrength 155\ngeneration 154\nprogress 154\ntype 154\nArmy 153\nMrs. 153\nPaul 153\nairline 153\npaying 153\nreduced 153\ns 153\nsometimes 153\nspeed 153\nJust 152\nLet 152\napproach 152\nblood 152\nmodel 152\nnewspaper 152\nsource 152\nsummer 152\ntraining 152\nappear 151\ndetails 151\nfollowed 151\nforced 151\nmagazine 151\nnumbers 151\nprovince 151\nretail 151\nBusiness 150\nProvince 150\nannouncement 150\ncalling 150\nemergency 150\nenvironmental 150\nestimates 150\nfamilies 150\nfriendly 150\ngiving 150\nhurt 150\nlarger 150\nprocessing 150\nrelatively 150\ntrip 150\nvalues 150\nKostunica 149\nRepublican 149\nService 149\nTrade 149\ndirectors 149\nmaterial 149\nnote 149\nportfolio 149\nruling 149\nact 148\nbelieved 148\ngone 148\norganizations 148\npretty 148\nregulations 148\nstarting 148\nad 147\navoid 147\nbasic 147\ncompletely 147\ncrime 147\npilots 147\nready 147\nseparate 147\nsession 147\nIts 146\nclients 146\ncontrolled 146\ndouble 146\nfacilities 146\nhandle 146\ninterview 146\npiece 146\nreasons 146\nreserves 146\ntelephone 146\n19 145\nDefense 145\nNBC 145\nSales 145\nbelieves 145\ngrow 145\nmarks 145\nsearch 145\nwhom 145\nAustralia 144\nMore 144\nMoreover 144\nadds 144\ndemocracy 144\nexisting 144\nfine 144\ninitial 144\nstandard 144\nthinking 144\n150 143\n1992 143\nCenter 143\nFinancial 143\nGore 143\n] 143\nfamous 143\nhappy 143\nlargely 143\nowns 143\nplease 143\nAirlines 142\nHussein 142\n` 142\nfighting 142\nhundreds 142\nmiles 142\nmortgage 142\nnor 142\nregional 142\nresponsible 142\nsmaller 142\nsources 142\nspeech 142\nwritten 142\nChang 141\nWell 141\ndividend 141\nestimate 141\nlegislation 141\nmainly 141\noffices 141\npartner 141\npick 141\npowerful 141\n23 140\nSupreme 140\nattacks 140\nbeyond 140\ndistrict 140\nmother 140\nscientific 140\nGM 139\n[ 139\nbrand 139\ngenerally 139\ninstance 139\nlooks 139\nmentioned 139\nplaying 139\nregions 139\nstandards 139\nstatus 139\nstrike 139\nvictory 139\nNavy 138\naccess 138\nauction 138\nbreak 138\ncarried 138\ncontinuing 138\ndead 138\ndiscuss 138\noverall 138\npositive 138\nproducers 138\nresponsibility 138\nsoftware 138\nvictims 138\nOther 137\naccused 137\ndifference 137\ngives 137\nintroduced 137\nknows 137\nnormal 137\nprimary 137\nsuccessful 137\nunless 137\nfollow 136\nheadquarters 136\nholds 136\n26 135\nABC 135\ncandidates 135\ndesigned 135\nearned 135\nestablish 135\nhomes 135\ninterested 135\nkill 135\nlaws 135\nleaving 135\nminister 135\nreturned 135\nArafat 134\nBureau 134\nMexico 134\nSeries 134\nStill 134\ngoal 134\nplus 134\nreturns 134\nsteps 134\nstructure 134\nterrorist 134\n35 133\nAnother 133\nJudge 133\nThree 133\nclosely 133\ndoor 133\nlose 133\nstage 133\ntrend 133\nAdministration 132\nBill 132\nPeter 132\nagencies 132\nbuyers 132\ncustomer 132\ndecade 132\ndeclared 132\nfly 132\nhappen 132\nincreasingly 132\nleadership 132\nlearned 132\noutput 132\nrelief 132\nslow 132\n75 131\nInternet 131\nParis 131\nTeng 131\nchoice 131\nexpand 131\ngets 131\nrising 131\nsharp 131\nstreet 131\nwaiting 131\nA. 130\nacquire 130\nbrokerage 130\neyes 130\nintelligence 130\ninvest 130\nplayed 130\npractice 130\nprepared 130\nprevent 130\nsports 130\n5/8 129\nafternoon 129\nappears 129\nbrother 129\nchemical 129\ndiplomatic 129\nfees 129\ngrew 129\npresidential 129\nrich 129\naircraft 128\nchoose 128\ndeep 128\nhusband 128\nkept 128\nlearn 128\nmaybe 128\nprotect 128\nquake 128\nrequire 128\nsecretary 128\n3/8 127\nJerusalem 127\nRiver 127\nbankruptcy 127\nconflict 127\ndirection 127\nfamiliar 127\nmutual 127\n2000 126\n28 126\n45 126\nAfrica 126\nDemocrats 126\nLynch 126\nManagement 126\nSun 126\nYet 126\naffairs 126\neffects 126\nfreedom 126\nlawyer 126\nreview 126\nshot 126\nDec. 125\nSaturday 125\nTheir 125\naddress 125\nanswer 125\ncandidate 125\nchain 125\ncounter 125\nde 125\neasily 125\nhot 125\npersonnel 125\nsafe 125\nsimple 125\nspread 125\nIndustrial 124\nMorgan 124\nPlease 124\nbooks 124\nconfidence 124\ncrash 124\neat 124\nfemale 124\ngiant 124\nindividuals 124\nmostly 124\nprofessional 124\nrecession 124\nround 124\ntrouble 124\nwriting 124\nAnalysts 123\nColumbia 123\nMerrill 123\nMoscow 123\nThank 123\nVOA 123\napply 123\nboost 123\ncarrying 123\nclaim 123\ndealers 123\ndelivery 123\ndesign 123\nentered 123\nevening 123\nmoves 123\nsafety 123\ntowards 123\n1991 122\nCalif. 122\naccounting 122\narrived 122\nassistance 122\nclosing 122\ndecide 122\ndisclosed 122\ndocuments 122\nflight 122\ninfrastructure 122\nmovement 122\npass 122\npositions 122\npublished 122\nquoted 122\nweak 122\n1,000 121\nTime 121\nabortion 121\nclean 121\ndiscovered 121\nhouses 121\nmachines 121\nproviding 121\ntask 121\nInvestors 120\nachieved 120\nbecoming 120\nowners 120\nraising 120\nrequest 120\nshown 120\nweekend 120\n7/8 119\nJ. 119\nSunday 119\nbalance 119\ncharged 119\nexpenses 119\ninternal 119\nkeeping 119\nkilling 119\nnegative 119\noffers 119\nolder 119\nowner 119\npercent 119\nprime 119\nprovides 119\nputting 119\nsetting 119\nLin 118\nNT$ 118\nOnly 118\nTrust 118\nchip 118\ngovernments 118\nknew 118\npicture 118\nport 118\nsound 118\nstudent 118\nsurvey 118\nthroughout 118\nversion 118\nvoters 118\n10,000 117\nEconomic 117\nLebanon 117\nResearch 117\nYesterday 117\nadding 117\nasset 117\nconfirmed 117\nconsumers 117\ncriminal 117\nelectronic 117\nhospital 117\nregular 117\nrequirements 117\nsoft 117\ntarget 117\ntough 117\nDrexel 116\nJournal 116\nadvantage 116\ndebate 116\nfinished 116\nimproved 116\nmessage 116\nrealize 116\nrelease 116\nscience 116\nvotes 116\nwatch 116\n1/8 115\nIndia 115\nbombing 115\ncold 115\nconsidering 115\nenter 115\nexactly 115\nfields 115\nflat 115\nhotel 115\nitems 115\nminimum 115\nnature 115\npointed 115\npromote 115\ntechnical 115\ntwice 115\nuniversity 115\n27 114\nRep. 114\nYour 114\nfriend 114\ninfluence 114\nlimit 114\nsave 114\nseason 114\nsquare 114\n400 113\nInsurance 113\nIsraelis 113\nRumsfeld 113\nborder 113\nfavor 113\njumped 113\npreferred 113\nreduction 113\nrelevant 113\nstreets 113\ntea 113\ntraded 113\nCurrently 112\nFund 112\nIndustries 112\nPower 112\nSen. 112\nactions 112\nappeared 112\nblock 112\ncollapse 112\nfactors 112\nfiling 112\nfuel 112\nindeed 112\nlisted 112\nmerger 112\nnames 112\npound 112\nregarding 112\nroads 112\nsavings 112\nstated 112\nwait 112\nLaw 111\nOfficials 111\nReserve 111\ndeals 111\neventually 111\nexpectations 111\nextent 111\nfacing 111\nhealthy 111\nindicated 111\nlocated 111\nownership 111\npayment 111\nrefused 111\nremaining 111\nrescue 111\nsharply 111\nB 110\nCBS 110\nJim 110\nJustice 110\nNoriega 110\ndetermined 110\ndiscount 110\nexcept 110\nexpensive 110\nfile 110\nrace 110\nreality 110\nresolution 110\nstories 110\nstrategic 110\nDespite 109\nElectric 109\nMao 109\nSmith 109\nSt. 109\ncheck 109\nkinds 109\nzone 109\n1985 108\ncontact 108\ndoubt 108\nextra 108\nfeeling 108\nhearing 108\nnotice 108\noption 108\nplayers 108\nquick 108\nreaching 108\nregime 108\nrespect 108\nshowing 108\nsubstantial 108\n1994 107\n2005 107\nAverage 107\nS&P 107\nSharon 107\nThough 107\nTimes 107\nVirginia 107\nW. 107\nbringing 107\nceremony 107\ncollege 107\ncongressional 107\ndegree 107\nguilty 107\nguy 107\nmanaged 107\nnations 107\npage 107\npartners 107\nplaced 107\nstatistics 107\nvideo 107\nwealth 107\nSystems 106\nUSS 106\naffected 106\nauthority 106\nbegun 106\nbus 106\ncreditors 106\neye 106\nfaces 106\nfair 106\nisland 106\nlaunched 106\nspring 106\nstanding 106\nthousand 106\nties 106\n1997 105\nAffairs 105\nAuthority 105\nPeters 105\nServices 105\nborn 105\nbrokers 105\ncomments 105\ncomplex 105\ncondition 105\nensure 105\nfrancs 105\ngames 105\nimport 105\njoined 105\nmountain 105\nthus 105\ntypes 105\nwidely 105\nHugo 104\nInstead 104\nLife 104\nNo. 104\nPLC 104\nadvance 104\nblue 104\ncutting 104\nfoot 104\nideas 104\nlosing 104\nmovie 104\npackage 104\nproducing 104\nprovisions 104\nsouthern 104\nspokeswoman 104\nwage 104\n29 103\nBrothers 103\nGuber 103\nOnce 103\ncard 103\ncompetitors 103\ncreating 103\ndepartments 103\nexercise 103\nimprovement 103\nnice 103\nproposals 103\npurpose 103\nstopped 103\ntotally 103\nvehicles 103\nDay 102\nJohnson 102\nShearson 102\nSony 102\nYugoslavia 102\ncable 102\ncharacter 102\ncorporations 102\ndamaged 102\ndrugs 102\nfailure 102\nfinding 102\njoin 102\njointly 102\nspeak 102\nteams 102\ntransactions 102\nunderstanding 102\nviews 102\nCan 101\nCommunist 101\nHealth 101\nHuang 101\nROC 101\nbodies 101\nbroad 101\ndaughter 101\nengineering 101\nera 101\ninvolving 101\nlocation 101\nmissile 101\nopportunities 101\nprimarily 101\nstability 101\nvoice 101\nyields 101\nAre 100\nEhud 100\nIslamic 100\nReagan 100\nRepublic 100\nSchool 100\nactively 100\ncareer 100\ndescribed 100\nlooked 100\nmaster 100\nordered 100\npace 100\nparticipate 100\nscene 100\nseeing 100\nseemed 100\nturning 100\nusual 100\nwatching 100\nworse 100\nBrown 99\nEach 99\nHamas 99\nHouston 99\nLike 99\nMotor 99\nPRC 99\ninvesting 99\nmail 99\nmillions 99\nprovision 99\nrally 99\nshareholder 99\ntheory 99\nunable 99\nCNN 98\nDigital 98\nExpress 98\nMarket 98\nMing 98\nNorthern 98\ncarrier 98\ndriving 98\nfilled 98\nflow 98\ninjured 98\nkids 98\nneither 98\norganized 98\nportion 98\npossibility 98\nprofessor 98\nrejected 98\nshop 98\nsigns 98\nstrengthen 98\nterritory 98\ntransfer 98\n2007 97\nAustralian 97\nCharles 97\nGood 97\nJackson 97\nLiu 97\nSingapore 97\nSoong 97\nYes 97\nagricultural 97\narbitrage 97\ncontributed 97\ncore 97\nmanaging 97\nmark 97\nmatch 97\noccurred 97\npartly 97\npush 97\nrepresentatives 97\nresigned 97\nspeaking 97\ntruth 97\nvirtually 97\nBond 96\nIRS 96\nYuan 96\naimed 96\namounts 96\napplication 96\nasking 96\nconvertible 96\ncuts 96\nguys 96\nleveraged 96\npublicly 96\nregulators 96\nsection 96\nsuffered 96\nvolatility 96\nyourself 96\nChristmas 95\nIndeed 95\nKidder 95\nMoody 95\nSEC 95\nairport 95\ncloser 95\nconservative 95\neggs 95\nending 95\nenterprise 95\nequal 95\nfaced 95\nheavily 95\nimpossible 95\nincident 95\nmale 95\nmyself 95\nofficers 95\nproperties 95\nquarterly 95\nstar 95\nsupporters 95\ntable 95\nunchanged 95\nworried 95\nBefore 94\narmy 94\nbranch 94\ncritical 94\ndefinitely 94\nformed 94\nheaded 94\nmeaning 94\nplunge 94\nrapid 94\nrapidly 94\nsailors 94\nserved 94\nwish 94\nworry 94\nCheng 93\nCommerce 93\nMongolia 93\nSuch 93\nThus 93\nYang 93\naccepted 93\ncommission 93\ndelegation 93\nexpansion 93\nfresh 93\nfunding 93\nguarantee 93\njury 93\nordinary 93\nplane 93\nprinciple 93\nreforms 93\nrevised 93\nstuff 93\n1999 92\n65 92\nBaker 92\nCommunications 92\nGeorgia 92\nInvestment 92\nTechnology 92\nagreements 92\nappropriate 92\nbigger 92\nchanging 92\nco-operation 92\ncommitted 92\ndistribution 92\nelectronics 92\nencourage 92\nexchanges 92\ngirl 92\ngreatly 92\nheads 92\nhelping 92\npurchased 92\nspot 92\ntruck 92\nuses 92\nvillage 92\nwhatever 92\nFBI 91\nSanta 91\nTianjin 91\nYeh 91\nappeal 91\nattended 91\nattract 91\ndecisions 91\neconomist 91\nfairly 91\nmass 91\nmeetings 91\nmetal 91\nnone 91\nphoto 91\npremium 91\nreading 91\nrestrictions 91\nsets 91\nsupported 91\nurban 91\nwire 91\n1.5 90\n33 90\nHsia 90\nIndex 90\naccident 90\nadopted 90\nbroke 90\ncenters 90\nchallenge 90\nchips 90\ncircumstances 90\ndispute 90\nexcellent 90\nexplains 90\nhappening 90\nmedium 90\nmethods 90\nmine 90\nmixed 90\nrecovery 90\nresearchers 90\nsouth 90\nsuggested 90\nsupplies 90\ntouch 90\nvehicle 90\n2.5 89\nIndian 89\nLeague 89\nPoland 89\nThomas 89\nUS$ 89\ncolor 89\ncrimes 89\ndog 89\nemployee 89\ngradually 89\nonto 89\nopposed 89\npictures 89\nrequires 89\nruns 89\nsettled 89\nstudies 89\nsuicide 89\nsupporting 89\nthinks 89\nveto 89\nwonder 89\nAIDS 88\nAM 88\nComputer 88\nGorbachev 88\nHurricane 88\nOver 88\ncommunications 88\ndemocratic 88\ndie 88\nelectricity 88\nfacility 88\nfactory 88\ngirls 88\nheat 88\nimplement 88\nlink 88\nlinks 88\nminority 88\nrepresent 88\nsanctions 88\nscientists 88\ntrader 88\n1998 87\nPutin 87\nQintex 87\nRights 87\nTom 87\nVice 87\nYao 87\nadjusted 87\nagents 87\nconstitutional 87\ndifficulties 87\nenjoy 87\nformal 87\ngross 87\nlearning 87\nmission 87\nnt 87\noffset 87\nputs 87\nran 87\nsea 87\nspeculation 87\nsplit 87\nworst 87\n1996 86\nJewish 86\nJr. 86\nMark 86\nOr 86\nSocial 86\nToronto 86\nZhang 86\naffect 86\nattached 86\naudience 86\nbottom 86\ncell 86\nclaimed 86\nconsultant 86\ndiscussed 86\nestablishment 86\nexplain 86\nillegal 86\nmanufacturers 86\nmatters 86\nobtain 86\npoliticians 86\npossibly 86\nrealized 86\nresistance 86\nsolution 86\nsummit 86\ntypically 86\nwestern 86\nAbu 85\nCuba 85\nDistrict 85\nEgypt 85\namid 85\ncompetitive 85\ncoupon 85\ncoverage 85\ncovered 85\ndangerous 85\ndiscussions 85\nenormous 85\ngene 85\ngrown 85\nhorse 85\nplayer 85\npromised 85\nsought 85\nspirit 85\nstations 85\ntrack 85\nunusual 85\n44 84\nGrand 84\nPM 84\nShenzhen 84\nSouthern 84\nWill 84\nadvice 84\nalleged 84\nassociated 84\nbusinessmen 84\ndealing 84\nfalling 84\nimmediate 84\njustice 84\nmodels 84\npricing 84\nproducer 84\npurchases 84\nrespectively 84\nthanks 84\nwaste 84\n55 83\nDirector 83\nEnergy 83\nHer 83\nLehman 83\nadministrative 83\nads 83\nallowing 83\narms 83\nbasically 83\nbomb 83\ncoalition 83\ncontrols 83\ncultural 83\ndrink 83\nfun 83\nmassive 83\nmodest 83\nmonthly 83\npatients 83\npresence 83\nprofitable 83\nprove 83\nreplace 83\nrestaurant 83\nshopping 83\nsignificantly 83\ntheme 83\ntransport 83\nvariety 83\nwave 83\nAfrican 82\nEarlier 82\nNissan 82\nRepublicans 82\nSalomon 82\nU.K. 82\nbeautiful 82\nblame 82\ncausing 82\nclimbed 82\nconcept 82\ncrude 82\nlived 82\nlots 82\nmargins 82\nminor 82\nminute 82\nobvious 82\npension 82\nphase 82\nprior 82\nrepresents 82\nrisks 82\nstatements 82\nsupposed 82\ntherefore 82\nthrift 82\ntrillion 82\nusers 82\nvalued 82\nwarned 82\nData 81\nHome 81\nNasdaq 81\nNet 81\nPeres 81\nPerhaps 81\nPublic 81\nValley 81\nachieve 81\nalternative 81\ncheap 81\nconducted 81\ndifferences 81\nengine 81\nequivalent 81\neverybody 81\nexpanding 81\nfalse 81\nforecast 81\nfraud 81\nmagazines 81\nmap 81\npermanent 81\nrating 81\nrecommend 81\nresume 81\ntend 81\nvisited 81\n1995 80\nII 80\nKing 80\nMedical 80\nSlobodan 80\nStrip 80\nanalysis 80\natmosphere 80\ncorrect 80\nease 80\neconomists 80\nemployment 80\nhappens 80\nhighway 80\nhoping 80\nintended 80\njump 80\nlaunch 80\nlies 80\nmedicine 80\nmissing 80\nnewly 80\nonline 80\noperate 80\npanel 80\npath 80\npredicted 80\nreflect 80\nretirement 80\nrevealed 80\nseriously 80\nshipping 80\nteachers 80\ntrees 80\nAssembly 79\nChien 79\nCross 79\nDongguan 79\nGoogle 79\nIndustry 79\nSea 79\nattributed 79\naware 79\ncited 79\ncoffee 79\ndecades 79\ndeclines 79\nexecution 79\nfallen 79\nfish 79\ngrade 79\nhistorical 79\nreasonable 79\nreflecting 79\nreflects 79\nregard 79\nscandal 79\nstronger 79\nsuggest 79\ntank 79\nvoted 79\nwake 79\nwalk 79\nBob 78\nDinkins 78\nEnron 78\nItaly 78\nQingqing 78\nR. 78\nStanley 78\nabsolutely 78\nagent 78\narmed 78\ncaught 78\ncommodity 78\nconnection 78\ncontributions 78\ncopy 78\ndeposits 78\ndesire 78\ndialogue 78\nearth 78\nfellow 78\ngap 78\ngreen 78\nhandling 78\nimplemented 78\ninstitutional 78\npar 78\npartnership 78\nreporting 78\nshall 78\nshift 78\nsurface 78\nsurprised 78\ntests 78\n250 77\nAbout 77\nChemical 77\nFrank 77\nMaybe 77\nOil 77\nOrganization 77\nSwiss 77\nbroken 77\nclient 77\nconfident 77\ndeputy 77\nduty 77\nextended 77\nleaves 77\nmeters 77\nmurder 77\nnorth 77\nnumerous 77\nprison 77\nrevenues 77\nsex 77\nshape 77\nsomewhat 77\nstrongly 77\ntalent 77\ntechnological 77\nterrorism 77\nthank 77\nvast 77\nweather 77\nControl 76\nLabor 76\nMarkets 76\nMotors 76\nVietnam 76\nannually 76\nbeauty 76\nbusy 76\ncauses 76\ncharacters 76\ncollection 76\ncombined 76\ncontent 76\ncontrast 76\ndiscussion 76\nfarmers 76\nfollows 76\nguidelines 76\nhousehold 76\ninteresting 76\njudges 76\nlawmakers 76\nmargin 76\nmeant 76\npeaceful 76\nphysical 76\npulled 76\nstable 76\ntech 76\ntrades 76\nwebsite 76\nwelcome 76\n1984 75\n37 75\nEngland 75\nJack 75\nKMT 75\nLatin 75\nLawson 75\nRMB 75\nYasser 75\naside 75\nattitude 75\nbaseball 75\ncabinet 75\ncomprehensive 75\ndemands 75\ndraft 75\nentering 75\nethnic 75\nfeels 75\nfit 75\nhoped 75\nmethod 75\nnorthern 75\npromise 75\npromoting 75\nseats 75\nskin 75\ntrain 75\nwidespread 75\n100,000 74\n2006 74\nAssociates 74\nNext 74\nSpain 74\nTHE 74\nZedong 74\nattend 74\nbeat 74\nbroker 74\ncomfortable 74\ndebentures 74\ndeliver 74\ndelivered 74\neffectively 74\nexplosion 74\nfired 74\ngreatest 74\nnegotiating 74\nparliament 74\npattern 74\nproperly 74\nprosecutors 74\nremained 74\nsensitive 74\nsitting 74\nstruggle 74\nsuddenly 74\nsuggests 74\ntargets 74\ntemporary 74\nwarning 74\n42 73\nAct 73\nAfghanistan 73\nArea 73\nBell 73\nForce 73\nGuangdong 73\nHutton 73\nPark 73\nSource 73\naccord 73\nanimals 73\napart 73\napplications 73\nbaby 73\nblow 73\ndonations 73\nentertainment 73\nfears 73\nholdings 73\nletters 73\npractices 73\nregulatory 73\nriver 73\ntitle 73\ntotaled 73\n.... 72\n1980s 72\n1993 72\nArabia 72\nLincoln 72\nLu 72\nPalau 72\nPrice 72\nThatcher 72\nappointed 72\nboy 72\ncards 72\nclashes 72\ncolleagues 72\ncriticism 72\neasier 72\nedge 72\nfolks 72\nforms 72\nfrequently 72\nnarrow 72\npending 72\nplenty 72\npowers 72\nreorganization 72\nreserve 72\nresulting 72\nsettle 72\nstands 72\ntrucks 72\nwall 72\n!! 71\n120 71\nAirport 71\nArizona 71\nFoundation 71\nHigh 71\nHill 71\nManhattan 71\nOhio 71\nS. 71\nadmitted 71\nassistant 71\nassume 71\nbehavior 71\nbidding 71\ndelay 71\ndenied 71\ndestruction 71\ndetermine 71\nexperienced 71\nforest 71\nfunded 71\nimportance 71\nmunicipal 71\nought 71\nplastic 71\npollution 71\npreliminary 71\npresented 71\nreducing 71\nruled 71\nsuspect 71\ntextile 71\ntruly 71\nwarm 71\nweakness 71\n1980 70\n51 70\nCDs 70\nE. 70\nGovernment 70\nOTC 70\nRecently 70\nTransportation 70\nargument 70\narrested 70\nauthor 70\nclub 70\nconduct 70\nconsideration 70\ncount 70\ndistance 70\ndozen 70\ndriven 70\nfactories 70\ngovernor 70\nhelps 70\nkilometers 70\nmovies 70\nnecessarily 70\nnormally 70\npark 70\nparticipation 70\npretax 70\npushed 70\nraw 70\nvoting 70\nwinning 70\nBelgrade 69\nC$ 69\nCoast 69\nIranian 69\nLtd 69\nPhilip 69\nSears 69\nShui 69\nWhich 69\naged 69\nappearance 69\nbankers 69\ncast 69\ncells 69\nchemicals 69\nclasses 69\ncollapsed 69\ncoup 69\ncrowd 69\ndealer 69\nelsewhere 69\nentry 69\nfaster 69\nfewer 69\ngoals 69\nguess 69\nindicate 69\ninvestigators 69\nliability 69\nmaturity 69\np.m. 69\npages 69\nphotos 69\npriority 69\nradio 69\nrepresentative 69\nrice 69\nseat 69\nsees 69\nsegment 69\nspoke 69\nstood 69\ntelling 69\nturns 69\n32 68\n38 68\nLloyd 68\nSyria 68\nZealand 68\nacquisitions 68\nbids 68\nbirth 68\nconsumption 68\ncounties 68\ndeaths 68\ndeeply 68\ndraw 68\nestablishing 68\nexist 68\nfacts 68\nfashion 68\nhonor 68\nimproving 68\nindependence 68\nmerely 68\nnewspapers 68\nreaders 68\nrecalls 68\nreligious 68\nsolid 68\nsuspended 68\ntells 68\ntrends 68\ntroubled 68\nweight 68\n1982 67\n48 67\nAlan 67\nBin 67\nCIA 67\nChapter 67\nL. 67\nLittle 67\nSteel 67\nThird 67\nacts 67\nattracted 67\nexpanded 67\nfill 67\nflag 67\nhired 67\nluxury 67\nmayor 67\nperfect 67\npetroleum 67\nplays 67\npublishing 67\nreaction 67\nrefugees 67\nregistered 67\nresolve 67\nsick 67\nsubordinated 67\nsuffering 67\nultimately 67\nurged 67\nwriter 67\nyouth 67\nExxon 66\nH. 66\nHollywood 66\nHuman 66\nMorris 66\nanimal 66\nargue 66\narticles 66\nbecomes 66\nbridge 66\nconventional 66\ndepends 66\neducational 66\nexpense 66\nforget 66\njoining 66\nlaid 66\nmanage 66\nmemory 66\nmonetary 66\nmortgages 66\nmultiple 66\noriginally 66\npoll 66\nreflected 66\nrepair 66\nrequested 66\nride 66\nroughly 66\nscreen 66\nsending 66\ntalked 66\ntape 66\ntesting 66\ntied 66\ntreated 66\nuh 66\nunique 66\nwindow 66\n1.2 65\n1.6 65\n1st 65\nBesides 65\nDallas 65\nEarth 65\nLebanese 65\nPanama 65\nParibas 65\nTherefore 65\nabuse 65\nafford 65\nafraid 65\napparent 65\ncatch 65\ncriticized 65\ndisease 65\neliminate 65\nflights 65\ngolden 65\nmembership 65\nmissiles 65\npence 65\npilot 65\nprincipal 65\nquarters 65\nrain 65\nrare 65\nrecords 65\nreplaced 65\nsuccessively 65\nuncertainty 65\nunlikely 65\nweekly 65\n1.8 64\n36 64\nBBC 64\nCD 64\nEvery 64\nFinally 64\nGE 64\nGas 64\nGoldman 64\nPhillips 64\nSoviets 64\nSpanish 64\nTanshui 64\nbox 64\nbroadcast 64\ncarefully 64\nchannels 64\nconcerning 64\nconcluded 64\ncopper 64\ncrew 64\ndefendants 64\ndevices 64\ndisclose 64\nexpert 64\nexplained 64\nextraordinary 64\nfriendship 64\ngotten 64\nhostile 64\ninvited 64\nmention 64\nmilk 64\notherwise 64\nprefer 64\nproceeds 64\nproper 64\nremove 64\nrevolution 64\nsevere 64\nshops 64\nsit 64\nsteady 64\nstrategies 64\nsubmitted 64\nsurplus 64\nsurprise 64\nvisitors 64\nyounger 64\nAden 63\nAny 63\nCampeau 63\nEntertainment 63\nExecutive 63\nHave 63\nInformation 63\nJersey 63\nKingdom 63\nLIN 63\nM. 63\nPaineWebber 63\nSerbia 63\naggressive 63\nallies 63\nbian 63\nboat 63\nbranches 63\ncoast 63\ncommitment 63\ncompensation 63\ncompromise 63\ndeadline 63\ndrama 63\neditor 63\nglass 63\ngrand 63\nhair 63\nhelpful 63\nitem 63\nlegislative 63\nmanufacturer 63\nmeanwhile 63\nobviously 63\nperspective 63\nplunged 63\npoverty 63\nratings 63\nrural 63\nsized 63\nslowdown 63\nstraight 63\nsubstantially 63\nsufficient 63\ntight 63\nventures 63\nworker 63\n1970s 62\n5,000 62\nB. 62\nC. 62\nEl 62\nFox 62\nGolden 62\nIraqis 62\nJoseph 62\nManville 62\nPrince 62\nScience 62\nTelevision 62\na.m. 62\nacting 62\nah 62\nargued 62\nattacked 62\nbrands 62\ncircles 62\nconcrete 62\ndanger 62\ndisplay 62\ndividends 62\nengaged 62\nfactor 62\ninstruments 62\nleads 62\nlimits 62\nmaintenance 62\nobtained 62\noccupation 62\noccur 62\nproceedings 62\nprospects 62\nrepresenting 62\nresponded 62\nseal 62\nsecondary 62\nsecret 62\nships 62\nsites 62\nstarts 62\nstruck 62\nstructural 62\nsucceed 62\nsuits 62\nsun 62\ntender 62\nunemployment 62\nwatched 62\nwinter 62\n20,000 61\n43 61\nChi 61\nCredit 61\nDoes 61\nMike 61\nQingdao 61\nScott 61\nSichuan 61\nacademic 61\nanywhere 61\nbrief 61\nchannel 61\ncongestion 61\nconsulting 61\ncorruption 61\ndark 61\neating 61\nfaith 61\nfounder 61\nhardly 61\ninstitution 61\nlegislature 61\nlending 61\nmetals 61\npassing 61\nprinciples 61\nrecovered 61\nrepeatedly 61\nrepresented 61\nreputation 61\nsells 61\nshut 61\nsilver 61\nskills 61\nsleep 61\nstress 61\nsudden 61\ntiny 61\ntransition 61\nwind 61\n41 60\nAtlanta 60\nFinance 60\nHoldings 60\nJordan 60\nLouis 60\nNavigation 60\nPeng 60\nPetroleum 60\nSavings 60\nShandong 60\nStar 60\nannounce 60\nanswers 60\napplied 60\nbarrels 60\nbeer 60\nclothing 60\ncontroversial 60\nconvicted 60\ncourts 60\ncovering 60\ndevelopments 60\ndoctor 60\ndramatic 60\ndrinking 60\nelderly 60\nexcess 60\nflying 60\nfoundation 60\ngrain 60\nguide 60\nimagine 60\ninformed 60\ninitially 60\nlisten 60\nmaintained 60\nnaturally 60\nofficially 60\nparticipants 60\nplanes 60\nprocedures 60\npursue 60\nranks 60\nrent 60\nshed 60\nstorage 60\nteacher 60\ntourist 60\nutility 60\nvision 60\nweb 60\nBroadcasting 59\nMiami 59\nNASA 59\nOlympic 59\nSeparately 59\nSports 59\nStephen 59\nTraders 59\nTurkey 59\nVojislav 59\napartment 59\naspects 59\nawarded 59\nbackground 59\nbegins 59\ncompatriots 59\nconsiderable 59\ncypress 59\ndeclining 59\nefficient 59\nenemy 59\nentirely 59\nforth 59\nguarantees 59\nhost 59\nlinked 59\nlunch 59\noperates 59\noutcome 59\nparticipating 59\nprepare 59\npull 59\nreceiving 59\nrecorded 59\nresulted 59\nrooms 59\nslipped 59\nsovereignty 59\nspecifically 59\nticket 59\ntoll 59\ntour 59\ntr. 59\n2004 58\n39 58\n49 58\nAT&T 58\nCosta 58\nDaily 58\nJiang 58\nLien 58\nMae 58\nMass. 58\nMixte 58\nMuslim 58\nPali 58\nRoss 58\nSachs 58\nSection 58\nStandard 58\nabandoned 58\naccounted 58\naim 58\nbed 58\ndefend 58\ndetailed 58\nfans 58\nfeatures 58\nfeed 58\nimported 58\nliquidity 58\nministry 58\npain 58\npatent 58\npatient 58\npersonally 58\npicked 58\npolls 58\npredict 58\npremier 58\nprivately 58\nquiet 58\nrecommended 58\nrelatives 58\nschedule 58\nscore 58\nserving 58\nstudying 58\nsuccessfully 58\nsuppose 58\nteaching 58\nterrorists 58\nthreatened 58\ntopic 58\ntourism 58\n1.3 57\n2001 57\nAG 57\nBlack 57\nCommunity 57\nCorporation 57\nKoreans 57\nKrenz 57\nLaden 57\nMuslims 57\nPoor 57\nSir 57\nSotheby 57\nUntil 57\nWithout 57\nacquiring 57\nadviser 57\nalliance 57\nanyway 57\nassembly 57\nassist 57\nbadly 57\nbehalf 57\nbuyer 57\ncomparable 57\ncompetitor 57\ncontribution 57\ncritics 57\ndestroyed 57\ndrives 57\nelectric 57\nembassy 57\nfarm 57\nforeigners 57\nice 57\nintense 57\nload 57\nprovincial 57\nrelationships 57\nreturning 57\nselected 57\nsky 57\nsolve 57\nstrengthening 57\ntechnologies 57\ntemporarily 57\ntens 57\ntool 57\ntreat 57\nwear 57\n13th 56\n1983 56\n3,000 56\n34 56\n46 56\nGao 56\nHall 56\nIreland 56\nItalian 56\nManufacturers 56\nYoung 56\nallows 56\nassociation 56\nban 56\nbarely 56\nbear 56\nbillions 56\nbrings 56\ncommunities 56\ncounty 56\ncross 56\ndamages 56\ndifficulty 56\ndinner 56\ndisk 56\neditorial 56\nemail 56\nessential 56\neverywhere 56\nexclusive 56\nfishing 56\nfocusing 56\nimplementation 56\nmarried 56\npainting 56\npreparing 56\npushing 56\nratio 56\nrecognize 56\nresignation 56\nresolved 56\nrestaurants 56\nrival 56\nseconds 56\nsecure 56\nslowing 56\nsnake 56\ntanks 56\nupper 56\nwearing 56\n200,000 55\n800 55\nC 55\nChristian 55\nEducation 55\nFood 55\nLake 55\nLong 55\nMCA 55\nMiller 55\nN.J. 55\nOakland 55\nParliament 55\nReport 55\nResources 55\nRight 55\nSecond 55\nUK 55\nUSA 55\napproximately 55\nattorneys 55\nautomotive 55\naward 55\nbrain 55\ncompete 55\nconsensus 55\ncorporation 55\ncorrespondent 55\ncreative 55\ncycle 55\ndivided 55\nefficiency 55\nexposure 55\nextreme 55\nfavorite 55\nholiday 55\nindicates 55\nlitigation 55\nmeat 55\nneighborhood 55\nnovel 55\noptimistic 55\noutlook 55\npieces 55\nposts 55\nprotest 55\nrated 55\nrely 55\nsomebody 55\nsounds 55\nspirits 55\nstressed 55\nsuspected 55\nswept 55\ntaxpayers 55\nterror 55\nutilized 55\n1.1 54\n52 54\n600 54\nAtlantic 54\nBen 54\nBoeing 54\nChan 54\nCiticorp 54\nCollege 54\nContinental 54\nFour 54\nHolding 54\nMajor 54\nOPEC 54\nSCI 54\nSays 54\nSystem 54\nTsai 54\nWalter 54\narm 54\nassessment 54\nattractive 54\nchains 54\nconfirm 54\nconsecutive 54\nempty 54\nextend 54\nfate 54\nfocused 54\ngolf 54\nidentify 54\nimages 54\ninsist 54\nintegrated 54\nintends 54\nmarriage 54\npeak 54\npurchasing 54\nrelative 54\nrespond 54\nsentence 54\nslide 54\nsomewhere 54\nspecialty 54\nstorm 54\nsuffer 54\nsurge 54\nton 54\n3.5 53\nConference 53\nEdward 53\nF. 53\nGo 53\nIntel 53\nLambert 53\nName 53\nRadio 53\nachievements 53\namendment 53\nappeals 53\nassassination 53\nchurch 53\nclimate 53\ncoal 53\ncommitments 53\ncrucial 53\ndry 53\neast 53\nfounded 53\ngathering 53\nhanded 53\nherself 53\nimplementing 53\ninsurers 53\nliterature 53\nluck 53\nmomentum 53\nmonitor 53\nmoral 53\nnegotiators 53\nopponents 53\novernight 53\npharmaceutical 53\npromotion 53\nproved 53\npurposes 53\nresidential 53\nshared 53\nsoared 53\nspill 53\nsteadily 53\nsubmarine 53\nsurged 53\ntelecommunications 53\ntiming 53\ntools 53\nunity 53\nuniversities 53\nvolatile 53\nzero 53\n99 52\nB.A.T 52\nBerlin 52\nChief 52\nGOP 52\nGreen 52\nHospital 52\nIsland 52\nJan. 52\nKennedy 52\nNobel 52\nOld 52\nPress 52\nSpace 52\nStewart 52\nZhou 52\nadvertisers 52\nbrothers 52\ncode 52\ncomparison 52\nconsequences 52\nconstantly 52\ndeclare 52\ndogs 52\nedged 52\nenforcement 52\nexceeded 52\ngun 52\nharder 52\ninsured 52\njudicial 52\nlay 52\nlocations 52\nmaintaining 52\nmining 52\nmoderate 52\nofferings 52\npact 52\nprofessionals 52\nprovinces 52\npump 52\nrecognized 52\nrecover 52\nroll 52\nrumors 52\nshoes 52\nsister 52\nspite 52\nsucceeds 52\nsurgery 52\ntransferred 52\ntwenty 52\nviewers 52\nwithdrawal 52\n* 51\n2003 51\n21st 51\n31st 51\n4.5 51\n85 51\nBeach 51\nBear 51\nChrysler 51\nMcCaw 51\nMitsubishi 51\nMuseum 51\nPennsylvania 51\nQatar 51\nSwitzerland 51\nUnfortunately 51\nalive 51\nallegations 51\nancient 51\nangry 51\nbases 51\nbet 51\nblacks 51\nblast 51\nboosted 51\nbreaking 51\nburied 51\ncareful 51\ncarries 51\ncat 51\nchallenges 51\ncontroversy 51\ncorn 51\ndefinition 51\ndiscussing 51\ndivisions 51\nelements 51\nemphasis 51\nencouraged 51\nevil 51\nexcessive 51\nexistence 51\nfavorable 51\nfee 51\nformally 51\nformerly 51\nheading 51\nhole 51\nimposed 51\ninitiative 51\ninvestigations 51\njournalists 51\nlawsuit 51\nmaximum 51\nminds 51\nobligation 51\nounce 51\nprotected 51\nsenators 51\nspecialists 51\nstructures 51\nsurrounding 51\ntremendous 51\nunderwriters 51\nwood 51\nwounded 51\n50,000 50\n72 50\nAug. 50\nBridge 50\nBrooks 50\nBurnham 50\nFrankfurt 50\nGuangzhou 50\nLater 50\nMortgage 50\nSeveral 50\nYemeni 50\nadvisers 50\nappreciation 50\nband 50\ncapable 50\ncent 50\ncitizen 50\nconclusion 50\nconnected 50\ncostly 50\ncouncil 50\ncross-strait 50\ndefeat 50\ndistricts 50\ndoctors 50\nedition 50\nemotional 50\nfail 50\ngasoline 50\nharm 50\nhumans 50\nlie 50\nlucky 50\nmerchandise 50\nmonitoring 50\nobservers 50\nproportion 50\nprosperity 50\nrestore 50\nretailers 50\nretain 50\nrush 50\nshooting 50\nsong 50\nsurprising 50\nthin 50\nthreats 50\ntraveling 50\ntroubles 50\ntypical 50\nwider 50\nwrites 50\nyes 50\n58 49\n66 49\nCustoms 49\nDonald 49\nEveryone 49\nHewlett 49\nHu 49\nMitchell 49\nMountain 49\nNicaragua 49\nPartners 49\nPittsburgh 49\nSoutheast 49\nTelephone 49\nUSAir 49\nUSX 49\nYugoslav 49\nadopt 49\nairlines 49\nambassador 49\nattending 49\nbureau 49\ncarriers 49\ncoastal 49\nconsultants 49\ndisclosure 49\ndistributed 49\nends 49\nexceed 49\nfinish 49\ngarden 49\ngenes 49\nguard 49\nidentified 49\ninner 49\nintroduce 49\nlift 49\nmerchants 49\nnetworks 49\noperator 49\nreligion 49\nrepeated 49\nslowly 49\nsmooth 49\nsongs 49\nspecies 49\nstamps 49\nstrikes 49\nsurvive 49\nthrifts 49\ntourists 49\ntragedy 49\nunprecedented 49\nvaluable 49\nvisiting 49\nweapon 49\n12.5 48\nAli 48\nArabs 48\nBack 48\nDid 48\nEgyptian 48\nHUD 48\nHan 48\nHillary 48\nMary 48\nMicrosoft 48\nMunicipal 48\nPakistan 48\nPan 48\nPhilippines 48\nPolice 48\nQaeda 48\nRather 48\nShiite 48\nTang 48\nWolf 48\naccordance 48\nactivists 48\nagriculture 48\nanticipated 48\nassociate 48\nattracting 48\nbenchmark 48\nboss 48\ncategory 48\nciting 48\nclothes 48\ncombat 48\ncombination 48\nconvinced 48\ndeposit 48\nearn 48\nencouraging 48\nerror 48\nessentially 48\nfifth 48\nfunction 48\nhalt 48\nlicense 48\nmerchant 48\noppose 48\npipeline 48\nplug 48\nproof 48\nrequiring 48\nresponsibilities 48\nrivals 48\nscholars 48\nstick 48\nstudio 48\nswings 48\nthrowing 48\ntracks 48\ntradition 48\ntriggered 48\ntripped 48\nunlike 48\nuser 48\nvirus 48\nwisdom 48\n180 47\n47 47\n500,000 47\nAbather 47\nAlexander 47\nBBS 47\nBosnia 47\nCarolina 47\nCastro 47\nFar 47\nFestival 47\nJeff 47\nK 47\nKaohsiung 47\nMGM 47\nMartin 47\nPackard 47\nPrudential 47\nRoad 47\nRoyal 47\nSpecial 47\nVolume 47\nWithin 47\nWould 47\nanybody 47\narrest 47\nbelief 47\nchose 47\ncommunication 47\ncomplaint 47\ncustoms 47\ndelays 47\nenjoyed 47\nexploration 47\nexposed 47\nexpress 47\nexpression 47\nguaranteed 47\nhelicopter 47\nintellectual 47\ninvolvement 47\nlegislators 47\nnobody 47\nperformed 47\npractical 47\npromises 47\nranking 47\nrecall 47\nresort 47\nretired 47\nroute 47\nsectors 47\nsignal 47\nsigning 47\nsituations 47\nstayed 47\nstuck 47\nsubsidies 47\nsum 47\ntasks 47\nthirds 47\nthreatening 47\ntrained 47\ntrim 47\nturmoil 47\nwarrants 47\n1.7 46\n1990s 46\n2,000 46\nChongqing 46\nDean 46\nDouglas 46\nF 46\nFannie 46\nFe 46\nGovernor 46\nHousing 46\nIrish 46\nLa 46\nMilan 46\nMilitary 46\nMuch 46\nNOT 46\nPinglin 46\nSaatchi 46\nSino 46\nUp 46\nWhether 46\nappointment 46\napprove 46\narguments 46\nbeneficial 46\nburden 46\ncadres 46\ncease 46\nceiling 46\nconcentrate 46\nconflicts 46\nconspiracy 46\ndelayed 46\ndescribes 46\ndrawn 46\nelectrical 46\nemerged 46\nemphasized 46\nfinds 46\nfootball 46\nforever 46\nfreight 46\nfunctions 46\ngrant 46\nhappiness 46\ninvolve 46\nlowered 46\nloyal 46\nmarked 46\nmouth 46\nnative 46\nopposite 46\nourselves 46\npapers 46\nphenomenon 46\npop 46\npounds 46\npresidency 46\nreaches 46\nremoved 46\nscope 46\nsignificance 46\nsocialist 46\nstolen 46\nsuggestions 46\ntargeted 46\nthrow 46\ntorch 46\nunderlying 46\nunderstood 46\nvisits 46\nwars 46\nwaters 46\nwholesale 46\nwing 46\nzones 46\n57 45\n900 45\nAMR 45\nAND 45\nActually 45\nBrazil 45\nCheney 45\nChung 45\nCultural 45\nDad 45\nDrug 45\nExport 45\nGlobal 45\nHaving 45\nKuwait 45\nLang 45\nMachines 45\nMarshall 45\nMichigan 45\nOK 45\nP&G 45\nPost 45\nRJR 45\nS.A. 45\nSunni 45\nSyrian 45\nT. 45\nacknowledged 45\naffiliate 45\nalternatives 45\nballots 45\nblamed 45\nboom 45\nbright 45\ncautious 45\ncertificates 45\nchances 45\ncommand 45\ncomplaints 45\ncool 45\ndefensive 45\ndemanding 45\nelect 45\nexperiments 45\nexplanation 45\nfat 45\nforecasts 45\nfrequent 45\ngenerate 45\nguns 45\nhumanitarian 45\nincentives 45\nintent 45\nlatter 45\nmachinery 45\nmanner 45\nmeter 45\nmistake 45\nneck 45\nperforming 45\npoison 45\nprecious 45\nranging 45\nretailing 45\nrisen 45\nsaving 45\nseeks 45\nspecialist 45\nsucceeded 45\ntaste 45\ntechniques 45\ntext 45\nuseful 45\nwest 45\nwithdraw 45\n1.4 44\n101 44\n1981 44\n56 44\nAmbassador 44\nClub 44\nDPP 44\nDutch 44\nEC 44\nFutures 44\nJose 44\nLawrence 44\nLook 44\nLooking 44\nMac 44\nN.Y. 44\nNevertheless 44\nSam 44\nStearns 44\nZone 44\nallegedly 44\narrangement 44\nattempts 44\ncapability 44\ncharity 44\nchoices 44\ncomplained 44\ncontacts 44\ndestroy 44\ndoors 44\ndozens 44\ndream 44\ndropping 44\nduties 44\nemployed 44\nexecuted 44\nfiber 44\nfibers 44\nfilms 44\nfranchise 44\nfundamental 44\ngenetic 44\ngraduate 44\ngranted 44\nhearings 44\nhospitals 44\ninjuries 44\nintroduction 44\ninvitation 44\nlawsuits 44\nlegitimate 44\nlenders 44\nmovements 44\nnearby 44\nneighbors 44\npassengers 44\npleased 44\npride 44\nquantity 44\nsatellite 44\nsellers 44\nseparately 44\nsheet 44\nshock 44\nsponsored 44\nstars 44\nstrange 44\nsurvived 44\nthoughts 44\ntone 44\nvs. 44\nwonderful 44\n!!! 43\n110 43\n350 43\n67 43\n700 43\n95 43\n98 43\nApple 43\nArt 43\nBanco 43\nCohen 43\nConn. 43\nDealers 43\nEquipment 43\nGuard 43\nJoe 43\nLoan 43\nMoney 43\nNeither 43\nOthers 43\nRegion 43\nRiyadh 43\nRoman 43\nSociety 43\nThompson 43\nWilliams 43\nYu 43\nabsence 43\naffair 43\naspect 43\nattempting 43\nauthorized 43\nbike 43\nbirthday 43\ncapitalize 43\nchicken 43\ncommodities 43\ncontracted 43\ncounsel 43\ncredibility 43\ndeadly 43\ndemanded 43\ndestroyer 43\ndisappointed 43\ndowntown 43\ndramas 43\ndress 43\ndriver 43\negg 43\nescape 43\nexperiment 43\nexternal 43\nfold 43\nguests 43\nideal 43\ninch 43\ninternet 43\ninvestigating 43\njudgment 43\nkid 43\nlikes 43\nlists 43\nnegotiated 43\nopinions 43\novercome 43\npaintings 43\nparliamentary 43\npicking 43\nprogramming 43\nprominent 43\npublisher 43\nreferring 43\nregistration 43\nrelay 43\nreply 43\nsat 43\nshipments 43\nsight 43\nstepped 43\nsustained 43\nswitch 43\ntestimony 43\ntobacco 43\ntransit 43\nwins 43\nwitnesses 43\n11/16 42\n1974 42\n54 42\n62 42\nAllen 42\nAnnan 42\nBrady 42\nCCP 42\nCEO 42\nChing 42\nChristopher 42\nCuban 42\nCup 42\nFive 42\nGary 42\nGreenspan 42\nHezbollah 42\nHsueh 42\nIntegrated 42\nKim 42\nNATO 42\nNekoosa 42\nPhiladelphia 42\nPortuguese 42\nSara 42\nTake 42\nTennessee 42\nTurner 42\nadjustments 42\nadvantages 42\namazing 42\nappreciate 42\nautomobile 42\nbacking 42\nbar 42\nbarrel 42\nchecks 42\nchosen 42\ncircle 42\ncivilians 42\ncleared 42\ncollect 42\ncommissions 42\ncomplain 42\ncomplicated 42\nconstitution 42\ncontends 42\ncontinuously 42\ncopies 42\ncounted 42\ndescribe 42\ndeveloper 42\ndevice 42\ndirected 42\nemployers 42\nfortune 42\nhandled 42\nhate 42\nhidden 42\nidentity 42\nknowing 42\nloyalty 42\nmechanism 42\nmetric 42\nmile 42\npassenger 42\npenalty 42\npopularity 42\npotentially 42\npressures 42\nprompted 42\npsychological 42\nrebels 42\nreference 42\nregards 42\nreject 42\nreluctant 42\nrock 42\nsharing 42\nsmoke 42\nsons 42\nsteep 42\nsurely 42\ntechnique 42\ntension 42\ntire 42\ntumbled 42\nupdate 42\nvulnerable 42\nwild 42\nworldwide 42\n2002 41\n23rd 41\nCairo 41\nClark 41\nDavis 41\nDelta 41\nEU 41\nFollowing 41\nIS 41\nIntelligence 41\nMassachusetts 41\nMississippi 41\nOgilvy 41\nPope 41\nSteve 41\nanger 41\nasks 41\nbailout 41\nball 41\nbiotechnology 41\ncargo 41\ncheaper 41\ncirculation 41\nclassic 41\ncompeting 41\ncontext 41\ncontrolling 41\ncorner 41\ncreation 41\ncurrencies 41\ndepending 41\ndevelopers 41\ndisappointing 41\ndisasters 41\ndownturn 41\nearthquakes 41\nexhibition 41\nexplore 41\nfought 41\nhistoric 41\nill 41\nindicating 41\nintention 41\nking 41\nmountains 41\noccasion 41\noperated 41\nparticipated 41\nplanet 41\nprospect 41\npunishment 41\nraises 41\nreportedly 41\nshortly 41\nsolutions 41\nsubmit 41\nsupports 41\nteach 41\ntotaling 41\ntriple 41\nverdict 41\nwalking 41\n0.2 40\n20th 40\nAbbas 40\nBrian 40\nCompaq 40\nDaniel 40\nDave 40\nFreddie 40\nHsu 40\nHua 40\nIllinois 40\nLas 40\nMcDonald 40\nNYSE 40\nOh 40\nProgram 40\nRevolution 40\nSciences 40\nThrough 40\nTibet 40\nTony 40\nXiamen 40\narrive 40\nbars 40\nboys 40\nbroadcasting 40\nbulk 40\ncamera 40\nclaiming 40\ncounts 40\ndefault 40\ndiseases 40\neconomics 40\nengineers 40\nengines 40\nexpecting 40\nexperiences 40\nfeelings 40\nfolk 40\ngathered 40\ngenerations 40\nhandful 40\nimprovements 40\ninspection 40\nleg 40\nlength 40\nlistening 40\nministers 40\nmistakes 40\nmotion 40\nnervous 40\nnoticed 40\nobstacles 40\nongoing 40\noperational 40\noperators 40\npair 40\nperform 40\npool 40\npreserve 40\nproductivity 40\nrarely 40\nregularly 40\nremarks 40\nrenewed 40\nrid 40\nsemiconductor 40\nsexual 40\nsoccer 40\nsoul 40\nsport 40\nspots 40\nsugar 40\ntheater 40\ntill 40\nuncertain 40\nurging 40\nutilities 40\nviolent 40\nwalls 40\nwedding 40\nwise 40\n53 39\n69 39\nBest 39\nBreeden 39\nCapitol 39\nCare 39\nCleveland 39\nD.C. 39\nFDA 39\nGames 39\nGen. 39\nHampshire 39\nIT 39\nMacao 39\nMan 39\nNPR 39\nOtherwise 39\nPeace 39\nPeipu 39\nPhoenix 39\nStraits 39\nTaylor 39\nTelerate 39\nTerms 39\nThailand 39\nUrban 39\nWater 39\nWells 39\nWorkers 39\nadjustment 39\nanniversary 39\nartists 39\naudit 39\nborrowed 39\ncalligraphy 39\ncanceled 39\ncelebration 39\ncharging 39\ncharter 39\nclimb 39\ncollective 39\nconfrontation 39\ncontain 39\ncontains 39\ncountryside 39\ndeck 39\ndegrees 39\ndeny 39\ndesk 39\ndetermination 39\ndiscipline 39\ndonated 39\ndoubled 39\ndreams 39\ndrivers 39\neconomies 39\nemerging 39\nformula 39\ngenerated 39\ngoverning 39\nhonest 39\nimpression 39\nindication 39\ninnovation 39\ninterviews 39\nintroducing 39\ninvented 39\nleague 39\nloved 39\nmature 39\nmeal 39\nnamely 39\nnose 39\nodd 39\nparking 39\npayable 39\nperfectly 39\npersons 39\npottery 39\nproduces 39\nquietly 39\nradical 39\nregulation 39\nrequests 39\nretailer 39\nretire 39\nsheng 39\nsnakes 39\nsophisticated 39\nstationed 39\nstone 39\nstrengthened 39\nstriking 39\nstudied 39\nsubjects 39\nsubsidiaries 39\nsuccessor 39\nterritories 39\nthereby 39\ntouched 39\nultimate 39\nvet 39\nweaker 39\nwheel 39\nwhenever 39\nwinner 39\nwriters 39\n130 38\n15,000 38\n1979 38\n2.2 38\n89 38\nAgriculture 38\nAirways 38\nChin 38\nChiu 38\nContra 38\nExperts 38\nGold 38\nHarris 38\nHoward 38\nHungary 38\nInterior 38\nKay 38\nLand 38\nMaryland 38\nMobil 38\nNetherlands 38\nOlympics 38\nOrange 38\nPaper 38\nPeterson 38\nPlant 38\nProfessor 38\nReal 38\nShi 38\nSteven 38\nToyota 38\naide 38\nalert 38\nargues 38\nassumed 38\nbilateral 38\nboards 38\ncollecting 38\nconstant 38\ncontribute 38\ncrop 38\ndefined 38\ndepend 38\ndiplomacy 38\ndisputes 38\ndrawing 38\neager 38\neastern 38\nelaborate 38\nenthusiasm 38\nexpenditures 38\nfailing 38\nfled 38\nforum 38\ngrounds 38\nhanging 38\nhitting 38\nignored 38\nincidents 38\ninsider 38\niron 38\nlanding 38\nmeantime 38\nmix 38\nnegotiation 38\nneighbor 38\npartial 38\npet 38\nplot 38\nqualified 38\nradiation 38\nreliable 38\nreplacement 38\nreward 38\nsand 38\nslight 38\nsoil 38\nsupervision 38\nsymbol 38\ntactics 38\ntall 38\nthrew 38\nvictim 38\nviolated 38\nwages 38\nwealthy 38\nyellow 38\n140 37\n2.4 37\n2009 37\n750 37\n:-RRB- 37\nAcademy 37\nBanking 37\nBellSouth 37\nBlair 37\nCincinnati 37\nDiego 37\nEagle 37\nFujian 37\nHenry 37\nHunt 37\nImport 37\nInterstate 37\nKo 37\nKursk 37\nLegislative 37\nMCI 37\nMaxwell 37\nMissouri 37\nNewport 37\nPrize 37\nProducts 37\nPudong 37\nRica 37\nSee 37\nWen 37\nadjust 37\nadmit 37\nadmits 37\napproached 37\nbanker 37\nbanned 37\nbreaks 37\nbuses 37\ncharacteristics 37\ncompletion 37\ncontemporary 37\nconviction 37\ncovers 37\ncredits 37\ndecrease 37\ndoubts 37\ndrum 37\nenable 37\nexamination 37\nfeature 37\nfiles 37\ngraphics 37\nhide 37\nhiring 37\ninsisted 37\nliterary 37\nlobbying 37\nlogic 37\nmere 37\nnegotiate 37\norganizing 37\npainful 37\npays 37\nposting 37\npresentations 37\nprevention 37\nprocedure 37\nprofile 37\nradar 37\nrake 37\nrealistic 37\nrefinery 37\nrejection 37\nremarkable 37\nrestricted 37\nrisky 37\nroot 37\nselect 37\nshortage 37\nshuttle 37\nsixth 37\nstrict 37\ntaught 37\ntickets 37\nturnover 37\nveteran 37\nwalked 37\n103 36\n190 36\n6.5 36\n63 36\n68 36\nA$ 36\nCalif 36\nChancellor 36\nChannel 36\nClass 36\nConstruction 36\nDetroit 36\nEnterprises 36\nFEDERAL 36\nFilm 36\nFree 36\nGDP 36\nGNP 36\nHarry 36\nHonecker 36\nJan 36\nMa 36\nMayor 36\nMessrs. 36\nMt. 36\nNever 36\nNixon 36\nNuclear 36\nPremier 36\nRay 36\nRobinson 36\nSadr 36\nSerbian 36\nShantou 36\nSmall 36\nSometimes 36\nStores 36\nTaliban 36\nTownship 36\nTrump 36\nUnlike 36\nadults 36\nadvised 36\naerospace 36\naffecting 36\nartist 36\nartistic 36\nbasketball 36\nbattles 36\nblog 36\nbound 36\nburning 36\ncalculated 36\ncap 36\ncaptured 36\ncolors 36\ncompare 36\ncontinually 36\nconvenience 36\ncosmetics 36\ncurb 36\ndance 36\ndemonstration 36\ndominated 36\ndownward 36\ndrove 36\ne-mail 36\neighth 36\nelement 36\neliminated 36\nenemies 36\nenhance 36\nexception 36\nexcluding 36\nexhibit 36\nfindings 36\nfleet 36\nflexibility 36\nflowers 36\nfruit 36\nhire 36\ninquiry 36\nintegrity 36\nintensive 36\nintervention 36\ninvolves 36\nletting 36\nlights 36\nlying 36\nmainframe 36\nmergers 36\nmoments 36\nnewsletter 36\nnoting 36\npermit 36\npersonality 36\npledged 36\npork 36\npreferential 36\npreparation 36\nprofitability 36\npursuing 36\nrape 36\nrevolutionary 36\nroutes 36\nsaved 36\nseed 36\nsentiment 36\nserves 36\nspell 36\nsuitable 36\nsurvival 36\nthirty 36\nunderground 36\nunknown 36\nviewed 36\nwelfare 36\n2008 35\n30,000 35\n71 35\n88 35\nAm 35\nAndrew 35\nDan 35\nDennis 35\nDick 35\nEverything 35\nField 35\nFurthermore 35\nGiven 35\nHanover 35\nHonda 35\nJiangxi 35\nKKR 35\nLTV 35\nMalaysia 35\nMoon 35\nPeabody 35\nPowell 35\nPrices 35\nSaab 35\nSenator 35\nShihkang 35\nStone 35\nStrait 35\nWCRS 35\nWebster 35\nWenchuan 35\nWestinghouse 35\nadequate 35\nanymore 35\nassumption 35\nautomatically 35\nbacteria 35\nboats 35\ncasino 35\ncaution 35\nclay 35\ncolumn 35\ncontest 35\ncontrary 35\nconvert 35\ncooperative 35\ncourtesy 35\ncriminals 35\ndefeated 35\ndemonstrations 35\ndetail 35\ndominant 35\ndrew 35\nemerge 35\nemissions 35\nentrepreneurs 35\neveryday 35\nfighter 35\nfix 35\nflexible 35\nflood 35\ngunmen 35\nincentive 35\nindicted 35\nindictment 35\nislands 35\nkeeps 35\nlandscape 35\nliked 35\nliterally 35\nlongest 35\nmerge 35\nmissed 35\nmom 35\nmotherland 35\nmotor 35\nplatform 35\npregnant 35\nprosecution 35\nproven 35\nrefer 35\nregarded 35\nrepeat 35\nrip 35\nrolled 35\nsalary 35\nsorry 35\nsorts 35\nstages 35\nstamp 35\nsuspects 35\nsuspension 35\nterrible 35\ntie 35\ntowns 35\ntree 35\nunsecured 35\nupset 35\nvacation 35\nvaccine 35\nvigorously 35\nwhites 35\nwitness 35\n1960s 34\n8.50 34\nBe 34\nCarol 34\nCarter 34\nCola 34\nDelmed 34\nEngelken 34\nEstate 34\nFuji 34\nGonzalez 34\nIncident 34\nKodak 34\nMesa 34\nOcean 34\nOrleans 34\nOrtega 34\nPoster 34\nProject 34\nProtection 34\nRobertson 34\nRoger 34\nShiites 34\nShould 34\nShu 34\nSpeaking 34\nSteinhardt 34\nVegas 34\nVladimir 34\naboard 34\nacceptable 34\nactor 34\naddressed 34\naims 34\nantitrust 34\napparel 34\narrival 34\nawareness 34\nbitter 34\nbonded 34\nborrowing 34\ncalm 34\ncellular 34\ncircuit 34\ncivilization 34\ncollision 34\nconsiders 34\ncooperate 34\ndepressed 34\ndesignated 34\ndevoted 34\neased 34\nentitled 34\nentrepreneur 34\nequally 34\nexclusively 34\nexcuse 34\nexported 34\nextensive 34\nfarmer 34\nfloating 34\nforcing 34\ngrants 34\ngrateful 34\nhits 34\nhometown 34\nhorses 34\nignore 34\ninland 34\ninnocent 34\ninsists 34\ninstructions 34\ninsurer 34\njail 34\nliberal 34\nlowest 34\nmainstream 34\nmobile 34\nmultinational 34\npackages 34\npanic 34\npermission 34\npermitted 34\nplastics 34\nports 34\nprobe 34\npromoted 34\nprosecutor 34\nquestioned 34\nrecording 34\nrepairs 34\nrolling 34\nroutine 34\nsatisfied 34\nscenes 34\nscheme 34\nscores 34\nshake 34\nshine 34\nslid 34\nsmoothly 34\nsoldier 34\nspecially 34\nspeculators 34\nstream 34\nsuggesting 34\nsurprisingly 34\nsyndicate 34\ntrash 34\ntreaty 34\nwelcomed 34\nwine 34\n1.25 33\n64 33\n73 33\n< 33\n> 33\nANC 33\nAetna 33\nAnderson 33\nBaltimore 33\nBankers 33\nBunuel 33\nChiang 33\nColorado 33\nCorry 33\nD 33\nDemocrat 33\nDu 33\nEve 33\nFoods 33\nLisa 33\nMars 33\nNetwork 33\nNikkei 33\nNothing 33\nPersian 33\nPosted 33\nRome 33\nRoy 33\nSAR 33\nSamuel 33\nSeoul 33\nSpring 33\nStocks 33\nTCAC 33\nTW 33\nU.N. 33\nZemin 33\nabortions 33\naccompanied 33\naccurate 33\nadult 33\nagenda 33\naides 33\narguing 33\narranged 33\nassigned 33\nbarriers 33\nbreakthrough 33\nbridges 33\ncage 33\ncategories 33\ncleaning 33\ncommittees 33\ncomponents 33\nconducting 33\nconsiderably 33\ncontend 33\nconvention 33\ncorrectly 33\ncorrupt 33\ncreates 33\ncultures 33\ndebts 33\ndemonstrators 33\ndepth 33\ndisabled 33\nexperimental 33\nexpertise 33\nexplaining 33\nfalls 33\nfault 33\nformation 33\nforming 33\nhardware 33\nhearts 33\nhell 33\nhurricane 33\nindustrials 33\ninventories 33\nlegs 33\nlesson 33\nm 33\nmatching 33\nnotion 33\noverwhelming 33\npenny 33\npleaded 33\nquit 33\nrack 33\nrebound 33\nrevive 33\nriding 33\nrough 33\nsample 33\nscared 33\nscenario 33\nsealed 33\nsecured 33\nseized 33\nsentenced 33\nsinger 33\nsmoking 33\nstakes 33\nsterling 33\nstrip 33\nsurrounded 33\nsweet 33\nswitched 33\ntai 33\nundertaking 33\nurge 33\nutilizing 33\nvital 33\nwaited 33\nwillingness 33\nwondering 33\nworries 33\n1. 32\n1.9 32\n19th 32\n2.3 32\n250,000 32\n3rd 32\n6th 32\n77 32\n9/16 32\n97 32\nAgreement 32\nAriel 32\nBache 32\nBarry 32\nCarl 32\nCoca 32\nCray 32\nDon 32\nEarly 32\nFormer 32\nGov. 32\nHi 32\nHope 32\nIII 32\nInvestigation 32\nJay 32\nLilly 32\nLouisiana 32\nMartha 32\nMom 32\nNatural 32\nNight 32\nOregon 32\nPalestine 32\nRalph 32\nRey 32\nRichmond 32\nRoberts 32\nSeattle 32\nTrading 32\nWas 32\naccidents 32\nadopting 32\narts 32\nassociations 32\nattempted 32\nbikes 32\nbirds 32\nblind 32\nblocking 32\ncats 32\nchien 32\nclearing 32\nconcessions 32\nconstructed 32\ncontainer 32\ncourses 32\ndeparture 32\ndignity 32\ndocument 32\ndual 32\nexplosions 32\nextension 32\nfinancially 32\ngallon 32\nhall 32\nhotels 32\nhunting 32\nimpose 32\ninfected 32\nintend 32\nlasted 32\nlung 32\nobjectives 32\noccupied 32\norigin 32\nouter 32\nplanners 32\nportfolios 32\npraise 32\nprotecting 32\nproud 32\npublish 32\npulling 32\npunch 32\nracial 32\nrecapitalization 32\nreferred 32\nrefuse 32\nregardless 32\nreinsurance 32\nresearcher 32\nreveal 32\nring 32\nripped 32\nrises 32\nsake 32\nsatisfaction 32\nselection 32\nshadow 32\nshoulder 32\nspecified 32\nspy 32\nstomach 32\nstrictly 32\nstrongest 32\nsuperior 32\nsurveillance 32\ntested 32\ntownship 32\ntrail 32\nversions 32\nvessel 32\nviolation 32\nwarming 32\nwithdrew 32\n11th 31\n4.6 31\n59 31\n6,000 31\nAllianz 31\nBethlehem 31\nBudget 31\nChun 31\nColgate 31\nConnecticut 31\nDaiwa 31\nDalian 31\nDonaldson 31\nEnvironmental 31\nEric 31\nEverybody 31\nFinland 31\nGermans 31\nHerzegovina 31\nIcahn 31\nImperial 31\nKuo 31\nLine 31\nM 31\nMich. 31\nMiss 31\nNEC 31\nNanjing 31\nNobody 31\nOF 31\nOverseas 31\nPlanning 31\nRonald 31\nSquare 31\nStanding 31\nTax 31\nThomson 31\nTiger 31\nWilson 31\nWork 31\nachievement 31\naggressively 31\nannouncing 31\nappealed 31\nappropriations 31\nassuming 31\nbags 31\nbeach 31\nbelong 31\nbird 31\nbloody 31\nborders 31\nborrow 31\nbroader 31\nbudgets 31\nbuys 31\ncaptain 31\nchampionship 31\ncompares 31\ncontaining 31\ncontractor 31\ncriteria 31\ncry 31\ndare 31\ndecides 31\ndecisive 31\ndeliberately 31\ndemonstrated 31\ndiscrimination 31\ndrafted 31\nearning 31\neasing 31\nelectoral 31\nerrors 31\nexempt 31\nfinishing 31\nflew 31\nframework 31\nhung 31\nimplications 31\ninjury 31\ninstalled 31\ninstitute 31\nissuing 31\njourney 31\nlaser 31\nliquid 31\nloose 31\nmachinists 31\nmanufacture 31\nmarketplace 31\nmess 31\nmood 31\nmuseum 31\nnoon 31\nobjective 31\nocean 31\nodds 31\noldest 31\nplummeted 31\npostponed 31\npromising 31\npsyllium 31\npure 31\nreceives 31\nreception 31\nrecognition 31\nrequirement 31\nresponding 31\nreviews 31\nsacrifice 31\nslump 31\nsoak 31\nstretch 31\nsuck 31\nsupplier 31\nsuspend 31\ntale 31\nteeth 31\ntemperature 31\nthrown 31\ntuition 31\nvary 31\nviolations 31\nwarrant 31\n102 30\n1967 30\n1972 30\n2. 30\n2.6 30\n7/16 30\nAkzo 30\nBonds 30\nCall 30\nCambodia 30\nChurch 30\nCommand 30\nDeputy 30\nDivision 30\nEconomy 30\nGandhi 30\nGeneva 30\nHarvard 30\nHills 30\nIndonesia 30\nJupiter 30\nLewis 30\nMahmoud 30\nMarines 30\nNomura 30\nNortheast 30\nParamount 30\nPart 30\nPhil 30\nPilson 30\nPont 30\nProfit 30\nQin 30\nReform 30\nRemember 30\nRitek 30\nShell 30\nShort 30\nStation 30\nStatistics 30\nTemple 30\nTerry 30\nabsolute 30\nabsorb 30\naccelerate 30\naccelerating 30\nacres 30\nanxious 30\narchitecture 30\nassured 30\nathletes 30\nbelongs 30\nbesides 30\nblocked 30\nbowl 30\nbusinessman 30\ncamps 30\ncertificate 30\nclassified 30\nclubs 30\ncommunist 30\nconfusion 30\nconsent 30\ncontained 30\nconverted 30\nconvince 30\ndecent 30\ndeclaration 30\ndeclares 30\ndeeper 30\ndowns 30\nfabric 30\nfastest 30\nfierce 30\nfinanced 30\nfirmly 30\nframe 30\nfunny 30\nfurniture 30\nghost 30\ngifts 30\nharbor 30\nharsh 30\nhelicopters 30\nhorrible 30\nhui 30\nhumanity 30\nintegration 30\njudiciary 30\nlessons 30\nlifted 30\nmassage 30\nmedal 30\nmercy 30\nmines 30\nminorities 30\nmorality 30\npaint 30\nplea 30\npremiums 30\nprojections 30\nprotein 30\nprotesters 30\npulp 30\nquantities 30\nrecommendations 30\nreconciliation 30\nrecycling 30\nrelating 30\nrelying 30\nremoving 30\nreverse 30\nreviewing 30\nrubber 30\nsad 30\nseemingly 30\nsmashed 30\nstruggling 30\ntheories 30\ntherapy 30\ntired 30\ntraditionally 30\ntrips 30\nups 30\nvisible 30\n0 29\n125 29\n15th 29\n170 29\n2.8 29\n2nd 29\n3.9 29\n300,000 29\n5th 29\nAbdullah 29\nArthur 29\nBerkeley 29\nBuilding 29\nCoke 29\nCommercial 29\nComposite 29\nCoors 29\nDe 29\nFax 29\nFederation 29\nFidelity 29\nFourth 29\nFreedom 29\nG. 29\nGalileo 29\nGerald 29\nJeffrey 29\nJiangsu 29\nMaine 29\nMalcolm 29\nMaynard 29\nMerc 29\nObviously 29\nPC 29\nPhelan 29\nPlan 29\nPuli 29\nRemic 29\nRockefeller 29\nS&L 29\nSocialist 29\nStockholm 29\nUsing 29\nVa. 29\nVery 29\nWomen 29\nWorking 29\nYi 29\naccepting 29\nadmission 29\nanxiety 29\nassault 29\nblew 29\nbombings 29\ncampaigns 29\ncapped 29\ncommerce 29\nconcentrated 29\nconsistently 29\ncouples 29\ncrack 29\ncreativity 29\ncubic 29\ndrilling 29\ndurable 29\nelite 29\nengage 29\nescaped 29\nexpiration 29\nfailures 29\nfan 29\nforgotten 29\ngrades 29\ngrows 29\nhalted 29\nhence 29\nholder 29\nindications 29\ninterim 29\ninternationally 29\ninterviewed 29\ninvasion 29\ninventory 29\nlady 29\nlately 29\nlaugh 29\nleverage 29\nlifetime 29\nlongtime 29\nmanages 29\nmidst 29\nmill 29\nmoon 29\nmouse 29\nmystery 29\nobligations 29\noriented 29\nouts 29\npartnerships 29\npasses 29\npenalties 29\npersuade 29\nphilosophy 29\npill 29\nplain 29\npointing 29\npossibilities 29\nprecise 29\nprojected 29\nprotests 29\npublicity 29\nquote 29\nrallied 29\nreconstruction 29\nresponses 29\nrestored 29\nresumed 29\nrow 29\nrubble 29\nsectarian 29\nseventh 29\nshame 29\nsilly 29\nsimultaneously 29\nslowed 29\nsolved 29\nsooner 29\nspoken 29\nstaying 29\nsued 29\nsweeping 29\ntears 29\ntoy 29\ntranslator 29\ntries 29\nunfair 29\nunited 29\nwindows 29\n16th 28\n1973 28\n400,000 28\n486 28\n61 28\n76 28\n93 28\nAlex 28\nArkansas 28\nAttorney 28\nBarney 28\nBook 28\nBrands 28\nCatholic 28\nChao 28\nChase 28\nChu 28\nCompanies 28\nConstitution 28\nCould 28\nDelaware 28\nDenver 28\nEarthquake 28\nEcuador 28\nEdwards 28\nFROM 28\nFair 28\nFamily 28\nFarmers 28\nFort 28\nFranklin 28\nGiuliani 28\nHolland 28\nLawyers 28\nLimited 28\nLove 28\nMarcos 28\nMarine 28\nMcGrady 28\nMerksamer 28\nMideast 28\nMohammed 28\nMoussa 28\nMubarak 28\nNicholas 28\nOptions 28\nOut 28\nP. 28\nPinkerton 28\nRussians 28\nSalvador 28\nSeidman 28\nSenior 28\nSwedish 28\nViacom 28\nWarsaw 28\nWelcome 28\nWitter 28\naccusations 28\nallocation 28\nally 28\nanswered 28\napple 28\nbargain 28\nbeating 28\nbolster 28\nbombs 28\nbreeding 28\nburned 28\ncapabilities 28\ncasualties 28\ncattle 28\ncement 28\nchair 28\ncharities 28\nchildhood 28\ncollateral 28\ncombine 28\ncomedy 28\ncommonly 28\ncotton 28\ndated 28\ndefinitive 28\ndesert 28\ndesperate 28\ndiscovery 28\ndispatched 28\ndust 28\nearliest 28\nengineer 28\nenhanced 28\nexact 28\nexciting 28\nexpire 28\nexploded 28\nfactions 28\nfare 28\nfavored 28\nfever 28\nglory 28\ngrounded 28\nhang 28\nholy 28\nhomeland 28\nindicators 28\ninspired 28\ninstrument 28\ninvestigate 28\njoins 28\njustify 28\nlaunching 28\nlease 28\nlegislator 28\nmanufactured 28\nmental 28\nmiss 28\nnationwide 28\nneighboring 28\nnickel 28\noffensive 28\nopenly 28\norganize 28\npackaging 28\npatterns 28\npolitically 28\npolitician 28\nprecisely 28\npregnancy 28\nprehistoric 28\npressing 28\nprinting 28\nproceed 28\npublication 28\nreasonably 28\nreductions 28\nrelaxed 28\nrender 28\nrestrict 28\nrobust 28\nscattered 28\nseeds 28\nseller 28\nsheets 28\nshore 28\nsole 28\nspin 28\nspiritual 28\nspreading 28\nstance 28\nstrive 28\nsubsequent 28\ntransform 28\ntreatments 28\nunclear 28\nunexpected 28\nupward 28\nurgent 28\nvoices 28\nwarfare 28\nwaves 28\nweakening 28\n12th 27\n2.7 27\n240 27\n3.1 27\n4,000 27\n5.5 27\n6.9 27\n79 27\n<< 27\n>> 27\nAdministrative 27\nAlbright 27\nAmex 27\nAziz 27\nBankruptcy 27\nBarbara 27\nBartlett 27\nBoren 27\nCabinet 27\nCaribbean 27\nCarlos 27\nCivil 27\nCorporate 27\nFidel 27\nGarden 27\nGet 27\nGlass 27\nHello 27\nHotel 27\nHsiang 27\nIndians 27\nJews 27\nKellogg 27\nKentucky 27\nLarry 27\nLuneng 27\nMake 27\nMother 27\nNorfolk 27\nOsama 27\nPer 27\nPolish 27\nSheikh 27\nShowtime 27\nSix 27\nStatesWest 27\nTibetan 27\nTitle 27\nUnless 27\nVenezuela 27\nVillage 27\nWaleed 27\nWarren 27\nWhatever 27\nXu 27\naccumulated 27\nactors 27\nadvances 27\naftermath 27\nagrees 27\naltogether 27\nambitious 27\napproaching 27\nballot 27\nbasket 27\nbearing 27\nbloc 27\nbunch 27\ncancel 27\ncarbon 27\ncigarettes 27\nclimbing 27\ncommander 27\ncompound 27\nconceded 27\nconfused 27\nconnections 27\nconscious 27\nconservation 27\ncontractors 27\ncope 27\ncounting 27\ncrowds 27\ndating 27\ndealings 27\ndemonstrate 27\ndenies 27\ndiluted 27\ndiscover 27\ndiversified 27\ndump 27\neh 27\neliminating 27\nevolution 27\nexcited 27\nexpressing 27\nfeared 27\nfinger 27\nfoods 27\nfreely 27\nfrozen 27\nguards 27\nguest 27\nhabits 27\nhero 27\nhybrid 27\nidle 27\nignorance 27\nimpressive 27\nincline 27\ninevitably 27\ninstant 27\nissuance 27\nla 27\nlaptop 27\nleasing 27\nlend 27\nmart 27\nmeasured 27\nnecessity 27\nok 27\nopposing 27\nowed 27\npack 27\nphones 27\npitch 27\npleasant 27\npostpone 27\npraised 27\npray 27\nprize 27\nprohibited 27\npropose 27\npursuit 27\nreacted 27\nregister 27\nremote 27\nrepurchase 27\nresident 27\nresign 27\nresource 27\nreunification 27\nrushed 27\nscaled 27\nscript 27\nsecrets 27\nsessions 27\nshelf 27\nshelter 27\nsinging 27\nsits 27\nski 27\nslower 27\nsmart 27\nsoaring 27\nsomehow 27\nsqueeze 27\nstealing 27\nsteam 27\nstem 27\nstrategist 27\nstring 27\nsubcommittee 27\nsucceeding 27\ntakeovers 27\ntargeting 27\ntopics 27\ntower 27\ntransformation 27\nunified 27\nutilization 27\nvigorous 27\nwash 27\nwhereas 27\nwishes 27\n3.3 26\n3.6 26\n3.8 26\n4.7 26\n9th 26\nAnhui 26\nAnthony 26\nAvenue 26\nBanks 26\nBased 26\nBloomingdale 26\nBrussels 26\nCathay 26\nConner 26\nCraig 26\nDC 26\nElectronics 26\nElsewhere 26\nEngineering 26\nFallujah 26\nFees 26\nFew 26\nGamble 26\nGarcia 26\nHeaven 26\nIMF 26\nIndependent 26\nIslands 26\nJerry 26\nJonathan 26\nLufkin 26\nMaliki 26\nMercury 26\nMinneapolis 26\nMount 26\nNTU 26\nNearly 26\nOklahoma 26\nProcter 26\nQuotron 26\nRamallah 26\nRaymond 26\nRegarding 26\nRick 26\nScotland 26\nSecondly 26\nSimon 26\nSons 26\nStaff 26\nSterling 26\nTechnologies 26\nTed 26\nTelegraph 26\nTrelleborg 26\nUnisys 26\nUnnamed 26\nWisconsin 26\nYOU 26\nacted 26\nadvocates 26\nagreeing 26\nanticipation 26\napartments 26\naudiences 26\nbattlefield 26\nbin 26\nboosting 26\nbother 26\nbrewing 26\nbureaucracy 26\ncamp 26\ncapita 26\ncelebrate 26\nchamber 26\ncivilian 26\nclan 26\nconfirmation 26\nconglomerate 26\ncontinuous 26\nconversations 26\ncrazy 26\ncrossing 26\ncrowded 26\ncustom 26\ndad 26\ndates 26\ndelta 26\ndeployment 26\ndesigner 26\ndrought 26\ndubbed 26\neducated 26\nemployer 26\nemploys 26\nenjoys 26\nentities 26\nexplosive 26\nextending 26\nfifty 26\nfilling 26\nfinances 26\nfits 26\nfreeze 26\ngraduated 26\ngray 26\nguided 26\ninfluenced 26\ninfluential 26\ninfo 26\njoke 26\njournalist 26\njunior 26\nkick 26\nkitchen 26\nlaboratory 26\nlasting 26\nlaying 26\nliquor 26\nlock 26\nlocked 26\nmainframes 26\nmasses 26\nmatches 26\nmeets 26\nmemories 26\nmerged 26\nmiracle 26\nmonopoly 26\nnaval 26\nnonetheless 26\noptical 26\noutlets 26\noverhaul 26\nowning 26\nperiods 26\npills 26\npleasure 26\npot 26\npredicting 26\nrailway 26\nrammed 26\nranked 26\nrational 26\nrefugee 26\nrelax 26\nrental 26\nrevisions 26\nseize 26\nsenator 26\nsettlements 26\nsettling 26\nseverely 26\nsilent 26\nsluggish 26\nsmile 26\nsubstance 26\ntear 26\nthrash 26\nunexpectedly 26\nunions 26\nuniversal 26\nunnecessary 26\nuprising 26\nvalid 26\nwounds 26\n・ 26\n150,000 25\n18th 25\n1971 25\n270 25\n3.4 25\n30th 25\n4.2 25\n40,000 25\n5.3 25\n91 25\nAllied 25\nAlready 25\nAnyone 25\nAnyway 25\nBa 25\nBalqes 25\nBlue 25\nChavez 25\nCircuit 25\nCode 25\nColombia 25\nCompared 25\nContras 25\nDue 25\nEvans 25\nEventually 25\nFatah 25\nFiat 25\nGemayel 25\nGramm 25\nHastings 25\nINC. 25\nKosovo 25\nKyrgyzstan 25\nLevine 25\nLikud 25\nLocal 25\nN. 25\nPalace 25\nPanamanian 25\nPut 25\nQuebecor 25\nRally 25\nResults 25\nRice 25\nRockets 25\nRudman 25\nSong 25\nSoon 25\nSouthwest 25\nStevens 25\nSusan 25\nTBAD 25\nTen 25\nTexaco 25\nVermont 25\nWeb 25\nYale 25\nYe 25\nYeah 25\nYing 25\naborigines 25\nacceptance 25\nacknowledges 25\nalter 25\naluminum 25\napplying 25\narena 25\narrangements 25\nattitudes 25\nbattery 25\nbidders 25\nbiochips 25\nbottles 25\nbread 25\ncalendar 25\nchallenged 25\nchaos 25\ncharacterized 25\ncollected 25\ncompensate 25\ncompetitiveness 25\nconscience 25\nconsciousness 25\nconsolidated 25\nconsortium 25\nconversation 25\ndangers 25\ndeciding 25\ndisappeared 25\ndiscounting 25\ndisplays 25\ndisputed 25\ndock 25\ndumped 25\neaten 25\nelementary 25\nenabled 25\nevident 25\nexecute 25\nfinancier 25\nfines 25\nfitness 25\nflavor 25\nforests 25\nfounding 25\nfundamentals 25\nglad 25\ngrave 25\ngrip 25\nholidays 25\nhungry 25\nignoring 25\nimmigrants 25\ninevitable 25\nlanded 25\nlegally 25\nlire 25\nlo 25\nlobby 25\nmessages 25\nnoise 25\nobserved 25\noccasionally 25\npayroll 25\npeacekeeping 25\npizza 25\nprevented 25\npro 25\nquestioning 25\nrail 25\nrailroad 25\nrank 25\nrat 25\nrefusal 25\nrepublic 25\nreset 25\nrespondents 25\nretained 25\nrocket 25\nsheep 25\nshipped 25\nshoe 25\nshoot 25\nsocialism 25\nsolely 25\nspeaker 25\nstops 25\nsubscribers 25\nsuper 25\nsurveyed 25\nsuspicious 25\ntap 25\ntaxable 25\ntense 25\ntentatively 25\nthick 25\nthreshold 25\ntires 25\ntrials 25\nuniform 25\nvegetables 25\nviolating 25\nvisa 25\nwinners 25\nwrapped 25\n14th 24\n2.1 24\n225 24\n8.5 24\n8th 24\n?? 24\nAgain 24\nAlto 24\nAriz. 24\nArlington 24\nBase 24\nChevron 24\nCos. 24\nDarwin 24\nDebra 24\nDing 24\nDisney 24\nFla. 24\nForum 24\nG 24\nGates 24\nHao 24\nHarbor 24\nHo 24\nHooker 24\nJihad 24\nJimmy 24\nJordanian 24\nKofi 24\nLe 24\nLibya 24\nLone 24\nMembers 24\nMercantile 24\nMidwest 24\nMikhail 24\nMilton 24\nNO 24\nNader 24\nNone 24\nOperating 24\nPictures 24\nProceeds 24\nPu 24\nPublishing 24\nQian 24\nQuebec 24\nRATE 24\nRICO 24\nReynolds 24\nRock 24\nRose 24\nRothschild 24\nSaint 24\nStein 24\nT 24\nTotal 24\nTurkish 24\nUse 24\nUtah 24\nVan 24\nVictor 24\nVietnamese 24\nWei 24\nZhejiang 24\nZhu 24\na.m 24\nachieving 24\nadvise 24\nages 24\nallocated 24\nanthem 24\narray 24\nasserted 24\nassess 24\nathlete 24\nb 24\nbag 24\nbears 24\nbeings 24\nbomber 24\nbombers 24\nbottle 24\ncalculations 24\ncapitalist 24\nchampion 24\nclash 24\ncleanup 24\ncoach 24\ncoins 24\ncomic 24\ncommit 24\ncompliance 24\nconcedes 24\nconsultation 24\ncounterparts 24\ncourage 24\ncritic 24\ndealt 24\ndedicated 24\ndependent 24\ndisagree 24\ndivers 24\neligible 24\nembarrassment 24\nendless 24\nengineered 24\nessence 24\nexceeding 24\nfigured 24\nfilings 24\nflows 24\nformat 24\nfusion 24\ngambling 24\ngenuine 24\ngiants 24\ngod 24\ngrace 24\ngrandmother 24\nharmony 24\nharvest 24\nhonored 24\nhopefully 24\nillness 24\nimportantly 24\njewelry 24\njournal 24\nkicked 24\nknocked 24\nkuang 24\nlimitations 24\nmaintains 24\nmaps 24\nmasters 24\nmilitias 24\nmuscle 24\nmusical 24\nmutually 24\nnegotiator 24\nobjects 24\noccasions 24\noils 24\nokay 24\npassion 24\nphrase 24\npit 24\npractically 24\npredictions 24\nprivatization 24\npsychology 24\nram 24\nreactions 24\nrealizing 24\nredemption 24\nrepay 24\nreplacing 24\nresistant 24\nrespected 24\nsafer 24\nsalt 24\nseparation 24\nskeptical 24\nskill 24\nsmash 24\nsparked 24\nspeaks 24\nsqueezed 24\nstupid 24\nsuggestion 24\nsurvivors 24\nsymptoms 24\ntrapped 24\ntribe 24\nturnaround 24\nunderwriter 24\nunfortunately 24\nunveiled 24\nvolumes 24\nvolunteers 24\nwarnings 24\nwei 24\nwells 24\nwings 24\n160 23\n17th 23\n3. 23\n3.7 23\n450 23\n7.5 23\n81 23\n87 23\nAcquisition 23\nAfghan 23\nAlaska 23\nAmoco 23\nApart 23\nBeverly 23\nBlock 23\nCable 23\nCar 23\nChamber 23\nCurrency 23\nDNA 23\nENA 23\nEST 23\nElectronic 23\nElizabeth 23\nEmail 23\nFaced 23\nFeb. 23\nFifth 23\nFirestone 23\nFlight 23\nFujitsu 23\nGansu 23\nGrandma 23\nHakka 23\nHence 23\nHughes 23\nInterest 23\nJazeera 23\nKansas 23\nKelly 23\nKevin 23\nLegal 23\nLibyan 23\nLinda 23\nLive 23\nLotus 23\nMass 23\nMen 23\nMontreal 23\nMotorola 23\nMyers 23\nNor 23\nOrder 23\nOverall 23\nPS 23\nPackwood 23\nPalo 23\nPatrick 23\nPerlingiere 23\nPhone 23\nPlus 23\nPolaroid 23\nPolicy 23\nPuerto 23\nQuestion 23\nResolution 23\nS 23\nScottish 23\nSearle 23\nShaanxi 23\nSweden 23\nTa 23\nTown 23\nTsung 23\nWPP 23\nWhitbread 23\nWorth 23\naboriginal 23\naccountability 23\naccountable 23\nacid 23\nactress 23\nadapt 23\nadverse 23\naffordable 23\napples 23\nassociates 23\navoided 23\nbacks 23\nbands 23\nbarrier 23\nbench 23\nbias 23\nblocks 23\nboils 23\nbreakdown 23\nbreakfast 23\nbuilders 23\ncharacteristic 23\nchess 23\nchoosing 23\nclinical 23\ncompleting 23\ncomposed 23\nconcepts 23\nconclude 23\ncondemned 23\nconsiderations 23\nconstitute 23\ncontents 23\ncounterpart 23\ncumulative 23\ncure 23\ndaughters 23\ndeficits 23\ndesigns 23\ndevastating 23\ndigital 23\ndirections 23\ndiscounts 23\ndismissed 23\ndistribute 23\ndiversity 23\ndressed 23\nedges 23\nembargo 23\nevaluation 23\nexam 23\nexpires 23\nexporting 23\nfake 23\nfashionable 23\nfatal 23\nfeaturing 23\nfed 23\nfruits 23\ngate 23\ngather 23\ngrowers 23\ngyrations 23\nheating 23\nhefty 23\nhottest 23\nhumor 23\ni.e. 23\nillegally 23\nimpeachment 23\nindependently 23\ninformal 23\ninitiatives 23\nink 23\ninsisting 23\ninterpretation 23\njet 23\njoy 23\njurisdiction 23\nkiller 23\nlicenses 23\nlinking 23\nmagic 23\nmeals 23\nmemo 23\nmilitant 23\nmounting 23\npacer 23\npackaged 23\npacked 23\npainted 23\npeasants 23\nplaintiffs 23\npossess 23\npresidents 23\nprint 23\nquotations 23\nracked 23\nracketeering 23\nrandom 23\nreact 23\nrebuild 23\nreceipts 23\nrecommendation 23\nreduces 23\nrefueling 23\nrelied 23\nreminds 23\nresidence 23\nresorts 23\nrestructure 23\nrue 23\nsearching 23\nseasonal 23\nseasonally 23\nservants 23\nsetback 23\nsignals 23\nslogan 23\nsnow 23\nsteelmakers 23\nsubsidy 23\nsuppliers 23\nsurveys 23\nswimming 23\ntalents 23\nterminal 23\nthereafter 23\nthread 23\ntimetable 23\ntip 23\ntougher 23\ntracking 23\ntroop 23\nunhappy 23\nunusually 23\nviolate 23\nweakened 23\n.? 22\n1976 22\n25,000 22\n713 22\n84 22\n86 22\n92 22\n96 22\nAdvanced 22\nAeroflot 22\nAlliance 22\nAlong 22\nAmid 22\nAppropriations 22\nBally 22\nBankAmerica 22\nBonded 22\nBrother 22\nCORP. 22\nCTI 22\nCenTrust 22\nChildren 22\nClearly 22\nCold 22\nConsumer 22\nDeloitte 22\nDollar 22\nEquity 22\nEstimated 22\nEver 22\nFellowship 22\nForest 22\nFurther 22\nGiants 22\nHIV 22\nHainan 22\nHealthVest 22\nHoly 22\nHoneywell 22\nIll. 22\nIowa 22\nJacobson 22\nJenrette 22\nLBO 22\nLeader 22\nMadrid 22\nManila 22\nMarina 22\nMeet 22\nMexican 22\nMonetary 22\nNFL 22\nNicaraguan 22\nNigel 22\nNorthwest 22\nPAPER 22\nPolitburo 22\nPolitical 22\nRelations 22\nRepresentatives 22\nShih 22\nShow 22\nSpielvogel 22\nSunnis 22\nTVA 22\nTonight 22\nTower 22\nYangtze 22\nabandon 22\nabsurd 22\naccelerated 22\naccomplished 22\naffluent 22\nambitions 22\namounted 22\nanalyze 22\nanticipate 22\napplicants 22\napproaches 22\nartillery 22\naudio 22\nbeef 22\nboxes 22\nbreach 22\nbreath 22\nbriefly 22\nbureaucrats 22\ncarpet 22\ncatastrophic 22\nchih 22\ncholesterol 22\nclock 22\ncollar 22\ncomponent 22\ncomputing 22\nconclusions 22\nconnecting 22\nconsolidation 22\nconversion 22\nconvoy 22\ncopying 22\ncrises 22\ncrops 22\ncruise 22\ncup 22\ndear 22\ndebut 22\ndefendant 22\ndelegates 22\ndeployed 22\ndescription 22\ndeserves 22\ndirty 22\ndismissal 22\ndistributor 22\neconomically 22\nemperor 22\nentity 22\nentrance 22\nexplosives 22\nfails 22\nfantastic 22\nfares 22\nfiring 22\nfool 22\nfuneral 22\ngarage 22\ngear 22\ngift 22\nglobe 22\ngovernmental 22\ngraduates 22\ngreeted 22\nguidance 22\nhabit 22\nhardest 22\nhatred 22\nheadline 22\nheritage 22\nhomeless 22\nimpressed 22\ninadequate 22\nincest 22\ninjunction 22\ninnovative 22\nintellectuals 22\ninterfere 22\ninvite 22\nlayer 22\nlean 22\nliabilities 22\nlifting 22\nlocals 22\nloud 22\nlovely 22\nmagnitude 22\nmales 22\nmandatory 22\nmarginal 22\nmassacre 22\nmighty 22\nminimal 22\nminus 22\nmodernization 22\nmud 22\nmurdered 22\nmurders 22\nmuscles 22\nneighborhoods 22\nnewcomers 22\nnotified 22\nobserve 22\noccurs 22\noffshore 22\noverhead 22\npassage 22\npatents 22\nperformances 22\nplanner 22\npledge 22\npocket 22\npockets 22\npoorly 22\npreference 22\npresents 22\nprocesses 22\nproposing 22\nprospective 22\nproves 22\npurely 22\nrebounded 22\nrebuilding 22\nrecount 22\nrecovering 22\nrefers 22\nrefusing 22\nreiterated 22\nridiculous 22\nroles 22\nroots 22\nsalaries 22\nsank 22\nshaking 22\nshocked 22\nsights 22\nsignature 22\nsingers 22\nspends 22\nstabilize 22\nstabilizing 22\nstadium 22\nstaged 22\nstones 22\nsubsidized 22\nsummary 22\nsurrender 22\nswap 22\nsyndrome 22\ntends 22\nthrashed 22\nthreaten 22\ntimely 22\ntoxic 22\ntraveled 22\ntreating 22\ntrimmed 22\nunderstands 22\nunrest 22\nwanting 22\nwarns 22\nwasted 22\nwherever 22\nwidened 22\n106 21\n1950s 21\n1969 21\n2019 21\n26th 21\n5.9 21\n74 21\n7th 21\n8,000 21\nAh 21\nAnheuser 21\nArmed 21\nBass 21\nBenjamin 21\nBeyond 21\nBruce 21\nCFCs 21\nCancer 21\nCargill 21\nCarpenter 21\nChris 21\nChristians 21\nChristie 21\nClara 21\nColin 21\nCommodity 21\nDeng 21\nDynamics 21\nEPA 21\nEarnings 21\nEd 21\nEither 21\nEthiopia 21\nEurocom 21\nFactory 21\nFargo 21\nGoodson 21\nGould 21\nGuinness 21\nHawaii 21\nHidden 21\nHsieh 21\nIngersoll 21\nJean 21\nJiujiang 21\nKatz 21\nKemper 21\nKenneth 21\nLate 21\nLaurie 21\nLord 21\nMORTGAGE 21\nMagazine 21\nMeredith 21\nMetropolitan 21\nMovement 21\nNelson 21\nNotes 21\nPemex 21\nPhilippine 21\nPrivate 21\nProfessional 21\nReports 21\nTencent 21\nTimor 21\nTribunal 21\nU 21\nU. 21\nUpon 21\nWay 21\nWright 21\nWuhan 21\nYeltsin 21\nabuses 21\naccommodate 21\naccuracy 21\nadjustable 21\nadvisory 21\naffiliates 21\nairports 21\nalcohol 21\namended 21\nappearing 21\nattendants 21\nattraction 21\naveraged 21\nbeneath 21\nbenefited 21\nbilling 21\nbonus 21\nbonuses 21\nbosses 21\nbreast 21\nbroadly 21\nbullet 21\ncapture 21\ncatastrophe 21\nchapter 21\nchecking 21\nchickens 21\ncigarette 21\ncollectors 21\ncolleges 21\ncolonial 21\ncolumns 21\ncomfort 21\ncommanders 21\ncommercials 21\ncommissioner 21\ncompiled 21\ncomplaining 21\nconcludes 21\ncongressman 21\nconsistent 21\nconstruct 21\nconsuming 21\ncontracting 21\nconvened 21\ncooking 21\ncorners 21\ndawn 21\ndeclaring 21\ndetect 21\ndeterioration 21\ndevelops 21\ndictatorship 21\ndisappointment 21\ndiving 21\ndrag 21\ndramatically 21\ndrops 21\nduck 21\nelder 21\nempire 21\nenvoy 21\nerupted 21\nevaluate 21\nexamine 21\nexamples 21\nexcitement 21\nexisted 21\nexists 21\nfarms 21\nfestival 21\nflesh 21\nfluid 21\nfocuses 21\nfueled 21\ngarbage 21\ngrab 21\ngrocery 21\nhesitate 21\nhighways 21\nhistorically 21\nhop 21\nhopeful 21\nimmune 21\ninstall 21\nintegrate 21\nisolated 21\nkidney 21\nkillings 21\nlackluster 21\nlacks 21\nlined 21\nliquidation 21\nlisting 21\nlively 21\nlonely 21\nlucrative 21\nlunar 21\nmarine 21\nmartial 21\nmentality 21\nmidnight 21\nmilitia 21\nmirror 21\nmodestly 21\nmortality 21\nnationally 21\nnightmare 21\nnotification 21\nobject 21\nobservation 21\nopens 21\nordering 21\nousted 21\noverly 21\nowe 21\npassive 21\nperception 21\nplate 21\npossesses 21\npouring 21\npredicts 21\npresenting 21\npreventing 21\npriorities 21\nprocessed 21\npromotional 21\nprompt 21\nprone 21\nreader 21\nreferendum 21\nrefinancing 21\nrefining 21\nregain 21\nrelates 21\nremoval 21\nrevision 21\nrocks 21\nrumored 21\nrushing 21\nsafely 21\nsatisfy 21\nseals 21\nsegments 21\nserial 21\nshy 21\nslip 21\nspare 21\nspecializes 21\nspokesperson 21\nstrain 21\ntechnicians 21\ntemperatures 21\ntheft 21\nthemes 21\ntitles 21\ntragic 21\ntreasurer 21\ntrigger 21\ntube 21\ntumor 21\nunanimously 21\nvolunteer 21\nvowed 21\nwheat 21\nwhoever 21\nwinds 21\n0.3 20\n1,500 20\n1000 20\n2.9 20\n22nd 20\n37.5 20\nAB 20\nAdditionally 20\nAdvertising 20\nAsked 20\nAsset 20\nAssociated 20\nAttached 20\nAustin 20\nAvery 20\nBates 20\nBenson 20\nBible 20\nBobby 20\nBoyd 20\nBrooklyn 20\nCOMMERCIAL 20\nChernobyl 20\nCities 20\nClean 20\nColombian 20\nConnaught 20\nCounterparty 20\nCowboys 20\nCulture 20\nDarman 20\nDun 20\nEnlai 20\nEuro 20\nExchequer 20\nFHA 20\nFeng 20\nFootball 20\nGorges 20\nHDTV 20\nHague 20\nHangzhou 20\nHart 20\nHui 20\nIslam 20\nJamie 20\nJane 20\nJennifer 20\nKeep 20\nKemp 20\nKen 20\nKuomintang 20\nLONDON 20\nLarge 20\nLines 20\nLitigation 20\nMajority 20\nMartine 20\nMayer 20\nMei 20\nMerkur 20\nMinnesota 20\nMongolian 20\nMoore 20\nMurphy 20\nNSC 20\nNabisco 20\nNanning 20\nNetanyahu 20\nNorwegian 20\nPRIME 20\nPa. 20\nPersonal 20\nPharmaceutical 20\nPierre 20\nPort 20\nRATES 20\nRecords 20\nRenaissance 20\nRich 20\nRico 20\nRouge 20\nSansui 20\nSay 20\nSeagram 20\nSemel 20\nSilicon 20\nSources 20\nSquibb 20\nSullivan 20\nTandy 20\nTing 20\nTravel 20\nTreasurys 20\nUA 20\nUniversal 20\nUpham 20\naccompanying 20\naccomplish 20\nadoption 20\nadvocate 20\nangle 20\nannouncements 20\napplause 20\nastronauts 20\nattacking 20\nauthors 20\nautomatic 20\nawards 20\nbabies 20\nbackdrop 20\nballoon 20\nbanning 20\nbargaining 20\nbay 20\nbearish 20\nbeneficiaries 20\nbleeding 20\nboasts 20\nboiled 20\nbreed 20\nbubble 20\nbullish 20\nburn 20\ncameras 20\ncapitalism 20\ncarved 20\ncelebrated 20\nchallenging 20\nchart 20\nchecked 20\ncloud 20\nclouds 20\ncombining 20\ncompatible 20\ncomplications 20\ncomptroller 20\nconservatives 20\nconsists 20\nconsul 20\ncontributing 20\nconvenient 20\ncorrespondence 20\ncredible 20\ncrossed 20\ncrush 20\ncustody 20\ncyclical 20\ndefects 20\ndefended 20\ndepression 20\ndictator 20\ndomestically 20\ndominate 20\ndonation 20\ndoubling 20\nemploy 20\nenjoying 20\nepisode 20\nequality 20\nexchanged 20\nexpectation 20\nexpired 20\nexploring 20\nfancy 20\nfashioned 20\nfax 20\nfeeding 20\nfend 20\nfiction 20\nflowing 20\nfossils 20\nfreeway 20\nfrustrated 20\ngaining 20\ngallons 20\ngender 20\ngrandma 20\nhandover 20\nhedge 20\nhip 20\nideological 20\nimplies 20\nimporting 20\ninclined 20\nincredible 20\ninitiated 20\ninput 20\nintensity 20\nintentions 20\ninterior 20\njurors 20\njustified 20\nknife 20\nlab 20\nlaboratories 20\nlacking 20\nlanguages 20\nlighter 20\nlikelihood 20\nliver 20\nlosers 20\nloses 20\nloves 20\nmanual 20\nmaturing 20\nmechanical 20\nmediator 20\nmice 20\nmisleading 20\nmissions 20\nmold 20\nnail 20\nnevertheless 20\nnights 20\nnoble 20\nobserver 20\novertime 20\npartially 20\npatience 20\nperformers 20\npillar 20\nplacing 20\nposed 20\npressed 20\nprevail 20\nprinted 20\nprofession 20\npromotions 20\npromptly 20\nprovider 20\nprovisional 20\npublishers 20\nqualify 20\nquotes 20\nrear 20\nrecruiting 20\nreflection 20\nregret 20\nrendered 20\nreservations 20\nresist 20\nrevealing 20\nrevolving 20\nrhetoric 20\nromantic 20\nsacred 20\nscored 20\nseparated 20\nshaped 20\nshells 20\nshield 20\nsink 20\nsinking 20\nsofter 20\nspecialized 20\nspeeches 20\nspur 20\nstatistical 20\nstriving 20\nstruggled 20\nsuburbs 20\nsworn 20\ntaxation 20\ntissue 20\ntorn 20\ntranslated 20\ntravelers 20\ntribes 20\ntune 20\nuncle 20\nuncovered 20\nundertake 20\nunification 20\nunrelated 20\nunsuccessful 20\nupgrade 20\nwalks 20\nwarn 20\nwives 20\nworn 20\nworrying 20\n+ 19\n107 19\n111 19\n1978 19\n2010 19\n6.3 19\n650 19\n78 19\nAchenbaum 19\nAlberta 19\nAmi 19\nAntonio 19\nArticle 19\nBattle 19\nBenz 19\nBetter 19\nBradstreet 19\nBristol 19\nCalgary 19\nChih 19\nCongressional 19\nConsider 19\nDPC 19\nDeath 19\nDeclaration 19\nDelhi 19\nDisease 19\nDoha 19\nEB 19\nEPM 19\nElection 19\nEli 19\nFilipino 19\nFireman 19\nFu 19\nGrace 19\nGrinch 19\nGuangxi 19\nHBO 19\nHaiti 19\nHealthcare 19\nHistory 19\nHitachi 19\nHudson 19\nI. 19\nIRA 19\nITT 19\nKabul 19\nKeating 19\nKurds 19\nL.J. 19\nLDP 19\nLane 19\nLaura 19\nLeon 19\nLexus 19\nLiang 19\nLitvinenko 19\nLockerbie 19\nLopez 19\nLouisville 19\nMaidenform 19\nMargaret 19\nMarketing 19\nMaster 19\nMerieux 19\nMiklaszewski 19\nMills 19\nMontgomery 19\nNP 19\nOlivetti 19\nPreviously 19\nQueen 19\nRated 19\nReally 19\nRegards 19\nRevco 19\nRogers 19\nRowe 19\nSARS 19\nShakespeare 19\nSimilarly 19\nSolidarity 19\nStuart 19\nTO 19\nTeam 19\nTry 19\nWalker 19\nYan 19\nZhao 19\nabilities 19\naccusing 19\nacre 19\naffects 19\naging 19\nalbum 19\nalliances 19\napplies 19\nauthorization 19\nawful 19\nbaskets 19\nbelieving 19\nblindly 19\nbondholders 19\nbrown 19\nburst 19\ncampus 19\ncasualty 19\ncatalog 19\nceramic 19\nchairs 19\ncheapest 19\nchest 19\nclearance 19\nclick 19\nclinic 19\ncommentary 19\ncommented 19\ncommunique 19\ncommuters 19\nconfirms 19\nconsisting 19\nconstructing 19\ncoordinated 19\ncountrymen 19\ncovert 19\ndebris 19\ndecreased 19\ndesired 19\ndetained 19\ndetermining 19\ndiesel 19\ndiffer 19\ndifferently 19\ndiplomats 19\ndipped 19\ndisappear 19\ndiscontinued 19\ndiscounted 19\ndissident 19\ndonate 19\ndonating 19\ndragon 19\ndrain 19\ndrill 19\ndrummed 19\ndying 19\nenacted 19\nencourages 19\nenthusiastic 19\nenvironmentalists 19\nequities 19\ner 19\nethics 19\nexclude 19\nexpenditure 19\nexperiencing 19\nexploit 19\nfabrics 19\nfeatured 19\nfingers 19\nfires 19\nforecasting 19\nfragile 19\nfulfill 19\ngallery 19\ngirlfriend 19\ngrasp 19\nhandicapped 19\nhat 19\nheels 19\nhighs 19\nhouseholds 19\nignorant 19\nincomplete 19\ninfluences 19\ninquiries 19\ninsistence 19\ninstallation 19\ninsufficient 19\nintact 19\njar 19\nlacked 19\nmayoral 19\nmilitants 19\nministries 19\nmixture 19\nmount 19\nmulti-national 19\nna 19\nninth 19\nnortheast 19\nopera 19\nopponent 19\nozone 19\nparallel 19\npatriotic 19\npiano 19\npie 19\npm 19\nportable 19\npresentation 19\npublications 19\nranges 19\nrealization 19\nrecalled 19\nrefined 19\nrelieved 19\nremainder 19\nremark 19\nreopen 19\nrepeating 19\nreplied 19\nrestructured 19\nretiring 19\nreveals 19\nsang 19\nscandals 19\nscholar 19\nseasons 19\nsections 19\nshelters 19\nshifting 19\nshit 19\nsignaled 19\nsmell 19\nsouthwest 19\nspacecraft 19\nspecifications 19\nstaffers 19\nstepping 19\nsticking 19\nstiff 19\nstipulated 19\nstretched 19\nsubsequently 19\nsuburban 19\nsubway 19\nsupplied 19\nsurpassed 19\nsweat 19\ntanker 19\ntemblor 19\ntensions 19\ntraditions 19\ntransmission 19\nunderway 19\nviable 19\nvoluntarily 19\nvoluntary 19\nwary 19\nwitnessed 19\nworship 19\nyang 19\nyard 19\nyielding 19\n..... 18\n0.1 18\n10th 18\n1977 18\n360 18\n4.4 18\n4.8 18\n6.25 18\n6.4 18\n8:00 18\n@ 18\nAhmed 18\nAlabama 18\nAlbert 18\nAmman 18\nArgentina 18\nArts 18\nBaltic 18\nBasin 18\nBeers 18\nBennett 18\nBernard 18\nBork 18\nCA 18\nCPC 18\nCiba 18\nCoalition 18\nCommonwealth 18\nConservative 18\nDaewoo 18\nDaimler 18\nDecker 18\nDozen 18\nDrabinsky 18\nEconomists 18\nFloating 18\nGardens 18\nGen 18\nGregory 18\nHad 18\nHappy 18\nHavana 18\nHenan 18\nHoffman 18\nHsinchu 18\nHyundai 18\nIF 18\nInter 18\nInternal 18\nIranians 18\nJanet 18\nJen 18\nJessica 18\nJoint 18\nKashmir 18\nKnight 18\nLockheed 18\nLorenzo 18\nMachinists 18\nMadeleine 18\nMaria 18\nMarxist 18\nMcDonough 18\nMedia 18\nMidland 18\nMinpeco 18\nMips 18\nMorishita 18\nMullins 18\nNashua 18\nNature 18\nNestle 18\nNine 18\nNook 18\nOppenheimer 18\nPakistani 18\nPartnership 18\nProbe 18\nProperty 18\nQing 18\nR 18\nReebok 18\nReview 18\nRican 18\nRichter 18\nRural 18\nSacramento 18\nSanford 18\nScientists 18\nSerb 18\nSeven 18\nShannon 18\nSimmons 18\nSorry 18\nSudan 18\nSuez 18\nTaichung 18\nTeddy 18\nTele-Communications 18\nThings 18\nThought 18\nTim 18\nToo 18\nToshiba 18\nUSTC 18\nValue 18\nView 18\nVincent 18\nW 18\nWathen 18\nYung 18\naccordingly 18\naccumulation 18\naccurately 18\nacknowledge 18\nadjusting 18\nadvising 18\naftershocks 18\nalike 18\namongst 18\nappointments 18\narriving 18\nasbestos 18\nassessing 18\nassure 18\nattach 18\navailability 18\navoiding 18\nbalanced 18\nbeliefs 18\nbleak 18\nboil 18\nbone 18\nbooming 18\nbricks 18\nbucks 18\nbugs 18\nc 18\ncelebrating 18\ncharitable 18\nchase 18\nchunk 18\ncitizenship 18\nclassroom 18\ncleaner 18\ncolony 18\ncolumnist 18\ncomeback 18\ncompact 18\nconcede 18\nconfidential 18\nconsolidating 18\nconsume 18\ncontended 18\ncoordinate 18\ncoordination 18\ncopyright 18\ncosmetic 18\ncreditor 18\ncypresses 18\ndamaging 18\ndealership 18\ndeductions 18\ndefaults 18\ndefine 18\ndelicate 18\ndepreciation 18\nderivative 18\nderived 18\ndescribing 18\ndestructive 18\ndiabetics 18\ndiamonds 18\ndies 18\ndisability 18\ndisobedience 18\ndisplayed 18\ndistant 18\ndynasty 18\nears 18\nembarrassed 18\nembarrassing 18\nenables 18\nencountered 18\nendorsed 18\nendorsement 18\nensuring 18\nexceptions 18\nexercises 18\nexit 18\nfame 18\nfeng 18\nfranchisees 18\nfrankly 18\nfrustration 18\nfuzzy 18\ngenerating 18\ngeographical 18\nglorious 18\ngoverned 18\ngratitude 18\ngreenhouse 18\nharmonious 18\nhats 18\nheight 18\nhonestly 18\nhonorable 18\nhorror 18\nhsiung 18\nhttp://mitbbs.com 18\nhunt 18\nimagined 18\ninches 18\nincurred 18\nindicator 18\nindirectly 18\ninstallations 18\ninsulin 18\nintervene 18\njumping 18\nkidnapping 18\nknock 18\nleather 18\nlecture 18\nlibrary 18\nlocally 18\nmad 18\nmatched 18\nmedicines 18\nmid 18\nmild 18\nminded 18\nmorale 18\nmotivated 18\nmunicipality 18\nmysterious 18\nnarrowed 18\nnasty 18\nnotably 18\nobjections 18\noptimism 18\nounces 18\noutbreak 18\noverboard 18\npen 18\nperceived 18\npersuaded 18\npessimistic 18\npetrochemical 18\nphysician 18\npin 18\npipe 18\npoetry 18\npostal 18\npracticing 18\nprayer 18\npredecessor 18\nprisoners 18\npro-choice 18\npumped 18\npumps 18\npunish 18\nqualities 18\nraces 18\nre 18\nrecommends 18\nrecreational 18\nregulate 18\nrepeal 18\nresisted 18\nrestated 18\nreviewed 18\nriders 18\nroutinely 18\nrunway 18\nsafeguard 18\nsamples 18\nscare 18\nscary 18\nscenic 18\nscientist 18\nseminar 18\nsentences 18\nshipyard 18\nshocking 18\nsidelines 18\nsilence 18\nslashed 18\nsmallest 18\nsolving 18\nsour 18\nspark 18\nspecify 18\nspotlight 18\nstalled 18\nsteal 18\nstemming 18\nstems 18\nstyles 18\nsubstitute 18\nsupermarket 18\nswing 18\ntables 18\ntechnically 18\nteen 18\ntentative 18\nterritorial 18\ntextiles 18\ntongue 18\ntreasures 18\nundertaken 18\nunfairly 18\nupscale 18\nurgency 18\nvessels 18\nweaknesses 18\nwebsites 18\nx 18\n------ 17\n1.50 17\n104 17\n1975 17\n2.25 17\n288 17\n4. 17\n62.5 17\n7.2 17\n7.7 17\n747 17\n8.45 17\n80,000 17\n800,000 17\n9.5 17\nAbd 17\nAcademia 17\nAllah 17\nAntar 17\nArmstrong 17\nAtayal 17\nAutonomous 17\nBangladesh 17\nBeing 17\nBelfast 17\nBelgium 17\nBrazilian 17\nCBOE 17\nCao 17\nCarnival 17\nChengchi 17\nChengdu 17\nChiefs 17\nChou 17\nCineplex 17\nClair 17\nCommissioner 17\nCommodore 17\nCrystal 17\nDaly 17\nDeLay 17\nDelicious 17\nDella 17\nDeutsche 17\nDirect 17\nEDT 17\nEgg 17\nEight 17\nFAW 17\nFUNDS 17\nFarm 17\nFemina 17\nFinnish 17\nFlorio 17\nForces 17\nFrancis 17\nFrederick 17\nFunding 17\nGive 17\nGordon 17\nGraphics 17\nGreater 17\nGreenwich 17\nGreg 17\nGuinea 17\nGuo 17\nGuy 17\nHalf 17\nHamilton 17\nHeritage 17\nHighness 17\nHighway 17\nHispanic 17\nHoliday 17\nHsing 17\nIN 17\nIndividual 17\nInvestigators 17\nInvestments 17\nJohns 17\nJustin 17\nKasparov 17\nKatrina 17\nKhmer 17\nKy. 17\nLiberation 17\nLuis 17\nMarvin 17\nMather 17\nMax 17\nMedicine 17\nMemorial 17\nMoslem 17\nNewell 17\nNonetheless 17\nNorman 17\nOntario 17\nPLO 17\nPalm 17\nPioneer 17\nPortugal 17\nPresidential 17\nProsecutors 17\nRTC 17\nRailway 17\nRe 17\nRochester 17\nRosen 17\nRoth 17\nRubicam 17\nSDI 17\nSaks 17\nSao 17\nSheng 17\nShevardnadze 17\nShimon 17\nSierra 17\nSignal 17\nSimpson 17\nSinica 17\nSociete 17\nSomalia 17\nSpeaker 17\nSydney 17\nTan 17\nThao 17\nThink 17\nTokyu 17\nTop 17\nTwenty 17\nVoice 17\nVolokh 17\nWTO 17\nWaertsilae 17\nWayne 17\nWushe 17\nYun 17\nZhuhai 17\nZurich 17\nabsorbed 17\naccepts 17\nadjusters 17\naffiliated 17\nafterwards 17\naided 17\nanthrax 17\napartheid 17\nappealing 17\nappropriately 17\nartifacts 17\nassembled 17\nassumes 17\nautonomous 17\naviation 17\nawaited 17\nawaiting 17\nbat 17\nbeloved 17\nbeta 17\nbite 17\nblank 17\nbless 17\nblown 17\nbothered 17\nbrains 17\nbreathing 17\nbrilliant 17\nbrush 17\nbull 17\ncasting 17\ncertification 17\nclosest 17\ncocoa 17\ncoin 17\ncolorful 17\ncomparatively 17\ncomplains 17\ncomply 17\nconcentrating 17\nconcert 17\nconferences 17\nconfronted 17\nconsolidate 17\nconstructive 17\ncontested 17\ncooked 17\ncorps 17\ncraft 17\ncredentials 17\ncrippled 17\ncruel 17\ncurious 17\ncycles 17\ndamp 17\ndefending 17\ndeliberate 17\ndeteriorating 17\ndiet 17\ndiscarded 17\ndish 17\ndiverse 17\ndivorce 17\ndomain 17\ndragged 17\nencouragement 17\nendure 17\nenthusiastically 17\nepicenter 17\nepidemic 17\nexchanging 17\nexclusion 17\nexile 17\nfederally 17\nfieldwork 17\nflagship 17\nflawed 17\nflooding 17\nfloors 17\nfolded 17\nfollowers 17\nforfeiture 17\nfunctional 17\ngauge 17\ngay 17\ngenerous 17\ngently 17\nglobalization 17\ngovern 17\ngrandson 17\ngrim 17\nhappily 17\nhardship 17\nharmful 17\nheadlines 17\nhiding 17\nholes 17\nhomeowners 17\nhook 17\nhorizon 17\nhosting 17\nhosts 17\nhua 17\nhull 17\nhurting 17\nidentical 17\nidentification 17\nimagination 17\ninaugural 17\ninclines 17\nincorporated 17\nindirect 17\ninterference 17\nintimate 17\nlabels 17\nlaughing 17\nleftist 17\nlegitimacy 17\nlesser 17\nlifestyle 17\nlowering 17\nmagnetic 17\nmakeup 17\nmate 17\nmemorial 17\nmid-1980s 17\nmills 17\nmode 17\nmonitors 17\nnationality 17\nnerve 17\nniche 17\nnickname 17\noffsetting 17\noh 17\noral 17\norgans 17\norientation 17\nowes 17\noxygen 17\npeoples 17\npersistent 17\npetition 17\npile 17\nplanted 17\npowered 17\npreparations 17\nprocedural 17\nproductive 17\nprolonged 17\nprotective 17\nqualifications 17\nr 17\nrabbit 17\nraid 17\nrealm 17\nrecounts 17\nrecreation 17\nredeem 17\nregulator 17\nrelation 17\nrelaxing 17\nremembered 17\nreminded 17\nreminder 17\nresolutions 17\nresolving 17\nrespects 17\nrestraint 17\nretreat 17\nsalesman 17\nscenery 17\nsearched 17\nsearches 17\nsends 17\nseniority 17\nserials 17\nsetbacks 17\nshirts 17\nshook 17\nspelled 17\nsponsor 17\nstabilized 17\nsticks 17\nstripped 17\nstructured 17\nstruggles 17\nstunning 17\nsucks 17\nsuited 17\nsums 17\nsupposedly 17\nthieves 17\nthorough 17\nthreatens 17\ntips 17\ntownships 17\ntrap 17\ntreasury 17\ntrusted 17\nunacceptable 17\nundergo 17\nunstable 17\nutterly 17\nvacuum 17\nwiped 17\nwires 17\nwithdrawn 17\nworm 17\nyearly 17\nying 17\n** 16\n145 16\n1930s 16\n1966 16\n260 16\n35,000 16\n4.9 16\n6.6 16\n60,000 16\n83 16\nAaron 16\nAbdul 16\nAd 16\nAgnelli 16\nAlfred 16\nAlmost 16\nAlzheimer 16\nAmsterdam 16\nAndy 16\nAnonymous 16\nAoun 16\nApplause 16\nArdery 16\nAshland 16\nAzoff 16\nBanc 16\nBeatrice 16\nBeirut 16\nBoesky 16\nBoris 16\nBreck 16\nBuying 16\nChia 16\nCitizens 16\nCompetition 16\nComprehensive 16\nCourter 16\nD.T. 16\nDVD 16\nDataproducts 16\nDenis 16\nDole 16\nDorrance 16\nDragon 16\nEconomics 16\nEdison 16\nEnterprise 16\nEspecially 16\nEuropeans 16\nExcluding 16\nExploration 16\nFederated 16\nFleet 16\nFred 16\nFresenius 16\nGTE 16\nGalapagos 16\nGillette 16\nGoldberg 16\nGraduate 16\nGrenfell 16\nHarold 16\nHelmsley 16\nHsiao 16\nIncome 16\nInfiniti 16\nInside 16\nInvestor 16\nIron 16\nJason 16\nJia 16\nJon 16\nKai 16\nKaine 16\nL 16\nLazio 16\nLevy 16\nLiberal 16\nLion 16\nLung 16\nManufacturing 16\nMarreiros 16\nMason 16\nMellon 16\nMiniScribe 16\nMusharraf 16\nNEW 16\nON 16\nOzzy 16\nPBS 16\nParker 16\nPat 16\nPeck 16\nPopular 16\nProduction 16\nProvigo 16\nRead 16\nRepresentative 16\nResearchers 16\nRussell 16\nRuth 16\nSchools 16\nShanxi 16\nShare 16\nShen 16\nSiemens 16\nStamford 16\nStop 16\nStraszheim 16\nSu 16\nSuisse 16\nTransport 16\nTucson 16\nTypical 16\nWeek 16\nWellington 16\nWood 16\nYahoo 16\nZ 16\naffection 16\nambition 16\nample 16\napiece 16\nappliances 16\naromatherapy 16\nassassinated 16\nassisting 16\nattain 16\nbackward 16\nballs 16\nbidder 16\nbinding 16\nboiler 16\nbold 16\nbooked 16\nborrowings 16\nbrake 16\nbrave 16\nbriefing 16\nburdens 16\nbutter 16\ncancelled 16\ncandy 16\ncapitalization 16\ncares 16\ncatching 16\ncenturies 16\ncheerleaders 16\ncheng 16\nchores 16\nclause 16\ncolleague 16\ncolored 16\nconfirming 16\nconfiscated 16\nconsumed 16\ncontingent 16\nconvey 16\nconvictions 16\ncorrespond 16\ncounseling 16\ncourtroom 16\ncrews 16\ndeeds 16\ndeepening 16\ndelivering 16\ndenying 16\nderegulation 16\ndestroying 16\ndeter 16\ndiminished 16\ndire 16\ndistinguished 16\ndisturbing 16\ndivide 16\ndominance 16\ndried 16\ndynamic 16\nembassies 16\nembraced 16\nemotions 16\nenabling 16\nencounter 16\nequaling 16\nequipped 16\neve 16\neventual 16\nexcluded 16\nexplanations 16\nfairness 16\nfarming 16\nferry 16\nfilters 16\nforestry 16\nfortunes 16\nforty 16\nfraudulent 16\nfreedoms 16\nfrightening 16\nfrontier 16\ngang 16\ngenetically 16\nglasnost 16\ngrandchildren 16\ngravel 16\nguerillas 16\nheroes 16\nhighlight 16\nimmigration 16\nimminent 16\nimperial 16\nindexes 16\nindividually 16\nindustrialized 16\ninferior 16\ningredients 16\ninstructed 16\ninviting 16\nisolation 16\njustification 16\nkidnapped 16\nlabel 16\nlabeled 16\nlasts 16\nleaked 16\nlegacy 16\nlighting 16\nlistened 16\nlivelihood 16\nloaded 16\nlogging 16\nlongstanding 16\nloving 16\nlows 16\nmarch 16\nmarketers 16\nmarking 16\nmechanisms 16\nmid-October 16\nmineral 16\nmonkeys 16\nmoreover 16\nmounted 16\nmunicipalities 16\nnarrowly 16\nnorthwest 16\nobscure 16\nobtaining 16\nopposes 16\nours 16\noversees 16\noverthrow 16\nparks 16\npedestrian 16\npeninsula 16\npermits 16\nphysics 16\npigs 16\nplatinum 16\nprinter 16\nproceeding 16\nprofoundly 16\nproposes 16\nproteins 16\npublishes 16\nquotas 16\nracist 16\nraider 16\nrays 16\nreads 16\nrecommending 16\nreconnaissance 16\nrecruited 16\nrefund 16\nrefuses 16\nrelies 16\nrepercussions 16\nreplies 16\nrepresentation 16\nrequesting 16\nrespective 16\nrestoring 16\nretains 16\nrevenge 16\nrevival 16\nrigid 16\nrobbed 16\nscarce 16\nschedules 16\nseas 16\nshell 16\nshots 16\nsincere 16\nslate 16\nsleeping 16\nslightest 16\nsnack 16\nsoy 16\nspeculate 16\nsponsors 16\nstating 16\nstatute 16\nstemmed 16\nstirred 16\nstockholders 16\nstreamlining 16\nstroke 16\nsuccession 16\nsufficiently 16\nswitches 16\nsymbols 16\ntail 16\ntariffs 16\nteaches 16\ntenure 16\ntestify 16\ntheirs 16\nthroat 16\ntopped 16\ntouches 16\ntoxin 16\ntrains 16\ntraitors 16\ntransfers 16\ntumble 16\nugly 16\nundermine 16\nunderwriting 16\nunspecified 16\nunwilling 16\nupgrading 16\nupside 16\nusage 16\nvan 16\nvariations 16\nvarieties 16\nvegetable 16\nvillages 16\nvisual 16\nwarehouse 16\nwarmth 16\nweaken 16\nweekends 16\nwidening 16\nwildlife 16\nworthy 16\nwound 16\nwritings 16\n!!!! 15\n'S 15\n120,000 15\n135 15\n1500 15\n230 15\n28th 15\n5.2 15\n5.8 15\n550 15\n7,000 15\n7.1 15\n7.50 15\n7.6 15\n7.875 15\n70,000 15\n8.55 15\n82 15\n87.5 15\nAZT 15\nAction 15\nActivity 15\nAdmiral 15\nAi 15\nAnne 15\nAppeals 15\nArnold 15\nAuthorities 15\nAviation 15\nBarre 15\nBogart 15\nBowl 15\nBradley 15\nCasualty 15\nCertainly 15\nCheck 15\nChevrolet 15\nCommodities 15\nCommunication 15\nConiston 15\nConsumers 15\nCooperation 15\nCourts 15\nCranston 15\nCreek 15\nCurrent 15\nDale 15\nDarfur 15\nDear 15\nDentsu 15\nDunn 15\nE 15\nEddie 15\nFCC 15\nFairfax 15\nFitzwater 15\nFortunately 15\nFriendly 15\nFuture 15\nGAF 15\nGate 15\nGreece 15\nGreek 15\nHas 15\nHey 15\nHorse 15\nHuai 15\nHung 15\nIMA 15\nIndonesian 15\nIronically 15\nIrving 15\nIssues 15\nJian 15\nJustices 15\nKaren 15\nKraft 15\nLaboratory 15\nLibyans 15\nLipper 15\nLiving 15\nMancuso 15\nManuel 15\nMemories 15\nMercedes 15\nMichelle 15\nMusic 15\nNancy 15\nNaval 15\nNazi 15\nNeal 15\nNenad 15\nNewark 15\nNing 15\nNorton 15\nOliver 15\nOperation 15\nOperations 15\nOxford 15\nParisian 15\nPaulo 15\nPosada 15\nPrior 15\nQichen 15\nRU 15\nResistance 15\nRon 15\nSanders 15\nSarkozy 15\nSemiconductor 15\nSerbs 15\nSkase 15\nSuddenly 15\nSuper 15\nTainan 15\nTallahassee 15\nTampa 15\nTogether 15\nTransmission 15\nUkraine 15\nUnilever 15\nUnocal 15\nWendy 15\nWhitten 15\nYunnan 15\naccessible 15\naccustomed 15\nadapted 15\naerobics 15\nafterward 15\naggression 15\naiming 15\nairplanes 15\nantibiotics 15\napparatus 15\narchitect 15\naroused 15\nassisted 15\nassumptions 15\nattendance 15\nattractions 15\nautonomy 15\nbackgrounds 15\nbacklog 15\nbackup 15\nbanner 15\nbasement 15\nbedroom 15\nbelt 15\nbetting 15\nbiting 15\nblames 15\nblessings 15\nbombed 15\nboundaries 15\nbreakers 15\nbreakup 15\nbrick 15\nbrowser 15\nbullets 15\nburns 15\ncancers 15\ncapitalists 15\ncareers 15\ncaring 15\ncelebrity 15\nchampionships 15\nchasing 15\nchi 15\nchromosome 15\nchronic 15\ncivilized 15\nclarify 15\nclassical 15\ncleaned 15\nclues 15\ncollaboration 15\ncolon 15\ncommitting 15\ncomparisons 15\ncompelling 15\nconcentration 15\nconcluding 15\nconnect 15\nconvincing 15\ncooling 15\ncorresponding 15\ncrunch 15\ncult 15\ndamn 15\ndesires 15\ndevote 15\ndiscretion 15\ndishes 15\ndisorder 15\ndistinct 15\ndraws 15\ndrinks 15\ndrunk 15\nearns 15\neffectiveness 15\nefficiently 15\nemission 15\nenergetic 15\nengaging 15\nentrenched 15\nescalation 15\nevaluating 15\nexams 15\nexceeds 15\nexcuses 15\nexhausted 15\nfaults 15\nfearing 15\nfelony 15\nfemales 15\nfence 15\nfishermen 15\nflags 15\nflash 15\nfloat 15\nfluctuations 15\nformulated 15\nfoundations 15\nfraction 15\nfranc 15\nfreezing 15\nfuels 15\ngangs 15\ngrabbed 15\ngrandfather 15\ngunfire 15\nhamster 15\nheaven 15\nhurry 15\nimposing 15\nincomes 15\nincompetence 15\ninform 15\ninjection 15\ninjuring 15\ninjustice 15\ninstantly 15\ninstitutes 15\ninstruction 15\nit's 15\njuice 15\njustices 15\nkeen 15\nknowledgeable 15\nladder 15\nlandslide 15\nleash 15\nlicensed 15\nlifelong 15\nlimiting 15\nlyrics 15\nmall 15\nmarched 15\nmeasuring 15\nmedication 15\nmentally 15\nmenu 15\nmerit 15\nmillennium 15\nmothers 15\nnavy 15\nneutral 15\nnominal 15\nnotable 15\nnotorious 15\nnovels 15\noccupying 15\noffs 15\noffspring 15\noutcry 15\npatrols 15\nperestroika 15\npermanently 15\nphysically 15\npicks 15\npoet 15\npopulated 15\npositively 15\nposters 15\npracticed 15\npreferences 15\nprepares 15\nprisoner 15\nprivileges 15\nprobability 15\nprofessors 15\nprofound 15\nprompting 15\nprosperous 15\nproxy 15\nprudent 15\npunched 15\nracing 15\nreceivables 15\nregrets 15\nrelate 15\nreleases 15\nreluctance 15\nremedy 15\nrenew 15\nrepression 15\nreproductions 15\nreserved 15\nrespectable 15\nrestriction 15\nreversed 15\nroof 15\nrubles 15\nrumor 15\nsalespeople 15\nscrutiny 15\nsecretly 15\nseldom 15\nshattered 15\nsin 15\nsituated 15\nsixty 15\nsizable 15\nskilled 15\nslated 15\nslipping 15\nslopes 15\nsoap 15\nsolo 15\nsometime 15\nsouls 15\nspectacular 15\nspeculated 15\nspotted 15\nspreads 15\nspun 15\nstandpoint 15\nstopping 15\nstrapped 15\nstresses 15\nstricken 15\nstrips 15\nsubjected 15\nsuitor 15\nsupervisor 15\nsurfaced 15\nsuspicion 15\nsustain 15\nsweeps 15\nsympathy 15\nsynthetic 15\ntactic 15\ntaxi 15\ntelevised 15\ntendered 15\ntestified 15\ntide 15\ntighten 15\ntighter 15\ntightly 15\ntitled 15\ntoys 15\ntransformed 15\ntransparent 15\ntruce 15\nuncomfortable 15\nundoubtedly 15\nunemployed 15\nuniverse 15\nvendors 15\nvicious 15\nviper 15\nvirtual 15\nwarriors 15\nwarship 15\nwheels 15\nwidow 15\nwonders 15\nwore 15\nworkstations 15\nwrap 15\nyoungest 15\nyours 15\n...... 14\n0.5 14\n0.7 14\n1.05 14\n1.125 14\n105 14\n18,000 14\n2.85 14\n220 14\n25th 14\n3889 14\n4.1 14\n4.3 14\n5.1 14\n5.4 14\n5000 14\n50th 14\n6.79 14\n7.9 14\n7.90 14\n713-646-3393 14\n713-853-3989 14\n9.6 14\n??? 14\nALL 14\nAbsolutely 14\nAcceptance 14\nAccounting 14\nAkayev 14\nAmount 14\nAndersson 14\nApparently 14\nArabian 14\nAsians 14\nAside 14\nAssistant 14\nBacker 14\nBall 14\nBanxquote 14\nBirmingham 14\nBonn 14\nBrewing 14\nCabrera 14\nCamp 14\nCampbell 14\nChandler 14\nCharleston 14\nChuang 14\nColo. 14\nColumbus 14\nCommunists 14\nComputers 14\nConcerned 14\nConn 14\nConsolidated 14\nConvention 14\nDO 14\nDate 14\nDays 14\nDemand 14\nDeposit 14\nDiago 14\nEastman 14\nEllis 14\nEmbassy 14\nExcellent 14\nExecutives 14\nFASB 14\nFTC 14\nFoley 14\nFreeman 14\nGang 14\nGeigy 14\nGiant 14\nGillett 14\nGuzman 14\nHariri 14\nHell 14\nHomeFed 14\nHosni 14\nIlan 14\nInco 14\nIndianapolis 14\nInner 14\nJeep 14\nJing 14\nJintao 14\nJudy 14\nKitchen 14\nKravis 14\nLaff 14\nLeft 14\nLeong 14\nLieutenant 14\nMONEY 14\nMTM 14\nMade 14\nMadison 14\nManagers 14\nMarlin 14\nMarsh 14\nMcDuffie 14\nMedicare 14\nMeeting 14\nMid 14\nMuhammad 14\nMutual 14\nMyron 14\nN.V. 14\nNGOs 14\nNanchang 14\nNathan 14\nNevada 14\nNick 14\nNippon 14\nOct 14\nOkay 14\nOmar 14\nOwen 14\nPWA 14\nPaiwan 14\nParks 14\nPepsi 14\nPinnacle 14\nPlace 14\nPortland 14\nPrimary 14\nProgressive 14\nRealty 14\nRenault 14\nRoderick 14\nSatellite 14\nSimply 14\nSon 14\nSpecialized 14\nSugarman 14\nTHIS 14\nTVS 14\nTVs 14\nTell 14\nThinking 14\nTopic 14\nUPGA 14\nUncle 14\nV 14\nVa 14\nVancouver 14\nVenice 14\nVienna 14\nWhereas 14\nWhittle 14\nWinter 14\nWorks 14\nXiang 14\nXingjian 14\nYamaichi 14\nYouth 14\nabolish 14\naccuse 14\nadministrator 14\nadvancing 14\naffidavit 14\nai 14\nalarm 14\nalbums 14\nalleges 14\nallowance 14\naltered 14\namendments 14\nangered 14\nanonymous 14\napologize 14\nappetite 14\narbitration 14\narrange 14\narrives 14\narrogance 14\narrogant 14\nattachment 14\nautumn 14\nawesome 14\nbattered 14\nbell 14\nbelly 14\nbelonged 14\nberths 14\nbeside 14\nbiochip 14\nbiological 14\nbitterly 14\nblaming 14\nblessing 14\nblockade 14\nboring 14\nbourbon 14\nbowling 14\nbuck 14\nbuilds 14\nbutton 14\ncafeteria 14\ncampaigning 14\ncanal 14\ncarol.st.clair@enron.com 14\ncasinos 14\nceasefire 14\ncelebrations 14\ncentennial 14\ncentered 14\ncertified 14\ncharms 14\ncheese 14\ncirculating 14\ncloses 14\ncodes 14\ncommissioned 14\ncompanion 14\ncompassion 14\ncompetent 14\ncondemn 14\nconfusing 14\nconsultations 14\ncontainers 14\ncontingency 14\ncorrection 14\ncountless 14\ncurrents 14\ncurriculum 14\ndam 14\ndatabase 14\ndeduction 14\ndeputies 14\ndeserve 14\ndesperately 14\ndestination 14\ndestiny 14\ndetectors 14\ndiagnostic 14\ndisagreement 14\ndisks 14\ndisposal 14\ndistinction 14\ndistributors 14\ndoomed 14\ndumb 14\nelegant 14\neleven 14\nelimination 14\nembrace 14\nenters 14\nepic 14\nexamined 14\nexamining 14\nexercisable 14\nexpose 14\nextends 14\nfacilitate 14\nfavors 14\nfeat 14\nfestivals 14\nfined 14\nflooded 14\nflower 14\nflu 14\nforemost 14\nfrequency 14\nfur 14\ngentle 14\ngeographic 14\ngranting 14\ngum 14\nhailed 14\nhandles 14\nhaul 14\nheel 14\nhostages 14\nhosted 14\nhub 14\nhurdle 14\nhurdles 14\nimpacts 14\nimpending 14\nimpetus 14\ninability 14\ninappropriate 14\ninherited 14\ninsiders 14\ninspect 14\ninstability 14\ninterbank 14\nintern 14\ninterpret 14\ninterpreted 14\ninterstate 14\ninvestigated 14\nissuers 14\nkilometer 14\nkm 14\nlands 14\nlengthy 14\nlent 14\nlightly 14\nlion 14\nliquefied 14\nmandate 14\nmaneuver 14\nmankind 14\nmarketed 14\nmaturities 14\nmeasurements 14\nmedals 14\nmetropolitan 14\nmigrant 14\nming 14\nmob 14\nmonitored 14\nnaked 14\nneeding 14\nnewer 14\nnominated 14\nnon-violent 14\nnoncallable 14\nobstacle 14\noccupies 14\noccurring 14\nomitted 14\norange 14\norbit 14\norganic 14\noust 14\noutlays 14\noutskirts 14\noutspoken 14\noverride 14\noverseeing 14\npanels 14\npermitting 14\npig 14\nplanting 14\nplight 14\nportions 14\nportrayed 14\npositioned 14\npossession 14\nposter 14\nprecautions 14\nprediction 14\npreliminarily 14\nprelude 14\nprestige 14\nprevailed 14\nprince 14\nprincipals 14\nprivacy 14\nprivilege 14\nprobable 14\nproclaimed 14\nproving 14\npublicized 14\npuppies 14\nraked 14\nreckless 14\nrefuge 14\nrelieve 14\nremind 14\nrented 14\nreopened 14\nresignations 14\nresponsive 14\nrestoration 14\nresumption 14\nretreated 14\nridden 14\nrivers 14\nroadblock 14\nroller 14\nrounds 14\nroyal 14\nrude 14\nscrambling 14\nscreaming 14\nscreening 14\nscreens 14\nsculpture 14\nshaken 14\nshaky 14\nshameful 14\nshifted 14\nshirt 14\nsilk 14\nslash 14\nsoda 14\nstabbed 14\nstaggering 14\nstays 14\nsteelmaker 14\nstimulate 14\nstir 14\nstrengths 14\nstunned 14\nsubstances 14\nsuccessive 14\nsuperiority 14\nsupervise 14\nsupplement 14\nsurviving 14\nsweep 14\nsymbolic 14\ntactical 14\ntales 14\ntaped 14\ntapes 14\ntemple 14\ntendency 14\nthoroughly 14\nthrows 14\ntolerance 14\ntopple 14\ntracked 14\ntransmitted 14\ntreasure 14\ntremors 14\ntroubling 14\ntwelve 14\nunfavorable 14\nunfortunate 14\nuniforms 14\nunreasonable 14\nuranium 14\nutilize 14\nversus 14\nveterans 14\nvis 14\nvitality 14\nwashing 14\nwholly 14\nwildly 14\nwooden 14\nyards 14\nzip 14\n0.9 13\n1.15 13\n11.5 13\n119 13\n13.1 13\n13.4 13\n13.5 13\n175 13\n1960 13\n27th 13\n29th 13\n3.2 13\n3000 13\n301 13\n4.25 13\n4th 13\n6.2 13\n6.7 13\n600,000 13\n7.10 13\n8.3 13\n880 13\n9:30 13\nAOL 13\nAerospace 13\nAge 13\nAgnos 13\nAlways 13\nAmendment 13\nAngels 13\nAnglo 13\nAnkara 13\nAnn 13\nAnnualized 13\nArchibald 13\nAward 13\nBlount 13\nBlumenfeld 13\nBofors 13\nBrotherhood 13\nCPA 13\nCambridge 13\nCapel 13\nCaptain 13\nCarbide 13\nCase 13\nCentury 13\nColeman 13\nContainers 13\nCountry 13\nCourtaulds 13\nCritics 13\nCruz 13\nDai 13\nDeal 13\nDell 13\nDomestic 13\nDrilling 13\nEPO 13\nEdelman 13\nEdwin 13\nEmployees 13\nErekat 13\nExterior 13\nFather 13\nFisher 13\nFortune 13\nFoster 13\nFournier 13\nFreeway 13\nFunds 13\nGenentech 13\nGiovanni 13\nGlenn 13\nGlobe 13\nGray 13\nGregg 13\nGuaranteed 13\nH 13\nHambrecht 13\nHeart 13\nHeavy 13\nHebron 13\nHiroshima 13\nHitler 13\nHolmes 13\nHumana 13\nHundreds 13\nIbn 13\nInstitutes 13\nInterpublic 13\nJackie 13\nJesus 13\nJinchuan 13\nJob 13\nK. 13\nKate 13\nLIBOR 13\nLaboratories 13\nLeonard 13\nLiao 13\nLoral 13\nMacmillan 13\nMainland 13\nManager 13\nMazen 13\nMd. 13\nMe 13\nMember 13\nMetro 13\nMicrosystems 13\nMohamed 13\nMurray 13\nNATIONAL 13\nNIH 13\nNT 13\nNasser 13\nNeil 13\nNice 13\nNote 13\nO'Kicki 13\nO. 13\nOpposition 13\nOrkem 13\nOutside 13\nPact 13\nPar 13\nPenn 13\nPenney 13\nPetrie 13\nPosting 13\nProduct 13\nQuantum 13\nQuestar 13\nRamada 13\nRevolutionary 13\nRobin 13\nRodgers 13\nRubens 13\nRyder 13\nSammy 13\nSchwarz 13\nScientific 13\nSen 13\nShack 13\nShares 13\nSosa 13\nSoul 13\nSpencer 13\nStadium 13\nStore 13\nStrategic 13\nStudents 13\nStudy 13\nSupervision 13\nSutton 13\nTIBE 13\nTRUST 13\nTaking 13\nTao 13\nTexans 13\nThem 13\nTomorrow 13\nTraditional 13\nTrans 13\nTrue 13\nTumen 13\nU.S.S.R. 13\nUnification 13\nWHO 13\nWalt 13\nWeir 13\nWeyerhaeuser 13\nWindsor 13\nWord 13\nX 13\nXi'an 13\nXinjiang 13\nYellow 13\nYields 13\nZimbabwe 13\nabruptly 13\naccommodation 13\naccrued 13\naddressing 13\nadjacent 13\nadvocacy 13\nalumni 13\nannoying 13\nappellate 13\nappreciated 13\narchitects 13\nassassinations 13\nattained 13\nattracts 13\nbackers 13\nbathroom 13\nbenign 13\nbipartisan 13\nblanket 13\nblueprint 13\nbones 13\nborrowers 13\nbronze 13\nbrutal 13\nbureaucratic 13\ncables 13\ncaffeine 13\ncake 13\ncapitalized 13\ncereal 13\nching 13\ncirculated 13\ncites 13\ncivic 13\nclarified 13\nclerk 13\nclosure 13\ncloth 13\nclout 13\ncommunicate 13\ncommute 13\ncompetitions 13\ncomprehensively 13\nconsequently 13\nconsist 13\nconstituents 13\ncontinent 13\nconverting 13\ncooperatives 13\ncounterparty 13\ncoupled 13\ncoupons 13\ncows 13\ncrackdown 13\ncrate 13\ncredited 13\ndebacle 13\ndebates 13\ndeceased 13\ndefinite 13\ndelete 13\ndemonstrates 13\ndenounced 13\ndepository 13\ndesks 13\ndesktop 13\ndiamond 13\ndiligently 13\ndinosaur 13\ndioxide 13\ndismal 13\ndisposable 13\ndisruption 13\ndissatisfied 13\ndistinctive 13\ndistorted 13\ndiverted 13\ndressing 13\neditions 13\nembedded 13\nemotion 13\nendured 13\nentertaining 13\neternal 13\nexercised 13\nexhaust 13\nexplicitly 13\nexploitation 13\nexploited 13\nexposing 13\nfaction 13\nfaded 13\nfantasy 13\nfathers 13\nfeasible 13\nfeathers 13\nfetch 13\nflaws 13\nflies 13\nflip 13\nformulate 13\nfoster 13\nfoul 13\nfragrance 13\ngamble 13\ngenerals 13\ngeneric 13\ngenius 13\nghosts 13\ngovernors 13\ngrass 13\ngridlock 13\nguerrillas 13\nhampered 13\nhaven 13\nheavier 13\nhedging 13\nheightened 13\nheights 13\nhostage 13\nimplicit 13\nimproper 13\nimproperly 13\nimproves 13\ninauguration 13\ninclusion 13\nincorrect 13\nincredibly 13\nindexing 13\nindispensable 13\ninsolvent 13\ninspectors 13\ninsult 13\nintake 13\nintentionally 13\nintermediate 13\ninvade 13\nirrelevant 13\nkeyboard 13\nknee 13\nknees 13\nknight 13\nkronor 13\nlandmark 13\nlaundering 13\nleap 13\nleased 13\nlegend 13\nleisure 13\nlets 13\nlicensing 13\nlid 13\nlikewise 13\nlit 13\nlocate 13\nlogical 13\nmanipulation 13\nmarry 13\nmeaningful 13\nmentioning 13\nmiddlemen 13\nmitbbs.com 13\nnegotiable 13\nnomination 13\nnowadays 13\nnylon 13\nobliged 13\nobserves 13\noccurrence 13\nopenness 13\noptic 13\norbital 13\norderly 13\noutright 13\noutsiders 13\noversight 13\npainter 13\npeaked 13\nperjury 13\npesticides 13\npharmaceuticals 13\nphased 13\nphotographs 13\npickup 13\npipes 13\nplagued 13\nplaintiff 13\npoisoning 13\npollen 13\nporn 13\npose 13\npotato 13\npotatoes 13\npour 13\npoured 13\nprecedent 13\nprecision 13\npressured 13\npriest 13\nprivileged 13\nprizes 13\npros 13\nprosecutions 13\npudding 13\npumping 13\npunished 13\npursued 13\nquota 13\nquoting 13\nrallies 13\nrats 13\nre-election 13\nreadily 13\nrealities 13\nrebel 13\nregulated 13\nrein 13\nreinforcement 13\nremembers 13\nrenewal 13\nrepaid 13\nrepaired 13\nrepayment 13\nreputable 13\nrestrain 13\nretaliation 13\nrioting 13\nriyals 13\nsalesmen 13\nsatisfying 13\nscenarios 13\nscrap 13\nselecting 13\nsentiments 13\nsheer 13\nshifts 13\nshines 13\nshortages 13\nshorter 13\nshoulders 13\nshouting 13\nshrinking 13\nsided 13\nsiege 13\nsmuggling 13\nsnapped 13\nsocieties 13\nsociology 13\nsolar 13\nsolicitation 13\nsoutheast 13\nspecializing 13\nspeeding 13\nspinoff 13\nspouses 13\nsubordinate 13\nsuccesses 13\nsulfur 13\nsupercomputer 13\nsupporter 13\nsurpass 13\nsurpluses 13\nswear 13\nsweetened 13\nswiftly 13\nsyndicates 13\nsystematic 13\nta 13\ntelephones 13\ntemptation 13\nterminals 13\nterminated 13\ntier 13\nting 13\ntoad 13\ntorture 13\ntouching 13\ntranslation 13\ntribunal 13\ntricky 13\ntripled 13\ntropical 13\nturf 13\ntv 13\ntwist 13\nunauthorized 13\nunderstandable 13\nuneven 13\nunite 13\nunlimited 13\nunloading 13\nupgraded 13\nutmost 13\nvalley 13\nvans 13\nverge 13\nvideotape 13\nviewing 13\nviolates 13\nvolcano 13\nvoter 13\nwears 13\nweigh 13\nweird 13\nwipe 13\nwithdrawals 13\nwoo 13\nworthwhile 13\nyeah 13\n--- 12\n0.4 12\n00 12\n1.02 12\n10.5 12\n115 12\n118 12\n15.6 12\n179 12\n1929 12\n1970 12\n7.25 12\n8.2 12\n8.25 12\n8.40 12\n8.8 12\n850 12\nAbraham 12\nAcross 12\nAdobe 12\nAhmad 12\nAhmadinejad 12\nAirline 12\nAo 12\nAround 12\nAutomotive 12\nAxa 12\nBelgian 12\nBerry 12\nBoys 12\nBranch 12\nBurger 12\nCFTC 12\nCS 12\nCandlestick 12\nCap 12\nCarr 12\nCassman 12\nCharter 12\nChevy 12\nChiung 12\nChuck 12\nChugai 12\nCitibank 12\nColor 12\nCome 12\nConservatives 12\nContel 12\nCoopers 12\nCorning 12\nCrane 12\nCrescent 12\nCubans 12\nCui 12\nDakota 12\nDeforest 12\nDetails 12\nDreyfus 12\nDuke 12\nEarl 12\nElementary 12\nEmergency 12\nEmerson 12\nEmperor 12\nEmpire 12\nEnigma 12\nExperience 12\nFEMA 12\nFM 12\nFeel 12\nFile 12\nFlying 12\nFoothills 12\nGame 12\nGenerally 12\nGetting 12\nGinnie 12\nGoing 12\nGoodyear 12\nGrant 12\nGrumman 12\nGu 12\nH&R 12\nHK 12\nHakim 12\nHamad 12\nHartford 12\nHelp 12\nHenderson 12\nHerbert 12\nHollander 12\nHsin 12\nHubei 12\nHunter 12\nImagine 12\nImmigration 12\nInitiative 12\nInstruments 12\nJ 12\nJ.C. 12\nJacobs 12\nJoining 12\nJong 12\nKenya 12\nKerry 12\nKey 12\nKhan 12\nKlein 12\nKleinwort 12\nKremlin 12\nLBOs 12\nLaos 12\nLeading 12\nLexington 12\nLiaoning 12\nLight 12\nLo 12\nLorin 12\nLou 12\nMOFA 12\nMacDonald 12\nMackenzie 12\nMahdi 12\nMann 12\nMansour 12\nMarks 12\nMart 12\nMarx 12\nMasson 12\nMazda 12\nMcCall 12\nMcDonnell 12\nMedicaid 12\nMirage 12\nMonica 12\nN.C. 12\nN.J 12\nNRM 12\nNation 12\nNile 12\nNorthrop 12\nNowadays 12\nNynex 12\nO'Brien 12\nOdeon 12\nOfficer 12\nOlmert 12\nOpen 12\nPearl 12\nPeladeau 12\nPeninsula 12\nPeoples 12\nPerspective 12\nPeru 12\nPharmaceuticals 12\nPhysics 12\nPlaza 12\nPolly 12\nPop 12\nPortfolio 12\nPrincess 12\nPrinceton 12\nProjects 12\nQatari 12\nQuality 12\nQuist 12\nR&D 12\nRandy 12\nReserves 12\nReuters 12\nRidge 12\nRoute 12\nSS 12\nSafety 12\nSandinista 12\nSarah 12\nSaudis 12\nSchwartz 12\nSells 12\nSenators 12\nSharp 12\nSheraton 12\nSilver 12\nSiniora 12\nSloan 12\nSomeone 12\nSprings 12\nStandards 12\nStanford 12\nStoll 12\nSudanese 12\nSummer 12\nSunnyvale 12\nTHERE 12\nTalabani 12\nTalks 12\nTaoyuan 12\nTechnical 12\nTehran 12\nTelecommunications 12\nTenn. 12\nTextile 12\nThanksgiving 12\nTire 12\nTraffic 12\nTransCanada 12\nTreaty 12\nUNESCO 12\nUSSR 12\nUnknown 12\nUtilities 12\nValentine 12\nVarious 12\nVatican 12\nWars 12\nWatch 12\nWatson 12\nWeihai 12\nWere 12\nWinston 12\nWoman 12\nWong 12\nWorse 12\nWorst 12\nWyoming 12\nXerox 12\nXiaoping 12\nYears 12\nYunlin 12\nabused 12\nacceleration 12\naccessories 12\naccords 12\nacknowledging 12\nactivist 12\nadvancement 12\nadvantageous 12\nadvent 12\nadvises 12\nadvocated 12\nairborne 12\nairing 12\nairplane 12\nalleviate 12\nalongside 12\naltitude 12\nanalyzing 12\nanti-American 12\nanticipates 12\nants 12\nanytime 12\narise 12\narises 12\narmies 12\narrears 12\nartery 12\nartificial 12\naspirations 12\nasserts 12\nassurance 12\nattendant 12\nattribute 12\nattributes 12\nautomobiles 12\naverages 12\nawkward 12\nbail 12\nbankrupt 12\nbans 12\nbatch 12\nbats 12\nbeaches 12\nbeaten 12\nbelts 12\nbenefiting 12\nbicycle 12\nbiology 12\nblend 12\nblessed 12\nblowing 12\nblows 12\nbolstered 12\nboomers 12\nboosts 12\nbooth 12\nbored 12\nbounce 12\nbribe 12\nbribery 12\nbrisk 12\nbudgetary 12\nbusiest 12\ncalcium 12\ncartoon 12\ncasual 12\ncave 12\ncelebrities 12\nchang 12\ncharm 12\nchartered 12\ncheated 12\ncheering 12\nchen 12\nchorus 12\ncircuits 12\ncite 12\nclutter 12\nco-operative 12\ncoaches 12\ncoincidence 12\ncombines 12\ncommenting 12\ncomputerized 12\nconfession 12\nconform 12\ncongress 12\ncooperating 12\ncostumes 12\ncourthouse 12\ncracks 12\ncrap 12\ncreatures 12\ndeductible 12\ndeepen 12\ndeferred 12\ndelegations 12\ndeliveries 12\ndenial 12\ndenominations 12\ndenouncing 12\ndensity 12\ndentist 12\ndepartures 12\ndependents 12\ndepositary 12\ndeprived 12\ndesirable 12\ndetected 12\ndevaluation 12\ndevastated 12\ndiary 12\ndig 12\ndilemma 12\ndimensional 12\ndirecting 12\ndisappointments 12\ndisciplinary 12\ndisplaced 12\ndistress 12\ndiversification 12\ndoctrine 12\ndonor 12\ndownstream 12\ndragging 12\ndrastically 12\ndubious 12\ndull 12\nenhancing 12\nentrusted 12\nenvy 12\nestablishments 12\nethical 12\nevasion 12\nexotic 12\nexplicit 12\nexplode 12\nfaithful 12\nfarther 12\nfeasibility 12\nfiercely 12\nfifteen 12\nfiguring 12\nfinals 12\nfixing 12\nforefront 12\nforums 12\nforwarded 12\nfossil 12\nfulfilled 12\nfundamentally 12\nfurs 12\ngalvanized 12\ngeared 12\nglance 12\ngods 12\ngon 12\ngraduating 12\nguardian 12\nguerrilla 12\ngym 12\nhanding 12\nharbors 12\nhint 12\nhinted 12\nhunger 12\nideology 12\nillusion 12\nillustrate 12\nillustration 12\nimitation 12\nimmunity 12\nincorrectly 12\nincumbent 12\ninduce 12\nindustrywide 12\ninexpensive 12\ninfection 12\nintegrating 12\nintelligent 12\nintensified 12\nintensify 12\ninteraction 12\ninterconnect 12\ninvaded 12\nirresponsible 12\njokes 12\njournalism 12\njudging 12\nk 12\nkitten 12\nknocking 12\nlag 12\nlagged 12\nlagging 12\nlawn 12\nlayers 12\nlayoffs 12\nleaf 12\nlegendary 12\nlender 12\nlenses 12\nliable 12\nliang 12\nlineup 12\nlining 12\nlips 12\nlivestock 12\nlottery 12\nmailing 12\nmandated 12\nmarketer 12\nmask 12\nmath 12\nmentions 12\nmilestones 12\nmoderately 12\nmodernize 12\nmodernized 12\nmodified 12\nmultinationals 12\nmultiples 12\nmurderers 12\nneatly 12\nneglected 12\nnerves 12\nnonsense 12\nnotify 12\nnursing 12\noath 12\nordinarily 12\noutflows 12\noutlying 12\noverdue 12\noverwhelmingly 12\npairs 12\nparental 12\nparity 12\nparked 12\npatrol 12\npencil 12\npersonalities 12\npets 12\nping 12\nplacement 12\nplatforms 12\npolite 12\npoorest 12\npossessed 12\npremise 12\npresently 12\npreservation 12\npreserved 12\npreserving 12\npretext 12\nprevailing 12\nprimitive 12\nproceeded 12\nprohibit 12\nprohibits 12\nprospectus 12\nprostitution 12\nprotects 12\nproviders 12\npsychiatrist 12\nranch 12\nrandomly 12\nraped 12\nratios 12\nrealism 12\nreap 12\nrebates 12\nrecipe 12\nrecipient 12\nreconsider 12\nrefrain 12\nrefunding 12\nrefunds 12\nreinforce 12\nrelaxation 12\nreliance 12\nrenewing 12\nrescued 12\nresemble 12\nresigning 12\nresisting 12\nretaining 12\nrevamped 12\nrewards 12\nrichest 12\nroadside 12\nrockets 12\nruble 12\nruins 12\nsafeguarding 12\nsatisfactory 12\nscams 12\nscrambled 12\nscrapped 12\nsedan 12\nseniors 12\nserver 12\nsettler 12\nsettlers 12\nshippers 12\nshout 12\nshutdown 12\nsincerely 12\nslack 12\nslim 12\nsloppy 12\nslot 12\nslumped 12\nsoaked 12\nspells 12\nspinning 12\nspray 12\nspurred 12\nstaffs 12\nstandstill 12\nstole 12\nsue 12\nsunk 12\nsupplying 12\nswitching 12\ntackle 12\ntalented 12\ntangible 12\ntapped 12\ntaxed 12\ntaxpayer 12\ntenth 12\nthesis 12\nthrust 12\ntoe 12\ntoes 12\ntooth 12\ntotals 12\ntowel 12\ntraced 12\ntraces 12\ntrademark 12\ntransported 12\ntribal 12\ntricks 12\ntrivial 12\nunavailable 12\nuncertainties 12\nunconstitutional 12\nundisclosed 12\nunreported 12\nunsettled 12\nunwanted 12\nunwelcome 12\nupcoming 12\nurges 12\nuseless 12\nvague 12\nvictories 12\nvillagers 12\nvirtue 12\nvoiced 12\nvoid 12\nwarranty 12\nwashed 12\nweighing 12\nwished 12\nwondered 12\nworkforce 12\nworkout 12\nwreck 12\nwrongdoing 12\nyun 12\nzoo 12\n· 12\n0.25 11\n1,200 11\n1.04 11\n1.75 11\n10.4 11\n108 11\n10:00 11\n113 11\n13.50 11\n13.6 11\n13.8 11\n141.90 11\n155 11\n185 11\n1906 11\n1961 11\n1963 11\n1964 11\n1965 11\n2,500 11\n2.50 11\n2.75 11\n24th 11\n4.875 11\n5.6 11\n6.1 11\n6.8 11\n7.52 11\n7.8 11\n7.98 11\n8.09 11\n8.375 11\n8.4 11\n8.9 11\n9.7 11\n911 11\n94 11\nABM 11\nASSOCIATION 11\nAdd 11\nAgainst 11\nAlice 11\nAnalyst 11\nApplied 11\nArabic 11\nArctic 11\nArmco 11\nAshcroft 11\nAsk 11\nBahrain 11\nBalkan 11\nBancorp 11\nBarnard 11\nBaseball 11\nBasra 11\nBeginning 11\nBetsy 11\nBetween 11\nBros. 11\nBryant 11\nBuffett 11\nBulgaria 11\nBurma 11\nCALL 11\nCambodian 11\nCambria 11\nCanal 11\nCarat 11\nCastle 11\nCellular 11\nCenters 11\nCharlotte 11\nChip 11\nChiron 11\nChurchill 11\nClose 11\nColer 11\nComing 11\nComrade 11\nConsequently 11\nCorey 11\nCorr 11\nCorrespondent 11\nCountries 11\nCraft 11\nDade 11\nDaniels 11\nDebates 11\nDictaphone 11\nDillon 11\nDodge 11\nDong 11\nDoug 11\nDown 11\nDuan 11\nDubai 11\nEOL 11\nEgyptians 11\nEmhart 11\nEritrea 11\nFAA 11\nFOR 11\nFOREIGN 11\nFalls 11\nFiji 11\nFinanciere 11\nFire 11\nFitness 11\nFossett 11\nFrankly 11\nFt. 11\nFulton 11\nGallery 11\nGenerale 11\nGeographic 11\nGorky 11\nGreenberg 11\nGrowth 11\nHOME 11\nHai 11\nHaikou 11\nHess 11\nHoechst 11\nHopkins 11\nHyde 11\nHydro 11\nIQ 11\nIll 11\nIlluminating 11\nInd. 11\nIntermediate 11\nIvan 11\nJacques 11\nJake 11\nJoel 11\nKaiser 11\nKao 11\nKarl 11\nKids 11\nKoch 11\nKohl 11\nKuan 11\nLA 11\nLOAN 11\nLaband 11\nLaci 11\nLawmakers 11\nLeigh 11\nLetting 11\nLikewise 11\nLing 11\nLogan 11\nLuzon 11\nMachinery 11\nMacy 11\nMaking 11\nMayaw 11\nMeans 11\nMediterranean 11\nMemphis 11\nMerck 11\nMexicans 11\nMickey 11\nMinsheng 11\nMission 11\nMontana 11\nMr 11\nN 11\nN.Y 11\nNTUST 11\nNationwide 11\nNielsen 11\nNikko 11\nNumber 11\nONE 11\nOsaka 11\nPa 11\nPao 11\nPatterson 11\nPayment 11\nPennzoil 11\nPepsiCo 11\nPfeiffer 11\nPittston 11\nPizza 11\nPlanet 11\nPolitics 11\nPresidents 11\nPrevention 11\nProspect 11\nQuayle 11\nREADY 11\nRISC 11\nReed 11\nRegardless 11\nRegulations 11\nRev. 11\nRhonda 11\nRichardson 11\nRising 11\nRoh 11\nRoom 11\nRunkel 11\nSaid 11\nSandinistas 11\nSassy 11\nSavaiko 11\nSchering 11\nSecret 11\nSeems 11\nShareholders 11\nShaw 11\nShop 11\nSimilar 11\nSky 11\nSouthwestern 11\nSpiegel 11\nStay 11\nStudies 11\nSure 11\nTaba 11\nTaihsi 11\nTakeover 11\nTalk 11\nTangshan 11\nTanzania 11\nTelecom 11\nTelesis 11\nTempleton 11\nTheater 11\nThroughout 11\nThurmond 11\nTide 11\nToledo 11\nTourism 11\nTravelers 11\nTrinova 11\nTu 11\nTurks 11\nU.S.A 11\nUNRWA 11\nUganda 11\nUruguay 11\nVenezuelan 11\nVenus 11\nVila 11\nVision 11\nWSJ 11\nWalking 11\nWash 11\nWebsite 11\nWeisfield 11\nWethy 11\nWhenever 11\nWorldCom 11\nYen 11\nabnormal 11\nabundant 11\naccompany 11\naccomplishment 11\naccountants 11\naccumulating 11\naccusation 11\nactivated 11\nadequately 11\nanalyzed 11\nanti-abortion 11\nappearances 11\nappoint 11\nappraisal 11\narchaeopteryx 11\narmored 11\nassignment 11\nauspicious 11\nautomated 11\nbackbone 11\nballoonfish 11\nbanquet 11\nbarred 11\nbarring 11\nbearded 11\nbeats 11\nbelonging 11\nbizarre 11\nborough 11\nbounced 11\nbreathe 11\nbusinesspeople 11\nbutt 11\ncages 11\ncans 11\ncatalyst 11\ncautioned 11\ncautions 11\nceramics 11\ncheek 11\nchew 11\nclips 11\ncoaster 11\ncoated 11\ncocktail 11\ncoffin 11\ncollapsing 11\ncollectively 11\ncommentators 11\ncondemnation 11\ncondemning 11\ncondolences 11\nconfessed 11\ncongratulated 11\nconjunction 11\nconsequence 11\ncontemplating 11\ncontention 11\ncontinuation 11\ncontradictory 11\nconventions 11\ncooled 11\ncord 11\ncorrected 11\ncough 11\ncousin 11\ncrashed 11\ncream 11\ncriticisms 11\ncrown 11\ncrushed 11\ncrystal 11\ncups 11\ncurve 11\nd 11\ndancing 11\ndelaying 11\ndelegate 11\ndelight 11\ndelivers 11\ndeparted 11\ndetailing 11\ndetective 11\ndevastation 11\ndeviation 11\ndiabetes 11\ndigging 11\ndirects 11\ndisadvantage 11\ndisappearance 11\ndisciplined 11\ndiscouraging 11\ndismiss 11\ndisparity 11\ndisplaying 11\ndisproportionate 11\ndisrupt 11\ndisrupted 11\ndissatisfaction 11\ndissolution 11\ndissolve 11\ndocumentary 11\ndrafting 11\neagerness 11\near 11\neditors 11\neducate 11\nemphasize 11\nemphasizes 11\nemphasizing 11\nendangered 11\nenduring 11\nenforce 11\nentourage 11\nenvironmentally 11\nevacuation 11\nevenly 11\nexaggerated 11\nexceptional 11\nexodus 11\nexpelled 11\nexpressway 11\nextract 11\nfad 11\nfairy 11\nfavorably 11\nfertilizer 11\nfinest 11\nfinishes 11\nfitting 11\nflame 11\nflown 11\nforgive 11\nfray 11\nfriction 11\nfulfilling 11\ngarment 11\ngenerates 11\ngin 11\ngloomy 11\ngoodwill 11\ngoogle 11\ngovernance 11\ngreetings 11\ngwo 11\nhandsome 11\nhandy 11\nhen 11\nhints 11\nhourly 11\nhubby 11\nhumiliation 11\nhunters 11\nhydrogen 11\nidentifying 11\nillustrates 11\nimplied 11\nimply 11\ninefficient 11\ninfamous 11\ninflux 11\ninherent 11\ninjected 11\ninspected 11\ninspector 11\ninstallment 11\ninstances 11\ninsulting 11\nintervened 11\nintraday 11\ninvariably 11\njacket 11\njeans 11\njets 11\njolt 11\njudgments 11\njung 11\njurisdictions 11\nlake 11\nlan 11\nlap 11\nleaks 11\nleases 11\nlens 11\nlethal 11\nli 11\nliberation 11\nlineage 11\nloading 11\nlobbyist 11\nlogo 11\nlol 11\nlover 11\nlumber 11\nlure 11\nmanufactures 11\nmassages 11\nmemorandum 11\nmerging 11\nminerals 11\nminers 11\nminimize 11\nmixing 11\nmm 11\nmorals 11\nmotives 11\nmouths 11\nmurderer 11\nmushrooms 11\nmusicians 11\nnails 11\nnaive 11\nnarrowing 11\nnecessities 11\nnewcomer 11\nnewest 11\nnewsprint 11\nnon-U.S. 11\nnonperforming 11\nnoodles 11\nnotices 11\no'clock 11\nobstruction 11\noppression 11\nopted 11\norganizational 11\norigins 11\noutlined 11\noutsider 11\noverlooked 11\np53 11\npachinko 11\npants 11\nparakeet 11\npartisan 11\npartition 11\npearl 11\npeasant 11\npeddling 11\nperceptions 11\npervasive 11\npinch 11\nplead 11\nplugged 11\nplunging 11\npoliceman 11\npools 11\nportrayal 11\nposes 11\npost-war 11\npostwar 11\npowerhouse 11\npragmatic 11\npraying 11\npreceded 11\npreceding 11\npredictable 11\nprefers 11\nprescribed 11\nprescription 11\npresumably 11\npricings 11\nprisons 11\npro-democracy 11\nprogressing 11\nprojection 11\npromotes 11\nprosecuted 11\nprotesting 11\nprovoked 11\nqueen 11\nquestionable 11\nranged 11\nrecorders 11\nregained 11\nregulating 11\nreinforcing 11\nreleasing 11\nremarkably 11\nrestrictive 11\nretaliate 11\nrevived 11\nrevolve 11\nricher 11\nrighteous 11\nrings 11\nriskier 11\nrod 11\nrolls 11\nroyalties 11\nruin 11\nsagging 11\nsailing 11\nsalmon 11\nsanitation 11\nsatirical 11\nsauce 11\nschemes 11\nscratch 11\nscripts 11\nservant 11\nshedding 11\nshoppers 11\nshortcomings 11\nsing 11\nsingled 11\nsins 11\nskepticism 11\nskiing 11\nslope 11\nsmiling 11\nsoar 11\nsoliciting 11\nsorrow 11\nsoybean 11\nspaces 11\nspawned 11\nspeculative 11\nspeeds 11\nspelling 11\nspilled 11\nspree 11\nstainless 11\nstarring 11\nstartling 11\nstatue 11\nsteering 11\nstereo 11\nstranger 11\nsubscribe 11\nsubstantive 11\nsubtle 11\nsuburb 11\nsucked 11\nsuite 11\nsummoned 11\nsupermarkets 11\nsuperpower 11\nsupervised 11\nsuppression 11\nsuppressor 11\nsupreme 11\nsurpassing 11\nsurprises 11\nsurrendered 11\nsuspicions 11\nswell 11\nsympathetic 11\ntainted 11\ntankers 11\nteenage 11\ntennis 11\ntents 11\ntheoretical 11\ntops 11\ntore 11\ntossed 11\ntours 11\ntrailer 11\ntrails 11\ntransitional 11\ntranslate 11\ntransporting 11\ntrick 11\ntrustee 11\ntwin 11\nundecided 11\nundercut 11\nunderestimated 11\nunderwrite 11\nunearthed 11\nunfriendly 11\nuniversally 11\nunmanned 11\nuntrue 11\nunveil 11\nupheld 11\nupstairs 11\nurine 11\nutter 11\nvacancy 11\nvacant 11\nvault 11\nvetoed 11\nvideos 11\nviewpoint 11\nvisibility 11\nwaiver 11\nweakest 11\nweaving 11\nwet 11\nwheelchair 11\nwithheld 11\nwithstand 11\nwoke 11\nwool 11\nwording 11\nworlds 11\nyeast 11\nyoga 11\nyourselves 11\nyouthful 11\nyu 11\nzoning 11\n1.10 10\n1.20 10\n1.23 10\n1.24 10\n1.35 10\n10.2 10\n10.6 10\n112 10\n14.6 10\n141.45 10\n1920s 10\n2012 10\n320 10\n330 10\n340 10\n5. 10\n5.7 10\n55,000 10\n6/2 10\n7.20 10\n7.4 10\n7.96 10\n8.05 10\nACCEPTANCES 10\nASAP 10\nASSETS 10\nAbdel 10\nAbove 10\nAdams 10\nAdha 10\nAdm. 10\nAer 10\nAmazing 10\nAnnette 10\nAnnual 10\nAnything 10\nArby 10\nAsaad 10\nAtlantis 10\nAutomobile 10\nBANKERS 10\nBNL 10\nBPCA 10\nBaidu 10\nBan 10\nBarcelona 10\nBatibot 10\nBatman 10\nBehind 10\nBeihai 10\nBloc 10\nBrawer 10\nBriggs 10\nBroadway 10\nBuir 10\nBurlington 10\nC.D.s 10\nC3CRM 10\nCCTV 10\nCERTIFICATES 10\nCO. 10\nCRM 10\nCalifornians 10\nChange 10\nCharlie 10\nChechnya 10\nChinanews.com 10\nChivas 10\nCie 10\nCircus 10\nClassic 10\nClearing 10\nCoal 10\nCoastal 10\nCollins 10\nColonel 10\nCommander 10\nCommon 10\nConnolly 10\nConsulting 10\nContact 10\nCorps 10\nCrown 10\nDEC 10\nDEPOSIT 10\nDES 10\nDISCOUNT 10\nDali 10\nDatapoint 10\nDepending 10\nDesign 10\nDhabi 10\nDifferent 10\nDingxiang 10\nDominican 10\nDonoghue 10\nDresdner 10\nDuff 10\nEMS 10\nEURODOLLARS 10\nEasy 10\nElectoral 10\nEnfield 10\nEugene 10\nExcept 10\nExporting 10\nExports 10\nExternal 10\nFERC 10\nFacilities 10\nFaisal 10\nFalcon 10\nFan 10\nFangchenggang 10\nFinancing 10\nFirefox 10\nFirstly 10\nFlorence 10\nFranco 10\nFried 10\nFriend 10\nGATT 10\nGallup 10\nGelbart 10\nGets 10\nGoldsmith 10\nGraham 10\nGroups 10\nGuardia 10\nHalloween 10\nHanna 10\nHarrison 10\nHassan 10\nHaven 10\nHead 10\nHebei 10\nHeller 10\nHulun 10\nHunan 10\nINTERBANK 10\nIRAs 10\nImports 10\nIndiana 10\nIndividuals 10\nInland 10\nInstitutions 10\nIverson 10\nIvory 10\nJefferson 10\nJie 10\nJin 10\nJoan 10\nJolla 10\nJuan 10\nJude 10\nJudges 10\nJudicial 10\nKGB 10\nKarbala 10\nKathleen 10\nKeith 10\nKuala 10\nLATE 10\nLYNCH 10\nLady 10\nLanzhou 10\nLaotian 10\nLarsen 10\nLauder 10\nLearning 10\nLess 10\nLeung 10\nLiberty 10\nLinux 10\nLots 10\nLufthansa 10\nLumpur 10\nLuo 10\nLynn 10\nMERRILL 10\nMachine 10\nMahfouz 10\nMalaysian 10\nMandarin 10\nMarathon 10\nMarcus 10\nMarlowe 10\nMet 10\nMetal 10\nMetromedia 10\nMichele 10\nMir 10\nMitterrand 10\nMobile 10\nMorning 10\nMoslems 10\nMoving 10\nNazis 10\nNeb. 10\nNegotiable 10\nNepal 10\nNichols 10\nNorway 10\nNuovo 10\nOFFERED 10\nOccidental 10\nOfficial 10\nOlympia 10\nOscar 10\nOvidie 10\nPCs 10\nPace 10\nParents 10\nPersonnel 10\nPlough 10\nPolls 10\nPrebon 10\nPromotion 10\nProphet 10\nProtocol 10\nPublications 10\nRecent 10\nRecognition 10\nRedford 10\nRidder 10\nRockwell 10\nRoebuck 10\nRothschilds 10\nRubbermaid 10\nRules 10\nSaleh 10\nSamsung 10\nSchedule 10\nShackleton 10\nSherwin 10\nShia 10\nShlomo 10\nShopping 10\nShortly 10\nShuang 10\nSinyard 10\nSkinner 10\nSmash 10\nSoftware 10\nSoo 10\nSoutham 10\nStart 10\nStarting 10\nSteinberg 10\nStoltzman 10\nStory 10\nSung 10\nSuperfund 10\nTCI 10\nTai 10\nTana 10\nTaxation 10\nThirdly 10\nThousands 10\nThrift 10\nTien 10\nTimothy 10\nTonkin 10\nTung 10\nTzu 10\nUVB 10\nUn 10\nUniroyal 10\nUsually 10\nV. 10\nVERY 10\nVIA 10\nVanguard 10\nVanity 10\nVenture 10\nVideo 10\nVisitors 10\nWHEN 10\nWal 10\nWash. 10\nWellcome 10\nWenzhou 10\nWestmoreland 10\nWinnebago 10\nWitnesses 10\nWorldwide 10\nXin 10\nYemenis 10\nYew 10\nabandoning 10\nabsorbing 10\naccidentally 10\naccomplishments 10\nadministered 10\nadmiration 10\nadvisor 10\naffirmative 10\nalleging 10\nallied 10\nallocate 10\namazed 10\nambiguous 10\namusing 10\nanalogy 10\nannualized 10\napprovals 10\narcheological 10\narranging 10\narteries 10\nartificially 10\nassistants 10\nassurances 10\nathletics 10\nauctioned 10\naunt 10\nbacklash 10\nbakery 10\nbankruptcies 10\nbargains 10\nbatteries 10\nbellwether 10\nbiography 10\nblink 10\nboyfriend 10\nbrass 10\nbreaker 10\nbred 10\nbuddy 10\nbuildup 10\nbuoyed 10\nburial 10\nbury 10\nbust 10\nbuttons 10\ncafe 10\ncalculating 10\ncalculation 10\ncandidacy 10\ncart 10\ncarving 10\ncatering 10\ncautiously 10\nchairmen 10\nchapters 10\ncharming 10\nchat 10\ncheered 10\ncheers 10\nchemistry 10\nchilling 10\nclassmates 10\ncoat 10\ncoatings 10\ncoding 10\ncollaborating 10\ncolliding 10\ncommemorative 10\ncompanions 10\ncomparing 10\ncompelled 10\ncomrades 10\nconditioned 10\nconditioning 10\nconflicting 10\nconstraints 10\ncontributes 10\nconvene 10\ncook 10\ncoolant 10\ncork 10\ncriminality 10\ncriticize 10\ncushion 10\ndared 10\ndarkness 10\ndash 10\ndecliners 10\ndecreasing 10\ndeed 10\ndeemed 10\ndefective 10\ndefenses 10\ndegradation 10\ndel 10\ndemise 10\ndependence 10\ndepicted 10\ndesigning 10\ndeteriorate 10\ndeteriorated 10\ndevotion 10\ndiagnosed 10\ndial 10\ndiameter 10\ndigit 10\ndigs 10\ndisarray 10\ndiscourage 10\ndisintegration 10\ndisorders 10\ndive 10\ndivisive 10\ndocks 10\ndocumentation 10\ndocumented 10\ndowngraded 10\ndrifted 10\ndwarf 10\neldest 10\nelites 10\nemails 10\nembryo 10\nemergence 10\nemerges 10\nenactment 10\nensuing 10\nenterovirus 10\nerode 10\nerosion 10\neruption 10\nessay 10\nethylene 10\nexecuting 10\nexecutions 10\nexercising 10\nextremists 10\nfabrication 10\nfacial 10\nfacto 10\nfaculty 10\nfashions 10\nfeedback 10\nfen 10\nfights 10\nfirmer 10\nfledgling 10\nflurry 10\nflush 10\nfond 10\nforbidding 10\nforehead 10\nforge 10\nfortunate 10\nfranchisee 10\nfreed 10\nfrenzy 10\nfrightened 10\nfronts 10\nfunctioning 10\nfurriers 10\ngainers 10\ngesture 10\nglimpse 10\nglossy 10\ngloves 10\ngoodness 10\ngradual 10\ngrandstand 10\ngreedy 10\ngreeting 10\ngrid 10\ngrower 10\nguaranteeing 10\nguessed 10\nhacker 10\nharmed 10\nhavoc 10\nhazardous 10\nhealing 10\nheir 10\nhelpless 10\nhepatitis 10\nhesitation 10\nho 10\nhonesty 10\nhumane 10\nhumanitarianism 10\nhurts 10\nhusbands 10\nimporter 10\ninaccurate 10\nincinerator 10\nincompetent 10\nindefinitely 10\ninflated 10\ninnovations 10\ninsight 10\ninsights 10\ninspiration 10\ninstalling 10\ninsure 10\nintegral 10\ninterrupted 10\ninventor 10\ninvests 10\ninvites 10\njammed 10\njen 10\njeou 10\njerry 10\njobless 10\njumps 10\nkillers 10\nkilograms 10\nkindergarten 10\nkindness 10\nkingdom 10\nkings 10\nkittens 10\nlaborers 10\nlandfill 10\nlandowners 10\nlaundry 10\nlavish 10\nlazy 10\nleaking 10\nlegged 10\nliberalization 10\nliberate 10\nlied 10\nlimbs 10\nlisteners 10\nlitter 10\nlobbyists 10\nlofty 10\nloosen 10\nloser 10\nlovers 10\nmagnificent 10\nmalls 10\nmapped 10\nmapping 10\nmarble 10\nmarginally 10\nmarrying 10\nmartyr 10\nmicroprocessor 10\nmid-1970s 10\nmidday 10\nmildly 10\nminiature 10\nmisery 10\nmoderates 10\nmonumental 10\nmooring 10\nmotivation 10\nmultimillion 10\nnaming 10\nnationalist 10\nnearest 10\nnegligence 10\nnominee 10\nnotch 10\nnotebook 10\nnowhere 10\nnumbered 10\nnuts 10\nobjection 10\nobservations 10\nobsolete 10\noccasional 10\noccupy 10\nordeal 10\nouster 10\noutline 10\noutnumbered 10\noutset 10\noutward 10\noutweigh 10\novercapacity 10\noverpriced 10\np.m 10\npacers 10\npacking 10\npad 10\npale 10\npalladium 10\npan 10\nparadox 10\nparagraph 10\nparcel 10\npassport 10\npaths 10\npayout 10\npencils 10\nphases 10\npitches 10\npleading 10\npleasures 10\npoisoned 10\npoisonous 10\npolicemen 10\npolonium 10\npolyethylene 10\npoorer 10\npornography 10\npotent 10\nprayers 10\nprepaid 10\nprestigious 10\npresumed 10\nprivatized 10\npro-active 10\nprobing 10\nprocurement 10\nproletarian 10\nproliferation 10\npropaganda 10\npulse 10\npurchasers 10\nquest 10\nrabbits 10\nracism 10\nradicals 10\nrage 10\nrains 10\nrampant 10\nreacting 10\nreadiness 10\nrecognizes 10\nrecoup 10\nrefineries 10\nreforming 10\nreinforced 10\nreins 10\nrejecting 10\nrelocation 10\nrenders 10\nresearching 10\nresilience 10\nresilient 10\nresolutely 10\nresumes 10\nresuming 10\nretrieve 10\nrevelations 10\nreversal 10\nrides 10\nrobberies 10\nromance 10\nrooted 10\nrope 10\nruined 10\nrunoff 10\nsalon 10\nsciences 10\nscuttle 10\nseizing 10\nservers 10\nshah 10\nshelves 10\nshipment 10\nshocks 10\nshooters 10\nshouted 10\nshower 10\nsightseeing 10\nsimilarly 10\nsimpler 10\nskies 10\nslaughter 10\nsleepy 10\nslept 10\nslice 10\nslick 10\nsliding 10\nslogans 10\nsolidarity 10\nsomeday 10\nsore 10\nsovereign 10\nspaceship 10\nspan 10\nspeakers 10\nspectrum 10\nsporadic 10\nspouse 10\nspreadsheet 10\nstabilization 10\nstaging 10\nsteer 10\nstreak 10\nstressing 10\nstudios 10\nsub 10\nsuffers 10\nsupportive 10\ntabloid 10\ntallest 10\ntasted 10\ntastes 10\nthanked 10\nthief 10\nthriving 10\nthumb 10\ntightening 10\ntoilet 10\ntolerate 10\ntolling 10\ntouting 10\ntowed 10\ntrafficking 10\ntrucking 10\ntuna 10\nunanimous 10\nunborn 10\nunderwear 10\nunderwritten 10\nundeveloped 10\nunfolding 10\nunidentified 10\nunpaid 10\nunpopular 10\nunto 10\nupdated 10\nurgently 10\nvain 10\nvalidity 10\nvaluation 10\nvariable 10\nvariables 10\nveil 10\nverbal 10\nvested 10\nvice-chairman 10\nvivid 10\nvowing 10\nvulnerability 10\nwallet 10\nwarships 10\nwaterproof 10\nweaponry 10\nweighted 10\nwelcoming 10\nwhores 10\nwiden 10\nwitches 10\nworsening 10\nwreckage 10\nwretched 10\nyielded 10\n!? 9\n'80s 9\n'em 9\n*** 9\n1,400 9\n1,800 9\n1.03 9\n1.11 9\n1.12 9\n1.19 9\n1.22 9\n1.44 9\n1.8470 9\n10.3 9\n109 9\n12,000 9\n14.5 9\n149 9\n165 9\n17.5 9\n1937 9\n1948 9\n1958 9\n1959 9\n1962 9\n2016 9\n3.35 9\n7.93 9\n75,000 9\n8.70 9\n8.75 9\n9.2 9\n9.9 9\n9000 9\n:--RRB- 9\n= 9\nABB 9\nAP 9\nAS 9\nAT 9\nAbkhazia 9\nAbortion 9\nAbrams 9\nAbramson 9\nAccord 9\nAdvertisers 9\nAdvisers 9\nAdvisory 9\nAlaskan 9\nAllan 9\nAluminum 9\nAna 9\nAndalou 9\nAngel 9\nAngelo 9\nArias 9\nArmonk 9\nAssets 9\nAuto 9\nBILLS 9\nBUT 9\nBacked 9\nBangguo 9\nBanque 9\nBasic 9\nBasque 9\nBeebes 9\nBo 9\nBolivia 9\nBorn 9\nBrokers 9\nBuddhist 9\nBurgess 9\nBurns 9\nBus 9\nByrne 9\nCMS 9\nCORP 9\nCPPCC 9\nCaltrans 9\nCananea 9\nCantonese 9\nCapcom 9\nCardinal 9\nCarson 9\nCaterpillar 9\nCensus 9\nChampionship 9\nChebeck 9\nChemicals 9\nCheung 9\nClubs 9\nCo-operation 9\nCol. 9\nCongo 9\nCornell 9\nCorrea 9\nCrime 9\nCrossland 9\nCypress 9\nDamascus 9\nDark 9\nDeaver 9\nDeb 9\nDel 9\nDel. 9\nDemocracy 9\nDepression 9\nDeukmejian 9\nDevelopments 9\nDigest 9\nDingell 9\nDisabilities 9\nDog 9\nESB 9\nEaton 9\nEdgar 9\nEdisto 9\nElectrical 9\nEmirates 9\nErbamont 9\nFear 9\nFei 9\nFinal 9\nFinding 9\nFirm 9\nFish 9\nFlower 9\nFormosa 9\nGCP 9\nGaubert 9\nGene 9\nGillian 9\nGoddess 9\nGrain 9\nGross 9\nGuaranty 9\nHAS 9\nHR 9\nHanoi 9\nHeadquarters 9\nHelen 9\nHighland 9\nHilal 9\nHonduras 9\nHou 9\nHuwei 9\nIPO 9\nIdaho 9\nIl 9\nImClone 9\nIndependence 9\nInstitutional 9\nInsurers 9\nIntelogic 9\nIntergroup 9\nJAL 9\nJudith 9\nKageyama 9\nKahn 9\nKeenan 9\nKhalil 9\nKia 9\nKid 9\nKnowledge 9\nKoizumi 9\nKorotich 9\nKrasnoyarsk 9\nKunming 9\nKurdish 9\nL.A. 9\nLLC 9\nLai 9\nLaurel 9\nLeaseway 9\nLeave 9\nLegend 9\nLeo 9\nLetter 9\nLeval 9\nLevi 9\nLieberman 9\nLyonnais 9\nM$ 9\nMBA 9\nMIPS 9\nMSN 9\nMail 9\nMansion 9\nMario 9\nMatsu 9\nMaxicare 9\nMcGraw 9\nMcLennan 9\nMeasure 9\nMenlo 9\nMeraBank 9\nMichel 9\nMitsui 9\nModern 9\nMontedison 9\nMontenegro 9\nMorrison 9\nMorton 9\nMujahedeen 9\nMulford 9\nMuqtada 9\nMurata 9\nMyanmar 9\nNOW 9\nNWA 9\nNantou 9\nNasrallah 9\nNationalist 9\nNebraska 9\nNike 9\nNoble 9\nOften 9\nOlsen 9\nOmaha 9\nOrders 9\nOriginal 9\nOrtiz 9\nOsbourne 9\nOu 9\nPLA 9\nPRI 9\nPalauan 9\nParkinson 9\nPaso 9\nPast 9\nPatel 9\nPearce 9\nPension 9\nPeriod 9\nPfizer 9\nPhoto 9\nPlans 9\nPoint 9\nPoles 9\nPotter 9\nPoulenc 9\nPresse 9\nPrison 9\nProbably 9\nProf. 9\nProfits 9\nQ 9\nQuite 9\nROW 9\nRapid 9\nRates 9\nRealist 9\nRecord 9\nRefcorp 9\nRegulatory 9\nReporting 9\nResource 9\nRestaurant 9\nRhone 9\nRidley 9\nRisk 9\nRob 9\nRobins 9\nRoper 9\nS.p 9\nSUV 9\nSaad 9\nSalinas 9\nSalinger 9\nSalt 9\nSaturn 9\nScania 9\nScorpio 9\nSeats 9\nSeita 9\nSerial 9\nSeventh 9\nShang 9\nSharia 9\nShenye 9\nSherman 9\nShimbun 9\nShuidong 9\nSisulu 9\nSnake 9\nSnow 9\nSocialists 9\nSolar 9\nSri 9\nSt 9\nStatistical 9\nSteppenwolf 9\nStudent 9\nSue 9\nSylvia 9\nTREASURY 9\nTRO 9\nTalking 9\nTel 9\nTerror 9\nTisch 9\nTorrijos 9\nToseland 9\nTransit 9\nTucker 9\nU.S.A. 9\nUBS 9\nUSD 9\nUnit 9\nUrumchi 9\nVDOT 9\nVeterans 9\nVisa 9\nVista 9\nWade 9\nWaksal 9\nWaxman 9\nWays 9\nWelch 9\nWelfare 9\nWhoever 9\nWolfowitz 9\nXie 9\nYingko 9\nYo 9\nZhangzhou 9\nZionist 9\nabbreviation 9\nabide 9\nabrupt 9\nabsorption 9\nabstract 9\nabundance 9\naccent 9\nadmitting 9\nadventure 9\nadversary 9\nadvertise 9\nadvertisements 9\nadvisors 9\naerobic 9\naffidavits 9\naftershock 9\nagendas 9\naground 9\naired 9\nalbeit 9\nalien 9\nallegiance 9\namend 9\nammunition 9\nanti-government 9\nantibody 9\nanticipating 9\napplicable 9\nappropriated 9\nappropriation 9\napt 9\naquarium 9\narbitrarily 9\narcheology 9\narchitectural 9\narising 9\naroma 9\nasleep 9\nassign 9\nate 9\nathletic 9\nauctions 9\nauthentic 9\nautos 9\nbaggage 9\nbang 9\nbare 9\nbarometer 9\nbearings 9\nbees 9\nbehave 9\nbehaved 9\nbehaviors 9\nbeneficiary 9\nbeverages 9\nbicycles 9\nbillionaire 9\nbits 9\nblooded 9\nboldly 9\nbore 9\nborne 9\nbourgeoisie 9\nbrakes 9\nbreadth 9\nbreakthroughs 9\nbriefed 9\nbug 9\nburdened 9\nbursts 9\nbushel 9\ncaller 9\ncared 9\ncater 9\nchaired 9\nchancellor 9\nchaotic 9\ncharts 9\nchatting 9\ncheat 9\ncheer 9\ncherished 9\nchocolate 9\nchuan 9\nchung 9\nclearer 9\nclears 9\nclever 9\nclip 9\ncocaine 9\ncomet 9\ncomfortably 9\ncommanding 9\ncommunists 9\ncomplementary 9\ncomplexity 9\ncomposition 9\ncon 9\nconceptual 9\nconferees 9\nconfined 9\nconscientiously 9\nconserve 9\nconsortia 9\nconstituency 9\nconstituted 9\nconstitutes 9\nconsult 9\ncontaminated 9\ncontrollers 9\ncopied 9\ncorpse 9\ncostume 9\ncousins 9\ncowboy 9\ncracked 9\ncriticizing 9\ncrumbling 9\ncrystals 9\ncultivate 9\ncultivated 9\ncultivation 9\ncynical 9\ndairy 9\ndeadlines 9\ndealerships 9\ndean 9\ndecree 9\ndefer 9\ndeficiency 9\ndefunct 9\ndeliberations 9\ndelighted 9\ndemocratization 9\ndemolished 9\ndemonstrating 9\ndenominated 9\ndense 9\ndepended 9\ndepletion 9\ndeploy 9\ndepressing 9\ndes 9\ndescendants 9\ndescent 9\ndevised 9\ndialect 9\ndiplomat 9\ndirt 9\ndisagreed 9\ndisappears 9\ndisastrous 9\ndischarge 9\ndiscouraged 9\ndiscrepancies 9\ndiscretionary 9\ndispatch 9\ndisruptions 9\ndistinguish 9\ndistributes 9\ndivestiture 9\ndizzying 9\ndocking 9\ndoldrums 9\ndonors 9\ndoorstep 9\ndraining 9\ndrastic 9\ndrawings 9\ndreamed 9\ndresses 9\ndrift 9\nducks 9\ne 9\neBay 9\neagerly 9\nearmarked 9\necological 9\nelectromagnetic 9\nemigration 9\nen 9\nenclave 9\nenthusiasts 9\nequals 9\neroding 9\nevacuated 9\nevolved 9\nexemption 9\nexerted 9\nexhausting 9\nexploding 9\nextradition 9\nfallout 9\nfang 9\nfarewell 9\nfascinating 9\nfavorites 9\nfervor 9\nfighters 9\nfilming 9\nfirmed 9\nfloods 9\nfoolish 9\nfools 9\nfootage 9\nforbidden 9\nforcefully 9\nforeseeable 9\nforgot 9\nfourteen 9\nfranchisers 9\nfrantically 9\nfundamentalist 9\nfurious 9\nfuror 9\ngallows 9\ngaming 9\ngaps 9\ngates 9\ngem 9\ngrease 9\ngreenification 9\ngreet 9\ngroundwork 9\nguilders 9\nguilt 9\nguinea 9\ngunships 9\ngut 9\nhackers 9\nhairs 9\nhardships 9\nhastily 9\nheadache 9\nhealthcare 9\nheaviest 9\nhello 9\nhesitant 9\nhill 9\nhomework 9\nhoney 9\nhopeless 9\nhorizontal 9\nhumble 9\nimbalance 9\nimbalances 9\nimplication 9\nimposes 9\ninched 9\nincitement 9\nincoming 9\nindoor 9\nindustrialization 9\ninhibit 9\ninitiate 9\ninsane 9\ninspections 9\ninspire 9\ninvestigator 9\ninvisible 9\ninward 9\nironic 9\nirony 9\njailed 9\njetliner 9\njolted 9\njumbo 9\nkeys 9\nkicking 9\nkills 9\nkinda 9\nlabs 9\nladen 9\nladies 9\nlapses 9\nlaughs 9\nlaughter 9\nlayout 9\nleaped 9\nlibel 9\nliberals 9\nliberty 9\nlieutenant 9\nlifts 9\nlightning 9\nlingering 9\nloads 9\nlog 9\nlogs 9\nloop 9\nloopholes 9\nluxurious 9\nmalignant 9\nmania 9\nmanipulate 9\nmarkedly 9\nmartyrs 9\nmarvelous 9\nmasks 9\nmassacres 9\nmaximize 9\nmegabyte 9\nmei 9\nmelting 9\nmercury 9\nmerits 9\nminicomputers 9\nmiserable 9\nmistaken 9\nmobilize 9\nmoderation 9\nmodifications 9\nmolecular 9\nmonopolies 9\nmonster 9\nmop 9\nmorally 9\nmotorcycle 9\nmph 9\nmunicipals 9\nmural 9\nmurdering 9\nnationalism 9\nnativist 9\nneat 9\nneglect 9\nnetizens 9\nnicely 9\nnod 9\nnon-financial 9\nnon-food 9\nnon-toxic 9\nnorm 9\nnostalgia 9\nnutrition 9\nobjected 9\nobserving 9\noffense 9\nolds 9\nominous 9\nonerous 9\noneself 9\noriginate 9\noriginated 9\norphans 9\noutdated 9\noutflow 9\noutpost 9\noutrage 9\noversee 9\npAsia 9\npainfully 9\npains 9\nparade 9\nparalyzed 9\nparticipant 9\npatch 9\npatented 9\npatriotism 9\npeacefully 9\npedal 9\npeer 9\npeers 9\npegged 9\npenetrate 9\nperfection 9\nperfume 9\nperiodic 9\npersist 9\npesticide 9\nphony 9\nphotograph 9\nphotographic 9\npiles 9\npillars 9\npipelines 9\npity 9\nplates 9\nplausible 9\nplaywright 9\nplots 9\npoised 9\npolished 9\npollutants 9\npolyester 9\npond 9\nposture 9\npowerless 9\npreamble 9\npredators 9\npreoccupied 9\npreparatory 9\nprevents 9\nprocessors 9\nprofessionalism 9\nprogrammed 9\nprogressively 9\nprojecting 9\npronunciation 9\nproponents 9\nprotested 9\npsychologically 9\npunitive 9\npuzzle 9\nquell 9\nraiders 9\nreactors 9\nreasoning 9\nreceipt 9\nrecipients 9\nrecognizing 9\nrecordings 9\nrecruit 9\nrecycled 9\nredeemed 9\nreferences 9\nrefrigerators 9\nregimes 9\nrelinquish 9\nremarried 9\nrenowned 9\nrepublics 9\nrestricts 9\nretarded 9\nreversing 9\nribbon 9\nrig 9\nripe 9\nroadways 9\nrobbers 9\nrobbery 9\nrode 9\nrows 9\nsabotage 9\nsacrificed 9\nsalvage 9\nsatellites 9\nsaves 9\nscam 9\nscarcely 9\nscramble 9\nscrews 9\nseated 9\nseekers 9\nseparatist 9\nseparatists 9\nseriousness 9\nsexually 9\nshallow 9\nshapes 9\nshaping 9\nsheds 9\nsideline 9\nsigh 9\nslapped 9\nsoftness 9\nsounded 9\nsounding 9\nspecialize 9\nspecter 9\nspies 9\nspills 9\nsplitting 9\nstaple 9\nstarred 9\nstartup 9\nstarving 9\nstatewide 9\nstatutes 9\nstipulates 9\nstopover 9\nstormed 9\nstove 9\nstreamline 9\nstreams 9\nstressful 9\nstretching 9\nstrikers 9\nstrokes 9\nsturdy 9\nsubordinates 9\nsubsidize 9\nsuing 9\nsunset 9\nsupervising 9\nsupervisors 9\nsurgical 9\nsurging 9\nsustainable 9\nsway 9\nswelling 9\nswift 9\nsyndicated 9\ntaboo 9\ntacit 9\ntangled 9\ntariff 9\ntended 9\ntermed 9\nterracotta 9\nterribly 9\ntexts 9\ntheaters 9\nthermal 9\ntightened 9\ntin 9\ntoppled 9\ntoured 9\ntouring 9\ntrace 9\ntrailing 9\ntransforming 9\ntransplant 9\ntreats 9\ntremendously 9\ntroublesome 9\ntrusts 9\ntwists 9\nunaware 9\nunbearable 9\nunbelievable 9\nunconsolidated 9\nundermined 9\nundertook 9\nundervalued 9\nuneasy 9\nunlawful 9\nunload 9\nunpredictable 9\nunprofitable 9\nunrealistic 9\nunused 9\nupbeat 9\nupheaval 9\nuphill 9\nupright 9\nvenerable 9\nvent 9\nvertical 9\nvirgin 9\nwaive 9\nwasting 9\nweights 9\nwen 9\nwhatsoever 9\nwhereabouts 9\nwhilst 9\nwillful 9\nwishing 9\nwithdrawing 9\nwoes 9\nworkplace 9\nworrisome 9\nwrestling 9\n**** 8\n.doc 8\n01 8\n1.06 8\n1.07 8\n1.18 8\n1.26 8\n1.29 8\n1.36 8\n1.42 8\n1.80 8\n1.85 8\n10.7 8\n11/28/2000 8\n12.4 8\n127 8\n13,000 8\n132 8\n136 8\n17,000 8\n17.6 8\n18.5 8\n190.58 8\n1940s 8\n1949 8\n1:00 8\n2018 8\n208 8\n22.5 8\n280 8\n3,500 8\n3/32 8\n309 8\n45,000 8\n475 8\n5.94 8\n5/16 8\n540 8\n613305 8\n7.88 8\n7.92 8\n7:30 8\n8.1 8\n8.60 8\n8.7 8\n8:30 8\n9,000 8\n9.4 8\nA.P. 8\nAN 8\nASEAN 8\nAccess 8\nAdding 8\nAdds 8\nAeronautics 8\nAgents 8\nAgricole 8\nAichi 8\nAirbus 8\nAircraft 8\nAltman 8\nAmgen 8\nAnalytical 8\nAndreas 8\nAntarctica 8\nApogee 8\nApplication 8\nArabiya 8\nAsarco 8\nAtomic 8\nAviv 8\nBART 8\nBar 8\nBarents 8\nBataille 8\nBawan 8\nBenton 8\nBergsma 8\nBird 8\nBondad 8\nBoulevard 8\nBout 8\nBowes 8\nBroad 8\nBronfman 8\nBronx 8\nBurton 8\nByrd 8\nCNW 8\nCa 8\nCadillac 8\nCage 8\nCanton 8\nCarnegie 8\nCars 8\nCatholics 8\nCeramics 8\nChad 8\nChalabi 8\nChampion 8\nChancery 8\nChicken 8\nChicks 8\nChild 8\nChong 8\nChongming 8\nChrist 8\nChristianity 8\nCie. 8\nClaire 8\nColinas 8\nColler 8\nCommons 8\nConsidering 8\nContainer 8\nContemporary 8\nConvex 8\nConway 8\nCopper 8\nCopyright 8\nCotton 8\nCrimes 8\nCrowd 8\nCrude 8\nCrusaders 8\nCupertino 8\nCyprus 8\nDAF 8\nDAX 8\nDakar 8\nDartmouth 8\nDenton 8\nDevices 8\nDiana 8\nDirectors 8\nDisabled 8\nDominion 8\nDreamer 8\nDrive 8\nDurian 8\nDynasty 8\nE-mail 8\nEcuadorian 8\nEgon 8\nEnds 8\nEnronOnline 8\nEra 8\nErnest 8\nErnst 8\nEsselte 8\nEssex 8\nEstonia 8\nEthiopian 8\nExactly 8\nExcellency 8\nExhibition 8\nFame 8\nFazio 8\nFemale 8\nFerguson 8\nFernando 8\nFerrous 8\nFigures 8\nFijian 8\nFindlay 8\nFine 8\nFirms 8\nFla 8\nFletcher 8\nFluor 8\nForestry 8\nFrancois 8\nFriends 8\nFront 8\nFuel 8\nFujis 8\nGa. 8\nGenetics 8\nGeorges 8\nGerry 8\nGhraib 8\nGintel 8\nGoodman 8\nGot 8\nGuarantee 8\nHK$ 8\nHaas 8\nHahn 8\nHajj 8\nHamburg 8\nHancock 8\nHanky 8\nHanson 8\nHard 8\nHeat 8\nHees 8\nHighly 8\nHilton 8\nHim 8\nHispanics 8\nHood 8\nHopefully 8\nHoping 8\nHot 8\nHotline 8\nHubbard 8\nHutchinson 8\nID 8\nIan 8\nIbrahim 8\nInitial 8\nIsmail 8\nJacob 8\nJamaica 8\nJamaican 8\nJasir 8\nJeb 8\nJeddah 8\nJefferies 8\nJesse 8\nJo 8\nJr 8\nJudging 8\nJudiciary 8\nJung 8\nJunk 8\nKandahar 8\nKang 8\nKarim 8\nKean 8\nKinmen 8\nKuanyin 8\nKume 8\nKyoto 8\nLazard 8\nLes 8\nLesko 8\nLeslie 8\nLionel 8\nLisbon 8\nLiz 8\nLortie 8\nMJIB 8\nMORE 8\nMP 8\nMadonna 8\nMahathir 8\nMain 8\nMale 8\nMartinez 8\nMateo 8\nMattel 8\nMatthew 8\nMaurice 8\nMd 8\nMecca 8\nMedina 8\nMessiah 8\nMich 8\nMirror 8\nMonsanto 8\nMosul 8\nMu 8\nMunicipality 8\nN.M. 8\nNBI 8\nNSB 8\nNan 8\nNashville 8\nNasr 8\nNato 8\nNearby 8\nNeedham 8\nNev. 8\nNguyen 8\nNimitz 8\nNoVA 8\nNormally 8\nNoting 8\nNoxell 8\nNugget 8\nOAS 8\nOctel 8\nOff 8\nOgden 8\nOk 8\nOre. 8\nOriental 8\nOriginally 8\nOuter 8\nOutlaws 8\nPSE 8\nPalmolive 8\nPasadena 8\nPei 8\nPerlman 8\nPete 8\nPeugeot 8\nPi 8\nPicop 8\nPing 8\nPitney 8\nPlayStation 8\nPo 8\nPostal 8\nPreparatory 8\nPretax 8\nPricing 8\nProgress 8\nProvidence 8\nRMA 8\nRafida 8\nRand 8\nRapids 8\nRate 8\nRecall 8\nReds 8\nReims 8\nResponse 8\nRetirement 8\nRichfield 8\nRocky 8\nRodney 8\nRoe 8\nRong 8\nRoosevelt 8\nRorer 8\nRowland 8\nRubin 8\nRwanda 8\nS.C 8\nS.C. 8\nSJ 8\nSK 8\nSO 8\nSalman 8\nSame 8\nSandra 8\nSchwab 8\nScowcroft 8\nSeabrook 8\nSean 8\nSession 8\nShamir 8\nSharif 8\nSlauson 8\nSohmer 8\nSolace 8\nSomething 8\nSorrell 8\nSperry 8\nSr. 8\nStan 8\nSteelworkers 8\nStick 8\nStorage 8\nStorer 8\nStrong 8\nSubcommittee 8\nSuccess 8\nSuperior 8\nSupport 8\nSurely 8\nTHEY 8\nTaco 8\nTaler 8\nTandem 8\nTask 8\nTass 8\nTea 8\nTerm 8\nThai 8\nThornburgh 8\nTian 8\nTouche 8\nTrek 8\nTribune 8\nTrinity 8\nTrotter 8\nTruman 8\nTseng 8\nType 8\nTypically 8\nU.S 8\nUP 8\nUV 8\nVerwoerd 8\nVinson 8\nVogelstein 8\nVolkswagen 8\nWWII 8\nWachovia 8\nWait 8\nWales 8\nWan 8\nWarwick 8\nWatergate 8\nWeekes 8\nWeekly 8\nWeil 8\nWeiss 8\nWertheim 8\nWheat 8\nWilbur 8\nWis. 8\nWohlstetter 8\nWoods 8\nXia 8\nYOUR 8\nYankee 8\nYankees 8\nZT 8\nabolished 8\naccuses 8\nacquires 8\nacquitted 8\nacronym 8\naddresses 8\nadhere 8\nadvertisement 8\naerial 8\naggravated 8\nailing 8\naircrafts 8\nairspace 8\nalarming 8\naliens 8\naligned 8\namateur 8\nanchor 8\nanimation 8\nannouncer 8\nanti-drug 8\nanti-takeover 8\napologized 8\narbitragers 8\narbitrator 8\narchrival 8\nascended 8\nashamed 8\nassemble 8\nassessments 8\natmospheric 8\natop 8\natrocities 8\nattributable 8\naudits 8\naveraging 8\nawaits 8\nballet 8\nballooning 8\nbandwagon 8\nbattled 8\nbeers 8\nbeleaguered 8\nbf 8\nbinge 8\nbitterness 8\nbladder 8\nblasted 8\nbleed 8\nbloated 8\nbluntly 8\nboasted 8\nbono 8\nbottled 8\nbottling 8\nboundary 8\nboycott 8\nbracing 8\nbran 8\nbuilder 8\nbulletin 8\nbullied 8\nbureaucrat 8\nburnt 8\nbuzz 8\ncalming 8\ncalories 8\ncancellation 8\ncandles 8\ncarpets 8\ncarts 8\ncatastrophes 8\nceased 8\nceremonies 8\nchairperson 8\nchanted 8\ncharisma 8\ncheating 8\nchefs 8\nchop 8\nchopsticks 8\nchun 8\ncinema 8\ncirculate 8\nclimax 8\ncloset 8\nclosings 8\ncombating 8\ncomedian 8\ncomets 8\ncommands 8\ncommissioners 8\ncommunism 8\ncommuting 8\ncompensated 8\ncompetence 8\ncompliment 8\ncomposer 8\ncompromises 8\nconception 8\nconcession 8\nconciliatory 8\nconcurrent 8\nconfront 8\nconfrontations 8\ncongratulate 8\ncongratulatory 8\nconscientious 8\nconsole 8\nconspiring 8\ncontentious 8\ncontinual 8\ncontraction 8\ncontradiction 8\ncontradictions 8\nconveyed 8\ncooperated 8\ncoordinator 8\ncop 8\ncoping 8\ncopyrighted 8\ncornerstone 8\ncorrelation 8\ncorridor 8\ncosting 8\ncouch 8\ncounters 8\ncoups 8\ncradle 8\ncrafted 8\ncraze 8\ncreator 8\ncreeping 8\ncrushing 8\ncrust 8\ncrying 8\ncurbs 8\ncurse 8\ncurtail 8\ncustomary 8\ncycling 8\ncylinder 8\nda 8\ndaring 8\ndaunting 8\ndazzling 8\ndeceived 8\ndecisively 8\ndecorated 8\ndedication 8\ndefeats 8\ndefends 8\ndeficiencies 8\ndefinitions 8\ndeleted 8\ndental 8\ndepict 8\ndeposed 8\ndepress 8\ndestabilizing 8\ndestinations 8\ndetention 8\ndetergent 8\ndetermines 8\ndevalued 8\ndiffering 8\ndirectory 8\ndisarm 8\ndisclosures 8\ndiscrepancy 8\ndisguised 8\ndislike 8\ndisregard 8\ndissent 8\ndistracted 8\ndistraction 8\ndistressed 8\ndisturb 8\ndiversify 8\ndiversifying 8\ndiversion 8\ndivorced 8\ndogged 8\ndomination 8\ndownside 8\ndrainage 8\ndumping 8\nduration 8\ndusk 8\ndynamics 8\nearnest 8\neasiest 8\necology 8\nedible 8\neighteen 8\nelastic 8\nelders 8\nelectromechanical 8\nelectronically 8\nemploying 8\nenact 8\nenergies 8\nenforced 8\nenjoyable 8\nenjoyment 8\nentries 8\nenvironments 8\nepisodes 8\nerupt 8\nespionage 8\nestimating 8\nevaporate 8\nexaminer 8\nexceptionally 8\nexpatriate 8\nexperimentation 8\nexporter 8\nexporters 8\nexpresses 8\nextortion 8\nextracted 8\nextraordinarily 8\nfacsimile 8\nfaltered 8\nfamed 8\nfared 8\nfarrier 8\nfearful 8\nfeast 8\nfederation 8\nfeminine 8\nflashing 8\nflashy 8\nflaw 8\nfleeing 8\nflopped 8\nflowed 8\nfog 8\nfoil 8\nforgo 8\nfrank 8\nfreeways 8\nfruitless 8\nfrustrating 8\ngardens 8\ngases 8\ngatherings 8\ngenerator 8\ngestation 8\nglare 8\nglue 8\ngrammatical 8\ngrassroots 8\ngreed 8\ngrief 8\ngroceries 8\ngroom 8\nguarded 8\nguerilla 8\nguts 8\nhandwriting 8\nhappier 8\nharshly 8\nhaunts 8\nhazard 8\nhazards 8\nheadaches 8\nheal 8\nhectares 8\nheed 8\nheroic 8\nhey 8\nhighlighted 8\nhormone 8\nhorn 8\nhsien 8\nhype 8\nhypoglycemia 8\nidealism 8\nideals 8\nidentities 8\nidiots 8\nignores 8\nimpasse 8\nimpoverished 8\nimprisoned 8\nimprisonment 8\ninequality 8\ninfant 8\ninflammatory 8\ninfrastructures 8\ninfringement 8\ninnocents 8\ninquired 8\ninseparable 8\ninsignificant 8\ninstallments 8\ninstituted 8\ninstructor 8\ninstrumental 8\ninsults 8\ninsurgents 8\nintangible 8\ninterface 8\ninterpretations 8\nintervals 8\ninvestigative 8\nirrational 8\njack 8\njoints 8\nju 8\nkidding 8\nlanterns 8\nlaptops 8\nlaughed 8\nlaunches 8\nlays 8\nlegislatures 8\nlettuce 8\nliaison 8\nlifestyles 8\nliquidated 8\nloath 8\nlobbied 8\nlogistics 8\nloneliness 8\nlonging 8\nlooms 8\nloudly 8\nlurking 8\nm. 8\nmacro 8\nmajors 8\nmao 8\nmarathon 8\nmarital 8\nmaterialized 8\nmeaningless 8\nmediocre 8\nmegawatt 8\nmegawatts 8\nmemoirs 8\nmessenger 8\nmileage 8\nmisstated 8\nmistakenly 8\nmonarchy 8\nmortal 8\nmotel 8\nmotive 8\nmountainous 8\nmourning 8\nmultibillion 8\nnationalities 8\nnearing 8\nnegligible 8\nnest 8\nnicknamed 8\nnightly 8\nnominees 8\nnon 8\nnoteworthy 8\nnoticeably 8\nnotions 8\nnotwithstanding 8\nnurse 8\nobsessed 8\noffend 8\nomnibus 8\nopenings 8\nordinance 8\noutdoor 8\noutdoors 8\noutlawed 8\npacks 8\npainters 8\npaperwork 8\nparachute 8\npave 8\npawn 8\npeculiar 8\npennies 8\nperformer 8\nphotographer 8\npitched 8\npits 8\npivotal 8\nplainly 8\nplains 8\nplentiful 8\nplumbing 8\nplummet 8\npolicyholders 8\npolled 8\npolling 8\npopulist 8\nportrait 8\npreclude 8\nprefab 8\npremiere 8\npremises 8\npressuring 8\npretend 8\npretrial 8\nprevails 8\npreventive 8\nprey 8\npriceless 8\nprincipally 8\nprinters 8\npro-life 8\nprogressive 8\nprohibition 8\npromulgated 8\npronounced 8\nprop 8\nproprietors 8\nprosper 8\nprostitutes 8\nprotestors 8\nprotocol 8\nprovocation 8\nprovoking 8\npub 8\npulls 8\nquicker 8\nquiz 8\nrailroads 8\nrash 8\nratification 8\nrationale 8\nreasoned 8\nrecessions 8\nrecorder 8\nredemptions 8\nreformer 8\nreign 8\nremedies 8\nrenamed 8\nrents 8\nrepairing 8\nresale 8\nresearched 8\nresentment 8\nrestrained 8\nreviving 8\nrevoke 8\nrevoked 8\nrift 8\nrighteousness 8\nrim 8\nrisked 8\nroadway 8\nrocked 8\nroyalty 8\nrub 8\nruining 8\nruler 8\nrulers 8\nrunaway 8\nsafeguards 8\nsalvation 8\nsandwich 8\nscales 8\nsecrecy 8\nsecretaries 8\nsecuring 8\nsedans 8\nseething 8\nseizure 8\nselections 8\nselective 8\nsemiconductors 8\nseminars 8\nsensational 8\nsensibility 8\nsensible 8\nsensitivity 8\nsequel 8\nsetup 8\nseverance 8\nshadows 8\nshards 8\nsharper 8\nshek 8\nshoddy 8\nshrink 8\nsilicon 8\nsings 8\nsisters 8\nsizes 8\nslowest 8\nslumping 8\nsmells 8\nsnap 8\nsniper 8\nsoaking 8\nsocially 8\nsoften 8\nsoftening 8\nsouthwestern 8\nspa 8\nspacious 8\nspate 8\nsped 8\nspine 8\nspirited 8\nsplendid 8\nspokesmen 8\nsporting 8\nsprings 8\nsquares 8\nstagnant 8\nstarters 8\nstarvation 8\nstatutory 8\nstaunch 8\nsterile 8\nstored 8\nstorms 8\nstrangers 8\nstrategists 8\nstrife 8\nstronghold 8\nstubborn 8\nsubmitting 8\nsuccessors 8\nsuffices 8\nsuitors 8\nsupervisory 8\nsuppressed 8\nsurroundings 8\nsurrounds 8\nsusceptible 8\nsuspending 8\nswaps 8\nswearing 8\nswelled 8\nswim 8\nsymptom 8\ntab 8\ntailored 8\ntailspin 8\ntaller 8\ntame 8\ntarnished 8\ntemp 8\nterminate 8\ntermination 8\ntextbooks 8\ntheatrical 8\nthoughtful 8\nthrive 8\nthroats 8\nthroughput 8\ntiger 8\ntile 8\ntimber 8\ntoads 8\ntomb 8\ntorpedo 8\ntoss 8\ntoughest 8\ntraffickers 8\ntrainer 8\ntransferring 8\ntransparency 8\ntransplants 8\ntravels 8\ntrendy 8\ntriangle 8\ntriggering 8\ntriumph 8\ntrot 8\ntrough 8\ntumbling 8\nturbine 8\nturbulence 8\ntzu 8\nu 8\nuncover 8\nundergoing 8\nunderneath 8\nunderscored 8\nunfinished 8\nunilateral 8\nunilaterally 8\nuninsured 8\nunnamed 8\nunpleasant 8\nunscathed 8\nunwarranted 8\nupdating 8\nv. 8\nvacated 8\nvaluing 8\nvanished 8\nvaried 8\nvastly 8\nvendor 8\nvenue 8\nvenues 8\nverification 8\nverify 8\nvibrant 8\nvice-director 8\nvice-president 8\nvictorious 8\nvintage 8\nvisions 8\nvisitor 8\nvisually 8\nvomiting 8\nwad 8\nwarehouses 8\nwarmly 8\nwatches 8\nwaving 8\nwedge 8\nweighed 8\nwelcomes 8\nwhiskey 8\nwholeheartedly 8\nwhooping 8\nwinding 8\nwindshield 8\nwit 8\nworkstation 8\nworms 8\nworthless 8\nwrangling 8\nwrecked 8\nyarn 8\n!!!!!! 7\n'60s 7\n0.05 7\n0.6 7\n1,100 7\n1,300 7\n1,850 7\n1.16 7\n1.27 7\n1.32 7\n1.65 7\n1.71 7\n1.875 7\n10.77 7\n11.7 7\n11.8 7\n11:00 7\n12.3 7\n12.6 7\n12.7 7\n12.9 7\n137 7\n14.2 7\n14.7 7\n148 7\n157 7\n160,000 7\n18.95 7\n1930 7\n1947 7\n1951 7\n1954 7\n1955 7\n1956 7\n1957 7\n2.46 7\n2/32 7\n20.5 7\n20.9 7\n2014 7\n2015 7\n21.5 7\n229 7\n247 7\n265 7\n275 7\n3.16 7\n3.18 7\n3.25 7\n3.52 7\n3.69 7\n325 7\n350,000 7\n38,000 7\n380 7\n39,000 7\n40th 7\n5/32 7\n50.3 7\n5:00 7\n6.90 7\n6000 7\n7. 7\n7/32 7\n700,000 7\n7777 7\n8.02 7\n8.03 7\n8.04 7\n8.06 7\n8.30 7\n8.32 7\n8.33 7\n80's 7\n9.1 7\n9.75 7\n90,000 7\n950 7\n?! 7\nA.C. 7\nARE 7\nAccept 7\nAccepted 7\nAga 7\nAgence 7\nAgricultural 7\nAhmanson 7\nAid 7\nAlley 7\nAlmighty 7\nAlternatively 7\nAmazon 7\nAmdura 7\nAmnesty 7\nAndrea 7\nAnna 7\nAnybody 7\nAnyhow 7\nAquino 7\nAsahi 7\nAssuming 7\nAtlas 7\nAudit 7\nAuthor 7\nBEST 7\nBTW 7\nBaath 7\nBaba 7\nBaby 7\nBahamas 7\nBai 7\nBalkans 7\nBanPonce 7\nBarber 7\nBarclays 7\nBarrett 7\nBelli 7\nBeretta 7\nBerlitz 7\nBernstein 7\nBetty 7\nBi 7\nBilly 7\nBlackstone 7\nBombay 7\nBoucher 7\nBox 7\nBruno 7\nBryan 7\nBuddha 7\nBuddy 7\nBuffalo 7\nBullock 7\nBurt 7\nBusch 7\nCNBC 7\nCalled 7\nCane 7\nCard 7\nCarrot 7\nCash 7\nCatherine 7\nCedars 7\nChambers 7\nChanges 7\nChapman 7\nCharge 7\nCherokee 7\nChilan 7\nChinaNews 7\nClaudio 7\nClient 7\nComair 7\nComex 7\nComments 7\nCommerciale 7\nComptroller 7\nConcord 7\nCongressman 7\nConnie 7\nConsultants 7\nConvenience 7\nConversely 7\nConvertible 7\nCook 7\nCooper 7\nCrandall 7\nCraven 7\nCustomer 7\nCypriot 7\nCzechoslovakia 7\nD.C 7\nDR. 7\nDaikin 7\nDalai 7\nDanxia 7\nDassault 7\nDayton 7\nDearborn 7\nDeborah 7\nDecatur 7\nDeclining 7\nDeep 7\nDeere 7\nDemler 7\nDenise 7\nDenmark 7\nDeveloping 7\nDi 7\nDiamond 7\nDiscovision 7\nDixie 7\nDoman 7\nDorfman 7\nDover 7\nDoyle 7\nDrew 7\nDrugs 7\nEMBA 7\nEPC 7\nEbola 7\nEddington 7\nEduard 7\nEhrlich 7\nEid 7\nEkco 7\nElders 7\nElectricity 7\nElian 7\nElliott 7\nEmbeke 7\nEmployers 7\nEnforcement 7\nEnvironment 7\nEnvironmentalists 7\nEnviropact 7\nEquitec 7\nErich 7\nEspectador 7\nEstimates 7\nEta 7\nExpect 7\nExpo 7\nExpressway 7\nEye 7\nFacts 7\nFalconbridge 7\nFang 7\nFantastic 7\nFashion 7\nFatima 7\nFayiz 7\nFernandez 7\nFiber 7\nFold 7\nFoot 7\nForget 7\nFraser 7\nFridays 7\nFruit 7\nFun 7\nGOODWYN 7\nGambia 7\nGarratt 7\nGarrison 7\nGeller 7\nGenetic 7\nGibson 7\nGiorgio 7\nGlazer 7\nGong 7\nGoupil 7\nGoya 7\nGreene 7\nGuardian 7\nGuards 7\nGuide 7\nGuild 7\nGutfreund 7\nH.D. 7\nH.F. 7\nHERE 7\nHaditha 7\nHamid 7\nHandicapped 7\nHands 7\nHarlem 7\nHarper 7\nHarthi 7\nHawaiian 7\nHebrew 7\nHelmut 7\nHenceforth 7\nHold 7\nHomeland 7\nHomes 7\nHon 7\nHorn 7\nHsiu 7\nHualien 7\nHugh 7\nHummer 7\nHun 7\nHural 7\nHyatt 7\nHyman 7\nIMO 7\nIV 7\nIncludes 7\nIncluding 7\nInflation 7\nIng 7\nInnopac 7\nInstitution 7\nIrvine 7\nIvy 7\nJ.P. 7\nJFK 7\nJW 7\nJakarta 7\nJeffs 7\nJewelers 7\nJobs 7\nJu 7\nJui 7\nJunior 7\nKane 7\nKangyo 7\nKaradzic 7\nKeeping 7\nKeong 7\nKerr 7\nKharek 7\nKhomeini 7\nKirk 7\nKoo 7\nKuang 7\nKurt 7\nLOVE 7\nLahoud 7\nLakes 7\nLan 7\nLatvia 7\nLaurence 7\nLavuras 7\nLed 7\nLei 7\nLeonardo 7\nLeventhal 7\nLewinsky 7\nLow 7\nLt. 7\nLuckily 7\nLungkeng 7\nLyondell 7\nMAC 7\nMIT 7\nMP3 7\nMX 7\nMY 7\nMack 7\nMaikang 7\nManaging 7\nManchester 7\nManeki 7\nManic 7\nManson 7\nMap 7\nMasako 7\nMatsushita 7\nMcNamee 7\nMehl 7\nMengistu 7\nMerchants 7\nMessenger 7\nMeyer 7\nMidler 7\nMillennium 7\nMine 7\nMinerals 7\nMinn. 7\nMonte 7\nMosbacher 7\nMozah 7\nMozilla 7\nMugabe 7\nMurdoch 7\nNCAA 7\nNEVER 7\nNPC 7\nNadeau 7\nNatWest 7\nNetworks 7\nNewspaper 7\nNigeria 7\nNingbo 7\nNoSprawlTax.org 7\nNoVa 7\nNoida 7\nNon-Bondad 7\nNora 7\nNoranda 7\nNormal 7\nNorwood 7\nNova 7\nNowak 7\nOakes 7\nOdds 7\nOlder 7\nOlson 7\nOpera 7\nOpponents 7\nOrlando 7\nOrr 7\nOttoman 7\nOwners 7\nP 7\nPFP 7\nPICC 7\nPaqueta 7\nPatent 7\nPaula 7\nPay 7\nPersons 7\nPetersburg 7\nPetrochemical 7\nPhelps 7\nPhilips 7\nPhysical 7\nPopulation 7\nPostville 7\nPowers 7\nPresidency 7\nPrivatization 7\nProducers 7\nPrograms 7\nProvincial 7\nPublishers 7\nPurchase 7\nPutting 7\nQaida 7\nQarase 7\nQinzhou 7\nQu 7\nQuaker 7\nRafael 7\nReady 7\nRedmond 7\nReflecting 7\nRefugees 7\nRegional 7\nReich 7\nReitman 7\nRelevant 7\nRetired 7\nReturn 7\nReunification 7\nReuter 7\nRevised 7\nRhode 7\nRod 7\nRomania 7\nRosenthal 7\nRudolph 7\nRunning 7\nRupert 7\nRuy 7\nS.A 7\nSE 7\nSKF 7\nSahara 7\nSatan 7\nSaud 7\nSaul 7\nSauls 7\nSave 7\nSchaeffer 7\nScholars 7\nSchroder 7\nScot 7\nSeagate 7\nSeeing 7\nSenegal 7\nSenegalese 7\nSeng 7\nSettlement 7\nShapiro 7\nShea 7\nShip 7\nShops 7\nSihanouk 7\nSilent 7\nSinger 7\nSioux 7\nSite 7\nSki 7\nSmithsonian 7\nSoldiers 7\nSomewhere 7\nSounds 7\nSoup 7\nSpecter 7\nStephanie 7\nStrategy 7\nStubHub 7\nStudios 7\nSugar 7\nSupporters 7\nSurvey 7\nSussex 7\nSverdlovsk 7\nSwimming 7\nT.V. 7\nTHAT 7\nTMT 7\nTamim 7\nTariq 7\nTate 7\nTeacher 7\nTeachers 7\nThane 7\nTharp 7\nTheory 7\nThi 7\nThick 7\nTiananmen 7\nTianchi 7\nTodd 7\nTommy 7\nTransaction 7\nTyler 7\nUDN 7\nUFO 7\nUPS 7\nUSB 7\nUchikoshi 7\nUm 7\nUniversities 7\nUpper 7\nUtility 7\nUtsumi 7\nVAX 7\nValdez 7\nVentures 7\nVictoria 7\nVince 7\nViolence 7\nVioletta 7\nVoting 7\nVoyager 7\nWAS 7\nWachter 7\nWarburg 7\nWard 7\nWedd 7\nWedtech 7\nWerner 7\nWhittington 7\nWild 7\nWillie 7\nWoolworth 7\nWow 7\nWyss 7\nXining 7\nXtra 7\nYa 7\nYates 7\nYield 7\nYin 7\nYunhong 7\nZeta 7\nZionists 7\nZoete 7\nabducted 7\nabiding 7\naborted 7\nabusive 7\nacademy 7\naccessory 7\naccompanies 7\nacute 7\nadapting 7\nadministrators 7\nadmired 7\nadvertised 7\nadvocating 7\naforementioned 7\nafternoons 7\naggregate 7\naids 7\nairbag 7\nalienating 7\nambulances 7\nammonia 7\nancestors 7\nancestral 7\nangels 7\nangles 7\nangrily 7\nanimosity 7\nannounces 7\nannuities 7\nannuity 7\nanswering 7\nantiquities 7\napplauded 7\nappointing 7\napproves 7\napproving 7\napproximate 7\nardent 7\narmaments 7\narmor 7\narose 7\narresting 7\narrests 7\naspiring 7\nassembling 7\natom 7\natoms 7\nattractiveness 7\nauthenticity 7\nauthoritarian 7\nauthoritative 7\navenues 7\nawait 7\nawake 7\nbacterium 7\nbalancing 7\nbalked 7\nballistic 7\nbamboo 7\nbarn 7\nbaseline 7\nbathing 7\nbattling 7\nbeans 7\nbeast 7\nbeds 7\nbeforehand 7\nbehavioral 7\nbent 7\nbets 7\nbikers 7\nbillings 7\nbirthplace 7\nbiz 7\nblasts 7\nblatant 7\nblending 7\nblonde 7\nboarding 7\nbolts 7\nbooths 7\nborrower 7\nbouncing 7\nbounds 7\nbourgeois 7\nbra 7\nbrewer 7\nbribes 7\nbrigade 7\nbroadband 7\nbroadcasted 7\nbroadcasts 7\nbroadened 7\nbucket 7\nbudding 7\nbudgeted 7\nbuffer 7\nbuffet 7\nbulb 7\nbunker 7\nburner 7\nbutcher 7\nbutterflies 7\ncalmly 7\ncampaigned 7\ncanned 7\ncanvas 7\ncartoons 7\ncarvings 7\ncensorship 7\ncensus 7\ncenterpiece 7\ncentimeters 7\ncentralized 7\ncertainty 7\ncharacterize 7\ncherry 7\nchiefs 7\nchilly 7\nchin 7\nchlorine 7\nchooses 7\nclaimants 7\nclarification 7\nclashed 7\nclassmate 7\ncleaners 7\nclinched 7\nclue 7\ncm 7\nco-chief 7\ncoastline 7\ncoating 7\ncollapses 7\ncommunicated 7\ncomplement 7\ncompounded 7\nconceal 7\nconcealed 7\nconcealing 7\nconceived 7\nconcurrently 7\nconfesses 7\nconfidentiality 7\nconfronting 7\nconfuse 7\ncongested 7\ncongressmen 7\nconquer 7\nconsulate 7\ncontainment 7\ncontamination 7\ncontemplate 7\ncontending 7\ncontrasts 7\nconverter 7\nconvict 7\ncoral 7\ncosmic 7\ncranes 7\ncreature 7\ncrewmembers 7\ncritically 7\ncritique 7\ncross-border 7\ncurbing 7\ncured 7\ncurfew 7\ncurry 7\ncurtailed 7\ncute 7\ndamper 7\ndanced 7\ndarling 7\ndeceptive 7\ndecorative 7\ndeduct 7\ndefaulted 7\ndefect 7\ndefender 7\ndefinitively 7\ndelicious 7\ndemolition 7\ndensely 7\ndentists 7\ndepicts 7\ndeprive 7\ndescend 7\ndescriptions 7\ndesigners 7\ndespair 7\ndetection 7\ndevelopmental 7\ndevise 7\ndevoid 7\ndialysis 7\ndime 7\ndining 7\ndip 7\ndirective 7\ndirectories 7\ndisadvantaged 7\ndisc 7\ndisgruntled 7\ndismantle 7\ndispose 7\ndistinctions 7\ndistort 7\ndistributing 7\ndistributions 7\ndisturbed 7\ndivergent 7\ndolls 7\ndominates 7\ndoubtful 7\ndowngrade 7\ndownhill 7\ndownright 7\ndownstairs 7\ndrawbacks 7\ndug 7\nduo 7\ndwelling 7\ne-generation 7\nearnestly 7\neats 7\nechoed 7\neclectic 7\neducating 7\nego 7\nelephant 7\nelevated 7\nelevator 7\nembraces 7\nembracing 7\neminent 7\nemotionally 7\nemphatically 7\nencompassing 7\nencounters 7\nendeavor 7\nenforcing 7\nentitlement 7\nentrepreneurial 7\neroded 7\nevaluations 7\nevenings 7\nevoke 7\nevolve 7\nexacerbated 7\nexaminations 7\nexcesses 7\nexhibited 7\nexhibitions 7\nexiled 7\nexorbitant 7\nexpands 7\nexperimenting 7\nexploiting 7\nexplored 7\nexpressions 7\nextensions 7\neyebrows 7\nfaint 7\nfamously 7\nfanfare 7\nfertilizers 7\nfiduciary 7\nfilmed 7\nfilter 7\nfinalized 7\nfinanciers 7\nfirsthand 7\nflamboyant 7\nflames 7\nfleets 7\nfoam 7\nfocal 7\nfooting 7\nfootsteps 7\nforensic 7\nforgiveness 7\nformidable 7\nforthcoming 7\nfortification 7\nfostering 7\nfragrant 7\nfreshly 7\nfry 7\nfundamentalists 7\nfunerals 7\nfungus 7\ngargantuan 7\ngenerators 7\ngenerously 7\ngentleman 7\ngenuinely 7\ngestures 7\nglamorous 7\nglobally 7\ngolfer 7\ngossip 7\ngrabbing 7\ngraphic 7\ngraves 7\ngravity 7\ngreatness 7\ngripped 7\ngrossly 7\nguaranty 7\nguiding 7\nguitar 7\ngunshot 7\ngymnasts 7\nhalls 7\nhammered 7\nhandlers 7\nharness 7\nhated 7\nheap 7\nhears 7\nheats 7\nheck 7\nhectic 7\nhegemony 7\nherein 7\nhijacked 7\nhinder 7\nhindered 7\nhistorian 7\nhollow 7\nhonorary 7\nhonoring 7\nhospitality 7\nhospitalized 7\nhugs 7\nhypothetical 7\nicicle 7\nidols 7\nillustrated 7\nimagery 7\nimaginary 7\nimitate 7\nimmense 7\nimpaired 7\nimpatient 7\nimplicated 7\nimpress 7\nincapable 7\ninciting 7\nincompatible 7\nincorporates 7\nincur 7\nindexation 7\nindictments 7\nineffective 7\ninexperienced 7\ninfections 7\ninfectious 7\ninfertility 7\ninflicted 7\ninflows 7\ninfluencing 7\ninhabitants 7\ninning 7\ninnocence 7\ninnovate 7\ninsects 7\ninserted 7\ninspiring 7\ninstincts 7\ninstrumentation 7\ninsulation 7\nintellect 7\nintensely 7\nintensifying 7\ninteractions 7\ninteractive 7\ninternally 7\ninternationalization 7\ninterpreter 7\nintersection 7\nintriguing 7\ninvention 7\nissuer 7\nivory 7\njazz 7\njealous 7\njeopardize 7\njeopardy 7\njitters 7\njittery 7\njokingly 7\njudged 7\njustifies 7\nkiss 7\nkuo 7\nlamp 7\nlandlord 7\nlane 7\nlawful 7\nleaning 7\nlecturer 7\nlest 7\nlevy 7\nliberalism 7\nlightblue 7\nlighten 7\nliking 7\nliner 7\nlingerie 7\nliquidate 7\nlooming 7\nloosely 7\nlooting 7\nlounge 7\nlubricants 7\nlump 7\nluncheon 7\nlungs 7\nmainstay 7\nmalicious 7\nmanagerial 7\nmaneuvering 7\nmanpower 7\nmarriages 7\nmastered 7\nmaternal 7\nmeager 7\nmechanics 7\nmedian 7\nmercenaries 7\nmid-1990s 7\nmimic 7\nmini 7\nmired 7\nmodify 7\nmotivate 7\nmundane 7\nmuse 7\nmuseums 7\nmushroom 7\nmuster 7\nnanny 7\nnarrower 7\nnationalistic 7\nnationals 7\nnegligent 7\nnervously 7\nnewborn 7\nnewsletters 7\nnitrogen 7\nnongovernmental 7\nnonrecurring 7\nnorms 7\nnosed 7\nnoticeable 7\nnovelist 7\nnurseries 7\nnurses 7\nnurture 7\noat 7\nobstruct 7\noceans 7\noperatives 7\noutfit 7\noutlet 7\noutperformed 7\noverpasses 7\noverruns 7\noverthrown 7\noverwhelmed 7\noxide 7\npalace 7\npall 7\npalm 7\npanacea 7\nparamilitary 7\npassionate 7\npassions 7\npatron 7\npaved 7\npeaks 7\nperceive 7\npercentages 7\nperch 7\npermissible 7\npertussis 7\nperverse 7\npetitions 7\nphoned 7\nphrases 7\nphysicians 7\nphysique 7\npins 7\npiston 7\npitcher 7\nplayoffs 7\nplotting 7\npodium 7\npoop 7\nportraying 7\npossessing 7\npossessions 7\npractitioners 7\npre-trial 7\nprecarious 7\nprecaution 7\npredicament 7\npredominantly 7\npreferring 7\nprepayment 7\npresided 7\nprevalent 7\nprized 7\nprobation 7\nprocessor 7\nprogrammers 7\nprogresses 7\nprominence 7\nprominently 7\npropelled 7\nprophet 7\nproposition 7\nproprietary 7\nprosecutorial 7\nprovocative 7\nprowess 7\npunishing 7\npuppy 7\nquagmire 7\nqueries 7\nquestionnaire 7\nradioactive 7\nrainfall 7\nraiser 7\nransom 7\nrapeseed 7\nratified 7\nravaged 7\nreaffirmed 7\nreassessment 7\nreassure 7\nrebellion 7\nrebuffed 7\nrecalling 7\nrecapture 7\nreceiver 7\nrecognizable 7\nrecruitment 7\nreformers 7\nrefuel 7\nrefueled 7\nrehabilitation 7\nremodeling 7\nremotely 7\nremoves 7\nrenting 7\nreparations 7\nreplaces 7\nreproduce 7\nreputations 7\nreservation 7\nresidences 7\nresin 7\nrespectful 7\nrespecting 7\nrestitution 7\nretinoblastoma 7\nrhythmic 7\nriches 7\nridicule 7\nrifle 7\nrisking 7\nroaring 7\nrobbing 7\nrobot 7\nrocking 7\nrounded 7\nrubbish 7\nsaga 7\nsalmonella 7\nsanctioned 7\nsandwiches 7\nsatanism 7\nsavage 7\nscheduling 7\nscholarship 7\nschooling 7\nseaport 7\nseasoned 7\nseating 7\nseismic 7\nsemiannual 7\nsen 7\nsensation 7\nsensor 7\nsentencing 7\nsequester 7\nservicing 7\nsettings 7\nsevered 7\nsexes 7\nshaft 7\nshipbuilding 7\nshoreline 7\nshortened 7\nshortfall 7\nshredded 7\nsiblings 7\nsidewalk 7\nsimmering 7\nsimultaneous 7\nskidded 7\nskiers 7\nskull 7\nslammed 7\nslashing 7\nslaughtered 7\nsliced 7\nslides 7\nsmarter 7\nsmiles 7\nsoup 7\nsoybeans 7\nspanning 7\nspared 7\nspewing 7\nspontaneous 7\nsprang 7\nsprawling 7\nsquad 7\nsquadron 7\nstadiums 7\nstagnation 7\nstall 7\nstandardized 7\nsticker 7\nstink 7\nstirring 7\nstitches 7\nstraightforward 7\nstrait 7\nstraw 7\nstretches 7\nstringent 7\nstripes 7\nstumbling 7\nsubjective 7\nsubsidizing 7\nsui 7\nsummarize 7\nsump 7\nsunday 7\nsung 7\nsunken 7\nsupplementary 7\nsuppress 7\nsurgeon 7\nsurreal 7\nsurrealism 7\nswallow 7\nswaying 7\nsword 7\nsympathize 7\ntacked 7\ntag 7\ntailor 7\ntally 7\ntaping 7\ntearing 7\ntechnologically 7\nteenager 7\ntelecommunication 7\ntemplate 7\ntenants 7\nterrain 7\nterrifying 7\ntertiary 7\ntestifies 7\ntexture 7\nthicker 7\nthinner 7\nthirteen 7\nthreads 7\nthru 7\nthwart 7\nthwarted 7\ntick 7\ntimid 7\ntopiary 7\ntournament 7\ntowers 7\ntracing 7\ntractors 7\ntraits 7\ntraps 7\ntravelling 7\ntreaties 7\ntremor 7\ntribute 7\ntruths 7\ntumultuous 7\ntuning 7\ntunnel 7\ntwisted 7\ntying 7\num 7\nuncommon 7\nundemocratic 7\nundergone 7\nunderscore 7\nundo 7\nuniquely 7\nunraveled 7\nunresolved 7\nunscrupulous 7\nunsolicited 7\nv 7\nvaguely 7\nvariation 7\nve 7\nvelocity 7\nverse 7\nverses 7\nviability 7\nvigor 7\nviolin 7\nvisas 7\nvogue 7\nvolunteered 7\nvon 7\nvulgar 7\nwaist 7\nwander 7\nward 7\nwarmer 7\nwatchers 7\nwatered 7\nwaved 7\nweary 7\nweeping 7\nwhack 7\nwhereby 7\nwhichever 7\nwhopping 7\nwidens 7\nwiping 7\nwitch 7\nwithstanding 7\nwitnessing 7\nwoods 7\nworded 7\nworkings 7\nworsen 7\nya 7\nyelled 7\nyelling 7\n....... 6\n0.19 6\n0.8 6\n03/15/2001 6\n1.30 6\n1.37 6\n1.375 6\n1.43 6\n1.48 6\n1.82 6\n10.1 6\n10.8 6\n10.9 6\n10:30 6\n11.25 6\n11.3 6\n11.4 6\n11.9 6\n114 6\n117 6\n1200 6\n121 6\n122 6\n124 6\n13.2 6\n13/16 6\n134 6\n14.8 6\n142.75 6\n143 6\n15.5 6\n16,000 6\n16.2 6\n166 6\n168 6\n17.2 6\n17.50 6\n177 6\n178 6\n18.65 6\n18.7 6\n182 6\n19.6 6\n19.7 6\n1926 6\n1939 6\n1945 6\n198 6\n2.625 6\n205 6\n207 6\n21.3 6\n21.7 6\n216 6\n22.8 6\n220,000 6\n224 6\n23.5 6\n235 6\n24.9 6\n27.9 6\n29/32 6\n365 6\n370 6\n375 6\n38.5 6\n39.55 6\n4.75 6\n4.92 6\n40.1 6\n42.5 6\n425,000 6\n470 6\n4:00 6\n506 6\n51.com 6\n550,000 6\n610 6\n62,000 6\n66.7 6\n7.3 6\n7.75 6\n7.95 6\n8.42 6\n80486 6\n9.3 6\n9.8 6\n9/32 6\n<$BlogBacklinkAuthor$> 6\n<$BlogBacklinkDateTime$> 6\nA.G. 6\nACCOUNT 6\nAD 6\nAMERICAN 6\nARCHIBALD 6\nAbbie 6\nAcadia 6\nAcer 6\nAdditional 6\nAdvancing 6\nAdvisor 6\nAfterwards 6\nAgent 6\nAlbania 6\nAlbanian 6\nAlgeria 6\nAllawi 6\nAmbrosiano 6\nAmtrak 6\nAmy 6\nAnalysis 6\nAnalysis_0712 6\nAnbo 6\nAncient 6\nAnniversary 6\nArabized 6\nArkla 6\nArmenian 6\nArtists 6\nAsea 6\nAshurst 6\nAssad 6\nAssurance 6\nAssurances 6\nAthletics 6\nAtkins 6\nAustria 6\nAutomatic 6\nAutumn 6\nAvondale 6\nBC 6\nBE 6\nBMW 6\nBacon 6\nBad 6\nBaglioni 6\nBailey 6\nBakker 6\nBaldwin 6\nBancroft 6\nBarclay 6\nBare 6\nBarida 6\nBarnicle 6\nBarr 6\nBasically 6\nBasir 6\nBat 6\nBausch 6\nBeam 6\nBebear 6\nBeefeater 6\nBeghin 6\nBelida 6\nBelieve 6\nBenefit 6\nBeranka 6\nBerkshire 6\nBert 6\nBiao 6\nBids 6\nBinggang 6\nBiotechnology 6\nBirthday 6\nBishop 6\nBlitzer 6\nBoise 6\nBologna 6\nBooks 6\nBosnian 6\nBoveri 6\nBoyer 6\nBremeha 6\nBrent 6\nBroderick 6\nBronner 6\nBrouwer 6\nBrunei 6\nBuckley 6\nBuddhism 6\nBuildings 6\nBulgarian 6\nBumiputra 6\nBunker 6\nBurke 6\nBurmah 6\nBusinesses 6\nBuy 6\nByang 6\nC.J. 6\nCACI 6\nCEOs 6\nCPI 6\nCafe 6\nCai 6\nCalloway 6\nCampaign 6\nCanyon 6\nCaracas 6\nCarey 6\nCarolinas 6\nCarolyn 6\nCarta 6\nCasey 6\nCathcart 6\nCela 6\nChadwick 6\nChampions 6\nChampionships 6\nChanghua 6\nChestman 6\nChieh 6\nChile 6\nChristies 6\nCigna 6\nCilcorp 6\nCiting 6\nCitizen 6\nClashes 6\nClick 6\nCoates 6\nCocom 6\nCoda 6\nCoffee 6\nCombined 6\nCombustion 6\nComdek 6\nComponents 6\nComsat 6\nConasupo 6\nConcern 6\nConfederation 6\nCongressmen 6\nConsciousness 6\nConsortium 6\nConstellation 6\nCorsica 6\nCost 6\nCox 6\nCriminal 6\nCruise 6\nCustomers 6\nCutler 6\nCynthia 6\nD'Arcy 6\nDDB 6\nDLJ 6\nDRAMs 6\nDaPuzzo 6\nDaffynition 6\nDalkon 6\nDarin 6\nDeVoe 6\nDead 6\nDempsey 6\nDepartments 6\nDescription 6\nDeseret 6\nDetrex 6\nDevelopers 6\nDiabetes 6\nDisaster 6\nDisneyland 6\nDispatch 6\nDjibouti 6\nDoctor 6\nDodd 6\nDollars 6\nDome 6\nDonovan 6\nDonuts 6\nDouble 6\nDowney 6\nDozens 6\nDr 6\nDublin 6\nDulles 6\nDunkin' 6\nDurkin 6\nEMR 6\nENERGY 6\nENRON.XLS 6\nESPN 6\nEVER 6\nEcological 6\nEidsmo 6\nEighth 6\nEngineers 6\nEnserch 6\nEurodollar 6\nEurostar 6\nEverest 6\nEvidence 6\nExpansion 6\nFDIC 6\nFK 6\nFT 6\nFTV 6\nFYI 6\nFaberge 6\nFailure 6\nFairfield 6\nFalse 6\nFeb 6\nFelix 6\nFerranti 6\nFigure 6\nFind 6\nFinkelstein 6\nFinnair 6\nFirstSouth 6\nFiscal 6\nFit 6\nFleming 6\nFlintoff 6\nFlom 6\nFog 6\nFollow 6\nFong 6\nForman 6\nFourthly 6\nFremont 6\nFujisawa 6\nFurukawa 6\nFuzhou 6\nGASB 6\nGDNF 6\nGISB 6\nGMAC 6\nGOOD 6\nGREAT 6\nGarth 6\nGatward 6\nGeorgetown 6\nGifford 6\nGilbert 6\nGilmore 6\nGlascoff 6\nGoran 6\nGradually 6\nGreatest 6\nGregor 6\nGrey 6\nGriffin 6\nGround 6\nGuidelines 6\nGuterman 6\nH.H. 6\nHANO 6\nHAVE 6\nHE 6\nHabib 6\nHammack 6\nHamster 6\nHang 6\nHans 6\nHarley 6\nHasbro 6\nHaskins 6\nHathaway 6\nHeadliners 6\nHedges 6\nHeidi 6\nHeights 6\nHigher 6\nHodge 6\nHoelzer 6\nHolt 6\nHonor 6\nHsien 6\nHubby 6\nHuichun 6\nIAFP 6\nIFI 6\nIG 6\nIP 6\nITN 6\nIchi 6\nImo 6\nIndependents 6\nInforian 6\nInn 6\nInouye 6\nInspection 6\nInterestingly 6\nIsler 6\nIstat 6\nItalians 6\nIvanov 6\nJana 6\nJanice 6\nJayark 6\nJeremy 6\nJerome 6\nJi 6\nJiabao 6\nJinan 6\nJohnsee 6\nJosh 6\nKawasaki 6\nKeller 6\nKent 6\nKirghizian 6\nKissinger 6\nKobe 6\nKohlberg 6\nKovtun 6\nKung 6\nKuroda 6\nKyle 6\nL.P. 6\nLME 6\nLauderdale 6\nLavelle 6\nLegent 6\nLegislature 6\nLeno 6\nLesk 6\nLinear 6\nLinkou 6\nLipton 6\nLiterature 6\nLocation 6\nLooks 6\nLowe 6\nLower 6\nLucy 6\nLying 6\nLyons 6\nMEH-risk 6\nMJ 6\nMLX 6\nMS 6\nMacedonia 6\nMacintosh 6\nMagna 6\nMakes 6\nMama 6\nManagua 6\nManitoba 6\nMarc 6\nMargin 6\nMargins 6\nMarianne 6\nMarie 6\nMarin 6\nMarley 6\nMarous 6\nMaterial 6\nMatt 6\nMatthews 6\nMcCarren 6\nMcDonalds 6\nMcGee 6\nMead 6\nMedal 6\nMeiring 6\nMel 6\nMeridian 6\nMessage 6\nMetall 6\nMiles 6\nMilwaukee 6\nMines 6\nMining 6\nMorgenzon 6\nMorocco 6\nMorristown 6\nMoss 6\nMountains 6\nMunicipals 6\nMuppets 6\nN'T 6\nN.C 6\nN.H. 6\nNBA 6\nNEWS 6\nNGO 6\nNablus 6\nNagoya 6\nNakamura 6\nNam 6\nNanjie 6\nNasiriyah 6\nNasri 6\nNear 6\nNed 6\nNegotiation 6\nNeverland 6\nNewman 6\nNewsletter 6\nNicolas 6\nNikon 6\nNouri 6\nNov 6\nNucor 6\nNye 6\nO'Connell 6\nO'Conner 6\nOECD 6\nObservatory 6\nObservers 6\nOccupation 6\nOgarkov 6\nOhbayashi 6\nOkla. 6\nOmni 6\nOrdinary 6\nOtto 6\nOutput 6\nOval 6\nOz 6\nPPA 6\nPR 6\nPage 6\nParade 6\nParkway 6\nPaterson 6\nPeasants 6\nPeking 6\nPemberton 6\nPenh 6\nPersonally 6\nPeruvian 6\nPetco 6\nPhilosophy 6\nPhnom 6\nPhyllis 6\nPicture 6\nPilgrim 6\nPilot 6\nPiper 6\nPlanners 6\nPlayer 6\nPlenary 6\nPolo 6\nPons 6\nPopo 6\nPosner 6\nPoughkeepsie 6\nPrague 6\nPravda 6\nPrentice 6\nPrimax 6\nPrinciples 6\nProductions 6\nQinghai 6\nQuanzhou 6\nQuickview 6\nQuina 6\nR.H. 6\nR.I. 6\nRAPHAEL 6\nRafale 6\nRahman 6\nRahn 6\nRail 6\nRainbow 6\nRaleigh 6\nRatners 6\nReader 6\nRecession 6\nReid 6\nRelease 6\nReporters 6\nReps. 6\nRescue 6\nRex 6\nRifenburgh 6\nRio 6\nRita 6\nRoberti 6\nRodino 6\nRoll 6\nRude 6\nRudy 6\nRun 6\nRush 6\nS&Ls 6\nSMS 6\nSafavid 6\nSagan 6\nSalim 6\nSanwa 6\nScenario 6\nSchneider 6\nSe-hwa 6\nSeaman 6\nSelling 6\nSeminar 6\nSens. 6\nSets 6\nShaheen2005 6\nShan 6\nSharm 6\nShere 6\nShield 6\nShin 6\nSide 6\nSidhpur 6\nSidley 6\nSilkies 6\nSinablog 6\nSinging 6\nSinorama 6\nSister 6\nSit 6\nSomali 6\nSpecifically 6\nSpielberg 6\nSsangYong 6\nStalinist 6\nStarbucks 6\nStark 6\nStarr 6\nStars 6\nStep 6\nStephens 6\nStern 6\nStrange 6\nSupply 6\nSymbol 6\nTCH 6\nTHEN 6\nTMD 6\nTNT 6\nTPA 6\nTaken 6\nTariff 6\nTeagan 6\nTech 6\nTenders 6\nTenneco 6\nTens 6\nTeresa 6\nTerrell 6\nTesoro 6\nTest 6\nTheatre 6\nThirty 6\nTieying 6\nTitanic 6\nToubro 6\nTourist 6\nTrail 6\nTrain 6\nTraub 6\nTravis 6\nTreatment 6\nTrecker 6\nTroops 6\nTrudeau 6\nTrustcorp 6\nTruth 6\nTrying 6\nTue 6\nTwelve 6\nUkrainian 6\nUltimately 6\nUnable 6\nUnderwriters 6\nUnemployment 6\nUnilab 6\nUnix 6\nUpjohn 6\nVCU 6\nVickers 6\nVictory 6\nViolin 6\nVirgin 6\nVoices 6\nVolvo 6\nVoters 6\nWILL 6\nWITH 6\nWalnut 6\nWangma 6\nWant 6\nWedding 6\nWestridge 6\nWheeler 6\nWhitman 6\nWholesale 6\nWilfred 6\nWindows 6\nWitness 6\nWonderful 6\nWords 6\nWrap 6\nWrigley 6\nWriters 6\nYamatake 6\nYetnikoff 6\nYorker 6\nZarqawi 6\nZheng 6\nZi 6\nZinni 6\nZoo 6\nZsa 6\nabduction 6\nabsentee 6\nacademia 6\nacademics 6\nacquirer 6\nacrylic 6\nadaptation 6\naddicts 6\nadditionally 6\nadditions 6\nadjuster 6\nadminister 6\nadmissions 6\naesthetic 6\nafar 6\nafflicted 6\nafloat 6\naggregates 6\nairwaves 6\naisle 6\nal_shaheen2005@hotmail.com 6\nalarms 6\namassed 6\nambiguity 6\nambulance 6\namidst 6\nandrogynous 6\nanecdotes 6\nannoyed 6\nanti-war 6\nappalled 6\nappease 6\nappliance 6\narbitrary 6\narchives 6\narsenals 6\nass 6\nassassinate 6\nassaults 6\nasserting 6\nassessed 6\nassorted 6\nassortment 6\nasteroids 6\nastonishing 6\nastronomical 6\nasymmetrical 6\natomic 6\nattackers 6\nattends 6\nauthorize 6\navoidance 6\nbackfire 6\nbackwards 6\nbadminton 6\nbalances 6\nbanana 6\nbark 6\nbarracks 6\nbarren 6\nbasics 6\nbedding 6\nbeverage 6\nbilled 6\nbiographer 6\nbiologists 6\nbishop 6\nbites 6\nblankets 6\nblasting 6\nblogger 6\nblogs 6\nbloodshed 6\nblurred 6\nboast 6\nboiling 6\nbooking 6\nboot 6\nbottleneck 6\nbounty 6\nbow 6\nbowed 6\nboxy 6\nboycotted 6\nbracket 6\nbreached 6\nbreeze 6\nbrethren 6\nbrew 6\nbrighter 6\nbrightest 6\nbrink 6\nbroaden 6\nbrow 6\nbrunt 6\nbuckled 6\nbullion 6\nbullying 6\nbumper 6\nbundle 6\nburgeoning 6\nburger 6\nburgers 6\nburying 6\nbusily 6\nbutler 6\ncachets 6\ncalamity 6\ncalculator 6\ncallers 6\ncalligraphers 6\ncandle 6\ncandor 6\ncannon 6\ncapacities 6\ncarpeting 6\ncartel 6\ncarve 6\ncemetery 6\ncentre 6\nchambers 6\nchampions 6\ncheaply 6\ncheckbook 6\ncheckpoints 6\nchic 6\nchill 6\nchoppy 6\nchuang 6\nchunks 6\nclarity 6\nclassification 6\ncleric 6\nclerics 6\ncleverly 6\nclusters 6\nco-author 6\ncoats 6\ncoffins 6\ncogeneration 6\ncoherent 6\ncola 6\ncollaborated 6\ncombustion 6\ncommendable 6\ncommended 6\ncommercially 6\ncommonplace 6\ncommuter 6\ncommutes 6\ncompartment 6\ncomplexes 6\ncomplicate 6\ncomplied 6\ncomprehension 6\ncomprise 6\ncomprised 6\ncompulsory 6\ncomrade 6\nconditional 6\nconducive 6\ncongregation 6\nconnectors 6\nconnects 6\nconservatorship 6\nconsisted 6\nconspired 6\nconsulted 6\ncontacted 6\ncontemplated 6\ncontender 6\ncontinental 6\ncontinuity 6\ncontraceptive 6\ncontractual 6\ncontroller 6\ncookies 6\ncounselor 6\ncountered 6\ncouplets 6\ncoupling 6\ncovenants 6\ncoveted 6\ncow 6\ncozy 6\ncracker 6\ncracking 6\ncrashes 6\ncreators 6\ncrest 6\ncue 6\ncuisine 6\nculmination 6\ncurator 6\ncurtain 6\ncutbacks 6\ndancers 6\ndaylight 6\ndaytime 6\ndebating 6\ndecency 6\ndecker 6\ndecoration 6\ndeepest 6\ndeer 6\ndeferring 6\ndefining 6\ndeflator 6\ndefraud 6\ndegenerate 6\ndementia 6\ndenomination 6\ndepart 6\ndeparting 6\ndepicting 6\nderail 6\ndescended 6\ndestined 6\ndetached 6\ndetainees 6\ndetector 6\ndetrimental 6\ndiagnosis 6\ndialing 6\ndiarrhea 6\ndictators 6\ndigits 6\ndilute 6\ndimension 6\ndirectorate 6\ndisabilities 6\ndisband 6\ndisciplines 6\ndiscovering 6\ndiscredited 6\ndiscriminating 6\ndiscriminatory 6\ndisdain 6\ndisgraceful 6\ndisgusting 6\ndismantled 6\ndisposed 6\ndisposing 6\ndissidents 6\ndistances 6\ndistilled 6\ndistracting 6\ndividing 6\ndopamine 6\ndot 6\ndownload 6\ndragons 6\ndrowned 6\ndrums 6\ndude 6\ndunes 6\nduplicate 6\ndye 6\neases 6\necho 6\neerie 6\nelasticity 6\nelbow 6\nelectors 6\nelevate 6\nembroiled 6\nemergencies 6\nendanger 6\nendorsing 6\nenrich 6\nenriching 6\nenroll 6\nenrongss.xls 6\nensured 6\nentice 6\nenvelope 6\nequivalents 6\nerased 6\nerected 6\nerratic 6\nerroneous 6\nescaping 6\nestablishes 6\neuphoria 6\nevade 6\nevaluated 6\nexcel 6\nexcessively 6\nexclusivity 6\nexert 6\nexhaustion 6\nexhaustive 6\nexpatriates 6\nexpectancy 6\nexpedited 6\nexploits 6\nexpulsion 6\nextinction 6\nextraction 6\nextradited 6\neyewitness 6\nf 6\nfacets 6\nfamiliarity 6\nfarmland 6\nfatigue 6\nfaxed 6\nfeedlots 6\nfeeds 6\nfeminist 6\nfences 6\nfengshui 6\nfermentation 6\nfertile 6\nfestivities 6\nfeudal 6\nfiller 6\nfills 6\nfilmmakers 6\nfitted 6\nflair 6\nflashes 6\nfledged 6\nflee 6\nfleeting 6\nflier 6\nfloats 6\nfloppy 6\nflourish 6\nfluent 6\nflung 6\nfoes 6\nfolly 6\nfoothold 6\nforbid 6\nforbids 6\nfore 6\nforeigner 6\nforgetting 6\nforging 6\nformulas 6\nformulation 6\nfounders 6\nframed 6\nfreeing 6\nfucking 6\nfueling 6\nfunnel 6\nfunneled 6\nfuse 6\nfuss 6\ngarlic 6\ngarments 6\ngaze 6\ngenocide 6\ngeographically 6\ngeography 6\ngerrymandering 6\ngigantic 6\nginger 6\nglamour 6\nglider 6\ngloom 6\nglow 6\ngoddess 6\ngorilla 6\ngorillas 6\ngovernorate 6\ngrandiose 6\ngrandparents 6\ngrenades 6\ngripes 6\ngrounding 6\nguarding 6\nguideline 6\nguides 6\ngulf 6\ngullible 6\ngunned 6\nguru 6\nhaikou 6\nhalal 6\nhamburger 6\nhammer 6\nhamper 6\nhangs 6\nhardness 6\nharms 6\nhassle 6\nhatch 6\nhatched 6\nhates 6\nhating 6\nhaunted 6\nhaunting 6\nheadquartered 6\nhealthier 6\nheated 6\nheater 6\nheavens 6\nheirs 6\nhelm 6\nhemorrhaging 6\nhenceforth 6\nhens 6\nherbal 6\nherd 6\nhid 6\nhijacking 6\nhinge 6\nhinges 6\nhog 6\nhomer 6\nhomosexuals 6\nhorrific 6\nhorrors 6\nhorsepower 6\nhose 6\nhousewives 6\nhugely 6\nhurricanes 6\nhush 6\nhymn 6\nhypocrisy 6\nidentifies 6\nillusions 6\nimmoral 6\nimperative 6\nimperialist 6\nimporters 6\nincense 6\ninception 6\ninconceivable 6\ninconclusive 6\nincorporate 6\nindifference 6\nindifferent 6\nindigenous 6\ninfiltration 6\ninfinite 6\ninflict 6\ninforming 6\ninfringe 6\ninfringed 6\ninjecting 6\ninscribed 6\ninscriptions 6\ninsomnia 6\ninstinct 6\ninsulated 6\ninsulating 6\nintellectually 6\ninterfering 6\nintermediary 6\nintertwined 6\nintervening 6\ninterviewing 6\nintimacy 6\nintimidating 6\nintimidation 6\nintrigue 6\nintruder 6\nintrusion 6\nintuition 6\ninvading 6\ninvent 6\ninvoice 6\ninvoices 6\njackets 6\njams 6\njealously 6\njealousy 6\njoys 6\njudgement 6\njug 6\njuries 6\nkg 6\nkilns 6\nkilts 6\nkindly 6\nknives 6\nknocks 6\nknowingly 6\nku 6\nkudos 6\nlance 6\nlandslides 6\nlaudable 6\nleery 6\nlegions 6\nleveled 6\nliberated 6\nlimbo 6\nlimp 6\nlin 6\nlip 6\nlistings 6\nlocalized 6\nlocking 6\nlogged 6\nlousy 6\nluobo 6\nlured 6\nmacho 6\nmacroeconomic 6\nmadness 6\nmafia 6\nmagistrate 6\nmainlanders 6\nmains 6\nmanifest 6\nmanifestations 6\nmansion 6\nmantra 6\nmanually 6\nmarching 6\nmaritime 6\nmarketable 6\nmasterpiece 6\nmaterially 6\nmates 6\nmathematical 6\nmayors 6\nmeasurement 6\nmega-issues 6\nmegabit 6\nmemorable 6\nmerchandising 6\nmethodology 6\nmettle 6\nmicroarray 6\nmicroprocessors 6\nmicroscopic 6\nmidlife 6\nmidsized 6\nmilitarily 6\nminimills 6\nministerial 6\nmink 6\nmins 6\nmint 6\nmiracles 6\nmiscellaneous 6\nmisconduct 6\nmisdemeanor 6\nmisfortune 6\nmisguided 6\nmismanagement 6\nmisses 6\nmisunderstanding 6\nmobilized 6\nmodeling 6\nmoisture 6\nmomentarily 6\nmonument 6\nmonuments 6\nmornings 6\nmosque 6\nmotions 6\nmotorists 6\nmu 6\nmushroomed 6\nmyth 6\nmyths 6\nnaczelnik 6\nnagging 6\nnarrator 6\nnarrows 6\nneedle 6\nneocons 6\nnervousness 6\nneurological 6\nneutralize 6\nneutrons 6\nnicer 6\nnineties 6\nninety 6\nnon-profit 6\nnonexistent 6\nnonprofit 6\nnostalgic 6\nnotoriously 6\nnucleus 6\noasis 6\nobjectivity 6\nobligated 6\nobsession 6\noccupations 6\noddly 6\nomit 6\noperative 6\noptimization 6\noptimize 6\nore 6\norgan 6\noriginals 6\northodox 6\noutage 6\noutcomes 6\novercrowded 6\noversubscribed 6\noverturn 6\noverturned 6\npaced 6\npageant 6\npaired 6\npalms 6\npanicked 6\nparadise 6\nparameters 6\nparcels 6\nparlors 6\npasswords 6\npaste 6\npathetic 6\npatriot 6\npause 6\npendant 6\npenetrating 6\npensions 6\nperil 6\npervert 6\nperverted 6\npesetas 6\npetrochemicals 6\nphotographers 6\npigment 6\npink 6\npinpoint 6\npioneers 6\npitching 6\nplaid 6\nplateau 6\npleadings 6\npleasing 6\nploy 6\npocketed 6\npoem 6\npoems 6\npoker 6\npole 6\npolish 6\nponds 6\nportray 6\npositioning 6\npost-crash 6\npoultry 6\npounding 6\npowder 6\npraising 6\npremature 6\nprepayments 6\nprescribe 6\nprobes 6\nproclaim 6\nproficient 6\nprofiles 6\nprofited 6\nproletariat 6\nprolong 6\npromoter 6\npromoters 6\npropane 6\nproportions 6\nprosecuting 6\nprostitute 6\nprototype 6\nprotracted 6\nproverbial 6\nprovident 6\nprovoke 6\nproximity 6\npubs 6\npullout 6\npuppet 6\npurple 6\npurses 6\npuzzled 6\npyramid 6\nquarterback 6\nquieted 6\nradically 6\nradius 6\nrails 6\nrailways 6\nraisers 6\nrand 6\nrang 6\nrazed 6\nrazor 6\nre-elected 6\nrealizes 6\nrealms 6\nreassured 6\nrebalancing 6\nrebate 6\nrecipes 6\nrecital 6\nreckon 6\nredefine 6\nreeling 6\nreferees 6\nrefiners 6\nregains 6\nreinvest 6\nreinvested 6\nrelics 6\nreligions 6\nreluctantly 6\nremembering 6\nremuneration 6\nrenovation 6\nreoffered 6\nreopening 6\nreorganized 6\nreplay 6\nreproduced 6\nrescuers 6\nresemblance 6\nresembles 6\nrespectability 6\nrestart 6\nrestraints 6\nretreating 6\nrevamping 6\nrevise 6\nrevolutionaries 6\nrewarding 6\nriddled 6\nrigged 6\nrigorous 6\nrivalry 6\nriverbank 6\nrobots 6\nrogue 6\nrollers 6\nrum 6\nrunner 6\nruthless 6\nsack 6\nsaite 6\nsalad 6\nsalvaged 6\nsatire 6\nsaturated 6\nsavvy 6\nscraps 6\nscream 6\nscreams 6\nscrewed 6\nseafood 6\nsecondly 6\nsect 6\nseedy 6\nseizures 6\nselectively 6\nsemester 6\nsenses 6\nsequence 6\nsettles 6\nseventy 6\nseverity 6\nsewage 6\nshade 6\nshaved 6\nshiny 6\nshoots 6\nshouts 6\nshowroom 6\nshrimp 6\nshrinkage 6\nshutting 6\nsighs 6\nsimilarity 6\nsimplest 6\nsincerity 6\nsixteen 6\nskyrocketing 6\nsleek 6\nslew 6\nsludge 6\nsmelling 6\nsmiled 6\nsmokers 6\nsnail 6\nsociologist 6\nsocket 6\nsolicit 6\nsolicited 6\nsophistication 6\nsorting 6\nsoundly 6\nspas 6\nspectators 6\nspeculating 6\nsphere 6\nspheres 6\nspinal 6\nspiral 6\nspit 6\nsplits 6\nsprung 6\nspurring 6\nspurt 6\nsquads 6\nstain 6\nstained 6\nstairs 6\nstamina 6\nstandoff 6\nstaring 6\nstark 6\nstartups 6\nstately 6\nstatues 6\nsteamed 6\nsteeped 6\nstern 6\nstimulators 6\nstint 6\nstrained 6\nstranded 6\nstrengthens 6\nstrings 6\nstroll 6\nstructuring 6\nstuffed 6\nstumble 6\nsubdivision 6\nsubpoena 6\nsubscriber 6\nsubstitutes 6\nsummed 6\nsunlight 6\nsunshine 6\nsuperconductors 6\nsuppressing 6\nsupremacy 6\nsurfing 6\nsurvives 6\nswallowed 6\nsweetheart 6\nsweetness 6\nswore 6\nt 6\nteas 6\ntechnician 6\nteens 6\ntelescope 6\nteller 6\ntempting 6\ntenfold 6\nterrified 6\ntextbook 6\nthoughtless 6\nthrone 6\nthumbs 6\ntiles 6\ntilt 6\ntoken 6\ntolerant 6\ntolls 6\ntony 6\ntoothpaste 6\ntortured 6\ntouted 6\ntow 6\ntowering 6\ntractor 6\ntrademarks 6\ntragedies 6\ntrailed 6\ntranscript 6\ntranslations 6\ntranslators 6\ntransnational 6\ntreacherous 6\ntreason 6\ntritium 6\ntruthful 6\ntubes 6\ntuned 6\nturbines 6\nturtle 6\ntyphoon 6\ntyranny 6\nulterior 6\nunanticipated 6\nunavoidable 6\nundeniable 6\nundesirable 6\nunduly 6\nunhealthy 6\nunheard 6\nunjust 6\nunmistakable 6\nunravel 6\nunraveling 6\nunsafe 6\nunsuccessfully 6\nunsupported 6\nupdates 6\nuproar 6\nupsurge 6\nurethane 6\nvaries 6\nvarying 6\nveggies 6\nvehemently 6\nveiled 6\nvenom 6\nverdicts 6\nverified 6\nvictimized 6\nviolently 6\nvipers 6\nviral 6\nvirtues 6\nvisibly 6\nvividly 6\nvocal 6\nvodka 6\nvolcanic 6\nvoluminous 6\nvows 6\nwaging 6\nwaived 6\nwalkways 6\nwarmed 6\nwarrior 6\nwasteful 6\nwaterworks 6\nwatts 6\nweeklong 6\nweighs 6\nwherein 6\nwhipped 6\nwicked 6\nwidth 6\nwields 6\nwills 6\nwiring 6\nwiser 6\nwithholding 6\nwonderfully 6\nworkshops 6\nwrapping 6\nwrist 6\nyacht 6\nyam 6\nyanked 6\nyi 6\nyoungsters 6\nyul 6\nyuppie 6\n~~~ 6\n!. 5\n'70s 5\n'90s 5\n'98 5\n,, 5\n0.03 5\n0.10 5\n0.24 5\n0.60 5\n000 5\n01/26/2001 5\n05/25/2001 5\n07 5\n07/17/2000 5\n1,600 5\n1.08 5\n1.13 5\n1.40 5\n1.52 5\n1.54 5\n1.55 5\n1.5765 5\n1.58 5\n1.78 5\n1.8340 5\n1.8353 5\n1.8355 5\n1.8485 5\n1.86 5\n1.8667 5\n10/32 5\n11,000 5\n11/32 5\n116 5\n11:30 5\n128 5\n130,000 5\n138 5\n14.3 5\n141.52 5\n141.70 5\n142 5\n142.10 5\n15.2 5\n153 5\n154 5\n156 5\n156.7 5\n1561 5\n158 5\n16.1 5\n161 5\n163 5\n17/32 5\n176 5\n180,000 5\n184 5\n186 5\n19.2 5\n19.95 5\n191 5\n1928 5\n1932 5\n1941 5\n1953 5\n1968 5\n2,100 5\n2.58 5\n2.60 5\n2.77 5\n2.79 5\n20.125 5\n2011 5\n2020 5\n21.4 5\n21.8 5\n210 5\n215 5\n219 5\n22.6 5\n221 5\n23,000 5\n23.8 5\n245 5\n25.8 5\n252 5\n255 5\n26.23 5\n2638.73 5\n2653.28 5\n2659.22 5\n2662.91 5\n2683.20 5\n27.1 5\n273 5\n285 5\n290 5\n2:00 5\n2:52 5\n30.2 5\n32.6 5\n345 5\n36.6 5\n386 5\n390 5\n3:00 5\n3:30 5\n4/32 5\n406 5\n410 5\n425 5\n430 5\n450,000 5\n49.9 5\n490 5\n5,500 5\n5.25 5\n5.75 5\n504 5\n582 5\n589 5\n6. 5\n6/32 5\n62.875 5\n6:00 5\n6:30 5\n7.03 5\n7.15 5\n7.37 5\n7.42 5\n7.51 5\n7.60 5\n763736 5\n8.10 5\n8.27 5\n8.35 5\n8.47 5\n8.6 5\n8.85 5\n9.80 5\n99.75 5\n:-LRB- 5\nAFL 5\nAGIP 5\nAPF's 5\nARCO 5\nAWB 5\nAbby 5\nAbe 5\nAbility 5\nAbramoff 5\nAceh 5\nAckerman 5\nAdmissions 5\nAdvisors 5\nAffair 5\nAfricans 5\nAikman 5\nAla. 5\nAlar 5\nAlbany 5\nAlbuquerque 5\nAlcee 5\nAlcohol 5\nAlishan 5\nAllday 5\nAlonso 5\nAlvin 5\nAmericas 5\nAmes 5\nAmidst 5\nAmidu 5\nAmira 5\nAmis 5\nAmr 5\nAnacomp 5\nAnaheim 5\nAndrews 5\nApache 5\nApollo 5\nArabism 5\nArco 5\nAristide 5\nAriz 5\nArk 5\nArmistice 5\nAshton 5\nAska 5\nAsquith 5\nAssessment 5\nAssociate 5\nAttorneys 5\nAuckland 5\nAutomated 5\nAuvil 5\nAwards 5\nAway 5\nB.V. 5\nBMEC 5\nBOT 5\nBRIEFS 5\nBT 5\nBTA 5\nBaden 5\nBaird 5\nBaken 5\nBakes 5\nBancorp. 5\nBard 5\nBarrie 5\nBarron 5\nBashar 5\nBeauty 5\nBeaver 5\nBeecham 5\nBeer 5\nBelow 5\nBennet 5\nBerbera 5\nBeta 5\nBeth 5\nBias 5\nBickwit 5\nBills 5\nBiung 5\nBlinder 5\nBlockbuster 5\nBlood 5\nBlum 5\nBlunt 5\nBody 5\nBoesel 5\nBoies 5\nBoon 5\nBoone 5\nBorden 5\nBottom 5\nBoulder 5\nBoy 5\nBozell 5\nBradford 5\nBreaking 5\nBrewery 5\nBroadBeach 5\nBrody 5\nBroker 5\nBrokerage 5\nBrookings 5\nBrought 5\nBuchanan 5\nBuckingham 5\nBundesbank 5\nBurkina 5\nBurning 5\nBusinessland 5\nButler 5\nCDL 5\nCFC 5\nCIO 5\nCKS 5\nCPAs 5\nCPR 5\nCTS 5\nCablevision 5\nCaijing 5\nCalls 5\nCamden 5\nCampus 5\nCanaan 5\nCanelo 5\nCanola 5\nCarla 5\nCarried 5\nCarriers 5\nCarroll 5\nCascade 5\nCave 5\nCawthorn 5\nCemetery 5\nCentre 5\nCetus 5\nChallenge 5\nChangchun 5\nChangsha 5\nChengyang 5\nChez 5\nChihshanyen 5\nChirac 5\nChronicle 5\nChunhua 5\nClaiborne 5\nClarke 5\nClaus 5\nClifford 5\nClinic 5\nClintons 5\nClouds 5\nCockburn 5\nColson 5\nComes 5\nCommerzbank 5\nCommune 5\nCompare 5\nComplete 5\nCompletion 5\nConcerto 5\nConflict 5\nConfucian 5\nContinue 5\nContract 5\nCore 5\nCorporations 5\nCostco 5\nCounsel 5\nCovenant 5\nCreative 5\nCrowley 5\nCubs 5\nCustom 5\nCzech 5\nDD 5\nDFC 5\nDISCO 5\nDae 5\nDalton 5\nDame 5\nDana 5\nDang 5\nDanny 5\nDarth 5\nDavidson 5\nDawn 5\nDebt 5\nDelegates 5\nDelegation 5\nDeltec 5\nDerrick 5\nDesert 5\nDesmond 5\nDevil 5\nDied 5\nDiet 5\nDilama 5\nDirectorate 5\nDivine 5\nDodgers 5\nDonna 5\nDonnelley 5\nDorgan 5\nDoubleday 5\nDrake 5\nDriving 5\nDubinsky 5\nDudley 5\nDuncan 5\nDupont 5\nDylan 5\nEDS 5\nEEI 5\nEG&G 5\nETA 5\nEaster 5\nEbensburg 5\nEconomist 5\nEgan 5\nEggs 5\nElaine 5\nElbertson 5\nElections 5\nElectron 5\nElkhorn 5\nEmerging 5\nEmployee 5\nEmployment 5\nEndowment 5\nEnvironmentalism 5\nEssentially 5\nEstee 5\nEstimate 5\nEthnic 5\nEvents 5\nExperimental 5\nFIFA 5\nFOOD 5\nFabio 5\nFacing 5\nFahd 5\nFahlawi 5\nFarewell 5\nFarmington 5\nFaso 5\nFat 5\nFavorite 5\nFeeling 5\nFeldman 5\nFerdinand 5\nFileNet 5\nFilipinos 5\nFischer 5\nFlakes 5\nFleischmann 5\nFloor 5\nFly 5\nFocus 5\nFormosan 5\nForrester 5\nForty 5\nForward 5\nFreeh 5\nFreind 5\nFreres 5\nFri 5\nFriendship 5\nFuh 5\nFujimori 5\nFujin 5\nGIO 5\nGRAINS 5\nGa 5\nGala 5\nGallagher 5\nGargan 5\nGauloises 5\nGavlak 5\nGeraldo 5\nGilead 5\nGill 5\nGloria 5\nGlory 5\nGogh 5\nGolan 5\nGolf 5\nGomez 5\nGoodall 5\nGoogle+ 5\nGorazde 5\nGoulart 5\nGovernments 5\nGovernorate 5\nGreenKel 5\nGreenville 5\nGrenada 5\nGroom 5\nGrowing 5\nGruberova 5\nGuerrillas 5\nGuizhou 5\nGuofeng 5\nHD 5\nHager 5\nHaitian 5\nHalal 5\nHammond 5\nHand 5\nHani 5\nHappiness 5\nHarbors 5\nHarlan 5\nHarvey 5\nHat 5\nHatch 5\nHeadline 5\nHealthy 5\nHei 5\nHeinz 5\nHells 5\nHepatitis 5\nHerald 5\nHerman 5\nHeron 5\nHesse 5\nHibor 5\nHicks 5\nHindu 5\nHines 5\nHissa 5\nHit 5\nHizbullah 5\nHongbin 5\nHouellebecq 5\nHowell 5\nHowick 5\nHuaqing 5\nHub 5\nHubert 5\nHusseiniya 5\nHut 5\nHyperCard 5\nIFAR 5\nINDUSTRIES 5\nISDA 5\nISO 5\nIdeally 5\nIkegai 5\nImmediately 5\nImmunex 5\nIncreasingly 5\nInfluenced 5\nInstitut 5\nInstrument 5\nIntellectual 5\nInternazionale 5\nInto 5\nIrrational 5\nIsaac 5\nIsabella 5\nIsle 5\nItel 5\nJSP 5\nJacky 5\nJamieson 5\nJanette 5\nJarrett 5\nJerell 5\nJericho 5\nJiahua 5\nJiaju 5\nJinana 5\nJiu 5\nJoanne 5\nJohnny 5\nJoshua 5\nJoyce 5\nKam 5\nKamyao 5\nKan. 5\nKanhal 5\nKeizai 5\nKhaled 5\nKhalifa 5\nKhobar 5\nKidwa 5\nKimberly 5\nKinder 5\nKirghizia 5\nKnopf 5\nKnow 5\nKnudson 5\nKoenig 5\nKowloon 5\nKozinski 5\nKramer 5\nKurzweil 5\nLOI 5\nLOS 5\nLSI 5\nLa. 5\nLack 5\nLandry 5\nLanguage 5\nLaughter 5\nLaureate 5\nLaws 5\nLay 5\nLeaders 5\nLeaving 5\nLeavitt 5\nLegion 5\nLegislation 5\nLeipzig 5\nLighting 5\nLima 5\nLingus 5\nLippens 5\nLitvack 5\nLiver 5\nLloyds 5\nLocals 5\nLocated 5\nLomb 5\nLonrho 5\nLorie 5\nLost 5\nLouise 5\nLowell 5\nLucky 5\nLujayn 5\nLuster 5\nLuxembourg 5\nLybrand 5\nMMBTU 5\nMRT 5\nMTV 5\nMUST 5\nMacanese 5\nMafia 5\nMagazines 5\nMaggie 5\nMagic 5\nMaier 5\nMall 5\nManufacturer 5\nMarco 5\nMarion 5\nMarketers 5\nMarriott 5\nMarvel 5\nMarwick 5\nMastergate 5\nMatra 5\nMcCarthy 5\nMcGill 5\nMcKinney 5\nMcfedden 5\nMeantime 5\nMelinda 5\nMellen 5\nMelloan 5\nMerchant 5\nMesopotamia 5\nMianyang 5\nMiaoli 5\nMicro 5\nMidas 5\nMilken 5\nMillions 5\nMilpitas 5\nMilunovich 5\nMinella 5\nMingo 5\nMingxia 5\nMinority 5\nMissile 5\nMitsukoshi 5\nMo 5\nMo. 5\nModesto 5\nMon 5\nMonet 5\nMonotheism 5\nMontagu 5\nMoses 5\nMovieline 5\nMunich 5\nMushkat 5\nMyanmaran 5\nNAM 5\nNCNB 5\nNOTE 5\nNY 5\nNadir 5\nNamibia 5\nNatalie 5\nNatick 5\nNationalities 5\nNeeds 5\nNeko 5\nNewly 5\nNewmark 5\nNgoc 5\nNickelodeon 5\nNihon 5\nNoel 5\nNon-no 5\nNope 5\nNotice 5\nNotre 5\nNutritional 5\nO 5\nOUT 5\nOak 5\nOffices 5\nOkinawa 5\nOlin 5\nOnline 5\nOracle 5\nOrient 5\nOslo 5\nOvcharenko 5\nOwn 5\nOy 5\nPEREZ 5\nPNB 5\nPOW 5\nPRICE 5\nPalmer 5\nPanisse 5\nPapua 5\nParagraph 5\nParenthood 5\nParisians 5\nParsow 5\nParticularly 5\nParts 5\nPassword 5\nPattison 5\nPaxus 5\nPearson 5\nPeat 5\nPegasus 5\nPeggy 5\nPerez 5\nPerry 5\nPetroleos 5\nPhD 5\nPhillip 5\nPig 5\nPilots 5\nPine 5\nPlains 5\nPlastics 5\nPoe 5\nPole 5\nPoll 5\nPoodle 5\nPooh 5\nPositive 5\nPreliminary 5\nPressure 5\nPretoria 5\nPride 5\nProsecution 5\nProsecutor 5\nProtestants 5\nProvident 5\nPryor 5\nPump 5\nPurnick 5\nPurple 5\nQQ 5\nQataris 5\nQi 5\nQinglin 5\nQuarterly 5\nQuds 5\nQuick 5\nQuinlan 5\nQuixote 5\nQusaybi 5\nRachel 5\nRahu 5\nRamadan 5\nRamirez 5\nRamstein 5\nRange 5\nRaw 5\nReading 5\nRecovery 5\nRecruit 5\nRecycling 5\nRegal 5\nRegister 5\nReinvestment 5\nReliance 5\nRelief 5\nRemics 5\nReproductive 5\nResidential 5\nRestaurants 5\nRevenues 5\nReverend 5\nRianta 5\nRider 5\nRiegle 5\nRiyals 5\nRoach 5\nRodham 5\nRolls 5\nRosa 5\nRosenberg 5\nRostenkowski 5\nRoyce 5\nRuder 5\nRumors 5\nRusso 5\nRuvolo 5\nS.p.A. 5\nSAS 5\nSHE 5\nSHV 5\nSOYBEANS 5\nSTORES 5\nSafeco 5\nSahaf 5\nSailing 5\nSally 5\nSantiago 5\nSasser 5\nSawyer 5\nSchroders 5\nSciMed 5\nScientology 5\nSe 5\nSector 5\nSediq 5\nSegundo 5\nSelah 5\nSelect 5\nSelf 5\nSend 5\nSent 5\nSept 5\nSeuss 5\nSeventeen 5\nSex 5\nShady 5\nShaffer 5\nShah 5\nShale 5\nShao 5\nSharpshooter 5\nShatoujiao 5\nShattuck 5\nShimone 5\nShipbuilding 5\nShippers 5\nShore 5\nSidney 5\nSiegel 5\nSigns 5\nSikes 5\nSilkie 5\nSinatra 5\nSitara 5\nSkin 5\nSmart 5\nSmithKline 5\nSmurfit 5\nSolana 5\nSoldier 5\nSolomon 5\nSonata 5\nSoros 5\nSound 5\nSouthland 5\nSouthmark 5\nSpirit 5\nSpringfield 5\nSprint 5\nStanza 5\nStennis 5\nStovall 5\nStrike 5\nSubject 5\nSunburn 5\nSundance 5\nSununu 5\nSupervisors 5\nSuppose 5\nSurvivors 5\nSuzuki 5\nSyracuse 5\nTGS 5\nTRW 5\nTX 5\nTacoma 5\nTaft 5\nTeamsters 5\nTechnically 5\nTemptation 5\nTexan 5\nThan 5\nTheodore 5\nThermo 5\nThrelkeld 5\nTina 5\nTomlin 5\nTong 5\nToni 5\nTorch 5\nTough 5\nToy 5\nTraviata 5\nTriangle 5\nTribe 5\nTroy 5\nTuchman 5\nTurbine 5\nTurnpike 5\nTurns 5\nTuzmen 5\nTwice 5\nUAA 5\nUNMIS 5\nUSI 5\nUT 5\nUVA 5\nUday 5\nUnice 5\nUno 5\nVISA 5\nVentura 5\nVerdi 5\nVerne 5\nVictorian 5\nViera 5\nVisit 5\nVitro 5\nVon 5\nVote 5\nWANT 5\nWASHINGTON 5\nWE 5\nWGBH 5\nWaPo 5\nWakeman 5\nWalk 5\nWallace 5\nWalmart 5\nWalters 5\nWastewater 5\nWed 5\nWeirton 5\nWestamerica 5\nWesterners 5\nWharton 5\nWhose 5\nWichita 5\nWildlife 5\nWilmer 5\nWilmington 5\nWindow 5\nWine 5\nWinners 5\nWinnie 5\nWoodland 5\nWoodrow 5\nWriting 5\nWrong 5\nWyndham 5\nX-ray 5\nX-rays 5\nXianglong 5\nXiao 5\nXiaoguang 5\nY 5\nYahoo! 5\nYan'an 5\nYangpu 5\nYeng 5\nYik 5\nYoga 5\nYokohama 5\nYong 5\nYorkers 5\nZaman 5\nZawahiri 5\nZhuqin 5\nZijin 5\nZipper 5\nZones 5\nZoran 5\nZou 5\n_ 5\na. 5\nabbreviated 5\nabound 5\nabrasive 5\naccession 5\nacclaimed 5\naccommodated 5\naccountant 5\naccumulatively 5\nachieves 5\nacquainted 5\nadhering 5\nadministrations 5\nadmirable 5\nadmittedly 5\nadolescents 5\nadopts 5\nadvancers 5\nadvertiser 5\naffiliation 5\naffirmation 5\naffirmed 5\naffliction 5\nafforded 5\naiding 5\nailment 5\naka 5\nalarmed 5\nalcoholic 5\nalimony 5\nallay 5\nalleys 5\nallotment 5\nalphabet 5\nalternate 5\namateurs 5\namazement 5\nambivalent 5\nambush 5\namenities 5\namortization 5\namply 5\nanalyses 5\nanarchy 5\nanchors 5\nanguish 5\nanonymity 5\nante 5\nanti-abortionists 5\nantibodies 5\nantiques 5\nanxiously 5\nappointees 5\narch 5\narenas 5\nartwork 5\naspiration 5\nassert 5\nassertions 5\nastronomers 5\nasylum 5\natheist 5\nattaching 5\nattachments 5\nattire 5\nauditor 5\nauditors 5\nausterity 5\nautobiography 5\nautomation 5\navant 5\navian 5\nawakened 5\nawarding 5\nbackfired 5\nbacklogs 5\nbackroom 5\nbacterial 5\nbalk 5\nballoons 5\nballoting 5\nbarrage 5\nbarter 5\nbash 5\nbasing 5\nbasking 5\nbass 5\nbath 5\nbatter 5\nbattleground 5\nbeak 5\nbeam 5\nbean 5\nbeard 5\nbearers 5\nbeautifully 5\nbeeper 5\nbeggars 5\nbegging 5\nbehaviour 5\nbelievers 5\nbend 5\nbending 5\nbesieged 5\nbickering 5\nbind 5\nblanks 5\nblaze 5\nbleach 5\nblended 5\nblindness 5\nblockbuster 5\nblunted 5\nbodyguards 5\nbogged 5\nbogus 5\nbolstering 5\nbolt 5\nbombshell 5\nbondad 5\nbookings 5\nboon 5\nboosters 5\nboulevard 5\nbpd 5\nbrawl 5\nbreathable 5\nbride 5\nbriefcase 5\nbroadcasters 5\nbroadening 5\nbrochure 5\nbrothels 5\nbrushed 5\nbrutality 5\nbrutally 5\nbuckle 5\nbuddies 5\nbuffeted 5\nbuffs 5\nbulbs 5\nbuns 5\nburdensome 5\nburglary 5\nbusted 5\nbutterfly 5\nbye 5\nbypass 5\ncabin 5\ncabinets 5\ncadmium 5\ncadre 5\ncakes 5\ncalculates 5\ncaliente 5\ncalligrapher 5\ncamping 5\ncampuses 5\ncanada 5\ncancellations 5\ncancerous 5\ncanning 5\ncaps 5\ncardboard 5\ncareless 5\ncarriage 5\ncarrot 5\ncartoonist 5\ncartridge 5\ncaustic 5\ncessation 5\nchairwoman 5\nchassis 5\nche 5\ncheerfully 5\ncheerleader 5\nchef 5\nchemist 5\nchemists 5\nchests 5\nchicks 5\nchina 5\nchiro 5\nchlorofluorocarbons 5\ncider 5\ncircular 5\ncirculatory 5\ncircumspect 5\nclamp 5\nclassrooms 5\ncleans 5\nclerks 5\nclicking 5\nclientele 5\ncliff 5\nclimber 5\ncling 5\ncloak 5\nclobbered 5\nclogged 5\nclowns 5\ncluster 5\nco-founder 5\nco-operations 5\nco-production 5\nco-sponsored 5\ncockpit 5\ncoffers 5\ncoincides 5\ncollateralized 5\ncollections 5\ncollector 5\ncollects 5\ncolonel 5\ncolonization 5\ncolours 5\ncom 5\ncombinations 5\ncommanded 5\ncommemorate 5\ncommemorating 5\ncommercialization 5\ncommercialize 5\ncommits 5\ncompassionate 5\ncompatriot 5\ncompeted 5\ncomplicity 5\ncompounding 5\ncompounds 5\ncompromised 5\ncomputerizing 5\nconcentrations 5\nconcocted 5\nconditioner 5\ncondominium 5\nconfiscation 5\nconfluence 5\nconforms 5\nconquered 5\nconsecutively 5\nconsents 5\nconservancy 5\nconsolation 5\nconspirators 5\nconstituencies 5\nconstrained 5\ncontempt 5\ncontinents 5\ncontradict 5\ncontributors 5\ncontroversies 5\nconveyor 5\nconvoys 5\ncookbook 5\ncooler 5\ncoordinates 5\ncoordinating 5\ncopyrights 5\ncornered 5\ncorpses 5\ncorrespondents 5\ncorrosion 5\ncountermeasures 5\ncouriers 5\ncourting 5\ncourtyard 5\ncovenant 5\ncrashing 5\ncried 5\ncries 5\ncrumble 5\ncrumpled 5\ncrux 5\nculprits 5\ncultivating 5\ncultured 5\ncustomized 5\ndangerously 5\ndeadliest 5\ndeadlocked 5\ndebatable 5\ndebenture 5\ndecidedly 5\ndecimal 5\ndecking 5\ndeclarations 5\ndecor 5\ndecreases 5\ndeductibility 5\ndefeating 5\ndefenders 5\ndefiant 5\ndefines 5\ndelinquent 5\ndemeanor 5\ndemocratically 5\ndemographic 5\ndemon 5\ndepiction 5\nderailed 5\nderision 5\nderogatory 5\ndeserving 5\ndesignate 5\ndesignation 5\ndetractors 5\ndevil 5\ndictate 5\ndifferential 5\ndifferentials 5\ndifferentiate 5\ndiffers 5\ndiligence 5\ndiligent 5\ndilution 5\ndilutive 5\ndimensions 5\ndinar 5\ndiploma 5\ndirectives 5\ndisapproved 5\ndisbanded 5\ndisbanding 5\ndiscern 5\ndisclosing 5\ndisconnected 5\ndiscredit 5\ndisgrace 5\ndishonesty 5\ndispel 5\ndisposition 5\ndissemination 5\ndissolved 5\ndistancing 5\ndistinctly 5\ndisturbances 5\ndivert 5\ndoctoral 5\ndoctorate 5\ndocumenting 5\ndollarization 5\ndome 5\ndoorway 5\ndope 5\ndormant 5\ndose 5\ndough 5\ndraped 5\ndrawer 5\ndrifting 5\ndrills 5\ndu 5\nduly 5\ndwellings 5\ndyed 5\ndyeing 5\ne-commerce 5\ne-mails 5\nearner 5\nearners 5\nebullient 5\nedging 5\neighties 5\neighty 5\nelectrolytic 5\nelephants 5\nelevations 5\nem 5\nembarrass 5\nembodied 5\nemeritus 5\nempathy 5\nenacting 5\nenclosed 5\nendangering 5\nenforcers 5\nengulfed 5\nenhancement 5\nenhancements 5\nenjoined 5\nenlightened 5\nenlisted 5\nenmity 5\nenovate 5\nenrollment 5\nensures 5\nentail 5\nentertained 5\nenvelopes 5\nenvironmentalist 5\nenvision 5\nequation 5\nequilibrium 5\nequip 5\nerotic 5\nescalate 5\nescort 5\nestranged 5\nethnicity 5\nevacuate 5\nevidenced 5\nevidently 5\nexasperation 5\nexceedingly 5\nexpel 5\nexploratory 5\nextensively 5\nextraneous 5\nextremes 5\nextremism 5\nfabrications 5\nfading 5\nfads 5\nfaithfully 5\nfalsehood 5\nfalsely 5\nfanatic 5\nfascinated 5\nfatten 5\nfavoring 5\nfei 5\nfellows 5\nferocious 5\nferroelectric 5\nfertilized 5\nfervently 5\nfetched 5\nfiberglass 5\nfiery 5\nfifties 5\nfilmmaker 5\nfins 5\nfirearms 5\nfirewood 5\nfireworks 5\nfist 5\nflattened 5\nflavors 5\nflea 5\nflipping 5\nflocked 5\nflocking 5\nfluids 5\nfluorine 5\nfolic 5\nfooling 5\nforbade 5\nforecasters 5\nforeclosed 5\nformalities 5\nformulating 5\nforsake 5\nforwarding 5\nfostered 5\nfountain 5\nfragments 5\nfranchises 5\nfried 5\nfrighten 5\nfrivolous 5\nfrontline 5\nft 5\nfu 5\nfurthermore 5\nfury 5\nfutile 5\nfuturistic 5\nfyi 5\ngambit 5\ngamers 5\ngaping 5\ngarner 5\ngearing 5\ngeese 5\ngenome 5\ngenre 5\ngeology 5\ngiveaway 5\nglasses 5\nglowing 5\nglued 5\nglut 5\ngoalie 5\ngoodies 5\ngoose 5\ngouging 5\ngrabs 5\ngrams 5\ngranite 5\ngrapple 5\ngreenmail 5\ngrievance 5\ngrievances 5\ngrilled 5\ngrinder 5\ngruesome 5\ngubernatorial 5\ngymnast 5\ngyms 5\nhabitat 5\nhabitats 5\nhaircut 5\nhaired 5\nhalfway 5\nhalves 5\nhandedly 5\nhandicap 5\nhandout 5\nhandsomely 5\nhandwritten 5\nharding 5\nhardy 5\nharming 5\nharsher 5\nhasten 5\nhauled 5\nhaunt 5\nheader 5\nheadway 5\nheady 5\nhearted 5\nheavyweight 5\nheeled 5\nhelmet 5\nheng 5\nheralded 5\nherbicide 5\nhereafter 5\nhi 5\nhierarchy 5\nhighlights 5\nhikers 5\nhills 5\nhillside 5\nhires 5\nhistorians 5\nhitch 5\nhoc 5\nhomemade 5\nhood 5\nhooked 5\nhopping 5\nhostilities 5\nhoused 5\nhousewife 5\nhousework 5\nhovered 5\nhovering 5\nhsin 5\nhumming 5\nhunter 5\nhybrids 5\nhyperinflation 5\nhypertension 5\niPhone 5\niced 5\nicon 5\nideation 5\nillogical 5\nillustrations 5\nimaginative 5\nimagining 5\nimmersed 5\nimmigrant 5\nimpartial 5\nimpeached 5\nimpeccable 5\nimpenetrable 5\nimpossibility 5\nimpressionist 5\nimpressions 5\ninclination 5\nindefinite 5\nindited 5\nindividualism 5\nindividuality 5\ninduced 5\ninduces 5\ninducing 5\ninexplicably 5\ninfantry 5\ninfants 5\ninfested 5\ninfighting 5\ninfuse 5\ningot 5\ningots 5\ningredient 5\ninheritance 5\ninquiring 5\ninscription 5\ninserts 5\nintentional 5\ninteract 5\ninterdiction 5\ninterrogated 5\nintricate 5\nintroduces 5\nintroductions 5\ninundated 5\ninventions 5\ninvoke 5\nirrigation 5\nisolate 5\nitinerary 5\njacked 5\njam 5\njihad 5\njong 5\njournalistic 5\njuices 5\njungle 5\nkaraoke 5\nkickbacks 5\nkidnappings 5\nkit 5\nknit 5\nknot 5\nlabeling 5\nlaboring 5\nlame 5\nlanes 5\nlatitude 5\nlaureate 5\nlawmaker 5\nleak 5\nleaping 5\nlearnt 5\nlegacies 5\nleisurely 5\nlengths 5\nliberalize 5\nlightening 5\nling 5\nlingers 5\nliquidating 5\nlistens 5\nliteracy 5\nliters 5\nlivers 5\nlobster 5\nlocalities 5\nlocks 5\nloft 5\nlongevity 5\nlookout 5\nloophole 5\nloot 5\nlopsided 5\nlovable 5\nloyalties 5\nludicrous 5\nluggage 5\nluring 5\nmacro-economic 5\nmacroscopic 5\nmagical 5\nmagnet 5\nmagnified 5\nmailed 5\nmalice 5\nmanageable 5\nmandating 5\nmaneuvers 5\nmanipulated 5\nmanned 5\nmanners 5\nmanuals 5\nmarker 5\nmarkers 5\nmasked 5\nmechanic 5\nmechanized 5\nmeddling 5\nmediation 5\nmedicinal 5\nmelancholy 5\nmelody 5\nmelon 5\nmelt 5\nmentor 5\nmercenary 5\nmesh 5\nmessed 5\nmetaphor 5\nmetering 5\nmethanol 5\nmicroscope 5\nmiddleman 5\nmidyear 5\nmightily 5\nmilestone 5\nmillennia 5\nmindless 5\nmindset 5\nminicomputer 5\nmisdeeds 5\nmisinterpreted 5\nmisled 5\nmisrepresentation 5\nmisrepresentations 5\nmixes 5\nmodes 5\nmolds 5\nmonarch 5\nmonk 5\nmonolithic 5\nmonotheism 5\nmortgaged 5\nmosques 5\nmotorcycles 5\nmotorist 5\nmotto 5\nmound 5\nmourners 5\nmouthed 5\nmulling 5\nmuni 5\nmuscular 5\nmusician 5\nmustard 5\nmysteries 5\nmysteriously 5\nmystique 5\nnan 5\nnaphtha 5\nnationalization 5\nnatured 5\nnausea 5\nneedlessly 5\nnegatively 5\nnegatives 5\nnephew 5\nnetworking 5\nneurons 5\nneutrality 5\nneutron 5\nniches 5\nnominate 5\nnon-recurring 5\nnonstop 5\nnormalcy 5\nnouns 5\nnullify 5\nnursery 5\nnutritional 5\no 5\noak 5\nobedience 5\noblivious 5\nobscurity 5\noccupancy 5\nopportunistic 5\noppressive 5\nops 5\nopt 5\noptimizing 5\norchard 5\norchestras 5\nordnance 5\noutages 5\noutgoing 5\noutperform 5\noutraged 5\noutrageous 5\noverbuilt 5\noverflow 5\noverlook 5\noverlooking 5\noverpass 5\noverreacting 5\noverstate 5\noverwhelm 5\npacket 5\npaints 5\nparamount 5\nparochial 5\nparody 5\nparticle 5\npasture 5\npatriots 5\npaving 5\npayers 5\npayoff 5\npeanuts 5\npedestrians 5\npediatric 5\npeng 5\npennant 5\npepper 5\nperfected 5\nperiodically 5\npersecution 5\npersisted 5\npersona 5\npersonalized 5\npertinent 5\npeso 5\npetty 5\nphilosophical 5\npianist 5\npickers 5\npier 5\npiled 5\npilgrimage 5\npinkies 5\npinned 5\npinning 5\npioneer 5\npioneered 5\npirates 5\npitfalls 5\npitiable 5\npitted 5\nplaster 5\npledges 5\nplummeting 5\npluralistic 5\npoetic 5\npolicing 5\npolyps 5\nponder 5\npondering 5\npopulations 5\npopulous 5\npores 5\nporno 5\nposing 5\npostings 5\nprecondition 5\npredecessors 5\npredetermined 5\nprefecture 5\nprejudice 5\nprematurely 5\npreoccupation 5\npresses 5\npresumption 5\npretenders 5\npretense 5\npriests 5\nprioritize 5\nproblematic 5\nproclaiming 5\nproclaims 5\nprodding 5\nprodigy 5\nproductions 5\nprogressed 5\nprohibiting 5\nproprietor 5\npropulsion 5\nprose 5\nprostate 5\nprotagonist 5\nprotections 5\nprotector 5\nproudly 5\nproverb 5\npsychiatric 5\npsychologist 5\npublicist 5\npublicize 5\npullback 5\npunishable 5\npurity 5\npurportedly 5\nputty 5\npuzzling 5\npython 5\nqualifying 5\nquantitative 5\nquickest 5\nquo 5\nracking 5\nramifications 5\nramp 5\nramps 5\nrap 5\nraping 5\nratify 5\nrattled 5\nray 5\nreactor 5\nrealty 5\nreassuring 5\nrebellious 5\nrebirth 5\nrebounding 5\nreceptor 5\nreckons 5\nreclaim 5\nreconcile 5\nreconstruct 5\nrecounted 5\nrecruits 5\nrecurring 5\nredesign 5\nredevelopment 5\nrediscover 5\nredlined 5\nreef 5\nreformed 5\nreformulated 5\nrefrigeration 5\nrefurbishing 5\nregiment 5\nregretted 5\nregulates 5\nreimburse 5\nreimbursed 5\nreinforcements 5\nreinstate 5\nreinstated 5\nreinvestment 5\nreliant 5\nrelieving 5\nrelish 5\nremarked 5\nreminiscent 5\nremnants 5\nrendition 5\nrenovate 5\nrepatriating 5\nrepatriation 5\nrepayments 5\nrepeats 5\nreplying 5\nreproducing 5\nreproductive 5\nreptiles 5\nrescind 5\nresell 5\nreshuffle 5\nresidue 5\nresponsibly 5\nrestarted 5\nrestraining 5\nrestricting 5\nresurrect 5\nretirees 5\nretrieved 5\nretrofit 5\nrevelation 5\nrevered 5\nreversals 5\nreversion 5\nreverts 5\nrevising 5\nrevolt 5\nrevolves 5\nrewarded 5\nrewritten 5\nrhythm 5\nrhythms 5\nrichly 5\nrider 5\nriots 5\nripple 5\nriverside 5\nrobe 5\nrocketed 5\nrocky 5\nrookie 5\nroommate 5\nrotation 5\nrotten 5\nrout 5\nrugged 5\nruptured 5\nrushes 5\nsaddle 5\nsagged 5\nsail 5\nsailor 5\nsalsa 5\nsamurai 5\nscanners 5\nscant 5\nscapegoat 5\nscaring 5\nscent 5\nscientifically 5\nscotch 5\nscuttled 5\nseabed 5\nsecular 5\nsecurely 5\nseesaw 5\nselfish 5\nselflessly 5\nsemblance 5\nsemiannually 5\nsender 5\nsensed 5\nseparating 5\nserene 5\nseventeen 5\nseventies 5\nsexuality 5\nshades 5\nshady 5\nshakeout 5\nsheikh 5\nsheltered 5\nshielded 5\nshining 5\nshove 5\nshowcase 5\nshowdown 5\nshrank 5\nshrine 5\nshrubs 5\nshunned 5\nshuttles 5\nsidewalks 5\nsimilarities 5\nsimplicity 5\nsimplified 5\nsimplify 5\nsimulated 5\nsimulation 5\nsimulations 5\nsiphoned 5\nsitcom 5\nskeptics 5\nsketchy 5\nskirt 5\nskyrocketed 5\nslackened 5\nslots 5\nsoaks 5\nsober 5\nsocialize 5\nsocks 5\nsolitary 5\nsoluble 5\nsoot 5\nsoothing 5\nsorted 5\nsoutheastern 5\nsouvenir 5\nsparking 5\nsparks 5\nspawn 5\nspecialties 5\nspecifics 5\nspectacle 5\nspectator 5\nspeculations 5\nspeedy 5\nspikes 5\nspiraling 5\nspontaneously 5\nspooked 5\nspotty 5\nsprawl 5\nspraying 5\nsqueezing 5\nstab 5\nstack 5\nstacked 5\nstaffed 5\nstaffer 5\nstains 5\nstalemate 5\nstalls 5\nstampede 5\nstamping 5\nstandardize 5\nstardom 5\nstarter 5\nstature 5\nstealth 5\nsteered 5\nstifle 5\nstimulated 5\nstockbroker 5\nstocking 5\nstopgap 5\nstoring 5\nstrains 5\nstrategically 5\nstripping 5\nstubbornly 5\nstumbled 5\nstupidity 5\nstyling 5\nstymied 5\nsublime 5\nsubmission 5\nsubmits 5\nsubscription 5\nsubsided 5\nsubstituted 5\nsubtract 5\nsubversion 5\nsucker 5\nsuites 5\nsunny 5\nsuperficial 5\nsuperstores 5\nsupplemental 5\nsupplements 5\nsurgeons 5\nsurrealist 5\nsurvivor 5\nswamped 5\nswimmer 5\nsympathizers 5\nsymposium 5\nsymptomatic 5\nsystematically 5\ntabs 5\ntacitly 5\ntack 5\ntackling 5\ntags 5\ntails 5\ntangle 5\ntantamount 5\ntasteless 5\ntasting 5\nteaming 5\ntelemarketers 5\ntelescopes 5\ntelevisions 5\ntemper 5\ntemperament 5\ntempted 5\ntenant 5\ntendencies 5\ntent 5\nterminating 5\nterrific 5\ntesters 5\ntestimonies 5\nthalassemia 5\nthefts 5\nthinly 5\nthoroughbred 5\nthy 5\ntidy 5\ntimed 5\ntimidity 5\ntinkering 5\ntissues 5\ntitanium 5\ntmobile 5\ntoast 5\ntoil 5\ntoiling 5\ntongues 5\ntonnage 5\ntopping 5\ntouchy 5\ntoughen 5\ntowels 5\ntract 5\ntrailers 5\ntrait 5\ntranquility 5\ntransferable 5\ntransforms 5\ntranslating 5\ntransmissions 5\ntransponder 5\ntrauma 5\ntreasured 5\ntrespass 5\ntrimming 5\ntrunk 5\ntrustees 5\ntsai 5\ntucked 5\ntug 5\ntung 5\ntween 5\ntwisting 5\nubiquitous 5\nultimatum 5\nultraviolet 5\nunarmed 5\nundamaged 5\nundercover 5\nundermining 5\nunderscores 5\nunderstated 5\nuneducated 5\nunexplained 5\nunfamiliar 5\nunfit 5\nunfold 5\nunfounded 5\nunharmed 5\nunhcr 5\nuniformity 5\nuniting 5\nunleashed 5\nunprepared 5\nunpublished 5\nunregulated 5\nunsure 5\nunthinkable 5\nunwillingness 5\nunwise 5\nuphold 5\nupload 5\nupstream 5\nusefulness 5\nushered 5\nvacations 5\nvalleys 5\nvanilla 5\nvanish 5\nvaunted 5\nvector 5\nventilation 5\nvests 5\nvets 5\nvice-minister 5\nvictor 5\nvideocassette 5\nviewpoints 5\nviruses 5\nvitamin 5\nvoyage 5\nwailing 5\nwallets 5\nwan 5\nwastes 5\nwastewater 5\nwatering 5\nwee 5\nweekday 5\nweep 5\nwhip 5\nwhitening 5\nwhitewash 5\nwidows 5\nwilderness 5\nwillingly 5\nwitchcraft 5\nwithered 5\nwolf 5\nwounding 5\nwoven 5\nwracked 5\nwrath 5\nwrinkle 5\nwrought 5\nyouths 5\nyuppies 5\nzinc 5\n▲ 5\n'Nita 4\n'n' 4\n..! 4\n................... 4\n..? 4\n0.13 4\n0.45 4\n0.88 4\n02 4\n09/16/99 4\n1.0 4\n1.01 4\n1.21 4\n1.31 4\n1.38 4\n1.46 4\n1.49 4\n1.53 4\n1.6145 4\n1.63 4\n1.69 4\n1.76 4\n1.77 4\n1.79 4\n1.8300 4\n1.850 4\n1.8578 4\n1.88 4\n1.90 4\n1.95 4\n101st 4\n10:40 4\n11.2 4\n114.3 4\n12.2 4\n12.75 4\n123 4\n124,875 4\n12:01 4\n13.7 4\n14,000 4\n14.1 4\n1400 4\n141 4\n141.65 4\n142.43 4\n146 4\n15.1 4\n15.7 4\n1575 4\n16.3 4\n16.4 4\n1600 4\n162 4\n167 4\n17.01 4\n17.8 4\n17.9 4\n171 4\n18.4 4\n188 4\n1899 4\n19/32 4\n1900 4\n1911 4\n1940 4\n1942 4\n1943 4\n1946 4\n195 4\n1950 4\n2,400 4\n2.0 4\n2.02 4\n2.19 4\n2.33 4\n2.375 4\n2.40 4\n2.45 4\n2.66 4\n2.68 4\n2.80 4\n2.87 4\n2.90 4\n202 4\n203 4\n21,000 4\n21.2 4\n212 4\n22,000 4\n22.25 4\n22.4 4\n23.2 4\n231 4\n24,000 4\n24.5 4\n242 4\n2500 4\n251 4\n26.5 4\n26.7 4\n2643.65 4\n2645.08 4\n268 4\n27.6 4\n28,000 4\n282 4\n287 4\n29.7 4\n3,200 4\n3.08 4\n3.13 4\n3.23 4\n3.31 4\n3.43 4\n3.46 4\n3.75 4\n30.6 4\n31.2 4\n318 4\n32.5 4\n321 4\n33,000 4\n338 4\n345-9945 4\n357 4\n36,000 4\n362 4\n390,000 4\n4.05 4\n4.07 4\n4.52 4\n4.55 4\n4000 4\n401 4\n41.60 4\n42.9 4\n420 4\n43.5 4\n43.50 4\n435 4\n44.3 4\n46.2 4\n46.9 4\n49.4 4\n4:30 4\n5.42 4\n501 4\n508 4\n512 4\n530 4\n54,000 4\n560 4\n570 4\n575 4\n576 4\n6.15 4\n60.25 4\n60s 4\n640 4\n646-8420 4\n65.7 4\n66.8 4\n68.5 4\n6:37 4\n7.01 4\n7.19 4\n7.30 4\n7.45 4\n7.55 4\n7.61 4\n7.62 4\n7.78 4\n7.82 4\n7.85 4\n7.89 4\n7.94 4\n70's 4\n720 4\n725 4\n729 4\n750,000 4\n757 4\n767 4\n8.01 4\n8.17 4\n8.20 4\n8.24 4\n8.28 4\n8.53 4\n80s 4\n82.8 4\n849 4\n863 4\n869 4\n88.12 4\n9.50 4\n90's 4\n900,000 4\n925 4\n9:00 4\n;- 4\nA.H. 4\nAAA 4\nABS 4\nAC 4\nAC&R 4\nAEW 4\nAFP 4\nAH 4\nAK 4\nALSO 4\nANY 4\nAbaba 4\nAbdallah 4\nAbroad 4\nAcademic 4\nAccumulation 4\nAcrylic 4\nActing 4\nAcura 4\nAdam 4\nAdded 4\nAddis 4\nAddison 4\nAddress 4\nAdler 4\nAdolph 4\nAdvice 4\nAfrikaner 4\nAfrikaners 4\nAires 4\nAlbo 4\nAlert 4\nAlexandra 4\nAlexandria 4\nAllentown 4\nAllowance 4\nAllstate 4\nAlongside 4\nAlternative 4\nAlumni 4\nAmcore 4\nAmdahl 4\nAmerada 4\nAmeriGas 4\nAmicable 4\nAmorgos 4\nAnchor 4\nAngava 4\nAnglia 4\nAngry 4\nAnimals 4\nAnthropology 4\nAnti-Japanese 4\nAntolini 4\nAnton 4\nAnwar 4\nArabi 4\nArbel 4\nArbitration 4\nArcher 4\nArgentine 4\nAri 4\nArk. 4\nArms 4\nArnett 4\nAryan 4\nAsh 4\nAsks 4\nAsman 4\nAssistance 4\nAthens 4\nAtwan 4\nAung 4\nAussedat 4\nAvoid 4\nAyatollah 4\nAyer 4\nAztar 4\nAztec 4\nBIG 4\nBP 4\nBSB 4\nBaathists 4\nBabe 4\nBachmann 4\nBadminton 4\nBadr 4\nBagella 4\nBainimarama 4\nBakr 4\nBali 4\nBanca 4\nBangkok 4\nBao 4\nBaqir 4\nBarksdale 4\nBarnett 4\nBaron 4\nBarrier 4\nBart 4\nBastille 4\nBatchelder 4\nBauman 4\nBavaria 4\nBaxter 4\nBayer 4\nBeale 4\nBearings 4\nBeauregard 4\nBeautiful 4\nBechtel 4\nBecky 4\nBeen 4\nBeit 4\nBending 4\nBenedict 4\nBeneficial 4\nBergen 4\nBerman 4\nBertie 4\nBertussi 4\nBethForge 4\nBetwons 4\nBeverage 4\nBeware 4\nBicycle 4\nBiehl 4\nBiloxi 4\nBiomedical 4\nBirnbaum 4\nBishops 4\nBlind 4\nBoat 4\nBock 4\nBoehringer 4\nBonwit 4\nBorder 4\nBosch 4\nBoxing 4\nBrad 4\nBrand 4\nBreene 4\nBridgestone 4\nBrierley 4\nBright 4\nBringing 4\nBros 4\nBroward 4\nBrowns 4\nBrunswick 4\nBu 4\nBuck 4\nBud 4\nBudapest 4\nBudweiser 4\nBuenos 4\nBulbul 4\nBull 4\nBulls 4\nBunun 4\nBurr 4\nBusinessmen 4\nBuyer 4\nBuzz 4\nBynoe 4\nCAE 4\nCALIFORNIA 4\nCDC 4\nCIAC 4\nCML 4\nCOMMUNICATIONS 4\nCOMPANIES 4\nCPU 4\nCRRA 4\nCSF 4\nCadbury 4\nCaddy 4\nCaesar 4\nCaesars 4\nCalculation 4\nCampaneris 4\nCampo 4\nCapable 4\nCape 4\nCapitalize 4\nCarboni 4\nCarew 4\nCaroline 4\nCarrier 4\nCarrion 4\nCarver 4\nCasablanca 4\nCasino 4\nCavaliers 4\nCavalry 4\nCelimene 4\nCervantes 4\nChaidamu 4\nChamaecyparis 4\nChamps 4\nChandross 4\nChanging 4\nChanglin 4\nChe 4\nCheers 4\nChernomyrdin 4\nCherry 4\nCheryl 4\nChesapeake 4\nChinatown 4\nChristina 4\nChristine 4\nChuan 4\nChubb 4\nChunqiu 4\nCindy 4\nCipher 4\nCitizenship 4\nClaims 4\nClarcor 4\nClaude 4\nClements 4\nCleopatra 4\nClothing 4\nCoats 4\nCobb 4\nCoelho 4\nCoin 4\nColo 4\nColumbine 4\nComets 4\nCommemoration 4\nConcerning 4\nConditions 4\nCondoleezza 4\nCone 4\nCongresses 4\nConlon 4\nConrail 4\nConservation 4\nConspiracy 4\nConsultant 4\nConsultative 4\nContinent 4\nContinued 4\nConventional 4\nCoogan 4\nCoordinator 4\nCordis 4\nCorrespondents 4\nCorrupt 4\nCortese 4\nCorvette 4\nCos 4\nCosmetics 4\nCounseling 4\nCrazy 4\nCrest 4\nCristobal 4\nCritical 4\nCut 4\nCyanamid 4\nD&B 4\nDARPA 4\nDAY 4\nDAYS 4\nDJ 4\nDJIA 4\nDVDs 4\nDWG 4\nDammam 4\nDaqamsa 4\nDar 4\nDaughter 4\nDavies 4\nDealing 4\nDeals 4\nDebenture 4\nDec 4\nDee 4\nDelchamps 4\nDellums 4\nDepositary 4\nDepot 4\nDerek 4\nDes 4\nDevon 4\nDhi 4\nDiary 4\nDickens 4\nDiscount 4\nDiversified 4\nDjalil 4\nDoc 4\nDoctrine 4\nDogs 4\nDoing 4\nDominos 4\nDonohoo 4\nDoof 4\nDownload 4\nDrink 4\nDrinking 4\nEARTHQUAKE 4\nECP 4\nECT 4\nEEOC 4\nEGM 4\nERC 4\nEagan 4\nEat 4\nEddy 4\nEdelson 4\nEdmond 4\nEdsel 4\nEducational 4\nEducators 4\nEfforts 4\nEhack 4\nEinstein 4\nElPaso 4\nElements 4\nEly 4\nEmbarcadero 4\nEmeryville 4\nEmile 4\nEnglewood 4\nEnough 4\nEnsor 4\nEqual 4\nEqually 4\nEquatorial 4\nEquitable 4\nErcot 4\nErik 4\nEscort 4\nEslite 4\nEthyl 4\nEurobonds 4\nEvening 4\nEverywhere 4\nEvidently 4\nEvil 4\nExcel 4\nExit 4\nExpenses 4\nExplains 4\nFADA 4\nFCB 4\nFINANCIAL 4\nFIRST 4\nFREE 4\nFace 4\nFacsimile 4\nFadil 4\nFall 4\nFamous 4\nFantasy 4\nFantome 4\nFaqeer 4\nFarrell 4\nFast 4\nFatman 4\nFault 4\nFelipe 4\nFighting 4\nFiorini 4\nFleischer 4\nFlexibility 4\nFlexible 4\nFlu 4\nFlynn 4\nFootballer 4\nForbes 4\nForm 4\nFranc 4\nFrawley 4\nFreeze 4\nFreight 4\nFremd 4\nFreud 4\nFriedman 4\nFujianese 4\nFukuyama 4\nFull 4\nFundamental 4\nFunny 4\nFusionRetail 4\nGA 4\nGAO 4\nGET 4\nGPA 4\nGSX 4\nGW 4\nGabor 4\nGambian 4\nGanzhou 4\nGarage 4\nGarfield 4\nGarith 4\nGartner 4\nGenerali 4\nGeneration 4\nGenscher 4\nGeoffrey 4\nGerard 4\nGhazal 4\nGibraltar 4\nGilchrist 4\nGilo 4\nGinsberg 4\nGladkovich 4\nGlaser 4\nGlazier 4\nGlen 4\nGods 4\nGoldwater 4\nGolenbock 4\nGollust 4\nGone 4\nGoodbye 4\nGooding 4\nGoodrich 4\nGoss 4\nGovi 4\nGranges 4\nGreenfield 4\nGreens 4\nGreve 4\nGrimm 4\nGrossman 4\nGroupe 4\nGrover 4\nGuadalajara 4\nGuantanamo 4\nGuaranty.doc 4\nGuatemala 4\nGuest 4\nGuofang 4\nGurria 4\nHEALTH 4\nHELP 4\nHHS 4\nHIS 4\nHOT 4\nHP 4\nHSBC 4\nHTML 4\nHaagen 4\nHachette 4\nHaier 4\nHail 4\nHailar 4\nHammacks 4\nHaotian 4\nHarare 4\nHarbi 4\nHardly 4\nHarken 4\nHarrisburg 4\nHaruhiko 4\nHastert 4\nHatfield 4\nHayat 4\nHayes 4\nHazel 4\nHealing 4\nHearing 4\nHearts 4\nHeathrow 4\nHeating 4\nHeckman 4\nHehe 4\nHelsinki 4\nHendrik 4\nHertz 4\nHiggenbotham 4\nHiggins 4\nHiller 4\nHistorically 4\nHmmm 4\nHochiminh 4\nHogan 4\nHolders 4\nHollings 4\nHolly 4\nHonolulu 4\nHoover 4\nHorrible 4\nHorses 4\nHorton 4\nHousehold 4\nHuge 4\nHuguenot 4\nHumaidi 4\nHungarian 4\nHwang 4\nIAEA 4\nIATA 4\nIBC 4\nIC 4\nIL 4\nINTERNATIONAL 4\nIOUs 4\nITRI 4\nIgor 4\nIijima 4\nIllustrated 4\nImam 4\nImpacts 4\nImprovement 4\nInitially 4\nInjury 4\nInk 4\nInsanally 4\nInsight 4\nInspectorate 4\nInteractive 4\nInterconnect 4\nInterferon 4\nIstanbul 4\nIwai 4\nJUST 4\nJaber 4\nJacksonville 4\nJala 4\nJaws 4\nJays 4\nJeffery 4\nJensen 4\nJeopardy 4\nJermaine 4\nJesuit 4\nJilin 4\nJill 4\nJiuzhai 4\nJohannesburg 4\nJohnston 4\nJorge 4\nJosephine 4\nJournalism 4\nJournalists 4\nJudaism 4\nJuilliard 4\nJulie 4\nJun 4\nJuppe 4\nKaifu 4\nKaoshikii 4\nKatherine 4\nKathy 4\nKaty 4\nKaufman 4\nKeelung 4\nKelley 4\nKelli 4\nKellner 4\nKensington 4\nKern 4\nKhalid 4\nKhatami 4\nKi 4\nKill 4\nKimbrough 4\nKimmel 4\nKind 4\nKindly 4\nKings 4\nKitty 4\nKlerk 4\nKline 4\nKloves 4\nKmart 4\nKnesset 4\nKnights 4\nKoran 4\nKori 4\nKriz 4\nKroger 4\nKroll 4\nKu 4\nKuantu 4\nKun 4\nKurai 4\nKy 4\nKyi 4\nLAW 4\nLDI 4\nLIKE 4\nLOT 4\nLaBonte 4\nLabour 4\nLadenburg 4\nLadies 4\nLagnado 4\nLahim 4\nLakeland 4\nLander 4\nLangton 4\nLantern 4\nLatvian 4\nLaughlin 4\nLauren 4\nLeach 4\nLeber 4\nLegg 4\nLen 4\nLester 4\nLeucadia 4\nLevel 4\nLiability 4\nLiaison 4\nLiangping 4\nLiberties 4\nLidgerwood 4\nLie 4\nLimit 4\nLipstein 4\nLiquid 4\nList 4\nLithuania 4\nLiverpool 4\nLodge 4\nLoeb 4\nLoess 4\nLogic 4\nLombardi 4\nLongkai 4\nLoose 4\nLubbers 4\nLucas 4\nLudcke 4\nLujiazui 4\nLukar 4\nLunar 4\nLyphomed 4\nME 4\nMETALS 4\nMMA 4\nMMI 4\nMOTAAWEN@HOTMAIL.COM 4\nMaalox 4\nMacbeth 4\nMaceda 4\nMagnin 4\nMaguire 4\nMajid 4\nMakro 4\nMalik 4\nMalvo 4\nMandela 4\nManpower 4\nMaoist 4\nMar 4\nMarino 4\nMarkese 4\nMarriage 4\nMasius 4\nMassage 4\nMaury 4\nMayo 4\nMcCormick 4\nMcLauren 4\nMcMaster 4\nMcNally 4\nMeagher 4\nMechanical 4\nMedellin 4\nMeek 4\nMehta 4\nMetals 4\nMetamucil 4\nMethodist 4\nMets 4\nMichelangelo 4\nMiddletown 4\nMidi 4\nMilk 4\nMillet 4\nMillicom 4\nMinisters 4\nMinor 4\nMiracle 4\nMiranda 4\nMishaan 4\nMithun 4\nMochida 4\nMoliere 4\nMoloch 4\nMona 4\nMonaco 4\nMonk 4\nMonkey 4\nMonroe 4\nMontgoris 4\nMonth 4\nMonthly 4\nMonths 4\nMoo 4\nMoran 4\nMorrissey 4\nMort 4\nMossad 4\nMouth 4\nMove 4\nMovie 4\nMujahideen 4\nMulroney 4\nMustafa 4\nMyong 4\nMyrtle 4\nNA 4\nNASDAQ 4\nNEED 4\nNJ 4\nNKF 4\nNRC 4\nNTT 4\nNV 4\nNW 4\nNaha 4\nNajaf 4\nNaturally 4\nNazer 4\nNazionale 4\nNeihu 4\nNepalese 4\nNev 4\nNeville 4\nNewsEdge 4\nNghe 4\nNguema 4\nNi 4\nNiMo 4\nNic 4\nNiciporuk 4\nNintendo 4\nNinth 4\nNissho 4\nNixdorf 4\nNoboa 4\nNogales 4\nNovell 4\nNuggets 4\nNut 4\nO'Donnell 4\nO'Neill 4\nOEX 4\nONCE 4\nOP 4\nOT 4\nOVER 4\nOadah 4\nOakar 4\nOats 4\nOddly 4\nOgonyok 4\nOji 4\nOman 4\nOmnicom 4\nOmnimedia 4\nOpinion 4\nOpportunity 4\nOprah 4\nOrdinarily 4\nOriani 4\nOriginating 4\nOrthodox 4\nOttawa 4\nP&L 4\nPA 4\nPACIFIC 4\nPNC 4\nPOA 4\nPOWs 4\nPRECIOUS 4\nPTS 4\nPUCT 4\nPX 4\nPai 4\nPalfrey 4\nPalmero 4\nPangcah 4\nPanhandle 4\nParcel 4\nPartly 4\nPastoral 4\nPatricia 4\nPatriot 4\nPeaceful 4\nPedro 4\nPelswick 4\nPeltz 4\nPencil 4\nPenghu 4\nPercival 4\nPerformance 4\nPeripherals 4\nPerson 4\nPhalange 4\nPhibro 4\nPhilippe 4\nPicasso 4\nPierce 4\nPilevsky 4\nPipe 4\nPipeLines 4\nPipeline 4\nPlain 4\nPlanned 4\nPlantation 4\nPlateau 4\nPlatinum 4\nPlatt 4\nPlaying 4\nPlaytex 4\nPoliticians 4\nPollock 4\nPonte 4\nPontiac 4\nPorter 4\nPostel 4\nPouchong 4\nPound 4\nPoverty 4\nPraise 4\nPrecisely 4\nPredictably 4\nPresentation 4\nPreservation 4\nPresidio 4\nPreston 4\nPreti 4\nPrevious 4\nPriam 4\nPritzker 4\nProfessors 4\nProhibited 4\nProperties 4\nProponents 4\nProposition 4\nProtestant 4\nPsychological 4\nPulp 4\nPushing 4\nPutnam 4\nPyszkiewicz 4\nQVC 4\nQar 4\nQishi 4\nQuito 4\nREAL 4\nRIGHT 4\nRT 4\nRTZ 4\nRV 4\nRabin 4\nRain 4\nRaising 4\nRanger 4\nRank 4\nRaptopoulos 4\nRascals 4\nRawls 4\nRaytheon 4\nReasonable 4\nReconciliation 4\nReferee 4\nRehabilitation 4\nRenfa 4\nReno 4\nReply 4\nResponding 4\nRetail 4\nRetailers 4\nReuben 4\nRevson 4\nReward 4\nRgds 4\nRichards 4\nRies 4\nRiley 4\nRim 4\nRodriguez 4\nRohm 4\nRohs 4\nRok 4\nRomanian 4\nRonnie 4\nRooney 4\nRoqi 4\nRotary 4\nRoughly 4\nRs 4\nRubber 4\nRubel 4\nRuby 4\nRuffo 4\nRule 4\nRuskin 4\nRuss 4\nRyan 4\nS.G. 4\nSA 4\nSF 4\nSHOULD 4\nSIBV 4\nSPAN 4\nSSMs 4\nSTOCK 4\nSadly 4\nSadoon 4\nSafe 4\nSago 4\nSai 4\nSala 4\nSalam 4\nSalary 4\nSale 4\nSalem 4\nSalon 4\nSaltzburg 4\nSand 4\nSandy 4\nSanger 4\nSante 4\nSanya 4\nSaskatchewan 4\nSat 4\nSat. 4\nSayeb 4\nScandal 4\nScandinavian 4\nSchaefer 4\nSchlesinger 4\nSchlumberger 4\nSchmidt 4\nSchulman 4\nSchulte 4\nSchweppes 4\nSciutto 4\nScore 4\nScotia 4\nScotts 4\nScottsdale 4\nSculley 4\nSearch 4\nSecretariat 4\nSeeking 4\nSeen 4\nSeib 4\nSell 4\nSerious 4\nSeth 4\nSetting 4\nSettle 4\nShaikh 4\nShalit 4\nShalom 4\nShebaa 4\nSheikha 4\nSherry 4\nShi'ite 4\nShields 4\nShipman 4\nShock 4\nShoemaker 4\nShultz 4\nSilk 4\nSimonds 4\nSincerely 4\nSindona 4\nSingaporean 4\nSingle 4\nSitting 4\nSixth 4\nSkeptics 4\nSkilled 4\nSlate 4\nSlater 4\nSmaller 4\nSmalling 4\nSnyder 4\nSoak 4\nSobel 4\nSocrates 4\nSolidaire 4\nSolo 4\nSoren 4\nSoweto 4\nSox 4\nSpaniard 4\nSpeculation 4\nSpokesmen 4\nSpy 4\nSr 4\nStability 4\nStacey 4\nStarpointe 4\nStarzl 4\nStatue 4\nStealth 4\nStefan 4\nStevenson 4\nStinnett 4\nStockholders 4\nStole 4\nStorm 4\nStrauss 4\nStripes 4\nStrom 4\nStudio 4\nSuburban 4\nSultan 4\nSumita 4\nSumitomo 4\nSummit 4\nSuns 4\nSunshine 4\nSusie 4\nSuu 4\nSuweiri 4\nSwan 4\nSweet 4\nTEXAS 4\nTHING 4\nTWO 4\nTa'abbata 4\nTae 4\nTaiping 4\nTalal 4\nTalent 4\nTam 4\nTamkang 4\nTarget 4\nTaub 4\nTaxpayers 4\nTelelawyer 4\nTeller 4\nTempe 4\nTemporary 4\nTerra 4\nTerrible 4\nTerritories 4\nTerrizzi 4\nTerrorism 4\nTet 4\nTextron 4\nThani 4\nTheme 4\nThereafter 4\nThoughts 4\nThurber 4\nTianding 4\nTicket 4\nTiffany 4\nTigrean 4\nTikrit 4\nTimbers 4\nTimken 4\nTip 4\nTitanium 4\nTollis 4\nTon 4\nTorres 4\nTorstar 4\nTouliu 4\nTrack 4\nTracy 4\nTraditionally 4\nTransAtlantic 4\nTranslation 4\nTranslator 4\nTranswestern 4\nTreasure 4\nTrial 4\nTrident 4\nTrim 4\nTrouble 4\nTroubled 4\nTrucking 4\nTsang 4\nTufts 4\nTulia 4\nTunick 4\nTurnover 4\nTutu 4\nTy 4\nTyphoon 4\nTyszkiewicz 4\nUH 4\nUSAA 4\nUSD$ 4\nUTOR 4\nUh 4\nUltrasonic 4\nUnique 4\nUri 4\nUs 4\nUzbekistan 4\nVA 4\nVGA 4\nVH 4\nVHS 4\nVader 4\nVail 4\nValhi 4\nVarian 4\nVehicle 4\nVenezuelans 4\nVerit 4\nVernon 4\nVeronis 4\nVersion 4\nVeslefrikk 4\nVinci 4\nVirginians 4\nViroqua 4\nVisiting 4\nVolunteer 4\nW.Va 4\nWHAT 4\nWOULD 4\nWTC 4\nWacoal 4\nWadi 4\nWaggoner 4\nWagoneer 4\nWaite 4\nWajba 4\nWalid 4\nWalsh 4\nWaltham 4\nWanpeng 4\nWanted 4\nWardair 4\nWarm 4\nWarshaw 4\nWarthog 4\nWaste 4\nWatts 4\nWealth 4\nWednesdays 4\nWeinstein 4\nWestin 4\nWeston 4\nWestwood 4\nWhip 4\nWhitford 4\nWilder 4\nWillis 4\nWilly 4\nWilshire 4\nWind 4\nWinton 4\nWireless 4\nWis 4\nWo 4\nWolfgang 4\nWomack 4\nWoo 4\nWoodbridge 4\nWoodruff 4\nWuerttemberg 4\nWuxi 4\nXiaoling 4\nXiaoying 4\nXixia 4\nYES 4\nYacht 4\nYahya 4\nYard 4\nYat 4\nYawai 4\nYenliao 4\nYeutter 4\nYiren 4\nYonehara 4\nYoshinoya 4\nYouhao 4\nYounes 4\nYuanzhi 4\nYubitrich 4\nZaid 4\nZeist 4\nZell 4\nZenith 4\nZhanjiang 4\nZhengzhou 4\nZhongshan 4\nZhujiang 4\nabandonment 4\nabated 4\nabatement 4\nabolishing 4\nabolition 4\nabsent 4\nabsorbs 4\nabstained 4\nabundantly 4\naccommodating 4\naccompaniment 4\naching 4\nacknowledgement 4\nacquaintance 4\nactivate 4\nacupuncturist 4\nacutely 4\nadamant 4\nadaptable 4\naddiction 4\nadditives 4\nadept 4\nadhered 4\nadhesive 4\nadjudicator 4\nadmiral 4\nadmire 4\nadventures 4\nadversaries 4\nadversely 4\naffectionate 4\naffirms 4\nagile 4\nagitated 4\nagreeable 4\nairlift 4\nairlifted 4\nairlifting 4\nakin 4\nal_arabmsb@hotmail.com 4\nalgae 4\nalisau@gmail.com 4\nallegation 4\nallege 4\nallergies 4\nallotments 4\nallotted 4\nallowable 4\nallowances 4\nalma 4\naloft 4\naltar 4\nalternation 4\nama455@hotmail.com 4\namass 4\namaze 4\nambience 4\namiable 4\nammo 4\namphibians 4\namplified 4\namused 4\nanalytical 4\nanecdotal 4\nanemic 4\nanesthetic 4\nanswerable 4\nanthers 4\nanti-miscarriage 4\nanti-terrorist 4\nantibiotic 4\nantinuclear 4\nantique 4\nanyhow 4\naplenty 4\napologies 4\napp 4\nappalling 4\napplicant 4\nappraisals 4\napprehensive 4\nappropriators 4\narbs 4\narchive 4\narduous 4\narisen 4\narming 4\narrivals 4\narrows 4\narsenal 4\nascending 4\nascribed 4\nashore 4\naspertame 4\nasphalt 4\nassociating 4\nassuring 4\nasthma 4\nastounding 4\nastride 4\nastronaut 4\natrium 4\natrocity 4\nattentive 4\nattrition 4\naustere 4\nauthored 4\nauthorizing 4\nautocratic 4\nauxiliary 4\navail 4\navenue 4\navers 4\navert 4\nawe 4\nawhile 4\nbackwardness 4\nbaked 4\nbakeries 4\nbaking 4\nbananas 4\nbandits 4\nbarbaric 4\nbashing 4\nbastion 4\nbaths 4\nbattalion 4\nbatting 4\nbeams 4\nbearer 4\nbeasts 4\nbeefed 4\nbeginnings 4\nbehest 4\nbeliever 4\nbellies 4\nbells 4\nbelongings 4\nbenches 4\nbenevolent 4\nbeset 4\nbestowed 4\nbetrayed 4\nbible 4\nbilingual 4\nbiomedical 4\nblackboard 4\nblacklined 4\nbland 4\nblase 4\nblazing 4\nblends 4\nblocker 4\nbloggers 4\nbloodthirsty 4\nblunder 4\nblunt 4\nboarded 4\nbode 4\nbodied 4\nboob 4\nbookstore 4\nboots 4\nbooze 4\nbounces 4\nbout 4\nboutique 4\nbouts 4\nbowls 4\nbrainwashed 4\nbraking 4\nbravely 4\nbravery 4\nbreasts 4\nbreeder 4\nbriefings 4\nbroadest 4\nbrochures 4\nbrowsed 4\nbrowsing 4\nbrushes 4\nbtw 4\nbudge 4\nbullies 4\nbulls 4\nbully 4\nbump 4\nbumped 4\nbumpy 4\nburbs 4\nbureaus 4\nburials 4\nbursting 4\nbush 4\nbustling 4\nbuyout 4\ncab 4\ncabins 4\ncachet 4\ncafes 4\ncalculate 4\ncalf 4\ncallable 4\ncalligraphic 4\ncallosum 4\ncapitalistic 4\ncapitals 4\ncapitol 4\ncaptive 4\ncardiac 4\ncardiovascular 4\ncarver 4\ncashed 4\ncassette 4\ncastigating 4\ncastle 4\ncasts 4\ncatalogs 4\ncatheter 4\ncathode 4\ncatty 4\ncausal 4\ncautionary 4\nceases 4\ncedar 4\nceded 4\nceilings 4\ncentrist 4\nceramicist 4\ncereals 4\nchained 4\nchairmanship 4\nchampagne 4\nchanting 4\ncharacteristically 4\ncharcoal 4\nchased 4\nchatted 4\nchatter 4\nchauffeur 4\ncheckout 4\ncheckpoint 4\ncheeks 4\nchewing 4\nchiefly 4\nchildish 4\nchimney 4\nchloride 4\nchoke 4\nchoking 4\nchromium 4\nchromosomes 4\nchurn 4\ncinematic 4\ncivilizations 4\nclad 4\nclaimant 4\nclaws 4\ncleansing 4\nclearest 4\nclerical 4\ncliched 4\nclimbs 4\nclinics 4\nclique 4\nclocks 4\nclouded 4\ncloudy 4\nclumsy 4\nclutching 4\nco-chairman 4\nco-ordination 4\nco-sponsor 4\nco-sponsors 4\ncoals 4\ncoexist 4\ncoincide 4\ncoined 4\ncollaborate 4\ncollectibles 4\ncombatants 4\ncomforting 4\ncomics 4\ncommandos 4\ncommencement 4\ncommentator 4\ncommissioning 4\ncomp.sources.unix 4\ncompanionship 4\ncomparative 4\ncompel 4\ncompetes 4\ncompletes 4\ncomprehend 4\ncompressors 4\ncomprises 4\ncomprising 4\ncondemns 4\nconditionally 4\ncondo 4\ncondolence 4\ncondoms 4\nconducts 4\nconduits 4\nconfessions 4\nconfided 4\nconfiguration 4\nconforming 4\nconfrontational 4\ncongratulating 4\nconnector 4\nconsciously 4\nconservatism 4\nconserving 4\nconsistency 4\nconsortiums 4\nconstructions 4\nconsular 4\ncontemptible 4\ncontests 4\ncontributor 4\nconveniences 4\nconverts 4\ncoolly 4\ncoop 4\ncopier 4\ncoronary 4\ncoroner 4\ncorpus 4\ncorrections 4\ncottage 4\ncounterproductive 4\ncounterrevolutionary 4\ncountersuit 4\ncounterterrorism 4\ncoupe 4\ncouplet 4\ncourted 4\ncowardly 4\ncrane 4\ncrave 4\ncraving 4\ncrawling 4\ncreed 4\ncrept 4\ncrippling 4\ncrisp 4\ncrooked 4\ncrooks 4\ncrusade 4\ncues 4\nculprit 4\ncurative 4\ncursed 4\ncurses 4\ncurtains 4\ncurves 4\ncutback 4\ncy 4\ncyberspace 4\ncyclosporine 4\ncynicism 4\nczar 4\ndamping 4\ndancer 4\ndances 4\ndangling 4\ndares 4\ndashed 4\ndaysafter 4\ndearest 4\ndearth 4\ndebated 4\ndeceit 4\ndecentralized 4\ndeducting 4\ndeepened 4\ndehydrated 4\ndeity 4\ndelightful 4\ndemonize 4\ndemons 4\ndemutualization 4\ndepleted 4\ndeported 4\ndeposited 4\ndepositions 4\ndepositors 4\ndepressive 4\nderivatives 4\nderives 4\ndescendant 4\ndesecration 4\ndespicable 4\ndespised 4\ndetecting 4\ndeteriorates 4\ndeterrence 4\ndeterrent 4\ndetoxification 4\ndevotes 4\ndew 4\ndhin 4\ndiabetic 4\ndiagnose 4\ndiagonal 4\ndiapers 4\ndiaries 4\ndictated 4\ndictionary 4\ndiffuse 4\ndigest 4\ndignitaries 4\ndilemmas 4\ndim 4\ndimly 4\ndined 4\ndinners 4\ndinosaurs 4\ndisagreements 4\ndisapproval 4\ndisarming 4\ndiscoveries 4\ndiscovers 4\ndiscrete 4\ndiscriminate 4\ndiscs 4\ndisgust 4\ndisgusted 4\ndishonest 4\ndismay 4\ndismayed 4\ndismissing 4\ndisorderly 4\ndispelled 4\ndispersant 4\ndispleased 4\ndisproportionately 4\ndisregarded 4\ndisruptive 4\ndissolving 4\ndistinguishes 4\ndistorting 4\ndistressing 4\nditch 4\ndiva 4\ndived 4\ndivergence 4\ndiverting 4\ndivides 4\ndivine 4\ndocked 4\ndoll 4\ndomains 4\ndons 4\ndormitory 4\ndou 4\ndownfall 4\ndowngrading 4\ndownplayed 4\ndownsizing 4\ndrags 4\ndrains 4\ndreaded 4\ndreaming 4\ndrilled 4\ndroppers 4\ndroves 4\ndrugstore 4\ndrummer 4\ndrunken 4\nduduyuu 4\nduel 4\ndues 4\ndummy 4\ndusty 4\ndwarfs 4\ndwindled 4\neCommerce 4\nearthworms 4\neastward 4\nebb 4\nechoing 4\necosystem 4\nedited 4\nediting 4\nejected 4\nelectorate 4\nelectrolysis 4\nelevators 4\neligibility 4\neloquent 4\nelusive 4\nemailed 4\nemancipation 4\nembark 4\nembodies 4\nempower 4\nendeavors 4\nendorse 4\nendowed 4\nendowment 4\nengages 4\nenhances 4\nenlarged 4\nenlightening 4\nenlightenment 4\nenrichment 4\nenrolled 4\nentertain 4\nentitle 4\nenviable 4\nenvisioned 4\nequitable 4\neroticism 4\nescrow 4\nesoteric 4\nessays 4\nesteem 4\nethos 4\netiquette 4\nevaporated 4\nevolutionary 4\nevolving 4\nex-wife 4\nexaggerate 4\nexaggerating 4\nexaminers 4\nexcerpts 4\nexecutes 4\nexemptions 4\nexerting 4\nexiles 4\nexits 4\nexpats 4\nexpectant 4\nexpedite 4\nexpedition 4\nexpelling 4\nexperimented 4\nexplores 4\nexposition 4\nexposures 4\nexterior 4\nexterminate 4\nextracting 4\nextrusion 4\neyed 4\neyewitnesses 4\nfabulous 4\nfacade 4\nfactual 4\nfades 4\nfainting 4\nfamine 4\nfanaticism 4\nfangs 4\nfarce 4\nfascist 4\nfasteners 4\nfated 4\nfaulty 4\nfaux 4\nfearsome 4\nfestive 4\nfewest 4\nfi 4\nfictional 4\nfiddle 4\nfilial 4\nfiltering 4\nfinalize 4\nfingerprint 4\nfirepower 4\nfirstly 4\nfists 4\nfixtures 4\nflakes 4\nflaps 4\nflared 4\nflashlight 4\nflatten 4\nflaunt 4\nflextime 4\nflirting 4\nflock 4\nflooring 4\nflour 4\nfluctuate 4\nfluently 4\nflute 4\nfolder 4\nfolklore 4\nfoodstuffs 4\nfootwear 4\nforceful 4\nforcibly 4\nforecasted 4\nforesee 4\nforeseen 4\nforgery 4\nforgiven 4\nforgiving 4\nforma 4\nformats 4\nfortified 4\nfountains 4\nfox 4\nfractional 4\nfractionally 4\nfractured 4\nframes 4\nfranchiser 4\nfranz371...@gmail.com 4\nfraught 4\nfreckles 4\nfreshman 4\nfreshmen 4\nfret 4\nfriendships 4\nfries 4\nfringe 4\nfrog 4\nfruitful 4\nfuck 4\nfulfillment 4\nfullest 4\nfunctionality 4\nfunctionally 4\nfurnace 4\nfurnaces 4\ngall 4\ngalleries 4\ngamut 4\ngangster 4\ngarde 4\ngardening 4\ngasket 4\ngasolines 4\ngated 4\ngda 4\ngears 4\ngems 4\ngentlemen 4\nghostbusting 4\ngig 4\ngimmickry 4\ngimmicks 4\ngirlfriends 4\ngladly 4\nglands 4\nglitch 4\nglitzy 4\ngloss 4\nglove 4\ngoats 4\ngoings 4\ngood-bye 4\ngospel 4\ngown 4\ngraceful 4\ngracefully 4\ngracious 4\ngraders 4\ngrains 4\ngrandly 4\ngrandstands 4\ngrapes 4\ngrapevine 4\ngraph 4\ngraveyard 4\ngrenade 4\ngrey 4\ngrieve 4\ngrisly 4\ngrouped 4\ngrueling 4\nguise 4\nhackles 4\nhah 4\nhai 4\nhail 4\nhails 4\nhallmark 4\nhallowed 4\nhalved 4\nhangar 4\nhappiest 4\nharassment 4\nharbinger 4\nhardworking 4\nharmless 4\nharvested 4\nhashish 4\nhateful 4\nhay 4\nhealthily 4\nheartfelt 4\nheartland 4\nhehe 4\nheighten 4\nhemisphere 4\nhemoglobin 4\nherb 4\nherders 4\nherds 4\nhereby 4\nheyday 4\nhibernation 4\nhiking 4\nhissed 4\nhitter 4\nhoard 4\nhobbled 4\nhobby 4\nhockey 4\nhomecoming 4\nhomosexual 4\nhoned 4\nhonors 4\nhopelessly 4\nhorizons 4\nhorns 4\nhoses 4\nhospitalization 4\nhostel 4\nhostess 4\nhotly 4\nhotspot 4\nhotspots 4\nhowling 4\nhubbub 4\nhuh 4\nhumiliating 4\nhurled 4\nhurried 4\nhurriedly 4\nhwa 4\nhydroelectric 4\nhysteria 4\nidealistic 4\nidol 4\nie 4\nignite 4\nignited 4\nil 4\nilliterate 4\nillnesses 4\nimitated 4\nimitating 4\nimpacted 4\nimperfections 4\nimpervious 4\nimpetuous 4\nimplacable 4\nimplicitly 4\nimplying 4\nimportation 4\nimposition 4\nimprovised 4\ninaccessible 4\ninaugurated 4\nincessantly 4\nincidental 4\ninclinations 4\ninconsistent 4\nincorporating 4\nincrements 4\nindebted 4\nindependents 4\nindexed 4\nindoors 4\ninept 4\ninexorable 4\ninfancy 4\ninfertile 4\ninfiltrated 4\ninflammation 4\ninflate 4\ninflationary 4\ninflow 4\ninhumane 4\ninitials 4\ninjure 4\ninkling 4\ninmate 4\ninnings 4\nins 4\ninsert 4\ninsistent 4\ninsole 4\ninsoles 4\ninspecting 4\ninstituting 4\ninstructors 4\ninsures 4\nint 4\nintelligently 4\nintentioned 4\ninteracting 4\ninteriors 4\ninterpersonal 4\ninterpreting 4\ninterrogation 4\nintersections 4\ninterstates 4\ninterviewers 4\nintestines 4\nintifada 4\nintolerable 4\nintolerance 4\nintoxicated 4\nintractable 4\nintrinsic 4\ninvalid 4\ninvasive 4\ninvincible 4\ninvoicing 4\ninvoking 4\nirregularities 4\nirreparable 4\nirrespective 4\nirritation 4\njagged 4\nje 4\njeopardizing 4\njewels 4\njin 4\njoking 4\njolts 4\njuicy 4\njuridical 4\njursidictions 4\njustifications 4\njustifying 4\nkanji 4\nketchup 4\nkickback 4\nkicker 4\nkiddies 4\nkindled 4\nkinship 4\nkits 4\nkitty 4\nkneaded 4\nknitting 4\nknob 4\nknots 4\nkylix 4\nl/c 4\nlaced 4\nlackeys 4\nlad 4\nlakes 4\nlanguishing 4\nlasers 4\nlatex 4\nlaurie.ellis@enron.com 4\nleaflets 4\nleans 4\nleapfrog 4\nleeway 4\nlegality 4\nlegalization 4\nlegalizing 4\nlemon 4\nleveling 4\nlevied 4\nliar 4\nliars 4\nliberalized 4\nliberating 4\nliberties 4\nlicensee 4\nlicking 4\nlien 4\nlighted 4\nlinage 4\nlineatus 4\nliners 4\nlions 4\nliq 4\nliquefy 4\nliteral 4\nlocales 4\nlocalization 4\nlocating 4\nlodge 4\nlodged 4\nlogically 4\nlogistical 4\nlone 4\nloosening 4\nlope 4\nlotion 4\nlouder 4\nlukewarm 4\nlull 4\nlun 4\nlush 4\nlust 4\nluster 4\nmachetes 4\nmagnate 4\nmainlander 4\nmaitre 4\nmajestic 4\nmalaise 4\nmalaria 4\nmammal 4\nmanagements 4\nmaniac 4\nmanipulative 4\nmansions 4\nmarches 4\nmare 4\nmarginalized 4\nmarkings 4\nmartyred 4\nmasonry 4\nmasseurs 4\nmasseuse 4\nmater 4\nmaterialize 4\nmathematics 4\nmatrix 4\nmausoleum 4\nmaxim 4\nmaze 4\nmeats 4\nmechanically 4\nmediated 4\nmeditation 4\nmega 4\nmellow 4\nmelodies 4\nmelons 4\nmemberships 4\nmemorabilia 4\nmessing 4\nmessy 4\nmetallurgical 4\nmetro 4\nmexico 4\nmicrocomputer 4\nmicroelectronics 4\nmid-August 4\nmid-November 4\nmidmorning 4\nmidsole 4\nmidterms 4\nmidtown 4\nmigrants 4\nmillennial 4\nmillionaire 4\nmindedness 4\nmingling 4\nminimalist 4\nminimizing 4\nmirrors 4\nmiscalculated 4\nmiserably 4\nmistress 4\nmistrust 4\nmisuse 4\nmitigate 4\nmmbtu 4\nmoaning 4\nmobilization 4\nmock 4\nmodeled 4\nmodification 4\nmolecule 4\nmolecules 4\nmomentary 4\nmonetarist 4\nmonetization 4\nmoneyed 4\nmoniker 4\nmonkey 4\nmoons 4\nmoribund 4\nmosquitoes 4\nmotif 4\nmotorcade 4\nmovers 4\nmudslides 4\nmultilateral 4\nmultimedia 4\nmultitude 4\nmurky 4\nmute 4\nmyriad 4\nmystical 4\nmythic 4\nnarcotics 4\nnarrative 4\nnatives 4\nnearer 4\nnears 4\nnecessitated 4\nnecklace 4\nnecks 4\nneedy 4\nnemesis 4\nnets 4\nnetting 4\nnettlesome 4\nnewscast 4\nnewsgroup 4\nnicest 4\nnightclub 4\nnominations 4\nnon-deductible 4\nnon-degree 4\nnon-duck 4\nnon-governmental 4\nnondescript 4\nnonfiction 4\nnortheastern 4\nnoses 4\nnotifying 4\nnotoriety 4\nnozzle 4\nnuances 4\nnuclei 4\nnudge 4\nnull 4\nnun 4\nnursed 4\nnut 4\nobesity 4\nobey 4\nobjectively 4\nobservable 4\nobservance 4\nobstructing 4\noccured 4\noctane 4\noffended 4\noffenders 4\noffing 4\noilfield 4\noily 4\nonetime 4\nontheway 4\nopium 4\nopportunists 4\noppressors 4\noptimized 4\norchestra 4\norganizers 4\norgy 4\noriginations 4\nostensibly 4\noutfits 4\noutlooks 4\noutpaced 4\noutstripped 4\noverdependence 4\noverflowing 4\noverhang 4\noverlap 4\noverlapping 4\noverriding 4\noversaw 4\noversized 4\noversold 4\noverthrowing 4\novertures 4\nowing 4\npads 4\npail 4\npainstaking 4\npalaces 4\npancreas 4\npanties 4\nparachutes 4\nparagraphs 4\nparakeets 4\nparallels 4\nparalysis 4\nparaphernalia 4\npardon 4\npardoned 4\npardons 4\nparticles 4\npasted 4\npastor 4\npastoral 4\npastry 4\npatched 4\npatronage 4\npaychecks 4\npayola 4\npayrolls 4\npeacock 4\npearls 4\npears 4\npeeled 4\npenchant 4\npenetrated 4\npens 4\npent 4\nperched 4\nperchlorate 4\nperennial 4\nperforms 4\nperils 4\nperimeter 4\nperipheral 4\nperish 4\nperks 4\nperpetual 4\npersecuted 4\nperspectives 4\npersuading 4\npersuasive 4\npessimism 4\nphantom 4\npharmacy 4\nphenomena 4\nphenomenal 4\nphilosophers 4\nphilosophies 4\npho 4\nphobia 4\nphonograph 4\nphotography 4\npicket 4\npico 4\npictured 4\npimp 4\npineapple 4\npitchers 4\nplacements 4\nplague 4\nplantations 4\nplayoff 4\npleas 4\npleasantries 4\nplebiscite 4\npledging 4\nplush 4\nply 4\nplying 4\nplywood 4\npointer 4\npointless 4\npoisons 4\npolar 4\npolluted 4\npolo 4\npolygamous 4\npolypropylene 4\npolyurethane 4\nporcelain 4\nportal 4\npostage 4\nposturing 4\npotlatch 4\npounded 4\npours 4\npragmatism 4\npreaching 4\nprecedence 4\nprecipitated 4\nprecipitous 4\npreface 4\npreferably 4\nprescriptions 4\npresiding 4\npreview 4\npricey 4\nprints 4\nprivatize 4\npro-Beijing 4\npro-union 4\nprocrastination 4\nprod 4\nprofessions 4\nproficiency 4\nprofitably 4\nproliferate 4\npropel 4\nproponent 4\nproration 4\nprosecute 4\nprotectionism 4\nprotege 4\npsychic 4\npuck 4\npulpit 4\npunches 4\npurposely 4\npurse 4\npursuits 4\nquack 4\nquashed 4\nquery 4\nquilt 4\nquips 4\nquotation 4\nracehorse 4\nragged 4\nraging 4\nraiding 4\nraids 4\nrainbow 4\nrainbows 4\nrained 4\nrallying 4\nrapist 4\nrationalize 4\nrattle 4\nre-evaluate 4\nre-examine 4\nreactionary 4\nreadings 4\nrealists 4\nreaped 4\nrearing 4\nreassess 4\nreassigned 4\nreassume 4\nrebut 4\nrecast 4\nreceptionist 4\nrecklessly 4\nreckoned 4\nreckoning 4\nreclusion 4\nreconsideration 4\nreconstructing 4\nrecounting 4\nrecoverable 4\nrecurrence 4\nrecycle 4\nredesigned 4\nredirect 4\nredistribution 4\nreeled 4\nrefinance 4\nrefocus 4\nrefurbished 4\nrefurbishment 4\nreigned 4\nreigning 4\nrejects 4\nrejoin 4\nrekindle 4\nrelayed 4\nrelentlessly 4\nreligiously 4\nrelinquished 4\nremitted 4\nrenaissance 4\nrenal 4\nrendering 4\nrendezvous 4\nrenown 4\nreorganize 4\nrep 4\nrepel 4\nrepertoire 4\nrepressive 4\nreptile 4\nrepublican 4\nrepulsive 4\nrepurchased 4\nreschedule 4\nrescheduled 4\nrescissions 4\nreseller 4\nresembling 4\nresentful 4\nreservoir 4\nreshaping 4\nreside 4\nresiliency 4\nresistances 4\nresists 4\nresold 4\nresolute 4\nresonance 4\nresponds 4\nrested 4\nresting 4\nrestructurings 4\nrests 4\nresurfaced 4\nresurgent 4\nrethink 4\nretribution 4\nretrofitting 4\nreunion 4\nreunited 4\nreverence 4\nrevisit 4\nrevitalize 4\nrevivalist 4\nrewrite 4\nribbons 4\nridiculed 4\nridiculously 4\nrife 4\nrifles 4\nrightly 4\nrigs 4\nrinsed 4\nripping 4\nritual 4\nroadblocks 4\nroar 4\nrodents 4\nrods 4\nroiling 4\nroofs 4\nroost 4\nrooster 4\nropes 4\nrosy 4\nrot 4\nrotary 4\nrudder 4\nrulings 4\nrumbling 4\nrunners 4\nrupture 4\nrye 4\nsacrifices 4\nsacrificing 4\nsaddled 4\nsailed 4\nsalute 4\nsampled 4\nsan 4\nsandy 4\nsatisfies 4\nsavoring 4\nscaling 4\nscarred 4\nschoolteacher 4\nscoop 4\nscoring 4\nscorn 4\nscout 4\nscratching 4\nscreamed 4\nscreeching 4\nscrew 4\nscripting 4\nscrub 4\nsculpted 4\nsealing 4\nseams 4\nseawater 4\nsemantics 4\nsensory 4\nsentimental 4\nseperate 4\nsew 4\nsexy 4\nshakes 4\nshampoo 4\nshark 4\nsharpest 4\nshave 4\nshied 4\nshields 4\nshithole 4\nshone 4\nshootings 4\nshores 4\nshortest 4\nshorts 4\nshovel 4\nshovels 4\nshowers 4\nshrewd 4\nshrouded 4\nshuffle 4\nshun 4\nshuttling 4\nsickness 4\nsighted 4\nsightings 4\nsignatures 4\nsignifying 4\nsimulate 4\nsinful 4\nsinister 4\nsketches 4\nskid 4\nskier 4\nskillfully 4\nskins 4\nskip 4\nskirts 4\nskittish 4\nskyscraper 4\nslabs 4\nslave 4\nslaves 4\nsleazy 4\nsleepers 4\nsleeps 4\nsleeve 4\nslime 4\nsliver 4\nsluggishness 4\nsmartest 4\nsmashes 4\nsmother 4\nsmuggle 4\nsmuggled 4\nsnacks 4\nsnags 4\nsnaps 4\nsnatch 4\nsniffing 4\nsocialization 4\nsociologists 4\nsodium 4\nsofa 4\nsoftened 4\nsolace 4\nsolemnly 4\nsolidify 4\nsolidly 4\nsolvent 4\nsophomore 4\nsorely 4\nsorghum 4\nsorrows 4\nsoundtrack 4\nsowing 4\nsown 4\nspares 4\nsparkling 4\nspecials 4\nspecifying 4\nspices 4\nsplash 4\nspoiled 4\nsponsoring 4\nsponsorship 4\nspringing 4\nsprinkle 4\nspying 4\nsquadrons 4\nsquarely 4\nstaid 4\nstaked 4\nstared 4\nstarve 4\nstatehood 4\nstatic 4\nstationary 4\nstationery 4\nstaunchly 4\nsteak 4\nsteals 4\nsternly 4\nstickers 4\nsticky 4\nstifling 4\nstigma 4\nstimulating 4\nstingy 4\nstockholder 4\nstockpiles 4\nstomachs 4\nstoppage 4\nstormy 4\nstout 4\nstraighten 4\nstraits 4\nstrangely 4\nstratum 4\nstray 4\nstreamed 4\nstreaming 4\nstrewn 4\nstricter 4\nstride 4\nstrives 4\nstunt 4\nstylish 4\nsubcompact 4\nsubdued 4\nsubgroups 4\nsubmarines 4\nsuckers 4\nsuffice 4\nsuitcase 4\nsummarily 4\nsuperb 4\nsupercomputers 4\nsuperconductivity 4\nsuperiors 4\nsupremely 4\nsurname 4\nsurrealists 4\nsurtax 4\nsurveying 4\nsustaining 4\nswarmed 4\nswipe 4\nswollen 4\nsymbolism 4\nsympathies 4\nsyndicator 4\nsyntax 4\nsynthesis 4\ntakeoff 4\ntat 4\nteamed 4\ntedious 4\ntelecom 4\ntelemarketing 4\ntelework 4\ntemples 4\ntending 4\ntenor 4\ntheorist 4\ntherapeutic 4\ntherein 4\nthereof 4\nthickness 4\nthinker 4\nthinnest 4\nthirst 4\nthirsty 4\nthirties 4\nthoroughfare 4\nthoroughfares 4\nthunder 4\nticking 4\ntides 4\ntilted 4\ntimers 4\ntipped 4\ntitans 4\ntoned 4\ntonic 4\ntopaz 4\ntorchbearer 4\ntotalitarian 4\ntoughness 4\ntout 4\ntracts 4\ntranscripts 4\ntranslates 4\ntransmitting 4\ntransplanted 4\ntraumatic 4\ntraveler 4\ntread 4\ntrenches 4\ntribulations 4\ntrillions 4\ntriumphed 4\ntroupes 4\ntrumpet 4\ntsunami 4\ntuberculosis 4\nturbulent 4\ntwins 4\ntyphoons 4\numbrella 4\nunaffected 4\nuncanny 4\nunchallenged 4\nunconstitutionality 4\nundefined 4\nunderdeveloped 4\nunderestimate 4\nundergraduate 4\nunderpin 4\nunderpinned 4\nunderstandably 4\nunderstatement 4\nundertakings 4\nunderwent 4\nundue 4\nunethical 4\nunfilled 4\nunforgettable 4\nunfulfilled 4\nunimaginable 4\nuninformed 4\nuninspired 4\nuninterrupted 4\nuninvited 4\nunloaded 4\nunlucky 4\nunnecessarily 4\nunofficial 4\nunofficially 4\nunprofessional 4\nunrestrained 4\nunruly 4\nunsettling 4\nuntapped 4\nuntold 4\nunveiling 4\nunworthy 4\nupstart 4\nupwards 4\nvacationers 4\nvalve 4\nveins 4\nvengeance 4\nveracity 4\nversa 4\nvertically 4\nvice-chief 4\nvice-premier 4\nvice-secretary 4\nviciously 4\nvigilant 4\nvirulence 4\nvoltage 4\nvoodoo 4\nvouchers 4\nvying 4\nw 4\nwakes 4\nwaking 4\nwallpaper 4\nwandered 4\nwandering 4\nwane 4\nwaning 4\nwards 4\nwares 4\nwarlords 4\nwarranties 4\nwarren 4\nwartime 4\nwatchdog 4\nwatt 4\nwavering 4\nwax 4\nweakens 4\nwealthier 4\nweddings 4\nwept 4\nwestward 4\nwhiff 4\nwhirlwind 4\nwhistle 4\nwhittled 4\nwholesaler 4\nwidest 4\nwield 4\nwisely 4\nwithstood 4\nwitty 4\nwmkj2006 4\nwomb 4\nwonderlich 4\nwooing 4\nworkshop 4\nworkweek 4\nwrongful 4\nwww.120zy.com 4\nye 4\nyearning 4\nyoke 4\nyuhua 4\nzeal 4\nzealous 4\nzeros 4\nzombie 4\nzoom 4\n~ 4\n→ 4\n!!? 3\n'68 3\n'86 3\n'94 3\n'97 3\n'N 3\n+Killing 3\n--------------------------------------------------------------------- 3\n...! 3\n0.02 3\n0.17 3\n0.53 3\n0.75 3\n0.95 3\n01/19/2001 3\n01:58 3\n03/02/2001 3\n03:20 3\n04/16/2001 3\n06/01/2001 3\n09/20/2000 3\n1,040 3\n1,350 3\n1,700 3\n1,900 3\n1.09 3\n1.17 3\n1.34 3\n1.39 3\n1.41 3\n1.45 3\n1.47 3\n1.56 3\n1.5795 3\n1.5820 3\n1.60 3\n1.625 3\n1.67 3\n1.70 3\n1.72 3\n1.74 3\n1.81 3\n1.8400 3\n1.8415 3\n1.91 3\n1.93 3\n1/32 3\n10.37 3\n10.59 3\n10/20/2000 3\n10/23/2000 3\n10/26/2000 3\n11.1 3\n11.6 3\n11/29/2000 3\n110,000 3\n110.6 3\n1100 3\n1111 3\n113th 3\n119.88 3\n11:32 3\n11:45 3\n12.45 3\n12.8 3\n129 3\n12:12 3\n12:30 3\n13.625 3\n13.75 3\n131 3\n139 3\n14.06 3\n14.75 3\n140,000 3\n142.85 3\n144 3\n15.125 3\n15.3 3\n15.50 3\n15.8 3\n151,000 3\n1560 3\n1580 3\n1584 3\n1594 3\n16.5 3\n16.6 3\n1644 3\n169 3\n17.95 3\n172 3\n174 3\n178.5 3\n18.1 3\n18.75 3\n18.9 3\n181 3\n189 3\n1891 3\n19.5 3\n19.50 3\n1915 3\n193.3 3\n1931 3\n1933 3\n196 3\n1960's 3\n197 3\n1980's 3\n1989A 3\n199 3\n2,200 3\n2,700 3\n2.06 3\n2.125 3\n2.14 3\n2.15 3\n2.21 3\n2.23 3\n2.29 3\n2.30 3\n2.35 3\n2.38 3\n2.41 3\n2.51 3\n2.53 3\n2.61 3\n2.62 3\n2.63 3\n2.65 3\n2.73 3\n2.82 3\n2.875 3\n2.88 3\n2/3 3\n20.6 3\n20.doc 3\n2013 3\n2023 3\n204 3\n206 3\n20s 3\n21.1 3\n21.6 3\n21/32 3\n213 3\n217 3\n218 3\n222 3\n223 3\n23.6 3\n234 3\n238 3\n24.2 3\n24.4 3\n24.8 3\n24.875 3\n240,000 3\n246 3\n248 3\n25.2 3\n25.4 3\n25/32 3\n253 3\n254 3\n2596.72 3\n26.50 3\n26.9 3\n264 3\n266 3\n267 3\n2689.14 3\n27.8 3\n276.8 3\n28.4 3\n28.6 3\n28.7 3\n28.75 3\n28/32 3\n289 3\n291 3\n293 3\n295870 3\n2:30 3\n3.03 3\n3.10 3\n3.19 3\n3.33 3\n3.36 3\n3.40 3\n3.45 3\n3.53 3\n3.55 3\n3.625 3\n3.64 3\n3.85 3\n3.90 3\n3/16 3\n30.1 3\n30.7 3\n300ZX 3\n303 3\n305 3\n3090 3\n30s 3\n31.1 3\n31.25 3\n31.5 3\n310 3\n312 3\n32,000 3\n32.8 3\n326 3\n33.3 3\n33.6 3\n336 3\n34,000 3\n34.2 3\n34.3 3\n352 3\n355 3\n36349 3\n368 3\n37.1 3\n37.75 3\n37th 3\n38.2 3\n38.50 3\n39.8 3\n392 3\n393 3\n3M 3\n4,400 3\n4.0 3\n4.12 3\n4.15 3\n4.375 3\n4.50 3\n4.56 3\n4.68 3\n4.90 3\n4.97 3\n40.6 3\n405 3\n40s 3\n41.3 3\n41.8 3\n412 3\n416 3\n42nd 3\n43,000 3\n430,000 3\n44.5 3\n440 3\n444 3\n45.2 3\n452 3\n465 3\n47.1 3\n47.7 3\n470.80 3\n473 3\n475,000 3\n480 3\n488 3\n49.7 3\n496 3\n5,600 3\n5.09 3\n5.16 3\n5.27 3\n5.32 3\n5.70 3\n5.80 3\n50.6 3\n50s 3\n52.2 3\n52.7 3\n526 3\n53.1 3\n53.7 3\n53.9 3\n53rd 3\n551 3\n55th 3\n56.9 3\n572 3\n58,000 3\n58.9 3\n580 3\n5:30 3\n6,500 3\n6.07 3\n6.20 3\n6.30 3\n6.45 3\n6.50 3\n6.76 3\n625 3\n64.9 3\n646-2600 3\n65,000 3\n658 3\n670 3\n684 3\n7.0 3\n7.12 3\n7.227 3\n7.31 3\n7.32 3\n7.54 3\n7.65 3\n7.77 3\n7.80 3\n7.81 3\n7.962 3\n7.97 3\n7.986 3\n70.1 3\n70.62. 3\n70th 3\n72.2 3\n720,000 3\n727 3\n737 3\n74419 3\n765 3\n780 3\n784 3\n7:00 3\n8.00 3\n8.08 3\n8.125 3\n8.21 3\n8.26 3\n8.292 3\n8.325 3\n8.48 3\n8.52 3\n8.59 3\n8.61 3\n8.90 3\n8.95 3\n813 3\n83.7 3\n84.3 3\n853-7906 3\n862 3\n866 3\n88.8 3\n884 3\n89.6 3\n8:45 3\n9.06 3\n9.25 3\n9.33 3\n9.35 3\n9.39 3\n9.45 3\n9.78 3\n9.81 3\n9.875 3\n9/11 3\n942 3\n963 3\n986 3\n99.1875 3\n99.35 3\n999 3\n:? 3\n;-RRB- 3\n;. 3\nA&M 3\nA1 3\nAA 3\nACLU 3\nADN 3\nADRs 3\nAEG 3\nAIR 3\nAIW 3\nALWAYS 3\nAMAZING 3\nAMS 3\nAMT 3\nAPEC 3\nAPF 3\nAPI 3\nAV 3\nAVX 3\nAWAY 3\nAba 3\nAbbott 3\nAberdeen 3\nAble 3\nAborigines 3\nAccessories 3\nAccident 3\nAccordingly 3\nAccords 3\nAccount 3\nAce 3\nAchille 3\nAchilles 3\nAcrobat 3\nActivists 3\nActivities 3\nActual 3\nAddington 3\nAddressing 3\nAdia 3\nAdministrator 3\nAdministrators 3\nAdults 3\nAdvance 3\nAerocom 3\nAfghans 3\nAfshari 3\nAfterward 3\nAges 3\nAhead 3\nAhnaf 3\nAids 3\nAiguo 3\nAiken 3\nAiko 3\nAiles 3\nAiling 3\nAirborne 3\nAirplanes 3\nAjinomoto 3\nAkron 3\nAlamos 3\nAlbay 3\nAlberto 3\nAlcan 3\nAlec 3\nAlex. 3\nAlexis 3\nAlgerian 3\nAllison 3\nAllowing 3\nAlltel 3\nAlthea 3\nAltimari 3\nAltogether 3\nAlton 3\nAluminium 3\nAlusuisse 3\nAlvaro 3\nAlxa 3\nAmateur 3\nAmber 3\nAmen 3\nAmeritech 3\nAmin 3\nAmmann 3\nAmram 3\nAmre 3\nAmstrad 3\nAmtech 3\nAmway 3\nAnalog 3\nAndes 3\nAndre 3\nAngela 3\nAngell 3\nAngola 3\nAnimal 3\nAnita 3\nAnshe 3\nAntonini 3\nAnyways 3\nApartments 3\nAppel 3\nApplebaum 3\nAppleyard 3\nApplications 3\nApplying 3\nApproved 3\nArch 3\nArchitecture 3\nArchuleta 3\nArdebili 3\nAreas 3\nArena 3\nArmored 3\nArps 3\nArrow 3\nArroyo 3\nArthel 3\nArthritis 3\nAruba 3\nAscorbic 3\nAsha 3\nAsher 3\nAslanian 3\nAso 3\nAspartame 3\nAssume 3\nAstonishing 3\nAstronomical 3\nAstronomy 3\nAtari 3\nAthena 3\nAtsushi 3\nAtta 3\nAudi 3\nAunt 3\nAustralians 3\nAustrian 3\nAutos 3\nAvalon 3\nAvon 3\nAwadi 3\nAwesome 3\nAwolowo 3\nAwwad 3\nB.F. 3\nB2 3\nB2B 3\nBABA 3\nBAD 3\nBECAUSE 3\nBEIJING 3\nBIP 3\nBNA 3\nBPC 3\nBSN 3\nBUSH 3\nBa3 3\nBaa 3\nBaathist 3\nBacarella 3\nBach 3\nBackroom 3\nBadgers 3\nBaer 3\nBagger 3\nBairam 3\nBaja 3\nBakersfield 3\nBalance 3\nBalcor 3\nBalfour 3\nBallard 3\nBallhaus 3\nBalzac 3\nBamboo 3\nBand 3\nBang 3\nBarakat 3\nBargain 3\nBaring 3\nBarletta 3\nBarnes 3\nBarris 3\nBarrow 3\nBasel 3\nBasf 3\nBashiti 3\nBasil 3\nBasketball 3\nBattalion 3\nBattery 3\nBayerische 3\nBeagle 3\nBears 3\nBeatles 3\nBeau 3\nBeckett 3\nBedford 3\nBeethoven 3\nBei 3\nBeichuan 3\nBelgique 3\nBelize 3\nBello 3\nBelmont 3\nBelo 3\nBelt 3\nBenefits 3\nBengali 3\nBernhard 3\nBerri 3\nBerthold 3\nBiblical 3\nBigger 3\nBiho 3\nBike 3\nBiking 3\nBillie 3\nBilzerian 3\nBiny 3\nBioSciences 3\nBiochips 3\nBiological 3\nBiondi 3\nBiotech 3\nBirtcher 3\nBlandings 3\nBletchley 3\nBloch 3\nBlocked 3\nBlodgett 3\nBloedel 3\nBlogger 3\nBloomington 3\nBlues 3\nBluff 3\nBoehm 3\nBogota 3\nBoje 3\nBolinas 3\nBolling 3\nBon 3\nBonanza 3\nBonnaroo 3\nBonnie 3\nBoss 3\nBougainville 3\nBouillaire 3\nBouygues 3\nBowa 3\nBowflex 3\nBowles 3\nBowling 3\nBrae 3\nBranca 3\nBraniff 3\nBravo 3\nBreakfast 3\nBreakneck 3\nBredesen 3\nBremen 3\nBrick 3\nBridges 3\nBrigade 3\nBrigadier 3\nBring 3\nBroadcast 3\nBroader 3\nBroberg 3\nBronze 3\nBrooke 3\nBrowning 3\nBruyette 3\nBubble 3\nBuchner 3\nBuddhas 3\nBuilders 3\nBullocks 3\nBureaus 3\nBurroughs 3\nBushes 3\nButayhan 3\nButcher 3\nButterfinger 3\nButz 3\nBuyers 3\nByron 3\nCAME 3\nCAR 3\nCFA 3\nCHICAGO 3\nCIC 3\nCOCOA 3\nCOMPANY 3\nCOMPUTER 3\nCOPPER 3\nCORPORATE 3\nCR 3\nCRA 3\nCSC 3\nCVN 3\nCabula 3\nCaishun 3\nCake 3\nCalMat 3\nCalgene 3\nCallister 3\nCanberra 3\nCannavaro 3\nCannes 3\nCapsule 3\nCarder 3\nCaritas 3\nCarlo 3\nCarmon 3\nCarty 3\nCases 3\nCass 3\nCastaneda 3\nCat 3\nCatastrophic 3\nCatch 3\nCategory 3\nCatherall 3\nCaution 3\nCawling 3\nCayne 3\nCell 3\nCertificate 3\nCertificates 3\nCertified 3\nCessna 3\nChai 3\nChallenger 3\nChamberlain 3\nChanel 3\nChangrui 3\nChavalit 3\nCheap 3\nCheapest 3\nChecking 3\nCheerios 3\nChengyu 3\nChernoff 3\nChess 3\nChester 3\nChesterfield 3\nChiao 3\nChiaotung 3\nChiat 3\nChihuahua 3\nChilds 3\nChiriqui 3\nCholet 3\nChoose 3\nChosen 3\nChow 3\nChristchurch 3\nChristy 3\nChunghsing 3\nChunghwa 3\nChyuan 3\nCinderella 3\nCinema 3\nCingular 3\nCircle 3\nCivilized 3\nClaiming 3\nClay 3\nClayton 3\nClifton 3\nClint 3\nClosed 3\nClosing 3\nClough 3\nCnews 3\nCoach 3\nCochran 3\nCocoa 3\nCodover 3\nCollaborator 3\nCollor 3\nColodny 3\nComfort 3\nCommissioners 3\nCommittees 3\nCompassionate 3\nCompatriots 3\nCompuServe 3\nComrades 3\nCon 3\nConcorde 3\nConde 3\nConfair 3\nConfidence 3\nConfidential 3\nConfronted 3\nCongratulations 3\nConseco 3\nConsidered 3\nConstance 3\nConstitutional 3\nContinuing 3\nContraris 3\nContrary 3\nConventions 3\nCopies 3\nCops 3\nCopy 3\nCorazon 3\nCorn 3\nCorner 3\nCorolla 3\nCorridor 3\nCosby 3\nCostner 3\nCounter 3\nCounties 3\nCourse 3\nCoventry 3\nCover 3\nCovert 3\nCracking 3\nCragey 3\nCrary 3\nCrawford 3\nCrescott 3\nCrestmont 3\nCretaceous 3\nCrete 3\nCrisco 3\nCrisis 3\nCristiani 3\nCroatia 3\nCrowds 3\nCrowntuft 3\nCullinet 3\nCult 3\nCulver 3\nCummins 3\nCunin 3\nCuomo 3\nCures 3\nCurve 3\nCycling 3\nDAT 3\nDEA 3\nDEAL 3\nDIG 3\nDONEWS 3\nDOT 3\nDPI 3\nDaihatsu 3\nDain 3\nDalbar 3\nDallek 3\nDam 3\nDamage 3\nDance 3\nDangerfield 3\nDaoist 3\nDarla 3\nDarwinian 3\nDatatronic 3\nDavy 3\nDawa 3\nDawasir 3\nDaytona 3\nDeConcini 3\nDebate 3\nDecisions 3\nDecree 3\nDefu 3\nDelaney 3\nDelapasvela 3\nDelight 3\nDenny 3\nDepends 3\nDeposits 3\nDepth 3\nDeputies 3\nDer 3\nDerby 3\nDeren 3\nDerr 3\nDesc 3\nDespair 3\nDeveloper 3\nDiamandis 3\nDiane 3\nDiaoyutai 3\nDiaries 3\nDigate 3\nDillow 3\nDiplomacy 3\nDiplomats 3\nDirawi 3\nDiri 3\nDirks 3\nDisasters 3\nDisclaimer 3\nDiscovery 3\nDiscussing 3\nDisinformation 3\nDisk 3\nDistillery 3\nDixon 3\nDjindjic 3\nDocument 3\nDominica 3\nDooling 3\nDoor 3\nDornoch 3\nDougherty 3\nDrago 3\nDream 3\nDreman 3\nDresden 3\nDreyer 3\nDriscoll 3\nDual 3\nDuarte 3\nDujiangyan 3\nDukakis 3\nDulaim 3\nDumbo 3\nDumez 3\nDundalk 3\nDuriron 3\nDwarf 3\nEARNINGS 3\nEAST 3\nEB3326 3\nEEFTL 3\nERCOT 3\nESPs 3\nEToys 3\nEWS 3\nEagles 3\nEasterners 3\nEating 3\nEbay 3\nEco 3\nEcogas 3\nEden 3\nEdinburgh 3\nEditor 3\nEdmund 3\nEffect 3\nEffective 3\nEhman 3\nEinhorn 3\nElected 3\nElena 3\nEleventh 3\nElie 3\nEliminating 3\nEllen 3\nEmil 3\nEmission 3\nEmpress 3\nEncouraged 3\nEngelan 3\nEngenvick 3\nEngine 3\nEnglishman 3\nEnpower 3\nEnter 3\nEntero 3\nEpilepsy 3\nErie 3\nErnesto 3\nEsso 3\nEstablishing 3\nEsther 3\nEthics 3\nEthnology 3\nEtzioni 3\nEuropa 3\nEvan 3\nEverett 3\nEverland 3\nEveryday 3\nEward 3\nExabyte 3\nExact 3\nExamination 3\nExcalibur 3\nF.W. 3\nFAR 3\nFFr 3\nFLDS 3\nFaith 3\nFalluja 3\nFanuc 3\nFarmer 3\nFascinating 3\nFedders 3\nFeinman 3\nFengzhu 3\nFerris 3\nFerro 3\nFerruzzi 3\nFewer 3\nFields 3\nFifteen 3\nFifthly 3\nFiggie 3\nFighter 3\nFinanziaria 3\nFingers 3\nFirgossy 3\nFitzgerald 3\nFixed 3\nFlash 3\nFlint 3\nFlood 3\nFloyd 3\nFluhr 3\nFolgers 3\nFolk 3\nFolks 3\nForecast 3\nForecasting 3\nForeigners 3\nForet 3\nForrest 3\nForster 3\nFouad 3\nFounded 3\nFounders 3\nFraud 3\nFredric 3\nFreedman 3\nFreeport 3\nFreightways 3\nFrequent 3\nFresca 3\nFresh 3\nFreshman 3\nFuad 3\nFucking 3\nFukuoka 3\nFurs 3\nFutian 3\nFuzzy 3\nG.D. 3\nG.m.b 3\nG.m.b.H. 3\nGB 3\nGDR 3\nGMAT 3\nGO 3\nGPSA 3\nGRE 3\nGROUP 3\nGabelli 3\nGabriel 3\nGadhafi 3\nGaelic 3\nGainen 3\nGalveston 3\nGaming 3\nGannett 3\nGarber 3\nGarcias 3\nGardner 3\nGarman 3\nGarn 3\nGarrett 3\nGarry 3\nGaskin 3\nGasoline 3\nGatos 3\nGauguin 3\nGay 3\nGaylord 3\nGazelle 3\nGbagbo 3\nGe 3\nGeffen 3\nGehrig 3\nGeiger 3\nGemina 3\nGenesis 3\nGeng 3\nGennie 3\nGenome 3\nGently 3\nGeo 3\nGeorgina 3\nGettysburg 3\nGhana 3\nGhostbusters 3\nGiffen 3\nGing 3\nGinn 3\nGinsburg 3\nGinseng 3\nGirls 3\nGitanes 3\nGivaudan 3\nGivens 3\nGiving 3\nGlasgow 3\nGlinn 3\nGlucksman 3\nGmail 3\nGodfather 3\nGoliath 3\nGortari 3\nGospel 3\nGotlieb 3\nGottlieb 3\nGovernors 3\nGradmann 3\nGraef 3\nGraeme 3\nGrants 3\nGras 3\nGrateful 3\nGreensboro 3\nGreetings 3\nGrid 3\nGrobstein 3\nGrove 3\nGrubman 3\nGruntal 3\nGuan 3\nGuests 3\nGuides 3\nGujarat 3\nGuofu 3\nGustafson 3\nGutfreunds 3\nHAD 3\nHDTVs 3\nHI 3\nHIGH 3\nHIStory 3\nHOUSE 3\nHUGO 3\nHaag 3\nHaedicke 3\nHakone 3\nHal 3\nHale 3\nHalles 3\nHallwood 3\nHalsey 3\nHam 3\nHambros 3\nHanshin 3\nHarlow 3\nHarsco 3\nHartt 3\nHarty 3\nHasidic 3\nHasina 3\nHau 3\nHaussmann 3\nHawkins 3\nHayward 3\nHealthdyne 3\nHeard 3\nHearst 3\nHeck 3\nHelionetics 3\nHemisphere 3\nHens 3\nHerdan 3\nHershey 3\nHighlighting 3\nHighways 3\nHilary 3\nHindus 3\nHiring 3\nHisham 3\nHizbollah 3\nHolbrooke 3\nHole 3\nHoller 3\nHollister 3\nHolliston 3\nHomecoming 3\nHomer 3\nHomestake 3\nHonestly 3\nHongwei 3\nHopwood 3\nHosanna 3\nHospitals 3\nHostile 3\nHotels 3\nHour 3\nHoylake 3\nHsi 3\nHsiuluan 3\nHuan 3\nHuanglong 3\nHuber 3\nHucheng 3\nHuei 3\nHuggins 3\nHughey 3\nHuizhou 3\nHulings 3\nHumans 3\nHundred 3\nHuntington 3\nHuo 3\nHuskers 3\nHussain 3\nHwa 3\nHwo 3\nHydra 3\nIBJ 3\nIDS 3\nIE 3\nIOC 3\nIPOs 3\nIQ97 3\nISS 3\nIacocca 3\nIce 3\nIceland 3\nIfint 3\nIgnacio 3\nIli 3\nIm 3\nImmigrant 3\nImmune 3\nImplementation 3\nImplementing 3\nImportant 3\nIncluded 3\nIncreasing 3\nIncredible 3\nInevitably 3\nInfantry 3\nInfinite 3\nIngram 3\nInoue 3\nInsider 3\nInteresting 3\nInterviews 3\nIntimate 3\nInuit 3\nInvest 3\nInvitation 3\nIo 3\nIrian 3\nIslamist 3\nIslamists 3\nIto 3\nItochu 3\nIzvestia 3\nJ.D. 3\nJ.M. 3\nJP 3\nJaafari 3\nJade 3\nJae 3\nJaffray 3\nJalal 3\nJasmine 3\nJasper 3\nJath 3\nJavier 3\nJaya 3\nJazirah 3\nJazz 3\nJehovah 3\nJelenic 3\nJemison 3\nJenkins 3\nJennison 3\nJenny 3\nJeou 3\nJewelry 3\nJianchang 3\nJianhua 3\nJianjiang 3\nJiaxing 3\nJierong 3\nJinglian 3\nJingsheng 3\nJinsheng 3\nJiuzhaigou 3\nJiyun 3\nJoey 3\nJohnnie 3\nJohnstown 3\nJolly 3\nJonas 3\nJournalist 3\nJovian 3\nJoy 3\nJubeil 3\nJubouri 3\nJuggernaut 3\nJujo 3\nJuliano 3\nJulius 3\nJuren 3\nJurisprudence 3\nJury 3\nJuventus 3\nKC 3\nKLM 3\nKNOW 3\nKPMG 3\nKabel 3\nKaddoumi 3\nKahan 3\nKakita 3\nKampala 3\nKandel 3\nKaplan 3\nKaralis 3\nKass 3\nKathie 3\nKathryn 3\nKatie 3\nKazuo 3\nKeck 3\nKeefe 3\nKeene 3\nKehenen 3\nKeizaikai 3\nKendall 3\nKennametal 3\nKenyan 3\nKerkorian 3\nKerschner 3\nKeteyian 3\nKeynesian 3\nKhieu 3\nKiev 3\nKin 3\nKirin 3\nKistler 3\nKit 3\nKitamura 3\nKleiber 3\nKnowing 3\nKnudsen 3\nKofcoh 3\nKolber 3\nKoleskinov 3\nKonan 3\nKonheim 3\nKoskotas 3\nKossuth 3\nKowalke 3\nKrajisnik 3\nKrat 3\nKrebs 3\nKrisher 3\nKrishnamurthy 3\nKuehn 3\nKungliao 3\nKut 3\nKuwaiti 3\nKyra 3\nL'Oreal 3\nL.L. 3\nLET 3\nLIT 3\nLOC 3\nLOPEZ 3\nLTS 3\nLaGuardia 3\nLaRioja 3\nLabatt 3\nLac 3\nLaiaoter 3\nLaidig 3\nLaiwu 3\nLama 3\nLamb 3\nLambda 3\nLampoon 3\nLancaster 3\nLance 3\nLancry 3\nLandfill 3\nLanding 3\nLanka 3\nLantos 3\nLaser 3\nLasting 3\nLaundry 3\nLauro 3\nLavrov 3\nLawrenson 3\nLeBron 3\nLead 3\nLeahy 3\nLeasing 3\nLeast 3\nLech 3\nLeche 3\nLeemans 3\nLeftist 3\nLego 3\nLeifeng 3\nLeisure 3\nLeng 3\nLenin 3\nLeona 3\nLep 3\nLever 3\nLeverage 3\nLevin 3\nLevinson 3\nLew 3\nLiberte 3\nLibor 3\nLichang 3\nLichtenstein 3\nLieber 3\nLifeSavers 3\nLijiu 3\nLikely 3\nLimo 3\nLindbergh 3\nLinden 3\nLindsey 3\nLing'ao 3\nLinguistics 3\nLink 3\nLinking 3\nLippold 3\nListen 3\nLithox 3\nLiuting 3\nLives 3\nLiza 3\nLoggia 3\nLombardo 3\nLoop 3\nLora 3\nLoss 3\nLosses 3\nLothson 3\nLott 3\nLover 3\nLucia 3\nLucio 3\nLuck 3\nLuther 3\nM&A 3\nM'Bow 3\nM.B.A. 3\nM1 3\nMADD 3\nMAKE 3\nMARKET 3\nMAY 3\nMD 3\nMLB 3\nMNC 3\nMPD 3\nMSNBC 3\nMV 3\nMacInnis 3\nMacMillan 3\nMacaense 3\nMad 3\nMagellan 3\nMailiao 3\nMaintaining 3\nMakers 3\nMalay 3\nMalaysians 3\nMallory 3\nMaloney 3\nMame 3\nManar 3\nManchuria 3\nManchurian 3\nMandil 3\nMaoming 3\nMaradona 3\nMarcia 3\nMarietta 3\nMarilyn 3\nMarinaro 3\nMarkey 3\nMarlboro 3\nMaronite 3\nMarrie 3\nMarried 3\nMartial 3\nMartyr 3\nMarunouchi 3\nMarwan 3\nMasked 3\nMassacre 3\nMasterCard 3\nMasters 3\nMastro 3\nMatagorda 3\nMaterials 3\nMatilda 3\nMatrix 3\nMattausch 3\nMaurizio 3\nMaya 3\nMaybelline 3\nMaysing 3\nMcBride 3\nMcCammon 3\nMcClain 3\nMcClauklin 3\nMcClelland 3\nMcCoy 3\nMcCraw 3\nMcFadden 3\nMcGrath 3\nMcGuire 3\nMcIntyre 3\nMcKinsey 3\nMcLaughlin 3\nMcMahon 3\nMeador 3\nMeasures 3\nMeat 3\nMedco 3\nMedieval 3\nMedium 3\nMedtronic 3\nMejia 3\nMelbourne 3\nMelvyn 3\nMemory 3\nMentor 3\nMerhige 3\nMeryl 3\nMeson 3\nMethods 3\nMiG 3\nMia 3\nMicrowave 3\nMid-west 3\nMidlife 3\nMidnight 3\nMidwestern 3\nMight 3\nMiguel 3\nMill 3\nMillion 3\nMillis 3\nMim 3\nMimi 3\nMin 3\nMiner 3\nMinna 3\nMint 3\nMinuteman 3\nMisanthrope 3\nMisawa 3\nMix 3\nModel 3\nMohammad 3\nMoines 3\nMojo 3\nMoldovan 3\nMondays 3\nMonitor 3\nMonogram 3\nMonsky 3\nMoqtada 3\nMora 3\nMorelli 3\nMoreno 3\nMoroccan 3\nMotel 3\nMothers 3\nMotoren 3\nMouse 3\nMrs 3\nMs 3\nMuniak 3\nMurder 3\nMurillo 3\nMurph 3\nMusab 3\nMusical 3\nMussolini 3\nN.A. 3\nN.V 3\nNASAA 3\nNASD 3\nNC 3\nNCI 3\nNFIB 3\nNMTBA 3\nNORC 3\nNOTICE 3\nNOVA 3\nNTD 3\nNacional 3\nNafi'i 3\nNahas 3\nNairobi 3\nNames 3\nNantong 3\nNapa 3\nNaples 3\nNarcotics 3\nNassau 3\nNast 3\nNatasha 3\nNationale 3\nNationalists 3\nNative 3\nNativist 3\nNatured 3\nNatwest 3\nNawaz 3\nNeave 3\nNeedless 3\nNeff 3\nNegus 3\nNeng 3\nNeptune 3\nNerds 3\nNetscape 3\nNewcastle 3\nNewquist 3\nNewsday 3\nNewspapers 3\nNewsprint 3\nNewt 3\nNewton 3\nNicki 3\nNie 3\nNien 3\nNina 3\nNingpo 3\nNino 3\nNokia 3\nNorberto 3\nNordic 3\nNorris 3\nNorwegians 3\nNorwest 3\nNovel 3\nNowhere 3\nNuan 3\nNumerous 3\nNun 3\nNuveen 3\nO'Connor 3\nO.J. 3\nOFF 3\nONLY 3\nOR 3\nOSS 3\nOaks 3\nObjections 3\nObserver 3\nOccasionally 3\nOccupational 3\nOka 3\nOlay 3\nOlga 3\nOllie 3\nOncor 3\nOndaatje 3\nOneida 3\nOpinions 3\nOptical 3\nOption 3\nOranjemund 3\nOrbitz 3\nOre 3\nOrganizations 3\nOrwell 3\nOskar 3\nOutgoing 3\nOutstanding 3\nOuyang 3\nOwnership 3\nPACs 3\nPAT 3\nPDT 3\nPEN 3\nPGE 3\nPLACE 3\nPPI 3\nPRICES 3\nPUTS 3\nPacker 3\nPackers 3\nPanasonic 3\nPanda 3\nPanelli 3\nPanetta 3\nPara 3\nParaguay 3\nPardons 3\nParent 3\nParish 3\nParisien 3\nParticipants 3\nParties 3\nPascal 3\nPass 3\nPassing 3\nPastor 3\nPatients 3\nPatty 3\nPauline 3\nPayPal 3\nPayne 3\nPeanuts 3\nPeasant 3\nPechiney 3\nPeinan 3\nPeishih 3\nPele 3\nPell 3\nPellegrin 3\nPending 3\nPenman 3\nPeoria 3\nPepper 3\nPepperidge 3\nPerkins 3\nPerrier 3\nPerritt 3\nPervez 3\nPetit 3\nPetrolane 3\nPetronas 3\nPh.D. 3\nPhet 3\nPhillies 3\nPho 3\nPhotoshop 3\nPhuket 3\nPier 3\nPigs 3\nPilates 3\nPin 3\nPingtung 3\nPinpoint 3\nPinter 3\nPirate 3\nPissocra 3\nPitt 3\nPlaces 3\nPlaintiffs 3\nPlanck 3\nPlanters 3\nPlay 3\nPlayers 3\nPluto 3\nPoindexter 3\nPointe 3\nPointing 3\nPolar 3\nPolynesian 3\nPonce 3\nPoncet 3\nPond 3\nPool 3\nPopup 3\nPosix 3\nPossible 3\nPossibly 3\nPostels 3\nPot 3\nPotential 3\nPowhatan 3\nPractically 3\nPracticing 3\nPratas 3\nPresbyterian 3\nPresently 3\nPresley 3\nPretty 3\nPrideaux 3\nPrimarily 3\nPrinting 3\nPrix 3\nProductivity 3\nProhibition 3\nProposes 3\nProteins 3\nProtesters 3\nProvide 3\nProvided 3\nProvisional 3\nPsychologists 3\nPsyllium 3\nPublished 3\nPublisher 3\nPunch 3\nPushkin 3\nPyongyang 3\nQarni 3\nQasim 3\nQiao 3\nQihua 3\nQizhen 3\nQuan 3\nQuantity 3\nQuarter 3\nQuesTech 3\nQusay 3\nR.C. 3\nR.D. 3\nRAM 3\nREALLY 3\nRFP 3\nRI 3\nRICOed 3\nRM 3\nRMI 3\nROM 3\nRabinowitz 3\nRacketeer 3\nRadiation 3\nRadical 3\nRadison 3\nRadzymin 3\nRaeder 3\nRafik 3\nRage 3\nRailroad 3\nRainman 3\nRajiv 3\nRam 3\nRama 3\nRambo 3\nRamos 3\nRandom 3\nRangel 3\nRashid 3\nRatings 3\nRayburn 3\nRazeq 3\nReaders 3\nReality 3\nRebecca 3\nRecapturing 3\nReconstruction 3\nRecruiting 3\nRedfield 3\nReducing 3\nReese 3\nReferring 3\nRefining 3\nReformist 3\nRegalia 3\nRegan 3\nRegrettably 3\nReichmann 3\nReidy 3\nReiss 3\nRejoicer 3\nRelated 3\nRenee 3\nRenzas 3\nRep 3\nRepairs 3\nRepeat 3\nResidence 3\nResort 3\nResorts 3\nRetrieval 3\nRevelation 3\nRevision 3\nRheingold 3\nRhine 3\nRice@ENRON 3\nRifkind 3\nRiordan 3\nRip 3\nRitchie 3\nRitterman 3\nRivera 3\nRivkin 3\nRoads 3\nRoberto 3\nRoche 3\nRockford 3\nRoland 3\nRolling 3\nRomans 3\nRomeo 3\nRongheng 3\nRonn 3\nRooms 3\nRosales 3\nRosemary 3\nRossini 3\nRoulac 3\nRove 3\nRover 3\nRow 3\nRubendall 3\nRubenstein 3\nRuble 3\nRudd 3\nRudnick 3\nRuge 3\nRui 3\nRuins 3\nRusskies 3\nSAID 3\nSALES 3\nSBA 3\nSEO 3\nSFE 3\nSHORT 3\nSKII 3\nSMU 3\nSNAP 3\nSONG 3\nSPECTRA 3\nSTART 3\nSTEAK 3\nSTREET 3\nSUGAR 3\nSUSHI 3\nSWG 3\nSafola 3\nSafra 3\nSagos 3\nSahar 3\nSalaam 3\nSalerno 3\nSalvation 3\nSalvatori 3\nSamanoud 3\nSamba 3\nSamir 3\nSamoa 3\nSancho 3\nSander 3\nSandoz 3\nSands 3\nSari 3\nSarney 3\nSasea 3\nSatoshi 3\nSavageau 3\nSaving 3\nSayeeb 3\nSaying 3\nSayyed 3\nScale 3\nScandinavia 3\nSchaffler 3\nScheme 3\nSchimmel 3\nSchott 3\nSchreibman 3\nSchuster 3\nSchwarzenegger 3\nScotto 3\nScrabble 3\nScudder 3\nSeal 3\nSear's 3\nSeas 3\nSecaucus 3\nSecondary 3\nSeek 3\nSelection 3\nSelkin 3\nSeller 3\nSemiconductors 3\nSense 3\nSenshukai 3\nSentra 3\nSerge 3\nSeriously 3\nSettlements 3\nSeymour 3\nShaalan 3\nShackleton@ECT 3\nShamJie 3\nShayan 3\nShean 3\nSheila 3\nSheldon 3\nShelly 3\nShelton 3\nShepherd 3\nSheriff 3\nSherlund 3\nShilling 3\nShinto 3\nShioya 3\nShisanhang 3\nShraideh 3\nShreveport 3\nShriver 3\nShrontz 3\nShuyukh 3\nSiberia 3\nSiebel 3\nSiena 3\nSigoloff 3\nSilas 3\nSilverman 3\nSimaxie~Ball 3\nSin 3\nSinfonia 3\nSitco 3\nSixthly 3\nSkadden 3\nSkies 3\nSkokie 3\nSkull 3\nSlaughter 3\nSlovakia 3\nSluggish 3\nSnae 3\nSnoopy 3\nSnowy 3\nSoap 3\nSoccer 3\nSocialization 3\nSoft 3\nSohn 3\nSomehow 3\nSometime 3\nSonet 3\nSonia 3\nSonny 3\nSooners 3\nSoundView 3\nSouthfield 3\nSouya 3\nSovran 3\nSoybean 3\nSoyuz 3\nSpace.com 3\nSpartan 3\nSpectacular 3\nSpeed 3\nSpending 3\nSpendthrift 3\nSpider 3\nSpill 3\nSpirits 3\nSpivey 3\nSponsors 3\nSport 3\nSpratley 3\nSquad 3\nSquadron 3\nStaar 3\nStage 3\nStaley 3\nStalin 3\nStand 3\nStanton 3\nStatements 3\nSteidtmann 3\nStevric 3\nStorms 3\nStrawberry 3\nStrieber 3\nString 3\nStroking 3\nSubsequently 3\nSucceeding 3\nSuite 3\nSuleiman 3\nSultanate 3\nSummerfolk 3\nSundays 3\nSuntory 3\nSuominen 3\nSupermarket 3\nSuppliers 3\nSupporting 3\nSupposedly 3\nSuq 3\nSurrounded 3\nSurveillance 3\nSurveying 3\nSurveys 3\nSushi 3\nSutcliffe 3\nSutro 3\nSuva 3\nSuzanne 3\nSwanson 3\nSwedes 3\nSword 3\nSylvester 3\nSymposium 3\nSyndrome 3\nSyrians 3\nTALK 3\nTCMP 3\nTECHNOLOGY 3\nTERRIBLE 3\nTHEIR 3\nTIME 3\nTIMES 3\nTORA 3\nTRY 3\nTVX 3\nTWA 3\nTab 3\nTacker 3\nTait 3\nTaiyuan 3\nTaj 3\nTakushoku 3\nTale 3\nTanya 3\nTarawa 3\nTariffs 3\nTaro 3\nTartan 3\nTaurus 3\nTawu 3\nTaxes 3\nTbilisi 3\nTeams 3\nTeeterman 3\nTeh 3\nTeheran 3\nTelos 3\nTenet 3\nTerminating 3\nTermination 3\nTerrorists 3\nTestimony 3\nTextiles 3\nThal 3\nThalmann 3\nThames 3\nTheft 3\nThemselves 3\nTherapy 3\nThereupon 3\nThief 3\nThieves 3\nThin 3\nThousand 3\nThrone 3\nThrow 3\nThu 3\nTianfa 3\nTierney 3\nTies 3\nTinker 3\nTitman 3\nTmac 3\nTomash 3\nTomsho 3\nToto 3\nToussie 3\nToward 3\nTowers 3\nToys 3\nTrace 3\nTransgenic 3\nTraveller 3\nTraxler 3\nTrees 3\nTrend 3\nTrends 3\nTrenton 3\nTrevino 3\nTriad 3\nTrifari 3\nTrimeresurus 3\nTrinidad 3\nTripoli 3\nTropez 3\nTropicana 3\nTropics 3\nTrout 3\nTruck 3\nTrustee 3\nTsui 3\nTsun 3\nTub 3\nTudari 3\nTulsa 3\nTungshih 3\nTuniu 3\nTurn 3\nTweed 3\nU.K 3\nUAV 3\nUAW 3\nUEFA 3\nUGI 3\nUNC 3\nUNHCR 3\nUNIX 3\nUPI 3\nURL 3\nUSACafes 3\nUSDA 3\nUltimate 3\nUncertain 3\nUncertainty 3\nUndersecretary 3\nUnderwear 3\nUndeterred 3\nUnions 3\nUnitrode 3\nUnits 3\nUnivision 3\nUntrue 3\nUsed 3\nUsenet 3\nUsinor 3\nV02 3\nVACRS 3\nVCR 3\nVCRs 3\nVI 3\nVIP 3\nVaezi 3\nValerie 3\nValues 3\nVance 3\nVandenberg 3\nVaux 3\nVector 3\nVercammen 3\nVic 3\nVicente 3\nVicky 3\nVilla 3\nVillages 3\nVincente 3\nVirgo 3\nVirtually 3\nVladi 3\nVolaticotherium 3\nVolokhs 3\nW.J. 3\nWELL 3\nWHY 3\nWINGS 3\nWORKERS 3\nWW 3\nWW2 3\nWachtel 3\nWaco 3\nWaeli 3\nWafd 3\nWaiting 3\nWake 3\nWako 3\nWallop 3\nWanhua 3\nWarehouse 3\nWarhol 3\nWarnaco 3\nWasserstein 3\nWatan 3\nWaterbury 3\nWatkins 3\nWeaken 3\nWeather 3\nWeaver 3\nWebb 3\nWeidiaunuo 3\nWeiping 3\nWeizhong 3\nWellman 3\nWen-hsing 3\nWenhai 3\nWenlong 3\nWerke 3\nWessels 3\nWheel 3\nWhitney 3\nWhittaker 3\nWiedemann 3\nWilhelm 3\nWillens 3\nWillmott 3\nWinbond 3\nWindy 3\nWittgreen 3\nWizards 3\nWolfe 3\nWon 3\nWonder 3\nWoodstream 3\nWoodward 3\nWorker 3\nWorkshop 3\nWriter 3\nWritten 3\nWussler 3\nXIII 3\nXL 3\nXML 3\nXR4Ti 3\nXVI 3\nXbox 3\nXiangming 3\nXuejie 3\nXueliang 3\nXufeng 3\nXuming 3\nXunxuan 3\nY&R 3\nYalta 3\nYamashita 3\nYazdi 3\nYbarra 3\nYeping 3\nYimin 3\nYitzhak 3\nYogi 3\nYoneyama 3\nYongjian 3\nYoussef 3\nYue 3\nYugoslavians 3\nYukon 3\nYungho 3\nYushchenko 3\nYuyi 3\nYuzhen 3\nZ. 3\nZBB 3\nZaire 3\nZalubice 3\nZen 3\nZeng 3\nZero 3\nZhai 3\nZhen 3\nZhiqiang 3\nZhongnanhai 3\nZhongyi 3\nZuoren 3\na.k.a 3\nabandons 3\nabby 3\nabcnews.com 3\nabdominal 3\nabducting 3\nabominable 3\nabrasives 3\nabsenteeism 3\nabsurdity 3\nacademician 3\naccents 3\nacceptances 3\naccessibility 3\naccidental 3\nacclaim 3\naccommodations 3\naccomplices 3\naccomplishing 3\naccumulate 3\naccumulative 3\nace 3\nacidity 3\nacquirers 3\nactivism 3\nacupuncture 3\naddicted 3\naddictive 3\nadherence 3\nadhesives 3\nadjoining 3\nadjournment 3\nadmiring 3\nadmonition 3\nadolescence 3\nadorned 3\nadulthood 3\nadventurism 3\nadvisories 3\naffiliations 3\naffluence 3\naflatoxin 3\nafterlife 3\naftertax 3\naggravate 3\naggrieved 3\nagility 3\nagitation 3\nailments 3\nairbags 3\nairs 3\nairway 3\nalerted 3\nalienated 3\nalign 3\nallegory 3\nalleviated 3\nalleviating 3\nalley 3\nallocating 3\nallocations 3\nalluded 3\nallure 3\nallusions 3\nalms 3\naloud 3\nalpha 3\nalternating 3\nalternatively 3\naltruistic 3\namasses 3\nambiguities 3\nambushed 3\namending 3\namicable 3\nammonium 3\namphibious 3\namputated 3\namusement 3\nanatomical 3\nanatomy 3\nancestry 3\nanchorman 3\nanew 3\nangel 3\nangst 3\nanimated 3\nanimus 3\nannoyance 3\nanomalous 3\nanonymously 3\nantacid 3\nantagonistic 3\nantagonize 3\nanti-Soviet 3\nanti-apartheid 3\nanti-ballistic 3\nanti-bike 3\nanti-independence 3\nanti-monopoly 3\nanti-nuclear 3\nanti-smoking 3\nanti-terrorism 3\nanti-union 3\nantidote 3\nantigen 3\nantiquus 3\nantiretroviral 3\nantithetical 3\nanxieties 3\napocalyptic 3\napologists 3\napology 3\napplauds 3\nappointee 3\nappraised 3\nappreciating 3\napprehended 3\napprehension 3\naptly 3\naquaculture 3\naquatic 3\narable 3\naramid 3\narchaeological 3\narchaeologists 3\narchipelago 3\naristocratic 3\narouse 3\narranges 3\narthritis 3\nartisans 3\nash 3\nasparagus 3\naspartame 3\naspire 3\nassailed 3\nassassinating 3\nassaulted 3\nassaulting 3\nassertion 3\nasses 3\nassimilated 3\nastray 3\nastronomy 3\natrophy 3\nattache 3\nattest 3\nattests 3\nattic 3\naudiophiles 3\naudited 3\naudition 3\naura 3\nauthentication 3\nautographed 3\nautographs 3\navalanche 3\navenge 3\naversion 3\naverted 3\naverting 3\navoids 3\nawakening 3\nawry 3\nax 3\nb/c 3\nbacker 3\nbackside 3\nbackyard 3\nbacon 3\nbailed 3\nbailouts 3\nbait 3\nbaker 3\nbakeware 3\nbald 3\nballoonists 3\nbaloney 3\nbanded 3\nbantiao 3\nbarbarians 3\nbareback 3\nbarges 3\nbartenders 3\nbasin 3\nbastards 3\nbastions 3\nbatches 3\nbathtub 3\nbattlefields 3\nbeautification 3\nbee 3\nbeefing 3\nbeg 3\nbeginner 3\nbehaves 3\nbehaving 3\nbeltway 3\nbemoaning 3\nbenefactor 3\nbenzene 3\nbereaved 3\nberth 3\nbesiege 3\nbested 3\nbetrayal 3\nbetta 3\nbeware 3\nbiased 3\nbiases 3\nbigotry 3\nbii 3\nbiking 3\nbiotech 3\nbirths 3\nbishops 3\nbitch 3\nbitten 3\nblackened 3\nblackmail 3\nblade 3\nblameless 3\nblinding 3\nblister 3\nblithely 3\nblossomed 3\nblossoms 3\nblueprints 3\nblur 3\nblurring 3\nblush 3\nbmw 3\nbob 3\nbog 3\nbombarded 3\nbombast 3\nbondin' 3\nbookstores 3\nbooms 3\nbooster 3\nbordering 3\nboredom 3\nbotched 3\nbothering 3\nbothers 3\nbottoming 3\nboundless 3\nboxed 3\nbraced 3\nbranded 3\nbrandy 3\nbrash 3\nbraved 3\nbreaches 3\nbreathtaking 3\nbreeds 3\nbrewery 3\nbribed 3\nbrigades 3\nbrightness 3\nbrilliance 3\nbrilliantly 3\nbrittle 3\nbrokering 3\nbrothel 3\nbrowse 3\nbruises 3\nbulging 3\nbulldozer 3\nbulldozers 3\nbulletins 3\nbumps 3\nbunny 3\nbuoyant 3\nbureaucracies 3\nbushels 3\nbushes 3\nbusier 3\nbusinesslike 3\nbusload 3\nbusting 3\nbutterfat 3\nbuttress 3\nbuzzing 3\ncabbage 3\ncache 3\ncal 3\ncalculators 3\ncalculus 3\ncaliber 3\ncalmness 3\ncalves 3\ncamouflaged 3\ncandid 3\ncandied 3\ncandies 3\ncanopy 3\ncanyon 3\ncapitalizing 3\ncaptives 3\ncarats 3\ncarboy 3\ncarefree 3\ncarelessness 3\ncarnival 3\ncarp 3\ncarpenter 3\ncarrots 3\ncascade 3\ncashing 3\ncasket 3\ncaster 3\ncasually 3\ncatapult 3\ncatchment 3\ncategorical 3\ncategorically 3\ncaucus 3\ncautiousness 3\ncavalier 3\ncaveat 3\ncelery 3\ncelestial 3\ncellphone 3\ncemeteries 3\ncensored 3\ncentering 3\ncentralization 3\ncertify 3\ncha 3\nchalked 3\nchallenger 3\nchallengers 3\nchameleon 3\nchampioned 3\nchan 3\nchandeliers 3\nchanneled 3\ncharacterizes 3\ncharger 3\ncharred 3\nchauvinism 3\ncheaters 3\ncheerleading 3\nchemically 3\nchemotherapy 3\nchieftain 3\nchimpanzees 3\nchipping 3\nchlorophyll 3\nchoir 3\nchops 3\nchords 3\nchristian 3\nchristmas 3\nchronically 3\nchu 3\nchuck 3\nchurches 3\nchurning 3\ncircumvent 3\ncircus 3\ncivility 3\nclarifying 3\nclarinetist 3\nclashing 3\nclassics 3\nclassify 3\nclauses 3\ncleanly 3\nclemency 3\nclergy 3\nclergyman 3\ncloaked 3\nclogging 3\nclosures 3\nclothed 3\ncloture 3\nclown 3\nclueless 3\nclustered 3\nco-founded 3\nco-managing 3\nco-operating 3\ncoasts 3\ncobalt 3\ncoca 3\ncockroaches 3\ncoded 3\ncoercion 3\ncoexistence 3\ncoiled 3\ncoke 3\ncolas 3\ncollaborators 3\ncollage 3\ncollages 3\ncolloquial 3\ncolonies 3\ncolour 3\ncoloured 3\ncolumnists 3\ncombing 3\ncomedies 3\ncommemoration 3\ncommenced 3\ncommencing 3\ncommendation 3\ncommercializing 3\ncommonality 3\ncommunicating 3\ncommuniques 3\ncompartments 3\ncompensating 3\ncompetencies 3\ncompile 3\ncomplacent 3\ncomplements 3\ncomplexities 3\ncomposers 3\ncompositions 3\ncomprehensiveness 3\ncompression 3\nconceit 3\nconceivable 3\nconcentrates 3\nconcerted 3\nconcerts 3\ncondensed 3\nconditioners 3\ncondom 3\ncondone 3\ncone 3\ncones 3\nconfer 3\nconflagration 3\nconformed 3\nconformity 3\nconjecture 3\nconjee 3\ncons 3\nconsensual 3\nconsequent 3\nconspiracies 3\nconspirator 3\nconstituent 3\nconstituting 3\nconstitutionality 3\nconstitutionally 3\nconstrain 3\nconstrued 3\nconsulates 3\nconsultative 3\nconsummated 3\ncontacting 3\ncontagious 3\ncontainerboard 3\ncontestant 3\ncontesting 3\ncontiguous 3\ncontraceptives 3\ncontractions 3\ncontradicted 3\ncontradicts 3\ncontrasting 3\nconvening 3\nconvent 3\nconversing 3\nconverters 3\nconveying 3\nconvoluted 3\ncookie 3\ncooks 3\ncoolants 3\ncordial 3\ncords 3\ncorked 3\ncorporates 3\ncorporatism 3\ncorporatist 3\ncorrecting 3\ncorrective 3\ncorridors 3\ncorroborate 3\ncostumed 3\ncouncilors 3\ncountenance 3\ncounter-revolutionaries 3\ncounter-terrorism 3\ncounterbid 3\ncounterclaim 3\ncourageous 3\ncourier 3\ncoverings 3\ncowards 3\ncrabs 3\ncrackpot 3\ncrammed 3\ncramming 3\ncrappy 3\ncrawl 3\ncrawled 3\ncrawls 3\ncrickets 3\ncrimp 3\ncriticizes 3\ncritters 3\ncroissants 3\ncronies 3\ncropped 3\ncross-functional 3\ncross-party 3\ncrosses 3\ncrossfire 3\ncrossroads 3\ncrotch 3\ncrows 3\ncrudes 3\ncruiser 3\ncrumbled 3\ncrumbs 3\ncrushes 3\ncuckoo 3\nculturally 3\ncumbersome 3\ncumulatively 3\ncurbed 3\ncursing 3\ncushioned 3\ncushioning 3\ncuz 3\ncyber-authors 3\ncyclist 3\ncyclists 3\ndab 3\ndads 3\ndagger 3\ndailies 3\ndal 3\ndamit 3\ndamped 3\ndarker 3\ndashboard 3\ndawning 3\ndaybreak 3\ndaycare 3\ndeadbeats 3\ndeaf 3\ndeafening 3\ndearly 3\ndecay 3\ndecedent 3\ndeceive 3\ndecisiveness 3\ndecks 3\ndecorating 3\ndecreed 3\ndecrepit 3\ndeductibles 3\ndefamation 3\ndefamatory 3\ndefected 3\ndefections 3\ndefence 3\ndefiantly 3\ndeficient 3\ndefied 3\ndefinately 3\ndeflated 3\ndefrauded 3\ndefrauding 3\ndegenerated 3\ndegraded 3\ndegrading 3\ndeli 3\ndelved 3\ndemobilization 3\ndemocracies 3\ndemographics 3\ndemoted 3\ndenounce 3\ndepartmental 3\ndependency 3\ndeplorable 3\ndeportation 3\ndeposition 3\ndepresses 3\ndeprives 3\ndepriving 3\ndepths 3\nderegulated 3\ndereliction 3\nderive 3\ndescendents 3\ndescending 3\ndescends 3\ndeserved 3\ndesolate 3\ndesperation 3\ndespise 3\ndespondent 3\ndespotism 3\ndessert 3\ndestroys 3\ndestruct 3\ndetaining 3\ndeterring 3\ndetract 3\ndeutsche 3\ndevils 3\ndi 3\ndialogues 3\ndictation 3\ndifferentiated 3\ndigesting 3\ndigestion 3\ndignified 3\ndike 3\ndiminish 3\ndiminishing 3\ndiminutive 3\ndinars 3\ndiners 3\ndipping 3\ndips 3\ndirawy@gmail.com 3\ndirectionless 3\ndirectorial 3\ndisappoint 3\ndisappointingly 3\ndisarmament 3\ndisbelief 3\ndiscard 3\ndischarges 3\ndisclaimer 3\ndisco 3\ndiscontent 3\ndiscontinuation 3\ndiscord 3\ndiscrediting 3\ndiscusses 3\ndisenchanted 3\ndisequilibriums 3\ndisingenuous 3\ndisintegrated 3\ndisintegrating 3\ndislikes 3\ndismantling 3\ndisobey 3\ndisparate 3\ndispersants 3\ndisplace 3\ndisqualified 3\ndissenting 3\ndistinguishing 3\ndistract 3\ndisturbance 3\ndisused 3\ndiversions 3\ndivest 3\ndivested 3\ndivestitures 3\ndivisional 3\ndivisiveness 3\ndocs 3\ndoctrines 3\ndocumentaries 3\ndoers 3\ndogma 3\ndolphins 3\ndominion 3\ndominos 3\ndongle 3\ndonuts 3\ndory 3\ndos 3\ndoses 3\ndoubles 3\ndoubly 3\ndoubted 3\ndowdy 3\ndownbeat 3\ndownloading 3\ndp 3\ndrained 3\ndramatization 3\ndrape 3\ndrawback 3\ndrifts 3\ndripping 3\ndrool 3\ndrown 3\ndryer 3\ndutifully 3\ndwarfed 3\ndwindling 3\ndysfunctional 3\neToys 3\neagle 3\neasygoing 3\neavesdropping 3\nebola 3\neclecticism 3\neclipse 3\necstatic 3\nedgy 3\nedit 3\neffortlessly 3\nel 3\nelated 3\nelbows 3\nelective 3\nelectrified 3\nelectrodes 3\nelectrogalvanized 3\nelegance 3\nelegantly 3\nelevation 3\nelicited 3\neloquently 3\nembarked 3\nembezzled 3\nemblamatic 3\nemblem 3\nembroidery 3\nemigrants 3\nemigrate 3\nemissaries 3\nemitted 3\nemptied 3\nemulate 3\nenclosure 3\nencoding 3\nencompasses 3\nencroachment 3\nendlessly 3\nendorsements 3\nengagement 3\nengraved 3\nenlarge 3\nenlist 3\nenlisting 3\nenriched 3\nentails 3\nenterprising 3\nenthusiast 3\nentirety 3\nentitles 3\nentrants 3\nentree 3\nentrepreneurship 3\nentrust 3\nenvironmentalism 3\nenvoys 3\nepilepsy 3\nequestrian 3\nequine 3\nequipping 3\neradicate 3\nerect 3\nerupting 3\nescalating 3\nessentials 3\net 3\netched 3\nethanol 3\nethnically 3\nevangelist 3\neverlasting 3\nevils 3\nevoking 3\nex-President 3\nex-dividend 3\nexcavated 3\nexcellence 3\nexcerpt 3\nexclaims 3\nexcrement 3\nexcursions 3\nexecutioner 3\nexempted 3\nexempting 3\nexhibits 3\nexpanses 3\nexpansive 3\nexpediting 3\nexpressly 3\nexpressways 3\nextant 3\nexternally 3\nextinct 3\nextort 3\nextracurricular 3\nextravagance 3\nextravagant 3\nextremist 3\nextricate 3\nexude 3\neyeing 3\nfabled 3\nfabricated 3\nfacetiously 3\nfacilitates 3\nfacilitating 3\nfactored 3\nfactually 3\nfade 3\nfairs 3\nfantasies 3\nfarcical 3\nfarmhouse 3\nfascism 3\nfastball 3\nfastener 3\nfatalities 3\nfatality 3\nfatally 3\nfauna 3\nfavour 3\nfeasts 3\nfeather 3\nfeathered 3\nfeats 3\nfelled 3\nfelon 3\nfeminism 3\nfeminists 3\nferment 3\nfertility 3\nfervent 3\nfetal 3\nfetus 3\nfetuses 3\nfeud 3\nfeuding 3\nfeverish 3\nfiasco 3\nfiberboard 3\nfibromyalgia 3\nfickle 3\nfielded 3\nfilament 3\nfilaments 3\nfilers 3\nfilibuster 3\nfiltered 3\nfilth 3\nfin 3\nfinalizing 3\nfinancings 3\nfinely 3\nfinery 3\nfingering 3\nfingerprints 3\nfireball 3\nfirefighters 3\nfishery 3\nfishingdays 3\nfixation 3\nfixes 3\nflagging 3\nflaming 3\nflap 3\nflashlights 3\nflatly 3\nflavour 3\nflexibly 3\nfliers 3\nflimsy 3\nflips 3\nfloated 3\nfloodwaters 3\nflop 3\nflopping 3\nflops 3\nflora 3\nfloral 3\nflorist 3\nflourishing 3\nflowering 3\nfluctuated 3\nfluctuation 3\nfluffy 3\nfluidity 3\nfluke 3\nfoe 3\nfolding 3\nfolds 3\nfondness 3\nfoolishly 3\nfooted 3\nforays 3\nforearm 3\nforeclosures 3\nforesees 3\nforfeit 3\nforfeited 3\nforfeitures 3\nforged 3\nforgets 3\nfork 3\nformations 3\nforthrightly 3\nfortitude 3\nforwards 3\nfoundering 3\nfourths 3\nfps 3\nfragility 3\nfragmented 3\nframing 3\nfranchised 3\nfrantic 3\nfraternity 3\nfreer 3\nfrees 3\nfreezer 3\nfrench 3\nfrenetic 3\nfrenzied 3\nfrequented 3\nfreshness 3\nfriendlier 3\nfriendliness 3\nfrigates 3\nfrigid 3\nfringes 3\nfro 3\nfrogs 3\nfrost 3\nfrustrations 3\nfrying 3\nfudge 3\nfugitive 3\nfugitives 3\nfumes 3\nfunctionaries 3\nfunctioned 3\nfurnishings 3\nfurry 3\nfused 3\ngadgets 3\ngainer 3\ngala 3\ngale 3\ngalloping 3\ngalvanize 3\ngangsters 3\ngardener 3\ngarnered 3\ngarrison 3\ngastroenteritis 3\ngathers 3\ngazing 3\ngeneralize 3\ngenerational 3\ngeological 3\ngeometric 3\ngiddy 3\ngifted 3\ngiggled 3\ngilts 3\nginseng 3\ngist 3\ngleaming 3\nglee 3\nglimmer 3\nglitz 3\nglucokinase 3\nglycols 3\ngmail 3\ngobbled 3\ngoddamn 3\ngolfing 3\ngoodbye 3\ngorges 3\ngossipy 3\ngourmet 3\ngraded 3\ngraduation 3\ngraft 3\ngram 3\ngranddaughter 3\ngrandmothers 3\ngranules 3\ngrasping 3\ngratified 3\ngratifying 3\ngravy 3\ngraze 3\ngreats 3\ngrill 3\ngrin 3\ngrips 3\ngrooming 3\ngroove 3\ngroping 3\ngrotesque 3\ngroundwater 3\ngrudge 3\ngrudges 3\ngrudgingly 3\ngrumble 3\ngtee 3\nguaranties 3\nguessing 3\nguitarist 3\ngunners 3\ngurus 3\ngwardyola 3\ngynecology 3\nh 3\nhabitual 3\nhabitually 3\nhacking 3\nhaha 3\nhairy 3\nhallway 3\nhalting 3\nham 3\nhamburgers 3\nhampering 3\nhamsters 3\nhandcuffed 3\nhandheld 3\nhandler 3\nhandshake 3\nhao 3\nhaptoglobin 3\nharass 3\nharassing 3\nharbour 3\nhardcore 3\nhardened 3\nharrowing 3\nharshest 3\nharvesting 3\nharvests 3\nhash 3\nhasty 3\nhatchet 3\nhaulers 3\nhav 3\nhavens 3\nhawk 3\nhazing 3\nheaders 3\nheadlights 3\nheadphone 3\nheadphones 3\nhearty 3\nheaved 3\nhedges 3\nheftier 3\nheightening 3\nhelplessly 3\nhelplessness 3\nhenchmen 3\nherbs 3\nherdsmen 3\nhers 3\nheterogeneous 3\nhh 3\nhiccup 3\nhideous 3\nhijackers 3\nhiker 3\nhilarious 3\nhind 3\nhindering 3\nhinoki 3\nhinterland 3\nhiss 3\nhistories 3\nhives 3\nhoarse 3\nhobbling 3\nhogs 3\nholed 3\nhomage 3\nhomemaker 3\nhomicide 3\nhone 3\nhoneymoon 3\nhookers 3\nhops 3\nhormones 3\nhorrifying 3\nhostility 3\nhousewares 3\nhsi 3\nhsing 3\nhttp://AbuAbdullaah.arabform.com 3\nhuddled 3\nhug 3\nhum 3\nhumanistic 3\nhumbled 3\nhunk 3\nhunky 3\nhurling 3\nhutch 3\nhwan 3\nhydraulic 3\nhyped 3\nhyun 3\niPod 3\niTunes 3\nibn 3\niceberg 3\nidiotic 3\nidly 3\nilk 3\nillegals 3\nillegitimate 3\nills 3\nimaging 3\nimmediacy 3\nimmensely 3\nimmigrate 3\nimmigrated 3\nimmorality 3\nimmortal 3\nimpartiality 3\nimpede 3\nimpeded 3\nimpediment 3\nimperiled 3\nimplements 3\nimplicate 3\nimpolite 3\nimpractical 3\nimprinted 3\nimpulses 3\nimpunity 3\ninadvertently 3\ninappropriately 3\ninching 3\nincidence 3\ninclusive 3\nincoherent 3\ninconsistencies 3\nincrimination 3\nincubation 3\nincumbents 3\nindelible 3\nindemnification 3\nindemnity 3\nindexers 3\nindicative 3\nindict 3\nindignity 3\ninducement 3\nindulge 3\nindulgence 3\nindulgent 3\nindulging 3\nindustrialist 3\ninefficiency 3\nineptitude 3\ninescapable 3\ninexplicable 3\ninextricably 3\ninfatuation 3\ninfiltrate 3\ninflame 3\ninformally 3\ninformant 3\ninformative 3\ninfused 3\ninfusion 3\ninhabited 3\ninherit 3\ninhibiting 3\ninhuman 3\ninject 3\ninnovated 3\ninnumerable 3\ninsatiable 3\ninsofar 3\ninstructing 3\ninsurances 3\ninsurgency 3\ninsuring 3\nintensively 3\ninter-city 3\nintercept 3\ninterchange 3\nintercollegiate 3\ninterfered 3\ninterleukin 3\nintermittent 3\ninterrupting 3\ninterruption 3\ninterspersed 3\ninterviewer 3\nintimately 3\nintrigued 3\nintrusive 3\nintuitive 3\ninverse 3\ninversion 3\ninvitations 3\nirked 3\nirresistible 3\nirritates 3\nisda 3\nisms 3\nisthmus 3\njanette.elbertson@enron.com 3\njarring 3\njars 3\njeff 3\njellyfish 3\njerk 3\njetliners 3\njilian 3\njitsu 3\njiu 3\njoe 3\njog 3\njogging 3\njoked 3\njournals 3\njoyous 3\njuggling 3\njumbos 3\njuncture 3\njuren 3\njurists 3\njuror 3\njustifiable 3\nkW 3\nkaffiyeh 3\nkeel 3\nkeenly 3\nkerala 3\nkernel 3\nkettle 3\nkeywords 3\nkicks 3\nkidnap 3\nkiln 3\nkilometres 3\nkilowatt 3\nkilowatts 3\nkindergartens 3\nkingpins 3\nkitchens 3\nknack 3\nknights 3\nknow-how 3\nkun 3\nkung 3\nlactose 3\nlamb 3\nlambs 3\nlamented 3\nlaments 3\nlamps 3\nlandfills 3\nlandings 3\nlandscaping 3\nlantern 3\nlapse 3\nlarou 3\nlashing 3\nlauded 3\nlava 3\nlawmaking 3\nlawns 3\nlbs 3\nleafy 3\nleagues 3\nleakage 3\nleaned 3\nleaner 3\nlearns 3\nleathers 3\nlectures 3\nlegends 3\nlegerdemain 3\nlegitimize 3\nlemelpe@NU.COM 3\nlends 3\nlengthened 3\nlenient 3\nleopard 3\nlesbians 3\nlessen 3\nlethargy 3\nlever 3\nlevying 3\nliberalizing 3\nlieu 3\nlifeline 3\nlightened 3\nlightest 3\nlightweight 3\nlikened 3\nlimelight 3\nlimestone 3\nlimousine 3\nlinear 3\nlinen 3\nlinens 3\nlinkages 3\nliquidations 3\nliquidator 3\nlistener 3\nliter 3\nlitmus 3\nlivable 3\nloaf 3\nloafers 3\nloaned 3\nloathing 3\nlocational 3\nlogos 3\nlongs 3\nloom 3\nlooser 3\nlooted 3\nlowers 3\nlowly 3\nluminaries 3\nlumps 3\nlurched 3\nm.nordstrom@pecorp.com 3\nmachikin 3\nmachinist 3\nmafias 3\nmagician 3\nmagnanimous 3\nmaiden 3\nmailers 3\nmailings 3\nmainline 3\nmajored 3\nmakeshift 3\nmalfunction 3\nmalpractice 3\nmammals 3\nmandates 3\nmaneuvered 3\nmanhunt 3\nmanic 3\nmanifestation 3\nmanifesto 3\nmanliness 3\nmannered 3\nmarina 3\nmarketization 3\nmarkka 3\nmarries 3\nmarrow 3\nmartyrdom 3\nmarvel 3\nmascot 3\nmasri 3\nmassacred 3\nmasse 3\nmastectomy 3\nmastery 3\nmaterialism 3\nmaturation 3\nmatured 3\nmaximizing 3\nmayoralty 3\nmeandering 3\nmediate 3\nmedieval 3\nmelding 3\nmembrane 3\nmementos 3\nmenace 3\nmenacing 3\nmenstrual 3\nmenus 3\nmerciful 3\nmercilessly 3\nmessaging 3\nmetaphors 3\nmethodologies 3\nmeticulous 3\nmicro 3\nmicrobes 3\nmicrocassette 3\nmicrofluidics 3\nmicropore 3\nmicrowave 3\nmicrowaves 3\nmid-July 3\nmid-September 3\nmid-afternoon 3\nmid-range 3\nmidafternoon 3\nmikecat 3\nmilieu 3\nmilitaries 3\nmilitarism 3\nmilitiamen 3\nmilling 3\nmimics 3\nmiml9@hotmail.com 3\nmin 3\nminefield 3\nmini-component 3\nminicars 3\nminiscule 3\nmints 3\nminuscule 3\nmiraculous 3\nmischief 3\nmisconception 3\nmiserly 3\nmisfortunes 3\nmishandled 3\nmisrepresented 3\nmissionary 3\nmistreated 3\nmists 3\nmisused 3\nmitigating 3\nmitigation 3\nmobilizing 3\nmockery 3\nmockingly 3\nmodem 3\nmodernizing 3\nmoisturizing 3\nmolded 3\nmolding 3\nmonastery 3\nmonopolized 3\nmonstrous 3\nmoors 3\nmorass 3\nmoron 3\nmotifs 3\nmotivations 3\nmotors 3\nmounds 3\nmulti-company 3\nmultiparty 3\nmultiply 3\nmultiyear 3\nmurderous 3\nmuses 3\nmutation 3\nmutiny 3\nn 3\nnailed 3\nnameplate 3\nnannies 3\nnano 3\nnap 3\nnarrowest 3\nnastier 3\nnationalists 3\nnationalized 3\nnavigation 3\nnaysayers 3\nneanderthals 3\nneedles 3\nneglecting 3\nneighborly 3\nneodymium 3\nneon 3\nnests 3\nnetted 3\nneurologist 3\nneutralization 3\nnewsdesk 3\nnewsgroups 3\nnicknames 3\nnihuai 3\nnoiseless 3\nnominally 3\nnon-binding 3\nnon-communist 3\nnon-convertible 3\nnon-invasive 3\nnon-nuclear 3\nnon-prescription 3\nnon-residential 3\nnon-stop 3\nnon-strategic 3\nnon-trade 3\nnon-users 3\nnoncontract 3\nnonessential 3\nnonferrous 3\nnonpartisan 3\nnonsensical 3\nnonunion 3\nnormalization 3\nnormalizing 3\nnosedive 3\nnostrils 3\nnourishing 3\nnuance 3\nnuisance 3\nnumb 3\nnumbering 3\nnumerical 3\nnurtured 3\nnurturing 3\nnutrients 3\noats 3\nobeying 3\nobfuscation 3\nobjectionable 3\nobliging 3\nobnoxious 3\nobscene 3\nobscures 3\nobstinate 3\noccult 3\noccurrences 3\nodious 3\noffline 3\noffshoot 3\nolive 3\nombudsman 3\nomits 3\nonions 3\nonset 3\nonshore 3\nonsite 3\nonslaught 3\nonwards 3\nopener 3\noperas 3\nopines 3\nopportune 3\noppressed 3\noptics 3\noptimistically 3\noptimum 3\nopting 3\noptionality 3\nopulent 3\norbiting 3\nordinances 3\norganism 3\norganisms 3\norganizer 3\noriginator 3\nornaments 3\norphan 3\norphaned 3\northodoxy 3\nosteoarthritis 3\nosteoporosis 3\noutbid 3\noutbreaks 3\noutgrowth 3\nouting 3\noutings 3\noutlay 3\noutlining 3\noutpacing 3\noutpatient 3\noutrageously 3\noutsourcing 3\novation 3\noven 3\noverbearing 3\novercame 3\novercrowding 3\noverhanging 3\noverhauling 3\noverheated 3\noverjoyed 3\noverload 3\noverpaid 3\noverproduction 3\novertly 3\novervalued 3\noverview 3\nowl 3\npacemakers 3\npacific 3\npacman 3\npadded 3\npadding 3\npainless 3\npal 3\npals 3\npaneling 3\npanicking 3\npanicky 3\nparachuting 3\nparamilitaries 3\nparanoid 3\nparasites 3\npared 3\nparing 3\nparlance 3\nparole 3\nparticipates 3\npassageway 3\npassageways 3\npassports 3\npasta 3\npatchwork 3\npatently 3\npathogenic 3\npatriarchal 3\npatrolled 3\npatterned 3\npaused 3\npausing 3\npaycheck 3\npayload 3\npayouts 3\npea 3\npeaking 3\npeanut 3\npeddle 3\npeddled 3\npediatrics 3\npee 3\npeeing 3\npenalized 3\npenetration 3\npenguins 3\npenines 3\npenned 3\npep 3\nperfecting 3\nperfectionism 3\nperilously 3\nperiodicals 3\nperipherals 3\nperished 3\nperpetuate 3\nperplexed 3\nperseverance 3\npersistence 3\npersisting 3\npersists 3\npetite 3\npharmacies 3\nphilosopher 3\nphoenix 3\npic 3\npicky 3\npicocassette 3\npics 3\npiecemeal 3\npiers 3\npiety 3\npillow 3\npinching 3\npines 3\npinky 3\npinpointed 3\npinyin 3\npioneering 3\npious 3\npitiful 3\npittance 3\npizzas 3\nplacebo 3\nplated 3\nplaza 3\nplotted 3\nploys 3\nplugs 3\npluralism 3\nplurality 3\nplutonium 3\npneumonia 3\npod 3\npoets 3\npoignant 3\npointedly 3\npoking 3\npoles 3\npolicewoman 3\npoliticized 3\npollsters 3\npollute 3\npolluting 3\npoltergeists 3\npommel 3\npopcorn 3\npope 3\npopping 3\npopulace 3\npopularization 3\npopularly 3\npopulating 3\npoqi 3\nporcelains 3\nporch 3\nportals 3\nportraits 3\nportrays 3\npost-Saddam 3\npost-Watergate 3\npost-World 3\npost-production 3\npostmaster 3\npostponement 3\npostponing 3\npotbellied 3\npots 3\npotter 3\npotters 3\npowdered 3\npowerhouses 3\npplcat 3\npraises 3\nprayed 3\npre 3\npre-school 3\npre-tax 3\npreach 3\npreacher 3\nprecepts 3\npredictably 3\nprejudices 3\npreorder 3\npreparedness 3\npreposterous 3\nprerequisites 3\nprerogatives 3\npresage 3\npresale 3\npreset 3\npresume 3\npretended 3\npretending 3\nprickly 3\nprimed 3\npristine 3\npro-China 3\nproactive 3\nproactively 3\nproclamations 3\nprofiting 3\nprogramme 3\nprolific 3\npronged 3\nproportional 3\npropped 3\npropping 3\nprops 3\nprospecting 3\nprotectors 3\nproviso 3\npsychoanalyst 3\npsychologists 3\npuckish 3\npuff 3\npullbacks 3\npulmonary 3\npummeled 3\npunching 3\npuny 3\npup 3\npups 3\npurged 3\npurging 3\npurification 3\npurported 3\npurposefully 3\npursuant 3\npushes 3\nputtable 3\nqigong 3\nquadrupled 3\nquadrupling 3\nquakes 3\nqualification 3\nquantify 3\nquarrel 3\nquartet 3\nqueue 3\nquibble 3\nquintessential 3\nquipped 3\nracehorses 3\nracially 3\nracks 3\nradios 3\nraft 3\nraged 3\nraided 3\nrambunctious 3\nrangers 3\nrapture 3\nrarest 3\nrashly 3\nrationality 3\nrationally 3\nravages 3\nrave 3\nre-emerge 3\nre-employed 3\nre-employment 3\nre-enactment 3\nre-establish 3\nre-evaluating 3\nre-thinking 3\nreadership 3\nrealistically 3\nrebuilt 3\nrecapitalizations 3\nrecaptured 3\nreceivable 3\nrecessive 3\nrecirculated 3\nrecourse 3\nrecreate 3\nrecruiter 3\nrectify 3\nrecycles 3\nredo 3\nredoing 3\nreestablish 3\nreestablished 3\nreferenced 3\nrefinanced 3\nrefiner 3\nrefocused 3\nrefocusing 3\nreformist 3\nrefreshing 3\nrefreshments 3\nrefrigerator 3\nrefuted 3\nregenerate 3\nregisters 3\nregistrants 3\nregretting 3\nreignited 3\nreinsurers 3\nreiterating 3\nrejoicing 3\nrekindling 3\nrelentless 3\nrelocated 3\nreminding 3\nremission 3\nremittances 3\nremnant 3\nrenegade 3\nrenegotiate 3\nrenegotiated 3\nrenews 3\nrenovating 3\nrentals 3\nrepealed 3\nreplacements 3\nreplenished 3\nreplete 3\nrepossessed 3\nreprisal 3\nreproduction 3\nrequisite 3\nrerouting 3\nrescinded 3\nreselling 3\nresettable 3\nreshuffling 3\nresidencia 3\nresiding 3\nresidual 3\nresorted 3\nresounding 3\nrestatement 3\nrestorer 3\nrestyled 3\nresupply 3\nretaliatory 3\nretires 3\nretrenchment 3\nreunite 3\nreverses 3\nreversible 3\nreviewer 3\nreviewers 3\nrevolted 3\nrevolutionized 3\nrevolutions 3\nrhetorical 3\nribs 3\nrichness 3\nrigidity 3\nrigors 3\nrigueur 3\nringing 3\nriot 3\nriser 3\nrites 3\nriveting 3\nroam 3\nroaming 3\nrobes 3\nroofing 3\nrooftop 3\nroses 3\nrotating 3\nrote 3\nrotors 3\nrouted 3\nrouting 3\nrubbed 3\nrubs 3\nruby 3\nrudimentary 3\nrugby 3\nrunways 3\nrupiahs 3\nrust 3\nrusty 3\nruthlessly 3\nsacks 3\nsaddened 3\nsadly 3\nsafest 3\nsails 3\nsaint 3\nsalaried 3\nsalesperson 3\nsalons 3\nsanguine 3\nsarcasm 3\nsarialhamad@yahoo.com 3\nsashing 3\nsatisfactorily 3\nscammers 3\nscandalous 3\nscanning 3\nscans 3\nscar 3\nscarcity 3\nscares 3\nscarlet 3\nscars 3\nscheming 3\nschizophrenia 3\nscholarly 3\nschtick 3\nscoff 3\nscolded 3\nscooped 3\nscooter 3\nscorecard 3\nscourge 3\nscrape 3\nscreenplay 3\nscrewdriver 3\nscroll 3\nscrutinize 3\nse 3\nseamen 3\nseashore 3\nsecretive 3\nseduce 3\nseeming 3\nsegregation 3\nselfless 3\nsenate 3\nsenseless 3\nsensibilities 3\nserpent 3\nsesame 3\nsewer 3\nshackles 3\nshadowy 3\nshakeup 3\nshan 3\nshareholding 3\nsharks 3\nsharpen 3\nsheetlet 3\nsheikhs 3\nshelled 3\nshelling 3\nshen 3\nshipbuilder 3\nshirking 3\nshoo 3\nshopkeepers 3\nshou 3\nshouldered 3\nshoved 3\nshowrooms 3\nshrapnel 3\nshrift 3\nshrines 3\nshrunk 3\nshunning 3\nshutdowns 3\nshyness 3\nsideways 3\nsift 3\nsignaling 3\nsignatory 3\nsignifies 3\nsilicone 3\nsimplifying 3\nsimulators 3\nsingles 3\nsip 3\nsixties 3\nskating 3\nskeletons 3\nskewed 3\nskimmer 3\nskipped 3\nskirmishes 3\nskittishness 3\nskulls 3\nskyline 3\nskyscrapers 3\nslab 3\nslain 3\nslandering 3\nslant 3\nslap 3\nslaps 3\nslavery 3\nslaying 3\nsleeves 3\nslices 3\nslimmer 3\nslimy 3\nslipper 3\nslippery 3\nslips 3\nslows 3\nslugging 3\nsmack 3\nsmallpox 3\nsmashing 3\nsmelled 3\nsmelting 3\nsmoother 3\nsnag 3\nsnagged 3\nsneak 3\nsniffed 3\nsniffs 3\nsnowboard 3\nsocalled 3\nsocializing 3\nsocietal 3\nsocioeconomic 3\nsoggy 3\nsolemn 3\nsolicitations 3\nsolvents 3\nsomber 3\nsomeplace 3\nsongwriter 3\nsoothe 3\nsorcery 3\nsorties 3\nsoundness 3\nsow 3\nspacewalk 3\nspaghetti 3\nspam 3\nspanking 3\nsparkle 3\nsparsely 3\nspatial 3\nspawning 3\nspayed 3\nspearheaded 3\nspecialization 3\nspecs 3\nspectacularly 3\nspeculator 3\nspeeded 3\nspeedometer 3\nspider 3\nspied 3\nspiked 3\nspillover 3\nspinach 3\nspindle 3\nspinner 3\nspiteful 3\nsplicing 3\nspooks 3\nsportswear 3\nspotting 3\nsprayed 3\nspreadsheets 3\nsprouting 3\nspurned 3\nspurts 3\nsquandered 3\nsquared 3\nsquaring 3\nsqueaky 3\nsquirming 3\nstagflation 3\nstaggered 3\nstale 3\nstalked 3\nstalwart 3\nstalwarts 3\nstandby 3\nstare 3\nstartled 3\nstarved 3\nstatesmen 3\nstave 3\nsteadfastly 3\nsteadied 3\nsteaks 3\nstellar 3\nstepmother 3\nstereos 3\nstereotype 3\nstereotypical 3\nsterilized 3\nsterilizing 3\nstiffer 3\nstimulation 3\nstimulus 3\nstinging 3\nstirs 3\nstockbrokers 3\nstockings 3\nstooges 3\nstoop 3\nstorefront 3\nstraining 3\nstrangled 3\nstrap 3\nstratospheric 3\nstrenuously 3\nstrident 3\nstrides 3\nstriding 3\nstrikingly 3\nstrobe 3\nstrongman 3\nstub 3\nstung 3\nstyled 3\nstylistic 3\nsubcontractors 3\nsubmerged 3\nsubminimum 3\nsubstituting 3\nsubstitution 3\nsubterfuge 3\nsubterranean 3\nsubtitle 3\nsubways 3\nsucking 3\nsufferers 3\nsuffrage 3\nsuitability 3\nsuitably 3\nsummers 3\nsunrise 3\nsuper-majority 3\nsuperconductor 3\nsuperintendents 3\nsupernatural 3\nsuperstore 3\nsurf 3\nsurfaces 3\nsurgically 3\nsurnames 3\nsurround 3\nsushi 3\nsussex 3\nsustains 3\nswapping 3\nswayed 3\nsweaters 3\nsweepstakes 3\nsweeten 3\nsweets 3\nswells 3\nswirl 3\nswoop 3\nswung 3\nsymbolizing 3\nsynchronized 3\nsynergy 3\nsynonymous 3\nsystemic 3\ntablets 3\ntabloids 3\ntad 3\ntagged 3\ntakers 3\ntallying 3\ntampered 3\ntampering 3\ntan 3\ntandem 3\ntangchaoZX 3\ntapping 3\ntarnish 3\ntasty 3\nte 3\nteargas 3\nteaspoon 3\ntectonic 3\nteeming 3\nteenagers 3\nteething 3\ntelephoned 3\ntemblors 3\ntempered 3\ntempo 3\ntempt 3\ntendering 3\ntenders 3\ntenets 3\ntenths 3\ntestifying 3\ntestimonial 3\nthankful 3\nthanking 3\nthawed 3\ntheological 3\ntheoretically 3\ntheorize 3\ntheorized 3\ntherapies 3\nthermometer 3\nthickening 3\nthicket 3\nthickly 3\nthirtysomething 3\nthorn 3\nthreefold 3\nthrill 3\nthrilled 3\nthrottle 3\nthrowers 3\nthug 3\nthugs 3\nthursday 3\nthyroid 3\ntidal 3\ntigers 3\ntil 3\ntilting 3\ntimbers 3\ntimeout 3\ntinge 3\ntirelessly 3\ntit 3\ntoilets 3\ntolerable 3\ntomatoes 3\ntonnes 3\ntooling 3\ntornado 3\ntornadoes 3\ntorrential 3\ntorso 3\ntoting 3\ntoying 3\ntraditionelles 3\ntragically 3\ntrainers 3\ntranscend 3\ntransformations 3\ntransfusion 3\ntransponders 3\ntransports 3\ntrappings 3\ntrashy 3\ntraumas 3\ntraumatized 3\ntray 3\ntrench 3\ntrepidation 3\ntrespassing 3\ntribulation 3\ntributaries 3\ntrickle 3\ntrilateral 3\ntrilogy 3\ntripling 3\ntrolley 3\ntrolleys 3\ntrotted 3\ntroublemakers 3\ntroupe 3\ntrousers 3\ntrumpeting 3\ntub 3\ntummy 3\ntumors 3\ntumult 3\ntunnels 3\nturban 3\nturbo 3\nturnabout 3\nturnout 3\ntutoring 3\ntweens 3\ntwelfth 3\ntwenties 3\ntwilight 3\ntycoon 3\ntypewriter 3\ntypo 3\ntyrant 3\nultrasonic 3\nultrasound 3\nunaccounted 3\nunambiguous 3\nunbalanced 3\nunbiased 3\nuncalled 3\nunchanging 3\nuncharacteristically 3\nunchecked 3\nuncivilized 3\nunconfirmed 3\nunconscious 3\nunconventional 3\nuncovering 3\nuncut 3\nunderage 3\nundercurrent 3\nundercutting 3\nunderline 3\nunderpinning 3\nunderprivileged 3\nunderstate 3\nundervotes 3\nunderwater 3\nundisciplined 3\nundisputed 3\nunease 3\nunencumbered 3\nunending 3\nunfazed 3\nunflattering 3\nunfocused 3\nunfolded 3\nunfolds 3\nuniformly 3\nunites 3\nunitholders 3\nunknowingly 3\nunknowns 3\nunlock 3\nunmarried 3\nunnerved 3\nunnerving 3\nunnoticed 3\nunoccupied 3\nunpatriotic 3\nunqualified 3\nunrealistically 3\nunrealized 3\nunreasonably 3\nunscientific 3\nunseen 3\nunsold 3\nunstoppable 3\nunsubstantiated 3\nunsuspected 3\nunsustainable 3\nuntested 3\nuntouchable 3\nuntouched 3\nuntreated 3\nunwieldy 3\nupgrades 3\nupheavals 3\nupholding 3\nupshot 3\nupswing 3\nupturn 3\nuttered 3\nvacancies 3\nvacating 3\nvalves 3\nvanguard 3\nvanity 3\nvat 3\nvaults 3\nvein 3\nvelvet 3\nventilated 3\nverbally 3\nveritable 3\nvest 3\nvial 3\nvibrating 3\nvicinity 3\nvideocassettes 3\nviewer 3\nvignettes 3\nvirtuoso 3\nvisualize 3\nvoicing 3\nvolatilities 3\nvoluptuous 3\nvomit 3\nvox 3\nvs 3\nvtiger 3\nvu 3\nwacky 3\nwaffle 3\nwafting 3\nwagons 3\nwaiters 3\nwaits 3\nwaiving 3\nwalled 3\nwalnut 3\nwaned 3\nwang 3\nwangs 3\nwarding 3\nwarplanes 3\nwarring 3\nwashes 3\nwaterfront 3\nwatershed 3\nwaterway 3\nwaterways 3\nwaxed 3\nwed 3\nweekdays 3\nweirder 3\nwhacked 3\nwhale 3\nwheeling 3\nwhim 3\nwhimsical 3\nwhipping 3\nwhipsawed 3\nwhispering 3\nwhitish 3\nwho's 3\nwholesome 3\nwicker 3\nwielding 3\nwig 3\nwilt 3\nwindfalls 3\nwindshields 3\nwinters 3\nwiretap 3\nwisecracks 3\nwisest 3\nwithering 3\nwithhold 3\nwobblers 3\nwobbly 3\nwoe 3\nwomanizing 3\nworkable 3\nworldly 3\nworsened 3\nworshipped 3\nwreaked 3\nwrinkles 3\nwrists 3\nwritedowns 3\nwronged 3\nwrongly 3\nx-rayed 3\nx-rays 3\nx3-9890 3\ny 3\nyachts 3\nyao 3\nyearn 3\nyinduasan 3\nyueh 3\nyung 3\nzeroing 3\nzheng 3\nzoology 3\n● 3\n!!!!! 2\n!!!? 2\n!!?? 2\n\"!!\" 2\n#'s 2\n#15 2\n'03 2\n'07 2\n'40s 2\n'67 2\n'71 2\n'72 2\n'73 2\n'90's 2\n'99 2\n'Id 2\n'bout 2\n'ma 2\n'n 2\n'nt 2\n'til 2\n***** 2\n********** 2\n+++ 2\n----- 2\n-------------- 2\n----------------------- 2\n------------------------ 2\n------------------------- 2\n------------------------------ 2\n---------------------------------------------------------------------- 2\n----------------------------------------------------------------------- 2\n--------------------------------------------------------------------------- 2\n---->===}*{===<---- 2\n-LRB-02-RRB- 2\n-RRB--RRB- 2\n........... 2\n.................... 2\n0.01 2\n0.0108 2\n0.06 2\n0.12 2\n0.15 2\n0.32 2\n0.375 2\n0.43 2\n0.59 2\n0.71 2\n0.94 2\n00000 2\n006 2\n01/24/2001 2\n01/25/2001 2\n010 2\n011 2\n02-05-02.doc 2\n02/27/2001 2\n02/28/2001 2\n03/21/2001 2\n03/23/2001 2\n03/27/2001 2\n03/28/2001 2\n03:58 2\n04 2\n04:03 2\n04:17 2\n04:41 2\n05 2\n05/17/99 2\n05/30/2001 2\n06 2\n07/14/2000 2\n070 2\n072 2\n07:24 2\n08 2\n08/15/2000 2\n08/16/2000 2\n08/17/2000 2\n08:58 2\n09 2\n09/08/2000 2\n09/15/99 2\n09:01 2\n1,000,000 2\n1,012 2\n1,015 2\n1,050 2\n1,111 2\n1,150,000 2\n1,250 2\n1,250,000 2\n1,365,226 2\n1,500,000 2\n1,750 2\n1,828,000 2\n1,859 2\n1- 2\n1-800-991-9019 2\n1.14 2\n1.28 2\n1.51 2\n1.5753 2\n1.5825 2\n1.59 2\n1.5920 2\n1.6030 2\n1.6055 2\n1.62 2\n1.637 2\n1.64 2\n1.66 2\n1.73 2\n1.8200 2\n1.83 2\n1.84 2\n1.8685 2\n1.87 2\n1.89 2\n1.92 2\n1.94 2\n1/3 2\n10,000,000 2\n10.01 2\n10.03 2\n10.05 2\n10.14 2\n10.29 2\n10.35 2\n10.38 2\n10.48 2\n10.625 2\n10.99 2\n100.2 2\n100.4 2\n100th 2\n101.4 2\n102.1 2\n102.625 2\n103,000 2\n105.4 2\n108.4 2\n109.85 2\n10:25 2\n10:45 2\n10:56 2\n10s 2\n11-20-2000 2\n11.04 2\n11.38 2\n11.53 2\n11.60 2\n11.625 2\n11.95 2\n11/27/2000 2\n111.48 2\n112.5 2\n114.4 2\n114.7 2\n115,000 2\n116,000 2\n11608 2\n117.3 2\n11:07 2\n11:15 2\n11s 2\n12,500 2\n12.1 2\n12.76 2\n12.95 2\n12/32 2\n120.1 2\n120.7 2\n122.7 2\n123.5 2\n1230.80 2\n1247.87 2\n125,000 2\n125.5 2\n1254.27 2\n127.5 2\n129.49 2\n12:00 2\n12:15 2\n12:48 2\n13,120 2\n13.02 2\n13.05 2\n13.32 2\n13.35 2\n13.71 2\n13.9 2\n13.94 2\n13011 2\n132.8 2\n133 2\n134.8 2\n136.4 2\n137.6 2\n14.25 2\n14.99 2\n141.55 2\n141.80 2\n142.5 2\n142.70 2\n1423 2\n1425 2\n1426 2\n146.8 2\n1466.29 2\n15.06 2\n15.25 2\n15.375 2\n15.625 2\n15.72 2\n15.75 2\n15.80 2\n15.82 2\n15.9 2\n15.97 2\n15/16 2\n15/32 2\n150.3 2\n1504 2\n151 2\n151.20 2\n1520 2\n1530 2\n154.2 2\n155,000 2\n155,650,000 2\n159 2\n1590's 2\n16.375 2\n16.40 2\n16.59 2\n16.75 2\n16.9 2\n16.95 2\n16/32 2\n161.1 2\n161.5 2\n162,000 2\n16241 2\n163.3 2\n163blog 2\n164,830,000 2\n166,900,000 2\n166.9 2\n17.1 2\n17.3 2\n17.4 2\n170,330,000 2\n170.4 2\n1701 2\n172.2 2\n172.5 2\n173.1 2\n175,000 2\n176,100,000 2\n177.5 2\n178.375 2\n178.9 2\n1796 2\n18.375 2\n18.50 2\n18.8 2\n18/32 2\n180.9 2\n1800 2\n183 2\n1862 2\n1868 2\n187 2\n1872 2\n1874 2\n1881 2\n1890s 2\n1895 2\n18:00 2\n18s 2\n19.25 2\n19.8 2\n190,000 2\n1900s 2\n1908 2\n1909 2\n191.75 2\n1912 2\n192.5 2\n1923 2\n1925 2\n1927 2\n193 2\n1935 2\n1936 2\n198,120,000 2\n1989B 2\n1:11 2\n1:30 2\n2,002 2\n2,064 2\n2,120 2\n2,202,000 2\n2,205,000 2\n2,250,000 2\n2,300 2\n2,360 2\n2,520 2\n2,600 2\n2- 2\n2.01 2\n2.03 2\n2.04 2\n2.05 2\n2.07 2\n2.08 2\n2.09 2\n2.10 2\n2.16 2\n2.17 2\n2.22 2\n2.26 2\n2.27 2\n2.28 2\n2.32 2\n2.34 2\n2.36 2\n2.44 2\n2.56 2\n2.57 2\n2.59 2\n2.69 2\n2.70 2\n2.74 2\n2.76 2\n2.78 2\n2.89 2\n2.95 2\n2.975 2\n2/8 2\n20.3 2\n20.4 2\n20.42 2\n20.7 2\n20.75 2\n200...@gmail.com 2\n200mA 2\n200s 2\n201 2\n202.456.1111 2\n202.456.1414 2\n202.456.2461 2\n208.7 2\n209,000 2\n21.12 2\n21.125 2\n21.25 2\n21.44 2\n21.9 2\n2149.3 2\n219.142. 2\n22.1 2\n22.125 2\n22.50 2\n22.78 2\n22.9 2\n22/32 2\n2200 2\n224,070,000 2\n224.1 2\n225,000 2\n226.3 2\n23.1 2\n23.25 2\n23.625 2\n23.7 2\n23.9 2\n23/32 2\n230,000 2\n232 2\n232.3 2\n2348 2\n235.2 2\n237,960,000 2\n24.25 2\n24.7 2\n2400 2\n244 2\n246.6 2\n25.3 2\n25.5 2\n25.6 2\n25.875 2\n250.77 2\n256,000 2\n256.6 2\n257.8 2\n258 2\n259 2\n26,000 2\n26.1 2\n26.3 2\n26/32 2\n2603.48 2\n266.2 2\n266.66 2\n269 2\n27,000 2\n27.5 2\n27.7 2\n271 2\n272,000 2\n274 2\n275,000 2\n276,334 2\n277 2\n278 2\n279 2\n2791.41 2\n28.3 2\n28.5 2\n280,000 2\n283 2\n283.7 2\n283.8 2\n286 2\n29.4 2\n29.6 2\n29.8 2\n292 2\n294 2\n299 2\n29s 2\n2:25 2\n3,100 2\n3,202.61 2\n3,300 2\n3,600 2\n3,900 2\n3.0 2\n3.01 2\n3.032 2\n3.04 2\n3.05 2\n3.09 2\n3.12 2\n3.125 2\n3.20 2\n3.26 2\n3.27 2\n3.29 2\n3.39 2\n3.41 2\n3.56 2\n3.57 2\n3.60 2\n3.62 2\n3.65 2\n3.68 2\n3.73 2\n3.74 2\n3.84 2\n3.86 2\n3.875 2\n3.95 2\n3.97 2\n30.00 2\n30.3 2\n30.4 2\n30.88 2\n300ER 2\n300s 2\n304 2\n307 2\n308.32 2\n31,329 2\n31.18 2\n31.3 2\n31.9 2\n31/32 2\n311 2\n3143C 2\n315 2\n315,000 2\n316 2\n317 2\n317.7 2\n319 2\n32.125 2\n32.3 2\n32.50 2\n32.71 2\n322 2\n323 2\n323s 2\n329 2\n33.25 2\n330,000 2\n331 2\n332.38 2\n334,774 2\n342 2\n344 2\n345-3436 2\n348.4 2\n35.4 2\n35.50 2\n354 2\n356 2\n359 2\n35th 2\n36.1 2\n36.2 2\n36.3 2\n36.4 2\n36.50 2\n363 2\n367 2\n37.50 2\n37.6 2\n37.7 2\n37.8 2\n370,000 2\n3700 2\n374 2\n377 2\n378 2\n38.7 2\n38.8 2\n382 2\n389 2\n39.5 2\n39.7 2\n39.9 2\n394 2\n395 2\n396 2\n396,000 2\n397 2\n399 2\n3:25 2\n4,440 2\n4,500 2\n4,800 2\n4,830 2\n4,900 2\n4.03 2\n4.04 2\n4.20 2\n4.26 2\n4.32 2\n4.35 2\n4.48 2\n4.625 2\n4.67 2\n4.76 2\n4.80 2\n4.88 2\n4.99 2\n40.4 2\n40.7 2\n40.9 2\n400s 2\n403 2\n405.4 2\n409 2\n41.2 2\n41.76 2\n410,000 2\n413 2\n414 2\n415 2\n42,000 2\n42.25 2\n42.3 2\n42.7 2\n420,588 2\n429 2\n43.1 2\n43.3 2\n43.375 2\n43.75 2\n433 2\n436,000 2\n438 2\n44,000 2\n44,400 2\n44,877 2\n44.1 2\n44.125 2\n441.1 2\n445 2\n449 2\n449.3 2\n45.3 2\n45.4 2\n45.50 2\n453 2\n459.93 2\n46.125 2\n46.5 2\n46.8 2\n460 2\n461 2\n47,000 2\n47,500 2\n47.6 2\n47.9 2\n476.5 2\n478 2\n479 2\n48,000 2\n48.2 2\n481,000 2\n4853 2\n49,000 2\n49.1 2\n49.2 2\n49.6 2\n49.8 2\n49.96 2\n4_28_00.doc 2\n5,200 2\n5,400 2\n5.00 2\n5.04 2\n5.10 2\n5.28 2\n5.41 2\n5.43 2\n5.50 2\n5.58 2\n5.64 2\n5.65 2\n5.66 2\n5.81 2\n5.83 2\n5.91 2\n5.99 2\n50.1 2\n50.50 2\n50.7 2\n50.875 2\n50.9 2\n505-625-8031 2\n507 2\n509 2\n51.1 2\n51.2 2\n51.3 2\n51.50 2\n51.6 2\n51.75 2\n51.9 2\n511 2\n515 2\n518 2\n52,000 2\n52.9 2\n522 2\n525 2\n525,000 2\n526.3 2\n527,000 2\n527.39 2\n528 2\n529.32 2\n53.2 2\n53.3 2\n532 2\n537 2\n54.4 2\n54.5 2\n54.6 2\n54.8 2\n541 2\n542 2\n55.2 2\n55.3 2\n55.6 2\n55.7 2\n557 2\n56.25 2\n56.875 2\n566 2\n569 2\n57.5 2\n575,000 2\n578 2\n57th 2\n58.50 2\n584 2\n585 2\n587 2\n59.3 2\n59.4 2\n59.5 2\n59.6 2\n592 2\n598 2\n5:09 2\n6.00 2\n6.05 2\n6.10 2\n6.2.1 2\n6.40 2\n6.41 2\n6.46 2\n6.52 2\n6.69 2\n6.70 2\n6.75 2\n6.80 2\n6.84 2\n6.95 2\n60.1 2\n613 2\n617 2\n62.25 2\n62.42 2\n62.7 2\n62.8 2\n623 2\n625,000 2\n625.4 2\n628 2\n63.5 2\n63.52 2\n63.9 2\n630 2\n631 2\n632 2\n6400 2\n645 2\n647.33 2\n648.2 2\n65,200 2\n65.2 2\n650,000 2\n654 2\n660 2\n661 2\n664 2\n668 2\n673 2\n678 2\n68.189. 2\n68.2 2\n680 2\n6871082# 2\n69.5 2\n690 2\n694 2\n699 2\n7,500 2\n7.09 2\n7.14 2\n7.282 2\n7.35 2\n7.41 2\n7.53 2\n7.567 2\n7.57 2\n7.73 2\n7.74 2\n7.91 2\n7.99 2\n70.2 2\n70.3 2\n70.5 2\n70.85 2\n70.9 2\n7000 2\n703 2\n70s 2\n71.9 2\n711 2\n713-853-1696 2\n713-853-3098 2\n715 2\n716 2\n72.3 2\n72.4 2\n72.92. 2\n723 2\n726 2\n73.5 2\n73.8 2\n73.805 2\n730 2\n730,070 2\n74.4 2\n744 2\n749 2\n75.1 2\n75.2 2\n756 2\n759933 2\n75th 2\n76,000 2\n76.5 2\n76.50 2\n76.7 2\n77.3 2\n77.7 2\n774 2\n78.4 2\n78.8 2\n783 2\n785 2\n788 2\n79.03 2\n79.4 2\n7B 2\n8,500 2\n8,880 2\n8.0 2\n8.13 2\n8.15 2\n8.22 2\n8.23 2\n8.31 2\n8.337 2\n8.38 2\n8.43 2\n8.475 2\n8.56 2\n8.575 2\n8.62 2\n8.625 2\n8.63 2\n8.65 2\n8.68 2\n8.82 2\n8.875 2\n8.98 2\n8/32 2\n80386 2\n807 2\n8085 2\n8088 2\n81,000 2\n81.2 2\n81.6 2\n814014 2\n82.2 2\n82.5 2\n822 2\n825 2\n83.8 2\n830 2\n832 2\n833.6 2\n84.4 2\n840 2\n840.8 2\n846 2\n85,000 2\n86.3 2\n86.4 2\n86.50 2\n87.25 2\n871 2\n875 2\n880,000 2\n886 2\n889 2\n8:15 2\n9,500 2\n9. 2\n9.19 2\n9.29 2\n9.34 2\n9.43 2\n9.51 2\n9.53 2\n9.62 2\n9.625 2\n9.725 2\n9.76 2\n9.787 2\n9.86 2\n9.88 2\n9.90 2\n904 2\n90th 2\n91.2 2\n91.7 2\n915,000 2\n93.2 2\n93.75 2\n94.5 2\n944 2\n95,000 2\n95.2 2\n95.4 2\n951 2\n960,000 2\n961 2\n965 2\n97.9 2\n98.5 2\n98.6 2\n980.2 2\n99.1 2\n99.14 2\n99.5 2\n99.85 2\n99.9 2\n99.90 2\n991 2\n:- 2\n:D 2\n:P 2\n;--RRB- 2\n<http://pangshengdong.com/> 2\n=== 2\n============================================================================== 2\n=particularly 2\n?!? 2\n????? 2\nA&P 2\nA&W 2\nA's 2\nA+ 2\nA.D. 2\nA.F. 2\nA14062 2\nAAC 2\nABBIE 2\nABCs 2\nACCIDENT 2\nACCOUNTING 2\nADS 2\nAEP 2\nAK47s 2\nAKs 2\nALQ 2\nAMD 2\nAMP 2\nANGELES 2\nANTONIO 2\nANYTHING 2\nAPI.pdf 2\nAPPLE 2\nAREA 2\nARSU 2\nARTICLE 2\nASCAP 2\nASK 2\nAST 2\nATC 2\nAUS 2\nAWESOME 2\nAWFUL 2\nAbalkin 2\nAbdulaziz 2\nAbdulkarim 2\nAbizaid 2\nAboriginal 2\nAbruzzo 2\nAbstract 2\nAbsurd 2\nAbuja 2\nAbuse 2\nAccepting 2\nAccounts 2\nAccused 2\nAcquired 2\nAcquiring 2\nActively 2\nActor 2\nAcupuncture 2\nAda 2\nAdjusters 2\nAdmirers 2\nAdnan 2\nAdnia 2\nAdopting 2\nAdriana 2\nAdult 2\nAdvancers 2\nAdverse 2\nAdviser 2\nAdz 2\nAegis 2\nAeroquip 2\nAffiliated 2\nAffiliation 2\nAffordable 2\nAfghanis 2\nAfraid 2\nAfternoon 2\nAfzal 2\nAgencies 2\nAgenda 2\nAggression 2\nAgnellis 2\nAgnew 2\nAgni 2\nAgora 2\nAgricola 2\nAhern 2\nAhlam 2\nAided 2\nAim 2\nAkashiken 2\nAkerson 2\nAkio 2\nAla 2\nAladdin 2\nAlarcon 2\nAlarmed 2\nAlas 2\nAlawali 2\nAlawi 2\nAlawsat 2\nAlbanians 2\nAlbum 2\nAlceste 2\nAldus 2\nAleman 2\nAlexandrine 2\nAlfredo 2\nAlger 2\nAlgiers 2\nAlito 2\nAlive 2\nAllow 2\nAlmanac 2\nAloe 2\nAlpha 2\nAlpharetta 2\nAlpine 2\nAlps 2\nAlsthom 2\nAlter 2\nAltos 2\nAlvarez 2\nAmanda 2\nAmax 2\nAmcast 2\nAmelia 2\nAmendments 2\nAmer 2\nAmericana 2\nAmeritas 2\nAmfac 2\nAmir 2\nAmon 2\nAnbar 2\nAnchorage 2\nAndersen 2\nAndres 2\nAnfal 2\nAngelica 2\nAngle 2\nAnjana 2\nAnnex 2\nAnniston 2\nAnnounced 2\nAnswers.com 2\nAnti-nuclear 2\nAntichrist 2\nAntique 2\nAntiques 2\nAntitrust 2\nAnts 2\nAntung 2\nAnxious 2\nAnying 2\nAoki 2\nApparel 2\nAppellate 2\nAppliance 2\nAppointed 2\nApproximately 2\nAquarius 2\nAraskog 2\nArbitragers 2\nArchdiocese 2\nArcheology 2\nArchitects 2\nArchive 2\nArchives 2\nArden 2\nArgentinian 2\nArgus 2\nArial 2\nArianna 2\nAries 2\nArivalo 2\nArkoma 2\nArlen 2\nArm 2\nArmatho 2\nArmenians 2\nArmuelles 2\nAromatherapy 2\nAronson 2\nArraf 2\nArrv. 2\nArtist 2\nAsad 2\nAsan 2\nAsharq 2\nAshraf 2\nAshtabula 2\nAsilone 2\nAsner 2\nAspin 2\nAspire 2\nAssemblyman 2\nAssociations 2\nAtkinson 2\nAtmosphere 2\nAtmospheric 2\nAtop 2\nAttachment 2\nAttack 2\nAttendants 2\nAttention 2\nAttic 2\nAttn. 2\nAuCoin 2\nAurora 2\nAutomation 2\nAutomax 2\nAutry 2\nAvailable 2\nAvard 2\nAvdel 2\nAvedisian 2\nAviacion 2\nAvoiding 2\nAweys 2\nAwsat 2\nAxe 2\nAy 2\nAyman 2\nAzerbaijan 2\nAzhar 2\nAznar 2\nB&Q 2\nB.C 2\nB.J. 2\nBANK 2\nBAY 2\nBBDO 2\nBCC 2\nBCE 2\nBDDP 2\nBEARDS 2\nBEEN 2\nBEFORE 2\nBEING 2\nBEWARE 2\nBJ 2\nBK 2\nBOTH 2\nBRENNER 2\nBRIDGET 2\nBSP 2\nBSPs 2\nBY 2\nBaa2 2\nBaburam 2\nBabylon 2\nBachman 2\nBacillus 2\nBadner 2\nBagdad 2\nBagels 2\nBaghdatis 2\nBahraini 2\nBaidoa 2\nBailit 2\nBaishi 2\nBait 2\nBakiyev 2\nBalloon 2\nBanana 2\nBandler 2\nBangladeshi 2\nBanharn 2\nBankshares 2\nBanner 2\nBao'an 2\nBaocheng 2\nBaptist 2\nBaqubah 2\nBarabba 2\nBarbarian 2\nBarbie 2\nBarbra 2\nBarely 2\nBarrah 2\nBartholomew 2\nBartlesville 2\nBasfar 2\nBasheer 2\nBashers 2\nBashing 2\nBashir 2\nBataan 2\nBatawi 2\nBateman 2\nBath 2\nBatin 2\nBaton 2\nBattelle 2\nBauche 2\nBaudelaire 2\nBauer 2\nBayan 2\nBayern 2\nBeacon 2\nBeal 2\nBean 2\nBeard 2\nBearded 2\nBeardies 2\nBeats 2\nBeatty 2\nBeaverton 2\nBeckman 2\nBecome 2\nBeddall 2\nBee 2\nBeech 2\nBegin 2\nBeijiang 2\nBeiping 2\nBelarus 2\nBelding 2\nBelier 2\nBella 2\nBeltway 2\nBenajones 2\nBenatar 2\nBend 2\nBendrel 2\nBeneath 2\nBenelux 2\nBenny 2\nBensonhurst 2\nBentsen 2\nBerger 2\nBermuda 2\nBern 2\nBernie 2\nBerol 2\nBerra 2\nBertolotti 2\nBertrand 2\nBethesda 2\nBette 2\nBewl 2\nBhd. 2\nBianchi 2\nBick 2\nBideri 2\nBiennial 2\nBiggest 2\nBih 2\nBilanz 2\nBilling 2\nBillings 2\nBince 2\nBinghamton 2\nBinhai 2\nBio 2\nBio-Technology 2\nBiologists 2\nBiosource 2\nBiotechnical 2\nBirinyi 2\nBirth 2\nBishkek 2\nBixby 2\nBizMart 2\nBjork 2\nBjorn 2\nBlackhawk 2\nBlacks 2\nBlaine 2\nBlake 2\nBlanchard 2\nBlanco 2\nBland 2\nBlandon 2\nBlankenship 2\nBlast 2\nBlazer 2\nBlender 2\nBlessing 2\nBliznakov 2\nBlizzard 2\nBlondes 2\nBloom 2\nBloomfield 2\nBlumstein 2\nBobar 2\nBockius 2\nBockris 2\nBognato 2\nBoheme 2\nBoi 2\nBolar 2\nBolden 2\nBolger 2\nBombs 2\nBones 2\nBonfire 2\nBooker 2\nBookstore 2\nBooming 2\nBoots 2\nBorg 2\nBorner 2\nBoskin 2\nBostian 2\nBostic 2\nBostik 2\nBosworth 2\nBoudreau 2\nBourbon 2\nBourgeoisie 2\nBourse 2\nBowater 2\nBowers 2\nBowing 2\nBowker 2\nBowls 2\nBoxes 2\nBozhong 2\nBragg 2\nBraintree 2\nBrake 2\nBrandenburg 2\nBrandon 2\nBranford 2\nBrant 2\nBrantford 2\nBrassai 2\nBrave 2\nBraves 2\nBreak 2\nBreaux 2\nBrecht 2\nBrechtian 2\nBreeders 2\nBreeding 2\nBreen 2\nBremmer 2\nBrenda 2\nBrevetti 2\nBrewer 2\nBreweries 2\nBrezhnevite 2\nBrickell 2\nBridget 2\nBrigham 2\nBrink 2\nBriscoe 2\nBrissette 2\nBritains 2\nBrittany 2\nBrizola 2\nBroadcasters 2\nBrockville 2\nBroder 2\nBromley 2\nBronco 2\nBroncos 2\nBronson 2\nBrook 2\nBrookline 2\nBrowne 2\nBrozman 2\nBruner 2\nBrunettes 2\nBrush 2\nBuchman 2\nBuchwald 2\nBudgeting 2\nBug 2\nBugs 2\nBuilt 2\nBukhari 2\nBulgarians 2\nBulge 2\nBullet 2\nBum 2\nBumpers 2\nBundy 2\nBunny 2\nBunting 2\nBurford 2\nBurghley 2\nBuried 2\nBurn 2\nBurrow 2\nBuses 2\nBusinessNews 2\nBussieres 2\nButch 2\nBuzzell 2\nByelorussia 2\nByler 2\nByrum 2\nC.R. 2\nCAEM 2\nCAN 2\nCAPITAL 2\nCARE 2\nCB 2\nCBN 2\nCBRC 2\nCCA 2\nCCNG 2\nCCW 2\nCDBG 2\nCDWR 2\nCF6 2\nCHECKOFF 2\nCHEMICAL 2\nCHERNOBYL 2\nCHESS 2\nCHICK 2\nCHT 2\nCIM 2\nCIT 2\nCJPT 2\nCLASS 2\nCLAUSE 2\nCLH 2\nCLTV 2\nCOKE 2\nCOMES 2\nCONTINENTAL 2\nCOOPER 2\nCORRECT 2\nCOULD 2\nCOURT 2\nCP 2\nCP486 2\nCPS 2\nCPUs 2\nCRAF 2\nCRI 2\nCRRES 2\nCSIS 2\nCSS 2\nCST 2\nCTB 2\nCTV 2\nCWA 2\nCY 2\nCaa 2\nCaere 2\nCaesarean 2\nCafeteria 2\nCafferty 2\nCain 2\nCairns 2\nCal 2\nCalabasas 2\nCalanda 2\nCalaveras 2\nCaldwell 2\nCaledonia 2\nCaliphate 2\nCallable 2\nCalor 2\nCalvi 2\nCambiasso 2\nCamera 2\nCamilo 2\nCanadians 2\nCanaveral 2\nCandice 2\nCandidates 2\nCandy 2\nCannon 2\nCanon 2\nCanonie 2\nCanter 2\nCantor 2\nCanvasing 2\nCapitalists 2\nCapitalizing 2\nCaprice 2\nCapricorn 2\nCapt. 2\nCara 2\nCardiff 2\nCardinals 2\nCardiovascular 2\nCards 2\nCareer 2\nCareful 2\nCargo 2\nCarlson 2\nCarlton 2\nCarlucci 2\nCarmel 2\nCarmichael 2\nCarmine 2\nCarnahan 2\nCarpentry 2\nCarrefour 2\nCarribbean 2\nCarrie 2\nCasinos 2\nCaspar 2\nCaspi 2\nCaspian 2\nCaspita 2\nCassation 2\nCassini 2\nCastaic 2\nCastillo 2\nCathedral 2\nCatholicism 2\nCattle 2\nCatwell 2\nCaucasians 2\nCaucus 2\nCavalier 2\nCaveat 2\nCavenee 2\nCayman 2\nCedar 2\nCelebrating 2\nCelebration 2\nCeltona 2\nCensorship 2\nCentennial 2\nCenterior 2\nCentronics 2\nChance 2\nChances 2\nChangbai 2\nChanghao 2\nChanghe 2\nChangjiang 2\nChangqing 2\nChangyi 2\nChanley 2\nChaoyang 2\nChaozhi 2\nChappaqua 2\nCharacters 2\nCharisma 2\nCharitable 2\nCharity 2\nCharlton 2\nChartered 2\nChatset 2\nChecchi 2\nChecked 2\nCheerleaders 2\nCheese 2\nChef 2\nChekhov 2\nChelsea 2\nChemex 2\nChenhsipao 2\nChesebrough 2\nChesley 2\nChex 2\nChezan 2\nChiapas 2\nChiara 2\nChicagoans 2\nChichester 2\nChickens 2\nChico 2\nChijian 2\nChilean 2\nChilmark 2\nChinaman 2\nChinamen 2\nChinanews 2\nChino 2\nChishui 2\nChiushe 2\nChoate 2\nChoice 2\nChojnowski 2\nChowk 2\nChristensen 2\nChronicles 2\nChua 2\nChui 2\nChumley 2\nChungcheng 2\nChungcheongnam 2\nChungshan 2\nChunju 2\nChuoshui 2\nChutung 2\nCiavarella 2\nCicero 2\nCitation 2\nCitigroup 2\nCivic 2\nClaimants 2\nClairol 2\nClan 2\nClanahan 2\nClare 2\nClaridge 2\nClaudia 2\nCleaning 2\nClear 2\nClients 2\nCliff 2\nClock 2\nClosely 2\nCloser 2\nClothiers 2\nCloud 2\nCloudy 2\nCo-founder 2\nCoburn 2\nCoeur 2\nCoil 2\nCokely 2\nCollaboration 2\nCollectibles 2\nCollection 2\nCollective 2\nCollectors 2\nColnaghi 2\nColombians 2\nColon 2\nColonial 2\nColony 2\nColumbian 2\nCombating 2\nComcast 2\nComet 2\nComito 2\nCommanding 2\nComment 2\nCommentators 2\nCommenting 2\nCommunism 2\nComparing 2\nCompassion 2\nCompensation 2\nCompeting 2\nCompetitors 2\nCompiled 2\nCompliance 2\nComplying 2\nCompound 2\nCompounding 2\nComputerworld 2\nConceptual 2\nConcerns 2\nConette 2\nConfederate 2\nConfederations 2\nConfirming 2\nConflicts 2\nConfusion 2\nCongestion 2\nConlin 2\nConnan 2\nConnection 2\nConning 2\nConrad 2\nConradies 2\nConsideration 2\nConsolidation 2\nConstant 2\nConstructing 2\nConsulate 2\nConte 2\nContinentals 2\nContracting 2\nContractors 2\nContracts 2\nContractual 2\nConverting 2\nConvicted 2\nCoolant 2\nCooperative 2\nCorcoran 2\nCorley 2\nCormack 2\nCornard 2\nCornette 2\nCorporal 2\nCorrespondence 2\nCorroon 2\nCorruption 2\nCosts 2\nCostume 2\nCougar 2\nCounterpoint 2\nCounting 2\nCountless 2\nCoupled 2\nCousteau 2\nCoverage 2\nCovington 2\nCowan 2\nCragy 2\nCreated 2\nCreation 2\nCreator 2\nCreditWatch 2\nCreekside 2\nCremonie 2\nCrew 2\nCrews 2\nCriticism 2\nCroix 2\nCrosby 2\nCrosse 2\nCrossing 2\nCrowe 2\nCrowleyan 2\nCrozier 2\nCrudeStocks.pdf 2\nCrunch 2\nCrusader 2\nCrutcher 2\nCrutzen 2\nCuckoo 2\nCuniket 2\nCunningham 2\nCurcio 2\nCuriously 2\nCurran 2\nCurse 2\nCurtin 2\nCurtis 2\nCustodian 2\nCutting 2\nCyber 2\nCycle 2\nCymbal 2\nD.c. 2\nD7000 2\nDC10 2\nDESPITE 2\nDIES 2\nDISPOSAL 2\nDOM 2\nDR 2\nDRAM 2\nDRUG 2\nDUI 2\nDVI 2\nDa 2\nDaggs 2\nDailey 2\nDaim 2\nDalama 2\nDallic 2\nDangerous 2\nDanish 2\nDannemiller 2\nDanube 2\nDanvers 2\nDaouk 2\nDaqing 2\nDarby 2\nDarren 2\nDarwinism 2\nDataTimes 2\nDatacomp 2\nDatamanager 2\nDauchy 2\nDaugherty 2\nDaukoru 2\nDavison 2\nDavos 2\nDawson 2\nDayuan 2\nDeBerry 2\nDeCook 2\nDeGol 2\nDeSoto 2\nDeacon 2\nDeaf 2\nDebaathification 2\nDebi 2\nDecades 2\nDecide 2\nDecisive 2\nDecoud 2\nDeek 2\nDeeply 2\nDeerfield 2\nDefine 2\nDefinite 2\nDegeneration 2\nDekuliar 2\nDelay 2\nDelco 2\nDelivery 2\nDelphi 2\nDemolition 2\nDems 2\nDemster 2\nDen 2\nDenlea 2\nDentistry 2\nDeparting 2\nDerbyshire 2\nDescribing 2\nDesierto 2\nDesigning 2\nDesktop 2\nDetention 2\nDeutsch 2\nDevario 2\nDevice 2\nDeyang 2\nDharma 2\nDhu 2\nDian 2\nDiaz 2\nDictionary 2\nDie 2\nDiebel 2\nDietrich 2\nDillard 2\nDiller 2\nDillmann 2\nDime 2\nDirectory 2\nDirty 2\nDisappointing 2\nDisc 2\nDisclosure 2\nDisco 2\nDiscounted 2\nDiscover 2\nDiscreet 2\nDiscussion 2\nDisgusting 2\nDistance 2\nDistribution 2\nDitto 2\nDividend 2\nDjango 2\nDoctors 2\nDogtown 2\nDoherty 2\nDoll 2\nDomenici 2\nDomingo 2\nDominick 2\nDonations 2\nDone 2\nDonghai 2\nDongsheng 2\nDoolittle 2\nDoors 2\nDoren 2\nDornan 2\nDorsch 2\nDostoevski 2\nDoubles 2\nDove 2\nDowning 2\nDowntown 2\nDp 2\nDracula 2\nDragons 2\nDrawing 2\nDress 2\nDrinker 2\nDrivon 2\nDrummond 2\nDrunk 2\nDry 2\nDryja 2\nDuane 2\nDucks 2\nDuct 2\nDude 2\nDuesseldorf 2\nDujail 2\nDujiang 2\nDule 2\nDumas 2\nDumpty 2\nDunde 2\nDunke 2\nDunkin 2\nDurable 2\nDust 2\nDuty 2\nDuval 2\nDuvall 2\nDwelling 2\nDwight 2\nDynamic 2\nDynasties 2\nE.R. 2\nE.W. 2\nEBay.com 2\nECI 2\nECLAC 2\nECU 2\nEELV 2\nEES 2\nEGA 2\nEIN 2\nELECTRIC 2\nEND 2\nENI 2\nENRON@enronXgate 2\nENW_GCP 2\nEPMI 2\nET 2\nEV71 2\nEVA 2\nEVEN 2\nEVERY 2\nEVERYONE 2\nEVERYTHING 2\nEVREC 2\nEWDB 2\nEXCELLENT 2\nEXECUTIVES 2\nEXPERIENCE 2\nEagleton 2\nEarning 2\nEaux 2\nEavesdropping 2\nEcho 2\nEchoes 2\nEchoing 2\nEckenfelder 2\nEclecticism 2\nEclipse 2\nEdge 2\nEdition 2\nEdmonton 2\nEdsty 2\nEffects 2\nEfficient 2\nEffort 2\nEidul 2\nEiffel 2\nEighteen 2\nEighty 2\nEisenberg 2\nEisenhower 2\nElaborating 2\nElanco 2\nElastic 2\nEleanor 2\nElephants 2\nEleven 2\nElf 2\nElimination 2\nElite 2\nElliot 2\nElm 2\nElmira 2\nElmo 2\nElton 2\nElvekrog 2\nElvis 2\nEmacs 2\nEmbedded 2\nEmery 2\nEmigrating 2\nEmily 2\nEminase 2\nEmma 2\nEmmerich 2\nEmperors 2\nEmporium 2\nEmpty 2\nEmshwiller 2\nEmyanitoff 2\nEn 2\nEndangered 2\nEndless 2\nEndo 2\nEnemy 2\nEnforcers 2\nEngineer 2\nEnglund 2\nEnjoy 2\nEnrique 2\nEnronOnLine 2\nEntity 2\nEntrepreneurs 2\nEntry 2\nEnvoy 2\nEpic 2\nEquine 2\nEquities 2\nErdogan 2\nEricson 2\nErinys 2\nEritrean 2\nEritreans 2\nErlian 2\nErshilibao 2\nErwin 2\nEskenazi 2\nEskridge 2\nEspionage 2\nEspre 2\nEstonian 2\nEsty 2\nEtc. 2\nEthan 2\nEton 2\nEubank 2\nEuljiro 2\nEurasia 2\nEurobond 2\nEuromarket 2\nEurostat 2\nEustachy 2\nEva 2\nEvent 2\nEvergreen 2\nEverytime 2\nEwing 2\nEx 2\nEx-dividend 2\nExaminer 2\nExamples 2\nExcellence 2\nExchanges 2\nExclusive 2\nExpatriate 2\nExpensive 2\nExperienced 2\nExperiments 2\nExplain 2\nExplaining 2\nExplonaft 2\nExposed 2\nExpressionist 2\nExtension 2\nExtreme 2\nEyes 2\nF.O.B. 2\nF1 2\nF2 2\nFALL 2\nFANTASTIC 2\nFAX 2\nFDR 2\nFHS 2\nFIDE 2\nFINALLY 2\nFIRM 2\nFIVE 2\nFMC 2\nFORCE 2\nFORGOT 2\nFORMER 2\nFOX 2\nFTA 2\nFUCKING 2\nFab 2\nFabi 2\nFabric 2\nFabrics 2\nFabulous 2\nFacebook 2\nFactor 2\nFagan 2\nFahd2000 2\nFahrenheit 2\nFailed 2\nFairness 2\nFaithful 2\nFalco 2\nFalk 2\nFalkland 2\nFalklands 2\nFamed 2\nFamilia 2\nFamine 2\nFans 2\nFareed 2\nFarms 2\nFarney 2\nFarooquee 2\nFarr 2\nFate 2\nFateh 2\nFavored 2\nFeathers 2\nFedayeen 2\nFeedlots 2\nFein 2\nFeith 2\nFelicia 2\nFellini 2\nFellow 2\nFemales 2\nFenglingdu 2\nFengshui 2\nFerembal 2\nFernandes 2\nFerrari 2\nFerreira 2\nFerrer 2\nFerrett 2\nFertilizer 2\nFeuerbach 2\nFever 2\nFiechter 2\nFiesta 2\nFifty 2\nFiguring 2\nFill 2\nFilms 2\nFinals 2\nFinest 2\nFinks 2\nFinn 2\nFionnuala 2\nFiorello 2\nFirefighters 2\nFires 2\nFirfer 2\nFisherman 2\nFishermen 2\nFitch 2\nFixx 2\nFlamply 2\nFlanked 2\nFlashdance 2\nFlat 2\nFlesh 2\nFlintstones 2\nFlorencen 2\nFlores 2\nFlowers 2\nFmHA 2\nFocusing 2\nFogg 2\nFonda 2\nFonz 2\nFoolish 2\nFoote 2\nForecasters 2\nForeclosures 2\nFormal 2\nFormally 2\nFormerly 2\nForster@ENRON 2\nFortney 2\nFosse 2\nFossey 2\nFourteenth 2\nFoxmoor 2\nFrabotta 2\nFramework 2\nFran 2\nFrancais 2\nFrances 2\nFranciscans 2\nFrancoise 2\nFranks 2\nFranz 2\nFraumeni 2\nFreezing 2\nFrelick 2\nFreon 2\nFrequently 2\nFreudenberger 2\nFri. 2\nFrieden 2\nFrito 2\nFrontier 2\nFrosted 2\nFrozen 2\nFueling 2\nFuller 2\nFully 2\nFung 2\nFurniture 2\nFuruta 2\nFuyan 2\nFuyang 2\nG6PC2 2\nGCP_London 2\nGEC 2\nGEM 2\nGENERAL 2\nGEORGE 2\nGI 2\nGIRL 2\nGIVE 2\nGNOFHAC 2\nGORBACHEV 2\nGPS 2\nGR8FLRED 2\nGRILL 2\nGUYS 2\nGWB 2\nGabele 2\nGabrahall 2\nGaddafi 2\nGaffney 2\nGain 2\nGalamian 2\nGalanos 2\nGalle 2\nGallust 2\nGamal 2\nGan 2\nGann 2\nGannan 2\nGant 2\nGardiner 2\nGaribaldi 2\nGarland 2\nGarment 2\nGathering 2\nGatorade 2\nGatsby 2\nGaulle 2\nGbps 2\nGe'ermu 2\nGeagea 2\nGeary 2\nGebhard 2\nGeduld 2\nGeeks 2\nGemini 2\nGeno 2\nGenova 2\nGenshen 2\nGeorgeson 2\nGeorgian 2\nGepeng 2\nGerhard 2\nGermeten 2\nGersony 2\nGerstner 2\nGetty 2\nGhost 2\nGibbons 2\nGideon 2\nGifts 2\nGiftzwerg 2\nGil 2\nGilad 2\nGilleland 2\nGilmartin 2\nGilton 2\nGingrich 2\nGiovannini 2\nGiraffe 2\nGirl 2\nGiroldi 2\nGirozentrale 2\nGives 2\nGizbert 2\nGlacier 2\nGlad 2\nGlare 2\nGlasnost 2\nGlaxo 2\nGliedman 2\nGlobo 2\nGlorious 2\nGnu 2\nGoat 2\nGodown 2\nGoebbels 2\nGoes 2\nGogiea 2\nGoldstein 2\nGoldston 2\nGoliaths 2\nGolomb 2\nGonzales 2\nGoode 2\nGoodfellow 2\nGoods 2\nGoodwin 2\nGoodwyn 2\nGoose 2\nGorillas 2\nGourlay 2\nGoverning 2\nGovt 2\nGowen 2\nGracie 2\nGrade 2\nGraedel 2\nGraib 2\nGrammond 2\nGrande 2\nGrannies 2\nGrano 2\nGrantor 2\nGranville 2\nGrapes 2\nGraphic 2\nGrauer 2\nGraves 2\nGravity 2\nGreenery 2\nGressette 2\nGrieco 2\nGriffith 2\nGrigoli 2\nGrill 2\nGrinnan 2\nGrisebach 2\nGrocery 2\nGrohl 2\nGromov 2\nGrounds 2\nGrow 2\nGrown 2\nGrupo 2\nGuam 2\nGuangzhi 2\nGuanquecailang 2\nGuanyin 2\nGubeni 2\nGuenter 2\nGuerra 2\nGui 2\nGuilin 2\nGuillermo 2\nGuinean 2\nGulbuddin 2\nGulls 2\nGulobowich 2\nGumbel 2\nGumucio 2\nGun 2\nGuns 2\nGuocheng 2\nGuojun 2\nGustavo 2\nGuthrie 2\nGutierrez 2\nGuttman 2\nGuys 2\nGwyneth 2\nGym 2\nGymnasium 2\nH.J. 2\nHASTINGS 2\nHCC 2\nHEAR 2\nHEI 2\nHIAA 2\nHIGHLY 2\nHOLIDAY 2\nHOW 2\nHQ 2\nHRGini 2\nHa 2\nHaberle 2\nHadden 2\nHadith 2\nHafer 2\nHafr 2\nHagar 2\nHagen 2\nHaig 2\nHaile 2\nHain 2\nHaines 2\nHair 2\nHajime 2\nHakkas 2\nHallingby 2\nHallmark 2\nHamm 2\nHammerstein 2\nHandover 2\nHandy 2\nHanifen 2\nHaniya 2\nHank 2\nHanks 2\nHannah 2\nHannifin 2\nHansen 2\nHapipy 2\nHappened 2\nHarbin 2\nHardee 2\nHardiman 2\nHardis 2\nHarith 2\nHarland 2\nHarpener 2\nHarriet 2\nHarriman 2\nHarrington 2\nHartley 2\nHartnett 2\nHartpury 2\nHartwell 2\nHarvest 2\nHashemite 2\nHaskayne 2\nHats 2\nHauptman 2\nHawi 2\nHawijah 2\nHawker 2\nHawley 2\nHawthorne 2\nHays 2\nHeaded 2\nHeading 2\nHearings 2\nHeatingOilStocks.pdf 2\nHeavily 2\nHefei 2\nHefner 2\nHeh 2\nHeidelberg 2\nHeileman 2\nHeilongjiang 2\nHejin 2\nHelens 2\nHelmsleySpear 2\nHelpern 2\nHelps 2\nHemingway 2\nHemming 2\nHeng 2\nHenley 2\nHenning 2\nHenri 2\nHepu 2\nHerb 2\nHercules 2\nHernandez 2\nHero 2\nHersh 2\nHersly 2\nHeston 2\nHeyman 2\nHibbard 2\nHickey 2\nHilan 2\nHildebrandt 2\nHilger 2\nHillard 2\nHilliard 2\nHimebaugh 2\nHimont 2\nHinckley 2\nHingham 2\nHipolito 2\nHiroshi 2\nHiroyuki 2\nHirsch 2\nHisha 2\nHiss 2\nHistorians 2\nHistorical 2\nHitchcock 2\nHixson 2\nHmm 2\nHmmmmmm 2\nHmong 2\nHnilica 2\nHockey 2\nHodges 2\nHodson 2\nHohhot 2\nHolocaust 2\nHomestead 2\nHometown 2\nHongkong 2\nHonorary 2\nHonors 2\nHoot 2\nHop 2\nHoran 2\nHoricon 2\nHorizons 2\nHormuz 2\nHoss 2\nHoughton 2\nHours 2\nHouses 2\nHovnanian 2\nHowley 2\nHoyt 2\nHsingyun 2\nHsun 2\nHubble 2\nHueglin 2\nHulk 2\nHumanistic 2\nHumphrey 2\nHumpty 2\nHumulin 2\nHurley 2\nHurun 2\nHusain 2\nHusband 2\nHusker 2\nHussey 2\nHyland 2\nHymowitz 2\nHypocrite 2\nHz 2\nI.C.H. 2\nI/C 2\nIBMS 2\nICON 2\nIMS 2\nINTEREST 2\nISPs 2\nIbbotson 2\nIda 2\nIdeal 2\nIdeas 2\nIdentify 2\nIdentity 2\nIdeologues 2\nIdris 2\nIfraem 2\nIgdaloff 2\nIhsas2 2\nIke 2\nIlyushins 2\nImaginary 2\nImaging 2\nImasco 2\nImbalance 2\nImhoff 2\nImmediate 2\nImpact 2\nImpco 2\nImprimis 2\nImproving 2\nInca 2\nIncorporated 2\nIncreased 2\nIncreases 2\nInd 2\nIndexed 2\nIndexing 2\nIndicators 2\nIndies 2\nIndustria 2\nIndustrialization 2\nIndustrie 2\nIndustrielle 2\nInfoCorp 2\nInfrared 2\nIngalls 2\nInmac 2\nInmates 2\nInnovative 2\nInns 2\nInquiry 2\nInsiders 2\nInsisting 2\nInstallation 2\nInterco 2\nIntercontinental 2\nInterface 2\nIntermoda 2\nInternationale 2\nInternationally 2\nInterpol 2\nInterprovincial 2\nIntroduction 2\nInventories 2\nInverness 2\nInvestigations 2\nIra 2\nIrfan 2\nIrises 2\nIrony 2\nIs'haqi 2\nIsetan 2\nIshiguro 2\nIslander 2\nIslanders 2\nIsmaili 2\nIsraelites 2\nIstituto 2\nIsuzu 2\nIttleson 2\nIvern 2\nIyad 2\nJ&L 2\nJ.doc 2\nJAPANESE 2\nJARIR 2\nJCP 2\nJM 2\nJMB 2\nJOB 2\nJPL 2\nJPMorgan 2\nJPY 2\nJUDGE 2\nJUMPING 2\nJURY 2\nJW's 2\nJabrel 2\nJachmann 2\nJacko 2\nJacobsen 2\nJacqueline 2\nJadida 2\nJaffe 2\nJahn 2\nJameh 2\nJammu 2\nJanachowski 2\nJaneiro 2\nJanell 2\nJanesville 2\nJarvis 2\nJasim 2\nJawa 2\nJawf 2\nJekyll 2\nJell 2\nJenco 2\nJenks 2\nJesperson 2\nJewel 2\nJiaka 2\nJianchao 2\nJianmin 2\nJianming 2\nJiansong 2\nJianyang 2\nJiao 2\nJinghua 2\nJingquan 2\nJingwei 2\nJinhu 2\nJinjiang 2\nJinrong 2\nJoachim 2\nJoann 2\nJobson 2\nJointly 2\nJollow 2\nJonetic 2\nJongno 2\nJonson 2\nJos. 2\nJosie 2\nJourney 2\nJudd 2\nJudgement 2\nJudson 2\nJules 2\nJulia 2\nJuliet 2\nJumblatt 2\nJunius 2\nJunkins 2\nJunmin 2\nJuragua 2\nJurvetson 2\nKB 2\nKCRA 2\nKEVALAM 2\nKFC 2\nKM 2\nKOFY 2\nKSFO 2\nKadane 2\nKaddurah 2\nKadhim 2\nKafka 2\nKah 2\nKakuei 2\nKalamazoo 2\nKale 2\nKalmus 2\nKamal 2\nKamm 2\nKanharith 2\nKarachi 2\nKarrada 2\nKarstadt 2\nKasler 2\nKato 2\nKe 2\nKealty 2\nKee 2\nKegie 2\nKegler 2\nKei 2\nKeliang 2\nKellwood 2\nKemal 2\nKenji 2\nKenyon 2\nKerlone 2\nKessler 2\nKetagalan 2\nKevalam 2\nKeynes 2\nKhaddafi 2\nKharis 2\nKhayr 2\nKheng 2\nKiep 2\nKiller 2\nKilling 2\nKilpatrick 2\nKingdoms 2\nKingel 2\nKingsway 2\nKinkerl 2\nKinney 2\nKippur 2\nKirgizia 2\nKirkuk 2\nKirschner 2\nKitts 2\nKlan 2\nKlaus 2\nKleinaitis 2\nKleinman 2\nKlinghoffer 2\nKluge 2\nKlux 2\nKnowledgeWare 2\nKnowledgeable 2\nKnown 2\nKnoxville 2\nKobayashi 2\nKochan 2\nKodokan 2\nKoerner 2\nKolesnikov 2\nKoppel 2\nKoresh 2\nKoura 2\nKowling 2\nKrampe 2\nKress 2\nKristobal 2\nKristol 2\nKrutchensky 2\nKrysalis 2\nKryuchkov 2\nKuangdi 2\nKudos 2\nKueck 2\nKuei 2\nKummerfeld 2\nKungpao 2\nKunshan 2\nKurdistan 2\nKurnit 2\nKushkin 2\nKushnick 2\nKuwaitis 2\nKwai 2\nKwek 2\nKyodo 2\nKyrgyzstani 2\nL.L.C 2\nLAND 2\nLAURA 2\nLAWYERS 2\nLDC 2\nLEDs 2\nLIMITED 2\nLIVESTOCK 2\nLJ 2\nLJN 2\nLLerena 2\nLOCUST 2\nLOGIN 2\nLONG 2\nLOW 2\nLP 2\nLTCB 2\nL_Age 2\nLaFalce 2\nLaLonde 2\nLaMore 2\nLaMothe 2\nLaSalle 2\nLabrador 2\nLabs 2\nLabuleng 2\nLackey 2\nLaidlaw 2\nLaizi 2\nLala 2\nLamasery 2\nLancet 2\nLandesbank 2\nLandforms 2\nLaney 2\nLanier 2\nLargely 2\nLargo 2\nLarson 2\nLasker 2\nLastly 2\nLaszlo 2\nLaugh 2\nLavery 2\nLavorato 2\nLavoro 2\nLawrenceville 2\nLawsuits 2\nLeBaron 2\nLeGere 2\nLeadership 2\nLeagues 2\nLease 2\nLeblang 2\nLecheria 2\nLees 2\nLefevre 2\nLegacy 2\nLegislator 2\nLegislators 2\nLenders 2\nLeningrad 2\nLenny 2\nLens 2\nLeonid 2\nLeopold 2\nLerner 2\nLessons 2\nLeveraged 2\nLevitt 2\nLevni 2\nLhasa 2\nLianyugang 2\nLianyungang 2\nLiau 2\nLiberals 2\nLibra 2\nLiddle 2\nLieb 2\nLifelong 2\nLights 2\nLillikas 2\nLinh 2\nLinks 2\nLinne 2\nLinsert 2\nLinus 2\nLinyi 2\nLiquidity 2\nLirong 2\nListening 2\nListing 2\nLitchfield 2\nLiterally 2\nLittleboy 2\nLitton 2\nLitvinchuk 2\nLiuzhou 2\nLive365 2\nLivestock 2\nLixian 2\nLoad 2\nLoans 2\nLockerby 2\nLoco 2\nLoden 2\nLoewi 2\nLoews 2\nLog 2\nLogistics 2\nLombard 2\nLomotil 2\nLonely 2\nLonesome 2\nLongbin 2\nLongchen 2\nLonger 2\nLongman 2\nLongmont 2\nLongo 2\nLooked 2\nLoom 2\nLoran 2\nLords 2\nLorimar 2\nLose 2\nLot 2\nLotte 2\nLouie 2\nLourie 2\nLovers 2\nLowrey 2\nLubar 2\nLuber 2\nLublin 2\nLugovoi 2\nLukang 2\nLuke 2\nLuluah 2\nLungshan 2\nLungtan 2\nLupel 2\nLurie 2\nLuthringshausen 2\nLutz 2\nLynda 2\nLyneses 2\nLynford 2\nLynne 2\nLyon 2\nLyrics 2\nM.A. 2\nM/H 2\nMA 2\nMACY 2\nMANAGER 2\nMB 2\nMCOA 2\nMEAN 2\nMEATS 2\nMIG 2\nMILNET 2\nMIPs 2\nMM 2\nMMS 2\nMND 2\nMO 2\nMOE 2\nMOEA 2\nMOVE 2\nMPI 2\nMRI 2\nMWh 2\nMabon 2\nMacArthur 2\nMace 2\nMach 2\nMackay 2\nMacon 2\nMacroChem 2\nMadagascar 2\nMadden 2\nMadsen 2\nMaffett 2\nMaggot 2\nMaghaweer 2\nMagnascreen 2\nMahal 2\nMahatma 2\nMahe 2\nMahler 2\nMahran 2\nMai 2\nMails 2\nMaines 2\nMainstream 2\nMaintain 2\nMair 2\nMaitreya 2\nMajesty 2\nMakkai 2\nMales 2\nMali 2\nMalizia 2\nMallinckrodt 2\nMalmqvist 2\nMalone 2\nMalta 2\nMandle 2\nManiat 2\nManifestation 2\nManifesto 2\nManion 2\nMankiewicz 2\nMankind 2\nManley 2\nManne 2\nMannheim 2\nMansfield 2\nManzanec 2\nMaoists 2\nMaple 2\nMapping 2\nMaps 2\nMarCor 2\nMarcelo 2\nMarcoses 2\nMare 2\nMarek 2\nMaritime 2\nMarkab 2\nMarkus 2\nMarlene 2\nMarmalstein 2\nMaronites 2\nMarsam 2\nMarston 2\nMarty 2\nMartyrs 2\nMarxism 2\nMaryam 2\nMasa 2\nMasahiro 2\nMasaki 2\nMasket 2\nMassa 2\nMassive 2\nMassoud 2\nMasterson 2\nMatanky 2\nMatar 2\nMatch 2\nMatchett 2\nMatisse 2\nMatsuda 2\nMatterhorn 2\nMattress 2\nMaui 2\nMaurer 2\nMauritania 2\nMauritanian 2\nMawangdui 2\nMaxima 2\nMaximum 2\nMayon 2\nMazowiecki 2\nMazzone 2\nMcAlary 2\nMcAllen 2\nMcCabe 2\nMcCain 2\nMcCann 2\nMcCartney 2\nMcChesney 2\nMcClary 2\nMcClauclin 2\nMcCullough 2\nMcDermott 2\nMcElroy 2\nMcEnaney 2\nMcGilloway 2\nMcGlade 2\nMcGowan 2\nMcInnes 2\nMcKay 2\nMcKenna 2\nMcKinley 2\nMcKinnon 2\nMcLean 2\nMcMoRan 2\nMcNair 2\nMcNamara 2\nMcNealy 2\nMcNeil 2\nMcVeigh 2\nMcdermott 2\nMeal 2\nMeats 2\nMechanics 2\nMedian 2\nMediobanca 2\nMedstone 2\nMegargel 2\nMeili 2\nMeira 2\nMekong 2\nMelanie 2\nMeltzer 2\nMembership 2\nMenace 2\nMencken 2\nMendes 2\nMenell 2\nMens 2\nMenuhin 2\nMerciful 2\nMerge 2\nMerger 2\nMeritor 2\nMerlin 2\nMerrist 2\nMerritt 2\nMerry 2\nMervin 2\nMervyn 2\nMesozoic 2\nMessina 2\nMessinger 2\nMeteorological 2\nMethodists 2\nMetn 2\nMexicana 2\nMexicanos 2\nMeyers 2\nMiao 2\nMice 2\nMichelin 2\nMicronic 2\nMidway 2\nMignanelli 2\nMigrant 2\nMilano 2\nMile 2\nMilgrim 2\nMilisanidis 2\nMilitias 2\nMilligan 2\nMind 2\nMinden 2\nMindy 2\nMineworkers 2\nMingbin 2\nMingchuan 2\nMinh 2\nMinhang 2\nMini 2\nMinimum 2\nMinistries 2\nMinjiang 2\nMinorities 2\nMintier 2\nMinutes 2\nMinzhang 2\nMirosoviki 2\nMirza 2\nMiss. 2\nMist 2\nMitsuoka 2\nMitsuru 2\nMixed 2\nMobilization 2\nMogul 2\nMohamad 2\nMojave 2\nMolina 2\nMollura 2\nMolokai 2\nMolotov 2\nMoments 2\nMon. 2\nMondale 2\nMonday's 2\nMonets 2\nMonetta 2\nMongol 2\nMonitoring 2\nMonopoly 2\nMonterrey 2\nMonticenos 2\nMontvale 2\nMoonie 2\nMoonies 2\nMoral 2\nMorales 2\nMoralis 2\nMori 2\nMormons 2\nMorna 2\nMorrow 2\nMortimer 2\nMosaic 2\nMoscom 2\nMoshe 2\nMosque 2\nMosques 2\nMostly 2\nMotion 2\nMotley 2\nMotorcycles 2\nMouton 2\nMovies 2\nMozambique 2\nMu'tasim 2\nMuammar 2\nMullahs 2\nMuller 2\nMuramatsu 2\nMurasawa 2\nMusicians 2\nMust 2\nMustang 2\nMutouasan 2\nMuzak 2\nMyself 2\nMystery 2\nMyth 2\nMyung 2\nN.D 2\nN.M 2\nNAACP 2\nNCKU 2\nNESB 2\nNHK 2\nNHTSA 2\nNL 2\nNMANNE@SusmanGodfrey.com 2\nNONE 2\nNORTHERN 2\nNSA 2\nNTNU 2\nNYC 2\nNabih 2\nNabulas 2\nNagy 2\nNail 2\nNails 2\nNajran 2\nNamib 2\nNamsan 2\nNan'ao 2\nNanyang 2\nNaomi 2\nNassim 2\nNatiq 2\nNausea 2\nNazareth 2\nNeanderthal 2\nNeed 2\nNegas 2\nNegative 2\nNegotiations 2\nNegotiator 2\nNegotiators 2\nNegro 2\nNeiman 2\nNejad 2\nNekos 2\nNellcor 2\nNened 2\nNeoTime 2\nNeocons 2\nNervous 2\nNesan 2\nNesbitt 2\nNetExpert 2\nNevis 2\nNewcomb 2\nNewhouse 2\nNewsnight 2\nNewsom 2\nNewsweek 2\nNgan 2\nNicastro 2\nNickel 2\nNicklaus 2\nNicky 2\nNicole 2\nNightline 2\nNights 2\nNikolai 2\nNiles 2\nNishiki 2\nNoSprawlTax 2\nNob 2\nNobora 2\nNobuyuki 2\nNolan 2\nNong 2\nNonperforming 2\nNonsense 2\nNordau 2\nNorodom 2\nNorske 2\nNorstar 2\nNortek 2\nNorthgate 2\nNorthwestern 2\nNorwitz 2\nNotional 2\nNottingham 2\nNovato 2\nNugent 2\nNumbers 2\nNumerical 2\nNunan 2\nNutt 2\nNutting 2\nNuys 2\nNybo 2\nO&Y 2\nO'Dwyer 2\nO'Neal 2\nO'Shea 2\nO'Sullivan 2\nOB:OtrPplQuoteWad 2\nOCN 2\nOMB 2\nOMG 2\nOS 2\nOTS 2\nOUSTED 2\nOasis 2\nOat 2\nOberstar 2\nOceania 2\nOdyssey 2\nOffer 2\nOffered 2\nOffering 2\nOfficially 2\nOhlman 2\nOldenburg 2\nOldsmobile 2\nOleg 2\nOlshan 2\nOmei 2\nOmron 2\nOng 2\nOpenNet 2\nOpening 2\nOperators 2\nOptimization 2\nOran 2\nOrchard 2\nOrganic 2\nOrganisation 2\nOrganized 2\nOrinoco 2\nOrmstedt 2\nOrrin 2\nOrson 2\nOrwellian 2\nOsamu 2\nOshkosh 2\nOstpolitik 2\nOstrager 2\nOswald 2\nOurselves 2\nOusley 2\nOutback 2\nOutflows 2\nOutlook 2\nOutokumpu 2\nOutplacement 2\nOvalle 2\nOvernight 2\nOverpriced 2\nOversight 2\nOvonic 2\nOwens 2\nOwings 2\nOwner 2\nOwning 2\nOxley 2\nOz-fest 2\nOzal 2\nOzarks 2\nOzzie 2\nP.J. 2\nP.S. 2\nPADDIIstocksCL.pdf 2\nPADDIstocksHO.pdf 2\nPADDIstocksHU.pdf 2\nPAL 2\nPAPERS 2\nPARTNERS 2\nPASSWORD 2\nPAY 2\nPCBs 2\nPDAs 2\nPECO 2\nPERSON 2\nPG&E 2\nPHILLY 2\nPLAN 2\nPLUS 2\nPO 2\nPOS 2\nPPG 2\nPPL 2\nPRESIDENT 2\nPROCEED 2\nPROPERTIES 2\nPS3 2\nPVC 2\nPachachi 2\nPacheco 2\nPachinko 2\nPaction 2\nPadilla 2\nPadovan 2\nPae 2\nPaes 2\nPaev 2\nPages 2\nPagong 2\nPainter 2\nPajoli 2\nPalicka 2\nPalifen 2\nPalma 2\nPalms 2\nPaltrow 2\nPaluck 2\nPam 2\nPamela 2\nPanchiao 2\nPanel 2\nPanet 2\nPang 2\nPaperboard 2\nPapers 2\nPapetti 2\nParacel 2\nParadise 2\nParadoxically 2\nParagould 2\nParametric 2\nParamus 2\nParanormal 2\nPari 2\nParking 2\nParkways 2\nPartnerships 2\nPascagoula 2\nPassaic 2\nPassengers 2\nPassion 2\nPassive 2\nPassport 2\nPassports 2\nPastrana 2\nPaswan 2\nPathe 2\nPatman 2\nPatriarch 2\nPatricof 2\nPatriots 2\nPatrol 2\nPatti 2\nPauen 2\nPawelski 2\nPawlowski 2\nPayco 2\nPaying 2\nPayments 2\nPayroll 2\nPayson 2\nPazeh 2\nPeak 2\nPedone 2\nPedroli 2\nPegaSys 2\nPeilice 2\nPeitou 2\nPelosi 2\nPenny 2\nPensacola 2\nPercentage 2\nPerches 2\nPercy 2\nPerella 2\nPerfect 2\nPergram 2\nPeriodically 2\nPerle 2\nPermanente 2\nPermit 2\nPerpetual 2\nPerrin 2\nPersians 2\nPertschuk 2\nPet 2\nPetSmart 2\nPetals 2\nPeterpaul 2\nPetrocorp 2\nPets 2\nPettee 2\nPetty 2\nPetzoldt 2\nPh. 2\nPharmacy 2\nPhase 2\nPhilinte 2\nPhilo 2\nPhilology 2\nPhotography 2\nPhotonics 2\nPhy 2\nPickens 2\nPickering 2\nPictured 2\nPieces 2\nPiedmont 2\nPigalle 2\nPiggy 2\nPignatelli 2\nPildes 2\nPiluso 2\nPimlott 2\nPingpu 2\nPinick 2\nPinola 2\nPinyin 2\nPiramal 2\nPisces 2\nPitman 2\nPittsburg 2\nPity 2\nPlacement 2\nPlacido 2\nPlague 2\nPlanar 2\nPlane 2\nPlaskett 2\nPlastic 2\nPlate 2\nPlayback 2\nPlays 2\nPleasant 2\nPlenty 2\nPlett 2\nPlymouth 2\nPng 2\nPodgorica 2\nPokemon 2\nPolicemen 2\nPolk 2\nPolyconomics 2\nPolytechnic 2\nPompey 2\nPonchatoula 2\nPong 2\nPorsche 2\nPortugese 2\nPost-Newsweek 2\nPosters 2\nPostipankki 2\nPouchen 2\nPour 2\nPowerful 2\nPractices 2\nPrada 2\nPrayer 2\nPrego 2\nPrehistoric 2\nPrehistory 2\nPremiere 2\nPremieres 2\nPremium 2\nPremner 2\nPreparedness 2\nPrescott 2\nPresent 2\nPresovo 2\nPressed 2\nPretl 2\nPrevent 2\nPrincipal 2\nPrinted 2\nPrivately 2\nPrizes 2\nPrizm 2\nProbing 2\nProblem 2\nProblems 2\nProcedure 2\nProcessing 2\nProleukin 2\nPromise 2\nPromoting 2\nPrompt 2\nPromptly 2\nProp. 2\nProposal 2\nPros 2\nProspective 2\nProtective 2\nProtestors 2\nProverbs 2\nProvisions 2\nPty. 2\nPuccini 2\nPudding 2\nPulitzer 2\nPull 2\nPursuing 2\nPush 2\nPymm 2\nQ. 2\nQ45 2\nQQSpace 2\nQUANTUM 2\nQUESTIONS 2\nQada 2\nQahtani 2\nQais 2\nQassem 2\nQays 2\nQianDao 2\nQiaotou 2\nQingnan 2\nQinshan 2\nQisrin 2\nQuack 2\nQuadrant 2\nQuake 2\nQuarantine 2\nQueensland 2\nQuek 2\nQuelle 2\nQuennell 2\nQuest 2\nQuestions 2\nQuezon 2\nQuickly 2\nQuill 2\nQuilted 2\nQuincy 2\nQuran 2\nR.I 2\nR.R. 2\nRAC 2\nRADIO 2\nRC6280 2\nRCI 2\nRDF 2\nREAD 2\nREDLINE 2\nROADHOUSE 2\nROBERT 2\nROI 2\nROMs 2\nRPGs 2\nRPM 2\nRTJ 2\nRULES 2\nRVs 2\nRabbits 2\nRacial 2\nRacing 2\nRack 2\nRafales 2\nRafidites 2\nRafiq 2\nRahim 2\nRainbows 2\nRainer 2\nRaines 2\nRainy 2\nRais 2\nRake 2\nRalston 2\nRamadi 2\nRamon 2\nRams 2\nRamtron 2\nRanaridh 2\nRancho 2\nRandall 2\nRandolph 2\nRangers 2\nRansom 2\nRantissi 2\nRaoul 2\nRarely 2\nRasini 2\nRating 2\nRatio 2\nRauschenberg 2\nRavine 2\nRavitch 2\nRawhide 2\nRayon 2\nRazi 2\nRazor 2\nReach 2\nReason 2\nReasoner 2\nReasons 2\nRebel 2\nRebellion 2\nRebuilding 2\nReceipts 2\nReceiving 2\nReceptech 2\nRecreation 2\nRectifier 2\nRedevelopment 2\nRedland 2\nRedstone 2\nReduction 2\nReedy 2\nReeker 2\nReference 2\nReformers 2\nRefuge 2\nReggie 2\nReginald 2\nRegistered 2\nRegistration 2\nRegulation 2\nReilly 2\nReinhold 2\nReinsurance 2\nRejoice 2\nRelationship 2\nRelocation 2\nRelying 2\nRemaining 2\nRemains 2\nRemaleg 2\nRemembrance 2\nRemove 2\nRemoves 2\nRendell 2\nRene 2\nRenewable 2\nRengav 2\nRenk 2\nRenoir 2\nRent 2\nRepeal 2\nRepresentation 2\nRepression 2\nReproduced 2\nReproducing 2\nRequests 2\nReserved 2\nReservoir 2\nResidents 2\nResist 2\nResolving 2\nRespect 2\nResponses 2\nResponsibility 2\nResponsible 2\nRest 2\nRestoration 2\nRetrovir 2\nReturning 2\nReturns 2\nReverse 2\nReviewing 2\nRevisited 2\nRevivalist 2\nRevlon 2\nRewards 2\nRexall 2\nRhoads 2\nRiche 2\nRickey 2\nRiese 2\nRifkin 2\nRifle 2\nRiga 2\nRiggs 2\nRikuso 2\nRilling 2\nRisley 2\nRisse 2\nRittenhouse 2\nRitz 2\nRival 2\nRobb 2\nRoda 2\nRodeo 2\nRodolfo 2\nRodrigo 2\nRolled 2\nRomance 2\nRomanick 2\nRongrong 2\nRoseanne 2\nRosemont 2\nRositas 2\nRothko 2\nRothman 2\nRound 2\nRounding 2\nRoussel 2\nRoyals 2\nRubble 2\nRudi 2\nRudwell 2\nRuggiero 2\nRumao 2\nRumsfeldian 2\nRumsfeldism 2\nRundle 2\nRuona 2\nRutgers 2\nRwandan 2\nRymer 2\nSAGE 2\nSAME 2\nSAVINGS 2\nSAY 2\nSCIRI 2\nSCUD 2\nSECTION 2\nSEE 2\nSEF 2\nSERVICE 2\nSERVICES 2\nSG 2\nSIA 2\nSMALL 2\nSNET 2\nSPCA 2\nSTAR 2\nSTAY 2\nSTOCKS 2\nSUCH 2\nSUCKS 2\nSUNY 2\nSW 2\nSYDNEY 2\nSYSTEMS 2\nSabre 2\nSackler 2\nSaens 2\nSafari 2\nSafer 2\nSafeway 2\nSaif 2\nSaifi 2\nSaigon 2\nSail 2\nSain 2\nSakr 2\nSal 2\nSalafi 2\nSalahudin 2\nSalay 2\nSalespeople 2\nSalih 2\nSalvadoran 2\nSalvagni 2\nSamarra 2\nSami 2\nSampras 2\nSampson 2\nSanches 2\nSanderson 2\nSanskrit 2\nSapporo 2\nSartre 2\nSass 2\nSazuka 2\nScalfaro 2\nScali 2\nScalia 2\nScarlet 2\nScattered 2\nScene 2\nSchafer 2\nSchantz 2\nSchatz 2\nSchaumburg 2\nSchenley 2\nSchline 2\nSchramm 2\nSchroeder 2\nSchulz 2\nSchumacher 2\nSchuman 2\nSchwinn 2\nScientist 2\nScofield 2\nScopes 2\nScores 2\nScorpios 2\nScotch 2\nScout 2\nScripps 2\nScrooge 2\nSeaboard 2\nSeasonal 2\nSebastian 2\nSebek 2\nSecord 2\nSega 2\nSekisui 2\nSemion 2\nSentelle 2\nSeong 2\nSep 2\nSeparate 2\nSepka 2\nSerenade 2\nSerie 2\nServing 2\nSerwer 2\nSesame 2\nSettlers 2\nSeville 2\nSewart 2\nSfeir 2\nSha 2\nShadow 2\nShaevitz 2\nShaken 2\nShaker 2\nShakespearean 2\nShamgerdy 2\nShams 2\nShamy 2\nShaoqi 2\nShapovalov 2\nSharahil 2\nShareholder 2\nSharfman 2\nSharing 2\nSharq 2\nShawel 2\nShawna 2\nShee 2\nSheinberg 2\nShek 2\nShelby 2\nShellbo 2\nShenyang 2\nShepperd 2\nSheridan 2\nSheryl 2\nShicoff 2\nShihsanhang 2\nShijiazhuang 2\nShimizu 2\nShing 2\nShining 2\nShipley 2\nShipments 2\nShipping 2\nShirley 2\nShiseido 2\nShlhoub 2\nShoney 2\nShoo 2\nShooting 2\nShopping.com 2\nShostakovich 2\nShrii 2\nShuguang 2\nShuishalien 2\nShun 2\nShuojing 2\nShuqin 2\nShuttle 2\nShuzu 2\nShyi 2\nSiad 2\nSibley 2\nSibra 2\nSicilian 2\nSid 2\nSiddeley 2\nSider 2\nSiegfried 2\nSiemienas 2\nSifa 2\nSigmund 2\nSigned 2\nSignore 2\nSikhs 2\nSilence 2\nSilva 2\nSilvers 2\nSimple 2\nSimulation 2\nSina 2\nSing 2\nSinn 2\nSinocolor 2\nSisters 2\nSites 2\nSixty 2\nSkiing 2\nSkills 2\nSleep 2\nSlevonovich 2\nSlightly 2\nSlim 2\nSlotnick 2\nSlovenia 2\nSloves 2\nSlow 2\nSlowing 2\nSlowly 2\nSmale 2\nSmile 2\nSmiling 2\nSmokey 2\nSnakes 2\nSnedeker 2\nSniper 2\nSnowman 2\nSoares 2\nSoaring 2\nSochaux 2\nSocieties 2\nSocks 2\nSoftLetter 2\nSol 2\nSold 2\nSole 2\nSolution 2\nSomalis 2\nSomebody 2\nSomeday 2\nSomerset 2\nSomoza 2\nSonic 2\nSonora 2\nSoochow 2\nSooraji 2\nSophomore 2\nSoreas 2\nSoule 2\nSouls 2\nSoutherners 2\nSovereignty 2\nSoybeans 2\nSpa 2\nSpadafora 2\nSpahr 2\nSparks 2\nSpears 2\nSpecialist 2\nSpecially 2\nSpecies 2\nSpecific 2\nSpecifications 2\nSpectator 2\nSpectrum 2\nSpend 2\nSphinx 2\nSpilianovich 2\nSpokane 2\nSpokesman 2\nSporadic 2\nSporting 2\nSpot 2\nSpotlight 2\nSprecher 2\nSpringtime 2\nSpruell 2\nSpumante 2\nSquads 2\nStahl 2\nStallone 2\nStaloff 2\nStals 2\nStansfield 2\nStapf 2\nStaples 2\nStatute 2\nSteele 2\nSteelmakers 2\nSteinbeck 2\nSteinkuehler 2\nSteinman 2\nStelco 2\nStella 2\nStelzer 2\nStena 2\nStepmother 2\nSteppel 2\nSteps 2\nSteroids 2\nStirling 2\nStokes 2\nStoltz 2\nStolzman 2\nStories 2\nStout 2\nStrasbourg 2\nStrasser 2\nStrategies 2\nStrathmann 2\nStreetspeak 2\nStreisand 2\nStriking 2\nStroh 2\nStronger 2\nStrongly 2\nStructural 2\nStrum 2\nStuttgart 2\nStygian 2\nSubsequent 2\nSues 2\nSuggestion 2\nSuhler 2\nSuhong 2\nSui 2\nSuicide 2\nSuizhong 2\nSukhoi 2\nSukle 2\nSulzberger 2\nSulzer 2\nSumner 2\nSun. 2\nSunGard 2\nSunbird 2\nSundarji 2\nSunna 2\nSunny 2\nSunrise 2\nSuperbowl 2\nSupervisor 2\nSupplemental 2\nSurgeon 2\nSurprises 2\nSuspension 2\nSutherland 2\nSuzhou 2\nSventek 2\nSwank 2\nSwap 2\nSwasey 2\nSwavely 2\nSweating 2\nSweeney 2\nSwift 2\nSwire 2\nSwissair 2\nSylmar 2\nSystemwide 2\nTA 2\nTAX 2\nTB 2\nTDS 2\nTERM 2\nTFB 2\nTHEM 2\nTHINK 2\nTI 2\nTNN 2\nTOO 2\nTRADING 2\nTRAVEL 2\nTROs 2\nTRUE 2\nTSMC 2\nTTV 2\nTables 2\nTachia 2\nTactic 2\nTadeusz 2\nTagammu 2\nTaierzhuang 2\nTaikang 2\nTaimo 2\nTairan 2\nTaitung 2\nTakashi 2\nTakashimaya 2\nTakes 2\nTakken 2\nTakuro 2\nTalented 2\nTally 2\nTammy 2\nTamsui 2\nTanaka 2\nTanglewood 2\nTanner 2\nTaping 2\nTarim 2\nTarter 2\nTashi 2\nTashkent 2\nTaste 2\nTator 2\nTawhid 2\nTaxi 2\nTayab 2\nTaymani 2\nTeaching 2\nTechnion 2\nTechnological 2\nTeens 2\nTeeple 2\nTeeth 2\nTeflon 2\nTeich 2\nTela 2\nTelemedia 2\nTelepictures 2\nTelesystems 2\nTelzrow 2\nTender 2\nTenpay 2\nTensions 2\nTentative 2\nTeodoro 2\nTerminal 2\nTerracotta 2\nTesco 2\nTesta 2\nTestament 2\nTestifying 2\nThacher 2\nThelema 2\nTheoretically 2\nTheresa 2\nThevenot 2\nThing 2\nThrifts 2\nThriller 2\nThrombinar 2\nThunder 2\nThur. 2\nThursdays 2\nTiant 2\nTicketron 2\nTicor 2\nTigers 2\nTigreans 2\nTijuana 2\nTikong 2\nTill 2\nTillich 2\nTilly 2\nTimber 2\nTimberlake 2\nTimorese 2\nTin 2\nTinku 2\nTiphook 2\nTipper 2\nTips 2\nTivoli 2\nTmobile 2\nTobacco 2\nTobias 2\nTodt 2\nToegyero 2\nTolls 2\nToman 2\nTomb 2\nToms 2\nTona 2\nTonawanda 2\nToney 2\nTonga 2\nTons 2\nTook 2\nTool 2\nTopper 2\nTort 2\nTory 2\nToshiki 2\nTouch 2\nTouching 2\nTowards 2\nTozzini 2\nTrabold 2\nTracers 2\nTracinda 2\nTracking 2\nTradename 2\nTradition 2\nTrails 2\nTraining 2\nTrans-Alaska 2\nTransTechnology 2\nTransactions 2\nTransamerica 2\nTransition 2\nTranslated 2\nTransporter 2\nTransvaal 2\nTraveling 2\nTreasurer 2\nTree 2\nTrent 2\nTrick 2\nTried 2\nTrip 2\nTrivelpiece 2\nTrivest 2\nTronicus 2\nTropical 2\nTrunkline 2\nTsao 2\nTsu 2\nTuesday's 2\nTunhua 2\nTunisi 2\nTunisian 2\nTuntex 2\nTuomioja 2\nTurbai 2\nTurk 2\nTurki 2\nTurning 2\nTwaron 2\nTwentieth 2\nTwin 2\nTwinkle 2\nTwins 2\nTyphus 2\nU$ 2\nU.S.S.R 2\nU.S.backed 2\nU.T. 2\nUAE 2\nUAP 2\nUCAN 2\nUCAS 2\nUCLA 2\nUFOs 2\nULI 2\nUMW 2\nUNITED 2\nUPDATE 2\nURSULA 2\nUSE 2\nUSG 2\nUSPI 2\nUTH 2\nUhr 2\nUlistic 2\nUltraSonic 2\nUlysses 2\nUmar 2\nUmmmm 2\nUnderground 2\nUndertaker 2\nUnderwriting 2\nUniFirst 2\nUnicorp 2\nUnidentified 2\nUnificationist 2\nUnitel 2\nUnleadedStocks.pdf 2\nUnruh 2\nUnseen 2\nUphoff 2\nUrals 2\nUreg 2\nUrethane 2\nUrgent 2\nUruguayan 2\nUsamah 2\nUsers 2\nUsha 2\nUtilization 2\nUtopia 2\nUtrecht 2\nUtsunomiya 2\nVCDs 2\nVMS 2\nVP 2\nVS 2\nVacation 2\nVacations 2\nValarie 2\nValencia 2\nValet 2\nValued 2\nVanderbilt 2\nVang 2\nVangie 2\nVanities 2\nVanourek 2\nVarity 2\nVarnell 2\nVedrine 2\nVerbatim 2\nVere 2\nVeritrac 2\nVern 2\nVersailles 2\nVesoft 2\nVeteran 2\nVevey 2\nVia 2\nViatech 2\nVice-chairman 2\nVichy 2\nVictims 2\nVictoire 2\nVidunas 2\nViewers 2\nVirtue 2\nVirtues 2\nVisas 2\nVisits 2\nVisualization 2\nVisualize 2\nVocalist 2\nVolatility 2\nVolcker 2\nVoldemort 2\nVolk 2\nVoronezh 2\nVortex 2\nVosges 2\nVoyles 2\nVradonit 2\nVries 2\nVt. 2\nVyas 2\nW.I. 2\nW.R. 2\nWAIT 2\nWALKER 2\nWARWICK 2\nWAY 2\nWFP 2\nWHERE 2\nWHICH 2\nWITHOUT 2\nWORK 2\nWORLD 2\nWTS 2\nWTXF 2\nWWF 2\nWWI 2\nWachtell 2\nWad 2\nWada 2\nWage 2\nWahhab 2\nWailing 2\nWakefield 2\nWalMart 2\nWald 2\nWaldbaum 2\nWaldheim 2\nWaldorf 2\nWalesa 2\nWanhai 2\nWarfare 2\nWarman 2\nWarming 2\nWarring 2\nWarriors 2\nWashburn 2\nWatching 2\nWaterstones 2\nWau 2\nWaves 2\nWeapons 2\nWeasel 2\nWeaving 2\nWed. 2\nWeekend 2\nWehmeyer 2\nWeichern 2\nWeifang 2\nWeighing 2\nWeight 2\nWeill 2\nWeinberg 2\nWeinberger 2\nWeird 2\nWeisberg 2\nWeisel 2\nWeiying 2\nWelcoming 2\nWelles 2\nWellesley 2\nWenceslas 2\nWenz 2\nWertheimer 2\nWestaway 2\nWestborough 2\nWestchester 2\nWestcoast 2\nWestendorf 2\nWestminister 2\nWestpac 2\nWestphalia 2\nWet 2\nWetherell 2\nWharf 2\nWhichever 2\nWhitcomb 2\nWhitehall 2\nWhitelock 2\nWhitley 2\nWhiz 2\nWhole 2\nWhore 2\nWidow 2\nWife 2\nWikipedia 2\nWilde 2\nWilke 2\nWilkinson 2\nWillamette 2\nWillard 2\nWilled 2\nWilliamson 2\nWillkie 2\nWillman 2\nWin32 2\nWinchester 2\nWinfred 2\nWinfrey 2\nWing 2\nWirthlin 2\nWise 2\nWitman 2\nWizard 2\nWolens 2\nWolff 2\nWonderland 2\nWonham 2\nWoody 2\nWooten 2\nWorcester 2\nWorms 2\nWorn 2\nWrath 2\nWrites 2\nWrote 2\nWu'er 2\nWubi 2\nWylie 2\nWynn 2\nWyo 2\nXian 2\nXiaobai 2\nXiaohui 2\nXiaosong 2\nXietu 2\nXiling 2\nXinliang 2\nXiong 2\nXisheng 2\nXrays 2\nXuding 2\nXuzhou 2\nYEARS 2\nYLE 2\nYa'an 2\nYanes 2\nYangcheng 2\nYangmingshan 2\nYanhee 2\nYantai 2\nYaohan 2\nYardeni 2\nYarmouk 2\nYassin 2\nYasuo 2\nYatim 2\nYearly 2\nYizhong 2\nYom 2\nYongkang 2\nYongtu 2\nYorkshire 2\nYoshio 2\nYoshiro 2\nYouTube 2\nYounus 2\nYouwei 2\nYuanshan 2\nYudhoyono 2\nYueh 2\nYuehua 2\nYuhong 2\nYuke 2\nYuming 2\nYuri 2\nYusen 2\nYushan 2\nYutaka 2\nYuzek 2\nYves 2\nZacks 2\nZafris 2\nZagreb 2\nZahn 2\nZaki 2\nZambia 2\nZamzam 2\nZane 2\nZapfel 2\nZarett 2\nZaves 2\nZayadi 2\nZealanders 2\nZebari 2\nZeffirelli 2\nZeidner 2\nZhengying 2\nZhifa 2\nZhiping 2\nZhiren 2\nZhishan 2\nZhongrong 2\nZhongshang 2\nZhongtang 2\nZhongxiang 2\nZhuang 2\nZhuangzu 2\nZidane 2\nZimbabwean 2\nZiyang 2\nZlahtina 2\nZoeller 2\nZombie 2\nZubairi 2\nZulu 2\nZurkuhlen 2\nZurn 2\n^,,^ 2\n^_^ 2\n______________ 2\n______________________ 2\n__________________________________________________ 2\n____________________________________________________________ 2\na&m 2\na.k.a. 2\naback 2\nabate 2\nabdicate 2\nabdomen 2\nabetting 2\nablaze 2\nabnormalities 2\nabolishment 2\nabounds 2\nabsences 2\nabsolutism 2\nabsorbers 2\nabstaining 2\nabstinent 2\nabstraction 2\nabusilm@maktoob.com 2\nabusing 2\nabyss 2\naccede 2\naccelerates 2\naccented 2\naccessed 2\naccessing 2\nacclamation 2\naccolade 2\naccomodate 2\naccompli 2\naccredited 2\naccrual 2\naccrue 2\naccruing 2\nachievable 2\nacquaintances 2\nacquisitive 2\nacquit 2\nacquittal 2\nacrimonious 2\nacrimony 2\nactivation 2\nactresses 2\nadamantly 2\nadapter 2\naddict 2\nadequacy 2\nadherent 2\nadherents 2\nadipocytes 2\nadjudicators 2\nadjusts 2\nadmin 2\nadministrating 2\nadmirer 2\nadmonishing 2\nadn 2\nadolescent 2\nadoptive 2\nadore 2\nadorn 2\nadrenaline 2\nadroitly 2\nadvancements 2\nadvert 2\nadvertises 2\nadvisable 2\naesthetics 2\naffections 2\naffinities 2\naffinity 2\naffirm 2\naffirming 2\nafflicts 2\naficionados 2\nafield 2\nafire 2\naftereffects 2\nagaisnt 2\nagate 2\nager 2\nagers 2\naggravating 2\nagonizing 2\nagony 2\nahold 2\nahs 2\naimlessly 2\nairfare 2\nairfields 2\nairliner 2\nairliners 2\nairstrike 2\nairtime 2\nairworthiness 2\naisles 2\nala 2\nalas 2\nalchemical 2\nalcoholics 2\nalcoholism 2\nale 2\nalienation 2\nalignment 2\nallayed 2\nallergic 2\nallergy 2\nalligator 2\nallocator 2\nalloy 2\nalluring 2\nalmanac 2\nalmighty 2\naloof 2\nalot 2\nalteration 2\nalterations 2\naltering 2\nalternator 2\naltruism 2\namalgamation 2\namateurish 2\namazingly 2\nambivalence 2\namerican 2\namino 2\namnesty 2\namok 2\namounting 2\namp 2\namplifiers 2\namps 2\namputation 2\namuse 2\namusements 2\namyloid 2\nana_alaraby@hotmail.com 2\nana_free11@hotmail.com 2\nanalgesic 2\nanalog 2\nanalytic 2\nanalyzes 2\nanathema 2\nanchored 2\nancients 2\nangeles 2\nangelic 2\nangering 2\nangling 2\nangstrom 2\nanguished 2\nanise 2\nankle 2\nankles 2\nannex 2\nannihilation 2\nannotated 2\nannuitant 2\nannul 2\nannum 2\nant 2\nantenna 2\nanthropology 2\nanti-China 2\nanti-Japanese 2\nanti-Milosevic 2\nanti-Semitism 2\nanti-anemia 2\nanti-cancer 2\nanti-communist 2\nanti-competitive 2\nanti-development 2\nanti-discrimination 2\nanti-dumping 2\nanti-missile 2\nanti-natal 2\nanti-ship 2\nanti-social 2\nanti-tax 2\nanti-white 2\nantics 2\nantisocial 2\nantiviral 2\nanyplace 2\napathetic 2\nape 2\naphrodisiac 2\naplomb 2\nappended 2\napplaud 2\nappoints 2\nappraise 2\nappreciable 2\nappreciates 2\napprenticeship 2\napprised 2\nappropriateness 2\nappropriating 2\napproximates 2\napps 2\nappy 2\naquittal 2\nar 2\narbitrager 2\narc 2\narcane 2\narchdiocese 2\narched 2\narcheologists 2\nardor 2\nargon 2\narguably 2\narguements 2\nargumentative 2\narithmetic 2\narma 2\narmadillos 2\narmory 2\narouses 2\narousing 2\narrayed 2\narrestees 2\narrow 2\narrv. 2\narsenic 2\narson 2\nartful 2\narticulate 2\nartifact 2\nascertain 2\nasian 2\naspirational 2\naspired 2\naspirin 2\nassailant 2\nassailants 2\nassassin 2\nassemblies 2\nassent 2\nassertive 2\nassiduously 2\nassigning 2\nassimilate 2\nassists 2\nassuage 2\nastonished 2\nastonishment 2\nastounded 2\nastrological 2\nastronomer 2\nastronomic 2\nastrophysicist 2\nastute 2\natheism 2\natrocious 2\natropine 2\nattainable 2\nattaining 2\nattendees 2\nattired 2\nattractively 2\nattuned 2\natty 2\nauctioneer 2\nauctioneers 2\naudacious 2\naudacity 2\nauditing 2\naugment 2\naugmented 2\naunts 2\naureus 2\nauspices 2\nauspiciousness 2\nauthorship 2\nautoimmune 2\nautomakers 2\navaricious 2\naverse 2\navuncular 2\naways 2\nawed 2\nawoke 2\naxiom 2\naxis 2\naxle 2\nazure 2\nbaccalaureate 2\nbackstage 2\nbackwater 2\nbadges 2\nbaffled 2\nbagel 2\nbagger 2\nbailing 2\nbalconies 2\nbales 2\nbalking 2\nballast 2\nballerina 2\nballooned 2\nballpark 2\nballparks 2\nballroom 2\nbanal 2\nbandages 2\nbanging 2\nbanish 2\nbanished 2\nbanknotes 2\nbankroll 2\nbanners 2\nbaptism 2\nbarbarically 2\nbarbecue 2\nbarber 2\nbarge 2\nbarley 2\nbarons 2\nbarreling 2\nbarricade 2\nbarricades 2\nbarroom 2\nbartender 2\nbaseless 2\nbaseman 2\nbashfulness 2\nbasil 2\nbask 2\nbathe 2\nbathed 2\nbathrooms 2\nbaton 2\nbattalions 2\nbattering 2\nbeachfront 2\nbeacon 2\nbeaming 2\nbeasties 2\nbeater 2\nbeauties 2\nbeautiful/talented 2\nbeautify 2\nbecuse 2\nbedevil 2\nbedrock 2\nbeep 2\nbeeping 2\nbefall 2\nbefalls 2\nbefell 2\nbefriended 2\nbefuddled 2\nbegged 2\nbegs 2\nbeibei 2\nbelie 2\nbelittle 2\nbellow 2\nbellows 2\nbenighted 2\nbequest 2\nberries 2\nberserk 2\nbestseller 2\nbetas 2\nbetray 2\nbetraying 2\nbetrays 2\nbetterment 2\nbewildered 2\nbewildering 2\nbewilderment 2\nbicentennial 2\nbiennial 2\nbikini 2\nbillet 2\nbillionnaire 2\nbillowing 2\nbimonthly 2\nbinary 2\nbingladen 2\nbinoculars 2\nbiologist 2\nbiomass 2\nbioresearch 2\nbiosphere 2\nbitchin 2\nbitterest 2\nbittersweet 2\nblacked 2\nblackmailed 2\nblackouts 2\nblackworms 2\nblades 2\nblanco 2\nblanketed 2\nblasphemous 2\nblatantly 2\nbleachers 2\nblemishes 2\nblindfolded 2\nblini 2\nblinked 2\nblinking 2\nblip 2\nblips 2\nblitz 2\nblockaded 2\nblocs 2\nblogging 2\nblond 2\nblondes 2\nbloodbath 2\nbloodiest 2\nbloom 2\nblossoming 2\nblotting 2\nblues 2\nbluffing 2\nblundered 2\nblunders 2\nblurry 2\nboarders 2\nboardroom 2\nboasting 2\nboatload 2\nbobmart 2\nboilers 2\nbolted 2\nbona 2\nbonanza 2\nbonding 2\nbookers 2\nbookkeeper 2\nbookkeeping 2\nboomed 2\nbordered 2\nborderless 2\nboron 2\nborrows 2\nbottlers 2\nbottomed 2\nbottomless 2\nboulder 2\nboutiques 2\nbovine 2\nbowel 2\nbowels 2\nboxer 2\nboxing 2\nboyfriends 2\nboyish 2\nbragging 2\nbrainchild 2\nbrainer 2\nbrainpower 2\nbrainwash 2\nbraised 2\nbranched 2\nbranching 2\nbraving 2\nbreakage 2\nbreakdowns 2\nbreathed 2\nbreech 2\nbreeders 2\nbreezy 2\nbrewed 2\nbrewers 2\nbribing 2\nbrides 2\nbridging 2\nbriefs 2\nbrigadier 2\nbrightened 2\nbrightly 2\nbrim 2\nbrimming 2\nbriskly 2\nbroadcaster 2\nbroadleaved 2\nbrokerages 2\nbrokered 2\nbruised 2\nbruising 2\nbrunch 2\nbrushing 2\nbrute 2\nbuabua 2\nbucked 2\nbuckets 2\nbucking 2\nbudged 2\nbudgeting 2\nbuds 2\nbuffalo 2\nbuffing 2\nbulkheads 2\nbulky 2\nbullhorns 2\nbulwark 2\nbumble 2\nbundled 2\nbundles 2\nbunk 2\nbunkers 2\nbuoy 2\nbuoying 2\nburglaries 2\nburrow 2\nbushy 2\nbusts 2\nbutting 2\nbuttoned 2\nbuttressed 2\nbuyback 2\nbuyouts 2\nbuzzwords 2\nbygone 2\nbystanders 2\nbytes 2\ncabal 2\ncaches 2\ncactus 2\ncaffeinated 2\ncalamitous 2\ncalipers 2\ncaliph 2\ncaliphate 2\ncalorie 2\ncameramen 2\ncanceling 2\ncancels 2\ncandlelight 2\ncane 2\ncanister 2\ncanisters 2\ncannons 2\ncanteen 2\ncanvassing 2\ncanyons 2\ncapacitors 2\ncapitalizes 2\ncapping 2\ncaptivated 2\ncaptivity 2\ncaptors 2\ncapturing 2\ncarat 2\ncarcasses 2\ncarcinogenic 2\ncarcinoma 2\ncareening 2\ncaretaker 2\ncaricatures 2\ncarnal 2\ncarnivore 2\ncartels 2\ncarton 2\ncartridges 2\ncashback 2\ncashier 2\ncasings 2\ncastigated 2\ncastling 2\ncastor 2\ncatalase 2\ncatalytic 2\ncatcher 2\ncatches 2\ncatchy 2\ncategorization 2\ncategorize 2\ncategorized 2\ncatered 2\ncaters 2\ncatfish 2\ncatheterization 2\ncauliflower 2\ncaveats 2\ncaved 2\ncavernous 2\ncaves 2\ncaving 2\nceaselessly 2\ncelebrates 2\ncellulose 2\ncensor 2\ncentenarians 2\ncenterfielder 2\ncentrally 2\ncentrifugal 2\nceremonial 2\ncertifications 2\ncertifying 2\ncervical 2\nchad 2\nchafed 2\nchandelier 2\nchant 2\nchapel 2\nchaps 2\ncharismatic 2\ncharmed 2\nchasers 2\nchaste 2\nchastises 2\nchateau 2\nchats 2\nchatty 2\nchem 2\ncherishes 2\ncherishing 2\ncherries 2\nchewed 2\nchia 2\nchiang 2\nchided 2\nchides 2\nchihuahua 2\nchildbirth 2\nchildrens 2\nchinatown 2\nchipset 2\nchipsets 2\nchisel 2\nchitin 2\nchiu 2\nchlorination 2\nchnages 2\nchoked 2\nchoppers 2\nchopstick 2\nchord 2\nchristened 2\nchromatograph 2\nchronicle 2\nchuckles 2\ncichlids 2\ncigar 2\ncinch 2\ncircling 2\ncircuitry 2\ncirculars 2\ncircumstance 2\ncitations 2\ncitrus 2\ncitywide 2\nclanging 2\nclans 2\nclapping 2\nclarifications 2\nclarinet 2\nclassifications 2\nclaw 2\nclawed 2\ncleanse 2\nclearances 2\ncleavages 2\ncleveland 2\nclicks 2\nclimatic 2\nclinch 2\nclinging 2\nclipboard 2\ncloned 2\nclones 2\nclot 2\nclownloach 2\nclumps 2\nclunky 2\nclutch 2\ncnn.com/wolf 2\ncnnfn 2\nco 2\nco-authored 2\nco-exist 2\nco-existence 2\nco-head 2\nco-operate 2\nco-operated 2\nco-owner 2\nco-pilot 2\nco-worker 2\nco-workers 2\nco. 2\ncoaching 2\ncoarse 2\ncoattails 2\ncoax 2\ncobra 2\ncockatoos 2\ncocked 2\ncocker 2\ncocktails 2\ncoconut 2\ncod 2\ncoddled 2\ncofferdam 2\ncognitively 2\ncohesion 2\ncohesive 2\ncohort 2\ncohorts 2\ncoincided 2\ncoldest 2\ncolds 2\ncollagen 2\ncollars 2\ncollectives 2\ncollectivism 2\ncollegiate 2\ncolloquium 2\ncolluded 2\ncollusion 2\ncolorado 2\ncolossal 2\ncolossus 2\ncolt 2\ncomb 2\ncombed 2\ncombo 2\ncomedic 2\ncomforted 2\ncomfy 2\ncommandant 2\ncommando 2\ncommensurate 2\ncommentaries 2\ncommons 2\ncommonwealth 2\ncommunal 2\ncompacted 2\ncompatibility 2\ncompetency 2\ncompilation 2\ncompiler 2\ncompiling 2\ncomplacency 2\ncompliant 2\ncomplicates 2\ncomplication 2\ncomplicit 2\ncomplimentary 2\ncomplimented 2\ncomplying 2\ncomposites 2\ncompressed 2\ncompromising 2\ncompulsions 2\ncompulsive 2\ncompute 2\nconceivably 2\nconcise 2\nconclusive 2\nconclusively 2\nconcomitant 2\nconcurred 2\nconcurrence 2\ncondemnations 2\ncondescending 2\ncondescension 2\ncondominiums 2\ncondos 2\nconductor 2\nconduit 2\nconelets 2\nconfederation 2\nconferenced 2\nconferencing 2\nconferred 2\nconferring 2\nconfers 2\nconfess 2\nconfessing 2\nconfidant 2\nconfigurations 2\nconfigured 2\nconfinement 2\nconfines 2\nconfiscate 2\nconfiscating 2\nconflicted 2\nconfronts 2\ncongenial 2\ncongratulations 2\ncongresses 2\ncongressionally 2\nconnoisseur 2\nconquest 2\nconscia 2\nconsciences 2\nconsenting 2\nconservationists 2\nconservatively 2\nconsiderate 2\nconsignees 2\nconsignors 2\nconsolations 2\nconsoles 2\nconspire 2\nconst 2\nconstrue 2\nconsumable 2\ncontemporaries 2\ncontemptuous 2\ncontenders 2\ncontented 2\ncontingencies 2\ncontingents 2\ncontradicting 2\ncontrasted 2\ncontravened 2\nconundrum 2\nconvenes 2\nconveniently 2\nconverge 2\nconverged 2\nconvergence 2\nconversions 2\nconvicts 2\nconvincingly 2\nconvulsions 2\ncopiers 2\ncopious 2\ncops 2\ncordon 2\ncornea 2\ncorneal 2\ncornfield 2\ncornstarch 2\ncorporal 2\ncorrectness 2\ncorresponded 2\ncorrode 2\ncosmonaut 2\ncosmopolitan 2\ncosy 2\ncoterie 2\ncoughed 2\ncouncillors 2\ncouncilman 2\ncouncils 2\ncountdown 2\ncounteract 2\ncounterclaims 2\ncounterfeiting 2\ncountering 2\ncounterintelligence 2\ncounterpoint 2\ncountertop 2\ncountervailing 2\ncounterweight 2\ncourageously 2\ncourtesan 2\ncourtrooms 2\ncourtship 2\ncoven 2\ncoverages 2\ncovetously 2\ncovetousness 2\ncowardice 2\ncowboys 2\ncower 2\ncoworkers 2\ncrab 2\ncrackle 2\ncrafter 2\ncrafting 2\ncraftsmen 2\ncramped 2\ncramps 2\ncraning 2\ncranking 2\ncrass 2\ncrater 2\ncrates 2\ncravings 2\ncrayon 2\ncreamer 2\ncreations 2\ncrediting 2\ncreditworthy 2\ncreeps 2\ncreepy 2\ncriminalize 2\ncrimping 2\ncripple 2\ncriss 2\ncroaker 2\ncrocodile 2\ncrooned 2\ncroons 2\ncross-Strait 2\ncross-blending 2\ncross-connect 2\ncross-country 2\ncross-generational 2\ncross-ownership 2\ncross-section 2\ncrossings 2\ncrotchety 2\ncrouched 2\ncrow 2\ncrowning 2\ncrucially 2\ncrucible 2\ncrudely 2\ncruelty 2\ncruising 2\ncrunchier 2\ncrunching 2\ncryptic 2\ncube 2\ncucumber 2\nculminating 2\ncults 2\ncumin 2\ncunning 2\ncurators 2\ncures 2\ncuriosity 2\ncurl 2\ncurricula 2\ncustard 2\ncustomarily 2\ncyber-literature 2\ncycads 2\ncyclone 2\ncynic 2\ncynically 2\nczars 2\nd' 2\nd'etat 2\ndamned 2\ndampen 2\ndampened 2\ndandy 2\ndangled 2\ndangles 2\ndarkened 2\ndarkly 2\ndart 2\ndasanicool 2\ndawned 2\ndazzled 2\ndebasement 2\ndebian 2\ndebit 2\ndebtholders 2\ndebtor 2\ndebtors 2\ndebug 2\ndebunk 2\ndecadence 2\ndecadent 2\ndeceitful 2\ndecentralization 2\ndeceptively 2\ndecimated 2\ndeclassified 2\ndecommissioned 2\ndecommissioning 2\ndecorate 2\ndecorations 2\ndecorum 2\ndecrees 2\ndecries 2\ndedicating 2\ndeducted 2\ndeems 2\ndefamed 2\ndefaulting 2\ndefecting 2\ndefection 2\ndefenceless 2\ndefenseless 2\ndefensible 2\ndefensively 2\ndeference 2\ndeflation 2\ndeflect 2\ndefuse 2\ndefy 2\ndefying 2\ndegeneration 2\ndegu 2\ndeja 2\ndelectable 2\ndelegating 2\ndeleting 2\ndeletion 2\ndeliberated 2\ndeliberating 2\ndeliberation 2\ndelicacies 2\ndelicacy 2\ndelicately 2\ndelights 2\ndelisting 2\ndelusion 2\ndelusional 2\ndelusions 2\ndemagogic 2\ndemeaning 2\ndemolish 2\ndemolishing 2\ndemonic 2\ndemotion 2\nden 2\ndenials 2\ndenominational 2\ndent 2\ndents 2\ndenunciation 2\ndependant 2\ndeplores 2\ndeployments 2\ndepot 2\ndepreciate 2\ndepressant 2\ndeprivation 2\ndept 2\nderailing 2\nderided 2\nderiving 2\nderring 2\ndesalination 2\ndeserted 2\ndesertion 2\ndesist 2\ndesolation 2\ndespondency 2\ndespots 2\ndestabilization 2\ndestitute 2\ndetach 2\ndetain 2\ndetectable 2\ndetectives 2\ndetergents 2\ndeterministic 2\ndeterred 2\ndeterrents 2\ndetonated 2\ndetour 2\ndetriment 2\ndeuterium 2\ndevelopmentally 2\ndevious 2\ndevotee 2\ndevoting 2\ndevour 2\ndevoured 2\ndevout 2\ndharmad...@gmail.com 2\ndiagnosing 2\ndiagram 2\ndialog 2\ndiametrically 2\ndiaper 2\ndice 2\ndictates 2\ndictating 2\ndictum 2\ndietary 2\ndiffered 2\ndiffusion 2\ndimes 2\ndiminution 2\ndimpled 2\ndine 2\ndiocese 2\ndioxins 2\ndirtier 2\ndisadvantages 2\ndisagreeing 2\ndisagrees 2\ndisallowed 2\ndisappearing 2\ndisavowed 2\ndisbelievers 2\ndisburse 2\ndisbursed 2\ndisbursement 2\ndisbursements 2\ndiscarding 2\ndisciples 2\ndiscloses 2\ndiscolored 2\ndisconnect 2\ndiscontinue 2\ndiscontinuing 2\ndiscotheque 2\ndiscouragement 2\ndiscourages 2\ndiscriminated 2\ndisdained 2\ndiseased 2\ndisenfranchised 2\ndisguise 2\ndisguises 2\ndishwasher 2\ndishwashers 2\ndisinclined 2\ndisinfectant 2\ndisinherits 2\ndisinterested 2\ndislocation 2\ndislocations 2\ndismisses 2\ndisparaging 2\ndisparities 2\ndispense 2\ndispersed 2\ndispersing 2\ndispleases 2\ndispleasure 2\ndisposals 2\ndisputing 2\ndisqualification 2\ndisrepair 2\ndisreputable 2\ndisrupting 2\ndissected 2\ndisseminate 2\ndisseminated 2\ndissented 2\ndisservice 2\ndissimilar 2\ndissipated 2\ndissuade 2\ndistention 2\ndistiller 2\ndistillers 2\ndistilling 2\ndistinctively 2\ndistortions 2\ndistorts 2\ndistributer 2\ndistrust 2\ndivvying 2\ndizzy 2\ndoc 2\ndocudrama 2\ndocumentations 2\ndoddering 2\ndogmatic 2\ndolce 2\ndole 2\ndoling 2\ndominating 2\ndomineering 2\ndonates 2\ndonkey 2\ndonkeys 2\ndonnybrook 2\ndoom 2\ndooming 2\ndoomsayers 2\ndoorsteps 2\ndosage 2\ndotcom 2\ndots 2\ndotted 2\ndoubtless 2\ndove 2\ndowned 2\ndowngrades 2\ndownloaded 2\ndownplaying 2\ndownsize 2\ndowntime 2\ndowntrend 2\ndowntrodden 2\ndownturns 2\ndoyen 2\ndrab 2\ndramatizations 2\ndrapes 2\ndread 2\ndreamlike 2\ndredging 2\ndregs 2\ndrenched 2\ndressmaking 2\ndrillers 2\ndrinker 2\ndrinkers 2\ndrivel 2\ndroplets 2\ndropout 2\ndropouts 2\ndropper 2\ndrowning 2\ndrummers 2\ndrumming 2\ndrunkenness 2\ndrving 2\ndrying 2\ndubiously 2\nducking 2\ndudes 2\nduke 2\ndulled 2\ndumbfounded 2\ndumplings 2\ndumps 2\ndung 2\nduplex 2\nduplicated 2\nduplicity 2\ndurability 2\ndurables 2\ndustbin 2\ndusting 2\ndusts 2\ndweller 2\ndwellers 2\ndynamism 2\ndynastic 2\ndynasties 2\ndysfunction 2\ndyspepsia 2\ne-mailed 2\ne.g. 2\nearmark 2\nearrings 2\nearthen 2\nearthly 2\nearthmover 2\nearthmoving 2\nearthy 2\neasement 2\neaters 2\nebay 2\nebook 2\neccentric 2\nechelon 2\nechelons 2\necologist 2\neconomical 2\necstasy 2\neducators 2\neel 2\neffected 2\nefficiencies 2\neffluent 2\neffusive 2\negalitarianism 2\negos 2\negregious 2\negregiously 2\neighths 2\nelation 2\nelectrically 2\nelectrician 2\nelectrochemicals 2\nelectrocuted 2\nelectroplating 2\nelicit 2\neliminates 2\nelitist 2\nelitists 2\nelixir 2\nelliptical 2\neloquence 2\neluded 2\nembankments 2\nembellished 2\nembellishment 2\nembezzlement 2\nembezzling 2\nembody 2\nembroil 2\nembryonic 2\nemcee 2\nemerald 2\nemigrated 2\nemigrating 2\nemissary 2\nemit 2\nemitting 2\nemotionalism 2\nempathize 2\nemptiness 2\nencased 2\nencircling 2\nenclose 2\nencoded 2\nencompass 2\nencompassed 2\nencountering 2\nencroachments 2\nencyclopedia 2\nendangerment 2\nendurance 2\nenergized 2\nengagements 2\nenjoin 2\nenlarging 2\nennui 2\nenormously 2\nenraged 2\nenrolling 2\nensemble 2\nentangled 2\nentertainers 2\nenthralled 2\nenticing 2\nentitling 2\nentombed 2\nentomology 2\nentrench 2\nentwined 2\nenvirons 2\nenvisage 2\nenvisaged 2\nenvisions 2\nenzymes 2\nepisodic 2\nequaled 2\nequate 2\nequator 2\nequestrians 2\nequips 2\nequitably 2\neradicating 2\neradication 2\nerasing 2\nerratically 2\nerred 2\nersatz 2\nerudite 2\nerupts 2\nescalates 2\nescalators 2\neschewed 2\nescorted 2\nesp 2\nesprit 2\nest 2\nestates 2\nesteemed 2\neternally 2\nethic 2\nethnography 2\nethnology 2\neucalyptus 2\neuphemisms 2\nevaders 2\nevaluates 2\nevened 2\nevenhanded 2\neverytime 2\nevocative 2\nevoked 2\nex 2\nex-husband 2\nex-members 2\nexacerbate 2\nexacerbates 2\nexaggeration 2\nexalted 2\nexamines 2\nexcavators 2\nexcels 2\nexchangeable 2\nexcitedly 2\nexclaiming 2\nexcludes 2\nexclusions 2\nexcruciating 2\nexcused 2\nexecutor 2\nexhilarating 2\nexonerated 2\nexorcism 2\nexpansionary 2\nexpansions 2\nexpeditions 2\nexpended 2\nexpiring 2\nexpiry 2\nexplanatory 2\nexploiters 2\nexplorer 2\nexplorers 2\nexquisite 2\next. 2\nextorting 2\nextradite 2\nextravaganza 2\nextremities 2\nextroverted 2\nexurbs 2\neyebrow 2\neyeful 2\neyelids 2\nfa 2\nfab 2\nfabricate 2\nfabricating 2\nfactly 2\nfactoring 2\nfailings 2\nfainted 2\nfaintest 2\nfaire 2\nfairer 2\nfait 2\nfaiths 2\nfaked 2\nfakes 2\nfallacy 2\nfallback 2\nfalsehoods 2\nfalsification 2\nfalsify 2\nfaltering 2\nfalters 2\nfamiliarize 2\nfanatically 2\nfanatics 2\nfanciful 2\nfanned 2\nfantasize 2\nfarewells 2\nfarthest 2\nfasten 2\nfatherland 2\nfats 2\nfattened 2\nfaulted 2\nfavoritism 2\nfearless 2\nfecal 2\nfeces 2\nfeckless 2\nfederalized 2\nfeeble 2\nfeeder 2\nfeedlot 2\nfeisty 2\nfelonies 2\nfelons 2\nfemininity 2\nfended 2\nfennel 2\nfenugreek 2\nfern 2\nferociously 2\nfertilization 2\nfestooned 2\nfetchingly 2\nfeuds 2\nfiat 2\nfictitious 2\nfide 2\nfidelity 2\nfielding 2\nfiguratively 2\nfigurehead 2\nfiligree 2\nfilthy 2\nfinancer 2\nfinch 2\nfirearm 2\nfirefight 2\nfirefighting 2\nfireplace 2\nfireproof 2\nfirestorm 2\nfirma 2\nfisheries 2\nfixture 2\nfizzled 2\nflammable 2\nflapping 2\nflaring 2\nflashed 2\nflashpoint 2\nflatbed 2\nflats 2\nflawless 2\nflawlessly 2\nflex 2\nflicker 2\nflippant 2\nflipped 2\nflogging 2\nfloored 2\nflorists 2\nflourished 2\nflowchart 2\nflunk 2\nfluorouracil 2\nflyers 2\nfo 2\nfoaming 2\nfocussed 2\nfodder 2\nfoetus 2\nfoiled 2\nfoiling 2\nfolders 2\nfollower 2\nfoo 2\nfoodstuff 2\nfoolhardy 2\nfoolishness 2\nfootnote 2\nforcible 2\nforeground 2\nforeman 2\nforensics 2\nforeplay 2\nforerunner 2\nforetold 2\nforgave 2\nforked 2\nforks 2\nforlorn 2\nforlornly 2\nformality 2\nformalizes 2\nformative 2\nformatted 2\nfornication 2\nfortifications 2\nfortress 2\nfortunetellers 2\nfouled 2\nfouls 2\nfoundary 2\nfoundational 2\nfour-fold 2\nfoward 2\nfowl 2\nfoxes 2\nfoyer 2\nfracture 2\nfragment 2\nfrail 2\nframers 2\nframeworks 2\nfranchising 2\nfrancisco 2\nfrauds 2\nfraudulently 2\nfreaked 2\nfreaks 2\nfreebies 2\nfreest 2\nfreighter 2\nfreighters 2\nfreon 2\nfretting 2\nfridge 2\nfrightful 2\nfrills 2\nfrittered 2\nfrolic 2\nfronds 2\nfrosty 2\nfroze 2\nfrugal 2\nfrugality 2\nfruition 2\nft. 2\nfucked 2\nfudged 2\nfulfil 2\nfuming 2\nfundraising 2\nfunneling 2\nfuriously 2\nfurnish 2\nfurnished 2\nfurthering 2\nfusions 2\ngagged 2\ngal 2\ngalas 2\ngalaxies 2\ngalleryfurniture.com 2\ngalvanizing 2\ngambler 2\ngangly 2\ngantry 2\ngarages 2\ngarland 2\ngarlands 2\ngasp 2\ngasped 2\ngasses 2\ngastric 2\ngawky 2\ngays 2\ngearbox 2\ngeeks 2\ngel 2\ngenders 2\ngenerosity 2\ngenesis 2\ngenres 2\ngent 2\ngenteel 2\ngentler 2\ngentry 2\ngeologically 2\ngeometry 2\ngeotextiles 2\ngerbil 2\ngf 2\nghostbusters 2\ngif 2\ngigolo 2\ngingerly 2\ngirding 2\ngiveaways 2\nglacial 2\nglaciers 2\nglanced 2\nglandular 2\nglaring 2\ngleeful 2\ngleefully 2\ngliders 2\nglint 2\nglisten 2\nglitches 2\nglitter 2\ngloating 2\nglobalists 2\ngloomier 2\ngloriously 2\nglowed 2\nglucose 2\nglutted 2\ngoalkeeper 2\ngoddesses 2\ngoggles 2\ngolfers 2\ngoodegg 2\ngoogled 2\ngorgeous 2\ngouge 2\ngoverns 2\ngraciously 2\ngrader 2\ngrafted 2\ngrammar 2\ngrandkids 2\ngrandsons 2\ngrapefruit 2\ngraphite 2\ngratuitous 2\ngratuitously 2\ngravely 2\ngraying 2\ngrazed 2\ngreasy 2\ngreek 2\ngreener 2\ngreenhouses 2\ngreening 2\ngridlocked 2\ngrieving 2\ngrilling 2\ngrimly 2\ngrinding 2\ngrinds 2\ngripping 2\ngrist 2\ngritty 2\ngroans 2\ngrocer 2\ngroomed 2\ngrossing 2\ngroundbreaking 2\ngrudging 2\ngrumbles 2\ngrumpy 2\nguan 2\nguardians 2\nguardianship 2\nguesswork 2\nguidebook 2\nguilds 2\ngunman 2\ngunshots 2\ngusto 2\ngymnastic 2\ngymnastics 2\ngyrating 2\nhabitation 2\nhabu 2\nhack 2\nhailing 2\nhairdresser 2\nhairstyle 2\nhalftime 2\nhallucinations 2\nhalo 2\nhamlets 2\nhammering 2\nhamstrung 2\nhandcuffs 2\nhandily 2\nhandsets 2\nhanged 2\nhanyin 2\nhappenings 2\nharangues 2\nharbored 2\nharden 2\nhardening 2\nhards 2\nharried 2\nharshness 2\nhassled 2\nhassles 2\nhaste 2\nhastened 2\nhastens 2\nhastiness 2\nhaters 2\nhath 2\nhaughty 2\nhaulage 2\nhauling 2\nhawking 2\nhazmat 2\nhd 2\nheadgear 2\nheadlong 2\nheadquarter 2\nheadset 2\nheaped 2\nheartedness 2\nheartened 2\nheartily 2\nheavenly 2\nhedgers 2\nheedless 2\nhehehe 2\nheist 2\nhelix 2\nhelmets 2\nhelter 2\nhemorrhoids 2\nhenna 2\nherbicides 2\nherding 2\nhereditary 2\nheredity 2\nheretic 2\nherons 2\nhesitantly 2\nhesitated 2\nhiatus 2\nhibernate 2\nhideaway 2\nhidebound 2\nhideouts 2\nhijacker 2\nhike 2\nhikes 2\nhilly 2\nhindrance 2\nhindrances 2\nhippie 2\nhippies 2\nhips 2\nhirier 2\nhisses 2\nhissing 2\nhitched 2\nhitches 2\nhitherto 2\nhitters 2\nhlep 2\nhmm 2\nhmmm 2\nhoards 2\nhoisted 2\nholdouts 2\nholiest 2\nholocaust 2\nhomefront 2\nhomeport 2\nhomo 2\nhomonym 2\nhonoured 2\nhooded 2\nhoodlum 2\nhoods 2\nhookless 2\nhooks 2\nhooligans 2\nhordes 2\nhorizontally 2\nhorrified 2\nhorticultural 2\nhorticulturally 2\nhorticulture 2\nhospitable 2\nhotbed 2\nhotdog 2\nhotline 2\nhotpot 2\nhotter 2\nhouei 2\nhousekeeper 2\nhousekeeping 2\nhrfrank 2\nhttp://newsimg.bbc.co.uk/media/images/41186000/jpg/_41186472_kar-child-ap416.jpg 2\nhttp://www.abortionno.org/Resources/pictures_2.html 2\nhttp://www.donews.net/pangshengdong 2\nhttp://www.euci.com/pdf/trans_expn.pdf 2\nhuajia 2\nhuff 2\nhugging 2\nhulk 2\nhumanities 2\nhumanizing 2\nhumankind 2\nhumid 2\nhumiliate 2\nhumorist 2\nhumorous 2\nhundredth 2\nhundredths 2\nhusbandry 2\nhusk 2\nhuts 2\nhwei 2\nhydrocarbon 2\nhydrophilic 2\nhyenas 2\nhygiene 2\nhyperactive 2\nhyperlink 2\nhyperventilating 2\nhyping 2\nhypnotized 2\nhypotheses 2\nhypothesis 2\nhysterical 2\nhysterically 2\niPods 2\ni_am_s4udi@yahoo.com 2\niconoclastic 2\nicons 2\nid 2\nidealist 2\nidentifiable 2\nideologies 2\nideologues 2\nidiot 2\nidled 2\nidling 2\nidyllic 2\nigniting 2\nignoble 2\nii 2\niii 2\nillicit 2\nillustrious 2\nimmunities 2\nimpairment 2\nimpassioned 2\nimpassively 2\nimpatiently 2\nimpedance 2\nimpedes 2\nimpediments 2\nimpelled 2\nimperialism 2\nimpersonator 2\nimplanted 2\nimplants 2\nimpound 2\nimpounded 2\nimpoundment 2\nimpressively 2\nimpulse 2\nimpulsively 2\nimpurities 2\nin-depth 2\ninaction 2\ninadequacies 2\ninadequacy 2\ninadequately 2\nincineration 2\nincite 2\nincomparable 2\ninconvenience 2\ninconvenient 2\nincorporation 2\nincrement 2\nincremental 2\nincubating 2\nincubators 2\nincursion 2\nindebtedness 2\nindecision 2\nindecisive 2\nindecisiveness 2\nindexer 2\nindignation 2\nindignities 2\nindisputable 2\nindistinguishable 2\nindomitable 2\nindustrialists 2\ninefficiencies 2\ninequalities 2\ninequities 2\ninertia 2\ninexcusable 2\ninexorably 2\ninfect 2\ninferiority 2\ninferred 2\ninfidelity 2\ninfidels 2\ninfinitely 2\ninflaming 2\ninflating 2\ninfluenza 2\ninfringes 2\ninfringing 2\ningenious 2\ningest 2\ningrained 2\ningratiate 2\ninhabit 2\ninherently 2\ninhibited 2\ninhibition 2\niniquity 2\ninitialed 2\ninjections 2\ninjects 2\ninjunctions 2\ninjustices 2\ninks 2\ninmates 2\ninnately 2\ninnkeeper 2\ninnocently 2\ninnovator 2\ninnovators 2\ninns 2\ninoperable 2\ninputs 2\ninquisitive 2\ninsanity 2\ninsecure 2\ninsensibility 2\ninsensible 2\ninsensitive 2\ninsignia 2\ninsincere 2\ninspirational 2\ninstigate 2\ninstigated 2\ninstigation 2\ninstilling 2\ninstinctive 2\ninsubordination 2\ninsufficiencies 2\ninsulate 2\ninsulins 2\ninsulted 2\ninsurmountable 2\ninter-American 2\ninter-group 2\nintercepted 2\nintercourse 2\ninterdependence 2\ninterdependent 2\ninterestingly 2\ninterestrate 2\ninterferes 2\ninterferon 2\ninterlinked 2\nintermediaries 2\nintermission 2\ninternalized 2\ninternationals 2\ninternment 2\ninternship 2\ninterpellations 2\ninterrogator 2\ninterrupt 2\nintertidal 2\nintestine 2\nintimidated 2\nintolerably 2\nintranet 2\nintrepid 2\nintuitively 2\ninvaders 2\ninvalidated 2\ninventing 2\ninventiveness 2\ninversely 2\ninverted 2\ninvestigates 2\ninvestigational 2\ninvigorating 2\ninvitees 2\ninvoluntarily 2\ninvoluntary 2\nip 2\nirate 2\nire 2\nireland 2\nirks 2\nironically 2\nirons 2\nirradiated 2\nirredeemably 2\nirregular 2\nirreparably 2\nirreplaceable 2\nirresponsibly 2\nirreverent 2\nirrevocable 2\nirritated 2\niseesea 2\nitch 2\nivy 2\njacking 2\njade 2\njamming 2\njapanese 2\njasmine 2\njaw 2\njaws 2\njeffery7373 2\njeopardizes 2\njersey 2\njester 2\njettisoning 2\njewel 2\njeweler 2\njewelers 2\njews 2\njihadis 2\njoblessness 2\njocks 2\njostling 2\njousting 2\njoyful 2\njudicious 2\njudo 2\njuly 2\njumbled 2\njumpy 2\njungles 2\njunkets 2\njunkie 2\njunks 2\njurisdictional 2\njustly 2\njuvenile 2\nkai 2\nkaoshikii 2\nkarma 2\nkennels 2\nkeyed 2\nkeyless 2\nkeynote 2\nkidnapper 2\nkidnappers 2\nkidneys 2\nkillie 2\nkillies 2\nkilometre 2\nkilos 2\nkilter 2\nkinder 2\nkinescope 2\nkingpin 2\nkingside 2\nkiosk 2\nkissed 2\nkisses 2\nknitted 2\nknobs 2\nknuckles 2\nkollam 2\nkowtow 2\nkuan 2\nkuei 2\nkui 2\nl 2\nlabored 2\nlaborer 2\nlabyrinth 2\nlace 2\nlaches 2\nlading 2\nlads 2\nlaggard 2\nlagoon 2\nlags 2\nlaissez 2\nlama 2\nlament 2\nlampposts 2\nlander 2\nlandform 2\nlandholdings 2\nlandmarks 2\nlandscapes 2\nlanguished 2\nlapsed 2\nlargess 2\nlash 2\nlashed 2\nlashes 2\nlatch 2\nlatent 2\nlateral 2\nlathes 2\nlaughingly 2\nlaunchers 2\nlaundered 2\nlavender 2\nlavishing 2\nlawless 2\nlaxative 2\nle 2\nleaded 2\nleaguer 2\nleaps 2\nlees 2\nleftists 2\nleftover 2\nlegalistic 2\nlegislate 2\nlegit 2\nlegitimately 2\nleisurewear 2\nlemonade 2\nlengthen 2\nlengthens 2\nleniency 2\nleprosy 2\nles 2\nlessening 2\nlet's 2\nlethargic 2\nleukemia 2\nlevamisole 2\nlexicon 2\nlibrarian 2\nlibraries 2\nlichee 2\nliens 2\nlieutenants 2\nlifers 2\nlifespan 2\nlifespans 2\nlighthouse 2\nlilting 2\nlime 2\nlimitation 2\nlimitless 2\nlinearly 2\nlinger 2\nlingo 2\nlionized 2\nlipid 2\nlipstick 2\nlistless 2\nliterati 2\nlithe 2\nlithium 2\nlithograph 2\nlithographs 2\nlithotripter 2\nlitigants 2\nlitigators 2\nlitle 2\nlittered 2\nlitters 2\nloach 2\nloader 2\nloadings 2\nloathed 2\nlobsters 2\nlocale 2\nlocality 2\nlocket 2\nlocomotive 2\nlodging 2\nlogistic 2\nlonged 2\nlongitude 2\nlongshoreman 2\nlooseleaf 2\nloosened 2\nlooters 2\nlos 2\nloth 2\nlotteries 2\nlowbrow 2\nloyalist 2\nloyalists 2\nlubricant 2\nlucid 2\nlumberyard 2\nlumpectomy 2\nlumping 2\nlunatic 2\nlunchbox 2\nlunchtime 2\nlures 2\nlurks 2\nlustrous 2\nluxuries 2\nlynx 2\nmacabre 2\nmachining 2\nmaddeningly 2\nmagically 2\nmagistrates 2\nmagnets 2\nmagnify 2\nmahogany 2\nmaid 2\nmailroom 2\nmailto:amy.cornell@compaq.com 2\nmailto:rosario.gonzales@compaq.com 2\nmajician 2\nmakeover 2\nmalignancy 2\nmalnourished 2\nmalnutrition 2\nmaneuverable 2\nmanicure 2\nmanifold 2\nmanipulating 2\nmanipulators 2\nmanslaughter 2\nmanuscript 2\nmaquiladoras 2\nmargarine 2\nmarijuana 2\nmarque 2\nmarred 2\nmarring 2\nmarshal 2\nmarshall 2\nmarvels 2\nmary 2\nmasculinity 2\nmasochism 2\nmasquerading 2\nmasseur 2\nmassively 2\nmasterfully 2\nmastermind 2\nmasterminded 2\nmat 2\nmatchmaker 2\nmated 2\nmaterializes 2\nmateriel 2\nmathematically 2\nmathematician 2\nmating 2\nmattered 2\nmattress 2\nmatures 2\nmaverick 2\nmax 2\nmayonnaise 2\nmb 2\nmeadow 2\nmeanest 2\nmeaningfully 2\nmeanings 2\nmeasurable 2\nmeatballs 2\nmediations 2\nmedications 2\nmeditating 2\nmegastore 2\nmegastores 2\nmegrahi 2\nmelanin 2\nmellowed 2\nmeltdown 2\nmelted 2\nmemento 2\nmemoir 2\nmemos 2\nmenaces 2\nmenswear 2\nmentalities 2\nmeowing 2\nmercies 2\nmerihi 2\nmerry 2\nmetabolism 2\nmetallurgy 2\nmetamorphosis 2\nmeted 2\nmeteorological 2\nmethane 2\nmethodical 2\nmeticulously 2\nmetres 2\nmetrics 2\nmi 2\nmicrocomputers 2\nmicrocosm 2\nmicromanage 2\nmicroorganisms 2\nmicrophone 2\nmicrophones 2\nmid-1990 2\nmid-1992 2\nmid-April 2\nmid-December 2\nmid-stream 2\nmiddling 2\nmidmarket 2\nmidrange 2\nmidsection 2\nmidway 2\nmigrate 2\nmigrated 2\nmigration 2\nmildew 2\nmilks 2\nmilky 2\nmillet 2\nmillionaires 2\nminced 2\nmincemeat 2\nmindlessly 2\nmined 2\nminer 2\nmingle 2\nmini-United 2\nmini-bus 2\nminiaturized 2\nminicar 2\nminimalism 2\nminimill 2\nminiseries 2\nminivans 2\nminors 2\nminted 2\nmire 2\nmirrored 2\nmisadventures 2\nmisc.consumers 2\nmiscalculation 2\nmiscarriages 2\nmisfire 2\nmisguiding 2\nmishandling 2\nmishaps 2\nmisinterpret 2\nmisinterpretation 2\nmislead 2\nmismatch 2\nmisperceptions 2\nmisrepresent 2\nmisrepresenting 2\nmissionaries 2\nmisstatements 2\nmisstates 2\nmister 2\nmistresses 2\nmistrial 2\nmistrials 2\nmisunderstandings 2\nmitigated 2\nmma 2\nmo 2\nmoans 2\nmobility 2\nmobs 2\nmocked 2\nmoderating 2\nmoderators 2\nmodernist 2\nmodicum 2\nmodifies 2\nmodule 2\nmogul 2\nmolested 2\nmomentous 2\nmonetarists 2\nmonocrystalline 2\nmonologue 2\nmonologues 2\nmonomer 2\nmonopolize 2\nmonoxide 2\nmonsoon 2\nmonsters 2\nmontage 2\nmoods 2\nmoonlighting 2\nmoored 2\nmopping 2\nmoralist 2\nmoratorium 2\nmorbidity 2\nmorons 2\nmortar 2\nmortem 2\nmosaic 2\nmosquito 2\nmotivating 2\nmotley 2\nmotoring 2\nmotorized 2\nmottled 2\nmountainside 2\nmounts 2\nmourns 2\nmoustache 2\nmouthpiece 2\nmoxie 2\nmpg 2\nms 2\nmua4000@hotmail.com 2\nmuck 2\nmucked 2\nmuddied 2\nmuddy 2\nmujahideen 2\nmulberry 2\nmulti 2\nmulti-family 2\nmulti-functional 2\nmulti-lateral 2\nmulti-nation 2\nmulti-task 2\nmulticultural 2\nmultifamily 2\nmultimillionaire 2\nmultiplied 2\nmummified 2\nmurals 2\nmussels 2\nmutated 2\nmuted 2\nmutilated 2\nmutilating 2\nmutters 2\nmutton 2\nmysticism 2\nmystified 2\nn3td3v 2\nnachos 2\nnaifgh@msn.com 2\nnameplates 2\nnamesake 2\nnantie 2\nnapalm 2\nnapkin 2\nnarrated 2\nnary 2\nnatal 2\nnationhood 2\nnaturalization 2\nnaturalized 2\nnavigate 2\nnavigated 2\nnavigating 2\nnavigator 2\nnazi 2\nneanderthal 2\nneared 2\nnecklaces 2\nnegativity 2\nnegligently 2\nneighbourhood 2\nneighbours 2\nneocon 2\nneoconservative 2\nnerd 2\nnerds 2\nnerdy 2\nneurologists 2\nneurosurgeon 2\nnewsman 2\nnewsprints 2\nnewsstands 2\nnewsworthy 2\nnexus 2\nnickels 2\nnicks 2\nnieces 2\nnifty 2\nnightlife 2\nnightmares 2\nnighttime 2\nnimble 2\nnineteen 2\nnipple 2\nnobility 2\nnoblewoman 2\nnode 2\nnods 2\nnoida 2\nnoisy 2\nnominating 2\nnon-Arab 2\nnon-Communist 2\nnon-GM 2\nnon-Japanese 2\nnon-Jewish 2\nnon-accrual 2\nnon-aggression 2\nnon-alcoholic 2\nnon-callable 2\nnon-championship 2\nnon-citizen 2\nnon-combatants 2\nnon-communists 2\nnon-disclosure 2\nnon-dual 2\nnon-essential 2\nnon-executive 2\nnon-existent 2\nnon-ferrous 2\nnon-interest 2\nnon-interference 2\nnon-mainstream 2\nnon-performing 2\nnon-permanent 2\nnon-praiseworthy 2\nnon-public 2\nnon-registered 2\nnon-scientific 2\nnon-standard 2\nnon-subscription 2\nnon-systematic 2\nnon-tariff 2\nnon-telephone 2\nnonbondad 2\nnoncommittal 2\nnoncompetitive 2\nnonconservative 2\nnoncriminal 2\nnondeductible 2\nnondemocratic 2\nnonfinancial 2\nnonfood 2\nnonoperating 2\nnonpublic 2\nnonresident 2\nnonstrategic 2\nnontoxic 2\nnonvoting 2\nnoodle 2\nnorthward 2\nnorthwestern 2\nnotebooks 2\nnoticing 2\nnotifications 2\nnoun 2\nnourishment 2\nnouveau 2\nnovelistic 2\nnude 2\nnuke 2\nnukes 2\nnumbing 2\nnumerically 2\nnutshell 2\noaths 2\nobjecting 2\nobligatory 2\noblige 2\nobliterate 2\noblivion 2\nobservances 2\nobservant 2\nobstructed 2\noccupant 2\noccupier 2\noceanographic 2\noffence 2\noffending 2\noffensives 2\nofficiate 2\nofficiated 2\noffstage 2\noilfields 2\nole 2\nomission 2\nomissions 2\nomnipotence 2\nonboard 2\noncogenes 2\noncoming 2\nonus 2\noohs 2\nooze 2\noozing 2\noperant 2\noperatic 2\nopossums 2\nopportunism 2\nopportunist 2\noptimal 2\noptimists 2\noptional 2\norbiter 2\norbits 2\norchards 2\norchestrated 2\norchid 2\norg 2\norganisation 2\norientations 2\norigami 2\noriginating 2\nornamental 2\northodontist 2\noutback 2\noutburst 2\noutcast 2\noutfield 2\noutfly 2\noutgrown 2\noutlaw 2\noutlawing 2\noutlines 2\noutmoded 2\noutstandingly 2\noutstrips 2\noutwardly 2\noutweighed 2\novalettes 2\novens 2\nover-optimistic 2\noveralls 2\noverbid 2\novercharge 2\novercharged 2\novercharges 2\noveremphasize 2\noverhauled 2\noverlay 2\noverloaded 2\noverrated 2\noverrun 2\noverseen 2\noverseers 2\novershadowed 2\novershadowing 2\noversimplified 2\noverstated 2\noverstatement 2\noverstay 2\noverstep 2\novertook 2\noveruse 2\noverweight 2\nowhali 2\nowwie 2\noyster 2\np 2\npacemaker 2\npacify 2\npacing 2\npackets 2\npaddies 2\npaddy 2\npainstakingly 2\npajama 2\npalate 2\npaled 2\npales 2\npalette 2\npallid 2\npalpable 2\npaltry 2\npan-Arab 2\npandemonium 2\npane 2\npangs 2\npanned 2\npanoramic 2\npantheon 2\npany 2\npap 2\nparades 2\nparadigm 2\nparagons 2\nparalegal 2\nparalyzing 2\nparanoia 2\nparenthood 2\npariah 2\nparkway 2\nparlor 2\nparting 2\npartisanship 2\npassages 2\npasscode 2\npasserby 2\npassionately 2\npassively 2\npastries 2\npaternal 2\npathologically 2\npatiently 2\npatriarch 2\npatrilineal 2\npatrolling 2\npatronizing 2\npatrons 2\npatter 2\npaulhastings.com 2\npavilion 2\npawing 2\npawns 2\npayback 2\npayer 2\npeacekeepers 2\npeacetime 2\npeacocks 2\npecking 2\npeddles 2\npedestal 2\npedophilia 2\npeep 2\npelvic 2\npenalize 2\npenicillin 2\npenis 2\npenniless 2\npenthouse 2\npeonies 2\npeople.com.cn 2\nperambulators 2\nperfectionist 2\nperfidy 2\nperilous 2\nperiphery 2\nperitoneal 2\nperked 2\npernicious 2\nperpetrated 2\nperpetually 2\nperpetuating 2\nperplexing 2\npersistency 2\npersonable 2\npersuasively 2\npertaining 2\nperuse 2\nperversely 2\nperversion 2\npessimists 2\npest 2\npests 2\npetrie 2\nphasing 2\nphenom 2\nphilanthropist 2\nphilatelic 2\nphilosophic 2\nphoenixes 2\nphonetic 2\nphoney 2\nphotographed 2\nphotons 2\nphysicist 2\nphysicists 2\npi 2\npianos 2\npickaxes 2\npickups 2\npicnic 2\npictorial 2\npierced 2\npiggybacking 2\npilfering 2\npilgrimages 2\npiling 2\npilings 2\npillows 2\npinball 2\npinnacle 2\npint 2\npints 2\npiped 2\npiping 2\npiqued 2\npirate 2\npissed 2\npistol 2\npistols 2\npivots 2\npizzazz 2\npl 2\nplacate 2\nplagues 2\nplaneloads 2\nplanetary 2\nplanets 2\nplank 2\nplasma 2\nplatoon 2\nplatter 2\nplaudits 2\nplayful 2\nplayground 2\nplaymates 2\npleasantly 2\npled 2\npliers 2\nplotters 2\nplowed 2\nplows 2\npls 2\nplucked 2\nplugging 2\nplugins 2\nplundering 2\npo 2\npoignantly 2\npointers 2\npokes 2\npolicymakers 2\npolio 2\npolishing 2\npolitely 2\npoliticking 2\npolity 2\npollinate 2\npollinated 2\npollster 2\npolyols 2\npolysilicon 2\npolystyrene 2\npolytheists 2\npontificate 2\npooled 2\npopped 2\npoppet 2\npopularize 2\npopularized 2\npopulate 2\npornographic 2\nporous 2\nportends 2\nportico 2\nposh 2\npositives 2\npost-1987 2\npost-1997 2\npost-Cold 2\npost-Soviet 2\npost-earthquake 2\npost-election 2\npost-inaugural 2\npost-quake 2\npost-season 2\npost-split 2\npost-strongman 2\npostmarked 2\npostmarks 2\npostulate 2\npostures 2\npotentialities 2\npotholes 2\npotpourri 2\npotted 2\npotting 2\npowerfully 2\npragmatist 2\nprawns 2\npre-1967 2\npre-election 2\npre-eminent 2\npre-empt 2\npre-emptive 2\npre-existing 2\npre-killed 2\npre-merger 2\npre-refunded 2\npre-register 2\npre-registered 2\npre-war 2\npreachers 2\nprearranged 2\nprecariously 2\nprecautionary 2\nprecede 2\nprecedents 2\nprecluded 2\nprecocious 2\npreconditions 2\npredates 2\npredictability 2\npredictive 2\npredominates 2\npreemptive 2\nprefectures 2\nprejudiced 2\npreliminaries 2\npremarital 2\npremiering 2\npresenter 2\npreserves 2\npreside 2\npresidental 2\npresides 2\nprettier 2\nprevalence 2\npreviews 2\nprick 2\nprimordial 2\nprinces 2\nprism 2\nprocedurally 2\nprocession 2\nproclamation 2\nprocure 2\nprocures 2\nprodigal 2\nprodigious 2\nprofessed 2\nprofessionally 2\nproffered 2\nproffers 2\nprofiling 2\nprofligate 2\nprogesterone 2\nprognosis 2\nprogrammatic 2\nprogrammer 2\nprogrammes 2\nprohibitions 2\nprohibitive 2\nproliferated 2\nproliferating 2\nprongs 2\npropagated 2\npropensity 2\nprophecies 2\nprophetic 2\npropositions 2\nproprietorships 2\npropylene 2\nprospected 2\nprostaglandin 2\nprostrate 2\nprotester 2\nprotruding 2\nprovenance 2\nproverbs 2\nprovincially 2\nprovisioning 2\nprovocations 2\nprovocatively 2\nprovokes 2\nprowl 2\nproxies 2\nprudently 2\nprudery 2\nps 2\npseudonym 2\npsyche 2\npsycho-spiritual 2\npsychobiology 2\npsychopaths 2\npuberty 2\npublicizing 2\npuddings 2\npuerile 2\npullers 2\npunctuation 2\npungent 2\npunk 2\npuns 2\npunts 2\npupil 2\npupils 2\npurge 2\npurists 2\npuritanical 2\npurports 2\npursuers 2\npurveyor 2\npushers 2\npushy 2\nputtering 2\npyng 2\npyramids 2\npyrotechnic 2\npythons 2\nquacks 2\nquadrupeds 2\nquail 2\nqualitative 2\nquantum 2\nquarreled 2\nquarry 2\nquarterbacks 2\nquartets 2\nqueens 2\nquelled 2\nquestioner 2\nqueues 2\nquirks 2\nquits 2\nquitting 2\nquivers 2\nrabbi 2\nrabid 2\nraced 2\nracket 2\nrad 2\nradiates 2\nradiator 2\nrag 2\nrages 2\nrags 2\nraindrops 2\nrainwear 2\nrainy 2\nraisins 2\nrakes 2\nraking 2\nramble 2\nrampage 2\nramshackle 2\nranchers 2\nranches 2\nrancor 2\nrancorous 2\nranger 2\nrapidity 2\nrapper 2\nrata 2\nratifying 2\nrationalizations 2\nrations 2\nraucous 2\nre-adjust 2\nre-counts 2\nre-enter 2\nre-run 2\nreachable 2\nreactivated 2\nreacts 2\nreadable 2\nreaffirmation 2\nreaffirming 2\nreaffirms 2\nrealign 2\nrealisation 2\nrealist 2\nreappearance 2\nreappeared 2\nreapply 2\nreappointed 2\nreappraised 2\nrearm 2\nrearrange 2\nrearranged 2\nreassert 2\nreasserting 2\nreassign 2\nreassignment 2\nreassurance 2\nreawakening 2\nrebounds 2\nrebuff 2\nrecalcitrant 2\nrecalculating 2\nrecanted 2\nreccomend 2\nreceding 2\nreceivers 2\nreceptive 2\nrecessionary 2\nrecharge 2\nrecharging 2\nrecited 2\nreciting 2\nreclaims 2\nreclamation 2\nreclassified 2\nrecollection 2\nrecombinant 2\nreconfirmation 2\nreconsidered 2\nreconstructed 2\nrecouped 2\nrecoveries 2\nrecreated 2\nrecreating 2\nrectal 2\nrectangle 2\nrectangles 2\nrectangular 2\nrecti 2\nrectifying 2\nrecumbent 2\nrecuperation 2\nrecyclable 2\nredder 2\nredeemable 2\nredeeming 2\nredefinition 2\nrediculous 2\nredness 2\nredraw 2\nreefs 2\nreeked 2\nreelected 2\nreexamine 2\nrefereeing 2\nreferral 2\nreferrals 2\nrefillable 2\nrefine 2\nrefinement 2\nreflections 2\nreflective 2\nreflexively 2\nreflux 2\nrefractory 2\nregaining 2\nregimen 2\nregiments 2\nregistering 2\nregistrar 2\nregistrations 2\nregressed 2\nregretfully 2\nregrettable 2\nregrettably 2\nregroup 2\nrehabilitate 2\nrehabilitated 2\nrehearing 2\nreigns 2\nreimbursable 2\nreimbursement 2\nreimpose 2\nreinforces 2\nreinstalled 2\nreinstatement 2\nreintroduced 2\nreinvented 2\nreinvesting 2\nreinvigorate 2\nreiterate 2\nrejoined 2\nrejuvenation 2\nrelaxes 2\nrelays 2\nrelegated 2\nrelent 2\nrelevance 2\nreliably 2\nreliever 2\nrelocate 2\nrelocating 2\nreminisce 2\nremittance 2\nremorse 2\nremovable 2\nremunerated 2\nrenounce 2\nrenouncing 2\nrenovated 2\nrenovations 2\nrenters 2\nrenunciation 2\nrepainted 2\nrepassed 2\nrepatriate 2\nrepent 2\nreplenish 2\nreplica 2\nrepositioning 2\nrepositories 2\nrepresentations 2\nrepressed 2\nreprieve 2\nreprisals 2\nrepublicanism 2\nrepugnant 2\nrepurchases 2\nrepurchasing 2\nreputed 2\nrequisition 2\nreruns 2\nrescinding 2\nrescission 2\nresells 2\nresembled 2\nresent 2\nreshuffled 2\nresidency 2\nresides 2\nresidues 2\nresigns 2\nresonances 2\nresonated 2\nresorting 2\nresourceful 2\nrespiratory 2\nrespite 2\nresplendent 2\nrestarting 2\nrestate 2\nrestless 2\nrestroom 2\nrestructures 2\nresurgence 2\nresurrected 2\nretails 2\nretainer 2\nretake 2\nretaking 2\nretaliated 2\nretention 2\nreticent 2\nretiree 2\nretirements 2\nretract 2\nretracted 2\nretreats 2\nretrial 2\nretrieval 2\nretroactive 2\nretrospective 2\nreturnees 2\nreunified 2\nreuse 2\nreused 2\nrevel 2\nreverberating 2\nrevere 2\nreverted 2\nrevitalized 2\nrevitalizing 2\nrevocation 2\nrevoking 2\nrevolutionize 2\nreworked 2\nrewriting 2\nrhyming 2\nrhythmically 2\nrib 2\nridership 2\nridge 2\nridges 2\nridicules 2\nrightfully 2\nrigor 2\nringleader 2\nrinse 2\nrinsing 2\nrioters 2\nriskiness 2\nrituals 2\nritzy 2\nriverbed 2\nriveted 2\nrivets 2\nroadbed 2\nroaders 2\nroamed 2\nroared 2\nroast 2\nroasted 2\nroasts 2\nrobotic 2\nrobotics 2\nrogues 2\nromanization 2\nrooftops 2\nroomful 2\nroosters 2\nrooting 2\nrosier 2\nroster 2\nrotate 2\nrotted 2\nrotting 2\nroughed 2\nroughshod 2\nroustabout 2\nroutines 2\nrovers 2\nrowing 2\nrubbing 2\nrubric 2\nrudest 2\nrueful 2\nrug 2\nrumbled 2\nrumblings 2\nruminated 2\nrung 2\nrupees 2\nrut 2\nruts 2\nsacked 2\nsacking 2\nsadder 2\nsadistic 2\nsadness 2\nsafeguarded 2\nsage 2\nsages 2\nsaints 2\nsaliva 2\nsaltpeter 2\nsaltwater 2\nsalvo 2\nsampler 2\nsampling 2\nsanction 2\nsanctioning 2\nsanctity 2\nsands 2\nsane 2\nsanitary 2\nsapped 2\nsarcastic 2\nsardonic 2\nsatanic 2\nsaturday 2\nsauces 2\nsauna 2\nsausage 2\nsavagely 2\nsavior 2\nsavviest 2\nscalded 2\nscammer 2\nscan 2\nscanner 2\nscathing 2\nscavengers 2\nscented 2\nscents 2\nscepticism 2\nscheduler 2\nschizophrenic 2\nschoolchildren 2\nschoolmate 2\nschoolmates 2\nsci 2\nscoffed 2\nscoffs 2\nscold 2\nscooping 2\nscoops 2\nscour 2\nscouting 2\nscowl 2\nscraper 2\nscrapping 2\nscratched 2\nscratches 2\nscratchy 2\nscreened 2\nscribbled 2\nscrimping 2\nscriptural 2\nscriptures 2\nscrubbers 2\nscruff 2\nscrupulous 2\nscrutinized 2\nscuffle 2\nsculpting 2\nsculptors 2\nsculptures 2\nscum 2\nscurries 2\nscurrying 2\nseaborne 2\nseacoast 2\nseahome 2\nseakness 2\nseam 2\nseamy 2\nseasonings 2\nseawall 2\nsecession 2\nsectarianism 2\nsectarians 2\nsects 2\nsediment 2\nseduced 2\nseeded 2\nseeped 2\nsegregate 2\nsegregated 2\nseismaesthesia 2\nseismology 2\nsellout 2\nsemi-annually 2\nsemi-automatic 2\nsemi-finished 2\nsenatorial 2\nsenders 2\nsensationalism 2\nsensors 2\nsensual 2\nsentry 2\nsequels 2\nsequences 2\nserenely 2\nserenity 2\nsergeant 2\nservile 2\nsetoff 2\nsetters 2\nsetups 2\nsever 2\nsewers 2\nsexist 2\nsextant 2\nshallot 2\nshamelessness 2\nsharecroppers 2\nshatter 2\nsheaths 2\nsheeps 2\nsheetlets 2\nshellfish 2\nshelved 2\nshepherds 2\nsheriff 2\nshielding 2\nshillings 2\nshined 2\nshins 2\nshipmates 2\nshipowner 2\nshipsets 2\nshipyards 2\nshiver 2\nshivering 2\nshivers 2\nshoals 2\nshoemaking 2\nshoestring 2\nshootout 2\nshopkeeper 2\nshopped 2\nshopper 2\nshorn 2\nshortbox 2\nshortcut 2\nshorten 2\nshortening 2\nshortsighted 2\nshotguns 2\nshoves 2\nshoving 2\nshowered 2\nshowering 2\nshred 2\nshreds 2\nshrewdly 2\nshriveled 2\nshrub 2\nshrugged 2\nshudders 2\nshui 2\nshunted 2\nshuts 2\nshutter 2\nshuttered 2\nshuttled 2\nsi 2\nsidelined 2\nsideshow 2\nsiding 2\nsignatories 2\nsignificances 2\nsignify 2\nsilently 2\nsilkie 2\nsilkworms 2\nsilky 2\nsilt 2\nsilting 2\nsilvery 2\nsimplification 2\nsimulator 2\nsingling 2\nsingular 2\nsinks 2\nsipped 2\nsixfold 2\nsizing 2\nsizzling 2\nskateboards 2\nskater 2\nskaters 2\nskeletal 2\nskelter 2\nskids 2\nskilful 2\nskillful 2\nskinny 2\nskipper 2\nskipping 2\nskirmishing 2\nskis 2\nskyrocket 2\nslanting 2\nslapping 2\nslashes 2\nslats 2\nsledgehammer 2\nslicing 2\nslimmed 2\nsling 2\nslinky 2\nslippage 2\nslippers 2\nslither 2\nslithering 2\nslum 2\nslums 2\nslung 2\nsmattering 2\nsmearing 2\nsmelly 2\nsmelter 2\nsmilies 2\nsmirk 2\nsmokes 2\nsmokescreen 2\nsmokestack 2\nsmoldering 2\nsmoothed 2\nsmugglers 2\nsnafu 2\nsnails 2\nsnakebite 2\nsnapping 2\nsnappy 2\nsnarls 2\nsnatched 2\nsneaked 2\nsneaking 2\nsneaks 2\nsneaky 2\nsnipers 2\nsnippets 2\nsnooping 2\nsnooty 2\nsnoring 2\nsnorkeling 2\nsnorts 2\nsnowballed 2\nsnowstorms 2\nsnubbing 2\nsob 2\nsobbing 2\nsobered 2\nsobering 2\nsociability 2\nsocialistic 2\nsocialists 2\nsockets 2\nsoftener 2\nsoils 2\nsolder 2\nsoldering 2\nsolicitous 2\nsolidified 2\nsoloist 2\nsolvency 2\nsolves 2\nsonar 2\nsongwriters 2\nsooooo 2\nsoreness 2\nsouled 2\nsoulmate 2\nsouped 2\nsouring 2\nsoviet 2\nsoviets 2\nspans 2\nsparingly 2\nsparkles 2\nsparred 2\nsparse 2\nspasms 2\nspecializations 2\nspecification 2\nspecimens 2\nspectacles 2\nspeedily 2\nspender 2\nspendy 2\nsperm 2\nspew 2\nspice 2\nspiffy 2\nspilling 2\nspindles 2\nspineless 2\nspinnaker 2\nspins 2\nspire 2\nspirituality 2\nspiritually 2\nspitting 2\nsplashed 2\nsplashy 2\nspleen 2\nsplendidly 2\nsplendor 2\nspoiler 2\nsponge 2\nspooky 2\nsporadically 2\nsporty 2\nsprays 2\nsprinkled 2\nsprinkles 2\nsprinting 2\nsprout 2\nspruce 2\nspuds 2\nspurious 2\nspurs 2\nspurted 2\nsqualid 2\nsquandering 2\nsquatted 2\nsquatting 2\nsqueamish 2\nsquinting 2\nstabilizes 2\nstably 2\nstacking 2\nstacks 2\nstaffing 2\nstagnated 2\nstaircase 2\nstalking 2\nstalling 2\nstampeded 2\nstandardization 2\nstandardizing 2\nstandout 2\nstaples 2\nstapling 2\nstarch 2\nstartle 2\nstash 2\nstatesman 2\nstatist 2\nstatistically 2\nstats 2\nstaunchest 2\nsteadfast 2\nsteakhouse 2\nsteelmaking 2\nsteely 2\nsteeper 2\nsteeply 2\nsteers 2\nstench 2\nsterilization 2\nsteroids 2\nstew 2\nstewed 2\nstiffest 2\nstimuli 2\nstinky 2\nstints 2\nstipulate 2\nstipulations 2\nstirrups 2\nstitch 2\nstitched 2\nstocked 2\nstockholdings 2\nstockpile 2\nstomping 2\nstonemason 2\nstoning 2\nstoppages 2\nstopper 2\nstorefronts 2\nstorytelling 2\nstowed 2\nstrategizing 2\nstratosphere 2\nstreamlined 2\nstreptokinase 2\nstrikeout 2\nstriker 2\nstriped 2\nstrippers 2\nstrode 2\nstructurally 2\nstudded 2\nstudious 2\nstuds 2\nstump 2\nstumping 2\nstun 2\nstunningly 2\nstupidest 2\nstupidly 2\nstylishly 2\nsub-standard 2\nsubcommittees 2\nsubconscious 2\nsubcontract 2\nsubcontracting 2\nsubdivided 2\nsubmissive 2\nsubpoenas 2\nsubprime 2\nsubscribed 2\nsubscribes 2\nsubscriptions 2\nsubsidence 2\nsubsides 2\nsubstandard 2\nsubstantiate 2\nsubstation 2\nsubstations 2\nsubstrate 2\nsubtilis 2\nsubtitled 2\nsubtracted 2\nsubversives 2\nsubvert 2\nsuccumbed 2\nsuccumbing 2\nsucre 2\nsugared 2\nsuitcases 2\nsullied 2\nsulphur 2\nsulphuric 2\nsummaries 2\nsummarized 2\nsummarizing 2\nsunglasses 2\nsuntan 2\nsupercharger 2\nsupercycle 2\nsupercycles 2\nsuperhighway 2\nsuperintendent 2\nsupersalmon 2\nsupersonic 2\nsuperstar 2\nsuperstitious 2\nsupervises 2\nsupper 2\nsupplant 2\nsupple 2\nsupposes 2\nsupposition 2\nsuppressors 2\nsurcharge 2\nsurest 2\nsurges 2\nsurpasses 2\nsurrendering 2\nsurrogate 2\nsurvivability 2\nsuspecting 2\nsuspensions 2\nsustainability 2\nsutures 2\nsvelte 2\nswallowing 2\nswamp 2\nswapped 2\nswarms 2\nswath 2\nsweater 2\nsweating 2\nsweatshops 2\nsweepers 2\nsweetener 2\nsweeteners 2\nswimmers 2\nswimwear 2\nswindled 2\nswinging 2\nswirls 2\nswoon 2\nswords 2\nsymbiotic 2\nsymbolically 2\nsymbolize 2\nsymbolized 2\nsymbolizes 2\nsympathized 2\nsymphony 2\nsymposiums 2\nsync 2\nsynch 2\nsynchronous 2\nsyndicating 2\nsyndication 2\nsyrup 2\nsystematized 2\nsystemwide 2\nta' 2\ntabulation 2\ntackles 2\ntaco 2\ntacos 2\ntailed 2\ntaint 2\ntakedown 2\ntakeoffs 2\ntallied 2\ntamed 2\ntamer 2\ntamper 2\ntapered 2\ntapers 2\ntar 2\ntardy 2\ntarmac 2\ntarp 2\ntart 2\ntastefully 2\ntastings 2\ntatters 2\ntattoo 2\ntawdry 2\ntaxing 2\nteammate 2\nteamwork 2\nteased 2\nteaser 2\nteasing 2\ntechnicality 2\ntechnobility 2\ntee 2\ntel 2\ntelegram 2\ntelegraphed 2\ntelephony 2\ntelex 2\ntelexes 2\ntellers 2\ntelly 2\ntempers 2\ntemptations 2\ntenacious 2\ntenacity 2\ntenet 2\ntenuous 2\nterra 2\nterrace 2\nterrestrial 2\nterrorize 2\nthankfully 2\nthanksgiving 2\ntheatre 2\nthee 2\nthematic 2\nthemed 2\ntheologians 2\ntheoretician 2\ntheorists 2\nthermometers 2\nthermostat 2\nthier 2\nthigh 2\nthighs 2\nthinkers 2\nthinned 2\nthirdly 2\nthirteenth 2\nthirtieth 2\nthorny 2\nthoroughbreds 2\nthrall 2\nthrashing 2\nthreemonth 2\nthrifty 2\nthriller 2\nthrilling 2\nthrowback 2\nthrusting 2\nthump 2\nthunderous 2\nthunderstorm 2\nticketed 2\nticklish 2\ntics 2\ntidbit 2\ntidied 2\ntien 2\ntights 2\ntilling 2\ntimberlands 2\ntimeline 2\ntimer 2\ntimorous 2\ntinkered 2\ntippee 2\ntipper 2\ntipping 2\ntiremaker 2\ntiresome 2\ntoasted 2\ntoddler 2\ntoenails 2\ntoggles 2\ntoiletries 2\ntoils 2\ntolerated 2\ntomorrows 2\ntones 2\ntoolbar 2\ntoothache 2\ntoothbrush 2\ntorched 2\ntorment 2\ntorments 2\ntorque 2\ntortuous 2\ntossing 2\ntotes 2\ntouchdown 2\ntournaments 2\ntouts 2\ntowing 2\ntownhouse 2\ntracker 2\ntradeoff 2\ntradeoffs 2\ntrainees 2\ntraitor 2\ntrampled 2\ntranquilizing 2\ntrans-Atlantic 2\ntransact 2\ntranscended 2\ntranscends 2\ntransgression 2\ntransgressors 2\ntransient 2\ntransistor 2\ntransistors 2\ntransitioning 2\ntransliteration 2\ntransmit 2\ntransmitter 2\ntransplantation 2\ntransplanting 2\ntrapping 2\ntrashing 2\ntravails 2\ntravelled 2\ntravellers 2\ntravesty 2\ntreachery 2\ntreasuries 2\ntriangles 2\ntributes 2\ntrickery 2\ntrickling 2\ntriggers 2\ntriglycerides 2\ntrimester 2\ntrinity 2\ntrio 2\ntrios 2\ntripartite 2\ntriviality 2\ntrophy 2\ntrotters 2\ntroughed 2\ntruant 2\ntrucked 2\ntruckload 2\ntruckloads 2\ntrudge 2\ntrudging 2\ntrunks 2\ntseh 2\ntsu 2\ntsung 2\ntubing 2\ntuitions 2\ntumbles 2\ntunes 2\ntupperwear 2\nturbans 2\nturbogenerator 2\nturboprop 2\nturkey 2\nturkeys 2\nturntable 2\nturtles 2\ntuxedo 2\ntweezers 2\ntwentieth 2\ntwitching 2\ntwofold 2\ntypifies 2\ntypos 2\ntyrannical 2\nubiquity 2\nultra-violent 2\numbilical 2\numbrellas 2\nunabated 2\nunaccustomed 2\nunadited 2\nunadjusted 2\nunaffiliated 2\nunanswered 2\nunattainable 2\nunattractive 2\nunawareness 2\nunbeatable 2\nunbelievers 2\nunburned 2\nuncensored 2\nuncharted 2\nunclassified 2\nunclean 2\nuncomplicated 2\nunconcerned 2\nunconditionally 2\nunconnected 2\nunconvincing 2\nuncritical 2\nundefeated 2\nundelivered 2\nundeniably 2\nunderfunded 2\nunderlined 2\nundermines 2\nunderperform 2\nunderrepresented 2\nunderscoring 2\nundersecretary 2\nundertaker 2\nundertones 2\nunderused 2\nunderutilized 2\nunderworld 2\nunderwrites 2\nundetermined 2\nundid 2\nundistinguished 2\nundisturbed 2\nundiversified 2\nundocumented 2\nundoing 2\nundone 2\nundulate 2\nunequal 2\nunequivocally 2\nunfamiliarity 2\nunfavorably 2\nunforeseen 2\nunforgivable 2\nunhappiness 2\nunhindered 2\nunhinged 2\nunhurried 2\nuni-polar 2\nunifier 2\nunify 2\nunimaginably 2\nunimportant 2\nunindicted 2\nunintended 2\nunintentionally 2\nunionized 2\nunlawfully 2\nunleaded 2\nunleash 2\nunleashes 2\nunlicensed 2\nunlocked 2\nunmarked 2\nunmatched 2\nunobserved 2\nunplanned 2\nunprecedentedly 2\nunpunished 2\nunquestionable 2\nunquestionably 2\nunrecognized 2\nunregistered 2\nunrelenting 2\nunremarkable 2\nunremitting 2\nunresponsive 2\nunseated 2\nunseemly 2\nunserious 2\nunsigned 2\nunsolved 2\nunsound 2\nunspent 2\nunsteady 2\nunstylish 2\nunsuitable 2\nunsurpassed 2\nunsurprising 2\nunthreatening 2\nuntrained 2\nuntruthful 2\nunturned 2\nunusable 2\nunveils 2\nunwavering 2\nunwind 2\nunwitting 2\nunwittingly 2\nunwritten 2\nupbringing 2\nupfront 2\nupholstery 2\nuplifted 2\nuppers 2\nupraised 2\nuprisings 2\nupsetting 2\nupstarts 2\nupstate 2\nuptight 2\nur 2\nurea 2\nurethra 2\nusability 2\nusable 2\nusurp 2\nusurping 2\nutensils 2\nuterine 2\nuterus 2\nutterances 2\nvacationing 2\nvaccination 2\nvaccines 2\nvacillation 2\nvagaries 2\nvaginal 2\nvalidating 2\nvantage 2\nvapor 2\nvapors 2\nvariac 2\nvases 2\nvectors 2\nveered 2\nvegetarian 2\nvegetarians 2\nvehement 2\nvendetta 2\nvenerated 2\nvengeful 2\nvenoms 2\nverbatim 2\nverifiable 2\nverifying 2\nverily 2\nversatile 2\nversed 2\nvexing 2\nviamedia 2\nvibration 2\nvibrations 2\nvice-governor 2\nvices 2\nvicissitudes 2\nvictimization 2\nvideotapes 2\nvideotaping 2\nvigil 2\nvigilance 2\nvile 2\nvillain 2\nvillains 2\nvinegar 2\nvinyl 2\nviolet 2\nviolinist 2\nvirtuosity 2\nvirtuous 2\nvisage 2\nvisionaries 2\nvisualizing 2\nvitally 2\nvitamins 2\nvitreous 2\nvm 2\nvocabularies 2\nvocabulary 2\nvocalist 2\nvociferously 2\nvoided 2\nvoir 2\nvolt 2\nvolunteering 2\nvowel 2\nvulgarity 2\nwa 2\nwading 2\nwafer 2\nwaffled 2\nwag 2\nwagon 2\nwaivered 2\nwaivers 2\nwalkie 2\nwalkout 2\nwalkouts 2\nwalmart 2\nwardens 2\nwardrobe 2\nwarehousing 2\nwarheads 2\nwarmest 2\nwarped 2\nwasher 2\nwasteland 2\nwatchdogs 2\nwaterfall 2\nwaterfalls 2\nwatery 2\nwavelength 2\nwayward 2\nwealthiest 2\nweaponized 2\nwearies 2\nweave 2\nwebpage 2\nwebs 2\nwedded 2\nwedged 2\nweed 2\nweeds 2\nweeklies 2\nweightlessness 2\nweighty 2\nweirdness 2\nwelded 2\nwerewolf 2\nwharves 2\nwhatnot 2\nwheeled 2\nwheeler 2\nwhimper 2\nwhims 2\nwhips 2\nwhisked 2\nwhisky 2\nwhispered 2\nwholesaling 2\nwhoosh 2\nwhoring 2\nwicking 2\nwielded 2\nwiggle 2\nwigs 2\nwildcat 2\nwilled 2\nwillfully 2\nwily 2\nwindfall 2\nwindy 2\nwineries 2\nwines 2\nwingers 2\nwinneragain 2\nwiper 2\nwired 2\nwireless 2\nwiry 2\nwishful 2\nwitching 2\nwithdraws 2\nwither 2\nwomanly 2\nwondrous 2\nwoodpeckers 2\nwoodwind 2\nwordy 2\nworkaholic 2\nworkday 2\nworriers 2\nworshipers 2\nwort 2\nworthiness 2\nwrack 2\nwrappers 2\nwraps 2\nwreaking 2\nwrecking 2\nwrench 2\nwrestle 2\nwring 2\nwrithing 2\nwu 2\nwunderkind 2\ny'all 2\nyanking 2\nyardstick 2\nyarns 2\nyearlong 2\nyeeesh 2\nyell 2\nyells 2\nyer 2\nyew 2\nyin 2\nymac 2\nyogurt 2\nyonder 2\nyorkshire 2\nyou'll 2\nyou're 2\nyrs 2\nyuk 2\nzenith 2\nzhuxi 2\nzoned 2\nzucchini 2\n~~~~~~~~~~~~ 2\n■ 2\n!!!!!!! 1\n!!!!!!!! 1\n!!!!!!!!!! 1\n!!!!!!!!!!!!!! 1\n!!!!!!!!!!!!!!! 1\n!!!!!!!!!!!!!!!! 1\n!!!!!!!!!!!!!!!!!! 1\n!!!!!!!!!!!!!!!!!!! 1\n!!!!!!!!!!!!!!!!!!!!!!!!! 1\n!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 1\n!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 1\n!!!!!!!!!!!!!? 1\n!!!!. 1\n!!!. 1\n!!. 1\n!* 1\n!?! 1\n\"\" 1\n##'s 1\n#NAME 1\n$50,000 1\n$ervice 1\n$involved 1\n$ome 1\n$ometime$ 1\n&# 1\n''It 1\n', 1\n'20s 1\n'30s 1\n'45 1\n'50s 1\n'70 1\n'74 1\n'89 1\n'96 1\n'Akkab 1\n'Arabi 1\n'Arabiya 1\n'Em 1\n'LL 1\n'T 1\n'amour 1\n'cause 1\n'droid 1\n'ns 1\n'nuff 1\n'recg 1\n*$ 1\n********* 1\n*********************************** 1\n*1 1\n*2 1\n*3 1\n*4 1\n*cheesy* 1\n*extremely* 1\n*integrated 1\n*require* 1\n*ss 1\n*when* 1\n++++ 1\n+++++ 1\n++++++ 1\n+already 1\n+grabbing 1\n+her 1\n+really 1\n+under 1\n+wanted 1\n,,, 1\n,,,, 1\n---- 1\n----------- 1\n------------------ 1\n------------------- 1\n----------------------------- 1\n------------------------------- 1\n----------------------------------------------- 1\n------------------------------------------------ 1\n------------------------------------------------------ 1\n------------------------------------------------------------- 1\n---------------------------------------------------------------- 1\n------------------------------------------------------------------ 1\n------------------------------------------------------------------- 1\n-> 1\n-LRB 1\n-LRB-c-RRB-?Ìö]o? 1\n-LRB-c-RRB-x 1\n-RRB 1\n-considering 1\n.- 1\n..!! 1\n..!!. 1\n..!* 1\n........ 1\n......... 1\n.......... 1\n............... 1\n..................... 1\n...................... 1\n....? 1\n.270 1\n.342 1\n.4 1\n.50 1\n.6 1\n.67 1\n.8 1\n.84 1\n.87 1\n.95 1\n.LBR 1\n.com 1\n.dbf 1\n.what 1\n// 1\n0. 1\n0.0002 1\n0.0015 1\n0.0018 1\n0.0040 1\n0.0075 1\n0.0100 1\n0.0115 1\n0.0182 1\n0.025 1\n0.07 1\n0.11 1\n0.14 1\n0.16 1\n0.18 1\n0.20 1\n0.23 1\n0.26 1\n0.27 1\n0.272 1\n0.28 1\n0.30 1\n0.31 1\n0.35 1\n0.36 1\n0.37 1\n0.39 1\n0.44 1\n0.47 1\n0.50 1\n0.51 1\n0.55 1\n0.57 1\n0.628394 1\n0.6287 1\n0.63 1\n0.65 1\n0.66 1\n0.73 1\n0.76 1\n0.85 1\n0.86 1\n0.89 1\n0.92 1\n0.97 1\n0.99 1\n0000 1\n003 1\n008 1\n009 1\n00949 1\n00:20 1\n00:39:42 1\n00:42:30 1\n01/13/2007 1\n01/19/01 1\n01040-2841 1\n012 1\n013 1\n014 1\n01:02 1\n01:09 1\n01:09:32 1\n01:35 1\n01:45 1\n01:47 1\n01:57 1\n02/13/2001 1\n02/22/2001 1\n021 1\n02920 1\n02:02 1\n02:10 1\n02:18 1\n02:19 1\n02:21 1\n02:25 1\n02:34 1\n03/01/2001 1\n03/16/2001 1\n03/26/2001 1\n03/29/2001 1\n0390 1\n03:00 1\n03:10:35 1\n03:11 1\n03:16 1\n03:22 1\n03:31 1\n03:43 1\n03:48 1\n04/25/01 1\n04/26/2001 1\n04/28/2000 1\n0400 1\n04:11 1\n04:13 1\n04:18 1\n04:28 1\n04:37 1\n04:44 1\n04:50 1\n04:52 1\n05/31/2001 1\n050901.doc 1\n0566 1\n05:09 1\n05:12:00 1\n05:15:41 1\n05:51 1\n05:54 1\n06/02/2001 1\n06/04/01 1\n06/04/2001 1\n06/12/2000 1\n0622 1\n064 1\n065 1\n066 1\n069 1\n06:03:48 1\n06:07 1\n06:15 1\n06:16 1\n06:23 1\n07/06/2000 1\n07/18/2000 1\n07/19/2001 1\n073 1\n074 1\n075 1\n076 1\n077 1\n078 1\n07:03:00 1\n07:17 1\n07:35 1\n07:48:44 1\n07:55 1\n0832 1\n0845 1\n08457 1\n0861 1\n08:14:00 1\n08:15 1\n08:22 1\n08:23 1\n08:27:46 1\n08:36:00 1\n08:38 1\n08:50 1\n08:52 1\n08:55 1\n09/03/99 1\n09/11/99 1\n09/17/99 1\n094 1\n095 1\n096 1\n097 1\n098 1\n09819602175 1\n099 1\n09:00 1\n09:00:15 1\n09:11 1\n09:14 1\n09:22 1\n09:30 1\n09:32 1\n09:36 1\n09:36:00 1\n09:37 1\n09:40 1\n09:46 1\n09:48 1\n09:51:22 1\n09:52 1\n09:55 1\n09:56 1\n0:00 1\n0nside 1\n1,001 1\n1,003,884 1\n1,013 1\n1,014 1\n1,022,000 1\n1,026.46 1\n1,027 1\n1,030 1\n1,035,000 1\n1,048,500,000 1\n1,059.04 1\n1,062 1\n1,068,000 1\n1,070,000 1\n1,074 1\n1,075,000 1\n1,087 1\n1,103.11 1\n1,108 1\n1,120 1\n1,120,317 1\n1,124 1\n1,141 1\n1,143 1\n1,150 1\n1,155 1\n1,174 1\n1,177,000 1\n1,178 1\n1,183 1\n1,200,000 1\n1,214 1\n1,222 1\n1,224 1\n1,235 1\n1,240 1\n1,244 1\n1,263,000 1\n1,271 1\n1,275,000 1\n1,290 1\n1,296,000 1\n1,296,800 1\n1,300,000 1\n1,310 1\n1,320 1\n1,325,900 1\n1,327 1\n1,342,264 1\n1,351,662 1\n1,368 1\n1,380,000 1\n1,384,119 1\n1,400,000 1\n1,425,035 1\n1,430 1\n1,435 1\n1,450,635 1\n1,455,000 1\n1,458,000 1\n1,474 1\n1,475,000 1\n1,480 1\n1,482 1\n1,490 1\n1,520 1\n1,531,000 1\n1,534 1\n1,534,600 1\n1,555 1\n1,580 1\n1,613 1\n1,614 1\n1,616,000 1\n1,640 1\n1,640,000 1\n1,642 1\n1,647 1\n1,650,000 1\n1,656,870 1\n1,657,736 1\n1,680 1\n1,685 1\n1,695,000 1\n1,704 1\n1,716 1\n1,730 1\n1,735 1\n1,745,000 1\n1,749,000 1\n1,770 1\n1,774,326 1\n1,784,400 1\n1,802,000 1\n1,809,300 1\n1,810,700 1\n1,816,000 1\n1,826,596 1\n1,838,200 1\n1,843,000 1\n1,848,000 1\n1,853,735 1\n1,878 1\n1,892 1\n1,908 1\n1,920 1\n1,922,000 1\n1,930 1\n1,977 1\n1,979,000 1\n1,980 1\n1-202-225-3121 1\n1-800-453-9000 1\n1-800-660-1350 1\n1-877-851-6437 1\n1-945-220-0044 1\n1.001 1\n1.011 1\n1.024 1\n1.045 1\n1.092 1\n1.1270 1\n1.1280 1\n1.130 1\n1.143 1\n1.1510 1\n1.1580 1\n1.1621 1\n1.168 1\n1.1726 1\n1.175 1\n1.1960 1\n1.234 1\n1.2345 1\n1.255 1\n1.2645 1\n1.2795 1\n1.342 1\n1.357 1\n1.388 1\n1.439 1\n1.441 1\n1.465 1\n1.5500 1\n1.57 1\n1.5775 1\n1.5805 1\n1.5885 1\n1.5890 1\n1.5930 1\n1.5940 1\n1.5990 1\n1.61 1\n1.6143 1\n1.68 1\n1.7600 1\n1.800.233.1234 1\n1.8410 1\n1.8435 1\n1.8690 1\n1.871 1\n1.877.999.3223 1\n1.9000 1\n1.927 1\n1.937 1\n1.9375 1\n1.96 1\n1.97 1\n1.98 1\n1.99 1\n1/02/2007 1\n1/11/1427 1\n1/16 1\n1/20 1\n1/20/2007 1\n1/30/10 1\n1/31 1\n1/5 1\n1/70th 1\n1/80th 1\n10,300 1\n10,450,000 1\n10,674,500 1\n10,750 1\n10,873 1\n10- 1\n10-fold 1\n10. 1\n10.0 1\n10.000.000 1\n10.02 1\n10.08 1\n10.09 1\n10.11 1\n10.125 1\n10.13 1\n10.16 1\n10.17 1\n10.33 1\n10.375 1\n10.40 1\n10.43 1\n10.44 1\n10.45 1\n10.485 1\n10.50 1\n10.5625 1\n10.62 1\n10.66 1\n10.75 1\n10.78 1\n10.83 1\n10.86 1\n10.875 1\n10.93 1\n10.95 1\n10.958 1\n10.98 1\n10/08/99 1\n10/2/2006 1\n100's 1\n100,000,000,000 1\n100.625 1\n100.8 1\n1000's 1\n10000 1\n1000th 1\n10043 1\n1006 1\n100s 1\n101,000 1\n101,250 1\n101.225 1\n101.45 1\n101.5 1\n101.60 1\n101.7 1\n101.75 1\n101.80 1\n101.90 1\n101.95 1\n101.98 1\n10100491 1\n1017.69 1\n102.01 1\n102.25 1\n102.5 1\n103.98 1\n1035 1\n104.79 1\n104.8 1\n10461 1\n10468 1\n10469 1\n105,000 1\n105.2 1\n105.39 1\n105.5 1\n1051 1\n106,100 1\n106.2 1\n106.6 1\n106.7 1\n107,100 1\n107.50 1\n107.87 1\n107.9 1\n1075 1\n108.2 1\n108.28 1\n108.3 1\n108.625 1\n108.8 1\n1080 1\n109,000 1\n109.25 1\n109.50 1\n109.66 1\n109.798 1\n109.82 1\n1099 1\n10:05 1\n10:07 1\n10:08 1\n10:09:13 1\n10:10 1\n10:13 1\n10:16 1\n10:20 1\n10:33 1\n10:39:03 1\n10:44 1\n10:46 1\n10:46:08 1\n10:51 1\n10:53 1\n10:57:32 1\n10:9 1\n10MM 1\n11,429,243 1\n11,450 1\n11,580 1\n11,586 1\n11,600 1\n11,742,368 1\n11,775,000 1\n11,795 1\n11,820,000 1\n11- 1\n11. 1\n11.0 1\n11.01 1\n11.07 1\n11.08 1\n11.11 1\n11.125 1\n11.13 1\n11.22 1\n11.41 1\n11.44 1\n11.56 1\n11.57 1\n11.66 1\n11.71 1\n11.711 1\n11.75 1\n11.79 1\n11.80 1\n11.88 1\n11.91 1\n11/08/2000 1\n11/1/01 1\n11/13/2000 1\n11/14/2000 1\n11/15/2000 1\n11/16/2005 1\n11/20/2004 1\n11/8/2000 1\n110.1 1\n110.380 1\n110.4 1\n110.625 1\n110.9 1\n110074404275 1\n11030 1\n111,000 1\n111.2 1\n111.9 1\n112,000 1\n112,383 1\n112.16 1\n112.2 1\n112.625 1\n112.9 1\n114.2 1\n114.6 1\n114.63 1\n116,385,000 1\n116,800 1\n116.56 1\n116.8 1\n117.2 1\n117.375 1\n117.7 1\n117.9 1\n117.94 1\n118.2 1\n119.2 1\n1190.43 1\n1191.86 1\n1194 1\n1199.32 1\n11:05 1\n11:06 1\n11:08 1\n11:13 1\n11:15:11 1\n11:16:00 1\n11:16:04 1\n11:21 1\n11:23 1\n11:24 1\n11:25 1\n11:26 1\n11:33 1\n11:34 1\n11:36 1\n11:37 1\n11:39am 1\n11:42 1\n11:50 1\n11:54 1\n11:57 1\n11:58 1\n11:59 1\n11the 1\n12's 1\n12,012 1\n12,017,724 1\n12,092 1\n12,275 1\n12,281 1\n12,283,217 1\n12,345 1\n12,500,000 1\n12,522 1\n12,573,758 1\n12,591 1\n12,675 1\n12,822,563 1\n12,915,000 1\n12.05 1\n12.09 1\n12.10 1\n12.12 1\n12.125 1\n12.25 1\n12.375 1\n12.38 1\n12.39 1\n12.43 1\n12.44 1\n12.47 1\n12.48 1\n12.49 1\n12.50 1\n12.57 1\n12.60 1\n12.62 1\n12.66 1\n12.875 1\n12.9375 1\n12.94 1\n12.97 1\n12.99 1\n12/06/2006 1\n12/24/1427 1\n12/26/2004 1\n12/27/1427 1\n12/31 1\n12/31/2006 1\n120.6 1\n120.8 1\n1205.01 1\n121.2 1\n1210.70 1\n122,700 1\n122.1 1\n122.36 1\n122.4 1\n123,000 1\n123.1 1\n123.6 1\n123.7 1\n123.8 1\n123.9 1\n1236.66 1\n124,000 1\n124,732 1\n124.2 1\n124.5 1\n1244 1\n125,075 1\n125,849 1\n125.1 1\n125.7 1\n1252 1\n1254 1\n126 1\n126,630,000 1\n126.1 1\n126.6 1\n126.68 1\n1263.51 1\n127,446 1\n127.47 1\n1271 1\n128.1 1\n128.19 1\n128.6 1\n128.9 1\n128K 1\n129.137. 1\n129.24 1\n129.3 1\n129.38 1\n129.48 1\n129.6 1\n129.62 1\n129.63 1\n129.72 1\n129.84 1\n129.90 1\n129.97 1\n12:05:53 1\n12:06 1\n12:07 1\n12:18:26 1\n12:24 1\n12:38 1\n12:42:55 1\n12:45 1\n12:53 1\n12:54 1\n12:58:27 1\n13,249 1\n13,865,000 1\n13.15 1\n13.18 1\n13.25 1\n13.26 1\n13.3 1\n13.34 1\n13.44 1\n13.63 1\n13.64 1\n13.78 1\n13.79 1\n13.81 1\n13.851 1\n13.855 1\n13.96 1\n13/12/1425 1\n13/32 1\n130.09 1\n130.1 1\n130.13 1\n130.2 1\n130.25 1\n130.36 1\n130.46 1\n130.73 1\n130.76 1\n130.80 1\n130.875 1\n131,146 1\n131.3 1\n131.34 1\n131.64 1\n132,000 1\n132,620,000 1\n132.00 1\n132.1 1\n1322 1\n1327 1\n13279 1\n133,000 1\n133.1 1\n133.8 1\n13339 1\n13381 1\n13389 1\n134,000 1\n134,550 1\n134,750,000 1\n134.2 1\n134.9 1\n1348 1\n135,000 1\n135,860,000 1\n135.09 1\n135.2 1\n135.4 1\n135.6 1\n135.9 1\n135s 1\n136,000 1\n136,800 1\n1360 1\n1368 1\n137,200 1\n137,550,000 1\n137.2 1\n137.20 1\n137.4 1\n137.5 1\n137.8 1\n137K 1\n138.625 1\n139.75 1\n139.857 1\n1393 1\n13:34:56 1\n13:41:01 1\n13D 1\n13s 1\n14,099 1\n14,500 1\n14,505 1\n14,560,000 1\n14,580,000 1\n14,789,000 1\n14.11 1\n14.24 1\n14.27 1\n14.31 1\n14.4 1\n14.43 1\n14.44 1\n14.50 1\n14.53 1\n14.60 1\n14.70 1\n14.76 1\n14.85 1\n14.9 1\n14.933 1\n14.95 1\n14.97 1\n14/32 1\n140.1 1\n140.106 1\n140.74 1\n140.91 1\n140.95 1\n140.97 1\n14000 1\n141,903 1\n141.1 1\n141.162. 1\n141.33 1\n141.35 1\n141.57 1\n141.60 1\n141.85 1\n141.93 1\n141.95 1\n142,117 1\n142.02 1\n142.15 1\n142.17 1\n142.2 1\n142.25 1\n142.3 1\n142.32 1\n142.4 1\n142.40 1\n142.55 1\n142.80 1\n142.95 1\n1424 1\n1427 1\n1428 1\n143,000 1\n143,178 1\n143,534 1\n143,800 1\n143.4 1\n143.6 1\n143.88 1\n144,610 1\n144.1 1\n144.32. 1\n144.35 1\n144.4 1\n144.5 1\n144.584 1\n144.9 1\n145,000 1\n145,954 1\n145.2 1\n145.4 1\n145.45 1\n145.7 1\n146,460,000 1\n146.3 1\n1462.93 1\n147 1\n147,121 1\n147,300 1\n147.5 1\n147.6 1\n1470 1\n1472.76 1\n148,000 1\n148.85 1\n1480 1\n1483 1\n149.3 1\n149.5 1\n149.69 1\n14:28 1\n14:57 1\n14:57:49 1\n15,015,000 1\n15,261 1\n15,417 1\n15,845,000 1\n15-fold 1\n15.02 1\n15.09 1\n15.31 1\n15.34 1\n15.4 1\n15.418 1\n15.43 1\n15.44 1\n15.64 1\n15.65 1\n15.81 1\n15.85 1\n15/11/1427 1\n150.2 1\n150.7 1\n150.8 1\n150301 1\n1507.37 1\n151.8 1\n1519 1\n152,000 1\n152.08 1\n152.14 1\n152.62 1\n1523.22 1\n1525 1\n1528 1\n153,000 1\n153.3 1\n153.9 1\n153.93 1\n1533 1\n1535 1\n15360 1\n153B09 1\n154.05 1\n1542 1\n155.039 1\n155.1 1\n155.15 1\n155.3 1\n155.4 1\n155.7 1\n155.9 1\n1550 1\n1551 1\n1554 1\n1555 1\n1559 1\n155mm 1\n156,000 1\n156.12 1\n156.3 1\n156.6 1\n156.8 1\n1562 1\n15656. 1\n157.2 1\n157.78 1\n157.8 1\n1570 1\n1572 1\n1574 1\n1576 1\n158,300 1\n158,863 1\n158.2 1\n1588 1\n159.7 1\n159.92 1\n1590 1\n1593 1\n15:00 1\n15:37:50 1\n15:40:06 1\n16,000,000,000 1\n16,250 1\n16,500 1\n16,746 1\n16,800,000 1\n16.0 1\n16.02 1\n16.03 1\n16.05 1\n16.08 1\n16.20 1\n16.22 1\n16.25 1\n16.38 1\n16.436 1\n16.50 1\n16.56 1\n16.625 1\n16.66 1\n16.7 1\n16.8 1\n16.88 1\n16.97 1\n16/2/1426 1\n160.1 1\n160.4 1\n1600's 1\n1605 1\n161.3 1\n1610 1\n1611 1\n1614 1\n1618 1\n162,190 1\n162,767 1\n162.1 1\n162.19 1\n163,000 1\n163.06 1\n163.2 1\n164 1\n1647 1\n165,000 1\n165.1 1\n165.9 1\n166,537 1\n166.4 1\n166.8 1\n1666 1\n1670 1\n1678.5 1\n168.50 1\n168.7 1\n169.28 1\n169.81 1\n16:00 1\n16:24:53 1\n17,500 1\n17,699 1\n17.06 1\n17.12 1\n17.19 1\n17.20 1\n17.22 1\n17.25 1\n17.375 1\n17.39 1\n17.47 1\n17.7 1\n17.73 1\n17.83 1\n17.92 1\n17.97 1\n170,000 1\n170.6 1\n170.65 1\n1700 1\n1701.7 1\n1707 1\n171.04 1\n171.9 1\n1717 1\n1721.4 1\n172nd 1\n173.3 1\n173.5 1\n1730.7 1\n1739.3 1\n174.5 1\n174.8 1\n1744 1\n175.2 1\n175.4 1\n175.5 1\n1751.9 1\n1757 1\n1758.5 1\n176,470 1\n176.1 1\n176.4 1\n176.7 1\n1761.0 1\n177.3 1\n177.4 1\n1772.1 1\n1772.6 1\n1775 1\n178.0 1\n178.8 1\n17820 1\n1789 1\n179,032 1\n179.916 1\n1797 1\n17:07:38 1\n17:41:07 1\n18,136 1\n18,300 1\n18,644 1\n18.11 1\n18.125 1\n18.2 1\n18.27 1\n18.3 1\n18.32 1\n18.35 1\n18.443 1\n18.46 1\n18.49 1\n18.56 1\n18.6 1\n18.69 1\n18.73 1\n18.85 1\n180.3 1\n180.7 1\n1800s 1\n1803 1\n1807 1\n1809 1\n181.9 1\n1810 1\n1812 1\n1815 1\n1818 1\n1819 1\n182,059 1\n182.1 1\n182.6 1\n182.9 1\n1820 1\n1825 1\n183,467 1\n1832 1\n1836 1\n1837 1\n184.4 1\n184.9 1\n1844 1\n1845 1\n185.5 1\n185.7 1\n1850 1\n186,000 1\n186.1 1\n186.4 1\n1864 1\n1866 1\n187.1 1\n187.4 1\n187.8 1\n1870s 1\n1875 1\n1878 1\n188,726 1\n188.1 1\n188.2 1\n188.5 1\n188.7 1\n188.89 1\n1882 1\n1883 1\n1884 1\n1885 1\n189.32 1\n189.52 1\n189.8 1\n1890 1\n1892 1\n1893 1\n1894 1\n1896 1\n18:1 1\n18:2 1\n18:3 1\n18:4 1\n18:5 1\n18:53:26 1\n19,000 1\n19,395 1\n19.1 1\n19.125 1\n19.3 1\n19.30 1\n19.4 1\n19.51 1\n19.54 1\n19.60 1\n19.62 1\n19.625 1\n19.65 1\n19.69 1\n19.72 1\n19.75 1\n19.76 1\n19.93 1\n19.98 1\n19/11/2004 1\n190.1 1\n190.125 1\n190.3 1\n1901 1\n1903 1\n1904 1\n1905 1\n191.1 1\n191.2 1\n191.3 1\n191.4 1\n191.9 1\n1914 1\n192 1\n192.1 1\n192.12 1\n192.9 1\n19204 1\n1921 1\n1924 1\n1930's 1\n1934 1\n1938 1\n194 1\n194,000 1\n194.24 1\n194.69 1\n1946:258 1\n195.19 1\n195.4 1\n1950's 1\n1952 1\n196,785 1\n196.1 1\n196.2 1\n196.7 1\n196.8 1\n198.1 1\n198.41 1\n199,203 1\n199.6 1\n199.7 1\n199.8 1\n1990's 1\n19912000 1\n19931999 1\n19:23:41 1\n19:34:54 1\n1: 1\n1:00pm 1\n1:20 1\n1:45.994 1\n1Q 1\n1st_top 1\n2,000th 1\n2,008,434 1\n2,010 1\n2,046 1\n2,048 1\n2,050 1\n2,052.10 1\n2,057,750,000 1\n2,060 1\n2,070 1\n2,080 1\n2,157,656 1\n2,204.62 1\n2,210 1\n2,290 1\n2,331,100 1\n2,379 1\n2,387,226 1\n2,412 1\n2,425,000 1\n2,437 1\n2,440 1\n2,472 1\n2,480 1\n2,490 1\n2,499 1\n2,537 1\n2,580 1\n2,600,000 1\n2,605 1\n2,610 1\n2,633,700 1\n2,660 1\n2,664,098 1\n2,680 1\n2,750 1\n2,760 1\n2,800 1\n2,809 1\n2,822,000 1\n2,840 1\n2,850,000 1\n2,853,000 1\n2,888 1\n2,888,000 1\n2,890 1\n2,909,827 1\n2,910,198 1\n2,936 1\n2,940 1\n2,960 1\n2.007 1\n2.025 1\n2.0476 1\n2.094 1\n2.11 1\n2.12 1\n2.175 1\n2.18 1\n2.180 1\n2.20 1\n2.285 1\n2.3125 1\n2.320 1\n2.39 1\n2.42 1\n2.4225 1\n2.428 1\n2.4375 1\n2.48 1\n2.49 1\n2.52 1\n2.524 1\n2.54 1\n2.55 1\n2.616 1\n2.627 1\n2.72 1\n2.83 1\n2.86 1\n2.8896 1\n2.8956 1\n2.91 1\n2.94 1\n2.9428 1\n2.9429 1\n2.9495 1\n2.9511 1\n2.9622 1\n2.97 1\n2/14/2005 1\n2/15/2005 1\n2/6/1424 1\n2/7/2005 1\n20,000,000 1\n20. 1\n20.2 1\n20.20 1\n20.212 1\n20.24 1\n20.25 1\n20.33 1\n20.375 1\n20.38 1\n20.39 1\n20.48 1\n20.56 1\n20.625 1\n20.85 1\n20.875 1\n20/32 1\n200,000,000,000 1\n200,843 1\n200...@gmail.com<MSN%EF%BC%9A200...@gmail.com> 1\n200.2 1\n200.3 1\n200.5 1\n200.70 1\n20001 1\n20005 1\n20006 1\n2003/2007 1\n2006-P1 1\n20067 1\n20070104 1\n20070107 1\n200th 1\n201,028 1\n201,870 1\n201.2 1\n202-466-9142 1\n202-828-3372 1\n202.429.1700 1\n202.582.1234 1\n202.637.4781 1\n202.739.0134 1\n202.785.0786 1\n20237 1\n2029 1\n203-719-7031 1\n203-719-8385 1\n203.2 1\n203.5 1\n203.674.7723 1\n203.961.7523 1\n2030 1\n20320 1\n204.3 1\n204.5 1\n204.8 1\n204s 1\n205.3 1\n2050 1\n206.3 1\n206.87 1\n207,000 1\n207.4 1\n2070 1\n2076 1\n208.185.9.024 1\n208.8 1\n2082.1 1\n20886 1\n20:45 1\n21,153 1\n21,600 1\n21,687 1\n21,900 1\n21.03 1\n21.18 1\n21.23 1\n21.30 1\n21.33 1\n21.42 1\n21.625 1\n21.71 1\n21.72 1\n21.88 1\n21.91 1\n21.93 1\n21.98 1\n21/12/1427 1\n210,000 1\n210.2 1\n210.3 1\n210.8 1\n2100 1\n21000089 1\n2102.2 1\n211 1\n211,666 1\n211.6 1\n2112.2 1\n2117.1 1\n212.1 1\n212.5 1\n2120.5 1\n21202 1\n2129.4 1\n213,000 1\n213.2 1\n213.97 1\n2135.5 1\n214,000 1\n214.4 1\n214.54 1\n2142.6 1\n215,000 1\n215,845 1\n215.04 1\n215.35 1\n215.42 1\n215.48 1\n215.86 1\n216.49 1\n216.74 1\n2161.9 1\n217,000 1\n217.5 1\n217.9 1\n2170 1\n2170.1 1\n2176.9 1\n2179.1 1\n218,000 1\n2189 1\n2189.3 1\n2189.7 1\n219.19 1\n219.27 1\n2192 1\n2195 1\n21:09:59 1\n21:10:06 1\n21:10:52 1\n21:12:01 1\n21:38:45 1\n22,101 1\n22,300 1\n22,336 1\n22,600 1\n22,750,000 1\n22,925 1\n22,985,000 1\n22.02 1\n22.2 1\n22.21 1\n22.26 1\n22.3 1\n22.61 1\n22.62 1\n22.7 1\n22.70 1\n22.75 1\n22.76 1\n22.82 1\n22/11/1427 1\n220.178 1\n221.61 1\n2210 1\n222.8 1\n222.875 1\n2223 1\n223.2 1\n223.3 1\n223.7 1\n2233.9 1\n224.5 1\n224.75 1\n225.5 1\n225.7 1\n226 1\n226.5 1\n227.1 1\n227.3 1\n22761792 1\n228 1\n228,000 1\n229,800 1\n229.03 1\n22:1 1\n22:22 1\n22:51:46 1\n22s 1\n22th 1\n23,114 1\n23.0 1\n23.031 1\n23.11 1\n23.125 1\n23.18 1\n23.195 1\n23.3 1\n23.31 1\n23.34 1\n23.4 1\n23.50 1\n23.500 1\n23.53 1\n23.65 1\n23.93 1\n2301 1\n2308 1\n231,000 1\n231,405 1\n232.12 1\n232.4 1\n232.6 1\n233 1\n233,000 1\n2338 1\n234.3 1\n234.5 1\n234027 1\n235,000 1\n235-1972 1\n235.5 1\n236.23 1\n236.8 1\n237 1\n237.1 1\n238.15 1\n238.3 1\n23:30 1\n23BN 1\n24,891 1\n24,985,000 1\n24,999 1\n24.05 1\n24.1 1\n24.3 1\n24.50 1\n24.6 1\n24.68 1\n24.97 1\n240.8 1\n240.86 1\n240SX 1\n241.6 1\n241.7 1\n241.9 1\n2410 1\n2423.9 1\n243 1\n243,000 1\n243,677 1\n243.2 1\n243.4 1\n244,000 1\n244.2 1\n244.6 1\n244.8 1\n245.3 1\n246.60 1\n246.9 1\n247,000 1\n247.3 1\n247.6 1\n248,279 1\n248.2 1\n248.3 1\n248.91 1\n249 1\n249.5 1\n249.68 1\n2493 1\n25.12 1\n25.125 1\n25.25 1\n25.50 1\n25.51 1\n25.7 1\n25.78 1\n25.96 1\n25/01/2001 1\n250.2 1\n250.80 1\n25000 1\n250000 1\n251.2 1\n251.8 1\n252.5 1\n254,200 1\n255,923 1\n255.8 1\n256 1\n256.18 1\n257 1\n257.5 1\n258,000 1\n258.4 1\n258.9 1\n259.3 1\n26,206 1\n26,350 1\n26.02 1\n26.125 1\n26.29 1\n26.48 1\n26.54 1\n26.6 1\n26.8 1\n26.805 1\n26.81 1\n26.875 1\n260,000 1\n260.5 1\n261 1\n2611.68 1\n2613.73 1\n262.4 1\n263,684 1\n263.2 1\n2640 1\n2642.64 1\n2642.88 1\n265,000 1\n265.79 1\n266.5 1\n2665.66 1\n2676.60 1\n2679.72 1\n268.3 1\n268.6 1\n268.98 1\n2680 1\n2681.22 1\n2681.76 1\n2687.53 1\n269.3 1\n2692.65 1\n27,225 1\n27,500 1\n27,700 1\n27,890,000 1\n27.01 1\n27.125 1\n27.14 1\n27.2 1\n27.4 1\n27.49 1\n27.50 1\n27.68 1\n27.75 1\n27.875 1\n27.90 1\n27.95 1\n27/32 1\n2700 1\n27000 1\n272 1\n272.16 1\n273,000 1\n273.5 1\n273.9 1\n274,475 1\n274.2 1\n276 1\n276-4459 1\n278.4 1\n278.7 1\n279.0 1\n279.39 1\n279.75 1\n279.8 1\n28.1 1\n28.125 1\n28.15 1\n28.2 1\n28.25 1\n28.375 1\n28.43 1\n28.55 1\n28.62 1\n28.625 1\n28.71 1\n28.8 1\n280.5 1\n280.7 1\n2800 1\n281 1\n281-514-3183 1\n281-518-1081 1\n281-518-9526 1\n281-735-5919 1\n281.2 1\n281.848.1619 1\n282.08 1\n283.2 1\n283.3 1\n283.9 1\n284 1\n284,500 1\n286.6 1\n286.8 1\n288,000 1\n2890 1\n2899 1\n29,000 1\n29,400 1\n29,700 1\n29.1 1\n29.18 1\n29.25 1\n29.5 1\n29.583 1\n29.66 1\n29.75 1\n29.9 1\n29.90 1\n29/11/1427 1\n29/12/1427 1\n29/4/1426 1\n290,541 1\n290,782 1\n290.1 1\n290.19 1\n291,890 1\n291.6 1\n2917 1\n293.29 1\n293.7 1\n293.9 1\n294.6 1\n295 1\n295.7 1\n296.95 1\n297 1\n297,446 1\n297.1 1\n298 1\n299,000 1\n2: 1\n2:00:25 1\n2:1 1\n2:21 1\n2:43 1\n2:53 1\n2:58 1\n2;30 1\n2C 1\n2UE 1\n2s 1\n3,027,330 1\n3,040,000 1\n3,102,935 1\n3,111,000 1\n3,175 1\n3,210 1\n3,250,000 1\n3,300,000 1\n3,350 1\n3,363,949 1\n3,372 1\n3,383,477 1\n3,390 1\n3,420,936 1\n3,437 1\n3,481,887 1\n3,513,072 1\n3,524,000 1\n3,609,800 1\n3,632 1\n3,776 1\n3,800 1\n3,820,634 1\n3,855.60 1\n3,950 1\n3,993,310 1\n3- 1\n3-5297 1\n3.00 1\n3.02 1\n3.06 1\n3.07 1\n3.11 1\n3.14 1\n3.17 1\n3.21 1\n3.22 1\n3.28 1\n3.30 1\n3.324 1\n3.34 1\n3.365 1\n3.38 1\n3.3V 1\n3.44 1\n3.48 1\n3.49 1\n3.50 1\n3.51 1\n3.526 1\n3.54 1\n3.59 1\n3.61 1\n3.66 1\n3.71 1\n3.72 1\n3.78 1\n3.80 1\n3.82 1\n3.83 1\n3.839 1\n3.846 1\n3.865 1\n3.87 1\n3.89 1\n3.91 1\n3.92 1\n3.98 1\n3/03 1\n3/9/2005 1\n30+ 1\n30,180 1\n30.02 1\n30.09 1\n30.25 1\n30.41 1\n30.84 1\n30.96 1\n30/32 1\n300m 1\n300th 1\n301.9 1\n302,000 1\n302,464 1\n303-294-4499 1\n303-832-8160 1\n303.00 1\n303.7 1\n303.9 1\n303rd 1\n305.7 1\n306 1\n306.6 1\n307,000 1\n307.2 1\n307.9 1\n309,381 1\n309,500 1\n309.3 1\n3090s 1\n31,000 1\n31,143 1\n31,777 1\n31.125 1\n31.375 1\n31.4 1\n31.48 1\n31.6 1\n31.65 1\n31.7 1\n31.75 1\n31.8 1\n31.875 1\n311.6 1\n313,125 1\n313,800 1\n313.2 1\n314 1\n314,000 1\n315,546 1\n315.12 1\n315.5 1\n315.8 1\n318.6 1\n318.7 1\n318.79 1\n319,000 1\n32,191 1\n32.2 1\n32.4 1\n32.7 1\n32.9 1\n32.99 1\n320,000 1\n320.4 1\n320.5 1\n320.54 1\n320.94 1\n322.7 1\n323,000 1\n323.2 1\n323.4 1\n323.85 1\n324 1\n324.75 1\n324.9 1\n325.50 1\n326,000 1\n328.2 1\n328.85 1\n329,600 1\n329.2 1\n33,270 1\n33.1 1\n33.2 1\n33.375 1\n33.625 1\n33.75 1\n33.8 1\n33.875 1\n33.9 1\n33.90 1\n330.1 1\n3300 1\n3303 1\n331,400 1\n331.8 1\n332,000 1\n333 1\n333,000 1\n333.3 1\n334 1\n334,000 1\n334.5 1\n334.8 1\n335 1\n335,700 1\n336.4 1\n337 1\n3371.36 1\n3377.43 1\n339.12 1\n3392.49 1\n3398.65 1\n33rd 1\n34,215,000 1\n34,320 1\n34,500 1\n34.1 1\n34.25 1\n34.5 1\n34.6 1\n34.9 1\n340.36 1\n340.7 1\n340.83 1\n3406.31 1\n340B 1\n341,000 1\n341.16 1\n341.76 1\n3411.08 1\n3416.81 1\n342,122 1\n342.50 1\n342.60 1\n3425.22 1\n3426.33 1\n3427.39 1\n343,333 1\n344,000 1\n344,354 1\n345.5 1\n346 1\n347 1\n347.13 1\n347.16 1\n348.2 1\n349 1\n349,900 1\n34996.08 1\n34th 1\n35.125 1\n35.2 1\n35.23 1\n35.28 1\n35.38 1\n35.5 1\n35.508 1\n35.6 1\n35.7 1\n35.875 1\n35.9 1\n35015.38 1\n351 1\n351.2 1\n351.3 1\n351.5 1\n35107.56 1\n35242.65 1\n353 1\n353,500 1\n35374.22 1\n35378.44 1\n354,000 1\n354,600 1\n354.39 1\n354.7 1\n35417.44 1\n35442.40 1\n35452.72 1\n35486.38 1\n355.02 1\n355.35 1\n35526.55 1\n35527.29 1\n35544.47 1\n35544.87 1\n35549.44 1\n35585.52 1\n35586.60 1\n35587.85 1\n35588.36 1\n356.1 1\n35611.38 1\n35620 1\n35670 1\n35689.98 1\n357.2 1\n357.4 1\n357.5 1\n357.7 1\n35mm 1\n36,000,000,000 1\n36,015,194 1\n36.125 1\n36.13 1\n36.25 1\n36.7 1\n36.87 1\n36.9 1\n360,000 1\n360.1 1\n361,000 1\n361.5 1\n3636.06 1\n364.1 1\n3642.90 1\n366 1\n366.50 1\n366.55 1\n366.79 1\n366.85 1\n366.89 1\n367.10 1\n367.30 1\n367.40 1\n368.15 1\n368.24 1\n368.25 1\n368.3 1\n368.4 1\n368.5 1\n368.70 1\n369,000 1\n369.10 1\n37,000 1\n37,300 1\n37,820 1\n37,860 1\n37.2 1\n37.4 1\n37.875 1\n370.20 1\n370.58 1\n370.60 1\n370.8 1\n370.85 1\n371.1 1\n371.20 1\n3717.46 1\n372.1 1\n372.50 1\n373 1\n373.40 1\n373.80 1\n374.6 1\n374.70 1\n375,000 1\n375.16 1\n375.9 1\n375.92 1\n376 1\n376,000 1\n376-9004 1\n376.36 1\n376.8 1\n376.80 1\n37625 1\n377.80 1\n378.07 1\n378.1 1\n378.30 1\n378.87 1\n379 1\n379.46 1\n38,489 1\n38.1 1\n38.32 1\n38.4 1\n38.6 1\n38.75 1\n38.9 1\n38.913 1\n380,000 1\n380.80 1\n3801a 1\n381,000 1\n382.2 1\n382.81 1\n382.9 1\n38282 1\n383 1\n385 1\n386,000 1\n387.4 1\n388.5 1\n389.6 1\n39,300 1\n39,400 1\n39.08 1\n39.125 1\n39.19 1\n39.2 1\n39.25 1\n39.31 1\n39.6 1\n39.68 1\n39.75 1\n39.787 1\n390.5 1\n391 1\n393.1 1\n393.4 1\n394.4 1\n395,000 1\n395,374 1\n395,700 1\n395,974 1\n395.3 1\n395.4 1\n398,000 1\n398,487 1\n398.3 1\n39938 1\n3994 1\n3: 1\n3:00pm 1\n3:10pm 1\n3:13 1\n3:15 1\n3:20 1\n3:29 1\n3COM 1\n3D 1\n3DSMAX 1\n3G 1\n3K 1\n3am 1\n3s 1\n4,090,000 1\n4,170 1\n4,199 1\n4,223,000 1\n4,280 1\n4,290 1\n4,300 1\n4,320 1\n4,343 1\n4,345 1\n4,346 1\n4,348 1\n4,469,167 1\n4,555 1\n4,600 1\n4,631,400 1\n4,695 1\n4,700 1\n4,750,000 1\n4,930 1\n4,995 1\n4,999 1\n4- 1\n4.00 1\n4.01 1\n4.02 1\n4.06 1\n4.060 1\n4.065 1\n4.0775 1\n4.10 1\n4.11 1\n4.13 1\n4.14 1\n4.22 1\n4.23 1\n4.27 1\n4.29 1\n4.31 1\n4.38 1\n4.39 1\n4.40 1\n4.41 1\n4.45 1\n4.46 1\n4.469 1\n4.47 1\n4.49 1\n4.51 1\n4.54 1\n4.58 1\n4.59 1\n4.65 1\n4.66 1\n4.6875 1\n4.70 1\n4.74 1\n4.82 1\n4.83 1\n4.84 1\n4.89 1\n4.93 1\n4/14/00 1\n4/4 1\n40,424 1\n40,800 1\n40.07 1\n40.125 1\n40.3 1\n40.5 1\n40.50 1\n40.86 1\n40.99 1\n400.0 1\n400.3 1\n400.4 1\n400000 1\n402,000 1\n402.7 1\n404,294 1\n405,000 1\n4053 1\n406,000 1\n407 1\n407.9 1\n408 1\n409,000 1\n4096 1\n41,900 1\n41.1 1\n41.18 1\n41.4 1\n41.5 1\n41.6 1\n41.725 1\n41.75 1\n41.78 1\n41.85 1\n41.9 1\n410-468-3499 1\n410-468-3798 1\n410.3 1\n410.4 1\n410.5 1\n4105SF 1\n411 1\n415.3 1\n415.9 1\n416,000 1\n417 1\n42,008 1\n42,374 1\n42,455 1\n42.0 1\n42.2 1\n42.375 1\n42.60 1\n42.75 1\n42.875 1\n420.68 1\n4200 1\n421 1\n422 1\n422.1 1\n423 1\n423.5 1\n423.9 1\n424 1\n424.3 1\n4241 1\n425.4 1\n427,300 1\n427.7 1\n428 1\n428,000 1\n429.9 1\n43.2 1\n43.34 1\n43.6 1\n43.7 1\n43.875 1\n43.88 1\n43.95 1\n430,700 1\n430.3 1\n431 1\n432 1\n432.6 1\n432.61 1\n432.78 1\n432700 1\n432785 1\n433.2 1\n433.5 1\n4344 1\n435.5 1\n436 1\n436.3 1\n437.5 1\n437.68 1\n437.7 1\n438,845 1\n438.15 1\n43rd 1\n44,094 1\n44,796 1\n44.04 1\n44.08 1\n44.2 1\n44.375 1\n44.50 1\n44.6 1\n44.625 1\n44.7 1\n44.82 1\n44.875 1\n44.9 1\n44.92 1\n4400 1\n443 1\n443.6 1\n445,645 1\n445.23 1\n445.7 1\n446,000 1\n446.5 1\n447.76 1\n448 1\n448.49 1\n448.80 1\n449.89 1\n45.00 1\n45.085 1\n45.6 1\n45.66 1\n45.8 1\n45.9 1\n4500 1\n450000 1\n451.37 1\n451.6 1\n452.23 1\n452.76 1\n453,000 1\n453.05 1\n453.4 1\n453.57 1\n454 1\n454,100 1\n454.6 1\n454.86 1\n455 1\n455,000 1\n455,410 1\n455.29 1\n455.63 1\n456.2 1\n456.4 1\n457 1\n457.5 1\n457.52 1\n457.9 1\n458.32 1\n458.52 1\n458.8 1\n46,245,000 1\n46,835 1\n46,892 1\n46,995 1\n46.02 1\n46.1 1\n46.4 1\n46.50 1\n46.6 1\n46.77 1\n46.80 1\n46.959 1\n460,000 1\n460.05 1\n460.33 1\n4608 1\n46093 1\n461,200 1\n461,539,056 1\n461.6 1\n461.70 1\n461.9 1\n462,900 1\n462.2 1\n462.89 1\n463,000 1\n463.06 1\n463.28 1\n464.7 1\n465,000 1\n466 1\n466,000 1\n467.22 1\n468 1\n469 1\n469.6 1\n469.8 1\n47.125 1\n47.17 1\n47.24 1\n47.3 1\n47.46 1\n47.5 1\n47.50 1\n470,000 1\n470.67 1\n4700 1\n470th 1\n471.6 1\n472 1\n472.5 1\n473.29 1\n475.35 1\n475.6 1\n4756 1\n476.14 1\n477.00 1\n477.1 1\n4779 1\n4783 1\n479.7 1\n47K202!.DOC 1\n48,100 1\n48.22 1\n48.375 1\n48.462 1\n48.5 1\n48.56 1\n48.6 1\n48.7 1\n48.9 1\n48.93 1\n480.4 1\n4809 1\n480i 1\n481 1\n482.3 1\n483 1\n484 1\n485 1\n486.1 1\n486.30 1\n486.6 1\n486.74 1\n48604 1\n486tm 1\n487 1\n487.8 1\n488.60 1\n489.9 1\n48th 1\n49.125 1\n49.3 1\n49.375 1\n49.5 1\n49.70 1\n491.10 1\n4918 1\n491?2 1\n492 1\n493 1\n494,100 1\n494.4 1\n494.8 1\n495 1\n495,000 1\n496,116 1\n496.7 1\n497,400 1\n497.1 1\n498 1\n499.4 1\n49ers 1\n4: 1\n4:02 1\n4:12 1\n4:25 1\n4TH 1\n4gamer 1\n4p 1\n5,0 1\n5,088 1\n5,088,774 1\n5,100 1\n5,226 1\n5,248 1\n5,267,238 1\n5,273 1\n5,377,000 1\n5,440 1\n5,441,000 1\n5,502 1\n5,599 1\n5,651 1\n5,700 1\n5,745,188 1\n5,760 1\n5,791 1\n5,810 1\n5,900 1\n5,960 1\n5- 1\n5.0 1\n5.01 1\n5.03 1\n5.05 1\n5.11 1\n5.12 1\n5.125 1\n5.13 1\n5.133 1\n5.17 1\n5.19 1\n5.1950 1\n5.20 1\n5.23 1\n5.277 1\n5.2830 1\n5.29 1\n5.315 1\n5.33 1\n5.36 1\n5.37 1\n5.38 1\n5.39 1\n5.40 1\n5.435 1\n5.44 1\n5.47 1\n5.49 1\n5.52 1\n5.53 1\n5.56 1\n5.63 1\n5.67 1\n5.76 1\n5.77 1\n5.79 1\n5.8125 1\n5.84 1\n5.85 1\n5.86 1\n5.934 1\n5.98 1\n5/1/01 1\n5/1/1428 1\n5/100 1\n5/28/00 1\n5/30/00 1\n50's 1\n50+ 1\n50,005,000 1\n50,085 1\n50,400 1\n50.01 1\n50.4 1\n50.46 1\n50.5 1\n50.59 1\n50.8 1\n50.9375 1\n500.00 1\n500.20 1\n500.26 1\n501,200 1\n501.61 1\n502 1\n502,000 1\n502.1 1\n503.1 1\n504,200 1\n504.2 1\n504.5 1\n505 1\n508012651 1\n51,911,566 1\n51.23 1\n51.4 1\n51.65 1\n51.8 1\n51.81 1\n510 1\n510,000 1\n510.6 1\n515.1 1\n515.4 1\n516.9 1\n517 1\n517,500 1\n517.85 1\n518.7 1\n519 1\n51st 1\n52,012 1\n52.1 1\n52.125 1\n52.25 1\n52.4 1\n52.50 1\n52.75 1\n520 1\n520,000 1\n521 1\n521.2 1\n521.4 1\n522.3 1\n523,920,214 1\n524.5 1\n525,546 1\n525.8 1\n527 1\n528.3 1\n528.4 1\n528.56 1\n529 1\n53,000 1\n53,496,665 1\n53.25 1\n53.6 1\n53.75 1\n532,000 1\n5320 1\n534 1\n534,000 1\n534.3 1\n535 1\n535,322 1\n535-4000 1\n53596 1\n536,000 1\n537.91 1\n538 1\n538,000 1\n538.5 1\n539.4 1\n54.1 1\n54.3 1\n54.50 1\n54.51 1\n54.58 1\n54.625 1\n54.75 1\n54.875 1\n54.9 1\n540,000 1\n540.9 1\n543,000 1\n543.5 1\n544 1\n544,681 1\n545.3 1\n545.96 1\n547 1\n547,000 1\n547,347,585 1\n549 1\n549,365 1\n549.9 1\n5498 1\n54C 1\n55,008 1\n55,500 1\n55.10 1\n55.375 1\n55.59 1\n55.8 1\n55.875 1\n552 1\n552,302 1\n554 1\n555 1\n555.5 1\n555.6 1\n556.5 1\n557,000 1\n558 1\n558.50 1\n56,000 1\n56,565,000 1\n56,900 1\n56.1 1\n56.13 1\n56.4 1\n56.8 1\n562 1\n563.8 1\n564.452 1\n565,000 1\n569,000 1\n56th 1\n57,000 1\n57.1 1\n57.125 1\n57.2 1\n57.348 1\n57.4 1\n57.50 1\n57.625 1\n57.665 1\n57.7 1\n57.8 1\n57.82 1\n57.87 1\n57.9 1\n570,000 1\n5703 1\n571 1\n575.1 1\n577 1\n577.3 1\n58.2 1\n58.3 1\n58.6 1\n58.75 1\n58.8 1\n58.97 1\n582.6 1\n5836 1\n58369 1\n586 1\n588,300 1\n588,350,000 1\n588,800 1\n59.2 1\n59.50 1\n59.7 1\n59.8 1\n59.9 1\n590.7 1\n591.239 1\n592-2277 1\n593 1\n593.5 1\n595 1\n596.8 1\n597 1\n597.8 1\n599 1\n599.4 1\n599.9 1\n5: 1\n5:04 1\n5:26 1\n5:30pm 1\n5:33 1\n5:40 1\n5B 1\n5k 1\n6,050 1\n6,165 1\n6,256 1\n6,320 1\n6,379,884 1\n6,400 1\n6,420,268 1\n6,475,000 1\n6,480 1\n6,495 1\n6,499 1\n6,500,000 1\n6,542,000 1\n6,622 1\n6,727,042 1\n6,744,600 1\n6,805 1\n6,840 1\n6,881 1\n6- 1\n6.0 1\n6.02 1\n6.03 1\n6.033 1\n6.056 1\n6.08 1\n6.11 1\n6.16 1\n6.172 1\n6.18 1\n6.19 1\n6.23 1\n6.235 1\n6.24 1\n6.26 1\n6.27 1\n6.31 1\n6.34 1\n6.35 1\n6.36 1\n6.424 1\n6.43 1\n6.47 1\n6.51 1\n6.513 1\n6.56 1\n6.58 1\n6.59 1\n6.61 1\n6.625 1\n6.63 1\n6.65 1\n6.66 1\n6.71 1\n6.81 1\n6.8354 1\n6.85 1\n6.87 1\n6.91 1\n6.94 1\n6.96 1\n6.97 1\n6.98 1\n6.99 1\n6/01/2007 1\n6/1/01 1\n6/4/11 1\n60's 1\n60,008 1\n60.06 1\n60.2 1\n60.3 1\n60.5 1\n60.6 1\n60.7 1\n60.9 1\n601 1\n601.3 1\n602 1\n603 1\n604.72 1\n605 1\n60A 1\n60K 1\n61,000,000,000 1\n61,493 1\n61.1 1\n61.125 1\n61.4 1\n61.5 1\n61.7 1\n61.83 1\n61.875 1\n61.98 1\n61007 1\n611 1\n613.2 1\n613.7 1\n614 1\n614.5 1\n614.6 1\n614.65 1\n615 1\n615,000 1\n615,874 1\n616 1\n618,000 1\n618.1 1\n618.6 1\n618.9 1\n619 1\n619-231-9449 1\n619-696-6966 1\n619.8 1\n62,500 1\n62,800 1\n62,872 1\n62.04 1\n62.1 1\n62.2 1\n62.3 1\n62.36 1\n62.4 1\n62.50 1\n62.6 1\n62.70 1\n62.75 1\n620 1\n620.5 1\n621 1\n622 1\n623.5 1\n626 1\n626.3 1\n629 1\n62nd 1\n63,971 1\n63.1 1\n63.25 1\n63.6 1\n63.875 1\n630,000 1\n63007 1\n631,163 1\n633.8 1\n635 1\n637 1\n637.5 1\n638,000 1\n639 1\n639.9 1\n64,000 1\n64,100 1\n64.1 1\n64.125 1\n64.2 1\n64.432 1\n64.73 1\n640,000 1\n641.5 1\n642 1\n643.3 1\n643.4 1\n644,500 1\n646-2495 1\n646-3490 1\n648 1\n65,619 1\n65.4 1\n65.53 1\n65.545 1\n65.6 1\n65.9 1\n650.9 1\n6500 1\n65146050 1\n654.5 1\n655 1\n656.5 1\n6565 1\n657 1\n66,743 1\n66.06 1\n66.5 1\n66.50 1\n66.6 1\n66.80 1\n66.9 1\n660,000 1\n662 1\n663 1\n663,000 1\n664.3 1\n666 1\n666,666 1\n667 1\n67,000 1\n67,400 1\n67,972 1\n67.1 1\n67.177. 1\n67.40 1\n67.7 1\n67.75 1\n67.8 1\n67.9 1\n670.3 1\n671 1\n672 1\n673.3 1\n675 1\n675,400,000 1\n676.5 1\n677 1\n679.5 1\n67th 1\n68,000 1\n68,548 1\n68.1 1\n68.109. 1\n68.4 1\n68.42 1\n68.9 1\n68.92. 1\n680.6 1\n682.7 1\n683 1\n683,000 1\n685,000 1\n686.7 1\n687 1\n688.77 1\n6889 1\n69,000 1\n69,105 1\n69,553 1\n69,980 1\n69.1 1\n69.15. 1\n69.2 1\n69.6 1\n69.8 1\n691.09 1\n693 1\n693.4 1\n696 1\n696.1 1\n6:03 1\n6:21 1\n6:33 1\n6:45 1\n6:50 1\n6TH 1\n7,0 1\n7,400 1\n7,440 1\n7,580 1\n7,592,988 1\n7,600 1\n7,800 1\n7,839 1\n7,841 1\n7- 1\n7.02 1\n7.04 1\n7.05 1\n7.06 1\n7.08 1\n7.0808 1\n7.081 1\n7.0826 1\n7.125 1\n7.13 1\n7.145 1\n7.160 1\n7.163 1\n7.17 1\n7.22 1\n7.24 1\n7.26 1\n7.2984 1\n7.33 1\n7.34 1\n7.375 1\n7.38 1\n7.43 1\n7.445 1\n7.47 1\n7.56 1\n7.58 1\n7.63 1\n7.649 1\n7.66 1\n7.68 1\n7.694 1\n7.70 1\n7.71 1\n7.79 1\n7.84 1\n7.86 1\n7.889 1\n7.904 1\n7.937 1\n7.955 1\n7/100ths 1\n7/25/2006 1\n70,315,000 1\n70,765 1\n70.6 1\n70.7 1\n702.4 1\n703-868-7896 1\n703.729.2710 1\n704.4 1\n705 1\n707 1\n708 1\n708,000 1\n71,800 1\n71,895 1\n71.25 1\n71.36 1\n71.6 1\n71.7 1\n71.75 1\n710-6084 1\n710.5 1\n711.9 1\n712 1\n713-546-5000 1\n713-790-2605 1\n713-793-1429 1\n713-793-2000 1\n713-819-2784 1\n713-853-3044 1\n713.306.7940 1\n713.5 1\n713.837.1638 1\n713.864.4149 1\n713/853-5025 1\n715.1 1\n717,000 1\n719,000 1\n72,000 1\n72.15 1\n72.224. 1\n72.6 1\n722 1\n724,579 1\n724.4 1\n725.8 1\n728 1\n728.5 1\n728.8 1\n729.04 1\n73,100 1\n73,803 1\n73.1 1\n73.50 1\n73.6 1\n73.9 1\n73.97 1\n730,000 1\n730.1 1\n730.37 1\n734.2 1\n734.8 1\n735 1\n736 1\n74,000 1\n74,351 1\n74.125 1\n74.139. 1\n74.20 1\n74.35 1\n74.71. 1\n74.8 1\n740 1\n742 1\n743.7 1\n745.7 1\n746 1\n747.7 1\n747.8 1\n748 1\n75+ 1\n75,075,000 1\n75,500 1\n75.25 1\n75.3 1\n75.34 1\n75.41 1\n75.5 1\n75.6 1\n75.625 1\n75.7 1\n75.75 1\n75.80. 1\n75.875 1\n750th 1\n752.9 1\n753 1\n754 1\n754.4 1\n755,000 1\n755-1109 1\n755.9 1\n756.3 1\n757.4 1\n76,161 1\n76,724 1\n76.171. 1\n76.33 1\n76.4 1\n76.6 1\n76.66 1\n76.8 1\n760 1\n761 1\n761.38 1\n762.4 1\n762259 1\n763 1\n7646 1\n766 1\n77,500 1\n77.2 1\n77.5 1\n770 1\n770,000 1\n77002 1\n77030-2707 1\n771 1\n773.94 1\n774,000 1\n775 1\n776,470 1\n777 1\n778 1\n778,100 1\n778.6 1\n779 1\n779.8 1\n77B 1\n78,600 1\n78,625 1\n78.13 1\n78.32 1\n78.50 1\n78.625 1\n78.64 1\n780.6 1\n784.5 1\n786,100 1\n786,700 1\n786,860,000 1\n787 1\n787.02 1\n788.8 1\n789 1\n789,000 1\n79.1 1\n79.18 1\n790.2 1\n791 1\n791,687 1\n792 1\n793 1\n795,900 1\n796 1\n7:00AM 1\n7:05 1\n7:13 1\n7:24 1\n7:35 1\n7A 1\n8,100 1\n8,355 1\n8,385 1\n8,400 1\n8,484 1\n8,500,000 1\n8,524 1\n8,550 1\n8,590 1\n8,930,000 1\n8- 1\n8. 1\n8.007 1\n8.0087 1\n8.019 1\n8.032 1\n8.07 1\n8.075 1\n8.12 1\n8.1255 1\n8.14 1\n8.16 1\n8.18 1\n8.19 1\n8.29 1\n8.328 1\n8.34 1\n8.347 1\n8.36 1\n8.387 1\n8.3875 1\n8.395 1\n8.425 1\n8.44 1\n8.457 1\n8.46 1\n8.483 1\n8.49 1\n8.525 1\n8.54 1\n8.579 1\n8.685 1\n8.74 1\n8.77 1\n8.79 1\n8.80 1\n8.81 1\n8.87 1\n8.88 1\n8.903 1\n8.96 1\n8.99 1\n8/11/1427 1\n8/15 1\n8/28/1941 1\n8/30/2005 1\n80.3 1\n80.53 1\n80.6 1\n80.8 1\n800-462-9029 1\n801,835 1\n801.2 1\n801.21 1\n802 1\n804.3 1\n805,000 1\n806 1\n806.7 1\n806.8 1\n807.5 1\n807.6 1\n808.3 1\n81.04 1\n81.125 1\n81.50 1\n81.9 1\n810 1\n810,700 1\n811 1\n812 1\n813.4 1\n814 1\n814,000 1\n814.1 1\n814.8 1\n816,000 1\n82,000 1\n82,348 1\n82.4 1\n82.50 1\n82.6 1\n820 1\n820.4 1\n821 1\n822.8 1\n824 1\n827.9 1\n8274 1\n829 1\n829.9 1\n82th 1\n83,950 1\n83.3 1\n83.3125 1\n83.6 1\n830,000 1\n830.5 1\n833 1\n835 1\n837 1\n837.5 1\n838 1\n838.3 1\n839 1\n839.4 1\n84,500 1\n84.1 1\n84.15 1\n84.7 1\n84.75 1\n84.90 1\n840,000 1\n840.4 1\n841 1\n841.5 1\n842 1\n843 1\n8441000 1\n845.7 1\n848.7 1\n85.1 1\n85.18 1\n85.3 1\n85.339 1\n85.49 1\n85.50 1\n85.6 1\n85.60 1\n85.7 1\n85.8 1\n850-748-0740 1\n851 1\n851,000 1\n852 1\n852,000 1\n853-0569 1\n853-5620 1\n855 1\n856.3 1\n857 1\n859.2 1\n859.5 1\n86,000 1\n86,500 1\n86,525,000 1\n86.2 1\n860-665-2368 1\n861 1\n862,000 1\n864.1 1\n868 1\n86ed 1\n87.026 1\n87.1 1\n87.375 1\n87.57 1\n87.9 1\n870 1\n872 1\n873.9 1\n874 1\n876 1\n876,706 1\n877 1\n877.6 1\n878 1\n879,000 1\n88,500 1\n88.1 1\n88.35 1\n88.5 1\n88.7 1\n880,500 1\n880.9 1\n881,969 1\n884,000 1\n885,800 1\n885.5 1\n8888 1\n889,000 1\n89,500 1\n89.2 1\n89.4 1\n89.5 1\n89.75 1\n89.875 1\n89.89 1\n890 1\n891-5042 1\n89108 1\n8934014 1\n894 1\n8940061 1\n897.2 1\n899.6 1\n899.8 1\n8:00PM 1\n8:01 1\n8:10 1\n8:35 1\n8d 1\n8s 1\n9,023 1\n9,232 1\n9,360 1\n9,404 1\n9,756 1\n9,800 1\n9,999 1\n9- 1\n9.03 1\n9.07 1\n9.125 1\n9.13 1\n9.16 1\n9.192 1\n9.275 1\n9.32 1\n9.324 1\n9.333 1\n9.37 1\n9.375 1\n9.42 1\n9.482 1\n9.49 1\n9.58 1\n9.617 1\n9.63 1\n9.664 1\n9.671 1\n9.68 1\n9.687 1\n9.700 1\n9.712 1\n9.800 1\n9.812 1\n9.83 1\n9.84 1\n9.85 1\n9.850 1\n9.87 1\n9.89 1\n9.900 1\n9.912 1\n9.9375 1\n9/18 1\n9/26/2005 1\n90,552 1\n90.20 1\n90.5 1\n90.6 1\n903 1\n904,000 1\n905 1\n908.8 1\n909-0951 1\n90s 1\n91.07 1\n91.21 1\n911,606 1\n914 1\n916,000 1\n916.3 1\n918 1\n918.4 1\n919 1\n92,000 1\n92,800 1\n92.2 1\n92.4 1\n92.42 1\n92.6 1\n921.6 1\n92101 1\n923 1\n923,500 1\n926 1\n926.1 1\n93,000 1\n93.1 1\n93.5 1\n93.7 1\n93.8 1\n930 1\n930.2 1\n931 1\n932 1\n933 1\n934,242 1\n934.7 1\n936 1\n936,500 1\n937 1\n94,243 1\n94,425,000 1\n94,543 1\n94.3 1\n94.625 1\n940 1\n940.6 1\n942,000 1\n9423 1\n943 1\n944,000 1\n948 1\n949.3 1\n95,400 1\n95.1 1\n95.11 1\n95.22 1\n95.25 1\n95.39 1\n95.57 1\n95.7 1\n95.72 1\n95.75 1\n95.8 1\n95.9 1\n95.90 1\n953.8 1\n954.5 1\n959.3 1\n96.2 1\n96.4 1\n96.808 1\n96.95 1\n960 1\n966 1\n97,000 1\n97,963 1\n97.2 1\n97.25 1\n97.275 1\n97.65 1\n97.70 1\n97.74 1\n97.75 1\n97.8 1\n97.85 1\n971,000 1\n975 1\n975,000 1\n978 1\n978-376-9004 1\n97th 1\n98.481 1\n98.518 1\n98.523 1\n98.8 1\n980 1\n980,000 1\n9800 1\n981.2 1\n981.7 1\n983 1\n984 1\n985 1\n989 1\n99,000 1\n99,385 1\n99.23 1\n99.555 1\n99.56 1\n99.60 1\n99.625 1\n99.64 1\n99.661 1\n99.672 1\n99.691 1\n99.7 1\n99.771 1\n99.775 1\n99.8 1\n99.80 1\n99.821 1\n99.93 1\n99.95 1\n990 1\n990,000 1\n990.79 1\n992 1\n992.7 1\n995 1\n9:15 1\n9:31 1\n9:38 1\n9:53 1\n9:56 1\n9v 1\n:--LRB- 1\n:. 1\n:/ 1\n:: 1\n:】 1\n<http://companyads.51job.com/companyads/shanghai/sh/shengdong_060627/i...> 1\n== 1\n======= 1\n============================== 1\n=Hanshin 1\n=National 1\n=abscesses 1\n=asked 1\n=bewitchment 1\n=bitterly 1\n=bribing 1\n=cost 1\n=crack 1\n=deny 1\n=great 1\n=just 1\n=passing 1\n=program 1\n=promise 1\n=top 1\n=train 1\n=volunteers 1\n>:-LRB- 1\n?!! 1\n?!!! 1\n?!!... 1\n?!. 1\n?!?!? 1\n?.. 1\n?... 1\n??. 1\n??????????????? 1\n?@@@... 1\n@#^% 1\n@paulhastings.com 1\nA&E 1\nA&K 1\nA'nandamu'rti 1\nA.D.L. 1\nA.J.C. 1\nA.T.B. 1\nA16 1\nA21 1\nA310 1\nA320 1\nA330 1\nA3l 1\nAAAAAGGGHHHHHH 1\nABA 1\nABBA 1\nABD 1\nABOVE 1\nABSOLUTELY 1\nABUSE 1\nACC 1\nACME 1\nACQUISITION 1\nACRES 1\nADD 1\nADDITIONAL 1\nADDRESS 1\nADHD 1\nADIA 1\nADMITTED 1\nADOPTED 1\nADSL 1\nADVANCED 1\nADVERTISING 1\nAEI 1\nAES 1\nAFC 1\nAFRICAN 1\nAFTER 1\nAFTERSHOCKS 1\nAGA 1\nAGAIN 1\nAGC 1\nAGNs 1\nAGS 1\nAH. 1\nAIIZ 1\nAIM 1\nAIPAC 1\nAJ 1\nAK47's 1\nALAMCO 1\nALBERTA 1\nALCEE 1\nALCOHOL 1\nALDL 1\nALII 1\nALLOWANCE 1\nALONE 1\nALREADY 1\nALUMINUM 1\nALWAY 1\nALonso 1\nAMAZE 1\nAMAZINGLY 1\nAMBASSADOR 1\nAMDAHL 1\nAMI 1\nAMONG 1\nAMRO 1\nANACOMP 1\nANALYSIS 1\nANB 1\nANNUITIES 1\nANSA 1\nANSWERS 1\nANTHEM 1\nANTHONY 1\nANYONE 1\nANYWAY 1\nANZUS 1\nAON 1\nAP600 1\nAPARTHEID 1\nAPIs 1\nAPMS 1\nAPPLICATIONS 1\nAPPLIED 1\nAPPROVAL 1\nAPPROVE 1\nAPPROVED 1\nAQ 1\nAR 1\nARA 1\nARCHILOCHUS 1\nARD 1\nARJ21 1\nAROUND 1\nARRIVED 1\nARTY 1\nARVN 1\nARe 1\nASA 1\nASSISTANCE 1\nASSOCIATES 1\nASWERING 1\nATARI 1\nATE 1\nATHLONE 1\nATI 1\nATLANTIC 1\nATP 1\nATS 1\nATTENTION 1\nATV's 1\nAUBREY 1\nAUDITS 1\nAUSTIN 1\nAUTO 1\nAVOID 1\nAVON 1\nAVs 1\nAWACS 1\nAWOL 1\nAX 1\nAYER 1\nAZ 1\nAaa 1\nAaaaahhh 1\nAafaaq 1\nAal 1\nAalseth 1\nAbada 1\nAbadi 1\nAbandoning 1\nAbbe 1\nAbbey 1\nAbduljawad 1\nAbdun 1\nAbed 1\nAbel 1\nAbi 1\nAbid 1\nAbidjan 1\nAbing 1\nAbitibi 1\nAboff 1\nAbominable 1\nAbramo@ENRON 1\nAbridged 1\nAbrutzi 1\nAbsaas 1\nAbscam 1\nAbsolute 1\nAbsoul 1\nAbstinence 1\nAbudwaliev 1\nAbul 1\nAbulkabir 1\nAbundance 1\nAbused 1\nAcapulco 1\nAccelerating 1\nAccent 1\nAccenture 1\nAcclaim 1\nAccompanying 1\nAccomplishing 1\nAccountabilities 1\nAccountability 1\nAccountants 1\nAccrued 1\nAccurate 1\nAccurately 1\nAcedraz 1\nAcorn 1\nAcquire 1\nAcriflaven 1\nAcronod 1\nAcrophobia 1\nActions 1\nActivism 1\nActon 1\nActs 1\nAcuvue 1\nAdamec 1\nAdamski 1\nAdapted 1\nAdaptor 1\nAddiss 1\nAdel 1\nAdelaide 1\nAdjournment 1\nAdjust 1\nAdjustment 1\nAdlai 1\nAdley 1\nAdmistration 1\nAdmittedly 1\nAdolf 1\nAdolphus 1\nAdopt 1\nAdrenaline 1\nAdriano 1\nAdriatic 1\nAds 1\nAdsi 1\nAdvancement 1\nAdvances 1\nAdvantage 1\nAdvantageous 1\nAdvent 1\nAdventist 1\nAdvertiser 1\nAdvocate 1\nAdvocates 1\nAec 1\nAegon 1\nAerial 1\nAerobics 1\nAerojet 1\nAeronautical 1\nAerpade 1\nAesthetics 1\nAffiliate 1\nAffiliates 1\nAfif 1\nAflatoxin 1\nAfnasjev 1\nAfricaine 1\nAfrikanerdom 1\nAftereffects 1\nAftermath 1\nAftershocks 1\nAgamben 1\nAgami 1\nAggie 1\nAggregate 1\nAggressively 1\nAgin 1\nAgip 1\nAgitato 1\nAgoglia 1\nAgoura 1\nAgrarian 1\nAgree 1\nAgreed 1\nAgreeing 1\nAgro 1\nAguilera 1\nAguirre 1\nAhab 1\nAhbudurecy 1\nAhbulaity 1\nAhlerich 1\nAhmedNazif 1\nAhmedinejad 1\nAhnes 1\nAhwaz 1\nAibo 1\nAidan 1\nAilat 1\nAimaiti 1\nAimed 1\nAiming 1\nAirbase 1\nAirplane 1\nAirports 1\nAit 1\nAitken 1\nAiwa 1\nAiwan 1\nAixing 1\nAjax 1\nAjay 1\nAjman 1\nAka 1\nAkbar 1\nAke 1\nAkhzam 1\nAkram 1\nAl-Akhbar 1\nAl-Aksa 1\nAl-Hayat 1\nAlaba 1\nAlagoas 1\nAlam 1\nAlameda 1\nAlamin 1\nAlamo 1\nAlarkon 1\nAlassane 1\nAlastair 1\nAlatorre@ENRON 1\nAlawiyah 1\nAlba 1\nAlbanese 1\nAlberg 1\nAlbergo 1\nAlberts 1\nAlbertville 1\nAlbion 1\nAlcala 1\nAlcarria 1\nAlcatel 1\nAlcatraz 1\nAlcoa 1\nAldergrove 1\nAlderman 1\nAlderson 1\nAldomet 1\nAldous 1\nAldrich 1\nAldridge 1\nAldrin 1\nAless 1\nAlessandro 1\nAlessio 1\nAleusis 1\nAlexa 1\nAlexe 1\nAlf 1\nAlfaro 1\nAlfonse 1\nAlfonso 1\nAlford 1\nAlfrado 1\nAlgarve 1\nAlgerians 1\nAlgonquin 1\nAlgraphy 1\nAlibaba 1\nAliber 1\nAlibi 1\nAlicia 1\nAlida 1\nAlien 1\nAlienate 1\nAlisa 1\nAlisarda 1\nAlisha 1\nAlisky 1\nAlison 1\nAlistair 1\nAljian 1\nAllahu 1\nAllaj 1\nAllat 1\nAllegany 1\nAllegheny 1\nAllegiance 1\nAllegro 1\nAllenport 1\nAllies 1\nAlligood 1\nAllumettes 1\nAlly 1\nAlma 1\nAlmaden 1\nAlmed 1\nAlmeida 1\nAloft 1\nAlone 1\nAlper 1\nAlphonsus 1\nAlqami 1\nAlson 1\nAlta 1\nAltaf 1\nAltering 1\nAlternate 1\nAlternately 1\nAlternatives 1\nAltitude 1\nAlun 1\nAlvea 1\nAlwan 1\nAlyce 1\nAlyssa 1\nAlyx 1\nAm@cnn.com 1\nAmBase 1\nAmado 1\nAmadou 1\nAmani 1\nAmaral 1\nAmarillo 1\nAmaury 1\nAmazingly 1\nAmazonian 1\nAmbigua 1\nAmbiguan 1\nAmbler 1\nAmcap 1\nAmdec 1\nAmed 1\nAmel 1\nAmending 1\nAmenities 1\nAmericanization 1\nAmericanized 1\nAmerman 1\nAmeron 1\nAmherst 1\nAmhowitz 1\nAmiable 1\nAmiga 1\nAmiriyah 1\nAmish 1\nAmitai 1\nAmityville 1\nAmityvilles 1\nAmmonium 1\nAmos 1\nAmounts 1\nAmparano 1\nAmphibian 1\nAmschel 1\nAmstel 1\nAmtran 1\nAmundsen 1\nAmur 1\nAmusing 1\nAnac 1\nAnadarko 1\nAnalogous 1\nAnalytic 1\nAnalyze 1\nAnalyzing 1\nAnand 1\nAnanda 1\nAnastasio 1\nAnatol 1\nAnchorman 1\nAnctil 1\nAndalusian 1\nAndean 1\nAndorra 1\nAndover 1\nAndreassen 1\nAndrei 1\nAndreotti 1\nAndrogynous 1\nAndrzej 1\nAnecdotes 1\nAng 1\nAngie 1\nAnglicans 1\nAngrist 1\nAnguar 1\nAnhai 1\nAni 1\nAnimated 1\nAniskovich 1\nAnketell 1\nAnnabel 1\nAnnalee 1\nAnnals 1\nAnnapolis 1\nAnnaud 1\nAnnie 1\nAnnounce 1\nAnnouncement 1\nAnnouncer 1\nAnnounces 1\nAnnouncing 1\nAnnuities 1\nAnnuity 1\nAnointing 1\nAnonymouth 1\nAnopheles 1\nAnotherwords 1\nAnracht 1\nAnsa 1\nAnsab 1\nAnsar 1\nAnsco 1\nAnswers 1\nAntalya 1\nAntarctic 1\nAnterior 1\nAnthong 1\nAnti 1\nAnti-Ballistic 1\nAnti-Christ 1\nAnti-Defamation 1\nAnti-Fraud 1\nAnti-Israeli 1\nAnti-Jones 1\nAnti-Nuclear 1\nAnti-Semitic 1\nAnti-abortion 1\nAnti-government 1\nAnti-union 1\nAnticipating 1\nAntigua 1\nAntilles 1\nAntipasto 1\nAntonates 1\nAntoni 1\nAntonov 1\nAntonovich 1\nAntwerp 1\nAntwerpsche 1\nAnubis 1\nAnxian 1\nAnxiety 1\nAnyang 1\nAoyama 1\nApartment 1\nApd 1\nApennines 1\nApes 1\nApia 1\nApicella 1\nAplo. 1\nAplocheilus 1\nApologies 1\nApostle 1\nApostolic 1\nAppalachia 1\nAppalachian 1\nAppalled 1\nAppealing 1\nAppelbaum 1\nAppell 1\nAppert 1\nApplesauce 1\nAppleseed 1\nAppleseeds 1\nApply 1\nAppointee 1\nAppraisers 1\nAppropriately 1\nApproval 1\nApprovals 1\nApps 1\nApricot 1\nApsaras 1\nApso 1\nApt 1\nAqaba 1\nAqsa 1\nAqua 1\nAquatic 1\nAquilla 1\nAquire 1\nAquiriums 1\nAquitaine 1\nAra 1\nArabists 1\nArabiyah 1\nArag 1\nArai 1\nArakat 1\nArakawa 1\nAranda 1\nAransas 1\nAras 1\nArat 1\nAraujo 1\nArbitrary 1\nArbor 1\nArboretum 1\nArbour 1\nArbusto 1\nArc 1\nArcata 1\nArcha 1\nArchaeological 1\nArchaeopteryx 1\nArches 1\nArchey 1\nArchimedes 1\nArchipelago 1\nArchitectural 1\nArdito 1\nArdmore 1\nArdullah 1\nAref 1\nArendt 1\nArens 1\nArensep 1\nArfeen 1\nArgoSystems 1\nArgonne 1\nArgos 1\nArguably 1\nAriane 1\nAristotelean 1\nAristotle 1\nArivala 1\nArlingtonians 1\nArmaments 1\nArmen 1\nArmenia 1\nArmiy 1\nArney 1\nArnoldo 1\nArnott 1\nAromatiques 1\nAron 1\nArp 1\nArpanet 1\nArrest 1\nArrested 1\nArrington 1\nArrival 1\nArrived 1\nArriving 1\nArrowheads 1\nArseneault 1\nArtemis 1\nArteries 1\nArthurian 1\nArtificial 1\nArtillery 1\nArtisans 1\nArtistic 1\nArtois 1\nArtzt 1\nArvind 1\nAsbali 1\nAsbestos 1\nAsbury 1\nAscend 1\nAscii 1\nAsda 1\nAsensio 1\nAshdown 1\nAshes 1\nAsheville 1\nAshitaer 1\nAshley 1\nAshok 1\nAshrafieh 1\nAshtamudi 1\nAsi 1\nAsiaworld 1\nAsifa 1\nAsil 1\nAsiri 1\nAskin 1\nAsmara 1\nAsopos 1\nAspen 1\nAssab 1\nAssassinate 1\nAssassins 1\nAssemble 1\nAssh@%$e 1\nAssicurazioni 1\nAssociating 1\nAssumption 1\nAssured 1\nAstaire 1\nAstor 1\nAstoria 1\nAstr 1\nAstrology 1\nAstronomers 1\nAsus 1\nAswara 1\nAsyut 1\nAtahiya 1\nAtahualpa 1\nAtalanta 1\nAtaturk 1\nAtchinson 1\nAte 1\nAteliers 1\nAthenian 1\nAthlete 1\nAthletic 1\nAtif 1\nAtlantans 1\nAtman 1\nAtmos 1\nAtone 1\nAtrium 1\nAttacks 1\nAttain 1\nAttempting 1\nAttend 1\nAttendees 1\nAttitude 1\nAttitudes 1\nAttiyatulla 1\nAttracting 1\nAttractiveness 1\nAttridge 1\nAtwa 1\nAtwater 1\nAtwood 1\nAuburn 1\nAuctioning 1\nAuctions 1\nAudio 1\nAudiovisual 1\nAuditors 1\nAudrey 1\nAug 1\nAugusta 1\nAugustine 1\nAugustines 1\nAustelouwumuff 1\nAustern 1\nAustralasian 1\nAustronesian 1\nAuthorised 1\nAuthoritarianism 1\nAuthorization 1\nAutodesk 1\nAutomatically 1\nAutomobiles 1\nAutomotives 1\nAutonomy 1\nAutorelease 1\nAutozam 1\nAuye 1\nAuzava 1\nAvailability 1\nAvalanche 1\nAvalokiteshvara 1\nAvatars 1\nAve 1\nAvelino 1\nAvena 1\nAvenida 1\nAverae 1\nAverett 1\nAvi 1\nAvianca 1\nAviators 1\nAvions 1\nAvis 1\nAvmark 1\nAvner 1\nAvoidance 1\nAw 1\nAwaiting 1\nAwe 1\nAwful 1\nAxiaNetMedia 1\nAyala 1\nAye 1\nAyers 1\nAykroyd 1\nAyu 1\nAz 1\nAzem 1\nAzioni 1\nAziza 1\nAzma 1\nAzman 1\nAzoury 1\nAzrael 1\nAzucena 1\nAzzam 1\nB'Gosh 1\nB.B. 1\nB.C. 1\nB.G. 1\nB1 1\nB7-H 1\nB92 1\nBA 1\nBACK 1\nBACKED 1\nBAKER 1\nBALANCES 1\nBALLOTS 1\nBALTIMORE 1\nBANCORP 1\nBANKAMERICA 1\nBANNED 1\nBASF 1\nBASIC 1\nBAT 1\nBATCH 1\nBATTLED 1\nBBB 1\nBBN 1\nBCI 1\nBDO 1\nBEAT 1\nBECHTEL 1\nBECOME 1\nBEER 1\nBELL 1\nBEN 1\nBENEFIT 1\nBENEFITS 1\nBFH 1\nBID 1\nBIGGER 1\nBILL 1\nBILLION 1\nBIO 1\nBJP 1\nBLAST 1\nBLOCK 1\nBLOCKBUSTER 1\nBLOEDEL 1\nBLOG 1\nBLOOD 1\nBLUES 1\nBLUR 1\nBMP 1\nBMWs 1\nBNBM 1\nBNP 1\nBOARD 1\nBOARD'S 1\nBOB 1\nBOLD 1\nBOND 1\nBONDS 1\nBONES 1\nBONO 1\nBORLAND 1\nBOS's 1\nBOTHERED 1\nBOZELL 1\nBPH 1\nBPM 1\nBR 1\nBRACED 1\nBRADLEY 1\nBRANDS 1\nBRAWLER 1\nBREADBOX 1\nBRETT 1\nBRING 1\nBRINGING 1\nBRINGS 1\nBRISTOL 1\nBRITISH 1\nBROKERAGE 1\nBROKERS 1\nBROUGHT 1\nBROWN 1\nBS 1\nBTUs 1\nBU 1\nBUELL 1\nBUFFALO 1\nBUILDING 1\nBUNCH 1\nBUNDY'S 1\nBURBANK 1\nBURGER 1\nBURNHAM 1\nBURNS 1\nBUSINESS 1\nBUSINESSLAND 1\nBUSY 1\nBUYERS 1\nBYU 1\nBa'athist 1\nBa'athists 1\nBaa3 1\nBaas 1\nBabalon 1\nBabar 1\nBabbage 1\nBabel 1\nBabelists 1\nBabies 1\nBaboo 1\nBacall 1\nBachelor 1\nBaches 1\nBachtold 1\nBackHome 1\nBackdrops 1\nBackers 1\nBackground 1\nBacklog 1\nBackup 1\nBadi 1\nBadra 1\nBadran 1\nBagado 1\nBagel 1\nBaggins 1\nBagram 1\nBahrani 1\nBaiji 1\nBaikonur 1\nBailard 1\nBailiffs 1\nBaily 1\nBain 1\nBainbridge 1\nBaiqiu 1\nBaixin 1\nBakery 1\nBakhit 1\nBakhtiar 1\nBakshi 1\nBaktiar 1\nBalag 1\nBalah 1\nBalazick 1\nBalcon 1\nBalintang 1\nBalladur 1\nBallerina 1\nBallmer 1\nBalloonfish 1\nBallooning 1\nBallroom 1\nBalls 1\nBallston 1\nBallwin 1\nBalmy 1\nBaltics 1\nBaluchistan 1\nBam 1\nBambi 1\nBancshares 1\nBanczak 1\nBandit 1\nBandits 1\nBands 1\nBanerian 1\nBangalore 1\nBangchang 1\nBangladeshis 1\nBani 1\nBankWatch 1\nBankcard 1\nBanker 1\nBankhaus 1\nBanminyan 1\nBanned 1\nBanquet 1\nBanu 1\nBaohong 1\nBaojun 1\nBaoli 1\nBaotou 1\nBapilly 1\nBaptists 1\nBaqtarsi 1\nBaquba 1\nBarabolak 1\nBarasch 1\nBarb 1\nBarba 1\nBarbados 1\nBarbary 1\nBarbera 1\nBarbour 1\nBarcalounger 1\nBarco 1\nBardagy 1\nBardo 1\nBarell 1\nBarent 1\nBarese 1\nBargen 1\nBarghouthi 1\nBarham 1\nBari 1\nBarings 1\nBarischoff 1\nBarisque 1\nBarkindo 1\nBarkis 1\nBarkley 1\nBarlow 1\nBarlows 1\nBarn 1\nBarnetts 1\nBarneys 1\nBarnhardt 1\nBarnstorm 1\nBarred 1\nBarret 1\nBarrick 1\nBarring 1\nBarros 1\nBarth 1\nBartholow 1\nBarton 1\nBaryshnikov 1\nBarzach 1\nBarzani 1\nBasa 1\nBascilla 1\nBascom 1\nBaseException 1\nBaseer 1\nBasesgioglu 1\nBasham 1\nBashari 1\nBashi 1\nBasij 1\nBasilica 1\nBasinger 1\nBasis 1\nBasit 1\nBasket 1\nBaskets 1\nBaskin 1\nBasset 1\nBatangas 1\nBatari 1\nBatch 1\nBater 1\nBatha 1\nBathing 1\nBator 1\nBatten 1\nBattista 1\nBattles 1\nBaudrillard 1\nBauernfeind 1\nBauhinia 1\nBaulieu 1\nBauser 1\nBaxley 1\nBaynes 1\nBayonne 1\nBayside 1\nBazarcowich 1\nBcc 1\nBeachcrofts 1\nBeaches 1\nBeachwood 1\nBeadleston 1\nBeairsto 1\nBeame 1\nBeanie 1\nBeantown 1\nBeards 1\nBeat 1\nBeata 1\nBeating 1\nBeatrix 1\nBeaubien 1\nBeaubourg 1\nBeaujolais 1\nBeaumont 1\nBeautifully 1\nBeaux 1\nBeavers 1\nBeazer 1\nBebop 1\nBecca 1\nBecker 1\nBeckham 1\nBeckwith 1\nBecoming 1\nBecton 1\nBedfellows 1\nBeebe 1\nBeef 1\nBeemers 1\nBeermann 1\nBeetle 1\nBeggiato 1\nBegging 1\nBehavioral 1\nBeheading 1\nBehrendt 1\nBehringwerke 1\nBeidi 1\nBeigel 1\nBeipiao 1\nBeisan 1\nBeise 1\nBeithshamuz 1\nBekaa 1\nBel 1\nBelatedly 1\nBelehi 1\nBelin 1\nBelinki 1\nBellas 1\nBellow 1\nBells 1\nBelmarsh 1\nBelmonts 1\nBelth 1\nBeltway-itis 1\nBelushi 1\nBelz 1\nBelzberg 1\nBelzbergs 1\nBenackova 1\nBenazir 1\nBenda 1\nBendectin 1\nBendix 1\nBenedek 1\nBeneficiaries 1\nBenefiting 1\nBeneta 1\nBeng 1\nBengal 1\nBengalis 1\nBenign 1\nBenin 1\nBenj. 1\nBennigsen 1\nBenninger 1\nBens 1\nBent 1\nBentley 1\nBeny 1\nBenzes 1\nBeonovach 1\nBeranski 1\nBereft 1\nBerets 1\nBerg 1\nBergamo 1\nBergamot 1\nBergman 1\nBerham 1\nBerle 1\nBerliner 1\nBerlusconi 1\nBermejo 1\nBernadette 1\nBernama 1\nBernanke 1\nBerner 1\nBernice 1\nBerrigan 1\nBersik 1\nBerthwana 1\nBertin 1\nBertolt 1\nBertone@ENRON_DEVELOPMENT 1\nBertram 1\nBeside 1\nBessemer 1\nBestertester 1\nBetaWest 1\nBetelnut 1\nBethle 1\nBethune 1\nBetrayed 1\nBets 1\nBettencourt 1\nBettner 1\nBeulah 1\nBeveneary 1\nBewitched 1\nBewkes 1\nBfree 1\nBhabani 1\nBhagat 1\nBhatia 1\nBhuj 1\nBhutto 1\nBiaggi 1\nBiarka 1\nBibles 1\nBic 1\nBickel 1\nBickford 1\nBicycling 1\nBicyclists 1\nBidding 1\nBiden 1\nBidu 1\nBiederman 1\nBien 1\nBierbower 1\nBighorn 1\nBikers 1\nBikfaya 1\nBikini 1\nBilal 1\nBilbao 1\nBilbo 1\nBilboa 1\nBilbrey 1\nBilis 1\nBillah 1\nBillerica 1\nBillion 1\nBillionaire 1\nBillions 1\nBilmes 1\nBilqees 1\nBiltera 1\nBilyasainyaur 1\nBinalshibh 1\nBindal 1\nBinder 1\nBing 1\nBinge 1\nBinhe 1\nBint 1\nBio-Sciences 1\nBio-technology 1\nBioVentures 1\nBioengineers 1\nBiofuels 1\nBiogen 1\nBiographical 1\nBiography 1\nBiospira 1\nBirch 1\nBirdies 1\nBirds 1\nBirk 1\nBirkhead 1\nBirns 1\nBiscay 1\nBisconti 1\nBiscuits 1\nBisheng 1\nBishket 1\nBismarckian 1\nBissett 1\nBit 1\nBitburg 1\nBitmap 1\nBitten 1\nBitter 1\nBizarro 1\nBlackfriar 1\nBlackjack 1\nBlackwell 1\nBladder 1\nBladeRunner 1\nBlaggs 1\nBlain 1\nBlaine@ENRON_DEVELOPMENT 1\nBlaise 1\nBlakemore 1\nBlanton 1\nBlaster 1\nBlatt 1\nBlazia 1\nBlazy 1\nBleacher 1\nBleckner 1\nBlessed 1\nBlessings 1\nBleus 1\nBliss 1\nBlobbo 1\nBlocks 1\nBlogWarBot 1\nBlogbus 1\nBlogging 1\nBlogshares 1\nBlohm 1\nBlonde 1\nBloody 1\nBloomberg 1\nBlot 1\nBlouberg 1\nBlowback 1\nBlows 1\nBlshe.com 1\nBlumberg 1\nBlumenthal 1\nBlumers 1\nBlvd. 1\nBoM 1\nBoS 1\nBoake 1\nBoardrooms 1\nBoards 1\nBoatmen 1\nBoba 1\nBocas 1\nBoccone 1\nBochniarz 1\nBochum 1\nBoddington 1\nBodhi 1\nBodien 1\nBodill 1\nBodine 1\nBodman 1\nBodmer 1\nBodo 1\nBoehringer-Ingelheim 1\nBoeings 1\nBoelkow 1\nBoer 1\nBoettcher 1\nBoga 1\nBogdan 1\nBogguss 1\nBogus 1\nBoiled 1\nBoisvert 1\nBolatavich 1\nBolin 1\nBolivar 1\nBolivian 1\nBollinger 1\nBolstered 1\nBolstering 1\nBolton 1\nBomb 1\nBombardment 1\nBomen 1\nBonafide 1\nBonaventure 1\nBondin' 1\nBondish 1\nBone 1\nBonecrusher 1\nBongo 1\nBonita 1\nBonnard 1\nBonniers 1\nBonomo 1\nBonuses 1\nBookings 1\nBookman 1\nBoomtown 1\nBoop 1\nBoorstyn 1\nBoost 1\nBooth 1\nBooths 1\nBootin 1\nBooz 1\nBopoka 1\nBorax 1\nBorders 1\nBordetella 1\nBorge 1\nBorgeson 1\nBorhani 1\nBorie 1\nBorishely 1\nBorough 1\nBorrowed 1\nBorrowers 1\nBosco 1\nBosket 1\nBosque 1\nBosses 1\nBot 1\nBotetourt 1\nBotticelli 1\nBottles 1\nBottling 1\nBoucheron 1\nBoucicault 1\nBoudin 1\nBought 1\nBoulet 1\nBounce 1\nBoundless 1\nBounty 1\nBouquet 1\nBourgeois 1\nBouri 1\nBourke 1\nBourne 1\nBourses 1\nBoutique 1\nBoutros 1\nBouvier 1\nBova 1\nBow 1\nBowcher 1\nBowels 1\nBowen 1\nBowle 1\nBowman 1\nBowne 1\nBoxer 1\nBoyle 1\nBoyz 1\nBraawwk 1\nBraawwkkk 1\nBrace 1\nBrachfeld 1\nBracknell 1\nBraeuer 1\nBrahimi 1\nBrahms 1\nBrain 1\nBrainwashed 1\nBraitman 1\nBraking 1\nBraman 1\nBramen 1\nBrammertz 1\nBran 1\nBranching 1\nBrandt 1\nBrandy 1\nBranislav 1\nBrannigan 1\nBrannon 1\nBraque 1\nBratislava 1\nBrauchli 1\nBrauerei 1\nBraumeisters 1\nBraun 1\nBrawler 1\nBrawley 1\nBrawls 1\nBraye 1\nBrazel 1\nBrazen 1\nBrazilians 1\nBrea 1\nBread 1\nBreakdown 1\nBreakers 1\nBreast 1\nBreathtaking 1\nBredders 1\nBreeder 1\nBreeze 1\nBreger 1\nBreguet 1\nBremer 1\nBrendan 1\nBrendel 1\nBrenmor 1\nBrest 1\nBrethren 1\nBrett 1\nBreuners 1\nBrewster 1\nBreyer 1\nBrezinski 1\nBriarcliff 1\nBribery 1\nBricktop 1\nBrideshead 1\nBridgeport 1\nBridgers 1\nBridies 1\nBrief 1\nBriefing 1\nBrigades 1\nBrighter 1\nBrights 1\nBriksa 1\nBrin 1\nBrinkley 1\nBrinkman 1\nBrisbane 1\nBrit 1\nBritan 1\nBritannia 1\nBritian 1\nBritney 1\nBritoil 1\nBrits 1\nBritto 1\nBritton 1\nBroadstar 1\nBroiler 1\nBrokaw 1\nBroke 1\nBroken 1\nBronces 1\nBronfmans 1\nBronski 1\nBronston 1\nBrooksie 1\nBrophy 1\nBrowder 1\nBrownback 1\nBrowncover 1\nBrowse 1\nBrr 1\nBruch 1\nBruckhaus 1\nBruckheimer 1\nBruha 1\nBruhl 1\nBruin 1\nBrunswig 1\nBruwer 1\nBryanUT 1\nBryner 1\nBs 1\nBtwn 1\nBubbles 1\nBubes 1\nBubonic 1\nBucaramanga 1\nBuccaneers 1\nBuchard 1\nBucharest 1\nBucking 1\nBuckles 1\nBucknell 1\nBuddhists 1\nBudnev 1\nBudor 1\nBudweis 1\nBuehrle 1\nBuell 1\nBuente 1\nBuffet 1\nBuffy 1\nBufton 1\nBugge 1\nBuha 1\nBuhrmann 1\nBuick 1\nBuksbaum 1\nBulatovic 1\nBulinka 1\nBulletin 1\nBullion 1\nBullish 1\nBullwark 1\nBullying 1\nBulseco 1\nBulwark 1\nBump 1\nBumrungard 1\nBumrungrad 1\nBums 1\nBun 1\nBund 1\nBundren 1\nBungarus 1\nBunkyo 1\nBuoyed 1\nBurbank 1\nBurch 1\nBurdened 1\nBurdisso 1\nBureaucrat 1\nBureaucrats 1\nBurgee 1\nBurk 1\nBurlingame 1\nBurmese 1\nBurned 1\nBurnsville 1\nBurrillville 1\nBurry 1\nBurundi 1\nBurzon 1\nBusinesspeople 1\nBuster 1\nBusy 1\nBusyboys 1\nButama 1\nButane 1\nButlerCellars 1\nButte 1\nButter 1\nButterflies 1\nBuzzMachine 1\nBuzzy 1\nByargeon 1\nBye 1\nByelorussian 1\nByzantine 1\nC&D 1\nC&IC 1\nC&P 1\nC++ 1\nC.B. 1\nC.DTF 1\nC.I.A. 1\nC.W. 1\nC1500 1\nCAAC 1\nCAJUNISH 1\nCALLED 1\nCALLIOPE 1\nCALLS 1\nCALTEX 1\nCAMBREX 1\nCAMPUS 1\nCAMRA 1\nCAMshaft 1\nCANADA 1\nCANADIAN 1\nCANCER 1\nCAPITALS 1\nCAROLG 1\nCARTER 1\nCASE 1\nCASH 1\nCATFISH 1\nCB003011 1\nCB100 1\nCBD 1\nCBGBs 1\nCBI 1\nCC 1\nCCA-15 1\nCCD 1\nCCGM 1\nCCL 1\nCCs 1\nCD's 1\nCDA 1\nCDMA2000 1\nCDT 1\nCDU 1\nCEC 1\nCENSURE 1\nCENTER 1\nCES 1\nCF 1\nCFD 1\nCGI 1\nCHANGE 1\nCHANGED 1\nCHARLES 1\nCHD 1\nCHEF 1\nCHEVRON 1\nCHIEF 1\nCHILD 1\nCHILDREN 1\nCHILL 1\nCHRIS 1\nCHRISTIAN 1\nCHRISTMAS 1\nCICQ 1\nCID 1\nCIPs 1\nCITIUS33 1\nCITIZENS 1\nCITY 1\nCLAIMANTS 1\nCLAIMS 1\nCLEAN 1\nCLEMENT 1\nCLOROX 1\nCLOSED 1\nCLOSING 1\nCLUBS 1\nCLUSTER 1\nCNA 1\nCNCA 1\nCNNFN 1\nCO 1\nCO2 1\nCOCA 1\nCODE 1\nCOFFEE 1\nCOHERENT 1\nCOINTELPRO 1\nCOLA 1\nCOME 1\nCOMM 1\nCOMMUTERS 1\nCOMPARE 1\nCOMPUTERS 1\nCONCERNED 1\nCONCLUSIONS 1\nCONFIDENTIAL 1\nCONFIDENTIALITY 1\nCONGRESSIONAL 1\nCONSERVATIVES 1\nCONSOLIDATED 1\nCONSULTANT 1\nCONSUMER 1\nCONTACT 1\nCONTROL 1\nCONVICTION 1\nCONVICTS 1\nCONservation 1\nCOOPERATION 1\nCOOR 1\nCOS. 1\nCOST 1\nCOSTS 1\nCOTTON 1\nCOUPLE 1\nCOURTS 1\nCOWARD 1\nCPA's 1\nCPCG 1\nCPG 1\nCPP 1\nCRACKDOWN 1\nCRASHED 1\nCRAY 1\nCREAM 1\nCREATIVE 1\nCRESTMONT 1\nCRIME 1\nCRIMINAL 1\nCRITICAL 1\nCROSS-BRED 1\nCRS 1\nCRTs 1\nCRY 1\nCRs 1\nCS2 1\nCSFB 1\nCSP 1\nCSX 1\nCTCareer 1\nCUISINE 1\nCULPA 1\nCURE 1\nCVB 1\nCVs 1\nCW 1\nCabanne 1\nCabazon 1\nCabbage 1\nCabs 1\nCachets 1\nCaddyshack 1\nCadwell 1\nCafes 1\nCaffe 1\nCagney 1\nCahoon 1\nCailion 1\nCairenes 1\nCaisse 1\nCaitlin 1\nCajun 1\nCakes 1\nCalFed 1\nCalTech 1\nCalaria 1\nCalculated 1\nCalculater 1\nCalderwood 1\nCaleb 1\nCalendar 1\nCalgon 1\nCalhoun 1\nCalifornian 1\nCaligula 1\nCaliph 1\nCalisto 1\nCallahan 1\nCallas 1\nCallender 1\nCallers 1\nCalligrapher 1\nCalligraphy 1\nCalling 1\nCallon 1\nCallum 1\nCalmat 1\nCalorie 1\nCalverley 1\nCalvert 1\nCamarine 1\nCamaro 1\nCamary 1\nCambodians 1\nCambrex 1\nCambrian 1\nCamel 1\nCamelot 1\nCameras 1\nCamerino 1\nCamille 1\nCammack 1\nCampion 1\nCampuk 1\nCamry 1\nCamrys 1\nCanadaNews 1\nCancel 1\nCancels 1\nCandace 1\nCandid 1\nCandidate 1\nCandiotti 1\nCandlemas 1\nCandu 1\nCanellos 1\nCanfield 1\nCanibal 1\nCanned 1\nCanner 1\nCannistaro 1\nCano 1\nCanoga 1\nCantobank 1\nCanvassing 1\nCaohejing 1\nCapacitors 1\nCapetown 1\nCapistrano 1\nCapitalism 1\nCapp 1\nCapra 1\nCapri 1\nCaptions 1\nCapturing 1\nCaputo 1\nCaputos 1\nCaras 1\nCarbon 1\nCardenas 1\nCardiac 1\nCardin 1\nCardinas 1\nCarefree 1\nCarews 1\nCarfax 1\nCarisbrook 1\nCarltons 1\nCarlyle 1\nCarmania 1\nCarmen 1\nCarney 1\nCarole 1\nCarolingian 1\nCarolinians 1\nCarolinska 1\nCarota 1\nCarpenters 1\nCarpet 1\nCarre 1\nCarriles 1\nCarrying 1\nCarstens 1\nCarthage 1\nCartoon 1\nCartoonists 1\nCartridge 1\nCarvain 1\nCarvalho 1\nCarville 1\nCarving 1\nCary 1\nCaryl 1\nCarytown 1\nCasa 1\nCashbox 1\nCashin 1\nCashman 1\nCassar 1\nCassell 1\nCassidey 1\nCassiopeia 1\nCast 1\nCastagnola@ENRON_DEVELOPMENT 1\nCastano@EES 1\nCastleman 1\nCastling 1\nCastor 1\nCastrol 1\nCasymier 1\nCatalan 1\nCatalog 1\nCatalyst 1\nCatamarca 1\nCatanduanes 1\nCatania 1\nCataracts 1\nCatastrophe 1\nCatching 1\nCatering 1\nCato 1\nCattolica 1\nCattrall 1\nCaucasian 1\nCautious 1\nCav 1\nCawdron 1\nCaygill 1\nCecconi 1\nCecelia 1\nCedergren 1\nCefiro 1\nCelanese 1\nCelebrity 1\nCelia 1\nCelica 1\nCeline 1\nCellphone 1\nCelnicker 1\nCelsius 1\nCeltic 1\nCeltics 1\nCementing 1\nCentCom 1\nCentcom 1\nCentered 1\nCentrale 1\nCentralizing 1\nCentrist 1\nCenturion 1\nCepeda 1\nCeramic 1\nCereal 1\nCero 1\nCeron 1\nCertain 1\nCerts 1\nCesar 1\nCessation 1\nCester 1\nChabanais 1\nChaim 1\nChair 1\nChairing 1\nChairperson 1\nChallengers 1\nChallenges 1\nChalmers 1\nChamomile 1\nChamorro 1\nChamos 1\nChamp 1\nChancellery 1\nChangan 1\nChangcai 1\nChanger 1\nChangfa 1\nChangfei 1\nChanghong 1\nChangming 1\nChangping 1\nChangxing 1\nChanne 1\nChanning 1\nChans 1\nChantilly 1\nChanukah 1\nChanyikhei 1\nChaojing 1\nChaos 1\nChaoxia 1\nChaozhou 1\nChaozhu 1\nChapdelaine 1\nChapelle 1\nChaplin 1\nChappellet 1\nCharacteristically 1\nChardon 1\nChargers 1\nCharges 1\nCharlene 1\nCharlestonians 1\nCharlet 1\nCharley 1\nCharlottesville 1\nCharls 1\nCharm 1\nCharming 1\nChaseman 1\nChasing 1\nChastain 1\nChatsworth 1\nChausson 1\nChavanne 1\nChechnyan 1\nCheckmate 1\nCheckrobot 1\nCheech 1\nCheerleading 1\nCheez 1\nChekhovian 1\nChekovian 1\nChelan 1\nChem 1\nChemfix 1\nChemistry 1\nChen-en 1\nChengbin 1\nChengbo 1\nChengchih 1\nChenghu 1\nChengtou 1\nChenmin 1\nChennai 1\nChens 1\nCher 1\nCherokees 1\nCheron 1\nCherone 1\nCherryPy 1\nChessman 1\nChest 1\nChet 1\nChetta 1\nChevenement 1\nChevrolets 1\nChiappa 1\nChiards 1\nChiayi 1\nChichi 1\nChichin 1\nChieftains 1\nChienchen 1\nChienshan 1\nChieurs 1\nChiewpy 1\nChildhood 1\nChili 1\nChill 1\nChimanbhai 1\nChinaDaily 1\nChinaTimes.com 1\nChinanewscom 1\nChineese 1\nChiou 1\nChipmunks 1\nChipotle 1\nChips 1\nChiricahua 1\nChiropractic 1\nChisholm 1\nChitchat 1\nChiuchiungken 1\nChivington 1\nChiwei 1\nChloe 1\nChlorine 1\nCho 1\nChocolate 1\nChoft 1\nChoi 1\nChomper 1\nChongchun 1\nChongju 1\nChongkai 1\nChongqi 1\nChongzhen 1\nChoosing 1\nChopped 1\nChopper 1\nChore 1\nChores 1\nChorrillos 1\nChoshui 1\nChosum 1\nChoudhry 1\nChretian 1\nChretien 1\nChristi 1\nChristianoid 1\nChristiansen 1\nChuanqing 1\nChugoku 1\nChujun 1\nChul 1\nChunghsiao 1\nChunghua 1\nChungli 1\nChungmuro 1\nChungtai 1\nChunhui 1\nChunjih 1\nChunks 1\nChunliang 1\nChunxiao 1\nChurk 1\nChye 1\nChyron 1\nCichan 1\nCicippio 1\nCidako 1\nCider 1\nCiera 1\nCigarette 1\nCima 1\nCimflex 1\nCiminero 1\nCinda 1\nCinderslut 1\nCinematografica 1\nCinemax 1\nCintra 1\nCinzano 1\nCiporkin 1\nCirculation 1\nCisco 1\nCisneros 1\nCitic 1\nCitiiiizen 1\nCitron 1\nCityGate 1\nCivilization 1\nClad 1\nClaim 1\nClapp 1\nClarence 1\nClarinet 1\nClarion 1\nClarkin 1\nClarksburg 1\nClarnedon 1\nClasses 1\nClassics 1\nClassroom 1\nClause 1\nClauses 1\nClavell 1\nClavier 1\nClaw 1\nClayt 1\nCleanest 1\nCleanup 1\nClearwater 1\nCleave 1\nCleaveland 1\nClemens 1\nClemensen 1\nClement 1\nClendenin 1\nClever 1\nCliffs 1\nClimate 1\nClimb 1\nClinghover 1\nClinical 1\nClinique 1\nClintonville 1\nCloaking 1\nClooney 1\nClorets 1\nCloss 1\nClosure 1\nCloth 1\nClothes 1\nClothestime 1\nCloudcroft 1\nClowich 1\nClue 1\nCluelessness 1\nCluggish 1\nCluster 1\nClutch 1\nCly 1\nClyde 1\nCnn 1\nCo-author 1\nCo-authors 1\nCo-op 1\nCo-ordinator 1\nCoan 1\nCoastAmerica 1\nCoastguard 1\nCoatedboard 1\nCoatings 1\nCobbs 1\nCocaine 1\nCochetegiva 1\nCockatiels 1\nCocksucker 1\nCoconuts 1\nCodifying 1\nCoe 1\nCoen 1\nCoffield 1\nCoffin 1\nCoffman 1\nCogeneration 1\nCognac 1\nCohens 1\nCohodes 1\nColdwell 1\nColeco 1\nColegio 1\nColes 1\nColette 1\nColier 1\nColiseum 1\nCollagen 1\nCollapsing 1\nCollateral 1\nCollateralized 1\nColleagues 1\nCollect 1\nCollections 1\nCollectively 1\nColleen 1\nColleges 1\nCollegiate 1\nCollision 1\nCologne 1\nColonia 1\nColored 1\nColors 1\nCols 1\nColt 1\nColucci 1\nCom 1\nComanche 1\nComaneci 1\nComb 1\nCombatant 1\nCombatting 1\nCombine 1\nCombis 1\nComdisco 1\nComeback 1\nComedy 1\nComerica 1\nComfortable 1\nComics 1\nCommanders 1\nCommandment 1\nCommemorate 1\nCommercials 1\nCommissar 1\nCommissioning 1\nCommissions 1\nCommitee 1\nCommoner 1\nCommunicator 1\nCommunion 1\nCommunities 1\nCompact 1\nCompania 1\nCompaore 1\nComparable 1\nCompatriot 1\nCompean 1\nCompetence 1\nComplaints 1\nCompletely 1\nComplex 1\nCompo 1\nComponent 1\nCompositionally 1\nCompton 1\nComputations 1\nComputerLand 1\nComputing 1\nComvik 1\nConAgra 1\nConceivably 1\nConcept 1\nConception 1\nConceptualism 1\nConceptually 1\nConcessionaire 1\nConcludes 1\nConclusion 1\nConclusions 1\nConcocts 1\nConcrete 1\nConcurrence 1\nCondeleeza 1\nConditioning 1\nCondo 1\nCondoleeza 1\nCondominium 1\nCondominiums 1\nCondoms 1\nConduct 1\nConduits 1\nConey 1\nConfectioner 1\nConfectionery 1\nConfer 1\nConferees 1\nConfession 1\nConfiding 1\nConfirmations 1\nConfucianism 1\nConfucianists 1\nConfused 1\nConfutatis 1\nCongdon 1\nConger 1\nCongolese 1\nCongrats 1\nCongratulating 1\nCongregation 1\nCongress's 1\nCongyong 1\nConnect 1\nConnectors 1\nConnoisseur 1\nConnors 1\nConoco 1\nConrades 1\nConradie 1\nCons 1\nConscience 1\nConseguences 1\nConselheiro 1\nConsensus 1\nConsent 1\nConservationists 1\nConservatory 1\nConsignments 1\nConsistantly 1\nConsistency 1\nConsistently 1\nConsob 1\nConspicuous 1\nConstable 1\nConstantly 1\nConstitutionality 1\nConstitutionally 1\nConstructeurs 1\nConstructions 1\nConsular 1\nConsultation 1\nContainment 1\nContent-type 1\nContents 1\nContestants 1\nContinential 1\nContrasts 1\nContributing 1\nControlling 1\nControversial 1\nControversy 1\nConveniently 1\nConversion 1\nConvict 1\nConviction 1\nConyers 1\nCooke 1\nCookie 1\nCookies 1\nCool 1\nCoolBid 1\nCooling 1\nCoolmax 1\nCoolplus 1\nCooover 1\nCooperator 1\nCoordinating 1\nCoordination 1\nCoplandesque 1\nCoppenbargers 1\nCopying 1\nCorbehem 1\nCordato 1\nCordially 1\nCoreStates 1\nCorespondent 1\nCoreys 1\nCornered 1\nCorners 1\nCorney 1\nCornick 1\nCornish 1\nCorno 1\nCornwall 1\nCorollas 1\nCorona 1\nCoronado 1\nCoronets 1\nCorpse 1\nCorpus 1\nCorresponding 1\nCortes 1\nCorvettes 1\nCory 1\nCosgrove 1\nCosmair 1\nCosmetic 1\nCosmic 1\nCosmos 1\nCostCo 1\nCostanza 1\nCostello 1\nCostly 1\nCote 1\nCottage 1\nCottrell 1\nCouch 1\nCoudert 1\nCounselor 1\nCounselors 1\nCounter-Enlightenment 1\nCounter-terrorism 1\nCountermeasure 1\nCounterterrorism 1\nCoupe 1\nCoupes 1\nCouple 1\nCouples 1\nCoupon 1\nCourant 1\nCouric 1\nCouriers 1\nCourthouse 1\nCourtis 1\nCourtney 1\nCovas 1\nCovering 1\nCovey 1\nCow 1\nCoward 1\nCowboy 1\nCowen 1\nCowling 1\nCoxsackie 1\nCrabb 1\nCracke 1\nCracked 1\nCracow 1\nCraigy 1\nCramer 1\nCrap 1\nCrash 1\nCrater 1\nCravath 1\nCraving 1\nCrawfordsville 1\nCreate 1\nCreates 1\nCreating 1\nCreations 1\nCreativity 1\nCredibility 1\nCreditbank 1\nCredito 1\nCreditors 1\nCree 1\nCreel 1\nCreeps 1\nCreepy 1\nCremona 1\nCreole 1\nCrepes 1\nCrespo 1\nCrested 1\nCreswell 1\nCrewman 1\nCrewmen 1\nCrickets 1\nCrippling 1\nCrises 1\nCrisman 1\nCrisp 1\nCrispin 1\nCriterion 1\nCritic 1\nCritique 1\nCroatian 1\nCrockett 1\nCroma 1\nCromwell 1\nCronkite 1\nCrops 1\nCross-Culture 1\nCross-Strait 1\nCross-gender 1\nCross-national 1\nCross-strait 1\nCrossair 1\nCrossed 1\nCrossword 1\nCrouch 1\nCrouched 1\nCrouching 1\nCrowell 1\nCrucially 1\nCrucifixion 1\nCruel 1\nCruelty 1\nCrum 1\nCrusade 1\nCrush 1\nCrust 1\nCry 1\nCrying 1\nCryptomeria 1\nCrystals 1\nCuauhtemoc 1\nCub 1\nCubby 1\nCube 1\nCubism 1\nCucamonga 1\nCucumber 1\nCuddeford 1\nCuddles 1\nCuellar 1\nCufflink 1\nCuisine 1\nCultivating 1\nCunninghamia 1\nCupboard 1\nCupressaceae 1\nCups 1\nCurdling 1\nCurr 1\nCurriculums 1\nCurrier 1\nCurry 1\nCurt 1\nCurtain 1\nCushman 1\nCustomarily 1\nCustomized 1\nCutbacks 1\nCute 1\nCutlass 1\nCutty 1\nCuyahoga 1\nCy 1\nCyber-authors 1\nCyber-literature 1\nCyberCenter 1\nCyberLink 1\nCyberlink 1\nCycads 1\nCyclops 1\nCypresses 1\nCyril 1\nCzechoslovak 1\nCzechoslovaks 1\nCzechs 1\nCzeslaw 1\nD' 1\nD'Agosto 1\nD'Amato 1\nD'Amico 1\nD'oh! 1\nD.B. 1\nD.H. 1\nD.N. 1\nD.S. 1\nD.T 1\nD.s 1\nD2 1\nDA 1\nDAILY 1\nDALIS 1\nDALKON 1\nDALLAS 1\nDAM 1\nDARMAN'S 1\nDATA 1\nDATE 1\nDAUGHTER 1\nDAYAC 1\nDAYTON 1\nDB 1\nDBA 1\nDBF 1\nDCSNet 1\nDDI 1\nDEALER 1\nDEATHS 1\nDEBRA 1\nDEBT 1\nDECENT 1\nDECOR 1\nDEFERRED 1\nDEFINITELY 1\nDELAYS 1\nDELIGHT 1\nDELL 1\nDENIAL 1\nDENTIST 1\nDEPARTMENT 1\nDEVELOPMENT 1\nDEVICES 1\nDFW 1\nDGAULT 1\nDH 1\nDH12x 1\nDHAWK 1\nDI 1\nDIA 1\nDIALING 1\nDIASONICS 1\nDID 1\nDIED 1\nDIET 1\nDIGITAL 1\nDILLARD 1\nDINING 1\nDIRECTORY 1\nDISAPPOINTMENTS 1\nDISASTER 1\nDISCIPLINARY 1\nDISHES 1\nDISTRESSFUL 1\nDITTO 1\nDJ's 1\nDK 1\nDL 1\nDLM8 1\nDM 1\nDNS 1\nDOCTOR 1\nDOE 1\nDOG 1\nDOGS 1\nDOLLARS 1\nDON'T 1\nDONE 1\nDOORS 1\nDOWN 1\nDOWNSIZING 1\nDPA 1\nDPT 1\nDRACULA 1\nDRAFT 1\nDREXEL 1\nDREYER'S 1\nDRI 1\nDRIVE 1\nDS 1\nDSKY 1\nDSM 1\nDSP 1\nDT5 1\nDTA 1\nDTC 1\nDULM 1\nDUMB 1\nDURER 1\nDVDRW 1\nDWP 1\nDX 1\nDaVinci 1\nDaaaaarim 1\nDabiara 1\nDabney 1\nDada 1\nDaddy 1\nDaegu 1\nDafpblet 1\nDagger 1\nDagong 1\nDahir 1\nDaifu 1\nDaignault 1\nDaisy 1\nDajin 1\nDak 1\nDaley 1\nDalila 1\nDalldorf 1\nDaming 1\nDamocles 1\nDamonne 1\nDane 1\nDanger 1\nDaniella 1\nDanilo 1\nDanza 1\nDao 1\nDaocheng 1\nDaohan 1\nDaohua 1\nDaojia 1\nDaphnia 1\nDaqiu 1\nDaralee 1\nDare 1\nDaremblum 1\nDarien 1\nDarimi 1\nDarius 1\nDarkness 1\nDarlene 1\nDarlington 1\nDarlow 1\nDarn 1\nDartboard 1\nDarutsigh 1\nDaryn 1\nDas 1\nDashiell 1\nDashuang 1\nDat 1\nDataplay 1\nDatian 1\nDatshir 1\nDatson 1\nDatuk 1\nDauphine 1\nDaura 1\nDavidge 1\nDavidian 1\nDavidow 1\nDavids 1\nDavol 1\nDawamaiti 1\nDawning 1\nDawood 1\nDaxi 1\nDaxie 1\nDayan 1\nDayaowan 1\nDayf 1\nDaylight 1\nDaze 1\nDeBakey 1\nDeFazio 1\nDeKlerk 1\nDeMunn 1\nDeScenza 1\nDeVillars 1\nDeVille 1\nDeVon 1\nDeWitt 1\nDeadline 1\nDeadwood 1\nDealer 1\nDealt 1\nDeanna 1\nDeater 1\nDeaths 1\nDebauched 1\nDebbie 1\nDebora 1\nDebord 1\nDebussy 1\nDecadent 1\nDecent 1\nDecheng 1\nDecided 1\nDeciding 1\nDecision 1\nDeclare 1\nDeclaring 1\nDecliners 1\nDeclines 1\nDecor 1\nDecorative 1\nDecrease 1\nDederick 1\nDedham 1\nDedication 1\nDeductions 1\nDeeds 1\nDeen 1\nDeepak 1\nDeer 1\nDeerslayer 1\nDefault 1\nDefaults 1\nDefeated 1\nDefect 1\nDefections 1\nDefence 1\nDefendants 1\nDefenders 1\nDefending 1\nDeffner 1\nDeficiency 1\nDefinately 1\nDefined 1\nDefinetely 1\nDefining 1\nDefinitely 1\nDefuse 1\nDegas 1\nDegenerate 1\nDegree 1\nDegu 1\nDegus 1\nDehuai 1\nDeinagkistrodon 1\nDeir 1\nDeja 1\nDekalb 1\nDelayed 1\nDelbert 1\nDeleage 1\nDelegate 1\nDelete 1\nDelicate 1\nDelilah 1\nDeliver 1\nDelivering 1\nDelma 1\nDelors 1\nDelponte 1\nDemagogues 1\nDemands 1\nDemin 1\nDeming 1\nDemme 1\nDemocratization 1\nDemographics 1\nDemonstrate 1\nDemonstrating 1\nDemonstrations 1\nDemonstrators 1\nDempseys 1\nDena 1\nDenenchofu 1\nDenial 1\nDenied 1\nDenizens 1\nDennehy 1\nDennison 1\nDenrees 1\nDense 1\nDentist 1\nDenying 1\nDepalazai 1\nDeparted 1\nDepartmentstore 1\nDeparture 1\nDepei 1\nDepicting 1\nDeployment 1\nDepo 1\nDepreciates 1\nDept 1\nDept. 1\nDequan 1\nDeringer 1\nDerivatives 1\nDesarrollo 1\nDescendants 1\nDescribe 1\nDese 1\nDesigned 1\nDesigner 1\nDesigns 1\nDesimum 1\nDesire 1\nDesires 1\nDesk 1\nDesperate 1\nDesperately 1\nDesperation 1\nDesportivo 1\nDestec 1\nDestiny 1\nDestitute 1\nDestruction 1\nDetached 1\nDetailed 1\nDetainee 1\nDetectives 1\nDetermination 1\nDetermined 1\nDetermining 1\nDeters 1\nDeuse 1\nDeutch 1\nDevans 1\nDevastating 1\nDevastation 1\nDevelop 1\nDevesa 1\nDevotees 1\nDevotion 1\nDewar 1\nDewhurst 1\nDezhou 1\nDezhu 1\nDhaka 1\nDhari 1\nDharmadeva 1\nDiLeo 1\nDiLorenzo 1\nDiMaggio 1\nDiabetic 1\nDiagnostic 1\nDiagnostics 1\nDiagram 1\nDial 1\nDialogue 1\nDiaoyu 1\nDickensian 1\nDickinson 1\nDicks 1\nDicor 1\nDictates 1\nDictation 1\nDierdre 1\nDies 1\nDiesel 1\nDieter 1\nDietrick 1\nDiexi 1\nDifferences 1\nDifferentiation 1\nDifficult 1\nDift 1\nDigging 1\nDigitrax 1\nDignitaries 1\nDiligence 1\nDilzem 1\nDimitri 1\nDinamo 1\nDine 1\nDingac 1\nDingliu 1\nDingshi 1\nDingtaifeng 1\nDingxin 1\nDingyi 1\nDinner 1\nDinsa 1\nDion 1\nDionne 1\nDipak 1\nDiplomatic 1\nDire 1\nDirected 1\nDirectv 1\nDirk 1\nDirst 1\nDis 1\nDisability 1\nDisagreement 1\nDisappeared 1\nDisappointed 1\nDisappointment 1\nDisastrous 1\nDisc2Phone 1\nDiscarding 1\nDisciple 1\nDiscipline 1\nDisclosures 1\nDisconnection 1\nDiscouragement 1\nDiscovered 1\nDiscovering 1\nDiscrepancies 1\nDiscret 1\nDiscuss 1\nDisdain 1\nDisgusted 1\nDishonesty 1\nDismantle 1\nDismissing 1\nDisorderly 1\nDispensary 1\nDisposable 1\nDisposition 1\nDisposti 1\nDisputado 1\nDispute 1\nDisrespectful 1\nDissident 1\nDissidents 1\nDissolved 1\nDistances 1\nDistilled 1\nDistiller 1\nDistillers 1\nDistraction 1\nDistributed 1\nDistributing 1\nDistributors 1\nDitch 1\nDiversification 1\nDiversify 1\nDiversity 1\nDivesting 1\nDivide 1\nDiving 1\nDivinity 1\nDivorced 1\nDivorcee 1\nDixiecrat 1\nDiyab 1\nDiyar 1\nDiyarbakir 1\nDjurdjevic 1\nDmitry 1\nDobi 1\nDobson 1\nDocklands 1\nDocomell 1\nDocumentation 1\nDocumented 1\nDocuments 1\nDoda 1\nDoddly 1\nDodger 1\nDodi 1\nDodson 1\nDoetch 1\nDogwood 1\nDoi 1\nDolan 1\nDolby 1\nDolphins 1\nDomain 1\nDomgo 1\nDominant 1\nDominated 1\nDomingos 1\nDominguez 1\nDominici 1\nDona 1\nDonahue 1\nDonair 1\nDonating 1\nDonau 1\nDonbas 1\nDonews 1\nDongcai 1\nDongfang 1\nDonggu 1\nDongju 1\nDongping 1\nDongtou 1\nDongxing 1\nDongyang 1\nDonnelly 1\nDonohue 1\nDoo 1\nDoody 1\nDoom 1\nDoonesbury 1\nDoorne 1\nDoosan 1\nDopamine 1\nDora 1\nDorado 1\nDorena 1\nDorgen 1\nDoris 1\nDornin 1\nDorothy 1\nDorsets 1\nDorsey 1\nDoskocil 1\nDostoievksy 1\nDotson 1\nDotted 1\nDou 1\nDoublespeak 1\nDoubt 1\nDoubts 1\nDougal 1\nDoughboys 1\nDoughty 1\nDouglass 1\nDoumani 1\nDouste 1\nDowd 1\nDowngraded 1\nDownwind 1\nDoxat 1\nDoza 1\nDraft 1\nDrafting 1\nDragewbo 1\nDragging 1\nDragoli 1\nDramatic 1\nDraper 1\nDrastic 1\nDravo 1\nDreamcast 1\nDreaming 1\nDreams 1\nDreier 1\nDriesell 1\nDrifting 1\nDrinks 1\nDriver 1\nDrogoul 1\nDrought 1\nDrove 1\nDrowning 1\nDrs. 1\nDru 1\nDrugewbo 1\nDrum 1\nDrury 1\nDrybred 1\nDryden 1\nDs 1\nDuCharme 1\nDualDrive 1\nDubbed 1\nDubinin 1\nDubis 1\nDubliners 1\nDubois 1\nDubose 1\nDuchampian 1\nDuck 1\nDucky 1\nDuclos 1\nDuffield 1\nDuffus 1\nDujianyan 1\nDuk 1\nDuma 1\nDumb 1\nDumber 1\nDumbest 1\nDumbya 1\nDumont 1\nDump 1\nDumpling 1\nDunes 1\nDungeon 1\nDunhuang 1\nDunker 1\nDunlaevy 1\nDunno 1\nDuns 1\nDunya 1\nDuplicating 1\nDupuy 1\nDuquesne 1\nDurables 1\nDuracell 1\nDurante 1\nDurcan 1\nDurgnat 1\nDurney 1\nDusary 1\nDustin 1\nDuston 1\nDutrait 1\nDuvalier 1\nDuxiu 1\nDuy 1\nDwarven 1\nDwayne 1\nDweik 1\nDworkin 1\nDy 1\nDycom 1\nDyer 1\nDyk 1\nDyke 1\nDylan's 1\nDylex 1\nDynapert 1\nDynascan 1\nDyson 1\nDzida 1\nE-Ring 1\nE-Tech 1\nE-Z 1\nE-commerce 1\nE-waste 1\nE.E. 1\nE.F. 1\nE.M. 1\nE@tG 1\nEARTHHHHHHH 1\nEAT 1\nEATTING 1\nEBay 1\nEDA 1\nEDI 1\nEE 1\nEENTY 1\nEESI 1\nEGYPT 1\nEI 1\nEI.London 1\nEKG 1\nEL 1\nELECTRONICS 1\nELP 1\nEMBAs 1\nEMC 1\nEMERCOM 1\nEMI 1\nEMPIRE 1\nEMPLOYEE 1\nENFIELD 1\nENG 1\nENGLAND 1\nENGRAPH 1\nENJOYABLE 1\nENJOYS 1\nENOUGH 1\nENRON 1\nENRON-CPS 1\nENRONONLINE 1\nENRONR~1.DOC 1\nENTERTAINMENT 1\nENTIRE 1\nENVIRONMENT 1\nENVIRONMENTAL 1\nEPrints 1\nEQUIPMENT 1\nEQUITY 1\nERA 1\nES 1\nESA 1\nESL 1\nESL988s 1\nESOP 1\nESP 1\nESS 1\nESSG 1\nESTATE 1\nETL 1\nETS 1\nEVBLIN 1\nEVEREX 1\nEWC 1\nEWR 1\nEX 1\nEXAMINE 1\nEXBT 1\nEXCELS 1\nEXPANDS 1\nEXPECT 1\nEXPECTING 1\nEXPENSIVE 1\nEXPERIENCED 1\nEXXON 1\nEZ 1\nEager 1\nEar 1\nEarthForce 1\nEarthLink 1\nEarthquakes 1\nEarthworms 1\nEasiest 1\nEastate 1\nEastday 1\nEbasco 1\nEcco 1\nEcheandia 1\nEchelon 1\nEckhard 1\nEcologists 1\nEcology 1\nEconometric 1\nEdale 1\nEdberg 1\nEdelmann 1\nEdelstein 1\nEdemame 1\nEdgefield 1\nEdison@ENRON 1\nEdit 1\nEdita 1\nEditorials 1\nEditors 1\nEdits 1\nEdmar 1\nEdmark 1\nEdmunds.com 1\nEdnee 1\nEdouard 1\nEducating 1\nEducator 1\nEdwardJones 1\nEdzard 1\nEee 1\nEgad 1\nEggars 1\nEh 1\nEhrlichman 1\nEichner 1\nEichof 1\nEidani 1\nEighteenth 1\nEiji 1\nEileen 1\nEilon 1\nEinar 1\nEisai 1\nEisenstein 1\nEiszner 1\nEizenstat 1\nEjected 1\nEkhbariya 1\nEkhbariyah 1\nEkonomicheskaya 1\nElantra 1\nElasticity 1\nElbows 1\nElder 1\nElderly 1\nEldest 1\nEldred 1\nElecktra 1\nElect 1\nElectrician 1\nElectrochemical 1\nElectrolux 1\nElectromechanical 1\nElectrosurgery 1\nElegant 1\nElegy 1\nElement 1\nElephant 1\nEleuthra 1\nElevation 1\nElgin 1\nElianti 1\nEliminate 1\nEliot 1\nElisa 1\nElisha 1\nElista 1\nElites 1\nEljer 1\nElkin 1\nElkins 1\nElle 1\nEllesmere 1\nEllington 1\nEllman 1\nEllmann 1\nElmer 1\nElodie 1\nElon 1\nElrick 1\nElse 1\nElvador 1\nElvira 1\nElysee 1\nElysium 1\nEm-enro2.doc 1\nEmancipation 1\nEmbassies 1\nEmbattled 1\nEmbedding 1\nEmbittered 1\nEmboldened 1\nEmbrace 1\nEmbryo 1\nEmei 1\nEmergencies 1\nEmigration 1\nEmirati 1\nEmotion 1\nEmphasis 1\nEmpowerment 1\nEncarta 1\nEncirclement 1\nEncourage 1\nEncouragement 1\nEncouraging 1\nEnderforth 1\nEnding 1\nEndurance 1\nEnduring 1\nEnergetic 1\nEnergie 1\nEnergieproduktiebedrijf 1\nEnergyphiles 1\nEnersen 1\nEnfolio 1\nEngagement 1\nEngelhardt 1\nEngineered 1\nEnglander 1\nEngler 1\nEnglishwoman 1\nEngraph 1\nEnhancements 1\nEniChem 1\nEnid 1\nEnjoyed 1\nEnlightenment 1\nEnneagram 1\nEnola 1\nEnormous 1\nEnquirer 1\nEnquiry 1\nEnright 1\nEnsler 1\nEntartete 1\nEnte 1\nEntebbe 1\nEntequia 1\nEntergy 1\nEntering 1\nEntertainer 1\nEntertaining 1\nEnthusiast 1\nEnthusiastic 1\nEnthusiasts 1\nEntree 1\nEntrekin 1\nEntries 1\nEpicurean 1\nEpinal 1\nEpinalers 1\nEpiphany 1\nEpiscopalians 1\nEpp 1\nEppel 1\nEppelmann 1\nEppler 1\nEpps 1\nEprex 1\nEpson 1\nEpstien 1\nEquifax 1\nEquilon 1\nEquipped 1\nEr 1\nErasing 1\nErath 1\nErburu 1\nErdolversorgungs 1\nErdos 1\nErensel 1\nErguna 1\nEricS 1\nErica 1\nEricks 1\nErickson 1\nErics 1\nEricsson 1\nEriksen 1\nErin 1\nErithmatic 1\nErkki 1\nErle 1\nErmanno 1\nErrol 1\nErroll 1\nError 1\nErskin 1\nErskine 1\nErtan 1\nEs 1\nEsber 1\nEscalante 1\nEscaping 1\nEscobar 1\nEscorts 1\nEscudome 1\nEskimo 1\nEslinger 1\nEsnard 1\nEsopus 1\nEspana 1\nEspanol 1\nEsplanade 1\nEssar 1\nEssayist 1\nEssential 1\nEsseri 1\nEstablish 1\nEstablished 1\nEstablishment 1\nEstadio 1\nEsteli 1\nEstimating 1\nEternal 1\nEternity 1\nEthanol 1\nEthel 1\nEthical 1\nEthicist 1\nEthnographer 1\nEthnographic 1\nEthylene 1\nEtienne 1\nEtta 1\nEtudes 1\nEtz 1\nEuchre 1\nEukanuba 1\nEulogic 1\nEunice 1\nEuphoria 1\nEuphrates 1\nEuroDisney 1\nEuroMed 1\nEuroconvertible 1\nEurodebentures 1\nEurodebt 1\nEuroissues 1\nEuronotes 1\nEuropass 1\nEuros 1\nEvaluating 1\nEvaluation 1\nEvaluations 1\nEvanell 1\nEvanston 1\nEvelyn 1\nEverglades 1\nEvers 1\nEveryman 1\nEveyone 1\nEvian 1\nEvils 1\nEvolution 1\nEvolutionary 1\nEvren 1\nExalted 1\nExam 1\nExaminers 1\nExcelsior 1\nExceptions 1\nExcerpt 1\nExcerpts 1\nExcise 1\nExcluded 1\nExcludes 1\nExcuses 1\nExecutes 1\nExecution 1\nExecutions 1\nExercise 1\nExercises 1\nExhausted 1\nExhibit 1\nExisting 1\nExocet 1\nExocets 1\nExodus 1\nExotic 1\nExpecting 1\nExpects 1\nExpedia 1\nExpeditionary 1\nExpertises 1\nExploitation 1\nExploiting 1\nExplorers 1\nExplosions 1\nExpressionism 1\nExpressive 1\nExtending 1\nExtensions 1\nExteriors 1\nExtracted 1\nExtraordinary 1\nExtremely 1\nExtremism 1\nEyak 1\nEyewitnesses 1\nEzekiel 1\nEzra 1\nF%#king 1\nF'ers 1\nF.A. 1\nF.A.O. 1\nF.C 1\nF.D.R. 1\nF.E. 1\nF.J. 1\nF.R.S. 1\nF.S.B. 1\nF100 1\nF16s 1\nF18s 1\nFABULOUS 1\nFACING 1\nFAILED 1\nFAIR 1\nFAKE 1\nFALTERS 1\nFAMILY 1\nFAQ 1\nFARGO 1\nFARMERS 1\nFARMING 1\nFBC 1\nFCCC 1\nFCE 1\nFCP 1\nFE 1\nFEAR 1\nFEATHERS 1\nFELLED 1\nFESPIC 1\nFEWER 1\nFFARs 1\nFH 1\nFHLBB 1\nFICCI 1\nFIG 1\nFINANCES 1\nFIRMS 1\nFISA 1\nFIT 1\nFJ 1\nFL 1\nFLIGHT 1\nFLOOR 1\nFLY 1\nFMI 1\nFMs 1\nFOES 1\nFORBIDDEN 1\nFORD 1\nFORM 1\nFORMAL 1\nFORMALDEHYDE 1\nFORMAN 1\nFORMULA 1\nFPL 1\nFPS 1\nFRANCE 1\nFRANKLIN 1\nFREAK 1\nFREDERICK 1\nFREEDOM 1\nFREIGHTWAYS 1\nFRIENDLY 1\nFRIES 1\nFRINGE 1\nFROG 1\nFRTF 1\nFS 1\nFSHOD 1\nFSX 1\nFULL 1\nFULLY 1\nFURY 1\nFXTV 1\nFY09E 1\nFY10E 1\nFa'en 1\nFabbri 1\nFaber 1\nFabian 1\nFactions 1\nFactorex 1\nFactories 1\nFactoring 1\nFaculty 1\nFadel 1\nFaek 1\nFagen 1\nFagershein 1\nFagots 1\nFails 1\nFailures 1\nFairground 1\nFairmont 1\nFaithfully 1\nFaiz 1\nFaiza 1\nFake 1\nFakka 1\nFalaq 1\nFalcons 1\nFallen 1\nFallon 1\nFallout 1\nFalsehoods 1\nFalwell 1\nFamilies 1\nFangbo 1\nFangcheng 1\nFangfei 1\nFannin 1\nFanshi 1\nFaraj 1\nFaraya 1\nFarc 1\nFardh 1\nFares 1\nFarid 1\nFarmhouse 1\nFarming 1\nFarookh 1\nFarooq 1\nFarouk 1\nFarraj 1\nFarri 1\nFaruq 1\nFas 1\nFascism 1\nFashu 1\nFassbinder 1\nFastenal 1\nFatalities 1\nFathers 1\nFathi 1\nFats 1\nFaulkner 1\nFaulty 1\nFauquier 1\nFauvism 1\nFaux 1\nFawn 1\nFay 1\nFayed 1\nFearon 1\nFears 1\nFeather 1\nFeathered 1\nFeatures 1\nFeaturing 1\nFebuary 1\nFeddema 1\nFederalist 1\nFederico 1\nFeders 1\nFedexed 1\nFee 1\nFeed 1\nFeelings 1\nFeels 1\nFeelup 1\nFeess 1\nFeet 1\nFego 1\nFeilibeiqiu 1\nFeitsui 1\nFeldemuehle 1\nFeldstein 1\nFelicity 1\nFeminist 1\nFending 1\nFengshuang 1\nFengxian 1\nFenn 1\nFenton 1\nFenway 1\nFergie 1\nFernand 1\nFernandina 1\nFerraro 1\nFerrell 1\nFerron 1\nFerry 1\nFery 1\nFester 1\nFestiva 1\nFeud 1\nFeudal 1\nFeynman 1\nFhimah 1\nFiala 1\nFias 1\nFiberCom 1\nFiberall 1\nFibreboard 1\nFibromyalgia 1\nFico 1\nFiedler 1\nFieldwork 1\nFienberg 1\nFierce 1\nFiery 1\nFife 1\nFifteenth 1\nFigadua 1\nFight 1\nFighters 1\nFiguratively 1\nFiled 1\nFilenes 1\nFilial 1\nFilled 1\nFillmore 1\nFilmworks 1\nFiltered 1\nFima 1\nFina 1\nFinancially 1\nFinancials 1\nFinanco 1\nFinanziario 1\nFindley 1\nFinis 1\nFinish 1\nFink 1\nFinley 1\nFinney 1\nFinns 1\nFinucane 1\nFiona 1\nFionandes 1\nFirangs 1\nFireWire 1\nFirearms 1\nFirebrand 1\nFireside 1\nFirey 1\nFisheries 1\nFishery 1\nFishkill 1\nFisk 1\nFiske 1\nFissette 1\nFitrus 1\nFittingly 1\nFitzsimmons 1\nFitzwilliam 1\nFitzwilliams 1\nFix 1\nFixing 1\nFizazi 1\nFizkultura 1\nFlad 1\nFlag 1\nFlags 1\nFlaherty 1\nFlamabable 1\nFlames 1\nFlamingo 1\nFlammable 1\nFlank 1\nFlankis 1\nFlats 1\nFlavio 1\nFlavo 1\nFlawless 1\nFlaws 1\nFleecing 1\nFleetwood 1\nFleihan 1\nFlemish 1\nFlesch 1\nFlex 1\nFlick 1\nFlickr 1\nFlights 1\nFlip 1\nFlippo 1\nFlooding 1\nFlop 1\nFloral 1\nFloridian 1\nFloridians 1\nFloss 1\nFlottl 1\nFluctuation 1\nFluent 1\nFlush 1\nFlushed 1\nFlute 1\nFlykus 1\nFo 1\nFodor 1\nFoerder 1\nFoggs 1\nFoguangshan 1\nFoiled 1\nFokker 1\nFolcroft 1\nFolding 1\nFollowed 1\nFollows 1\nFolly 1\nFolsom 1\nFoncier 1\nFond 1\nFonse 1\nFontainbleu 1\nFoong 1\nForbidden 1\nForced 1\nForcing 1\nFordham 1\nFords 1\nForecasts 1\nForeclosed 1\nForeclosure 1\nForemost 1\nForesight 1\nForfeiture 1\nForge 1\nForgetful 1\nForgetting 1\nForham 1\nFormby 1\nFormed 1\nFornos 1\nForrestal 1\nForry 1\nForstmann 1\nForte 1\nForza 1\nFos 1\nFoshan 1\nFossan 1\nFosset 1\nFostering 1\nFound 1\nFounding 1\nFourteen 1\nFowler 1\nFoxsies 1\nFoy 1\nFramatome 1\nFramingham 1\nFrancaises 1\nFranchesco 1\nFranchisees 1\nFranciscan 1\nFranciso 1\nFrancona 1\nFranjieh 1\nFrankel 1\nFrankenberry 1\nFrankenstein 1\nFrankie 1\nFrankovka 1\nFrayne 1\nFrazer 1\nFreckles 1\nFreddy 1\nFrederic 1\nFrederico 1\nFredrick 1\nFredricka 1\nFreeberg 1\nFreecycle 1\nFreed 1\nFreedmen 1\nFreedoms 1\nFreemason 1\nFreeview 1\nFreie 1\nFrenchman 1\nFrenchmen 1\nFrenchwoman 1\nFrenzel 1\nFreshfields 1\nFresno 1\nFreudian 1\nFrey 1\nFridman 1\nFriedkin 1\nFriedrich 1\nFriendless 1\nFriendships 1\nFriis 1\nFrisbee 1\nFrist 1\nFritz 1\nFroebel 1\nFroomkin 1\nFrost 1\nFrucher 1\nFruehauf 1\nFrustration 1\nFryar 1\nFrying 1\nFs 1\nFubon 1\nFuchang 1\nFucheng 1\nFuck 1\nFucker 1\nFudan 1\nFudao 1\nFudosan 1\nFuelCell 1\nFugitive 1\nFugitives 1\nFujairah 1\nFukuda 1\nFulbright 1\nFuling 1\nFullerton 1\nFultz 1\nFulung 1\nFuncinpec 1\nFundTrust 1\nFundamentalist 1\nFundamentals 1\nFuneral 1\nFunk 1\nFuqua 1\nFur 1\nFurman 1\nFurnace 1\nFurong 1\nFurthemore 1\nFury 1\nFusen 1\nFushan 1\nFushih 1\nFutuna 1\nFuturism 1\nFuxing 1\nFuzhongsan 1\nG&G 1\nG.C. 1\nG.L. 1\nG.M.T. 1\nG.O. 1\nG.S. 1\nG.m.b. 1\nG2 1\nGAIT 1\nGAMBLE 1\nGANNETT 1\nGAP 1\nGARBAGE 1\nGAS 1\nGASES 1\nGAVE 1\nGDL 1\nGENENTECH 1\nGENERIC 1\nGERMANS 1\nGERMANY'S 1\nGH 1\nGILBERGD@sullcrom.com 1\nGIs 1\nGLAD 1\nGLITTER 1\nGLR 1\nGMC 1\nGMP 1\nGMT 1\nGNS430 1\nGOD 1\nGOES 1\nGOING 1\nGOLDEN 1\nGOLF 1\nGOOGLEZON 1\nGORE 1\nGOULD 1\nGP 1\nGPUs 1\nGR 1\nGRAB 1\nGRAND 1\nGRECO 1\nGREEN 1\nGROOM 1\nGROWING 1\nGROWTH 1\nGRS 1\nGRiD 1\nGS430 1\nGSPA 1\nGT 1\nGTC 1\nGTCs 1\nGUARANTEE 1\nGUIDE 1\nGUN 1\nGXE 1\nGables 1\nGabon 1\nGabriela 1\nGabriele 1\nGaceta 1\nGadafi 1\nGadsden 1\nGaechinger 1\nGaels 1\nGai 1\nGail 1\nGaining 1\nGains 1\nGaisman 1\nGaithersburg 1\nGaja 1\nGalactic 1\nGalani 1\nGalanter 1\nGalanz 1\nGalax 1\nGalaxy 1\nGalbani 1\nGale 1\nGaletta 1\nGalicia 1\nGalipault 1\nGalleries 1\nGalles 1\nGallic 1\nGallitzin 1\nGallohock 1\nGallop 1\nGalloping 1\nGalloway 1\nGaloob 1\nGalsworthy 1\nGama 1\nGamaa 1\nGambit 1\nGambling 1\nGameCube 1\nGameSpot 1\nGameboy 1\nGangaji 1\nGanglun 1\nGanjiang 1\nGansu's 1\nGaolan 1\nGaoliang 1\nGaoqiao 1\nGarage.com 1\nGarbage 1\nGarcey 1\nGarcia@ENRON 1\nGare 1\nGargantuan 1\nGarish 1\nGarner 1\nGarnett 1\nGarpian 1\nGases 1\nGateses 1\nGateway 1\nGau 1\nGavlack 1\nGayat 1\nGazeta 1\nGear 1\nGebrueder 1\nGedinage 1\nGeek 1\nGeeman 1\nGehl 1\nGeier 1\nGeisel 1\nGellert 1\nGelman 1\nGem 1\nGemology 1\nGenCorp 1\nGenProbe 1\nGenerales 1\nGeneralizations 1\nGenerals 1\nGenerating 1\nGenerator 1\nGeneratorExit 1\nGenerous 1\nGenesys 1\nGengxin 1\nGenius 1\nGenliang 1\nGenocide 1\nGensets 1\nGentile 1\nGentility 1\nGentle 1\nGentzs 1\nGenuine 1\nGeocryology 1\nGeodetic 1\nGeoelectricity 1\nGeoffrie 1\nGeoffrion 1\nGeohydromechanics 1\nGeometry 1\nGeorg 1\nGeorgescu 1\nGeorgette 1\nGeotextiles 1\nGephardt 1\nGeragos 1\nGeraldine 1\nGerardo 1\nGerd 1\nGerlaridy 1\nGerm 1\nGermain 1\nGermanic 1\nGermanys 1\nGermont 1\nGero 1\nGerrard 1\nGerson 1\nGersoncy 1\nGeste 1\nGet-A-Free-House.com 1\nGethsemane 1\nGewu 1\nGezafarcus 1\nGhad 1\nGhaffari 1\nGhafoor 1\nGhali 1\nGhanim 1\nGhassan 1\nGhazaliyah 1\nGhazel 1\nGhirardelli 1\nGhoneim 1\nGhulam 1\nGhuliyandrin 1\nGi 1\nGia 1\nGiamatti 1\nGiancarlo 1\nGianutto 1\nGibbon 1\nGibbs 1\nGiddings 1\nGifted 1\nGiguiere 1\nGilbraltar 1\nGilder 1\nGiles 1\nGilgore 1\nGillers 1\nGilliatt 1\nGillman 1\nGimp 1\nGina 1\nGingerly 1\nGingirch 1\nGinny 1\nGitmo 1\nGitter 1\nGiulio 1\nGiverny 1\nGiza 1\nGlaciology 1\nGladys 1\nGlamorous 1\nGlandular 1\nGlasswork 1\nGlassworks 1\nGlauber 1\nGleaners 1\nGlenbrook 1\nGlenhouser 1\nGlenne 1\nGlenny 1\nGlitter 1\nGlobalisation 1\nGlobalized 1\nGlobes 1\nGlobex 1\nGlorioso 1\nGlossy 1\nGloucester 1\nGloucestershire 1\nGlove 1\nGluck 1\nGnosticism 1\nGodiva 1\nGodot 1\nGoehring 1\nGoerdler 1\nGoetzke 1\nGoff 1\nGoggles 1\nGogol 1\nGolar 1\nGoldfus 1\nGoldie 1\nGoldin 1\nGoldinger 1\nGoldscheider 1\nGoldstar 1\nGolems 1\nGolfers 1\nGollich 1\nGolo 1\nGoloven 1\nGomel 1\nGon 1\nGoncharov 1\nGoncourt 1\nGonggaluobu 1\nGooRoo 1\nGoochland 1\nGoodday 1\nGoodfriend 1\nGoodger 1\nGoogleOS 1\nGoonewardena 1\nGooroo 1\nGorbachov 1\nGorby 1\nGorce 1\nGord 1\nGorenstein 1\nGorilla 1\nGoriot 1\nGorman 1\nGorshkov 1\nGorton 1\nGosbank 1\nGosh 1\nGoshen 1\nGospels 1\nGosplan 1\nGossnab 1\nGotaas 1\nGotham 1\nGotshal 1\nGottesfeld 1\nGottesman 1\nGoubuli 1\nGouldoid 1\nGouramis 1\nGourmet 1\nGout 1\nGovern 1\nGovernador 1\nGovernmental 1\nGovernorates 1\nGraae 1\nGrabe 1\nGrabowiec 1\nGracious 1\nGradison 1\nGrads 1\nGrady 1\nGrahi 1\nGrains 1\nGrais 1\nGrammy 1\nGrammys 1\nGrams 1\nGranada 1\nGrandpa 1\nGranny 1\nGranted 1\nGranting 1\nGraphical 1\nGrapple 1\nGrasevina 1\nGrass 1\nGrassley 1\nGrasso 1\nGrassroots 1\nGratitude 1\nGravano 1\nGraw 1\nGrayson 1\nGrazia 1\nGreaney 1\nGrease 1\nGreatly 1\nGreatness 1\nGreed 1\nGreedily 1\nGreenbelt 1\nGreengrocers 1\nGreenland 1\nGreenshields 1\nGreenwald 1\nGreenwalt 1\nGreenwood 1\nGreffy 1\nGregoire 1\nGregorian 1\nGreif 1\nGreifswald 1\nGreiner 1\nGremlins 1\nGrenadines 1\nGrenier 1\nGretzky 1\nGrief 1\nGriesa 1\nGrievances 1\nGriffen 1\nGriffith@ENRON 1\nGriggs 1\nGrigsby 1\nGrille 1\nGrilled 1\nGrimes 1\nGrinevsky 1\nGrip 1\nGrippo 1\nGriswold 1\nGrit 1\nGro 1\nGrodnik 1\nGroove 1\nGros 1\nGrosse 1\nGroucho 1\nGrounded 1\nGroundhog 1\nGroundwater 1\nGroupement 1\nGroupie 1\nGroused 1\nGroveton 1\nGrowers 1\nGrows 1\nGrubb 1\nGrumbles 1\nGrundfest 1\nGruppe 1\nGrusin 1\nGuang 1\nGuangchun 1\nGuanggu 1\nGuanghuai 1\nGuangqian 1\nGuangshui 1\nGuangxu 1\nGuangya 1\nGuangying 1\nGuangzhao 1\nGuangzi 1\nGuanlin 1\nGuanshan 1\nGuanting 1\nGuanying 1\nGuanzhong 1\nGuarana 1\nGuatapae 1\nGucci 1\nGuchang 1\nGudai 1\nGuerin 1\nGuernica 1\nGuess 1\nGuesthouse 1\nGuevara 1\nGuggenheim 1\nGuibing 1\nGuidance 1\nGuided 1\nGuifei 1\nGuijin 1\nGuildford 1\nGuilds 1\nGuilherme 1\nGuiness 1\nGuisheng 1\nGuitar 1\nGuixian 1\nGuiyang 1\nGul 1\nGulag 1\nGulch 1\nGulfport 1\nGulick 1\nGulpan 1\nGumi 1\nGumkowski 1\nGummiBear 1\nGump 1\nGundy 1\nGung 1\nGunma 1\nGunn 1\nGunner 1\nGunther 1\nGuoli 1\nGuoliang 1\nGuoquan 1\nGuoxian 1\nGuoyan 1\nGuoyuan 1\nGuozhong 1\nGuppy 1\nGupta 1\nGur 1\nGurgaon 1\nGurtz 1\nGus 1\nGusev 1\nGushan 1\nGushin 1\nGustav 1\nGustavus 1\nGutenberg 1\nGutenberghus 1\nGuzewich 1\nGwan 1\nGymnastics 1\nGypsum 1\nH.G. 1\nH.L. 1\nH.R. 1\nH.W. 1\nH1 1\nHAD1999 1\nHALE 1\nHAMSTERS 1\nHAND 1\nHANNIFIN 1\nHATE 1\nHATED 1\nHATES 1\nHATING 1\nHAWLEY 1\nHBJ 1\nHCFCs 1\nHCM 1\nHDM 1\nHEALTHDYNE 1\nHEARS 1\nHEATING 1\nHEAVEN 1\nHEAVY 1\nHENRI 1\nHER 1\nHERBERT 1\nHERO 1\nHEWLETT 1\nHEXCEL 1\nHEYNOW 1\nHG 1\nHH 1\nHIB 1\nHIGHER 1\nHIM 1\nHIR 1\nHIRING 1\nHL2 1\nHLR 1\nHMS 1\nHND 1\nHOA 1\nHOBBY 1\nHOLD 1\nHOLLYWOOD 1\nHOMESTAKE 1\nHOMESTEAD 1\nHONECKER 1\nHONKA 1\nHOPE 1\nHOPES 1\nHORRIBLE 1\nHOSTESS 1\nHOTEL 1\nHOUSTON 1\nHOWEVER 1\nHPL 1\nHRH 1\nHRonline 1\nHSB 1\nHSL 1\nHTC 1\nHUDSON 1\nHUGE 1\nHUGO'S 1\nHUH 1\nHUNGARY 1\nHUNTING 1\nHURRICANE 1\nHUSBANDS 1\nHUTTON 1\nHabbaniyah 1\nHaber 1\nHabil 1\nHabit 1\nHabitat 1\nHabolonei 1\nHachuel 1\nHackensack 1\nHackman 1\nHacksaw 1\nHadassah 1\nHadera 1\nHadhazy 1\nHadifa 1\nHadithas 1\nHadley 1\nHagood 1\nHah 1\nHahahahahahaahahhahahahahahhahahahahahahahahhahahah 1\nHahnemann 1\nHaifa 1\nHaifeng 1\nHaijuan 1\nHaikuan 1\nHaim 1\nHaired 1\nHaisheng 1\nHaitao 1\nHaitians 1\nHaiwang 1\nHaixiong 1\nHaizi 1\nHajak 1\nHajiri 1\nHajo 1\nHakeem 1\nHakko 1\nHakr 1\nHakuhodo 1\nHala 1\nHalas 1\nHalen 1\nHaley 1\nHalfway 1\nHali 1\nHalis 1\nHalle 1\nHallett 1\nHalley 1\nHalliburton 1\nHalls 1\nHallucigenia 1\nHalpern 1\nHalsted 1\nHalva 1\nHamada 1\nHamadi 1\nHamakua 1\nHamayil 1\nHamblen 1\nHamburger 1\nHamer 1\nHamlet 1\nHamma 1\nHammer 1\nHammett 1\nHampton 1\nHanabali 1\nHanauer 1\nHanbal 1\nHandan 1\nHandelsman 1\nHandmaid 1\nHangxiong 1\nHanieh 1\nHaniyeh 1\nHankou 1\nHankui 1\nHann 1\nHannei 1\nHannibal 1\nHannover 1\nHansen@ENRON 1\nHanwa 1\nHanyu 1\nHanyuan 1\nHanzu 1\nHaojing 1\nHaoyuan 1\nHaqbani 1\nHar 1\nHarald 1\nHaram 1\nHarari 1\nHarbanse 1\nHarcourt 1\nHardest 1\nHardworking 1\nHardy 1\nHare 1\nHarf 1\nHargrave 1\nHari 1\nHarithi 1\nHarkin 1\nHarkins 1\nHarmonia 1\nHarmonious 1\nHarmony 1\nHarms 1\nHarord 1\nHarpoon 1\nHarrah 1\nHarri 1\nHarriton 1\nHarrods 1\nHarte 1\nHarts 1\nHartsfield 1\nHartung 1\nHaruo 1\nHarwood 1\nHasan 1\nHasang 1\nHasenauer 1\nHashidate 1\nHashimi 1\nHashrudi 1\nHasse 1\nHassenfeld 1\nHaste 1\nHastily 1\nHastion 1\nHatakeyama 1\nHatches 1\nHatchett 1\nHathcock 1\nHaughey 1\nHauraki 1\nHavel 1\nHavoc 1\nHawaiians 1\nHawesville 1\nHawk 1\nHawke 1\nHawks 1\nHawn 1\nHawtat 1\nHayasaka 1\nHayden 1\nHayworth 1\nHazem 1\nHazmat 1\nHe/She 1\nHeadley 1\nHeadlines 1\nHeadly 1\nHeady 1\nHealey 1\nHealthsource 1\nHeap 1\nHear 1\nHeari 1\nHearken 1\nHeartland 1\nHeartwise 1\nHeather 1\nHeats 1\nHeber 1\nHeberto 1\nHebrews 1\nHeckmann 1\nHecla 1\nHector 1\nHedge 1\nHee 1\nHeem 1\nHeerden 1\nHeffner 1\nHegelian 1\nHehuan 1\nHeibao 1\nHeidegger 1\nHeihe 1\nHeimers 1\nHein 1\nHeine 1\nHeineken 1\nHeinemann 1\nHeinhold 1\nHeinrich 1\nHeisbourg 1\nHeishantou 1\nHeiwa 1\nHekhmatyar 1\nHekmatyar 1\nHelaba 1\nHelane 1\nHeld 1\nHelena 1\nHelga 1\nHelicopter 1\nHelicopters 1\nHeliopolis 1\nHelium 1\nHellada 1\nHelliesen 1\nHellman 1\nHelm 1\nHelms 1\nHelmuth 1\nHelped 1\nHelper 1\nHelpless 1\nHemdan 1\nHemet 1\nHemispheric 1\nHemmer 1\nHemoglobin 1\nHempel 1\nHemweg 1\nHengchun 1\nHennessy 1\nHenrico 1\nHenrik 1\nHenson 1\nHeping 1\nHepworth 1\nHerbalist 1\nHerbig 1\nHerbs 1\nHereafter 1\nHermann 1\nHermeticism 1\nHermitage 1\nHermits 1\nHernan 1\nHeroes 1\nHerpes 1\nHerrera 1\nHerring 1\nHerrman 1\nHers 1\nHerschel 1\nHersey 1\nHershhenson 1\nHershiser 1\nHershy 1\nHerslow 1\nHerwick 1\nHerzfeld 1\nHerzog 1\nHessians 1\nHessische 1\nHeterosexuals 1\nHeublein 1\nHewart 1\nHewerd 1\nHeyi 1\nHeywood 1\nHezbo 1\nHibben 1\nHibernation 1\nHibernia 1\nHibler 1\nHickman 1\nHidetoshi 1\nHiding 1\nHighlander 1\nHighlights 1\nHighschool 1\nHijaz 1\nHijet 1\nHikers 1\nHikmi 1\nHilali 1\nHilder 1\nHilgert 1\nHilla 1\nHillah 1\nHillegonds 1\nHillman 1\nHillsboro 1\nHillsdown 1\nHiltunen 1\nHiltz 1\nHime 1\nHimself 1\nHind 1\nHindemith 1\nHinduism 1\nHinkle 1\nHinoki 1\nHinsville 1\nHip 1\nHippie 1\nHiram 1\nHired 1\nHiro 1\nHirohito 1\nHirschfeld 1\nHirsohima 1\nHisake 1\nHismanal 1\nHispanoamericana 1\nHistorian 1\nHitchens 1\nHiutong 1\nHive 1\nHoa 1\nHobbes 1\nHobbs 1\nHobgoblin 1\nHoboken 1\nHoc 1\nHockney 1\nHodgkin 1\nHoe 1\nHoenlein 1\nHoffmann 1\nHogg 1\nHogs 1\nHogwasher 1\nHokkaido 1\nHokuriku 1\nHolcomb 1\nHolden 1\nHolewinski 1\nHolf 1\nHoli 1\nHolidaymakers 1\nHolinshed 1\nHolistic 1\nHolkeri 1\nHolkerry 1\nHollandale 1\nHolliger 1\nHollinger 1\nHolofernes 1\nHoltzman 1\nHolum 1\nHolyoke 1\nHomart 1\nHomback 1\nHombak 1\nHomelite 1\nHomeopath 1\nHomeowner 1\nHomeroom 1\nHomerun 1\nHomma 1\nHomo 1\nHomosexuals 1\nHonKong 1\nHondas 1\nHonduran 1\nHondurans 1\nHoney 1\nHoneybee 1\nHongbo 1\nHongjiu 1\nHongmin 1\nHongshui 1\nHongyang 1\nHongyong 1\nHoniss 1\nHonorable 1\nHooded 1\nHoof 1\nHoog 1\nHooks 1\nHooser 1\nHooters 1\nHoped 1\nHopeless 1\nHopelessness 1\nHopes 1\nHopkinson@ENRON_DEVELOPMENT 1\nHopless 1\nHoratio 1\nHordern 1\nHori 1\nHornaday 1\nHorne 1\nHornets 1\nHorribly 1\nHorsehead 1\nHorsey 1\nHorta 1\nHorticultural 1\nHorwitz 1\nHose 1\nHoses 1\nHoshyar 1\nHost 1\nHostel 1\nHostilities 1\nHoston 1\nHoudini 1\nHoujian 1\nHouli 1\nHoulian 1\nHounded 1\nHouping 1\nHousatonic 1\nHouseboat 1\nHousewares 1\nHousings 1\nHouson 1\nHowdeyen 1\nHowdy 1\nHowe 1\nHowie 1\nHowson 1\nHoy 1\nHoya 1\nHs. 1\nHseuh 1\nHsiachuotsu 1\nHsiachuotzu 1\nHsimen 1\nHsimenting 1\nHsinyi 1\nHsiuh 1\nHsiukulan 1\nHsiung 1\nHua-chih 1\nHuadong 1\nHuaihai 1\nHuaixi 1\nHualong 1\nHuaneng 1\nHuangbiao 1\nHuangxing 1\nHuanqing 1\nHuanxing 1\nHuaqiu 1\nHuard 1\nHuateng 1\nHuaxia 1\nHuaxing 1\nHuazi 1\nHubel 1\nHuble@ENRON 1\nHubris 1\nHuckabee 1\nHuckleberry 1\nHuddersfield 1\nHuddy 1\nHuerta 1\nHuff 1\nHuffington 1\nHugely 1\nHuggies 1\nHuh 1\nHuhhot 1\nHuilan 1\nHuiliang 1\nHuiluo 1\nHuizhen 1\nHukou 1\nHumane 1\nHumanism 1\nHumanizing 1\nHumanpixel 1\nHumbert 1\nHumen 1\nHumet 1\nHumility 1\nHumming 1\nHumorous 1\nHump 1\nHumphreys 1\nHumphries 1\nHumvee 1\nHunchun 1\nHungarians 1\nHunger 1\nHungerfords 1\nHunin 1\nHunley 1\nHunted 1\nHunterdon 1\nHunting 1\nHuntley 1\nHuntsville 1\nHuolianguang 1\nHurd 1\nHuricane 1\nHurrican 1\nHurricanes 1\nHurriyat 1\nHurrriyat 1\nHurst 1\nHurtado 1\nHurter 1\nHurts 1\nHurwitt 1\nHurwitz 1\nHus 1\nHusbands 1\nHusky 1\nHustead 1\nHuston 1\nHutchison 1\nHutou 1\nHutsells 1\nHutung 1\nHuver 1\nHuwaijah 1\nHuwayah 1\nHuxley 1\nHuy 1\nHwai 1\nHyang 1\nHybrid 1\nHybritech 1\nHydo 1\nHydor 1\nHydraulics 1\nHygiene 1\nHyong 1\nHyperthyroidism 1\nHypotheekkas 1\nI'd 1\nI'll 1\nI'm 1\nI've 1\nI'vey 1\nI.B. 1\nI.B.M. 1\nI.E.P. 1\nI.K. 1\nI.M. 1\nI.W. 1\nICBM 1\nICBMs 1\nICC 1\nICE 1\nICM 1\nICU 1\nIDC 1\nIDF 1\nIE6 1\nIED 1\nIES 1\nIHOP 1\nIIP 1\nIIRC 1\nIIcx 1\nIMELDA 1\nIMG_0782.JPG 1\nIMMEDIATE 1\nIMPOSSIBLE 1\nINA 1\nINCOME 1\nINDUSTRIAL 1\nINMAC 1\nINRI 1\nINSTITUTE 1\nINSULTED 1\nINSULTING 1\nINSURERS 1\nINTEL 1\nINTENSIVE 1\nINVESTMENT 1\nIO 1\nIOF 1\nION 1\nIONs 1\nIOW 1\nIPA 1\nIRB 1\nIRIs 1\nISC 1\nISDAs 1\nISI 1\nISLAM 1\nISP 1\nISRAEL 1\nISRAELI 1\nITEL 1\nIXL 1\nIaciofano 1\nIams 1\nIbariche 1\nIberia 1\nIberian 1\nIbiza 1\nIced 1\nIcicle 1\nId 1\nIdeals 1\nIdeation 1\nIdeological 1\nIditarod 1\nIdling 1\nIdol 1\nIdrissa 1\nIdrocarburi 1\nIfa 1\nIgaras 1\nIgnatius 1\nIgnazio 1\nIgnorance 1\nIgnore 1\nIgnoring 1\nIguana 1\nIhab 1\nIhla 1\nIjyan 1\nIkea 1\nIlbo 1\nIlena 1\nIlkka 1\nIllegal 1\nImaduldin 1\nImage 1\nImagination99 1\nImai 1\nIman 1\nImelda 1\nImmaculate 1\nImmortal 1\nImn 1\nImola 1\nImpaired 1\nImpasse 1\nImpenetrable 1\nImperium 1\nImpetus 1\nImplicated 1\nImported 1\nImpose 1\nImpossible 1\nImpressed 1\nImpressionist 1\nImpressionists 1\nImprisoned 1\nImprovements 1\nImran 1\nInacio 1\nInada 1\nInara 1\nInauguration 1\nInc.One 1\nInch 1\nIncheon 1\nIncidentally 1\nIncidents 1\nIncirlik 1\nIncitement 1\nIncline 1\nIncompetence 1\nIncompetent 1\nIncrease 1\nIncrementally 1\nIncriminating 1\nIndemnity 1\nIndentical 1\nIndexes 1\nIndications 1\nIndicator 1\nIndira 1\nIndochina 1\nIndocin 1\nIndoor 1\nIndosuez 1\nIndustriale 1\nIndustriali 1\nIndustrials 1\nIndustriels 1\nIndustriously 1\nIndustrywide 1\nInefficient 1\nInexplicably 1\nInfinity 1\nInfo 1\nInfoPro 1\nInfocomm 1\nInford 1\nInform 1\nInformative 1\nInformed 1\nInformix 1\nInfosys 1\nInfotechnology 1\nInfrastructure 1\nInfrastructures 1\nIng. 1\nIngleheim 1\nIngmar 1\nInherent 1\nInheritance 1\nInitiatives 1\nInjured 1\nInksi 1\nInlet 1\nInnis 1\nInnocent 1\nInnocents 1\nInnovation 1\nInnovations 1\nInoco 1\nInquisition 1\nInrockuptibles 1\nInscriptions 1\nInsights 1\nInsilco 1\nInsitutional 1\nInski 1\nInsofar 1\nInspections 1\nInspector 1\nInspectors 1\nInspiration 1\nInspire 1\nInspired 1\nInstall 1\nInstep 1\nInstituto 1\nInstitutue 1\nInstruction 1\nInstructions 1\nInstructor 1\nInsureres 1\nIntech 1\nIntegrate 1\nIntegration 1\nIntegrity 1\nIntelligencer 1\nIntelligent 1\nIntelsat 1\nIntended 1\nIntense 1\nIntent 1\nInter-American 1\nInter-Branch 1\nInter-department 1\nInterMedia 1\nInterServ 1\nInterbank 1\nIntercepting 1\nInterested 1\nInterfax 1\nIntergraph 1\nInterhome 1\nInterim 1\nInterlake 1\nIntermec 1\nIntermediary 1\nInternally 1\nIntertan 1\nIntervention 1\nInterview 1\nInterviewer 1\nInterviewers 1\nInterviu 1\nIntl 1\nIntra-European 1\nIntrastate 1\nIntrauterine 1\nIntrepid 1\nIntroduces 1\nIntroducing 1\nIntuition 1\nInuits 1\nInvade 1\nInvariably 1\nInvasion 1\nInvention 1\nInvercon 1\nInvested 1\nInvestigating 1\nInvestigator 1\nInvesting 1\nInvincible 1\nInvoking 1\nInvolved 1\nInvulnerable 1\nInwa 1\nInwood 1\nInzer 1\nIommi 1\nIpanema 1\nIpod 1\nIqlim 1\nIranU.S 1\nIraqiya 1\nIraqyia 1\nIrbil 1\nIrishman 1\nIrishmen 1\nIrkutsk 1\nIronic 1\nIrregularities 1\nIrrigation 1\nIsabelle 1\nIsacsson 1\nIsao 1\nIsikoff 1\nIsis 1\nIskakavut 1\nIslah 1\nIslamabad 1\nIslami 1\nIslamism 1\nIslamiyyah 1\nIslamofascism 1\nIslamophobia 1\nIsles 1\nIsmael 1\nIsmailia 1\nIsmailis 1\nIsoda 1\nIsola 1\nIsolated 1\nIssak 1\nIssam 1\nIssuance 1\nIssued 1\nIssuing 1\nIt's 1\nIta 1\nItalia 1\nItaliana 1\nItalianate 1\nItaru 1\nItems 1\nItogi 1\nIttihad 1\nItunes 1\nItzhak 1\nIvey 1\nIvkovic 1\nIvorians 1\nIzquierda 1\nIzu 1\nJ&B 1\nJ'ai 1\nJ.E. 1\nJ.F. 1\nJ.L. 1\nJ.R. 1\nJ.R.R. 1\nJ.T. 1\nJ.V 1\nJ.V. 1\nJ.X. 1\nJAB 1\nJACKSON 1\nJAGRY 1\nJAILED 1\nJAMES 1\nJAPAN 1\nJARIR9@hotmail.com 1\nJAUNTS 1\nJBOSS 1\nJCDC 1\nJCKC 1\nJDAM 1\nJDOM 1\nJERSEY 1\nJET 1\nJG 1\nJH 1\nJJ 1\nJK 1\nJKD 1\nJOHN 1\nJOHNSON 1\nJOIN 1\nJOKE 1\nJPG 1\nJR 1\nJROE 1\nJSF 1\nJULY 1\nJUNO 1\nJVG 1\nJWVS 1\nJaMarcus 1\nJaap 1\nJacinto 1\nJackals 1\nJackass 1\nJackets 1\nJacki 1\nJaclyn 1\nJaco 1\nJacoboski 1\nJacoby@ECT 1\nJacque 1\nJacuzzi 1\nJag 1\nJagi 1\nJaguars 1\nJaime 1\nJakes 1\nJalalabad 1\nJaleo 1\nJalininggele 1\nJamachakih 1\nJamia 1\nJamil 1\nJamiroquai 1\nJammaz 1\nJanabi 1\nJanelle 1\nJangchung 1\nJanjaweed 1\nJanlori 1\nJann 1\nJanney 1\nJansen 1\nJansz. 1\nJardine 1\nJason.Leopold@dowjones.com 1\nJava 1\nJaw 1\nJaymz 1\nJayne 1\nJeane 1\nJeanene 1\nJeanette 1\nJeanne 1\nJeans 1\nJebtsundamba 1\nJeeps 1\nJeez 1\nJeffersons 1\nJeffry 1\nJeju 1\nJelinski 1\nJellicle 1\nJellied 1\nJellison 1\nJen'ai 1\nJena 1\nJeneme 1\nJenna 1\nJennie 1\nJenning 1\nJennings 1\nJens 1\nJepson 1\nJerald 1\nJeremiah 1\nJerral 1\nJerrico 1\nJerry0803 1\nJespersen 1\nJesuits 1\nJet 1\nJets 1\nJetset 1\nJett 1\nJetta 1\nJew 1\nJewboy 1\nJewelery 1\nJewels 1\nJi'an 1\nJi'nan 1\nJiading 1\nJiakun 1\nJialiao 1\nJialing 1\nJian'gang 1\nJiandao 1\nJiangbei 1\nJiangnan 1\nJiangsen 1\nJianhong 1\nJianjun 1\nJianlian 1\nJianping 1\nJiansou 1\nJianxin 1\nJianxiong 1\nJianzhai 1\nJiaojiazhai 1\nJiaotong 1\nJiaqi 1\nJiaxuan 1\nJibran 1\nJibril 1\nJicheng 1\nJici 1\nJidong 1\nJieping 1\nJihua 1\nJiliang 1\nJiling 1\nJillin 1\nJimbo 1\nJimenez 1\nJindo 1\nJingcai 1\nJingdezhen 1\nJingguo 1\nJingkang 1\nJingqiao 1\nJingyu 1\nJinhui 1\nJinjun 1\nJinrunfa 1\nJinshan 1\nJinxi 1\nJinyi 1\nJiotto 1\nJiptanoy 1\nJiri 1\nJittery 1\nJiujianpeng 1\nJiulong 1\nJiwu 1\nJoakok 1\nJoanna 1\nJoaquin 1\nJobThread.com 1\nJobe 1\nJoby 1\nJocelyn 1\nJock 1\nJockey 1\nJodie 1\nJoe_Lardy@cargill.com 1\nJoerg 1\nJogis 1\nJohan 1\nJohanna 1\nJohanneson 1\nJohanson 1\nJohnson@ENRON 1\nJohnsons 1\nJoin 1\nJoiner 1\nJointed 1\nJoke 1\nJolas 1\nJolivet 1\nJolt 1\nJona 1\nJoni 1\nJoos 1\nJordeena 1\nJordena 1\nJordon 1\nJosalyn 1\nJosef 1\nJosephson 1\nJoshi 1\nJoss 1\nJotaro 1\nJothun 1\nJounieh 1\nJovanovich 1\nJovi 1\nJoyceon 1\nJubur 1\nJudas 1\nJudeh 1\nJudeo 1\nJudie 1\nJudo 1\nJuego 1\nJueren 1\nJuge 1\nJugend 1\nJuicer 1\nJuke 1\nJukes 1\nJul 1\nJulian 1\nJuliana 1\nJulileah 1\nJumbo 1\nJump 1\nJumper 1\nJumping 1\nJunction 1\nJunge 1\nJunichiro 1\nJunlian 1\nJunmo 1\nJunor 1\nJunxiu 1\nJurassic 1\nJurek 1\nJuri 1\nJuridical 1\nJurists 1\nJurors 1\nJuste 1\nJute 1\nJutting 1\nKAISER 1\nKDE 1\nKDGE 1\nKDY 1\nKEY 1\nKFH 1\nKHAD 1\nKHS 1\nKHz 1\nKIM 1\nKING 1\nKIPP 1\nKIPPUR 1\nKK 1\nKMart 1\nKNEW 1\nKOTRA 1\nKPLU 1\nKRENZ 1\nKS 1\nKSI 1\nKTV 1\nKTXL 1\nKVM 1\nKa 1\nKabun 1\nKacy 1\nKadonada 1\nKael 1\nKafaroff 1\nKafkaesque 1\nKagan 1\nKailun 1\nKaisha 1\nKaitaia 1\nKaixi 1\nKajima 1\nKakumaru 1\nKalahari 1\nKalatuohai 1\nKalca 1\nKaleningrad 1\nKali 1\nKaliningrad 1\nKalison 1\nKamchatka 1\nKamel 1\nKan 1\nKanan 1\nKanazawa 1\nKangxiong 1\nKanjorski 1\nKann 1\nKanon 1\nKans. 1\nKantakari 1\nKantorei 1\nKaolin 1\nKapinski 1\nKapital 1\nKar 1\nKarada 1\nKarakh 1\nKaram 1\nKarate 1\nKarben 1\nKarches 1\nKarenna 1\nKargalskiy 1\nKark 1\nKarna 1\nKarnak 1\nKarni 1\nKarnsund 1\nKaro 1\nKaros 1\nKarp 1\nKarsh 1\nKartalia 1\nKary 1\nKarzai 1\nKashi 1\nKasir 1\nKasna 1\nKasslik 1\nKasten 1\nKatharina 1\nKathe 1\nKathman 1\nKatonah 1\nKatsive 1\nKatunar 1\nKatzenjammer 1\nKaul 1\nKavanagh 1\nKawaguchi 1\nKaye 1\nKaylee 1\nKaysersberg 1\nKayton 1\nKazakhstan 1\nKazakhstani 1\nKazempour 1\nKazuhiko 1\nKazushige 1\nKb 1\nKbasda 1\nKearn 1\nKearns 1\nKeatingland 1\nKeaton 1\nKebing 1\nKeeney 1\nKeepers 1\nKefa 1\nKeffer 1\nKeg 1\nKeidanren 1\nKeihin 1\nKeio 1\nKeji 1\nKelaudin 1\nKelton 1\nKempner 1\nKenan 1\nKendrick 1\nKenichiro 1\nKenike 1\nKenmare 1\nKenmore 1\nKenneally 1\nKennett 1\nKennewick 1\nKenney 1\nKenny 1\nKenosha 1\nKensetsu 1\nKenting 1\nKenton 1\nKeogh 1\nKerala 1\nKernel 1\nKerny 1\nKerouac 1\nKerrMcGee 1\nKerrey 1\nKerrigan 1\nKershye 1\nKerstian 1\nKeshtmand 1\nKetagelan 1\nKetchum 1\nKetin 1\nKettner 1\nKevin.A.Boone@accenture.com 1\nKevlar 1\nKeyang 1\nKeye 1\nKeyhole 1\nKeynesians 1\nKeys 1\nKezi 1\nKhader 1\nKhadhera 1\nKhaldoun 1\nKhaledi 1\nKhaleefa 1\nKhalfan 1\nKhamis 1\nKhareq 1\nKharoub 1\nKhashvili 1\nKhasib 1\nKhatib 1\nKhattab 1\nKhazars 1\nKhbeir 1\nKhokha 1\nKhorakiwala 1\nKhori 1\nKhost 1\nKhwarij 1\nKhyber 1\nKieran 1\nKiffin 1\nKiki 1\nKikkoman 1\nKiko 1\nKildare 1\nKilgore 1\nKilled 1\nKillers 1\nKilliFish 1\nKillifish 1\nKillion 1\nKilo 1\nKilroy 1\nKilty 1\nKimba 1\nKimihide 1\nKimpo 1\nKinderCare 1\nKindertotenlieder 1\nKindom 1\nKindopp 1\nKindred 1\nKinescope 1\nKinga 1\nKingfish 1\nKingman 1\nKingsbridge 1\nKingsford 1\nKingsleys 1\nKingston 1\nKingsville 1\nKinji 1\nKinko 1\nKinnard 1\nKinnear 1\nKinnevik 1\nKinnock 1\nKinshumir 1\nKioka 1\nKipuket 1\nKirchberger 1\nKiribati 1\nKirkendall 1\nKirkland 1\nKirkpatrick 1\nKirschbaum 1\nKishigawa 1\nKishimoto 1\nKiss 1\nKissing 1\nKitada 1\nKitcat 1\nKitna 1\nKits 1\nKitties 1\nKiyotaka 1\nKlass 1\nKlatman 1\nKleenex 1\nKlelov 1\nKlimberg 1\nKlineberg 1\nKlinsky 1\nKliotech 1\nKlondike 1\nKloner 1\nKnee 1\nKnicks 1\nKnife 1\nKnocking 1\nKnowPost.com 1\nKnowlton 1\nKnox 1\nKnuckle 1\nKnuettel 1\nKochis 1\nKoffman 1\nKoha 1\nKohut 1\nKoji 1\nKolbe 1\nKolbi 1\nKollmorgen 1\nKompakt 1\nKongers 1\nKongsberg 1\nKonka 1\nKonner 1\nKono 1\nKonopnicki 1\nKonosuke 1\nKonowitch 1\nKopp 1\nKoppers 1\nKorando 1\nKorbin 1\nKoreagate 1\nKoreanized 1\nKoreas 1\nKorff 1\nKorn 1\nKornfield 1\nKornreich 1\nKoryerov 1\nKos 1\nKosher 1\nKostelanetz 1\nKostinica 1\nKotman 1\nKotobuki 1\nKouji 1\nKozlowski 1\nKracauer 1\nKracheh 1\nKraemer 1\nKrakow 1\nKrasnow 1\nKrib 1\nKrick 1\nKriner 1\nKringle 1\nKrishna 1\nKrishnaswami 1\nKrispies 1\nKristiansen 1\nKristin 1\nKrk 1\nKrofts 1\nKron 1\nKrug 1\nKruger 1\nKrupp 1\nKrys 1\nKuai 1\nKuandu 1\nKuanghua 1\nKubuntu 1\nKucharski 1\nKuchma 1\nKuehler 1\nKueneke 1\nKufa 1\nKuhns 1\nKui 1\nKuiper 1\nKuishan 1\nKulov 1\nKulun 1\nKumagai 1\nKumon 1\nKuno 1\nKunqu 1\nKunst 1\nKuohsing 1\nKupalba 1\nKurda 1\nKurlak 1\nKurncz 1\nKuroshio 1\nKuroyedov 1\nKurran 1\nKursad 1\nKurtanjek 1\nKurtz 1\nKusal 1\nKuse 1\nKutney 1\nKwang 1\nKwantung 1\nKweisi 1\nKwiakowski 1\nKwik 1\nKwon 1\nKye 1\nKyocera 1\nKyong 1\nKyowa 1\nKyu 1\nKyung 1\nKyushu 1\nL' 1\nL'Heureux 1\nL.A 1\nL.H. 1\nL.M. 1\nL.P 1\nL.Ron 1\nL/C 1\nL/C's 1\nL1 1\nLA. 1\nLABOR 1\nLABORATORIES 1\nLAMBERT 1\nLASER 1\nLAST 1\nLATLON 1\nLAWMAKERS 1\nLC 1\nLCD 1\nLD 1\nLEADERS 1\nLEAST 1\nLEBANESE 1\nLED 1\nLEDbulbs 1\nLEHMAN 1\nLENSES 1\nLEONARDO 1\nLEWIS 1\nLG 1\nLIBERTY 1\nLIDL 1\nLIEBERMAN 1\nLIES 1\nLIPA 1\nLISTEN 1\nLITTLE 1\nLLP 1\nLMEYER 1\nLNG 1\nLOCATION 1\nLOCKHEED 1\nLOGIC 1\nLONGS 1\nLORD 1\nLOSS 1\nLOTR 1\nLOTS 1\nLOTUS 1\nLOU 1\nLRC 1\nLS400 1\nLSX 1\nLTD. 1\nLTK 1\nLUCIO 1\nLUTHER 1\nLW 1\nLYSH 1\nLaBella 1\nLaBianca 1\nLaRosa 1\nLaRouchies 1\nLaToya 1\nLaWare 1\nLaWarre 1\nLab 1\nLabat 1\nLabe 1\nLabel 1\nLabeling 1\nLabonte 1\nLaboratorium 1\nLaboring 1\nLabovitz 1\nLacan 1\nLackluster 1\nLactobacillus 1\nLada 1\nLadislav 1\nLafarge 1\nLafayette 1\nLafontant 1\nLag 1\nLaghi 1\nLagoon 1\nLagos 1\nLahagewasu 1\nLahagewasulen 1\nLahaz 1\nLaila 1\nLaird 1\nLaisenia 1\nLakers 1\nLakhdar 1\nLakshmipura 1\nLamar 1\nLamb@ENRON 1\nLambs 1\nLamen 1\nLamle 1\nLamonica 1\nLamore 1\nLamos 1\nLampe 1\nLamphere 1\nLamson 1\nLancelyn 1\nLancia 1\nLandel 1\nLandmark 1\nLandscape 1\nLandscaping 1\nLangendorf 1\nLangford 1\nLangston 1\nLanguages 1\nLanhua 1\nLankans 1\nLanqing 1\nLansheng 1\nLansing 1\nLanterns 1\nLantz 1\nLanyang 1\nLanyi 1\nLaochienkeng 1\nLaojun 1\nLaoussine 1\nLaphroaig 1\nLara 1\nLarchmont 1\nLaren 1\nLarkin 1\nLars 1\nLaserscope 1\nLashio 1\nLashkar 1\nLasorda 1\nLasso 1\nLata 1\nLatecomers 1\nLately 1\nLatest 1\nLatham 1\nLatif 1\nLatifah 1\nLatino 1\nLatowski 1\nLatter 1\nLau 1\nLaughing 1\nLaunch 1\nLaunching 1\nLaundered 1\nLaurance 1\nLaurent 1\nLausanne 1\nLautrec 1\nLaux 1\nLava 1\nLavaro 1\nLavender 1\nLavidge 1\nLavin 1\nLavo 1\nLavroff 1\nLawful 1\nLawn 1\nLawsie 1\nLawton 1\nLawyer 1\nLays 1\nLazarus 1\nLazy 1\nLeBow 1\nLeFrere 1\nLeMans 1\nLePatner 1\nLeT 1\nLeWitt 1\nLea 1\nLeads 1\nLeaf 1\nLeainne 1\nLean 1\nLeaning 1\nLeap 1\nLeaping 1\nLearn 1\nLeaves 1\nLeavy 1\nLeba 1\nLeboho 1\nLecturer 1\nLecturers 1\nLederberg 1\nLederer 1\nLedger 1\nLeekin 1\nLeesburg 1\nLeeza 1\nLefcourt 1\nLefortovo 1\nLeftovers 1\nLefty 1\nLeg 1\nLegally 1\nLegit 1\nLegitimate 1\nLegittino 1\nLehigh 1\nLehmans 1\nLehn 1\nLehne 1\nLehrer 1\nLeibowitz 1\nLeiby 1\nLeicester 1\nLeish 1\nLeithead 1\nLejeune 1\nLekberg 1\nLemans 1\nLemell 1\nLemmon 1\nLemon 1\nLemont 1\nLend 1\nLength 1\nLenhart 1\nLeninism 1\nLeninist 1\nLeninskoye 1\nLennon 1\nLenovo 1\nLeominster 1\nLeone 1\nLeonel 1\nLeopard 1\nLepore 1\nLerman 1\nLesbianists 1\nLescaze 1\nLeser 1\nLeshan 1\nLesley 1\nLesnar 1\nLessner 1\nLesson 1\nLest 1\nLesutis 1\nLethal 1\nLett 1\nLetterman 1\nLetters 1\nLeu 1\nLeubert 1\nLeumi 1\nLeuzzi 1\nLev 1\nLeverkusen 1\nLevesque 1\nLevitte 1\nLewala 1\nLex 1\nLiaisons 1\nLianhuashan 1\nLianye 1\nLiaohe 1\nLiaoxi 1\nLibby 1\nLibera 1\nLiberia 1\nLiberian 1\nLibrairie 1\nLibrary 1\nLibro 1\nLichtblau 1\nLick 1\nLidder 1\nLido 1\nLiefting 1\nLifeSpan 1\nLifeline 1\nLifesaver 1\nLifestyles 1\nLifland 1\nLifts 1\nLightning 1\nLightwave 1\nLigurian 1\nLihuang 1\nLikins 1\nLikudniks 1\nLilan 1\nLilian 1\nLiliane 1\nLilic 1\nLilith 1\nLillehammer 1\nLillian 1\nLillo 1\nLily 1\nLiman 1\nLimbering 1\nLimbs 1\nLime 1\nLimitations 1\nLimiting 1\nLimos 1\nLimousine 1\nLincolns 1\nLincolnshire 1\nLindens 1\nLindh 1\nLindsay 1\nLinfen 1\nLinger 1\nLinghu 1\nLingnan 1\nLinguistic 1\nLinna 1\nLinnard 1\nLinsey 1\nLinsley 1\nLinsong 1\nLintas 1\nLions 1\nLipids 1\nLipman 1\nLipps 1\nLiqaa 1\nLiqueur 1\nLiquidating 1\nLiquor 1\nLira 1\nLirang 1\nLisan 1\nLisbeth 1\nLish 1\nLisheng 1\nListed 1\nListened 1\nListeners 1\nListon 1\nLit 1\nLitao 1\nLite 1\nLiteracy 1\nLiterary 1\nLites 1\nLithotripsy 1\nLiton 1\nLittleton 1\nLittman 1\nLiuh 1\nLively 1\nLivermore 1\nLivingston 1\nLivingstone 1\nLixuetang 1\nLizhi 1\nLizi 1\nLizuo 1\nLizzie 1\nLoada 1\nLoads 1\nLoafers 1\nLoathing 1\nLoay 1\nLobby 1\nLobbying 1\nLobbyist 1\nLobe 1\nLobster 1\nLocally 1\nLocarno 1\nLoch 1\nLock 1\nLocke 1\nLocker 1\nLockman 1\nLocust 1\nLodestar 1\nLogsdon 1\nLok 1\nLokey 1\nLoman 1\nLomas 1\nLompoc 1\nLonbado 1\nLonde 1\nLondoners 1\nLondons 1\nLoney 1\nLong/Zhou 1\nLongchang 1\nLongest 1\nLonghu 1\nLonging 1\nLongmenshan 1\nLongnan 1\nLongping 1\nLongtan 1\nLongtime 1\nLongwan 1\nLonnie 1\nLookout 1\nLooming 1\nLope 1\nLopid 1\nLorain 1\nLorca 1\nLordstown 1\nLoretta 1\nLorex 1\nLoring 1\nLorne 1\nLorraine 1\nLotos 1\nLotuses 1\nLoud 1\nLoudoun 1\nLoughman 1\nLouisiane 1\nLouisianna 1\nLouvre 1\nLova 1\nLovejoy 1\nLovely 1\nLoves 1\nLoving 1\nLovley 1\nLowenthal 1\nLowndes 1\nLowry 1\nLoyalty 1\nLubbock 1\nLubkin 1\nLubriderm 1\nLubyanka 1\nLucelly 1\nLuciano 1\nLucinda 1\nLucisano 1\nLucullan 1\nLudan 1\nLuding 1\nLudwigshafen 1\nLuehrs 1\nLuerssen 1\nLugar 1\nLukassen 1\nLukes 1\nLukou 1\nLumber 1\nLumbera 1\nLun 1\nLuna 1\nLunch 1\nLuncheon 1\nLund 1\nLuobo 1\nLuoluo 1\nLuoshi 1\nLuqiao 1\nLustgarten 1\nLutai 1\nLutheran 1\nLuweitan 1\nLux 1\nLuxurious 1\nLuxury 1\nLvovna 1\nLydia 1\nLyman 1\nLyme 1\nLynchburg 1\nLynden 1\nLynes 1\nLynmouth 1\nLyric 1\nM.D 1\nM.D.C. 1\nM.E. 1\nM.I.T. 1\nM.J. 1\nM.R. 1\nM.W. 1\nM/30 1\nM16 1\nM3 1\nM8.7sp 1\nMACHINES 1\nMACMILLAN 1\nMACPOST 1\nMACREDO 1\nMAD 1\nMAI 1\nMAINTENANCE 1\nMAITRE'D 1\nMAJOR 1\nMALAISE 1\nMAN 1\nMANCHILD 1\nMANEUVERS 1\nMANUALS 1\nMANUFACTURING 1\nMARCH 1\nMARCOS 1\nMARK 1\nMASSAGE 1\nMATERIALS 1\nMBB 1\nMBL8 1\nMBTI 1\nMC 1\nMC68030 1\nMC88200 1\nMEA 1\nMEALS 1\nMEASUREX 1\nMEAT 1\nMEDIA 1\nMEDICINE 1\nMEDUSA 1\nMEK 1\nMEMOS 1\nMEP 1\nMERGER 1\nMG 1\nMGr...@usa.pipeline.com 1\nMH 1\nMI 1\nMICHAEL 1\nMICRO 1\nMICROSYSTEMS 1\nMIDDLEMAN 1\nMILEAGE 1\nMINI 1\nMINIMUM 1\nMINING 1\nMINOR 1\nMINORITY 1\nMINT 1\nMISSISSIPPI 1\nMISTAKE 1\nMISUSE 1\nMITI 1\nMK 1\nMKULTRA 1\nMLA 1\nMLO 1\nMMC 1\nMMG 1\nMMOG 1\nMNB 1\nMOB 1\nMOLE 1\nMONITORED 1\nMOPA 1\nMORALITY 1\nMORGAN 1\nMOST 1\nMOT 1\nMOTOR 1\nMOTORS 1\nMOV 1\nMOVED 1\nMOVES 1\nMOVING 1\nMR. 1\nMSU 1\nMTLE 1\nMU 1\nMUCH 1\nMUD 1\nMUNICIPALS 1\nMURDER 1\nMURPH 1\nMUTUAL 1\nMVB 1\nMVL 1\nMYERS 1\nMYSTERYS 1\nMYTHS 1\nMZ 1\nMa'ariv 1\nMaarouf 1\nMabellini 1\nMacAllister 1\nMacDougall 1\nMacNamara 1\nMacSharry 1\nMacTheRipper 1\nMacaemse 1\nMacapagal 1\nMacari 1\nMacaroni 1\nMacarthur 1\nMacaulay 1\nMaccabee 1\nMacchiarola 1\nMacfarlane 1\nMachelle 1\nMachiavelli 1\nMacho 1\nMachon 1\nMackinac 1\nMackinaw 1\nMaclaine 1\nMaclean 1\nMacrocosm 1\nMacs 1\nMacvicar 1\nMadam 1\nMadani 1\nMaddox 1\nMadilyn 1\nMadson 1\nMaeda 1\nMagalhaes 1\nMaged 1\nMaggiore 1\nMaghfouri 1\nMagians 1\nMagistrate 1\nMagnet 1\nMagnetic 1\nMagnolias 1\nMagnus 1\nMagruder 1\nMagy 1\nMahan 1\nMaher 1\nMahodi 1\nMahrous 1\nMahtar 1\nMaid 1\nMaiden 1\nMailson 1\nMaimaiti 1\nMainichi 1\nMainly 1\nMainz 1\nMaiorana 1\nMaisa 1\nMaisara 1\nMaitland 1\nMaizuru 1\nMaj. 1\nMajd 1\nMajed 1\nMajestic 1\nMajowski 1\nMajusi 1\nMajyul 1\nMakin 1\nMakine 1\nMakla 1\nMakoni 1\nMakoto 1\nMaku 1\nMakwah 1\nMalabo 1\nMalacca 1\nMalach 1\nMalaki 1\nMalapai 1\nMalawi 1\nMalayo 1\nMalays 1\nMalden 1\nMaldives 1\nMalec 1\nMalenchenko 1\nMalibu 1\nMalicious 1\nMalick 1\nMalkovich 1\nMalls 1\nMalpass 1\nMalpede 1\nMals 1\nMalsela 1\nMaltese 1\nMalthus 1\nMaluf 1\nMamansk 1\nMammonorial 1\nManaf 1\nManaifatturiera 1\nManalapan 1\nManchu 1\nMancini 1\nMandans 1\nMander 1\nMandibles 1\nMandina 1\nMandom 1\nMandrake 1\nMandros 1\nMandy 1\nManfred 1\nManger 1\nManges 1\nMangino 1\nMangold 1\nMangrove 1\nManhatten 1\nManicure 1\nManimail 1\nMankiw 1\nMann@ENRON 1\nMannesmann 1\nMannington 1\nManogue 1\nManor 1\nMansoon 1\nMansoor 1\nMansura 1\nMantia 1\nMantu 1\nMantua 1\nManukua 1\nManzoni 1\nMao'ergai 1\nMaoxian 1\nMapQuest 1\nMarbella 1\nMarcel 1\nMarcello 1\nMarche 1\nMarchers 1\nMarching 1\nMareham 1\nMargie 1\nMarginal 1\nMargo 1\nMargolis 1\nMarguerite 1\nMariam 1\nMarian 1\nMariana 1\nMarib 1\nMaricopa 1\nMariel 1\nMarikhi 1\nMarilao 1\nMarinCounty 1\nMariners 1\nMarist 1\nMarjorie 1\nMarketplace 1\nMarlo 1\nMarni 1\nMarnier 1\nMarquesas 1\nMarquez 1\nMarriages 1\nMarrill 1\nMarrying 1\nMarschalk 1\nMarseillaise 1\nMarsha 1\nMarshal 1\nMartek 1\nMarten 1\nMartex 1\nMarthe 1\nMarti 1\nMartian 1\nMartinez@ENRON 1\nMartyn 1\nMartyrdom 1\nMarval 1\nMarvelon 1\nMarvelous 1\nMarver 1\nMarysville 1\nMasaaki 1\nMasahiko 1\nMashit 1\nMasibih 1\nMasillon 1\nMasonic 1\nMasonics 1\nMasri 1\nMassMutual 1\nMassicotte 1\nMasterpiece 1\nMastery 1\nMasud 1\nMasur 1\nMata 1\nMatab 1\nMatalin 1\nMatamoros 1\nMatcher 1\nMatchstick 1\nMate 1\nMateyo 1\nMath 1\nMathematical 1\nMathematics 1\nMatheson 1\nMathews 1\nMathewson 1\nMatic 1\nMatitle 1\nMatlock 1\nMatsing 1\nMatsuo 1\nMattCollins 1\nMatters 1\nMattes 1\nMattia 1\nMattis 1\nMattola 1\nMattone 1\nMature 1\nMaturities 1\nMatuschka 1\nMatwali 1\nMatz 1\nMaude 1\nMaumee 1\nMaureen 1\nMausoleum 1\nMaven 1\nMavis 1\nMaxim 1\nMaximo 1\nMaxxam 1\nMayan 1\nMayans 1\nMayes 1\nMayhap 1\nMayors 1\nMayumi 1\nMayur 1\nMazdaism 1\nMaze 1\nMazna 1\nMazowsze* 1\nMazzera 1\nMbeki 1\nMcAlu 1\nMcAuley 1\nMcCaffrey 1\nMcCaine 1\nMcCarran 1\nMcCartin 1\nMcCarty 1\nMcCaskey 1\nMcCaughey 1\nMcClements 1\nMcCloud 1\nMcCloy 1\nMcCollum 1\nMcConnell 1\nMcConnell@ECT 1\nMcCormack 1\nMcCorry 1\nMcCracken 1\nMcCutchen 1\nMcDermid 1\nMcFall 1\nMcGillicuddy 1\nMcGinley 1\nMcGregor 1\nMcGwire 1\nMcHenry 1\nMcInerney 1\nMcIntosh 1\nMcKee 1\nMcKim 1\nMcKinzie 1\nMcLaren 1\nMcLelland 1\nMcLeran 1\nMcLuhan 1\nMcManus 1\nMcMillen 1\nMcMullin 1\nMcNarry 1\nMcNaught 1\nMcNeill 1\nMcNugg 1\nMcQueen 1\nMckaleese 1\nMckleese 1\nMcmanus 1\nMeadowrun 1\nMeadows 1\nMean 1\nMeaning 1\nMeasured 1\nMeasurement 1\nMeatballs 1\nMecaniques 1\nMechaincs 1\nMechanisms 1\nMechanized 1\nMed 1\nMeddling 1\nMedfield 1\nMediated 1\nMedici 1\nMedicinal 1\nMedicines 1\nMedicis 1\nMeditating 1\nMeditation 1\nMednis 1\nMedusa 1\nMedvedev 1\nMeetings 1\nMeets 1\nMega 1\nMegane 1\nMeharry 1\nMehola 1\nMeiguan 1\nMeiji 1\nMeiling 1\nMeilo 1\nMeisi 1\nMeizhen 1\nMekki 1\nMelissa 1\nMellencamp 1\nMello 1\nMellor 1\nMelton 1\nMelvin 1\nMemoirs 1\nMemorandum 1\nMenem 1\nMenendez 1\nMeng 1\nMenger 1\nMengfu 1\nMengjun 1\nMengqin 1\nMenjun 1\nMenomonee 1\nMent 1\nMental 1\nMentally 1\nMerabank 1\nMeraux 1\nMercifully 1\nMercy 1\nMerdelet 1\nMerdeuses 1\nMergers 1\nMerianovic 1\nMerill 1\nMerkel 1\nMerkurs 1\nMermonsk 1\nMermosk 1\nMerola 1\nMerrik 1\nMerrin 1\nMerryman 1\nMersa 1\nMersyside 1\nMeselson 1\nMeserve 1\nMeshulam 1\nMesios 1\nMesirov 1\nMess 1\nMessa 1\nMesserschmitt 1\nMessiaen 1\nMetWest 1\nMeta 1\nMetamorphosis 1\nMetatrace 1\nMethanol 1\nMethod 1\nMethodius 1\nMethylene 1\nMetric 1\nMetroplex 1\nMetropolis 1\nMetrorail 1\nMetruh 1\nMetschan 1\nMetzenbaum 1\nMetzenbaums 1\nMeurer 1\nMevacor 1\nMey 1\nMezzogiorno 1\nMfM 1\nMfg. 1\nMfume 1\nMi 1\nMiana 1\nMianzhu 1\nMiaomiao 1\nMiaoyang 1\nMichaelcheck 1\nMichagin 1\nMichelangelos 1\nMichelia 1\nMichelman 1\nMicroGeneSys 1\nMicrobiology 1\nMicrofluidics 1\nMicrolog 1\nMicronyx 1\nMicropolis 1\nMicroprocessor 1\nMid-Atlantic 1\nMid-Autumn 1\nMid-State 1\nMid-sized 1\nMidco 1\nMiddlebury 1\nMidgetman 1\nMidlevel 1\nMidmorning 1\nMidsized 1\nMidtown 1\nMidwife 1\nMighty 1\nMigration 1\nMihalek 1\nMik 1\nMikati 1\nMikiang 1\nMil 1\nMilacron 1\nMilanzech 1\nMilbank 1\nMilburn 1\nMilitia 1\nMillan 1\nMillbrae 1\nMillenium 1\nMillie 1\nMillioniare 1\nMilo 1\nMilos 1\nMilstar 1\nMilt 1\nMimosas 1\nMinato 1\nMinchuan 1\nMindManager 1\nMinding 1\nMindoro 1\nMineola 1\nMinera 1\nMineral 1\nMinero 1\nMinerva 1\nMingan 1\nMingchun 1\nMinghui 1\nMingli 1\nMingquan 1\nMingtong 1\nMingyang 1\nMingying 1\nMingyuan 1\nMinicar 1\nMinikes 1\nMiniscribe 1\nMinisterial 1\nMinistership 1\nMinitruck 1\nMinn 1\nMinnelli 1\nMinnie 1\nMinster 1\nMintz 1\nMinute 1\nMinwax 1\nMinxiong 1\nMiqdad 1\nMirabello 1\nMiraculously 1\nMirai 1\nMirco 1\nMiringoff 1\nMiron 1\nMirroring 1\nMisa 1\nMiscellaneous 1\nMiser 1\nMisery 1\nMisha 1\nMishal 1\nMissed 1\nMissiles 1\nMissions 1\nMississippian 1\nMissned 1\nMistake 1\nMistakes 1\nMisto 1\nMisu 1\nMisubishi 1\nMisunderstanding 1\nMisunderstandings 1\nMitch 1\nMitre 1\nMitsutaka 1\nMittag 1\nMitten 1\nMitzel 1\nMixtec 1\nMiyata 1\nMiyoshi 1\nMizuno 1\nMnouchkine 1\nMoI 1\nMoammar 1\nMob 1\nMobilising 1\nMock 1\nModa 1\nMode 1\nModell 1\nModerators 1\nModernization 1\nModernizations 1\nModrian 1\nModrow 1\nMoertel 1\nMoffett 1\nMogadishu 1\nMogan 1\nMohan 1\nMohandas 1\nMohiuddin 1\nMohsen 1\nMoira 1\nMoises 1\nMojia 1\nMoldavia 1\nMolecular 1\nMolly 1\nMoloyev 1\nMolten 1\nMomer 1\nMonadnock 1\nMonaneng 1\nMonarch 1\nMonarchy 1\nMonastery 1\nMones 1\nMoneyline 1\nMongan 1\nMonkeys 1\nMonmouth 1\nMonopolies 1\nMonorail 1\nMonster 1\nMontague 1\nMontbrial 1\nMontedision 1\nMonteith 1\nMontenagrian 1\nMonterey 1\nMontgolfier 1\nMontgomery@ENRON 1\nMonthsaway 1\nMontle 1\nMontorgueil 1\nMontpelier 1\nMood 1\nMoonachie 1\nMooney 1\nMoor 1\nMoorhead 1\nMorality 1\nMorally 1\nMoratti 1\nMorcott 1\nMorency 1\nMoreton 1\nMorever 1\nMorey 1\nMorfey 1\nMorgantown 1\nMorinaga 1\nMorino 1\nMorita 1\nMoritz 1\nMorley 1\nMormon 1\nMormonism 1\nMorphed 1\nMorrell 1\nMorrisville 1\nMose 1\nMoselle 1\nMosher 1\nMossoviet 1\nMot 1\nMotherwell 1\nMotivated 1\nMotorcycle 1\nMotorfair 1\nMotorized 1\nMotorsport 1\nMotorsports 1\nMotown 1\nMotoyuki 1\nMottaki 1\nMotto 1\nMottram 1\nMou 1\nMouhamad 1\nMounigou 1\nMourning 1\nMouser 1\nMousketeer 1\nMoutasaddiq 1\nMove-Team@ENRON 1\nMoveOn.org 1\nMoves 1\nMoxley 1\nMoynihan 1\nMozart 1\nMpilo 1\nMuarraf 1\nMuaz 1\nMuba 1\nMubadala 1\nMubrid 1\nMucha 1\nMudd 1\nMuddy 1\nMuffin 1\nMuffins 1\nMuffs 1\nMuhammed 1\nMuharram 1\nMuhsin 1\nMujahed 1\nMujahid 1\nMukhi 1\nMukhtar 1\nMulhouse 1\nMullah 1\nMulrooney 1\nMulti-Income 1\nMultidirectional 1\nMultiflow 1\nMultilateral 1\nMultimate 1\nMultimedia 1\nMultinational 1\nMultinationals 1\nMultiples 1\nMulva 1\nMulvoy 1\nMumson 1\nMunching 1\nMundo 1\nMuni 1\nMunir 1\nMunitions 1\nMunsell 1\nMurai 1\nMurat 1\nMurauts 1\nMurders 1\nMurmosk 1\nMurrah 1\nMusa 1\nMusannad 1\nMusca 1\nMuscles 1\nMuscovites 1\nMuse 1\nMushan 1\nMusician 1\nMuskegon 1\nMussa 1\nMusta'in 1\nMustain 1\nMustansiriyah 1\nMustasim 1\nMutaa 1\nMutinies 1\nMuwaffaq 1\nMwakiru 1\nMySQL 1\nMylroie 1\nMysteries 1\nMyths 1\nN.D. 1\nN.H 1\nNAB 1\nNAFTA 1\nNAMED 1\nNAS 1\nNASCAR 1\nNASDA 1\nNASTY 1\nNATION'S 1\nNATIONWIDE 1\nNCR 1\nNDL 1\nNDRC 1\nNE 1\nNEATLY 1\nNEEDED 1\nNEEEEEEEEEVERRRR 1\nNEKOOSA 1\nNEWHALL 1\nNEWSPAPERS 1\nNEXT 1\nNGO's 1\nNH 1\nNHI 1\nNHL 1\nNICE 1\nNICER 1\nNICHOLS 1\nNJTransit 1\nNKK 1\nNON 1\nNORDSTROM 1\nNORTH 1\nNORTHEAST 1\nNOTES 1\nNOTHING 1\nNOV 1\nNOVEMBER 1\nNPD 1\nNSNewsMax 1\nNST 1\nNT&SA 1\nNTI 1\nNTSB 1\nNU 1\nNUCLEAR 1\nNUM 1\nNURSING 1\nNVQ 1\nNYMEX 1\nNYPD 1\nNYT 1\nNYU 1\nNZ$ 1\nNZI 1\nNZLR 1\nNabal 1\nNabarro 1\nNabetaen 1\nNabil 1\nNabokov 1\nNac 1\nNacchio 1\nNachmany 1\nNacion 1\nNada 1\nNadelmann 1\nNadereh 1\nNadi 1\nNadia 1\nNadja 1\nNagano 1\nNagar 1\nNagasaki 1\nNagaski 1\nNagayama 1\nNagin 1\nNaguib 1\nNahhhhh 1\nNahrawan 1\nNaisi 1\nNaja 1\nNaji 1\nNajib 1\nNakata 1\nNakazato 1\nNaked 1\nNakhilan 1\nNamaskar 1\nNamath 1\nNamdaemun 1\nNamibian 1\nNaming 1\nNan'an 1\nNanbing 1\nNanchong 1\nNandu 1\nNanfang 1\nNanhai 1\nNankang 1\nNanny 1\nNano 1\nNanshan 1\nNanta 1\nNantucket 1\nNantzu 1\nNanxiang 1\nNaperville 1\nNapoleonic 1\nNarragansett 1\nNarrow 1\nNarrowing 1\nNarusuke 1\nNash 1\nNasional 1\nNasir 1\nNasiri 1\nNasrawi 1\nNastro 1\nNate 1\nNathaniel 1\nNationalList 1\nNationsl 1\nNatter 1\nNatura 1\nNaturalization 1\nNatures 1\nNatzi 1\nNauman 1\nNaumberg 1\nNauru 1\nNautilus 1\nNave 1\nNavforJapan 1\nNavin 1\nNaw 1\nNawaf 1\nNay 1\nNazism 1\nNeapolitan 1\nNeas 1\nNeck 1\nNedre 1\nNegroponte 1\nNeidl 1\nNeighboring 1\nNeighbors 1\nNeill 1\nNeinas 1\nNejd 1\nNelieer 1\nNell 1\nNella 1\nNemec 1\nNengyuan 1\nNeo-Con 1\nNeoclassical 1\nNeocon 1\nNeoconservatives 1\nNeonjab 1\nNeoplatonic 1\nNepali 1\nNephew 1\nNervousness 1\nNesco 1\nNessingwary 1\nNetMeeting 1\nNetanya 1\nNetease 1\nNetworking 1\nNeubauer 1\nNeue 1\nNeuendorffer 1\nNeuhaus 1\nNeuilly 1\nNeulasta 1\nNeuromancer 1\nNeuros 1\nNeurosciences 1\nNeuroscientist 1\nNeutrality 1\nNevermind 1\nNewer 1\nNewgate 1\nNewhalem 1\nNewmont 1\nNewsday.com 1\nNewspeak 1\nNewswire 1\nNgawa 1\nNiagra 1\nNiang 1\nNiche-itis 1\nNickle 1\nNicois 1\nNicolo 1\nNidal 1\nNidd 1\nNigerian 1\nNigerians 1\nNightdress 1\nNightingale 1\nNightlife 1\nNightmare 1\nNighttime 1\nNiigata 1\nNikes 1\nNikita 1\nNikka 1\nNikons 1\nNinety 1\nNinevah 1\nNingguo 1\nNipsco 1\nNipsy 1\nNisa'i 1\nNishi 1\nNishimura 1\nNissans 1\nNissen 1\nNita 1\nNite 1\nNith 1\nNitrogen 1\nNiumien 1\nNiva 1\nNivine 1\nNo's 1\nNo.3 1\nNoSrawlTax 1\nNoSrawlTax.org 1\nNoah 1\nNobels 1\nNobre 1\nNobrega 1\nNoir 1\nNoise 1\nNokomis 1\nNomination 1\nNon-Proliferation 1\nNon-bank 1\nNon-executive 1\nNon-interest 1\nNon-lawyers 1\nNon-smoking 1\nNonProfit 1\nNonain 1\nNonferrous 1\nNonfiction 1\nNonfood 1\nNonian 1\nNonunion 1\nNoonan 1\nNorbert 1\nNorcross 1\nNord 1\nNordine 1\nNordisk 1\nNorge 1\nNoriegan 1\nNorimasa 1\nNorle 1\nNorm 1\nNorma 1\nNorment 1\nNorms 1\nNorquist 1\nNorthampton 1\nNortheastern 1\nNorthington 1\nNorthlich 1\nNorthrup 1\nNorthwood 1\nNorwalk 1\nNostra 1\nNostradamus 1\nNostromo 1\nNotable 1\nNotably 1\nNothin' 1\nNotices 1\nNoticias 1\nNotwithstanding 1\nNouveaux 1\nNovak 1\nNovametrix 1\nNovelist 1\nNovick 1\nNovo 1\nNovostate 1\nNovotel 1\nNowheresville 1\nNozzle 1\nNt 1\nNui 1\nNumas 1\nNunn 1\nNur 1\nNurah 1\nNuremberg 1\nNuria_R_Ibarra@calpx.com 1\nNurse 1\nNursing 1\nNurture 1\nNusbaum 1\nNutmeg 1\nNutraSweet 1\nNutrition 1\nNuttall 1\nNux 1\nNvidia 1\nNville 1\nO&M 1\nO' 1\nO'Boyle 1\nO'Brian 1\nO'Dwyer's 1\nO'Gatta 1\nO'Hara 1\nO'Hare 1\nO'Keefe 1\nO'Malley 1\nO'Neil 1\nO'Reilly 1\nO'Rourke 1\nO'brien 1\nO'clock 1\nO.D.S.P 1\nO.P. 1\nOAV 1\nOBP 1\nOBrion 1\nOCC 1\nODDITIES 1\nODF 1\nODI 1\nOEM 1\nOEO 1\nOFFICIALS 1\nOIL 1\nOLED 1\nOLEDs 1\nOMFG 1\nONEIDA 1\nONEZIE 1\nONLINE 1\nONSI 1\nOOPS 1\nOPEN 1\nOPENING 1\nOPI 1\nOPINION 1\nOPPENHEIMER 1\nORACLE 1\nORDERS 1\nORGANIZED 1\nOSD 1\nOSI 1\nOSTRICH 1\nOSX 1\nOTHER 1\nOTOH 1\nOUTRAGE 1\nOVERALL 1\nOVERHAUL 1\nOVERVIEW 1\nOWN 1\nOWNER 1\nOakley 1\nObama 1\nObedience 1\nOberhausen 1\nObermaier 1\nObey 1\nObiang 1\nObina 1\nObjects 1\nObservation 1\nObsolete 1\nObudu 1\nObvious 1\nOccam 1\nOccident 1\nOccupying 1\nOceanic 1\nOcho 1\nOchoa 1\nOchs 1\nOctopus 1\nOdd 1\nOdell 1\nOdessa 1\nOdisha 1\nOerlikon 1\nOffensive 1\nOfficers 1\nOffsetting 1\nOffshore 1\nOftentimes 1\nOgada 1\nOgade 1\nOgburns 1\nOgilvyspeak 1\nOhara 1\nOhioan 1\nOhioans 1\nOhls 1\nOhmae 1\nOhmro 1\nOilfield 1\nOils 1\nOilwell 1\nOily 1\nOkabayashi 1\nOkasan 1\nOke 1\nOkichobee 1\nOkla 1\nOkobank 1\nOlbina 1\nOlds 1\nOlean 1\nOlestra 1\nOlissa 1\nOlive 1\nOllari 1\nOlof 1\nOlsson 1\nOluanpi 1\nOm 1\nOmanis 1\nOmari 1\nOmbliz 1\nOmbudsmen 1\nOmega 1\nOmnibank 1\nOmnicorp 1\nOncogenes 1\nOneSearch 1\nOnek 1\nOnlookers 1\nOnstage 1\nOnward 1\nOolong 1\nOops 1\nOpel 1\nOpens 1\nOperate 1\nOperational 1\nOpere 1\nOpositora 1\nOpportunities 1\nOps 1\nOptic 1\nOptical4Less 1\nOptimist 1\nOptimistic 1\nOptoelectronics 1\nOral 1\nOranges 1\nOrbach 1\nOrbe 1\nOrbis 1\nOrbital 1\nOrchestra 1\nOrchestration 1\nOrdinances 1\nOrel 1\nOren 1\nOrgani 1\nOrganizers 1\nOrganizing 1\nOrgans 1\nOrgie 1\nOrhi 1\nOriented 1\nOrigin 1\nOrigination 1\nOrigins 1\nOrin 1\nOriole 1\nOrioles 1\nOrissa 1\nOrndorff 1\nOrnette 1\nOrnstein 1\nOrondo 1\nOropos 1\nOrphaned 1\nOrtegas 1\nOrtho 1\nOsborn 1\nOsborne 1\nOsbournes 1\nOsiraq 1\nOsman 1\nOsprey 1\nOster 1\nOstrander 1\nOther's 1\nOtros 1\nOuattara 1\nOuch 1\nOuda 1\nOuedraogo 1\nOuhai 1\nOuija 1\nOusted 1\nOutdated 1\nOutgrowths 1\nOuthouse 1\nOuthwaite 1\nOutlays 1\nOutreach 1\nOutsource 1\nOutsourcing 1\nOverbuilt 1\nOvercaes 1\nOverdevelopment 1\nOverhead 1\nOvernite 1\nOverreacting 1\nOversupply 1\nOvert 1\nOvertega 1\nOvid 1\nOviously 1\nOvneol 1\nOwing 1\nOwomoyela 1\nOxfordian 1\nOxfordians 1\nOxfords 1\nOxfordshire 1\nOzone 1\nP&I 1\nP.A. 1\nP.K. 1\nP.O. 1\nP.R. 1\nP0340 1\nP133 1\nPAC 1\nPACKARD 1\nPACS 1\nPAID 1\nPAINT 1\nPANDA 1\nPANHANDLER 1\nPARENT 1\nPARKER 1\nPARTNERSHIP 1\nPATOIS 1\nPAYMENTS 1\nPAYS 1\nPB 1\nPC's 1\nPC-ing 1\nPCB 1\nPCI 1\nPCIe 1\nPCR 1\nPD 1\nPDF 1\nPEANUTS 1\nPEC 1\nPENALTY 1\nPENCIL 1\nPENCILS 1\nPENNEY 1\nPENSION 1\nPEP 1\nPEREGRINO 1\nPEREGRINate 1\nPERFORMANCE 1\nPERIOD 1\nPERLINGIERE 1\nPETSMART 1\nPG 1\nPGA 1\nPGM 1\nPHILADELPHIA 1\nPHONE 1\nPHOTOS 1\nPHP5 1\nPILING 1\nPIN 1\nPIPELINE 1\nPIR 1\nPITCH 1\nPJC 1\nPJM 1\nPLANS 1\nPLANTS 1\nPLASTIC 1\nPLAYER 1\nPLEASE 1\nPOJOs 1\nPOLICY 1\nPOLITICAL 1\nPOLYGAMY 1\nPONCHATOULA 1\nPOSSIBLE 1\nPPM 1\nPPPG 1\nPRA 1\nPRICED 1\nPRIMERICA 1\nPRINCE 1\nPRISON 1\nPRIVATE 1\nPRO 1\nPROCEEDINGS 1\nPROCTER 1\nPROD 1\nPRODUCTS 1\nPROFIT 1\nPROFITS 1\nPROGRAM 1\nPROGRAMS 1\nPROMISED 1\nPROMOTION 1\nPROPERTY 1\nPROPOSAL 1\nPROSECUTOR 1\nPROSECUTORS 1\nPROSPECTS 1\nPROVIDED 1\nPRS 1\nPS. 1\nPSI 1\nPSU 1\nPT5 1\nPTL 1\nPVA 1\nPVT 1\nPVT. 1\nPW 1\nPac 1\nPacemakers 1\nPacer 1\nPacholik 1\nPachyderms 1\nPacitti 1\nPack 1\nPackaged 1\nPackin 1\nPacquiao 1\nPaddlers 1\nPadget 1\nPaganism 1\nPageant 1\nPager 1\nPagones 1\nPagurian 1\nPahl 1\nPahsien 1\nPain 1\nPaintings 1\nPair 1\nPakistanis 1\nPalais 1\nPale 1\nPaleozoic 1\nPalermo 1\nPalisades 1\nPalmatier 1\nPalme 1\nPalmer@ENRON 1\nPalmistry 1\nPalomino 1\nPampers 1\nPamplona 1\nPan-Alberta 1\nPan-American 1\nPanAm 1\nPanamanians 1\nPancras 1\nPandora 1\nPanelists 1\nPanels 1\nPanera 1\nPangea 1\nPanglossian 1\nPanic 1\nPankyo 1\nPanmunjom 1\nPanny 1\nPanorama 1\nPans 1\nPantaps 1\nPantheon 1\nPantyhose 1\nPaochung 1\nPaoen 1\nPaolo 1\nPaos 1\nPaoshan 1\nPapa 1\nPapciak 1\nPapermils 1\nPaperwork 1\nParacchini 1\nParadox 1\nParagon 1\nParagons 1\nParakeet 1\nParallel 1\nParalympics 1\nParama 1\nParamedics 1\nPardon 1\nParental 1\nPareo 1\nParkersburg 1\nParkhaji 1\nParkland 1\nParkshore 1\nParle 1\nParmigiana 1\nParoles 1\nParra 1\nParretti 1\nParrino 1\nParrott 1\nParsippany 1\nParson 1\nParsons 1\nPart-timer 1\nPartially 1\nParticles 1\nParticular 1\nParting 1\nPartisan 1\nPartner 1\nParvovirus 1\nPascale 1\nPaschi 1\nPascricha 1\nPascual 1\nPascutto 1\nPasquale 1\nPasricha 1\nPassenger 1\nPasternak 1\nPastorius 1\nPatch 1\nPateh 1\nPath 1\nPaths 1\nPatience 1\nPatriarca 1\nPatricelli 1\nPatrician 1\nPatriotism 1\nPats 1\nPatsy 1\nPaulson 1\nPause 1\nPautsch 1\nPavel 1\nPaves 1\nPavilion 1\nPawan 1\nPawtucket 1\nPax 1\nPaxman 1\nPayers 1\nPayload 1\nPayola 1\nPayout 1\nPea 1\nPeachy 1\nPeacock 1\nPeafowl 1\nPeake 1\nPeanutjake 1\nPearlstein 1\nPecos 1\nPeder 1\nPedersen 1\nPederson 1\nPediatric 1\nPedicures 1\nPedigrees 1\nPeduzzi 1\nPee 1\nPeebles 1\nPeeking 1\nPeel 1\nPeg 1\nPeifu 1\nPeignot 1\nPeiyao 1\nPeiyeh 1\nPeiyun 1\nPeleliu 1\nPeljesac 1\nPellens 1\nPen 1\nPenang 1\nPence 1\nPencils 1\nPendant 1\nPendulum 1\nPenelope 1\nPenguin 1\nPenhaul 1\nPennysaver 1\nPentagonese 1\nPentium 1\nPeople.com.cn 1\nPeppermint 1\nPepsiCola 1\nPercent 1\nPerceptions 1\nPerchance 1\nPercussion 1\nPere 1\nPeregrine 1\nPereira 1\nPerelman 1\nPerform 1\nPerformances 1\nPerformers 1\nPeri-natal 1\nPerimeter 1\nPerk 1\nPerl 1\nPermanent 1\nPermian 1\nPermission 1\nPermits 1\nPerozo 1\nPerry@ENRON_DEVELOPMENT 1\nPerseverance 1\nPershare 1\nPersia 1\nPersuading 1\nPerth 1\nPerugawan 1\nPervaiz 1\nPestered 1\nPestrana 1\nPesus 1\nPetaluma 1\nPeterborough 1\nPetersen 1\nPeteski 1\nPetits 1\nPetra 1\nPetrarch 1\nPetras 1\nPetre 1\nPetrified 1\nPetro 1\nPetrofina 1\nPetroliam 1\nPetrovich 1\nPetruzzi 1\nPetsmart 1\nPettit 1\nPettitte 1\nPeyrelongue 1\nPfau 1\nPfiefer 1\nPgh 1\nPh 1\nPhalangist 1\nPhalangists 1\nPhantom 1\nPharmacists 1\nPharmics 1\nPhasing 1\nPhat 1\nPhenix 1\nPheonix 1\nPherwani 1\nPhilanthropic 1\nPhilharmonic 1\nPhilipines 1\nPhilippians 1\nPhilly 1\nPhils 1\nPhineas 1\nPhobias 1\nPhoebe 1\nPhotoShop 1\nPhotographic 1\nPhotoprotective 1\nPhotorealism 1\nPhotos 1\nPhx 1\nPhysicians 1\nPhysiology 1\nPhysios 1\nPhysiotherapists 1\nPiano 1\nPic 1\nPicasa 1\nPicassos 1\nPiccolino 1\nPichia 1\nPick 1\nPickin 1\nPicking 1\nPickins 1\nPickle 1\nPicot 1\nPicoult 1\nPicus 1\nPie 1\nPied 1\nPierluigi 1\nPiers 1\nPieter 1\nPietists 1\nPiety 1\nPigeonnier 1\nPiggybacking 1\nPigou 1\nPikaia 1\nPike 1\nPilanesburg 1\nPilferage 1\nPilipino 1\nPillow 1\nPillsbury 1\nPilsudski 1\nPimenov 1\nPina 1\nPinatubo 1\nPinch 1\nPincus 1\nPineapple 1\nPines 1\nPingchen 1\nPingho 1\nPingxiang 1\nPingxingguan 1\nPingyang 1\nPingyi 1\nPining 1\nPink 1\nPinky 1\nPinsou 1\nPio 1\nPiping 1\nPips 1\nPirelli 1\nPisa 1\nPiscataway 1\nPissarro 1\nPistol 1\nPiszczalski 1\nPit 1\nPitcher 1\nPitching 1\nPitcoff 1\nPitiful 1\nPixley 1\nPizzo 1\nPlaced 1\nPlacements 1\nPlacer 1\nPlacid 1\nPlacing 1\nPlaines 1\nPlantago 1\nPlanting 1\nPlaster 1\nPlastow 1\nPlatform 1\nPlatonic 1\nPlatsberg 1\nPlayStations 1\nPlayboy 1\nPlayboys 1\nPlayhouse 1\nPlayskool 1\nPlaywrights 1\nPlc 1\nPledge 1\nPlenipotentiary 1\nPlenitude 1\nPlews 1\nPlouf 1\nPloys 1\nPlt 1\nPlucker 1\nPlug 1\nPlugging 1\nPlugs 1\nPlum 1\nPlumbing 1\nPlummer 1\nPlump 1\nPluralism 1\nPoachers 1\nPoag 1\nPockets 1\nPoconos 1\nPoem 1\nPoeme 1\nPoet 1\nPoetry 1\nPoets 1\nPointes 1\nPoison 1\nPoisoning 1\nPoker 1\nPol 1\nPoliceman 1\nPolished 1\nPolitically 1\nPolitkovskaya 1\nPolitrick 1\nPollack 1\nPollin 1\nPollo 1\nPollution 1\nPolsky 1\nPolycast 1\nPolygram 1\nPolymerix 1\nPolysilicon 1\nPolyurethane 1\nPolyvinyl 1\nPom 1\nPomfret 1\nPompano 1\nPomper 1\nPomton 1\nPoncelet 1\nPontiff 1\nPoole 1\nPopeye 1\nPopkin 1\nPopolare 1\nPopovic 1\nPoppenberg 1\nPopper 1\nPops 1\nPopulares 1\nPorche 1\nPork 1\nPorkapolis 1\nPorno 1\nPortagee 1\nPortals 1\nPorte 1\nPortera 1\nPortfolios 1\nPortia 1\nPortrait 1\nPortrayal 1\nPortsmouth 1\nPosh 1\nPossessing 1\nPossibilities 1\nPostive 1\nPostscript 1\nPotala 1\nPotash 1\nPotatoes 1\nPotentially 1\nPothier 1\nPotidea 1\nPots 1\nPotsy 1\nPottery 1\nPotts 1\nPottstown 1\nPotty 1\nPoulin 1\nPounding 1\nPovich 1\nPowder 1\nPowin 1\nPoyner 1\nPozen 1\nPozza 1\nPrab 1\nPrabhakar 1\nPractical 1\nPractice 1\nPraetorian 1\nPrandini 1\nPratap 1\nPratt 1\nPravo 1\nPre-College 1\nPre-Foreclosures 1\nPre-refunded 1\nPre-trial 1\nPreamble 1\nPrecious 1\nPrecision 1\nPredators 1\nPredicting 1\nPredictions 1\nPreferable 1\nPreferred 1\nPrefers 1\nPrelude 1\nPremarin 1\nPremark 1\nPremature 1\nPrenatal 1\nPreparation 1\nPrepare 1\nPrepayments 1\nPrepulsid 1\nPresage 1\nPresavo 1\nPresbyterians 1\nPreschool 1\nPrescription 1\nPresence 1\nPrestige 1\nPresto 1\nPresumably 1\nPretend 1\nPrevented 1\nPreview 1\nPrewitt 1\nPrez 1\nPriceless 1\nPriceline 1\nPriceline.com 1\nPriest 1\nPriests 1\nPrimatologist 1\nPrimeTime 1\nPrimerica 1\nPrincipals 1\nPrinciple 1\nPrinze 1\nPriorities 1\nPrisca 1\nPriscilla 1\nPrisons 1\nPritikin 1\nPrivacy 1\nPrivileged 1\nPrizms 1\nPro 1\nPro-Iranian 1\nPro-active 1\nPro-choice 1\nPro-life 1\nProBody 1\nProbable 1\nProcardia 1\nProceedings 1\nProcess 1\nProclamation 1\nProd 1\nProdigal 1\nProduce 1\nProducing 1\nProfession 1\nProgrammer 1\nProguard 1\nProjected 1\nProjecting 1\nProminent 1\nPromises 1\nPromotional 1\nPrompted 1\nProper 1\nProposals 1\nPropper 1\nPropylene 1\nProsecutions 1\nProspects 1\nProsser 1\nProstate 1\nProtect 1\nProtecting 1\nProtector 1\nProtein 1\nProtestantism 1\nProtesting 1\nProtracted 1\nProvato 1\nProve 1\nProvence 1\nProvenza 1\nProvera 1\nProvinces 1\nProvocative 1\nProvost 1\nProzac 1\nPrps 1\nPru 1\nPrudence 1\nPrudhoe 1\nPruett 1\nPrussia 1\nPryce 1\nPsa 1\nPsalm 1\nPsychiatric 1\nPsychiatrist 1\nPsychiatry 1\nPsychopaths 1\nPt 1\nPubMed 1\nPublicity 1\nPublicly 1\nPubu 1\nPuchu 1\nPucik 1\nPuente 1\nPuget 1\nPui 1\nPulaski 1\nPulkova 1\nPuller 1\nPulor 1\nPulsating 1\nPulse 1\nPumped 1\nPunching 1\nPunishment 1\nPurcell 1\nPurchasing 1\nPurdue 1\nPure 1\nPurina 1\nPuritan 1\nPurloined 1\nPurpose 1\nPurposes 1\nPurse 1\nPursuit 1\nPurus'a 1\nPusan 1\nPussycats 1\nPute 1\nPutian 1\nPutka 1\nPuto 1\nPuts 1\nPuttino 1\nPuttnam 1\nPym 1\nPyo 1\nPyong 1\nPyrenees 1\nPysllium 1\nPythagoras 1\nPython 1\nQE 1\nQT 1\nQUESTION 1\nQUIETEST 1\nQUOTABLE 1\nQabalan 1\nQadhi 1\nQaim 1\nQandahar 1\nQanoon 1\nQanso 1\nQaqa 1\nQashington 1\nQasimi 1\nQassebi 1\nQiangguo 1\nQianjiang 1\nQichao 1\nQicheng 1\nQiguang 1\nQihuan 1\nQilu 1\nQingchuan 1\nQingcun 1\nQinghua 1\nQinglu 1\nQingpin 1\nQingping 1\nQingpu 1\nQingqi 1\nQingzang 1\nQingzhong 1\nQinhai 1\nQiong 1\nQiongtai 1\nQiping 1\nQiubai 1\nQiusheng 1\nQixin 1\nQizheng 1\nQomolangma 1\nQuackenbush 1\nQuaddaffi 1\nQualitech 1\nQualls 1\nQuanta 1\nQuantico 1\nQuantitatively 1\nQuanyou 1\nQuarters 1\nQuartet 1\nQuatre 1\nQue 1\nQueenan 1\nQueens 1\nQueks 1\nQuentin 1\nQuerecho 1\nQuestioned 1\nQuetzalcoatl 1\nQuickTime 1\nQuicken 1\nQuiet 1\nQuigley 1\nQuimba 1\nQuine 1\nQuinn 1\nQuintus 1\nQuips 1\nQuiting 1\nQuna 1\nQuoting 1\nQuotrons 1\nQuranic 1\nQusaim 1\nQussaim 1\nR&B 1\nR.L. 1\nR.P. 1\nR.W. 1\nR2 1\nR2D2 1\nRA 1\nRAD 1\nRADISON 1\nRAE 1\nRAGNAROK 1\nRAK 1\nRALLIED 1\nRAND 1\nRANSOM 1\nRAPED 1\nRATIOS 1\nRATTLED 1\nRAVAGES 1\nRAVIZ 1\nRAW 1\nRAYCHEM 1\nRBC 1\nRBIs 1\nRBS 1\nRCC 1\nRCD 1\nRCSB 1\nRD 1\nRDBMS 1\nRDU 1\nREAAAALLY 1\nREACTOR 1\nREAGAN 1\nREALTY 1\nREALY 1\nREAP 1\nREAT 1\nREATs 1\nRECENT 1\nRECORDS 1\nRECRUITING 1\nREDUCED 1\nREFUELING 1\nREFUNDABLE 1\nREGION 1\nREGISTER 1\nREGULATIONS 1\nRELEASE 1\nREM 1\nREMICs 1\nRENT 1\nREPAIR 1\nREPLICATION 1\nREPORTS 1\nREQUIRED 1\nRESEARCHERS 1\nRESIDENTIAL 1\nRESIGNATIONS 1\nRESOURCES 1\nRESUME 1\nRETAIL 1\nRETURNED 1\nREUTERS 1\nREVIEW 1\nREVISED 1\nRHR 1\nRIAA 1\nRICHMOND 1\nRID 1\nRIGHTS 1\nRIP 1\nRISK 1\nRIT 1\nRIVER 1\nRLLY 1\nRMB? 1\nRNA 1\nRNR 1\nRODE 1\nRODERIGO 1\nROLLS 1\nROSS 1\nRR 1\nRSPB 1\nRTD 1\nRTO 1\nRTOS 1\nRTP 1\nRTS 1\nRUDE 1\nRUN 1\nRUSSIAN 1\nRUTH 1\nRUY 1\nRXDC 1\nRa 1\nRabb 1\nRabbi 1\nRabi 1\nRabie 1\nRabista 1\nRabo 1\nRacal 1\nRaces 1\nRachmaninoff 1\nRachwalski 1\nRacine 1\nRacketeering 1\nRadavan 1\nRaddatz 1\nRadiance 1\nRadioactive 1\nRafferty 1\nRafi 1\nRafid 1\nRafidain 1\nRafidite 1\nRafsanjani 1\nRaful 1\nRagan 1\nRaghavan 1\nRagu 1\nRahill 1\nRahman2002 1\nRahway 1\nRaiders 1\nRaikes 1\nRaikkonen 1\nRailcar 1\nRails 1\nRailways 1\nRaimondo 1\nRaina 1\nRainier 1\nRainwater 1\nRaisa 1\nRaised 1\nRaja 1\nRajaaa100@Hotmail.Com 1\nRajavis 1\nRajendra 1\nRajihi 1\nRakes 1\nRales 1\nRamad 1\nRambus 1\nRamo 1\nRamone 1\nRamsey 1\nRamzi 1\nRance@ENRON 1\nRanch 1\nRandi 1\nRandoff 1\nRandol 1\nRandt 1\nRangoon 1\nRans 1\nRapatee 1\nRape 1\nRaph 1\nRaphael 1\nRapper 1\nRapport 1\nRaqab 1\nRascal 1\nRash 1\nRashed 1\nRasheed 1\nRashidiya 1\nRaskolnikov 1\nRasoul 1\nRat 1\nRats 1\nRatzinger 1\nRauch 1\nRaul 1\nRawl 1\nRayah 1\nRaydiola 1\nRaymoan 1\nRays 1\nRazon 1\nRd 1\nRe-creating 1\nRe-enactments 1\nRe-equalizer 1\nRe-interviewed 1\nReabov 1\nReaching 1\nReaction 1\nReaganauts 1\nRealignment 1\nRealistic 1\nRealistically 1\nRear 1\nRearding 1\nReasoning 1\nReassemble 1\nRecWarehouse.com 1\nRecalling 1\nRecedes 1\nReceivables 1\nRecep 1\nReckoning 1\nRecoba 1\nRecognizing 1\nReconnaissance 1\nReconsideration 1\nReconstructions 1\nRecording 1\nRecovering 1\nRecreational 1\nRecruited 1\nRecruiter 1\nRecruitment 1\nRecuperation 1\nReda 1\nRede 1\nRedecorating 1\nReding 1\nRedskins 1\nReduced 1\nRedwood 1\nReeve 1\nReeves 1\nRefco 1\nRefine 1\nRefinery 1\nReformed 1\nRefugee 1\nRefurbishing 1\nRegaard 1\nRegard 1\nRegatta 1\nRegency 1\nRegie 1\nRegime 1\nRegiment 1\nRegions 1\nRegistrar 1\nRegistry 1\nRegular 1\nRegulator 1\nRegulators 1\nRehfeld 1\nRehman 1\nReichmanns 1\nReider 1\nReign 1\nReimbursement 1\nReina 1\nReincarnation 1\nReinforced 1\nReinforcing 1\nReins 1\nReinvigorating 1\nReivsion 1\nRejection 1\nRejoins 1\nRelationships 1\nRelative 1\nRelatively 1\nRelax 1\nRelay 1\nReleasing 1\nReliant 1\nRelic 1\nRelics 1\nReligion 1\nReligione 1\nReligiosity 1\nReluctant 1\nRemain 1\nRemarkably 1\nRemarketers 1\nRembembrance 1\nRemedies 1\nRemeliik 1\nRemembering 1\nRemind 1\nRemoval 1\nRemoved 1\nRemoving 1\nRen 1\nRenay 1\nRender 1\nRendon 1\nRendy 1\nRenewal 1\nRenewed 1\nRenjie 1\nRennie 1\nRenoirs 1\nRenta 1\nRental 1\nRents 1\nRenzu 1\nRepair 1\nRepay 1\nRepeated 1\nRepeating 1\nRepertory 1\nReplaceable 1\nReplaced 1\nReplacing 1\nReplay 1\nReplies 1\nReplogle 1\nReported 1\nReportedly 1\nRepresenting 1\nRepresents 1\nReprinted 1\nReproduction 1\nReptile 1\nReq 1\nRequest 1\nRequiem 1\nRescuing 1\nResearcher 1\nResellers 1\nReservoirs 1\nResidences 1\nResident 1\nResilient 1\nResmini 1\nResnick 1\nResolutely 1\nResouces 1\nRestaurationen 1\nReston 1\nRestored 1\nRestraint 1\nRestricted 1\nRestructuring 1\nResult 1\nRetention 1\nRettke 1\nReturned 1\nReuven 1\nRevaldo 1\nReveals 1\nRevell 1\nRevenge 1\nReversing 1\nReviglio 1\nRevising 1\nRevisions 1\nRevitalized 1\nRevivals 1\nRevocation 1\nRewarding 1\nReykjavik 1\nRezneck 1\nRhee 1\nRhin 1\nRhodeRunner 1\nRhodes 1\nRhona 1\nRiad 1\nRic 1\nRicans 1\nRicardo 1\nRicca 1\nRichterian 1\nRickel 1\nRicky 1\nRiely 1\nRiepe 1\nRiesling 1\nRifai 1\nRiflemen 1\nRigdon 1\nRiggins 1\nRighetti 1\nRigid 1\nRigorous 1\nRik 1\nRiklis 1\nRima 1\nRimbaud 1\nRina 1\nRindos 1\nRing 1\nRingboard 1\nRinger 1\nRingo 1\nRings 1\nRinks 1\nRios 1\nRiots 1\nRippe 1\nRipper 1\nRise 1\nRistorante 1\nRites 1\nRitter 1\nRiunite 1\nRivals 1\nRive 1\nRivers 1\nRiverside 1\nRiviera 1\nRiyal 1\nRiyardov 1\nRoadblocks 1\nRoadways 1\nRoaming 1\nRoaring 1\nRobbed 1\nRobbers 1\nRobbins 1\nRoberta 1\nRobertsCorp 1\nRobie 1\nRobious 1\nRobles 1\nRobot 1\nRobotics 1\nRobots 1\nRobsart 1\nRoccaforte 1\nRocco 1\nRockerfeller 1\nRockies 1\nRocks 1\nRococo 1\nRodents 1\nRodman 1\nRoeck 1\nRoeser 1\nRogie 1\nRogin 1\nRohatgi 1\nRohatyn 1\nRohrer 1\nRoi 1\nRojas 1\nRokko 1\nRolaids 1\nRolan 1\nRolando 1\nRolfe 1\nRolfes 1\nRoling 1\nRollie 1\nRollins 1\nRolm 1\nRolodex 1\nRolodexes 1\nRoma 1\nRomanesque 1\nRomantic 1\nRomanticism 1\nRomantics 1\nRomero 1\nRonaldinho 1\nRongdian 1\nRongke 1\nRongkun 1\nRonne 1\nRoode 1\nRoof 1\nRooftops 1\nRooker 1\nRoommates 1\nRoosters 1\nRoot 1\nRoots 1\nRory 1\nRosalco 1\nRosanna 1\nRosanne 1\nRosario 1\nRosechird 1\nRosenbach 1\nRosenbaum 1\nRosenblit 1\nRosencrants 1\nRosenfeld 1\nRoses 1\nRoskind 1\nRossetti 1\nRossi 1\nRossner 1\nRost 1\nRostropovich 1\nRoswell 1\nRotarians 1\nRotarua 1\nRothe 1\nRothfeder 1\nRotorua 1\nRotterdam 1\nRouault 1\nRough 1\nRoundup 1\nRoustabouts 1\nRovers 1\nRowan 1\nRowell 1\nRowling 1\nRoxboro 1\nRozelle 1\nRozycki 1\nRua 1\nRubega 1\nRubeli 1\nRubenesquely 1\nRuberg 1\nRubik 1\nRubins 1\nRudeina 1\nRudolf 1\nRudong 1\nRudraksha 1\nRuentex 1\nRuettgers 1\nRuffel 1\nRugby 1\nRuhollah 1\nRuili 1\nRuined 1\nRukai 1\nRuler 1\nRuling 1\nRulun 1\nRumack 1\nRumela 1\nRummy 1\nRumour 1\nRumours 1\nRunan 1\nRunaway 1\nRune 1\nRunner 1\nRunways 1\nRupee 1\nRusAmArts@aol.com 1\nRusafah 1\nRuscha 1\nRushes 1\nRushforth 1\nRuso 1\nRusted 1\nRustum 1\nRusty 1\nRut 1\nRuyi 1\nRuyue 1\nRye 1\nRyner 1\nRyool 1\nRyosuke 1\nRyszard 1\nRythm 1\nRyukichi 1\nRyzhkov 1\nS&S 1\nS.O.S. 1\nS.P. 1\nS.S. 1\nS.T. 1\nS.p.A 1\nS3 1\nSAAB's 1\nSABC 1\nSABRE 1\nSAFEWAY 1\nSALAD 1\nSALEM 1\nSAM 1\nSAMPLE.DOC 1\nSAMURAI 1\nSANTA 1\nSAP 1\nSARA 1\nSASAC 1\nSAV 1\nSAVAK 1\nSCADA 1\nSCAM 1\nSCAMMERS 1\nSCHEDULE 1\nSCHOOL 1\nSCHWAB 1\nSCHWARTZ 1\nSCIENTIST 1\nSCRAP 1\nSCRUM 1\nSDomeone 1\nSEAGATE 1\nSEAQ 1\nSEATS 1\nSECOND 1\nSECRET 1\nSEEKING 1\nSEEMS 1\nSEH 1\nSELL 1\nSEM 1\nSEMICONDUCTOR 1\nSENATE 1\nSENIOR 1\nSEPARATED 1\nSERC 1\nSERIOUSLY 1\nSERVERS 1\nSETTER 1\nSFX 1\nSH 1\nSHAKE 1\nSHAPE 1\nSHARING 1\nSHAVING 1\nSHEA 1\nSHEARSON 1\nSHEDDING 1\nSHELTERS 1\nSHEVARDNADZE 1\nSHIBUMI 1\nSHIELD 1\nSHIPPING 1\nSHIT 1\nSHN 1\nSHOP 1\nSHOPPE 1\nSHOPPERS 1\nSHOPS 1\nSHOWS 1\nSHUN 1\nSIDE 1\nSIDES 1\nSIDS 1\nSIERRA 1\nSIGNALED 1\nSIMPLIFYING 1\nSISAL 1\nSIZING 1\nSKIDDED 1\nSKILLED 1\nSKIRTS 1\nSKr1.5 1\nSKr20 1\nSKr205 1\nSKr225 1\nSKr29 1\nSL 1\nSMALLER 1\nSMART 1\nSMEs 1\nSMIP 1\nSMOKE 1\nSMYRNA 1\nSOCIETY 1\nSOFT 1\nSOLVE 1\nSONGsters 1\nSOOOO 1\nSORRY 1\nSOS 1\nSOUND 1\nSOUTHERN 1\nSOblander@carrfut.com 1\nSP 1\nSP1 1\nSPACE.com 1\nSPACEHAB 1\nSPECIAL 1\nSPECIALIZED 1\nSPECTRE 1\nSPICES 1\nSPLOID.com 1\nSPOT 1\nSPRDOPT 1\nSPRING 1\nSPS 1\nSQ006 1\nSQUARE 1\nSQUIBB 1\nSRA 1\nSRD 1\nSSI 1\nSTAGED 1\nSTANDARDS 1\nSTANDING 1\nSTANLEY 1\nSTARSH 1\nSTATES 1\nSTEALING 1\nSTEEL 1\nSTEPS 1\nSTILL 1\nSTODGY 1\nSTONER 1\nSTOP 1\nSTORE 1\nSTORY 1\nSTRUCK 1\nSTRUGGLED 1\nSTSN 1\nSTUBBED 1\nSTUDENTS 1\nSU 1\nSUN 1\nSUPER 1\nSUPPOSEDLY 1\nSUPREME 1\nSURGED 1\nSUSPECT 1\nSWT 1\nSWUNG 1\nSa 1\nSa'ada 1\nSaalfeld 1\nSabah 1\nSabati 1\nSabbath 1\nSabena 1\nSabha 1\nSabie 1\nSabina 1\nSabine 1\nSable 1\nSabres 1\nSacasa 1\nSacilor 1\nSacks 1\nSacre 1\nSacred 1\nSacremento 1\nSacrificing 1\nSada 1\nSadakane 1\nSadam 1\nSadaqal 1\nSaddened 1\nSaddle 1\nSadiq 1\nSadiri 1\nSadri 1\nSadrists 1\nSaeb 1\nSaeed 1\nSafariMac 1\nSafavids 1\nSafawis 1\nSafawites 1\nSafawiya 1\nSafed 1\nSaffir 1\nSafford 1\nSafr 1\nSaga 1\nSagesse 1\nSaginaw 1\nSagis 1\nSagittarius 1\nSahih 1\nSaikung 1\nSailors 1\nSainte 1\nSainthood 1\nSaints 1\nSaito 1\nSaiyong 1\nSajak 1\nSaklatvala 1\nSakovich 1\nSakowitz 1\nSakura 1\nSalad 1\nSalafis 1\nSalafism 1\nSalah 1\nSalamat 1\nSalang 1\nSalant 1\nSalazar 1\nSalees 1\nSalesman 1\nSalikh 1\nSalina 1\nSalinardo@ENRON 1\nSalisbury 1\nSalk 1\nSallah 1\nSalmagundi 1\nSalsa 1\nSalsbury 1\nSalton 1\nSaluting 1\nSalvatore 1\nSam3102.doc 1\nSamChawk@aol.com 1\nSamak 1\nSamaritans 1\nSamawa 1\nSamel 1\nSamengo 1\nSamford 1\nSamia 1\nSamiel 1\nSammi 1\nSammye 1\nSamnoud 1\nSamovar 1\nSamphon 1\nSamson 1\nSana 1\nSanak 1\nSanchih 1\nSanchung 1\nSanctions 1\nSandalwood 1\nSandblasters 1\nSande 1\nSandhills 1\nSandia 1\nSandip 1\nSandisk 1\nSandler 1\nSandor 1\nSandrine 1\nSandup 1\nSandwiched 1\nSandymount 1\nSane 1\nSangh 1\nSangzao 1\nSanholi 1\nSanhsia 1\nSanhuan 1\nSanjay 1\nSanmei 1\nSanmin 1\nSann 1\nSanraku 1\nSantonio 1\nSantos 1\nSanwiches 1\nSanxingdui 1\nSanyo 1\nSanzio 1\nSaqa 1\nSarai 1\nSarajevo 1\nSaran 1\nSarandon 1\nSarasota 1\nSarbanes 1\nSardi 1\nSardina 1\nSardinia 1\nSarge 1\nSargent 1\nSarhid 1\nSari2001 1\nSark 1\nSars 1\nSasae 1\nSasaki 1\nSasebo 1\nSasha 1\nSatanic 1\nSatanists 1\nSate 1\nSatisfaction 1\nSatisfactory 1\nSatisfy 1\nSatisfying 1\nSato 1\nSatoko 1\nSaturdays 1\nSaudia 1\nSaudization 1\nSaull 1\nSaunders 1\nSausage 1\nSausalito 1\nSavadina 1\nSavalas 1\nSavannah 1\nSaved 1\nSaveth 1\nSavidge 1\nSavior 1\nSavoring 1\nSavoy 1\nSaw 1\nSawchuck 1\nSawchuk 1\nSawn 1\nSawx 1\nSaydiyah 1\nSayonara 1\nSayre 1\nSayyaf 1\nSc. 1\nScalds 1\nScallops 1\nScam 1\nScambio 1\nScana 1\nScandlin 1\nScanning 1\nScarborough 1\nScare 1\nScaring 1\nScarlett 1\nScarsdale 1\nScary 1\nScenarios 1\nScenic 1\nScentsory 1\nSceto 1\nSchabowski 1\nScheduled 1\nScheetz 1\nSchellke 1\nSchenectady 1\nScherer 1\nSchieffelin 1\nSchiffs 1\nSchimberg 1\nSchloss 1\nSchlumpf 1\nSchmedel 1\nSchmick 1\nSchmidlin 1\nSchoeppner 1\nScholar 1\nSchrager 1\nSchreyer 1\nSchubert 1\nSchuler 1\nSchulof 1\nSchultz 1\nSchumer 1\nSchwartzenburg@ENRON_DEVELOPMENT 1\nSchwarzenberger 1\nSchwarzkopf 1\nSchweitzer 1\nSchwerin 1\nScipio 1\nScooby 1\nScoop 1\nScorsese 1\nScotchbrite 1\nScrap 1\nScraping 1\nScreams 1\nScreen 1\nScribes 1\nScripting 1\nScripture 1\nScroll 1\nScrum 1\nScully 1\nScurlock 1\nSderot 1\nSeacomb 1\nSeals 1\nSeams 1\nSearched 1\nSearcher 1\nSeason 1\nSeasonally 1\nSeasoned 1\nSeasons 1\nSeattlepi.com 1\nSeaworth 1\nSecrecy 1\nSecretaries 1\nSecrets 1\nSecular 1\nSecured 1\nSedans 1\nSedatives 1\nSedra 1\nSegar 1\nSeger 1\nSegolene 1\nSeidler 1\nSeif 1\nSeifert 1\nSeiko 1\nSeiler 1\nSeimei 1\nSeince 1\nSeiren 1\nSeismographic 1\nSeizing 1\nSeizures 1\nSelassie 1\nSelavo 1\nSeldom 1\nSelkirk 1\nSellars 1\nSelmer 1\nSeltzer 1\nSelve 1\nSelwyn 1\nSemantics 1\nSemi-liberated 1\nSemifinished 1\nSeminary 1\nSemites 1\nSemmel 1\nSemmelman 1\nSempra 1\nSenado 1\nSending 1\nSenet 1\nSenilagakali 1\nSeniors 1\nSennheiser 1\nSenorita 1\nSentences 1\nSentencing 1\nSentiment 1\nSeparated 1\nSeparating 1\nSeparation 1\nSeparatist 1\nSequa 1\nSerafin 1\nSerail 1\nSerapis 1\nSerfdom 1\nSergeant 1\nSergey 1\nSergiusz 1\nSerkin 1\nServed 1\nServer 1\nServicemen 1\nServifilm 1\nSessions 1\nSet 1\nSetoff 1\nSetoli 1\nSeton 1\nSeussian 1\nSeventieth 1\nSeventy 1\nSevere 1\nSeverence 1\nSewedi 1\nSewing 1\nSexodrome 1\nSexual 1\nSf 1\nShaab 1\nShabab 1\nShadi 1\nShadid 1\nShadin 1\nShadows 1\nShafer 1\nShafii 1\nShahal 1\nShaheen 1\nShaja 1\nShake 1\nShakespear 1\nShakia 1\nShaking 1\nShakir 1\nShakspere 1\nShalan 1\nShales 1\nShalhoub 1\nShalik 1\nShall 1\nShalome 1\nShalu 1\nShalun 1\nShamanism 1\nShame 1\nShames 1\nShampoo 1\nShamsed 1\nShamus 1\nShanar 1\nShanbu 1\nShane 1\nShanfang 1\nShanganning 1\nShangqing 1\nShangquan 1\nShangtou 1\nShangzhi 1\nShankman 1\nShanks 1\nShanqin 1\nShanyin＊ 1\nShaohua 1\nShaoping 1\nShaoqing 1\nShaoyang 1\nShape 1\nShaquille 1\nShara 1\nSharayu 1\nShardlow 1\nShared 1\nSharegernkerf 1\nSharesBase 1\nSharikh 1\nSharjah 1\nSharpe 1\nSharps 1\nSharwak 1\nShash 1\nShashe 1\nShastra 1\nShaughnessy 1\nShaving 1\nShawn 1\nShaykh 1\nShealy 1\nShearman 1\nShearon 1\nSheaves 1\nShebek 1\nSheehan 1\nSheehen 1\nSheehy 1\nSheet 1\nSheetlets 1\nSheffield 1\nShehata 1\nSheika 1\nSheikhly 1\nShelbyville 1\nShelf 1\nShelia 1\nShelley 1\nShellpot 1\nShelten 1\nSheltons 1\nShenandoah 1\nShengdong 1\nShenggeban 1\nShengjiang 1\nShengjie 1\nShengli 1\nShengmin 1\nShengnan 1\nShengyou 1\nShennan 1\nShennanzhong 1\nShepherds 1\nSher 1\nSherblom 1\nSheremetyevo 1\nSherlock 1\nSherren 1\nSherron 1\nSheryll 1\nShewar 1\nShidler 1\nShidyaq 1\nShiflett 1\nShigezo 1\nShihfang 1\nShihkung 1\nShihlin 1\nShihmen 1\nShilin 1\nShima 1\nShimayi 1\nShimoyama 1\nShimson 1\nShinbun 1\nShine 1\nShines 1\nShinichi 1\nShinpan 1\nShinshe 1\nShiny 1\nShinyee 1\nShirer 1\nShirman 1\nShirtwaist 1\nShishen 1\nShishi 1\nShitreet 1\nShizhong 1\nShlaes 1\nShlama 1\nShlenker 1\nShocked 1\nShocking 1\nShoes 1\nShook 1\nShoppers 1\nShores 1\nShorn 1\nShortage 1\nShortageflation 1\nShortages 1\nShorted 1\nShorty 1\nShot 1\nShots 1\nShouda 1\nShouhai 1\nShowBiz 1\nShowa 1\nShowdown 1\nShowing 1\nShown 1\nShowrooms 1\nShows 1\nShrabanitca 1\nShrapnel 1\nShrewdly 1\nShri 1\nShrimp 1\nShrine 1\nShrinking 1\nShropshire 1\nShrovita 1\nShrubs 1\nShuangliu 1\nShuchu 1\nShuifu 1\nShuiyu 1\nShulman 1\nShunde 1\nShuo 1\nShuoren 1\nShupe 1\nShuqair 1\nShusheng 1\nShut 1\nShutter 1\nShuttling 1\nShuwa 1\nShuye 1\nShuzhen 1\nShvartzer 1\nShwe 1\nSi 1\nSiamese 1\nSiang 1\nSiano 1\nSicily 1\nSida 1\nSides 1\nSiding 1\nSidorenko 1\nSidoti 1\nSidq 1\nSiebert 1\nSieckman 1\nSiegal 1\nSiegler 1\nSierras 1\nSiew 1\nSigh 1\nSight 1\nSigma 1\nSignac 1\nSignature 1\nSignificance 1\nSignificant 1\nSignificantly 1\nSiguniang 1\nSigurd 1\nSihai 1\nSilences 1\nSilesia 1\nSills 1\nSilly 1\nSilpa 1\nSilvio 1\nSimat 1\nSimba 1\nSimolingsike 1\nSimone 1\nSimulated 1\nSimulator 1\nSincere 1\nSinemet 1\nSingTel 1\nSingaporeans 1\nSingh 1\nSingin 1\nSingleton 1\nSingmaster 1\nSingtsufang 1\nSiniscal 1\nSink 1\nSinopac 1\nSinopoli 1\nSintel 1\nSirota 1\nSirrine 1\nSis 1\nSisk 1\nSisley 1\nSistani 1\nSiteProNews 1\nSituational 1\nSituations 1\nSixteen 1\nSiye 1\nSiyi 1\nSizable 1\nSize 1\nSkeptical 1\nSkibo 1\nSkip 1\nSkippers 1\nSkipping 1\nSkoal 1\nSkopbank 1\nSkylight 1\nSkype 1\nSlam 1\nSlatkin 1\nSlavin 1\nSlavonia 1\nSlay 1\nSleeping 1\nSleepless 1\nSleet 1\nSlender 1\nSlick 1\nSlicker 1\nSlides 1\nSlimSkim 1\nSlims 1\nSlobo 1\nSlobodin 1\nSloma 1\nSlosberg 1\nSlote 1\nSlovenian 1\nSlower 1\nSlowest 1\nSlut 1\nSmartGrowth 1\nSmartNet 1\nSmartRender 1\nSmartek 1\nSmarter 1\nSmartwolf 1\nSmartwolves 1\nSmeal 1\nSmedes 1\nSmelting 1\nSmetek 1\nSmill 1\nSmirnoff 1\nSmithson 1\nSmokers 1\nSmoking 1\nSmolensk 1\nSmooth 1\nSmoothly 1\nSmorgon 1\nSmuzynski 1\nSnack 1\nSnacks 1\nSnafu 1\nSnap 1\nSnape 1\nSnatchers 1\nSneak 1\nSneaker 1\nSnecma 1\nSneh 1\nSnorkeling 1\nSoCal 1\nSoHo 1\nSoaked 1\nSobey 1\nSochi 1\nSocialism 1\nSocieta 1\nSociology 1\nSock 1\nSoda 1\nSofia 1\nSofter 1\nSoftly 1\nSofyan 1\nSogo 1\nSohail 1\nSohublog 1\nSoichiro 1\nSokol 1\nSolarz 1\nSolchaga 1\nSoldado 1\nSolebury 1\nSolids 1\nSolis 1\nSolow 1\nSolso 1\nSolutions 1\nSolving 1\nSolzhenitsyn 1\nSomewhat 1\nSomme 1\nSommer 1\nSongjiang 1\nSongling 1\nSongpan 1\nSonicare 1\nSonja 1\nSonnenberg 1\nSophisticated 1\nSorbus 1\nSorceress 1\nSore 1\nSorenson 1\nSorrow 1\nSorting 1\nSosuke 1\nSotela 1\nSothomayuel 1\nSou'wester 1\nSounded 1\nSoundview 1\nSouris 1\nSouter 1\nSouthbrook 1\nSouthdown 1\nSouthdowns 1\nSoutheastern 1\nSouthlake 1\nSouthpaw 1\nSouthport 1\nSouthwide 1\nSouthwood 1\nSouyasen 1\nSouza 1\nSovietized 1\nSows 1\nSoy 1\nSpaced 1\nSpaghetti 1\nSpago 1\nSpahn 1\nSpalding 1\nSpangled 1\nSpaniards 1\nSpanning 1\nSparc 1\nSparcstation 1\nSparkling 1\nSparta 1\nSpartak 1\nSpaulding 1\nSpaull 1\nSpeakers 1\nSpear 1\nSpec 1\nSpecialists 1\nSpecialty 1\nSpectra 1\nSpeculative 1\nSpeculators 1\nSpeech 1\nSpeeding 1\nSpeedway 1\nSpence 1\nSpenser 1\nSpice 1\nSpiegelman 1\nSpike 1\nSpilanovic 1\nSpin 1\nSpinal 1\nSpinner 1\nSpinney 1\nSpinola 1\nSpirited 1\nSpiritual 1\nSpitzenburg 1\nSpliced 1\nSplit 1\nSpokespersons 1\nSpongy 1\nSponsored 1\nSpooked 1\nSpoon 1\nSportdom 1\nSportsmen 1\nSportswear 1\nSpotsy 1\nSpotted 1\nSpouse 1\nSpratly 1\nSprawl 1\nSprayers 1\nSprenger 1\nSpringdale 1\nSpringsteen 1\nSprinkle 1\nSprinkled 1\nSprinkles 1\nSprinting 1\nSprite 1\nSprizzo 1\nSpurred 1\nSpyware 1\nSquamish 1\nSquealiers 1\nSqueege 1\nSquid 1\nSquire 1\nSrinagar 1\nSstaff 1\nStaber 1\nStabilization 1\nStabilizer 1\nStabilizing 1\nStacey.Richardson@enron.com 1\nStackup 1\nStacy 1\nStaffan 1\nStaffers 1\nStaffing 1\nStagecoach 1\nStaikos 1\nStalinism 1\nStalone 1\nStambolic 1\nStamps 1\nStandardizing 1\nStands 1\nStanislav 1\nStanwick 1\nStaphylococcus 1\nStaple 1\nStapleton 1\nStardust 1\nStarke 1\nStarroute 1\nStarted 1\nStarter 1\nStartups 1\nStarve 1\nStated 1\nStatehouse 1\nStaten 1\nStateswest 1\nStations 1\nStauffer 1\nStayed 1\nStaying 1\nStazi 1\nSte 1\nSteadily 1\nSteadman 1\nSteak 1\nSteamed 1\nSteaming 1\nSteamship 1\nStedt 1\nSteelmaking 1\nSteenbergen 1\nSteep 1\nSteeped 1\nSteffen 1\nSteffes 1\nStehlin 1\nSteinbach 1\nSteinkrauss 1\nSteinkuhler 1\nStennett 1\nStephens@ENRON 1\nStephenson 1\nSteptoe 1\nStertz 1\nStibel 1\nStiefvater 1\nStiglitz 1\nStikeman 1\nSting 1\nStingers 1\nStinson 1\nStir 1\nStirred 1\nStirs 1\nStitch 1\nStjernsward 1\nStl 1\nStoch 1\nStockard 1\nStockbrokers 1\nStockholmers 1\nStockholmites 1\nStockton 1\nStoddard 1\nStoecker 1\nStoecklin 1\nStokely 1\nStolen 1\nStolley 1\nStomach 1\nStoneman 1\nStoner 1\nStoneridge 1\nStones 1\nStony 1\nStops 1\nStorehouse 1\nStoried 1\nStow 1\nStradivarius 1\nStrambler 1\nStranglove 1\nStrat 1\nStratas 1\nStrategists 1\nStratus 1\nStraub 1\nStravinsky 1\nStraying 1\nStream 1\nStreams 1\nStreep 1\nStreets 1\nStrehler 1\nStreitz 1\nStrengthening 1\nStress 1\nStressing 1\nStretch 1\nStretching 1\nStricken 1\nStrickland 1\nStrict 1\nStrictly 1\nStrider 1\nStringer 1\nStrings 1\nStripperella 1\nStrive 1\nStriving 1\nStrobel 1\nStroke 1\nStrokes 1\nStrolling 1\nStromeyer 1\nStrongid 1\nStroup 1\nStruble 1\nStructure 1\nStructuring 1\nStubblefield 1\nStuck 1\nStudds 1\nStudying 1\nStuecker 1\nStuff 1\nStuffing 1\nStumpf 1\nStumpy 1\nStung 1\nStunned 1\nStupid 1\nStupidity 1\nStwilkivich 1\nStyle 1\nSub 1\nSub-cultures 1\nSub-saharan 1\nSubaihi 1\nSubaru 1\nSubcontractors 1\nSubdivision 1\nSubjects 1\nSubmerge 1\nSubroto 1\nSubscribers 1\nSubscribing 1\nSubsidizing 1\nSubsistencias 1\nSubstituting 1\nSubway 1\nSuccasunna 1\nSuccessful 1\nSuccessfully 1\nSuchdev 1\nSuchocki 1\nSuck 1\nSuckow 1\nSucre 1\nSudol 1\nSuedan 1\nSuerat 1\nSuffer 1\nSuffering 1\nSufficient 1\nSuffolk 1\nSufi 1\nSuggest 1\nSuh 1\nSuhaimi 1\nSuheto 1\nSuitable 1\nSuites 1\nSuits 1\nSukoi 1\nSullivan@ENRON 1\nSullivans 1\nSultana 1\nSulya 1\nSum 1\nSummarizing 1\nSummary 1\nSummerland 1\nSummoning 1\nSunCor 1\nSuncor 1\nSunil 1\nSunkist 1\nSunlight 1\nSunnyside 1\nSunset 1\nSuo 1\nSuolangdaji 1\nSuper-shear 1\nSuperbowls 1\nSupercenter 1\nSuperdome 1\nSuperman 1\nSupermarkets 1\nSuperstitions 1\nSuperstore 1\nSupplementary 1\nSupposing 1\nSuppressing 1\nSuppression 1\nSur 1\nSurety 1\nSurgical 1\nSurplus 1\nSurprise 1\nSurrealism 1\nSurrealist 1\nSurrender 1\nSurrey 1\nSurrounding 1\nSurveyer 1\nSurvival 1\nSurvive 1\nSurvived 1\nSurviving 1\nSusanna 1\nSusino 1\nSussexs 1\nSussman 1\nSusumu 1\nSutra 1\nSuwail 1\nSuwayrah 1\nSuwu 1\nSuyan 1\nSuzette 1\nSuzy 1\nSwaine 1\nSwartzkopf 1\nSweat 1\nSwede 1\nSweeping 1\nSweetest 1\nSweezey 1\nSwept 1\nSwetha 1\nSwiffer 1\nSwitchboard 1\nSwiveling 1\nSyb 1\nSybil 1\nSyeb 1\nSykes 1\nSymbol:HRB 1\nSymbolist 1\nSymphony 1\nSynOptics 1\nSyndicate 1\nSynthelabo 1\nSystem.getPropery 1\nSystematic 1\nSze 1\nSzeto 1\nSzuhu 1\nSzuros 1\nT&D 1\nT.D. 1\nT.M.X. 1\nT.T. 1\nT1 1\nTAGG 1\nTAIEX 1\nTAKE 1\nTALKS 1\nTANDEM 1\nTANG 1\nTASTY 1\nTAU 1\nTAXPAYERS 1\nTBH 1\nTBWA 1\nTCA 1\nTCAS 1\nTCL 1\nTD 1\nTDK 1\nTEACH 1\nTEACHERS 1\nTEACHING 1\nTECHNOLOGIST 1\nTECO 1\nTED 1\nTELELAW 1\nTELESIS 1\nTEN 1\nTERAS 1\nTERRIFIED 1\nTEST 1\nTESTS 1\nTGIF 1\nTHAN 1\nTHANK 1\nTHOSE 1\nTHOUSAND 1\nTHPHD7UY 1\nTHR 1\nTHREAT 1\nTHREE 1\nTHRIVE 1\nTHROUGHOUT 1\nTHX 1\nTHYSELF 1\nTHey 1\nTIGER 1\nTIGRs 1\nTILT 1\nTIP 1\nTITI 1\nTJO 1\nTK 1\nTMMC 1\nTMS 1\nTMobile 1\nTODAY 1\nTOKYO 1\nTOLL 1\nTOOK 1\nTOPAZ 1\nTOPIC 1\nTOPMARGIN 1\nTOVA 1\nTOWNSHIP 1\nTPAS 1\nTRADE 1\nTRAFFIC 1\nTRANSAMERICA 1\nTRANSPARENT 1\nTRANSPORTATION 1\nTRASH 1\nTRAVELS 1\nTRC 1\nTREND 1\nTRIAD 1\nTRIMMING 1\nTROUBLE 1\nTRT 1\nTRUCK 1\nTRUSTEE 1\nTRYING 1\nTT 1\nTUCSON 1\nTV's 1\nTVBS 1\nTVC 1\nTVLINK 1\nTWIG 1\nTXb 1\nTYPE 1\nTabacs 1\nTabah 1\nTabak 1\nTabas 1\nTaber 1\nTable 1\nTabloid 1\nTaboos 1\nTabors 1\nTachuwei 1\nTacit 1\nTack 1\nTactical 1\nTad 1\nTadahiko 1\nTadzhikistan 1\nTaffiq 1\nTagalog 1\nTagesspiegel 1\nTagesthemen 1\nTagg 1\nTaghlabi 1\nTaguba 1\nTaha 1\nTahitian 1\nTahsin 1\nTaiPower 1\nTaif 1\nTaifeng 1\nTaihoku 1\nTaihua 1\nTaiji 1\nTaipa 1\nTaishang 1\nTaiuan 1\nTaiwanese-ness 1\nTaiwania 1\nTaiwanization 1\nTaiwanized 1\nTaiyo 1\nTaizhou 1\nTajik 1\nTajikistan 1\nTajikstan 1\nTajis 1\nTakagi 1\nTakamori 1\nTakanori 1\nTakayama 1\nTakedown 1\nTakimura 1\nTales 1\nTali 1\nTalib 1\nTalibiya 1\nTalingshan 1\nTalked 1\nTalley 1\nTamalin 1\nTambo 1\nTambor 1\nTambora 1\nTaming 1\nTammi 1\nTampere 1\nTangible 1\nTangiers 1\nTangipahoa 1\nTango 1\nTanii 1\nTank 1\nTanker 1\nTankers 1\nTankless 1\nTanks 1\nTannenbaum 1\nTannoy 1\nTanqueray 1\nTantrika 1\nTanzanian 1\nTanzi 1\nTanzim 1\nTaoist 1\nTaokas 1\nTaos 1\nTaoyan 1\nTap 1\nTaps 1\nTarawi 1\nTarid 1\nTarik 1\nTarnopol 1\nTarot 1\nTarrytown 1\nTarsia 1\nTart 1\nTartikoff 1\nTarzana 1\nTascher 1\nTashjian 1\nTasse 1\nTaster 1\nTastes 1\nTat 1\nTata 1\nTateishi 1\nTateisi 1\nTatsuhara 1\nTattingers 1\nTattoo 1\nTatun 1\nTau 1\nTaufiq 1\nTaugia 1\nTauris 1\nTavern 1\nTawana 1\nTaxing 1\nTaxotere 1\nTaxpayer 1\nTaxus 1\nTayar 1\nTaylors 1\nTayyip 1\nTbond 1\nTcl 1\nTe 1\nTe'an 1\nTeahouse 1\nTears 1\nTeases 1\nTeather 1\nTechDesign 1\nTechnik 1\nTechnique 1\nTeco 1\nTectonics 1\nTed.Bockius@ivita.com 1\nTee 1\nTeenage 1\nTeenagers 1\nTegucigalpa 1\nTeijin 1\nTeikoku 1\nTeito 1\nTekiat 1\nTeknowledge 1\nTelaction 1\nTele 1\nTeleVideo 1\nTelectronics 1\nTeleflora 1\nTelegraaf 1\nTelerama 1\nTelescope 1\nTelevisions 1\nTelly 1\nTelugu 1\nTelxon 1\nTemper 1\nTempered 1\nTempers 1\nTemples 1\nTempura 1\nTenacity 1\nTencel 1\nTendered 1\nTenn 1\nTennesse 1\nTennessean 1\nTennis 1\nTenth 1\nTeodorani 1\nTeodulo 1\nTequila 1\nTerceira 1\nTerminals 1\nTerminator 1\nTerrawi 1\nTerree 1\nTerrence 1\nTerrific 1\nTeruel 1\nTese 1\nTesla 1\nTestator 1\nTesting 1\nTeton 1\nTetris 1\nTettamanti 1\nTetterode 1\nTex 1\nTex. 1\nTexasness 1\nText 1\nTextbook 1\nThabo 1\nThaddeus 1\nThanh 1\nThankfully 1\nThanking 1\nThatcherism 1\nThatcherite 1\nThayer 1\nThefts 1\nTheirs 1\nThema 1\nTheological 1\nTheorists 1\nTheran 1\nTherapists 1\nTherein 1\nThermal 1\nThermoelectric 1\nThermometer 1\nTheupups 1\nThi$ 1\nThiep 1\nThierry 1\nThirteenth 1\nThirthar 1\nTholt 1\nThomae 1\nThomasini 1\nThomistic 1\nThompsons 1\nThornton 1\nThoroughbred 1\nThoroughly 1\nThou 1\nThrashers 1\nThreat 1\nThreatening 1\nThrice 1\nThroat 1\nThrown 1\nThun 1\nThunderbird 1\nThurow 1\nThx 1\nThxs 1\nThy 1\nTiVo 1\nTian'ge 1\nTianenmen 1\nTianhe 1\nTiantai 1\nTiantao 1\nTiantong 1\nTiaoyutai 1\nTiba 1\nTibbles 1\nTibbs 1\nTibetans 1\nTickell 1\nTickets 1\nTidewater 1\nTidia 1\nTie 1\nTied 1\nTieh 1\nTiempo 1\nTiemuer 1\nTienmu 1\nTienti 1\nTiepolo 1\nTier 1\nTighten 1\nTigris 1\nTigue 1\nTil 1\nTile 1\nTillery 1\nTillinghast 1\nTilted 1\nTimeline 1\nTimer 1\nTiming 1\nTingchuo 1\nTingfang 1\nTinseltown 1\nTiny 1\nTipasa 1\nTirana 1\nTired 1\nTirello 1\nTissues 1\nTithing 1\nTitled 1\nTito 1\nTitus 1\nToad 1\nToast 1\nTobin 1\nTobishima 1\nTobruk 1\nTockman 1\nTod 1\nToensing 1\nToepfer 1\nToes 1\nToiba 1\nToils 1\nTokai 1\nTokinio 1\nTokuo 1\nTokuyama 1\nTold 1\nTolentino 1\nTolkien 1\nToll 1\nTolling 1\nTolstoy 1\nTomahawk 1\nTomaz 1\nTombs 1\nTomcats 1\nTomgram 1\nTomiPilates 1\nTomkin 1\nTomoshige 1\nTong'Il 1\nTongling 1\nTongyong 1\nTonji 1\nTonka 1\nTonto 1\nTools 1\nToomanytaxes 1\nTootsie 1\nTopaz 1\nTopeka 1\nTopix 1\nTopping 1\nToprak 1\nTorah 1\nTorchmark 1\nTories 1\nTornado 1\nToros 1\nTorrealba 1\nTorrence 1\nTorrents 1\nTorrington 1\nTorts 1\nTorvalds 1\nTorx 1\nTosco 1\nToshimitsu 1\nToshiyuki 1\nTotally 1\nTotals 1\nTots 1\nTotten 1\nTouched 1\nToufen 1\nTour 1\nTourette 1\nTouring 1\nTourists 1\nTowering 1\nTowing 1\nTowns 1\nTownsend 1\nTownships 1\nToyama 1\nToyko 1\nToyobo 1\nToyoko 1\nTracer 1\nTracing 1\nTracker 1\nTrader 1\nTraditionalists 1\nTraditions 1\nTrafalgar 1\nTrafficking 1\nTragedies 1\nTrainchl 1\nTrained 1\nTrains 1\nTranquil 1\nTrans-Jordan 1\nTrans-Mediterranean 1\nTransair 1\nTransatlantic 1\nTransfer 1\nTransformation 1\nTransformational 1\nTransformers 1\nTranslant 1\nTransmillennial 1\nTransmutation 1\nTransnational 1\nTransol 1\nTransparency 1\nTransplantation 1\nTransponders 1\nTransporatation 1\nTransylvania 1\nTrappist 1\nTrash 1\nTravelgate 1\nTravelled 1\nTravelocity 1\nTravels 1\nTraverse 1\nTraverso 1\nTraynor 1\nTreasurers 1\nTreat 1\nTreating 1\nTrebian 1\nTredegar 1\nTrekkers 1\nTrekkies 1\nTremblant 1\nTremendae 1\nTrentret 1\nTrettien 1\nTrevor 1\nTri-Service 1\nTrier 1\nTries 1\nTriggering 1\nTrimble 1\nTrimmer 1\nTrinen 1\nTrinidadian 1\nTrinitron 1\nTripod 1\nTrish 1\nTristars 1\nTriton 1\nTrivia 1\nTrivial 1\nTrolls 1\nTrompe 1\nTrong 1\nTroop 1\nTrotting 1\nTroubleFunk 1\nTroubles 1\nTrousse 1\nTroutman 1\nTruanderie 1\nTruckee 1\nTruckers 1\nTrucks 1\nTrud 1\nTruffaut 1\nTruly 1\nTrumps 1\nTrunkslu 1\nTrusk 1\nTrusthouse 1\nTruthful 1\nTryon 1\nTsarist 1\nTse 1\nTshirt 1\nTsi 1\nTsim 1\nTsinghua 1\nTucheng 1\nTue. 1\nTueni 1\nTues. 1\nTuesdays 1\nTugs 1\nTuitions 1\nTukaram 1\nTuku 1\nTulip 1\nTully 1\nTumble 1\nTunas 1\nTundra 1\nTune 1\nTunghai 1\nTunhwa 1\nTunisia 1\nTuo 1\nTupolev 1\nTupperware 1\nTurandot 1\nTurben 1\nTurgut 1\nTurkcell 1\nTurkmenia 1\nTurkmenistan 1\nTurnaround 1\nTurnbull 1\nTurnoff 1\nTurtle 1\nTussey 1\nTustin 1\nTuz 1\nTweens 1\nTwenties 1\nTwilight 1\nTwinky 1\nTwinsburg 1\nTwist 1\nTwisted 1\nTyburn 1\nTyco 1\nTymnet 1\nTypes 1\nTyping 1\nTyrannosaurus 1\nTyson 1\nTze 1\nTzeng 1\nTzung 1\nU-turn 1\nU.A.E. 1\nU.Cal 1\nU.N 1\nU.S.C. 1\nU3 1\nUAL'S 1\nUC 1\nUCC 1\nUCSF 1\nUEP 1\nUGLY 1\nULGG 1\nUMNO 1\nUN-ISLAMIC 1\nUNA 1\nUNDER 1\nUNIFIRST 1\nUNION 1\nUNR 1\nUNRESOLVED 1\nUNSC 1\nUNSCEAR 1\nUNTRUE 1\nUPJOHN 1\nURGED 1\nURGENT 1\nURIs 1\nUS116.7 1\nUSAF 1\nUSAirways 1\nUSCanada 1\nUSED 1\nUSN 1\nUSO 1\nUSPS 1\nUSUAL 1\nUTC 1\nUTF 1\nUTH's 1\nUUP 1\nUber 1\nUbuntu 1\nUclaf 1\nUdeid 1\nUgliness 1\nUgly 1\nUh-uh 1\nUhle 1\nUhlmann 1\nUigur 1\nUkrainians 1\nUlan 1\nUlbricht 1\nUlead 1\nUlier 1\nUllman 1\nUlmanis 1\nUlric 1\nUlster 1\nUltima 1\nUltimatum 1\nUmbrella 1\nUmm 1\nUnafraid 1\nUnamused 1\nUnburden 1\nUnconfirmed 1\nUnconstitutional 1\nUndaunted 1\nUndead 1\nUnderclass 1\nUnderneath 1\nUnderscoring 1\nUnderseas 1\nUnderserved 1\nUnderstandably 1\nUnderstanding 1\nUnderused 1\nUnderwood 1\nUndoubtedly 1\nUnease 1\nUneasiness 1\nUnemployed 1\nUnemployent 1\nUnenlightened 1\nUnesco 1\nUnexpected 1\nUnexpectedly 1\nUnfaithful 1\nUnfilled 1\nUnfortunalty 1\nUnfortunate 1\nUnfriendly 1\nUng 1\nUngaretti 1\nUngermann 1\nUnhappily 1\nUni-President 1\nUnicom 1\nUnida 1\nUnificationism 1\nUnificators 1\nUnigesco 1\nUnilite 1\nUnimin 1\nUnincorporated 1\nUninhibited 1\nUninstall 1\nUnionFed 1\nUnionist 1\nUnitas 1\nUnite 1\nUnitich 1\nUnitours 1\nUnity 1\nUniverse 1\nUnjust 1\nUnlikely 1\nUnloaded 1\nUnloved 1\nUnmanned 1\nUnprovable 1\nUnreformed 1\nUnrelieved 1\nUnreported 1\nUnsolved 1\nUnsuspecting 1\nUntiringly 1\nUnto 1\nUntrustworthy 1\nUnused 1\nUnveiled 1\nUnwilling 1\nUofH 1\nUpchurch 1\nUpdate 1\nUpdated 1\nUpgrades 1\nUpgrading 1\nUphold 1\nUpload 1\nUpping 1\nUprise 1\nUpset 1\nUpshifting 1\nUpstairs 1\nUpstream 1\nUranus 1\nUrbanism 1\nUrbuinano 1\nUrchin 1\nUrge 1\nUrging 1\nUribe 1\nUrinary 1\nUris 1\nUrs 1\nUrsula 1\nUsage 1\nUseful 1\nUser 1\nUsery 1\nUshuaia 1\nUsines 1\nUtahans 1\nUthaymin 1\nUthman 1\nUtilizes 1\nUtilizing 1\nUtter 1\nUwe 1\nUyl 1\nUzi 1\nUzika 1\nV-block 1\nV.H. 1\nV8 1\nVAERS 1\nVALLEY 1\nVAN 1\nVARIAN 1\nVAT 1\nVAX9000 1\nVE 1\nVERYYY 1\nVERYYYY 1\nVF 1\nVFW 1\nVHDL 1\nVIACOM 1\nVICTIMS 1\nVICTORIES 1\nVIDEO 1\nVII 1\nVIN 1\nVINGAS 1\nVIPs 1\nVIRGO 1\nVISAKHA 1\nVISUALIZING 1\nVITAC 1\nVITRO 1\nVLSI 1\nVOLUME 1\nVOLUNTARISM 1\nVOM 1\nVPP 1\nVTC 1\nVTEC 1\nVaastu 1\nVacaville 1\nVaccine 1\nVachon 1\nVaclav 1\nVadar 1\nVadas 1\nVahid 1\nVal 1\nValais 1\nValdes 1\nValdiviesco 1\nValente 1\nValenti 1\nValentin 1\nValero 1\nValery 1\nValiant 1\nValladolid 1\nValparaiso 1\nValu 1\nValuable 1\nVandenBerg 1\nVangie.McGilloway@powersrc.com 1\nVanguardia 1\nVantage 1\nVanuatu 1\nVappenfabrikk 1\nVariety 1\nVariously 1\nVarkala 1\nVarney 1\nVaro 1\nVarvara 1\nVasari 1\nVasco 1\nVass 1\nVassar 1\nVassiliades 1\nVat 1\nVault 1\nVauxhall 1\nVauxhalls 1\nVauxhill 1\nVax 1\nVaxSyn 1\nVeatch 1\nVeba 1\nVedas 1\nVeekz 1\nVega 1\nVegans 1\nVegetables 1\nVehicles 1\nVeiling 1\nVelasco 1\nVelcro 1\nVellante 1\nVen 1\nVenemon 1\nVenerable 1\nVenetoen 1\nVengeance 1\nVentes 1\nVento 1\nVentspils 1\nVercellotti 1\nVerde 1\nVerdun 1\nVerfahrenstechnik 1\nVerification 1\nVerizon 1\nVerlaine 1\nVeronica 1\nVerse 1\nVersicherung 1\nVersicherungs 1\nVertical 1\nVesselin 1\nVet 1\nVeterinarian 1\nVeterinary 1\nVetrivinia 1\nVets 1\nVezina 1\nViagra 1\nViaje 1\nVice-Chairman 1\nVice-President 1\nVice-minister 1\nViceroy 1\nVicks 1\nVictim 1\nVictorious 1\nVideoconference 1\nVie 1\nVieira 1\nViet 1\nViewing 1\nViewpoint 1\nViews 1\nVigdor 1\nVigor 1\nVigorously 1\nVijay 1\nVikings 1\nVikram 1\nViktor 1\nVilas 1\nVillagers 1\nVillanueva 1\nVincennes 1\nVinegar 1\nVineyards 1\nVining 1\nVint 1\nVintage 1\nVinyard 1\nVinylon 1\nViola 1\nViolation 1\nVioleta 1\nVios 1\nViper 1\nViral 1\nVirgil 1\nVirgilio 1\nVirginian 1\nVirology 1\nVirtual 1\nViscous 1\nVisher 1\nVishwanath 1\nVisionQuest 1\nVisker 1\nVisual 1\nVisualisations 1\nVisualizations 1\nVitaly 1\nVitamin 1\nVitter 1\nVittoria 1\nViva 1\nVivaldi 1\nVivien 1\nVizas 1\nVizcaya 1\nVizeversa 1\nVladaven 1\nVladimiro 1\nVladivostok 1\nVlaja 1\nVlasi 1\nVlico 1\nVnet 1\nVoIP 1\nVocal 1\nVocational 1\nVoiceMax 1\nVoidanich 1\nVojislave 1\nVojislov 1\nVol. 1\nVolio 1\nVolland 1\nVolpe 1\nVoltages 1\nVolumes 1\nVoluntarily 1\nVoluntary 1\nVolunteers 1\nVomiting 1\nVonage 1\nVopunsa 1\nVorontsov 1\nVoss 1\nVoter 1\nVoucher 1\nVragulyuv 1\nVranian 1\nVrbnicka 1\nVroom 1\nVs 1\nVt 1\nVultures 1\nVyquest 1\nW.A. 1\nW.B. 1\nW.C. 1\nW.G. 1\nW.H.S. 1\nW.T. 1\nW.Va. 1\nWA 1\nWAD 1\nWADE 1\nWAGE 1\nWAITING 1\nWAITRESS 1\nWALL 1\nWANES 1\nWAR 1\nWARNED 1\nWARS 1\nWASTED 1\nWATCH 1\nWATKINS 1\nWAV 1\nWAVE 1\nWB 1\nWBBM 1\nWD 1\nWEEK 1\nWEEKS 1\nWEFA 1\nWEIRTON 1\nWELLS 1\nWENT 1\nWEST 1\nWFAA 1\nWHDH 1\nWHEC 1\nWHISPER 1\nWHITE 1\nWHITMAN 1\nWHOLESALE 1\nWIND 1\nWINSTON 1\nWIRELESS 1\nWITHHOLDING 1\nWLF 1\nWMD 1\nWNEM 1\nWON 1\nWONDERFUL 1\nWORST 1\nWORTH 1\nWOS 1\nWOThigh 1\nWOW 1\nWSN 1\nWTI 1\nWUSA 1\nWWOR 1\nWYSE 1\nWa 1\nWabal 1\nWabil 1\nWachtler 1\nWafaa 1\nWaffen 1\nWaged 1\nWages 1\nWagg 1\nWagner 1\nWagon 1\nWah 1\nWahabis 1\nWahbi 1\nWahhabi 1\nWahlberg 1\nWain 1\nWaist 1\nWaitan 1\nWal-Marts 1\nWalcott 1\nWaldenbooks 1\nWaldman 1\nWalensa 1\nWaleson 1\nWalked 1\nWalkin 1\nWalkman 1\nWalkmen 1\nWallach 1\nWallingford 1\nWallis 1\nWalls 1\nWalsingham 1\nWaltana 1\nWaltch 1\nWalther 1\nWames 1\nWanbaozhi 1\nWanda 1\nWanders 1\nWandong 1\nWanfang 1\nWanfo 1\nWangda 1\nWanghsi 1\nWanglang 1\nWaning 1\nWanit 1\nWanjialing 1\nWanke 1\nWanna 1\nWannan 1\nWannian 1\nWanniski 1\nWanshou 1\nWansink 1\nWanxian 1\nWarburgs 1\nWardwell 1\nWaring 1\nWarmen 1\nWarned 1\nWarners 1\nWarning 1\nWarrant 1\nWarrens 1\nWarrick 1\nWarwickshire 1\nWary 1\nWasatch 1\nWashPost 1\nWashes 1\nWashing 1\nWatanabe 1\nWatchers 1\nWatchmen 1\nWaterfall 1\nWaterford 1\nWaterhouse 1\nWaters 1\nWatertown 1\nWatsonville 1\nWaugh 1\nWaukesha 1\nWaverly 1\nWax 1\nWe've 1\nWeakening 1\nWeakens 1\nWeakness 1\nWeapon 1\nWear 1\nWearing 1\nWeathermen 1\nWeber 1\nWebern 1\nWeblog 1\nWedbush 1\nWedeman 1\nWednesday's 1\nWee 1\nWeedon 1\nWeeds 1\nWeekends 1\nWeeks 1\nWegener 1\nWeibin 1\nWeicheng 1\nWeidong 1\nWeiguang 1\nWeihua 1\nWeiner 1\nWeingarten 1\nWeinroth 1\nWeisman 1\nWeisskopf 1\nWeitz 1\nWeixian 1\nWeizhou 1\nWelcomes 1\nWelle 1\nWellir 1\nWelt 1\nWelty 1\nWendler 1\nWeng 1\nWenhao 1\nWenhua 1\nWenhuangduo 1\nWenhui 1\nWenjian 1\nWenkerlorphsky 1\nWennberg 1\nWenshan 1\nWent 1\nWentworth 1\nWenxin 1\nWerewolf 1\nWertz 1\nWes 1\nWeshikar 1\nWesley 1\nWesleyan 1\nWestboro 1\nWestburne 1\nWestco 1\nWestdeutsche 1\nWesterly 1\nWestminster 1\nWestport 1\nWestview 1\nWetzel 1\nWhaaaaaaaaaaat 1\nWhaler 1\nWham 1\nWheaties 1\nWheeling 1\nWheels 1\nWherein 1\nWherever 1\nWhie 1\nWhinney 1\nWhipsawed 1\nWhirlpool 1\nWhisk 1\nWhiskers 1\nWhiskey 1\nWhisky 1\nWhisper 1\nWhitehead 1\nWhitening 1\nWhitewater 1\nWhitey 1\nWhitfield 1\nWhittier 1\nWhoa 1\nWholes 1\nWhom 1\nWhoopee 1\nWhoville 1\nWi940 1\nWiFi 1\nWicca 1\nWichterle 1\nWickes 1\nWide 1\nWidely 1\nWidowed 1\nWieden 1\nWiegers 1\nWiener 1\nWiesbaden 1\nWiesenthal 1\nWieslawa 1\nWiggins 1\nWigglesworth 1\nWight 1\nWiiams 1\nWiki 1\nWikimedia 1\nWikipedia.org 1\nWilcock 1\nWildbad 1\nWildenstein 1\nWilderness 1\nWile 1\nWilhite 1\nWillam 1\nWillem 1\nWilliams@ENRON_DEVELOPMENT 1\nWilliamsburg 1\nWilliamses 1\nWilling 1\nWillingness 1\nWillow 1\nWilm 1\nWilpers 1\nWilsonian 1\nWin 1\nWindermere 1\nWindflower 1\nWingX 1\nWingate 1\nWinger 1\nWinging 1\nWings 1\nWinkle 1\nWinnetka 1\nWinnipeg 1\nWinslow 1\nWinster 1\nWinterfield 1\nWinterthur 1\nWire 1\nWired 1\nWisely 1\nWiser 1\nWish 1\nWissam 1\nWisteria 1\nWitch 1\nWithdrawing 1\nWithrow 1\nWittmer 1\nWives 1\nWixom 1\nWiz 1\nWoi 1\nWok 1\nWoking 1\nWolfman 1\nWolfsheim 1\nWolfson 1\nWollkook 1\nWollo 1\nWolof 1\nWolves 1\nWonders 1\nWonderworld 1\nWoodMac 1\nWoodrum 1\nWoodside 1\nWoodstock 1\nWoodworth 1\nWoolard 1\nWoong 1\nWordPerfect 1\nWordsworth 1\nWorkplace 1\nWorkshops 1\nWorldcom 1\nWorried 1\nWorries 1\nWorry 1\nWorsely 1\nWorsening 1\nWorthington 1\nWorthy 1\nWrangler 1\nWrapping 1\nWrench 1\nWrighting 1\nWrights 1\nWrist 1\nWristwatch 1\nWrite 1\nWuchner 1\nWuerttemburg 1\nWuhu 1\nWyly 1\nWyman 1\nWyo. 1\nX-Files 1\nX-box 1\nX. 1\nX33098 1\nXBox 1\nXDJM 1\nXIC 1\nXMen 1\nXP 1\nXR4Ti's 1\nXS 1\nXSL 1\nXVID 1\nXacto 1\nXangsane 1\nXerxes 1\nXiangxiang 1\nXiangyu 1\nXiangyun 1\nXianlong 1\nXiannian 1\nXianwen 1\nXiaocun 1\nXiaodong 1\nXiaofang 1\nXiaojie 1\nXiaolangdi 1\nXiaolin 1\nXiaolue 1\nXiaoming 1\nXiaoqing 1\nXiaoyu 1\nXide 1\nXidex 1\nXidong 1\nXierong 1\nXiesong 1\nXiguang 1\nXiguo 1\nXijiang 1\nXiliang 1\nXimei 1\nXinbaotianyang 1\nXing 1\nXingdong 1\nXinghong 1\nXingtang 1\nXinhuadu 1\nXinjing 1\nXinmei 1\nXinyi 1\nXinzhong 1\nXinzhuang 1\nXiquan 1\nXishan 1\nXiu 1\nXiucai 1\nXiuquan 1\nXiuwen 1\nXiuying 1\nXixihahahehe 1\nXizhi 1\nXtreme 1\nXuancheng 1\nXuangang 1\nXuange 1\nXuanwu 1\nXue 1\nXuejun 1\nXueqin 1\nXueqing 1\nXuezhong 1\nXuhui 1\nXun 1\nXushun 1\nY!A 1\nY. 1\nY.J. 1\nY.S. 1\nYAH 1\nYAHOO 1\nYALE 1\nYAY 1\nYE 1\nYN 1\nYOM 1\nYORK 1\nYUMMY 1\nYYYY-MM-DD 1\nYaaba 1\nYachtsman 1\nYacos 1\nYadavaran 1\nYael 1\nYafei 1\nYahao 1\nYahoos 1\nYahudi 1\nYahweh 1\nYakie 1\nYakutsk 1\nYala 1\nYaling 1\nYam 1\nYamaguchi 1\nYamaha 1\nYamane 1\nYamatane 1\nYaming 1\nYancheng 1\nYanfeng 1\nYangCheng 1\nYanghe 1\nYangu 1\nYangzhou 1\nYaniv 1\nYankelovich 1\nYankus 1\nYannian 1\nYanping 1\nYanqun 1\nYanzhen 1\nYanzhi 1\nYaobang 1\nYaodu 1\nYaoming 1\nYaotang 1\nYaoyao 1\nYaping 1\nYaqub 1\nYaqubi 1\nYards 1\nYarmouth 1\nYaroslavl 1\nYas 1\nYaser 1\nYasir 1\nYasmine 1\nYastrzemski 1\nYasumichi 1\nYatsen 1\nYawheh 1\nYaxin 1\nYay 1\nYaya 1\nYayir 1\nYe$ 1\nYea 1\nYearwood 1\nYeast 1\nYeat 1\nYehud 1\nYehuda 1\nYehudi 1\nYelinia 1\nYemma 1\nYeong 1\nYersinia 1\nYesterdays 1\nYichang 1\nYidagongzi 1\nYiddish 1\nYifei 1\nYigal 1\nYiguo 1\nYikes 1\nYililan 1\nYiman 1\nYimeng 1\nYiming 1\nYingqi 1\nYingqiang 1\nYingrui 1\nYingtan 1\nYingzhen 1\nYining 1\nYinkang 1\nYinmo 1\nYinxuan 1\nYippies 1\nYiu 1\nYizhuang 1\nYkeba 1\nYoder 1\nYohani 1\nYohei 1\nYon 1\nYoncayu 1\nYongbo 1\nYongding 1\nYongfeng 1\nYonghua 1\nYongjia 1\nYongmin 1\nYongqi 1\nYongqiu 1\nYongwei 1\nYongxiang 1\nYongzhao 1\nYongzhi 1\nYoon 1\nYorktown 1\nYosee 1\nYoshiaki 1\nYoshihisa 1\nYoshitoki 1\nYosi 1\nYouTubeMailer 1\nYouchou 1\nYouhu 1\nYoujiang 1\nYoumei 1\nYoungberg 1\nYounger 1\nYoungstown 1\nYourself 1\nYousuf 1\nYouyang 1\nYuanlin 1\nYuanzhe 1\nYuba 1\nYubitritch 1\nYucheng 1\nYuchih 1\nYuden 1\nYudiad 1\nYuegan 1\nYueguang 1\nYueli 1\nYuen 1\nYueqing 1\nYuesheng 1\nYugoslavian 1\nYugoslavic 1\nYukuang 1\nYuli 1\nYulin 1\nYum 1\nYuncheng 1\nYunfa 1\nYunfei 1\nYunting 1\nYunzhi 1\nYup 1\nYusuf 1\nYusufiyah 1\nYutang 1\nYutsai 1\nYuxi 1\nYuzhao 1\nYvan 1\nYvon 1\nZ06 1\nZCTA 1\nZD 1\nZDNet 1\nZIP 1\nZabin 1\nZach 1\nZacharias 1\nZafra 1\nZagros 1\nZagurka 1\nZahar 1\nZahir 1\nZahoud 1\nZahra 1\nZainuddin 1\nZaishuo 1\nZaita 1\nZakar 1\nZakaria 1\nZalubinau 1\nZama 1\nZaman1 1\nZamislov 1\nZamya 1\nZanim 1\nZanzhong 1\nZaobao.com 1\nZapotec 1\nZaragova 1\nZatanna 1\nZawraa 1\nZbigniew 1\nZe 1\nZeal 1\nZealander 1\nZebing 1\nZebulon 1\nZedillo 1\nZehnder 1\nZeiger 1\nZeisler 1\nZeitung 1\nZel 1\nZelikow 1\nZellers 1\nZenedine 1\nZenger 1\nZengshou 1\nZengtou 1\nZenni 1\nZeppelin 1\nZequan 1\nZeus 1\nZexu 1\nZeyuan 1\nZhan 1\nZhangdian 1\nZhangjiagang 1\nZhangyu 1\nZhaojiacun 1\nZhaoxiang 1\nZhaoxing 1\nZhaozhong 1\nZhe 1\nZhehui 1\nZhengda 1\nZhengdao 1\nZhengdong 1\nZhenghua 1\nZhengming 1\nZhenguo 1\nZhenhua 1\nZhenning 1\nZhenqing 1\nZhenya 1\nZhezhu 1\nZhi 1\nZhibang 1\nZhicheng 1\nZhigang 1\nZhigen 1\nZhiguo 1\nZhijiang 1\nZhili 1\nZhiliang 1\nZhilin 1\nZhiling 1\nZhimin 1\nZhixiang 1\nZhixiong 1\nZhiyi 1\nZhiyuan 1\nZhizhi 1\nZhizhong 1\nZhong 1\nZhongfa 1\nZhongguan 1\nZhonghou 1\nZhonghui 1\nZhonglong 1\nZhongnan 1\nZhongshi 1\nZhongxian 1\nZhongyuan 1\nZhouzhuang 1\nZhuangzi 1\nZhujia 1\nZhuoma 1\nZhuqing 1\nZia 1\nZiad 1\nZibo 1\nZiebarth 1\nZiff 1\nZiliang 1\nZimarai 1\nZimet 1\nZimmer 1\nZimmerman 1\nZingic 1\nZinny 1\nZipdash 1\nZipser 1\nZirbel 1\nZiyuan 1\nZoe 1\nZongbin 1\nZongmin 1\nZongming 1\nZongren 1\nZongxian 1\nZongxin 1\nZoning 1\nZoology 1\nZoroastrian 1\nZorro 1\nZosen 1\nZubayda 1\nZuercher 1\nZuhua 1\nZukang 1\nZukin 1\nZulus 1\nZumbrunn 1\nZunjing 1\nZunyi 1\nZuo 1\nZuowei 1\nZupan 1\nZuricic 1\nZvi 1\nZweibel 1\nZweig 1\nZwelakhe 1\nZwiren 1\nZygmunt 1\n[. 1\n^^ 1\n_________ 1\n____________ 1\n_______________ 1\n_________________________ 1\n___________________________________________ 1\n_that_ 1\n_way_ 1\n`91 1\n`99 1\n`T 1\n`n 1\na-plenty 1\na.D 1\nabacus 1\nabacuses 1\nabased 1\nabashed 1\nabates 1\nabating 1\nabattoir 1\nabbreviate 1\nabc.com 1\nabdicated 1\nabdication 1\nabdomens 1\naberration 1\nabetted 1\nabhor 1\nabides 1\nabilites 1\nabject 1\nabjectly 1\nabode 1\nabominant 1\nabominations 1\nabomnaf@hotmail.com 1\naboout 1\nabort 1\nabortive 1\nabou 1\nabounded 1\nabounding 1\nabovementioned 1\nabrasion 1\nabrasions 1\nabreast 1\nabridged 1\nabrogation 1\nabsentia 1\nabsenting 1\nabsolution 1\nabsolves 1\nabsolving 1\nabsorbent 1\nabsoulutely 1\nabstain 1\nabstentia 1\nabstention 1\nabstentions 1\nabstinence 1\nabstracts 1\nabsurdities 1\nabuser 1\nabusers 1\nabuts 1\nac 1\nacademe 1\nacademicians 1\nacccounting 1\nacceded 1\naccelerative 1\naccelerator 1\naccentuated 1\naccesory 1\naccessment 1\naccessorize 1\naccolades 1\naccomodated 1\naccomodating 1\naccompaniments 1\naccompanist 1\naccomplishes 1\naccorded 1\naccountabilities 1\naccoutrements 1\naccreted 1\naccrues 1\naccumulates 1\naccusatory 1\naccusers 1\naccussed 1\nacetic 1\nacetone 1\nacetylene 1\nacetylsalicylic 1\nached 1\naches 1\nachiever 1\nacidified 1\nacids 1\nacknowledgment 1\nacolyte 1\nacommodated 1\nacorns 1\nacoustic 1\nacquiesced 1\nacquiescence 1\nacquistion 1\nacreage 1\nacronyms 1\nacrophobic 1\nacrylics 1\nacter 1\nactins 1\nactionable 1\nactivating 1\nactivators 1\nactives 1\nactuality 1\nactualization 1\nactuarial 1\nactuaries 1\nactuary 1\nacutus 1\nad=nd 1\nadage 1\nadaptive 1\nadd'l 1\nadders 1\naddictions 1\nadditive 1\nadenocard 1\nadepts 1\nadhear 1\nadipocyte 1\nadjective 1\nadjectives 1\nadjoined 1\nadjourned 1\nadjunct 1\nadjuvant 1\nadman 1\nadmen 1\nadmidst 1\nadministering 1\nadministers 1\nadminstrative 1\nadmirably 1\nadmires 1\nadmissible 1\nadmonish 1\nado 1\nadorable 1\nadores 1\nadoring 1\nadornments 1\nadrift 1\nadroit 1\nadultery 1\nadvancer 1\nadvantaged 1\nadversarial 1\nadvertizing 1\nadvertorial 1\naegis 1\naerobatics 1\naerodynamic 1\naeterna 1\naffable 1\naffectation 1\naffiliating 1\naffixed 1\nafflictions 1\naffordability 1\naffords 1\nafoot 1\naforethought 1\nafoul 1\nafresh 1\nafrocentric 1\naft 1\nafterglow 1\nafterthought 1\nafterwords 1\nagape 1\nagarwood 1\nageel-al9labah@hotmail.com..@z@aabal-Saah 1\nageel-al9labah@hotmail.com..@z@aabalsaah 1\naggrandizing 1\naggravates 1\naggravation 1\naggressiveness 1\naggressors 1\naggro 1\naghast 1\nagitatedly 1\nagitates 1\nagitating 1\nagonize 1\nagrarian 1\nagriproducts 1\nagro 1\nagrochemical 1\naileron 1\nailerons 1\nails 1\naimless 1\nairconditioner 1\naircrews 1\nairdrop 1\nairfield 1\nairflows 1\nairholes 1\nairmail 1\nairman 1\nairmen 1\najon 1\nakita 1\naksa 1\nal- 1\nal-Sahaf 1\nal. 1\nalchemically 1\nalchemists 1\nalebda@msn.com 1\nalerting 1\nalertly 1\nalertness 1\nalerts 1\nalexandrine 1\nalfalaq@hotmail.com 1\nalfalfa 1\nalfresco 1\nalienate 1\nalienates 1\naligning 1\naliment 1\nalkali 1\nalla 1\nalledged 1\nallegiances 1\nallen 1\nalleviation 1\nallgedly 1\nalligators 1\nallocates 1\nalloys 1\nallrightniks 1\nallusion 1\nalluvial 1\nallwing 1\nallying 1\nallys 1\naloofness 1\nalsaha100@yahoo.com 1\nalt 1\nalt.bread.recipes 1\nalt.consumers 1\nalt.pets.dogs.pitbull 1\naltercation 1\nalternated 1\nalternately 1\nalternates 1\nalters 1\nalts 1\nalufating 1\nalumnus 1\nalway 1\nam@cnn.com 1\nam@cnn.com. 1\namahs 1\namalgam 1\namalgamate 1\namalgamated 1\namalgamations 1\namazon.com 1\namazons 1\nambassadorial 1\nambassadors 1\nambient 1\namble 1\nambulatory 1\namenable 1\namendements 1\namerica 1\namericans 1\namethyst 1\namethysts 1\namiss 1\namitriptyline 1\nammazing 1\namortize 1\namoung 1\namour 1\namours 1\namoxicillin 1\namphetamine 1\namplification 1\namplifier 1\namplify 1\namputate 1\namt 1\namy.cornell@compaq.com 1\nanachronism 1\nanachronisms 1\nanalize 1\nanalogous 1\nanalogue 1\nanalytics 1\nanarchist 1\nancestor 1\nancillary 1\nand/or 1\nanecdote 1\nanemia 1\nanemias 1\nanemics 1\nanemones 1\nanesthetics 1\nanesthetized 1\nangelfish 1\nangina 1\nangled 1\nanglophone 1\nangora 1\nangular 1\nanimalcare 1\nanimations 1\nanimator 1\nannals 1\nannexation 1\nannexed 1\nannihilate 1\nannihilated 1\nannnouncement 1\nannointed 1\nannotations 1\nannoucement 1\nannuls 1\nanodize 1\nanon 1\nanonimity 1\nanonymousfuck 1\nanorexia 1\nans 1\nanswetred 1\nantagonism 1\nantagonists 1\nantagonizing 1\nante-chambers 1\nanteaters 1\nantebellum 1\nantelope 1\nantennas 1\nanthems 1\nanthology 1\nanthraquinone 1\nanthropologist 1\nanthropologists 1\nanti-Asian 1\nanti-Bork 1\nanti-Chinese 1\nanti-Christian 1\nanti-Communist 1\nanti-Cui 1\nanti-European 1\nanti-Galileo 1\nanti-Israel 1\nanti-Israeli 1\nanti-Lee 1\nanti-Muslim 1\nanti-Noriega 1\nanti-Sandinista 1\nanti-Semite 1\nanti-Semitic 1\nanti-Serbian 1\nanti-Shiite 1\nanti-Somoza 1\nanti-Stalinist 1\nanti-Stratfordian 1\nanti-Turkish 1\nanti-U.S. 1\nanti-Western 1\nanti-abortionist 1\nanti-affirmative 1\nanti-aircraft 1\nanti-airline 1\nanti-bribery 1\nanti-clotting 1\nanti-communism 1\nanti-condensation 1\nanti-crime 1\nanti-democratic 1\nanti-depressant 1\nanti-diarrhea 1\nanti-diarrheal 1\nanti-epidemic 1\nanti-fascist 1\nanti-flag 1\nanti-foreigner 1\nanti-fraud 1\nanti-gay 1\nanti-heroes 1\nanti-homosexual 1\nanti-hooligan 1\nanti-imperialist 1\nanti-infectives 1\nanti-inflation 1\nanti-inflationary 1\nanti-intellectual 1\nanti-intellectualism 1\nanti-jamming 1\nanti-lock 1\nanti-nausea 1\nanti-outsider 1\nanti-piracy 1\nanti-pollution 1\nanti-porn 1\nanti-profiteering 1\nanti-prostitution 1\nanti-quake 1\nanti-reconnaissance 1\nanti-rejection 1\nanti-science 1\nanti-shipping 1\nanti-static 1\nanti-stratfordian 1\nanti-suppression 1\nanti-taxers 1\nanti-trust 1\nantiSony 1\nantianemia 1\nantiaristo 1\nanticancer 1\nanticleric 1\nanticoagulants 1\nanticult 1\nantidisestablishmentarianism 1\nantidotes 1\nantifreeze 1\nantihero 1\nantihistamine 1\nantimatter 1\nantimissile 1\nantiquated 1\nantiquity 1\nantirationalism 1\nantirealistic 1\nantiserum 1\nantiserums 1\nantithesis 1\nantiwar 1\nanyways 1\napace 1\naparently 1\napathy 1\napear 1\naperture 1\naphorism 1\napollo 1\napologetic 1\napologetically 1\napologised 1\napologizes 1\napologizing 1\napostrophe 1\napparat 1\napparition 1\napparitions 1\nappartently 1\nappartus 1\nappeased 1\nappeasing 1\nappellation 1\nappend 1\nappendages 1\nappetit 1\nappetites 1\nappetitie 1\nappetizing 1\napplauding 1\napplelike 1\napplesauce 1\napplicability 1\napplicances 1\nappointive 1\nappraiser 1\nappraisers 1\nappraises 1\nappreciably 1\napprehensions 1\napprenticed 1\napprising 1\napproachable 1\napprox. 1\napso 1\naptitude 1\naquamarine 1\naquarist 1\naquariums 1\naquasafe 1\naqueduct 1\naquisition 1\narab 1\narb 1\narbitraging 1\narbitrariness 1\narbitrates 1\narbitrators 1\narborists 1\narcade 1\narcades 1\narch-enemy 1\narch-murderer 1\narchaic 1\narchbishop 1\narches 1\narchetypical 1\narchie 1\narchitecturally 1\narchivist 1\narchness 1\narchy 1\narcs 1\nardently 1\naria 1\narian 1\narid 1\naristocracy 1\naristocrat 1\narivals 1\nark 1\narmament 1\narmistice 1\narmoured 1\narmpits 1\naromas 1\naromatic 1\narometherapy 1\narond 1\narpeggios 1\narraigned 1\narrondissement 1\narsonist 1\narsonists 1\narte 1\narteriosclerosis 1\narthritic 1\narticulates 1\narticulating 1\nartifical 1\nartillerists 1\nartistry 1\nartsy 1\nartware 1\nartworks 1\nascend 1\nascension 1\nascent 1\nascribes 1\naseptically 1\nasf 1\nashes 1\nashland 1\nashters 1\nashtray 1\nashtrays 1\naskew 1\naspected 1\naspens 1\nasphyxiating 1\naspires 1\nassail 1\nassailing 1\nassassins 1\nasseet 1\nassemblages 1\nassemblers 1\nassemblymen 1\nassertaion 1\nassignments 1\nassigns 1\nassuredly 1\nassures 1\nasteroid 1\nastoundingly 1\nastounds 1\nastral 1\nastringency 1\nastronaut's 1\nastronut 1\nastrophysics 1\nasunder 1\nasymmetric 1\nasymmetry 1\nasynchronous 1\natack 1\natavistic 1\natheists 1\natleast 1\natm 1\natolls 1\natonal 1\natra 1\natrophied 1\natrun 1\nattaches 1\nattacker 1\nattarcks 1\nattendee 1\nattested 1\nattics 1\nattn 1\nattributing 1\nattys 1\nau 1\nauberge 1\nauctioning 1\naudible 1\naudio-visual 1\naudiovisual 1\naugur 1\naugust 1\nauguster 1\naunte 1\naural 1\naustralian 1\nautarchies 1\nautarchy 1\nauthoritys 1\nauthorizations 1\nautism 1\nautobahn 1\nautocrat 1\nautofocus 1\nautograph 1\nautoloaded 1\nautomate 1\nautomates 1\nautomating 1\nautonomously 1\nautopsies 1\nautopsy 1\nautumns 1\navailed 1\navailing 1\navariciousness 1\navec 1\navenger 1\naverred 1\naverts 1\navi 1\navid 1\navidly 1\navowedly 1\naw 1\nawash 1\nawfully 1\nawkwardness 1\nawning 1\naxed 1\naxiomatic 1\naxioms 1\naxles 1\nayatollah 1\naye 1\naz 1\nazalea 1\nb**** 1\nb*llshit 1\nb. 1\nb/t 1\nba.consumers 1\nbabbling 1\nbabel 1\nbabes 1\nbaby's 1\nbabysit 1\nbachelor 1\nbachelorette 1\nbachelorhood 1\nbacillus 1\nbackbench 1\nbackbencher 1\nbackdraft 1\nbackdrops 1\nbackfield 1\nbackfires 1\nbackfist 1\nbackhoe 1\nbackhoes 1\nbacklit 1\nbackpackers 1\nbackpacks 1\nbackpedaling 1\nbackstop 1\nbacktracking 1\nbacteriological 1\nbaddebt 1\nbadder 1\nbadge 1\nbadmouth 1\nbadmouthed 1\nbadmouths 1\nbaffles 1\nbaggers 1\nbagmaker 1\nbagpipe 1\nbailiff 1\nbaiting 1\nbake 1\nbakeoff 1\nbakers 1\nbaksheesh 1\nbalcony 1\nbalding 1\nbaldness 1\nballasts 1\nballcock 1\nballistics 1\nballplayer 1\nballplayers 1\nballyhooed 1\nbalm 1\nbanality 1\nbandied 1\nbandit 1\nbandwidth 1\nbangs 1\nbanishment 1\nbanjo 1\nbankrupted 1\nbankrupty 1\nbanlieusards 1\nbanshees 1\nbanter 1\nbanteringly 1\nbanyan 1\nbao 1\nbaoguo 1\nbaptist 1\nbarbarous 1\nbarbarously 1\nbarbecuing 1\nbarbed 1\nbarbers 1\nbarbershop 1\nbarbs 1\nbarcodes 1\nbarcoding 1\nbarefoot 1\nbarest 1\nbargained 1\nbargelike 1\nbaring 1\nbaritone 1\nbariums 1\nbarking 1\nbarns 1\nbarnyard 1\nbaroque 1\nbarrages 1\nbarrelers 1\nbarricaded 1\nbarricading 1\nbartered 1\nbartering 1\nbaseballs 1\nbasements 1\nbashful 1\nbasilica 1\nbasophobic 1\nbassist 1\nbassoon 1\nbassoonist 1\nbastard 1\nbated 1\nbathtubs 1\nbatshit 1\nbattalians 1\nbatters 1\nbattlegroups 1\nbattlements 1\nbayou 1\nbays 1\nbazaar 1\nbbs 1\nbc 1\nbe-gloved 1\nbe-socked 1\nbeacuse 1\nbead 1\nbeamed 1\nbeanballs 1\nbearable 1\nbeardies 1\nbeatific 1\nbeatings 1\nbeautifull 1\nbeautifying 1\nbeaver 1\nbecase 1\nbecca 1\nbeck 1\nbeckoning 1\nbecouse 1\nbedbugs 1\nbedeviled 1\nbedeviling 1\nbedfellows 1\nbedlam 1\nbedridden 1\nbedrooms 1\nbeech 1\nbeefless 1\nbeefy 1\nbeekeeper 1\nbeeps 1\nbeery 1\nbees' 1\nbeeswax 1\nbeet 1\nbeethoven 1\nbefallen 1\nbefitting 1\nbefriend 1\nbeggar 1\nbeggin' 1\nbegiinning 1\nbegining 1\nbeginners 1\nbegrudge 1\nbeguile 1\nbehaviorally 1\nbehaviours 1\nbehead 1\nbeheading 1\nbehemoths 1\nbehold 1\nbeholden 1\nbeholder 1\nbehoove 1\nbeige 1\nbeiguan 1\nbelated 1\nbelaying 1\nbelch 1\nbelied 1\nbelittling 1\nbelive 1\nbelle 1\nbellied 1\nbelligerence 1\nbelligerent 1\nbellwethers 1\nbemoans 1\nbemused 1\nbends 1\nbenedict 1\nbenefactors 1\nbeneficence 1\nbeneficent 1\nbeneficially 1\nbenefitted 1\nbenevolence 1\nbenna 1\nbentonite 1\nbequeathed 1\nbequests 1\nberated 1\nbereft 1\nberet 1\nberry 1\nberthing 1\nbeseech 1\nbeseeching 1\nbesiegers 1\nbespectacled 1\nbestiality 1\nbestsellers 1\nbetel 1\nbetrayers 1\nbetrothed 1\nbettas 1\nbettering 1\nbetters 1\nbevy 1\nbewitched 1\nbewitching 1\nbi-directional 1\nbi-polar 1\nbia 1\nbiannual 1\nbiblical 1\nbibliography 1\nbickered 1\nbicornate 1\nbicycling 1\nbicyclist 1\nbide 1\nbigoted 1\nbigwigs 1\nbiiiiiiiiig 1\nbiker 1\nbilatarily 1\nbilaterally 1\nbile 1\nbilges 1\nbilharzia 1\nbilking 1\nbillable 1\nbillboard 1\nbillboards 1\nbilliards 1\nbillionaires 1\nbinder 1\nbinds 1\nbingo 1\nbinkies 1\nbio-analytical 1\nbio-technology 1\nbioTechnology 1\nbiochemist 1\nbiochemists 1\nbiodegradable 1\nbiodegrade 1\nbiodiversity 1\nbioequivalence 1\nbiogas 1\nbiographers 1\nbiographical 1\nbioinformatics 1\nbiomaterials 1\nbiomedicine 1\nbiophysicist 1\nbiopsies 1\nbipartisanship 1\nbiped 1\nbipolar 1\nbirdcage 1\nbirdie 1\nbirdy 1\nbirthdate 1\nbirthdays 1\nbirthstone 1\nbiscuits 1\nbitches 1\nbitmap 1\nbitmapraster 1\nbitrate 1\nbizzare 1\nblabbing 1\nblackest 1\nblackline 1\nblacklisted 1\nblacklisting 1\nblacklists 1\nblackmailing 1\nbladed 1\nblandness 1\nblared 1\nblaring 1\nblarney 1\nblasphemy 1\nblazed 1\nblazer 1\nblazes 1\nbleached 1\nbleakest 1\nbleary 1\nbleating 1\nbled 1\nbleeds 1\nbleep 1\nblemish 1\nblender 1\nblesses 1\nblighted 1\nblinded 1\nblindingly 1\nblinkers 1\nblitzes 1\nblitzing 1\nblitzkrieg 1\nblizzard 1\nblockading 1\nblockbusters 1\nblogosphere 1\nbloke 1\nblokes 1\nbloodied 1\nbloodless 1\nbloodletting 1\nbloodlust 1\nbloodstream 1\nbloodsuckers 1\nbloodworms 1\nbloodying 1\nbloomed 1\nblooming 1\nblooms 1\nblooper 1\nblouberg 1\nblowed 1\nblower 1\nblowouts 1\nblowtorch 1\nblowup 1\nblshe.com 1\nblu 1\nblubbering 1\nbludgeoned 1\nblueberries 1\nbluebloods 1\nbluesy 1\nbluff 1\nbluish 1\nblundering 1\nblurt 1\nblustery 1\nblying 1\nbmil 1\nbn 1\nbo 1\nboa 1\nboar 1\nboardrooms 1\nboardwalk 1\nboaters 1\nboating 1\nboatmen 1\nbobber 1\nbocci 1\nbodacious 1\nbodegas 1\nbodes 1\nbodice 1\nbodily 1\nbodyguard 1\nbodytalk 1\nbodyweight 1\nbodyworkers 1\nboffows 1\nbogging 1\nboggle 1\nboggles 1\nboggling 1\nbogglingly 1\nbogs 1\nboilerplate 1\nboisterous 1\nbolder 1\nboldest 1\nboldface 1\nboldness 1\nbollard 1\nbolsters 1\nbolt-ette 1\nbombardment 1\nbon 1\nbondholdings 1\nboned 1\nbonfire 1\nbonhomie 1\nbonks 1\nbonnet 1\nbonnets 1\nboo 1\nboogers 1\nbookish 1\nbooklet 1\nbooklets 1\nbookshop 1\nboolean 1\nboondocking 1\nboons 1\nboorish 1\nboors 1\nboos 1\nbooted 1\nbootlegged 1\nbooty 1\nboozing 1\nbopper 1\nboran 1\nborderline 1\nborer 1\nboringly 1\nbosom 1\nboson 1\nbossy 1\nbot 1\nbotanical 1\nbotany 1\nbotching 1\nbothersome 1\nbottlenecked 1\nbottler 1\nbottoms 1\nbouild 1\nboulders 1\nbouncy 1\nbounded 1\nbounding 1\nbouquet 1\nbourbons 1\nbourses 1\nbowlful 1\nbows 1\nbox. 1\nbozos 1\nbp 1\nbrace 1\nbracelet 1\nbracelets 1\nbradley 1\nbrag 1\nbraggarts 1\nbrags 1\nbraids 1\nbrained 1\nbrainiacs 1\nbraining 1\nbrainstorming 1\nbrainwashing 1\nbrainy 1\nbraising 1\nbrambles 1\nbrase 1\nbrassieres 1\nbrat 1\nbrats 1\nbraves 1\nbravest 1\nbravura 1\nbrawny 1\nbrazen 1\nbreadbasket 1\nbreadbox 1\nbreaded 1\nbreadwinner 1\nbreakable 1\nbreakage. 1\nbreakaway 1\nbreakfasts 1\nbreasted 1\nbreathability 1\nbreather 1\nbreathes 1\nbreathlessly 1\nbreathtakingly 1\nbreathy 1\nbreeeding 1\nbreezes 1\nbreezier 1\nbreweries 1\nbrianp@aiglincoln.com 1\nbricklaying 1\nbrickyard 1\nbridal 1\nbridgehead 1\nbrie 1\nbrigand 1\nbrightening 1\nbrillant 1\nbrimmed 1\nbrimstone 1\nbrine 1\nbrinkmanship 1\nbriny 1\nbrio 1\nbriquettes 1\nbrisker 1\nbristle 1\nbristled 1\nbristles 1\nbristling 1\nbroached 1\nbroadens 1\nbroadside 1\nbroccoli 1\nbroiled 1\nbroncs 1\nbrontosauruses 1\nbrooch 1\nbrood 1\nbrook 1\nbroom 1\nbroth 1\nbrotherhood 1\nbrotherism 1\nbrotherly 1\nbrouhaha 1\nbrowbeat 1\nbrows 1\nbrowsers 1\nbruiser 1\nbrumation 1\nbrunettes 1\nbrushbacks 1\nbrushoff 1\nbrushwork 1\nbrutalized 1\nbrutish 1\nbs 1\nbta 1\nbu 1\nbubblelike 1\nbuckling 1\nbuckshot 1\nbud 1\nbudgeteers 1\nbudgie 1\nbudgies 1\nbuffed 1\nbuffets 1\nbuffetting 1\nbugaboo 1\nbugbear 1\nbugless 1\nbuglike 1\nbuilf 1\nbulgaricus 1\nbulged 1\nbulked 1\nbullcraping 1\nbulldoze 1\nbulldozed 1\nbulldozier 1\nbulled 1\nbulleted 1\nbulletproof 1\nbullhorn 1\nbullishly 1\nbullishness 1\nbullseye 1\nbullshit 1\nbumbling 1\nbunched 1\nbunches 1\nbunco 1\nbungee 1\nbungled 1\nbungling 1\nbunko 1\nbunks 1\nbunt 1\nbuoyancy 1\nbuoys 1\nburbles 1\nbureacratic 1\nburglar 1\nburglarized 1\nburgs 1\nburlap 1\nburlesque 1\nburly 1\nburnishing 1\nburnout 1\nburnouts 1\nburrowed 1\nburrowing 1\nburrows 1\nbushing 1\nbusies 1\nbusinessperson 1\nbusinesswoman 1\nbusing 1\nbusses 1\nbuster 1\nbustle 1\nbusyboys 1\nbusywork 1\nbutchered 1\nbutlers 1\nbuttered 1\nbuttresses 1\nbuyings 1\nbuzzer 1\nbuzzes 1\nbuzzsaw 1\nbuzzword 1\nbyes 1\nbylaws 1\nbylines 1\nbypassed 1\nbypassing 1\nbyproducts 1\nbystander 1\nbyzantine 1\nc.i.f 1\ncDNA 1\ncabaret 1\ncabs 1\ncacophonous 1\ncacophony 1\ncaddy 1\ncadet 1\ncadets 1\ncadge 1\ncaffeine-o-torium 1\ncai 1\ncai: 1\ncairns 1\ncajoled 1\ncaked 1\ncalanques 1\ncalcite 1\ncalculable 1\ncaldera 1\ncalendars 1\ncalender 1\ncalibrated 1\ncalibrations 1\ncalico 1\ncalifornia 1\ncaliper 1\ncalisthenics 1\ncalligraphies 1\ncallipygous 1\ncallous 1\ncalloused 1\ncalluses 1\ncalmed 1\ncalmer 1\ncaloy 1\ncalumny 1\ncam 1\ncamaraderie 1\ncamaro 1\ncamber 1\ncamels 1\ncameo 1\ncameraman 1\ncamouflage 1\ncampaigner 1\ncampaigners 1\ncamped 1\ncamper 1\ncampers 1\ncampfire 1\ncampfires 1\ncampsite 1\ncanadian 1\ncanals 1\ncanape's 1\ncancellable 1\ncandidacies 1\ncandidly 1\ncandybar 1\ncanis 1\ncannabis 1\ncannibalistic 1\ncannibals 1\ncannot 1\ncanny 1\ncant 1\ncantering 1\ncantonal 1\ncanvases 1\ncanvass 1\ncanvassed 1\ncape 1\ncapillaries 1\ncapitalgains 1\ncapitulated 1\ncappuccinos 1\ncaprice 1\ncapricious 1\ncapriciously 1\ncapriciousness 1\ncapsized 1\ncapsule 1\ncapsules 1\ncapsulitis 1\ncaptains 1\ncaptivating 1\ncaptures 1\ncarath 1\ncaravan 1\ncarbage 1\ncarbamide 1\ncarbde 1\ncarbide 1\ncarbinol 1\ncarbohydrate 1\ncarbonate 1\ncarboxyhemoglobin 1\ncarcinogen 1\ncarcinogens 1\ncardamon 1\ncardholders 1\ncardigan 1\ncardinal 1\ncardinals 1\ncardio 1\ncardiopulmonary 1\ncareen 1\ncareened 1\ncareer-wise 1\ncareerism 1\ncarefuly 1\ncaregivers 1\ncarelessly 1\ncaribou 1\ncaricature 1\ncarless 1\ncarnage 1\ncarnauba 1\ncarnivalesque 1\ncarnivores 1\ncarolina 1\ncaroling 1\ncarols 1\ncarotid 1\ncarouses 1\ncarpenters 1\ncarpentry 1\ncarpetbaggers 1\ncarping 1\ncarriages 1\ncarribean 1\ncarrion 1\ncarryforwards 1\ncarted 1\ncartilage 1\ncarting 1\ncartons 1\ncartwheels 1\ncarval 1\ncarvers 1\ncarves 1\ncascades 1\ncaseload 1\ncaseloads 1\ncaseworkers 1\ncasing 1\ncaskets 1\ncassettes 1\ncastlelike 1\ncastrated 1\ncasuistry 1\ncataloged 1\ncataloging 1\ncatalogue 1\ncatalysis 1\ncatamaran 1\ncatbird 1\ncatch-phrase 1\ncatchers 1\ncatchphrase 1\ncaterer 1\ncathartic 1\ncathedral 1\ncatheters 1\ncathodes 1\ncatsup 1\ncatties 1\ncatwalk 1\ncaucuses 1\ncauldron 1\ncaulking 1\ncausality 1\ncauser 1\ncautioning 1\ncavalry 1\ncavemen 1\ncavies 1\ncavity 1\ncc 1\ncd 1\ncede 1\nceding 1\ncelebrators 1\ncelibate 1\ncellar 1\ncellars 1\ncellists 1\ncelluloids 1\ncemented 1\ncements 1\ncensors 1\ncensure 1\ncensured 1\ncentenary 1\ncentigrade 1\ncentimeter 1\ncentralize 1\ncentres 1\ncentrifuge 1\ncenturions 1\ncere 1\ncerebral 1\ncerebrovascular 1\ncervix 1\nceviche 1\ncge 1\ncgy 1\nchafe 1\nchagrin 1\nchagrined 1\nchai 1\nchainsaws 1\nchairbound 1\nchairing 1\nchairpersons 1\nchakra 1\nchalk 1\nchalking 1\nchambermaid 1\nchamomile 1\nchamp 1\nchamps 1\nchana 1\nchangbaishan 1\nchangeover 1\nchanneller 1\nchanteuse 1\nchants 1\nchar 1\ncharacterization 1\ncharacterless 1\ncharacterstics 1\ncharades 1\nchardonnay 1\nchariot 1\nchariots 1\ncharitably 1\ncharlatanry 1\ncharlatans 1\ncharles 1\ncharmingly 1\ncharted 1\ncharting 1\nchaser 1\nchases 1\nchatrooms 1\nchattering 1\nchauffeurs 1\nchauvinistic 1\nchauvinists 1\nchealte 1\ncheapens 1\ncheapo 1\ncheater 1\ncheats 1\ncheckbooks 1\nchecker 1\nchecklists 1\ncheckup 1\nchecp 1\ncheeky 1\ncheep 1\ncheerleadng 1\ncheery 1\ncheesecloth 1\ncheeses 1\ncheetah 1\nchelicerates 1\nchemises 1\nchemistries 1\nchemosynthetic 1\nchenille 1\ncheques 1\ncherish 1\ncherubs 1\nchessboard 1\nchevy 1\nchews 1\nchicago 1\nchicanery 1\nchick 1\nchickadee 1\nchihuahuas 1\nchil 1\nchildbearing 1\nchildcare 1\nchildhoods 1\nchildlike 1\nchile 1\nchilled 1\nchilliness 1\nchillingly 1\nchimes 1\nchimneys 1\nchimpanzee 1\nchina-sss.com 1\nchinanews 1\nchinese 1\nchinless 1\nchipboard 1\nchipmaker 1\nchipped 1\nchiropractric 1\nchiros 1\nchirping 1\nchirpy 1\nchit 1\nchlorazepate 1\nchloroplasts 1\nchock 1\nchocolates 1\nchokehold 1\nchomping 1\nchondroitin 1\nchopped 1\nchopper 1\nchopping 1\nchoramine 1\nchoreographer 1\nchoreography 1\nchorion 1\nchortled 1\nchoruses 1\nchosing 1\nchou 1\nchrisssake 1\nchristchurch 1\nchristening 1\nchristopher.knight@latimes.com 1\nchromatography 1\nchrome 1\nchronicles 1\nchronicling 1\nchronological 1\nchronologically 1\nchucked 1\nchuckling 1\nchug 1\nchugged 1\nchum 1\nchurchy 1\nchute 1\nchutzpah 1\ncichlid 1\nciel 1\ncigars 1\ncigs 1\ncinemas 1\ncinematographer 1\ncinematography 1\ncinnabar 1\ncinnamon 1\ncirca 1\ncircled 1\ncircuited 1\ncircuitous 1\ncirculates 1\ncirculators 1\ncircumcision 1\ncircumference 1\ncircumferential 1\ncircumlocution 1\ncircumnavigated 1\ncircumscribe 1\ncircumventing 1\ncircumvents 1\ncitation 1\ncitizenry 1\ncitric 1\ncitys 1\ncivics 1\ncivilisation 1\ncivilised 1\ncivillian 1\nclambered 1\nclambering 1\nclammy 1\nclamor 1\nclamors 1\nclampdown 1\nclampdowns 1\nclamping 1\nclandestine 1\nclanking 1\nclaptrap 1\nclarifies 1\nclasped 1\nclassiest 1\nclassifies 1\nclassifying 1\nclassless 1\nclassy 1\nclaudication 1\nclaustrophobia 1\nclaustrophobic 1\nclawing 1\nclays 1\ncleanest 1\ncleanings 1\ncleanliness 1\ncleanser 1\ncleansers 1\nclearinghouse 1\ncleaser 1\nclef 1\nclenched 1\nclergymen 1\ncliche 1\nclicked 1\nclicker 1\nclientelage 1\ncliffs 1\nclimates 1\nclimaxes 1\nclimbers 1\nclingy 1\nclinically 1\nclinicals 1\nclinicians 1\nclinked 1\nclinkers 1\nclipped 1\nclippings 1\nclitoris 1\nclobber 1\nclockdate 1\nclocked 1\nclocking 1\nclockwise 1\nclockwork 1\nclog 1\nclosedown 1\ncloseness 1\ncloseup 1\nclosey 1\nclosse 1\nclothier 1\nclothiers 1\ncloudier 1\ncloudiness 1\ncloves 1\nclowning 1\nclubbed 1\nclubhouse 1\nclump 1\nclumping 1\nclumsily 1\nclung 1\nclunking 1\nclutches 1\ncluttered 1\ncn 1\ncnnfn.com 1\nco$t 1\nco-chairmen 1\nco-defendant 1\nco-development 1\nco-director 1\nco-edited 1\nco-editor 1\nco-edits 1\nco-educational 1\nco-founders 1\nco-host 1\nco-insurance 1\nco-issuers 1\nco-managed 1\nco-manager 1\nco-op 1\nco-operatively 1\nco-operators 1\nco-ordinator 1\nco-payments 1\nco-president 1\nco-produce 1\nco-publisher 1\nco-sponsoring 1\nco-venture 1\nco-written 1\ncoacervation 1\ncoasted 1\ncoasters 1\ncoastlines 1\ncoaxing 1\ncob 1\ncobbled 1\ncobs 1\ncobwebs 1\ncockatiel 1\ncockiness 1\ncockney 1\ncockpits 1\ncocks 1\ncoconuts 1\ncocotte 1\ncoddle 1\ncoddling 1\ncodec 1\ncodecs 1\ncodification 1\ncodify 1\ncodpiece 1\ncoed 1\ncoerce 1\ncoerces 1\ncoextrude 1\ncoffeehouse 1\ncog 1\ncogestion 1\ncognac 1\ncognitive 1\ncognizance 1\ncognoscenti 1\ncohabit 1\ncohere 1\ncoherence 1\ncoherently 1\ncoiffed 1\ncoil 1\ncoincidences 1\ncoincident 1\ncoincidental 1\ncoincidentally 1\ncoinciding 1\ncolder 1\ncoldly 1\ncole 1\ncoli 1\ncolitis 1\ncollaborates 1\ncollaboratively 1\ncollabrative 1\ncollapsable 1\ncollectivization 1\ncollectivizers 1\ncollegial 1\ncolleting 1\ncollide 1\ncollided 1\ncollisions 1\ncolloquia 1\ncolloquies 1\ncolludes 1\ncolonialism 1\ncolonialist 1\ncolonialists 1\ncolonists 1\ncolonize 1\ncolonizer 1\ncolonnaded 1\ncoloration 1\ncoloratura 1\ncolorimetric 1\ncolorization 1\ncolorlessness 1\ncolourful 1\ncolum 1\ncolumbariums 1\ncolumned 1\ncoma 1\ncomapnies 1\ncomatose 1\ncombattant 1\ncombinable 1\ncombs 1\ncombustible 1\ncomcast 1\ncomediennes 1\ncomely 1\ncomers 1\ncomestibles 1\ncomforts 1\ncomfty 1\ncomical 1\ncomically 1\ncomings 1\ncomity 1\ncomma 1\ncommandeering 1\ncommemorated 1\ncommend 1\ncommends 1\ncommercialised 1\ncommercialized 1\ncommercializes 1\ncommiserate 1\ncommiseration 1\ncommissar 1\ncommissariat 1\ncommited 1\ncommitee 1\ncommiting 1\ncommittes 1\ncommittment 1\ncommment 1\ncommmon 1\ncommodified 1\ncommonstock 1\ncommotion 1\ncommune 1\ncommunicates 1\ncommunicator 1\ncommunicators 1\ncommunion 1\ncommunistic 1\ncomp.sources.d 1\ncompacting 1\ncompactly 1\ncompaign 1\ncompanie 1\ncompanys 1\ncomparability 1\ncomparably 1\ncompartmentalize 1\ncompass 1\ncompatability 1\ncompeltly 1\ncompendium 1\ncompensates 1\ncompensations 1\ncompensatory 1\ncompet 1\ncompetitively 1\ncompetitve 1\ncompiles 1\ncomplacence 1\ncomplainant 1\ncompletions 1\ncomplexify 1\ncomplexions 1\ncomplicating 1\ncomplimenting 1\ncompliments 1\ncomporomise 1\ncomportment 1\ncomports 1\ncomposes 1\ncompositional 1\ncompress 1\ncompressa 1\ncompressing 1\ncompunction 1\ncomputerize 1\ncomputes 1\nconceding 1\nconceited 1\nconceive 1\nconceiver 1\nconceiving 1\nconcentric 1\nconceptions 1\nconceptualization 1\nconceptualize 1\nconcertos 1\nconcisely 1\nconcision 1\nconcoct 1\nconcoctions 1\nconcomitantly 1\nconcretely 1\nconcubine 1\nconcubines 1\nconcur 1\nconcurs 1\nconcussion 1\ncondem 1\ncondense 1\ncondenser 1\ncondensers 1\ncondenses 1\ncondiment 1\nconditons 1\ncondolent 1\ncondoned 1\ncondor 1\nconductors 1\nconfidante 1\nconfidants 1\nconfidences 1\nconfidently 1\nconfidents 1\nconfides 1\nconfine 1\nconfining 1\nconfirmations 1\nconflagrations 1\nconflation 1\nconformist 1\nconfreres 1\ncongealed 1\ncongee 1\ncongestive 1\ncongetsed 1\nconglomerates 1\ncongratulation 1\ncongregate 1\ncongregated 1\ncongregational 1\nconifer 1\nconiferous 1\nconjectures 1\nconjuction 1\nconjugal 1\nconjunct 1\nconjures 1\nconjuring 1\nconnectivity 1\nconned 1\nconnoisseurs 1\nconnotation 1\nconnotations 1\nconnote 1\nconquering 1\nconquests 1\nconrfirmed 1\nconscript 1\nconscripts 1\nconsented 1\nconsentual 1\nconservators 1\nconservatory 1\nconsigning 1\nconsigns 1\nconsistencies 1\nconsolidates 1\nconsolidations 1\nconsolidator 1\nconsort 1\nconsorting 1\nconstants 1\nconstellation 1\nconstitutions 1\nconstrains 1\nconstraint 1\nconstricting 1\nconstrictors 1\nconstructionist 1\nconstructively 1\nconstructon 1\nconstructs 1\nconsultancy 1\nconsumes 1\nconsummate 1\nconsumptions 1\ncont'd. 1\ncontactable 1\ncontactee 1\ncontagion 1\ncontainerized 1\ncontaminate 1\ncontemplates 1\ncontemplation 1\ncontemplative 1\ncontemporize 1\ncontemptuously 1\ncontentions 1\ncontestants 1\ncontibuted 1\ncontinously 1\ncontinuance 1\ncontinuum 1\ncontorted 1\ncontraption 1\ncontrarian 1\ncontribued 1\ncontrition 1\ncontrived 1\ncontroled 1\ncontruction 1\nconvenants 1\nconvener 1\nconvension 1\nconventionally 1\nconventioners 1\nconverging 1\nconversationalist 1\nconversationalists 1\nconversed 1\nconversos 1\nconvertable 1\nconvertibility 1\nconvertibles 1\nconvexity 1\nconveys 1\nconvicting 1\nconvinces 1\nconvocation 1\nconvoke 1\nconvulsed 1\ncookbooks 1\ncoolers 1\ncools 1\ncooly 1\ncooperates 1\ncooperatively 1\ncooporation 1\ncoppertop 1\ncopping 1\ncoprophilia 1\ncopycat 1\ncopywright 1\ncorals 1\ncoreligionists 1\ncores 1\ncorgi 1\ncorinthian 1\ncorking 1\ncorks 1\ncornflake 1\ncornices 1\ncorns 1\ncornucopia 1\ncorollary 1\ncorona 1\ncorporatewide 1\ncorporeal 1\ncorral 1\ncorrectional 1\ncorrector 1\ncorrects 1\ncorrelates 1\ncorresponds 1\ncorrosive 1\ncorrugated 1\ncorrupted 1\ncorrupters 1\ncory 1\ncosmetology 1\ncosmos 1\ncosseted 1\ncostlier 1\ncostliest 1\ncot 1\ncots 1\ncottages 1\ncouched 1\ncouching 1\ncougar 1\ncoughing 1\ncoughs 1\ncouncilwoman 1\ncounseled 1\ncounsellors 1\ncounselors 1\ncounsels 1\ncounter-accusation 1\ncounter-argument 1\ncounter-attack 1\ncounter-attacks 1\ncounter-claims 1\ncounter-intelligence 1\ncounter-measures 1\ncounter-productive 1\ncounter-propaganda 1\ncounter-revolutionary 1\ncounter-trade 1\ncounteracted 1\ncounterattacked 1\ncounterbalance 1\ncounterbidders 1\ncounterbids 1\ncounterblast 1\ncounterclockwise 1\ncountercultural 1\ncounterespionage 1\ncounterfeits 1\ncountermove 1\ncounterprogram 1\ncountersued 1\ncountersuing 1\ncouponing 1\ncoursed 1\ncourteous 1\ncourtesies 1\ncourthouses 1\ncourtier 1\ncouter-cultural 1\ncouture 1\ncover-ed 1\ncovertly 1\ncoverts 1\ncoverup 1\ncovetous 1\ncovets 1\ncovrefeu 1\ncoward 1\ncowered 1\ncowering 1\ncowrote 1\ncoy 1\ncoyote 1\ncpa 1\ncpoly 1\ncpus 1\ncrabby 1\ncrackers 1\ncrafts 1\ncrafty 1\ncraghoppers 1\ncrags 1\ncram 1\ncrams 1\ncranberry 1\ncrank 1\ncrankcase 1\ncranks 1\ncranky 1\ncranny 1\ncrapazoid 1\ncrapshoot 1\ncratering 1\ncraved 1\ncraven 1\ncrazes 1\ncraziest 1\ncreak 1\ncreaking 1\ncreamier 1\ncreams 1\ncreamy 1\ncreasing 1\ncreationist 1\ncreativities 1\ncredence 1\ncredential 1\ncredit's 1\ncredo 1\ncredulity 1\ncreeds 1\ncreepers 1\ncreepiest 1\ncrematoriums 1\ncreme 1\ncrepe 1\ncrescendo 1\ncrescent 1\ncresting 1\ncrevasse 1\ncrevasses 1\ncrevices 1\ncrewman 1\ncrewmen 1\ncribbage 1\ncricket 1\ncriminologist 1\ncriminology 1\ncrimped 1\ncrimper 1\ncrimson 1\ncringed 1\ncripples 1\ncrisper 1\ncrispness 1\ncrisscross 1\ncrisscrossing 1\ncriterion 1\ncritiques 1\ncritter 1\ncroaking 1\ncronyism 1\ncrook 1\ncrookery 1\ncrooner 1\ncropping 1\ncrore 1\ncross-bay 1\ncross-century 1\ncross-discipline 1\ncross-examination 1\ncross-eyed 1\ncross-functionally 1\ncross-grain 1\ncross-industry 1\ncross-licensing 1\ncross-negotiations 1\ncross-over 1\ncross-pollinated 1\ncross-pollination 1\ncross-sea 1\ncross-sectional 1\ncross-shareholdings 1\ncross-state 1\ncrossbred 1\ncrosswalks 1\ncrouch 1\ncrouching 1\ncrowding 1\ncrowed 1\ncrowing 1\ncrowned 1\ncrowns 1\ncrucifixion 1\ncruisers 1\ncrumbles 1\ncrunched 1\ncrunchers 1\ncrunches 1\ncrusader 1\ncrusaders 1\ncrusades 1\ncrustaceans 1\ncrusts 1\ncrusty 1\ncrutch 1\ncruzado 1\ncryogenics 1\ncryptographers 1\ncryptomerioides 1\ncrystalline 1\ncrystallization 1\ncrystallize 1\ncrystallizing 1\ncub 1\ncubes 1\ncubicles 1\ncubist 1\ncubit 1\ncubs 1\ncuckold 1\ncuckoos 1\ncucumbers 1\ncudgel 1\ncuff 1\ncuisines 1\ncul 1\nculinary 1\nculminated 1\nculminates 1\nculpable 1\nculpas 1\ncultish 1\ncultists 1\ncultivates 1\ncumin' 1\ncunninghamia 1\ncuprimine 1\ncurate 1\ncurating 1\ncurbside 1\ncurd 1\ncuries 1\ncuring 1\ncurled 1\ncurls 1\ncurrencny 1\ncurricular 1\ncurriculums 1\ncursory 1\ncurtailing 1\ncurtly 1\ncurtness 1\ncurved 1\ncurvy 1\ncussing 1\ncustodial 1\ncuter 1\ncutlery 1\ncutoff 1\ncutoffs 1\ncutouts 1\ncutters 1\ncutthroat 1\ncuttings 1\ncvs 1\ncyanide 1\ncyber 1\ncyber-novel 1\ncyberville 1\ncyclicals 1\ncymbal 1\ncynosure 1\nd'Alene 1\nd'Amiante 1\nd'Exploitation 1\nd'Silva 1\nd'affaires 1\nd'uh 1\nd*ck 1\ndabble 1\ndabbled 1\ndabbling 1\ndabke 1\ndabs 1\ndaddy 1\ndafted 1\ndailyem...@ebs.bbc.co.uk 1\ndaisy 1\ndalliances 1\ndamnation 1\ndamning 1\ndampening 1\ndampness 1\ndancewear 1\ndandies 1\ndandle 1\ndangedled 1\ndanjones@paulhastings.com 1\ndank 1\ndarin 1\ndarkest 1\ndarling0823 1\ndarts 1\ndashing 1\ndatabank 1\ndataset 1\ndateline 1\ndaub 1\ndauntless 1\ndauntlessly 1\ndawdling 1\ndawns 1\ndaydreams 1\ndaylighted 1\ndays. 1\ndaze 1\nde-Baathification 1\nde-emphasis 1\nde-emphasized 1\nde-escalate 1\nde-facto 1\nde-stocking 1\nde-stroyed 1\ndea 1\ndeacons 1\ndeactivates 1\ndeactivation 1\ndeadlier 1\ndeadwood 1\ndealship 1\ndears 1\ndeathbed 1\ndeathless 1\ndeathly 1\ndebacles 1\ndebenhams 1\ndebilitating 1\ndebugged 1\ndebunked 1\ndebunking 1\ndecapitate 1\ndecapitation 1\ndeceiving 1\ndecelerated 1\ndecelerating 1\ndecember 1\ndecentralizing 1\ndeception 1\ndecertified 1\ndecibel 1\ndecibels 1\ndecieve 1\ndecimate 1\ndecimeter 1\ndecipherous 1\ndecisons 1\ndecked 1\ndeckhands 1\ndeclaratory 1\ndeclasse 1\ndeclassifying 1\ndeco 1\ndecoded 1\ndecommissoned 1\ndecompose 1\ndecomposing 1\ndeconstructed 1\ndecontaminated 1\ndecontrol 1\ndecorator 1\ndecorators 1\ndecribed 1\ndecriminalization 1\nded69...@hotmail.com 1\ndedications 1\ndeem 1\ndeepwater 1\ndefame 1\ndefames 1\ndefanged 1\ndefaulters 1\ndefecation 1\ndefensiveness 1\ndeferment 1\ndefiance 1\ndeficency 1\ndeficitcutting 1\ndefies 1\ndefination 1\ndeflationary 1\ndeflators 1\ndeflecting 1\ndefoliate 1\ndeformation 1\ndeformed 1\ndefray 1\ndefrosted 1\ndeftly 1\ndegenerative 1\ndegrade 1\ndehumanization 1\ndehumanizing 1\ndehumidified 1\ndehy 1\ndehydration 1\ndei 1\ndeified 1\ndeities 1\ndejected 1\ndelectably 1\ndeleterious 1\ndeletions 1\ndeliberative 1\ndelineate 1\ndelinquencies 1\ndelinquency 1\ndelinquents 1\ndelirious 1\ndeliriously 1\ndella 1\ndelousing 1\ndeltas 1\ndelude 1\ndeluded 1\ndeluding 1\ndeluge 1\ndelusive 1\ndeluxe 1\ndelver 1\ndelvery 1\ndelves 1\ndelving 1\ndemagoguery 1\ndemagogues 1\ndemagoguing 1\ndeman 1\ndemarcated 1\ndemarcating 1\ndemarcation 1\ndemeaned 1\ndemeanors 1\ndemilitarize 1\ndemo 1\ndemobilize 1\ndemobilizing 1\ndemocratize 1\ndemocratized 1\ndemocrats 1\ndemographically 1\ndemography 1\ndemonized 1\ndemonizing 1\ndemonologies 1\ndemonologist 1\ndemonstates 1\ndemonstrably 1\ndemonstrativeness 1\ndemonstrator 1\ndempsey 1\ndemurs 1\ndemutualize 1\ndenationalized 1\ndenigration 1\ndensities 1\ndenting 1\ndenuclearized 1\ndenude 1\ndeodorant 1\ndeoxyribonucleic 1\ndeparts 1\ndependable 1\ndependants 1\ndepletes 1\ndeplorably 1\ndeploring 1\ndeployable 1\ndeploying 1\ndeportment 1\ndeposing 1\ndepositing 1\ndepraved 1\ndepravity 1\ndeprecation 1\ndepreciable 1\ndepredation 1\ndepredations 1\ndeprogrammings 1\nderby 1\nderegulate 1\nderegulates 1\nderegulaton 1\nderig 1\nderisive 1\nderisively 1\nderivation 1\ndermatological 1\ndermatologists 1\nderogation 1\nderriere 1\ndervish 1\ndesalinating 1\ndescriptive 1\ndesecrated 1\ndesensitize 1\ndeserters 1\ndeserts 1\ndesignating 1\ndesignees 1\ndesirous 1\ndesktops 1\ndespairing 1\ndespairs 1\ndesserts 1\ndestabilize 1\ndestinies 1\ndestitution 1\ndestructed 1\ndestructing 1\ndestructiveness 1\ndetachment 1\ndetailsman 1\ndetainee 1\ndetainment 1\ndetects 1\ndetent 1\ndetente 1\ndeterminedly 1\ndeterrant 1\ndeters 1\ndetestable 1\ndetested 1\ndetests 1\ndethroned 1\ndetracts 1\ndetriments 1\ndeutsch 1\ndeux 1\ndev 1\ndevaluations 1\ndevastatingly 1\ndevelope 1\ndevelopiong 1\ndeveopment 1\ndeviant 1\ndeviants 1\ndeviated 1\ndeviations 1\ndevises 1\ndewatering 1\ndeworming 1\ndhabas 1\ndhirea 1\ndiGenova 1\ndiagnoses 1\ndiagnostics 1\ndiagonally 1\ndiagramming 1\ndialague 1\ndialectic 1\ndialects 1\ndialed 1\ndials 1\ndiameters 1\ndiaphram 1\ndiarheya 1\ndiaspora 1\ndiathesis 1\ndiazepam 1\ndibenzofurans 1\ndichotomy 1\ndickered 1\ndickering 1\ndictations 1\ndictatorial 1\ndictatorships 1\ndiction 1\ndidnt 1\ndiehard 1\ndiehards 1\ndieing 1\ndiethylstilbestrol 1\ndieting 1\ndifferentiates 1\ndifferentiating 1\ndiffidence 1\ndiffraction 1\ndigested 1\ndignify 1\ndignitary 1\ndilapidated 1\ndiluting 1\ndiminishes 1\ndimmed 1\ndimsum 1\ndimwit 1\ndin 1\ndinasaurs 1\ndiner 1\ndings 1\ndingy 1\ndinkiest 1\ndiode 1\ndiodes 1\ndios 1\ndiphtheria 1\ndiplomas 1\ndiplomatically 1\ndipotassium 1\ndirectional 1\ndirectivity 1\ndirectmail 1\ndirectness 1\ndisable 1\ndisabling 1\ndisadvantageous 1\ndisaffection 1\ndisagreeable 1\ndisallusionment 1\ndisappoints 1\ndisapprove 1\ndisapproves 1\ndisarmed 1\ndisassemble 1\ndisassociate 1\ndisasterous 1\ndisbelieving 1\ndisbenefits 1\ndiscerner 1\ndiscernible 1\ndiscerning 1\ndischarged 1\ndischord 1\ndiscimination 1\ndisciplinarians 1\ndisciplining 1\ndisclaims 1\ndiscomfit 1\ndiscomfited 1\ndiscomfort 1\ndisconcerted 1\ndisconcerting 1\ndisconnecting 1\ndiscontinuance 1\ndiscordant 1\ndiscounter 1\ndiscourse 1\ndiscoursed 1\ndiscoverer 1\ndiscreet 1\ndiscrepencies 1\ndiscus 1\ndisdainful 1\ndisdaining 1\ndisdains 1\ndisembark 1\ndisembarked 1\ndisenfranchisement 1\ndisengage 1\ndisequilibrium 1\ndisfavor 1\ndisfigured 1\ndisfigurement 1\ndisgorgement 1\ndisgraced 1\ndisgracing 1\ndisguising 1\ndisgustingly 1\ndisheartening 1\ndisheveled 1\ndishing 1\ndishonestly 1\ndishonorable 1\ndishonors 1\ndishpans 1\ndisillusioned 1\ndisinfectants 1\ndisinfection 1\ndisinflation 1\ndisinflationary 1\ndisingenuously 1\ndisintegrate 1\ndisjointed 1\ndiskettes 1\ndisloyal 1\ndisloyalty 1\ndismantles 1\ndismaying 1\ndismember 1\ndismembered 1\ndisorganized 1\ndisorientating 1\ndisorienting 1\ndisparage 1\ndisparaged 1\ndispassionately 1\ndispatchers 1\ndispatching 1\ndispensation 1\ndispensed 1\ndispensers 1\ndispensing 1\ndisperse 1\ndispirited 1\ndisplacement 1\ndisplacing 1\ndisposables 1\ndisposes 1\ndispositions 1\ndispossesed 1\ndisprove 1\ndisproved 1\ndisputable 1\ndisqualify 1\ndisquieting 1\ndisregards 1\ndisrespect 1\ndisrespects 1\ndissect 1\ndissecting 1\ndissection 1\ndissembled 1\ndisseminating 1\ndissension 1\ndissenters 1\ndissertation 1\ndissimulating 1\ndissipate 1\ndissipating 1\ndissipation 1\ndissociate 1\ndissociating 1\ndissolves 1\ndissonance 1\ndissonant 1\ndissonantly 1\ndist 1\ndistain 1\ndistanced 1\ndistasteful 1\ndistate 1\ndistended 1\ndistill 1\ndistillation 1\ndistillery 1\ndistinctiveness 1\ndistortion 1\ndistractions 1\ndistraught 1\ndistressingly 1\ndistributable 1\ndistributive 1\ndistributorship 1\ndistricting 1\ndisturbs 1\ndisunity 1\nditches 1\ndithering 1\nditto 1\ndiurnal 1\ndivas 1\ndiver 1\ndiverge 1\ndivergences 1\ndiverges 1\ndiverging 1\ndiversifed 1\ndiversifications 1\ndivesting 1\ndivests 1\ndividers 1\ndivorcee 1\ndivorcees 1\ndivorces 1\ndivs 1\ndivulges 1\ndizziness 1\ndoberman 1\ndocket 1\ndockings 1\ndoctorates 1\ndoctorine 1\ndoctorines 1\ndoctoring 1\ndoctrinal 1\ndocudramas 1\ndodge 1\ndodged 1\ndodging 1\ndoe 1\ndogboy 1\ndogfight 1\ndogging 1\ndogi 1\ndogmas 1\ndogmatist 1\ndogsledding 1\ndoi 1\ndoings 1\ndoled 1\ndollarize 1\ndolledup 1\ndollop 1\ndolt 1\ndoltish 1\ndom. 1\ndomed 1\ndomes 1\ndomestica 1\ndomesticated 1\ndomestics 1\ndomino 1\ndominoes 1\ndon 1\ndon't 1\ndon'ts 1\ndonair 1\ndonned 1\ndonning 1\ndoodads 1\ndoomsayer 1\ndoomsday 1\ndoormen 1\ndopes 1\ndopey 1\ndorm 1\ndormitories 1\ndosages 1\ndosimeters 1\ndossier 1\ndossiers 1\ndotCom 1\ndoting 1\ndotting 1\ndouble-fold 1\ndoubleA 1\ndoublespeak 1\ndoublethink 1\ndoubltlessly 1\ndoubters 1\ndouchebag 1\ndoughnut 1\ndoughty 1\ndour 1\ndoused 1\ndousing 1\ndovetails 1\ndovish 1\ndowndraft 1\ndownfalls 1\ndownhearted 1\ndownlinked 1\ndownpayments 1\ndownsides 1\ndownsized 1\ndownwards 1\ndownwind 1\ndoxepin 1\ndoyens 1\ndpi 1\ndrabbest 1\ndraconian 1\ndrafters 1\ndrafts 1\ndraftsmen 1\ndragger 1\ndragnet 1\ndramatizing 1\ndrawers 1\ndrawl 1\ndreadful 1\ndreadfully 1\ndreads 1\ndreamt 1\ndreamy 1\ndreary 1\ndredge 1\ndredged 1\ndren 1\ndrenching 1\ndressage 1\ndresser 1\ndressers 1\ndressup 1\ndribble 1\ndribbled 1\ndrifter 1\ndriftnet 1\ndriftwood 1\ndrip 1\ndrips 1\ndrone 1\ndrones 1\ndrooling 1\ndroopy 1\ndropoff 1\ndroppable 1\ndroppings 1\ndross 1\ndroughts 1\ndrubbing 1\ndrugged 1\ndrugging 1\ndrugstores 1\ndruid 1\ndrumbeating 1\ndrumroll 1\ndrunks 1\ndryness 1\ndrywall 1\ndsp 1\ndthat 1\ndualities 1\nduality 1\ndub 1\ndubia 1\ndubs 1\nduct 1\nducts 1\ndudgeon 1\nduds 1\nduels 1\nduet 1\nduffers 1\nduller 1\ndullest 1\ndullish 1\ndullness 1\ndumbbells 1\ndumber 1\ndumbest 1\ndumbs 1\ndummies 1\ndumpster 1\ndumshit 1\ndune 1\ndungeon 1\ndungeons 1\ndunk 1\nduodenal 1\nduplications 1\nduress 1\ndusted 1\nduster 1\ndustup 1\ndutiful 1\ndvd 1\ndwarves 1\ndwindle 1\ndyes 1\ndynamite 1\ndynamo 1\ndysentery 1\ndysmorphia 1\ndystopia 1\ndzh...@yahoo.com.cn 1\ne'mail 1\ne- 1\ne-Khalq 1\ne-cycling 1\ne60006e@hotmail.com 1\neBuyer 1\neMachines 1\neagles 1\neaiser 1\nealier 1\neared 1\nearl 1\nearmarking 1\nearnigs 1\nearphones 1\nearpiece 1\nearring 1\nearthbound 1\nearthlings 1\nearthshaking 1\neasel 1\neasements 1\neastern-most 1\neastgardens 1\neater 1\neatin 1\neaves 1\neavesdrop 1\neavesdropped 1\nebbed 1\nebbs 1\neccentricities 1\nechangisme 1\nechangistes 1\nechinoderms 1\nechoes 1\neclairs 1\neclipsed 1\neclipsing 1\neco-systems 1\necologically 1\necologists 1\necono-boxes 1\neconobox 1\neconometric 1\neconomize 1\neconomized 1\necotourism 1\nectoplasmic 1\ned 1\nedifices 1\neditorially 1\neditorials 1\nedits 1\nedmonton 1\neducationalists 1\neducationally 1\neducator 1\neduction 1\nee 1\neek 1\neerily 1\neeriness 1\neffecient 1\neffecting 1\neffete 1\nefficacious 1\nefficacy 1\neffortless 1\neffrontery 1\neg. 1\negalitarian 1\negocentric 1\negotist 1\negress 1\negret 1\neikh 1\neis 1\nejaculation 1\neject 1\nejecting 1\nejection 1\neked 1\nelaborated 1\nelaborates 1\nelaborating 1\nelaboration 1\nelan 1\nelapsed 1\nelastics 1\nelbowing 1\nelecting 1\nelectricals 1\nelectricians 1\nelectrification 1\nelectrocardiogram 1\nelectrocute 1\nelectrogalvanizing 1\nelectroluminescence 1\nelectrolyte 1\nelectromagnets 1\nelectron 1\nelectronegative 1\nelectroplater 1\nelectroreality 1\nelemental 1\nelevates 1\neligiblity 1\nelk 1\nelm 1\nelongate 1\nelswehere 1\neluding 1\nelvis 1\nemaciated 1\nemanating 1\nemasculate 1\nemasculation 1\nembalming 1\nembankment 1\nembargoed 1\nembargoes 1\nembargos 1\nembarking 1\nembarks 1\nembarrassingly 1\nembarrassments 1\nembattled 1\nembed 1\nembellish 1\nembellishments 1\nembittered 1\nemblazon 1\nemblazoned 1\nemblematic 1\nembodying 1\nemboldened 1\nembossed 1\nembroidered 1\nembryos 1\nemigrant 1\nemigres 1\neminently 1\nemirate 1\nemmigrate 1\nemnbarrass 1\nemote 1\nemoted 1\nempathetic 1\nempathized 1\nemperors 1\nemphatic 1\nemphaticize 1\nemphemeral 1\nempires 1\nempirical 1\nemplacement 1\nemployable 1\nemployerpaid 1\nempowerment 1\nempowers 1\nempoy 1\nemptying 1\nemulated 1\nemulating 1\nemulative 1\nenablement 1\nenabler 1\nenacts 1\nenamored 1\nencapsulate 1\nencasing 1\nenchanting 1\nenchantment 1\nencircle 1\nencircled 1\nencirclement 1\nencloses 1\nenclosing 1\nencode 1\nencore 1\nencouragingly 1\nencrusted 1\nencrypted 1\nencrypting 1\nencumbered 1\nencyclopedic 1\nendearing 1\nendeavoring 1\nendeavour 1\nendemic 1\nendgame 1\nendings 1\nendowing 1\nendrocrine 1\nendued 1\nendureance 1\neneedle 1\nenen 1\nenergetically 1\nenflamed 1\nenforces 1\nenfurates 1\nengender 1\nengendered 1\nenginner 1\nenglish 1\nengravers 1\nengraving 1\nengravings 1\nengrossing 1\nenhancementsmaintenance 1\nenigma 1\nenigmatic 1\nenka 1\nenliven 1\nenlivening 1\nennumerated 1\nenormity 1\nenquiry 1\nenrollees 1\nenrollments 1\nenrolment 1\nenron 1\nenroute 1\nensconced 1\nenslaved 1\nensnared 1\nensue 1\nensued 1\nentailed 1\nentanglement 1\nenteroviruses 1\nentertains 1\nentery 1\nenthusiasms 1\nenticed 1\nentices 1\nenticingly 1\nentie 1\nentitlements 1\nentranced 1\nentrances 1\nentranceway 1\nentre 1\nentreaties 1\nentreaty 1\nentrenchment 1\nentrusting 1\nenumeration 1\nenvelop 1\nenveloped 1\nenveloping 1\nenvelops 1\nenviably 1\nenvied 1\nenvious 1\nenvisioning 1\nenvlope 1\neons 1\nepidemiologist 1\nepileptic 1\nepileptics 1\nepiphany 1\nepiphytic 1\nepithelial 1\nepithet 1\nepitomize 1\nepitomized 1\nepoch 1\nepochal 1\nepoxy 1\nepsiode 1\nequalize 1\nequanimity 1\nequated 1\nequates 1\nequating 1\nequilibriums 1\nequiment 1\nequinox 1\nequipments 1\nequivalence 1\nequivocal 1\nequivocations 1\neradicated 1\nerasable 1\nerase 1\neraser 1\nerembal 1\nerf 1\nergo 1\nergonomically 1\neric 1\nerm 1\nerodes 1\nerosive 1\nerotica 1\nerotomaniac 1\nerotomaniacs 1\nerr 1\nerradicate 1\nerrand 1\nerrands 1\nerrata 1\nerroneously 1\nerrs 1\nerruption 1\nerstwhile 1\nerudition 1\nerythropoietin 1\nescalated 1\nescapade 1\nescapist 1\neschewing 1\nescorts 1\nescrowed 1\nesophageal 1\nesp. 1\nespousal 1\nespouse 1\nespoused 1\nespouses 1\nespresso 1\nessayist 1\nestablshed 1\nestiamtes 1\nestimation 1\nestimators 1\nestrogen 1\nestuarian 1\nestuary 1\net. 1\nethernet 1\nethnographer 1\nethnographic 1\nethnological 1\nethology 1\netter 1\neulogized 1\neulogizing 1\neulogy 1\neuro 1\neuropean 1\nevacuees 1\nevaded 1\nevades 1\neval 1\nevangelists 1\nevaporates 1\nevaporating 1\nevaporation 1\nevasions 1\nevasive 1\neve. 1\nevend 1\nevens 1\neverbody 1\nevicted 1\nevictees 1\nevidences 1\nevidentary 1\nevidentiary 1\nevildoers 1\nevinced 1\neviscerating 1\nevolves 1\nex-FBI 1\nex-Marine 1\nex-Saudi 1\nex-accountant 1\nex-astronaut 1\nex-chief 1\nex-cons 1\nex-employees 1\nex-employer 1\nex-investment 1\nex-minister 1\nex-player 1\nex-president 1\nexacerbating 1\nexaggerations 1\nexaggerator 1\nexaltedness 1\nexaminees 1\nexasperated 1\nexcavating 1\nexcavation 1\nexcavator 1\nexcelled 1\nexcelnt 1\nexceptionalism 1\nexcessiveness 1\nexchangeability 1\nexchangers 1\nexchequer 1\nexcise 1\nexcite 1\nexcites 1\nexclaim 1\nexclaimed 1\nexclamation 1\nexclusionary 1\nexcorciate 1\nexcoriated 1\nexcrutiatingly 1\nexcursus 1\nexcutives 1\nexecutioners 1\nexecutors 1\nexelent 1\nexemplar 1\nexemplary 1\nexemplified 1\nexemplifies 1\nexempts 1\nexercycles 1\nexerpts 1\nexhaled 1\nexhibiting 1\nexhibitors 1\nexhorbitant 1\nexhortatory 1\nexistent 1\nexistential 1\nexistentialist 1\nexited 1\nexiting 1\nexmearden 1\nexonerating 1\nexorcise 1\nexorcisms 1\nexorcist 1\nexpander 1\nexpanse 1\nexpansionists 1\nexpat 1\nexpedients 1\nexpeditionary 1\nexpeditiously 1\nexpendable 1\nexpensively 1\nexperimentally 1\nexperimenter 1\nexpertly 1\nexpirations 1\nexpletive 1\nexploiter 1\nexplusion 1\nexpo 1\nexponent 1\nexponentially 1\nexponents 1\nexportation 1\nexposes 1\nexpounding 1\nexpressionless 1\nexpressive 1\nexpropriate 1\nexpulse 1\nexpulsions 1\nexpunge 1\nexquisitely 1\next 1\nexteme 1\nextensiveness 1\nextents 1\nexteriors 1\nexterminating 1\nextermination 1\nexterminator 1\nextinguish 1\nextol 1\nextolled 1\nextolling 1\nextorted 1\nextra-nasty 1\nextractions 1\nextracts 1\nextraditing 1\nextraditions 1\nextramarital 1\nextramural 1\nextrapolated 1\nextrapolates 1\nextrapolating 1\nextras 1\nextraterrestrial 1\nextravagantly 1\nextrcurricular 1\nextrememly 1\nextruded 1\nextrusions 1\nexuberance 1\nexuberant 1\nexuded 1\nexuding 1\neyeball 1\neyeballing 1\neyeballs 1\neyedropper 1\neyeglasses 1\neyelet 1\neyelets 1\neyeshadow 1\neのsnail 1\nf*ck 1\nf*cking 1\nf.o.b 1\nfabic 1\nfable 1\nfabolous 1\nfabricators 1\nfabs 1\nfabulously 1\nfacades 1\nfacelift 1\nfacelifts 1\nfacet 1\nfaceted 1\nfacilitated 1\nfacilitations 1\nfacilites 1\nfacings 1\nfacsimiles 1\nfactionalism 1\nfacula 1\nfaculties 1\nfaired 1\nfairway 1\nfairytales 1\nfajitas 1\nfaking 1\nfallacious 1\nfallibility 1\nfallible 1\nfalloff 1\nfallow 1\nfalseness 1\nfalsified 1\nfalsifying 1\nfalter 1\nfalutin' 1\nfamines 1\nfancier 1\nfancies 1\nfancy'shvartzer 1\nfanning 1\nfanny 1\nfantasist 1\nfantasizing 1\nfaraway 1\nfarfetched 1\nfarmlands 1\nfarmsteads 1\nfarmwives 1\nfarriers 1\nfarsightedness 1\nfarting 1\nfascisti 1\nfascists 1\nfastballs 1\nfastened 1\nfastidious 1\nfataalsahwah@hotmail.com 1\nfateful 1\nfates 1\nfathered 1\nfatherhood 1\nfathom 1\nfatigued 1\nfatit 1\nfattening 1\nfatter 1\nfatuous 1\nfatuously 1\nfatwas 1\nfaucet 1\nfaught 1\nfaultless 1\nfaultlines 1\nfaut 1\nfavorability 1\nfavourable 1\nfavoured 1\nfavourite 1\nfavours 1\nfawning 1\nfaxes 1\nfearlessly 1\nfearsomely 1\nfeasable 1\nfeasted 1\nfeasting 1\nfeatherless 1\nfeathery 1\nfeatureless 1\nfeb 1\nfedayeen 1\nfederalism 1\nfederalists 1\nfederated 1\nfeedstock 1\nfeelers 1\nfeigning 1\nfeline 1\nfelines 1\nfellas 1\nfelling 1\nfellowship 1\nfemurs 1\nfender 1\nfending 1\nferal 1\nfermented 1\nfermenter 1\nfermenters 1\nfermenting 1\nferocity 1\nferret 1\nferreting 1\nferrets 1\nferries 1\nferris 1\nferroconcrete 1\nferrule 1\nferrying 1\nfertilize 1\nfertilizing 1\nfervente 1\nfervid 1\nfess 1\nfest 1\nfester 1\nfestering 1\nfestivity 1\nfestooning 1\nfetches 1\nfetching 1\nfetish 1\nfetishism 1\nfeudalism 1\nfeverishly 1\nfez 1\nffk 1\nfiance 1\nfiancee 1\nfickleness 1\nfictions 1\nficus 1\nfiddler 1\nfiddling 1\nfidelus 1\nfidgeting 1\nfiefdoms 1\nfiends 1\nfiercest 1\nfifteenfold 1\nfifteenth 1\nfifthly 1\nfifths 1\nfig 1\nfigment 1\nfigur 1\nfigurative 1\nfiji 1\nfilberts 1\nfilched 1\nfilename 1\nfilially 1\nfilipinos 1\nfilly 1\nfilmic 1\nfilthiest 1\nfilthiness 1\nfiltration 1\nfinagled 1\nfinagling 1\nfinale 1\nfinalists 1\nfinanceer 1\nfinanicial 1\nfinches 1\nfinder 1\nfinders 1\nfinessed 1\nfinessing 1\nfingered 1\nfingerlings 1\nfinishe 1\nfinishers 1\nfino 1\nfireballs 1\nfirecrackers 1\nfirefox 1\nfirehoops 1\nfiremen 1\nfireplaces 1\nfireproofing 1\nfirewall 1\nfirewalls 1\nfirewater 1\nfirewire 1\nfirings 1\nfirming 1\nfirmness 1\nfirsts 1\nfishbowl 1\nfisherman 1\nfishes 1\nfishin' 1\nfisrt 1\nfissure 1\nfissures 1\nfisted 1\nfistful 1\nfittest 1\nfivefold 1\nfivepin 1\nfiver 1\nfives 1\nfiveyear 1\nfiving 1\nfixable 1\nfizzes 1\nfk 1\nfking 1\nflabbergasted 1\nflabbiness 1\nflack 1\nflagpole 1\nflagrant 1\nflagrante 1\nflail 1\nflaky 1\nflam 1\nflamabable 1\nflamed 1\nflank 1\nflanked 1\nflanker 1\nflanking 1\nflannel 1\nflare 1\nflashA.com 1\nflashC.com 1\nflashD.com 1\nflashback 1\nflashbacks 1\nflashier 1\nflashiers 1\nflashily 1\nflatfish 1\nflatness 1\nflatout 1\nflatter 1\nflattered 1\nflattering 1\nflattery 1\nflatty 1\nflaunting 1\nflaunts 1\nflautist 1\nflavored 1\nflavorful 1\nflay 1\nfleas 1\nfledglings 1\nfleece 1\nfleeced 1\nflees 1\nfleshpots 1\nfleshy 1\nflexing 1\nflfings 1\nflickered 1\nflickering 1\nflickers 1\nflied 1\nflim 1\nflinch 1\nfling 1\nflinging 1\nflings 1\nflinty 1\nflirt 1\nflirtation 1\nflirtatious 1\nflirtatiously 1\nflirted 1\nflirty 1\nflocks 1\nflog 1\nfloodgates 1\nfloodplain 1\nfloorspace 1\nflotation 1\nflotations 1\nflotilla 1\nflounder 1\nfloundered 1\nfloundering 1\nflouting 1\nfluctuates 1\nfluctuating 1\nfluency 1\nfluff 1\nflunked 1\nflunkies 1\nflunking 1\nflunky 1\nfluorescent 1\nfluoride 1\nflurocarbon 1\nflushed 1\nflushing 1\nflustered 1\nflutes 1\nfluting 1\nflux 1\nflyer 1\nflys 1\nflytrap 1\nflywheel 1\nfocussing 1\nfogey 1\nfoggy 1\nfogs 1\nfoie 1\nfoisted 1\nfoliage 1\nfolkish 1\nfolksy 1\nfolktale 1\nfollowership 1\nfoment 1\nfomenting 1\nfondest 1\nfondles 1\nfondly 1\nfont 1\nfonts 1\nfoodservice 1\nfook 1\nfooled 1\nfoolhardiness 1\nfoolishnesses 1\nfooter 1\nfoothills 1\nfootnotes 1\nfootprint 1\nfootprints 1\nforage 1\nforaging 1\nforay 1\nforbearance 1\nforcefulness 1\nforcode 1\nford 1\nforebears 1\nforeclosure 1\nforefather 1\nforefathers 1\nforefronts 1\nforegone 1\nforehand 1\nforelock 1\nforelocks 1\nforerunners 1\nforeshadowed 1\nforeshadows 1\nforeshocks 1\nforesight 1\nforested 1\nforesters 1\nforfeitable 1\nforger 1\nforgeries 1\nforgetful 1\nforgettable 1\nforgings 1\nforgives 1\nforida 1\nforklift 1\nforklifts 1\nformaldehyde 1\nformalism 1\nformalize 1\nformosana 1\nformosensis 1\nformulaic 1\nformulates 1\nformulations 1\nforsaken 1\nfort 1\nforthright 1\nforties 1\nfortifying 1\nfortnight 1\nfortnightly 1\nfortresses 1\nforts 1\nfortuitous 1\nfortuitously 1\nfortunately 1\nfot 1\nfouling 1\nfoundered 1\nfoung 1\nfount 1\nfourfold 1\nfourthquarter 1\nfracas 1\nfractions 1\nfractious 1\nfractures 1\nfragrances 1\nfragrantization 1\nfrailties 1\nframitz 1\nfranca 1\nfrance 1\nfranchisor 1\nfrankincense 1\nfranking 1\nfraternal 1\nfraternities 1\nfraudulence 1\nfrayed 1\nfre 1\nfreakin 1\nfreakishly 1\nfreaky 1\nfreefall 1\nfreeholders 1\nfreelancer 1\nfreeman 1\nfreemarket 1\nfreespender 1\nfreewheeling 1\nfreezers 1\nfreezes 1\nfreq 1\nfrequents 1\nfresco 1\nfreshener 1\nfresher 1\nfreshwater 1\nfretboard 1\nfrets 1\nfriar 1\nfrictions 1\nfriday 1\nfriers 1\nfrigate 1\nfriggin' 1\nfright 1\nfrighteningly 1\nfripperies 1\nfrisk 1\nfrisky 1\nfrites 1\nfrittering 1\nfrivolously 1\nfrivolousness 1\nfrocked 1\nfrocking 1\nfrocks 1\nfrogmen 1\nfrolicked 1\nfrontal 1\nfrontend 1\nfrontiers 1\nfrosting 1\nfrothing 1\nfrothy 1\nfrown 1\nfrowned 1\nfruitbowl 1\nfruitification 1\nfrumpy 1\nfrustrate 1\nfucken 1\nfucker 1\nfuckingly 1\nfucks 1\nfudging 1\nfullblown 1\nfuller 1\nfullling 1\nfullscale 1\nfulltime 1\nfulminations 1\nfulsome 1\nfumarase 1\nfume 1\nfumpered 1\nfunctionalities 1\nfundamantal 1\nfundamantalist 1\nfundamentalism 1\nfundraisings 1\nfungal 1\nfungi 1\nfunk 1\nfunnier 1\nfunnies 1\nfunniest 1\nfurloughed 1\nfurloughs 1\nfurore 1\nfurrier 1\nfurrows 1\nfurthered 1\nfurthest 1\nfurtive 1\nfuselage 1\nfusillade 1\nfusing 1\nfusses 1\nfussy 1\nfutility 1\nfuzzier 1\nfuzzily 1\ng'head 1\ngaddy 1\ngadget 1\ngaffes 1\ngag 1\ngaily 1\ngainst 1\ngalaxy 1\ngallant 1\ngalley 1\ngalling 1\ngallstone 1\ngallstones 1\ngalvanometers 1\ngam 1\ngamblers 1\ngametocide 1\ngamey 1\ngamma 1\ngandalf 1\ngangbusters 1\ngangland 1\nganglion 1\ngant 1\ngaped 1\ngapping 1\ngardeners 1\ngardenettes 1\ngardneri 1\ngargle 1\ngarret 1\ngarter 1\ngash 1\ngasps 1\ngassed 1\ngassing 1\ngastro-intestinal 1\ngastroenterologist 1\ngateau 1\ngateway 1\ngateways 1\ngatherers 1\ngatorade 1\ngaudy 1\ngauntlet 1\ngauntlets 1\ngazed 1\ngazette 1\ngazetteers 1\ngearboxes 1\ngeckos 1\ngee 1\ngeebies 1\ngeek 1\ngeeksquad 1\ngelatin 1\ngelato 1\ngelatos 1\ngemsbok 1\ngemstone 1\ngendarme 1\ngendarmes 1\ngenealogical 1\ngenealogy 1\ngeneralisations 1\ngeneralist 1\ngeneralists 1\ngeneralized 1\ngeneralpurpose 1\ngenerically 1\ngeneticist 1\ngenetics 1\ngenial 1\ngenital 1\ngenius-like 1\ngeniuses 1\ngenocidal 1\ngenocides 1\ngensets 1\ngentility 1\ngentleladies 1\ngentlelady 1\ngentleness 1\ngentrification 1\ngentrified 1\ngentrifying 1\ngents 1\ngenus 1\ngeo 1\ngeochemistry 1\ngeode 1\ngeometrical 1\ngeopolitical 1\ngeopolitics 1\ngeorge 1\ngeosciences 1\ngeothermal 1\ngerminate 1\ngerms 1\ngestational 1\ngestured 1\ngesturing 1\ngetaway 1\ngetinternaltype 1\ngetter 1\ngetters 1\ngettting 1\ngeysers 1\nghais2001@hotmail.com 1\nghastly 1\nghee 1\nghettos 1\nghostly 1\ngibberish 1\ngigatons 1\ngiggle 1\ngigs 1\ngigue 1\ngilded 1\ngilding 1\ngilt 1\ngimmick 1\nginning 1\nginny 1\ngiraffe 1\ngirded 1\ngirder 1\ngirlie 1\ngirth 1\ngivebacks 1\ngivens 1\ngiveth 1\ngizmo 1\ngizmos 1\nglacier 1\nglade 1\nglam 1\nglamorize 1\nglamourized 1\nglances 1\nglancing 1\ngland 1\nglares 1\nglasswork 1\nglaze 1\nglean 1\ngleaned 1\nglen 1\nglib 1\nglide 1\ngliding 1\nglimpses 1\nglitterati 1\nglittering 1\nglitters 1\nglittery 1\ngloat 1\ngloated 1\ngloats 1\nglobalised 1\nglobalism 1\nglobalist 1\nglobalized 1\nglobes 1\nglobulin 1\ngloved 1\nglowsticks 1\nglucosamine 1\nglues 1\ngluts 1\ngluttonous 1\ngnawing 1\ngnome 1\ngoalball 1\ngoalposts 1\ngoaltender 1\ngoaltenders 1\ngoatee 1\ngobble 1\ngobbledygook 1\ngoddam 1\ngodmother 1\ngoe's 1\ngoers 1\ngoeth 1\ngoldbanded 1\ngoldfish 1\ngolds 1\ngong 1\ngongs 1\ngoo 1\ngooder 1\ngooders 1\ngoodly 1\ngoofs 1\ngooseberry 1\ngopher 1\ngore 1\ngorge 1\ngorse 1\ngossiping 1\ngouaille 1\ngourmands 1\ngourmets 1\ngout 1\ngouty 1\ngovernmemt 1\ngovernmentset 1\ngovernorates 1\ngovernorship 1\ngowns 1\ngp 1\ngr-r-r 1\ngraced 1\ngracilis 1\ngradations 1\ngradient 1\ngraffiti 1\ngrainy 1\ngrammarian 1\ngranary 1\ngrandchild 1\ngrandees 1\ngrander 1\ngrandeur 1\ngrandfathers 1\ngrandmasters 1\ngrandmotherly 1\ngrandpa 1\ngrandure 1\ngranola 1\ngranular 1\ngranulated 1\ngraphically 1\ngrappled 1\ngrapples 1\ngras 1\ngrasped 1\ngrasses 1\ngrasshoppers 1\ngrassy 1\ngrate 1\ngratuities 1\ngratuity 1\ngravesite 1\ngravest 1\ngravestone 1\ngravitational 1\ngrazing 1\ngreedier 1\ngreenback 1\ngreenery 1\ngreenfield 1\ngreengrocer 1\ngreengrocery 1\ngreenhorns 1\ngreenies 1\ngreenish 1\ngreens 1\ngremlins 1\ngridded 1\ngridiron 1\ngrieved 1\ngrieves 1\ngrievous 1\ngrievously 1\ngrillings 1\ngrills 1\ngrimace 1\ngrimaced 1\ngrimaces 1\ngrime 1\ngrimey 1\ngrimmest 1\ngrimness 1\ngrind 1\ngrinders 1\ngrindstone 1\ngringo 1\ngringos 1\ngrinning 1\ngrins 1\ngripe 1\ngripwork 1\ngrittier 1\ngritting 1\ngro 1\ngroan 1\ngrocerys 1\ngroin 1\ngrooves 1\ngroped 1\ngrossed 1\ngrosser 1\ngrotto 1\ngrottoes 1\ngroundball 1\ngroundbreakers 1\ngroundless 1\ngroundlessly 1\ngroundup 1\ngrouping 1\ngroupings 1\ngrouses 1\ngrout 1\ngrouting 1\ngrove 1\ngrovels 1\ngrowingly 1\ngrowled 1\ngrowling 1\ngrowls 1\ngrowths 1\ngrubby 1\ngruff 1\ngrumbled 1\ngrumbling 1\ngrunts 1\ngstrathmann@mediaone.net 1\ngtownsend@manorisd.net 1\nguality 1\nguan: 1\nguarantor 1\nguardedly 1\ngudgeons 1\nguesses 1\nguesthouse 1\nguesthouses 1\nguideposts 1\nguild 1\nguile 1\nguilelessness 1\nguillotine 1\nguises 1\nguitars 1\ngull 1\ngulls 1\ngummed 1\ngunboats 1\ngunfight 1\ngung 1\ngungho 1\ngunmakers 1\ngunner 1\ngunning 1\ngunny 1\ngunpoint 1\ngunslinging 1\ngush 1\ngushed 1\ngushes 1\nguss 1\ngust 1\ngusty 1\ngutsy 1\ngutted 1\ngutter 1\ngutting 1\nguv'nor 1\nguxi 1\nguzzle 1\nguzzling 1\ngvt 1\ngwan 1\ngymnasiums 1\ngymnophobia 1\ngypsy 1\ngyrated 1\ngyro 1\nh=guys 1\nha 1\nha[[y 1\nhackneyed 1\nhacks 1\nhadith 1\nhaemoglobin 1\nhaemostatic 1\nhag 1\nhaggle 1\nhaggling 1\nhagiographies 1\nhahaha 1\nhahahaahh 1\nhahahahahahahahaha 1\nhailstorm 1\nhaircuts 1\nhairline 1\nhairspray 1\nhairyknuckled 1\nhalf-million 1\nhalfhearted 1\nhalfheartedly 1\nhallucinating 1\nhallucinatory 1\nhalogen 1\nhalogenated 1\nhaltingly 1\nhalts 1\nhalve 1\nhamlet 1\nhammerfist 1\nhammers 1\nhammie 1\nhamming 1\nhammocks 1\nhammy 1\nhampers 1\nhandbag 1\nhandbills 1\nhandbooks 1\nhandbrake 1\nhandcraft 1\nhandedness 1\nhander 1\nhandguns 1\nhandicrafts 1\nhandldk 1\nhandmade 1\nhandpicked 1\nhandstands 1\nhandyman 1\nhangover 1\nhank 1\nhankering 1\nhaphazard 1\nhapless 1\nhappenstance 1\nhaqve 1\nharassed 1\nharbingers 1\nharboring 1\nhardball 1\nhardbound 1\nhardcover 1\nharddisk 1\nhardens 1\nhardliner 1\nhardwood 1\nhardwork 1\nhare 1\nharmonic 1\nharmonics 1\nharmoniously 1\nharp 1\nharry 1\nhashing 1\nhatbox 1\nhater 1\nhaute 1\nhauteur 1\nhaw 1\nhawawy100@hotmail.com 1\nhawker 1\nhawkers 1\nhawkish 1\nhawks 1\nhaymakers 1\nhazel 1\nhazelnut 1\nheadcount 1\nheadhunting 1\nheadman 1\nheadmaster 1\nheadsets 1\nheadspring 1\nheadstones 1\nheadwater 1\nhealed 1\nhealer 1\nhealthiness 1\nheaping 1\nhearers 1\nhearse 1\nheartbeat 1\nheartbeats 1\nheartbreak 1\nheartbreaking 1\nheartbroken 1\nheartless 1\nheartstopping 1\nheartwarmingly 1\nheartwood 1\nheaters 1\nheather 1\nheaves 1\nheaving 1\nheavyweights 1\nheebee 1\nheebie 1\nheeded 1\nheedful 1\nheeding 1\nheedlessness 1\nheft 1\nhegemonic 1\nheighborhoods 1\nheinous 1\nheirarchy 1\nhelluva 1\nhelmeted 1\nhelmsmen 1\nhelo 1\nhelos 1\nhelper 1\nhelpers 1\nhelpfully 1\nhem 1\nhemispheric 1\nhemlock 1\nhemmed 1\nhemolyzed 1\nhemorrhaged 1\nhendecasyllabic 1\nherald 1\nheralding 1\nhereabouts 1\nheresy 1\nheretical 1\nheretics 1\nheretofore 1\nhermit 1\nhernia 1\nherniated 1\nheroically 1\nheroics 1\nheroine 1\nheroism 1\nherpes 1\nherring 1\nhesitating 1\nhesitations 1\nhewed 1\nhewn 1\nhews 1\nhex 1\nhexagons 1\nhi-fi 1\nhick 1\nhidaways 1\nhiders 1\nhides 1\nhierarchical 1\nhigh-profit 1\nhigh-risk 1\nhigherups 1\nhighjacker 1\nhighlighting 1\nhighpriced 1\nhightailing 1\nhightops 1\nhiked 1\nhillary 1\nhillsides 1\nhinders 1\nhinterlands 1\nhinting 1\nhireing 1\nhireling 1\nhirelings 1\nhis/her 1\nhissy 1\nhistoricized 1\nhj 1\nhmmmm 1\nho11ho@hotmail.com 1\nhoarder 1\nhoarding 1\nhobos 1\nhock 1\nhodgepodge 1\nhoe 1\nhogtied 1\nhoister 1\nhoisting 1\nhoists 1\nholdovers 1\nholdups 1\nholler 1\nhollering 1\nholograms 1\nhomebuilding 1\nhomeequity 1\nhomegrown 1\nhomemakers 1\nhomeopathy 1\nhomeowner 1\nhomepage 1\nhomers 1\nhomerun 1\nhomey 1\nhomicides 1\nhomoerotic 1\nhomogenous 1\nhomologous 1\nhomosexuality 1\nhoning 1\nhonks 1\nhonorably 1\nhonorarium 1\nhonorariums 1\nhonour 1\nhonourable 1\nhonro 1\nhoodie 1\nhoodwinked 1\nhoodwinking 1\nhoof 1\nhooking 1\nhookup 1\nhookups 1\nhooliganism 1\nhoopla 1\nhoops 1\nhooptie 1\nhooves 1\nhopefuls 1\nhopelessness 1\nhoplessly 1\nhopped 1\nhopper 1\nhopscotched 1\nhopscotching 1\nhorde 1\nhornblende 1\nhornet 1\nhorrendous 1\nhorrendousness 1\nhorribles 1\nhorribly 1\nhorridly 1\nhorseback 1\nhorseman 1\nhorsemen 1\nhorseshoes 1\nhorsetrack 1\nhorsetracks 1\nhorticulturist 1\nhosannas 1\nhospices 1\nhospitalizations 1\nhotcakes 1\nhotdogs 1\nhoteliers 1\nhotheads 1\nhotlines 1\nhottie 1\nhounded 1\nhousecall 1\nhousecleaning 1\nhouseful 1\nhousehld 1\nhouseman 1\nhouseplants 1\nhouston 1\nhover 1\nhowdy 1\nhowitzer 1\nhowitzers 1\nhowl 1\nhr 1\nhsa 1\nhsiang 1\nhsiao 1\nhsiu 1\nhsun 1\nhttp://3.bp.blogspot.com/-X_e2uwT6wPw/Tkj_7UVTw6I/AAAAAAAAAGs/e_hICAdYPYI/s1600/lotte_world_from_high_up.jpg 1\nhttp://aaa102.arabform.com/ 1\nhttp://alsaha.fares.net/sahat?128@91.sd1NcTEBAaa.32@.3ba99db5 1\nhttp://alsaha.fares.net/sahat?128@96.GdrOcqF3AIR.21@.3ba9a044 1\nhttp://alsaha2.fares.net/sahat?128@247.n9fpcUVKH9b.0@.3ba9b6f7 1\nhttp://blog.donews.com/pangshengdong 1\nhttp://blog.sina.com.cn/u/1265687602 1\nhttp://dave.typepad.com/dave/2004/06/janet_jacksons_.html 1\nhttp://digon_va.tripod.com/Chernobyl.htm 1\nhttp://en.wikipedia.org/wiki/Aerocom 1\nhttp://en.wikipedia.org/wiki/Degenerate_art 1\nhttp://en.wikipedia.org/wiki/John_Balance 1\nhttp://farm3.static.flickr.com/2406/2527255596_db23df940f.jpg 1\nhttp://herp-info.webs.com/beardeddragon.htm 1\nhttp://home.enron.com/employeemeeting 1\nhttp://i.imgur.com/S2MD2.jpg 1\nhttp://i.imgur.com/T2zff.jpg 1\nhttp://i.imgur.com/Xytex.jpg 1\nhttp://im.yahoo.com/ 1\nhttp://lingerie.selectedsex.com 1\nhttp://maaajd.arabform.com 1\nhttp://msi-team.com/awing 1\nhttp://news.bbc.co.uk/1/hi/help/4162471.stm 1\nhttp://news.bbc.co.uk/1/hi/programmes/panorama/live_forums/2124808.stm 1\nhttp://news.bbc.co.uk/1/hi/world/middle_east/4281450.stm 1\nhttp://news.bbc.co.uk/2/hi/programmes/this_world/4446342.stm 1\nhttp://news.bbc.co.uk/go/em/fr/-/1/hi/technology/4630694.stm 1\nhttp://news.yahoo.com/nestl-purina-releases-commercial-aimed-dogs-183443091.html 1\nhttp://news.zdnet.co.uk/internet/0,39020369,39187928,00.htm 1\nhttp://nigeria.usembassy.gov/scams.html 1\nhttp://seattlepi.nwsource.com/national/apmideast_story.asp?category=1107&slug=Palestinians%20Abbas 1\nhttp://stargods.org/MasonicMichaelJackson.htm 1\nhttp://sultan5.arabform.com 1\nhttp://tong.visitkorea.or.kr/cms/resource/81/188181_image2_1.jpg 1\nhttp://travel.state.gov/travel/cis_pa_tw/cis/cis_1052.html 1\nhttp://v2.cache7.c.bigcache.googleapis.com/static.panoramio.com/photos/original/42661265.jpg?redirect_counter=2 1\nhttp://washington.hyatt.com/wasgh/index.html 1\nhttp://web.wenxuecity.com/BBSView.php?SubID=memory&MsgID=106201 1\nhttp://web.wenxuecity.com/BBSViewphp?SubID=currentevent&MsgID=159556 1\nhttp://whymsi.com/awing 1\nhttp://www-formal.stanford.edu/jmc/progress/chernobyl.html 1\nhttp://www.21stcenturysciencetech.com/articles/chernobyl.html 1\nhttp://www.4gamer.net/news/image/2005.09/20050909152208_14big.html 1\nhttp://www.4gamer.net/news/image/2005.09/20050917024918_1big.html 1\nhttp://www.4gamer.net/news/image/2005.09/20050917034321_15big.html 1\nhttp://www.4gamer.net/news/image/2005.09/20050917034321_2big.html 1\nhttp://www.4gamer.net/news/image/2005.09/20050919032951_21big.html 1\nhttp://www.4gamer.net/news/image/2005.09/20050919032951_32big.html 1\nhttp://www.4gamer.net/news/image/2005.09/20050919032951_39big.html 1\nhttp://www.4gamer.net/news/image/2005.09/20050919032951_40big.html 1\nhttp://www.4gamer.net/news/image/2005.09/20050919032951_41big.html 1\nhttp://www.4gamer.net/news/image/2005.09/20050919032951_42big.html 1\nhttp://www.4gamer.net/news/image/2005.09/20050920111900_21big.html 1\nhttp://www.accenture.com 1\nhttp://www.adiccp.org/home/default.asp 1\nhttp://www.al-jazirah.com.sa/cars/06122006/rood2.htm 1\nhttp://www.al-jazirah.com.sa/cars/10012007/rood57.htm 1\nhttp://www.al-jazirah.com.sa/cars/13122006/rood43.htm 1\nhttp://www.al-jazirah.com.sa/cars/17012007/rood40.htm 1\nhttp://www.al-jazirah.com.sa/cars/20122006/rood43.htm 1\nhttp://www.al-jazirah.com.sa/cars/22112006/roods42.htm 1\nhttp://www.al-jazirah.com.sa/cars/24012007/rood2.htm 1\nhttp://www.al-jazirah.com.sa/cars/29112006/rood55.htm 1\nhttp://www.al-majalla.com/ListNews.a...=1175&MenuID=8 1\nhttp://www.almokhtsar.com/html/news/1413/2/65370.php 1\nhttp://www.alwatan.com.sa/daily/2007-01-31/first_page/first_page01.htm 1\nhttp://www.alyaum.com/issue/page.php?IN=12271&P=4 1\nhttp://www.amazon.ca/exec/obidos/ASIN/0915765381/701-3377456-8181939 1\nhttp://www.analystnewspaper.com/ameu_scam_feb21.html 1\nhttp://www.antifraudcentre-centreantifraude.ca/english/home-eng.html 1\nhttp://www.arps.org.au/Chernobyl.htm 1\nhttp://www.bali.tpc.gov.tw/ 1\nhttp://www.bbc.co.uk/dailyemail/ 1\nhttp://www.bbc.co.uk/email 1\nhttp://www.bbc.co.uk/news 1\nhttp://www.beardeddragon.org/articles/caresheet/?page=1 1\nhttp://www.bigeye.com/111003.htm 1\nhttp://www.bullatomsci.org/issues/1993/s93/s93Marples.html 1\nhttp://www.bumrungrad.com/en/patient-services/clinics-and-centers/plastic-surgery-thailand-bangkok/breast-augmentation-ba 1\nhttp://www.calguard.ca.gov/ia/Chernobyl-15%20years.htm 1\nhttp://www.canadavisa.com/canadian-immigration-faq-skilled-workers.html 1\nhttp://www.chernobyl.info/en 1\nhttp://www.chernobyl.org.uk/page2.htm 1\nhttp://www.cic.gc.ca/english/contacts/index.asp 1\nhttp://www.cic.gc.ca/english/immigrate/index.asp 1\nhttp://www.cic.gc.ca/english/immigrate/skilled/assess/index.asp 1\nhttp://www.cic.gc.ca/english/index.asp 1\nhttp://www.collectinghistory.net/chernobyl/ 1\nhttp://www.consumerreports.org/health/healthy-living/beauty-personal-care/hair-loss-10-08/hair-loss.htm 1\nhttp://www.country-couples.co.uk/datingtips/basic-travel-allowance-bta-dating-scam/ 1\nhttp://www.csmonitor.com/2006/0509/p02s01-ussc.html?s=t5 1\nhttp://www.dailykos.com/story/2006/5/12/232746/857 1\nhttp://www.debenhams.com/women/craghoppers#catalogId=10001&lid=//productsuniverse/en_GB/product_online0Y/categories%3C0productsuniverse_186610/brand_description0.000000E+000craghoppers0/categories%3C0productsuniverse_18661_186820&ps=default&sfn=Categories&sfv=Coats+%26amp0+jackets&storeId=10001 1\nhttp://www.disinfo.com/archive/pages/dossier/id334/pg1/ 1\nhttp://www.ebay.co.uk/itm/130589513308?var=430034792128&ssPageName=STRK:MEWAX:IT&_trksid=p3984.m1438.l2648#ht_1500wt_660 1\nhttp://www.ebay.co.uk/itm/250927098564?var=550057729382&ssPageName=STRK:MEWAX:IT&_trksid=p3984.m1438.l2649#ht_ 1\nhttp://www.ebay.co.uk/itm/250927098564?var=550057729382&ssPageName=STRK:MEWAX:IT&_trksid=p3984.m1438.l2649#ht_2079wt_893 1\nhttp://www.egotastic.com/entertainment/celebrities/janet-jackson/janet-jackson-is-too-fat-to-sing 1\nhttp://www.environmentalchemistry.com/yogi/hazmat/articles/chernobyl1.html 1\nhttp://www.equinecaninefeline.com/catalog/abode-large-metal-cage-liberta-free-delivery-p-6679.html 1\nhttp://www.equinecaninefeline.com/catalog/mamble-hamster-narrow-100cm-cage-p-12642.html 1\nhttp://www.equinecaninefeline.com/catalog/savic-freddy-cage-free-delivery-p-6750.html 1\nhttp://www.etechrecycling.com 1\nhttp://www.furk.net/newsadam.avi.html 1\nhttp://www.gasbuddy.com/gb_gastemperaturemap.aspx 1\nhttp://www.globalpolicy.org/intljustice/wanted/2005/1212ties.htm 1\nhttp://www.guardian.co.uk/obituaries/story/0,3604,1371372,00.html 1\nhttp://www.gulf-news.com/Articles/news.asp?ArticleID=97508 1\nhttp://www.ibiblio.org/expo/soviet.exhibit/chernobyl.html 1\nhttp://www.ibrae.ac.ru/IBRAE/eng/chernobyl/nat_rep/nat_repe.htm#24 1\nhttp://www.infoukes.com/history/chornobyl/elg/ 1\nhttp://www.infoukes.com/history/chornobyl/gregorovich/index.html 1\nhttp://www.islamtoday.net/pen/show_articles_content.cfm?id=64&catid=195&artid=8476 1\nhttp://www.justcages.co.uk/ferret-cages/ferplast-furet-plus-ferret-cage#v_431 1\nhttp://www.laweekly.com/general/features/satan-loves-you/13454/ 1\nhttp://www.liveleak.com/view?i=c5daa5b733 1\nhttp://www.mofa.gov.sa/detail.asp?InNewsItemID=59090&InTemplateKey=print 1\nhttp://www.moltqaa.com 1\nhttp://www.msnbc.msn.com/id/11800917/ 1\nhttp://www.nea.fr/html/rp/chernobyl/c01.html 1\nhttp://www.nea.fr/html/rp/chernobyl/c05.html 1\nhttp://www.nea.fr/html/rp/chernobyl/conclusions5.html 1\nhttp://www.netpetshop.co.uk/p-19500-savic-chichi-2-chinchilla-rat-degu-ferret-cage.aspx 1\nhttp://www.newscientistspace.com/article.ns?id=dn8293&feedId=human-spaceflight_atom03 1\nhttp://www.newsday.com/news/opinion/ny-vpnasa054135614feb05,0,5979821.story?coll=ny-editorials-headlines 1\nhttp://www.nosprawltax.org/media/releases/2002Robbery.html 1\nhttp://www.nsrl.ttu.edu/chernobyl/wildlifepreserve.htm 1\nhttp://www.oneworld.org/index_oc/issue196/byckau.html 1\nhttp://www.ontario.ca/en/information_bundle/birthcertificates/119274.html 1\nhttp://www.ontario.ca/en/information_bundle/birthcertificates/119275.html 1\nhttp://www.pangshengdong.com 1\nhttp://www.pbase.com/jolka/image/53574802 1\nhttp://www.petsathome.com/shop/combi-1-dwarf-hamster-cage-by-ferplast-15986 1\nhttp://www.physics.isu.edu/radinf/chern.htm 1\nhttp://www.pukmedia.com/arabicnews/6-1/news33.htm 1\nhttp://www.qin.com.tw 1\nhttp://www.rawstory.com/news/2006/US_outsourcing_special_operations_intelligence_gathering_0413.html 1\nhttp://www.reuters.co.uk/newsPackageArticle.jhtml?type=worldNews&storyID=624569&section=news 1\nhttp://www.romancescam.com/forum/viewtopic.php?t=7231 1\nhttp://www.solutions.com/jump.jsp?itemID=1361&itemType=PRODUCT&path=1%2C3%2C477&iProductID=1361 1\nhttp://www.space.com/astronotes/astronotes.html 1\nhttp://www.space.com/missionlaunches/ft_050829_ksc_spacefuture.html 1\nhttp://www.sploid.com/news/2006/05/evil_priest_gui.php 1\nhttp://www.tamsui.gov.tw/ 1\nhttp://www.tecsoc.org/pubs/history/2002/apr26.htm 1\nhttp://www.thekcrachannel.com/news/4503872/detail.html 1\nhttp://www.thetruthseeker.co.uk/article.asp?id=4503 1\nhttp://www.time.com/time/daily/chernobyl/860901.accident.html 1\nhttp://www.ukrainianweb.com/chernobyl_ukraine.htm 1\nhttp://www.un.org/ha/chernobyl/ 1\nhttp://www.usatoday.com/tech/science/space/2005-03-09-nasa-search_x.htm?csp=34 1\nhttp://www.vanityfair.com/commentary/content/articles/050411roco03c 1\nhttp://www.waynemadsenreport.com/ 1\nhttp://www.world-nuclear.org/info/chernobyl/inf07.htm 1\nhttp://www.wyndham.com/Washington_DC/default.cfm 1\nhttp://yooha.meepzorp.com/out-spot/index_files/out-spot.jpg 1\nhttp://youtube.com/watch?v=L8mJsgPj1iU 1\nhttp://youtube.com/watch?v=d46_ctqDmI4 1\nhttp://z08.zupload.com/download.php?...filepath=48993 1\nhttp://z12.zupload.com/download.php?file=getfile&filepath=6894 1\nhttp:www.aes.orgpublicationsstandardssearch.cfm 1\nhttp:www.fordvehicles.comdealerships 1\nhttp:www.u5lazarus.com 1\nhttps:weblion.psu.edutracweblionwikiMakeAsubfolderAnavigationRoot 1\nhu 1\nhuang 1\nhubristic 1\nhuckstering 1\nhues 1\nhugged 1\nhulking 1\nhullabaloo 1\nhulls 1\nhumaneness 1\nhumanism 1\nhumanist 1\nhumanized 1\nhumanly 1\nhumidity 1\nhumiliated 1\nhumility 1\nhummingbirds 1\nhumongous 1\nhumors 1\nhunched 1\nhundredweight 1\nhunker 1\nhunkered 1\nhunkering 1\nhunks 1\nhunted 1\nhunts 1\nhurl 1\nhurrah 1\nhurtling 1\nhushed 1\nhusks 1\nhusky 1\nhustings 1\nhustled 1\nhustlers 1\nhustles 1\nhustling 1\nhvae 1\nhybridization 1\nhydrated 1\nhydration 1\nhydrocele 1\nhydrocephalus 1\nhydrochloride 1\nhydrogenated 1\nhydrogenation 1\nhydrotherapy 1\nhygienic 1\nhymns 1\nhyper 1\nhyper-trader 1\nhypermarkets 1\nhypersensitive 1\nhypnotic 1\nhypocrite 1\nhypocrites 1\nhypocritical 1\nhypoglycemic 1\nhypothesized 1\nhypoxia 1\nhysteric 1\ni's 1\ni.e 1\niN2015 1\niResearch 1\niambic 1\nichi 1\nicky 1\nicmc 1\nidealists 1\nidealized 1\nideally 1\nideate 1\nident 1\nideologically 1\nideologist 1\nideologue 1\nidiocy 1\nidiosyncratic 1\nidolized 1\nidolizing 1\nidosyncratic 1\nidscs 1\niffy 1\nifthenelse 1\nig 1\nignition 1\nignitor 1\nignominiously 1\nignoramus 1\nillegality 1\nilliquid 1\nilliquidity 1\nilliteracy 1\nilliterates 1\nillogic 1\nilluminate 1\nilluminated 1\nilluminates 1\nilluminating 1\nillusionary 1\nillusionist 1\nillusory 1\nimage001.jpg 1\nimage_gif_part 1\nimage_jpg_part 1\nimaginable 1\nimaginations 1\nimagines 1\nimaginings 1\nimam 1\nimbroglio 1\nimbue 1\nimbued 1\nimitations 1\nimmaculate 1\nimmaculately 1\nimmature 1\nimmaturity 1\nimmeasurable 1\nimmerse 1\nimmersive 1\nimmigeration 1\nimmobiliser 1\nimmobilising 1\nimmobilized 1\nimmodest 1\nimmolation 1\nimmortality 1\nimmortals 1\nimmunization 1\nimmunizing 1\nimmunologist 1\nimpacting 1\nimpair 1\nimparted 1\nimpassible 1\nimpassiveness 1\nimpatience 1\nimpeding 1\nimperatives 1\nimperfect 1\nimperialists 1\nimperil 1\nimperious 1\nimpermanence 1\nimpersonating 1\nimpersonation 1\nimpersonations 1\nimpertinently 1\nimpetuses 1\nimplace 1\nimplantation 1\nimplanting 1\nimplausible 1\nimplementating 1\nimplementer 1\nimplicating 1\nimplored 1\nimplores 1\nimploring 1\nimploringly 1\nimpolitely 1\nimporves 1\nimpossibile 1\nimposters 1\nimpotence 1\nimpotent 1\nimpregnable 1\nimpregnated 1\nimpregnating 1\nimpresses 1\nimpressing 1\nimprinting 1\nimprison 1\nimprisoning 1\nimprobability 1\nimprobable 1\nimpromptu 1\nimpropriety 1\nimproverished 1\nimprovisation 1\nimprovisational 1\nimprovisatory 1\nimproviser 1\nimprovization 1\nimprovizational 1\nimpugn 1\nimpulsive 1\nimpure 1\nimputed 1\ninaccuracy 1\ninaccurately 1\ninactivation 1\ninactivity 1\ninadvertence 1\ninadvertent 1\ninalienable 1\ninane 1\ninasmuch 1\ninattention 1\ninaudible 1\ninaugurate 1\ninborn 1\ninbound 1\ninca 1\nincalculable 1\nincandescents 1\nincapacitate 1\nincarcerate 1\nincarcerated 1\nincendiarially 1\nincendiary 1\nincentivize 1\nincessant 1\nincestuous 1\ninchworm 1\nincidentally 1\nincincerating 1\nincinerated 1\nincipient 1\nincised 1\nincision 1\nincisive 1\nincited 1\nincommensurability 1\nincompatibility 1\nincompetently 1\nincompletely 1\nincomprehensible 1\nincongruities 1\nincongruity 1\ninconsistency 1\ninconsitencies 1\ninconstancy 1\nincontestable 1\nincorporeal 1\nincorrigible 1\nincorruptible 1\nincredibility 1\nincredulously 1\nincriminating 1\nincubator 1\nincumbency 1\nincurable 1\nincurring 1\nincurs 1\nincursions 1\nindecipherable 1\nindefensible 1\nindelibly 1\nindemnify 1\nindenture 1\nindescriminately 1\nindestructibility 1\nindeterminable 1\nindeterminate 1\nindia 1\nindian 1\nindica 1\nindices 1\nindicting 1\nindie 1\nindigestion 1\nindignant 1\nindirectness 1\nindiscriminately 1\nindispensability 1\nindissoluble 1\nindividualistic 1\nindividualized 1\nindoctrinated 1\nindolence 1\nindolent 1\ninducted 1\nindulgences 1\nindulges 1\nindustrialize 1\nindustrious 1\ninedible 1\nineffable 1\nineffably 1\nineffectiveness 1\nineffectual 1\nineligible 1\ninequitable 1\ninequity 1\ninert 1\ninescapably 1\ninestimable 1\ninevitability 1\ninexcusably 1\ninexhaustible 1\ninexpensively 1\ninexperience 1\ninexpressible 1\ninfamy 1\ninfantile 1\ninfecting 1\ninfectiously 1\ninfelicitous 1\ninfer 1\ninferences 1\ninferiors 1\ninferno 1\ninfestation 1\ninfighter 1\ninfiltrating 1\ninfiltrations 1\ninfiltrators 1\ninflatable 1\ninflection 1\ninflicting 1\ninfomation 1\ninformational 1\ninforms 1\ninfotainment 1\ninfractions 1\ninfrared 1\ninfrastructural 1\ninfrequent 1\ninfuriate 1\ninfuses 1\ning 1\ningeniously 1\ningestion 1\ningrates 1\ningress 1\ninhabitation 1\ninhabits 1\ninhaled 1\ninherits 1\ninhibitions 1\ninhibitors 1\ninhibits 1\ninhospitable 1\ninhumanely 1\ninimitable 1\niniquities 1\ninitialing 1\ninitiates 1\ninitiating 1\ninitiation 1\ninjecters 1\ninjectors 1\ninked 1\ninking 1\ninky 1\ninlay 1\ninn 1\ninnards 1\ninnate 1\ninnermost 1\ninnkeepers 1\ninnoculating 1\ninnuendo 1\ninnured 1\ninoffensive 1\ninoperative 1\ninordinate 1\ninpenetrable 1\ninquest 1\ninquire 1\ninquisitions 1\ninroads 1\ninrushing 1\ninsecticides 1\ninsertable 1\ninserting 1\ninsidious 1\ninsightful 1\ninsignias 1\ninsignificance 1\ninsinuations 1\ninsinuendo 1\ninsistance 1\ninsolence 1\ninspirations 1\ninstaller 1\ninstalls 1\ninstantaneity 1\ninstantaneous 1\ninstigating 1\ninstigator 1\ninstill 1\ninstilled 1\ninstitutionally 1\ninstructional 1\ninstructive 1\ninstructs 1\ninsubstantial 1\ninsulator 1\ninsupportable 1\ninsurability 1\ninsurgencies 1\ninsurrection 1\ninteger 1\nintegers 1\nintel 1\nintensification 1\nintensifier 1\nintently 1\nintents 1\ninter-branch 1\ninter-company 1\ninter-county 1\ninter-enterprise 1\ninter-governorate 1\ninter-library 1\ninter-office 1\ninter-party 1\ninter-provincial 1\ninter-related 1\ninter-state 1\ninter29ing 1\ninteractively 1\ninteragency 1\nintercede 1\ninterceptions 1\nintercepts 1\nintercessors 1\ninterchangeable 1\nintercity 1\nintercompany 1\ninterconnected 1\ninterdisciplinary 1\ninterferences 1\nintergenerational 1\nintergovernmental 1\ninterjunction 1\ninterlaced 1\ninterloper 1\ninterloping 1\ninterlude 1\nintermarrying 1\ninterminable 1\nintermingling 1\nintermixed 1\ninternalize 1\ninternationalist 1\ninternationalists 1\ninternets 1\ninterns 1\ninterpellation 1\ninterprets 1\ninterprovincial 1\ninterracial 1\ninterrelated 1\ninterrogating 1\ninterrogators 1\ninterruptions 1\nintersected 1\nintertwining 1\ninterval 1\ninterveners 1\ninterventional 1\ninterventions 1\ninterwar 1\nintestinal 1\nintial 1\nintifadah 1\nintimation 1\nintimidate 1\nintimidations 1\nintitiative 1\nintonation 1\nintoned 1\nintones 1\nintoxicating 1\nintra-administration 1\nintra-day 1\nintracompany 1\nintrastate 1\nintrauterine 1\nintravenous 1\nintrepidly 1\nintrigueing 1\nintrigues 1\nintriguingly 1\nintroductory 1\nintrospection 1\nintrospective 1\nintroverted 1\nintrude 1\ninvalidating 1\ninvalidity 1\ninvaluable 1\ninvective 1\ninvencion 1\ninventers 1\ninventive 1\ninventors 1\ninvertebrates 1\ninverter 1\ninvestements 1\ninveterate 1\ninvidious 1\ninvigorates 1\ninvisibility 1\ninvitaion 1\ninvitationals 1\ninvocation 1\ninvoiced 1\ninvoked 1\ninvokes 1\ninvovled 1\ninwards 1\niodized 1\nion 1\niota 1\nipso 1\nirene 1\nirish 1\nirk 1\nirksome 1\nironclad 1\nironed 1\nironfist 1\nironies 1\nirradiation 1\nirrationality 1\nirrationally 1\nirreconcilable 1\nirreconcilables 1\nirregularly 1\nirresistable 1\nirresponsibility 1\nirrevocably 1\niseesoisee 1\nislative 1\nism 1\nisolates 1\nisolating 1\nisosceles 1\nisp's 1\nitalian 1\nitching 1\nitchy 1\nitemize 1\nitemized 1\niterated 1\niteration 1\nitinerant 1\nitiveness 1\niv 1\niw 1\nizakaya 1\njab 1\njabber 1\njabs 1\njackass 1\njackhammers 1\njacko 1\njackpot 1\njacks 1\njacksons 1\njacobsOn 1\njaggies 1\njailer 1\njailhouse 1\njails 1\njalapeno 1\njaldi 1\njamboree 1\njanesmith@paulhastings.com 1\njanitor 1\njaponica 1\njargon 1\njassem1@hotmail.com 1\njaundiced 1\njaunt 1\njauntily 1\njaunts 1\njavelin 1\njaywalk 1\njazzy 1\njcbda 1\njeep 1\njeju 1\njejudo 1\njelled 1\njellies 1\njelly 1\njeng 1\njeopardized 1\njerked 1\njerking 1\njerks 1\njerky 1\njerseys 1\njest 1\njesting 1\njests 1\njetty 1\njewelery 1\njia 1\njiaozi 1\njiggling 1\njigs 1\njigsaw 1\njill 1\njillions 1\njilted 1\njim 1\njing 1\njingling 1\njingoistic 1\njingzhe19 1\njinqian 1\njock 1\njockey 1\njockeyed 1\njockeying 1\njogger 1\njogs 1\njohnstaikos@paulhastings.com 1\njonron 1\njossling 1\njostle 1\njour 1\njourneyed 1\njovial 1\njowl 1\njowls 1\njubilant 1\njudgmental 1\njudicially 1\njudiciously 1\njudoka 1\njuggle 1\njugs 1\njukebox 1\njumpiness 1\njunction 1\njunctions 1\njunctures 1\njuniors 1\njunket 1\njunkholders 1\njunky 1\njunkyard 1\njunt 1\njurist 1\njutting 1\njuveniles 1\njuxtapose 1\njuxtaposed 1\njuxtaposition 1\nk.m.n_84@hotmail.com 1\nkV 1\nkWh 1\nkaffee 1\nkale 1\nkalega 1\nkalija 1\nkan 1\nkang 1\nkangaroo 1\nkantakari 1\nkarat 1\nkarate 1\nkaviiad 1\nkayoed 1\nke 1\nkebabs 1\nkeeper 1\nkeg 1\nkelly 1\nkeng 1\nkent.shoemaker@ae.ge.com 1\nkeratoma 1\nkerchiefed 1\nkeychain 1\nkhaki 1\nkhy 1\nkickers 1\nkilamanjaro 1\nkillfiles 1\nkilobit 1\nkilogram 1\nkilovolts 1\nkimberwick 1\nkin 1\nkindergartener 1\nkindling 1\nkinetic 1\nkinfolk 1\nkingmaker 1\nkingston 1\nkinked 1\nkinky 1\nkissers 1\nkissing 1\nkithchen 1\nkiting 1\nkitschy 1\nkitties 1\nkiwi 1\nkneading 1\nkneed 1\nknifings 1\nknits 1\nknitwear 1\nknockout 1\nknotting 1\nknotty 1\nknowledgable 1\nknowns 1\nknuckle 1\nkomal.ba 1\nkonishii 1\nkooorah 1\nkosa 1\nkosas 1\nkosher 1\nkrait 1\nkrater 1\nkrausen 1\nkrona 1\nkroner 1\nkryptonite 1\nkuai 1\nkursk 1\nl' 1\nl'Ouest 1\nl'oeil 1\nl988 1\nlabelled 1\nlable 1\nlabo 1\nlabors 1\nlabour 1\nlabourer 1\nlabourers 1\nlacerations 1\nlaces 1\nlackey 1\nlacquer 1\nlacs 1\nladdered 1\nladens 1\nladybug 1\nladybugs 1\nlagoons 1\nlai 1\nlain 1\nlair 1\nlambasted 1\nlambastes 1\nlambda 1\nlambskin 1\nlamenting 1\nlaminated 1\nlaminations 1\nlanders 1\nlandfall 1\nlandfilled 1\nlandlines 1\nlandlocked 1\nlandlords 1\nlandmine 1\nlandmines 1\nlandowner 1\nlandscaape 1\nlandscapers 1\nlanguish 1\nlanguishes 1\nlanguor 1\nlanguorous 1\nlantana 1\nlanzador 1\nlapful 1\nlarceny 1\nlard 1\nlargish 1\nlarries 1\nlasciviously 1\nlassitude 1\nlastest 1\nlastly 1\nlatched 1\nlatches 1\nlatching 1\nlatecomers 1\nlathe 1\nlatitudes 1\nlattice 1\nlaughingstock 1\nlaunder 1\nlaunderers 1\nlaundromats 1\nlaurels 1\nlavished 1\nlavishly 1\nlawfully 1\nlawlessness 1\nlawnmower 1\nlawyering 1\nlax 1\nlaxatives 1\nlayering 1\nlayman 1\nlayoff 1\nlayouts 1\nlazers 1\nlazily 1\nlaziness 1\nlb 1\nleaches 1\nleaching 1\nleadoff 1\nleaguers 1\nleakers 1\nlearner 1\nleasable 1\nleaseholds 1\nleatherbound 1\nlecherous 1\nlectured 1\nledger 1\nledgers 1\nleek 1\nleftfield 1\nleftism 1\nlegalized 1\nleggings 1\nlegible 1\nlegion 1\nlegislating 1\nlegitimized 1\nlegitimizing 1\nlegnth 1\nlegume 1\nlei 1\nleitmotif 1\nleli 1\nlemmings 1\nlemons 1\nlemurs 1\nlendable 1\nlennium 1\nleotards 1\nleporjj@selectenergy.com 1\nlesion 1\nlessers 1\nletdowns 1\nlevelled 1\nlevitation 1\nlewdness 1\nlhasa 1\nliaisons 1\nlian 1\nlibeled 1\nliberalizations 1\nlibertarian 1\nlibertarians 1\nlibertins 1\nlibidinous 1\nlibrarians 1\nlice 1\nlicence 1\nlicentiousness 1\nlicks 1\nlifeblood 1\nlifeboat 1\nlifeguards 1\nlifeless 1\nlifelike 1\nlifes 1\nlifesize 1\nliftoff 1\nligament 1\nligaments 1\nlightheaded 1\nlighthearted 1\nlightheartedly 1\nlik 1\nlikeness 1\nlikens 1\nlikey 1\nlil 1\nlilly 1\nlilt 1\nlily 1\nlimb 1\nlimber 1\nlimbering 1\nlimberring 1\nlimo 1\nlimosine 1\nlimousines 1\nlimpid 1\nlimply 1\nlinden 1\nlineages 1\nlinebackers 1\nlineman 1\nlineups 1\nlingua 1\nlinguine 1\nlinings 1\nlinkup 1\nlinseed 1\nlint 1\nlinux 1\nlipoproteins 1\nliposuction 1\nlipped 1\nliquefies 1\nliquidized 1\nliquids 1\nliquified 1\nlira 1\nlisa 1\nlisewise 1\nlisp 1\nlistlessly 1\nlistlessness 1\nlitany 1\nliteralist 1\nliteraly 1\nliterarily 1\nliterate 1\nlitgation 1\nlithography 1\nlithos 1\nlithotripsy 1\nlitigant 1\nlitigated 1\nlitigations 1\nlitigator 1\nlitterbox 1\nlittering 1\nlitttle 1\nliturgical 1\nliturgy 1\nlivelier 1\nliveliest 1\nlivelihoods 1\nliveliness 1\nliven 1\nliveried 1\nlivid 1\nlizards 1\nlkeast 1\nll 1\nllamas 1\nlo9nger 1\nloam 1\nloathe 1\nloathes 1\nloathsome 1\nlobbed 1\nlobbies 1\nlobotomized 1\nlocalizations 1\nlocalize 1\nlocalizing 1\nlockstep 1\nlockup 1\nlocomotives 1\nlocust 1\nlocutions 1\nlodgings 1\nlofted 1\nloftiness 1\nlogarithm 1\nlogger 1\nloggerheads 1\nloggers 1\nloiter 1\nloitering 1\nlolo 1\nlondon 1\nloner 1\nloners 1\nlong-term 1\nlonghaul 1\nlongitudes 1\nlongitudinal 1\nlongitudinally 1\nlongterm 1\nlookup 1\nlookups 1\nloomed 1\nloonies 1\nloony 1\nloosing 1\nlopez 1\nloquacious 1\nlora.sullivan@enron.com 1\nlorazapam 1\nlord 1\nlorded 1\nlords 1\nlordship 1\nlore 1\nloudest 1\nloudspeakers 1\nlouis 1\nlouise 1\nlouse 1\nlovebirds 1\nlovin 1\nlowlife 1\nlowlifes 1\nltake 1\nlu 1\nlubricating 1\nlucked 1\nluckier 1\nludicrously 1\nlug 1\nlugging 1\nlugs 1\nlukewarmly 1\nlulled 1\nlumbering 1\nluminal 1\nluminous 1\nlumped 1\nlumpier 1\nlumpy 1\nlunacy 1\nlunchboxes 1\nlunches 1\nlunchroom 1\nlunge 1\nlunged 1\nlurch 1\nlurching 1\nlurked 1\nlurkers 1\nlustful 1\nlv. 1\nlymph 1\nlynch 1\nlyric 1\nlyric-less 1\nlyricism 1\nm.o. 1\nm/h 1\nmac 1\nmacaws 1\nmacedonization 1\nmachinations 1\nmacro-control 1\nmacros 1\nmacs 1\nmadam 1\nmadly 1\nmadman 1\nmadrasas 1\nmaelstrom 1\nmaestro 1\nmafiosi 1\nmag 1\nmagazined 1\nmaggots 1\nmagick 1\nmagickal 1\nmagisterially 1\nmagnanimity 1\nmagnanimously 1\nmagnesium 1\nmagnetically 1\nmagnetism 1\nmagnetized 1\nmagnification 1\nmagnificence 1\nmagnolia 1\nmagpies 1\nmags 1\nmaharajahs 1\nmailbox 1\nmailmen 1\nmailto:mayur...@yahoo.com 1\nmailto:vijay.suchdev@funb.com 1\nmainstays 1\nmaintainence 1\nmaipulation 1\nmairei 1\nmaison 1\nmaize 1\nmajeure 1\nmajoring 1\nmajoritarian 1\nmakeups 1\nmalcontent 1\nmalefactors 1\nmalevolent 1\nmalfeasance 1\nmalfunctions 1\nmalnourishment 1\nmalt 1\nmama 1\nmambo 1\nmammalian 1\nmammoth 1\nmammoths 1\nmanacles 1\nmanatees 1\nmandarin 1\nmandolin 1\nmanfully 1\nmangement 1\nmangled 1\nmangrove 1\nmangroves 1\nmanhandled 1\nmanhood 1\nmani 1\nmanifested 1\nmanifesting 1\nmanifestly 1\nmanifestos 1\nmanifests 1\nmaninstays 1\nmanioc 1\nmanipulates 1\nmanipulations 1\nmanipulator 1\nmanly 1\nmannerism 1\nmanning 1\nmannofy@hotmail.com 1\nmanor 1\nmantle 1\nmantou 1\nmanuevering 1\nmanuscripts 1\nmapBounds 1\nmaple 1\nmapmakers 1\nmaquette 1\nmarbles 1\nmarcato 1\nmares 1\nmargarita 1\nmarginalia 1\nmarginalization 1\nmarginalizes 1\nmarginalizing 1\nmarinade 1\nmarinara 1\nmarinate 1\nmarkdowns 1\nmarker.iconImage.style.zoom 1\nmarketability 1\nmarketeers 1\nmarketized 1\nmarketmaking 1\nmarketplaces 1\nmarketwide 1\nmarlin 1\nmarquees 1\nmarshals 1\nmarshes 1\nmarshmallow 1\nmartini 1\nmartinis 1\nmarveled 1\nmarvellous 1\nmasala 1\nmascara 1\nmascots 1\nmasculine 1\nmaserdy@hotmail.com 1\nmash 1\nmashed 1\nmashups 1\nmasons 1\nmasquerade 1\nmassacring 1\nmassaging 1\nmasscot 1\nmassed 1\nmasseuses 1\nmast 1\nmasterminding 1\nmasterpieces 1\nmasturbation 1\nmatchmaking 1\nmaterialist 1\nmaterialistic 1\nmaterialists 1\nmaternity 1\nmaters 1\nmathematicians 1\nmatriarch 1\nmatrilineal 1\nmatrimony 1\nmatron 1\nmatryoshka 1\nmats 1\nmatt 1\nmaudlin 1\nmaul 1\nmaureen 1\nmavens 1\nmaximization 1\nmaxims 1\nmayhem 1\nmayorship 1\nmayur...@yahoo.com 1\nmazes 1\nmba 1\nmcallister 1\nme$$age 1\nmea 1\nmeadows 1\nmealy 1\nmeanders 1\nmeaner 1\nmeaninglessness 1\nmeasley 1\nmeatball 1\nmeatier 1\nmecca 1\nmechanistic 1\nmechanization 1\nmeclofenamate 1\nmedallions 1\nmedically 1\nmeds 1\nmeerkat 1\nmega-commercial 1\nmega-crash 1\nmega-crashes 1\nmega-lawyer 1\nmega-problems 1\nmega-profit 1\nmega-projects 1\nmega-resorts 1\nmegabillion 1\nmegabytes 1\nmegadrop 1\nmegalomaniacal 1\nmegaphone 1\nmegaquestions 1\nmeh 1\nmehitabel 1\nmein 1\nmelanie.gray@weil.com 1\nmeld 1\nmelded 1\nmelds 1\nmellifluous 1\nmelodically 1\nmelodious 1\nmelodramatic 1\nmelts 1\nmemebers 1\nmemoranda 1\nmemorandums 1\nmemorialized 1\nmemorization 1\nmemorize 1\nmemorizing 1\nmend 1\nmendacity 1\nmenopause 1\nmens 1\nmenstruation 1\nmentors 1\nmerchandised 1\nmerchandiser 1\nmerchandisers 1\nmercifully 1\nmerciless 1\nmercurial 1\nmerd 1\nmerengues 1\nmeridians 1\nmeringue 1\nmeringues 1\nmeriting 1\nmerrily 1\nmerrymaking 1\nmers 1\nmesmerizing 1\nmesquite 1\nmessagers 1\nmessengers 1\nmessiah 1\nmessin' 1\nmeta 1\nmetabolize 1\nmetabolized 1\nmetallic 1\nmetalworkers 1\nmetalworking 1\nmetamorphosed 1\nmetaphorical 1\nmetaphorically 1\nmetaphysical 1\nmetaphysics 1\nmetaquestion 1\nmeteoric 1\nmeteorologists 1\nmethodically 1\nmethodicalness 1\nmethyl 1\nmetropolis 1\nmexican 1\nmhain@ISO-NE.com 1\nmicoprocessors 1\nmicro-components 1\nmicro-econometrics 1\nmicro-electronic 1\nmicro-fiber 1\nmicro-liquidity 1\nmicrobe 1\nmicrobiologist 1\nmicrochip 1\nmicrocontrollers 1\nmicroeconomics 1\nmicrofilm 1\nmicromachining 1\nmicron 1\nmicroorganism 1\nmicrovan 1\nmicrowaved 1\nmid-'60s 1\nmid-17th 1\nmid-1940s 1\nmid-1950s 1\nmid-1979 1\nmid-1990's 1\nmid-1991 1\nmid-1995 1\nmid-1999 1\nmid-20s 1\nmid-30s 1\nmid-80s 1\nmid-90s 1\nmid-January 1\nmid-June 1\nmid-March 1\nmid-May 1\nmid-day 1\nmid-development 1\nmid-evenings 1\nmid-level 1\nmid-morning 1\nmid-priced 1\nmid-season 1\nmid-sized 1\nmid-term 1\nmid-war 1\nmid-week 1\nmidair 1\nmidcapitalization 1\nmidcontinent 1\nmideast 1\nmidfield 1\nmidget 1\nmidlevel 1\nmidou 1\nmidsize 1\nmidstream 1\nmidsummer 1\nmidterm 1\nmidweek 1\nmien 1\nmiffed 1\nmighta 1\nmigrations 1\nmigratory 1\nmike 1\nmiked 1\nmil 1\nmildewy 1\nmilitancies 1\nmilitancy 1\nmilitarize 1\nmilitaro 1\nmilitate 1\nmilked 1\nmilking 1\nmilled 1\nmillenium 1\nmillimeters 1\nmilllion 1\nmime 1\nmimicked 1\nmin. 1\nmince 1\nmincing 1\nmindful 1\nmindfulness 1\nmindscape 1\nminefields 1\nmineralogic 1\nmingfm 1\nmini-coffin 1\nmini-discs 1\nmini-disks 1\nmini-slip 1\nmini-studio 1\nmini-type 1\nminiaturization 1\nminibus 1\nminicrash 1\nminimally 1\nminimise 1\nminimization 1\nminimized 1\nminimums 1\nminimun 1\nminimunm 1\nminincomputer 1\nministering 1\nmint35@hotmail.com 1\nminting 1\nminuses 1\nminutiae 1\nmioxidil 1\nmirage 1\nmircles 1\nmirroring 1\nmirth 1\nmis-matches 1\nmisadventure 1\nmisappropriated 1\nmisbegotten 1\nmisbehaves 1\nmisc.consumers.frugal-living 1\nmischievous 1\nmisclassified 1\nmiscommunication 1\nmiscommunications 1\nmisconstrue 1\nmiscounts 1\nmiscreant 1\nmiscreants 1\nmisdemeanors 1\nmisdirection 1\nmiseries 1\nmisfired 1\nmisfiring 1\nmisfit 1\nmisgivings 1\nmishap 1\nmisinform 1\nmisinformation 1\nmisinterprets 1\nmisjudgment 1\nmisjudgments 1\nmislaid 1\nmismeasurements 1\nmisperception 1\nmisplaced 1\nmisquotation 1\nmisrepresents 1\nmisrouted 1\nmississippi 1\nmisspell 1\nmisspent 1\nmist 1\nmistaking 1\nmistletoe 1\nmistranslated 1\nmistreat 1\nmistrusted 1\nmisunderstand 1\nmisunderstood 1\nmisuses 1\nmitBBS.com 1\nmite 1\nmites 1\nmixable 1\nmixed-race 1\nmixers 1\nmixtures 1\nml 1\nmnemonics 1\nmoan 1\nmoat 1\nmobbed 1\nmobilised 1\nmobilizes 1\nmobster 1\nmocca 1\nmodelers 1\nmoderator 1\nmodish 1\nmods 1\nmodular 1\nmodulate 1\nmodulating 1\nmodulation 1\nmodus 1\nmoguls 1\nmohair 1\nmoi 1\nmoist 1\nmoisturizer 1\nmoisturizers 1\nmojo 1\nmolar 1\nmoldy 1\nmolecularly 1\nmolehill 1\nmolestation 1\nmolesters 1\nmolitoff 1\nmollified 1\nmollify 1\nmollifying 1\nmollusks 1\nmolten 1\nmolybdenum 1\nmomma 1\nmoms 1\nmonarchies 1\nmonetize 1\nmonetizing 1\nmoney-wise 1\nmoneys 1\nmonied 1\nmonks 1\nmonoculture 1\nmonohull 1\nmonoid 1\nmonoliths 1\nmononucleosis 1\nmonophonic 1\nmonopolizing 1\nmonotonous 1\nmonotony 1\nmonsieur 1\nmontgolfiere 1\nmontgolfing 1\nmoodiness 1\nmoonbeams 1\nmoorland 1\nmooseleems 1\nmor 1\nmoralistic 1\nmores 1\nmoreso 1\nmori 1\nmormon 1\nmorose 1\nmorph 1\nmorphed 1\nmorrow 1\nmorse 1\nmorsel 1\nmorsels 1\nmortals 1\nmortgagebacked 1\nmotels 1\nmoth 1\nmotherboard 1\nmotionless 1\nmotorbike 1\nmotorhomes 1\nmots 1\nmounded 1\nmountaineering 1\nmountainsides 1\nmountaintop 1\nmourn 1\nmouseover 1\nmousetrap 1\nmousetraps 1\nmousse 1\nmouthful 1\nmoveable 1\nmover 1\nmoviegoer 1\nmovieland 1\nmoviestar 1\nmovingly 1\nmow 1\nmowed 1\nmown 1\nmp3 1\nmshames@ucan.org 1\nmsi 1\nmucilage 1\nmucrosquamatus 1\nmuddle 1\nmuddled 1\nmudslinging 1\nmuff 1\nmuffins 1\nmuffler 1\nmuffs 1\nmugan 1\nmujahidin 1\nmulching 1\nmules 1\nmulitiple 1\nmulitiplier 1\nmull 1\nmullah 1\nmullahs 1\nmulls 1\nmulti-GPU 1\nmulti-access 1\nmulti-agency 1\nmulti-channel 1\nmulti-column 1\nmulti-directional 1\nmulti-faceted 1\nmulti-faith 1\nmulti-gear 1\nmulti-layered 1\nmulti-level 1\nmulti-million 1\nmulti-millionaires 1\nmulti-millionnaires 1\nmulti-part 1\nmulti-polarization 1\nmulti-purpose 1\nmulti-screen 1\nmulti-share 1\nmulti-skilled 1\nmulti-spired 1\nmulti-story 1\nmulti-ton 1\nmulti-track 1\nmultibillionaire 1\nmultibyte 1\nmulticinctus 1\nmulticolored 1\nmultifarious 1\nmultifunctional 1\nmultilayer 1\nmultilevel 1\nmultimillions 1\nmultipart 1\nmultiplayer 1\nmultipled 1\nmultipleuser 1\nmultiplexer 1\nmultiplier 1\nmultipolar 1\nmultisided 1\nmultistate 1\nmumbled 1\nmumbling 1\nmumbo 1\nmummies 1\nmumsy 1\nmunakafa@yahoo.com 1\nmunching 1\nmunis 1\nmunitions 1\nmurkier 1\nmurray 1\nmus 1\nmuscled 1\nmuscling 1\nmusems 1\nmush 1\nmushy 1\nmusicianship 1\nmustache 1\nmustachioed 1\nmusters 1\nmutant 1\nmutating 1\nmutations 1\nmutilation 1\nmutinous 1\nmutt 1\nmuttering 1\nmutts 1\nmuzzleloader 1\nmuzzles 1\nmylar 1\nmystic 1\nmystification 1\nmythical 1\nmythological 1\nmythology 1\nmythos 1\nn33na33@hotmail.com 1\nnab 1\nnaff 1\nnagged 1\nnaggings 1\nnags 1\nnaivete 1\nnaivety 1\nnaja 1\nnamedropper 1\nnamespace 1\nnanguan 1\nnanometer 1\nnanosecond 1\nnaps 1\nnarcokleptocrat 1\nnarcotraficantes 1\nnarrates 1\nnarrating 1\nnarrowcasting 1\nnasopharyngeal 1\nnast 1\nnastiest 1\nnate 1\nnationalizations 1\nnationalize 1\nnatively 1\nnatue 1\nnaturalist 1\nnaturalistic 1\nnatures 1\nnaughtier 1\nnauseous 1\nnautical 1\nnavies 1\nnavigational 1\nnaysay 1\nnazism 1\nncfa 1\nnearsightedness 1\nnecked 1\nnecking 1\nnecktie 1\nneckties 1\nnecrosis 1\nneedier 1\nnegated 1\nnegation 1\nnegativism 1\nneglects 1\nnegligibly 1\nnegotiatory 1\nnegs 1\nneige 1\nneighbhorhoods 1\nneighbour 1\nneighbouring 1\nnematode 1\nneo-Nazi 1\nneo-conservatives 1\nneo-insurgency 1\nneoclassical 1\nneoconservatism 1\nneoconservatives 1\nneonatal 1\nneonatology 1\nneophyte 1\nneophytes 1\nnephews 1\nnephews. 1\nnephropathy 1\nnepotism 1\nnervosa 1\nnervy 1\nnessasary 1\nnested 1\nnesting 1\nnestled 1\nnestles 1\nnetherworld 1\nnetiquette 1\nnetwerk 1\nneurasthenia 1\nneurology 1\nneuron 1\nneuroscientist 1\nneutered 1\nneutering 1\nneutralizes 1\nnevermore 1\nnewbies 1\nnewborns 1\nnewfound 1\nnewlywed 1\nnewscasts 1\nnewsies 1\nnewspaperman 1\nnewsreel 1\nnewsroom 1\nnewsrooms 1\nnewswire 1\nnewsworthiness 1\nnext@cnn 1\nni 1\nniave 1\nnibble 1\nnibbled 1\nnibbling 1\nniche-itis 1\nnichols 1\nnicked 1\nniece 1\nnigeria 1\nnightclubs 1\nnighter 1\nnightfall 1\nnightgown 1\nnightie 1\nnihilist 1\nnik 1\nnilly 1\nninefold 1\nnissan 1\nnit 1\nnitpicking 1\nnixed 1\nno. 1\nnobel 1\nnoblemen 1\nnobler 1\nnobles 1\nnobodies 1\nnoc@paulhastings.com 1\nnocturnal 1\nnodded 1\nnodding 1\nnoises 1\nnomad 1\nnomenal 1\nnomenclature 1\nnon-AMT 1\nnon-Arabs 1\nnon-Cocom 1\nnon-DPP 1\nnon-English 1\nnon-Hakka 1\nnon-Hanabali 1\nnon-Hispanic 1\nnon-Humana 1\nnon-Indian 1\nnon-KMT 1\nnon-Microsoft 1\nnon-Moslem 1\nnon-Muslim 1\nnon-Muslims 1\nnon-NMS 1\nnon-Russian 1\nnon-Saudi 1\nnon-State-owned 1\nnon-Swedish 1\nnon-Tagalog 1\nnon-accumulation 1\nnon-advertising 1\nnon-auto 1\nnon-automotive 1\nnon-brain 1\nnon-building 1\nnon-caffeine 1\nnon-calculus 1\nnon-call 1\nnon-cash 1\nnon-class 1\nnon-clients 1\nnon-coastal 1\nnon-combat 1\nnon-commercialization 1\nnon-competitive 1\nnon-contradiction 1\nnon-controlling 1\nnon-crumbly 1\nnon-customers 1\nnon-daily 1\nnon-dairy 1\nnon-defense 1\nnon-destructive 1\nnon-dischargable 1\nnon-discrimination 1\nnon-drug 1\nnon-earners 1\nnon-economic 1\nnon-economical 1\nnon-edible 1\nnon-encoded 1\nnon-enforcement 1\nnon-equity 1\nnon-ethnic 1\nnon-event 1\nnon-evidence 1\nnon-exclusive 1\nnon-exclusiveness 1\nnon-existence 1\nnon-expert 1\nnon-eye 1\nnon-family 1\nnon-farm 1\nnon-firm 1\nnon-flight 1\nnon-fortress 1\nnon-grata 1\nnon-heteros 1\nnon-horticultural 1\nnon-interstate 1\nnon-interventionist 1\nnon-issue 1\nnon-lawyers 1\nnon-lethal 1\nnon-letter 1\nnon-life 1\nnon-market 1\nnon-mega 1\nnon-member 1\nnon-military 1\nnon-narrative 1\nnon-normative 1\nnon-objective 1\nnon-operative 1\nnon-packaging 1\nnon-partisan 1\nnon-party 1\nnon-patent 1\nnon-payment 1\nnon-pilot 1\nnon-poisonous 1\nnon-polio 1\nnon-politicized 1\nnon-pregnant 1\nnon-professionals 1\nnon-provocative 1\nnon-recourse 1\nnon-refugees 1\nnon-regulated 1\nnon-repeatable 1\nnon-retail 1\nnon-road 1\nnon-sales 1\nnon-social 1\nnon-specialists 1\nnon-starter 1\nnon-striking 1\nnon-threatening 1\nnon-union 1\nnon-user 1\nnon-veg 1\nnon-verbal 1\nnon-violence 1\nnon-viral 1\nnon-virulent 1\nnon-volatile 1\nnon-voting 1\nnon-warranty 1\nnon-wealthy 1\nnon-working 1\nnonaggression 1\nnonbusiness 1\nnoncash 1\nnonchelant 1\nnonchlorine 1\nnoncombatant 1\nnoncombatants 1\nnoncommercial 1\nnoncompliant 1\nnonconformists 1\nnonconvertible 1\nnoncumulative 1\nnondairy 1\nnonenforcement 1\nnonentity 1\nnonevent 1\nnonexecutive 1\nnonexistant 1\nnonfeasance 1\nnonflammable 1\nnonintervention 1\nnonlethal 1\nnonparticipant 1\nnonplussed 1\nnonpriority 1\nnonproductive 1\nnonproliferation 1\nnonregulated 1\nnonsocialist 1\nnonspecific 1\nnontraditional 1\nnonvirulent 1\nnonweaponized 1\nnonworking 1\nnonwoven 1\nnook 1\nnope 1\nnormality 1\nnormalize 1\nnormall 1\nnormals 1\nnormative 1\nnortheasterly 1\nnortherly 1\nnortherner 1\nnorthernmost 1\nnosediving 1\nnostrum 1\nnotables 1\nnotarization 1\nnotarized 1\nnotary 1\nnotation 1\nnotches 1\nnoteholder 1\nnotepad 1\nnoteslinks 1\nnothin 1\nnothingness 1\nnothings 1\nnotional 1\nnotionally 1\nnourish 1\nnovelties 1\nnovelty 1\nnovice 1\nnovitiate 1\nnovitiates 1\nnozzles 1\nnucleotide 1\nnudity 1\nnugget 1\nnumbness 1\nnumeral 1\nnumerals 1\nnumerator 1\nnumeric 1\nnunchuk 1\nnung 1\nnunjun 1\nnutriments 1\nnutritionists 1\nnutritious 1\nnutso 1\nnutter 1\nnutz 1\nnuzzles 1\nnym 1\nnymex 1\noaks 1\nobdurate 1\nobediently 1\nobeisance 1\nobelisk 1\nobese 1\nobfuscate 1\nobituary 1\nobjectors 1\nobligating 1\nobligatto 1\nobliges 1\nobligingly 1\noblique 1\nobliquely 1\nobliterated 1\nobliterating 1\noblong 1\noboist 1\nobscenity 1\nobscured 1\nobservational 1\nobservatory 1\nobservetory 1\nobsess 1\nobsessing 1\nobsolescence 1\nobsoleting 1\nobstetrics 1\nobstinacy 1\nobstructionist 1\nobstructions 1\nobstructive 1\nobstruse 1\nobtainable 1\nobtusa 1\nobtuse 1\nobverse 1\nobviate 1\noc 1\nocculted 1\noccupants 1\noccupationally 1\noctagonal 1\noctave 1\noctaves 1\noctogenarian 1\noctopus 1\noddities 1\noddity 1\noddsmaker 1\noddysey 1\nodometer 1\nodor 1\nodors 1\nodyssey 1\noeufs 1\noff-cuts 1\noffbeat 1\noffences 1\noffender 1\noffends 1\noffenses 1\noffhandedly 1\noffhandedness 1\nofficals 1\nofficialdom 1\nofficio 1\nofficious 1\nofficiously 1\noffload 1\noffputting 1\noffsets 1\noffshoots 1\noffside 1\noffsite 1\noficials 1\noft 1\noftentimes 1\nogle 1\nogles 1\nogling 1\nogre 1\noiled 1\noiler 1\noink 1\nointment 1\nokra 1\nold-style 1\noldsters 1\nolefins 1\noligarchic 1\noligonucleotides 1\nolives 1\nomens 1\nominously 1\nomnipotent 1\non-line 1\noncogene 1\none's 1\none-half 1\nonepage 1\nonesie 1\noneyear 1\nonion 1\nonline?u=mayursha&m=g&t=1 1\nonlookers 1\nonrushing 1\nons 1\nonstage 1\nontology 1\noodles 1\noomph 1\noooch 1\nop 1\nopacity 1\nopenended 1\nopeners 1\noperable 1\noperand 1\nophthalmologist 1\nopining 1\nopinon 1\nopinons 1\nopoonants 1\nopponants 1\noppositions 1\noppositon 1\noppressions 1\noppressor 1\noptically 1\noptimist 1\noptinal 1\nopto 1\nopulence 1\noracle 1\noranges 1\norca 1\norchardists 1\norchestral 1\norchestrating 1\nordained 1\nordeals 1\noregon 1\norganically 1\norganisations 1\norganise 1\norganised 1\norganiser 1\norganizes 1\norgies 1\norginally 1\noriental 1\norifices 1\noriginates 1\noriginators 1\noring 1\nornament 1\nornamentation 1\nornamented 1\nornery 1\nornithological 1\northo 1\northopedics 1\noscars 1\nosmotic 1\nossification 1\nostensible 1\nostentation 1\nostentatiously 1\nosteos 1\nostracized 1\nostrich 1\nostriches 1\not 1\notherworldly 1\notters 1\nould 1\noutbidding 1\noutdelta1 1\noutdelta2 1\noutdelta3 1\noutdelta4 1\noutdid 1\noutdoes 1\noutdone 1\noutdoorsman 1\noutermost 1\noutfielders 1\noutflank 1\noutgained 1\noutgrew 1\noutlanders 1\noutlandish 1\noutlast 1\noutlasted 1\noutlaws 1\noutleaped 1\noutlier 1\noutlived 1\noutmoding 1\noutpace 1\noutpaces 1\noutpatients 1\noutperforming 1\noutperforms 1\noutplacement 1\noutposts 1\noutputs 1\noutrages 1\noutranked 1\noutreach 1\noutsell 1\noutselling 1\noutsells 1\noutshine 1\noutshines 1\noutsides 1\noutsized 1\noutsold 1\noutsole 1\noutsource 1\noutsourced 1\noutstretched 1\noutstrip 1\noutstripping 1\nouzo 1\noval 1\novarian 1\novata 1\nove 1\nover-capacity 1\nover-eat 1\nover-emphasize 1\nover-generalizations 1\nover-indebted 1\nover-leveraged 1\nover-refined 1\noverachievers 1\noveranxious 1\noverarching 1\noverblown 1\noverbought 1\noverbreadth 1\noverbuilding 1\noverburdened 1\novercast 1\novercollateralized 1\novercomes 1\novercoming 1\novercommitted 1\novercooked 1\noverdeveloped 1\noverdosed 1\noverdosing 1\noverdraft 1\noverdrafts 1\noverdrawn 1\noverdressed 1\novereager 1\noveremphasis 1\noveremphasized 1\noverestimate 1\noverestimating 1\noverflights 1\noverflown 1\noverflows 1\novergeneralization 1\novergrown 1\noverheard 1\noverheating 1\noverinclusion 1\noverland 1\noverlaps 1\noverlays 1\noverloading 1\noverloads 1\noverlooks 1\novernice 1\novernourished 1\noverplanted 1\noverplayed 1\noverpopulation 1\noverpower 1\noverpowered 1\noverprints 1\noverpurchase 1\noverran 1\noverreach 1\noverreact 1\noverreacted 1\noverreaction 1\noverreactions 1\noverregulated 1\noverrode 1\noverrule 1\noverruled 1\noverruling 1\novers 1\noversaturation 1\novershadows 1\noversize 1\noverstaffed 1\noverstating 1\noverstocked 1\noverstretched 1\noversupply 1\novert 1\novertake 1\novertaken 1\novertaking 1\novertaxed 1\novertaxing 1\noverthrew 1\novertopped 1\noverturning 1\noverused 1\noverweighted 1\noverworked 1\noverworking 1\noverwrite 1\noverwritten 1\noverwrought 1\noverzealous 1\noverzealousness 1\novr 1\novre 1\novulation 1\novulatory 1\nowgwo@yahoo.com 1\nowls 1\nox 1\noxidation 1\noxidize 1\noxidized 1\noxidizer 1\noxidizes 1\noysters 1\noz 1\nozonedepletion 1\np...@hotmail.com 1\npH 1\npaces 1\npacification 1\npacified 1\npacts 1\npaddle 1\npaddleball 1\npaddock 1\npaeans 1\npagans 1\npageantry 1\npagen 1\npaging 1\npaige 1\npained 1\npaintball 1\npaintbrush 1\npairing 1\npajamas 1\npalamedes 1\npalatable 1\npalatial 1\npalaver 1\npalazzi 1\npaleo 1\npaleomagnetic 1\npaleontologic 1\npaleontologically 1\npaleontologists 1\npallets 1\npallor 1\npalsy 1\npampered 1\npampers 1\npamphlet 1\npamphleteer 1\npamphlets 1\npan-Chinese 1\npanache 1\npancake 1\npancreatitis 1\npandemic 1\npandering 1\npandoro 1\npaneled 1\npanelists 1\npanhandler 1\npanhandling 1\npanics 1\npanjandrums 1\npannel 1\npanning 1\npant 1\npanting 1\npantry 1\npao 1\npaople 1\npapal 1\npaperboard 1\npaperboy 1\npaperclip 1\npara-system 1\nparabens 1\nparabola 1\nparacetamol 1\nparaded 1\nparagon 1\nparagonimiasis 1\nparagraphing 1\nparalleled 1\nparallelism 1\nparalyze 1\nparanormal 1\nparaphrase 1\nparaplegic 1\nparasailer 1\nparasite 1\nparastatals 1\nparched 1\nparchment 1\npardonable 1\nparentheses 1\nparentid 1\nparenting 1\nparigot 1\nparigotes 1\nparimutuels 1\nparish 1\nparishioners 1\nparities 1\nparley 1\nparliamentarian 1\nparliamentarians 1\nparole. 1\nparoxysmal 1\nparried 1\nparrot 1\nparrotfish 1\nparrots 1\nparry 1\nparse 1\nparsing 1\npartake 1\npartakers 1\nparted 1\nparticlular 1\nparticulars 1\nparticulate 1\npartied 1\npartisans 1\npartying 1\npas 1\npashas 1\npassable 1\npassably 1\npassbook 1\npassel 1\npassers 1\npassersby 1\npassivity 1\npassword 1\npastdue 1\npastel 1\npasteurized 1\npastime 1\npastimes 1\npastoris 1\npastors 1\npasttime 1\npastures 1\npatacas 1\npatches 1\npatching 1\npathetically 1\npathfinder 1\npathogen 1\npathogens 1\npathology 1\npathos 1\npathway 1\npatriarchs 1\npatriarchy 1\npatronize 1\npatronized 1\npatted 1\npattering 1\npatties 1\npaud 1\npauper 1\npauses 1\npavement 1\npaw 1\npawning 1\npaws 1\npayables 1\npaymasters 1\npc 1\npcs 1\npdf 1\npeacemaker 1\npeacemakers 1\npeacemaking 1\npeaches 1\npeachwood 1\npeanutjak...@usa.com 1\npeas 1\npecan 1\npeccadilloes 1\npeck 1\npecks 1\npectorale 1\npeculiarities 1\npedagogies 1\npedagogue 1\npedaled 1\npedaling 1\npeddler 1\npedel 1\npederastized 1\npedi 1\npediatrician 1\npedicure 1\npedophile 1\npeek 1\npeel 1\npeelback 1\npeeling 1\npeering 1\npeerless 1\npeg 1\npegging 1\npegs 1\npei 1\npeiod 1\npejorative 1\npekin 1\npelham 1\npelicans 1\npellets 1\npenal 1\npenalizes 1\npenalizing 1\npenance 1\npenetrates 1\npenguin 1\npenises 1\npennants 1\npensive 1\npentameter 1\npenury 1\npeople/matters 1\npepco 1\npepole 1\npeppered 1\npepperoni 1\npeppy 1\npeptides 1\nperceives 1\npercenter 1\nperceptiveness 1\nperceptual 1\npercet 1\nperemptory 1\nperennially 1\nperestrokia 1\nperforated 1\nperfumed 1\nperfumes 1\nperfunctorily 1\nperihelion 1\nperiodical 1\nperipatetic 1\nperishables 1\nperk 1\npermanence 1\npermanency 1\npermeable 1\npermeate 1\npermeated 1\npermeating 1\npermissions 1\npermissive 1\npermutation 1\nperoxide 1\nperpetraited 1\nperpetrators 1\nperpetuates 1\nperrenial 1\npersecuting 1\npersecutions 1\npersevere 1\nperseveres 1\npersevering 1\npershare 1\npersimmons 1\npersistently 1\npersonages 1\npersonaly 1\npersonas 1\npersonification 1\npersonify 1\nperspicacious 1\npersuades 1\npersuasiveness 1\npertains 1\nperturbed 1\nperu 1\nperversities 1\nperversity 1\npesatas 1\npesky 1\npesos 1\npester 1\npestis 1\npetard 1\npeter 1\npetit 1\npetrified 1\npetro-market 1\npetrol 1\npetrologic 1\npetsmart 1\npettiness 1\npetting 1\nphalanges 1\nphalanx 1\nphallic 1\npharmacists 1\npharmas 1\nphilanthropists 1\nphilanthropy 1\nphilatelists 1\nphilly 1\nphilosophizing 1\nphlegm 1\nphobias 1\nphonebook 1\nphonies 1\nphoning 1\nphosphate 1\nphotocopiers 1\nphotocopies 1\nphotoelectric 1\nphotofinishing 1\nphotogenic 1\nphotographing 1\nphotojournalist 1\nphotorealism 1\nphotosynthesis 1\nphototransistor 1\nphrasing 1\nphysios 1\npianistic 1\npiao 1\npiasters 1\npick-up 1\npicketers 1\npicketing 1\npickets 1\npickings 1\npickle 1\npickled 1\npickles 1\npickling 1\npicturesquely 1\npicturing 1\npieced 1\npiecing 1\npierce 1\npiercing 1\npies 1\npigeonholed 1\npiggybacked 1\npiglet 1\npiglets 1\npigsty 1\npiker 1\npilates 1\npileup 1\npilgrims 1\npillaged 1\npilloried 1\npillorying 1\npillowcases 1\npilote 1\npiloting 1\npilotted 1\npimples 1\npimps 1\npinchers 1\npine 1\npineapples 1\npinging 1\npinheaded 1\npinkie 1\npinstripe 1\npipelined 1\npiper 1\npipette 1\npipsqueak 1\npiquant 1\npiracy 1\npiranha 1\npirated 1\npiratical 1\npiroghi 1\npiss 1\npistils 1\npistons 1\npitchman 1\npitchmen 1\npithiest 1\npithy 1\npitstop 1\npivot 1\npixel 1\npixie 1\npizazz 1\npizzerias 1\nplacards 1\nplacated 1\nplacating 1\nplacid 1\nplacidly 1\nplaguing 1\nplaint 1\nplaintive 1\nplaintively 1\nplaning 1\nplanks 1\nplantar 1\nplantation 1\nplanter 1\nplaque 1\nplasmodium 1\nplastered 1\nplateaus 1\nplating 1\nplausibly 1\nplayboy 1\nplayboys 1\nplayfully 1\nplayfulness 1\nplaygrounds 1\nplayin 1\nplayland 1\nplaything 1\nplc 1\npleadingly 1\npleases 1\npleasingly 1\npleasurable 1\npleasuring 1\npleated 1\nplenary 1\nplenum 1\nplethora 1\npliability 1\npliant 1\nplies 1\nplights 1\nplllz 1\nploddingly 1\nplow 1\npluck 1\nplum 1\nplume 1\nplump 1\nplunder 1\nplundered 1\nplunges 1\nplural 1\npluralist 1\npluralization 1\npluralized 1\npluri-party 1\npluses 1\nplutocracy 1\nplz 1\npoaching 1\npocketbook 1\npocketbooks 1\npocketing 1\npockmarked 1\npodcasters 1\npodiatrist 1\npogroms 1\npointy 1\npoises 1\npoke 1\npoked 1\npol 1\npolarization 1\npolarized 1\npolemic 1\npoliced 1\npolices 1\npolicyholder 1\npolicymaker 1\npolicymaking 1\npolitico 1\npoliticos 1\npollinating 1\npollination 1\npolllution 1\npolluter 1\npols 1\npolution 1\npoly 1\npolychaetes 1\npolyglot 1\npolygon 1\npolyline 1\npolymer 1\npolymerase 1\npolymeric 1\npolymers 1\npolymorphisms 1\npolyphony 1\npolypin 1\npolyrhythms 1\npolytechnic 1\npolytheism 1\npolyvinyl 1\npomegranates 1\npomelo 1\npomfret 1\npomological 1\npomologist 1\npomp 1\npompous 1\nponderousness 1\nponeh 1\npong 1\nponied 1\npony 1\npony's 1\nponying 1\npoo 1\npooch 1\npoodles 1\npoof 1\npoohbah 1\npooling 1\npop. 1\npop...@spinach.eat 1\npopes 1\npopper 1\npoppy 1\npops 1\npopsicle 1\npopularizing 1\npopulism 1\nporches 1\nporchlights 1\npored 1\nporing 1\nporkless 1\nporphyria 1\nported 1\nportend 1\nporthole 1\nportland 1\nportlet 1\nportly 1\nposies 1\npositional 1\npositivist 1\nposse 1\nposses 1\npossibilitly 1\npossibilitys 1\npost-921 1\npost-Barre 1\npost-Bush 1\npost-Chavez 1\npost-Freudian 1\npost-Hinkley 1\npost-Hugo 1\npost-June 1\npost-Oct 1\npost-Pop 1\npost-September 1\npost-Vietnam 1\npost-accident 1\npost-bankruptcy 1\npost-catastrophe 1\npost-colonial 1\npost-disaster 1\npost-game 1\npost-harvest 1\npost-industrialization 1\npost-martial 1\npost-modern 1\npost-op 1\npost-presidential-election 1\npostcard 1\npostcards 1\nposterity 1\npostmodernism 1\npostulates 1\npotassium 1\npotatoe 1\npotatos 1\npotentates 1\npotentiality 1\npotentials 1\npotful 1\npothole 1\npotion 1\npotomac 1\npottage 1\npotty 1\npouch 1\npounce 1\npoundcake 1\npowderkeg 1\npowerhead 1\npox 1\npp 1\npppd 1\nppww 1\npractise 1\npractitioner 1\npragmatists 1\nprairies 1\nprancing 1\npranks 1\nprayfully 1\npre-18th 1\npre-1950s 1\npre-Christmas 1\npre-Freudian 1\npre-May 1\npre-Reagan 1\npre-Revolutionary 1\npre-accident 1\npre-admission 1\npre-approved 1\npre-bankruptcy 1\npre-big 1\npre-competition 1\npre-conditions 1\npre-crash 1\npre-death 1\npre-eminence 1\npre-empted 1\npre-game 1\npre-introduction 1\npre-kindergarten 1\npre-machined 1\npre-made 1\npre-negotiated 1\npre-noon 1\npre-owned 1\npre-publication 1\npre-quake 1\npre-recorded 1\npre-reform 1\npre-sale 1\npre-separating 1\npre-set 1\npre-shifted 1\npre-signed 1\npre-split 1\npre-strike 1\npre-summit 1\npre-tested 1\npre-try 1\npre-university 1\npre-warning 1\npre-work 1\npreached 1\npreaches 1\npreadmission 1\npreamplifiers 1\nprecariousness 1\nprecinct 1\nprecincts 1\nprecipices 1\nprecipitation 1\nprecipitously 1\npreclearance 1\nprecursor 1\npredawn 1\npredictor 1\npredilection 1\npredilections 1\npredispose 1\npredominance 1\npredominant 1\npredominately 1\npreeminent 1\npreening 1\npreety 1\nprefect 1\nprefectural 1\npreferable 1\nprefigurement 1\nprefix 1\nprefixes 1\npreflight 1\npregnancies 1\nprejudicial 1\npreloaded 1\npremediated 1\npremeditated 1\npremeditation 1\npremiered 1\npremiers 1\npremiership 1\nprenatal 1\npreneurialism 1\nprenuptial 1\npreoccupations 1\npreoccupy 1\npreparer 1\npreparers 1\nprepay 1\nprepaying 1\nprepositioning 1\nprepping 1\npreppy 1\nprerequisite 1\nprerogative 1\npresages 1\npreschool 1\npreschooler 1\nprescribes 1\nprescriptive 1\npresences 1\npresentable 1\npresenters 1\npresid...@whitehouse.gov 1\npresidentially 1\npressers 1\npresure 1\npretends 1\npretensions 1\npretexts 1\npretzel 1\nprevaricating 1\npreventively 1\npreviewing 1\nprevously 1\nprewar 1\npricecutting 1\npricier 1\npricks 1\nprideful 1\nprides 1\npriesthood 1\nprim 1\nprimaarily 1\nprimal 1\nprimaries 1\nprimarly 1\nprimate 1\nprimates 1\nprimer 1\nprimitives 1\nprincess 1\nprintouts 1\npriori 1\nprioritized 1\nprioritizing 1\nprivateers 1\nprivations 1\nprivatizing 1\nprivilage 1\nprivledges 1\npro-American 1\npro-Communist 1\npro-Gorbachev 1\npro-Gore 1\npro-Iraqi 1\npro-Milosevic 1\npro-NATO 1\npro-Nazi 1\npro-Noriega 1\npro-Palestinian 1\npro-Reagan 1\npro-Republican 1\npro-Russia 1\npro-Saddam 1\npro-Soong 1\npro-Western 1\npro-Zionist 1\npro-abortion 1\npro-consumer 1\npro-consumption 1\npro-enterprise 1\npro-environment 1\npro-family 1\npro-forma 1\npro-growth 1\npro-independence 1\npro-investment 1\npro-mark 1\npro-rata 1\npro-road 1\npro-selected 1\npro-tax 1\npro-tested 1\nprobabilities 1\nprobablyl 1\nprobate 1\nprobity 1\nproblematics 1\nprocess's 1\nprocrastinate 1\nprocreation 1\nprocuratorate 1\nprocured 1\nprodded 1\nprodiction 1\nprodigies 1\nprods 1\nproessional 1\nprof 1\nprofanity 1\nprofesses 1\nprofessing 1\nproffer 1\nprofferred 1\nprofiled 1\nprofiteering 1\nprofiteers 1\nprofitmargin 1\nprofittaking 1\nprofundo 1\nprofuse 1\nprofusely 1\nprogeny 1\nprognosticators 1\nprogrammable 1\nprogression 1\nprogressions 1\nprohibitively 1\nprojectiles 1\nprojectors 1\nprolongation 1\nprolonging 1\nprom 1\npromenade 1\nprominant 1\npromissory 1\npromptitude 1\nprompts 1\npromulgate 1\npronation 1\nprong 1\npronoun 1\npronouncements 1\npronounces 1\nproofreading 1\nproofs 1\npropagandists 1\npropagate 1\npropagation 1\npropellant 1\npropellers 1\nprophecy 1\nprophesies 1\nprophets 1\nproportionally 1\nproportionate 1\nproportioned 1\nproposterous 1\nproprieter 1\npropriety 1\npropter 1\npropulsive 1\nprosaic 1\nprosection 1\nprosecutive 1\nprosoma 1\nprospectively 1\nprospectuses 1\nprospered 1\nprosthetic 1\nprostituting 1\nprostration 1\nprotagonists 1\nprotanopia 1\nprotectant 1\nprotocols 1\nproudest 1\nprounounced 1\nprovences 1\nprovincialism 1\nprovost 1\nprude 1\nprudence 1\nprune 1\npruned 1\npruning 1\nps. 1\npsalms 1\npseudo-lobbyists 1\npseudomembranous 1\npseudonymous 1\npsi 1\npsyches 1\npsychiatry 1\npsychics 1\npsychoanalytic 1\npsycholical 1\npsychopathic 1\npsychos 1\npsychotic 1\npsyop 1\npu 1\npub. 1\npuffed 1\npuffers 1\npuffing 1\npuffs 1\npuffy 1\npug 1\npuke 1\npulchritude 1\npullet 1\npullouts 1\npulpits 1\npulsate 1\npulsing 1\npulverizing 1\npummel 1\npummeling 1\npumpkin 1\npunchers 1\npunchline 1\npunchy 1\npunctuated 1\npunditing 1\npundits 1\npunies 1\npunishes 1\npunishments 1\npunt 1\npunters 1\npuppets 1\npur 1\npurchaser 1\npurges 1\npuries 1\npuritan 1\npuritans 1\npurport 1\npurposeful 1\npurr 1\npurrfect 1\npurrs 1\npursuance 1\npursues 1\npusher 1\npushover 1\npusillanimity 1\npusillanimous 1\npussy 1\nputtagenius 1\npuzzlement 1\npuzzler 1\npuzzles 1\npvp 1\npwople 1\npygmy 1\npyjamas 1\npyramiding 1\npyrimidal 1\nqi 1\nqinyin 1\nqqtlbos 1\nquackery 1\nquadrant 1\nquadrennial 1\nquadriplegic 1\nquadruples 1\nquaint 1\nqualifies 1\nqualms 1\nquandary 1\nquango 1\nquanitizer 1\nquantification 1\nquantified 1\nquantitatively 1\nquarantine 1\nquarreling 1\nquarrels 1\nquart 1\nquarterbacking 1\nquartered 1\nquarterfinals 1\nquartz 1\nquasi-compilation 1\nquasi-country 1\nquasi-federal 1\nquasi-governmental 1\nquasi-international 1\nquasi-legal 1\nquasi-magic 1\nquasi-man 1\nquasi-men 1\nquasi-public 1\nquasi-xenophobic 1\nqueasily 1\nqueenside 1\nqueerly 1\nquelling 1\nquench 1\nqueried 1\nquesters 1\nquestionsand 1\nquests 1\nquibbling 1\nquickened 1\nquickening 1\nquickens 1\nquicksand 1\nquid 1\nquiescent 1\nquieter 1\nquieting 1\nquietness 1\nquilling 1\nquilting 1\nquintillion 1\nquintuple 1\nquintuplets 1\nquipster 1\nquirky 1\nquisling 1\nquiver 1\nquivering 1\nquizzing 1\nquorum 1\nquotient 1\nr.c.s.m 1\nr2 1\nr2d2 1\nrabbinical 1\nrabble 1\nrabies. 1\nraccoon 1\nracecar 1\nracetrack 1\nracetracks 1\nrachet 1\nrackets 1\nracoon 1\nracy 1\nradars 1\nradial 1\nradiance 1\nradiant 1\nradiantly 1\nradiographs 1\nradioing 1\nradiological 1\nradiophonic 1\nraffle 1\nrafters 1\nragging 1\nragnarok 1\nragtime 1\nrailbikes 1\nrailcars 1\nraincoat 1\nraincoats 1\nrainier 1\nraining 1\nrainout 1\nrainstorm 1\nrajendra.o...@gmail.com 1\nrambled 1\nrambling 1\nrampaged 1\nramrod 1\nrams 1\nranching 1\nrandomness 1\nrandroid 1\nrankled 1\nrankles 1\nrantissi 1\nrapacious 1\nrapes 1\nrapeseeds 1\nrapidement 1\nrapped 1\nrappers 1\nrapport 1\nraps 1\nraptors 1\nrarer 1\nrarity 1\nrascals 1\nraspberries 1\nratchet 1\nratifies 1\nration 1\nrationalist 1\nrationalistic 1\nrationalization 1\nrationalizing 1\nrattling 1\nratty 1\nravage 1\nravenous 1\nraves 1\nravished 1\nrawest 1\nrayon 1\nrazing 1\nrazzed 1\nrd 1\nre-adjustment 1\nre-adjustments 1\nre-alignment 1\nre-applying 1\nre-assure 1\nre-broadcasts 1\nre-case 1\nre-count 1\nre-creactions 1\nre-creating 1\nre-creation 1\nre-creations 1\nre-deployment 1\nre-designed 1\nre-discovered 1\nre-educated 1\nre-electing 1\nre-emphasize 1\nre-enacted 1\nre-enacting 1\nre-enactments 1\nre-enlist 1\nre-entered 1\nre-entering 1\nre-entry 1\nre-establishing 1\nre-evaluated 1\nre-examination 1\nre-examined 1\nre-examining 1\nre-exported 1\nre-fight 1\nre-igniting 1\nre-imposed 1\nre-innovation 1\nre-inscribe 1\nre-inscribed 1\nre-investment 1\nre-landscaped 1\nre-leased 1\nre-looking 1\nre-opened 1\nre-organization 1\nre-organize 1\nre-organized 1\nre-read 1\nre-registering 1\nre-rendering 1\nre-routed 1\nre-schedule 1\nre-secured 1\nre-shipped 1\nre-sign 1\nre-start 1\nre-summoned 1\nre-supplied 1\nre-tap 1\nre-think 1\nre-trained 1\nre-unification 1\nre-unified 1\nre-wiring 1\nre-wording 1\nreactionaries 1\nreactivating 1\nreactivity 1\nreadandpost 1\nreadied 1\nreadjust 1\nreadjustment 1\nreadmit 1\nreadmitted 1\nreadymade 1\nreaffirm 1\nrealigning 1\nrealignments 1\nrealise 1\nrealising 1\nrealizable 1\nrealizations 1\nreallocation 1\nrealtime 1\nrealtion 1\nream 1\nreamins 1\nreams 1\nreaper 1\nreaping 1\nreapplication 1\nreapportion 1\nreappraisal 1\nreappraise 1\nreappropriation 1\nreared 1\nrearrangement 1\nrearranges 1\nrearview 1\nreasearch 1\nreassemble 1\nreassessed 1\nreassessing 1\nreassigns 1\nreassurances 1\nreassuringly 1\nreauthorization 1\nreauthorize 1\nrebar 1\nrebelling 1\nrebelliousness 1\nreboot 1\nreborn 1\nrebuking 1\nrebuttal 1\nrebutted 1\nrec 1\nrec. 1\nrecalculated 1\nrecalculations 1\nrecalibrate 1\nrecalibrating 1\nrecantation 1\nrecap 1\nrecaptilization 1\nrecapturing 1\nrecasting 1\nreccommend 1\nreceivership 1\nrecentralized 1\nreceptionists 1\nreceptions 1\nreceptivity 1\nreceptors 1\nrecess 1\nrecessed 1\nrechargeable 1\nrecieve 1\nrecieved 1\nreciprocal 1\nreciprocity 1\nrecitals 1\nreciter 1\nrecites 1\nrecklessness 1\nreclaimed 1\nreclaiming 1\nrecliner 1\nrecluse 1\nreclusive 1\nrecognise 1\nrecognizably 1\nrecoil 1\nrecoils 1\nrecombination 1\nrecombine 1\nrecomend 1\nrecommendatons 1\nrecommit 1\nrecompense 1\nreconciled 1\nreconciliations 1\nreconditioning 1\nreconfigure 1\nreconfirm 1\nreconnect 1\nreconvenes 1\nrecordable 1\nrecordkeeping 1\nrecovers 1\nrecraft 1\nrecreations 1\nrecriminations 1\nrecruiters 1\nrectification 1\nrectified 1\nrectilinear 1\nrecuperate 1\nrecuperating 1\nrecurrent 1\nrecursive 1\nrecuse 1\nrecycler 1\nreddened 1\nreddish 1\nredefined 1\nredefining 1\nredeploy 1\nredeployment 1\nredesigning 1\nredevelop 1\nredfish 1\nredial 1\nredirected 1\nredirecting 1\nredirection 1\nrediscovering 1\nredistribute 1\nredistributes 1\nredistributionism 1\nredistributors 1\nredistricting 1\nredlining 1\nredocking 1\nredone 1\nredouble 1\nredoubled 1\nredoubling 1\nredoubt 1\nredound 1\nredrawn 1\nredressing 1\nreds 1\nredvepco.doc 1\nreeeeeeeeeeeeeeaaaal 1\nreeged 1\nreel 1\nreelection 1\nreels 1\nreenter 1\nreexamined 1\nreexamining 1\nrefashioning 1\nreferee 1\nreferencing 1\nreferenda 1\nrefills 1\nreflector 1\nreflex 1\nreflexes 1\nreflexive 1\nrefocuses 1\nreforestation 1\nreformists 1\nreformulation 1\nrefracted 1\nrefraction 1\nrefrained 1\nrefreshed 1\nrefreshingly 1\nrefunded 1\nrefurb 1\nrefurbish 1\nrefutation 1\nrefute 1\nrefutes 1\nregading 1\nregal 1\nregimented 1\nregressing 1\nregression 1\nregressive 1\nregretful 1\nregularities 1\nregularity 1\nregulars 1\nregummed 1\nrehabilitating 1\nrehash 1\nrehashing 1\nrehearsed 1\nrehearses 1\nreignite 1\nreimagine 1\nreimbursements 1\nreimburses 1\nreincarnation 1\nreincorporated 1\nreincorporating 1\nreindicting 1\nreining 1\nreinstating 1\nreinstituting 1\nreinsure 1\nreintegrated 1\nreintegrating 1\nreinterpretation 1\nreintroduce 1\nreintroduction 1\nreinvent 1\nreinventer 1\nreinvigorated 1\nreinvigorates 1\nreinvigorating 1\nreinvigoration 1\nreiterates 1\nrejections 1\nrejoiced 1\nrejoices 1\nrejoinders 1\nrejoining 1\nrejuvenate 1\nrelabeling 1\nrelapse 1\nrelapsed 1\nrelatable 1\nrelational 1\nrelaunch 1\nrelaunched 1\nrelaying 1\nrelegate 1\nrelegating 1\nrelented 1\nrelenting 1\nrelevancy 1\nreliability 1\nrelic 1\nrelieves 1\nrelinquishing 1\nrelished 1\nrelishes 1\nrelive 1\nrelived 1\nremade 1\nremanded 1\nremarketer 1\nremarketings 1\nremarking 1\nremarriage 1\nremarry 1\nrematch 1\nremediation 1\nremedied 1\nremedying 1\nremembrance 1\nreminiscences 1\nreminiscing 1\nremiss 1\nremit 1\nremitting 1\nremodel 1\nremodeled 1\nremonstrance 1\nremora 1\nremorseful 1\nrenderings 1\nrendezvoused 1\nrending 1\nrenditions 1\nrenegades 1\nrenege 1\nreneged 1\nreneging 1\nrenegotiating 1\nrenegotiation 1\nrenewals 1\nrenounced 1\nrenounces 1\nrenter 1\nreoffering 1\nreopens 1\nreordering 1\nreorganizes 1\nreorganizing 1\nrep. 1\nrepackage 1\nrepackaging 1\nrepairable 1\nreparation 1\nrepasts 1\nrepayable 1\nrepaying 1\nrepeaters 1\nrepelled 1\nrepellent 1\nrepelling 1\nrepenting 1\nrepetitive 1\nrepition 1\nreplant 1\nreplaster 1\nreplays 1\nreplicas 1\nreplicate 1\nreplicated 1\nreplicating 1\nrepoire 1\nrepond 1\nreportorial 1\nreposition 1\nrepository 1\nrepossesed 1\nrepossess 1\nrepost 1\nreprehensible 1\nrepressing 1\nreprimanded 1\nreprint 1\nreprinted 1\nreprints 1\nreprise 1\nreproval 1\nreps 1\nrepudiation 1\nreq 1\nrequirments 1\nrequisitioned 1\nrerouted 1\nresales 1\nreschedulable 1\nrescheduling 1\nrescuer 1\nrescues 1\nresearches 1\nresellers 1\nresented 1\nreservers 1\nreserving 1\nreservists 1\nresettle 1\nresettled 1\nresettlement 1\nreshape 1\nreshaped 1\nreshufflings 1\nresided 1\nresiliently 1\nresins 1\nresocialization 1\nresoluteness 1\nreson 1\nresonant 1\nresonate 1\nresonates 1\nresound 1\nresounded 1\nresourse 1\nrespirator 1\nresponble 1\nrespondent 1\nresponse.headers 1\nresponsiblilty 1\nresponsilibity 1\nresponsiveness 1\nresprout 1\nrestarters 1\nrestating 1\nrestful 1\nrestfully 1\nrestive 1\nrestores 1\nrestraunt 1\nresurface 1\nresurrection 1\nresurrects 1\nresuscitating 1\nretaliates 1\nretalitory 1\nretardation 1\nrethinking 1\nreticence 1\nretinal 1\nretinue 1\nretinues 1\nretool 1\nretooling 1\nretools 1\nretorical 1\nretort 1\nretorted 1\nretraining 1\nretransmission 1\nretrieverr 1\nretrievers 1\nretrieving 1\nretroactively 1\nretrogress 1\nretrogressed 1\nretrospect 1\nretry 1\nreturnee 1\nreturnists 1\nreunify 1\nreunions 1\nrev 1\nrev1 1\nrevalue 1\nrevalued 1\nrevamp 1\nrevelers 1\nreveling 1\nrevelry 1\nrevels 1\nreverberate 1\nreverberated 1\nreverberations 1\nreverential 1\nreversibility 1\nreverting 1\nrevile 1\nreviled 1\nrevisionist 1\nrevisiting 1\nrevisted 1\nrevitalization 1\nrevivals 1\nrevives 1\nrevoltingly 1\nrevolver 1\nrevote 1\nrevved 1\nreworded 1\nreworking 1\nrewrapped 1\nrez 1\nrezoning 1\nrhapsody 1\nrhetorically 1\nrhino 1\nrhodium 1\nrhododendron 1\nrhomboideus 1\nribavirin 1\nribosomal 1\nrichard 1\nriche 1\nrickets 1\nridding 1\nriddle 1\nrideable 1\nridiculing 1\nridiculousness 1\nriff 1\nriffs 1\nrigging 1\nrighted 1\nrightful 1\nrighthander 1\nrightist 1\nrightward 1\nrightwards 1\nrigidly 1\nrigorously 1\nriiight 1\nrile 1\nriles 1\nrill 1\nrimmed 1\nrind 1\nringer 1\nringlets 1\nringneck 1\nringnecks 1\nrink 1\nrinses 1\nriparian 1\nripened 1\nripens 1\nripoffs 1\nriposte 1\nripper 1\nrippling 1\nrips 1\nrisible 1\nriskiest 1\nrite 1\nritualistic 1\nritually 1\nrivaled 1\nrivaling 1\nriverbanks 1\nriverfront 1\nriyal 1\nroaches 1\nroars 1\nroasting 1\nrobber 1\nrobin 1\nrobs 1\nrobustly 1\nrobustness 1\nrocker 1\nrockers 1\nrocketing 1\nrockslides 1\nrodent 1\nrodeo 1\nroguelikes 1\nroil 1\nroiled 1\nrollback 1\nrollbacks 1\nrollercoaster 1\nrollover 1\nrollovers 1\nrollup 1\nroly 1\nromaine 1\nroman 1\nromances 1\nromancing 1\nromatic 1\nromney 1\nromp 1\nromps 1\nroofed 1\nroofers 1\nrook 1\nroomette 1\nrootless 1\nrosarians 1\nrosario.gonzales@compaq.com 1\nrosebush 1\nrosette 1\nrosie 1\nrotated 1\nroties 1\nrotor 1\nrotorua 1\nrototiller 1\nrotproof 1\nrouge 1\nrougher 1\nroughhouse 1\nroughneck 1\nroughnecks 1\nroughness 1\nroulette 1\nrounding 1\nroundly 1\nrousing 1\nroust 1\nroustabouts 1\nrouter 1\nrover 1\nrowed 1\nrowhouse 1\nroyalist 1\nroyally 1\nrpm 1\nrsjacobs@Encoreacq.com 1\nrtb 1\nrubbery 1\nrubdowns 1\nrubfests 1\nrudeness 1\nruefully 1\nrues 1\nruffled 1\nrugs 1\nruinous 1\nrumble 1\nrumbles 1\nrumpled 1\nrundown 1\nrunny 1\nrunup 1\nrupturing 1\nrustic 1\nrusticated 1\nrusting 1\nrustlers 1\nrustling 1\nrustlings 1\nrvalues 1\ns? 1\nsabers 1\nsabot 1\nsabotaging 1\nsaboteurs 1\nsac 1\nsacer 1\nsackings 1\nsacristy 1\nsaddens 1\nsaddlebags 1\nsadism 1\nsadist 1\nsafekeeping 1\nsafes 1\nsainthood 1\nsaintly 1\nsalaam 1\nsalads 1\nsalamanders 1\nsalarymen 1\nsaleable 1\nsalesclerk 1\nsalesclerks 1\nsalesparson 1\nsaleswise 1\nsalh1 1\nsalicylate 1\nsalicylates 1\nsalicylic 1\nsalient 1\nsalle 1\nsallow 1\nsalted 1\nsalubrious 1\nsalutary 1\nsalutes 1\nsaluting 1\nsalvages 1\nsamantha 1\nsamovars 1\nsanctified 1\nsanctimonious 1\nsanctuary 1\nsandals 1\nsandalwood 1\nsandbag 1\nsandbags 1\nsandbox 1\nsanded 1\nsandpaper 1\nsandrock 1\nsandstone 1\nsandwiched 1\nsanitationists 1\nsanitize 1\nsanitized 1\nsanitizing 1\nsanity 1\nsap 1\nsapiens 1\nsaplings 1\nsapping 1\nsara 1\nsarakin 1\nsardine 1\nsardonically 1\nsarsaparilla 1\nsartorial 1\nsashes 1\nsashimi 1\nsate 1\nsatiric 1\nsatirized 1\nsaturate 1\nsaucepan 1\nsaucers 1\nsauerkraut 1\nsaunas 1\nsavagery 1\nsavehote 1\nsaver 1\nsavers 1\nsavor 1\nsavored 1\nsavors 1\nsaws 1\nsaxophonist 1\nsayings 1\nscabs 1\nscalable 1\nscalawags 1\nscalding 1\nscaleFactor 1\nscallions 1\nscallops 1\nscalp 1\nscalps 1\nscammed 1\nscamper 1\nscandal-ize 1\nscandalized 1\nscape 1\nscapegoating 1\nscapel 1\nscarecrow 1\nscarfing 1\nscariest 1\nscarring 1\nscatological 1\nscattering 1\nscavenged 1\nscavenger 1\nsceme 1\nsceptical 1\nschadenfreude 1\nschakalhund 1\nschemer 1\nschemers 1\nschizoid 1\nschmoozing 1\nschmumpered 1\nschnitzel 1\nschoolboy 1\nschoolchild 1\nschoolgirls 1\nschoolteachers 1\nschoolwork 1\nschumacher 1\nscientology 1\nscious 1\nscissors 1\nsclerosis 1\nscooted 1\nscooters 1\nscopes 1\nscorched 1\nscorching 1\nscore-wise 1\nscorekeeping 1\nscorers 1\nscornful 1\nscorpions 1\nscorpios 1\nscotched 1\nscotches 1\nscoundrel 1\nscoundrels 1\nscoured 1\nscourges 1\nscouring 1\nscowls 1\nscrabbling 1\nscrambles 1\nscraping 1\nscrappy 1\nscreeched 1\nscreed 1\nscreem 1\nscreenings 1\nscreensaver 1\nscreenwriter 1\nscrewball 1\nscribblers 1\nscribbling 1\nscribblings 1\nscribe 1\nscribes 1\nscrimmage 1\nscrimped 1\nscriptwriter 1\nscriptwriters 1\nscriptwriting 1\nscrounged 1\nscrubbed 1\nscrubs 1\nscrutinizing 1\nscuba 1\nscuffled 1\nscuffles 1\nscuffling 1\nsculptor 1\nsculptural 1\nscurry 1\nscythes 1\nseaboard 1\nseacoasts 1\nseak 1\nsealant 1\nsealants 1\nsealings 1\nseamier 1\nseamless 1\nseamlessly 1\nseamstress 1\nseamstresses 1\nseaplane 1\nsear 1\nsearch* 1\nsearchers 1\nseared 1\nsearing 1\nseashells 1\nseasonality 1\nseatbelt 1\nseatrout 1\nseaweeds 1\nseceding 1\nsecluded 1\nseclusion 1\nsecondarily 1\nsecrecty 1\nsecretarial 1\nsecrete 1\nsecretes 1\nsecretions 1\nsecularism 1\nsecularized 1\nsecures 1\nsecurites 1\nsecuritiess 1\nsedate 1\nsedated 1\nsedatives 1\nsediments 1\nseducing 1\nseductive 1\nseedlings 1\nseeker 1\nseer 1\nseesawing 1\nsegmentation 1\nsegmented 1\nsegrationist 1\nsegregationist 1\nseige 1\nseisho 1\nseismographic 1\nselectiveness 1\nselects 1\nselflessness 1\nselloff 1\nselloffs 1\nseltzer 1\nselves 1\nsemesters 1\nsemi-custom 1\nsemi-finals 1\nsemi-gently 1\nsemi-intellectual 1\nsemi-liberated 1\nsemi-liquefied 1\nsemi-objective 1\nsemi-obscure 1\nsemi-official 1\nsemi-private 1\nsemi-professional 1\nsemi-retired 1\nsemi-skilled 1\nsemicolon 1\nsemifinished 1\nseminal 1\nsenile 1\nsensationalist 1\nsensations 1\nsensibly 1\nsensing 1\nsensitives 1\nsensitivities 1\nsensitize 1\nsensuality 1\nsentencings 1\nsentimentality 1\nsentinel 1\nseparates 1\nseparations 1\nseperates 1\nsepsis 1\nseptember 1\nseptembre 1\nsepticemia 1\nsequencing 1\nsequestering 1\nsequestration 1\nsequined 1\nsequins 1\nserf 1\nserfdom 1\nsergeants 1\nserivce 1\nsermon 1\nsermons 1\nserpentine 1\nserviced 1\nservings 1\nservos 1\nsevaraty 1\nseven-fold 1\nsevenfold 1\nsevering 1\nseverly 1\nsevices 1\nsewing 1\nsewn 1\nsexagenarians 1\nsexier 1\nsexing 1\nsexless 1\nsez 1\nsha 1\nshaaaaaaame 1\nshabby 1\nshackled 1\nshacks 1\nshaded 1\nshadier 1\nshadowed 1\nshadowing 1\nshafts 1\nshag 1\nshaggy 1\nshags 1\nshaker 1\nshakers 1\nshal 1\nshallac 1\nshallower 1\nshallowness 1\nshalom 1\nsham 1\nshaman 1\nshameless 1\nshang 1\nshanks 1\nshantytown 1\nshapely 1\nshareholdings 1\nsharpening 1\nsharpens 1\nsharpness 1\nsharpshooters 1\nshashlik 1\nshattering 1\nshatters 1\nshaven 1\nshaves 1\nshaving 1\nshavings 1\nshawl 1\nsheaf 1\nsheared 1\nshearer 1\nshearing 1\nsheath 1\nsheen 1\nsheepish 1\nsheepishly 1\nsheepskin 1\nsheesh 1\nsheetrock 1\nsheiks 1\nsheltering 1\nshepherd 1\nshepherded 1\nsherbet 1\nsheriffs 1\nshi 1\nshibboleths 1\nshiest 1\nshills 1\nshimmered 1\nshimmering 1\nshimmers 1\nshimmied 1\nshin 1\nshing 1\nshingle 1\nshingles 1\nshipbuilders 1\nshipper 1\nshipwreck 1\nshirk 1\nshitholes 1\nshits 1\nshitty 1\nshocker 1\nshockproof 1\nshoeed 1\nshoehorned 1\nshoelaces 1\nshoemaker 1\nsholder 1\nshooter 1\nshopex 1\nshopfull 1\nshoplifter 1\nshoplifting 1\nshopping.com 1\nshorelines 1\nshortchanged 1\nshorted 1\nshortfalls 1\nshorthand 1\nshorting 1\nshortlists 1\nshortstop 1\nshortterm 1\nshouldering 1\nshoveling 1\nshowbusiness 1\nshowcased 1\nshowcasing 1\nshowman 1\nshravan 1\nshredders 1\nshrewder 1\nshrewdness 1\nshrieked 1\nshrinks 1\nshroud 1\nshrouds 1\nshrubmegiddo 1\nshrug 1\nshrugs 1\nshtick 1\nshucks 1\nshuffled 1\nshuo 1\nshutoff 1\nshuttering 1\nshutters 1\nshying 1\nsiamese 1\nsibling 1\nsickened 1\nsickest 1\nsickle 1\nsidaM 1\nsidelining 1\nsider 1\nsiders 1\nsidestepped 1\nsidesteps 1\nsidetrack 1\nsiecle 1\nsieve 1\nsieze 1\nsifted 1\nsighed 1\nsightseers 1\nsigil 1\nsignally 1\nsignboard 1\nsignficantly 1\nsignified 1\nsilenced 1\nsilences 1\nsilencing 1\nsilhouette 1\nsilhouetted 1\nsilkies 1\nsilted 1\nsimiliar 1\nsimmer 1\nsimpleton 1\nsimplicities 1\nsimplifications 1\nsimplz 1\nsimpsons 1\nsims 1\nsimulates 1\nsimulating 1\nsinatra 1\nsinecure 1\nsingle-fold 1\nsinglemindedly 1\nsingly 1\nsingularly 1\nsinker 1\nsinned 1\nsinners 1\nsinus 1\nsiphoning 1\nsipping 1\nsir 1\nsirens 1\nsitcoms 1\nsited 1\nsitter 1\nsitu 1\nsix-fold 1\nsizeable 1\nskate 1\nskateboarding 1\nskein 1\nskeleton 1\nsketch 1\nsketchiest 1\nsketching 1\nskied 1\nskim 1\nskimmers 1\nskimming 1\nskimping 1\nskimpy 1\nskinned 1\nskips 1\nskirmish 1\nskirted 1\nskirting 1\nskit 1\nsku 1\nskulked 1\nskush@swbell.net 1\nskylight 1\nskyward 1\nslacked 1\nslacken 1\nslackening 1\nslacks 1\nslalom 1\nslam 1\nslammer 1\nslamming 1\nslane 1\nslang 1\nslanted 1\nslapdash 1\nslathering 1\nslaughterhous 1\nslaughtering 1\nslavish 1\nslavishly 1\nslaw 1\nslay 1\nslayings 1\nsleaze 1\nsled 1\nsledding 1\nsleds 1\nsleeper 1\nsleet 1\nsleight 1\nslender 1\nslighted 1\nslighty 1\nslimming 1\nslings 1\nslipshod 1\nslit 1\nslithered 1\nslithers 1\nslits 1\nslivered 1\nslli 1\nslobs 1\nslog 1\nslogs 1\nslop 1\nsloping 1\nsloshes 1\nslouch 1\nslowball 1\nslowdowns 1\nslugger 1\nsluicing 1\nslumps 1\nslurry 1\nslurs 1\nslush 1\nsly 1\nslyly 1\nsm 1\nsmacks 1\nsmall- 1\nsmarting 1\nsmarts 1\nsmarttags 1\nsmeared 1\nsmelt 1\nsmidgins 1\nsmiley 1\nsmite 1\nsmithereens 1\nsmoked 1\nsmokehouse 1\nsmoky 1\nsmolder 1\nsmoothest 1\nsmorgasbord 1\nsmothered 1\nsmug 1\nsmuggler 1\nsmuggles 1\nsmuks 1\nsnaffle 1\nsnafus 1\nsnaked 1\nsnaky 1\nsnappily 1\nsnapshot 1\nsnapshots 1\nsnatching 1\nsnatchings 1\nsnazzy 1\nsneaker 1\nsneakers 1\nsneer 1\nsneezing 1\nsnidely 1\nsniff 1\nsniggeringly 1\nsnip 1\nsniped 1\nsnipping 1\nsniveling 1\nsnobbery 1\nsnobbish 1\nsnobs 1\nsnoogans 1\nsnore 1\nsnorkel 1\nsnorting 1\nsnotty 1\nsnout 1\nsnowbirds 1\nsnowstorm 1\nsnowsuit 1\nsnowy 1\nsnubbed 1\nsnuck 1\nsnugly 1\nsoapbox 1\nsoaps 1\nsobriety 1\nsobriquet 1\nsociable 1\nsocio 1\nsocio-political 1\nsocio-spiritual 1\nsociobiology 1\nsocioeconomically 1\nsociological 1\nsock 1\nsodas 1\nsodden 1\nsoething 1\nsofas 1\nsoftball 1\nsoftens 1\nsofthearted 1\nsofties 1\nsoftly 1\nsoftwood 1\nsofty 1\nsoiled 1\nsoing 1\nsoirees 1\nsolarheated 1\nsolemnity 1\nsolenoid 1\nsoles 1\nsolicitor 1\nsolicitors 1\nsolicits 1\nsolidifying 1\nsolidity 1\nsoliloquies 1\nsolvers 1\nsomatostatin 1\nsomersaulting 1\nsomethin' 1\nsommelier 1\nsonambulant 1\nsonnet 1\nsonnets 1\nsoo 1\nsoonest 1\nsooo 1\nsooth 1\nsoothsayer 1\nsophisms 1\nsophisticates 1\nsoporific 1\nsoprano 1\nsorbents 1\nsoreheads 1\nsosies 1\nsoulful 1\nsoulless 1\nsoundings 1\nsoups 1\nsourced 1\nsourcing 1\nsourdough 1\nsoured 1\nsouthwesterly 1\nsouvenirs 1\nsowed 1\nsows 1\nsoyega 1\nsp. 1\nspaceborn 1\nspaced 1\nspaceflight 1\nspaceships 1\nspacetime 1\nspades 1\nspake 1\nspamcomment 1\nspammed 1\nspammers 1\nspamming 1\nspammings 1\nspandex 1\nspaniel 1\nspaniels 1\nspanish 1\nspar 1\nsparrows 1\nspartan 1\nspat 1\nspatula 1\nspawns 1\nspear 1\nspearhead 1\nspearheading 1\nspears 1\nspecialised 1\nspeciality 1\nspecificity 1\nspecifies 1\nspecimen 1\nspeckle 1\nspeckled 1\nspeechless 1\nspeechlessly 1\nspeeders 1\nspeedier 1\nspeedup 1\nspenders 1\nspendthrifts 1\nspermologer 1\nspewed 1\nspiciness 1\nspicy 1\nspiders 1\nspiel 1\nspigots 1\nspil 1\nspines 1\nspinoffs 1\nspinsterhood 1\nspirochetes 1\nspittle 1\nspkg 1\nsplashing 1\nsplattered 1\nsplendorous 1\nsplinter 1\nsplintered 1\nsplints 1\nspoil 1\nspoilage 1\nspoilers 1\nspoiling 1\nspoked 1\nspokeman 1\nspokes 1\nsponges 1\nsponsered 1\nspontaneity 1\nspookiest 1\nspoon 1\nspoonbills 1\nspores 1\nsported 1\nsportif 1\nsportsmen 1\nspotlights 1\nspout 1\nsprained 1\nsprayer 1\nsprightly 1\nspringloaded 1\nspringtime 1\nsprinkler 1\nsprinklers 1\nsprint 1\nsprouted 1\nspt 1\nspunk 1\nspunky 1\nspurn 1\nspurning 1\nsputter 1\nsputtering 1\nspyglass 1\nsq 1\nsq. 1\nsquabble 1\nsquabbles 1\nsquabbling 1\nsquall 1\nsqualls 1\nsqualor 1\nsquash 1\nsquashed 1\nsquat 1\nsquawk 1\nsqueaking 1\nsqueaks 1\nsqueegee 1\nsqueezable 1\nsqueezes 1\nsquelch 1\nsquelched 1\nsquid 1\nsquiggly 1\nsquinted 1\nsquirelled 1\nsquirm 1\nsquirrel 1\nsquirts 1\nsry 1\nsssssr 1\nst 1\nst. 1\nstabbing 1\nstabilizer 1\nstaccato 1\nstacker 1\nstagewhispers 1\nstagnate 1\nstair 1\nstaircases 1\nstairways 1\nstakeholders 1\nstalks 1\nstallion 1\nstamped 1\nstampings 1\nstances 1\nstanch 1\nstandalone 1\nstandbys 1\nstandings 1\nstandup 1\nstanford 1\nstaphylococcus 1\nstarchy 1\nstares 1\nstargods 1\nstarkly 1\nstarlets 1\nstashed 1\nstaters 1\nstatesmanship 1\nstatism 1\nstatistic 1\nstatistician 1\nstatisticians 1\nstatuettes 1\nstatutorily 1\nstead 1\nsteadfastness 1\nsteadier 1\nsteadiness 1\nsteadying 1\nsteages 1\nsteakburgers 1\nsteamers 1\nsteaming 1\nsteamroller 1\nsteamrollered 1\nsteams 1\nsteamships 1\nsteamy 1\nsteeled 1\nsteels 1\nsteeple 1\nsteeps 1\nstejnegeri 1\nstepchildren 1\nstepside 1\nstereographic 1\nstereotypes 1\nstereotypically 1\nsteriles 1\nsterility 1\nsterilize 1\nsterilizers 1\nsterllite 1\nsteskically 1\nstewards 1\nstewardship 1\nstickier 1\nstickiness 1\nstickler 1\nsticthcing 1\nstiffen 1\nstiffness 1\nstifled 1\nstifles 1\nstiletto 1\nstilll 1\nstilts 1\nstimulant 1\nstimulative 1\nstimulator 1\nstinger 1\nstingier 1\nstingrays 1\nstipends 1\nstipes 1\nstippled 1\nstipulation 1\nstirrings 1\nstitching 1\nstitute 1\nstockbrokerage 1\nstockbuilding 1\nstockholm 1\nstockpiling 1\nstockroom 1\nstockyards 1\nstodgy 1\nstoic 1\nstoking 1\nstolid 1\nstomachache 1\nstomped 1\nstoned 1\nstonewalled 1\nstonework 1\nstony 1\nstooge 1\nstools 1\nstoopery 1\nstooping 1\nstopcock 1\nstopovers 1\nstoppers 1\nstorability 1\nstoreowners 1\nstoreroom 1\nstorey 1\nstoried 1\nstork 1\nstormier 1\nstorming 1\nstorybooks 1\nstoryline 1\nstowaway 1\nstowaways 1\nstr1 1\nstraddling 1\nstrafe 1\nstraightaway 1\nstraightened 1\nstraighter 1\nstraightforwardness 1\nstrainers 1\nstraitjacket 1\nstranding 1\nstrangest 1\nstranglehold 1\nstrangler 1\nstrangles 1\nstrangling 1\nstrangulation 1\nstrata 1\nstratagems 1\nstratified 1\nstratigraphic 1\nstratigraphy 1\nstratocaster 1\nstratus 1\nstrawberries 1\nstrawberry 1\nstrawman 1\nstraying 1\nstreaked 1\nstreaky 1\nstreamers 1\nstreetcorner 1\nstreetscape 1\nstreetwalkers 1\nstrenuous 1\nstressors 1\nstretcher 1\nstripe 1\nstriven 1\nstrivers 1\nstrolled 1\nstrolling 1\nstrongholds 1\nstrop 1\nstrove 1\nstructively 1\nstruggle. 1\nstrung 1\nstrut 1\nstubbornness 1\nstubby 1\nstucco 1\nstudiousness 1\nstuffs 1\nstuffy 1\nstultified 1\nstumped 1\nstunk 1\nstunted 1\nstupidities 1\nsturdier 1\nstutter 1\nstuttering 1\nstylist 1\nstylists 1\nstylized 1\nstyloid 1\nstyrene 1\nstyrofoam 1\nsub-Saharan 1\nsub-categories 1\nsub-committee 1\nsub-culture 1\nsub-division 1\nsub-minimum 1\nsub-ministerial 1\nsub-par 1\nsub-station 1\nsub-underwriters 1\nsub-underwriting 1\nsub-woofer 1\nsub-zero 1\nsubchapter 1\nsubcommitee 1\nsubcompacts 1\nsubconferences 1\nsubconsciously 1\nsubcontinent 1\nsubcontractor 1\nsubdirector 1\nsubdirectories 1\nsubdivisions 1\nsubgroup 1\nsubjecting 1\nsublet 1\nsublicense 1\nsublimated 1\nsubliminal 1\nsubmariners 1\nsubmerge 1\nsubmerges 1\nsubmersion 1\nsubmissiveness 1\nsubnational 1\nsubproject 1\nsubscribing 1\nsubservience 1\nsubset 1\nsubside 1\nsubsididzed 1\nsubsidization 1\nsubsidizes 1\nsubsistence 1\nsubspecies 1\nsubsume 1\nsubsumed 1\nsubterraneous 1\nsubtitles 1\nsubtlety 1\nsubtly 1\nsubtropical 1\nsuburbanites 1\nsuburbia 1\nsubversive 1\nsubverted 1\nsubverts 1\nsubwoofer 1\nsuccesful 1\nsuccinct 1\nsuccint 1\nsuccintly 1\nsuckups 1\nsuction 1\nsuddently 1\nsufferer 1\nsufferings 1\nsufficed 1\nsufficiency 1\nsuffixes 1\nsuffocate 1\nsuffocating 1\nsuffragette 1\nsugarcane 1\nsugarcoated 1\nsugars 1\nsugary 1\nsuicidality 1\nsuied 1\nsuject 1\nsul 1\nsulfate 1\nsulfites 1\nsulfurous 1\nsullen 1\nsultan 1\nsultanate 1\nsummarised 1\nsummertime 1\nsummery 1\nsummon 1\nsummoning 1\nsumo 1\nsumptuous 1\nsunburn 1\nsunburned 1\nsundry 1\nsunflowers 1\nsunsets 1\nsuper-charger 1\nsuper-exciting 1\nsuper-expensive 1\nsuper-fast 1\nsuper-national 1\nsuper-old 1\nsuper-pole 1\nsuper-specialized 1\nsuper-spy 1\nsuper-teams 1\nsuper-user 1\nsuper-youth 1\nsuperagent 1\nsuperalloy 1\nsuperannuated 1\nsuperbly 1\nsuperbowl 1\nsupercede 1\nsuperceded 1\nsupercuts 1\nsupered 1\nsuperefficient 1\nsuperfast 1\nsuperfluities 1\nsuperfluous 1\nsuperimposed 1\nsuperintendant 1\nsupermainframe 1\nsupernovae 1\nsuperpowers 1\nsupersafe 1\nsupersede 1\nsuperseded 1\nsuperstardom 1\nsuperstars 1\nsuperstition 1\nsuperstitions 1\nsuperstructure 1\nsuperwoman 1\nsupplanting 1\nsupplementing 1\nsupportable 1\nsuppository 1\nsupraventricular 1\nsupressor 1\nsur 1\nsurfacing 1\nsurfers 1\nsurmise 1\nsurmounting 1\nsurrealistic 1\nsurreally 1\nsurrenders 1\nsurreptitiously 1\nsurtaxes 1\nsurveil 1\nsurvivable 1\nsus 1\nsusceptibility 1\nsuspense 1\nsuspiciously 1\nsustenance 1\nsuvivors 1\nswabs 1\nswagger 1\nswankier 1\nswarm 1\nswarming 1\nswastika 1\nswat 1\nswathed 1\nswears 1\nsweated 1\nsweatshirt 1\nsweepingly 1\nsweetie 1\nsweetly 1\nsweltering 1\nswerve 1\nswiffer 1\nswig 1\nswindling 1\nswine 1\nswingers 1\nswitchers 1\nswiveling 1\nswivels 1\nswooning 1\nswooping 1\nsycamore 1\nsycophancy 1\nsycophants 1\nsyd 1\nsyllable 1\nsyllables 1\nsyllogistic 1\nsymbology 1\nsymmetrical 1\nsymmetry 1\nsynchopated 1\nsynchronization 1\nsynchronously 1\nsyncing 1\nsyndciated 1\nsyndications 1\nsynergies 1\nsyngeries 1\nsynonym 1\nsynthesize 1\nsynthesized 1\nsynthesizers 1\nsynthetics 1\nsynthroid 1\nsyringe 1\nsystematize 1\nsystematizing 1\nt' 1\nt's 1\nt-shirts 1\ntabacs 1\ntabby 1\ntabled 1\ntablemodel 1\ntablespoon 1\ntablet 1\ntaboos 1\ntabt_shaa@hotmail.com 1\ntabulates 1\ntabulating 1\ntachycardia 1\ntacking 1\ntackled 1\ntagline 1\ntailback 1\ntailgated 1\ntailgating 1\ntailing 1\ntailstock 1\ntaiwania 1\ntaketh 1\ntakingly 1\ntaleofterra.com 1\ntalismat 1\ntalkback 1\ntalkerics 1\ntalkers 1\ntalkie 1\ntalkies 1\ntaming 1\ntampers 1\ntampons 1\ntang 1\ntangential 1\ntangibles 1\ntango 1\ntangoed 1\ntangy 1\ntanked 1\ntankful 1\ntankmates 1\ntanned 1\ntannin 1\ntanning 1\ntans 1\ntantrums 1\ntao 1\ntapering 1\ntapestries 1\ntapestry 1\ntapeworms 1\ntapings 1\ntaps 1\ntarpaulins 1\ntarred 1\ntartan 1\ntartans 1\ntartness 1\ntaserings 1\ntasked 1\ntasking 1\ntassel 1\ntasseled 1\ntassels 1\ntasteful 1\ntastier 1\ntattered 1\ntaunted 1\ntaunting 1\ntaut 1\ntavern 1\ntaxicab 1\ntaxiing 1\ntaxlow 1\ntbsp 1\nteachings 1\nteacup 1\nteammates 1\ntearful 1\ntearfully 1\nteary 1\ntease 1\nteaspoons 1\ntechnocrat 1\ntechnocratic 1\ntechnocrats 1\ntechnologist 1\ntechs 1\nteemed 1\nteeny 1\ntees 1\nteetering 1\ntele 1\ntelecast 1\ntelecines 1\ntelecommuting 1\ntelecoms 1\nteleconference 1\ntelegrams 1\ntelegraphic 1\ntelegraphs 1\ntelephoning 1\ntelephotograph 1\ntelesystems 1\ntelevangelism 1\ntelevising 1\ntelevison 1\ntelevized 1\nteleworking 1\ntellingly 1\ntelltale 1\ntemerity 1\ntemperamental 1\ntempest 1\ntemporal 1\ntemps 1\ntempts 1\ntempura 1\ntenaciously 1\ntendancy 1\ntendentious 1\ntenderizer 1\ntenderness 1\ntenpin 1\ntentacles 1\ntenterhooks 1\ntenuously 1\ntepid 1\nterain 1\nterminally 1\nterminations 1\ntermite 1\nterraced 1\nterrazzo 1\nterrify 1\nterrorisms 1\nterroristic 1\nterse 1\ntestament 1\ntestaments 1\ntestosterone 1\ntetanus 1\ntethered 1\ntetrodotoxin 1\nteutonic 1\ntexas 1\ntexting 1\ntextures 1\ntgs 1\nth 1\nthailand 1\nthankless 1\nthatched 1\nthaw 1\nthawing 1\ntheeth 1\nthei 1\ntheistic 1\ntheocracy 1\ntheocratic 1\ntheologian 1\ntheology 1\nther 1\ntheraphy 1\ntherapists 1\nthereupon 1\nthermodynamics 1\nthermosetting 1\nthermostats 1\nthesaurus 1\ntheses 1\nthespian 1\ntheta 1\nthey're 1\nthickens 1\nthiking 1\nthine 1\nthinness 1\nthinning 1\nthinset 1\nthirdquarter 1\nthislast 1\nthistles 1\ntho 1\nthorougly 1\nthouroughly 1\nthr 1\nthreadlock 1\nthreatcon 1\nthree-star 1\nthrees 1\nthriftily 1\nthrills 1\nthrips 1\nthrives 1\nthrobbing 1\nthrobs 1\nthrones 1\nthronged 1\nthrow-away 1\nthrowaway 1\nthrusters 1\nthrusts 1\nthugging 1\nthumbnail 1\nthundered 1\nthwapping 1\nthx 1\nthyself 1\nti 1\ntibetan 1\ntickbox 1\nticked 1\nticketing 1\ntickle 1\ntidbits 1\ntiddler 1\ntidings 1\ntidying 1\ntiered 1\ntiff 1\ntightener 1\ntikki 1\ntiled 1\ntiller 1\ntilth 1\ntilts 1\ntimberland 1\ntimeframe 1\ntimeliness 1\ntimescape 1\ntimesheet 1\ntimpani 1\ntincture 1\ntinctures 1\ntinged 1\ntinges 1\ntingles 1\ntiniest 1\ntins 1\ntint 1\ntipple 1\ntippling 1\ntipsters 1\ntiptoe 1\ntiptoed 1\ntirade 1\ntiramisu 1\ntiredness 1\ntiring 1\ntitanate 1\ntiter 1\ntitillating 1\ntittering 1\ntitular 1\ntks 1\ntm 1\ntoadies 1\ntoaster 1\ntoasting 1\ntoccata 1\ntoda 1\ntoddlers 1\ntoehold 1\ntoeholds 1\ntoenail 1\ntoga 1\ntogetherness 1\ntogethers 1\ntoiled 1\ntokens 1\ntolerating 1\ntolled 1\ntollroad 1\ntollways 1\ntomato 1\ntomatos 1\ntombed 1\ntomboy 1\ntoms419 1\ntonal 1\ntoner 1\ntoney 1\ntongued 1\ntonics 1\ntonightttt 1\ntonnages 1\ntonne 1\ntoolbox 1\ntooled 1\ntoolset 1\ntoothed 1\ntoothpicks 1\ntopgrade 1\ntopical 1\ntopicality 1\ntopicsissues 1\ntopless 1\ntopline 1\ntopmost 1\ntopophobic 1\ntoppers 1\ntoppings 1\ntopples 1\ntoppling 1\ntopsy 1\ntorchbearers 1\ntormented 1\ntorpedoes 1\ntorrent 1\ntorrents 1\ntorrid 1\ntort 1\ntortoises 1\ntorts 1\ntorturing 1\ntorturous 1\ntossers 1\ntosses 1\ntot 1\ntotality 1\ntotalling 1\ntoted 1\ntoter 1\ntotter 1\ntottering 1\ntoughening 1\ntownhouses 1\ntows 1\ntoxicity 1\ntoxicologist 1\ntoxicology 1\ntoxins 1\ntoyed 1\ntoyota 1\ntrac 1\ntracer 1\ntradedistorting 1\ntraditionalist 1\ntraditionalists 1\ntraditionnelles 1\ntraduce 1\ntraduced 1\ntrafficker 1\ntragicomic 1\ntrainable 1\ntrainings 1\ntraitress 1\ntramping 1\ntranquil 1\ntrans-Pacific 1\ntransacted 1\ntransatlantic 1\ntransbay 1\ntransceiver 1\ntranscendence 1\ntranscendent 1\ntranscending 1\ntranscribe 1\ntranscribed 1\ntranscribers 1\ntranscription 1\ntransducers 1\ntransferability 1\ntransfered 1\ntransferrable 1\ntransformer 1\ntransfusions 1\ntransgendered 1\ntransgresses 1\ntransgressions 1\ntransited 1\ntransitions 1\ntransitory 1\ntransitting 1\ntranslucent 1\ntransluscent 1\ntransmitters 1\ntransmogrified 1\ntransparently 1\ntransportable 1\ntransporter 1\ntransporters 1\ntransvestites 1\ntrashed 1\ntravail 1\ntraveller 1\ntravelogues 1\ntraversed 1\ntraversing 1\ntrazadone 1\ntreadmill 1\ntreadmills 1\ntreasonable 1\ntreasurers 1\ntreasuring 1\ntreatable 1\ntreatise 1\ntreatises 1\ntreble 1\ntreehouse 1\ntrek 1\ntrekked 1\ntrembling 1\ntremblor 1\ntremulous 1\ntrenchant 1\ntrendies 1\ntrending 1\ntrendsetter 1\ntrespasses 1\ntress 1\ntri-colored 1\ntri-top 1\ntriage 1\ntribesmen 1\ntribune 1\ntricked 1\ntrickier 1\ntrickster 1\ntricycle 1\ntrifle 1\ntrifles 1\ntrills 1\ntrimed 1\ntrimesters 1\ntrimmers 1\ntrims 1\ntrinity-world.com 1\ntripe 1\ntriphosphorous 1\ntriples 1\ntripping 1\ntrisodium 1\ntristate 1\ntriumphantly 1\ntriumphs 1\ntrivandrum 1\ntrivia 1\ntrivialize 1\ntrodden 1\ntroll 1\ntrolling 1\ntrooper 1\ntroopers 1\ntrop 1\ntropics 1\ntrotting 1\ntroubleshooting 1\ntrounce 1\ntrounced 1\ntrouncing 1\ntrout 1\ntrove 1\ntrowel 1\ntrucker 1\ntruckin' 1\ntruculence 1\ntruely 1\ntruest 1\ntruism 1\ntrump 1\ntrumpets 1\ntrundles 1\ntrussed 1\ntrustful 1\ntrustworthiness 1\ntrustworthy 1\ntrusty 1\ntruthfully 1\ntryed 1\ntse 1\ntseng 1\ntsun 1\ntu 1\ntubs 1\ntubular 1\ntugboat 1\ntugged 1\ntugging 1\ntuina 1\ntulip 1\ntulips 1\ntulle 1\ntulsi 1\ntuly 1\ntumbledown 1\ntuner 1\ntungsten 1\ntunneling 1\ntuo 1\nturbid 1\nturboprops 1\nturd 1\ntureen 1\nturfs 1\nturgid 1\nturkistan 1\nturmoils 1\nturnarounds 1\nturnip 1\nturnkey 1\nturnovers 1\nturquoise 1\nturvy 1\ntusks 1\ntussle 1\ntutelage 1\ntutor 1\ntutored 1\ntutorial 1\ntuxedos 1\ntwang 1\ntwangy 1\ntweak 1\ntweaking 1\ntweaks 1\ntweed 1\ntweeting 1\ntweety 1\ntwelvefold 1\ntwiddling 1\ntwinky 1\ntwirling 1\ntwitch 1\ntwits 1\ntwopoint 1\ntwos 1\ntwotiered 1\ntwpeters 1\ntycoons 1\ntyke 1\ntyped 1\ntyphoid 1\ntypified 1\ntypographers 1\ntypographical 1\ntyrannize 1\ntyrants 1\ntyres 1\ntyvek 1\nu.k 1\nuVme 1\nuchideshi 1\nudnjob 1\nufc 1\nugliest 1\nuhh 1\nuk 1\nukulele 1\nulcers 1\nultimatums 1\nultra-Zionist 1\nultra-hip 1\nultra-right 1\nultra-safe 1\nultra-sensitive 1\nultra-thin 1\nultramodern 1\numlaut 1\nump 1\numpires 1\numpteen 1\nun-Asian 1\nun-Christian 1\nun-Islamic 1\nun-Swiss 1\nun-noticed 1\nun-ruly 1\nunabatingly 1\nunaccomplished 1\nunaccountably 1\nunachievable 1\nunacknowledged 1\nunadulterated 1\nunaffordable 1\nunaltered 1\nunamended 1\nunamortized 1\nunanimity 1\nunannounced 1\nunanswerable 1\nunappealing 1\nunapproachable 1\nunapproved 1\nunasked 1\nunassailable 1\nunassuming 1\nunaudited 1\nunavailability 1\nunawares 1\nunbanning 1\nunbecoming 1\nunbeknownst 1\nunbelief 1\nunbelieving 1\nunbleached 1\nunblemished 1\nunblinking 1\nunblock 1\nunblocked 1\nunbounded 1\nunbridled 1\nunbroken 1\nunburdened 1\nuncalculated 1\nuncannily 1\nuncapping 1\nunceasing 1\nunceasingly 1\nunceremoniously 1\nunchlorinated 1\nunclaimed 1\nuncleaned 1\nuncles 1\nuncollaborated 1\nuncombed 1\nuncomfortably 1\nuncompetitive 1\nuncon 1\nunconcious 1\nunconditional 1\nuncongested 1\nunconscionable 1\nunconsciously 1\nunconstrained 1\nuncontaminated 1\nuncontested 1\nuncontrollable 1\nuncontrolled 1\nunconventionality 1\nunconvinced 1\nuncooperative 1\nuncorrupted 1\nuncountable 1\nunction 1\nuncultured 1\nuncured 1\nundated 1\nundaunted 1\nunde 1\nunde$tood 1\nundead 1\nundeclared 1\nundecorated 1\nundercapitalized 1\nundercooked 1\nundercount 1\nundercurrents 1\nunderdog 1\nunderdressed 1\nunderemployed 1\nunderestimates 1\nunderestimating 1\nunderfoot 1\nundergarment 1\nundergarments 1\nundergirding 1\nundergoes 1\nundergrads 1\nundergraduates 1\nundergrowth 1\nunderinflate 1\nunderlie 1\nundernourished 1\nunderpaid 1\nunderpass 1\nunderperformed 1\nunderperformers 1\nunderperforms 1\nunderpinnings 1\nunderreacting 1\nundersea 1\nunderside 1\nundersized 1\nundersold 1\nunderstaffing 1\nunderstaffs 1\nunderstandings 1\nunderstating 1\nunderstorey 1\nundersubscription 1\nundertakes 1\nundertone 1\nundervaluing 1\nundervote 1\nunderweighted 1\nunderwhelmed 1\nunderworked 1\nunderwrote 1\nundescribable 1\nundeserved 1\nundeserving 1\nundetected 1\nundiluted 1\nundiversifiable 1\nundocking 1\nundress 1\nundrstood 1\nundying 1\nunearthing 1\nuneasiness 1\nuneconomic 1\nunedited 1\nunemployable 1\nunencoded 1\nunenthusiastic 1\nunenviable 1\nunequivocal 1\nunerringly 1\nuneveness 1\nuneventful 1\nunexpended 1\nunexploited 1\nunexplored 1\nunfailingly 1\nunfairness 1\nunfashionable 1\nunfathomable 1\nunfertile 1\nunfertilized 1\nunfettered 1\nunfililal 1\nunfixed 1\nunflaggingly 1\nunflaky 1\nunflappable 1\nunforgiving 1\nunfunded 1\nungainly 1\nungentlemanly 1\nunglamorous 1\nungodly 1\nungrateful 1\nunguarded 1\nunguided 1\nunheeded 1\nunhelpful 1\nunheroic 1\nunhocked 1\nunhurt 1\nunhusked 1\nunicycle 1\nunidentifiable 1\nunificators 1\nunifying 1\nunimaginative 1\nunimproved 1\nunincorporated 1\nuninfected 1\nuninformative 1\nuninhabitable 1\nuninitiated 1\nuninjured 1\nuninstructed 1\nuninsurable 1\nuninterested 1\nuninteresting 1\nuninterruptable 1\nuninterruptedly 1\nunionists 1\nuniramous 1\nunitary 1\nunjustified 1\nunjustly 1\nunlamented 1\nunleashing 1\nunlinked 1\nunlisted 1\nunlocks 1\nunlovable 1\nunlovely 1\nunmaintained 1\nunmaking 1\nunmapped 1\nunmask 1\nunmasks 1\nunmelodic 1\nunmentioned 1\nunmet 1\nunmitigated 1\nunmoderated 1\nunmourned 1\nunmoved 1\nunnamable 1\nunnatural 1\nunnoticeable 1\nunnumbered 1\nunobservable 1\nunobtrusive 1\nunopened 1\nunopposable 1\nunopposed 1\nunorganized 1\nunorthodox 1\nunoxidized 1\nunpacking 1\nunparalleled 1\nunpardonably 1\nunpeace 1\nunperformed 1\nunpleasantness 1\nunplug 1\nunpolarizing 1\nunpolitical 1\nunpopularity 1\nunpredicted 1\nunprivate 1\nunproductive 1\nunprotected 1\nunquenchable 1\nunquestioned 1\nunravelling 1\nunread 1\nunreadable 1\nunrecoverable 1\nunreinforced 1\nunreliable 1\nunremittingly 1\nunrepentant 1\nunrestricted 1\nunrighteous 1\nunrivaled 1\nunroll 1\nunrolls 1\nunsavory 1\nunscrupulously 1\nunseal 1\nunsealed 1\nunseat 1\nunselfish 1\nunselfishly 1\nunsentimental 1\nunsettlement 1\nunshackled 1\nunshakable 1\nunshakeable 1\nunshirkable 1\nunsophisticated 1\nunspeakable 1\nunspeakably 1\nunspoken 1\nunsubordinated 1\nunsubscribe 1\nunsuitability 1\nunsuited 1\nunsuspecting 1\nunswagged 1\nunswaggering 1\nunswerving 1\nunswervingly 1\nuntainted 1\nuntamable 1\nuntamed 1\nuntenable 1\nuntidiness 1\nuntradeable 1\nuntraditionally 1\nuntrendy 1\nuntried 1\nuntrustworthy 1\nuntruthfulness 1\nunvaccinated 1\nunvaryingly 1\nunverifiable 1\nunwarrantedly 1\nunwary 1\nunwed 1\nunwell 1\nunwinding 1\nunworkable 1\nupholds 1\nuploading 1\nupmarket 1\nupped 1\nuppermost 1\nuprooted 1\nuprooting 1\nupsets 1\nupsmanship 1\nuptake 1\nuptempo 1\nuptick 1\nuptrend 1\nure 1\nurinary 1\nurination 1\nury 1\nusana 1\nusana...@gmail.com 1\nusefuleness 1\nuseing 1\nuselessly 1\nushering 1\nushers 1\nusre 1\nusurpation 1\nususal 1\nutf8 1\nuthja 1\nuthke 1\nutilitarian 1\nutilizes 1\nutmosts 1\nutopia 1\nutopians 1\nuttering 1\nuv 1\nv70 1\nvacate 1\nvacationed 1\nvaccinated 1\nvacillate 1\nvacillating 1\nvacuous 1\nvagabonds 1\nvagrant 1\nvaguest 1\nval 1\nvaledictory 1\nvalentine 1\nvaliant 1\nvaliantly 1\nvalidate 1\nvalidated 1\nvalidly 1\nvalor 1\nvaluables 1\nvaluations 1\nvampires 1\nvancomycin 1\nvane 1\nvangie.mcgilloway@powersrc.com 1\nvanishes 1\nvanishing 1\nvanmark 1\nvanquish 1\nvanquished 1\nvaporize 1\nvar 1\nvar. 1\nvariability 1\nvariant 1\nvarie$ 1\nvarietal 1\nvariously 1\nvarnell 1\nvase 1\nvassals 1\nvastness 1\nvato 1\nvauluations 1\nveal 1\nvectored 1\nvedette 1\nveer 1\nveering 1\nvegan 1\nvege 1\nvegetarianism 1\nvegetative 1\nveggie 1\nvehemence 1\nvehicular 1\nveils 1\nveined 1\nvelocities 1\nvenal 1\nvending 1\nvenerate 1\nvenerating 1\nveneration 1\nvenereal 1\nvenison 1\nvenomous 1\nvenomously 1\nvented 1\nvents 1\nventuresome 1\nventuring 1\nverbiage 1\nverdant 1\nverged 1\nverifies 1\nverizon 1\nvermouth 1\nvernacular 1\nvernier 1\nverso 1\nverve 1\nvestigial 1\nvesting 1\nvesuvius 1\nveterin 1\nveterinarian 1\nveterinarians 1\nveterinary 1\nvetoes 1\nvetoing 1\nvexatious 1\nvexed 1\nviaduct 1\nviaducts 1\nvibe 1\nvibes 1\nvice-Director 1\nvice-Foreign 1\nvice-Premier 1\nvice-bureau 1\nvice-chairmen 1\nvice-mayor 1\nvice-mayors 1\nvice-presidential 1\nvice-presidents 1\nvice-prime 1\nviceroy 1\nvictors 1\nvideoconference 1\nvideoconferencing 1\nvideodisk 1\nvideodisks 1\nvideophiles 1\nvideotaped 1\nvied 1\nvietnam 1\nviewership 1\nviewings 1\nvignette 1\nvijay.suchdev@funb.com 1\nvileness 1\nvilification 1\nvilla 1\nvillager 1\nvillas 1\nvilli 1\nvillians 1\nvindicate 1\nvindicated 1\nvindication 1\nvine 1\nvines 1\nvineyard 1\nvineyards 1\nvinylon 1\nviolators 1\nviolee 1\nviolins 1\nvirginia 1\nvirgins 1\nvirility 1\nvirologist 1\nvirtuosos 1\nvirtuousness 1\nvis-à-vis 1\nvisages 1\nviscous 1\nvisitation 1\nvista 1\nvisualisations 1\nvisualization 1\nvisualizations 1\nvisuals 1\nvitae 1\nvitiate 1\nvitriolic 1\nvitro 1\nvituperation 1\nviva 1\nvivendi 1\nvivisection 1\nvocalizations 1\nvocation 1\nvocational 1\nvole 1\nvols 1\nvoltmeter 1\nvolts 1\nvolunteerily 1\nvolunteerism 1\nvomica 1\nvomited 1\nvonfox 1\nvoracious 1\nvounty 1\nvowels 1\nvoyeurism 1\nvpts 1\nvulnerablility 1\nwaaaaaaaaaaaaay 1\nwaaay 1\nwaddles 1\nwade 1\nwaded 1\nwafers 1\nwafted 1\nwaged 1\nwagering 1\nwagging 1\nwaggishly 1\nwagin 1\nwags 1\nwail 1\nwailed 1\nwaistline 1\nwaiter 1\nwaitress 1\nwaitresses 1\nwaives 1\nwalkin 1\nwalkover 1\nwalkway 1\nwallabies 1\nwallcoverings 1\nwalloping 1\nwallowing 1\nwallows 1\nwalneck 1\nwalrus 1\nwaltz 1\nwand 1\nwanderings 1\nwanders 1\nwanes 1\nwantonly 1\nwantons 1\nwanzhe 1\nwarded 1\nware 1\nwarfighter 1\nwarily 1\nwarlike 1\nwarmheartedness 1\nwarmonger 1\nwarp 1\nwarps 1\nwarranted 1\nwarranteed 1\nwarts 1\nwashable 1\nwashbasins 1\nwastefully 1\nwastepaper 1\nwatan 1\nwatchful 1\nwatchmaker 1\nwatchman 1\nwatchword 1\nwatercolor 1\nwatercourse 1\nwatermelon 1\nwattage 1\nwavelengths 1\nwavered 1\nwaxing 1\nwaybill 1\nwayside 1\nwaystation 1\nweakling 1\nweakly 1\nweal 1\nweaned 1\nweaponsmaking 1\nwearer 1\nwearily 1\nweariness 1\nweasel 1\nweasling 1\nweatherbeaten 1\nweatherman 1\nweaver 1\nweb-centric 1\nwebfriends 1\nweblogs.us 1\nwebmas...@globelingerie.com 1\nwedges 1\nwednesday 1\nweds 1\nweeding 1\nweeknights 1\nweepers 1\nweighting 1\nweightings 1\nweightlifting 1\nweir 1\nweirdest 1\nwelders 1\nwelding 1\nwellbeing 1\nwelled 1\nwellplaced 1\nwellrun 1\nwelter 1\nwendan 1\nwendy 1\nwesterners 1\nwesternization 1\nwestin.com. 1\nwether 1\nwetlands 1\nwetter 1\nwhacker 1\nwhacky 1\nwhales 1\nwhammy 1\nwheelbases 1\nwheellike 1\nwheezing 1\nwhence 1\nwhereupon 1\nwherewithal 1\nwhhich 1\nwhiffs 1\nwhigism 1\nwhigs 1\nwhimpering 1\nwhimpers 1\nwhimsically 1\nwhimsy 1\nwhiner 1\nwhining 1\nwhiplash 1\nwhippings 1\nwhipsaw 1\nwhipsawing 1\nwhirl 1\nwhirlpools 1\nwhirlwinds 1\nwhirring 1\nwhisk 1\nwhiskery 1\nwhisper 1\nwhistleblowing 1\nwhistled 1\nwhistles 1\nwhistling 1\nwhit 1\nwhitehouse.gov 1\nwhiteness 1\nwhiter 1\nwhitewalled 1\nwhiz 1\nwhizzes 1\nwholesales 1\nwhomever 1\nwhooosh 1\nwhooper 1\nwhoopsie 1\nwhore 1\nwick 1\nwickedly 1\nwickedness 1\nwidget 1\nwidgets 1\nwidowed 1\nwiener 1\nwiggled 1\nwiggling 1\nwight 1\nwiki 1\nwikipedia 1\nwil 1\nwildest 1\nwildfire 1\nwildfires 1\nwildflowers 1\nwildness 1\nwilds 1\nwile 1\nwilful 1\nwillfulness 1\nwillies 1\nwillin 1\nwillingess 1\nwillinging 1\nwilll 1\nwillow 1\nwillpower 1\nwilly 1\nwimp 1\nwimpering 1\nwimping 1\nwinches 1\nwinded 1\nwindless 1\nwindowless 1\nwindsheild 1\nwindshiel 1\nwinemakers 1\nwinery 1\nwingbeat 1\nwinger 1\nwingspan 1\nwining 1\nwinking 1\nwinnnig 1\nwinnowing 1\nwinter. 1\nwintering 1\nwipeout 1\nwipers 1\nwipes 1\nwipper 1\nwireline 1\nwiretaps 1\nwishbone 1\nwistful 1\nwistfully 1\nwitha 1\nwits 1\nwitted 1\nwittingly 1\nwll 1\nwobble 1\nwodges 1\nwoebegone 1\nwoefully 1\nwoken 1\nwolfed 1\nwolves 1\nwonderbars 1\nwonderment 1\nwoodchucks 1\nwoodrow 1\nwoodwork 1\nwooed 1\nwoolen 1\nwoolly 1\nwop 1\nwordplay 1\nworht 1\nworkdays 1\nworkhorse 1\nworkload 1\nworkman 1\nworkmanship 1\nworkmen 1\nworkouts 1\nworkplaces 1\nworkroom 1\nworksheet 1\nworksheets 1\nworldview 1\nwormed 1\nwornderful 1\nworthier 1\nwove 1\nwow 1\nwows 1\nwrapper 1\nwreak 1\nwreaks 1\nwreathed 1\nwrecker 1\nwree 1\nwren 1\nwrenched 1\nwrenches 1\nwrenching 1\nwrest 1\nwrested 1\nwrestlers 1\nwrestles 1\nwretch 1\nwriggling 1\nwringing 1\nwrit 1\nwriteoffs 1\nwrithe 1\nwrok 1\nwrondgoing 1\nwrongfully 1\nwrung 1\nwryly 1\nwww.Career.com 1\nwww.adobe.com 1\nwww.alfalaq.com 1\nwww.answers.com 1\nwww.caem.org 1\nwww.designofashion.com 1\nwww.europeaninternet.com 1\nwww.ibb.gov/editorials 1\nwww.juancole.com 1\nwww.kaffeeeis.co.nz 1\nwww.ninecommentaries.com 1\nwww.norcalfightingalliance.com 1\nwww.questions.com 1\nwww.visitsouthshropshire.co.uk 1\nwww.vitac.com 1\nwynners 1\nx-ray 1\nx.x 1\nx36709 1\nxenophobia 1\nxenophobic 1\nxerophytes 1\nxiaojueniu 1\nxiu 1\nxiucai 1\nxml 1\nxploita...@gmail.com 1\nxray 1\nxzx...@sina.com 1\nxzx620521.blogcn.com 1\ny' 1\nyaar 1\nyaboo 1\nyachting 1\nyachtsman 1\nyams 1\nyamwhatiyam 1\nyan 1\nyank 1\nyanks 1\nyardage 1\nyardwork 1\nyea 1\nyearbook 1\nyearbooks 1\nyearend 1\nyearling 1\nyearlings 1\nyearned 1\nyearnings 1\nyeasts 1\nyeding 1\nyellows 1\nyep 1\nyjtf 1\nymsgr:sendIM?mayursha&__Hi+Mayur... 1\nyoghurt 1\nyogi 1\nyooooooooooou 1\nyore 1\nyound 1\nyoungish 1\nyoungsteers 1\nyoungster 1\nyounth 1\nyoure 1\nyouself 1\nyouthfulness 1\nyoutube 1\nyouyutianshi...@sina.com 1\nyow 1\nyr 1\nyrs. 1\nyuans 1\nyuanzi 1\nyucai 1\nyucky 1\nyummy 1\nyuor 1\nz 1\nzag 1\nzagging 1\nzakat 1\nzalman 1\nzapped 1\nzappers 1\nzapping 1\nzation 1\nzblubs 1\nzealot 1\nzealots 1\nzebra 1\nzebras 1\nzeitgeist 1\nzeroed 1\nzeroes 1\nzestfully 1\nzhu 1\nzig 1\nzigging 1\nzigzags 1\nzilch 1\nzillion 1\nzinger 1\nzipper 1\nzirconate 1\nzlotys 1\nzodiac 1\nzodiacal 1\nzombies 1\nzoomed 1\nzounds 1\nzuo 1\nzzqq 1\n| 1\n~~~~~~~~~~ 1\n~~~~~~~~~~~~~ 1\nÌò...-LRB-c-RRB-x˙goÌö] 1\nÌö] 1\nÛDÌ’ 1\nِAnyway 1\n■Buy 1\n■Controlling 1\n□ 1\n●Stamp 1\n●The 1\n【 1\n在 1\n（www.mcoa.cn） 1\n（一） 1\n＊Lingtai 1\n： 1\n￥28 1\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/models/parsey_universal/context-tokenize-zh.pbtxt",
    "content": "Parameter {\n  name: \"brain_tokenizer_zh_embedding_dims\"\n  value: \"32;32\"\n}\nParameter {\n  name: \"brain_tokenizer_zh_embedding_names\"\n  value: \"chars;words\"\n}\nParameter {\n  name: \"brain_tokenizer_zh_features\"\n  value: \"input.char \"\n         \"input(1).char \"\n         \"input(2).char \"\n         \"input(3).char \"\n         \"input(-1).char \"\n         \"input(-2).char \"\n         \"input(-3).char \"\n         \"stack.char \"\n         \"stack.offset(1).char \"\n         \"stack.offset(-1).char \"\n         \"stack(1).char \"\n         \"stack(1).offset(1).char \"\n         \"stack(1).offset(-1).char \"\n         \"stack(2).char; \"\n         \"last-word(1,min-freq=2) \"\n         \"last-word(2,min-freq=2) \"\n         \"last-word(3,min-freq=2)\"\n}\nParameter {\n  name: \"brain_tokenizer_zh_transition_system\"\n  value: \"binary-segment-transitions\"\n}\ninput {\n  name: \"word-map\"\n  Part {\n    file_pattern: \"last-word-map\"\n  }\n}\ninput {\n  name: \"char-map\"\n  Part {\n    file_pattern: \"char-map\"\n  }\n}\ninput {\n  name: \"label-map\"\n  Part {\n    file_pattern: \"label-map\"\n  }\n}\ninput {\n  name: 'stdin-untoken'\n  record_format: 'untokenized-text'\n  Part {\n    file_pattern: '-'\n  }\n}\ninput {\n  name: 'stdout-conll'\n  record_format: 'conll-sentence'\n  Part {\n    file_pattern: '-'\n  }\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/models/parsey_universal/context.pbtxt",
    "content": "Parameter {\n  name: \"brain_tokenizer_embedding_dims\"\n  value: \"16;16;16\"\n}\nParameter {\n  name: \"brain_tokenizer_embedding_names\"\n  value: \"chars;digits;puncts\"\n}\nParameter {\n  name: \"brain_tokenizer_features\"\n  value:  \"input.char \"\n          \"input(-1).char \"\n          \"input(1).char; \"\n          \"input.digit \"\n          \"input(-1).digit \"\n          \"input(1).digit; \"\n          \"input.punctuation-amount \"\n          \"input(-1).punctuation-amount \"\n          \"input(1).punctuation-amount \"\n}\nParameter {\n  name: \"brain_tokenizer_transition_system\"\n  value: \"binary-segment-transitions\"\n}\nParameter {\n  name: \"brain_morpher_embedding_dims\"\n  value: \"2;16;8;16;16;16;16;16;64\"\n}\nParameter {\n  name: \"brain_morpher_embedding_names\"\n  value: \"capitalization;char_ngram;other;prefix2;prefix3;suffix2;suffix3;tags;words\"\n}\nParameter {\n  name: \"brain_morpher_features\"\n  value: \"input.capitalization \"\n         \"input(1).capitalization \"\n         \"input(2).capitalization \"\n         \"input(3).capitalization \"\n         \"input(-1).capitalization \"\n         \"input(-2).capitalization \"\n         \"input(-3).capitalization \"\n         \"input(-4).capitalization; \"\n         \"input.token.char-ngram \"\n         \"input(1).token.char-ngram \"\n         \"input(2).token.char-ngram \"\n         \"input(3).token.char-ngram \"\n         \"input(-1).token.char-ngram \"\n         \"input(-2).token.char-ngram \"\n         \"input(-3).token.char-ngram \"\n         \"input(-4).token.char-ngram; \"\n         \"input.digit \"\n         \"input.hyphen \"\n         \"input.token.punctuation-amount \"\n         \"input.token.quote; \"\n         \"input.token.prefix(length=2) \"\n         \"input(1).token.prefix(length=2) \"\n         \"input(2).token.prefix(length=2) \"\n         \"input(3).token.prefix(length=2) \"\n         \"input(-1).token.prefix(length=2) \"\n         \"input(-2).token.prefix(length=2) \"\n         \"input(-3).token.prefix(length=2) \"\n         \"input(-4).token.prefix(length=2); \"\n         \"input.token.prefix(length=3) \"\n         \"input(1).token.prefix(length=3) \"\n         \"input(2).token.prefix(length=3) \"\n         \"input(3).token.prefix(length=3) \"\n         \"input(-1).token.prefix(length=3) \"\n         \"input(-2).token.prefix(length=3) \"\n         \"input(-3).token.prefix(length=3) \"\n         \"input(-4).token.prefix(length=3); \"\n         \"input.token.suffix(length=2) \"\n         \"input(1).token.suffix(length=2) \"\n         \"input(2).token.suffix(length=2) \"\n         \"input(3).token.suffix(length=2) \"\n         \"input(-1).token.suffix(length=2) \"\n         \"input(-2).token.suffix(length=2) \"\n         \"input(-3).token.suffix(length=2) \"\n         \"input(-4).token.suffix(length=2); \"\n         \"input.token.suffix(length=3) \"\n         \"input(1).token.suffix(length=3) \"\n         \"input(2).token.suffix(length=3) \"\n         \"input(3).token.suffix(length=3) \"\n         \"input(-1).token.suffix(length=3) \"\n         \"input(-2).token.suffix(length=3) \"\n         \"input(-3).token.suffix(length=3) \"\n         \"input(-4).token.suffix(length=3); \"\n         \"input(-1).pred-morph-tag \"\n         \"input(-2).pred-morph-tag \"\n         \"input(-3).pred-morph-tag \"\n         \"input(-4).pred-morph-tag; \"\n         \"input.token.word \"\n         \"input(1).token.word \"\n         \"input(2).token.word \"\n         \"input(3).token.word \"\n         \"input(-1).token.word \"\n         \"input(-2).token.word \"\n         \"input(-3).token.word \"\n         \"input(-4).token.word\"\n}\nParameter {\n  name: \"brain_morpher_transition_system\"\n  value: \"morpher\"\n}\nParameter {\n  name: \"brain_tagger_embedding_dims\"\n  value: \"2;16;8;16;16;16;16;16;64\"\n}\nParameter {\n  name: \"brain_tagger_embedding_names\"\n  value: \"capitalization;char_ngram;other;prefix2;prefix3;suffix2;suffix3;tags;words\"\n}\nParameter {\n  name: \"brain_tagger_features\"\n  value: \"input.capitalization \"\n         \"input(1).capitalization \"\n         \"input(2).capitalization \"\n         \"input(3).capitalization \"\n         \"input(-1).capitalization \"\n         \"input(-2).capitalization \"\n         \"input(-3).capitalization \"\n         \"input(-4).capitalization; \"\n         \"input.token.char-ngram \"\n         \"input(1).token.char-ngram \"\n         \"input(2).token.char-ngram \"\n         \"input(3).token.char-ngram \"\n         \"input(-1).token.char-ngram \"\n         \"input(-2).token.char-ngram \"\n         \"input(-3).token.char-ngram \"\n         \"input(-4).token.char-ngram; \"\n         \"input.digit \"\n         \"input.hyphen \"\n         \"input.token.punctuation-amount \"\n         \"input.token.quote; \"\n         \"input.token.prefix(length=2) \"\n         \"input(1).token.prefix(length=2) \"\n         \"input(2).token.prefix(length=2) \"\n         \"input(3).token.prefix(length=2) \"\n         \"input(-1).token.prefix(length=2) \"\n         \"input(-2).token.prefix(length=2) \"\n         \"input(-3).token.prefix(length=2) \"\n         \"input(-4).token.prefix(length=2); \"\n         \"input.token.prefix(length=3) \"\n         \"input(1).token.prefix(length=3) \"\n         \"input(2).token.prefix(length=3) \"\n         \"input(3).token.prefix(length=3) \"\n         \"input(-1).token.prefix(length=3) \"\n         \"input(-2).token.prefix(length=3) \"\n         \"input(-3).token.prefix(length=3) \"\n         \"input(-4).token.prefix(length=3); \"\n         \"input.token.suffix(length=2) \"\n         \"input(1).token.suffix(length=2) \"\n         \"input(2).token.suffix(length=2) \"\n         \"input(3).token.suffix(length=2) \"\n         \"input(-1).token.suffix(length=2) \"\n         \"input(-2).token.suffix(length=2) \"\n         \"input(-3).token.suffix(length=2) \"\n         \"input(-4).token.suffix(length=2); \"\n         \"input.token.suffix(length=3) \"\n         \"input(1).token.suffix(length=3) \"\n         \"input(2).token.suffix(length=3) \"\n         \"input(3).token.suffix(length=3) \"\n         \"input(-1).token.suffix(length=3) \"\n         \"input(-2).token.suffix(length=3) \"\n         \"input(-3).token.suffix(length=3) \"\n         \"input(-4).token.suffix(length=3); \"\n         \"input(-1).pred-tag \"\n         \"input(-2).pred-tag \"\n         \"input(-3).pred-tag \"\n         \"input(-4).pred-tag; \"\n         \"input.token.word \"\n         \"input(1).token.word \"\n         \"input(2).token.word \"\n         \"input(3).token.word \"\n         \"input(-1).token.word \"\n         \"input(-2).token.word \"\n         \"input(-3).token.word \"\n         \"input(-4).token.word\"\n}\nParameter {\n  name: \"brain_tagger_transition_system\"\n  value: \"tagger\"\n}\nParameter {\n  name: \"brain_parser_embedding_dims\"\n  value: \"32;32;32;64\"\n}\nParameter {\n  name: \"brain_parser_embedding_names\"\n  value: \"labels;morphology;tags;words\"\n}\nParameter {\n  name: \"brain_parser_features\"\n  value: \"stack.child(1).label \"\n         \"stack.child(1).sibling(-1).label \"\n         \"stack.child(-1).label \"\n         \"stack.child(-1).sibling(1).label \"\n         \"stack.child(2).label \"\n         \"stack.child(-2).label \"\n         \"stack(1).child(1).label \"\n         \"stack(1).child(1).sibling(-1).label \"\n         \"stack(1).child(-1).label \"\n         \"stack(1).child(-1).sibling(1).label \"\n         \"stack(1).child(2).label \"\n         \"stack(1).child(-2).label; \"\n         \"input.token.morphology-set \"\n         \"input(1).token.morphology-set \"\n         \"input(2).token.morphology-set \"\n         \"input(3).token.morphology-set \"\n         \"stack.token.morphology-set \"\n         \"stack.child(1).token.morphology-set \"\n         \"stack.child(1).sibling(-1).token.morphology-set \"\n         \"stack.child(-1).token.morphology-set \"\n         \"stack.child(-1).sibling(1).token.morphology-set \"\n         \"stack.child(2).token.morphology-set \"\n         \"stack.child(-2).token.morphology-set \"\n         \"stack(1).token.morphology-set \"\n         \"stack(1).child(1).token.morphology-set \"\n         \"stack(1).child(1).sibling(-1).token.morphology-set \"\n         \"stack(1).child(-1).token.morphology-set \"\n         \"stack(1).child(-1).sibling(1).token.morphology-set \"\n         \"stack(1).child(2).token.morphology-set \"\n         \"stack(1).child(-2).token.morphology-set \"\n         \"stack(2).token.morphology-set \"\n         \"stack(3).token.morphology-set; \"\n         \"input.token.tag \"\n         \"input(1).token.tag \"\n         \"input(2).token.tag \"\n         \"input(3).token.tag \"\n         \"stack.token.tag \"\n         \"stack.child(1).token.tag \"\n         \"stack.child(1).sibling(-1).token.tag \"\n         \"stack.child(-1).token.tag \"\n         \"stack.child(-1).sibling(1).token.tag \"\n         \"stack.child(2).token.tag \"\n         \"stack.child(-2).token.tag \"\n         \"stack(1).token.tag \"\n         \"stack(1).child(1).token.tag \"\n         \"stack(1).child(1).sibling(-1).token.tag \"\n         \"stack(1).child(-1).token.tag \"\n         \"stack(1).child(-1).sibling(1).token.tag \"\n         \"stack(1).child(2).token.tag \"\n         \"stack(1).child(-2).token.tag \"\n         \"stack(2).token.tag \"\n         \"stack(3).token.tag; \"\n         \"input.token.word \"\n         \"input(1).token.word \"\n         \"input(2).token.word \"\n         \"input(3).token.word \"\n         \"stack.token.word \"\n         \"stack.child(1).token.word \"\n         \"stack.child(1).sibling(-1).token.word \"\n         \"stack.child(-1).token.word \"\n         \"stack.child(-1).sibling(1).token.word \"\n         \"stack.child(2).token.word \"\n         \"stack.child(-2).token.word \"\n         \"stack(1).token.word \"\n         \"stack(1).child(1).token.word \"\n         \"stack(1).child(1).sibling(-1).token.word \"\n         \"stack(1).child(-1).token.word \"\n         \"stack(1).child(-1).sibling(1).token.word \"\n         \"stack(1).child(2).token.word \"\n         \"stack(1).child(-2).token.word \"\n         \"stack(2).token.word \"\n         \"stack(3).token.word \"\n}\nParameter {\n  name: \"brain_parser_transition_system\"\n  value: \"arc-standard\"\n}\nParameter {\n  name: \"join_category_to_pos\"\n  value: \"true\"\n}\ninput {\n  name: \"word-map\"\n  Part {\n    file_pattern: \"word-map\"\n  }\n}\ninput {\n  name: \"char-map\"\n  Part {\n    file_pattern: \"char-map\"\n  }\n}\ninput {\n  name: \"tag-map\"\n  Part {\n    file_pattern: \"tag-map\"\n  }\n}\n\ninput {\n  name: \"tag-to-category\"\n  Part {\n    file_pattern: \"tag-to-category\"\n  }\n}\ninput {\n  name: \"label-map\"\n  Part {\n    file_pattern: \"label-map\"\n  }\n}\ninput {\n  name: \"char-ngram-map\"\n  Part {\n    file_pattern: \"char-ngram-map\"\n  }\n}\ninput {\n  name: \"prefix-table\"\n  Part {\n    file_pattern: \"prefix-table\"\n  }\n}\ninput {\n  name: \"suffix-table\"\n  Part {\n    file_pattern: \"suffix-table\"\n  }\n}\ninput {\n  name: \"morph-label-set\"\n  Part {\n    file_pattern: \"morph-label-set\"\n  }\n}\ninput {\n  name: \"morphology-map\"\n  Part {\n    file_pattern: \"morphology-map\"\n  }\n}\ninput {\n  name: 'stdin'\n  record_format: 'tokenized-text'\n  Part {\n    file_pattern: '-'\n  }\n}\ninput {\n  name: 'stdin-conll'\n  record_format: 'conll-sentence'\n  Part {\n    file_pattern: '-'\n  }\n}\ninput {\n  name: 'stdin-untoken'\n  record_format: 'untokenized-text'\n  Part {\n    file_pattern: '-'\n  }\n}\ninput {\n  name: 'stdout-conll'\n  record_format: 'conll-sentence'\n  Part {\n    file_pattern: '-'\n  }\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/models/parsey_universal/parse.sh",
    "content": "# A script that runs a morphological analyzer, a part-of-speech tagger and a\n# dependency parser on a text file, with one sentence per line.\n#\n# Example usage:\n#  bazel build syntaxnet:parser_eval\n#  cat sentences.txt |\n#    syntaxnet/models/parsey_universal/parse.sh \\\n#    $MODEL_DIRECTORY > output.conll\n#\n# To run on a conll formatted file, add the --conll command line argument:\n#  cat sentences.conll |\n#    syntaxnet/models/parsey_universal/parse.sh \\\n#    --conll $MODEL_DIRECTORY > output.conll\n#\n# Models can be downloaded from\n#  http://download.tensorflow.org/models/parsey_universal/<language>.zip\n# for the languages listed at\n#  https://github.com/tensorflow/models/blob/master/syntaxnet/universal.md\n#\n\nPARSER_EVAL=bazel-bin/syntaxnet/parser_eval\nCONTEXT=syntaxnet/models/parsey_universal/context.pbtxt\nif [[ \"$1\" == \"--conll\" ]]; then\n  INPUT_FORMAT=stdin-conll\n  shift\nelse\n  INPUT_FORMAT=stdin\nfi\nMODEL_DIR=$1\n\n$PARSER_EVAL \\\n  --input=$INPUT_FORMAT \\\n  --output=stdout-conll \\\n  --hidden_layer_sizes=64 \\\n  --arg_prefix=brain_morpher \\\n  --graph_builder=structured \\\n  --task_context=$CONTEXT \\\n  --resource_dir=$MODEL_DIR \\\n  --model_path=$MODEL_DIR/morpher-params \\\n  --slim_model \\\n  --batch_size=1024 \\\n  --alsologtostderr \\\n  | \\\n  $PARSER_EVAL \\\n  --input=stdin-conll \\\n  --output=stdout-conll \\\n  --hidden_layer_sizes=64 \\\n  --arg_prefix=brain_tagger \\\n  --graph_builder=structured \\\n  --task_context=$CONTEXT \\\n  --resource_dir=$MODEL_DIR \\\n  --model_path=$MODEL_DIR/tagger-params \\\n  --slim_model \\\n  --batch_size=1024 \\\n  --alsologtostderr \\\n  | \\\n  $PARSER_EVAL \\\n  --input=stdin-conll \\\n  --output=stdout-conll \\\n  --hidden_layer_sizes=512,512 \\\n  --arg_prefix=brain_parser \\\n  --graph_builder=structured \\\n  --task_context=$CONTEXT \\\n  --resource_dir=$MODEL_DIR \\\n  --model_path=$MODEL_DIR/parser-params \\\n  --slim_model \\\n  --batch_size=1024 \\\n  --alsologtostderr\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/models/parsey_universal/tokenize.sh",
    "content": "# A script that runs a tokenizer on a text file with one sentence per line.\n#\n# Example usage:\n#  bazel build syntaxnet:parser_eval\n#  cat untokenized-sentences.txt |\n#    syntaxnet/models/parsey_universal/tokenize.sh \\\n#    $MODEL_DIRECTORY > output.conll\n#\n# Models can be downloaded from\n#  http://download.tensorflow.org/models/parsey_universal/<language>.zip\n# for the languages listed at\n#  https://github.com/tensorflow/models/blob/master/syntaxnet/universal.md\n#\n\nPARSER_EVAL=bazel-bin/syntaxnet/parser_eval\nCONTEXT=syntaxnet/models/parsey_universal/context.pbtxt\nINPUT_FORMAT=stdin-untoken\nMODEL_DIR=$1\n\n$PARSER_EVAL \\\n  --input=$INPUT_FORMAT \\\n  --output=stdin-untoken \\\n  --hidden_layer_sizes=128,128 \\\n  --arg_prefix=brain_tokenizer \\\n  --graph_builder=greedy \\\n  --task_context=$CONTEXT \\\n  --resource_dir=$MODEL_DIR \\\n  --model_path=$MODEL_DIR/tokenizer-params \\\n  --batch_size=32 \\\n  --alsologtostderr \\\n  --slim_model\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/models/parsey_universal/tokenize_zh.sh",
    "content": "# A script that runs a traditional Chinese tokenizer on a text file with one\n# sentence per line.\n#\n# Example usage:\n#  bazel build syntaxnet:parser_eval\n#  cat untokenized-sentences.txt |\n#    syntaxnet/models/parsey_universal/tokenize_zh.sh \\\n#    $MODEL_DIRECTORY > output.conll\n#\n# The traditional Chinese model can be downloaded from\n#  http://download.tensorflow.org/models/parsey_universal/Chinese.zip\n#\n\nPARSER_EVAL=bazel-bin/syntaxnet/parser_eval\nCONTEXT=syntaxnet/models/parsey_universal/context-tokenize-zh.pbtxt\nINPUT_FORMAT=stdin-untoken\nMODEL_DIR=$1\n\n$PARSER_EVAL \\\n  --input=$INPUT_FORMAT \\\n  --output=stdin-untoken \\\n  --hidden_layer_sizes=256,256 \\\n  --arg_prefix=brain_tokenizer_zh \\\n  --graph_builder=structured \\\n  --task_context=$CONTEXT \\\n  --resource_dir=$MODEL_DIR \\\n  --model_path=$MODEL_DIR/tokenizer-params \\\n  --batch_size=1024 \\\n  --alsologtostderr \\\n  --slim_model\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/morpher_transitions.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// Morpher transition system.\n//\n// This transition system has one type of actions:\n//  - The SHIFT action pushes the next input token to the stack and\n//    advances to the next input token, assigning a part-of-speech tag to the\n//    token that was shifted.\n//\n// The transition system operates with parser actions encoded as integers:\n//  - A SHIFT action is encoded as number starting from 0.\n\n#include <string>\n\n#include \"syntaxnet/morphology_label_set.h\"\n#include \"syntaxnet/parser_features.h\"\n#include \"syntaxnet/parser_state.h\"\n#include \"syntaxnet/parser_transitions.h\"\n#include \"syntaxnet/sentence_features.h\"\n#include \"syntaxnet/shared_store.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/term_frequency_map.h\"\n#include \"syntaxnet/utils.h\"\n#include \"tensorflow/core/lib/strings/strcat.h\"\n\nnamespace syntaxnet {\n\nclass MorphologyTransitionState : public ParserTransitionState {\n public:\n  explicit MorphologyTransitionState(const MorphologyLabelSet *label_set)\n      : label_set_(label_set) {}\n\n  explicit MorphologyTransitionState(const MorphologyTransitionState *state)\n      : MorphologyTransitionState(state->label_set_) {\n    tag_ = state->tag_;\n    gold_tag_ = state->gold_tag_;\n  }\n\n  // Clones the transition state by returning a new object.\n  ParserTransitionState *Clone() const override {\n    return new MorphologyTransitionState(this);\n  }\n\n  // Reads gold tags for each token.\n  void Init(ParserState *state) override {\n    tag_.resize(state->sentence().token_size(), -1);\n    gold_tag_.resize(state->sentence().token_size(), -1);\n    for (int pos = 0; pos < state->sentence().token_size(); ++pos) {\n      const Token &token = state->GetToken(pos);\n\n      // NOTE: we allow token to not have a TokenMorphology extension or for the\n      // TokenMorphology to be absent from the label_set_ because this can\n      // happen at test time.\n      gold_tag_[pos] = label_set_->LookupExisting(\n          token.GetExtension(TokenMorphology::morphology));\n    }\n  }\n\n  // Returns the tag assigned to a given token.\n  int Tag(int index) const {\n    DCHECK_GE(index, 0);\n    DCHECK_LT(index, tag_.size());\n    return index == -1 ? -1 : tag_[index];\n  }\n\n  // Sets this tag on the token at index.\n  void SetTag(int index, int tag) {\n    DCHECK_GE(index, 0);\n    DCHECK_LT(index, tag_.size());\n    tag_[index] = tag;\n  }\n\n  // Returns the gold tag for a given token.\n  int GoldTag(int index) const {\n    DCHECK_GE(index, -1);\n    DCHECK_LT(index, gold_tag_.size());\n    return index == -1 ? -1 : gold_tag_[index];\n  }\n\n  // Returns the proto corresponding to the tag, or an empty proto if the tag is\n  // not found.\n  const TokenMorphology &TagAsProto(int tag) const {\n    if (tag >= 0 && tag < label_set_->Size()) {\n      return label_set_->Lookup(tag);\n    }\n    return TokenMorphology::default_instance();\n  }\n\n  // Adds transition state specific annotations to the document.\n  void AddParseToDocument(const ParserState &state, bool rewrite_root_labels,\n                          Sentence *sentence) const override {\n    for (int i = 0; i < tag_.size(); ++i) {\n      Token *token = sentence->mutable_token(i);\n      *token->MutableExtension(TokenMorphology::morphology) =\n          TagAsProto(Tag(i));\n    }\n  }\n\n  // Whether a parsed token should be considered correct for evaluation.\n  bool IsTokenCorrect(const ParserState &state, int index) const override {\n    return GoldTag(index) == Tag(index);\n  }\n\n  // Returns a human readable string representation of this state.\n  string ToString(const ParserState &state) const override {\n    string str;\n    for (int i = state.StackSize(); i > 0; --i) {\n      const string &word = state.GetToken(state.Stack(i - 1)).word();\n      if (i != state.StackSize() - 1) str.append(\" \");\n      tensorflow::strings::StrAppend(\n          &str, word, \"[\",\n          TagAsProto(Tag(state.StackSize() - i)).ShortDebugString(), \"]\");\n    }\n    for (int i = state.Next(); i < state.NumTokens(); ++i) {\n      tensorflow::strings::StrAppend(&str, \" \", state.GetToken(i).word());\n    }\n    return str;\n  }\n\n private:\n  // Currently assigned morphological analysis for each token in this sentence.\n  vector<int> tag_;\n\n  // Gold morphological analysis from the input document.\n  vector<int> gold_tag_;\n\n  // Tag map used for conversions between integer and string representations\n  // part of speech tags. Not owned.\n  const MorphologyLabelSet *label_set_ = nullptr;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(MorphologyTransitionState);\n};\n\nclass MorphologyTransitionSystem : public ParserTransitionSystem {\n public:\n  ~MorphologyTransitionSystem() override { SharedStore::Release(label_set_); }\n\n  // Determines tag map location.\n  void Setup(TaskContext *context) override {\n    context->GetInput(\"morph-label-set\");\n  }\n\n  // Reads tag map and tag to category map.\n  void Init(TaskContext *context) override {\n    const string fname =\n        TaskContext::InputFile(*context->GetInput(\"morph-label-set\"));\n    label_set_ =\n        SharedStoreUtils::GetWithDefaultName<MorphologyLabelSet>(fname);\n  }\n\n  // The SHIFT action uses the same value as the corresponding action type.\n  static ParserAction ShiftAction(int tag) { return tag; }\n\n  // The morpher transition system doesn't look at the dependency tree, so it\n  // allows non-projective trees.\n  bool AllowsNonProjective() const override { return true; }\n\n  // Returns the number of action types.\n  int NumActionTypes() const override { return 1; }\n\n  // Returns the number of possible actions.\n  int NumActions(int num_labels) const override { return label_set_->Size(); }\n\n  // The default action for a given state is assigning the most frequent tag.\n  ParserAction GetDefaultAction(const ParserState &state) const override {\n    return ShiftAction(0);\n  }\n\n  // Returns the next gold action for a given state according to the\n  // underlying annotated sentence.\n  ParserAction GetNextGoldAction(const ParserState &state) const override {\n    if (!state.EndOfInput()) {\n      return ShiftAction(TransitionState(state).GoldTag(state.Next()));\n    }\n    return ShiftAction(0);\n  }\n\n  // Checks if the action is allowed in a given parser state.\n  bool IsAllowedAction(ParserAction action,\n                       const ParserState &state) const override {\n    return !state.EndOfInput();\n  }\n\n  // Makes a shift by pushing the next input token on the stack and moving to\n  // the next position.\n  void PerformActionWithoutHistory(ParserAction action,\n                                   ParserState *state) const override {\n    DCHECK(!state->EndOfInput());\n    if (!state->EndOfInput()) {\n      MutableTransitionState(state)->SetTag(state->Next(), action);\n      state->Push(state->Next());\n      state->Advance();\n    }\n  }\n\n  // We are in a final state when we reached the end of the input and the stack\n  // is empty.\n  bool IsFinalState(const ParserState &state) const override {\n    return state.EndOfInput();\n  }\n\n  // Returns a string representation of a parser action.\n  string ActionAsString(ParserAction action,\n                        const ParserState &state) const override {\n    return tensorflow::strings::StrCat(\n        \"SHIFT(\", label_set_->Lookup(action).ShortDebugString(), \")\");\n  }\n\n  // No state is deterministic in this transition system.\n  bool IsDeterministicState(const ParserState &state) const override {\n    return false;\n  }\n\n  // Returns a new transition state to be used to enhance the parser state.\n  ParserTransitionState *NewTransitionState(bool training_mode) const override {\n    return new MorphologyTransitionState(label_set_);\n  }\n\n  // Downcasts the const ParserTransitionState in ParserState to a const\n  // MorphologyTransitionState.\n  static const MorphologyTransitionState &TransitionState(\n      const ParserState &state) {\n    return *static_cast<const MorphologyTransitionState *>(\n        state.transition_state());\n  }\n\n  // Downcasts the ParserTransitionState in ParserState to an\n  // MorphologyTransitionState.\n  static MorphologyTransitionState *MutableTransitionState(ParserState *state) {\n    return static_cast<MorphologyTransitionState *>(\n        state->mutable_transition_state());\n  }\n\n  // Input for the tag map. Not owned.\n  TaskInput *input_label_set_ = nullptr;\n\n  // Tag map used for conversions between integer and string representations\n  // morphology labels. Owned through SharedStore.\n  const MorphologyLabelSet *label_set_;\n};\n\nREGISTER_TRANSITION_SYSTEM(\"morpher\", MorphologyTransitionSystem);\n\n// Feature function for retrieving the tag assigned to a token by the tagger\n// transition system.\nclass PredictedMorphTagFeatureFunction : public ParserIndexFeatureFunction {\n public:\n  PredictedMorphTagFeatureFunction() {}\n\n  // Determines tag map location.\n  void Setup(TaskContext *context) override {\n    context->GetInput(\"morph-label-set\", \"recordio\", \"token-morphology\");\n  }\n\n  // Reads tag map.\n  void Init(TaskContext *context) override {\n    const string fname =\n        TaskContext::InputFile(*context->GetInput(\"morph-label-set\"));\n    label_set_ = SharedStore::Get<MorphologyLabelSet>(fname, fname);\n    set_feature_type(new FullLabelFeatureType(name(), label_set_));\n  }\n\n  // Gets the MorphologyTransitionState from the parser state and reads the\n  // assigned\n  // tag at the focus index. Returns -1 if the focus is not within the sentence.\n  FeatureValue Compute(const WorkspaceSet &workspaces, const ParserState &state,\n                       int focus, const FeatureVector *result) const override {\n    if (focus < 0 || focus >= state.sentence().token_size()) return -1;\n    return static_cast<const MorphologyTransitionState *>(\n               state.transition_state())\n        ->Tag(focus);\n  }\n\n private:\n  // Tag map used for conversions between integer and string representations\n  // part of speech tags. Owned through SharedStore.\n  const MorphologyLabelSet *label_set_;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(PredictedMorphTagFeatureFunction);\n};\n\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"pred-morph-tag\",\n                                     PredictedMorphTagFeatureFunction);\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/morphology_label_set.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/morphology_label_set.h\"\n\nnamespace syntaxnet {\n\nconst char MorphologyLabelSet::kSeparator[] = \"\\t\";\n\nint MorphologyLabelSet::Add(const TokenMorphology &morph) {\n  string repr = StringForMatch(morph);\n  auto it = fast_lookup_.find(repr);\n  if (it != fast_lookup_.end()) return it->second;\n  fast_lookup_[repr] = label_set_.size();\n  label_set_.push_back(morph);\n  return label_set_.size() - 1;\n}\n\n// Look up an existing TokenMorphology.  If it is not present, return -1.\nint MorphologyLabelSet::LookupExisting(const TokenMorphology &morph) const {\n  string repr = StringForMatch(morph);\n  auto it = fast_lookup_.find(repr);\n  if (it != fast_lookup_.end()) return it->second;\n  return -1;\n}\n\n// Return the TokenMorphology at position i.  The input i should be in the range\n// 0..size().\nconst TokenMorphology &MorphologyLabelSet::Lookup(int i) const {\n  CHECK_GE(i, 0);\n  CHECK_LT(i, label_set_.size());\n  return label_set_[i];\n}\n\nvoid MorphologyLabelSet::Read(const string &filename) {\n  ProtoRecordReader reader(filename);\n  Read(&reader);\n}\n\nvoid MorphologyLabelSet::Read(ProtoRecordReader *reader) {\n  TokenMorphology morph;\n  while (reader->Read(&morph).ok()) {\n    CHECK_EQ(-1, LookupExisting(morph));\n    Add(morph);\n  }\n}\n\nvoid MorphologyLabelSet::Write(const string &filename) const {\n  ProtoRecordWriter writer(filename);\n  Write(&writer);\n}\n\nvoid MorphologyLabelSet::Write(ProtoRecordWriter *writer) const {\n  for (const TokenMorphology &morph : label_set_) {\n    writer->Write(morph);\n  }\n}\n\nstring MorphologyLabelSet::StringForMatch(const TokenMorphology &morph) const {\n  vector<string> attributes;\n  for (const auto &a : morph.attribute()) {\n    attributes.push_back(\n        tensorflow::strings::StrCat(a.name(), kSeparator, a.value()));\n  }\n  std::sort(attributes.begin(), attributes.end());\n  return utils::Join(attributes, kSeparator);\n}\n\nstring FullLabelFeatureType::GetFeatureValueName(FeatureValue value) const {\n  const TokenMorphology &morph = label_set_->Lookup(value);\n  vector<string> attributes;\n  for (const auto &a : morph.attribute()) {\n    attributes.push_back(tensorflow::strings::StrCat(a.name(), \":\", a.value()));\n  }\n  std::sort(attributes.begin(), attributes.end());\n  return utils::Join(attributes, \",\");\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/morphology_label_set.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// A class to store the set of possible TokenMorphology objects.  This includes\n// lookup, iteration and serialziation.\n\n#ifndef SYNTAXNET_MORPHOLOGY_LABEL_SET_H_\n#define SYNTAXNET_MORPHOLOGY_LABEL_SET_H_\n\n#include <unordered_map>\n#include <string>\n#include <vector>\n\n#include \"syntaxnet/proto_io.h\"\n#include \"syntaxnet/sentence.pb.h\"\n\nnamespace syntaxnet {\n\nclass MorphologyLabelSet {\n public:\n  // Initalize as an empty morphology.\n  MorphologyLabelSet() {}\n\n  // Initalizes by reading the given file, which has been saved by Write().\n  // This makes using the shared store easier.\n  explicit MorphologyLabelSet(const string &fname) { Read(fname); }\n\n  // Adds a TokenMorphology to the set if it is not present. In any case, return\n  // its position in the list. Note: This is slow, and should not be called\n  // outside of training or init.\n  int Add(const TokenMorphology &morph);\n\n  // Look up an existing TokenMorphology. If it is not present, return -1.\n  // Note: This is slow, and should not be called outside of training workflow\n  // or init.\n  int LookupExisting(const TokenMorphology &morph) const;\n\n  // Return the TokenMorphology at position i. The input i should be in the\n  // range 0..size(). Note: this will be called at inference time and needs to\n  // be kept fast.\n  const TokenMorphology &Lookup(int i) const;\n\n  // Return the number of elements.\n  int Size() const { return label_set_.size(); }\n\n  // Deserialization and serialization.\n  void Read(const string &filename);\n  void Write(const string &filename) const;\n\n private:\n  string StringForMatch(const TokenMorphology &morhp) const;\n\n  // Deserialization and serialziation implementation.\n  void Read(ProtoRecordReader *reader);\n  void Write(ProtoRecordWriter *writer) const;\n\n  // List of all possible annotations.  This is a unique list, where equality is\n  // defined as follows:\n  //\n  //   a == b iff the set of attribute pairs (attribute, value) is identical.\n  vector<TokenMorphology> label_set_;\n\n  // Because protocol buffer equality is complicated, we implement our own\n  // equality operator based on strings. This unordered_map allows us to do the\n  // lookup more quickly.\n  unordered_map<string, int> fast_lookup_;\n\n  // A separator string that should not occur in any of the attribute names.\n  // This should never be serialized, so that it can be changed in the code if\n  // we change attribute names and it occurs in the new names.\n  static const char kSeparator[];\n};\n\n// A feature type with one value for each complete morphological analysis\n// (analogous to the fulltag analyzer).\nclass FullLabelFeatureType : public FeatureType {\n public:\n  FullLabelFeatureType(const string &name, const MorphologyLabelSet *label_set)\n      : FeatureType(name), label_set_(label_set) {}\n\n  ~FullLabelFeatureType() override {}\n\n  // Converts a feature value to a name.  We don't use StringForMatch, since the\n  // goal of these are to be readable, even if they might occasionally be\n  // non-unique.\n  string GetFeatureValueName(FeatureValue value) const override;\n\n  // Returns the size of the feature values domain.\n  FeatureValue GetDomainSize() const override { return label_set_->Size(); }\n\n private:\n  // Not owned.\n  const MorphologyLabelSet *label_set_ = nullptr;\n};\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_MORPHOLOGY_LABEL_SET_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/morphology_label_set_test.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/morphology_label_set.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include <gmock/gmock.h>\n\n#include \"tensorflow/core/lib/core/status.h\"\n#include \"tensorflow/core/platform/env.h\"\n#include \"tensorflow/core/platform/test.h\"\n\nnamespace syntaxnet {\n\nclass MorphologyLabelSetTest : public ::testing::Test {\n protected:\n  MorphologyLabelSet label_set_;\n};\n\n// Test that Add and LookupExisting work as expected.\nTEST_F(MorphologyLabelSetTest, AddLookupExisting) {\n  TokenMorphology si1, si2;  // singular, imperative\n  TokenMorphology pi;        // plural, imperative\n  TokenMorphology six;       // singular, imperative with extra value\n  TextFormat::ParseFromString(R\"(\n      attribute {name: \"Number\" value: \"Singular\"}\n      attribute {name: \"POS\" value: \"IMP\"})\",\n                                      &si1);\n  TextFormat::ParseFromString(R\"(\n      attribute {name: \"POS\" value: \"IMP\"}\n      attribute {name: \"Number\" value: \"Singular\"})\",\n                                      &si2);\n  TextFormat::ParseFromString(R\"(\n      attribute {name: \"Number\" value: \"Plural\"}\n      attribute {name: \"POS\" value: \"IMP\"})\",\n                                      &pi);\n  TextFormat::ParseFromString(R\"(\n      attribute {name: \"Number\" value: \"Plural\"}\n      attribute {name: \"POS\" value: \"IMP\"}\n      attribute {name: \"x\" value: \"x\"})\",\n                                      &six);\n\n  // Check Lookup existing returns -1 for non-existing entries.\n  EXPECT_EQ(-1, label_set_.LookupExisting(si1));\n  EXPECT_EQ(-1, label_set_.LookupExisting(si2));\n  EXPECT_EQ(0, label_set_.Size());\n\n  // Check that adding returns 0 (this is the only possiblity given Size())\n  EXPECT_EQ(0, label_set_.Add(si1));\n  EXPECT_EQ(0, label_set_.Add(si1));  // calling Add twice adds only once\n  EXPECT_EQ(1, label_set_.Size());\n\n  // Check that order of attributes does not matter.\n  EXPECT_EQ(0, label_set_.LookupExisting(si2));\n\n  // Check that un-added entries still are not present.\n  EXPECT_EQ(-1, label_set_.LookupExisting(pi));\n  EXPECT_EQ(-1, label_set_.LookupExisting(six));\n\n  // Check that we can add them.\n  EXPECT_EQ(1, label_set_.Add(pi));\n  EXPECT_EQ(2, label_set_.Add(six));\n  EXPECT_EQ(3, label_set_.Size());\n}\n\n// Test write and deserializing constructor.\nTEST_F(MorphologyLabelSetTest, Serialization) {\n  TokenMorphology si;  // singular, imperative\n  TokenMorphology pi;  // plural, imperative\n  TextFormat::ParseFromString(R\"(\n      attribute {name: \"Number\" value: \"Singular\"}\n      attribute {name: \"POS\" value: \"IMP\"})\",\n                                      &si);\n  TextFormat::ParseFromString(R\"(\n      attribute {name: \"Number\" value: \"Plural\"}\n      attribute {name: \"POS\" value: \"IMP\"})\",\n                                      &pi);\n  EXPECT_EQ(0, label_set_.Add(si));\n  EXPECT_EQ(1, label_set_.Add(pi));\n\n  // Serialize and deserialize.\n  string fname = utils::JoinPath({tensorflow::testing::TmpDir(), \"label-set\"});\n  label_set_.Write(fname);\n  MorphologyLabelSet label_set2(fname);\n  EXPECT_EQ(0, label_set2.LookupExisting(si));\n  EXPECT_EQ(1, label_set2.LookupExisting(pi));\n  EXPECT_EQ(2, label_set2.Size());\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/ops/parser_ops.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"tensorflow/core/framework/op.h\"\n\nnamespace syntaxnet {\n\n// -----------------------------------------------------------------------------\n\nREGISTER_OP(\"GoldParseReader\")\n    .Output(\"features: feature_size * string\")\n    .Output(\"num_epochs: int32\")\n    .Output(\"gold_actions: int32\")\n    .Attr(\"task_context: string\")\n    .Attr(\"feature_size: int\")\n    .Attr(\"batch_size: int\")\n    .Attr(\"corpus_name: string='documents'\")\n    .Attr(\"arg_prefix: string='brain_parser'\")\n    .SetIsStateful()\n    .Doc(R\"doc(\nReads sentences, parses them, and returns (gold action, feature) pairs.\n\nfeatures: features firing at the current parser state, encoded as\n          dist_belief.SparseFeatures protocol buffers.\nnum_epochs: number of times this reader went over the training corpus.\ngold_actions: action to perform at the current parser state.\ntask_context: file path at which to read the task context.\nfeature_size: number of feature outputs emitted by this reader.\nbatch_size: number of sentences to parse at a time.\ncorpus_name: name of task input in the task context to read parses from.\narg_prefix: prefix for context parameters.\n)doc\");\n\nREGISTER_OP(\"DecodedParseReader\")\n    .Input(\"transition_scores: float\")\n    .Output(\"features: feature_size * string\")\n    .Output(\"num_epochs: int32\")\n    .Output(\"eval_metrics: int32\")\n    .Output(\"documents: string\")\n    .Attr(\"task_context: string\")\n    .Attr(\"feature_size: int\")\n    .Attr(\"batch_size: int\")\n    .Attr(\"corpus_name: string='documents'\")\n    .Attr(\"arg_prefix: string='brain_parser'\")\n    .SetIsStateful()\n    .Doc(R\"doc(\nReads sentences and parses them taking parsing transitions based on the\ninput transition scores.\n\ntransition_scores: scores for every transition from the current parser state.\nfeatures: features firing at the current parser state encoded as\n          dist_belief.SparseFeatures protocol buffers.\nnum_epochs: number of times this reader went over the training corpus.\neval_metrics: token counts used to compute evaluation metrics.\ntask_context: file path at which to read the task context.\nfeature_size: number of feature outputs emitted by this reader.\nbatch_size: number of sentences to parse at a time.\ncorpus_name: name of task input in the task context to read parses from.\narg_prefix: prefix for context parameters.\n)doc\");\n\nREGISTER_OP(\"BeamParseReader\")\n    .Output(\"features: feature_size * string\")\n    .Output(\"beam_state: int64\")\n    .Output(\"num_epochs: int32\")\n    .Attr(\"task_context: string\")\n    .Attr(\"feature_size: int\")\n    .Attr(\"beam_size: int\")\n    .Attr(\"batch_size: int=1\")\n    .Attr(\"corpus_name: string='documents'\")\n    .Attr(\"allow_feature_weights: bool=true\")\n    .Attr(\"arg_prefix: string='brain_parser'\")\n    .Attr(\"continue_until_all_final: bool=false\")\n    .Attr(\"always_start_new_sentences: bool=false\")\n    .SetIsStateful()\n    .Doc(R\"doc(\nReads sentences and creates a beam parser.\n\nfeatures: features firing at the initial parser state encoded as\n          dist_belief.SparseFeatures protocol buffers.\nbeam_state: beam state handle.\ntask_context: file path at which to read the task context.\nfeature_size: number of feature outputs emitted by this reader.\nbeam_size: limit on the beam size.\ncorpus_name: name of task input in the task context to read parses from.\nallow_feature_weights: whether the op is expected to output weighted features.\n                       If false, it will check that no weights are specified.\narg_prefix: prefix for context parameters.\ncontinue_until_all_final: whether to continue parsing after the gold path falls\n                          off the beam.\nalways_start_new_sentences: whether to skip to the beginning of a new sentence\n                            after each training step.\n)doc\");\n\nREGISTER_OP(\"BeamParser\")\n    .Input(\"beam_state: int64\")\n    .Input(\"transition_scores: float\")\n    .Output(\"features: feature_size * string\")\n    .Output(\"next_beam_state: int64\")\n    .Output(\"alive: bool\")\n    .Attr(\"feature_size: int\")\n    .SetIsStateful()\n    .Doc(R\"doc(\nUpdates the beam parser based on scores in the input transition scores.\n\nbeam_state: beam state.\ntransition_scores: scores for every transition from the current parser state.\nfeatures: features firing at the current parser state encoded as\n          dist_belief.SparseFeatures protocol buffers.\nnext_beam_state: beam state handle.\nalive: whether the gold state is still in the beam.\nfeature_size: number of feature outputs emitted by this reader.\n)doc\");\n\nREGISTER_OP(\"BeamParserOutput\")\n    .Input(\"beam_state: int64\")\n    .Output(\"indices_and_paths: int32\")\n    .Output(\"batches_and_slots: int32\")\n    .Output(\"gold_slot: int32\")\n    .Output(\"path_scores: float\")\n    .SetIsStateful()\n    .Doc(R\"doc(\nConverts the current state of the beam parser into a set of indices into\nthe scoring matrices that lead there.\n\nbeam_state: beam state handle.\nindices_and_paths: matrix whose first row is a vector to look up beam paths and\n                   decisions with, and whose second row are the corresponding\n                   path ids.\nbatches_and_slots: matrix whose first row is a vector identifying the batch to\n                   which the paths correspond, and whose second row are the\n                   slots.\ngold_slot: location in final beam of the gold path [batch_size].\npath_scores: cumulative sum of scores along each path in each beam. Within each\n             beam, scores are sorted from low to high.\n)doc\");\n\nREGISTER_OP(\"BeamEvalOutput\")\n    .Input(\"beam_state: int64\")\n    .Output(\"eval_metrics: int32\")\n    .Output(\"documents: string\")\n    .SetIsStateful()\n    .Doc(R\"doc(\nComputes eval metrics for the best paths in the input beams.\n\nbeam_state: beam state handle.\neval_metrics: token counts used to compute evaluation metrics.\ndocuments: parsed documents.\n)doc\");\n\nREGISTER_OP(\"LexiconBuilder\")\n    .Attr(\"task_context: string\")\n    .Attr(\"corpus_name: string='documents'\")\n    .Attr(\"lexicon_max_prefix_length: int = 3\")\n    .Attr(\"lexicon_max_suffix_length: int = 3\")\n    .Doc(R\"doc(\nAn op that collects term statistics over a corpus and saves a set of term maps.\n\ntask_context: file path at which to read the task context.\ncorpus_name: name of the context input to compute lexicons.\nlexicon_max_prefix_length: maximum prefix length for lexicon words.\nlexicon_max_suffix_length: maximum suffix length for lexicon words.\n)doc\");\n\nREGISTER_OP(\"FeatureSize\")\n    .Attr(\"task_context: string\")\n    .Output(\"feature_sizes: int32\")\n    .Output(\"domain_sizes: int32\")\n    .Output(\"embedding_dims: int32\")\n    .Output(\"num_actions: int32\")\n    .Attr(\"arg_prefix: string='brain_parser'\")\n    .Doc(R\"doc(\nAn op that returns the number and domain sizes of parser features.\n\ntask_context: file path at which to read the task context.\nfeature_sizes: number of feature locators in each group of parser features.\ndomain_sizes: domain size for each feature group of parser features.\nembedding_dims: embedding dimension for each feature group of parser features.\nnum_actions: number of actions a parser can perform.\narg_prefix: prefix for context parameters.\n)doc\");\n\nREGISTER_OP(\"UnpackSparseFeatures\")\n    .Input(\"sf: string\")\n    .Output(\"indices: int32\")\n    .Output(\"ids: int64\")\n    .Output(\"weights: float\")\n    .Doc(R\"doc(\nConverts a vector of strings with SparseFeatures to tensors.\n\nNote that indices, ids and weights are vectors of the same size and have\none-to-one correspondence between their elements. ids and weights are each\nobtained by sequentially concatenating sf[i].id and sf[i].weight, for i in\n1...size(sf). Note that if sf[i].weight is not set, the default value for the\nweight is assumed to be 1.0. Also for any j, if ids[j] and weights[j] were\nextracted from sf[i], then index[j] is set to i.\n\nsf: vector of string, where each element is the string encoding of\n    SpareFeatures proto.\nindices: vector of indices inside sf\nids: vector of id extracted from the SparseFeatures proto.\nweights: vector of weight extracted from the SparseFeatures proto.\n)doc\");\n\nREGISTER_OP(\"WordEmbeddingInitializer\")\n    .Output(\"word_embeddings: float\")\n    .Attr(\"vectors: string\")\n    .Attr(\"task_context: string\")\n    .Attr(\"embedding_init: float = 1.0\")\n    .Doc(R\"doc(\nReads word embeddings from an sstable of dist_belief.TokenEmbedding protos for\nevery word specified in a text vocabulary file.\n\nword_embeddings: a tensor containing word embeddings from the specified sstable.\nvectors: path to recordio of word embedding vectors.\ntask_context: file path at which to read the task context.\n)doc\");\n\nREGISTER_OP(\"DocumentSource\")\n    .Output(\"documents: string\")\n    .Output(\"last: bool\")\n    .Attr(\"task_context: string\")\n    .Attr(\"corpus_name: string='documents'\")\n    .Attr(\"batch_size: int\")\n    .SetIsStateful()\n    .Doc(R\"doc(\nReads documents from documents_path and outputs them.\n\ndocuments: a vector of documents as serialized protos.\nlast: whether this is the last batch of documents from this document path.\nbatch_size: how many documents to read at once.\n)doc\");\n\nREGISTER_OP(\"DocumentSink\")\n    .Input(\"documents: string\")\n    .Attr(\"task_context: string\")\n    .Attr(\"corpus_name: string='documents'\")\n    .Doc(R\"doc(\nWrite documents to documents_path.\n\ndocuments: documents to write.\n)doc\");\n\nREGISTER_OP(\"WellFormedFilter\")\n    .Input(\"documents: string\")\n    .Output(\"filtered: string\")\n    .Attr(\"task_context: string\")\n    .Attr(\"corpus_name: string='documents'\")\n    .Attr(\"keep_malformed_documents: bool = False\")\n    .Doc(R\"doc(\n)doc\");\n\nREGISTER_OP(\"ProjectivizeFilter\")\n    .Input(\"documents: string\")\n    .Output(\"filtered: string\")\n    .Attr(\"task_context: string\")\n    .Attr(\"corpus_name: string='documents'\")\n    .Attr(\"discard_non_projective: bool = False\")\n    .Doc(R\"doc(\n)doc\");\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/parser_eval.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"A program to annotate a conll file with a tensorflow neural net parser.\"\"\"\n\n\nimport os\nimport os.path\nimport time\nimport tempfile\nimport tensorflow as tf\n\nfrom tensorflow.python.platform import gfile\nfrom tensorflow.python.platform import tf_logging as logging\n\nfrom google.protobuf import text_format\n\nfrom syntaxnet import sentence_pb2\nfrom syntaxnet import graph_builder\nfrom syntaxnet import structured_graph_builder\nfrom syntaxnet.ops import gen_parser_ops\nfrom syntaxnet import task_spec_pb2\n\nflags = tf.app.flags\nFLAGS = flags.FLAGS\n\n\nflags.DEFINE_string('task_context', '',\n                    'Path to a task context with inputs and parameters for '\n                    'feature extractors.')\nflags.DEFINE_string('resource_dir', '',\n                    'Optional base directory for task context resources.')\nflags.DEFINE_string('model_path', '', 'Path to model parameters.')\nflags.DEFINE_string('arg_prefix', None, 'Prefix for context parameters.')\nflags.DEFINE_string('graph_builder', 'greedy',\n                    'Which graph builder to use, either greedy or structured.')\nflags.DEFINE_string('input', 'stdin',\n                    'Name of the context input to read data from.')\nflags.DEFINE_string('output', 'stdout',\n                    'Name of the context input to write data to.')\nflags.DEFINE_string('hidden_layer_sizes', '200,200',\n                    'Comma separated list of hidden layer sizes.')\nflags.DEFINE_integer('batch_size', 32,\n                     'Number of sentences to process in parallel.')\nflags.DEFINE_integer('beam_size', 8, 'Number of slots for beam parsing.')\nflags.DEFINE_integer('max_steps', 1000, 'Max number of steps to take.')\nflags.DEFINE_bool('slim_model', False,\n                  'Whether to expect only averaged variables.')\n\n\ndef RewriteContext(task_context):\n  context = task_spec_pb2.TaskSpec()\n  with gfile.FastGFile(task_context) as fin:\n    text_format.Merge(fin.read(), context)\n  for resource in context.input:\n    for part in resource.part:\n      if part.file_pattern != '-':\n        part.file_pattern = os.path.join(FLAGS.resource_dir, part.file_pattern)\n  with tempfile.NamedTemporaryFile(delete=False) as fout:\n    fout.write(str(context))\n    return fout.name\n\n\ndef Eval(sess):\n  \"\"\"Builds and evaluates a network.\"\"\"\n  task_context = FLAGS.task_context\n  if FLAGS.resource_dir:\n    task_context = RewriteContext(task_context)\n  feature_sizes, domain_sizes, embedding_dims, num_actions = sess.run(\n      gen_parser_ops.feature_size(task_context=task_context,\n                                  arg_prefix=FLAGS.arg_prefix))\n\n  t = time.time()\n  hidden_layer_sizes = map(int, FLAGS.hidden_layer_sizes.split(','))\n  logging.info('Building training network with parameters: feature_sizes: %s '\n               'domain_sizes: %s', feature_sizes, domain_sizes)\n  if FLAGS.graph_builder == 'greedy':\n    parser = graph_builder.GreedyParser(num_actions,\n                                        feature_sizes,\n                                        domain_sizes,\n                                        embedding_dims,\n                                        hidden_layer_sizes,\n                                        gate_gradients=True,\n                                        arg_prefix=FLAGS.arg_prefix)\n  else:\n    parser = structured_graph_builder.StructuredGraphBuilder(\n        num_actions,\n        feature_sizes,\n        domain_sizes,\n        embedding_dims,\n        hidden_layer_sizes,\n        gate_gradients=True,\n        arg_prefix=FLAGS.arg_prefix,\n        beam_size=FLAGS.beam_size,\n        max_steps=FLAGS.max_steps)\n  parser.AddEvaluation(task_context,\n                       FLAGS.batch_size,\n                       corpus_name=FLAGS.input,\n                       evaluation_max_steps=FLAGS.max_steps)\n\n  parser.AddSaver(FLAGS.slim_model)\n  sess.run(parser.inits.values())\n  parser.saver.restore(sess, FLAGS.model_path)\n\n  sink_documents = tf.placeholder(tf.string)\n  sink = gen_parser_ops.document_sink(sink_documents,\n                                      task_context=task_context,\n                                      corpus_name=FLAGS.output)\n  t = time.time()\n  num_epochs = None\n  num_tokens = 0\n  num_correct = 0\n  num_documents = 0\n  while True:\n    tf_eval_epochs, tf_eval_metrics, tf_documents = sess.run([\n        parser.evaluation['epochs'],\n        parser.evaluation['eval_metrics'],\n        parser.evaluation['documents'],\n    ])\n\n    if len(tf_documents):\n      logging.info('Processed %d documents', len(tf_documents))\n      num_documents += len(tf_documents)\n      sess.run(sink, feed_dict={sink_documents: tf_documents})\n\n    num_tokens += tf_eval_metrics[0]\n    num_correct += tf_eval_metrics[1]\n    if num_epochs is None:\n      num_epochs = tf_eval_epochs\n    elif num_epochs < tf_eval_epochs:\n      break\n\n  logging.info('Total processed documents: %d', num_documents)\n  if num_tokens > 0:\n    eval_metric = 100.0 * num_correct / num_tokens\n    logging.info('num correct tokens: %d', num_correct)\n    logging.info('total tokens: %d', num_tokens)\n    logging.info('Seconds elapsed in evaluation: %.2f, '\n                 'eval metric: %.2f%%', time.time() - t, eval_metric)\n\n\ndef main(unused_argv):\n  logging.set_verbosity(logging.INFO)\n  with tf.Session() as sess:\n    Eval(sess)\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/parser_features.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/parser_features.h\"\n\n#include <string>\n\n#include \"syntaxnet/registry.h\"\n#include \"syntaxnet/sentence_features.h\"\n#include \"syntaxnet/workspace.h\"\n\nnamespace syntaxnet {\n\n// Registry for the parser feature functions.\nREGISTER_CLASS_REGISTRY(\"parser feature function\", ParserFeatureFunction);\n\n// Registry for the parser state + token index feature functions.\nREGISTER_CLASS_REGISTRY(\"parser+index feature function\",\n                        ParserIndexFeatureFunction);\n\nRootFeatureType::RootFeatureType(const string &name,\n                                 const FeatureType &wrapped_type,\n                                 int root_value)\n    : FeatureType(name), wrapped_type_(wrapped_type), root_value_(root_value) {}\n\nstring RootFeatureType::GetFeatureValueName(FeatureValue value) const {\n  if (value == root_value_) return \"<ROOT>\";\n  return wrapped_type_.GetFeatureValueName(value);\n}\n\nFeatureValue RootFeatureType::GetDomainSize() const {\n  return wrapped_type_.GetDomainSize() + 1;\n}\n\n// Parser feature locator for accessing the remaining input tokens in the parser\n// state. It takes the offset relative to the current input token as argument.\n// Negative values represent tokens to the left, positive values to the right\n// and 0 (the default argument value) represents the current input token.\nclass InputParserLocator : public ParserLocator<InputParserLocator> {\n public:\n  // Gets the new focus.\n  int GetFocus(const WorkspaceSet &workspaces, const ParserState &state) const {\n    const int offset = argument();\n    return state.Input(offset);\n  }\n};\n\nREGISTER_PARSER_FEATURE_FUNCTION(\"input\", InputParserLocator);\n\n// Parser feature locator for accessing the stack in the parser state. The\n// argument represents the position on the stack, 0 being the top of the stack.\nclass StackParserLocator : public ParserLocator<StackParserLocator> {\n public:\n  // Gets the new focus.\n  int GetFocus(const WorkspaceSet &workspaces, const ParserState &state) const {\n    const int position = argument();\n    return state.Stack(position);\n  }\n};\n\nREGISTER_PARSER_FEATURE_FUNCTION(\"stack\", StackParserLocator);\n\n// Parser feature locator for locating the head of the focus token. The argument\n// specifies the number of times the head function is applied. Please note that\n// this operates on partially built dependency trees.\nclass HeadFeatureLocator : public ParserIndexLocator<HeadFeatureLocator> {\n public:\n  // Updates the current focus to a new location. If the initial focus is\n  // outside the range of the sentence, returns -2.\n  void UpdateArgs(const WorkspaceSet &workspaces, const ParserState &state,\n                  int *focus) const {\n    if (*focus < -1 || *focus >= state.sentence().token_size()) {\n      *focus = -2;\n      return;\n    }\n    const int levels = argument();\n    *focus = state.Parent(*focus, levels);\n  }\n};\n\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"head\", HeadFeatureLocator);\n\n// Parser feature locator for locating children of the focus token. The argument\n// specifies the number of times the leftmost (when the argument is < 0) or\n// rightmost (when the argument > 0) child function is applied. Please note that\n// this operates on partially built dependency trees.\nclass ChildFeatureLocator : public ParserIndexLocator<ChildFeatureLocator> {\n public:\n  // Updates the current focus to a new location. If the initial focus is\n  // outside the range of the sentence, returns -2.\n  void UpdateArgs(const WorkspaceSet &workspaces, const ParserState &state,\n                  int *focus) const {\n    if (*focus < -1 || *focus >= state.sentence().token_size()) {\n      *focus = -2;\n      return;\n    }\n    const int levels = argument();\n    if (levels < 0) {\n      *focus = state.LeftmostChild(*focus, -levels);\n    } else {\n      *focus = state.RightmostChild(*focus, levels);\n    }\n  }\n};\n\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"child\", ChildFeatureLocator);\n\n// Parser feature locator for locating siblings of the focus token. The argument\n// specifies the sibling position relative to the focus token: a negative value\n// triggers a search to the left, while a positive value one to the right.\n// Please note that this operates on partially built dependency trees.\nclass SiblingFeatureLocator\n    : public ParserIndexLocator<SiblingFeatureLocator> {\n public:\n  // Updates the current focus to a new location. If the initial focus is\n  // outside the range of the sentence, returns -2.\n  void UpdateArgs(const WorkspaceSet &workspaces, const ParserState &state,\n                  int *focus) const {\n    if (*focus < -1 || *focus >= state.sentence().token_size()) {\n      *focus = -2;\n      return;\n    }\n    const int position = argument();\n    if (position < 0) {\n      *focus = state.LeftSibling(*focus, -position);\n    } else {\n      *focus = state.RightSibling(*focus, position);\n    }\n  }\n};\n\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"sibling\", SiblingFeatureLocator);\n\n// Feature function for computing the label from focus token. Note that this\n// does not use the precomputed values, since we get the labels from the stack;\n// the reason it utilizes sentence_features::Label is to obtain the label map.\nclass LabelFeatureFunction : public BasicParserSentenceFeatureFunction<Label> {\n public:\n  // Computes the label of the relation between the focus token and its parent.\n  // Valid focus values range from -1 to sentence->size() - 1, inclusively.\n  FeatureValue Compute(const WorkspaceSet &workspaces, const ParserState &state,\n                       int focus, const FeatureVector *result) const override {\n    if (focus == -1) return RootValue();\n    if (focus < -1 || focus >= state.sentence().token_size()) {\n      return feature_.NumValues();\n    }\n    const int label = state.Label(focus);\n    return label == -1 ? RootValue() : label;\n  }\n};\n\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"label\", LabelFeatureFunction);\n\ntypedef BasicParserSentenceFeatureFunction<Word> WordFeatureFunction;\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"word\", WordFeatureFunction);\n\ntypedef BasicParserSentenceFeatureFunction<Char> CharFeatureFunction;\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"char\", CharFeatureFunction);\n\ntypedef BasicParserSentenceFeatureFunction<Tag> TagFeatureFunction;\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"tag\", TagFeatureFunction);\n\ntypedef BasicParserSentenceFeatureFunction<Digit> DigitFeatureFunction;\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"digit\", DigitFeatureFunction);\n\ntypedef BasicParserSentenceFeatureFunction<Hyphen> HyphenFeatureFunction;\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"hyphen\", HyphenFeatureFunction);\n\ntypedef BasicParserSentenceFeatureFunction<Capitalization>\n    CapitalizationFeatureFunction;\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"capitalization\",\n                                     CapitalizationFeatureFunction);\n\ntypedef BasicParserSentenceFeatureFunction<PunctuationAmount>\n    PunctuationAmountFeatureFunction;\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"punctuation-amount\",\n                                     PunctuationAmountFeatureFunction);\n\ntypedef BasicParserSentenceFeatureFunction<Quote>\n    QuoteFeatureFunction;\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"quote\",\n                                     QuoteFeatureFunction);\n\ntypedef BasicParserSentenceFeatureFunction<PrefixFeature> PrefixFeatureFunction;\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"prefix\", PrefixFeatureFunction);\n\ntypedef BasicParserSentenceFeatureFunction<SuffixFeature> SuffixFeatureFunction;\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"suffix\", SuffixFeatureFunction);\n\n// Parser feature function that can use nested sentence feature functions for\n// feature extraction.\nclass ParserTokenFeatureFunction : public NestedFeatureFunction<\n  FeatureFunction<Sentence, int>, ParserState, int> {\n public:\n  void Preprocess(WorkspaceSet *workspaces, ParserState *state) const override {\n    for (auto *function : nested_) {\n      function->Preprocess(workspaces, state->mutable_sentence());\n    }\n  }\n\n  void Evaluate(const WorkspaceSet &workspaces, const ParserState &state,\n                int focus, FeatureVector *result) const override {\n    for (auto *function : nested_) {\n      function->Evaluate(workspaces, state.sentence(), focus, result);\n    }\n  }\n\n  // Returns the first nested feature's computed value.\n  FeatureValue Compute(const WorkspaceSet &workspaces, const ParserState &state,\n                       int focus, const FeatureVector *result) const override {\n    if (nested_.empty()) return -1;\n    return nested_[0]->Compute(workspaces, state.sentence(), focus, result);\n  }\n};\n\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"token\",\n                                     ParserTokenFeatureFunction);\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/parser_features.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// Sentence-based features for the transition parser.\n\n#ifndef SYNTAXNET_PARSER_FEATURES_H_\n#define SYNTAXNET_PARSER_FEATURES_H_\n\n#include <string>\n\n#include \"syntaxnet/feature_extractor.h\"\n#include \"syntaxnet/feature_types.h\"\n#include \"syntaxnet/parser_state.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/workspace.h\"\n\nnamespace syntaxnet {\n\n// A union used to represent discrete and continuous feature values.\nunion FloatFeatureValue {\n public:\n  explicit FloatFeatureValue(FeatureValue v) : discrete_value(v) {}\n  FloatFeatureValue(uint32 i, float w) : id(i), weight(w) {}\n  FeatureValue discrete_value;\n  struct {\n    uint32 id;\n    float weight;\n  };\n};\n\ntypedef FeatureFunction<ParserState> ParserFeatureFunction;\n\n// Feature function for the transition parser based on a parser state object and\n// a token index. This typically extracts information from a given token.\ntypedef FeatureFunction<ParserState, int> ParserIndexFeatureFunction;\n\n// Utilities to register the two types of parser features.\n#define REGISTER_PARSER_FEATURE_FUNCTION(name, component) \\\n  REGISTER_FEATURE_FUNCTION(ParserFeatureFunction, name, component)\n\n#define REGISTER_PARSER_IDX_FEATURE_FUNCTION(name, component) \\\n  REGISTER_FEATURE_FUNCTION(ParserIndexFeatureFunction, name, component)\n\n// Alias for locator type that takes a parser state, and produces a focus\n// integer that can be used on nested ParserIndexFeature objects.\ntemplate<class DER>\nusing ParserLocator = FeatureAddFocusLocator<DER, ParserState, int>;\n\n// Alias for Locator type features that take (ParserState, int) signatures and\n// call other ParserIndexFeatures.\ntemplate<class DER>\nusing ParserIndexLocator = FeatureLocator<DER, ParserState, int>;\n\n// Feature extractor for the transition parser based on a parser state object.\ntypedef FeatureExtractor<ParserState> ParserFeatureExtractor;\n\n// A simple wrapper FeatureType that adds a special \"<ROOT>\" type.\nclass RootFeatureType : public FeatureType {\n public:\n  // Creates a RootFeatureType that wraps a given type and adds the special\n  // \"<ROOT>\" value in root_value.\n  RootFeatureType(const string &name, const FeatureType &wrapped_type,\n                  int root_value);\n\n  // Returns the feature value name, but with the special \"<ROOT>\" value.\n  string GetFeatureValueName(FeatureValue value) const override;\n\n  // Returns the original number of features plus one for the \"<ROOT>\" value.\n  FeatureValue GetDomainSize() const override;\n\n private:\n  // A wrapped type that handles everything else besides \"<ROOT>\".\n  const FeatureType &wrapped_type_;\n\n  // The reserved root value.\n  int root_value_;\n};\n\n// Simple feature function that wraps a Sentence based feature\n// function. It adds a \"<ROOT>\" feature value that is triggered whenever the\n// focus is the special root token. This class is sub-classed based on the\n// extracted arguments of the nested function.\ntemplate<class F>\nclass ParserSentenceFeatureFunction : public ParserIndexFeatureFunction {\n public:\n  // Instantiates and sets up the nested feature.\n  void Setup(TaskContext *context) override {\n    this->feature_.set_descriptor(this->descriptor());\n    this->feature_.set_prefix(this->prefix());\n    this->feature_.set_extractor(this->extractor());\n    feature_.Setup(context);\n  }\n\n  // Initializes the nested feature and sets feature type.\n  void Init(TaskContext *context) override {\n    feature_.Init(context);\n    num_base_values_ = feature_.GetFeatureType()->GetDomainSize();\n    set_feature_type(new RootFeatureType(\n        name(), *feature_.GetFeatureType(), RootValue()));\n  }\n\n  // Passes workspace requests and preprocessing to the nested feature.\n  void RequestWorkspaces(WorkspaceRegistry *registry) override {\n    feature_.RequestWorkspaces(registry);\n  }\n\n  void Preprocess(WorkspaceSet *workspaces, ParserState *state) const override {\n    feature_.Preprocess(workspaces, state->mutable_sentence());\n  }\n\n protected:\n  // Returns the special value to represent a root token.\n  FeatureValue RootValue() const { return num_base_values_; }\n\n  // Store the number of base values from the wrapped function so compute the\n  // root value.\n  int num_base_values_;\n\n  // The wrapped feature.\n  F feature_;\n};\n\n// Specialization of ParserSentenceFeatureFunction that calls the nested feature\n// with (Sentence, int) arguments based on the current integer focus.\ntemplate<class F>\nclass BasicParserSentenceFeatureFunction :\n      public ParserSentenceFeatureFunction<F> {\n public:\n  FeatureValue Compute(const WorkspaceSet &workspaces, const ParserState &state,\n                       int focus, const FeatureVector *result) const override {\n    if (focus == -1) return this->RootValue();\n    return this->feature_.Compute(workspaces, state.sentence(), focus, result);\n  }\n};\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_PARSER_FEATURES_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/parser_features_test.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/parser_features.h\"\n\n#include <string>\n\n#include \"syntaxnet/feature_extractor.h\"\n#include \"syntaxnet/parser_state.h\"\n#include \"syntaxnet/populate_test_inputs.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/task_spec.pb.h\"\n#include \"syntaxnet/term_frequency_map.h\"\n#include \"syntaxnet/utils.h\"\n#include \"syntaxnet/workspace.h\"\n#include \"tensorflow/core/lib/strings/strcat.h\"\n#include \"tensorflow/core/platform/test.h\"\n\nnamespace syntaxnet {\n\n// Feature extractor for the transition parser based on a parser state object.\ntypedef FeatureExtractor<ParserState, int> ParserIndexFeatureExtractor;\n\n// Test fixture for parser features.\nclass ParserFeatureFunctionTest : public ::testing::Test {\n protected:\n  // Sets up a parser state.\n  void SetUp() override {\n    // Prepare a document.\n    const char *kTaggedDocument =\n        \"text: 'I saw a man with a telescope.' \"\n        \"token { word: 'I' start: 0 end: 0 tag: 'PRP' category: 'PRON'\"\n        \" break_level: NO_BREAK } \"\n        \"token { word: 'saw' start: 2 end: 4 tag: 'VBD' category: 'VERB'\"\n        \" break_level: SPACE_BREAK } \"\n        \"token { word: 'a' start: 6 end: 6 tag: 'DT' category: 'DET'\"\n        \" break_level: SPACE_BREAK } \"\n        \"token { word: 'man' start: 8 end: 10 tag: 'NN' category: 'NOUN'\"\n        \" break_level: SPACE_BREAK } \"\n        \"token { word: 'with' start: 12 end: 15 tag: 'IN' category: 'ADP'\"\n        \" break_level: SPACE_BREAK } \"\n        \"token { word: 'a' start: 17 end: 17 tag: 'DT' category: 'DET'\"\n        \" break_level: SPACE_BREAK } \"\n        \"token { word: 'telescope' start: 19 end: 27 tag: 'NN' category: 'NOUN'\"\n        \" break_level: SPACE_BREAK } \"\n        \"token { word: '.' start: 28 end: 28 tag: '.' category: '.'\"\n        \" break_level: NO_BREAK }\";\n    CHECK(TextFormat::ParseFromString(kTaggedDocument, &sentence_));\n    creators_ = PopulateTestInputs::Defaults(sentence_);\n\n    // Prepare a label map. By adding labels in lexicographic order we make sure\n    // the term indices stay the same after sorting (which happens when the\n    // label map is saved to disk).\n    label_map_.Increment(\"NULL\");\n    label_map_.Increment(\"ROOT\");\n    label_map_.Increment(\"det\");\n    label_map_.Increment(\"dobj\");\n    label_map_.Increment(\"nsubj\");\n    label_map_.Increment(\"p\");\n    label_map_.Increment(\"pobj\");\n    label_map_.Increment(\"prep\");\n    creators_.Add(\"label-map\", \"text\", \"\", [this](const string &filename) {\n      label_map_.Save(filename);\n    });\n\n    // Prepare a parser state.\n    state_.reset(new ParserState(&sentence_, nullptr /* no transition state */,\n                                 &label_map_));\n  }\n\n  // Prepares a feature for computations.\n  string ExtractFeature(const string &feature_name) {\n    context_.mutable_spec()->mutable_input()->Clear();\n    context_.mutable_spec()->mutable_output()->Clear();\n    feature_extractor_.reset(new ParserFeatureExtractor());\n    feature_extractor_->Parse(feature_name);\n    feature_extractor_->Setup(&context_);\n    creators_.Populate(&context_);\n    feature_extractor_->Init(&context_);\n    feature_extractor_->RequestWorkspaces(&registry_);\n    workspaces_.Reset(registry_);\n    feature_extractor_->Preprocess(&workspaces_, state_.get());\n    FeatureVector result;\n    feature_extractor_->ExtractFeatures(workspaces_, *state_, &result);\n    return result.type(0)->GetFeatureValueName(result.value(0));\n  }\n\n  std::unique_ptr<ParserState> state_;\n  Sentence sentence_;\n  WorkspaceSet workspaces_;\n  TermFrequencyMap label_map_;\n\n  PopulateTestInputs::CreatorMap creators_;\n  TaskContext context_;\n  WorkspaceRegistry registry_;\n  std::unique_ptr<ParserFeatureExtractor> feature_extractor_;\n};\n\nTEST_F(ParserFeatureFunctionTest, TagFeatureFunction) {\n  state_->Push(-1);\n  state_->Push(0);\n  EXPECT_EQ(\"PRP\", ExtractFeature(\"input.tag\"));\n  EXPECT_EQ(\"VBD\", ExtractFeature(\"input(1).tag\"));\n  EXPECT_EQ(\"<OUTSIDE>\", ExtractFeature(\"input(10).tag\"));\n  EXPECT_EQ(\"PRP\", ExtractFeature(\"stack(0).tag\"));\n  EXPECT_EQ(\"<ROOT>\", ExtractFeature(\"stack(1).tag\"));\n}\n\nTEST_F(ParserFeatureFunctionTest, LabelFeatureFunction) {\n  // Construct a partial dependency tree.\n  state_->AddArc(0, 1, 4);\n  state_->AddArc(1, -1, 1);\n  state_->AddArc(2, 3, 2);\n  state_->AddArc(3, 1, 3);\n  state_->AddArc(5, 6, 2);\n  state_->AddArc(6, 4, 6);\n  state_->AddArc(7, 1, 5);\n\n  // Test the feature function.\n  EXPECT_EQ(label_map_.GetTerm(4), ExtractFeature(\"input.label\"));\n  EXPECT_EQ(\"ROOT\", ExtractFeature(\"input(1).label\"));\n  EXPECT_EQ(label_map_.GetTerm(2), ExtractFeature(\"input(2).label\"));\n\n  // Push artifical root token onto the stack. This triggers the wrapped <ROOT>\n  // value, rather than indicating a token with the label \"ROOT\" (which may or\n  // may not be the artificial root token.)\n  state_->Push(-1);\n  EXPECT_EQ(\"<ROOT>\", ExtractFeature(\"stack.label\"));\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/parser_state.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/parser_state.h\"\n\n#include \"syntaxnet/kbest_syntax.pb.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/term_frequency_map.h\"\n#include \"syntaxnet/utils.h\"\n\nnamespace syntaxnet {\n\nconst char ParserState::kRootLabel[] = \"ROOT\";\n\nParserState::ParserState(Sentence *sentence,\n                         ParserTransitionState *transition_state,\n                         const TermFrequencyMap *label_map)\n    : sentence_(sentence),\n      num_tokens_(sentence->token_size()),\n      transition_state_(transition_state),\n      label_map_(label_map),\n      root_label_(kDefaultRootLabel),\n      next_(0) {\n  // Initialize the stack. Some transition systems could also push the\n  // artificial root on the stack, so we make room for that as well.\n  stack_.reserve(num_tokens_ + 1);\n\n  // Allocate space for head indices and labels. Initialize the head for all\n  // tokens to be the artificial root node, i.e. token -1.\n  head_.resize(num_tokens_, -1);\n  label_.resize(num_tokens_, RootLabel());\n\n  // Transition system-specific preprocessing.\n  if (transition_state_ != nullptr) transition_state_->Init(this);\n}\n\nParserState::~ParserState() { delete transition_state_; }\n\nParserState *ParserState::Clone() const {\n  ParserState *new_state = new ParserState();\n  new_state->sentence_ = sentence_;\n  new_state->num_tokens_ = num_tokens_;\n  new_state->alternative_ = alternative_;\n  new_state->transition_state_ =\n      (transition_state_ == nullptr ? nullptr : transition_state_->Clone());\n  new_state->label_map_ = label_map_;\n  new_state->root_label_ = root_label_;\n  new_state->next_ = next_;\n  new_state->stack_.assign(stack_.begin(), stack_.end());\n  new_state->head_.assign(head_.begin(), head_.end());\n  new_state->label_.assign(label_.begin(), label_.end());\n  new_state->score_ = score_;\n  new_state->is_gold_ = is_gold_;\n  return new_state;\n}\n\nint ParserState::RootLabel() const { return root_label_; }\n\nint ParserState::Next() const {\n  DCHECK_GE(next_, -1);\n  DCHECK_LE(next_, num_tokens_);\n  return next_;\n}\n\nint ParserState::Input(int offset) const {\n  int index = next_ + offset;\n  return index >= -1 && index < num_tokens_ ? index : -2;\n}\n\nvoid ParserState::Advance() {\n  DCHECK_LT(next_, num_tokens_);\n  ++next_;\n}\n\nbool ParserState::EndOfInput() const { return next_ == num_tokens_; }\n\nvoid ParserState::Push(int index) {\n  DCHECK_LE(stack_.size(), num_tokens_);\n  stack_.push_back(index);\n}\n\nint ParserState::Pop() {\n  DCHECK(!StackEmpty());\n  const int result = stack_.back();\n  stack_.pop_back();\n  return result;\n}\n\nint ParserState::Top() const {\n  DCHECK(!StackEmpty());\n  return stack_.back();\n}\n\nint ParserState::Stack(int position) const {\n  if (position < 0) return -2;\n  const int index = stack_.size() - 1 - position;\n  return (index < 0) ? -2 : stack_[index];\n}\n\nint ParserState::StackSize() const { return stack_.size(); }\n\nbool ParserState::StackEmpty() const { return stack_.empty(); }\n\nint ParserState::Head(int index) const {\n  DCHECK_GE(index, -1);\n  DCHECK_LT(index, num_tokens_);\n  return index == -1 ? -1 : head_[index];\n}\n\nint ParserState::Label(int index) const {\n  DCHECK_GE(index, -1);\n  DCHECK_LT(index, num_tokens_);\n  return index == -1 ? RootLabel() : label_[index];\n}\n\nint ParserState::Parent(int index, int n) const {\n  // Find the n-th parent by applying the head function n times.\n  DCHECK_GE(index, -1);\n  DCHECK_LT(index, num_tokens_);\n  while (n-- > 0) index = Head(index);\n  return index;\n}\n\nint ParserState::LeftmostChild(int index, int n) const {\n  DCHECK_GE(index, -1);\n  DCHECK_LT(index, num_tokens_);\n  while (n-- > 0) {\n    // Find the leftmost child by scanning from start until a child is\n    // encountered.\n    int i;\n    for (i = -1; i < index; ++i) {\n      if (Head(i) == index) break;\n    }\n    if (i == index) return -2;\n    index = i;\n  }\n  return index;\n}\n\nint ParserState::RightmostChild(int index, int n) const {\n  DCHECK_GE(index, -1);\n  DCHECK_LT(index, num_tokens_);\n  while (n-- > 0) {\n    // Find the rightmost child by scanning backward from end until a child\n    // is encountered.\n    int i;\n    for (i = num_tokens_ - 1; i > index; --i) {\n      if (Head(i) == index) break;\n    }\n    if (i == index) return -2;\n    index = i;\n  }\n  return index;\n}\n\nint ParserState::LeftSibling(int index, int n) const {\n  // Find the n-th left sibling by scanning left until the n-th child of the\n  // parent is encountered.\n  DCHECK_GE(index, -1);\n  DCHECK_LT(index, num_tokens_);\n  if (index == -1 && n > 0) return -2;\n  int i = index;\n  while (n > 0) {\n    --i;\n    if (i == -1) return -2;\n    if (Head(i) == Head(index)) --n;\n  }\n  return i;\n}\n\nint ParserState::RightSibling(int index, int n) const {\n  // Find the n-th right sibling by scanning right until the n-th child of the\n  // parent is encountered.\n  DCHECK_GE(index, -1);\n  DCHECK_LT(index, num_tokens_);\n  if (index == -1 && n > 0) return -2;\n  int i = index;\n  while (n > 0) {\n    ++i;\n    if (i == num_tokens_) return -2;\n    if (Head(i) == Head(index)) --n;\n  }\n  return i;\n}\n\nvoid ParserState::AddArc(int index, int head, int label) {\n  DCHECK_GE(index, 0);\n  DCHECK_LT(index, num_tokens_);\n  head_[index] = head;\n  label_[index] = label;\n}\n\nint ParserState::GoldHead(int index) const {\n  // A valid ParserState index is transformed to a valid Sentence index,\n  // then the gold head is extracted.\n  DCHECK_GE(index, -1);\n  DCHECK_LT(index, num_tokens_);\n  if (index == -1) return -1;\n  const int offset = 0;\n  const int gold_head = GetToken(index).head();\n  return gold_head == -1 ? -1 : gold_head - offset;\n}\n\nint ParserState::GoldLabel(int index) const {\n  // A valid ParserState index is transformed to a valid Sentence index,\n  // then the gold label is extracted.\n  DCHECK_GE(index, -1);\n  DCHECK_LT(index, num_tokens_);\n  if (index == -1) return RootLabel();\n  string gold_label;\n  gold_label = GetToken(index).label();\n  return label_map_->LookupIndex(gold_label, RootLabel() /* unknown */);\n}\n\nvoid ParserState::AddParseToDocument(Sentence *sentence,\n                                     bool rewrite_root_labels) const {\n  transition_state_->AddParseToDocument(*this, rewrite_root_labels, sentence);\n}\n\nbool ParserState::IsTokenCorrect(int index) const {\n  return transition_state_->IsTokenCorrect(*this, index);\n}\n\nstring ParserState::LabelAsString(int label) const {\n  if (label == root_label_) return \"ROOT\";\n  if (label >= 0 && label < label_map_->Size()) {\n    return label_map_->GetTerm(label);\n  }\n  return \"\";\n}\n\nstring ParserState::ToString() const {\n  return transition_state_->ToString(*this);\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/parser_state.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// Parser state for the transition-based dependency parser.\n\n#ifndef SYNTAXNET_PARSER_STATE_H_\n#define SYNTAXNET_PARSER_STATE_H_\n\n#include <string>\n#include <vector>\n\n#include \"syntaxnet/utils.h\"\n#include \"syntaxnet/kbest_syntax.pb.h\"\n#include \"syntaxnet/parser_transitions.h\"\n#include \"syntaxnet/sentence.pb.h\"\n\nnamespace syntaxnet {\n\nclass TermFrequencyMap;\n\n// A ParserState object represents the state of the parser during the parsing of\n// a sentence. The state consists of a pointer to the next input token and a\n// stack of partially processed tokens. The parser state can be changed by\n// applying a sequence of transitions. Some transitions also add relations\n// to the dependency tree of the sentence. The parser state also records the\n// (partial) parse tree for the sentence by recording the head of each token and\n// the label of this relation. The state is used for both training and parsing.\nclass ParserState {\n public:\n  // String representation of the root label.\n  static const char kRootLabel[];\n\n  // Default value for the root label in case it's not in the label map.\n  static const int kDefaultRootLabel = -1;\n\n  // Initializes the parser state for a sentence, using an additional transition\n  // state for preprocessing and/or additional information specific to the\n  // transition system. The transition state is allowed to be null, in which\n  // case no additional work is performed. The parser state takes ownership of\n  // the transition state. A label map is used for transforming between integer\n  // and string representations of the labels.\n  ParserState(Sentence *sentence,\n              ParserTransitionState *transition_state,\n              const TermFrequencyMap *label_map);\n\n  // Deletes the parser state.\n  ~ParserState();\n\n  // Clones the parser state.\n  ParserState *Clone() const;\n\n  // Returns the root label.\n  int RootLabel() const;\n\n  // Returns the index of the next input token.\n  int Next() const;\n\n  // Returns the number of tokens in the sentence.\n  int NumTokens() const { return num_tokens_; }\n\n  // Returns the token index relative to the next input token. If no such token\n  // exists, returns -2.\n  int Input(int offset) const;\n\n  // Advances to the next input token.\n  void Advance();\n\n  // Returns true if all tokens have been processed.\n  bool EndOfInput() const;\n\n  // Pushes an element to the stack.\n  void Push(int index);\n\n  // Pops the top element from stack and returns it.\n  int Pop();\n\n  // Returns the element from the top of the stack.\n  int Top() const;\n\n  // Returns the element at a certain position in the stack. Stack(0) is the top\n  // stack element. If no such position exists, returns -2.\n  int Stack(int position) const;\n\n  // Returns the number of elements on the stack.\n  int StackSize() const;\n\n  // Returns true if the stack is empty.\n  bool StackEmpty() const;\n\n  // Returns the head index for a given token.\n  int Head(int index) const;\n\n  // Returns the label of the relation to head for a given token.\n  int Label(int index) const;\n\n  // Returns the parent of a given token 'n' levels up in the tree.\n  int Parent(int index, int n) const;\n\n  // Returns the leftmost child of a given token 'n' levels down in the tree. If\n  // no such child exists, returns -2.\n  int LeftmostChild(int index, int n) const;\n\n  // Returns the rightmost child of a given token 'n' levels down in the tree.\n  // If no such child exists, returns -2.\n  int RightmostChild(int index, int n) const;\n\n  // Returns the n-th left sibling of a given token. If no such sibling exists,\n  // returns -2.\n  int LeftSibling(int index, int n) const;\n\n  // Returns the n-th right sibling of a given token. If no such sibling exists,\n  // returns -2.\n  int RightSibling(int index, int n) const;\n\n  // Adds an arc to the partial dependency tree of the state.\n  void AddArc(int index, int head, int label);\n\n  // Returns the gold head index for a given token, based on the underlying\n  // annotated sentence.\n  int GoldHead(int index) const;\n\n  // Returns the gold label for a given token, based on the underlying annotated\n  // sentence.\n  int GoldLabel(int index) const;\n\n  // Get a reference to the underlying token at index. Returns an empty default\n  // Token if accessing the root.\n  const Token &GetToken(int index) const {\n    if (index == -1) return kRootToken;\n    return sentence().token(index);\n  }\n\n  // Annotates a document with the dependency relations built during parsing for\n  // one of its sentences. If rewrite_root_labels is true, then all tokens with\n  // no heads will be assigned the default root label \"ROOT\".\n  void AddParseToDocument(Sentence *document, bool rewrite_root_labels) const;\n\n  // As above, but uses the default of rewrite_root_labels = true.\n  void AddParseToDocument(Sentence *document) const {\n    AddParseToDocument(document, true);\n  }\n\n  // Whether a parsed token should be considered correct for evaluation.\n  bool IsTokenCorrect(int index) const;\n\n  // Returns the string representation of a dependency label, or an empty string\n  // if the label is invalid.\n  string LabelAsString(int label) const;\n\n  // Returns a string representation of the parser state.\n  string ToString() const;\n\n  // Returns the underlying sentence instance.\n  const Sentence &sentence() const { return *sentence_; }\n  Sentence *mutable_sentence() const { return sentence_; }\n\n  // Returns the transition system-specific state.\n  const ParserTransitionState *transition_state() const {\n    return transition_state_;\n  }\n  ParserTransitionState *mutable_transition_state() {\n    return transition_state_;\n  }\n\n  // Gets/sets the flag which says that the state was obtained though gold\n  // transitions only.\n  bool is_gold() const { return is_gold_; }\n  void set_is_gold(bool is_gold) { is_gold_ = is_gold; }\n\n private:\n  // Empty constructor used for the cloning operation.\n  ParserState() {}\n\n  // Default value for the root token.\n  const Token kRootToken;\n\n  // Sentence to parse. Not owned.\n  Sentence *sentence_ = nullptr;\n\n  // Number of tokens in the sentence to parse.\n  int num_tokens_;\n\n  // Which alternative token analysis is used for tag/category/head/label\n  // information. -1 means use default.\n  int alternative_ = -1;\n\n  // Transition system-specific state. Owned.\n  ParserTransitionState *transition_state_ = nullptr;\n\n  // Label map used for conversions between integer and string representations\n  // of the dependency labels. Not owned.\n  const TermFrequencyMap *label_map_ = nullptr;\n\n  // Root label.\n  int root_label_;\n\n  // Index of the next input token.\n  int next_;\n\n  // Parse stack of partially processed tokens.\n  vector<int> stack_;\n\n  // List of head positions for the (partial) dependency tree.\n  vector<int> head_;\n\n  // List of dependency relation labels describing the (partial) dependency\n  // tree.\n  vector<int> label_;\n\n  // Score of the parser state.\n  double score_ = 0.0;\n\n  // True if this is the gold standard sequence (used for structured learning).\n  bool is_gold_ = false;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(ParserState);\n};\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_PARSER_STATE_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/parser_trainer.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"A program to train a tensorflow neural net parser from a a conll file.\"\"\"\n\n\n\nimport os\nimport os.path\nimport time\nimport tensorflow as tf\n\nfrom tensorflow.python.platform import gfile\nfrom tensorflow.python.platform import tf_logging as logging\n\nfrom google.protobuf import text_format\n\nfrom syntaxnet import graph_builder\nfrom syntaxnet import structured_graph_builder\nfrom syntaxnet.ops import gen_parser_ops\nfrom syntaxnet import task_spec_pb2\n\nflags = tf.app.flags\nFLAGS = flags.FLAGS\n\nflags.DEFINE_string('tf_master', '',\n                    'TensorFlow execution engine to connect to.')\nflags.DEFINE_string('output_path', '', 'Top level for output.')\nflags.DEFINE_string('task_context', '',\n                    'Path to a task context with resource locations and '\n                    'parameters.')\nflags.DEFINE_string('arg_prefix', None, 'Prefix for context parameters.')\nflags.DEFINE_string('params', '0', 'Unique identifier of parameter grid point.')\nflags.DEFINE_string('training_corpus', 'training-corpus',\n                    'Name of the context input to read training data from.')\nflags.DEFINE_string('tuning_corpus', 'tuning-corpus',\n                    'Name of the context input to read tuning data from.')\nflags.DEFINE_string('word_embeddings', None,\n                    'Recordio containing pretrained word embeddings, will be '\n                    'loaded as the first embedding matrix.')\nflags.DEFINE_bool('compute_lexicon', False, '')\nflags.DEFINE_bool('projectivize_training_set', False, '')\nflags.DEFINE_string('hidden_layer_sizes', '200,200',\n                    'Comma separated list of hidden layer sizes.')\nflags.DEFINE_string('graph_builder', 'greedy',\n                    'Graph builder to use, either \"greedy\" or \"structured\".')\nflags.DEFINE_integer('batch_size', 32,\n                     'Number of sentences to process in parallel.')\nflags.DEFINE_integer('beam_size', 10, 'Number of slots for beam parsing.')\nflags.DEFINE_integer('num_epochs', 10, 'Number of epochs to train for.')\nflags.DEFINE_integer('max_steps', 50,\n                     'Max number of parser steps during a training step.')\nflags.DEFINE_integer('report_every', 100,\n                     'Report cost and training accuracy every this many steps.')\nflags.DEFINE_integer('checkpoint_every', 5000,\n                     'Measure tuning UAS and checkpoint every this many steps.')\nflags.DEFINE_bool('slim_model', False,\n                  'Whether to remove non-averaged variables, for compactness.')\nflags.DEFINE_float('learning_rate', 0.1, 'Initial learning rate parameter.')\nflags.DEFINE_integer('decay_steps', 4000,\n                     'Decay learning rate by 0.96 every this many steps.')\nflags.DEFINE_float('momentum', 0.9,\n                   'Momentum parameter for momentum optimizer.')\nflags.DEFINE_string('seed', '0', 'Initialization seed for TF variables.')\nflags.DEFINE_string('pretrained_params', None,\n                    'Path to model from which to load params.')\nflags.DEFINE_string('pretrained_params_names', None,\n                    'List of names of tensors to load from pretrained model.')\nflags.DEFINE_float('averaging_decay', 0.9999,\n                   'Decay for exponential moving average when computing'\n                   'averaged parameters, set to 1 to do vanilla averaging.')\n\n\ndef StageName():\n  return os.path.join(FLAGS.arg_prefix, FLAGS.graph_builder)\n\n\ndef OutputPath(path):\n  return os.path.join(FLAGS.output_path, StageName(), FLAGS.params, path)\n\n\ndef RewriteContext():\n  context = task_spec_pb2.TaskSpec()\n  with gfile.FastGFile(FLAGS.task_context) as fin:\n    text_format.Merge(fin.read(), context)\n  for resource in context.input:\n    if resource.creator == StageName():\n      del resource.part[:]\n      part = resource.part.add()\n      part.file_pattern = os.path.join(OutputPath(resource.name))\n  with gfile.FastGFile(OutputPath('context'), 'w') as fout:\n    fout.write(str(context))\n\n\ndef WriteStatus(num_steps, eval_metric, best_eval_metric):\n  status = os.path.join(os.getenv('GOOGLE_STATUS_DIR') or '/tmp', 'STATUS')\n  message = ('Parameters: %s | Steps: %d | Tuning score: %.2f%% | '\n             'Best tuning score: %.2f%%' % (FLAGS.params, num_steps,\n                                            eval_metric, best_eval_metric))\n  with gfile.FastGFile(status, 'w') as fout:\n    fout.write(message)\n  with gfile.FastGFile(OutputPath('status'), 'a') as fout:\n    fout.write(message + '\\n')\n\n\ndef Eval(sess, parser, num_steps, best_eval_metric):\n  \"\"\"Evaluates a network and checkpoints it to disk.\n\n  Args:\n    sess: tensorflow session to use\n    parser: graph builder containing all ops references\n    num_steps: number of training steps taken, for logging\n    best_eval_metric: current best eval metric, to decide whether this model is\n        the best so far\n\n  Returns:\n    new best eval metric\n  \"\"\"\n  logging.info('Evaluating training network.')\n  t = time.time()\n  num_epochs = None\n  num_tokens = 0\n  num_correct = 0\n  while True:\n    tf_eval_epochs, tf_eval_metrics = sess.run([\n        parser.evaluation['epochs'], parser.evaluation['eval_metrics']\n    ])\n    num_tokens += tf_eval_metrics[0]\n    num_correct += tf_eval_metrics[1]\n    if num_epochs is None:\n      num_epochs = tf_eval_epochs\n    elif num_epochs < tf_eval_epochs:\n      break\n  eval_metric = 0 if num_tokens == 0 else (100.0 * num_correct / num_tokens)\n  logging.info('Seconds elapsed in evaluation: %.2f, '\n               'eval metric: %.2f%%', time.time() - t, eval_metric)\n  WriteStatus(num_steps, eval_metric, max(eval_metric, best_eval_metric))\n\n  # Save parameters.\n  if FLAGS.output_path:\n    logging.info('Writing out trained parameters.')\n    parser.saver.save(sess, OutputPath('latest-model'))\n    if eval_metric > best_eval_metric:\n      parser.saver.save(sess, OutputPath('model'))\n\n  return max(eval_metric, best_eval_metric)\n\n\ndef Train(sess, num_actions, feature_sizes, domain_sizes, embedding_dims):\n  \"\"\"Builds and trains the network.\n\n  Args:\n    sess: tensorflow session to use.\n    num_actions: number of possible golden actions.\n    feature_sizes: size of each feature vector.\n    domain_sizes: number of possible feature ids in each feature vector.\n    embedding_dims: embedding dimension to use for each feature group.\n  \"\"\"\n  t = time.time()\n  hidden_layer_sizes = map(int, FLAGS.hidden_layer_sizes.split(','))\n  logging.info('Building training network with parameters: feature_sizes: %s '\n               'domain_sizes: %s', feature_sizes, domain_sizes)\n\n  if FLAGS.graph_builder == 'greedy':\n    parser = graph_builder.GreedyParser(num_actions,\n                                        feature_sizes,\n                                        domain_sizes,\n                                        embedding_dims,\n                                        hidden_layer_sizes,\n                                        seed=int(FLAGS.seed),\n                                        gate_gradients=True,\n                                        averaging_decay=FLAGS.averaging_decay,\n                                        arg_prefix=FLAGS.arg_prefix)\n  else:\n    parser = structured_graph_builder.StructuredGraphBuilder(\n        num_actions,\n        feature_sizes,\n        domain_sizes,\n        embedding_dims,\n        hidden_layer_sizes,\n        seed=int(FLAGS.seed),\n        gate_gradients=True,\n        averaging_decay=FLAGS.averaging_decay,\n        arg_prefix=FLAGS.arg_prefix,\n        beam_size=FLAGS.beam_size,\n        max_steps=FLAGS.max_steps)\n\n  task_context = OutputPath('context')\n  if FLAGS.word_embeddings is not None:\n    parser.AddPretrainedEmbeddings(0, FLAGS.word_embeddings, task_context)\n\n  corpus_name = ('projectivized-training-corpus' if\n                 FLAGS.projectivize_training_set else FLAGS.training_corpus)\n  parser.AddTraining(task_context,\n                     FLAGS.batch_size,\n                     learning_rate=FLAGS.learning_rate,\n                     momentum=FLAGS.momentum,\n                     decay_steps=FLAGS.decay_steps,\n                     corpus_name=corpus_name)\n  parser.AddEvaluation(task_context,\n                       FLAGS.batch_size,\n                       corpus_name=FLAGS.tuning_corpus)\n  parser.AddSaver(FLAGS.slim_model)\n\n  # Save graph.\n  if FLAGS.output_path:\n    with gfile.FastGFile(OutputPath('graph'), 'w') as f:\n      f.write(sess.graph_def.SerializeToString())\n\n  logging.info('Initializing...')\n  num_epochs = 0\n  cost_sum = 0.0\n  num_steps = 0\n  best_eval_metric = 0.0\n  sess.run(parser.inits.values())\n\n  if FLAGS.pretrained_params is not None:\n    logging.info('Loading pretrained params from %s', FLAGS.pretrained_params)\n    feed_dict = {'save/Const:0': FLAGS.pretrained_params}\n    targets = []\n    for node in sess.graph_def.node:\n      if (node.name.startswith('save/Assign') and\n          node.input[0] in FLAGS.pretrained_params_names.split(',')):\n        logging.info('Loading %s with op %s', node.input[0], node.name)\n        targets.append(node.name)\n    sess.run(targets, feed_dict=feed_dict)\n\n  logging.info('Training...')\n  while num_epochs < FLAGS.num_epochs:\n    tf_epochs, tf_cost, _ = sess.run([parser.training[\n        'epochs'], parser.training['cost'], parser.training['train_op']])\n    num_epochs = tf_epochs\n    num_steps += 1\n    cost_sum += tf_cost\n    if num_steps % FLAGS.report_every == 0:\n      logging.info('Epochs: %d, num steps: %d, '\n                   'seconds elapsed: %.2f, avg cost: %.2f, ', num_epochs,\n                   num_steps, time.time() - t, cost_sum / FLAGS.report_every)\n      cost_sum = 0.0\n    if num_steps % FLAGS.checkpoint_every == 0:\n      best_eval_metric = Eval(sess, parser, num_steps, best_eval_metric)\n\n\ndef main(unused_argv):\n  logging.set_verbosity(logging.INFO)\n  if not gfile.IsDirectory(OutputPath('')):\n    gfile.MakeDirs(OutputPath(''))\n\n  # Rewrite context.\n  RewriteContext()\n\n  # Creates necessary term maps.\n  if FLAGS.compute_lexicon:\n    logging.info('Computing lexicon...')\n    with tf.Session(FLAGS.tf_master) as sess:\n      gen_parser_ops.lexicon_builder(task_context=OutputPath('context'),\n                                     corpus_name=FLAGS.training_corpus).run()\n  with tf.Session(FLAGS.tf_master) as sess:\n    feature_sizes, domain_sizes, embedding_dims, num_actions = sess.run(\n        gen_parser_ops.feature_size(task_context=OutputPath('context'),\n                                    arg_prefix=FLAGS.arg_prefix))\n\n  # Well formed and projectivize.\n  if FLAGS.projectivize_training_set:\n    logging.info('Preprocessing...')\n    with tf.Session(FLAGS.tf_master) as sess:\n      source, last = gen_parser_ops.document_source(\n          task_context=OutputPath('context'),\n          batch_size=FLAGS.batch_size,\n          corpus_name=FLAGS.training_corpus)\n      sink = gen_parser_ops.document_sink(\n          task_context=OutputPath('context'),\n          corpus_name='projectivized-training-corpus',\n          documents=gen_parser_ops.projectivize_filter(\n              gen_parser_ops.well_formed_filter(source,\n                                                task_context=OutputPath(\n                                                    'context')),\n              task_context=OutputPath('context')))\n      while True:\n        tf_last, _ = sess.run([last, sink])\n        if tf_last:\n          break\n\n  logging.info('Training...')\n  with tf.Session(FLAGS.tf_master) as sess:\n    Train(sess, num_actions, feature_sizes, domain_sizes, embedding_dims)\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/parser_trainer_test.sh",
    "content": "#!/bin/bash\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n# This test trains a parser on a small dataset, then runs it in greedy mode and\n# in structured mode with beam 1, and checks that the result is identical.\n\nset -eux\n\nBINDIR=$TEST_SRCDIR/$TEST_WORKSPACE/syntaxnet\nCONTEXT=$BINDIR/testdata/context.pbtxt\nTMP_DIR=/tmp/syntaxnet-output\n\nmkdir -p $TMP_DIR\nsed \"s=SRCDIR=$TEST_SRCDIR=\" \"$CONTEXT\" | \\\n  sed \"s=OUTPATH=$TMP_DIR=\" > $TMP_DIR/context\n\nPARAMS=128-0.08-3600-0.9-0\n\n\"$BINDIR/parser_trainer\" \\\n  --arg_prefix=brain_parser \\\n  --batch_size=32 \\\n  --compute_lexicon \\\n  --decay_steps=3600 \\\n  --graph_builder=greedy \\\n  --hidden_layer_sizes=128 \\\n  --learning_rate=0.08 \\\n  --momentum=0.9 \\\n  --output_path=$TMP_DIR \\\n  --task_context=$TMP_DIR/context \\\n  --training_corpus=training-corpus \\\n  --tuning_corpus=tuning-corpus \\\n  --params=$PARAMS \\\n  --num_epochs=12 \\\n  --report_every=100 \\\n  --checkpoint_every=1000 \\\n  --logtostderr\n\n\"$BINDIR/parser_eval\" \\\n  --task_context=$TMP_DIR/brain_parser/greedy/$PARAMS/context \\\n  --hidden_layer_sizes=128 \\\n  --input=tuning-corpus \\\n  --output=stdout \\\n  --arg_prefix=brain_parser \\\n  --graph_builder=greedy \\\n  --model_path=$TMP_DIR/brain_parser/greedy/$PARAMS/model \\\n  --logtostderr \\\n  > $TMP_DIR/greedy-out\n\n\"$BINDIR/parser_eval\" \\\n  --task_context=$TMP_DIR/context \\\n  --hidden_layer_sizes=128 \\\n  --beam_size=1 \\\n  --input=tuning-corpus \\\n  --output=stdout \\\n  --arg_prefix=brain_parser \\\n  --graph_builder=structured \\\n  --model_path=$TMP_DIR/brain_parser/greedy/$PARAMS/model \\\n  --logtostderr \\\n  > $TMP_DIR/struct-beam1-out\n\ndiff $TMP_DIR/greedy-out $TMP_DIR/struct-beam1-out\n\nSTRUCT_PARAMS=128-0.001-3600-0.9-0\n\n\"$BINDIR/parser_trainer\" \\\n  --arg_prefix=brain_parser \\\n  --batch_size=8 \\\n  --compute_lexicon \\\n  --decay_steps=3600 \\\n  --graph_builder=structured \\\n  --hidden_layer_sizes=128 \\\n  --learning_rate=0.001 \\\n  --momentum=0.9 \\\n  --pretrained_params=$TMP_DIR/brain_parser/greedy/$PARAMS/model \\\n  --pretrained_params_names=\\\nembedding_matrix_0,embedding_matrix_1,embedding_matrix_2,bias_0,weights_0 \\\n  --output_path=$TMP_DIR \\\n  --task_context=$TMP_DIR/context \\\n  --training_corpus=training-corpus \\\n  --tuning_corpus=tuning-corpus \\\n  --params=$STRUCT_PARAMS \\\n  --num_epochs=20 \\\n  --report_every=25 \\\n  --checkpoint_every=200 \\\n  --logtostderr\n\n\"$BINDIR/parser_eval\" \\\n  --task_context=$TMP_DIR/context \\\n  --hidden_layer_sizes=128 \\\n  --beam_size=8 \\\n  --input=tuning-corpus \\\n  --output=stdout \\\n  --arg_prefix=brain_parser \\\n  --graph_builder=structured \\\n  --model_path=$TMP_DIR/brain_parser/structured/$STRUCT_PARAMS/model \\\n  --logtostderr \\\n  > $TMP_DIR/struct-beam8-out\n\necho \"PASS\"\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/parser_transitions.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/parser_transitions.h\"\n\n#include \"syntaxnet/parser_state.h\"\n\nnamespace syntaxnet {\n\n// Transition system registry.\nREGISTER_CLASS_REGISTRY(\"transition system\", ParserTransitionSystem);\n\nvoid ParserTransitionSystem::PerformAction(ParserAction action,\n                                           ParserState *state) const {\n  PerformActionWithoutHistory(action, state);\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/parser_transitions.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// Transition system for the transition-based dependency parser.\n\n#ifndef SYNTAXNET_PARSER_TRANSITIONS_H_\n#define SYNTAXNET_PARSER_TRANSITIONS_H_\n\n#include <string>\n#include <vector>\n\n#include \"syntaxnet/utils.h\"\n#include \"syntaxnet/registry.h\"\n\nnamespace tensorflow {\nnamespace io {\nclass RecordReader;\nclass RecordWriter;\n}\n}\n\nnamespace syntaxnet {\n\nclass Sentence;\nclass ParserState;\nclass TaskContext;\n\n// Parser actions for the transition system are encoded as integers.\ntypedef int ParserAction;\n\n// Label type for the parser action.\nenum class LabelType {\n  NO_LABEL = 0,\n  LEFT_LABEL = 1,\n  RIGHT_LABEL = 2,\n};\n\n// Transition system-specific state. Transition systems can subclass this to\n// preprocess the parser state and/or to keep additional information during\n// parsing.\nclass ParserTransitionState {\n public:\n  virtual ~ParserTransitionState() {}\n\n  // Clones the transition state.\n  virtual ParserTransitionState *Clone() const = 0;\n\n  // Initializes a parser state for the transition system.\n  virtual void Init(ParserState *state) = 0;\n\n  virtual void AddParseToDocument(const ParserState &state,\n                                  bool rewrite_root_labels,\n                                  Sentence *sentence) const {}\n\n  // Whether a parsed token should be considered correct for evaluation.\n  virtual bool IsTokenCorrect(const ParserState &state, int index) const = 0;\n\n  // Returns a human readable string representation of this state.\n  virtual string ToString(const ParserState &state) const = 0;\n};\n\n// A transition system is used for handling the parser state transitions. During\n// training the transition system is used for extracting a canonical sequence of\n// transitions for an annotated sentence. During parsing the transition system\n// is used for applying the predicted transitions to the parse state and\n// therefore build the parse tree for the sentence. Transition systems can be\n// implemented by subclassing this abstract class and registered using the\n// REGISTER_TRANSITION_SYSTEM macro.\nclass ParserTransitionSystem\n    : public RegisterableClass<ParserTransitionSystem> {\n public:\n  // Construction and cleanup.\n  ParserTransitionSystem() {}\n  virtual ~ParserTransitionSystem() {}\n\n  // Sets up the transition system. If inputs are needed, this is the place to\n  // specify them.\n  virtual void Setup(TaskContext *context) {}\n\n  // Initializes the transition system.\n  virtual void Init(TaskContext *context) {}\n\n  // Reads the transition system from disk.\n  virtual void Read(tensorflow::io::RecordReader *reader) {}\n\n  // Writes the transition system to disk.\n  virtual void Write(tensorflow::io::RecordWriter *writer) const {}\n\n  // Returns the number of action types.\n  virtual int NumActionTypes() const = 0;\n\n  // Returns the number of actions.\n  virtual int NumActions(int num_labels) const = 0;\n\n  // Internally creates the set of outcomes (when transition systems support a\n  // variable number of actions).\n  virtual void CreateOutcomeSet(int num_labels) {}\n\n  // Returns the default action for a given state.\n  virtual ParserAction GetDefaultAction(const ParserState &state) const = 0;\n\n  // Returns the next gold action for the parser during training using the\n  // dependency relations found in the underlying annotated sentence.\n  virtual ParserAction GetNextGoldAction(const ParserState &state) const = 0;\n\n  // Returns all next gold actions for the parser during training using the\n  // dependency relations found in the underlying annotated sentence.\n  virtual void GetAllNextGoldActions(const ParserState &state,\n                                     vector<ParserAction> *actions) const {\n    ParserAction action = GetNextGoldAction(state);\n    *actions = {action};\n  }\n\n  // Internally counts all next gold actions from the current parser state.\n  virtual void CountAllNextGoldActions(const ParserState &state) {}\n\n  // Returns the number of atomic actions within the specified ParserAction.\n  virtual int ActionLength(ParserAction action) const { return 1; }\n\n  // Returns true if the action is allowed in the given parser state.\n  virtual bool IsAllowedAction(ParserAction action,\n                               const ParserState &state) const = 0;\n\n  // Performs the specified action on a given parser state. The action is not\n  // saved in the state's history.\n  virtual void PerformActionWithoutHistory(ParserAction action,\n                                           ParserState *state) const = 0;\n\n  // Performs the specified action on a given parser state. The action is saved\n  // in the state's history.\n  void PerformAction(ParserAction action, ParserState *state) const;\n\n  // Returns true if a given state is deterministic.\n  virtual bool IsDeterministicState(const ParserState &state) const = 0;\n\n  // Returns true if no more actions can be applied to a given parser state.\n  virtual bool IsFinalState(const ParserState &state) const = 0;\n\n  // Returns a string representation of a parser action.\n  virtual string ActionAsString(ParserAction action,\n                                const ParserState &state) const = 0;\n\n  // Returns a new transition state that can be used to put additional\n  // information in a parser state. By specifying if we are in training_mode\n  // (true) or not (false), we can construct a different transition state\n  // depending on whether we are training a model or parsing new documents. A\n  // null return value means we don't need to add anything to the parser state.\n  virtual ParserTransitionState *NewTransitionState(bool training_mode) const {\n    return nullptr;\n  }\n\n  // Whether to back off to the best allowable transition rather than the\n  // default action when the highest scoring action is not allowed.  Some\n  // transition systems do not degrade gracefully to the default action and so\n  // should return true for this function.\n  virtual bool BackOffToBestAllowableTransition() const { return false; }\n\n  // Whether the system returns multiple gold transitions from a single\n  // configuration.\n  virtual bool ReturnsMultipleGoldTransitions() const { return false; }\n\n  // Whether the system allows non-projective trees.\n  virtual bool AllowsNonProjective() const { return false; }\n\n  // Action meta data: get pointers to token indices based on meta-info about\n  // (state, action) pairs. NOTE: the following interface is somewhat\n  // experimental and may be subject to change. Use with caution and ask\n  // djweiss@ for details.\n\n  // Whether or not the system supports computing meta-data about actions.\n  virtual bool SupportsActionMetaData() const { return false; }\n\n  // Get the index of the child that would be created by this action. -1 for\n  // no child created.\n  virtual int ChildIndex(const ParserState &state,\n                         const ParserAction &action) const {\n    return -1;\n  }\n\n  // Get the index of the parent that would gain a new child by this action. -1\n  // for no parent modified.\n  virtual int ParentIndex(const ParserState &state,\n                          const ParserAction &action) const {\n    return -1;\n  }\n\n private:\n  TF_DISALLOW_COPY_AND_ASSIGN(ParserTransitionSystem);\n};\n\n#define REGISTER_TRANSITION_SYSTEM(type, component) \\\n  REGISTER_CLASS_COMPONENT(ParserTransitionSystem, type, component)\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_PARSER_TRANSITIONS_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/populate_test_inputs.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/populate_test_inputs.h\"\n\n#include <map>\n#include <utility>\n\n#include \"syntaxnet/dictionary.pb.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/task_spec.pb.h\"\n#include \"syntaxnet/term_frequency_map.h\"\n#include \"syntaxnet/utils.h\"\n#include \"gtest/gtest.h\"\n#include \"tensorflow/core/lib/core/status.h\"\n#include \"tensorflow/core/lib/io/record_writer.h\"\n#include \"tensorflow/core/lib/strings/strcat.h\"\n#include \"tensorflow/core/platform/env.h\"\n#include \"tensorflow/core/platform/test.h\"\n\nnamespace syntaxnet {\n\nvoid PopulateTestInputs::CreatorMap::Add(\n    const string &name, const string &file_format, const string &record_format,\n    PopulateTestInputs::CreateFile makefile) {\n  (*this)[name] = [name, file_format, record_format,\n                   makefile](TaskInput *input) {\n    makefile(AddPart(input, file_format, record_format));\n  };\n}\n\nbool PopulateTestInputs::CreatorMap::Populate(TaskContext *context) const {\n  return PopulateTestInputs::Populate(*this, context);\n}\n\nPopulateTestInputs::CreatorMap PopulateTestInputs::Defaults(\n    const Sentence &document) {\n  CreatorMap creators;\n  creators[\"category-map\"] =\n      CreateTFMapFromDocumentTokens(document, TokenCategory);\n  creators[\"label-map\"] = CreateTFMapFromDocumentTokens(document, TokenLabel);\n  creators[\"tag-map\"] = CreateTFMapFromDocumentTokens(document, TokenTag);\n  creators[\"tag-to-category\"] = CreateTagToCategoryFromTokens(document);\n  creators[\"word-map\"] = CreateTFMapFromDocumentTokens(document, TokenWord);\n  return creators;\n}\n\nbool PopulateTestInputs::Populate(\n    const std::unordered_map<string, Create> &creator_map,\n    TaskContext *context) {\n  TaskSpec *spec = context->mutable_spec();\n  bool found_all_inputs = true;\n\n  // Fail if a mandatory input is not found.\n  auto name_not_found = [&found_all_inputs](TaskInput *input) {\n    found_all_inputs = false;\n  };\n\n  for (TaskInput &input : *spec->mutable_input()) {\n    auto it = creator_map.find(input.name());\n    (it == creator_map.end() ? name_not_found : it->second)(&input);\n\n    // Check for compatibility with declared supported formats.\n    for (const auto &part : input.part()) {\n      if (!TaskContext::Supports(input, part.file_format(),\n                                 part.record_format())) {\n        LOG(FATAL) << \"Input \" << input.name()\n                   << \" does not support file of type \" << part.file_format()\n                   << \"/\" << part.record_format();\n      }\n    }\n  }\n  return found_all_inputs;\n}\n\nPopulateTestInputs::Create PopulateTestInputs::CreateTFMapFromDocumentTokens(\n    const Sentence &document,\n    std::function<vector<string>(const Token &)> token2str) {\n  return [document, token2str](TaskInput *input) {\n    TermFrequencyMap map;\n\n    // Build and write the dummy term frequency map.\n    for (const Token &token : document.token()) {\n      vector<string> strings_for_token = token2str(token);\n      for (const string &s : strings_for_token) map.Increment(s);\n    }\n    string file_name = AddPart(input, \"text\", \"\");\n    map.Save(file_name);\n  };\n}\n\nPopulateTestInputs::Create PopulateTestInputs::CreateTagToCategoryFromTokens(\n    const Sentence &document) {\n  return [document](TaskInput *input) {\n    TagToCategoryMap tag_to_category;\n    for (auto &token : document.token()) {\n      if (token.has_tag()) {\n        tag_to_category.SetCategory(token.tag(), token.category());\n      }\n    }\n    const string file_name = AddPart(input, \"text\", \"\");\n    tag_to_category.Save(file_name);\n  };\n}\n\nvector<string> PopulateTestInputs::TokenCategory(const Token &token) {\n  if (token.has_category()) return {token.category()};\n  return {};\n}\n\nvector<string> PopulateTestInputs::TokenLabel(const Token &token) {\n  if (token.has_label()) return {token.label()};\n  return {};\n}\n\nvector<string> PopulateTestInputs::TokenTag(const Token &token) {\n  if (token.has_tag()) return {token.tag()};\n  return {};\n}\n\nvector<string> PopulateTestInputs::TokenWord(const Token &token) {\n  if (token.has_word()) return {token.word()};\n  return {};\n}\n\nstring PopulateTestInputs::AddPart(TaskInput *input, const string &file_format,\n                                   const string &record_format) {\n  string file_name =\n      tensorflow::strings::StrCat(\n          tensorflow::testing::TmpDir(), input->name());\n  auto *part = CHECK_NOTNULL(input)->add_part();\n  part->set_file_pattern(file_name);\n  part->set_file_format(file_format);\n  part->set_record_format(record_format);\n  return file_name;\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/populate_test_inputs.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// A utility for populating a set of inputs of a task.  This knows how to create\n// tag-map, category-map, label-map and has hooks to\n// populate other kinds of inputs.  The expected set of operations are:\n//\n// Sentence document_for_init = ...;\n// TaskContext context;\n// context->SetParameter(\"my_parameter\", \"true\");\n// MyDocumentProcessor processor;\n// processor.Setup(&context);\n// PopulateTestInputs::Defaults(document_for_init).Populate(&context);\n// processor.Init(&context);\n//\n// This will check the inputs requested by the processor's Setup(TaskContext *)\n// function, and files corresponding to them.  For example, if the processor\n// asked for the a \"tag-map\" input, it will create a TermFrequencyMap, populate\n// it with the POS tags found in the Sentence document_for_init, save it to disk\n// and update the TaskContext with the location of the file.  By convention, the\n// location is the name of the input. Conceptually, the logic is very simple:\n//\n// for (TaskInput &input : context->mutable_spec()->mutable_input()) {\n//   creators[input.name()](&input);\n//   // check for missing inputs, incompatible formats, etc...\n// }\n//\n// The Populate() routine will also check compatability between requested and\n// supplied formats. The Default mapping knows how to populate the following\n// inputs:\n//\n//  - category-map: TermFrequencyMap containing POS categories.\n//\n//  - label-map: TermFrequencyMap containing parser labels.\n//\n//  - tag-map: TermFrequencyMap containing POS tags.\n//\n//  - tag-to-category: StringToStringMap mapping POS tags to categories.\n//\n//  - word-map: TermFrequencyMap containing words.\n//\n// Clients can add creation routines by defining a std::function:\n//\n// auto creators = PopulateTestInputs::Defaults(document_for_init);\n// creators[\"my-input\"] = [](TaskInput *input) { ...; }\n//\n// See also creators.Add() for more convenience functions.\n\n#ifndef SYNTAXNET_POPULATE_TEST_INPUTS_H_\n#define SYNTAXNET_POPULATE_TEST_INPUTS_H_\n\n#include <functional>\n#include <string>\n#include <unordered_map>\n#include <vector>\n\n#include \"syntaxnet/utils.h\"\n\nnamespace syntaxnet {\n\nclass Sentence;\nclass TaskContext;\nclass TaskInput;\nclass TaskOutput;\nclass Token;\n\nclass PopulateTestInputs {\n public:\n  // When called, Create() should populate an input by creating a file and\n  // adding one or more parts to the TaskInput.\n  typedef std::function<void(TaskInput *)> Create;\n\n  // When called, CreateFile() should create a file resource at the given\n  // path. These are typically less inconvient to write.\n  typedef std::function<void(const string &)> CreateFile;\n\n  // A set of creators, one for each input in a TaskContext.\n  class CreatorMap : public std::unordered_map<string, Create> {\n   public:\n    // A simplified way to add a single-file creator.  The name of the file\n    // location will be file::JoinPath(FLAGS_test_tmpdir, name).\n    void Add(const string &name, const string &file_format,\n             const string &record_format, CreateFile makefile);\n\n    // Convenience method to populate the inputs in context.  Returns true if it\n    // was possible to populate each input, and false otherwise.  If a mandatory\n    // input does not have a creator, then we LOG(FATAL).\n    bool Populate(TaskContext *context) const;\n  };\n\n  // Default creator set.  This knows how to generate from a given Document\n  //  - category-map\n  //  - label-map\n  //  - tag-map\n  //  - tag-to-category\n  //  - word-map\n  //\n  //  Note: the default creators capture the document input by value: this means\n  //  that subsequent modifications to the document will NOT be\n  //  reflected in the inputs. However, the following is perfectly valid:\n  //\n  //  CreatorMap creators;\n  //  {\n  //    Sentence document;\n  //    creators = PopulateTestInputs::Defaults(document);\n  //  }\n  //  creators.Populate(context);\n  static CreatorMap Defaults(const Sentence &document);\n\n  // Populates the TaskContext object from a map of creator functions. Note that\n  // this static version is compatible with any hash map of the correct type.\n  static bool Populate(const std::unordered_map<string, Create> &creator_map,\n                       TaskContext *context);\n\n  // Helper function for creating a term frequency map from a document.  This\n  // iterates over all the tokens in the document, calls token2str on each\n  // token, and adds each returned string to the term frequency map.  The map is\n  // then saved to FLAGS_test_tmpdir/name.\n  static Create CreateTFMapFromDocumentTokens(\n      const Sentence &document,\n      std::function<vector<string>(const Token &)> token2str);\n\n  // Creates a StringToStringMap protocol buffer input that maps tags to\n  // categories. Uses whatever mapping is present in the document.\n  static Create CreateTagToCategoryFromTokens(const Sentence &document);\n\n  // Default implementations for \"token2str\" above.\n  static vector<string> TokenCategory(const Token &token);\n  static vector<string> TokenLabel(const Token &token);\n  static vector<string> TokenTag(const Token &token);\n  static vector<string> TokenWord(const Token &token);\n\n  // Utility function. Sets the TaskInput->part() fields for a new input part.\n  // Returns the file name.\n  static string AddPart(TaskInput *input, const string &file_format,\n                        const string &record_format);\n};\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_POPULATE_TEST_INPUTS_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/proto_io.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#ifndef SYNTAXNET_PROTO_IO_H_\n#define SYNTAXNET_PROTO_IO_H_\n\n#include <iostream>\n#include <memory>\n#include <string>\n#include <vector>\n\n#include \"syntaxnet/document_format.h\"\n#include \"syntaxnet/feature_extractor.pb.h\"\n#include \"syntaxnet/feature_types.h\"\n#include \"syntaxnet/registry.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/utils.h\"\n#include \"syntaxnet/workspace.h\"\n#include \"tensorflow/core/lib/core/errors.h\"\n#include \"tensorflow/core/lib/core/status.h\"\n#include \"tensorflow/core/lib/core/stringpiece.h\"\n#include \"tensorflow/core/lib/io/buffered_inputstream.h\"\n#include \"tensorflow/core/lib/io/random_inputstream.h\"\n#include \"tensorflow/core/lib/io/record_reader.h\"\n#include \"tensorflow/core/lib/io/record_writer.h\"\n#include \"tensorflow/core/lib/strings/strcat.h\"\n#include \"tensorflow/core/platform/env.h\"\n\nnamespace syntaxnet {\n\n// A convenience wrapper to read protos with a RecordReader.\nclass ProtoRecordReader {\n public:\n  explicit ProtoRecordReader(tensorflow::RandomAccessFile *file) {\n    file_.reset(file);\n    reader_.reset(new tensorflow::io::RecordReader(file_.get()));\n  }\n\n  explicit ProtoRecordReader(const string &filename) {\n    TF_CHECK_OK(\n        tensorflow::Env::Default()->NewRandomAccessFile(filename, &file_));\n    reader_.reset(new tensorflow::io::RecordReader(file_.get()));\n  }\n\n  ~ProtoRecordReader() {\n    reader_.reset();\n  }\n\n  template <typename T>\n  tensorflow::Status Read(T *proto) {\n    string buffer;\n    tensorflow::Status status = reader_->ReadRecord(&offset_, &buffer);\n    if (status.ok()) {\n      CHECK(proto->ParseFromString(buffer));\n      return tensorflow::Status::OK();\n    } else {\n      return status;\n    }\n  }\n\n private:\n  uint64 offset_ = 0;\n  std::unique_ptr<tensorflow::io::RecordReader> reader_;\n  std::unique_ptr<tensorflow::RandomAccessFile> file_;\n};\n\n// A convenience wrapper to write protos with a RecordReader.\nclass ProtoRecordWriter {\n public:\n  explicit ProtoRecordWriter(const string &filename) {\n    TF_CHECK_OK(tensorflow::Env::Default()->NewWritableFile(filename, &file_));\n    writer_.reset(new tensorflow::io::RecordWriter(file_.get()));\n  }\n\n  ~ProtoRecordWriter() {\n    writer_.reset();\n    file_.reset();\n  }\n\n  template <typename T>\n  void Write(const T &proto) {\n    TF_CHECK_OK(writer_->WriteRecord(proto.SerializeAsString()));\n  }\n\n private:\n  std::unique_ptr<tensorflow::io::RecordWriter> writer_;\n  std::unique_ptr<tensorflow::WritableFile> file_;\n};\n\n// A file implementation to read from stdin.\nclass StdIn : public tensorflow::RandomAccessFile {\n public:\n  StdIn() {}\n  ~StdIn() override {}\n\n  // Reads up to n bytes from standard input.  Returns `OUT_OF_RANGE` if fewer\n  // than n bytes were stored in `*result` because of EOF.\n  tensorflow::Status Read(uint64 offset, size_t n,\n                          tensorflow::StringPiece *result,\n                          char *scratch) const override {\n    CHECK_EQ(expected_offset_, offset);\n    if (!eof_) {\n      string line;\n      eof_ = !std::getline(std::cin, line);\n      buffer_.append(line);\n      buffer_.append(\"\\n\");\n    }\n    CopyFromBuffer(std::min(buffer_.size(), n), result, scratch);\n    if (eof_) {\n      return tensorflow::errors::OutOfRange(\"End of file reached\");\n    } else {\n      return tensorflow::Status::OK();\n    }\n  }\n\n private:\n  void CopyFromBuffer(size_t n, tensorflow::StringPiece *result,\n                      char *scratch) const {\n    memcpy(scratch, buffer_.data(), buffer_.size());\n    buffer_ = buffer_.substr(n);\n    result->set(scratch, n);\n    expected_offset_ += n;\n  }\n\n  mutable bool eof_ = false;\n  mutable int64 expected_offset_ = 0;\n  mutable string buffer_;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(StdIn);\n};\n\n// Reads sentence protos from a text file.\nclass TextReader {\n public:\n  explicit TextReader(const TaskInput &input, TaskContext *context) {\n    CHECK_EQ(input.record_format_size(), 1)\n        << \"TextReader only supports inputs with one record format: \"\n        << input.DebugString();\n    CHECK_EQ(input.part_size(), 1)\n        << \"TextReader only supports inputs with one part: \"\n        << input.DebugString();\n    filename_ = TaskContext::InputFile(input);\n    format_.reset(DocumentFormat::Create(input.record_format(0)));\n    format_->Setup(context);\n    Reset();\n  }\n\n  Sentence *Read() {\n    // Skips emtpy sentences, e.g., blank lines at the beginning of a file or\n    // commented out blocks.\n    vector<Sentence *> sentences;\n    string key, value;\n    while (sentences.empty() && format_->ReadRecord(buffer_.get(), &value)) {\n      key = tensorflow::strings::StrCat(filename_, \":\", sentence_count_);\n      format_->ConvertFromString(key, value, &sentences);\n      CHECK_LE(sentences.size(), 1);\n    }\n    if (sentences.empty()) {\n      // End of file reached.\n      return nullptr;\n    } else {\n      ++sentence_count_;\n      return sentences[0];\n    }\n  }\n\n  void Reset() {\n    sentence_count_ = 0;\n    if (filename_ == \"-\") {\n      static const int kInputBufferSize = 8 * 1024; /* bytes */\n      file_.reset(new StdIn());\n      stream_.reset(new tensorflow::io::RandomAccessInputStream(file_.get()));\n      buffer_.reset(new tensorflow::io::BufferedInputStream(file_.get(),\n                                                            kInputBufferSize));\n    } else {\n      static const int kInputBufferSize = 1 * 1024 * 1024; /* bytes */\n      TF_CHECK_OK(\n          tensorflow::Env::Default()->NewRandomAccessFile(filename_, &file_));\n      stream_.reset(new tensorflow::io::RandomAccessInputStream(file_.get()));\n      buffer_.reset(new tensorflow::io::BufferedInputStream(file_.get(),\n                                                            kInputBufferSize));\n    }\n  }\n\n private:\n  string filename_;\n  int sentence_count_ = 0;\n  std::unique_ptr<tensorflow::RandomAccessFile>\n      file_;  // must outlive buffer_, stream_\n  std::unique_ptr<tensorflow::io::RandomAccessInputStream>\n      stream_;  // Must outlive buffer_\n  std::unique_ptr<tensorflow::io::BufferedInputStream> buffer_;\n  std::unique_ptr<DocumentFormat> format_;\n};\n\n// Writes sentence protos to a text conll file.\nclass TextWriter {\n public:\n  explicit TextWriter(const TaskInput &input, TaskContext *context) {\n    CHECK_EQ(input.record_format_size(), 1)\n        << \"TextWriter only supports files with one record format: \"\n        << input.DebugString();\n    CHECK_EQ(input.part_size(), 1)\n        << \"TextWriter only supports files with one part: \"\n        << input.DebugString();\n    filename_ = TaskContext::InputFile(input);\n    format_.reset(DocumentFormat::Create(input.record_format(0)));\n    format_->Setup(context);\n    if (filename_ != \"-\") {\n      TF_CHECK_OK(\n          tensorflow::Env::Default()->NewWritableFile(filename_, &file_));\n    }\n  }\n\n  ~TextWriter() {\n    if (file_) {\n      file_->Close();\n    }\n  }\n\n  void Write(const Sentence &sentence) {\n    string key, value;\n    format_->ConvertToString(sentence, &key, &value);\n    if (file_) {\n      TF_CHECK_OK(file_->Append(value));\n    } else {\n      std::cout << value;\n    }\n  }\n\n private:\n  string filename_;\n  std::unique_ptr<DocumentFormat> format_;\n  std::unique_ptr<tensorflow::WritableFile> file_;\n};\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_PROTO_IO_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/reader_ops.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include <math.h>\n#include <deque>\n#include <memory>\n#include <string>\n#include <unordered_map>\n#include <vector>\n\n#include \"syntaxnet/base.h\"\n#include \"syntaxnet/feature_extractor.h\"\n#include \"syntaxnet/parser_state.h\"\n#include \"syntaxnet/parser_transitions.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/sentence_batch.h\"\n#include \"syntaxnet/shared_store.h\"\n#include \"syntaxnet/sparse.pb.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/task_spec.pb.h\"\n#include \"third_party/eigen3/unsupported/Eigen/CXX11/Tensor\"\n#include \"tensorflow/core/framework/op_kernel.h\"\n#include \"tensorflow/core/framework/tensor.h\"\n#include \"tensorflow/core/framework/tensor_shape.h\"\n#include \"tensorflow/core/lib/core/status.h\"\n#include \"tensorflow/core/lib/io/table.h\"\n#include \"tensorflow/core/lib/io/table_options.h\"\n#include \"tensorflow/core/lib/strings/stringprintf.h\"\n#include \"tensorflow/core/platform/env.h\"\n\nusing tensorflow::DEVICE_CPU;\nusing tensorflow::DT_FLOAT;\nusing tensorflow::DT_INT32;\nusing tensorflow::DT_INT64;\nusing tensorflow::DT_STRING;\nusing tensorflow::DataType;\nusing tensorflow::OpKernel;\nusing tensorflow::OpKernelConstruction;\nusing tensorflow::OpKernelContext;\nusing tensorflow::Tensor;\nusing tensorflow::TensorShape;\nusing tensorflow::error::OUT_OF_RANGE;\nusing tensorflow::errors::InvalidArgument;\n\nnamespace syntaxnet {\n\nclass ParsingReader : public OpKernel {\n public:\n  explicit ParsingReader(OpKernelConstruction *context) : OpKernel(context) {\n    string file_path, corpus_name;\n    OP_REQUIRES_OK(context, context->GetAttr(\"task_context\", &file_path));\n    OP_REQUIRES_OK(context, context->GetAttr(\"feature_size\", &feature_size_));\n    OP_REQUIRES_OK(context, context->GetAttr(\"batch_size\", &max_batch_size_));\n    OP_REQUIRES_OK(context, context->GetAttr(\"corpus_name\", &corpus_name));\n    OP_REQUIRES_OK(context, context->GetAttr(\"arg_prefix\", &arg_prefix_));\n\n    // Reads task context from file.\n    string data;\n    OP_REQUIRES_OK(context, ReadFileToString(tensorflow::Env::Default(),\n                                             file_path, &data));\n    OP_REQUIRES(context,\n                TextFormat::ParseFromString(data, task_context_.mutable_spec()),\n                InvalidArgument(\"Could not parse task context at \", file_path));\n\n    // Set up the batch reader.\n    sentence_batch_.reset(\n        new SentenceBatch(max_batch_size_, corpus_name));\n    sentence_batch_->Init(&task_context_);\n\n    // Set up the parsing features and transition system.\n    states_.resize(max_batch_size_);\n    workspaces_.resize(max_batch_size_);\n    features_.reset(new ParserEmbeddingFeatureExtractor(arg_prefix_));\n    features_->Setup(&task_context_);\n    transition_system_.reset(ParserTransitionSystem::Create(task_context_.Get(\n        features_->GetParamName(\"transition_system\"), \"arc-standard\")));\n    transition_system_->Setup(&task_context_);\n    features_->Init(&task_context_);\n    features_->RequestWorkspaces(&workspace_registry_);\n    transition_system_->Init(&task_context_);\n    string label_map_path =\n        TaskContext::InputFile(*task_context_.GetInput(\"label-map\"));\n    label_map_ = SharedStoreUtils::GetWithDefaultName<TermFrequencyMap>(\n        label_map_path, 0, 0);\n\n    // Checks number of feature groups matches the task context.\n    const int required_size = features_->embedding_dims().size();\n    OP_REQUIRES(\n        context, feature_size_ == required_size,\n        InvalidArgument(\"Task context requires feature_size=\", required_size));\n  }\n\n  ~ParsingReader() override { SharedStore::Release(label_map_); }\n\n  // Creates a new ParserState if there's another sentence to be read.\n  virtual void AdvanceSentence(int index) {\n    states_[index].reset();\n    if (sentence_batch_->AdvanceSentence(index)) {\n      states_[index].reset(new ParserState(\n          sentence_batch_->sentence(index),\n          transition_system_->NewTransitionState(true), label_map_));\n      workspaces_[index].Reset(workspace_registry_);\n      features_->Preprocess(&workspaces_[index], states_[index].get());\n    }\n  }\n\n  void Compute(OpKernelContext *context) override {\n    mutex_lock lock(mu_);\n\n    // Advances states to the next positions.\n    PerformActions(context);\n\n    // Advances any final states to the next sentences.\n    for (int i = 0; i < max_batch_size_; ++i) {\n      if (state(i) == nullptr) continue;\n\n      // Switches to the next sentence if we're at a final state.\n      while (transition_system_->IsFinalState(*state(i))) {\n        VLOG(2) << \"Advancing sentence \" << i;\n        AdvanceSentence(i);\n        if (state(i) == nullptr) break;  // EOF has been reached\n      }\n    }\n\n    // Rewinds if no states remain in the batch (we need to re-wind the corpus).\n    if (sentence_batch_->size() == 0) {\n      ++num_epochs_;\n      LOG(INFO) << \"Starting epoch \" << num_epochs_;\n      sentence_batch_->Rewind();\n      for (int i = 0; i < max_batch_size_; ++i) AdvanceSentence(i);\n    }\n\n    // Create the outputs for each feature space.\n    vector<Tensor *> feature_outputs(features_->NumEmbeddings());\n    for (size_t i = 0; i < feature_outputs.size(); ++i) {\n      OP_REQUIRES_OK(context, context->allocate_output(\n                                  i, TensorShape({sentence_batch_->size(),\n                                                  features_->FeatureSize(i)}),\n                                  &feature_outputs[i]));\n    }\n\n    // Populate feature outputs.\n    for (int i = 0, index = 0; i < max_batch_size_; ++i) {\n      if (states_[i] == nullptr) continue;\n\n      // Extract features from the current parser state, and fill up the\n      // available batch slots.\n      std::vector<std::vector<SparseFeatures>> features =\n          features_->ExtractSparseFeatures(workspaces_[i], *states_[i]);\n\n      for (size_t feature_space = 0; feature_space < features.size();\n           ++feature_space) {\n        int feature_size = features[feature_space].size();\n        CHECK(feature_size == features_->FeatureSize(feature_space));\n        auto features_output = feature_outputs[feature_space]->matrix<string>();\n        for (int k = 0; k < feature_size; ++k) {\n          features_output(index, k) =\n              features[feature_space][k].SerializeAsString();\n        }\n      }\n      ++index;\n    }\n\n    // Return the number of epochs.\n    Tensor *epoch_output;\n    OP_REQUIRES_OK(context, context->allocate_output(\n                                feature_size_, TensorShape({}), &epoch_output));\n    auto num_epochs = epoch_output->scalar<int32>();\n    num_epochs() = num_epochs_;\n\n    // Create outputs specific to this reader.\n    AddAdditionalOutputs(context);\n  }\n\n protected:\n  // Peforms any relevant actions on the parser states, typically either\n  // the gold action or a predicted action from decoding.\n  virtual void PerformActions(OpKernelContext *context) = 0;\n\n  // Adds outputs specific to this reader starting at additional_output_index().\n  virtual void AddAdditionalOutputs(OpKernelContext *context) const = 0;\n\n  // Returns the output type specification of the this base class.\n  std::vector<DataType> default_outputs() const {\n    std::vector<DataType> output_types(feature_size_, DT_STRING);\n    output_types.push_back(DT_INT32);\n    return output_types;\n  }\n\n  // Accessors.\n  int max_batch_size() const { return max_batch_size_; }\n  int batch_size() const { return sentence_batch_->size(); }\n  int additional_output_index() const { return feature_size_ + 1; }\n  ParserState *state(int i) const { return states_[i].get(); }\n  const ParserTransitionSystem &transition_system() const {\n    return *transition_system_;\n  }\n\n  // Parser task context.\n  const TaskContext &task_context() const { return task_context_; }\n\n  const string &arg_prefix() const { return arg_prefix_; }\n\n private:\n  // Task context used to configure this op.\n  TaskContext task_context_;\n\n  // Prefix for context parameters.\n  string arg_prefix_;\n\n  // mutex to synchronize access to Compute.\n  mutex mu_;\n\n  // How many times the document source has been rewinded.\n  int num_epochs_ = 0;\n\n  // How many sentences this op can be processing at any given time.\n  int max_batch_size_ = 1;\n\n  // Number of feature groups in the brain parser features.\n  int feature_size_ = -1;\n\n  // Batch of sentences, and the corresponding parser states.\n  std::unique_ptr<SentenceBatch> sentence_batch_;\n\n  // Batch: ParserState objects.\n  std::vector<std::unique_ptr<ParserState>> states_;\n\n  // Batch: WorkspaceSet objects.\n  std::vector<WorkspaceSet> workspaces_;\n\n  // Dependency label map used in transition system.\n  const TermFrequencyMap *label_map_;\n\n  // Transition system.\n  std::unique_ptr<ParserTransitionSystem> transition_system_;\n\n  // Typed feature extractor for embeddings.\n  std::unique_ptr<ParserEmbeddingFeatureExtractor> features_;\n\n  // Internal workspace registry for use in feature extraction.\n  WorkspaceRegistry workspace_registry_;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(ParsingReader);\n};\n\nclass GoldParseReader : public ParsingReader {\n public:\n  explicit GoldParseReader(OpKernelConstruction *context)\n      : ParsingReader(context) {\n    // Sets up number and type of inputs and outputs.\n    std::vector<DataType> output_types = default_outputs();\n    output_types.push_back(DT_INT32);\n    OP_REQUIRES_OK(context, context->MatchSignature({}, output_types));\n  }\n\n private:\n  // Always performs the next gold action for each state.\n  void PerformActions(OpKernelContext *context) override {\n    for (int i = 0; i < max_batch_size(); ++i) {\n      if (state(i) != nullptr) {\n        transition_system().PerformAction(\n            transition_system().GetNextGoldAction(*state(i)), state(i));\n      }\n    }\n  }\n\n  // Adds the list of gold actions for each state as an additional output.\n  void AddAdditionalOutputs(OpKernelContext *context) const override {\n    Tensor *actions_output;\n    OP_REQUIRES_OK(context, context->allocate_output(\n                                additional_output_index(),\n                                TensorShape({batch_size()}), &actions_output));\n\n    // Add all gold actions for non-null states as an additional output.\n    auto gold_actions = actions_output->vec<int32>();\n    for (int i = 0, batch_index = 0; i < max_batch_size(); ++i) {\n      if (state(i) != nullptr) {\n        const int gold_action =\n            transition_system().GetNextGoldAction(*state(i));\n        gold_actions(batch_index++) = gold_action;\n      }\n    }\n  }\n\n  TF_DISALLOW_COPY_AND_ASSIGN(GoldParseReader);\n};\n\nREGISTER_KERNEL_BUILDER(Name(\"GoldParseReader\").Device(DEVICE_CPU),\n                        GoldParseReader);\n\n// DecodedParseReader parses sentences using transition scores computed\n// by a TensorFlow network. This op additionally computes a token correctness\n// evaluation metric which can be used to select hyperparameter settings and\n// training stopping point.\n//\n// The notion of correct token is determined by the transition system, e.g.\n// a tagger will return POS tag accuracy, while an arc-standard parser will\n// return UAS.\n//\n// Which tokens should be scored is controlled by the '<arg_prefix>_scoring'\n// task parameter.  Possible values are\n//   - 'default': skips tokens with only punctuation in the tag name.\n//   - 'conllx': skips tokens with only punctuation in the surface form.\n//   - 'ignore_parens': same as conllx, but skipping parentheses as well.\n//   - '': scores all tokens.\nclass DecodedParseReader : public ParsingReader {\n public:\n  explicit DecodedParseReader(OpKernelConstruction *context)\n      : ParsingReader(context) {\n    // Sets up number and type of inputs and outputs.\n    std::vector<DataType> output_types = default_outputs();\n    output_types.push_back(DT_INT32);\n    output_types.push_back(DT_STRING);\n    OP_REQUIRES_OK(context, context->MatchSignature({DT_FLOAT}, output_types));\n\n    // Gets scoring parameters.\n    scoring_type_ = task_context().Get(\n        tensorflow::strings::StrCat(arg_prefix(), \"_scoring\"), \"\");\n  }\n\n private:\n  void AdvanceSentence(int index) override {\n    ParsingReader::AdvanceSentence(index);\n    if (state(index)) {\n      docids_.push_front(state(index)->sentence().docid());\n    }\n  }\n\n  // Tallies the # of correct and incorrect tokens for a given ParserState.\n  void ComputeTokenAccuracy(const ParserState &state) {\n    for (int i = 0; i < state.sentence().token_size(); ++i) {\n      const Token &token = state.GetToken(i);\n      if (utils::PunctuationUtil::ScoreToken(token.word(), token.tag(),\n                                             scoring_type_)) {\n        ++num_tokens_;\n        if (state.IsTokenCorrect(i)) ++num_correct_;\n      }\n    }\n  }\n\n  // Performs the allowed action with the highest score on the given state.\n  // Also records the accuracy whenver a terminal action is taken.\n  void PerformActions(OpKernelContext *context) override {\n    auto scores_matrix = context->input(0).matrix<float>();\n    num_tokens_ = 0;\n    num_correct_ = 0;\n    for (int i = 0, batch_index = 0; i < max_batch_size(); ++i) {\n      ParserState *state = this->state(i);\n      if (state != nullptr) {\n        int best_action = 0;\n        float best_score = -INFINITY;\n        for (int action = 0; action < scores_matrix.dimension(1); ++action) {\n          float score = scores_matrix(batch_index, action);\n          if (score > best_score &&\n              transition_system().IsAllowedAction(action, *state)) {\n            best_action = action;\n            best_score = score;\n          }\n        }\n        transition_system().PerformAction(best_action, state);\n\n        // Update the # of scored correct tokens if this is the last state\n        // in the sentence and save the annotated document.\n        if (transition_system().IsFinalState(*state)) {\n          ComputeTokenAccuracy(*state);\n          sentence_map_[state->sentence().docid()] = state->sentence();\n          state->AddParseToDocument(&sentence_map_[state->sentence().docid()]);\n        }\n        ++batch_index;\n      }\n    }\n  }\n\n  // Adds the evaluation metrics and annotated documents as additional outputs,\n  // if there were any terminal states.\n  void AddAdditionalOutputs(OpKernelContext *context) const override {\n    Tensor *counts_output;\n    OP_REQUIRES_OK(context,\n                   context->allocate_output(additional_output_index(),\n                                            TensorShape({2}), &counts_output));\n    auto eval_metrics = counts_output->vec<int32>();\n    eval_metrics(0) = num_tokens_;\n    eval_metrics(1) = num_correct_;\n\n    // Output annotated documents for each state. To preserve order, repeatedly\n    // pull from the back of the docids queue as long as the sentences have been\n    // completely processed. If the next document has not been completely\n    // processed yet, then the docid will not be found in 'sentence_map_'.\n    vector<Sentence> sentences;\n    while (!docids_.empty() &&\n           sentence_map_.find(docids_.back()) != sentence_map_.end()) {\n      sentences.emplace_back(sentence_map_[docids_.back()]);\n      sentence_map_.erase(docids_.back());\n      docids_.pop_back();\n    }\n    Tensor *annotated_output;\n    OP_REQUIRES_OK(context,\n                   context->allocate_output(\n                       additional_output_index() + 1,\n                       TensorShape({static_cast<int64>(sentences.size())}),\n                       &annotated_output));\n\n    auto document_output = annotated_output->vec<string>();\n    for (size_t i = 0; i < sentences.size(); ++i) {\n      document_output(i) = sentences[i].SerializeAsString();\n    }\n  }\n\n  // State for eval metric computation.\n  int num_tokens_ = 0;\n  int num_correct_ = 0;\n\n  // Parameter for deciding which tokens to score.\n  string scoring_type_;\n\n  mutable std::deque<string> docids_;\n  mutable map<string, Sentence> sentence_map_;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(DecodedParseReader);\n};\n\nREGISTER_KERNEL_BUILDER(Name(\"DecodedParseReader\").Device(DEVICE_CPU),\n                        DecodedParseReader);\n\nclass WordEmbeddingInitializer : public OpKernel {\n public:\n  explicit WordEmbeddingInitializer(OpKernelConstruction *context)\n      : OpKernel(context) {\n    string file_path, data;\n    OP_REQUIRES_OK(context, context->GetAttr(\"task_context\", &file_path));\n    OP_REQUIRES_OK(context, ReadFileToString(tensorflow::Env::Default(),\n                                             file_path, &data));\n    OP_REQUIRES(context,\n                TextFormat::ParseFromString(data, task_context_.mutable_spec()),\n                InvalidArgument(\"Could not parse task context at \", file_path));\n    OP_REQUIRES_OK(context, context->GetAttr(\"vectors\", &vectors_path_));\n    OP_REQUIRES_OK(context,\n                   context->GetAttr(\"embedding_init\", &embedding_init_));\n\n    // Sets up number and type of inputs and outputs.\n    OP_REQUIRES_OK(context, context->MatchSignature({}, {DT_FLOAT}));\n  }\n\n  void Compute(OpKernelContext *context) override {\n    // Loads words from vocabulary with mapping to ids.\n    string path = TaskContext::InputFile(*task_context_.GetInput(\"word-map\"));\n    const TermFrequencyMap *word_map =\n        SharedStoreUtils::GetWithDefaultName<TermFrequencyMap>(path, 0, 0);\n    unordered_map<string, int64> vocab;\n    for (int i = 0; i < word_map->Size(); ++i) {\n      vocab[word_map->GetTerm(i)] = i;\n    }\n\n    // Creates a reader pointing to a local copy of the vectors recordio.\n    string tmp_vectors_path;\n    OP_REQUIRES_OK(context, CopyToTmpPath(vectors_path_, &tmp_vectors_path));\n    ProtoRecordReader reader(tmp_vectors_path);\n\n    // Loads the embedding vectors into a matrix.\n    Tensor *embedding_matrix = nullptr;\n    TokenEmbedding embedding;\n    while (reader.Read(&embedding) == tensorflow::Status::OK()) {\n      if (embedding_matrix == nullptr) {\n        const int embedding_size = embedding.vector().values_size();\n        OP_REQUIRES_OK(\n            context, context->allocate_output(\n                         0, TensorShape({word_map->Size() + 3, embedding_size}),\n                         &embedding_matrix));\n        embedding_matrix->matrix<float>()\n            .setRandom<Eigen::internal::NormalRandomGenerator<float>>();\n        embedding_matrix->matrix<float>() =\n            embedding_matrix->matrix<float>() * static_cast<float>(\n                embedding_init_ / sqrt(embedding_size));\n      }\n      if (vocab.find(embedding.token()) != vocab.end()) {\n        SetNormalizedRow(embedding.vector(), vocab[embedding.token()],\n                         embedding_matrix);\n      }\n    }\n  }\n\n private:\n  // Sets embedding_matrix[row] to a normalized version of the given vector.\n  void SetNormalizedRow(const TokenEmbedding::Vector &vector, const int row,\n                        Tensor *embedding_matrix) {\n    float norm = 0.0f;\n    for (int col = 0; col < vector.values_size(); ++col) {\n      float val = vector.values(col);\n      norm += val * val;\n    }\n    norm = sqrt(norm);\n    for (int col = 0; col < vector.values_size(); ++col) {\n      embedding_matrix->matrix<float>()(row, col) = vector.values(col) / norm;\n    }\n  }\n\n  // Copies the file at source_path to a temporary file and sets tmp_path to the\n  // temporary file's location. This is helpful since reading from non local\n  // files with a record reader can be very slow.\n  static tensorflow::Status CopyToTmpPath(const string &source_path,\n                                          string *tmp_path) {\n    // Opens source file.\n    std::unique_ptr<tensorflow::RandomAccessFile> source_file;\n    TF_RETURN_IF_ERROR(tensorflow::Env::Default()->NewRandomAccessFile(\n        source_path, &source_file));\n\n    // Creates destination file.\n    std::unique_ptr<tensorflow::WritableFile> target_file;\n    *tmp_path = tensorflow::strings::Printf(\n        \"/tmp/%d.%lld\", getpid(), tensorflow::Env::Default()->NowMicros());\n    TF_RETURN_IF_ERROR(\n        tensorflow::Env::Default()->NewWritableFile(*tmp_path, &target_file));\n\n    // Performs copy.\n    tensorflow::Status s;\n    const size_t kBytesToRead = 10 << 20;  // 10MB at a time.\n    string scratch;\n    scratch.resize(kBytesToRead);\n    for (uint64 offset = 0; s.ok(); offset += kBytesToRead) {\n      tensorflow::StringPiece data;\n      s.Update(source_file->Read(offset, kBytesToRead, &data, &scratch[0]));\n      target_file->Append(data);\n    }\n    if (s.code() == OUT_OF_RANGE) {\n      return tensorflow::Status::OK();\n    } else {\n      return s;\n    }\n  }\n\n  // Task context used to configure this op.\n  TaskContext task_context_;\n\n  // Embedding vectors that are not found in the input sstable are initialized\n  // randomly from a normal distribution with zero mean and\n  //   std dev = embedding_init_ / sqrt(embedding_size).\n  float embedding_init_ = 1.f;\n\n  // Path to recordio with word embedding vectors.\n  string vectors_path_;\n};\n\nREGISTER_KERNEL_BUILDER(Name(\"WordEmbeddingInitializer\").Device(DEVICE_CPU),\n                        WordEmbeddingInitializer);\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/reader_ops_test.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Tests for reader_ops.\"\"\"\n\n\nimport os.path\nimport numpy as np\nimport tensorflow as tf\n\nfrom tensorflow.python.framework import test_util\nfrom tensorflow.python.platform import googletest\nfrom tensorflow.python.platform import tf_logging as logging\n\nfrom syntaxnet import dictionary_pb2\nfrom syntaxnet import graph_builder\nfrom syntaxnet import sparse_pb2\nfrom syntaxnet.ops import gen_parser_ops\n\n\nFLAGS = tf.app.flags.FLAGS\nif not hasattr(FLAGS, 'test_srcdir'):\n  FLAGS.test_srcdir = ''\nif not hasattr(FLAGS, 'test_tmpdir'):\n  FLAGS.test_tmpdir = tf.test.get_temp_dir()\n\n\nclass ParsingReaderOpsTest(test_util.TensorFlowTestCase):\n\n  def setUp(self):\n    # Creates a task context with the correct testing paths.\n    initial_task_context = os.path.join(\n        FLAGS.test_srcdir,\n        'syntaxnet/'\n        'testdata/context.pbtxt')\n    self._task_context = os.path.join(FLAGS.test_tmpdir, 'context.pbtxt')\n    with open(initial_task_context, 'r') as fin:\n      with open(self._task_context, 'w') as fout:\n        fout.write(fin.read().replace('SRCDIR', FLAGS.test_srcdir)\n                   .replace('OUTPATH', FLAGS.test_tmpdir))\n\n    # Creates necessary term maps.\n    with self.test_session() as sess:\n      gen_parser_ops.lexicon_builder(task_context=self._task_context,\n                                     corpus_name='training-corpus').run()\n      self._num_features, self._num_feature_ids, _, self._num_actions = (\n          sess.run(gen_parser_ops.feature_size(task_context=self._task_context,\n                                               arg_prefix='brain_parser')))\n\n  def GetMaxId(self, sparse_features):\n    max_id = 0\n    for x in sparse_features:\n      for y in x:\n        f = sparse_pb2.SparseFeatures()\n        f.ParseFromString(y)\n        for i in f.id:\n          max_id = max(i, max_id)\n    return max_id\n\n  def testParsingReaderOp(self):\n    # Runs the reader over the test input for two epochs.\n    num_steps_a = 0\n    num_actions = 0\n    num_word_ids = 0\n    num_tag_ids = 0\n    num_label_ids = 0\n    batch_size = 10\n    with self.test_session() as sess:\n      (words, tags, labels), epochs, gold_actions = (\n          gen_parser_ops.gold_parse_reader(self._task_context,\n                                           3,\n                                           batch_size,\n                                           corpus_name='training-corpus'))\n      while True:\n        tf_gold_actions, tf_epochs, tf_words, tf_tags, tf_labels = (\n            sess.run([gold_actions, epochs, words, tags, labels]))\n        num_steps_a += 1\n        num_actions = max(num_actions, max(tf_gold_actions) + 1)\n        num_word_ids = max(num_word_ids, self.GetMaxId(tf_words) + 1)\n        num_tag_ids = max(num_tag_ids, self.GetMaxId(tf_tags) + 1)\n        num_label_ids = max(num_label_ids, self.GetMaxId(tf_labels) + 1)\n        self.assertIn(tf_epochs, [0, 1, 2])\n        if tf_epochs > 1:\n          break\n\n    # Runs the reader again, this time with a lot of added graph nodes.\n    num_steps_b = 0\n    with self.test_session() as sess:\n      num_features = [6, 6, 4]\n      num_feature_ids = [num_word_ids, num_tag_ids, num_label_ids]\n      embedding_sizes = [8, 8, 8]\n      hidden_layer_sizes = [32, 32]\n      # Here we aim to test the iteration of the reader op in a complex network,\n      # not the GraphBuilder.\n      parser = graph_builder.GreedyParser(\n          num_actions, num_features, num_feature_ids, embedding_sizes,\n          hidden_layer_sizes)\n      parser.AddTraining(self._task_context,\n                         batch_size,\n                         corpus_name='training-corpus')\n      sess.run(parser.inits.values())\n      while True:\n        tf_epochs, tf_cost, _ = sess.run(\n            [parser.training['epochs'], parser.training['cost'],\n             parser.training['train_op']])\n        num_steps_b += 1\n        self.assertGreaterEqual(tf_cost, 0)\n        self.assertIn(tf_epochs, [0, 1, 2])\n        if tf_epochs > 1:\n          break\n\n    # Assert that the two runs made the exact same number of steps.\n    logging.info('Number of steps in the two runs: %d, %d',\n                 num_steps_a, num_steps_b)\n    self.assertEqual(num_steps_a, num_steps_b)\n\n  def testParsingReaderOpWhileLoop(self):\n    feature_size = 3\n    batch_size = 5\n\n    def ParserEndpoints():\n      return gen_parser_ops.gold_parse_reader(self._task_context,\n                                              feature_size,\n                                              batch_size,\n                                              corpus_name='training-corpus')\n\n    with self.test_session() as sess:\n      # The 'condition' and 'body' functions expect as many arguments as there\n      # are loop variables. 'condition' depends on the 'epoch' loop variable\n      # only, so we disregard the remaining unused function arguments. 'body'\n      # returns a list of updated loop variables.\n      def Condition(epoch, *unused_args):\n        return tf.less(epoch, 2)\n\n      def Body(epoch, num_actions, *feature_args):\n        # By adding one of the outputs of the reader op ('epoch') as a control\n        # dependency to the reader op we force the repeated evaluation of the\n        # reader op.\n        with epoch.graph.control_dependencies([epoch]):\n          features, epoch, gold_actions = ParserEndpoints()\n        num_actions = tf.maximum(num_actions,\n                                 tf.reduce_max(gold_actions, [0], False) + 1)\n        feature_ids = []\n        for i in range(len(feature_args)):\n          feature_ids.append(features[i])\n        return [epoch, num_actions] + feature_ids\n\n      epoch = ParserEndpoints()[-2]\n      num_actions = tf.constant(0)\n      loop_vars = [epoch, num_actions]\n\n      res = sess.run(\n          tf.while_loop(Condition, Body, loop_vars,\n                        shape_invariants=[tf.TensorShape(None)] * 2,\n                        parallel_iterations=1))\n      logging.info('Result: %s', res)\n      self.assertEqual(res[0], 2)\n\n  def testWordEmbeddingInitializer(self):\n    def _TokenEmbedding(token, embedding):\n      e = dictionary_pb2.TokenEmbedding()\n      e.token = token\n      e.vector.values.extend(embedding)\n      return e.SerializeToString()\n\n    # Provide embeddings for the first three words in the word map.\n    records_path = os.path.join(FLAGS.test_tmpdir, 'sstable-00000-of-00001')\n    writer = tf.python_io.TFRecordWriter(records_path)\n    writer.write(_TokenEmbedding('.', [1, 2]))\n    writer.write(_TokenEmbedding(',', [3, 4]))\n    writer.write(_TokenEmbedding('the', [5, 6]))\n    del writer\n\n    with self.test_session():\n      embeddings = gen_parser_ops.word_embedding_initializer(\n          vectors=records_path,\n          task_context=self._task_context).eval()\n    self.assertAllClose(\n        np.array([[1. / (1 + 4) ** .5, 2. / (1 + 4) ** .5],\n                  [3. / (9 + 16) ** .5, 4. / (9 + 16) ** .5],\n                  [5. / (25 + 36) ** .5, 6. / (25 + 36) ** .5]]),\n        embeddings[:3,])\n\n\nif __name__ == '__main__':\n  googletest.main()\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/registry.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/registry.h\"\n\nnamespace syntaxnet {\n\n// Global list of all component registries.\nRegistryMetadata *global_registry_list = nullptr;\n\nvoid RegistryMetadata::Register(RegistryMetadata *registry) {\n  registry->set_link(global_registry_list);\n  global_registry_list = registry;\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/registry.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// Registry for component registration. These classes can be used for creating\n// registries of components conforming to the same interface. This is useful for\n// making a component-based architecture where the specific implementation\n// classes can be selected at runtime. There is support for both class-based and\n// instance based registries.\n//\n// Example:\n//  function.h:\n//\n//   class Function : public RegisterableInstance<Function> {\n//    public:\n//     virtual double Evaluate(double x) = 0;\n//   };\n//\n//   #define REGISTER_FUNCTION(type, component)\n//     REGISTER_INSTANCE_COMPONENT(Function, type, component);\n//\n//  function.cc:\n//\n//   REGISTER_INSTANCE_REGISTRY(\"function\", Function);\n//\n//   class Cos : public Function {\n//    public:\n//     double Evaluate(double x) { return cos(x); }\n//   };\n//\n//   class Exp : public Function {\n//    public:\n//     double Evaluate(double x) { return exp(x); }\n//   };\n//\n//   REGISTER_FUNCTION(\"cos\", Cos);\n//   REGISTER_FUNCTION(\"exp\", Exp);\n//\n//   Function *f = Function::Lookup(\"cos\");\n//   double result = f->Evaluate(arg);\n\n#ifndef SYNTAXNET_REGISTRY_H_\n#define SYNTAXNET_REGISTRY_H_\n\n#include <string.h>\n#include <string>\n#include <vector>\n\n#include \"syntaxnet/utils.h\"\n\nnamespace syntaxnet {\n\n// Component metadata with information about name, class, and code location.\nclass ComponentMetadata {\n public:\n  ComponentMetadata(const char *name, const char *class_name, const char *file,\n                    int line)\n      : name_(name),\n        class_name_(class_name),\n        file_(file),\n        line_(line),\n        link_(nullptr) {}\n\n  // Returns component name.\n  const char *name() const { return name_; }\n\n  // Metadata objects can be linked in a list.\n  ComponentMetadata *link() const { return link_; }\n  void set_link(ComponentMetadata *link) { link_ = link; }\n\n private:\n  // Component name.\n  const char *name_;\n\n  // Name of class for component.\n  const char *class_name_;\n\n  // Code file and location where the component was registered.\n  const char *file_;\n  int line_;\n\n  // Link to next metadata object in list.\n  ComponentMetadata *link_;\n};\n\n// The master registry contains all registered component registries. A registry\n// is not registered in the master registry until the first component of that\n// type is registered.\nclass RegistryMetadata : public ComponentMetadata {\n public:\n  RegistryMetadata(const char *name, const char *class_name, const char *file,\n                   int line, ComponentMetadata **components)\n      : ComponentMetadata(name, class_name, file, line),\n        components_(components) {}\n\n  // Registers a component registry in the master registry.\n  static void Register(RegistryMetadata *registry);\n\n private:\n  // Location of list of components in registry.\n  ComponentMetadata **components_;\n};\n\n// Registry for components. An object can be registered with a type name in the\n// registry. The named instances in the registry can be returned using the\n// Lookup() method. The components in the registry are put into a linked list\n// of components. It is important that the component registry can be statically\n// initialized in order not to depend on initialization order.\ntemplate <class T>\nstruct ComponentRegistry {\n  typedef ComponentRegistry<T> Self;\n\n  // Component registration class.\n  class Registrar : public ComponentMetadata {\n   public:\n    // Registers new component by linking itself into the component list of\n    // the registry.\n    Registrar(Self *registry, const char *type, const char *class_name,\n              const char *file, int line, T *object)\n        : ComponentMetadata(type, class_name, file, line), object_(object) {\n      // Register registry in master registry if this is the first registered\n      // component of this type.\n      if (registry->components == nullptr) {\n        RegistryMetadata::Register(new RegistryMetadata(\n            registry->name, registry->class_name, registry->file,\n            registry->line,\n            reinterpret_cast<ComponentMetadata **>(&registry->components)));\n      }\n\n      // Register component in registry.\n      set_link(registry->components);\n      registry->components = this;\n    }\n\n    // Returns component type.\n    const char *type() const { return name(); }\n\n    // Returns component object.\n    T *object() const { return object_; }\n\n    // Returns the next component in the component list.\n    Registrar *next() const { return static_cast<Registrar *>(link()); }\n\n   private:\n    // Component object.\n    T *object_;\n  };\n\n  // Finds registrar for named component in registry.\n  const Registrar *GetComponent(const char *type) const {\n    Registrar *r = components;\n    while (r != nullptr && strcmp(type, r->type()) != 0) r = r->next();\n    if (r == nullptr) {\n      LOG(FATAL) << \"Unknown \" << name << \" component: '\" << type << \"'.\";\n    }\n    return r;\n  }\n\n  // Finds a named component in the registry.\n  T *Lookup(const char *type) const { return GetComponent(type)->object(); }\n  T *Lookup(const string &type) const { return Lookup(type.c_str()); }\n\n  // Textual description of the kind of components in the registry.\n  const char *name;\n\n  // Base class name of component type.\n  const char *class_name;\n\n  // File and line where the registry is defined.\n  const char *file;\n  int line;\n\n  // Linked list of registered components.\n  Registrar *components;\n};\n\n// Base class for registerable class-based components.\ntemplate <class T>\nclass RegisterableClass {\n public:\n  // Factory function type.\n  typedef T *(Factory)();\n\n  // Registry type.\n  typedef ComponentRegistry<Factory> Registry;\n\n  // Creates a new component instance.\n  static T *Create(const string &type) { return registry()->Lookup(type)(); }\n\n  // Returns registry for class.\n  static Registry *registry() { return &registry_; }\n\n private:\n  // Registry for class.\n  static Registry registry_;\n};\n\n// Base class for registerable instance-based components.\ntemplate <class T>\nclass RegisterableInstance {\n public:\n  // Registry type.\n  typedef ComponentRegistry<T> Registry;\n\n private:\n  // Registry for class.\n  static Registry registry_;\n};\n\n#define REGISTER_CLASS_COMPONENT(base, type, component)             \\\n  static base *__##component##__factory() { return new component; } \\\n  static base::Registry::Registrar __##component##__##registrar(    \\\n      base::registry(), type, #component, __FILE__, __LINE__,       \\\n      __##component##__factory)\n\n#define REGISTER_CLASS_REGISTRY(type, classname)                  \\\n  template <>                                                     \\\n  classname::Registry RegisterableClass<classname>::registry_ = { \\\n      type, #classname, __FILE__, __LINE__, NULL}\n\n#define REGISTER_INSTANCE_COMPONENT(base, type, component)       \\\n  static base::Registry::Registrar __##component##__##registrar( \\\n      base::registry(), type, #component, __FILE__, __LINE__, new component)\n\n#define REGISTER_INSTANCE_REGISTRY(type, classname)                  \\\n  template <>                                                        \\\n  classname::Registry RegisterableInstance<classname>::registry_ = { \\\n      type, #classname, __FILE__, __LINE__, NULL}\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_REGISTRY_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/segmenter_utils.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/segmenter_utils.h\"\n#include \"util/utf8/unicodetext.h\"\n#include \"util/utf8/unilib.h\"\n#include \"util/utf8/unilib_utf8_utils.h\"\n\nnamespace syntaxnet {\n\n// Separators, code Zs from http://www.unicode.org/Public/UNIDATA/PropList.txt\n// NB: This list is not necessarily exhaustive.\nconst std::unordered_set<int> SegmenterUtils::kBreakChars({\n  0x2028,  // line separator\n  0x2029,  // paragraph separator\n  0x0020,  // space\n  0x00a0,  // no-break space\n  0x1680,  // Ogham space mark\n  0x180e,  // Mongolian vowel separator\n  0x202f,  // narrow no-break space\n  0x205f,  // medium mathematical space\n  0x3000,  // ideographic space\n  0xe5e5,  // Google addition\n  0x2000, 0x2001, 0x2002, 0x2003, 0x2004, 0x2005, 0x2006, 0x2007, 0x2008,\n  0x2009, 0x200a\n});\n\nvoid SegmenterUtils::GetUTF8Chars(const string &text,\n                                  vector<tensorflow::StringPiece> *chars) {\n  const char *start = text.c_str();\n  const char *end = text.c_str() + text.size();\n  while (start < end) {\n    int char_length = UniLib::OneCharLen(start);\n    chars->emplace_back(start, char_length);\n    start += char_length;\n  }\n}\n\nvoid SegmenterUtils::SetCharsAsTokens(\n    const string &text,\n    const vector<tensorflow::StringPiece> &chars,\n    Sentence *sentence) {\n  sentence->clear_token();\n  sentence->set_text(text);\n  for (int i = 0; i < chars.size(); ++i) {\n    Token *tok = sentence->add_token();\n    tok->set_word(chars[i].ToString());  // NOLINT\n    int start_byte, end_byte;\n    GetCharStartEndBytes(text, chars[i], &start_byte, &end_byte);\n    tok->set_start(start_byte);\n    tok->set_end(end_byte);\n  }\n}\n\nbool SegmenterUtils::IsValidSegment(const Sentence &sentence,\n                                    const Token &token) {\n  // Check that the token is not empty, both by string and by bytes.\n  if (token.word().empty()) return false;\n  if (token.start() > token.end()) return false;\n\n  // Check token boudaries inside of text.\n  if (token.start() < 0) return false;\n  if (token.end() >= sentence.text().size()) return false;\n\n  // Check that token string is valid UTF8, by bytes.\n  const char s = sentence.text()[token.start()];\n  const char e = sentence.text()[token.end() + 1];\n  if (UniLib::IsTrailByte(s)) return false;\n  if (UniLib::IsTrailByte(e)) return false;\n  return true;\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/segmenter_utils.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#ifndef SYNTAXNET_SEGMENTER_UTILS_H_\n#define SYNTAXNET_SEGMENTER_UTILS_H_\n\n#include <string>\n#include <vector>\n#include <unordered_set>\n\n#include \"syntaxnet/sentence.pb.h\"\n#include \"tensorflow/core/lib/strings/strcat.h\"\n#include \"util/utf8/unicodetext.h\"\n\nnamespace syntaxnet {\n\n// A set of common convenience functions.\nclass SegmenterUtils {\n public:\n  // Takes a text and convert it into a vector, where each element is a utf8\n  // character.\n  static void GetUTF8Chars(const string &text,\n                           vector<tensorflow::StringPiece> *chars);\n\n  // Sets tokens in the sentence so that each token is a single character.\n  // Assigns the start/end byte offsets.\n  //\n  // If the sentence is not empty, the current tokens will be cleared.\n  static void SetCharsAsTokens(const string &text,\n                               const vector<tensorflow::StringPiece> &chars,\n                               Sentence *sentence);\n\n  // Returns true for UTF-8 characters that cannot be 'real' tokens. This is\n  // defined as any whitespace, line break or paragraph break.\n  static bool IsBreakChar(const string &word) {\n    if (word == \"\\n\" || word == \"\\t\") return true;\n    UnicodeText text;\n    text.PointToUTF8(word.c_str(), word.length());\n    CHECK_EQ(text.size(), 1);\n    return kBreakChars.find(*text.begin()) != kBreakChars.end();\n  }\n\n  // Returns the break level for the next token based on the current character.\n  static Token::BreakLevel BreakLevel(const string &word) {\n    UnicodeText text;\n    text.PointToUTF8(word.c_str(), word.length());\n    auto point = *text.begin();\n    if (word == \"\\n\" || point == kLineSeparator) {\n      return Token::LINE_BREAK;\n    } else if (point == kParagraphSeparator) {\n      return Token::SENTENCE_BREAK;  // No PARAGRAPH_BREAK in sentence proto.\n    } else if (word == \"\\t\" || kBreakChars.find(point) != kBreakChars.end()) {\n      return Token::SPACE_BREAK;\n    }\n    return Token::NO_BREAK;\n  }\n\n  // Convenience function for computing start/end byte offsets of a character\n  // StringPiece relative to original text.\n  static void GetCharStartEndBytes(const string &text,\n                                   tensorflow::StringPiece c,\n                                   int *start,\n                                   int *end) {\n    *start = c.data() - text.data();\n    *end = *start + c.size() - 1;\n  }\n\n  // Returns true if this segment is a valid segment. Currently checks:\n  // 1) It is non-empty\n  // 2) It is valid UTF8\n  static bool IsValidSegment(const Sentence &sentence, const Token &token);\n\n  // Set for utf8 break characters.\n  static const std::unordered_set<int> kBreakChars;\n  static const int kLineSeparator = 0x2028;\n  static const int kParagraphSeparator = 0x2029;\n};\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_SEGMENTER_UTILS_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/segmenter_utils_test.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/segmenter_utils.h\"\n\n#include <string>\n#include <vector>\n\n#include \"syntaxnet/char_properties.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include <gmock/gmock.h>\n#include \"tensorflow/core/lib/strings/strcat.h\"\n\nnamespace syntaxnet {\n\n// Creates a Korean senence and also initializes the token field.\nstatic Sentence GetKoSentence() {\n  Sentence sentence;\n\n  string text = \"서울시는 2012년부터\";\n\n  // Add tokens.\n  sentence.set_text(text);\n  Token *tok = sentence.add_token();\n  tok->set_word(\"서울시\");\n  tok->set_start(0);\n  tok->set_end(8);\n  tok = sentence.add_token();\n  tok->set_word(\"는\");\n  tok->set_start(9);\n  tok->set_end(11);\n  tok = sentence.add_token();\n  tok->set_word(\"2012\");\n  tok->set_start(13);\n  tok->set_end(16);\n  tok = sentence.add_token();\n  tok->set_word(\"년\");\n  tok->set_start(17);\n  tok->set_end(19);\n  tok = sentence.add_token();\n  tok->set_word(\"부터\");\n  tok->set_start(20);\n  tok->set_end(25);\n\n  return sentence;\n}\n\n// Gets the start end bytes of the given chars in the given text.\nstatic void GetStartEndBytes(const string &text,\n                             const vector<tensorflow::StringPiece> &chars,\n                             vector<int> *starts,\n                             vector<int> *ends) {\n  SegmenterUtils segment_utils;\n  for (const tensorflow::StringPiece &c : chars) {\n    int start; int end;\n    segment_utils.GetCharStartEndBytes(text, c, &start, &end);\n    starts->push_back(start);\n    ends->push_back(end);\n  }\n}\n\n// Test the GetChars function.\nTEST(SegmenterUtilsTest, GetCharsTest) {\n  // Create test sentence.\n  const Sentence sentence = GetKoSentence();\n  vector<tensorflow::StringPiece> chars;\n  SegmenterUtils::GetUTF8Chars(sentence.text(), &chars);\n\n  // Check the number of characters is correct.\n  CHECK_EQ(chars.size(), 12);\n\n  vector<int> starts;\n  vector<int> ends;\n  GetStartEndBytes(sentence.text(), chars, &starts, &ends);\n\n  // Check start positions.\n  CHECK_EQ(starts[0], 0);\n  CHECK_EQ(starts[1], 3);\n  CHECK_EQ(starts[2], 6);\n  CHECK_EQ(starts[3], 9);\n  CHECK_EQ(starts[4], 12);\n  CHECK_EQ(starts[5], 13);\n  CHECK_EQ(starts[6], 14);\n  CHECK_EQ(starts[7], 15);\n  CHECK_EQ(starts[8], 16);\n  CHECK_EQ(starts[9], 17);\n  CHECK_EQ(starts[10], 20);\n  CHECK_EQ(starts[11], 23);\n\n  // Check end positions.\n  CHECK_EQ(ends[0], 2);\n  CHECK_EQ(ends[1], 5);\n  CHECK_EQ(ends[2], 8);\n  CHECK_EQ(ends[3], 11);\n  CHECK_EQ(ends[4], 12);\n  CHECK_EQ(ends[5], 13);\n  CHECK_EQ(ends[6], 14);\n  CHECK_EQ(ends[7], 15);\n  CHECK_EQ(ends[8], 16);\n  CHECK_EQ(ends[9], 19);\n  CHECK_EQ(ends[10], 22);\n  CHECK_EQ(ends[11], 25);\n}\n\n// Test the SetCharsAsTokens function.\nTEST(SegmenterUtilsTest, SetCharsAsTokensTest) {\n  // Create test sentence.\n  const Sentence sentence = GetKoSentence();\n  vector<tensorflow::StringPiece> chars;\n  SegmenterUtils segment_utils;\n  segment_utils.GetUTF8Chars(sentence.text(), &chars);\n\n  vector<int> starts;\n  vector<int> ends;\n  GetStartEndBytes(sentence.text(), chars, &starts, &ends);\n\n  // Check that the new docs word, start and end positions are properly set.\n  Sentence new_sentence;\n  segment_utils.SetCharsAsTokens(sentence.text(), chars, &new_sentence);\n  CHECK_EQ(new_sentence.token_size(), chars.size());\n  for (int t = 0; t < sentence.token_size(); ++t) {\n    CHECK_EQ(new_sentence.token(t).word(), chars[t]);\n    CHECK_EQ(new_sentence.token(t).start(), starts[t]);\n    CHECK_EQ(new_sentence.token(t).end(), ends[t]);\n  }\n\n  // Re-running should remove the old tokens.\n  segment_utils.SetCharsAsTokens(sentence.text(), chars, &new_sentence);\n  CHECK_EQ(new_sentence.token_size(), chars.size());\n  for (int t = 0; t < sentence.token_size(); ++t) {\n    CHECK_EQ(new_sentence.token(t).word(), chars[t]);\n    CHECK_EQ(new_sentence.token(t).start(), starts[t]);\n    CHECK_EQ(new_sentence.token(t).end(), ends[t]);\n  }\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/sentence.proto",
    "content": "// Protocol buffer specification for document analysis.\n\nsyntax = \"proto2\";\n\npackage syntaxnet;\n\n// A Sentence contains the raw text contents of a sentence, as well as an\n// analysis.\nmessage Sentence {\n  // Identifier for document.\n  optional string docid = 1;\n\n  // Raw text contents of the sentence.\n  optional string text = 2;\n\n  // Tokenization of the sentence.\n  repeated Token token = 3;\n\n  extensions 1000 to max;\n}\n\n// A document token marks a span of bytes in the document text as a token\n// or word.\nmessage Token {\n  // Token word form.\n  required string word = 1;\n\n  // Start position of token in text.\n  required int32 start = 2;\n\n  // End position of token in text. Gives index of last byte, not one past\n  // the last byte. If token came from lexer, excludes any trailing HTML tags.\n  required int32 end = 3;\n\n  // Head of this token in the dependency tree: the id of the token which has an\n  // arc going to this one. If it is the root token of a sentence, then it is\n  // set to -1.\n  optional int32 head = 4 [default = -1];\n\n  // Part-of-speech tag for token.\n  optional string tag = 5;\n\n  // Coarse-grained word category for token.\n  optional string category = 6;\n\n  // Label for dependency relation between this token and its head.\n  optional string label = 7;\n\n  // Break level for tokens that indicates how it was separated from the\n  // previous token in the text.\n  enum BreakLevel {\n    NO_BREAK = 0;         // No separation between tokens.\n    SPACE_BREAK = 1;      // Tokens separated by space.\n    LINE_BREAK = 2;       // Tokens separated by line break.\n    SENTENCE_BREAK = 3;   // Tokens separated by sentence break.\n  }\n\n  optional BreakLevel break_level = 8 [default = SPACE_BREAK];\n\n  extensions 1000 to max;\n}\n\n// Stores information about the morphology of a token.\nmessage TokenMorphology {\n  extend Token {\n    optional TokenMorphology morphology = 63949837;\n  }\n\n  // Morphology is represented by a set of attribute values.\n  message Attribute {\n    required string name = 1;\n    required string value = 2;\n  }\n  // This attribute field is designated to hold a single disambiguated analysis.\n  repeated Attribute attribute = 3;\n};\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/sentence_batch.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/sentence_batch.h\"\n\n#include <memory>\n#include <string>\n#include <vector>\n\n#include \"syntaxnet/task_context.h\"\n\nnamespace syntaxnet {\n\nvoid SentenceBatch::Init(TaskContext *context) {\n  reader_.reset(new TextReader(*context->GetInput(input_name_), context));\n  size_ = 0;\n}\n\nbool SentenceBatch::AdvanceSentence(int index) {\n  if (sentences_[index] == nullptr) ++size_;\n  sentences_[index].reset();\n  std::unique_ptr<Sentence> sentence(reader_->Read());\n  if (sentence == nullptr) {\n    --size_;\n    return false;\n  }\n\n  // Preprocess the new sentence for the parser state.\n  sentences_[index] = std::move(sentence);\n  return true;\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/sentence_batch.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#ifndef SYNTAXNET_SENTENCE_BATCH_H_\n#define SYNTAXNET_SENTENCE_BATCH_H_\n\n#include <memory>\n#include <string>\n#include <utility>\n#include <vector>\n\n#include \"syntaxnet/embedding_feature_extractor.h\"\n#include \"syntaxnet/feature_extractor.h\"\n#include \"syntaxnet/parser_state.h\"\n#include \"syntaxnet/parser_transitions.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/sparse.pb.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/task_spec.pb.h\"\n#include \"syntaxnet/term_frequency_map.h\"\n\nnamespace syntaxnet {\n\n// Helper class to manage generating batches of preprocessed ParserState objects\n// by reading in multiple sentences in parallel.\nclass SentenceBatch {\n public:\n  SentenceBatch(int batch_size, string input_name)\n      : batch_size_(batch_size),\n        input_name_(std::move(input_name)),\n        sentences_(batch_size) {}\n\n  // Initializes all resources and opens the corpus file.\n  void Init(TaskContext *context);\n\n  // Advances the index'th sentence in the batch to the next sentence. This will\n  // create and preprocess a new ParserState for that element. Returns false if\n  // EOF is reached (if EOF, also sets the state to be nullptr.)\n  bool AdvanceSentence(int index);\n\n  // Rewinds the corpus reader.\n  void Rewind() { reader_->Reset(); }\n\n  int size() const { return size_; }\n\n  Sentence *sentence(int index) { return sentences_[index].get(); }\n\n private:\n  // Running tally of non-nullptr states in the batch.\n  int size_;\n\n  // Maximum number of states in the batch.\n  int batch_size_;\n\n  // Input to read from the TaskContext.\n  string input_name_;\n\n  // Reader for the corpus.\n  std::unique_ptr<TextReader> reader_;\n\n  // Batch: Sentence objects.\n  std::vector<std::unique_ptr<Sentence>> sentences_;\n};\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_SENTENCE_BATCH_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/sentence_features.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/sentence_features.h\"\n#include \"syntaxnet/char_properties.h\"\n#include \"syntaxnet/registry.h\"\n#include \"util/utf8/unicodetext.h\"\n#include \"util/utf8/unilib.h\"\n#include \"util/utf8/unilib_utf8_utils.h\"\n\nnamespace syntaxnet {\n\nTermFrequencyMapFeature::~TermFrequencyMapFeature() {\n  if (term_map_ != nullptr) {\n    SharedStore::Release(term_map_);\n    term_map_ = nullptr;\n  }\n}\n\nvoid TermFrequencyMapFeature::Setup(TaskContext *context) {\n  TokenLookupFeature::Setup(context);\n  context->GetInput(input_name_, \"text\", \"\");\n}\n\nvoid TermFrequencyMapFeature::Init(TaskContext *context) {\n  min_freq_ = GetIntParameter(\"min-freq\", 0);\n  max_num_terms_ = GetIntParameter(\"max-num-terms\", 0);\n  file_name_ = context->InputFile(*context->GetInput(input_name_));\n  term_map_ = SharedStoreUtils::GetWithDefaultName<TermFrequencyMap>(\n      file_name_, min_freq_, max_num_terms_);\n  TokenLookupFeature::Init(context);\n}\n\nstring TermFrequencyMapFeature::GetFeatureValueName(FeatureValue value) const {\n  if (value == UnknownValue()) return \"<UNKNOWN>\";\n  if (value >= 0 && value < (NumValues() - 1)) {\n    return term_map_->GetTerm(value);\n  }\n  LOG(ERROR) << \"Invalid feature value: \" << value;\n  return \"<INVALID>\";\n}\n\nstring TermFrequencyMapFeature::WorkspaceName() const {\n  return SharedStoreUtils::CreateDefaultName(\"term-frequency-map\", input_name_,\n                                             min_freq_, max_num_terms_);\n}\n\nTermFrequencyMapSetFeature::~TermFrequencyMapSetFeature() {\n  if (term_map_ != nullptr) {\n    SharedStore::Release(term_map_);\n    term_map_ = nullptr;\n  }\n}\n\nvoid TermFrequencyMapSetFeature::Setup(TaskContext *context) {\n  context->GetInput(input_name_, \"text\", \"\");\n}\n\nvoid TermFrequencyMapSetFeature::Init(TaskContext *context) {\n  min_freq_ = GetIntParameter(\"min-freq\", 0);\n  max_num_terms_ = GetIntParameter(\"max-num-terms\", 0);\n  file_name_ = context->InputFile(*context->GetInput(input_name_));\n  term_map_ = SharedStoreUtils::GetWithDefaultName<TermFrequencyMap>(\n      file_name_, min_freq_, max_num_terms_);\n  TokenLookupSetFeature::Init(context);\n}\n\nstring TermFrequencyMapSetFeature::WorkspaceName() const {\n  return SharedStoreUtils::CreateDefaultName(\n      \"term-frequency-map-set\", input_name_, min_freq_, max_num_terms_);\n}\n\nnamespace {\nvoid GetUTF8Chars(const string &word, vector<tensorflow::StringPiece> *chars) {\n  UnicodeText text;\n  text.PointToUTF8(word.c_str(), word.size());\n  for (UnicodeText::const_iterator it = text.begin(); it != text.end(); ++it) {\n    chars->push_back(tensorflow::StringPiece(it.utf8_data(), it.utf8_length()));\n  }\n}\n\nint UTF8FirstLetterNumBytes(const char *utf8_str) {\n  if (*utf8_str == '\\0') return 0;\n  return UniLib::OneCharLen(utf8_str);\n}\n\n}  // namespace\n\nvoid CharNgram::GetTokenIndices(const Token &token, vector<int> *values) const {\n  values->clear();\n  vector<tensorflow::StringPiece> char_sp;\n  if (use_terminators_) char_sp.push_back(\"^\");\n  GetUTF8Chars(token.word(), &char_sp);\n  if (use_terminators_) char_sp.push_back(\"$\");\n  for (int start = 0; start < char_sp.size(); ++start) {\n    string char_ngram;\n    for (int index = 0;\n         index < max_char_ngram_length_ && start + index < char_sp.size();\n         ++index) {\n      tensorflow::StringPiece c = char_sp[start + index];\n      if (c == \" \") break;  // Never add char ngrams containing spaces.\n      tensorflow::strings::StrAppend(&char_ngram, c);\n      int value = LookupIndex(char_ngram);\n      if (value != -1) {  // Skip unknown values.\n        values->push_back(value);\n      }\n    }\n  }\n}\n\nvoid MorphologySet::GetTokenIndices(const Token &token,\n                                    vector<int> *values) const {\n  values->clear();\n  const TokenMorphology &token_morphology =\n      token.GetExtension(TokenMorphology::morphology);\n  for (const TokenMorphology::Attribute &att : token_morphology.attribute()) {\n    int value =\n        LookupIndex(tensorflow::strings::StrCat(att.name(), \"=\", att.value()));\n    if (value != -1) {  // Skip unknown values.\n      values->push_back(value);\n    }\n  }\n}\n\nstring Hyphen::GetFeatureValueName(FeatureValue value) const {\n  switch (value) {\n    case NO_HYPHEN:\n      return \"NO_HYPHEN\";\n    case HAS_HYPHEN:\n      return \"HAS_HYPHEN\";\n  }\n  return \"<INVALID>\";\n}\n\nFeatureValue Hyphen::ComputeValue(const Token &token) const {\n  const string &word = token.word();\n  return (word.find('-') < word.length() ? HAS_HYPHEN : NO_HYPHEN);\n}\n\nvoid Capitalization::Setup(TaskContext *context) {\n  utf8_ = (GetParameter(\"utf8\") == \"true\");\n}\n\n// Runs ComputeValue for each token in the sentence.\nvoid Capitalization::Preprocess(WorkspaceSet *workspaces,\n                                Sentence *sentence) const {\n  if (workspaces->Has<VectorIntWorkspace>(Workspace())) return;\n  VectorIntWorkspace *workspace =\n      new VectorIntWorkspace(sentence->token_size());\n  for (int i = 0; i < sentence->token_size(); ++i) {\n    const int value = ComputeValueWithFocus(sentence->token(i), i);\n    workspace->set_element(i, value);\n  }\n  workspaces->Set<VectorIntWorkspace>(Workspace(), workspace);\n}\n\nstring Capitalization::GetFeatureValueName(FeatureValue value) const {\n  switch (value) {\n    case LOWERCASE:\n      return \"LOWERCASE\";\n    case UPPERCASE:\n      return \"UPPERCASE\";\n    case CAPITALIZED:\n      return \"CAPITALIZED\";\n    case CAPITALIZED_SENTENCE_INITIAL:\n      return \"CAPITALIZED_SENTENCE_INITIAL\";\n    case NON_ALPHABETIC:\n      return \"NON_ALPHABETIC\";\n  }\n  return \"<INVALID>\";\n}\n\nFeatureValue Capitalization::ComputeValueWithFocus(const Token &token,\n                                                   int focus) const {\n  const string &word = token.word();\n\n  // Check whether there is an uppercase or lowercase character.\n  bool has_upper = false;\n  bool has_lower = false;\n  if (utf8_) {\n    LOG(FATAL) << \"Not implemented.\";\n  } else {\n    const char *str = word.c_str();\n    for (int i = 0; i < word.length(); ++i) {\n      const char c = str[i];\n      has_upper = (has_upper || (c >= 'A' && c <= 'Z'));\n      has_lower = (has_lower || (c >= 'a' && c <= 'z'));\n    }\n  }\n\n  // Compute simple values.\n  if (!has_upper && has_lower) return LOWERCASE;\n  if (has_upper && !has_lower) return UPPERCASE;\n  if (!has_upper && !has_lower) return NON_ALPHABETIC;\n\n  // Else has_upper && has_lower; a normal capitalized word.  Check the break\n  // level to determine whether the capitalized word is sentence-initial.\n  const bool sentence_initial = (focus == 0);\n  return sentence_initial ? CAPITALIZED_SENTENCE_INITIAL : CAPITALIZED;\n}\n\nstring PunctuationAmount::GetFeatureValueName(FeatureValue value) const {\n  switch (value) {\n    case NO_PUNCTUATION:\n      return \"NO_PUNCTUATION\";\n    case SOME_PUNCTUATION:\n      return \"SOME_PUNCTUATION\";\n    case ALL_PUNCTUATION:\n      return \"ALL_PUNCTUATION\";\n  }\n  return \"<INVALID>\";\n}\n\nFeatureValue PunctuationAmount::ComputeValue(const Token &token) const {\n  const string &word = token.word();\n  bool has_punctuation = false;\n  bool all_punctuation = true;\n\n  const char *start = word.c_str();\n  const char *end = word.c_str() + word.size();\n  while (start < end) {\n    int char_length = UTF8FirstLetterNumBytes(start);\n    bool char_is_punct = is_punctuation_or_symbol(start, char_length);\n    all_punctuation &= char_is_punct;\n    has_punctuation |= char_is_punct;\n    if (!all_punctuation && has_punctuation) return SOME_PUNCTUATION;\n    start += char_length;\n  }\n  if (!all_punctuation) return NO_PUNCTUATION;\n  return ALL_PUNCTUATION;\n}\n\nstring Quote::GetFeatureValueName(FeatureValue value) const {\n  switch (value) {\n    case NO_QUOTE:\n      return \"NO_QUOTE\";\n    case OPEN_QUOTE:\n      return \"OPEN_QUOTE\";\n    case CLOSE_QUOTE:\n      return \"CLOSE_QUOTE\";\n    case UNKNOWN_QUOTE:\n      return \"UNKNOWN_QUOTE\";\n  }\n  return \"<INVALID>\";\n}\n\nFeatureValue Quote::ComputeValue(const Token &token) const {\n  const string &word = token.word();\n\n  // Penn Treebank open and close quotes are multi-character.\n  if (word == \"``\") return OPEN_QUOTE;\n  if (word == \"''\") return CLOSE_QUOTE;\n  if (word.length() == 1) {\n    int char_len = UTF8FirstLetterNumBytes(word.c_str());\n    bool is_open = is_open_quote(word.c_str(), char_len);\n    bool is_close = is_close_quote(word.c_str(), char_len);\n    if (is_open && !is_close) return OPEN_QUOTE;\n    if (is_close && !is_open) return CLOSE_QUOTE;\n    if (is_open && is_close) return UNKNOWN_QUOTE;\n  }\n  return NO_QUOTE;\n}\n\nvoid Quote::Preprocess(WorkspaceSet *workspaces, Sentence *sentence) const {\n  if (workspaces->Has<VectorIntWorkspace>(Workspace())) return;\n  VectorIntWorkspace *workspace =\n      new VectorIntWorkspace(sentence->token_size());\n\n  // For double quote \", it is unknown whether they are open or closed without\n  // looking at the prior tokens in the sentence.  in_quote is true iff an odd\n  // number of \" marks have been seen so far in the sentence (similar to the\n  // behavior of some tokenizers).\n  bool in_quote = false;\n  for (int i = 0; i < sentence->token_size(); ++i) {\n    int quote_type = ComputeValue(sentence->token(i));\n    if (quote_type == UNKNOWN_QUOTE) {\n      // Update based on in_quote and flip in_quote.\n      quote_type = in_quote ? CLOSE_QUOTE : OPEN_QUOTE;\n      in_quote = !in_quote;\n    }\n    workspace->set_element(i, quote_type);\n  }\n  workspaces->Set<VectorIntWorkspace>(Workspace(), workspace);\n}\n\nstring Digit::GetFeatureValueName(FeatureValue value) const {\n  switch (value) {\n    case NO_DIGIT:\n      return \"NO_DIGIT\";\n    case SOME_DIGIT:\n      return \"SOME_DIGIT\";\n    case ALL_DIGIT:\n      return \"ALL_DIGIT\";\n  }\n  return \"<INVALID>\";\n}\n\nFeatureValue Digit::ComputeValue(const Token &token) const {\n  const string &word = token.word();\n  bool has_digit = isdigit(word[0]);\n  bool all_digit = has_digit;\n  for (size_t i = 1; i < word.length(); ++i) {\n    bool char_is_digit = isdigit(word[i]);\n    all_digit = all_digit && char_is_digit;\n    has_digit = has_digit || char_is_digit;\n    if (!all_digit && has_digit) return SOME_DIGIT;\n  }\n  if (!all_digit) return NO_DIGIT;\n  return ALL_DIGIT;\n}\n\nAffixTableFeature::AffixTableFeature(AffixTable::Type type)\n    : type_(type) {\n  if (type == AffixTable::PREFIX) {\n    input_name_ = \"prefix-table\";\n  } else {\n    input_name_ = \"suffix-table\";\n  }\n}\n\nAffixTableFeature::~AffixTableFeature() {\n  SharedStore::Release(affix_table_);\n  affix_table_ = nullptr;\n}\n\nstring AffixTableFeature::WorkspaceName() const {\n  return SharedStoreUtils::CreateDefaultName(\n      \"affix-table\", input_name_, type_, affix_length_);\n}\n\n// Utility function to create a new affix table without changing constructors,\n// to be called by the SharedStore.\nstatic AffixTable *CreateAffixTable(const string &filename,\n                                    AffixTable::Type type) {\n  AffixTable *affix_table = new AffixTable(type, 1);\n  std::unique_ptr<tensorflow::RandomAccessFile> file;\n  TF_CHECK_OK(tensorflow::Env::Default()->NewRandomAccessFile(filename, &file));\n  ProtoRecordReader reader(file.release());\n  affix_table->Read(&reader);\n  return affix_table;\n}\n\nvoid AffixTableFeature::Setup(TaskContext *context) {\n  context->GetInput(input_name_, \"recordio\", \"affix-table\");\n  affix_length_ = GetIntParameter(\"length\", 0);\n  CHECK_GE(affix_length_, 0) << \"Length must be specified for affix feature.\";\n  TokenLookupFeature::Setup(context);\n}\n\nvoid AffixTableFeature::Init(TaskContext *context) {\n  string filename = context->InputFile(*context->GetInput(input_name_));\n\n  // Get the shared AffixTable object.\n  std::function<AffixTable *()> closure =\n      std::bind(CreateAffixTable, filename, type_);\n  affix_table_ = SharedStore::ClosureGetOrDie(filename, &closure);\n  CHECK_GE(affix_table_->max_length(), affix_length_)\n      << \"Affixes of length \" << affix_length_ << \" needed, but the affix \"\n      <<\"table only provides affixes of length <= \"\n      << affix_table_->max_length() << \".\";\n  TokenLookupFeature::Init(context);\n}\n\nFeatureValue AffixTableFeature::ComputeValue(const Token &token) const {\n  const string &word = token.word();\n  UnicodeText text;\n  text.PointToUTF8(word.c_str(), word.size());\n  if (affix_length_ > text.size()) return UnknownValue();\n  UnicodeText::const_iterator start, end;\n  if (type_ == AffixTable::PREFIX) {\n    start = end = text.begin();\n    for (int i = 0; i < affix_length_; ++i) ++end;\n  } else {\n    start = end = text.end();\n    for (int i = 0; i < affix_length_; ++i) --start;\n  }\n  string affix(start.utf8_data(), end.utf8_data() - start.utf8_data());\n  int affix_id = affix_table_->AffixId(affix);\n  return affix_id == -1 ? UnknownValue() : affix_id;\n}\n\nstring AffixTableFeature::GetFeatureValueName(FeatureValue value) const {\n  if (value == UnknownValue()) return \"<UNKNOWN>\";\n  if (value >= 0 && value < UnknownValue()) {\n    return affix_table_->AffixForm(value);\n  }\n  LOG(ERROR) << \"Invalid feature value: \" << value;\n  return \"<INVALID>\";\n}\n\n// Registry for the Sentence + token index feature functions.\nREGISTER_CLASS_REGISTRY(\"sentence+index feature function\", SentenceFeature);\n\n// Register the features defined in the header.\nREGISTER_SENTENCE_IDX_FEATURE(\"word\", Word);\nREGISTER_SENTENCE_IDX_FEATURE(\"char\", Char);\nREGISTER_SENTENCE_IDX_FEATURE(\"lcword\", LowercaseWord);\nREGISTER_SENTENCE_IDX_FEATURE(\"tag\", Tag);\nREGISTER_SENTENCE_IDX_FEATURE(\"offset\", Offset);\nREGISTER_SENTENCE_IDX_FEATURE(\"hyphen\", Hyphen);\nREGISTER_SENTENCE_IDX_FEATURE(\"digit\", Digit);\nREGISTER_SENTENCE_IDX_FEATURE(\"prefix\", PrefixFeature);\nREGISTER_SENTENCE_IDX_FEATURE(\"suffix\", SuffixFeature);\nREGISTER_SENTENCE_IDX_FEATURE(\"char-ngram\", CharNgram);\nREGISTER_SENTENCE_IDX_FEATURE(\"morphology-set\", MorphologySet);\nREGISTER_SENTENCE_IDX_FEATURE(\"capitalization\", Capitalization);\nREGISTER_SENTENCE_IDX_FEATURE(\"punctuation-amount\", PunctuationAmount);\nREGISTER_SENTENCE_IDX_FEATURE(\"quote\", Quote);\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/sentence_features.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// Features that operate on Sentence objects. Most features are defined\n// in this header so they may be re-used via composition into other more\n// advanced feature classes.\n\n#ifndef SYNTAXNET_SENTENCE_FEATURES_H_\n#define SYNTAXNET_SENTENCE_FEATURES_H_\n\n#include \"syntaxnet/affix.h\"\n#include \"syntaxnet/feature_extractor.h\"\n#include \"syntaxnet/feature_types.h\"\n#include \"syntaxnet/segmenter_utils.h\"\n#include \"syntaxnet/shared_store.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/workspace.h\"\n\nnamespace syntaxnet {\n\n// Feature function for any component that processes Sentences, whose\n// focus is a token index into the sentence.\ntypedef FeatureFunction<Sentence, int> SentenceFeature;\n\n// Alias for Locator type features that take (Sentence, int) signatures\n// and call other (Sentence, int) features.\ntemplate <class DER>\nusing Locator = FeatureLocator<DER, Sentence, int>;\n\nclass TokenLookupFeature : public SentenceFeature {\n public:\n  void Init(TaskContext *context) override {\n    set_feature_type(new ResourceBasedFeatureType<TokenLookupFeature>(\n        name(), this, {{NumValues(), \"<OUTSIDE>\"}}));\n  }\n\n  // Given a position in a sentence and workspaces, looks up the corresponding\n  // feature value. The index is relative to the start of the sentence.\n  virtual FeatureValue ComputeValue(const Token &token) const = 0;\n\n  // Number of unique values.\n  virtual int64 NumValues() const = 0;\n\n  // Convert the numeric value of the feature to a human readable string.\n  virtual string GetFeatureValueName(FeatureValue value) const = 0;\n\n  // Name of the shared workspace.\n  virtual string WorkspaceName() const = 0;\n\n  // Runs ComputeValue for each token in the sentence.\n  void Preprocess(WorkspaceSet *workspaces,\n                  Sentence *sentence) const override {\n    if (workspaces->Has<VectorIntWorkspace>(workspace_)) return;\n    VectorIntWorkspace *workspace = new VectorIntWorkspace(\n        sentence->token_size());\n    for (int i = 0; i < sentence->token_size(); ++i) {\n      const int value = ComputeValue(sentence->token(i));\n      workspace->set_element(i, value);\n    }\n    workspaces->Set<VectorIntWorkspace>(workspace_, workspace);\n  }\n\n  // Requests a vector of int's to store in the workspace registry.\n  void RequestWorkspaces(WorkspaceRegistry *registry) override {\n    workspace_ = registry->Request<VectorIntWorkspace>(WorkspaceName());\n  }\n\n  // Returns the precomputed value, or NumValues() for features outside\n  // the sentence.\n  FeatureValue Compute(const WorkspaceSet &workspaces,\n                       const Sentence &sentence, int focus,\n                       const FeatureVector *result) const override {\n    if (focus < 0 || focus >= sentence.token_size()) return NumValues();\n    return workspaces.Get<VectorIntWorkspace>(workspace_).element(focus);\n  }\n\n  int Workspace() const { return workspace_; }\n\n private:\n  int workspace_;\n};\n\n// A multi purpose specialization of the feature. Processes the tokens in a\n// Sentence by looking up a value set for each token and storing that in\n// a VectorVectorInt workspace. Given a set of base values of size Size(),\n// reserves an extra value for unknown tokens.\nclass TokenLookupSetFeature : public SentenceFeature {\n public:\n  void Init(TaskContext *context) override {\n    set_feature_type(new ResourceBasedFeatureType<TokenLookupSetFeature>(\n        name(), this, {{NumValues(), \"<OUTSIDE>\"}}));\n  }\n\n  // Number of unique values.\n  virtual int64 NumValues() const = 0;\n\n  // Given a position in a sentence and workspaces, looks up the corresponding\n  // feature value set. The index is relative to the start of the sentence.\n  virtual void LookupToken(const WorkspaceSet &workspaces,\n                           const Sentence &sentence, int index,\n                           vector<int> *values) const = 0;\n\n  // Given a feature value, returns a string representation.\n  virtual string GetFeatureValueName(int value) const = 0;\n\n  // Name of the shared workspace.\n  virtual string WorkspaceName() const = 0;\n\n  // TokenLookupSetFeatures use VectorVectorIntWorkspaces by default.\n  void RequestWorkspaces(WorkspaceRegistry *registry) override {\n    workspace_ = registry->Request<VectorVectorIntWorkspace>(WorkspaceName());\n  }\n\n  // Default preprocessing: looks up a value set for each token in the Sentence.\n  void Preprocess(WorkspaceSet *workspaces, Sentence *sentence) const override {\n    // Default preprocessing: lookup a value set for each token in the Sentence.\n    if (workspaces->Has<VectorVectorIntWorkspace>(workspace_)) return;\n    VectorVectorIntWorkspace *workspace =\n        new VectorVectorIntWorkspace(sentence->token_size());\n    for (int i = 0; i < sentence->token_size(); ++i) {\n      LookupToken(*workspaces, *sentence, i, workspace->mutable_elements(i));\n    }\n    workspaces->Set<VectorVectorIntWorkspace>(workspace_, workspace);\n  }\n\n  // Returns a pre-computed token value from the cache. This assumes the cache\n  // is populated.\n  const vector<int> &GetCachedValueSet(const WorkspaceSet &workspaces,\n                                       const Sentence &sentence,\n                                       int focus) const {\n    // Do bounds checking on focus.\n    CHECK_GE(focus, 0);\n    CHECK_LT(focus, sentence.token_size());\n\n    // Return value from cache.\n    return workspaces.Get<VectorVectorIntWorkspace>(workspace_).elements(focus);\n  }\n\n  // Adds any precomputed features at the given focus, if present.\n  void Evaluate(const WorkspaceSet &workspaces, const Sentence &sentence,\n                int focus, FeatureVector *result) const override {\n    if (focus >= 0 && focus < sentence.token_size()) {\n      const vector<int> &elements =\n          GetCachedValueSet(workspaces, sentence, focus);\n      for (auto &value : elements) {\n        result->add(this->feature_type(), value);\n      }\n    }\n  }\n\n  // Returns the precomputed value, or NumValues() for features outside\n  // the sentence.\n  FeatureValue Compute(const WorkspaceSet &workspaces, const Sentence &sentence,\n                       int focus, const FeatureVector *result) const override {\n    if (focus < 0 || focus >= sentence.token_size()) return NumValues();\n    return workspaces.Get<VectorIntWorkspace>(workspace_).element(focus);\n  }\n\n private:\n  int workspace_;\n};\n\n// Lookup feature that uses a TermFrequencyMap to store a string->int mapping.\nclass TermFrequencyMapFeature : public TokenLookupFeature {\n public:\n  explicit TermFrequencyMapFeature(const string &input_name)\n      : input_name_(input_name), min_freq_(0), max_num_terms_(0) {}\n  ~TermFrequencyMapFeature() override;\n\n  // Requests the input map as a resource.\n  void Setup(TaskContext *context) override;\n\n  // Loads the input map into memory (using SharedStore to avoid redundancy.)\n  void Init(TaskContext *context) override;\n\n  // Number of unique values.\n  int64 NumValues() const override { return term_map_->Size() + 1; }\n\n  // Special value for strings not in the map.\n  FeatureValue UnknownValue() const { return term_map_->Size(); }\n\n  // Uses the TermFrequencyMap to lookup the string associated with a value.\n  string GetFeatureValueName(FeatureValue value) const override;\n\n  // Name of the shared workspace.\n  string WorkspaceName() const override;\n\n protected:\n  const TermFrequencyMap &term_map() const { return *term_map_; }\n\n private:\n  // Shortcut pointer to shared map. Not owned.\n  const TermFrequencyMap *term_map_ = nullptr;\n\n  // Name of the input for the term map.\n  string input_name_;\n\n  // Filename of the underlying resource.\n  string file_name_;\n\n  // Minimum frequency for term map.\n  int min_freq_;\n\n  // Maximum number of terms for term map.\n  int max_num_terms_;\n};\n\n// Specialization of the TokenLookupSetFeature class to use a TermFrequencyMap\n// to perform the mapping. This takes two options: \"min_freq\" (discard tokens\n// with less than this min frequency), and \"max_num_terms\" (only read in at most\n// these terms.)\nclass TermFrequencyMapSetFeature : public TokenLookupSetFeature {\n public:\n  // Initializes with an empty name, since we need the options to compute the\n  // actual workspace name.\n  explicit TermFrequencyMapSetFeature(const string &input_name)\n      : input_name_(input_name), min_freq_(0), max_num_terms_(0) {}\n\n  // Releases shared resources.\n  ~TermFrequencyMapSetFeature() override;\n\n  // Returns index of raw word text.\n  virtual void GetTokenIndices(const Token &token,\n                               vector<int> *values) const = 0;\n\n  // Requests the resource inputs.\n  void Setup(TaskContext *context) override;\n\n  // Obtains resources using the shared store. At this point options are known\n  // so the full name can be computed.\n  void Init(TaskContext *context) override;\n\n  // Number of unique values.\n\n  int64 NumValues() const override { return term_map_->Size(); }\n\n  // Special value for strings not in the map.\n  FeatureValue UnknownValue() const { return term_map_->Size(); }\n\n  // Gets pointer to the underlying map.\n  const TermFrequencyMap *term_map() const { return term_map_; }\n\n  // Returns the term index or the unknown value. Used inside GetTokenIndex()\n  // specializations for convenience.\n  int LookupIndex(const string &term) const {\n    return term_map_->LookupIndex(term, -1);\n  }\n\n  // Given a position in a sentence and workspaces, looks up the corresponding\n  // feature value set. The index is relative to the start of the sentence.\n  void LookupToken(const WorkspaceSet &workspaces, const Sentence &sentence,\n                   int index, vector<int> *values) const override {\n    GetTokenIndices(sentence.token(index), values);\n  }\n\n  // Uses the TermFrequencyMap to lookup the string associated with a value.\n  string GetFeatureValueName(int value) const override {\n    if (value == UnknownValue()) return \"<UNKNOWN>\";\n    if (value >= 0 && value < NumValues()) {\n      return term_map_->GetTerm(value);\n    }\n    LOG(ERROR) << \"Invalid feature value: \" << value;\n    return \"<INVALID>\";\n  }\n\n  // Name of the shared workspace.\n  string WorkspaceName() const override;\n\n private:\n  // Shortcut pointer to shared map. Not owned.\n  const TermFrequencyMap *term_map_ = nullptr;\n\n  // Name of the input for the term map.\n  string input_name_;\n\n  // Filename of the underlying resource.\n  string file_name_;\n\n  // Minimum frequency for term map.\n  int min_freq_;\n\n  // Maximum number of terms for term map.\n  int max_num_terms_;\n};\n\nclass Word : public TermFrequencyMapFeature {\n public:\n  Word() : TermFrequencyMapFeature(\"word-map\") {}\n\n  FeatureValue ComputeValue(const Token &token) const override {\n    const string &form = token.word();\n    return term_map().LookupIndex(form, UnknownValue());\n  }\n};\n\nclass Char : public TermFrequencyMapFeature {\n public:\n  Char() : TermFrequencyMapFeature(\"char-map\") {}\n\n  FeatureValue ComputeValue(const Token &token) const override {\n    const string &form = token.word();\n    if (SegmenterUtils::IsBreakChar(form)) return BreakCharValue();\n    return term_map().LookupIndex(form, UnknownValue());\n  }\n\n  // Special value for breaks.\n  FeatureValue BreakCharValue() const { return term_map().Size(); }\n\n  // Special value for non-break strings not in the map.\n  FeatureValue UnknownValue() const { return term_map().Size() + 1; }\n\n  // Number of unique values.\n  int64 NumValues() const override { return term_map().Size() + 2; }\n\n  string GetFeatureValueName(FeatureValue value) const override {\n    if (value == BreakCharValue()) return \"<BREAK_CHAR>\";\n    if (value == UnknownValue()) return \"<UNKNOWN>\";\n    if (value >= 0 && value < term_map().Size()) {\n      return term_map().GetTerm(value);\n    }\n    LOG(ERROR) << \"Invalid feature value: \" << value;\n    return \"<INVALID>\";\n  }\n};\n\nclass LowercaseWord : public TermFrequencyMapFeature {\n public:\n  LowercaseWord() : TermFrequencyMapFeature(\"lc-word-map\") {}\n\n  FeatureValue ComputeValue(const Token &token) const override {\n    const string lcword = utils::Lowercase(token.word());\n    return term_map().LookupIndex(lcword, UnknownValue());\n  }\n};\n\nclass Tag : public TermFrequencyMapFeature {\n public:\n  Tag() : TermFrequencyMapFeature(\"tag-map\") {}\n\n  FeatureValue ComputeValue(const Token &token) const override {\n    return term_map().LookupIndex(token.tag(), UnknownValue());\n  }\n};\n\nclass Label : public TermFrequencyMapFeature {\n public:\n  Label() : TermFrequencyMapFeature(\"label-map\") {}\n\n  FeatureValue ComputeValue(const Token &token) const override {\n    return term_map().LookupIndex(token.label(), UnknownValue());\n  }\n};\n\nclass CharNgram : public TermFrequencyMapSetFeature {\n public:\n  CharNgram() : TermFrequencyMapSetFeature(\"char-ngram-map\") {}\n  ~CharNgram() override {}\n\n  void Setup(TaskContext *context) override {\n    TermFrequencyMapSetFeature::Setup(context);\n    max_char_ngram_length_ = context->Get(\"lexicon_max_char_ngram_length\", 3);\n    use_terminators_ =\n        context->Get(\"lexicon_char_ngram_include_terminators\", false);\n  }\n\n  // Returns index of raw word text.\n  void GetTokenIndices(const Token &token, vector<int> *values) const override;\n\n private:\n  // Size parameter (n) for the ngrams.\n  int max_char_ngram_length_ = 3;\n\n  // Whether to pad the word with ^ and $ before extracting ngrams.\n  bool use_terminators_ = false;\n};\n\nclass MorphologySet : public TermFrequencyMapSetFeature {\n public:\n  MorphologySet() : TermFrequencyMapSetFeature(\"morphology-map\") {}\n  ~MorphologySet() override {}\n\n  void Setup(TaskContext *context) override {\n    TermFrequencyMapSetFeature::Setup(context);\n  }\n\n\n  int64 NumValues() const override {\n    return term_map()->Size() - 1;\n  }\n\n  // Returns index of raw word text.\n  void GetTokenIndices(const Token &token, vector<int> *values) const override;\n};\n\nclass LexicalCategoryFeature : public TokenLookupFeature {\n public:\n  LexicalCategoryFeature(const string &name, int cardinality)\n      : name_(name), cardinality_(cardinality) {}\n  ~LexicalCategoryFeature() override {}\n\n  FeatureValue NumValues() const override { return cardinality_; }\n\n  // Returns the identifier for the workspace for this feature.\n  string WorkspaceName() const override {\n    return tensorflow::strings::StrCat(name_, \":\", cardinality_);\n  }\n\n private:\n  // Name of the category type.\n  const string name_;\n\n  // Number of values.\n  const int cardinality_;\n};\n\n// Feature that computes whether a word has a hyphen or not.\nclass Hyphen : public LexicalCategoryFeature {\n public:\n  // Enumeration of values.\n  enum Category {\n    NO_HYPHEN = 0,\n    HAS_HYPHEN = 1,\n    CARDINALITY = 2,\n  };\n\n  // Default constructor.\n  Hyphen() : LexicalCategoryFeature(\"hyphen\", CARDINALITY) {}\n\n  // Returns a string representation of the enum value.\n  string GetFeatureValueName(FeatureValue value) const override;\n\n  // Returns the category value for the token.\n  FeatureValue ComputeValue(const Token &token) const override;\n};\n\n// Feature that categorizes the capitalization of the word. If the option\n// utf8=true is specified, lowercase and uppercase checks are done with UTF8\n// compliant functions.\nclass Capitalization : public LexicalCategoryFeature {\n public:\n  // Enumeration of values.\n  enum Category {\n    LOWERCASE = 0,                     // normal word\n    UPPERCASE = 1,                     // all-caps\n    CAPITALIZED = 2,                   // has one cap and one non-cap\n    CAPITALIZED_SENTENCE_INITIAL = 3,  // same as above but sentence-initial\n    NON_ALPHABETIC = 4,                // contains no alphabetic characters\n    CARDINALITY = 5,\n  };\n\n  // Default constructor.\n  Capitalization() : LexicalCategoryFeature(\"capitalization\", CARDINALITY) {}\n\n  // Sets one of the options for the capitalization.\n  void Setup(TaskContext *context) override;\n\n  // Capitalization needs special preprocessing because token category can\n  // depend on whether the token is at the start of the sentence.\n  void Preprocess(WorkspaceSet *workspaces, Sentence *sentence) const override;\n\n  // Returns a string representation of the enum value.\n  string GetFeatureValueName(FeatureValue value) const override;\n\n  // Returns the category value for the token.\n  FeatureValue ComputeValue(const Token &token) const override {\n    LOG(FATAL) << \"Capitalization should use ComputeValueWithFocus.\";\n    return 0;\n  }\n\n  // Returns the category value for the token.\n  FeatureValue ComputeValueWithFocus(const Token &token, int focus) const;\n\n private:\n  // Whether to use UTF8 compliant functions to check capitalization.\n  bool utf8_ = false;\n};\n\n// A feature for computing whether the focus token contains any punctuation\n// for ternary features.\nclass PunctuationAmount : public LexicalCategoryFeature {\n public:\n  // Enumeration of values.\n  enum Category {\n    NO_PUNCTUATION = 0,\n    SOME_PUNCTUATION = 1,\n    ALL_PUNCTUATION = 2,\n    CARDINALITY = 3,\n  };\n\n  // Default constructor.\n  PunctuationAmount()\n      : LexicalCategoryFeature(\"punctuation-amount\", CARDINALITY) {}\n\n  // Returns a string representation of the enum value.\n  string GetFeatureValueName(FeatureValue value) const override;\n\n  // Returns the category value for the token.\n  FeatureValue ComputeValue(const Token &token) const override;\n};\n\n// A feature for a feature that returns whether the word is an open or\n// close quotation mark, based on its relative position to other quotation marks\n// in the sentence.\nclass Quote : public LexicalCategoryFeature {\n public:\n  // Enumeration of values.\n  enum Category {\n    NO_QUOTE = 0,\n    OPEN_QUOTE = 1,\n    CLOSE_QUOTE = 2,\n    UNKNOWN_QUOTE = 3,\n    CARDINALITY = 4,\n  };\n\n  // Default constructor.\n  Quote() : LexicalCategoryFeature(\"quote\", CARDINALITY) {}\n\n  // Returns a string representation of the enum value.\n  string GetFeatureValueName(FeatureValue value) const override;\n\n  // Returns the category value for the token.\n  FeatureValue ComputeValue(const Token &token) const override;\n\n  // Override preprocess to compute open and close quotes from prior context of\n  // the sentence.\n  void Preprocess(WorkspaceSet *workspaces, Sentence *instance) const override;\n};\n\n// Feature that computes whether a word has digits or not.\nclass Digit : public LexicalCategoryFeature {\n public:\n  // Enumeration of values.\n  enum Category {\n    NO_DIGIT = 0,\n    SOME_DIGIT = 1,\n    ALL_DIGIT = 2,\n    CARDINALITY = 3,\n  };\n\n  // Default constructor.\n  Digit() : LexicalCategoryFeature(\"digit\", CARDINALITY) {}\n\n  // Returns a string representation of the enum value.\n  string GetFeatureValueName(FeatureValue value) const override;\n\n  // Returns the category value for the token.\n  FeatureValue ComputeValue(const Token &token) const override;\n};\n\n// TokenLookupFeature object to compute prefixes and suffixes of words. The\n// AffixTable is stored in the SharedStore. This is very similar to the\n// implementation of TermFrequencyMapFeature, but using an AffixTable to\n// perform the lookups. There are only two specializations, for prefixes and\n// suffixes.\nclass AffixTableFeature : public TokenLookupFeature {\n public:\n  // Explicit constructor to set the type of the table. This determines the\n  // requested input.\n  explicit AffixTableFeature(AffixTable::Type type);\n  ~AffixTableFeature() override;\n\n  // Requests inputs for the affix table.\n  void Setup(TaskContext *context) override;\n\n  // Loads the affix table from the SharedStore.\n  void Init(TaskContext *context) override;\n\n  // The workspace name is specific to which affix length we are computing.\n  string WorkspaceName() const override;\n\n  // Returns the total number of affixes in the table, regardless of specified\n  // length.\n  FeatureValue NumValues() const override { return affix_table_->size() + 1; }\n\n  // Special value for strings not in the map.\n  FeatureValue UnknownValue() const { return affix_table_->size(); }\n\n  // Looks up the affix for a given word.\n  FeatureValue ComputeValue(const Token &token) const override;\n\n  // Returns the string associated with a value.\n  string GetFeatureValueName(FeatureValue value) const override;\n\n private:\n  // Size parameter for the affix table.\n  int affix_length_;\n\n  // Name of the input for the table.\n  string input_name_;\n\n  // The type of the affix table.\n  const AffixTable::Type type_;\n\n  // Affix table used for indexing. This comes from the shared store, and is not\n  // owned directly.\n  const AffixTable *affix_table_ = nullptr;\n};\n\n// Specific instantiation for computing prefixes. This requires the input\n// \"prefix-table\".\nclass PrefixFeature : public AffixTableFeature {\n public:\n  PrefixFeature() : AffixTableFeature(AffixTable::PREFIX) {}\n};\n\n// Specific instantiation for computing suffixes. Requires the input\n// \"suffix-table.\"\nclass SuffixFeature : public AffixTableFeature {\n public:\n  SuffixFeature() : AffixTableFeature(AffixTable::SUFFIX) {}\n};\n\n// Offset locator. Simple locator: just changes the focus by some offset.\nclass Offset : public Locator<Offset> {\n public:\n  void UpdateArgs(const WorkspaceSet &workspaces,\n                  const Sentence &sentence, int *focus) const {\n    *focus += argument();\n  }\n};\n\ntypedef FeatureExtractor<Sentence, int> SentenceExtractor;\n\n// Utility to register the sentence_instance::Feature functions.\n#define REGISTER_SENTENCE_IDX_FEATURE(name, type) \\\n  REGISTER_FEATURE_FUNCTION(SentenceFeature, name, type)\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_SENTENCE_FEATURES_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/sentence_features_test.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/sentence_features.h\"\n\n#include <string>\n#include <vector>\n\n#include \"syntaxnet/feature_extractor.h\"\n#include \"syntaxnet/populate_test_inputs.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/task_spec.pb.h\"\n#include \"syntaxnet/utils.h\"\n#include \"syntaxnet/workspace.h\"\n#include <gmock/gmock.h>\n#include \"tensorflow/core/platform/test.h\"\n\nusing testing::UnorderedElementsAreArray;\n\nnamespace syntaxnet {\n\n// A basic fixture for testing Features. Takes a string of a\n// Sentence protobuf that is used as the test data in the constructor.\nclass SentenceFeaturesTest : public ::testing::Test {\n protected:\n  explicit SentenceFeaturesTest(const string &prototxt)\n      : sentence_(ParseASCII(prototxt)),\n        creators_(PopulateTestInputs::Defaults(sentence_)) {}\n\n  static Sentence ParseASCII(const string &prototxt) {\n    Sentence document;\n    CHECK(TextFormat::ParseFromString(prototxt, &document));\n    return document;\n  }\n\n  // Prepares a new feature for extracting from the attached sentence,\n  // regenerating the TaskContext and all resources. Will automatically add\n  // anything in info_ field into the LexiFuse repository.\n  virtual void PrepareFeature(const string &fml) {\n    context_.mutable_spec()->mutable_input()->Clear();\n    context_.mutable_spec()->mutable_output()->Clear();\n    extractor_.reset(new SentenceExtractor());\n    extractor_->Parse(fml);\n    extractor_->Setup(&context_);\n    creators_.Populate(&context_);\n    extractor_->Init(&context_);\n    extractor_->RequestWorkspaces(&registry_);\n    workspaces_.Reset(registry_);\n    extractor_->Preprocess(&workspaces_, &sentence_);\n  }\n\n  // Returns the string representation of the prepared feature extracted at the\n  // given index.\n  virtual string ExtractFeature(int index) {\n    FeatureVector result;\n    extractor_->ExtractFeatures(workspaces_, sentence_, index,\n                                &result);\n    return result.type(0)->GetFeatureValueName(result.value(0));\n  }\n\n  // Extracts a vector of string representations from evaluating the prepared\n  // set feature (returning multiple values) at the given index.\n  virtual vector<string> ExtractMultiFeature(int index) {\n    vector<string> values;\n    FeatureVector result;\n    extractor_->ExtractFeatures(workspaces_, sentence_, index,\n                                &result);\n    for (int i = 0; i < result.size(); ++i) {\n      values.push_back(result.type(i)->GetFeatureValueName(result.value(i)));\n    }\n    return values;\n  }\n\n  // Adds an input to the task context.\n  void AddInputToContext(const string &name, const string &file_pattern,\n                         const string &file_format,\n                         const string &record_format) {\n    TaskInput *input = context_.GetInput(name);\n    TaskInput::Part *part = input->add_part();\n    part->set_file_pattern(file_pattern);\n    part->set_file_format(file_format);\n    part->set_record_format(record_format);\n  }\n\n  // Checks that a vector workspace is equal to a target vector.\n  void CheckVectorWorkspace(const VectorIntWorkspace &workspace,\n                            vector<int> target) {\n    vector<int> src;\n    for (int i = 0; i < workspace.size(); ++i) {\n      src.push_back(workspace.element(i));\n    }\n    EXPECT_THAT(src, testing::ContainerEq(target));\n  }\n\n  Sentence sentence_;\n  WorkspaceSet workspaces_;\n\n  PopulateTestInputs::CreatorMap creators_;\n  TaskContext context_;\n  WorkspaceRegistry registry_;\n  std::unique_ptr<SentenceExtractor> extractor_;\n};\n\n// Test fixture for simple common features that operate on just a sentence.\nclass CommonSentenceFeaturesTest : public SentenceFeaturesTest {\n protected:\n  CommonSentenceFeaturesTest()\n      : SentenceFeaturesTest(\n            \"text: 'I saw a man with a telescope.' \"\n            \"token { word: 'I' start: 0 end: 0 tag: 'PRP' category: 'PRON'\"\n            \"  head: 1 label: 'nsubj' break_level: NO_BREAK } \"\n            \"token { word: 'saw' start: 2 end: 4 tag: 'VBD' category: 'VERB'\"\n            \"  label: 'ROOT' break_level: SPACE_BREAK } \"\n            \"token { word: 'a' start: 6 end: 6 tag: 'DT' category: 'DET'\"\n            \"  head: 3 label: 'det' break_level: SPACE_BREAK } \"\n            \"token { word: 'man' start: 8 end: 10 tag: 'NN' category: 'NOUN'\"\n            \"  head: 1 label: 'dobj' break_level: SPACE_BREAK\"\n            \"  [syntaxnet.TokenMorphology.morphology] { \"\n            \"    attribute { name:'morph' value:'Sg' } \"\n            \"    attribute { name:'morph' value:'Masc' } \"\n            \"  } \"\n            \"} \"\n            \"token { word: 'with' start: 12 end: 15 tag: 'IN' category: 'ADP'\"\n            \" head: 1 label: 'prep' break_level: SPACE_BREAK } \"\n            \"token { word: 'a' start: 17 end: 17 tag: 'DT' category: 'DET'\"\n            \" head: 6 label: 'det' break_level: SPACE_BREAK } \"\n            \"token { word: 'telescope' start: 19 end: 27 tag: 'NN' category: \"\n            \"'NOUN'\"\n            \" head: 4 label: 'pobj'  break_level: SPACE_BREAK } \"\n            \"token { word: '.' start: 28 end: 28 tag: '.' category: '.'\"\n            \" head: 1 label: 'p' break_level: NO_BREAK }\") {}\n};\n\nTEST_F(CommonSentenceFeaturesTest, TagFeature) {\n  PrepareFeature(\"tag\");\n  EXPECT_EQ(\"<OUTSIDE>\", ExtractFeature(-1));\n  EXPECT_EQ(\"PRP\", ExtractFeature(0));\n  EXPECT_EQ(\"VBD\", ExtractFeature(1));\n  EXPECT_EQ(\"DT\", ExtractFeature(2));\n  EXPECT_EQ(\"NN\", ExtractFeature(3));\n  EXPECT_EQ(\"<OUTSIDE>\", ExtractFeature(8));\n}\n\nTEST_F(CommonSentenceFeaturesTest, TagFeaturePassesArgs) {\n  PrepareFeature(\"tag(min-freq=5)\");  // don't load any tags\n  EXPECT_EQ(ExtractFeature(-1), \"<OUTSIDE>\");\n  EXPECT_EQ(ExtractFeature(0), \"<UNKNOWN>\");\n  EXPECT_EQ(ExtractFeature(8), \"<OUTSIDE>\");\n\n  // Only 2 features: <UNKNOWN> and <OUTSIDE>.\n  EXPECT_EQ(2, extractor_->feature_type(0)->GetDomainSize());\n}\n\nTEST_F(CommonSentenceFeaturesTest, OffsetPlusTag) {\n  PrepareFeature(\"offset(-1).tag(min-freq=2)\");\n  EXPECT_EQ(\"<OUTSIDE>\", ExtractFeature(-1));\n  EXPECT_EQ(\"<OUTSIDE>\", ExtractFeature(0));\n  EXPECT_EQ(\"<UNKNOWN>\", ExtractFeature(1));\n  EXPECT_EQ(\"<UNKNOWN>\", ExtractFeature(2));\n  EXPECT_EQ(\"DT\", ExtractFeature(3));  // DT, NN are the only freq tags\n  EXPECT_EQ(\"NN\", ExtractFeature(4));\n  EXPECT_EQ(\"<UNKNOWN>\", ExtractFeature(5));\n  EXPECT_EQ(\"DT\", ExtractFeature(6));\n  EXPECT_EQ(\"NN\", ExtractFeature(7));\n  EXPECT_EQ(\"<UNKNOWN>\", ExtractFeature(8));\n  EXPECT_EQ(\"<OUTSIDE>\", ExtractFeature(9));\n}\n\nTEST_F(CommonSentenceFeaturesTest, CharNgramFeature) {\n  TermFrequencyMap char_ngram_map;\n  char_ngram_map.Increment(\"a\");\n  char_ngram_map.Increment(\"aw\");\n  char_ngram_map.Increment(\"sa\");\n  creators_.Add(\n      \"char-ngram-map\", \"text\", \"\",\n      [&char_ngram_map](const string &path) { char_ngram_map.Save(path); });\n\n  // Test that CharNgram works as expected.\n  PrepareFeature(\"char-ngram\");\n  EXPECT_EQ(\"\", utils::Join(ExtractMultiFeature(-1), \",\"));\n  EXPECT_EQ(\"\", utils::Join(ExtractMultiFeature(0), \",\"));\n  EXPECT_EQ(\"sa,a,aw\", utils::Join(ExtractMultiFeature(1), \",\"));\n  EXPECT_EQ(\"a\", utils::Join(ExtractMultiFeature(2), \",\"));\n  EXPECT_EQ(\"a\", utils::Join(ExtractMultiFeature(3), \",\"));\n  EXPECT_EQ(\"\", utils::Join(ExtractMultiFeature(8), \",\"));\n}\n\nTEST_F(CommonSentenceFeaturesTest, MorphologySetFeature) {\n  TermFrequencyMap morphology_map;\n  morphology_map.Increment(\"morph=Sg\");\n  morphology_map.Increment(\"morph=Sg\");\n  morphology_map.Increment(\"morph=Masc\");\n  morphology_map.Increment(\"morph=Masc\");\n  morphology_map.Increment(\"morph=Pl\");\n  creators_.Add(\n      \"morphology-map\", \"text\", \"\",\n      [&morphology_map](const string &path) { morphology_map.Save(path); });\n\n  // Test that CharNgram works as expected.\n  PrepareFeature(\"morphology-set\");\n  EXPECT_EQ(\"\", utils::Join(ExtractMultiFeature(-1), \",\"));\n  EXPECT_EQ(\"\", utils::Join(ExtractMultiFeature(0), \",\"));\n  EXPECT_EQ(\"morph=Sg,morph=Masc\", utils::Join(ExtractMultiFeature(3), \",\"));\n}\n\nTEST_F(CommonSentenceFeaturesTest, CapitalizationProcessesCorrectly) {\n  Capitalization feature;\n  feature.RequestWorkspaces(&registry_);\n  workspaces_.Reset(registry_);\n  feature.Preprocess(&workspaces_, &sentence_);\n\n  // Check the workspace contains what we expect.\n  EXPECT_TRUE(workspaces_.Has<VectorIntWorkspace>(feature.Workspace()));\n  const VectorIntWorkspace &workspace =\n      workspaces_.Get<VectorIntWorkspace>(feature.Workspace());\n  constexpr int UPPERCASE = Capitalization::UPPERCASE;\n  constexpr int LOWERCASE = Capitalization::LOWERCASE;\n  constexpr int NON_ALPHABETIC = Capitalization::NON_ALPHABETIC;\n  CheckVectorWorkspace(workspace,\n                       {UPPERCASE, LOWERCASE, LOWERCASE, LOWERCASE, LOWERCASE,\n                        LOWERCASE, LOWERCASE, NON_ALPHABETIC});\n}\n\nclass CharFeatureTest : public SentenceFeaturesTest {\n protected:\n  CharFeatureTest()\n      : SentenceFeaturesTest(\n          \"text: '一 个 测 试 员  ' \"\n          \"token { word: '一' start: 0 end: 2 } \"\n          \"token { word: '个' start: 3 end: 5 } \"\n          \"token { word: '测' start: 6 end: 8 } \"\n          \"token { word: '试' start: 9 end: 11 } \"\n          \"token { word: '员' start: 12 end: 14 } \"\n          \"token { word: ' ' start: 15 end: 15 } \"\n          \"token { word: '\\t' start: 16 end: 16 } \") {}\n};\n\nTEST_F(CharFeatureTest, CharFeature) {\n  TermFrequencyMap char_map;\n  char_map.Increment(\"一\");\n  char_map.Increment(\"个\");\n  char_map.Increment(\"试\");\n  char_map.Increment(\"员\");\n  creators_.Add(\n      \"char-map\", \"text\", \"\",\n      [&char_map](const string &path) { char_map.Save(path); });\n\n  // Test that Char works as expected.\n  PrepareFeature(\"char\");\n  EXPECT_EQ(\"<OUTSIDE>\", ExtractFeature(-1));\n  EXPECT_EQ(\"一\", ExtractFeature(0));\n  EXPECT_EQ(\"个\", ExtractFeature(1));\n  EXPECT_EQ(\"<UNKNOWN>\", ExtractFeature(2));  // \"测\" is not in the char map.\n  EXPECT_EQ(\"试\", ExtractFeature(3));\n  EXPECT_EQ(\"员\", ExtractFeature(4));\n  EXPECT_EQ(\"<BREAK_CHAR>\", ExtractFeature(5));\n  EXPECT_EQ(\"<BREAK_CHAR>\", ExtractFeature(6));\n  EXPECT_EQ(\"<OUTSIDE>\", ExtractFeature(7));\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/shared_store.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/shared_store.h\"\n\n#include <unordered_map>\n\n#include \"tensorflow/core/lib/strings/stringprintf.h\"\n\nnamespace syntaxnet {\n\nSharedStore::SharedObjectMap *SharedStore::shared_object_map_ =\n    new SharedObjectMap;\n\nmutex SharedStore::shared_object_map_mutex_(tensorflow::LINKER_INITIALIZED);\n\nSharedStore::SharedObjectMap *SharedStore::shared_object_map() {\n  return shared_object_map_;\n}\n\nbool SharedStore::Release(const void *object) {\n  if (object == nullptr) {\n    return true;\n  }\n  mutex_lock l(shared_object_map_mutex_);\n  for (SharedObjectMap::iterator it = shared_object_map()->begin();\n       it != shared_object_map()->end(); ++it) {\n    if (it->second.object == object) {\n      // Check the invariant that reference counts are positive. A violation\n      // likely implies memory corruption.\n      CHECK_GE(it->second.refcount, 1);\n      it->second.refcount--;\n      if (it->second.refcount == 0) {\n        it->second.delete_callback();\n        shared_object_map()->erase(it);\n      }\n      return true;\n    }\n  }\n  return false;\n}\n\nvoid SharedStore::Clear() {\n  mutex_lock l(shared_object_map_mutex_);\n  for (SharedObjectMap::iterator it = shared_object_map()->begin();\n       it != shared_object_map()->end(); ++it) {\n    it->second.delete_callback();\n  }\n  shared_object_map()->clear();\n}\n\nstring SharedStoreUtils::CreateDefaultName() { return string(); }\n\nstring SharedStoreUtils::ToString(const string &input) {\n  return ToString(tensorflow::StringPiece(input));\n}\n\nstring SharedStoreUtils::ToString(const char *input) {\n  return ToString(tensorflow::StringPiece(input));\n}\n\nstring SharedStoreUtils::ToString(tensorflow::StringPiece input) {\n  return tensorflow::strings::StrCat(\"\\\"\", utils::CEscape(input.ToString()),\n                                     \"\\\"\");\n}\n\nstring SharedStoreUtils::ToString(bool input) {\n  return input ? \"true\" : \"false\";\n}\n\nstring SharedStoreUtils::ToString(float input) {\n  return tensorflow::strings::Printf(\"%af\", input);\n}\n\nstring SharedStoreUtils::ToString(double input) {\n  return tensorflow::strings::Printf(\"%a\", input);\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/shared_store.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// Utility for creating read-only objects once and sharing them across threads.\n\n#ifndef SYNTAXNET_SHARED_STORE_H_\n#define SYNTAXNET_SHARED_STORE_H_\n\n#include <functional>\n#include <string>\n#include <typeindex>\n#include <unordered_map>\n#include <utility>\n\n#include \"syntaxnet/utils.h\"\n#include \"tensorflow/core/lib/core/stringpiece.h\"\n#include \"tensorflow/core/lib/strings/strcat.h\"\n\nnamespace syntaxnet {\n\nclass SharedStore {\n public:\n  // Returns an existing object with type T and name 'name' if it exists, else\n  // creates one with \"new T(args...)\".  Note: Objects will be indexed under\n  // their typeid + name, so names only have to be unique within a given type.\n  template <typename T, typename ...Args>\n  static const T *Get(const string &name,\n                      Args &&...args);  // NOLINT(build/c++11)\n\n  // Like Get(), but creates the object with \"closure->Run()\". If the closure\n  // returns null, we store a null in the SharedStore, but note that Release()\n  // cannot be used to remove it. This is because Release() finds the object\n  // by associative lookup, and there may be more than one null value, so we\n  // don't know which one to release. If the closure returns a duplicate value\n  // (one that is pointer-equal to an object already in the SharedStore),\n  // we disregard it and store null instead -- otherwise associative lookup\n  // would again fail (and the reference counts would be wrong).\n  template <typename T>\n  static const T *ClosureGet(const string &name, std::function<T *()> *closure);\n\n  // Like ClosureGet(), but check-fails if ClosureGet() would return null.\n  template <typename T>\n  static const T *ClosureGetOrDie(const string &name,\n                                  std::function<T *()> *closure);\n\n  // Release an object that was acquired by Get(). When its reference count\n  // hits 0, the object will be deleted. Returns true if the object was found.\n  // Does nothing and returns true if the object is null.\n  static bool Release(const void *object);\n\n  // Delete all objects in the shared store.\n  static void Clear();\n\n private:\n  // A shared object.\n  struct SharedObject {\n    void *object;\n    std::function<void()> delete_callback;\n    int refcount;\n\n    SharedObject(void *o, std::function<void()> d)\n        : object(o), delete_callback(std::move(d)), refcount(1) {}\n  };\n\n  // A map from keys to shared objects.\n  typedef std::unordered_map<string, SharedObject> SharedObjectMap;\n\n  // Return the shared object map.\n  static SharedObjectMap *shared_object_map();\n\n  // Return the string to use for indexing an object in the shared store.\n  template <typename T>\n  static string GetSharedKey(const string &name);\n\n  // Delete an object of type T.\n  template <typename T>\n  static void DeleteObject(T *object);\n\n  // Add an object to the shared object map. Return the object.\n  template <typename T>\n  static T *StoreObject(const string &key, T *object);\n\n  // Increment the reference count of an object in the map. Return the object.\n  template <typename T>\n  static T *IncrementRefCountOfObject(SharedObjectMap::iterator it);\n\n  // Map from keys to shared objects.\n  static SharedObjectMap *shared_object_map_;\n  static mutex shared_object_map_mutex_;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(SharedStore);\n};\n\ntemplate <typename T>\nstring SharedStore::GetSharedKey(const string &name) {\n  const std::type_index id = std::type_index(typeid(T));\n  return tensorflow::strings::StrCat(id.name(), \"_\", name);\n}\n\ntemplate <typename T>\nvoid SharedStore::DeleteObject(T *object) {\n  delete object;\n}\n\ntemplate <typename T>\nT *SharedStore::StoreObject(const string &key, T *object) {\n  std::function<void()> delete_cb =\n      std::bind(SharedStore::DeleteObject<T>, object);\n  SharedObject so(object, delete_cb);\n  shared_object_map()->insert(std::make_pair(key, so));\n  return object;\n}\n\ntemplate <typename T>\nT *SharedStore::IncrementRefCountOfObject(SharedObjectMap::iterator it) {\n  it->second.refcount++;\n  return static_cast<T *>(it->second.object);\n}\n\ntemplate <typename T, typename ...Args>\nconst T *SharedStore::Get(const string &name,\n                          Args &&...args) {  // NOLINT(build/c++11)\n  mutex_lock l(shared_object_map_mutex_);\n  const string key = GetSharedKey<T>(name);\n  SharedObjectMap::iterator it = shared_object_map()->find(key);\n  return (it == shared_object_map()->end()) ?\n      StoreObject<T>(key, new T(std::forward<Args>(args)...)) :\n      IncrementRefCountOfObject<T>(it);\n}\n\ntemplate <typename T>\nconst T *SharedStore::ClosureGet(const string &name,\n                                 std::function<T *()> *closure) {\n  mutex_lock l(shared_object_map_mutex_);\n  const string key = GetSharedKey<T>(name);\n  SharedObjectMap::iterator it = shared_object_map()->find(key);\n  if (it == shared_object_map()->end()) {\n    // Creates a new object by calling the closure.\n    T *object = (*closure)();\n    if (object == nullptr) {\n      LOG(ERROR) << \"Closure returned a null pointer\";\n    } else {\n      for (SharedObjectMap::iterator it = shared_object_map()->begin();\n           it != shared_object_map()->end(); ++it) {\n        if (it->second.object == object) {\n          LOG(ERROR)\n              << \"Closure returned duplicate pointer: \"\n              << \"keys \" << it->first << \" and \" << key;\n\n          // Not a memory leak to discard pointer, since we have another copy.\n          object = nullptr;\n          break;\n        }\n      }\n    }\n    return StoreObject<T>(key, object);\n  } else {\n    return IncrementRefCountOfObject<T>(it);\n  }\n}\n\ntemplate <typename T>\nconst T *SharedStore::ClosureGetOrDie(const string &name,\n                                      std::function<T *()> *closure) {\n  const T *object = ClosureGet<T>(name, closure);\n  CHECK(object != nullptr);\n  return object;\n}\n\n// A collection of utility functions for working with the shared store.\nclass SharedStoreUtils {\n public:\n  // Returns a shared object registered using a default name that is created\n  // from the constructor args.\n  //\n  // NB: This function does not guarantee a one-to-one relationship between\n  // sets of constructor args and names.  See warnings on CreateDefaultName().\n  // It is the caller's responsibility to ensure that the args provided will\n  // result in unique names.\n  template <class T, class... Args>\n  static const T *GetWithDefaultName(Args &&... args) {  // NOLINT(build/c++11)\n    return SharedStore::Get<T>(CreateDefaultName(std::forward<Args>(args)...),\n                               std::forward<Args>(args)...);\n  }\n\n  // Returns a string name representing the args.  Implemented via a pair of\n  // overloaded functions to achieve compile-time recursion.\n  //\n  // WARNING: It is possible for instances of different types to have the same\n  // string representation.  For example,\n  //\n  // CreateDefaultName(1) == CreateDefaultName(1ULL)\n  //\n  template <class First, class... Rest>\n  static string CreateDefaultName(First &&first,\n                                  Rest &&... rest) {  // NOLINT(build/c++11)\n    return tensorflow::strings::StrCat(\n        ToString<First>(std::forward<First>(first)), \",\",\n        CreateDefaultName(std::forward<Rest>(rest)...));\n  }\n  static string CreateDefaultName();\n\n private:\n  // Returns a string representing the input.  The generic implementation uses\n  // StrCat(), and overloads are provided for selected types.\n  template <class T>\n  static string ToString(T input) {\n    return tensorflow::strings::StrCat(input);\n  }\n  static string ToString(const string &input);\n  static string ToString(const char *input);\n  static string ToString(tensorflow::StringPiece input);\n  static string ToString(bool input);\n  static string ToString(float input);\n  static string ToString(double input);\n\n  TF_DISALLOW_COPY_AND_ASSIGN(SharedStoreUtils);\n};\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_SHARED_STORE_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/shared_store_test.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/shared_store.h\"\n\n#include <string>\n\n#include \"syntaxnet/utils.h\"\n#include <gmock/gmock.h>\n#include \"tensorflow/core/lib/core/threadpool.h\"\n\nusing ::testing::_;\n\nnamespace syntaxnet {\n\nstruct NoArgs {\n  NoArgs() {\n    LOG(INFO) << \"Calling NoArgs()\";\n  }\n};\n\nstruct OneArg {\n  string name;\n  explicit OneArg(const string &n) : name(n) {\n    LOG(INFO) << \"Calling OneArg(\" << name << \")\";\n  }\n};\n\nstruct TwoArgs {\n  string name;\n  int age;\n  TwoArgs(const string &n, int a) : name(n), age(a) {\n    LOG(INFO) << \"Calling TwoArgs(\" << name << \", \" << age << \")\";\n  }\n};\n\nstruct Slow {\n  string lengthy;\n  Slow() {\n    LOG(INFO) << \"Calling Slow()\";\n    lengthy.assign(50 << 20, 'L');  // 50MB of the letter 'L'\n  }\n};\n\nstruct CountCalls {\n  CountCalls() {\n    LOG(INFO) << \"Calling CountCalls()\";\n    ++constructor_calls;\n  }\n\n  ~CountCalls() {\n    LOG(INFO) << \"Calling ~CountCalls()\";\n    ++destructor_calls;\n  }\n\n  static void Reset() {\n    constructor_calls = 0;\n    destructor_calls = 0;\n  }\n\n  static int constructor_calls;\n  static int destructor_calls;\n};\n\nint CountCalls::constructor_calls = 0;\nint CountCalls::destructor_calls = 0;\n\nclass PointerSet {\n public:\n  PointerSet() { }\n\n  void Add(const void *p) {\n    mutex_lock l(mu_);\n    pointers_.insert(p);\n  }\n\n  int size() {\n    mutex_lock l(mu_);\n    return pointers_.size();\n  }\n\n private:\n  mutex mu_;\n  unordered_set<const void *> pointers_;\n};\n\nclass SharedStoreTest : public testing::Test {\n protected:\n  ~SharedStoreTest() override {\n    // Clear the shared store after each test, otherwise objects created\n    // in one test may interfere with other tests.\n    SharedStore::Clear();\n  }\n};\n\n// Verify that we can call constructors with varying numbers and types of args.\nTEST_F(SharedStoreTest, ConstructorArgs) {\n  SharedStore::Get<NoArgs>(\"no args\");\n  SharedStore::Get<OneArg>(\"one arg\", \"Fred\");\n  SharedStore::Get<TwoArgs>(\"two args\", \"Pebbles\", 2);\n}\n\n// Verify that an object with a given key is created only once.\nTEST_F(SharedStoreTest, Shared) {\n  const NoArgs *ob1 = SharedStore::Get<NoArgs>(\"first\");\n  const NoArgs *ob2 = SharedStore::Get<NoArgs>(\"second\");\n  const NoArgs *ob3 = SharedStore::Get<NoArgs>(\"first\");\n  EXPECT_EQ(ob1, ob3);\n  EXPECT_NE(ob1, ob2);\n  EXPECT_NE(ob2, ob3);\n}\n\n// Verify that objects with the same name but different types do not collide.\nTEST_F(SharedStoreTest, DifferentTypes) {\n  const NoArgs *ob1 = SharedStore::Get<NoArgs>(\"same\");\n  const OneArg *ob2 = SharedStore::Get<OneArg>(\"same\", \"foo\");\n  const TwoArgs *ob3 = SharedStore::Get<TwoArgs>(\"same\", \"bar\", 5);\n  EXPECT_NE(static_cast<const void *>(ob1), static_cast<const void *>(ob2));\n  EXPECT_NE(static_cast<const void *>(ob1), static_cast<const void *>(ob3));\n  EXPECT_NE(static_cast<const void *>(ob2), static_cast<const void *>(ob3));\n}\n\n// Factory method to make a OneArg.\nOneArg *MakeOneArg(const string &n) {\n  return new OneArg(n);\n}\n\nTEST_F(SharedStoreTest, ClosureGet) {\n  std::function<OneArg *()> closure1 = std::bind(MakeOneArg, \"Al\");\n  std::function<OneArg *()> closure2 = std::bind(MakeOneArg, \"Al\");\n  const OneArg *ob1 = SharedStore::ClosureGet(\"first\", &closure1);\n  const OneArg *ob2 = SharedStore::ClosureGet(\"first\", &closure2);\n  EXPECT_EQ(\"Al\", ob1->name);\n  EXPECT_EQ(ob1, ob2);\n}\n\nTEST_F(SharedStoreTest, PermanentCallback) {\n  std::function<OneArg *()> closure = std::bind(MakeOneArg, \"Al\");\n  const OneArg *ob1 = SharedStore::ClosureGet(\"first\", &closure);\n  const OneArg *ob2 = SharedStore::ClosureGet(\"first\", &closure);\n  EXPECT_EQ(\"Al\", ob1->name);\n  EXPECT_EQ(ob1, ob2);\n}\n\n// Factory method to \"make\" a NoArgs by simply returning an input pointer.\nNoArgs *BogusMakeNoArgs(NoArgs *ob) {\n  return ob;\n}\n\n// Create a CountCalls object, pretend it failed, and return null.\nCountCalls *MakeFailedCountCalls() {\n  CountCalls *ob = new CountCalls;\n  delete ob;\n  return nullptr;\n}\n\n// Verify that ClosureGet() only calls the closure for a given key once,\n// even if the closure fails.\nTEST_F(SharedStoreTest, FailedClosureGet) {\n  CountCalls::Reset();\n  std::function<CountCalls *()> closure1(MakeFailedCountCalls);\n  std::function<CountCalls *()> closure2(MakeFailedCountCalls);\n  const CountCalls *ob1 = SharedStore::ClosureGet(\"first\", &closure1);\n  const CountCalls *ob2 = SharedStore::ClosureGet(\"first\", &closure2);\n  EXPECT_EQ(nullptr, ob1);\n  EXPECT_EQ(nullptr, ob2);\n  EXPECT_EQ(1, CountCalls::constructor_calls);\n}\n\ntypedef SharedStoreTest SharedStoreDeathTest;\n\nTEST_F(SharedStoreDeathTest, ClosureGetOrDie) {\n  NoArgs *empty = nullptr;\n  std::function<NoArgs *()> closure = std::bind(BogusMakeNoArgs, empty);\n  EXPECT_DEATH(SharedStore::ClosureGetOrDie(\"first\", &closure), \"nullptr\");\n}\n\nTEST_F(SharedStoreTest, Release) {\n  const OneArg *ob1 = SharedStore::Get<OneArg>(\"first\", \"Fred\");\n  const OneArg *ob2 = SharedStore::Get<OneArg>(\"first\", \"Fred\");\n  EXPECT_EQ(ob1, ob2);\n  EXPECT_TRUE(SharedStore::Release(ob1));      // now refcount = 1\n  EXPECT_TRUE(SharedStore::Release(ob1));      // now object is deleted\n  EXPECT_FALSE(SharedStore::Release(ob1));     // now object is not found\n  EXPECT_TRUE(SharedStore::Release(nullptr));  // release(nullptr) returns true\n}\n\nTEST_F(SharedStoreTest, Clear) {\n  CountCalls::Reset();\n\n  SharedStore::Get<CountCalls>(\"first\");\n  SharedStore::Get<CountCalls>(\"second\");\n  SharedStore::Get<CountCalls>(\"first\");\n\n  // Test that the constructor and destructor are each called exactly once\n  // for each key in the shared store.\n  SharedStore::Clear();\n  EXPECT_EQ(2, CountCalls::constructor_calls);\n  EXPECT_EQ(2, CountCalls::destructor_calls);\n}\n\nvoid GetSharedObject(PointerSet *ps) {\n  // Gets a shared object whose constructor takes a long time.\n  const Slow *ob = SharedStore::Get<Slow>(\"first\");\n\n  // Collects the pointer we got. Later, we'll check whether SharedStore\n  // mistakenly called the constructor more than once.\n  ps->Add(static_cast<const void *>(ob));\n}\n\n// If multiple parallel threads all access an object with the same key,\n// only one object is created.\nTEST_F(SharedStoreTest, ThreadSafety) {\n  const int kNumThreads = 20;\n  tensorflow::thread::ThreadPool *pool = new tensorflow::thread::ThreadPool(\n      tensorflow::Env::Default(), \"ThreadSafetyPool\", kNumThreads);\n  PointerSet ps;\n  for (int i = 0; i < kNumThreads; ++i) {\n    std::function<void()> closure = std::bind(GetSharedObject, &ps);\n    pool->Schedule(closure);\n  }\n\n  // Waits for closures to finish, then delete the pool.\n  delete pool;\n\n  // Expects only one object to have been created across all threads.\n  EXPECT_EQ(1, ps.size());\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/sparse.proto",
    "content": "// Protocol for passing around sparse sets of features.\n\nsyntax = \"proto2\";\n\npackage syntaxnet;\n\n// A sparse set of features.\n//\n// If using SparseStringToIdTransformer, description is required and id should\n// be omitted; otherwise, id is required and description optional.\n//\n// id, weight, and description fields are all aligned if present (ie, any of\n// these that are non-empty should have the same # items). If weight is omitted,\n// 1.0 is used.\nmessage SparseFeatures {\n  repeated uint64 id = 1;\n  repeated float weight = 2;\n  repeated string description = 3;\n};\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/structured_graph_builder.py",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Build structured parser models.\"\"\"\n\nimport tensorflow as tf\n\nfrom tensorflow.python.ops import control_flow_ops as cf\nfrom tensorflow.python.ops import state_ops\nfrom tensorflow.python.ops import tensor_array_ops\n\nfrom syntaxnet import graph_builder\nfrom syntaxnet.ops import gen_parser_ops\n\ntf.NotDifferentiable('BeamParseReader')\ntf.NotDifferentiable('BeamParser')\ntf.NotDifferentiable('BeamParserOutput')\n\n\ndef AddCrossEntropy(batch_size, n):\n  \"\"\"Adds a cross entropy cost function.\"\"\"\n  cross_entropies = []\n  def _Pass():\n    return tf.constant(0, dtype=tf.float32, shape=[1])\n\n  for beam_id in range(batch_size):\n    beam_gold_slot = tf.reshape(tf.slice(n['gold_slot'], [beam_id], [1]), [1])\n    def _ComputeCrossEntropy():\n      \"\"\"Adds ops to compute cross entropy of the gold path in a beam.\"\"\"\n      # Requires a cast so that UnsortedSegmentSum, in the gradient,\n      # is happy with the type of its input 'segment_ids', which\n      # must be int32.\n      idx = tf.cast(\n          tf.reshape(\n              tf.where(tf.equal(n['beam_ids'], beam_id)), [-1]), tf.int32)\n      beam_scores = tf.reshape(tf.gather(n['all_path_scores'], idx), [1, -1])\n      num = tf.shape(idx)\n      return tf.nn.softmax_cross_entropy_with_logits(\n          beam_scores, tf.expand_dims(\n              tf.sparse_to_dense(beam_gold_slot, num, [1.], 0.), 0))\n    # The conditional here is needed to deal with the last few batches of the\n    # corpus which can contain -1 in beam_gold_slot for empty batch slots.\n    cross_entropies.append(cf.cond(\n        beam_gold_slot[0] >= 0, _ComputeCrossEntropy, _Pass))\n  return {'cross_entropy': tf.div(tf.add_n(cross_entropies), batch_size)}\n\n\nclass StructuredGraphBuilder(graph_builder.GreedyParser):\n  \"\"\"Extends the standard GreedyParser with a CRF objective using a beam.\n\n  The constructor takes two additional keyword arguments.\n  beam_size: the maximum size the beam can grow to.\n  max_steps: the maximum number of steps in any particular beam.\n\n  The model supports batch training with the batch_size argument to the\n  AddTraining method.\n  \"\"\"\n\n  def __init__(self, *args, **kwargs):\n    self._beam_size = kwargs.pop('beam_size', 10)\n    self._max_steps = kwargs.pop('max_steps', 25)\n    super(StructuredGraphBuilder, self).__init__(*args, **kwargs)\n\n  def _AddBeamReader(self,\n                     task_context,\n                     batch_size,\n                     corpus_name,\n                     until_all_final=False,\n                     always_start_new_sentences=False):\n    \"\"\"Adds an op capable of reading sentences and parsing them with a beam.\"\"\"\n    features, state, epochs = gen_parser_ops.beam_parse_reader(\n        task_context=task_context,\n        feature_size=self._feature_size,\n        beam_size=self._beam_size,\n        batch_size=batch_size,\n        corpus_name=corpus_name,\n        allow_feature_weights=self._allow_feature_weights,\n        arg_prefix=self._arg_prefix,\n        continue_until_all_final=until_all_final,\n        always_start_new_sentences=always_start_new_sentences)\n    return {'state': state, 'features': features, 'epochs': epochs}\n\n  def _BuildSequence(self,\n                     batch_size,\n                     max_steps,\n                     features,\n                     state,\n                     use_average=False):\n    \"\"\"Adds a sequence of beam parsing steps.\"\"\"\n    def Advance(state, step, scores_array, alive, alive_steps, *features):\n      scores = self._BuildNetwork(features,\n                                  return_average=use_average)['logits']\n      scores_array = scores_array.write(step, scores)\n      features, state, alive = (\n          gen_parser_ops.beam_parser(state, scores, self._feature_size))\n      return [state, step + 1, scores_array, alive, alive_steps + tf.cast(\n          alive, tf.int32)] + list(features)\n\n    # args: (state, step, scores_array, alive, alive_steps, *features)\n    def KeepGoing(*args):\n      return tf.logical_and(args[1] < max_steps, tf.reduce_any(args[3]))\n\n    step = tf.constant(0, tf.int32, [])\n    scores_array = tensor_array_ops.TensorArray(dtype=tf.float32,\n                                                size=0,\n                                                dynamic_size=True)\n    alive = tf.constant(True, tf.bool, [batch_size])\n    alive_steps = tf.constant(0, tf.int32, [batch_size])\n    t = tf.while_loop(\n        KeepGoing,\n        Advance,\n        [state, step, scores_array, alive, alive_steps] + list(features),\n        shape_invariants=[tf.TensorShape(None)] * (len(features) + 5),\n        parallel_iterations=100)\n\n    # Link to the final nodes/values of ops that have passed through While:\n    return {'state': t[0],\n            'concat_scores': t[2].concat(),\n            'alive': t[3],\n            'alive_steps': t[4]}\n\n  def AddTraining(self,\n                  task_context,\n                  batch_size,\n                  learning_rate=0.1,\n                  decay_steps=4000,\n                  momentum=None,\n                  corpus_name='documents'):\n    with tf.name_scope('training'):\n      n = self.training\n      n['accumulated_alive_steps'] = self._AddVariable(\n          [batch_size], tf.int32, 'accumulated_alive_steps',\n          tf.zeros_initializer)\n      n.update(self._AddBeamReader(task_context, batch_size, corpus_name))\n      # This adds a required 'step' node too:\n      learning_rate = tf.constant(learning_rate, dtype=tf.float32)\n      n['learning_rate'] = self._AddLearningRate(learning_rate, decay_steps)\n      # Call BuildNetwork *only* to set up the params outside of the main loop.\n      self._BuildNetwork(list(n['features']))\n\n      n.update(self._BuildSequence(batch_size, self._max_steps, n['features'],\n                                   n['state']))\n\n      flat_concat_scores = tf.reshape(n['concat_scores'], [-1])\n      (indices_and_paths, beams_and_slots, n['gold_slot'], n[\n          'beam_path_scores']) = gen_parser_ops.beam_parser_output(n[\n              'state'])\n      n['indices'] = tf.reshape(tf.gather(indices_and_paths, [0]), [-1])\n      n['path_ids'] = tf.reshape(tf.gather(indices_and_paths, [1]), [-1])\n      n['all_path_scores'] = tf.sparse_segment_sum(\n          flat_concat_scores, n['indices'], n['path_ids'])\n      n['beam_ids'] = tf.reshape(tf.gather(beams_and_slots, [0]), [-1])\n      n.update(AddCrossEntropy(batch_size, n))\n\n      if self._only_train:\n        trainable_params = {k: v for k, v in self.params.iteritems()\n                            if k in self._only_train}\n      else:\n        trainable_params = self.params\n      for p in trainable_params:\n        tf.logging.info('trainable_param: %s', p)\n\n      regularized_params = [\n          tf.nn.l2_loss(p) for k, p in trainable_params.iteritems()\n          if k.startswith('weights') or k.startswith('bias')]\n      l2_loss = 1e-4 * tf.add_n(regularized_params) if regularized_params else 0\n\n      n['cost'] = tf.add(n['cross_entropy'], l2_loss, name='cost')\n\n      n['gradients'] = tf.gradients(n['cost'], trainable_params.values())\n\n      with tf.control_dependencies([n['alive_steps']]):\n        update_accumulators = tf.group(\n            tf.assign_add(n['accumulated_alive_steps'], n['alive_steps']))\n\n      def ResetAccumulators():\n        return tf.assign(\n            n['accumulated_alive_steps'], tf.zeros([batch_size], tf.int32))\n      n['reset_accumulators_func'] = ResetAccumulators\n\n      optimizer = tf.train.MomentumOptimizer(n['learning_rate'],\n                                             momentum,\n                                             use_locking=self._use_locking)\n      train_op = optimizer.minimize(n['cost'],\n                                    var_list=trainable_params.values())\n      for param in trainable_params.values():\n        slot = optimizer.get_slot(param, 'momentum')\n        self.inits[slot.name] = state_ops.init_variable(slot,\n                                                        tf.zeros_initializer)\n        self.variables[slot.name] = slot\n\n      def NumericalChecks():\n        return tf.group(*[\n            tf.check_numerics(param, message='Parameter is not finite.')\n            for param in trainable_params.values()\n            if param.dtype.base_dtype in [tf.float32, tf.float64]])\n      check_op = cf.cond(tf.equal(tf.mod(self.GetStep(), self._check_every), 0),\n                         NumericalChecks, tf.no_op)\n      avg_update_op = tf.group(*self._averaging.values())\n      train_ops = [train_op]\n      if self._check_parameters:\n        train_ops.append(check_op)\n      if self._use_averaging:\n        train_ops.append(avg_update_op)\n      with tf.control_dependencies([update_accumulators]):\n        n['train_op'] = tf.group(*train_ops, name='train_op')\n      n['alive_steps'] = tf.identity(n['alive_steps'], name='alive_steps')\n    return n\n\n  def AddEvaluation(self,\n                    task_context,\n                    batch_size,\n                    evaluation_max_steps=300,\n                    corpus_name=None):\n    with tf.name_scope('evaluation'):\n      n = self.evaluation\n      n.update(self._AddBeamReader(task_context,\n                                   batch_size,\n                                   corpus_name,\n                                   until_all_final=True,\n                                   always_start_new_sentences=True))\n      self._BuildNetwork(\n          list(n['features']),\n          return_average=self._use_averaging)\n      n.update(self._BuildSequence(batch_size, evaluation_max_steps, n[\n          'features'], n['state'], use_average=self._use_averaging))\n      n['eval_metrics'], n['documents'] = (\n          gen_parser_ops.beam_eval_output(n['state']))\n    return n\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/syntaxnet.bzl",
    "content": "# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\nload(\"@protobuf//:protobuf.bzl\", \"cc_proto_library\")\nload(\"@protobuf//:protobuf.bzl\", \"py_proto_library\")\n\ndef if_cuda(if_true, if_false = []):\n    \"\"\"Shorthand for select()'ing on whether we're building with CUDA.\"\"\"\n    return select({\n        \"@local_config_cuda//cuda:using_nvcc\": if_true,\n        \"@local_config_cuda//cuda:using_clang\": if_true,\n        \"//conditions:default\": if_false\n    })\n\ndef tf_copts():\n  return ([\"-fno-exceptions\", \"-DEIGEN_AVOID_STL_ARRAY\",] +\n          if_cuda([\"-DGOOGLE_CUDA=1\"]) +\n          select({\"@org_tensorflow//tensorflow:darwin\": [],\n                  \"//conditions:default\": [\"-pthread\"]}))\n\ndef tf_proto_library(name, srcs=[], has_services=False,\n                     deps=[], visibility=None, testonly=0,\n                     cc_api_version=2, go_api_version=2,\n                     java_api_version=2,\n                     py_api_version=2):\n  native.filegroup(name=name + \"_proto_srcs\",\n                   srcs=srcs,\n                   testonly=testonly,)\n\n  cc_proto_library(name=name,\n                   srcs=srcs,\n                   deps=deps,\n                   cc_libs = [\"@protobuf//:protobuf\"],\n                   protoc=\"@protobuf//:protoc\",\n                   default_runtime=\"@protobuf//:protobuf\",\n                   testonly=testonly,\n                   visibility=visibility,)\n\ndef tf_proto_library_py(name, srcs=[], deps=[], visibility=None, testonly=0):\n  py_proto_library(name=name,\n                   srcs=srcs,\n                   srcs_version = \"PY2AND3\",\n                   deps=deps,\n                   default_runtime=\"@protobuf//:protobuf_python\",\n                   protoc=\"@protobuf//:protoc\",\n                   visibility=visibility,\n                   testonly=testonly,)\n\n# Given a list of \"op_lib_names\" (a list of files in the ops directory\n# without their .cc extensions), generate a library for that file.\ndef tf_gen_op_libs(op_lib_names):\n  # Make library out of each op so it can also be used to generate wrappers\n  # for various languages.\n  for n in op_lib_names:\n    native.cc_library(name=n + \"_op_lib\",\n                      copts=tf_copts(),\n                      srcs=[\"ops/\" + n + \".cc\"],\n                      deps=([\"@org_tensorflow//tensorflow/core:framework\"]),\n                      visibility=[\"//visibility:public\"],\n                      alwayslink=1,\n                      linkstatic=1,)\n\n# Invoke this rule in .../tensorflow/python to build the wrapper library.\ndef tf_gen_op_wrapper_py(name, out=None, hidden=[], visibility=None, deps=[],\n                         require_shape_functions=False):\n  # Construct a cc_binary containing the specified ops.\n  tool_name = \"gen_\" + name + \"_py_wrappers_cc\"\n  if not deps:\n    deps = [\"//tensorflow/core:\" + name + \"_op_lib\"]\n  native.cc_binary(\n      name = tool_name,\n      linkopts = [\"-lm\"],\n      copts = tf_copts(),\n      linkstatic = 1,   # Faster to link this one-time-use binary dynamically\n      deps = ([\"@org_tensorflow//tensorflow/core:framework\",\n               \"@org_tensorflow//tensorflow/python:python_op_gen_main\"] + deps),\n  )\n\n  # Invoke the previous cc_binary to generate a python file.\n  if not out:\n    out = \"ops/gen_\" + name + \".py\"\n\n  native.genrule(\n      name=name + \"_pygenrule\",\n      outs=[out],\n      tools=[tool_name],\n      cmd=(\"$(location \" + tool_name + \") \" + \",\".join(hidden)\n           + \" \" + (\"1\" if require_shape_functions else \"0\") + \" > $@\"))\n\n  # Make a py_library out of the generated python file.\n  native.py_library(name=name,\n                    srcs=[out],\n                    srcs_version=\"PY2AND3\",\n                    visibility=visibility,\n                    deps=[\n                        \"@org_tensorflow//tensorflow/python:framework_for_generated_wrappers\",\n                    ],)\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/tagger_transitions.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// Tagger transition system.\n//\n// This transition system has one type of actions:\n//  - The SHIFT action pushes the next input token to the stack and\n//    advances to the next input token, assigning a part-of-speech tag to the\n//    token that was shifted.\n//\n// The transition system operates with parser actions encoded as integers:\n//  - A SHIFT action is encoded as number starting from 0.\n\n#include <string>\n\n#include \"syntaxnet/parser_features.h\"\n#include \"syntaxnet/parser_state.h\"\n#include \"syntaxnet/parser_transitions.h\"\n#include \"syntaxnet/sentence_features.h\"\n#include \"syntaxnet/shared_store.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/term_frequency_map.h\"\n#include \"syntaxnet/utils.h\"\n#include \"tensorflow/core/lib/strings/strcat.h\"\n\nnamespace syntaxnet {\n\nclass TaggerTransitionState : public ParserTransitionState {\n public:\n  explicit TaggerTransitionState(const TermFrequencyMap *tag_map,\n                                 const TagToCategoryMap *tag_to_category)\n      : tag_map_(tag_map), tag_to_category_(tag_to_category) {}\n\n  explicit TaggerTransitionState(const TaggerTransitionState *state)\n      : TaggerTransitionState(state->tag_map_, state->tag_to_category_) {\n    tag_ = state->tag_;\n    gold_tag_ = state->gold_tag_;\n  }\n\n  // Clones the transition state by returning a new object.\n  ParserTransitionState *Clone() const override {\n    return new TaggerTransitionState(this);\n  }\n\n  // Reads gold tags for each token.\n  void Init(ParserState *state) override {\n    tag_.resize(state->sentence().token_size(), -1);\n    gold_tag_.resize(state->sentence().token_size(), -1);\n    for (int pos = 0; pos < state->sentence().token_size(); ++pos) {\n      int tag = tag_map_->LookupIndex(state->GetToken(pos).tag(), -1);\n      gold_tag_[pos] = tag;\n    }\n  }\n\n  // Returns the tag assigned to a given token.\n  int Tag(int index) const {\n    DCHECK_GE(index, 0);\n    DCHECK_LT(index, tag_.size());\n    return index == -1 ? -1 : tag_[index];\n  }\n\n  // Sets this tag on the token at index.\n  void SetTag(int index, int tag) {\n    DCHECK_GE(index, 0);\n    DCHECK_LT(index, tag_.size());\n    tag_[index] = tag;\n  }\n\n  // Returns the gold tag for a given token.\n  int GoldTag(int index) const {\n    DCHECK_GE(index, -1);\n    DCHECK_LT(index, gold_tag_.size());\n    return index == -1 ? -1 : gold_tag_[index];\n  }\n\n  // Returns the string representation of a POS tag, or an empty string\n  // if the tag is invalid.\n  string TagAsString(int tag) const {\n    if (tag >= 0 && tag < tag_map_->Size()) {\n      return tag_map_->GetTerm(tag);\n    }\n    return \"\";\n  }\n\n  // Adds transition state specific annotations to the document.\n  void AddParseToDocument(const ParserState &state, bool rewrite_root_labels,\n                          Sentence *sentence) const override {\n    for (size_t i = 0; i < tag_.size(); ++i) {\n      Token *token = sentence->mutable_token(i);\n      token->set_tag(TagAsString(Tag(i)));\n      if (tag_to_category_) {\n        token->set_category(tag_to_category_->GetCategory(token->tag()));\n      }\n    }\n  }\n\n  // Whether a parsed token should be considered correct for evaluation.\n  bool IsTokenCorrect(const ParserState &state, int index) const override {\n    return GoldTag(index) == Tag(index);\n  }\n\n  // Returns a human readable string representation of this state.\n  string ToString(const ParserState &state) const override {\n    string str;\n    for (int i = state.StackSize(); i > 0; --i) {\n      const string &word = state.GetToken(state.Stack(i - 1)).word();\n      if (i != state.StackSize() - 1) str.append(\" \");\n      tensorflow::strings::StrAppend(\n          &str, word, \"[\", TagAsString(Tag(state.StackSize() - i)), \"]\");\n    }\n    for (int i = state.Next(); i < state.NumTokens(); ++i) {\n      tensorflow::strings::StrAppend(&str, \" \", state.GetToken(i).word());\n    }\n    return str;\n  }\n\n private:\n  // Currently assigned POS tags for each token in this sentence.\n  vector<int> tag_;\n\n  // Gold POS tags from the input document.\n  vector<int> gold_tag_;\n\n  // Tag map used for conversions between integer and string representations\n  // part of speech tags. Not owned.\n  const TermFrequencyMap *tag_map_ = nullptr;\n\n  // Tag to category map. Not owned.\n  const TagToCategoryMap *tag_to_category_ = nullptr;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(TaggerTransitionState);\n};\n\nclass TaggerTransitionSystem : public ParserTransitionSystem {\n public:\n  ~TaggerTransitionSystem() override { SharedStore::Release(tag_map_); }\n\n  // Determines tag map location.\n  void Setup(TaskContext *context) override {\n    input_tag_map_ = context->GetInput(\"tag-map\", \"text\", \"\");\n    join_category_to_pos_ = context->GetBoolParameter(\"join_category_to_pos\");\n    input_tag_to_category_ = context->GetInput(\"tag-to-category\", \"text\", \"\");\n  }\n\n  // Reads tag map and tag to category map.\n  void Init(TaskContext *context) override {\n    const string tag_map_path = TaskContext::InputFile(*input_tag_map_);\n    tag_map_ = SharedStoreUtils::GetWithDefaultName<TermFrequencyMap>(\n        tag_map_path, 0, 0);\n    if (!join_category_to_pos_) {\n      const string tag_to_category_path =\n          TaskContext::InputFile(*input_tag_to_category_);\n      tag_to_category_ = SharedStoreUtils::GetWithDefaultName<TagToCategoryMap>(\n          tag_to_category_path);\n    }\n  }\n\n  // The SHIFT action uses the same value as the corresponding action type.\n  static ParserAction ShiftAction(int tag) { return tag; }\n\n  // The tagger transition system doesn't look at the dependency tree, so it\n  // allows non-projective trees.\n  bool AllowsNonProjective() const override { return true; }\n\n  // Returns the number of action types.\n  int NumActionTypes() const override { return 1; }\n\n  // Returns the number of possible actions.\n  int NumActions(int num_labels) const override { return tag_map_->Size(); }\n\n  // The default action for a given state is assigning the most frequent tag.\n  ParserAction GetDefaultAction(const ParserState &state) const override {\n    return ShiftAction(0);\n  }\n\n  // Returns the next gold action for a given state according to the\n  // underlying annotated sentence.\n  ParserAction GetNextGoldAction(const ParserState &state) const override {\n    if (!state.EndOfInput()) {\n      return ShiftAction(TransitionState(state).GoldTag(state.Next()));\n    }\n    return ShiftAction(0);\n  }\n\n  // Checks if the action is allowed in a given parser state.\n  bool IsAllowedAction(ParserAction action,\n                       const ParserState &state) const override {\n    return !state.EndOfInput();\n  }\n\n  // Makes a shift by pushing the next input token on the stack and moving to\n  // the next position.\n  void PerformActionWithoutHistory(ParserAction action,\n                                   ParserState *state) const override {\n    DCHECK(!state->EndOfInput());\n    if (!state->EndOfInput()) {\n      MutableTransitionState(state)->SetTag(state->Next(), action);\n      state->Push(state->Next());\n      state->Advance();\n    }\n  }\n\n  // We are in a final state when we reached the end of the input and the stack\n  // is empty.\n  bool IsFinalState(const ParserState &state) const override {\n    return state.EndOfInput();\n  }\n\n  // Returns a string representation of a parser action.\n  string ActionAsString(ParserAction action,\n                        const ParserState &state) const override {\n    return tensorflow::strings::StrCat(\"SHIFT(\", tag_map_->GetTerm(action),\n                                       \")\");\n  }\n\n  // No state is deterministic in this transition system.\n  bool IsDeterministicState(const ParserState &state) const override {\n    return false;\n  }\n\n  // Returns a new transition state to be used to enhance the parser state.\n  ParserTransitionState *NewTransitionState(bool training_mode) const override {\n    return new TaggerTransitionState(tag_map_, tag_to_category_);\n  }\n\n  // Downcasts the const ParserTransitionState in ParserState to a const\n  // TaggerTransitionState.\n  static const TaggerTransitionState &TransitionState(\n      const ParserState &state) {\n    return *static_cast<const TaggerTransitionState *>(\n        state.transition_state());\n  }\n\n  // Downcasts the ParserTransitionState in ParserState to an\n  // TaggerTransitionState.\n  static TaggerTransitionState *MutableTransitionState(ParserState *state) {\n    return static_cast<TaggerTransitionState *>(\n        state->mutable_transition_state());\n  }\n\n  // Input for the tag map. Not owned.\n  TaskInput *input_tag_map_ = nullptr;\n\n  // Tag map used for conversions between integer and string representations\n  // part of speech tags. Owned through SharedStore.\n  const TermFrequencyMap *tag_map_ = nullptr;\n\n  // Input for the tag to category map. Not owned.\n  TaskInput *input_tag_to_category_ = nullptr;\n\n  // Tag to category map. Owned through SharedStore.\n  const TagToCategoryMap *tag_to_category_ = nullptr;\n\n  bool join_category_to_pos_ = false;\n};\n\nREGISTER_TRANSITION_SYSTEM(\"tagger\", TaggerTransitionSystem);\n\n// Feature function for retrieving the tag assigned to a token by the tagger\n// transition system.\nclass PredictedTagFeatureFunction\n    : public BasicParserSentenceFeatureFunction<Tag> {\n public:\n  PredictedTagFeatureFunction() {}\n\n  // Gets the TaggerTransitionState from the parser state and reads the assigned\n  // tag at the focus index. Returns -1 if the focus is not within the sentence.\n  FeatureValue Compute(const WorkspaceSet &workspaces, const ParserState &state,\n                       int focus, const FeatureVector *result) const override {\n    if (focus < 0 || focus >= state.sentence().token_size()) return -1;\n    return static_cast<const TaggerTransitionState *>(state.transition_state())\n        ->Tag(focus);\n  }\n\n private:\n  TF_DISALLOW_COPY_AND_ASSIGN(PredictedTagFeatureFunction);\n};\n\nREGISTER_PARSER_IDX_FEATURE_FUNCTION(\"pred-tag\", PredictedTagFeatureFunction);\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/tagger_transitions_test.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include <memory>\n#include <string>\n\n#include \"syntaxnet/parser_state.h\"\n#include \"syntaxnet/parser_transitions.h\"\n#include \"syntaxnet/populate_test_inputs.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/task_context.h\"\n#include \"syntaxnet/task_spec.pb.h\"\n#include \"syntaxnet/term_frequency_map.h\"\n#include \"syntaxnet/utils.h\"\n#include \"tensorflow/core/lib/core/status.h\"\n#include \"tensorflow/core/platform/env.h\"\n#include \"tensorflow/core/platform/test.h\"\n\nnamespace syntaxnet {\n\nclass TaggerTransitionTest : public ::testing::Test {\n public:\n  TaggerTransitionTest()\n      : transition_system_(ParserTransitionSystem::Create(\"tagger\")) {}\n\n protected:\n  // Creates a label map and a tag map for testing based on the given\n  // document and initializes the transition system appropriately.\n  void SetUpForDocument(const Sentence &document) {\n    input_label_map_ = context_.GetInput(\"label-map\", \"text\", \"\");\n    input_label_map_ = context_.GetInput(\"tag-map\", \"text\", \"\");\n    transition_system_->Setup(&context_);\n    PopulateTestInputs::Defaults(document).Populate(&context_);\n    label_map_.Load(TaskContext::InputFile(*input_label_map_),\n                    0 /* minimum frequency */,\n                    -1 /* maximum number of terms */);\n    transition_system_->Init(&context_);\n  }\n\n  // Creates a cloned state from a sentence in order to test that cloning\n  // works correctly for the new parser states.\n  ParserState *NewClonedState(Sentence *sentence) {\n    ParserState state(sentence, transition_system_->NewTransitionState(\n                                    true /* training mode */),\n                      &label_map_);\n    return state.Clone();\n  }\n\n  // Performs gold transitions and check that the labels and heads recorded\n  // in the parser state match gold heads and labels.\n  void GoldParse(Sentence *sentence) {\n    ParserState *state = NewClonedState(sentence);\n    LOG(INFO) << \"Initial parser state: \" << state->ToString();\n    while (!transition_system_->IsFinalState(*state)) {\n      ParserAction action = transition_system_->GetNextGoldAction(*state);\n      EXPECT_TRUE(transition_system_->IsAllowedAction(action, *state));\n      LOG(INFO) << \"Performing action: \"\n                << transition_system_->ActionAsString(action, *state);\n      transition_system_->PerformActionWithoutHistory(action, state);\n      LOG(INFO) << \"Parser state: \" << state->ToString();\n    }\n    delete state;\n  }\n\n  // Always takes the default action, and verifies that this leads to\n  // a final state through a sequence of allowed actions.\n  void DefaultParse(Sentence *sentence) {\n    ParserState *state = NewClonedState(sentence);\n    LOG(INFO) << \"Initial parser state: \" << state->ToString();\n    while (!transition_system_->IsFinalState(*state)) {\n      ParserAction action = transition_system_->GetDefaultAction(*state);\n      EXPECT_TRUE(transition_system_->IsAllowedAction(action, *state));\n      LOG(INFO) << \"Performing action: \"\n                << transition_system_->ActionAsString(action, *state);\n      transition_system_->PerformActionWithoutHistory(action, state);\n      LOG(INFO) << \"Parser state: \" << state->ToString();\n    }\n    delete state;\n  }\n\n  TaskContext context_;\n  TaskInput *input_label_map_ = nullptr;\n  TermFrequencyMap label_map_;\n  std::unique_ptr<ParserTransitionSystem> transition_system_;\n};\n\nTEST_F(TaggerTransitionTest, SingleSentenceDocumentTest) {\n  string document_text;\n  Sentence document;\n  TF_CHECK_OK(ReadFileToString(\n      tensorflow::Env::Default(),\n      \"syntaxnet/testdata/document\",\n      &document_text));\n  LOG(INFO) << \"see doc\\n:\" << document_text;\n  CHECK(TextFormat::ParseFromString(document_text, &document));\n  SetUpForDocument(document);\n  GoldParse(&document);\n  DefaultParse(&document);\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/task_context.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/task_context.h\"\n\n#include \"tensorflow/core/lib/core/status.h\"\n#include \"tensorflow/core/platform/env.h\"\n\nnamespace syntaxnet {\nnamespace {\n\nconst char *const kShardPrintFormat = \"%05d\";\n\n}  // namespace\n\nTaskInput *TaskContext::GetInput(const string &name) {\n  // Return existing input if it exists.\n  for (int i = 0; i < spec_.input_size(); ++i) {\n    if (spec_.input(i).name() == name) return spec_.mutable_input(i);\n  }\n\n  // Create new input.\n  TaskInput *input = spec_.add_input();\n  input->set_name(name);\n  return input;\n}\n\nTaskInput *TaskContext::GetInput(const string &name, const string &file_format,\n                                 const string &record_format) {\n  TaskInput *input = GetInput(name);\n  if (!file_format.empty()) {\n    bool found = false;\n    for (int i = 0; i < input->file_format_size(); ++i) {\n      if (input->file_format(i) == file_format) found = true;\n    }\n    if (!found) input->add_file_format(file_format);\n  }\n  if (!record_format.empty()) {\n    bool found = false;\n    for (int i = 0; i < input->record_format_size(); ++i) {\n      if (input->record_format(i) == record_format) found = true;\n    }\n    if (!found) input->add_record_format(record_format);\n  }\n  return input;\n}\n\nvoid TaskContext::SetParameter(const string &name, const string &value) {\n  // If the parameter already exists update the value.\n  for (int i = 0; i < spec_.parameter_size(); ++i) {\n    if (spec_.parameter(i).name() == name) {\n      spec_.mutable_parameter(i)->set_value(value);\n      return;\n    }\n  }\n\n  // Add new parameter.\n  TaskSpec::Parameter *param = spec_.add_parameter();\n  param->set_name(name);\n  param->set_value(value);\n}\n\nstring TaskContext::GetParameter(const string &name) const {\n  // First try to find parameter in task specification.\n  for (int i = 0; i < spec_.parameter_size(); ++i) {\n    if (spec_.parameter(i).name() == name) return spec_.parameter(i).value();\n  }\n\n  // Parameter not found, return empty string.\n  return \"\";\n}\n\nint TaskContext::GetIntParameter(const string &name) const {\n  string value = GetParameter(name);\n  return utils::ParseUsing<int>(value, 0, utils::ParseInt32);\n}\n\nint64 TaskContext::GetInt64Parameter(const string &name) const {\n  string value = GetParameter(name);\n  return utils::ParseUsing<int64>(value, 0ll, utils::ParseInt64);\n}\n\nbool TaskContext::GetBoolParameter(const string &name) const {\n  string value = GetParameter(name);\n  return value == \"true\";\n}\n\ndouble TaskContext::GetFloatParameter(const string &name) const {\n  string value = GetParameter(name);\n  return utils::ParseUsing<double>(value, .0, utils::ParseDouble);\n}\n\nstring TaskContext::Get(const string &name, const char *defval) const {\n  // First try to find parameter in task specification.\n  for (int i = 0; i < spec_.parameter_size(); ++i) {\n    if (spec_.parameter(i).name() == name) return spec_.parameter(i).value();\n  }\n\n  // Parameter not found, return default value.\n  return defval;\n}\n\nstring TaskContext::Get(const string &name, const string &defval) const {\n  return Get(name, defval.c_str());\n}\n\nint TaskContext::Get(const string &name, int defval) const {\n  string value = Get(name, \"\");\n  return utils::ParseUsing<int>(value, defval, utils::ParseInt32);\n}\n\nint64 TaskContext::Get(const string &name, int64 defval) const {\n  string value = Get(name, \"\");\n  return utils::ParseUsing<int64>(value, defval, utils::ParseInt64);\n}\n\ndouble TaskContext::Get(const string &name, double defval) const {\n  string value = Get(name, \"\");\n  return utils::ParseUsing<double>(value, defval, utils::ParseDouble);\n}\n\nbool TaskContext::Get(const string &name, bool defval) const {\n  string value = Get(name, \"\");\n  return value.empty() ? defval : value == \"true\";\n}\n\nstring TaskContext::InputFile(const TaskInput &input) {\n  CHECK_EQ(input.part_size(), 1) << input.name();\n  return input.part(0).file_pattern();\n}\n\nbool TaskContext::Supports(const TaskInput &input, const string &file_format,\n                           const string &record_format) {\n  // Check file format.\n  if (input.file_format_size() > 0) {\n    bool found = false;\n    for (int i = 0; i < input.file_format_size(); ++i) {\n      if (input.file_format(i) == file_format) {\n        found = true;\n        break;\n      }\n    }\n    if (!found) return false;\n  }\n\n  // Check record format.\n  if (input.record_format_size() > 0) {\n    bool found = false;\n    for (int i = 0; i < input.record_format_size(); ++i) {\n      if (input.record_format(i) == record_format) {\n        found = true;\n        break;\n      }\n    }\n    if (!found) return false;\n  }\n\n  return true;\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/task_context.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#ifndef SYNTAXNET_TASK_CONTEXT_H_\n#define SYNTAXNET_TASK_CONTEXT_H_\n\n#include <string>\n#include <vector>\n\n#include \"syntaxnet/task_spec.pb.h\"\n#include \"syntaxnet/utils.h\"\n\nnamespace syntaxnet {\n\n// A task context holds configuration information for a task. It is basically a\n// wrapper around a TaskSpec protocol buffer.\nclass TaskContext {\n public:\n  // Returns the underlying task specification protocol buffer for the context.\n  const TaskSpec &spec() const { return spec_; }\n  TaskSpec *mutable_spec() { return &spec_; }\n\n  // Returns a named input descriptor for the task. A new input  is created if\n  // the task context does not already have an input with that name.\n  TaskInput *GetInput(const string &name);\n  TaskInput *GetInput(const string &name, const string &file_format,\n                      const string &record_format);\n\n  // Sets task parameter.\n  void SetParameter(const string &name, const string &value);\n\n  // Returns task parameter. If the parameter is not in the task configuration\n  // the (default) value of the corresponding command line flag is returned.\n  string GetParameter(const string &name) const;\n  int GetIntParameter(const string &name) const;\n  int64 GetInt64Parameter(const string &name) const;\n  bool GetBoolParameter(const string &name) const;\n  double GetFloatParameter(const string &name) const;\n\n  // Returns task parameter. If the parameter is not in the task configuration\n  // the default value is returned. Parameters retrieved using these methods\n  // don't need to be defined with a DEFINE_*() macro.\n  string Get(const string &name, const string &defval) const;\n  string Get(const string &name, const char *defval) const;\n  int Get(const string &name, int defval) const;\n  int64 Get(const string &name, int64 defval) const;\n  double Get(const string &name, double defval) const;\n  bool Get(const string &name, bool defval) const;\n\n  // Returns input file name for a single-file task input.\n  static string InputFile(const TaskInput &input);\n\n  // Returns true if task input supports the file and record format.\n  static bool Supports(const TaskInput &input, const string &file_format,\n                       const string &record_format);\n\n private:\n  // Underlying task specification protocol buffer.\n  TaskSpec spec_;\n\n  // Vector of parameters required by this task.  These must be specified in the\n  // task rather than relying on default values.\n  vector<string> required_parameters_;\n};\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_TASK_CONTEXT_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/task_spec.proto",
    "content": "// LINT: ALLOW_GROUPS\n// Protocol buffer specifications for task configuration.\n\nsyntax = \"proto2\";\n\npackage syntaxnet;\n\n// Task input descriptor.\nmessage TaskInput {\n  // Name of input resource.\n  required string name = 1;\n\n  // Name of stage responsible of creating this resource.\n  optional string creator = 2;\n\n  // File format for resource.\n  repeated string file_format = 3;\n\n  // Record format for resource.\n  repeated string record_format = 4;\n\n  // Is this resource multi-file?\n  optional bool multi_file = 5 [default = false];\n\n  // An input can consist of multiple file sets.\n  repeated group Part = 6 {\n    // File pattern for file set.\n    optional string file_pattern = 7;\n\n    // File format for file set.\n    optional string file_format = 8;\n\n    // Record format for file set.\n    optional string record_format = 9;\n  }\n}\n\n// Task output descriptor.\nmessage TaskOutput {\n  // Name of output resource.\n  required string name = 1;\n\n  // File format for output resource.\n  optional string file_format = 2;\n\n  // Record format for output resource.\n  optional string record_format = 3;\n\n  // Number of shards in output. If it is different from zero this output is\n  // sharded. If the number of shards is set to -1 this means that the output is\n  // sharded, but the number of shard is unknown. The files are then named\n  // 'base-*-of-*'.\n  optional int32 shards = 4 [default = 0];\n\n  // Base file name for output resource. If this is not set by the task\n  // component it is set to a default value by the workflow engine.\n  optional string file_base = 5;\n\n  // Optional extension added to the file name.\n  optional string file_extension = 6;\n}\n\n// A task specification is used for describing executing parameters.\nmessage TaskSpec {\n  // Name of task.\n  optional string task_name = 1;\n\n  // Workflow task type.\n  optional string task_type = 2;\n\n  // Task parameters.\n  repeated group Parameter = 3 {\n    required string name = 4;\n    optional string value = 5;\n  }\n\n  // Task inputs.\n  repeated TaskInput input = 6;\n\n  // Task outputs.\n  repeated TaskOutput output = 7;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/term_frequency_map.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/term_frequency_map.h\"\n\n#include <stddef.h>\n#include <algorithm>\n#include <limits>\n\n#include \"tensorflow/core/lib/core/status.h\"\n#include \"tensorflow/core/lib/io/buffered_inputstream.h\"\n#include \"tensorflow/core/lib/io/random_inputstream.h\"\n#include \"tensorflow/core/lib/strings/strcat.h\"\n#include \"tensorflow/core/platform/env.h\"\n\nnamespace syntaxnet {\n\nint TermFrequencyMap::Increment(const string &term) {\n  CHECK_EQ(term_index_.size(), term_data_.size());\n  const TermIndex::const_iterator it = term_index_.find(term);\n  if (term_index_.find(term) != term_index_.end()) {\n    // Increment the existing term.\n    pair<string, int64> &data = term_data_[it->second];\n    CHECK_EQ(term, data.first);\n    ++(data.second);\n    return it->second;\n  } else {\n    // Add a new term.\n    const int index = term_index_.size();\n    CHECK_LT(index, std::numeric_limits<int32>::max());  // overflow\n    term_index_[term] = index;\n    term_data_.push_back(pair<string, int64>(term, 1));\n    return index;\n  }\n}\n\nvoid TermFrequencyMap::Clear() {\n  term_index_.clear();\n  term_data_.clear();\n}\n\nvoid TermFrequencyMap::Load(const string &filename, int min_frequency,\n                            int max_num_terms) {\n  Clear();\n\n  // If max_num_terms is non-positive, replace it with INT_MAX.\n  if (max_num_terms <= 0) max_num_terms = std::numeric_limits<int>::max();\n\n  // Read the first line (total # of terms in the mapping).\n  std::unique_ptr<tensorflow::RandomAccessFile> file;\n  TF_CHECK_OK(tensorflow::Env::Default()->NewRandomAccessFile(filename, &file));\n  static const int kInputBufferSize = 1 * 1024 * 1024; /* bytes */\n  tensorflow::io::RandomAccessInputStream stream(file.get());\n  tensorflow::io::BufferedInputStream buffer(&stream, kInputBufferSize);\n  string line;\n  TF_CHECK_OK(buffer.ReadLine(&line));\n  int32 total = -1;\n  CHECK(utils::ParseInt32(line.c_str(), &total));\n  CHECK_GE(total, 0);\n\n  // Read the mapping.\n  int64 last_frequency = -1;\n  for (int i = 0; i < total && i < max_num_terms; ++i) {\n    TF_CHECK_OK(buffer.ReadLine(&line));\n    vector<string> elements = utils::Split(line, ' ');\n    CHECK_EQ(2, elements.size());\n    CHECK(!elements[0].empty());\n    CHECK(!elements[1].empty());\n    int64 frequency = 0;\n    CHECK(utils::ParseInt64(elements[1].c_str(), &frequency));\n    CHECK_GT(frequency, 0);\n    const string &term = elements[0];\n\n    // Check frequency sorting (descending order).\n    if (i > 0) CHECK_GE(last_frequency, frequency);\n    last_frequency = frequency;\n\n    // Ignore low-frequency items.\n    if (frequency < min_frequency) continue;\n\n    // Check uniqueness of the mapped terms.\n    CHECK(term_index_.find(term) == term_index_.end())\n        << \"File \" << filename << \" has duplicate term: \" << term;\n\n    // Assign the next available index.\n    const int index = term_index_.size();\n    term_index_[term] = index;\n    term_data_.push_back(pair<string, int64>(term, frequency));\n  }\n  CHECK_EQ(term_index_.size(), term_data_.size());\n  LOG(INFO) << \"Loaded \" << term_index_.size() << \" terms from \" << filename\n            << \".\";\n}\n\nstruct TermFrequencyMap::SortByFrequencyThenTerm {\n  // Return a > b to sort in descending order of frequency; otherwise,\n  // lexicographic sort on term.\n  bool operator()(const pair<string, int64> &a,\n                  const pair<string, int64> &b) const {\n    return (a.second > b.second || (a.second == b.second && a.first < b.first));\n  }\n};\n\nvoid TermFrequencyMap::Save(const string &filename) const {\n  CHECK_EQ(term_index_.size(), term_data_.size());\n\n  // Copy and sort the term data.\n  vector<pair<string, int64>> sorted_data(term_data_);\n  std::sort(sorted_data.begin(), sorted_data.end(), SortByFrequencyThenTerm());\n\n  // Write the number of terms.\n  std::unique_ptr<tensorflow::WritableFile> file;\n  TF_CHECK_OK(tensorflow::Env::Default()->NewWritableFile(filename, &file));\n  CHECK_LE(term_index_.size(), std::numeric_limits<int32>::max());  // overflow\n  const int32 num_terms = term_index_.size();\n  const string header = tensorflow::strings::StrCat(num_terms, \"\\n\");\n  TF_CHECK_OK(file->Append(header));\n\n  // Write each term and frequency.\n  for (size_t i = 0; i < sorted_data.size(); ++i) {\n    if (i > 0) CHECK_GE(sorted_data[i - 1].second, sorted_data[i].second);\n    const string line = tensorflow::strings::StrCat(\n        sorted_data[i].first, \" \", sorted_data[i].second, \"\\n\");\n    TF_CHECK_OK(file->Append(line));\n  }\n  TF_CHECK_OK(file->Close()) << \"for file \" << filename;\n  LOG(INFO) << \"Saved \" << term_index_.size() << \" terms to \" << filename\n            << \".\";\n}\n\nTagToCategoryMap::TagToCategoryMap(const string &filename) {\n  // Load the mapping.\n  std::unique_ptr<tensorflow::RandomAccessFile> file;\n  TF_CHECK_OK(tensorflow::Env::Default()->NewRandomAccessFile(filename, &file));\n  static const int kInputBufferSize = 1 * 1024 * 1024; /* bytes */\n  tensorflow::io::RandomAccessInputStream stream(file.get());\n  tensorflow::io::BufferedInputStream buffer(&stream, kInputBufferSize);\n  string line;\n  while (buffer.ReadLine(&line) == tensorflow::Status::OK()) {\n    vector<string> pair = utils::Split(line, '\\t');\n    CHECK(line.empty() || pair.size() == 2) << line;\n    tag_to_category_[pair[0]] = pair[1];\n  }\n}\n\n// Returns the category associated with the given tag.\nconst string &TagToCategoryMap::GetCategory(const string &tag) const {\n  const auto it = tag_to_category_.find(tag);\n  CHECK(it != tag_to_category_.end()) << \"No category found for tag \" << tag;\n  return it->second;\n}\n\nvoid TagToCategoryMap::SetCategory(const string &tag, const string &category) {\n  const auto it = tag_to_category_.find(tag);\n  if (it != tag_to_category_.end()) {\n    CHECK_EQ(category, it->second)\n        << \"POS tag cannot be mapped to multiple coarse POS tags. \"\n        << \"'\" << tag << \"' is mapped to: '\" << category << \"' and '\"\n        << it->second << \"'\";\n  } else {\n    tag_to_category_[tag] = category;\n  }\n}\n\nvoid TagToCategoryMap::Save(const string &filename) const {\n  // Write tag and category on each line.\n  std::unique_ptr<tensorflow::WritableFile> file;\n  TF_CHECK_OK(tensorflow::Env::Default()->NewWritableFile(filename, &file));\n  for (const auto &pair : tag_to_category_) {\n    const string line =\n        tensorflow::strings::StrCat(pair.first, \"\\t\", pair.second, \"\\n\");\n    TF_CHECK_OK(file->Append(line));\n  }\n  TF_CHECK_OK(file->Close()) << \"for file \" << filename;\n}\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/term_frequency_map.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#ifndef SYNTAXNET_TERM_FREQUENCY_MAP_H_\n#define SYNTAXNET_TERM_FREQUENCY_MAP_H_\n\n#include <stddef.h>\n#include <memory>\n#include <string>\n#include <unordered_map>\n#include <utility>\n#include <vector>\n\n#include \"syntaxnet/utils.h\"\n\nnamespace syntaxnet {\n\n// A mapping from strings to frequencies with save and load functionality.\nclass TermFrequencyMap {\n public:\n  // Creates an empty frequency map.\n  TermFrequencyMap() {}\n\n  // Creates a term frequency map by calling Load.\n  TermFrequencyMap(const string &file, int min_frequency, int max_num_terms) {\n    Load(file, min_frequency, max_num_terms);\n  }\n\n  // Returns the number of terms with positive frequency.\n  int Size() const { return term_index_.size(); }\n\n  // Returns the index associated with the given term.  If the term does not\n  // exist, the unknown index is returned instead.\n  int LookupIndex(const string &term, int unknown) const {\n    const TermIndex::const_iterator it = term_index_.find(term);\n    return (it != term_index_.end() ? it->second : unknown);\n  }\n\n  // Returns the term associated with the given index.\n  const string &GetTerm(int index) const { return term_data_[index].first; }\n\n  // Increases the frequency of the given term by 1, creating a new entry if\n  // necessary, and returns the index of the term.\n  int Increment(const string &term);\n\n  // Clears all frequencies.\n  void Clear();\n\n  // Loads a frequency mapping from the given file, which must have been created\n  // by an earlier call to Save().  After loading, the term indices are\n  // guaranteed to be ordered in descending order of frequency (breaking ties\n  // arbitrarily).  However, any new terms inserted after loading do not\n  // maintain this sorting invariant.\n  //\n  // Only loads terms with frequency >= min_frequency.  If max_num_terms <= 0,\n  // then all qualifying terms are loaded; otherwise, max_num_terms terms with\n  // maximal frequency are loaded (breaking ties arbitrarily).\n  void Load(const string &filename, int min_frequency, int max_num_terms);\n\n  // Saves a frequency mapping to the given file.\n  void Save(const string &filename) const;\n\n private:\n  // Hashtable for term-to-index mapping.\n  typedef std::unordered_map<string, int> TermIndex;\n\n  // Sorting functor for term data.\n  struct SortByFrequencyThenTerm;\n\n  // Mapping from terms to indices.\n  TermIndex term_index_;\n\n  // Mapping from indices to term and frequency.\n  vector<pair<string, int64>> term_data_;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(TermFrequencyMap);\n};\n\n// A mapping from tags to categories.\nclass TagToCategoryMap {\n public:\n  TagToCategoryMap() {}\n  ~TagToCategoryMap() {}\n\n  // Loads a tag to category map from a text file.\n  explicit TagToCategoryMap(const string &filename);\n\n  // Sets the category for the given tag.\n  void SetCategory(const string &tag, const string &category);\n\n  // Returns the category associated with the given tag.\n  const string &GetCategory(const string &tag) const;\n\n  // Saves a tag to category map to the given file.\n  void Save(const string &filename) const;\n\n private:\n  map<string, string> tag_to_category_;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(TagToCategoryMap);\n};\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_TERM_FREQUENCY_MAP_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/test_main.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// A program with a main that is suitable for unittests, including those\n// that also define microbenchmarks.  Based on whether the user specified\n// the --benchmark_filter flag which specifies which benchmarks to run,\n// we will either run benchmarks or run the gtest tests in the program.\n\n#include \"tensorflow/core/platform/platform.h\"\n#include \"tensorflow/core/platform/types.h\"\n\n#if defined(PLATFORM_GOOGLE) || defined(__ANDROID__)\n\n// main() is supplied by gunit_main\n#else\n#include \"gtest/gtest.h\"\n#include \"tensorflow/core/lib/core/stringpiece.h\"\n#include \"tensorflow/core/platform/test_benchmark.h\"\n\nGTEST_API_ int main(int argc, char **argv) {\n  std::cout << \"Running main() from test_main.cc\\n\";\n\n  testing::InitGoogleTest(&argc, argv);\n  for (int i = 1; i < argc; i++) {\n    if (tensorflow::StringPiece(argv[i]).starts_with(\"--benchmarks=\")) {\n      const char *pattern = argv[i] + strlen(\"--benchmarks=\");\n      tensorflow::testing::Benchmark::Run(pattern);\n      return 0;\n    }\n  }\n  return RUN_ALL_TESTS();\n}\n#endif\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/testdata/context.pbtxt",
    "content": "Parameter {\n  name: 'brain_parser_embedding_dims'\n  value: '8;8;8'\n}\nParameter {\n  name: 'brain_parser_features'\n  value: 'input.token.word input(1).token.word input(2).token.word stack.token.word stack(1).token.word stack(2).token.word;input.tag input(1).tag input(2).tag stack.tag stack(1).tag stack(2).tag;stack.child(1).label stack.child(1).sibling(-1).label stack.child(-1).label stack.child(-1).sibling(1).label'\n}\nParameter {\n  name: 'brain_parser_embedding_names'\n  value: 'words;tags;labels'\n}\ninput {\n  name: 'training-corpus'\n  record_format: 'conll-sentence'\n  Part {\n    file_pattern: 'syntaxnet/testdata/mini-training-set'\n  }\n}\ninput {\n  name: 'tuning-corpus'\n  record_format: 'conll-sentence'\n  Part {\n    file_pattern: 'syntaxnet/testdata/mini-training-set'\n  }\n}\ninput {\n  name: 'parsed-tuning-corpus'\n  creator: 'brain_parser/greedy'\n  record_format: 'conll-sentence'\n}\ninput {\n  name: 'label-map'\n  file_format: 'text'\n  Part {\n    file_pattern: 'OUTPATH/label-map'\n  }\n}\ninput {\n  name: 'word-map'\n  Part {\n    file_pattern: 'OUTPATH/word-map'\n  }\n}\ninput {\n  name: 'lcword-map'\n  Part {\n    file_pattern: 'OUTPATH/lcword-map'\n  }\n}\ninput {\n  name: 'tag-map'\n  Part {\n    file_pattern: 'OUTPATH/tag-map'\n  }\n}\ninput {\n  name: 'category-map'\n  Part {\n    file_pattern: 'OUTPATH/category-map'\n  }\n}\ninput {\n  name: 'char-map'\n  Part {\n    file_pattern: 'OUTPATH/char-map'\n  }\n}\ninput {\n  name: 'prefix-table'\n  Part {\n    file_pattern: 'OUTPATH/prefix-table'\n  }\n}\ninput {\n  name: 'suffix-table'\n  Part {\n    file_pattern: 'OUTPATH/suffix-table'\n  }\n}\ninput {\n  name: 'tag-to-category'\n  Part {\n    file_pattern: 'OUTPATH/tag-to-category'\n  }\n}\ninput {\n  name: 'stdout'\n  record_format: 'conll-sentence'\n  Part {\n    file_pattern: '-'\n  }\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/testdata/document",
    "content": "text       : \"I can not recall any disorder in currency markets since the 1974 guidelines were adopted .\"\ntoken: {\n  word    : \"I\"\n  start   : 0\n  end     : 0\n  head    : 3\n  tag     : \"PRP\"\n  category: \"PRON\"\n  label   : \"nsubj\"\n  break_level       : SENTENCE_BREAK\n}\ntoken: {\n  word    : \"can\"\n  start   : 2\n  end     : 4\n  head    : 3\n  tag     : \"MD\"\n  category: \"VERB\"\n  label   : \"aux\"\n}\ntoken: {\n  word    : \"not\"\n  start   : 6\n  end     : 8\n  head    : 3\n  tag     : \"RB\"\n  category: \"ADV\"\n  label   : \"neg\"\n}\ntoken: {\n  word    : \"recall\"\n  start   : 10\n  end     : 15\n  tag     : \"VB\"\n  category: \"VERB\"\n  label   : \"ROOT\"\n}\ntoken: {\n  word    : \"any\"\n  start   : 17\n  end     : 19\n  head    : 5\n  tag     : \"DT\"\n  category: \"DET\"\n  label   : \"det\"\n}\ntoken: {\n  word    : \"disorder\"\n  start   : 21\n  end     : 28\n  head    : 3\n  tag     : \"NN\"\n  category: \"NOUN\"\n  label   : \"dobj\"\n}\ntoken: {\n  word    : \"in\"\n  start   : 30\n  end     : 31\n  head    : 5\n  tag     : \"IN\"\n  category: \"ADP\"\n  label   : \"prep\"\n}\ntoken: {\n  word    : \"currency\"\n  start   : 33\n  end     : 40\n  head    : 8\n  tag     : \"NN\"\n  category: \"NOUN\"\n  label   : \"nn\"\n}\ntoken: {\n  word    : \"markets\"\n  start   : 42\n  end     : 48\n  head    : 6\n  tag     : \"NNS\"\n  category: \"NOUN\"\n  label   : \"pobj\"\n}\ntoken: {\n  word    : \"since\"\n  start   : 50\n  end     : 54\n  head    : 14\n  tag     : \"IN\"\n  category: \"ADP\"\n  label   : \"mark\"\n}\ntoken: {\n  word    : \"the\"\n  start   : 56\n  end     : 58\n  head    : 12\n  tag     : \"DT\"\n  category: \"DET\"\n  label   : \"det\"\n}\ntoken: {\n  word    : \"1974\"\n  start   : 60\n  end     : 63\n  head    : 12\n  tag     : \"CD\"\n  category: \"NUM\"\n  label   : \"num\"\n}\ntoken: {\n  word    : \"guidelines\"\n  start   : 65\n  end     : 74\n  head    : 14\n  tag     : \"NNS\"\n  category: \"NOUN\"\n  label   : \"nsubjpass\"\n}\ntoken: {\n  word    : \"were\"\n  start   : 76\n  end     : 79\n  head    : 14\n  tag     : \"VBD\"\n  category: \"VERB\"\n  label   : \"auxpass\"\n}\ntoken: {\n  word    : \"adopted\"\n  start   : 81\n  end     : 87\n  head    : 3\n  tag     : \"VBN\"\n  category: \"VERB\"\n  label   : \"advcl\"\n}\ntoken: {\n  word    : \".\"\n  start   : 89\n  end     : 89\n  head    : 3\n  tag     : \".\"\n  category: \".\"\n  label   : \"p\"\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/testdata/mini-training-set",
    "content": "1\tI\t_\tPRP\tPRP\t_\t2\tnsubj\t_\t_\n2\tknew\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n3\tI\t_\tPRP\tPRP\t_\t5\tnsubj\t_\t_\n4\tcould\t_\tMD\tMD\t_\t5\taux\t_\t_\n5\tdo\t_\tVB\tVB\t_\t2\tccomp\t_\t_\n6\tit\t_\tPRP\tPRP\t_\t5\tdobj\t_\t_\n7\tproperly\t_\tRB\tRB\t_\t5\tadvmod\t_\t_\n8\tif\t_\tIN\tIN\t_\t9\tmark\t_\t_\n9\tgiven\t_\tVBN\tVBN\t_\t5\tadvcl\t_\t_\n10\tthe\t_\tDT\tDT\t_\t12\tdet\t_\t_\n11\tright\t_\tJJ\tJJ\t_\t12\tamod\t_\t_\n12\tkind\t_\tNN\tNN\t_\t9\tdobj\t_\t_\n13\tof\t_\tIN\tIN\t_\t12\tprep\t_\t_\n14\tsupport\t_\tNN\tNN\t_\t13\tpobj\t_\t_\n15\t.\t_\t.\t.\t_\t2\tpunct\t_\t_\n\n1\tThe\t_\tDT\tDT\t_\t2\tdet\t_\t_\n2\tjourney\t_\tNN\tNN\t_\t8\tnsubj\t_\t_\n3\tthrough\t_\tIN\tIN\t_\t2\tprep\t_\t_\n4\tdeserts\t_\tNNS\tNNS\t_\t3\tpobj\t_\t_\n5\tand\t_\tCC\tCC\t_\t4\tcc\t_\t_\n6\tmountains\t_\tNNS\tNNS\t_\t4\tconj\t_\t_\n7\tcan\t_\tMD\tMD\t_\t8\taux\t_\t_\n8\ttake\t_\tVB\tVB\t_\t0\tROOT\t_\t_\n9\ta\t_\tDT\tDT\t_\t10\tdet\t_\t_\n10\tmonth\t_\tNN\tNN\t_\t8\ttmod\t_\t_\n11\t.\t_\t.\t.\t_\t8\tpunct\t_\t_\n\n1\tYou\t_\tPRP\tPRP\t_\t2\tnsubj\t_\t_\n2\tsay\t_\tVBP\tVBP\t_\t0\tROOT\t_\t_\n3\tthey\t_\tPRP\tPRP\t_\t4\tnsubj\t_\t_\n4\t're\t_\tVBP\tVBP\t_\t2\tccomp\t_\t_\n5\tin\t_\tIN\tIN\t_\t4\tprep\t_\t_\n6\tthe\t_\tDT\tDT\t_\t7\tdet\t_\t_\n7\tpipeline\t_\tNN\tNN\t_\t5\tpobj\t_\t_\n8\t?\t_\t.\t.\t_\t2\tpunct\t_\t_\n\n1\tBorder\t_\tNNP\tNNP\t_\t5\tnn\t_\t_\n2\tpolice\t_\tNN\tNN\t_\t5\tnn\t_\t_\n3\tcommander\t_\tNN\tNN\t_\t5\tnn\t_\t_\n4\tAbdul\t_\tNNP\tNNP\t_\t5\tnn\t_\t_\n5\tRaziq\t_\tNNP\tNNP\t_\t6\tnsubj\t_\t_\n6\tsays\t_\tVBZ\tVBZ\t_\t0\tROOT\t_\t_\n7\tthe\t_\tDT\tDT\t_\t8\tdet\t_\t_\n8\tdrugs\t_\tNNS\tNNS\t_\t10\tnsubjpass\t_\t_\n9\twere\t_\tVBD\tVBD\t_\t10\tauxpass\t_\t_\n10\tfound\t_\tVBN\tVBN\t_\t6\tccomp\t_\t_\n11\tin\t_\tIN\tIN\t_\t10\tprep\t_\t_\n12\tthe\t_\tDT\tDT\t_\t13\tdet\t_\t_\n13\tbasement\t_\tNN\tNN\t_\t11\tpobj\t_\t_\n14\tof\t_\tIN\tIN\t_\t13\tprep\t_\t_\n15\ta\t_\tDT\tDT\t_\t16\tdet\t_\t_\n16\tcompound\t_\tNN\tNN\t_\t14\tpobj\t_\t_\n17\tin\t_\tIN\tIN\t_\t16\tprep\t_\t_\n18\tNawa\t_\tNNP\tNNP\t_\t20\tnn\t_\t_\n19\tKili\t_\tNNP\tNNP\t_\t20\tnn\t_\t_\n20\tvillage\t_\tNN\tNN\t_\t17\tpobj\t_\t_\n21\t.\t_\t.\t.\t_\t6\tpunct\t_\t_\n\n1\tFourth\t_\tJJ\tJJ\t_\t3\tamod\t_\t_\n2\tquarter\t_\tNN\tNN\t_\t3\tnn\t_\t_\n3\tproduction\t_\tNN\tNN\t_\t5\tnsubjpass\t_\t_\n4\tis\t_\tVBZ\tVBZ\t_\t5\tauxpass\t_\t_\n5\texpected\t_\tVBN\tVBN\t_\t0\tROOT\t_\t_\n6\tto\t_\tTO\tTO\t_\t7\taux\t_\t_\n7\tincrease\t_\tVB\tVB\t_\t5\txcomp\t_\t_\n8\tto\t_\tTO\tTO\t_\t7\tprep\t_\t_\n9\t130,000\t_\tCD\tCD\t_\t10\tnum\t_\t_\n10\tounces\t_\tNNS\tNNS\t_\t8\tpobj\t_\t_\n11\t.\t_\t.\t.\t_\t5\tpunct\t_\t_\n\n1\tMinor\t_\tNNP\tNNP\t_\t2\tnn\t_\t_\n2\tscuffling\t_\tNN\tNN\t_\t3\tnsubj\t_\t_\n3\tbroke\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n4\tout\t_\tRP\tRP\t_\t3\tprt\t_\t_\n5\tas\t_\tIN\tIN\t_\t7\tmark\t_\t_\n6\tofficials\t_\tNNS\tNNS\t_\t7\tnsubj\t_\t_\n7\tsought\t_\tVBD\tVBD\t_\t3\tadvcl\t_\t_\n8\tto\t_\tTO\tTO\t_\t9\taux\t_\t_\n9\tseparate\t_\tVB\tVB\t_\t7\txcomp\t_\t_\n10\tthe\t_\tDT\tDT\t_\t11\tdet\t_\t_\n11\tgroups\t_\tNNS\tNNS\t_\t9\tdobj\t_\t_\n12\t.\t_\t.\t.\t_\t3\tpunct\t_\t_\n\n1\tAccording\t_\tVBG\tVBG\t_\t18\tprep\t_\t_\n2\tto\t_\tTO\tTO\t_\t1\tpcomp\t_\t_\n3\tFacebook\t_\tNNP\tNNP\t_\t2\tpobj\t_\t_\n4\t,\t_\t,\t,\t_\t3\tpunct\t_\t_\n5\twhich\t_\tWDT\tWDT\t_\t7\tnsubjpass\t_\t_\n6\tis\t_\tVBZ\tVBZ\t_\t7\tauxpass\t_\t_\n7\tbased\t_\tVBN\tVBN\t_\t3\trcmod\t_\t_\n8\tin\t_\tIN\tIN\t_\t7\tprep\t_\t_\n9\tPalo\t_\tNNP\tNNP\t_\t10\tnn\t_\t_\n10\tAlto\t_\tNNP\tNNP\t_\t8\tpobj\t_\t_\n11\t,\t_\t,\t,\t_\t10\tpunct\t_\t_\n12\tCalif\t_\tNNP\tNNP\t_\t10\tappos\t_\t_\n13\t.\t_\t.\t.\t_\t12\tpunct\t_\t_\n14\t,\t_\t,\t,\t_\t18\tpunct\t_\t_\n15\tthe\t_\tDT\tDT\t_\t17\tdet\t_\t_\n16\tWeb\t_\tNNP\tNNP\t_\t17\tnn\t_\t_\n17\tsite\t_\tNN\tNN\t_\t18\tnsubj\t_\t_\n18\thas\t_\tVBZ\tVBZ\t_\t0\tROOT\t_\t_\n19\tabout\t_\tIN\tIN\t_\t21\tquantmod\t_\t_\n20\t47\t_\tCD\tCD\t_\t21\tnumber\t_\t_\n21\tmillion\t_\tCD\tCD\t_\t23\tnum\t_\t_\n22\tactive\t_\tJJ\tJJ\t_\t23\tamod\t_\t_\n23\tusers\t_\tNNS\tNNS\t_\t18\tdobj\t_\t_\n24\t.\t_\t.\t.\t_\t18\tpunct\t_\t_\n\n1\tAmong\t_\tIN\tIN\t_\t10\tprep\t_\t_\n2\tthose\t_\tDT\tDT\t_\t1\tpobj\t_\t_\n3\tleaning\t_\tVBG\tVBG\t_\t2\tpartmod\t_\t_\n4\ttoward\t_\tIN\tIN\t_\t3\tprep\t_\t_\n5\tMcDonnell\t_\tNNP\tNNP\t_\t4\tpobj\t_\t_\n6\t,\t_\t,\t,\t_\t10\tpunct\t_\t_\n7\thowever\t_\tRB\tRB\t_\t10\tadvmod\t_\t_\n8\t,\t_\t,\t,\t_\t10\tpunct\t_\t_\n9\tsome\t_\tDT\tDT\t_\t10\tnsubj\t_\t_\n10\ttook\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n11\ta\t_\tDT\tDT\t_\t14\tdet\t_\t_\n12\tmore\t_\tRBR\tRBR\t_\t13\tadvmod\t_\t_\n13\tnuanced\t_\tJJ\tJJ\t_\t14\tamod\t_\t_\n14\tview\t_\tNN\tNN\t_\t10\tdobj\t_\t_\n15\t,\t_\t,\t,\t_\t10\tpunct\t_\t_\n16\tallowing\t_\tVBG\tVBG\t_\t10\tpartmod\t_\t_\n17\tfor\t_\tIN\tIN\t_\t16\tprep\t_\t_\n18\tthe\t_\tDT\tDT\t_\t19\tdet\t_\t_\n19\tpossibility\t_\tNN\tNN\t_\t17\tpobj\t_\t_\n20\tthat\t_\tIN\tIN\t_\t24\tmark\t_\t_\n21\tMcDonnell\t_\tNNP\tNNP\t_\t24\tnsubj\t_\t_\n22\tcould\t_\tMD\tMD\t_\t24\taux\t_\t_\n23\thave\t_\tVB\tVB\t_\t24\taux\t_\t_\n24\tchanged\t_\tVBN\tVBN\t_\t19\tccomp\t_\t_\n25\this\t_\tPRP$\tPRP$\t_\t26\tposs\t_\t_\n26\tmind\t_\tNN\tNN\t_\t24\tdobj\t_\t_\n27\tin\t_\tIN\tIN\t_\t24\tprep\t_\t_\n28\tthe\t_\tDT\tDT\t_\t31\tdet\t_\t_\n29\tintervening\t_\tVBG\tVBG\t_\t31\tamod\t_\t_\n30\t20\t_\tCD\tCD\t_\t31\tnum\t_\t_\n31\tyears\t_\tNNS\tNNS\t_\t27\tpobj\t_\t_\n32\tor\t_\tCC\tCC\t_\t24\tcc\t_\t_\n33\tthat\t_\tIN\tIN\t_\t39\tmark\t_\t_\n34\this\t_\tPRP$\tPRP$\t_\t36\tposs\t_\t_\n35\tpersonal\t_\tJJ\tJJ\t_\t36\tamod\t_\t_\n36\tconvictions\t_\tNNS\tNNS\t_\t39\tnsubj\t_\t_\n37\twould\t_\tMD\tMD\t_\t39\taux\t_\t_\n38\tnot\t_\tRB\tRB\t_\t39\tneg\t_\t_\n39\tinterfere\t_\tVB\tVB\t_\t24\tconj\t_\t_\n40\twith\t_\tIN\tIN\t_\t39\tprep\t_\t_\n41\this\t_\tPRP$\tPRP$\t_\t42\tposs\t_\t_\n42\tgoverning\t_\tNN\tNN\t_\t40\tpobj\t_\t_\n43\t.\t_\t.\t.\t_\t10\tpunct\t_\t_\n\n1\tBoth\t_\tDT\tDT\t_\t2\tdet\t_\t_\n2\tteams\t_\tNNS\tNNS\t_\t3\tnsubj\t_\t_\n3\thave\t_\tVBP\tVBP\t_\t0\tROOT\t_\t_\n4\t97\t_\tCD\tCD\t_\t5\tnum\t_\t_\n5\tpoints\t_\tNNS\tNNS\t_\t3\tdobj\t_\t_\n6\t.\t_\t.\t.\t_\t3\tpunct\t_\t_\n\n1\tStar-Banner\t_\tNNP\tNNP\t_\t2\tnsubj\t_\t_\n2\treported\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n3\tTuesday\t_\tNNP\tNNP\t_\t2\ttmod\t_\t_\n4\t.\t_\t.\t.\t_\t2\tpunct\t_\t_\n\n1\tHarry\t_\tNNP\tNNP\t_\t2\tnn\t_\t_\n2\tRedknapp\t_\tNNP\tNNP\t_\t9\tnsubj\t_\t_\n3\t,\t_\t,\t,\t_\t2\tpunct\t_\t_\n4\tthe\t_\tDT\tDT\t_\t6\tdet\t_\t_\n5\tTottenham\t_\tNNP\tNNP\t_\t6\tnn\t_\t_\n6\tmanager\t_\tNN\tNN\t_\t2\tappos\t_\t_\n7\t,\t_\t,\t,\t_\t2\tpunct\t_\t_\n8\twas\t_\tVBD\tVBD\t_\t9\taux\t_\t_\n9\tdisbelieving\t_\tVBG\tVBG\t_\t0\tROOT\t_\t_\n10\tthat\t_\tIN\tIN\t_\t18\tmark\t_\t_\n11\tLennon\t_\tNNP\tNNP\t_\t13\tposs\t_\t_\n12\t's\t_\tPOS\tPOS\t_\t11\tpossessive\t_\t_\n13\tdelivery\t_\tNN\tNN\t_\t18\tnsubj\t_\t_\n14\tcould\t_\tMD\tMD\t_\t18\taux\t_\t_\n15\tbe\t_\tVB\tVB\t_\t18\tcop\t_\t_\n16\tso\t_\tRB\tRB\t_\t18\tadvmod\t_\t_\n17\tradically\t_\tRB\tRB\t_\t18\tadvmod\t_\t_\n18\tdifferent\t_\tJJ\tJJ\t_\t9\tccomp\t_\t_\n19\t.\t_\t.\t.\t_\t9\tpunct\t_\t_\n\n1\tThe\t_\tDT\tDT\t_\t3\tdet\t_\t_\n2\tUS\t_\tNNP\tNNP\t_\t3\tnn\t_\t_\n3\tuptick\t_\tNN\tNN\t_\t4\tnsubj\t_\t_\n4\tmirrors\t_\tVBZ\tVBZ\t_\t0\tROOT\t_\t_\n5\tan\t_\tDT\tDT\t_\t6\tdet\t_\t_\n6\timprovement\t_\tNN\tNN\t_\t4\tdobj\t_\t_\n7\tin\t_\tIN\tIN\t_\t6\tprep\t_\t_\n8\tmany\t_\tJJ\tJJ\t_\t10\tamod\t_\t_\n9\tother\t_\tJJ\tJJ\t_\t10\tamod\t_\t_\n10\tparts\t_\tNNS\tNNS\t_\t7\tpobj\t_\t_\n11\tof\t_\tIN\tIN\t_\t10\tprep\t_\t_\n12\tthe\t_\tDT\tDT\t_\t13\tdet\t_\t_\n13\tworld\t_\tNN\tNN\t_\t11\tpobj\t_\t_\n14\t.\t_\t.\t.\t_\t4\tpunct\t_\t_\n\n1\tAlthough\t_\tIN\tIN\t_\t4\tmark\t_\t_\n2\tsatellite\t_\tNN\tNN\t_\t3\tnn\t_\t_\n3\ttelevision\t_\tNN\tNN\t_\t4\tnsubj\t_\t_\n4\thas\t_\tVBZ\tVBZ\t_\t17\tadvcl\t_\t_\n5\tthe\t_\tDT\tDT\t_\t6\tdet\t_\t_\n6\tcapacity\t_\tNN\tNN\t_\t4\tdobj\t_\t_\n7\tfor\t_\tIN\tIN\t_\t6\tprep\t_\t_\n8\thundreds\t_\tNNS\tNNS\t_\t7\tpobj\t_\t_\n9\tof\t_\tIN\tIN\t_\t8\tprep\t_\t_\n10\tconventional\t_\tJJ\tJJ\t_\t12\tamod\t_\t_\n11\ttelevision\t_\tNN\tNN\t_\t12\tnn\t_\t_\n12\tchannels\t_\tNNS\tNNS\t_\t9\tpobj\t_\t_\n13\t,\t_\t,\t,\t_\t17\tpunct\t_\t_\n14\tit\t_\tPRP\tPRP\t_\t17\tnsubj\t_\t_\n15\tis\t_\tVBZ\tVBZ\t_\t17\tcop\t_\t_\n16\tless\t_\tRBR\tRBR\t_\t17\tadvmod\t_\t_\n17\table\t_\tJJ\tJJ\t_\t0\tROOT\t_\t_\n18\tto\t_\tTO\tTO\t_\t19\taux\t_\t_\n19\tprovide\t_\tVB\tVB\t_\t17\txcomp\t_\t_\n20\tvideo-on-demand\t_\tNN\tNN\t_\t19\tdobj\t_\t_\n21\t.\t_\t.\t.\t_\t17\tpunct\t_\t_\n\n1\tOur\t_\tPRP$\tPRP$\t_\t3\tposs\t_\t_\n2\tcomfortable\t_\tJJ\tJJ\t_\t3\tamod\t_\t_\n3\troom\t_\tNN\tNN\t_\t4\tnsubj\t_\t_\n4\tfeels\t_\tVBZ\tVBZ\t_\t0\tROOT\t_\t_\n5\ton\t_\tIN\tIN\t_\t4\tprep\t_\t_\n6\tthe\t_\tDT\tDT\t_\t8\tdet\t_\t_\n7\tsmall\t_\tJJ\tJJ\t_\t8\tamod\t_\t_\n8\tside\t_\tNN\tNN\t_\t5\tpobj\t_\t_\n9\t,\t_\t,\t,\t_\t4\tpunct\t_\t_\n10\tmainly\t_\tRB\tRB\t_\t17\tadvmod\t_\t_\n11\tbecause\t_\tIN\tIN\t_\t17\tmark\t_\t_\n12\ttoo\t_\tRB\tRB\t_\t13\tadvmod\t_\t_\n13\tmuch\t_\tJJ\tJJ\t_\t14\tamod\t_\t_\n14\tfurniture\t_\tNN\tNN\t_\t17\tnsubjpass\t_\t_\n15\thas\t_\tVBZ\tVBZ\t_\t17\taux\t_\t_\n16\tbeen\t_\tVBN\tVBN\t_\t17\tauxpass\t_\t_\n17\tshoehorned\t_\tVBN\tVBN\t_\t4\tadvcl\t_\t_\n18\tinto\t_\tIN\tIN\t_\t17\tprep\t_\t_\n19\tit\t_\tPRP\tPRP\t_\t18\tpobj\t_\t_\n20\t.\t_\t.\t.\t_\t4\tpunct\t_\t_\n\n1\tThey\t_\tPRP\tPRP\t_\t3\tnsubj\t_\t_\n2\talso\t_\tRB\tRB\t_\t3\tadvmod\t_\t_\n3\trequire\t_\tVBP\tVBP\t_\t0\tROOT\t_\t_\n4\ta\t_\tDT\tDT\t_\t6\tdet\t_\t_\n5\tslower\t_\tJJR\tJJR\t_\t6\tamod\t_\t_\n6\tinhale\t_\tNN\tNN\t_\t3\tdobj\t_\t_\n7\t.\t_\t.\t.\t_\t3\tpunct\t_\t_\n\n1\tHer\t_\tPRP$\tPRP$\t_\t2\tposs\t_\t_\n2\tring\t_\tNN\tNN\t_\t4\tnsubjpass\t_\t_\n3\twas\t_\tVBD\tVBD\t_\t4\tauxpass\t_\t_\n4\tfound\t_\tVBN\tVBN\t_\t0\tROOT\t_\t_\n5\tin\t_\tIN\tIN\t_\t4\tprep\t_\t_\n6\tthe\t_\tDT\tDT\t_\t7\tdet\t_\t_\n7\tcar\t_\tNN\tNN\t_\t5\tpobj\t_\t_\n8\t.\t_\t.\t.\t_\t4\tpunct\t_\t_\n\n1\tIn\t_\tIN\tIN\t_\t12\tprep\t_\t_\n2\tthe\t_\tDT\tDT\t_\t4\tdet\t_\t_\n3\tpast\t_\tJJ\tJJ\t_\t4\tamod\t_\t_\n4\tyear\t_\tNN\tNN\t_\t1\tpobj\t_\t_\n5\t,\t_\t,\t,\t_\t7\tpunct\t_\t_\n6\tForsythe\t_\tNNP\tNNP\t_\t7\tnsubj\t_\t_\n7\tsaid\t_\tVBD\tVBD\t_\t12\tparataxis\t_\t_\n8\t,\t_\t,\t,\t_\t7\tpunct\t_\t_\n9\tthe\t_\tDT\tDT\t_\t11\tdet\t_\t_\n10\tSalvation\t_\tNNP\tNNP\t_\t11\tnn\t_\t_\n11\tArmy\t_\tNNP\tNNP\t_\t12\tnsubj\t_\t_\n12\tprovided\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n13\trental\t_\tJJ\tJJ\t_\t14\tamod\t_\t_\n14\tsubsidies\t_\tNNS\tNNS\t_\t12\tdobj\t_\t_\n15\tthat\t_\tWDT\tWDT\t_\t16\tnsubj\t_\t_\n16\tprevented\t_\tVBD\tVBD\t_\t14\trcmod\t_\t_\n17\t1,172\t_\tCD\tCD\t_\t18\tnum\t_\t_\n18\tevictions\t_\tNNS\tNNS\t_\t16\tdobj\t_\t_\n19\t.\t_\t.\t.\t_\t12\tpunct\t_\t_\n\n1\tA\t_\tDT\tDT\t_\t3\tdet\t_\t_\n2\t23-year-old\t_\tJJ\tJJ\t_\t3\tamod\t_\t_\n3\tman\t_\tNN\tNN\t_\t6\tnsubjpass\t_\t_\n4\thas\t_\tVBZ\tVBZ\t_\t6\taux\t_\t_\n5\tbeen\t_\tVBN\tVBN\t_\t6\tauxpass\t_\t_\n6\tjailed\t_\tVBN\tVBN\t_\t0\tROOT\t_\t_\n7\tfor\t_\tIN\tIN\t_\t6\tprep\t_\t_\n8\ttwo\t_\tCD\tCD\t_\t9\tnum\t_\t_\n9\tyears\t_\tNNS\tNNS\t_\t7\tpobj\t_\t_\n10\tafter\t_\tIN\tIN\t_\t6\tprep\t_\t_\n11\tpleading\t_\tVBG\tVBG\t_\t10\tpcomp\t_\t_\n12\tguilty\t_\tJJ\tJJ\t_\t11\tacomp\t_\t_\n13\tto\t_\tTO\tTO\t_\t12\tprep\t_\t_\n14\tthe\t_\tDT\tDT\t_\t15\tdet\t_\t_\n15\tmanslaughter\t_\tNN\tNN\t_\t13\tpobj\t_\t_\n16\tof\t_\tIN\tIN\t_\t15\tprep\t_\t_\n17\ta\t_\tDT\tDT\t_\t18\tdet\t_\t_\n18\tman\t_\tNN\tNN\t_\t16\tpobj\t_\t_\n19\tin\t_\tIN\tIN\t_\t18\tprep\t_\t_\n20\tHertfordshire\t_\tNNP\tNNP\t_\t19\tpobj\t_\t_\n21\t.\t_\t.\t.\t_\t6\tpunct\t_\t_\n\n1\tBut\t_\tCC\tCC\t_\t10\tcc\t_\t_\n2\tthe\t_\tDT\tDT\t_\t3\tdet\t_\t_\n3\tsustainability\t_\tNN\tNN\t_\t10\tnsubj\t_\t_\n4\tof\t_\tIN\tIN\t_\t3\tprep\t_\t_\n5\tany\t_\tDT\tDT\t_\t7\tdet\t_\t_\n6\tpost-bubble\t_\tJJ\tJJ\t_\t7\tamod\t_\t_\n7\trecovery\t_\tNN\tNN\t_\t4\tpobj\t_\t_\n8\tis\t_\tVBZ\tVBZ\t_\t10\tcop\t_\t_\n9\talways\t_\tRB\tRB\t_\t10\tadvmod\t_\t_\n10\tdubious\t_\tJJ\tJJ\t_\t0\tROOT\t_\t_\n11\t.\t_\t.\t.\t_\t10\tpunct\t_\t_\n\n1\tThey\t_\tPRP\tPRP\t_\t2\tnsubj\t_\t_\n2\tspoke\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n3\tto\t_\tTO\tTO\t_\t2\tprep\t_\t_\n4\tthe\t_\tDT\tDT\t_\t5\tdet\t_\t_\n5\tBBC\t_\tNNP\tNNP\t_\t8\tposs\t_\t_\n6\t's\t_\tPOS\tPOS\t_\t5\tpossessive\t_\t_\n7\tArtyom\t_\tNNP\tNNP\t_\t8\tnn\t_\t_\n8\tLiss\t_\tNNP\tNNP\t_\t3\tpobj\t_\t_\n9\t.\t_\t.\t.\t_\t2\tpunct\t_\t_\n\n1\tThat\t_\tDT\tDT\t_\t2\tnsubj\t_\t_\n2\tincludes\t_\tVBZ\tVBZ\t_\t0\tROOT\t_\t_\n3\tme\t_\tPRP\tPRP\t_\t2\tdobj\t_\t_\n4\t,\t_\t,\t,\t_\t2\tpunct\t_\t_\n5\ttoo\t_\tRB\tRB\t_\t2\tadvmod\t_\t_\n6\t.\t_\t.\t.\t_\t2\tpunct\t_\t_\n\n1\tThe\t_\tDT\tDT\t_\t2\tdet\t_\t_\n2\tname\t_\tNN\tNN\t_\t9\tnsubj\t_\t_\n3\tof\t_\tIN\tIN\t_\t2\tprep\t_\t_\n4\tRachel\t_\tNNP\tNNP\t_\t5\tnn\t_\t_\n5\tHarris\t_\tNNP\tNNP\t_\t8\tposs\t_\t_\n6\t'\t_\tPOS\tPOS\t_\t5\tpossessive\t_\t_\n7\tWeb\t_\tNNP\tNNP\t_\t8\tnn\t_\t_\n8\tsite\t_\tNN\tNN\t_\t3\tpobj\t_\t_\n9\tsays\t_\tVBZ\tVBZ\t_\t0\tROOT\t_\t_\n10\tit\t_\tPRP\tPRP\t_\t9\tdobj\t_\t_\n11\tall\t_\tDT\tDT\t_\t10\tdet\t_\t_\n12\t.\t_\t.\t.\t_\t9\tpunct\t_\t_\n\n1\tIf\t_\tIN\tIN\t_\t3\tmark\t_\t_\n2\tyou\t_\tPRP\tPRP\t_\t3\tnsubj\t_\t_\n3\tprefer\t_\tVBP\tVBP\t_\t19\tadvcl\t_\t_\n4\tto\t_\tTO\tTO\t_\t5\taux\t_\t_\n5\tmaximize\t_\tVB\tVB\t_\t3\txcomp\t_\t_\n6\tyour\t_\tPRP$\tPRP$\t_\t7\tposs\t_\t_\n7\ttravel\t_\tNN\tNN\t_\t5\tdobj\t_\t_\n8\twith\t_\tIN\tIN\t_\t5\tprep\t_\t_\n9\tshorter\t_\tJJR\tJJR\t_\t10\tamod\t_\t_\n10\tstays\t_\tNNS\tNNS\t_\t8\tpobj\t_\t_\n11\tin\t_\tIN\tIN\t_\t10\tprep\t_\t_\n12\tmore\t_\tJJR\tJJR\t_\t13\tmwe\t_\t_\n13\tthan\t_\tIN\tIN\t_\t14\tquantmod\t_\t_\n14\tone\t_\tCD\tCD\t_\t15\tnum\t_\t_\n15\tdestination\t_\tNN\tNN\t_\t11\tpobj\t_\t_\n16\t,\t_\t,\t,\t_\t19\tpunct\t_\t_\n17\tyou\t_\tPRP\tPRP\t_\t19\tnsubj\t_\t_\n18\tmay\t_\tMD\tMD\t_\t19\taux\t_\t_\n19\tlike\t_\tVB\tVB\t_\t0\tROOT\t_\t_\n20\tthis\t_\tDT\tDT\t_\t22\tdet\t_\t_\n21\tmulti-country\t_\tJJ\tJJ\t_\t22\tamod\t_\t_\n22\tjaunt\t_\tNN\tNN\t_\t19\tdobj\t_\t_\n23\tfrom\t_\tIN\tIN\t_\t22\tprep\t_\t_\n24\tVirgin\t_\tNNP\tNNP\t_\t25\tnn\t_\t_\n25\tVacations\t_\tNNPS\tNNPS\t_\t23\tpobj\t_\t_\n26\t.\t_\t.\t.\t_\t19\tpunct\t_\t_\n\n1\tThe\t_\tDT\tDT\t_\t3\tdet\t_\t_\n2\tAfghan\t_\tJJ\tJJ\t_\t3\tamod\t_\t_\n3\tgovernment\t_\tNN\tNN\t_\t6\tnsubj\t_\t_\n4\talso\t_\tRB\tRB\t_\t6\tadvmod\t_\t_\n5\tis\t_\tVBZ\tVBZ\t_\t6\taux\t_\t_\n6\ttrying\t_\tVBG\tVBG\t_\t0\tROOT\t_\t_\n7\tto\t_\tTO\tTO\t_\t8\taux\t_\t_\n8\tpersuade\t_\tVB\tVB\t_\t6\txcomp\t_\t_\n9\tfarmers\t_\tNNS\tNNS\t_\t8\tdobj\t_\t_\n10\tto\t_\tTO\tTO\t_\t11\taux\t_\t_\n11\tstop\t_\tVB\tVB\t_\t8\txcomp\t_\t_\n12\tgrowing\t_\tVBG\tVBG\t_\t13\tamod\t_\t_\n13\tpoppy\t_\tNN\tNN\t_\t11\tdobj\t_\t_\n14\tand\t_\tCC\tCC\t_\t11\tcc\t_\t_\n15\tshift\t_\tVB\tVB\t_\t11\tconj\t_\t_\n16\tto\t_\tTO\tTO\t_\t15\tprep\t_\t_\n17\tother\t_\tJJ\tJJ\t_\t18\tamod\t_\t_\n18\tcrops\t_\tNNS\tNNS\t_\t16\tpobj\t_\t_\n19\t,\t_\t,\t,\t_\t18\tpunct\t_\t_\n20\tparticularly\t_\tRB\tRB\t_\t18\tadvmod\t_\t_\n21\twheat\t_\tNN\tNN\t_\t18\tdep\t_\t_\n22\t.\t_\t.\t.\t_\t6\tpunct\t_\t_\n\n1\tThe\t_\tDT\tDT\t_\t3\tdet\t_\t_\n2\tmost\t_\tRBS\tRBS\t_\t3\tadvmod\t_\t_\n3\tstriking\t_\tJJ\tJJ\t_\t6\tnsubj\t_\t_\n4\tis\t_\tVBZ\tVBZ\t_\t6\tcop\t_\t_\n5\tthe\t_\tDT\tDT\t_\t6\tdet\t_\t_\n6\tdifferences\t_\tNNS\tNNS\t_\t0\tROOT\t_\t_\n7\tover\t_\tIN\tIN\t_\t6\tprep\t_\t_\n8\twhat\t_\tWP\tWP\t_\t10\tnsubj\t_\t_\n9\tto\t_\tTO\tTO\t_\t10\taux\t_\t_\n10\tdo\t_\tVB\tVB\t_\t7\tpcomp\t_\t_\n11\twith\t_\tIN\tIN\t_\t10\tprep\t_\t_\n12\tthe\t_\tDT\tDT\t_\t13\tdet\t_\t_\n13\tbanks\t_\tNNS\tNNS\t_\t11\tpobj\t_\t_\n14\t.\t_\t.\t.\t_\t6\tpunct\t_\t_\n\n1\tPhilo\t_\tNNP\tNNP\t_\t4\tnsubj\t_\t_\n2\tdid\t_\tVBD\tVBD\t_\t4\taux\t_\t_\n3\tnot\t_\tRB\tRB\t_\t4\tneg\t_\t_\n4\tmention\t_\tVB\tVB\t_\t0\tROOT\t_\t_\n5\tany\t_\tDT\tDT\t_\t6\tdet\t_\t_\n6\tname\t_\tNN\tNN\t_\t4\tdobj\t_\t_\n7\t,\t_\t,\t,\t_\t6\tpunct\t_\t_\n8\tplace\t_\tNN\tNN\t_\t6\tconj\t_\t_\n9\t,\t_\t,\t,\t_\t6\tpunct\t_\t_\n10\tdate\t_\tNN\tNN\t_\t6\tconj\t_\t_\n11\t,\t_\t,\t,\t_\t6\tpunct\t_\t_\n12\tor\t_\tCC\tCC\t_\t6\tcc\t_\t_\n13\thistorical\t_\tJJ\tJJ\t_\t14\tamod\t_\t_\n14\tcircumstances\t_\tNNS\tNNS\t_\t6\tconj\t_\t_\n15\t,\t_\t,\t,\t_\t6\tpunct\t_\t_\n16\tor\t_\tCC\tCC\t_\t6\tcc\t_\t_\n17\tany\t_\tDT\tDT\t_\t18\tdet\t_\t_\n18\tbackground\t_\tNN\tNN\t_\t6\tconj\t_\t_\n19\tto\t_\tTO\tTO\t_\t18\tprep\t_\t_\n20\tthe\t_\tDT\tDT\t_\t21\tdet\t_\t_\n21\tconsolidation\t_\tNN\tNN\t_\t19\tpobj\t_\t_\n22\tof\t_\tIN\tIN\t_\t21\tprep\t_\t_\n23\tthis\t_\tDT\tDT\t_\t24\tdet\t_\t_\n24\tgroup\t_\tNN\tNN\t_\t22\tpobj\t_\t_\n25\t.\t_\t.\t.\t_\t4\tpunct\t_\t_\n\n1\tCreated\t_\tVBN\tVBN\t_\t8\tpartmod\t_\t_\n2\tin\t_\tIN\tIN\t_\t1\tprep\t_\t_\n3\t1996\t_\tCD\tCD\t_\t2\tpobj\t_\t_\n4\t,\t_\t,\t,\t_\t8\tpunct\t_\t_\n5\tthe\t_\tDT\tDT\t_\t6\tdet\t_\t_\n6\tpayments\t_\tNNS\tNNS\t_\t8\tnsubjpass\t_\t_\n7\tare\t_\tVBP\tVBP\t_\t8\tauxpass\t_\t_\n8\tbased\t_\tVBN\tVBN\t_\t0\tROOT\t_\t_\n9\ton\t_\tIN\tIN\t_\t8\tprep\t_\t_\n10\ta\t_\tDT\tDT\t_\t11\tdet\t_\t_\n11\tfarm\t_\tNN\tNN\t_\t14\tposs\t_\t_\n12\t's\t_\tPOS\tPOS\t_\t11\tpossessive\t_\t_\n13\tpast\t_\tJJ\tJJ\t_\t14\tamod\t_\t_\n14\tproduction\t_\tNN\tNN\t_\t9\tpobj\t_\t_\n15\tand\t_\tCC\tCC\t_\t8\tcc\t_\t_\n16\tare\t_\tVBP\tVBP\t_\t17\tauxpass\t_\t_\n17\tissued\t_\tVBN\tVBN\t_\t8\tconj\t_\t_\n18\tregardless\t_\tRB\tRB\t_\t17\tadvmod\t_\t_\n19\tof\t_\tIN\tIN\t_\t18\tprep\t_\t_\n20\tcurrent\t_\tJJ\tJJ\t_\t21\tamod\t_\t_\n21\tproduction\t_\tNN\tNN\t_\t19\tpobj\t_\t_\n22\tor\t_\tCC\tCC\t_\t21\tcc\t_\t_\n23\tmarket\t_\tNN\tNN\t_\t24\tnn\t_\t_\n24\tprices\t_\tNNS\tNNS\t_\t21\tconj\t_\t_\n25\t.\t_\t.\t.\t_\t8\tpunct\t_\t_\n\n1\tProsecutors\t_\tNNS\tNNS\t_\t2\tnsubj\t_\t_\n2\tsaid\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n3\tsome\t_\tDT\tDT\t_\t16\tnsubjpass\t_\t_\n4\tof\t_\tIN\tIN\t_\t3\tprep\t_\t_\n5\tthe\t_\tDT\tDT\t_\t6\tdet\t_\t_\n6\tbillions\t_\tNNS\tNNS\t_\t4\tpobj\t_\t_\n7\tof\t_\tIN\tIN\t_\t6\tprep\t_\t_\n8\tdollars\t_\tNNS\tNNS\t_\t7\tpobj\t_\t_\n9\ttransferred\t_\tVBN\tVBN\t_\t8\tpartmod\t_\t_\n10\tfrom\t_\tIN\tIN\t_\t9\tprep\t_\t_\n11\tMexican\t_\tJJ\tJJ\t_\t14\tamod\t_\t_\n12\tmoney\t_\tNN\tNN\t_\t14\tnn\t_\t_\n13\texchange\t_\tNN\tNN\t_\t14\tnn\t_\t_\n14\thouses\t_\tNNS\tNNS\t_\t10\tpobj\t_\t_\n15\twas\t_\tVBD\tVBD\t_\t16\tauxpass\t_\t_\n16\tused\t_\tVBN\tVBN\t_\t2\tccomp\t_\t_\n17\tto\t_\tTO\tTO\t_\t18\taux\t_\t_\n18\tbuy\t_\tVB\tVB\t_\t16\txcomp\t_\t_\n19\tplanes\t_\tNNS\tNNS\t_\t18\tdobj\t_\t_\n20\tfor\t_\tIN\tIN\t_\t18\tprep\t_\t_\n21\tdrug\t_\tNN\tNN\t_\t22\tnn\t_\t_\n22\ttraffickers\t_\tNNS\tNNS\t_\t20\tpobj\t_\t_\n23\t.\t_\t.\t.\t_\t2\tpunct\t_\t_\n\n1\tMargaret\t_\tNNP\tNNP\t_\t2\tnn\t_\t_\n2\tRutherford\t_\tNNP\tNNP\t_\t11\tnsubj\t_\t_\n3\t,\t_\t,\t,\t_\t2\tpunct\t_\t_\n4\tchairwoman\t_\tNN\tNN\t_\t2\tappos\t_\t_\n5\tof\t_\tIN\tIN\t_\t4\tprep\t_\t_\n6\tLoxton\t_\tNNP\tNNP\t_\t9\tposs\t_\t_\n7\t's\t_\tPOS\tPOS\t_\t6\tpossessive\t_\t_\n8\tparish\t_\tJJ\tJJ\t_\t9\tamod\t_\t_\n9\tcouncil\t_\tNN\tNN\t_\t5\tpobj\t_\t_\n10\t,\t_\t,\t,\t_\t2\tpunct\t_\t_\n11\ttold\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n12\tBBC\t_\tNNP\tNNP\t_\t13\tnn\t_\t_\n13\tSomerset\t_\tNNP\tNNP\t_\t11\tdobj\t_\t_\n14\tthat\t_\tIN\tIN\t_\t16\tmark\t_\t_\n15\tshe\t_\tPRP\tPRP\t_\t16\tnsubj\t_\t_\n16\thoped\t_\tVBD\tVBD\t_\t11\tccomp\t_\t_\n17\tthe\t_\tDT\tDT\t_\t18\tdet\t_\t_\n18\tlines\t_\tNNS\tNNS\t_\t21\tnsubjpass\t_\t_\n19\tcould\t_\tMD\tMD\t_\t21\taux\t_\t_\n20\tbe\t_\tVB\tVB\t_\t21\tauxpass\t_\t_\n21\tsited\t_\tVBN\tVBN\t_\t16\tccomp\t_\t_\n22\tunderground\t_\tRB\tRB\t_\t21\tadvmod\t_\t_\n23\t.\t_\t.\t.\t_\t11\tpunct\t_\t_\n\n1\tAmid\t_\tIN\tIN\t_\t3\tmark\t_\t_\n2\tUS\t_\tPRP\tPRP\t_\t3\tnsubj\t_\t_\n3\tfears\t_\tVBZ\tVBZ\t_\t16\tadvcl\t_\t_\n4\tthat\t_\tIN\tIN\t_\t7\tmark\t_\t_\n5\tthey\t_\tPRP\tPRP\t_\t7\tnsubj\t_\t_\n6\tcould\t_\tMD\tMD\t_\t7\taux\t_\t_\n7\tface\t_\tVB\tVB\t_\t3\tccomp\t_\t_\n8\ttorture\t_\tVB\tVB\t_\t7\tdobj\t_\t_\n9\tif\t_\tIN\tIN\t_\t10\tmark\t_\t_\n10\treturned\t_\tVBN\tVBN\t_\t7\tadvcl\t_\t_\n11\tto\t_\tTO\tTO\t_\t10\tprep\t_\t_\n12\tChina\t_\tNNP\tNNP\t_\t11\tpobj\t_\t_\n13\t,\t_\t,\t,\t_\t16\tpunct\t_\t_\n14\tfive\t_\tCD\tCD\t_\t16\tnsubjpass\t_\t_\n15\twere\t_\tVBD\tVBD\t_\t16\tauxpass\t_\t_\n16\treleased\t_\tVBN\tVBN\t_\t0\tROOT\t_\t_\n17\tto\t_\tTO\tTO\t_\t16\tprep\t_\t_\n18\tAlbania\t_\tNNP\tNNP\t_\t17\tpobj\t_\t_\n19\tin\t_\tIN\tIN\t_\t16\tprep\t_\t_\n20\t2006\t_\tCD\tCD\t_\t19\tpobj\t_\t_\n21\t,\t_\t,\t,\t_\t16\tpunct\t_\t_\n22\tand\t_\tCC\tCC\t_\t16\tcc\t_\t_\n23\tfour\t_\tCD\tCD\t_\t25\tnsubjpass\t_\t_\n24\twere\t_\tVBD\tVBD\t_\t25\tauxpass\t_\t_\n25\tresettled\t_\tVBN\tVBN\t_\t16\tconj\t_\t_\n26\tin\t_\tIN\tIN\t_\t25\tprep\t_\t_\n27\tBermuda\t_\tNNP\tNNP\t_\t26\tpobj\t_\t_\n28\tthis\t_\tDT\tDT\t_\t29\tdet\t_\t_\n29\tyear\t_\tNN\tNN\t_\t25\ttmod\t_\t_\n30\t.\t_\t.\t.\t_\t16\tpunct\t_\t_\n\n1\tHe\t_\tPRP\tPRP\t_\t3\tnsubj\t_\t_\n2\tthen\t_\tRB\tRB\t_\t3\tadvmod\t_\t_\n3\tprovided\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n4\tMarshal\t_\tNNP\tNNP\t_\t5\tnn\t_\t_\n5\tMcAvoy\t_\tNNP\tNNP\t_\t8\tposs\t_\t_\n6\t's\t_\tPOS\tPOS\t_\t5\tpossessive\t_\t_\n7\tphone\t_\tNN\tNN\t_\t8\tnn\t_\t_\n8\tnumber\t_\tNN\tNN\t_\t3\tdobj\t_\t_\n9\t.\t_\t.\t.\t_\t3\tpunct\t_\t_\n\n1\tTech\t_\tNNP\tNNP\t_\t2\tnn\t_\t_\n2\tcredits\t_\tNNS\tNNS\t_\t5\tnsubj\t_\t_\n3\tare\t_\tVBP\tVBP\t_\t5\tcop\t_\t_\n4\tjust\t_\tRB\tRB\t_\t5\tadvmod\t_\t_\n5\tfine\t_\tJJ\tJJ\t_\t0\tROOT\t_\t_\n6\tfor\t_\tIN\tIN\t_\t5\tprep\t_\t_\n7\twhat\t_\tWP\tWP\t_\t12\tnsubj\t_\t_\n8\tessentially\t_\tRB\tRB\t_\t12\tadvmod\t_\t_\n9\tis\t_\tVBZ\tVBZ\t_\t12\tcop\t_\t_\n10\tan\t_\tDT\tDT\t_\t12\tdet\t_\t_\n11\tun-reality\t_\tJJ\tJJ\t_\t12\tamod\t_\t_\n12\tshow\t_\tNN\tNN\t_\t6\tpcomp\t_\t_\n13\t.\t_\t.\t.\t_\t5\tpunct\t_\t_\n\n1\tBut\t_\tCC\tCC\t_\t8\tcc\t_\t_\n2\tmy\t_\tPRP$\tPRP$\t_\t4\tposs\t_\t_\n3\teldest\t_\tJJS\tJJS\t_\t4\tamod\t_\t_\n4\tdaughter\t_\tNN\tNN\t_\t8\tnsubj\t_\t_\n5\t,\t_\t,\t,\t_\t4\tpunct\t_\t_\n6\tDonna\t_\tNNP\tNNP\t_\t4\tappos\t_\t_\n7\t,\t_\t,\t,\t_\t4\tpunct\t_\t_\n8\tdid\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n9\t.\t_\t.\t.\t_\t8\tpunct\t_\t_\n\n1\tThe\t_\tDT\tDT\t_\t2\tdet\t_\t_\n2\tdepartment\t_\tNN\tNN\t_\t4\tnsubj\t_\t_\n3\thas\t_\tVBZ\tVBZ\t_\t4\taux\t_\t_\n4\tspent\t_\tVBN\tVBN\t_\t0\tROOT\t_\t_\n5\t$\t_\t$\t$\t_\t4\tdobj\t_\t_\n6\t2.9\t_\tCD\tCD\t_\t7\tnumber\t_\t_\n7\tmillion\t_\tCD\tCD\t_\t5\tnum\t_\t_\n8\ton\t_\tIN\tIN\t_\t4\tprep\t_\t_\n9\tthe\t_\tDT\tDT\t_\t11\tdet\t_\t_\n10\thot\t_\tJJ\tJJ\t_\t11\tamod\t_\t_\n11\tline\t_\tNN\tNN\t_\t8\tpobj\t_\t_\n12\tthus\t_\tRB\tRB\t_\t13\tadvmod\t_\t_\n13\tfar\t_\tRB\tRB\t_\t4\tadvmod\t_\t_\n14\t.\t_\t.\t.\t_\t4\tpunct\t_\t_\n\n1\tPicoplatin\t_\tNNP\tNNP\t_\t3\tnsubjpass\t_\t_\n2\tis\t_\tVBZ\tVBZ\t_\t3\tauxpass\t_\t_\n3\tdesigned\t_\tVBN\tVBN\t_\t0\tROOT\t_\t_\n4\tto\t_\tTO\tTO\t_\t5\taux\t_\t_\n5\tovercome\t_\tVB\tVB\t_\t3\txcomp\t_\t_\n6\tplatinum\t_\tNN\tNN\t_\t7\tnn\t_\t_\n7\tresistance\t_\tNN\tNN\t_\t5\tdobj\t_\t_\n8\tassociated\t_\tVBN\tVBN\t_\t7\tpartmod\t_\t_\n9\twith\t_\tIN\tIN\t_\t8\tprep\t_\t_\n10\tchemotherapy\t_\tNN\tNN\t_\t9\tpobj\t_\t_\n11\tin\t_\tIN\tIN\t_\t10\tprep\t_\t_\n12\tsolid\t_\tJJ\tJJ\t_\t13\tamod\t_\t_\n13\ttumors\t_\tNNS\tNNS\t_\t11\tpobj\t_\t_\n14\t,\t_\t,\t,\t_\t3\tpunct\t_\t_\n15\tand\t_\tCC\tCC\t_\t3\tcc\t_\t_\n16\tis\t_\tVBZ\tVBZ\t_\t18\taux\t_\t_\n17\tbeing\t_\tVBG\tVBG\t_\t18\tauxpass\t_\t_\n18\tstudied\t_\tVBN\tVBN\t_\t3\tconj\t_\t_\n19\tin\t_\tIN\tIN\t_\t18\tprep\t_\t_\n20\tmultiple\t_\tJJ\tJJ\t_\t22\tamod\t_\t_\n21\tcancer\t_\tNN\tNN\t_\t22\tnn\t_\t_\n22\tindications\t_\tNNS\tNNS\t_\t19\tpobj\t_\t_\n23\t,\t_\t,\t,\t_\t22\tpunct\t_\t_\n24\tcombinations\t_\tNNS\tNNS\t_\t22\tconj\t_\t_\n25\tand\t_\tCC\tCC\t_\t22\tcc\t_\t_\n26\tformulations\t_\tNNS\tNNS\t_\t22\tconj\t_\t_\n27\t.\t_\t.\t.\t_\t3\tpunct\t_\t_\n\n1\tOnly\t_\tRB\tRB\t_\t4\tadvmod\t_\t_\n2\tyou\t_\tPRP\tPRP\t_\t4\tnsubj\t_\t_\n3\tcan\t_\tMD\tMD\t_\t4\taux\t_\t_\n4\tdecide\t_\tVB\tVB\t_\t0\tROOT\t_\t_\n5\twhat\t_\tWP\tWP\t_\t7\tnsubj\t_\t_\n6\t's\t_\tVBZ\tVBZ\t_\t7\tcop\t_\t_\n7\timportant\t_\tJJ\tJJ\t_\t4\tccomp\t_\t_\n8\t.\t_\t.\t.\t_\t4\tpunct\t_\t_\n\n1\tLt.\t_\tNNP\tNNP\t_\t4\tnn\t_\t_\n2\tCol.\t_\tNNP\tNNP\t_\t4\tnn\t_\t_\n3\tDavid\t_\tNNP\tNNP\t_\t4\tnn\t_\t_\n4\tAccetta\t_\tNNP\tNNP\t_\t14\tnsubj\t_\t_\n5\t,\t_\t,\t,\t_\t4\tpunct\t_\t_\n6\tthe\t_\tDT\tDT\t_\t10\tdet\t_\t_\n7\ttop\t_\tJJ\tJJ\t_\t10\tamod\t_\t_\n8\tU.S.\t_\tNNP\tNNP\t_\t10\tnn\t_\t_\n9\tmilitary\t_\tJJ\tJJ\t_\t10\tamod\t_\t_\n10\tspokesman\t_\tNN\tNN\t_\t4\tappos\t_\t_\n11\tin\t_\tIN\tIN\t_\t10\tprep\t_\t_\n12\tAfghanistan\t_\tNNP\tNNP\t_\t11\tpobj\t_\t_\n13\t,\t_\t,\t,\t_\t4\tpunct\t_\t_\n14\tsaid\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n15\the\t_\tPRP\tPRP\t_\t18\tnsubj\t_\t_\n16\tcould\t_\tMD\tMD\t_\t18\taux\t_\t_\n17\tnot\t_\tRB\tRB\t_\t18\tneg\t_\t_\n18\tconfirm\t_\tVB\tVB\t_\t14\tccomp\t_\t_\n19\tthe\t_\tDT\tDT\t_\t20\tdet\t_\t_\n20\treport\t_\tNN\tNN\t_\t18\tdobj\t_\t_\n21\t.\t_\t.\t.\t_\t14\tpunct\t_\t_\n\n1\tThe\t_\tDT\tDT\t_\t3\tdet\t_\t_\n2\tfour\t_\tCD\tCD\t_\t3\tnum\t_\t_\n3\tteams\t_\tNNS\tNNS\t_\t14\tnsubj\t_\t_\n4\tthat\t_\tWDT\tWDT\t_\t6\tnsubj\t_\t_\n5\twill\t_\tMD\tMD\t_\t6\taux\t_\t_\n6\tplay\t_\tVB\tVB\t_\t3\trcmod\t_\t_\n7\tin\t_\tIN\tIN\t_\t6\tprep\t_\t_\n8\tthe\t_\tDT\tDT\t_\t9\tdet\t_\t_\n9\twomen\t_\tNNS\tNNS\t_\t11\tposs\t_\t_\n10\t's\t_\tPOS\tPOS\t_\t9\tpossessive\t_\t_\n11\ttournament\t_\tNN\tNN\t_\t7\tpobj\t_\t_\n12\tare\t_\tVBP\tVBP\t_\t14\tcop\t_\t_\n13\tAlaska\t_\tNNP\tNNP\t_\t14\tnn\t_\t_\n14\tAnchorage\t_\tNNP\tNNP\t_\t0\tROOT\t_\t_\n15\t,\t_\t,\t,\t_\t14\tpunct\t_\t_\n16\tCincinnati\t_\tNNP\tNNP\t_\t14\tconj\t_\t_\n17\t,\t_\t,\t,\t_\t14\tpunct\t_\t_\n18\tCoastal\t_\tNNP\tNNP\t_\t19\tnn\t_\t_\n19\tCarolina\t_\tNNP\tNNP\t_\t14\tconj\t_\t_\n20\tand\t_\tCC\tCC\t_\t14\tcc\t_\t_\n21\tWestern\t_\tNNP\tNNP\t_\t22\tnn\t_\t_\n22\tCarolina\t_\tNNP\tNNP\t_\t14\tconj\t_\t_\n23\t.\t_\t.\t.\t_\t14\tpunct\t_\t_\n\n1\tSpeaking\t_\tVBG\tVBG\t_\t8\tpartmod\t_\t_\n2\tto\t_\tTO\tTO\t_\t1\tprep\t_\t_\n3\treporters\t_\tNNS\tNNS\t_\t2\tpobj\t_\t_\n4\t,\t_\t,\t,\t_\t8\tpunct\t_\t_\n5\tshe\t_\tPRP\tPRP\t_\t8\tnsubj\t_\t_\n6\tdid\t_\tVBD\tVBD\t_\t8\taux\t_\t_\n7\tnot\t_\tRB\tRB\t_\t8\tneg\t_\t_\n8\trepeat\t_\tVB\tVB\t_\t0\tROOT\t_\t_\n9\ther\t_\tPRP$\tPRP$\t_\t10\tposs\t_\t_\n10\tdemand\t_\tNN\tNN\t_\t8\tdobj\t_\t_\n11\tthat\t_\tIN\tIN\t_\t17\tmark\t_\t_\n12\ta\t_\tDT\tDT\t_\t15\tdet\t_\t_\n13\tnew\t_\tJJ\tJJ\t_\t15\tamod\t_\t_\n14\tgovernment-run\t_\tJJ\tJJ\t_\t15\tamod\t_\t_\n15\tplan\t_\tNN\tNN\t_\t17\tnsubj\t_\t_\n16\tbe\t_\tVB\tVB\t_\t17\tcop\t_\t_\n17\tpart\t_\tNN\tNN\t_\t10\tccomp\t_\t_\n18\tof\t_\tIN\tIN\t_\t17\tprep\t_\t_\n19\tthe\t_\tDT\tDT\t_\t21\tdet\t_\t_\n20\tfinal\t_\tJJ\tJJ\t_\t21\tamod\t_\t_\n21\tlegislation\t_\tNN\tNN\t_\t18\tpobj\t_\t_\n22\t.\t_\t.\t.\t_\t8\tpunct\t_\t_\n\n1\t'\t_\t''\t''\t_\t10\tpunct\t_\t_\n2\tBut\t_\tCC\tCC\t_\t10\tcc\t_\t_\n3\twith\t_\tIN\tIN\t_\t10\tprep\t_\t_\n4\tthe\t_\tDT\tDT\t_\t5\tdet\t_\t_\n5\thelp\t_\tNN\tNN\t_\t3\tpobj\t_\t_\n6\tof\t_\tIN\tIN\t_\t5\tprep\t_\t_\n7\tEnglish\t_\tNNP\tNNP\t_\t8\tnn\t_\t_\n8\tHeritage\t_\tNNP\tNNP\t_\t6\tpobj\t_\t_\n9\twe\t_\tPRP\tPRP\t_\t10\tnsubj\t_\t_\n10\trestored\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n11\tthem\t_\tPRP\tPRP\t_\t10\tdobj\t_\t_\n12\t.\t_\t.\t.\t_\t10\tpunct\t_\t_\n13\t'\t_\t''\t''\t_\t10\tpunct\t_\t_\n\n1\tMr\t_\tNNP\tNNP\t_\t2\tnn\t_\t_\n2\tOubridge\t_\tNNP\tNNP\t_\t3\tnsubj\t_\t_\n3\tsaid\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n4\twhen\t_\tWRB\tWRB\t_\t8\tadvmod\t_\t_\n5\tthe\t_\tDT\tDT\t_\t7\tdet\t_\t_\n6\tfestival\t_\tNN\tNN\t_\t7\tnn\t_\t_\n7\tteam\t_\tNN\tNN\t_\t8\tnsubj\t_\t_\n8\tmet\t_\tVBD\tVBD\t_\t20\tadvcl\t_\t_\n9\tcouncil\t_\tNN\tNN\t_\t10\tnn\t_\t_\n10\tofficials\t_\tNNS\tNNS\t_\t8\tdobj\t_\t_\n11\tand\t_\tCC\tCC\t_\t10\tcc\t_\t_\n12\tthe\t_\tDT\tDT\t_\t13\tdet\t_\t_\n13\tpolice\t_\tNN\tNN\t_\t10\tconj\t_\t_\n14\ton\t_\tIN\tIN\t_\t8\tprep\t_\t_\n15\tThursday\t_\tNNP\tNNP\t_\t14\tpobj\t_\t_\n16\tthere\t_\tEX\tEX\t_\t20\texpl\t_\t_\n17\thad\t_\tVBD\tVBD\t_\t20\taux\t_\t_\n18\tbeen\t_\tVBN\tVBN\t_\t20\tcop\t_\t_\n19\tno\t_\tDT\tDT\t_\t20\tdet\t_\t_\n20\tmention\t_\tNN\tNN\t_\t3\tccomp\t_\t_\n21\tof\t_\tIN\tIN\t_\t20\tprep\t_\t_\n22\ta\t_\tDT\tDT\t_\t24\tdet\t_\t_\n23\tpotential\t_\tJJ\tJJ\t_\t24\tamod\t_\t_\n24\tinjunction\t_\tNN\tNN\t_\t21\tpobj\t_\t_\n25\t.\t_\t.\t.\t_\t3\tpunct\t_\t_\n\n1\tA\t_\tDT\tDT\t_\t2\tdet\t_\t_\n2\tnumber\t_\tNN\tNN\t_\t6\tnsubj\t_\t_\n3\tof\t_\tIN\tIN\t_\t2\tprep\t_\t_\n4\tministers\t_\tNNS\tNNS\t_\t3\tpobj\t_\t_\n5\thave\t_\tVBP\tVBP\t_\t6\taux\t_\t_\n6\tleft\t_\tVBN\tVBN\t_\t0\tROOT\t_\t_\n7\tthe\t_\tDT\tDT\t_\t8\tdet\t_\t_\n8\tgovernment\t_\tNN\tNN\t_\t9\tnsubj\t_\t_\n9\tfacing\t_\tVBG\tVBG\t_\t6\tdep\t_\t_\n10\tquestions\t_\tNNS\tNNS\t_\t9\tdobj\t_\t_\n11\tover\t_\tIN\tIN\t_\t10\tprep\t_\t_\n12\ttheir\t_\tPRP$\tPRP$\t_\t13\tposs\t_\t_\n13\texpenses\t_\tNNS\tNNS\t_\t11\tpobj\t_\t_\n14\t,\t_\t,\t,\t_\t13\tpunct\t_\t_\n15\tincluding\t_\tVBG\tVBG\t_\t13\tprep\t_\t_\n16\tHazel\t_\tNNP\tNNP\t_\t17\tnn\t_\t_\n17\tBlears\t_\tNNP\tNNP\t_\t15\tpobj\t_\t_\n18\t,\t_\t,\t,\t_\t17\tpunct\t_\t_\n19\tthe\t_\tDT\tDT\t_\t22\tdet\t_\t_\n20\tformer\t_\tJJ\tJJ\t_\t22\tamod\t_\t_\n21\tcommunities\t_\tNNS\tNNS\t_\t22\tnn\t_\t_\n22\tsecretary\t_\tNN\tNN\t_\t17\tappos\t_\t_\n23\t;\t_\t:\t:\t_\t17\tpunct\t_\t_\n24\tJacqui\t_\tNNP\tNNP\t_\t25\tnn\t_\t_\n25\tSmith\t_\tNNP\tNNP\t_\t17\tconj\t_\t_\n26\t,\t_\t,\t,\t_\t25\tpunct\t_\t_\n27\tthe\t_\tDT\tDT\t_\t30\tdet\t_\t_\n28\tformer\t_\tJJ\tJJ\t_\t30\tamod\t_\t_\n29\thome\t_\tNN\tNN\t_\t30\tnn\t_\t_\n30\tsecretary\t_\tNN\tNN\t_\t25\tappos\t_\t_\n31\t;\t_\t:\t:\t_\t17\tpunct\t_\t_\n32\tand\t_\tCC\tCC\t_\t17\tcc\t_\t_\n33\tTony\t_\tNNP\tNNP\t_\t34\tnn\t_\t_\n34\tMcNulty\t_\tNNP\tNNP\t_\t17\tconj\t_\t_\n35\t,\t_\t,\t,\t_\t34\tpunct\t_\t_\n36\tthe\t_\tDT\tDT\t_\t39\tdet\t_\t_\n37\tformer\t_\tJJ\tJJ\t_\t39\tamod\t_\t_\n38\temployment\t_\tNN\tNN\t_\t39\tnn\t_\t_\n39\tminister\t_\tNN\tNN\t_\t34\tappos\t_\t_\n40\t.\t_\t.\t.\t_\t6\tpunct\t_\t_\n\n1\tAn\t_\tDT\tDT\t_\t4\tdet\t_\t_\n2\tenticingly\t_\tRB\tRB\t_\t3\tadvmod\t_\t_\n3\tbig\t_\tJJ\tJJ\t_\t4\tamod\t_\t_\n4\tbutton\t_\tNN\tNN\t_\t10\tnsubj\t_\t_\n5\tthat\t_\tWDT\tWDT\t_\t6\tnsubj\t_\t_\n6\tlooked\t_\tVBD\tVBD\t_\t4\trcmod\t_\t_\n7\tlike\t_\tIN\tIN\t_\t6\tprep\t_\t_\n8\ta\t_\tDT\tDT\t_\t9\tdet\t_\t_\n9\tlatch\t_\tNN\tNN\t_\t7\tpobj\t_\t_\n10\tturned\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n11\tout\t_\tRP\tRP\t_\t10\tprt\t_\t_\n12\tto\t_\tTO\tTO\t_\t15\taux\t_\t_\n13\tbe\t_\tVB\tVB\t_\t15\tcop\t_\t_\n14\ta\t_\tDT\tDT\t_\t15\tdet\t_\t_\n15\thinge\t_\tNN\tNN\t_\t10\txcomp\t_\t_\n16\t.\t_\t.\t.\t_\t10\tpunct\t_\t_\n\n1\tAfter\t_\tIN\tIN\t_\t8\tprep\t_\t_\n2\tan\t_\tDT\tDT\t_\t5\tdet\t_\t_\n3\toustanding\t_\tJJ\tJJ\t_\t5\tamod\t_\t_\n4\topening\t_\tNN\tNN\t_\t5\tnn\t_\t_\n5\tround\t_\tNN\tNN\t_\t1\tpobj\t_\t_\n6\t,\t_\t,\t,\t_\t8\tpunct\t_\t_\n7\tGarcia\t_\tNNP\tNNP\t_\t8\tnsubj\t_\t_\n8\tfound\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n9\thimself\t_\tPRP\tPRP\t_\t10\tnsubj\t_\t_\n10\ttied\t_\tVBD\tVBD\t_\t8\tccomp\t_\t_\n11\twith\t_\tIN\tIN\t_\t10\tprep\t_\t_\n12\tthe\t_\tDT\tDT\t_\t14\tdet\t_\t_\n13\t50-year-old\t_\tJJ\tJJ\t_\t14\tamod\t_\t_\n14\tLanger\t_\tNNP\tNNP\t_\t11\tpobj\t_\t_\n15\t,\t_\t,\t,\t_\t14\tpunct\t_\t_\n16\twho\t_\tWP\tWP\t_\t17\tnsubj\t_\t_\n17\tfired\t_\tVBD\tVBD\t_\t14\trcmod\t_\t_\n18\ta\t_\tDT\tDT\t_\t20\tdet\t_\t_\n19\tfive-under\t_\tJJ\tJJ\t_\t20\tamod\t_\t_\n20\t67\t_\tNN\tNN\t_\t17\tdobj\t_\t_\n21\tfollowing\t_\tVBG\tVBG\t_\t17\tprep\t_\t_\n22\this\t_\tPRP$\tPRP$\t_\t24\tposs\t_\t_\n23\tfirst-round\t_\tJJ\tJJ\t_\t24\tamod\t_\t_\n24\t72\t_\tCD\tCD\t_\t21\tpobj\t_\t_\n25\t.\t_\t.\t.\t_\t8\tpunct\t_\t_\n\n1\tWe\t_\tPRP\tPRP\t_\t2\tnsubj\t_\t_\n2\tmade\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n3\tmistakes\t_\tNNS\tNNS\t_\t2\tdobj\t_\t_\n4\tin\t_\tIN\tIN\t_\t2\tprep\t_\t_\n5\tthose\t_\tDT\tDT\t_\t6\tdet\t_\t_\n6\tgames\t_\tNNS\tNNS\t_\t4\tpobj\t_\t_\n7\tin\t_\tIN\tIN\t_\t2\tprep\t_\t_\n8\tthe\t_\tDT\tDT\t_\t10\tdet\t_\t_\n9\tlast\t_\tJJ\tJJ\t_\t10\tamod\t_\t_\n10\tminute\t_\tNN\tNN\t_\t7\tpobj\t_\t_\n11\t,\t_\t,\t,\t_\t2\tpunct\t_\t_\n12\tso\t_\tIN\tIN\t_\t16\tmark\t_\t_\n13\tit\t_\tPRP\tPRP\t_\t16\tnsubj\t_\t_\n14\t's\t_\tVBZ\tVBZ\t_\t16\tcop\t_\t_\n15\tour\t_\tPRP$\tPRP$\t_\t16\tposs\t_\t_\n16\tfault\t_\tNN\tNN\t_\t2\tadvcl\t_\t_\n17\tin\t_\tIN\tIN\t_\t16\tprep\t_\t_\n18\tthe\t_\tDT\tDT\t_\t19\tdet\t_\t_\n19\tend\t_\tNN\tNN\t_\t17\tpobj\t_\t_\n20\t.\t_\t.\t.\t_\t2\tpunct\t_\t_\n\n1\tThis\t_\tDT\tDT\t_\t3\tdet\t_\t_\n2\tlatest\t_\tJJS\tJJS\t_\t3\tamod\t_\t_\n3\tincident\t_\tNN\tNN\t_\t7\tnsubj\t_\t_\n4\tis\t_\tVBZ\tVBZ\t_\t7\tcop\t_\t_\n5\tthe\t_\tDT\tDT\t_\t7\tdet\t_\t_\n6\tsecond\t_\tJJ\tJJ\t_\t7\tamod\t_\t_\n7\ttime\t_\tNN\tNN\t_\t0\tROOT\t_\t_\n8\tin\t_\tIN\tIN\t_\t7\tprep\t_\t_\n9\tfour\t_\tCD\tCD\t_\t10\tnum\t_\t_\n10\tweeks\t_\tNNS\tNNS\t_\t8\tpobj\t_\t_\n11\tthe\t_\tDT\tDT\t_\t12\tdet\t_\t_\n12\tRevenue\t_\tNN\tNN\t_\t14\tnsubj\t_\t_\n13\thas\t_\tVBZ\tVBZ\t_\t14\taux\t_\t_\n14\tadmitted\t_\tVBN\tVBN\t_\t7\trcmod\t_\t_\n15\tlosing\t_\tVBG\tVBG\t_\t14\txcomp\t_\t_\n16\ttaxpayers\t_\tNNS\tNNS\t_\t18\tposs\t_\t_\n17\t'\t_\tPOS\tPOS\t_\t16\tpossessive\t_\t_\n18\tdetails\t_\tNNS\tNNS\t_\t15\tdobj\t_\t_\n19\t.\t_\t.\t.\t_\t7\tpunct\t_\t_\n\n1\tNebuAd\t_\tNNP\tNNP\t_\t2\tnsubj\t_\t_\n2\tconfirmed\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n3\tFriday\t_\tNNP\tNNP\t_\t2\ttmod\t_\t_\n4\tthat\t_\tIN\tIN\t_\t7\tmark\t_\t_\n5\tit\t_\tPRP\tPRP\t_\t7\tnsubj\t_\t_\n6\tis\t_\tVBZ\tVBZ\t_\t7\taux\t_\t_\n7\tpartnering\t_\tVBG\tVBG\t_\t2\tccomp\t_\t_\n8\twith\t_\tIN\tIN\t_\t7\tprep\t_\t_\n9\tCharter\t_\tNNP\tNNP\t_\t8\tpobj\t_\t_\n10\tbut\t_\tCC\tCC\t_\t2\tcc\t_\t_\n11\tdeclined\t_\tVBD\tVBD\t_\t2\tconj\t_\t_\n12\tfurther\t_\tJJ\tJJ\t_\t13\tamod\t_\t_\n13\tcomment\t_\tNN\tNN\t_\t11\tdobj\t_\t_\n14\t.\t_\t.\t.\t_\t2\tpunct\t_\t_\n\n1\tNeedless\t_\tJJ\tJJ\t_\t6\tccomp\t_\t_\n2\tto\t_\tTO\tTO\t_\t3\taux\t_\t_\n3\tsay\t_\tVB\tVB\t_\t1\txcomp\t_\t_\n4\t,\t_\t,\t,\t_\t6\tpunct\t_\t_\n5\tit\t_\tPRP\tPRP\t_\t6\tnsubj\t_\t_\n6\twasn\t_\tVBP\tVBP\t_\t0\tROOT\t_\t_\n7\t't\t_\tNN\tNN\t_\t6\tdobj\t_\t_\n8\tlong\t_\tRB\tRB\t_\t11\tadvmod\t_\t_\n9\tbefore\t_\tIN\tIN\t_\t11\tmark\t_\t_\n10\the\t_\tPRP\tPRP\t_\t11\tnsubj\t_\t_\n11\tsat\t_\tVBD\tVBD\t_\t6\tadvcl\t_\t_\n12\tdown\t_\tRP\tRP\t_\t11\tprt\t_\t_\n13\t.\t_\t.\t.\t_\t6\tpunct\t_\t_\n\n1\tFor\t_\tIN\tIN\t_\t18\tprep\t_\t_\n2\tJudy\t_\tNNP\tNNP\t_\t3\tnn\t_\t_\n3\tJohn-Baptiste\t_\tNNP\tNNP\t_\t1\tpobj\t_\t_\n4\t,\t_\t,\t,\t_\t3\tpunct\t_\t_\n5\twho\t_\tWP\tWP\t_\t6\tnsubj\t_\t_\n6\truns\t_\tVBZ\tVBZ\t_\t3\trcmod\t_\t_\n7\tthe\t_\tDT\tDT\t_\t10\tdet\t_\t_\n8\tBasement\t_\tNNP\tNNP\t_\t10\tnn\t_\t_\n9\tDance\t_\tNNP\tNNP\t_\t10\tnn\t_\t_\n10\tStudio\t_\tNNP\tNNP\t_\t6\tdobj\t_\t_\n11\tin\t_\tIN\tIN\t_\t10\tprep\t_\t_\n12\tLondon\t_\tNNP\tNNP\t_\t11\tpobj\t_\t_\n13\t,\t_\t,\t,\t_\t18\tpunct\t_\t_\n14\tballet\t_\tNN\tNN\t_\t18\tnsubj\t_\t_\n15\tis\t_\tVBZ\tVBZ\t_\t18\tcop\t_\t_\n16\tthe\t_\tDT\tDT\t_\t18\tdet\t_\t_\n17\tmost\t_\tRBS\tRBS\t_\t18\tadvmod\t_\t_\n18\tpopular\t_\tJJ\tJJ\t_\t0\tROOT\t_\t_\n19\tof\t_\tIN\tIN\t_\t18\tprep\t_\t_\n20\tall\t_\tPDT\tPDT\t_\t22\tpredet\t_\t_\n21\tthe\t_\tDT\tDT\t_\t22\tdet\t_\t_\n22\tclasses\t_\tNNS\tNNS\t_\t19\tpobj\t_\t_\n23\tshe\t_\tPRP\tPRP\t_\t24\tnsubj\t_\t_\n24\toffers\t_\tVBZ\tVBZ\t_\t22\trcmod\t_\t_\n25\t.\t_\t.\t.\t_\t18\tpunct\t_\t_\n\n1\tRuss\t_\tNNP\tNNP\t_\t2\tnn\t_\t_\n2\tDixon\t_\tNNP\tNNP\t_\t7\tnsubj\t_\t_\n3\t,\t_\t,\t,\t_\t2\tpunct\t_\t_\n4\tan\t_\tDT\tDT\t_\t5\tdet\t_\t_\n5\tinfielder\t_\tNN\tNN\t_\t2\tappos\t_\t_\n6\t,\t_\t,\t,\t_\t2\tpunct\t_\t_\n7\thomered\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n8\tto\t_\tTO\tTO\t_\t7\tprep\t_\t_\n9\tright\t_\tNN\tNN\t_\t8\tpobj\t_\t_\n10\t,\t_\t,\t,\t_\t7\tpunct\t_\t_\n11\tthen\t_\tRB\tRB\t_\t7\tadvmod\t_\t_\n12\tsheepishly\t_\tRB\tRB\t_\t13\tadvmod\t_\t_\n13\tput\t_\tVBD\tVBD\t_\t7\tdep\t_\t_\n14\this\t_\tPRP$\tPRP$\t_\t15\tposs\t_\t_\n15\thead\t_\tNN\tNN\t_\t13\tdobj\t_\t_\n16\tdown\t_\tRP\tRP\t_\t13\tprt\t_\t_\n17\tto\t_\tTO\tTO\t_\t18\taux\t_\t_\n18\tavoid\t_\tVB\tVB\t_\t13\txcomp\t_\t_\n19\teye\t_\tNN\tNN\t_\t20\tnn\t_\t_\n20\tcontact\t_\tNN\tNN\t_\t18\tdobj\t_\t_\n21\twith\t_\tIN\tIN\t_\t20\tprep\t_\t_\n22\tthe\t_\tDT\tDT\t_\t23\tdet\t_\t_\n23\tpitcher\t_\tNN\tNN\t_\t21\tpobj\t_\t_\n24\t.\t_\t.\t.\t_\t7\tpunct\t_\t_\n\n1\tMr.\t_\tNNP\tNNP\t_\t2\tnn\t_\t_\n2\tGore\t_\tNNP\tNNP\t_\t3\tnsubj\t_\t_\n3\twas\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n4\tnot\t_\tRB\tRB\t_\t3\tneg\t_\t_\n5\there\t_\tRB\tRB\t_\t3\tadvmod\t_\t_\n6\t,\t_\t,\t,\t_\t3\tpunct\t_\t_\n7\tbut\t_\tCC\tCC\t_\t3\tcc\t_\t_\n8\this\t_\tPRP$\tPRP$\t_\t9\tposs\t_\t_\n9\tname\t_\tNN\tNN\t_\t10\tnsubj\t_\t_\n10\tcame\t_\tVBD\tVBD\t_\t3\tconj\t_\t_\n11\tup\t_\tRP\tRP\t_\t10\tprt\t_\t_\n12\tfrequently\t_\tRB\tRB\t_\t10\tadvmod\t_\t_\n13\t.\t_\t.\t.\t_\t3\tpunct\t_\t_\n\n1\tThe\t_\tDT\tDT\t_\t2\tdet\t_\t_\n2\tlawsuit\t_\tNN\tNN\t_\t4\tnsubj\t_\t_\n3\talso\t_\tRB\tRB\t_\t4\tadvmod\t_\t_\n4\tnames\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n5\tthe\t_\tDT\tDT\t_\t7\tdet\t_\t_\n6\tshopping\t_\tNN\tNN\t_\t7\tnn\t_\t_\n7\tmall\t_\tNN\tNN\t_\t4\tdobj\t_\t_\n8\twhere\t_\tWRB\tWRB\t_\t11\tadvmod\t_\t_\n9\tthe\t_\tDT\tDT\t_\t10\tdet\t_\t_\n10\tincident\t_\tNN\tNN\t_\t11\tnsubj\t_\t_\n11\toccurred\t_\tVBD\tVBD\t_\t7\trcmod\t_\t_\n12\tand\t_\tCC\tCC\t_\t7\tcc\t_\t_\n13\tthe\t_\tDT\tDT\t_\t15\tdet\t_\t_\n14\tsecurity\t_\tNN\tNN\t_\t15\tnn\t_\t_\n15\tcompany\t_\tNN\tNN\t_\t7\tconj\t_\t_\n16\temployed\t_\tVBN\tVBN\t_\t15\tpartmod\t_\t_\n17\tby\t_\tIN\tIN\t_\t16\tprep\t_\t_\n18\tWal-Mart\t_\tNNP\tNNP\t_\t17\tpobj\t_\t_\n19\t.\t_\t.\t.\t_\t4\tpunct\t_\t_\n\n1\tRudy\t_\tNNP\tNNP\t_\t2\tnn\t_\t_\n2\tCrutchfield\t_\tNNP\tNNP\t_\t9\tnsubj\t_\t_\n3\tand\t_\tCC\tCC\t_\t2\tcc\t_\t_\n4\tSteve\t_\tNNP\tNNP\t_\t5\tnn\t_\t_\n5\tHadeed\t_\tNNP\tNNP\t_\t2\tconj\t_\t_\n6\thave\t_\tVBP\tVBP\t_\t9\taux\t_\t_\n7\tbeen\t_\tVBN\tVBN\t_\t9\tcop\t_\t_\n8\tclose\t_\tJJ\tJJ\t_\t9\tamod\t_\t_\n9\tfriends\t_\tNNS\tNNS\t_\t0\tROOT\t_\t_\n10\tsince\t_\tIN\tIN\t_\t9\tprep\t_\t_\n11\ttheir\t_\tPRP$\tPRP$\t_\t12\tposs\t_\t_\n12\tdays\t_\tNNS\tNNS\t_\t10\tpobj\t_\t_\n13\tat\t_\tIN\tIN\t_\t12\tprep\t_\t_\n14\tWheaton\t_\tNNP\tNNP\t_\t16\tnn\t_\t_\n15\tHigh\t_\tNNP\tNNP\t_\t16\tnn\t_\t_\n16\tSchool\t_\tNNP\tNNP\t_\t13\tpobj\t_\t_\n17\t.\t_\t.\t.\t_\t9\tpunct\t_\t_\n\n1\tEarlier\t_\tRBR\tRBR\t_\t3\tadvmod\t_\t_\n2\tthis\t_\tDT\tDT\t_\t3\tdet\t_\t_\n3\tmonth\t_\tNN\tNN\t_\t6\ttmod\t_\t_\n4\t,\t_\t,\t,\t_\t6\tpunct\t_\t_\n5\tGM\t_\tNNP\tNNP\t_\t6\tnsubj\t_\t_\n6\tannounced\t_\tVBD\tVBD\t_\t0\tROOT\t_\t_\n7\tplans\t_\tNNS\tNNS\t_\t6\tdobj\t_\t_\n8\tto\t_\tTO\tTO\t_\t9\taux\t_\t_\n9\tsell\t_\tVB\tVB\t_\t7\tinfmod\t_\t_\n10\tHummer\t_\tNNP\tNNP\t_\t9\tdobj\t_\t_\n11\tto\t_\tTO\tTO\t_\t9\tprep\t_\t_\n12\ta\t_\tDT\tDT\t_\t14\tdet\t_\t_\n13\tChinese\t_\tJJ\tJJ\t_\t14\tamod\t_\t_\n14\tmanufacturer\t_\tNN\tNN\t_\t11\tpobj\t_\t_\n15\tand\t_\tCC\tCC\t_\t14\tcc\t_\t_\n16\tSaturn\t_\tNNP\tNNP\t_\t14\tconj\t_\t_\n17\tto\t_\tTO\tTO\t_\t9\tprep\t_\t_\n18\tMichigan-based\t_\tJJ\tJJ\t_\t24\tamod\t_\t_\n19\tdealership\t_\tNN\tNN\t_\t24\tnn\t_\t_\n20\tchain\t_\tNN\tNN\t_\t24\tnn\t_\t_\n21\tPenske\t_\tNNP\tNNP\t_\t24\tnn\t_\t_\n22\tAutomotive\t_\tNNP\tNNP\t_\t24\tnn\t_\t_\n23\tGroup\t_\tNNP\tNNP\t_\t24\tnn\t_\t_\n24\tInc\t_\tNNP\tNNP\t_\t17\tpobj\t_\t_\n25\t.\t_\t.\t.\t_\t6\tpunct\t_\t_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/text_formats.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include <memory>\n#include <string>\n#include <vector>\n\n#include \"syntaxnet/document_format.h\"\n#include \"syntaxnet/segmenter_utils.h\"\n#include \"syntaxnet/sentence.pb.h\"\n#include \"syntaxnet/utils.h\"\n#include \"tensorflow/core/lib/io/buffered_inputstream.h\"\n#include \"tensorflow/core/lib/strings/strcat.h\"\n#include \"tensorflow/core/lib/strings/stringprintf.h\"\n#include \"tensorflow/core/platform/regexp.h\"\n\nnamespace syntaxnet {\n\n// CoNLL document format reader for dependency annotated corpora.\n// The expected format is described e.g. at http://ilk.uvt.nl/conll/#dataformat\n//\n// Data should adhere to the following rules:\n//   - Data files contain sentences separated by a blank line.\n//   - A sentence consists of one or tokens, each one starting on a new line.\n//   - A token consists of ten fields described in the table below.\n//   - Fields are separated by a single tab character.\n//   - All data files will contains these ten fields, although only the ID\n//     column is required to contain non-dummy (i.e. non-underscore) values.\n// Data files should be UTF-8 encoded (Unicode).\n//\n// Fields:\n// 1  ID:      Token counter, starting at 1 for each new sentence and increasing\n//             by 1 for every new token.\n// 2  FORM:    Word form or punctuation symbol.\n// 3  LEMMA:   Lemma or stem.\n// 4  CPOSTAG: Coarse-grained part-of-speech tag or category.\n// 5  POSTAG:  Fine-grained part-of-speech tag. Note that the same POS tag\n//             cannot appear with multiple coarse-grained POS tags.\n// 6  FEATS:   Unordered set of syntactic and/or morphological features.\n// 7  HEAD:    Head of the current token, which is either a value of ID or '0'.\n// 8  DEPREL:  Dependency relation to the HEAD.\n// 9  PHEAD:   Projective head of current token.\n// 10 PDEPREL: Dependency relation to the PHEAD.\n//\n// This CoNLL reader is compatible with the CoNLL-U format described at\n//   http://universaldependencies.org/format.html\n// Note that this reader skips CoNLL-U multiword tokens and ignores the last two\n// fields of every line, which are PHEAD and PDEPREL in CoNLL format, but are\n// replaced by DEPS and MISC in CoNLL-U.\n//\nclass CoNLLSyntaxFormat : public DocumentFormat {\n public:\n  CoNLLSyntaxFormat() {}\n\n  void Setup(TaskContext *context) override {\n    join_category_to_pos_ = context->GetBoolParameter(\"join_category_to_pos\");\n    add_pos_as_attribute_ = context->GetBoolParameter(\"add_pos_as_attribute\");\n  }\n\n  // Reads up to the first empty line and returns false end of file is reached.\n  bool ReadRecord(tensorflow::io::BufferedInputStream *buffer,\n                  string *record) override {\n    string line;\n    record->clear();\n    tensorflow::Status status = buffer->ReadLine(&line);\n    while (!line.empty() && status.ok()) {\n      tensorflow::strings::StrAppend(record, line, \"\\n\");\n      status = buffer->ReadLine(&line);\n    }\n    return status.ok() || !record->empty();\n  }\n\n  void ConvertFromString(const string &key, const string &value,\n                         vector<Sentence *> *sentences) override {\n    // Create new sentence.\n    Sentence *sentence = new Sentence();\n\n    // Each line corresponds to one token.\n    string text;\n    vector<string> lines = utils::Split(value, '\\n');\n\n    // Add each token to the sentence.\n    vector<string> fields;\n    int expected_id = 1;\n    for (size_t i = 0; i < lines.size(); ++i) {\n      // Split line into tab-separated fields.\n      fields.clear();\n      fields = utils::Split(lines[i], '\\t');\n      if (fields.empty()) continue;\n\n      // Skip comment lines.\n      if (fields[0][0] == '#') continue;\n\n      // Skip CoNLLU lines for multiword tokens which are indicated by\n      // hyphenated line numbers, e.g., \"2-4\".\n      // http://universaldependencies.github.io/docs/format.html\n      if (RE2::FullMatch(fields[0], \"[0-9]+-[0-9]+\")) continue;\n\n      // Clear all optional fields equal to '_'.\n      for (size_t j = 2; j < fields.size(); ++j) {\n        if (fields[j].length() == 1 && fields[j][0] == '_') fields[j].clear();\n      }\n\n      // Check that the line is valid.\n      CHECK_GE(fields.size(), 8)\n          << \"Every line has to have at least 8 tab separated fields.\";\n\n      // Check that the ids follow the expected format.\n      const int id = utils::ParseUsing<int>(fields[0], 0, utils::ParseInt32);\n      CHECK_EQ(expected_id++, id)\n          << \"Token ids start at 1 for each new sentence and increase by 1 \"\n          << \"on each new token. Sentences are separated by an empty line.\";\n\n      // Get relevant fields.\n      const string &word = fields[1];\n      const string &cpostag = fields[3];\n      const string &tag = fields[4];\n      const string &attributes = fields[5];\n      const int head = utils::ParseUsing<int>(fields[6], 0, utils::ParseInt32);\n      const string &label = fields[7];\n\n      // Add token to sentence text.\n      if (!text.empty()) text.append(\" \");\n      const int start = text.size();\n      const int end = start + word.size() - 1;\n      text.append(word);\n\n      // Add token to sentence.\n      Token *token = sentence->add_token();\n      token->set_word(word);\n      token->set_start(start);\n      token->set_end(end);\n      if (head > 0) token->set_head(head - 1);\n      if (!tag.empty()) token->set_tag(tag);\n      if (!cpostag.empty()) token->set_category(cpostag);\n      if (!label.empty()) token->set_label(label);\n      if (!attributes.empty()) AddMorphAttributes(attributes, token);\n      if (join_category_to_pos_) JoinCategoryToPos(token);\n      if (add_pos_as_attribute_) AddPosAsAttribute(token);\n    }\n\n    if (sentence->token_size() > 0) {\n      sentence->set_docid(key);\n      sentence->set_text(text);\n      sentences->push_back(sentence);\n    } else {\n      // If the sentence was empty (e.g., blank lines at the beginning of a\n      // file), then don't save it.\n      delete sentence;\n    }\n  }\n\n  // Converts a sentence to a key/value pair.\n  void ConvertToString(const Sentence &sentence, string *key,\n                       string *value) override {\n    *key = sentence.docid();\n    vector<string> lines;\n    for (int i = 0; i < sentence.token_size(); ++i) {\n      Token token = sentence.token(i);\n      if (join_category_to_pos_) SplitCategoryFromPos(&token);\n      if (add_pos_as_attribute_) RemovePosFromAttributes(&token);\n      vector<string> fields(10);\n      fields[0] = tensorflow::strings::Printf(\"%d\", i + 1);\n      fields[1] = UnderscoreIfEmpty(token.word());\n      fields[2] = \"_\";\n      fields[3] = UnderscoreIfEmpty(token.category());\n      fields[4] = UnderscoreIfEmpty(token.tag());\n      fields[5] = GetMorphAttributes(token);\n      fields[6] = tensorflow::strings::Printf(\"%d\", token.head() + 1);\n      fields[7] = UnderscoreIfEmpty(token.label());\n      fields[8] = \"_\";\n      fields[9] = \"_\";\n      lines.push_back(utils::Join(fields, \"\\t\"));\n    }\n    *value = tensorflow::strings::StrCat(utils::Join(lines, \"\\n\"), \"\\n\\n\");\n  }\n\n private:\n  // Replaces empty fields with an undescore.\n  string UnderscoreIfEmpty(const string &field) {\n    return field.empty() ? \"_\" : field;\n  }\n\n  // Creates a TokenMorphology object out of a list of attribute values of the\n  // form: a1=v1|a2=v2|... or v1|v2|...\n  void AddMorphAttributes(const string &attributes, Token *token) {\n    TokenMorphology *morph =\n        token->MutableExtension(TokenMorphology::morphology);\n    vector<string> att_vals = utils::Split(attributes, '|');\n    for (int i = 0; i < att_vals.size(); ++i) {\n      vector<string> att_val = utils::SplitOne(att_vals[i], '=');\n\n      // Format is either:\n      //   1) a1=v1|a2=v2..., e.g., Czech CoNLL data, or,\n      //   2) v1|v2|..., e.g., German CoNLL data.\n      const pair<string, string> name_value =\n          att_val.size() == 2 ? std::make_pair(att_val[0], att_val[1])\n                              : std::make_pair(att_val[0], \"on\");\n\n      // We currently don't expect an empty attribute value, but might have an\n      // empty attribute name due to data input errors.\n      if (name_value.second.empty()) {\n        LOG(WARNING) << \"Invalid attributes string: \" << attributes\n                     << \" for token: \" << token->ShortDebugString();\n        continue;\n      }\n      if (!name_value.first.empty()) {\n        TokenMorphology::Attribute *attribute = morph->add_attribute();\n        attribute->set_name(name_value.first);\n        attribute->set_value(name_value.second);\n      }\n    }\n  }\n\n  // Creates a list of attribute values of the form a1=v1|a2=v2|... or v1|v2|...\n  // from a TokenMorphology object.\n  string GetMorphAttributes(const Token &token) {\n    const TokenMorphology &morph =\n        token.GetExtension(TokenMorphology::morphology);\n    if (morph.attribute_size() == 0) return \"_\";\n    string attributes;\n    for (const TokenMorphology::Attribute &attribute : morph.attribute()) {\n      if (!attributes.empty()) tensorflow::strings::StrAppend(&attributes, \"|\");\n      tensorflow::strings::StrAppend(&attributes, attribute.name());\n      if (attribute.value() != \"on\") {\n        tensorflow::strings::StrAppend(&attributes, \"=\", attribute.value());\n      }\n    }\n    return attributes;\n  }\n\n  void JoinCategoryToPos(Token *token) {\n    token->set_tag(\n        tensorflow::strings::StrCat(token->category(), \"++\", token->tag()));\n    token->clear_category();\n  }\n\n  void SplitCategoryFromPos(Token *token) {\n    const string &tag = token->tag();\n    const size_t pos = tag.find(\"++\");\n    if (pos != string::npos) {\n      token->set_category(tag.substr(0, pos));\n      token->set_tag(tag.substr(pos + 2));\n    }\n  }\n\n  void AddPosAsAttribute(Token *token) {\n    if (!token->tag().empty()) {\n      TokenMorphology *morph =\n          token->MutableExtension(TokenMorphology::morphology);\n      TokenMorphology::Attribute *attribute = morph->add_attribute();\n      attribute->set_name(\"fPOS\");\n      attribute->set_value(token->tag());\n    }\n  }\n\n  void RemovePosFromAttributes(Token *token) {\n    // Assumes the \"fPOS\" attribute, if present, is the last one.\n    TokenMorphology *morph =\n        token->MutableExtension(TokenMorphology::morphology);\n    if (morph->attribute_size() > 0 &&\n        morph->attribute().rbegin()->name() == \"fPOS\") {\n      morph->mutable_attribute()->RemoveLast();\n    }\n  }\n\n  bool join_category_to_pos_ = false;\n  bool add_pos_as_attribute_ = false;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(CoNLLSyntaxFormat);\n};\n\nREGISTER_DOCUMENT_FORMAT(\"conll-sentence\", CoNLLSyntaxFormat);\n\n// Reader for segmentation training data format. This reader assumes the input\n// format is similar to CoNLL format but with only two fileds:\n//\n// Fields:\n// 1  FORM:        Word form or punctuation symbol.\n// 2  SPACE FLAG:  Can be either 'SPACE' or 'NO_SPACE' indicates that whether\n//                 there should be a space between this word and the next one in\n//                 the raw text.\n//\n// Examples:\n// To create a training example for sentence with raw text:\n//   That's a good point.\n// and the corresponding gold segmentation:\n//   That 's a good point .\n// Then the correct input is:\n// That\tNO_SPACE\n// 's\tSPACE\n// a\tSPACE\n// good\tSPACE\n// point\tNO_SPACE\n// .\tNO_SPACE\n//\n// Yet another example:\n// To create a training example for sentence with raw text:\n//   这是一个测试\n// and the corresponding gold segmentation:\n//   这 是 一 个 测试\n// Then the correct input is:\n// 这\tNO_SPACE\n// 是\tNO_SPACE\n// 一\tNO_SPACE\n// 个\tNO_SPACE\n// 测试\tNO_SPACE\nclass SegmentationTrainingDataFormat : public CoNLLSyntaxFormat {\n public:\n  // Converts to segmentation training data by breaking those word in the input\n  // tokens to utf8 character based tokens. Moreover, if a character is the\n  // first char of the word in the original token, then its break level is set\n  // to SPACE_BREAK to indicate that the corresponding gold transition for that\n  // character token is START. Otherwise NO_BREAK to indicate MERGE.\n  void ConvertFromString(const string &key, const string &value,\n                         vector<Sentence *> *sentences) override {\n    // Create new sentence.\n    Sentence *sentence = new Sentence();\n\n    // Each line corresponds to one token.\n    string text;\n    vector<string> lines = utils::Split(value, '\\n');\n\n    // Add each token to the sentence.\n    vector<string> fields;\n    for (size_t i = 0; i < lines.size(); ++i) {\n      // Split line into tab-separated fields.\n      fields.clear();\n      fields = utils::Split(lines[i], '\\t');\n      if (fields.empty()) continue;\n\n      // Skip comment lines.\n      if (fields[0][0] == '#') continue;\n\n      // Check that the line is valid.\n      CHECK_GE(fields.size(), 2)\n          << \"Every line has to have at least 8 tab separated fields.\";\n\n      // Get relevant fields.\n      const string &word = fields[0];\n      CHECK(fields[1] == \"SPACE\" || fields[1] == \"NO_SPACE\")\n          << \"The space field can only be either 'SPACE' or 'NO_SPACE'\";\n      const bool space_after = fields[1] == \"SPACE\";\n\n      // Add token to sentence text.\n      int start = text.size();\n      text.append(word);\n      if (space_after && i != lines.size() - 1) {\n        text.append(\" \");\n      }\n\n      // Add character-based token to sentence.\n      vector<tensorflow::StringPiece> chars;\n      SegmenterUtils::GetUTF8Chars(word, &chars);\n      bool is_first_char = true;\n      for (auto utf8char : chars) {\n        Token *char_token = sentence->add_token();\n        char_token->set_word(utf8char.ToString());\n        char_token->set_start(start);\n        start += char_token->word().size();\n        char_token->set_end(start - 1);\n        char_token->set_break_level(\n            is_first_char ? Token::SPACE_BREAK : Token::NO_BREAK);\n        is_first_char = false;\n      }\n\n      // Add another space token.\n      if (space_after) {\n        Token *char_token = sentence->add_token();\n        char_token->set_word(\" \");\n        char_token->set_start(start);\n        char_token->set_end(start);\n        char_token->set_break_level(Token::SPACE_BREAK);\n      }\n    }\n\n    if (sentence->token_size() > 0) {\n      sentence->set_docid(key);\n      sentence->set_text(text);\n      sentences->push_back(sentence);\n    } else {\n      // If the sentence was empty (e.g., blank lines at the beginning of a\n      // file), then don't save it.\n      delete sentence;\n    }\n  }\n};\n\nREGISTER_DOCUMENT_FORMAT(\"segment-train-data\", SegmentationTrainingDataFormat);\n\n// Reader for tokenized text. This reader expects every sentence to be on a\n// single line and tokens on that line to be separated by single spaces.\n//\nclass TokenizedTextFormat : public DocumentFormat {\n public:\n  TokenizedTextFormat() {}\n\n  // Reads a line and returns false if end of file is reached.\n  bool ReadRecord(tensorflow::io::BufferedInputStream *buffer,\n                  string *record) override {\n    return buffer->ReadLine(record).ok();\n  }\n\n  void ConvertFromString(const string &key, const string &value,\n                         vector<Sentence *> *sentences) override {\n    Sentence *sentence = new Sentence();\n    string text;\n    for (const string &word : utils::Split(value, ' ')) {\n      if (word.empty()) continue;\n      const int start = text.size();\n      const int end = start + word.size() - 1;\n      if (!text.empty()) text.append(\" \");\n      text.append(word);\n      Token *token = sentence->add_token();\n      token->set_word(word);\n      token->set_start(start);\n      token->set_end(end);\n    }\n\n    if (sentence->token_size() > 0) {\n      sentence->set_docid(key);\n      sentence->set_text(text);\n      sentences->push_back(sentence);\n    } else {\n      // If the sentence was empty (e.g., blank lines at the beginning of a\n      // file), then don't save it.\n      delete sentence;\n    }\n  }\n\n  void ConvertToString(const Sentence &sentence, string *key,\n                       string *value) override {\n    *key = sentence.docid();\n    value->clear();\n    for (const Token &token : sentence.token()) {\n      if (!value->empty()) value->append(\" \");\n      value->append(token.word());\n      if (token.has_tag()) {\n        value->append(\"_\");\n        value->append(token.tag());\n      }\n      if (token.has_head()) {\n        value->append(\"_\");\n        value->append(tensorflow::strings::StrCat(token.head()));\n      }\n    }\n    value->append(\"\\n\");\n  }\n\n private:\n  TF_DISALLOW_COPY_AND_ASSIGN(TokenizedTextFormat);\n};\n\nREGISTER_DOCUMENT_FORMAT(\"tokenized-text\", TokenizedTextFormat);\n\n// Reader for un-tokenized text. This reader expects every sentence to be on a\n// single line. For each line in the input, a sentence proto will be created,\n// where tokens are utf8 characters of that line.\n//\nclass UntokenizedTextFormat : public TokenizedTextFormat {\n public:\n  UntokenizedTextFormat() {}\n\n  void ConvertFromString(const string &key, const string &value,\n                         vector<Sentence *> *sentences) override {\n    Sentence *sentence = new Sentence();\n    vector<tensorflow::StringPiece> chars;\n    SegmenterUtils::GetUTF8Chars(value, &chars);\n    int start = 0;\n    for (auto utf8char : chars) {\n      Token *token = sentence->add_token();\n      token->set_word(utf8char.ToString());\n      token->set_start(start);\n      start += utf8char.size();\n      token->set_end(start - 1);\n    }\n\n    if (sentence->token_size() > 0) {\n      sentence->set_docid(key);\n      sentence->set_text(value);\n      sentences->push_back(sentence);\n    } else {\n      // If the sentence was empty (e.g., blank lines at the beginning of a\n      // file), then don't save it.\n      delete sentence;\n    }\n  }\n\n private:\n  TF_DISALLOW_COPY_AND_ASSIGN(UntokenizedTextFormat);\n};\n\nREGISTER_DOCUMENT_FORMAT(\"untokenized-text\", UntokenizedTextFormat);\n\n// Text reader that attmpts to perform Penn Treebank tokenization on arbitrary\n// raw text. Adapted from https://www.cis.upenn.edu/~treebank/tokenizer.sed\n// by Robert MacIntyre, University of Pennsylvania, late 1995.\n// Expected input: raw text with one sentence per line.\n//\nclass EnglishTextFormat : public TokenizedTextFormat {\n public:\n  EnglishTextFormat() {}\n\n  void ConvertFromString(const string &key, const string &value,\n                         vector<Sentence *> *sentences) override {\n    vector<pair<string, string>> preproc_rules = {\n        // Punctuation.\n        {\"’\", \"'\"},\n        {\"…\", \"...\"},\n        {\"---\", \"--\"},\n        {\"—\", \"--\"},\n        {\"–\", \"--\"},\n        {\"，\", \",\"},\n        {\"。\", \".\"},\n        {\"！\", \"!\"},\n        {\"？\", \"?\"},\n        {\"：\", \":\"},\n        {\"；\", \";\"},\n        {\"＆\", \"&\"},\n\n        // Brackets.\n        {\"\\\\[\", \"(\"},\n        {\"]\", \")\"},\n        {\"{\", \"(\"},\n        {\"}\", \")\"},\n        {\"【\", \"(\"},\n        {\"】\", \")\"},\n        {\"（\", \"(\"},\n        {\"）\", \")\"},\n\n        // Quotation marks.\n        {\"\\\"\", \"\\\"\"},\n        {\"″\", \"\\\"\"},\n        {\"“\", \"\\\"\"},\n        {\"„\", \"\\\"\"},\n        {\"‵‵\", \"\\\"\"},\n        {\"”\", \"\\\"\"},\n        {\"’\", \"\\\"\"},\n        {\"‘\", \"\\\"\"},\n        {\"′′\", \"\\\"\"},\n        {\"‹\", \"\\\"\"},\n        {\"›\", \"\\\"\"},\n        {\"«\", \"\\\"\"},\n        {\"»\", \"\\\"\"},\n\n        // Discarded punctuation that breaks sentences.\n        {\"|\", \"\"},\n        {\"·\", \"\"},\n        {\"•\", \"\"},\n        {\"●\", \"\"},\n        {\"▪\", \"\"},\n        {\"■\", \"\"},\n        {\"□\", \"\"},\n        {\"❑\", \"\"},\n        {\"◆\", \"\"},\n        {\"★\", \"\"},\n        {\"＊\", \"\"},\n        {\"♦\", \"\"},\n    };\n\n    vector<pair<string, string>> rules = {\n        // attempt to get correct directional quotes\n        {R\"re(^\")re\", \"`` \"},\n        {R\"re(([ \\([{<])\")re\", \"\\\\1 `` \"},\n        // close quotes handled at end\n\n        {R\"re(\\.\\.\\.)re\", \" ... \"},\n        {\"[,;:@#$%&]\", \" \\\\0 \"},\n\n        // Assume sentence tokenization has been done first, so split FINAL\n        // periods only.\n        {R\"re(([^.])(\\.)([\\]\\)}>\"']*)[ ]*$)re\", \"\\\\1 \\\\2\\\\3 \"},\n        // however, we may as well split ALL question marks and exclamation\n        // points, since they shouldn't have the abbrev.-marker ambiguity\n        // problem\n        {\"[?!]\", \" \\\\0 \"},\n\n        // parentheses, brackets, etc.\n        {R\"re([\\]\\[\\(\\){}<>])re\", \" \\\\0 \"},\n\n        // Like Adwait Ratnaparkhi's MXPOST, we use the parsed-file version of\n        // these symbols.\n        {\"\\\\(\", \"-LRB-\"},\n        {\"\\\\)\", \"-RRB-\"},\n        {\"\\\\]\", \"-LSB-\"},\n        {\"\\\\]\", \"-RSB-\"},\n        {\"{\", \"-LCB-\"},\n        {\"}\", \"-RCB-\"},\n\n        {\"--\", \" -- \"},\n\n        // First off, add a space to the beginning and end of each line, to\n        // reduce necessary number of regexps.\n        {\"$\", \" \"},\n        {\"^\", \" \"},\n\n        {\"\\\"\", \" '' \"},\n        // possessive or close-single-quote\n        {\"([^'])' \", \"\\\\1 ' \"},\n        // as in it's, I'm, we'd\n        {\"'([sSmMdD]) \", \" '\\\\1 \"},\n        {\"'ll \", \" 'll \"},\n        {\"'re \", \" 're \"},\n        {\"'ve \", \" 've \"},\n        {\"n't \", \" n't \"},\n        {\"'LL \", \" 'LL \"},\n        {\"'RE \", \" 'RE \"},\n        {\"'VE \", \" 'VE \"},\n        {\"N'T \", \" N'T \"},\n\n        {\" ([Cc])annot \", \" \\\\1an not \"},\n        {\" ([Dd])'ye \", \" \\\\1' ye \"},\n        {\" ([Gg])imme \", \" \\\\1im me \"},\n        {\" ([Gg])onna \", \" \\\\1on na \"},\n        {\" ([Gg])otta \", \" \\\\1ot ta \"},\n        {\" ([Ll])emme \", \" \\\\1em me \"},\n        {\" ([Mm])ore'n \", \" \\\\1ore 'n \"},\n        {\" '([Tt])is \", \" '\\\\1 is \"},\n        {\" '([Tt])was \", \" '\\\\1 was \"},\n        {\" ([Ww])anna \", \" \\\\1an na \"},\n        {\" ([Ww])haddya \", \" \\\\1ha dd ya \"},\n        {\" ([Ww])hatcha \", \" \\\\1ha t cha \"},\n\n        // clean out extra spaces\n        {\"  *\", \" \"},\n        {\"^ *\", \"\"},\n    };\n\n    string rewritten = value;\n    for (const pair<string, string> &rule : preproc_rules) {\n      RE2::GlobalReplace(&rewritten, rule.first, rule.second);\n    }\n    for (const pair<string, string> &rule : rules) {\n      RE2::GlobalReplace(&rewritten, rule.first, rule.second);\n    }\n    TokenizedTextFormat::ConvertFromString(key, rewritten, sentences);\n  }\n\n private:\n  TF_DISALLOW_COPY_AND_ASSIGN(EnglishTextFormat);\n};\n\nREGISTER_DOCUMENT_FORMAT(\"english-text\", EnglishTextFormat);\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/text_formats_test.py",
    "content": "# coding=utf-8\n# Copyright 2016 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Tests for english_tokenizer.\"\"\"\n\n\n# disable=no-name-in-module,unused-import,g-bad-import-order,maybe-no-member\nimport os.path\nimport tensorflow as tf\n\nimport syntaxnet.load_parser_ops\n\nfrom tensorflow.python.framework import test_util\nfrom tensorflow.python.platform import googletest\nfrom tensorflow.python.platform import tf_logging as logging\n\nfrom syntaxnet import sentence_pb2\nfrom syntaxnet import task_spec_pb2\nfrom syntaxnet.ops import gen_parser_ops\n\nFLAGS = tf.app.flags.FLAGS\n\n\nclass TextFormatsTest(test_util.TensorFlowTestCase):\n\n  def setUp(self):\n    if not hasattr(FLAGS, 'test_srcdir'):\n      FLAGS.test_srcdir = ''\n    if not hasattr(FLAGS, 'test_tmpdir'):\n      FLAGS.test_tmpdir = tf.test.get_temp_dir()\n    self.corpus_file = os.path.join(FLAGS.test_tmpdir, 'documents.conll')\n    self.context_file = os.path.join(FLAGS.test_tmpdir, 'context.pbtxt')\n\n  def AddInput(self, name, file_pattern, record_format, context):\n    inp = context.input.add()\n    inp.name = name\n    inp.record_format.append(record_format)\n    inp.part.add().file_pattern = file_pattern\n\n  def AddParameter(self, name, value, context):\n    param = context.parameter.add()\n    param.name = name\n    param.value = value\n\n  def WriteContext(self, corpus_format):\n    context = task_spec_pb2.TaskSpec()\n    self.AddInput('documents', self.corpus_file, corpus_format, context)\n    for name in ('word-map', 'lcword-map', 'tag-map',\n                 'category-map', 'label-map', 'prefix-table',\n                 'suffix-table', 'tag-to-category'):\n      self.AddInput(name, os.path.join(FLAGS.test_tmpdir, name), '', context)\n    logging.info('Writing context to: %s', self.context_file)\n    with open(self.context_file, 'w') as f:\n      f.write(str(context))\n\n  def ReadNextDocument(self, sess, sentence):\n    sentence_str, = sess.run([sentence])\n    if sentence_str:\n      sentence_doc = sentence_pb2.Sentence()\n      sentence_doc.ParseFromString(sentence_str[0])\n    else:\n      sentence_doc = None\n    return sentence_doc\n\n  def CheckTokenization(self, sentence, tokenization):\n    self.WriteContext('english-text')\n    logging.info('Writing text file to: %s', self.corpus_file)\n    with open(self.corpus_file, 'w') as f:\n      f.write(sentence)\n    sentence, _ = gen_parser_ops.document_source(\n        self.context_file, batch_size=1)\n    with self.test_session() as sess:\n      sentence_doc = self.ReadNextDocument(sess, sentence)\n      self.assertEqual(' '.join([t.word for t in sentence_doc.token]),\n                       tokenization)\n\n  def CheckUntokenizedDoc(self, sentence, words, starts, ends):\n    self.WriteContext('untokenized-text')\n    logging.info('Writing text file to: %s', self.corpus_file)\n    with open(self.corpus_file, 'w') as f:\n      f.write(sentence)\n    sentence, _ = gen_parser_ops.document_source(\n        self.context_file, batch_size=1)\n    with self.test_session() as sess:\n      sentence_doc = self.ReadNextDocument(sess, sentence)\n      self.assertEqual(len(sentence_doc.token), len(words))\n      self.assertEqual(len(sentence_doc.token), len(starts))\n      self.assertEqual(len(sentence_doc.token), len(ends))\n      for i, token in enumerate(sentence_doc.token):\n        self.assertEqual(token.word.encode('utf-8'), words[i])\n        self.assertEqual(token.start, starts[i])\n        self.assertEqual(token.end, ends[i])\n\n  def testUntokenized(self):\n    self.CheckUntokenizedDoc('一个测试', ['一', '个', '测', '试'],\n                             [0, 3, 6, 9], [2, 5, 8, 11])\n    self.CheckUntokenizedDoc('Hello ', ['H', 'e', 'l', 'l', 'o', ' '],\n                             [0, 1, 2, 3, 4, 5], [0, 1, 2, 3, 4, 5])\n\n  def testSegmentationTrainingData(self):\n    doc1_lines = ['测试\tNO_SPACE\\n',\n                  '的\tNO_SPACE\\n',\n                  '句子\tNO_SPACE']\n    doc1_text = '测试的句子'\n    doc1_tokens = ['测', '试', '的', '句', '子']\n    doc1_break_levles = [1, 0, 1, 1, 0]\n    doc2_lines = ['That\tNO_SPACE\\n',\n                  '\\'s\tSPACE\\n',\n                  'a\tSPACE\\n',\n                  'good\tSPACE\\n',\n                  'point\tNO_SPACE\\n',\n                  '.\tNO_SPACE']\n    doc2_text = 'That\\'s a good point.'\n    doc2_tokens = ['T', 'h', 'a', 't', '\\'', 's', ' ', 'a', ' ', 'g', 'o', 'o',\n                   'd', ' ', 'p', 'o', 'i', 'n', 't', '.']\n    doc2_break_levles = [1, 0, 0, 0, 1, 0, 1, 1, 1, 1, 0, 0, 0, 1, 1, 0, 0, 0,\n                         0, 1]\n    self.CheckSegmentationTrainingData(doc1_lines, doc1_text, doc1_tokens,\n                                       doc1_break_levles)\n    self.CheckSegmentationTrainingData(doc2_lines, doc2_text, doc2_tokens,\n                                       doc2_break_levles)\n\n  def CheckSegmentationTrainingData(self, doc_lines, doc_text, doc_words,\n                                    break_levels):\n    # Prepare context.\n    self.WriteContext('segment-train-data')\n\n    # Prepare test sentence.\n    with open(self.corpus_file, 'w') as f:\n      f.write(''.join(doc_lines))\n\n    # Test converted sentence.\n    sentence, _ = gen_parser_ops.document_source(\n        self.context_file, batch_size=1)\n    with self.test_session() as sess:\n      sentence_doc = self.ReadNextDocument(sess, sentence)\n      self.assertEqual(doc_text.decode('utf-8'), sentence_doc.text)\n      self.assertEqual([t.decode('utf-8') for t in doc_words],\n                       [t.word for t in sentence_doc.token])\n      self.assertEqual(break_levels,\n                       [t.break_level for t in sentence_doc.token])\n\n  def testSimple(self):\n    self.CheckTokenization('Hello, world!', 'Hello , world !')\n    self.CheckTokenization('\"Hello\"', \"`` Hello ''\")\n    self.CheckTokenization('{\"Hello@#$', '-LRB- `` Hello @ # $')\n    self.CheckTokenization('\"Hello...\"', \"`` Hello ... ''\")\n    self.CheckTokenization('()[]{}<>',\n                           '-LRB- -RRB- -LRB- -RRB- -LRB- -RRB- < >')\n    self.CheckTokenization('Hello--world', 'Hello -- world')\n    self.CheckTokenization(\"Isn't\", \"Is n't\")\n    self.CheckTokenization(\"n't\", \"n't\")\n    self.CheckTokenization('Hello Mr. Smith.', 'Hello Mr. Smith .')\n    self.CheckTokenization(\"It's Mr. Smith's.\", \"It 's Mr. Smith 's .\")\n    self.CheckTokenization(\"It's the Smiths'.\", \"It 's the Smiths ' .\")\n    self.CheckTokenization('Gotta go', 'Got ta go')\n    self.CheckTokenization('50-year-old', '50-year-old')\n\n  def testUrl(self):\n    self.CheckTokenization('http://www.google.com/news is down',\n                           'http : //www.google.com/news is down')\n\n\nif __name__ == '__main__':\n  googletest.main()\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/unpack_sparse_features.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#define EIGEN_USE_THREADS\n\n#include <memory>\n#include <string>\n#include <utility>\n#include <vector>\n\n#include \"syntaxnet/sparse.pb.h\"\n#include \"syntaxnet/utils.h\"\n#include \"third_party/eigen3/unsupported/Eigen/CXX11/Tensor\"\n#include \"tensorflow/core/framework/op_kernel.h\"\n#include \"tensorflow/core/framework/tensor.h\"\n#include \"tensorflow/core/framework/types.h\"\n\nusing tensorflow::DEVICE_CPU;\nusing tensorflow::DT_FLOAT;\nusing tensorflow::DT_INT32;\nusing tensorflow::DT_INT64;\nusing tensorflow::DT_STRING;\nusing tensorflow::OpKernel;\nusing tensorflow::OpKernelConstruction;\nusing tensorflow::OpKernelContext;\nusing tensorflow::Tensor;\nusing tensorflow::TensorShape;\nusing tensorflow::errors::InvalidArgument;\n\nnamespace syntaxnet {\n\n// Operator to unpack ids and weights stored in SparseFeatures proto.\nclass UnpackSparseFeatures : public OpKernel {\n public:\n  explicit UnpackSparseFeatures(OpKernelConstruction *context)\n      : OpKernel(context) {\n    OP_REQUIRES_OK(context, context->MatchSignature(\n                                {DT_STRING}, {DT_INT32, DT_INT64, DT_FLOAT}));\n  }\n\n  void Compute(OpKernelContext *context) override {\n    const Tensor &input = context->input(0);\n    OP_REQUIRES(context, IsLegacyVector(input.shape()),\n                InvalidArgument(\"input should be a vector.\"));\n\n    const int64 n = input.NumElements();\n    const auto input_vec = input.flat<string>();\n    SparseFeatures sf;\n    int output_size = 0;\n    std::vector<std::pair<int64, float> > id_and_weight;\n\n    // Guess that we'll be averaging a handful of ids per SparseFeatures record.\n    id_and_weight.reserve(n * 4);\n    std::vector<int> num_ids(n);\n    for (int64 i = 0; i < n; ++i) {\n      OP_REQUIRES(context, sf.ParseFromString(input_vec(i)),\n                  InvalidArgument(\"Couldn't parse as SparseFeature\"));\n      OP_REQUIRES(context,\n                  sf.weight_size() == 0 || sf.weight_size() == sf.id_size(),\n                  InvalidArgument(tensorflow::strings::StrCat(\n                      \"Incorrect number of weights\", sf.DebugString())));\n      int n_ids = sf.id_size();\n      num_ids[i] = n_ids;\n      output_size += n_ids;\n      for (int j = 0; j < n_ids; j++) {\n        float w = (sf.weight_size() > 0) ? sf.weight(j) : 1.0f;\n        id_and_weight.push_back(std::make_pair(sf.id(j), w));\n      }\n    }\n\n    Tensor *indices_t;\n    OP_REQUIRES_OK(context, context->allocate_output(\n                                0, TensorShape({output_size}), &indices_t));\n    Tensor *ids_t;\n    OP_REQUIRES_OK(context, context->allocate_output(\n                                1, TensorShape({output_size}), &ids_t));\n    Tensor *weights_t;\n    OP_REQUIRES_OK(context, context->allocate_output(\n                                2, TensorShape({output_size}), &weights_t));\n\n    auto indices = indices_t->vec<int32>();\n    auto ids = ids_t->vec<int64>();\n    auto weights = weights_t->vec<float>();\n    int c = 0;\n    for (int64 i = 0; i < n; ++i) {\n      for (int j = 0; j < num_ids[i]; ++j) {\n        indices(c) = i;\n        ids(c) = id_and_weight[c].first;\n        weights(c) = id_and_weight[c].second;\n        ++c;\n      }\n    }\n  }\n};\n\nREGISTER_KERNEL_BUILDER(Name(\"UnpackSparseFeatures\").Device(DEVICE_CPU),\n                        UnpackSparseFeatures);\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/utils.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/utils.h\"\n#include \"tensorflow/core/platform/macros.h\"\n\nnamespace syntaxnet {\nnamespace utils {\n\nbool ParseInt32(const char *c_str, int *value) {\n  char *temp;\n  *value = strtol(c_str, &temp, 0);  // NOLINT\n  return (*temp == '\\0');\n}\n\nbool ParseInt64(const char *c_str, int64 *value) {\n  char *temp;\n  *value = strtol(c_str, &temp, 0);  // NOLINT\n  return (*temp == '\\0');\n}\n\nbool ParseDouble(const char *c_str, double *value) {\n  char *temp;\n  *value = strtod(c_str, &temp);\n  return (*temp == '\\0');\n}\n\nstatic char hex_char[] = \"0123456789abcdef\";\n\nstring CEscape(const string &src) {\n  string dest;\n\n  for (unsigned char c : src) {\n    switch (c) {\n      case '\\n':\n        dest.append(\"\\\\n\");\n        break;\n      case '\\r':\n        dest.append(\"\\\\r\");\n        break;\n      case '\\t':\n        dest.append(\"\\\\t\");\n        break;\n      case '\\\"':\n        dest.append(\"\\\\\\\"\");\n        break;\n      case '\\'':\n        dest.append(\"\\\\'\");\n        break;\n      case '\\\\':\n        dest.append(\"\\\\\\\\\");\n        break;\n      default:\n        // Note that if we emit \\xNN and the src character after that is a hex\n        // digit then that digit must be escaped too to prevent it being\n        // interpreted as part of the character code by C.\n        if ((c >= 0x80) || !isprint(c)) {\n          dest.append(\"\\\\\");\n          dest.push_back(hex_char[c / 64]);\n          dest.push_back(hex_char[(c % 64) / 8]);\n          dest.push_back(hex_char[c % 8]);\n        } else {\n          dest.push_back(c);\n          break;\n        }\n    }\n  }\n\n  return dest;\n}\n\nstd::vector<string> Split(const string &text, char delim) {\n  std::vector<string> result;\n  int token_start = 0;\n  if (!text.empty()) {\n    for (size_t i = 0; i < text.size() + 1; i++) {\n      if ((i == text.size()) || (text[i] == delim)) {\n        result.push_back(string(text.data() + token_start, i - token_start));\n        token_start = i + 1;\n      }\n    }\n  }\n  return result;\n}\n\nstd::vector<string> SplitOne(const string &text, char delim) {\n  std::vector<string> result;\n  size_t split = text.find_first_of(delim);\n  result.push_back(text.substr(0, split));\n  if (split != string::npos) {\n    result.push_back(text.substr(split + 1));\n  }\n  return result;\n}\n\nbool IsAbsolutePath(tensorflow::StringPiece path) {\n  return !path.empty() && path[0] == '/';\n}\n\n// For an array of paths of length count, append them all together,\n// ensuring that the proper path separators are inserted between them.\nstring JoinPath(std::initializer_list<tensorflow::StringPiece> paths) {\n  string result;\n\n  for (tensorflow::StringPiece path : paths) {\n    if (path.empty()) {\n      continue;\n    }\n\n    if (result.empty()) {\n      result = path.ToString();\n      continue;\n    }\n\n    if (result[result.size() - 1] == '/') {\n      if (IsAbsolutePath(path)) {\n        tensorflow::strings::StrAppend(&result, path.substr(1));\n      } else {\n        tensorflow::strings::StrAppend(&result, path);\n      }\n    } else {\n      if (IsAbsolutePath(path)) {\n        tensorflow::strings::StrAppend(&result, path);\n      } else {\n        tensorflow::strings::StrAppend(&result, \"/\", path);\n      }\n    }\n  }\n\n  return result;\n}\n\nsize_t RemoveLeadingWhitespace(tensorflow::StringPiece *text) {\n  size_t count = 0;\n  const char *ptr = text->data();\n  while (count < text->size() && isspace(*ptr)) {\n    count++;\n    ptr++;\n  }\n  text->remove_prefix(count);\n  return count;\n}\n\nsize_t RemoveTrailingWhitespace(tensorflow::StringPiece *text) {\n  size_t count = 0;\n  const char *ptr = text->data() + text->size() - 1;\n  while (count < text->size() && isspace(*ptr)) {\n    ++count;\n    --ptr;\n  }\n  text->remove_suffix(count);\n  return count;\n}\n\nsize_t RemoveWhitespaceContext(tensorflow::StringPiece *text) {\n  // use RemoveLeadingWhitespace() and RemoveTrailingWhitespace() to do the job\n  return RemoveLeadingWhitespace(text) + RemoveTrailingWhitespace(text);\n}\n\nnamespace {\n// Lower-level versions of Get... that read directly from a character buffer\n// without any bounds checking.\ninline uint32 DecodeFixed32(const char *ptr) {\n  return ((static_cast<uint32>(static_cast<unsigned char>(ptr[0]))) |\n          (static_cast<uint32>(static_cast<unsigned char>(ptr[1])) << 8) |\n          (static_cast<uint32>(static_cast<unsigned char>(ptr[2])) << 16) |\n          (static_cast<uint32>(static_cast<unsigned char>(ptr[3])) << 24));\n}\n\n// 0xff is in case char is signed.\nstatic inline uint32 ByteAs32(char c) { return static_cast<uint32>(c) & 0xff; }\n}  // namespace\n\nuint32 Hash32(const char *data, size_t n, uint32 seed) {\n  // 'm' and 'r' are mixing constants generated offline.\n  // They're not really 'magic', they just happen to work well.\n  const uint32 m = 0x5bd1e995;\n  const int r = 24;\n\n  // Initialize the hash to a 'random' value\n  uint32 h = seed ^ n;\n\n  // Mix 4 bytes at a time into the hash\n  while (n >= 4) {\n    uint32 k = DecodeFixed32(data);\n    k *= m;\n    k ^= k >> r;\n    k *= m;\n    h *= m;\n    h ^= k;\n    data += 4;\n    n -= 4;\n  }\n\n  // Handle the last few bytes of the input array\n  switch (n) {\n    case 3:\n      h ^= ByteAs32(data[2]) << 16;\n      TF_FALLTHROUGH_INTENDED;\n    case 2:\n      h ^= ByteAs32(data[1]) << 8;\n      TF_FALLTHROUGH_INTENDED;\n    case 1:\n      h ^= ByteAs32(data[0]);\n      h *= m;\n  }\n\n  // Do a few final mixes of the hash to ensure the last few\n  // bytes are well-incorporated.\n  h ^= h >> 13;\n  h *= m;\n  h ^= h >> 15;\n  return h;\n}\n\nstring Lowercase(tensorflow::StringPiece s) {\n  string result(s.data(), s.size());\n  for (char &c : result) {\n    c = tolower(c);\n  }\n  return result;\n}\n\nPunctuationUtil::CharacterRange PunctuationUtil::kPunctuation[] = {\n    {33, 35},       {37, 42},       {44, 47},       {58, 59},\n    {63, 64},       {91, 93},       {95, 95},       {123, 123},\n    {125, 125},     {161, 161},     {171, 171},     {183, 183},\n    {187, 187},     {191, 191},     {894, 894},     {903, 903},\n    {1370, 1375},   {1417, 1418},   {1470, 1470},   {1472, 1472},\n    {1475, 1475},   {1478, 1478},   {1523, 1524},   {1548, 1549},\n    {1563, 1563},   {1566, 1567},   {1642, 1645},   {1748, 1748},\n    {1792, 1805},   {2404, 2405},   {2416, 2416},   {3572, 3572},\n    {3663, 3663},   {3674, 3675},   {3844, 3858},   {3898, 3901},\n    {3973, 3973},   {4048, 4049},   {4170, 4175},   {4347, 4347},\n    {4961, 4968},   {5741, 5742},   {5787, 5788},   {5867, 5869},\n    {5941, 5942},   {6100, 6102},   {6104, 6106},   {6144, 6154},\n    {6468, 6469},   {6622, 6623},   {6686, 6687},   {8208, 8231},\n    {8240, 8259},   {8261, 8273},   {8275, 8286},   {8317, 8318},\n    {8333, 8334},   {9001, 9002},   {9140, 9142},   {10088, 10101},\n    {10181, 10182}, {10214, 10219}, {10627, 10648}, {10712, 10715},\n    {10748, 10749}, {11513, 11516}, {11518, 11519}, {11776, 11799},\n    {11804, 11805}, {12289, 12291}, {12296, 12305}, {12308, 12319},\n    {12336, 12336}, {12349, 12349}, {12448, 12448}, {12539, 12539},\n    {64830, 64831}, {65040, 65049}, {65072, 65106}, {65108, 65121},\n    {65123, 65123}, {65128, 65128}, {65130, 65131}, {65281, 65283},\n    {65285, 65290}, {65292, 65295}, {65306, 65307}, {65311, 65312},\n    {65339, 65341}, {65343, 65343}, {65371, 65371}, {65373, 65373},\n    {65375, 65381}, {65792, 65793}, {66463, 66463}, {68176, 68184},\n    {-1, -1}};\n\nvoid NormalizeDigits(string *form) {\n  for (size_t i = 0; i < form->size(); ++i) {\n    if ((*form)[i] >= '0' && (*form)[i] <= '9') (*form)[i] = '9';\n  }\n}\n\n}  // namespace utils\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/utils.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#ifndef SYNTAXNET_UTILS_H_\n#define SYNTAXNET_UTILS_H_\n\n#include <functional>\n#include <string>\n#include <vector>\n#include <unordered_set>\n#include \"syntaxnet/base.h\"\n#include \"tensorflow/core/lib/core/status.h\"\n#include \"tensorflow/core/lib/strings/strcat.h\"\n#include \"tensorflow/core/platform/default/integral_types.h\"\n#include \"tensorflow/core/platform/mutex.h\"\n#include \"util/utf8/unicodetext.h\"\n\nnamespace syntaxnet {\nnamespace utils {\n\nbool ParseInt32(const char *c_str, int *value);\nbool ParseInt64(const char *c_str, int64 *value);\nbool ParseDouble(const char *c_str, double *value);\n\ntemplate <typename T>\nT ParseUsing(const string &str, std::function<bool(const char *, T *)> func) {\n  T value;\n  CHECK(func(str.c_str(), &value)) << \"Failed to convert: \" << str;\n  return value;\n}\n\ntemplate <typename T>\nT ParseUsing(const string &str, T defval,\n             std::function<bool(const char *, T *)> func) {\n  return str.empty() ? defval : ParseUsing<T>(str, func);\n}\n\nstring CEscape(const string &src);\n\n// Splits the given string on every occurrence of the given delimiter char.\nstd::vector<string> Split(const string &text, char delim);\n\n// Splits the given string on the first occurrence of the given delimiter char,\n// or returns the given string if the given delimiter is not found.\nstd::vector<string> SplitOne(const string &text, char delim);\n\ntemplate <typename T>\nstring Join(const std::vector<T> &s, const char *sep) {\n  string result;\n  bool first = true;\n  for (const auto &x : s) {\n    tensorflow::strings::StrAppend(&result, (first ? \"\" : sep), x);\n    first = false;\n  }\n  return result;\n}\n\nstring JoinPath(std::initializer_list<tensorflow::StringPiece> paths);\n\nsize_t RemoveLeadingWhitespace(tensorflow::StringPiece *text);\n\nsize_t RemoveTrailingWhitespace(tensorflow::StringPiece *text);\n\nsize_t RemoveWhitespaceContext(tensorflow::StringPiece *text);\n\nuint32 Hash32(const char *data, size_t n, uint32 seed);\n\n// Deletes all the elements in an STL container and clears the container. This\n// function is suitable for use with a vector, set, hash_set, or any other STL\n// container which defines sensible begin(), end(), and clear() methods.\n// If container is NULL, this function is a no-op.\ntemplate <typename T>\nvoid STLDeleteElements(T *container) {\n  if (!container) return;\n  auto it = container->begin();\n  while (it != container->end()) {\n    auto temp = it;\n    ++it;\n    delete *temp;\n  }\n  container->clear();\n}\n\n// Returns lower-cased version of s.\nstring Lowercase(tensorflow::StringPiece s);\n\nclass PunctuationUtil {\n public:\n  // Unicode character ranges for punctuation characters according to CoNLL.\n  struct CharacterRange {\n    int first;\n    int last;\n  };\n  static CharacterRange kPunctuation[];\n\n  // Returns true if Unicode character is a punctuation character.\n  static bool IsPunctuation(int u) {\n    int i = 0;\n    while (kPunctuation[i].first > 0) {\n      if (u < kPunctuation[i].first) return false;\n      if (u <= kPunctuation[i].last) return true;\n      ++i;\n    }\n    return false;\n  }\n\n  // Determine if tag is a punctuation tag.\n  static bool IsPunctuationTag(const string &tag) {\n    for (size_t i = 0; i < tag.length(); ++i) {\n      int c = tag[i];\n      if (c != ',' && c != ':' && c != '.' && c != '\\'' && c != '`') {\n        return false;\n      }\n    }\n    return true;\n  }\n\n  // Returns true if word consists of punctuation characters.\n  static bool IsPunctuationToken(const string &word) {\n    UnicodeText text;\n    text.PointToUTF8(word.c_str(), word.length());\n    UnicodeText::const_iterator it;\n    for (it = text.begin(); it != text.end(); ++it) {\n      if (!IsPunctuation(*it)) return false;\n    }\n    return true;\n  }\n\n  // Returns true if tag is non-empty and has only punctuation or parens\n  // symbols.\n  static bool IsPunctuationTagOrParens(const string &tag) {\n    if (tag.empty()) return false;\n    for (size_t i = 0; i < tag.length(); ++i) {\n      int c = tag[i];\n      if (c != '(' && c != ')' && c != ',' && c != ':' && c != '.' &&\n          c != '\\'' && c != '`') {\n        return false;\n      }\n    }\n    return true;\n  }\n\n  // Decides whether to score a token, given the word, the POS tag and\n  // and the scoring type.\n  static bool ScoreToken(const string &word, const string &tag,\n                         const string &scoring_type) {\n    if (scoring_type == \"default\") {\n      return tag.empty() || !IsPunctuationTag(tag);\n    } else if (scoring_type == \"conllx\") {\n      return !IsPunctuationToken(word);\n    } else if (scoring_type == \"ignore_parens\") {\n      return !IsPunctuationTagOrParens(tag);\n    }\n    CHECK(scoring_type.empty()) << \"Unknown scoring strategy \" << scoring_type;\n    return true;\n  }\n};\n\nvoid NormalizeDigits(string *form);\n\n// Helper type to mark missing c-tor argument types\n// for Type's c-tor in LazyStaticPtr<Type, ...>.\nstruct NoArg {};\n\ntemplate <typename Type, typename Arg1 = NoArg, typename Arg2 = NoArg,\n          typename Arg3 = NoArg>\nclass LazyStaticPtr {\n public:\n  typedef Type element_type;  // per smart pointer convention\n\n  // Pretend to be a pointer to Type (never NULL due to on-demand creation):\n  Type &operator*() const { return *get(); }\n  Type *operator->() const { return get(); }\n\n  // Named accessor/initializer:\n  Type *get() const {\n    if (!ptr_) Initialize(this);\n    return ptr_;\n  }\n\n public:\n  // All the data is public and LazyStaticPtr has no constructors so that we can\n  // initialize LazyStaticPtr objects with the \"= { arg_value, ... }\" syntax.\n  // Clients of LazyStaticPtr must not access the data members directly.\n\n  // Arguments for Type's c-tor\n  // (unused NoArg-typed arguments consume either no space, or 1 byte to\n  //  ensure address uniqueness):\n  Arg1 arg1_;\n  Arg2 arg2_;\n  Arg3 arg3_;\n\n  // The object we create and show.\n  mutable Type *ptr_;\n\n private:\n  template <typename A1, typename A2, typename A3>\n  static Type *Factory(const A1 &a1, const A2 &a2, const A3 &a3) {\n    return new Type(a1, a2, a3);\n  }\n\n  template <typename A1, typename A2>\n  static Type *Factory(const A1 &a1, const A2 &a2, NoArg a3) {\n    return new Type(a1, a2);\n  }\n\n  template <typename A1>\n  static Type *Factory(const A1 &a1, NoArg a2, NoArg a3) {\n    return new Type(a1);\n  }\n\n  static Type *Factory(NoArg a1, NoArg a2, NoArg a3) { return new Type(); }\n\n  static void Initialize(const LazyStaticPtr *lsp) {\n    lsp->ptr_ = Factory(lsp->arg1_, lsp->arg2_, lsp->arg3_);\n  }\n};\n\n}  // namespace utils\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_UTILS_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/workspace.cc",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n#include \"syntaxnet/workspace.h\"\n\n#include \"tensorflow/core/lib/strings/strcat.h\"\n\nnamespace syntaxnet {\n\nstring WorkspaceRegistry::DebugString() const {\n  string str;\n  for (auto &it : workspace_names_) {\n    const string &type_name = workspace_types_.at(it.first);\n    for (size_t index = 0; index < it.second.size(); ++index) {\n      const string &workspace_name = it.second[index];\n      tensorflow::strings::StrAppend(&str, \"\\n  \", type_name, \" :: \",\n                                     workspace_name);\n    }\n  }\n  return str;\n}\n\nVectorIntWorkspace::VectorIntWorkspace(int size) : elements_(size) {}\n\nVectorIntWorkspace::VectorIntWorkspace(int size, int value)\n    : elements_(size, value) {}\n\nVectorIntWorkspace::VectorIntWorkspace(const vector<int> &elements)\n    : elements_(elements) {}\n\nstring VectorIntWorkspace::TypeName() { return \"Vector\"; }\n\nVectorVectorIntWorkspace::VectorVectorIntWorkspace(int size)\n    : elements_(size) {}\n\nstring VectorVectorIntWorkspace::TypeName() { return \"VectorVector\"; }\n\n}  // namespace syntaxnet\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/syntaxnet/workspace.h",
    "content": "/* Copyright 2016 Google Inc. All Rights Reserved.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n==============================================================================*/\n\n// Notes on thread-safety: All of the classes here are thread-compatible.  More\n// specifically, the registry machinery is thread-safe, as long as each thread\n// performs feature extraction on a different Sentence object.\n\n#ifndef SYNTAXNET_WORKSPACE_H_\n#define SYNTAXNET_WORKSPACE_H_\n\n#include <string>\n#include <typeindex>\n#include <unordered_map>\n#include <utility>\n#include <vector>\n\n#include \"syntaxnet/utils.h\"\n\nnamespace syntaxnet {\n\n// A base class for shared workspaces. Derived classes implement a static member\n// function TypeName() which returns a human readable string name for the class.\nclass Workspace {\n public:\n  // Polymorphic destructor.\n  virtual ~Workspace() {}\n\n protected:\n  // Create an empty workspace.\n  Workspace() {}\n\n private:\n  TF_DISALLOW_COPY_AND_ASSIGN(Workspace);\n};\n\n// A registry that keeps track of workspaces.\nclass WorkspaceRegistry {\n public:\n  // Create an empty registry.\n  WorkspaceRegistry() {}\n\n  // Returns the index of a named workspace, adding it to the registry first\n  // if necessary.\n  template <class W>\n  int Request(const string &name) {\n    const std::type_index id = std::type_index(typeid(W));\n    workspace_types_[id] = W::TypeName();\n    vector<string> &names = workspace_names_[id];\n    for (int i = 0; i < names.size(); ++i) {\n      if (names[i] == name) return i;\n    }\n    names.push_back(name);\n    return names.size() - 1;\n  }\n\n  const std::unordered_map<std::type_index, vector<string> > &WorkspaceNames()\n      const {\n    return workspace_names_;\n  }\n\n  // Returns a string describing the registered workspaces.\n  string DebugString() const;\n\n private:\n  // Workspace type names, indexed as workspace_types_[typeid].\n  std::unordered_map<std::type_index, string> workspace_types_;\n\n  // Workspace names, indexed as workspace_names_[typeid][workspace].\n  std::unordered_map<std::type_index, vector<string> > workspace_names_;\n\n  TF_DISALLOW_COPY_AND_ASSIGN(WorkspaceRegistry);\n};\n\n// A typed collected of workspaces. The workspaces are indexed according to an\n// external WorkspaceRegistry. If the WorkspaceSet is const, the contents are\n// also immutable.\nclass WorkspaceSet {\n public:\n  ~WorkspaceSet() { Reset(WorkspaceRegistry()); }\n\n  // Returns true if a workspace has been set.\n  template <class W>\n  bool Has(int index) const {\n    const std::type_index id = std::type_index(typeid(W));\n    DCHECK(workspaces_.find(id) != workspaces_.end());\n    DCHECK_LT(index, workspaces_.find(id)->second.size());\n    return workspaces_.find(id)->second[index] != nullptr;\n  }\n\n  // Returns an indexed workspace; the workspace must have been set.\n  template <class W>\n  const W &Get(int index) const {\n    DCHECK(Has<W>(index));\n    const Workspace *w =\n        workspaces_.find(std::type_index(typeid(W)))->second[index];\n    return reinterpret_cast<const W &>(*w);\n  }\n\n  // Sets an indexed workspace; this takes ownership of the workspace, which\n  // must have been new-allocated.  It is an error to set a workspace twice.\n  template <class W>\n  void Set(int index, W *workspace) {\n    const std::type_index id = std::type_index(typeid(W));\n    DCHECK(workspaces_.find(id) != workspaces_.end());\n    DCHECK_LT(index, workspaces_[id].size());\n    DCHECK(workspaces_[id][index] == nullptr);\n    DCHECK(workspace != nullptr);\n    workspaces_[id][index] = workspace;\n  }\n\n  void Reset(const WorkspaceRegistry &registry) {\n    // Deallocate current workspaces.\n    for (auto &it : workspaces_) {\n      for (size_t index = 0; index < it.second.size(); ++index) {\n        delete it.second[index];\n      }\n    }\n    workspaces_.clear();\n\n    // Allocate space for new workspaces.\n    for (auto &it : registry.WorkspaceNames()) {\n      workspaces_[it.first].resize(it.second.size());\n    }\n  }\n\n private:\n  // The set of workspaces, indexed as workspaces_[typeid][index].\n  std::unordered_map<std::type_index, vector<Workspace *> > workspaces_;\n};\n\n// A workspace that wraps around a single int.\nclass SingletonIntWorkspace : public Workspace {\n public:\n  // Default-initializes the int value.\n  SingletonIntWorkspace() {}\n\n  // Initializes the int with the given value.\n  explicit SingletonIntWorkspace(int value) : value_(value) {}\n\n  // Returns the name of this type of workspace.\n  static string TypeName() { return \"SingletonInt\"; }\n\n  // Returns the int value.\n  int get() const { return value_; }\n\n  // Sets the int value.\n  void set(int value) { value_ = value; }\n\n private:\n  // The enclosed int.\n  int value_ = 0;\n};\n\n// A workspace that wraps around a vector of int.\nclass VectorIntWorkspace : public Workspace {\n public:\n  // Creates a vector of the given size.\n  explicit VectorIntWorkspace(int size);\n\n  // Creates a vector initialized with the given array.\n  explicit VectorIntWorkspace(const vector<int> &elements);\n\n  // Creates a vector of the given size, with each element initialized to the\n  // given value.\n  VectorIntWorkspace(int size, int value);\n\n  // Returns the name of this type of workspace.\n  static string TypeName();\n\n  // Returns the i'th element.\n  int element(int i) const { return elements_[i]; }\n\n  // Sets the i'th element.\n  void set_element(int i, int value) { elements_[i] = value; }\n\n  int size() const { return elements_.size(); }\n\n private:\n  // The enclosed vector.\n  vector<int> elements_;\n};\n\n// A workspace that wraps around a vector of vector of int.\nclass VectorVectorIntWorkspace : public Workspace {\n public:\n  // Creates a vector of empty vectors of the given size.\n  explicit VectorVectorIntWorkspace(int size);\n\n  // Returns the name of this type of workspace.\n  static string TypeName();\n\n  // Returns the i'th vector of elements.\n  const vector<int> &elements(int i) const { return elements_[i]; }\n\n  // Mutable access to the i'th vector of elements.\n  vector<int> *mutable_elements(int i) { return &(elements_[i]); }\n\n private:\n  // The enclosed vector of vector of elements.\n  vector<vector<int> > elements_;\n};\n\n}  // namespace syntaxnet\n\n#endif  // SYNTAXNET_WORKSPACE_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/BUILD",
    "content": "licenses([\"notice\"])\n\ncc_library(\n    name = \"utf\",\n    srcs = [\n        \"rune.c\",\n        \"runestrcat.c\",\n        \"runestrchr.c\",\n        \"runestrcmp.c\",\n        \"runestrcpy.c\",\n        \"runestrdup.c\",\n        \"runestrecpy.c\",\n        \"runestrlen.c\",\n        \"runestrncat.c\",\n        \"runestrncmp.c\",\n        \"runestrncpy.c\",\n        \"runestrrchr.c\",\n        \"runestrstr.c\",\n        \"runetype.c\",\n        \"utfecpy.c\",\n        \"utflen.c\",\n        \"utfnlen.c\",\n        \"utfrrune.c\",\n        \"utfrune.c\",\n        \"utfutf.c\",\n    ],\n    hdrs = [\n        \"runetypebody.c\",\n        \"utf.h\",\n        \"utfdef.h\",\n    ],\n    includes = [\".\"],\n    visibility = [\"//visibility:public\"],\n)\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/README",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 1998-2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/rune.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nenum\n{\n\tBit1\t= 7,\n\tBitx\t= 6,\n\tBit2\t= 5,\n\tBit3\t= 4,\n\tBit4\t= 3,\n\tBit5\t= 2, \n\n\tT1\t= ((1<<(Bit1+1))-1) ^ 0xFF,\t/* 0000 0000 */\n\tTx\t= ((1<<(Bitx+1))-1) ^ 0xFF,\t/* 1000 0000 */\n\tT2\t= ((1<<(Bit2+1))-1) ^ 0xFF,\t/* 1100 0000 */\n\tT3\t= ((1<<(Bit3+1))-1) ^ 0xFF,\t/* 1110 0000 */\n\tT4\t= ((1<<(Bit4+1))-1) ^ 0xFF,\t/* 1111 0000 */\n\tT5\t= ((1<<(Bit5+1))-1) ^ 0xFF,\t/* 1111 1000 */\n\n\tRune1\t= (1<<(Bit1+0*Bitx))-1,\t\t/* 0000 0000 0111 1111 */\n\tRune2\t= (1<<(Bit2+1*Bitx))-1,\t\t/* 0000 0111 1111 1111 */\n\tRune3\t= (1<<(Bit3+2*Bitx))-1,\t\t/* 1111 1111 1111 1111 */\n\tRune4\t= (1<<(Bit4+3*Bitx))-1,\n                                        /* 0001 1111 1111 1111 1111 1111 */\n\n\tMaskx\t= (1<<Bitx)-1,\t\t\t/* 0011 1111 */\n\tTestx\t= Maskx ^ 0xFF,\t\t\t/* 1100 0000 */\n\n\tBad\t= Runeerror,\n};\n\n/*\n * Modified by Wei-Hwa Huang, Google Inc., on 2004-09-24\n * This is a slower but \"safe\" version of the old chartorune \n * that works on strings that are not necessarily null-terminated.\n * \n * If you know for sure that your string is null-terminated,\n * chartorune will be a bit faster.\n *\n * It is guaranteed not to attempt to access \"length\"\n * past the incoming pointer.  This is to avoid\n * possible access violations.  If the string appears to be\n * well-formed but incomplete (i.e., to get the whole Rune\n * we'd need to read past str+length) then we'll set the Rune\n * to Bad and return 0.\n *\n * Note that if we have decoding problems for other\n * reasons, we return 1 instead of 0.\n */\nint\ncharntorune(Rune *rune, const char *str, int length)\n{\n\tint c, c1, c2, c3;\n\tlong l;\n\n\t/* When we're not allowed to read anything */\n\tif(length <= 0) {\n\t\tgoto badlen;\n\t}\n\n\t/*\n\t * one character sequence (7-bit value)\n\t *\t00000-0007F => T1\n\t */\n\tc = *(uchar*)str;\n\tif(c < Tx) {\n\t\t*rune = c;\n\t\treturn 1;\n\t}\n\n\t// If we can't read more than one character we must stop\n\tif(length <= 1) {\n\t\tgoto badlen;\n\t}\n\n\t/*\n\t * two character sequence (11-bit value)\n\t *\t0080-07FF => T2 Tx\n\t */\n\tc1 = *(uchar*)(str+1) ^ Tx;\n\tif(c1 & Testx)\n\t\tgoto bad;\n\tif(c < T3) {\n\t\tif(c < T2)\n\t\t\tgoto bad;\n\t\tl = ((c << Bitx) | c1) & Rune2;\n\t\tif(l <= Rune1)\n\t\t\tgoto bad;\n\t\t*rune = l;\n\t\treturn 2;\n\t}\n\n\t// If we can't read more than two characters we must stop\n\tif(length <= 2) {\n\t\tgoto badlen;\n\t}\n\n\t/*\n\t * three character sequence (16-bit value)\n\t *\t0800-FFFF => T3 Tx Tx\n\t */\n\tc2 = *(uchar*)(str+2) ^ Tx;\n\tif(c2 & Testx)\n\t\tgoto bad;\n\tif(c < T4) {\n\t\tl = ((((c << Bitx) | c1) << Bitx) | c2) & Rune3;\n\t\tif(l <= Rune2)\n\t\t\tgoto bad;\n\t\t*rune = l;\n\t\treturn 3;\n\t}\n\n\tif (length <= 3)\n\t\tgoto badlen;\n\n\t/*\n\t * four character sequence (21-bit value)\n\t *\t10000-1FFFFF => T4 Tx Tx Tx\n\t */\n\tc3 = *(uchar*)(str+3) ^ Tx;\n\tif (c3 & Testx)\n\t\tgoto bad;\n\tif (c < T5) {\n\t\tl = ((((((c << Bitx) | c1) << Bitx) | c2) << Bitx) | c3) & Rune4;\n\t\tif (l <= Rune3)\n\t\t\tgoto bad;\n\t\tif (l > Runemax)\n\t\t\tgoto bad;\n\t\t*rune = l;\n\t\treturn 4;\n\t}\n\n\t// Support for 5-byte or longer UTF-8 would go here, but\n\t// since we don't have that, we'll just fall through to bad.\n\n\t/*\n\t * bad decoding\n\t */\nbad:\n\t*rune = Bad;\n\treturn 1;\nbadlen:\n\t*rune = Bad;\n\treturn 0;\n\n}\n\n\n/*\n * This is the older \"unsafe\" version, which works fine on \n * null-terminated strings.\n */\nint\nchartorune(Rune *rune, const char *str)\n{\n\tint c, c1, c2, c3;\n\tlong l;\n\n\t/*\n\t * one character sequence\n\t *\t00000-0007F => T1\n\t */\n\tc = *(uchar*)str;\n\tif(c < Tx) {\n\t\t*rune = c;\n\t\treturn 1;\n\t}\n\n\t/*\n\t * two character sequence\n\t *\t0080-07FF => T2 Tx\n\t */\n\tc1 = *(uchar*)(str+1) ^ Tx;\n\tif(c1 & Testx)\n\t\tgoto bad;\n\tif(c < T3) {\n\t\tif(c < T2)\n\t\t\tgoto bad;\n\t\tl = ((c << Bitx) | c1) & Rune2;\n\t\tif(l <= Rune1)\n\t\t\tgoto bad;\n\t\t*rune = l;\n\t\treturn 2;\n\t}\n\n\t/*\n\t * three character sequence\n\t *\t0800-FFFF => T3 Tx Tx\n\t */\n\tc2 = *(uchar*)(str+2) ^ Tx;\n\tif(c2 & Testx)\n\t\tgoto bad;\n\tif(c < T4) {\n\t\tl = ((((c << Bitx) | c1) << Bitx) | c2) & Rune3;\n\t\tif(l <= Rune2)\n\t\t\tgoto bad;\n\t\t*rune = l;\n\t\treturn 3;\n\t}\n\n\t/*\n\t * four character sequence (21-bit value)\n\t *\t10000-1FFFFF => T4 Tx Tx Tx\n\t */\n\tc3 = *(uchar*)(str+3) ^ Tx;\n\tif (c3 & Testx)\n\t\tgoto bad;\n\tif (c < T5) {\n\t\tl = ((((((c << Bitx) | c1) << Bitx) | c2) << Bitx) | c3) & Rune4;\n\t\tif (l <= Rune3)\n\t\t\tgoto bad;\n\t\tif (l > Runemax)\n\t\t\tgoto bad;\n\t\t*rune = l;\n\t\treturn 4;\n\t}\n\n\t/*\n\t * Support for 5-byte or longer UTF-8 would go here, but\n\t * since we don't have that, we'll just fall through to bad.\n\t */\n\n\t/*\n\t * bad decoding\n\t */\nbad:\n\t*rune = Bad;\n\treturn 1;\n}\n\nint\nisvalidcharntorune(const char* str, int length, Rune* rune, int* consumed) {\n\t*consumed = charntorune(rune, str, length);\n\treturn *rune != Runeerror || *consumed == 3;\n}\n    \nint\nrunetochar(char *str, const Rune *rune)\n{\n\t/* Runes are signed, so convert to unsigned for range check. */\n\tunsigned long c;\n\n\t/*\n\t * one character sequence\n\t *\t00000-0007F => 00-7F\n\t */\n\tc = *rune;\n\tif(c <= Rune1) {\n\t\tstr[0] = c;\n\t\treturn 1;\n\t}\n\n\t/*\n\t * two character sequence\n\t *\t0080-07FF => T2 Tx\n\t */\n\tif(c <= Rune2) {\n\t\tstr[0] = T2 | (c >> 1*Bitx);\n\t\tstr[1] = Tx | (c & Maskx);\n\t\treturn 2;\n\t}\n\n\t/*\n\t * If the Rune is out of range, convert it to the error rune.\n\t * Do this test here because the error rune encodes to three bytes.\n\t * Doing it earlier would duplicate work, since an out of range\n\t * Rune wouldn't have fit in one or two bytes.\n\t */\n\tif (c > Runemax)\n\t\tc = Runeerror;\n\n\t/*\n\t * three character sequence\n\t *\t0800-FFFF => T3 Tx Tx\n\t */\n\tif (c <= Rune3) {\n\t\tstr[0] = T3 |  (c >> 2*Bitx);\n\t\tstr[1] = Tx | ((c >> 1*Bitx) & Maskx);\n\t\tstr[2] = Tx |  (c & Maskx);\n\t\treturn 3;\n\t}\n\n\t/*\n\t * four character sequence (21-bit value)\n\t *     10000-1FFFFF => T4 Tx Tx Tx\n\t */\n\tstr[0] = T4 | (c >> 3*Bitx);\n\tstr[1] = Tx | ((c >> 2*Bitx) & Maskx);\n\tstr[2] = Tx | ((c >> 1*Bitx) & Maskx);\n\tstr[3] = Tx | (c & Maskx);\n\treturn 4;\n}\n\nint\nrunelen(Rune rune)\n{\n\tchar str[10];\n\n\treturn runetochar(str, &rune);\n}\n\nint\nrunenlen(const Rune *r, int nrune)\n{\n\tint nb;\n\tulong c;\t/* Rune is signed, so use unsigned for range check. */\n\n\tnb = 0;\n\twhile(nrune--) {\n\t\tc = *r++;\n\t\tif (c <= Rune1)\n\t\t\tnb++;\n\t\telse if (c <= Rune2)\n\t\t\tnb += 2;\n\t\telse if (c <= Rune3)\n\t\t\tnb += 3;\n\t\telse if (c <= Runemax)\n\t\t\tnb += 4;\n\t\telse\n\t\t\tnb += 3;\t/* Runeerror = 0xFFFD, see runetochar */\n\t}\n\treturn nb;\n}\n\nint\nfullrune(const char *str, int n)\n{\n\tif (n > 0) {\n\t\tint c = *(uchar*)str;\n\t\tif (c < Tx)\n\t\t\treturn 1;\n\t\tif (n > 1) {\n\t\t\tif (c < T3)\n\t\t\t\treturn 1;\n\t\t\tif (n > 2) {\n\t\t\t\tif (c < T4 || n > 3)\n\t\t\t\t\treturn 1;\n\t\t\t}\n\t\t}\n\t}\n\treturn 0;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/runestrcat.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nRune*\nrunestrcat(Rune *s1, const Rune *s2)\n{\n\n\trunestrcpy((Rune*)runestrchr(s1, 0), s2);\n\treturn s1;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/runestrchr.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nconst\nRune*\nrunestrchr(const Rune *s, Rune c)\n{\n\tRune c0 = c;\n\tRune c1;\n\n\tif(c == 0) {\n\t\twhile(*s++)\n\t\t\t;\n\t\treturn s-1;\n\t}\n\n\twhile((c1 = *s++) != 0)\n\t\tif(c1 == c0)\n\t\t\treturn s-1;\n\treturn 0;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/runestrcmp.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nint\nrunestrcmp(const Rune *s1, const Rune *s2)\n{\n\tRune c1, c2;\n\n\tfor(;;) {\n\t\tc1 = *s1++;\n\t\tc2 = *s2++;\n\t\tif(c1 != c2) {\n\t\t\tif(c1 > c2)\n\t\t\t\treturn 1;\n\t\t\treturn -1;\n\t\t}\n\t\tif(c1 == 0)\n\t\t\treturn 0;\n\t}\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/runestrcpy.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nRune*\nrunestrcpy(Rune *s1, const Rune *s2)\n{\n\tRune *os1;\n\n\tos1 = s1;\n\twhile((*s1++ = *s2++) != 0)\n\t\t;\n\treturn os1;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/runestrdup.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include <stdlib.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nRune*\nrunestrdup(const Rune *s) \n{  \n\tRune *ns;\n\n\tns = (Rune*)malloc(sizeof(Rune)*(runestrlen(s) + 1));\n\tif(ns == 0)\n\t\treturn 0;\n\n\treturn runestrcpy(ns, s);\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/runestrecpy.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nRune*\nrunestrecpy(Rune *s1, Rune *es1, const Rune *s2)\n{\n\tif(s1 >= es1)\n\t\treturn s1;\n\n\twhile((*s1++ = *s2++) != 0){\n\t\tif(s1 == es1){\n\t\t\t*--s1 = '\\0';\n\t\t\tbreak;\n\t\t}\n\t}\n\treturn s1;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/runestrlen.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nlong\nrunestrlen(const Rune *s)\n{\n\n\treturn runestrchr(s, 0) - s;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/runestrncat.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nRune*\nrunestrncat(Rune *s1, const Rune *s2, long n)\n{\n\tRune *os1;\n\n\tos1 = s1;\n\ts1 = (Rune*)runestrchr(s1, 0);\n\twhile((*s1++ = *s2++) != 0)\n\t\tif(--n < 0) {\n\t\t\ts1[-1] = 0;\n\t\t\tbreak;\n\t\t}\n\treturn os1;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/runestrncmp.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nint\nrunestrncmp(const Rune *s1, const Rune *s2, long n)\n{\n\tRune c1, c2;\n\n\twhile(n > 0) {\n\t\tc1 = *s1++;\n\t\tc2 = *s2++;\n\t\tn--;\n\t\tif(c1 != c2) {\n\t\t\tif(c1 > c2)\n\t\t\t\treturn 1;\n\t\t\treturn -1;\n\t\t}\n\t\tif(c1 == 0)\n\t\t\tbreak;\n\t}\n\treturn 0;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/runestrncpy.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nRune*\nrunestrncpy(Rune *s1, const Rune *s2, long n)\n{\n\tint i;\n\tRune *os1;\n\n\tos1 = s1;\n\tfor(i = 0; i < n; i++)\n\t\tif((*s1++ = *s2++) == 0) {\n\t\t\twhile(++i < n)\n\t\t\t\t*s1++ = 0;\n\t\t\treturn os1;\n\t\t}\n\treturn os1;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/runestrrchr.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nconst\nRune*\nrunestrrchr(const Rune *s, Rune c)\n{\n\tconst Rune *r;\n\n\tif(c == 0)\n\t\treturn runestrchr(s, 0);\n\tr = 0;\n\twhile((s = runestrchr(s, c)) != 0)\n\t\tr = s++;\n\treturn r;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/runestrstr.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\n/*\n * Return pointer to first occurrence of s2 in s1,\n * 0 if none\n */\nconst\nRune*\nrunestrstr(const Rune *s1, const Rune *s2)\n{\n\tconst Rune *p, *pa, *pb;\n\tint c0, c;\n\n\tc0 = *s2;\n\tif(c0 == 0)\n\t\treturn s1;\n\ts2++;\n\tfor(p=runestrchr(s1, c0); p; p=runestrchr(p+1, c0)) {\n\t\tpa = p;\n\t\tfor(pb=s2;; pb++) {\n\t\t\tc = *pb;\n\t\t\tif(c == 0)\n\t\t\t\treturn p;\n\t\t\tif(c != *++pa)\n\t\t\t\tbreak;\n\t\t}\n\t}\n\treturn 0;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/runetype.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nstatic\nRune*\nrbsearch(Rune c, Rune *t, int n, int ne)\n{\n\tRune *p;\n\tint m;\n\n\twhile(n > 1) {\n\t\tm = n >> 1;\n\t\tp = t + m*ne;\n\t\tif(c >= p[0]) {\n\t\t\tt = p;\n\t\t\tn = n-m;\n\t\t} else\n\t\t\tn = m;\n\t}\n\tif(n && c >= t[0])\n\t\treturn t;\n\treturn 0;\n}\n\n/*\n * The \"ideographic\" property is hard to extract from UnicodeData.txt,\n * so it is hard coded here.\n *\n * It is defined in the Unicode PropList.txt file, for example\n * PropList-3.0.0.txt.  Unlike the UnicodeData.txt file, the format of\n * PropList changes between versions.  This property appears relatively static;\n * it is the same in version 4.0.1, except that version defines some >16 bit\n * chars as ideographic as well: 20000..2a6d6, and 2f800..2Fa1d.\n */\nstatic Rune __isideographicr[] = {\n\t0x3006, 0x3007,\t\t\t/* 3006 not in Unicode 2, in 2.1 */\n\t0x3021, 0x3029,\n\t0x3038, 0x303a,\t\t\t/* not in Unicode 2 or 2.1 */\n\t0x3400, 0x4db5,\t\t\t/* not in Unicode 2 or 2.1 */\n\t0x4e00, 0x9fbb,\t\t\t/* 0x9FA6..0x9FBB added for 4.1.0? */\n\t0xf900, 0xfa2d,\n        0x20000, 0x2A6D6,\n        0x2F800, 0x2FA1D,\n};\n\nint\nisideographicrune(Rune c)\n{\n\tRune *p;\n\n\tp = rbsearch(c, __isideographicr, nelem(__isideographicr)/2, 2);\n\tif(p && c >= p[0] && c <= p[1])\n\t\treturn 1;\n\treturn 0;\n}\n\n#include \"third_party/utf/runetypebody.c\"\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/runetypebody.c",
    "content": "/* generated automatically by mkrunetype.c from UnicodeData-6.3.0.txt */\n\nstatic Rune __isspacer[] = {\n\t0x0009, 0x000d,\n\t0x0020, 0x0020,\n\t0x0085, 0x0085,\n\t0x00a0, 0x00a0,\n\t0x1680, 0x1680,\n\t0x2000, 0x200a,\n\t0x2028, 0x2029,\n\t0x202f, 0x202f,\n\t0x205f, 0x205f,\n\t0x3000, 0x3000,\n\t0xfeff, 0xfeff,\n};\n\nint\nisspacerune(Rune c)\n{\n\tRune *p;\n\n\tp = rbsearch(c, __isspacer, nelem(__isspacer)/2, 2);\n\tif(p && c >= p[0] && c <= p[1])\n\t\treturn 1;\n\treturn 0;\n}\n\nstatic Rune __isdigitr[] = {\n\t0x0030, 0x0039,\n\t0x0660, 0x0669,\n\t0x06f0, 0x06f9,\n\t0x07c0, 0x07c9,\n\t0x0966, 0x096f,\n\t0x09e6, 0x09ef,\n\t0x0a66, 0x0a6f,\n\t0x0ae6, 0x0aef,\n\t0x0b66, 0x0b6f,\n\t0x0be6, 0x0bef,\n\t0x0c66, 0x0c6f,\n\t0x0ce6, 0x0cef,\n\t0x0d66, 0x0d6f,\n\t0x0e50, 0x0e59,\n\t0x0ed0, 0x0ed9,\n\t0x0f20, 0x0f29,\n\t0x1040, 0x1049,\n\t0x1090, 0x1099,\n\t0x17e0, 0x17e9,\n\t0x1810, 0x1819,\n\t0x1946, 0x194f,\n\t0x19d0, 0x19d9,\n\t0x1a80, 0x1a89,\n\t0x1a90, 0x1a99,\n\t0x1b50, 0x1b59,\n\t0x1bb0, 0x1bb9,\n\t0x1c40, 0x1c49,\n\t0x1c50, 0x1c59,\n\t0xa620, 0xa629,\n\t0xa8d0, 0xa8d9,\n\t0xa900, 0xa909,\n\t0xa9d0, 0xa9d9,\n\t0xaa50, 0xaa59,\n\t0xabf0, 0xabf9,\n\t0xff10, 0xff19,\n\t0x104a0, 0x104a9,\n\t0x11066, 0x1106f,\n\t0x110f0, 0x110f9,\n\t0x11136, 0x1113f,\n\t0x111d0, 0x111d9,\n\t0x116c0, 0x116c9,\n\t0x1d7ce, 0x1d7ff,\n};\n\nint\nisdigitrune(Rune c)\n{\n\tRune *p;\n\n\tp = rbsearch(c, __isdigitr, nelem(__isdigitr)/2, 2);\n\tif(p && c >= p[0] && c <= p[1])\n\t\treturn 1;\n\treturn 0;\n}\n\nstatic Rune __isalphar[] = {\n\t0x0041, 0x005a,\n\t0x0061, 0x007a,\n\t0x00c0, 0x00d6,\n\t0x00d8, 0x00f6,\n\t0x00f8, 0x02c1,\n\t0x02c6, 0x02d1,\n\t0x02e0, 0x02e4,\n\t0x0370, 0x0374,\n\t0x0376, 0x0377,\n\t0x037a, 0x037d,\n\t0x0388, 0x038a,\n\t0x038e, 0x03a1,\n\t0x03a3, 0x03f5,\n\t0x03f7, 0x0481,\n\t0x048a, 0x0527,\n\t0x0531, 0x0556,\n\t0x0561, 0x0587,\n\t0x05d0, 0x05ea,\n\t0x05f0, 0x05f2,\n\t0x0620, 0x064a,\n\t0x066e, 0x066f,\n\t0x0671, 0x06d3,\n\t0x06e5, 0x06e6,\n\t0x06ee, 0x06ef,\n\t0x06fa, 0x06fc,\n\t0x0712, 0x072f,\n\t0x074d, 0x07a5,\n\t0x07ca, 0x07ea,\n\t0x07f4, 0x07f5,\n\t0x0800, 0x0815,\n\t0x0840, 0x0858,\n\t0x08a2, 0x08ac,\n\t0x0904, 0x0939,\n\t0x0958, 0x0961,\n\t0x0971, 0x0977,\n\t0x0979, 0x097f,\n\t0x0985, 0x098c,\n\t0x098f, 0x0990,\n\t0x0993, 0x09a8,\n\t0x09aa, 0x09b0,\n\t0x09b6, 0x09b9,\n\t0x09dc, 0x09dd,\n\t0x09df, 0x09e1,\n\t0x09f0, 0x09f1,\n\t0x0a05, 0x0a0a,\n\t0x0a0f, 0x0a10,\n\t0x0a13, 0x0a28,\n\t0x0a2a, 0x0a30,\n\t0x0a32, 0x0a33,\n\t0x0a35, 0x0a36,\n\t0x0a38, 0x0a39,\n\t0x0a59, 0x0a5c,\n\t0x0a72, 0x0a74,\n\t0x0a85, 0x0a8d,\n\t0x0a8f, 0x0a91,\n\t0x0a93, 0x0aa8,\n\t0x0aaa, 0x0ab0,\n\t0x0ab2, 0x0ab3,\n\t0x0ab5, 0x0ab9,\n\t0x0ae0, 0x0ae1,\n\t0x0b05, 0x0b0c,\n\t0x0b0f, 0x0b10,\n\t0x0b13, 0x0b28,\n\t0x0b2a, 0x0b30,\n\t0x0b32, 0x0b33,\n\t0x0b35, 0x0b39,\n\t0x0b5c, 0x0b5d,\n\t0x0b5f, 0x0b61,\n\t0x0b85, 0x0b8a,\n\t0x0b8e, 0x0b90,\n\t0x0b92, 0x0b95,\n\t0x0b99, 0x0b9a,\n\t0x0b9e, 0x0b9f,\n\t0x0ba3, 0x0ba4,\n\t0x0ba8, 0x0baa,\n\t0x0bae, 0x0bb9,\n\t0x0c05, 0x0c0c,\n\t0x0c0e, 0x0c10,\n\t0x0c12, 0x0c28,\n\t0x0c2a, 0x0c33,\n\t0x0c35, 0x0c39,\n\t0x0c58, 0x0c59,\n\t0x0c60, 0x0c61,\n\t0x0c85, 0x0c8c,\n\t0x0c8e, 0x0c90,\n\t0x0c92, 0x0ca8,\n\t0x0caa, 0x0cb3,\n\t0x0cb5, 0x0cb9,\n\t0x0ce0, 0x0ce1,\n\t0x0cf1, 0x0cf2,\n\t0x0d05, 0x0d0c,\n\t0x0d0e, 0x0d10,\n\t0x0d12, 0x0d3a,\n\t0x0d60, 0x0d61,\n\t0x0d7a, 0x0d7f,\n\t0x0d85, 0x0d96,\n\t0x0d9a, 0x0db1,\n\t0x0db3, 0x0dbb,\n\t0x0dc0, 0x0dc6,\n\t0x0e01, 0x0e30,\n\t0x0e32, 0x0e33,\n\t0x0e40, 0x0e46,\n\t0x0e81, 0x0e82,\n\t0x0e87, 0x0e88,\n\t0x0e94, 0x0e97,\n\t0x0e99, 0x0e9f,\n\t0x0ea1, 0x0ea3,\n\t0x0eaa, 0x0eab,\n\t0x0ead, 0x0eb0,\n\t0x0eb2, 0x0eb3,\n\t0x0ec0, 0x0ec4,\n\t0x0edc, 0x0edf,\n\t0x0f40, 0x0f47,\n\t0x0f49, 0x0f6c,\n\t0x0f88, 0x0f8c,\n\t0x1000, 0x102a,\n\t0x1050, 0x1055,\n\t0x105a, 0x105d,\n\t0x1065, 0x1066,\n\t0x106e, 0x1070,\n\t0x1075, 0x1081,\n\t0x10a0, 0x10c5,\n\t0x10d0, 0x10fa,\n\t0x10fc, 0x1248,\n\t0x124a, 0x124d,\n\t0x1250, 0x1256,\n\t0x125a, 0x125d,\n\t0x1260, 0x1288,\n\t0x128a, 0x128d,\n\t0x1290, 0x12b0,\n\t0x12b2, 0x12b5,\n\t0x12b8, 0x12be,\n\t0x12c2, 0x12c5,\n\t0x12c8, 0x12d6,\n\t0x12d8, 0x1310,\n\t0x1312, 0x1315,\n\t0x1318, 0x135a,\n\t0x1380, 0x138f,\n\t0x13a0, 0x13f4,\n\t0x1401, 0x166c,\n\t0x166f, 0x167f,\n\t0x1681, 0x169a,\n\t0x16a0, 0x16ea,\n\t0x1700, 0x170c,\n\t0x170e, 0x1711,\n\t0x1720, 0x1731,\n\t0x1740, 0x1751,\n\t0x1760, 0x176c,\n\t0x176e, 0x1770,\n\t0x1780, 0x17b3,\n\t0x1820, 0x1877,\n\t0x1880, 0x18a8,\n\t0x18b0, 0x18f5,\n\t0x1900, 0x191c,\n\t0x1950, 0x196d,\n\t0x1970, 0x1974,\n\t0x1980, 0x19ab,\n\t0x19c1, 0x19c7,\n\t0x1a00, 0x1a16,\n\t0x1a20, 0x1a54,\n\t0x1b05, 0x1b33,\n\t0x1b45, 0x1b4b,\n\t0x1b83, 0x1ba0,\n\t0x1bae, 0x1baf,\n\t0x1bba, 0x1be5,\n\t0x1c00, 0x1c23,\n\t0x1c4d, 0x1c4f,\n\t0x1c5a, 0x1c7d,\n\t0x1ce9, 0x1cec,\n\t0x1cee, 0x1cf1,\n\t0x1cf5, 0x1cf6,\n\t0x1d00, 0x1dbf,\n\t0x1e00, 0x1f15,\n\t0x1f18, 0x1f1d,\n\t0x1f20, 0x1f45,\n\t0x1f48, 0x1f4d,\n\t0x1f50, 0x1f57,\n\t0x1f5f, 0x1f7d,\n\t0x1f80, 0x1fb4,\n\t0x1fb6, 0x1fbc,\n\t0x1fc2, 0x1fc4,\n\t0x1fc6, 0x1fcc,\n\t0x1fd0, 0x1fd3,\n\t0x1fd6, 0x1fdb,\n\t0x1fe0, 0x1fec,\n\t0x1ff2, 0x1ff4,\n\t0x1ff6, 0x1ffc,\n\t0x2090, 0x209c,\n\t0x210a, 0x2113,\n\t0x2119, 0x211d,\n\t0x212a, 0x212d,\n\t0x212f, 0x2139,\n\t0x213c, 0x213f,\n\t0x2145, 0x2149,\n\t0x2183, 0x2184,\n\t0x2c00, 0x2c2e,\n\t0x2c30, 0x2c5e,\n\t0x2c60, 0x2ce4,\n\t0x2ceb, 0x2cee,\n\t0x2cf2, 0x2cf3,\n\t0x2d00, 0x2d25,\n\t0x2d30, 0x2d67,\n\t0x2d80, 0x2d96,\n\t0x2da0, 0x2da6,\n\t0x2da8, 0x2dae,\n\t0x2db0, 0x2db6,\n\t0x2db8, 0x2dbe,\n\t0x2dc0, 0x2dc6,\n\t0x2dc8, 0x2dce,\n\t0x2dd0, 0x2dd6,\n\t0x2dd8, 0x2dde,\n\t0x3005, 0x3006,\n\t0x3031, 0x3035,\n\t0x303b, 0x303c,\n\t0x3041, 0x3096,\n\t0x309d, 0x309f,\n\t0x30a1, 0x30fa,\n\t0x30fc, 0x30ff,\n\t0x3105, 0x312d,\n\t0x3131, 0x318e,\n\t0x31a0, 0x31ba,\n\t0x31f0, 0x31ff,\n\t0x3400, 0x4db5,\n\t0x4e00, 0x9fcc,\n\t0xa000, 0xa48c,\n\t0xa4d0, 0xa4fd,\n\t0xa500, 0xa60c,\n\t0xa610, 0xa61f,\n\t0xa62a, 0xa62b,\n\t0xa640, 0xa66e,\n\t0xa67f, 0xa697,\n\t0xa6a0, 0xa6e5,\n\t0xa717, 0xa71f,\n\t0xa722, 0xa788,\n\t0xa78b, 0xa78e,\n\t0xa790, 0xa793,\n\t0xa7a0, 0xa7aa,\n\t0xa7f8, 0xa801,\n\t0xa803, 0xa805,\n\t0xa807, 0xa80a,\n\t0xa80c, 0xa822,\n\t0xa840, 0xa873,\n\t0xa882, 0xa8b3,\n\t0xa8f2, 0xa8f7,\n\t0xa90a, 0xa925,\n\t0xa930, 0xa946,\n\t0xa960, 0xa97c,\n\t0xa984, 0xa9b2,\n\t0xaa00, 0xaa28,\n\t0xaa40, 0xaa42,\n\t0xaa44, 0xaa4b,\n\t0xaa60, 0xaa76,\n\t0xaa80, 0xaaaf,\n\t0xaab5, 0xaab6,\n\t0xaab9, 0xaabd,\n\t0xaadb, 0xaadd,\n\t0xaae0, 0xaaea,\n\t0xaaf2, 0xaaf4,\n\t0xab01, 0xab06,\n\t0xab09, 0xab0e,\n\t0xab11, 0xab16,\n\t0xab20, 0xab26,\n\t0xab28, 0xab2e,\n\t0xabc0, 0xabe2,\n\t0xac00, 0xd7a3,\n\t0xd7b0, 0xd7c6,\n\t0xd7cb, 0xd7fb,\n\t0xf900, 0xfa6d,\n\t0xfa70, 0xfad9,\n\t0xfb00, 0xfb06,\n\t0xfb13, 0xfb17,\n\t0xfb1f, 0xfb28,\n\t0xfb2a, 0xfb36,\n\t0xfb38, 0xfb3c,\n\t0xfb40, 0xfb41,\n\t0xfb43, 0xfb44,\n\t0xfb46, 0xfbb1,\n\t0xfbd3, 0xfd3d,\n\t0xfd50, 0xfd8f,\n\t0xfd92, 0xfdc7,\n\t0xfdf0, 0xfdfb,\n\t0xfe70, 0xfe74,\n\t0xfe76, 0xfefc,\n\t0xff21, 0xff3a,\n\t0xff41, 0xff5a,\n\t0xff66, 0xffbe,\n\t0xffc2, 0xffc7,\n\t0xffca, 0xffcf,\n\t0xffd2, 0xffd7,\n\t0xffda, 0xffdc,\n\t0x10000, 0x1000b,\n\t0x1000d, 0x10026,\n\t0x10028, 0x1003a,\n\t0x1003c, 0x1003d,\n\t0x1003f, 0x1004d,\n\t0x10050, 0x1005d,\n\t0x10080, 0x100fa,\n\t0x10280, 0x1029c,\n\t0x102a0, 0x102d0,\n\t0x10300, 0x1031e,\n\t0x10330, 0x10340,\n\t0x10342, 0x10349,\n\t0x10380, 0x1039d,\n\t0x103a0, 0x103c3,\n\t0x103c8, 0x103cf,\n\t0x10400, 0x1049d,\n\t0x10800, 0x10805,\n\t0x1080a, 0x10835,\n\t0x10837, 0x10838,\n\t0x1083f, 0x10855,\n\t0x10900, 0x10915,\n\t0x10920, 0x10939,\n\t0x10980, 0x109b7,\n\t0x109be, 0x109bf,\n\t0x10a10, 0x10a13,\n\t0x10a15, 0x10a17,\n\t0x10a19, 0x10a33,\n\t0x10a60, 0x10a7c,\n\t0x10b00, 0x10b35,\n\t0x10b40, 0x10b55,\n\t0x10b60, 0x10b72,\n\t0x10c00, 0x10c48,\n\t0x11003, 0x11037,\n\t0x11083, 0x110af,\n\t0x110d0, 0x110e8,\n\t0x11103, 0x11126,\n\t0x11183, 0x111b2,\n\t0x111c1, 0x111c4,\n\t0x11680, 0x116aa,\n\t0x12000, 0x1236e,\n\t0x13000, 0x1342e,\n\t0x16800, 0x16a38,\n\t0x16f00, 0x16f44,\n\t0x16f93, 0x16f9f,\n\t0x1b000, 0x1b001,\n\t0x1d400, 0x1d454,\n\t0x1d456, 0x1d49c,\n\t0x1d49e, 0x1d49f,\n\t0x1d4a5, 0x1d4a6,\n\t0x1d4a9, 0x1d4ac,\n\t0x1d4ae, 0x1d4b9,\n\t0x1d4bd, 0x1d4c3,\n\t0x1d4c5, 0x1d505,\n\t0x1d507, 0x1d50a,\n\t0x1d50d, 0x1d514,\n\t0x1d516, 0x1d51c,\n\t0x1d51e, 0x1d539,\n\t0x1d53b, 0x1d53e,\n\t0x1d540, 0x1d544,\n\t0x1d54a, 0x1d550,\n\t0x1d552, 0x1d6a5,\n\t0x1d6a8, 0x1d6c0,\n\t0x1d6c2, 0x1d6da,\n\t0x1d6dc, 0x1d6fa,\n\t0x1d6fc, 0x1d714,\n\t0x1d716, 0x1d734,\n\t0x1d736, 0x1d74e,\n\t0x1d750, 0x1d76e,\n\t0x1d770, 0x1d788,\n\t0x1d78a, 0x1d7a8,\n\t0x1d7aa, 0x1d7c2,\n\t0x1d7c4, 0x1d7cb,\n\t0x1ee00, 0x1ee03,\n\t0x1ee05, 0x1ee1f,\n\t0x1ee21, 0x1ee22,\n\t0x1ee29, 0x1ee32,\n\t0x1ee34, 0x1ee37,\n\t0x1ee4d, 0x1ee4f,\n\t0x1ee51, 0x1ee52,\n\t0x1ee61, 0x1ee62,\n\t0x1ee67, 0x1ee6a,\n\t0x1ee6c, 0x1ee72,\n\t0x1ee74, 0x1ee77,\n\t0x1ee79, 0x1ee7c,\n\t0x1ee80, 0x1ee89,\n\t0x1ee8b, 0x1ee9b,\n\t0x1eea1, 0x1eea3,\n\t0x1eea5, 0x1eea9,\n\t0x1eeab, 0x1eebb,\n\t0x20000, 0x2a6d6,\n\t0x2a700, 0x2b734,\n\t0x2b740, 0x2b81d,\n\t0x2f800, 0x2fa1d,\n};\n\nstatic Rune __isalphas[] = {\n\t0x00aa,\n\t0x00b5,\n\t0x00ba,\n\t0x02ec,\n\t0x02ee,\n\t0x0386,\n\t0x038c,\n\t0x0559,\n\t0x06d5,\n\t0x06ff,\n\t0x0710,\n\t0x07b1,\n\t0x07fa,\n\t0x081a,\n\t0x0824,\n\t0x0828,\n\t0x08a0,\n\t0x093d,\n\t0x0950,\n\t0x09b2,\n\t0x09bd,\n\t0x09ce,\n\t0x0a5e,\n\t0x0abd,\n\t0x0ad0,\n\t0x0b3d,\n\t0x0b71,\n\t0x0b83,\n\t0x0b9c,\n\t0x0bd0,\n\t0x0c3d,\n\t0x0cbd,\n\t0x0cde,\n\t0x0d3d,\n\t0x0d4e,\n\t0x0dbd,\n\t0x0e84,\n\t0x0e8a,\n\t0x0e8d,\n\t0x0ea5,\n\t0x0ea7,\n\t0x0ebd,\n\t0x0ec6,\n\t0x0f00,\n\t0x103f,\n\t0x1061,\n\t0x108e,\n\t0x10c7,\n\t0x10cd,\n\t0x1258,\n\t0x12c0,\n\t0x17d7,\n\t0x17dc,\n\t0x18aa,\n\t0x1aa7,\n\t0x1f59,\n\t0x1f5b,\n\t0x1f5d,\n\t0x1fbe,\n\t0x2071,\n\t0x207f,\n\t0x2102,\n\t0x2107,\n\t0x2115,\n\t0x2124,\n\t0x2126,\n\t0x2128,\n\t0x214e,\n\t0x2d27,\n\t0x2d2d,\n\t0x2d6f,\n\t0x2e2f,\n\t0xa8fb,\n\t0xa9cf,\n\t0xaa7a,\n\t0xaab1,\n\t0xaac0,\n\t0xaac2,\n\t0xfb1d,\n\t0xfb3e,\n\t0x10808,\n\t0x1083c,\n\t0x10a00,\n\t0x16f50,\n\t0x1d4a2,\n\t0x1d4bb,\n\t0x1d546,\n\t0x1ee24,\n\t0x1ee27,\n\t0x1ee39,\n\t0x1ee3b,\n\t0x1ee42,\n\t0x1ee47,\n\t0x1ee49,\n\t0x1ee4b,\n\t0x1ee54,\n\t0x1ee57,\n\t0x1ee59,\n\t0x1ee5b,\n\t0x1ee5d,\n\t0x1ee5f,\n\t0x1ee64,\n\t0x1ee7e,\n};\n\nint\nisalpharune(Rune c)\n{\n\tRune *p;\n\n\tp = rbsearch(c, __isalphar, nelem(__isalphar)/2, 2);\n\tif(p && c >= p[0] && c <= p[1])\n\t\treturn 1;\n\tp = rbsearch(c, __isalphas, nelem(__isalphas), 1);\n\tif(p && c == p[0])\n\t\treturn 1;\n\treturn 0;\n}\n\nstatic Rune __isupperr[] = {\n\t0x0041, 0x005a,\n\t0x00c0, 0x00d6,\n\t0x00d8, 0x00de,\n\t0x0178, 0x0179,\n\t0x0181, 0x0182,\n\t0x0186, 0x0187,\n\t0x0189, 0x018b,\n\t0x018e, 0x0191,\n\t0x0193, 0x0194,\n\t0x0196, 0x0198,\n\t0x019c, 0x019d,\n\t0x019f, 0x01a0,\n\t0x01a6, 0x01a7,\n\t0x01ae, 0x01af,\n\t0x01b1, 0x01b3,\n\t0x01b7, 0x01b8,\n\t0x01f6, 0x01f8,\n\t0x023a, 0x023b,\n\t0x023d, 0x023e,\n\t0x0243, 0x0246,\n\t0x0388, 0x038a,\n\t0x038e, 0x038f,\n\t0x0391, 0x03a1,\n\t0x03a3, 0x03ab,\n\t0x03d2, 0x03d4,\n\t0x03f9, 0x03fa,\n\t0x03fd, 0x042f,\n\t0x04c0, 0x04c1,\n\t0x0531, 0x0556,\n\t0x10a0, 0x10c5,\n\t0x1f08, 0x1f0f,\n\t0x1f18, 0x1f1d,\n\t0x1f28, 0x1f2f,\n\t0x1f38, 0x1f3f,\n\t0x1f48, 0x1f4d,\n\t0x1f68, 0x1f6f,\n\t0x1f88, 0x1f8f,\n\t0x1f98, 0x1f9f,\n\t0x1fa8, 0x1faf,\n\t0x1fb8, 0x1fbc,\n\t0x1fc8, 0x1fcc,\n\t0x1fd8, 0x1fdb,\n\t0x1fe8, 0x1fec,\n\t0x1ff8, 0x1ffc,\n\t0x210b, 0x210d,\n\t0x2110, 0x2112,\n\t0x2119, 0x211d,\n\t0x212a, 0x212d,\n\t0x2130, 0x2133,\n\t0x213e, 0x213f,\n\t0x2160, 0x216f,\n\t0x24b6, 0x24cf,\n\t0x2c00, 0x2c2e,\n\t0x2c62, 0x2c64,\n\t0x2c6d, 0x2c70,\n\t0x2c7e, 0x2c80,\n\t0xa77d, 0xa77e,\n\t0xff21, 0xff3a,\n\t0x10400, 0x10427,\n\t0x1d400, 0x1d419,\n\t0x1d434, 0x1d44d,\n\t0x1d468, 0x1d481,\n\t0x1d49e, 0x1d49f,\n\t0x1d4a5, 0x1d4a6,\n\t0x1d4a9, 0x1d4ac,\n\t0x1d4ae, 0x1d4b5,\n\t0x1d4d0, 0x1d4e9,\n\t0x1d504, 0x1d505,\n\t0x1d507, 0x1d50a,\n\t0x1d50d, 0x1d514,\n\t0x1d516, 0x1d51c,\n\t0x1d538, 0x1d539,\n\t0x1d53b, 0x1d53e,\n\t0x1d540, 0x1d544,\n\t0x1d54a, 0x1d550,\n\t0x1d56c, 0x1d585,\n\t0x1d5a0, 0x1d5b9,\n\t0x1d5d4, 0x1d5ed,\n\t0x1d608, 0x1d621,\n\t0x1d63c, 0x1d655,\n\t0x1d670, 0x1d689,\n\t0x1d6a8, 0x1d6c0,\n\t0x1d6e2, 0x1d6fa,\n\t0x1d71c, 0x1d734,\n\t0x1d756, 0x1d76e,\n\t0x1d790, 0x1d7a8,\n};\n\nstatic Rune __isupperp[] = {\n\t0x0100, 0x0136,\n\t0x0139, 0x0147,\n\t0x014a, 0x0176,\n\t0x017b, 0x017d,\n\t0x01a2, 0x01a4,\n\t0x01cd, 0x01db,\n\t0x01de, 0x01ee,\n\t0x01fa, 0x0232,\n\t0x0248, 0x024e,\n\t0x0370, 0x0372,\n\t0x03d8, 0x03ee,\n\t0x0460, 0x0480,\n\t0x048a, 0x04be,\n\t0x04c3, 0x04cd,\n\t0x04d0, 0x0526,\n\t0x1e00, 0x1e94,\n\t0x1e9e, 0x1efe,\n\t0x1f59, 0x1f5f,\n\t0x2124, 0x2128,\n\t0x2c67, 0x2c6b,\n\t0x2c82, 0x2ce2,\n\t0x2ceb, 0x2ced,\n\t0xa640, 0xa66c,\n\t0xa680, 0xa696,\n\t0xa722, 0xa72e,\n\t0xa732, 0xa76e,\n\t0xa779, 0xa77b,\n\t0xa780, 0xa786,\n\t0xa78b, 0xa78d,\n\t0xa790, 0xa792,\n\t0xa7a0, 0xa7aa,\n};\n\nstatic Rune __isuppers[] = {\n\t0x0184,\n\t0x01a9,\n\t0x01ac,\n\t0x01b5,\n\t0x01bc,\n\t0x01c4,\n\t0x01c7,\n\t0x01ca,\n\t0x01f1,\n\t0x01f4,\n\t0x0241,\n\t0x0376,\n\t0x0386,\n\t0x038c,\n\t0x03cf,\n\t0x03f4,\n\t0x03f7,\n\t0x10c7,\n\t0x10cd,\n\t0x2102,\n\t0x2107,\n\t0x2115,\n\t0x2145,\n\t0x2183,\n\t0x2c60,\n\t0x2c72,\n\t0x2c75,\n\t0x2cf2,\n\t0x1d49c,\n\t0x1d4a2,\n\t0x1d546,\n\t0x1d7ca,\n};\n\nint\nisupperrune(Rune c)\n{\n\tRune *p;\n\n\tp = rbsearch(c, __isupperr, nelem(__isupperr)/2, 2);\n\tif(p && c >= p[0] && c <= p[1])\n\t\treturn 1;\n\tp = rbsearch(c, __isupperp, nelem(__isupperp)/2, 2);\n\tif(p && c >= p[0] && c <= p[1] && !((c - p[0]) & 1))\n\t\treturn 1;\n\tp = rbsearch(c, __isuppers, nelem(__isuppers), 1);\n\tif(p && c == p[0])\n\t\treturn 1;\n\treturn 0;\n}\n\nstatic Rune __islowerr[] = {\n\t0x0061, 0x007a,\n\t0x00df, 0x00f6,\n\t0x00f8, 0x00ff,\n\t0x0137, 0x0138,\n\t0x0148, 0x0149,\n\t0x017e, 0x0180,\n\t0x018c, 0x018d,\n\t0x0199, 0x019b,\n\t0x01aa, 0x01ab,\n\t0x01b9, 0x01ba,\n\t0x01bd, 0x01bf,\n\t0x01dc, 0x01dd,\n\t0x01ef, 0x01f0,\n\t0x0233, 0x0239,\n\t0x023f, 0x0240,\n\t0x024f, 0x0293,\n\t0x0295, 0x02af,\n\t0x037b, 0x037d,\n\t0x03ac, 0x03ce,\n\t0x03d0, 0x03d1,\n\t0x03d5, 0x03d7,\n\t0x03ef, 0x03f3,\n\t0x03fb, 0x03fc,\n\t0x0430, 0x045f,\n\t0x04ce, 0x04cf,\n\t0x0561, 0x0587,\n\t0x1d00, 0x1d2b,\n\t0x1d6b, 0x1d77,\n\t0x1d79, 0x1d9a,\n\t0x1e95, 0x1e9d,\n\t0x1eff, 0x1f07,\n\t0x1f10, 0x1f15,\n\t0x1f20, 0x1f27,\n\t0x1f30, 0x1f37,\n\t0x1f40, 0x1f45,\n\t0x1f50, 0x1f57,\n\t0x1f60, 0x1f67,\n\t0x1f70, 0x1f7d,\n\t0x1f80, 0x1f87,\n\t0x1f90, 0x1f97,\n\t0x1fa0, 0x1fa7,\n\t0x1fb0, 0x1fb4,\n\t0x1fb6, 0x1fb7,\n\t0x1fc2, 0x1fc4,\n\t0x1fc6, 0x1fc7,\n\t0x1fd0, 0x1fd3,\n\t0x1fd6, 0x1fd7,\n\t0x1fe0, 0x1fe7,\n\t0x1ff2, 0x1ff4,\n\t0x1ff6, 0x1ff7,\n\t0x210e, 0x210f,\n\t0x213c, 0x213d,\n\t0x2146, 0x2149,\n\t0x2170, 0x217f,\n\t0x24d0, 0x24e9,\n\t0x2c30, 0x2c5e,\n\t0x2c65, 0x2c66,\n\t0x2c73, 0x2c74,\n\t0x2c76, 0x2c7b,\n\t0x2ce3, 0x2ce4,\n\t0x2d00, 0x2d25,\n\t0xa72f, 0xa731,\n\t0xa771, 0xa778,\n\t0xfb00, 0xfb06,\n\t0xfb13, 0xfb17,\n\t0xff41, 0xff5a,\n\t0x10428, 0x1044f,\n\t0x1d41a, 0x1d433,\n\t0x1d44e, 0x1d454,\n\t0x1d456, 0x1d467,\n\t0x1d482, 0x1d49b,\n\t0x1d4b6, 0x1d4b9,\n\t0x1d4bd, 0x1d4c3,\n\t0x1d4c5, 0x1d4cf,\n\t0x1d4ea, 0x1d503,\n\t0x1d51e, 0x1d537,\n\t0x1d552, 0x1d56b,\n\t0x1d586, 0x1d59f,\n\t0x1d5ba, 0x1d5d3,\n\t0x1d5ee, 0x1d607,\n\t0x1d622, 0x1d63b,\n\t0x1d656, 0x1d66f,\n\t0x1d68a, 0x1d6a5,\n\t0x1d6c2, 0x1d6da,\n\t0x1d6dc, 0x1d6e1,\n\t0x1d6fc, 0x1d714,\n\t0x1d716, 0x1d71b,\n\t0x1d736, 0x1d74e,\n\t0x1d750, 0x1d755,\n\t0x1d770, 0x1d788,\n\t0x1d78a, 0x1d78f,\n\t0x1d7aa, 0x1d7c2,\n\t0x1d7c4, 0x1d7c9,\n};\n\nstatic Rune __islowerp[] = {\n\t0x0101, 0x0135,\n\t0x013a, 0x0146,\n\t0x014b, 0x0177,\n\t0x017a, 0x017c,\n\t0x0183, 0x0185,\n\t0x01a1, 0x01a5,\n\t0x01b4, 0x01b6,\n\t0x01cc, 0x01da,\n\t0x01df, 0x01ed,\n\t0x01f3, 0x01f5,\n\t0x01f9, 0x0231,\n\t0x0247, 0x024d,\n\t0x0371, 0x0373,\n\t0x03d9, 0x03ed,\n\t0x0461, 0x0481,\n\t0x048b, 0x04bf,\n\t0x04c2, 0x04cc,\n\t0x04d1, 0x0527,\n\t0x1e01, 0x1e93,\n\t0x1e9f, 0x1efd,\n\t0x2c68, 0x2c6c,\n\t0x2c81, 0x2ce1,\n\t0x2cec, 0x2cee,\n\t0xa641, 0xa66d,\n\t0xa681, 0xa697,\n\t0xa723, 0xa72d,\n\t0xa733, 0xa76f,\n\t0xa77a, 0xa77c,\n\t0xa77f, 0xa787,\n\t0xa78c, 0xa78e,\n\t0xa791, 0xa793,\n\t0xa7a1, 0xa7a9,\n};\n\nstatic Rune __islowers[] = {\n\t0x00b5,\n\t0x0188,\n\t0x0192,\n\t0x0195,\n\t0x019e,\n\t0x01a8,\n\t0x01ad,\n\t0x01b0,\n\t0x01c6,\n\t0x01c9,\n\t0x023c,\n\t0x0242,\n\t0x0377,\n\t0x0390,\n\t0x03f5,\n\t0x03f8,\n\t0x1fbe,\n\t0x210a,\n\t0x2113,\n\t0x212f,\n\t0x2134,\n\t0x2139,\n\t0x214e,\n\t0x2184,\n\t0x2c61,\n\t0x2c71,\n\t0x2cf3,\n\t0x2d27,\n\t0x2d2d,\n\t0xa7fa,\n\t0x1d4bb,\n\t0x1d7cb,\n};\n\nint\nislowerrune(Rune c)\n{\n\tRune *p;\n\n\tp = rbsearch(c, __islowerr, nelem(__islowerr)/2, 2);\n\tif(p && c >= p[0] && c <= p[1])\n\t\treturn 1;\n\tp = rbsearch(c, __islowerp, nelem(__islowerp)/2, 2);\n\tif(p && c >= p[0] && c <= p[1] && !((c - p[0]) & 1))\n\t\treturn 1;\n\tp = rbsearch(c, __islowers, nelem(__islowers), 1);\n\tif(p && c == p[0])\n\t\treturn 1;\n\treturn 0;\n}\n\nstatic Rune __istitler[] = {\n\t0x0041, 0x005a,\n\t0x00c0, 0x00d6,\n\t0x00d8, 0x00de,\n\t0x0178, 0x0179,\n\t0x0181, 0x0182,\n\t0x0186, 0x0187,\n\t0x0189, 0x018b,\n\t0x018e, 0x0191,\n\t0x0193, 0x0194,\n\t0x0196, 0x0198,\n\t0x019c, 0x019d,\n\t0x019f, 0x01a0,\n\t0x01a6, 0x01a7,\n\t0x01ae, 0x01af,\n\t0x01b1, 0x01b3,\n\t0x01b7, 0x01b8,\n\t0x01f6, 0x01f8,\n\t0x023a, 0x023b,\n\t0x023d, 0x023e,\n\t0x0243, 0x0246,\n\t0x0388, 0x038a,\n\t0x038e, 0x038f,\n\t0x0391, 0x03a1,\n\t0x03a3, 0x03ab,\n\t0x03f9, 0x03fa,\n\t0x03fd, 0x042f,\n\t0x04c0, 0x04c1,\n\t0x0531, 0x0556,\n\t0x10a0, 0x10c5,\n\t0x1f08, 0x1f0f,\n\t0x1f18, 0x1f1d,\n\t0x1f28, 0x1f2f,\n\t0x1f38, 0x1f3f,\n\t0x1f48, 0x1f4d,\n\t0x1f68, 0x1f6f,\n\t0x1f88, 0x1f8f,\n\t0x1f98, 0x1f9f,\n\t0x1fa8, 0x1faf,\n\t0x1fb8, 0x1fbc,\n\t0x1fc8, 0x1fcc,\n\t0x1fd8, 0x1fdb,\n\t0x1fe8, 0x1fec,\n\t0x1ff8, 0x1ffc,\n\t0x2160, 0x216f,\n\t0x24b6, 0x24cf,\n\t0x2c00, 0x2c2e,\n\t0x2c62, 0x2c64,\n\t0x2c6d, 0x2c70,\n\t0x2c7e, 0x2c80,\n\t0xa77d, 0xa77e,\n\t0xff21, 0xff3a,\n\t0x10400, 0x10427,\n};\n\nstatic Rune __istitlep[] = {\n\t0x0100, 0x012e,\n\t0x0132, 0x0136,\n\t0x0139, 0x0147,\n\t0x014a, 0x0176,\n\t0x017b, 0x017d,\n\t0x01a2, 0x01a4,\n\t0x01cb, 0x01db,\n\t0x01de, 0x01ee,\n\t0x01f2, 0x01f4,\n\t0x01fa, 0x0232,\n\t0x0248, 0x024e,\n\t0x0370, 0x0372,\n\t0x03d8, 0x03ee,\n\t0x0460, 0x0480,\n\t0x048a, 0x04be,\n\t0x04c3, 0x04cd,\n\t0x04d0, 0x0526,\n\t0x1e00, 0x1e94,\n\t0x1ea0, 0x1efe,\n\t0x1f59, 0x1f5f,\n\t0x2c67, 0x2c6b,\n\t0x2c82, 0x2ce2,\n\t0x2ceb, 0x2ced,\n\t0xa640, 0xa66c,\n\t0xa680, 0xa696,\n\t0xa722, 0xa72e,\n\t0xa732, 0xa76e,\n\t0xa779, 0xa77b,\n\t0xa780, 0xa786,\n\t0xa78b, 0xa78d,\n\t0xa790, 0xa792,\n\t0xa7a0, 0xa7aa,\n};\n\nstatic Rune __istitles[] = {\n\t0x0184,\n\t0x01a9,\n\t0x01ac,\n\t0x01b5,\n\t0x01bc,\n\t0x01c5,\n\t0x01c8,\n\t0x0241,\n\t0x0376,\n\t0x0386,\n\t0x038c,\n\t0x03cf,\n\t0x03f7,\n\t0x10c7,\n\t0x10cd,\n\t0x2132,\n\t0x2183,\n\t0x2c60,\n\t0x2c72,\n\t0x2c75,\n\t0x2cf2,\n};\n\nint\nistitlerune(Rune c)\n{\n\tRune *p;\n\n\tp = rbsearch(c, __istitler, nelem(__istitler)/2, 2);\n\tif(p && c >= p[0] && c <= p[1])\n\t\treturn 1;\n\tp = rbsearch(c, __istitlep, nelem(__istitlep)/2, 2);\n\tif(p && c >= p[0] && c <= p[1] && !((c - p[0]) & 1))\n\t\treturn 1;\n\tp = rbsearch(c, __istitles, nelem(__istitles), 1);\n\tif(p && c == p[0])\n\t\treturn 1;\n\treturn 0;\n}\n\nstatic Rune __toupperr[] = {\n\t0x0061, 0x007a, 1048544,\n\t0x00e0, 0x00f6, 1048544,\n\t0x00f8, 0x00fe, 1048544,\n\t0x023f, 0x0240, 1059391,\n\t0x0256, 0x0257, 1048371,\n\t0x028a, 0x028b, 1048359,\n\t0x037b, 0x037d, 1048706,\n\t0x03ad, 0x03af, 1048539,\n\t0x03b1, 0x03c1, 1048544,\n\t0x03c3, 0x03cb, 1048544,\n\t0x03cd, 0x03ce, 1048513,\n\t0x0430, 0x044f, 1048544,\n\t0x0450, 0x045f, 1048496,\n\t0x0561, 0x0586, 1048528,\n\t0x1f00, 0x1f07, 1048584,\n\t0x1f10, 0x1f15, 1048584,\n\t0x1f20, 0x1f27, 1048584,\n\t0x1f30, 0x1f37, 1048584,\n\t0x1f40, 0x1f45, 1048584,\n\t0x1f60, 0x1f67, 1048584,\n\t0x1f70, 0x1f71, 1048650,\n\t0x1f72, 0x1f75, 1048662,\n\t0x1f76, 0x1f77, 1048676,\n\t0x1f78, 0x1f79, 1048704,\n\t0x1f7a, 0x1f7b, 1048688,\n\t0x1f7c, 0x1f7d, 1048702,\n\t0x1f80, 0x1f87, 1048584,\n\t0x1f90, 0x1f97, 1048584,\n\t0x1fa0, 0x1fa7, 1048584,\n\t0x1fb0, 0x1fb1, 1048584,\n\t0x1fd0, 0x1fd1, 1048584,\n\t0x1fe0, 0x1fe1, 1048584,\n\t0x2170, 0x217f, 1048560,\n\t0x24d0, 0x24e9, 1048550,\n\t0x2c30, 0x2c5e, 1048528,\n\t0x2d00, 0x2d25, 1041312,\n\t0xff41, 0xff5a, 1048544,\n\t0x10428, 0x1044f, 1048536,\n};\n\nstatic Rune __toupperp[] = {\n\t0x0101, 0x012f, 1048575,\n\t0x0133, 0x0137, 1048575,\n\t0x013a, 0x0148, 1048575,\n\t0x014b, 0x0177, 1048575,\n\t0x017a, 0x017e, 1048575,\n\t0x0183, 0x0185, 1048575,\n\t0x01a1, 0x01a5, 1048575,\n\t0x01b4, 0x01b6, 1048575,\n\t0x01ce, 0x01dc, 1048575,\n\t0x01df, 0x01ef, 1048575,\n\t0x01f9, 0x021f, 1048575,\n\t0x0223, 0x0233, 1048575,\n\t0x0247, 0x024f, 1048575,\n\t0x0371, 0x0373, 1048575,\n\t0x03d9, 0x03ef, 1048575,\n\t0x0461, 0x0481, 1048575,\n\t0x048b, 0x04bf, 1048575,\n\t0x04c2, 0x04ce, 1048575,\n\t0x04d1, 0x0527, 1048575,\n\t0x1e01, 0x1e95, 1048575,\n\t0x1ea1, 0x1eff, 1048575,\n\t0x1f51, 0x1f57, 1048584,\n\t0x2c68, 0x2c6c, 1048575,\n\t0x2c81, 0x2ce3, 1048575,\n\t0x2cec, 0x2cee, 1048575,\n\t0xa641, 0xa66d, 1048575,\n\t0xa681, 0xa697, 1048575,\n\t0xa723, 0xa72f, 1048575,\n\t0xa733, 0xa76f, 1048575,\n\t0xa77a, 0xa77c, 1048575,\n\t0xa77f, 0xa787, 1048575,\n\t0xa791, 0xa793, 1048575,\n\t0xa7a1, 0xa7a9, 1048575,\n};\n\nstatic Rune __touppers[] = {\n\t0x00b5, 1049319,\n\t0x00ff, 1048697,\n\t0x0131, 1048344,\n\t0x017f, 1048276,\n\t0x0180, 1048771,\n\t0x0188, 1048575,\n\t0x018c, 1048575,\n\t0x0192, 1048575,\n\t0x0195, 1048673,\n\t0x0199, 1048575,\n\t0x019a, 1048739,\n\t0x019e, 1048706,\n\t0x01a8, 1048575,\n\t0x01ad, 1048575,\n\t0x01b0, 1048575,\n\t0x01b9, 1048575,\n\t0x01bd, 1048575,\n\t0x01bf, 1048632,\n\t0x01c5, 1048575,\n\t0x01c6, 1048574,\n\t0x01c8, 1048575,\n\t0x01c9, 1048574,\n\t0x01cb, 1048575,\n\t0x01cc, 1048574,\n\t0x01dd, 1048497,\n\t0x01f2, 1048575,\n\t0x01f3, 1048574,\n\t0x01f5, 1048575,\n\t0x023c, 1048575,\n\t0x0242, 1048575,\n\t0x0250, 1059359,\n\t0x0251, 1059356,\n\t0x0252, 1059358,\n\t0x0253, 1048366,\n\t0x0254, 1048370,\n\t0x0259, 1048374,\n\t0x025b, 1048373,\n\t0x0260, 1048371,\n\t0x0263, 1048369,\n\t0x0265, 1090856,\n\t0x0266, 1090884,\n\t0x0268, 1048367,\n\t0x0269, 1048365,\n\t0x026b, 1059319,\n\t0x026f, 1048365,\n\t0x0271, 1059325,\n\t0x0272, 1048363,\n\t0x0275, 1048362,\n\t0x027d, 1059303,\n\t0x0280, 1048358,\n\t0x0283, 1048358,\n\t0x0288, 1048358,\n\t0x0289, 1048507,\n\t0x028c, 1048505,\n\t0x0292, 1048357,\n\t0x0345, 1048660,\n\t0x0377, 1048575,\n\t0x03ac, 1048538,\n\t0x03c2, 1048545,\n\t0x03cc, 1048512,\n\t0x03d0, 1048514,\n\t0x03d1, 1048519,\n\t0x03d5, 1048529,\n\t0x03d6, 1048522,\n\t0x03d7, 1048568,\n\t0x03f0, 1048490,\n\t0x03f1, 1048496,\n\t0x03f2, 1048583,\n\t0x03f5, 1048480,\n\t0x03f8, 1048575,\n\t0x03fb, 1048575,\n\t0x04cf, 1048561,\n\t0x1d79, 1083908,\n\t0x1d7d, 1052390,\n\t0x1e9b, 1048517,\n\t0x1fb3, 1048585,\n\t0x1fbe, 1041371,\n\t0x1fc3, 1048585,\n\t0x1fe5, 1048583,\n\t0x1ff3, 1048585,\n\t0x214e, 1048548,\n\t0x2184, 1048575,\n\t0x2c61, 1048575,\n\t0x2c65, 1037781,\n\t0x2c66, 1037784,\n\t0x2c73, 1048575,\n\t0x2c76, 1048575,\n\t0x2cf3, 1048575,\n\t0x2d27, 1041312,\n\t0x2d2d, 1041312,\n\t0xa78c, 1048575,\n};\n\nRune\ntoupperrune(Rune c)\n{\n\tRune *p;\n\n\tp = rbsearch(c, __toupperr, nelem(__toupperr)/3, 3);\n\tif(p && c >= p[0] && c <= p[1])\n\t\treturn c + p[2] - 1048576;\n\tp = rbsearch(c, __toupperp, nelem(__toupperp)/3, 3);\n\tif(p && c >= p[0] && c <= p[1] && !((c - p[0]) & 1))\n\t\treturn c + p[2] - 1048576;\n\tp = rbsearch(c, __touppers, nelem(__touppers)/2, 2);\n\tif(p && c == p[0])\n\t\treturn c + p[1] - 1048576;\n\treturn c;\n}\n\nstatic Rune __tolowerr[] = {\n\t0x0041, 0x005a, 1048608,\n\t0x00c0, 0x00d6, 1048608,\n\t0x00d8, 0x00de, 1048608,\n\t0x0189, 0x018a, 1048781,\n\t0x01b1, 0x01b2, 1048793,\n\t0x0388, 0x038a, 1048613,\n\t0x038e, 0x038f, 1048639,\n\t0x0391, 0x03a1, 1048608,\n\t0x03a3, 0x03ab, 1048608,\n\t0x03fd, 0x03ff, 1048446,\n\t0x0400, 0x040f, 1048656,\n\t0x0410, 0x042f, 1048608,\n\t0x0531, 0x0556, 1048624,\n\t0x10a0, 0x10c5, 1055840,\n\t0x1f08, 0x1f0f, 1048568,\n\t0x1f18, 0x1f1d, 1048568,\n\t0x1f28, 0x1f2f, 1048568,\n\t0x1f38, 0x1f3f, 1048568,\n\t0x1f48, 0x1f4d, 1048568,\n\t0x1f68, 0x1f6f, 1048568,\n\t0x1f88, 0x1f8f, 1048568,\n\t0x1f98, 0x1f9f, 1048568,\n\t0x1fa8, 0x1faf, 1048568,\n\t0x1fb8, 0x1fb9, 1048568,\n\t0x1fba, 0x1fbb, 1048502,\n\t0x1fc8, 0x1fcb, 1048490,\n\t0x1fd8, 0x1fd9, 1048568,\n\t0x1fda, 0x1fdb, 1048476,\n\t0x1fe8, 0x1fe9, 1048568,\n\t0x1fea, 0x1feb, 1048464,\n\t0x1ff8, 0x1ff9, 1048448,\n\t0x1ffa, 0x1ffb, 1048450,\n\t0x2160, 0x216f, 1048592,\n\t0x24b6, 0x24cf, 1048602,\n\t0x2c00, 0x2c2e, 1048624,\n\t0x2c7e, 0x2c7f, 1037761,\n\t0xff21, 0xff3a, 1048608,\n\t0x10400, 0x10427, 1048616,\n};\n\nstatic Rune __tolowerp[] = {\n\t0x0100, 0x012e, 1048577,\n\t0x0132, 0x0136, 1048577,\n\t0x0139, 0x0147, 1048577,\n\t0x014a, 0x0176, 1048577,\n\t0x017b, 0x017d, 1048577,\n\t0x01a2, 0x01a4, 1048577,\n\t0x01b3, 0x01b5, 1048577,\n\t0x01cd, 0x01db, 1048577,\n\t0x01de, 0x01ee, 1048577,\n\t0x01f8, 0x021e, 1048577,\n\t0x0222, 0x0232, 1048577,\n\t0x0248, 0x024e, 1048577,\n\t0x0370, 0x0372, 1048577,\n\t0x03d8, 0x03ee, 1048577,\n\t0x0460, 0x0480, 1048577,\n\t0x048a, 0x04be, 1048577,\n\t0x04c3, 0x04cd, 1048577,\n\t0x04d0, 0x0526, 1048577,\n\t0x1e00, 0x1e94, 1048577,\n\t0x1ea0, 0x1efe, 1048577,\n\t0x1f59, 0x1f5f, 1048568,\n\t0x2c67, 0x2c6b, 1048577,\n\t0x2c80, 0x2ce2, 1048577,\n\t0x2ceb, 0x2ced, 1048577,\n\t0xa640, 0xa66c, 1048577,\n\t0xa680, 0xa696, 1048577,\n\t0xa722, 0xa72e, 1048577,\n\t0xa732, 0xa76e, 1048577,\n\t0xa779, 0xa77b, 1048577,\n\t0xa780, 0xa786, 1048577,\n\t0xa790, 0xa792, 1048577,\n\t0xa7a0, 0xa7a8, 1048577,\n};\n\nstatic Rune __tolowers[] = {\n\t0x0130, 1048377,\n\t0x0178, 1048455,\n\t0x0179, 1048577,\n\t0x0181, 1048786,\n\t0x0182, 1048577,\n\t0x0184, 1048577,\n\t0x0186, 1048782,\n\t0x0187, 1048577,\n\t0x018b, 1048577,\n\t0x018e, 1048655,\n\t0x018f, 1048778,\n\t0x0190, 1048779,\n\t0x0191, 1048577,\n\t0x0193, 1048781,\n\t0x0194, 1048783,\n\t0x0196, 1048787,\n\t0x0197, 1048785,\n\t0x0198, 1048577,\n\t0x019c, 1048787,\n\t0x019d, 1048789,\n\t0x019f, 1048790,\n\t0x01a0, 1048577,\n\t0x01a6, 1048794,\n\t0x01a7, 1048577,\n\t0x01a9, 1048794,\n\t0x01ac, 1048577,\n\t0x01ae, 1048794,\n\t0x01af, 1048577,\n\t0x01b7, 1048795,\n\t0x01b8, 1048577,\n\t0x01bc, 1048577,\n\t0x01c4, 1048578,\n\t0x01c5, 1048577,\n\t0x01c7, 1048578,\n\t0x01c8, 1048577,\n\t0x01ca, 1048578,\n\t0x01cb, 1048577,\n\t0x01f1, 1048578,\n\t0x01f2, 1048577,\n\t0x01f4, 1048577,\n\t0x01f6, 1048479,\n\t0x01f7, 1048520,\n\t0x0220, 1048446,\n\t0x023a, 1059371,\n\t0x023b, 1048577,\n\t0x023d, 1048413,\n\t0x023e, 1059368,\n\t0x0241, 1048577,\n\t0x0243, 1048381,\n\t0x0244, 1048645,\n\t0x0245, 1048647,\n\t0x0246, 1048577,\n\t0x0376, 1048577,\n\t0x0386, 1048614,\n\t0x038c, 1048640,\n\t0x03cf, 1048584,\n\t0x03f4, 1048516,\n\t0x03f7, 1048577,\n\t0x03f9, 1048569,\n\t0x03fa, 1048577,\n\t0x04c0, 1048591,\n\t0x04c1, 1048577,\n\t0x10c7, 1055840,\n\t0x10cd, 1055840,\n\t0x1e9e, 1040961,\n\t0x1fbc, 1048567,\n\t0x1fcc, 1048567,\n\t0x1fec, 1048569,\n\t0x1ffc, 1048567,\n\t0x2126, 1041059,\n\t0x212a, 1040193,\n\t0x212b, 1040314,\n\t0x2132, 1048604,\n\t0x2183, 1048577,\n\t0x2c60, 1048577,\n\t0x2c62, 1037833,\n\t0x2c63, 1044762,\n\t0x2c64, 1037849,\n\t0x2c6d, 1037796,\n\t0x2c6e, 1037827,\n\t0x2c6f, 1037793,\n\t0x2c70, 1037794,\n\t0x2c72, 1048577,\n\t0x2c75, 1048577,\n\t0x2cf2, 1048577,\n\t0xa77d, 1013244,\n\t0xa77e, 1048577,\n\t0xa78b, 1048577,\n\t0xa78d, 1006296,\n\t0xa7aa, 1006268,\n};\n\nRune\ntolowerrune(Rune c)\n{\n\tRune *p;\n\n\tp = rbsearch(c, __tolowerr, nelem(__tolowerr)/3, 3);\n\tif(p && c >= p[0] && c <= p[1])\n\t\treturn c + p[2] - 1048576;\n\tp = rbsearch(c, __tolowerp, nelem(__tolowerp)/3, 3);\n\tif(p && c >= p[0] && c <= p[1] && !((c - p[0]) & 1))\n\t\treturn c + p[2] - 1048576;\n\tp = rbsearch(c, __tolowers, nelem(__tolowers)/2, 2);\n\tif(p && c == p[0])\n\t\treturn c + p[1] - 1048576;\n\treturn c;\n}\n\nstatic Rune __totitler[] = {\n\t0x0061, 0x007a, 1048544,\n\t0x00e0, 0x00f6, 1048544,\n\t0x00f8, 0x00fe, 1048544,\n\t0x023f, 0x0240, 1059391,\n\t0x0256, 0x0257, 1048371,\n\t0x028a, 0x028b, 1048359,\n\t0x037b, 0x037d, 1048706,\n\t0x03ad, 0x03af, 1048539,\n\t0x03b1, 0x03c1, 1048544,\n\t0x03c3, 0x03cb, 1048544,\n\t0x03cd, 0x03ce, 1048513,\n\t0x0430, 0x044f, 1048544,\n\t0x0450, 0x045f, 1048496,\n\t0x0561, 0x0586, 1048528,\n\t0x1f00, 0x1f07, 1048584,\n\t0x1f10, 0x1f15, 1048584,\n\t0x1f20, 0x1f27, 1048584,\n\t0x1f30, 0x1f37, 1048584,\n\t0x1f40, 0x1f45, 1048584,\n\t0x1f60, 0x1f67, 1048584,\n\t0x1f70, 0x1f71, 1048650,\n\t0x1f72, 0x1f75, 1048662,\n\t0x1f76, 0x1f77, 1048676,\n\t0x1f78, 0x1f79, 1048704,\n\t0x1f7a, 0x1f7b, 1048688,\n\t0x1f7c, 0x1f7d, 1048702,\n\t0x1f80, 0x1f87, 1048584,\n\t0x1f90, 0x1f97, 1048584,\n\t0x1fa0, 0x1fa7, 1048584,\n\t0x1fb0, 0x1fb1, 1048584,\n\t0x1fd0, 0x1fd1, 1048584,\n\t0x1fe0, 0x1fe1, 1048584,\n\t0x2170, 0x217f, 1048560,\n\t0x24d0, 0x24e9, 1048550,\n\t0x2c30, 0x2c5e, 1048528,\n\t0x2d00, 0x2d25, 1041312,\n\t0xff41, 0xff5a, 1048544,\n\t0x10428, 0x1044f, 1048536,\n};\n\nstatic Rune __totitlep[] = {\n\t0x0101, 0x012f, 1048575,\n\t0x0133, 0x0137, 1048575,\n\t0x013a, 0x0148, 1048575,\n\t0x014b, 0x0177, 1048575,\n\t0x017a, 0x017e, 1048575,\n\t0x0183, 0x0185, 1048575,\n\t0x01a1, 0x01a5, 1048575,\n\t0x01b4, 0x01b6, 1048575,\n\t0x01cc, 0x01dc, 1048575,\n\t0x01df, 0x01ef, 1048575,\n\t0x01f3, 0x01f5, 1048575,\n\t0x01f9, 0x021f, 1048575,\n\t0x0223, 0x0233, 1048575,\n\t0x0247, 0x024f, 1048575,\n\t0x0371, 0x0373, 1048575,\n\t0x03d9, 0x03ef, 1048575,\n\t0x0461, 0x0481, 1048575,\n\t0x048b, 0x04bf, 1048575,\n\t0x04c2, 0x04ce, 1048575,\n\t0x04d1, 0x0527, 1048575,\n\t0x1e01, 0x1e95, 1048575,\n\t0x1ea1, 0x1eff, 1048575,\n\t0x1f51, 0x1f57, 1048584,\n\t0x2c68, 0x2c6c, 1048575,\n\t0x2c81, 0x2ce3, 1048575,\n\t0x2cec, 0x2cee, 1048575,\n\t0xa641, 0xa66d, 1048575,\n\t0xa681, 0xa697, 1048575,\n\t0xa723, 0xa72f, 1048575,\n\t0xa733, 0xa76f, 1048575,\n\t0xa77a, 0xa77c, 1048575,\n\t0xa77f, 0xa787, 1048575,\n\t0xa791, 0xa793, 1048575,\n\t0xa7a1, 0xa7a9, 1048575,\n};\n\nstatic Rune __totitles[] = {\n\t0x00b5, 1049319,\n\t0x00ff, 1048697,\n\t0x0131, 1048344,\n\t0x017f, 1048276,\n\t0x0180, 1048771,\n\t0x0188, 1048575,\n\t0x018c, 1048575,\n\t0x0192, 1048575,\n\t0x0195, 1048673,\n\t0x0199, 1048575,\n\t0x019a, 1048739,\n\t0x019e, 1048706,\n\t0x01a8, 1048575,\n\t0x01ad, 1048575,\n\t0x01b0, 1048575,\n\t0x01b9, 1048575,\n\t0x01bd, 1048575,\n\t0x01bf, 1048632,\n\t0x01c4, 1048577,\n\t0x01c6, 1048575,\n\t0x01c7, 1048577,\n\t0x01c9, 1048575,\n\t0x01ca, 1048577,\n\t0x01dd, 1048497,\n\t0x01f1, 1048577,\n\t0x023c, 1048575,\n\t0x0242, 1048575,\n\t0x0250, 1059359,\n\t0x0251, 1059356,\n\t0x0252, 1059358,\n\t0x0253, 1048366,\n\t0x0254, 1048370,\n\t0x0259, 1048374,\n\t0x025b, 1048373,\n\t0x0260, 1048371,\n\t0x0263, 1048369,\n\t0x0265, 1090856,\n\t0x0266, 1090884,\n\t0x0268, 1048367,\n\t0x0269, 1048365,\n\t0x026b, 1059319,\n\t0x026f, 1048365,\n\t0x0271, 1059325,\n\t0x0272, 1048363,\n\t0x0275, 1048362,\n\t0x027d, 1059303,\n\t0x0280, 1048358,\n\t0x0283, 1048358,\n\t0x0288, 1048358,\n\t0x0289, 1048507,\n\t0x028c, 1048505,\n\t0x0292, 1048357,\n\t0x0345, 1048660,\n\t0x0377, 1048575,\n\t0x03ac, 1048538,\n\t0x03c2, 1048545,\n\t0x03cc, 1048512,\n\t0x03d0, 1048514,\n\t0x03d1, 1048519,\n\t0x03d5, 1048529,\n\t0x03d6, 1048522,\n\t0x03d7, 1048568,\n\t0x03f0, 1048490,\n\t0x03f1, 1048496,\n\t0x03f2, 1048583,\n\t0x03f5, 1048480,\n\t0x03f8, 1048575,\n\t0x03fb, 1048575,\n\t0x04cf, 1048561,\n\t0x1d79, 1083908,\n\t0x1d7d, 1052390,\n\t0x1e9b, 1048517,\n\t0x1fb3, 1048585,\n\t0x1fbe, 1041371,\n\t0x1fc3, 1048585,\n\t0x1fe5, 1048583,\n\t0x1ff3, 1048585,\n\t0x214e, 1048548,\n\t0x2184, 1048575,\n\t0x2c61, 1048575,\n\t0x2c65, 1037781,\n\t0x2c66, 1037784,\n\t0x2c73, 1048575,\n\t0x2c76, 1048575,\n\t0x2cf3, 1048575,\n\t0x2d27, 1041312,\n\t0x2d2d, 1041312,\n\t0xa78c, 1048575,\n};\n\nRune\ntotitlerune(Rune c)\n{\n\tRune *p;\n\n\tp = rbsearch(c, __totitler, nelem(__totitler)/3, 3);\n\tif(p && c >= p[0] && c <= p[1])\n\t\treturn c + p[2] - 1048576;\n\tp = rbsearch(c, __totitlep, nelem(__totitlep)/3, 3);\n\tif(p && c >= p[0] && c <= p[1] && !((c - p[0]) & 1))\n\t\treturn c + p[2] - 1048576;\n\tp = rbsearch(c, __totitles, nelem(__totitles)/2, 2);\n\tif(p && c == p[0])\n\t\treturn c + p[1] - 1048576;\n\treturn c;\n}\n\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/utf.h",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#ifndef _UTFH_\n#define _UTFH_ 1\n\n#include <stdint.h>\n\ntypedef signed int Rune;\t/* Code-point values in Unicode 4.0 are 21 bits wide.*/\n\nenum\n{\n  UTFmax\t= 4,\t\t/* maximum bytes per rune */\n  Runesync\t= 0x80,\t\t/* cannot represent part of a UTF sequence (<) */\n  Runeself\t= 0x80,\t\t/* rune and UTF sequences are the same (<) */\n  Runeerror\t= 0xFFFD,\t/* decoding error in UTF */\n  Runemax\t= 0x10FFFF,\t/* maximum rune value */\n};\n\n#ifdef\t__cplusplus\nextern \"C\" {\n#endif\n\n/*\n * rune routines\n */\n\n/*\n * These routines were written by Rob Pike and Ken Thompson\n * and first appeared in Plan 9.\n * SEE ALSO\n * utf (7)\n * tcs (1)\n*/\n\n// runetochar copies (encodes) one rune, pointed to by r, to at most\n// UTFmax bytes starting at s and returns the number of bytes generated.\n\nint runetochar(char* s, const Rune* r);\n\n\n// chartorune copies (decodes) at most UTFmax bytes starting at s to\n// one rune, pointed to by r, and returns the number of bytes consumed.\n// If the input is not exactly in UTF format, chartorune will set *r\n// to Runeerror and return 1.\n//\n// Note: There is no special case for a \"null-terminated\" string. A\n// string whose first byte has the value 0 is the UTF8 encoding of the\n// Unicode value 0 (i.e., ASCII NULL). A byte value of 0 is illegal\n// anywhere else in a UTF sequence.\n\nint chartorune(Rune* r, const char* s);\n\n\n// charntorune is like chartorune, except that it will access at most\n// n bytes of s.  If the UTF sequence is incomplete within n bytes,\n// charntorune will set *r to Runeerror and return 0. If it is complete\n// but not in UTF format, it will set *r to Runeerror and return 1.\n// \n// Added 2004-09-24 by Wei-Hwa Huang\n\nint charntorune(Rune* r, const char* s, int n);\n\n// isvalidcharntorune(str, n, r, consumed)\n// is a convenience function that calls \"*consumed = charntorune(r, str, n)\"\n// and returns an int (logically boolean) indicating whether the first\n// n bytes of str was a valid and complete UTF sequence.\n\nint isvalidcharntorune(const char* str, int n, Rune* r, int* consumed);\n\n// runelen returns the number of bytes required to convert r into UTF.\n\nint runelen(Rune r);\n\n\n// runenlen returns the number of bytes required to convert the n\n// runes pointed to by r into UTF.\n\nint runenlen(const Rune* r, int n);\n\n\n// fullrune returns 1 if the string s of length n is long enough to be\n// decoded by chartorune, and 0 otherwise. This does not guarantee\n// that the string contains a legal UTF encoding. This routine is used\n// by programs that obtain input one byte at a time and need to know\n// when a full rune has arrived.\n\nint fullrune(const char* s, int n);\n\n// The following routines are analogous to the corresponding string\n// routines with \"utf\" substituted for \"str\", and \"rune\" substituted\n// for \"chr\".\n\n// utflen returns the number of runes that are represented by the UTF\n// string s. (cf. strlen)\n\nint utflen(const char* s);\n\n\n// utfnlen returns the number of complete runes that are represented\n// by the first n bytes of the UTF string s. If the last few bytes of\n// the string contain an incompletely coded rune, utfnlen will not\n// count them; in this way, it differs from utflen, which includes\n// every byte of the string. (cf. strnlen)\n\nint utfnlen(const char* s, long n);\n\n\n// utfrune returns a pointer to the first occurrence of rune r in the\n// UTF string s, or 0 if r does not occur in the string.  The NULL\n// byte terminating a string is considered to be part of the string s.\n// (cf. strchr)\n\nconst char* utfrune(const char* s, Rune r);\n\n\n// utfrrune returns a pointer to the last occurrence of rune r in the\n// UTF string s, or 0 if r does not occur in the string.  The NULL\n// byte terminating a string is considered to be part of the string s.\n// (cf. strrchr)\n\nconst char* utfrrune(const char* s, Rune r);\n\n\n// utfutf returns a pointer to the first occurrence of the UTF string\n// s2 as a UTF substring of s1, or 0 if there is none. If s2 is the\n// null string, utfutf returns s1. (cf. strstr)\n\nconst char* utfutf(const char* s1, const char* s2);\n\n\n// utfecpy copies UTF sequences until a null sequence has been copied,\n// but writes no sequences beyond es1.  If any sequences are copied,\n// s1 is terminated by a null sequence, and a pointer to that sequence\n// is returned.  Otherwise, the original s1 is returned. (cf. strecpy)\n\nchar* utfecpy(char *s1, char *es1, const char *s2);\n\n\n\n// These functions are rune-string analogues of the corresponding\n// functions in strcat (3).\n// \n// These routines first appeared in Plan 9.\n// SEE ALSO\n// memmove (3)\n// rune (3)\n// strcat (2)\n//\n// BUGS: The outcome of overlapping moves varies among implementations.\n\nRune* runestrcat(Rune* s1, const Rune* s2);\nRune* runestrncat(Rune* s1, const Rune* s2, long n);\n\nconst Rune* runestrchr(const Rune* s, Rune c);\n\nint runestrcmp(const Rune* s1, const Rune* s2);\nint runestrncmp(const Rune* s1, const Rune* s2, long n);\n\nRune* runestrcpy(Rune* s1, const Rune* s2);\nRune* runestrncpy(Rune* s1, const Rune* s2, long n);\nRune* runestrecpy(Rune* s1, Rune* es1, const Rune* s2);\n\nRune* runestrdup(const Rune* s);\n\nconst Rune* runestrrchr(const Rune* s, Rune c);\nlong runestrlen(const Rune* s);\nconst Rune* runestrstr(const Rune* s1, const Rune* s2);\n\n\n\n// The following routines test types and modify cases for Unicode\n// characters.  Unicode defines some characters as letters and\n// specifies three cases: upper, lower, and title.  Mappings among the\n// cases are also defined, although they are not exhaustive: some\n// upper case letters have no lower case mapping, and so on.  Unicode\n// also defines several character properties, a subset of which are\n// checked by these routines.  These routines are based on Unicode\n// version 3.0.0.\n//\n// NOTE: The routines are implemented in C, so the boolean functions\n// (e.g., isupperrune) return 0 for false and 1 for true.\n//\n//\n// toupperrune, tolowerrune, and totitlerune are the Unicode case\n// mappings. These routines return the character unchanged if it has\n// no defined mapping.\n\nRune toupperrune(Rune r);\nRune tolowerrune(Rune r);\nRune totitlerune(Rune r);\n\n\n// isupperrune tests for upper case characters, including Unicode\n// upper case letters and targets of the toupper mapping. islowerrune\n// and istitlerune are defined analogously. \n \nint isupperrune(Rune r);\nint islowerrune(Rune r);\nint istitlerune(Rune r);\n\n\n// isalpharune tests for Unicode letters; this includes ideographs in\n// addition to alphabetic characters.\n\nint isalpharune(Rune r);\n\n\n// isdigitrune tests for digits. Non-digit numbers, such as Roman\n// numerals, are not included.\n\nint isdigitrune(Rune r);\n\n\n// isideographicrune tests for ideographic characters and numbers, as\n// defined by the Unicode standard.\n\nint isideographicrune(Rune r);\n\n\n// isspacerune tests for whitespace characters, including \"C\" locale\n// whitespace, Unicode defined whitespace, and the \"zero-width\n// non-break space\" character.\n\nint isspacerune(Rune r);\n\n\n// (The comments in this file were copied from the manpage files rune.3,\n// isalpharune.3, and runestrcat.3. Some formatting changes were also made\n// to conform to Google style. /JRM 11/11/05)\n\n#ifdef\t__cplusplus\n}\n#endif\n\n#endif\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/utfdef.h",
    "content": "#define uchar _utfuchar\n#define ushort _utfushort\n#define uint _utfuint\n#define ulong _utfulong\n#define vlong _utfvlong\n#define uvlong _utfuvlong\n\ntypedef unsigned char\t\tuchar;\ntypedef unsigned short\t\tushort;\ntypedef unsigned int\t\tuint;\ntypedef unsigned long\t\tulong;\n\n#define nelem(x) (sizeof(x)/sizeof((x)[0]))\n#define nil ((void*)0)\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/utfecpy.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nchar*\nutfecpy(char *to, char *e, const char *from)\n{\n\tchar *end;\n\n\tif(to >= e)\n\t\treturn to;\n\tend = (char*)memccpy(to, from, '\\0', e - to);\n\tif(end == nil){\n\t\tend = e-1;\n\t\twhile(end>to && (*--end&0xC0)==0x80)\n\t\t\t;\n\t\t*end = '\\0';\n\t}else{\n\t\tend--;\n\t}\n\treturn end;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/utflen.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nint\nutflen(const char *s)\n{\n\tint c;\n\tlong n;\n\tRune rune;\n\n\tn = 0;\n\tfor(;;) {\n\t\tc = *(uchar*)s;\n\t\tif(c < Runeself) {\n\t\t\tif(c == 0)\n\t\t\t\treturn n;\n\t\t\ts++;\n\t\t} else\n\t\t\ts += chartorune(&rune, s);\n\t\tn++;\n\t}\n\treturn 0;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/utfnlen.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nint\nutfnlen(const char *s, long m)\n{\n\tint c;\n\tlong n;\n\tRune rune;\n\tconst char *es;\n\n\tes = s + m;\n\tfor(n = 0; s < es; n++) {\n\t\tc = *(uchar*)s;\n\t\tif(c < Runeself){\n\t\t\tif(c == '\\0')\n\t\t\t\tbreak;\n\t\t\ts++;\n\t\t\tcontinue;\n\t\t}\n\t\tif(!fullrune(s, es-s))\n\t\t\tbreak;\n\t\ts += chartorune(&rune, s);\n\t}\n\treturn n;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/utfrrune.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nconst\nchar*\nutfrrune(const char *s, Rune c)\n{\n\tlong c1;\n\tRune r;\n\tconst char *s1;\n\n\tif(c < Runesync)\t\t/* not part of utf sequence */\n\t\treturn strrchr(s, c);\n\n\ts1 = 0;\n\tfor(;;) {\n\t\tc1 = *(uchar*)s;\n\t\tif(c1 < Runeself) {\t/* one byte rune */\n\t\t\tif(c1 == 0)\n\t\t\t\treturn s1;\n\t\t\tif(c1 == c)\n\t\t\t\ts1 = s;\n\t\t\ts++;\n\t\t\tcontinue;\n\t\t}\n\t\tc1 = chartorune(&r, s);\n\t\tif(r == c)\n\t\t\ts1 = s;\n\t\ts += c1;\n\t}\n\treturn 0;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/utfrune.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\nconst\nchar*\nutfrune(const char *s, Rune c)\n{\n\tlong c1;\n\tRune r;\n\tint n;\n\n\tif(c < Runesync)\t\t/* not part of utf sequence */\n\t\treturn strchr(s, c);\n\n\tfor(;;) {\n\t\tc1 = *(uchar*)s;\n\t\tif(c1 < Runeself) {\t/* one byte rune */\n\t\t\tif(c1 == 0)\n\t\t\t\treturn 0;\n\t\t\tif(c1 == c)\n\t\t\t\treturn s;\n\t\t\ts++;\n\t\t\tcontinue;\n\t\t}\n\t\tn = chartorune(&r, s);\n\t\tif(r == c)\n\t\t\treturn s;\n\t\ts += n;\n\t}\n\treturn 0;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/third_party/utf/utfutf.c",
    "content": "/*\n * The authors of this software are Rob Pike and Ken Thompson.\n *              Copyright (c) 2002 by Lucent Technologies.\n * Permission to use, copy, modify, and distribute this software for any\n * purpose without fee is hereby granted, provided that this entire notice\n * is included in all copies of any software which is or includes a copy\n * or modification of this software and in all copies of the supporting\n * documentation for such software.\n * THIS SOFTWARE IS BEING PROVIDED \"AS IS\", WITHOUT ANY EXPRESS OR IMPLIED\n * WARRANTY.  IN PARTICULAR, NEITHER THE AUTHORS NOR LUCENT TECHNOLOGIES MAKE ANY\n * REPRESENTATION OR WARRANTY OF ANY KIND CONCERNING THE MERCHANTABILITY\n * OF THIS SOFTWARE OR ITS FITNESS FOR ANY PARTICULAR PURPOSE.\n */\n#include <stdarg.h>\n#include <string.h>\n#include \"third_party/utf/utf.h\"\n#include \"third_party/utf/utfdef.h\"\n\n\n/*\n * Return pointer to first occurrence of s2 in s1,\n * 0 if none\n */\nconst\nchar*\nutfutf(const char *s1, const char *s2)\n{\n\tconst char *p;\n\tlong f, n1, n2;\n\tRune r;\n\n\tn1 = chartorune(&r, s2);\n\tf = r;\n\tif(f <= Runesync)\t\t/* represents self */\n\t\treturn strstr(s1, s2);\n\n\tn2 = strlen(s2);\n\tfor(p=s1; (p=utfrune(p, f)) != 0; p+=n1)\n\t\tif(strncmp(p, s2, n2) == 0)\n\t\t\treturn p;\n\treturn 0;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/tools/bazel.rc",
    "content": "build:cuda --crosstool_top=//third_party/gpus/crosstool\n\nbuild --define=use_fast_cpp_protos=true\nbuild --define=allow_oversize_protos=true\nbuild --copt -funsigned-char\nbuild -c opt\n\nbuild --spawn_strategy=standalone\ntest --spawn_strategy=standalone\nrun --spawn_strategy=standalone\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/universal.md",
    "content": "# Parsey's Cousins.\n\nA collection of pretrained syntactic models is now available for download at\n`http://download.tensorflow.org/models/parsey_universal/<language>.zip`\n\nAfter downloading and unzipping a model, you can run it similarly to\nParsey McParseface with:\n\n```shell\n  MODEL_DIRECTORY=/where/you/unzipped/the/model/files\n  cat sentences.txt | syntaxnet/models/parsey_universal/parse.sh \\\n    $MODEL_DIRECTORY > output.conll\n```\n\nThese models are trained on\n[Universal Dependencies](http://universaldependencies.org/) datasets v1.3.\nThe following table shows their accuracy on Universal\nDependencies test sets for different types of annotations.\n\nLanguage | No. tokens | POS | fPOS | Morph | UAS | LAS\n--------  | :--: | :--: | :--: | :--: | :--: | :--: | :--:\nAncient_Greek-PROIEL | 18502 | 97.14% | 96.97% | 89.77% | 78.74% | 73.15%\nAncient_Greek | 25251 | 93.22% | 84.22% | 90.01% | 68.98% | 62.07%\nArabic | 28268 | 95.65% | 91.03% | 91.23% | 81.49% | 75.82%\nBasque | 24374 | 94.88% | - | 87.82% | 78.00% | 73.36%\nBulgarian | 15734 | 97.71% | 95.14% | 94.61% | 89.35% | 85.01%\nCatalan | 59503 | 98.06% | 98.06% | 97.56% | 90.47% | 87.64%\nChinese | 12012 | 91.32% | 90.89% | 98.76% | 76.71% | 71.24%\nCroatian | 4125 | 94.67% | - | 86.69% | 80.65% | 74.06%\nCzech-CAC | 10862 | 98.11% | 92.43% | 91.43% | 87.28% | 83.44%\nCzech-CLTT | 4105 | 95.79% | 87.36% | 86.33% | 77.34% | 73.40%\nCzech | 173920 | 98.12% | 93.76% | 93.13% | 89.47% | 85.93%\nDanish | 5884 | 95.28% | - | 95.24% | 79.84% | 76.34%\nDutch-LassySmall | 4562 | 95.62% | - | 95.44% | 81.63% | 78.08%\nDutch | 5843 | 89.89% | 86.03% | 89.12% | 77.70% | 71.21%\nEnglish-LinES | 8481 | 95.34% | 93.11% | - | 81.50% | 77.37%\nEnglish | 25096 | 90.48% | 89.71% | 91.30% | 84.79% | 80.38%\nEstonian | 23670 | 95.92% | 96.76% | 92.73% | 83.10% | 78.83%\nFinnish-FTB | 16286 | 93.50% | 91.15% | 92.44% | 84.97% | 80.48%\nFinnish | 9140 | 94.78% | 95.84% | 92.42% | 83.65% | 79.60%\nFrench | 7018 | 96.27% | - | 96.05% | 84.68% | 81.05%\nGalician | 29746 | 96.81% | 96.14% | - | 84.48% | 81.35%\nGerman | 16268 | 91.79% | - | - | 79.73% | 74.07%\nGothic | 5158 | 95.58% | 96.03% | 87.32% | 79.33% | 71.69%\nGreek | 5668 | 97.48% | 97.48% | 92.70% | 83.68% | 79.99%\nHebrew | 12125 | 95.04% | 95.04% | 92.05% | 84.61% | 78.71%\nHindi | 35430 | 96.45% | 95.77% | 90.98% | 93.04% | 89.32%\nHungarian | 4235 | 94.00% | - | 75.68% | 78.75% | 71.83%\nIndonesian | 11780 | 92.62% | - | - | 80.03% | 72.99%\nIrish | 3821 | 91.34% | 89.95% | 77.07% | 74.51% | 66.29%\nItalian | 10952 | 97.31% | 97.18% | 97.27% | 89.81% | 87.13%\nKazakh | 587 | 75.47% | 75.13% | - | 58.09% | 43.95%\nLatin-ITTB | 6548 | 97.98% | 92.68% | 93.52% | 84.22% | 81.17%\nLatin-PROIEL | 14906 | 96.50% | 96.08% | 88.39% | 77.60% | 70.98%\nLatin | 4832 | 88.04% | 74.07% | 76.03% | 56.00% | 45.80%\nLatvian | 3985 | 80.95% | 66.60% | 73.60% | 58.92% | 51.47%\nNorwegian | 30034 | 97.44% | - | 95.58% | 88.61% | 86.22%\nOld_Church_Slavonic | 5079 | 96.50% | 96.28% | 89.43% | 84.86% | 78.85%\nPersian | 16022 | 96.20% | 95.72% | 95.90% | 84.42% | 80.28%\nPolish | 7185 | 95.05% | 85.83% | 86.12% | 88.30% | 82.71%\nPortuguese-BR | 29438 | 97.07% | 97.07% | 99.91% | 87.91% | 85.44%\nPortuguese | 6262 | 96.81% | 90.67% | 94.22% | 85.12% | 81.28%\nRomanian | 18375 | 95.26% | 91.66% | 91.98% | 83.64% | 75.36%\nRussian-SynTagRus | 107737 | 98.27% | - | 94.91% | 91.68% | 87.44%\nRussian | 9573 | 95.27% | 95.02% | 87.75% | 81.75% | 77.71%\nSlovenian-SST | 2951 | 90.00% | 84.48% | 84.38% | 65.06% | 56.96%\nSlovenian | 14063 | 96.22% | 90.46% | 90.35% | 87.71% | 84.60%\nSpanish-AnCora | 53594 | 98.28% | 98.28% | 97.82% | 89.26% | 86.50%\nSpanish | 7953 | 95.27% | - | 95.74% | 85.06% | 81.53%\nSwedish-LinES | 8228 | 96.00% | 93.77% | - | 81.38% | 77.21%\nSwedish | 20377 | 96.27% | 94.13% | 94.14% | 83.84% | 80.28%\nTamil | 1989 | 79.29% | 71.79% | 75.97% | 64.45% | 55.35%\nTurkish | 8616 | 93.63% | 92.62% | 86.79% | 82.00% | 71.37%\n**Average** | - | 94.27% | 92.93% | 90.38% | 81.12% | 75.85%\n\nThese results are obtained using gold text segmentation. Accuracies are measured\nover all tokens, including punctuation. `POS`, `fPOS` are coarse and fine\npart-of-speech tagging accuracies. `Morph` is full-token accuracy of predicted\nmorphological attributes. `UAS` and `LAS` are unlabeled and labeled attachment\nscores.\n\nMany of these models also support text segmentation, with:\n\n```shell\n  MODEL_DIRECTORY=/where/you/unzipped/the/model/files\n  cat sentences.txt | syntaxnet/models/parsey_universal/tokenize.sh \\\n    $MODEL_DIRECTORY > output.conll\n```\n\nText segmentation is currently available for:\n`Bulgarian`, `Czech`, `German`, `Greek`, `English`, `Spanish`, `Estonian`,\n`Basque`, `Persian`, `Finnish`, `Finnish-FTB`, `French`, `Galician`,\n`Ancient_Greek`, `Ancient_Greek-PROIEL`, `Hebrew`, `Hindi`, `Croatian`,\n`Hungarian`, `Indonesian`, `Italian`, `Latin`, `Latin-PROIEL`, `Dutch`,\n`Norwegian`, `Polish`, `Portuguese`, `Slovenian`, `Swedish`, `Tamil`.\n\nFor `Chinese` (traditional) we use a larger text segmentation\nmodel, which can be run with:\n\n```shell\n  MODEL_DIRECTORY=/where/you/unzipped/the/model/files\n  cat sentences.txt | syntaxnet/models/parsey_universal/tokenize_zh.sh \\\n    $MODEL_DIRECTORY > output.conll\n```\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/util/utf8/BUILD",
    "content": "licenses([\"notice\"])\n\n# Requires --copt -funsigned-char when compiling (unsigned chars).\n\ncc_library(\n    name = \"unicodetext\",\n    srcs = [\n        \"unicodetext.cc\",\n        \"unilib.cc\",\n    ],\n    hdrs = [\n        \"unicodetext.h\",\n        \"unilib.h\",\n        \"unilib_utf8_utils.h\",\n    ],\n    visibility = [\"//visibility:public\"],\n    deps = [\n        \"//syntaxnet:base\",\n        \"//third_party/utf\",\n    ],\n)\n\ncc_test(\n    name = \"unicodetext_unittest\",\n    srcs = [\n        \"gtest_main.cc\",\n        \"unicodetext_unittest.cc\",\n    ],\n    deps = [\n        \"@org_tensorflow//tensorflow/core:testlib\",\n        \":unicodetext\",\n    ],\n)\n\ncc_binary(\n    name = \"unicodetext_main\",\n    srcs = [\"unicodetext_main.cc\"],\n    deps = [\":unicodetext\"],\n)\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/util/utf8/gtest_main.cc",
    "content": "/**\n * Copyright 2010 Google Inc.\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n *      http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\n// Author: sligocki@google.com (Shawn Ligocki)\n//\n// Build all tests with this main to run all tests.\n\n#include \"gtest/gtest.h\"\n\nint main(int argc, char **argv) {\n  ::testing::InitGoogleTest(&argc, argv);\n  return RUN_ALL_TESTS();\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/util/utf8/unicodetext.cc",
    "content": "/**\n * Copyright 2010 Google Inc.\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n *      http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\n#include \"util/utf8/unicodetext.h\"\n\n#include <string.h>                     // for memcpy, NULL, memcmp, etc\n#include <algorithm>                    // for max\n\n//#include \"base/logging.h\"               // for operator<<, CHECK, etc\n//#include \"base/stringprintf.h\"          // for StringPrintf, StringAppendF\n//#include \"strings/stringpiece.h\"        // for StringPiece, etc\n\n#include \"third_party/utf/utf.h\"        // for isvalidcharntorune, etc\n#include \"util/utf8/unilib.h\"    // for IsInterchangeValid, etc\n#include \"util/utf8/unilib_utf8_utils.h\"    // for OneCharLen\n\nstatic int CodepointDistance(const char* start, const char* end) {\n  int n = 0;\n  // Increment n on every non-trail-byte.\n  for (const char* p = start; p < end; ++p) {\n    n += (*reinterpret_cast<const signed char*>(p) >= -0x40);\n  }\n  return n;\n}\n\nstatic int CodepointCount(const char* utf8, int len) {\n  return CodepointDistance(utf8, utf8 + len);\n}\n\nUnicodeText::const_iterator::difference_type\ndistance(const UnicodeText::const_iterator& first,\n         const UnicodeText::const_iterator& last) {\n  return CodepointDistance(first.it_, last.it_);\n}\n\n// ---------- Utility ----------\n\nstatic int ConvertToInterchangeValid(char* start, int len) {\n  // This routine is called only when we've discovered that a UTF-8 buffer\n  // that was passed to CopyUTF8, TakeOwnershipOfUTF8, or PointToUTF8\n  // was not interchange valid. This indicates a bug in the caller, and\n  // a LOG(WARNING) is done in that case.\n  // This is similar to CoerceToInterchangeValid, but it replaces each\n  // structurally valid byte with a space, and each non-interchange\n  // character with a space, even when that character requires more\n  // than one byte in UTF8. E.g., \"\\xEF\\xB7\\x90\" (U+FDD0) is\n  // structurally valid UTF8, but U+FDD0 is not an interchange-valid\n  // code point. The result should contain one space, not three.\n  //\n  // Since the conversion never needs to write more data than it\n  // reads, it is safe to change the buffer in place. It returns the\n  // number of bytes written.\n  char* const in = start;\n  char* out = start;\n  char* const end = start + len;\n  while (start < end) {\n    int good = UniLib::SpanInterchangeValid(start, end - start);\n    if (good > 0) {\n      if (out != start) {\n        memmove(out, start, good);\n      }\n      out += good;\n      start += good;\n      if (start == end) {\n        break;\n      }\n    }\n    // Is the current string invalid UTF8 or just non-interchange UTF8?\n    char32 rune;\n    int n;\n    if (isvalidcharntorune(start, end - start, &rune, &n)) {\n      // structurally valid UTF8, but not interchange valid\n      start += n;  // Skip over the whole character.\n    } else {  // bad UTF8\n      start += 1;  // Skip over just one byte\n    }\n    *out++ = ' ';\n  }\n  return out - in;\n}\n\n\n// *************** Data representation **********\n\n// Note: the copy constructor is undefined.\n\n// After reserve(), resize(), or clear(), we're an owner, not an alias.\n\nvoid UnicodeText::Repr::reserve(int new_capacity) {\n  // If there's already enough capacity, and we're an owner, do nothing.\n  if (capacity_ >= new_capacity && ours_) return;\n\n  // Otherwise, allocate a new buffer.\n  capacity_ = std::max(new_capacity, (3 * capacity_) / 2 + 20);\n  char* new_data = new char[capacity_];\n\n  // If there is an old buffer, copy it into the new buffer.\n  if (data_) {\n    memcpy(new_data, data_, size_);\n    if (ours_) delete[] data_;  // If we owned the old buffer, free it.\n  }\n  data_ = new_data;\n  ours_ = true;  // We own the new buffer.\n  // size_ is unchanged.\n}\n\nvoid UnicodeText::Repr::resize(int new_size) {\n  if (new_size == 0) {\n    clear();\n  } else {\n    if (!ours_ || new_size > capacity_) reserve(new_size);\n    // Clear the memory in the expanded part.\n    if (size_ < new_size) memset(data_ + size_, 0, new_size - size_);\n    size_ = new_size;\n    ours_ = true;\n  }\n}\n\n// This implementation of clear() deallocates the buffer if we're an owner.\n// That's not strictly necessary; we could just set size_ to 0.\nvoid UnicodeText::Repr::clear() {\n  if (ours_) delete[] data_;\n  data_ = nullptr;\n  size_ = capacity_ = 0;\n  ours_ = true;\n}\n\nvoid UnicodeText::Repr::Copy(const char* data, int size) {\n  resize(size);\n  memcpy(data_, data, size);\n}\n\nvoid UnicodeText::Repr::TakeOwnershipOf(char* data, int size, int capacity) {\n  if (data == data_) return;  // We already own this memory. (Weird case.)\n  if (ours_ && data_) delete[] data_;  // If we owned the old buffer, free it.\n  data_ = data;\n  size_ = size;\n  capacity_ = capacity;\n  ours_ = true;\n}\n\nvoid UnicodeText::Repr::PointTo(const char* data, int size) {\n  if (ours_ && data_) delete[] data_;  // If we owned the old buffer, free it.\n  data_ = const_cast<char*>(data);\n  size_ = size;\n  capacity_ = size;\n  ours_ = false;\n}\n\nvoid UnicodeText::Repr::append(const char* bytes, int byte_length) {\n  reserve(size_ + byte_length);\n  memcpy(data_ + size_, bytes, byte_length);\n  size_ += byte_length;\n}\n\nstring UnicodeText::Repr::DebugString() const {\n  return tensorflow::strings::Printf(\"{Repr %p data=%p size=%d capacity=%d %s}\",\n                      this,\n                      data_, size_, capacity_,\n                      ours_ ? \"Owned\" : \"Alias\");\n}\n\n\n\n// *************** UnicodeText ******************\n\n// ----- Constructors -----\n\n// Default constructor\nUnicodeText::UnicodeText() {\n}\n\n// Copy constructor\nUnicodeText::UnicodeText(const UnicodeText& src) {\n  Copy(src);\n}\n\n// Substring constructor\nUnicodeText::UnicodeText(const UnicodeText::const_iterator& first,\n                         const UnicodeText::const_iterator& last) {\n  CHECK(first <= last) << \" Incompatible iterators\";\n  repr_.append(first.it_, last.it_ - first.it_);\n}\n\nstring UnicodeText::UTF8Substring(const const_iterator& first,\n                                  const const_iterator& last) {\n  CHECK(first <= last) << \" Incompatible iterators\";\n  return string(first.it_, last.it_ - first.it_);\n}\n\n\n// ----- Copy -----\n\nUnicodeText& UnicodeText::operator=(const UnicodeText& src) {\n  if (this != &src) {\n    Copy(src);\n  }\n  return *this;\n}\n\nUnicodeText& UnicodeText::Copy(const UnicodeText& src) {\n  repr_.Copy(src.repr_.data_, src.repr_.size_);\n  return *this;\n}\n\nUnicodeText& UnicodeText::CopyUTF8(const char* buffer, int byte_length) {\n  repr_.Copy(buffer, byte_length);\n  if (!UniLib:: IsInterchangeValid(buffer, byte_length)) {\n    LOG(WARNING) << \"UTF-8 buffer is not interchange-valid.\";\n    repr_.size_ = ConvertToInterchangeValid(repr_.data_, byte_length);\n  }\n  return *this;\n}\n\nUnicodeText& UnicodeText::UnsafeCopyUTF8(const char* buffer,\n                                           int byte_length) {\n  repr_.Copy(buffer, byte_length);\n  return *this;\n}\n\n// ----- TakeOwnershipOf  -----\n\nUnicodeText& UnicodeText::TakeOwnershipOfUTF8(char* buffer,\n                                              int byte_length,\n                                              int byte_capacity) {\n  repr_.TakeOwnershipOf(buffer, byte_length, byte_capacity);\n  if (!UniLib:: IsInterchangeValid(buffer, byte_length)) {\n    LOG(WARNING) << \"UTF-8 buffer is not interchange-valid.\";\n    repr_.size_ = ConvertToInterchangeValid(repr_.data_, byte_length);\n  }\n  return *this;\n}\n\nUnicodeText& UnicodeText::UnsafeTakeOwnershipOfUTF8(char* buffer,\n                                                    int byte_length,\n                                                    int byte_capacity) {\n  repr_.TakeOwnershipOf(buffer, byte_length, byte_capacity);\n  return *this;\n}\n\n// ----- PointTo -----\n\nUnicodeText& UnicodeText::PointToUTF8(const char* buffer, int byte_length) {\n  if (UniLib:: IsInterchangeValid(buffer, byte_length)) {\n    repr_.PointTo(buffer, byte_length);\n  } else {\n    LOG(WARNING) << \"UTF-8 buffer is not interchange-valid.\";\n    repr_.Copy(buffer, byte_length);\n    repr_.size_ = ConvertToInterchangeValid(repr_.data_, byte_length);\n  }\n  return *this;\n}\n\nUnicodeText& UnicodeText::UnsafePointToUTF8(const char* buffer,\n                                          int byte_length) {\n  repr_.PointTo(buffer, byte_length);\n  return *this;\n}\n\nUnicodeText& UnicodeText::PointTo(const UnicodeText& src) {\n  repr_.PointTo(src.repr_.data_, src.repr_.size_);\n  return *this;\n}\n\nUnicodeText& UnicodeText::PointTo(const const_iterator &first,\n                                  const const_iterator &last) {\n  CHECK(first <= last) << \" Incompatible iterators\";\n  repr_.PointTo(first.utf8_data(), last.utf8_data() - first.utf8_data());\n  return *this;\n}\n\n// ----- Append -----\n\nUnicodeText& UnicodeText::append(const UnicodeText& u) {\n  repr_.append(u.repr_.data_, u.repr_.size_);\n  return *this;\n}\n\nUnicodeText& UnicodeText::append(const const_iterator& first,\n                                 const const_iterator& last) {\n  CHECK(first <= last) << \" Incompatible iterators\";\n  repr_.append(first.it_, last.it_ - first.it_);\n  return *this;\n}\n\nUnicodeText& UnicodeText::UnsafeAppendUTF8(const char* utf8, int len) {\n  repr_.append(utf8, len);\n  return *this;\n}\n\n// ----- substring searching -----\n\nUnicodeText::const_iterator UnicodeText::find(const UnicodeText& look,\n                                              const_iterator start_pos) const {\n  CHECK_GE(start_pos.utf8_data(), utf8_data());\n  CHECK_LE(start_pos.utf8_data(), utf8_data() + utf8_length());\n  return UnsafeFind(look, start_pos);\n}\n\nUnicodeText::const_iterator UnicodeText::find(const UnicodeText& look) const {\n  return UnsafeFind(look, begin());\n}\n\nUnicodeText::const_iterator UnicodeText::UnsafeFind(\n    const UnicodeText& look, const_iterator start_pos) const {\n  // Due to the magic of the UTF8 encoding, searching for a sequence of\n  // letters is equivalent to substring search.\n  StringPiece searching(utf8_data(), utf8_length());\n  StringPiece look_piece(look.utf8_data(), look.utf8_length());\n  LOG(FATAL) << \"Not implemented\";\n  //StringPiece::size_type found =\n  //    searching.find(look_piece, start_pos.utf8_data() - utf8_data());\n  StringPiece::size_type found = StringPiece::npos;\n  if (found == StringPiece::npos) return end();\n  return const_iterator(utf8_data() + found);\n}\n\nbool UnicodeText::HasReplacementChar() const {\n  // Equivalent to:\n  //   UnicodeText replacement_char;\n  //   replacement_char.push_back(0xFFFD);\n  //   return find(replacement_char) != end();\n  StringPiece searching(utf8_data(), utf8_length());\n  StringPiece looking_for(\"\\xEF\\xBF\\xBD\", 3);\n  LOG(FATAL) << \"Not implemented\";\n  //return searching.find(looking_for) != StringPiece::npos;\n  return false;\n}\n\n// ----- other methods -----\n\n// Clear operator\nvoid UnicodeText::clear() {\n  repr_.clear();\n}\n\n// Destructor\nUnicodeText::~UnicodeText() {}\n\n\nvoid UnicodeText::push_back(char32 c) {\n  if (UniLib::IsValidCodepoint(c)) {\n    char buf[UTFmax];\n    int len = runetochar(buf, &c);\n    if (UniLib::IsInterchangeValid(buf, len)) {\n      repr_.append(buf, len);\n    } else {\n      LOG(WARNING) << \"Unicode value 0x\" << std::hex << c\n                  << \" is not valid for interchange\";\n      repr_.append(\" \", 1);\n    }\n  } else {\n    LOG(WARNING) << \"Illegal Unicode value: 0x\" << std::hex << c;\n    repr_.append(\" \", 1);\n  }\n}\n\nint UnicodeText::size() const {\n  return CodepointCount(repr_.data_, repr_.size_);\n}\n\nbool operator==(const UnicodeText& lhs, const UnicodeText& rhs) {\n  if (&lhs == &rhs) return true;\n  if (lhs.repr_.size_ != rhs.repr_.size_) return false;\n  return memcmp(lhs.repr_.data_, rhs.repr_.data_, lhs.repr_.size_) == 0;\n}\n\nstring UnicodeText::DebugString() const {\n  return tensorflow::strings::Printf(\"{UnicodeText %p chars=%d repr=%s}\",\n                      this,\n                      size(),\n                      repr_.DebugString().c_str());\n}\n\n\n// ******************* UnicodeText::const_iterator *********************\n\n// The implementation of const_iterator would be nicer if it\n// inherited from boost::iterator_facade\n// (http://boost.org/libs/iterator/doc/iterator_facade.html).\n\nUnicodeText::const_iterator::const_iterator() : it_(nullptr) {}\n\nUnicodeText::const_iterator::const_iterator(const const_iterator& other)\n    : it_(other.it_) {\n}\n\nUnicodeText::const_iterator&\nUnicodeText::const_iterator::operator=(const const_iterator& other) {\n  if (&other != this)\n    it_ = other.it_;\n  return *this;\n}\n\nUnicodeText::const_iterator UnicodeText::begin() const {\n  return const_iterator(repr_.data_);\n}\n\nUnicodeText::const_iterator UnicodeText::end() const {\n  return const_iterator(repr_.data_ + repr_.size_);\n}\n\nbool operator<(const UnicodeText::const_iterator& lhs,\n               const UnicodeText::const_iterator& rhs) {\n  return lhs.it_ < rhs.it_;\n}\n\nchar32 UnicodeText::const_iterator::operator*() const {\n  // (We could call chartorune here, but that does some\n  // error-checking, and we're guaranteed that our data is valid\n  // UTF-8. Also, we expect this routine to be called very often. So\n  // for speed, we do the calculation ourselves.)\n\n  // Convert from UTF-8\n  int byte1 = it_[0];\n  if (byte1 < 0x80)\n    return byte1;\n\n  int byte2 = it_[1];\n  if (byte1 < 0xE0)\n    return ((byte1 & 0x1F) << 6)\n          | (byte2 & 0x3F);\n\n  int byte3 = it_[2];\n  if (byte1 < 0xF0)\n    return ((byte1 & 0x0F) << 12)\n         | ((byte2 & 0x3F) << 6)\n         |  (byte3 & 0x3F);\n\n  int byte4 = it_[3];\n  return ((byte1 & 0x07) << 18)\n       | ((byte2 & 0x3F) << 12)\n       | ((byte3 & 0x3F) << 6)\n       |  (byte4 & 0x3F);\n}\n\nUnicodeText::const_iterator& UnicodeText::const_iterator::operator++() {\n  it_ += UniLib::OneCharLen(it_);\n  return *this;\n}\n\nUnicodeText::const_iterator& UnicodeText::const_iterator::operator--() {\n  while (UniLib::IsTrailByte(*--it_));\n  return *this;\n}\n\nint UnicodeText::const_iterator::get_utf8(char* utf8_output) const {\n  utf8_output[0] = it_[0]; if (it_[0] < 0x80) return 1;\n  utf8_output[1] = it_[1]; if (it_[0] < 0xE0) return 2;\n  utf8_output[2] = it_[2]; if (it_[0] < 0xF0) return 3;\n  utf8_output[3] = it_[3];\n  return 4;\n}\n\nstring UnicodeText::const_iterator::get_utf8_string() const {\n  return string(utf8_data(), utf8_length());\n}\n\nint UnicodeText::const_iterator::utf8_length() const {\n  if (it_[0] < 0x80) {\n    return 1;\n  } else if (it_[0] < 0xE0) {\n    return 2;\n  } else if (it_[0] < 0xF0) {\n    return 3;\n  } else {\n    return 4;\n  }\n}\n\nUnicodeText::const_iterator UnicodeText::MakeIterator(const char* p) const {\n  CHECK(p != nullptr);\n  const char* start = utf8_data();\n  int len = utf8_length();\n  const char* end = start + len;\n  CHECK(p >= start);\n  CHECK(p <= end);\n  CHECK(p == end || !UniLib::IsTrailByte(*p));\n  return const_iterator(p);\n}\n\nstring UnicodeText::const_iterator::DebugString() const {\n  return tensorflow::strings::Printf(\"{iter %p}\", it_);\n}\n\n\n// *************************** Utilities *************************\n\nstring CodepointString(const UnicodeText& t) {\n  string s;\n  UnicodeText::const_iterator it = t.begin(), end = t.end();\n  while (it != end) tensorflow::strings::Appendf(&s, \"%X \", *it++);\n  return s;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/util/utf8/unicodetext.h",
    "content": "/**\n * Copyright 2010 Google Inc.\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n *      http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\n#ifndef UTIL_UTF8_PUBLIC_UNICODETEXT_H_\n#define UTIL_UTF8_PUBLIC_UNICODETEXT_H_\n\n#include <stddef.h>                     // for NULL, ptrdiff_t\n#include <iterator>                     // for bidirectional_iterator_tag, etc\n#include <string>                       // for string\n#include <utility>                      // for pair\n\n#include \"syntaxnet/base.h\"\n\n// ***************************** UnicodeText **************************\n//\n// A UnicodeText object is a container for a sequence of Unicode\n// codepoint values. It has default, copy, and assignment constructors.\n// Data can be appended to it from another UnicodeText, from\n// iterators, or from a single codepoint.\n//\n// The internal representation of the text is UTF-8. Since UTF-8 is a\n// variable-width format, UnicodeText does not provide random access\n// to the text, and changes to the text are permitted only at the end.\n//\n// The UnicodeText class defines a const_iterator. The dereferencing\n// operator (*) returns a codepoint (char32). The iterator is a\n// bidirectional, read-only iterator. It becomes invalid if the text\n// is changed.\n//\n// There are methods for appending and retrieving UTF-8 data directly.\n// The 'utf8_data' method returns a const char* that contains the\n// UTF-8-encoded version of the text; 'utf8_length' returns the number\n// of bytes in the UTF-8 data. An iterator's 'get' method stores up to\n// 4 bytes of UTF-8 data in a char array and returns the number of\n// bytes that it stored.\n//\n// Codepoints are integers in the range [0, 0xD7FF] or [0xE000,\n// 0x10FFFF], but UnicodeText has the additional restriction that it\n// can contain only those characters that are valid for interchange on\n// the Web. This excludes all of the control codes except for carriage\n// return, line feed, and horizontal tab.  It also excludes\n// non-characters, but codepoints that are in the Private Use regions\n// are allowed, as are codepoints that are unassigned. (See the\n// Unicode reference for details.) The function UniLib::IsInterchangeValid\n// can be used as a test for this property.\n//\n// UnicodeTexts are safe. Every method that constructs or modifies a\n// UnicodeText tests for interchange-validity, and will substitute a\n// space for the invalid data. Such cases are reported via\n// LOG(WARNING).\n//\n// MEMORY MANAGEMENT: copy, take ownership, or point to\n//\n// A UnicodeText is either an \"owner\", meaning that it owns the memory\n// for the data buffer and will free it when the UnicodeText is\n// destroyed, or it is an \"alias\", meaning that it does not.\n//\n// There are three methods for storing UTF-8 data in a UnicodeText:\n//\n// CopyUTF8(buffer, len) copies buffer.\n//\n// TakeOwnershipOfUTF8(buffer, size, capacity) takes ownership of buffer.\n//\n// PointToUTF8(buffer, size) creates an alias pointing to buffer.\n//\n// All three methods perform a validity check on the buffer. There are\n// private, \"unsafe\" versions of these functions that bypass the\n// validity check. They are used internally and by friend-functions\n// that are handling UTF-8 data that has already been validated.\n//\n// The purpose of an alias is to avoid making an unnecessary copy of a\n// UTF-8 buffer while still providing access to the Unicode values\n// within that text through iterators or the fast scanners that are\n// based on UTF-8 state tables. The lifetime of an alias must not\n// exceed the lifetime of the buffer from which it was constructed.\n//\n// The semantics of an alias might be described as \"copy on write or\n// repair.\" The source data is never modified. If push_back() or\n// append() is called on an alias, a copy of the data will be created,\n// and the UnicodeText will become an owner. If clear() is called on\n// an alias, it becomes an (empty) owner.\n//\n// The copy constructor and the assignment operator produce an owner.\n// That is, after direct initialization (\"UnicodeText x(y);\") or copy\n// initialization (\"UnicodeText x = y;\") x will be an owner, even if y\n// was an alias. The assignment operator (\"x = y;\") also produces an\n// owner unless x and y are the same object and y is an alias.\n//\n// Aliases should be used with care. If the source from which an alias\n// was created is freed, or if the contents are changed, while the\n// alias is still in use, fatal errors could result. But it can be\n// quite useful to have a UnicodeText \"window\" through which to see a\n// UTF-8 buffer without having to pay the price of making a copy.\n//\n// UTILITIES\n//\n// The interfaces in util/utf8/public/textutils.h provide higher-level\n// utilities for dealing with UnicodeTexts, including routines for\n// creating UnicodeTexts (both owners and aliases) from UTF-8 buffers or\n// strings, creating strings from UnicodeTexts, normalizing text for\n// efficient matching or display, and others.\n\nclass UnicodeText {\n public:\n  class const_iterator;\n\n  typedef char32 value_type;\n\n  // Constructors. These always produce owners.\n  UnicodeText();  // Create an empty text.\n  UnicodeText(const UnicodeText& src);  // copy constructor\n  // Construct a substring (copies the data).\n  UnicodeText(const const_iterator& first, const const_iterator& last);\n\n  // Assignment operator. This copies the data and produces an owner\n  // unless this == &src, e.g., \"x = x;\", which is a no-op.\n  UnicodeText& operator=(const UnicodeText& src);\n\n  // x.Copy(y) copies the data from y into x.\n  UnicodeText& Copy(const UnicodeText& src);\n  inline UnicodeText& assign(const UnicodeText& src) { return Copy(src); }\n\n  // x.PointTo(y) changes x so that it points to y's data.\n  // It does not copy y or take ownership of y's data.\n  UnicodeText& PointTo(const UnicodeText& src);\n  UnicodeText& PointTo(const const_iterator& first,\n                       const const_iterator& last);\n\n  ~UnicodeText();\n\n  void clear();  // Clear text.\n  bool empty() const { return repr_.size_ == 0; }  // Test if text is empty.\n\n  // Add a codepoint to the end of the text.\n  // If the codepoint is not interchange-valid, add a space instead\n  // and log a warning.\n  void push_back(char32 codepoint);\n\n  // Generic appending operation.\n  // iterator_traits<ForwardIterator>::value_type must be implicitly\n  // convertible to char32. Typical uses of this method might include:\n  //     char32 chars[] = {0x1, 0x2, ...};\n  //     vector<char32> more_chars = ...;\n  //     utext.append(chars, chars+arraysize(chars));\n  //     utext.append(more_chars.begin(), more_chars.end());\n  template<typename ForwardIterator>\n  UnicodeText& append(ForwardIterator first, const ForwardIterator last) {\n    while (first != last) { push_back(*first++); }\n    return *this;\n  }\n\n  // A specialization of the generic append() method.\n  UnicodeText& append(const const_iterator& first, const const_iterator& last);\n\n  // An optimization of append(source.begin(), source.end()).\n  UnicodeText& append(const UnicodeText& source);\n\n  int size() const;  // the number of Unicode characters (codepoints)\n\n  friend bool operator==(const UnicodeText& lhs, const UnicodeText& rhs);\n  friend bool operator!=(const UnicodeText& lhs, const UnicodeText& rhs);\n\n  class const_iterator {\n    typedef const_iterator CI;\n   public:\n    typedef std::bidirectional_iterator_tag iterator_category;\n    typedef char32 value_type;\n    typedef ptrdiff_t difference_type;\n    typedef void pointer;  // (Not needed.)\n    typedef const char32 reference;  // (Needed for const_reverse_iterator)\n\n    // Iterators are default-constructible.\n    const_iterator();\n\n    // It's safe to make multiple passes over a UnicodeText.\n    const_iterator(const const_iterator& other);\n    const_iterator& operator=(const const_iterator& other);\n\n    char32 operator*() const;  // Dereference\n\n    const_iterator& operator++();  // Advance (++iter)\n    const_iterator operator++(int) {  // (iter++)\n      const_iterator result(*this);\n      ++*this;\n      return result;\n    }\n\n    const_iterator& operator--();  // Retreat (--iter)\n    const_iterator operator--(int) {  // (iter--)\n      const_iterator result(*this);\n      --*this;\n      return result;\n    }\n\n    // We love relational operators.\n    friend bool operator==(const CI& lhs, const CI& rhs) {\n      return lhs.it_ == rhs.it_; }\n    friend bool operator!=(const CI& lhs, const CI& rhs) {\n      return !(lhs == rhs); }\n    friend bool operator<(const CI& lhs, const CI& rhs);\n    friend bool operator>(const CI& lhs, const CI& rhs) {\n      return rhs < lhs; }\n    friend bool operator<=(const CI& lhs, const CI& rhs) {\n      return !(rhs < lhs); }\n    friend bool operator>=(const CI& lhs, const CI& rhs) {\n      return !(lhs < rhs); }\n\n    friend difference_type distance(const CI& first, const CI& last);\n\n    // UTF-8-specific methods\n    // Store the UTF-8 encoding of the current codepoint into buf,\n    // which must be at least 4 bytes long. Return the number of\n    // bytes written.\n    int get_utf8(char* buf) const;\n    // Return the UTF-8 character that the iterator points to.\n    string get_utf8_string() const;\n    // Return the byte length of the UTF-8 character the iterator points to.\n    int utf8_length() const;\n    // Return the iterator's pointer into the UTF-8 data.\n    const char* utf8_data() const { return it_; }\n\n    string DebugString() const;\n\n   private:\n    friend class UnicodeText;\n    friend class UnicodeTextUtils;\n    friend class UTF8StateTableProperty;\n    explicit const_iterator(const char* it) : it_(it) {}\n\n    const char* it_;\n  };\n\n  const_iterator begin() const;\n  const_iterator end() const;\n\n  class const_reverse_iterator : public std::reverse_iterator<const_iterator> {\n   public:\n    explicit const_reverse_iterator(const_iterator it) :\n        std::reverse_iterator<const_iterator>(it) {}\n    const char* utf8_data() const {\n      const_iterator tmp_it = base();\n      return (--tmp_it).utf8_data();\n    }\n    int get_utf8(char* buf) const {\n      const_iterator tmp_it = base();\n      return (--tmp_it).get_utf8(buf);\n    }\n    string get_utf8_string() const {\n      const_iterator tmp_it = base();\n      return (--tmp_it).get_utf8_string();\n    }\n    int utf8_length() const {\n      const_iterator tmp_it = base();\n      return (--tmp_it).utf8_length();\n    }\n  };\n  const_reverse_iterator rbegin() const {\n    return const_reverse_iterator(end());\n  }\n  const_reverse_iterator rend() const {\n    return const_reverse_iterator(begin());\n  }\n\n  // Substring searching.  Returns the beginning of the first\n  // occurrence of \"look\", or end() if not found.\n  const_iterator find(const UnicodeText& look, const_iterator start_pos) const;\n  // Equivalent to find(look, begin())\n  const_iterator find(const UnicodeText& look) const;\n\n  // Returns whether this contains the character U+FFFD.  This can\n  // occur, for example, if the input to Encodings::Decode() had byte\n  // sequences that were invalid in the source encoding.\n  bool HasReplacementChar() const;\n\n  // UTF-8-specific methods\n  //\n  // Return the data, length, and capacity of UTF-8-encoded version of\n  // the text. Length and capacity are measured in bytes.\n  const char* utf8_data() const { return repr_.data_; }\n  int utf8_length() const { return repr_.size_; }\n  int utf8_capacity() const { return repr_.capacity_; }\n\n  // Return the UTF-8 data as a string.\n  static string UTF8Substring(const const_iterator& first,\n                              const const_iterator& last);\n\n  // There are three methods for initializing a UnicodeText from UTF-8\n  // data. They vary in details of memory management. In all cases,\n  // the data is tested for interchange-validity. If it is not\n  // interchange-valid, a LOG(WARNING) is issued, and each\n  // structurally invalid byte and each interchange-invalid codepoint\n  // is replaced with a space.\n\n  // x.CopyUTF8(buf, len) copies buf into x.\n  UnicodeText& CopyUTF8(const char* utf8_buffer, int byte_length);\n\n  // x.TakeOwnershipOfUTF8(buf, len, capacity). x takes ownership of\n  // buf. buf is not copied.\n  UnicodeText& TakeOwnershipOfUTF8(char* utf8_buffer,\n                                   int byte_length,\n                                   int byte_capacity);\n\n  // x.PointToUTF8(buf,len) changes x so that it points to buf\n  // (\"becomes an alias\"). It does not take ownership or copy buf.\n  // If the buffer is not valid, this has the same effect as\n  // CopyUTF8(utf8_buffer, byte_length).\n  UnicodeText& PointToUTF8(const char* utf8_buffer, int byte_length);\n\n  // Occasionally it is necessary to use functions that operate on the\n  // pointer returned by utf8_data(). MakeIterator(p) provides a way\n  // to get back to the UnicodeText level. It uses CHECK to ensure\n  // that p is a pointer within this object's UTF-8 data, and that it\n  // points to the beginning of a character.\n  const_iterator MakeIterator(const char* p) const;\n\n  string DebugString() const;\n\n private:\n  friend class const_iterator;\n  friend class UnicodeTextUtils;\n\n  class Repr {  // A byte-string.\n   public:\n    char* data_;\n    int size_;\n    int capacity_;\n    bool ours_;  // Do we own data_?\n\n    Repr() : data_(nullptr), size_(0), capacity_(0), ours_(true) {}\n    ~Repr() { if (ours_) delete[] data_; }\n\n    void clear();\n    void reserve(int capacity);\n    void resize(int size);\n\n    void append(const char* bytes, int byte_length);\n    void Copy(const char* data, int size);\n    void TakeOwnershipOf(char* data, int size, int capacity);\n    void PointTo(const char* data, int size);\n\n    string DebugString() const;\n\n   private:\n    Repr& operator=(const Repr&);\n    Repr(const Repr& other);\n  };\n\n  Repr repr_;\n\n  // UTF-8-specific private methods.\n  // These routines do not perform a validity check when compiled\n  // in opt mode.\n  // It is an error to call these methods with UTF-8 data that\n  // is not interchange-valid.\n  //\n  UnicodeText& UnsafeCopyUTF8(const char* utf8_buffer, int byte_length);\n  UnicodeText& UnsafeTakeOwnershipOfUTF8(\n      char* utf8_buffer, int byte_length, int byte_capacity);\n  UnicodeText& UnsafePointToUTF8(const char* utf8_buffer, int byte_length);\n  UnicodeText& UnsafeAppendUTF8(const char* utf8_buffer, int byte_length);\n  const_iterator UnsafeFind(const UnicodeText& look,\n                            const_iterator start_pos) const;\n};\n\nbool operator==(const UnicodeText& lhs, const UnicodeText& rhs);\n\ninline bool operator!=(const UnicodeText& lhs, const UnicodeText& rhs) {\n  return !(lhs == rhs);\n}\n\n// UnicodeTextRange is a pair of iterators, useful for specifying text\n// segments. If the iterators are ==, the segment is empty.\ntypedef pair<UnicodeText::const_iterator,\n             UnicodeText::const_iterator> UnicodeTextRange;\n\ninline bool UnicodeTextRangeIsEmpty(const UnicodeTextRange& r) {\n  return r.first == r.second;\n}\n\n\n// *************************** Utilities *************************\n\n// A factory function for creating a UnicodeText from a buffer of\n// UTF-8 data. The new UnicodeText takes ownership of the buffer. (It\n// is an \"owner.\")\n//\n// Each byte that is structurally invalid will be replaced with a\n// space. Each codepoint that is interchange-invalid will also be\n// replaced with a space, even if the codepoint was represented with a\n// multibyte sequence in the UTF-8 data.\n//\ninline UnicodeText MakeUnicodeTextAcceptingOwnership(\n    char* utf8_buffer, int byte_length, int byte_capacity) {\n  return UnicodeText().TakeOwnershipOfUTF8(\n      utf8_buffer, byte_length, byte_capacity);\n}\n\n// A factory function for creating a UnicodeText from a buffer of\n// UTF-8 data. The new UnicodeText does not take ownership of the\n// buffer. (It is an \"alias.\")\n//\ninline UnicodeText MakeUnicodeTextWithoutAcceptingOwnership(\n    const char* utf8_buffer, int byte_length) {\n  return UnicodeText().PointToUTF8(utf8_buffer, byte_length);\n}\n\n// Create a UnicodeText from a UTF-8 string or buffer.\n//\n// If do_copy is true, then a copy of the string is made. The copy is\n// owned by the resulting UnicodeText object and will be freed when\n// the object is destroyed. This UnicodeText object is referred to\n// as an \"owner.\"\n//\n// If do_copy is false, then no copy is made. The resulting\n// UnicodeText object does NOT take ownership of the string; in this\n// case, the lifetime of the UnicodeText object must not exceed the\n// lifetime of the string. This Unicodetext object is referred to as\n// an \"alias.\" This is the same as MakeUnicodeTextWithoutAcceptingOwnership.\n//\n// If the input string does not contain valid UTF-8, then a copy is\n// made (as if do_copy were true) and coerced to valid UTF-8 by\n// replacing each invalid byte with a space.\n//\ninline UnicodeText UTF8ToUnicodeText(const char* utf8_buf, int len,\n                                     bool do_copy) {\n  UnicodeText t;\n  if (do_copy) {\n    t.CopyUTF8(utf8_buf, len);\n  } else {\n    t.PointToUTF8(utf8_buf, len);\n  }\n  return t;\n}\n\ninline UnicodeText UTF8ToUnicodeText(const string& utf_string, bool do_copy) {\n  return UTF8ToUnicodeText(utf_string.data(), utf_string.size(), do_copy);\n}\n\ninline UnicodeText UTF8ToUnicodeText(const char* utf8_buf, int len) {\n  return UTF8ToUnicodeText(utf8_buf, len, true);\n}\ninline UnicodeText UTF8ToUnicodeText(const string& utf8_string) {\n  return UTF8ToUnicodeText(utf8_string, true);\n}\n\n// Return a string containing the UTF-8 encoded version of all the\n// Unicode characters in t.\ninline string UnicodeTextToUTF8(const UnicodeText& t) {\n  return string(t.utf8_data(), t.utf8_length());\n}\n\n// This template function declaration is used in defining arraysize.\n// Note that the function doesn't need an implementation, as we only\n// use its type.\ntemplate <typename T, size_t N>\nchar (&ArraySizeHelper(T (&array)[N]))[N];\n#define arraysize(array) (sizeof(ArraySizeHelper(array)))\n\n// For debugging.  Return a string of integers, written in uppercase\n// hex (%X), corresponding to the codepoints within the text. Each\n// integer is followed by a space. E.g., \"61 62 6A 3005 \".\nstring CodepointString(const UnicodeText& t);\n\n#endif  // UTIL_UTF8_PUBLIC_UNICODETEXT_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/util/utf8/unicodetext_main.cc",
    "content": "/**\n * Copyright 2010 Google Inc.\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n *      http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\n// Author: sligocki@google.com (Shawn Ligocki)\n//\n// A basic main function to test that UnicodeText builds.\n\n#include <stdio.h>\n#include <stdlib.h>\n\n#include <string>\n\n#include \"util/utf8/unicodetext.h\"\n\nint main(int argc, char** argv) {\n  if (argc > 1) {\n    printf(\"Bytes:\\n\");\n    std::string bytes(argv[1]);\n    for (std::string::const_iterator iter = bytes.begin();\n         iter < bytes.end(); ++iter) {\n      printf(\"  0x%02X\\n\", *iter);\n    }\n\n    printf(\"Unicode codepoints:\\n\");\n    UnicodeText text(UTF8ToUnicodeText(bytes));\n    for (UnicodeText::const_iterator iter = text.begin();\n         iter < text.end(); ++iter) {\n      printf(\"  U+%X\\n\", *iter);\n    }\n  }\n  return EXIT_SUCCESS;\n}\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/util/utf8/unicodetext_unittest.cc",
    "content": "/**\n * Copyright 2010 Google Inc.\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n *      http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\n#include \"util/utf8/unicodetext.h\"\n\n#include <iterator>\n#include <set>\n\n#include \"gtest/gtest.h\"\n#include \"third_party/utf/utf.h\"\n#include \"util/utf8/unilib.h\"\n\nnamespace {\n\nclass UnicodeTextTest : public testing::Test {\n protected:\n  UnicodeTextTest() : empty_text_() {\n    const char32 text[] = {0x1C0, 0x4E8C, 0xD7DB, 0x34, 0x1D11E};\n    // Construct a UnicodeText from those codepoints.\n    text_.append(&text[0], text + arraysize(text));\n  }\n\n  UnicodeText empty_text_;\n  UnicodeText text_;\n};\n\nTEST(UnicodeTextTest, Ownership) {\n  const string src =  \"\\u304A\\u00B0\\u106B\";\n  {\n    string s = src;\n    char* sbuf = new char[s.size()];\n    memcpy(sbuf, s.data(), s.size());\n    UnicodeText owned;\n    owned.TakeOwnershipOfUTF8(sbuf, s.size(), s.size());\n    EXPECT_EQ(owned.utf8_data(), sbuf);\n    s.clear();\n    // owned should be OK even after s has been cleared.\n    UnicodeText::const_iterator it = owned.begin();\n    EXPECT_EQ(*it++, 0x304A);\n    EXPECT_EQ(*it++, 0x00B0);\n    EXPECT_EQ(*it++, 0x106B);\n    CHECK(it == owned.end());\n  }\n\n  {\n    UnicodeText owner;\n    {  // Create a new scope for s.\n      string s = src;\n      char* sbuf = new char[s.size()];\n      memcpy(sbuf, s.data(), s.size());\n      UnicodeText t;\n      t.TakeOwnershipOfUTF8(sbuf, s.size(), s.size());\n      EXPECT_EQ(t.utf8_data(), sbuf);\n      owner = t;  // Copies the data\n      EXPECT_NE(owner.utf8_data(), sbuf);\n    }\n    // owner should be OK even after s has gone out of scope\n    UnicodeText::const_iterator it = owner.begin();\n    EXPECT_EQ(*it++, 0x304A);\n    EXPECT_EQ(*it++, 0x00B0);\n    EXPECT_EQ(*it++, 0x106B);\n    CHECK(it == owner.end());\n  }\n\n  {\n    UnicodeText alias;\n    alias.PointToUTF8(src.data(), src.size());\n    EXPECT_EQ(alias.utf8_data(), src.data());\n    UnicodeText::const_iterator it = alias.begin();\n    EXPECT_EQ(*it++, 0x304A);\n    EXPECT_EQ(*it++, 0x00B0);\n    EXPECT_EQ(*it++, 0x106B);\n    CHECK(it == alias.end());\n\n    UnicodeText t = alias;  // Copy initialization copies the data.\n    EXPECT_NE(t.utf8_data(), alias.utf8_data());\n\n    UnicodeText t2;\n    t2 = alias;  // Assignment copies the data.\n    EXPECT_NE(t2.utf8_data(), alias.utf8_data());\n\n    // Preserve an alias.\n    t.PointTo(alias); // This does not copy the data.\n    EXPECT_EQ(t.utf8_data(), alias.utf8_data());\n\n    t.push_back(0x0020); // Modify the alias\n    EXPECT_NE(t.utf8_data(), alias.utf8_data()); // It's no longer an alias.\n  }\n}\n\nclass IteratorTest : public UnicodeTextTest {};\n\nTEST_F(IteratorTest, Iterates) {\n  UnicodeText::const_iterator iter = text_.begin();\n  EXPECT_EQ(0x1C0, *iter);\n  EXPECT_EQ(&iter, &++iter);  // operator++ returns *this.\n  EXPECT_EQ(0x4E8C, *iter++);\n  EXPECT_EQ(0xD7DB, *iter);\n  // Make sure you can dereference more than once.\n  EXPECT_EQ(0xD7DB, *iter);\n  EXPECT_EQ(0x34, *++iter);\n  EXPECT_EQ(0x1D11E, *++iter);\n  ASSERT_TRUE(iter != text_.end());\n  iter++;\n  EXPECT_TRUE(iter == text_.end());\n}\n\nTEST_F(IteratorTest, Reverse) {\n  UnicodeText::const_reverse_iterator iter = text_.rbegin();\n  EXPECT_EQ(0x1D11E, *iter);\n  EXPECT_EQ(&iter, &++iter);  // operator++ returns *this.\n  EXPECT_EQ(0x34, *iter++);\n  EXPECT_EQ(0xD7DB, *iter);\n  // Make sure you can dereference more than once.\n  EXPECT_EQ(0xD7DB, *iter);\n  EXPECT_EQ(0x4E8C, *++iter);\n  EXPECT_EQ(0x1C0, *++iter);\n  ASSERT_TRUE(iter != text_.rend());\n  iter++;\n  EXPECT_TRUE(iter == text_.rend());\n}\n\nTEST_F(IteratorTest, MultiPass) {\n  // Also tests Default Constructible and Assignable.\n  UnicodeText::const_iterator i1, i2;\n  i1 = text_.begin();\n  i2 = i1;\n  EXPECT_EQ(0x4E8C, *++i1);\n  EXPECT_TRUE(i1 != i2);\n  EXPECT_EQ(0x1C0, *i2);\n  ++i2;\n  EXPECT_TRUE(i1 == i2);\n  EXPECT_EQ(0x4E8C, *i2);\n}\n\nTEST_F(IteratorTest, ReverseIterates) {\n  UnicodeText::const_iterator iter = text_.end();\n  EXPECT_TRUE(iter == text_.end());\n  iter--;\n  ASSERT_TRUE(iter != text_.end());\n  EXPECT_EQ(0x1D11E, *iter--);\n  EXPECT_EQ(0x34, *iter);\n  EXPECT_EQ(0xD7DB, *--iter);\n  // Make sure you can dereference more than once.\n  EXPECT_EQ(0xD7DB, *iter);\n  --iter;\n  EXPECT_EQ(0x4E8C, *iter--);\n  EXPECT_EQ(0x1C0, *iter);\n  EXPECT_TRUE(iter == text_.begin());\n}\n\nTEST_F(IteratorTest, Comparable) {\n  UnicodeText::const_iterator i1, i2;\n  i1 = text_.begin();\n  i2 = i1;\n  ++i2;\n\n  EXPECT_TRUE(i1 < i2);\n  EXPECT_TRUE(text_.begin() <= i1);\n  EXPECT_FALSE(i1 >= i2);\n  EXPECT_FALSE(i1 > text_.end());\n}\n\nTEST_F(IteratorTest, Advance) {\n  UnicodeText::const_iterator iter = text_.begin();\n  EXPECT_EQ(0x1C0, *iter);\n  std::advance(iter, 4);\n  EXPECT_EQ(0x1D11E, *iter);\n  ++iter;\n  EXPECT_TRUE(iter == text_.end());\n}\n\nTEST_F(IteratorTest, Distance) {\n  UnicodeText::const_iterator iter = text_.begin();\n  EXPECT_EQ(0, distance(text_.begin(), iter));\n  EXPECT_EQ(5, distance(iter, text_.end()));\n  ++iter;\n  ++iter;\n  EXPECT_EQ(2, distance(text_.begin(), iter));\n  EXPECT_EQ(3, distance(iter, text_.end()));\n  ++iter;\n  ++iter;\n  EXPECT_EQ(4, distance(text_.begin(), iter));\n  ++iter;\n  EXPECT_EQ(0, distance(iter, text_.end()));\n}\n\nTEST_F(IteratorTest, Encode) {\n  const string utf8 = \"\\xC7\\x80\"\n                      \"\\xE4\\xBA\\x8C\"\n                      \"\\xED\\x9F\\x9B\"\n                      \"\\x34\"\n                      \"\\xF0\\x9D\\x84\\x9E\";\n  const int lengths[] = {2, 3, 3, 1, 4};\n  EXPECT_EQ(text_.size(), 5);\n  EXPECT_EQ(text_.utf8_length(), 13);\n  EXPECT_TRUE(memcmp(text_.utf8_data(), utf8.data(), text_.utf8_length())\n              == 0);\n\n  {\n    // Test the iterator\n    UnicodeText::const_iterator iter = text_.begin(), end = text_.end();\n    const char* u = utf8.data();\n    int i = 0;\n    while (iter != end) {\n      char buf[5];\n      int n = iter.get_utf8(buf);\n      buf[n] = '\\0';\n      EXPECT_TRUE(strncmp(buf, u, n) == 0);\n      EXPECT_EQ(buf, iter.get_utf8_string());\n      EXPECT_EQ(lengths[i], iter.utf8_length());\n      u += n;\n      iter++;\n      i++;\n    }\n  }\n\n  {\n    // Test the reverse_iterator\n    UnicodeText::const_reverse_iterator iter = text_.rbegin();\n    UnicodeText::const_reverse_iterator end = text_.rend();\n    const char* u = utf8.data() + utf8.size();\n    int i = 0;\n    while (iter != end) {\n      char buf[5];\n      int n = iter.get_utf8(buf);\n      buf[n] = '\\0';\n      u -= n;\n      EXPECT_TRUE(strncmp(buf, u, n) == 0);\n      EXPECT_EQ(buf, iter.get_utf8_string());\n      EXPECT_EQ(lengths[text_.size() - i - 1], iter.utf8_length());\n      iter++;\n      i++;\n    }\n  }\n\n  text_.push_back('$');\n  EXPECT_EQ(text_.size(), 6);\n  EXPECT_EQ(text_.utf8_length(), 14);\n\n  text_.push_back('\\xAE');  // registered sign\n  EXPECT_EQ(text_.size(), 7);\n  EXPECT_EQ(text_.utf8_length(), 16);  // 2 bytes long\n}\n\nTEST_F(IteratorTest, Decode) {\n  const char32 text[] = {0x1C0, 0x4E8C, 0xD7DB, 0x34, 0x1D11E};\n  UnicodeText::const_iterator iter = text_.begin();\n  for (int i = 0; i < 5; ++i)\n    EXPECT_EQ(text[i], *iter++);\n  string s = CodepointString(text_);\n  EXPECT_EQ(s, \"1C0 4E8C D7DB 34 1D11E \");\n}\n\n\n\nclass OperatorTest : public UnicodeTextTest {};\n\nTEST_F(OperatorTest, Clear) {\n  UnicodeText empty_text(UTF8ToUnicodeText(\"\"));\n  EXPECT_FALSE(text_ == empty_text);\n  text_.clear();\n  EXPECT_TRUE(text_ == empty_text);\n}\n\nTEST_F(OperatorTest, Empty) {\n  EXPECT_TRUE(empty_text_.empty());\n  EXPECT_FALSE(text_.empty());\n  text_.clear();\n  EXPECT_TRUE(text_.empty());\n}\n\nTEST(UnicodeTextTest, InterchangeValidity) {\n  char* FDD0 = new char[3];\n  memcpy(FDD0, \"\\xEF\\xB7\\x90\", 3);\n  EXPECT_FALSE(UniLib::IsInterchangeValid(FDD0, 3));\n\n  UnicodeText a = MakeUnicodeTextWithoutAcceptingOwnership(FDD0, 3);\n  EXPECT_EQ(a.size(), 1);\n  EXPECT_EQ(*a.begin(), 0x20);\n  a.clear();\n  a.push_back(0xFDD0);\n  EXPECT_EQ(a.size(), 1);\n  EXPECT_EQ(*a.begin(), 0x20);\n\n  a = MakeUnicodeTextAcceptingOwnership(FDD0, 3, 3);\n  EXPECT_EQ(a.size(), 1);\n  EXPECT_EQ(*a.begin(), 0x20);\n  a.clear();\n  a.push_back(0xFDD0);\n  EXPECT_EQ(a.size(), 1);\n  EXPECT_EQ(*a.begin(), 0x20);\n}\n\nclass SubstringSearchTest : public UnicodeTextTest {};\n\n// TEST_F(SubstringSearchTest, FindEmpty) {\n//   EXPECT_TRUE(text_.find(empty_text_) == text_.begin());\n//   EXPECT_TRUE(empty_text_.find(text_) == empty_text_.end());\n// }\n\n// TEST_F(SubstringSearchTest, Find) {\n//   UnicodeText::const_iterator second_pos = text_.begin();\n//   ++second_pos;\n//   UnicodeText::const_iterator third_pos = second_pos;\n//   ++third_pos;\n//   UnicodeText::const_iterator fourth_pos = third_pos;\n//   ++fourth_pos;\n\n//   // same as text_\n//   const char32 text[] = {0x1C0, 0x4E8C, 0xD7DB, 0x34, 0x1D11E};\n\n//   UnicodeText prefix;\n//   prefix.append(&text[0], &text[2]);\n//   EXPECT_TRUE(text_.find(prefix) == text_.begin());\n//   EXPECT_TRUE(text_.find(prefix, second_pos) == text_.end());\n\n//   UnicodeText suffix;\n//   suffix.append(&text[2], text + arraysize(text));\n//   EXPECT_TRUE(text_.find(suffix) == third_pos);\n//   EXPECT_TRUE(text_.find(suffix, second_pos) == third_pos);\n//   EXPECT_TRUE(text_.find(suffix, third_pos) == third_pos);\n//   EXPECT_TRUE(text_.find(suffix, fourth_pos) == text_.end());\n// }\n\n// TEST_F(SubstringSearchTest, HasConversionError) {\n//   EXPECT_FALSE(text_.HasReplacementChar());\n//   const char32 beg[] = {0xFFFD, 0x1C0, 0x4E8C, 0xD7DB, 0x34, 0x1D11E};\n//   UnicodeText beg_uni;\n//   beg_uni.append(&beg[0], beg + arraysize(beg));\n//   EXPECT_TRUE(beg_uni.HasReplacementChar());\n\n//   const char32 mid[] = {0x1C0, 0x4E8C, 0xFFFD, 0xD7DB, 0x34, 0x1D11E};\n//   UnicodeText mid_uni;\n//   mid_uni.append(&mid[0], mid + arraysize(mid));\n//   EXPECT_TRUE(mid_uni.HasReplacementChar());\n\n//   const char32 end[] = {0x1C0, 0x4E8C, 0xD7DB, 0x34, 0x1D11E, 0xFFFD};\n//   UnicodeText end_uni;\n//   end_uni.append(&end[0], end + arraysize(end));\n//   EXPECT_TRUE(end_uni.HasReplacementChar());\n\n//   const char32 two[] = {0xFFFD, 0x1C0, 0x4E8C, 0xD7DB, 0x34, 0x1D11E, 0xFFFD};\n//   UnicodeText two_uni;\n//   two_uni.append(&two[0], two + arraysize(two));\n//   EXPECT_TRUE(two_uni.HasReplacementChar());\n\n//   const char32 adj[] = {0x1C0, 0xFFFD, 0xFFFD, 0x4E8C, 0xD7DB, 0x34, 0x1D11E};\n//   UnicodeText adj_uni;\n//   adj_uni.append(&adj[0], adj + arraysize(adj));\n//   EXPECT_TRUE(adj_uni.HasReplacementChar());\n// }\n\n}  // namespace\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/util/utf8/unilib.cc",
    "content": "/**\n * Copyright 2010 Google Inc.\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n *      http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\n// Author: sligocki@google.com (Shawn Ligocki)\n\n#include \"util/utf8/unilib.h\"\n\n#include \"syntaxnet/base.h\"\n#include \"third_party/utf/utf.h\"\n\nnamespace UniLib {\n\n// Codepoints not allowed for interchange are:\n//   C0 (ASCII) controls: U+0000 to U+001F excluding Space (SP, U+0020),\n//       Horizontal Tab (HT, U+0009), Line-Feed (LF, U+000A),\n//       Form Feed (FF, U+000C) and Carriage-Return (CR, U+000D)\n//   C1 controls: U+007F to U+009F\n//   Surrogates: U+D800 to U+DFFF\n//   Non-characters: U+FDD0 to U+FDEF and U+xxFFFE to U+xxFFFF for all xx\nbool IsInterchangeValid(char32 c) {\n  return !((c >= 0x00 && c <= 0x08) || c == 0x0B || (c >= 0x0E && c <= 0x1F) ||\n           (c >= 0x7F && c <= 0x9F) ||\n           (c >= 0xD800 && c <= 0xDFFF) ||\n           (c >= 0xFDD0 && c <= 0xFDEF) || (c&0xFFFE) == 0xFFFE);\n}\n\nint SpanInterchangeValid(const char* begin, int byte_length) {\n  char32 rune;\n  const char* p = begin;\n  const char* end = begin + byte_length;\n  while (p < end) {\n    int bytes_consumed = charntorune(&rune, p, end - p);\n    // We want to accept Runeerror == U+FFFD as a valid char, but it is used\n    // by chartorune to indicate error. Luckily, the real codepoint is size 3\n    // while errors return bytes_consumed <= 1.\n    if ((rune == Runeerror && bytes_consumed <= 1) ||\n        !IsInterchangeValid(rune)) {\n      break;  // Found\n    }\n    p += bytes_consumed;\n  }\n  return p - begin;\n}\n\n}  // namespace UniLib\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/util/utf8/unilib.h",
    "content": "/**\n * Copyright 2010 Google Inc.\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n *      http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\n// Routines to do manipulation of Unicode characters or text\n//\n// The StructurallyValid routines accept buffers of arbitrary bytes.\n// For CoerceToStructurallyValid(), the input buffer and output buffers may\n// point to exactly the same memory.\n//\n// In all other cases, the UTF-8 string must be structurally valid and\n// have all codepoints in the range  U+0000 to U+D7FF or U+E000 to U+10FFFF.\n// Debug builds take a fatal error for invalid UTF-8 input.\n// The input and output buffers may not overlap at all.\n//\n// The char32 routines are here only for convenience; they convert to UTF-8\n// internally and use the UTF-8 routines.\n\n#ifndef UTIL_UTF8_UNILIB_H__\n#define UTIL_UTF8_UNILIB_H__\n\n#include <string>\n#include \"syntaxnet/base.h\"\n\n// We export OneCharLen, IsValidCodepoint, and IsTrailByte from here,\n// but they are defined in unilib_utf8_utils.h.\n//#include \"util/utf8/public/unilib_utf8_utils.h\"  // IWYU pragma: export\n\nnamespace UniLib {\n\n// Returns the length in bytes of the prefix of src that is all\n//  interchange valid UTF-8\nint SpanInterchangeValid(const char* src, int byte_length);\ninline int SpanInterchangeValid(const std::string& src) {\n  return SpanInterchangeValid(src.data(), src.size());\n}\n\n// Returns true if the source is all interchange valid UTF-8\n// \"Interchange valid\" is a stronger than structurally valid --\n// no C0 or C1 control codes (other than CR LF HT FF) and no non-characters.\nbool IsInterchangeValid(char32 codepoint);\ninline bool IsInterchangeValid(const char* src, int byte_length) {\n  return (byte_length == SpanInterchangeValid(src, byte_length));\n}\ninline bool IsInterchangeValid(const std::string& src) {\n  return IsInterchangeValid(src.data(), src.size());\n}\n\n}  // namespace UniLib\n\n#endif  // UTIL_UTF8_PUBLIC_UNILIB_H_\n"
  },
  {
    "path": "model_zoo/models/syntaxnet/util/utf8/unilib_utf8_utils.h",
    "content": "/**\n * Copyright 2010 Google Inc.\n *\n * Licensed under the Apache License, Version 2.0 (the \"License\");\n * you may not use this file except in compliance with the License.\n * You may obtain a copy of the License at\n *\n *      http://www.apache.org/licenses/LICENSE-2.0\n *\n * Unless required by applicable law or agreed to in writing, software\n * distributed under the License is distributed on an \"AS IS\" BASIS,\n * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n * See the License for the specific language governing permissions and\n * limitations under the License.\n */\n\n#ifndef UTIL_UTF8_PUBLIC_UNILIB_UTF8_UTILS_H_\n#define UTIL_UTF8_PUBLIC_UNILIB_UTF8_UTILS_H_\n\n// These definitions are self-contained and have no dependencies.\n// They are also exported from unilib.h for legacy reasons.\n\n#include \"syntaxnet/base.h\"\n#include \"third_party/utf/utf.h\"\n\nnamespace UniLib {\n\n// Returns true if 'c' is in the range [0, 0xD800) or [0xE000, 0x10FFFF]\n// (i.e., is not a surrogate codepoint). See also\n// IsValidCodepoint(const char* src) in util/utf8/public/unilib.h.\ninline bool IsValidCodepoint(char32 c) {\n  return (static_cast<uint32>(c) < 0xD800)\n    || (c >= 0xE000 && c <= 0x10FFFF);\n}\n\n// Returns true if 'str' is the start of a structurally valid UTF-8\n// sequence and is not a surrogate codepoint. Returns false if str.empty()\n// or if str.length() < UniLib::OneCharLen(str[0]). Otherwise, this function\n// will access 1-4 bytes of src, where n is UniLib::OneCharLen(src[0]).\ninline bool IsUTF8ValidCodepoint(StringPiece str) {\n  char32 c;\n  int consumed;\n  // It's OK if str.length() > consumed.\n  return !str.empty()\n      && isvalidcharntorune(str.data(), str.size(), &c, &consumed)\n      && IsValidCodepoint(c);\n}\n\n// Returns the length (number of bytes) of the Unicode code point\n// starting at src, based on inspecting just that one byte. This\n// requires that src point to a well-formed UTF-8 string; the result\n// is undefined otherwise.\ninline int OneCharLen(const char* src) {\n  return \"\\1\\1\\1\\1\\1\\1\\1\\1\\1\\1\\1\\1\\2\\2\\3\\4\"[(*src & 0xFF) >> 4];\n}\n\n// Returns true if this byte is a trailing UTF-8 byte (10xx xxxx)\ninline bool IsTrailByte(char x) {\n  // return (x & 0xC0) == 0x80;\n  // Since trail bytes are always in [0x80, 0xBF], we can optimize:\n  return static_cast<signed char>(x) < -0x40;\n}\n\n}  // namespace UniLib\n\n#endif  // UTIL_UTF8_PUBLIC_UNILIB_UTF8_UTILS_H_\n"
  },
  {
    "path": "model_zoo/models/textsum/BUILD",
    "content": "package(default_visibility = [\":internal\"])\n\nlicenses([\"notice\"])  # Apache 2.0\n\nexports_files([\"LICENSE\"])\n\npackage_group(\n    name = \"internal\",\n    packages = [\n        \"//textsum/...\",\n    ],\n)\n\npy_library(\n    name = \"seq2seq_attention_model\",\n    srcs = [\"seq2seq_attention_model.py\"],\n    deps = [\n        \":seq2seq_lib\",\n    ],\n)\n\npy_library(\n    name = \"seq2seq_lib\",\n    srcs = [\"seq2seq_lib.py\"],\n)\n\npy_binary(\n    name = \"seq2seq_attention\",\n    srcs = [\"seq2seq_attention.py\"],\n    deps = [\n        \":batch_reader\",\n        \":data\",\n        \":seq2seq_attention_decode\",\n        \":seq2seq_attention_model\",\n    ],\n)\n\npy_library(\n    name = \"batch_reader\",\n    srcs = [\"batch_reader.py\"],\n    deps = [\n        \":data\",\n        \":seq2seq_attention_model\",\n    ],\n)\n\npy_library(\n    name = \"beam_search\",\n    srcs = [\"beam_search.py\"],\n)\n\npy_library(\n    name = \"seq2seq_attention_decode\",\n    srcs = [\"seq2seq_attention_decode.py\"],\n    deps = [\n        \":beam_search\",\n        \":data\",\n    ],\n)\n\npy_library(\n    name = \"data\",\n    srcs = [\"data.py\"],\n)\n"
  },
  {
    "path": "model_zoo/models/textsum/README.md",
    "content": "Sequence-to-Sequence with Attention Model for Text Summarization.\n\nAuthors:\n\nXin Pan (xpan@google.com, github:panyx0718),\nPeter Liu (peterjliu@google.com, github:peterjliu)\n\n<b>Introduction</b>\n\nThe core model is the traditional sequence-to-sequence model with attention.\nIt is customized (mostly inputs/outputs) for the text summarization task. The\nmodel has been trained on Gigaword dataset and achieved state-of-the-art\nresults (as of June 2016).\n\nThe results described below are based on model trained on multi-gpu and\nmulti-machine settings. It has been simplified to run on only one machine\nfor open source purpose.\n\n<b>DataSet</b>\n\nWe used the Gigaword dataset described in [Rush et al. A Neural Attention Model\nfor Sentence Summarization](https://arxiv.org/abs/1509.00685).\n\nWe cannot provide the dataset due to the license. See ExampleGen in data.py\nabout the data format. data/data contains a toy example. Also see data/vocab\nfor example vocabulary format. In <b>How To Run</b> below, users can use toy\ndata and vocab provided in the data/ directory to run the training by replacing\nthe data directory flag.\n\ndata_convert_example.py contains example of convert between binary and text.\n\n\n<b>Experiment Result</b>\n\n8000 examples from testset are sampled to generate summaries and rouge score is\ncalculated for the generated summaries. Here is the best rouge score on\nGigaword dataset:\n\nROUGE-1 Average_R: 0.38272 (95%-conf.int. 0.37774 - 0.38755)\n\nROUGE-1 Average_P: 0.50154 (95%-conf.int. 0.49509 - 0.50780)\n\nROUGE-1 Average_F: 0.42568 (95%-conf.int. 0.42016 - 0.43099)\n\nROUGE-2 Average_R: 0.20576 (95%-conf.int. 0.20060 - 0.21112)\n\nROUGE-2 Average_P: 0.27565 (95%-conf.int. 0.26851 - 0.28257)\n\nROUGE-2 Average_F: 0.23126 (95%-conf.int. 0.22539 - 0.23708)\n\n<b>Configuration:</b>\n\nFollowing is the configuration for the best trained model on Gigaword:\n\nbatch_size: 64\n\nbidirectional encoding layer: 4\n\narticle length: first 2 sentences, total words within 120.\n\nsummary length: total words within 30.\n\nword embedding size: 128\n\nLSTM hidden units: 256\n\nSampled softmax: 4096\n\nvocabulary size: Most frequent 200k words from dataset's article and summaries.\n\n<b>How To Run</b>\n\nPre-requesite:\n\nInstall TensorFlow and Bazel.\n\n```shell\n# cd to your workspace\n# 1. Clone the textsum code to your workspace 'textsum' directory.\n# 2. Create an empty 'WORKSPACE' file in your workspace.\n# 3. Move the train/eval/test data to your workspace 'data' directory.\n#    In the following example, I named the data training-*, test-*, etc.\n#    If your data files have different names, update the --data_path.\n#    If you don't have data but want to try out the model, copy the toy\n#    data from the textsum/data/data to the data/ directory in the workspace.\nls -R\n.:\ndata  textsum  WORKSPACE\n\n./data:\nvocab  test-0  training-0  training-1  validation-0 ...(omitted)\n\n./textsum:\nbatch_reader.py       beam_search.py       BUILD    README.md                    seq2seq_attention_model.py  data\ndata.py  seq2seq_attention_decode.py  seq2seq_attention.py        seq2seq_lib.py\n\n./textsum/data:\ndata  vocab\n\nbazel build -c opt --config=cuda textsum/...\n\n# Run the training.\nbazel-bin/textsum/seq2seq_attention \\\n  --mode=train \\\n  --article_key=article \\\n  --abstract_key=abstract \\\n  --data_path=data/training-* \\\n  --vocab_path=data/vocab \\\n  --log_root=textsum/log_root \\\n  --train_dir=textsum/log_root/train\n\n# Run the eval. Try to avoid running on the same machine as training.\nbazel-bin/textsum/seq2seq_attention \\\n  --mode=eval \\\n  --article_key=article \\\n  --abstract_key=abstract \\\n  --data_path=data/validation-* \\\n  --vocab_path=data/vocab \\\n  --log_root=textsum/log_root \\\n  --eval_dir=textsum/log_root/eval\n\n# Run the decode. Run it when the most is mostly converged.\nbazel-bin/textsum/seq2seq_attention \\\n  --mode=decode \\\n  --article_key=article \\\n  --abstract_key=abstract \\\n  --data_path=data/test-* \\\n  --vocab_path=data/vocab \\\n  --log_root=textsum/log_root \\\n  --decode_dir=textsum/log_root/decode \\\n  --beam_size=8\n```\n\n\n<b>Examples:</b>\n\nThe following are some text summarization examples, including experiments\nusing dataset other than Gigaword.\n\narticle: novell inc. chief executive officer eric schmidt has been named chairman of the internet search-engine company google .\n\nhuman: novell ceo named google chairman\n\nmachine:  novell chief executive named to head internet company\n\n======================================\n\narticle: gulf newspapers voiced skepticism thursday over whether newly re - elected us president bill clinton could help revive the troubled middle east peace process but saw a glimmer of hope .\n\nhuman: gulf skeptical about whether clinton will revive peace process\n\nmachine:  gulf press skeptical over clinton 's prospects for peace process\n\n======================================\n\narticle:  the european court of justice ( ecj ) recently ruled in lock v british gas trading ltd that eu law requires a worker 's statutory holiday pay to take commission payments into account - it should not be based solely on basic salary . the case is not over yet , but its outcome could potentially be costly for employers with workers who are entitled to commission . mr lock , an energy salesman for british gas , was paid a basic salary and sales commission on a monthly basis . his sales commission made up around 60 % of his remuneration package . when he took two weeks ' annual leave in december 2012 , he was paid his basic salary and also received commission from previous sales that fell due during that period . lock obviously did not generate new sales while he was on holiday , which meant that in the following period he suffered a reduced income through lack of commission . he brought an employment tribunal claim asserting that this amounted to a breach of the working time regulations 1998 .....deleted rest for readability...\n\nabstract: will british gas ecj ruling fuel holiday pay hike ?\n\ndecode: eu law requires worker 's statutory holiday pay \n\n======================================\n\narticle:  the junior all whites have been eliminated from the fifa u - 20 world cup in colombia with results on the final day of pool play confirming their exit . sitting on two points , new zealand needed results in one of the final two groups to go their way to join the last 16 as one of the four best third place teams . but while spain helped the kiwis ' cause with a 5 - 1 thrashing of australia , a 3 - 0 win for ecuador over costa rica saw the south americans climb to second in group c with costa rica 's three points also good enough to progress in third place . that left the junior all whites hopes hanging on the group d encounter between croatia and honduras finishing in a draw . a stalemate - and a place in the knockout stages for new zealand - appeared on the cards until midfielder marvin ceballos netted an 81st minute winner that sent guatemala through to the second round and left the junior all whites packing their bags . new zealand finishes the 24 - nation tournament in 17th place , having claimed their first ever points at this level in just their second appearance at the finals .\n\nabstract: junior all whites exit world cup\n\ndecoded:  junior all whites eliminated from u- 20 world cup\n\n"
  },
  {
    "path": "model_zoo/models/textsum/batch_reader.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Batch reader to seq2seq attention model, with bucketing support.\"\"\"\n\nfrom collections import namedtuple\nimport Queue\nfrom random import shuffle\nfrom threading import Thread\nimport time\n\nimport numpy as np\nimport tensorflow as tf\n\nimport data\n\nModelInput = namedtuple('ModelInput',\n                        'enc_input dec_input target enc_len dec_len '\n                        'origin_article origin_abstract')\n\nBUCKET_CACHE_BATCH = 100\nQUEUE_NUM_BATCH = 100\n\n\nclass Batcher(object):\n  \"\"\"Batch reader with shuffling and bucketing support.\"\"\"\n\n  def __init__(self, data_path, vocab, hps,\n               article_key, abstract_key, max_article_sentences,\n               max_abstract_sentences, bucketing=True, truncate_input=False):\n    \"\"\"Batcher constructor.\n\n    Args:\n      data_path: tf.Example filepattern.\n      vocab: Vocabulary.\n      hps: Seq2SeqAttention model hyperparameters.\n      article_key: article feature key in tf.Example.\n      abstract_key: abstract feature key in tf.Example.\n      max_article_sentences: Max number of sentences used from article.\n      max_abstract_sentences: Max number of sentences used from abstract.\n      bucketing: Whether bucket articles of similar length into the same batch.\n      truncate_input: Whether to truncate input that is too long. Alternative is\n        to discard such examples.\n    \"\"\"\n    self._data_path = data_path\n    self._vocab = vocab\n    self._hps = hps\n    self._article_key = article_key\n    self._abstract_key = abstract_key\n    self._max_article_sentences = max_article_sentences\n    self._max_abstract_sentences = max_abstract_sentences\n    self._bucketing = bucketing\n    self._truncate_input = truncate_input\n    self._input_queue = Queue.Queue(QUEUE_NUM_BATCH * self._hps.batch_size)\n    self._bucket_input_queue = Queue.Queue(QUEUE_NUM_BATCH)\n    self._input_threads = []\n    for _ in xrange(16):\n      self._input_threads.append(Thread(target=self._FillInputQueue))\n      self._input_threads[-1].daemon = True\n      self._input_threads[-1].start()\n    self._bucketing_threads = []\n    for _ in xrange(4):\n      self._bucketing_threads.append(Thread(target=self._FillBucketInputQueue))\n      self._bucketing_threads[-1].daemon = True\n      self._bucketing_threads[-1].start()\n\n    self._watch_thread = Thread(target=self._WatchThreads)\n    self._watch_thread.daemon = True\n    self._watch_thread.start()\n\n  def NextBatch(self):\n    \"\"\"Returns a batch of inputs for seq2seq attention model.\n\n    Returns:\n      enc_batch: A batch of encoder inputs [batch_size, hps.enc_timestamps].\n      dec_batch: A batch of decoder inputs [batch_size, hps.dec_timestamps].\n      target_batch: A batch of targets [batch_size, hps.dec_timestamps].\n      enc_input_len: encoder input lengths of the batch.\n      dec_input_len: decoder input lengths of the batch.\n      loss_weights: weights for loss function, 1 if not padded, 0 if padded.\n      origin_articles: original article words.\n      origin_abstracts: original abstract words.\n    \"\"\"\n    enc_batch = np.zeros(\n        (self._hps.batch_size, self._hps.enc_timesteps), dtype=np.int32)\n    enc_input_lens = np.zeros(\n        (self._hps.batch_size), dtype=np.int32)\n    dec_batch = np.zeros(\n        (self._hps.batch_size, self._hps.dec_timesteps), dtype=np.int32)\n    dec_output_lens = np.zeros(\n        (self._hps.batch_size), dtype=np.int32)\n    target_batch = np.zeros(\n        (self._hps.batch_size, self._hps.dec_timesteps), dtype=np.int32)\n    loss_weights = np.zeros(\n        (self._hps.batch_size, self._hps.dec_timesteps), dtype=np.float32)\n    origin_articles = ['None'] * self._hps.batch_size\n    origin_abstracts = ['None'] * self._hps.batch_size\n\n    buckets = self._bucket_input_queue.get()\n    for i in xrange(self._hps.batch_size):\n      (enc_inputs, dec_inputs, targets, enc_input_len, dec_output_len,\n       article, abstract) = buckets[i]\n\n      origin_articles[i] = article\n      origin_abstracts[i] = abstract\n      enc_input_lens[i] = enc_input_len\n      dec_output_lens[i] = dec_output_len\n      enc_batch[i, :] = enc_inputs[:]\n      dec_batch[i, :] = dec_inputs[:]\n      target_batch[i, :] = targets[:]\n      for j in xrange(dec_output_len):\n        loss_weights[i][j] = 1\n    return (enc_batch, dec_batch, target_batch, enc_input_lens, dec_output_lens,\n            loss_weights, origin_articles, origin_abstracts)\n\n  def _FillInputQueue(self):\n    \"\"\"Fill input queue with ModelInput.\"\"\"\n    start_id = self._vocab.WordToId(data.SENTENCE_START)\n    end_id = self._vocab.WordToId(data.SENTENCE_END)\n    pad_id = self._vocab.WordToId(data.PAD_TOKEN)\n    input_gen = self._TextGenerator(data.ExampleGen(self._data_path))\n    while True:\n      (article, abstract) = input_gen.next()\n      article_sentences = [sent.strip() for sent in\n                           data.ToSentences(article, include_token=False)]\n      abstract_sentences = [sent.strip() for sent in\n                            data.ToSentences(abstract, include_token=False)]\n\n      enc_inputs = []\n      # Use the <s> as the <GO> symbol for decoder inputs.\n      dec_inputs = [start_id]\n\n      # Convert first N sentences to word IDs, stripping existing <s> and </s>.\n      for i in xrange(min(self._max_article_sentences,\n                          len(article_sentences))):\n        enc_inputs += data.GetWordIds(article_sentences[i], self._vocab)\n      for i in xrange(min(self._max_abstract_sentences,\n                          len(abstract_sentences))):\n        dec_inputs += data.GetWordIds(abstract_sentences[i], self._vocab)\n\n      # Filter out too-short input\n      if (len(enc_inputs) < self._hps.min_input_len or\n          len(dec_inputs) < self._hps.min_input_len):\n        tf.logging.warning('Drop an example - too short.\\nenc:%d\\ndec:%d',\n                           len(enc_inputs), len(dec_inputs))\n        continue\n\n      # If we're not truncating input, throw out too-long input\n      if not self._truncate_input:\n        if (len(enc_inputs) > self._hps.enc_timesteps or\n            len(dec_inputs) > self._hps.dec_timesteps):\n          tf.logging.warning('Drop an example - too long.\\nenc:%d\\ndec:%d',\n                             len(enc_inputs), len(dec_inputs))\n          continue\n      # If we are truncating input, do so if necessary\n      else:\n        if len(enc_inputs) > self._hps.enc_timesteps:\n          enc_inputs = enc_inputs[:self._hps.enc_timesteps]\n        if len(dec_inputs) > self._hps.dec_timesteps:\n          dec_inputs = dec_inputs[:self._hps.dec_timesteps]\n\n      # targets is dec_inputs without <s> at beginning, plus </s> at end\n      targets = dec_inputs[1:]\n      targets.append(end_id)\n\n      # Now len(enc_inputs) should be <= enc_timesteps, and\n      # len(targets) = len(dec_inputs) should be <= dec_timesteps\n\n      enc_input_len = len(enc_inputs)\n      dec_output_len = len(targets)\n\n      # Pad if necessary\n      while len(enc_inputs) < self._hps.enc_timesteps:\n        enc_inputs.append(pad_id)\n      while len(dec_inputs) < self._hps.dec_timesteps:\n        dec_inputs.append(end_id)\n      while len(targets) < self._hps.dec_timesteps:\n        targets.append(end_id)\n\n      element = ModelInput(enc_inputs, dec_inputs, targets, enc_input_len,\n                           dec_output_len, ' '.join(article_sentences),\n                           ' '.join(abstract_sentences))\n      self._input_queue.put(element)\n\n  def _FillBucketInputQueue(self):\n    \"\"\"Fill bucketed batches into the bucket_input_queue.\"\"\"\n    while True:\n      inputs = []\n      for _ in xrange(self._hps.batch_size * BUCKET_CACHE_BATCH):\n        inputs.append(self._input_queue.get())\n      if self._bucketing:\n        inputs = sorted(inputs, key=lambda inp: inp.enc_len)\n\n      batches = []\n      for i in xrange(0, len(inputs), self._hps.batch_size):\n        batches.append(inputs[i:i+self._hps.batch_size])\n      shuffle(batches)\n      for b in batches:\n        self._bucket_input_queue.put(b)\n\n  def _WatchThreads(self):\n    \"\"\"Watch the daemon input threads and restart if dead.\"\"\"\n    while True:\n      time.sleep(60)\n      input_threads = []\n      for t in self._input_threads:\n        if t.is_alive():\n          input_threads.append(t)\n        else:\n          tf.logging.error('Found input thread dead.')\n          new_t = Thread(target=self._FillInputQueue)\n          input_threads.append(new_t)\n          input_threads[-1].daemon = True\n          input_threads[-1].start()\n      self._input_threads = input_threads\n\n      bucketing_threads = []\n      for t in self._bucketing_threads:\n        if t.is_alive():\n          bucketing_threads.append(t)\n        else:\n          tf.logging.error('Found bucketing thread dead.')\n          new_t = Thread(target=self._FillBucketInputQueue)\n          bucketing_threads.append(new_t)\n          bucketing_threads[-1].daemon = True\n          bucketing_threads[-1].start()\n      self._bucketing_threads = bucketing_threads\n\n  def _TextGenerator(self, example_gen):\n    \"\"\"Generates article and abstract text from tf.Example.\"\"\"\n    while True:\n      e = example_gen.next()\n      try:\n        article_text = self._GetExFeatureText(e, self._article_key)\n        abstract_text = self._GetExFeatureText(e, self._abstract_key)\n      except ValueError:\n        tf.logging.error('Failed to get article or abstract from example')\n        continue\n\n      yield (article_text, abstract_text)\n\n  def _GetExFeatureText(self, ex, key):\n    \"\"\"Extract text for a feature from td.Example.\n\n    Args:\n      ex: tf.Example.\n      key: key of the feature to be extracted.\n    Returns:\n      feature: a feature text extracted.\n    \"\"\"\n    return ex.features.feature[key].bytes_list.value[0]\n"
  },
  {
    "path": "model_zoo/models/textsum/beam_search.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Beam search module.\n\nBeam search takes the top K results from the model, predicts the K results for\neach of the previous K result, getting K*K results. Pick the top K results from\nK*K results, and start over again until certain number of results are fully\ndecoded.\n\"\"\"\n\nimport tensorflow as tf\n\nFLAGS = tf.flags.FLAGS\ntf.flags.DEFINE_bool('normalize_by_length', True, 'Whether normalize')\n\n\nclass Hypothesis(object):\n  \"\"\"Defines a hypothesis during beam search.\"\"\"\n\n  def __init__(self, tokens, log_prob, state):\n    \"\"\"Hypothesis constructor.\n\n    Args:\n      tokens: start tokens for decoding.\n      log_prob: log prob of the start tokens, usually 1.\n      state: decoder initial states.\n    \"\"\"\n    self.tokens = tokens\n    self.log_prob = log_prob\n    self.state = state\n\n  def Extend(self, token, log_prob, new_state):\n    \"\"\"Extend the hypothesis with result from latest step.\n\n    Args:\n      token: latest token from decoding.\n      log_prob: log prob of the latest decoded tokens.\n      new_state: decoder output state. Fed to the decoder for next step.\n    Returns:\n      New Hypothesis with the results from latest step.\n    \"\"\"\n    return Hypothesis(self.tokens + [token], self.log_prob + log_prob,\n                      new_state)\n\n  @property\n  def latest_token(self):\n    return self.tokens[-1]\n\n  def __str__(self):\n    return ('Hypothesis(log prob = %.4f, tokens = %s)' % (self.log_prob,\n                                                          self.tokens))\n\n\nclass BeamSearch(object):\n  \"\"\"Beam search.\"\"\"\n\n  def __init__(self, model, beam_size, start_token, end_token, max_steps):\n    \"\"\"Creates BeamSearch object.\n\n    Args:\n      model: Seq2SeqAttentionModel.\n      beam_size: int.\n      start_token: int, id of the token to start decoding with\n      end_token: int, id of the token that completes an hypothesis\n      max_steps: int, upper limit on the size of the hypothesis\n    \"\"\"\n    self._model = model\n    self._beam_size = beam_size\n    self._start_token = start_token\n    self._end_token = end_token\n    self._max_steps = max_steps\n\n  def BeamSearch(self, sess, enc_inputs, enc_seqlen):\n    \"\"\"Performs beam search for decoding.\n\n    Args:\n      sess: tf.Session, session\n      enc_inputs: ndarray of shape (enc_length, 1), the document ids to encode\n      enc_seqlen: ndarray of shape (1), the length of the sequnce\n\n    Returns:\n      hyps: list of Hypothesis, the best hypotheses found by beam search,\n          ordered by score\n    \"\"\"\n\n    # Run the encoder and extract the outputs and final state.\n    enc_top_states, dec_in_state = self._model.encode_top_state(\n        sess, enc_inputs, enc_seqlen)\n    # Replicate the initial states K times for the first step.\n    hyps = [Hypothesis([self._start_token], 0.0, dec_in_state)\n           ] * self._beam_size\n    results = []\n\n    steps = 0\n    while steps < self._max_steps and len(results) < self._beam_size:\n      latest_tokens = [h.latest_token for h in hyps]\n      states = [h.state for h in hyps]\n\n      topk_ids, topk_log_probs, new_states = self._model.decode_topk(\n          sess, latest_tokens, enc_top_states, states)\n      # Extend each hypothesis.\n      all_hyps = []\n      # The first step takes the best K results from first hyps. Following\n      # steps take the best K results from K*K hyps.\n      num_beam_source = 1 if steps == 0 else len(hyps)\n      for i in xrange(num_beam_source):\n        h, ns = hyps[i], new_states[i]\n        for j in xrange(self._beam_size*2):\n          all_hyps.append(h.Extend(topk_ids[i, j], topk_log_probs[i, j], ns))\n\n      # Filter and collect any hypotheses that have the end token.\n      hyps = []\n      for h in self._BestHyps(all_hyps):\n        if h.latest_token == self._end_token:\n          # Pull the hypothesis off the beam if the end token is reached.\n          results.append(h)\n        else:\n          # Otherwise continue to the extend the hypothesis.\n          hyps.append(h)\n        if len(hyps) == self._beam_size or len(results) == self._beam_size:\n          break\n\n      steps += 1\n\n    if steps == self._max_steps:\n      results.extend(hyps)\n\n    return self._BestHyps(results)\n\n  def _BestHyps(self, hyps):\n    \"\"\"Sort the hyps based on log probs and length.\n\n    Args:\n      hyps: A list of hypothesis.\n    Returns:\n      hyps: A list of sorted hypothesis in reverse log_prob order.\n    \"\"\"\n    # This length normalization is only effective for the final results.\n    if FLAGS.normalize_by_length:\n      return sorted(hyps, key=lambda h: h.log_prob/len(h.tokens), reverse=True)\n    else:\n      return sorted(hyps, key=lambda h: h.log_prob, reverse=True)\n"
  },
  {
    "path": "model_zoo/models/textsum/data/vocab",
    "content": "the 135597564\n, 121400181\n. 98868076\nto 58429764\nof 56269484\nin 49820911\na 49701084\nand 49378364\n's 23787251\n'' 23227828\n`` 23116499\nthat 21577263\nfor 20998230\nsaid 20858620\non 19106851\n## 16627320\nis 15661835\nwas 14607055\nwith 14265376\nhe 13755120\nit 13588190\n<UNK> 12263923\nat 12221539\nas 11657129\nby 11105584\n</d> 11090708\n<d> 11090708\nfrom 10275933\nhis 9090323\nbe 8939486\nhave 8930288\nhas 8880930\nbut 8213981\nan 8035012\nfourmile 60\nzwart 60\npost-baby 60\ndiasporas 60\nherzeg-bosna 60\nyounkers 60\nrolfing 60\ncyclades 60\nlovas 60\nsuper-cheap 60\njohnsonglobe.com 60\nincarnates 60\ncandis 60\nluzira 60\ntoyota\\/lola\\/bridgestone 60\ncaohc 60\nflatbeds 60\npairat 60\nstubborness 60\nmogaka 60\nmarch-past 60\nalaba 60\nextravehicular 60\nconolly 60\nshelford 60\nsnowblowers 60\nexcoriation 60\nlangoliers 60\nayios 60\nrsm\\/rw## 60\nultralow 60\nkassire 60\nkirikkale 60\nrutaca 60\nhardys 60\nlatigo 60\naggressive-growth 60\nshankland 60\ntetrault 60\nntf 60\nbritish-u.s. 60\nrbk 60\nhannagan 60\npro-french 60\nmacero 60\nbahrenburg 60\nrecanvass 60\nhayrunisa 60\neducap 60\ntuoh 60\nx-#-#-#-# 60\ncravenly 60\njent 60\nbritain-farm-animals-disease 60\nwhite-power 60\npongsidhirak 60\nfbc-ohiostate 60\neuro-arab 60\nprds 60\nsinkinson 60\nbaugher 60\nfpd 60\nsakombi 60\nholyhead 60\nvirusscan 60\nniboro 60\ncliffsnotes 60\ninhibitory 60\nnontariff 60\nhuaraz 60\nqera 60\nicomos 60\noff-handed 60\nlumbee 60\nkututwa 60\nbesra 60\nownit 60\nout-of-area 60\npetrozuata 60\nwielinga 60\nroecker 60\njeanneret 60\nryukyus 60\nchocked 60\nsyda 60\nrearers 60\nespecialistas 60\nstoeltje 60\ntag-along 60\npendulums 60\nland-hungry 60\nmale-pattern 60\nogbulafor 60\njemil 60\nsinglehanded 60\nbogaert 60\nbrawnier 60\npicardie 60\npatsalides 60\nzvimba 60\ntalamoni 60\naristos 60\nreductionist 60\nsung-han 60\nebby 60\ntcambanisglobe.com 60\nwell-populated 60\nboguinskaia 60\ngolfen 60\nfossmo 60\nleches 60\nmadtv 60\nmirzapur 60\ndromey 60\nmakowski 60\nbearzot 60\nfifth-day 60\ntogoimi 60\nethopia 60\nespoo-based 60\ncuppa 60\ncristin 60\nlambrechts 60\neurosystem 60\nnovember-january 60\nhome-security 60\nvengerov 60\nmajor-market 60\nobviates 60\nhorster 60\nthree-and-half 60\nsinyong 60\ncizek 60\nissyk-kul 60\ngranatino 60\nketones 60\njenista 60\nmvovo 60\nshuttlecocks 60\nbehounek 60\nnonscholarship 60\nmutaz 60\nhermandad 60\nengelmayer 60\nmussallam 60\nlutein 60\ndrag-queen 60\nindependent-film 60\nu.s.-asia 60\njiantang 60\npinkins 60\nuplifts 60\nlifsher 60\ngree 60\nsuit-clad 60\ndonavan 60\nstracke 60\nhard-to-read 60\nmerco 60\nclanks 60\nasvat 60\nkloof 60\nmarsland 60\ncaipirinhas 60\narmy-style 60\nbenhuri 60\nmokotedi 60\ntoader 60\nhanarotelecom 60\nguodong 60\ncottone 60\ndominicis 60\nosuntokun 60\nplanetall 60\nik-rjm 60\nserhan 60\ndelimiting 60\nchiaro 60\noptimizes 60\nkopay 60\nadakhan 60\ncretz 60\nliberato 60\nkandie 60\nabdur-raheem 60\nvodichkova 60\nschawlow 60\nkulivan 60\ntagtop 60\nstefanini 60\nwcec 60\nkabocha 60\nokresek 60\nmartin-in-the-fields 60\nlawn-mowing 60\ncpds 60\nraso 60\nciubuc 60\nmatch-winners 60\neye-pleasing 60\nelorriaga 60\nmarusa 60\nford@globe.com 60\ndreger 60\n#x#k 60\nwsm 60\nhocutt 60\nmacconnell 60\nout-of-service 60\ndeep-fat 60\nbody-builder 60\nstreamline.com 60\npiatas 60\nscolavino 60\ntechnobabble 60\nwar-ridden 60\nmuch-used 60\ncolisee 60\nitty 60\nbhw 60\nundersubscribed 60\ncargraphics 60\ntourist-related 60\nfada 60\nsibomana 60\nshuger 60\nmegawatt-hours 60\nwebpad 60\nvisine 60\nrewriteable 60\nmadore 60\nfasanos 60\nweinhauer 60\nanti-mormon 60\nloudmouths 60\nbroadness 60\nquixtar 60\nfancy-schmancy 60\ngangways 60\naversive 60\nclingendael 60\njonathans 60\nn#k 60\nschwald 60\npuxi 60\nabrahamic 60\ncasebook 60\nclear-the-air 60\nislamised 60\nmesseria 60\nvacation-home 60\npro-tutsi 60\nkishkovsky 60\nportee 60\nawardee 60\nbatsh 60\nlatika 60\nbristol-meyers 60\ncollum 60\naprils 60\ntelephia 60\ncloud-shrouded 60\nchild-abusing 60\nhott 60\nbacteriophages 60\nghosananda 60\nx-original-to 60\naubuchon 60\nmaldon 60\nowei 60\nstumper 60\nghanaian-born 60\ncincinnatti 60\nspelich 60\nguoxing 60\nregulation-time 60\nscotiamcleod 60\ntesana 60\nseung-youn 60\nwen-ko 60\nstadt 60\nschroeders 60\nnorin 60\nnung 60\nbank\\/schroder 60\nrelased 60\nea-lm 60\nrubey 60\ncfi 60\nkavaja 60\nbourgault 60\nbehrakis 60\nsuraphong 60\nhomesteader 60\nwbur-fm 60\nwhee 60\nafghan-kidnappings 60\nrybin 60\nnear-starvation 60\ncrippa 60\nsanlitun 60\npro-monarchist 60\nmortalities 60\nanticoagulants 60\nunsual 60\nkaha 60\nkamte 60\nharakah 60\nscarfe 60\nkenoy 60\nsendov 60\ndepasquale 60\nmaddaloni 60\njointly-funded 60\nnokwe 60\nliff 60\nkickstarter 60\nkhakpour 60\ncribiore 60\nhusbanded 60\npedauye 60\nperrella 60\nfar-post 60\nboruchowitz 60\ninterational 60\nclose-to-home 60\nsophoan 60\nluqa 60\nlegislator-elect 60\nforgy 60\nleutar 60\nuosukainen 60\nmirisch 60\nhavin 60\nscorekeeping 60\nenglishness 60\nphilippines-landslide 60\neconomides 60\netonian 60\nrandburg 60\naew 60\nmaillet 60\nabogado 60\nwell-aware 60\nstunners 60\nlaras 60\n#-tatiana 60\ncicciaro 60\nski-in 60\n##-billion-u.s. 60\nketeyian 60\nfirst-in-the-south 60\nfull-rate 60\npalepoi 60\nself-deluded 60\nhiv-infection 60\nrios-martinez 60\nturbojet 60\nyike 60\nradmacher 60\nstir-crazy 60\nbucuane 60\nunidentifed 60\nrichardo 60\nzygi 60\nspritely 60\nvaynerchuk 60\nmeniere 60\nspecial-ed 60\nteenybopper 60\njmckim@globe.com 60\nxade 60\ndanish-owned 60\nikuko 60\n#-chris 60\nkayapo 60\nzeckendorf 60\nsteamrollers 60\ngiroir 60\njaidev 60\neraserhead 60\njapanese-chinese 60\nreprocesses 60\nmuros 60\nmust-buy 60\ncorsiglia 60\nyanqui 60\nbasanez 60\nunplayed 60\nmimika 60\nelectric-only 60\nscerbatihs 60\nhitzig 60\nzero-interest-rate 60\npopulares 60\niphigenia 60\nnutricia 60\nsape 60\nkopec 60\niencsi 60\nalvan 60\nho-nyoun 60\nmob-style 60\nfrivolities 60\npost-ups 60\npercentiles 60\ncoote 60\nsodowsky 60\ncaglar 60\nsahaviriya 60\nleggat 60\nxueju 60\ncompagnia 60\nzomax 60\ninter-school 60\nhugeness 60\nsleaziness 60\nlangenhan 60\nbusiness-news 60\narmenteros 60\ncup-clinching 60\nyelavich 60\nchao-ching 60\nsalvadori 60\nameronline 60\nsex-starved 60\nalifereti 60\ntrosch 60\nrecombined 60\ndwek 60\nemission-control 60\npozdniakov 60\nshalaan 60\nuzelac 60\nimzouren 60\nobolensk 60\nmoshen 60\ncommodified 60\naleady 60\nbarakaldo 60\ntumwesigye 60\ngreiff 60\nwalp 60\nr-palm 60\njulkipli 60\nmanagerless 60\nyabucoa 60\nzandl 60\nself-doubting 60\nkherman@statesman.com 60\namol 60\nerck 60\nnitc 60\nhutcheon 60\nhornbills 60\nall-win 60\nheavy-weighted 60\npublick 60\nb&h 60\nkirchler 60\ncornella-el 60\nseefeldt 60\nsasse 60\nhttp://www.nra.org 60\nball-striker 60\nfip 60\nconcentra 60\ngraywolf 60\ndebbouze 60\nsnowfields 60\nrabushka 60\nwell-trimmed 60\nsofri 60\njungle-shrouded 60\nbeynon 60\nkotil 60\nsnarly 60\n##-cent-per-gallon 60\nthen-boss 60\nwakeling 60\nhaft-e-tir 60\nmajoros 60\nguglielminpietro 60\nhasnawi 60\nplachkov 60\ncabarrus 60\nconsumer-advocacy 60\nxinhuanet 60\nkarakul 60\neight-iron 60\nrallis 60\nlighthizer 60\nvugar 60\neighth-generation 60\nunitd 60\nx-anthony 60\nmwonzora 60\nprestart 60\nnadkarni 60\nphyapon 60\ntshuva 60\nkakuryu 60\nist###-### 60\ngolf-ryder 60\ndelivered-to 60\nexcelle 60\n###.#-yard 60\nless-visible 60\nelopement 60\nkhenthong 60\ncontrollability 60\nvenzuela 60\nsoutine 60\npongpen 60\nxhelili 60\nfour-homer 60\nhis-name 60\ndunnigan 60\naustalia 60\ndjalma 60\nuher 60\nyaman 60\nkipng 60\ndebt-related 60\nvolgyes 60\nliposome 60\ndiamoutene 60\nspragg 60\nsumahadi 60\nlebov 60\nblackly 60\nmutineering 60\noverstocking 60\nryynaenen 60\nberaja 60\nhoklo 60\nuniphoenix 60\ncumbal 60\nredmayne 60\nsesana 60\ninu 60\ndaresay 60\nsalvinia 60\negoistic 60\nnellies 60\ndeep-frozen 60\nlabor-relations 60\nweegen 60\nsino-kenyan 60\ncup-winner 60\ngimlet-eyed 60\nlendus 60\nrwisereka 60\nhugh-jones 60\nshenzhen-listed 60\nsunmonu 60\nzmago 60\nlatonya 60\ntv\\/ji 60\ntroshin 60\nanalista 60\nhysteric 60\nlindsay-hogg 60\nmarmaro 60\npurvey 60\nkilovolt 60\nval-kill 60\npartership 60\nprostrating 60\nbhairahawa 60\nfoc 60\nlyonpo 60\nyeardley 60\ndetalles 60\ncamas 60\nweb-search 60\nkosminsky 60\ndefund 60\ncowman 60\nal-kebir 60\nkasher 60\nkapolei 60\nfiords 60\nbrentjens 60\nteen-pregnancy 60\nqawi 60\norquesta 60\ngenotyping 60\nnonato 60\nnssa 60\ntwyford 60\nyuling 60\npearcy 60\nsisic 60\ndefraying 60\neisteddfod 60\nbenson-pope 60\nbarflies 60\none-arm 60\nunitech 60\nwheats 60\nmid-town 60\nnon-tour 60\nfiord 60\njnr. 60\nbge 60\nmixmaster 60\nglanton 60\noner 60\ntax-return 60\nembolisms 60\nwillse 60\nribon 60\nturbodiesel 60\nhigher-performing 60\ncotswold 60\nearly-summer 60\nromasko 60\nrulfo 60\nmoad 60\nparadyne 60\n##-sided 60\ngalili 60\nnabatean 60\nhaarlemmermeer 60\nreanimated 60\nsupo 60\nvarujan 60\ndemystifies 60\ngilgamesh 60\nbangabhaban 60\nekin 60\nctl 60\nbmv 60\nipolito 60\novsyannikov 60\narchaeologically 60\nx-michael 60\nhko 60\nstollen 60\nveloute 60\nsung-young 60\nhydroplaned 60\ncoogee 60\ntammert 60\nstambouli 60\nmakama 60\nkilted 60\ntuqan 60\ndutch-german 60\nmids 60\ntwo-to-four 60\ngavan 60\ndoorstops 60\npazira 60\nhalifa 60\nmelwood 60\ntechonology 60\nmontecatini 60\nthroat-clearing 60\navakian 60\nhirschbiegel 60\nanticonvulsant 60\ndisadvantaging 60\nsieger 60\nsealink 60\n-##.### 60\nrabelo 60\nweb-mail 60\nmaulud 60\ninchworm 60\ntabtabai 60\nrelata 60\nktt 60\ndae-hyun 60\nwitchdoctor 60\nastres 60\nyannett 60\nfixed-dose 60\ncadle 60\ntianfu 60\nnecesitamos 60\nslemrod 60\ntamil-speaking 60\nrsm\\/br## 60\npreisdent 60\neitzmann 60\nakuffo 60\nblefary 60\ncyclophosphamide 60\naprilla 60\nshare-the-wealth 60\nyagmurdereli 60\nmaumere 60\nembarek 60\nmonday-thursday 60\nostiglia 60\nseree 60\nsmokefree 60\nnerio 60\nlactase 60\nharbour-felax 60\nbelue 60\nal-jaz 60\ntamisuke 60\ntwahir 60\nsetian 60\ndmp 60\nkowske 60\nsupermercados 60\ndecalogue 60\noutpolls 60\npinballed 60\nhezbullah 60\ndaluwatte 60\nnintendogs 60\nmediocrities 60\ncercelletta 60\nlulzim 60\nyoungstars 60\nmavrou 60\ngorgonio 60\nmanie 60\noccassionally 60\nchikelu 60\nmcgettigan 60\nsinduhije 60\nprodujo 60\ncharvat 60\nonce-communist 60\ngood-tasting 60\npalexpo 60\nizak 60\ncomedy\\/drama 60\nbrusqueness 60\nanky 60\n#rd-#th 60\nergasias 60\njaxon 60\nmiletti 60\nwangui 60\nheb 60\nexurb 60\nchalices 60\nsentimentalists 60\n###-turbine 60\nvermeers 60\nwipeouts 60\nrolff 60\ndeguardia 60\ngrosbard 60\nzegra 60\nkunder 60\nstreet-racing 60\nbullington 60\nwunderteam 60\nnon-disabled 60\nleese 60\nweishan 60\nten-time 60\nlactobacillus 60\nfunkadelic 60\nzhengyu 60\nstate-subsidised 60\nexisten 60\nnakivubo 60\ndorinda 60\npopularly-elected 60\nwmf 60\nzrenjanin 60\nal-momen 60\ncesspools 60\nyacub 60\nzhuoru 60\nzouk 60\ntarnower 60\nsweet-potato 60\nlashonda 60\nfcdu 60\ntantalus 60\nnordling 60\nheidrun 60\nswigged 60\ntramontano 60\nspa\\/fas 60\ninvicta 60\nfenzl 60\nal-najar 60\nvinyl-coated 60\nstategy 60\nenewetak 60\nfrost-free 60\ndstanford 60\nelwin 60\nkainer 60\nyugi 60\ncl### 60\npetitclerc 60\nmargriet 60\nafterworld 60\nsirbu 60\nnon-tradeable 60\nco-signing 60\nisiro 60\njelacic 60\ntonghai 60\nnarisetti 60\nwhisperings 60\npembangunan 60\nclatters 60\nkalashnikova 60\nhealth-flu-europe 60\ncross-license 60\nsarees 60\nminal 60\nretirement-age 60\nnitrogen-based 60\nsun-yup 60\nnaief 60\n#,###-day 60\nrovs 60\nkonia 60\nparasitology 60\nriz 60\nhankyu 60\nebulliently 60\nfranzblau 60\napuuli 60\nmuan 60\nghoulishly 60\nsix-gun 60\nrecondite 60\ndukker 60\nzemmouri 60\nyumei 60\nhttp://www.centcom.mil 60\none-night-only 60\nkovanda 60\nakhundzadeh 60\nworcester\\/eng 60\nmissiroli 60\nzhenyuan 60\nquarterpipe 60\nsalix 60\npetrouchka 60\nmini-treaty 60\nepicure 60\nmaintainance 60\nday-nighter 60\nghengis 60\npro-hitler 60\nhurleys 60\neditorialize 60\nhenle 60\nsonson 60\nmeals-ready-to-eat 60\npratts 60\npisetsky 60\nthonglao 60\ntestings 60\nnarcisa 60\neight-night 60\nnoster 60\ntechnology-stock 60\ndsquared 60\nnain 60\nbbn# 60\nexplosively-formed 60\nb.l. 60\nzucchero 60\ndulay 60\nmid-spring 60\nionides 60\nprochazkova 60\ntwinsburg 60\njhollis 60\npre-####s 60\nbouyabes 60\nsafe-conduct 60\ndrug-resistance 60\npayphones 60\nyongwei 60\nlabor-law 60\ntualatin 60\nehf 60\nhirooka 60\nmortgage-bond 60\ngaztelu 60\nmitidja 60\nkondratiev 60\nchristain 60\nhttp://www.alcoa.com 60\nhand-raised 60\nshaner 60\nmedeski 60\nscansoft 60\nmulti-millionaires 60\nreder 60\nhtin 60\npaillettes 60\nintershop 60\ncarfagna 60\nbeyazit 60\nannakin 60\nscaglione 60\nesdi 60\nmikie 60\ninferiors 60\nstudent-on-student 60\nregionalize 60\nluff 60\ndemitasse 60\ncal-bred 60\ncalzado 60\nagita 60\nbalin 60\nhighest-end 60\nsinyani 60\nall-industries 60\nreuses 60\nlagrimas 60\nherberts 60\nkoves 60\nsub-surface 60\nal-anzi 60\nlonger-running 60\ngallimore 60\npluralities 60\nbenedetta 60\njiddah-based 60\ntomaselli 60\nkallestad 60\nsouheil 60\nkanso 60\nprivett 60\nkoronka 60\nnonmonetary 60\nrangsan 60\nesztergom 60\nzettler 60\nland-management 60\nindie-film 60\nlearysptimes.com 60\ndemokratikong 60\nvishakhapatnam 60\nrutba 60\npluss 60\nafricare 60\n#.##-acre 60\nshabib 60\nflorins 60\njujiya 60\ncompaign 60\ndemant 60\njosiane 60\nd'andre 60\nskagerrak 60\nazapo 60\nhttp://www.ladieseuropeantour.com 60\nwhisperers 60\nstuddert 60\noverindulgent 60\nqabazard 60\nnorcross-based 60\nex-gurkha 60\nx-seattle 60\nhejiang 60\nmcglaughlin 60\npreciously 60\ngeetha 60\ngardezi 60\nikuta 60\nattaullah 60\nhensler 60\ntree-shaped 60\nsubarctic 60\nstarman 60\nhussar 60\nverburg 60\ntupurkovski 60\njeelani 60\navidan 60\nsancha 60\npoliakoff 60\nroom-sized 60\ninstable 60\npejsek 60\nel-falali 60\nchachoengsao 60\nchinese-produced 60\nal-kinani 60\nknuth 60\nbunner 60\nshinholster 60\nsabiston 60\ndvdirect 60\ndouroux 60\nzebley 60\nsalicylic 60\netian 60\nco-organised 60\nosmanagic 60\nkhotang 60\ndoswell 60\nzfa 60\nproscribing 60\nbbn-dodgernotes 60\ndomzale 60\nchinguetti 60\njoseph-beth 60\nstirrer 60\ndeviatovski 60\nwright-designed 60\nspin-doctoring 60\nun-sudan 60\ncontratos 60\ndecimalization 60\nmurugesu 60\ncleeman 60\nunbolted 60\ndelphin 60\ntwitter.com/gregauman 60\ndepravation 60\ngoen 60\nmeckel 60\nkerr\\/john 60\nnariaki 60\nco-ranked 60\ndouble-gold 60\n_______ 60\nbinet 60\nperpere 60\nmahbuhbullah 60\ngendun 60\nzilla 60\ngodowsky 60\nlycett 60\nceramist 60\ndamia 60\noradell 60\nkarimi-rad 60\nkreisler 60\nhard-hatted 60\nwahpeton 60\nchristianne 60\nvesselin 60\nkivumbi 60\nlife-supporting 60\nwriting-directing 60\nvalances 60\nchlorinate 60\nna-young 60\nnuclear-bomb 60\ndecomissioning 60\npayam 60\ntuleh 60\ninnova 60\nchih-hao 60\ndejohn 60\ndemocratic-farmer-labor 60\nlbs. 60\nsmall-school 60\nnur-pashi 60\nbollaert 60\ncrookedly 60\ntalloires 60\nkringen 60\ndopfer 60\n#-#-year 60\ndefeatists 60\nmadiot 60\nfull-member 60\nvyachorka 60\nhaixi 60\nsydenham 60\n#-emilie 60\nacuvue 60\ntelser 60\ncumpston 60\nsokhina 60\nbancolombia 60\ncrisis-prone 60\ngmhc 60\npbf 60\nnisi 60\nrevolutionizes 60\nazzahar 60\nmoney-draining 60\ndouble-bogeying 60\ngauvreau 60\nchabris 60\nleg-break 60\ntecbud 60\nginnifer 60\nosoria 60\nnigeria-unrest 60\nturjanzadeh 60\nself-affirmation 60\nminuto 60\nprivate-property 60\nwambui 60\nbodao 60\ngwalia 60\nvladimirs 60\nnatural-foods 60\nprolapsed 60\nvdis 60\ncullerton 60\negawa 60\naspl 60\nwindu 60\nrecession-fighting 60\nriss 60\non-land 60\nngap 60\nperjure 60\ncochaired 60\nmpumelelo 60\nfixed-wireless 60\npoots 60\nluneville 60\nhands-only 60\nbacashihua 60\nchancing 60\nmazrouei 60\nnabiyev 60\nmideast-summit 60\nbaler 60\ngenii 60\nbleustein 60\nunmixed 60\nphrenology 60\niica 60\nshevaun 60\norch 60\nzaramba 60\nchina-latin 60\nujc 60\nheeler 60\nsparseness 60\npercudani 60\nislam#uk 60\nrbos 60\ngb-lak 60\ncobbled-together 60\nhalf-owner 60\nd'entremont 60\nation 60\ntranslucence 60\nluxgen 60\nahhs 60\nposibles 60\nkulyab 60\nstalement 60\nanti-car 60\no'crowley 60\naustria-crime-incest 60\nzuazo 60\nsugarbaker 60\nrougemont 60\nbrugmann 60\nseidensticker 60\nrambla 60\npensonic 60\ngimbal 60\nnaugahyde 60\nlongneck 60\nbumpings 60\npuffball 60\nselolwane 60\nanti-climatic 60\ncrucibles 60\npenetta 60\nfatehpur 60\nbaixada 60\nukrop 60\nudoto 60\nyulieski 60\nchidren 60\nkarwan 60\nzonca 60\nrapid-ascent 60\nsayle 60\nactuator 60\nmultireligious 60\ngotchas 60\ntorossian 60\nbonici 60\nfellmeth 60\nsmatterings 60\nglycerol 60\nbatigol 60\nadamany 60\nlongball 60\nduck-and-cover 60\nboxford 60\nflounce 60\nseabeds 60\nshui-tsai 60\nsreerema 60\nwentzville 60\nmckeel 60\necla 60\ncheol 60\ntosta 60\nstayaways 60\nqazigund 60\nzolkin 60\nlyuboslav 60\nka-## 60\nbellaart 60\nmarcovicci 60\ndream-come-true 60\nwinnemucca 60\ndelbarton 60\nfilmtec 60\nno-drive 60\nelection-reform 60\nescoto 60\n###,###-capacity 60\nsupertyphoon 60\nwakaazuma 60\nunsurmountable 60\nempcar 60\nmembers-in-waiting 60\nvirola 60\ng\\/h 60\nmanch 60\nducker 60\nkornilov 60\nmetsa-botnia 60\nmoistening 60\ncoequal 60\nacga 60\nrupesh 60\nwilkinsburg 60\nbrashest 60\nkhoramshahi 60\ninverter 60\nco-offensive 60\noscar-night 60\netymological 60\ntranter 60\neckenrode 60\nsireta 60\nsmtp 60\ndadey 60\nrehak 60\ngooneratne 60\njurie 60\namte 60\ntroyens 60\nnametag 60\nlops 60\npetach 60\nsoliev 60\n#,###-peso 60\nre-sentenced 60\nberdiyev 60\nphang-nga 60\nptuj 60\nnavickas 60\nglassmakers 60\nsingle-ticket 60\nholderman 60\nstratcom 60\nbhartiya 60\nportentously 60\nhorlock 60\n##\\/#-year-old 60\nhertog 60\nkrissoff 60\npinschers 60\ncoverlets 60\ngalaxie 60\nbarmaids 60\nharmetz 60\ncommunist-bloc 60\ncavallier 60\nrelizane 60\nstauring 60\nkousa 60\n#,###-car 60\nthobela 60\ndisavowals 60\nbistrong 60\nrelevent 60\nvibert 60\nreturn-path 60\nlamagna 60\natthe 60\nsatrio 60\nleebove 60\nsallai 60\nhealth-oriented 60\nisra 60\nbidvest 60\ngasperoni 60\ncrispo 60\ngortat 60\ngaravani 60\ntorgau 60\nikeyama 60\nkocian 60\nzhihe 60\ncatapano 60\ntaxmen 60\nquittner 60\nisraelson 60\ntroi 60\ntoevs 60\ngermany\\/milram 60\nodintsov 60\nchango 60\nlatag 60\nbalcytis 60\nantolini 60\nxingyi 60\nnienaber 60\nshiso 60\nboskov 60\nresource-hungry 60\nrouland 60\nmpe 60\nhindu-christian 60\ndodt 60\nkoenigstein 60\nonce-rich 60\ngenerator-powered 60\nbidemi 60\nkhi 60\nkniazkov 60\ntaborsky 60\nlehya 60\nzylstra 60\nschosberg 60\nemergency-management 60\npanne 60\nclosedown 60\nfreemason 60\nhigginbottom 60\nerogenous 60\nplaatjes 60\nsolyndra 60\ncommerciality 60\ngorky-# 60\ni-opener 60\n#.##-carat 60\nalispahic 60\nporzio 60\ncummerbunds 60\ngaku 60\nborght 60\nvalicevic 60\nicier 60\notwell 60\nalighieri 60\nmilsteins 60\nmachi 60\none-stop-shop 60\nherperger 60\nhypergrowth 60\nchaffey 60\nhenare 60\npeterbilt 60\nsieben 60\nescarre 60\ncross-disciplinary 60\ntahj 60\nhomestands 60\nellenson 60\nperons 60\nvanoy 60\nhafed 60\nmingshan 60\nin-migration 60\nivancic 60\ndramani 60\nasia-middle 60\nmagoffin 60\nhertzfeld 60\n####\\/### 60\nlarena 60\nnjue 60\nhela 60\nnigerian-registered 60\nchraidi 60\nsaint-jean-de-maurienne 60\nonsale 60\ntaneski 60\nblagoj 60\npcij 60\nbomb-thrower 60\ntai-shan 60\nfertel 60\ntrasvina 60\nshtml 60\nsakaiminato 60\nkilar 60\nsifang 60\nsnorkeled 60\ngentian 60\ndito 60\ntransbourse 60\nmachinga 60\ngrowth-enhancing 60\nreclogging 60\n#-roque 60\ncmf 60\nputaway 60\nklinge 60\npitching-rich 59\nalights 59\n#-mariano 59\nshatz 59\nkoskoff 59\nsesssion 59\ndowthitt 59\nkriens 59\nrevering 59\nnanhu 59\nmenduh 59\nissers 59\ngabali 59\naristolochia 59\nmagrino 59\nsb# 59\npiledriver 59\nskywalks 59\nczapiewski 59\ncenterstage 59\nblandest 59\nprescriptives 59\nheartlessly 59\nvitalia 59\nanchorages 59\nhttp://www.nrlc.org 59\npittston 59\nhigh-payroll 59\nmakahs 59\nshobokshi 59\nbandwagons 59\nkrivenik 59\ngamey 59\nyear-plus 59\n##-billion-yen 59\nantwun 59\nhaniel 59\nlonger-lived 59\nsaheli 59\nminutewomen 59\ngray-and-white 59\nafran 59\ndysphonia 59\nanau 59\nflemmons 59\nkabushenga 59\nanti-france 59\nsimecek 59\ngreenwalt 59\nwavelet 59\nzarai 59\nschweickart 59\niicd 59\ntrupin 59\nnuez 59\ntobiass 59\noutofthebox 59\no'scannlain 59\nhaqq 59\nwrcf##a 59\nholberton 59\ngermplasm 59\nhalide 59\nwine-colored 59\nauthories 59\nharpal 59\nhigh-emission 59\nkombu 59\ncde 59\nhorakova 59\nkinderhook 59\nocksman 59\nayd 59\nubdina 59\nsubhan 59\nmoistness 59\nkampmeier 59\nturyk-wawrynowicz 59\nkawachi 59\nnortherns 59\nmangga 59\nadv##-cox 59\nbukvich 59\nbudhi 59\nd-word 59\nphap 59\nglimmerings 59\nrecondition 59\nstuhlbarg 59\nobfuscated 59\ncollingswood 59\nthiew 59\nvarberg 59\ncalifornia-grown 59\nakbank 59\n#.#-feet 59\nesmie 59\namanya 59\nre-filed 59\nbonani 59\nbednarz 59\neconomatica 59\ntuf 59\nwater-recycling 59\nathanasia 59\njennekvist 59\nhuidong 59\nissues-oriented 59\nwoong-bae 59\nkochis 59\nair-dried 59\nbrunken 59\nhttp://www.enron.com 59\nhandgrenades 59\ndeaf-mutes 59\nesops 59\ndarulaman 59\nchanachai 59\naponavicius 59\nupper-hand 59\nbaszczynski 59\nculley 59\nncaer 59\ncalcagni 59\nbolarinwa 59\nbrumos 59\nvelha 59\nlathem 59\nsadoff 59\nvolumen 59\npm-elect 59\ntown-house 59\nbackpacked 59\n##-yarders 59\nborrelia 59\nagostinelli 59\nrundles 59\nsentir 59\nasjylyn 59\nchimalapas 59\ntetherow 59\nkinan 59\ntegan 59\ngjenero 59\ndl-hla 59\nestigarribia 59\nonce-moribund 59\nbiersack 59\nkhitan 59\nkoury 59\ncricket-ind-aus 59\nhttp://blogs.timesunion.com/mcguire 59\nmuzahim 59\nwoul 59\nsck 59\ndongzhimen 59\nfanciulla 59\narianne 59\nkoszics 59\nmeadwestvaco 59\nstaveley 59\ntimb 59\nlevs 59\nsuper-spy 59\nyaish 59\naraz 59\ntabman 59\nbasanti 59\npeace-enforcement 59\ncesaria 59\nneopolitan 59\nmispronunciations 59\ncounter-cyclical 59\ndisarmement 59\nchavhanga 59\ngayler 59\nmercedez-benz 59\nrecollects 59\nrundall 59\nchen-chung 59\nmalusa 59\ngereida 59\ndecha 59\nlemberger 59\nbakoyiannis 59\ngogi 59\ngrinda 59\njimson 59\nbradberry 59\nmervat 59\nbvg 59\njingzhong 59\nendel 59\nallers 59\nzuoyun 59\nloevinger 59\nzntb 59\nmirnawan 59\nigniter 59\nx-shaped 59\ncarlsson-paige 59\ncobia 59\nveillette 59\nweed-killer 59\nlarbaa 59\nprn 59\nvictorinox 59\nhousehold-name 59\n##-dnp 59\nxiangning 59\nminasian 59\nfrappe 59\nozone-friendly 59\nsmita 59\nnon-biological 59\nsyllabuses 59\ntwe 59\nnorthwick 59\nivashkevich 59\nun-proposed 59\nmuharrem 59\ncalorie-free 59\nkolwezi 59\nbarbados-born 59\neu-serbia 59\nlish 59\npolyak 59\nhttp://www.nifc.gov/ 59\nkenro 59\niran-unrest 59\ncothran 59\nbasak 59\nbeezer 59\nthree-strike 59\nwilkomirski 59\ncoovadia 59\nserreqi 59\nbythe 59\nnoncompetition 59\npost-columbine 59\nminnix 59\nmickell 59\nspyro 59\nboire 59\ntopcu 59\nboureij 59\nfengyang 59\nsuntrajarn 59\netre 59\ntarsy 59\ndecors 59\nyerger 59\nnon-internet 59\nstuntz 59\nlazer 59\ncahaba 59\nsabtu 59\nmut 59\nfadden 59\neckstine 59\nwawrzyniak 59\nhip-swiveling 59\nplateaux 59\nshih-fang 59\nvarah 59\nschwertner 59\nmalfi 59\nscheeren 59\nrousselot 59\nbircher 59\ngoldsberry 59\ncharteau 59\nsullenness 59\nomoro 59\nalipov 59\ntax-cutter 59\ngayer 59\ncortinovis 59\nkettleman 59\nremodeler 59\nhard-punching 59\nmagomedali 59\ntelesleuth 59\nsmajlovic 59\nnkem 59\nsea-green 59\nmorobe 59\nnodia 59\nbiss 59\ndawan 59\nabolfazl 59\nalberton 59\nfaial 59\npicardi 59\nmuchall 59\nno-change 59\nenfranchisement 59\nkijevo 59\nsalpigidis 59\nimjingak 59\nanfrel 59\nmubarek 59\nchudy 59\nbroadway-notes 59\nnear-miraculous 59\nlefse 59\ncacace 59\ncoffield 59\nout-of-hand 59\nnuhanovic 59\nchelule 59\njumagulov 59\nelasticized 59\nnon-existing 59\nsperl 59\njunri 59\nmanders 59\nsubstructure 59\ncahaya 59\nsegamat 59\nkharas 59\ndds 59\nsnitched 59\nmonforts 59\nmarantha 59\ngreen-blue 59\nsukawaty 59\ntroutt 59\nasics-cga 59\npart-ownership 59\noline 59\nleanna 59\nmerl 59\nrockhopper 59\nshakiba 59\nparlez-vous 59\ncostilla 59\nindustry.net 59\nlogvinenko 59\nkaforey 59\nfinnish-born 59\nerror-plagued 59\nrodnina 59\nsteel-mesh 59\nthrushes 59\nnesvig 59\narcapita 59\nwellstream 59\nnaushad 59\nenad 59\nmarxist-leninists 59\noptimising 59\njl-bg 59\npecina 59\ncalamine 59\nbrinner 59\nkeion 59\nhatra 59\nsobota 59\nfifita 59\nakayeva 59\n#-million-acre 59\n##-marcelo 59\ncurto 59\nbattallion 59\nsocko 59\nparviainen 59\nall-southeastern 59\nekland 59\nchinotimba 59\nescamillo 59\nwartelle 59\nbarely-there 59\nbookbuilding 59\nbrightly-lit 59\nkopeck 59\npno 59\npelvises 59\nscuffs 59\nterminix 59\nsnow\\/cloudy 59\nperraud 59\nintroversion 59\nthielemans 59\nayalew 59\nsardo 59\ndebunkers 59\nsnow-topped 59\ninuktitut 59\nstraberg 59\nkeech 59\nvoter-id 59\nleoncavallo 59\nharmes 59\nclay.robison@chron.com 59\nsujeeva 59\nimage-maker 59\nleet 59\nwister 59\nfour-tournament 59\ntwo-games-to-none 59\nconsuela 59\nprokhorova 59\nokin 59\ncholakis 59\npolicharki 59\ngolf-ball 59\nbischof 59\nlanguorously 59\nbracanov 59\njoellen 59\ncoryo 59\nparty-hopping 59\nhangchow 59\nth# 59\nloetschberg 59\nyuxia 59\nanthrax-related 59\nradio-ready 59\nfethiye 59\nshak 59\ncurrent-generation 59\ngrotowski 59\npakorn 59\ngaronne 59\nmknobler@ajc.com 59\nburyak 59\ngreen-minded 59\nbarril 59\nyamaki 59\nmacheyo 59\nwork\\/life 59\nfessler 59\nrheingau 59\nwahdan 59\nhugel 59\nconoley 59\nhellcat 59\npoydras 59\nkazoos 59\ngollogly 59\ndiscoursing 59\npungue 59\ntwilson 59\naivars 59\ntorrelavega 59\nassadollah 59\nornskoldsvik 59\nsyaifudin 59\ndoshisha 59\ndifazio 59\nportioned 59\nesap 59\npop\\/contemporary 59\n#the 59\nbelluscio 59\nafficionados 59\nshargin 59\ncaulks 59\nbarnesville 59\nal-mahmud 59\ntopalli 59\navowal 59\nskil 59\njeschke 59\npieke 59\np&j 59\nmatillano 59\nsoft-loan 59\nsoon-to-be-former 59\npittle 59\ntransposing 59\nparmelee 59\nfessel 59\nobudu 59\nisland-republic 59\ncoutry 59\ngolpe 59\nvakhayev 59\nvllaznia 59\nkarzai-appointed 59\ninvs 59\npost-abortion 59\nchianwala 59\nennoble 59\nneradko 59\npujobroto 59\nchikowi 59\nansah 59\nlitang 59\npalaghiaccio 59\neuroepan 59\ny-net 59\ncnpp 59\nsteroid-distribution 59\nmaulawi 59\ngrandison 59\nkamungozi 59\ntonneau 59\ntorchings 59\nfour-candidate 59\npallium 59\nkljajevic 59\n#.##-liter 59\nbuy-to-let 59\nfanzhi 59\nsunanda 59\ntaymour 59\nremanufactured 59\nschopenhauer 59\nanti-syrians 59\nanti-affirmative-action 59\npetrochemia 59\nkptc 59\nlobohombo 59\nmongomo 59\njanss 59\nsabharwal 59\nlandholding 59\n##-per-month 59\naftertax 59\nsardjoe 59\nplacings\\/standings 59\ngarrec 59\njohnson-freese 59\nyablonsky 59\nstojkov 59\nwenzao 59\nwater-conservation 59\nlawner 59\nfront-loader 59\nanaheim-based 59\ndoulos 59\ndimapur 59\nvashchuk 59\nyorked 59\nmewar 59\nfive-pointer 59\nwarrell 59\nndiwa 59\nshrage 59\npoblanos 59\nblack-run 59\nauslese 59\nbanducci 59\ncarless 59\nsuhardjono 59\nextroverts 59\nbavarian-style 59\ncalorie-laden 59\nsportbild 59\nrobo-signers 59\nbucchi 59\nbourne\\/victor 59\noesterle 59\nrenda 59\nmortgage-market 59\nhasibul 59\nbohren 59\nhuman-computer 59\nhalf-moons 59\nal-kadhimi 59\nreguera 59\nachike 59\nschnee 59\natje 59\nesp\\/eus 59\nlamplight 59\nsolimoes 59\nloua 59\nelkan 59\nvitamin-enriched 59\ndeskovic 59\nburgmans 59\nmiaa 59\nascorbic 59\nspain\\/festina 59\n#-##-### 59\nagro-industries 59\nnegociaciones 59\nskirt-chasing 59\ngph##bb 59\nminister-counselor 59\npenhall 59\nhat-in-hand 59\nautograph-seeking 59\ntemodar 59\nothmani 59\ncalida 59\npanichgul 59\nblameworthy 59\nburkitt 59\nfoppert 59\noptionsxpress 59\nr-salem 59\nonassio 59\nteruyuki 59\ncalstart 59\nfitzcarraldo 59\ntetrazzini 59\nbohme 59\ncarinii 59\ndevilliers 59\nartemyev 59\nbozkurt 59\npacificor 59\nkomisar 59\nweijun 59\nzehri 59\nchalle 59\nkingi 59\nzurcher 59\nsvansicklersptimes.com 59\nziso 59\npersonal-foul 59\nmyanmar-language 59\nitalian-french 59\nyung-ping 59\nsadbhawana 59\nwaywardness 59\nlockard 59\nmonib 59\ncontrolee 59\nmaroga 59\npublic-listed 59\nembrey 59\nched 59\nfive-power 59\nsaynudin 59\nqasemi 59\nnuzum 59\nmicrodrive 59\nre-investigation 59\nconflicto 59\ndasanayake 59\nxynthia 59\nwormholes 59\nbaoming 59\nrappleyea 59\necotech 59\npre-oscars 59\nminx 59\nyator 59\njobbed 59\npeccadillos 59\nsocieta 59\nsadriya 59\nwackness 59\na.b 59\ndychtwald 59\nlibancell 59\ntwo-hundred 59\negs 59\ncontrada 59\ninterferring 59\nshajiao 59\nzeidane 59\ntengger 59\nsrinagar-muzaffarabad 59\nkurosaki 59\nyidong 59\nnchito 59\nabdille 59\ntid 59\nraw-vid###### 59\nski-world 59\nblockings 59\ngedeck 59\nlocalhost 59\ndooky 59\nkhafagy 59\ncoppock 59\nnon-gay 59\nsimec 59\n#,###-yuan 59\ndementias 59\ngalatioto 59\ntitter 59\nnagyvary 59\nayahuasca 59\ngordon-conwell 59\nalberto-culver 59\nhoutan 59\nwaspish 59\nunimaginatively 59\ndead-serious 59\ncontrovery 59\ningels 59\nmaxton 59\nforefingers 59\nibata 59\nticknor 59\nlazne 59\ngidel 59\nblache 59\nschormann 59\ndrop-by 59\nnationalbank 59\nshuanghe 59\nfreetel 59\n###-###-#-###-#### 59\naveritt 59\nanti-earthquake 59\nmacefield 59\ntrali 59\nwitbooi 59\npatru 59\nunhappiest 59\npervomaysk 59\nrealy 59\napplera 59\nest\\/a#r 59\nutada 59\noil-trading 59\nal-sammarai 59\nmuria 59\ntrapasso 59\ndiefenderfer 59\nmovie-quality 59\nmaslova 59\ngedz 59\nbright-yellow 59\nal-fahal 59\nrazziq 59\ncorbelli 59\ndecanted 59\nblack-and-silver 59\ncubanos 59\ntowelettes 59\nhttp://www.weforum.org 59\niia 59\nsirba 59\ncads 59\nkapetanos 59\ndisses 59\nltc 59\ndebbasch 59\nft.com 59\ngraito 59\nstaletovich 59\ndebreu 59\nmerisant 59\nnear-collisions 59\nlokuge 59\nsoftware-maker 59\nstephanus 59\nduba 59\nnovik 59\nmayombo 59\nsalvato 59\nmets-yankees 59\noskars 59\ntitmice 59\nnikolovski 59\ndeanda 59\nrestrictionist 59\nkhadduri 59\nonterrio 59\nthodey 59\nperroud 59\nfoping 59\nagboville 59\nisin 59\nwednesday-thursday 59\nohmori 59\njong-chan 59\nacocella 59\nbuggery 59\nhouphouet 59\nsanofi-pasteur 59\nxxxxxxxxend 59\nohel 59\nsuburbanized 59\ntekeda 59\nwell-proportioned 59\ncounter-punch 59\nselyem 59\ndawlat 59\negas 59\nthree-punch 59\nfouke 59\ngas-to-liquids 59\nedan 59\nhollopeter 59\nhintsa 59\naddenbrooke 59\nmountaga 59\nzogu 59\nbxe# 59\nbushinsky 59\nmartinage 59\nkaufer 59\npingeon 59\nmazahir 59\nshao-haei 59\nbriner 59\nnemea 59\nyork-headquartered 59\nanytown 59\nkleptocratic 59\nnon-supervisory 59\naix-les-bains 59\nconocer 59\nheit 59\nreinhilt 59\nclear-cuts 59\natong 59\nhaefner 59\nal-taweel 59\nsholokhov 59\nnezavisne 59\nkamoga 59\nmonsalve 59\nzanussi 59\nkaller 59\nfidal 59\nmoloi 59\nneck-to-ankle 59\nshirvani 59\nnon-saudis 59\nkharg 59\nc-murder 59\nfingernail-sized 59\ngjomle 59\npiccioli 59\ntheater-in-the-round 59\nhonnold 59\nshingirai 59\nvending-machine 59\nmini-applications 59\nkrusty 59\ne-services 59\nadressing 59\ntaht 59\nzacharatos 59\ncoupler 59\naparri 59\nedenton 59\ndicke 59\none-###th 59\nflamme 59\ngladiolas 59\nmcnee 59\ncarnero 59\nwww.caiso.com 59\ngrabar-kitarovic 59\nfelsen 59\nmemes 59\nreinjected 59\nedwar 59\nfavier 59\nyokomine 59\nslamed 59\npoll-watchers 59\nazizulhasni 59\ngracin 59\ntank-automotive 59\nhalvari 59\ngloversville 59\naleksejs 59\nlouisianan 59\nclerkships 59\nsharapov 59\nill-chosen 59\nclerestory 59\nnasa-funded 59\nminella 59\nkarnaugh 59\nphoned-in 59\ntaibe 59\nschondelmeyer 59\nd'arby 59\nu.s-china 59\none-hitters 59\nchephren 59\ncaru 59\nbachleda 59\nnategh-nuri 59\ngaitley 59\nkiecolt-glaser 59\nwatmore 59\norangeman 59\narch-villain 59\nhttp://www.amd.com 59\npowderject 59\ntoker 59\nstubbled 59\nohnishi 59\nrequesters 59\nhelicoptering 59\ntahoes 59\nmorong 59\nrahmonov 59\nledanois 59\nborodulina 59\nspringmann 59\nkeynotes 59\nsaith 59\npallidotomy 59\nlusso 59\nshabaa 59\nbyrdak 59\nvance-owen 59\nibertsberger 59\nmassih 59\nartyukhov 59\nlip-synch 59\nbenziman 59\nschaer 59\nyormark 59\nzuanazzi 59\ndejohnette 59\nmaternally 59\ngyurkovics 59\nhaleiwa 59\nunprecented 59\ntotus 59\ndshir 59\nyeakel 59\nlavecchia 59\nturbo-generators 59\nsanoma 59\nsimiyu 59\nhmnzs 59\nsabagh 59\ndecoratifs 59\nlehder 59\nsuryo 59\nsamaraneftegaz 59\nkingfishers 59\nquasi-socialist 59\njurnee 59\nfischbein 59\nmaotai 59\nturgot 59\nkarimirad 59\nosnes 59\nmargarian 59\nkrul 59\nraho 59\nbibring 59\nshaktoolik 59\nwendler 59\nbaim 59\nhellwig 59\nsugarcane-based 59\nal-shaar 59\nsuomi 59\nguittard 59\namirov 59\nchafer 59\nreconnoitering 59\nre-forming 59\nthweatt 59\ntax-writers 59\noladapo 59\nsidelocks 59\nbedaux 59\naravind 59\nfruit-growing 59\nkoror 59\nlight-year 59\nadat 59\nutm 59\nkeenans 59\ngrecia 59\npalestinian-jordanian 59\npro-choicers 59\nliuhua 59\nsujit 59\nmilchan 59\nweather-philippines-typhoon 59\nadditonal 59\nleukemias 59\nhttp://www.continental.com 59\nparrothead 59\nafghanistan-unrest-taliban 59\nre-submitted 59\ncup-champion 59\njitloff 59\ngungoren 59\nal-shall 59\nphotochemical 59\ngrandee 59\nchi-chao 59\ncash-raising 59\nnanri 59\npro-prosecution 59\nmaicosuel 59\nplourde 59\ntarell 59\nlaquidara 59\nharkonen 59\nmicrodermabrasion 59\nshootarounds 59\ncryopreservation 59\ncockfield 59\nyb\\/jh 59\nphromphan 59\nmaxa 59\nflatliners 59\nextranjera 59\ncalaio 59\ntjh-rjm 59\npost-performance 59\napriantono 59\nrawn 59\nhumadi 59\nlupsha 59\nbikita 59\nroad-rail 59\nmantes-la-jolie 59\njianyang 59\nshaweesh 59\nmigliaro 59\nlarrikin 59\nmuju 59\ndelahunty 59\nno\\/okla 59\ngracen 59\npenalty-filled 59\nkitui 59\nernsting-krienke 59\nretela 59\nntust 59\ncordel 59\nverkhovsky 59\nhatha 59\nmeat-eater 59\nmusker 59\nless-publicized 59\ncityu 59\ntla 59\nfancying 59\npankin 59\nsimasiku 59\nspinaway 59\nhedija 59\nmersenne 59\ndunshee 59\ndalipi 59\nducale 59\nbruene 59\narmy-owned 59\ngreatorex 59\nkarsiyaka 59\nmartinu 59\nshamala 59\nbrimfield 59\natlantics 59\nutuado 59\nconneally 59\nhye-youn 59\nsharpstown 59\nsignore 59\natlanta-journal 59\nnegoesco 59\nplotnikov 59\nlinscott 59\ndespereaux 59\nligang 59\nsurvivals 59\nokolski 59\nfang-yue 59\ndistrigaz 59\nborana 59\nconable 59\nsorest 59\nrompers 59\nbatubara 59\nkencell 59\nibi 59\ntin-plated 59\nshlain 59\nsoybean-based 59\ntallberg 59\nfibras 59\nselda 59\ninter-dependence 59\nchunkier 59\ncollege-jewish 59\nchiune 59\nbredahl 59\nhalf-starved 59\ndymchenko 59\nsuspectedly 59\ndimitrakos 59\njf-## 59\ndury 59\nmcgivney 59\ncakl 59\n###-storey 59\nkishor 59\nvionnet 59\ntorlen 59\ncciced 59\ncar-bombs 59\nmomsen 59\ncoach-class 59\nbadmouthed 59\nliebermans 59\nsmain 59\nsun-loving 59\nsqueaky-voiced 59\nthree-pack 59\nprimex 59\nremixer 59\nwell-articulated 59\nrun-first 59\nsecret-agent 59\nwismar 59\npendulous 59\nagro-technicians 59\nsupercedes 59\nconcepto 59\nback-to-the-land 59\nshamoon 59\nisnt 59\nvergina 59\nkolesar 59\ndelpierre 59\nvolchkov 59\nbaixa 59\npentacostal 59\nmpca 59\nrwindi 59\ntechnimetrics 59\nbeijing-hangzhou 59\ndapa 59\ndivisionism 59\nburcham 59\nbean-counters 59\nserravalle 59\nagni-ii 59\ntexana 59\ntwo-masted 59\nyumen 59\nman-of-war 59\nwell-nourished 59\nktvu 59\nbrialy 59\nvlashi 59\npalanggoon 59\nessi 59\ndreamlife 59\nculbreath 59\ngosingan 59\ntejinder 59\nmillion-to-one 59\nferozabad 59\nmovsisyan 59\nbirdie-eagle 59\ntaeb 59\nbiomolecules 59\ncynara 59\nmogas 59\nhalbrook 59\nfinstrom 59\nupmc 59\nactivist-journalist 59\nstate-guided 59\nscitrek 59\nnewyork 59\nrosenbauer 59\nbraker 59\nbds 59\nchain-smokes 59\ndivyang 59\ncinemania 59\nnorwegian-based 59\nbowlegged 59\ntosoh 59\nisbin 59\nstamatis 59\n###zx 59\nstroem-erichsen 59\nre-organized 59\nmoughniyah 59\nmulpuru 59\nnaeringsliv 59\nzitko 59\ncaudillos 59\nt-systems 59\nal-shayea 59\nawesomeness 59\ncityvote 59\ndesanctis 59\nluckenbill 59\nwww.tsa.gov 59\nquatrano 59\ntetuan 59\nseven-million-dollar 59\nbienville 59\nabou-treika 59\nakutagawa 59\nlasn 59\nouseph 59\nbrand-conscious 59\nultra-cool 59\nsupporting-acting 59\nvivar 59\nafanasyeva 59\nraes 59\nfaliron 59\nsoru 59\njobski 59\ndodonnell@nycap.rr.com 59\nscadden 59\narditi 59\ndoornekamp 59\nnexabit 59\nlivetv 59\nduplin 59\nendurance-booster 59\nradionuclides 59\nvolmar 59\ngittings 59\ncoile 59\novertown 59\nlesi 59\nagarwalla 59\nbrooksley 59\ntrifonov 59\nbizera 59\nsino-chilean 59\nip-based 59\ncondren 59\nhalf-size 59\nkwu 59\nkangol 59\narlys 59\nbinstock 59\nlibyan-backed 59\nrayed 59\nalimentary 59\ngrained 59\natoki 59\nnegations 59\noasen 59\narab-mediated 59\nmozena 59\nserioux 59\nbjervig 59\nhot-spring 59\ngruet 59\n#-inch-deep 59\nlinker 59\nglycine 59\nchimere 59\nhamoked 59\nsmyrek 59\nmonkish 59\nperaliya 59\ngorospe 59\neight-month-long 59\ngalwey 59\nmastitis 59\ngoupil 59\nmegaresort 59\nnutraceuticals 59\nransford 59\nsubterfuges 59\npronostica 59\nreinspection 59\nho-yeol 59\napperson 59\nthanomsak 59\njadideh 59\nwevill 59\ncepelova 59\naera 59\nkuehbacher 59\nntumba 59\niphitos 59\nyanis 59\nisolda 59\no'shanter 59\nscheuring 59\nblack-listed 59\ntan-colored 59\nkayhian 59\npro-german 59\nstiletto-heeled 59\nmastan 59\ndhamar 59\ncreisson 59\ntarasenko 59\ngetahun 59\npordenone 59\nhentunen 59\nunderlayment 59\nal-muslimeen 59\ngardy 59\nleukocyte 59\nthermography 59\nmatchpoint 59\nhongyi 59\nobertan 59\nprimeeast 59\ncastellvi 59\nisitolo 59\npro-competition 59\nlaryngeal 59\nxianghe 59\neurozone-imf 59\nnigeria-oil-unrest-kidnap 59\nnader-camejo 59\nkyrastas 59\nseltsovsky 59\nspain\\/astana 59\nberan 59\nfoudre 59\ngaudier 59\nfrance\\/cofidis 59\ninterenergoservice 59\ngaar 59\namberleigh 59\nshoora 59\nstehr 59\ndehoyos 59\nwait-listed 59\nmeuleman 59\nkennerly 59\nyue\\/niu 59\nananiev 59\nkomrskova 59\njmh 59\nxianfeng 59\nxg### 59\nrane 59\niturriaga 59\nbely 59\nsekondi 59\ngentex 59\naleisha 59\nclenched-fist 59\nbaraybar 59\nangolite 59\ncamp-out 59\nquandry 59\nphymatrix 59\nsidecars 59\nhenoch 59\nsubandi 59\nbaldin 59\nmacia 59\noynes 59\npre-registered 59\nlanne 59\ngaritano 59\ndebarked 59\nduval-scott 59\nmalambo 59\nhernreich 59\nsankofa 59\npracharaj 59\nbusies 59\ncly 59\nrassul 59\nkeobounphanh 59\nper-year 59\ncovenas 59\non-wine 59\nnederlandse 59\nnontitle 59\ngoave 59\nfuchsova 59\nmonroes 59\nnadzmi 59\njoerres 59\nbogere 59\nshanhua 59\nszalay 59\nduf 59\ngate-crashers 59\nchidamabaram 59\nkirchpaytv 59\nruweished 59\nandjelko 59\nostrager 59\natp-monte 59\nbridgers 59\np#c 59\ndamba 59\ndaisey 59\nnon-insurance 59\nfrappuccinos 59\nmissile-equipped 59\ncunneyworth 59\nkostiantyn 59\nzubr 59\nkimmi 59\ngatorland 59\nwaxen 59\nsonke 59\ngramm-rudman 59\nrabon 59\ncumani 59\nhirshson 59\nharouna 59\nmulti-user 59\nmccleave 59\nnemerov 59\nejegayehu 59\nrivenbark 59\nless-privileged 59\nrotundo 59\nduangchalerm 59\nspeechmaker 59\nching-piao 59\nunderachieve 59\nbakala 59\nsuweidi 59\nadriaenssens 59\nautobytel 59\nwillimon 59\nclean-out 59\nmazdas 59\nmochida 59\nvolkogonov 59\njasen 59\nwaple 59\npodlodowski 59\ncardia 59\ntraffick 59\ncarpentaria 59\nharrad 59\nforadil 59\nzaveryukha 59\nrueda-denvers 59\nesperan 59\ndavis-monthan 59\nover-aggressive 59\nhuracan-tres 59\nredaction 59\nbegles 59\nkupusovic 59\ngoskomstat 59\ncents-per-share 59\nslopped 59\n#-paolo 59\n#-paola 59\nshafiullah 59\ngold-digger 59\nrawle 59\ngarnishment 59\naguta 59\nchartchai 59\ndebevec 59\nfirst-wave 59\nhaidt 59\nneider 59\nbsheets 59\nsheja'eya 59\nomofuma 59\njk-hla 59\njila 59\nfomca 59\nkozel 59\nphuntsog 59\nsoc-intlnotes 59\nneils 59\njebril 59\nxiaojie 59\nenrica 59\nzhare 59\nsainvil 59\nfsh 59\nfsg 59\ndbu 59\nbocalandro 59\nterror-stricken 59\nnine-plus 59\nhealth-guestworkout 59\nadie 59\nsissinghurst 59\nsubglacial 59\npraefcke 59\nover-the-head 59\nzerlentes 59\ngeosystems 59\nmescheriakova 59\napolloni 59\nbatat 59\nshortest-serving 59\nred-dirt 59\nnaziunalista 59\npellucida 59\nnear-deserted 59\nolanzapine 59\nonce-beautiful 59\nwoon-kwong 59\nfixit 59\nreekers 59\nsurgically-repaired 59\nhatipoglu 59\nmannichl 59\nlatessa 59\nnaturists 59\nberquist 59\nveldakova 59\ndetabali 59\nsleep-related 59\nasean-us 59\nhiscock 59\nmuch-reviled 59\nair-ground 59\nonce-taboo 59\nromilly 59\nbossem-levy 59\nhealth-sars-taiwan 59\ndymock 59\nlynge 59\npiranesi 59\nconsistencies 59\nnelson-bond 59\nshusaku 59\nvax 59\nrhythmical 59\nall-premier 59\njorquera 59\nshikhar 59\ncamel-colored 59\nphilomene 59\nvancomycin-resistant 59\neinaugler 59\nmutrif 59\nmartitegi 59\nneustar 59\nganzuri 59\neufaula 59\nquichua 59\nthundercloud 59\nex-christian 59\nhavisham 59\ncurcic 59\ntalledega 59\nsung-man 59\ntr\\/vls 59\nmalgieri 59\ncollege-entrance 59\nbourkoff 59\ngambol 59\nswick 59\ncalvinism 59\nghettoization 59\nanti-rollover 59\nkamark 59\nforded 59\nrepurpose 59\ngupte 59\nathans 59\npolhemus 59\nkeshia 59\nramachandra 59\nkl-gm 59\nguolla 59\nbrazil-plane 59\nporato 59\npoms 59\npost-trauma 59\nubud 59\nsemi-circular 59\nblowhole 59\ninflation-related 59\nflori 59\nbarhoumi 59\nbarolos 59\nmadritsch 59\nkulis 59\nweerts 59\nsardis 59\nsaravanan 59\nkfda 59\ngrapefruit-sized 59\nmucke 59\nsaifudin 59\ntime-slot 59\nxinli 59\ndownwinders 59\nrotcheva 59\nhemdan 59\nbrasses 59\ncowpox 59\nmushota 59\nbounder 59\nex-lax 59\ntibs 59\nnarcocorrido 59\nsiok 59\nfinal-lap 59\nholczer 59\nkresse 59\nibtisam 59\nemy 59\nal-kurd 59\noutside-the-beltway 59\nlegitimises 59\nllamo 59\nincendiaries 59\ntopoff 59\nasloum 59\nsabelli 59\nturkey-based 59\nkrupnikovic 59\nlezion 59\noutswinging 59\npesach 59\nmercury-free 59\nbaerbel 59\npranav 59\nkhokhlova\\/sergei 59\nprimecap 59\nlukko 59\nbttb 59\nsemprun 59\nzohur 59\nfbc-usc 59\nulrica 59\nround-ups 59\nmaastrict 59\ncharbroiled 59\nson-in 59\nsegat 59\ntransfield 59\ngovernment-rescued 59\ntryggve 59\nodling-smee 59\ncanelas 59\nswd 59\nbapu 59\nxcp 59\nkeulder 59\nhaipe 59\nal-hedoud 59\nefthymiou 59\nyalman 59\nratsirahonana 59\ntoo-strong 59\npolonaise 59\naramburuzabala 59\nnajafabad 59\nplaited 59\njailbait 59\nitagui 59\nco-rookie 59\nthonier 59\niason 59\nlow-probability 59\nligocka 59\ntetzlaff 59\nglassblowers 59\neverard 59\nettienne 59\ngranlund 59\nbever 59\ncarematrix 59\nvotevets.org 59\nghinwa 59\namoussouga 59\nbrigade-size 59\nmanats 59\nmrazova 59\nkupelo 59\ndekay 59\nvanzekin 59\ncalixte 59\npaktin 59\ngivry 59\nchocolatey 59\npoverty-eradication 59\nscandal-prone 59\nel#l 59\ngood-for-nothing 59\nsailosi 59\ntuguegarao 59\nstruk 59\nvancura 59\njean-mary 59\nreemployed 59\ndurling 59\nxinhua-run 59\nchadd 59\nover-expansion 59\nmiyamura 59\ndiani 59\nblinov 59\nstick-swinging 59\nsparxxx 59\nblagoi 59\nflohr 59\ncasselman 59\nmagnifico 59\ntemerlin 59\nkirm 59\nbambu 59\ngohlke 59\nmaniglia 59\nkamphuis 59\nmoodily 59\nkilak 59\nanti-church 59\nleat 59\nserap 59\ndanish-based 59\nbagless 59\nbluford 59\ntsygurov 59\ntahmasebi 59\negomania 59\nu.n.-demarcated 59\ndm-pyg 59\ncrianza 59\nghafari 59\nmottus 58\ndjamil 58\nrazor-close 58\nsetola 58\ncortulua 58\nturvy 58\ngechev 58\ngessel 58\nayamas 58\neberl 58\nleys 58\nbeating-heart 58\nkhadjiev 58\nvbac 58\nsteber 58\no'liney 58\nchizuko 58\ntransgressor 58\nbalcavage 58\nnon-voters 58\ngoligoski 58\nuddi 58\nphiladelphia-born 58\nschlomo 58\ngamila 58\nherranz 58\nd-ram 58\naerating 58\nheathwood 58\njazz-pop 58\nswindlehurst 58\ndabagh 58\nsportfive 58\njitter 58\npeelle 58\ngeste 58\ntuitele 58\nlimited-liability 58\nh### 58\nheilmann 58\nonyancha 58\nholobyte 58\nppt 58\nicelike 58\nboru 58\nayala-cornejo 58\nsinger-bassist 58\nseifullah 58\nsonrisa 58\nbogacheva 58\ndassey 58\nx-muttiah 58\nimmigrant-friendly 58\npost-all-star 58\nphysically-unable-to-perform 58\nalbergo 58\ncausado 58\ntriple-triples 58\nkalhammer 58\nbranquinho 58\nwhang 58\ncogbill 58\nrossier 58\npitztal 58\nbreadline 58\ndollar-pegged 58\nruffing 58\nmchinji 58\nupperhands 58\nstanely 58\nbanpro 58\nstereophonic 58\nspaz 58\ndumeisi 58\nsoissons 58\nevaluative 58\nvashti 58\ntransouth 58\nalinea 58\njust-retired 58\n##-norandrosterone 58\nethanol-based 58\nblauensteiner 58\nmwaba 58\nchep 58\nintercutting 58\nequal-pay 58\nvalencia-based 58\nchambeshi 58\nabsorptive 58\nbramson 58\nhallo 58\ndiatoms 58\nal-nassiri 58\nleviton 58\ntachi 58\nmiscount 58\nanthuenis 58\nbrockington 58\ngiuffrida 58\nscreenvision 58\nlimandri 58\nkaliopate 58\nweather-stripping 58\npauwels 58\ntangaroa 58\nmelanogaster 58\nbewilderingly 58\nboasso 58\ndeputise 58\nchinese-north 58\nuriri 58\nshibutani 58\nranya 58\naramin 58\nantonacci 58\narab-kurdish 58\nsazanovich 58\nmassachusetts-amherst 58\netti 58\ntandil 58\nelectrotechnical 58\nriqueza 58\nchudasama 58\nleksand 58\ntwo-pointer 58\nmariangela 58\nkeye 58\nnikolaev 58\nsl# 58\njamat-ud-dawa 58\nliener 58\nashford.com 58\nparknshop 58\nquapaw 58\ntwo-pack-a-day 58\naderhold 58\nburing 58\nunrolls 58\ntwo-iron 58\nr-miss. 58\nfazilah 58\nokam 58\nkabuli 58\nregia 58\nsekula 58\nre-deployed 58\nv.v. 58\nsporleder 58\nvigiano 58\ncahal 58\nsinsuat 58\nshortenings 58\ndetargeting 58\nsportspeople 58\nta-## 58\nimmunosuppression 58\nkongying 58\nrepack 58\nhuster 58\nus-university 58\neco-tourists 58\nslip-sliding 58\nwucker 58\ngreece-finance-economy 58\nshvetsov 58\nsn-mpm 58\njohn-patrick 58\nenought 58\nverbard 58\nsconce 58\nelectro-mechanics 58\nbudhia 58\nzuckerbrod 58\nkadriu 58\nbiavaschi 58\npentathletes 58\nnimani 58\ndezso 58\ncottesloe 58\ngrosh 58\ncroupiers 58\nvuillermin 58\nlipcsei 58\nreveiz 58\nkolesnik 58\nneeb 58\nmccullagh 58\nstand-still 58\nanimistic 58\nllave 58\noh-so 58\nsavannahs 58\nobledo 58\nston 58\nziyuan 58\nsafety-first 58\nmultiple-listing 58\nnadzeya 58\nearth-observing 58\nlacquers 58\nfazl-ur 58\nbribers 58\npass-catcher 58\nlasek 58\ncyprus-un-talks 58\nbyzantines 58\ncomposer-in-residence 58\nprinceling 58\nbataga 58\nas-yet-undetermined 58\nsumarno 58\nwuchuan 58\ngatty 58\ngobdon 58\nfarsi-speaking 58\nonewest 58\nkleeman 58\nmbt 58\nandre-joseph 58\nhaggett 58\numd 58\ngoggle-eyed 58\nthen-popular 58\ngarity 58\nufundi 58\nchangsheng 58\nhercegovacka 58\ngasparini 58\nruyan 58\ntanerau 58\nassault-rifle 58\nvohs 58\ndimmesdale 58\ngun-for-hire 58\nlokomotive 58\ngreiss 58\nafsar 58\nhersley 58\nkruma 58\nmultiforme 58\nolkhovsky 58\nmeridiana 58\npannella 58\nreduced-rate 58\nleebaw 58\nwebman 58\nkacha 58\noil-like 58\nniello 58\nbrierton 58\nciliberto 58\naercap 58\nlarin 58\ncaze 58\ncongo-fighting 58\nak-chin 58\nsaengchai 58\nreichsmarks 58\nmutability 58\nredistributive 58\nmuons 58\nproyas 58\nagajanian 58\ncity-by-city 58\nnageikina 58\ndallam 58\npb&j 58\njuacevedo 58\npharmacologists 58\nnon-moslems 58\nmukri 58\njavasoft 58\npre-op 58\nkrezelok 58\nthird-and-short 58\nwahhabist 58\nfarve 58\nbluejays 58\nsergius 58\ndorouma 58\nneukirchen 58\nundulated 58\nserlenga 58\nunsan 58\nal-nueimi 58\nblood-testing 58\nplait 58\nwilhide 58\nfour-floor 58\nu.s.-flag 58\nmushier 58\ncatto 58\nnapster-like 58\nnightstands 58\nsalvar 58\nvorontsova 58\ncriticos 58\ngrantmakers 58\nslader 58\ngholam-reza 58\nmagnifique 58\nvolen 58\nchishui 58\nsunalliance 58\nindoor\\/outdoor 58\nheidgen 58\nlule 58\nbelot 58\ncheng-yuan 58\ncyrene 58\nmanganelli 58\nundershooting 58\nlorenzo-vera 58\nturadzhonzoda 58\nsantilli 58\nkucher 58\njo-krg 58\nmr\\/dw 58\ndosen 58\nmargi 58\nhalf-cooked 58\nheslin 58\nvidoje 58\nlepley 58\ncommandante 58\npattersons 58\ncorsaire 58\nfamosa 58\nyojimbo 58\nbromma 58\ntransmutation 58\ncost-to-income 58\nfeinted 58\nmop-topped 58\nmoxibustion 58\nyongyi 58\nheterogeneity 58\ncurti 58\nbolte 58\ntripplett 58\npaun 58\nbio-pharmaceutical 58\ncherubini 58\nhyun-woo 58\nstroem 58\nwanvig 58\nfriuli-venezia 58\nschornack 58\ncaucasia 58\nprosecuters 58\nsagmeister 58\ndefore 58\nfakhoury 58\nozen 58\nlightning-bolt 58\nfunks 58\npusillanimous 58\nticky-tack 58\nlenfest 58\novermedicated 58\nsung-tae 58\nfotherby 58\nal-gabali 58\nmuhajiroun 58\n##.#-billion-pound 58\ncridlin 58\nlow-dollar 58\nslappy 58\nunder-recognized 58\nconsumer-finance 58\njoma 58\nfabyan 58\nvredenburg 58\nrushville 58\nkingsmen 58\nfeng-ying 58\njung-hoon 58\naldape 58\nwednesday-night 58\njanja 58\nmonsod 58\nmaimings 58\nkwaito 58\nal-baghli 58\nst.-denis 58\nagro-food 58\nafsa 58\nsyncytial 58\ncrf 58\nbrogliatti 58\nmelcior 58\ncongestions 58\nthemself 58\nmeiners 58\ngranick 58\nmamonyane 58\nkrayer 58\nbrm 58\nsex-selective 58\ndaina 58\njjh\\/db 58\ncarella 58\nre-took 58\nilmor 58\nchedjou 58\nwaites 58\n#-max 58\nmerriex 58\npresident-general 58\nwater-main 58\npsa\\/bloomberg 58\ncamerounians 58\nsavatheda 58\ncounter-measure 58\ni.t. 58\nkaid 58\nporkpie 58\ncinchona 58\nmakhosini 58\naudemars 58\nmessa 58\nunlamented 58\nniedak-ashkenazi 58\nbutyrka 58\nmusicales 58\nuvm 58\nbayh-dole 58\nschleyer-halle 58\nunzipping 58\n#,###-game 58\nfreemarkets 58\nrequa 58\nopondo 58\npost-and-beam 58\nkmarts 58\nhejda 58\npeilin 58\nneisser 58\nre-using 58\ntakeisha 58\nnewstalkzb 58\npengilly 58\nchinese-african 58\nflippancy 58\ncoorsh 58\napta 58\nwillcocks 58\nemlen 58\ncitiseconline 58\nrulemakers 58\ndistintas 58\nqadderdan 58\ndpk 58\nvortexes 58\nmae-ggl 58\nhodur 58\nkaravellas 58\ntimidria 58\nhaina 58\nnare 58\novercast\\/sleet 58\neurest 58\n####-###### 58\nkakhi 58\nsenafe 58\nzetti 58\nciencia 58\nboya 58\nweadock 58\npestriaev 58\ndhahi 58\nhui-mei 58\nel-arabi 58\nkotto 58\nsundstroem 58\nhamui 58\nprovos 58\nsenio 58\nbochner 58\nedyta 58\nminamoto 58\nscorings 58\ngesner 58\nadiyaman 58\nwassail 58\nfishhook 58\nkrivda 58\ntestifed 58\nmomanyi 58\ncafarelli 58\nre-christened 58\nnaaman 58\nchien-chih 58\ntrailside 58\nwenhao 58\ndeplaning 58\nlegree 58\nanti-personal 58\nyemelyanov 58\ntest-market 58\nikramullah 58\nplateauing 58\nbatad 58\npeu 58\nsamling 58\nglaudini 58\nfilippidis 58\nzhifu 58\nanti-divorce 58\nscrooges 58\nlashari 58\nideologists 58\nsagapolutele 58\nniobium 58\nchows 58\nbenfer 58\nhhh 58\ntwo-footer 58\nnon-acceptance 58\nmd\\/ji 58\nauchi 58\nveanne 58\nbrick-red 58\nvarga-balazs 58\npeskiric 58\nrestitutions 58\ninterjection 58\nin-place 58\ncavagnaro 58\nlewinksy 58\nsaudi-brokered 58\nrotax 58\nkawagoe 58\nwlosowicz 58\npuleo 58\nhoth 58\ndisch-falk 58\nseiple 58\nnitrogenous 58\nurubamba 58\nparticularily 58\nhagop 58\nbachirou 58\ncpc-led 58\narbor-based 58\nvaleant 58\ngunhild 58\nmarkswoman 58\nqueyranne 58\nphoumsavanh 58\nkirschstein 58\nsharjah-based 58\nharriton 58\nbaktiari 58\nconfernce 58\npremade 58\nsleet\\/overcast 58\ndenbeaux 58\ntyibilika 58\nteya 58\nsqualene 58\nguzy 58\nampad 58\nnua 58\noundjian 58\nyankees-mets 58\nclyfford 58\nbootes 58\ndegeratu 58\nheshmatollah 58\nself-diagnosis 58\ngulyanamitta 58\nbelgium-politics 58\nzukor 58\njyujiya 58\ninadmissibility 58\nsecurity-wise 58\nspiliotes 58\nzambelli 58\naxford 58\nclosed-in 58\nlaender 58\nanti-flooding 58\nicca 58\nso-what 58\npharmacogenomics 58\nprefecture-level 58\ndeclines# 58\nbriffa 58\nplexicushion 58\nroyal-blue 58\nadd-ins 58\nsfp 58\nwide-awake 58\nmahoud 58\n#motion 58\nriwhite 58\nfrappes 58\nrecoletos 58\naccountholders 58\nmost-affected 58\nyalie 58\nsugarmann 58\nahmadov 58\nictu 58\nconservative-minded 58\nfinigan 58\nblackard 58\nlabant 58\nvierra 58\nzizka 58\nfiguras 58\nesquimalt 58\npottuvil 58\nkhassawneh 58\nalben 58\nperu-hostages 58\nrebozos 58\nkonchalovsky 58\nkesayeva 58\nkoyo 58\nflukey 58\nover-bought 58\ndollar-supportive 58\npushtuns 58\ndistribuidora 58\nlalji 58\nthiermann 58\nadvance-fee 58\nwoodberry 58\nwaleska 58\noysterman 58\nssali 58\nfievet 58\npro-and 58\nmetaxas 58\ncairngorms 58\ncohabited 58\ndoyev 58\njohnsrud 58\nrayani 58\nsalvatori 58\nbowdlerized 58\nepidaurus 58\ntilahun 58\ncounter-guerrilla 58\nmaulani 58\nchalai 58\ngood-for-you 58\nsoon-to-retire 58\nforkballs 58\nresettlers 58\nultimatetv 58\noblinger 58\nritto 58\nfiberweb 58\ndungey 58\npinholes 58\n#-diego 58\nkikhia 58\ntugluk 58\nprogressivity 58\nasakawa 58\nkolawole 58\nregpay 58\nsyktyvkar 58\nanti-copying 58\nluda 58\ntv-like 58\nweisbach 58\nmulticanal 58\nexport-fueled 58\nlucherini 58\nbossio 58\nnumeiri 58\nswagel 58\nfamily-like 58\nlodar 58\nfinalises 58\nlarwood 58\nu.n.-proposed 58\nself-designated 58\nrecord-hard 58\nsyncing 58\nwinebrenner 58\nprovosts 58\nnaisbitt 58\nchetcuti 58\nwentland 58\ncapetillo 58\nandreyeva 58\nskvortsova 58\nrole-players 58\nartek 58\ncholing 58\n##,###-sq-m 58\nresized 58\nhartack 58\nwilens 58\nsentimentalism 58\nserifovic 58\ncancio 58\nhausch 58\nearlene 58\nmengwa 58\nyokado 58\npisarcik 58\nloredana 58\nnaqi 58\nmost-asked 58\nchicha 58\nskammelsrud 58\nsupinit 58\nwanseele 58\nfragrance-free 58\nleffe 58\nhttp://www.firstunion.com 58\nnuder 58\nrouf 58\nzigomanis 58\nmp-# 58\nguma 58\nmarket-access 58\nsquare-kilometre 58\nurbanists 58\ntortes 58\nfranking 58\nvorenberg 58\nschwieterman 58\nbawazir 58\ncrankcase 58\nderogatis 58\nkarakasevic 58\nentropia 58\nbarlonyo 58\nhannibal-lagrange 58\nartnews 58\nadventurists 58\nmoney-changer 58\nhousecoat 58\nnordictrack 58\nwilfert 58\nmccain-obama 58\nmusselwhite 58\nbangzhu 58\nmeyerbeer 58\ntegucigalpa-san 58\nal-saqqa 58\nquercus 58\nlouay 58\nyenga 58\nrajauri 58\ncigarroa 58\nkimiyasu 58\nnicotine-free 58\nzelinsky 58\ndiagon 58\npermach 58\nsymmetrix 58\nfirnas 58\nflower-decked 58\nistar 58\nquintupling 58\nbreyers 58\narmyworm 58\nabdoulie 58\ncare-related 58\naldar 58\nbeauteous 58\nyardages 58\nnon-greek 58\ncutthroats 58\nflessel-colovic 58\nmago 58\nmargets 58\nbarela 58\nflight-to-quality 58\nmost-coveted 58\nzubar 58\nruzowitzky 58\npetru-alexandru 58\nal-hajji 58\nscots-irish 58\nchabang 58\nbouafle 58\nxxxxxxxend 58\nmintoo 58\nlolab 58\nlargely-christian 58\nflim-flam 58\nwen-ying 58\ngreensomes 58\npractioners 58\nya\\/ml 58\nchanters 58\nkarapetian 58\nin-seat 58\nshirko 58\nbistrot 58\nesmaeel 58\ninsanitary 58\ncheka 58\nissue-driven 58\ngok 58\nauclair 58\nveira 58\ncraftsman-style 58\nhonorifics 58\nmartek 58\nebri 58\nferdie 58\nseshaiah 58\nsunkin 58\ncaldrons 58\ncabals 58\nforedeck 58\ntight-head 58\nsung-nam 58\ndemerara 58\njucker 58\nthumb-size 58\nsciame 58\neriksen\\/mette 58\nricketson 58\ndomesticating 58\nkhagendra 58\nabla 58\nrescorla 58\nkanta 58\nstylez 58\nruwenzori 58\nkilfoyle 58\nsteinmann 58\ntabakh 58\ninquisitorial 58\nheldman 58\nvus#### 58\nwhole-language 58\nkimberlee 58\nharnecker 58\ndary 58\nazimkar 58\nreneau 58\nin-the-round 58\nvucetic 58\nmetrowerks 58\ntogiola 58\nmotor-racing 58\nthemistocleous 58\nrugut 58\nbectu 58\nwww.orbitz.com 58\npagos 58\nakeem 58\nanouncement 58\nflat-line 58\nsindical 58\nbc-mexico 58\nthree-building 58\nchambal 58\nbhoj 58\nasra 58\np.p. 58\noverexploitation 58\ntime-keeping 58\nbearse 58\nrobp 58\nmarakesh 58\navena 58\nhighwayman 58\nbarnaba 58\nco-coaches 58\nrfranklin 58\nself-abuse 58\nfarra 58\nbelow-strength 58\nbolek 58\ndevkota 58\nfarney 58\njunck 58\nlefranc 58\nlindale 58\nlikhachev 58\nflywheels 58\ncnca 58\nsamin 58\nsamie 58\nglasberg 58\nblue-suited 58\nkazarlyga 58\nvacanti 58\npurwoprandjono 58\n##-billion-baht 58\nbahaeddin 58\npw#### 58\nworld-shaking 58\nbalbina 58\ndarwyn 58\naleskerov 58\nprimp 58\nchue 58\no'cealleagh 58\nscotrail 58\nfigeroux 58\nthanawat 58\niihs 58\nhendardji 58\nboldyrev 58\nmihoko 58\nsibani 58\ndiapering 58\nanimal-feed 58\nlintel 58\ninfobahn 58\nkuribayashi 58\nbretons 58\nbevmark 58\nrisk-sensitive 58\nflanner 58\ngroenvold 58\nkaspersky 58\nsalt-free 58\ncankaya 58\nzna 58\nrabson 58\noutvote 58\nhanada 58\ntourist-filled 58\nadobes 58\nathamna 58\nnato-brokered 58\npriding 58\nngetich 58\nbenac 58\neelco 58\nteza 58\nkeahon 58\nemeryville-based 58\nisentress 58\nmunford 58\ncritchfield 58\nchinandega 58\ncarax 58\nposers 58\ntehreek-i-jafria 58\ntbarnhart@ajc.com 58\npicolinate 58\ncobwebbed 58\nconventionality 58\nbyelections 58\nmacallister 58\npapoose 58\n##-microgram 58\nsled-dog 58\nmyostatin 58\nalbaladejo 58\nndirangu 58\nhonderich 58\nsceptically 58\ngarro 58\nvereeniging 58\nh-shaped 58\nlillington 58\ndiddams 58\ncoffy 58\ndoorjamb 58\nmilitia-style 58\nmagluta 58\nmazzariol 58\nstanford-trained 58\nagriculturalists 58\nspit-and-polish 58\nservility 58\npadmore 58\nsatmars 58\nseren 58\nsubsitute 58\nserwotka 58\nkuokuang 58\ntavlaridis 58\ncongressman-elect 58\ntwo-euro 58\nseries-levelling 58\ncricket-ashes-aus-eng 58\ndigital-only 58\nprovencher 58\nmuslim-jewish 58\nresizing 58\nhula-hoop 58\nfuegos 58\nfair-housing 58\nwaitakere 58\nbouldin 58\nchilanga 58\nsuceeded 58\nsundeen 58\nsix-phase 58\ndisassociation 58\nshteyngart 58\ncrystalize 58\nshihan 58\nprotocal 58\nlong-march 58\nfazal-ur 58\njackknife 58\nmcnairy 58\nealey 58\ndrivon 58\npleasure-seeking 58\nsciolino 58\nshore-up 58\nthen-banned 58\ngerman-swiss 58\npolymorphism 58\nthree-foot-high 58\nanti-speculation 58\nstoudmann 58\nkoreas-talks 58\nivanek 58\nkanyenda 58\nholyfield-lennox 58\nnaceri 58\nanti-globalist 58\njeg 58\nfolden 58\ntalpur 58\nr.f. 58\n##,###-barrel-a-day 58\nbinshu 58\ndoerfler 58\njuanmi 58\norgandy 58\nkostal 58\nchola 58\npresgrave 58\ncosgrave 58\nzhevnov 58\nrecapitalizations 58\nus-philippines 58\nhalf-a-percentage 58\nmayenne 58\npozole 58\njva 58\nhigher-up 58\nerra 58\nbtrc 58\ntaxachusetts 58\nlyor 58\nadvocate-general 58\ntoko 58\nblowzy 58\nkantaras 58\nconsiste 58\nindama 58\nxiaoxiang 58\njamelli 58\nesquipulas 58\ncd-based 58\nmaxillofacial 58\nbernabei 58\nmeshell 58\nparlin 58\nnow-imprisoned 58\nummc 58\nposterboard 58\ndegradable 58\nannasue 58\nanjouanese 58\ngallivan 58\nusc-notre 58\nsappho 58\ncreameries 58\nturbocharge 58\nsoslan 58\nthurairaja 58\nsingapore-china 58\nantiperspirant 58\nabominably 58\npuiu 58\nnewfoundlanders 58\nkaylene 58\nplaine 58\nhoworth 58\nnon-save 58\ndijana 58\nronaldson 58\naivar 58\nhudong 58\n##-student 58\npravastatin 58\ndelima 58\nsix-continent 58\nunion-imposed 58\nkaz\\/cof 58\nsandy-colored 58\nlongliners 58\njanic 58\namerithrax 58\nblurriness 58\naleki 58\nrokkasho-mura 58\nsogang 58\nmethow 58\nrosalee 58\nrapiscan 58\nnepali-language 58\neastgate 58\ntimis 58\nrecord-indoor 58\nadelaida 58\nethno-sectarian 58\npay-for-view 58\ndahdouh 58\ndesisto 58\nhuldai 58\nbrunswijk 58\nhundertwasser 58\nchiew 58\npyrotechnical 58\nsnatchings 58\njabri 58\nkleeblatt 58\nturkish-registered 58\nghodhbane 58\nmasunungure 58\nmousepad 58\nascott 58\nlapoint 58\nquarter-page 58\nbelfiore 58\nse-r 58\nmartinville 58\nsalchow-triple 58\nminored 58\nahe 58\ncounter-complaint 58\nmbula 58\nkhadi 58\nwolfli 58\nfaggots 58\nvigouroux 58\nchaifetz 58\ngiugliano 58\ndefoliated 58\nvashee 58\nmaumee 58\njutge 58\nloudi 58\npowderly 58\nrazini 58\nself-seeking 58\nbest-written 58\narkaah 58\npatroon 58\nmotormen 58\npedagogic 58\nsoendral 58\npharynx 58\nqudratullah 58\nproces 58\nalcohol-monitoring 58\ncorange 58\na.t.m. 58\nrasshan 58\nsabbar 58\nmalmberg 58\nsverrisson 58\ntiler 58\nsabag 58\nhttp://www.state.gov/ 58\nfeichter 58\n#-george 58\nwheelis 58\ncleric-run 58\nroskot 58\nadjetey-nelson 58\npropios 58\nquickies 58\nnzimbi 58\ngazprombank 58\nwoofer 58\ntrita 58\nweigman 58\ncongealing 58\nsolidarite 58\nnarcoleptic 58\npulai 58\nbewag 58\ncomputadora 58\nsppf 58\nsoundcheck 58\nzaituc 58\nadurogboye 58\norogen 58\ndemeanors 58\nneumayr 58\nmso 58\nntou 58\nwesttown 58\nre-directed 58\navalanche-journal 58\nishmail 58\npearsmhnytimes.com 58\nposthaste 58\nneedle-nose 58\nself-assessments 58\nlewisite 58\nkuwaiti-based 58\nkostevych 58\ndogonadze 58\nkorasuv 58\nhampl 58\nconcealed-weapon 58\nshragai 58\nkappos 58\nanti-casino 58\nsleith@ajc.com 58\nkaryo 58\nkopi 58\nsub-themes 58\nknock-kneed 58\nstendardo 58\nanthuan 58\nrekapac 58\nnasaa 58\nchinese-manufactured 58\nisioma 58\ngreenwillow 58\nstrad 58\npiontkowski 58\nnine-stroke 58\navm 58\nparishoners 58\ndelfim 58\npasquini 58\nwigmore 58\ndieuze 58\nscag 58\nberzengi 58\ncazzulani 58\nbalmont 58\nmarriotts 58\ncete 58\nphertzberg 58\nwapenaar 58\nhodara 58\ncurcumin 58\ncanoni 58\nabdulmajid 58\nmaroon# 58\nanti-vaccine 58\nmini-city 58\nbioland 58\nlanced 58\nhepatology 58\nbalakong 58\neneco 58\nsubuh 58\nwolfenstein 58\nhenchy 58\ntumukunde 58\nargentina-vote 58\nsoft-dollar 58\nmerit-making 58\nqanoni 58\nczech-built 58\nal-khayat 58\nheidsieck 58\ndeinstitutionalization 58\nmail-fraud 58\ncalakmul 58\nchiasso 58\nfaveur 58\nranarith 58\nsatara 58\ncash-management 58\nayittey 58\nbuschbaum 58\ncompatibles 58\nolatunji 58\nhourglass-shaped 58\nabousamra 58\nserafino 58\nrassemblement 58\ncohen-tannoudji 58\nartspace 58\nschwartlander 58\nyeltsova 58\nkiu 58\nzoubeir 58\nasias 58\norv 58\nyaring 58\nsheppards 58\ntwo-course 58\ncadrez 58\ncharima 58\ncanana 58\ngeodon 58\ndeplasco 58\nfully-armed 58\nbrontes 58\nkbp 58\nstehn 58\n#-iker 58\nbaff 58\nperlas 58\nmadhes 58\npreis 58\nharward 58\nkoloane 58\ndeshayes 58\nfalomo 58\nsaturno 58\ncorrupters 58\nplanetariums 58\nbifengxia 58\nflum 58\npropound 58\nhemanshu 58\nimacec 58\nfeyernoord 58\nulta 58\natli 58\nre-hire 58\ninterclan 58\ngitonga 58\naand 58\nicmr 58\njarar 58\nskank 58\nthrane 58\ntrousseau 58\nspent-fuel 58\nchilaquiles 58\nbump-drafting 58\niao 58\nvanhala 58\ndj\\/ak## 58\nproselytization 58\ndongdu 58\nrave-up 58\nuzebekistan 58\nntagerura 58\npengkalan 58\nanti-fascism 58\ndomingue 58\nlamport 58\nkaillie 58\nfernet 58\nbekaert 58\nproject-based 58\nmauriac 58\nabrogates 58\nbasketball-wise 58\ngansey 58\nemailing 58\njiulong 58\nmechale 58\nvignerons 58\nfederative 58\nemmo 58\npanino 58\neasterain 58\naceto 58\nmanresa 58\nown-goals 58\nlamongan 58\nmorinaga 58\nfourth-season 58\nduvergel 58\nruzindana 58\nyu-ih 58\nvarnishing 58\nnon-televised 58\nederer 58\ngrosboell 58\nsakanyi 58\nwrn 58\ncostless 58\nlaxalt 58\nlb-# 58\nhalse 58\nupper-tier 58\nsheinkin 58\nshadwell 58\nkatselas 58\nwww.aol.com 58\nyankey 58\nmytouch 58\nenyimba\\/ngr 58\nbrokedown 58\njen-hung 58\nkisutu 58\nkuniyoshi 58\ndahiyah 58\namrhein 58\nup-country 58\nrolo 58\nsmartmedia 58\nwatch-helsinki 58\nkhawazakhela 58\nocws 58\nyacoubian 58\nwandlike 58\ntratan 58\ntemane 58\ntainan-based 58\nmicropal 58\nkorrodi 58\nms-as 58\nahdyar 58\nsoutherton 58\nkakiuchi 58\nstate-of-origin 58\nbreault 58\nfilled-in 58\nkawase 58\nbysshe 58\npericard 58\nsakwiba 58\nqingyang 58\npedantry 58\nkamangar 58\nunlawfulness 58\ngdc 58\nastuteness 58\nvca 58\nhighly-qualified 58\npressurised 58\nfliegende 58\nambush-style 58\nsweatman 58\nbembry 58\nrabelais 58\nhellion 58\njaune 58\npade 58\ncylon 58\nnnt 58\ncod-style 58\nmanoeuvering 58\nlubero 58\nknocker 58\nnpower 58\noeystein 58\nstellone 58\nbadou 58\nberlinecke 58\nshengrong 58\nnon-sporting 58\nanti-moslem 58\nhttp://www.defenselink.mil/ 58\nmacgillivary 58\npigheaded 58\ngrogin 58\npluta 58\nmadasamy 58\nlozzano 58\nbong-kil 58\nchristmas-tree 58\nmetroid 58\nkudisch 58\nrunups 58\nkinsolving 58\nzety 58\nguitarist-singer 58\nvinayagamoorthi 58\nmovilnet 58\nppa-containing 58\nseyval 58\nfeliks 58\nshiite-populated 58\nharymurti 58\ntingo 58\ngrassless 58\ncheap-looking 58\nzhengming 58\njastrow 58\nnrcs 58\nshirkers 58\nbc-af-fin 58\nbatac 58\nfinocchiaro 58\npolyrhythms 58\nweale 58\nfang-yu 58\nispa 58\ncondensers 58\nsynergie 58\nrottman 58\ntest-bed 58\nafroyim 58\nanti-arms 58\namathila 58\nbillion\\/## 58\ntodman 58\nrabies-free 58\ngilbreath 58\nultra-conservatives 58\nout-of-context 58\nflook 58\npongpanich 58\nskulked 58\na###xwb 58\nsemiskilled 58\nsynoptics 58\nhome-ported 58\nih 58\nbornand 58\nkeshubhai 58\nadianto 58\nasen 58\nd-hillsborough 58\ncommercial-grade 58\ndiery 58\ntzoganis 58\nal-nounou 58\ntedd 58\nmap-making 58\nblockbusting 58\nbe-in 58\ntrouble-torn 58\nskipp 58\nwhitsunday 58\nhutterites 58\nttc 58\noccam 58\nwitters 58\nwinterreise 58\nal-hasani 58\ncities-abc 58\nkurtag 58\nkahoolawe 58\nradiofrequency 58\nelfers 58\nrazor-edged 58\nprofanity-filled 58\nnedzad 58\nelsworth 58\nsashays 58\nredbone 58\narkhipova 58\ngeck 58\nfouras 58\nerdf 58\nsivori 58\npoint-based 58\napennines 58\nraffanello 58\nfoot-fault 58\nemirsyah 58\nsafian 58\npoppy-producing 58\nbagong 58\nroehrig 58\nstutes 58\nmichelet 58\nanter 58\ncampionati 58\nsemmelweis 58\ndos-based 58\nsadosky 58\nst-# 58\nfranker 58\nbrende 58\nmihajlov 58\neshetu 58\nbarile 58\nroey 58\nbijaya 58\nmatcha 58\nhounslow 58\nsischy 58\nanxi 58\ntopdog\\/underdog 58\npredisposing 58\ntightenings 58\ncolantuono 58\nduflo 58\ntervuren 58\nslebos 58\nkrein 58\nmalu-malu 58\nnkorea-nuclear-weapons-us 58\nd'hondt 58\nmarce 58\nmovieline 58\nbles 58\nhyeon 58\ncasner 58\ndry-aged 58\nclomping 58\njd\\/pi## 58\ngiganotosaurus 58\nfive-six 58\niita 58\nthen-teammate 58\nbadola 58\nahrendts 58\nchristoforakos 58\nal-daradji 58\nhathorn 58\ncomputer-operated 58\nsoviet-american 58\nmousetraps 58\nturetsky 58\nfarc-held 58\nmedzamor 58\nherpoel 58\nscissor-kick 58\nwodie 58\nquirine 58\nshrivelled 58\nflameproof 58\nless-talented 58\nsacharow 58\nthin-bladed 58\ncavour 58\nallaga 58\ndornbush 58\n#-pounder 58\nschuermann 58\nmafia-related 58\nbellingen 58\ncrichlow 58\ndividend-rich 58\nlorsch 58\nanglada 58\nnon-actors 58\nanti-surface 58\ndeepcut 58\nlazarev 58\nthumb-sucking 58\nbi-polar 58\nmadrid-barajas 58\nthile 58\nbarn-burner 58\ncalibrations 58\nstimulus-fueled 58\nunplanted 58\nkeyboardists 58\necuador-vote 58\narmuelles 58\nchittick 58\ntaavi 58\nmondesire 58\nsmederevska 58\nding-dong 58\npromphan 58\nlovsan 58\nloveseat 58\nfullscale 58\nnazzaro 58\nmulvenon 58\nhillegass 58\nvanderford 58\ngoodlad 58\ncarphedon 58\ncourt-at-law 58\npseudo.com 58\nsollers 58\nputterman 58\nfinnegans 58\ndurakovic 58\nhamayon 58\nduct-tape 58\nbardic 58\nscaled-up 58\nstill-robust 58\nwilchcombe 58\nwathiq 58\nhiltachk 58\nkrylatskoye 58\nboose 58\ndata-intensive 58\ndanielides 58\ntranssexuality 58\nclaw-foot 58\nnone-out 58\nelucidating 58\ntomasch 58\nbrignol 58\njeyarajah 58\ndangor 58\ncaic 58\ngoaland 58\nmellis 58\nsomali-based 58\nsidetracking 58\nsushmita 58\nmid-stride 58\ndomestically-traded 58\nredox 58\nshoba 58\nhouse-arrest 58\ncoppens 58\nat-tuffah 58\nnijssen 58\nhely 58\nrouged 58\nlounger 58\na.r.c. 58\ndubovsky 58\nnon-payments 58\nenviga 58\nfrenz 58\n#-boris 58\nmisprision 58\ngo-along 58\nriccadonna 58\nmachine-gunner 58\nvellore 58\ntackie 58\nconstitutionalists 58\ntongliao 58\npaderina 58\nnan-cheng 58\ncattan 58\nus-immigration 58\nsparq 58\nmashingaidze 58\ntogawa 58\nhouseflies 58\nsemenzato 58\nmoton 58\natlanta-bound 58\nsummerer 58\nmarmottan 58\nquartier 58\nel-motassadeq 58\nmore-profitable 58\nboada 57\nmalim 57\nvoit 57\ntichtchenko 57\nqgpc 57\nsakassou 57\nhamadou 57\ngeeked 57\nanagrams 57\npoliticizes 57\nchainarong 57\nfourth-youngest 57\nschulweis 57\nunsafeguarded 57\nurmila 57\nmobile-telephone 57\ngrigson 57\nabdul-samad 57\nel-youssef 57\nsuffuse 57\ncastellane 57\nline-outs 57\nmanohara 57\nchristiansborg 57\nstandley 57\ngasoline\\/electric 57\nmunton 57\naaib 57\npostcommunist 57\nbuckcherry 57\nteletext 57\njuancho 57\nsteep-sided 57\nmicro-enterprise 57\nmedicals 57\ncloudlike 57\ngriet 57\njintropin 57\nfunsho 57\nschaus 57\nchenoweth-hage 57\nclub-mate 57\nisrael-vote 57\ngokavi 57\ntoufic 57\nscalloping 57\nlfc 57\nchatwal 57\nmovie-mad 57\namerican-european 57\nbutchie 57\ntelevisual 57\nchainrai 57\nmambasa 57\nquaye 57\ndilg 57\nhttp://www.nobel.no 57\njabotinsky 57\nnilly 57\nlakela 57\nfrostily 57\naustralia-bushfires 57\nmaumalanga 57\ncoleccion 57\n##-carry 57\nweisser 57\nmorago 57\nkpatinde 57\nkorun 57\nkc-pq 57\nnextlink 57\ntbilissi 57\nazua 57\n#m## 57\nobersalzberg 57\nsea-skimming 57\nnizuc 57\nsacombank 57\nnitrate-based 57\nf.h. 57\nfarj 57\nlagunov 57\nmillipede 57\n#-cd 57\nmusclebound 57\nbloche 57\ndissuades 57\ntae-dong 57\narms-related 57\nsuk-tae 57\nconceptualizing 57\npokaski 57\noften-contentious 57\nseleccion 57\ndialogo 57\ngrevenmacher 57\nalledged 57\nmaziarz 57\necompanies 57\nslow-building 57\ngreczyn 57\npackham 57\nlamego 57\nsaddlebag 57\nstrokosch 57\neutaw 57\nshophouses 57\nxxiv 57\nnickel-cadmium 57\nshuning 57\nhttp://www.homedepot.com 57\nsunfin 57\natg 57\nwindmilling 57\nzostavax 57\nthen-majority 57\nrevenant 57\ndaish 57\ncvijanovic 57\nbutoh 57\nmarianist 57\nvisoth 57\neidul 57\nisaura 57\naleynikov 57\nshalita 57\nbenegas 57\nmine-strewn 57\nlitif 57\nqallab 57\npercolation 57\nmagnesite 57\nsung-jin 57\ncook-offs 57\nronayne 57\ncarias 57\ncalendula 57\nelmaghraby 57\nfafner 57\nscibelli 57\nhuhhot 57\nsidex 57\nbank-issued 57\nperfecta 57\nvitrines 57\nho-chunk 57\nalstyne 57\nsydkraft 57\nfootball-wise 57\nsilverite 57\npagliaro 57\noverdubbed 57\nmighani 57\nfinger-snapping 57\nliron 57\nmulti-trillion 57\nortuno 57\ndrop-kick 57\nmussavi 57\nbrasilia-based 57\nmeteoroid 57\noxygen-deprived 57\nrentech 57\npuzzle-solving 57\npro-ravalomanana 57\nthmey 57\nzalben 57\nliko 57\nqardaha 57\nmuayad 57\nlottner 57\nconceptualist 57\nlaychak 57\nbadini 57\njazzmen 57\nmacaco 57\nnexstar 57\nespecies 57\nwarbirds 57\navelar 57\nulcerated 57\nmurshidabad 57\nmarife 57\ngrillers 57\nmarakwet 57\nsanroma 57\ncollege-preparatory 57\nbartholomay 57\ndpp-initiated 57\nduskin 57\nanesthetizing 57\nbelgrade-controlled 57\nestragon 57\nnovolipetsk 57\ncaton-jones 57\noffman 57\njetways 57\nam\\/ji 57\nhatemonger 57\nzeo 57\ndial-around 57\nhunzike 57\nwicha 57\ncupiagua 57\nmewelde 57\nwebmethods 57\ndacca 57\nagrast 57\nindonesia-weather-floods 57\nsumaya 57\nmagliore 57\nseljan 57\ndead-eyed 57\nrezidor 57\nrubberneck 57\nsubstantia 57\nazzurro 57\nfirst-responder 57\nperuses 57\njanusaitis 57\nsummiting 57\nmantas 57\nposthumus 57\nsufjan 57\noften-delayed 57\nresiana 57\nrrustem 57\npulled-together 57\nmilmo 57\ncompa 57\nprezant 57\nlucke 57\nprosthetist 57\nse# 57\nluisita 57\nmolinelli 57\nearthrights 57\nosayemi 57\nmarket-beating 57\nsincor 57\nrexburg 57\nmiku 57\nanti-disease 57\nalready-crowded 57\nsantry 57\nfriesian 57\ntg-pyg 57\nnce 57\ncottoned 57\nvaugrenard 57\ncandleholders 57\nmudi 57\nhawksworth 57\njinglian 57\nschuon 57\ndehesa 57\npuyo 57\nbiosensor 57\nperiodnone 57\nnon-deductible 57\neid-ul 57\npalestrina 57\nmitic 57\nsegars 57\nentrapping 57\n###w 57\nvls\\/nvw 57\npeachpit 57\nsmall-plane 57\nnorsworthy 57\nlello 57\nwair 57\nlatinpass 57\nsouquet 57\nreveles 57\nsoapboxes 57\ntholut 57\npompton 57\nfmd-free 57\nsomething-or-other 57\nnayoko 57\nforum-asia 57\ninterferometer 57\nhortensia 57\ntifosi 57\nu.s.-provided 57\nbrasfield 57\nhttp://www.people-press.org 57\nmojokerto 57\nstankalla 57\ngallien 57\nvalda 57\nspeciosa 57\nlightly-regarded 57\nbublitz 57\nchongryong 57\nbit-part 57\nsharabati 57\ndrottningholm 57\neum 57\npongrat 57\ntesa 57\nmulticast 57\n+##,### 57\nmushed 57\nravello 57\neye-fi 57\ncalandra 57\nshujah 57\nrural\\/metro 57\ndjeric 57\n##,###-points 57\nhoutart 57\nidigov 57\nrussian-supported 57\ncaa# 57\ntionne 57\ngabai 57\ndosek 57\nsadyk 57\ncalzati 57\nhip-hugger 57\niue-cwa 57\nksf 57\nhpa-an 57\nfinamex 57\nberden 57\nameco 57\nnon-jordanians 57\ntoxford 57\nmulembwe 57\nrovos 57\nextrication 57\ncutchogue 57\nbang-andersen 57\nlucchetti 57\nuninstalling 57\nexcises 57\nazaouagh 57\ndecompressing 57\nnorthwesterners 57\nwielkopolski 57\ntahlequah 57\noptimark 57\ngheen 57\neurlings 57\nunclimbed 57\nesti 57\nniyonzima 57\npro-rata 57\ngiannoulas 57\nkodjoe 57\nulanqab 57\ntiliwaldi 57\nbaldock 57\n#-meters 57\nghorak 57\nbovey 57\nsameur 57\nchien-kuo 57\ncolten 57\nvincenti 57\nsquare-kilometers 57\ngaramvoelgyi 57\nzippel 57\nthen-commander 57\n##.#-nautical 57\nunpowered 57\nhanly 57\nparten 57\nliederman 57\nvelayat 57\nwillo 57\ncoonelly 57\nkitov 57\nswiss-educated 57\noutraise 57\nsinta 57\n#-felix 57\nvirts 57\nyome 57\nsaury 57\ngretz 57\nisoun 57\nmpigi 57\nnstp 57\nmodzeleski 57\nsamran 57\nwazed 57\nreappraising 57\nregola 57\nexor 57\nfrance-telecom 57\niresearch 57\nsoz 57\nn.b.a. 57\njenai 57\nliukko 57\nwenpu 57\ncervenko 57\nhalfa 57\nyefremova 57\nbiaggio 57\nmimmo 57\niraq-unrest-us-toll 57\nquart-size 57\nfavalora 57\nhockey-mad 57\ndoppelgangers 57\nghalibaf 57\nmarijnissen 57\nopen-face 57\nsytem 57\nfredricksen 57\nshafayat 57\nsafeen 57\nprograme 57\nkuroichi 57\nburqa-style 57\nxuesen 57\ndissembled 57\nuptagrafft 57\ncanadiense 57\nsung-wook 57\nparavant 57\nkada 57\nshort-stay 57\npinedo 57\nbalwinder 57\ntibon 57\nprig 57\nducent 57\nkc-###s 57\nersberg 57\nfamily-type 57\nlardin 57\ndicussed 57\nkaim 57\nwebnews 57\nkavak 57\ndebut-making 57\nhewitson 57\nmoeletsi 57\ntapit 57\nupdegrove 57\nsoft-sided 57\nhudepohl 57\nreflectivity 57\ntigerland 57\n#.##-per-share 57\nhalandri 57\nplote 57\ntemporaries 57\ngren 57\nmerlet 57\nodera 57\nlingao 57\nsleazier 57\ndialectics 57\ndoubletalk 57\nal-siddiq 57\ntifatul 57\nsanabis 57\nsc### 57\nvanore 57\nalair 57\nmae-eap 57\nspookiness 57\nhyson 57\nnounou 57\nnasreddine 57\nrose-marie 57\ncompletamente 57\nnon-starters 57\nbeguin 57\nbell-bottomed 57\nhandloom 57\nabu-zeid 57\ntolstaya 57\ntranswestern 57\nmaraven 57\nneoforma 57\ngalster 57\nviraat 57\nengqvist 57\nsadiya 57\nidiot-proof 57\nfrance-politics-jobs-youth 57\nlarini 57\nyuzawa 57\nutilitarianism 57\nantosh 57\nbelize-flagged 57\nkasetsiri 57\nkekauoha 57\nkaleida 57\ndeviously 57\nboucheron 57\nhanen 57\ncrose 57\npawson 57\nsudikoff 57\nallayar 57\nindustry-# 57\nperrott 57\nunframed 57\nkirpan 57\ndimasa 57\npopma 57\nmutianyu 57\naygun 57\niannelli 57\nd-dayton 57\nbrand-named 57\npersian-speaking 57\ngarforth 57\nsucess 57\ndisposer 57\nrosangela 57\npscs 57\nhanway 57\ncambon 57\npresas 57\nj&b 57\ncheckmated 57\nbonnardeaux 57\nguzzetti 57\ndigel 57\nmweemba 57\nparty-sponsored 57\nuncommanded 57\nhigh-touch 57\ndraughon 57\nshanzai 57\nkhakimov 57\njaponicus 57\nunigate 57\n##-anastasia 57\nnabatiye 57\npenalties_none 57\nfive-song 57\nlightning-strike 57\nthumbnail-sized 57\ngutteres 57\nswardson 57\nkls 57\nstill-simmering 57\nburges 57\nam\\/sbg 57\ngolf-epga-esp 57\nshort-termism 57\nlifefx 57\ngtm 57\nhuntley-brinkley 57\nsaidat 57\nsequencer 57\nseperatist 57\nki-chi 57\nspoksman 57\nhuman-driven 57\nlaue 57\nvancleave 57\nsjoblom 57\nbarbacoa 57\nstatman 57\nshaneen 57\nantique-filled 57\nmhh-krg 57\nremond 57\nbijli 57\ncarby 57\ngreece-style 57\nmisapplying 57\ndorsen 57\nbuk 57\nschwarzenbauer 57\nraiz 57\nbanin 57\ndaryn 57\nartemisinin-based 57\nkbohls@statesman.com 57\npersahabatan 57\nwormy 57\nsamsung\\/radioshack 57\nbaxter-johnson 57\nre-tried 57\naetats 57\nhead-turner 57\ndegganssptimes.com 57\nthirty-thousand 57\nimperiousness 57\nkesha 57\ntenace 57\nferguson-mckenzie 57\njaovisidha 57\nagassa 57\nbarriga 57\nair-strike 57\nhome-opener 57\nturbi 57\numali 57\nkralik 57\npump-and-dump 57\nbassat 57\nkeasler 57\ntank-killing 57\nmisdirecting 57\nex-fighter 57\nmngomeni 57\nrejigged 57\nnovska 57\nbackstabbers 57\nspa\\/qst 57\nre-tally 57\nintermune 57\nlifa 57\nurbanczyk 57\nsarabeth 57\ncoke-bottle 57\nbattle-readiness 57\nkawkab 57\nkerdyk 57\nwenceslaus 57\nmind-expanding 57\nboutroue 57\ntanona 57\nsalivation 57\ntricolore 57\nout-gunned 57\njacobowitz 57\nbacot 57\nparticipar 57\nalprazolam 57\npolho 57\nfunnyordie.com 57\npanarin 57\nblanchfield 57\nyolane 57\nqualia 57\nrobertses 57\novp 57\npeissel 57\ndratshev 57\nieremia-stansbury 57\nkorchnoi 57\npvs-lk 57\njan.-sep 57\nschlickeisen 57\ncourier-post 57\nhigh-cholesterol 57\netich 57\nguolin 57\ngiesen 57\nindusty 57\ncucherat 57\nvillate 57\nevelin 57\nkelly-goss 57\nu.s.-korea 57\nkivutha 57\noverdramatized 57\nnemzet 57\ncassata 57\ndarle 57\ncur 57\nactor-politician 57\ndingbat 57\ngerspach 57\nbetter-established 57\nrestos 57\ntristesse 57\nausmin 57\ncomputer-like 57\nthongsing 57\nramberg 57\nhjort 57\nover-estimated 57\nbioremediation 57\nstress-reduction 57\ntelevisi 57\nchelny 57\nsundazed 57\nradio-cassette 57\nopening-week 57\nreghecampf 57\nhighest-performing 57\nzamanbek 57\nprefigures 57\nred-and-white-striped 57\nmetters 57\ntravessa 57\npengiran 57\ncopernic 57\ngovernment-granted 57\nqoryoley 57\nkur 57\nmartincova 57\naldis 57\nremissainthe 57\nfavor-hamilton 57\nby-# 57\noooooooooooooooooooo 57\nfieldsman 57\naa-plus 57\nfarmsteads 57\nmarrack 57\nfambrough 57\nongarato 57\nspayd 57\ncorsetry 57\ninuvik 57\nsalwen 57\npower-grabbing 57\nwornick 57\nkalkstein 57\npapermakers 57\nthwaite 57\nmineira 57\nnewbigging 57\nuberstine 57\nroxx 57\nehrlichiosis 57\nal-noor 57\nchernogorneft 57\nmercally 57\nvillamayor 57\ntexpool 57\neye-watering 57\nwhite-brick 57\nkoco 57\ntransportations 57\ni-zone 57\nkouadio 57\nwebsense 57\nclinton-like 57\nkievsky 57\npietton 57\nathirson 57\noxygen-generating 57\nyoure 57\nsupertramp 57\nergic 57\nkasambala 57\nattention-grabber 57\nj-shaped 57\nqedwa 57\nhttp://www.ipcc.ch 57\nbossman 57\nlast-rock 57\npentair 57\nbingle 57\nap# 57\nsignally 57\ncundieff 57\nmaflahi 57\nciger 57\ngerlinde 57\nndambuki 57\nottakar 57\nbraunskill 57\nrecopa 57\njournaling 57\nrmit 57\nernk 57\nnon-congress 57\ndayle 57\nq-cells 57\nhetian 57\nboons 57\nnon-deliverable 57\nwoolston 57\ncristoph 57\nr\\/# 57\nmanglano 57\nerdinc 57\ngajon 57\nknickknack 57\nrohmat 57\nwen-yuan 57\nharleman 57\nguesso 57\nstamenson 57\naustralia-united 57\nnitromethane 57\neastn 57\nanonyme 57\nstadelmann 57\nsiefer 57\npripps 57\ncross-checks 57\nsalzburger 57\noceaneering 57\nball-point 57\nmandeep 57\nfratesi 57\nbeyrer 57\nsad-faced 57\nkrasnomovets 57\nrouhani 57\neruh 57\nlong-battered 57\ntoles 57\nparrotfish 57\ntuffree 57\nbrasseur 57\nshenkarow 57\nhalf-a-mile 57\ngyroball 57\ndombi 57\nanzar 57\ntajan 57\neasiness 57\nburgett 57\nhoskyns 57\nkembla 57\norena 57\nhatam 57\npinko 57\ndetroit-hamtramck 57\nsuborned 57\nkavetas 57\nnielsen\\/net 57\nsrg 57\nu.s.-patrolled 57\nschuelke 57\nhayim 57\ndrobiazko\\/povilas 57\nbhumidhar 57\nakan 57\nbungler 57\nheterosexually 57\nshija 57\ndazhen 57\nteikyo 57\ntechno-thriller 57\nbezrukov 57\nhobgoblin 57\ngattis 57\nfamous-brand 57\nseamico 57\nboobytrapped 57\ndownrange 57\nfuel-starved 57\nmcaulay 57\nrobustelli 57\nbrinegar 57\nrohrbaugh 57\nismar 57\nbonfils 57\nponomareva 57\ngoorjian 57\nkashmola 57\non-rushing 57\ntuesday-sunday 57\nstill-sluggish 57\nregular-cab 57\nobermayer 57\nkpakol 57\nbasketball\\/pros 57\n#-virginia 57\nhunding 57\ndangol 57\npendleton-based 57\nneftegaz 57\nnakaniwa 57\nbetham 57\nlagniappe 57\nmbambo 57\nindle 57\nsakorn 57\npro-khartoum 57\nconjectural 57\nlunda-sul 57\nchiu-chin 57\nbancassurance 57\nlimbu 57\ninital 57\nanisotropy 57\npilbeam 57\nyazawa 57\narzak 57\nopacic 57\nkarasu 57\nhttp://www.cbs.com 57\nmingaladon 57\njoensuu 57\nweidenbaum 57\nugwu 57\nanugerah 57\nimmunologically 57\nespuelas 57\nizgi 57\nfredie 57\ncariplo 57\ncoyotepec 57\ncuyos 57\nnolle 57\nyb\\/sbg 57\neigil 57\nmaximilien 57\nbarberie 57\nyeongam 57\nharperbusiness 57\nnonlawyers 57\nmekachera 57\nmahlon 57\nveruca 57\nfirmo 57\nlamell 57\nsileo 57\njabarani 57\nimkb 57\nmaiffret 57\nodalovic 57\ngingivitis 57\nnagasawa 57\nfrancistown 57\nkocherlakota 57\nlife-saver 57\nseegers 57\nsbcs 57\nmarkazi 57\nkharbash 57\nfokker-### 57\nwide-release 57\nbajilan 57\nyaral 57\nmaione 57\ntokiwa 57\nqld 57\ngovernable 57\nparry-jones 57\ntwo-unit 57\nmcclay 57\nmcclam 57\ninner-tube 57\nobanda 57\nconfederado 57\neravur 57\nribbs 57\nmcclennen 57\nmulund 57\nbordallo 57\ncutely 57\ndiferencias 57\nblack-tailed 57\nride-alongs 57\nmosson 57\nnotarial 57\nrovereto 57\nskosana 57\nholohan 57\nthamilchelvan 57\nbusiness-management 57\nrerecorded 57\nseomin 57\nroewe 57\ngaffs 57\nwhirr 57\ngeppetto 57\nchukwueke 57\nbc-na-fea-gen 57\nrock-oriented 57\nmirandes 57\ntelkiyski 57\nkalindi 57\nnon-minority 57\nwell-greased 57\nnemsadze 57\nsouthshore 57\nddungu 57\nlanin 57\nlubuk 57\nzhongqiang 57\nvolesky 57\ngontard 57\nkopylov 57\nsiestas 57\nhonduras-politics-coup 57\ndavitian 57\nfinancial-industry 57\nkantono 57\npumpkin-colored 57\nreconverted 57\nfixer-uppers 57\neuro##-euro## 57\nepinal 57\nhighchairs 57\nbrowses 57\njs\\/jd## 57\nmg\\/l 57\nsix-decade-old 57\nchimeric 57\nbowdler 57\nshannyn 57\nsnick 57\nfemale-oriented 57\ncambridgeside 57\ngiacomini 57\npolityka 57\nmarzouki 57\nvullo 57\nfist-pump 57\nvm###-### 57\nal-aoofi 57\nklecker 57\ninhumanly 57\nwhite-on-black 57\nmacneil-lehrer 57\nconkling 57\nal-kharbit 57\npcrm 57\nbest-song 57\nlevines 57\nadventuress 57\ncastagnetti 57\nmgn 57\nchumbawamba 57\ndiscontinuous 57\nojima 57\nkarlos 57\nihar 57\nmid-# 57\ntoe-loop 57\nasantehene 57\nruso 57\ncamago-malampaya 57\nsweet-talked 57\nbaitadi 57\nkaleidoscopes 57\nntabakuze 57\nsenado 57\nill-received 57\nvaccum 57\nguigang 57\nvoula 57\nkatou 57\nmontcoal 57\nharilal 57\nwetted 57\nruman 57\nagni-i 57\nrahbani 57\nschaffhouse 57\ngeyserville 57\nelda 57\n##-book 57\nyinhui 57\nleaseplan 57\nmidstage 57\nelnora 57\ndepende 57\nsecond-in 57\nmrp 57\niturralde 57\nengelberger 57\nyoude 57\nregivaldo 57\nschaper 57\ntotted 57\nelhassan 57\n#,###-patient 57\nslivinski 57\nmanically 57\nsampang 57\nbratu 57\nmulti-goal 57\narab-summit 57\nvisted 57\nazour 57\n##-sebastian 57\nmabasa 57\nlambayeque 57\nanuradha 57\nkw\\/hours 57\nfire-suppression 57\nbiliary 57\nyasujiro 57\ncrusat 57\nncds 57\nzager 57\nschmitt-roschmann 57\nfamiliar-looking 57\nairiness 57\nmoamba 57\nvocci 57\ngodchildren 57\ncaleigh 57\nbrokenborough 57\ncervo 57\nrecieving 57\nnon-recourse 57\nclassic-car 57\ndoy 57\nyong-seok 57\nscramjets 57\nterrorismo 57\nwesternised 57\ntechnical-support 57\nbuckaroos 57\ncaydee 57\npmdc 57\nv#s 57\ndalhart 57\nvibrance 57\nwayang 57\nroundish 57\nignarro 57\nsansoni 57\nschobert 57\n#.####-mark 57\nncsl 57\nbhight 57\nshapey 57\nroyan 57\nlemper 57\nshannon.buggs@chron.com 57\nanadyr 57\nch-##d 57\njhala 57\nbaumgart 57\ncaponi 57\nsouverain 57\npush-off 57\nvidosevic 57\nverwiel 57\nstrougal 57\nnon-network 57\neco-terrorists 57\nwashkansky 57\nmatherne 57\nbt-## 57\ndragica 57\nnamuyamba 57\npetroleum-related 57\nas### 57\nmcgahern 57\nmorganti 57\namerican-record 57\nrandfontein 57\niraq-al-qaida 57\ncubillan 57\nbayberry 57\nmythbusters 57\ndollar-cost-averaging 57\nmultitasker 57\naicraft 57\nopole 57\nru\\/sw 57\nspirea 57\nencoder 57\nchanko 57\npyshkin 57\nstructuralism 57\nvijayakumar 57\neasygroup 57\ngoncalino 57\ndeadeye 57\nmilieus 57\nleomitis 57\nstart-stop 57\nstrohm 57\nstevensville 57\nanklam 57\nhuanghe 57\nbushisms 57\nsugimori 57\ntightly-knit 57\napono 57\none-yuan 57\npheonix 57\nfreeze-for-freeze 57\ncagily 57\nvivants 57\ncotman 57\nsirf 57\nstemme 57\nbaldassi 57\nkongsi 57\ntele-medicine 57\nmackanin 57\nbuild-a-bear 57\ngreeson 57\nindomobil 57\nbritain-politics-labour 57\nredenomination 57\nsell-outs 57\ngrullon 57\nnoorda 57\nsawasdi 57\nccamlr 57\ndbrs 57\nmallouh 57\nage-verification 57\nstrength-sapping 57\nb-#bs 57\nprestigous 57\n##-miroslav 57\nuzcategui 57\nlaquila 57\nrs\\/#### 57\ngasoline-guzzling 57\ngalesi 57\nhemley 57\nsimha 57\nkokura 57\nnyang 57\ncatenaccio 57\nsigo 57\nhiwada 57\nhuscroft 57\nbench-pressed 57\nrock-hurling 57\nadministrates 57\nal-hawali 57\ntime-scale 57\ngokcek 57\nbudgett 57\ndyatchin 57\nrege 57\nouargla 57\nginns 57\npapathanassiou 57\nlucrezia 57\nisroilova 57\njuillet 57\nr-fairfax 57\ncivilian-populated 57\nxiaoyun 57\nmcramerglobe.com 57\npsuv 57\ncreasey 57\ntelleldin 57\noltman 57\npost-football 57\nperiodontist 57\nus-school 57\nimpsa 57\nkahana 57\nlunch-bucket 57\nzahalka 57\nparty-going 57\nlc-gm 57\norumieh 57\nndongou 57\nludek 57\nrenowed 57\njunyao 57\nmost-admired 57\ncricket-aus-ind 57\noutpitch 57\natherosclerotic 57\nsalamao 57\ncapilla 57\nall-expense 57\nalganov 57\nvaslav 57\nal-qaissi 57\nstoklos 57\nhortman 57\nultra-small 57\nncea 57\nfermenters 57\nbelik 57\nbarsuk 57\ncycad 57\nfgarcia 57\nhigh-carbon 57\nriska 57\npathogenicity 57\ncrenson 57\nvinyls 57\ntercentenary 57\nsledded 57\ntleiss 57\netat 57\ngoogleplex 57\nmessin 57\nsr# 57\npatkar 57\nsinaloan 57\ndjoumessi 57\nfritzky 57\nkanaana 57\npcv 57\navt 57\npuzzlers 57\nswellings 57\nhillshire 57\nsentani 57\nlivery-cab 57\nmartensson 57\nunroadworthy 57\njinwei 57\nlongyang 57\ngroenfeld 57\nderbent 57\nnihilists 57\nmaurizi 57\nturnhalle 57\nmixologists 57\nfrostiness 57\nselph 57\nprodigene 57\ntewodros 57\nmosquito-transmitted 57\nportch 57\narbib 57\nattarian 57\neffendy 57\nkason 57\nadvances# 57\njerges 57\nrockhouse 57\nandani 57\neight-city 57\nfurio 57\nyaswant 57\nmonona 57\nepiphanny 57\nsix-kilometre 57\nshaleil 57\nprotropin 57\nzients 57\nwindhorst 57\nabana 57\nalleghenies 57\nintra-state 57\nlalwani 57\nunstinted 57\nwarford 57\nflordia 57\ngalax 57\nmcgiffert 57\ncadereyta 57\nzeck 57\nrailamerica 57\nhttp://www.ford.com/ 57\nronetta 57\nfoodmakers 57\nyongyudh 57\nhalf-serious 57\ncoal-to-liquid 57\nday-use 57\nacclimatizing 57\ncharkhi 57\nphalaborwa 57\nmuthaiga 57\nbldp 57\nobnoxiousness 57\nesperar 57\nresistances 57\ntop-producing 57\natochem 57\nintiman 57\ndog-show 57\nhard-edge 57\nwolfhound 57\nvincent-st 57\ncampiness 57\nleibniz 57\nreligare 57\ntankful 57\nkertih 57\nlodal 57\nmbita 57\nforestalls 57\nimbecilic 57\nwatch\\/americas 57\nfondiaria 57\nsanyi 57\nmikhailo 57\npalumbi 57\ngermain\\/fra 57\nmg-### 57\nhunthausen 57\nsnappily 57\nshakhnazarov 57\njakabos 57\nchampassak 57\nchia-chun 57\nbassoonist 57\nkasambara 57\nparadies 57\narcore 57\nnewquist 57\nmocny 57\nkostelka 57\ndolina 57\nmcelrathbey 57\nprovince-based 57\naddum 57\nspritzer 57\nschoenholtz 57\nabbotts 57\ngay-related 57\nsuseno 57\nwreh 57\nal-forat 57\noutdraw 57\nkeleher 57\niraqen#### 57\nafrim 57\nsouaidia 57\ncardi 57\ninkjets 57\ngas-sipping 57\nscalf 57\npuddled 57\nkadidal 57\nraymonde 57\near-shattering 57\nbishi 57\nzamano 57\nsabeh 57\nprinsen 57\nus-violence 57\nmillhauser 57\nmillion\\/euros 57\ntexmaco 57\nalltime 57\ncheik 57\ncomando 57\namaitis 57\nerskin 57\nmastersingers 57\nbursaries 57\nwimbley 57\nveni 57\ndolgorsvren 57\nsirait 57\nrinero 57\nmander 57\nnatiq 57\nparilla 57\nmewling 57\nanangwe 57\nomnifone 57\nshandler 57\nnow-ubiquitous 57\nkadokawa 57\nseven-kilometer 57\na.k.a 57\nsatterthwaite 57\nluusua 57\npila 57\nfarabee 57\noften-heard 57\ntembec 57\nfleabag 57\ndozen-plus 57\ntrichopoulos 57\nantrobus 57\nantlfinger 57\nzuendel 57\nzeevi-farkash 57\nthumbelina 57\nrestrictionists 57\nvraalsen 57\nradwaniya 57\nbergan 57\nspectating 57\nstrike-slip 57\nsetchell 57\nnew-era 57\ncoutries 57\nshaoqiang 57\nfact-checked 57\nfariborz 57\n###-billion-pound 57\nbiblioteca 57\nisoa 57\nslimeball 57\ngenaux 57\nletha 57\nfma 57\ntrouble-maker 57\nsonali 57\nanumnu 57\naavishkar 57\nr-pasadena 57\nwel 57\ntrailways 57\nkalpoes 57\nmarcelhino 57\nerythematosus 57\nyayha 57\nshenhar 57\nraheel 57\nyasnaya 57\nsung-kuk 57\ncorvalan 57\nhouston-galveston 57\ntouchier 57\nu.p. 57\nre-live 57\nbarrington-coupe 57\njarosz 57\nhighest-flying 57\nunderuse 57\nolmecs 57\ncliment 57\nslabbert 57\ncommericial 57\nshalgi 57\npeattie 57\nwell-staffed 57\ncypriot-flagged 57\nfraccari 57\nrafel 57\nribo 57\nvamped 57\ntaffel 57\nlucheng 57\npugnaciously 57\nstrambach 57\ncgtp 57\ntapulous 57\nrockiest 57\nbienstock 57\nhigher-interest 57\nrusted-out 57\nsopped 57\nthird-from-bottom 57\npacholczyk 57\nkrasovska 57\nromulan 57\nsang-moon 57\nundammed 57\nncacc 57\nbrookhart 57\nvalbon 57\ngreenes 57\nkitchell 57\nzurick 57\ninsufficiencies 57\nmohamedou 57\ndaric 57\nmulticamera 57\nsunao 57\ny## 57\nplumelec 57\nntshangase 57\natwi 57\nyiotis 57\nbaoquan 57\nmuehlebach 57\nvradenburg 57\ndeliberateness 57\nridgelea 57\nsuperlotto 57\ngame-long 57\nhard-to-sell 57\npjm\\/gj## 57\nfrot-coutaz 57\nmiram 57\noil-on-canvas 57\nlaming 57\nstodginess 57\nrikrok 57\npresident-in-uniform 57\nobbo 57\namerican-grown 57\n#-matt 57\ncharnvirakul 57\nbaldomero 57\nlangfeld 57\nthroat-slashing 57\nhegemonist 57\nmolchanov 57\ntarkan 57\ngrimaud 57\ntreelike 57\nyi-chiao 57\ncaston 57\nshallman 57\nkandeh 57\n##-jarkko 57\nboy-king 57\ngermani 57\nsindian 57\noff-the-book 57\nv&s 57\nkishaba 57\nfertonani 57\nschear 57\ndrainpipes 57\nholness 57\npwilson 57\ndfler 57\nako 57\nbonannos 57\nmoonwalked 57\nbuzim 57\ntrapdoors 57\nskeeters 57\ncamshafts 57\nonair 57\npsncr 57\ncalton 57\nnyambuya 57\nspeedman 57\naltarpieces 57\nfat-laden 57\noffice-based 57\nend-time 57\nnavigenics 57\nanser 57\nknup 57\nkyung-ja 57\npatrimonio 57\nerda 57\ntimecards 57\nquencher 57\nnon-deployed 57\nanggoro 57\nbajas 57\niju 57\nkaggwa 57\nmiller-jenkins 57\nvisicalc 57\nmagdalo 57\njohannsen 57\ntassinari 57\ncat-like 57\nspectacled 57\nblauner 57\nstickwork 57\nexplosives-sniffing 57\nstavenhagen 57\nbocskai 57\n#-anna-lena 57\nbrebner 57\nmoehringer 57\ntamweel 57\ngold-leafed 57\ndrevna 57\nreidyglobe.com 57\nmennes 57\nripia 57\nnoongar 57\nre-appear 57\nmangosteen 57\nqueremos 57\naarchs 57\nfoon 57\ncooz 57\njahurul 57\nsn####a 57\ndemiralp 57\nsmooched 57\ncols 57\nchapati 57\nresailed 57\nevensong 57\nluzhin 57\n##-plus-year 57\npresento 57\nnon-exempt 57\nwahyono 57\nservier 57\nmalayev 57\nneuro 57\ntemerko 57\nbeddoes 57\nmarquard 57\ncremonini 57\ncarnivalesque 57\npetrac 57\ntri-colored 57\ndebbie-ann 57\nbeaute 57\npreviously-announced 57\ndugovich 57\nthibaudet 57\noutboards 57\nbenhur 57\nkubelik 57\nschellenberger 57\nin-kook 57\npapan 57\nbdnf 57\nhaleva 57\nprecor 57\ntightly-contested 57\nhge 57\nasrar 57\npiriz 57\ntaitz 57\ntaita 57\nmedog 57\nbonners 57\ncaucausus 57\nuncustomary 57\nbalsamo 57\n##-card 57\ngood-vs 57\ninternational-class 57\nstrominger 57\ninterleukin 57\nfuneral-home 57\nsavall 57\neasynet 57\nkonculj 57\nfengdu 57\nenio 57\ncyber-criminals 57\nsoms 57\nwasikowska 57\nsemporna 57\n#to 57\ndamschroder 57\nmedstar 57\nbastin 57\ntno 57\nmunier 57\nrav-# 57\nribalta 57\nelmes 57\nflicked-on 57\ncricket-wc####-pak 57\nrehabilitator 57\nmedora 57\nkaddura 57\nyannopoulos 57\nsericulture 57\nclosed-captioning 57\nrigler 57\npre-injury 57\njandek 57\nchiri-yurt 57\n#-midnight 57\nkhorshid 57\nwitkoff 57\nfesten 57\naboutreika 57\njourdon 57\nhospitalisations 57\nsalonius 57\nsubnational 57\nbonadio 57\ntv-viewing 57\nlessee 57\nfikir 57\nwatani 57\nnewtok 57\nkintz 57\nin-network 57\nrepacholi 57\nsubert 57\nmoralize 57\ntruck-loads 57\nrocktober 57\nrecently-held 57\nsteubing 57\nmyfi 57\nharned 57\nalleles 57\nnesi 57\nreferential 57\nmanufacturing-based 57\nrzb 57\nsun-worshipping 57\ncaiu 57\nchi-keung 57\nmajdalawi 57\nlackner 57\narrangment 57\no'mahoney 57\nintermarriages 57\nzantzinger 57\nmargairaz 57\npatzelt 57\nchania 57\ngribakin 57\n##-star 57\nharuko 57\nlagunas 57\ntch 57\nex-social 57\nqorabi 57\ndebt-for-nature 57\nplam 57\nbeady-eyed 57\noscillators 57\nblood-related 57\nqualm 57\nconsumer-based 57\nleweck 57\nhigh-revving 57\nl'heureux 57\nmeramec 57\nfsa-eap 57\nlhernandez 57\ncarports 57\nair-worthiness 57\nbolswessanen 57\nuberti 57\naver 57\n#-million-member 57\nsingleminded 57\nhurreh 57\nmoslem-oriented 56\nbenediktsson 56\naeolian 56\ngogitidze 56\nintrepids 56\nzebedayo 56\nshakhtyor 56\ngioiella 56\nllegue 56\ndraggy 56\ntkaczuk 56\nbalongan 56\nabullah 56\nmn-imj 56\nabdula 56\nvillepinte 56\nconcrete-filled 56\novercash 56\nastara 56\nmoskvich 56\nsemi-autonomy 56\ninvestment-linked 56\nself-tanner 56\nnurmukhammed 56\nnilesh 56\npitch-and-putt 56\nhaetzni 56\ncorn-soya 56\nhttp://www.redcross.org 56\npredominantly-muslim 56\nzigging 56\nryegrass 56\nsoft-cover 56\nfaibish 56\nreinsured 56\nmideast-israel 56\nkoshlyakov 56\nbig-hitters 56\nstalcup 56\namland 56\nmega-hits 56\nmid-stretch 56\nsauna-like 56\nantihypertensive 56\nmuskat 56\necd 56\nchornovyl 56\nshih-ming 56\ninerrant 56\nxiuqi 56\npseudo-scientific 56\nlangberg 56\nbungoma 56\nzainol 56\ntelecomunications 56\nescogido 56\nokkalapa 56\nchun-sheng 56\nsaefuddin 56\nrinus 56\ndcp 56\nmilitary-based 56\npetkovski 56\nserbian-led 56\nbernardsville 56\nfarkhar 56\nphosphors 56\ndutreil 56\nwragg 56\nhunza 56\nkhulani 56\nbrahima 56\nzakiur 56\nnooni 56\nalef 56\ngovernmnent 56\nhaiti-vote 56\nroro 56\nwaterfowls 56\nschoeller 56\nunsalable 56\nmcdonnel 56\nconflict-prevention 56\nmantz 56\nmajelis 56\nexultantly 56\npigeon-toed 56\nneiers 56\ntzemel 56\nnghimtina 56\nsindhis 56\nlippa 56\nsg-# 56\nal-yemen 56\noughton 56\nwash-out 56\npd-imj 56\ndestinee 56\ncanfor 56\nseyni 56\nbp-gm 56\njakkrit 56\n###:##:## 56\nrapace 56\nhabie 56\nex-slave 56\ntyisha 56\nfrou-frou 56\ngroeschel 56\npmb 56\nwyrsch 56\nidahoans 56\ntiantan 56\n##-million-us 56\nkootenay 56\nassasinated 56\nteruhisa 56\nkirundo 56\njoensen 56\nwedding-cake 56\nunderfinancing 56\nshapewear 56\nclinton-haters 56\ntetzchner 56\narja 56\nduchesnay 56\nfuxi 56\nradow 56\ntheatre-goers 56\nsign-carrying 56\nhorstman 56\ncastrati 56\nheat-generating 56\ngolfsmith 56\npendareva 56\nloubscher 56\nfumento 56\nwiniarski 56\nxianrong 56\nzanchi 56\nalibi-ya 56\nf-bomb 56\nstonehouse 56\nzumbo 56\nsange 56\ngamidov 56\nexanta 56\ntake-or-pay 56\nmapunda 56\npropuesta 56\nthree-meter-high 56\nzafaryab 56\nhaloed 56\nsliven 56\nlepton 56\nalcohol-fuelled 56\noperationalized 56\nfairless 56\nfoot-tapping 56\npntl 56\nchoos 56\ndanisco 56\nkhazaal 56\nmacknin 56\nkera 56\nnon-believer 56\nexport-dominated 56\nchowdry 56\ntecnost 56\nmonceau 56\njmarmstrongdenverpost.com 56\nchunsheng 56\nccac 56\noakenfold 56\nktda 56\nrecord-length 56\ntoco 56\ngolog 56\npastorale 56\nzuberi 56\nmillion-euro## 56\ntrustor 56\ndelsener 56\nzx 56\ncollege-student 56\nultra-right-wing 56\nrunout 56\nabloom 56\nholovak 56\nscafidi 56\nassociaton 56\ngerman-brokered 56\ncents-a-share 56\ndhanawibawa 56\nperformance-driven 56\nsurburb 56\norso 56\nduodenal 56\nyellow-and-red 56\nkoehl 56\nboonton 56\nexective 56\n###-footer 56\nmusasa 56\nhttp://www.nbc.com 56\npig-raising 56\ntelo 56\ncassells 56\nsacko 56\nmonshipour 56\ndesk-bound 56\nsoft-top 56\ntunings 56\ntomasa 56\nlekhanya 56\nbreathalyser 56\nhmeid 56\nventolin 56\ngoldsby 56\ncoba 56\nsingkil 56\nadvertisment 56\nbarfoed 56\nd'ermilio 56\nesala 56\n#,#-dioxane 56\nzohreh 56\nkasatka 56\nhorita 56\nspado 56\nitvs 56\nchia-yuh 56\nnandy 56\nvercauteren 56\nguedj 56\nchimene 56\nmasaba 56\ncentury-oriented 56\nnovavax 56\nburn-in 56\nsnu 56\nmoscou 56\ngurrola 56\nlong-feuding 56\ngrails 56\njsk 56\nbachmans 56\ndeguchi 56\nwanchalerm 56\nunder-performed 56\npaisa 56\nsmap 56\nderamus 56\ndysphoria 56\npardede 56\nlunstead 56\nhouweling 56\nsolar-heated 56\nlalic 56\nbanchetta 56\ntursday 56\ntree-dwelling 56\nrightmire 56\ntradicion 56\ntae-yong 56\nsayrescoxnews.com 56\npaku 56\nmubang 56\nmoseyed 56\ncarbon-reduction 56\nphilippine-born 56\nevent-planning 56\nlong-fought 56\nshamsudin 56\ngeninho 56\ncoatless 56\nhardy-garcia 56\nand-white 56\nkuchov 56\nshoebox-sized 56\npenalty-killers 56\nvalue-added-tax 56\ndispenza 56\nben-yehuda 56\nyellow-and-green 56\nbabaloo 56\ngoyas 56\nthen-partner 56\nmoulty 56\nsachio 56\nover-the-horizon 56\nbtp 56\n###t 56\ncalagna 56\navions 56\ncopepods 56\nfrench-ruled 56\nphilomel 56\nosmo 56\niconix 56\nthomastown 56\nschoerghofer 56\nwitchdoctors 56\nryoung 56\nwire-fraud 56\ntynesha 56\nfreas 56\ncpw 56\nunc-wilmington 56\nwater-born 56\neket 56\nloudhailer 56\nfilipp 56\nkillick 56\nklimenko 56\nkarppinen 56\nkuwait-politics 56\nstollsteimer 56\ntrussell 56\narbesfeld 56\nalfama 56\ncaithness 56\nmonetti 56\narkle 56\nzhoima 56\nchoue 56\nvulgamore 56\nrakowitz 56\nprovencio 56\nbeswick 56\nre-financing 56\ninnocentive 56\nlahoti 56\nbertholle 56\neui 56\njostens 56\nindecisively 56\ngamov 56\nkalis 56\ntoughest-ever 56\nmi-jin 56\ninterwar 56\nzari 56\nwoodroffe 56\nbaixing 56\nisabekov 56\nimportations 56\nhedvig 56\nname-your-price 56\nvuai 56\ncarkner 56\njmf\\/ml 56\nrotarian 56\ncerra 56\nfour-over-par 56\nvisitantes 56\nstautner 56\ntiu 56\ninside-baseball 56\nchakravarthi 56\nklokot 56\nmisstates 56\nenap 56\nweld-cellucci 56\ntahara 56\nbolender 56\nsuettinger 56\nkaradassiou 56\nisak-muivah 56\ngovernates 56\nbuonomo 56\nasia-focused 56\nstatesville 56\ncalgon 56\nyonemura 56\nfazlic 56\nsohlberg 56\nbogale 56\nall-arounder 56\nrou 56\nswint 56\nkendall-smith 56\njamaar 56\ngurfinkel 56\nkargar 56\nre-invented 56\nsarda 56\nwitharanage 56\nnon-college 56\nlohas 56\nforemothers 56\nvent-free 56\nnarcotic-drug-related 56\nyashar 56\ncompounce 56\nbotto 56\ngeddie 56\nmarchman 56\nouest-france 56\ndrees 56\nbusiness-services 56\npre-selection 56\nvaher 56\njocko 56\nsaint-quentin 56\namerenue 56\nlindenmuth 56\nulus 56\nmbala 56\nhusic 56\nyl###-### 56\nfollicular 56\nmost-loved 56\nncafp 56\ncgx 56\npeaden 56\npingeot 56\nfusen 56\nphaiboon 56\npre-marked 56\nheilbroner 56\nlongyi 56\nnr\\/dj## 56\nnashwan 56\nkathuria 56\n##-wayne 56\nway-station 56\nvisitor-friendly 56\ngenerative 56\nunprejudiced 56\nwolfberry 56\ncashwell 56\ntwo-building 56\nmadryn 56\nakwesasne 56\ncommunity-college 56\nlieff 56\ncoalhouse 56\nsuncare 56\npassamaquoddy 56\nkanstantsin 56\namep 56\nnews-making 56\nrahe 56\ncampione 56\nneiwand 56\nmunabao 56\npotti 56\nmachicura 56\nwwor 56\nvermeersch 56\nschmoekel 56\nchiffonade 56\nfashola 56\nseec 56\ncentre-backs 56\nkremlinologists 56\nakuressa 56\nneuropsychiatry 56\ns-##c 56\nmutaa 56\nsociobiology 56\nultra-premium 56\ntwirler 56\nxiaojin 56\nfree-jazz 56\naudrina 56\nnelon 56\nseconds\\/### 56\ninquisitions 56\nlevel-four 56\nvertigo-inducing 56\nmugambage 56\nbondue 56\nyuk\\/leung 56\nedelnor 56\nwascher 56\nside-footing 56\ngiresun 56\nenfoque 56\ntorro 56\nrrps 56\nkirsh 56\ncrimefighter 56\nimporta 56\nlonger-acting 56\nthakurgaon 56\nsufferance 56\ncricket-wc####-aus 56\nfluconazole 56\nwaltzer 56\nstz\\/ea## 56\ngbm 56\n##-stephen 56\nurey 56\nmedine 56\npelloux 56\narkans 56\nbizzare 56\nsiegbahn 56\ncarnelian 56\nrueing 56\npantomiming 56\nduckweed 56\nhofmeyr 56\nnastas 56\npennymac 56\npeairs 56\nstr-ti-jbm 56\nsuharjono 56\nbarcenas 56\nspit-shined 56\ncoch 56\naldebert 56\nchups 56\neckl 56\nczaja 56\nredken 56\nduffie 56\njordanaires 56\nimpudently 56\nzingaro 56\nundistorted 56\ntoolik 56\nkammerhoff 56\ndecriminalisation 56\nclayoquot 56\nleblon 56\npakistan-missile 56\ngertrudis 56\nsnooker-gbr 56\nal-jarba 56\ntumwine 56\nmorinigo 56\nbacre 56\n##,###,###.# 56\nstaleness 56\n##hours 56\namni 56\nlinenthal 56\nkleinsasser 56\nbepza 56\nsaransk 56\nkhermanstatesman.com 56\nrockwellian 56\nnieuw 56\nsolomonyan 56\nmangahas 56\nsewering 56\nmozartean 56\n##-square-kilometre 56\nbhu 56\nlampasas 56\ncoagulate 56\ndinaburg 56\ndairy-free 56\none-bath 56\ndjibrill 56\nfischer-boel 56\nfour-button 56\nlemon-flavored 56\nbipeds 56\nhorneber 56\nmini-movies 56\ntoplin 56\nsilang 56\nbarberton 56\nboukensa 56\nfernado 56\nlamberty 56\ngratings 56\nmarban 56\nilala 56\nl'avenir 56\nnizhni 56\nonepass 56\nd&b 56\nilogho 56\nbowral 56\nlhr### 56\ncabaniss 56\ncamera-friendly 56\nlangerman 56\nplattekill 56\ntodesca 56\ncantorial 56\nimport-dependent 56\noverspenders 56\nelsen 56\nnovorossisk 56\nhansabank 56\ntaip 56\neverding 56\nkilinc 56\nallieu 56\nkippy 56\nkasputis 56\nrrp 56\ndarling-hammond 56\nbitkom 56\nholtzhausen 56\nkin-chung 56\nsinosat-# 56\nb'gosh 56\nafroz 56\nsolomou 56\nproegler 56\non-sang 56\nmotorcyles 56\neasyknit 56\ncross-pollinating 56\nambeyi 56\nlavaggi 56\nconfessors 56\naviion 56\n###-######## 56\niraq-unrest-qaeda 56\njanowitz 56\nofari 56\ntato 56\nreputacion 56\ngading 56\nunivac 56\niniative 56\nal-amal 56\nsahdan 56\nculloty 56\nmariella 56\nboigny 56\npartyka 56\njuda 56\nvierma 56\nmajorette 56\nnews-stands 56\nashti 56\nsarria 56\ndonadze 56\nhuaqiang 56\ndej 56\ndeddy 56\nsobaru 56\nself-invention 56\nmurley 56\nneurotically 56\nsmolar 56\npiccarreta 56\n#og 56\nflowerbeds 56\nbc-ap 56\ninseparably 56\nprezioso 56\nropeik 56\npumpido 56\nhypothesizes 56\nwuz 56\nroulade 56\nhttp://www.genoa-g#.it/eng/index.html 56\neu-mideast 56\npaddleboard 56\nbone-white 56\nderschau 56\nburika 56\nboarding-school 56\nsenitt 56\nre-equipping 56\nmulvoy 56\nretinues 56\nbresse 56\ngreentown 56\nwww.insidesocal.com/tv/ 56\nsinaia 56\nraupp 56\nnspo 56\novertopping 56\nshindell 56\nbahk 56\njeremain 56\nkurta 56\nmargarite 56\nebbesen 56\nkise 56\ncoooperation 56\nbrochtrup 56\nkenya-climate 56\nmusic-themed 56\nrahimpour 56\nsamory 56\nbusiness-wise 56\nabadie 56\nshur 56\neuro-skepticism 56\n#,###-kilometers 56\nmystery-shrouded 56\nbulker 56\novh 56\nnear-vertical 56\nkhash 56\nespriella 56\nunef 56\ntorbor 56\ndigene 56\ndishonoured 56\ningvard 56\nandis 56\no'donley 56\ndata-serving 56\nlakhubhai 56\nleccion 56\ncommon-man 56\npre-test 56\ndiht 56\naegina 56\nyoungtown 56\ntalkathon 56\najirawit 56\nstonerside 56\noly-####-advisory 56\nnon-baseball 56\nlusail 56\ngiacoletti 56\nwomans 56\nchelsom 56\nlancing 56\nnagyz 56\nstriatum 56\nnahel 56\ncoinages 56\nkleins 56\nmulherin 56\nestebanez 56\nandolina 56\non-coming 56\nhazels 56\nautomotives 56\nborai 56\nbeiteddine 56\nsu-lin 56\nal-siyassa 56\nlequi 56\nmcmorran 56\npitcher-friendly 56\nnitch 56\nrauer 56\nd'arrigo 56\nhaziness 56\ndraft-night 56\nmaokola-majogo 56\ncuppers 56\nsteinhoff 56\ngushiken 56\nmoppet 56\npotebenko 56\nfishtailed 56\namedo 56\narbois 56\npasturing 56\nharba 56\nauthorties 56\nigber 56\nafghan-international 56\nmyanmar-protest-monks 56\nfresh-air 56\niigep 56\nwolke 56\nshock-jock 56\nsmilin 56\nguguletu 56\nderegister 56\ncoorperation 56\neckstrom 56\nguelperin 56\nlobed 56\nvartanian 56\nsponged 56\nupledger 56\nkalat 56\nbrocato 56\nwuertz 56\nlongchamps 56\nhambastegi 56\nwunderkinder 56\nlongans 56\nshe-devil 56\nfrueh 56\nambridge 56\nlessner 56\nkiddos 56\nkuypers 56\ndaair 56\nbg-acw 56\nclub-swinging 56\nszulik 56\nquizas 56\nmaxxi 56\nprojectionists 56\nvujin 56\nflyertalk.com 56\ngurin 56\nprolinea 56\nvivra 56\njurijs 56\nus-hollywood 56\nmirretti 56\nwildt 56\n###percent 56\nsylvers 56\nhoerster 56\np-### 56\nmdluli 56\nslipup 56\nzaozhuang 56\nkazakhstani 56\nfurtwaengler 56\nmuffat 56\nbusic 56\nviamonte 56\nmontsame 56\ntuszynski 56\nmickiewicz 56\nvarathep 56\nfreak-show 56\ntampakan 56\nafterburner 56\nearly-#### 56\nkerchove 56\nrasanen 56\nsimonas 56\nhigh-sounding 56\nkibbutzniks 56\nus-colombian 56\nantiqued 56\nheatherington 56\namdo 56\nside-scan 56\nbirlik 56\nspigarelli 56\nnon-nordic 56\nimpact-resistant 56\numran 56\nchalke 56\nnassari 56\nfreiman 56\njaar 56\nsoylent 56\nsinjhuang 56\nfour-hectare 56\ncontinuing-education 56\nmimo 56\ndown-ticket 56\nfiercely-contested 56\nlevantine 56\nshamie 56\ng\\/t\\/f 56\nautoridad 56\nprizing 56\nbaywalk 56\nmocatta 56\ntea-to-steel 56\npregunte 56\nirascibility 56\ntowles 56\nchina-romania 56\nhamengkubuwono 56\npadire 56\naldam 56\ncoinsurance 56\nauret 56\nignas 56\ndisco-era 56\nminette 56\nnurse-midwives 56\nsmolian 56\njaffery 56\nhochschorner 56\nrcastro 56\nfnla 56\niyanla 56\nplant-derived 56\nkrupp-hoesch 56\nnight-blooming 56\nfood-grade 56\nfuenmayor 56\nglaciation 56\nhandmaidens 56\nkosnatcheva 56\ndll###-### 56\nmulyo 56\nct## 56\nqfc 56\nhigher-yield 56\ndeyana 56\nunderdiagnosed 56\nwojtala 56\nchristmas-season 56\ncoitus 56\nestey 56\nashenfelter 56\nhapal 56\ntawatchai 56\ncambron 56\nkasereka 56\nperiwinkles 56\ncraciun 56\nclabo 56\nglashow 56\ndanwei 56\nluwero 56\nblythedale 56\nkadyr 56\nyoung-me 56\nflash-floods 56\nspla\\/m 56\ncross@globe.com 56\npues 56\nhounddog 56\nbulimics 56\nmingming 56\ninsa 56\nvideocam 56\npaoua 56\nhttp://www.xerox.com 56\ncar-jackings 56\nmarzban 56\ncattolica 56\nkatri 56\nkanada 56\ndacula 56\nfaurel 56\nzaytun 56\nautonomy-minded 56\nshauri 56\ndefrayed 56\nsanhe 56\nhowison 56\nedurne 56\nstates. 56\nissue-by-issue 56\nmicroturbines 56\nmacijauskas 56\niberville 56\nyizhar 56\ncocodrie 56\nwats 56\nustc 56\nwalta 56\nethereally 56\ncookstown 56\ndissociating 56\nonce-safe 56\nvishwanathan 56\nwince-inducing 56\nzire 56\narpa 56\nal-shurta 56\nluneta 56\naldermaston 56\nhttp://www.rnc.org 56\nhosono 56\nqiqiu\\/zhao 56\njacinthe 56\nhttp://www.amrcorp.com 56\nderbyshires 56\nweiyang 56\nhouze 56\npirata 56\nbugiri 56\nlafakis 56\nvika 56\ndismas 56\ncardio-thoracic 56\nthrashings 56\nreagle 56\nus-attacks-guantanamo 56\nhwwa 56\nundergirded 56\nportale 56\nscandalizing 56\nbrezovica 56\nzebadua 56\ndrizin 56\nreya 56\ntubewells 56\nskink 56\nbabushka 56\npre-judging 56\nschutzman 56\nwroclawski 56\nzaca 56\ndaviz 56\nmoree 56\ndiamantis 56\nformer-soviet 56\nislam-based 56\nennes 56\nalysia 56\nsoloveitchik 56\nnegar 56\ndamselflies 56\ntried-and-tested 56\nhebard 56\nslipstream-chipotle 56\nmeraklis 56\nso-call 56\nsamarn 56\npink-hued 56\nhieronim 56\nkonadu 56\nfradulent 56\nmanchild 56\nskube-column 56\nbouder 56\npermitir 56\nseawolves 56\npouladi 56\ncoronell 56\nfolino 56\njuravich 56\naquacultural 56\ndono 56\nstrongest-ever 56\nsub-four-minute 56\nshu-hung 56\nhakkar 56\nkotex 56\nrelink 56\nmultiethnicity 56\nregio 56\npomar 56\nazzopardi 56\nkelmon 56\ntihany 56\nshisun 56\nwater-repellent 56\nbumgardner 56\nclark\\/donna 56\nrevenge-minded 56\ngrcs 56\nmcphie 56\nsmall-molecule 56\nzorica 56\nchavit 56\nbaselga 56\nsirima 56\nzegas 56\ntalis 56\niner 56\nracette 56\nazema 56\nfuhe 56\ncaramagna 56\nstrenghtened 56\nlavage 56\ngiannichedda 56\ngurbuz 56\ndegiorgio 56\nkilolitres 56\nescolero 56\nantespend 56\nswash 56\nunoaked 56\ngimondi 56\ndirt-road 56\ndaguin 56\nimputation 56\norientalist 56\nneigh 56\ninconstant 56\npenaherrera 56\nshambhala 56\nravishingly 56\nikrema 56\nmythologizing 56\ngwee 56\nlousiest 56\nzirkin 56\ncross-regional 56\ncampagne 56\ndumarsais 56\nmansilla 56\nquintus 56\njurikova 56\nstickered 56\nysern 56\ndjebbour 56\nlaw-firm 56\nquark-gluon 56\nforbush 56\nnon-responsive 56\nplay-it-safe 56\nmoisture-laden 56\ncomradely 56\norito 56\nfbl-fra-lcup 56\npaucke 56\nmuji 56\nfertilizer-based 56\nearly-to-bed 56\nwoodcrest 56\ndanto 56\nonne 56\nperi-urban 56\nkayiranga 56\njianbo 56\ncorrespondant 56\nvankor 56\nlambing 56\nmephedrone 56\nicglr 56\nnon-lawyers 56\nopen-handed 56\nmusetta 56\nkhanty-mansiisk 56\ncarouser 56\nsrebnick 56\nunkown 56\nfrankland 56\nby-catch 56\naeromedical 56\nmiddle-tier 56\nknust 56\nnatsagiin 56\ngressly 56\nlong-coveted 56\ngolf-epga-por 56\nthanat 56\nivermectin 56\ngarbelotto 56\nmedin 56\nlamoreaux 56\nciwujia 56\niacoboni 56\nlipin 56\ngundegmaa 56\nnkhoma 56\nitek 56\ncebekhulu 56\nserigne 56\ntsumura 56\nradio-show 56\nmirman 56\nkelder 56\ndesvonde 56\nelongation 56\nshot-for-shot 56\nloofah 56\nsterno 56\nhrabal 56\nkumon 56\nerinle 56\nworkmate 56\nzuendt 56\ncomputer-software 56\ndervishi 56\nmuhieddin 56\nsasono 56\nsecond-week 56\nsliti 56\nwinterrowd 56\nex-test 56\nentomological 56\nkooiman 56\nakhmedova 56\nstepanovic 56\nkriste 56\nalfond 56\nqed 56\nfullone 56\ngold-encrusted 56\nfireballer 56\nbackbenches 56\nanti-minority 56\nparakh 56\non-street 56\nmajolica 56\nresponde 56\ngeraldton 56\npolyot 56\ntaleqani 56\nhand-knitted 56\npaleozoic 56\na-ram 56\noil-tainted 56\nmatuidi 56\nsanest 56\nmiddle-schooler 56\nwrightsman 56\ngerdano 56\nnozadze 56\nimm 56\nsholar 56\nplea-bargained 56\nhttp://www.usccb.org 56\nshikata 56\n##-kevin 56\ntejgaon 56\ntwo-and-a-half-month 56\nmuch-younger 56\nvaill 56\nkrach 56\naudah 56\nbullis 56\nmaharajahs 56\nunadilla 56\nuzbekistani 56\nairconditioners 56\negolf 56\nlongheld 56\noriflame 56\nsportcoat 56\ncuttaree 56\ncressend 56\nexculpate 56\nre-shuffle 56\nsinosure 56\nkartono 56\nsupergrass 56\nwalley 56\nnouzaret 56\nwell-adapted 56\nchupacabra 56\nschoenbaum 56\nuser-friendliness 56\nshorabak 56\nzaiqing 56\nsluga 56\noegroseno 56\ncloer 56\nwaterlily 56\nnisreen 56\njunor 56\ncha-ching 56\nrocknes 56\nkejuan 56\nhofmannsthal 56\ngoldhirsh 56\nh.i.v. 56\nherschbach 56\nrajakarunanayake 56\npottenger 56\nhumanplasma 56\nbentler 56\nopi 56\ntroglitazone 56\nborgas 56\nayatskov 56\nprincetonian 56\nwingard 56\nkashmir-based 56\nchaiet 56\nexcretions 56\neydie 56\nmarthastewart.com 56\nglassine 56\nhenkes 56\nshikuku 56\nngarlejy 56\nmeversley 56\nkunsthistorisches 56\npro-stadium 56\ncarrel 56\nweilerstein 56\njamiruddin 56\nflashiness 56\n#.##-trillion 56\ndubee 56\nknickerbockers 56\nhedong 56\nyung-lai 56\ndone-that 56\nweisler 56\nprotectmarriage.com 56\nassitance 56\nkadewe 56\nnear-war 56\nwigstock 56\nresonator 56\nneu-ulm 56\ntaimali 56\norgan-transplant 56\ncompositing 56\nfantagraphics 56\nscharioth 56\nredleaf 56\nhtil 56\nmideast-gaza 56\nmanasieva 56\ngwang-soo 56\nyannotti 56\nunclassifiable 56\nnagarajan 56\nbanisar 56\ndevroy 56\nnangka 56\nhouse-sized 56\nmefraj 56\nlake-side 56\npeachcare 56\nkagah 56\napostrophes 56\nframe-ups 56\nhainaut 56\nnoiseless 56\nchiado 56\nsalesgirl 56\nborovic 56\nkaouch 56\nkirkaldy 56\nnon-automotive 56\nprokopek 56\nthree-event 56\naseanapol 56\nelfatih 56\nal-osboa 56\nsanty 56\ngerrick 56\nuntallied 56\nbushings 56\nturn-over 56\nfive-length 56\nafanasyevsky 56\nmeiselas 56\nyugoslav-made 56\n##-quart 56\npegu 56\nriverwoods 56\ntake-back 56\nremote-detonated 56\noil-reliant 56\njd\\/nr## 56\nerinvale 56\nbrookhiser 56\nghirardi-rubbi 56\nmedsger 56\nmesilla 56\nmaysa 56\ncarter-finley 56\nbiver 56\nsurkh 56\ntoy-related 56\npring 56\njjh-eap 56\nn-#### 56\nsciarrino 56\nmoba 56\nfalcioni 56\nfuria 56\nfranz-christoph 56\nliguera 56\nherreshoff 56\nbjelkevik 56\nidoko 56\nhelander 56\nbby 56\nmatrika 56\nmilas 56\nkosak 56\nforteo 56\nnon-kenyan 56\nfretes 56\ncienaga 56\nq###s 56\nchia-yen 56\nworrywart 56\nprofessional-grade 56\nestudillo 56\ndeficit-fighting 56\nmazzulla 56\nbie-shyun 56\nbrick-sized 56\nrobeco 56\ngoskowicz 56\nd'hiver 56\ndewes 56\ntoppo 56\nsacb 56\ngolden-capped 56\nboudreau-gagnon 56\ndeflectors 56\nsauzee 56\nketz 56\nfootbal 56\nmenaka 56\njerred 56\ntugendhat 56\nsutee 56\nsemi-secret 56\ncftr 56\nschabarum 56\nshirani 56\nodon 56\ncosey 56\nhttp://www.aflcio.org 56\ndoorbuster 56\ndimmock 56\ncref 56\nkrusee 56\ngruwell 56\npropellors 56\nbrodovitch 56\npoutine 56\ndunhua 56\nlese-majeste 56\nepicurious.com 56\nlate-april 56\nniha 56\nlifescan 56\nmellette 56\nguangxiang 56\nshawali 56\nsagia 56\neu-eurozone-economy-growth 56\nsuzerainty 56\nwheatgrass 56\nadir 56\nman-bok 56\ncustomizer 56\naccordionists 56\nperspicacity 56\nwalusimbi 56\nvalenciano 56\ndevelopping 56\nmatellan 56\npropiedades 56\ntopa 56\nmirken 56\ncalvins 56\ntownfolk 56\nself-motivation 56\n##-nicole 56\ntear-streaked 56\nlin\\/wang 56\nshibboleths 56\nueslei 56\nkleier 56\nunionville 56\nmohtarma 56\nesmeijer 56\nsoloed 56\nsandschneider 56\nsuper-long 56\npayal 56\nvictim-impact 56\ncarrs 56\nbufalino 56\npavlovski 56\ndemayo 56\nhttp://www.nmb.gov 56\nchen-yuan 56\nwine-and-cheese 56\nperepelova 56\nsaddiqi 56\nunsafely 56\nd'amiano 56\nhenochowicz 56\nlesperoglou 56\nvandewater 56\nsportsweek 56\nfankhouser 56\njayer 56\nkarres 56\neconomise 56\nmasopust 56\ndessie 56\npoona 56\nassouw 56\nrebated 56\nsaimone 56\nethno-religious 56\nalbany-area 56\npeiyao 56\nhighly-efficient 56\nnjie 56\nvernand 56\nbore-hole 56\nlundine 56\nfoxcroft 56\nsvoik 56\ndelyagin 56\nbodas 56\nnon-qualified 56\nditrapano 56\nal-gizouli 56\ngiorgianni 56\ntalab 56\ncaribbean-weather 56\ncross-bred 56\nconair 56\nhttp://www.njusao.org/break.html 56\ngarlock 56\nkarmakar 56\nthe#### 56\ncheek-to-cheek 56\njawiya 56\ngoldsbury 56\npelajaran 56\nshults 56\nkalala 56\nzaetta 56\ndaubert 56\nrisd 56\nbrunhoff 56\nciment 56\nblunden 56\ncall-to-arms 56\nkhirbet 56\nseliverstova 56\nvadhana 56\npeochar 56\nre-development 56\neight-sided 56\nmagam 56\nveran 56\ntax-dodgers 56\nluana 56\ncattle-rustling 56\nraia 56\npb-jp 56\nnon-labor 56\nmid-##st 56\nrussian-occupied 56\nhttp://www.freddiemac.com 56\nnowgam 56\nhiroyoshi 56\nshenlong 56\nrafidiyeh 56\nnjoki 56\nodoriferous 56\ngoldoni 56\niws 56\nmabou 56\nozio 56\nchiuri 56\nfirst-of-its 56\nciders 56\nby-now 56\nsecond-winningest 56\narix 56\ngsd 56\npro-yugoslav 56\nertugruloglu 56\nnissay 56\nguaita 56\njefferys 56\n##-straight 56\nspielbergian 56\ncomplexly 56\nphumaphi 56\nbaronet 56\nbrulliard 56\nherrara 56\nconwood 56\ncontin 56\neastent 56\niwashita 56\nproforma 56\nguiyu 56\nwenn 56\ntime-cnn 56\nre-regulate 56\nmyaungmya 56\ngrapplers 56\nchimura 56\nganchev 56\ntwerp 56\nyukking 56\npflaumer 56\ndeloatch 56\nal-jabali 56\npaech 56\ndelimited 56\npanola 56\nclottemans 56\nneigbors 56\ngruzen 56\nchile-quake 56\nmilkmaid 56\nooooo 56\nravenscroft 56\ndarbar 56\nwhitely 56\ntumpat 56\nsandile 56\nwxia 56\niwork 56\nmcmahons 56\ngawd 56\nengles 56\nwork-based 56\nchina-unrest-tibet-rights-oly-#### 56\ntrembler 56\nnybz### 56\nfengshui 56\nmoalin 56\nperumal 56\nsopon 56\ntae-bo 56\nghosting 56\nchanggwang 56\nkecil 56\nkjolstad 56\nkopel 56\nbone-building 56\nju-on 56\ndogaru 56\npalframan 56\nbauduin 56\nueb 56\ntime-space 56\nfellow-citizens 56\npapahanaumokuakea 56\nsegment-leading 56\nfuel-hedging 56\nshokir 56\ncoffeeshops 56\nadvantica 56\nmarcelina 56\nrecirculate 56\nlead-zinc 56\nlankan-born 56\ncnbc.com 56\neffros 56\nkeast 56\nindecorous 56\nostrov 56\nlow-demand 56\ngrafer 56\nkaigler 56\nyniguez 56\ntayyaba 56\nwoodcraft 56\nk.p.s. 56\njarwan 56\nyamasoto 56\nl'carriere 56\nruttenberg 56\nrevver 56\n#-gisela 56\nlong-shuttered 56\nwestshore 56\nrendina 56\nmoscardi 56\nyoani 56\ndungu 56\nhttp://www.mtv.com 56\nlupa 56\nmoney-lending 56\nfutagawa 56\nal-majd 56\nsarokin 56\nmasefield 56\nhatemongers 56\n#####-#####-# 56\nci-### 56\nguderzo 56\nconnoting 56\ngalavision 56\nloggans 56\nfiorucci 56\nre-inspected 56\nlemenager 56\ncovadonga 56\nchronometer 56\nprodon 56\nchandraswamy 56\ngatlif 56\nmoevenpick 56\ngiaccone 56\ncharanga 56\narti 56\nclinton-appointed 56\nextrasensory 56\nmartaban 56\nessaye 56\nflorentines 56\nmctague 56\nderozan 56\nbmet 56\nakher 56\ncounter-corruption 56\nnotepaper 56\nluzinski 56\nskanky 56\nunharvested 56\nkadosh 56\ntaimyr 56\nmartial-law 56\nshoreditch 56\nsomersworth 56\nmass-scale 56\nlobotomies 56\nfrates 56\nal-gailani 56\ncuito 56\nbiters 56\nfiszmann 56\nmurati 56\nvouilloz 56\narscott 56\ngoshutes 56\nfouhami 56\npalestinian-palestinian 56\nhellmer 56\nmaryinsky 56\ng-class 56\ncvid 56\nhilman 56\nus-japan-space-shuttle 56\nvallop 56\npripyat 56\ngenencor 56\nnightshade 56\nkielar 56\nginanjar 56\nbotan 56\nsukamdani 56\ntemizkanoglu 56\nberezovksy 56\nlithographic 56\ncolbie 56\ncourchesne 56\nnewsbreak 56\nkwamie 56\ncharacterising 56\njoyriding 56\nghilarducci 56\nmortham 56\nshaken-baby 56\nglorieta 56\nfengyun-# 56\nleibacher 56\ngardening@nytimes.com 56\nchristenings 56\ndiscombobulating 56\nambion 56\nstolyarov 56\nlastuvka 56\nultra-tight 56\nxingye 56\nterrariums 56\ntreaty-based 56\nedmon 56\nmangena 56\nscopolamine 56\nnobes 56\nlong-separated 56\nacls 56\nskamania 56\negx## 56\ndahduli 56\nmotech 56\nintitial 56\ncaffita 56\ngirl-group 56\nhalf-expected 56\nlinette 56\nsoldati 56\npend 56\nhopoate 56\npre-castro 56\nlulin 56\nhttp://www.usccb.org/ 56\nnon-convertible 56\nabu-zayyad 56\nstep-brother 56\nartux 56\nyusanto 56\nbestiary 56\ndebrum 56\ndingers 56\nkhamzat 56\nridpath 56\nadex 56\nanthracnose 56\nten-week 56\nhelbrans 56\nshoehorning 56\n:####### 56\nparide 56\nboosterish 56\ngoldwin 56\npigeonholes 56\nraimunda 56\nhowden 56\nevangelized 56\nmixups 56\npanter 56\no.h. 56\nrockrose 56\nengland-born 56\naugustow 56\nlessel 56\nkasane 56\nsudeikis 56\ncontentville 56\ncleaning-up 56\nnatan-zada 56\nmanawatu 56\nunreviewable 56\nwitticism 56\nternus 56\nuscirf 56\nrijsbergen 56\nrakad 56\ncruciferous 56\ndundes 56\npirozzi 56\nkreegel 56\nbudgie 56\nclijster 56\nbrou 56\nshaywitz 56\nlopano 56\nhelissio 56\nbolshunov 56\ntennstedt 56\nanhua 56\ndimiter 56\nmosquito-control 56\nhearts-and-minds 56\npretre 56\ntvind 56\nabednego 56\nborn-and-bred 56\nzingre-graf 56\nlonghai 56\ninniger 56\ndiaper-changing 56\ncorleones 56\nrunge-metzger 56\ntangdhar 56\nthibadeau 56\nfinancieele 56\nchassin 56\npolarities 56\nchi-x 56\nhelvey 56\nvetters 56\npazzi 56\nuniverity 56\njail-house 56\npegasystems 56\nchengshan 56\nofakim 56\na-lister 56\njackfruit 56\nkissy 56\nruocco 56\npaleobiologist 56\nbobosikova 56\nal-rayes 56\ndornod 56\nas-expected 56\nzadari 56\njavadi 56\ndynatech 56\nsilantiev 56\nsonaecom 56\nkerosine 56\nakito 55\nbenedictions 55\nmamounia 55\ncharco 55\nblue-colored 55\nval-de-marne 55\nantipov 55\nnagu 55\nmueller-wohlfahrt 55\npratapkumar 55\nmultirole 55\nsquibbed 55\nseroczynski 55\nsatcom 55\nyaacobi 55\nweihong 55\nwhitesnake 55\nuninflected 55\npabbo 55\nrunyonesque 55\nsloviter 55\nsperrazza 55\nrimland 55\nstanden 55\nechouafni 55\nthanakorn 55\nokutan 55\nolim 55\nnadeam 55\ndolla 55\nchaudhri 55\nhediger 55\nibwc 55\ncraigslist.com 55\nsub-office 55\nal-luhaibi 55\nboutris 55\nharshman 55\nbuhain 55\nnurhadi 55\nfood-loving 55\npluots 55\ngeometrics 55\npedercini 55\ncouvrette 55\nrlopez 55\ngreystoke 55\nstandbridge 55\nafilias 55\nex-strongman 55\nfator 55\nsomerwill 55\nworthley 55\nkarbanenko 55\namchitka 55\nthadeus 55\netemad-e-melli 55\nsafecracker 55\nonce-elegant 55\ngonyea 55\nanti-islamist 55\ndabanovic 55\naljabr 55\nozcelik 55\nsmutny-jones 55\nfa'asavalu 55\nsearby 55\npengrowth 55\ncuecat 55\nmininum 55\nshkirko 55\ngolf-club 55\nmuzzafarabad 55\nbozsik 55\ndiscerns 55\nrothbaum 55\nayila 55\nscandanavia 55\nlewmar 55\njelloun 55\ndragnea 55\npole-dancing 55\ntakanyi 55\ncustomshouse 55\nkritzer 55\ngomelauri 55\ny.v. 55\noff-airport 55\ndelfs 55\npoints-scoring 55\njupin 55\neliska 55\nrainman 55\nrenovator 55\nlaroussi 55\nosmus 55\ngegamian 55\nbantadtan 55\nvoletta 55\nyussupova 55\ntottenville 55\ndascalu 55\n#-los 55\nmetabolizing 55\nkremlin-orchestrated 55\ngastro 55\nkhayyat 55\ntalara 55\nzizzo 55\nsumaysim 55\nb-level 55\nsecularisation 55\nforeign-ownership 55\npuning 55\nstoos 55\nmoun 55\nsidener 55\nfootmen 55\nkamioka 55\nnazy 55\nvalentierra 55\nlebovitz 55\npalanga 55\nenglish-khmer 55\nsecularize 55\ntousignant 55\ndevelopment-related 55\njoram 55\npositano 55\nhand-crank 55\nlaser-beam 55\noutfalls 55\ngps-enabled 55\nmaranda 55\newatch 55\nexecutive-secretary 55\nsaira 55\nradom 55\nwagoneer 55\nromack 55\nbiomolecular 55\nmarsudi 55\nvernier 55\ncasualization 55\nnon-intrusive 55\nidefense 55\no.b. 55\ncredit-monitoring 55\nmunante 55\nmolehills 55\nseelbach 55\nswiger 55\nbado 55\ndppc 55\ngremolata 55\nnon-parents 55\nikenson 55\nwanniarachchi 55\nwednesdy 55\nlilas 55\nchengliang 55\nkrauter 55\nblast-furnace 55\npolinard 55\noscar-winners 55\noddar 55\nqingcheng 55\ngellin 55\ncastmate 55\nempedocle 55\nwse 55\ncomerci 55\noctuplet 55\ngristmill 55\ncalcium-fortified 55\naeron 55\nxiuli 55\npowerine 55\nnear-default 55\nspangles 55\nzarafshan 55\nturkish-based 55\nposptoned 55\nguyana-based 55\nagroforestry 55\nheavily-protected 55\negis 55\nbaoqing 55\nniezabitowska 55\nassignee 55\ntaddeo 55\nxiaolan 55\nwechsel-bank 55\nstill-living 55\nbolingbroke 55\nfusari 55\npadnos 55\nfeistiest 55\nheumann 55\nnalle 55\numbridge 55\nanthocyanins 55\nvampirism 55\nfuel-related 55\nkatari 55\nnewly-married 55\nfunderburg 55\ncompunctions 55\nmi-jung 55\nparrotheads 55\nkail 55\njinhao 55\nganciclovir 55\ncentral-east 55\nm\\/a-com 55\nggagbo 55\nhaendel 55\ncholamandalam 55\nemina 55\nthiamin 55\nautorities 55\nthrill-seeker 55\nonu 55\nserb-majority 55\nmandylor 55\nkenesei 55\nsoffa 55\nghozlan 55\npuffinburger 55\nautotrader.com 55\njanowska 55\nluxenberg 55\nliddick 55\nstelmakh 55\ninternationally-known 55\noil-pipeline 55\nlolli 55\nshiang-nung 55\nmatovu 55\nsablefish 55\nuntradable 55\nninth-placed 55\nminidv 55\nbyronic 55\ntesema 55\nnightime 55\nisea 55\nflorida-alabama 55\nbruininks 55\ngronke 55\nchearavanont 55\n##-by-##-centimeter 55\nlatwp 55\ntrouville 55\nallover 55\nswitz 55\nnotman 55\nmickler 55\nbouchenaki 55\nbeirendonck 55\nsharkboy 55\nanahuac 55\nfrale 55\nvoter-rich 55\nlivingood 55\nbailis 55\ncatsup 55\nfinnish-german 55\nchalupas 55\ngenri 55\nabdulatif 55\nepisiotomy 55\nrumbaut 55\nmullein 55\nbursted 55\ncrye 55\nphiloctetes 55\nbeji 55\nkreidenko 55\nbrianderson 55\ngitex 55\nsupoj 55\nredmen 55\nroad-testing 55\nrac-ns 55\nel-arab 55\ntahsi 55\nlactose-intolerant 55\nout-aced 55\nsherifi 55\nangat 55\npatarroyo 55\nczuma 55\narsi 55\ntongue-and-groove 55\nsudol 55\npunkers 55\nfrayre 55\nperformance-wise 55\n#-xavier 55\ncantel 55\nndiaye-diatta 55\njunes 55\ncomegys 55\nnyuk 55\nbachelorettes 55\ncreepy-crawly 55\nrisner 55\nnassi 55\nteall 55\noutraising 55\nmini-concert 55\never-lasting 55\nhabboush 55\ngarrigue 55\ntrelleborgs 55\nelisangela 55\ntask-oriented 55\nrueppel 55\nbone-thin 55\ncremi 55\nhamat 55\nlong-faced 55\ncjh\\/rr 55\nfinisar 55\nmalaysia-vote 55\npene 55\ndovid 55\ndeal-killer 55\nenestam 55\ndedes 55\ndingiri 55\nrussia-chechnya-vote 55\ngenis 55\nshaabiya 55\nchelyabinsk-## 55\nselecta 55\nshort-dated 55\nmahadhesi 55\ndeath-knell 55\ngotzsche 55\nilaga 55\nlanegan 55\nkouk 55\nmahle 55\nper-mile 55\ncorruption-busting 55\nalipui 55\ngorali 55\njiazhen 55\nyu-ting 55\nisely 55\nmedaire 55\nhaggui 55\np#-plus-one 55\nntn 55\nmkalavishvili 55\noff-farm 55\nchromatis 55\nsharia-compliant 55\nhighest-volume 55\nboner 55\nkannell 55\ntottenham\\/eng 55\nhelsingoer 55\ncoluccio 55\naliyeva 55\nsteier 55\nkoers 55\nexceso 55\ndesaulniers 55\ncomcast-spectacor 55\njersey-born 55\nrock-music 55\nal-mazidi 55\nkilim 55\npovera 55\niksanov 55\ntele-ventures 55\nprimor 55\ncivlians 55\nr.i.-based 55\norania 55\nyear-around 55\nsmall-to-medium 55\ndravecky 55\nbontecou 55\nthe# 55\namsterdam-schiphol 55\nworkrate 55\nadvocator 55\ndemouge 55\nei-ichi 55\naeberhard 55\nbridi 55\npicacho 55\ndinant 55\nslashers 55\njudaea 55\nkayama 55\ndelavekouras 55\nfuli 55\ndreno 55\n##-year-plus 55\nanang 55\nlerche 55\nabessole 55\n##-robert 55\nkhidr 55\nmeheganglobe.com 55\ndubuc 55\nbudiardjo 55\n#-state 55\nspit-roasted 55\nasbill 55\nbalvino 55\nmunyenyembe 55\nbalzaretti 55\nivelin 55\naproximadamente 55\npockmark 55\nulleval 55\nmoneycentral 55\nhoussine 55\ngagoc 55\njong-ho 55\nnonlawyer 55\nstr\\/lp## 55\nship-to-air 55\nyanakiev 55\ngimbels 55\nconcow 55\nmoschetti 55\nzonghuai 55\nottenhoff 55\nno-bake 55\nrutschow-stomporowski 55\ntampabay.com 55\ntzeltal 55\nc-note 55\nvebjoern 55\nfuel-cycle 55\ncolombian-owned 55\nrescigno 55\ncropsey 55\ntriple-x 55\npalmbeachpost.com/depression 55\nkhanjani 55\ntousle-haired 55\ngulled 55\ntullia 55\nbideau 55\n#-and-a-half 55\nwaytha 55\naido 55\n###ci 55\nhuels 55\nmychael 55\nmadrigali 55\ntrygg 55\nciccolo 55\nkotscho 55\nlevinstein 55\ntaie 55\nadcb 55\nkry 55\nethers 55\ncheaptickets.com 55\nlehrmann 55\ndendur 55\nantipodean 55\n#-gao 55\nkeret 55\nsrilanka-unrest-blast 55\nswamis 55\nhorse-breeding 55\ndinsmoor 55\nbarysch 55\njunior-level 55\ngeode 55\nhoffmann-laroche 55\nnones 55\nwijdan 55\nsquare-shouldered 55\nwhite-shirted 55\ncoldiron 55\nchartis 55\nkuskokwim 55\nstepfret 55\ngopendra 55\nnefertari 55\nklsx-fm 55\nwesthusing 55\nrevault 55\nluxar 55\ngiancola 55\ngartnerg# 55\nlakemba 55\norganizaciones 55\ncamerman 55\namoussou 55\nlast-hour 55\npastrik 55\ndatsakorn 55\nduxford 55\nbrown-brick 55\nfortul 55\nzainy 55\nwamidh 55\nhanel 55\nfeda 55\nreminyl 55\nthoeni 55\nsportcenter 55\nwhite-fleshed 55\nparchments 55\nseafrance 55\nnordmark 55\naristobulo 55\nligonier 55\nrmf 55\ntorchy 55\nbutcheries 55\nkurfuerstendamm 55\nhttp://www.cdc.gov/h#n#flu 55\nisleta 55\nletter-bombs 55\nhadrosaur 55\nnourizadeh 55\nkujawa 55\nmemling 55\nmtd 55\nparents-to-be 55\nkinyu 55\ncompleto 55\nalsatians 55\nitta 55\nbaoliu 55\npravit 55\nsiping 55\nkorea-eu 55\nfulminate 55\nguandu 55\nyili\\/zhao 55\ndemott 55\npakistan-militant 55\nlpu 55\npetrowski 55\nsondashi 55\nlindbom 55\ngatton 55\ndubelier 55\nhsin-hsing 55\nghazl 55\nchad-unrest 55\namcc 55\nlemon-yellow 55\naniek 55\ndanshuei 55\ntidmore 55\nchin-up 55\nscarpaci 55\nfresnel 55\nmecum 55\nrosebush 55\nconsumer-confidence 55\nvainshtok 55\nkateri 55\nhungiapuko 55\nqibao 55\ncamon 55\nruhlmann 55\nclaunch 55\naudible.com 55\nfalic 55\nbardales 55\nlithographer 55\nebrima 55\nszot 55\nambulance-chasing 55\nwatoto 55\nvigan 55\nkomineft 55\nmiklikova 55\ntbl 55\negidius 55\nsrikkanth 55\nsenlin 55\nsuizhong 55\nsaltpeter 55\nshitreet 55\nlaxton 55\ncspc 55\nelisdottir 55\nperrins 55\nuniglory 55\nemmick 55\nskulk 55\nearwax 55\nbenaroya 55\nrushwaya 55\ndiscomforted 55\nohalete 55\npanigoro 55\nmosquito-born 55\nsafe-house 55\nexcelerate 55\nmattino 55\nmasoudi 55\nsonon 55\nbn.com 55\nostergaard 55\nkamya 55\nkyeung-ran 55\nkitayama 55\nfu-hsing 55\nclip-art 55\ntrillion-won 55\nascender 55\nspritzing 55\njonesing 55\npolicy-based 55\nflomo 55\nglobalist 55\nacamprosate 55\nredwall 55\nljudmila 55\nkorade 55\nlubo 55\nhighhandedness 55\nanad 55\n#.#-meter-deep 55\nbrownshirts 55\nattaboy 55\ntraynham 55\neuropeanized 55\nadjustables 55\nlumberyards 55\nhotel-like 55\nelektroprivreda 55\nkirilova 55\ndual-layer 55\nerrett 55\nracinos 55\nsigou 55\nspindletop 55\nsteidl 55\nhilderbrand 55\nman-of-the 55\nunconditioned 55\nbudget-strapped 55\nbodart 55\ndanic 55\nplushest 55\njalaleddin 55\nfna 55\nto-# 55\nburhannudin 55\ngodward 55\ngreece-fires 55\ntransfat 55\nlovelife 55\nmyomectomy 55\ndelgada 55\nictv 55\nu.s-mexico 55\nlloreda 55\nmoebius 55\nbarkow 55\nmadox 55\nliang-jen 55\ng+j 55\ncgnpc 55\npunchier 55\nwrsa 55\nrospars 55\noverbid 55\nlokichoggio 55\nfukuura 55\nyuegu 55\ntarkett 55\nsnocountry 55\ninui 55\nmoved.wits-end-column 55\npartita 55\noverfilling 55\nhectarage 55\nokwiri 55\n#-zheng 55\neidinger 55\ntengan 55\nkouris 55\ncbuchholz 55\nhafnarfjordur 55\nowasi 55\niza 55\nspungen 55\ntepix 55\nsial 55\nkeepin 55\npaint-by-number 55\njeanene 55\nnatural-food 55\nkorniyenko 55\nhartadi 55\nananova 55\ntrend-setters 55\nnohilly 55\npasteurize 55\nwww.blogs.tampabay.com/food 55\nadvertising-driven 55\nnalumango 55\nruxton 55\njalrez 55\nmiyar 55\ndmelvin@coxnews.com 55\novervalue 55\nslane 55\nhankerson 55\nharpswell 55\nwassell 55\nbishan 55\nfried-chicken 55\ngalvanic 55\nylli 55\ncusip 55\ndogeared 55\ngoofy-looking 55\nrhys-meyers 55\nlined-up 55\nheeren 55\nhyslop 55\nyanovsky 55\nwww.aa.com 55\nseagren 55\nwiklund 55\nart-making 55\nnon-lawyer 55\nsaurashtra 55\nmichoacana 55\n##-###-## 55\nshabnam 55\nroadworker 55\nshetler 55\nquistelli 55\njacksonville-based 55\ngalekovic 55\nbadruddin 55\nvoorhis 55\nthree-song 55\ngeordies 55\ndecane 55\nkilrea 55\nepfl 55\ngentamicin 55\nchewed-up 55\nsevergazprom 55\nbolona 55\nchandos 55\nrafflesia 55\nmercutio 55\npipebomb 55\nruthann 55\nmakhenkesi 55\nsweatband 55\nbuilt-ins 55\ndonata 55\nramonet 55\nmeehl 55\ncurico 55\nalibux 55\ncommentary\\/oped 55\nslifkin 55\nbason 55\nlawsky 55\nsabal 55\nisaksen 55\ncheng-kung 55\nlabruno 55\nmusics 55\nvirgets 55\nqidwa 55\nexport-heavy 55\nout-compete 55\nnettuno 55\nruhanie 55\nshailendra 55\nogallaga 55\ntime-being 55\nmashego 55\nbangladesh-based 55\n####b 55\n####g 55\nschool-prayer 55\nnecked 55\nluzern 55\nwasbir 55\npro-royalist 55\nmontelongo 55\nsudan-darfur-un 55\ninternationally-sponsored 55\nbuyenzi 55\nb&l 55\nparagliders 55\nproducer\\/director 55\nsaalbach 55\ndillons 55\nsubleasing 55\nunifrance 55\nfleeces 55\nchu-huan 55\nkanchi 55\nbihan 55\nagrama 55\nwhite-sided 55\nhansack 55\njin-woo 55\nhileman 55\nhandwoven 55\nstouter 55\nleaf-shaped 55\nosby 55\nstamen 55\npleasurably 55\nchimeras 55\nfilm-related 55\njotspot 55\nbaitzel 55\npalmar 55\nilegal 55\nhiaa 55\neeeee 55\nalcobendas 55\nprewett 55\nknkt 55\nberkous 55\nwoe-is-me 55\nkrinkie 55\nmagdelena 55\npost-hussein 55\nhardline-controlled 55\nnarrow-bodied 55\naqueous 55\nburukina 55\nnozoe 55\nawartani 55\nkalapani 55\nguenot 55\ntuipulotu 55\niovino 55\nvosper 55\n##-flavia 55\nrrodriguez 55\ntranh 55\nd'amuro 55\nweert 55\nbrosh 55\neuropewide 55\nkovals 55\npie-shaped 55\nnahda 55\nsiboni 55\nvedenkin 55\nsquillacote 55\ngittes 55\njaakkola 55\nkhattabi 55\nvatican-affiliated 55\nfewer-than-expected 55\nsubschinski 55\nsherawat 55\ngaladari 55\nsalvors 55\nvirdi 55\njayasundara 55\ngiraudet 55\nlongjing 55\nthum 55\nahorros 55\npro-franco 55\nplutonium-making 55\nenvironmental-protection 55\nescuredo 55\nzhiping 55\nworktable 55\nsediq 55\ntree-hugger 55\nsahlman 55\nfadeout 55\nzizhou 55\nmartes 55\nu.s.-arranged 55\nkymco 55\nnxc# 55\nkopin 55\nastbury 55\nmarzan 55\ndongfanghong 55\nmontecore 55\ncpap 55\nyamase 55\nassiri 55\nmbandjock 55\nplutocrat 55\n###-billion-baht 55\nunmerciful 55\nbritish-chinese 55\nre-gifting 55\nsexually-explicit 55\nswitch-off 55\nsvan 55\nabeywardene 55\nansin 55\ncold-shoulder 55\nterron 55\nlizard-like 55\naquarian 55\nsolove 55\nwater-carrying 55\ncelebrityhood 55\nforeign-produced 55\nganascia 55\nnyepi 55\npopmart 55\nustream 55\nrosalio 55\noutdrawing 55\nvladimirovna 55\ngbissau 55\nersoy 55\ngolovlyov 55\nbiserko 55\nglicken 55\nfbl-esp-cup 55\nbriskman 55\nsolvberg 55\nwehr-hasler 55\nyokel 55\nkoplovitz 55\nlikeminded 55\nberkey 55\ninfrasound 55\nal-adil 55\nquinley 55\nboym 55\nboxier 55\ntutuila 55\nstair-climbing 55\nvike 55\na.f.m. 55\nbelgrad 55\nnucleaire 55\nmakubuya 55\nsilverchair 55\nprawiro 55\nkluzak 55\ndecepticons 55\nchappy 55\neeriest 55\npaycut 55\npinit 55\ntech-dominant 55\ncharbonnet 55\nmid-innings 55\nwatermarking 55\nunshelled 55\nbpa-free 55\nfaughnan 55\nvanins 55\nsirena 55\nmilitello 55\nofficier 55\nfras 55\nmarudai 55\nhypnotherapy 55\npasada 55\nlaser-cut 55\nfuji-servetto 55\nbutar 55\nvajna 55\nmiguez 55\ncseries 55\nschlow 55\nhamkyong 55\ntop-hatted 55\nmidf 55\nunion-mandated 55\nhanssens 55\nexhorbitant 55\nmost-famous 55\nruesch 55\ndexedrine 55\nbaracoa 55\nmaricela 55\nappropriator 55\nbronis 55\nvachagayev 55\nmaseratis 55\ngrowth-stock 55\nwesterngeco 55\nfuel-injection 55\njpletz 55\nedun 55\npinmanee 55\nswarts 55\ngauntlets 55\nbovelander 55\nazpurua 55\nhonka 55\nedge-of-your-seat 55\nmcspaden 55\naacc 55\nunderripe 55\nhelipads 55\nintra-shiite 55\ncorazzin 55\ncopperheads 55\nmumuni 55\nhruby 55\nd'alba 55\ngoodland 55\nverrill 55\nhitoki 55\nforswore 55\ngriddles 55\nspintronics 55\nsunncomm 55\ntongsalee 55\nanti-roll 55\ndrogoul 55\necologic 55\ntwo-states 55\nacto 55\ncoral-colored 55\njuliusz 55\ntraesch 55\nulaanbaatar 55\n##-grade 55\nscorelines 55\nrainswept 55\nmandisi 55\ndually 55\nwiant 55\nessig 55\nzambeze 55\nortigas 55\ncrewson 55\nassurer 55\nyellow-legged 55\npirouetting 55\ngotschall 55\ninjury-forced 55\nchemins 55\naerator 55\nmusalo 55\nkaijuka 55\nradivoje 55\nrennet 55\nhalf-seriously 55\nbulwer-lytton 55\nashrafiyeh 55\nsmaller-market 55\nmukarram 55\none-and-only 55\nbojkov 55\nmosab 55\nremmel 55\ncapellini 55\nlemel 55\nmarostica 55\ngurfein 55\nkilicdaroglu 55\nconfig 55\nxxxend 55\nwell-chilled 55\nmeatiest 55\n###,###-rupee 55\ninfinitis 55\nfiddlehead 55\nsimpanan 55\nbat-winged 55\nbanyoles 55\nneistat 55\nragas 55\nphilologist 55\nautostick 55\nnessler 55\n##-people 55\ntr#s 55\ntent-pole 55\nalgarabawi 55\nprakarn 55\nts### 55\npierre-henry 55\ndumbfounding 55\nhttp://www.chinapntr.gov 55\nbnu 55\nstep-grandfather 55\nmcnamaraglobe.com 55\nmlm 55\npepitone 55\namericium 55\nphased-out 55\nmitchellville 55\nindia-weather 55\nilion 55\nsapan 55\nland-to-air 55\naerators 55\nbacho 55\ngerwig 55\nrichart 55\ndethrones 55\nruncorn 55\nhinzpeter 55\nspoon-feeding 55\nkrey 55\nartform 55\nkva 55\nmacmullanglobe.com 55\nnonde 55\nhispanico 55\ndanke 55\nsevene 55\narqam 55\nmbatista 55\nrphilpotstar-telegram 55\nlauridsen 55\nestrogen-only 55\nclow 55\nki-### 55\nfoucher 55\nzetec 55\nautograph-signing 55\nafricanus 55\nex-israeli 55\npsoriatic 55\nhualing 55\nunibet.com 55\nkhoshchehreh 55\nmatanog 55\npeace\\/def 55\nrashid-merem 55\nkostyuk 55\nniantic 55\ndressen 55\nneinas 55\nin-process 55\nkazeem 55\npude 55\narria 55\nfuning 55\nself-destructiveness 55\nshergold 55\nbatiuk 55\npetrosa 55\nstaser 55\nseebaran 55\nnesmachny 55\n#rd\\/tv 55\naltona 55\nmacarounas 55\nurad 55\narenes 55\nzinovy 55\njavelins 55\ncarpet-cleaning 55\nal-gumhuriya 55\nmogelonsky 55\ntattle 55\nafaq 55\nbalie 55\njayesh 55\nmccawley 55\nmanganaro 55\nbreitsprecher 55\ncharice 55\nabdiqassim 55\ntantaquidgeon 55\nyung-san 55\nmeirav 55\n###km\\/h 55\nanouma 55\nel-ayoun 55\nwherehouse 55\nalloudi 55\nmoshtarak 55\nktvt 55\ngholikhan 55\ngogele 55\nmithoff 55\nintourist 55\nx-milwaukee 55\nnever-used 55\n#,###-square-kilometre 55\ndarkman 55\nmoulvibazar 55\nreher 55\nsewa 55\nkrulwich 55\nkajiwara 55\ntunmore 55\n#-gong 55\nguille 55\nwrong-doers 55\ncirone 55\nadinolfi 55\nkil-seung 55\ncharge-card 55\ntumpel-gugerell 55\nguliev 55\nglenmore 55\nflagcarrier 55\nzakum 55\ncentromin 55\nearnings-driven 55\npavoni 55\nmachination 55\nflyertalk 55\nall-too-real 55\nmaidenhead 55\nceltic\\/sco 55\nrawod 55\nco-world 55\nsekhar 55\ngendel 55\nszymanowski 55\ndomer 55\ntailandia 55\nthree-panel 55\nvereker 55\nkuchinsky 55\nyongqiang 55\nfremont-based 55\nimouraren 55\nkotschau 55\ngebre-egziabher 55\nconsolatory 55\nalready-fragile 55\nmabini 55\nforehander 55\nwhomping 55\ntest-drove 55\nopensocial 55\nbacteria-killing 55\nkanko 55\nyakovleva 55\nbergert 55\nmunari 55\nsoccer-only 55\nmaniatis 55\nhelgesson 55\nnippert 55\nmagnee 55\nhamisah 55\nnon-war 55\nklaasen 55\nrubai 55\nsaint-lazare 55\nshafir 55\njianlin 55\nkoppers 55\nlamberton 55\nmetta 55\nalgoma 55\nkessar 55\nkozue 55\nswiss-registered 55\nmoderate-led 55\ndesarrollar 55\nadenoids 55\noberhofen 55\ntregubova 55\n##-cents 55\nadesina 55\nnedo 55\ntrunzo 55\nnasaw 55\nvinters 55\navx 55\nzanna 55\nsuspender 55\nglutting 55\nobjet 55\nmedicina 55\nlatka 55\nelegante 55\nkilner 55\ngovernment-certified 55\nrc-# 55\nal-thawadi 55\nflip-out 55\npolish-language 55\nstationmaster 55\nfire-bombs 55\nseemo 55\npractical-minded 55\ntank-top 55\nhodari 55\nsaj 55\nger\\/sae 55\nspinbaldak 55\nbikker 55\nvolpenhein 55\nrom-com 55\nfunches 55\nhaacke 55\nal-hajiri 55\ntc-gb 55\nlatium 55\nbhansali 55\nsterols 55\nmulliken 55\nplutarco 55\nvoorsanger 55\nthomas-keprta 55\nbieksa 55\nloikaw 55\nmatsepe-casaburri 55\nwarty 55\nchen-wei 55\nwww.mcdonalds.com 55\nhondora 55\nm.t.b. 55\noff-grid 55\nhome-design 55\nlipovsky 55\nbergsson 55\nkii 55\nstrakhov 55\nsongphon 55\nzahab 55\nallauddin 55\nwell-tolerated 55\nscolese 55\ncucbm 55\ndrayson 55\neiken 55\nhfi 55\nzoia 55\nedgecomb 55\nsaaremaa 55\nannisu-r 55\nallaf 55\neye-view 55\nchambermaids 55\nhirosawa 55\nvladivostock 55\ntrutv 55\norrorin 55\nwistron 55\nkik 55\npretentions 55\nsciullo 55\nhandwriting-recognition 55\nmajles 55\nmotoshima 55\ntanginye 55\nhellawell 55\nextractable 55\nend-year 55\nsoll 55\ncaronna 55\nvampiric 55\nxigui 55\nhit-and-runs 55\nduba-yurt 55\nsaur 55\nkoltsov 55\nblue-helmet 55\nhyuga 55\nkifri 55\nnajja 55\nwible 55\nmatloha 55\nidrissi 55\ntamarindo 55\njust-the-facts 55\nbulaong 55\nmeddein 55\nklinefelter 55\nsatz 55\nushpizin 55\novernighting 55\nhandbrake 55\ndalley 55\nprampero 55\nhss 55\nnonhybrid 55\nmutebusi 55\njingqian 55\ncryptology 55\nanupama 55\nyakshina 55\neeas 55\npottengal 55\nright-to-carry 55\nshampooed 55\nsonejee 55\nozyurek 55\nkinrara 55\nmunyua 55\ntraboulsi 55\ncontar 55\nlerone 55\nroundtrips 55\nchelanga 55\nlimited-field 55\nwenming 55\nkreisberg 55\nbertino 55\nzhongxiao 55\nacsi 55\nlevance 55\neocene 55\n#-feliciano 55\ntarantella 55\njaehnig 55\nal-awadhi 55\nanti-socialist 55\nhornak 55\ndarkhovin 55\nperimetre 55\nabramian 55\ncaricola 55\nanti-hungarian 55\nmaneiro 55\nsize-# 55\nfarglory 55\npetermann 55\nmeiwa 55\nphelpses 55\nporgras 55\nsamiya 55\ntexoma 55\nhuffines 55\nthird-placer 55\nal-badran 55\nbabyhood 55\nseamounts 55\naasen 55\ncasulties 55\nbodong 55\nshamar 55\ndestablising 55\ncurado 55\nshangai 55\nsvedka 55\n<s> 83845866\n</s> 83845866\n<PAD> 5\n"
  },
  {
    "path": "model_zoo/models/textsum/data.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Data batchers for data described in ..//data_prep/README.md.\"\"\"\n\nimport glob\nimport random\nimport struct\nimport sys\n\nfrom tensorflow.core.example import example_pb2\n\n\n# Special tokens\nPARAGRAPH_START = '<p>'\nPARAGRAPH_END = '</p>'\nSENTENCE_START = '<s>'\nSENTENCE_END = '</s>'\nUNKNOWN_TOKEN = '<UNK>'\nPAD_TOKEN = '<PAD>'\nDOCUMENT_START = '<d>'\nDOCUMENT_END = '</d>'\n\n\nclass Vocab(object):\n  \"\"\"Vocabulary class for mapping words and ids.\"\"\"\n\n  def __init__(self, vocab_file, max_size):\n    self._word_to_id = {}\n    self._id_to_word = {}\n    self._count = 0\n\n    with open(vocab_file, 'r') as vocab_f:\n      for line in vocab_f:\n        pieces = line.split()\n        if len(pieces) != 2:\n          sys.stderr.write('Bad line: %s\\n' % line)\n          continue\n        if pieces[0] in self._word_to_id:\n          raise ValueError('Duplicated word: %s.' % pieces[0])\n        self._word_to_id[pieces[0]] = self._count\n        self._id_to_word[self._count] = pieces[0]\n        self._count += 1\n        if self._count > max_size:\n          raise ValueError('Too many words: >%d.' % max_size)\n\n  def WordToId(self, word):\n    if word not in self._word_to_id:\n      return self._word_to_id[UNKNOWN_TOKEN]\n    return self._word_to_id[word]\n\n  def IdToWord(self, word_id):\n    if word_id not in self._id_to_word:\n      raise ValueError('id not found in vocab: %d.' % word_id)\n    return self._id_to_word[word_id]\n\n  def NumIds(self):\n    return self._count\n\n\ndef ExampleGen(data_path, num_epochs=None):\n  \"\"\"Generates tf.Examples from path of data files.\n\n    Binary data format: <length><blob>. <length> represents the byte size\n    of <blob>. <blob> is serialized tf.Example proto. The tf.Example contains\n    the tokenized article text and summary.\n\n  Args:\n    data_path: path to tf.Example data files.\n    num_epochs: Number of times to go through the data. None means infinite.\n\n  Yields:\n    Deserialized tf.Example.\n\n  If there are multiple files specified, they accessed in a random order.\n  \"\"\"\n  epoch = 0\n  while True:\n    if num_epochs is not None and epoch >= num_epochs:\n      break\n    filelist = glob.glob(data_path)\n    assert filelist, 'Empty filelist.'\n    random.shuffle(filelist)\n    for f in filelist:\n      reader = open(f, 'rb')\n      while True:\n        len_bytes = reader.read(8)\n        if not len_bytes: break\n        str_len = struct.unpack('q', len_bytes)[0]\n        example_str = struct.unpack('%ds' % str_len, reader.read(str_len))[0]\n        yield example_pb2.Example.FromString(example_str)\n\n    epoch += 1\n\n\ndef Pad(ids, pad_id, length):\n  \"\"\"Pad or trim list to len length.\n\n  Args:\n    ids: list of ints to pad\n    pad_id: what to pad with\n    length: length to pad or trim to\n\n  Returns:\n    ids trimmed or padded with pad_id\n  \"\"\"\n  assert pad_id is not None\n  assert length is not None\n\n  if len(ids) < length:\n    a = [pad_id] * (length - len(ids))\n    return ids + a\n  else:\n    return ids[:length]\n\n\ndef GetWordIds(text, vocab, pad_len=None, pad_id=None):\n  \"\"\"Get ids corresponding to words in text.\n\n  Assumes tokens separated by space.\n\n  Args:\n    text: a string\n    vocab: TextVocabularyFile object\n    pad_len: int, length to pad to\n    pad_id: int, word id for pad symbol\n\n  Returns:\n    A list of ints representing word ids.\n  \"\"\"\n  ids = []\n  for w in text.split():\n    i = vocab.WordToId(w)\n    if i >= 0:\n      ids.append(i)\n    else:\n      ids.append(vocab.WordToId(UNKNOWN_TOKEN))\n  if pad_len is not None:\n    return Pad(ids, pad_id, pad_len)\n  return ids\n\n\ndef Ids2Words(ids_list, vocab):\n  \"\"\"Get words from ids.\n\n  Args:\n    ids_list: list of int32\n    vocab: TextVocabulary object\n\n  Returns:\n    List of words corresponding to ids.\n  \"\"\"\n  assert isinstance(ids_list, list), '%s  is not a list' % ids_list\n  return [vocab.IdToWord(i) for i in ids_list]\n\n\ndef SnippetGen(text, start_tok, end_tok, inclusive=True):\n  \"\"\"Generates consecutive snippets between start and end tokens.\n\n  Args:\n    text: a string\n    start_tok: a string denoting the start of snippets\n    end_tok: a string denoting the end of snippets\n    inclusive: Whether include the tokens in the returned snippets.\n\n  Yields:\n    String snippets\n  \"\"\"\n  cur = 0\n  while True:\n    try:\n      start_p = text.index(start_tok, cur)\n      end_p = text.index(end_tok, start_p + 1)\n      cur = end_p + len(end_tok)\n      if inclusive:\n        yield text[start_p:cur]\n      else:\n        yield text[start_p+len(start_tok):end_p]\n    except ValueError as e:\n      raise StopIteration('no more snippets in text: %s' % e)\n\n\ndef GetExFeatureText(ex, key):\n  return ex.features.feature[key].bytes_list.value[0]\n\n\ndef ToSentences(paragraph, include_token=True):\n  \"\"\"Takes tokens of a paragraph and returns list of sentences.\n\n  Args:\n    paragraph: string, text of paragraph\n    include_token: Whether include the sentence separation tokens result.\n\n  Returns:\n    List of sentence strings.\n  \"\"\"\n  s_gen = SnippetGen(paragraph, SENTENCE_START, SENTENCE_END, include_token)\n  return [s for s in s_gen]\n"
  },
  {
    "path": "model_zoo/models/textsum/data_convert_example.py",
    "content": "\"\"\"Example of Converting TextSum model data.\nUsage:\npython data_convert_example.py --command binary_to_text --in_file data/data --out_file data/text_data\npython data_convert_example.py --command text_to_binary --in_file data/text_data --out_file data/binary_data\npython data_convert_example.py --command binary_to_text --in_file data/binary_data --out_file data/text_data2\ndiff data/text_data2 data/text_data\n\"\"\"\n\nimport struct\nimport sys\n\nimport tensorflow as tf\nfrom tensorflow.core.example import example_pb2\n\nFLAGS = tf.app.flags.FLAGS\ntf.app.flags.DEFINE_string('command', 'binary_to_text',\n                           'Either binary_to_text or text_to_binary.'\n                           'Specify FLAGS.in_file accordingly.')\ntf.app.flags.DEFINE_string('in_file', '', 'path to file')\ntf.app.flags.DEFINE_string('out_file', '', 'path to file')\n\ndef _binary_to_text():\n  reader = open(FLAGS.in_file, 'rb')\n  writer = open(FLAGS.out_file, 'w')\n  while True:\n    len_bytes = reader.read(8)\n    if not len_bytes:\n      sys.stderr.write('Done reading\\n')\n      return\n    str_len = struct.unpack('q', len_bytes)[0]\n    tf_example_str = struct.unpack('%ds' % str_len, reader.read(str_len))[0]\n    tf_example = example_pb2.Example.FromString(tf_example_str)\n    examples = []\n    for key in tf_example.features.feature:\n      examples.append('%s=%s' % (key, tf_example.features.feature[key].bytes_list.value[0]))\n    writer.write('%s\\n' % '\\t'.join(examples))\n  reader.close()\n  writer.close()\n\n\ndef _text_to_binary():\n  inputs = open(FLAGS.in_file, 'r').readlines()\n  writer = open(FLAGS.out_file, 'wb')\n  for inp in inputs:\n    tf_example = example_pb2.Example()\n    for feature in inp.strip().split('\\t'):\n      (k, v) = feature.split('=')\n      tf_example.features.feature[k].bytes_list.value.extend([v])\n    tf_example_str = tf_example.SerializeToString()\n    str_len = len(tf_example_str)\n    writer.write(struct.pack('q', str_len))\n    writer.write(struct.pack('%ds' % str_len, tf_example_str))\n  writer.close()\n\n\ndef main(unused_argv):\n  assert FLAGS.command and FLAGS.in_file and FLAGS.out_file\n  if FLAGS.command == 'binary_to_text':\n    _binary_to_text()\n  elif FLAGS.command == 'text_to_binary':\n    _text_to_binary()\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/textsum/seq2seq_attention.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Trains a seq2seq model.\n\nWORK IN PROGRESS.\n\nImplement \"Abstractive Text Summarization using Sequence-to-sequence RNNS and\nBeyond.\"\n\n\"\"\"\nimport sys\nimport time\n\nimport tensorflow as tf\nimport batch_reader\nimport data\nimport seq2seq_attention_decode\nimport seq2seq_attention_model\n\nFLAGS = tf.app.flags.FLAGS\ntf.app.flags.DEFINE_string('data_path',\n                           '', 'Path expression to tf.Example.')\ntf.app.flags.DEFINE_string('vocab_path',\n                           '', 'Path expression to text vocabulary file.')\ntf.app.flags.DEFINE_string('article_key', 'article',\n                           'tf.Example feature key for article.')\ntf.app.flags.DEFINE_string('abstract_key', 'headline',\n                           'tf.Example feature key for abstract.')\ntf.app.flags.DEFINE_string('log_root', '', 'Directory for model root.')\ntf.app.flags.DEFINE_string('train_dir', '', 'Directory for train.')\ntf.app.flags.DEFINE_string('eval_dir', '', 'Directory for eval.')\ntf.app.flags.DEFINE_string('decode_dir', '', 'Directory for decode summaries.')\ntf.app.flags.DEFINE_string('mode', 'train', 'train/eval/decode mode')\ntf.app.flags.DEFINE_integer('max_run_steps', 10000000,\n                            'Maximum number of run steps.')\ntf.app.flags.DEFINE_integer('max_article_sentences', 2,\n                            'Max number of first sentences to use from the '\n                            'article')\ntf.app.flags.DEFINE_integer('max_abstract_sentences', 100,\n                            'Max number of first sentences to use from the '\n                            'abstract')\ntf.app.flags.DEFINE_integer('beam_size', 4,\n                            'beam size for beam search decoding.')\ntf.app.flags.DEFINE_integer('eval_interval_secs', 60, 'How often to run eval.')\ntf.app.flags.DEFINE_integer('checkpoint_secs', 60, 'How often to checkpoint.')\ntf.app.flags.DEFINE_bool('use_bucketing', False,\n                         'Whether bucket articles of similar length.')\ntf.app.flags.DEFINE_bool('truncate_input', False,\n                         'Truncate inputs that are too long. If False, '\n                         'examples that are too long are discarded.')\ntf.app.flags.DEFINE_integer('num_gpus', 0, 'Number of gpus used.')\ntf.app.flags.DEFINE_integer('random_seed', 111, 'A seed value for randomness.')\n\n\ndef _RunningAvgLoss(loss, running_avg_loss, summary_writer, step, decay=0.999):\n  \"\"\"Calculate the running average of losses.\"\"\"\n  if running_avg_loss == 0:\n    running_avg_loss = loss\n  else:\n    running_avg_loss = running_avg_loss * decay + (1 - decay) * loss\n  running_avg_loss = min(running_avg_loss, 12)\n  loss_sum = tf.Summary()\n  loss_sum.value.add(tag='running_avg_loss', simple_value=running_avg_loss)\n  summary_writer.add_summary(loss_sum, step)\n  sys.stdout.write('running_avg_loss: %f\\n' % running_avg_loss)\n  return running_avg_loss\n\n\ndef _Train(model, data_batcher):\n  \"\"\"Runs model training.\"\"\"\n  with tf.device('/cpu:0'):\n    model.build_graph()\n    saver = tf.train.Saver()\n    # Train dir is different from log_root to avoid summary directory\n    # conflict with Supervisor.\n    summary_writer = tf.train.SummaryWriter(FLAGS.train_dir)\n    sv = tf.train.Supervisor(logdir=FLAGS.log_root,\n                             is_chief=True,\n                             saver=saver,\n                             summary_op=None,\n                             save_summaries_secs=60,\n                             save_model_secs=FLAGS.checkpoint_secs,\n                             global_step=model.global_step)\n    sess = sv.prepare_or_wait_for_session(config=tf.ConfigProto(\n        allow_soft_placement=True))\n    running_avg_loss = 0\n    step = 0\n    while not sv.should_stop() and step < FLAGS.max_run_steps:\n      (article_batch, abstract_batch, targets, article_lens, abstract_lens,\n       loss_weights, _, _) = data_batcher.NextBatch()\n      (_, summaries, loss, train_step) = model.run_train_step(\n          sess, article_batch, abstract_batch, targets, article_lens,\n          abstract_lens, loss_weights)\n\n      summary_writer.add_summary(summaries, train_step)\n      running_avg_loss = _RunningAvgLoss(\n          running_avg_loss, loss, summary_writer, train_step)\n      step += 1\n      if step % 100 == 0:\n        summary_writer.flush()\n    sv.Stop()\n    return running_avg_loss\n\n\ndef _Eval(model, data_batcher, vocab=None):\n  \"\"\"Runs model eval.\"\"\"\n  model.build_graph()\n  saver = tf.train.Saver()\n  summary_writer = tf.train.SummaryWriter(FLAGS.eval_dir)\n  sess = tf.Session(config=tf.ConfigProto(allow_soft_placement=True))\n  running_avg_loss = 0\n  step = 0\n  while True:\n    time.sleep(FLAGS.eval_interval_secs)\n    try:\n      ckpt_state = tf.train.get_checkpoint_state(FLAGS.log_root)\n    except tf.errors.OutOfRangeError as e:\n      tf.logging.error('Cannot restore checkpoint: %s', e)\n      continue\n\n    if not (ckpt_state and ckpt_state.model_checkpoint_path):\n      tf.logging.info('No model to eval yet at %s', FLAGS.train_dir)\n      continue\n\n    tf.logging.info('Loading checkpoint %s', ckpt_state.model_checkpoint_path)\n    saver.restore(sess, ckpt_state.model_checkpoint_path)\n\n    (article_batch, abstract_batch, targets, article_lens, abstract_lens,\n     loss_weights, _, _) = data_batcher.NextBatch()\n    (summaries, loss, train_step) = model.run_eval_step(\n        sess, article_batch, abstract_batch, targets, article_lens,\n        abstract_lens, loss_weights)\n    tf.logging.info(\n        'article:  %s',\n        ' '.join(data.Ids2Words(article_batch[0][:].tolist(), vocab)))\n    tf.logging.info(\n        'abstract: %s',\n        ' '.join(data.Ids2Words(abstract_batch[0][:].tolist(), vocab)))\n\n    summary_writer.add_summary(summaries, train_step)\n    running_avg_loss = _RunningAvgLoss(\n        running_avg_loss, loss, summary_writer, train_step)\n    if step % 100 == 0:\n      summary_writer.flush()\n\n\ndef main(unused_argv):\n  vocab = data.Vocab(FLAGS.vocab_path, 1000000)\n  # Check for presence of required special tokens.\n  assert vocab.WordToId(data.PAD_TOKEN) > 0\n  assert vocab.WordToId(data.UNKNOWN_TOKEN) >= 0\n  assert vocab.WordToId(data.SENTENCE_START) > 0\n  assert vocab.WordToId(data.SENTENCE_END) > 0\n\n  batch_size = 4\n  if FLAGS.mode == 'decode':\n    batch_size = FLAGS.beam_size\n\n  hps = seq2seq_attention_model.HParams(\n      mode=FLAGS.mode,  # train, eval, decode\n      min_lr=0.01,  # min learning rate.\n      lr=0.15,  # learning rate\n      batch_size=batch_size,\n      enc_layers=4,\n      enc_timesteps=120,\n      dec_timesteps=30,\n      min_input_len=2,  # discard articles/summaries < than this\n      num_hidden=256,  # for rnn cell\n      emb_dim=128,  # If 0, don't use embedding\n      max_grad_norm=2,\n      num_softmax_samples=4096)  # If 0, no sampled softmax.\n\n  batcher = batch_reader.Batcher(\n      FLAGS.data_path, vocab, hps, FLAGS.article_key,\n      FLAGS.abstract_key, FLAGS.max_article_sentences,\n      FLAGS.max_abstract_sentences, bucketing=FLAGS.use_bucketing,\n      truncate_input=FLAGS.truncate_input)\n  tf.set_random_seed(FLAGS.random_seed)\n\n  if hps.mode == 'train':\n    model = seq2seq_attention_model.Seq2SeqAttentionModel(\n        hps, vocab, num_gpus=FLAGS.num_gpus)\n    _Train(model, batcher)\n  elif hps.mode == 'eval':\n    model = seq2seq_attention_model.Seq2SeqAttentionModel(\n        hps, vocab, num_gpus=FLAGS.num_gpus)\n    _Eval(model, batcher, vocab=vocab)\n  elif hps.mode == 'decode':\n    decode_mdl_hps = hps\n    # Only need to restore the 1st step and reuse it since\n    # we keep and feed in state for each step's output.\n    decode_mdl_hps = hps._replace(dec_timesteps=1)\n    model = seq2seq_attention_model.Seq2SeqAttentionModel(\n        decode_mdl_hps, vocab, num_gpus=FLAGS.num_gpus)\n    decoder = seq2seq_attention_decode.BSDecoder(model, batcher, hps, vocab)\n    decoder.DecodeLoop()\n\n\nif __name__ == '__main__':\n  tf.app.run()\n"
  },
  {
    "path": "model_zoo/models/textsum/seq2seq_attention_decode.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Module for decoding.\"\"\"\n\nimport os\nimport time\n\nimport tensorflow as tf\nimport beam_search\nimport data\n\nFLAGS = tf.app.flags.FLAGS\ntf.app.flags.DEFINE_integer('max_decode_steps', 1000000,\n                            'Number of decoding steps.')\ntf.app.flags.DEFINE_integer('decode_batches_per_ckpt', 8000,\n                            'Number of batches to decode before restoring next '\n                            'checkpoint')\n\nDECODE_LOOP_DELAY_SECS = 60\nDECODE_IO_FLUSH_INTERVAL = 100\n\n\nclass DecodeIO(object):\n  \"\"\"Writes the decoded and references to RKV files for Rouge score.\n\n    See nlp/common/utils/internal/rkv_parser.py for detail about rkv file.\n  \"\"\"\n\n  def __init__(self, outdir):\n    self._cnt = 0\n    self._outdir = outdir\n    if not os.path.exists(self._outdir):\n      os.mkdir(self._outdir)\n    self._ref_file = None\n    self._decode_file = None\n\n  def Write(self, reference, decode):\n    \"\"\"Writes the reference and decoded outputs to RKV files.\n\n    Args:\n      reference: The human (correct) result.\n      decode: The machine-generated result\n    \"\"\"\n    self._ref_file.write('output=%s\\n' % reference)\n    self._decode_file.write('output=%s\\n' % decode)\n    self._cnt += 1\n    if self._cnt % DECODE_IO_FLUSH_INTERVAL == 0:\n      self._ref_file.flush()\n      self._decode_file.flush()\n\n  def ResetFiles(self):\n    \"\"\"Resets the output files. Must be called once before Write().\"\"\"\n    if self._ref_file: self._ref_file.close()\n    if self._decode_file: self._decode_file.close()\n    timestamp = int(time.time())\n    self._ref_file = open(\n        os.path.join(self._outdir, 'ref%d'%timestamp), 'w')\n    self._decode_file = open(\n        os.path.join(self._outdir, 'decode%d'%timestamp), 'w')\n\n\nclass BSDecoder(object):\n  \"\"\"Beam search decoder.\"\"\"\n\n  def __init__(self, model, batch_reader, hps, vocab):\n    \"\"\"Beam search decoding.\n\n    Args:\n      model: The seq2seq attentional model.\n      batch_reader: The batch data reader.\n      hps: Hyperparamters.\n      vocab: Vocabulary\n    \"\"\"\n    self._model = model\n    self._model.build_graph()\n    self._batch_reader = batch_reader\n    self._hps = hps\n    self._vocab = vocab\n    self._saver = tf.train.Saver()\n    self._decode_io = DecodeIO(FLAGS.decode_dir)\n\n  def DecodeLoop(self):\n    \"\"\"Decoding loop for long running process.\"\"\"\n    sess = tf.Session(config=tf.ConfigProto(allow_soft_placement=True))\n    step = 0\n    while step < FLAGS.max_decode_steps:\n      time.sleep(DECODE_LOOP_DELAY_SECS)\n      if not self._Decode(self._saver, sess):\n        continue\n      step += 1\n\n  def _Decode(self, saver, sess):\n    \"\"\"Restore a checkpoint and decode it.\n\n    Args:\n      saver: Tensorflow checkpoint saver.\n      sess: Tensorflow session.\n    Returns:\n      If success, returns true, otherwise, false.\n    \"\"\"\n    ckpt_state = tf.train.get_checkpoint_state(FLAGS.log_root)\n    if not (ckpt_state and ckpt_state.model_checkpoint_path):\n      tf.logging.info('No model to decode yet at %s', FLAGS.log_root)\n      return False\n\n    tf.logging.info('checkpoint path %s', ckpt_state.model_checkpoint_path)\n    ckpt_path = os.path.join(\n        FLAGS.log_root, os.path.basename(ckpt_state.model_checkpoint_path))\n    tf.logging.info('renamed checkpoint path %s', ckpt_path)\n    saver.restore(sess, ckpt_path)\n\n    self._decode_io.ResetFiles()\n    for _ in xrange(FLAGS.decode_batches_per_ckpt):\n      (article_batch, _, _, article_lens, _, _, origin_articles,\n       origin_abstracts) = self._batch_reader.NextBatch()\n      for i in xrange(self._hps.batch_size):\n        bs = beam_search.BeamSearch(\n            self._model, self._hps.batch_size,\n            self._vocab.WordToId(data.SENTENCE_START),\n            self._vocab.WordToId(data.SENTENCE_END),\n            self._hps.dec_timesteps)\n\n        article_batch_cp = article_batch.copy()\n        article_batch_cp[:] = article_batch[i:i+1]\n        article_lens_cp = article_lens.copy()\n        article_lens_cp[:] = article_lens[i:i+1]\n        best_beam = bs.BeamSearch(sess, article_batch_cp, article_lens_cp)[0]\n        decode_output = [int(t) for t in best_beam.tokens[1:]]\n        self._DecodeBatch(\n            origin_articles[i], origin_abstracts[i], decode_output)\n    return True\n\n  def _DecodeBatch(self, article, abstract, output_ids):\n    \"\"\"Convert id to words and writing results.\n\n    Args:\n      article: The original article string.\n      abstract: The human (correct) abstract string.\n      output_ids: The abstract word ids output by machine.\n    \"\"\"\n    decoded_output = ' '.join(data.Ids2Words(output_ids, self._vocab))\n    end_p = decoded_output.find(data.SENTENCE_END, 0)\n    if end_p != -1:\n      decoded_output = decoded_output[:end_p]\n    tf.logging.info('article:  %s', article)\n    tf.logging.info('abstract: %s', abstract)\n    tf.logging.info('decoded:  %s', decoded_output)\n    self._decode_io.Write(abstract, decoded_output.strip())\n"
  },
  {
    "path": "model_zoo/models/textsum/seq2seq_attention_model.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Sequence-to-Sequence with attention model for text summarization.\n\"\"\"\nfrom collections import namedtuple\n\nimport numpy as np\nimport tensorflow as tf\nimport seq2seq_lib\n\n\nHParams = namedtuple('HParams',\n                     'mode, min_lr, lr, batch_size, '\n                     'enc_layers, enc_timesteps, dec_timesteps, '\n                     'min_input_len, num_hidden, emb_dim, max_grad_norm, '\n                     'num_softmax_samples')\n\n\ndef _extract_argmax_and_embed(embedding, output_projection=None,\n                              update_embedding=True):\n  \"\"\"Get a loop_function that extracts the previous symbol and embeds it.\n\n  Args:\n    embedding: embedding tensor for symbols.\n    output_projection: None or a pair (W, B). If provided, each fed previous\n      output will first be multiplied by W and added B.\n    update_embedding: Boolean; if False, the gradients will not propagate\n      through the embeddings.\n\n  Returns:\n    A loop function.\n  \"\"\"\n  def loop_function(prev, _):\n    \"\"\"function that feed previous model output rather than ground truth.\"\"\"\n    if output_projection is not None:\n      prev = tf.nn.xw_plus_b(\n          prev, output_projection[0], output_projection[1])\n    prev_symbol = tf.argmax(prev, 1)\n    # Note that gradients will not propagate through the second parameter of\n    # embedding_lookup.\n    emb_prev = tf.nn.embedding_lookup(embedding, prev_symbol)\n    if not update_embedding:\n      emb_prev = tf.stop_gradient(emb_prev)\n    return emb_prev\n  return loop_function\n\n\nclass Seq2SeqAttentionModel(object):\n  \"\"\"Wrapper for Tensorflow model graph for text sum vectors.\"\"\"\n\n  def __init__(self, hps, vocab, num_gpus=0):\n    self._hps = hps\n    self._vocab = vocab\n    self._num_gpus = num_gpus\n    self._cur_gpu = 0\n\n  def run_train_step(self, sess, article_batch, abstract_batch, targets,\n                     article_lens, abstract_lens, loss_weights):\n    to_return = [self._train_op, self._summaries, self._loss, self.global_step]\n    return sess.run(to_return,\n                    feed_dict={self._articles: article_batch,\n                               self._abstracts: abstract_batch,\n                               self._targets: targets,\n                               self._article_lens: article_lens,\n                               self._abstract_lens: abstract_lens,\n                               self._loss_weights: loss_weights})\n\n  def run_eval_step(self, sess, article_batch, abstract_batch, targets,\n                    article_lens, abstract_lens, loss_weights):\n    to_return = [self._summaries, self._loss, self.global_step]\n    return sess.run(to_return,\n                    feed_dict={self._articles: article_batch,\n                               self._abstracts: abstract_batch,\n                               self._targets: targets,\n                               self._article_lens: article_lens,\n                               self._abstract_lens: abstract_lens,\n                               self._loss_weights: loss_weights})\n\n  def run_decode_step(self, sess, article_batch, abstract_batch, targets,\n                      article_lens, abstract_lens, loss_weights):\n    to_return = [self._outputs, self.global_step]\n    return sess.run(to_return,\n                    feed_dict={self._articles: article_batch,\n                               self._abstracts: abstract_batch,\n                               self._targets: targets,\n                               self._article_lens: article_lens,\n                               self._abstract_lens: abstract_lens,\n                               self._loss_weights: loss_weights})\n\n  def _next_device(self):\n    \"\"\"Round robin the gpu device. (Reserve last gpu for expensive op).\"\"\"\n    if self._num_gpus == 0:\n      return ''\n    dev = '/gpu:%d' % self._cur_gpu\n    if self._num_gpus > 1:\n      self._cur_gpu = (self._cur_gpu + 1) % (self._num_gpus-1)\n    return dev\n\n  def _get_gpu(self, gpu_id):\n    if self._num_gpus <= 0 or gpu_id >= self._num_gpus:\n      return ''\n    return '/gpu:%d' % gpu_id\n\n  def _add_placeholders(self):\n    \"\"\"Inputs to be fed to the graph.\"\"\"\n    hps = self._hps\n    self._articles = tf.placeholder(tf.int32,\n                                    [hps.batch_size, hps.enc_timesteps],\n                                    name='articles')\n    self._abstracts = tf.placeholder(tf.int32,\n                                     [hps.batch_size, hps.dec_timesteps],\n                                     name='abstracts')\n    self._targets = tf.placeholder(tf.int32,\n                                   [hps.batch_size, hps.dec_timesteps],\n                                   name='targets')\n    self._article_lens = tf.placeholder(tf.int32, [hps.batch_size],\n                                        name='article_lens')\n    self._abstract_lens = tf.placeholder(tf.int32, [hps.batch_size],\n                                         name='abstract_lens')\n    self._loss_weights = tf.placeholder(tf.float32,\n                                        [hps.batch_size, hps.dec_timesteps],\n                                        name='loss_weights')\n\n  def _add_seq2seq(self):\n    hps = self._hps\n    vsize = self._vocab.NumIds()\n\n    with tf.variable_scope('seq2seq'):\n      encoder_inputs = tf.unpack(tf.transpose(self._articles))\n      decoder_inputs = tf.unpack(tf.transpose(self._abstracts))\n      targets = tf.unpack(tf.transpose(self._targets))\n      loss_weights = tf.unpack(tf.transpose(self._loss_weights))\n      article_lens = self._article_lens\n\n      # Embedding shared by the input and outputs.\n      with tf.variable_scope('embedding'), tf.device('/cpu:0'):\n        embedding = tf.get_variable(\n            'embedding', [vsize, hps.emb_dim], dtype=tf.float32,\n            initializer=tf.truncated_normal_initializer(stddev=1e-4))\n        emb_encoder_inputs = [tf.nn.embedding_lookup(embedding, x)\n                              for x in encoder_inputs]\n        emb_decoder_inputs = [tf.nn.embedding_lookup(embedding, x)\n                              for x in decoder_inputs]\n\n      for layer_i in xrange(hps.enc_layers):\n        with tf.variable_scope('encoder%d'%layer_i), tf.device(\n            self._next_device()):\n          cell_fw = tf.nn.rnn_cell.LSTMCell(\n              hps.num_hidden,\n              initializer=tf.random_uniform_initializer(-0.1, 0.1, seed=123),\n              state_is_tuple=False)\n          cell_bw = tf.nn.rnn_cell.LSTMCell(\n              hps.num_hidden,\n              initializer=tf.random_uniform_initializer(-0.1, 0.1, seed=113),\n              state_is_tuple=False)\n          (emb_encoder_inputs, fw_state, _) = tf.nn.bidirectional_rnn(\n              cell_fw, cell_bw, emb_encoder_inputs, dtype=tf.float32,\n              sequence_length=article_lens)\n      encoder_outputs = emb_encoder_inputs\n\n      with tf.variable_scope('output_projection'):\n        w = tf.get_variable(\n            'w', [hps.num_hidden, vsize], dtype=tf.float32,\n            initializer=tf.truncated_normal_initializer(stddev=1e-4))\n        w_t = tf.transpose(w)\n        v = tf.get_variable(\n            'v', [vsize], dtype=tf.float32,\n            initializer=tf.truncated_normal_initializer(stddev=1e-4))\n\n      with tf.variable_scope('decoder'), tf.device(self._next_device()):\n        # When decoding, use model output from the previous step\n        # for the next step.\n        loop_function = None\n        if hps.mode == 'decode':\n          loop_function = _extract_argmax_and_embed(\n              embedding, (w, v), update_embedding=False)\n\n        cell = tf.nn.rnn_cell.LSTMCell(\n            hps.num_hidden,\n            initializer=tf.random_uniform_initializer(-0.1, 0.1, seed=113),\n            state_is_tuple=False)\n\n        encoder_outputs = [tf.reshape(x, [hps.batch_size, 1, 2*hps.num_hidden])\n                           for x in encoder_outputs]\n        self._enc_top_states = tf.concat(1, encoder_outputs)\n        self._dec_in_state = fw_state\n        # During decoding, follow up _dec_in_state are fed from beam_search.\n        # dec_out_state are stored by beam_search for next step feeding.\n        initial_state_attention = (hps.mode == 'decode')\n        decoder_outputs, self._dec_out_state = tf.nn.seq2seq.attention_decoder(\n            emb_decoder_inputs, self._dec_in_state, self._enc_top_states,\n            cell, num_heads=1, loop_function=loop_function,\n            initial_state_attention=initial_state_attention)\n\n      with tf.variable_scope('output'), tf.device(self._next_device()):\n        model_outputs = []\n        for i in xrange(len(decoder_outputs)):\n          if i > 0:\n            tf.get_variable_scope().reuse_variables()\n          model_outputs.append(\n              tf.nn.xw_plus_b(decoder_outputs[i], w, v))\n\n      if hps.mode == 'decode':\n        with tf.variable_scope('decode_output'), tf.device('/cpu:0'):\n          best_outputs = [tf.argmax(x, 1) for x in model_outputs]\n          tf.logging.info('best_outputs%s', best_outputs[0].get_shape())\n          self._outputs = tf.concat(\n              1, [tf.reshape(x, [hps.batch_size, 1]) for x in best_outputs])\n\n          self._topk_log_probs, self._topk_ids = tf.nn.top_k(\n              tf.log(tf.nn.softmax(model_outputs[-1])), hps.batch_size*2)\n\n      with tf.variable_scope('loss'), tf.device(self._next_device()):\n        def sampled_loss_func(inputs, labels):\n          with tf.device('/cpu:0'):  # Try gpu.\n            labels = tf.reshape(labels, [-1, 1])\n            return tf.nn.sampled_softmax_loss(w_t, v, inputs, labels,\n                                              hps.num_softmax_samples, vsize)\n\n        if hps.num_softmax_samples != 0 and hps.mode == 'train':\n          self._loss = seq2seq_lib.sampled_sequence_loss(\n              decoder_outputs, targets, loss_weights, sampled_loss_func)\n        else:\n          self._loss = tf.nn.seq2seq.sequence_loss(\n              model_outputs, targets, loss_weights)\n        tf.scalar_summary('loss', tf.minimum(12.0, self._loss))\n\n  def _add_train_op(self):\n    \"\"\"Sets self._train_op, op to run for training.\"\"\"\n    hps = self._hps\n\n    self._lr_rate = tf.maximum(\n        hps.min_lr,  # min_lr_rate.\n        tf.train.exponential_decay(hps.lr, self.global_step, 30000, 0.98))\n\n    tvars = tf.trainable_variables()\n    with tf.device(self._get_gpu(self._num_gpus-1)):\n      grads, global_norm = tf.clip_by_global_norm(\n          tf.gradients(self._loss, tvars), hps.max_grad_norm)\n    tf.scalar_summary('global_norm', global_norm)\n    optimizer = tf.train.GradientDescentOptimizer(self._lr_rate)\n    tf.scalar_summary('learning rate', self._lr_rate)\n    self._train_op = optimizer.apply_gradients(\n        zip(grads, tvars), global_step=self.global_step, name='train_step')\n\n  def encode_top_state(self, sess, enc_inputs, enc_len):\n    \"\"\"Return the top states from encoder for decoder.\n\n    Args:\n      sess: tensorflow session.\n      enc_inputs: encoder inputs of shape [batch_size, enc_timesteps].\n      enc_len: encoder input length of shape [batch_size]\n    Returns:\n      enc_top_states: The top level encoder states.\n      dec_in_state: The decoder layer initial state.\n    \"\"\"\n    results = sess.run([self._enc_top_states, self._dec_in_state],\n                       feed_dict={self._articles: enc_inputs,\n                                  self._article_lens: enc_len})\n    return results[0], results[1][0]\n\n  def decode_topk(self, sess, latest_tokens, enc_top_states, dec_init_states):\n    \"\"\"Return the topK results and new decoder states.\"\"\"\n    feed = {\n        self._enc_top_states: enc_top_states,\n        self._dec_in_state:\n            np.squeeze(np.array(dec_init_states)),\n        self._abstracts:\n            np.transpose(np.array([latest_tokens])),\n        self._abstract_lens: np.ones([len(dec_init_states)], np.int32)}\n\n    results = sess.run(\n        [self._topk_ids, self._topk_log_probs, self._dec_out_state],\n        feed_dict=feed)\n\n    ids, probs, states = results[0], results[1], results[2]\n    new_states = [s for s in states]\n    return ids, probs, new_states\n\n  def build_graph(self):\n    self._add_placeholders()\n    self._add_seq2seq()\n    self.global_step = tf.Variable(0, name='global_step', trainable=False)\n    if self._hps.mode == 'train':\n      self._add_train_op()\n    self._summaries = tf.merge_all_summaries()\n"
  },
  {
    "path": "model_zoo/models/textsum/seq2seq_lib.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"seq2seq library codes copied from elsewhere for customization.\"\"\"\n\nimport tensorflow as tf\n\n\n# Adapted to support sampled_softmax loss function, which accepts activations\n# instead of logits.\ndef sequence_loss_by_example(inputs, targets, weights, loss_function,\n                             average_across_timesteps=True, name=None):\n  \"\"\"Sampled softmax loss for a sequence of inputs (per example).\n\n  Args:\n    inputs: List of 2D Tensors of shape [batch_size x hid_dim].\n    targets: List of 1D batch-sized int32 Tensors of the same length as logits.\n    weights: List of 1D batch-sized float-Tensors of the same length as logits.\n    loss_function: Sampled softmax function (inputs, labels) -> loss\n    average_across_timesteps: If set, divide the returned cost by the total\n      label weight.\n    name: Optional name for this operation, default: 'sequence_loss_by_example'.\n\n  Returns:\n    1D batch-sized float Tensor: The log-perplexity for each sequence.\n\n  Raises:\n    ValueError: If len(inputs) is different from len(targets) or len(weights).\n  \"\"\"\n  if len(targets) != len(inputs) or len(weights) != len(inputs):\n    raise ValueError('Lengths of logits, weights, and targets must be the same '\n                     '%d, %d, %d.' % (len(inputs), len(weights), len(targets)))\n  with tf.op_scope(inputs + targets + weights, name,\n                   'sequence_loss_by_example'):\n    log_perp_list = []\n    for inp, target, weight in zip(inputs, targets, weights):\n      crossent = loss_function(inp, target)\n      log_perp_list.append(crossent * weight)\n    log_perps = tf.add_n(log_perp_list)\n    if average_across_timesteps:\n      total_size = tf.add_n(weights)\n      total_size += 1e-12  # Just to avoid division by 0 for all-0 weights.\n      log_perps /= total_size\n  return log_perps\n\n\ndef sampled_sequence_loss(inputs, targets, weights, loss_function,\n                          average_across_timesteps=True,\n                          average_across_batch=True, name=None):\n  \"\"\"Weighted cross-entropy loss for a sequence of logits, batch-collapsed.\n\n  Args:\n    inputs: List of 2D Tensors of shape [batch_size x hid_dim].\n    targets: List of 1D batch-sized int32 Tensors of the same length as inputs.\n    weights: List of 1D batch-sized float-Tensors of the same length as inputs.\n    loss_function: Sampled softmax function (inputs, labels) -> loss\n    average_across_timesteps: If set, divide the returned cost by the total\n      label weight.\n    average_across_batch: If set, divide the returned cost by the batch size.\n    name: Optional name for this operation, defaults to 'sequence_loss'.\n\n  Returns:\n    A scalar float Tensor: The average log-perplexity per symbol (weighted).\n\n  Raises:\n    ValueError: If len(inputs) is different from len(targets) or len(weights).\n  \"\"\"\n  with tf.op_scope(inputs + targets + weights, name, 'sampled_sequence_loss'):\n    cost = tf.reduce_sum(sequence_loss_by_example(\n        inputs, targets, weights, loss_function,\n        average_across_timesteps=average_across_timesteps))\n    if average_across_batch:\n      batch_size = tf.shape(targets[0])[0]\n      return cost / tf.cast(batch_size, tf.float32)\n    else:\n      return cost\n\n\ndef linear(args, output_size, bias, bias_start=0.0, scope=None):\n  \"\"\"Linear map: sum_i(args[i] * W[i]), where W[i] is a variable.\n\n  Args:\n    args: a 2D Tensor or a list of 2D, batch x n, Tensors.\n    output_size: int, second dimension of W[i].\n    bias: boolean, whether to add a bias term or not.\n    bias_start: starting value to initialize the bias; 0 by default.\n    scope: VariableScope for the created subgraph; defaults to \"Linear\".\n\n  Returns:\n    A 2D Tensor with shape [batch x output_size] equal to\n    sum_i(args[i] * W[i]), where W[i]s are newly created matrices.\n\n  Raises:\n    ValueError: if some of the arguments has unspecified or wrong shape.\n  \"\"\"\n  if args is None or (isinstance(args, (list, tuple)) and not args):\n    raise ValueError('`args` must be specified')\n  if not isinstance(args, (list, tuple)):\n    args = [args]\n\n  # Calculate the total size of arguments on dimension 1.\n  total_arg_size = 0\n  shapes = [a.get_shape().as_list() for a in args]\n  for shape in shapes:\n    if len(shape) != 2:\n      raise ValueError('Linear is expecting 2D arguments: %s' % str(shapes))\n    if not shape[1]:\n      raise ValueError('Linear expects shape[1] of arguments: %s' % str(shapes))\n    else:\n      total_arg_size += shape[1]\n\n  # Now the computation.\n  with tf.variable_scope(scope or 'Linear'):\n    matrix = tf.get_variable('Matrix', [total_arg_size, output_size])\n    if len(args) == 1:\n      res = tf.matmul(args[0], matrix)\n    else:\n      res = tf.matmul(tf.concat(1, args), matrix)\n    if not bias:\n      return res\n    bias_term = tf.get_variable(\n        'Bias', [output_size],\n        initializer=tf.constant_initializer(bias_start))\n  return res + bias_term\n"
  },
  {
    "path": "model_zoo/models/transformer/README.md",
    "content": "# Spatial Transformer Network\n\nThe Spatial Transformer Network [1] allows the spatial manipulation of data within the network.\n\n<div align=\"center\">\n  <img width=\"600px\" src=\"http://i.imgur.com/ExGDVul.png\"><br><br>\n</div>\n\n### API \n\nA Spatial Transformer Network implemented in Tensorflow 0.7 and based on [2].\n\n#### How to use\n\n<div align=\"center\">\n  <img src=\"http://i.imgur.com/gfqLV3f.png\"><br><br>\n</div>\n\n```python\ntransformer(U, theta, out_size)\n```\n    \n#### Parameters\n\n    U : float \n        The output of a convolutional net should have the\n        shape [num_batch, height, width, num_channels]. \n    theta: float   \n        The output of the\n        localisation network should be [num_batch, 6].\n    out_size: tuple of two ints\n        The size of the output of the network\n        \n    \n#### Notes\nTo initialize the network to the identity transform init ``theta`` to :\n\n```python\nidentity = np.array([[1., 0., 0.],\n                    [0., 1., 0.]]) \nidentity = identity.flatten()\ntheta = tf.Variable(initial_value=identity)\n```        \n\n#### Experiments\n\n<div align=\"center\">\n  <img width=\"600px\" src=\"http://i.imgur.com/HtCBYk2.png\"><br><br>\n</div>\n\nWe used cluttered MNIST. Left column are the input images, right are the attended parts of the image by an STN.\n\nAll experiments were run in Tensorflow 0.7.\n\n### References\n\n[1] Jaderberg, Max, et al. \"Spatial Transformer Networks.\" arXiv preprint arXiv:1506.02025 (2015)\n\n[2] https://github.com/skaae/transformer_network/blob/master/transformerlayer.py\n"
  },
  {
    "path": "model_zoo/models/transformer/cluttered_mnist.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# =============================================================================\nimport tensorflow as tf\nfrom spatial_transformer import transformer\nimport numpy as np\nfrom tf_utils import weight_variable, bias_variable, dense_to_one_hot\n\n# %% Load data\nmnist_cluttered = np.load('./data/mnist_sequence1_sample_5distortions5x5.npz')\n\nX_train = mnist_cluttered['X_train']\ny_train = mnist_cluttered['y_train']\nX_valid = mnist_cluttered['X_valid']\ny_valid = mnist_cluttered['y_valid']\nX_test = mnist_cluttered['X_test']\ny_test = mnist_cluttered['y_test']\n\n# % turn from dense to one hot representation\nY_train = dense_to_one_hot(y_train, n_classes=10)\nY_valid = dense_to_one_hot(y_valid, n_classes=10)\nY_test = dense_to_one_hot(y_test, n_classes=10)\n\n# %% Graph representation of our network\n\n# %% Placeholders for 40x40 resolution\nx = tf.placeholder(tf.float32, [None, 1600])\ny = tf.placeholder(tf.float32, [None, 10])\n\n# %% Since x is currently [batch, height*width], we need to reshape to a\n# 4-D tensor to use it in a convolutional graph.  If one component of\n# `shape` is the special value -1, the size of that dimension is\n# computed so that the total size remains constant.  Since we haven't\n# defined the batch dimension's shape yet, we use -1 to denote this\n# dimension should not change size.\nx_tensor = tf.reshape(x, [-1, 40, 40, 1])\n\n# %% We'll setup the two-layer localisation network to figure out the\n# %% parameters for an affine transformation of the input\n# %% Create variables for fully connected layer\nW_fc_loc1 = weight_variable([1600, 20])\nb_fc_loc1 = bias_variable([20])\n\nW_fc_loc2 = weight_variable([20, 6])\n# Use identity transformation as starting point\ninitial = np.array([[1., 0, 0], [0, 1., 0]])\ninitial = initial.astype('float32')\ninitial = initial.flatten()\nb_fc_loc2 = tf.Variable(initial_value=initial, name='b_fc_loc2')\n\n# %% Define the two layer localisation network\nh_fc_loc1 = tf.nn.tanh(tf.matmul(x, W_fc_loc1) + b_fc_loc1)\n# %% We can add dropout for regularizing and to reduce overfitting like so:\nkeep_prob = tf.placeholder(tf.float32)\nh_fc_loc1_drop = tf.nn.dropout(h_fc_loc1, keep_prob)\n# %% Second layer\nh_fc_loc2 = tf.nn.tanh(tf.matmul(h_fc_loc1_drop, W_fc_loc2) + b_fc_loc2)\n\n# %% We'll create a spatial transformer module to identify discriminative\n# %% patches\nout_size = (40, 40)\nh_trans = transformer(x_tensor, h_fc_loc2, out_size)\n\n# %% We'll setup the first convolutional layer\n# Weight matrix is [height x width x input_channels x output_channels]\nfilter_size = 3\nn_filters_1 = 16\nW_conv1 = weight_variable([filter_size, filter_size, 1, n_filters_1])\n\n# %% Bias is [output_channels]\nb_conv1 = bias_variable([n_filters_1])\n\n# %% Now we can build a graph which does the first layer of convolution:\n# we define our stride as batch x height x width x channels\n# instead of pooling, we use strides of 2 and more layers\n# with smaller filters.\n\nh_conv1 = tf.nn.relu(\n    tf.nn.conv2d(input=h_trans,\n                 filter=W_conv1,\n                 strides=[1, 2, 2, 1],\n                 padding='SAME') +\n    b_conv1)\n\n# %% And just like the first layer, add additional layers to create\n# a deep net\nn_filters_2 = 16\nW_conv2 = weight_variable([filter_size, filter_size, n_filters_1, n_filters_2])\nb_conv2 = bias_variable([n_filters_2])\nh_conv2 = tf.nn.relu(\n    tf.nn.conv2d(input=h_conv1,\n                 filter=W_conv2,\n                 strides=[1, 2, 2, 1],\n                 padding='SAME') +\n    b_conv2)\n\n# %% We'll now reshape so we can connect to a fully-connected layer:\nh_conv2_flat = tf.reshape(h_conv2, [-1, 10 * 10 * n_filters_2])\n\n# %% Create a fully-connected layer:\nn_fc = 1024\nW_fc1 = weight_variable([10 * 10 * n_filters_2, n_fc])\nb_fc1 = bias_variable([n_fc])\nh_fc1 = tf.nn.relu(tf.matmul(h_conv2_flat, W_fc1) + b_fc1)\n\nh_fc1_drop = tf.nn.dropout(h_fc1, keep_prob)\n\n# %% And finally our softmax layer:\nW_fc2 = weight_variable([n_fc, 10])\nb_fc2 = bias_variable([10])\ny_logits = tf.matmul(h_fc1_drop, W_fc2) + b_fc2\n\n# %% Define loss/eval/training functions\ncross_entropy = tf.reduce_mean(\n    tf.nn.softmax_cross_entropy_with_logits(y_logits, y))\nopt = tf.train.AdamOptimizer()\noptimizer = opt.minimize(cross_entropy)\ngrads = opt.compute_gradients(cross_entropy, [b_fc_loc2])\n\n# %% Monitor accuracy\ncorrect_prediction = tf.equal(tf.argmax(y_logits, 1), tf.argmax(y, 1))\naccuracy = tf.reduce_mean(tf.cast(correct_prediction, 'float'))\n\n# %% We now create a new session to actually perform the initialization the\n# variables:\nsess = tf.Session()\nsess.run(tf.initialize_all_variables())\n\n\n# %% We'll now train in minibatches and report accuracy, loss:\niter_per_epoch = 100\nn_epochs = 500\ntrain_size = 10000\n\nindices = np.linspace(0, 10000 - 1, iter_per_epoch)\nindices = indices.astype('int')\n\nfor epoch_i in range(n_epochs):\n    for iter_i in range(iter_per_epoch - 1):\n        batch_xs = X_train[indices[iter_i]:indices[iter_i+1]]\n        batch_ys = Y_train[indices[iter_i]:indices[iter_i+1]]\n\n        if iter_i % 10 == 0:\n            loss = sess.run(cross_entropy,\n                            feed_dict={\n                                x: batch_xs,\n                                y: batch_ys,\n                                keep_prob: 1.0\n                            })\n            print('Iteration: ' + str(iter_i) + ' Loss: ' + str(loss))\n\n        sess.run(optimizer, feed_dict={\n            x: batch_xs, y: batch_ys, keep_prob: 0.8})\n\n    print('Accuracy (%d): ' % epoch_i + str(sess.run(accuracy,\n                                                     feed_dict={\n                                                         x: X_valid,\n                                                         y: Y_valid,\n                                                         keep_prob: 1.0\n                                                     })))\n    # theta = sess.run(h_fc_loc2, feed_dict={\n    #        x: batch_xs, keep_prob: 1.0})\n    # print(theta[0])\n"
  },
  {
    "path": "model_zoo/models/transformer/data/README.md",
    "content": "### How to get the data\n\n#### Cluttered MNIST\n\nThe cluttered MNIST dataset can be found here [1] or can be generated via [2].\n\nSettings used for `cluttered_mnist.py` :\n\n```python\n\nORG_SHP = [28, 28]\nOUT_SHP = [40, 40]\nNUM_DISTORTIONS = 8\ndist_size = (5, 5) \n\n```\n\n[1] https://github.com/daviddao/spatial-transformer-tensorflow\n\n[2] https://github.com/skaae/recurrent-spatial-transformer-code/blob/master/MNIST_SEQUENCE/create_mnist_sequence.py"
  },
  {
    "path": "model_zoo/models/transformer/example.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\nfrom scipy import ndimage\nimport tensorflow as tf\nfrom spatial_transformer import transformer\nimport numpy as np\nimport matplotlib.pyplot as plt\n\n# %% Create a batch of three images (1600 x 1200)\n# %% Image retrieved from:\n# %% https://raw.githubusercontent.com/skaae/transformer_network/master/cat.jpg\nim = ndimage.imread('cat.jpg')\nim = im / 255.\nim = im.reshape(1, 1200, 1600, 3)\nim = im.astype('float32')\n\n# %% Let the output size of the transformer be half the image size.\nout_size = (600, 800)\n\n# %% Simulate batch\nbatch = np.append(im, im, axis=0)\nbatch = np.append(batch, im, axis=0)\nnum_batch = 3\n\nx = tf.placeholder(tf.float32, [None, 1200, 1600, 3])\nx = tf.cast(batch, 'float32')\n\n# %% Create localisation network and convolutional layer\nwith tf.variable_scope('spatial_transformer_0'):\n\n    # %% Create a fully-connected layer with 6 output nodes\n    n_fc = 6\n    W_fc1 = tf.Variable(tf.zeros([1200 * 1600 * 3, n_fc]), name='W_fc1')\n\n    # %% Zoom into the image\n    initial = np.array([[0.5, 0, 0], [0, 0.5, 0]])\n    initial = initial.astype('float32')\n    initial = initial.flatten()\n\n    b_fc1 = tf.Variable(initial_value=initial, name='b_fc1')\n    h_fc1 = tf.matmul(tf.zeros([num_batch, 1200 * 1600 * 3]), W_fc1) + b_fc1\n    h_trans = transformer(x, h_fc1, out_size)\n\n# %% Run session\nsess = tf.Session()\nsess.run(tf.initialize_all_variables())\ny = sess.run(h_trans, feed_dict={x: batch})\n\n# plt.imshow(y[0])\n"
  },
  {
    "path": "model_zoo/models/transformer/spatial_transformer.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\nimport tensorflow as tf\n\n\ndef transformer(U, theta, out_size, name='SpatialTransformer', **kwargs):\n    \"\"\"Spatial Transformer Layer\n\n    Implements a spatial transformer layer as described in [1]_.\n    Based on [2]_ and edited by David Dao for Tensorflow.\n\n    Parameters\n    ----------\n    U : float\n        The output of a convolutional net should have the\n        shape [num_batch, height, width, num_channels].\n    theta: float\n        The output of the\n        localisation network should be [num_batch, 6].\n    out_size: tuple of two ints\n        The size of the output of the network (height, width)\n\n    References\n    ----------\n    .. [1]  Spatial Transformer Networks\n            Max Jaderberg, Karen Simonyan, Andrew Zisserman, Koray Kavukcuoglu\n            Submitted on 5 Jun 2015\n    .. [2]  https://github.com/skaae/transformer_network/blob/master/transformerlayer.py\n\n    Notes\n    -----\n    To initialize the network to the identity transform init\n    ``theta`` to :\n        identity = np.array([[1., 0., 0.],\n                             [0., 1., 0.]])\n        identity = identity.flatten()\n        theta = tf.Variable(initial_value=identity)\n\n    \"\"\"\n\n    def _repeat(x, n_repeats):\n        with tf.variable_scope('_repeat'):\n            rep = tf.transpose(\n                tf.expand_dims(tf.ones(shape=tf.pack([n_repeats, ])), 1), [1, 0])\n            rep = tf.cast(rep, 'int32')\n            x = tf.matmul(tf.reshape(x, (-1, 1)), rep)\n            return tf.reshape(x, [-1])\n\n    def _interpolate(im, x, y, out_size):\n        with tf.variable_scope('_interpolate'):\n            # constants\n            num_batch = tf.shape(im)[0]\n            height = tf.shape(im)[1]\n            width = tf.shape(im)[2]\n            channels = tf.shape(im)[3]\n\n            x = tf.cast(x, 'float32')\n            y = tf.cast(y, 'float32')\n            height_f = tf.cast(height, 'float32')\n            width_f = tf.cast(width, 'float32')\n            out_height = out_size[0]\n            out_width = out_size[1]\n            zero = tf.zeros([], dtype='int32')\n            max_y = tf.cast(tf.shape(im)[1] - 1, 'int32')\n            max_x = tf.cast(tf.shape(im)[2] - 1, 'int32')\n\n            # scale indices from [-1, 1] to [0, width/height]\n            x = (x + 1.0)*(width_f) / 2.0\n            y = (y + 1.0)*(height_f) / 2.0\n\n            # do sampling\n            x0 = tf.cast(tf.floor(x), 'int32')\n            x1 = x0 + 1\n            y0 = tf.cast(tf.floor(y), 'int32')\n            y1 = y0 + 1\n\n            x0 = tf.clip_by_value(x0, zero, max_x)\n            x1 = tf.clip_by_value(x1, zero, max_x)\n            y0 = tf.clip_by_value(y0, zero, max_y)\n            y1 = tf.clip_by_value(y1, zero, max_y)\n            dim2 = width\n            dim1 = width*height\n            base = _repeat(tf.range(num_batch)*dim1, out_height*out_width)\n            base_y0 = base + y0*dim2\n            base_y1 = base + y1*dim2\n            idx_a = base_y0 + x0\n            idx_b = base_y1 + x0\n            idx_c = base_y0 + x1\n            idx_d = base_y1 + x1\n\n            # use indices to lookup pixels in the flat image and restore\n            # channels dim\n            im_flat = tf.reshape(im, tf.pack([-1, channels]))\n            im_flat = tf.cast(im_flat, 'float32')\n            Ia = tf.gather(im_flat, idx_a)\n            Ib = tf.gather(im_flat, idx_b)\n            Ic = tf.gather(im_flat, idx_c)\n            Id = tf.gather(im_flat, idx_d)\n\n            # and finally calculate interpolated values\n            x0_f = tf.cast(x0, 'float32')\n            x1_f = tf.cast(x1, 'float32')\n            y0_f = tf.cast(y0, 'float32')\n            y1_f = tf.cast(y1, 'float32')\n            wa = tf.expand_dims(((x1_f-x) * (y1_f-y)), 1)\n            wb = tf.expand_dims(((x1_f-x) * (y-y0_f)), 1)\n            wc = tf.expand_dims(((x-x0_f) * (y1_f-y)), 1)\n            wd = tf.expand_dims(((x-x0_f) * (y-y0_f)), 1)\n            output = tf.add_n([wa*Ia, wb*Ib, wc*Ic, wd*Id])\n            return output\n\n    def _meshgrid(height, width):\n        with tf.variable_scope('_meshgrid'):\n            # This should be equivalent to:\n            #  x_t, y_t = np.meshgrid(np.linspace(-1, 1, width),\n            #                         np.linspace(-1, 1, height))\n            #  ones = np.ones(np.prod(x_t.shape))\n            #  grid = np.vstack([x_t.flatten(), y_t.flatten(), ones])\n            x_t = tf.matmul(tf.ones(shape=tf.pack([height, 1])),\n                            tf.transpose(tf.expand_dims(tf.linspace(-1.0, 1.0, width), 1), [1, 0]))\n            y_t = tf.matmul(tf.expand_dims(tf.linspace(-1.0, 1.0, height), 1),\n                            tf.ones(shape=tf.pack([1, width])))\n\n            x_t_flat = tf.reshape(x_t, (1, -1))\n            y_t_flat = tf.reshape(y_t, (1, -1))\n\n            ones = tf.ones_like(x_t_flat)\n            grid = tf.concat(0, [x_t_flat, y_t_flat, ones])\n            return grid\n\n    def _transform(theta, input_dim, out_size):\n        with tf.variable_scope('_transform'):\n            num_batch = tf.shape(input_dim)[0]\n            height = tf.shape(input_dim)[1]\n            width = tf.shape(input_dim)[2]\n            num_channels = tf.shape(input_dim)[3]\n            theta = tf.reshape(theta, (-1, 2, 3))\n            theta = tf.cast(theta, 'float32')\n\n            # grid of (x_t, y_t, 1), eq (1) in ref [1]\n            height_f = tf.cast(height, 'float32')\n            width_f = tf.cast(width, 'float32')\n            out_height = out_size[0]\n            out_width = out_size[1]\n            grid = _meshgrid(out_height, out_width)\n            grid = tf.expand_dims(grid, 0)\n            grid = tf.reshape(grid, [-1])\n            grid = tf.tile(grid, tf.pack([num_batch]))\n            grid = tf.reshape(grid, tf.pack([num_batch, 3, -1]))\n\n            # Transform A x (x_t, y_t, 1)^T -> (x_s, y_s)\n            T_g = tf.batch_matmul(theta, grid)\n            x_s = tf.slice(T_g, [0, 0, 0], [-1, 1, -1])\n            y_s = tf.slice(T_g, [0, 1, 0], [-1, 1, -1])\n            x_s_flat = tf.reshape(x_s, [-1])\n            y_s_flat = tf.reshape(y_s, [-1])\n\n            input_transformed = _interpolate(\n                input_dim, x_s_flat, y_s_flat,\n                out_size)\n\n            output = tf.reshape(\n                input_transformed, tf.pack([num_batch, out_height, out_width, num_channels]))\n            return output\n\n    with tf.variable_scope(name):\n        output = _transform(theta, U, out_size)\n        return output\n\n\ndef batch_transformer(U, thetas, out_size, name='BatchSpatialTransformer'):\n    \"\"\"Batch Spatial Transformer Layer\n\n    Parameters\n    ----------\n\n    U : float\n        tensor of inputs [num_batch,height,width,num_channels]\n    thetas : float\n        a set of transformations for each input [num_batch,num_transforms,6]\n    out_size : int\n        the size of the output [out_height,out_width]\n\n    Returns: float\n        Tensor of size [num_batch*num_transforms,out_height,out_width,num_channels]\n    \"\"\"\n    with tf.variable_scope(name):\n        num_batch, num_transforms = map(int, thetas.get_shape().as_list()[:2])\n        indices = [[i]*num_transforms for i in xrange(num_batch)]\n        input_repeated = tf.gather(U, tf.reshape(indices, [-1]))\n        return transformer(input_repeated, thetas, out_size)\n"
  },
  {
    "path": "model_zoo/models/transformer/tf_utils.py",
    "content": "# Copyright 2016 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n# %% Borrowed utils from here: https://github.com/pkmital/tensorflow_tutorials/\nimport tensorflow as tf\nimport numpy as np\n\ndef conv2d(x, n_filters,\n           k_h=5, k_w=5,\n           stride_h=2, stride_w=2,\n           stddev=0.02,\n           activation=lambda x: x,\n           bias=True,\n           padding='SAME',\n           name=\"Conv2D\"):\n    \"\"\"2D Convolution with options for kernel size, stride, and init deviation.\n    Parameters\n    ----------\n    x : Tensor\n        Input tensor to convolve.\n    n_filters : int\n        Number of filters to apply.\n    k_h : int, optional\n        Kernel height.\n    k_w : int, optional\n        Kernel width.\n    stride_h : int, optional\n        Stride in rows.\n    stride_w : int, optional\n        Stride in cols.\n    stddev : float, optional\n        Initialization's standard deviation.\n    activation : arguments, optional\n        Function which applies a nonlinearity\n    padding : str, optional\n        'SAME' or 'VALID'\n    name : str, optional\n        Variable scope to use.\n    Returns\n    -------\n    x : Tensor\n        Convolved input.\n    \"\"\"\n    with tf.variable_scope(name):\n        w = tf.get_variable(\n            'w', [k_h, k_w, x.get_shape()[-1], n_filters],\n            initializer=tf.truncated_normal_initializer(stddev=stddev))\n        conv = tf.nn.conv2d(\n            x, w, strides=[1, stride_h, stride_w, 1], padding=padding)\n        if bias:\n            b = tf.get_variable(\n                'b', [n_filters],\n                initializer=tf.truncated_normal_initializer(stddev=stddev))\n            conv = conv + b\n        return conv\n    \ndef linear(x, n_units, scope=None, stddev=0.02,\n           activation=lambda x: x):\n    \"\"\"Fully-connected network.\n    Parameters\n    ----------\n    x : Tensor\n        Input tensor to the network.\n    n_units : int\n        Number of units to connect to.\n    scope : str, optional\n        Variable scope to use.\n    stddev : float, optional\n        Initialization's standard deviation.\n    activation : arguments, optional\n        Function which applies a nonlinearity\n    Returns\n    -------\n    x : Tensor\n        Fully-connected output.\n    \"\"\"\n    shape = x.get_shape().as_list()\n\n    with tf.variable_scope(scope or \"Linear\"):\n        matrix = tf.get_variable(\"Matrix\", [shape[1], n_units], tf.float32,\n                                 tf.random_normal_initializer(stddev=stddev))\n        return activation(tf.matmul(x, matrix))\n    \n# %%\ndef weight_variable(shape):\n    '''Helper function to create a weight variable initialized with\n    a normal distribution\n    Parameters\n    ----------\n    shape : list\n        Size of weight variable\n    '''\n    #initial = tf.random_normal(shape, mean=0.0, stddev=0.01)\n    initial = tf.zeros(shape)\n    return tf.Variable(initial)\n\n# %%\ndef bias_variable(shape):\n    '''Helper function to create a bias variable initialized with\n    a constant value.\n    Parameters\n    ----------\n    shape : list\n        Size of weight variable\n    '''\n    initial = tf.random_normal(shape, mean=0.0, stddev=0.01)\n    return tf.Variable(initial)\n\n# %% \ndef dense_to_one_hot(labels, n_classes=2):\n    \"\"\"Convert class labels from scalars to one-hot vectors.\"\"\"\n    labels = np.array(labels)\n    n_labels = labels.shape[0]\n    index_offset = np.arange(n_labels) * n_classes\n    labels_one_hot = np.zeros((n_labels, n_classes), dtype=np.float32)\n    labels_one_hot.flat[index_offset + labels.ravel()] = 1\n    return labels_one_hot\n"
  },
  {
    "path": "model_zoo/models/video_prediction/README.md",
    "content": "# Video Prediction with Neural Advection\n\n*A TensorFlow implementation of the models described in [Finn et al. (2016)]\n(http://arxiv.org/abs/1605.07157).*\n\nThis video prediction model, which is optionally conditioned on actions,\npredictions future video by internally predicting how to transform the last\nimage (which may have been predicted) into the next image. As a result, it can\nreuse apperance information from previous frames and can better generalize to\nobjects not seen in the training set. Some example predictions on novel objects\nare shown below:\n\n![Animation](https://storage.googleapis.com/push_gens/novelgengifs9/16_70.gif)\n![Animation](https://storage.googleapis.com/push_gens/novelgengifs9/2_96.gif)\n![Animation](https://storage.googleapis.com/push_gens/novelgengifs9/1_38.gif)\n![Animation](https://storage.googleapis.com/push_gens/novelgengifs9/11_10.gif)\n![Animation](https://storage.googleapis.com/push_gens/novelgengifs9/3_34.gif)\n\nWhen the model is conditioned on actions, it changes it's predictions based on\nthe passed in action. Here we show the models predictions in response to varying\nthe magnitude of the passed in actions, from small to large:\n\n![Animation](https://storage.googleapis.com/push_gens/webgifs/0xact_0.gif)\n![Animation](https://storage.googleapis.com/push_gens/05xact_0.gif)\n![Animation](https://storage.googleapis.com/push_gens/webgifs/1xact_0.gif)\n![Animation](https://storage.googleapis.com/push_gens/webgifs/15xact_0.gif)\n\n![Animation](https://storage.googleapis.com/push_gens/webgifs/0xact_17.gif)\n![Animation](https://storage.googleapis.com/push_gens/webgifs/05xact_17.gif)\n![Animation](https://storage.googleapis.com/push_gens/webgifs/1xact_17.gif)\n![Animation](https://storage.googleapis.com/push_gens/webgifs/15xact_17.gif)\n\n\nBecause the model is trained with an l2 objective, it represents uncertainty as\nblur.\n\n## Requirements\n* Tensorflow (see tensorflow.org for installation instructions)\n* spatial_tranformer model in tensorflow/models, for the spatial tranformer\n  predictor (STP).\n\n## Data\nThe data used to train this model is located\n[here](https://sites.google.com/site/brainrobotdata/home/push-dataset).\n\nTo download the robot data, run the following.\n```shell\n./download_data.sh\n```\n\n## Training the model\n\nTo train the model, run the prediction_train.py file.\n```shell\npython prediction_train.py\n```\n\nThere are several flags which can control the model that is trained, which are\nexeplified below:\n```shell\npython prediction_train.py \\\n  --data_dir=push/push_train \\ # path to the training set.\n  --model=CDNA \\ # the model type to use - DNA, CDNA, or STP\n  --output_dir=./checkpoints \\ # where to save model checkpoints\n  --event_log_dir=./summaries \\ # where to save training statistics\n  --num_iterations=100000 \\ # number of training iterations\n  --pretrained_model=model \\ # path to model to initialize from, random if emtpy\n  --sequence_length=10 \\ # the number of total frames in a sequence\n  --context_frames=2 \\ # the number of ground truth frames to pass in at start\n  --use_state=1 \\ # whether or not to condition on actions and the initial state\n  --num_masks=10 \\ # the number of transformations and corresponding masks\n  --schedsamp_k=900.0 \\ # the constant used for scheduled sampling or -1\n  --train_val_split=0.95 \\ # the percentage of training data for validation\n  --batch_size=32 \\ # the training batch size\n  --learning_rate=0.001 \\ # the initial learning rate for the Adam optimizer\n```\n\nIf the dynamic neural advection (DNA) model is being used, the `--num_masks`\noption should be set to one.\n\nThe `--context_frames` option defines both the number of initial ground truth\nframes to pass in, as well as when to start penalizing the model's predictions.\n\nThe data directory `--data_dir` should contain tfrecord files with the format\nused in the released push dataset. See\n[here](https://sites.google.com/site/brainrobotdata/home/push-dataset) for\ndetails. If the `--use_state` option is not set, then the data only needs to\ncontain image sequences, not states and actions.\n\n\n## Contact\n\nTo ask questions or report issues please open an issue on the tensorflow/models\n[issues tracker](https://github.com/tensorflow/models/issues).\nPlease assign issues to @cbfinn.\n\n## Credits\n\nThis code was written by Chelsea Finn.\n"
  },
  {
    "path": "model_zoo/models/video_prediction/download_data.sh",
    "content": "#!/bin/bash\n# Copyright 2016 The TensorFlow Authors All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\n# Example:\n#\n#   download_dataset.sh datafiles.txt ./tmp\n#\n# will download all of the files listed in the file, datafiles.txt, into\n# a directory, \"./tmp\".\n#\n# Each line of the datafiles.txt file should contain the path from the\n# bucket root to a file.\n\nARGC=\"$#\"\nLISTING_FILE=push_datafiles.txt\nif [ \"${ARGC}\" -ge 1 ]; then\n  LISTING_FILE=$1\nfi\nOUTPUT_DIR=\"./\"\nif [ \"${ARGC}\" -ge 2 ]; then\n  OUTPUT_DIR=$2\nfi\n\necho \"OUTPUT_DIR=$OUTPUT_DIR\"\n\nmkdir \"${OUTPUT_DIR}\"\n\nfunction download_file {\n  FILE=$1\n  BUCKET=\"https://storage.googleapis.com/brain-robotics-data\"\n  URL=\"${BUCKET}/${FILE}\"\n  OUTPUT_FILE=\"${OUTPUT_DIR}/${FILE}\"\n  DIRECTORY=`dirname ${OUTPUT_FILE}`\n  echo DIRECTORY=$DIRECTORY\n  mkdir -p \"${DIRECTORY}\"\n  curl --output ${OUTPUT_FILE} ${URL}\n}\n\nwhile read filename; do\n  download_file $filename\ndone <${LISTING_FILE}\n"
  },
  {
    "path": "model_zoo/models/video_prediction/lstm_ops.py",
    "content": "# Copyright 2016 The TensorFlow Authors All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Convolutional LSTM implementation.\"\"\"\n\nimport tensorflow as tf\n\nfrom tensorflow.contrib.slim import add_arg_scope\nfrom tensorflow.contrib.slim import layers\n\n\ndef init_state(inputs,\n               state_shape,\n               state_initializer=tf.zeros_initializer,\n               dtype=tf.float32):\n  \"\"\"Helper function to create an initial state given inputs.\n\n  Args:\n    inputs: input Tensor, at least 2D, the first dimension being batch_size\n    state_shape: the shape of the state.\n    state_initializer: Initializer(shape, dtype) for state Tensor.\n    dtype: Optional dtype, needed when inputs is None.\n  Returns:\n     A tensors representing the initial state.\n  \"\"\"\n  if inputs is not None:\n    # Handle both the dynamic shape as well as the inferred shape.\n    inferred_batch_size = inputs.get_shape().with_rank_at_least(1)[0]\n    batch_size = tf.shape(inputs)[0]\n    dtype = inputs.dtype\n  else:\n    inferred_batch_size = 0\n    batch_size = 0\n\n  initial_state = state_initializer(\n      tf.pack([batch_size] + state_shape),\n      dtype=dtype)\n  initial_state.set_shape([inferred_batch_size] + state_shape)\n\n  return initial_state\n\n\n@add_arg_scope\ndef basic_conv_lstm_cell(inputs,\n                         state,\n                         num_channels,\n                         filter_size=5,\n                         forget_bias=1.0,\n                         scope=None,\n                         reuse=None):\n  \"\"\"Basic LSTM recurrent network cell, with 2D convolution connctions.\n\n  We add forget_bias (default: 1) to the biases of the forget gate in order to\n  reduce the scale of forgetting in the beginning of the training.\n\n  It does not allow cell clipping, a projection layer, and does not\n  use peep-hole connections: it is the basic baseline.\n\n  Args:\n    inputs: input Tensor, 4D, batch x height x width x channels.\n    state: state Tensor, 4D, batch x height x width x channels.\n    num_channels: the number of output channels in the layer.\n    filter_size: the shape of the each convolution filter.\n    forget_bias: the initial value of the forget biases.\n    scope: Optional scope for variable_scope.\n    reuse: whether or not the layer and the variables should be reused.\n\n  Returns:\n     a tuple of tensors representing output and the new state.\n  \"\"\"\n  spatial_size = inputs.get_shape()[1:3]\n  if state is None:\n    state = init_state(inputs, list(spatial_size) + [2 * num_channels])\n  with tf.variable_scope(scope,\n                         'BasicConvLstmCell',\n                         [inputs, state],\n                         reuse=reuse):\n    inputs.get_shape().assert_has_rank(4)\n    state.get_shape().assert_has_rank(4)\n    c, h = tf.split(3, 2, state)\n    inputs_h = tf.concat(3, [inputs, h])\n    # Parameters of gates are concatenated into one conv for efficiency.\n    i_j_f_o = layers.conv2d(inputs_h,\n                            4 * num_channels, [filter_size, filter_size],\n                            stride=1,\n                            activation_fn=None,\n                            scope='Gates')\n\n    # i = input_gate, j = new_input, f = forget_gate, o = output_gate\n    i, j, f, o = tf.split(3, 4, i_j_f_o)\n\n    new_c = c * tf.sigmoid(f + forget_bias) + tf.sigmoid(i) * tf.tanh(j)\n    new_h = tf.tanh(new_c) * tf.sigmoid(o)\n\n    return new_h, tf.concat(3, [new_c, new_h])\n\n\n\n"
  },
  {
    "path": "model_zoo/models/video_prediction/prediction_input.py",
    "content": "# Copyright 2016 The TensorFlow Authors All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Code for building the input for the prediction model.\"\"\"\n\nimport os\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom tensorflow.python.platform import flags\nfrom tensorflow.python.platform import gfile\n\n\nFLAGS = flags.FLAGS\n\n# Original image dimensions\nORIGINAL_WIDTH = 640\nORIGINAL_HEIGHT = 512\nCOLOR_CHAN = 3\n\n# Default image dimensions.\nIMG_WIDTH = 64\nIMG_HEIGHT = 64\n\n# Dimension of the state and action.\nSTATE_DIM = 5\n\n\ndef build_tfrecord_input(training=True):\n  \"\"\"Create input tfrecord tensors.\n\n  Args:\n    training: training or validation data.\n  Returns:\n    list of tensors corresponding to images, actions, and states. The images\n    tensor is 5D, batch x time x height x width x channels. The state and\n    action tensors are 3D, batch x time x dimension.\n  Raises:\n    RuntimeError: if no files found.\n  \"\"\"\n  filenames = gfile.Glob(os.path.join(FLAGS.data_dir, '*'))\n  if not filenames:\n    raise RuntimeError('No data files found.')\n  index = int(np.floor(FLAGS.train_val_split * len(filenames)))\n  if training:\n    filenames = filenames[:index]\n  else:\n    filenames = filenames[index:]\n  filename_queue = tf.train.string_input_producer(filenames, shuffle=True)\n  reader = tf.TFRecordReader()\n  _, serialized_example = reader.read(filename_queue)\n\n  image_seq, state_seq, action_seq = [], [], []\n\n  for i in range(FLAGS.sequence_length):\n    image_name = 'move/' + str(i) + '/image/encoded'\n    action_name = 'move/' + str(i) + '/commanded_pose/vec_pitch_yaw'\n    state_name = 'move/' + str(i) + '/endeffector/vec_pitch_yaw'\n    if FLAGS.use_state:\n      features = {image_name: tf.FixedLenFeature([1], tf.string),\n                  action_name: tf.FixedLenFeature([STATE_DIM], tf.float32),\n                  state_name: tf.FixedLenFeature([STATE_DIM], tf.float32)}\n    else:\n      features = {image_name: tf.FixedLenFeature([1], tf.string)}\n    features = tf.parse_single_example(serialized_example, features=features)\n\n    image_buffer = tf.reshape(features[image_name], shape=[])\n    image = tf.image.decode_jpeg(image_buffer, channels=COLOR_CHAN)\n    image.set_shape([ORIGINAL_HEIGHT, ORIGINAL_WIDTH, COLOR_CHAN])\n\n    if IMG_HEIGHT != IMG_WIDTH:\n      raise ValueError('Unequal height and width unsupported')\n\n    crop_size = min(ORIGINAL_HEIGHT, ORIGINAL_WIDTH)\n    image = tf.image.resize_image_with_crop_or_pad(image, crop_size, crop_size)\n    image = tf.reshape(image, [1, crop_size, crop_size, COLOR_CHAN])\n    image = tf.image.resize_bicubic(image, [IMG_HEIGHT, IMG_WIDTH])\n    image = tf.cast(image, tf.float32) / 255.0\n    image_seq.append(image)\n\n    if FLAGS.use_state:\n      state = tf.reshape(features[state_name], shape=[1, STATE_DIM])\n      state_seq.append(state)\n      action = tf.reshape(features[action_name], shape=[1, STATE_DIM])\n      action_seq.append(action)\n\n  image_seq = tf.concat(0, image_seq)\n\n  if FLAGS.use_state:\n    state_seq = tf.concat(0, state_seq)\n    action_seq = tf.concat(0, action_seq)\n    [image_batch, action_batch, state_batch] = tf.train.batch(\n        [image_seq, action_seq, state_seq],\n        FLAGS.batch_size,\n        num_threads=FLAGS.batch_size,\n        capacity=100 * FLAGS.batch_size)\n    return image_batch, action_batch, state_batch\n  else:\n    image_batch = tf.train.batch(\n        [image_seq],\n        FLAGS.batch_size,\n        num_threads=FLAGS.batch_size,\n        capacity=100 * FLAGS.batch_size)\n    zeros_batch = tf.zeros([FLAGS.batch_size, FLAGS.sequence_length, STATE_DIM])\n    return image_batch, zeros_batch, zeros_batch\n\n"
  },
  {
    "path": "model_zoo/models/video_prediction/prediction_model.py",
    "content": "# Copyright 2016 The TensorFlow Authors All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Model architecture for predictive model, including CDNA, DNA, and STP.\"\"\"\n\nimport numpy as np\nimport tensorflow as tf\n\nimport tensorflow.contrib.slim as slim\nfrom tensorflow.contrib.layers.python import layers as tf_layers\nfrom lstm_ops import basic_conv_lstm_cell\n\n# Amount to use when lower bounding tensors\nRELU_SHIFT = 1e-12\n\n# kernel size for DNA and CDNA.\nDNA_KERN_SIZE = 5\n\n\ndef construct_model(images,\n                    actions=None,\n                    states=None,\n                    iter_num=-1.0,\n                    k=-1,\n                    use_state=True,\n                    num_masks=10,\n                    stp=False,\n                    cdna=True,\n                    dna=False,\n                    context_frames=2):\n  \"\"\"Build convolutional lstm video predictor using STP, CDNA, or DNA.\n\n  Args:\n    images: tensor of ground truth image sequences\n    actions: tensor of action sequences\n    states: tensor of ground truth state sequences\n    iter_num: tensor of the current training iteration (for sched. sampling)\n    k: constant used for scheduled sampling. -1 to feed in own prediction.\n    use_state: True to include state and action in prediction\n    num_masks: the number of different pixel motion predictions (and\n               the number of masks for each of those predictions)\n    stp: True to use Spatial Transformer Predictor (STP)\n    cdna: True to use Convoluational Dynamic Neural Advection (CDNA)\n    dna: True to use Dynamic Neural Advection (DNA)\n    context_frames: number of ground truth frames to pass in before\n                    feeding in own predictions\n  Returns:\n    gen_images: predicted future image frames\n    gen_states: predicted future states\n\n  Raises:\n    ValueError: if more than one network option specified or more than 1 mask\n    specified for DNA model.\n  \"\"\"\n  if stp + cdna + dna != 1:\n    raise ValueError('More than one, or no network option specified.')\n  batch_size, img_height, img_width, color_channels = images[0].get_shape()[0:4]\n  lstm_func = basic_conv_lstm_cell\n\n  # Generated robot states and images.\n  gen_states, gen_images = [], []\n  current_state = states[0]\n\n  if k == -1:\n    feedself = True\n  else:\n    # Scheduled sampling:\n    # Calculate number of ground-truth frames to pass in.\n    num_ground_truth = tf.to_int32(\n        tf.round(tf.to_float(batch_size) * (k / (k + tf.exp(iter_num / k)))))\n    feedself = False\n\n  # LSTM state sizes and states.\n  lstm_size = np.int32(np.array([32, 32, 64, 64, 128, 64, 32]))\n  lstm_state1, lstm_state2, lstm_state3, lstm_state4 = None, None, None, None\n  lstm_state5, lstm_state6, lstm_state7 = None, None, None\n\n  for image, action in zip(images[:-1], actions[:-1]):\n    # Reuse variables after the first timestep.\n    reuse = bool(gen_images)\n\n    done_warm_start = len(gen_images) > context_frames - 1\n    with slim.arg_scope(\n        [lstm_func, slim.layers.conv2d, slim.layers.fully_connected,\n         tf_layers.layer_norm, slim.layers.conv2d_transpose],\n        reuse=reuse):\n\n      if feedself and done_warm_start:\n        # Feed in generated image.\n        prev_image = gen_images[-1]\n      elif done_warm_start:\n        # Scheduled sampling\n        prev_image = scheduled_sample(image, gen_images[-1], batch_size,\n                                      num_ground_truth)\n      else:\n        # Always feed in ground_truth\n        prev_image = image\n\n      # Predicted state is always fed back in\n      state_action = tf.concat(1, [action, current_state])\n\n      enc0 = slim.layers.conv2d(\n          prev_image,\n          32, [5, 5],\n          stride=2,\n          scope='scale1_conv1',\n          normalizer_fn=tf_layers.layer_norm,\n          normalizer_params={'scope': 'layer_norm1'})\n\n      hidden1, lstm_state1 = lstm_func(\n          enc0, lstm_state1, lstm_size[0], scope='state1')\n      hidden1 = tf_layers.layer_norm(hidden1, scope='layer_norm2')\n      hidden2, lstm_state2 = lstm_func(\n          hidden1, lstm_state2, lstm_size[1], scope='state2')\n      hidden2 = tf_layers.layer_norm(hidden2, scope='layer_norm3')\n      enc1 = slim.layers.conv2d(\n          hidden2, hidden2.get_shape()[3], [3, 3], stride=2, scope='conv2')\n\n      hidden3, lstm_state3 = lstm_func(\n          enc1, lstm_state3, lstm_size[2], scope='state3')\n      hidden3 = tf_layers.layer_norm(hidden3, scope='layer_norm4')\n      hidden4, lstm_state4 = lstm_func(\n          hidden3, lstm_state4, lstm_size[3], scope='state4')\n      hidden4 = tf_layers.layer_norm(hidden4, scope='layer_norm5')\n      enc2 = slim.layers.conv2d(\n          hidden4, hidden4.get_shape()[3], [3, 3], stride=2, scope='conv3')\n\n      # Pass in state and action.\n      smear = tf.reshape(\n          state_action,\n          [int(batch_size), 1, 1, int(state_action.get_shape()[1])])\n      smear = tf.tile(\n          smear, [1, int(enc2.get_shape()[1]), int(enc2.get_shape()[2]), 1])\n      if use_state:\n        enc2 = tf.concat(3, [enc2, smear])\n      enc3 = slim.layers.conv2d(\n          enc2, hidden4.get_shape()[3], [1, 1], stride=1, scope='conv4')\n\n      hidden5, lstm_state5 = lstm_func(\n          enc3, lstm_state5, lstm_size[4], scope='state5')  # last 8x8\n      hidden5 = tf_layers.layer_norm(hidden5, scope='layer_norm6')\n      enc4 = slim.layers.conv2d_transpose(\n          hidden5, hidden5.get_shape()[3], 3, stride=2, scope='convt1')\n\n      hidden6, lstm_state6 = lstm_func(\n          enc4, lstm_state6, lstm_size[5], scope='state6')  # 16x16\n      hidden6 = tf_layers.layer_norm(hidden6, scope='layer_norm7')\n      # Skip connection.\n      hidden6 = tf.concat(3, [hidden6, enc1])  # both 16x16\n\n      enc5 = slim.layers.conv2d_transpose(\n          hidden6, hidden6.get_shape()[3], 3, stride=2, scope='convt2')\n      hidden7, lstm_state7 = lstm_func(\n          enc5, lstm_state7, lstm_size[6], scope='state7')  # 32x32\n      hidden7 = tf_layers.layer_norm(hidden7, scope='layer_norm8')\n\n      # Skip connection.\n      hidden7 = tf.concat(3, [hidden7, enc0])  # both 32x32\n\n      enc6 = slim.layers.conv2d_transpose(\n          hidden7,\n          hidden7.get_shape()[3], 3, stride=2, scope='convt3',\n          normalizer_fn=tf_layers.layer_norm,\n          normalizer_params={'scope': 'layer_norm9'})\n\n      if dna:\n        # Using largest hidden state for predicting untied conv kernels.\n        enc7 = slim.layers.conv2d_transpose(\n            enc6, DNA_KERN_SIZE**2, 1, stride=1, scope='convt4')\n      else:\n        # Using largest hidden state for predicting a new image layer.\n        enc7 = slim.layers.conv2d_transpose(\n            enc6, color_channels, 1, stride=1, scope='convt4')\n        # This allows the network to also generate one image from scratch,\n        # which is useful when regions of the image become unoccluded.\n        transformed = [tf.nn.sigmoid(enc7)]\n\n      if stp:\n        stp_input0 = tf.reshape(hidden5, [int(batch_size), -1])\n        stp_input1 = slim.layers.fully_connected(\n            stp_input0, 100, scope='fc_stp')\n        transformed += stp_transformation(prev_image, stp_input1, num_masks)\n      elif cdna:\n        cdna_input = tf.reshape(hidden5, [int(batch_size), -1])\n        transformed += cdna_transformation(prev_image, cdna_input, num_masks,\n                                           int(color_channels))\n      elif dna:\n        # Only one mask is supported (more should be unnecessary).\n        if num_masks != 1:\n          raise ValueError('Only one mask is supported for DNA model.')\n        transformed = [dna_transformation(prev_image, enc7)]\n\n      masks = slim.layers.conv2d_transpose(\n          enc6, num_masks + 1, 1, stride=1, scope='convt7')\n      masks = tf.reshape(\n          tf.nn.softmax(tf.reshape(masks, [-1, num_masks + 1])),\n          [int(batch_size), int(img_height), int(img_width), num_masks + 1])\n      mask_list = tf.split(3, num_masks + 1, masks)\n      output = mask_list[0] * prev_image\n      for layer, mask in zip(transformed, mask_list[1:]):\n        output += layer * mask\n      gen_images.append(output)\n\n      current_state = slim.layers.fully_connected(\n          state_action,\n          int(current_state.get_shape()[1]),\n          scope='state_pred',\n          activation_fn=None)\n      gen_states.append(current_state)\n\n  return gen_images, gen_states\n\n\n## Utility functions\ndef stp_transformation(prev_image, stp_input, num_masks):\n  \"\"\"Apply spatial transformer predictor (STP) to previous image.\n\n  Args:\n    prev_image: previous image to be transformed.\n    stp_input: hidden layer to be used for computing STN parameters.\n    num_masks: number of masks and hence the number of STP transformations.\n  Returns:\n    List of images transformed by the predicted STP parameters.\n  \"\"\"\n  # Only import spatial transformer if needed.\n  from spatial_transformer import transformer\n\n  identity_params = tf.convert_to_tensor(\n      np.array([1.0, 0.0, 0.0, 0.0, 1.0, 0.0], np.float32))\n  transformed = []\n  for i in range(num_masks - 1):\n    params = slim.layers.fully_connected(\n        stp_input, 6, scope='stp_params' + str(i),\n        activation_fn=None) + identity_params\n    transformed.append(transformer(prev_image, params))\n\n  return transformed\n\n\ndef cdna_transformation(prev_image, cdna_input, num_masks, color_channels):\n  \"\"\"Apply convolutional dynamic neural advection to previous image.\n\n  Args:\n    prev_image: previous image to be transformed.\n    cdna_input: hidden lyaer to be used for computing CDNA kernels.\n    num_masks: the number of masks and hence the number of CDNA transformations.\n    color_channels: the number of color channels in the images.\n  Returns:\n    List of images transformed by the predicted CDNA kernels.\n  \"\"\"\n  batch_size = int(cdna_input.get_shape()[0])\n\n  # Predict kernels using linear function of last hidden layer.\n  cdna_kerns = slim.layers.fully_connected(\n      cdna_input,\n      DNA_KERN_SIZE * DNA_KERN_SIZE * num_masks,\n      scope='cdna_params',\n      activation_fn=None)\n\n  # Reshape and normalize.\n  cdna_kerns = tf.reshape(\n      cdna_kerns, [batch_size, DNA_KERN_SIZE, DNA_KERN_SIZE, 1, num_masks])\n  cdna_kerns = tf.nn.relu(cdna_kerns - RELU_SHIFT) + RELU_SHIFT\n  norm_factor = tf.reduce_sum(cdna_kerns, [1, 2, 3], keep_dims=True)\n  cdna_kerns /= norm_factor\n\n  cdna_kerns = tf.tile(cdna_kerns, [1, 1, 1, color_channels, 1])\n  cdna_kerns = tf.split(0, batch_size, cdna_kerns)\n  prev_images = tf.split(0, batch_size, prev_image)\n\n  # Transform image.\n  transformed = []\n  for kernel, preimg in zip(cdna_kerns, prev_images):\n    kernel = tf.squeeze(kernel)\n    if len(kernel.get_shape()) == 3:\n      kernel = tf.expand_dims(kernel, -1)\n    transformed.append(\n        tf.nn.depthwise_conv2d(preimg, kernel, [1, 1, 1, 1], 'SAME'))\n  transformed = tf.concat(0, transformed)\n  transformed = tf.split(3, num_masks, transformed)\n  return transformed\n\n\ndef dna_transformation(prev_image, dna_input):\n  \"\"\"Apply dynamic neural advection to previous image.\n\n  Args:\n    prev_image: previous image to be transformed.\n    dna_input: hidden lyaer to be used for computing DNA transformation.\n  Returns:\n    List of images transformed by the predicted CDNA kernels.\n  \"\"\"\n  # Construct translated images.\n  prev_image_pad = tf.pad(prev_image, [[0, 0], [2, 2], [2, 2], [0, 0]])\n  image_height = int(prev_image.get_shape()[1])\n  image_width = int(prev_image.get_shape()[2])\n\n  inputs = []\n  for xkern in range(DNA_KERN_SIZE):\n    for ykern in range(DNA_KERN_SIZE):\n      inputs.append(\n          tf.expand_dims(\n              tf.slice(prev_image_pad, [0, xkern, ykern, 0],\n                       [-1, image_height, image_width, -1]), [3]))\n  inputs = tf.concat(3, inputs)\n\n  # Normalize channels to 1.\n  kernel = tf.nn.relu(dna_input - RELU_SHIFT) + RELU_SHIFT\n  kernel = tf.expand_dims(\n      kernel / tf.reduce_sum(\n          kernel, [3], keep_dims=True), [4])\n  return tf.reduce_sum(kernel * inputs, [3], keep_dims=False)\n\n\ndef scheduled_sample(ground_truth_x, generated_x, batch_size, num_ground_truth):\n  \"\"\"Sample batch with specified mix of ground truth and generated data points.\n\n  Args:\n    ground_truth_x: tensor of ground-truth data points.\n    generated_x: tensor of generated data points.\n    batch_size: batch size\n    num_ground_truth: number of ground-truth examples to include in batch.\n  Returns:\n    New batch with num_ground_truth sampled from ground_truth_x and the rest\n    from generated_x.\n  \"\"\"\n  idx = tf.random_shuffle(tf.range(int(batch_size)))\n  ground_truth_idx = tf.gather(idx, tf.range(num_ground_truth))\n  generated_idx = tf.gather(idx, tf.range(num_ground_truth, int(batch_size)))\n\n  ground_truth_examps = tf.gather(ground_truth_x, ground_truth_idx)\n  generated_examps = tf.gather(generated_x, generated_idx)\n  return tf.dynamic_stitch([ground_truth_idx, generated_idx],\n                           [ground_truth_examps, generated_examps])\n"
  },
  {
    "path": "model_zoo/models/video_prediction/prediction_train.py",
    "content": "# Copyright 2016 The TensorFlow Authors All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n#     http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n\n\"\"\"Code for training the prediction model.\"\"\"\n\nimport numpy as np\nimport tensorflow as tf\n\nfrom tensorflow.python.platform import app\nfrom tensorflow.python.platform import flags\n\nfrom prediction_input import build_tfrecord_input\nfrom prediction_model import construct_model\n\n# How often to record tensorboard summaries.\nSUMMARY_INTERVAL = 40\n\n# How often to run a batch through the validation model.\nVAL_INTERVAL = 200\n\n# How often to save a model checkpoint\nSAVE_INTERVAL = 2000\n\n# tf record data location:\nDATA_DIR = 'push/push_train'\n\n# local output directory\nOUT_DIR = '/tmp/data'\n\nFLAGS = flags.FLAGS\n\nflags.DEFINE_string('data_dir', DATA_DIR, 'directory containing data.')\nflags.DEFINE_string('output_dir', OUT_DIR, 'directory for model checkpoints.')\nflags.DEFINE_string('event_log_dir', OUT_DIR, 'directory for writing summary.')\nflags.DEFINE_integer('num_iterations', 100000, 'number of training iterations.')\nflags.DEFINE_string('pretrained_model', '',\n                    'filepath of a pretrained model to initialize from.')\n\nflags.DEFINE_integer('sequence_length', 10,\n                     'sequence length, including context frames.')\nflags.DEFINE_integer('context_frames', 2, '# of frames before predictions.')\nflags.DEFINE_integer('use_state', 1,\n                     'Whether or not to give the state+action to the model')\n\nflags.DEFINE_string('model', 'CDNA',\n                    'model architecture to use - CDNA, DNA, or STP')\n\nflags.DEFINE_integer('num_masks', 10,\n                     'number of masks, usually 1 for DNA, 10 for CDNA, STN.')\nflags.DEFINE_float('schedsamp_k', 900.0,\n                   'The k hyperparameter for scheduled sampling,'\n                   '-1 for no scheduled sampling.')\nflags.DEFINE_float('train_val_split', 0.95,\n                   'The percentage of files to use for the training set,'\n                   ' vs. the validation set.')\n\nflags.DEFINE_integer('batch_size', 32, 'batch size for training')\nflags.DEFINE_float('learning_rate', 0.001,\n                   'the base learning rate of the generator')\n\n\n## Helper functions\ndef peak_signal_to_noise_ratio(true, pred):\n  \"\"\"Image quality metric based on maximal signal power vs. power of the noise.\n\n  Args:\n    true: the ground truth image.\n    pred: the predicted image.\n  Returns:\n    peak signal to noise ratio (PSNR)\n  \"\"\"\n  return 10.0 * tf.log(1.0 / mean_squared_error(true, pred)) / tf.log(10.0)\n\n\ndef mean_squared_error(true, pred):\n  \"\"\"L2 distance between tensors true and pred.\n\n  Args:\n    true: the ground truth image.\n    pred: the predicted image.\n  Returns:\n    mean squared error between ground truth and predicted image.\n  \"\"\"\n  return tf.reduce_sum(tf.square(true - pred)) / tf.to_float(tf.size(pred))\n\n\nclass Model(object):\n\n  def __init__(self,\n               images=None,\n               actions=None,\n               states=None,\n               sequence_length=None,\n               reuse_scope=None):\n\n    if sequence_length is None:\n      sequence_length = FLAGS.sequence_length\n\n    self.prefix = prefix = tf.placeholder(tf.string, [])\n    self.iter_num = tf.placeholder(tf.float32, [])\n    summaries = []\n\n    # Split into timesteps.\n    actions = tf.split(1, actions.get_shape()[1], actions)\n    actions = [tf.squeeze(act) for act in actions]\n    states = tf.split(1, states.get_shape()[1], states)\n    states = [tf.squeeze(st) for st in states]\n    images = tf.split(1, images.get_shape()[1], images)\n    images = [tf.squeeze(img) for img in images]\n\n    if reuse_scope is None:\n      gen_images, gen_states = construct_model(\n          images,\n          actions,\n          states,\n          iter_num=self.iter_num,\n          k=FLAGS.schedsamp_k,\n          use_state=FLAGS.use_state,\n          num_masks=FLAGS.num_masks,\n          cdna=FLAGS.model == 'CDNA',\n          dna=FLAGS.model == 'DNA',\n          stp=FLAGS.model == 'STP',\n          context_frames=FLAGS.context_frames)\n    else:  # If it's a validation or test model.\n      with tf.variable_scope(reuse_scope, reuse=True):\n        gen_images, gen_states = construct_model(\n            images,\n            actions,\n            states,\n            iter_num=self.iter_num,\n            k=FLAGS.schedsamp_k,\n            use_state=FLAGS.use_state,\n            num_masks=FLAGS.num_masks,\n            cdna=FLAGS.model == 'CDNA',\n            dna=FLAGS.model == 'DNA',\n            stp=FLAGS.model == 'STP',\n            context_frames=FLAGS.context_frames)\n\n    # L2 loss, PSNR for eval.\n    loss, psnr_all = 0.0, 0.0\n    for i, x, gx in zip(\n        range(len(gen_images)), images[FLAGS.context_frames:],\n        gen_images[FLAGS.context_frames - 1:]):\n      recon_cost = mean_squared_error(x, gx)\n      psnr_i = peak_signal_to_noise_ratio(x, gx)\n      psnr_all += psnr_i\n      summaries.append(\n          tf.scalar_summary(prefix + '_recon_cost' + str(i), recon_cost))\n      summaries.append(tf.scalar_summary(prefix + '_psnr' + str(i), psnr_i))\n      loss += recon_cost\n\n    for i, state, gen_state in zip(\n        range(len(gen_states)), states[FLAGS.context_frames:],\n        gen_states[FLAGS.context_frames - 1:]):\n      state_cost = mean_squared_error(state, gen_state) * 1e-4\n      summaries.append(\n          tf.scalar_summary(prefix + '_state_cost' + str(i), state_cost))\n      loss += state_cost\n    summaries.append(tf.scalar_summary(prefix + '_psnr_all', psnr_all))\n    self.psnr_all = psnr_all\n\n    self.loss = loss = loss / np.float32(len(images) - FLAGS.context_frames)\n\n    summaries.append(tf.scalar_summary(prefix + '_loss', loss))\n\n    self.lr = tf.placeholder_with_default(FLAGS.learning_rate, ())\n\n    self.train_op = tf.train.AdamOptimizer(self.lr).minimize(loss)\n    self.summ_op = tf.merge_summary(summaries)\n\n\ndef main(unused_argv):\n\n  print 'Constructing models and inputs.'\n  with tf.variable_scope('model', reuse=None) as training_scope:\n    images, actions, states = build_tfrecord_input(training=True)\n    model = Model(images, actions, states, FLAGS.sequence_length)\n\n  with tf.variable_scope('val_model', reuse=None):\n    val_images, val_actions, val_states = build_tfrecord_input(training=False)\n    val_model = Model(val_images, val_actions, val_states,\n                      FLAGS.sequence_length, training_scope)\n\n  print 'Constructing saver.'\n  # Make saver.\n  saver = tf.train.Saver(\n      tf.get_collection(tf.GraphKeys.VARIABLES), max_to_keep=0)\n\n  # Make training session.\n  sess = tf.InteractiveSession()\n  summary_writer = tf.train.SummaryWriter(\n      FLAGS.event_log_dir, graph=sess.graph, flush_secs=10)\n\n  if FLAGS.pretrained_model:\n    saver.restore(sess, FLAGS.pretrained_model)\n\n  tf.train.start_queue_runners(sess)\n  sess.run(tf.initialize_all_variables())\n\n  tf.logging.info('iteration number, cost')\n\n  # Run training.\n  for itr in range(FLAGS.num_iterations):\n    # Generate new batch of data.\n    feed_dict = {model.prefix: 'train',\n                 model.iter_num: np.float32(itr),\n                 model.lr: FLAGS.learning_rate}\n    cost, _, summary_str = sess.run([model.loss, model.train_op, model.summ_op],\n                                    feed_dict)\n\n    # Print info: iteration #, cost.\n    tf.logging.info(str(itr) + ' ' + str(cost))\n\n    if (itr) % VAL_INTERVAL == 2:\n      # Run through validation set.\n      feed_dict = {val_model.lr: 0.0,\n                   val_model.prefix: 'val',\n                   val_model.iter_num: np.float32(itr)}\n      _, val_summary_str = sess.run([val_model.train_op, val_model.summ_op],\n                                     feed_dict)\n      summary_writer.add_summary(val_summary_str, itr)\n\n    if (itr) % SAVE_INTERVAL == 2:\n      tf.logging.info('Saving model.')\n      saver.save(sess, FLAGS.output_dir + '/model' + str(itr))\n\n    if (itr) % SUMMARY_INTERVAL:\n      summary_writer.add_summary(summary_str, itr)\n\n  tf.logging.info('Saving model.')\n  saver.save(sess, FLAGS.output_dir + '/model')\n  tf.logging.info('Training complete')\n  tf.logging.flush()\n\n\nif __name__ == '__main__':\n  app.run()\n"
  },
  {
    "path": "model_zoo/models/video_prediction/push_datafiles.txt",
    "content": "push/push_testnovel/push_testnovel.tfrecord-00000-of-00005\npush/push_testnovel/push_testnovel.tfrecord-00001-of-00005\npush/push_testnovel/push_testnovel.tfrecord-00002-of-00005\npush/push_testnovel/push_testnovel.tfrecord-00003-of-00005\npush/push_testnovel/push_testnovel.tfrecord-00004-of-00005\npush/push_testseen/push_testseen.tfrecord-00000-of-00005\npush/push_testseen/push_testseen.tfrecord-00001-of-00005\npush/push_testseen/push_testseen.tfrecord-00002-of-00005\npush/push_testseen/push_testseen.tfrecord-00003-of-00005\npush/push_testseen/push_testseen.tfrecord-00004-of-00005\npush/push_train/push_train.tfrecord-00000-of-00264\npush/push_train/push_train.tfrecord-00001-of-00264\npush/push_train/push_train.tfrecord-00002-of-00264\npush/push_train/push_train.tfrecord-00003-of-00264\npush/push_train/push_train.tfrecord-00004-of-00264\npush/push_train/push_train.tfrecord-00005-of-00264\npush/push_train/push_train.tfrecord-00006-of-00264\npush/push_train/push_train.tfrecord-00007-of-00264\npush/push_train/push_train.tfrecord-00008-of-00264\npush/push_train/push_train.tfrecord-00009-of-00264\npush/push_train/push_train.tfrecord-00010-of-00264\npush/push_train/push_train.tfrecord-00011-of-00264\npush/push_train/push_train.tfrecord-00012-of-00264\npush/push_train/push_train.tfrecord-00013-of-00264\npush/push_train/push_train.tfrecord-00014-of-00264\npush/push_train/push_train.tfrecord-00015-of-00264\npush/push_train/push_train.tfrecord-00016-of-00264\npush/push_train/push_train.tfrecord-00017-of-00264\npush/push_train/push_train.tfrecord-00018-of-00264\npush/push_train/push_train.tfrecord-00019-of-00264\npush/push_train/push_train.tfrecord-00020-of-00264\npush/push_train/push_train.tfrecord-00021-of-00264\npush/push_train/push_train.tfrecord-00022-of-00264\npush/push_train/push_train.tfrecord-00023-of-00264\npush/push_train/push_train.tfrecord-00024-of-00264\npush/push_train/push_train.tfrecord-00025-of-00264\npush/push_train/push_train.tfrecord-00026-of-00264\npush/push_train/push_train.tfrecord-00027-of-00264\npush/push_train/push_train.tfrecord-00028-of-00264\npush/push_train/push_train.tfrecord-00029-of-00264\npush/push_train/push_train.tfrecord-00030-of-00264\npush/push_train/push_train.tfrecord-00031-of-00264\npush/push_train/push_train.tfrecord-00032-of-00264\npush/push_train/push_train.tfrecord-00033-of-00264\npush/push_train/push_train.tfrecord-00034-of-00264\npush/push_train/push_train.tfrecord-00035-of-00264\npush/push_train/push_train.tfrecord-00036-of-00264\npush/push_train/push_train.tfrecord-00037-of-00264\npush/push_train/push_train.tfrecord-00038-of-00264\npush/push_train/push_train.tfrecord-00039-of-00264\npush/push_train/push_train.tfrecord-00040-of-00264\npush/push_train/push_train.tfrecord-00041-of-00264\npush/push_train/push_train.tfrecord-00042-of-00264\npush/push_train/push_train.tfrecord-00043-of-00264\npush/push_train/push_train.tfrecord-00044-of-00264\npush/push_train/push_train.tfrecord-00045-of-00264\npush/push_train/push_train.tfrecord-00046-of-00264\npush/push_train/push_train.tfrecord-00047-of-00264\npush/push_train/push_train.tfrecord-00048-of-00264\npush/push_train/push_train.tfrecord-00049-of-00264\npush/push_train/push_train.tfrecord-00050-of-00264\npush/push_train/push_train.tfrecord-00051-of-00264\npush/push_train/push_train.tfrecord-00052-of-00264\npush/push_train/push_train.tfrecord-00053-of-00264\npush/push_train/push_train.tfrecord-00054-of-00264\npush/push_train/push_train.tfrecord-00055-of-00264\npush/push_train/push_train.tfrecord-00056-of-00264\npush/push_train/push_train.tfrecord-00057-of-00264\npush/push_train/push_train.tfrecord-00058-of-00264\npush/push_train/push_train.tfrecord-00059-of-00264\npush/push_train/push_train.tfrecord-00060-of-00264\npush/push_train/push_train.tfrecord-00061-of-00264\npush/push_train/push_train.tfrecord-00062-of-00264\npush/push_train/push_train.tfrecord-00063-of-00264\npush/push_train/push_train.tfrecord-00064-of-00264\npush/push_train/push_train.tfrecord-00065-of-00264\npush/push_train/push_train.tfrecord-00066-of-00264\npush/push_train/push_train.tfrecord-00067-of-00264\npush/push_train/push_train.tfrecord-00068-of-00264\npush/push_train/push_train.tfrecord-00069-of-00264\npush/push_train/push_train.tfrecord-00070-of-00264\npush/push_train/push_train.tfrecord-00071-of-00264\npush/push_train/push_train.tfrecord-00072-of-00264\npush/push_train/push_train.tfrecord-00073-of-00264\npush/push_train/push_train.tfrecord-00074-of-00264\npush/push_train/push_train.tfrecord-00075-of-00264\npush/push_train/push_train.tfrecord-00076-of-00264\npush/push_train/push_train.tfrecord-00077-of-00264\npush/push_train/push_train.tfrecord-00078-of-00264\npush/push_train/push_train.tfrecord-00079-of-00264\npush/push_train/push_train.tfrecord-00080-of-00264\npush/push_train/push_train.tfrecord-00081-of-00264\npush/push_train/push_train.tfrecord-00082-of-00264\npush/push_train/push_train.tfrecord-00083-of-00264\npush/push_train/push_train.tfrecord-00084-of-00264\npush/push_train/push_train.tfrecord-00085-of-00264\npush/push_train/push_train.tfrecord-00086-of-00264\npush/push_train/push_train.tfrecord-00087-of-00264\npush/push_train/push_train.tfrecord-00088-of-00264\npush/push_train/push_train.tfrecord-00089-of-00264\npush/push_train/push_train.tfrecord-00090-of-00264\npush/push_train/push_train.tfrecord-00091-of-00264\npush/push_train/push_train.tfrecord-00092-of-00264\npush/push_train/push_train.tfrecord-00093-of-00264\npush/push_train/push_train.tfrecord-00094-of-00264\npush/push_train/push_train.tfrecord-00095-of-00264\npush/push_train/push_train.tfrecord-00096-of-00264\npush/push_train/push_train.tfrecord-00097-of-00264\npush/push_train/push_train.tfrecord-00098-of-00264\npush/push_train/push_train.tfrecord-00099-of-00264\npush/push_train/push_train.tfrecord-00100-of-00264\npush/push_train/push_train.tfrecord-00101-of-00264\npush/push_train/push_train.tfrecord-00102-of-00264\npush/push_train/push_train.tfrecord-00103-of-00264\npush/push_train/push_train.tfrecord-00104-of-00264\npush/push_train/push_train.tfrecord-00105-of-00264\npush/push_train/push_train.tfrecord-00106-of-00264\npush/push_train/push_train.tfrecord-00107-of-00264\npush/push_train/push_train.tfrecord-00108-of-00264\npush/push_train/push_train.tfrecord-00109-of-00264\npush/push_train/push_train.tfrecord-00110-of-00264\npush/push_train/push_train.tfrecord-00111-of-00264\npush/push_train/push_train.tfrecord-00112-of-00264\npush/push_train/push_train.tfrecord-00113-of-00264\npush/push_train/push_train.tfrecord-00114-of-00264\npush/push_train/push_train.tfrecord-00115-of-00264\npush/push_train/push_train.tfrecord-00116-of-00264\npush/push_train/push_train.tfrecord-00117-of-00264\npush/push_train/push_train.tfrecord-00118-of-00264\npush/push_train/push_train.tfrecord-00119-of-00264\npush/push_train/push_train.tfrecord-00120-of-00264\npush/push_train/push_train.tfrecord-00121-of-00264\npush/push_train/push_train.tfrecord-00122-of-00264\npush/push_train/push_train.tfrecord-00123-of-00264\npush/push_train/push_train.tfrecord-00124-of-00264\npush/push_train/push_train.tfrecord-00125-of-00264\npush/push_train/push_train.tfrecord-00126-of-00264\npush/push_train/push_train.tfrecord-00127-of-00264\npush/push_train/push_train.tfrecord-00128-of-00264\npush/push_train/push_train.tfrecord-00129-of-00264\npush/push_train/push_train.tfrecord-00130-of-00264\npush/push_train/push_train.tfrecord-00131-of-00264\npush/push_train/push_train.tfrecord-00132-of-00264\npush/push_train/push_train.tfrecord-00133-of-00264\npush/push_train/push_train.tfrecord-00134-of-00264\npush/push_train/push_train.tfrecord-00135-of-00264\npush/push_train/push_train.tfrecord-00136-of-00264\npush/push_train/push_train.tfrecord-00137-of-00264\npush/push_train/push_train.tfrecord-00138-of-00264\npush/push_train/push_train.tfrecord-00139-of-00264\npush/push_train/push_train.tfrecord-00140-of-00264\npush/push_train/push_train.tfrecord-00141-of-00264\npush/push_train/push_train.tfrecord-00142-of-00264\npush/push_train/push_train.tfrecord-00143-of-00264\npush/push_train/push_train.tfrecord-00144-of-00264\npush/push_train/push_train.tfrecord-00145-of-00264\npush/push_train/push_train.tfrecord-00146-of-00264\npush/push_train/push_train.tfrecord-00147-of-00264\npush/push_train/push_train.tfrecord-00148-of-00264\npush/push_train/push_train.tfrecord-00149-of-00264\npush/push_train/push_train.tfrecord-00150-of-00264\npush/push_train/push_train.tfrecord-00151-of-00264\npush/push_train/push_train.tfrecord-00152-of-00264\npush/push_train/push_train.tfrecord-00153-of-00264\npush/push_train/push_train.tfrecord-00154-of-00264\npush/push_train/push_train.tfrecord-00155-of-00264\npush/push_train/push_train.tfrecord-00156-of-00264\npush/push_train/push_train.tfrecord-00157-of-00264\npush/push_train/push_train.tfrecord-00158-of-00264\npush/push_train/push_train.tfrecord-00159-of-00264\npush/push_train/push_train.tfrecord-00160-of-00264\npush/push_train/push_train.tfrecord-00161-of-00264\npush/push_train/push_train.tfrecord-00162-of-00264\npush/push_train/push_train.tfrecord-00163-of-00264\npush/push_train/push_train.tfrecord-00164-of-00264\npush/push_train/push_train.tfrecord-00165-of-00264\npush/push_train/push_train.tfrecord-00166-of-00264\npush/push_train/push_train.tfrecord-00167-of-00264\npush/push_train/push_train.tfrecord-00168-of-00264\npush/push_train/push_train.tfrecord-00169-of-00264\npush/push_train/push_train.tfrecord-00170-of-00264\npush/push_train/push_train.tfrecord-00171-of-00264\npush/push_train/push_train.tfrecord-00172-of-00264\npush/push_train/push_train.tfrecord-00173-of-00264\npush/push_train/push_train.tfrecord-00174-of-00264\npush/push_train/push_train.tfrecord-00175-of-00264\npush/push_train/push_train.tfrecord-00176-of-00264\npush/push_train/push_train.tfrecord-00177-of-00264\npush/push_train/push_train.tfrecord-00178-of-00264\npush/push_train/push_train.tfrecord-00179-of-00264\npush/push_train/push_train.tfrecord-00180-of-00264\npush/push_train/push_train.tfrecord-00181-of-00264\npush/push_train/push_train.tfrecord-00182-of-00264\npush/push_train/push_train.tfrecord-00183-of-00264\npush/push_train/push_train.tfrecord-00184-of-00264\npush/push_train/push_train.tfrecord-00185-of-00264\npush/push_train/push_train.tfrecord-00186-of-00264\npush/push_train/push_train.tfrecord-00187-of-00264\npush/push_train/push_train.tfrecord-00188-of-00264\npush/push_train/push_train.tfrecord-00189-of-00264\npush/push_train/push_train.tfrecord-00190-of-00264\npush/push_train/push_train.tfrecord-00191-of-00264\npush/push_train/push_train.tfrecord-00192-of-00264\npush/push_train/push_train.tfrecord-00193-of-00264\npush/push_train/push_train.tfrecord-00194-of-00264\npush/push_train/push_train.tfrecord-00195-of-00264\npush/push_train/push_train.tfrecord-00196-of-00264\npush/push_train/push_train.tfrecord-00197-of-00264\npush/push_train/push_train.tfrecord-00198-of-00264\npush/push_train/push_train.tfrecord-00199-of-00264\npush/push_train/push_train.tfrecord-00200-of-00264\npush/push_train/push_train.tfrecord-00201-of-00264\npush/push_train/push_train.tfrecord-00202-of-00264\npush/push_train/push_train.tfrecord-00203-of-00264\npush/push_train/push_train.tfrecord-00204-of-00264\npush/push_train/push_train.tfrecord-00205-of-00264\npush/push_train/push_train.tfrecord-00206-of-00264\npush/push_train/push_train.tfrecord-00207-of-00264\npush/push_train/push_train.tfrecord-00208-of-00264\npush/push_train/push_train.tfrecord-00209-of-00264\npush/push_train/push_train.tfrecord-00210-of-00264\npush/push_train/push_train.tfrecord-00211-of-00264\npush/push_train/push_train.tfrecord-00212-of-00264\npush/push_train/push_train.tfrecord-00213-of-00264\npush/push_train/push_train.tfrecord-00214-of-00264\npush/push_train/push_train.tfrecord-00215-of-00264\npush/push_train/push_train.tfrecord-00216-of-00264\npush/push_train/push_train.tfrecord-00217-of-00264\npush/push_train/push_train.tfrecord-00218-of-00264\npush/push_train/push_train.tfrecord-00219-of-00264\npush/push_train/push_train.tfrecord-00220-of-00264\npush/push_train/push_train.tfrecord-00221-of-00264\npush/push_train/push_train.tfrecord-00222-of-00264\npush/push_train/push_train.tfrecord-00223-of-00264\npush/push_train/push_train.tfrecord-00224-of-00264\npush/push_train/push_train.tfrecord-00225-of-00264\npush/push_train/push_train.tfrecord-00226-of-00264\npush/push_train/push_train.tfrecord-00227-of-00264\npush/push_train/push_train.tfrecord-00228-of-00264\npush/push_train/push_train.tfrecord-00229-of-00264\npush/push_train/push_train.tfrecord-00230-of-00264\npush/push_train/push_train.tfrecord-00231-of-00264\npush/push_train/push_train.tfrecord-00232-of-00264\npush/push_train/push_train.tfrecord-00233-of-00264\npush/push_train/push_train.tfrecord-00234-of-00264\npush/push_train/push_train.tfrecord-00235-of-00264\npush/push_train/push_train.tfrecord-00236-of-00264\npush/push_train/push_train.tfrecord-00237-of-00264\npush/push_train/push_train.tfrecord-00238-of-00264\npush/push_train/push_train.tfrecord-00239-of-00264\npush/push_train/push_train.tfrecord-00240-of-00264\npush/push_train/push_train.tfrecord-00241-of-00264\npush/push_train/push_train.tfrecord-00242-of-00264\npush/push_train/push_train.tfrecord-00243-of-00264\npush/push_train/push_train.tfrecord-00244-of-00264\npush/push_train/push_train.tfrecord-00245-of-00264\npush/push_train/push_train.tfrecord-00246-of-00264\npush/push_train/push_train.tfrecord-00247-of-00264\npush/push_train/push_train.tfrecord-00248-of-00264\npush/push_train/push_train.tfrecord-00249-of-00264\npush/push_train/push_train.tfrecord-00250-of-00264\npush/push_train/push_train.tfrecord-00251-of-00264\npush/push_train/push_train.tfrecord-00252-of-00264\npush/push_train/push_train.tfrecord-00253-of-00264\npush/push_train/push_train.tfrecord-00254-of-00264\npush/push_train/push_train.tfrecord-00255-of-00264\npush/push_train/push_train.tfrecord-00256-of-00264\npush/push_train/push_train.tfrecord-00257-of-00264\npush/push_train/push_train.tfrecord-00258-of-00264\npush/push_train/push_train.tfrecord-00259-of-00264\npush/push_train/push_train.tfrecord-00260-of-00264\npush/push_train/push_train.tfrecord-00261-of-00264\npush/push_train/push_train.tfrecord-00262-of-00264\npush/push_train/push_train.tfrecord-00263-of-00264\n"
  },
  {
    "path": "models.py",
    "content": "# @Author  : Sky chen\n# @Email   : dzhchxk@126.com\n# @Personal homepage  : https://coderskychen.cn\n\nfrom torch import nn\n\nfrom transforms import *\nfrom torch.nn.init import normal, constant\n\n\nclass TwoStream(nn.Module):\n    def __init__(self, num_class, modality,\n                 base_model='BNInception', new_length=None,\n                 dropout=0.8,\n                 crop_num=1, partial_bn=True):\n        super(TwoStream, self).__init__()\n        self.modality = modality\n        self.reshape = True\n        self.dropout = dropout\n        self.crop_num = crop_num\n\n        if new_length is None:\n            self.new_length = 1 if modality == \"RGB\" else 5\n        else:\n            self.new_length = new_length\n\n        print((\"\"\"\nInitializing with base model: {}.\nTSN Configurations:\n    input_modality:     {}    \n    new_length:         {}    \n    dropout_ratio:      {}    \n        \"\"\".format(base_model, self.modality, self.new_length, self.dropout)))\n\n        self._prepare_base_model(base_model)\n\n        feature_dim = self._prepare_classifier(num_class)\n\n        if self.modality == 'Flow':\n            print(\"Converting the ImageNet model to a flow init model\")\n            self.base_model = self._construct_flow_model(self.base_model)\n            print(\"Done. Flow model ready...\")\n\n\n        self.softmax = nn.Softmax()\n\n        if partial_bn:\n            self.partialBN(True)\n\n    def _prepare_classifier(self, num_class):\n        feature_dim = getattr(self.base_model, self.base_model.last_layer_name).in_features\n\n        setattr(self.base_model, self.base_model.last_layer_name, nn.Dropout(p=self.dropout))\n        self.new_fc = nn.Linear(feature_dim, num_class)\n\n        std = 0.001\n        normal(self.new_fc.weight, 0, std)\n        constant(self.new_fc.bias, 0)\n\n        return feature_dim\n\n    def _prepare_base_model(self, base_model):\n\n        if 'resnet' in base_model or 'vgg' in base_model:\n            self.base_model = getattr(torchvision.models, base_model)(True)\n            self.base_model.last_layer_name = 'fc'\n            self.input_size = 224\n            self.input_mean = [0.485, 0.456, 0.406]\n            self.input_std = [0.229, 0.224, 0.225]\n\n            if self.modality == 'Flow':\n                self.input_mean = [0.5]\n                self.input_std = [np.mean(self.input_std)]\n            elif self.modality == 'RGBDiff':\n                self.input_mean = [0.485, 0.456, 0.406] + [0] * 3 * self.new_length\n                self.input_std = self.input_std + [np.mean(self.input_std) * 2] * 3 * self.new_length\n        elif base_model == 'BNInception':\n            import model_zoo\n            self.base_model = getattr(model_zoo, base_model)()\n            self.base_model.last_layer_name = 'fc'\n            self.input_size = 224\n            self.input_mean = [104, 117, 128]\n            self.input_std = [1]\n\n            if self.modality == 'Flow':\n                self.input_mean = [128]\n\n        elif base_model == 'InceptionV3':\n            import model_zoo\n            self.base_model = getattr(model_zoo, base_model)()\n            self.base_model.last_layer_name = 'top_cls_fc'\n            self.input_size = 299\n            self.input_mean = [104, 117, 128]\n            self.input_std = [1]\n            if self.modality == 'Flow':\n                self.input_mean = [128]\n            elif self.modality == 'RGBDiff':\n                self.input_mean = self.input_mean * (1 + self.new_length)\n\n        elif 'inception' in base_model:\n            import model_zoo\n            self.base_model = getattr(model_zoo, base_model)()\n            self.base_model.last_layer_name = 'classif'\n            self.input_size = 299\n            self.input_mean = [0.5]\n            self.input_std = [0.5]\n        else:\n            raise ValueError('Unknown base model: {}'.format(base_model))\n\n    def train(self, mode=True):\n        \"\"\"\n        Override the default train() to freeze the BN parameters\n        :return:\n        \"\"\"\n        super(TwoStream, self).train(mode)\n        count = 0\n        if self._enable_pbn:\n            print(\"Freezing BatchNorm2D except the first one.\")\n            for m in self.base_model.modules():\n                if isinstance(m, nn.BatchNorm2d):\n                    count += 1\n                    if count >= (2 if self._enable_pbn else 1):\n                        m.eval()\n\n                        # shutdown update in frozen mode\n                        m.weight.requires_grad = False\n                        m.bias.requires_grad = False\n\n    def partialBN(self, enable):\n        self._enable_pbn = enable\n\n    def get_optim_policies(self):\n        first_conv_weight = []\n        first_conv_bias = []\n        normal_weight = []\n        normal_bias = []\n        bn = []\n\n        conv_cnt = 0\n        bn_cnt = 0\n        for m in self.modules():\n            if isinstance(m, torch.nn.Conv2d) or isinstance(m, torch.nn.Conv1d):\n                ps = list(m.parameters())\n                conv_cnt += 1\n                if conv_cnt == 1:\n                    first_conv_weight.append(ps[0])\n                    if len(ps) == 2:\n                        first_conv_bias.append(ps[1])\n                else:\n                    normal_weight.append(ps[0])\n                    if len(ps) == 2:\n                        normal_bias.append(ps[1])\n            elif isinstance(m, torch.nn.Linear):\n                ps = list(m.parameters())\n                normal_weight.append(ps[0])\n                if len(ps) == 2:\n                    normal_bias.append(ps[1])\n\n            elif isinstance(m, torch.nn.BatchNorm1d):\n                bn.extend(list(m.parameters()))\n            elif isinstance(m, torch.nn.BatchNorm2d):\n                bn_cnt += 1\n                # later BN's are frozen\n                if not self._enable_pbn or bn_cnt == 1:\n                    bn.extend(list(m.parameters()))\n            elif len(m._modules) == 0:\n                if len(list(m.parameters())) > 0:\n                    raise ValueError(\"New atomic module type: {}. Need to give it a learning policy\".format(type(m)))\n\n        return [\n            {'params': first_conv_weight, 'lr_mult': 5 if self.modality == 'Flow' else 1, 'decay_mult': 1,\n             'name': \"first_conv_weight\"},\n            {'params': first_conv_bias, 'lr_mult': 10 if self.modality == 'Flow' else 2, 'decay_mult': 0,\n             'name': \"first_conv_bias\"},\n            {'params': normal_weight, 'lr_mult': 1, 'decay_mult': 1,\n             'name': \"normal_weight\"},\n            {'params': normal_bias, 'lr_mult': 2, 'decay_mult': 0,\n             'name': \"normal_bias\"},\n            {'params': bn, 'lr_mult': 1, 'decay_mult': 0,\n             'name': \"BN scale/shift\"},\n        ]\n\n    def forward(self, input):\n        # input: bs* channelss * w *h   channelss=channels*frames\n        sample_len = (3 if self.modality == \"RGB\" else 2) * self.new_length   #channels\n\n        base_out = self.base_model(input.view((-1, sample_len) + input.size()[-2:]))\n        # input:(bs*frames)*channels*w*h\n        # output:(bs*frames)*dim\n\n        base_out = self.new_fc(base_out)\n\n        # if self.reshape:\n        #     base_out = base_out.view((-1, self.num_segments) + base_out.size()[1:])\n            #bs*frames*dim\n\n\n        return base_out    #bs*dim\n        # return output.squeeze(1)    #bs*dim\n\n    def _construct_flow_model(self, base_model):\n        # modify the convolution layers\n        # Torch models are usually defined in a hierarchical way.\n        # nn.modules.children() return all sub modules in a DFS manner\n        modules = list(self.base_model.modules())\n        first_conv_idx = list(filter(lambda x: isinstance(modules[x], nn.Conv2d), list(range(len(modules)))))[0]\n        conv_layer = modules[first_conv_idx]\n        container = modules[first_conv_idx - 1]\n\n        # modify parameters, assume the first blob contains the convolution kernels\n        params = [x.clone() for x in conv_layer.parameters()]\n        kernel_size = params[0].size()\n        new_kernel_size = kernel_size[:1] + (2 * self.new_length,) + kernel_size[2:]\n        new_kernels = params[0].data.mean(dim=1, keepdim=True).expand(new_kernel_size).contiguous()\n\n        new_conv = nn.Conv2d(2 * self.new_length, conv_layer.out_channels,\n                             conv_layer.kernel_size, conv_layer.stride, conv_layer.padding,\n                             bias=True if len(params) == 2 else False)\n        new_conv.weight.data = new_kernels\n        if len(params) == 2:\n            new_conv.bias.data = params[1].data  # add bias if neccessary\n        layer_name = list(container.state_dict().keys())[0][:-7]  # remove .weight suffix to get the layer name\n\n        # replace the first convlution layer\n        setattr(container, layer_name, new_conv)\n        return base_model\n\n    @property\n    def crop_size(self):\n        return self.input_size\n\n    @property\n    def scale_size(self):\n        return self.input_size * 256 // 224\n\n    def get_augmentation(self):\n        if self.modality == 'RGB':\n            return torchvision.transforms.Compose([GroupMultiScaleCrop(self.input_size, [1, .875, .75, .66]),\n                                                   GroupRandomHorizontalFlip(is_flow=False)])\n        elif self.modality == 'Flow':\n            return torchvision.transforms.Compose([GroupMultiScaleCrop(self.input_size, [1, .875, .75]),\n                                                   GroupRandomHorizontalFlip(is_flow=True)])\n        elif self.modality == 'RGBDiff':\n            return torchvision.transforms.Compose([GroupMultiScaleCrop(self.input_size, [1, .875, .75]),\n                                                   GroupRandomHorizontalFlip(is_flow=False)])\n\n\nclass TSN(nn.Module):\n    def __init__(self, num_class, num_segments, modality,\n                 base_model='BNInception', new_length=None,\n                 dropout=0.8,\n                 crop_num=1, partial_bn=True):\n        super(TSN, self).__init__()\n        self.modality = modality\n        self.num_class = num_class\n        self.num_segments = num_segments\n        self.reshape = True\n        self.dropout = dropout\n        self.crop_num = crop_num\n\n        if new_length is None:\n            self.new_length = 1 if modality == \"RGB\" else 5\n        else:\n            self.new_length = new_length\n\n        print((\"\"\"\nInitializing with base model: {}.\nTSN Configurations:\n    input_modality:     {}    \n    new_length:         {}    \n    dropout_ratio:      {}    \n        \"\"\".format(base_model, self.modality, self.new_length, self.dropout)))\n\n        self._prepare_base_model(base_model)\n\n        feature_dim = self._prepare_tsn(num_class)\n\n        if self.modality == 'Flow':\n            print(\"Converting the ImageNet model to a flow init model\")\n            self.base_model = self._construct_flow_model(self.base_model)\n            print(\"Done. Flow model ready...\")\n\n\n        self.softmax = nn.Softmax()\n\n        if partial_bn:\n            self.partialBN(True)\n\n    def _prepare_tsn(self, num_class):\n        feature_dim = getattr(self.base_model, self.base_model.last_layer_name).in_features\n\n        setattr(self.base_model, self.base_model.last_layer_name, nn.Dropout(p=self.dropout))\n        self.new_fc = nn.Linear(feature_dim, num_class)\n\n        std = 0.001\n        normal(self.new_fc.weight, 0, std)\n        constant(self.new_fc.bias, 0)\n\n        return feature_dim\n\n    def _prepare_base_model(self, base_model):\n\n        if 'resnet' in base_model or 'vgg' in base_model:\n            self.base_model = getattr(torchvision.models, base_model)(True)\n            self.base_model.last_layer_name = 'fc'\n            self.input_size = 224\n            self.input_mean = [0.485, 0.456, 0.406]\n            self.input_std = [0.229, 0.224, 0.225]\n\n            if self.modality == 'Flow':\n                self.input_mean = [0.5]\n                self.input_std = [np.mean(self.input_std)]\n            elif self.modality == 'RGBDiff':\n                self.input_mean = [0.485, 0.456, 0.406] + [0] * 3 * self.new_length\n                self.input_std = self.input_std + [np.mean(self.input_std) * 2] * 3 * self.new_length\n        elif base_model == 'BNInception':\n            import model_zoo\n            self.base_model = getattr(model_zoo, base_model)()\n            self.base_model.last_layer_name = 'fc'\n            self.input_size = 224\n            self.input_mean = [104, 117, 128]\n            self.input_std = [1]\n\n            if self.modality == 'Flow':\n                self.input_mean = [128]\n\n        elif base_model == 'InceptionV3':\n            import model_zoo\n            self.base_model = getattr(model_zoo, base_model)()\n            self.base_model.last_layer_name = 'top_cls_fc'\n            self.input_size = 299\n            self.input_mean = [104, 117, 128]\n            self.input_std = [1]\n            if self.modality == 'Flow':\n                self.input_mean = [128]\n            elif self.modality == 'RGBDiff':\n                self.input_mean = self.input_mean * (1 + self.new_length)\n\n        elif 'inception' in base_model:\n            import model_zoo\n            self.base_model = getattr(model_zoo, base_model)()\n            self.base_model.last_layer_name = 'classif'\n            self.input_size = 299\n            self.input_mean = [0.5]\n            self.input_std = [0.5]\n        else:\n            raise ValueError('Unknown base model: {}'.format(base_model))\n\n    def train(self, mode=True):\n        \"\"\"\n        Override the default train() to freeze the BN parameters\n        :return:\n        \"\"\"\n        super(TSN, self).train(mode)\n        count = 0\n        if self._enable_pbn:\n            print(\"Freezing BatchNorm2D except the first one.\")\n            for m in self.base_model.modules():\n                if isinstance(m, nn.BatchNorm2d):\n                    count += 1\n                    if count >= (2 if self._enable_pbn else 1):\n                        m.eval()\n\n                        # shutdown update in frozen mode\n                        m.weight.requires_grad = False\n                        m.bias.requires_grad = False\n\n    def partialBN(self, enable):\n        self._enable_pbn = enable\n\n    def get_optim_policies(self):\n        first_conv_weight = []\n        first_conv_bias = []\n        normal_weight = []\n        normal_bias = []\n        bn = []\n\n        conv_cnt = 0\n        bn_cnt = 0\n        for m in self.modules():\n            if isinstance(m, torch.nn.Conv2d) or isinstance(m, torch.nn.Conv1d):\n                ps = list(m.parameters())\n                conv_cnt += 1\n                if conv_cnt == 1:\n                    first_conv_weight.append(ps[0])\n                    if len(ps) == 2:\n                        first_conv_bias.append(ps[1])\n                else:\n                    normal_weight.append(ps[0])\n                    if len(ps) == 2:\n                        normal_bias.append(ps[1])\n            elif isinstance(m, torch.nn.Linear):\n                ps = list(m.parameters())\n                normal_weight.append(ps[0])\n                if len(ps) == 2:\n                    normal_bias.append(ps[1])\n\n            elif isinstance(m, torch.nn.BatchNorm1d):\n                bn.extend(list(m.parameters()))\n            elif isinstance(m, torch.nn.BatchNorm2d):\n                bn_cnt += 1\n                # later BN's are frozen\n                if not self._enable_pbn or bn_cnt == 1:\n                    bn.extend(list(m.parameters()))\n            elif len(m._modules) == 0:\n                if len(list(m.parameters())) > 0:\n                    raise ValueError(\"New atomic module type: {}. Need to give it a learning policy\".format(type(m)))\n\n        return [\n            {'params': first_conv_weight, 'lr_mult': 5 if self.modality == 'Flow' else 1, 'decay_mult': 1,\n             'name': \"first_conv_weight\"},\n            {'params': first_conv_bias, 'lr_mult': 10 if self.modality == 'Flow' else 2, 'decay_mult': 0,\n             'name': \"first_conv_bias\"},\n            {'params': normal_weight, 'lr_mult': 1, 'decay_mult': 1,\n             'name': \"normal_weight\"},\n            {'params': normal_bias, 'lr_mult': 2, 'decay_mult': 0,\n             'name': \"normal_bias\"},\n            {'params': bn, 'lr_mult': 1, 'decay_mult': 0,\n             'name': \"BN scale/shift\"},\n        ]\n\n    def forward(self, input):\n        # input: bs* channelss * w *h   channelss=channels*frames\n        bs = input.size()[0]\n        sample_len = (3 if self.modality == \"RGB\" else 2) * self.new_length   #channels\n\n        base_out = self.base_model(input.view((-1, sample_len) + input.size()[-2:]))\n        # input:(bs*frames)*channels*w*h\n        # output:(bs*frames)*dim\n\n        base_out = self.new_fc(base_out)\n\n        # avg consensus function\n        base_out = base_out.view((bs, self.num_segments, self.num_class))\n        base_out = torch.mean(base_out, 1)\n\n\n        return base_out    #bs*dim\n\n\n    def _construct_flow_model(self, base_model):\n        # modify the convolution layers\n        # Torch models are usually defined in a hierarchical way.\n        # nn.modules.children() return all sub modules in a DFS manner\n        modules = list(self.base_model.modules())\n        first_conv_idx = list(filter(lambda x: isinstance(modules[x], nn.Conv2d), list(range(len(modules)))))[0]\n        conv_layer = modules[first_conv_idx]\n        container = modules[first_conv_idx - 1]\n\n        # modify parameters, assume the first blob contains the convolution kernels\n        params = [x.clone() for x in conv_layer.parameters()]\n        kernel_size = params[0].size()\n        new_kernel_size = kernel_size[:1] + (2 * self.new_length,) + kernel_size[2:]\n        new_kernels = params[0].data.mean(dim=1, keepdim=True).expand(new_kernel_size).contiguous()\n\n        new_conv = nn.Conv2d(2 * self.new_length, conv_layer.out_channels,\n                             conv_layer.kernel_size, conv_layer.stride, conv_layer.padding,\n                             bias=True if len(params) == 2 else False)\n        new_conv.weight.data = new_kernels\n        if len(params) == 2:\n            new_conv.bias.data = params[1].data  # add bias if neccessary\n        layer_name = list(container.state_dict().keys())[0][:-7]  # remove .weight suffix to get the layer name\n\n        # replace the first convlution layer\n        setattr(container, layer_name, new_conv)\n        return base_model\n\n    @property\n    def crop_size(self):\n        return self.input_size\n\n    @property\n    def scale_size(self):\n        return self.input_size * 256 // 224\n\n    def get_augmentation(self):\n        if self.modality == 'RGB':\n            return torchvision.transforms.Compose([GroupMultiScaleCrop(self.input_size, [1, .875, .75, .66]),\n                                                   GroupRandomHorizontalFlip(is_flow=False)])\n        elif self.modality == 'Flow':\n            return torchvision.transforms.Compose([GroupMultiScaleCrop(self.input_size, [1, .875, .75]),\n                                                   GroupRandomHorizontalFlip(is_flow=True)])\n        elif self.modality == 'RGBDiff':\n            return torchvision.transforms.Compose([GroupMultiScaleCrop(self.input_size, [1, .875, .75]),\n                                                   GroupRandomHorizontalFlip(is_flow=False)])\n\n\n\nclass C3D(nn.Module):\n\n    def __init__(self):\n        super(C3D, self).__init__()\n\n        self.modality = 'RGB'\n        self.input_size = 112\n        self.input_mean = [104, 117, 128]\n        self.input_std = [1]\n\n        self.conv1 = nn.Conv3d(3, 64, kernel_size=(3, 3, 3), padding=(1, 1, 1))\n        self.pool1 = nn.MaxPool3d(kernel_size=(1, 2, 2), stride=(1, 2, 2))\n\n        self.conv2 = nn.Conv3d(64, 128, kernel_size=(3, 3, 3), padding=(1, 1, 1))\n        self.pool2 = nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 2, 2))\n\n        self.conv3a = nn.Conv3d(128, 256, kernel_size=(3, 3, 3), padding=(1, 1, 1))\n        self.conv3b = nn.Conv3d(256, 256, kernel_size=(3, 3, 3), padding=(1, 1, 1))\n        self.pool3 = nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 2, 2))\n\n        self.conv4a = nn.Conv3d(256, 512, kernel_size=(3, 3, 3), padding=(1, 1, 1))\n        self.conv4b = nn.Conv3d(512, 512, kernel_size=(3, 3, 3), padding=(1, 1, 1))\n        self.pool4 = nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 2, 2))\n\n        self.conv5a = nn.Conv3d(512, 512, kernel_size=(3, 3, 3), padding=(1, 1, 1))\n        self.conv5b = nn.Conv3d(512, 512, kernel_size=(3, 3, 3), padding=(1, 1, 1))\n        self.pool5 = nn.MaxPool3d(kernel_size=(2, 2, 2), stride=(2, 2, 2), padding=(0, 1, 1))\n\n        self.fc6 = nn.Linear(8192, 4096)\n        self.fc7 = nn.Linear(4096, 4096)\n        self.fc8_new = nn.Linear(4096, 174)\n\n        self.dropout = nn.Dropout(p=0.5)\n\n        self.relu = nn.ReLU()\n\n        self.softmax = nn.Softmax()\n    def forward(self, x):\n\n        bs = x.size(0)\n        # print(x.size())\n        x = x.view(-1, 3, 16, 112, 112)\n\n        h = self.relu(self.conv1(x))\n        h = self.pool1(h)\n\n        h = self.relu(self.conv2(h))\n        h = self.pool2(h)\n\n        h = self.relu(self.conv3a(h))\n        h = self.relu(self.conv3b(h))\n        h = self.pool3(h)\n\n        h = self.relu(self.conv4a(h))\n        h = self.relu(self.conv4b(h))\n        h = self.pool4(h)\n\n        h = self.relu(self.conv5a(h))\n        h = self.relu(self.conv5b(h))\n        h = self.pool5(h)\n\n        h = h.view(-1, 8192)\n        h = self.relu(self.fc6(h))\n        h = self.dropout(h)\n        h = self.relu(self.fc7(h))\n        h = self.dropout(h)\n\n        logits = self.fc8_new(h)\n\n        return logits\n\n    def partialBN(self, enable):\n        pass\n\n    @property\n    def crop_size(self):\n        return self.input_size\n\n    @property\n    def scale_size(self):\n        return self.input_size * 128 // 112\n\n    def get_augmentation(self):\n        if self.modality == 'RGB':\n            return torchvision.transforms.Compose([GroupMultiScaleCrop(self.input_size, [1, .875, .75, .66]),\n                                                   GroupRandomHorizontalFlip(is_flow=False)])\n"
  },
  {
    "path": "optical_flow/gpu_main.cpp",
    "content": "#include<sys/types.h>\n#include<sys/stat.h>\n#include <iostream>\n#include <fstream>\n#include<dirent.h>\n\n#include \"opencv2/video/tracking.hpp\"\n#include \"opencv2/highgui/highgui.hpp\"\n#include\"opencv2/gpu/gpu.hpp\"\n\nusing namespace cv;\nusing namespace cv::gpu;\nusing namespace std;\n\n\nbool clipFlow = true;\n\ninline bool isFlowCorrect(Point2f u)\n{\n    return !cvIsNaN(u.x) && !cvIsNaN(u.y) && fabs(u.x) < 1e9 && fabs(u.y) < 1e9;\n}\n\nstatic Vec3b computeColor(float fx, float fy)\n{\n    static bool first = true;\n\n    // relative lengths of color transitions:\n    // these are chosen based on perceptual similarity\n    // (e.g. one can distinguish more shades between red and yellow\n    //  than between yellow and green)\n    const int RY = 15;\n    const int YG = 6;\n    const int GC = 4;\n    const int CB = 11;\n    const int BM = 13;\n    const int MR = 6;\n    const int NCOLS = RY + YG + GC + CB + BM + MR;\n    static Vec3i colorWheel[NCOLS];\n\n    if (first)\n    {\n        int k = 0;\n\n        for (int i = 0; i < RY; ++i, ++k)\n            colorWheel[k] = Vec3i(255, 255 * i / RY, 0);\n\n        for (int i = 0; i < YG; ++i, ++k)\n            colorWheel[k] = Vec3i(255 - 255 * i / YG, 255, 0);\n\n        for (int i = 0; i < GC; ++i, ++k)\n            colorWheel[k] = Vec3i(0, 255, 255 * i / GC);\n\n        for (int i = 0; i < CB; ++i, ++k)\n            colorWheel[k] = Vec3i(0, 255 - 255 * i / CB, 255);\n\n        for (int i = 0; i < BM; ++i, ++k)\n            colorWheel[k] = Vec3i(255 * i / BM, 0, 255);\n\n        for (int i = 0; i < MR; ++i, ++k)\n            colorWheel[k] = Vec3i(255, 0, 255 - 255 * i / MR);\n\n        first = false;\n    }\n\n    const float rad = sqrt(fx * fx + fy * fy);\n    const float a = atan2(-fy, -fx) / (float)CV_PI;\n\n    const float fk = (a + 1.0f) / 2.0f * (NCOLS - 1);\n    const int k0 = static_cast<int>(fk);\n    const int k1 = (k0 + 1) % NCOLS;\n    const float f = fk - k0;\n\n    Vec3b pix;\n\n    for (int b = 0; b < 3; b++)\n    {\n        const float col0 = colorWheel[k0][b] / 255.f;\n        const float col1 = colorWheel[k1][b] / 255.f;\n\n        float col = (1 - f) * col0 + f * col1;\n\n        if (rad <= 1)\n            col = 1 - rad * (1 - col); // increase saturation with radius\n        else\n            col *= .75; // out of range\n\n        pix[2 - b] = static_cast<uchar>(255.f * col);\n    }\n\n    return pix;\n}\n\nstatic void drawOpticalFlow(const Mat_<Point2f>& flow, Mat& dst, float maxmotion = -1)\n{\n    dst.create(flow.size(), CV_8UC3);\n    dst.setTo(Scalar::all(0));\n\n    // determine motion range:\n    float maxrad = maxmotion;\n\n    if (maxmotion <= 0)\n    {\n        maxrad = 1;\n        for (int y = 0; y < flow.rows; ++y)\n        {\n            for (int x = 0; x < flow.cols; ++x)\n            {\n                Point2f u = flow(y, x);\n\n                if (!isFlowCorrect(u))\n                    continue;\n\n                maxrad = max(maxrad, sqrt(u.x * u.x + u.y * u.y));\n            }\n        }\n    }\n\n    for (int y = 0; y < flow.rows; ++y)\n    {\n        for (int x = 0; x < flow.cols; ++x)\n        {\n            Point2f u = flow(y, x);\n\n            if (isFlowCorrect(u))\n                dst.at<Vec3b>(y, x) = computeColor(u.x / maxrad, u.y / maxrad);\n        }\n    }\n}\n\n// binary file format for flow data specified here:\n// http://vision.middlebury.edu/flow/data/\nstatic void writeOpticalFlowToFile(const Mat_<Point2f>& flow, const string& fileName)\n{\n    static const char FLO_TAG_STRING[] = \"PIEH\";\n\n    ofstream file(fileName.c_str(), ios_base::binary);\n\n    file << FLO_TAG_STRING;\n\n    file.write((const char*) &flow.cols, sizeof(int));\n    file.write((const char*) &flow.rows, sizeof(int));\n\n    for (int i = 0; i < flow.rows; ++i)\n    {\n        for (int j = 0; j < flow.cols; ++j)\n        {\n            const Point2f u = flow(i, j);\n\n            file.write((const char*) &u.x, sizeof(float));\n            file.write((const char*) &u.y, sizeof(float));\n        }\n    }\n}\n\nstatic void convertFlowToImage(const Mat &flow_x, Mat &img_x, double lowerBound, double higherBound)\n{\n    #define CAST(v, L, H) ((v) > (H) ? 255 : (v) < (L) ? 0 : cvRound(255*((v) - (L))/((H)-(L))))\n    for (int i = 0; i < flow_x.rows; ++i) {\n        for (int j = 0; j < flow_x.cols; ++j) {\n            float x = flow_x.at<float>(i,j);\n            img_x.at<uchar>(i,j) = CAST(x, lowerBound, higherBound);\n        }\n    }\n    #undef CAST\n}\n\n\n\nint main(int argc, const char* argv[])\n{\n  //  if (argc < 3)\n  //  {\n  //      cerr << \"Usage : \" << argv[0] << \"<frame0> <frame1> [<output_flow>]\" << endl;\n  //      return -1;\n  //  }\n\n    Mat image,prev_image,flow_x,flow_y;\n\n    GpuMat frame_0, frame_1, flow_u, flow_v;\n\n    setDevice(0);\n    OpticalFlowDual_TVL1_GPU alg_tvl1;\n\n    string outputPath=\"/home/mcg/cxk/dataset/somthing-something/something-optical-flow/\";\n    DIR* dir=opendir(\"/home/mcg/cxk/dataset/somthing-something/something-rgb\");\n    dirent* p=NULL;\n    while((p=readdir(dir))!=NULL)\n    {\n        if(p->d_name[0]!='.')\n        {\n            string name=\"/home/mcg/cxk/dataset/somthing-something/something-rgb/\"+string(p->d_name);\n            cout<<name<<endl;\n\n            int status;\n            string newPath=outputPath+string(p->d_name);\n            status=mkdir(newPath.c_str(), S_IRWXU | S_IRWXG | S_IROTH | S_IXOTH);\n            if(status!=0)\n            {\n                cout<<\"error in mkdir!\"<<endl;\n                cout<<status<<endl;\n                cout<<newPath<<endl;\n                return -1;\n            }\n\n\n            int count=0;\n            DIR* dirChild=opendir(name.c_str());\n            dirent* pChild=NULL;\n            while((pChild=readdir(dirChild))!=NULL)\n            {\n                if(pChild->d_name[0]!='.')\n                {\n                    string nameChild=name+'/'+string(pChild->d_name);\n                    //cout<<nameChild<<endl;\n                    ++count;\n                }\n            }\n            closedir(dirChild);\n            \n            for(int i=1;i<=count;i+=2)\n            {\n                char tmp[20];\n                sprintf(tmp,\"%05d.jpg\",int(i));\n                string imgName=name+'/'+tmp;\n                if(i==1)\n                {\n                    prev_image = imread(imgName, IMREAD_GRAYSCALE);\n                    if(prev_image.empty())\n                    {\n                        cerr << \"Can't open image [\"  << imgName << \"]\" << endl;\n                        return -1;\n                    }\n\n                }\n                image=imread(imgName, IMREAD_GRAYSCALE);\n                if(image.empty())\n                {\n                    cerr << \"Can't open image [\"  << imgName << \"]\" << endl;\n                    return -1;\n                }\n                if (image.size() != prev_image.size())\n                {\n                    cerr << \"Images should be of equal sizes\" << endl;\n                    cerr<<image.size()<<prev_image.size()<<endl;\n                    return -1;\n                }\n\n                frame_0.upload(prev_image);\n                frame_1.upload(image);\n\n                alg_tvl1(frame_0,frame_1,flow_u,flow_v);\n\n                flow_u.download(flow_x);\n                flow_v.download(flow_y);\n\n                Mat imgX(flow_x.size(),CV_8UC1);\n                Mat imgY(flow_y.size(),CV_8UC1);\n\n\n                double min_x,max_x;\n                double min_y,max_y;\n                minMaxLoc(flow_x,&min_x,&min_x);\n                minMaxLoc(flow_y,&min_y,&min_y);\n\n                if(clipFlow)\n                {\n                    min_x=-20; max_x=20;\n                    min_y=-20; max_y=20;\n                }\n                convertFlowToImage(flow_x,imgX,min_x,max_x);\n                convertFlowToImage(flow_y,imgY,min_y,max_y);\n\n                imwrite(outputPath+string(p->d_name)+'/'+\"x_\"+tmp,imgX);\n                imwrite(outputPath+string(p->d_name)+'/'+\"y_\"+tmp,imgY);\n\n                std::swap(prev_image,image);\n                           \n            }\n\n        }\n    }\n    closedir(dir);\n\n//    Mat out;\n//    drawOpticalFlow(flow, out);\n//\n//    if (argc == 4)\n//        writeOpticalFlowToFile(flow, argv[3]);\n//\n//    imwrite(\"Flow.jpg\", out);\n//    cout<<\"ok\"<<endl;\n//\n//    waitKey();\n\n    return 0;\n}\n"
  },
  {
    "path": "optical_flow/gpu_makefile",
    "content": "INCLUDE = $(shell pkg-config --cflags opencv)\nLIBS = $(shell pkg-config --libs opencv)\nOBJECTS = gpu_main.o\nSOURCE = gpu_main.cpp\nBIN = gpu_bin\n$(OBJECTS) : $(SOURCE)\n\tg++ -c $(SOURCE)\n$(BIN):$(OBJECTS)\n\tg++ -o $(BIN) $(OBJECTS) -I $(INCLUDE) $(LIBS)\n\n\nclean:\n\trm $(OBJECTS) $(BIN)\n"
  },
  {
    "path": "optical_flow/main.cpp",
    "content": "#include<sys/types.h>\n#include<sys/stat.h>\n#include <iostream>\n#include <fstream>\n#include<dirent.h>\n\n#include \"opencv2/video/tracking.hpp\"\n#include \"opencv2/highgui/highgui.hpp\"\n\nusing namespace cv;\nusing namespace std;\n\n\nbool clipFlow = true;\n\ninline bool isFlowCorrect(Point2f u)\n{\n    return !cvIsNaN(u.x) && !cvIsNaN(u.y) && fabs(u.x) < 1e9 && fabs(u.y) < 1e9;\n}\n\nstatic Vec3b computeColor(float fx, float fy)\n{\n    static bool first = true;\n\n    // relative lengths of color transitions:\n    // these are chosen based on perceptual similarity\n    // (e.g. one can distinguish more shades between red and yellow\n    //  than between yellow and green)\n    const int RY = 15;\n    const int YG = 6;\n    const int GC = 4;\n    const int CB = 11;\n    const int BM = 13;\n    const int MR = 6;\n    const int NCOLS = RY + YG + GC + CB + BM + MR;\n    static Vec3i colorWheel[NCOLS];\n\n    if (first)\n    {\n        int k = 0;\n\n        for (int i = 0; i < RY; ++i, ++k)\n            colorWheel[k] = Vec3i(255, 255 * i / RY, 0);\n\n        for (int i = 0; i < YG; ++i, ++k)\n            colorWheel[k] = Vec3i(255 - 255 * i / YG, 255, 0);\n\n        for (int i = 0; i < GC; ++i, ++k)\n            colorWheel[k] = Vec3i(0, 255, 255 * i / GC);\n\n        for (int i = 0; i < CB; ++i, ++k)\n            colorWheel[k] = Vec3i(0, 255 - 255 * i / CB, 255);\n\n        for (int i = 0; i < BM; ++i, ++k)\n            colorWheel[k] = Vec3i(255 * i / BM, 0, 255);\n\n        for (int i = 0; i < MR; ++i, ++k)\n            colorWheel[k] = Vec3i(255, 0, 255 - 255 * i / MR);\n\n        first = false;\n    }\n\n    const float rad = sqrt(fx * fx + fy * fy);\n    const float a = atan2(-fy, -fx) / (float)CV_PI;\n\n    const float fk = (a + 1.0f) / 2.0f * (NCOLS - 1);\n    const int k0 = static_cast<int>(fk);\n    const int k1 = (k0 + 1) % NCOLS;\n    const float f = fk - k0;\n\n    Vec3b pix;\n\n    for (int b = 0; b < 3; b++)\n    {\n        const float col0 = colorWheel[k0][b] / 255.f;\n        const float col1 = colorWheel[k1][b] / 255.f;\n\n        float col = (1 - f) * col0 + f * col1;\n\n        if (rad <= 1)\n            col = 1 - rad * (1 - col); // increase saturation with radius\n        else\n            col *= .75; // out of range\n\n        pix[2 - b] = static_cast<uchar>(255.f * col);\n    }\n\n    return pix;\n}\n\nstatic void drawOpticalFlow(const Mat_<Point2f>& flow, Mat& dst, float maxmotion = -1)\n{\n    dst.create(flow.size(), CV_8UC3);\n    dst.setTo(Scalar::all(0));\n\n    // determine motion range:\n    float maxrad = maxmotion;\n\n    if (maxmotion <= 0)\n    {\n        maxrad = 1;\n        for (int y = 0; y < flow.rows; ++y)\n        {\n            for (int x = 0; x < flow.cols; ++x)\n            {\n                Point2f u = flow(y, x);\n\n                if (!isFlowCorrect(u))\n                    continue;\n\n                maxrad = max(maxrad, sqrt(u.x * u.x + u.y * u.y));\n            }\n        }\n    }\n\n    for (int y = 0; y < flow.rows; ++y)\n    {\n        for (int x = 0; x < flow.cols; ++x)\n        {\n            Point2f u = flow(y, x);\n\n            if (isFlowCorrect(u))\n                dst.at<Vec3b>(y, x) = computeColor(u.x / maxrad, u.y / maxrad);\n        }\n    }\n}\n\n// binary file format for flow data specified here:\n// http://vision.middlebury.edu/flow/data/\nstatic void writeOpticalFlowToFile(const Mat_<Point2f>& flow, const string& fileName)\n{\n    static const char FLO_TAG_STRING[] = \"PIEH\";\n\n    ofstream file(fileName.c_str(), ios_base::binary);\n\n    file << FLO_TAG_STRING;\n\n    file.write((const char*) &flow.cols, sizeof(int));\n    file.write((const char*) &flow.rows, sizeof(int));\n\n    for (int i = 0; i < flow.rows; ++i)\n    {\n        for (int j = 0; j < flow.cols; ++j)\n        {\n            const Point2f u = flow(i, j);\n\n            file.write((const char*) &u.x, sizeof(float));\n            file.write((const char*) &u.y, sizeof(float));\n        }\n    }\n}\n\nstatic void convertFlowToImage(const Mat &flow_x, Mat &img_x, double lowerBound, double higherBound)\n{\n    #define CAST(v, L, H) ((v) > (H) ? 255 : (v) < (L) ? 0 : cvRound(255*((v) - (L))/((H)-(L))))\n    for (int i = 0; i < flow_x.rows; ++i) {\n        for (int j = 0; j < flow_x.cols; ++j) {\n            float x = flow_x.at<float>(i,j);\n            img_x.at<uchar>(i,j) = CAST(x, lowerBound, higherBound);\n        }\n    }\n    #undef CAST\n}\n\n\n\nint main(int argc, const char* argv[])\n{\n  //  if (argc < 3)\n  //  {\n  //      cerr << \"Usage : \" << argv[0] << \"<frame0> <frame1> [<output_flow>]\" << endl;\n  //      return -1;\n  //  }\n\n    Mat image,prev_image;\n\n    string outputPath=\"/home/mcg/cxk/dataset/somthing-something/something-optical-flow/\";\n    DIR* dir=opendir(\"/home/mcg/cxk/dataset/somthing-something/20bn-something-something-v1\");\n    dirent* p=NULL;\n    while((p=readdir(dir))!=NULL)\n    {\n        if(p->d_name[0]!='.')\n        {\n            string name=\"/home/mcg/cxk/dataset/somthing-something/20bn-something-something-v1/\"+string(p->d_name);\n            cout<<name<<endl;\n\n            int status;\n            string newPath=outputPath+string(p->d_name);\n            status=mkdir(newPath.c_str(), S_IRWXU | S_IRWXG | S_IROTH | S_IXOTH);\n            if(status!=0)\n            {\n                cout<<\"error in mkdir!\"<<endl;\n                cout<<status<<endl;\n                cout<<newPath<<endl;\n                return -1;\n            }\n\n\n            int count=0;\n            DIR* dirChild=opendir(name.c_str());\n            dirent* pChild=NULL;\n            while((pChild=readdir(dirChild))!=NULL)\n            {\n                if(pChild->d_name[0]!='.')\n                {\n                    string nameChild=name+'/'+string(pChild->d_name);\n                    //cout<<nameChild<<endl;\n                    ++count;\n                }\n            }\n            closedir(dirChild);\n            \n            for(int i=1;i<=count;i+=2)\n            {\n                char tmp[20];\n                sprintf(tmp,\"%05d.jpg\",int(i));\n                string imgName=name+'/'+tmp;\n                if(i==1)\n                {\n                    prev_image = imread(imgName, IMREAD_GRAYSCALE);\n                    if(prev_image.empty())\n                    {\n                        cerr << \"Can't open image [\"  << imgName << \"]\" << endl;\n                        return -1;\n                    }\n\n                }\n                image=imread(imgName, IMREAD_GRAYSCALE);\n                if(image.empty())\n                {\n                    cerr << \"Can't open image [\"  << imgName << \"]\" << endl;\n                    return -1;\n                }\n                if (image.size() != prev_image.size())\n                {\n                    cerr << \"Images should be of equal sizes\" << endl;\n                    cerr<<image.size()<<prev_image.size()<<endl;\n                    return -1;\n                }\n\n    \n                Mat_<Point2f> flow;\n                Ptr<DenseOpticalFlow> tvl1 = createOptFlow_DualTVL1();\n\n                const double start = (double)getTickCount();\n                tvl1->calc(prev_image, image, flow);\n                const double timeSec = (getTickCount() - start) / getTickFrequency();\n                cout << \"calcOpticalFlowDual_TVL1 : \" << timeSec << \" sec\" << endl;\n\n\n                Mat x(flow.rows,flow.cols,CV_64FC1);\n                Mat y(flow.rows,flow.cols,CV_64FC1);\n\n                Mat imgX(flow.rows,flow.cols,CV_8UC1);\n                Mat imgY(flow.rows,flow.cols,CV_8UC1);\n\n\n                float* px=x.ptr<float>(0);\n                float* py=y.ptr<float>(0);\n\n                for (int i = 0; i < flow.rows; ++i)\n                {\n                    px=x.ptr<float>(i);\n                    py=y.ptr<float>(i);\n                    for (int j = 0; j < flow.cols; ++j)\n                    {\n                        const Point2f u = flow(i, j);\n                        px[j]=u.x;\n                        py[j]=u.y;\n                    }\n                }\n\n                double min_x,max_x;\n                double min_y,max_y;\n                minMaxLoc(x,&min_x,&min_x);\n                minMaxLoc(y,&min_y,&min_y);\n\n                if(clipFlow)\n                {\n                    min_x=-20; max_x=20;\n                    min_y=-20; max_y=20;\n                }\n                convertFlowToImage(x,imgX,min_x,max_x);\n                convertFlowToImage(y,imgY,min_y,max_y);\n\n                imwrite(outputPath+string(p->d_name)+'/'+\"x_\"+tmp,imgX);\n                imwrite(outputPath+string(p->d_name)+'/'+\"y_\"+tmp,imgY);\n\n                std::swap(prev_image,image);\n                           \n            }\n\n        }\n    }\n    closedir(dir);\n\n//    Mat out;\n//    drawOpticalFlow(flow, out);\n//\n//    if (argc == 4)\n//        writeOpticalFlowToFile(flow, argv[3]);\n//\n//    imwrite(\"Flow.jpg\", out);\n//    cout<<\"ok\"<<endl;\n//\n//    waitKey();\n\n    return 0;\n}\n"
  },
  {
    "path": "optical_flow/makefile",
    "content": "INCLUDE = $(shell pkg-config --cflags opencv)\nLIBS = $(shell pkg-config --libs opencv)\nOBJECTS = main.o\nSOURCE = main.cpp\nBIN = bin\n$(OBJECTS) : $(SOURCE)\n\tg++ -c $(SOURCE)\n$(BIN):$(OBJECTS)\n\tg++ -o $(BIN) $(OBJECTS) -I $(INCLUDE) $(LIBS)\n\n\nclean:\n\trm $(OBJECTS) $(BIN)\n"
  },
  {
    "path": "opts.py",
    "content": "# @Author  : Sky chen\n# @Email   : dzhchxk@126.com\n# @Personal homepage  : https://coderskychen.cn\n\nimport argparse\nparser = argparse.ArgumentParser(description=\"PyTorch implementation of TwoStream\")\nparser.add_argument('model', type=str, choices=['TwoStream', 'TSN', 'C3D'])\nparser.add_argument('modality', type=str, choices=['RGB', 'Flow'])\nparser.add_argument('train_id', type=str)\nparser.add_argument('--num_segments', type=int, default=3)\nparser.add_argument('--train_list', type=str,default=\"\")\nparser.add_argument('--val_list', type=str, default=\"\")\nparser.add_argument('--root_path', type=str, default=\"\")\nparser.add_argument('--store_name', type=str, default=\"\")\n# ========================= Model Configs ==========================\nparser.add_argument('--arch', type=str, default=\"BNInception\")\nparser.add_argument('--dropout', '--do', default=0.8, type=float, metavar='DO', help='dropout ratio (default: 0.8)')\n# ========================= Learning Configs ==========================\nparser.add_argument('--epochs', default=120, type=int, metavar='N',\n                    help='number of total epochs to run')\nparser.add_argument('-b', '--batch_size', default=16, type=int,\n                    metavar='N', help='mini-batch size (default: 256)')\nparser.add_argument('--lr', '--learning-rate', default=0.005, type=float,\n                    metavar='LR', help='initial learning rate')\nparser.add_argument('--factor', '--factor', default=0.1, type=float,\n                    help='decay factor')\nparser.add_argument('--lr_steps', default=[30, 55], type=float, nargs=\"+\",\n                    metavar='LRSteps', help='epochs to decay learning rate by factor')\nparser.add_argument('--momentum', default=0.9, type=float, metavar='M',\n                    help='momentum')\nparser.add_argument('--weight-decay', '--wd', default=5e-4, type=float,\n                    metavar='W', help='weight decay (default: 5e-4)')\nparser.add_argument('--clip-gradient', '--gd', default=20, type=float,\n                    metavar='W', help='gradient norm clipping (default: disabled)')\nparser.add_argument('--no_partialbn', '--npb', default=False, action=\"store_true\")\n# ========================= Monitor Configs ==========================\nparser.add_argument('--print-freq', '-p', default=5, type=int,\n                    metavar='N', help='print frequency (default: 10)')\nparser.add_argument('--eval-freq', '-ef', default=1, type=int,\n                    metavar='N', help='evaluation frequency (default: 5)')\n# ========================= Runtime Configs ==========================\nparser.add_argument('-j', '--workers', default=30, type=int, metavar='N',\n                    help='number of data loading workers (default: 4)')\nparser.add_argument('--resume', default='', type=str, metavar='PATH',\n                    help='path to latest checkpoint (default: none)')\nparser.add_argument('-e', '--evaluate', dest='evaluate', action='store_true',\n                    help='evaluate model on validation set')\nparser.add_argument('--snapshot_pref', type=str, default=\"\")\nparser.add_argument('--start_epoch', default=0, type=int, metavar='N',\n                    help='manual epoch number (useful on restarts)')\nparser.add_argument('--gpus', nargs='+', type=int, default=None)\nparser.add_argument('--root_log',type=str, default='log')\nparser.add_argument('--root_model', type=str, default='model')\nparser.add_argument('--root_output',type=str, default='output')\n\n\n\n"
  },
  {
    "path": "process_dataset.py",
    "content": "# @Author  : Sky chen\n# @Email   : dzhchxk@126.com\n# @Personal homepage  : https://coderskychen.cn\n\n# processing the raw data of the video datasets (Something-something)\n# generate the meta files:\n#   category.txt:               the list of categories.\n#   train_videofolder.txt:      each row contains [videoname num_frames classIDX]\n#   val_videofolder.txt:        same as above\n#\n\nimport os\n\n\ndef train_and_val_test(modality='rgb'):\n    dataset_name = ''\n    if modality == 'rgb':\n        dataset_name = 'something-rgb'\n    elif modality == 'flow':\n        dataset_name = 'something-optical-flow'\n    else:\n        print('error in modal type!')\n        exit()\n    rootdir = '/home/mcg/cxk/dataset/somthing-something/'\n\n    prefix_name = 'something-something-v1'\n    with open(rootdir + '%s-labels.csv' % prefix_name) as f:\n        lines = f.readlines()\n    categories = []\n    for line in lines:\n        line = line.rstrip()\n        categories.append(line)\n    categories = sorted(categories)\n    with open(rootdir + 'category.txt', 'w') as f:\n        f.write('\\n'.join(categories))\n\n    dict_categories = {}\n    for i, category in enumerate(categories):\n        dict_categories[category] = i\n\n    files_input = [rootdir + '%s-validation.csv' % prefix_name, rootdir + '%s-train.csv' % prefix_name]\n    files_output = [rootdir + 'val_videofolder_%s.txt' % modality, rootdir + 'train_videofolder_%s.txt' % modality]\n    for (filename_input, filename_output) in zip(files_input, files_output):\n        with open(filename_input) as f:\n            lines = f.readlines()\n        folders = []\n        idx_categories = []\n        for line in lines:\n            line = line.rstrip()\n            items = line.split(';')\n            folders.append(items[0])\n            idx_categories.append(os.path.join(dict_categories[items[1]]))\n        output = []\n        for i in range(len(folders)):\n            curFolder = folders[i]\n            curIDX = idx_categories[i]\n            # counting the number of frames in each video folders\n            dir_files = os.listdir(os.path.join(rootdir, dataset_name, curFolder))\n            output.append('%s %d %d' % (curFolder, len(dir_files), curIDX))\n            print('%d/%d' % (i, len(folders)))\n        with open(filename_output, 'w') as f:\n            f.write('\\n'.join(output))\n        # -----------test set-----------\n        with open(rootdir + 'something-something-v1-test.csv') as f:\n            lines = f.readlines()\n        output = []\n        for idx, i in enumerate(lines):\n            floder = i.strip()\n            files = os.listdir(rootdir + dataset_name + '/%s' % floder)\n            output.append('%s %d' % (floder, len(files)))\n            print('%d/%d' % (idx, len(lines)))\n        with open(rootdir + 'test_videofolder_%s.txt' % modality, 'w') as f:\n            f.write('\\n'.join(output))\n\n\nif __name__ == '__main__':\n    train_and_val_test(modality='flow')\n    train_and_val_test(modality='rgb')\n"
  },
  {
    "path": "test_models.py",
    "content": "# @Author  : Sky chen\n# @Email   : dzhchxk@126.com\n# @Personal homepage  : https://coderskychen.cn\n# Note that when testing TSN, num_segments=1, and num_segments>1 only on traing phrase.\n\nimport os\nimport argparse\nimport time\n\nimport numpy as np\nimport torch.nn.parallel\nimport torch.optim\nfrom sklearn.metrics import confusion_matrix\nfrom models import *\nfrom transforms import *\nfrom dataset import *\nimport pdb\nfrom torch.nn import functional as F\n\n\n# options\nparser = argparse.ArgumentParser(description=\"testing on the full validation set\")\nparser.add_argument('--model', type=str, choices=['TwoStream', 'TSN', 'C3D'])\nparser.add_argument('--modality', type=str, choices=['RGB', 'Flow'])\nparser.add_argument('--weights', type=str)\nparser.add_argument('--train_id', type=str)\nparser.add_argument('--arch', type=str, default=\"BNInception\")\nparser.add_argument('--save_scores', type=str, default=None)\nparser.add_argument('--test_segments', type=int, default=25)\nparser.add_argument('--max_num', type=int, default=-1)\nparser.add_argument('--test_crops', type=int, default=10)\nparser.add_argument('--input_size', type=int, default=224)\nparser.add_argument('--crop_fusion_type', type=str, default='TSN-DI',\n                    choices=['avg', 'TRN','TRNmultiscale', 'TSN-DI'])\nparser.add_argument('-j', '--workers', default=4, type=int, metavar='N',\n                    help='number of data loading workers (default: 4)')\nparser.add_argument('--gpus', nargs='+', type=int, default=None)\nparser.add_argument('--img_feature_dim',type=int, default=256)\nparser.add_argument('--k', type=int, default=3)\nparser.add_argument('--softmax', type=int, default=0)\n\nargs = parser.parse_args()\n\ndef return_something_path(modality):\n    filename_categories = '/home/mcg/cxk/dataset/somthing-something/category.txt'\n    if modality == 'RGB':\n        root_data = '/home/mcg/cxk/dataset/somthing-something/something-rgb'\n        filename_imglist_train = '/home/mcg/cxk/dataset/somthing-something/train_videofolder_rgb.txt'\n        filename_imglist_val = '/home/mcg/cxk/dataset/somthing-something/val_videofolder_rgb.txt'\n\n        prefix = '{:05d}.jpg'\n    else:\n        root_data = '/home/mcg/cxk/dataset/somthing-something/something-optical-flow'\n        filename_imglist_train = '/home/mcg/cxk/dataset/somthing-something/train_videofolder_flow.txt'\n        filename_imglist_val = '/home/mcg/cxk/dataset/somthing-something/val_videofolder_flow.txt'\n\n        prefix = '{:s}_{:05d}.jpg'\n\n    with open(filename_categories) as f:\n        lines = f.readlines()\n    categories = [item.rstrip() for item in lines]\n    return categories, filename_imglist_train, filename_imglist_val, root_data, prefix\n\nclass AverageMeter(object):\n    \"\"\"Computes and stores the average and current value\"\"\"\n    def __init__(self):\n        self.reset()\n\n    def reset(self):\n        self.val = 0\n        self.avg = 0\n        self.sum = 0\n        self.count = 0\n\n    def update(self, val, n=1):\n        self.val = val\n        self.sum += val * n\n        self.count += n\n        self.avg = self.sum / self.count\n\ndef accuracy(output, target, topk=(1,)):\n    \"\"\"Computes the precision@k for the specified values of k\"\"\"\n    maxk = max(topk)\n    batch_size = target.size(0)\n    _, pred = output.topk(maxk, 0, True, True)\n    # pred = pred.t()\n    # print(target)\n    # print(pred)\n    correct = pred.eq(target.view(-1).expand(pred.size()))\n    # print(correct)\n    # correct = pred.eq(target.view(1, -1).expand_as(pred))\n    res = []\n    for k in topk:\n         correct_k = correct[:k].view(-1).float().sum(0)\n         res.append(correct_k.mul_(100.0 / (batch_size)))\n    return res\n\n\n# time.sleep(3400)\n\ncategories, args.train_list, args.val_list, args.root_path, prefix = return_something_path(args.modality)\nnum_class = len(categories)\n\nif args.model == 'TwoStream':\n    net = TwoStream(num_class, args.modality, base_model=args.arch)\nelif args.model == 'TSN':\n    net = TSN(num_class, 1, args.modality, base_model=args.arch)\nelif args.model == 'C3D':\n    net = C3D()\n\ncheckpoint = torch.load(os.path.join('/home/mcg/cxk/action-recognition-zoo/results', args.train_id, 'model', args.weights))\nprint(\"model epoch {} best prec@1: {}\".format(checkpoint['epoch'], checkpoint['best_prec1']))\n\nbase_dict = {'.'.join(k.split('.')[1:]): v for k,v in list(checkpoint['state_dict'].items())}\nnet.load_state_dict(base_dict)\n\nif args.test_crops == 1:\n    cropping = torchvision.transforms.Compose([\n        GroupScale(net.scale_size),\n        GroupCenterCrop(net.input_size),\n    ])\nelif args.test_crops == 10:\n    cropping = torchvision.transforms.Compose([\n        GroupOverSample(net.input_size, net.scale_size)\n    ])\nelse:\n    raise ValueError(\"Only 1 and 10 crops are supported while we got {}\".format(args.test_crops))\nif args.model == 'TwoStream':\n    data_loader = torch.utils.data.DataLoader(\n            TwoStreamDataSet(args.root_path, args.val_list, num_segments=args.test_segments,\n                       new_length=1 if args.modality == \"RGB\" else 5,\n                       modality=args.modality,\n                       image_tmpl=prefix,\n                       test_mode=True,\n                       transform=torchvision.transforms.Compose([\n                           cropping,\n                           Stack(roll=(args.arch in ['BNInception','InceptionV3'])),\n                           ToTorchFormatTensor(div=(args.arch not in ['BNInception','InceptionV3'])),\n                           GroupNormalize(net.input_mean, net.input_std),\n                       ])),\n            batch_size=1, shuffle=False,\n            num_workers=args.workers * 2, pin_memory=True)\nelif args.model == 'TSN':\n    data_loader = torch.utils.data.DataLoader(\n            TSNDataSet(args.root_path, args.val_list, num_segments=args.test_segments,\n                       new_length=1 if args.modality == \"RGB\" else 5,\n                       modality=args.modality,\n                       image_tmpl=prefix,\n                       test_mode=True,\n                       transform=torchvision.transforms.Compose([\n                           cropping,\n                           Stack(roll=(args.arch in ['BNInception','InceptionV3'])),\n                           ToTorchFormatTensor(div=(args.arch not in ['BNInception','InceptionV3'])),\n                           GroupNormalize(net.input_mean, net.input_std),\n                       ])),\n            batch_size=1, shuffle=False,\n            num_workers=args.workers * 2, pin_memory=True)\nelif args.model == 'C3D':\n    data_loader = torch.utils.data.DataLoader(\n        C3DDataSet(args.root_path, args.val_list, num_segments=args.test_segments,\n                              new_length=16,\n                              modality=args.modality,\n                              image_tmpl=prefix,\n                              test_mode=True,\n                              random_shift=False,\n                              transform=torchvision.transforms.Compose([\n                                  cropping,\n                                  Stack(roll=(args.arch in ['BNInception', 'InceptionV3'])),\n                                  ToTorchFormatTensor(\n                                      div=(args.arch not in ['BNInception', 'InceptionV3', 'C3D'])),\n                                  GroupNormalize(net.input_mean, net.input_std),\n                              ])),\n        batch_size=1, shuffle=False,\n        num_workers=args.workers * 2, pin_memory=True)\n\nif args.gpus is not None:\n    devices = [args.gpus[i] for i in range(args.workers)]\nelse:\n    devices = list(range(args.workers))\n\n\n#net = torch.nn.DataParallel(net.cuda(devices[0]), device_ids=devices)\nnet = torch.nn.DataParallel(net.cuda())\n# net=net.cuda()\nnet.eval()\n\ndata_gen = enumerate(data_loader)\n\ntotal_num = len(data_loader.dataset)\noutput = []\n\n\ndef eval_video(video_data):\n    i, data, label = video_data\n\n    num_crop = args.test_crops\n\n    if args.modality == 'RGB':\n        length = 3\n        if args.model == 'C3D':\n            length = 16\n    elif args.modality == 'Flow':\n        length = 10\n    elif args.modality == 'RGBDiff':\n        length = 18\n    else:\n        raise ValueError(\"Unknown modality \"+args.modality)\n\n    # data: bs* channelss * w *h   channelss=channels*frames\n\n    input_var = torch.autograd.Variable(data.view(-1, length, data.size(2), data.size(3)),\n                                        volatile=True)\n    rst = net(input_var.cuda())\n    rst = rst.data.cpu().numpy().copy()\n\n    rst = rst.reshape((num_crop, args.test_segments, num_class)).mean(axis=0).reshape((args.test_segments, num_class)).mean(axis=0).reshape((num_class))\n\n    return i, rst, label[0]\n\n\nproc_start_time = time.time()\nmax_num = args.max_num if args.max_num > 0 else len(data_loader.dataset)\n\ntop1 = AverageMeter()\ntop5 = AverageMeter()\n\nfor i, (data, label) in data_gen:\n    if i >= max_num:\n        break\n    rst = eval_video((i, data, label))\n    output.append(rst[1:])\n    cnt_time = time.time() - proc_start_time\n    prec1, prec5 = accuracy(torch.from_numpy(rst[1]), label, topk=(1, 5))\n    top1.update(prec1[0], 1)\n    top5.update(prec5[0], 1)\n    print('video {} done, total {}/{}, average {:.3f} sec/video, moving Prec@1 {:.3f} Prec@5 {:.3f}'.format(i, i+1,\n                                                                    len(data_loader),\n                                                                    float(cnt_time) / (i+1), top1.avg, top5.avg))\n\nvideo_pred = [np.argmax(np.mean(x[0], axis=0)) for x in output]\n\nvideo_labels = [x[1] for x in output]\n\n\ncf = confusion_matrix(video_labels, video_pred).astype(float)\n\ncls_cnt = cf.sum(axis=1)\ncls_hit = np.diag(cf)\n\ncls_acc = cls_hit / cls_cnt\n\nprint('-----Evaluation is finished------')\nprint('Class Accuracy {:.02f}%'.format(np.mean(cls_acc) * 100))\nprint('Overall Prec@1 {:.02f}% Prec@5 {:.02f}%'.format(top1.avg, top5.avg))\n\nif args.save_scores is not None:\n\n    if args.modality=='RGB':\n        test_list = '/home/mcg/cxk/dataset/somthing-something/val_videofolder_rgb.txt'\n    elif args.modality=='Flow':\n        test_list = '/home/mcg/cxk/dataset/somthing-something/val_videofolder_flow.txt'\n    # reorder before saving\n    name_list = [x.strip().split()[0] for x in open(test_list)]\n\n    assert len(output) == len(name_list)\n\n    order_dict = {e:i for i, e in enumerate(sorted(name_list))}\n    reorder_output = [None] * len(name_list)\n    # reorder_label = [None] * len(output)\n    reorder_pred = [None] * len(name_list)\n    output_csv = []\n    for i in range(len(output)):\n        idx = order_dict[name_list[i]]\n        reorder_output[idx] = output[i]\n        # reorder_label[idx] = video_labels[i]\n        reorder_pred[idx] = video_pred[i]\n        output_csv.append('%s;%s'%(name_list[i], categories[video_pred[i]]))\n\n    np.savez(os.path.join('/home/mcg/cxk/action-recognition-zoo/results', args.train_id, 'output', args.save_scores), scores=reorder_output, predictions=reorder_pred)\n\n    # with open(os.path.join('/home/mcg/cxk/action-recognition-zoo/results', args.train_id, 'output', args.save_scores+'.csv'), 'w') as f:\n    #     f.write('\\n'.join(output_csv))\n\n\n"
  },
  {
    "path": "transforms.py",
    "content": "# @Author  : Sky chen\n# @Email   : dzhchxk@126.com\n# @Personal homepage  : https://coderskychen.cn\n\nimport torchvision\nimport random\nfrom PIL import Image, ImageOps\nimport numpy as np\nimport numbers\nimport math\nimport torch\n\n\nclass GroupRandomCrop(object):\n    def __init__(self, size):\n        if isinstance(size, numbers.Number):\n            self.size = (int(size), int(size))\n        else:\n            self.size = size\n\n    def __call__(self, img_group):\n\n        w, h = img_group[0].size\n        th, tw = self.size\n\n        out_images = list()\n\n        x1 = random.randint(0, w - tw)\n        y1 = random.randint(0, h - th)\n\n        for img in img_group:\n            assert(img.size[0] == w and img.size[1] == h)\n            if w == tw and h == th:\n                out_images.append(img)\n            else:\n                out_images.append(img.crop((x1, y1, x1 + tw, y1 + th)))\n\n        return out_images\n\n\nclass GroupCenterCrop(object):\n    def __init__(self, size):\n        self.worker = torchvision.transforms.CenterCrop(size)\n\n    def __call__(self, img_group):\n        return [self.worker(img) for img in img_group]\n\n\nclass GroupRandomHorizontalFlip(object):\n    \"\"\"Randomly horizontally flips the given PIL.Image with a probability of 0.5\n    \"\"\"\n    def __init__(self, is_flow=False):\n        self.is_flow = is_flow\n\n    def __call__(self, img_group, is_flow=False):\n        v = random.random()\n        if v < 0.5:\n            ret = [img.transpose(Image.FLIP_LEFT_RIGHT) for img in img_group]\n            if self.is_flow:\n                for i in range(0, len(ret), 2):\n                    ret[i] = ImageOps.invert(ret[i])  # invert flow pixel values when flipping\n            return ret\n        else:\n            return img_group\n\n\nclass GroupNormalize(object):\n    def __init__(self, mean, std):\n        self.mean = mean\n        self.std = std\n\n    def __call__(self, tensor):\n        rep_mean = self.mean * (tensor.size()[0]//len(self.mean))\n        rep_std = self.std * (tensor.size()[0]//len(self.std))\n\n        # TODO: make efficient\n        for t, m, s in zip(tensor, rep_mean, rep_std):\n            t.sub_(m).div_(s)\n\n        return tensor\n\n\nclass GroupScale(object):\n    \"\"\" Rescales the input PIL.Image to the given 'size'.\n    'size' will be the size of the smaller edge.\n    For example, if height > width, then image will be\n    rescaled to (size * height / width, size)\n    size: size of the smaller edge\n    interpolation: Default: PIL.Image.BILINEAR\n    \"\"\"\n\n    def __init__(self, size, interpolation=Image.BILINEAR):\n        self.worker = torchvision.transforms.Scale(size, interpolation)\n\n    def __call__(self, img_group):\n        return [self.worker(img) for img in img_group]\n\n\nclass GroupOverSample(object):\n    def __init__(self, crop_size, scale_size=None):\n        self.crop_size = crop_size if not isinstance(crop_size, int) else (crop_size, crop_size)\n\n        if scale_size is not None:\n            self.scale_worker = GroupScale(scale_size)\n        else:\n            self.scale_worker = None\n\n    def __call__(self, img_group):\n\n        if self.scale_worker is not None:\n            img_group = self.scale_worker(img_group)\n\n        image_w, image_h = img_group[0].size\n        crop_w, crop_h = self.crop_size\n\n        offsets = GroupMultiScaleCrop.fill_fix_offset(False, image_w, image_h, crop_w, crop_h)\n        oversample_group = list()\n        for o_w, o_h in offsets:\n            normal_group = list()\n            flip_group = list()\n            for i, img in enumerate(img_group):\n                crop = img.crop((o_w, o_h, o_w + crop_w, o_h + crop_h))\n                normal_group.append(crop)\n                flip_crop = crop.copy().transpose(Image.FLIP_LEFT_RIGHT)\n\n                if img.mode == 'L' and i % 2 == 0:\n                    flip_group.append(ImageOps.invert(flip_crop))\n                else:\n                    flip_group.append(flip_crop)\n\n            oversample_group.extend(normal_group)\n            oversample_group.extend(flip_group)\n        return oversample_group\n\n\nclass GroupMultiScaleCrop(object):\n\n    def __init__(self, input_size, scales=None, max_distort=1, fix_crop=True, more_fix_crop=True):\n        self.scales = scales if scales is not None else [1, 875, .75, .66]\n        self.max_distort = max_distort\n        self.fix_crop = fix_crop\n        self.more_fix_crop = more_fix_crop\n        self.input_size = input_size if not isinstance(input_size, int) else [input_size, input_size]\n        self.interpolation = Image.BILINEAR\n\n    def __call__(self, img_group):\n\n        im_size = img_group[0].size\n\n        crop_w, crop_h, offset_w, offset_h = self._sample_crop_size(im_size)\n        crop_img_group = [img.crop((offset_w, offset_h, offset_w + crop_w, offset_h + crop_h)) for img in img_group]\n        ret_img_group = [img.resize((self.input_size[0], self.input_size[1]), self.interpolation)\n                         for img in crop_img_group]\n        return ret_img_group\n\n    def _sample_crop_size(self, im_size):\n        image_w, image_h = im_size[0], im_size[1]\n\n        # find a crop size\n        base_size = min(image_w, image_h)\n        crop_sizes = [int(base_size * x) for x in self.scales]\n        crop_h = [self.input_size[1] if abs(x - self.input_size[1]) < 3 else x for x in crop_sizes]\n        crop_w = [self.input_size[0] if abs(x - self.input_size[0]) < 3 else x for x in crop_sizes]\n\n        pairs = []\n        for i, h in enumerate(crop_h):\n            for j, w in enumerate(crop_w):\n                if abs(i - j) <= self.max_distort:\n                    pairs.append((w, h))\n\n        crop_pair = random.choice(pairs)\n        if not self.fix_crop:\n            w_offset = random.randint(0, image_w - crop_pair[0])\n            h_offset = random.randint(0, image_h - crop_pair[1])\n        else:\n            w_offset, h_offset = self._sample_fix_offset(image_w, image_h, crop_pair[0], crop_pair[1])\n\n        return crop_pair[0], crop_pair[1], w_offset, h_offset\n\n    def _sample_fix_offset(self, image_w, image_h, crop_w, crop_h):\n        offsets = self.fill_fix_offset(self.more_fix_crop, image_w, image_h, crop_w, crop_h)\n        return random.choice(offsets)\n\n    @staticmethod\n    def fill_fix_offset(more_fix_crop, image_w, image_h, crop_w, crop_h):\n        w_step = (image_w - crop_w) // 4\n        h_step = (image_h - crop_h) // 4\n\n        ret = list()\n        ret.append((0, 0))  # upper left\n        ret.append((4 * w_step, 0))  # upper right\n        ret.append((0, 4 * h_step))  # lower left\n        ret.append((4 * w_step, 4 * h_step))  # lower right\n        ret.append((2 * w_step, 2 * h_step))  # center\n\n        if more_fix_crop:\n            ret.append((0, 2 * h_step))  # center left\n            ret.append((4 * w_step, 2 * h_step))  # center right\n            ret.append((2 * w_step, 4 * h_step))  # lower center\n            ret.append((2 * w_step, 0 * h_step))  # upper center\n\n            ret.append((1 * w_step, 1 * h_step))  # upper left quarter\n            ret.append((3 * w_step, 1 * h_step))  # upper right quarter\n            ret.append((1 * w_step, 3 * h_step))  # lower left quarter\n            ret.append((3 * w_step, 3 * h_step))  # lower righ quarter\n\n        return ret\n\n\nclass GroupRandomSizedCrop(object):\n    \"\"\"Random crop the given PIL.Image to a random size of (0.08 to 1.0) of the original size\n    and and a random aspect ratio of 3/4 to 4/3 of the original aspect ratio\n    This is popularly used to train the Inception networks\n    size: size of the smaller edge\n    interpolation: Default: PIL.Image.BILINEAR\n    \"\"\"\n    def __init__(self, size, interpolation=Image.BILINEAR):\n        self.size = size\n        self.interpolation = interpolation\n\n    def __call__(self, img_group):\n        for attempt in range(10):\n            area = img_group[0].size[0] * img_group[0].size[1]\n            target_area = random.uniform(0.08, 1.0) * area\n            aspect_ratio = random.uniform(3. / 4, 4. / 3)\n\n            w = int(round(math.sqrt(target_area * aspect_ratio)))\n            h = int(round(math.sqrt(target_area / aspect_ratio)))\n\n            if random.random() < 0.5:\n                w, h = h, w\n\n            if w <= img_group[0].size[0] and h <= img_group[0].size[1]:\n                x1 = random.randint(0, img_group[0].size[0] - w)\n                y1 = random.randint(0, img_group[0].size[1] - h)\n                found = True\n                break\n        else:\n            found = False\n            x1 = 0\n            y1 = 0\n\n        if found:\n            out_group = list()\n            for img in img_group:\n                img = img.crop((x1, y1, x1 + w, y1 + h))\n                assert(img.size == (w, h))\n                out_group.append(img.resize((self.size, self.size), self.interpolation))\n            return out_group\n        else:\n            # Fallback\n            scale = GroupScale(self.size, interpolation=self.interpolation)\n            crop = GroupRandomCrop(self.size)\n            return crop(scale(img_group))\n\n\nclass Stack(object):\n\n    def __init__(self, roll=False):\n        self.roll = roll\n\n    def __call__(self, img_group):\n        if img_group[0].mode == 'L':\n            return np.concatenate([np.expand_dims(x, 2) for x in img_group], axis=2)\n        elif img_group[0].mode == 'RGB':\n            if self.roll:\n                return np.concatenate([np.array(x)[:, :, ::-1] for x in img_group], axis=2)\n            else:\n                return np.concatenate(img_group, axis=2)\n\n\nclass ToTorchFormatTensor(object):\n    \"\"\" Converts a PIL.Image (RGB) or numpy.ndarray (H x W x C) in the range [0, 255]\n    to a torch.FloatTensor of shape (C x H x W) in the range [0.0, 1.0] \"\"\"\n    def __init__(self, div=True):\n        self.div = div\n\n    def __call__(self, pic):\n        if isinstance(pic, np.ndarray):\n            # handle numpy array\n            img = torch.from_numpy(pic).permute(2, 0, 1).contiguous()\n        else:\n            # handle PIL Image\n            img = torch.ByteTensor(torch.ByteStorage.from_buffer(pic.tobytes()))\n            img = img.view(pic.size[1], pic.size[0], len(pic.mode))\n            # put it from HWC to CHW format\n            # yikes, this transpose takes 80% of the loading time/CPU\n            img = img.transpose(0, 1).transpose(0, 2).contiguous()\n        return img.float().div(255) if self.div else img.float()\n\n\nclass IdentityTransform(object):\n\n    def __call__(self, data):\n        return data\n\n\nif __name__ == \"__main__\":\n    trans = torchvision.transforms.Compose([\n        GroupScale(256),\n        GroupRandomCrop(224),\n        Stack(),\n        ToTorchFormatTensor(),\n        GroupNormalize(\n            mean=[.485, .456, .406],\n            std=[.229, .224, .225]\n        )]\n    )\n\n    im = Image.open('../tensorflow-model-zoo.torch/lena_299.png')\n\n    color_group = [im] * 3\n    rst = trans(color_group)\n\n    gray_group = [im.convert('L')] * 9\n    gray_rst = trans(gray_group)\n\n    trans2 = torchvision.transforms.Compose([\n        GroupRandomSizedCrop(256),\n        Stack(),\n        ToTorchFormatTensor(),\n        GroupNormalize(\n            mean=[.485, .456, .406],\n            std=[.229, .224, .225])\n    ])\n    print(trans2(color_group))"
  }
]