[
  {
    "path": "LICENSE",
    "content": "MIT License\n\nCopyright (c) 2021 Alibaba\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n"
  },
  {
    "path": "README.md",
    "content": "![Python >=3.5](https://img.shields.io/badge/Python->=3.6-blue.svg)\n![PyTorch >=1.6](https://img.shields.io/badge/PyTorch->=1.6-yellow.svg)\n\n# Cluster Contrast for Unsupervised Person Re-Identification\n\nThe *official* repository for [Cluster Contrast for Unsupervised Person Re-Identification](https://arxiv.org/pdf/2103.11568v3.pdf). We achieve state-of-the-art performances on **unsupervised learning** tasks for object re-ID, including person re-ID and vehicle re-ID.\n\n**Our unified framework**\n![framework](figs/frameworkv2.png)\n## Updates\n\n***11/19/2021***\n\n1. Memory dictionary update changed from bath hard update to momentum update. Because the bath hard update is sensitive to parameters, good results need to adjust many parameters, which is not robust enough.\n\n\n2. Add the results of the InforMap clustering algorithm. Compared with the DBSCAN clustering algorithm, it can achieve better results. At the same time, we found through experiments that it is more robust on each data set.\n\n## Notes\n\nIn the process of doing experiments, we found that some settings have a greater impact on the results. Share them here to prevent everyone from stepping on the pit when applying our method.\n\n1. The dataloader sampler uses RandomMultipleGallerySampler, see the code implementation for details. At the same time, we also provide RandomMultipleGallerySamplerNoCam sampler, which can be used in non-ReID fields.\n\n2. Add batch normalization to the final output layer of the network, see the code for details.\n\n3.  we obtain a total number of P × Z images in the mini\nbatch. P represents the number of categories, Z represents the number of instances of each category. mini batch = P x Z, P is set to 16, Z changes with the mini batch. \n\n## Requirements\n\n### Installation\n\n```shell\ngit clone https://github.com/alibaba/cluster-contrast-reid.git\ncd ClusterContrast\npython setup.py develop\n```\n\n### Prepare Datasets\n\n```shell\ncd examples && mkdir data\n```\nDownload the person datasets Market-1501,MSMT17,PersonX,DukeMTMC-reID and the vehicle datasets VeRi-776 from [aliyun](https://virutalbuy-public.oss-cn-hangzhou.aliyuncs.com/share/data.zip).\nThen unzip them under the directory like\n\n```\nClusterContrast/examples/data\n├── market1501\n│   └── Market-1501-v15.09.15\n├── msmt17\n│   └── MSMT17_V1\n├── personx\n│   └── PersonX\n├── dukemtmcreid\n│   └── DukeMTMC-reID\n└── veri\n    └── VeRi\n```\n\n### Prepare ImageNet Pre-trained Models for IBN-Net\n\nWhen training with the backbone of [IBN-ResNet](https://arxiv.org/abs/1807.09441), you need to download the ImageNet-pretrained model from this [link](https://drive.google.com/drive/folders/1thS2B8UOSBi_cJX6zRy6YYRwz_nVFI_S) and save it under the path of `examples/pretrained/`.\n\nImageNet-pretrained models for **ResNet-50** will be automatically downloaded in the python script.\n\n## Training\n\nWe utilize 4 GTX-2080TI GPUs for training. For more parameter configuration, please check **`run_code.sh`**.\n\n**examples:**\n\nMarket-1501:\n\n1. Using DBSCAN:\n```shell\nCUDA_VISIBLE_DEVICES=0,1,2,3 python examples/cluster_contrast_train_usl.py -b 256 -a resnet50 -d market1501 --iters 200 --momentum 0.1 --eps 0.6 --num-instances 16\n```\n\n\n2. Using InfoMap:\n```shell\nCUDA_VISIBLE_DEVICES=0,1,2,3 python examples/cluster_contrast_train_usl_infomap.py -b 256 -a resnet50 -d market1501 --iters 200 --momentum 0.1 --eps 0.5 --k1 15 --k2 4 --num-instances 16\n```\n\nMSMT17:\n\n1. Using DBSCAN:\n```shell\nCUDA_VISIBLE_DEVICES=0,1,2,3 python examples/cluster_contrast_train_usl.py -b 256 -a resnet50 -d msmt17 --iters 400 --momentum 0.1 --eps 0.6 --num-instances 16\n```\n\n2. Using InfoMap:\n```shell\nCUDA_VISIBLE_DEVICES=0,1,2,3 python examples/cluster_contrast_train_usl_infomap.py -b 256 -a resnet50 -d msmt17 --iters 400 --momentum 0.1 --eps 0.5 --k1 15 --k2 4 --num-instances 16\n```\n\nDukeMTMC-reID:\n\n1. Using DBSCAN:\n```shell\nCUDA_VISIBLE_DEVICES=0,1,2,3 python examples/cluster_contrast_train_usl.py -b 256 -a resnet50 -d dukemtmcreid --iters 200 --momentum 0.1 --eps 0.6 --num-instances 16\n```\n\n2. Using InfoMap:\n```shell\nCUDA_VISIBLE_DEVICES=0,1,2,3 python examples/cluster_contrast_train_usl_infomap.py -b 256 -a resnet50 -d dukemtmcreid --iters 200 --momentum 0.1 --eps 0.5 --k1 15 --k2 4 --num-instances 16\n```\n\nVeRi-776\n\n1. Using DBSCAN:\n```shell\nCUDA_VISIBLE_DEVICES=0,1,2,3 python examples/cluster_contrast_train_usl.py -b 256 -a resnet50 -d veri --iters 400 --momentum 0.1 --eps 0.6 --num-instances 16 --height 224 --width 224\n```\n\n2. Using InfoMap:\n```shell\nCUDA_VISIBLE_DEVICES=0,1,2,3 python examples/cluster_contrast_train_usl_infomap.py -b 256 -a resnet50 -d veri --iters 400 --momentum 0.1 --eps 0.5 --k1 15 --k2 4 --num-instances 16 --height 224 --width 224\n```\n\n## Evaluation\n\nWe utilize 1 GTX-2080TI GPU for testing. **Note that**\n\n+ use `--width 128 --height 256` (default) for person datasets, and `--height 224 --width 224` for vehicle datasets;\n\n+ use `-a resnet50` (default) for the backbone of ResNet-50, and `-a resnet_ibn50a` for the backbone of IBN-ResNet.\n\nTo evaluate the model, run:\n```shell\nCUDA_VISIBLE_DEVICES=0 \\\npython examples/test.py \\\n  -d $DATASET --resume $PATH\n```\n\n**Some examples:**\n```shell\n### Market-1501 ###\nCUDA_VISIBLE_DEVICES=0 \\\npython examples/test.py \\\n  -d market1501 --resume logs/spcl_usl/market_resnet50/model_best.pth.tar\n```\n\n## Results\n\n![framework](figs/resultsv2.png)\n\nYou can download the above models in the paper from [aliyun](https://virutalbuy-public.oss-cn-hangzhou.aliyuncs.com/share/cluster-contrast.zip) \n\n\n## Citation\n\nIf you find this code useful for your research, please cite our paper\n```\n@article{dai2021cluster,\n  title={Cluster Contrast for Unsupervised Person Re-Identification},\n  author={Dai, Zuozhuo and Wang, Guangyuan and Zhu, Siyu and Yuan, Weihao and Tan, Ping},\n  journal={arXiv preprint arXiv:2103.11568},\n  year={2021}\n}\n```\n\n# Acknowledgements\n\nThanks to Yixiao Ge for opening source of his excellent works  [SpCL](https://github.com/yxgeee/SpCL). \n"
  },
  {
    "path": "clustercontrast/__init__.py",
    "content": "from __future__ import absolute_import\n\nfrom . import datasets\nfrom . import evaluation_metrics\nfrom . import models\nfrom . import utils\nfrom . import evaluators\nfrom . import trainers\n\n__version__ = '0.1.0'\n"
  },
  {
    "path": "clustercontrast/datasets/__init__.py",
    "content": "from __future__ import absolute_import\nimport warnings\n\nfrom .market1501 import Market1501\nfrom .msmt17 import MSMT17\nfrom .personx import PersonX\nfrom .veri import VeRi\nfrom .dukemtmcreid import DukeMTMCreID\n\n\n__factory = {\n    'market1501': Market1501,\n    'msmt17': MSMT17,\n    'personx': PersonX,\n    'veri': VeRi,\n    'dukemtmcreid': DukeMTMCreID\n}\n\n\ndef names():\n    return sorted(__factory.keys())\n\n\ndef create(name, root, *args, **kwargs):\n    \"\"\"\n    Create a dataset instance.\n\n    Parameters\n    ----------\n    name : str\n        The dataset name. \n    root : str\n        The path to the dataset directory.\n    split_id : int, optional\n        The index of data split. Default: 0\n    num_val : int or float, optional\n        When int, it means the number of validation identities. When float,\n        it means the proportion of validation to all the trainval. Default: 100\n    download : bool, optional\n        If True, will download the dataset. Default: False\n    \"\"\"\n    if name not in __factory:\n        raise KeyError(\"Unknown dataset:\", name)\n    return __factory[name](root, *args, **kwargs)\n\n\ndef get_dataset(name, root, *args, **kwargs):\n    warnings.warn(\"get_dataset is deprecated. Use create instead.\")\n    return create(name, root, *args, **kwargs)\n"
  },
  {
    "path": "clustercontrast/datasets/dukemtmcreid.py",
    "content": "import glob\nimport os.path as osp\nimport re\nfrom ..utils.data import BaseImageDataset\n\n\ndef process_dir(dir_path, relabel=False):\n    img_paths = glob.glob(osp.join(dir_path, \"*.jpg\"))\n    pattern = re.compile(r\"([-\\d]+)_c(\\d)\")\n\n    # get all identities\n    pid_container = set()\n    for img_path in img_paths:\n        pid, _ = map(int, pattern.search(img_path).groups())\n        if pid == -1:\n            continue\n        pid_container.add(pid)\n\n    pid2label = {pid: label for label, pid in enumerate(pid_container)}\n\n    data = []\n    for img_path in img_paths:\n        pid, camid = map(int, pattern.search(img_path).groups())\n        if (pid not in pid_container) or (pid == -1):\n            continue\n\n        assert 1 <= camid <= 8\n        camid -= 1\n\n        if relabel:\n            pid = pid2label[pid]\n        data.append((img_path, pid, camid))\n\n    return data\n\n\nclass DukeMTMCreID(BaseImageDataset):\n\n    \"\"\"DukeMTMC-reID.\n    Reference:\n        - Ristani et al. Performance Measures and a Data Set for Multi-Target,\n            Multi-Camera Tracking. ECCVW 2016.\n        - Zheng et al. Unlabeled Samples Generated by GAN Improve the Person\n            Re-identification Baseline in vitro. ICCV 2017.\n    URL: `<https://github.com/layumi/DukeMTMC-reID_evaluation>`_\n\n    Dataset statistics:\n        - identities: 1404 (train + query).\n        - images:16522 (train) + 2228 (query) + 17661 (gallery).\n        - cameras: 8.\n    \"\"\"\n\n    dataset_dir = \"DukeMTMC-reID\"\n\n    def __init__(self, root, verbose=True):\n        super(DukeMTMCreID, self).__init__()\n        self.root = osp.abspath(osp.expanduser(root))\n        self.dataset_dir = osp.join(self.root, self.dataset_dir)\n\n        self.train_dir = osp.join(self.dataset_dir, 'bounding_box_train')\n        self.query_dir = osp.join(self.dataset_dir, 'query')\n        self.gallery_dir = osp.join(self.dataset_dir, 'bounding_box_test')\n\n        train = process_dir(dir_path=self.train_dir, relabel=True)\n        query = process_dir(dir_path=self.query_dir, relabel=False)\n        gallery = process_dir(dir_path=self.gallery_dir, relabel=False)\n\n        self.train = train\n        self.query = query\n        self.gallery = gallery\n\n        self.num_train_pids, self.num_train_imgs, self.num_train_cams = self.get_imagedata_info(self.train)\n        self.num_query_pids, self.num_query_imgs, self.num_query_cams = self.get_imagedata_info(self.query)\n        self.num_gallery_pids, self.num_gallery_imgs, self.num_gallery_cams = self.get_imagedata_info(self.gallery)\n\n    def _check_before_run(self):\n        \"\"\"Check if all files are available before going deeper\"\"\"\n        if not osp.exists(self.dataset_dir):\n            raise RuntimeError(\"'{}' is not available\".format(self.dataset_dir))\n        if not osp.exists(self.train_dir):\n            raise RuntimeError(\"'{}' is not available\".format(self.train_dir))\n        if not osp.exists(self.query_dir):\n            raise RuntimeError(\"'{}' is not available\".format(self.query_dir))\n        if not osp.exists(self.gallery_dir):\n            raise RuntimeError(\"'{}' is not available\".format(self.gallery_dir))\n"
  },
  {
    "path": "clustercontrast/datasets/market1501.py",
    "content": "from __future__ import print_function, absolute_import\nimport os.path as osp\nimport glob\nimport re\nfrom ..utils.data import BaseImageDataset\n\n\nclass Market1501(BaseImageDataset):\n    \"\"\"\n    Market1501\n    Reference:\n    Zheng et al. Scalable Person Re-identification: A Benchmark. ICCV 2015.\n    URL: http://www.liangzheng.org/Project/project_reid.html\n\n    Dataset statistics:\n    # identities: 1501 (+1 for background)\n    # images: 12936 (train) + 3368 (query) + 15913 (gallery)\n    \"\"\"\n    dataset_dir = 'Market-1501-v15.09.15'\n\n    def __init__(self, root, verbose=True, **kwargs):\n        super(Market1501, self).__init__()\n        self.dataset_dir = osp.join(root, self.dataset_dir)\n        self.train_dir = osp.join(self.dataset_dir, 'bounding_box_train')\n        self.query_dir = osp.join(self.dataset_dir, 'query')\n        self.gallery_dir = osp.join(self.dataset_dir, 'bounding_box_test')\n\n        self._check_before_run()\n\n        train = self._process_dir(self.train_dir, relabel=True)\n        query = self._process_dir(self.query_dir, relabel=False)\n        gallery = self._process_dir(self.gallery_dir, relabel=False)\n\n        if verbose:\n            print(\"=> Market1501 loaded\")\n            self.print_dataset_statistics(train, query, gallery)\n\n        self.train = train\n        self.query = query\n        self.gallery = gallery\n\n        self.num_train_pids, self.num_train_imgs, self.num_train_cams = self.get_imagedata_info(self.train)\n        self.num_query_pids, self.num_query_imgs, self.num_query_cams = self.get_imagedata_info(self.query)\n        self.num_gallery_pids, self.num_gallery_imgs, self.num_gallery_cams = self.get_imagedata_info(self.gallery)\n\n    def _check_before_run(self):\n        \"\"\"Check if all files are available before going deeper\"\"\"\n        if not osp.exists(self.dataset_dir):\n            raise RuntimeError(\"'{}' is not available\".format(self.dataset_dir))\n        if not osp.exists(self.train_dir):\n            raise RuntimeError(\"'{}' is not available\".format(self.train_dir))\n        if not osp.exists(self.query_dir):\n            raise RuntimeError(\"'{}' is not available\".format(self.query_dir))\n        if not osp.exists(self.gallery_dir):\n            raise RuntimeError(\"'{}' is not available\".format(self.gallery_dir))\n\n    def _process_dir(self, dir_path, relabel=False):\n        img_paths = glob.glob(osp.join(dir_path, '*.jpg'))\n        pattern = re.compile(r'([-\\d]+)_c(\\d)')\n\n        pid_container = set()\n        for img_path in img_paths:\n            pid, _ = map(int, pattern.search(img_path).groups())\n            if pid == -1:\n                continue  # junk images are just ignored\n            pid_container.add(pid)\n        pid2label = {pid: label for label, pid in enumerate(pid_container)}\n\n        dataset = []\n        for img_path in img_paths:\n            pid, camid = map(int, pattern.search(img_path).groups())\n            if pid == -1:\n                continue  # junk images are just ignored\n            assert 0 <= pid <= 1501  # pid == 0 means background\n            assert 1 <= camid <= 6\n            camid -= 1  # index starts from 0\n            if relabel:\n                pid = pid2label[pid]\n            dataset.append((img_path, pid, camid))\n\n        return dataset\n"
  },
  {
    "path": "clustercontrast/datasets/msmt17.py",
    "content": "from __future__ import print_function, absolute_import\nimport os.path as osp\n\nimport glob\nimport re\nfrom ..utils.data import BaseImageDataset\n\n\ndef _process_dir(dir_path, relabel=False):\n    img_paths = glob.glob(osp.join(dir_path, '*.jpg'))\n    pattern = re.compile(r'([-\\d]+)_c(\\d+)')\n\n    pid_container = set()\n    for img_path in img_paths:\n        pid, _ = map(int, pattern.search(img_path).groups())\n        if pid == -1:\n            continue  # junk images are just ignored\n        pid_container.add(pid)\n    pid2label = {pid: label for label, pid in enumerate(pid_container)}\n    dataset = []\n    for img_path in img_paths:\n        pid, camid = map(int, pattern.search(img_path).groups())\n        if pid == -1:\n            continue  # junk images are just ignored\n        assert 1 <= camid <= 15\n        camid -= 1  # index starts from 0\n        if relabel:\n            pid = pid2label[pid]\n        dataset.append((img_path, pid, camid))\n\n    return dataset\n\n\nclass MSMT17(BaseImageDataset):\n    dataset_dir = 'MSMT17_V1'\n\n    def __init__(self, root, verbose=True, **kwargs):\n        super(MSMT17, self).__init__()\n        self.dataset_dir = osp.join(root, self.dataset_dir)\n        self.train_dir = osp.join(self.dataset_dir, 'bounding_box_train')\n        self.query_dir = osp.join(self.dataset_dir, 'query')\n        self.gallery_dir = osp.join(self.dataset_dir, 'bounding_box_test')\n\n        self._check_before_run()\n\n        train = _process_dir(self.train_dir, relabel=True)\n        query = _process_dir(self.query_dir, relabel=False)\n        gallery = _process_dir(self.gallery_dir, relabel=False)\n\n        if verbose:\n            print(\"=> MSMT17_V1 loaded\")\n            self.print_dataset_statistics(train, query, gallery)\n\n            self.train = train\n            self.query = query\n            self.gallery = gallery\n\n            self.num_train_pids, self.num_train_imgs, self.num_train_cams = self.get_imagedata_info(self.train)\n            self.num_query_pids, self.num_query_imgs, self.num_query_cams = self.get_imagedata_info(self.query)\n            self.num_gallery_pids, self.num_gallery_imgs, self.num_gallery_cams = self.get_imagedata_info(self.gallery)\n\n    def _check_before_run(self):\n        \"\"\"Check if all files are available before going deeper\"\"\"\n        if not osp.exists(self.dataset_dir):\n            raise RuntimeError(\"'{}' is not available\".format(self.dataset_dir))\n        if not osp.exists(self.train_dir):\n            raise RuntimeError(\"'{}' is not available\".format(self.train_dir))\n        if not osp.exists(self.query_dir):\n            raise RuntimeError(\"'{}' is not available\".format(self.query_dir))\n        if not osp.exists(self.gallery_dir):\n            raise RuntimeError(\"'{}' is not available\".format(self.gallery_dir))\n"
  },
  {
    "path": "clustercontrast/datasets/personx.py",
    "content": "from __future__ import print_function, absolute_import\nimport os.path as osp\nimport glob\nimport re\n\nfrom ..utils.data import BaseImageDataset\n\n\nclass PersonX(BaseImageDataset):\n    \"\"\"\n    PersonX\n    Reference:\n    Sun et al. Dissecting Person Re-identification from the Viewpoint of Viewpoint. CVPR 2019.\n\n    Dataset statistics:\n    # identities: 1266\n    # images: 9840 (train) + 5136 (query) + 30816 (gallery)\n    \"\"\"\n    dataset_dir = 'PersonX'\n\n    def __init__(self, root, verbose=True, **kwargs):\n        super(PersonX, self).__init__()\n        self.dataset_dir = osp.join(root, self.dataset_dir)\n        self.train_dir = osp.join(self.dataset_dir, 'bounding_box_train')\n        self.query_dir = osp.join(self.dataset_dir, 'query')\n        self.gallery_dir = osp.join(self.dataset_dir, 'bounding_box_test')\n\n        self._check_before_run()\n\n        train = self._process_dir(self.train_dir, relabel=True)\n        query = self._process_dir(self.query_dir, relabel=False)\n        gallery = self._process_dir(self.gallery_dir, relabel=False)\n\n        if verbose:\n            print(\"=> PersonX loaded\")\n            self.print_dataset_statistics(train, query, gallery)\n\n        self.train = train\n        self.query = query\n        self.gallery = gallery\n\n        self.num_train_pids, self.num_train_imgs, self.num_train_cams = self.get_imagedata_info(self.train)\n        self.num_query_pids, self.num_query_imgs, self.num_query_cams = self.get_imagedata_info(self.query)\n        self.num_gallery_pids, self.num_gallery_imgs, self.num_gallery_cams = self.get_imagedata_info(self.gallery)\n\n    def _check_before_run(self):\n        \"\"\"Check if all files are available before going deeper\"\"\"\n        if not osp.exists(self.dataset_dir):\n            raise RuntimeError(\"'{}' is not available\".format(self.dataset_dir))\n        if not osp.exists(self.train_dir):\n            raise RuntimeError(\"'{}' is not available\".format(self.train_dir))\n        if not osp.exists(self.query_dir):\n            raise RuntimeError(\"'{}' is not available\".format(self.query_dir))\n        if not osp.exists(self.gallery_dir):\n            raise RuntimeError(\"'{}' is not available\".format(self.gallery_dir))\n\n    def _process_dir(self, dir_path, relabel=False):\n        img_paths = glob.glob(osp.join(dir_path, '*.jpg'))\n        pattern = re.compile(r'([-\\d]+)_c([-\\d]+)')\n        cam2label = {3: 1, 4: 2, 8: 3, 10: 4, 11: 5, 12: 6}\n\n        pid_container = set()\n        for img_path in img_paths:\n            pid, _ = map(int, pattern.search(img_path).groups())\n            pid_container.add(pid)\n        pid2label = {pid: label for label, pid in enumerate(pid_container)}\n\n        dataset = []\n        for img_path in img_paths:\n            pid, camid = map(int, pattern.search(img_path).groups())\n            assert (camid in cam2label.keys())\n            camid = cam2label[camid]\n            camid -= 1  # index starts from 0\n            if relabel: pid = pid2label[pid]\n            dataset.append((img_path, pid, camid))\n\n        return dataset\n"
  },
  {
    "path": "clustercontrast/datasets/veri.py",
    "content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport glob\nimport re\nimport os.path as osp\n\nfrom ..utils.data import BaseImageDataset\n\n\nclass VeRi(BaseImageDataset):\n    \"\"\"\n    VeRi\n    Reference:\n    Liu, X., Liu, W., Ma, H., Fu, H.: Large-scale vehicle re-identification in urban surveillance videos. In: IEEE   %\n    International Conference on Multimedia and Expo. (2016) accepted.\n    Dataset statistics:\n    # identities: 776 vehicles(576 for training and 200 for testing)\n    # images: 37778 (train) + 11579 (query)\n    \"\"\"\n    dataset_dir = 'VeRi'\n\n    def __init__(self, root, verbose=True, **kwargs):\n        super(VeRi, self).__init__()\n        self.dataset_dir = osp.join(root, self.dataset_dir)\n        self.train_dir = osp.join(self.dataset_dir, 'image_train')\n        self.query_dir = osp.join(self.dataset_dir, 'image_query')\n        self.gallery_dir = osp.join(self.dataset_dir, 'image_test')\n\n        self.check_before_run()\n\n        train = self.process_dir(self.train_dir, relabel=True)\n        query = self.process_dir(self.query_dir, relabel=False)\n        gallery = self.process_dir(self.gallery_dir, relabel=False)\n\n        if verbose:\n            print('=> VeRi loaded')\n            self.print_dataset_statistics(train, query, gallery)\n\n        self.train = train\n        self.query = query\n        self.gallery = gallery\n\n        self.num_train_pids, self.num_train_imgs, self.num_train_cams = self.get_imagedata_info(self.train)\n        self.num_query_pids, self.num_query_imgs, self.num_query_cams = self.get_imagedata_info(self.query)\n        self.num_gallery_pids, self.num_gallery_imgs, self.num_gallery_cams = self.get_imagedata_info(self.gallery)\n\n    def check_before_run(self):\n        \"\"\"Check if all files are available before going deeper\"\"\"\n        if not osp.exists(self.dataset_dir):\n            raise RuntimeError('\"{}\" is not available'.format(self.dataset_dir))\n        if not osp.exists(self.train_dir):\n            raise RuntimeError('\"{}\" is not available'.format(self.train_dir))\n        if not osp.exists(self.query_dir):\n            raise RuntimeError('\"{}\" is not available'.format(self.query_dir))\n        if not osp.exists(self.gallery_dir):\n            raise RuntimeError('\"{}\" is not available'.format(self.gallery_dir))\n\n    def process_dir(self, dir_path, relabel=False):\n        img_paths = glob.glob(osp.join(dir_path, '*.jpg'))\n        pattern = re.compile(r'([-\\d]+)_c([-\\d]+)')\n\n        pid_container = set()\n        for img_path in img_paths:\n            pid, _ = map(int, pattern.search(img_path).groups())\n            if pid == -1:\n                continue  # junk images are just ignored\n            pid_container.add(pid)\n        pid2label = {pid: label for label, pid in enumerate(pid_container)}\n\n        dataset = []\n        for img_path in img_paths:\n            pid, camid = map(int, pattern.search(img_path).groups())\n            if pid == -1:\n                continue  # junk images are just ignored\n            assert 0 <= pid <= 776  # pid == 0 means background\n            assert 1 <= camid <= 20\n            camid -= 1  # index starts from 0\n            if relabel:\n                pid = pid2label[pid]\n            dataset.append((img_path, pid, camid))\n\n        return dataset\n"
  },
  {
    "path": "clustercontrast/evaluation_metrics/__init__.py",
    "content": "from __future__ import absolute_import\n\nfrom .classification import accuracy\nfrom .ranking import cmc, mean_ap\n\n__all__ = [\n    'accuracy',\n    'cmc',\n    'mean_ap'\n]\n"
  },
  {
    "path": "clustercontrast/evaluation_metrics/classification.py",
    "content": "from __future__ import absolute_import\n\nimport torch\nfrom ..utils import to_torch\n\n\ndef accuracy(output, target, topk=(1,)):\n    with torch.no_grad():\n        output, target = to_torch(output), to_torch(target)\n        maxk = max(topk)\n        batch_size = target.size(0)\n\n        _, pred = output.topk(maxk, 1, True, True)\n        pred = pred.t()\n        correct = pred.eq(target.view(1, -1).expand_as(pred))\n\n        ret = []\n        for k in topk:\n            correct_k = correct[:k].view(-1).float().sum(dim=0, keepdim=True)\n            ret.append(correct_k.mul_(1. / batch_size))\n        return ret\n"
  },
  {
    "path": "clustercontrast/evaluation_metrics/ranking.py",
    "content": "from __future__ import absolute_import\nfrom collections import defaultdict\n\nimport numpy as np\nfrom sklearn.metrics import average_precision_score\n\nfrom ..utils import to_numpy\n\n\ndef _unique_sample(ids_dict, num):\n    mask = np.zeros(num, dtype=np.bool)\n    for _, indices in ids_dict.items():\n        i = np.random.choice(indices)\n        mask[i] = True\n    return mask\n\n\ndef cmc(distmat, query_ids=None, gallery_ids=None,\n        query_cams=None, gallery_cams=None, topk=100,\n        separate_camera_set=False,\n        single_gallery_shot=False,\n        first_match_break=False):\n    distmat = to_numpy(distmat)\n    m, n = distmat.shape\n    # Fill up default values\n    if query_ids is None:\n        query_ids = np.arange(m)\n    if gallery_ids is None:\n        gallery_ids = np.arange(n)\n    if query_cams is None:\n        query_cams = np.zeros(m).astype(np.int32)\n    if gallery_cams is None:\n        gallery_cams = np.ones(n).astype(np.int32)\n    # Ensure numpy array\n    query_ids = np.asarray(query_ids)\n    gallery_ids = np.asarray(gallery_ids)\n    query_cams = np.asarray(query_cams)\n    gallery_cams = np.asarray(gallery_cams)\n    # Sort and find correct matches\n    indices = np.argsort(distmat, axis=1)\n    matches = (gallery_ids[indices] == query_ids[:, np.newaxis])\n    # Compute CMC for each query\n    ret = np.zeros(topk)\n    num_valid_queries = 0\n    for i in range(m):\n        # Filter out the same id and same camera\n        valid = ((gallery_ids[indices[i]] != query_ids[i]) |\n                 (gallery_cams[indices[i]] != query_cams[i]))\n        if separate_camera_set:\n            # Filter out samples from same camera\n            valid &= (gallery_cams[indices[i]] != query_cams[i])\n        if not np.any(matches[i, valid]): continue\n        if single_gallery_shot:\n            repeat = 10\n            gids = gallery_ids[indices[i][valid]]\n            inds = np.where(valid)[0]\n            ids_dict = defaultdict(list)\n            for j, x in zip(inds, gids):\n                ids_dict[x].append(j)\n        else:\n            repeat = 1\n        for _ in range(repeat):\n            if single_gallery_shot:\n                # Randomly choose one instance for each id\n                sampled = (valid & _unique_sample(ids_dict, len(valid)))\n                index = np.nonzero(matches[i, sampled])[0]\n            else:\n                index = np.nonzero(matches[i, valid])[0]\n            delta = 1. / (len(index) * repeat)\n            for j, k in enumerate(index):\n                if k - j >= topk: break\n                if first_match_break:\n                    ret[k - j] += 1\n                    break\n                ret[k - j] += delta\n        num_valid_queries += 1\n    if num_valid_queries == 0:\n        raise RuntimeError(\"No valid query\")\n    return ret.cumsum() / num_valid_queries\n\n\ndef mean_ap(distmat, query_ids=None, gallery_ids=None,\n            query_cams=None, gallery_cams=None):\n    distmat = to_numpy(distmat)\n    m, n = distmat.shape\n    # Fill up default values\n    if query_ids is None:\n        query_ids = np.arange(m)\n    if gallery_ids is None:\n        gallery_ids = np.arange(n)\n    if query_cams is None:\n        query_cams = np.zeros(m).astype(np.int32)\n    if gallery_cams is None:\n        gallery_cams = np.ones(n).astype(np.int32)\n    # Ensure numpy array\n    query_ids = np.asarray(query_ids)\n    gallery_ids = np.asarray(gallery_ids)\n    query_cams = np.asarray(query_cams)\n    gallery_cams = np.asarray(gallery_cams)\n    # Sort and find correct matches\n    indices = np.argsort(distmat, axis=1)\n    matches = (gallery_ids[indices] == query_ids[:, np.newaxis])\n    # Compute AP for each query\n    aps = []\n    for i in range(m):\n        # Filter out the same id and same camera\n        valid = ((gallery_ids[indices[i]] != query_ids[i]) |\n                 (gallery_cams[indices[i]] != query_cams[i]))\n        y_true = matches[i, valid]\n        y_score = -distmat[i][indices[i]][valid]\n        if not np.any(y_true): continue\n        aps.append(average_precision_score(y_true, y_score))\n    if len(aps) == 0:\n        raise RuntimeError(\"No valid query\")\n    return np.mean(aps)\n"
  },
  {
    "path": "clustercontrast/evaluators.py",
    "content": "from __future__ import print_function, absolute_import\nimport time\nimport collections\nfrom collections import OrderedDict\nimport numpy as np\nimport torch\nimport random\nimport copy\n\nfrom .evaluation_metrics import cmc, mean_ap\nfrom .utils.meters import AverageMeter\nfrom .utils.rerank import re_ranking\nfrom .utils import to_torch\n\n\ndef extract_cnn_feature(model, inputs):\n    inputs = to_torch(inputs).cuda()\n    outputs = model(inputs)\n    outputs = outputs.data.cpu()\n    return outputs\n\n\ndef extract_features(model, data_loader, print_freq=50):\n    model.eval()\n    batch_time = AverageMeter()\n    data_time = AverageMeter()\n\n    features = OrderedDict()\n    labels = OrderedDict()\n\n    end = time.time()\n    with torch.no_grad():\n        for i, (imgs, fnames, pids, _, _) in enumerate(data_loader):\n            data_time.update(time.time() - end)\n\n            outputs = extract_cnn_feature(model, imgs)\n            for fname, output, pid in zip(fnames, outputs, pids):\n                features[fname] = output\n                labels[fname] = pid\n\n            batch_time.update(time.time() - end)\n            end = time.time()\n\n            if (i + 1) % print_freq == 0:\n                print('Extract Features: [{}/{}]\\t'\n                      'Time {:.3f} ({:.3f})\\t'\n                      'Data {:.3f} ({:.3f})\\t'\n                      .format(i + 1, len(data_loader),\n                              batch_time.val, batch_time.avg,\n                              data_time.val, data_time.avg))\n\n    return features, labels\n\n\ndef pairwise_distance(features, query=None, gallery=None):\n    if query is None and gallery is None:\n        n = len(features)\n        x = torch.cat(list(features.values()))\n        x = x.view(n, -1)\n        dist_m = torch.pow(x, 2).sum(dim=1, keepdim=True) * 2\n        dist_m = dist_m.expand(n, n) - 2 * torch.mm(x, x.t())\n        return dist_m\n\n    x = torch.cat([features[f].unsqueeze(0) for f, _, _ in query], 0)\n    y = torch.cat([features[f].unsqueeze(0) for f, _, _ in gallery], 0)\n    m, n = x.size(0), y.size(0)\n    x = x.view(m, -1)\n    y = y.view(n, -1)\n    dist_m = torch.pow(x, 2).sum(dim=1, keepdim=True).expand(m, n) + \\\n           torch.pow(y, 2).sum(dim=1, keepdim=True).expand(n, m).t()\n    dist_m.addmm_(1, -2, x, y.t())\n    return dist_m, x.numpy(), y.numpy()\n\n\ndef evaluate_all(query_features, gallery_features, distmat, query=None, gallery=None,\n                 query_ids=None, gallery_ids=None,\n                 query_cams=None, gallery_cams=None,\n                 cmc_topk=(1, 5, 10), cmc_flag=False):\n    if query is not None and gallery is not None:\n        query_ids = [pid for _, pid, _ in query]\n        gallery_ids = [pid for _, pid, _ in gallery]\n        query_cams = [cam for _, _, cam in query]\n        gallery_cams = [cam for _, _, cam in gallery]\n    else:\n        assert (query_ids is not None and gallery_ids is not None\n                and query_cams is not None and gallery_cams is not None)\n\n    # Compute mean AP\n    mAP = mean_ap(distmat, query_ids, gallery_ids, query_cams, gallery_cams)\n    print('Mean AP: {:4.1%}'.format(mAP))\n\n    if (not cmc_flag):\n        return mAP\n\n    cmc_configs = {\n        'market1501': dict(separate_camera_set=False,\n                           single_gallery_shot=False,\n                           first_match_break=True),}\n    cmc_scores = {name: cmc(distmat, query_ids, gallery_ids,\n                            query_cams, gallery_cams, **params)\n                  for name, params in cmc_configs.items()}\n\n    print('CMC Scores:')\n    for k in cmc_topk:\n        print('  top-{:<4}{:12.1%}'.format(k, cmc_scores['market1501'][k-1]))\n    return cmc_scores['market1501'], mAP\n\n\nclass Evaluator(object):\n    def __init__(self, model):\n        super(Evaluator, self).__init__()\n        self.model = model\n\n    def evaluate(self, data_loader, query, gallery, cmc_flag=False, rerank=False):\n        features, _ = extract_features(self.model, data_loader)\n        distmat, query_features, gallery_features = pairwise_distance(features, query, gallery)\n        results = evaluate_all(query_features, gallery_features, distmat, query=query, gallery=gallery, cmc_flag=cmc_flag)\n\n        if (not rerank):\n            return results\n\n        print('Applying person re-ranking ...')\n        distmat_qq, _, _ = pairwise_distance(features, query, query)\n        distmat_gg, _, _ = pairwise_distance(features, gallery, gallery)\n        distmat = re_ranking(distmat.numpy(), distmat_qq.numpy(), distmat_gg.numpy())\n        return evaluate_all(query_features, gallery_features, distmat, query=query, gallery=gallery, cmc_flag=cmc_flag)\n"
  },
  {
    "path": "clustercontrast/models/__init__.py",
    "content": "from __future__ import absolute_import\n\nfrom .resnet import *\nfrom .resnet_ibn import *\n__factory = {\n    'resnet18': resnet18,\n    'resnet34': resnet34,\n    'resnet50': resnet50,\n    'resnet101': resnet101,\n    'resnet152': resnet152,\n    'resnet_ibn50a': resnet_ibn50a,\n    'resnet_ibn101a': resnet_ibn101a,\n}\n\n\ndef names():\n    return sorted(__factory.keys())\n\n\ndef create(name, *args, **kwargs):\n    \"\"\"\n    Create a model instance.\n\n    Parameters\n    ----------\n    name : str\n        Model name. Can be one of 'inception', 'resnet18', 'resnet34',\n        'resnet50', 'resnet101', and 'resnet152'.\n    pretrained : bool, optional\n        Only applied for 'resnet*' models. If True, will use ImageNet pretrained\n        model. Default: True\n    cut_at_pooling : bool, optional\n        If True, will cut the model before the last global pooling layer and\n        ignore the remaining kwargs. Default: False\n    num_features : int, optional\n        If positive, will append a Linear layer after the global pooling layer,\n        with this number of output units, followed by a BatchNorm layer.\n        Otherwise these layers will not be appended. Default: 256 for\n        'inception', 0 for 'resnet*'\n    norm : bool, optional\n        If True, will normalize the feature to be unit L2-norm for each sample.\n        Otherwise will append a ReLU layer after the above Linear layer if\n        num_features > 0. Default: False\n    dropout : float, optional\n        If positive, will append a Dropout layer with this dropout rate.\n        Default: 0\n    num_classes : int, optional\n        If positive, will append a Linear layer at the end as the classifier\n        with this number of output units. Default: 0\n    \"\"\"\n    if name not in __factory:\n        raise KeyError(\"Unknown model:\", name)\n    return __factory[name](*args, **kwargs)\n"
  },
  {
    "path": "clustercontrast/models/cm.py",
    "content": "import collections\nimport numpy as np\nfrom abc import ABC\nimport torch\nimport torch.nn.functional as F\nfrom torch import nn, autograd\n\n\nclass CM(autograd.Function):\n\n    @staticmethod\n    def forward(ctx, inputs, targets, features, momentum):\n        ctx.features = features\n        ctx.momentum = momentum\n        ctx.save_for_backward(inputs, targets)\n        outputs = inputs.mm(ctx.features.t())\n\n        return outputs\n\n    @staticmethod\n    def backward(ctx, grad_outputs):\n        inputs, targets = ctx.saved_tensors\n        grad_inputs = None\n        if ctx.needs_input_grad[0]:\n            grad_inputs = grad_outputs.mm(ctx.features)\n\n        # momentum update\n        for x, y in zip(inputs, targets):\n            ctx.features[y] = ctx.momentum * ctx.features[y] + (1. - ctx.momentum) * x\n            ctx.features[y] /= ctx.features[y].norm()\n\n        return grad_inputs, None, None, None\n\n\ndef cm(inputs, indexes, features, momentum=0.5):\n    return CM.apply(inputs, indexes, features, torch.Tensor([momentum]).to(inputs.device))\n\n\nclass CM_Hard(autograd.Function):\n\n    @staticmethod\n    def forward(ctx, inputs, targets, features, momentum):\n        ctx.features = features\n        ctx.momentum = momentum\n        ctx.save_for_backward(inputs, targets)\n        outputs = inputs.mm(ctx.features.t())\n\n        return outputs\n\n    @staticmethod\n    def backward(ctx, grad_outputs):\n        inputs, targets = ctx.saved_tensors\n        grad_inputs = None\n        if ctx.needs_input_grad[0]:\n            grad_inputs = grad_outputs.mm(ctx.features)\n\n        batch_centers = collections.defaultdict(list)\n        for instance_feature, index in zip(inputs, targets.tolist()):\n            batch_centers[index].append(instance_feature)\n\n        for index, features in batch_centers.items():\n            distances = []\n            for feature in features:\n                distance = feature.unsqueeze(0).mm(ctx.features[index].unsqueeze(0).t())[0][0]\n                distances.append(distance.cpu().numpy())\n\n            median = np.argmin(np.array(distances))\n            ctx.features[index] = ctx.features[index] * ctx.momentum + (1 - ctx.momentum) * features[median]\n            ctx.features[index] /= ctx.features[index].norm()\n\n        return grad_inputs, None, None, None\n\n\ndef cm_hard(inputs, indexes, features, momentum=0.5):\n    return CM_Hard.apply(inputs, indexes, features, torch.Tensor([momentum]).to(inputs.device))\n\n\nclass ClusterMemory(nn.Module, ABC):\n    def __init__(self, num_features, num_samples, temp=0.05, momentum=0.2, use_hard=False):\n        super(ClusterMemory, self).__init__()\n        self.num_features = num_features\n        self.num_samples = num_samples\n\n        self.momentum = momentum\n        self.temp = temp\n        self.use_hard = use_hard\n\n        self.register_buffer('features', torch.zeros(num_samples, num_features))\n\n    def forward(self, inputs, targets):\n\n        inputs = F.normalize(inputs, dim=1).cuda()\n        if self.use_hard:\n            outputs = cm_hard(inputs, targets, self.features, self.momentum)\n        else:\n            outputs = cm(inputs, targets, self.features, self.momentum)\n\n        outputs /= self.temp\n        loss = F.cross_entropy(outputs, targets)\n        return loss\n"
  },
  {
    "path": "clustercontrast/models/dsbn.py",
    "content": "import torch\nimport torch.nn as nn\n\n# Domain-specific BatchNorm\n\nclass DSBN2d(nn.Module):\n    def __init__(self, planes):\n        super(DSBN2d, self).__init__()\n        self.num_features = planes\n        self.BN_S = nn.BatchNorm2d(planes)\n        self.BN_T = nn.BatchNorm2d(planes)\n\n    def forward(self, x):\n        if (not self.training):\n            return self.BN_T(x)\n\n        bs = x.size(0)\n        assert (bs%2==0)\n        split = torch.split(x, int(bs/2), 0)\n        out1 = self.BN_S(split[0].contiguous())\n        out2 = self.BN_T(split[1].contiguous())\n        out = torch.cat((out1, out2), 0)\n        return out\n\nclass DSBN1d(nn.Module):\n    def __init__(self, planes):\n        super(DSBN1d, self).__init__()\n        self.num_features = planes\n        self.BN_S = nn.BatchNorm1d(planes)\n        self.BN_T = nn.BatchNorm1d(planes)\n\n    def forward(self, x):\n        if (not self.training):\n            return self.BN_T(x)\n\n        bs = x.size(0)\n        assert (bs%2==0)\n        split = torch.split(x, int(bs/2), 0)\n        out1 = self.BN_S(split[0].contiguous())\n        out2 = self.BN_T(split[1].contiguous())\n        out = torch.cat((out1, out2), 0)\n        return out\n\ndef convert_dsbn(model):\n    for _, (child_name, child) in enumerate(model.named_children()):\n        assert(not next(model.parameters()).is_cuda)\n        if isinstance(child, nn.BatchNorm2d):\n            m = DSBN2d(child.num_features)\n            m.BN_S.load_state_dict(child.state_dict())\n            m.BN_T.load_state_dict(child.state_dict())\n            setattr(model, child_name, m)\n        elif isinstance(child, nn.BatchNorm1d):\n            m = DSBN1d(child.num_features)\n            m.BN_S.load_state_dict(child.state_dict())\n            m.BN_T.load_state_dict(child.state_dict())\n            setattr(model, child_name, m)\n        else:\n            convert_dsbn(child)\n\ndef convert_bn(model, use_target=True):\n    for _, (child_name, child) in enumerate(model.named_children()):\n        assert(not next(model.parameters()).is_cuda)\n        if isinstance(child, DSBN2d):\n            m = nn.BatchNorm2d(child.num_features)\n            if use_target:\n                m.load_state_dict(child.BN_T.state_dict())\n            else:\n                m.load_state_dict(child.BN_S.state_dict())\n            setattr(model, child_name, m)\n        elif isinstance(child, DSBN1d):\n            m = nn.BatchNorm1d(child.num_features)\n            if use_target:\n                m.load_state_dict(child.BN_T.state_dict())\n            else:\n                m.load_state_dict(child.BN_S.state_dict())\n            setattr(model, child_name, m)\n        else:\n            convert_bn(child, use_target=use_target)\n"
  },
  {
    "path": "clustercontrast/models/kmeans.py",
    "content": "# Written by Yixiao Ge\n\nimport warnings\n\nimport faiss\nimport torch\n\nfrom ..utils import to_numpy, to_torch\n\n__all__ = [\"label_generator_kmeans\"]\n\n\n@torch.no_grad()\ndef label_generator_kmeans(features, num_classes=500, cuda=True):\n\n    assert num_classes, \"num_classes for kmeans is null\"\n\n    # k-means cluster by faiss\n    cluster = faiss.Kmeans(\n        features.size(-1), num_classes, niter=300, verbose=True, gpu=cuda\n    )\n\n    cluster.train(to_numpy(features))\n\n    _, labels = cluster.index.search(to_numpy(features), 1)\n    labels = labels.reshape(-1)\n\n    centers = to_torch(cluster.centroids).float()\n    # labels = to_torch(labels).long()\n\n    # k-means does not have outlier points\n    assert not (-1 in labels)\n\n    return labels, centers, num_classes, None\n"
  },
  {
    "path": "clustercontrast/models/pooling.py",
    "content": "# Credit to https://github.com/JDAI-CV/fast-reid/blob/master/fastreid/layers/pooling.py\nfrom abc import ABC\n\nimport torch\nimport torch.nn.functional as F\nfrom torch import nn\n\n__all__ = [\n    \"GeneralizedMeanPoolingPFpn\",\n    \"GeneralizedMeanPoolingList\",\n    \"GeneralizedMeanPoolingP\",\n    \"AdaptiveAvgMaxPool2d\",\n    \"FastGlobalAvgPool2d\",\n    \"avg_pooling\",\n    \"max_pooling\",\n]\n\n\nclass GeneralizedMeanPoolingList(nn.Module, ABC):\n    r\"\"\"Applies a 2D power-average adaptive pooling over an input signal composed of\n    several input planes.\n    The function computed is: :math:`f(X) = pow(sum(pow(X, p)), 1/p)`\n        - At p = infinity, one gets Max Pooling\n        - At p = 1, one gets Average Pooling\n    The output is of size H x W, for any input size.\n    The number of output features is equal to the number of input planes.\n    Args:\n        output_size: the target output size of the image of the form H x W.\n                     Can be a tuple (H, W) or a single H for a square image H x H\n                     H and W can be either a ``int``, or ``None`` which means the size\n                     will be the same as that of the input.\n    \"\"\"\n\n    def __init__(self, output_size=1, eps=1e-6):\n        super(GeneralizedMeanPoolingList, self).__init__()\n        self.output_size = output_size\n        self.eps = eps\n\n    def forward(self, x_list):\n        outs = []\n        for x in x_list:\n            x = x.clamp(min=self.eps)\n            out = torch.nn.functional.adaptive_avg_pool2d(x, self.output_size)\n            outs.append(out)\n        return torch.stack(outs, -1).mean(-1)\n\n    def __repr__(self):\n        return (\n            self.__class__.__name__\n            + \"(\"\n            + \"output_size=\"\n            + str(self.output_size)\n            + \")\"\n        )\n\n\nclass GeneralizedMeanPooling(nn.Module, ABC):\n    r\"\"\"Applies a 2D power-average adaptive pooling over an input signal composed of\n    several input planes.\n    The function computed is: :math:`f(X) = pow(sum(pow(X, p)), 1/p)`\n        - At p = infinity, one gets Max Pooling\n        - At p = 1, one gets Average Pooling\n    The output is of size H x W, for any input size.\n    The number of output features is equal to the number of input planes.\n    Args:\n        output_size: the target output size of the image of the form H x W.\n                     Can be a tuple (H, W) or a single H for a square image H x H\n                     H and W can be either a ``int``, or ``None`` which means the size\n                     will be the same as that of the input.\n    \"\"\"\n\n    def __init__(self, norm, output_size=1, eps=1e-6):\n        super(GeneralizedMeanPooling, self).__init__()\n        assert norm > 0\n        self.p = float(norm)\n        self.output_size = output_size\n        self.eps = eps\n\n    def forward(self, x):\n        x = x.clamp(min=self.eps).pow(self.p)\n        return torch.nn.functional.adaptive_avg_pool2d(x, self.output_size).pow(\n            1.0 / self.p\n        )\n\n    def __repr__(self):\n        return (\n            self.__class__.__name__\n            + \"(\"\n            + str(self.p)\n            + \", \"\n            + \"output_size=\"\n            + str(self.output_size)\n            + \")\"\n        )\n\n\nclass GeneralizedMeanPoolingP(GeneralizedMeanPooling, ABC):\n    \"\"\" Same, but norm is trainable\n    \"\"\"\n\n    def __init__(self, norm=3, output_size=1, eps=1e-6):\n        super(GeneralizedMeanPoolingP, self).__init__(norm, output_size, eps)\n        self.p = nn.Parameter(torch.ones(1) * norm)\n\n\nclass GeneralizedMeanPoolingFpn(nn.Module, ABC):\n    r\"\"\"Applies a 2D power-average adaptive pooling over an input signal composed of\n    several input planes.\n    The function computed is: :math:`f(X) = pow(sum(pow(X, p)), 1/p)`\n        - At p = infinity, one gets Max Pooling\n        - At p = 1, one gets Average Pooling\n    The output is of size H x W, for any input size.\n    The number of output features is equal to the number of input planes.\n    Args:\n        output_size: the target output size of the image of the form H x W.\n                     Can be a tuple (H, W) or a single H for a square image H x H\n                     H and W can be either a ``int``, or ``None`` which means the size\n                     will be the same as that of the input.\n    \"\"\"\n\n    def __init__(self, norm, output_size=1, eps=1e-6):\n        super(GeneralizedMeanPoolingFpn, self).__init__()\n        assert norm > 0\n        self.p = float(norm)\n        self.output_size = output_size\n        self.eps = eps\n\n    def forward(self, x_lists):\n        outs = []\n        for x in x_lists:\n            x = x.clamp(min=self.eps).pow(self.p)\n            out = torch.nn.functional.adaptive_avg_pool2d(x, self.output_size).pow(\n                1.0 / self.p\n            )\n            outs.append(out)\n        return torch.cat(outs, 1)\n\n    def __repr__(self):\n        return (\n            self.__class__.__name__\n            + \"(\"\n            + str(self.p)\n            + \", \"\n            + \"output_size=\"\n            + str(self.output_size)\n            + \")\"\n        )\n\n\nclass GeneralizedMeanPoolingPFpn(GeneralizedMeanPoolingFpn, ABC):\n    \"\"\" Same, but norm is trainable\n    \"\"\"\n\n    def __init__(self, norm=3, output_size=1, eps=1e-6):\n        super(GeneralizedMeanPoolingPFpn, self).__init__(norm, output_size, eps)\n        self.p = nn.Parameter(torch.ones(1) * norm)\n\n\nclass AdaptiveAvgMaxPool2d(nn.Module, ABC):\n    def __init__(self):\n        super(AdaptiveAvgMaxPool2d, self).__init__()\n        self.avgpool = FastGlobalAvgPool2d()\n\n    def forward(self, x):\n        x_avg = self.avgpool(x, self.output_size)\n        x_max = F.adaptive_max_pool2d(x, 1)\n        x = x_max + x_avg\n        return x\n\n\nclass FastGlobalAvgPool2d(nn.Module, ABC):\n    def __init__(self, flatten=False):\n        super(FastGlobalAvgPool2d, self).__init__()\n        self.flatten = flatten\n\n    def forward(self, x):\n        if self.flatten:\n            in_size = x.size()\n            return x.view((in_size[0], in_size[1], -1)).mean(dim=2)\n        else:\n            return (\n                x.view(x.size(0), x.size(1), -1)\n                .mean(-1)\n                .view(x.size(0), x.size(1), 1, 1)\n            )\n\n\ndef avg_pooling():\n    return nn.AdaptiveAvgPool2d(1)\n    # return FastGlobalAvgPool2d()\n\n\ndef max_pooling():\n    return nn.AdaptiveMaxPool2d(1)\n\n\nclass Flatten(nn.Module):\n    def forward(self, input):\n        return input.view(input.size(0), -1)\n\n\n__pooling_factory = {\n    \"avg\": avg_pooling,\n    \"max\": max_pooling,\n    \"gem\": GeneralizedMeanPoolingP,\n    \"gemFpn\": GeneralizedMeanPoolingPFpn,\n    \"gemList\": GeneralizedMeanPoolingList,\n    \"avg+max\": AdaptiveAvgMaxPool2d,\n}\n\n\ndef pooling_names():\n    return sorted(__pooling_factory.keys())\n\n\ndef build_pooling_layer(name):\n    \"\"\"\n    Create a pooling layer.\n    Parameters\n    ----------\n    name : str\n        The backbone name.\n    \"\"\"\n    if name not in __pooling_factory:\n        raise KeyError(\"Unknown pooling layer:\", name)\n    return __pooling_factory[name]()"
  },
  {
    "path": "clustercontrast/models/resnet.py",
    "content": "from __future__ import absolute_import\nfrom torch import nn\nfrom torch.nn import functional as F\nfrom torch.nn import init\nimport torchvision\nimport torch\nfrom .pooling import build_pooling_layer\n\n\n__all__ = ['ResNet', 'resnet18', 'resnet34', 'resnet50', 'resnet101',\n           'resnet152']\n\n\nclass ResNet(nn.Module):\n    __factory = {\n        18: torchvision.models.resnet18,\n        34: torchvision.models.resnet34,\n        50: torchvision.models.resnet50,\n        101: torchvision.models.resnet101,\n        152: torchvision.models.resnet152,\n    }\n\n    def __init__(self, depth, pretrained=True, cut_at_pooling=False,\n                 num_features=0, norm=False, dropout=0, num_classes=0, pooling_type='avg'):\n        super(ResNet, self).__init__()\n        self.pretrained = pretrained\n        self.depth = depth\n        self.cut_at_pooling = cut_at_pooling\n        # Construct base (pretrained) resnet\n        if depth not in ResNet.__factory:\n            raise KeyError(\"Unsupported depth:\", depth)\n        resnet = ResNet.__factory[depth](pretrained=pretrained)\n        resnet.layer4[0].conv2.stride = (1, 1)\n        resnet.layer4[0].downsample[0].stride = (1, 1)\n        self.base = nn.Sequential(\n            resnet.conv1, resnet.bn1, resnet.relu, resnet.maxpool,\n            resnet.layer1, resnet.layer2, resnet.layer3, resnet.layer4)\n\n        self.gap = build_pooling_layer(pooling_type)\n\n        if not self.cut_at_pooling:\n            self.num_features = num_features\n            self.norm = norm\n            self.dropout = dropout\n            self.has_embedding = num_features > 0\n            self.num_classes = num_classes\n\n            out_planes = resnet.fc.in_features\n\n            # Append new layers\n            if self.has_embedding:\n                self.feat = nn.Linear(out_planes, self.num_features)\n                self.feat_bn = nn.BatchNorm1d(self.num_features)\n                init.kaiming_normal_(self.feat.weight, mode='fan_out')\n                init.constant_(self.feat.bias, 0)\n            else:\n                # Change the num_features to CNN output channels\n                self.num_features = out_planes\n                self.feat_bn = nn.BatchNorm1d(self.num_features)\n            self.feat_bn.bias.requires_grad_(False)\n            if self.dropout > 0:\n                self.drop = nn.Dropout(self.dropout)\n            if self.num_classes > 0:\n                self.classifier = nn.Linear(self.num_features, self.num_classes, bias=False)\n                init.normal_(self.classifier.weight, std=0.001)\n        init.constant_(self.feat_bn.weight, 1)\n        init.constant_(self.feat_bn.bias, 0)\n\n        if not pretrained:\n            self.reset_params()\n\n    def forward(self, x):\n        bs = x.size(0)\n        x = self.base(x)\n\n        x = self.gap(x)\n        x = x.view(x.size(0), -1)\n\n        if self.cut_at_pooling:\n            return x\n\n        if self.has_embedding:\n            bn_x = self.feat_bn(self.feat(x))\n        else:\n            bn_x = self.feat_bn(x)\n\n        if (self.training is False):\n            bn_x = F.normalize(bn_x)\n            return bn_x\n\n        if self.norm:\n            bn_x = F.normalize(bn_x)\n        elif self.has_embedding:\n            bn_x = F.relu(bn_x)\n\n        if self.dropout > 0:\n            bn_x = self.drop(bn_x)\n\n        if self.num_classes > 0:\n            prob = self.classifier(bn_x)\n        else:\n            return bn_x\n\n        return prob\n\n    def reset_params(self):\n        for m in self.modules():\n            if isinstance(m, nn.Conv2d):\n                init.kaiming_normal_(m.weight, mode='fan_out')\n                if m.bias is not None:\n                    init.constant_(m.bias, 0)\n            elif isinstance(m, nn.BatchNorm2d):\n                init.constant_(m.weight, 1)\n                init.constant_(m.bias, 0)\n            elif isinstance(m, nn.BatchNorm1d):\n                init.constant_(m.weight, 1)\n                init.constant_(m.bias, 0)\n            elif isinstance(m, nn.Linear):\n                init.normal_(m.weight, std=0.001)\n                if m.bias is not None:\n                    init.constant_(m.bias, 0)\n\n\ndef resnet18(**kwargs):\n    return ResNet(18, **kwargs)\n\n\ndef resnet34(**kwargs):\n    return ResNet(34, **kwargs)\n\n\ndef resnet50(**kwargs):\n    return ResNet(50, **kwargs)\n\n\ndef resnet101(**kwargs):\n    return ResNet(101, **kwargs)\n\n\ndef resnet152(**kwargs):\n    return ResNet(152, **kwargs)\n"
  },
  {
    "path": "clustercontrast/models/resnet_ibn.py",
    "content": "from __future__ import absolute_import\n\nfrom torch import nn\nfrom torch.nn import functional as F\nfrom torch.nn import init\nimport torchvision\nimport torch\nfrom .pooling import build_pooling_layer\n\nfrom .resnet_ibn_a import resnet50_ibn_a, resnet101_ibn_a\n\n\n__all__ = ['ResNetIBN', 'resnet_ibn50a', 'resnet_ibn101a']\n\n\nclass ResNetIBN(nn.Module):\n    __factory = {\n        '50a': resnet50_ibn_a,\n        '101a': resnet101_ibn_a\n    }\n\n    def __init__(self, depth, pretrained=True, cut_at_pooling=False,\n                 num_features=0, norm=False, dropout=0, num_classes=0, pooling_type='avg'):\n\n        super(ResNetIBN, self).__init__()\n\n        self.depth = depth\n        self.pretrained = pretrained\n        self.cut_at_pooling = cut_at_pooling\n\n        resnet = ResNetIBN.__factory[depth](pretrained=pretrained)\n        resnet.layer4[0].conv2.stride = (1, 1)\n        resnet.layer4[0].downsample[0].stride = (1, 1)\n\n        self.base = nn.Sequential(\n            resnet.conv1, resnet.bn1, resnet.relu, resnet.maxpool,\n            resnet.layer1, resnet.layer2, resnet.layer3, resnet.layer4)\n\n        self.gap = build_pooling_layer(pooling_type)\n\n        if not self.cut_at_pooling:\n            self.num_features = num_features\n            self.norm = norm\n            self.dropout = dropout\n            self.has_embedding = num_features > 0\n            self.num_classes = num_classes\n\n            out_planes = resnet.fc.in_features\n\n            # Append new layers\n            if self.has_embedding:\n                self.feat = nn.Linear(out_planes, self.num_features)\n                self.feat_bn = nn.BatchNorm1d(self.num_features)\n                init.kaiming_normal_(self.feat.weight, mode='fan_out')\n                init.constant_(self.feat.bias, 0)\n            else:\n                # Change the num_features to CNN output channels\n                self.num_features = out_planes\n                self.feat_bn = nn.BatchNorm1d(self.num_features)\n            self.feat_bn.bias.requires_grad_(False)\n            if self.dropout > 0:\n                self.drop = nn.Dropout(self.dropout)\n            if self.num_classes > 0:\n                self.classifier = nn.Linear(self.num_features, self.num_classes, bias=False)\n                init.normal_(self.classifier.weight, std=0.001)\n\n        init.constant_(self.feat_bn.weight, 1)\n        init.constant_(self.feat_bn.bias, 0)\n\n        if not pretrained:\n            self.reset_params()\n\n    def forward(self, x):\n        x = self.base(x)\n\n        x = self.gap(x)\n        x = x.view(x.size(0), -1)\n\n        if self.cut_at_pooling:\n            return x\n\n        if self.has_embedding:\n            bn_x = self.feat_bn(self.feat(x))\n        else:\n            bn_x = self.feat_bn(x)\n\n        if self.training is False:\n            bn_x = F.normalize(bn_x)\n            return bn_x\n\n        if self.norm:\n            bn_x = F.normalize(bn_x)\n        elif self.has_embedding:\n            bn_x = F.relu(bn_x)\n\n        if self.dropout > 0:\n            bn_x = self.drop(bn_x)\n\n        if self.num_classes > 0:\n            prob = self.classifier(bn_x)\n        else:\n            return bn_x\n\n        return prob\n\n    def reset_params(self):\n        for m in self.modules():\n            if isinstance(m, nn.Conv2d):\n                init.kaiming_normal_(m.weight, mode='fan_out')\n                if m.bias is not None:\n                    init.constant_(m.bias, 0)\n            elif isinstance(m, nn.BatchNorm2d):\n                init.constant_(m.weight, 1)\n                init.constant_(m.bias, 0)\n            elif isinstance(m, nn.BatchNorm1d):\n                init.constant_(m.weight, 1)\n                init.constant_(m.bias, 0)\n            elif isinstance(m, nn.Linear):\n                init.normal_(m.weight, std=0.001)\n                if m.bias is not None:\n                    init.constant_(m.bias, 0)\n\n\ndef resnet_ibn50a(**kwargs):\n    return ResNetIBN('50a', **kwargs)\n\n\ndef resnet_ibn101a(**kwargs):\n    return ResNetIBN('101a', **kwargs)\n"
  },
  {
    "path": "clustercontrast/models/resnet_ibn_a.py",
    "content": "import torch\nimport torch.nn as nn\nimport math\nimport torch.utils.model_zoo as model_zoo\n\n\n__all__ = ['ResNet', 'resnet50_ibn_a', 'resnet101_ibn_a']\n\n\nmodel_urls = {\n    'ibn_resnet50a': './examples/pretrained/resnet50_ibn_a.pth.tar',\n    'ibn_resnet101a': './examples/pretrained/resnet101_ibn_a.pth.tar',\n}\n\n\ndef conv3x3(in_planes, out_planes, stride=1):\n    \"3x3 convolution with padding\"\n    return nn.Conv2d(in_planes, out_planes, kernel_size=3, stride=stride,\n                     padding=1, bias=False)\n\n\nclass BasicBlock(nn.Module):\n    expansion = 1\n\n    def __init__(self, inplanes, planes, stride=1, downsample=None):\n        super(BasicBlock, self).__init__()\n        self.conv1 = conv3x3(inplanes, planes, stride)\n        self.bn1 = nn.BatchNorm2d(planes)\n        self.relu = nn.ReLU(inplace=True)\n        self.conv2 = conv3x3(planes, planes)\n        self.bn2 = nn.BatchNorm2d(planes)\n        self.downsample = downsample\n        self.stride = stride\n\n    def forward(self, x):\n        residual = x\n\n        out = self.conv1(x)\n        out = self.bn1(out)\n        out = self.relu(out)\n\n        out = self.conv2(out)\n        out = self.bn2(out)\n\n        if self.downsample is not None:\n            residual = self.downsample(x)\n\n        out += residual\n        out = self.relu(out)\n\n        return out\n\n\nclass IBN(nn.Module):\n    def __init__(self, planes):\n        super(IBN, self).__init__()\n        half1 = int(planes/2)\n        self.half = half1\n        half2 = planes - half1\n        self.IN = nn.InstanceNorm2d(half1, affine=True)\n        self.BN = nn.BatchNorm2d(half2)\n\n    def forward(self, x):\n        split = torch.split(x, self.half, 1)\n        out1 = self.IN(split[0].contiguous())\n        out2 = self.BN(split[1].contiguous())\n        out = torch.cat((out1, out2), 1)\n        return out\n\nclass Bottleneck(nn.Module):\n    expansion = 4\n\n    def __init__(self, inplanes, planes, ibn=False, stride=1, downsample=None):\n        super(Bottleneck, self).__init__()\n        self.conv1 = nn.Conv2d(inplanes, planes, kernel_size=1, bias=False)\n        if ibn:\n            self.bn1 = IBN(planes)\n        else:\n            self.bn1 = nn.BatchNorm2d(planes)\n        self.conv2 = nn.Conv2d(planes, planes, kernel_size=3, stride=stride,\n                               padding=1, bias=False)\n        self.bn2 = nn.BatchNorm2d(planes)\n        self.conv3 = nn.Conv2d(planes, planes * self.expansion, kernel_size=1, bias=False)\n        self.bn3 = nn.BatchNorm2d(planes * self.expansion)\n        self.relu = nn.ReLU(inplace=True)\n        self.downsample = downsample\n        self.stride = stride\n\n    def forward(self, x):\n        residual = x\n\n        out = self.conv1(x)\n        out = self.bn1(out)\n        out = self.relu(out)\n\n        out = self.conv2(out)\n        out = self.bn2(out)\n        out = self.relu(out)\n\n        out = self.conv3(out)\n        out = self.bn3(out)\n\n        if self.downsample is not None:\n            residual = self.downsample(x)\n\n        out += residual\n        out = self.relu(out)\n\n        return out\n\n\nclass ResNet(nn.Module):\n\n    def __init__(self, block, layers, num_classes=1000):\n        scale = 64\n        self.inplanes = scale\n        super(ResNet, self).__init__()\n        self.conv1 = nn.Conv2d(3, scale, kernel_size=7, stride=2, padding=3,\n                               bias=False)\n        self.bn1 = nn.BatchNorm2d(scale)\n        self.relu = nn.ReLU(inplace=True)\n        self.maxpool = nn.MaxPool2d(kernel_size=3, stride=2, padding=1)\n        self.layer1 = self._make_layer(block, scale, layers[0])\n        self.layer2 = self._make_layer(block, scale*2, layers[1], stride=2)\n        self.layer3 = self._make_layer(block, scale*4, layers[2], stride=2)\n        self.layer4 = self._make_layer(block, scale*8, layers[3], stride=2)\n        self.avgpool = nn.AvgPool2d(7)\n        self.fc = nn.Linear(scale * 8 * block.expansion, num_classes)\n\n        for m in self.modules():\n            if isinstance(m, nn.Conv2d):\n                n = m.kernel_size[0] * m.kernel_size[1] * m.out_channels\n                m.weight.data.normal_(0, math.sqrt(2. / n))\n            elif isinstance(m, nn.BatchNorm2d):\n                m.weight.data.fill_(1)\n                m.bias.data.zero_()\n            elif isinstance(m, nn.InstanceNorm2d):\n                m.weight.data.fill_(1)\n                m.bias.data.zero_()\n\n    def _make_layer(self, block, planes, blocks, stride=1):\n        downsample = None\n        if stride != 1 or self.inplanes != planes * block.expansion:\n            downsample = nn.Sequential(\n                nn.Conv2d(self.inplanes, planes * block.expansion,\n                          kernel_size=1, stride=stride, bias=False),\n                nn.BatchNorm2d(planes * block.expansion),\n            )\n\n        layers = []\n        ibn = True\n        if planes == 512:\n            ibn = False\n        layers.append(block(self.inplanes, planes, ibn, stride, downsample))\n        self.inplanes = planes * block.expansion\n        for i in range(1, blocks):\n            layers.append(block(self.inplanes, planes, ibn))\n\n        return nn.Sequential(*layers)\n\n    def forward(self, x):\n        x = self.conv1(x)\n        x = self.bn1(x)\n        x = self.relu(x)\n        x = self.maxpool(x)\n\n        x = self.layer1(x)\n        x = self.layer2(x)\n        x = self.layer3(x)\n        x = self.layer4(x)\n\n        x = self.avgpool(x)\n        x = x.view(x.size(0), -1)\n        x = self.fc(x)\n\n        return x\n\n\ndef resnet50_ibn_a(pretrained=False, **kwargs):\n    \"\"\"Constructs a ResNet-50 model.\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n    \"\"\"\n    model = ResNet(Bottleneck, [3, 4, 6, 3], **kwargs)\n    if pretrained:\n        state_dict = torch.load(model_urls['ibn_resnet50a'], map_location=torch.device('cpu'))['state_dict']\n        state_dict = remove_module_key(state_dict)\n        model.load_state_dict(state_dict)\n    return model\n\n\ndef resnet101_ibn_a(pretrained=False, **kwargs):\n    \"\"\"Constructs a ResNet-101 model.\n    Args:\n        pretrained (bool): If True, returns a model pre-trained on ImageNet\n    \"\"\"\n    model = ResNet(Bottleneck, [3, 4, 23, 3], **kwargs)\n    if pretrained:\n        state_dict = torch.load(model_urls['ibn_resnet101a'], map_location=torch.device('cpu'))['state_dict']\n        state_dict = remove_module_key(state_dict)\n        model.load_state_dict(state_dict)\n    return model\n\n\ndef remove_module_key(state_dict):\n    for key in list(state_dict.keys()):\n        if 'module' in key:\n            state_dict[key.replace('module.','')] = state_dict.pop(key)\n    return state_dict\n"
  },
  {
    "path": "clustercontrast/trainers.py",
    "content": "from __future__ import print_function, absolute_import\nimport time\nfrom .utils.meters import AverageMeter\n\n\nclass ClusterContrastTrainer(object):\n    def __init__(self, encoder, memory=None):\n        super(ClusterContrastTrainer, self).__init__()\n        self.encoder = encoder\n        self.memory = memory\n\n    def train(self, epoch, data_loader, optimizer, print_freq=10, train_iters=400):\n        self.encoder.train()\n\n        batch_time = AverageMeter()\n        data_time = AverageMeter()\n\n        losses = AverageMeter()\n\n        end = time.time()\n        for i in range(train_iters):\n            # load data\n            inputs = data_loader.next()\n            data_time.update(time.time() - end)\n\n            # process inputs\n            inputs, labels, indexes = self._parse_data(inputs)\n\n            # forward\n            f_out = self._forward(inputs)\n            # print(\"f_out shape: {}\".format(f_out.shape))\n            # compute loss with the hybrid memory\n            # loss = self.memory(f_out, indexes)\n            loss = self.memory(f_out, labels)\n\n            optimizer.zero_grad()\n            loss.backward()\n            optimizer.step()\n\n            losses.update(loss.item())\n\n            # print log\n            batch_time.update(time.time() - end)\n            end = time.time()\n\n            if (i + 1) % print_freq == 0:\n                print('Epoch: [{}][{}/{}]\\t'\n                      'Time {:.3f} ({:.3f})\\t'\n                      'Data {:.3f} ({:.3f})\\t'\n                      'Loss {:.3f} ({:.3f})'\n                      .format(epoch, i + 1, len(data_loader),\n                              batch_time.val, batch_time.avg,\n                              data_time.val, data_time.avg,\n                              losses.val, losses.avg))\n\n    def _parse_data(self, inputs):\n        imgs, _, pids, _, indexes = inputs\n        return imgs.cuda(), pids.cuda(), indexes.cuda()\n\n    def _forward(self, inputs):\n        return self.encoder(inputs)\n\n"
  },
  {
    "path": "clustercontrast/utils/__init__.py",
    "content": "from __future__ import absolute_import\n\nimport torch\n\n\ndef to_numpy(tensor):\n    if torch.is_tensor(tensor):\n        return tensor.cpu().numpy()\n    elif type(tensor).__module__ != 'numpy':\n        raise ValueError(\"Cannot convert {} to numpy array\"\n                         .format(type(tensor)))\n    return tensor\n\n\ndef to_torch(ndarray):\n    if type(ndarray).__module__ == 'numpy':\n        return torch.from_numpy(ndarray)\n    elif not torch.is_tensor(ndarray):\n        raise ValueError(\"Cannot convert {} to torch tensor\"\n                         .format(type(ndarray)))\n    return ndarray\n"
  },
  {
    "path": "clustercontrast/utils/data/__init__.py",
    "content": "from __future__ import absolute_import\n\nfrom .base_dataset import BaseDataset, BaseImageDataset\nfrom .preprocessor import Preprocessor\n\n\nclass IterLoader:\n    def __init__(self, loader, length=None):\n        self.loader = loader\n        self.length = length\n        self.iter = None\n\n    def __len__(self):\n        if self.length is not None:\n            return self.length\n\n        return len(self.loader)\n\n    def new_epoch(self):\n        self.iter = iter(self.loader)\n\n    def next(self):\n        try:\n            return next(self.iter)\n        except:\n            self.iter = iter(self.loader)\n            return next(self.iter)\n"
  },
  {
    "path": "clustercontrast/utils/data/base_dataset.py",
    "content": "# encoding: utf-8\nimport numpy as np\n\n\nclass BaseDataset(object):\n    \"\"\"\n    Base class of reid dataset\n    \"\"\"\n\n    def get_imagedata_info(self, data):\n        pids, cams = [], []\n        for _, pid, camid in data:\n            pids += [pid]\n            cams += [camid]\n        pids = set(pids)\n        cams = set(cams)\n        num_pids = len(pids)\n        num_cams = len(cams)\n        num_imgs = len(data)\n        return num_pids, num_imgs, num_cams\n\n    def print_dataset_statistics(self):\n        raise NotImplementedError\n\n    @property\n    def images_dir(self):\n        return None\n\n\nclass BaseImageDataset(BaseDataset):\n    \"\"\"\n    Base class of image reid dataset\n    \"\"\"\n\n    def print_dataset_statistics(self, train, query, gallery):\n        num_train_pids, num_train_imgs, num_train_cams = self.get_imagedata_info(train)\n        num_query_pids, num_query_imgs, num_query_cams = self.get_imagedata_info(query)\n        num_gallery_pids, num_gallery_imgs, num_gallery_cams = self.get_imagedata_info(gallery)\n\n        print(\"Dataset statistics:\")\n        print(\"  ----------------------------------------\")\n        print(\"  subset   | # ids | # images | # cameras\")\n        print(\"  ----------------------------------------\")\n        print(\"  train    | {:5d} | {:8d} | {:9d}\".format(num_train_pids, num_train_imgs, num_train_cams))\n        print(\"  query    | {:5d} | {:8d} | {:9d}\".format(num_query_pids, num_query_imgs, num_query_cams))\n        print(\"  gallery  | {:5d} | {:8d} | {:9d}\".format(num_gallery_pids, num_gallery_imgs, num_gallery_cams))\n        print(\"  ----------------------------------------\")\n"
  },
  {
    "path": "clustercontrast/utils/data/preprocessor.py",
    "content": "from __future__ import absolute_import\nimport os\nimport os.path as osp\nfrom torch.utils.data import DataLoader, Dataset\nimport numpy as np\nimport random\nimport math\nfrom PIL import Image\n\n\nclass Preprocessor(Dataset):\n    def __init__(self, dataset, root=None, transform=None):\n        super(Preprocessor, self).__init__()\n        self.dataset = dataset\n        self.root = root\n        self.transform = transform\n\n    def __len__(self):\n        return len(self.dataset)\n\n    def __getitem__(self, indices):\n        return self._get_single_item(indices)\n\n    def _get_single_item(self, index):\n        fname, pid, camid = self.dataset[index]\n        fpath = fname\n        if self.root is not None:\n            fpath = osp.join(self.root, fname)\n\n        img = Image.open(fpath).convert('RGB')\n\n        if self.transform is not None:\n            img = self.transform(img)\n\n        return img, fname, pid, camid, index\n"
  },
  {
    "path": "clustercontrast/utils/data/sampler.py",
    "content": "from __future__ import absolute_import\nfrom collections import defaultdict\nimport math\n\nimport numpy as np\nimport copy\nimport random\nimport torch\nfrom torch.utils.data.sampler import (\n    Sampler, SequentialSampler, RandomSampler, SubsetRandomSampler,\n    WeightedRandomSampler)\n\n\ndef No_index(a, b):\n    assert isinstance(a, list)\n    return [i for i, j in enumerate(a) if j != b]\n\n\nclass RandomIdentitySampler(Sampler):\n    def __init__(self, data_source, num_instances):\n        self.data_source = data_source\n        self.num_instances = num_instances\n        self.index_dic = defaultdict(list)\n        for index, (_, pid, _) in enumerate(data_source):\n            self.index_dic[pid].append(index)\n        self.pids = list(self.index_dic.keys())\n        self.num_samples = len(self.pids)\n\n    def __len__(self):\n        return self.num_samples * self.num_instances\n\n    def __iter__(self):\n        indices = torch.randperm(self.num_samples).tolist()\n        ret = []\n        for i in indices:\n            pid = self.pids[i]\n            t = self.index_dic[pid]\n            if len(t) >= self.num_instances:\n                t = np.random.choice(t, size=self.num_instances, replace=False)\n            else:\n                t = np.random.choice(t, size=self.num_instances, replace=True)\n            ret.extend(t)\n        return iter(ret)\n\n\nclass RandomMultipleGallerySampler(Sampler):\n    def __init__(self, data_source, num_instances=4):\n        super().__init__(data_source)\n        self.data_source = data_source\n        self.index_pid = defaultdict(int)\n        self.pid_cam = defaultdict(list)\n        self.pid_index = defaultdict(list)\n        self.num_instances = num_instances\n\n        for index, (_, pid, cam) in enumerate(data_source):\n            if pid < 0:\n                continue\n            self.index_pid[index] = pid\n            self.pid_cam[pid].append(cam)\n            self.pid_index[pid].append(index)\n\n        self.pids = list(self.pid_index.keys())\n        self.num_samples = len(self.pids)\n\n    def __len__(self):\n        return self.num_samples * self.num_instances\n\n    def __iter__(self):\n        indices = torch.randperm(len(self.pids)).tolist()\n        ret = []\n\n        for kid in indices:\n            i = random.choice(self.pid_index[self.pids[kid]])\n\n            _, i_pid, i_cam = self.data_source[i]\n\n            ret.append(i)\n\n            pid_i = self.index_pid[i]\n            cams = self.pid_cam[pid_i]\n            index = self.pid_index[pid_i]\n            select_cams = No_index(cams, i_cam)\n\n            if select_cams:\n\n                if len(select_cams) >= self.num_instances:\n                    cam_indexes = np.random.choice(select_cams, size=self.num_instances-1, replace=False)\n                else:\n                    cam_indexes = np.random.choice(select_cams, size=self.num_instances-1, replace=True)\n\n                for kk in cam_indexes:\n                    ret.append(index[kk])\n\n            else:\n                select_indexes = No_index(index, i)\n                if not select_indexes:\n                    continue\n                if len(select_indexes) >= self.num_instances:\n                    ind_indexes = np.random.choice(select_indexes, size=self.num_instances-1, replace=False)\n                else:\n                    ind_indexes = np.random.choice(select_indexes, size=self.num_instances-1, replace=True)\n\n                for kk in ind_indexes:\n                    ret.append(index[kk])\n\n        return iter(ret)\n\n\nclass RandomMultipleGallerySamplerNoCam(Sampler):\n    def __init__(self, data_source, num_instances=4):\n        super().__init__(data_source)\n\n        self.data_source = data_source\n        self.index_pid = defaultdict(int)\n        self.pid_index = defaultdict(list)\n        self.num_instances = num_instances\n\n        for index, (_, pid, cam) in enumerate(data_source):\n            if pid < 0:\n                continue\n            self.index_pid[index] = pid\n            self.pid_index[pid].append(index)\n\n        self.pids = list(self.pid_index.keys())\n        self.num_samples = len(self.pids)\n\n    def __len__(self):\n        return self.num_samples * self.num_instances\n\n    def __iter__(self):\n        indices = torch.randperm(len(self.pids)).tolist()\n        ret = []\n\n        for kid in indices:\n            i = random.choice(self.pid_index[self.pids[kid]])\n            _, i_pid, i_cam = self.data_source[i]\n\n            ret.append(i)\n\n            pid_i = self.index_pid[i]\n            index = self.pid_index[pid_i]\n\n            select_indexes = No_index(index, i)\n            if not select_indexes:\n                continue\n            if len(select_indexes) >= self.num_instances:\n                ind_indexes = np.random.choice(select_indexes, size=self.num_instances-1, replace=False)\n            else:\n                ind_indexes = np.random.choice(select_indexes, size=self.num_instances-1, replace=True)\n\n            for kk in ind_indexes:\n                ret.append(index[kk])\n\n        return iter(ret)\n"
  },
  {
    "path": "clustercontrast/utils/data/transforms.py",
    "content": "from __future__ import absolute_import\n\nfrom torchvision.transforms import *\nfrom PIL import Image\nimport random\nimport math\nimport numpy as np\n\nclass RectScale(object):\n    def __init__(self, height, width, interpolation=Image.BILINEAR):\n        self.height = height\n        self.width = width\n        self.interpolation = interpolation\n\n    def __call__(self, img):\n        w, h = img.size\n        if h == self.height and w == self.width:\n            return img\n        return img.resize((self.width, self.height), self.interpolation)\n\n\nclass RandomSizedRectCrop(object):\n    def __init__(self, height, width, interpolation=Image.BILINEAR):\n        self.height = height\n        self.width = width\n        self.interpolation = interpolation\n\n    def __call__(self, img):\n        for attempt in range(10):\n            area = img.size[0] * img.size[1]\n            target_area = random.uniform(0.64, 1.0) * area\n            aspect_ratio = random.uniform(2, 3)\n\n            h = int(round(math.sqrt(target_area * aspect_ratio)))\n            w = int(round(math.sqrt(target_area / aspect_ratio)))\n\n            if w <= img.size[0] and h <= img.size[1]:\n                x1 = random.randint(0, img.size[0] - w)\n                y1 = random.randint(0, img.size[1] - h)\n\n                img = img.crop((x1, y1, x1 + w, y1 + h))\n                assert(img.size == (w, h))\n\n                return img.resize((self.width, self.height), self.interpolation)\n\n        # Fallback\n        scale = RectScale(self.height, self.width,\n                          interpolation=self.interpolation)\n        return scale(img)\n\n\nclass RandomErasing(object):\n    \"\"\" Randomly selects a rectangle region in an image and erases its pixels.\n        'Random Erasing Data Augmentation' by Zhong et al.\n        See https://arxiv.org/pdf/1708.04896.pdf\n    Args:\n         probability: The probability that the Random Erasing operation will be performed.\n         sl: Minimum proportion of erased area against input image.\n         sh: Maximum proportion of erased area against input image.\n         r1: Minimum aspect ratio of erased area.\n         mean: Erasing value.\n    \"\"\"\n\n    def __init__(self, probability=0.5, sl=0.02, sh=0.4, r1=0.3, mean=(0.4914, 0.4822, 0.4465)):\n        self.probability = probability\n        self.mean = mean\n        self.sl = sl\n        self.sh = sh\n        self.r1 = r1\n\n    def __call__(self, img):\n\n        if random.uniform(0, 1) >= self.probability:\n            return img\n\n        for attempt in range(100):\n            area = img.size()[1] * img.size()[2]\n\n            target_area = random.uniform(self.sl, self.sh) * area\n            aspect_ratio = random.uniform(self.r1, 1 / self.r1)\n\n            h = int(round(math.sqrt(target_area * aspect_ratio)))\n            w = int(round(math.sqrt(target_area / aspect_ratio)))\n\n            if w < img.size()[2] and h < img.size()[1]:\n                x1 = random.randint(0, img.size()[1] - h)\n                y1 = random.randint(0, img.size()[2] - w)\n                if img.size()[0] == 3:\n                    img[0, x1:x1 + h, y1:y1 + w] = self.mean[0]\n                    img[1, x1:x1 + h, y1:y1 + w] = self.mean[1]\n                    img[2, x1:x1 + h, y1:y1 + w] = self.mean[2]\n                else:\n                    img[0, x1:x1 + h, y1:y1 + w] = self.mean[0]\n                return img\n\n        return img\n"
  },
  {
    "path": "clustercontrast/utils/faiss_rerank.py",
    "content": "#!/usr/bin/env python3\n# -*- coding: utf-8 -*-\n\"\"\"\nCVPR2017 paper:Zhong Z, Zheng L, Cao D, et al. Re-ranking Person Re-identification with k-reciprocal Encoding[J]. 2017.\nurl:http://openaccess.thecvf.com/content_cvpr_2017/papers/Zhong_Re-Ranking_Person_Re-Identification_CVPR_2017_paper.pdf\nMatlab version: https://github.com/zhunzhong07/person-re-ranking\n\"\"\"\n\nimport os, sys\nimport time\nimport numpy as np\nfrom scipy.spatial.distance import cdist\nimport gc\nimport faiss\n\nimport torch\nimport torch.nn.functional as F\n\nfrom .faiss_utils import search_index_pytorch, search_raw_array_pytorch, \\\n                            index_init_gpu, index_init_cpu\n\n\ndef k_reciprocal_neigh(initial_rank, i, k1):\n    forward_k_neigh_index = initial_rank[i,:k1+1]\n    backward_k_neigh_index = initial_rank[forward_k_neigh_index,:k1+1]\n    fi = np.where(backward_k_neigh_index==i)[0]\n    return forward_k_neigh_index[fi]\n\n\ndef compute_jaccard_distance(target_features, k1=20, k2=6, print_flag=True, search_option=0, use_float16=False):\n    end = time.time()\n    if print_flag:\n        print('Computing jaccard distance...')\n\n    ngpus = faiss.get_num_gpus()\n    N = target_features.size(0)\n    mat_type = np.float16 if use_float16 else np.float32\n\n    if (search_option==0):\n        # GPU + PyTorch CUDA Tensors (1)\n        res = faiss.StandardGpuResources()\n        res.setDefaultNullStreamAllDevices()\n        _, initial_rank = search_raw_array_pytorch(res, target_features, target_features, k1)\n        initial_rank = initial_rank.cpu().numpy()\n    elif (search_option==1):\n        # GPU + PyTorch CUDA Tensors (2)\n        res = faiss.StandardGpuResources()\n        index = faiss.GpuIndexFlatL2(res, target_features.size(-1))\n        index.add(target_features.cpu().numpy())\n        _, initial_rank = search_index_pytorch(index, target_features, k1)\n        res.syncDefaultStreamCurrentDevice()\n        initial_rank = initial_rank.cpu().numpy()\n    elif (search_option==2):\n        # GPU\n        index = index_init_gpu(ngpus, target_features.size(-1))\n        index.add(target_features.cpu().numpy())\n        _, initial_rank = index.search(target_features.cpu().numpy(), k1)\n    else:\n        # CPU\n        index = index_init_cpu(target_features.size(-1))\n        index.add(target_features.cpu().numpy())\n        _, initial_rank = index.search(target_features.cpu().numpy(), k1)\n\n\n    nn_k1 = []\n    nn_k1_half = []\n    for i in range(N):\n        nn_k1.append(k_reciprocal_neigh(initial_rank, i, k1))\n        nn_k1_half.append(k_reciprocal_neigh(initial_rank, i, int(np.around(k1/2))))\n\n    V = np.zeros((N, N), dtype=mat_type)\n    for i in range(N):\n        k_reciprocal_index = nn_k1[i]\n        k_reciprocal_expansion_index = k_reciprocal_index\n        for candidate in k_reciprocal_index:\n            candidate_k_reciprocal_index = nn_k1_half[candidate]\n            if (len(np.intersect1d(candidate_k_reciprocal_index,k_reciprocal_index)) > 2/3*len(candidate_k_reciprocal_index)):\n                k_reciprocal_expansion_index = np.append(k_reciprocal_expansion_index,candidate_k_reciprocal_index)\n\n        k_reciprocal_expansion_index = np.unique(k_reciprocal_expansion_index)  ## element-wise unique\n        dist = 2-2*torch.mm(target_features[i].unsqueeze(0).contiguous(), target_features[k_reciprocal_expansion_index].t())\n        if use_float16:\n            V[i,k_reciprocal_expansion_index] = F.softmax(-dist, dim=1).view(-1).cpu().numpy().astype(mat_type)\n        else:\n            V[i,k_reciprocal_expansion_index] = F.softmax(-dist, dim=1).view(-1).cpu().numpy()\n\n    del nn_k1, nn_k1_half\n\n    if k2 != 1:\n        V_qe = np.zeros_like(V, dtype=mat_type)\n        for i in range(N):\n            V_qe[i,:] = np.mean(V[initial_rank[i,:k2],:], axis=0)\n        V = V_qe\n        del V_qe\n\n    del initial_rank\n\n    invIndex = []\n    for i in range(N):\n        invIndex.append(np.where(V[:,i] != 0)[0])  #len(invIndex)=all_num\n\n    jaccard_dist = np.zeros((N, N), dtype=mat_type)\n    for i in range(N):\n        temp_min = np.zeros((1, N), dtype=mat_type)\n        # temp_max = np.zeros((1,N), dtype=mat_type)\n        indNonZero = np.where(V[i, :] != 0)[0]\n        indImages = []\n        indImages = [invIndex[ind] for ind in indNonZero]\n        for j in range(len(indNonZero)):\n            temp_min[0, indImages[j]] = temp_min[0, indImages[j]]+np.minimum(V[i, indNonZero[j]], V[indImages[j], indNonZero[j]])\n            # temp_max[0,indImages[j]] = temp_max[0,indImages[j]]+np.maximum(V[i,indNonZero[j]],V[indImages[j],indNonZero[j]])\n\n        jaccard_dist[i] = 1-temp_min/(2-temp_min)\n        # jaccard_dist[i] = 1-temp_min/(temp_max+1e-6)\n\n    del invIndex, V\n\n    pos_bool = (jaccard_dist < 0)\n    jaccard_dist[pos_bool] = 0.0\n    if print_flag:\n        print(\"Jaccard distance computing time cost: {}\".format(time.time()-end))\n\n    return jaccard_dist\n"
  },
  {
    "path": "clustercontrast/utils/faiss_utils.py",
    "content": "import os\nimport numpy as np\nimport faiss\nimport torch\n\ndef swig_ptr_from_FloatTensor(x):\n    assert x.is_contiguous()\n    assert x.dtype == torch.float32\n    return faiss.cast_integer_to_float_ptr(\n        x.storage().data_ptr() + x.storage_offset() * 4)\n\ndef swig_ptr_from_LongTensor(x):\n    assert x.is_contiguous()\n    assert x.dtype == torch.int64, 'dtype=%s' % x.dtype\n\n    return faiss.cast_integer_to_idx_t_ptr(\n        x.storage().data_ptr() + x.storage_offset() * 8)\n\ndef search_index_pytorch(index, x, k, D=None, I=None):\n    \"\"\"call the search function of an index with pytorch tensor I/O (CPU\n    and GPU supported)\"\"\"\n    assert x.is_contiguous()\n    n, d = x.size()\n    assert d == index.d\n\n    if D is None:\n        D = torch.empty((n, k), dtype=torch.float32, device=x.device)\n    else:\n        assert D.size() == (n, k)\n\n    if I is None:\n        I = torch.empty((n, k), dtype=torch.int64, device=x.device)\n    else:\n        assert I.size() == (n, k)\n    torch.cuda.synchronize()\n    xptr = swig_ptr_from_FloatTensor(x)\n    Iptr = swig_ptr_from_LongTensor(I)\n    Dptr = swig_ptr_from_FloatTensor(D)\n    index.search_c(n, xptr,\n                   k, Dptr, Iptr)\n    torch.cuda.synchronize()\n    return D, I\n\ndef search_raw_array_pytorch(res, xb, xq, k, D=None, I=None,\n                             metric=faiss.METRIC_L2):\n    assert xb.device == xq.device\n\n    nq, d = xq.size()\n    if xq.is_contiguous():\n        xq_row_major = True\n    elif xq.t().is_contiguous():\n        xq = xq.t()    # I initially wrote xq:t(), Lua is still haunting me :-)\n        xq_row_major = False\n    else:\n        raise TypeError('matrix should be row or column-major')\n\n    xq_ptr = swig_ptr_from_FloatTensor(xq)\n\n    nb, d2 = xb.size()\n    assert d2 == d\n    if xb.is_contiguous():\n        xb_row_major = True\n    elif xb.t().is_contiguous():\n        xb = xb.t()\n        xb_row_major = False\n    else:\n        raise TypeError('matrix should be row or column-major')\n    xb_ptr = swig_ptr_from_FloatTensor(xb)\n\n    if D is None:\n        D = torch.empty(nq, k, device=xb.device, dtype=torch.float32)\n    else:\n        assert D.shape == (nq, k)\n        assert D.device == xb.device\n\n    if I is None:\n        I = torch.empty(nq, k, device=xb.device, dtype=torch.int64)\n    else:\n        assert I.shape == (nq, k)\n        assert I.device == xb.device\n\n    D_ptr = swig_ptr_from_FloatTensor(D)\n    I_ptr = swig_ptr_from_LongTensor(I)\n\n    faiss.bruteForceKnn(res, metric,\n                xb_ptr, xb_row_major, nb,\n                xq_ptr, xq_row_major, nq,\n                d, k, D_ptr, I_ptr)\n\n    return D, I\n\ndef index_init_gpu(ngpus, feat_dim):\n    flat_config = []\n    for i in range(ngpus):\n        cfg = faiss.GpuIndexFlatConfig()\n        cfg.useFloat16 = False\n        cfg.device = i\n        flat_config.append(cfg)\n\n    res = [faiss.StandardGpuResources() for i in range(ngpus)]\n    indexes = [faiss.GpuIndexFlatL2(res[i], feat_dim, flat_config[i]) for i in range(ngpus)]\n    index = faiss.IndexShards(feat_dim)\n    for sub_index in indexes:\n        index.add_shard(sub_index)\n    index.reset()\n    return index\n\ndef index_init_cpu(feat_dim):\n    return faiss.IndexFlatL2(feat_dim)\n"
  },
  {
    "path": "clustercontrast/utils/infomap_cluster.py",
    "content": "import numpy as np\nfrom tqdm import tqdm\nimport infomap\nimport faiss\nimport math\nimport multiprocessing as mp\nfrom clustercontrast.utils.infomap_utils import Timer\n\n\n\n\ndef l2norm(vec):\n    \"\"\"\n    归一化\n    :param vec:\n    :return:\n    \"\"\"\n    vec /= np.linalg.norm(vec, axis=1).reshape(-1, 1)\n    return vec\n\n\ndef intdict2ndarray(d, default_val=-1):\n    arr = np.zeros(len(d)) + default_val\n    for k, v in d.items():\n        arr[k] = v\n    return arr\n\n\ndef read_meta(fn_meta, start_pos=0, verbose=True):\n    \"\"\"\n    idx2lb：每一个顶点对应一个类\n    lb2idxs：每个类对应一个id\n    \"\"\"\n    lb2idxs = {}\n    idx2lb = {}\n    with open(fn_meta) as f:\n        for idx, x in enumerate(f.readlines()[start_pos:]):\n            lb = int(x.strip())\n            if lb not in lb2idxs:\n                lb2idxs[lb] = []\n            lb2idxs[lb] += [idx]\n            idx2lb[idx] = lb\n\n    inst_num = len(idx2lb)\n    cls_num = len(lb2idxs)\n    if verbose:\n        print('[{}] #cls: {}, #inst: {}'.format(fn_meta, cls_num, inst_num))\n    return lb2idxs, idx2lb\n\n\nclass knn_faiss():\n    \"\"\"\n    内积暴力循环\n    归一化特征的内积等价于余弦相似度\n    \"\"\"\n\n    def __init__(self, feats, k, knn_method='faiss-cpu', verbose=True):\n        self.verbose = verbose\n\n        with Timer('[{}] build index {}'.format(knn_method, k), verbose):\n            feats = feats.astype('float32')\n            size, dim = feats.shape\n            if knn_method == 'faiss-gpu':\n                i = math.ceil(size / 1000000)\n                if i > 1:\n                    i = (i - 1) * 4\n                res = faiss.StandardGpuResources()\n                res.setTempMemory(i * 1024 * 1024 * 1024)\n                index = faiss.GpuIndexFlatIP(res, dim)\n            else:\n                index = faiss.IndexFlatIP(dim)\n            index.add(feats)\n\n        with Timer('[{}] query topk {}'.format(knn_method, k), verbose):\n            sims, nbrs = index.search(feats, k=k)\n            self.knns = [(np.array(nbr, dtype=np.int32),\n                          1 - np.array(sim, dtype=np.float32))\n                         for nbr, sim in zip(nbrs, sims)]\n\n    def filter_by_th(self, i):\n        th_nbrs = []\n        th_dists = []\n        nbrs, dists = self.knns[i]\n        for n, dist in zip(nbrs, dists):\n            if 1 - dist < self.th:\n                continue\n            th_nbrs.append(n)\n            th_dists.append(dist)\n        th_nbrs = np.array(th_nbrs)\n        th_dists = np.array(th_dists)\n        return th_nbrs, th_dists\n\n    def get_knns(self, th=None):\n        if th is None or th <= 0.:\n            return self.knns\n        # TODO: optimize the filtering process by numpy\n        # nproc = mp.cpu_count()\n        nproc = 1\n        with Timer('filter edges by th {} (CPU={})'.format(th, nproc),\n                   self.verbose):\n            self.th = th\n            self.th_knns = []\n            tot = len(self.knns)\n            if nproc > 1:\n                pool = mp.Pool(nproc)\n                th_knns = list(\n                    tqdm(pool.imap(self.filter_by_th, range(tot)), total=tot))\n                pool.close()\n            else:\n                th_knns = [self.filter_by_th(i) for i in range(tot)]\n            return th_knns\n\n\ndef knns2ordered_nbrs(knns, sort=True):\n    if isinstance(knns, list):\n        knns = np.array(knns)\n    nbrs = knns[:, 0, :].astype(np.int32)\n    dists = knns[:, 1, :]\n    if sort:\n        # sort dists from low to high\n        nb_idx = np.argsort(dists, axis=1)\n        idxs = np.arange(nb_idx.shape[0]).reshape(-1, 1)\n        dists = dists[idxs, nb_idx]\n        nbrs = nbrs[idxs, nb_idx]\n    return dists, nbrs\n\n\n# 构造边\ndef get_links(single, links, nbrs, dists, min_sim):\n    for i in tqdm(range(nbrs.shape[0])):\n        count = 0\n        for j in range(0, len(nbrs[i])):\n            # 排除本身节点\n            if i == nbrs[i][j]:\n                pass\n            elif dists[i][j] <= 1 - min_sim:\n                count += 1\n                links[(i, nbrs[i][j])] = float(1 - dists[i][j])\n            else:\n                break\n        # 统计孤立点\n        if count == 0:\n            single.append(i)\n    return single, links\n\n\ndef cluster_by_infomap(nbrs, dists, min_sim, cluster_num=2):\n    \"\"\"\n    基于infomap的聚类\n    :param nbrs:\n    :param dists:\n    :param pred_label_path:\n    :return:\n    \"\"\"\n    single = []\n    links = {}\n    with Timer('get links', verbose=True):\n        single, links = get_links(single=single, links=links, nbrs=nbrs, dists=dists, min_sim=min_sim)\n\n    infomapWrapper = infomap.Infomap(\"--two-level --directed\")\n    for (i, j), sim in tqdm(links.items()):\n        _ = infomapWrapper.addLink(int(i), int(j), sim)\n\n    # 聚类运算\n    infomapWrapper.run()\n\n    label2idx = {}\n    idx2label = {}\n\n    # 聚类结果统计\n    for node in infomapWrapper.iterTree():\n        # node.physicalId 特征向量的编号\n        # node.moduleIndex() 聚类的编号\n        if node.moduleIndex() not in label2idx:\n            label2idx[node.moduleIndex()] = []\n        label2idx[node.moduleIndex()].append(node.physicalId)\n\n    node_count = 0\n    for k, v in label2idx.items():\n        if k == 0:\n            each_index_list = v[2:]\n            node_count += len(each_index_list)\n            label2idx[k] = each_index_list\n        else:\n            each_index_list = v[1:]\n            node_count += len(each_index_list)\n            label2idx[k] = each_index_list\n\n        for each_index in each_index_list:\n            idx2label[each_index] = k\n\n    keys_len = len(list(label2idx.keys()))\n    # 孤立点放入到结果中\n    for single_node in single:\n        idx2label[single_node] = keys_len\n        label2idx[keys_len] = [single_node]\n        keys_len += 1\n        node_count += 1\n\n    # 孤立点个数\n    print(\"孤立点数：{}\".format(len(single)))\n\n    idx_len = len(list(idx2label.keys()))\n    assert idx_len == node_count, 'idx_len not equal node_count!'\n\n    print(\"总节点数：{}\".format(idx_len))\n\n    old_label_container = set()\n    for each_label, each_index_list in label2idx.items():\n        if len(each_index_list) <= cluster_num:\n            for each_index in each_index_list:\n                idx2label[each_index] = -1\n        else:\n            old_label_container.add(each_label)\n\n    old2new = {old_label: new_label for new_label, old_label in enumerate(old_label_container)}\n\n    for each_index, each_label in idx2label.items():\n        if each_label == -1:\n            continue\n        idx2label[each_index] = old2new[each_label]\n\n    pre_labels = intdict2ndarray(idx2label)\n\n    print(\"总类别数：{}/{}\".format(keys_len, len(set(pre_labels)) - (1 if -1 in pre_labels else 0)))\n\n    return pre_labels\n\n\ndef get_dist_nbr(features, k=80, knn_method='faiss-cpu'):\n    index = knn_faiss(feats=features, k=k, knn_method=knn_method)\n    knns = index.get_knns()\n    dists, nbrs = knns2ordered_nbrs(knns)\n    return dists, nbrs\n\n\n\n"
  },
  {
    "path": "clustercontrast/utils/infomap_utils.py",
    "content": "import time\n\n\nclass TextColors:\n    HEADER = '\\033[35m'\n    OKBLUE = '\\033[34m'\n    OKGREEN = '\\033[32m'\n    WARNING = '\\033[33m'\n    FATAL = '\\033[31m'\n    ENDC = '\\033[0m'\n    BOLD = '\\033[1m'\n    UNDERLINE = '\\033[4m'\n\n\nclass Timer():\n    def __init__(self, name='task', verbose=True):\n        self.name = name\n        self.verbose = verbose\n\n    def __enter__(self):\n        self.start = time.time()\n        return self\n\n    def __exit__(self, exc_type, exc_val, exc_tb):\n        if self.verbose:\n            print('[Time] {} consumes {:.4f} s'.format(\n                self.name,\n                time.time() - self.start))\n        return exc_type is None"
  },
  {
    "path": "clustercontrast/utils/logging.py",
    "content": "from __future__ import absolute_import\nimport os\nimport sys\n\nfrom .osutils import mkdir_if_missing\n\n\nclass Logger(object):\n    def __init__(self, fpath=None):\n        self.console = sys.stdout\n        self.file = None\n        if fpath is not None:\n            mkdir_if_missing(os.path.dirname(fpath))\n            self.file = open(fpath, 'w')\n\n    def __del__(self):\n        self.close()\n\n    def __enter__(self):\n        pass\n\n    def __exit__(self, *args):\n        self.close()\n\n    def write(self, msg):\n        self.console.write(msg)\n        if self.file is not None:\n            self.file.write(msg)\n\n    def flush(self):\n        self.console.flush()\n        if self.file is not None:\n            self.file.flush()\n            os.fsync(self.file.fileno())\n\n    def close(self):\n        self.console.close()\n        if self.file is not None:\n            self.file.close()\n"
  },
  {
    "path": "clustercontrast/utils/meters.py",
    "content": "from __future__ import absolute_import\n\n\nclass AverageMeter(object):\n    \"\"\"Computes and stores the average and current value\"\"\"\n\n    def __init__(self):\n        self.val = 0\n        self.avg = 0\n        self.sum = 0\n        self.count = 0\n\n    def reset(self):\n        self.val = 0\n        self.avg = 0\n        self.sum = 0\n        self.count = 0\n\n    def update(self, val, n=1):\n        self.val = val\n        self.sum += val * n\n        self.count += n\n        self.avg = self.sum / self.count\n"
  },
  {
    "path": "clustercontrast/utils/osutils.py",
    "content": "from __future__ import absolute_import\nimport os\nimport errno\n\n\ndef mkdir_if_missing(dir_path):\n    try:\n        os.makedirs(dir_path)\n    except OSError as e:\n        if e.errno != errno.EEXIST:\n            raise\n"
  },
  {
    "path": "clustercontrast/utils/rerank.py",
    "content": "#!/usr/bin/env python2/python3\n# -*- coding: utf-8 -*-\n\"\"\"\nSource: https://github.com/zhunzhong07/person-re-ranking\nCreated on Mon Jun 26 14:46:56 2017\n@author: luohao\nModified by Houjing Huang, 2017-12-22.\n- This version accepts distance matrix instead of raw features.\n- The difference of `/` division between python 2 and 3 is handled.\n- numpy.float16 is replaced by numpy.float32 for numerical precision.\nCVPR2017 paper:Zhong Z, Zheng L, Cao D, et al. Re-ranking Person Re-identification with k-reciprocal Encoding[J]. 2017.\nurl:http://openaccess.thecvf.com/content_cvpr_2017/papers/Zhong_Re-Ranking_Person_Re-Identification_CVPR_2017_paper.pdf\nMatlab version: https://github.com/zhunzhong07/person-re-ranking\nAPI\nq_g_dist: query-gallery distance matrix, numpy array, shape [num_query, num_gallery]\nq_q_dist: query-query distance matrix, numpy array, shape [num_query, num_query]\ng_g_dist: gallery-gallery distance matrix, numpy array, shape [num_gallery, num_gallery]\nk1, k2, lambda_value: parameters, the original paper is (k1=20, k2=6, lambda_value=0.3)\nReturns:\n  final_dist: re-ranked distance, numpy array, shape [num_query, num_gallery]\n\"\"\"\nfrom __future__ import absolute_import\nfrom __future__ import print_function\nfrom __future__ import division\n\n__all__ = ['re_ranking']\n\nimport numpy as np\n\n\ndef re_ranking(q_g_dist, q_q_dist, g_g_dist, k1=20, k2=6, lambda_value=0.3):\n\n    # The following naming, e.g. gallery_num, is different from outer scope.\n    # Don't care about it.\n\n    original_dist = np.concatenate(\n      [np.concatenate([q_q_dist, q_g_dist], axis=1),\n       np.concatenate([q_g_dist.T, g_g_dist], axis=1)],\n      axis=0)\n    original_dist = np.power(original_dist, 2).astype(np.float32)\n    original_dist = np.transpose(1. * original_dist/np.max(original_dist,axis = 0))\n    V = np.zeros_like(original_dist).astype(np.float32)\n    initial_rank = np.argsort(original_dist).astype(np.int32)\n\n    query_num = q_g_dist.shape[0]\n    gallery_num = q_g_dist.shape[0] + q_g_dist.shape[1]\n    all_num = gallery_num\n\n    for i in range(all_num):\n        # k-reciprocal neighbors\n        forward_k_neigh_index = initial_rank[i,:k1+1]\n        backward_k_neigh_index = initial_rank[forward_k_neigh_index,:k1+1]\n        fi = np.where(backward_k_neigh_index==i)[0]\n        k_reciprocal_index = forward_k_neigh_index[fi]\n        k_reciprocal_expansion_index = k_reciprocal_index\n        for j in range(len(k_reciprocal_index)):\n            candidate = k_reciprocal_index[j]\n            candidate_forward_k_neigh_index = initial_rank[candidate,:int(np.around(k1/2.))+1]\n            candidate_backward_k_neigh_index = initial_rank[candidate_forward_k_neigh_index,:int(np.around(k1/2.))+1]\n            fi_candidate = np.where(candidate_backward_k_neigh_index == candidate)[0]\n            candidate_k_reciprocal_index = candidate_forward_k_neigh_index[fi_candidate]\n            if len(np.intersect1d(candidate_k_reciprocal_index,k_reciprocal_index))> 2./3*len(candidate_k_reciprocal_index):\n                k_reciprocal_expansion_index = np.append(k_reciprocal_expansion_index,candidate_k_reciprocal_index)\n\n        k_reciprocal_expansion_index = np.unique(k_reciprocal_expansion_index)\n        weight = np.exp(-original_dist[i,k_reciprocal_expansion_index])\n        V[i,k_reciprocal_expansion_index] = 1.*weight/np.sum(weight)\n    original_dist = original_dist[:query_num,]\n    if k2 != 1:\n        V_qe = np.zeros_like(V,dtype=np.float32)\n        for i in range(all_num):\n            V_qe[i,:] = np.mean(V[initial_rank[i,:k2],:],axis=0)\n        V = V_qe\n        del V_qe\n    del initial_rank\n    invIndex = []\n    for i in range(gallery_num):\n        invIndex.append(np.where(V[:,i] != 0)[0])\n\n    jaccard_dist = np.zeros_like(original_dist,dtype = np.float32)\n\n\n    for i in range(query_num):\n        temp_min = np.zeros(shape=[1,gallery_num],dtype=np.float32)\n        indNonZero = np.where(V[i,:] != 0)[0]\n        indImages = []\n        indImages = [invIndex[ind] for ind in indNonZero]\n        for j in range(len(indNonZero)):\n            temp_min[0,indImages[j]] = temp_min[0,indImages[j]]+ np.minimum(V[i,indNonZero[j]],V[indImages[j],indNonZero[j]])\n        jaccard_dist[i] = 1-temp_min/(2.-temp_min)\n\n    final_dist = jaccard_dist*(1-lambda_value) + original_dist*lambda_value\n    del original_dist\n    del V\n    del jaccard_dist\n    final_dist = final_dist[:query_num,query_num:]\n    return final_dist\n"
  },
  {
    "path": "clustercontrast/utils/serialization.py",
    "content": "from __future__ import print_function, absolute_import\nimport json\nimport os.path as osp\nimport shutil\n\nimport torch\nfrom torch.nn import Parameter\n\nfrom .osutils import mkdir_if_missing\n\n\ndef read_json(fpath):\n    with open(fpath, 'r') as f:\n        obj = json.load(f)\n    return obj\n\n\ndef write_json(obj, fpath):\n    mkdir_if_missing(osp.dirname(fpath))\n    with open(fpath, 'w') as f:\n        json.dump(obj, f, indent=4, separators=(',', ': '))\n\n\ndef save_checkpoint(state, is_best, fpath='checkpoint.pth.tar'):\n    mkdir_if_missing(osp.dirname(fpath))\n    torch.save(state, fpath)\n    if is_best:\n        shutil.copy(fpath, osp.join(osp.dirname(fpath), 'model_best.pth.tar'))\n\n\ndef load_checkpoint(fpath):\n    if osp.isfile(fpath):\n        # checkpoint = torch.load(fpath)\n        checkpoint = torch.load(fpath, map_location=torch.device('cpu'))\n        print(\"=> Loaded checkpoint '{}'\".format(fpath))\n        return checkpoint\n    else:\n        raise ValueError(\"=> No checkpoint found at '{}'\".format(fpath))\n\n\ndef copy_state_dict(state_dict, model, strip=None):\n    tgt_state = model.state_dict()\n    copied_names = set()\n    for name, param in state_dict.items():\n        if strip is not None and name.startswith(strip):\n            name = name[len(strip):]\n        if name not in tgt_state:\n            continue\n        if isinstance(param, Parameter):\n            param = param.data\n        if param.size() != tgt_state[name].size():\n            print('mismatch:', name, param.size(), tgt_state[name].size())\n            continue\n        tgt_state[name].copy_(param)\n        copied_names.add(name)\n\n    missing = set(tgt_state.keys()) - copied_names\n    if len(missing) > 0:\n        print(\"missing keys in state_dict:\", missing)\n\n    return model\n"
  },
  {
    "path": "examples/cluster_contrast_train_usl.py",
    "content": "# -*- coding: utf-8 -*-\nfrom __future__ import print_function, absolute_import\nimport argparse\nimport os.path as osp\nimport random\nimport numpy as np\nimport sys\nimport collections\nimport time\nfrom datetime import timedelta\n\nfrom sklearn.cluster import DBSCAN\n\nimport torch\nfrom torch import nn\nfrom torch.backends import cudnn\nfrom torch.utils.data import DataLoader\nimport torch.nn.functional as F\n\nfrom clustercontrast import datasets\nfrom clustercontrast import models\nfrom clustercontrast.models.cm import ClusterMemory\nfrom clustercontrast.trainers import ClusterContrastTrainer\nfrom clustercontrast.evaluators import Evaluator, extract_features\nfrom clustercontrast.utils.data import IterLoader\nfrom clustercontrast.utils.data import transforms as T\nfrom clustercontrast.utils.data.preprocessor import Preprocessor\nfrom clustercontrast.utils.logging import Logger\nfrom clustercontrast.utils.serialization import load_checkpoint, save_checkpoint\nfrom clustercontrast.utils.faiss_rerank import compute_jaccard_distance\nfrom clustercontrast.utils.data.sampler import RandomMultipleGallerySampler, RandomMultipleGallerySamplerNoCam\n\nstart_epoch = best_mAP = 0\n\n\ndef get_data(name, data_dir):\n    root = osp.join(data_dir, name)\n    dataset = datasets.create(name, root)\n    return dataset\n\n\ndef get_train_loader(args, dataset, height, width, batch_size, workers,\n                     num_instances, iters, trainset=None, no_cam=False):\n\n    normalizer = T.Normalize(mean=[0.485, 0.456, 0.406],\n                             std=[0.229, 0.224, 0.225])\n\n    train_transformer = T.Compose([\n        T.Resize((height, width), interpolation=3),\n        T.RandomHorizontalFlip(p=0.5),\n        T.Pad(10),\n        T.RandomCrop((height, width)),\n        T.ToTensor(),\n        normalizer,\n        T.RandomErasing(probability=0.5, mean=[0.485, 0.456, 0.406])\n    ])\n\n    train_set = sorted(dataset.train) if trainset is None else sorted(trainset)\n    rmgs_flag = num_instances > 0\n    if rmgs_flag:\n        if no_cam:\n            sampler = RandomMultipleGallerySamplerNoCam(train_set, num_instances)\n        else:\n            sampler = RandomMultipleGallerySampler(train_set, num_instances)\n    else:\n        sampler = None\n    train_loader = IterLoader(\n        DataLoader(Preprocessor(train_set, root=dataset.images_dir, transform=train_transformer),\n                   batch_size=batch_size, num_workers=workers, sampler=sampler,\n                   shuffle=not rmgs_flag, pin_memory=True, drop_last=True), length=iters)\n\n    return train_loader\n\n\ndef get_test_loader(dataset, height, width, batch_size, workers, testset=None):\n    normalizer = T.Normalize(mean=[0.485, 0.456, 0.406],\n                             std=[0.229, 0.224, 0.225])\n\n    test_transformer = T.Compose([\n        T.Resize((height, width), interpolation=3),\n        T.ToTensor(),\n        normalizer\n    ])\n\n    if testset is None:\n        testset = list(set(dataset.query) | set(dataset.gallery))\n\n    test_loader = DataLoader(\n        Preprocessor(testset, root=dataset.images_dir, transform=test_transformer),\n        batch_size=batch_size, num_workers=workers,\n        shuffle=False, pin_memory=True)\n\n    return test_loader\n\n\ndef create_model(args):\n    model = models.create(args.arch, num_features=args.features, norm=True, dropout=args.dropout,\n                          num_classes=0, pooling_type=args.pooling_type)\n    # use CUDA\n    model.cuda()\n    model = nn.DataParallel(model)\n    return model\n\n\ndef main():\n    args = parser.parse_args()\n\n    if args.seed is not None:\n        random.seed(args.seed)\n        np.random.seed(args.seed)\n        torch.manual_seed(args.seed)\n        cudnn.deterministic = True\n\n    main_worker(args)\n\n\ndef main_worker(args):\n    global start_epoch, best_mAP\n    start_time = time.monotonic()\n\n    cudnn.benchmark = True\n\n    sys.stdout = Logger(osp.join(args.logs_dir, 'log.txt'))\n    print(\"==========\\nArgs:{}\\n==========\".format(args))\n\n    # Create datasets\n    iters = args.iters if (args.iters > 0) else None\n    print(\"==> Load unlabeled dataset\")\n    dataset = get_data(args.dataset, args.data_dir)\n    test_loader = get_test_loader(dataset, args.height, args.width, args.batch_size, args.workers)\n\n    # Create model\n    model = create_model(args)\n\n    # Evaluator\n    evaluator = Evaluator(model)\n\n    # Optimizer\n    params = [{\"params\": [value]} for _, value in model.named_parameters() if value.requires_grad]\n    optimizer = torch.optim.Adam(params, lr=args.lr, weight_decay=args.weight_decay)\n    lr_scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=args.step_size, gamma=0.1)\n\n    # Trainer\n    trainer = ClusterContrastTrainer(model)\n\n    for epoch in range(args.epochs):\n        with torch.no_grad():\n            print('==> Create pseudo labels for unlabeled data')\n            cluster_loader = get_test_loader(dataset, args.height, args.width,\n                                             args.batch_size, args.workers, testset=sorted(dataset.train))\n\n            features, _ = extract_features(model, cluster_loader, print_freq=50)\n            features = torch.cat([features[f].unsqueeze(0) for f, _, _ in sorted(dataset.train)], 0)\n            rerank_dist = compute_jaccard_distance(features, k1=args.k1, k2=args.k2)\n\n            if epoch == 0:\n                # DBSCAN cluster\n                eps = args.eps\n                print('Clustering criterion: eps: {:.3f}'.format(eps))\n                cluster = DBSCAN(eps=eps, min_samples=4, metric='precomputed', n_jobs=-1)\n\n            # select & cluster images as training set of this epochs\n            pseudo_labels = cluster.fit_predict(rerank_dist)\n            num_cluster = len(set(pseudo_labels)) - (1 if -1 in pseudo_labels else 0)\n\n            # print(\"epoch: {} \\n pseudo_labels: {}\".format(epoch, pseudo_labels.tolist()[:100]))\n\n        # generate new dataset and calculate cluster centers\n        @torch.no_grad()\n        def generate_cluster_features(labels, features):\n            centers = collections.defaultdict(list)\n            for i, label in enumerate(labels):\n                if label == -1:\n                    continue\n                centers[labels[i]].append(features[i])\n\n            centers = [\n                torch.stack(centers[idx], dim=0).mean(0) for idx in sorted(centers.keys())\n            ]\n\n            centers = torch.stack(centers, dim=0)\n            return centers\n\n        cluster_features = generate_cluster_features(pseudo_labels, features)\n\n        del cluster_loader, features\n\n        # Create hybrid memory\n        memory = ClusterMemory(model.module.num_features, num_cluster, temp=args.temp,\n                               momentum=args.momentum, use_hard=args.use_hard).cuda()\n        memory.features = F.normalize(cluster_features, dim=1).cuda()\n\n        trainer.memory = memory\n\n        pseudo_labeled_dataset = []\n        for i, ((fname, _, cid), label) in enumerate(zip(sorted(dataset.train), pseudo_labels)):\n            if label != -1:\n                pseudo_labeled_dataset.append((fname, label.item(), cid))\n\n        print('==> Statistics for epoch {}: {} clusters'.format(epoch, num_cluster))\n\n        train_loader = get_train_loader(args, dataset, args.height, args.width,\n                                        args.batch_size, args.workers, args.num_instances, iters,\n                                        trainset=pseudo_labeled_dataset, no_cam=args.no_cam)\n\n        train_loader.new_epoch()\n\n        trainer.train(epoch, train_loader, optimizer,\n                      print_freq=args.print_freq, train_iters=len(train_loader))\n\n        if (epoch + 1) % args.eval_step == 0 or (epoch == args.epochs - 1):\n            mAP = evaluator.evaluate(test_loader, dataset.query, dataset.gallery, cmc_flag=False)\n            is_best = (mAP > best_mAP)\n            best_mAP = max(mAP, best_mAP)\n            save_checkpoint({\n                'state_dict': model.state_dict(),\n                'epoch': epoch + 1,\n                'best_mAP': best_mAP,\n            }, is_best, fpath=osp.join(args.logs_dir, 'checkpoint.pth.tar'))\n\n            print('\\n * Finished epoch {:3d}  model mAP: {:5.1%}  best: {:5.1%}{}\\n'.\n                  format(epoch, mAP, best_mAP, ' *' if is_best else ''))\n\n        lr_scheduler.step()\n\n    print('==> Test with the best model:')\n    checkpoint = load_checkpoint(osp.join(args.logs_dir, 'model_best.pth.tar'))\n    model.load_state_dict(checkpoint['state_dict'])\n    evaluator.evaluate(test_loader, dataset.query, dataset.gallery, cmc_flag=True)\n\n    end_time = time.monotonic()\n    print('Total running time: ', timedelta(seconds=end_time - start_time))\n\n\nif __name__ == '__main__':\n    parser = argparse.ArgumentParser(description=\"Self-paced contrastive learning on unsupervised re-ID\")\n    # data\n    parser.add_argument('-d', '--dataset', type=str, default='dukemtmcreid',\n                        choices=datasets.names())\n    parser.add_argument('-b', '--batch-size', type=int, default=2)\n    parser.add_argument('-j', '--workers', type=int, default=4)\n    parser.add_argument('--height', type=int, default=256, help=\"input height\")\n    parser.add_argument('--width', type=int, default=128, help=\"input width\")\n    parser.add_argument('--num-instances', type=int, default=4,\n                        help=\"each minibatch consist of \"\n                             \"(batch_size // num_instances) identities, and \"\n                             \"each identity has num_instances instances, \"\n                             \"default: 0 (NOT USE)\")\n    # cluster\n    parser.add_argument('--eps', type=float, default=0.6,\n                        help=\"max neighbor distance for DBSCAN\")\n    parser.add_argument('--eps-gap', type=float, default=0.02,\n                        help=\"multi-scale criterion for measuring cluster reliability\")\n    parser.add_argument('--k1', type=int, default=30,\n                        help=\"hyperparameter for jaccard distance\")\n    parser.add_argument('--k2', type=int, default=6,\n                        help=\"hyperparameter for jaccard distance\")\n\n    # model\n    parser.add_argument('-a', '--arch', type=str, default='resnet50',\n                        choices=models.names())\n    parser.add_argument('--features', type=int, default=0)\n    parser.add_argument('--dropout', type=float, default=0)\n    parser.add_argument('--momentum', type=float, default=0.2,\n                        help=\"update momentum for the hybrid memory\")\n    # optimizer\n    parser.add_argument('--lr', type=float, default=0.00035,\n                        help=\"learning rate\")\n    parser.add_argument('--weight-decay', type=float, default=5e-4)\n    parser.add_argument('--epochs', type=int, default=50)\n    parser.add_argument('--iters', type=int, default=400)\n    parser.add_argument('--step-size', type=int, default=20)\n    # training configs\n    parser.add_argument('--seed', type=int, default=1)\n    parser.add_argument('--print-freq', type=int, default=10)\n    parser.add_argument('--eval-step', type=int, default=10)\n    parser.add_argument('--temp', type=float, default=0.05,\n                        help=\"temperature for scaling contrastive loss\")\n    # path\n    working_dir = osp.dirname(osp.abspath(__file__))\n    parser.add_argument('--data-dir', type=str, metavar='PATH',\n                        default=osp.join(working_dir, 'data'))\n    parser.add_argument('--logs-dir', type=str, metavar='PATH',\n                        default=osp.join(working_dir, 'logs'))\n    parser.add_argument('--pooling-type', type=str, default='gem')\n    parser.add_argument('--use-hard', action=\"store_true\")\n    parser.add_argument('--no-cam',  action=\"store_true\")\n\n    main()\n"
  },
  {
    "path": "examples/cluster_contrast_train_usl_infomap.py",
    "content": "# -*- coding: utf-8 -*-\nfrom __future__ import print_function, absolute_import\nimport argparse\nimport os.path as osp\nimport random\nimport numpy as np\nimport sys\nimport collections\nimport time\nfrom datetime import timedelta\n\nimport torch\nfrom torch import nn\nfrom torch.backends import cudnn\nfrom torch.utils.data import DataLoader\nimport torch.nn.functional as F\n\nfrom clustercontrast import datasets\nfrom clustercontrast import models\nfrom clustercontrast.models.cm import ClusterMemory\nfrom clustercontrast.trainers import ClusterContrastTrainer\nfrom clustercontrast.evaluators import Evaluator, extract_features\nfrom clustercontrast.utils.data import IterLoader\nfrom clustercontrast.utils.data import transforms as T\nfrom clustercontrast.utils.data.sampler import RandomMultipleGallerySampler, RandomMultipleGallerySamplerNoCam\nfrom clustercontrast.utils.data.preprocessor import Preprocessor\nfrom clustercontrast.utils.logging import Logger\nfrom clustercontrast.utils.serialization import load_checkpoint, save_checkpoint\nfrom clustercontrast.utils.infomap_cluster import get_dist_nbr, cluster_by_infomap\n\nstart_epoch = best_mAP = 0\n\n\ndef str2bool(v):\n    if isinstance(v, bool):\n        return v\n    if v.lower() in ('yes', 'true', 't', 'y', '1'):\n        return True\n    elif v.lower() in ('no', 'false', 'f', 'n', '0'):\n        return False\n    else:\n        raise argparse.ArgumentTypeError('Boolean value expected.')\n\ndef get_data(name, data_dir):\n    root = osp.join(data_dir, name)\n    dataset = datasets.create(name, root)\n    return dataset\n\n\ndef get_train_loader(args, dataset, height, width, batch_size, workers,\n                     num_instances, iters, trainset=None, no_cam=False):\n\n    normalizer = T.Normalize(mean=[0.485, 0.456, 0.406],\n                             std=[0.229, 0.224, 0.225])\n\n    train_transformer = T.Compose([\n        T.Resize((height, width), interpolation=3),\n        T.RandomHorizontalFlip(p=0.5),\n        T.Pad(10),\n        T.RandomCrop((height, width)),\n        T.ToTensor(),\n        normalizer,\n        T.RandomErasing(probability=0.5, mean=[0.485, 0.456, 0.406])\n    ])\n\n    train_set = sorted(dataset.train) if trainset is None else sorted(trainset)\n    rmgs_flag = num_instances > 0\n    if rmgs_flag:\n        if no_cam:\n            sampler = RandomMultipleGallerySamplerNoCam(train_set, num_instances)\n        else:\n            sampler = RandomMultipleGallerySampler(train_set, num_instances)\n    else:\n        sampler = None\n    train_loader = IterLoader(\n        DataLoader(Preprocessor(train_set, root=dataset.images_dir, transform=train_transformer),\n                   batch_size=batch_size, num_workers=workers, sampler=sampler,\n                   shuffle=not rmgs_flag, pin_memory=True, drop_last=True), length=iters)\n    return train_loader\n\n\ndef get_test_loader(dataset, height, width, batch_size, workers, testset=None):\n\n    normalizer = T.Normalize(mean=[0.485, 0.456, 0.406],\n                             std=[0.229, 0.224, 0.225])\n\n    test_transformer = T.Compose([\n        T.Resize((height, width), interpolation=3),\n        T.ToTensor(),\n        normalizer\n    ])\n\n    if testset is None:\n        testset = list(set(dataset.query) | set(dataset.gallery))\n\n    test_loader = DataLoader(\n        Preprocessor(testset, root=dataset.images_dir, transform=test_transformer),\n        batch_size=batch_size, num_workers=workers,\n        shuffle=False, pin_memory=True)\n\n    return test_loader\n\n\ndef create_model(args):\n    model = models.create(args.arch, num_features=args.features, norm=True, dropout=args.dropout,\n                          num_classes=0, pooling_type=args.pooling_type)\n    # use CUDA\n    model.cuda()\n    model = nn.DataParallel(model)\n    return model\n\n\ndef main():\n    args = parser.parse_args()\n\n    if args.seed is not None:\n        random.seed(args.seed)\n        np.random.seed(args.seed)\n        torch.manual_seed(args.seed)\n        cudnn.deterministic = True\n\n    main_worker(args)\n\n\ndef main_worker(args):\n    global start_epoch, best_mAP\n    start_time = time.monotonic()\n\n    cudnn.benchmark = True\n\n    sys.stdout = Logger(osp.join(args.logs_dir, 'log.txt'))\n    print(\"==========\\nArgs:{}\\n==========\".format(args))\n\n    # Create datasets\n    iters = args.iters if (args.iters > 0) else None\n    print(\"==> Load unlabeled dataset\")\n    dataset = get_data(args.dataset, args.data_dir)\n    test_loader = get_test_loader(dataset, args.height, args.width, args.batch_size, args.workers)\n\n    # Create model\n    model = create_model(args)\n    # Evaluator\n    evaluator = Evaluator(model)\n\n    # Optimizer\n    params = [{\"params\": [value]} for _, value in model.named_parameters() if value.requires_grad]\n    optimizer = torch.optim.Adam(params, lr=args.lr, weight_decay=args.weight_decay)\n\n    lr_scheduler = torch.optim.lr_scheduler.StepLR(optimizer, step_size=args.step_size, gamma=0.1)\n    # Trainer\n    trainer = ClusterContrastTrainer(model)\n\n    for epoch in range(args.epochs):\n        with torch.no_grad():\n            print('==> Create pseudo labels for unlabeled data')\n            cluster_loader = get_test_loader(dataset, args.height, args.width,\n                                             args.batch_size, args.workers, testset=sorted(dataset.train))\n\n            features, _ = extract_features(model, cluster_loader, print_freq=50)\n            features = torch.cat([features[f].unsqueeze(0) for f, _, _ in sorted(dataset.train)], 0)\n\n            features_array = F.normalize(features, dim=1).cpu().numpy()\n            feat_dists, feat_nbrs = get_dist_nbr(features=features_array, k=args.k1, knn_method='faiss-gpu')\n            del features_array\n\n            s = time.time()\n            pseudo_labels = cluster_by_infomap(feat_nbrs, feat_dists, min_sim=args.eps, cluster_num=args.k2)\n            pseudo_labels = pseudo_labels.astype(np.intp)\n\n            print('cluster cost time: {}'.format(time.time() - s))\n            num_cluster = len(set(pseudo_labels)) - (1 if -1 in pseudo_labels else 0)\n\n        # generate new dataset and calculate cluster centers\n        @torch.no_grad()\n        def generate_cluster_features(labels, features):\n            centers = collections.defaultdict(list)\n\n            for i, label in enumerate(labels):\n                if label == -1:\n                    continue\n                centers[labels[i]].append(features[i])\n\n            centers = [\n                torch.stack(centers[idx], dim=0).mean(0) for idx in sorted(centers.keys())\n            ]\n\n            centers = torch.stack(centers, dim=0)\n            return centers\n\n        cluster_features = generate_cluster_features(pseudo_labels, features)\n\n        del cluster_loader, features\n\n        # Create hybrid memory\n        memory = ClusterMemory(model.module.num_features, num_cluster, temp=args.temp,\n                               momentum=args.momentum, use_hard=args.use_hard).cuda()\n\n        memory.features = F.normalize(cluster_features, dim=1).cuda()\n        trainer.memory = memory\n        pseudo_labeled_dataset = []\n\n        for i, ((fname, _, cid), label) in enumerate(zip(sorted(dataset.train), pseudo_labels)):\n            if label != -1:\n                pseudo_labeled_dataset.append((fname, label.item(), cid))\n\n        print('==> Statistics for epoch {}: {} clusters'.format(epoch, num_cluster))\n\n        train_loader = get_train_loader(args, dataset, args.height, args.width,\n                                        args.batch_size, args.workers, args.num_instances, iters,\n                                        trainset=pseudo_labeled_dataset, no_cam=args.no_cam)\n\n        train_loader.new_epoch()\n        trainer.train(epoch, train_loader, optimizer,\n                      print_freq=args.print_freq, train_iters=len(train_loader))\n\n        if (epoch + 1) % args.eval_step == 0 or (epoch == args.epochs - 1):\n            mAP = evaluator.evaluate(test_loader, dataset.query, dataset.gallery, cmc_flag=False)\n            is_best = (mAP > best_mAP)\n            best_mAP = max(mAP, best_mAP)\n\n            save_checkpoint({\n                'state_dict': model.state_dict(),\n                'epoch': epoch + 1,\n                'best_mAP': best_mAP,\n            }, is_best, fpath=osp.join(args.logs_dir, 'checkpoint.pth.tar'))\n\n            print('\\n * Finished epoch {:3d}  model mAP: {:5.1%}  best: {:5.1%}{}\\n'.\n                  format(epoch, mAP, best_mAP, ' *' if is_best else ''))\n\n        lr_scheduler.step()\n\n    print('==> Test with the best model:')\n    checkpoint = load_checkpoint(osp.join(args.logs_dir, 'model_best.pth.tar'))\n    model.load_state_dict(checkpoint['state_dict'])\n    evaluator.evaluate(test_loader, dataset.query, dataset.gallery, cmc_flag=True)\n\n    end_time = time.monotonic()\n    print('Total running time: ', timedelta(seconds=end_time - start_time))\n\n\nif __name__ == '__main__':\n    parser = argparse.ArgumentParser(description=\"Self-paced contrastive learning on unsupervised re-ID\")\n    # data\n    parser.add_argument('-d', '--dataset', type=str, default='dukemtmcreid',\n                        choices=datasets.names())\n    parser.add_argument('-b', '--batch-size', type=int, default=2)\n    parser.add_argument('-j', '--workers', type=int, default=4)\n    parser.add_argument('--height', type=int, default=256, help=\"input height\")\n    parser.add_argument('--width', type=int, default=128, help=\"input width\")\n    parser.add_argument('--num-instances', type=int, default=4,\n                        help=\"each minibatch consist of \"\n                             \"(batch_size // num_instances) identities, and \"\n                             \"each identity has num_instances instances, \"\n                             \"default: 0 (NOT USE)\")\n    # cluster\n    parser.add_argument('--eps', type=float, default=0.5,\n                        help=\"max neighbor distance for DBSCAN\")\n    parser.add_argument('--eps-gap', type=float, default=0.02,\n                        help=\"multi-scale criterion for measuring cluster reliability\")\n    parser.add_argument('--k1', type=int, default=15,\n                        help=\"hyperparameter for KNN\")\n    parser.add_argument('--k2', type=int, default=4,\n                        help=\"hyperparameter for outline\")\n    # model\n    parser.add_argument('-a', '--arch', type=str, default='resnet50',\n                        choices=models.names())\n    parser.add_argument('--features', type=int, default=0)\n    parser.add_argument('--dropout', type=float, default=0)\n    parser.add_argument('--momentum', type=float, default=0.2,\n                        help=\"update momentum for the hybrid memory\")\n    # optimizer\n    parser.add_argument('--lr', type=float, default=0.00035,\n                        help=\"learning rate\")\n    parser.add_argument('--weight-decay', type=float, default=5e-4)\n    parser.add_argument('--epochs', type=int, default=50)\n    parser.add_argument('--iters', type=int, default=400)\n    parser.add_argument('--step-size', type=int, default=20)\n\n    # training configs\n    parser.add_argument('--seed', type=int, default=1)\n    parser.add_argument('--print-freq', type=int, default=10)\n    parser.add_argument('--eval-step', type=int, default=10)\n    parser.add_argument('--temp', type=float, default=0.05,\n                        help=\"temperature for scaling contrastive loss\")\n    # path\n    working_dir = osp.dirname(osp.abspath(__file__))\n    parser.add_argument('--data-dir', type=str, metavar='PATH',\n                        default=osp.join(working_dir, 'data'))\n    parser.add_argument('--logs-dir', type=str, metavar='PATH',\n                        default=osp.join(working_dir, 'logs'))\n    parser.add_argument('--pooling-type', type=str, default='gem')\n    parser.add_argument('--use-hard', action=\"store_true\")\n    parser.add_argument('--no-cam', action=\"store_true\")\n    main()\n"
  },
  {
    "path": "examples/logs/log.txt",
    "content": "==========\nArgs:Namespace(arch='resnet_ibn50a', batch_size=256, data_dir='/data0/developer/cluster-contrast/examples/data', dataset='market1501', dropout=0, epochs=50, eps=0.4, eps_gap=0.02, eval_step=10, features=0, height=256, iters=400, k1=30, k2=6, logs_dir='/data0/developer/cluster-contrast/examples/logs/gem-hard', lr=0.00035, momentum=0.1, num_instances=16, pooling_type='gem', print_freq=10, seed=1, step_size=20, temp=0.05, use_hard=True, weight_decay=0.0005, width=128, workers=4)\n==========\n==> Load unlabeled dataset\n=> Market1501 loaded\nDataset statistics:\n  ----------------------------------------\n  subset   | # ids | # images | # cameras\n  ----------------------------------------\n  train    |   751 |    12936 |         6\n  query    |   750 |     3368 |         6\n  gallery  |   751 |    15913 |         6\n  ----------------------------------------\npooling_type: gem\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.128 (0.379)\tData 0.000 (0.019)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 23.047871351242065\nClustering criterion: eps: 0.400\n==> Statistics for epoch 0: 80 clusters\nEpoch: [0][10/400]\tTime 0.370 (0.775)\tData 0.000 (0.211)\tLoss 4.447 (3.906)\nEpoch: [0][20/400]\tTime 0.362 (0.733)\tData 0.000 (0.271)\tLoss 3.082 (3.512)\nEpoch: [0][30/400]\tTime 0.361 (0.722)\tData 0.000 (0.293)\tLoss 3.117 (3.283)\nEpoch: [0][40/400]\tTime 0.361 (0.714)\tData 0.000 (0.301)\tLoss 3.093 (3.113)\nEpoch: [0][50/400]\tTime 0.362 (0.712)\tData 0.000 (0.309)\tLoss 2.184 (2.973)\nEpoch: [0][60/400]\tTime 0.361 (0.711)\tData 0.000 (0.315)\tLoss 2.049 (2.829)\nEpoch: [0][70/400]\tTime 0.354 (0.711)\tData 0.000 (0.319)\tLoss 2.118 (2.714)\nEpoch: [0][80/400]\tTime 0.362 (0.709)\tData 0.000 (0.321)\tLoss 2.413 (2.657)\nEpoch: [0][90/400]\tTime 0.360 (0.707)\tData 0.000 (0.321)\tLoss 2.092 (2.576)\nEpoch: [0][100/400]\tTime 0.362 (0.706)\tData 0.000 (0.322)\tLoss 1.619 (2.488)\nEpoch: [0][110/400]\tTime 0.374 (0.704)\tData 0.000 (0.322)\tLoss 2.173 (2.423)\nEpoch: [0][120/400]\tTime 0.351 (0.703)\tData 0.000 (0.323)\tLoss 1.402 (2.359)\nEpoch: [0][130/400]\tTime 0.369 (0.703)\tData 0.000 (0.324)\tLoss 2.615 (2.307)\nEpoch: [0][140/400]\tTime 0.368 (0.703)\tData 0.000 (0.325)\tLoss 1.323 (2.262)\nEpoch: [0][150/400]\tTime 0.369 (0.703)\tData 0.000 (0.325)\tLoss 1.811 (2.209)\nEpoch: [0][160/400]\tTime 0.369 (0.704)\tData 0.000 (0.326)\tLoss 1.369 (2.153)\nEpoch: [0][170/400]\tTime 0.368 (0.704)\tData 0.000 (0.327)\tLoss 1.734 (2.109)\nEpoch: [0][180/400]\tTime 0.369 (0.704)\tData 0.000 (0.327)\tLoss 0.889 (2.057)\nEpoch: [0][190/400]\tTime 0.366 (0.704)\tData 0.000 (0.328)\tLoss 1.458 (2.011)\nEpoch: [0][200/400]\tTime 0.361 (0.705)\tData 0.000 (0.329)\tLoss 1.066 (1.967)\nEpoch: [0][210/400]\tTime 0.363 (0.705)\tData 0.000 (0.329)\tLoss 1.535 (1.928)\nEpoch: [0][220/400]\tTime 0.362 (0.705)\tData 0.000 (0.330)\tLoss 1.193 (1.898)\nEpoch: [0][230/400]\tTime 0.362 (0.705)\tData 0.000 (0.330)\tLoss 1.210 (1.862)\nEpoch: [0][240/400]\tTime 0.360 (0.704)\tData 0.000 (0.330)\tLoss 1.218 (1.829)\nEpoch: [0][250/400]\tTime 0.363 (0.704)\tData 0.000 (0.330)\tLoss 0.901 (1.800)\nEpoch: [0][260/400]\tTime 0.363 (0.704)\tData 0.000 (0.330)\tLoss 1.281 (1.770)\nEpoch: [0][270/400]\tTime 0.360 (0.704)\tData 0.000 (0.331)\tLoss 1.332 (1.740)\nEpoch: [0][280/400]\tTime 0.362 (0.704)\tData 0.000 (0.331)\tLoss 1.559 (1.710)\nEpoch: [0][290/400]\tTime 0.362 (0.704)\tData 0.000 (0.331)\tLoss 0.889 (1.682)\nEpoch: [0][300/400]\tTime 0.362 (0.704)\tData 0.000 (0.331)\tLoss 1.134 (1.657)\nEpoch: [0][310/400]\tTime 0.363 (0.703)\tData 0.000 (0.331)\tLoss 1.001 (1.630)\nEpoch: [0][320/400]\tTime 0.372 (0.703)\tData 0.000 (0.331)\tLoss 0.619 (1.602)\nEpoch: [0][330/400]\tTime 0.361 (0.703)\tData 0.000 (0.332)\tLoss 0.942 (1.576)\nEpoch: [0][340/400]\tTime 0.363 (0.703)\tData 0.000 (0.332)\tLoss 0.700 (1.552)\nEpoch: [0][350/400]\tTime 0.363 (0.703)\tData 0.000 (0.332)\tLoss 0.434 (1.530)\nEpoch: [0][360/400]\tTime 0.362 (0.703)\tData 0.000 (0.332)\tLoss 0.739 (1.507)\nEpoch: [0][370/400]\tTime 0.364 (0.703)\tData 0.000 (0.332)\tLoss 0.631 (1.485)\nEpoch: [0][380/400]\tTime 0.365 (0.702)\tData 0.000 (0.332)\tLoss 0.529 (1.465)\nEpoch: [0][390/400]\tTime 0.364 (0.702)\tData 0.000 (0.332)\tLoss 0.699 (1.445)\nEpoch: [0][400/400]\tTime 0.363 (0.702)\tData 0.000 (0.332)\tLoss 0.945 (1.426)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.130 (0.158)\tData 0.000 (0.027)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 20.340964555740356\n==> Statistics for epoch 1: 286 clusters\nEpoch: [1][10/400]\tTime 0.352 (0.418)\tData 0.001 (0.053)\tLoss 0.530 (0.732)\nEpoch: [1][20/400]\tTime 0.360 (0.480)\tData 0.000 (0.117)\tLoss 3.473 (1.095)\nEpoch: [1][30/400]\tTime 0.361 (0.441)\tData 0.000 (0.078)\tLoss 3.468 (1.908)\nEpoch: [1][40/400]\tTime 0.363 (0.461)\tData 0.000 (0.099)\tLoss 3.717 (2.348)\nEpoch: [1][50/400]\tTime 0.366 (0.442)\tData 0.000 (0.079)\tLoss 3.763 (2.694)\nEpoch: [1][60/400]\tTime 0.363 (0.456)\tData 0.000 (0.094)\tLoss 3.590 (2.891)\nEpoch: [1][70/400]\tTime 0.345 (0.468)\tData 0.000 (0.106)\tLoss 3.950 (3.029)\nEpoch: [1][80/400]\tTime 0.362 (0.455)\tData 0.000 (0.093)\tLoss 3.343 (3.112)\nEpoch: [1][90/400]\tTime 0.362 (0.465)\tData 0.000 (0.102)\tLoss 3.418 (3.184)\nEpoch: [1][100/400]\tTime 0.363 (0.454)\tData 0.000 (0.092)\tLoss 3.445 (3.229)\nEpoch: [1][110/400]\tTime 0.362 (0.462)\tData 0.001 (0.099)\tLoss 3.722 (3.268)\nEpoch: [1][120/400]\tTime 2.066 (0.468)\tData 1.640 (0.105)\tLoss 2.855 (3.295)\nEpoch: [1][130/400]\tTime 0.362 (0.460)\tData 0.000 (0.097)\tLoss 3.671 (3.306)\nEpoch: [1][140/400]\tTime 0.363 (0.465)\tData 0.000 (0.102)\tLoss 2.616 (3.296)\nEpoch: [1][150/400]\tTime 0.363 (0.458)\tData 0.000 (0.095)\tLoss 3.351 (3.290)\nEpoch: [1][160/400]\tTime 0.363 (0.463)\tData 0.000 (0.100)\tLoss 3.071 (3.285)\nEpoch: [1][170/400]\tTime 0.363 (0.458)\tData 0.000 (0.094)\tLoss 3.275 (3.282)\nEpoch: [1][180/400]\tTime 0.364 (0.461)\tData 0.000 (0.098)\tLoss 3.301 (3.270)\nEpoch: [1][190/400]\tTime 0.381 (0.466)\tData 0.001 (0.102)\tLoss 2.709 (3.262)\nEpoch: [1][200/400]\tTime 0.365 (0.460)\tData 0.000 (0.097)\tLoss 3.041 (3.249)\nEpoch: [1][210/400]\tTime 0.370 (0.464)\tData 0.001 (0.100)\tLoss 3.036 (3.237)\nEpoch: [1][220/400]\tTime 0.369 (0.460)\tData 0.000 (0.096)\tLoss 3.528 (3.245)\nEpoch: [1][230/400]\tTime 0.362 (0.463)\tData 0.001 (0.099)\tLoss 2.933 (3.224)\nEpoch: [1][240/400]\tTime 0.356 (0.466)\tData 0.001 (0.101)\tLoss 2.484 (3.218)\nEpoch: [1][250/400]\tTime 0.362 (0.462)\tData 0.000 (0.097)\tLoss 2.455 (3.207)\nEpoch: [1][260/400]\tTime 0.414 (0.464)\tData 0.001 (0.100)\tLoss 2.943 (3.191)\nEpoch: [1][270/400]\tTime 0.369 (0.461)\tData 0.000 (0.096)\tLoss 2.580 (3.175)\nEpoch: [1][280/400]\tTime 0.352 (0.464)\tData 0.000 (0.099)\tLoss 2.666 (3.156)\nEpoch: [1][290/400]\tTime 1.998 (0.466)\tData 1.634 (0.101)\tLoss 2.325 (3.150)\nEpoch: [1][300/400]\tTime 0.363 (0.463)\tData 0.000 (0.098)\tLoss 3.091 (3.134)\nEpoch: [1][310/400]\tTime 0.361 (0.465)\tData 0.000 (0.100)\tLoss 1.971 (3.123)\nEpoch: [1][320/400]\tTime 0.364 (0.462)\tData 0.000 (0.097)\tLoss 3.067 (3.111)\nEpoch: [1][330/400]\tTime 0.362 (0.464)\tData 0.001 (0.099)\tLoss 2.665 (3.097)\nEpoch: [1][340/400]\tTime 0.363 (0.461)\tData 0.000 (0.096)\tLoss 2.564 (3.083)\nEpoch: [1][350/400]\tTime 0.363 (0.463)\tData 0.000 (0.098)\tLoss 2.617 (3.072)\nEpoch: [1][360/400]\tTime 0.361 (0.465)\tData 0.001 (0.100)\tLoss 2.341 (3.057)\nEpoch: [1][370/400]\tTime 0.365 (0.463)\tData 0.000 (0.098)\tLoss 2.255 (3.041)\nEpoch: [1][380/400]\tTime 0.364 (0.464)\tData 0.001 (0.099)\tLoss 2.267 (3.029)\nEpoch: [1][390/400]\tTime 0.366 (0.462)\tData 0.000 (0.097)\tLoss 2.701 (3.015)\nEpoch: [1][400/400]\tTime 0.376 (0.464)\tData 0.001 (0.099)\tLoss 1.968 (3.001)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.130 (0.160)\tData 0.000 (0.031)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 21.161827564239502\n==> Statistics for epoch 2: 561 clusters\nEpoch: [2][10/400]\tTime 0.362 (0.422)\tData 0.001 (0.054)\tLoss 0.609 (0.845)\nEpoch: [2][20/400]\tTime 0.361 (0.392)\tData 0.001 (0.027)\tLoss 0.627 (0.767)\nEpoch: [2][30/400]\tTime 0.361 (0.382)\tData 0.000 (0.018)\tLoss 0.408 (0.696)\nEpoch: [2][40/400]\tTime 0.363 (0.420)\tData 0.001 (0.056)\tLoss 4.156 (1.033)\nEpoch: [2][50/400]\tTime 0.362 (0.408)\tData 0.001 (0.045)\tLoss 3.165 (1.537)\nEpoch: [2][60/400]\tTime 0.386 (0.401)\tData 0.000 (0.038)\tLoss 4.043 (1.907)\nEpoch: [2][70/400]\tTime 0.363 (0.396)\tData 0.000 (0.032)\tLoss 4.042 (2.221)\nEpoch: [2][80/400]\tTime 0.364 (0.415)\tData 0.001 (0.051)\tLoss 3.583 (2.379)\nEpoch: [2][90/400]\tTime 0.364 (0.409)\tData 0.000 (0.046)\tLoss 3.396 (2.509)\nEpoch: [2][100/400]\tTime 0.365 (0.405)\tData 0.000 (0.041)\tLoss 3.422 (2.619)\nEpoch: [2][110/400]\tTime 0.363 (0.418)\tData 0.001 (0.054)\tLoss 4.696 (2.711)\nEpoch: [2][120/400]\tTime 0.363 (0.414)\tData 0.001 (0.050)\tLoss 3.733 (2.794)\nEpoch: [2][130/400]\tTime 0.391 (0.410)\tData 0.001 (0.046)\tLoss 3.354 (2.860)\nEpoch: [2][140/400]\tTime 0.365 (0.407)\tData 0.000 (0.043)\tLoss 3.528 (2.919)\nEpoch: [2][150/400]\tTime 0.362 (0.416)\tData 0.001 (0.051)\tLoss 4.046 (2.952)\nEpoch: [2][160/400]\tTime 0.364 (0.413)\tData 0.000 (0.048)\tLoss 3.927 (2.989)\nEpoch: [2][170/400]\tTime 0.364 (0.410)\tData 0.000 (0.045)\tLoss 3.270 (3.019)\nEpoch: [2][180/400]\tTime 0.362 (0.416)\tData 0.001 (0.052)\tLoss 3.555 (3.050)\nEpoch: [2][190/400]\tTime 0.364 (0.414)\tData 0.001 (0.049)\tLoss 3.980 (3.071)\nEpoch: [2][200/400]\tTime 0.363 (0.411)\tData 0.001 (0.047)\tLoss 3.768 (3.093)\nEpoch: [2][210/400]\tTime 0.363 (0.409)\tData 0.000 (0.045)\tLoss 3.659 (3.104)\nEpoch: [2][220/400]\tTime 0.364 (0.415)\tData 0.001 (0.050)\tLoss 2.920 (3.109)\nEpoch: [2][230/400]\tTime 0.365 (0.413)\tData 0.001 (0.048)\tLoss 3.674 (3.115)\nEpoch: [2][240/400]\tTime 0.365 (0.411)\tData 0.000 (0.046)\tLoss 2.956 (3.118)\nEpoch: [2][250/400]\tTime 0.363 (0.416)\tData 0.001 (0.051)\tLoss 2.949 (3.129)\nEpoch: [2][260/400]\tTime 0.363 (0.414)\tData 0.000 (0.049)\tLoss 3.126 (3.132)\nEpoch: [2][270/400]\tTime 0.365 (0.412)\tData 0.001 (0.047)\tLoss 2.763 (3.149)\nEpoch: [2][280/400]\tTime 0.363 (0.410)\tData 0.000 (0.046)\tLoss 2.874 (3.153)\nEpoch: [2][290/400]\tTime 0.350 (0.414)\tData 0.001 (0.050)\tLoss 3.045 (3.151)\nEpoch: [2][300/400]\tTime 0.366 (0.412)\tData 0.001 (0.048)\tLoss 2.985 (3.155)\nEpoch: [2][310/400]\tTime 0.365 (0.411)\tData 0.000 (0.046)\tLoss 3.784 (3.154)\nEpoch: [2][320/400]\tTime 0.364 (0.415)\tData 0.000 (0.050)\tLoss 2.958 (3.151)\nEpoch: [2][330/400]\tTime 0.366 (0.413)\tData 0.001 (0.049)\tLoss 2.676 (3.160)\nEpoch: [2][340/400]\tTime 0.366 (0.412)\tData 0.001 (0.047)\tLoss 3.050 (3.167)\nEpoch: [2][350/400]\tTime 0.363 (0.410)\tData 0.000 (0.046)\tLoss 3.733 (3.167)\nEpoch: [2][360/400]\tTime 0.365 (0.414)\tData 0.001 (0.049)\tLoss 3.433 (3.169)\nEpoch: [2][370/400]\tTime 0.366 (0.413)\tData 0.000 (0.048)\tLoss 3.044 (3.165)\nEpoch: [2][380/400]\tTime 0.364 (0.412)\tData 0.000 (0.047)\tLoss 3.781 (3.168)\nEpoch: [2][390/400]\tTime 0.364 (0.415)\tData 0.000 (0.050)\tLoss 3.573 (3.170)\nEpoch: [2][400/400]\tTime 0.365 (0.414)\tData 0.001 (0.049)\tLoss 3.600 (3.168)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.158)\tData 0.000 (0.030)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 22.187981843948364\n==> Statistics for epoch 3: 669 clusters\nEpoch: [3][10/400]\tTime 0.363 (0.432)\tData 0.001 (0.064)\tLoss 0.745 (0.655)\nEpoch: [3][20/400]\tTime 0.362 (0.396)\tData 0.000 (0.032)\tLoss 0.646 (0.626)\nEpoch: [3][30/400]\tTime 0.363 (0.384)\tData 0.001 (0.022)\tLoss 0.527 (0.619)\nEpoch: [3][40/400]\tTime 0.363 (0.379)\tData 0.000 (0.016)\tLoss 0.470 (0.578)\nEpoch: [3][50/400]\tTime 0.361 (0.413)\tData 0.000 (0.050)\tLoss 3.554 (1.092)\nEpoch: [3][60/400]\tTime 0.364 (0.405)\tData 0.000 (0.042)\tLoss 3.596 (1.479)\nEpoch: [3][70/400]\tTime 0.360 (0.399)\tData 0.000 (0.036)\tLoss 3.852 (1.760)\nEpoch: [3][80/400]\tTime 0.364 (0.394)\tData 0.000 (0.031)\tLoss 3.392 (1.988)\nEpoch: [3][90/400]\tTime 0.359 (0.410)\tData 0.000 (0.047)\tLoss 3.664 (2.139)\nEpoch: [3][100/400]\tTime 0.364 (0.406)\tData 0.001 (0.043)\tLoss 3.258 (2.264)\nEpoch: [3][110/400]\tTime 0.351 (0.402)\tData 0.000 (0.039)\tLoss 4.049 (2.381)\nEpoch: [3][120/400]\tTime 0.362 (0.398)\tData 0.000 (0.036)\tLoss 3.549 (2.481)\nEpoch: [3][130/400]\tTime 0.356 (0.410)\tData 0.001 (0.047)\tLoss 3.922 (2.562)\nEpoch: [3][140/400]\tTime 0.356 (0.406)\tData 0.001 (0.044)\tLoss 3.350 (2.643)\nEpoch: [3][150/400]\tTime 0.358 (0.403)\tData 0.001 (0.041)\tLoss 3.777 (2.697)\nEpoch: [3][160/400]\tTime 0.363 (0.400)\tData 0.000 (0.038)\tLoss 3.202 (2.751)\nEpoch: [3][170/400]\tTime 0.363 (0.409)\tData 0.000 (0.046)\tLoss 3.472 (2.796)\nEpoch: [3][180/400]\tTime 0.353 (0.406)\tData 0.001 (0.044)\tLoss 4.451 (2.837)\nEpoch: [3][190/400]\tTime 0.355 (0.404)\tData 0.001 (0.042)\tLoss 3.997 (2.881)\nEpoch: [3][200/400]\tTime 0.364 (0.402)\tData 0.000 (0.040)\tLoss 3.018 (2.906)\nEpoch: [3][210/400]\tTime 0.358 (0.408)\tData 0.001 (0.046)\tLoss 4.063 (2.924)\nEpoch: [3][220/400]\tTime 0.394 (0.406)\tData 0.001 (0.044)\tLoss 3.438 (2.944)\nEpoch: [3][230/400]\tTime 0.365 (0.404)\tData 0.000 (0.042)\tLoss 4.328 (2.966)\nEpoch: [3][240/400]\tTime 0.363 (0.403)\tData 0.000 (0.040)\tLoss 3.005 (2.975)\nEpoch: [3][250/400]\tTime 0.357 (0.408)\tData 0.001 (0.046)\tLoss 3.439 (2.991)\nEpoch: [3][260/400]\tTime 0.366 (0.406)\tData 0.001 (0.044)\tLoss 3.608 (3.002)\nEpoch: [3][270/400]\tTime 0.366 (0.405)\tData 0.000 (0.042)\tLoss 3.404 (3.014)\nEpoch: [3][280/400]\tTime 0.363 (0.403)\tData 0.001 (0.041)\tLoss 3.531 (3.033)\nEpoch: [3][290/400]\tTime 0.355 (0.408)\tData 0.000 (0.046)\tLoss 3.654 (3.046)\nEpoch: [3][300/400]\tTime 0.354 (0.407)\tData 0.000 (0.044)\tLoss 3.869 (3.054)\nEpoch: [3][310/400]\tTime 0.353 (0.405)\tData 0.001 (0.043)\tLoss 3.166 (3.064)\nEpoch: [3][320/400]\tTime 0.365 (0.404)\tData 0.000 (0.042)\tLoss 3.749 (3.077)\nEpoch: [3][330/400]\tTime 0.349 (0.408)\tData 0.001 (0.046)\tLoss 3.487 (3.084)\nEpoch: [3][340/400]\tTime 0.365 (0.407)\tData 0.001 (0.045)\tLoss 3.068 (3.092)\nEpoch: [3][350/400]\tTime 0.366 (0.406)\tData 0.001 (0.043)\tLoss 2.509 (3.098)\nEpoch: [3][360/400]\tTime 0.366 (0.404)\tData 0.000 (0.042)\tLoss 3.394 (3.107)\nEpoch: [3][370/400]\tTime 2.231 (0.408)\tData 1.838 (0.046)\tLoss 2.484 (3.108)\nEpoch: [3][380/400]\tTime 0.404 (0.407)\tData 0.001 (0.045)\tLoss 3.090 (3.111)\nEpoch: [3][390/400]\tTime 0.363 (0.406)\tData 0.000 (0.044)\tLoss 2.966 (3.116)\nEpoch: [3][400/400]\tTime 0.355 (0.405)\tData 0.001 (0.043)\tLoss 2.738 (3.121)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.159)\tData 0.000 (0.029)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 20.534826517105103\n==> Statistics for epoch 4: 718 clusters\nEpoch: [4][10/400]\tTime 0.363 (0.422)\tData 0.001 (0.059)\tLoss 0.710 (0.738)\nEpoch: [4][20/400]\tTime 0.363 (0.390)\tData 0.001 (0.030)\tLoss 0.527 (0.702)\nEpoch: [4][30/400]\tTime 0.364 (0.380)\tData 0.000 (0.020)\tLoss 0.632 (0.662)\nEpoch: [4][40/400]\tTime 0.354 (0.375)\tData 0.000 (0.015)\tLoss 0.507 (0.615)\nEpoch: [4][50/400]\tTime 0.361 (0.407)\tData 0.001 (0.046)\tLoss 3.590 (0.947)\nEpoch: [4][60/400]\tTime 0.365 (0.400)\tData 0.000 (0.038)\tLoss 3.781 (1.381)\nEpoch: [4][70/400]\tTime 0.380 (0.395)\tData 0.001 (0.033)\tLoss 3.560 (1.677)\nEpoch: [4][80/400]\tTime 0.353 (0.391)\tData 0.000 (0.029)\tLoss 3.385 (1.940)\nEpoch: [4][90/400]\tTime 0.336 (0.408)\tData 0.001 (0.046)\tLoss 2.380 (2.110)\nEpoch: [4][100/400]\tTime 0.365 (0.403)\tData 0.000 (0.041)\tLoss 3.410 (2.233)\nEpoch: [4][110/400]\tTime 0.356 (0.399)\tData 0.001 (0.038)\tLoss 3.001 (2.355)\nEpoch: [4][120/400]\tTime 0.365 (0.396)\tData 0.000 (0.034)\tLoss 3.813 (2.443)\nEpoch: [4][130/400]\tTime 0.364 (0.393)\tData 0.000 (0.032)\tLoss 3.336 (2.522)\nEpoch: [4][140/400]\tTime 0.353 (0.404)\tData 0.001 (0.043)\tLoss 3.169 (2.593)\nEpoch: [4][150/400]\tTime 0.364 (0.402)\tData 0.001 (0.040)\tLoss 3.389 (2.639)\nEpoch: [4][160/400]\tTime 0.365 (0.399)\tData 0.001 (0.037)\tLoss 3.830 (2.694)\nEpoch: [4][170/400]\tTime 0.364 (0.397)\tData 0.000 (0.035)\tLoss 3.357 (2.738)\nEpoch: [4][180/400]\tTime 0.365 (0.405)\tData 0.001 (0.043)\tLoss 2.959 (2.768)\nEpoch: [4][190/400]\tTime 0.367 (0.403)\tData 0.001 (0.041)\tLoss 3.563 (2.808)\nEpoch: [4][200/400]\tTime 0.365 (0.401)\tData 0.000 (0.039)\tLoss 3.389 (2.831)\nEpoch: [4][210/400]\tTime 0.365 (0.399)\tData 0.001 (0.037)\tLoss 3.796 (2.872)\nEpoch: [4][220/400]\tTime 0.364 (0.397)\tData 0.000 (0.036)\tLoss 3.907 (2.902)\nEpoch: [4][230/400]\tTime 0.360 (0.404)\tData 0.001 (0.042)\tLoss 3.402 (2.915)\nEpoch: [4][240/400]\tTime 0.356 (0.402)\tData 0.001 (0.040)\tLoss 3.342 (2.935)\nEpoch: [4][250/400]\tTime 0.366 (0.400)\tData 0.001 (0.038)\tLoss 2.512 (2.949)\nEpoch: [4][260/400]\tTime 0.366 (0.399)\tData 0.000 (0.037)\tLoss 3.143 (2.970)\nEpoch: [4][270/400]\tTime 0.364 (0.404)\tData 0.000 (0.042)\tLoss 2.611 (2.978)\nEpoch: [4][280/400]\tTime 0.351 (0.402)\tData 0.001 (0.040)\tLoss 3.454 (2.994)\nEpoch: [4][290/400]\tTime 0.364 (0.401)\tData 0.001 (0.039)\tLoss 3.459 (3.013)\nEpoch: [4][300/400]\tTime 0.365 (0.400)\tData 0.001 (0.038)\tLoss 4.128 (3.031)\nEpoch: [4][310/400]\tTime 0.348 (0.404)\tData 0.001 (0.042)\tLoss 3.815 (3.045)\nEpoch: [4][320/400]\tTime 0.353 (0.403)\tData 0.001 (0.041)\tLoss 2.844 (3.051)\nEpoch: [4][330/400]\tTime 0.364 (0.402)\tData 0.000 (0.040)\tLoss 2.982 (3.059)\nEpoch: [4][340/400]\tTime 0.363 (0.400)\tData 0.001 (0.039)\tLoss 2.965 (3.071)\nEpoch: [4][350/400]\tTime 0.364 (0.399)\tData 0.000 (0.037)\tLoss 3.538 (3.081)\nEpoch: [4][360/400]\tTime 0.353 (0.403)\tData 0.000 (0.041)\tLoss 2.585 (3.086)\nEpoch: [4][370/400]\tTime 0.352 (0.402)\tData 0.000 (0.040)\tLoss 3.671 (3.091)\nEpoch: [4][380/400]\tTime 0.353 (0.401)\tData 0.000 (0.039)\tLoss 3.471 (3.093)\nEpoch: [4][390/400]\tTime 0.354 (0.400)\tData 0.000 (0.038)\tLoss 3.564 (3.103)\nEpoch: [4][400/400]\tTime 0.363 (0.403)\tData 0.000 (0.042)\tLoss 3.386 (3.105)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.159)\tData 0.000 (0.029)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 19.915706157684326\n==> Statistics for epoch 5: 738 clusters\nEpoch: [5][10/400]\tTime 0.363 (0.427)\tData 0.001 (0.058)\tLoss 0.871 (0.671)\nEpoch: [5][20/400]\tTime 0.363 (0.395)\tData 0.000 (0.029)\tLoss 0.594 (0.658)\nEpoch: [5][30/400]\tTime 0.361 (0.384)\tData 0.001 (0.020)\tLoss 0.571 (0.646)\nEpoch: [5][40/400]\tTime 0.362 (0.379)\tData 0.000 (0.015)\tLoss 0.434 (0.608)\nEpoch: [5][50/400]\tTime 0.363 (0.411)\tData 0.001 (0.047)\tLoss 3.662 (0.826)\nEpoch: [5][60/400]\tTime 0.363 (0.404)\tData 0.001 (0.039)\tLoss 3.928 (1.278)\nEpoch: [5][70/400]\tTime 0.363 (0.398)\tData 0.001 (0.034)\tLoss 3.518 (1.610)\nEpoch: [5][80/400]\tTime 0.363 (0.394)\tData 0.001 (0.030)\tLoss 4.034 (1.891)\nEpoch: [5][90/400]\tTime 0.360 (0.391)\tData 0.000 (0.026)\tLoss 3.181 (2.090)\nEpoch: [5][100/400]\tTime 0.363 (0.406)\tData 0.001 (0.042)\tLoss 3.906 (2.223)\nEpoch: [5][110/400]\tTime 0.363 (0.402)\tData 0.001 (0.038)\tLoss 2.889 (2.327)\nEpoch: [5][120/400]\tTime 0.364 (0.399)\tData 0.001 (0.035)\tLoss 3.901 (2.436)\nEpoch: [5][130/400]\tTime 0.372 (0.396)\tData 0.001 (0.032)\tLoss 3.618 (2.512)\nEpoch: [5][140/400]\tTime 0.364 (0.406)\tData 0.001 (0.042)\tLoss 3.443 (2.572)\nEpoch: [5][150/400]\tTime 0.362 (0.403)\tData 0.001 (0.039)\tLoss 3.673 (2.634)\nEpoch: [5][160/400]\tTime 0.362 (0.401)\tData 0.001 (0.037)\tLoss 3.939 (2.700)\nEpoch: [5][170/400]\tTime 0.363 (0.398)\tData 0.001 (0.035)\tLoss 3.513 (2.750)\nEpoch: [5][180/400]\tTime 0.362 (0.396)\tData 0.000 (0.033)\tLoss 3.440 (2.786)\nEpoch: [5][190/400]\tTime 0.364 (0.404)\tData 0.001 (0.040)\tLoss 3.323 (2.821)\nEpoch: [5][200/400]\tTime 0.379 (0.402)\tData 0.001 (0.038)\tLoss 3.773 (2.852)\nEpoch: [5][210/400]\tTime 0.364 (0.400)\tData 0.001 (0.037)\tLoss 3.739 (2.879)\nEpoch: [5][220/400]\tTime 0.365 (0.399)\tData 0.001 (0.035)\tLoss 2.810 (2.905)\nEpoch: [5][230/400]\tTime 0.363 (0.397)\tData 0.000 (0.033)\tLoss 2.995 (2.926)\nEpoch: [5][240/400]\tTime 0.363 (0.403)\tData 0.001 (0.039)\tLoss 3.483 (2.945)\nEpoch: [5][250/400]\tTime 0.364 (0.402)\tData 0.001 (0.038)\tLoss 2.913 (2.960)\nEpoch: [5][260/400]\tTime 0.364 (0.400)\tData 0.001 (0.036)\tLoss 3.559 (2.984)\nEpoch: [5][270/400]\tTime 0.363 (0.399)\tData 0.000 (0.035)\tLoss 2.889 (2.996)\nEpoch: [5][280/400]\tTime 0.363 (0.404)\tData 0.000 (0.040)\tLoss 3.172 (3.010)\nEpoch: [5][290/400]\tTime 0.397 (0.402)\tData 0.001 (0.038)\tLoss 2.851 (3.031)\nEpoch: [5][300/400]\tTime 0.363 (0.401)\tData 0.001 (0.037)\tLoss 3.221 (3.040)\nEpoch: [5][310/400]\tTime 0.404 (0.400)\tData 0.001 (0.036)\tLoss 4.441 (3.056)\nEpoch: [5][320/400]\tTime 0.362 (0.399)\tData 0.000 (0.035)\tLoss 3.435 (3.065)\nEpoch: [5][330/400]\tTime 0.363 (0.403)\tData 0.000 (0.039)\tLoss 2.926 (3.070)\nEpoch: [5][340/400]\tTime 0.363 (0.402)\tData 0.001 (0.038)\tLoss 3.199 (3.078)\nEpoch: [5][350/400]\tTime 0.363 (0.401)\tData 0.000 (0.037)\tLoss 3.741 (3.090)\nEpoch: [5][360/400]\tTime 0.364 (0.400)\tData 0.000 (0.036)\tLoss 3.576 (3.102)\nEpoch: [5][370/400]\tTime 0.354 (0.404)\tData 0.000 (0.040)\tLoss 3.012 (3.114)\nEpoch: [5][380/400]\tTime 0.362 (0.403)\tData 0.001 (0.039)\tLoss 3.448 (3.122)\nEpoch: [5][390/400]\tTime 0.362 (0.402)\tData 0.001 (0.038)\tLoss 3.214 (3.128)\nEpoch: [5][400/400]\tTime 0.363 (0.401)\tData 0.000 (0.037)\tLoss 2.759 (3.135)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.156)\tData 0.000 (0.028)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 21.006665468215942\n==> Statistics for epoch 6: 774 clusters\nEpoch: [6][10/400]\tTime 0.361 (0.421)\tData 0.000 (0.055)\tLoss 0.654 (0.630)\nEpoch: [6][20/400]\tTime 0.361 (0.391)\tData 0.000 (0.028)\tLoss 0.532 (0.637)\nEpoch: [6][30/400]\tTime 0.349 (0.381)\tData 0.001 (0.019)\tLoss 0.706 (0.609)\nEpoch: [6][40/400]\tTime 0.351 (0.376)\tData 0.001 (0.014)\tLoss 0.388 (0.561)\nEpoch: [6][50/400]\tTime 0.351 (0.409)\tData 0.001 (0.046)\tLoss 3.556 (0.656)\nEpoch: [6][60/400]\tTime 0.363 (0.401)\tData 0.001 (0.039)\tLoss 3.634 (1.125)\nEpoch: [6][70/400]\tTime 0.363 (0.394)\tData 0.001 (0.033)\tLoss 3.379 (1.449)\nEpoch: [6][80/400]\tTime 0.363 (0.390)\tData 0.001 (0.029)\tLoss 4.524 (1.719)\nEpoch: [6][90/400]\tTime 0.361 (0.386)\tData 0.000 (0.026)\tLoss 4.387 (1.916)\nEpoch: [6][100/400]\tTime 0.363 (0.401)\tData 0.001 (0.040)\tLoss 3.475 (2.080)\nEpoch: [6][110/400]\tTime 0.362 (0.398)\tData 0.001 (0.037)\tLoss 3.509 (2.204)\nEpoch: [6][120/400]\tTime 0.363 (0.395)\tData 0.000 (0.034)\tLoss 4.077 (2.310)\nEpoch: [6][130/400]\tTime 0.363 (0.392)\tData 0.001 (0.031)\tLoss 3.712 (2.401)\nEpoch: [6][140/400]\tTime 0.363 (0.390)\tData 0.000 (0.029)\tLoss 3.296 (2.459)\nEpoch: [6][150/400]\tTime 0.363 (0.400)\tData 0.001 (0.038)\tLoss 2.994 (2.518)\nEpoch: [6][160/400]\tTime 0.367 (0.397)\tData 0.000 (0.036)\tLoss 3.683 (2.580)\nEpoch: [6][170/400]\tTime 0.363 (0.395)\tData 0.001 (0.034)\tLoss 3.782 (2.641)\nEpoch: [6][180/400]\tTime 0.356 (0.393)\tData 0.000 (0.032)\tLoss 3.207 (2.689)\nEpoch: [6][190/400]\tTime 0.362 (0.392)\tData 0.000 (0.030)\tLoss 3.582 (2.743)\nEpoch: [6][200/400]\tTime 0.354 (0.399)\tData 0.001 (0.038)\tLoss 3.799 (2.784)\nEpoch: [6][210/400]\tTime 0.356 (0.397)\tData 0.001 (0.036)\tLoss 4.227 (2.824)\nEpoch: [6][220/400]\tTime 0.365 (0.396)\tData 0.000 (0.034)\tLoss 3.427 (2.857)\nEpoch: [6][230/400]\tTime 0.365 (0.394)\tData 0.001 (0.033)\tLoss 3.153 (2.878)\nEpoch: [6][240/400]\tTime 0.361 (0.393)\tData 0.000 (0.032)\tLoss 2.961 (2.900)\nEpoch: [6][250/400]\tTime 0.365 (0.399)\tData 0.001 (0.037)\tLoss 3.756 (2.919)\nEpoch: [6][260/400]\tTime 0.365 (0.398)\tData 0.000 (0.036)\tLoss 3.699 (2.935)\nEpoch: [6][270/400]\tTime 0.363 (0.397)\tData 0.001 (0.035)\tLoss 3.538 (2.951)\nEpoch: [6][280/400]\tTime 0.356 (0.395)\tData 0.000 (0.033)\tLoss 3.931 (2.973)\nEpoch: [6][290/400]\tTime 0.397 (0.401)\tData 0.000 (0.039)\tLoss 3.717 (2.991)\nEpoch: [6][300/400]\tTime 0.364 (0.399)\tData 0.000 (0.037)\tLoss 3.070 (3.006)\nEpoch: [6][310/400]\tTime 0.364 (0.398)\tData 0.001 (0.036)\tLoss 3.666 (3.022)\nEpoch: [6][320/400]\tTime 0.364 (0.397)\tData 0.000 (0.035)\tLoss 3.174 (3.034)\nEpoch: [6][330/400]\tTime 0.365 (0.396)\tData 0.000 (0.034)\tLoss 3.553 (3.049)\nEpoch: [6][340/400]\tTime 0.364 (0.401)\tData 0.001 (0.038)\tLoss 3.590 (3.060)\nEpoch: [6][350/400]\tTime 0.360 (0.400)\tData 0.001 (0.037)\tLoss 3.599 (3.066)\nEpoch: [6][360/400]\tTime 0.353 (0.399)\tData 0.001 (0.036)\tLoss 4.292 (3.077)\nEpoch: [6][370/400]\tTime 0.364 (0.398)\tData 0.001 (0.035)\tLoss 3.490 (3.088)\nEpoch: [6][380/400]\tTime 0.365 (0.397)\tData 0.000 (0.034)\tLoss 2.599 (3.097)\nEpoch: [6][390/400]\tTime 0.367 (0.400)\tData 0.001 (0.038)\tLoss 2.956 (3.100)\nEpoch: [6][400/400]\tTime 0.358 (0.399)\tData 0.000 (0.037)\tLoss 3.397 (3.106)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.128 (0.158)\tData 0.000 (0.028)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 20.961553812026978\n==> Statistics for epoch 7: 803 clusters\nEpoch: [7][10/400]\tTime 0.362 (0.417)\tData 0.001 (0.049)\tLoss 0.741 (0.740)\nEpoch: [7][20/400]\tTime 0.364 (0.392)\tData 0.000 (0.025)\tLoss 0.576 (0.666)\nEpoch: [7][30/400]\tTime 0.363 (0.381)\tData 0.001 (0.017)\tLoss 0.634 (0.641)\nEpoch: [7][40/400]\tTime 0.363 (0.376)\tData 0.000 (0.013)\tLoss 0.411 (0.603)\nEpoch: [7][50/400]\tTime 0.363 (0.374)\tData 0.000 (0.010)\tLoss 0.468 (0.554)\nEpoch: [7][60/400]\tTime 0.364 (0.401)\tData 0.001 (0.038)\tLoss 3.117 (1.051)\nEpoch: [7][70/400]\tTime 0.364 (0.396)\tData 0.000 (0.033)\tLoss 3.634 (1.421)\nEpoch: [7][80/400]\tTime 0.365 (0.392)\tData 0.000 (0.029)\tLoss 3.693 (1.686)\nEpoch: [7][90/400]\tTime 0.365 (0.388)\tData 0.000 (0.025)\tLoss 3.826 (1.910)\nEpoch: [7][100/400]\tTime 0.363 (0.386)\tData 0.000 (0.023)\tLoss 3.239 (2.086)\nEpoch: [7][110/400]\tTime 0.363 (0.399)\tData 0.001 (0.037)\tLoss 3.253 (2.211)\nEpoch: [7][120/400]\tTime 0.359 (0.397)\tData 0.001 (0.034)\tLoss 2.935 (2.302)\nEpoch: [7][130/400]\tTime 0.363 (0.394)\tData 0.001 (0.031)\tLoss 3.105 (2.382)\nEpoch: [7][140/400]\tTime 0.365 (0.392)\tData 0.001 (0.029)\tLoss 2.918 (2.452)\nEpoch: [7][150/400]\tTime 0.364 (0.390)\tData 0.000 (0.027)\tLoss 3.601 (2.498)\nEpoch: [7][160/400]\tTime 0.356 (0.399)\tData 0.001 (0.036)\tLoss 3.396 (2.568)\nEpoch: [7][170/400]\tTime 0.355 (0.396)\tData 0.001 (0.033)\tLoss 3.363 (2.623)\nEpoch: [7][180/400]\tTime 0.356 (0.394)\tData 0.000 (0.032)\tLoss 3.293 (2.673)\nEpoch: [7][190/400]\tTime 0.353 (0.393)\tData 0.001 (0.030)\tLoss 3.481 (2.716)\nEpoch: [7][200/400]\tTime 0.363 (0.391)\tData 0.000 (0.029)\tLoss 3.512 (2.769)\nEpoch: [7][210/400]\tTime 0.407 (0.398)\tData 0.001 (0.035)\tLoss 3.056 (2.812)\nEpoch: [7][220/400]\tTime 0.364 (0.397)\tData 0.001 (0.034)\tLoss 3.987 (2.845)\nEpoch: [7][230/400]\tTime 0.358 (0.395)\tData 0.000 (0.032)\tLoss 4.364 (2.876)\nEpoch: [7][240/400]\tTime 0.364 (0.394)\tData 0.000 (0.031)\tLoss 4.100 (2.909)\nEpoch: [7][250/400]\tTime 0.363 (0.393)\tData 0.000 (0.030)\tLoss 3.368 (2.932)\nEpoch: [7][260/400]\tTime 0.366 (0.398)\tData 0.000 (0.035)\tLoss 3.815 (2.958)\nEpoch: [7][270/400]\tTime 0.364 (0.397)\tData 0.001 (0.034)\tLoss 4.178 (2.977)\nEpoch: [7][280/400]\tTime 0.366 (0.395)\tData 0.001 (0.033)\tLoss 3.080 (2.994)\nEpoch: [7][290/400]\tTime 0.366 (0.394)\tData 0.000 (0.031)\tLoss 3.532 (3.008)\nEpoch: [7][300/400]\tTime 0.364 (0.393)\tData 0.000 (0.030)\tLoss 3.602 (3.028)\nEpoch: [7][310/400]\tTime 0.363 (0.398)\tData 0.001 (0.035)\tLoss 2.980 (3.041)\nEpoch: [7][320/400]\tTime 0.354 (0.397)\tData 0.001 (0.034)\tLoss 3.400 (3.052)\nEpoch: [7][330/400]\tTime 0.362 (0.396)\tData 0.001 (0.033)\tLoss 3.165 (3.062)\nEpoch: [7][340/400]\tTime 0.353 (0.395)\tData 0.001 (0.032)\tLoss 3.562 (3.073)\nEpoch: [7][350/400]\tTime 0.363 (0.394)\tData 0.000 (0.031)\tLoss 3.479 (3.082)\nEpoch: [7][360/400]\tTime 0.363 (0.398)\tData 0.001 (0.035)\tLoss 2.918 (3.089)\nEpoch: [7][370/400]\tTime 0.364 (0.397)\tData 0.001 (0.034)\tLoss 3.367 (3.101)\nEpoch: [7][380/400]\tTime 0.365 (0.396)\tData 0.001 (0.033)\tLoss 3.145 (3.108)\nEpoch: [7][390/400]\tTime 0.393 (0.395)\tData 0.001 (0.033)\tLoss 2.261 (3.115)\nEpoch: [7][400/400]\tTime 0.364 (0.394)\tData 0.000 (0.032)\tLoss 3.137 (3.124)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.127 (0.161)\tData 0.000 (0.030)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 20.03676414489746\n==> Statistics for epoch 8: 799 clusters\nEpoch: [8][10/400]\tTime 0.365 (0.420)\tData 0.001 (0.049)\tLoss 0.598 (0.707)\nEpoch: [8][20/400]\tTime 0.359 (0.392)\tData 0.001 (0.025)\tLoss 0.571 (0.668)\nEpoch: [8][30/400]\tTime 0.366 (0.382)\tData 0.001 (0.017)\tLoss 0.589 (0.644)\nEpoch: [8][40/400]\tTime 0.364 (0.377)\tData 0.001 (0.013)\tLoss 0.573 (0.621)\nEpoch: [8][50/400]\tTime 2.100 (0.409)\tData 1.716 (0.044)\tLoss 3.106 (0.639)\nEpoch: [8][60/400]\tTime 0.356 (0.401)\tData 0.000 (0.037)\tLoss 3.788 (1.123)\nEpoch: [8][70/400]\tTime 0.352 (0.395)\tData 0.001 (0.032)\tLoss 3.493 (1.449)\nEpoch: [8][80/400]\tTime 0.353 (0.390)\tData 0.001 (0.028)\tLoss 2.146 (1.712)\nEpoch: [8][90/400]\tTime 0.353 (0.387)\tData 0.001 (0.025)\tLoss 2.986 (1.912)\nEpoch: [8][100/400]\tTime 0.352 (0.402)\tData 0.001 (0.040)\tLoss 3.582 (2.088)\nEpoch: [8][110/400]\tTime 0.365 (0.399)\tData 0.001 (0.036)\tLoss 3.188 (2.201)\nEpoch: [8][120/400]\tTime 0.364 (0.396)\tData 0.001 (0.033)\tLoss 3.841 (2.302)\nEpoch: [8][130/400]\tTime 0.366 (0.394)\tData 0.001 (0.031)\tLoss 2.937 (2.373)\nEpoch: [8][140/400]\tTime 0.360 (0.392)\tData 0.001 (0.029)\tLoss 3.857 (2.442)\nEpoch: [8][150/400]\tTime 0.418 (0.402)\tData 0.001 (0.039)\tLoss 3.660 (2.511)\nEpoch: [8][160/400]\tTime 0.364 (0.399)\tData 0.000 (0.036)\tLoss 3.513 (2.573)\nEpoch: [8][170/400]\tTime 0.362 (0.397)\tData 0.001 (0.034)\tLoss 3.805 (2.619)\nEpoch: [8][180/400]\tTime 0.361 (0.395)\tData 0.001 (0.032)\tLoss 3.434 (2.668)\nEpoch: [8][190/400]\tTime 0.363 (0.393)\tData 0.000 (0.031)\tLoss 3.731 (2.715)\nEpoch: [8][200/400]\tTime 0.358 (0.401)\tData 0.000 (0.038)\tLoss 2.796 (2.752)\nEpoch: [8][210/400]\tTime 0.364 (0.399)\tData 0.001 (0.036)\tLoss 3.727 (2.783)\nEpoch: [8][220/400]\tTime 0.363 (0.397)\tData 0.000 (0.034)\tLoss 3.256 (2.807)\nEpoch: [8][230/400]\tTime 0.363 (0.395)\tData 0.001 (0.033)\tLoss 3.585 (2.840)\nEpoch: [8][240/400]\tTime 0.363 (0.394)\tData 0.000 (0.032)\tLoss 3.574 (2.871)\nEpoch: [8][250/400]\tTime 0.363 (0.400)\tData 0.000 (0.037)\tLoss 3.430 (2.897)\nEpoch: [8][260/400]\tTime 0.355 (0.398)\tData 0.000 (0.036)\tLoss 3.468 (2.922)\nEpoch: [8][270/400]\tTime 0.363 (0.397)\tData 0.000 (0.034)\tLoss 3.489 (2.942)\nEpoch: [8][280/400]\tTime 0.353 (0.396)\tData 0.001 (0.033)\tLoss 3.625 (2.956)\nEpoch: [8][290/400]\tTime 0.363 (0.394)\tData 0.000 (0.032)\tLoss 3.313 (2.974)\nEpoch: [8][300/400]\tTime 0.365 (0.399)\tData 0.001 (0.037)\tLoss 3.589 (2.987)\nEpoch: [8][310/400]\tTime 0.365 (0.398)\tData 0.001 (0.036)\tLoss 3.810 (3.009)\nEpoch: [8][320/400]\tTime 0.355 (0.397)\tData 0.001 (0.035)\tLoss 3.761 (3.022)\nEpoch: [8][330/400]\tTime 0.364 (0.396)\tData 0.000 (0.034)\tLoss 4.115 (3.036)\nEpoch: [8][340/400]\tTime 0.361 (0.395)\tData 0.000 (0.033)\tLoss 4.035 (3.048)\nEpoch: [8][350/400]\tTime 0.364 (0.399)\tData 0.001 (0.037)\tLoss 3.287 (3.061)\nEpoch: [8][360/400]\tTime 0.363 (0.398)\tData 0.000 (0.036)\tLoss 3.489 (3.067)\nEpoch: [8][370/400]\tTime 0.364 (0.397)\tData 0.001 (0.035)\tLoss 3.295 (3.074)\nEpoch: [8][380/400]\tTime 0.364 (0.396)\tData 0.000 (0.034)\tLoss 4.110 (3.090)\nEpoch: [8][390/400]\tTime 0.364 (0.395)\tData 0.000 (0.033)\tLoss 2.863 (3.096)\nEpoch: [8][400/400]\tTime 0.354 (0.399)\tData 0.001 (0.037)\tLoss 3.270 (3.097)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.160)\tData 0.000 (0.030)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 19.664186716079712\n==> Statistics for epoch 9: 824 clusters\nEpoch: [9][10/400]\tTime 0.363 (0.416)\tData 0.001 (0.058)\tLoss 0.592 (0.635)\nEpoch: [9][20/400]\tTime 0.364 (0.389)\tData 0.001 (0.029)\tLoss 0.543 (0.622)\nEpoch: [9][30/400]\tTime 0.373 (0.379)\tData 0.001 (0.020)\tLoss 0.464 (0.579)\nEpoch: [9][40/400]\tTime 0.355 (0.373)\tData 0.001 (0.015)\tLoss 0.486 (0.557)\nEpoch: [9][50/400]\tTime 0.363 (0.370)\tData 0.000 (0.012)\tLoss 0.417 (0.524)\nEpoch: [9][60/400]\tTime 0.352 (0.397)\tData 0.001 (0.039)\tLoss 3.998 (0.975)\nEpoch: [9][70/400]\tTime 0.363 (0.394)\tData 0.001 (0.034)\tLoss 3.432 (1.335)\nEpoch: [9][80/400]\tTime 0.352 (0.389)\tData 0.000 (0.030)\tLoss 2.783 (1.618)\nEpoch: [9][90/400]\tTime 0.353 (0.386)\tData 0.000 (0.026)\tLoss 3.003 (1.821)\nEpoch: [9][100/400]\tTime 0.354 (0.383)\tData 0.000 (0.024)\tLoss 3.782 (2.005)\nEpoch: [9][110/400]\tTime 0.354 (0.398)\tData 0.001 (0.038)\tLoss 3.219 (2.122)\nEpoch: [9][120/400]\tTime 0.369 (0.395)\tData 0.001 (0.035)\tLoss 3.513 (2.217)\nEpoch: [9][130/400]\tTime 0.356 (0.392)\tData 0.001 (0.032)\tLoss 3.717 (2.314)\nEpoch: [9][140/400]\tTime 0.354 (0.389)\tData 0.001 (0.030)\tLoss 3.731 (2.382)\nEpoch: [9][150/400]\tTime 0.364 (0.387)\tData 0.000 (0.028)\tLoss 3.522 (2.460)\nEpoch: [9][160/400]\tTime 0.364 (0.398)\tData 0.000 (0.038)\tLoss 3.247 (2.517)\nEpoch: [9][170/400]\tTime 0.358 (0.395)\tData 0.001 (0.036)\tLoss 3.656 (2.571)\nEpoch: [9][180/400]\tTime 0.370 (0.393)\tData 0.000 (0.034)\tLoss 3.259 (2.614)\nEpoch: [9][190/400]\tTime 0.360 (0.391)\tData 0.000 (0.032)\tLoss 3.847 (2.663)\nEpoch: [9][200/400]\tTime 0.365 (0.390)\tData 0.000 (0.030)\tLoss 2.787 (2.698)\nEpoch: [9][210/400]\tTime 0.353 (0.397)\tData 0.001 (0.037)\tLoss 3.997 (2.739)\nEpoch: [9][220/400]\tTime 0.353 (0.395)\tData 0.001 (0.036)\tLoss 3.220 (2.773)\nEpoch: [9][230/400]\tTime 0.363 (0.393)\tData 0.000 (0.034)\tLoss 2.794 (2.797)\nEpoch: [9][240/400]\tTime 0.353 (0.391)\tData 0.000 (0.033)\tLoss 3.842 (2.828)\nEpoch: [9][250/400]\tTime 0.364 (0.390)\tData 0.000 (0.031)\tLoss 3.323 (2.856)\nEpoch: [9][260/400]\tTime 0.355 (0.396)\tData 0.000 (0.037)\tLoss 2.733 (2.875)\nEpoch: [9][270/400]\tTime 0.353 (0.395)\tData 0.001 (0.036)\tLoss 2.985 (2.894)\nEpoch: [9][280/400]\tTime 0.352 (0.393)\tData 0.001 (0.035)\tLoss 3.120 (2.913)\nEpoch: [9][290/400]\tTime 0.352 (0.392)\tData 0.001 (0.034)\tLoss 2.927 (2.932)\nEpoch: [9][300/400]\tTime 0.364 (0.391)\tData 0.000 (0.032)\tLoss 3.177 (2.946)\nEpoch: [9][310/400]\tTime 0.353 (0.396)\tData 0.001 (0.037)\tLoss 3.438 (2.960)\nEpoch: [9][320/400]\tTime 0.353 (0.395)\tData 0.001 (0.036)\tLoss 3.001 (2.971)\nEpoch: [9][330/400]\tTime 0.374 (0.394)\tData 0.001 (0.035)\tLoss 2.812 (2.981)\nEpoch: [9][340/400]\tTime 0.355 (0.393)\tData 0.001 (0.034)\tLoss 3.710 (3.000)\nEpoch: [9][350/400]\tTime 0.356 (0.392)\tData 0.001 (0.033)\tLoss 3.407 (3.016)\nEpoch: [9][360/400]\tTime 0.354 (0.395)\tData 0.001 (0.037)\tLoss 2.849 (3.028)\nEpoch: [9][370/400]\tTime 0.352 (0.394)\tData 0.001 (0.036)\tLoss 3.054 (3.037)\nEpoch: [9][380/400]\tTime 0.354 (0.394)\tData 0.001 (0.035)\tLoss 3.148 (3.037)\nEpoch: [9][390/400]\tTime 0.351 (0.393)\tData 0.001 (0.034)\tLoss 3.203 (3.042)\nEpoch: [9][400/400]\tTime 0.354 (0.392)\tData 0.001 (0.033)\tLoss 3.732 (3.048)\nExtract Features: [50/76]\tTime 0.133 (0.159)\tData 0.000 (0.029)\t\nMean AP: 53.8%\n\n * Finished epoch   9  model mAP: 53.8%  best: 53.8% *\n\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.128 (0.161)\tData 0.000 (0.029)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 19.404287099838257\n==> Statistics for epoch 10: 832 clusters\nEpoch: [10][10/400]\tTime 0.352 (0.426)\tData 0.001 (0.059)\tLoss 0.724 (0.652)\nEpoch: [10][20/400]\tTime 0.352 (0.389)\tData 0.001 (0.030)\tLoss 0.703 (0.629)\nEpoch: [10][30/400]\tTime 0.352 (0.377)\tData 0.001 (0.020)\tLoss 0.772 (0.608)\nEpoch: [10][40/400]\tTime 0.353 (0.371)\tData 0.001 (0.015)\tLoss 0.289 (0.561)\nEpoch: [10][50/400]\tTime 0.363 (0.370)\tData 0.000 (0.012)\tLoss 0.215 (0.538)\nEpoch: [10][60/400]\tTime 0.354 (0.396)\tData 0.000 (0.039)\tLoss 3.770 (0.915)\nEpoch: [10][70/400]\tTime 0.354 (0.391)\tData 0.001 (0.033)\tLoss 3.789 (1.271)\nEpoch: [10][80/400]\tTime 0.354 (0.387)\tData 0.000 (0.029)\tLoss 3.113 (1.542)\nEpoch: [10][90/400]\tTime 0.352 (0.383)\tData 0.001 (0.026)\tLoss 3.385 (1.774)\nEpoch: [10][100/400]\tTime 0.363 (0.380)\tData 0.000 (0.024)\tLoss 3.141 (1.958)\nEpoch: [10][110/400]\tTime 0.353 (0.395)\tData 0.001 (0.038)\tLoss 3.547 (2.085)\nEpoch: [10][120/400]\tTime 0.352 (0.392)\tData 0.001 (0.035)\tLoss 3.849 (2.198)\nEpoch: [10][130/400]\tTime 0.355 (0.389)\tData 0.001 (0.032)\tLoss 3.358 (2.279)\nEpoch: [10][140/400]\tTime 0.357 (0.388)\tData 0.000 (0.030)\tLoss 2.969 (2.351)\nEpoch: [10][150/400]\tTime 0.369 (0.386)\tData 0.000 (0.028)\tLoss 3.277 (2.412)\nEpoch: [10][160/400]\tTime 0.350 (0.395)\tData 0.001 (0.037)\tLoss 2.731 (2.463)\nEpoch: [10][170/400]\tTime 0.369 (0.393)\tData 0.001 (0.035)\tLoss 3.326 (2.526)\nEpoch: [10][180/400]\tTime 0.353 (0.391)\tData 0.000 (0.033)\tLoss 3.250 (2.590)\nEpoch: [10][190/400]\tTime 0.359 (0.389)\tData 0.000 (0.031)\tLoss 4.185 (2.639)\nEpoch: [10][200/400]\tTime 0.354 (0.387)\tData 0.000 (0.030)\tLoss 3.112 (2.672)\nEpoch: [10][210/400]\tTime 0.364 (0.395)\tData 0.001 (0.037)\tLoss 3.719 (2.719)\nEpoch: [10][220/400]\tTime 0.354 (0.393)\tData 0.001 (0.035)\tLoss 3.780 (2.747)\nEpoch: [10][230/400]\tTime 0.353 (0.391)\tData 0.001 (0.033)\tLoss 3.101 (2.770)\nEpoch: [10][240/400]\tTime 0.342 (0.390)\tData 0.001 (0.032)\tLoss 3.630 (2.795)\nEpoch: [10][250/400]\tTime 0.353 (0.388)\tData 0.001 (0.031)\tLoss 3.657 (2.832)\nEpoch: [10][260/400]\tTime 0.365 (0.387)\tData 0.000 (0.030)\tLoss 3.088 (2.845)\nEpoch: [10][270/400]\tTime 0.370 (0.393)\tData 0.001 (0.035)\tLoss 3.160 (2.863)\nEpoch: [10][280/400]\tTime 0.377 (0.392)\tData 0.000 (0.034)\tLoss 3.240 (2.878)\nEpoch: [10][290/400]\tTime 0.362 (0.391)\tData 0.001 (0.033)\tLoss 2.841 (2.893)\nEpoch: [10][300/400]\tTime 0.350 (0.390)\tData 0.001 (0.032)\tLoss 3.499 (2.909)\nEpoch: [10][310/400]\tTime 0.363 (0.389)\tData 0.000 (0.031)\tLoss 3.529 (2.928)\nEpoch: [10][320/400]\tTime 0.352 (0.394)\tData 0.001 (0.035)\tLoss 3.590 (2.945)\nEpoch: [10][330/400]\tTime 0.352 (0.392)\tData 0.001 (0.034)\tLoss 3.562 (2.963)\nEpoch: [10][340/400]\tTime 0.353 (0.391)\tData 0.000 (0.033)\tLoss 3.800 (2.979)\nEpoch: [10][350/400]\tTime 0.352 (0.390)\tData 0.001 (0.032)\tLoss 3.227 (2.986)\nEpoch: [10][360/400]\tTime 0.364 (0.389)\tData 0.000 (0.032)\tLoss 4.077 (3.000)\nEpoch: [10][370/400]\tTime 0.355 (0.394)\tData 0.000 (0.036)\tLoss 3.087 (3.014)\nEpoch: [10][380/400]\tTime 0.361 (0.393)\tData 0.000 (0.035)\tLoss 3.346 (3.021)\nEpoch: [10][390/400]\tTime 0.371 (0.392)\tData 0.000 (0.034)\tLoss 3.906 (3.031)\nEpoch: [10][400/400]\tTime 0.352 (0.391)\tData 0.001 (0.033)\tLoss 3.951 (3.036)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.161)\tData 0.000 (0.028)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 19.264953136444092\n==> Statistics for epoch 11: 820 clusters\nEpoch: [11][10/400]\tTime 0.361 (0.425)\tData 0.001 (0.066)\tLoss 0.612 (0.591)\nEpoch: [11][20/400]\tTime 0.364 (0.391)\tData 0.000 (0.033)\tLoss 0.553 (0.592)\nEpoch: [11][30/400]\tTime 0.361 (0.379)\tData 0.001 (0.022)\tLoss 0.460 (0.592)\nEpoch: [11][40/400]\tTime 0.351 (0.374)\tData 0.000 (0.017)\tLoss 0.447 (0.555)\nEpoch: [11][50/400]\tTime 0.362 (0.371)\tData 0.000 (0.013)\tLoss 0.305 (0.512)\nEpoch: [11][60/400]\tTime 0.355 (0.399)\tData 0.000 (0.041)\tLoss 3.779 (0.909)\nEpoch: [11][70/400]\tTime 0.351 (0.393)\tData 0.000 (0.035)\tLoss 3.464 (1.264)\nEpoch: [11][80/400]\tTime 0.354 (0.389)\tData 0.000 (0.031)\tLoss 3.233 (1.546)\nEpoch: [11][90/400]\tTime 0.354 (0.385)\tData 0.000 (0.027)\tLoss 3.937 (1.768)\nEpoch: [11][100/400]\tTime 0.362 (0.382)\tData 0.000 (0.025)\tLoss 3.272 (1.956)\nEpoch: [11][110/400]\tTime 0.351 (0.396)\tData 0.000 (0.038)\tLoss 3.169 (2.079)\nEpoch: [11][120/400]\tTime 0.357 (0.393)\tData 0.001 (0.035)\tLoss 3.332 (2.180)\nEpoch: [11][130/400]\tTime 0.352 (0.389)\tData 0.000 (0.032)\tLoss 3.293 (2.266)\nEpoch: [11][140/400]\tTime 0.351 (0.387)\tData 0.001 (0.030)\tLoss 3.317 (2.349)\nEpoch: [11][150/400]\tTime 0.362 (0.385)\tData 0.000 (0.028)\tLoss 3.531 (2.424)\nEpoch: [11][160/400]\tTime 0.351 (0.394)\tData 0.001 (0.037)\tLoss 3.524 (2.475)\nEpoch: [11][170/400]\tTime 0.351 (0.392)\tData 0.001 (0.035)\tLoss 2.960 (2.529)\nEpoch: [11][180/400]\tTime 0.351 (0.390)\tData 0.001 (0.033)\tLoss 3.660 (2.580)\nEpoch: [11][190/400]\tTime 0.365 (0.388)\tData 0.001 (0.032)\tLoss 2.995 (2.623)\nEpoch: [11][200/400]\tTime 0.363 (0.387)\tData 0.000 (0.030)\tLoss 3.274 (2.664)\nEpoch: [11][210/400]\tTime 0.362 (0.395)\tData 0.001 (0.038)\tLoss 3.163 (2.692)\nEpoch: [11][220/400]\tTime 0.352 (0.393)\tData 0.001 (0.036)\tLoss 3.408 (2.712)\nEpoch: [11][230/400]\tTime 0.363 (0.392)\tData 0.001 (0.035)\tLoss 3.713 (2.756)\nEpoch: [11][240/400]\tTime 0.354 (0.390)\tData 0.001 (0.033)\tLoss 2.348 (2.772)\nEpoch: [11][250/400]\tTime 0.364 (0.389)\tData 0.000 (0.032)\tLoss 3.499 (2.798)\nEpoch: [11][260/400]\tTime 0.352 (0.395)\tData 0.000 (0.038)\tLoss 4.510 (2.814)\nEpoch: [11][270/400]\tTime 0.372 (0.394)\tData 0.001 (0.036)\tLoss 3.749 (2.832)\nEpoch: [11][280/400]\tTime 0.352 (0.393)\tData 0.001 (0.035)\tLoss 2.962 (2.854)\nEpoch: [11][290/400]\tTime 0.362 (0.392)\tData 0.001 (0.034)\tLoss 3.504 (2.875)\nEpoch: [11][300/400]\tTime 0.363 (0.390)\tData 0.000 (0.033)\tLoss 3.720 (2.880)\nEpoch: [11][310/400]\tTime 0.351 (0.395)\tData 0.000 (0.037)\tLoss 2.789 (2.896)\nEpoch: [11][320/400]\tTime 0.363 (0.394)\tData 0.001 (0.036)\tLoss 3.354 (2.910)\nEpoch: [11][330/400]\tTime 0.355 (0.393)\tData 0.001 (0.035)\tLoss 2.951 (2.927)\nEpoch: [11][340/400]\tTime 0.364 (0.392)\tData 0.001 (0.034)\tLoss 3.100 (2.945)\nEpoch: [11][350/400]\tTime 0.354 (0.391)\tData 0.001 (0.033)\tLoss 3.269 (2.954)\nEpoch: [11][360/400]\tTime 0.353 (0.395)\tData 0.001 (0.037)\tLoss 3.439 (2.968)\nEpoch: [11][370/400]\tTime 0.353 (0.394)\tData 0.001 (0.036)\tLoss 2.336 (2.975)\nEpoch: [11][380/400]\tTime 0.353 (0.393)\tData 0.001 (0.035)\tLoss 3.314 (2.985)\nEpoch: [11][390/400]\tTime 0.352 (0.392)\tData 0.001 (0.034)\tLoss 3.860 (2.992)\nEpoch: [11][400/400]\tTime 0.354 (0.391)\tData 0.001 (0.033)\tLoss 2.886 (3.006)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.161)\tData 0.000 (0.028)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 19.479636669158936\n==> Statistics for epoch 12: 823 clusters\nEpoch: [12][10/400]\tTime 0.350 (0.406)\tData 0.000 (0.049)\tLoss 0.580 (0.660)\nEpoch: [12][20/400]\tTime 0.352 (0.381)\tData 0.000 (0.025)\tLoss 0.551 (0.625)\nEpoch: [12][30/400]\tTime 0.368 (0.373)\tData 0.001 (0.017)\tLoss 0.421 (0.581)\nEpoch: [12][40/400]\tTime 0.355 (0.369)\tData 0.001 (0.013)\tLoss 0.313 (0.540)\nEpoch: [12][50/400]\tTime 0.364 (0.367)\tData 0.000 (0.010)\tLoss 0.316 (0.492)\nEpoch: [12][60/400]\tTime 0.352 (0.394)\tData 0.000 (0.037)\tLoss 3.107 (0.885)\nEpoch: [12][70/400]\tTime 0.377 (0.389)\tData 0.000 (0.031)\tLoss 3.758 (1.241)\nEpoch: [12][80/400]\tTime 0.352 (0.385)\tData 0.000 (0.028)\tLoss 3.670 (1.538)\nEpoch: [12][90/400]\tTime 0.351 (0.381)\tData 0.001 (0.025)\tLoss 2.967 (1.747)\nEpoch: [12][100/400]\tTime 0.363 (0.379)\tData 0.000 (0.022)\tLoss 3.561 (1.928)\nEpoch: [12][110/400]\tTime 0.353 (0.394)\tData 0.001 (0.036)\tLoss 3.007 (2.034)\nEpoch: [12][120/400]\tTime 0.352 (0.390)\tData 0.000 (0.033)\tLoss 3.539 (2.135)\nEpoch: [12][130/400]\tTime 0.352 (0.388)\tData 0.001 (0.031)\tLoss 3.415 (2.224)\nEpoch: [12][140/400]\tTime 0.375 (0.385)\tData 0.001 (0.029)\tLoss 3.808 (2.296)\nEpoch: [12][150/400]\tTime 0.364 (0.384)\tData 0.000 (0.027)\tLoss 3.084 (2.358)\nEpoch: [12][160/400]\tTime 0.352 (0.393)\tData 0.000 (0.036)\tLoss 2.886 (2.419)\nEpoch: [12][170/400]\tTime 0.354 (0.391)\tData 0.000 (0.034)\tLoss 3.283 (2.467)\nEpoch: [12][180/400]\tTime 0.354 (0.389)\tData 0.000 (0.032)\tLoss 3.750 (2.515)\nEpoch: [12][190/400]\tTime 0.353 (0.387)\tData 0.000 (0.030)\tLoss 3.949 (2.560)\nEpoch: [12][200/400]\tTime 0.363 (0.386)\tData 0.000 (0.029)\tLoss 3.164 (2.605)\nEpoch: [12][210/400]\tTime 0.352 (0.394)\tData 0.001 (0.037)\tLoss 3.314 (2.634)\nEpoch: [12][220/400]\tTime 0.351 (0.392)\tData 0.000 (0.035)\tLoss 3.724 (2.659)\nEpoch: [12][230/400]\tTime 0.353 (0.391)\tData 0.001 (0.034)\tLoss 3.433 (2.687)\nEpoch: [12][240/400]\tTime 0.351 (0.389)\tData 0.000 (0.032)\tLoss 3.072 (2.719)\nEpoch: [12][250/400]\tTime 0.361 (0.388)\tData 0.000 (0.031)\tLoss 2.990 (2.742)\nEpoch: [12][260/400]\tTime 0.354 (0.393)\tData 0.000 (0.037)\tLoss 2.773 (2.763)\nEpoch: [12][270/400]\tTime 0.362 (0.392)\tData 0.001 (0.035)\tLoss 3.360 (2.783)\nEpoch: [12][280/400]\tTime 0.353 (0.391)\tData 0.000 (0.034)\tLoss 3.166 (2.801)\nEpoch: [12][290/400]\tTime 0.362 (0.390)\tData 0.000 (0.033)\tLoss 2.688 (2.824)\nEpoch: [12][300/400]\tTime 0.362 (0.389)\tData 0.000 (0.032)\tLoss 2.819 (2.839)\nEpoch: [12][310/400]\tTime 0.352 (0.394)\tData 0.000 (0.037)\tLoss 3.205 (2.846)\nEpoch: [12][320/400]\tTime 0.394 (0.393)\tData 0.001 (0.036)\tLoss 3.492 (2.863)\nEpoch: [12][330/400]\tTime 0.353 (0.392)\tData 0.000 (0.035)\tLoss 3.305 (2.868)\nEpoch: [12][340/400]\tTime 0.362 (0.391)\tData 0.000 (0.034)\tLoss 3.045 (2.877)\nEpoch: [12][350/400]\tTime 0.354 (0.390)\tData 0.000 (0.033)\tLoss 3.212 (2.886)\nEpoch: [12][360/400]\tTime 0.351 (0.394)\tData 0.000 (0.037)\tLoss 2.953 (2.895)\nEpoch: [12][370/400]\tTime 0.355 (0.393)\tData 0.000 (0.036)\tLoss 3.228 (2.902)\nEpoch: [12][380/400]\tTime 0.353 (0.392)\tData 0.001 (0.035)\tLoss 2.765 (2.906)\nEpoch: [12][390/400]\tTime 0.376 (0.392)\tData 0.000 (0.034)\tLoss 3.026 (2.912)\nEpoch: [12][400/400]\tTime 0.363 (0.391)\tData 0.001 (0.033)\tLoss 2.972 (2.922)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.160)\tData 0.000 (0.029)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 19.265992641448975\n==> Statistics for epoch 13: 795 clusters\nEpoch: [13][10/400]\tTime 0.350 (0.426)\tData 0.000 (0.053)\tLoss 0.824 (0.678)\nEpoch: [13][20/400]\tTime 0.352 (0.391)\tData 0.001 (0.027)\tLoss 0.324 (0.580)\nEpoch: [13][30/400]\tTime 0.352 (0.378)\tData 0.000 (0.018)\tLoss 0.401 (0.550)\nEpoch: [13][40/400]\tTime 0.351 (0.372)\tData 0.000 (0.014)\tLoss 0.377 (0.512)\nEpoch: [13][50/400]\tTime 2.274 (0.408)\tData 1.869 (0.048)\tLoss 3.357 (0.535)\nEpoch: [13][60/400]\tTime 0.352 (0.399)\tData 0.000 (0.040)\tLoss 3.581 (0.979)\nEpoch: [13][70/400]\tTime 0.355 (0.392)\tData 0.000 (0.035)\tLoss 4.060 (1.349)\nEpoch: [13][80/400]\tTime 0.352 (0.387)\tData 0.000 (0.030)\tLoss 3.170 (1.605)\nEpoch: [13][90/400]\tTime 0.353 (0.384)\tData 0.000 (0.027)\tLoss 3.976 (1.797)\nEpoch: [13][100/400]\tTime 0.378 (0.401)\tData 0.000 (0.043)\tLoss 2.443 (1.947)\nEpoch: [13][110/400]\tTime 0.362 (0.397)\tData 0.000 (0.039)\tLoss 3.263 (2.076)\nEpoch: [13][120/400]\tTime 0.362 (0.394)\tData 0.000 (0.036)\tLoss 3.032 (2.173)\nEpoch: [13][130/400]\tTime 0.364 (0.391)\tData 0.000 (0.033)\tLoss 3.170 (2.243)\nEpoch: [13][140/400]\tTime 0.362 (0.389)\tData 0.001 (0.031)\tLoss 3.357 (2.306)\nEpoch: [13][150/400]\tTime 0.356 (0.399)\tData 0.000 (0.040)\tLoss 3.208 (2.359)\nEpoch: [13][160/400]\tTime 0.359 (0.396)\tData 0.001 (0.038)\tLoss 3.320 (2.415)\nEpoch: [13][170/400]\tTime 0.356 (0.394)\tData 0.000 (0.036)\tLoss 3.595 (2.476)\nEpoch: [13][180/400]\tTime 0.353 (0.392)\tData 0.001 (0.034)\tLoss 3.498 (2.532)\nEpoch: [13][190/400]\tTime 0.363 (0.390)\tData 0.000 (0.032)\tLoss 3.206 (2.583)\nEpoch: [13][200/400]\tTime 0.354 (0.399)\tData 0.000 (0.040)\tLoss 3.678 (2.616)\nEpoch: [13][210/400]\tTime 0.353 (0.397)\tData 0.001 (0.038)\tLoss 3.085 (2.651)\nEpoch: [13][220/400]\tTime 0.354 (0.395)\tData 0.000 (0.036)\tLoss 2.938 (2.679)\nEpoch: [13][230/400]\tTime 0.352 (0.393)\tData 0.001 (0.035)\tLoss 3.427 (2.707)\nEpoch: [13][240/400]\tTime 0.363 (0.392)\tData 0.000 (0.033)\tLoss 3.214 (2.734)\nEpoch: [13][250/400]\tTime 0.354 (0.397)\tData 0.001 (0.039)\tLoss 3.185 (2.753)\nEpoch: [13][260/400]\tTime 0.351 (0.396)\tData 0.001 (0.037)\tLoss 3.228 (2.780)\nEpoch: [13][270/400]\tTime 0.353 (0.394)\tData 0.001 (0.036)\tLoss 3.484 (2.799)\nEpoch: [13][280/400]\tTime 0.351 (0.393)\tData 0.001 (0.035)\tLoss 3.392 (2.818)\nEpoch: [13][290/400]\tTime 0.364 (0.392)\tData 0.000 (0.034)\tLoss 3.271 (2.830)\nEpoch: [13][300/400]\tTime 0.358 (0.397)\tData 0.001 (0.039)\tLoss 3.744 (2.848)\nEpoch: [13][310/400]\tTime 0.354 (0.396)\tData 0.001 (0.037)\tLoss 3.242 (2.856)\nEpoch: [13][320/400]\tTime 0.351 (0.394)\tData 0.001 (0.036)\tLoss 4.199 (2.868)\nEpoch: [13][330/400]\tTime 0.353 (0.393)\tData 0.000 (0.035)\tLoss 3.117 (2.881)\nEpoch: [13][340/400]\tTime 0.364 (0.393)\tData 0.000 (0.034)\tLoss 3.464 (2.896)\nEpoch: [13][350/400]\tTime 0.376 (0.397)\tData 0.000 (0.038)\tLoss 2.727 (2.909)\nEpoch: [13][360/400]\tTime 0.354 (0.395)\tData 0.000 (0.037)\tLoss 3.185 (2.920)\nEpoch: [13][370/400]\tTime 0.354 (0.394)\tData 0.000 (0.036)\tLoss 3.677 (2.932)\nEpoch: [13][380/400]\tTime 0.354 (0.393)\tData 0.001 (0.035)\tLoss 3.252 (2.945)\nEpoch: [13][390/400]\tTime 0.375 (0.393)\tData 0.000 (0.034)\tLoss 2.978 (2.949)\nEpoch: [13][400/400]\tTime 0.354 (0.396)\tData 0.001 (0.038)\tLoss 3.420 (2.955)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.159)\tData 0.000 (0.027)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 19.923077821731567\n==> Statistics for epoch 14: 790 clusters\nEpoch: [14][10/400]\tTime 0.362 (0.416)\tData 0.000 (0.048)\tLoss 0.610 (0.566)\nEpoch: [14][20/400]\tTime 0.351 (0.386)\tData 0.001 (0.024)\tLoss 0.554 (0.538)\nEpoch: [14][30/400]\tTime 0.352 (0.376)\tData 0.001 (0.016)\tLoss 0.321 (0.498)\nEpoch: [14][40/400]\tTime 0.361 (0.372)\tData 0.001 (0.012)\tLoss 0.381 (0.475)\nEpoch: [14][50/400]\tTime 2.216 (0.406)\tData 1.846 (0.047)\tLoss 2.810 (0.496)\nEpoch: [14][60/400]\tTime 0.351 (0.399)\tData 0.001 (0.039)\tLoss 3.211 (0.944)\nEpoch: [14][70/400]\tTime 0.353 (0.393)\tData 0.001 (0.034)\tLoss 3.647 (1.274)\nEpoch: [14][80/400]\tTime 0.351 (0.389)\tData 0.001 (0.030)\tLoss 4.010 (1.545)\nEpoch: [14][90/400]\tTime 0.353 (0.385)\tData 0.001 (0.026)\tLoss 3.302 (1.752)\nEpoch: [14][100/400]\tTime 0.348 (0.401)\tData 0.000 (0.041)\tLoss 3.138 (1.925)\nEpoch: [14][110/400]\tTime 0.352 (0.397)\tData 0.000 (0.038)\tLoss 3.270 (2.021)\nEpoch: [14][120/400]\tTime 0.362 (0.394)\tData 0.001 (0.035)\tLoss 3.810 (2.130)\nEpoch: [14][130/400]\tTime 0.355 (0.391)\tData 0.001 (0.032)\tLoss 3.018 (2.196)\nEpoch: [14][140/400]\tTime 0.352 (0.389)\tData 0.001 (0.030)\tLoss 3.428 (2.267)\nEpoch: [14][150/400]\tTime 0.351 (0.399)\tData 0.000 (0.040)\tLoss 2.540 (2.318)\nEpoch: [14][160/400]\tTime 0.362 (0.396)\tData 0.001 (0.037)\tLoss 3.774 (2.392)\nEpoch: [14][170/400]\tTime 0.352 (0.394)\tData 0.000 (0.035)\tLoss 3.386 (2.439)\nEpoch: [14][180/400]\tTime 0.352 (0.392)\tData 0.000 (0.033)\tLoss 3.828 (2.493)\nEpoch: [14][190/400]\tTime 0.363 (0.390)\tData 0.000 (0.031)\tLoss 3.470 (2.540)\nEpoch: [14][200/400]\tTime 0.353 (0.398)\tData 0.000 (0.039)\tLoss 3.056 (2.566)\nEpoch: [14][210/400]\tTime 0.361 (0.396)\tData 0.001 (0.038)\tLoss 3.308 (2.603)\nEpoch: [14][220/400]\tTime 0.352 (0.395)\tData 0.001 (0.036)\tLoss 3.197 (2.634)\nEpoch: [14][230/400]\tTime 0.406 (0.393)\tData 0.001 (0.034)\tLoss 3.360 (2.657)\nEpoch: [14][240/400]\tTime 0.362 (0.392)\tData 0.000 (0.033)\tLoss 3.102 (2.685)\nEpoch: [14][250/400]\tTime 0.351 (0.398)\tData 0.001 (0.039)\tLoss 3.292 (2.709)\nEpoch: [14][260/400]\tTime 0.355 (0.396)\tData 0.001 (0.037)\tLoss 3.541 (2.733)\nEpoch: [14][270/400]\tTime 0.352 (0.395)\tData 0.001 (0.036)\tLoss 3.391 (2.754)\nEpoch: [14][280/400]\tTime 0.357 (0.393)\tData 0.001 (0.035)\tLoss 3.009 (2.780)\nEpoch: [14][290/400]\tTime 0.364 (0.392)\tData 0.000 (0.033)\tLoss 3.992 (2.795)\nEpoch: [14][300/400]\tTime 0.353 (0.398)\tData 0.000 (0.039)\tLoss 3.267 (2.809)\nEpoch: [14][310/400]\tTime 0.368 (0.396)\tData 0.001 (0.037)\tLoss 3.523 (2.826)\nEpoch: [14][320/400]\tTime 0.363 (0.395)\tData 0.001 (0.036)\tLoss 2.903 (2.833)\nEpoch: [14][330/400]\tTime 0.351 (0.394)\tData 0.000 (0.035)\tLoss 2.882 (2.840)\nEpoch: [14][340/400]\tTime 0.364 (0.393)\tData 0.000 (0.034)\tLoss 3.980 (2.858)\nEpoch: [14][350/400]\tTime 0.351 (0.397)\tData 0.001 (0.038)\tLoss 4.101 (2.872)\nEpoch: [14][360/400]\tTime 0.361 (0.396)\tData 0.001 (0.037)\tLoss 3.365 (2.882)\nEpoch: [14][370/400]\tTime 0.352 (0.395)\tData 0.001 (0.036)\tLoss 3.678 (2.894)\nEpoch: [14][380/400]\tTime 0.351 (0.394)\tData 0.001 (0.035)\tLoss 3.371 (2.905)\nEpoch: [14][390/400]\tTime 0.363 (0.393)\tData 0.000 (0.034)\tLoss 3.487 (2.907)\nEpoch: [14][400/400]\tTime 0.352 (0.397)\tData 0.001 (0.038)\tLoss 2.707 (2.913)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.159)\tData 0.000 (0.029)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.7038893699646\n==> Statistics for epoch 15: 762 clusters\nEpoch: [15][10/400]\tTime 0.361 (0.408)\tData 0.001 (0.052)\tLoss 0.427 (0.511)\nEpoch: [15][20/400]\tTime 0.344 (0.379)\tData 0.001 (0.027)\tLoss 0.680 (0.525)\nEpoch: [15][30/400]\tTime 0.350 (0.369)\tData 0.001 (0.018)\tLoss 0.465 (0.490)\nEpoch: [15][40/400]\tTime 0.379 (0.365)\tData 0.001 (0.014)\tLoss 0.393 (0.465)\nEpoch: [15][50/400]\tTime 0.353 (0.403)\tData 0.001 (0.048)\tLoss 3.886 (0.625)\nEpoch: [15][60/400]\tTime 0.363 (0.395)\tData 0.001 (0.041)\tLoss 4.075 (1.036)\nEpoch: [15][70/400]\tTime 0.354 (0.389)\tData 0.001 (0.035)\tLoss 3.462 (1.347)\nEpoch: [15][80/400]\tTime 0.351 (0.385)\tData 0.001 (0.031)\tLoss 3.275 (1.604)\nEpoch: [15][90/400]\tTime 0.361 (0.382)\tData 0.000 (0.027)\tLoss 3.815 (1.821)\nEpoch: [15][100/400]\tTime 0.352 (0.397)\tData 0.001 (0.042)\tLoss 3.366 (1.957)\nEpoch: [15][110/400]\tTime 0.352 (0.393)\tData 0.001 (0.038)\tLoss 3.331 (2.063)\nEpoch: [15][120/400]\tTime 0.358 (0.391)\tData 0.001 (0.035)\tLoss 2.829 (2.147)\nEpoch: [15][130/400]\tTime 0.351 (0.388)\tData 0.001 (0.032)\tLoss 3.148 (2.216)\nEpoch: [15][140/400]\tTime 0.362 (0.386)\tData 0.000 (0.030)\tLoss 3.133 (2.283)\nEpoch: [15][150/400]\tTime 0.353 (0.395)\tData 0.000 (0.039)\tLoss 3.000 (2.326)\nEpoch: [15][160/400]\tTime 0.351 (0.392)\tData 0.000 (0.037)\tLoss 2.966 (2.371)\nEpoch: [15][170/400]\tTime 0.352 (0.390)\tData 0.001 (0.035)\tLoss 3.734 (2.429)\nEpoch: [15][180/400]\tTime 0.352 (0.388)\tData 0.000 (0.033)\tLoss 3.507 (2.491)\nEpoch: [15][190/400]\tTime 0.351 (0.396)\tData 0.000 (0.040)\tLoss 2.590 (2.533)\nEpoch: [15][200/400]\tTime 0.352 (0.394)\tData 0.001 (0.038)\tLoss 3.072 (2.566)\nEpoch: [15][210/400]\tTime 0.351 (0.392)\tData 0.000 (0.037)\tLoss 3.743 (2.606)\nEpoch: [15][220/400]\tTime 0.351 (0.390)\tData 0.001 (0.035)\tLoss 4.051 (2.639)\nEpoch: [15][230/400]\tTime 0.363 (0.389)\tData 0.000 (0.033)\tLoss 2.909 (2.665)\nEpoch: [15][240/400]\tTime 0.353 (0.395)\tData 0.001 (0.039)\tLoss 3.017 (2.698)\nEpoch: [15][250/400]\tTime 0.353 (0.394)\tData 0.001 (0.038)\tLoss 3.426 (2.713)\nEpoch: [15][260/400]\tTime 0.374 (0.392)\tData 0.001 (0.036)\tLoss 2.636 (2.730)\nEpoch: [15][270/400]\tTime 0.355 (0.391)\tData 0.001 (0.035)\tLoss 2.961 (2.754)\nEpoch: [15][280/400]\tTime 0.364 (0.390)\tData 0.000 (0.034)\tLoss 3.096 (2.773)\nEpoch: [15][290/400]\tTime 0.355 (0.395)\tData 0.001 (0.039)\tLoss 2.648 (2.790)\nEpoch: [15][300/400]\tTime 0.354 (0.394)\tData 0.001 (0.038)\tLoss 3.332 (2.796)\nEpoch: [15][310/400]\tTime 0.363 (0.393)\tData 0.001 (0.036)\tLoss 2.959 (2.814)\nEpoch: [15][320/400]\tTime 0.352 (0.392)\tData 0.001 (0.035)\tLoss 3.060 (2.828)\nEpoch: [15][330/400]\tTime 2.109 (0.396)\tData 1.729 (0.039)\tLoss 3.621 (2.842)\nEpoch: [15][340/400]\tTime 0.353 (0.395)\tData 0.001 (0.038)\tLoss 3.697 (2.851)\nEpoch: [15][350/400]\tTime 0.354 (0.394)\tData 0.001 (0.037)\tLoss 3.061 (2.856)\nEpoch: [15][360/400]\tTime 0.353 (0.393)\tData 0.000 (0.036)\tLoss 3.597 (2.868)\nEpoch: [15][370/400]\tTime 0.365 (0.392)\tData 0.000 (0.035)\tLoss 2.889 (2.879)\nEpoch: [15][380/400]\tTime 0.353 (0.396)\tData 0.000 (0.039)\tLoss 3.172 (2.887)\nEpoch: [15][390/400]\tTime 0.405 (0.395)\tData 0.001 (0.038)\tLoss 3.439 (2.896)\nEpoch: [15][400/400]\tTime 0.353 (0.394)\tData 0.000 (0.037)\tLoss 3.548 (2.901)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.162)\tData 0.000 (0.028)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 19.068045377731323\n==> Statistics for epoch 16: 752 clusters\nEpoch: [16][10/400]\tTime 0.349 (0.414)\tData 0.000 (0.053)\tLoss 0.500 (0.518)\nEpoch: [16][20/400]\tTime 0.351 (0.382)\tData 0.000 (0.027)\tLoss 0.438 (0.509)\nEpoch: [16][30/400]\tTime 0.351 (0.374)\tData 0.000 (0.018)\tLoss 0.239 (0.498)\nEpoch: [16][40/400]\tTime 0.351 (0.370)\tData 0.001 (0.014)\tLoss 0.273 (0.460)\nEpoch: [16][50/400]\tTime 0.351 (0.406)\tData 0.000 (0.049)\tLoss 3.233 (0.605)\nEpoch: [16][60/400]\tTime 0.371 (0.399)\tData 0.000 (0.041)\tLoss 2.734 (1.023)\nEpoch: [16][70/400]\tTime 0.350 (0.392)\tData 0.000 (0.035)\tLoss 3.455 (1.349)\nEpoch: [16][80/400]\tTime 0.354 (0.388)\tData 0.001 (0.031)\tLoss 3.746 (1.599)\nEpoch: [16][90/400]\tTime 0.362 (0.384)\tData 0.000 (0.027)\tLoss 3.622 (1.805)\nEpoch: [16][100/400]\tTime 0.357 (0.400)\tData 0.001 (0.042)\tLoss 2.684 (1.938)\nEpoch: [16][110/400]\tTime 0.351 (0.396)\tData 0.001 (0.039)\tLoss 3.758 (2.046)\nEpoch: [16][120/400]\tTime 0.361 (0.393)\tData 0.000 (0.035)\tLoss 2.084 (2.116)\nEpoch: [16][130/400]\tTime 0.371 (0.390)\tData 0.001 (0.033)\tLoss 3.511 (2.181)\nEpoch: [16][140/400]\tTime 0.363 (0.388)\tData 0.000 (0.030)\tLoss 2.800 (2.250)\nEpoch: [16][150/400]\tTime 0.375 (0.399)\tData 0.000 (0.041)\tLoss 2.657 (2.301)\nEpoch: [16][160/400]\tTime 0.352 (0.396)\tData 0.001 (0.038)\tLoss 3.241 (2.360)\nEpoch: [16][170/400]\tTime 0.350 (0.394)\tData 0.001 (0.036)\tLoss 3.451 (2.422)\nEpoch: [16][180/400]\tTime 0.352 (0.391)\tData 0.001 (0.034)\tLoss 3.335 (2.469)\nEpoch: [16][190/400]\tTime 0.375 (0.399)\tData 0.001 (0.042)\tLoss 3.002 (2.510)\nEpoch: [16][200/400]\tTime 0.355 (0.397)\tData 0.000 (0.039)\tLoss 2.965 (2.534)\nEpoch: [16][210/400]\tTime 0.365 (0.395)\tData 0.001 (0.038)\tLoss 3.615 (2.571)\nEpoch: [16][220/400]\tTime 0.361 (0.394)\tData 0.001 (0.036)\tLoss 3.552 (2.598)\nEpoch: [16][230/400]\tTime 0.362 (0.392)\tData 0.000 (0.034)\tLoss 3.023 (2.618)\nEpoch: [16][240/400]\tTime 0.364 (0.398)\tData 0.000 (0.040)\tLoss 3.833 (2.643)\nEpoch: [16][250/400]\tTime 0.350 (0.396)\tData 0.001 (0.038)\tLoss 2.741 (2.667)\nEpoch: [16][260/400]\tTime 0.352 (0.395)\tData 0.000 (0.037)\tLoss 2.653 (2.688)\nEpoch: [16][270/400]\tTime 0.354 (0.393)\tData 0.000 (0.036)\tLoss 3.082 (2.706)\nEpoch: [16][280/400]\tTime 0.363 (0.392)\tData 0.000 (0.034)\tLoss 3.040 (2.733)\nEpoch: [16][290/400]\tTime 0.364 (0.397)\tData 0.000 (0.039)\tLoss 3.054 (2.752)\nEpoch: [16][300/400]\tTime 0.356 (0.396)\tData 0.001 (0.038)\tLoss 3.046 (2.761)\nEpoch: [16][310/400]\tTime 0.366 (0.395)\tData 0.000 (0.037)\tLoss 3.188 (2.774)\nEpoch: [16][320/400]\tTime 0.364 (0.394)\tData 0.001 (0.036)\tLoss 3.074 (2.783)\nEpoch: [16][330/400]\tTime 2.142 (0.398)\tData 1.691 (0.040)\tLoss 3.194 (2.789)\nEpoch: [16][340/400]\tTime 0.353 (0.397)\tData 0.001 (0.038)\tLoss 1.998 (2.798)\nEpoch: [16][350/400]\tTime 0.354 (0.396)\tData 0.000 (0.037)\tLoss 2.835 (2.806)\nEpoch: [16][360/400]\tTime 0.354 (0.394)\tData 0.000 (0.036)\tLoss 3.600 (2.824)\nEpoch: [16][370/400]\tTime 0.363 (0.393)\tData 0.000 (0.035)\tLoss 3.361 (2.837)\nEpoch: [16][380/400]\tTime 0.350 (0.397)\tData 0.000 (0.039)\tLoss 3.627 (2.846)\nEpoch: [16][390/400]\tTime 0.351 (0.396)\tData 0.001 (0.038)\tLoss 3.628 (2.857)\nEpoch: [16][400/400]\tTime 0.354 (0.395)\tData 0.001 (0.037)\tLoss 2.607 (2.863)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.158)\tData 0.000 (0.027)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.778992891311646\n==> Statistics for epoch 17: 744 clusters\nEpoch: [17][10/400]\tTime 0.350 (0.404)\tData 0.001 (0.046)\tLoss 0.413 (0.525)\nEpoch: [17][20/400]\tTime 0.351 (0.382)\tData 0.001 (0.023)\tLoss 0.344 (0.506)\nEpoch: [17][30/400]\tTime 0.362 (0.374)\tData 0.001 (0.016)\tLoss 0.320 (0.475)\nEpoch: [17][40/400]\tTime 0.363 (0.369)\tData 0.000 (0.012)\tLoss 0.243 (0.429)\nEpoch: [17][50/400]\tTime 0.352 (0.405)\tData 0.000 (0.046)\tLoss 2.625 (0.619)\nEpoch: [17][60/400]\tTime 0.363 (0.397)\tData 0.001 (0.039)\tLoss 3.538 (1.028)\nEpoch: [17][70/400]\tTime 0.356 (0.391)\tData 0.000 (0.033)\tLoss 3.579 (1.337)\nEpoch: [17][80/400]\tTime 0.364 (0.387)\tData 0.001 (0.029)\tLoss 2.884 (1.575)\nEpoch: [17][90/400]\tTime 0.363 (0.384)\tData 0.000 (0.026)\tLoss 3.305 (1.786)\nEpoch: [17][100/400]\tTime 0.352 (0.399)\tData 0.000 (0.041)\tLoss 2.545 (1.909)\nEpoch: [17][110/400]\tTime 0.350 (0.395)\tData 0.001 (0.038)\tLoss 3.551 (2.010)\nEpoch: [17][120/400]\tTime 0.379 (0.392)\tData 0.001 (0.035)\tLoss 2.456 (2.070)\nEpoch: [17][130/400]\tTime 0.358 (0.389)\tData 0.001 (0.032)\tLoss 2.775 (2.147)\nEpoch: [17][140/400]\tTime 0.351 (0.400)\tData 0.000 (0.042)\tLoss 2.695 (2.210)\nEpoch: [17][150/400]\tTime 0.351 (0.397)\tData 0.000 (0.039)\tLoss 2.791 (2.272)\nEpoch: [17][160/400]\tTime 0.354 (0.395)\tData 0.001 (0.037)\tLoss 3.777 (2.328)\nEpoch: [17][170/400]\tTime 0.353 (0.392)\tData 0.001 (0.035)\tLoss 3.818 (2.384)\nEpoch: [17][180/400]\tTime 0.363 (0.390)\tData 0.000 (0.033)\tLoss 3.937 (2.442)\nEpoch: [17][190/400]\tTime 0.354 (0.398)\tData 0.001 (0.040)\tLoss 3.735 (2.481)\nEpoch: [17][200/400]\tTime 0.350 (0.396)\tData 0.001 (0.038)\tLoss 3.109 (2.507)\nEpoch: [17][210/400]\tTime 0.362 (0.394)\tData 0.001 (0.036)\tLoss 2.810 (2.538)\nEpoch: [17][220/400]\tTime 0.352 (0.393)\tData 0.001 (0.035)\tLoss 2.615 (2.563)\nEpoch: [17][230/400]\tTime 0.362 (0.391)\tData 0.000 (0.033)\tLoss 3.619 (2.587)\nEpoch: [17][240/400]\tTime 0.362 (0.397)\tData 0.001 (0.039)\tLoss 2.995 (2.613)\nEpoch: [17][250/400]\tTime 0.352 (0.395)\tData 0.001 (0.038)\tLoss 3.345 (2.634)\nEpoch: [17][260/400]\tTime 0.354 (0.394)\tData 0.001 (0.036)\tLoss 2.918 (2.650)\nEpoch: [17][270/400]\tTime 0.364 (0.392)\tData 0.000 (0.035)\tLoss 2.958 (2.669)\nEpoch: [17][280/400]\tTime 0.405 (0.398)\tData 0.000 (0.040)\tLoss 2.801 (2.687)\nEpoch: [17][290/400]\tTime 0.349 (0.396)\tData 0.001 (0.038)\tLoss 3.234 (2.700)\nEpoch: [17][300/400]\tTime 0.362 (0.395)\tData 0.001 (0.037)\tLoss 2.987 (2.710)\nEpoch: [17][310/400]\tTime 0.353 (0.394)\tData 0.001 (0.036)\tLoss 3.327 (2.726)\nEpoch: [17][320/400]\tTime 0.363 (0.393)\tData 0.000 (0.035)\tLoss 3.132 (2.733)\nEpoch: [17][330/400]\tTime 0.352 (0.397)\tData 0.001 (0.039)\tLoss 2.924 (2.742)\nEpoch: [17][340/400]\tTime 0.364 (0.396)\tData 0.001 (0.038)\tLoss 3.477 (2.755)\nEpoch: [17][350/400]\tTime 0.353 (0.395)\tData 0.000 (0.037)\tLoss 4.079 (2.770)\nEpoch: [17][360/400]\tTime 0.364 (0.394)\tData 0.000 (0.036)\tLoss 3.004 (2.781)\nEpoch: [17][370/400]\tTime 0.351 (0.397)\tData 0.001 (0.039)\tLoss 2.665 (2.791)\nEpoch: [17][380/400]\tTime 0.353 (0.396)\tData 0.001 (0.038)\tLoss 2.900 (2.793)\nEpoch: [17][390/400]\tTime 0.351 (0.395)\tData 0.001 (0.037)\tLoss 2.701 (2.798)\nEpoch: [17][400/400]\tTime 0.352 (0.394)\tData 0.001 (0.037)\tLoss 2.122 (2.806)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.124 (0.159)\tData 0.000 (0.029)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.61119508743286\n==> Statistics for epoch 18: 725 clusters\nEpoch: [18][10/400]\tTime 0.360 (0.419)\tData 0.001 (0.058)\tLoss 0.515 (0.526)\nEpoch: [18][20/400]\tTime 0.366 (0.389)\tData 0.001 (0.029)\tLoss 0.368 (0.455)\nEpoch: [18][30/400]\tTime 0.369 (0.379)\tData 0.000 (0.020)\tLoss 0.421 (0.432)\nEpoch: [18][40/400]\tTime 0.362 (0.374)\tData 0.000 (0.015)\tLoss 0.357 (0.407)\nEpoch: [18][50/400]\tTime 0.353 (0.405)\tData 0.000 (0.046)\tLoss 2.700 (0.637)\nEpoch: [18][60/400]\tTime 0.371 (0.398)\tData 0.001 (0.038)\tLoss 2.934 (1.057)\nEpoch: [18][70/400]\tTime 0.350 (0.392)\tData 0.001 (0.033)\tLoss 3.014 (1.374)\nEpoch: [18][80/400]\tTime 0.364 (0.388)\tData 0.001 (0.029)\tLoss 2.810 (1.612)\nEpoch: [18][90/400]\tTime 0.362 (0.385)\tData 0.000 (0.026)\tLoss 3.201 (1.804)\nEpoch: [18][100/400]\tTime 0.354 (0.401)\tData 0.000 (0.041)\tLoss 2.767 (1.907)\nEpoch: [18][110/400]\tTime 0.351 (0.397)\tData 0.001 (0.037)\tLoss 3.115 (1.999)\nEpoch: [18][120/400]\tTime 0.353 (0.393)\tData 0.000 (0.034)\tLoss 2.943 (2.105)\nEpoch: [18][130/400]\tTime 0.364 (0.390)\tData 0.000 (0.032)\tLoss 2.687 (2.179)\nEpoch: [18][140/400]\tTime 0.353 (0.401)\tData 0.000 (0.042)\tLoss 2.881 (2.230)\nEpoch: [18][150/400]\tTime 0.362 (0.398)\tData 0.001 (0.039)\tLoss 3.405 (2.303)\nEpoch: [18][160/400]\tTime 0.352 (0.395)\tData 0.001 (0.037)\tLoss 3.115 (2.353)\nEpoch: [18][170/400]\tTime 0.369 (0.393)\tData 0.001 (0.035)\tLoss 3.282 (2.411)\nEpoch: [18][180/400]\tTime 0.362 (0.391)\tData 0.000 (0.033)\tLoss 3.578 (2.455)\nEpoch: [18][190/400]\tTime 0.352 (0.400)\tData 0.001 (0.041)\tLoss 2.819 (2.486)\nEpoch: [18][200/400]\tTime 0.351 (0.397)\tData 0.000 (0.039)\tLoss 3.273 (2.520)\nEpoch: [18][210/400]\tTime 0.352 (0.395)\tData 0.001 (0.037)\tLoss 3.079 (2.542)\nEpoch: [18][220/400]\tTime 0.362 (0.394)\tData 0.000 (0.036)\tLoss 3.234 (2.572)\nEpoch: [18][230/400]\tTime 0.354 (0.400)\tData 0.000 (0.042)\tLoss 2.924 (2.595)\nEpoch: [18][240/400]\tTime 0.349 (0.398)\tData 0.000 (0.040)\tLoss 3.282 (2.626)\nEpoch: [18][250/400]\tTime 0.353 (0.397)\tData 0.001 (0.038)\tLoss 3.278 (2.646)\nEpoch: [18][260/400]\tTime 0.352 (0.395)\tData 0.000 (0.037)\tLoss 3.142 (2.666)\nEpoch: [18][270/400]\tTime 0.362 (0.394)\tData 0.000 (0.035)\tLoss 3.349 (2.683)\nEpoch: [18][280/400]\tTime 0.363 (0.399)\tData 0.001 (0.041)\tLoss 2.549 (2.701)\nEpoch: [18][290/400]\tTime 0.353 (0.398)\tData 0.000 (0.039)\tLoss 2.947 (2.714)\nEpoch: [18][300/400]\tTime 0.356 (0.397)\tData 0.001 (0.038)\tLoss 3.413 (2.728)\nEpoch: [18][310/400]\tTime 0.361 (0.396)\tData 0.000 (0.037)\tLoss 3.380 (2.737)\nEpoch: [18][320/400]\tTime 0.352 (0.400)\tData 0.001 (0.041)\tLoss 3.836 (2.751)\nEpoch: [18][330/400]\tTime 0.353 (0.399)\tData 0.001 (0.040)\tLoss 2.782 (2.762)\nEpoch: [18][340/400]\tTime 0.352 (0.397)\tData 0.001 (0.039)\tLoss 3.140 (2.770)\nEpoch: [18][350/400]\tTime 0.353 (0.396)\tData 0.001 (0.038)\tLoss 3.102 (2.777)\nEpoch: [18][360/400]\tTime 0.364 (0.395)\tData 0.000 (0.037)\tLoss 3.474 (2.785)\nEpoch: [18][370/400]\tTime 0.364 (0.400)\tData 0.001 (0.041)\tLoss 3.068 (2.792)\nEpoch: [18][380/400]\tTime 0.398 (0.399)\tData 0.000 (0.040)\tLoss 2.818 (2.802)\nEpoch: [18][390/400]\tTime 0.364 (0.397)\tData 0.001 (0.039)\tLoss 2.980 (2.805)\nEpoch: [18][400/400]\tTime 0.363 (0.396)\tData 0.000 (0.038)\tLoss 2.303 (2.812)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.164)\tData 0.000 (0.030)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.65268301963806\n==> Statistics for epoch 19: 718 clusters\nEpoch: [19][10/400]\tTime 0.354 (0.409)\tData 0.001 (0.048)\tLoss 0.564 (0.489)\nEpoch: [19][20/400]\tTime 0.351 (0.383)\tData 0.001 (0.024)\tLoss 0.433 (0.468)\nEpoch: [19][30/400]\tTime 0.350 (0.372)\tData 0.001 (0.016)\tLoss 0.333 (0.432)\nEpoch: [19][40/400]\tTime 0.362 (0.368)\tData 0.000 (0.012)\tLoss 0.277 (0.401)\nEpoch: [19][50/400]\tTime 0.362 (0.401)\tData 0.001 (0.044)\tLoss 3.021 (0.724)\nEpoch: [19][60/400]\tTime 0.353 (0.393)\tData 0.000 (0.036)\tLoss 3.214 (1.109)\nEpoch: [19][70/400]\tTime 0.352 (0.388)\tData 0.001 (0.031)\tLoss 3.490 (1.362)\nEpoch: [19][80/400]\tTime 0.352 (0.383)\tData 0.000 (0.027)\tLoss 2.828 (1.590)\nEpoch: [19][90/400]\tTime 0.349 (0.400)\tData 0.001 (0.044)\tLoss 3.026 (1.755)\nEpoch: [19][100/400]\tTime 0.363 (0.396)\tData 0.001 (0.040)\tLoss 2.432 (1.860)\nEpoch: [19][110/400]\tTime 0.353 (0.393)\tData 0.000 (0.036)\tLoss 2.974 (1.940)\nEpoch: [19][120/400]\tTime 0.361 (0.390)\tData 0.001 (0.033)\tLoss 2.866 (2.025)\nEpoch: [19][130/400]\tTime 0.363 (0.387)\tData 0.000 (0.031)\tLoss 2.962 (2.082)\nEpoch: [19][140/400]\tTime 0.351 (0.399)\tData 0.001 (0.041)\tLoss 2.721 (2.155)\nEpoch: [19][150/400]\tTime 0.350 (0.396)\tData 0.001 (0.039)\tLoss 3.033 (2.206)\nEpoch: [19][160/400]\tTime 0.351 (0.393)\tData 0.001 (0.036)\tLoss 3.136 (2.266)\nEpoch: [19][170/400]\tTime 0.364 (0.391)\tData 0.000 (0.034)\tLoss 2.575 (2.304)\nEpoch: [19][180/400]\tTime 0.351 (0.400)\tData 0.001 (0.042)\tLoss 3.043 (2.341)\nEpoch: [19][190/400]\tTime 0.353 (0.398)\tData 0.001 (0.040)\tLoss 2.589 (2.358)\nEpoch: [19][200/400]\tTime 0.354 (0.395)\tData 0.001 (0.038)\tLoss 2.823 (2.389)\nEpoch: [19][210/400]\tTime 0.354 (0.393)\tData 0.001 (0.036)\tLoss 2.407 (2.422)\nEpoch: [19][220/400]\tTime 0.355 (0.392)\tData 0.000 (0.035)\tLoss 3.007 (2.451)\nEpoch: [19][230/400]\tTime 0.355 (0.398)\tData 0.001 (0.041)\tLoss 3.365 (2.478)\nEpoch: [19][240/400]\tTime 0.354 (0.397)\tData 0.000 (0.039)\tLoss 2.966 (2.501)\nEpoch: [19][250/400]\tTime 0.354 (0.395)\tData 0.001 (0.038)\tLoss 3.292 (2.523)\nEpoch: [19][260/400]\tTime 0.374 (0.394)\tData 0.000 (0.036)\tLoss 2.445 (2.539)\nEpoch: [19][270/400]\tTime 0.363 (0.399)\tData 0.001 (0.042)\tLoss 3.306 (2.552)\nEpoch: [19][280/400]\tTime 0.354 (0.397)\tData 0.001 (0.040)\tLoss 3.214 (2.571)\nEpoch: [19][290/400]\tTime 0.375 (0.396)\tData 0.001 (0.039)\tLoss 3.514 (2.592)\nEpoch: [19][300/400]\tTime 0.351 (0.395)\tData 0.001 (0.037)\tLoss 2.527 (2.602)\nEpoch: [19][310/400]\tTime 0.351 (0.400)\tData 0.001 (0.042)\tLoss 3.175 (2.617)\nEpoch: [19][320/400]\tTime 0.353 (0.398)\tData 0.001 (0.041)\tLoss 3.120 (2.630)\nEpoch: [19][330/400]\tTime 0.362 (0.397)\tData 0.001 (0.039)\tLoss 2.933 (2.646)\nEpoch: [19][340/400]\tTime 0.354 (0.396)\tData 0.000 (0.038)\tLoss 2.819 (2.662)\nEpoch: [19][350/400]\tTime 0.365 (0.395)\tData 0.000 (0.037)\tLoss 3.588 (2.676)\nEpoch: [19][360/400]\tTime 0.353 (0.399)\tData 0.000 (0.041)\tLoss 2.967 (2.684)\nEpoch: [19][370/400]\tTime 0.352 (0.398)\tData 0.001 (0.040)\tLoss 2.823 (2.688)\nEpoch: [19][380/400]\tTime 0.375 (0.397)\tData 0.000 (0.039)\tLoss 3.236 (2.693)\nEpoch: [19][390/400]\tTime 0.364 (0.396)\tData 0.000 (0.038)\tLoss 2.652 (2.697)\nEpoch: [19][400/400]\tTime 0.354 (0.400)\tData 0.001 (0.042)\tLoss 2.646 (2.707)\nExtract Features: [50/76]\tTime 0.133 (0.163)\tData 0.000 (0.029)\t\nMean AP: 76.9%\n\n * Finished epoch  19  model mAP: 76.9%  best: 76.9% *\n\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.136 (0.161)\tData 0.000 (0.029)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 19.028404235839844\n==> Statistics for epoch 20: 710 clusters\nEpoch: [20][10/400]\tTime 0.361 (0.421)\tData 0.001 (0.052)\tLoss 0.460 (0.458)\nEpoch: [20][20/400]\tTime 0.349 (0.387)\tData 0.001 (0.026)\tLoss 0.355 (0.444)\nEpoch: [20][30/400]\tTime 0.362 (0.378)\tData 0.001 (0.018)\tLoss 0.334 (0.431)\nEpoch: [20][40/400]\tTime 0.363 (0.373)\tData 0.000 (0.014)\tLoss 0.351 (0.403)\nEpoch: [20][50/400]\tTime 0.349 (0.407)\tData 0.001 (0.047)\tLoss 2.913 (0.716)\nEpoch: [20][60/400]\tTime 0.351 (0.399)\tData 0.001 (0.040)\tLoss 3.591 (1.113)\nEpoch: [20][70/400]\tTime 0.351 (0.392)\tData 0.001 (0.034)\tLoss 2.805 (1.384)\nEpoch: [20][80/400]\tTime 0.350 (0.387)\tData 0.001 (0.030)\tLoss 2.635 (1.594)\nEpoch: [20][90/400]\tTime 0.351 (0.405)\tData 0.001 (0.047)\tLoss 1.877 (1.735)\nEpoch: [20][100/400]\tTime 0.354 (0.401)\tData 0.001 (0.042)\tLoss 2.308 (1.833)\nEpoch: [20][110/400]\tTime 0.381 (0.397)\tData 0.001 (0.038)\tLoss 2.102 (1.879)\nEpoch: [20][120/400]\tTime 0.352 (0.393)\tData 0.001 (0.035)\tLoss 2.467 (1.931)\nEpoch: [20][130/400]\tTime 0.363 (0.390)\tData 0.000 (0.033)\tLoss 2.248 (1.984)\nEpoch: [20][140/400]\tTime 0.353 (0.400)\tData 0.000 (0.043)\tLoss 2.529 (2.044)\nEpoch: [20][150/400]\tTime 0.364 (0.398)\tData 0.001 (0.040)\tLoss 2.474 (2.089)\nEpoch: [20][160/400]\tTime 0.351 (0.395)\tData 0.001 (0.037)\tLoss 2.442 (2.117)\nEpoch: [20][170/400]\tTime 0.363 (0.393)\tData 0.000 (0.035)\tLoss 2.734 (2.156)\nEpoch: [20][180/400]\tTime 0.408 (0.402)\tData 0.000 (0.044)\tLoss 2.851 (2.183)\nEpoch: [20][190/400]\tTime 0.351 (0.400)\tData 0.001 (0.041)\tLoss 2.188 (2.217)\nEpoch: [20][200/400]\tTime 0.353 (0.397)\tData 0.001 (0.039)\tLoss 1.985 (2.231)\nEpoch: [20][210/400]\tTime 0.352 (0.395)\tData 0.000 (0.037)\tLoss 2.666 (2.240)\nEpoch: [20][220/400]\tTime 0.364 (0.394)\tData 0.000 (0.036)\tLoss 2.769 (2.245)\nEpoch: [20][230/400]\tTime 0.363 (0.400)\tData 0.001 (0.042)\tLoss 2.979 (2.259)\nEpoch: [20][240/400]\tTime 0.353 (0.399)\tData 0.001 (0.041)\tLoss 3.191 (2.271)\nEpoch: [20][250/400]\tTime 0.350 (0.397)\tData 0.001 (0.039)\tLoss 2.494 (2.283)\nEpoch: [20][260/400]\tTime 0.364 (0.395)\tData 0.000 (0.037)\tLoss 2.453 (2.293)\nEpoch: [20][270/400]\tTime 0.354 (0.401)\tData 0.001 (0.043)\tLoss 2.789 (2.306)\nEpoch: [20][280/400]\tTime 0.363 (0.400)\tData 0.000 (0.041)\tLoss 1.968 (2.313)\nEpoch: [20][290/400]\tTime 0.360 (0.398)\tData 0.001 (0.040)\tLoss 1.779 (2.317)\nEpoch: [20][300/400]\tTime 0.364 (0.397)\tData 0.001 (0.039)\tLoss 2.203 (2.328)\nEpoch: [20][310/400]\tTime 0.352 (0.402)\tData 0.001 (0.043)\tLoss 3.181 (2.337)\nEpoch: [20][320/400]\tTime 0.352 (0.400)\tData 0.000 (0.042)\tLoss 2.559 (2.335)\nEpoch: [20][330/400]\tTime 0.363 (0.399)\tData 0.000 (0.040)\tLoss 2.705 (2.342)\nEpoch: [20][340/400]\tTime 0.375 (0.398)\tData 0.000 (0.039)\tLoss 2.751 (2.342)\nEpoch: [20][350/400]\tTime 0.364 (0.397)\tData 0.000 (0.038)\tLoss 2.336 (2.344)\nEpoch: [20][360/400]\tTime 0.354 (0.401)\tData 0.000 (0.042)\tLoss 2.553 (2.348)\nEpoch: [20][370/400]\tTime 0.362 (0.400)\tData 0.000 (0.041)\tLoss 2.264 (2.346)\nEpoch: [20][380/400]\tTime 0.352 (0.399)\tData 0.000 (0.040)\tLoss 2.168 (2.349)\nEpoch: [20][390/400]\tTime 0.363 (0.397)\tData 0.000 (0.039)\tLoss 1.949 (2.354)\nEpoch: [20][400/400]\tTime 0.351 (0.401)\tData 0.000 (0.042)\tLoss 2.349 (2.359)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.128 (0.159)\tData 0.000 (0.028)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.549808740615845\n==> Statistics for epoch 21: 712 clusters\nEpoch: [21][10/400]\tTime 0.361 (0.412)\tData 0.001 (0.046)\tLoss 0.307 (0.372)\nEpoch: [21][20/400]\tTime 0.351 (0.382)\tData 0.001 (0.023)\tLoss 0.255 (0.347)\nEpoch: [21][30/400]\tTime 0.361 (0.374)\tData 0.001 (0.016)\tLoss 0.242 (0.333)\nEpoch: [21][40/400]\tTime 0.361 (0.369)\tData 0.000 (0.012)\tLoss 0.209 (0.311)\nEpoch: [21][50/400]\tTime 0.361 (0.405)\tData 0.001 (0.047)\tLoss 2.810 (0.615)\nEpoch: [21][60/400]\tTime 0.369 (0.398)\tData 0.000 (0.039)\tLoss 2.503 (0.985)\nEpoch: [21][70/400]\tTime 0.363 (0.392)\tData 0.001 (0.033)\tLoss 2.820 (1.208)\nEpoch: [21][80/400]\tTime 0.375 (0.388)\tData 0.000 (0.029)\tLoss 3.185 (1.413)\nEpoch: [21][90/400]\tTime 0.350 (0.406)\tData 0.001 (0.046)\tLoss 2.023 (1.545)\nEpoch: [21][100/400]\tTime 0.352 (0.400)\tData 0.000 (0.042)\tLoss 2.388 (1.639)\nEpoch: [21][110/400]\tTime 0.353 (0.396)\tData 0.000 (0.038)\tLoss 2.053 (1.702)\nEpoch: [21][120/400]\tTime 0.353 (0.393)\tData 0.000 (0.035)\tLoss 1.929 (1.761)\nEpoch: [21][130/400]\tTime 0.364 (0.390)\tData 0.000 (0.032)\tLoss 2.019 (1.798)\nEpoch: [21][140/400]\tTime 0.351 (0.402)\tData 0.001 (0.043)\tLoss 2.398 (1.841)\nEpoch: [21][150/400]\tTime 0.362 (0.399)\tData 0.001 (0.041)\tLoss 2.625 (1.885)\nEpoch: [21][160/400]\tTime 0.353 (0.396)\tData 0.000 (0.038)\tLoss 2.128 (1.921)\nEpoch: [21][170/400]\tTime 0.370 (0.394)\tData 0.000 (0.036)\tLoss 2.737 (1.962)\nEpoch: [21][180/400]\tTime 0.364 (0.402)\tData 0.000 (0.044)\tLoss 2.979 (1.998)\nEpoch: [21][190/400]\tTime 0.354 (0.400)\tData 0.000 (0.042)\tLoss 2.289 (2.016)\nEpoch: [21][200/400]\tTime 0.353 (0.398)\tData 0.000 (0.039)\tLoss 2.543 (2.038)\nEpoch: [21][210/400]\tTime 0.352 (0.396)\tData 0.001 (0.038)\tLoss 1.918 (2.045)\nEpoch: [21][220/400]\tTime 0.363 (0.394)\tData 0.000 (0.036)\tLoss 2.924 (2.060)\nEpoch: [21][230/400]\tTime 0.353 (0.400)\tData 0.001 (0.042)\tLoss 2.544 (2.073)\nEpoch: [21][240/400]\tTime 0.362 (0.398)\tData 0.000 (0.040)\tLoss 2.803 (2.096)\nEpoch: [21][250/400]\tTime 0.355 (0.397)\tData 0.001 (0.039)\tLoss 2.281 (2.103)\nEpoch: [21][260/400]\tTime 0.361 (0.395)\tData 0.000 (0.037)\tLoss 2.552 (2.116)\nEpoch: [21][270/400]\tTime 0.353 (0.401)\tData 0.000 (0.043)\tLoss 1.931 (2.124)\nEpoch: [21][280/400]\tTime 0.353 (0.399)\tData 0.001 (0.041)\tLoss 2.650 (2.132)\nEpoch: [21][290/400]\tTime 0.370 (0.398)\tData 0.000 (0.040)\tLoss 1.741 (2.139)\nEpoch: [21][300/400]\tTime 0.353 (0.396)\tData 0.001 (0.038)\tLoss 2.356 (2.148)\nEpoch: [21][310/400]\tTime 0.352 (0.401)\tData 0.001 (0.043)\tLoss 2.661 (2.163)\nEpoch: [21][320/400]\tTime 0.363 (0.400)\tData 0.001 (0.042)\tLoss 2.844 (2.170)\nEpoch: [21][330/400]\tTime 0.350 (0.398)\tData 0.001 (0.040)\tLoss 2.405 (2.173)\nEpoch: [21][340/400]\tTime 0.373 (0.397)\tData 0.001 (0.039)\tLoss 2.405 (2.181)\nEpoch: [21][350/400]\tTime 0.363 (0.396)\tData 0.000 (0.038)\tLoss 2.124 (2.188)\nEpoch: [21][360/400]\tTime 0.367 (0.400)\tData 0.001 (0.042)\tLoss 2.459 (2.204)\nEpoch: [21][370/400]\tTime 0.362 (0.400)\tData 0.001 (0.041)\tLoss 2.598 (2.205)\nEpoch: [21][380/400]\tTime 0.351 (0.399)\tData 0.000 (0.040)\tLoss 2.395 (2.213)\nEpoch: [21][390/400]\tTime 0.358 (0.398)\tData 0.000 (0.039)\tLoss 2.266 (2.217)\nEpoch: [21][400/400]\tTime 0.362 (0.401)\tData 0.001 (0.043)\tLoss 2.810 (2.222)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.161)\tData 0.000 (0.029)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.53739333152771\n==> Statistics for epoch 22: 710 clusters\nEpoch: [22][10/400]\tTime 0.361 (0.414)\tData 0.001 (0.046)\tLoss 0.377 (0.346)\nEpoch: [22][20/400]\tTime 0.407 (0.386)\tData 0.001 (0.023)\tLoss 0.420 (0.336)\nEpoch: [22][30/400]\tTime 0.359 (0.375)\tData 0.001 (0.016)\tLoss 0.235 (0.312)\nEpoch: [22][40/400]\tTime 0.361 (0.369)\tData 0.000 (0.012)\tLoss 0.290 (0.296)\nEpoch: [22][50/400]\tTime 0.350 (0.402)\tData 0.001 (0.044)\tLoss 2.791 (0.583)\nEpoch: [22][60/400]\tTime 0.363 (0.395)\tData 0.001 (0.037)\tLoss 3.026 (0.926)\nEpoch: [22][70/400]\tTime 0.352 (0.390)\tData 0.001 (0.031)\tLoss 2.184 (1.168)\nEpoch: [22][80/400]\tTime 0.354 (0.386)\tData 0.001 (0.028)\tLoss 3.398 (1.365)\nEpoch: [22][90/400]\tTime 0.372 (0.404)\tData 0.001 (0.045)\tLoss 2.459 (1.491)\nEpoch: [22][100/400]\tTime 0.370 (0.400)\tData 0.001 (0.041)\tLoss 1.967 (1.560)\nEpoch: [22][110/400]\tTime 0.351 (0.395)\tData 0.001 (0.037)\tLoss 1.832 (1.622)\nEpoch: [22][120/400]\tTime 0.350 (0.392)\tData 0.001 (0.034)\tLoss 2.662 (1.681)\nEpoch: [22][130/400]\tTime 0.362 (0.389)\tData 0.000 (0.032)\tLoss 2.741 (1.731)\nEpoch: [22][140/400]\tTime 0.352 (0.400)\tData 0.001 (0.042)\tLoss 1.949 (1.783)\nEpoch: [22][150/400]\tTime 0.361 (0.397)\tData 0.001 (0.039)\tLoss 2.409 (1.825)\nEpoch: [22][160/400]\tTime 0.350 (0.395)\tData 0.001 (0.037)\tLoss 2.350 (1.855)\nEpoch: [22][170/400]\tTime 0.364 (0.393)\tData 0.000 (0.034)\tLoss 2.622 (1.886)\nEpoch: [22][180/400]\tTime 0.351 (0.401)\tData 0.000 (0.043)\tLoss 2.235 (1.911)\nEpoch: [22][190/400]\tTime 0.369 (0.399)\tData 0.001 (0.041)\tLoss 2.155 (1.941)\nEpoch: [22][200/400]\tTime 0.365 (0.397)\tData 0.001 (0.039)\tLoss 2.705 (1.960)\nEpoch: [22][210/400]\tTime 0.370 (0.395)\tData 0.001 (0.037)\tLoss 1.725 (1.977)\nEpoch: [22][220/400]\tTime 0.362 (0.394)\tData 0.000 (0.035)\tLoss 2.362 (1.992)\nEpoch: [22][230/400]\tTime 0.354 (0.400)\tData 0.001 (0.041)\tLoss 1.859 (2.004)\nEpoch: [22][240/400]\tTime 0.351 (0.398)\tData 0.001 (0.040)\tLoss 2.818 (2.028)\nEpoch: [22][250/400]\tTime 0.352 (0.397)\tData 0.001 (0.038)\tLoss 2.263 (2.040)\nEpoch: [22][260/400]\tTime 0.362 (0.395)\tData 0.000 (0.037)\tLoss 1.736 (2.044)\nEpoch: [22][270/400]\tTime 0.357 (0.401)\tData 0.000 (0.042)\tLoss 1.936 (2.057)\nEpoch: [22][280/400]\tTime 0.352 (0.399)\tData 0.001 (0.041)\tLoss 1.676 (2.064)\nEpoch: [22][290/400]\tTime 0.352 (0.398)\tData 0.000 (0.039)\tLoss 2.071 (2.073)\nEpoch: [22][300/400]\tTime 0.355 (0.396)\tData 0.001 (0.038)\tLoss 2.014 (2.073)\nEpoch: [22][310/400]\tTime 0.376 (0.401)\tData 0.000 (0.043)\tLoss 2.728 (2.084)\nEpoch: [22][320/400]\tTime 0.354 (0.400)\tData 0.001 (0.041)\tLoss 2.114 (2.087)\nEpoch: [22][330/400]\tTime 0.354 (0.399)\tData 0.001 (0.040)\tLoss 2.277 (2.094)\nEpoch: [22][340/400]\tTime 0.352 (0.397)\tData 0.001 (0.039)\tLoss 2.347 (2.098)\nEpoch: [22][350/400]\tTime 0.364 (0.396)\tData 0.000 (0.038)\tLoss 2.331 (2.106)\nEpoch: [22][360/400]\tTime 0.352 (0.400)\tData 0.000 (0.042)\tLoss 2.734 (2.110)\nEpoch: [22][370/400]\tTime 0.354 (0.399)\tData 0.001 (0.041)\tLoss 2.916 (2.124)\nEpoch: [22][380/400]\tTime 0.376 (0.398)\tData 0.000 (0.039)\tLoss 2.088 (2.131)\nEpoch: [22][390/400]\tTime 0.364 (0.397)\tData 0.000 (0.038)\tLoss 2.388 (2.131)\nEpoch: [22][400/400]\tTime 0.354 (0.401)\tData 0.000 (0.042)\tLoss 2.091 (2.132)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.163)\tData 0.000 (0.030)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.42928433418274\n==> Statistics for epoch 23: 713 clusters\nEpoch: [23][10/400]\tTime 0.361 (0.406)\tData 0.001 (0.046)\tLoss 0.402 (0.346)\nEpoch: [23][20/400]\tTime 0.352 (0.383)\tData 0.000 (0.023)\tLoss 0.363 (0.328)\nEpoch: [23][30/400]\tTime 0.362 (0.373)\tData 0.000 (0.016)\tLoss 0.231 (0.313)\nEpoch: [23][40/400]\tTime 0.363 (0.370)\tData 0.000 (0.012)\tLoss 0.100 (0.279)\nEpoch: [23][50/400]\tTime 0.352 (0.403)\tData 0.001 (0.045)\tLoss 2.306 (0.563)\nEpoch: [23][60/400]\tTime 0.353 (0.396)\tData 0.000 (0.038)\tLoss 2.223 (0.922)\nEpoch: [23][70/400]\tTime 0.375 (0.390)\tData 0.001 (0.032)\tLoss 2.137 (1.124)\nEpoch: [23][80/400]\tTime 0.363 (0.385)\tData 0.001 (0.028)\tLoss 2.805 (1.312)\nEpoch: [23][90/400]\tTime 0.358 (0.402)\tData 0.001 (0.044)\tLoss 1.662 (1.431)\nEpoch: [23][100/400]\tTime 0.356 (0.397)\tData 0.001 (0.040)\tLoss 1.970 (1.493)\nEpoch: [23][110/400]\tTime 0.353 (0.393)\tData 0.000 (0.036)\tLoss 2.019 (1.547)\nEpoch: [23][120/400]\tTime 0.351 (0.390)\tData 0.001 (0.033)\tLoss 2.887 (1.613)\nEpoch: [23][130/400]\tTime 0.364 (0.388)\tData 0.000 (0.031)\tLoss 1.536 (1.656)\nEpoch: [23][140/400]\tTime 0.361 (0.398)\tData 0.001 (0.041)\tLoss 2.075 (1.701)\nEpoch: [23][150/400]\tTime 0.356 (0.396)\tData 0.001 (0.038)\tLoss 2.588 (1.753)\nEpoch: [23][160/400]\tTime 0.364 (0.393)\tData 0.001 (0.036)\tLoss 1.957 (1.778)\nEpoch: [23][170/400]\tTime 0.362 (0.391)\tData 0.000 (0.034)\tLoss 2.318 (1.802)\nEpoch: [23][180/400]\tTime 0.353 (0.400)\tData 0.001 (0.042)\tLoss 2.423 (1.826)\nEpoch: [23][190/400]\tTime 0.353 (0.397)\tData 0.001 (0.040)\tLoss 1.775 (1.850)\nEpoch: [23][200/400]\tTime 0.372 (0.395)\tData 0.001 (0.038)\tLoss 2.113 (1.869)\nEpoch: [23][210/400]\tTime 0.353 (0.393)\tData 0.001 (0.036)\tLoss 1.826 (1.892)\nEpoch: [23][220/400]\tTime 0.363 (0.392)\tData 0.000 (0.035)\tLoss 2.857 (1.915)\nEpoch: [23][230/400]\tTime 0.353 (0.398)\tData 0.001 (0.041)\tLoss 1.979 (1.938)\nEpoch: [23][240/400]\tTime 0.373 (0.396)\tData 0.001 (0.039)\tLoss 2.044 (1.949)\nEpoch: [23][250/400]\tTime 0.354 (0.395)\tData 0.001 (0.038)\tLoss 1.894 (1.966)\nEpoch: [23][260/400]\tTime 0.364 (0.394)\tData 0.000 (0.036)\tLoss 2.139 (1.984)\nEpoch: [23][270/400]\tTime 0.363 (0.399)\tData 0.001 (0.042)\tLoss 3.073 (1.994)\nEpoch: [23][280/400]\tTime 0.353 (0.397)\tData 0.000 (0.040)\tLoss 2.604 (2.007)\nEpoch: [23][290/400]\tTime 0.364 (0.396)\tData 0.001 (0.039)\tLoss 2.114 (2.018)\nEpoch: [23][300/400]\tTime 0.351 (0.395)\tData 0.001 (0.038)\tLoss 2.095 (2.021)\nEpoch: [23][310/400]\tTime 0.342 (0.399)\tData 0.001 (0.042)\tLoss 2.046 (2.030)\nEpoch: [23][320/400]\tTime 0.354 (0.398)\tData 0.001 (0.041)\tLoss 2.289 (2.042)\nEpoch: [23][330/400]\tTime 0.396 (0.397)\tData 0.001 (0.039)\tLoss 2.013 (2.046)\nEpoch: [23][340/400]\tTime 0.354 (0.395)\tData 0.001 (0.038)\tLoss 1.960 (2.052)\nEpoch: [23][350/400]\tTime 0.364 (0.394)\tData 0.000 (0.037)\tLoss 2.948 (2.056)\nEpoch: [23][360/400]\tTime 0.381 (0.398)\tData 0.001 (0.041)\tLoss 1.699 (2.059)\nEpoch: [23][370/400]\tTime 0.352 (0.397)\tData 0.001 (0.040)\tLoss 1.855 (2.069)\nEpoch: [23][380/400]\tTime 0.351 (0.396)\tData 0.001 (0.039)\tLoss 1.582 (2.073)\nEpoch: [23][390/400]\tTime 0.364 (0.395)\tData 0.000 (0.038)\tLoss 1.826 (2.077)\nEpoch: [23][400/400]\tTime 0.350 (0.399)\tData 0.001 (0.041)\tLoss 2.326 (2.080)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.128 (0.163)\tData 0.000 (0.030)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.305078983306885\n==> Statistics for epoch 24: 716 clusters\nEpoch: [24][10/400]\tTime 0.352 (0.406)\tData 0.001 (0.046)\tLoss 0.243 (0.322)\nEpoch: [24][20/400]\tTime 0.351 (0.380)\tData 0.001 (0.023)\tLoss 0.231 (0.310)\nEpoch: [24][30/400]\tTime 0.361 (0.372)\tData 0.001 (0.016)\tLoss 0.266 (0.292)\nEpoch: [24][40/400]\tTime 0.363 (0.368)\tData 0.000 (0.012)\tLoss 0.116 (0.275)\nEpoch: [24][50/400]\tTime 0.352 (0.403)\tData 0.001 (0.046)\tLoss 1.981 (0.547)\nEpoch: [24][60/400]\tTime 0.349 (0.395)\tData 0.000 (0.039)\tLoss 2.227 (0.871)\nEpoch: [24][70/400]\tTime 0.361 (0.389)\tData 0.001 (0.033)\tLoss 2.061 (1.091)\nEpoch: [24][80/400]\tTime 0.352 (0.385)\tData 0.001 (0.029)\tLoss 2.861 (1.256)\nEpoch: [24][90/400]\tTime 0.362 (0.403)\tData 0.000 (0.047)\tLoss 2.304 (1.391)\nEpoch: [24][100/400]\tTime 0.352 (0.398)\tData 0.000 (0.042)\tLoss 2.466 (1.471)\nEpoch: [24][110/400]\tTime 0.355 (0.394)\tData 0.001 (0.038)\tLoss 2.271 (1.526)\nEpoch: [24][120/400]\tTime 0.351 (0.391)\tData 0.001 (0.035)\tLoss 1.862 (1.586)\nEpoch: [24][130/400]\tTime 0.361 (0.388)\tData 0.000 (0.032)\tLoss 2.511 (1.648)\nEpoch: [24][140/400]\tTime 0.352 (0.399)\tData 0.000 (0.042)\tLoss 2.394 (1.700)\nEpoch: [24][150/400]\tTime 0.350 (0.396)\tData 0.001 (0.039)\tLoss 2.440 (1.737)\nEpoch: [24][160/400]\tTime 0.352 (0.393)\tData 0.001 (0.037)\tLoss 2.214 (1.770)\nEpoch: [24][170/400]\tTime 0.363 (0.391)\tData 0.000 (0.035)\tLoss 2.031 (1.803)\nEpoch: [24][180/400]\tTime 0.351 (0.399)\tData 0.001 (0.042)\tLoss 1.688 (1.825)\nEpoch: [24][190/400]\tTime 0.351 (0.396)\tData 0.001 (0.040)\tLoss 1.648 (1.840)\nEpoch: [24][200/400]\tTime 0.351 (0.394)\tData 0.001 (0.038)\tLoss 1.322 (1.849)\nEpoch: [24][210/400]\tTime 0.351 (0.392)\tData 0.001 (0.036)\tLoss 1.882 (1.859)\nEpoch: [24][220/400]\tTime 0.362 (0.390)\tData 0.000 (0.035)\tLoss 1.609 (1.877)\nEpoch: [24][230/400]\tTime 0.362 (0.396)\tData 0.001 (0.041)\tLoss 2.267 (1.887)\nEpoch: [24][240/400]\tTime 0.352 (0.395)\tData 0.000 (0.039)\tLoss 2.272 (1.894)\nEpoch: [24][250/400]\tTime 0.363 (0.393)\tData 0.001 (0.038)\tLoss 2.677 (1.914)\nEpoch: [24][260/400]\tTime 0.362 (0.392)\tData 0.000 (0.036)\tLoss 2.411 (1.924)\nEpoch: [24][270/400]\tTime 0.353 (0.398)\tData 0.000 (0.042)\tLoss 1.875 (1.945)\nEpoch: [24][280/400]\tTime 0.359 (0.396)\tData 0.000 (0.040)\tLoss 2.719 (1.952)\nEpoch: [24][290/400]\tTime 0.354 (0.395)\tData 0.001 (0.039)\tLoss 2.635 (1.964)\nEpoch: [24][300/400]\tTime 0.352 (0.394)\tData 0.000 (0.038)\tLoss 2.588 (1.974)\nEpoch: [24][310/400]\tTime 0.371 (0.398)\tData 0.000 (0.042)\tLoss 1.678 (1.985)\nEpoch: [24][320/400]\tTime 0.354 (0.397)\tData 0.000 (0.041)\tLoss 1.551 (1.994)\nEpoch: [24][330/400]\tTime 0.355 (0.396)\tData 0.001 (0.040)\tLoss 2.005 (2.000)\nEpoch: [24][340/400]\tTime 0.355 (0.395)\tData 0.000 (0.038)\tLoss 2.369 (2.002)\nEpoch: [24][350/400]\tTime 0.364 (0.394)\tData 0.000 (0.037)\tLoss 2.188 (2.013)\nEpoch: [24][360/400]\tTime 0.356 (0.398)\tData 0.001 (0.041)\tLoss 2.251 (2.017)\nEpoch: [24][370/400]\tTime 0.354 (0.397)\tData 0.001 (0.040)\tLoss 2.289 (2.024)\nEpoch: [24][380/400]\tTime 0.352 (0.396)\tData 0.001 (0.039)\tLoss 1.841 (2.033)\nEpoch: [24][390/400]\tTime 0.363 (0.395)\tData 0.000 (0.038)\tLoss 1.745 (2.043)\nEpoch: [24][400/400]\tTime 0.355 (0.399)\tData 0.001 (0.042)\tLoss 2.466 (2.047)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.128 (0.159)\tData 0.000 (0.028)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.23905038833618\n==> Statistics for epoch 25: 711 clusters\nEpoch: [25][10/400]\tTime 0.361 (0.423)\tData 0.001 (0.062)\tLoss 0.292 (0.276)\nEpoch: [25][20/400]\tTime 0.351 (0.391)\tData 0.001 (0.031)\tLoss 0.212 (0.295)\nEpoch: [25][30/400]\tTime 0.361 (0.379)\tData 0.001 (0.021)\tLoss 0.247 (0.272)\nEpoch: [25][40/400]\tTime 0.360 (0.373)\tData 0.000 (0.016)\tLoss 0.179 (0.253)\nEpoch: [25][50/400]\tTime 0.376 (0.405)\tData 0.001 (0.047)\tLoss 2.275 (0.484)\nEpoch: [25][60/400]\tTime 0.362 (0.398)\tData 0.000 (0.039)\tLoss 1.952 (0.807)\nEpoch: [25][70/400]\tTime 0.363 (0.393)\tData 0.001 (0.033)\tLoss 2.215 (1.028)\nEpoch: [25][80/400]\tTime 0.353 (0.388)\tData 0.001 (0.029)\tLoss 2.909 (1.224)\nEpoch: [25][90/400]\tTime 0.350 (0.405)\tData 0.001 (0.045)\tLoss 1.771 (1.355)\nEpoch: [25][100/400]\tTime 0.352 (0.400)\tData 0.000 (0.041)\tLoss 2.343 (1.442)\nEpoch: [25][110/400]\tTime 0.351 (0.396)\tData 0.001 (0.037)\tLoss 1.742 (1.506)\nEpoch: [25][120/400]\tTime 0.354 (0.393)\tData 0.000 (0.034)\tLoss 2.541 (1.553)\nEpoch: [25][130/400]\tTime 0.364 (0.390)\tData 0.000 (0.032)\tLoss 1.868 (1.594)\nEpoch: [25][140/400]\tTime 0.350 (0.401)\tData 0.001 (0.042)\tLoss 1.714 (1.630)\nEpoch: [25][150/400]\tTime 0.352 (0.398)\tData 0.000 (0.040)\tLoss 2.128 (1.679)\nEpoch: [25][160/400]\tTime 0.343 (0.395)\tData 0.001 (0.037)\tLoss 2.646 (1.721)\nEpoch: [25][170/400]\tTime 0.363 (0.393)\tData 0.000 (0.035)\tLoss 2.862 (1.749)\nEpoch: [25][180/400]\tTime 0.352 (0.401)\tData 0.000 (0.043)\tLoss 2.219 (1.776)\nEpoch: [25][190/400]\tTime 0.365 (0.399)\tData 0.001 (0.041)\tLoss 1.603 (1.796)\nEpoch: [25][200/400]\tTime 0.353 (0.397)\tData 0.001 (0.039)\tLoss 2.012 (1.815)\nEpoch: [25][210/400]\tTime 0.354 (0.395)\tData 0.001 (0.037)\tLoss 1.765 (1.829)\nEpoch: [25][220/400]\tTime 0.356 (0.393)\tData 0.000 (0.035)\tLoss 2.505 (1.852)\nEpoch: [25][230/400]\tTime 0.351 (0.400)\tData 0.001 (0.042)\tLoss 1.637 (1.862)\nEpoch: [25][240/400]\tTime 0.354 (0.398)\tData 0.001 (0.040)\tLoss 2.129 (1.877)\nEpoch: [25][250/400]\tTime 0.352 (0.396)\tData 0.001 (0.038)\tLoss 2.527 (1.880)\nEpoch: [25][260/400]\tTime 0.360 (0.395)\tData 0.000 (0.037)\tLoss 2.561 (1.896)\nEpoch: [25][270/400]\tTime 0.352 (0.400)\tData 0.000 (0.042)\tLoss 2.506 (1.904)\nEpoch: [25][280/400]\tTime 0.355 (0.399)\tData 0.000 (0.041)\tLoss 2.811 (1.924)\nEpoch: [25][290/400]\tTime 0.353 (0.397)\tData 0.000 (0.039)\tLoss 2.267 (1.933)\nEpoch: [25][300/400]\tTime 0.353 (0.396)\tData 0.000 (0.038)\tLoss 2.235 (1.938)\nEpoch: [25][310/400]\tTime 0.354 (0.400)\tData 0.001 (0.042)\tLoss 2.424 (1.953)\nEpoch: [25][320/400]\tTime 0.353 (0.399)\tData 0.000 (0.041)\tLoss 1.801 (1.958)\nEpoch: [25][330/400]\tTime 0.350 (0.397)\tData 0.001 (0.040)\tLoss 2.150 (1.963)\nEpoch: [25][340/400]\tTime 0.350 (0.396)\tData 0.001 (0.039)\tLoss 2.056 (1.966)\nEpoch: [25][350/400]\tTime 0.363 (0.395)\tData 0.000 (0.038)\tLoss 2.046 (1.977)\nEpoch: [25][360/400]\tTime 0.350 (0.399)\tData 0.001 (0.042)\tLoss 2.717 (1.983)\nEpoch: [25][370/400]\tTime 0.350 (0.398)\tData 0.001 (0.040)\tLoss 2.238 (1.985)\nEpoch: [25][380/400]\tTime 0.357 (0.397)\tData 0.001 (0.039)\tLoss 2.029 (1.988)\nEpoch: [25][390/400]\tTime 0.363 (0.396)\tData 0.000 (0.038)\tLoss 2.094 (1.990)\nEpoch: [25][400/400]\tTime 0.352 (0.400)\tData 0.000 (0.042)\tLoss 1.520 (1.993)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.160)\tData 0.000 (0.027)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.398639917373657\n==> Statistics for epoch 26: 717 clusters\nEpoch: [26][10/400]\tTime 0.361 (0.422)\tData 0.001 (0.057)\tLoss 0.278 (0.339)\nEpoch: [26][20/400]\tTime 0.367 (0.389)\tData 0.001 (0.029)\tLoss 0.315 (0.314)\nEpoch: [26][30/400]\tTime 0.371 (0.380)\tData 0.000 (0.019)\tLoss 0.206 (0.292)\nEpoch: [26][40/400]\tTime 0.362 (0.376)\tData 0.000 (0.015)\tLoss 0.139 (0.264)\nEpoch: [26][50/400]\tTime 0.351 (0.410)\tData 0.000 (0.049)\tLoss 2.654 (0.509)\nEpoch: [26][60/400]\tTime 0.354 (0.402)\tData 0.001 (0.041)\tLoss 2.286 (0.826)\nEpoch: [26][70/400]\tTime 0.397 (0.396)\tData 0.001 (0.035)\tLoss 2.926 (1.071)\nEpoch: [26][80/400]\tTime 0.350 (0.390)\tData 0.001 (0.031)\tLoss 2.400 (1.231)\nEpoch: [26][90/400]\tTime 0.349 (0.405)\tData 0.001 (0.046)\tLoss 1.698 (1.342)\nEpoch: [26][100/400]\tTime 0.352 (0.400)\tData 0.001 (0.041)\tLoss 1.819 (1.411)\nEpoch: [26][110/400]\tTime 0.362 (0.397)\tData 0.000 (0.038)\tLoss 1.967 (1.474)\nEpoch: [26][120/400]\tTime 0.368 (0.394)\tData 0.000 (0.034)\tLoss 2.246 (1.522)\nEpoch: [26][130/400]\tTime 0.363 (0.391)\tData 0.000 (0.032)\tLoss 2.159 (1.582)\nEpoch: [26][140/400]\tTime 0.372 (0.401)\tData 0.001 (0.042)\tLoss 1.633 (1.622)\nEpoch: [26][150/400]\tTime 0.354 (0.398)\tData 0.001 (0.039)\tLoss 1.507 (1.668)\nEpoch: [26][160/400]\tTime 0.351 (0.395)\tData 0.001 (0.037)\tLoss 1.784 (1.707)\nEpoch: [26][170/400]\tTime 0.363 (0.393)\tData 0.000 (0.035)\tLoss 2.864 (1.736)\nEpoch: [26][180/400]\tTime 0.350 (0.401)\tData 0.000 (0.042)\tLoss 2.127 (1.770)\nEpoch: [26][190/400]\tTime 0.355 (0.398)\tData 0.000 (0.040)\tLoss 2.116 (1.794)\nEpoch: [26][200/400]\tTime 0.353 (0.396)\tData 0.001 (0.038)\tLoss 1.829 (1.818)\nEpoch: [26][210/400]\tTime 0.386 (0.395)\tData 0.001 (0.036)\tLoss 1.824 (1.837)\nEpoch: [26][220/400]\tTime 0.371 (0.393)\tData 0.000 (0.035)\tLoss 1.653 (1.845)\nEpoch: [26][230/400]\tTime 0.353 (0.399)\tData 0.001 (0.040)\tLoss 2.259 (1.859)\nEpoch: [26][240/400]\tTime 0.354 (0.397)\tData 0.001 (0.039)\tLoss 2.017 (1.872)\nEpoch: [26][250/400]\tTime 0.363 (0.396)\tData 0.001 (0.037)\tLoss 2.357 (1.876)\nEpoch: [26][260/400]\tTime 0.364 (0.394)\tData 0.000 (0.036)\tLoss 2.528 (1.889)\nEpoch: [26][270/400]\tTime 0.354 (0.400)\tData 0.001 (0.041)\tLoss 2.455 (1.903)\nEpoch: [26][280/400]\tTime 0.382 (0.398)\tData 0.001 (0.040)\tLoss 1.738 (1.918)\nEpoch: [26][290/400]\tTime 0.354 (0.397)\tData 0.001 (0.038)\tLoss 2.629 (1.925)\nEpoch: [26][300/400]\tTime 0.355 (0.395)\tData 0.001 (0.037)\tLoss 2.672 (1.935)\nEpoch: [26][310/400]\tTime 0.354 (0.400)\tData 0.001 (0.041)\tLoss 2.295 (1.945)\nEpoch: [26][320/400]\tTime 0.362 (0.398)\tData 0.001 (0.040)\tLoss 2.442 (1.954)\nEpoch: [26][330/400]\tTime 0.353 (0.397)\tData 0.001 (0.039)\tLoss 1.867 (1.958)\nEpoch: [26][340/400]\tTime 0.353 (0.396)\tData 0.001 (0.038)\tLoss 2.522 (1.963)\nEpoch: [26][350/400]\tTime 0.362 (0.395)\tData 0.000 (0.037)\tLoss 2.413 (1.967)\nEpoch: [26][360/400]\tTime 0.362 (0.399)\tData 0.000 (0.041)\tLoss 2.700 (1.974)\nEpoch: [26][370/400]\tTime 0.351 (0.398)\tData 0.001 (0.040)\tLoss 1.509 (1.979)\nEpoch: [26][380/400]\tTime 0.362 (0.397)\tData 0.001 (0.039)\tLoss 1.781 (1.984)\nEpoch: [26][390/400]\tTime 0.364 (0.396)\tData 0.000 (0.038)\tLoss 2.561 (1.988)\nEpoch: [26][400/400]\tTime 0.352 (0.399)\tData 0.000 (0.041)\tLoss 2.522 (1.992)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.158)\tData 0.000 (0.026)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.381394863128662\n==> Statistics for epoch 27: 715 clusters\nEpoch: [27][10/400]\tTime 0.349 (0.426)\tData 0.000 (0.064)\tLoss 0.274 (0.282)\nEpoch: [27][20/400]\tTime 0.351 (0.392)\tData 0.000 (0.032)\tLoss 0.183 (0.274)\nEpoch: [27][30/400]\tTime 0.350 (0.379)\tData 0.001 (0.022)\tLoss 0.302 (0.267)\nEpoch: [27][40/400]\tTime 0.361 (0.373)\tData 0.000 (0.016)\tLoss 0.184 (0.259)\nEpoch: [27][50/400]\tTime 0.351 (0.405)\tData 0.001 (0.047)\tLoss 1.973 (0.478)\nEpoch: [27][60/400]\tTime 0.389 (0.397)\tData 0.000 (0.040)\tLoss 2.429 (0.773)\nEpoch: [27][70/400]\tTime 0.349 (0.390)\tData 0.001 (0.034)\tLoss 2.763 (1.013)\nEpoch: [27][80/400]\tTime 0.350 (0.386)\tData 0.001 (0.030)\tLoss 1.946 (1.165)\nEpoch: [27][90/400]\tTime 0.347 (0.403)\tData 0.000 (0.047)\tLoss 1.607 (1.281)\nEpoch: [27][100/400]\tTime 0.352 (0.399)\tData 0.000 (0.042)\tLoss 1.960 (1.346)\nEpoch: [27][110/400]\tTime 0.354 (0.395)\tData 0.001 (0.038)\tLoss 2.283 (1.411)\nEpoch: [27][120/400]\tTime 0.352 (0.391)\tData 0.000 (0.035)\tLoss 2.405 (1.471)\nEpoch: [27][130/400]\tTime 0.361 (0.389)\tData 0.000 (0.033)\tLoss 1.659 (1.502)\nEpoch: [27][140/400]\tTime 0.363 (0.399)\tData 0.001 (0.043)\tLoss 2.130 (1.539)\nEpoch: [27][150/400]\tTime 0.352 (0.397)\tData 0.001 (0.040)\tLoss 2.817 (1.596)\nEpoch: [27][160/400]\tTime 0.360 (0.394)\tData 0.001 (0.038)\tLoss 1.796 (1.631)\nEpoch: [27][170/400]\tTime 0.363 (0.392)\tData 0.000 (0.035)\tLoss 2.564 (1.662)\nEpoch: [27][180/400]\tTime 0.388 (0.400)\tData 0.000 (0.043)\tLoss 1.992 (1.681)\nEpoch: [27][190/400]\tTime 0.351 (0.398)\tData 0.001 (0.041)\tLoss 2.945 (1.707)\nEpoch: [27][200/400]\tTime 0.361 (0.396)\tData 0.001 (0.039)\tLoss 2.035 (1.733)\nEpoch: [27][210/400]\tTime 0.353 (0.394)\tData 0.001 (0.037)\tLoss 1.779 (1.751)\nEpoch: [27][220/400]\tTime 0.362 (0.392)\tData 0.000 (0.035)\tLoss 2.641 (1.764)\nEpoch: [27][230/400]\tTime 0.358 (0.399)\tData 0.001 (0.041)\tLoss 1.857 (1.778)\nEpoch: [27][240/400]\tTime 0.363 (0.397)\tData 0.001 (0.040)\tLoss 2.062 (1.798)\nEpoch: [27][250/400]\tTime 0.364 (0.396)\tData 0.001 (0.038)\tLoss 1.770 (1.816)\nEpoch: [27][260/400]\tTime 0.364 (0.394)\tData 0.000 (0.037)\tLoss 2.249 (1.826)\nEpoch: [27][270/400]\tTime 0.353 (0.400)\tData 0.001 (0.042)\tLoss 1.956 (1.842)\nEpoch: [27][280/400]\tTime 0.354 (0.398)\tData 0.001 (0.041)\tLoss 2.069 (1.848)\nEpoch: [27][290/400]\tTime 0.365 (0.397)\tData 0.000 (0.039)\tLoss 2.086 (1.858)\nEpoch: [27][300/400]\tTime 0.353 (0.395)\tData 0.001 (0.038)\tLoss 2.112 (1.876)\nEpoch: [27][310/400]\tTime 0.373 (0.400)\tData 0.001 (0.043)\tLoss 2.207 (1.885)\nEpoch: [27][320/400]\tTime 0.353 (0.399)\tData 0.001 (0.041)\tLoss 2.382 (1.893)\nEpoch: [27][330/400]\tTime 0.363 (0.398)\tData 0.001 (0.040)\tLoss 2.760 (1.903)\nEpoch: [27][340/400]\tTime 0.363 (0.397)\tData 0.001 (0.039)\tLoss 1.652 (1.907)\nEpoch: [27][350/400]\tTime 0.362 (0.396)\tData 0.000 (0.038)\tLoss 1.640 (1.910)\nEpoch: [27][360/400]\tTime 0.353 (0.400)\tData 0.000 (0.042)\tLoss 1.722 (1.917)\nEpoch: [27][370/400]\tTime 0.362 (0.399)\tData 0.001 (0.041)\tLoss 2.274 (1.923)\nEpoch: [27][380/400]\tTime 0.352 (0.398)\tData 0.000 (0.039)\tLoss 1.876 (1.924)\nEpoch: [27][390/400]\tTime 0.364 (0.397)\tData 0.000 (0.038)\tLoss 1.737 (1.925)\nEpoch: [27][400/400]\tTime 0.363 (0.401)\tData 0.000 (0.042)\tLoss 1.736 (1.924)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.163)\tData 0.000 (0.030)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.326934814453125\n==> Statistics for epoch 28: 716 clusters\nEpoch: [28][10/400]\tTime 0.351 (0.419)\tData 0.001 (0.062)\tLoss 0.284 (0.315)\nEpoch: [28][20/400]\tTime 0.351 (0.388)\tData 0.000 (0.031)\tLoss 0.369 (0.285)\nEpoch: [28][30/400]\tTime 0.407 (0.378)\tData 0.001 (0.021)\tLoss 0.280 (0.267)\nEpoch: [28][40/400]\tTime 0.364 (0.374)\tData 0.000 (0.016)\tLoss 0.228 (0.246)\nEpoch: [28][50/400]\tTime 0.362 (0.407)\tData 0.001 (0.048)\tLoss 2.438 (0.495)\nEpoch: [28][60/400]\tTime 0.351 (0.398)\tData 0.001 (0.040)\tLoss 2.782 (0.831)\nEpoch: [28][70/400]\tTime 0.351 (0.392)\tData 0.000 (0.034)\tLoss 1.735 (1.024)\nEpoch: [28][80/400]\tTime 0.351 (0.387)\tData 0.000 (0.030)\tLoss 2.260 (1.188)\nEpoch: [28][90/400]\tTime 0.349 (0.403)\tData 0.001 (0.046)\tLoss 2.026 (1.285)\nEpoch: [28][100/400]\tTime 0.387 (0.399)\tData 0.000 (0.042)\tLoss 1.900 (1.355)\nEpoch: [28][110/400]\tTime 0.350 (0.395)\tData 0.001 (0.038)\tLoss 2.403 (1.432)\nEpoch: [28][120/400]\tTime 0.363 (0.392)\tData 0.000 (0.035)\tLoss 2.064 (1.485)\nEpoch: [28][130/400]\tTime 0.361 (0.389)\tData 0.000 (0.032)\tLoss 2.202 (1.525)\nEpoch: [28][140/400]\tTime 0.350 (0.399)\tData 0.001 (0.042)\tLoss 2.244 (1.571)\nEpoch: [28][150/400]\tTime 0.356 (0.396)\tData 0.001 (0.039)\tLoss 1.948 (1.615)\nEpoch: [28][160/400]\tTime 0.361 (0.394)\tData 0.001 (0.037)\tLoss 2.378 (1.651)\nEpoch: [28][170/400]\tTime 0.362 (0.391)\tData 0.000 (0.035)\tLoss 2.072 (1.680)\nEpoch: [28][180/400]\tTime 0.354 (0.400)\tData 0.000 (0.043)\tLoss 1.663 (1.698)\nEpoch: [28][190/400]\tTime 0.361 (0.397)\tData 0.000 (0.041)\tLoss 1.754 (1.698)\nEpoch: [28][200/400]\tTime 0.350 (0.395)\tData 0.000 (0.039)\tLoss 1.721 (1.711)\nEpoch: [28][210/400]\tTime 0.353 (0.393)\tData 0.000 (0.037)\tLoss 2.144 (1.737)\nEpoch: [28][220/400]\tTime 0.363 (0.392)\tData 0.000 (0.035)\tLoss 2.250 (1.753)\nEpoch: [28][230/400]\tTime 0.352 (0.398)\tData 0.001 (0.041)\tLoss 2.690 (1.768)\nEpoch: [28][240/400]\tTime 0.352 (0.396)\tData 0.001 (0.040)\tLoss 1.952 (1.783)\nEpoch: [28][250/400]\tTime 0.353 (0.395)\tData 0.001 (0.038)\tLoss 1.914 (1.796)\nEpoch: [28][260/400]\tTime 0.363 (0.393)\tData 0.000 (0.037)\tLoss 2.043 (1.809)\nEpoch: [28][270/400]\tTime 0.352 (0.399)\tData 0.001 (0.042)\tLoss 2.725 (1.821)\nEpoch: [28][280/400]\tTime 0.353 (0.397)\tData 0.001 (0.041)\tLoss 2.246 (1.827)\nEpoch: [28][290/400]\tTime 0.353 (0.396)\tData 0.001 (0.039)\tLoss 2.356 (1.835)\nEpoch: [28][300/400]\tTime 0.352 (0.395)\tData 0.001 (0.038)\tLoss 2.861 (1.846)\nEpoch: [28][310/400]\tTime 0.377 (0.400)\tData 0.000 (0.043)\tLoss 1.423 (1.850)\nEpoch: [28][320/400]\tTime 0.353 (0.399)\tData 0.000 (0.041)\tLoss 3.093 (1.859)\nEpoch: [28][330/400]\tTime 0.352 (0.397)\tData 0.000 (0.040)\tLoss 2.111 (1.865)\nEpoch: [28][340/400]\tTime 0.355 (0.396)\tData 0.001 (0.039)\tLoss 2.292 (1.875)\nEpoch: [28][350/400]\tTime 0.354 (0.395)\tData 0.000 (0.038)\tLoss 2.111 (1.882)\nEpoch: [28][360/400]\tTime 0.354 (0.399)\tData 0.000 (0.042)\tLoss 2.014 (1.888)\nEpoch: [28][370/400]\tTime 0.364 (0.398)\tData 0.001 (0.041)\tLoss 2.212 (1.896)\nEpoch: [28][380/400]\tTime 0.353 (0.397)\tData 0.001 (0.040)\tLoss 1.227 (1.898)\nEpoch: [28][390/400]\tTime 0.364 (0.396)\tData 0.000 (0.039)\tLoss 1.393 (1.893)\nEpoch: [28][400/400]\tTime 0.352 (0.400)\tData 0.000 (0.042)\tLoss 1.566 (1.893)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.161)\tData 0.000 (0.030)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.5582857131958\n==> Statistics for epoch 29: 716 clusters\nEpoch: [29][10/400]\tTime 0.351 (0.406)\tData 0.001 (0.046)\tLoss 0.328 (0.281)\nEpoch: [29][20/400]\tTime 0.350 (0.381)\tData 0.001 (0.023)\tLoss 0.215 (0.274)\nEpoch: [29][30/400]\tTime 0.352 (0.371)\tData 0.001 (0.016)\tLoss 0.280 (0.259)\nEpoch: [29][40/400]\tTime 0.361 (0.369)\tData 0.000 (0.012)\tLoss 0.286 (0.245)\nEpoch: [29][50/400]\tTime 0.350 (0.401)\tData 0.001 (0.045)\tLoss 2.234 (0.490)\nEpoch: [29][60/400]\tTime 0.351 (0.394)\tData 0.001 (0.037)\tLoss 1.989 (0.813)\nEpoch: [29][70/400]\tTime 0.348 (0.388)\tData 0.001 (0.032)\tLoss 1.999 (0.983)\nEpoch: [29][80/400]\tTime 0.352 (0.383)\tData 0.001 (0.028)\tLoss 2.434 (1.169)\nEpoch: [29][90/400]\tTime 0.350 (0.401)\tData 0.001 (0.045)\tLoss 1.446 (1.269)\nEpoch: [29][100/400]\tTime 0.351 (0.396)\tData 0.000 (0.040)\tLoss 2.144 (1.329)\nEpoch: [29][110/400]\tTime 0.362 (0.393)\tData 0.001 (0.037)\tLoss 1.831 (1.385)\nEpoch: [29][120/400]\tTime 0.351 (0.389)\tData 0.001 (0.034)\tLoss 2.147 (1.436)\nEpoch: [29][130/400]\tTime 0.361 (0.387)\tData 0.000 (0.031)\tLoss 2.310 (1.479)\nEpoch: [29][140/400]\tTime 0.351 (0.399)\tData 0.000 (0.042)\tLoss 2.493 (1.509)\nEpoch: [29][150/400]\tTime 0.360 (0.396)\tData 0.001 (0.040)\tLoss 2.518 (1.551)\nEpoch: [29][160/400]\tTime 0.351 (0.393)\tData 0.000 (0.037)\tLoss 2.251 (1.587)\nEpoch: [29][170/400]\tTime 0.361 (0.391)\tData 0.000 (0.035)\tLoss 1.932 (1.617)\nEpoch: [29][180/400]\tTime 0.352 (0.400)\tData 0.001 (0.043)\tLoss 2.165 (1.648)\nEpoch: [29][190/400]\tTime 0.359 (0.397)\tData 0.001 (0.041)\tLoss 1.469 (1.657)\nEpoch: [29][200/400]\tTime 0.363 (0.395)\tData 0.001 (0.039)\tLoss 2.036 (1.675)\nEpoch: [29][210/400]\tTime 0.352 (0.393)\tData 0.001 (0.037)\tLoss 2.396 (1.698)\nEpoch: [29][220/400]\tTime 0.361 (0.392)\tData 0.000 (0.036)\tLoss 1.767 (1.726)\nEpoch: [29][230/400]\tTime 0.362 (0.398)\tData 0.001 (0.042)\tLoss 1.820 (1.743)\nEpoch: [29][240/400]\tTime 0.363 (0.397)\tData 0.001 (0.040)\tLoss 2.084 (1.754)\nEpoch: [29][250/400]\tTime 0.362 (0.395)\tData 0.000 (0.038)\tLoss 2.524 (1.769)\nEpoch: [29][260/400]\tTime 0.362 (0.394)\tData 0.000 (0.037)\tLoss 1.901 (1.776)\nEpoch: [29][270/400]\tTime 0.361 (0.399)\tData 0.001 (0.042)\tLoss 1.845 (1.784)\nEpoch: [29][280/400]\tTime 0.351 (0.398)\tData 0.001 (0.041)\tLoss 2.507 (1.793)\nEpoch: [29][290/400]\tTime 0.392 (0.396)\tData 0.000 (0.039)\tLoss 2.691 (1.804)\nEpoch: [29][300/400]\tTime 0.352 (0.395)\tData 0.001 (0.038)\tLoss 2.069 (1.808)\nEpoch: [29][310/400]\tTime 0.349 (0.400)\tData 0.001 (0.042)\tLoss 2.542 (1.815)\nEpoch: [29][320/400]\tTime 0.353 (0.398)\tData 0.000 (0.041)\tLoss 1.929 (1.828)\nEpoch: [29][330/400]\tTime 0.352 (0.397)\tData 0.001 (0.040)\tLoss 2.480 (1.829)\nEpoch: [29][340/400]\tTime 0.351 (0.395)\tData 0.000 (0.039)\tLoss 2.125 (1.837)\nEpoch: [29][350/400]\tTime 0.363 (0.394)\tData 0.000 (0.038)\tLoss 2.146 (1.839)\nEpoch: [29][360/400]\tTime 0.362 (0.398)\tData 0.000 (0.041)\tLoss 2.462 (1.841)\nEpoch: [29][370/400]\tTime 0.353 (0.397)\tData 0.000 (0.040)\tLoss 1.931 (1.845)\nEpoch: [29][380/400]\tTime 0.364 (0.396)\tData 0.000 (0.039)\tLoss 1.925 (1.857)\nEpoch: [29][390/400]\tTime 0.361 (0.395)\tData 0.000 (0.038)\tLoss 3.028 (1.863)\nEpoch: [29][400/400]\tTime 0.351 (0.399)\tData 0.000 (0.042)\tLoss 2.177 (1.871)\nExtract Features: [50/76]\tTime 0.133 (0.161)\tData 0.000 (0.028)\t\nMean AP: 85.5%\n\n * Finished epoch  29  model mAP: 85.5%  best: 85.5% *\n\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.128 (0.163)\tData 0.000 (0.031)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.52395009994507\n==> Statistics for epoch 30: 710 clusters\nEpoch: [30][10/400]\tTime 0.361 (0.418)\tData 0.001 (0.059)\tLoss 0.307 (0.259)\nEpoch: [30][20/400]\tTime 0.347 (0.388)\tData 0.001 (0.030)\tLoss 0.396 (0.269)\nEpoch: [30][30/400]\tTime 0.360 (0.376)\tData 0.001 (0.020)\tLoss 0.201 (0.252)\nEpoch: [30][40/400]\tTime 0.363 (0.371)\tData 0.000 (0.015)\tLoss 0.162 (0.230)\nEpoch: [30][50/400]\tTime 0.377 (0.405)\tData 0.001 (0.048)\tLoss 2.497 (0.468)\nEpoch: [30][60/400]\tTime 0.353 (0.397)\tData 0.001 (0.040)\tLoss 2.282 (0.765)\nEpoch: [30][70/400]\tTime 0.351 (0.391)\tData 0.001 (0.034)\tLoss 2.432 (0.965)\nEpoch: [30][80/400]\tTime 0.352 (0.387)\tData 0.001 (0.030)\tLoss 1.586 (1.096)\nEpoch: [30][90/400]\tTime 0.350 (0.403)\tData 0.001 (0.046)\tLoss 1.397 (1.226)\nEpoch: [30][100/400]\tTime 0.352 (0.399)\tData 0.001 (0.041)\tLoss 2.194 (1.313)\nEpoch: [30][110/400]\tTime 0.352 (0.395)\tData 0.001 (0.038)\tLoss 2.427 (1.394)\nEpoch: [30][120/400]\tTime 0.352 (0.391)\tData 0.001 (0.035)\tLoss 2.159 (1.432)\nEpoch: [30][130/400]\tTime 0.363 (0.389)\tData 0.000 (0.032)\tLoss 1.832 (1.465)\nEpoch: [30][140/400]\tTime 0.354 (0.399)\tData 0.001 (0.043)\tLoss 1.944 (1.514)\nEpoch: [30][150/400]\tTime 0.378 (0.397)\tData 0.001 (0.040)\tLoss 2.359 (1.553)\nEpoch: [30][160/400]\tTime 0.353 (0.394)\tData 0.001 (0.037)\tLoss 2.040 (1.589)\nEpoch: [30][170/400]\tTime 0.363 (0.392)\tData 0.000 (0.035)\tLoss 1.772 (1.607)\nEpoch: [30][180/400]\tTime 0.349 (0.401)\tData 0.001 (0.043)\tLoss 2.023 (1.634)\nEpoch: [30][190/400]\tTime 0.361 (0.399)\tData 0.001 (0.041)\tLoss 2.247 (1.667)\nEpoch: [30][200/400]\tTime 0.359 (0.397)\tData 0.000 (0.039)\tLoss 2.367 (1.683)\nEpoch: [30][210/400]\tTime 0.353 (0.394)\tData 0.001 (0.037)\tLoss 1.764 (1.690)\nEpoch: [30][220/400]\tTime 0.363 (0.393)\tData 0.000 (0.036)\tLoss 1.734 (1.707)\nEpoch: [30][230/400]\tTime 0.352 (0.399)\tData 0.001 (0.042)\tLoss 1.559 (1.719)\nEpoch: [30][240/400]\tTime 0.371 (0.397)\tData 0.000 (0.040)\tLoss 1.811 (1.729)\nEpoch: [30][250/400]\tTime 0.351 (0.396)\tData 0.001 (0.039)\tLoss 1.912 (1.738)\nEpoch: [30][260/400]\tTime 0.363 (0.394)\tData 0.000 (0.037)\tLoss 2.044 (1.754)\nEpoch: [30][270/400]\tTime 0.363 (0.400)\tData 0.001 (0.042)\tLoss 1.696 (1.759)\nEpoch: [30][280/400]\tTime 0.362 (0.398)\tData 0.001 (0.041)\tLoss 2.116 (1.765)\nEpoch: [30][290/400]\tTime 0.387 (0.397)\tData 0.001 (0.039)\tLoss 2.248 (1.775)\nEpoch: [30][300/400]\tTime 0.353 (0.396)\tData 0.000 (0.038)\tLoss 1.655 (1.780)\nEpoch: [30][310/400]\tTime 0.348 (0.400)\tData 0.001 (0.043)\tLoss 2.580 (1.782)\nEpoch: [30][320/400]\tTime 0.351 (0.399)\tData 0.001 (0.041)\tLoss 2.295 (1.795)\nEpoch: [30][330/400]\tTime 0.352 (0.398)\tData 0.001 (0.040)\tLoss 1.989 (1.806)\nEpoch: [30][340/400]\tTime 0.350 (0.396)\tData 0.001 (0.039)\tLoss 2.025 (1.815)\nEpoch: [30][350/400]\tTime 0.363 (0.395)\tData 0.000 (0.038)\tLoss 2.185 (1.818)\nEpoch: [30][360/400]\tTime 0.354 (0.400)\tData 0.001 (0.042)\tLoss 1.556 (1.825)\nEpoch: [30][370/400]\tTime 0.356 (0.398)\tData 0.001 (0.041)\tLoss 2.031 (1.831)\nEpoch: [30][380/400]\tTime 0.358 (0.397)\tData 0.001 (0.040)\tLoss 2.201 (1.838)\nEpoch: [30][390/400]\tTime 0.362 (0.396)\tData 0.000 (0.039)\tLoss 2.389 (1.841)\nEpoch: [30][400/400]\tTime 0.362 (0.400)\tData 0.001 (0.042)\tLoss 1.868 (1.843)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.128 (0.160)\tData 0.000 (0.027)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.330976486206055\n==> Statistics for epoch 31: 711 clusters\nEpoch: [31][10/400]\tTime 0.362 (0.414)\tData 0.001 (0.048)\tLoss 0.196 (0.252)\nEpoch: [31][20/400]\tTime 0.352 (0.384)\tData 0.000 (0.024)\tLoss 0.212 (0.261)\nEpoch: [31][30/400]\tTime 0.362 (0.374)\tData 0.001 (0.016)\tLoss 0.156 (0.238)\nEpoch: [31][40/400]\tTime 0.362 (0.369)\tData 0.000 (0.012)\tLoss 0.294 (0.225)\nEpoch: [31][50/400]\tTime 0.362 (0.405)\tData 0.001 (0.047)\tLoss 2.292 (0.480)\nEpoch: [31][60/400]\tTime 0.350 (0.397)\tData 0.000 (0.040)\tLoss 1.765 (0.751)\nEpoch: [31][70/400]\tTime 0.361 (0.391)\tData 0.001 (0.034)\tLoss 2.759 (0.987)\nEpoch: [31][80/400]\tTime 0.349 (0.387)\tData 0.001 (0.030)\tLoss 2.231 (1.135)\nEpoch: [31][90/400]\tTime 0.371 (0.404)\tData 0.001 (0.046)\tLoss 2.243 (1.232)\nEpoch: [31][100/400]\tTime 0.353 (0.400)\tData 0.001 (0.042)\tLoss 2.106 (1.305)\nEpoch: [31][110/400]\tTime 0.354 (0.396)\tData 0.001 (0.038)\tLoss 1.471 (1.351)\nEpoch: [31][120/400]\tTime 0.354 (0.393)\tData 0.001 (0.035)\tLoss 2.216 (1.404)\nEpoch: [31][130/400]\tTime 0.361 (0.390)\tData 0.000 (0.032)\tLoss 1.866 (1.435)\nEpoch: [31][140/400]\tTime 0.361 (0.402)\tData 0.001 (0.043)\tLoss 1.570 (1.471)\nEpoch: [31][150/400]\tTime 0.355 (0.399)\tData 0.001 (0.040)\tLoss 2.506 (1.502)\nEpoch: [31][160/400]\tTime 0.351 (0.396)\tData 0.001 (0.038)\tLoss 2.229 (1.547)\nEpoch: [31][170/400]\tTime 0.362 (0.394)\tData 0.000 (0.036)\tLoss 1.987 (1.569)\nEpoch: [31][180/400]\tTime 0.361 (0.403)\tData 0.000 (0.044)\tLoss 1.843 (1.591)\nEpoch: [31][190/400]\tTime 0.355 (0.400)\tData 0.000 (0.042)\tLoss 2.522 (1.617)\nEpoch: [31][200/400]\tTime 0.399 (0.398)\tData 0.000 (0.040)\tLoss 2.548 (1.639)\nEpoch: [31][210/400]\tTime 0.353 (0.396)\tData 0.001 (0.038)\tLoss 2.205 (1.655)\nEpoch: [31][220/400]\tTime 0.360 (0.395)\tData 0.000 (0.036)\tLoss 1.400 (1.673)\nEpoch: [31][230/400]\tTime 0.353 (0.401)\tData 0.001 (0.043)\tLoss 2.368 (1.699)\nEpoch: [31][240/400]\tTime 0.351 (0.399)\tData 0.001 (0.041)\tLoss 1.774 (1.715)\nEpoch: [31][250/400]\tTime 0.351 (0.397)\tData 0.001 (0.039)\tLoss 1.778 (1.731)\nEpoch: [31][260/400]\tTime 0.363 (0.396)\tData 0.000 (0.038)\tLoss 2.269 (1.746)\nEpoch: [31][270/400]\tTime 0.363 (0.401)\tData 0.001 (0.043)\tLoss 1.831 (1.754)\nEpoch: [31][280/400]\tTime 0.353 (0.400)\tData 0.001 (0.041)\tLoss 2.091 (1.762)\nEpoch: [31][290/400]\tTime 0.363 (0.398)\tData 0.001 (0.040)\tLoss 1.713 (1.762)\nEpoch: [31][300/400]\tTime 0.351 (0.397)\tData 0.001 (0.039)\tLoss 2.192 (1.768)\nEpoch: [31][310/400]\tTime 0.367 (0.402)\tData 0.001 (0.043)\tLoss 2.388 (1.783)\nEpoch: [31][320/400]\tTime 0.351 (0.400)\tData 0.001 (0.042)\tLoss 1.956 (1.789)\nEpoch: [31][330/400]\tTime 0.371 (0.399)\tData 0.001 (0.041)\tLoss 2.460 (1.797)\nEpoch: [31][340/400]\tTime 0.351 (0.398)\tData 0.001 (0.040)\tLoss 1.812 (1.803)\nEpoch: [31][350/400]\tTime 0.361 (0.396)\tData 0.000 (0.038)\tLoss 2.318 (1.803)\nEpoch: [31][360/400]\tTime 0.349 (0.400)\tData 0.000 (0.042)\tLoss 2.678 (1.812)\nEpoch: [31][370/400]\tTime 0.362 (0.399)\tData 0.000 (0.041)\tLoss 1.420 (1.813)\nEpoch: [31][380/400]\tTime 0.363 (0.398)\tData 0.001 (0.040)\tLoss 1.399 (1.816)\nEpoch: [31][390/400]\tTime 0.363 (0.397)\tData 0.000 (0.039)\tLoss 2.059 (1.822)\nEpoch: [31][400/400]\tTime 0.361 (0.401)\tData 0.000 (0.042)\tLoss 1.773 (1.827)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.127 (0.161)\tData 0.000 (0.028)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.22784423828125\n==> Statistics for epoch 32: 710 clusters\nEpoch: [32][10/400]\tTime 0.359 (0.426)\tData 0.001 (0.063)\tLoss 0.265 (0.267)\nEpoch: [32][20/400]\tTime 0.350 (0.391)\tData 0.001 (0.032)\tLoss 0.357 (0.278)\nEpoch: [32][30/400]\tTime 0.362 (0.379)\tData 0.001 (0.021)\tLoss 0.174 (0.252)\nEpoch: [32][40/400]\tTime 0.360 (0.374)\tData 0.000 (0.016)\tLoss 0.213 (0.232)\nEpoch: [32][50/400]\tTime 0.350 (0.406)\tData 0.000 (0.046)\tLoss 2.206 (0.470)\nEpoch: [32][60/400]\tTime 0.349 (0.396)\tData 0.000 (0.039)\tLoss 2.402 (0.775)\nEpoch: [32][70/400]\tTime 0.350 (0.390)\tData 0.000 (0.033)\tLoss 1.651 (0.977)\nEpoch: [32][80/400]\tTime 0.351 (0.385)\tData 0.000 (0.029)\tLoss 1.936 (1.143)\nEpoch: [32][90/400]\tTime 0.364 (0.401)\tData 0.000 (0.045)\tLoss 2.383 (1.264)\nEpoch: [32][100/400]\tTime 0.352 (0.396)\tData 0.000 (0.040)\tLoss 2.375 (1.334)\nEpoch: [32][110/400]\tTime 0.352 (0.393)\tData 0.000 (0.037)\tLoss 1.922 (1.394)\nEpoch: [32][120/400]\tTime 0.375 (0.390)\tData 0.000 (0.034)\tLoss 2.081 (1.440)\nEpoch: [32][130/400]\tTime 0.360 (0.387)\tData 0.000 (0.031)\tLoss 1.508 (1.473)\nEpoch: [32][140/400]\tTime 0.351 (0.397)\tData 0.001 (0.041)\tLoss 2.418 (1.515)\nEpoch: [32][150/400]\tTime 0.351 (0.394)\tData 0.001 (0.038)\tLoss 1.810 (1.544)\nEpoch: [32][160/400]\tTime 0.351 (0.391)\tData 0.000 (0.036)\tLoss 1.789 (1.564)\nEpoch: [32][170/400]\tTime 0.363 (0.389)\tData 0.000 (0.034)\tLoss 1.595 (1.586)\nEpoch: [32][180/400]\tTime 0.354 (0.397)\tData 0.000 (0.041)\tLoss 2.328 (1.606)\nEpoch: [32][190/400]\tTime 0.362 (0.395)\tData 0.001 (0.039)\tLoss 1.934 (1.626)\nEpoch: [32][200/400]\tTime 0.353 (0.393)\tData 0.000 (0.037)\tLoss 1.853 (1.642)\nEpoch: [32][210/400]\tTime 0.354 (0.391)\tData 0.000 (0.035)\tLoss 2.073 (1.664)\nEpoch: [32][220/400]\tTime 0.362 (0.390)\tData 0.000 (0.034)\tLoss 1.993 (1.673)\nEpoch: [32][230/400]\tTime 0.389 (0.397)\tData 0.000 (0.040)\tLoss 2.494 (1.689)\nEpoch: [32][240/400]\tTime 0.352 (0.395)\tData 0.000 (0.038)\tLoss 1.904 (1.701)\nEpoch: [32][250/400]\tTime 0.358 (0.393)\tData 0.001 (0.037)\tLoss 2.270 (1.713)\nEpoch: [32][260/400]\tTime 0.363 (0.392)\tData 0.000 (0.035)\tLoss 1.670 (1.720)\nEpoch: [32][270/400]\tTime 0.352 (0.397)\tData 0.001 (0.041)\tLoss 1.345 (1.724)\nEpoch: [32][280/400]\tTime 0.361 (0.396)\tData 0.000 (0.039)\tLoss 2.200 (1.732)\nEpoch: [32][290/400]\tTime 0.350 (0.394)\tData 0.001 (0.038)\tLoss 1.404 (1.735)\nEpoch: [32][300/400]\tTime 0.374 (0.393)\tData 0.000 (0.037)\tLoss 1.726 (1.743)\nEpoch: [32][310/400]\tTime 0.364 (0.398)\tData 0.000 (0.041)\tLoss 1.977 (1.748)\nEpoch: [32][320/400]\tTime 0.368 (0.397)\tData 0.000 (0.040)\tLoss 1.587 (1.754)\nEpoch: [32][330/400]\tTime 0.352 (0.395)\tData 0.000 (0.039)\tLoss 1.916 (1.757)\nEpoch: [32][340/400]\tTime 0.352 (0.394)\tData 0.000 (0.038)\tLoss 2.608 (1.764)\nEpoch: [32][350/400]\tTime 0.363 (0.393)\tData 0.000 (0.037)\tLoss 2.680 (1.771)\nEpoch: [32][360/400]\tTime 0.352 (0.397)\tData 0.000 (0.041)\tLoss 2.394 (1.779)\nEpoch: [32][370/400]\tTime 0.362 (0.396)\tData 0.001 (0.039)\tLoss 3.304 (1.787)\nEpoch: [32][380/400]\tTime 0.352 (0.395)\tData 0.000 (0.038)\tLoss 1.731 (1.789)\nEpoch: [32][390/400]\tTime 0.363 (0.394)\tData 0.000 (0.037)\tLoss 1.859 (1.793)\nEpoch: [32][400/400]\tTime 0.353 (0.398)\tData 0.000 (0.041)\tLoss 2.488 (1.799)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.130 (0.156)\tData 0.000 (0.028)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.366796493530273\n==> Statistics for epoch 33: 709 clusters\nEpoch: [33][10/400]\tTime 0.361 (0.426)\tData 0.001 (0.063)\tLoss 0.194 (0.253)\nEpoch: [33][20/400]\tTime 0.375 (0.393)\tData 0.001 (0.032)\tLoss 0.195 (0.239)\nEpoch: [33][30/400]\tTime 0.359 (0.380)\tData 0.001 (0.021)\tLoss 0.146 (0.227)\nEpoch: [33][40/400]\tTime 0.362 (0.374)\tData 0.000 (0.016)\tLoss 0.162 (0.217)\nEpoch: [33][50/400]\tTime 0.361 (0.406)\tData 0.000 (0.047)\tLoss 2.018 (0.450)\nEpoch: [33][60/400]\tTime 0.350 (0.398)\tData 0.001 (0.040)\tLoss 2.206 (0.727)\nEpoch: [33][70/400]\tTime 0.351 (0.392)\tData 0.001 (0.034)\tLoss 2.221 (0.939)\nEpoch: [33][80/400]\tTime 0.353 (0.388)\tData 0.001 (0.030)\tLoss 2.421 (1.089)\nEpoch: [33][90/400]\tTime 0.372 (0.404)\tData 0.000 (0.046)\tLoss 2.164 (1.189)\nEpoch: [33][100/400]\tTime 0.349 (0.399)\tData 0.000 (0.042)\tLoss 1.647 (1.266)\nEpoch: [33][110/400]\tTime 0.367 (0.396)\tData 0.000 (0.038)\tLoss 1.675 (1.326)\nEpoch: [33][120/400]\tTime 0.350 (0.392)\tData 0.001 (0.035)\tLoss 2.078 (1.372)\nEpoch: [33][130/400]\tTime 0.362 (0.389)\tData 0.000 (0.032)\tLoss 2.040 (1.409)\nEpoch: [33][140/400]\tTime 0.355 (0.400)\tData 0.001 (0.042)\tLoss 2.060 (1.459)\nEpoch: [33][150/400]\tTime 0.403 (0.397)\tData 0.000 (0.039)\tLoss 2.235 (1.484)\nEpoch: [33][160/400]\tTime 0.354 (0.394)\tData 0.000 (0.037)\tLoss 2.016 (1.525)\nEpoch: [33][170/400]\tTime 0.362 (0.392)\tData 0.000 (0.035)\tLoss 1.760 (1.552)\nEpoch: [33][180/400]\tTime 0.352 (0.401)\tData 0.001 (0.043)\tLoss 2.339 (1.577)\nEpoch: [33][190/400]\tTime 0.353 (0.398)\tData 0.001 (0.041)\tLoss 1.591 (1.586)\nEpoch: [33][200/400]\tTime 0.354 (0.396)\tData 0.001 (0.039)\tLoss 2.400 (1.611)\nEpoch: [33][210/400]\tTime 0.354 (0.395)\tData 0.001 (0.037)\tLoss 2.234 (1.627)\nEpoch: [33][220/400]\tTime 0.364 (0.393)\tData 0.000 (0.035)\tLoss 2.739 (1.644)\nEpoch: [33][230/400]\tTime 0.364 (0.400)\tData 0.001 (0.042)\tLoss 3.010 (1.659)\nEpoch: [33][240/400]\tTime 0.352 (0.398)\tData 0.001 (0.040)\tLoss 2.128 (1.670)\nEpoch: [33][250/400]\tTime 0.362 (0.396)\tData 0.001 (0.038)\tLoss 1.944 (1.685)\nEpoch: [33][260/400]\tTime 0.364 (0.394)\tData 0.000 (0.037)\tLoss 1.506 (1.695)\nEpoch: [33][270/400]\tTime 0.349 (0.400)\tData 0.001 (0.042)\tLoss 1.854 (1.707)\nEpoch: [33][280/400]\tTime 0.402 (0.398)\tData 0.001 (0.041)\tLoss 1.700 (1.716)\nEpoch: [33][290/400]\tTime 0.351 (0.397)\tData 0.001 (0.039)\tLoss 2.549 (1.728)\nEpoch: [33][300/400]\tTime 0.352 (0.395)\tData 0.001 (0.038)\tLoss 1.660 (1.737)\nEpoch: [33][310/400]\tTime 0.363 (0.400)\tData 0.001 (0.043)\tLoss 2.257 (1.740)\nEpoch: [33][320/400]\tTime 0.352 (0.399)\tData 0.001 (0.041)\tLoss 1.842 (1.742)\nEpoch: [33][330/400]\tTime 0.353 (0.397)\tData 0.001 (0.040)\tLoss 2.136 (1.749)\nEpoch: [33][340/400]\tTime 0.352 (0.396)\tData 0.001 (0.039)\tLoss 1.497 (1.756)\nEpoch: [33][350/400]\tTime 0.363 (0.395)\tData 0.000 (0.038)\tLoss 1.399 (1.760)\nEpoch: [33][360/400]\tTime 0.352 (0.399)\tData 0.001 (0.042)\tLoss 1.820 (1.764)\nEpoch: [33][370/400]\tTime 0.364 (0.398)\tData 0.001 (0.041)\tLoss 1.795 (1.765)\nEpoch: [33][380/400]\tTime 0.353 (0.397)\tData 0.001 (0.040)\tLoss 1.812 (1.769)\nEpoch: [33][390/400]\tTime 0.364 (0.396)\tData 0.000 (0.039)\tLoss 1.959 (1.773)\nEpoch: [33][400/400]\tTime 0.371 (0.400)\tData 0.000 (0.042)\tLoss 2.447 (1.777)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.163)\tData 0.000 (0.029)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.287232637405396\n==> Statistics for epoch 34: 711 clusters\nEpoch: [34][10/400]\tTime 0.362 (0.422)\tData 0.001 (0.052)\tLoss 0.150 (0.241)\nEpoch: [34][20/400]\tTime 0.372 (0.391)\tData 0.001 (0.026)\tLoss 0.269 (0.236)\nEpoch: [34][30/400]\tTime 0.363 (0.379)\tData 0.001 (0.018)\tLoss 0.237 (0.217)\nEpoch: [34][40/400]\tTime 0.363 (0.374)\tData 0.000 (0.013)\tLoss 0.092 (0.207)\nEpoch: [34][50/400]\tTime 0.362 (0.406)\tData 0.001 (0.045)\tLoss 2.076 (0.432)\nEpoch: [34][60/400]\tTime 0.350 (0.399)\tData 0.001 (0.038)\tLoss 2.680 (0.712)\nEpoch: [34][70/400]\tTime 0.352 (0.392)\tData 0.001 (0.033)\tLoss 1.997 (0.898)\nEpoch: [34][80/400]\tTime 0.361 (0.388)\tData 0.001 (0.029)\tLoss 1.830 (1.044)\nEpoch: [34][90/400]\tTime 0.363 (0.406)\tData 0.001 (0.046)\tLoss 1.457 (1.154)\nEpoch: [34][100/400]\tTime 0.352 (0.401)\tData 0.000 (0.041)\tLoss 1.947 (1.216)\nEpoch: [34][110/400]\tTime 0.354 (0.397)\tData 0.001 (0.038)\tLoss 1.752 (1.268)\nEpoch: [34][120/400]\tTime 0.353 (0.394)\tData 0.001 (0.035)\tLoss 2.355 (1.320)\nEpoch: [34][130/400]\tTime 0.363 (0.391)\tData 0.000 (0.032)\tLoss 1.995 (1.356)\nEpoch: [34][140/400]\tTime 0.354 (0.401)\tData 0.001 (0.042)\tLoss 1.517 (1.387)\nEpoch: [34][150/400]\tTime 0.374 (0.398)\tData 0.001 (0.039)\tLoss 2.148 (1.429)\nEpoch: [34][160/400]\tTime 0.353 (0.395)\tData 0.001 (0.036)\tLoss 2.262 (1.462)\nEpoch: [34][170/400]\tTime 0.363 (0.393)\tData 0.000 (0.034)\tLoss 1.645 (1.477)\nEpoch: [34][180/400]\tTime 0.361 (0.401)\tData 0.001 (0.042)\tLoss 1.650 (1.507)\nEpoch: [34][190/400]\tTime 0.358 (0.399)\tData 0.001 (0.040)\tLoss 1.566 (1.532)\nEpoch: [34][200/400]\tTime 0.358 (0.396)\tData 0.001 (0.038)\tLoss 2.212 (1.550)\nEpoch: [34][210/400]\tTime 0.355 (0.395)\tData 0.001 (0.036)\tLoss 1.762 (1.563)\nEpoch: [34][220/400]\tTime 0.363 (0.393)\tData 0.000 (0.034)\tLoss 1.768 (1.576)\nEpoch: [34][230/400]\tTime 0.356 (0.400)\tData 0.001 (0.041)\tLoss 1.811 (1.591)\nEpoch: [34][240/400]\tTime 0.362 (0.398)\tData 0.001 (0.039)\tLoss 1.862 (1.601)\nEpoch: [34][250/400]\tTime 0.362 (0.396)\tData 0.001 (0.038)\tLoss 1.504 (1.617)\nEpoch: [34][260/400]\tTime 0.362 (0.395)\tData 0.000 (0.036)\tLoss 1.400 (1.625)\nEpoch: [34][270/400]\tTime 0.352 (0.400)\tData 0.001 (0.041)\tLoss 2.275 (1.635)\nEpoch: [34][280/400]\tTime 0.363 (0.398)\tData 0.001 (0.040)\tLoss 1.721 (1.641)\nEpoch: [34][290/400]\tTime 0.351 (0.397)\tData 0.001 (0.038)\tLoss 1.197 (1.651)\nEpoch: [34][300/400]\tTime 0.354 (0.395)\tData 0.001 (0.037)\tLoss 1.492 (1.653)\nEpoch: [34][310/400]\tTime 0.352 (0.400)\tData 0.001 (0.041)\tLoss 1.241 (1.652)\nEpoch: [34][320/400]\tTime 0.355 (0.398)\tData 0.001 (0.040)\tLoss 2.021 (1.663)\nEpoch: [34][330/400]\tTime 0.353 (0.397)\tData 0.001 (0.039)\tLoss 1.802 (1.672)\nEpoch: [34][340/400]\tTime 0.354 (0.396)\tData 0.001 (0.038)\tLoss 2.133 (1.679)\nEpoch: [34][350/400]\tTime 0.360 (0.395)\tData 0.000 (0.037)\tLoss 2.327 (1.691)\nEpoch: [34][360/400]\tTime 0.353 (0.399)\tData 0.000 (0.040)\tLoss 1.637 (1.695)\nEpoch: [34][370/400]\tTime 0.351 (0.398)\tData 0.001 (0.039)\tLoss 1.787 (1.695)\nEpoch: [34][380/400]\tTime 0.353 (0.396)\tData 0.001 (0.038)\tLoss 1.898 (1.702)\nEpoch: [34][390/400]\tTime 0.361 (0.395)\tData 0.000 (0.037)\tLoss 2.211 (1.710)\nEpoch: [34][400/400]\tTime 0.350 (0.399)\tData 0.001 (0.041)\tLoss 2.018 (1.718)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.128 (0.158)\tData 0.000 (0.027)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.268587589263916\n==> Statistics for epoch 35: 709 clusters\nEpoch: [35][10/400]\tTime 0.350 (0.401)\tData 0.001 (0.046)\tLoss 0.238 (0.250)\nEpoch: [35][20/400]\tTime 0.351 (0.376)\tData 0.001 (0.023)\tLoss 0.157 (0.250)\nEpoch: [35][30/400]\tTime 0.350 (0.367)\tData 0.001 (0.016)\tLoss 0.120 (0.232)\nEpoch: [35][40/400]\tTime 0.361 (0.366)\tData 0.000 (0.012)\tLoss 0.129 (0.214)\nEpoch: [35][50/400]\tTime 0.351 (0.401)\tData 0.001 (0.045)\tLoss 2.182 (0.447)\nEpoch: [35][60/400]\tTime 0.351 (0.392)\tData 0.001 (0.038)\tLoss 2.259 (0.728)\nEpoch: [35][70/400]\tTime 0.350 (0.387)\tData 0.000 (0.032)\tLoss 2.739 (0.919)\nEpoch: [35][80/400]\tTime 0.352 (0.382)\tData 0.000 (0.028)\tLoss 1.888 (1.064)\nEpoch: [35][90/400]\tTime 0.351 (0.399)\tData 0.000 (0.045)\tLoss 1.659 (1.142)\nEpoch: [35][100/400]\tTime 0.351 (0.394)\tData 0.001 (0.040)\tLoss 1.626 (1.206)\nEpoch: [35][110/400]\tTime 0.352 (0.391)\tData 0.000 (0.037)\tLoss 2.215 (1.257)\nEpoch: [35][120/400]\tTime 0.372 (0.388)\tData 0.001 (0.034)\tLoss 1.815 (1.308)\nEpoch: [35][130/400]\tTime 0.362 (0.386)\tData 0.000 (0.031)\tLoss 1.783 (1.346)\nEpoch: [35][140/400]\tTime 0.352 (0.396)\tData 0.000 (0.041)\tLoss 2.988 (1.371)\nEpoch: [35][150/400]\tTime 0.351 (0.393)\tData 0.001 (0.038)\tLoss 1.978 (1.408)\nEpoch: [35][160/400]\tTime 0.361 (0.391)\tData 0.000 (0.036)\tLoss 1.353 (1.438)\nEpoch: [35][170/400]\tTime 0.362 (0.389)\tData 0.000 (0.034)\tLoss 1.955 (1.464)\nEpoch: [35][180/400]\tTime 0.352 (0.398)\tData 0.000 (0.043)\tLoss 1.737 (1.473)\nEpoch: [35][190/400]\tTime 0.397 (0.396)\tData 0.001 (0.040)\tLoss 2.531 (1.491)\nEpoch: [35][200/400]\tTime 0.349 (0.394)\tData 0.001 (0.038)\tLoss 2.015 (1.501)\nEpoch: [35][210/400]\tTime 0.386 (0.392)\tData 0.001 (0.037)\tLoss 1.617 (1.523)\nEpoch: [35][220/400]\tTime 0.362 (0.391)\tData 0.000 (0.035)\tLoss 2.169 (1.543)\nEpoch: [35][230/400]\tTime 0.362 (0.397)\tData 0.001 (0.041)\tLoss 1.787 (1.551)\nEpoch: [35][240/400]\tTime 0.358 (0.395)\tData 0.000 (0.039)\tLoss 2.130 (1.562)\nEpoch: [35][250/400]\tTime 0.361 (0.393)\tData 0.001 (0.038)\tLoss 1.558 (1.568)\nEpoch: [35][260/400]\tTime 0.361 (0.392)\tData 0.000 (0.036)\tLoss 2.154 (1.582)\nEpoch: [35][270/400]\tTime 0.352 (0.398)\tData 0.001 (0.042)\tLoss 1.527 (1.596)\nEpoch: [35][280/400]\tTime 0.352 (0.396)\tData 0.000 (0.040)\tLoss 1.892 (1.609)\nEpoch: [35][290/400]\tTime 0.352 (0.395)\tData 0.001 (0.039)\tLoss 1.607 (1.617)\nEpoch: [35][300/400]\tTime 0.351 (0.393)\tData 0.000 (0.038)\tLoss 2.073 (1.623)\nEpoch: [35][310/400]\tTime 0.362 (0.398)\tData 0.000 (0.043)\tLoss 2.095 (1.630)\nEpoch: [35][320/400]\tTime 0.344 (0.397)\tData 0.001 (0.041)\tLoss 1.743 (1.635)\nEpoch: [35][330/400]\tTime 0.363 (0.396)\tData 0.001 (0.040)\tLoss 2.024 (1.641)\nEpoch: [35][340/400]\tTime 0.362 (0.395)\tData 0.001 (0.039)\tLoss 2.016 (1.646)\nEpoch: [35][350/400]\tTime 0.363 (0.394)\tData 0.000 (0.038)\tLoss 1.809 (1.647)\nEpoch: [35][360/400]\tTime 0.353 (0.398)\tData 0.000 (0.042)\tLoss 1.855 (1.654)\nEpoch: [35][370/400]\tTime 0.349 (0.397)\tData 0.001 (0.041)\tLoss 2.347 (1.658)\nEpoch: [35][380/400]\tTime 0.361 (0.396)\tData 0.001 (0.039)\tLoss 1.610 (1.660)\nEpoch: [35][390/400]\tTime 0.362 (0.395)\tData 0.000 (0.038)\tLoss 1.695 (1.663)\nEpoch: [35][400/400]\tTime 0.350 (0.398)\tData 0.000 (0.042)\tLoss 1.929 (1.668)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.162)\tData 0.000 (0.031)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.193820238113403\n==> Statistics for epoch 36: 711 clusters\nEpoch: [36][10/400]\tTime 0.360 (0.404)\tData 0.001 (0.046)\tLoss 0.284 (0.219)\nEpoch: [36][20/400]\tTime 0.349 (0.381)\tData 0.000 (0.023)\tLoss 0.230 (0.232)\nEpoch: [36][30/400]\tTime 0.360 (0.371)\tData 0.001 (0.016)\tLoss 0.212 (0.217)\nEpoch: [36][40/400]\tTime 0.360 (0.367)\tData 0.000 (0.012)\tLoss 0.105 (0.200)\nEpoch: [36][50/400]\tTime 0.360 (0.403)\tData 0.001 (0.046)\tLoss 2.046 (0.427)\nEpoch: [36][60/400]\tTime 0.351 (0.395)\tData 0.001 (0.039)\tLoss 2.402 (0.695)\nEpoch: [36][70/400]\tTime 0.353 (0.390)\tData 0.000 (0.033)\tLoss 2.158 (0.898)\nEpoch: [36][80/400]\tTime 0.351 (0.385)\tData 0.001 (0.029)\tLoss 1.923 (1.028)\nEpoch: [36][90/400]\tTime 0.373 (0.404)\tData 0.001 (0.047)\tLoss 1.971 (1.129)\nEpoch: [36][100/400]\tTime 0.354 (0.399)\tData 0.001 (0.042)\tLoss 1.879 (1.199)\nEpoch: [36][110/400]\tTime 0.350 (0.395)\tData 0.000 (0.038)\tLoss 2.084 (1.264)\nEpoch: [36][120/400]\tTime 0.350 (0.391)\tData 0.000 (0.035)\tLoss 1.254 (1.306)\nEpoch: [36][130/400]\tTime 0.361 (0.389)\tData 0.000 (0.032)\tLoss 1.993 (1.343)\nEpoch: [36][140/400]\tTime 0.352 (0.399)\tData 0.000 (0.043)\tLoss 1.589 (1.374)\nEpoch: [36][150/400]\tTime 0.350 (0.396)\tData 0.001 (0.040)\tLoss 2.109 (1.404)\nEpoch: [36][160/400]\tTime 0.351 (0.393)\tData 0.001 (0.038)\tLoss 1.775 (1.431)\nEpoch: [36][170/400]\tTime 0.360 (0.391)\tData 0.000 (0.035)\tLoss 2.344 (1.454)\nEpoch: [36][180/400]\tTime 0.388 (0.400)\tData 0.001 (0.044)\tLoss 2.050 (1.472)\nEpoch: [36][190/400]\tTime 0.350 (0.398)\tData 0.001 (0.042)\tLoss 1.485 (1.480)\nEpoch: [36][200/400]\tTime 0.352 (0.395)\tData 0.001 (0.040)\tLoss 1.883 (1.493)\nEpoch: [36][210/400]\tTime 0.350 (0.393)\tData 0.001 (0.038)\tLoss 2.398 (1.511)\nEpoch: [36][220/400]\tTime 0.361 (0.392)\tData 0.000 (0.036)\tLoss 1.606 (1.526)\nEpoch: [36][230/400]\tTime 0.351 (0.399)\tData 0.000 (0.043)\tLoss 2.663 (1.542)\nEpoch: [36][240/400]\tTime 0.352 (0.397)\tData 0.000 (0.041)\tLoss 2.496 (1.558)\nEpoch: [36][250/400]\tTime 0.350 (0.395)\tData 0.000 (0.039)\tLoss 1.472 (1.566)\nEpoch: [36][260/400]\tTime 0.361 (0.393)\tData 0.000 (0.038)\tLoss 2.040 (1.581)\nEpoch: [36][270/400]\tTime 0.352 (0.399)\tData 0.000 (0.043)\tLoss 1.965 (1.594)\nEpoch: [36][280/400]\tTime 0.352 (0.397)\tData 0.001 (0.042)\tLoss 1.581 (1.601)\nEpoch: [36][290/400]\tTime 0.354 (0.396)\tData 0.000 (0.040)\tLoss 1.575 (1.613)\nEpoch: [36][300/400]\tTime 0.353 (0.395)\tData 0.001 (0.039)\tLoss 1.625 (1.618)\nEpoch: [36][310/400]\tTime 0.351 (0.399)\tData 0.001 (0.043)\tLoss 1.476 (1.623)\nEpoch: [36][320/400]\tTime 0.351 (0.398)\tData 0.000 (0.042)\tLoss 2.165 (1.634)\nEpoch: [36][330/400]\tTime 0.352 (0.397)\tData 0.001 (0.041)\tLoss 2.060 (1.636)\nEpoch: [36][340/400]\tTime 0.353 (0.395)\tData 0.000 (0.040)\tLoss 2.750 (1.642)\nEpoch: [36][350/400]\tTime 0.362 (0.394)\tData 0.000 (0.038)\tLoss 1.950 (1.643)\nEpoch: [36][360/400]\tTime 0.386 (0.398)\tData 0.000 (0.042)\tLoss 1.257 (1.644)\nEpoch: [36][370/400]\tTime 0.362 (0.397)\tData 0.001 (0.041)\tLoss 1.395 (1.646)\nEpoch: [36][380/400]\tTime 0.354 (0.396)\tData 0.000 (0.040)\tLoss 1.661 (1.654)\nEpoch: [36][390/400]\tTime 0.362 (0.395)\tData 0.000 (0.039)\tLoss 1.568 (1.654)\nEpoch: [36][400/400]\tTime 0.348 (0.399)\tData 0.000 (0.043)\tLoss 2.014 (1.657)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.161)\tData 0.000 (0.029)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.401859998703003\n==> Statistics for epoch 37: 709 clusters\nEpoch: [37][10/400]\tTime 0.347 (0.415)\tData 0.001 (0.052)\tLoss 0.302 (0.264)\nEpoch: [37][20/400]\tTime 0.347 (0.383)\tData 0.000 (0.026)\tLoss 0.157 (0.230)\nEpoch: [37][30/400]\tTime 0.351 (0.373)\tData 0.001 (0.018)\tLoss 0.200 (0.218)\nEpoch: [37][40/400]\tTime 0.361 (0.368)\tData 0.000 (0.013)\tLoss 0.108 (0.199)\nEpoch: [37][50/400]\tTime 0.361 (0.405)\tData 0.001 (0.049)\tLoss 2.320 (0.430)\nEpoch: [37][60/400]\tTime 0.406 (0.397)\tData 0.001 (0.041)\tLoss 2.069 (0.683)\nEpoch: [37][70/400]\tTime 0.361 (0.391)\tData 0.001 (0.035)\tLoss 1.630 (0.885)\nEpoch: [37][80/400]\tTime 0.351 (0.387)\tData 0.001 (0.031)\tLoss 1.943 (1.008)\nEpoch: [37][90/400]\tTime 0.354 (0.404)\tData 0.000 (0.046)\tLoss 1.611 (1.125)\nEpoch: [37][100/400]\tTime 0.355 (0.399)\tData 0.001 (0.041)\tLoss 1.927 (1.182)\nEpoch: [37][110/400]\tTime 0.352 (0.395)\tData 0.001 (0.038)\tLoss 1.416 (1.230)\nEpoch: [37][120/400]\tTime 0.352 (0.391)\tData 0.001 (0.035)\tLoss 1.431 (1.276)\nEpoch: [37][130/400]\tTime 0.361 (0.388)\tData 0.000 (0.032)\tLoss 2.194 (1.325)\nEpoch: [37][140/400]\tTime 0.354 (0.399)\tData 0.000 (0.043)\tLoss 1.044 (1.348)\nEpoch: [37][150/400]\tTime 0.351 (0.396)\tData 0.000 (0.040)\tLoss 1.801 (1.390)\nEpoch: [37][160/400]\tTime 0.353 (0.393)\tData 0.000 (0.037)\tLoss 2.198 (1.422)\nEpoch: [37][170/400]\tTime 0.362 (0.391)\tData 0.000 (0.035)\tLoss 2.183 (1.454)\nEpoch: [37][180/400]\tTime 0.352 (0.399)\tData 0.001 (0.043)\tLoss 1.823 (1.475)\nEpoch: [37][190/400]\tTime 0.351 (0.396)\tData 0.001 (0.041)\tLoss 1.610 (1.476)\nEpoch: [37][200/400]\tTime 0.364 (0.394)\tData 0.001 (0.039)\tLoss 1.835 (1.494)\nEpoch: [37][210/400]\tTime 0.351 (0.392)\tData 0.001 (0.037)\tLoss 2.643 (1.510)\nEpoch: [37][220/400]\tTime 0.363 (0.391)\tData 0.000 (0.035)\tLoss 2.271 (1.525)\nEpoch: [37][230/400]\tTime 0.360 (0.397)\tData 0.001 (0.041)\tLoss 2.186 (1.544)\nEpoch: [37][240/400]\tTime 0.348 (0.395)\tData 0.001 (0.040)\tLoss 2.245 (1.555)\nEpoch: [37][250/400]\tTime 0.350 (0.393)\tData 0.001 (0.038)\tLoss 1.419 (1.566)\nEpoch: [37][260/400]\tTime 0.362 (0.392)\tData 0.000 (0.037)\tLoss 1.575 (1.572)\nEpoch: [37][270/400]\tTime 0.363 (0.398)\tData 0.001 (0.042)\tLoss 2.652 (1.588)\nEpoch: [37][280/400]\tTime 0.350 (0.396)\tData 0.001 (0.040)\tLoss 2.592 (1.600)\nEpoch: [37][290/400]\tTime 0.361 (0.395)\tData 0.001 (0.039)\tLoss 1.409 (1.602)\nEpoch: [37][300/400]\tTime 0.350 (0.393)\tData 0.001 (0.038)\tLoss 1.215 (1.606)\nEpoch: [37][310/400]\tTime 0.349 (0.398)\tData 0.001 (0.042)\tLoss 1.869 (1.617)\nEpoch: [37][320/400]\tTime 0.352 (0.396)\tData 0.001 (0.041)\tLoss 1.683 (1.621)\nEpoch: [37][330/400]\tTime 0.352 (0.395)\tData 0.001 (0.040)\tLoss 1.971 (1.629)\nEpoch: [37][340/400]\tTime 0.352 (0.394)\tData 0.001 (0.038)\tLoss 1.028 (1.634)\nEpoch: [37][350/400]\tTime 0.362 (0.393)\tData 0.000 (0.037)\tLoss 2.136 (1.637)\nEpoch: [37][360/400]\tTime 0.351 (0.397)\tData 0.000 (0.041)\tLoss 1.058 (1.635)\nEpoch: [37][370/400]\tTime 0.362 (0.396)\tData 0.001 (0.040)\tLoss 1.410 (1.635)\nEpoch: [37][380/400]\tTime 0.351 (0.395)\tData 0.001 (0.039)\tLoss 1.461 (1.637)\nEpoch: [37][390/400]\tTime 0.362 (0.394)\tData 0.000 (0.038)\tLoss 1.372 (1.639)\nEpoch: [37][400/400]\tTime 0.341 (0.397)\tData 0.001 (0.042)\tLoss 2.433 (1.647)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.128 (0.159)\tData 0.000 (0.028)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.123547554016113\n==> Statistics for epoch 38: 708 clusters\nEpoch: [38][10/400]\tTime 0.350 (0.422)\tData 0.001 (0.063)\tLoss 0.191 (0.205)\nEpoch: [38][20/400]\tTime 0.352 (0.390)\tData 0.001 (0.032)\tLoss 0.253 (0.202)\nEpoch: [38][30/400]\tTime 0.378 (0.377)\tData 0.001 (0.021)\tLoss 0.201 (0.203)\nEpoch: [38][40/400]\tTime 0.361 (0.372)\tData 0.000 (0.016)\tLoss 0.196 (0.191)\nEpoch: [38][50/400]\tTime 0.361 (0.407)\tData 0.001 (0.050)\tLoss 2.190 (0.410)\nEpoch: [38][60/400]\tTime 0.352 (0.398)\tData 0.001 (0.042)\tLoss 1.711 (0.637)\nEpoch: [38][70/400]\tTime 0.352 (0.392)\tData 0.001 (0.036)\tLoss 1.449 (0.845)\nEpoch: [38][80/400]\tTime 0.347 (0.387)\tData 0.001 (0.031)\tLoss 1.806 (0.953)\nEpoch: [38][90/400]\tTime 0.349 (0.404)\tData 0.001 (0.047)\tLoss 1.974 (1.062)\nEpoch: [38][100/400]\tTime 0.407 (0.399)\tData 0.001 (0.042)\tLoss 0.921 (1.100)\nEpoch: [38][110/400]\tTime 0.353 (0.395)\tData 0.001 (0.039)\tLoss 1.614 (1.159)\nEpoch: [38][120/400]\tTime 0.376 (0.392)\tData 0.001 (0.035)\tLoss 1.641 (1.208)\nEpoch: [38][130/400]\tTime 0.360 (0.390)\tData 0.000 (0.033)\tLoss 1.477 (1.228)\nEpoch: [38][140/400]\tTime 0.352 (0.399)\tData 0.000 (0.043)\tLoss 1.564 (1.272)\nEpoch: [38][150/400]\tTime 0.351 (0.396)\tData 0.000 (0.040)\tLoss 2.342 (1.311)\nEpoch: [38][160/400]\tTime 0.361 (0.394)\tData 0.000 (0.037)\tLoss 1.523 (1.338)\nEpoch: [38][170/400]\tTime 0.362 (0.391)\tData 0.000 (0.035)\tLoss 1.103 (1.360)\nEpoch: [38][180/400]\tTime 0.366 (0.400)\tData 0.000 (0.043)\tLoss 2.203 (1.374)\nEpoch: [38][190/400]\tTime 0.352 (0.397)\tData 0.000 (0.041)\tLoss 1.910 (1.395)\nEpoch: [38][200/400]\tTime 0.352 (0.395)\tData 0.001 (0.039)\tLoss 1.841 (1.408)\nEpoch: [38][210/400]\tTime 0.351 (0.393)\tData 0.000 (0.037)\tLoss 1.734 (1.427)\nEpoch: [38][220/400]\tTime 0.362 (0.391)\tData 0.000 (0.035)\tLoss 1.831 (1.447)\nEpoch: [38][230/400]\tTime 0.352 (0.398)\tData 0.001 (0.041)\tLoss 1.554 (1.458)\nEpoch: [38][240/400]\tTime 0.351 (0.396)\tData 0.000 (0.040)\tLoss 2.432 (1.468)\nEpoch: [38][250/400]\tTime 0.378 (0.394)\tData 0.001 (0.038)\tLoss 1.196 (1.474)\nEpoch: [38][260/400]\tTime 0.362 (0.393)\tData 0.000 (0.037)\tLoss 1.838 (1.484)\nEpoch: [38][270/400]\tTime 0.350 (0.398)\tData 0.000 (0.042)\tLoss 1.457 (1.493)\nEpoch: [38][280/400]\tTime 0.361 (0.397)\tData 0.001 (0.041)\tLoss 1.560 (1.504)\nEpoch: [38][290/400]\tTime 0.352 (0.395)\tData 0.001 (0.039)\tLoss 1.937 (1.517)\nEpoch: [38][300/400]\tTime 0.361 (0.394)\tData 0.000 (0.038)\tLoss 2.140 (1.529)\nEpoch: [38][310/400]\tTime 0.349 (0.398)\tData 0.001 (0.042)\tLoss 1.848 (1.535)\nEpoch: [38][320/400]\tTime 0.361 (0.397)\tData 0.001 (0.041)\tLoss 1.352 (1.543)\nEpoch: [38][330/400]\tTime 0.361 (0.396)\tData 0.001 (0.040)\tLoss 1.742 (1.550)\nEpoch: [38][340/400]\tTime 0.360 (0.395)\tData 0.000 (0.038)\tLoss 1.578 (1.554)\nEpoch: [38][350/400]\tTime 0.362 (0.394)\tData 0.000 (0.037)\tLoss 2.461 (1.568)\nEpoch: [38][360/400]\tTime 0.350 (0.398)\tData 0.000 (0.042)\tLoss 1.616 (1.570)\nEpoch: [38][370/400]\tTime 0.352 (0.397)\tData 0.001 (0.040)\tLoss 2.210 (1.578)\nEpoch: [38][380/400]\tTime 0.359 (0.396)\tData 0.001 (0.039)\tLoss 2.130 (1.583)\nEpoch: [38][390/400]\tTime 0.361 (0.395)\tData 0.000 (0.038)\tLoss 1.376 (1.589)\nEpoch: [38][400/400]\tTime 0.360 (0.398)\tData 0.000 (0.042)\tLoss 1.360 (1.592)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.161)\tData 0.000 (0.028)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 21.01050329208374\n==> Statistics for epoch 39: 711 clusters\nEpoch: [39][10/400]\tTime 0.359 (0.423)\tData 0.001 (0.057)\tLoss 0.258 (0.204)\nEpoch: [39][20/400]\tTime 0.350 (0.387)\tData 0.000 (0.029)\tLoss 0.162 (0.201)\nEpoch: [39][30/400]\tTime 0.352 (0.375)\tData 0.000 (0.019)\tLoss 0.173 (0.196)\nEpoch: [39][40/400]\tTime 0.361 (0.370)\tData 0.000 (0.015)\tLoss 0.104 (0.179)\nEpoch: [39][50/400]\tTime 0.362 (0.406)\tData 0.001 (0.048)\tLoss 1.335 (0.366)\nEpoch: [39][60/400]\tTime 0.351 (0.398)\tData 0.001 (0.040)\tLoss 1.526 (0.599)\nEpoch: [39][70/400]\tTime 0.362 (0.392)\tData 0.001 (0.035)\tLoss 1.863 (0.807)\nEpoch: [39][80/400]\tTime 0.349 (0.387)\tData 0.001 (0.030)\tLoss 1.671 (0.937)\nEpoch: [39][90/400]\tTime 0.361 (0.405)\tData 0.000 (0.048)\tLoss 1.887 (1.042)\nEpoch: [39][100/400]\tTime 0.351 (0.401)\tData 0.001 (0.043)\tLoss 2.026 (1.105)\nEpoch: [39][110/400]\tTime 0.352 (0.396)\tData 0.001 (0.039)\tLoss 2.422 (1.172)\nEpoch: [39][120/400]\tTime 0.353 (0.393)\tData 0.000 (0.036)\tLoss 1.835 (1.202)\nEpoch: [39][130/400]\tTime 0.362 (0.390)\tData 0.000 (0.033)\tLoss 2.023 (1.246)\nEpoch: [39][140/400]\tTime 0.352 (0.402)\tData 0.000 (0.044)\tLoss 1.504 (1.281)\nEpoch: [39][150/400]\tTime 0.353 (0.399)\tData 0.000 (0.041)\tLoss 1.647 (1.314)\nEpoch: [39][160/400]\tTime 0.350 (0.396)\tData 0.001 (0.039)\tLoss 2.659 (1.358)\nEpoch: [39][170/400]\tTime 0.361 (0.394)\tData 0.000 (0.036)\tLoss 1.229 (1.385)\nEpoch: [39][180/400]\tTime 0.360 (0.401)\tData 0.000 (0.044)\tLoss 2.290 (1.402)\nEpoch: [39][190/400]\tTime 0.360 (0.399)\tData 0.001 (0.041)\tLoss 1.390 (1.428)\nEpoch: [39][200/400]\tTime 0.411 (0.397)\tData 0.001 (0.039)\tLoss 1.557 (1.442)\nEpoch: [39][210/400]\tTime 0.362 (0.395)\tData 0.001 (0.038)\tLoss 1.926 (1.458)\nEpoch: [39][220/400]\tTime 0.361 (0.394)\tData 0.000 (0.036)\tLoss 2.026 (1.467)\nEpoch: [39][230/400]\tTime 0.352 (0.400)\tData 0.001 (0.043)\tLoss 1.330 (1.483)\nEpoch: [39][240/400]\tTime 0.338 (0.398)\tData 0.001 (0.041)\tLoss 1.656 (1.496)\nEpoch: [39][250/400]\tTime 0.355 (0.397)\tData 0.001 (0.039)\tLoss 1.567 (1.501)\nEpoch: [39][260/400]\tTime 0.348 (0.395)\tData 0.000 (0.038)\tLoss 1.718 (1.520)\nEpoch: [39][270/400]\tTime 0.350 (0.400)\tData 0.000 (0.043)\tLoss 1.690 (1.536)\nEpoch: [39][280/400]\tTime 0.351 (0.398)\tData 0.001 (0.041)\tLoss 1.352 (1.539)\nEpoch: [39][290/400]\tTime 0.360 (0.397)\tData 0.001 (0.040)\tLoss 1.725 (1.548)\nEpoch: [39][300/400]\tTime 0.352 (0.396)\tData 0.001 (0.039)\tLoss 1.780 (1.555)\nEpoch: [39][310/400]\tTime 0.350 (0.400)\tData 0.001 (0.043)\tLoss 2.326 (1.564)\nEpoch: [39][320/400]\tTime 0.351 (0.399)\tData 0.000 (0.042)\tLoss 1.868 (1.567)\nEpoch: [39][330/400]\tTime 0.351 (0.398)\tData 0.001 (0.041)\tLoss 1.967 (1.577)\nEpoch: [39][340/400]\tTime 0.351 (0.397)\tData 0.001 (0.039)\tLoss 2.266 (1.582)\nEpoch: [39][350/400]\tTime 0.362 (0.396)\tData 0.000 (0.038)\tLoss 1.948 (1.585)\nEpoch: [39][360/400]\tTime 0.353 (0.399)\tData 0.000 (0.042)\tLoss 1.781 (1.584)\nEpoch: [39][370/400]\tTime 0.351 (0.398)\tData 0.001 (0.041)\tLoss 2.013 (1.590)\nEpoch: [39][380/400]\tTime 0.352 (0.397)\tData 0.001 (0.040)\tLoss 1.505 (1.597)\nEpoch: [39][390/400]\tTime 0.364 (0.396)\tData 0.000 (0.039)\tLoss 1.749 (1.606)\nEpoch: [39][400/400]\tTime 0.361 (0.400)\tData 0.000 (0.043)\tLoss 1.780 (1.611)\nExtract Features: [50/76]\tTime 0.134 (0.159)\tData 0.000 (0.026)\t\nMean AP: 86.8%\n\n * Finished epoch  39  model mAP: 86.8%  best: 86.8% *\n\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.165)\tData 0.000 (0.032)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.143017053604126\n==> Statistics for epoch 40: 707 clusters\nEpoch: [40][10/400]\tTime 0.348 (0.417)\tData 0.001 (0.057)\tLoss 0.124 (0.206)\nEpoch: [40][20/400]\tTime 0.350 (0.387)\tData 0.000 (0.029)\tLoss 0.184 (0.197)\nEpoch: [40][30/400]\tTime 0.350 (0.375)\tData 0.000 (0.019)\tLoss 0.083 (0.185)\nEpoch: [40][40/400]\tTime 0.360 (0.370)\tData 0.000 (0.015)\tLoss 0.100 (0.172)\nEpoch: [40][50/400]\tTime 0.380 (0.405)\tData 0.001 (0.047)\tLoss 1.776 (0.368)\nEpoch: [40][60/400]\tTime 0.351 (0.396)\tData 0.001 (0.039)\tLoss 1.513 (0.628)\nEpoch: [40][70/400]\tTime 0.351 (0.390)\tData 0.001 (0.033)\tLoss 2.163 (0.824)\nEpoch: [40][80/400]\tTime 0.349 (0.386)\tData 0.001 (0.029)\tLoss 1.906 (0.953)\nEpoch: [40][90/400]\tTime 0.351 (0.403)\tData 0.001 (0.046)\tLoss 1.463 (1.035)\nEpoch: [40][100/400]\tTime 0.362 (0.398)\tData 0.001 (0.041)\tLoss 1.973 (1.095)\nEpoch: [40][110/400]\tTime 0.352 (0.395)\tData 0.001 (0.038)\tLoss 1.685 (1.145)\nEpoch: [40][120/400]\tTime 0.352 (0.392)\tData 0.001 (0.034)\tLoss 1.609 (1.174)\nEpoch: [40][130/400]\tTime 0.362 (0.390)\tData 0.000 (0.032)\tLoss 0.893 (1.207)\nEpoch: [40][140/400]\tTime 0.352 (0.400)\tData 0.001 (0.042)\tLoss 1.766 (1.238)\nEpoch: [40][150/400]\tTime 0.363 (0.397)\tData 0.000 (0.039)\tLoss 1.848 (1.271)\nEpoch: [40][160/400]\tTime 0.351 (0.394)\tData 0.001 (0.037)\tLoss 1.326 (1.304)\nEpoch: [40][170/400]\tTime 0.362 (0.392)\tData 0.000 (0.035)\tLoss 1.707 (1.325)\nEpoch: [40][180/400]\tTime 0.361 (0.400)\tData 0.000 (0.042)\tLoss 2.184 (1.337)\nEpoch: [40][190/400]\tTime 0.363 (0.398)\tData 0.001 (0.040)\tLoss 1.782 (1.355)\nEpoch: [40][200/400]\tTime 0.354 (0.396)\tData 0.001 (0.038)\tLoss 1.454 (1.370)\nEpoch: [40][210/400]\tTime 0.364 (0.394)\tData 0.001 (0.037)\tLoss 1.549 (1.378)\nEpoch: [40][220/400]\tTime 0.361 (0.393)\tData 0.000 (0.035)\tLoss 1.658 (1.394)\nEpoch: [40][230/400]\tTime 0.362 (0.399)\tData 0.001 (0.041)\tLoss 1.838 (1.413)\nEpoch: [40][240/400]\tTime 0.350 (0.397)\tData 0.001 (0.039)\tLoss 1.166 (1.419)\nEpoch: [40][250/400]\tTime 0.363 (0.395)\tData 0.001 (0.038)\tLoss 0.982 (1.425)\nEpoch: [40][260/400]\tTime 0.364 (0.394)\tData 0.000 (0.036)\tLoss 1.744 (1.432)\nEpoch: [40][270/400]\tTime 0.353 (0.400)\tData 0.001 (0.042)\tLoss 2.232 (1.445)\nEpoch: [40][280/400]\tTime 0.361 (0.398)\tData 0.001 (0.040)\tLoss 1.813 (1.449)\nEpoch: [40][290/400]\tTime 0.363 (0.397)\tData 0.001 (0.039)\tLoss 1.564 (1.454)\nEpoch: [40][300/400]\tTime 0.374 (0.396)\tData 0.001 (0.038)\tLoss 1.757 (1.464)\nEpoch: [40][310/400]\tTime 0.347 (0.401)\tData 0.001 (0.042)\tLoss 1.468 (1.465)\nEpoch: [40][320/400]\tTime 0.354 (0.399)\tData 0.001 (0.041)\tLoss 1.230 (1.465)\nEpoch: [40][330/400]\tTime 0.353 (0.398)\tData 0.001 (0.040)\tLoss 2.178 (1.471)\nEpoch: [40][340/400]\tTime 0.362 (0.397)\tData 0.001 (0.038)\tLoss 1.000 (1.472)\nEpoch: [40][350/400]\tTime 0.363 (0.396)\tData 0.000 (0.037)\tLoss 2.050 (1.479)\nEpoch: [40][360/400]\tTime 0.353 (0.399)\tData 0.000 (0.041)\tLoss 1.524 (1.482)\nEpoch: [40][370/400]\tTime 0.362 (0.398)\tData 0.001 (0.040)\tLoss 1.736 (1.485)\nEpoch: [40][380/400]\tTime 0.353 (0.397)\tData 0.000 (0.039)\tLoss 1.418 (1.496)\nEpoch: [40][390/400]\tTime 0.363 (0.396)\tData 0.000 (0.038)\tLoss 1.328 (1.498)\nEpoch: [40][400/400]\tTime 0.351 (0.400)\tData 0.000 (0.042)\tLoss 1.822 (1.500)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.159)\tData 0.000 (0.026)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.482816457748413\n==> Statistics for epoch 41: 710 clusters\nEpoch: [41][10/400]\tTime 0.350 (0.405)\tData 0.000 (0.046)\tLoss 0.172 (0.208)\nEpoch: [41][20/400]\tTime 0.351 (0.378)\tData 0.000 (0.023)\tLoss 0.214 (0.187)\nEpoch: [41][30/400]\tTime 0.350 (0.371)\tData 0.000 (0.016)\tLoss 0.135 (0.180)\nEpoch: [41][40/400]\tTime 0.361 (0.366)\tData 0.000 (0.012)\tLoss 0.119 (0.171)\nEpoch: [41][50/400]\tTime 0.355 (0.400)\tData 0.000 (0.044)\tLoss 1.536 (0.341)\nEpoch: [41][60/400]\tTime 0.350 (0.392)\tData 0.000 (0.037)\tLoss 1.437 (0.581)\nEpoch: [41][70/400]\tTime 0.362 (0.386)\tData 0.000 (0.032)\tLoss 1.328 (0.758)\nEpoch: [41][80/400]\tTime 0.350 (0.383)\tData 0.001 (0.028)\tLoss 1.718 (0.879)\nEpoch: [41][90/400]\tTime 0.374 (0.401)\tData 0.001 (0.045)\tLoss 2.029 (0.992)\nEpoch: [41][100/400]\tTime 0.352 (0.397)\tData 0.001 (0.040)\tLoss 1.179 (1.076)\nEpoch: [41][110/400]\tTime 0.397 (0.393)\tData 0.001 (0.037)\tLoss 1.798 (1.127)\nEpoch: [41][120/400]\tTime 0.353 (0.390)\tData 0.001 (0.034)\tLoss 1.256 (1.164)\nEpoch: [41][130/400]\tTime 0.369 (0.388)\tData 0.000 (0.031)\tLoss 1.696 (1.195)\nEpoch: [41][140/400]\tTime 0.354 (0.398)\tData 0.000 (0.041)\tLoss 1.644 (1.228)\nEpoch: [41][150/400]\tTime 0.353 (0.395)\tData 0.001 (0.038)\tLoss 1.341 (1.247)\nEpoch: [41][160/400]\tTime 0.354 (0.393)\tData 0.001 (0.036)\tLoss 1.449 (1.266)\nEpoch: [41][170/400]\tTime 0.364 (0.391)\tData 0.000 (0.034)\tLoss 1.767 (1.280)\nEpoch: [41][180/400]\tTime 0.363 (0.400)\tData 0.001 (0.042)\tLoss 1.540 (1.295)\nEpoch: [41][190/400]\tTime 0.356 (0.398)\tData 0.001 (0.040)\tLoss 1.829 (1.315)\nEpoch: [41][200/400]\tTime 0.354 (0.395)\tData 0.001 (0.038)\tLoss 2.040 (1.340)\nEpoch: [41][210/400]\tTime 0.353 (0.394)\tData 0.001 (0.036)\tLoss 1.267 (1.351)\nEpoch: [41][220/400]\tTime 0.359 (0.392)\tData 0.000 (0.034)\tLoss 1.565 (1.362)\nEpoch: [41][230/400]\tTime 0.357 (0.398)\tData 0.001 (0.041)\tLoss 1.555 (1.368)\nEpoch: [41][240/400]\tTime 0.351 (0.397)\tData 0.001 (0.039)\tLoss 1.458 (1.377)\nEpoch: [41][250/400]\tTime 0.355 (0.395)\tData 0.001 (0.038)\tLoss 1.978 (1.390)\nEpoch: [41][260/400]\tTime 0.364 (0.393)\tData 0.000 (0.036)\tLoss 1.634 (1.397)\nEpoch: [41][270/400]\tTime 0.353 (0.399)\tData 0.001 (0.041)\tLoss 1.419 (1.397)\nEpoch: [41][280/400]\tTime 0.352 (0.397)\tData 0.001 (0.040)\tLoss 1.578 (1.402)\nEpoch: [41][290/400]\tTime 0.353 (0.396)\tData 0.001 (0.038)\tLoss 1.037 (1.407)\nEpoch: [41][300/400]\tTime 0.354 (0.394)\tData 0.001 (0.037)\tLoss 1.339 (1.418)\nEpoch: [41][310/400]\tTime 0.414 (0.399)\tData 0.001 (0.041)\tLoss 1.368 (1.424)\nEpoch: [41][320/400]\tTime 0.362 (0.398)\tData 0.001 (0.040)\tLoss 1.917 (1.428)\nEpoch: [41][330/400]\tTime 0.363 (0.397)\tData 0.001 (0.039)\tLoss 1.997 (1.442)\nEpoch: [41][340/400]\tTime 0.353 (0.396)\tData 0.001 (0.038)\tLoss 2.095 (1.454)\nEpoch: [41][350/400]\tTime 0.365 (0.395)\tData 0.000 (0.037)\tLoss 1.098 (1.453)\nEpoch: [41][360/400]\tTime 0.353 (0.399)\tData 0.001 (0.041)\tLoss 1.919 (1.456)\nEpoch: [41][370/400]\tTime 0.355 (0.398)\tData 0.001 (0.039)\tLoss 1.852 (1.460)\nEpoch: [41][380/400]\tTime 0.351 (0.396)\tData 0.001 (0.038)\tLoss 1.355 (1.464)\nEpoch: [41][390/400]\tTime 0.364 (0.395)\tData 0.000 (0.037)\tLoss 1.235 (1.468)\nEpoch: [41][400/400]\tTime 0.352 (0.399)\tData 0.001 (0.041)\tLoss 1.871 (1.472)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.163)\tData 0.000 (0.030)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.165592432022095\n==> Statistics for epoch 42: 710 clusters\nEpoch: [42][10/400]\tTime 0.351 (0.403)\tData 0.001 (0.046)\tLoss 0.131 (0.209)\nEpoch: [42][20/400]\tTime 0.350 (0.377)\tData 0.001 (0.023)\tLoss 0.079 (0.190)\nEpoch: [42][30/400]\tTime 0.352 (0.369)\tData 0.001 (0.016)\tLoss 0.111 (0.185)\nEpoch: [42][40/400]\tTime 0.362 (0.366)\tData 0.000 (0.012)\tLoss 0.156 (0.177)\nEpoch: [42][50/400]\tTime 0.352 (0.397)\tData 0.001 (0.042)\tLoss 1.428 (0.371)\nEpoch: [42][60/400]\tTime 0.406 (0.390)\tData 0.001 (0.035)\tLoss 1.790 (0.603)\nEpoch: [42][70/400]\tTime 0.352 (0.385)\tData 0.001 (0.030)\tLoss 1.906 (0.781)\nEpoch: [42][80/400]\tTime 0.351 (0.381)\tData 0.000 (0.027)\tLoss 2.170 (0.908)\nEpoch: [42][90/400]\tTime 0.374 (0.398)\tData 0.001 (0.042)\tLoss 1.977 (1.024)\nEpoch: [42][100/400]\tTime 0.362 (0.394)\tData 0.001 (0.038)\tLoss 2.042 (1.082)\nEpoch: [42][110/400]\tTime 0.362 (0.391)\tData 0.001 (0.035)\tLoss 1.651 (1.124)\nEpoch: [42][120/400]\tTime 0.353 (0.388)\tData 0.000 (0.032)\tLoss 1.648 (1.170)\nEpoch: [42][130/400]\tTime 0.360 (0.385)\tData 0.000 (0.030)\tLoss 1.396 (1.199)\nEpoch: [42][140/400]\tTime 0.352 (0.397)\tData 0.000 (0.040)\tLoss 1.425 (1.234)\nEpoch: [42][150/400]\tTime 0.352 (0.394)\tData 0.001 (0.038)\tLoss 1.445 (1.253)\nEpoch: [42][160/400]\tTime 0.351 (0.391)\tData 0.000 (0.035)\tLoss 1.912 (1.262)\nEpoch: [42][170/400]\tTime 0.355 (0.389)\tData 0.000 (0.033)\tLoss 1.455 (1.283)\nEpoch: [42][180/400]\tTime 0.350 (0.397)\tData 0.000 (0.041)\tLoss 1.667 (1.300)\nEpoch: [42][190/400]\tTime 0.350 (0.395)\tData 0.001 (0.039)\tLoss 1.394 (1.316)\nEpoch: [42][200/400]\tTime 0.352 (0.393)\tData 0.001 (0.037)\tLoss 1.452 (1.334)\nEpoch: [42][210/400]\tTime 0.352 (0.391)\tData 0.001 (0.035)\tLoss 1.692 (1.346)\nEpoch: [42][220/400]\tTime 0.362 (0.390)\tData 0.000 (0.034)\tLoss 1.589 (1.358)\nEpoch: [42][230/400]\tTime 0.351 (0.396)\tData 0.001 (0.040)\tLoss 1.390 (1.371)\nEpoch: [42][240/400]\tTime 0.354 (0.394)\tData 0.000 (0.038)\tLoss 1.540 (1.383)\nEpoch: [42][250/400]\tTime 0.351 (0.392)\tData 0.001 (0.037)\tLoss 1.383 (1.396)\nEpoch: [42][260/400]\tTime 0.363 (0.391)\tData 0.000 (0.035)\tLoss 1.978 (1.408)\nEpoch: [42][270/400]\tTime 0.351 (0.397)\tData 0.001 (0.041)\tLoss 1.521 (1.420)\nEpoch: [42][280/400]\tTime 0.352 (0.395)\tData 0.001 (0.039)\tLoss 1.836 (1.433)\nEpoch: [42][290/400]\tTime 0.352 (0.394)\tData 0.001 (0.038)\tLoss 1.459 (1.442)\nEpoch: [42][300/400]\tTime 0.352 (0.393)\tData 0.001 (0.037)\tLoss 1.829 (1.448)\nEpoch: [42][310/400]\tTime 0.350 (0.397)\tData 0.001 (0.041)\tLoss 1.895 (1.453)\nEpoch: [42][320/400]\tTime 0.353 (0.396)\tData 0.000 (0.040)\tLoss 1.434 (1.459)\nEpoch: [42][330/400]\tTime 0.363 (0.395)\tData 0.001 (0.039)\tLoss 1.398 (1.461)\nEpoch: [42][340/400]\tTime 0.352 (0.394)\tData 0.000 (0.038)\tLoss 1.933 (1.464)\nEpoch: [42][350/400]\tTime 0.363 (0.393)\tData 0.000 (0.037)\tLoss 1.301 (1.468)\nEpoch: [42][360/400]\tTime 0.354 (0.397)\tData 0.001 (0.041)\tLoss 1.466 (1.469)\nEpoch: [42][370/400]\tTime 0.348 (0.396)\tData 0.001 (0.040)\tLoss 2.005 (1.477)\nEpoch: [42][380/400]\tTime 0.351 (0.395)\tData 0.001 (0.039)\tLoss 2.237 (1.481)\nEpoch: [42][390/400]\tTime 0.363 (0.394)\tData 0.000 (0.038)\tLoss 1.526 (1.485)\nEpoch: [42][400/400]\tTime 0.351 (0.397)\tData 0.000 (0.041)\tLoss 1.959 (1.495)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.128 (0.160)\tData 0.000 (0.028)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.162556648254395\n==> Statistics for epoch 43: 707 clusters\nEpoch: [43][10/400]\tTime 0.350 (0.415)\tData 0.001 (0.051)\tLoss 0.180 (0.211)\nEpoch: [43][20/400]\tTime 0.362 (0.386)\tData 0.001 (0.026)\tLoss 0.165 (0.194)\nEpoch: [43][30/400]\tTime 0.350 (0.375)\tData 0.001 (0.017)\tLoss 0.104 (0.188)\nEpoch: [43][40/400]\tTime 0.362 (0.370)\tData 0.000 (0.013)\tLoss 0.180 (0.183)\nEpoch: [43][50/400]\tTime 0.361 (0.405)\tData 0.000 (0.047)\tLoss 1.803 (0.381)\nEpoch: [43][60/400]\tTime 0.348 (0.398)\tData 0.000 (0.040)\tLoss 2.111 (0.618)\nEpoch: [43][70/400]\tTime 0.362 (0.392)\tData 0.001 (0.034)\tLoss 2.024 (0.790)\nEpoch: [43][80/400]\tTime 0.353 (0.388)\tData 0.000 (0.030)\tLoss 1.710 (0.911)\nEpoch: [43][90/400]\tTime 0.349 (0.405)\tData 0.001 (0.046)\tLoss 1.285 (1.008)\nEpoch: [43][100/400]\tTime 0.354 (0.400)\tData 0.000 (0.042)\tLoss 1.357 (1.059)\nEpoch: [43][110/400]\tTime 0.363 (0.396)\tData 0.001 (0.038)\tLoss 1.604 (1.109)\nEpoch: [43][120/400]\tTime 0.354 (0.393)\tData 0.000 (0.035)\tLoss 1.543 (1.151)\nEpoch: [43][130/400]\tTime 0.363 (0.390)\tData 0.000 (0.032)\tLoss 1.795 (1.179)\nEpoch: [43][140/400]\tTime 0.351 (0.400)\tData 0.001 (0.042)\tLoss 1.096 (1.210)\nEpoch: [43][150/400]\tTime 0.351 (0.397)\tData 0.001 (0.039)\tLoss 1.239 (1.239)\nEpoch: [43][160/400]\tTime 0.353 (0.394)\tData 0.000 (0.037)\tLoss 2.171 (1.260)\nEpoch: [43][170/400]\tTime 0.363 (0.392)\tData 0.000 (0.034)\tLoss 1.504 (1.283)\nEpoch: [43][180/400]\tTime 0.351 (0.401)\tData 0.001 (0.043)\tLoss 1.284 (1.303)\nEpoch: [43][190/400]\tTime 0.353 (0.398)\tData 0.001 (0.041)\tLoss 1.322 (1.317)\nEpoch: [43][200/400]\tTime 0.353 (0.396)\tData 0.001 (0.039)\tLoss 1.440 (1.329)\nEpoch: [43][210/400]\tTime 0.352 (0.394)\tData 0.001 (0.037)\tLoss 1.088 (1.333)\nEpoch: [43][220/400]\tTime 0.369 (0.393)\tData 0.000 (0.035)\tLoss 1.715 (1.350)\nEpoch: [43][230/400]\tTime 0.362 (0.399)\tData 0.001 (0.042)\tLoss 1.376 (1.363)\nEpoch: [43][240/400]\tTime 0.352 (0.398)\tData 0.000 (0.040)\tLoss 2.252 (1.370)\nEpoch: [43][250/400]\tTime 0.354 (0.396)\tData 0.001 (0.039)\tLoss 1.209 (1.380)\nEpoch: [43][260/400]\tTime 0.364 (0.394)\tData 0.000 (0.037)\tLoss 2.110 (1.398)\nEpoch: [43][270/400]\tTime 0.353 (0.400)\tData 0.001 (0.042)\tLoss 1.596 (1.408)\nEpoch: [43][280/400]\tTime 0.351 (0.399)\tData 0.001 (0.041)\tLoss 1.804 (1.415)\nEpoch: [43][290/400]\tTime 0.354 (0.397)\tData 0.001 (0.040)\tLoss 1.572 (1.424)\nEpoch: [43][300/400]\tTime 0.351 (0.396)\tData 0.001 (0.038)\tLoss 2.234 (1.432)\nEpoch: [43][310/400]\tTime 0.349 (0.400)\tData 0.001 (0.043)\tLoss 2.016 (1.439)\nEpoch: [43][320/400]\tTime 0.351 (0.399)\tData 0.000 (0.041)\tLoss 1.284 (1.437)\nEpoch: [43][330/400]\tTime 0.351 (0.397)\tData 0.000 (0.040)\tLoss 1.405 (1.444)\nEpoch: [43][340/400]\tTime 0.352 (0.396)\tData 0.000 (0.039)\tLoss 0.942 (1.445)\nEpoch: [43][350/400]\tTime 0.359 (0.395)\tData 0.000 (0.038)\tLoss 1.682 (1.448)\nEpoch: [43][360/400]\tTime 0.354 (0.399)\tData 0.000 (0.041)\tLoss 1.433 (1.452)\nEpoch: [43][370/400]\tTime 0.374 (0.397)\tData 0.001 (0.040)\tLoss 1.473 (1.455)\nEpoch: [43][380/400]\tTime 0.351 (0.396)\tData 0.001 (0.039)\tLoss 1.421 (1.462)\nEpoch: [43][390/400]\tTime 0.364 (0.395)\tData 0.000 (0.038)\tLoss 1.107 (1.467)\nEpoch: [43][400/400]\tTime 0.352 (0.399)\tData 0.000 (0.042)\tLoss 1.955 (1.477)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.128 (0.162)\tData 0.000 (0.029)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.062583684921265\n==> Statistics for epoch 44: 710 clusters\nEpoch: [44][10/400]\tTime 0.351 (0.416)\tData 0.001 (0.058)\tLoss 0.100 (0.171)\nEpoch: [44][20/400]\tTime 0.349 (0.386)\tData 0.001 (0.029)\tLoss 0.175 (0.184)\nEpoch: [44][30/400]\tTime 0.349 (0.375)\tData 0.001 (0.020)\tLoss 0.096 (0.175)\nEpoch: [44][40/400]\tTime 0.362 (0.371)\tData 0.000 (0.015)\tLoss 0.086 (0.161)\nEpoch: [44][50/400]\tTime 0.361 (0.403)\tData 0.001 (0.045)\tLoss 1.622 (0.371)\nEpoch: [44][60/400]\tTime 0.353 (0.394)\tData 0.000 (0.038)\tLoss 1.611 (0.603)\nEpoch: [44][70/400]\tTime 0.361 (0.389)\tData 0.001 (0.033)\tLoss 2.232 (0.758)\nEpoch: [44][80/400]\tTime 0.354 (0.384)\tData 0.000 (0.029)\tLoss 1.322 (0.880)\nEpoch: [44][90/400]\tTime 0.375 (0.404)\tData 0.001 (0.047)\tLoss 1.389 (0.979)\nEpoch: [44][100/400]\tTime 0.350 (0.399)\tData 0.001 (0.042)\tLoss 1.671 (1.035)\nEpoch: [44][110/400]\tTime 0.352 (0.395)\tData 0.001 (0.038)\tLoss 1.356 (1.089)\nEpoch: [44][120/400]\tTime 0.352 (0.391)\tData 0.001 (0.035)\tLoss 0.993 (1.121)\nEpoch: [44][130/400]\tTime 0.363 (0.389)\tData 0.000 (0.033)\tLoss 1.272 (1.156)\nEpoch: [44][140/400]\tTime 0.350 (0.400)\tData 0.001 (0.043)\tLoss 1.814 (1.187)\nEpoch: [44][150/400]\tTime 0.353 (0.397)\tData 0.001 (0.040)\tLoss 1.670 (1.219)\nEpoch: [44][160/400]\tTime 0.354 (0.394)\tData 0.000 (0.038)\tLoss 1.312 (1.232)\nEpoch: [44][170/400]\tTime 0.362 (0.392)\tData 0.000 (0.036)\tLoss 1.745 (1.254)\nEpoch: [44][180/400]\tTime 0.350 (0.401)\tData 0.000 (0.044)\tLoss 2.123 (1.275)\nEpoch: [44][190/400]\tTime 0.362 (0.399)\tData 0.001 (0.041)\tLoss 1.214 (1.291)\nEpoch: [44][200/400]\tTime 0.379 (0.396)\tData 0.001 (0.039)\tLoss 1.859 (1.312)\nEpoch: [44][210/400]\tTime 0.362 (0.394)\tData 0.001 (0.038)\tLoss 1.947 (1.331)\nEpoch: [44][220/400]\tTime 0.362 (0.393)\tData 0.000 (0.036)\tLoss 1.031 (1.338)\nEpoch: [44][230/400]\tTime 0.350 (0.399)\tData 0.001 (0.042)\tLoss 1.748 (1.343)\nEpoch: [44][240/400]\tTime 0.359 (0.397)\tData 0.001 (0.040)\tLoss 1.870 (1.358)\nEpoch: [44][250/400]\tTime 0.350 (0.396)\tData 0.001 (0.038)\tLoss 1.781 (1.368)\nEpoch: [44][260/400]\tTime 0.362 (0.394)\tData 0.000 (0.037)\tLoss 1.317 (1.380)\nEpoch: [44][270/400]\tTime 0.351 (0.399)\tData 0.001 (0.042)\tLoss 1.718 (1.392)\nEpoch: [44][280/400]\tTime 0.374 (0.398)\tData 0.001 (0.041)\tLoss 1.636 (1.401)\nEpoch: [44][290/400]\tTime 0.352 (0.396)\tData 0.001 (0.039)\tLoss 1.231 (1.410)\nEpoch: [44][300/400]\tTime 0.349 (0.395)\tData 0.001 (0.038)\tLoss 1.921 (1.413)\nEpoch: [44][310/400]\tTime 0.361 (0.400)\tData 0.001 (0.042)\tLoss 1.383 (1.420)\nEpoch: [44][320/400]\tTime 0.350 (0.399)\tData 0.001 (0.041)\tLoss 1.726 (1.432)\nEpoch: [44][330/400]\tTime 0.402 (0.397)\tData 0.000 (0.040)\tLoss 1.563 (1.437)\nEpoch: [44][340/400]\tTime 0.352 (0.396)\tData 0.001 (0.039)\tLoss 1.032 (1.444)\nEpoch: [44][350/400]\tTime 0.362 (0.395)\tData 0.000 (0.038)\tLoss 1.573 (1.448)\nEpoch: [44][360/400]\tTime 0.351 (0.398)\tData 0.000 (0.041)\tLoss 1.933 (1.449)\nEpoch: [44][370/400]\tTime 0.361 (0.397)\tData 0.001 (0.040)\tLoss 1.482 (1.451)\nEpoch: [44][380/400]\tTime 0.362 (0.396)\tData 0.000 (0.039)\tLoss 1.352 (1.454)\nEpoch: [44][390/400]\tTime 0.361 (0.395)\tData 0.000 (0.038)\tLoss 1.528 (1.458)\nEpoch: [44][400/400]\tTime 0.350 (0.398)\tData 0.001 (0.041)\tLoss 1.346 (1.459)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.161)\tData 0.000 (0.029)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 21.04724669456482\n==> Statistics for epoch 45: 711 clusters\nEpoch: [45][10/400]\tTime 0.349 (0.424)\tData 0.001 (0.063)\tLoss 0.153 (0.183)\nEpoch: [45][20/400]\tTime 0.350 (0.388)\tData 0.000 (0.032)\tLoss 0.216 (0.170)\nEpoch: [45][30/400]\tTime 0.361 (0.376)\tData 0.001 (0.021)\tLoss 0.097 (0.163)\nEpoch: [45][40/400]\tTime 0.361 (0.371)\tData 0.000 (0.016)\tLoss 0.113 (0.162)\nEpoch: [45][50/400]\tTime 0.352 (0.403)\tData 0.000 (0.048)\tLoss 1.543 (0.351)\nEpoch: [45][60/400]\tTime 0.350 (0.395)\tData 0.001 (0.040)\tLoss 2.020 (0.570)\nEpoch: [45][70/400]\tTime 0.354 (0.389)\tData 0.001 (0.034)\tLoss 1.971 (0.753)\nEpoch: [45][80/400]\tTime 0.350 (0.384)\tData 0.000 (0.030)\tLoss 1.234 (0.868)\nEpoch: [45][90/400]\tTime 0.342 (0.401)\tData 0.001 (0.046)\tLoss 1.229 (0.964)\nEpoch: [45][100/400]\tTime 0.353 (0.397)\tData 0.000 (0.042)\tLoss 1.315 (1.020)\nEpoch: [45][110/400]\tTime 0.363 (0.393)\tData 0.000 (0.038)\tLoss 1.671 (1.056)\nEpoch: [45][120/400]\tTime 0.353 (0.390)\tData 0.001 (0.035)\tLoss 1.119 (1.102)\nEpoch: [45][130/400]\tTime 0.363 (0.388)\tData 0.000 (0.032)\tLoss 1.328 (1.139)\nEpoch: [45][140/400]\tTime 0.353 (0.399)\tData 0.001 (0.043)\tLoss 0.924 (1.176)\nEpoch: [45][150/400]\tTime 0.353 (0.396)\tData 0.001 (0.040)\tLoss 1.637 (1.204)\nEpoch: [45][160/400]\tTime 0.369 (0.394)\tData 0.001 (0.038)\tLoss 1.266 (1.219)\nEpoch: [45][170/400]\tTime 0.364 (0.391)\tData 0.000 (0.036)\tLoss 1.894 (1.242)\nEpoch: [45][180/400]\tTime 0.350 (0.399)\tData 0.000 (0.043)\tLoss 1.150 (1.257)\nEpoch: [45][190/400]\tTime 0.352 (0.397)\tData 0.001 (0.041)\tLoss 2.173 (1.273)\nEpoch: [45][200/400]\tTime 0.351 (0.394)\tData 0.001 (0.039)\tLoss 1.493 (1.288)\nEpoch: [45][210/400]\tTime 0.353 (0.392)\tData 0.001 (0.037)\tLoss 1.505 (1.299)\nEpoch: [45][220/400]\tTime 0.363 (0.391)\tData 0.000 (0.035)\tLoss 1.737 (1.314)\nEpoch: [45][230/400]\tTime 0.368 (0.397)\tData 0.001 (0.041)\tLoss 2.321 (1.332)\nEpoch: [45][240/400]\tTime 0.353 (0.395)\tData 0.000 (0.039)\tLoss 1.704 (1.346)\nEpoch: [45][250/400]\tTime 0.364 (0.394)\tData 0.001 (0.038)\tLoss 1.752 (1.354)\nEpoch: [45][260/400]\tTime 0.365 (0.392)\tData 0.000 (0.036)\tLoss 2.220 (1.366)\nEpoch: [45][270/400]\tTime 0.351 (0.398)\tData 0.000 (0.042)\tLoss 1.526 (1.373)\nEpoch: [45][280/400]\tTime 0.365 (0.397)\tData 0.000 (0.040)\tLoss 1.280 (1.377)\nEpoch: [45][290/400]\tTime 0.352 (0.395)\tData 0.000 (0.039)\tLoss 1.631 (1.381)\nEpoch: [45][300/400]\tTime 0.374 (0.394)\tData 0.001 (0.038)\tLoss 1.594 (1.390)\nEpoch: [45][310/400]\tTime 0.377 (0.399)\tData 0.001 (0.042)\tLoss 1.777 (1.398)\nEpoch: [45][320/400]\tTime 0.352 (0.398)\tData 0.000 (0.041)\tLoss 1.706 (1.402)\nEpoch: [45][330/400]\tTime 0.351 (0.397)\tData 0.001 (0.040)\tLoss 1.670 (1.403)\nEpoch: [45][340/400]\tTime 0.351 (0.395)\tData 0.001 (0.039)\tLoss 1.918 (1.414)\nEpoch: [45][350/400]\tTime 0.365 (0.394)\tData 0.000 (0.038)\tLoss 2.267 (1.425)\nEpoch: [45][360/400]\tTime 0.355 (0.398)\tData 0.000 (0.041)\tLoss 1.025 (1.431)\nEpoch: [45][370/400]\tTime 0.350 (0.397)\tData 0.001 (0.040)\tLoss 1.433 (1.431)\nEpoch: [45][380/400]\tTime 0.370 (0.396)\tData 0.001 (0.039)\tLoss 1.009 (1.433)\nEpoch: [45][390/400]\tTime 0.363 (0.395)\tData 0.000 (0.038)\tLoss 1.035 (1.432)\nEpoch: [45][400/400]\tTime 0.348 (0.399)\tData 0.000 (0.042)\tLoss 1.340 (1.434)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.158)\tData 0.000 (0.027)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.08151888847351\n==> Statistics for epoch 46: 711 clusters\nEpoch: [46][10/400]\tTime 0.399 (0.430)\tData 0.001 (0.066)\tLoss 0.155 (0.199)\nEpoch: [46][20/400]\tTime 0.351 (0.391)\tData 0.001 (0.033)\tLoss 0.134 (0.200)\nEpoch: [46][30/400]\tTime 0.351 (0.378)\tData 0.001 (0.022)\tLoss 0.098 (0.187)\nEpoch: [46][40/400]\tTime 0.363 (0.372)\tData 0.000 (0.017)\tLoss 0.085 (0.180)\nEpoch: [46][50/400]\tTime 0.352 (0.407)\tData 0.001 (0.051)\tLoss 1.418 (0.354)\nEpoch: [46][60/400]\tTime 0.350 (0.398)\tData 0.001 (0.043)\tLoss 1.741 (0.585)\nEpoch: [46][70/400]\tTime 0.354 (0.392)\tData 0.001 (0.037)\tLoss 1.096 (0.742)\nEpoch: [46][80/400]\tTime 0.342 (0.387)\tData 0.001 (0.032)\tLoss 1.535 (0.850)\nEpoch: [46][90/400]\tTime 0.353 (0.404)\tData 0.000 (0.049)\tLoss 1.479 (0.952)\nEpoch: [46][100/400]\tTime 0.351 (0.400)\tData 0.001 (0.044)\tLoss 1.431 (1.020)\nEpoch: [46][110/400]\tTime 0.363 (0.396)\tData 0.001 (0.040)\tLoss 1.496 (1.060)\nEpoch: [46][120/400]\tTime 0.355 (0.392)\tData 0.001 (0.037)\tLoss 1.418 (1.109)\nEpoch: [46][130/400]\tTime 0.362 (0.390)\tData 0.000 (0.034)\tLoss 1.688 (1.133)\nEpoch: [46][140/400]\tTime 0.351 (0.402)\tData 0.000 (0.045)\tLoss 1.732 (1.168)\nEpoch: [46][150/400]\tTime 0.351 (0.399)\tData 0.001 (0.042)\tLoss 1.358 (1.199)\nEpoch: [46][160/400]\tTime 0.353 (0.396)\tData 0.001 (0.040)\tLoss 1.810 (1.219)\nEpoch: [46][170/400]\tTime 0.361 (0.394)\tData 0.000 (0.037)\tLoss 1.586 (1.231)\nEpoch: [46][180/400]\tTime 0.351 (0.403)\tData 0.000 (0.046)\tLoss 1.650 (1.247)\nEpoch: [46][190/400]\tTime 0.404 (0.400)\tData 0.001 (0.043)\tLoss 1.439 (1.257)\nEpoch: [46][200/400]\tTime 0.352 (0.398)\tData 0.000 (0.041)\tLoss 1.318 (1.285)\nEpoch: [46][210/400]\tTime 0.364 (0.396)\tData 0.001 (0.039)\tLoss 2.036 (1.298)\nEpoch: [46][220/400]\tTime 0.363 (0.395)\tData 0.000 (0.037)\tLoss 1.743 (1.314)\nEpoch: [46][230/400]\tTime 0.362 (0.402)\tData 0.001 (0.044)\tLoss 1.166 (1.324)\nEpoch: [46][240/400]\tTime 0.352 (0.400)\tData 0.001 (0.042)\tLoss 1.346 (1.330)\nEpoch: [46][250/400]\tTime 0.362 (0.398)\tData 0.001 (0.040)\tLoss 1.890 (1.340)\nEpoch: [46][260/400]\tTime 0.363 (0.397)\tData 0.000 (0.039)\tLoss 1.414 (1.350)\nEpoch: [46][270/400]\tTime 0.353 (0.402)\tData 0.001 (0.044)\tLoss 1.933 (1.366)\nEpoch: [46][280/400]\tTime 0.394 (0.401)\tData 0.001 (0.043)\tLoss 1.450 (1.377)\nEpoch: [46][290/400]\tTime 0.353 (0.399)\tData 0.001 (0.041)\tLoss 1.843 (1.385)\nEpoch: [46][300/400]\tTime 0.351 (0.397)\tData 0.001 (0.040)\tLoss 1.477 (1.392)\nEpoch: [46][310/400]\tTime 0.366 (0.402)\tData 0.001 (0.045)\tLoss 1.166 (1.396)\nEpoch: [46][320/400]\tTime 0.353 (0.401)\tData 0.000 (0.043)\tLoss 1.909 (1.408)\nEpoch: [46][330/400]\tTime 0.355 (0.399)\tData 0.001 (0.042)\tLoss 0.863 (1.415)\nEpoch: [46][340/400]\tTime 0.351 (0.398)\tData 0.000 (0.041)\tLoss 1.236 (1.418)\nEpoch: [46][350/400]\tTime 0.363 (0.397)\tData 0.000 (0.039)\tLoss 1.864 (1.420)\nEpoch: [46][360/400]\tTime 0.350 (0.401)\tData 0.001 (0.043)\tLoss 1.548 (1.426)\nEpoch: [46][370/400]\tTime 0.372 (0.399)\tData 0.001 (0.042)\tLoss 2.012 (1.433)\nEpoch: [46][380/400]\tTime 0.352 (0.398)\tData 0.001 (0.041)\tLoss 1.236 (1.436)\nEpoch: [46][390/400]\tTime 0.363 (0.397)\tData 0.000 (0.040)\tLoss 1.478 (1.440)\nEpoch: [46][400/400]\tTime 0.352 (0.401)\tData 0.000 (0.043)\tLoss 1.541 (1.445)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.162)\tData 0.000 (0.030)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.107317209243774\n==> Statistics for epoch 47: 711 clusters\nEpoch: [47][10/400]\tTime 0.352 (0.414)\tData 0.001 (0.053)\tLoss 0.259 (0.188)\nEpoch: [47][20/400]\tTime 0.350 (0.384)\tData 0.001 (0.027)\tLoss 0.162 (0.175)\nEpoch: [47][30/400]\tTime 0.360 (0.374)\tData 0.001 (0.018)\tLoss 0.084 (0.162)\nEpoch: [47][40/400]\tTime 0.360 (0.370)\tData 0.000 (0.014)\tLoss 0.134 (0.160)\nEpoch: [47][50/400]\tTime 0.362 (0.403)\tData 0.001 (0.046)\tLoss 1.716 (0.362)\nEpoch: [47][60/400]\tTime 0.353 (0.396)\tData 0.000 (0.038)\tLoss 1.623 (0.611)\nEpoch: [47][70/400]\tTime 0.350 (0.390)\tData 0.001 (0.033)\tLoss 1.651 (0.768)\nEpoch: [47][80/400]\tTime 0.353 (0.385)\tData 0.000 (0.029)\tLoss 1.758 (0.875)\nEpoch: [47][90/400]\tTime 0.365 (0.404)\tData 0.000 (0.047)\tLoss 1.572 (0.954)\nEpoch: [47][100/400]\tTime 0.352 (0.400)\tData 0.001 (0.042)\tLoss 1.840 (1.007)\nEpoch: [47][110/400]\tTime 0.351 (0.396)\tData 0.000 (0.038)\tLoss 1.568 (1.036)\nEpoch: [47][120/400]\tTime 0.352 (0.392)\tData 0.001 (0.035)\tLoss 1.460 (1.069)\nEpoch: [47][130/400]\tTime 0.362 (0.390)\tData 0.000 (0.033)\tLoss 1.767 (1.103)\nEpoch: [47][140/400]\tTime 0.352 (0.399)\tData 0.001 (0.042)\tLoss 1.280 (1.132)\nEpoch: [47][150/400]\tTime 0.362 (0.396)\tData 0.001 (0.040)\tLoss 1.597 (1.168)\nEpoch: [47][160/400]\tTime 0.352 (0.394)\tData 0.001 (0.037)\tLoss 1.718 (1.201)\nEpoch: [47][170/400]\tTime 0.362 (0.392)\tData 0.000 (0.035)\tLoss 0.969 (1.217)\nEpoch: [47][180/400]\tTime 0.353 (0.401)\tData 0.000 (0.044)\tLoss 1.759 (1.229)\nEpoch: [47][190/400]\tTime 0.360 (0.398)\tData 0.001 (0.041)\tLoss 1.096 (1.246)\nEpoch: [47][200/400]\tTime 0.353 (0.396)\tData 0.000 (0.039)\tLoss 1.713 (1.260)\nEpoch: [47][210/400]\tTime 0.361 (0.394)\tData 0.001 (0.037)\tLoss 1.732 (1.281)\nEpoch: [47][220/400]\tTime 0.362 (0.393)\tData 0.000 (0.036)\tLoss 1.835 (1.304)\nEpoch: [47][230/400]\tTime 0.356 (0.399)\tData 0.001 (0.042)\tLoss 1.515 (1.315)\nEpoch: [47][240/400]\tTime 0.353 (0.397)\tData 0.000 (0.040)\tLoss 1.414 (1.325)\nEpoch: [47][250/400]\tTime 0.363 (0.395)\tData 0.000 (0.038)\tLoss 1.460 (1.333)\nEpoch: [47][260/400]\tTime 0.362 (0.394)\tData 0.000 (0.037)\tLoss 1.325 (1.346)\nEpoch: [47][270/400]\tTime 0.353 (0.399)\tData 0.000 (0.042)\tLoss 1.254 (1.352)\nEpoch: [47][280/400]\tTime 0.355 (0.398)\tData 0.000 (0.040)\tLoss 1.654 (1.357)\nEpoch: [47][290/400]\tTime 0.352 (0.396)\tData 0.000 (0.039)\tLoss 1.070 (1.363)\nEpoch: [47][300/400]\tTime 0.354 (0.395)\tData 0.000 (0.038)\tLoss 1.354 (1.372)\nEpoch: [47][310/400]\tTime 0.370 (0.399)\tData 0.001 (0.042)\tLoss 1.563 (1.378)\nEpoch: [47][320/400]\tTime 0.353 (0.398)\tData 0.000 (0.041)\tLoss 1.460 (1.387)\nEpoch: [47][330/400]\tTime 0.359 (0.397)\tData 0.000 (0.039)\tLoss 1.687 (1.392)\nEpoch: [47][340/400]\tTime 0.355 (0.395)\tData 0.000 (0.038)\tLoss 1.769 (1.396)\nEpoch: [47][350/400]\tTime 0.363 (0.394)\tData 0.000 (0.037)\tLoss 1.445 (1.403)\nEpoch: [47][360/400]\tTime 0.355 (0.399)\tData 0.000 (0.041)\tLoss 2.206 (1.409)\nEpoch: [47][370/400]\tTime 0.364 (0.398)\tData 0.000 (0.040)\tLoss 1.943 (1.410)\nEpoch: [47][380/400]\tTime 0.355 (0.397)\tData 0.000 (0.039)\tLoss 1.526 (1.411)\nEpoch: [47][390/400]\tTime 0.364 (0.396)\tData 0.000 (0.038)\tLoss 1.535 (1.410)\nEpoch: [47][400/400]\tTime 0.353 (0.399)\tData 0.000 (0.042)\tLoss 1.390 (1.411)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.160)\tData 0.000 (0.027)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.148759126663208\n==> Statistics for epoch 48: 709 clusters\nEpoch: [48][10/400]\tTime 0.351 (0.418)\tData 0.000 (0.054)\tLoss 0.182 (0.190)\nEpoch: [48][20/400]\tTime 0.351 (0.386)\tData 0.000 (0.027)\tLoss 0.112 (0.171)\nEpoch: [48][30/400]\tTime 0.356 (0.375)\tData 0.000 (0.018)\tLoss 0.071 (0.161)\nEpoch: [48][40/400]\tTime 0.363 (0.370)\tData 0.000 (0.014)\tLoss 0.066 (0.155)\nEpoch: [48][50/400]\tTime 0.350 (0.407)\tData 0.000 (0.049)\tLoss 1.437 (0.322)\nEpoch: [48][60/400]\tTime 0.355 (0.399)\tData 0.000 (0.041)\tLoss 2.100 (0.550)\nEpoch: [48][70/400]\tTime 0.353 (0.393)\tData 0.000 (0.035)\tLoss 1.520 (0.708)\nEpoch: [48][80/400]\tTime 0.353 (0.389)\tData 0.000 (0.031)\tLoss 1.874 (0.848)\nEpoch: [48][90/400]\tTime 0.347 (0.406)\tData 0.001 (0.047)\tLoss 1.481 (0.942)\nEpoch: [48][100/400]\tTime 0.358 (0.401)\tData 0.000 (0.042)\tLoss 1.607 (1.003)\nEpoch: [48][110/400]\tTime 0.371 (0.397)\tData 0.000 (0.039)\tLoss 1.408 (1.061)\nEpoch: [48][120/400]\tTime 0.352 (0.393)\tData 0.000 (0.035)\tLoss 1.642 (1.106)\nEpoch: [48][130/400]\tTime 0.364 (0.391)\tData 0.000 (0.033)\tLoss 1.447 (1.133)\nEpoch: [48][140/400]\tTime 0.354 (0.401)\tData 0.000 (0.044)\tLoss 1.327 (1.156)\nEpoch: [48][150/400]\tTime 0.362 (0.399)\tData 0.001 (0.041)\tLoss 2.500 (1.184)\nEpoch: [48][160/400]\tTime 0.354 (0.396)\tData 0.000 (0.038)\tLoss 1.433 (1.206)\nEpoch: [48][170/400]\tTime 0.364 (0.394)\tData 0.000 (0.036)\tLoss 0.955 (1.232)\nEpoch: [48][180/400]\tTime 0.344 (0.402)\tData 0.000 (0.044)\tLoss 1.659 (1.252)\nEpoch: [48][190/400]\tTime 0.362 (0.399)\tData 0.001 (0.041)\tLoss 0.991 (1.257)\nEpoch: [48][200/400]\tTime 0.353 (0.397)\tData 0.001 (0.039)\tLoss 2.188 (1.290)\nEpoch: [48][210/400]\tTime 0.352 (0.395)\tData 0.001 (0.037)\tLoss 1.919 (1.307)\nEpoch: [48][220/400]\tTime 0.363 (0.394)\tData 0.000 (0.036)\tLoss 1.209 (1.316)\nEpoch: [48][230/400]\tTime 0.363 (0.400)\tData 0.001 (0.042)\tLoss 2.199 (1.335)\nEpoch: [48][240/400]\tTime 0.355 (0.398)\tData 0.000 (0.040)\tLoss 1.371 (1.350)\nEpoch: [48][250/400]\tTime 0.355 (0.396)\tData 0.001 (0.038)\tLoss 1.599 (1.358)\nEpoch: [48][260/400]\tTime 0.364 (0.395)\tData 0.000 (0.037)\tLoss 1.218 (1.362)\nEpoch: [48][270/400]\tTime 0.355 (0.400)\tData 0.000 (0.042)\tLoss 2.434 (1.374)\nEpoch: [48][280/400]\tTime 0.355 (0.399)\tData 0.000 (0.041)\tLoss 1.247 (1.376)\nEpoch: [48][290/400]\tTime 0.355 (0.397)\tData 0.000 (0.039)\tLoss 1.527 (1.383)\nEpoch: [48][300/400]\tTime 0.355 (0.396)\tData 0.000 (0.038)\tLoss 1.490 (1.389)\nEpoch: [48][310/400]\tTime 0.377 (0.401)\tData 0.001 (0.043)\tLoss 1.530 (1.395)\nEpoch: [48][320/400]\tTime 0.353 (0.400)\tData 0.000 (0.041)\tLoss 1.689 (1.405)\nEpoch: [48][330/400]\tTime 0.352 (0.398)\tData 0.001 (0.040)\tLoss 1.653 (1.412)\nEpoch: [48][340/400]\tTime 0.354 (0.397)\tData 0.000 (0.039)\tLoss 1.388 (1.413)\nEpoch: [48][350/400]\tTime 0.364 (0.396)\tData 0.000 (0.038)\tLoss 1.497 (1.417)\nEpoch: [48][360/400]\tTime 0.354 (0.400)\tData 0.000 (0.042)\tLoss 1.494 (1.423)\nEpoch: [48][370/400]\tTime 0.363 (0.399)\tData 0.000 (0.041)\tLoss 1.442 (1.429)\nEpoch: [48][380/400]\tTime 0.369 (0.398)\tData 0.000 (0.039)\tLoss 1.703 (1.432)\nEpoch: [48][390/400]\tTime 0.364 (0.397)\tData 0.000 (0.038)\tLoss 1.562 (1.434)\nEpoch: [48][400/400]\tTime 0.377 (0.401)\tData 0.000 (0.042)\tLoss 1.649 (1.438)\n==> Create pseudo labels for unlabeled data\nExtract Features: [50/51]\tTime 0.129 (0.160)\tData 0.000 (0.028)\t\nComputing jaccard distance...\nJaccard distance computing time cost: 18.608444452285767\n==> Statistics for epoch 49: 709 clusters\nEpoch: [49][10/400]\tTime 0.351 (0.417)\tData 0.000 (0.054)\tLoss 0.151 (0.166)\nEpoch: [49][20/400]\tTime 0.352 (0.385)\tData 0.000 (0.027)\tLoss 0.313 (0.176)\nEpoch: [49][30/400]\tTime 0.351 (0.374)\tData 0.000 (0.018)\tLoss 0.198 (0.176)\nEpoch: [49][40/400]\tTime 0.363 (0.370)\tData 0.000 (0.014)\tLoss 0.085 (0.161)\nEpoch: [49][50/400]\tTime 0.352 (0.404)\tData 0.001 (0.047)\tLoss 1.400 (0.342)\nEpoch: [49][60/400]\tTime 0.353 (0.397)\tData 0.000 (0.039)\tLoss 2.070 (0.582)\nEpoch: [49][70/400]\tTime 0.355 (0.391)\tData 0.000 (0.034)\tLoss 1.482 (0.739)\nEpoch: [49][80/400]\tTime 0.353 (0.387)\tData 0.000 (0.030)\tLoss 2.275 (0.853)\nEpoch: [49][90/400]\tTime 0.363 (0.406)\tData 0.000 (0.047)\tLoss 1.722 (0.925)\nEpoch: [49][100/400]\tTime 0.353 (0.400)\tData 0.000 (0.043)\tLoss 1.248 (0.977)\nEpoch: [49][110/400]\tTime 0.369 (0.397)\tData 0.000 (0.039)\tLoss 1.271 (1.035)\nEpoch: [49][120/400]\tTime 0.354 (0.393)\tData 0.000 (0.036)\tLoss 1.528 (1.084)\nEpoch: [49][130/400]\tTime 0.364 (0.390)\tData 0.000 (0.033)\tLoss 1.895 (1.132)\nEpoch: [49][140/400]\tTime 0.354 (0.400)\tData 0.000 (0.043)\tLoss 1.840 (1.170)\nEpoch: [49][150/400]\tTime 0.352 (0.397)\tData 0.001 (0.040)\tLoss 1.704 (1.212)\nEpoch: [49][160/400]\tTime 0.398 (0.395)\tData 0.000 (0.037)\tLoss 2.271 (1.241)\nEpoch: [49][170/400]\tTime 0.363 (0.393)\tData 0.000 (0.035)\tLoss 2.027 (1.259)\nEpoch: [49][180/400]\tTime 0.363 (0.401)\tData 0.000 (0.043)\tLoss 2.174 (1.275)\nEpoch: [49][190/400]\tTime 0.353 (0.399)\tData 0.000 (0.041)\tLoss 2.098 (1.295)\nEpoch: [49][200/400]\tTime 0.376 (0.396)\tData 0.000 (0.039)\tLoss 1.620 (1.300)\nEpoch: [49][210/400]\tTime 0.353 (0.395)\tData 0.000 (0.037)\tLoss 1.721 (1.313)\nEpoch: [49][220/400]\tTime 0.363 (0.393)\tData 0.000 (0.035)\tLoss 1.708 (1.329)\nEpoch: [49][230/400]\tTime 0.357 (0.399)\tData 0.001 (0.042)\tLoss 2.011 (1.338)\nEpoch: [49][240/400]\tTime 0.354 (0.398)\tData 0.001 (0.040)\tLoss 1.132 (1.342)\nEpoch: [49][250/400]\tTime 0.357 (0.396)\tData 0.001 (0.038)\tLoss 1.602 (1.350)\nEpoch: [49][260/400]\tTime 0.363 (0.395)\tData 0.000 (0.037)\tLoss 2.304 (1.359)\nEpoch: [49][270/400]\tTime 0.362 (0.400)\tData 0.000 (0.042)\tLoss 1.903 (1.366)\nEpoch: [49][280/400]\tTime 0.355 (0.398)\tData 0.000 (0.040)\tLoss 1.882 (1.369)\nEpoch: [49][290/400]\tTime 0.363 (0.397)\tData 0.001 (0.039)\tLoss 1.252 (1.375)\nEpoch: [49][300/400]\tTime 0.354 (0.395)\tData 0.000 (0.038)\tLoss 1.802 (1.379)\nEpoch: [49][310/400]\tTime 0.421 (0.400)\tData 0.000 (0.042)\tLoss 1.358 (1.386)\nEpoch: [49][320/400]\tTime 0.354 (0.399)\tData 0.000 (0.041)\tLoss 1.961 (1.395)\nEpoch: [49][330/400]\tTime 0.357 (0.398)\tData 0.000 (0.040)\tLoss 1.538 (1.406)\nEpoch: [49][340/400]\tTime 0.353 (0.397)\tData 0.000 (0.039)\tLoss 2.063 (1.406)\nEpoch: [49][350/400]\tTime 0.365 (0.395)\tData 0.000 (0.037)\tLoss 1.933 (1.405)\nEpoch: [49][360/400]\tTime 0.354 (0.399)\tData 0.000 (0.041)\tLoss 1.433 (1.406)\nEpoch: [49][370/400]\tTime 0.353 (0.398)\tData 0.000 (0.040)\tLoss 1.375 (1.409)\nEpoch: [49][380/400]\tTime 0.353 (0.397)\tData 0.000 (0.039)\tLoss 1.282 (1.410)\nEpoch: [49][390/400]\tTime 0.363 (0.396)\tData 0.000 (0.038)\tLoss 1.522 (1.420)\nEpoch: [49][400/400]\tTime 0.349 (0.400)\tData 0.001 (0.042)\tLoss 1.714 (1.428)\nExtract Features: [50/76]\tTime 0.133 (0.160)\tData 0.000 (0.027)\t\nMean AP: 87.0%\n\n * Finished epoch  49  model mAP: 87.0%  best: 87.0% *\n\n==> Test with the best model:\n=> Loaded checkpoint '/data0/developer/cluster-contrast/examples/logs/gem-hard/model_best.pth.tar'\nExtract Features: [50/76]\tTime 0.134 (0.162)\tData 0.000 (0.028)\t\nMean AP: 87.0%\nCMC Scores:\n  top-1          94.6%\n  top-5          98.2%\n  top-10         98.8%\nTotal running time:  2:44:06.629802\n"
  },
  {
    "path": "examples/test.py",
    "content": "from __future__ import print_function, absolute_import\nimport argparse\nimport os.path as osp\nimport random\nimport numpy as np\nimport sys\n\nimport torch\nfrom torch import nn\nfrom torch.backends import cudnn\nfrom torch.utils.data import DataLoader\n\nfrom clustercontrast import datasets\nfrom clustercontrast import models\nfrom clustercontrast.models.dsbn import convert_dsbn, convert_bn\nfrom clustercontrast.evaluators import Evaluator\nfrom clustercontrast.utils.data import transforms as T\nfrom clustercontrast.utils.data.preprocessor import Preprocessor\nfrom clustercontrast.utils.logging import Logger\nfrom clustercontrast.utils.serialization import load_checkpoint, save_checkpoint, copy_state_dict\n\n\ndef get_data(name, data_dir, height, width, batch_size, workers):\n    root = osp.join(data_dir, name)\n\n    dataset = datasets.create(name, root)\n\n    normalizer = T.Normalize(mean=[0.485, 0.456, 0.406],\n                             std=[0.229, 0.224, 0.225])\n\n    test_transformer = T.Compose([\n             T.Resize((height, width), interpolation=3),\n             T.ToTensor(),\n             normalizer\n         ])\n\n    test_loader = DataLoader(\n        Preprocessor(list(set(dataset.query) | set(dataset.gallery)),\n                     root=dataset.images_dir, transform=test_transformer),\n        batch_size=batch_size, num_workers=workers,\n        shuffle=False, pin_memory=True)\n    return dataset, test_loader\n\n\ndef main():\n    args = parser.parse_args()\n\n    if args.seed is not None:\n        random.seed(args.seed)\n        np.random.seed(args.seed)\n        torch.manual_seed(args.seed)\n        cudnn.deterministic = True\n\n    main_worker(args)\n\n\ndef main_worker(args):\n    cudnn.benchmark = True\n\n    log_dir = osp.dirname(args.resume)\n    sys.stdout = Logger(osp.join(log_dir, 'log_test.txt'))\n    print(\"==========\\nArgs:{}\\n==========\".format(args))\n\n    # Create data loaders\n    dataset, test_loader = get_data(args.dataset, args.data_dir, args.height,\n                                    args.width, args.batch_size, args.workers)\n\n    # Create model\n    model = models.create(args.arch, pretrained=False, num_features=args.features, dropout=args.dropout,\n                          num_classes=0, pooling_type=args.pooling_type)\n    if args.dsbn:\n        print(\"==> Load the model with domain-specific BNs\")\n        convert_dsbn(model)\n\n    # Load from checkpoint\n    checkpoint = load_checkpoint(args.resume)\n    copy_state_dict(checkpoint['state_dict'], model, strip='module.')\n\n    if args.dsbn:\n        print(\"==> Test with {}-domain BNs\".format(\"source\" if args.test_source else \"target\"))\n        convert_bn(model, use_target=(not args.test_source))\n\n    model.cuda()\n    model = nn.DataParallel(model)\n\n    # Evaluator\n    model.eval()\n    evaluator = Evaluator(model)\n    evaluator.evaluate(test_loader, dataset.query, dataset.gallery, cmc_flag=True, rerank=args.rerank)\n    return\n\n\nif __name__ == '__main__':\n    parser = argparse.ArgumentParser(description=\"Testing the model\")\n    # data\n    parser.add_argument('-d', '--dataset', type=str, default='market1501')\n    parser.add_argument('-b', '--batch-size', type=int, default=256)\n    parser.add_argument('-j', '--workers', type=int, default=4)\n    parser.add_argument('--height', type=int, default=256, help=\"input height\")\n    parser.add_argument('--width', type=int, default=128, help=\"input width\")\n    # model\n    parser.add_argument('-a', '--arch', type=str, default='resnet50',\n                        choices=models.names())\n    parser.add_argument('--features', type=int, default=0)\n    parser.add_argument('--dropout', type=float, default=0)\n\n    parser.add_argument('--resume', type=str,\n                        default=\"/media/yixuan/DATA/cluster-contrast/market-res50/logs/model_best.pth.tar\",\n                        metavar='PATH')\n    # testing configs\n    parser.add_argument('--rerank', action='store_true',\n                        help=\"evaluation only\")\n    parser.add_argument('--dsbn', action='store_true',\n                        help=\"test on the model with domain-specific BN\")\n    parser.add_argument('--test-source', action='store_true',\n                        help=\"test on the source domain\")\n    parser.add_argument('--seed', type=int, default=1)\n    # path\n    working_dir = osp.dirname(osp.abspath(__file__))\n    parser.add_argument('--data-dir', type=str, metavar='PATH',\n                        default='/media/yixuan/Project/guangyuan/workpalces/SpCL/examples/data')\n    parser.add_argument('--pooling-type', type=str, default='gem')\n    parser.add_argument('--embedding_features_path', type=str,\n                        default='/media/yixuan/Project/guangyuan/workpalces/SpCL/embedding_features/mark1501_res50_ibn/')\n    main()\n"
  },
  {
    "path": "run_code.sh",
    "content": "CUDA_VISIBLE_DEVICES=0,1,2,3 python examples/cluster_contrast_train_usl.py -b 256 -a resnet50 -d market1501 --iters 200 --momentum 0.1 --eps 0.6 --num-instances 16\n# CUDA_VISIBLE_DEVICES=0,1,2,3 python examples/cluster_contrast_train_usl_infomap.py -b 256 -a resnet50 -d market1501 --iters 200 --momentum 0.1 --eps 0.5 --k1 15 --k2 4 --num-instances 16\n\n# CUDA_VISIBLE_DEVICES=0,1,2,3 python examples/cluster_contrast_train_usl.py -b 256 -a resnet50 -d msmt17 --iters 400 --momentum 0.1 --eps 0.6 --num-instances 16\n# CUDA_VISIBLE_DEVICES=0,1,2,3 python examples/cluster_contrast_train_usl_infomap.py -b 256 -a resnet50 -d msmt17 --iters 400 --momentum 0.1 --eps 0.5 --k1 15 --k2 4 --num-instances 16\n\n\n# CUDA_VISIBLE_DEVICES=0,1,2,3 python examples/cluster_contrast_train_usl.py -b 256 -a resnet50 -d dukemtmcreid --iters 200 --momentum 0.1 --eps 0.6 --num-instances 16\n# CUDA_VISIBLE_DEVICES=0,1,2,3 python examples/cluster_contrast_train_usl_infomap.py -b 256 -a resnet50 -d dukemtmcreid --iters 200 --momentum 0.1 --eps 0.5 --k1 15 --k2 4 --num-instances 16\n\n\n# CUDA_VISIBLE_DEVICES=0,1,2,3 python examples/cluster_contrast_train_usl.py -b 256 -a resnet50 -d veri --iters 400 --momentum 0.1 --eps 0.6 --num-instances 16 --height 224 --width 224\n# CUDA_VISIBLE_DEVICES=0,1,2,3 python examples/cluster_contrast_train_usl_infomap.py -b 256 -a resnet50 -d veri --iters 400 --momentum 0.1 --eps 0.5 --k1 15 --k2 4 --num-instances 16 --height 224 --width 224\n"
  },
  {
    "path": "setup.py",
    "content": "from setuptools import setup, find_packages\n\n\nsetup(name='ClusterContrast',\n      version='1.0.0',\n      description='Cluster Contrast for Unsupervised Person Re-Identification',\n      author='GuangYuan wang',\n      author_email='yixuan.wgy@alibaba-inc.com',\n      # url='',\n      install_requires=[\n          'numpy', 'torch', 'torchvision',\n          'six', 'h5py', 'Pillow', 'scipy',\n          'scikit-learn', 'metric-learn', 'faiss_gpu'],\n      packages=find_packages(),\n      keywords=[\n          'Unsupervised Learning',\n          'Contrastive Learning',\n          'Object Re-identification'\n      ])\n"
  }
]