Showing preview only (6,122K chars total). Download the full file or copy to clipboard to get everything.
Repository: gangweiX/Fast-ACVNet
Branch: main
Commit: 61d5375e1816
Files: 45
Total size: 5.8 MB
Directory structure:
gitextract_85err5x8/
├── LICENSE.md
├── README.md
├── datasets/
│ ├── MiddleburyLoader.py
│ ├── __init__.py
│ ├── data_io.py
│ ├── eth3dLoader.py
│ ├── eth3d_dataset.py
│ ├── flow_transforms.py
│ ├── kitti_dataset_1215.py
│ ├── kitti_dataset_1215_augmentation.py
│ ├── kittiraw_loader.py
│ ├── listfiles.py
│ ├── middlebury_data.py
│ ├── middlebury_data_our.py
│ ├── middlebury_loader.py
│ ├── readpfm.py
│ ├── sceneflow_dataset.py
│ └── sceneflow_dataset_augmentation.py
├── filenames/
│ ├── kitti12_15_all.txt
│ ├── kitti12_15_train.txt
│ ├── kitti12_all.txt
│ ├── kitti12_test.txt
│ ├── kitti12_train.txt
│ ├── kitti12_val.txt
│ ├── kitti15_all.txt
│ ├── kitti15_test.txt
│ ├── kitti15_train.txt
│ ├── kitti15_val.txt
│ ├── sceneflow_test.txt
│ ├── sceneflow_train.txt
│ └── test_eth3d_files.txt
├── main_kitti.py
├── main_sceneflow.py
├── models/
│ ├── Fast_ACV.py
│ ├── Fast_ACV_plus.py
│ ├── __init__.py
│ ├── loss.py
│ └── submodule.py
├── save_disp.py
├── test_mid.py
└── utils/
├── __init__.py
├── experiment.py
├── metrics.py
├── misc.py
└── visualization.py
================================================
FILE CONTENTS
================================================
================================================
FILE: LICENSE.md
================================================
MIT License
Copyright (c) 2022 Gangwei Xu
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
================================================
FILE: README.md
================================================
<p align="center">
<h1 align="center">Accurate and Efficient Stereo Matching via Attention Concatenation Volume</h1>
<p align="center">
Gangwei Xu, Yun Wang, Junda Cheng, Jinhui Tang, Xin Yang
</p>
<h3 align="center">TPAMI 2023</h3>
<h3 align="center"><a href="https://arxiv.org/pdf/2209.12699.pdf">Paper</a>
<div align="center"></div>
</p>
<p align="center">
<a href="">
<img src="https://github.com/gangweiX/Fast-ACVNet/blob/main/imgs/Fast-ACV.png" alt="Logo" width="100%">
</a>
</p>
<p align="center">
Fast-ACVNet.
</p>
# Demo on KITTI raw data
A demo result on our RTX 3090 (Ubuntu 20.04).
<p align="center">
<img width="844" height="446" src="./demo/kittiraw_demo.gif" data-zoomable>
</p>
# How to use
## Environment
* Python 3.8
* Pytorch 1.10
## Install
### Create a virtual environment and activate it.
```
conda create -n fast_acv python=3.8
conda activate fast_acv
```
### Dependencies
```
conda install pytorch torchvision torchaudio cudatoolkit=11.3 -c pytorch -c nvidia
pip install opencv-python
pip install scikit-image
pip install tensorboard
pip install matplotlib
pip install tqdm
pip install timm==0.5.4
```
## Data Preparation
Download [Scene Flow Datasets](https://lmb.informatik.uni-freiburg.de/resources/datasets/SceneFlowDatasets.en.html), [KITTI 2012](http://www.cvlibs.net/datasets/kitti/eval_stereo_flow.php?benchmark=stereo), [KITTI 2015](http://www.cvlibs.net/datasets/kitti/eval_scene_flow.php?benchmark=stereo)
## Train
Use the following command to train Fast-ACVNet+ or Fast-ACVNet on Scene Flow
Firstly, train attention weights generation network for 24 epochs,
```
python main_sceneflow.py --attention_weights_only True --logdir ./checkpoints/sceneflow/attention
```
Secondly, train complete network for another 24 epochs,
```
python main_sceneflow.py --loadckpt ./checkpoints/sceneflow/attention/checkpoint_000023.ckpt --logdir ./checkpoints/sceneflow/complete
```
Use the following command to train Fast-ACVNet+ or Fast-ACVNet on KITTI (using pretrained model on Scene Flow),
```
python main_kitti.py --loadckpt ./checkpoints/sceneflow/complete/checkpoint_000023.ckpt --logdir ./checkpoints/kitti
```
## Submitted to KITTI benchmarks
```
python save_disp.py
```
# Evaluation on Scene Flow and KITTI
| Method | Scene Flow <br> (EPE) | KITTI 2012 <br> (3-all) | KITTI 2015 <br> (D1-all) | Runtime (ms) |
|:-:|:-:|:-:|:-:|:-:|
| Fast-ACVNet+ | 0.59 | 1.85 % | 2.01 % | 45 |
| HITNet | - | 1.89 % |1.98 % | 54 |
| CoEx | 0.69 | 1.93 % | 2.13 % | 33 |
| BGNet+ | - | 2.03 % | 2.19 % | 35 |
| AANet | 0.87 | 2.42 % | 2.55 % | 62 |
| DeepPrunerFast | 0.97 | - | 2.59 % | 50 |
Our Fast-ACVNet+ achieves comparable accuracy with HITNet on KITTI 2012 and KITTI 2015
### Pretrained Model
[Fast-ACVNet](https://drive.google.com/drive/folders/1vLt_9W3F2K-MciV8Pmv8iRpU24lkMXGo?usp=share_link)
[Fast-ACVNet+](https://drive.google.com/drive/folders/1lcyzoKlkYoDL3tiPGCR6nob9WsusaTI8?usp=share_link)
# Qualitative results on Scene Flow.

# Qualitative results on KITTI.

# Generalization performance on the Middlebury 2014 dataset. All the comparison methods are only trained on Scene Flow without data augmentation.

# Citation
If you find this project helpful in your research, welcome to cite the paper.
```
@article{xu2023accurate,
title={Accurate and efficient stereo matching via attention concatenation volume},
author={Xu, Gangwei and Wang, Yun and Cheng, Junda and Tang, Jinhui and Yang, Xin},
journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
year={2023},
publisher={IEEE}
}
@inproceedings{xu2022attention,
title={Attention Concatenation Volume for Accurate and Efficient Stereo Matching},
author={Xu, Gangwei and Cheng, Junda and Guo, Peng and Yang, Xin},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
pages={12981--12990},
year={2022}
}
```
# Acknowledgements
Thanks to Antyanta Bangunharcana for opening source of his excellent work [Correlate-and-Excite](https://github.com/antabangun/coex). Thanks to Xiaoyang Guo for opening source of his excellent work [GwcNet](https://github.com/xy-guo/GwcNet).
================================================
FILE: datasets/MiddleburyLoader.py
================================================
import os, torch, torch.utils.data as data
from PIL import Image
import numpy as np
from . import flow_transforms
import pdb
import torchvision
import warnings
from . import readpfm as rp
from datasets.data_io import get_transform, read_all_lines
warnings.filterwarnings('ignore', '.*output shape of zoom.*')
import cv2
IMG_EXTENSIONS = [
'.jpg', '.JPG', '.jpeg', '.JPEG',
'.png', '.PNG', '.ppm', '.PPM', '.bmp', '.BMP']
def is_image_file(filename):
return any((filename.endswith(extension) for extension in IMG_EXTENSIONS))
def default_loader(path):
return Image.open(path).convert('RGB')
def disparity_loader(path):
if '.png' in path:
data = Image.open(path)
data = np.ascontiguousarray(data,dtype=np.float32)/256
return data
else:
data = rp.readPFM(path)[0]
data = np.ascontiguousarray(data, dtype=np.float32)
return data
class myImageFloder(data.Dataset):
def __init__(self, left, right, left_disparity, training, right_disparity=None, loader=default_loader, dploader=disparity_loader):
self.left = left
self.right = right
self.disp_L = left_disparity
self.disp_R = right_disparity
self.training = training
self.loader = loader
self.dploader = dploader
self.order = 0
def __getitem__(self, index):
left = self.left[index]
right = self.right[index]
left_img = self.loader(left)
right_img = self.loader(right)
if self.disp_L is not None:
disp_L = self.disp_L[index]
disparity = self.dploader(disp_L)
disparity[disparity == np.inf] = 0
else:
disparity = None
if self.training:
th, tw = 256, 512
#th, tw = 320, 704
random_brightness = np.random.uniform(0.5, 2.0, 2)
random_gamma = np.random.uniform(0.8, 1.2, 2)
random_contrast = np.random.uniform(0.8, 1.2, 2)
left_img = torchvision.transforms.functional.adjust_brightness(left_img, random_brightness[0])
left_img = torchvision.transforms.functional.adjust_gamma(left_img, random_gamma[0])
left_img = torchvision.transforms.functional.adjust_contrast(left_img, random_contrast[0])
right_img = torchvision.transforms.functional.adjust_brightness(right_img, random_brightness[1])
right_img = torchvision.transforms.functional.adjust_gamma(right_img, random_gamma[1])
right_img = torchvision.transforms.functional.adjust_contrast(right_img, random_contrast[1])
right_img = np.asarray(right_img)
left_img = np.asarray(left_img)
# w, h = left_img.size
# th, tw = 256, 512
#
# x1 = random.randint(0, w - tw)
# y1 = random.randint(0, h - th)
#
# left_img = left_img.crop((x1, y1, x1 + tw, y1 + th))
# right_img = right_img.crop((x1, y1, x1 + tw, y1 + th))
# dataL = dataL[y1:y1 + th, x1:x1 + tw]
# right_img = np.asarray(right_img)
# left_img = np.asarray(left_img)
# geometric unsymmetric-augmentation
angle = 0;
px = 0
if np.random.binomial(1, 0.5):
# angle = 0.1;
# px = 2
angle = 0.05
px = 1
co_transform = flow_transforms.Compose([
# flow_transforms.RandomVdisp(angle, px),
flow_transforms.Scale(0.5, order=self.order),
flow_transforms.RandomCrop((th, tw)),
])
augmented, disparity = co_transform([left_img, right_img], disparity)
left_img = augmented[0]
right_img = augmented[1]
right_img.flags.writeable = True
if np.random.binomial(1,0.2):
sx = int(np.random.uniform(35,100))
sy = int(np.random.uniform(25,75))
cx = int(np.random.uniform(sx,right_img.shape[0]-sx))
cy = int(np.random.uniform(sy,right_img.shape[1]-sy))
right_img[cx-sx:cx+sx,cy-sy:cy+sy] = np.mean(np.mean(right_img,0),0)[np.newaxis,np.newaxis]
# to tensor, normalize
disparity = np.ascontiguousarray(disparity, dtype=np.float32)
processed = get_transform()
left_img = processed(left_img)
right_img = processed(right_img)
return {"left": left_img,
"right": right_img,
"disparity": disparity}
else:
# w, h = left_img.size
right_img = np.asarray(right_img)
left_img = np.asarray(left_img)
# co_transform = flow_transforms.Compose([
# # flow_transforms.RandomVdisp(angle, px),
# flow_transforms.Scale(0.5, order=self.order),
# # flow_transforms.RandomCrop((th, tw)),
# ])
# augmented, disparity = co_transform([left_img, right_img], disparity)
# left_img = augmented[0]
# right_img = augmented[1]
# right_img = cv2.resize(right_img, None, fx=0.5,fy=0.5 ,interpolation=cv2.INTER_CUBIC)
# left_img = cv2.resize(left_img, None, fx=0.5, fy=0.5, interpolation=cv2.INTER_CUBIC)
disparity = np.ascontiguousarray(disparity, dtype=np.float32)
# normalize
h = left_img.shape[0]
w = left_img.shape[1]
processed = get_transform()
left_img = processed(left_img).numpy()
right_img = processed(right_img).numpy()
# h, w, _ = left_img.shape
# pad to size 1248x384
#top_pad = 32 - (h % 32)
#right_pad = 32 - (w % 32)
top_pad = 64 - (h % 64)
right_pad = 64 - (w % 64)
assert top_pad > 0 and right_pad > 0
# pad images
left_img = np.lib.pad(left_img, ((0, 0), (top_pad, 0), (0, right_pad)), mode='constant', constant_values=0)
right_img = np.lib.pad(right_img, ((0, 0), (top_pad, 0), (0, right_pad)), mode='constant',
constant_values=0)
# pad disparity gt
# if disparity is not None:
# assert len(disparity.shape) == 2
# disparity = np.lib.pad(disparity, ((top_pad, 0), (0, right_pad)), mode='constant', constant_values=0)
if disparity is not None:
return {"left": left_img,
"right": right_img,
"disparity": disparity,
"top_pad": top_pad,
"right_pad": right_pad,
"left_filename": self.left[index],
"right_filename": self.right[index]
}
else:
return {"left": left_img,
"right": right_img,
"top_pad": top_pad,
"right_pad": right_pad,
"left_filename": self.left[index],
"right_filename": self.right[index]}
def __len__(self):
return len(self.left)
================================================
FILE: datasets/__init__.py
================================================
from .kitti_dataset_1215 import KITTIDataset
from .sceneflow_dataset import SceneFlowDatset
__datasets__ = {
"sceneflow": SceneFlowDatset,
"kitti": KITTIDataset
}
================================================
FILE: datasets/data_io.py
================================================
import numpy as np
import re
import torchvision.transforms as transforms
def get_transform():
mean = [0.485, 0.456, 0.406]
std = [0.229, 0.224, 0.225]
return transforms.Compose([
transforms.ToTensor(),
transforms.Normalize(mean=mean, std=std),
])
def get_transform_aug():
mean = [0.485, 0.456, 0.406]
std = [0.229, 0.224, 0.225]
return transforms.Compose([
transforms.ToTensor(),
])
# read all lines in a file
def read_all_lines(filename):
with open(filename) as f:
lines = [line.rstrip() for line in f.readlines()]
return lines
# read an .pfm file into numpy array, used to load SceneFlow disparity files
def pfm_imread(filename):
file = open(filename, 'rb')
color = None
width = None
height = None
scale = None
endian = None
header = file.readline().decode('utf-8').rstrip()
if header == 'PF':
color = True
elif header == 'Pf':
color = False
else:
raise Exception('Not a PFM file.')
dim_match = re.match(r'^(\d+)\s(\d+)\s$', file.readline().decode('utf-8'))
if dim_match:
width, height = map(int, dim_match.groups())
else:
raise Exception('Malformed PFM header.')
scale = float(file.readline().rstrip())
if scale < 0: # little-endian
endian = '<'
scale = -scale
else:
endian = '>' # big-endian
data = np.fromfile(file, endian + 'f')
shape = (height, width, 3) if color else (height, width)
data = np.reshape(data, shape)
data = np.flipud(data)
return data, scale
================================================
FILE: datasets/eth3dLoader.py
================================================
import os, torch, torch.utils.data as data
from PIL import Image
import numpy as np
from . import flow_transforms
import pdb
import torchvision
import warnings
from . import readpfm as rp
from datasets.data_io import get_transform, read_all_lines
import cv2
warnings.filterwarnings('ignore', '.*output shape of zoom.*')
IMG_EXTENSIONS = [
'.jpg', '.JPG', '.jpeg', '.JPEG',
'.png', '.PNG', '.ppm', '.PPM', '.bmp', '.BMP']
def is_image_file(filename):
return any((filename.endswith(extension) for extension in IMG_EXTENSIONS))
def default_loader(path):
return Image.open(path).convert('RGB')
def disparity_loader(path):
if '.png' in path:
data = Image.open(path)
data = np.ascontiguousarray(data,dtype=np.float32)/256
return data
else:
data = rp.readPFM(path)[0]
data = np.ascontiguousarray(data, dtype=np.float32)
return data
class myImageFloder(data.Dataset):
def __init__(self, left, right, left_disparity, training, right_disparity=None, loader=default_loader, dploader=disparity_loader):
self.left = left
self.right = right
self.disp_L = left_disparity
self.disp_R = right_disparity
self.training = training
self.loader = loader
self.dploader = dploader
def __getitem__(self, index):
left = self.left[index]
right = self.right[index]
left_img = self.loader(left)
right_img = self.loader(right)
if self.disp_L is not None:
disp_L = self.disp_L[index]
disparity = self.dploader(disp_L)
disparity[disparity == np.inf] = 0
else:
disparity = None
if self.training:
th, tw = 256, 512
#th, tw = 320, 704
random_brightness = np.random.uniform(0.5, 2.0, 2)
random_gamma = np.random.uniform(0.8, 1.2, 2)
random_contrast = np.random.uniform(0.8, 1.2, 2)
left_img = torchvision.transforms.functional.adjust_brightness(left_img, random_brightness[0])
left_img = torchvision.transforms.functional.adjust_gamma(left_img, random_gamma[0])
left_img = torchvision.transforms.functional.adjust_contrast(left_img, random_contrast[0])
right_img = torchvision.transforms.functional.adjust_brightness(right_img, random_brightness[1])
right_img = torchvision.transforms.functional.adjust_gamma(right_img, random_gamma[1])
right_img = torchvision.transforms.functional.adjust_contrast(right_img, random_contrast[1])
right_img = np.array(right_img)
left_img = np.array(left_img)
# w, h = left_img.size
# th, tw = 256, 512
#
# x1 = random.randint(0, w - tw)
# y1 = random.randint(0, h - th)
#
# left_img = left_img.crop((x1, y1, x1 + tw, y1 + th))
# right_img = right_img.crop((x1, y1, x1 + tw, y1 + th))
# dataL = dataL[y1:y1 + th, x1:x1 + tw]
# right_img = np.asarray(right_img)
# left_img = np.asarray(left_img)
# geometric unsymmetric-augmentation
angle = 0;
px = 0
if np.random.binomial(1, 0.5):
# angle = 0.1;
# px = 2
angle = 0.05
px = 1
co_transform = flow_transforms.Compose([
# flow_transforms.RandomVdisp(angle, px),
# flow_transforms.Scale(np.random.uniform(self.rand_scale[0], self.rand_scale[1]), order=self.order),
flow_transforms.RandomCrop((th, tw)),
])
augmented, disparity = co_transform([left_img, right_img], disparity)
left_img = augmented[0]
right_img = augmented[1]
right_img.flags.writeable = True
if np.random.binomial(1,0.2):
sx = int(np.random.uniform(35,100))
sy = int(np.random.uniform(25,75))
cx = int(np.random.uniform(sx,right_img.shape[0]-sx))
cy = int(np.random.uniform(sy,right_img.shape[1]-sy))
right_img[cx-sx:cx+sx,cy-sy:cy+sy] = np.mean(np.mean(right_img,0),0)[np.newaxis,np.newaxis]
# to tensor, normalize
disparity = np.ascontiguousarray(disparity, dtype=np.float32)
processed = get_transform()
left_img = processed(left_img)
right_img = processed(right_img)
# print("disparity", disparity.shape)
disparity_low = cv2.resize(disparity, (tw//4, th//4), interpolation=cv2.INTER_NEAREST)
return {"left": left_img,
"right": right_img,
"disparity": disparity,
'disparity_low': disparity_low}
else:
w, h = left_img.size
# normalize
processed = get_transform()
left_img = processed(left_img).numpy()
right_img = processed(right_img).numpy()
# pad to size 1248x384
if h % 64 == 0:
top_pad = 0
else:
top_pad = 64 - (h % 64)
if w % 64 == 0:
right_pad = 0
else:
right_pad = 64 - (w % 64)
assert top_pad >= 0 and right_pad >= 0
# pad images
left_img = np.lib.pad(left_img, ((0, 0), (top_pad, 0), (0, right_pad)), mode='constant', constant_values=0)
right_img = np.lib.pad(right_img, ((0, 0), (top_pad, 0), (0, right_pad)), mode='constant',
constant_values=0)
# pad disparity gt
if disparity is not None:
assert len(disparity.shape) == 2
disparity = np.lib.pad(disparity, ((top_pad, 0), (0, right_pad)), mode='constant', constant_values=0)
if disparity is not None:
return {"left": left_img,
"right": right_img,
"disparity": disparity,
"top_pad": top_pad,
"right_pad": right_pad}
else:
return {"left": left_img,
"right": right_img,
"top_pad": top_pad,
"right_pad": right_pad,
"left_filename": self.left[index],
"right_filename": self.right[index]}
def __len__(self):
return len(self.left)
================================================
FILE: datasets/eth3d_dataset.py
================================================
# Copyright Niantic 2020. Patent Pending. All rights reserved.
#
# This software is licensed under the terms of the Stereo-from-mono licence
# which allows for non-commercial use only, the full terms of which are made
# available in the LICENSE file.
from __future__ import absolute_import, division, print_function
import os
os.environ["MKL_NUM_THREADS"] = "1"
os.environ["NUMEXPR_NUM_THREADS"] = "1"
os.environ["OMP_NUM_THREADS"] = "1"
import random
import numpy as np
from PIL import Image # using pillow-simd for increased speed
from torchvision import transforms
import cv2
cv2.setNumThreads(0)
from .base_dataset import BaseDataset
class ETH3DStereoDataset(BaseDataset):
def __init__(self,
data_path,
filenames,
height,
width,
is_train,
disable_normalisation=False,
**kwargs):
super(ETH3DStereoDataset, self).__init__(data_path, filenames, height, width,
is_train=is_train, has_gt=False,
disable_normalisation=disable_normalisation)
self.img_resizer = transforms.Resize(size=(height, width))
def load_images(self, idx, do_flip=False):
folder = self.filenames[idx]
image = self.loader(os.path.join(self.data_path, 'two_view_training',
folder, 'im0.png'))
stereo_image = self.loader(os.path.join(self.data_path, 'two_view_training',
folder, 'im1.png'))
return image, stereo_image
def load_disparity(self, idx, do_flip=False):
folder = self.filenames[idx]
disparity = self.read_pfm(os.path.join(self.data_path, 'two_view_training_gt',
folder, 'disp0GT.pfm'))
# loaded disparity contains infs for no reading
disparity[disparity == np.inf] = 0
return np.ascontiguousarray(disparity)
def __getitem__(self, idx):
inputs = {}
image, stereo_image = self.load_images(idx, do_flip=False)
image = self.img_resizer(image)
stereo_image = self.img_resizer(stereo_image)
disparity = self.load_disparity(idx)
inputs['image'] = image
inputs['stereo_image'] = stereo_image
inputs['disparity'] = disparity
self.preprocess(inputs)
return inputs
================================================
FILE: datasets/flow_transforms.py
================================================
from __future__ import division
import torch
import random
import numpy as np
import numbers
import pdb
import cv2
class Compose(object):
""" Composes several co_transforms together.
"""
def __init__(self, co_transforms):
self.co_transforms = co_transforms
def __call__(self, input, target):
for t in self.co_transforms:
input,target = t(input,target)
return input,target
class Compose_mask(object):
""" Composes several co_transforms together.
"""
def __init__(self, co_transforms):
self.co_transforms = co_transforms
def __call__(self, input, target, mask):
for t in self.co_transforms:
input,target,mask = t(input,target,mask)
return input,target,mask
class Scale(object):
""" Rescales the inputs and target arrays to the given 'size'.
"""
def __init__(self, size, order=2):
self.ratio = size
self.order = order
if order==0:
self.code=cv2.INTER_NEAREST
elif order==1:
self.code=cv2.INTER_LINEAR
elif order==2:
self.code=cv2.INTER_CUBIC
def __call__(self, inputs, target):
h, w, _ = inputs[0].shape
ratio = self.ratio
inputs[0] = cv2.resize(inputs[0], None, fx=ratio,fy=ratio,interpolation=cv2.INTER_CUBIC)
inputs[1] = cv2.resize(inputs[1], None, fx=ratio,fy=ratio,interpolation=cv2.INTER_CUBIC)
target = cv2.resize(target, None, fx=ratio,fy=ratio,interpolation=self.code) * ratio
return inputs, target
class RandomCrop(object):
""" Randomly crop images
"""
def __init__(self, size):
if isinstance(size, numbers.Number):
self.size = (int(size), int(size))
else:
self.size = size
def __call__(self, inputs,target):
h, w, _ = inputs[0].shape
th, tw = self.size
if w < tw: tw=w
if h < th: th=h
x1 = random.randint(0, w - tw)
y1 = random.randint(0, h - th)
inputs[0] = inputs[0][y1: y1 + th,x1: x1 + tw]
inputs[1] = inputs[1][y1: y1 + th,x1: x1 + tw]
return inputs, target[y1: y1 + th,x1: x1 + tw]
class RandomCrop_mask(object):
""" Randomly crop images
"""
def __init__(self, size):
if isinstance(size, numbers.Number):
self.size = (int(size), int(size))
else:
self.size = size
def __call__(self, inputs,target,mask):
h, w, _ = inputs[0].shape
th, tw = self.size
if w < tw: tw=w
if h < th: th=h
x1 = random.randint(0, w - tw)
y1 = random.randint(0, h - th)
inputs[0] = inputs[0][y1: y1 + th,x1: x1 + tw]
inputs[1] = inputs[1][y1: y1 + th,x1: x1 + tw]
mask = mask[y1: y1 + th,x1: x1 + tw]
return inputs, target[y1: y1 + th,x1: x1 + tw], mask
class RandomVdisp(object):
"""Random vertical disparity augmentation
"""
def __init__(self, angle, px, diff_angle=0, order=2, reshape=False):
self.angle = angle
self.reshape = reshape
self.order = order
self.diff_angle = diff_angle
self.px = px
def __call__(self, inputs,target):
px2 = random.uniform(-self.px,self.px)
angle2 = random.uniform(-self.angle,self.angle)
image_center = (np.random.uniform(0,inputs[1].shape[0]),\
np.random.uniform(0,inputs[1].shape[1]))
rot_mat = cv2.getRotationMatrix2D(image_center, angle2, 1.0)
inputs[1] = cv2.warpAffine(inputs[1], rot_mat, inputs[1].shape[1::-1], flags=cv2.INTER_LINEAR)
trans_mat = np.float32([[1,0,0],[0,1,px2]])
inputs[1] = cv2.warpAffine(inputs[1], trans_mat, inputs[1].shape[1::-1], flags=cv2.INTER_LINEAR)
return inputs,target
================================================
FILE: datasets/kitti_dataset_1215.py
================================================
import os
import random
from torch.utils.data import Dataset
from PIL import Image
import numpy as np
import cv2
from datasets.data_io import get_transform, read_all_lines, pfm_imread
import torchvision.transforms as transforms
import torch
import matplotlib.pyplot as plt
class KITTIDataset(Dataset):
def __init__(self, kitti15_datapath, kitti12_datapath, list_filename, training):
self.datapath_15 = kitti15_datapath
self.datapath_12 = kitti12_datapath
self.left_filenames, self.right_filenames, self.disp_filenames = self.load_path(list_filename)
self.training = training
if self.training:
assert self.disp_filenames is not None
def load_path(self, list_filename):
lines = read_all_lines(list_filename)
splits = [line.split() for line in lines]
left_images = [x[0] for x in splits]
right_images = [x[1] for x in splits]
if len(splits[0]) == 2: # ground truth not available
return left_images, right_images, None
else:
disp_images = [x[2] for x in splits]
return left_images, right_images, disp_images
def load_image(self, filename):
return Image.open(filename).convert('RGB')
def load_disp(self, filename):
data = Image.open(filename)
data = np.array(data, dtype=np.float32) / 256.
return data
def __len__(self):
return len(self.left_filenames)
def __getitem__(self, index):
left_name = self.left_filenames[index].split('/')[1]
if left_name.startswith('image'):
self.datapath = self.datapath_15
else:
self.datapath = self.datapath_12
left_img = self.load_image(os.path.join(self.datapath, self.left_filenames[index]))
right_img = self.load_image(os.path.join(self.datapath, self.right_filenames[index]))
if self.disp_filenames: # has disparity ground truth
disparity = self.load_disp(os.path.join(self.datapath, self.disp_filenames[index]))
else:
disparity = None
if self.training:
w, h = left_img.size
crop_w, crop_h = 512, 256
x1 = random.randint(0, w - crop_w)
# y1 = random.randint(0, h - crop_h)
if random.randint(0, 10) >= int(8):
y1 = random.randint(0, h - crop_h)
else:
y1 = random.randint(int(0.3 * h), h - crop_h)
# random crop
left_img = left_img.crop((x1, y1, x1 + crop_w, y1 + crop_h))
right_img = right_img.crop((x1, y1, x1 + crop_w, y1 + crop_h))
disparity = disparity[y1:y1 + crop_h, x1:x1 + crop_w]
disparity_low = cv2.resize(disparity, (crop_w//4, crop_h//4), interpolation=cv2.INTER_NEAREST)
# to tensor, normalize
processed = get_transform()
left_img = processed(left_img)
right_img = processed(right_img)
return {"left": left_img,
"right": right_img,
"disparity": disparity,
"disparity_low": disparity_low}
else:
w, h = left_img.size
# normalize
processed = get_transform()
left_img = processed(left_img).numpy()
right_img = processed(right_img).numpy()
# pad to size 1248x384
top_pad = 384 - h
right_pad = 1248 - w
assert top_pad > 0 and right_pad > 0
# pad images
left_img = np.lib.pad(left_img, ((0, 0), (top_pad, 0), (0, right_pad)), mode='constant', constant_values=0)
right_img = np.lib.pad(right_img, ((0, 0), (top_pad, 0), (0, right_pad)), mode='constant',
constant_values=0)
# pad disparity gt
if disparity is not None:
assert len(disparity.shape) == 2
disparity = np.lib.pad(disparity, ((top_pad, 0), (0, right_pad)), mode='constant', constant_values=0)
if disparity is not None:
return {"left": left_img,
"right": right_img,
"disparity": disparity,
"top_pad": top_pad,
"right_pad": right_pad}
else:
return {"left": left_img,
"right": right_img,
"top_pad": top_pad,
"right_pad": right_pad,
"left_filename": self.left_filenames[index],
"right_filename": self.right_filenames[index]}
================================================
FILE: datasets/kitti_dataset_1215_augmentation.py
================================================
import os
import random
from torch.utils.data import Dataset
from PIL import Image
import numpy as np
import cv2
from datasets.data_io import get_transform, read_all_lines, pfm_imread
# import torchvision.transforms as transforms
from . import flow_transforms
import torch
import matplotlib.pyplot as plt
import torchvision
class KITTIDataset(Dataset):
def __init__(self, kitti15_datapath, kitti12_datapath, list_filename, training):
self.datapath_15 = kitti15_datapath
self.datapath_12 = kitti12_datapath
self.left_filenames, self.right_filenames, self.disp_filenames = self.load_path(list_filename)
self.training = training
if self.training:
assert self.disp_filenames is not None
def load_path(self, list_filename):
lines = read_all_lines(list_filename)
splits = [line.split() for line in lines]
left_images = [x[0] for x in splits]
right_images = [x[1] for x in splits]
if len(splits[0]) == 2: # ground truth not available
return left_images, right_images, None
else:
disp_images = [x[2] for x in splits]
return left_images, right_images, disp_images
def load_image(self, filename):
return Image.open(filename).convert('RGB')
def load_disp(self, filename):
data = Image.open(filename)
data = np.array(data, dtype=np.float32) / 256.
return data
def __len__(self):
return len(self.left_filenames)
def __getitem__(self, index):
left_name = self.left_filenames[index].split('/')[1]
if left_name.startswith('image'):
self.datapath = self.datapath_15
else:
self.datapath = self.datapath_12
left_img = self.load_image(os.path.join(self.datapath, self.left_filenames[index]))
right_img = self.load_image(os.path.join(self.datapath, self.right_filenames[index]))
if self.disp_filenames: # has disparity ground truth
disparity = self.load_disp(os.path.join(self.datapath, self.disp_filenames[index]))
else:
disparity = None
if self.training:
th, tw = 256, 512
random_brightness = np.random.uniform(0.5, 2.0, 2)
random_gamma = np.random.uniform(0.8, 1.2, 2)
random_contrast = np.random.uniform(0.8, 1.2, 2)
random_saturation = np.random.uniform(0, 1.4, 2)
left_img = torchvision.transforms.functional.adjust_brightness(left_img, random_brightness[0])
left_img = torchvision.transforms.functional.adjust_gamma(left_img, random_gamma[0])
left_img = torchvision.transforms.functional.adjust_contrast(left_img, random_contrast[0])
right_img = torchvision.transforms.functional.adjust_brightness(right_img, random_brightness[1])
right_img = torchvision.transforms.functional.adjust_gamma(right_img, random_gamma[1])
right_img = torchvision.transforms.functional.adjust_contrast(right_img, random_contrast[1])
left_img = torchvision.transforms.functional.adjust_saturation(left_img, random_saturation[0])
right_img = torchvision.transforms.functional.adjust_saturation(right_img, random_saturation[1])
right_img = np.array(right_img)
left_img = np.array(left_img)
# geometric unsymmetric-augmentation
angle = 0;
px = 0
if np.random.binomial(1, 0.5):
# angle = 0.1;
# px = 2
angle = 0.05
px = 1
co_transform = flow_transforms.Compose([
# flow_transforms.RandomVdisp(angle, px),
# flow_transforms.Scale(np.random.uniform(self.rand_scale[0], self.rand_scale[1]), order=self.order),
flow_transforms.RandomCrop((th, tw)),
])
augmented, disparity = co_transform([left_img, right_img], disparity)
left_img = augmented[0]
right_img = augmented[1]
right_img.flags.writeable = True
if np.random.binomial(1,0.2):
sx = int(np.random.uniform(35,100))
sy = int(np.random.uniform(25,75))
cx = int(np.random.uniform(sx,right_img.shape[0]-sx))
cy = int(np.random.uniform(sy,right_img.shape[1]-sy))
right_img[cx-sx:cx+sx,cy-sy:cy+sy] = np.mean(np.mean(right_img,0),0)[np.newaxis,np.newaxis]
# to tensor, normalize
disparity = np.ascontiguousarray(disparity, dtype=np.float32)
disparity_low = cv2.resize(disparity, (tw//4, th//4), interpolation=cv2.INTER_NEAREST)
processed = get_transform()
left_img = processed(left_img)
right_img = processed(right_img)
return {"left": left_img,
"right": right_img,
"disparity": disparity,
"disparity_low": disparity_low}
else:
w, h = left_img.size
# normalize
processed = get_transform()
left_img = processed(left_img).numpy()
right_img = processed(right_img).numpy()
# pad to size 1248x384
top_pad = 384 - h
right_pad = 1248 - w
assert top_pad > 0 and right_pad > 0
# pad images
left_img = np.lib.pad(left_img, ((0, 0), (top_pad, 0), (0, right_pad)), mode='constant', constant_values=0)
right_img = np.lib.pad(right_img, ((0, 0), (top_pad, 0), (0, right_pad)), mode='constant',
constant_values=0)
# pad disparity gt
if disparity is not None:
assert len(disparity.shape) == 2
disparity = np.lib.pad(disparity, ((top_pad, 0), (0, right_pad)), mode='constant', constant_values=0)
if disparity is not None:
return {"left": left_img,
"right": right_img,
"disparity": disparity,
"top_pad": top_pad,
"right_pad": right_pad,
"left_filename": self.left_filenames[index],
"right_filename": self.right_filenames[index]
}
else:
return {"left": left_img,
"right": right_img,
"top_pad": top_pad,
"right_pad": right_pad,
"left_filename": self.left_filenames[index],
"right_filename": self.right_filenames[index]}
================================================
FILE: datasets/kittiraw_loader.py
================================================
import os
import random
from torch.utils.data import Dataset
from PIL import Image
import numpy as np
import cv2
from datasets.data_io import get_transform, read_all_lines, pfm_imread
import torchvision.transforms as transforms
import torch
import matplotlib.pyplot as plt
def listfiles(filepath):
#filepath = './data_demo/KITTI_raw/2011_09_26/2011_09_26_drive_0005_sync/'
left_fold = 'image_02/data'
right_fold = 'image_03/data'
datadir_left = '{}/{}'.format(filepath, left_fold)
datadir_right = '{}/{}'.format(filepath, right_fold)
img_names = os.listdir(datadir_left)
img_names.sort()
left_imgs = []
right_imgs = []
for i in range(len(img_names)):
left_imgs.append(os.path.join(datadir_left, img_names[i]))
right_imgs.append(os.path.join(datadir_right, img_names[i]))
return left_imgs, right_imgs
class ImageLoader(Dataset):
def __init__(self, left, right):
self.left = left
self.right = right
def load_path(self, list_filename):
lines = read_all_lines(list_filename)
splits = [line.split() for line in lines]
left_images = [x[0] for x in splits]
right_images = [x[1] for x in splits]
if len(splits[0]) == 2: # ground truth not available
return left_images, right_images, None
else:
disp_images = [x[2] for x in splits]
return left_images, right_images, disp_images
def load_image(self, filename):
return Image.open(filename).convert('RGB')
def load_disp(self, filename):
data = Image.open(filename)
data = np.array(data, dtype=np.float32) / 256.
return data
def __len__(self):
return len(self.left_filenames)
def __getitem__(self, index):
batch = dict()
left = self.left[index]
right = self.right[index]
left_img = self.load_image(left)
right_img = self.load_image(right)
w, h = left_img.size
# h, w, _ = left_img.shape
processed = get_transform()
left_img_p = processed(left_img).numpy()
right_img_p = processed(right_img).numpy()
top_pad = 384 - h
right_pad = 1248 - w
assert top_pad > 0 and right_pad > 0
# pad images
left_img_p = np.lib.pad(left_img_p, ((0, 0), (top_pad, 0), (0, right_pad)), mode='constant', constant_values=0)
right_img_p = np.lib.pad(right_img_p, ((0, 0), (top_pad, 0), (0, right_pad)), mode='constant', constant_values=0)
left_img_ = np.transpose(left_img, (2, 0, 1)).astype(np.float32)
right_img_ = np.transpose(right_img, (2, 0, 1)).astype(np.float32)
batch['imgL'], batch['imgR'] = left_img_p, right_img_p
batch['imgLRaw'], batch['imgRRaw'] = left_img_, right_img_
batch['top_pad'], batch['right_pad'] = top_pad, right_pad
return batch
def __len__(self):
return len(self.left)
================================================
FILE: datasets/listfiles.py
================================================
import torch.utils.data as data
import pdb
from PIL import Image
import os
import os.path
import numpy as np
import glob
def dataloader(filepath):
img_list = [i.split('/')[-1] for i in glob.glob('%s/*'%filepath) if os.path.isdir(i)]
left_train = ['%s/%s/im0.png'% (filepath,img) for img in img_list]
right_train = ['%s/%s/im1.png'% (filepath,img) for img in img_list]
disp_train_L = ['%s/%s/disp0GT.pfm' % (filepath,img) for img in img_list]
disp_train_R = ['%s/%s/disp1GT.pfm' % (filepath,img) for img in img_list]
return left_train, right_train, disp_train_L, disp_train_R
================================================
FILE: datasets/middlebury_data.py
================================================
import os
import numpy as np
from PIL import Image # using pillow-simd for increased speed
from torchvision import transforms
import cv2
import torch
import torch.utils.data as data
def pil_loader(path):
# open path as file to avoid ResourceWarning
# (https://github.com/python-pillow/Pillow/issues/835)
with open(path, 'rb') as f:
with Image.open(f) as img:
return img.convert('RGB')
def readlines(filename):
""" read lines of a text file """
with open(filename, 'r') as file_handler:
lines = file_handler.read().splitlines()
return lines
def read_pfm(file):
with open(file, 'rb') as fh:
fh.readline()
width, height = str(fh.readline().rstrip())[2:-1].split()
fh.readline()
disp = np.fromfile(fh, '<f')
return np.flipud(disp.reshape(int(height), int(width)))
class MiddleburyStereoDataset:
def __init__(self,
data_path,
filenames,
height,
width,
is_train=False,
has_gt=True,
**kwargs):
self.img_resizer = transforms.Resize(size=(height, width))
self.height = height
self.width = width
self.loader = pil_loader
self.read_pfm = read_pfm
self.data_path = data_path
self.filenames = readlines(filenames)
self.to_tensor = transforms.ToTensor()
self.is_train = is_train
self.has_gt = has_gt
def load_images(self, idx, do_flip=False):
folder = self.filenames[idx]
left_img = self.loader(os.path.join(self.data_path, 'MiddEval3/trainingF',
folder, 'im0.png'))
right_img = self.loader(os.path.join(self.data_path, 'MiddEval3/trainingF',
folder, 'im1.png'))
return left_img, right_img
def load_disparity(self, idx, do_flip=False):
folder = self.filenames[idx]
disparity = self.read_pfm(os.path.join(self.data_path, 'MiddEval3/trainingF',
folder, 'disp0GT.pfm'))
# loaded disparity contains infs for no reading
disparity[disparity == np.inf] = 0
return np.ascontiguousarray(disparity)
def preprocess(self, inputs):
# convert to tensors and standardise using ImageNet
for key in ['left_img', 'right_img']:
inputs[key] = (self.to_tensor(inputs[key]) - 0.45) / 0.225
if self.has_gt:
inputs['disparity'] = torch.from_numpy(inputs['disparity']).float()
def __len__(self):
return len(self.filenames)
def __getitem__(self, idx):
inputs = {}
left_img, right_img = self.load_images(idx, do_flip=False)
left_img = self.img_resizer(left_img)
right_img = self.img_resizer(right_img)
disparity = self.load_disparity(idx)
# disparity = cv2.resize(disparity, (self.width,self.height))
inputs['left_img'] = left_img
inputs['right_img'] = right_img
inputs['disparity'] = disparity
self.preprocess(inputs)
left_img = inputs['left_img']
right_img = inputs['right_img']
disparity = inputs['disparity']
# print(left_img.shape)
# print(right_img.shape)
# print(disparity.shape)
return {"left": left_img,
"right": right_img,
"disparity": disparity}
================================================
FILE: datasets/middlebury_data_our.py
================================================
import os
import numpy as np
from PIL import Image # using pillow-simd for increased speed
from torchvision import transforms
import cv2
import torch
import torch.utils.data as data
import random
def pil_loader(path):
# open path as file to avoid ResourceWarning
# (https://github.com/python-pillow/Pillow/issues/835)
with open(path, 'rb') as f:
with Image.open(f) as img:
return img.convert('RGB')
def readlines(filename):
""" read lines of a text file """
with open(filename, 'r') as file_handler:
lines = file_handler.read().splitlines()
return lines
def read_pfm(file):
with open(file, 'rb') as fh:
fh.readline()
width, height = str(fh.readline().rstrip())[2:-1].split()
fh.readline()
disp = np.fromfile(fh, '<f')
return np.flipud(disp.reshape(int(height), int(width)))
class MiddleburyStereoDataset:
def __init__(self,
data_path,
filenames,
height,
width,
is_train=False,
has_gt=True,
**kwargs):
self.img_resizer = transforms.Resize(size=(height, width))
self.height = height
self.width = width
self.loader = pil_loader
self.read_pfm = read_pfm
self.data_path = data_path
self.filenames = readlines(filenames)
self.to_tensor = transforms.ToTensor()
self.is_train = is_train
self.has_gt = has_gt
self.training = is_train
def load_images(self, idx, do_flip=False):
folder = self.filenames[idx]
left_img = self.loader(os.path.join(self.data_path, 'MiddEval3/trainingF',
folder, 'im0.png'))
right_img = self.loader(os.path.join(self.data_path, 'MiddEval3/trainingF',
folder, 'im1.png'))
return left_img, right_img
def load_disparity(self, idx, do_flip=False):
folder = self.filenames[idx]
disparity = self.read_pfm(os.path.join(self.data_path, 'MiddEval3/trainingF',
folder, 'disp0GT.pfm'))
# loaded disparity contains infs for no reading
disparity[disparity == np.inf] = 0
return np.ascontiguousarray(disparity)
def preprocess(self, inputs):
# convert to tensors and standardise using ImageNet
for key in ['left_img', 'right_img']:
inputs[key] = (self.to_tensor(inputs[key]) - 0.45) / 0.225
if self.has_gt:
inputs['disparity'] = torch.from_numpy(inputs['disparity']).float()
def __len__(self):
return len(self.filenames)
def __getitem__(self, idx):
inputs = {}
left_img, right_img = self.load_images(idx, do_flip=False)
disparity = self.load_disparity(idx)
if self.is_train:
w, h = left_img.size
crop_w, crop_h = self.width, self.height
x1 = random.randint(0, w - crop_w)
y1 = random.randint(0, h - crop_h)
left_img = left_img.crop((x1, y1, x1 + crop_w, y1 + crop_h))
right_img = right_img.crop((x1, y1, x1 + crop_w, y1 + crop_h))
disparity = disparity[y1:y1 + crop_h, x1:x1 + crop_w]
inputs['left_img'] = left_img
inputs['right_img'] = right_img
inputs['disparity'] = disparity
self.preprocess(inputs)
left_img = inputs['left_img']
right_img = inputs['right_img']
disparity = inputs['disparity']
return {"left": left_img,
"right": right_img,
"disparity": disparity}
else:
w, h = left_img.size
print(w,h)
crop_w, crop_h = self.width, self.height
x1 = random.randint(0, w - crop_w)
y1 = random.randint(0, h - crop_h)
left_img = left_img.crop((x1, y1, x1 + crop_w, y1 + crop_h))
right_img = right_img.crop((x1, y1, x1 + crop_w, y1 + crop_h))
disparity = disparity[y1:y1 + crop_h, x1:x1 + crop_w]
inputs['left_img'] = left_img
inputs['right_img'] = right_img
inputs['disparity'] = disparity
self.preprocess(inputs)
left_img = inputs['left_img']
right_img = inputs['right_img']
disparity = inputs['disparity']
return {"left": left_img,
"right": right_img,
"disparity": disparity}
================================================
FILE: datasets/middlebury_loader.py
================================================
import os
from PIL import Image
from datasets import readpfm as rp
import torch.utils.data as data
import torchvision.transforms as transforms
import numpy as np
import random
def mb_loader(filepath, res):
train_path = os.path.join(filepath, 'training' + res)
test_path = os.path.join(filepath, 'test' + res)
# gt_path = train_path.replace('training' + res, 'Eval3_GT/training' + res)
gt_path = os.path.join(filepath, 'training' + res)
train_left = []
train_right = []
train_gt = []
for c in os.listdir(train_path):
train_left.append(os.path.join(train_path, c, 'im0.png'))
train_right.append(os.path.join(train_path, c, 'im1.png'))
train_gt.append(os.path.join(gt_path, c, 'disp0GT.pfm'))
test_left = []
test_right = []
for c in os.listdir(test_path):
test_left.append(os.path.join(test_path, c, 'im0.png'))
test_right.append(os.path.join(test_path, c, 'im1.png'))
train_left = sorted(train_left)
train_right = sorted(train_right)
train_gt = sorted(train_gt)
test_left = sorted(test_left)
test_right = sorted(test_right)
return train_left, train_right, train_gt, test_left, test_right
def img_loader(path):
return Image.open(path).convert('RGB')
def disparity_loader(path):
return rp.readPFM(path)
class myDataset(data.Dataset):
def __init__(self, left, right, left_disp, training, imgloader=img_loader, dploader = disparity_loader):
self.left = left
self.right = right
self.disp_L = left_disp
self.imgloader = imgloader
self.dploader = dploader
self.training = training
self.img_transorm = transforms.Compose([
transforms.ToTensor(),
transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225])])
def __getitem__(self, index):
left = self.left[index]
right = self.right[index]
disp_L = self.disp_L[index]
left_img = self.imgloader(left)
right_img = self.imgloader(right)
dataL, scaleL = self.dploader(disp_L)
dataL = Image.fromarray(np.ascontiguousarray(dataL, dtype=np.float32))
if self.training:
w, h = left_img.size
# random resize
s = np.random.uniform(0.95, 1.05, 1)
rw, rh = np.round(w*s), np.round(h*s)
left_img = left_img.resize((rw, rh), Image.NEAREST)
right_img = right_img.resize((rw, rh), Image.NEAREST)
dataL = dataL.resize((rw, rh), Image.NEAREST)
dataL = Image.fromarray(np.array(dataL) * s)
# random horizontal flip
p = np.random.rand(1)
if p >= 0.5:
left_img = horizontal_flip(left_img)
right_img = horizontal_flip(right_img)
dataL = horizontal_flip(dataL)
w, h = left_img.size
tw, th = 320, 240
x1 = random.randint(0, w - tw)
y1 = random.randint(0, h - th)
left_img = left_img.crop((x1, y1, x1+tw, y1+th))
right_img = right_img.crop((x1, y1, x1+tw, y1+th))
dataL = dataL.crop((x1, y1, x1+tw, y1+th))
left_img = self.img_transorm(left_img)
right_img = self.img_transorm(right_img)
dataL = np.array(dataL)
return left_img, right_img, dataL
else:
w, h = left_img.size
left_img = left_img.resize((w // 32 * 32, h // 32 * 32))
right_img = right_img.resize((w // 32 * 32, h // 32 * 32))
left_img = self.img_transorm(left_img)
right_img = self.img_transorm(right_img)
dataL = np.array(dataL)
return left_img, right_img, dataL
def __len__(self):
return len(self.left)
def horizontal_flip(img):
img_np = np.array(img)
img_np = np.flip(img_np, axis=1)
img = Image.fromarray(img_np)
return img
if __name__ == '__main__':
train_left, train_right, train_gt, _, _ = mb_loader('/media/data/dataset/MiddEval3-data-Q/', res='Q')
H, W = 0, 0
for l in train_right:
left_img = Image.open(l).convert('RGB')
h, w = left_img.size
H += h
W += w
print(H / 15, W / 15)
================================================
FILE: datasets/readpfm.py
================================================
import re
import numpy as np
import sys
def readPFM(file):
file = open(file, 'rb')
color = None
width = None
height = None
scale = None
endian = None
header = file.readline().rstrip()
if (sys.version[0]) == '3':
header = header.decode('utf-8')
if header == 'PF':
color = True
elif header == 'Pf':
color = False
else:
raise Exception('Not a PFM file.')
if (sys.version[0]) == '3':
dim_match = re.match(r'^(\d+)\s(\d+)\s$', file.readline().decode('utf-8'))
else:
dim_match = re.match(r'^(\d+)\s(\d+)\s$', file.readline())
if dim_match:
width, height = map(int, dim_match.groups())
else:
raise Exception('Malformed PFM header.')
if (sys.version[0]) == '3':
scale = float(file.readline().rstrip().decode('utf-8'))
else:
scale = float(file.readline().rstrip())
if scale < 0: # little-endian
endian = '<'
scale = -scale
else:
endian = '>' # big-endian
data = np.fromfile(file, endian + 'f')
shape = (height, width, 3) if color else (height, width)
data = np.reshape(data, shape)
data = np.flipud(data)
return data, scale
================================================
FILE: datasets/sceneflow_dataset.py
================================================
import os
import random
from PIL import Image
from torch.utils.data import Dataset
import numpy as np
import cv2
from datasets.data_io import get_transform, read_all_lines, pfm_imread
import torchvision.transforms as transforms
import torch
from PIL import ImageFile
ImageFile.LOAD_TRUNCATED_IMAGES = True
import matplotlib.pyplot as plt
class SceneFlowDatset(Dataset):
def __init__(self, datapath, list_filename, training):
self.datapath = datapath
self.left_filenames, self.right_filenames, self.disp_filenames = self.load_path(list_filename)
self.training = training
def load_path(self, list_filename):
lines = read_all_lines(list_filename)
splits = [line.split() for line in lines]
left_images = [x[0] for x in splits]
right_images = [x[1] for x in splits]
disp_images = [x[2] for x in splits]
return left_images, right_images, disp_images
def load_image(self, filename):
return Image.open(filename).convert('RGB')
def load_disp(self, filename):
data, scale = pfm_imread(filename)
data = np.ascontiguousarray(data, dtype=np.float32)
return data
def __len__(self):
return len(self.left_filenames)
def __getitem__(self, index):
left_img = self.load_image(os.path.join(self.datapath, self.left_filenames[index]))
right_img = self.load_image(os.path.join(self.datapath, self.right_filenames[index]))
disparity = self.load_disp(os.path.join(self.datapath, self.disp_filenames[index]))
left_img_np = np.array(left_img)
dx_imgL = cv2.Sobel(left_img_np,cv2.CV_32F,1,0,ksize=3)
dy_imgL = cv2.Sobel(left_img_np,cv2.CV_32F,0,1,ksize=3)
dxy_imgL=np.sqrt(np.sum(np.square(dx_imgL),axis=-1)+np.sum(np.square(dy_imgL),axis=-1))
dxy_imgL = dxy_imgL/(np.max(dxy_imgL)+1e-5)
if self.training:
w, h = left_img.size
crop_w, crop_h = 512, 256
x1 = random.randint(0, w - crop_w)
y1 = random.randint(0, h - crop_h)
# random crop
left_img = left_img.crop((x1, y1, x1 + crop_w, y1 + crop_h))
right_img = right_img.crop((x1, y1, x1 + crop_w, y1 + crop_h))
disparity = disparity[y1:y1 + crop_h, x1:x1 + crop_w]
disparity_low = cv2.resize(disparity, (crop_w//4, crop_h//4), interpolation=cv2.INTER_NEAREST)
disparity_low_r8 = cv2.resize(disparity, (crop_w//8, crop_h//8), interpolation=cv2.INTER_NEAREST)
gradient_map = dxy_imgL[y1:y1 + crop_h, x1:x1 + crop_w]
# to tensor, normalize
processed = get_transform()
left_img = processed(left_img)
right_img = processed(right_img)
gradient_map = torch.from_numpy(gradient_map)
return {"left": left_img,
"right": right_img,
"disparity": disparity,
"gradient_map":gradient_map,
"disparity_low":disparity_low,
"disparity_low_r8":disparity_low_r8}
else:
w, h = left_img.size
crop_w, crop_h = 960, 512
left_img = left_img.crop((w - crop_w, h - crop_h, w, h))
right_img = right_img.crop((w - crop_w, h - crop_h, w, h))
disparity = disparity[h - crop_h:h, w - crop_w: w]
gradient_map = dxy_imgL[h - crop_h:h, w - crop_w: w]
disparity_low = cv2.resize(disparity, (crop_w//4, crop_h//4), interpolation=cv2.INTER_NEAREST)
processed = get_transform()
left_img = processed(left_img)
right_img = processed(right_img)
return {"left": left_img,
"right": right_img,
"disparity": disparity,
"top_pad": 0,
"right_pad": 0,
"gradient_map":gradient_map,
"disparity_low":disparity_low,
"left_filename": self.left_filenames[index]}
================================================
FILE: datasets/sceneflow_dataset_augmentation.py
================================================
import os
import random
from torch.utils.data import Dataset
from PIL import Image
import numpy as np
from datasets.data_io import get_transform, read_all_lines, pfm_imread
from . import flow_transforms
import torchvision
import cv2
import copy
class SceneFlowDatset(Dataset):
def __init__(self, datapath, list_filename, training):
self.datapath = datapath
self.left_filenames, self.right_filenames, self.disp_filenames = self.load_path(list_filename)
self.training = training
def load_path(self, list_filename):
lines = read_all_lines(list_filename)
splits = [line.split() for line in lines]
left_images = [x[0] for x in splits]
right_images = [x[1] for x in splits]
disp_images = [x[2] for x in splits]
return left_images, right_images, disp_images
def load_image(self, filename):
return Image.open(filename).convert('RGB')
def load_disp(self, filename):
data, scale = pfm_imread(filename)
data = np.ascontiguousarray(data, dtype=np.float32)
return data
def __len__(self):
return len(self.left_filenames)
def __getitem__(self, index):
left_img = self.load_image(os.path.join(self.datapath, self.left_filenames[index]))
right_img = self.load_image(os.path.join(self.datapath, self.right_filenames[index]))
disparity = self.load_disp(os.path.join(self.datapath, self.disp_filenames[index]))
if self.training:
th, tw = 256, 512
#th, tw = 288, 512
random_brightness = np.random.uniform(0.5, 2.0, 2)
random_gamma = np.random.uniform(0.8, 1.2, 2)
random_contrast = np.random.uniform(0.8, 1.2, 2)
random_saturation = np.random.uniform(0, 1.4, 2)
left_img = torchvision.transforms.functional.adjust_brightness(left_img, random_brightness[0])
left_img = torchvision.transforms.functional.adjust_gamma(left_img, random_gamma[0])
left_img = torchvision.transforms.functional.adjust_contrast(left_img, random_contrast[0])
left_img = torchvision.transforms.functional.adjust_contrast(left_img, random_saturation[0])
right_img = torchvision.transforms.functional.adjust_brightness(right_img, random_brightness[1])
right_img = torchvision.transforms.functional.adjust_gamma(right_img, random_gamma[1])
right_img = torchvision.transforms.functional.adjust_contrast(right_img, random_contrast[1])
right_img = torchvision.transforms.functional.adjust_contrast(right_img, random_saturation[1])
left_img = np.array(left_img)
right_img = np.array(right_img)
# geometric unsymmetric-augmentation
angle = 0;
px = 0
if np.random.binomial(1, 0.5):
# angle = 0.1;
# px = 2
angle = 0.05
px = 1
co_transform = flow_transforms.Compose([
flow_transforms.RandomCrop((th, tw)),
])
augmented, disparity = co_transform([left_img, right_img], disparity)
left_img = augmented[0]
right_img = augmented[1]
# randomly occlude a region
right_img.flags.writeable = True
if np.random.binomial(1,0.5):
sx = int(np.random.uniform(35,100))
sy = int(np.random.uniform(25,75))
cx = int(np.random.uniform(sx,right_img.shape[0]-sx))
cy = int(np.random.uniform(sy,right_img.shape[1]-sy))
right_img[cx-sx:cx+sx,cy-sy:cy+sy] = np.mean(np.mean(right_img,0),0)[np.newaxis,np.newaxis]
# w, h = left_img.size
disparity = np.ascontiguousarray(disparity, dtype=np.float32)
disparity_low = cv2.resize(disparity, (tw//4, th//4), interpolation=cv2.INTER_NEAREST)
processed = get_transform()
left_img = processed(left_img)
right_img = processed(right_img)
return {"left": left_img,
"right": right_img,
"disparity": disparity,
"disparity_low": disparity_low}
else:
w, h = left_img.size
crop_w, crop_h = 960, 512
left_img = left_img.crop((w - crop_w, h - crop_h, w, h))
right_img = right_img.crop((w - crop_w, h - crop_h, w, h))
disparity = disparity[h - crop_h:h, w - crop_w: w]
processed = get_transform()
left_img = processed(left_img)
right_img = processed(right_img)
return {"left": left_img,
"right": right_img,
"disparity": disparity,
"top_pad": 0,
"right_pad": 0}
================================================
FILE: filenames/kitti12_15_all.txt
================================================
training/image_2/000000_10.png training/image_3/000000_10.png training/disp_occ_0/000000_10.png
training/image_2/000002_10.png training/image_3/000002_10.png training/disp_occ_0/000002_10.png
training/image_2/000003_10.png training/image_3/000003_10.png training/disp_occ_0/000003_10.png
training/image_2/000004_10.png training/image_3/000004_10.png training/disp_occ_0/000004_10.png
training/image_2/000005_10.png training/image_3/000005_10.png training/disp_occ_0/000005_10.png
training/image_2/000007_10.png training/image_3/000007_10.png training/disp_occ_0/000007_10.png
training/image_2/000008_10.png training/image_3/000008_10.png training/disp_occ_0/000008_10.png
training/image_2/000009_10.png training/image_3/000009_10.png training/disp_occ_0/000009_10.png
training/image_2/000010_10.png training/image_3/000010_10.png training/disp_occ_0/000010_10.png
training/image_2/000011_10.png training/image_3/000011_10.png training/disp_occ_0/000011_10.png
training/image_2/000012_10.png training/image_3/000012_10.png training/disp_occ_0/000012_10.png
training/image_2/000013_10.png training/image_3/000013_10.png training/disp_occ_0/000013_10.png
training/image_2/000014_10.png training/image_3/000014_10.png training/disp_occ_0/000014_10.png
training/image_2/000015_10.png training/image_3/000015_10.png training/disp_occ_0/000015_10.png
training/image_2/000016_10.png training/image_3/000016_10.png training/disp_occ_0/000016_10.png
training/image_2/000017_10.png training/image_3/000017_10.png training/disp_occ_0/000017_10.png
training/image_2/000018_10.png training/image_3/000018_10.png training/disp_occ_0/000018_10.png
training/image_2/000019_10.png training/image_3/000019_10.png training/disp_occ_0/000019_10.png
training/image_2/000020_10.png training/image_3/000020_10.png training/disp_occ_0/000020_10.png
training/image_2/000021_10.png training/image_3/000021_10.png training/disp_occ_0/000021_10.png
training/image_2/000022_10.png training/image_3/000022_10.png training/disp_occ_0/000022_10.png
training/image_2/000023_10.png training/image_3/000023_10.png training/disp_occ_0/000023_10.png
training/image_2/000024_10.png training/image_3/000024_10.png training/disp_occ_0/000024_10.png
training/image_2/000025_10.png training/image_3/000025_10.png training/disp_occ_0/000025_10.png
training/image_2/000027_10.png training/image_3/000027_10.png training/disp_occ_0/000027_10.png
training/image_2/000028_10.png training/image_3/000028_10.png training/disp_occ_0/000028_10.png
training/image_2/000029_10.png training/image_3/000029_10.png training/disp_occ_0/000029_10.png
training/image_2/000030_10.png training/image_3/000030_10.png training/disp_occ_0/000030_10.png
training/image_2/000031_10.png training/image_3/000031_10.png training/disp_occ_0/000031_10.png
training/image_2/000032_10.png training/image_3/000032_10.png training/disp_occ_0/000032_10.png
training/image_2/000033_10.png training/image_3/000033_10.png training/disp_occ_0/000033_10.png
training/image_2/000034_10.png training/image_3/000034_10.png training/disp_occ_0/000034_10.png
training/image_2/000035_10.png training/image_3/000035_10.png training/disp_occ_0/000035_10.png
training/image_2/000036_10.png training/image_3/000036_10.png training/disp_occ_0/000036_10.png
training/image_2/000037_10.png training/image_3/000037_10.png training/disp_occ_0/000037_10.png
training/image_2/000039_10.png training/image_3/000039_10.png training/disp_occ_0/000039_10.png
training/image_2/000040_10.png training/image_3/000040_10.png training/disp_occ_0/000040_10.png
training/image_2/000041_10.png training/image_3/000041_10.png training/disp_occ_0/000041_10.png
training/image_2/000042_10.png training/image_3/000042_10.png training/disp_occ_0/000042_10.png
training/image_2/000044_10.png training/image_3/000044_10.png training/disp_occ_0/000044_10.png
training/image_2/000045_10.png training/image_3/000045_10.png training/disp_occ_0/000045_10.png
training/image_2/000046_10.png training/image_3/000046_10.png training/disp_occ_0/000046_10.png
training/image_2/000047_10.png training/image_3/000047_10.png training/disp_occ_0/000047_10.png
training/image_2/000048_10.png training/image_3/000048_10.png training/disp_occ_0/000048_10.png
training/image_2/000050_10.png training/image_3/000050_10.png training/disp_occ_0/000050_10.png
training/image_2/000051_10.png training/image_3/000051_10.png training/disp_occ_0/000051_10.png
training/image_2/000052_10.png training/image_3/000052_10.png training/disp_occ_0/000052_10.png
training/image_2/000053_10.png training/image_3/000053_10.png training/disp_occ_0/000053_10.png
training/image_2/000054_10.png training/image_3/000054_10.png training/disp_occ_0/000054_10.png
training/image_2/000055_10.png training/image_3/000055_10.png training/disp_occ_0/000055_10.png
training/image_2/000056_10.png training/image_3/000056_10.png training/disp_occ_0/000056_10.png
training/image_2/000057_10.png training/image_3/000057_10.png training/disp_occ_0/000057_10.png
training/image_2/000058_10.png training/image_3/000058_10.png training/disp_occ_0/000058_10.png
training/image_2/000059_10.png training/image_3/000059_10.png training/disp_occ_0/000059_10.png
training/image_2/000060_10.png training/image_3/000060_10.png training/disp_occ_0/000060_10.png
training/image_2/000061_10.png training/image_3/000061_10.png training/disp_occ_0/000061_10.png
training/image_2/000062_10.png training/image_3/000062_10.png training/disp_occ_0/000062_10.png
training/image_2/000063_10.png training/image_3/000063_10.png training/disp_occ_0/000063_10.png
training/image_2/000064_10.png training/image_3/000064_10.png training/disp_occ_0/000064_10.png
training/image_2/000065_10.png training/image_3/000065_10.png training/disp_occ_0/000065_10.png
training/image_2/000066_10.png training/image_3/000066_10.png training/disp_occ_0/000066_10.png
training/image_2/000068_10.png training/image_3/000068_10.png training/disp_occ_0/000068_10.png
training/image_2/000069_10.png training/image_3/000069_10.png training/disp_occ_0/000069_10.png
training/image_2/000070_10.png training/image_3/000070_10.png training/disp_occ_0/000070_10.png
training/image_2/000071_10.png training/image_3/000071_10.png training/disp_occ_0/000071_10.png
training/image_2/000072_10.png training/image_3/000072_10.png training/disp_occ_0/000072_10.png
training/image_2/000073_10.png training/image_3/000073_10.png training/disp_occ_0/000073_10.png
training/image_2/000074_10.png training/image_3/000074_10.png training/disp_occ_0/000074_10.png
training/image_2/000075_10.png training/image_3/000075_10.png training/disp_occ_0/000075_10.png
training/image_2/000076_10.png training/image_3/000076_10.png training/disp_occ_0/000076_10.png
training/image_2/000077_10.png training/image_3/000077_10.png training/disp_occ_0/000077_10.png
training/image_2/000078_10.png training/image_3/000078_10.png training/disp_occ_0/000078_10.png
training/image_2/000079_10.png training/image_3/000079_10.png training/disp_occ_0/000079_10.png
training/image_2/000080_10.png training/image_3/000080_10.png training/disp_occ_0/000080_10.png
training/image_2/000082_10.png training/image_3/000082_10.png training/disp_occ_0/000082_10.png
training/image_2/000083_10.png training/image_3/000083_10.png training/disp_occ_0/000083_10.png
training/image_2/000084_10.png training/image_3/000084_10.png training/disp_occ_0/000084_10.png
training/image_2/000085_10.png training/image_3/000085_10.png training/disp_occ_0/000085_10.png
training/image_2/000086_10.png training/image_3/000086_10.png training/disp_occ_0/000086_10.png
training/image_2/000087_10.png training/image_3/000087_10.png training/disp_occ_0/000087_10.png
training/image_2/000088_10.png training/image_3/000088_10.png training/disp_occ_0/000088_10.png
training/image_2/000090_10.png training/image_3/000090_10.png training/disp_occ_0/000090_10.png
training/image_2/000091_10.png training/image_3/000091_10.png training/disp_occ_0/000091_10.png
training/image_2/000092_10.png training/image_3/000092_10.png training/disp_occ_0/000092_10.png
training/image_2/000093_10.png training/image_3/000093_10.png training/disp_occ_0/000093_10.png
training/image_2/000094_10.png training/image_3/000094_10.png training/disp_occ_0/000094_10.png
training/image_2/000095_10.png training/image_3/000095_10.png training/disp_occ_0/000095_10.png
training/image_2/000096_10.png training/image_3/000096_10.png training/disp_occ_0/000096_10.png
training/image_2/000097_10.png training/image_3/000097_10.png training/disp_occ_0/000097_10.png
training/image_2/000098_10.png training/image_3/000098_10.png training/disp_occ_0/000098_10.png
training/image_2/000099_10.png training/image_3/000099_10.png training/disp_occ_0/000099_10.png
training/image_2/000100_10.png training/image_3/000100_10.png training/disp_occ_0/000100_10.png
training/image_2/000101_10.png training/image_3/000101_10.png training/disp_occ_0/000101_10.png
training/image_2/000102_10.png training/image_3/000102_10.png training/disp_occ_0/000102_10.png
training/image_2/000103_10.png training/image_3/000103_10.png training/disp_occ_0/000103_10.png
training/image_2/000104_10.png training/image_3/000104_10.png training/disp_occ_0/000104_10.png
training/image_2/000105_10.png training/image_3/000105_10.png training/disp_occ_0/000105_10.png
training/image_2/000106_10.png training/image_3/000106_10.png training/disp_occ_0/000106_10.png
training/image_2/000107_10.png training/image_3/000107_10.png training/disp_occ_0/000107_10.png
training/image_2/000108_10.png training/image_3/000108_10.png training/disp_occ_0/000108_10.png
training/image_2/000110_10.png training/image_3/000110_10.png training/disp_occ_0/000110_10.png
training/image_2/000111_10.png training/image_3/000111_10.png training/disp_occ_0/000111_10.png
training/image_2/000112_10.png training/image_3/000112_10.png training/disp_occ_0/000112_10.png
training/image_2/000113_10.png training/image_3/000113_10.png training/disp_occ_0/000113_10.png
training/image_2/000114_10.png training/image_3/000114_10.png training/disp_occ_0/000114_10.png
training/image_2/000115_10.png training/image_3/000115_10.png training/disp_occ_0/000115_10.png
training/image_2/000116_10.png training/image_3/000116_10.png training/disp_occ_0/000116_10.png
training/image_2/000117_10.png training/image_3/000117_10.png training/disp_occ_0/000117_10.png
training/image_2/000118_10.png training/image_3/000118_10.png training/disp_occ_0/000118_10.png
training/image_2/000119_10.png training/image_3/000119_10.png training/disp_occ_0/000119_10.png
training/image_2/000120_10.png training/image_3/000120_10.png training/disp_occ_0/000120_10.png
training/image_2/000121_10.png training/image_3/000121_10.png training/disp_occ_0/000121_10.png
training/image_2/000123_10.png training/image_3/000123_10.png training/disp_occ_0/000123_10.png
training/image_2/000124_10.png training/image_3/000124_10.png training/disp_occ_0/000124_10.png
training/image_2/000125_10.png training/image_3/000125_10.png training/disp_occ_0/000125_10.png
training/image_2/000126_10.png training/image_3/000126_10.png training/disp_occ_0/000126_10.png
training/image_2/000127_10.png training/image_3/000127_10.png training/disp_occ_0/000127_10.png
training/image_2/000128_10.png training/image_3/000128_10.png training/disp_occ_0/000128_10.png
training/image_2/000130_10.png training/image_3/000130_10.png training/disp_occ_0/000130_10.png
training/image_2/000131_10.png training/image_3/000131_10.png training/disp_occ_0/000131_10.png
training/image_2/000133_10.png training/image_3/000133_10.png training/disp_occ_0/000133_10.png
training/image_2/000134_10.png training/image_3/000134_10.png training/disp_occ_0/000134_10.png
training/image_2/000135_10.png training/image_3/000135_10.png training/disp_occ_0/000135_10.png
training/image_2/000136_10.png training/image_3/000136_10.png training/disp_occ_0/000136_10.png
training/image_2/000137_10.png training/image_3/000137_10.png training/disp_occ_0/000137_10.png
training/image_2/000138_10.png training/image_3/000138_10.png training/disp_occ_0/000138_10.png
training/image_2/000139_10.png training/image_3/000139_10.png training/disp_occ_0/000139_10.png
training/image_2/000140_10.png training/image_3/000140_10.png training/disp_occ_0/000140_10.png
training/image_2/000142_10.png training/image_3/000142_10.png training/disp_occ_0/000142_10.png
training/image_2/000143_10.png training/image_3/000143_10.png training/disp_occ_0/000143_10.png
training/image_2/000144_10.png training/image_3/000144_10.png training/disp_occ_0/000144_10.png
training/image_2/000145_10.png training/image_3/000145_10.png training/disp_occ_0/000145_10.png
training/image_2/000146_10.png training/image_3/000146_10.png training/disp_occ_0/000146_10.png
training/image_2/000147_10.png training/image_3/000147_10.png training/disp_occ_0/000147_10.png
training/image_2/000148_10.png training/image_3/000148_10.png training/disp_occ_0/000148_10.png
training/image_2/000149_10.png training/image_3/000149_10.png training/disp_occ_0/000149_10.png
training/image_2/000150_10.png training/image_3/000150_10.png training/disp_occ_0/000150_10.png
training/image_2/000151_10.png training/image_3/000151_10.png training/disp_occ_0/000151_10.png
training/image_2/000153_10.png training/image_3/000153_10.png training/disp_occ_0/000153_10.png
training/image_2/000154_10.png training/image_3/000154_10.png training/disp_occ_0/000154_10.png
training/image_2/000155_10.png training/image_3/000155_10.png training/disp_occ_0/000155_10.png
training/image_2/000156_10.png training/image_3/000156_10.png training/disp_occ_0/000156_10.png
training/image_2/000157_10.png training/image_3/000157_10.png training/disp_occ_0/000157_10.png
training/image_2/000158_10.png training/image_3/000158_10.png training/disp_occ_0/000158_10.png
training/image_2/000160_10.png training/image_3/000160_10.png training/disp_occ_0/000160_10.png
training/image_2/000161_10.png training/image_3/000161_10.png training/disp_occ_0/000161_10.png
training/image_2/000162_10.png training/image_3/000162_10.png training/disp_occ_0/000162_10.png
training/image_2/000163_10.png training/image_3/000163_10.png training/disp_occ_0/000163_10.png
training/image_2/000164_10.png training/image_3/000164_10.png training/disp_occ_0/000164_10.png
training/image_2/000165_10.png training/image_3/000165_10.png training/disp_occ_0/000165_10.png
training/image_2/000166_10.png training/image_3/000166_10.png training/disp_occ_0/000166_10.png
training/image_2/000167_10.png training/image_3/000167_10.png training/disp_occ_0/000167_10.png
training/image_2/000168_10.png training/image_3/000168_10.png training/disp_occ_0/000168_10.png
training/image_2/000169_10.png training/image_3/000169_10.png training/disp_occ_0/000169_10.png
training/image_2/000170_10.png training/image_3/000170_10.png training/disp_occ_0/000170_10.png
training/image_2/000172_10.png training/image_3/000172_10.png training/disp_occ_0/000172_10.png
training/image_2/000173_10.png training/image_3/000173_10.png training/disp_occ_0/000173_10.png
training/image_2/000174_10.png training/image_3/000174_10.png training/disp_occ_0/000174_10.png
training/image_2/000175_10.png training/image_3/000175_10.png training/disp_occ_0/000175_10.png
training/image_2/000176_10.png training/image_3/000176_10.png training/disp_occ_0/000176_10.png
training/image_2/000177_10.png training/image_3/000177_10.png training/disp_occ_0/000177_10.png
training/image_2/000178_10.png training/image_3/000178_10.png training/disp_occ_0/000178_10.png
training/image_2/000180_10.png training/image_3/000180_10.png training/disp_occ_0/000180_10.png
training/image_2/000181_10.png training/image_3/000181_10.png training/disp_occ_0/000181_10.png
training/image_2/000182_10.png training/image_3/000182_10.png training/disp_occ_0/000182_10.png
training/image_2/000183_10.png training/image_3/000183_10.png training/disp_occ_0/000183_10.png
training/image_2/000185_10.png training/image_3/000185_10.png training/disp_occ_0/000185_10.png
training/image_2/000186_10.png training/image_3/000186_10.png training/disp_occ_0/000186_10.png
training/image_2/000188_10.png training/image_3/000188_10.png training/disp_occ_0/000188_10.png
training/image_2/000189_10.png training/image_3/000189_10.png training/disp_occ_0/000189_10.png
training/image_2/000190_10.png training/image_3/000190_10.png training/disp_occ_0/000190_10.png
training/image_2/000191_10.png training/image_3/000191_10.png training/disp_occ_0/000191_10.png
training/image_2/000192_10.png training/image_3/000192_10.png training/disp_occ_0/000192_10.png
training/image_2/000193_10.png training/image_3/000193_10.png training/disp_occ_0/000193_10.png
training/image_2/000194_10.png training/image_3/000194_10.png training/disp_occ_0/000194_10.png
training/image_2/000195_10.png training/image_3/000195_10.png training/disp_occ_0/000195_10.png
training/image_2/000196_10.png training/image_3/000196_10.png training/disp_occ_0/000196_10.png
training/image_2/000197_10.png training/image_3/000197_10.png training/disp_occ_0/000197_10.png
training/image_2/000198_10.png training/image_3/000198_10.png training/disp_occ_0/000198_10.png
training/image_2/000199_10.png training/image_3/000199_10.png training/disp_occ_0/000199_10.png
training/image_2/000001_10.png training/image_3/000001_10.png training/disp_occ_0/000001_10.png
training/image_2/000006_10.png training/image_3/000006_10.png training/disp_occ_0/000006_10.png
training/image_2/000026_10.png training/image_3/000026_10.png training/disp_occ_0/000026_10.png
training/image_2/000038_10.png training/image_3/000038_10.png training/disp_occ_0/000038_10.png
training/image_2/000043_10.png training/image_3/000043_10.png training/disp_occ_0/000043_10.png
training/image_2/000049_10.png training/image_3/000049_10.png training/disp_occ_0/000049_10.png
training/image_2/000067_10.png training/image_3/000067_10.png training/disp_occ_0/000067_10.png
training/image_2/000081_10.png training/image_3/000081_10.png training/disp_occ_0/000081_10.png
training/image_2/000089_10.png training/image_3/000089_10.png training/disp_occ_0/000089_10.png
training/image_2/000109_10.png training/image_3/000109_10.png training/disp_occ_0/000109_10.png
training/image_2/000122_10.png training/image_3/000122_10.png training/disp_occ_0/000122_10.png
training/image_2/000129_10.png training/image_3/000129_10.png training/disp_occ_0/000129_10.png
training/image_2/000132_10.png training/image_3/000132_10.png training/disp_occ_0/000132_10.png
training/image_2/000141_10.png training/image_3/000141_10.png training/disp_occ_0/000141_10.png
training/image_2/000152_10.png training/image_3/000152_10.png training/disp_occ_0/000152_10.png
training/image_2/000159_10.png training/image_3/000159_10.png training/disp_occ_0/000159_10.png
training/image_2/000171_10.png training/image_3/000171_10.png training/disp_occ_0/000171_10.png
training/image_2/000179_10.png training/image_3/000179_10.png training/disp_occ_0/000179_10.png
training/image_2/000184_10.png training/image_3/000184_10.png training/disp_occ_0/000184_10.png
training/image_2/000187_10.png training/image_3/000187_10.png training/disp_occ_0/000187_10.png
training/colored_0/000000_10.png training/colored_1/000000_10.png training/disp_occ/000000_10.png
training/colored_0/000001_10.png training/colored_1/000001_10.png training/disp_occ/000001_10.png
training/colored_0/000002_10.png training/colored_1/000002_10.png training/disp_occ/000002_10.png
training/colored_0/000003_10.png training/colored_1/000003_10.png training/disp_occ/000003_10.png
training/colored_0/000004_10.png training/colored_1/000004_10.png training/disp_occ/000004_10.png
training/colored_0/000005_10.png training/colored_1/000005_10.png training/disp_occ/000005_10.png
training/colored_0/000006_10.png training/colored_1/000006_10.png training/disp_occ/000006_10.png
training/colored_0/000007_10.png training/colored_1/000007_10.png training/disp_occ/000007_10.png
training/colored_0/000008_10.png training/colored_1/000008_10.png training/disp_occ/000008_10.png
training/colored_0/000009_10.png training/colored_1/000009_10.png training/disp_occ/000009_10.png
training/colored_0/000010_10.png training/colored_1/000010_10.png training/disp_occ/000010_10.png
training/colored_0/000011_10.png training/colored_1/000011_10.png training/disp_occ/000011_10.png
training/colored_0/000012_10.png training/colored_1/000012_10.png training/disp_occ/000012_10.png
training/colored_0/000013_10.png training/colored_1/000013_10.png training/disp_occ/000013_10.png
training/colored_0/000014_10.png training/colored_1/000014_10.png training/disp_occ/000014_10.png
training/colored_0/000015_10.png training/colored_1/000015_10.png training/disp_occ/000015_10.png
training/colored_0/000016_10.png training/colored_1/000016_10.png training/disp_occ/000016_10.png
training/colored_0/000017_10.png training/colored_1/000017_10.png training/disp_occ/000017_10.png
training/colored_0/000018_10.png training/colored_1/000018_10.png training/disp_occ/000018_10.png
training/colored_0/000019_10.png training/colored_1/000019_10.png training/disp_occ/000019_10.png
training/colored_0/000020_10.png training/colored_1/000020_10.png training/disp_occ/000020_10.png
training/colored_0/000021_10.png training/colored_1/000021_10.png training/disp_occ/000021_10.png
training/colored_0/000022_10.png training/colored_1/000022_10.png training/disp_occ/000022_10.png
training/colored_0/000023_10.png training/colored_1/000023_10.png training/disp_occ/000023_10.png
training/colored_0/000024_10.png training/colored_1/000024_10.png training/disp_occ/000024_10.png
training/colored_0/000025_10.png training/colored_1/000025_10.png training/disp_occ/000025_10.png
training/colored_0/000026_10.png training/colored_1/000026_10.png training/disp_occ/000026_10.png
training/colored_0/000027_10.png training/colored_1/000027_10.png training/disp_occ/000027_10.png
training/colored_0/000028_10.png training/colored_1/000028_10.png training/disp_occ/000028_10.png
training/colored_0/000029_10.png training/colored_1/000029_10.png training/disp_occ/000029_10.png
training/colored_0/000030_10.png training/colored_1/000030_10.png training/disp_occ/000030_10.png
training/colored_0/000031_10.png training/colored_1/000031_10.png training/disp_occ/000031_10.png
training/colored_0/000032_10.png training/colored_1/000032_10.png training/disp_occ/000032_10.png
training/colored_0/000033_10.png training/colored_1/000033_10.png training/disp_occ/000033_10.png
training/colored_0/000034_10.png training/colored_1/000034_10.png training/disp_occ/000034_10.png
training/colored_0/000035_10.png training/colored_1/000035_10.png training/disp_occ/000035_10.png
training/colored_0/000036_10.png training/colored_1/000036_10.png training/disp_occ/000036_10.png
training/colored_0/000037_10.png training/colored_1/000037_10.png training/disp_occ/000037_10.png
training/colored_0/000038_10.png training/colored_1/000038_10.png training/disp_occ/000038_10.png
training/colored_0/000039_10.png training/colored_1/000039_10.png training/disp_occ/000039_10.png
training/colored_0/000040_10.png training/colored_1/000040_10.png training/disp_occ/000040_10.png
training/colored_0/000041_10.png training/colored_1/000041_10.png training/disp_occ/000041_10.png
training/colored_0/000042_10.png training/colored_1/000042_10.png training/disp_occ/000042_10.png
training/colored_0/000043_10.png training/colored_1/000043_10.png training/disp_occ/000043_10.png
training/colored_0/000044_10.png training/colored_1/000044_10.png training/disp_occ/000044_10.png
training/colored_0/000045_10.png training/colored_1/000045_10.png training/disp_occ/000045_10.png
training/colored_0/000046_10.png training/colored_1/000046_10.png training/disp_occ/000046_10.png
training/colored_0/000047_10.png training/colored_1/000047_10.png training/disp_occ/000047_10.png
training/colored_0/000048_10.png training/colored_1/000048_10.png training/disp_occ/000048_10.png
training/colored_0/000049_10.png training/colored_1/000049_10.png training/disp_occ/000049_10.png
training/colored_0/000050_10.png training/colored_1/000050_10.png training/disp_occ/000050_10.png
training/colored_0/000051_10.png training/colored_1/000051_10.png training/disp_occ/000051_10.png
training/colored_0/000052_10.png training/colored_1/000052_10.png training/disp_occ/000052_10.png
training/colored_0/000053_10.png training/colored_1/000053_10.png training/disp_occ/000053_10.png
training/colored_0/000054_10.png training/colored_1/000054_10.png training/disp_occ/000054_10.png
training/colored_0/000055_10.png training/colored_1/000055_10.png training/disp_occ/000055_10.png
training/colored_0/000056_10.png training/colored_1/000056_10.png training/disp_occ/000056_10.png
training/colored_0/000057_10.png training/colored_1/000057_10.png training/disp_occ/000057_10.png
training/colored_0/000058_10.png training/colored_1/000058_10.png training/disp_occ/000058_10.png
training/colored_0/000059_10.png training/colored_1/000059_10.png training/disp_occ/000059_10.png
training/colored_0/000060_10.png training/colored_1/000060_10.png training/disp_occ/000060_10.png
training/colored_0/000061_10.png training/colored_1/000061_10.png training/disp_occ/000061_10.png
training/colored_0/000062_10.png training/colored_1/000062_10.png training/disp_occ/000062_10.png
training/colored_0/000063_10.png training/colored_1/000063_10.png training/disp_occ/000063_10.png
training/colored_0/000064_10.png training/colored_1/000064_10.png training/disp_occ/000064_10.png
training/colored_0/000065_10.png training/colored_1/000065_10.png training/disp_occ/000065_10.png
training/colored_0/000066_10.png training/colored_1/000066_10.png training/disp_occ/000066_10.png
training/colored_0/000067_10.png training/colored_1/000067_10.png training/disp_occ/000067_10.png
training/colored_0/000068_10.png training/colored_1/000068_10.png training/disp_occ/000068_10.png
training/colored_0/000069_10.png training/colored_1/000069_10.png training/disp_occ/000069_10.png
training/colored_0/000070_10.png training/colored_1/000070_10.png training/disp_occ/000070_10.png
training/colored_0/000071_10.png training/colored_1/000071_10.png training/disp_occ/000071_10.png
training/colored_0/000072_10.png training/colored_1/000072_10.png training/disp_occ/000072_10.png
training/colored_0/000073_10.png training/colored_1/000073_10.png training/disp_occ/000073_10.png
training/colored_0/000074_10.png training/colored_1/000074_10.png training/disp_occ/000074_10.png
training/colored_0/000075_10.png training/colored_1/000075_10.png training/disp_occ/000075_10.png
training/colored_0/000076_10.png training/colored_1/000076_10.png training/disp_occ/000076_10.png
training/colored_0/000077_10.png training/colored_1/000077_10.png training/disp_occ/000077_10.png
training/colored_0/000078_10.png training/colored_1/000078_10.png training/disp_occ/000078_10.png
training/colored_0/000079_10.png training/colored_1/000079_10.png training/disp_occ/000079_10.png
training/colored_0/000080_10.png training/colored_1/000080_10.png training/disp_occ/000080_10.png
training/colored_0/000081_10.png training/colored_1/000081_10.png training/disp_occ/000081_10.png
training/colored_0/000082_10.png training/colored_1/000082_10.png training/disp_occ/000082_10.png
training/colored_0/000083_10.png training/colored_1/000083_10.png training/disp_occ/000083_10.png
training/colored_0/000084_10.png training/colored_1/000084_10.png training/disp_occ/000084_10.png
training/colored_0/000085_10.png training/colored_1/000085_10.png training/disp_occ/000085_10.png
training/colored_0/000086_10.png training/colored_1/000086_10.png training/disp_occ/000086_10.png
training/colored_0/000087_10.png training/colored_1/000087_10.png training/disp_occ/000087_10.png
training/colored_0/000088_10.png training/colored_1/000088_10.png training/disp_occ/000088_10.png
training/colored_0/000089_10.png training/colored_1/000089_10.png training/disp_occ/000089_10.png
training/colored_0/000090_10.png training/colored_1/000090_10.png training/disp_occ/000090_10.png
training/colored_0/000091_10.png training/colored_1/000091_10.png training/disp_occ/000091_10.png
training/colored_0/000092_10.png training/colored_1/000092_10.png training/disp_occ/000092_10.png
training/colored_0/000093_10.png training/colored_1/000093_10.png training/disp_occ/000093_10.png
training/colored_0/000094_10.png training/colored_1/000094_10.png training/disp_occ/000094_10.png
training/colored_0/000095_10.png training/colored_1/000095_10.png training/disp_occ/000095_10.png
training/colored_0/000096_10.png training/colored_1/000096_10.png training/disp_occ/000096_10.png
training/colored_0/000097_10.png training/colored_1/000097_10.png training/disp_occ/000097_10.png
training/colored_0/000098_10.png training/colored_1/000098_10.png training/disp_occ/000098_10.png
training/colored_0/000099_10.png training/colored_1/000099_10.png training/disp_occ/000099_10.png
training/colored_0/000100_10.png training/colored_1/000100_10.png training/disp_occ/000100_10.png
training/colored_0/000101_10.png training/colored_1/000101_10.png training/disp_occ/000101_10.png
training/colored_0/000102_10.png training/colored_1/000102_10.png training/disp_occ/000102_10.png
training/colored_0/000103_10.png training/colored_1/000103_10.png training/disp_occ/000103_10.png
training/colored_0/000104_10.png training/colored_1/000104_10.png training/disp_occ/000104_10.png
training/colored_0/000105_10.png training/colored_1/000105_10.png training/disp_occ/000105_10.png
training/colored_0/000106_10.png training/colored_1/000106_10.png training/disp_occ/000106_10.png
training/colored_0/000107_10.png training/colored_1/000107_10.png training/disp_occ/000107_10.png
training/colored_0/000108_10.png training/colored_1/000108_10.png training/disp_occ/000108_10.png
training/colored_0/000109_10.png training/colored_1/000109_10.png training/disp_occ/000109_10.png
training/colored_0/000110_10.png training/colored_1/000110_10.png training/disp_occ/000110_10.png
training/colored_0/000111_10.png training/colored_1/000111_10.png training/disp_occ/000111_10.png
training/colored_0/000112_10.png training/colored_1/000112_10.png training/disp_occ/000112_10.png
training/colored_0/000113_10.png training/colored_1/000113_10.png training/disp_occ/000113_10.png
training/colored_0/000114_10.png training/colored_1/000114_10.png training/disp_occ/000114_10.png
training/colored_0/000115_10.png training/colored_1/000115_10.png training/disp_occ/000115_10.png
training/colored_0/000116_10.png training/colored_1/000116_10.png training/disp_occ/000116_10.png
training/colored_0/000117_10.png training/colored_1/000117_10.png training/disp_occ/000117_10.png
training/colored_0/000118_10.png training/colored_1/000118_10.png training/disp_occ/000118_10.png
training/colored_0/000119_10.png training/colored_1/000119_10.png training/disp_occ/000119_10.png
training/colored_0/000120_10.png training/colored_1/000120_10.png training/disp_occ/000120_10.png
training/colored_0/000121_10.png training/colored_1/000121_10.png training/disp_occ/000121_10.png
training/colored_0/000122_10.png training/colored_1/000122_10.png training/disp_occ/000122_10.png
training/colored_0/000123_10.png training/colored_1/000123_10.png training/disp_occ/000123_10.png
training/colored_0/000124_10.png training/colored_1/000124_10.png training/disp_occ/000124_10.png
training/colored_0/000125_10.png training/colored_1/000125_10.png training/disp_occ/000125_10.png
training/colored_0/000126_10.png training/colored_1/000126_10.png training/disp_occ/000126_10.png
training/colored_0/000127_10.png training/colored_1/000127_10.png training/disp_occ/000127_10.png
training/colored_0/000128_10.png training/colored_1/000128_10.png training/disp_occ/000128_10.png
training/colored_0/000129_10.png training/colored_1/000129_10.png training/disp_occ/000129_10.png
training/colored_0/000130_10.png training/colored_1/000130_10.png training/disp_occ/000130_10.png
training/colored_0/000131_10.png training/colored_1/000131_10.png training/disp_occ/000131_10.png
training/colored_0/000132_10.png training/colored_1/000132_10.png training/disp_occ/000132_10.png
training/colored_0/000133_10.png training/colored_1/000133_10.png training/disp_occ/000133_10.png
training/colored_0/000134_10.png training/colored_1/000134_10.png training/disp_occ/000134_10.png
training/colored_0/000135_10.png training/colored_1/000135_10.png training/disp_occ/000135_10.png
training/colored_0/000136_10.png training/colored_1/000136_10.png training/disp_occ/000136_10.png
training/colored_0/000137_10.png training/colored_1/000137_10.png training/disp_occ/000137_10.png
training/colored_0/000138_10.png training/colored_1/000138_10.png training/disp_occ/000138_10.png
training/colored_0/000139_10.png training/colored_1/000139_10.png training/disp_occ/000139_10.png
training/colored_0/000140_10.png training/colored_1/000140_10.png training/disp_occ/000140_10.png
training/colored_0/000141_10.png training/colored_1/000141_10.png training/disp_occ/000141_10.png
training/colored_0/000142_10.png training/colored_1/000142_10.png training/disp_occ/000142_10.png
training/colored_0/000143_10.png training/colored_1/000143_10.png training/disp_occ/000143_10.png
training/colored_0/000144_10.png training/colored_1/000144_10.png training/disp_occ/000144_10.png
training/colored_0/000145_10.png training/colored_1/000145_10.png training/disp_occ/000145_10.png
training/colored_0/000146_10.png training/colored_1/000146_10.png training/disp_occ/000146_10.png
training/colored_0/000147_10.png training/colored_1/000147_10.png training/disp_occ/000147_10.png
training/colored_0/000148_10.png training/colored_1/000148_10.png training/disp_occ/000148_10.png
training/colored_0/000149_10.png training/colored_1/000149_10.png training/disp_occ/000149_10.png
training/colored_0/000150_10.png training/colored_1/000150_10.png training/disp_occ/000150_10.png
training/colored_0/000151_10.png training/colored_1/000151_10.png training/disp_occ/000151_10.png
training/colored_0/000152_10.png training/colored_1/000152_10.png training/disp_occ/000152_10.png
training/colored_0/000153_10.png training/colored_1/000153_10.png training/disp_occ/000153_10.png
training/colored_0/000154_10.png training/colored_1/000154_10.png training/disp_occ/000154_10.png
training/colored_0/000155_10.png training/colored_1/000155_10.png training/disp_occ/000155_10.png
training/colored_0/000156_10.png training/colored_1/000156_10.png training/disp_occ/000156_10.png
training/colored_0/000157_10.png training/colored_1/000157_10.png training/disp_occ/000157_10.png
training/colored_0/000158_10.png training/colored_1/000158_10.png training/disp_occ/000158_10.png
training/colored_0/000159_10.png training/colored_1/000159_10.png training/disp_occ/000159_10.png
training/colored_0/000160_10.png training/colored_1/000160_10.png training/disp_occ/000160_10.png
training/colored_0/000161_10.png training/colored_1/000161_10.png training/disp_occ/000161_10.png
training/colored_0/000162_10.png training/colored_1/000162_10.png training/disp_occ/000162_10.png
training/colored_0/000163_10.png training/colored_1/000163_10.png training/disp_occ/000163_10.png
training/colored_0/000164_10.png training/colored_1/000164_10.png training/disp_occ/000164_10.png
training/colored_0/000165_10.png training/colored_1/000165_10.png training/disp_occ/000165_10.png
training/colored_0/000166_10.png training/colored_1/000166_10.png training/disp_occ/000166_10.png
training/colored_0/000167_10.png training/colored_1/000167_10.png training/disp_occ/000167_10.png
training/colored_0/000168_10.png training/colored_1/000168_10.png training/disp_occ/000168_10.png
training/colored_0/000169_10.png training/colored_1/000169_10.png training/disp_occ/000169_10.png
training/colored_0/000170_10.png training/colored_1/000170_10.png training/disp_occ/000170_10.png
training/colored_0/000171_10.png training/colored_1/000171_10.png training/disp_occ/000171_10.png
training/colored_0/000172_10.png training/colored_1/000172_10.png training/disp_occ/000172_10.png
training/colored_0/000173_10.png training/colored_1/000173_10.png training/disp_occ/000173_10.png
training/colored_0/000174_10.png training/colored_1/000174_10.png training/disp_occ/000174_10.png
training/colored_0/000175_10.png training/colored_1/000175_10.png training/disp_occ/000175_10.png
training/colored_0/000176_10.png training/colored_1/000176_10.png training/disp_occ/000176_10.png
training/colored_0/000177_10.png training/colored_1/000177_10.png training/disp_occ/000177_10.png
training/colored_0/000178_10.png training/colored_1/000178_10.png training/disp_occ/000178_10.png
training/colored_0/000179_10.png training/colored_1/000179_10.png training/disp_occ/000179_10.png
training/colored_0/000180_10.png training/colored_1/000180_10.png training/disp_occ/000180_10.png
training/colored_0/000181_10.png training/colored_1/000181_10.png training/disp_occ/000181_10.png
training/colored_0/000182_10.png training/colored_1/000182_10.png training/disp_occ/000182_10.png
training/colored_0/000183_10.png training/colored_1/000183_10.png training/disp_occ/000183_10.png
training/colored_0/000184_10.png training/colored_1/000184_10.png training/disp_occ/000184_10.png
training/colored_0/000185_10.png training/colored_1/000185_10.png training/disp_occ/000185_10.png
training/colored_0/000186_10.png training/colored_1/000186_10.png training/disp_occ/000186_10.png
training/colored_0/000187_10.png training/colored_1/000187_10.png training/disp_occ/000187_10.png
training/colored_0/000188_10.png training/colored_1/000188_10.png training/disp_occ/000188_10.png
training/colored_0/000189_10.png training/colored_1/000189_10.png training/disp_occ/000189_10.png
training/colored_0/000190_10.png training/colored_1/000190_10.png training/disp_occ/000190_10.png
training/colored_0/000191_10.png training/colored_1/000191_10.png training/disp_occ/000191_10.png
training/colored_0/000192_10.png training/colored_1/000192_10.png training/disp_occ/000192_10.png
training/colored_0/000193_10.png training/colored_1/000193_10.png training/disp_occ/000193_10.png
================================================
FILE: filenames/kitti12_15_train.txt
================================================
training/image_2/000000_10.png training/image_3/000000_10.png training/disp_occ_0/000000_10.png
training/image_2/000002_10.png training/image_3/000002_10.png training/disp_occ_0/000002_10.png
training/image_2/000003_10.png training/image_3/000003_10.png training/disp_occ_0/000003_10.png
training/image_2/000004_10.png training/image_3/000004_10.png training/disp_occ_0/000004_10.png
training/image_2/000005_10.png training/image_3/000005_10.png training/disp_occ_0/000005_10.png
training/image_2/000007_10.png training/image_3/000007_10.png training/disp_occ_0/000007_10.png
training/image_2/000008_10.png training/image_3/000008_10.png training/disp_occ_0/000008_10.png
training/image_2/000009_10.png training/image_3/000009_10.png training/disp_occ_0/000009_10.png
training/image_2/000010_10.png training/image_3/000010_10.png training/disp_occ_0/000010_10.png
training/image_2/000011_10.png training/image_3/000011_10.png training/disp_occ_0/000011_10.png
training/image_2/000012_10.png training/image_3/000012_10.png training/disp_occ_0/000012_10.png
training/image_2/000013_10.png training/image_3/000013_10.png training/disp_occ_0/000013_10.png
training/image_2/000014_10.png training/image_3/000014_10.png training/disp_occ_0/000014_10.png
training/image_2/000015_10.png training/image_3/000015_10.png training/disp_occ_0/000015_10.png
training/image_2/000016_10.png training/image_3/000016_10.png training/disp_occ_0/000016_10.png
training/image_2/000017_10.png training/image_3/000017_10.png training/disp_occ_0/000017_10.png
training/image_2/000018_10.png training/image_3/000018_10.png training/disp_occ_0/000018_10.png
training/image_2/000019_10.png training/image_3/000019_10.png training/disp_occ_0/000019_10.png
training/image_2/000020_10.png training/image_3/000020_10.png training/disp_occ_0/000020_10.png
training/image_2/000021_10.png training/image_3/000021_10.png training/disp_occ_0/000021_10.png
training/image_2/000022_10.png training/image_3/000022_10.png training/disp_occ_0/000022_10.png
training/image_2/000023_10.png training/image_3/000023_10.png training/disp_occ_0/000023_10.png
training/image_2/000024_10.png training/image_3/000024_10.png training/disp_occ_0/000024_10.png
training/image_2/000025_10.png training/image_3/000025_10.png training/disp_occ_0/000025_10.png
training/image_2/000027_10.png training/image_3/000027_10.png training/disp_occ_0/000027_10.png
training/image_2/000028_10.png training/image_3/000028_10.png training/disp_occ_0/000028_10.png
training/image_2/000029_10.png training/image_3/000029_10.png training/disp_occ_0/000029_10.png
training/image_2/000030_10.png training/image_3/000030_10.png training/disp_occ_0/000030_10.png
training/image_2/000031_10.png training/image_3/000031_10.png training/disp_occ_0/000031_10.png
training/image_2/000032_10.png training/image_3/000032_10.png training/disp_occ_0/000032_10.png
training/image_2/000033_10.png training/image_3/000033_10.png training/disp_occ_0/000033_10.png
training/image_2/000034_10.png training/image_3/000034_10.png training/disp_occ_0/000034_10.png
training/image_2/000035_10.png training/image_3/000035_10.png training/disp_occ_0/000035_10.png
training/image_2/000036_10.png training/image_3/000036_10.png training/disp_occ_0/000036_10.png
training/image_2/000037_10.png training/image_3/000037_10.png training/disp_occ_0/000037_10.png
training/image_2/000039_10.png training/image_3/000039_10.png training/disp_occ_0/000039_10.png
training/image_2/000040_10.png training/image_3/000040_10.png training/disp_occ_0/000040_10.png
training/image_2/000041_10.png training/image_3/000041_10.png training/disp_occ_0/000041_10.png
training/image_2/000042_10.png training/image_3/000042_10.png training/disp_occ_0/000042_10.png
training/image_2/000044_10.png training/image_3/000044_10.png training/disp_occ_0/000044_10.png
training/image_2/000045_10.png training/image_3/000045_10.png training/disp_occ_0/000045_10.png
training/image_2/000046_10.png training/image_3/000046_10.png training/disp_occ_0/000046_10.png
training/image_2/000047_10.png training/image_3/000047_10.png training/disp_occ_0/000047_10.png
training/image_2/000048_10.png training/image_3/000048_10.png training/disp_occ_0/000048_10.png
training/image_2/000050_10.png training/image_3/000050_10.png training/disp_occ_0/000050_10.png
training/image_2/000051_10.png training/image_3/000051_10.png training/disp_occ_0/000051_10.png
training/image_2/000052_10.png training/image_3/000052_10.png training/disp_occ_0/000052_10.png
training/image_2/000053_10.png training/image_3/000053_10.png training/disp_occ_0/000053_10.png
training/image_2/000054_10.png training/image_3/000054_10.png training/disp_occ_0/000054_10.png
training/image_2/000055_10.png training/image_3/000055_10.png training/disp_occ_0/000055_10.png
training/image_2/000056_10.png training/image_3/000056_10.png training/disp_occ_0/000056_10.png
training/image_2/000057_10.png training/image_3/000057_10.png training/disp_occ_0/000057_10.png
training/image_2/000058_10.png training/image_3/000058_10.png training/disp_occ_0/000058_10.png
training/image_2/000059_10.png training/image_3/000059_10.png training/disp_occ_0/000059_10.png
training/image_2/000060_10.png training/image_3/000060_10.png training/disp_occ_0/000060_10.png
training/image_2/000061_10.png training/image_3/000061_10.png training/disp_occ_0/000061_10.png
training/image_2/000062_10.png training/image_3/000062_10.png training/disp_occ_0/000062_10.png
training/image_2/000063_10.png training/image_3/000063_10.png training/disp_occ_0/000063_10.png
training/image_2/000064_10.png training/image_3/000064_10.png training/disp_occ_0/000064_10.png
training/image_2/000065_10.png training/image_3/000065_10.png training/disp_occ_0/000065_10.png
training/image_2/000066_10.png training/image_3/000066_10.png training/disp_occ_0/000066_10.png
training/image_2/000068_10.png training/image_3/000068_10.png training/disp_occ_0/000068_10.png
training/image_2/000069_10.png training/image_3/000069_10.png training/disp_occ_0/000069_10.png
training/image_2/000070_10.png training/image_3/000070_10.png training/disp_occ_0/000070_10.png
training/image_2/000071_10.png training/image_3/000071_10.png training/disp_occ_0/000071_10.png
training/image_2/000072_10.png training/image_3/000072_10.png training/disp_occ_0/000072_10.png
training/image_2/000073_10.png training/image_3/000073_10.png training/disp_occ_0/000073_10.png
training/image_2/000074_10.png training/image_3/000074_10.png training/disp_occ_0/000074_10.png
training/image_2/000075_10.png training/image_3/000075_10.png training/disp_occ_0/000075_10.png
training/image_2/000076_10.png training/image_3/000076_10.png training/disp_occ_0/000076_10.png
training/image_2/000077_10.png training/image_3/000077_10.png training/disp_occ_0/000077_10.png
training/image_2/000078_10.png training/image_3/000078_10.png training/disp_occ_0/000078_10.png
training/image_2/000079_10.png training/image_3/000079_10.png training/disp_occ_0/000079_10.png
training/image_2/000080_10.png training/image_3/000080_10.png training/disp_occ_0/000080_10.png
training/image_2/000082_10.png training/image_3/000082_10.png training/disp_occ_0/000082_10.png
training/image_2/000083_10.png training/image_3/000083_10.png training/disp_occ_0/000083_10.png
training/image_2/000084_10.png training/image_3/000084_10.png training/disp_occ_0/000084_10.png
training/image_2/000085_10.png training/image_3/000085_10.png training/disp_occ_0/000085_10.png
training/image_2/000086_10.png training/image_3/000086_10.png training/disp_occ_0/000086_10.png
training/image_2/000087_10.png training/image_3/000087_10.png training/disp_occ_0/000087_10.png
training/image_2/000088_10.png training/image_3/000088_10.png training/disp_occ_0/000088_10.png
training/image_2/000090_10.png training/image_3/000090_10.png training/disp_occ_0/000090_10.png
training/image_2/000091_10.png training/image_3/000091_10.png training/disp_occ_0/000091_10.png
training/image_2/000092_10.png training/image_3/000092_10.png training/disp_occ_0/000092_10.png
training/image_2/000093_10.png training/image_3/000093_10.png training/disp_occ_0/000093_10.png
training/image_2/000094_10.png training/image_3/000094_10.png training/disp_occ_0/000094_10.png
training/image_2/000095_10.png training/image_3/000095_10.png training/disp_occ_0/000095_10.png
training/image_2/000096_10.png training/image_3/000096_10.png training/disp_occ_0/000096_10.png
training/image_2/000097_10.png training/image_3/000097_10.png training/disp_occ_0/000097_10.png
training/image_2/000098_10.png training/image_3/000098_10.png training/disp_occ_0/000098_10.png
training/image_2/000099_10.png training/image_3/000099_10.png training/disp_occ_0/000099_10.png
training/image_2/000100_10.png training/image_3/000100_10.png training/disp_occ_0/000100_10.png
training/image_2/000101_10.png training/image_3/000101_10.png training/disp_occ_0/000101_10.png
training/image_2/000102_10.png training/image_3/000102_10.png training/disp_occ_0/000102_10.png
training/image_2/000103_10.png training/image_3/000103_10.png training/disp_occ_0/000103_10.png
training/image_2/000104_10.png training/image_3/000104_10.png training/disp_occ_0/000104_10.png
training/image_2/000105_10.png training/image_3/000105_10.png training/disp_occ_0/000105_10.png
training/image_2/000106_10.png training/image_3/000106_10.png training/disp_occ_0/000106_10.png
training/image_2/000107_10.png training/image_3/000107_10.png training/disp_occ_0/000107_10.png
training/image_2/000108_10.png training/image_3/000108_10.png training/disp_occ_0/000108_10.png
training/image_2/000110_10.png training/image_3/000110_10.png training/disp_occ_0/000110_10.png
training/image_2/000111_10.png training/image_3/000111_10.png training/disp_occ_0/000111_10.png
training/image_2/000112_10.png training/image_3/000112_10.png training/disp_occ_0/000112_10.png
training/image_2/000113_10.png training/image_3/000113_10.png training/disp_occ_0/000113_10.png
training/image_2/000114_10.png training/image_3/000114_10.png training/disp_occ_0/000114_10.png
training/image_2/000115_10.png training/image_3/000115_10.png training/disp_occ_0/000115_10.png
training/image_2/000116_10.png training/image_3/000116_10.png training/disp_occ_0/000116_10.png
training/image_2/000117_10.png training/image_3/000117_10.png training/disp_occ_0/000117_10.png
training/image_2/000118_10.png training/image_3/000118_10.png training/disp_occ_0/000118_10.png
training/image_2/000119_10.png training/image_3/000119_10.png training/disp_occ_0/000119_10.png
training/image_2/000120_10.png training/image_3/000120_10.png training/disp_occ_0/000120_10.png
training/image_2/000121_10.png training/image_3/000121_10.png training/disp_occ_0/000121_10.png
training/image_2/000123_10.png training/image_3/000123_10.png training/disp_occ_0/000123_10.png
training/image_2/000124_10.png training/image_3/000124_10.png training/disp_occ_0/000124_10.png
training/image_2/000125_10.png training/image_3/000125_10.png training/disp_occ_0/000125_10.png
training/image_2/000126_10.png training/image_3/000126_10.png training/disp_occ_0/000126_10.png
training/image_2/000127_10.png training/image_3/000127_10.png training/disp_occ_0/000127_10.png
training/image_2/000128_10.png training/image_3/000128_10.png training/disp_occ_0/000128_10.png
training/image_2/000130_10.png training/image_3/000130_10.png training/disp_occ_0/000130_10.png
training/image_2/000131_10.png training/image_3/000131_10.png training/disp_occ_0/000131_10.png
training/image_2/000133_10.png training/image_3/000133_10.png training/disp_occ_0/000133_10.png
training/image_2/000134_10.png training/image_3/000134_10.png training/disp_occ_0/000134_10.png
training/image_2/000135_10.png training/image_3/000135_10.png training/disp_occ_0/000135_10.png
training/image_2/000136_10.png training/image_3/000136_10.png training/disp_occ_0/000136_10.png
training/image_2/000137_10.png training/image_3/000137_10.png training/disp_occ_0/000137_10.png
training/image_2/000138_10.png training/image_3/000138_10.png training/disp_occ_0/000138_10.png
training/image_2/000139_10.png training/image_3/000139_10.png training/disp_occ_0/000139_10.png
training/image_2/000140_10.png training/image_3/000140_10.png training/disp_occ_0/000140_10.png
training/image_2/000142_10.png training/image_3/000142_10.png training/disp_occ_0/000142_10.png
training/image_2/000143_10.png training/image_3/000143_10.png training/disp_occ_0/000143_10.png
training/image_2/000144_10.png training/image_3/000144_10.png training/disp_occ_0/000144_10.png
training/image_2/000145_10.png training/image_3/000145_10.png training/disp_occ_0/000145_10.png
training/image_2/000146_10.png training/image_3/000146_10.png training/disp_occ_0/000146_10.png
training/image_2/000147_10.png training/image_3/000147_10.png training/disp_occ_0/000147_10.png
training/image_2/000148_10.png training/image_3/000148_10.png training/disp_occ_0/000148_10.png
training/image_2/000149_10.png training/image_3/000149_10.png training/disp_occ_0/000149_10.png
training/image_2/000150_10.png training/image_3/000150_10.png training/disp_occ_0/000150_10.png
training/image_2/000151_10.png training/image_3/000151_10.png training/disp_occ_0/000151_10.png
training/image_2/000153_10.png training/image_3/000153_10.png training/disp_occ_0/000153_10.png
training/image_2/000154_10.png training/image_3/000154_10.png training/disp_occ_0/000154_10.png
training/image_2/000155_10.png training/image_3/000155_10.png training/disp_occ_0/000155_10.png
training/image_2/000156_10.png training/image_3/000156_10.png training/disp_occ_0/000156_10.png
training/image_2/000157_10.png training/image_3/000157_10.png training/disp_occ_0/000157_10.png
training/image_2/000158_10.png training/image_3/000158_10.png training/disp_occ_0/000158_10.png
training/image_2/000160_10.png training/image_3/000160_10.png training/disp_occ_0/000160_10.png
training/image_2/000161_10.png training/image_3/000161_10.png training/disp_occ_0/000161_10.png
training/image_2/000162_10.png training/image_3/000162_10.png training/disp_occ_0/000162_10.png
training/image_2/000163_10.png training/image_3/000163_10.png training/disp_occ_0/000163_10.png
training/image_2/000164_10.png training/image_3/000164_10.png training/disp_occ_0/000164_10.png
training/image_2/000165_10.png training/image_3/000165_10.png training/disp_occ_0/000165_10.png
training/image_2/000166_10.png training/image_3/000166_10.png training/disp_occ_0/000166_10.png
training/image_2/000167_10.png training/image_3/000167_10.png training/disp_occ_0/000167_10.png
training/image_2/000168_10.png training/image_3/000168_10.png training/disp_occ_0/000168_10.png
training/image_2/000169_10.png training/image_3/000169_10.png training/disp_occ_0/000169_10.png
training/image_2/000170_10.png training/image_3/000170_10.png training/disp_occ_0/000170_10.png
training/image_2/000172_10.png training/image_3/000172_10.png training/disp_occ_0/000172_10.png
training/image_2/000173_10.png training/image_3/000173_10.png training/disp_occ_0/000173_10.png
training/image_2/000174_10.png training/image_3/000174_10.png training/disp_occ_0/000174_10.png
training/image_2/000175_10.png training/image_3/000175_10.png training/disp_occ_0/000175_10.png
training/image_2/000176_10.png training/image_3/000176_10.png training/disp_occ_0/000176_10.png
training/image_2/000177_10.png training/image_3/000177_10.png training/disp_occ_0/000177_10.png
training/image_2/000178_10.png training/image_3/000178_10.png training/disp_occ_0/000178_10.png
training/image_2/000180_10.png training/image_3/000180_10.png training/disp_occ_0/000180_10.png
training/image_2/000181_10.png training/image_3/000181_10.png training/disp_occ_0/000181_10.png
training/image_2/000182_10.png training/image_3/000182_10.png training/disp_occ_0/000182_10.png
training/image_2/000183_10.png training/image_3/000183_10.png training/disp_occ_0/000183_10.png
training/image_2/000185_10.png training/image_3/000185_10.png training/disp_occ_0/000185_10.png
training/image_2/000186_10.png training/image_3/000186_10.png training/disp_occ_0/000186_10.png
training/image_2/000188_10.png training/image_3/000188_10.png training/disp_occ_0/000188_10.png
training/image_2/000189_10.png training/image_3/000189_10.png training/disp_occ_0/000189_10.png
training/image_2/000190_10.png training/image_3/000190_10.png training/disp_occ_0/000190_10.png
training/image_2/000191_10.png training/image_3/000191_10.png training/disp_occ_0/000191_10.png
training/image_2/000192_10.png training/image_3/000192_10.png training/disp_occ_0/000192_10.png
training/image_2/000193_10.png training/image_3/000193_10.png training/disp_occ_0/000193_10.png
training/image_2/000194_10.png training/image_3/000194_10.png training/disp_occ_0/000194_10.png
training/image_2/000195_10.png training/image_3/000195_10.png training/disp_occ_0/000195_10.png
training/image_2/000196_10.png training/image_3/000196_10.png training/disp_occ_0/000196_10.png
training/image_2/000197_10.png training/image_3/000197_10.png training/disp_occ_0/000197_10.png
training/image_2/000198_10.png training/image_3/000198_10.png training/disp_occ_0/000198_10.png
training/image_2/000199_10.png training/image_3/000199_10.png training/disp_occ_0/000199_10.png
training/colored_0/000000_10.png training/colored_1/000000_10.png training/disp_occ/000000_10.png
training/colored_0/000001_10.png training/colored_1/000001_10.png training/disp_occ/000001_10.png
training/colored_0/000002_10.png training/colored_1/000002_10.png training/disp_occ/000002_10.png
training/colored_0/000003_10.png training/colored_1/000003_10.png training/disp_occ/000003_10.png
training/colored_0/000004_10.png training/colored_1/000004_10.png training/disp_occ/000004_10.png
training/colored_0/000005_10.png training/colored_1/000005_10.png training/disp_occ/000005_10.png
training/colored_0/000006_10.png training/colored_1/000006_10.png training/disp_occ/000006_10.png
training/colored_0/000007_10.png training/colored_1/000007_10.png training/disp_occ/000007_10.png
training/colored_0/000008_10.png training/colored_1/000008_10.png training/disp_occ/000008_10.png
training/colored_0/000009_10.png training/colored_1/000009_10.png training/disp_occ/000009_10.png
training/colored_0/000010_10.png training/colored_1/000010_10.png training/disp_occ/000010_10.png
training/colored_0/000011_10.png training/colored_1/000011_10.png training/disp_occ/000011_10.png
training/colored_0/000012_10.png training/colored_1/000012_10.png training/disp_occ/000012_10.png
training/colored_0/000013_10.png training/colored_1/000013_10.png training/disp_occ/000013_10.png
training/colored_0/000014_10.png training/colored_1/000014_10.png training/disp_occ/000014_10.png
training/colored_0/000015_10.png training/colored_1/000015_10.png training/disp_occ/000015_10.png
training/colored_0/000016_10.png training/colored_1/000016_10.png training/disp_occ/000016_10.png
training/colored_0/000017_10.png training/colored_1/000017_10.png training/disp_occ/000017_10.png
training/colored_0/000018_10.png training/colored_1/000018_10.png training/disp_occ/000018_10.png
training/colored_0/000019_10.png training/colored_1/000019_10.png training/disp_occ/000019_10.png
training/colored_0/000020_10.png training/colored_1/000020_10.png training/disp_occ/000020_10.png
training/colored_0/000021_10.png training/colored_1/000021_10.png training/disp_occ/000021_10.png
training/colored_0/000022_10.png training/colored_1/000022_10.png training/disp_occ/000022_10.png
training/colored_0/000023_10.png training/colored_1/000023_10.png training/disp_occ/000023_10.png
training/colored_0/000024_10.png training/colored_1/000024_10.png training/disp_occ/000024_10.png
training/colored_0/000025_10.png training/colored_1/000025_10.png training/disp_occ/000025_10.png
training/colored_0/000026_10.png training/colored_1/000026_10.png training/disp_occ/000026_10.png
training/colored_0/000027_10.png training/colored_1/000027_10.png training/disp_occ/000027_10.png
training/colored_0/000028_10.png training/colored_1/000028_10.png training/disp_occ/000028_10.png
training/colored_0/000029_10.png training/colored_1/000029_10.png training/disp_occ/000029_10.png
training/colored_0/000030_10.png training/colored_1/000030_10.png training/disp_occ/000030_10.png
training/colored_0/000031_10.png training/colored_1/000031_10.png training/disp_occ/000031_10.png
training/colored_0/000032_10.png training/colored_1/000032_10.png training/disp_occ/000032_10.png
training/colored_0/000033_10.png training/colored_1/000033_10.png training/disp_occ/000033_10.png
training/colored_0/000034_10.png training/colored_1/000034_10.png training/disp_occ/000034_10.png
training/colored_0/000035_10.png training/colored_1/000035_10.png training/disp_occ/000035_10.png
training/colored_0/000036_10.png training/colored_1/000036_10.png training/disp_occ/000036_10.png
training/colored_0/000037_10.png training/colored_1/000037_10.png training/disp_occ/000037_10.png
training/colored_0/000038_10.png training/colored_1/000038_10.png training/disp_occ/000038_10.png
training/colored_0/000039_10.png training/colored_1/000039_10.png training/disp_occ/000039_10.png
training/colored_0/000040_10.png training/colored_1/000040_10.png training/disp_occ/000040_10.png
training/colored_0/000041_10.png training/colored_1/000041_10.png training/disp_occ/000041_10.png
training/colored_0/000042_10.png training/colored_1/000042_10.png training/disp_occ/000042_10.png
training/colored_0/000043_10.png training/colored_1/000043_10.png training/disp_occ/000043_10.png
training/colored_0/000044_10.png training/colored_1/000044_10.png training/disp_occ/000044_10.png
training/colored_0/000045_10.png training/colored_1/000045_10.png training/disp_occ/000045_10.png
training/colored_0/000046_10.png training/colored_1/000046_10.png training/disp_occ/000046_10.png
training/colored_0/000047_10.png training/colored_1/000047_10.png training/disp_occ/000047_10.png
training/colored_0/000048_10.png training/colored_1/000048_10.png training/disp_occ/000048_10.png
training/colored_0/000049_10.png training/colored_1/000049_10.png training/disp_occ/000049_10.png
training/colored_0/000050_10.png training/colored_1/000050_10.png training/disp_occ/000050_10.png
training/colored_0/000051_10.png training/colored_1/000051_10.png training/disp_occ/000051_10.png
training/colored_0/000052_10.png training/colored_1/000052_10.png training/disp_occ/000052_10.png
training/colored_0/000053_10.png training/colored_1/000053_10.png training/disp_occ/000053_10.png
training/colored_0/000054_10.png training/colored_1/000054_10.png training/disp_occ/000054_10.png
training/colored_0/000055_10.png training/colored_1/000055_10.png training/disp_occ/000055_10.png
training/colored_0/000056_10.png training/colored_1/000056_10.png training/disp_occ/000056_10.png
training/colored_0/000057_10.png training/colored_1/000057_10.png training/disp_occ/000057_10.png
training/colored_0/000058_10.png training/colored_1/000058_10.png training/disp_occ/000058_10.png
training/colored_0/000059_10.png training/colored_1/000059_10.png training/disp_occ/000059_10.png
training/colored_0/000060_10.png training/colored_1/000060_10.png training/disp_occ/000060_10.png
training/colored_0/000061_10.png training/colored_1/000061_10.png training/disp_occ/000061_10.png
training/colored_0/000062_10.png training/colored_1/000062_10.png training/disp_occ/000062_10.png
training/colored_0/000063_10.png training/colored_1/000063_10.png training/disp_occ/000063_10.png
training/colored_0/000064_10.png training/colored_1/000064_10.png training/disp_occ/000064_10.png
training/colored_0/000065_10.png training/colored_1/000065_10.png training/disp_occ/000065_10.png
training/colored_0/000066_10.png training/colored_1/000066_10.png training/disp_occ/000066_10.png
training/colored_0/000067_10.png training/colored_1/000067_10.png training/disp_occ/000067_10.png
training/colored_0/000068_10.png training/colored_1/000068_10.png training/disp_occ/000068_10.png
training/colored_0/000069_10.png training/colored_1/000069_10.png training/disp_occ/000069_10.png
training/colored_0/000070_10.png training/colored_1/000070_10.png training/disp_occ/000070_10.png
training/colored_0/000071_10.png training/colored_1/000071_10.png training/disp_occ/000071_10.png
training/colored_0/000072_10.png training/colored_1/000072_10.png training/disp_occ/000072_10.png
training/colored_0/000073_10.png training/colored_1/000073_10.png training/disp_occ/000073_10.png
training/colored_0/000074_10.png training/colored_1/000074_10.png training/disp_occ/000074_10.png
training/colored_0/000075_10.png training/colored_1/000075_10.png training/disp_occ/000075_10.png
training/colored_0/000076_10.png training/colored_1/000076_10.png training/disp_occ/000076_10.png
training/colored_0/000077_10.png training/colored_1/000077_10.png training/disp_occ/000077_10.png
training/colored_0/000078_10.png training/colored_1/000078_10.png training/disp_occ/000078_10.png
training/colored_0/000079_10.png training/colored_1/000079_10.png training/disp_occ/000079_10.png
training/colored_0/000080_10.png training/colored_1/000080_10.png training/disp_occ/000080_10.png
training/colored_0/000081_10.png training/colored_1/000081_10.png training/disp_occ/000081_10.png
training/colored_0/000082_10.png training/colored_1/000082_10.png training/disp_occ/000082_10.png
training/colored_0/000083_10.png training/colored_1/000083_10.png training/disp_occ/000083_10.png
training/colored_0/000084_10.png training/colored_1/000084_10.png training/disp_occ/000084_10.png
training/colored_0/000085_10.png training/colored_1/000085_10.png training/disp_occ/000085_10.png
training/colored_0/000086_10.png training/colored_1/000086_10.png training/disp_occ/000086_10.png
training/colored_0/000087_10.png training/colored_1/000087_10.png training/disp_occ/000087_10.png
training/colored_0/000088_10.png training/colored_1/000088_10.png training/disp_occ/000088_10.png
training/colored_0/000089_10.png training/colored_1/000089_10.png training/disp_occ/000089_10.png
training/colored_0/000090_10.png training/colored_1/000090_10.png training/disp_occ/000090_10.png
training/colored_0/000091_10.png training/colored_1/000091_10.png training/disp_occ/000091_10.png
training/colored_0/000092_10.png training/colored_1/000092_10.png training/disp_occ/000092_10.png
training/colored_0/000093_10.png training/colored_1/000093_10.png training/disp_occ/000093_10.png
training/colored_0/000094_10.png training/colored_1/000094_10.png training/disp_occ/000094_10.png
training/colored_0/000095_10.png training/colored_1/000095_10.png training/disp_occ/000095_10.png
training/colored_0/000096_10.png training/colored_1/000096_10.png training/disp_occ/000096_10.png
training/colored_0/000097_10.png training/colored_1/000097_10.png training/disp_occ/000097_10.png
training/colored_0/000098_10.png training/colored_1/000098_10.png training/disp_occ/000098_10.png
training/colored_0/000099_10.png training/colored_1/000099_10.png training/disp_occ/000099_10.png
training/colored_0/000100_10.png training/colored_1/000100_10.png training/disp_occ/000100_10.png
training/colored_0/000101_10.png training/colored_1/000101_10.png training/disp_occ/000101_10.png
training/colored_0/000102_10.png training/colored_1/000102_10.png training/disp_occ/000102_10.png
training/colored_0/000103_10.png training/colored_1/000103_10.png training/disp_occ/000103_10.png
training/colored_0/000104_10.png training/colored_1/000104_10.png training/disp_occ/000104_10.png
training/colored_0/000105_10.png training/colored_1/000105_10.png training/disp_occ/000105_10.png
training/colored_0/000106_10.png training/colored_1/000106_10.png training/disp_occ/000106_10.png
training/colored_0/000107_10.png training/colored_1/000107_10.png training/disp_occ/000107_10.png
training/colored_0/000108_10.png training/colored_1/000108_10.png training/disp_occ/000108_10.png
training/colored_0/000109_10.png training/colored_1/000109_10.png training/disp_occ/000109_10.png
training/colored_0/000110_10.png training/colored_1/000110_10.png training/disp_occ/000110_10.png
training/colored_0/000111_10.png training/colored_1/000111_10.png training/disp_occ/000111_10.png
training/colored_0/000112_10.png training/colored_1/000112_10.png training/disp_occ/000112_10.png
training/colored_0/000113_10.png training/colored_1/000113_10.png training/disp_occ/000113_10.png
training/colored_0/000114_10.png training/colored_1/000114_10.png training/disp_occ/000114_10.png
training/colored_0/000115_10.png training/colored_1/000115_10.png training/disp_occ/000115_10.png
training/colored_0/000116_10.png training/colored_1/000116_10.png training/disp_occ/000116_10.png
training/colored_0/000117_10.png training/colored_1/000117_10.png training/disp_occ/000117_10.png
training/colored_0/000118_10.png training/colored_1/000118_10.png training/disp_occ/000118_10.png
training/colored_0/000119_10.png training/colored_1/000119_10.png training/disp_occ/000119_10.png
training/colored_0/000120_10.png training/colored_1/000120_10.png training/disp_occ/000120_10.png
training/colored_0/000121_10.png training/colored_1/000121_10.png training/disp_occ/000121_10.png
training/colored_0/000122_10.png training/colored_1/000122_10.png training/disp_occ/000122_10.png
training/colored_0/000123_10.png training/colored_1/000123_10.png training/disp_occ/000123_10.png
training/colored_0/000124_10.png training/colored_1/000124_10.png training/disp_occ/000124_10.png
training/colored_0/000125_10.png training/colored_1/000125_10.png training/disp_occ/000125_10.png
training/colored_0/000126_10.png training/colored_1/000126_10.png training/disp_occ/000126_10.png
training/colored_0/000127_10.png training/colored_1/000127_10.png training/disp_occ/000127_10.png
training/colored_0/000128_10.png training/colored_1/000128_10.png training/disp_occ/000128_10.png
training/colored_0/000129_10.png training/colored_1/000129_10.png training/disp_occ/000129_10.png
training/colored_0/000130_10.png training/colored_1/000130_10.png training/disp_occ/000130_10.png
training/colored_0/000131_10.png training/colored_1/000131_10.png training/disp_occ/000131_10.png
training/colored_0/000132_10.png training/colored_1/000132_10.png training/disp_occ/000132_10.png
training/colored_0/000133_10.png training/colored_1/000133_10.png training/disp_occ/000133_10.png
training/colored_0/000134_10.png training/colored_1/000134_10.png training/disp_occ/000134_10.png
training/colored_0/000135_10.png training/colored_1/000135_10.png training/disp_occ/000135_10.png
training/colored_0/000136_10.png training/colored_1/000136_10.png training/disp_occ/000136_10.png
training/colored_0/000137_10.png training/colored_1/000137_10.png training/disp_occ/000137_10.png
training/colored_0/000138_10.png training/colored_1/000138_10.png training/disp_occ/000138_10.png
training/colored_0/000139_10.png training/colored_1/000139_10.png training/disp_occ/000139_10.png
training/colored_0/000140_10.png training/colored_1/000140_10.png training/disp_occ/000140_10.png
training/colored_0/000141_10.png training/colored_1/000141_10.png training/disp_occ/000141_10.png
training/colored_0/000142_10.png training/colored_1/000142_10.png training/disp_occ/000142_10.png
training/colored_0/000143_10.png training/colored_1/000143_10.png training/disp_occ/000143_10.png
training/colored_0/000144_10.png training/colored_1/000144_10.png training/disp_occ/000144_10.png
training/colored_0/000145_10.png training/colored_1/000145_10.png training/disp_occ/000145_10.png
training/colored_0/000146_10.png training/colored_1/000146_10.png training/disp_occ/000146_10.png
training/colored_0/000147_10.png training/colored_1/000147_10.png training/disp_occ/000147_10.png
training/colored_0/000148_10.png training/colored_1/000148_10.png training/disp_occ/000148_10.png
training/colored_0/000149_10.png training/colored_1/000149_10.png training/disp_occ/000149_10.png
training/colored_0/000150_10.png training/colored_1/000150_10.png training/disp_occ/000150_10.png
training/colored_0/000151_10.png training/colored_1/000151_10.png training/disp_occ/000151_10.png
training/colored_0/000152_10.png training/colored_1/000152_10.png training/disp_occ/000152_10.png
training/colored_0/000153_10.png training/colored_1/000153_10.png training/disp_occ/000153_10.png
training/colored_0/000154_10.png training/colored_1/000154_10.png training/disp_occ/000154_10.png
training/colored_0/000155_10.png training/colored_1/000155_10.png training/disp_occ/000155_10.png
training/colored_0/000156_10.png training/colored_1/000156_10.png training/disp_occ/000156_10.png
training/colored_0/000157_10.png training/colored_1/000157_10.png training/disp_occ/000157_10.png
training/colored_0/000158_10.png training/colored_1/000158_10.png training/disp_occ/000158_10.png
training/colored_0/000159_10.png training/colored_1/000159_10.png training/disp_occ/000159_10.png
training/colored_0/000160_10.png training/colored_1/000160_10.png training/disp_occ/000160_10.png
training/colored_0/000161_10.png training/colored_1/000161_10.png training/disp_occ/000161_10.png
training/colored_0/000162_10.png training/colored_1/000162_10.png training/disp_occ/000162_10.png
training/colored_0/000163_10.png training/colored_1/000163_10.png training/disp_occ/000163_10.png
training/colored_0/000164_10.png training/colored_1/000164_10.png training/disp_occ/000164_10.png
training/colored_0/000165_10.png training/colored_1/000165_10.png training/disp_occ/000165_10.png
training/colored_0/000166_10.png training/colored_1/000166_10.png training/disp_occ/000166_10.png
training/colored_0/000167_10.png training/colored_1/000167_10.png training/disp_occ/000167_10.png
training/colored_0/000168_10.png training/colored_1/000168_10.png training/disp_occ/000168_10.png
training/colored_0/000169_10.png training/colored_1/000169_10.png training/disp_occ/000169_10.png
training/colored_0/000170_10.png training/colored_1/000170_10.png training/disp_occ/000170_10.png
training/colored_0/000171_10.png training/colored_1/000171_10.png training/disp_occ/000171_10.png
training/colored_0/000172_10.png training/colored_1/000172_10.png training/disp_occ/000172_10.png
training/colored_0/000173_10.png training/colored_1/000173_10.png training/disp_occ/000173_10.png
training/colored_0/000174_10.png training/colored_1/000174_10.png training/disp_occ/000174_10.png
training/colored_0/000175_10.png training/colored_1/000175_10.png training/disp_occ/000175_10.png
training/colored_0/000176_10.png training/colored_1/000176_10.png training/disp_occ/000176_10.png
training/colored_0/000177_10.png training/colored_1/000177_10.png training/disp_occ/000177_10.png
training/colored_0/000178_10.png training/colored_1/000178_10.png training/disp_occ/000178_10.png
training/colored_0/000179_10.png training/colored_1/000179_10.png training/disp_occ/000179_10.png
================================================
FILE: filenames/kitti12_all.txt
================================================
training/colored_0/000000_10.png training/colored_1/000000_10.png training/disp_occ/000000_10.png
training/colored_0/000001_10.png training/colored_1/000001_10.png training/disp_occ/000001_10.png
training/colored_0/000002_10.png training/colored_1/000002_10.png training/disp_occ/000002_10.png
training/colored_0/000003_10.png training/colored_1/000003_10.png training/disp_occ/000003_10.png
training/colored_0/000004_10.png training/colored_1/000004_10.png training/disp_occ/000004_10.png
training/colored_0/000005_10.png training/colored_1/000005_10.png training/disp_occ/000005_10.png
training/colored_0/000006_10.png training/colored_1/000006_10.png training/disp_occ/000006_10.png
training/colored_0/000007_10.png training/colored_1/000007_10.png training/disp_occ/000007_10.png
training/colored_0/000008_10.png training/colored_1/000008_10.png training/disp_occ/000008_10.png
training/colored_0/000009_10.png training/colored_1/000009_10.png training/disp_occ/000009_10.png
training/colored_0/000010_10.png training/colored_1/000010_10.png training/disp_occ/000010_10.png
training/colored_0/000011_10.png training/colored_1/000011_10.png training/disp_occ/000011_10.png
training/colored_0/000012_10.png training/colored_1/000012_10.png training/disp_occ/000012_10.png
training/colored_0/000013_10.png training/colored_1/000013_10.png training/disp_occ/000013_10.png
training/colored_0/000014_10.png training/colored_1/000014_10.png training/disp_occ/000014_10.png
training/colored_0/000015_10.png training/colored_1/000015_10.png training/disp_occ/000015_10.png
training/colored_0/000016_10.png training/colored_1/000016_10.png training/disp_occ/000016_10.png
training/colored_0/000017_10.png training/colored_1/000017_10.png training/disp_occ/000017_10.png
training/colored_0/000018_10.png training/colored_1/000018_10.png training/disp_occ/000018_10.png
training/colored_0/000019_10.png training/colored_1/000019_10.png training/disp_occ/000019_10.png
training/colored_0/000020_10.png training/colored_1/000020_10.png training/disp_occ/000020_10.png
training/colored_0/000021_10.png training/colored_1/000021_10.png training/disp_occ/000021_10.png
training/colored_0/000022_10.png training/colored_1/000022_10.png training/disp_occ/000022_10.png
training/colored_0/000023_10.png training/colored_1/000023_10.png training/disp_occ/000023_10.png
training/colored_0/000024_10.png training/colored_1/000024_10.png training/disp_occ/000024_10.png
training/colored_0/000025_10.png training/colored_1/000025_10.png training/disp_occ/000025_10.png
training/colored_0/000026_10.png training/colored_1/000026_10.png training/disp_occ/000026_10.png
training/colored_0/000027_10.png training/colored_1/000027_10.png training/disp_occ/000027_10.png
training/colored_0/000028_10.png training/colored_1/000028_10.png training/disp_occ/000028_10.png
training/colored_0/000029_10.png training/colored_1/000029_10.png training/disp_occ/000029_10.png
training/colored_0/000030_10.png training/colored_1/000030_10.png training/disp_occ/000030_10.png
training/colored_0/000031_10.png training/colored_1/000031_10.png training/disp_occ/000031_10.png
training/colored_0/000032_10.png training/colored_1/000032_10.png training/disp_occ/000032_10.png
training/colored_0/000033_10.png training/colored_1/000033_10.png training/disp_occ/000033_10.png
training/colored_0/000034_10.png training/colored_1/000034_10.png training/disp_occ/000034_10.png
training/colored_0/000035_10.png training/colored_1/000035_10.png training/disp_occ/000035_10.png
training/colored_0/000036_10.png training/colored_1/000036_10.png training/disp_occ/000036_10.png
training/colored_0/000037_10.png training/colored_1/000037_10.png training/disp_occ/000037_10.png
training/colored_0/000038_10.png training/colored_1/000038_10.png training/disp_occ/000038_10.png
training/colored_0/000039_10.png training/colored_1/000039_10.png training/disp_occ/000039_10.png
training/colored_0/000040_10.png training/colored_1/000040_10.png training/disp_occ/000040_10.png
training/colored_0/000041_10.png training/colored_1/000041_10.png training/disp_occ/000041_10.png
training/colored_0/000042_10.png training/colored_1/000042_10.png training/disp_occ/000042_10.png
training/colored_0/000043_10.png training/colored_1/000043_10.png training/disp_occ/000043_10.png
training/colored_0/000044_10.png training/colored_1/000044_10.png training/disp_occ/000044_10.png
training/colored_0/000045_10.png training/colored_1/000045_10.png training/disp_occ/000045_10.png
training/colored_0/000046_10.png training/colored_1/000046_10.png training/disp_occ/000046_10.png
training/colored_0/000047_10.png training/colored_1/000047_10.png training/disp_occ/000047_10.png
training/colored_0/000048_10.png training/colored_1/000048_10.png training/disp_occ/000048_10.png
training/colored_0/000049_10.png training/colored_1/000049_10.png training/disp_occ/000049_10.png
training/colored_0/000050_10.png training/colored_1/000050_10.png training/disp_occ/000050_10.png
training/colored_0/000051_10.png training/colored_1/000051_10.png training/disp_occ/000051_10.png
training/colored_0/000052_10.png training/colored_1/000052_10.png training/disp_occ/000052_10.png
training/colored_0/000053_10.png training/colored_1/000053_10.png training/disp_occ/000053_10.png
training/colored_0/000054_10.png training/colored_1/000054_10.png training/disp_occ/000054_10.png
training/colored_0/000055_10.png training/colored_1/000055_10.png training/disp_occ/000055_10.png
training/colored_0/000056_10.png training/colored_1/000056_10.png training/disp_occ/000056_10.png
training/colored_0/000057_10.png training/colored_1/000057_10.png training/disp_occ/000057_10.png
training/colored_0/000058_10.png training/colored_1/000058_10.png training/disp_occ/000058_10.png
training/colored_0/000059_10.png training/colored_1/000059_10.png training/disp_occ/000059_10.png
training/colored_0/000060_10.png training/colored_1/000060_10.png training/disp_occ/000060_10.png
training/colored_0/000061_10.png training/colored_1/000061_10.png training/disp_occ/000061_10.png
training/colored_0/000062_10.png training/colored_1/000062_10.png training/disp_occ/000062_10.png
training/colored_0/000063_10.png training/colored_1/000063_10.png training/disp_occ/000063_10.png
training/colored_0/000064_10.png training/colored_1/000064_10.png training/disp_occ/000064_10.png
training/colored_0/000065_10.png training/colored_1/000065_10.png training/disp_occ/000065_10.png
training/colored_0/000066_10.png training/colored_1/000066_10.png training/disp_occ/000066_10.png
training/colored_0/000067_10.png training/colored_1/000067_10.png training/disp_occ/000067_10.png
training/colored_0/000068_10.png training/colored_1/000068_10.png training/disp_occ/000068_10.png
training/colored_0/000069_10.png training/colored_1/000069_10.png training/disp_occ/000069_10.png
training/colored_0/000070_10.png training/colored_1/000070_10.png training/disp_occ/000070_10.png
training/colored_0/000071_10.png training/colored_1/000071_10.png training/disp_occ/000071_10.png
training/colored_0/000072_10.png training/colored_1/000072_10.png training/disp_occ/000072_10.png
training/colored_0/000073_10.png training/colored_1/000073_10.png training/disp_occ/000073_10.png
training/colored_0/000074_10.png training/colored_1/000074_10.png training/disp_occ/000074_10.png
training/colored_0/000075_10.png training/colored_1/000075_10.png training/disp_occ/000075_10.png
training/colored_0/000076_10.png training/colored_1/000076_10.png training/disp_occ/000076_10.png
training/colored_0/000077_10.png training/colored_1/000077_10.png training/disp_occ/000077_10.png
training/colored_0/000078_10.png training/colored_1/000078_10.png training/disp_occ/000078_10.png
training/colored_0/000079_10.png training/colored_1/000079_10.png training/disp_occ/000079_10.png
training/colored_0/000080_10.png training/colored_1/000080_10.png training/disp_occ/000080_10.png
training/colored_0/000081_10.png training/colored_1/000081_10.png training/disp_occ/000081_10.png
training/colored_0/000082_10.png training/colored_1/000082_10.png training/disp_occ/000082_10.png
training/colored_0/000083_10.png training/colored_1/000083_10.png training/disp_occ/000083_10.png
training/colored_0/000084_10.png training/colored_1/000084_10.png training/disp_occ/000084_10.png
training/colored_0/000085_10.png training/colored_1/000085_10.png training/disp_occ/000085_10.png
training/colored_0/000086_10.png training/colored_1/000086_10.png training/disp_occ/000086_10.png
training/colored_0/000087_10.png training/colored_1/000087_10.png training/disp_occ/000087_10.png
training/colored_0/000088_10.png training/colored_1/000088_10.png training/disp_occ/000088_10.png
training/colored_0/000089_10.png training/colored_1/000089_10.png training/disp_occ/000089_10.png
training/colored_0/000090_10.png training/colored_1/000090_10.png training/disp_occ/000090_10.png
training/colored_0/000091_10.png training/colored_1/000091_10.png training/disp_occ/000091_10.png
training/colored_0/000092_10.png training/colored_1/000092_10.png training/disp_occ/000092_10.png
training/colored_0/000093_10.png training/colored_1/000093_10.png training/disp_occ/000093_10.png
training/colored_0/000094_10.png training/colored_1/000094_10.png training/disp_occ/000094_10.png
training/colored_0/000095_10.png training/colored_1/000095_10.png training/disp_occ/000095_10.png
training/colored_0/000096_10.png training/colored_1/000096_10.png training/disp_occ/000096_10.png
training/colored_0/000097_10.png training/colored_1/000097_10.png training/disp_occ/000097_10.png
training/colored_0/000098_10.png training/colored_1/000098_10.png training/disp_occ/000098_10.png
training/colored_0/000099_10.png training/colored_1/000099_10.png training/disp_occ/000099_10.png
training/colored_0/000100_10.png training/colored_1/000100_10.png training/disp_occ/000100_10.png
training/colored_0/000101_10.png training/colored_1/000101_10.png training/disp_occ/000101_10.png
training/colored_0/000102_10.png training/colored_1/000102_10.png training/disp_occ/000102_10.png
training/colored_0/000103_10.png training/colored_1/000103_10.png training/disp_occ/000103_10.png
training/colored_0/000104_10.png training/colored_1/000104_10.png training/disp_occ/000104_10.png
training/colored_0/000105_10.png training/colored_1/000105_10.png training/disp_occ/000105_10.png
training/colored_0/000106_10.png training/colored_1/000106_10.png training/disp_occ/000106_10.png
training/colored_0/000107_10.png training/colored_1/000107_10.png training/disp_occ/000107_10.png
training/colored_0/000108_10.png training/colored_1/000108_10.png training/disp_occ/000108_10.png
training/colored_0/000109_10.png training/colored_1/000109_10.png training/disp_occ/000109_10.png
training/colored_0/000110_10.png training/colored_1/000110_10.png training/disp_occ/000110_10.png
training/colored_0/000111_10.png training/colored_1/000111_10.png training/disp_occ/000111_10.png
training/colored_0/000112_10.png training/colored_1/000112_10.png training/disp_occ/000112_10.png
training/colored_0/000113_10.png training/colored_1/000113_10.png training/disp_occ/000113_10.png
training/colored_0/000114_10.png training/colored_1/000114_10.png training/disp_occ/000114_10.png
training/colored_0/000115_10.png training/colored_1/000115_10.png training/disp_occ/000115_10.png
training/colored_0/000116_10.png training/colored_1/000116_10.png training/disp_occ/000116_10.png
training/colored_0/000117_10.png training/colored_1/000117_10.png training/disp_occ/000117_10.png
training/colored_0/000118_10.png training/colored_1/000118_10.png training/disp_occ/000118_10.png
training/colored_0/000119_10.png training/colored_1/000119_10.png training/disp_occ/000119_10.png
training/colored_0/000120_10.png training/colored_1/000120_10.png training/disp_occ/000120_10.png
training/colored_0/000121_10.png training/colored_1/000121_10.png training/disp_occ/000121_10.png
training/colored_0/000122_10.png training/colored_1/000122_10.png training/disp_occ/000122_10.png
training/colored_0/000123_10.png training/colored_1/000123_10.png training/disp_occ/000123_10.png
training/colored_0/000124_10.png training/colored_1/000124_10.png training/disp_occ/000124_10.png
training/colored_0/000125_10.png training/colored_1/000125_10.png training/disp_occ/000125_10.png
training/colored_0/000126_10.png training/colored_1/000126_10.png training/disp_occ/000126_10.png
training/colored_0/000127_10.png training/colored_1/000127_10.png training/disp_occ/000127_10.png
training/colored_0/000128_10.png training/colored_1/000128_10.png training/disp_occ/000128_10.png
training/colored_0/000129_10.png training/colored_1/000129_10.png training/disp_occ/000129_10.png
training/colored_0/000130_10.png training/colored_1/000130_10.png training/disp_occ/000130_10.png
training/colored_0/000131_10.png training/colored_1/000131_10.png training/disp_occ/000131_10.png
training/colored_0/000132_10.png training/colored_1/000132_10.png training/disp_occ/000132_10.png
training/colored_0/000133_10.png training/colored_1/000133_10.png training/disp_occ/000133_10.png
training/colored_0/000134_10.png training/colored_1/000134_10.png training/disp_occ/000134_10.png
training/colored_0/000135_10.png training/colored_1/000135_10.png training/disp_occ/000135_10.png
training/colored_0/000136_10.png training/colored_1/000136_10.png training/disp_occ/000136_10.png
training/colored_0/000137_10.png training/colored_1/000137_10.png training/disp_occ/000137_10.png
training/colored_0/000138_10.png training/colored_1/000138_10.png training/disp_occ/000138_10.png
training/colored_0/000139_10.png training/colored_1/000139_10.png training/disp_occ/000139_10.png
training/colored_0/000140_10.png training/colored_1/000140_10.png training/disp_occ/000140_10.png
training/colored_0/000141_10.png training/colored_1/000141_10.png training/disp_occ/000141_10.png
training/colored_0/000142_10.png training/colored_1/000142_10.png training/disp_occ/000142_10.png
training/colored_0/000143_10.png training/colored_1/000143_10.png training/disp_occ/000143_10.png
training/colored_0/000144_10.png training/colored_1/000144_10.png training/disp_occ/000144_10.png
training/colored_0/000145_10.png training/colored_1/000145_10.png training/disp_occ/000145_10.png
training/colored_0/000146_10.png training/colored_1/000146_10.png training/disp_occ/000146_10.png
training/colored_0/000147_10.png training/colored_1/000147_10.png training/disp_occ/000147_10.png
training/colored_0/000148_10.png training/colored_1/000148_10.png training/disp_occ/000148_10.png
training/colored_0/000149_10.png training/colored_1/000149_10.png training/disp_occ/000149_10.png
training/colored_0/000150_10.png training/colored_1/000150_10.png training/disp_occ/000150_10.png
training/colored_0/000151_10.png training/colored_1/000151_10.png training/disp_occ/000151_10.png
training/colored_0/000152_10.png training/colored_1/000152_10.png training/disp_occ/000152_10.png
training/colored_0/000153_10.png training/colored_1/000153_10.png training/disp_occ/000153_10.png
training/colored_0/000154_10.png training/colored_1/000154_10.png training/disp_occ/000154_10.png
training/colored_0/000155_10.png training/colored_1/000155_10.png training/disp_occ/000155_10.png
training/colored_0/000156_10.png training/colored_1/000156_10.png training/disp_occ/000156_10.png
training/colored_0/000157_10.png training/colored_1/000157_10.png training/disp_occ/000157_10.png
training/colored_0/000158_10.png training/colored_1/000158_10.png training/disp_occ/000158_10.png
training/colored_0/000159_10.png training/colored_1/000159_10.png training/disp_occ/000159_10.png
training/colored_0/000160_10.png training/colored_1/000160_10.png training/disp_occ/000160_10.png
training/colored_0/000161_10.png training/colored_1/000161_10.png training/disp_occ/000161_10.png
training/colored_0/000162_10.png training/colored_1/000162_10.png training/disp_occ/000162_10.png
training/colored_0/000163_10.png training/colored_1/000163_10.png training/disp_occ/000163_10.png
training/colored_0/000164_10.png training/colored_1/000164_10.png training/disp_occ/000164_10.png
training/colored_0/000165_10.png training/colored_1/000165_10.png training/disp_occ/000165_10.png
training/colored_0/000166_10.png training/colored_1/000166_10.png training/disp_occ/000166_10.png
training/colored_0/000167_10.png training/colored_1/000167_10.png training/disp_occ/000167_10.png
training/colored_0/000168_10.png training/colored_1/000168_10.png training/disp_occ/000168_10.png
training/colored_0/000169_10.png training/colored_1/000169_10.png training/disp_occ/000169_10.png
training/colored_0/000170_10.png training/colored_1/000170_10.png training/disp_occ/000170_10.png
training/colored_0/000171_10.png training/colored_1/000171_10.png training/disp_occ/000171_10.png
training/colored_0/000172_10.png training/colored_1/000172_10.png training/disp_occ/000172_10.png
training/colored_0/000173_10.png training/colored_1/000173_10.png training/disp_occ/000173_10.png
training/colored_0/000174_10.png training/colored_1/000174_10.png training/disp_occ/000174_10.png
training/colored_0/000175_10.png training/colored_1/000175_10.png training/disp_occ/000175_10.png
training/colored_0/000176_10.png training/colored_1/000176_10.png training/disp_occ/000176_10.png
training/colored_0/000177_10.png training/colored_1/000177_10.png training/disp_occ/000177_10.png
training/colored_0/000178_10.png training/colored_1/000178_10.png training/disp_occ/000178_10.png
training/colored_0/000179_10.png training/colored_1/000179_10.png training/disp_occ/000179_10.png
training/colored_0/000180_10.png training/colored_1/000180_10.png training/disp_occ/000180_10.png
training/colored_0/000181_10.png training/colored_1/000181_10.png training/disp_occ/000181_10.png
training/colored_0/000182_10.png training/colored_1/000182_10.png training/disp_occ/000182_10.png
training/colored_0/000183_10.png training/colored_1/000183_10.png training/disp_occ/000183_10.png
training/colored_0/000184_10.png training/colored_1/000184_10.png training/disp_occ/000184_10.png
training/colored_0/000185_10.png training/colored_1/000185_10.png training/disp_occ/000185_10.png
training/colored_0/000186_10.png training/colored_1/000186_10.png training/disp_occ/000186_10.png
training/colored_0/000187_10.png training/colored_1/000187_10.png training/disp_occ/000187_10.png
training/colored_0/000188_10.png training/colored_1/000188_10.png training/disp_occ/000188_10.png
training/colored_0/000189_10.png training/colored_1/000189_10.png training/disp_occ/000189_10.png
training/colored_0/000190_10.png training/colored_1/000190_10.png training/disp_occ/000190_10.png
training/colored_0/000191_10.png training/colored_1/000191_10.png training/disp_occ/000191_10.png
training/colored_0/000192_10.png training/colored_1/000192_10.png training/disp_occ/000192_10.png
training/colored_0/000193_10.png training/colored_1/000193_10.png training/disp_occ/000193_10.png
================================================
FILE: filenames/kitti12_test.txt
================================================
testing/colored_0/000000_10.png testing/colored_1/000000_10.png
testing/colored_0/000001_10.png testing/colored_1/000001_10.png
testing/colored_0/000002_10.png testing/colored_1/000002_10.png
testing/colored_0/000003_10.png testing/colored_1/000003_10.png
testing/colored_0/000004_10.png testing/colored_1/000004_10.png
testing/colored_0/000005_10.png testing/colored_1/000005_10.png
testing/colored_0/000006_10.png testing/colored_1/000006_10.png
testing/colored_0/000007_10.png testing/colored_1/000007_10.png
testing/colored_0/000008_10.png testing/colored_1/000008_10.png
testing/colored_0/000009_10.png testing/colored_1/000009_10.png
testing/colored_0/000010_10.png testing/colored_1/000010_10.png
testing/colored_0/000011_10.png testing/colored_1/000011_10.png
testing/colored_0/000012_10.png testing/colored_1/000012_10.png
testing/colored_0/000013_10.png testing/colored_1/000013_10.png
testing/colored_0/000014_10.png testing/colored_1/000014_10.png
testing/colored_0/000015_10.png testing/colored_1/000015_10.png
testing/colored_0/000016_10.png testing/colored_1/000016_10.png
testing/colored_0/000017_10.png testing/colored_1/000017_10.png
testing/colored_0/000018_10.png testing/colored_1/000018_10.png
testing/colored_0/000019_10.png testing/colored_1/000019_10.png
testing/colored_0/000020_10.png testing/colored_1/000020_10.png
testing/colored_0/000021_10.png testing/colored_1/000021_10.png
testing/colored_0/000022_10.png testing/colored_1/000022_10.png
testing/colored_0/000023_10.png testing/colored_1/000023_10.png
testing/colored_0/000024_10.png testing/colored_1/000024_10.png
testing/colored_0/000025_10.png testing/colored_1/000025_10.png
testing/colored_0/000026_10.png testing/colored_1/000026_10.png
testing/colored_0/000027_10.png testing/colored_1/000027_10.png
testing/colored_0/000028_10.png testing/colored_1/000028_10.png
testing/colored_0/000029_10.png testing/colored_1/000029_10.png
testing/colored_0/000030_10.png testing/colored_1/000030_10.png
testing/colored_0/000031_10.png testing/colored_1/000031_10.png
testing/colored_0/000032_10.png testing/colored_1/000032_10.png
testing/colored_0/000033_10.png testing/colored_1/000033_10.png
testing/colored_0/000034_10.png testing/colored_1/000034_10.png
testing/colored_0/000035_10.png testing/colored_1/000035_10.png
testing/colored_0/000036_10.png testing/colored_1/000036_10.png
testing/colored_0/000037_10.png testing/colored_1/000037_10.png
testing/colored_0/000038_10.png testing/colored_1/000038_10.png
testing/colored_0/000039_10.png testing/colored_1/000039_10.png
testing/colored_0/000040_10.png testing/colored_1/000040_10.png
testing/colored_0/000041_10.png testing/colored_1/000041_10.png
testing/colored_0/000042_10.png testing/colored_1/000042_10.png
testing/colored_0/000043_10.png testing/colored_1/000043_10.png
testing/colored_0/000044_10.png testing/colored_1/000044_10.png
testing/colored_0/000045_10.png testing/colored_1/000045_10.png
testing/colored_0/000046_10.png testing/colored_1/000046_10.png
testing/colored_0/000047_10.png testing/colored_1/000047_10.png
testing/colored_0/000048_10.png testing/colored_1/000048_10.png
testing/colored_0/000049_10.png testing/colored_1/000049_10.png
testing/colored_0/000050_10.png testing/colored_1/000050_10.png
testing/colored_0/000051_10.png testing/colored_1/000051_10.png
testing/colored_0/000052_10.png testing/colored_1/000052_10.png
testing/colored_0/000053_10.png testing/colored_1/000053_10.png
testing/colored_0/000054_10.png testing/colored_1/000054_10.png
testing/colored_0/000055_10.png testing/colored_1/000055_10.png
testing/colored_0/000056_10.png testing/colored_1/000056_10.png
testing/colored_0/000057_10.png testing/colored_1/000057_10.png
testing/colored_0/000058_10.png testing/colored_1/000058_10.png
testing/colored_0/000059_10.png testing/colored_1/000059_10.png
testing/colored_0/000060_10.png testing/colored_1/000060_10.png
testing/colored_0/000061_10.png testing/colored_1/000061_10.png
testing/colored_0/000062_10.png testing/colored_1/000062_10.png
testing/colored_0/000063_10.png testing/colored_1/000063_10.png
testing/colored_0/000064_10.png testing/colored_1/000064_10.png
testing/colored_0/000065_10.png testing/colored_1/000065_10.png
testing/colored_0/000066_10.png testing/colored_1/000066_10.png
testing/colored_0/000067_10.png testing/colored_1/000067_10.png
testing/colored_0/000068_10.png testing/colored_1/000068_10.png
testing/colored_0/000069_10.png testing/colored_1/000069_10.png
testing/colored_0/000070_10.png testing/colored_1/000070_10.png
testing/colored_0/000071_10.png testing/colored_1/000071_10.png
testing/colored_0/000072_10.png testing/colored_1/000072_10.png
testing/colored_0/000073_10.png testing/colored_1/000073_10.png
testing/colored_0/000074_10.png testing/colored_1/000074_10.png
testing/colored_0/000075_10.png testing/colored_1/000075_10.png
testing/colored_0/000076_10.png testing/colored_1/000076_10.png
testing/colored_0/000077_10.png testing/colored_1/000077_10.png
testing/colored_0/000078_10.png testing/colored_1/000078_10.png
testing/colored_0/000079_10.png testing/colored_1/000079_10.png
testing/colored_0/000080_10.png testing/colored_1/000080_10.png
testing/colored_0/000081_10.png testing/colored_1/000081_10.png
testing/colored_0/000082_10.png testing/colored_1/000082_10.png
testing/colored_0/000083_10.png testing/colored_1/000083_10.png
testing/colored_0/000084_10.png testing/colored_1/000084_10.png
testing/colored_0/000085_10.png testing/colored_1/000085_10.png
testing/colored_0/000086_10.png testing/colored_1/000086_10.png
testing/colored_0/000087_10.png testing/colored_1/000087_10.png
testing/colored_0/000088_10.png testing/colored_1/000088_10.png
testing/colored_0/000089_10.png testing/colored_1/000089_10.png
testing/colored_0/000090_10.png testing/colored_1/000090_10.png
testing/colored_0/000091_10.png testing/colored_1/000091_10.png
testing/colored_0/000092_10.png testing/colored_1/000092_10.png
testing/colored_0/000093_10.png testing/colored_1/000093_10.png
testing/colored_0/000094_10.png testing/colored_1/000094_10.png
testing/colored_0/000095_10.png testing/colored_1/000095_10.png
testing/colored_0/000096_10.png testing/colored_1/000096_10.png
testing/colored_0/000097_10.png testing/colored_1/000097_10.png
testing/colored_0/000098_10.png testing/colored_1/000098_10.png
testing/colored_0/000099_10.png testing/colored_1/000099_10.png
testing/colored_0/000100_10.png testing/colored_1/000100_10.png
testing/colored_0/000101_10.png testing/colored_1/000101_10.png
testing/colored_0/000102_10.png testing/colored_1/000102_10.png
testing/colored_0/000103_10.png testing/colored_1/000103_10.png
testing/colored_0/000104_10.png testing/colored_1/000104_10.png
testing/colored_0/000105_10.png testing/colored_1/000105_10.png
testing/colored_0/000106_10.png testing/colored_1/000106_10.png
testing/colored_0/000107_10.png testing/colored_1/000107_10.png
testing/colored_0/000108_10.png testing/colored_1/000108_10.png
testing/colored_0/000109_10.png testing/colored_1/000109_10.png
testing/colored_0/000110_10.png testing/colored_1/000110_10.png
testing/colored_0/000111_10.png testing/colored_1/000111_10.png
testing/colored_0/000112_10.png testing/colored_1/000112_10.png
testing/colored_0/000113_10.png testing/colored_1/000113_10.png
testing/colored_0/000114_10.png testing/colored_1/000114_10.png
testing/colored_0/000115_10.png testing/colored_1/000115_10.png
testing/colored_0/000116_10.png testing/colored_1/000116_10.png
testing/colored_0/000117_10.png testing/colored_1/000117_10.png
testing/colored_0/000118_10.png testing/colored_1/000118_10.png
testing/colored_0/000119_10.png testing/colored_1/000119_10.png
testing/colored_0/000120_10.png testing/colored_1/000120_10.png
testing/colored_0/000121_10.png testing/colored_1/000121_10.png
testing/colored_0/000122_10.png testing/colored_1/000122_10.png
testing/colored_0/000123_10.png testing/colored_1/000123_10.png
testing/colored_0/000124_10.png testing/colored_1/000124_10.png
testing/colored_0/000125_10.png testing/colored_1/000125_10.png
testing/colored_0/000126_10.png testing/colored_1/000126_10.png
testing/colored_0/000127_10.png testing/colored_1/000127_10.png
testing/colored_0/000128_10.png testing/colored_1/000128_10.png
testing/colored_0/000129_10.png testing/colored_1/000129_10.png
testing/colored_0/000130_10.png testing/colored_1/000130_10.png
testing/colored_0/000131_10.png testing/colored_1/000131_10.png
testing/colored_0/000132_10.png testing/colored_1/000132_10.png
testing/colored_0/000133_10.png testing/colored_1/000133_10.png
testing/colored_0/000134_10.png testing/colored_1/000134_10.png
testing/colored_0/000135_10.png testing/colored_1/000135_10.png
testing/colored_0/000136_10.png testing/colored_1/000136_10.png
testing/colored_0/000137_10.png testing/colored_1/000137_10.png
testing/colored_0/000138_10.png testing/colored_1/000138_10.png
testing/colored_0/000139_10.png testing/colored_1/000139_10.png
testing/colored_0/000140_10.png testing/colored_1/000140_10.png
testing/colored_0/000141_10.png testing/colored_1/000141_10.png
testing/colored_0/000142_10.png testing/colored_1/000142_10.png
testing/colored_0/000143_10.png testing/colored_1/000143_10.png
testing/colored_0/000144_10.png testing/colored_1/000144_10.png
testing/colored_0/000145_10.png testing/colored_1/000145_10.png
testing/colored_0/000146_10.png testing/colored_1/000146_10.png
testing/colored_0/000147_10.png testing/colored_1/000147_10.png
testing/colored_0/000148_10.png testing/colored_1/000148_10.png
testing/colored_0/000149_10.png testing/colored_1/000149_10.png
testing/colored_0/000150_10.png testing/colored_1/000150_10.png
testing/colored_0/000151_10.png testing/colored_1/000151_10.png
testing/colored_0/000152_10.png testing/colored_1/000152_10.png
testing/colored_0/000153_10.png testing/colored_1/000153_10.png
testing/colored_0/000154_10.png testing/colored_1/000154_10.png
testing/colored_0/000155_10.png testing/colored_1/000155_10.png
testing/colored_0/000156_10.png testing/colored_1/000156_10.png
testing/colored_0/000157_10.png testing/colored_1/000157_10.png
testing/colored_0/000158_10.png testing/colored_1/000158_10.png
testing/colored_0/000159_10.png testing/colored_1/000159_10.png
testing/colored_0/000160_10.png testing/colored_1/000160_10.png
testing/colored_0/000161_10.png testing/colored_1/000161_10.png
testing/colored_0/000162_10.png testing/colored_1/000162_10.png
testing/colored_0/000163_10.png testing/colored_1/000163_10.png
testing/colored_0/000164_10.png testing/colored_1/000164_10.png
testing/colored_0/000165_10.png testing/colored_1/000165_10.png
testing/colored_0/000166_10.png testing/colored_1/000166_10.png
testing/colored_0/000167_10.png testing/colored_1/000167_10.png
testing/colored_0/000168_10.png testing/colored_1/000168_10.png
testing/colored_0/000169_10.png testing/colored_1/000169_10.png
testing/colored_0/000170_10.png testing/colored_1/000170_10.png
testing/colored_0/000171_10.png testing/colored_1/000171_10.png
testing/colored_0/000172_10.png testing/colored_1/000172_10.png
testing/colored_0/000173_10.png testing/colored_1/000173_10.png
testing/colored_0/000174_10.png testing/colored_1/000174_10.png
testing/colored_0/000175_10.png testing/colored_1/000175_10.png
testing/colored_0/000176_10.png testing/colored_1/000176_10.png
testing/colored_0/000177_10.png testing/colored_1/000177_10.png
testing/colored_0/000178_10.png testing/colored_1/000178_10.png
testing/colored_0/000179_10.png testing/colored_1/000179_10.png
testing/colored_0/000180_10.png testing/colored_1/000180_10.png
testing/colored_0/000181_10.png testing/colored_1/000181_10.png
testing/colored_0/000182_10.png testing/colored_1/000182_10.png
testing/colored_0/000183_10.png testing/colored_1/000183_10.png
testing/colored_0/000184_10.png testing/colored_1/000184_10.png
testing/colored_0/000185_10.png testing/colored_1/000185_10.png
testing/colored_0/000186_10.png testing/colored_1/000186_10.png
testing/colored_0/000187_10.png testing/colored_1/000187_10.png
testing/colored_0/000188_10.png testing/colored_1/000188_10.png
testing/colored_0/000189_10.png testing/colored_1/000189_10.png
testing/colored_0/000190_10.png testing/colored_1/000190_10.png
testing/colored_0/000191_10.png testing/colored_1/000191_10.png
testing/colored_0/000192_10.png testing/colored_1/000192_10.png
testing/colored_0/000193_10.png testing/colored_1/000193_10.png
testing/colored_0/000194_10.png testing/colored_1/000194_10.png
================================================
FILE: filenames/kitti12_train.txt
================================================
training/colored_0/000000_10.png training/colored_1/000000_10.png training/disp_occ/000000_10.png
training/colored_0/000001_10.png training/colored_1/000001_10.png training/disp_occ/000001_10.png
training/colored_0/000002_10.png training/colored_1/000002_10.png training/disp_occ/000002_10.png
training/colored_0/000003_10.png training/colored_1/000003_10.png training/disp_occ/000003_10.png
training/colored_0/000004_10.png training/colored_1/000004_10.png training/disp_occ/000004_10.png
training/colored_0/000005_10.png training/colored_1/000005_10.png training/disp_occ/000005_10.png
training/colored_0/000006_10.png training/colored_1/000006_10.png training/disp_occ/000006_10.png
training/colored_0/000007_10.png training/colored_1/000007_10.png training/disp_occ/000007_10.png
training/colored_0/000008_10.png training/colored_1/000008_10.png training/disp_occ/000008_10.png
training/colored_0/000009_10.png training/colored_1/000009_10.png training/disp_occ/000009_10.png
training/colored_0/000010_10.png training/colored_1/000010_10.png training/disp_occ/000010_10.png
training/colored_0/000011_10.png training/colored_1/000011_10.png training/disp_occ/000011_10.png
training/colored_0/000012_10.png training/colored_1/000012_10.png training/disp_occ/000012_10.png
training/colored_0/000013_10.png training/colored_1/000013_10.png training/disp_occ/000013_10.png
training/colored_0/000014_10.png training/colored_1/000014_10.png training/disp_occ/000014_10.png
training/colored_0/000015_10.png training/colored_1/000015_10.png training/disp_occ/000015_10.png
training/colored_0/000016_10.png training/colored_1/000016_10.png training/disp_occ/000016_10.png
training/colored_0/000017_10.png training/colored_1/000017_10.png training/disp_occ/000017_10.png
training/colored_0/000018_10.png training/colored_1/000018_10.png training/disp_occ/000018_10.png
training/colored_0/000019_10.png training/colored_1/000019_10.png training/disp_occ/000019_10.png
training/colored_0/000020_10.png training/colored_1/000020_10.png training/disp_occ/000020_10.png
training/colored_0/000021_10.png training/colored_1/000021_10.png training/disp_occ/000021_10.png
training/colored_0/000022_10.png training/colored_1/000022_10.png training/disp_occ/000022_10.png
training/colored_0/000023_10.png training/colored_1/000023_10.png training/disp_occ/000023_10.png
training/colored_0/000024_10.png training/colored_1/000024_10.png training/disp_occ/000024_10.png
training/colored_0/000025_10.png training/colored_1/000025_10.png training/disp_occ/000025_10.png
training/colored_0/000026_10.png training/colored_1/000026_10.png training/disp_occ/000026_10.png
training/colored_0/000027_10.png training/colored_1/000027_10.png training/disp_occ/000027_10.png
training/colored_0/000028_10.png training/colored_1/000028_10.png training/disp_occ/000028_10.png
training/colored_0/000029_10.png training/colored_1/000029_10.png training/disp_occ/000029_10.png
training/colored_0/000030_10.png training/colored_1/000030_10.png training/disp_occ/000030_10.png
training/colored_0/000031_10.png training/colored_1/000031_10.png training/disp_occ/000031_10.png
training/colored_0/000032_10.png training/colored_1/000032_10.png training/disp_occ/000032_10.png
training/colored_0/000033_10.png training/colored_1/000033_10.png training/disp_occ/000033_10.png
training/colored_0/000034_10.png training/colored_1/000034_10.png training/disp_occ/000034_10.png
training/colored_0/000035_10.png training/colored_1/000035_10.png training/disp_occ/000035_10.png
training/colored_0/000036_10.png training/colored_1/000036_10.png training/disp_occ/000036_10.png
training/colored_0/000037_10.png training/colored_1/000037_10.png training/disp_occ/000037_10.png
training/colored_0/000038_10.png training/colored_1/000038_10.png training/disp_occ/000038_10.png
training/colored_0/000039_10.png training/colored_1/000039_10.png training/disp_occ/000039_10.png
training/colored_0/000040_10.png training/colored_1/000040_10.png training/disp_occ/000040_10.png
training/colored_0/000041_10.png training/colored_1/000041_10.png training/disp_occ/000041_10.png
training/colored_0/000042_10.png training/colored_1/000042_10.png training/disp_occ/000042_10.png
training/colored_0/000043_10.png training/colored_1/000043_10.png training/disp_occ/000043_10.png
training/colored_0/000044_10.png training/colored_1/000044_10.png training/disp_occ/000044_10.png
training/colored_0/000045_10.png training/colored_1/000045_10.png training/disp_occ/000045_10.png
training/colored_0/000046_10.png training/colored_1/000046_10.png training/disp_occ/000046_10.png
training/colored_0/000047_10.png training/colored_1/000047_10.png training/disp_occ/000047_10.png
training/colored_0/000048_10.png training/colored_1/000048_10.png training/disp_occ/000048_10.png
training/colored_0/000049_10.png training/colored_1/000049_10.png training/disp_occ/000049_10.png
training/colored_0/000050_10.png training/colored_1/000050_10.png training/disp_occ/000050_10.png
training/colored_0/000051_10.png training/colored_1/000051_10.png training/disp_occ/000051_10.png
training/colored_0/000052_10.png training/colored_1/000052_10.png training/disp_occ/000052_10.png
training/colored_0/000053_10.png training/colored_1/000053_10.png training/disp_occ/000053_10.png
training/colored_0/000054_10.png training/colored_1/000054_10.png training/disp_occ/000054_10.png
training/colored_0/000055_10.png training/colored_1/000055_10.png training/disp_occ/000055_10.png
training/colored_0/000056_10.png training/colored_1/000056_10.png training/disp_occ/000056_10.png
training/colored_0/000057_10.png training/colored_1/000057_10.png training/disp_occ/000057_10.png
training/colored_0/000058_10.png training/colored_1/000058_10.png training/disp_occ/000058_10.png
training/colored_0/000059_10.png training/colored_1/000059_10.png training/disp_occ/000059_10.png
training/colored_0/000060_10.png training/colored_1/000060_10.png training/disp_occ/000060_10.png
training/colored_0/000061_10.png training/colored_1/000061_10.png training/disp_occ/000061_10.png
training/colored_0/000062_10.png training/colored_1/000062_10.png training/disp_occ/000062_10.png
training/colored_0/000063_10.png training/colored_1/000063_10.png training/disp_occ/000063_10.png
training/colored_0/000064_10.png training/colored_1/000064_10.png training/disp_occ/000064_10.png
training/colored_0/000065_10.png training/colored_1/000065_10.png training/disp_occ/000065_10.png
training/colored_0/000066_10.png training/colored_1/000066_10.png training/disp_occ/000066_10.png
training/colored_0/000067_10.png training/colored_1/000067_10.png training/disp_occ/000067_10.png
training/colored_0/000068_10.png training/colored_1/000068_10.png training/disp_occ/000068_10.png
training/colored_0/000069_10.png training/colored_1/000069_10.png training/disp_occ/000069_10.png
training/colored_0/000070_10.png training/colored_1/000070_10.png training/disp_occ/000070_10.png
training/colored_0/000071_10.png training/colored_1/000071_10.png training/disp_occ/000071_10.png
training/colored_0/000072_10.png training/colored_1/000072_10.png training/disp_occ/000072_10.png
training/colored_0/000073_10.png training/colored_1/000073_10.png training/disp_occ/000073_10.png
training/colored_0/000074_10.png training/colored_1/000074_10.png training/disp_occ/000074_10.png
training/colored_0/000075_10.png training/colored_1/000075_10.png training/disp_occ/000075_10.png
training/colored_0/000076_10.png training/colored_1/000076_10.png training/disp_occ/000076_10.png
training/colored_0/000077_10.png training/colored_1/000077_10.png training/disp_occ/000077_10.png
training/colored_0/000078_10.png training/colored_1/000078_10.png training/disp_occ/000078_10.png
training/colored_0/000079_10.png training/colored_1/000079_10.png training/disp_occ/000079_10.png
training/colored_0/000080_10.png training/colored_1/000080_10.png training/disp_occ/000080_10.png
training/colored_0/000081_10.png training/colored_1/000081_10.png training/disp_occ/000081_10.png
training/colored_0/000082_10.png training/colored_1/000082_10.png training/disp_occ/000082_10.png
training/colored_0/000083_10.png training/colored_1/000083_10.png training/disp_occ/000083_10.png
training/colored_0/000084_10.png training/colored_1/000084_10.png training/disp_occ/000084_10.png
training/colored_0/000085_10.png training/colored_1/000085_10.png training/disp_occ/000085_10.png
training/colored_0/000086_10.png training/colored_1/000086_10.png training/disp_occ/000086_10.png
training/colored_0/000087_10.png training/colored_1/000087_10.png training/disp_occ/000087_10.png
training/colored_0/000088_10.png training/colored_1/000088_10.png training/disp_occ/000088_10.png
training/colored_0/000089_10.png training/colored_1/000089_10.png training/disp_occ/000089_10.png
training/colored_0/000090_10.png training/colored_1/000090_10.png training/disp_occ/000090_10.png
training/colored_0/000091_10.png training/colored_1/000091_10.png training/disp_occ/000091_10.png
training/colored_0/000092_10.png training/colored_1/000092_10.png training/disp_occ/000092_10.png
training/colored_0/000093_10.png training/colored_1/000093_10.png training/disp_occ/000093_10.png
training/colored_0/000094_10.png training/colored_1/000094_10.png training/disp_occ/000094_10.png
training/colored_0/000095_10.png training/colored_1/000095_10.png training/disp_occ/000095_10.png
training/colored_0/000096_10.png training/colored_1/000096_10.png training/disp_occ/000096_10.png
training/colored_0/000097_10.png training/colored_1/000097_10.png training/disp_occ/000097_10.png
training/colored_0/000098_10.png training/colored_1/000098_10.png training/disp_occ/000098_10.png
training/colored_0/000099_10.png training/colored_1/000099_10.png training/disp_occ/000099_10.png
training/colored_0/000100_10.png training/colored_1/000100_10.png training/disp_occ/000100_10.png
training/colored_0/000101_10.png training/colored_1/000101_10.png training/disp_occ/000101_10.png
training/colored_0/000102_10.png training/colored_1/000102_10.png training/disp_occ/000102_10.png
training/colored_0/000103_10.png training/colored_1/000103_10.png training/disp_occ/000103_10.png
training/colored_0/000104_10.png training/colored_1/000104_10.png training/disp_occ/000104_10.png
training/colored_0/000105_10.png training/colored_1/000105_10.png training/disp_occ/000105_10.png
training/colored_0/000106_10.png training/colored_1/000106_10.png training/disp_occ/000106_10.png
training/colored_0/000107_10.png training/colored_1/000107_10.png training/disp_occ/000107_10.png
training/colored_0/000108_10.png training/colored_1/000108_10.png training/disp_occ/000108_10.png
training/colored_0/000109_10.png training/colored_1/000109_10.png training/disp_occ/000109_10.png
training/colored_0/000110_10.png training/colored_1/000110_10.png training/disp_occ/000110_10.png
training/colored_0/000111_10.png training/colored_1/000111_10.png training/disp_occ/000111_10.png
training/colored_0/000112_10.png training/colored_1/000112_10.png training/disp_occ/000112_10.png
training/colored_0/000113_10.png training/colored_1/000113_10.png training/disp_occ/000113_10.png
training/colored_0/000114_10.png training/colored_1/000114_10.png training/disp_occ/000114_10.png
training/colored_0/000115_10.png training/colored_1/000115_10.png training/disp_occ/000115_10.png
training/colored_0/000116_10.png training/colored_1/000116_10.png training/disp_occ/000116_10.png
training/colored_0/000117_10.png training/colored_1/000117_10.png training/disp_occ/000117_10.png
training/colored_0/000118_10.png training/colored_1/000118_10.png training/disp_occ/000118_10.png
training/colored_0/000119_10.png training/colored_1/000119_10.png training/disp_occ/000119_10.png
training/colored_0/000120_10.png training/colored_1/000120_10.png training/disp_occ/000120_10.png
training/colored_0/000121_10.png training/colored_1/000121_10.png training/disp_occ/000121_10.png
training/colored_0/000122_10.png training/colored_1/000122_10.png training/disp_occ/000122_10.png
training/colored_0/000123_10.png training/colored_1/000123_10.png training/disp_occ/000123_10.png
training/colored_0/000124_10.png training/colored_1/000124_10.png training/disp_occ/000124_10.png
training/colored_0/000125_10.png training/colored_1/000125_10.png training/disp_occ/000125_10.png
training/colored_0/000126_10.png training/colored_1/000126_10.png training/disp_occ/000126_10.png
training/colored_0/000127_10.png training/colored_1/000127_10.png training/disp_occ/000127_10.png
training/colored_0/000128_10.png training/colored_1/000128_10.png training/disp_occ/000128_10.png
training/colored_0/000129_10.png training/colored_1/000129_10.png training/disp_occ/000129_10.png
training/colored_0/000130_10.png training/colored_1/000130_10.png training/disp_occ/000130_10.png
training/colored_0/000131_10.png training/colored_1/000131_10.png training/disp_occ/000131_10.png
training/colored_0/000132_10.png training/colored_1/000132_10.png training/disp_occ/000132_10.png
training/colored_0/000133_10.png training/colored_1/000133_10.png training/disp_occ/000133_10.png
training/colored_0/000134_10.png training/colored_1/000134_10.png training/disp_occ/000134_10.png
training/colored_0/000135_10.png training/colored_1/000135_10.png training/disp_occ/000135_10.png
training/colored_0/000136_10.png training/colored_1/000136_10.png training/disp_occ/000136_10.png
training/colored_0/000137_10.png training/colored_1/000137_10.png training/disp_occ/000137_10.png
training/colored_0/000138_10.png training/colored_1/000138_10.png training/disp_occ/000138_10.png
training/colored_0/000139_10.png training/colored_1/000139_10.png training/disp_occ/000139_10.png
training/colored_0/000140_10.png training/colored_1/000140_10.png training/disp_occ/000140_10.png
training/colored_0/000141_10.png training/colored_1/000141_10.png training/disp_occ/000141_10.png
training/colored_0/000142_10.png training/colored_1/000142_10.png training/disp_occ/000142_10.png
training/colored_0/000143_10.png training/colored_1/000143_10.png training/disp_occ/000143_10.png
training/colored_0/000144_10.png training/colored_1/000144_10.png training/disp_occ/000144_10.png
training/colored_0/000145_10.png training/colored_1/000145_10.png training/disp_occ/000145_10.png
training/colored_0/000146_10.png training/colored_1/000146_10.png training/disp_occ/000146_10.png
training/colored_0/000147_10.png training/colored_1/000147_10.png training/disp_occ/000147_10.png
training/colored_0/000148_10.png training/colored_1/000148_10.png training/disp_occ/000148_10.png
training/colored_0/000149_10.png training/colored_1/000149_10.png training/disp_occ/000149_10.png
training/colored_0/000150_10.png training/colored_1/000150_10.png training/disp_occ/000150_10.png
training/colored_0/000151_10.png training/colored_1/000151_10.png training/disp_occ/000151_10.png
training/colored_0/000152_10.png training/colored_1/000152_10.png training/disp_occ/000152_10.png
training/colored_0/000153_10.png training/colored_1/000153_10.png training/disp_occ/000153_10.png
training/colored_0/000154_10.png training/colored_1/000154_10.png training/disp_occ/000154_10.png
training/colored_0/000155_10.png training/colored_1/000155_10.png training/disp_occ/000155_10.png
training/colored_0/000156_10.png training/colored_1/000156_10.png training/disp_occ/000156_10.png
training/colored_0/000157_10.png training/colored_1/000157_10.png training/disp_occ/000157_10.png
training/colored_0/000158_10.png training/colored_1/000158_10.png training/disp_occ/000158_10.png
training/colored_0/000159_10.png training/colored_1/000159_10.png training/disp_occ/000159_10.png
training/colored_0/000160_10.png training/colored_1/000160_10.png training/disp_occ/000160_10.png
training/colored_0/000161_10.png training/colored_1/000161_10.png training/disp_occ/000161_10.png
training/colored_0/000162_10.png training/colored_1/000162_10.png training/disp_occ/000162_10.png
training/colored_0/000163_10.png training/colored_1/000163_10.png training/disp_occ/000163_10.png
training/colored_0/000164_10.png training/colored_1/000164_10.png training/disp_occ/000164_10.png
training/colored_0/000165_10.png training/colored_1/000165_10.png training/disp_occ/000165_10.png
training/colored_0/000166_10.png training/colored_1/000166_10.png training/disp_occ/000166_10.png
training/colored_0/000167_10.png training/colored_1/000167_10.png training/disp_occ/000167_10.png
training/colored_0/000168_10.png training/colored_1/000168_10.png training/disp_occ/000168_10.png
training/colored_0/000169_10.png training/colored_1/000169_10.png training/disp_occ/000169_10.png
training/colored_0/000170_10.png training/colored_1/000170_10.png training/disp_occ/000170_10.png
training/colored_0/000171_10.png training/colored_1/000171_10.png training/disp_occ/000171_10.png
training/colored_0/000172_10.png training/colored_1/000172_10.png training/disp_occ/000172_10.png
training/colored_0/000173_10.png training/colored_1/000173_10.png training/disp_occ/000173_10.png
training/colored_0/000174_10.png training/colored_1/000174_10.png training/disp_occ/000174_10.png
training/colored_0/000175_10.png training/colored_1/000175_10.png training/disp_occ/000175_10.png
training/colored_0/000176_10.png training/colored_1/000176_10.png training/disp_occ/000176_10.png
training/colored_0/000177_10.png training/colored_1/000177_10.png training/disp_occ/000177_10.png
training/colored_0/000178_10.png training/colored_1/000178_10.png training/disp_occ/000178_10.png
training/colored_0/000179_10.png training/colored_1/000179_10.png training/disp_occ/000179_10.png
================================================
FILE: filenames/kitti12_val.txt
================================================
training/colored_0/000180_10.png training/colored_1/000180_10.png training/disp_occ/000180_10.png
training/colored_0/000181_10.png training/colored_1/000181_10.png training/disp_occ/000181_10.png
training/colored_0/000182_10.png training/colored_1/000182_10.png training/disp_occ/000182_10.png
training/colored_0/000183_10.png training/colored_1/000183_10.png training/disp_occ/000183_10.png
training/colored_0/000184_10.png training/colored_1/000184_10.png training/disp_occ/000184_10.png
training/colored_0/000185_10.png training/colored_1/000185_10.png training/disp_occ/000185_10.png
training/colored_0/000186_10.png training/colored_1/000186_10.png training/disp_occ/000186_10.png
training/colored_0/000187_10.png training/colored_1/000187_10.png training/disp_occ/000187_10.png
training/colored_0/000188_10.png training/colored_1/000188_10.png training/disp_occ/000188_10.png
training/colored_0/000189_10.png training/colored_1/000189_10.png training/disp_occ/000189_10.png
training/colored_0/000190_10.png training/colored_1/000190_10.png training/disp_occ/000190_10.png
training/colored_0/000191_10.png training/colored_1/000191_10.png training/disp_occ/000191_10.png
training/colored_0/000192_10.png training/colored_1/000192_10.png training/disp_occ/000192_10.png
training/colored_0/000193_10.png training/colored_1/000193_10.png training/disp_occ/000193_10.png
================================================
FILE: filenames/kitti15_all.txt
================================================
training/image_2/000000_10.png training/image_3/000000_10.png training/disp_occ_0/000000_10.png
training/image_2/000002_10.png training/image_3/000002_10.png training/disp_occ_0/000002_10.png
training/image_2/000003_10.png training/image_3/000003_10.png training/disp_occ_0/000003_10.png
training/image_2/000004_10.png training/image_3/000004_10.png training/disp_occ_0/000004_10.png
training/image_2/000005_10.png training/image_3/000005_10.png training/disp_occ_0/000005_10.png
training/image_2/000007_10.png training/image_3/000007_10.png training/disp_occ_0/000007_10.png
training/image_2/000008_10.png training/image_3/000008_10.png training/disp_occ_0/000008_10.png
training/image_2/000009_10.png training/image_3/000009_10.png training/disp_occ_0/000009_10.png
training/image_2/000010_10.png training/image_3/000010_10.png training/disp_occ_0/000010_10.png
training/image_2/000011_10.png training/image_3/000011_10.png training/disp_occ_0/000011_10.png
training/image_2/000012_10.png training/image_3/000012_10.png training/disp_occ_0/000012_10.png
training/image_2/000013_10.png training/image_3/000013_10.png training/disp_occ_0/000013_10.png
training/image_2/000014_10.png training/image_3/000014_10.png training/disp_occ_0/000014_10.png
training/image_2/000015_10.png training/image_3/000015_10.png training/disp_occ_0/000015_10.png
training/image_2/000016_10.png training/image_3/000016_10.png training/disp_occ_0/000016_10.png
training/image_2/000017_10.png training/image_3/000017_10.png training/disp_occ_0/000017_10.png
training/image_2/000018_10.png training/image_3/000018_10.png training/disp_occ_0/000018_10.png
training/image_2/000019_10.png training/image_3/000019_10.png training/disp_occ_0/000019_10.png
training/image_2/000020_10.png training/image_3/000020_10.png training/disp_occ_0/000020_10.png
training/image_2/000021_10.png training/image_3/000021_10.png training/disp_occ_0/000021_10.png
training/image_2/000022_10.png training/image_3/000022_10.png training/disp_occ_0/000022_10.png
training/image_2/000023_10.png training/image_3/000023_10.png training/disp_occ_0/000023_10.png
training/image_2/000024_10.png training/image_3/000024_10.png training/disp_occ_0/000024_10.png
training/image_2/000025_10.png training/image_3/000025_10.png training/disp_occ_0/000025_10.png
training/image_2/000027_10.png training/image_3/000027_10.png training/disp_occ_0/000027_10.png
training/image_2/000028_10.png training/image_3/000028_10.png training/disp_occ_0/000028_10.png
training/image_2/000029_10.png training/image_3/000029_10.png training/disp_occ_0/000029_10.png
training/image_2/000030_10.png training/image_3/000030_10.png training/disp_occ_0/000030_10.png
training/image_2/000031_10.png training/image_3/000031_10.png training/disp_occ_0/000031_10.png
training/image_2/000032_10.png training/image_3/000032_10.png training/disp_occ_0/000032_10.png
training/image_2/000033_10.png training/image_3/000033_10.png training/disp_occ_0/000033_10.png
training/image_2/000034_10.png training/image_3/000034_10.png training/disp_occ_0/000034_10.png
training/image_2/000035_10.png training/image_3/000035_10.png training/disp_occ_0/000035_10.png
training/image_2/000036_10.png training/image_3/000036_10.png training/disp_occ_0/000036_10.png
training/image_2/000037_10.png training/image_3/000037_10.png training/disp_occ_0/000037_10.png
training/image_2/000039_10.png training/image_3/000039_10.png training/disp_occ_0/000039_10.png
training/image_2/000040_10.png training/image_3/000040_10.png training/disp_occ_0/000040_10.png
training/image_2/000041_10.png training/image_3/000041_10.png training/disp_occ_0/000041_10.png
training/image_2/000042_10.png training/image_3/000042_10.png training/disp_occ_0/000042_10.png
training/image_2/000044_10.png training/image_3/000044_10.png training/disp_occ_0/000044_10.png
training/image_2/000045_10.png training/image_3/000045_10.png training/disp_occ_0/000045_10.png
training/image_2/000046_10.png training/image_3/000046_10.png training/disp_occ_0/000046_10.png
training/image_2/000047_10.png training/image_3/000047_10.png training/disp_occ_0/000047_10.png
training/image_2/000048_10.png training/image_3/000048_10.png training/disp_occ_0/000048_10.png
training/image_2/000050_10.png training/image_3/000050_10.png training/disp_occ_0/000050_10.png
training/image_2/000051_10.png training/image_3/000051_10.png training/disp_occ_0/000051_10.png
training/image_2/000052_10.png training/image_3/000052_10.png training/disp_occ_0/000052_10.png
training/image_2/000053_10.png training/image_3/000053_10.png training/disp_occ_0/000053_10.png
training/image_2/000054_10.png training/image_3/000054_10.png training/disp_occ_0/000054_10.png
training/image_2/000055_10.png training/image_3/000055_10.png training/disp_occ_0/000055_10.png
training/image_2/000056_10.png training/image_3/000056_10.png training/disp_occ_0/000056_10.png
training/image_2/000057_10.png training/image_3/000057_10.png training/disp_occ_0/000057_10.png
training/image_2/000058_10.png training/image_3/000058_10.png training/disp_occ_0/000058_10.png
training/image_2/000059_10.png training/image_3/000059_10.png training/disp_occ_0/000059_10.png
training/image_2/000060_10.png training/image_3/000060_10.png training/disp_occ_0/000060_10.png
training/image_2/000061_10.png training/image_3/000061_10.png training/disp_occ_0/000061_10.png
training/image_2/000062_10.png training/image_3/000062_10.png training/disp_occ_0/000062_10.png
training/image_2/000063_10.png training/image_3/000063_10.png training/disp_occ_0/000063_10.png
training/image_2/000064_10.png training/image_3/000064_10.png training/disp_occ_0/000064_10.png
training/image_2/000065_10.png training/image_3/000065_10.png training/disp_occ_0/000065_10.png
training/image_2/000066_10.png training/image_3/000066_10.png training/disp_occ_0/000066_10.png
training/image_2/000068_10.png training/image_3/000068_10.png training/disp_occ_0/000068_10.png
training/image_2/000069_10.png training/image_3/000069_10.png training/disp_occ_0/000069_10.png
training/image_2/000070_10.png training/image_3/000070_10.png training/disp_occ_0/000070_10.png
training/image_2/000071_10.png training/image_3/000071_10.png training/disp_occ_0/000071_10.png
training/image_2/000072_10.png training/image_3/000072_10.png training/disp_occ_0/000072_10.png
training/image_2/000073_10.png training/image_3/000073_10.png training/disp_occ_0/000073_10.png
training/image_2/000074_10.png training/image_3/000074_10.png training/disp_occ_0/000074_10.png
training/image_2/000075_10.png training/image_3/000075_10.png training/disp_occ_0/000075_10.png
training/image_2/000076_10.png training/image_3/000076_10.png training/disp_occ_0/000076_10.png
training/image_2/000077_10.png training/image_3/000077_10.png training/disp_occ_0/000077_10.png
training/image_2/000078_10.png training/image_3/000078_10.png training/disp_occ_0/000078_10.png
training/image_2/000079_10.png training/image_3/000079_10.png training/disp_occ_0/000079_10.png
training/image_2/000080_10.png training/image_3/000080_10.png training/disp_occ_0/000080_10.png
training/image_2/000082_10.png training/image_3/000082_10.png training/disp_occ_0/000082_10.png
training/image_2/000083_10.png training/image_3/000083_10.png training/disp_occ_0/000083_10.png
training/image_2/000084_10.png training/image_3/000084_10.png training/disp_occ_0/000084_10.png
training/image_2/000085_10.png training/image_3/000085_10.png training/disp_occ_0/000085_10.png
training/image_2/000086_10.png train
gitextract_85err5x8/
├── LICENSE.md
├── README.md
├── datasets/
│ ├── MiddleburyLoader.py
│ ├── __init__.py
│ ├── data_io.py
│ ├── eth3dLoader.py
│ ├── eth3d_dataset.py
│ ├── flow_transforms.py
│ ├── kitti_dataset_1215.py
│ ├── kitti_dataset_1215_augmentation.py
│ ├── kittiraw_loader.py
│ ├── listfiles.py
│ ├── middlebury_data.py
│ ├── middlebury_data_our.py
│ ├── middlebury_loader.py
│ ├── readpfm.py
│ ├── sceneflow_dataset.py
│ └── sceneflow_dataset_augmentation.py
├── filenames/
│ ├── kitti12_15_all.txt
│ ├── kitti12_15_train.txt
│ ├── kitti12_all.txt
│ ├── kitti12_test.txt
│ ├── kitti12_train.txt
│ ├── kitti12_val.txt
│ ├── kitti15_all.txt
│ ├── kitti15_test.txt
│ ├── kitti15_train.txt
│ ├── kitti15_val.txt
│ ├── sceneflow_test.txt
│ ├── sceneflow_train.txt
│ └── test_eth3d_files.txt
├── main_kitti.py
├── main_sceneflow.py
├── models/
│ ├── Fast_ACV.py
│ ├── Fast_ACV_plus.py
│ ├── __init__.py
│ ├── loss.py
│ └── submodule.py
├── save_disp.py
├── test_mid.py
└── utils/
├── __init__.py
├── experiment.py
├── metrics.py
├── misc.py
└── visualization.py
SYMBOL INDEX (219 symbols across 27 files)
FILE: datasets/MiddleburyLoader.py
function is_image_file (line 17) | def is_image_file(filename):
function default_loader (line 21) | def default_loader(path):
function disparity_loader (line 25) | def disparity_loader(path):
class myImageFloder (line 36) | class myImageFloder(data.Dataset):
method __init__ (line 38) | def __init__(self, left, right, left_disparity, training, right_dispar...
method __getitem__ (line 48) | def __getitem__(self, index):
method __len__ (line 175) | def __len__(self):
FILE: datasets/data_io.py
function get_transform (line 6) | def get_transform():
function get_transform_aug (line 15) | def get_transform_aug():
function read_all_lines (line 25) | def read_all_lines(filename):
function pfm_imread (line 32) | def pfm_imread(filename):
FILE: datasets/eth3dLoader.py
function is_image_file (line 17) | def is_image_file(filename):
function default_loader (line 21) | def default_loader(path):
function disparity_loader (line 25) | def disparity_loader(path):
class myImageFloder (line 36) | class myImageFloder(data.Dataset):
method __init__ (line 38) | def __init__(self, left, right, left_disparity, training, right_dispar...
method __getitem__ (line 47) | def __getitem__(self, index):
method __len__ (line 165) | def __len__(self):
FILE: datasets/eth3d_dataset.py
class ETH3DStereoDataset (line 25) | class ETH3DStereoDataset(BaseDataset):
method __init__ (line 27) | def __init__(self,
method load_images (line 42) | def load_images(self, idx, do_flip=False):
method load_disparity (line 50) | def load_disparity(self, idx, do_flip=False):
method __getitem__ (line 58) | def __getitem__(self, idx):
FILE: datasets/flow_transforms.py
class Compose (line 10) | class Compose(object):
method __init__ (line 14) | def __init__(self, co_transforms):
method __call__ (line 17) | def __call__(self, input, target):
class Compose_mask (line 22) | class Compose_mask(object):
method __init__ (line 26) | def __init__(self, co_transforms):
method __call__ (line 29) | def __call__(self, input, target, mask):
class Scale (line 36) | class Scale(object):
method __init__ (line 40) | def __init__(self, size, order=2):
method __call__ (line 50) | def __call__(self, inputs, target):
class RandomCrop (line 61) | class RandomCrop(object):
method __init__ (line 65) | def __init__(self, size):
method __call__ (line 71) | def __call__(self, inputs,target):
class RandomCrop_mask (line 83) | class RandomCrop_mask(object):
method __init__ (line 87) | def __init__(self, size):
method __call__ (line 93) | def __call__(self, inputs,target,mask):
class RandomVdisp (line 106) | class RandomVdisp(object):
method __init__ (line 110) | def __init__(self, angle, px, diff_angle=0, order=2, reshape=False):
method __call__ (line 117) | def __call__(self, inputs,target):
FILE: datasets/kitti_dataset_1215.py
class KITTIDataset (line 13) | class KITTIDataset(Dataset):
method __init__ (line 14) | def __init__(self, kitti15_datapath, kitti12_datapath, list_filename, ...
method load_path (line 22) | def load_path(self, list_filename):
method load_image (line 33) | def load_image(self, filename):
method load_disp (line 36) | def load_disp(self, filename):
method __len__ (line 41) | def __len__(self):
method __getitem__ (line 44) | def __getitem__(self, index):
FILE: datasets/kitti_dataset_1215_augmentation.py
class KITTIDataset (line 15) | class KITTIDataset(Dataset):
method __init__ (line 16) | def __init__(self, kitti15_datapath, kitti12_datapath, list_filename, ...
method load_path (line 24) | def load_path(self, list_filename):
method load_image (line 35) | def load_image(self, filename):
method load_disp (line 38) | def load_disp(self, filename):
method __len__ (line 43) | def __len__(self):
method __getitem__ (line 46) | def __getitem__(self, index):
FILE: datasets/kittiraw_loader.py
function listfiles (line 14) | def listfiles(filepath):
class ImageLoader (line 32) | class ImageLoader(Dataset):
method __init__ (line 33) | def __init__(self, left, right):
method load_path (line 37) | def load_path(self, list_filename):
method load_image (line 48) | def load_image(self, filename):
method load_disp (line 51) | def load_disp(self, filename):
method __len__ (line 56) | def __len__(self):
method __getitem__ (line 59) | def __getitem__(self, index):
method __len__ (line 86) | def __len__(self):
FILE: datasets/listfiles.py
function dataloader (line 11) | def dataloader(filepath):
FILE: datasets/middlebury_data.py
function pil_loader (line 9) | def pil_loader(path):
function readlines (line 16) | def readlines(filename):
function read_pfm (line 22) | def read_pfm(file):
class MiddleburyStereoDataset (line 31) | class MiddleburyStereoDataset:
method __init__ (line 33) | def __init__(self,
method load_images (line 53) | def load_images(self, idx, do_flip=False):
method load_disparity (line 62) | def load_disparity(self, idx, do_flip=False):
method preprocess (line 70) | def preprocess(self, inputs):
method __len__ (line 78) | def __len__(self):
method __getitem__ (line 81) | def __getitem__(self, idx):
FILE: datasets/middlebury_data_our.py
function pil_loader (line 10) | def pil_loader(path):
function readlines (line 17) | def readlines(filename):
function read_pfm (line 23) | def read_pfm(file):
class MiddleburyStereoDataset (line 32) | class MiddleburyStereoDataset:
method __init__ (line 34) | def __init__(self,
method load_images (line 55) | def load_images(self, idx, do_flip=False):
method load_disparity (line 64) | def load_disparity(self, idx, do_flip=False):
method preprocess (line 72) | def preprocess(self, inputs):
method __len__ (line 80) | def __len__(self):
method __getitem__ (line 83) | def __getitem__(self, idx):
FILE: datasets/middlebury_loader.py
function mb_loader (line 10) | def mb_loader(filepath, res):
function img_loader (line 41) | def img_loader(path):
function disparity_loader (line 45) | def disparity_loader(path):
class myDataset (line 49) | class myDataset(data.Dataset):
method __init__ (line 51) | def __init__(self, left, right, left_disp, training, imgloader=img_loa...
method __getitem__ (line 62) | def __getitem__(self, index):
method __len__ (line 116) | def __len__(self):
function horizontal_flip (line 120) | def horizontal_flip(img):
FILE: datasets/readpfm.py
function readPFM (line 6) | def readPFM(file):
FILE: datasets/sceneflow_dataset.py
class SceneFlowDatset (line 14) | class SceneFlowDatset(Dataset):
method __init__ (line 15) | def __init__(self, datapath, list_filename, training):
method load_path (line 20) | def load_path(self, list_filename):
method load_image (line 28) | def load_image(self, filename):
method load_disp (line 31) | def load_disp(self, filename):
method __len__ (line 36) | def __len__(self):
method __getitem__ (line 39) | def __getitem__(self, index):
FILE: datasets/sceneflow_dataset_augmentation.py
class SceneFlowDatset (line 12) | class SceneFlowDatset(Dataset):
method __init__ (line 13) | def __init__(self, datapath, list_filename, training):
method load_path (line 18) | def load_path(self, list_filename):
method load_image (line 26) | def load_image(self, filename):
method load_disp (line 29) | def load_disp(self, filename):
method __len__ (line 34) | def __len__(self):
method __getitem__ (line 37) | def __getitem__(self, index):
FILE: main_kitti.py
function train (line 95) | def train():
function train_sample (line 147) | def train_sample(sample, compute_metrics=False):
function test_sample (line 179) | def test_sample(sample, compute_metrics=True):
FILE: main_sceneflow.py
function train (line 92) | def train():
function train_sample (line 148) | def train_sample(sample, compute_metrics=False):
function test_sample (line 179) | def test_sample(sample, compute_metrics=True):
FILE: models/Fast_ACV.py
class SubModule (line 13) | class SubModule(nn.Module):
method __init__ (line 14) | def __init__(self):
method weight_init (line 17) | def weight_init(self):
class Feature (line 33) | class Feature(SubModule):
method __init__ (line 34) | def __init__(self):
method forward (line 50) | def forward(self, x):
class FeatUp (line 59) | class FeatUp(SubModule):
method __init__ (line 60) | def __init__(self):
method forward (line 70) | def forward(self, featL, featR=None):
class channelAtt (line 88) | class channelAtt(SubModule):
method __init__ (line 89) | def __init__(self, cv_chan, im_chan):
method forward (line 98) | def forward(self, cv, im):
class hourglass (line 104) | class hourglass(nn.Module):
method __init__ (line 105) | def __init__(self, in_channels):
method forward (line 133) | def forward(self, x, imgs):
class hourglass_att (line 150) | class hourglass_att(nn.Module):
method __init__ (line 151) | def __init__(self, in_channels):
method forward (line 179) | def forward(self, x, imgs):
class Fast_ACVNet (line 196) | class Fast_ACVNet(nn.Module):
method __init__ (line 197) | def __init__(self, maxdisp, att_weights_only):
method concat_volume_generator (line 243) | def concat_volume_generator(self, left_input, right_input, disparity_s...
method forward (line 249) | def forward(self, left, right):
FILE: models/Fast_ACV_plus.py
class SubModule (line 13) | class SubModule(nn.Module):
method __init__ (line 14) | def __init__(self):
method weight_init (line 17) | def weight_init(self):
class Feature (line 33) | class Feature(SubModule):
method __init__ (line 34) | def __init__(self):
method forward (line 49) | def forward(self, x):
class FeatUp (line 58) | class FeatUp(SubModule):
method __init__ (line 59) | def __init__(self):
method forward (line 69) | def forward(self, featL, featR=None):
class channelAtt (line 84) | class channelAtt(SubModule):
method __init__ (line 85) | def __init__(self, cv_chan, im_chan):
method forward (line 94) | def forward(self, cv, im):
class hourglass (line 99) | class hourglass(nn.Module):
method __init__ (line 100) | def __init__(self, in_channels):
method forward (line 127) | def forward(self, x, imgs):
class hourglass_att (line 143) | class hourglass_att(nn.Module):
method __init__ (line 144) | def __init__(self, in_channels):
method forward (line 186) | def forward(self, x, imgs):
class Fast_ACVNet_plus (line 212) | class Fast_ACVNet_plus(nn.Module):
method __init__ (line 213) | def __init__(self, maxdisp, att_weights_only):
method concat_volume_generator (line 248) | def concat_volume_generator(self, left_input, right_input, disparity_s...
method forward (line 256) | def forward(self, left, right):
FILE: models/loss.py
function model_loss_train (line 4) | def model_loss_train(disp_ests, disp_gts, img_masks):
function model_loss_test (line 11) | def model_loss_test(disp_ests, disp_gts,img_masks):
FILE: models/submodule.py
class BasicConv (line 12) | class BasicConv(nn.Module):
method __init__ (line 14) | def __init__(self, in_channels, out_channels, deconv=False, is_3d=Fals...
method forward (line 32) | def forward(self, x):
class Conv2x (line 41) | class Conv2x(nn.Module):
method __init__ (line 43) | def __init__(self, in_channels, out_channels, deconv=False, is_3d=Fals...
method forward (line 68) | def forward(self, x, rem):
function disparity_regression (line 83) | def disparity_regression(x, maxdisp):
function build_concat_volume (line 90) | def build_concat_volume(refimg_fea, targetimg_fea, maxdisp):
function groupwise_correlation (line 104) | def groupwise_correlation(fea1, fea2, num_groups):
function build_gwc_volume (line 112) | def build_gwc_volume(refimg_fea, targetimg_fea, maxdisp, num_groups):
function groupwise_correlation_norm (line 124) | def groupwise_correlation_norm(fea1, fea2, num_groups):
function build_gwc_volume_norm (line 135) | def build_gwc_volume_norm(refimg_fea, targetimg_fea, maxdisp, num_groups):
function norm_correlation (line 148) | def norm_correlation(fea1, fea2):
function build_norm_correlation_volume (line 152) | def build_norm_correlation_volume(refimg_fea, targetimg_fea, maxdisp):
function disparity_variance (line 163) | def disparity_variance(x, maxdisp, disparity):
function SpatialTransformer_grid (line 171) | def SpatialTransformer_grid(x, y, disp_range_samples):
class Propagation (line 196) | class Propagation(nn.Module):
method __init__ (line 197) | def __init__(self):
method forward (line 201) | def forward(self, disparity_samples):
class Propagation_prob (line 216) | class Propagation_prob(nn.Module):
method __init__ (line 217) | def __init__(self):
method forward (line 221) | def forward(self, prob_volume):
function context_upsample (line 236) | def context_upsample(depth_low, up_weights):
function regression_topk (line 251) | def regression_topk(cost, disparity_samples, k):
FILE: save_disp.py
function test (line 59) | def test():
function test_sample (line 82) | def test_sample(sample):
FILE: test_mid.py
function test_trainset (line 50) | def test_trainset():
FILE: utils/experiment.py
function make_iterative_func (line 13) | def make_iterative_func(func):
function make_nograd_func (line 27) | def make_nograd_func(func):
function tensor2float (line 37) | def tensor2float(vars):
function tensor2numpy (line 47) | def tensor2numpy(vars):
function check_allfloat (line 57) | def check_allfloat(vars):
function save_scalars (line 61) | def save_scalars(logger, mode_tag, scalar_dict, global_step):
function save_images (line 73) | def save_images(logger, mode_tag, images_dict, global_step):
function adjust_learning_rate (line 91) | def adjust_learning_rate(optimizer, epoch, base_lr, lrepochs):
class AverageMeter (line 112) | class AverageMeter(object):
method __init__ (line 113) | def __init__(self):
method update (line 117) | def update(self, x):
method mean (line 122) | def mean(self):
class AverageMeterDict (line 126) | class AverageMeterDict(object):
method __init__ (line 127) | def __init__(self):
method update (line 131) | def update(self, x):
method mean (line 146) | def mean(self):
function get_world_size (line 155) | def get_world_size():
function reduce_scalar_outputs (line 164) | def reduce_scalar_outputs(scalar_outputs):
FILE: utils/metrics.py
function check_shape_for_metric_computation (line 15) | def check_shape_for_metric_computation(*vars):
function compute_metric_for_each_image (line 22) | def compute_metric_for_each_image(metric_func):
function D1_metric (line 45) | def D1_metric(D_est, D_gt, mask):
function Thres_metric (line 53) | def Thres_metric(D_est, D_gt, mask, thres):
function EPE_metric (line 63) | def EPE_metric(D_est, D_gt, mask):
function D1_metric_mask (line 71) | def D1_metric_mask(D_est, D_gt, mask, mask_img):
function Thres_metric_mask (line 80) | def Thres_metric_mask(D_est, D_gt, mask, thres, mask_img):
function EPE_metric_mask (line 91) | def EPE_metric_mask(D_est, D_gt, mask, mask_img):
FILE: utils/misc.py
function setup_for_distributed (line 5) | def setup_for_distributed(is_master):
function init_distributed_mode (line 20) | def init_distributed_mode(args):
FILE: utils/visualization.py
function gen_error_colormap (line 11) | def gen_error_colormap():
class disp_error_image_func (line 30) | class disp_error_image_func(Function):
method forward (line 31) | def forward(self, D_est_tensor, D_gt_tensor, abs_thres=3., rel_thres=0...
method backward (line 57) | def backward(self, grad_output):
Condensed preview — 45 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (6,165K chars).
[
{
"path": "LICENSE.md",
"chars": 1067,
"preview": "MIT License\n\nCopyright (c) 2022 Gangwei Xu\n\nPermission is hereby granted, free of charge, to any person obtaining a copy"
},
{
"path": "README.md",
"chars": 4420,
"preview": "<p align=\"center\">\n <h1 align=\"center\">Accurate and Efficient Stereo Matching via Attention Concatenation Volume</h1>\n "
},
{
"path": "datasets/MiddleburyLoader.py",
"chars": 7235,
"preview": "import os, torch, torch.utils.data as data\nfrom PIL import Image\nimport numpy as np\nfrom . import flow_transforms\nimport"
},
{
"path": "datasets/__init__.py",
"chars": 172,
"preview": "from .kitti_dataset_1215 import KITTIDataset\nfrom .sceneflow_dataset import SceneFlowDatset\n\n__datasets__ = {\n \"scene"
},
{
"path": "datasets/data_io.py",
"chars": 1601,
"preview": "import numpy as np\nimport re\nimport torchvision.transforms as transforms\n\n\ndef get_transform():\n mean = [0.485, 0.456"
},
{
"path": "datasets/eth3dLoader.py",
"chars": 6514,
"preview": "import os, torch, torch.utils.data as data\nfrom PIL import Image\nimport numpy as np\nfrom . import flow_transforms\nimport"
},
{
"path": "datasets/eth3d_dataset.py",
"chars": 2485,
"preview": "# Copyright Niantic 2020. Patent Pending. All rights reserved.\n#\n# This software is licensed under the terms of the Ster"
},
{
"path": "datasets/flow_transforms.py",
"chars": 3808,
"preview": "from __future__ import division\nimport torch\nimport random\nimport numpy as np\nimport numbers\nimport pdb\nimport cv2\n\n\ncla"
},
{
"path": "datasets/kitti_dataset_1215.py",
"chars": 4779,
"preview": "import os\r\nimport random\r\nfrom torch.utils.data import Dataset\r\nfrom PIL import Image\r\nimport numpy as np\r\nimport cv2\r\nf"
},
{
"path": "datasets/kitti_dataset_1215_augmentation.py",
"chars": 6824,
"preview": "import os\r\nimport random\r\nfrom torch.utils.data import Dataset\r\nfrom PIL import Image\r\nimport numpy as np\r\nimport cv2\r\nf"
},
{
"path": "datasets/kittiraw_loader.py",
"chars": 3025,
"preview": "import os\r\nimport random\r\nfrom torch.utils.data import Dataset\r\nfrom PIL import Image\r\nimport numpy as np\r\nimport cv2\r\nf"
},
{
"path": "datasets/listfiles.py",
"chars": 595,
"preview": "import torch.utils.data as data\n\nimport pdb\nfrom PIL import Image\nimport os\nimport os.path\nimport numpy as np\nimport glo"
},
{
"path": "datasets/middlebury_data.py",
"chars": 3584,
"preview": "import os\r\nimport numpy as np\r\nfrom PIL import Image # using pillow-simd for increased speed\r\nfrom torchvision import t"
},
{
"path": "datasets/middlebury_data_our.py",
"chars": 4703,
"preview": "import os\r\nimport numpy as np\r\nfrom PIL import Image # using pillow-simd for increased speed\r\nfrom torchvision import t"
},
{
"path": "datasets/middlebury_loader.py",
"chars": 4248,
"preview": "import os\nfrom PIL import Image\nfrom datasets import readpfm as rp\nimport torch.utils.data as data\nimport torchvision.tr"
},
{
"path": "datasets/readpfm.py",
"chars": 1234,
"preview": "import re\nimport numpy as np\nimport sys\n \n\ndef readPFM(file):\n file = open(file, 'rb')\n\n color = None\n width = "
},
{
"path": "datasets/sceneflow_dataset.py",
"chars": 4133,
"preview": "import os\r\nimport random\r\nfrom PIL import Image\r\nfrom torch.utils.data import Dataset\r\nimport numpy as np\r\nimport cv2\r\nf"
},
{
"path": "datasets/sceneflow_dataset_augmentation.py",
"chars": 4825,
"preview": "import os\nimport random\nfrom torch.utils.data import Dataset\nfrom PIL import Image\nimport numpy as np\nfrom datasets.data"
},
{
"path": "filenames/kitti12_15_all.txt",
"chars": 38211,
"preview": "training/image_2/000000_10.png training/image_3/000000_10.png training/disp_occ_0/000000_10.png\ntraining/image_2/000002_"
},
{
"path": "filenames/kitti12_15_train.txt",
"chars": 34919,
"preview": "training/image_2/000000_10.png training/image_3/000000_10.png training/disp_occ_0/000000_10.png\ntraining/image_2/000002_"
},
{
"path": "filenames/kitti12_all.txt",
"chars": 19206,
"preview": "training/colored_0/000000_10.png training/colored_1/000000_10.png training/disp_occ/000000_10.png\r\ntraining/colored_0/00"
},
{
"path": "filenames/kitti12_test.txt",
"chars": 12480,
"preview": "testing/colored_0/000000_10.png testing/colored_1/000000_10.png\ntesting/colored_0/000001_10.png testing/colored_1/000001"
},
{
"path": "filenames/kitti12_train.txt",
"chars": 17640,
"preview": "training/colored_0/000000_10.png training/colored_1/000000_10.png training/disp_occ/000000_10.png\ntraining/colored_0/000"
},
{
"path": "filenames/kitti12_val.txt",
"chars": 1372,
"preview": "training/colored_0/000180_10.png training/colored_1/000180_10.png training/disp_occ/000180_10.png\ntraining/colored_0/000"
},
{
"path": "filenames/kitti15_all.txt",
"chars": 19398,
"preview": "training/image_2/000000_10.png training/image_3/000000_10.png training/disp_occ_0/000000_10.png\r\ntraining/image_2/000002"
},
{
"path": "filenames/kitti15_test.txt",
"chars": 12000,
"preview": "testing/image_2/000000_10.png testing/image_3/000000_10.png\ntesting/image_2/000001_10.png testing/image_3/000001_10.png\n"
},
{
"path": "filenames/kitti15_train.txt",
"chars": 17280,
"preview": "training/image_2/000000_10.png training/image_3/000000_10.png training/disp_occ_0/000000_10.png\ntraining/image_2/000002_"
},
{
"path": "filenames/kitti15_val.txt",
"chars": 1920,
"preview": "training/image_2/000001_10.png training/image_3/000001_10.png training/disp_occ_0/000001_10.png\ntraining/image_2/000006_"
},
{
"path": "filenames/sceneflow_test.txt",
"chars": 537510,
"preview": "frames_finalpass/TEST/C/0134/left/0008.png frames_finalpass/TEST/C/0134/right/0008.png disparity/TEST/C/0134/left/0008.p"
},
{
"path": "filenames/sceneflow_train.txt",
"chars": 5253744,
"preview": "frames_finalpass/TRAIN/15mm_focallength/scene_backwards/slow/left/0224.png frames_finalpass/TRAIN/15mm_focallength/scene"
},
{
"path": "filenames/test_eth3d_files.txt",
"chars": 352,
"preview": "terrace_1s\nplayground_3s\nforest_2s\ndelivery_area_1l\nterrains_1l\nelectro_3s\nforest_1s\nterrains_2l\ndelivery_area_2l\nterrac"
},
{
"path": "main_kitti.py",
"chars": 10560,
"preview": "# from __future__ import print_function, division\r\nimport argparse\r\nimport os\r\nimport torch\r\nimport torch.nn as nn\r\nimpo"
},
{
"path": "main_sceneflow.py",
"chars": 10465,
"preview": "# from __future__ import print_function, division\r\nimport argparse\r\nimport os\r\nos.environ['CUDA_VISIBLE_DEVICES'] = '0,1"
},
{
"path": "models/Fast_ACV.py",
"chars": 15540,
"preview": "from __future__ import print_function\r\nimport torch\r\nimport torch.nn as nn\r\nimport torch.utils.data\r\nfrom torch.autograd"
},
{
"path": "models/Fast_ACV_plus.py",
"chars": 15270,
"preview": "from __future__ import print_function\r\nimport torch\r\nimport torch.nn as nn\r\nimport torch.utils.data\r\nfrom torch.autograd"
},
{
"path": "models/__init__.py",
"chars": 222,
"preview": "from .Fast_ACV import Fast_ACVNet\nfrom .Fast_ACV_plus import Fast_ACVNet_plus\nfrom .loss import model_loss_train, model_"
},
{
"path": "models/loss.py",
"chars": 702,
"preview": "import torch.nn.functional as F\nimport torch\n\ndef model_loss_train(disp_ests, disp_gts, img_masks):\n weights = [1.0, "
},
{
"path": "models/submodule.py",
"chars": 9859,
"preview": "from __future__ import print_function\nimport torch\nimport torch.nn as nn\nimport torch.utils.data\nfrom torch.autograd imp"
},
{
"path": "save_disp.py",
"chars": 3588,
"preview": "from __future__ import print_function, division\r\nimport argparse\r\nimport os\r\nimport torch\r\nimport torch.nn as nn\r\nimport"
},
{
"path": "test_mid.py",
"chars": 4162,
"preview": "from __future__ import print_function, division\r\nimport argparse\r\nimport os\r\nimport torch\r\nimport torch.nn as nn\r\nimport"
},
{
"path": "utils/__init__.py",
"chars": 223,
"preview": "from utils.experiment import *\nfrom utils.visualization import *\nfrom utils.metrics import D1_metric, Thres_metric, EPE_"
},
{
"path": "utils/experiment.py",
"chars": 5714,
"preview": "from __future__ import print_function, division\nimport torch\nimport torch.nn as nn\nimport torch.nn.parallel\nimport torch"
},
{
"path": "utils/metrics.py",
"chars": 3524,
"preview": "import torch\nimport torch.nn.functional as F\nfrom utils.experiment import make_nograd_func\nfrom torch.autograd import Va"
},
{
"path": "utils/misc.py",
"chars": 1278,
"preview": "import os\nimport torch\n\n\ndef setup_for_distributed(is_master):\n \"\"\"\n This function disables printing when not in m"
},
{
"path": "utils/visualization.py",
"chars": 2198,
"preview": "from __future__ import print_function\nimport torch\nimport torch.nn as nn\nimport torch.utils.data\nfrom torch.autograd imp"
}
]
About this extraction
This page contains the full source code of the gangweiX/Fast-ACVNet GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 45 files (5.8 MB), approximately 1.5M tokens, and a symbol index with 219 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.