Repository: zju3dv/Coin3D
Branch: main
Commit: 3de63a14eb21
Files: 126
Total size: 1.9 MB
Directory structure:
gitextract_sa0or0_n/
├── CONDITION.md
├── README.md
├── blender_utils/
│ └── render_proxy.py
├── configs/
│ ├── coin3d_train.yaml
│ ├── ctrldemo.yaml
│ ├── nerf.yaml
│ ├── neus.yaml
│ └── syncdreamer.yaml
├── example/
│ ├── panda/
│ │ ├── mesh.obj
│ │ └── proxy.txt
│ ├── pumpkin/
│ │ ├── mesh.obj
│ │ └── proxy.txt
│ ├── teddybear/
│ │ ├── mesh.obj
│ │ └── proxy.txt
│ ├── toycar/
│ │ ├── mesh.obj
│ │ └── proxy.txt
│ └── turtle/
│ ├── mesh.obj
│ └── proxy.txt
├── externs/
│ ├── __init__.py
│ └── pvcnn/
│ └── modules/
│ ├── __init__.py
│ ├── ball_query.py
│ ├── frustum.py
│ ├── functional/
│ │ ├── __init__.py
│ │ ├── backend.py
│ │ ├── ball_query.py
│ │ ├── devoxelization.py
│ │ ├── grouping.py
│ │ ├── interpolatation.py
│ │ ├── loss.py
│ │ ├── sampling.py
│ │ ├── src/
│ │ │ ├── ball_query/
│ │ │ │ ├── ball_query.cpp
│ │ │ │ ├── ball_query.cu
│ │ │ │ ├── ball_query.cuh
│ │ │ │ └── ball_query.hpp
│ │ │ ├── bindings.cpp
│ │ │ ├── cuda_utils.cuh
│ │ │ ├── grouping/
│ │ │ │ ├── grouping.cpp
│ │ │ │ ├── grouping.cu
│ │ │ │ ├── grouping.cuh
│ │ │ │ └── grouping.hpp
│ │ │ ├── interpolate/
│ │ │ │ ├── neighbor_interpolate.cpp
│ │ │ │ ├── neighbor_interpolate.cu
│ │ │ │ ├── neighbor_interpolate.cuh
│ │ │ │ ├── neighbor_interpolate.hpp
│ │ │ │ ├── trilinear_devox.cpp
│ │ │ │ ├── trilinear_devox.cu
│ │ │ │ ├── trilinear_devox.cuh
│ │ │ │ └── trilinear_devox.hpp
│ │ │ ├── sampling/
│ │ │ │ ├── sampling.cpp
│ │ │ │ ├── sampling.cu
│ │ │ │ ├── sampling.cuh
│ │ │ │ └── sampling.hpp
│ │ │ ├── utils.hpp
│ │ │ └── voxelization/
│ │ │ ├── vox.cpp
│ │ │ ├── vox.cu
│ │ │ ├── vox.cuh
│ │ │ └── vox.hpp
│ │ └── voxelization.py
│ ├── loss.py
│ ├── pointnet.py
│ ├── pvconv.py
│ ├── se.py
│ ├── shared_mlp.py
│ └── voxelization.py
├── foreground_segment.py
├── generate.py
├── ldm/
│ ├── DPMPPScheduler.py
│ ├── base_utils.py
│ ├── data/
│ │ ├── __init__.py
│ │ ├── base.py
│ │ ├── coco.py
│ │ ├── control_sync_dreamer.py
│ │ ├── dummy.py
│ │ ├── imagenet.py
│ │ ├── inpainting/
│ │ │ ├── __init__.py
│ │ │ └── synthetic_mask.py
│ │ ├── laion.py
│ │ ├── lsun.py
│ │ ├── nerf_like.py
│ │ ├── simple.py
│ │ └── sync_dreamer.py
│ ├── lr_scheduler.py
│ ├── models/
│ │ ├── autoencoder.py
│ │ └── diffusion/
│ │ ├── __init__.py
│ │ ├── ctrldemo_sync_dreamer.py
│ │ ├── sync_dreamer.py
│ │ ├── sync_dreamer_attention.py
│ │ ├── sync_dreamer_network.py
│ │ └── sync_dreamer_utils.py
│ ├── modules/
│ │ ├── attention.py
│ │ ├── diffusionmodules/
│ │ │ ├── __init__.py
│ │ │ ├── model.py
│ │ │ ├── openaimodel.py
│ │ │ └── util.py
│ │ ├── distributions/
│ │ │ ├── __init__.py
│ │ │ └── distributions.py
│ │ ├── encoders/
│ │ │ ├── __init__.py
│ │ │ └── modules.py
│ │ └── x_transformer.py
│ ├── thirdp/
│ │ └── psp/
│ │ ├── helpers.py
│ │ ├── id_loss.py
│ │ └── model_irse.py
│ ├── typing.py
│ └── util.py
├── meta_info/
│ └── camera-16.pkl
├── misc.ipynb
├── raymarching/
│ ├── __init__.py
│ ├── backend.py
│ ├── raymarching.py
│ ├── setup.py
│ └── src/
│ ├── bindings.cpp
│ ├── raymarching.cu
│ └── raymarching.h
├── renderer/
│ ├── agg_net.py
│ ├── cost_reg_net.py
│ ├── dummy_dataset.py
│ ├── feature_net.py
│ ├── neus_networks.py
│ ├── ngp_renderer.py
│ └── renderer.py
├── requirements.txt
├── train_diffusion.py
├── train_renderer.py
└── workflow/
├── Coin3D_condition_workflow.json
├── Coin3D_condition_workflow_api.json
└── inference_comfyui_api.py
================================================
FILE CONTENTS
================================================
================================================
FILE: CONDITION.md
================================================
### Prepare condition image for inference
1. Rendering the proxy image
First you need to download [Blender](https://www.blender.org/) and unzip it to any directory. The version we use is [Blender-3.6](https://download.blender.org/release/Blender3.6/blender-3.6.12-linux-x64.tar.xz)
```
path/to/your/blender -b -P blender_utils/render_proxy.py -- --obj_path example/teddybear/mesh.obj
```
The example rendered result can be found in this [image](example/teddybear/condition.png). Optionally, if you use blender to construct a coarse proxy, setting a different base color for each primitive can improve the effect of extracting softedges in ControlNet.
2. Use ComfyUI and Controlnet to construct condition image
**We have prepared a [Workflow](workflow/Coin3D_condition_workflow.json) using Depth-Condition and Softedge-Condition. You can pull the workflow into the ComfyUI interface to use it. Users can also use other controlnet conditions as needed.**
The usage of ComfyUI can be found [here](https://github.com/comfyanonymous/ComfyUI).
To run the above workflow, you need to download the following pretrained models. Here, we use the [Disney](https://civitai.com/models/65203/disney-pixar-cartoon-type-a) style basemodel as an example.
```
mkdir interactive_workflow
cd interactive_workflow
git clone https://github.com/comfyanonymous/ComfyUI.git
cd custom_nodes
git clone https://github.com/BlenderNeko/ComfyUI_ADV_CLIP_emb.git
git clone https://github.com/Fannovel16/comfyui_controlnet_aux.git
cd ../models/checkpoints/
wget https://huggingface.co/BitStarWalkin/RPG_models/resolve/main/disneyPixarCartoon_v10.safetensors
cd ../controlnet/
wget https://huggingface.co/lllyasviel/ControlNet-v1-1/resolve/main/control_v11f1p_sd15_depth.pth
wget https://huggingface.co/lllyasviel/ControlNet-v1-1/resolve/main/control_v11p_sd15_softedge.pth
```
Make sure ComfyUI has the following custom_nodes and pretrain models
:
```bash
ComfyUI
|-- models
|-- checkpoints
|--disneyPixarCartoon_v10.safetensors
|-- controlnet
|--control_v11f1p_sd15_depth.pth
|--control_v11p_sd15_softedge.pth
|-- custom_nodes
|-- ComfyUI_ADV_CLIP_emb
|-- comfyui_controlnet_aux
```
After loading the workflow, it will be as shown below.
You need to **load the rendered proxy image** into the Load Image Node, **set Text Prompt** in the CLIP Text Encode (Advance) Node, and finally **adjust the appropriate ControlNet parameters** in the Apply ControlNet Node.
Apply ControlNet Parameters Explanation:
- `strength` The parameter controls the degree of influence that ControlNet has on the generated image. The value usually ranges from 0 to 1. The higher the value, the greater the influence that ControlNet has on the generated image; the lower the value, the less influence. For example, when strength is set to 1, ControlNet completely controls the generation process; when it is set to 0, ControlNet does not participate in the control.
- `start_percent` Specifies the point in the generation process (as a percentage) when ControlNet starts to take effect. For example, a value of 0.3 means ControlNet starts influencing at 30% of the process. **Normally this parameter is set to 0.**
- `end_percent` Specifies the point in the generation process (as a percentage) when ControlNet stops taking effect. For example, a value of 0.7 means ControlNet stops influencing at 70% of the process.
**ComfyUI API**
At the same time, you can also get the condition image through the ComfyUI API. Here is the example code based on the above [workflow](workflow/Coin3D_condition_workflow.json).
First, you need to run ComfyUI
```
python3 main.py --port 6621
```
After that, run
```
python3 inference_comfyui_api.py
```
### Acknowledgement
We deeply appreciate the authors of the following repositories for generously sharing their code, which we have extensively utilized. Their contributions have been invaluable to our work, and we are grateful for their openness and willingness to share their expertise. Our project has greatly benefited from their efforts and dedication.
- [ComfyUI](https://github.com/comfyanonymous/ComfyUI)
================================================
FILE: README.md
================================================
# Coin3D: Controllable and Interactive 3D Assets Generation with Proxy-Guided Conditioning
### [Project Page](https://zju3dv.github.io/coin3d/) | [Video](https://www.youtube.com/watch?v=d6p3LLbmOnI) | [Paper](https://arxiv.org/abs/2405.08054)
### ToDo List
- [x] Inference code and pretrained models.
- [ ] Interactive workflow.
- [x] Training data.
- [ ] Blender Addons
### Preparation for inference
1. Install packages in `requirements.txt`. We test our model on a A100-80G GPU with 11.8 CUDA and 2.0.1 pytorch.
```angular2html
conda create -n coin3d
conda activate coin3d
pip install -r requirements.txt
```
2. Download checkpoints
```
mkdir ckpt
cd ckpt
wget https://huggingface.co/WenqiDong/Coin3D-v1/resolve/main/ViT-L-14.pt
wget https://huggingface.co/WenqiDong/Coin3D-v1/resolve/main/model.ckpt
```
### Inference
1. Make sure you have the following models.
```bash
Coin3D
|-- ckpt
|-- ViT-L-14.pt
|-- model.ckpt
```
2. We provide a workflow that uses a custom mesh and text prompt to generate the input image. You can refer to [this instruction](CONDITION.md).
3. (Optional) Make sure the input image has a white background. Here we refer to [SyncDreamer](https://github.com/liuyuan-pal/SyncDreamer) and use the following tools for foreground segmentation. Predict foreground mask as the alpha channel. We use [Paint3D](https://apps.microsoft.com/store/detail/paint-3d/9NBLGGH5FV99) to segment the foreground object interactively.
We also provide a script `foreground_segment.py` using `carvekit` to predict foreground masks and you need to first crop the object region before feeding it to `foreground_segment.py`. We may double check the predicted masks are correct or not.
```bash
python3 foreground_segment.py --input --output
```
3. Using coarse proxy to control 3D generation of multi-view images.
```bash
python3 generate.py \
--cfg configs/ctrldemo.yaml \
--ckpt ckpt/model.ckpt \
--input example/panda/input.png \
--input_proxy example/panda/proxy.txt \
--output output/custom \
--sample_num 1 \
--cfg_scale 2.0 \
--elevation 30 \
--ctrl_end_step 1.0 \
--sampler ddim_demo
```
Explanation:
- `--cfg` is the model configuration.
- `--ckpt` is the checkpoint to load.
- `--input` is the input image in the RGBA form. The alpha value means the foreground object mask.
- `--input_proxy` is the input coarse proxy. The proxy contains 256 points by default. [misc.ipynb](misc.ipynb) contains code for using the coarse mesh sampling proxy.
- `--output` is the output directory. Results would be saved to `output/custom/0.png` which contains 16 images of predefined viewpoints per `png` file.
- `--sample_num` is the number of instances we will generate.
- `--cfg_scale` is the *classifier-free-guidance*. `2.0` is OK for most cases.
- `--elevation` is the elevation angle of the input image in degree. Need to be set to 30.
- `--ctrl_end_step` is the timestamp of ending 3D control, from `0` to `1.0`, usually set to `0.6` to `1.0`.
4. Run a NeuS or a NeRF for 3D reconstruction.
```bash
# train a neus
python3 train_renderer.py -i output/custom/0.png \
-n custom-neus \
-b configs/neus.yaml \
-l output/renderer
# train a nerf
python3 train_renderer.py -i output/custom/0.png \
-n custom-nerf \
-b configs/nerf.yaml \
-l output/renderer
```
Explanation:
- `-i` contains the multiview images generated by SyncDreamer. Since SyncDreamer does not always produce good results, we may need to select a good generated image set (from `0.png` to `3.png`) for reconstruction.
- `-n` means the name. `-l` means the log dir. Results will be saved to `/` i.e. `output/renderer/custom-neus` and `output/renderer/custom-nerf`.
### Dataset
We train the model on the [Objaverse LVIS](https://objaverse.allenai.org/docs/objaverse-1.0/) dataset. The preprocessed data can be found [here](https://huggingface.co/datasets/WenqiDong/Coin3D_Objaverse_LVIS). We use the script for rendering multi-view images in [SyncDreamer](https://github.com/liuyuan-pal/SyncDreamer). The script of object extraction proxy can refer to [misc](misc.ipynb).
### Training
Please note that you need to set the data directory location in the [config file](configs/coin3d_train.yaml).
```
target_dir: path/to/renderings-v1 # renderings of target views
input_dir: path/to/renderings-random # renderings of input views
proxy_dir: path/to/proxy_256 # proxys of target objects
```
```bash
python3 train_syncdreamer.py -b configs/coin3d_train.yaml \
--finetune_from ckpt/syncdreamer-pretrain.ckpt \
-l ./logs/coin3d \
-c ./ckpt/coin3d \
--gpus 0
```
## Acknowledgement
We deeply appreciate the authors of the following repositories for generously sharing their code, which we have extensively utilized. Their contributions have been invaluable to our work, and we are grateful for their openness and willingness to share their expertise. Our project has greatly benefited from their efforts and dedication.
- [SyncDreamer](https://github.com/liuyuan-pal/SyncDreamer)
- [latent-diffusion](https://github.com/CompVis/latent-diffusion)
- [threestudio](https://github.com/threestudio-project/threestudio)
- [pvcnn](https://github.com/mit-han-lab/pvcnn)
- [stable diffusion](https://github.com/CompVis/stable-diffusion)
- [zero123](https://github.com/cvlab-columbia/zero123)
- [COLMAP](https://colmap.github.io/)
- [NeuS](https://github.com/Totoro97/NeuS)
## Citation
If you find this repository useful in your project, please cite the following work. :)
```
@article{dong2024coin3d,
title={Coin3D: Controllable and Interactive 3D Assets Generation with Proxy-Guided Conditioning},
author={Dong, Wenqi and Yang, Bangbang and Ma, Lin and Liu, Xiao and Cui, Liyuan and Bao, Hujun and Ma, Yuewen and Cui, Zhaopeng},
year={2024},
eprint={2405.08054},
archivePrefix={arXiv},
primaryClass={cs.GR}
}
```
================================================
FILE: blender_utils/render_proxy.py
================================================
import argparse
import json
import math
import os
import random
import sys
import time
import urllib.request
from pathlib import Path
import bmesh
from mathutils import Vector, Matrix
import numpy as np
from io import BytesIO
import bpy
from mathutils import Vector
import pickle
import json
from urllib import request, parse
import random
import base64
def load_object(object_path: str) -> None:
"""Loads a 3D model into the scene."""
if object_path.endswith(".glb"):
bpy.ops.import_scene.gltf(filepath=object_path, merge_vertices=True)
elif object_path.endswith(".fbx"):
bpy.ops.import_scene.fbx(filepath=object_path)
elif object_path.endswith(".obj"):
bpy.ops.import_scene.obj(filepath=object_path)
else:
raise ValueError(f"Unsupported file type: {object_path}")
def az_el_to_points(azimuths, elevations):
x = np.cos(azimuths)*np.cos(elevations)
y = np.sin(azimuths)*np.cos(elevations)
z = np.sin(elevations)
return np.stack([x,y,z],-1) #
def points_to_az_el_dist(location):
dist = np.linalg.norm(location)
location /= dist
x, y, z = location
ele = np.arcsin(z)
azi = np.arctan2(y, x)
return azi, ele, dist
def set_camera(camera, az, el, dist):
# az, el in degree
distances = dist
azimuths = np.deg2rad(az).astype(np.float32)
elevations = np.deg2rad(el).astype(np.float32)
cam_pts = az_el_to_points(azimuths, elevations) * distances
x, y, z = cam_pts
camera.location = x, y, z
def get_camera(camera_name):
context = bpy.context
scene = context.scene
# 获取或创建Empty对象
empty = scene.objects.get("Empty")
if empty is None:
empty = bpy.data.objects.new("Empty", None)
scene.collection.objects.link(empty)
# 尝试获取相机
camera = scene.objects.get(camera_name)
if camera is None:
# 创建新的相机
camera_data = bpy.data.cameras.new(name=camera_name)
camera = bpy.data.objects.new(camera_name, camera_data)
context.collection.objects.link(camera)
camera.location = (0, 1.2, 0)
camera.data.lens = 35
camera.data.sensor_width = 32
constrs = camera.constraints.get('Track To', None)
if constrs is None:
cam_constraint = camera.constraints.new(type='TRACK_TO')
cam_constraint.track_axis = 'TRACK_NEGATIVE_Z'
cam_constraint.up_axis = 'UP_Y'
# 设置约束的目标
cam_constraint.target = empty
return camera
def init_global(context):
scene = context.scene
render = scene.render
cam = get_camera("Camera")
set_camera(camera=cam, az=0, el=30, dist=1.5)
bpy.context.scene.camera = cam
render.engine = "CYCLES"
render.image_settings.file_format = "PNG"
render.image_settings.color_mode = "RGBA"
render.resolution_x = 256
render.resolution_y = 256
render.resolution_percentage = 100
scene.cycles.device = "GPU"
scene.cycles.samples = 128
scene.cycles.diffuse_bounces = 1
scene.cycles.glossy_bounces = 1
scene.cycles.transparent_max_bounces = 3
scene.cycles.transmission_bounces = 3
scene.cycles.filter_width = 0.01
scene.cycles.use_denoising = True
scene.render.film_transparent = True
bpy.context.preferences.addons["cycles"].preferences.get_devices()
# Set the device_type
bpy.context.preferences.addons["cycles"].preferences.compute_device_type = "NONE" # or "OPENCL"
bpy.context.scene.cycles.tile_size = 8192
world_tree = bpy.context.scene.world.node_tree
back_node = world_tree.nodes['Background']
env_light = 0.5
back_node.inputs['Color'].default_value = Vector([env_light, env_light, env_light, 1.0])
back_node.inputs['Strength'].default_value = 1.0
def reset_scene() -> None:
"""Resets the scene to a clean state."""
# delete everything that isn't part of a camera or a light
for obj in bpy.data.objects:
if obj.type not in {"CAMERA", "LIGHT"}:
bpy.data.objects.remove(obj, do_unlink=True)
# delete all the materials
for material in bpy.data.materials:
bpy.data.materials.remove(material, do_unlink=True)
# delete all the textures
for texture in bpy.data.textures:
bpy.data.textures.remove(texture, do_unlink=True)
# delete all the images
for image in bpy.data.images:
bpy.data.images.remove(image, do_unlink=True)
def scene_root_objects():
for obj in bpy.context.scene.objects.values():
if not obj.parent and isinstance(obj.data, (bpy.types.Mesh, bpy.types.Light)):
yield obj
def scene_meshes():
for obj in bpy.context.scene.objects.values():
if isinstance(obj.data, (bpy.types.Mesh)):
yield obj
def selected_root_objects():
for obj in bpy.context.selected_objects:
if not obj.parent and isinstance(obj.data, (bpy.types.Mesh, bpy.types.Light)):
yield obj
def selected_meshes():
for obj in bpy.context.selected_objects:
if isinstance(obj.data, (bpy.types.Mesh)):
yield obj
def selected_objects_bbox(single_obj=None, ignore_matrix=False):
bbox_min = (math.inf,) * 3
bbox_max = (-math.inf,) * 3
found = False
for obj in scene_meshes() if single_obj is None else [single_obj]:
found = True
for coord in obj.bound_box:
coord = Vector(coord)
if not ignore_matrix:
coord = obj.matrix_world @ coord
bbox_min = tuple(min(x, y) for x, y in zip(bbox_min, coord))
bbox_max = tuple(max(x, y) for x, y in zip(bbox_max, coord))
if not found:
raise RuntimeError("no objects in scene to compute bounding box for")
return Vector(bbox_min), Vector(bbox_max)
def scene_bbox(single_obj=None, ignore_matrix=False):
bbox_min = (math.inf,) * 3
bbox_max = (-math.inf,) * 3
found = False
for obj in scene_meshes() if single_obj is None else [single_obj]:
found = True
for coord in obj.bound_box:
coord = Vector(coord)
if not ignore_matrix:
coord = obj.matrix_world @ coord
bbox_min = tuple(min(x, y) for x, y in zip(bbox_min, coord))
bbox_max = tuple(max(x, y) for x, y in zip(bbox_max, coord))
if not found:
raise RuntimeError("no objects in scene to compute bounding box for")
return Vector(bbox_min), Vector(bbox_max)
def normalize_scene():
bbox_min, bbox_max = scene_bbox()
scale = 1 / max(bbox_max - bbox_min)
for obj in scene_root_objects():
obj.scale = obj.scale * scale
obj.location=obj.location*scale
# Apply scale to matrix_world.
bpy.context.view_layer.update()
bbox_min, bbox_max = scene_bbox()
offset = -(bbox_min + bbox_max) / 2
for obj in scene_root_objects():
obj.matrix_world.translation += offset
bpy.ops.object.select_all(action="DESELECT")
if __name__ == "__main__":
parser = argparse.ArgumentParser(
description="Render mesh minimal scripts"
)
parser.add_argument('--obj_path', type=str)
parser.add_argument(
"--outdir",
default="./render_result",
type=str,
help="Path to save sampled data."
)
argv = sys.argv[sys.argv.index("--") + 1 :]
args = parser.parse_args(argv)
if not os.path.exists(args.outdir):
os.makedirs(args.outdir)
reset_scene()
init_global(bpy.context)
load_object(args.obj_path)
normalize_scene()
output_path = os.path.join(args.outdir, "condition.png")
bpy.context.scene.render.filepath = (output_path)
bpy.ops.render.render(write_still=True)
================================================
FILE: configs/coin3d_train.yaml
================================================
model:
base_learning_rate: 5.0e-05
target: ldm.models.diffusion.ctrldemo_sync_dreamer.CtrlDemo
params:
view_num: 16
image_size: 256
cfg_scale: 2.0
output_num: 8
batch_view_num: 4
finetune_unet: false
finetune_projection: false
drop_conditions: false
clip_image_encoder_path: ckpt/ViT-L-14.pt
feature_scale: 1
scheduler_config: # 10000 warmup steps
target: ldm.lr_scheduler.LambdaLinearScheduler
params:
warm_up_steps: [ 100 ]
cycle_lengths: [ 100000 ]
f_start: [ 0.02 ]
f_max: [ 1.0 ]
f_min: [ 1.0 ]
unet_config:
target: ldm.models.diffusion.sync_dreamer_attention.DepthWiseAttention
params:
volume_dims: [64, 128, 256, 512]
image_size: 32
in_channels: 8
out_channels: 4
model_channels: 320
attention_resolutions: [ 4, 2, 1 ]
num_res_blocks: 2
channel_mult: [ 1, 2, 4, 4 ]
num_heads: 8
use_spatial_transformer: True
transformer_depth: 1
context_dim: 768
use_checkpoint: True
legacy: False
data:
target: ldm.data.control_sync_dreamer.ControlSyncDreamerDataset
params:
target_dir: path/to/renderings-v1 # renderings of target views
input_dir: path/to/renderings-random # renderings of input views
proxy_dir: path/to/proxy_256 # renderings of input views
validation_dir: path/to/renderings-v1 # directory of validation data
uid_set_pkl: path/to/proxy_256/train.pkl # a list of uids
valid_uid_set_pkl: path/to/proxy_256/test.pkl # a list of uids
batch_size: 8 # batch size for a single gpu
num_workers: 8
lightning:
modelcheckpoint:
params:
every_n_train_steps: 1000 # we will save models every 1k steps
callbacks:
{}
trainer:
benchmark: True
val_check_interval: 100000 # we will run validation every 1k steps, the validation will output images to //val
num_sanity_val_steps: 0
check_val_every_n_epoch: null
# max_epochs: 10000
================================================
FILE: configs/ctrldemo.yaml
================================================
model:
base_learning_rate: 5.0e-05
target: ldm.models.diffusion.ctrldemo_sync_dreamer.CtrlDemo
params:
view_num: 16
image_size: 256
cfg_scale: 2.0
output_num: 8
batch_view_num: 4
finetune_unet: false
finetune_projection: false
drop_conditions: false
clip_image_encoder_path: ckpt/ViT-L-14.pt
scheduler_config: # 10000 warmup steps
target: ldm.lr_scheduler.LambdaLinearScheduler
params:
warm_up_steps: [ 100 ]
cycle_lengths: [ 100000 ]
f_start: [ 0.02 ]
f_max: [ 1.0 ]
f_min: [ 1.0 ]
unet_config:
target: ldm.models.diffusion.sync_dreamer_attention.DepthWiseAttention
params:
volume_dims: [64, 128, 256, 512]
image_size: 32
in_channels: 8
out_channels: 4
model_channels: 320
attention_resolutions: [ 4, 2, 1 ]
num_res_blocks: 2
channel_mult: [ 1, 2, 4, 4 ]
num_heads: 8
use_spatial_transformer: True
transformer_depth: 1
context_dim: 768
use_checkpoint: True
legacy: False
data: {}
lightning:
trainer: {}
================================================
FILE: configs/nerf.yaml
================================================
model:
base_lr: 1.0e-2
target: renderer.renderer.RendererTrainer
params:
total_steps: 2000
warm_up_steps: 100
train_batch_num: 40960
test_batch_num: 40960
renderer: ngp
cube_bound: 0.6
use_mask: true
lambda_rgb_loss: 0.5
lambda_mask_loss: 10.0
data:
target: renderer.dummy_dataset.DummyDataset
params: {}
callbacks:
save_interval: 5000
trainer:
val_check_interval: 500
max_steps: 2000
================================================
FILE: configs/neus.yaml
================================================
model:
base_lr: 5.0e-4
target: renderer.renderer.RendererTrainer
params:
total_steps: 2000
warm_up_steps: 100
train_batch_num: 3584
train_batch_fg_num: 512
test_batch_num: 4096
use_mask: true
lambda_rgb_loss: 0.5
lambda_mask_loss: 1.0
lambda_eikonal_loss: 0.1
use_warm_up: true
data:
target: renderer.dummy_dataset.DummyDataset
params: {}
callbacks:
save_interval: 500
trainer:
val_check_interval: 500
max_steps: 2000
================================================
FILE: configs/syncdreamer.yaml
================================================
model:
base_learning_rate: 5.0e-05
target: ldm.models.diffusion.sync_dreamer.SyncMultiviewDiffusion
params:
view_num: 16
image_size: 256
cfg_scale: 2.0
output_num: 8
batch_view_num: 4
finetune_unet: false
finetune_projection: false
drop_conditions: false
clip_image_encoder_path: ckpt/ViT-L-14.pt
scheduler_config: # 10000 warmup steps
target: ldm.lr_scheduler.LambdaLinearScheduler
params:
warm_up_steps: [ 100 ]
cycle_lengths: [ 100000 ]
f_start: [ 0.02 ]
f_max: [ 1.0 ]
f_min: [ 1.0 ]
unet_config:
target: ldm.models.diffusion.sync_dreamer_attention.DepthWiseAttention
params:
volume_dims: [64, 128, 256, 512]
image_size: 32
in_channels: 8
out_channels: 4
model_channels: 320
attention_resolutions: [ 4, 2, 1 ]
num_res_blocks: 2
channel_mult: [ 1, 2, 4, 4 ]
num_heads: 8
use_spatial_transformer: True
transformer_depth: 1
context_dim: 768
use_checkpoint: True
legacy: False
data: {}
lightning:
trainer: {}
================================================
FILE: example/panda/mesh.obj
================================================
# Blender 3.6.1
# www.blender.org
o Sphere
v -0.090264 0.462556 -0.155620
v -0.090264 0.401258 -0.226042
v -0.090264 0.321167 -0.264155
v -0.090264 0.277823 -0.269059
v -0.090264 0.234478 -0.264155
v -0.090264 0.154388 -0.226042
v -0.081254 0.495731 -0.062652
v -0.072590 0.483088 -0.109615
v -0.064606 0.462556 -0.152895
v -0.057607 0.434926 -0.190831
v -0.051864 0.401258 -0.221964
v -0.047596 0.362846 -0.245099
v -0.044967 0.321167 -0.259344
v -0.044080 0.277823 -0.264155
v -0.044967 0.234478 -0.259344
v -0.047596 0.192799 -0.245099
v -0.051864 0.154388 -0.221964
v -0.057607 0.120720 -0.190831
v -0.064606 0.093089 -0.152895
v -0.072590 0.072558 -0.109615
v -0.081254 0.059914 -0.062652
v -0.072590 0.495731 -0.059819
v -0.055596 0.483088 -0.104056
v -0.039933 0.462556 -0.144826
v -0.026205 0.434926 -0.180561
v -0.014938 0.401258 -0.209887
v -0.006567 0.362846 -0.231679
v -0.001411 0.321167 -0.245099
v 0.000329 0.277823 -0.249630
v -0.001411 0.234478 -0.245099
v -0.006567 0.192799 -0.231679
v -0.014938 0.154388 -0.209887
v -0.026205 0.120720 -0.180561
v -0.039933 0.093089 -0.144826
v -0.055596 0.072558 -0.104056
v -0.072590 0.059914 -0.059819
v -0.064606 0.495731 -0.055217
v -0.039933 0.483088 -0.095030
v -0.017195 0.462556 -0.131721
v 0.002736 0.434926 -0.163882
v 0.019092 0.401258 -0.190275
v 0.031246 0.362846 -0.209887
v 0.038730 0.321167 -0.221964
v 0.041257 0.277823 -0.226042
v 0.038730 0.234478 -0.221964
v 0.031246 0.192799 -0.209887
v 0.019092 0.154388 -0.190275
v 0.002736 0.120720 -0.163882
v -0.017195 0.093089 -0.131721
v -0.039933 0.072558 -0.095030
v -0.064606 0.059914 -0.055217
v -0.057607 0.495731 -0.049024
v -0.026205 0.483088 -0.082882
v 0.002736 0.462556 -0.114086
v 0.028102 0.434926 -0.141436
v 0.048920 0.401258 -0.163882
v 0.064389 0.362846 -0.180561
v 0.073915 0.321167 -0.190831
v 0.077131 0.277823 -0.194299
v 0.073915 0.234478 -0.190831
v 0.064389 0.192799 -0.180561
v 0.048920 0.154388 -0.163882
v 0.028102 0.120720 -0.141436
v 0.002736 0.093089 -0.114086
v -0.026205 0.072558 -0.082882
v -0.057607 0.059914 -0.049024
v -0.051864 0.495731 -0.041478
v -0.014938 0.483088 -0.068080
v 0.019092 0.462556 -0.092597
v 0.048920 0.434926 -0.114086
v 0.073399 0.401258 -0.131721
v 0.091589 0.362846 -0.144826
v 0.102790 0.321167 -0.152895
v 0.106572 0.277823 -0.155620
v 0.102790 0.234478 -0.152895
v 0.091589 0.192799 -0.144826
v 0.073399 0.154388 -0.131721
v 0.048920 0.120720 -0.114086
v 0.019092 0.093089 -0.092597
v -0.014938 0.072558 -0.068080
v -0.051864 0.059914 -0.041478
v -0.090264 0.500000 -0.013813
v -0.047596 0.495731 -0.032869
v -0.006567 0.483088 -0.051193
v 0.031246 0.462556 -0.068080
v 0.064389 0.434926 -0.082882
v 0.091589 0.401258 -0.095030
v 0.111800 0.362846 -0.104056
v 0.124246 0.321167 -0.109615
v 0.128448 0.277823 -0.111491
v 0.124246 0.234478 -0.109615
v 0.111800 0.192799 -0.104056
v 0.091589 0.154388 -0.095030
v 0.064389 0.120720 -0.082882
v 0.031246 0.093089 -0.068080
v -0.006567 0.072558 -0.051193
v -0.047596 0.059914 -0.032869
v -0.044967 0.495731 -0.023528
v -0.001411 0.483088 -0.032869
v 0.038730 0.462556 -0.041478
v 0.073915 0.434926 -0.049024
v 0.102790 0.401258 -0.055217
v 0.124246 0.362846 -0.059819
v 0.137458 0.321167 -0.062652
v 0.141920 0.277823 -0.063609
v 0.137458 0.234478 -0.062652
v 0.124246 0.192799 -0.059819
v 0.102790 0.154388 -0.055217
v 0.073915 0.120720 -0.049024
v 0.038730 0.093089 -0.041478
v -0.001411 0.072558 -0.032869
v -0.044967 0.059914 -0.023528
v -0.044080 0.495731 -0.013813
v 0.000329 0.483088 -0.013813
v 0.041257 0.462556 -0.013813
v 0.077131 0.434926 -0.013813
v 0.106572 0.401258 -0.013813
v 0.128448 0.362846 -0.013813
v 0.141920 0.321167 -0.013813
v 0.146468 0.277823 -0.013813
v 0.141920 0.234478 -0.013813
v 0.128448 0.192799 -0.013813
v 0.106572 0.154388 -0.013813
v 0.077131 0.120720 -0.013813
v 0.041257 0.093089 -0.013813
v 0.000329 0.072558 -0.013813
v -0.044080 0.059914 -0.013813
v -0.044967 0.495731 -0.004098
v -0.001411 0.483088 0.005243
v 0.038730 0.462556 0.013852
v 0.073915 0.434926 0.021398
v 0.102790 0.401258 0.027591
v 0.124246 0.362846 0.032193
v 0.137458 0.321167 0.035026
v 0.141920 0.277823 0.035983
v 0.137458 0.234478 0.035026
v 0.124246 0.192799 0.032193
v 0.102790 0.154388 0.027591
v 0.073915 0.120720 0.021398
v 0.038730 0.093089 0.013852
v -0.001411 0.072558 0.005243
v -0.044967 0.059914 -0.004098
v -0.047596 0.495731 0.005243
v -0.006567 0.483088 0.023567
v 0.031246 0.462556 0.040454
v 0.064389 0.434926 0.055256
v 0.091589 0.401258 0.067404
v 0.111800 0.362846 0.076430
v 0.124246 0.321167 0.081989
v 0.128448 0.277823 0.083866
v 0.124246 0.234478 0.081989
v 0.111800 0.192799 0.076430
v 0.091589 0.154388 0.067404
v 0.064389 0.120720 0.055256
v 0.031246 0.093089 0.040454
v -0.006567 0.072558 0.023567
v -0.047596 0.059914 0.005243
v -0.051864 0.495731 0.013852
v -0.014938 0.483088 0.040454
v 0.019092 0.462556 0.064971
v 0.048920 0.434926 0.086460
v 0.073399 0.401258 0.104095
v 0.091589 0.362846 0.117200
v 0.102790 0.321167 0.125269
v 0.106572 0.277823 0.127994
v 0.102790 0.234478 0.125269
v 0.091589 0.192799 0.117200
v 0.073399 0.154388 0.104095
v 0.048920 0.120720 0.086460
v 0.019092 0.093089 0.064971
v -0.014938 0.072558 0.040454
v -0.051864 0.059914 0.013852
v -0.057607 0.495731 0.021398
v -0.026205 0.483088 0.055256
v 0.002736 0.462556 0.086460
v 0.028102 0.434926 0.113810
v 0.048920 0.401258 0.136256
v 0.064389 0.362846 0.152935
v 0.073915 0.321167 0.163205
v 0.077131 0.277823 0.166673
v 0.073915 0.234478 0.163205
v 0.064389 0.192799 0.152935
v 0.048920 0.154388 0.136256
v 0.028102 0.120720 0.113810
v 0.002736 0.093089 0.086460
v -0.026205 0.072558 0.055256
v -0.057607 0.059914 0.021398
v -0.064606 0.495731 0.027591
v -0.039933 0.483088 0.067404
v -0.017195 0.462556 0.104095
v 0.002736 0.434926 0.136256
v 0.019092 0.401258 0.162649
v 0.031246 0.362846 0.182261
v 0.038730 0.321167 0.194339
v 0.041257 0.277823 0.198416
v 0.038730 0.234478 0.194339
v 0.031246 0.192799 0.182261
v 0.019092 0.154388 0.162649
v 0.002736 0.120720 0.136256
v -0.017195 0.093089 0.104095
v -0.039933 0.072558 0.067404
v -0.064606 0.059914 0.027591
v -0.072590 0.495731 0.032193
v -0.055596 0.483088 0.076430
v -0.039933 0.462556 0.117200
v -0.026205 0.434926 0.152935
v -0.014938 0.401258 0.182261
v -0.006567 0.362846 0.204053
v -0.001411 0.321167 0.217473
v 0.000329 0.277823 0.222004
v -0.001411 0.234478 0.217473
v -0.006567 0.192799 0.204053
v -0.014938 0.154388 0.182261
v -0.026205 0.120720 0.152935
v -0.039933 0.093089 0.117200
v -0.055596 0.072558 0.076430
v -0.072590 0.059914 0.032193
v -0.081254 0.495731 0.035026
v -0.072590 0.483088 0.081989
v -0.064606 0.462556 0.125269
v -0.057607 0.434926 0.163205
v -0.051864 0.401258 0.194338
v -0.047596 0.362846 0.217473
v -0.044968 0.321167 0.231718
v -0.044080 0.277823 0.236529
v -0.044968 0.234478 0.231718
v -0.047596 0.192799 0.217473
v -0.051864 0.154388 0.194338
v -0.057607 0.120720 0.163205
v -0.064606 0.093089 0.125269
v -0.072590 0.072558 0.081989
v -0.081254 0.059914 0.035026
v -0.090264 0.495731 0.035983
v -0.090264 0.483088 0.083866
v -0.090264 0.462556 0.127994
v -0.090264 0.434926 0.166673
v -0.090264 0.401258 0.198416
v -0.090264 0.362846 0.222004
v -0.090264 0.321167 0.236529
v -0.090264 0.277823 0.241433
v -0.090264 0.234478 0.236529
v -0.090264 0.192799 0.222004
v -0.090264 0.154388 0.198416
v -0.090264 0.120720 0.166673
v -0.090264 0.093089 0.127994
v -0.090264 0.072558 0.083866
v -0.090264 0.059914 0.035983
v -0.099274 0.495731 0.035026
v -0.107938 0.483088 0.081989
v -0.115923 0.462556 0.125269
v -0.122922 0.434926 0.163205
v -0.128665 0.401258 0.194338
v -0.132933 0.362846 0.217473
v -0.135561 0.321167 0.231718
v -0.136449 0.277823 0.236529
v -0.135561 0.234478 0.231718
v -0.132933 0.192799 0.217473
v -0.128665 0.154388 0.194338
v -0.122922 0.120720 0.163205
v -0.115923 0.093089 0.125269
v -0.107938 0.072558 0.081989
v -0.099274 0.059914 0.035026
v -0.107938 0.495731 0.032193
v -0.124933 0.483088 0.076430
v -0.140596 0.462556 0.117200
v -0.154324 0.434926 0.152935
v -0.165590 0.401258 0.182261
v -0.173962 0.362846 0.204053
v -0.179117 0.321167 0.217473
v -0.180858 0.277823 0.222004
v -0.179117 0.234478 0.217473
v -0.173962 0.192799 0.204053
v -0.165590 0.154388 0.182261
v -0.154324 0.120720 0.152935
v -0.140596 0.093089 0.117200
v -0.124933 0.072558 0.076430
v -0.107938 0.059914 0.032193
v -0.115923 0.495731 0.027591
v -0.140596 0.483088 0.067404
v -0.163334 0.462556 0.104095
v -0.183264 0.434926 0.136256
v -0.199621 0.401258 0.162649
v -0.211775 0.362846 0.182261
v -0.219259 0.321167 0.194338
v -0.221786 0.277823 0.198416
v -0.219259 0.234478 0.194338
v -0.211775 0.192799 0.182261
v -0.199621 0.154388 0.162649
v -0.183264 0.120720 0.136256
v -0.163334 0.093089 0.104095
v -0.140596 0.072558 0.067404
v -0.115923 0.059914 0.027591
v -0.122922 0.495731 0.021398
v -0.154324 0.483088 0.055256
v -0.183264 0.462556 0.086460
v -0.208631 0.434926 0.113810
v -0.229448 0.401258 0.136256
v -0.244917 0.362846 0.152935
v -0.254443 0.321167 0.163205
v -0.257660 0.277823 0.166673
v -0.254443 0.234478 0.163205
v -0.244917 0.192799 0.152935
v -0.229448 0.154388 0.136256
v -0.208631 0.120720 0.113810
v -0.183264 0.093089 0.086460
v -0.154324 0.072558 0.055256
v -0.122922 0.059914 0.021398
v -0.090264 0.055645 -0.013813
v -0.128665 0.495731 0.013852
v -0.165590 0.483088 0.040454
v -0.199621 0.462556 0.064971
v -0.229449 0.434926 0.086460
v -0.253928 0.401258 0.104095
v -0.272117 0.362846 0.117200
v -0.283318 0.321167 0.125269
v -0.287100 0.277823 0.127994
v -0.283318 0.234478 0.125269
v -0.272117 0.192799 0.117200
v -0.253928 0.154388 0.104095
v -0.229449 0.120720 0.086460
v -0.199621 0.093089 0.064971
v -0.165590 0.072558 0.040454
v -0.128665 0.059914 0.013852
v -0.132933 0.495731 0.005243
v -0.173962 0.483088 0.023567
v -0.211774 0.462556 0.040454
v -0.244918 0.434926 0.055256
v -0.272117 0.401258 0.067404
v -0.292328 0.362846 0.076430
v -0.304774 0.321167 0.081989
v -0.308977 0.277823 0.083865
v -0.304774 0.234478 0.081989
v -0.292328 0.192799 0.076430
v -0.272117 0.154388 0.067404
v -0.244918 0.120720 0.055256
v -0.211774 0.093089 0.040454
v -0.173962 0.072558 0.023567
v -0.132933 0.059914 0.005243
v -0.135561 0.495731 -0.004098
v -0.179117 0.483088 0.005243
v -0.219259 0.462556 0.013852
v -0.254443 0.434926 0.021398
v -0.283318 0.401258 0.027591
v -0.304774 0.362846 0.032193
v -0.317987 0.321167 0.035026
v -0.322448 0.277823 0.035983
v -0.317987 0.234478 0.035026
v -0.304774 0.192799 0.032193
v -0.283318 0.154388 0.027591
v -0.254443 0.120720 0.021398
v -0.219259 0.093089 0.013852
v -0.179117 0.072558 0.005243
v -0.135561 0.059914 -0.004098
v -0.136449 0.495731 -0.013813
v -0.180858 0.483088 -0.013813
v -0.221786 0.462556 -0.013813
v -0.257660 0.434926 -0.013813
v -0.287100 0.401258 -0.013813
v -0.308977 0.362846 -0.013813
v -0.322448 0.321167 -0.013813
v -0.326997 0.277823 -0.013813
v -0.322448 0.234478 -0.013813
v -0.308977 0.192799 -0.013813
v -0.287100 0.154388 -0.013813
v -0.257660 0.120720 -0.013813
v -0.221786 0.093089 -0.013813
v -0.180858 0.072558 -0.013813
v -0.136449 0.059914 -0.013813
v -0.135561 0.495731 -0.023528
v -0.179117 0.483088 -0.032869
v -0.219259 0.462556 -0.041478
v -0.254443 0.434926 -0.049024
v -0.283318 0.401258 -0.055217
v -0.304774 0.362846 -0.059819
v -0.317987 0.321167 -0.062652
v -0.322448 0.277823 -0.063609
v -0.317987 0.234478 -0.062652
v -0.304774 0.192799 -0.059819
v -0.283318 0.154388 -0.055217
v -0.254443 0.120720 -0.049024
v -0.219259 0.093089 -0.041478
v -0.179117 0.072558 -0.032869
v -0.135561 0.059914 -0.023528
v -0.132933 0.495731 -0.032869
v -0.173962 0.483088 -0.051193
v -0.211774 0.462556 -0.068080
v -0.244917 0.434926 -0.082882
v -0.272117 0.401258 -0.095030
v -0.292328 0.362846 -0.104056
v -0.304774 0.321167 -0.109615
v -0.308977 0.277823 -0.111491
v -0.304774 0.234478 -0.109615
v -0.292328 0.192799 -0.104056
v -0.272117 0.154388 -0.095030
v -0.244917 0.120720 -0.082882
v -0.211774 0.093089 -0.068080
v -0.173962 0.072558 -0.051193
v -0.132933 0.059914 -0.032869
v -0.128665 0.495731 -0.041478
v -0.165590 0.483088 -0.068080
v -0.199621 0.462556 -0.092597
v -0.229448 0.434926 -0.114086
v -0.253928 0.401258 -0.131721
v -0.272117 0.362846 -0.144826
v -0.283318 0.321167 -0.152895
v -0.287100 0.277823 -0.155620
v -0.283318 0.234478 -0.152895
v -0.272117 0.192799 -0.144826
v -0.253928 0.154388 -0.131721
v -0.229448 0.120720 -0.114086
v -0.199621 0.093089 -0.092597
v -0.165590 0.072558 -0.068080
v -0.128665 0.059914 -0.041478
v -0.122922 0.495731 -0.049024
v -0.154324 0.483088 -0.082882
v -0.183264 0.462556 -0.114086
v -0.208631 0.434926 -0.141436
v -0.229448 0.401258 -0.163882
v -0.244917 0.362846 -0.180560
v -0.254443 0.321167 -0.190831
v -0.257659 0.277823 -0.194299
v -0.254443 0.234478 -0.190831
v -0.244917 0.192799 -0.180560
v -0.229448 0.154388 -0.163882
v -0.208631 0.120720 -0.141436
v -0.183264 0.093089 -0.114086
v -0.154324 0.072558 -0.082882
v -0.122922 0.059914 -0.049024
v -0.115923 0.495731 -0.055217
v -0.140595 0.483088 -0.095030
v -0.163334 0.462556 -0.131721
v -0.183264 0.434926 -0.163882
v -0.199621 0.401258 -0.190275
v -0.211775 0.362846 -0.209887
v -0.219259 0.321167 -0.221964
v -0.221786 0.277823 -0.226042
v -0.219259 0.234478 -0.221964
v -0.211775 0.192799 -0.209887
v -0.199621 0.154388 -0.190275
v -0.183264 0.120720 -0.163882
v -0.163334 0.093089 -0.131721
v -0.140595 0.072558 -0.095030
v -0.115923 0.059914 -0.055217
v -0.107938 0.495731 -0.059818
v -0.124933 0.483088 -0.104056
v -0.140595 0.462556 -0.144826
v -0.154324 0.434926 -0.180561
v -0.165590 0.401258 -0.209887
v -0.173962 0.362846 -0.231679
v -0.179117 0.321167 -0.245098
v -0.180858 0.277823 -0.249629
v -0.179117 0.234478 -0.245098
v -0.173962 0.192799 -0.231679
v -0.165590 0.154388 -0.209887
v -0.154324 0.120720 -0.180561
v -0.140595 0.093089 -0.144826
v -0.124933 0.072558 -0.104056
v -0.107938 0.059914 -0.059818
v -0.099274 0.495731 -0.062652
v -0.107938 0.483088 -0.109615
v -0.115923 0.462556 -0.152895
v -0.122922 0.434926 -0.190831
v -0.128665 0.401258 -0.221964
v -0.132933 0.362846 -0.245098
v -0.135561 0.321167 -0.259344
v -0.136448 0.277823 -0.264154
v -0.135561 0.234478 -0.259344
v -0.132933 0.192799 -0.245098
v -0.128665 0.154388 -0.221964
v -0.122922 0.120720 -0.190831
v -0.115923 0.093089 -0.152895
v -0.107938 0.072558 -0.109615
v -0.099274 0.059914 -0.062652
v -0.090264 0.495731 -0.063609
v -0.090264 0.483088 -0.111491
v -0.090264 0.434926 -0.194299
v -0.090264 0.362846 -0.249630
v -0.090264 0.192799 -0.249630
v -0.090264 0.120720 -0.194299
v -0.090264 0.093089 -0.155620
v -0.090264 0.072558 -0.111491
v -0.090264 0.059914 -0.063609
vn 0.0901 -0.5212 -0.8487
vn 0.0616 0.8122 -0.5802
vn 0.0770 -0.6840 -0.7254
vn 0.0770 0.6840 -0.7254
vn 0.0616 -0.8122 -0.5802
vn 0.0901 0.5212 -0.8487
vn 0.0448 -0.9058 -0.4214
vn 0.0998 0.3274 -0.9396
vn 0.0271 -0.9665 -0.2552
vn 0.1049 0.1118 -0.9882
vn 0.0091 0.9963 -0.0854
vn 0.0091 -0.9963 -0.0854
vn 0.1049 -0.1118 -0.9882
vn 0.0271 0.9665 -0.2552
vn 0.0998 -0.3274 -0.9396
vn 0.0448 0.9058 -0.4214
vn 0.2939 -0.3257 -0.8986
vn 0.1324 0.9048 -0.4048
vn 0.2657 -0.5189 -0.8125
vn 0.1821 0.8105 -0.5567
vn 0.2274 -0.6818 -0.6953
vn 0.2274 0.6818 -0.6953
vn 0.1821 -0.8105 -0.5567
vn 0.2657 0.5189 -0.8125
vn 0.1324 -0.9048 -0.4048
vn 0.2939 0.3257 -0.8986
vn 0.0802 -0.9661 -0.2453
vn 0.3089 0.1112 -0.9446
vn 0.0269 0.9963 -0.0821
vn 0.0269 -0.9963 -0.0821
vn 0.3089 -0.1112 -0.9446
vn 0.0802 0.9661 -0.2453
vn 0.2146 -0.9030 -0.3723
vn 0.4726 0.3225 -0.8201
vn 0.1302 -0.9654 -0.2259
vn 0.4963 0.1100 -0.8612
vn 0.0436 0.9962 -0.0757
vn 0.0436 -0.9962 -0.0757
vn 0.4963 -0.1100 -0.8612
vn 0.1302 0.9654 -0.2259
vn 0.4726 -0.3225 -0.8201
vn 0.2146 0.9030 -0.3723
vn 0.4281 -0.5147 -0.7428
vn 0.2946 0.8074 -0.5111
vn 0.3671 -0.6778 -0.6371
vn 0.3671 0.6778 -0.6371
vn 0.2946 -0.8074 -0.5111
vn 0.4281 0.5147 -0.7428
vn 0.2880 0.9006 -0.3255
vn 0.5702 -0.5095 -0.6444
vn 0.3945 0.8035 -0.4458
vn 0.4904 -0.6726 -0.5542
vn 0.4904 0.6726 -0.5542
vn 0.3945 -0.8035 -0.4458
vn 0.5702 0.5095 -0.6444
vn 0.2880 -0.9006 -0.3255
vn 0.6282 0.3185 -0.7099
vn 0.1750 -0.9645 -0.1978
vn 0.6588 0.1085 -0.7445
vn 0.0587 0.9961 -0.0663
vn 0.0587 -0.9961 -0.0663
vn 0.6588 -0.1085 -0.7445
vn 0.1750 0.9645 -0.1978
vn 0.6282 -0.3185 -0.7099
vn 0.7554 0.3143 -0.5750
vn 0.2131 -0.9635 -0.1622
vn 0.7912 0.1069 -0.6022
vn 0.0715 0.9960 -0.0544
vn 0.0715 -0.9960 -0.0544
vn 0.7912 -0.1069 -0.6022
vn 0.2131 0.9635 -0.1622
vn 0.7554 -0.3143 -0.5750
vn 0.3499 0.8981 -0.2664
vn 0.6873 -0.5039 -0.5231
vn 0.4782 0.7993 -0.3640
vn 0.5927 -0.6672 -0.4511
vn 0.5927 0.6672 -0.4511
vn 0.4782 -0.7993 -0.3640
vn 0.6873 0.5039 -0.5231
vn 0.3499 -0.8981 -0.2664
vn 0.7764 -0.4990 -0.3849
vn 0.5429 0.7955 -0.2692
vn 0.6712 -0.6623 -0.3328
vn 0.6712 0.6623 -0.3328
vn 0.5429 -0.7955 -0.2692
vn 0.7764 0.4990 -0.3849
vn 0.3982 -0.8958 -0.1974
vn 0.8516 0.3106 -0.4222
vn 0.2428 -0.9626 -0.1204
vn 0.8909 0.1055 -0.4417
vn 0.0816 0.9958 -0.0404
vn 0.0816 -0.9958 -0.0404
vn 0.8909 -0.1055 -0.4417
vn 0.2428 0.9626 -0.1204
vn 0.8516 -0.3106 -0.4222
vn 0.3982 0.8958 -0.1974
vn 0.2633 -0.9619 -0.0741
vn 0.9574 0.1045 -0.2693
vn 0.0885 0.9958 -0.0249
vn 0.0885 -0.9958 -0.0249
vn 0.9574 -0.1045 -0.2693
vn 0.2633 0.9619 -0.0741
vn 0.9159 -0.3079 -0.2577
vn 0.4313 0.8940 -0.1213
vn 0.8363 -0.4953 -0.2353
vn 0.5870 0.7926 -0.1651
vn 0.7243 -0.6587 -0.2038
vn 0.7243 0.6587 -0.2038
vn 0.5870 -0.7926 -0.1651
vn 0.8363 0.4953 -0.2353
vn 0.4313 -0.8940 -0.1213
vn 0.9159 0.3079 -0.2577
vn 0.7510 -0.6567 -0.0686
vn 0.7510 0.6567 -0.0686
vn 0.6093 -0.7910 -0.0557
vn 0.8662 0.4933 -0.0791
vn 0.4480 -0.8931 -0.0409
vn 0.9480 0.3064 -0.0866
vn 0.2737 -0.9615 -0.0250
vn 0.9905 0.1039 -0.0905
vn 0.0920 0.9957 -0.0084
vn 0.0920 -0.9957 -0.0084
vn 0.9905 -0.1039 -0.0905
vn 0.2737 0.9615 -0.0250
vn 0.9480 -0.3064 -0.0866
vn 0.4480 0.8931 -0.0409
vn 0.8662 -0.4933 -0.0791
vn 0.6093 0.7910 -0.0557
vn 0.0920 0.9957 0.0084
vn 0.0920 -0.9957 0.0084
vn 0.9905 -0.1039 0.0905
vn 0.2737 0.9615 0.0250
vn 0.9480 -0.3064 0.0866
vn 0.4480 0.8931 0.0409
vn 0.8662 -0.4933 0.0791
vn 0.6093 0.7910 0.0557
vn 0.7510 -0.6567 0.0686
vn 0.7510 0.6567 0.0686
vn 0.6093 -0.7910 0.0557
vn 0.8662 0.4933 0.0791
vn 0.4480 -0.8931 0.0409
vn 0.9480 0.3064 0.0866
vn 0.2737 -0.9615 0.0250
vn 0.9905 0.1039 0.0905
vn 0.7243 0.6587 0.2038
vn 0.5870 -0.7926 0.1651
vn 0.8363 0.4953 0.2353
vn 0.4313 -0.8940 0.1213
vn 0.9159 0.3079 0.2577
vn 0.2633 -0.9619 0.0741
vn 0.9574 0.1045 0.2693
vn 0.0885 0.9958 0.0249
vn 0.0885 -0.9958 0.0249
vn 0.9574 -0.1045 0.2693
vn 0.2633 0.9619 0.0741
vn 0.9159 -0.3079 0.2577
vn 0.4313 0.8940 0.1213
vn 0.8363 -0.4953 0.2353
vn 0.5870 0.7926 0.1651
vn 0.7243 -0.6587 0.2038
vn 0.8909 -0.1055 0.4417
vn 0.2428 0.9626 0.1204
vn 0.8516 -0.3106 0.4222
vn 0.3982 0.8958 0.1974
vn 0.7764 -0.4990 0.3849
vn 0.5429 0.7955 0.2692
vn 0.6712 -0.6623 0.3328
vn 0.6712 0.6623 0.3328
vn 0.5429 -0.7955 0.2692
vn 0.7764 0.4990 0.3849
vn 0.3982 -0.8958 0.1974
vn 0.8516 0.3106 0.4222
vn 0.2428 -0.9626 0.1204
vn 0.8909 0.1055 0.4417
vn 0.0816 0.9958 0.0404
vn 0.0816 -0.9958 0.0404
vn 0.4782 -0.7993 0.3640
vn 0.6873 0.5039 0.5231
vn 0.3499 -0.8981 0.2664
vn 0.7554 0.3143 0.5750
vn 0.2131 -0.9635 0.1622
vn 0.7912 0.1069 0.6022
vn 0.0715 0.9960 0.0544
vn 0.0715 -0.9960 0.0544
vn 0.7912 -0.1069 0.6022
vn 0.2131 0.9635 0.1622
vn 0.7554 -0.3143 0.5750
vn 0.3499 0.8981 0.2664
vn 0.6873 -0.5039 0.5231
vn 0.4782 0.7993 0.3640
vn 0.5927 -0.6672 0.4511
vn 0.5927 0.6672 0.4511
vn 0.1750 0.9645 0.1978
vn 0.6282 -0.3185 0.7099
vn 0.2880 0.9006 0.3255
vn 0.5702 -0.5095 0.6444
vn 0.3945 0.8035 0.4458
vn 0.4904 -0.6726 0.5542
vn 0.4904 0.6726 0.5542
vn 0.3945 -0.8035 0.4458
vn 0.5702 0.5095 0.6444
vn 0.2880 -0.9006 0.3255
vn 0.6282 0.3185 0.7099
vn 0.1750 -0.9645 0.1978
vn 0.6588 0.1084 0.7445
vn 0.0587 0.9961 0.0663
vn 0.0587 -0.9961 0.0663
vn 0.6588 -0.1084 0.7445
vn 0.4281 0.5147 0.7428
vn 0.2146 -0.9030 0.3723
vn 0.4726 0.3225 0.8201
vn 0.1302 -0.9654 0.2259
vn 0.4963 0.1100 0.8612
vn 0.0436 0.9962 0.0757
vn 0.0436 -0.9962 0.0757
vn 0.4963 -0.1100 0.8612
vn 0.1302 0.9654 0.2259
vn 0.4726 -0.3225 0.8201
vn 0.2146 0.9030 0.3723
vn 0.4281 -0.5147 0.7428
vn 0.2946 0.8074 0.5111
vn 0.3671 -0.6778 0.6371
vn 0.3671 0.6778 0.6371
vn 0.2946 -0.8074 0.5111
vn 0.2939 -0.3257 0.8986
vn 0.1324 0.9048 0.4048
vn 0.2657 -0.5189 0.8125
vn 0.1821 0.8105 0.5567
vn 0.2274 -0.6818 0.6953
vn 0.2274 0.6818 0.6953
vn 0.1821 -0.8105 0.5567
vn 0.2657 0.5189 0.8125
vn 0.1324 -0.9048 0.4048
vn 0.2939 0.3257 0.8986
vn 0.0802 -0.9661 0.2453
vn 0.3089 0.1111 0.9446
vn 0.0269 0.9963 0.0821
vn 0.0269 -0.9963 0.0821
vn 0.3089 -0.1111 0.9446
vn 0.0802 0.9661 0.2453
vn 0.0448 -0.9058 0.4214
vn 0.0998 0.3274 0.9396
vn 0.0271 -0.9665 0.2552
vn 0.1049 0.1118 0.9882
vn 0.0091 0.9963 0.0854
vn 0.0091 -0.9963 0.0854
vn 0.1049 -0.1118 0.9882
vn 0.0271 0.9665 0.2552
vn 0.0998 -0.3274 0.9396
vn 0.0448 0.9058 0.4214
vn 0.0901 -0.5212 0.8487
vn 0.0616 0.8122 0.5802
vn 0.0770 -0.6840 0.7254
vn 0.0770 0.6840 0.7254
vn 0.0616 -0.8122 0.5802
vn 0.0901 0.5212 0.8487
vn -0.0901 -0.5212 0.8487
vn -0.0616 0.8122 0.5802
vn -0.0770 -0.6840 0.7254
vn -0.0770 0.6840 0.7254
vn -0.0616 -0.8122 0.5802
vn -0.0901 0.5212 0.8487
vn -0.0448 -0.9058 0.4214
vn -0.0998 0.3274 0.9396
vn -0.0271 -0.9665 0.2552
vn -0.1049 0.1118 0.9882
vn -0.0091 0.9963 0.0854
vn -0.0091 -0.9963 0.0854
vn -0.1049 -0.1118 0.9882
vn -0.0271 0.9665 0.2552
vn -0.0998 -0.3274 0.9396
vn -0.0448 0.9058 0.4214
vn -0.0802 -0.9661 0.2453
vn -0.3089 0.1111 0.9446
vn -0.0269 0.9963 0.0821
vn -0.0269 -0.9963 0.0821
vn -0.3089 -0.1111 0.9446
vn -0.0802 0.9661 0.2453
vn -0.2939 -0.3257 0.8986
vn -0.1324 0.9048 0.4048
vn -0.2657 -0.5189 0.8125
vn -0.1821 0.8105 0.5567
vn -0.2274 -0.6818 0.6953
vn -0.2274 0.6818 0.6953
vn -0.1821 -0.8105 0.5567
vn -0.2657 0.5189 0.8125
vn -0.1324 -0.9048 0.4048
vn -0.2939 0.3257 0.8986
vn -0.2946 0.8074 0.5111
vn -0.3671 -0.6778 0.6371
vn -0.3671 0.6778 0.6371
vn -0.2946 -0.8074 0.5111
vn -0.4281 0.5147 0.7428
vn -0.2146 -0.9030 0.3723
vn -0.4726 0.3225 0.8201
vn -0.1302 -0.9654 0.2259
vn -0.4963 0.1100 0.8612
vn -0.0436 0.9962 0.0757
vn -0.0436 -0.9962 0.0757
vn -0.4963 -0.1100 0.8612
vn -0.1302 0.9654 0.2259
vn -0.4726 -0.3225 0.8201
vn -0.2146 0.9030 0.3723
vn -0.4281 -0.5147 0.7428
vn -0.6588 0.1084 0.7445
vn -0.0587 0.9961 0.0663
vn -0.0587 -0.9961 0.0663
vn -0.6588 -0.1084 0.7445
vn -0.1750 0.9645 0.1978
vn -0.6282 -0.3185 0.7099
vn -0.2880 0.9006 0.3255
vn -0.5702 -0.5095 0.6444
vn -0.3945 0.8035 0.4458
vn -0.4904 -0.6726 0.5542
vn -0.4904 0.6726 0.5542
vn -0.3945 -0.8035 0.4458
vn -0.5702 0.5095 0.6444
vn -0.2880 -0.9006 0.3255
vn -0.6282 0.3185 0.7099
vn -0.1750 -0.9645 0.1978
vn -0.5927 -0.6672 0.4511
vn -0.5927 0.6672 0.4511
vn -0.4782 -0.7993 0.3640
vn -0.6873 0.5039 0.5231
vn -0.3499 -0.8981 0.2664
vn -0.7554 0.3143 0.5750
vn -0.2131 -0.9635 0.1622
vn -0.7912 0.1069 0.6022
vn -0.0715 0.9960 0.0544
vn -0.0715 -0.9960 0.0544
vn -0.7912 -0.1069 0.6022
vn -0.2131 0.9635 0.1622
vn -0.7554 -0.3143 0.5750
vn -0.3500 0.8981 0.2664
vn -0.6873 -0.5039 0.5231
vn -0.4782 0.7993 0.3640
vn -0.0816 0.9958 0.0404
vn -0.0816 -0.9958 0.0404
vn -0.8909 -0.1055 0.4417
vn -0.2428 0.9626 0.1204
vn -0.8516 -0.3106 0.4222
vn -0.3982 0.8958 0.1974
vn -0.7764 -0.4990 0.3849
vn -0.5429 0.7955 0.2692
vn -0.6712 -0.6623 0.3328
vn -0.6712 0.6623 0.3328
vn -0.5429 -0.7955 0.2692
vn -0.7764 0.4990 0.3849
vn -0.3982 -0.8958 0.1974
vn -0.8516 0.3106 0.4222
vn -0.2428 -0.9626 0.1204
vn -0.8909 0.1055 0.4417
vn -0.7243 0.6587 0.2038
vn -0.5870 -0.7926 0.1651
vn -0.8363 0.4953 0.2353
vn -0.4313 -0.8940 0.1213
vn -0.9159 0.3079 0.2577
vn -0.2633 -0.9619 0.0741
vn -0.9574 0.1045 0.2693
vn -0.0885 0.9958 0.0249
vn -0.0885 -0.9958 0.0249
vn -0.9574 -0.1045 0.2693
vn -0.2633 0.9619 0.0741
vn -0.9159 -0.3079 0.2577
vn -0.4313 0.8940 0.1213
vn -0.8363 -0.4953 0.2353
vn -0.5870 0.7926 0.1651
vn -0.7243 -0.6587 0.2038
vn -0.9905 -0.1039 0.0905
vn -0.2737 0.9615 0.0250
vn -0.9480 -0.3064 0.0866
vn -0.4480 0.8931 0.0409
vn -0.8662 -0.4933 0.0791
vn -0.6093 0.7910 0.0557
vn -0.7510 -0.6567 0.0686
vn -0.7510 0.6567 0.0686
vn -0.6093 -0.7910 0.0557
vn -0.8662 0.4933 0.0791
vn -0.4480 -0.8931 0.0409
vn -0.9480 0.3064 0.0866
vn -0.2737 -0.9615 0.0250
vn -0.9905 0.1039 0.0905
vn -0.0920 0.9957 0.0084
vn -0.0920 -0.9957 0.0084
vn -0.6093 -0.7910 -0.0557
vn -0.8662 0.4933 -0.0791
vn -0.4480 -0.8931 -0.0409
vn -0.9480 0.3064 -0.0866
vn -0.2737 -0.9615 -0.0250
vn -0.9905 0.1039 -0.0905
vn -0.0920 0.9957 -0.0084
vn -0.0920 -0.9957 -0.0084
vn -0.9905 -0.1039 -0.0905
vn -0.2737 0.9615 -0.0250
vn -0.9480 -0.3064 -0.0866
vn -0.4480 0.8931 -0.0409
vn -0.8662 -0.4933 -0.0791
vn -0.6093 0.7910 -0.0557
vn -0.7510 -0.6567 -0.0686
vn -0.7510 0.6567 -0.0686
vn -0.9159 -0.3079 -0.2577
vn -0.4313 0.8940 -0.1213
vn -0.8363 -0.4953 -0.2353
vn -0.5870 0.7926 -0.1651
vn -0.7243 -0.6587 -0.2038
vn -0.7243 0.6587 -0.2038
vn -0.5870 -0.7926 -0.1651
vn -0.8363 0.4953 -0.2353
vn -0.4313 -0.8940 -0.1213
vn -0.9159 0.3079 -0.2577
vn -0.2633 -0.9619 -0.0741
vn -0.9574 0.1045 -0.2693
vn -0.0885 0.9958 -0.0249
vn -0.0885 -0.9958 -0.0249
vn -0.9574 -0.1045 -0.2693
vn -0.2633 0.9619 -0.0741
vn -0.3982 -0.8958 -0.1974
vn -0.8516 0.3106 -0.4222
vn -0.2428 -0.9626 -0.1204
vn -0.8909 0.1055 -0.4417
vn -0.0816 0.9958 -0.0404
vn -0.0816 -0.9958 -0.0404
vn -0.8909 -0.1055 -0.4417
vn -0.2428 0.9626 -0.1204
vn -0.8516 -0.3106 -0.4222
vn -0.3982 0.8958 -0.1974
vn -0.7764 -0.4990 -0.3849
vn -0.5429 0.7955 -0.2692
vn -0.6712 -0.6623 -0.3328
vn -0.6712 0.6623 -0.3328
vn -0.5429 -0.7955 -0.2692
vn -0.7764 0.4990 -0.3849
vn -0.3500 0.8981 -0.2664
vn -0.6873 -0.5039 -0.5231
vn -0.4782 0.7993 -0.3640
vn -0.5927 -0.6672 -0.4511
vn -0.5927 0.6672 -0.4511
vn -0.4782 -0.7993 -0.3640
vn -0.6873 0.5039 -0.5231
vn -0.3499 -0.8981 -0.2664
vn -0.7554 0.3143 -0.5750
vn -0.2131 -0.9635 -0.1622
vn -0.7912 0.1069 -0.6022
vn -0.0715 0.9960 -0.0544
vn -0.0715 -0.9960 -0.0544
vn -0.7912 -0.1069 -0.6022
vn -0.2131 0.9635 -0.1622
vn -0.7554 -0.3143 -0.5750
vn -0.6282 0.3185 -0.7099
vn -0.1750 -0.9645 -0.1978
vn -0.6588 0.1084 -0.7445
vn -0.0587 0.9961 -0.0663
vn -0.0587 -0.9961 -0.0663
vn -0.6588 -0.1084 -0.7445
vn -0.1750 0.9645 -0.1978
vn -0.6282 -0.3185 -0.7099
vn -0.2880 0.9006 -0.3255
vn -0.5702 -0.5095 -0.6444
vn -0.3945 0.8035 -0.4458
vn -0.4904 -0.6726 -0.5542
vn -0.4904 0.6726 -0.5542
vn -0.3945 -0.8035 -0.4458
vn -0.5702 0.5095 -0.6444
vn -0.2880 -0.9006 -0.3255
vn -0.4281 -0.5147 -0.7428
vn -0.2946 0.8074 -0.5111
vn -0.3671 -0.6778 -0.6371
vn -0.3671 0.6778 -0.6371
vn -0.2946 -0.8074 -0.5111
vn -0.4281 0.5147 -0.7428
vn -0.2146 -0.9030 -0.3723
vn -0.4726 0.3225 -0.8201
vn -0.1302 -0.9654 -0.2259
vn -0.4963 0.1100 -0.8612
vn -0.0436 0.9962 -0.0757
vn -0.0436 -0.9962 -0.0757
vn -0.4963 -0.1100 -0.8612
vn -0.1302 0.9654 -0.2259
vn -0.4726 -0.3225 -0.8201
vn -0.2146 0.9030 -0.3723
vn -0.0802 -0.9661 -0.2453
vn -0.3089 0.1111 -0.9446
vn -0.0269 0.9963 -0.0821
vn -0.0269 -0.9963 -0.0821
vn -0.3089 -0.1111 -0.9446
vn -0.0802 0.9661 -0.2453
vn -0.2939 -0.3257 -0.8986
vn -0.1324 0.9048 -0.4048
vn -0.2657 -0.5189 -0.8125
vn -0.1821 0.8105 -0.5567
vn -0.2274 -0.6818 -0.6953
vn -0.2274 0.6818 -0.6953
vn -0.1821 -0.8105 -0.5567
vn -0.2657 0.5189 -0.8125
vn -0.1324 -0.9048 -0.4048
vn -0.2939 0.3257 -0.8986
vn -0.0616 0.8122 -0.5802
vn -0.0770 -0.6840 -0.7254
vn -0.0770 0.6840 -0.7254
vn -0.0616 -0.8122 -0.5802
vn -0.0901 0.5212 -0.8487
vn -0.0448 -0.9058 -0.4214
vn -0.0998 0.3274 -0.9396
vn -0.0271 -0.9665 -0.2552
vn -0.1049 0.1118 -0.9882
vn -0.0091 0.9963 -0.0854
vn -0.0091 -0.9963 -0.0854
vn -0.1049 -0.1118 -0.9882
vn -0.0271 0.9665 -0.2552
vn -0.0998 -0.3274 -0.9396
vn -0.0448 0.9058 -0.4214
vn -0.0901 -0.5212 -0.8487
vn 0.0447 -0.9058 -0.4214
vn -0.3500 -0.8981 0.2664
vn -0.3499 0.8981 0.2664
vn -0.3499 0.8981 -0.2664
vn -0.3500 -0.8981 -0.2664
vt 0.750000 0.375000
vt 0.718750 0.312500
vt 0.750000 0.312500
vt 0.750000 0.750000
vt 0.718750 0.812500
vt 0.718750 0.750000
vt 0.718750 0.250000
vt 0.750000 0.250000
vt 0.750000 0.687500
vt 0.718750 0.687500
vt 0.718750 0.187500
vt 0.750000 0.187500
vt 0.750000 0.625000
vt 0.718750 0.625000
vt 0.718750 0.125000
vt 0.750000 0.125000
vt 0.750000 0.562500
vt 0.718750 0.562500
vt 0.718750 0.062500
vt 0.750000 0.062500
vt 0.718750 0.500000
vt 0.750000 0.500000
vt 0.750000 0.937500
vt 0.734375 1.000000
vt 0.718750 0.937500
vt 0.734375 0.000000
vt 0.718750 0.437500
vt 0.750000 0.437500
vt 0.750000 0.875000
vt 0.718750 0.875000
vt 0.718750 0.375000
vt 0.750000 0.812500
vt 0.687500 0.375000
vt 0.687500 0.812500
vt 0.687500 0.312500
vt 0.687500 0.750000
vt 0.687500 0.250000
vt 0.687500 0.687500
vt 0.687500 0.187500
vt 0.687500 0.625000
vt 0.687500 0.125000
vt 0.687500 0.562500
vt 0.687500 0.062500
vt 0.687500 0.500000
vt 0.703125 1.000000
vt 0.687500 0.937500
vt 0.703125 0.000000
vt 0.687500 0.437500
vt 0.687500 0.875000
vt 0.656250 0.125000
vt 0.656250 0.562500
vt 0.656250 0.062500
vt 0.656250 0.500000
vt 0.671875 1.000000
vt 0.656250 0.937500
vt 0.671875 0.000000
vt 0.656250 0.437500
vt 0.656250 0.875000
vt 0.656250 0.375000
vt 0.656250 0.812500
vt 0.656250 0.312500
vt 0.656250 0.750000
vt 0.656250 0.250000
vt 0.656250 0.687500
vt 0.656250 0.187500
vt 0.656250 0.625000
vt 0.625000 0.812500
vt 0.625000 0.375000
vt 0.625000 0.312500
vt 0.625000 0.750000
vt 0.625000 0.250000
vt 0.625000 0.687500
vt 0.625000 0.187500
vt 0.625000 0.625000
vt 0.625000 0.125000
vt 0.625000 0.562500
vt 0.625000 0.062500
vt 0.625000 0.500000
vt 0.640625 1.000000
vt 0.625000 0.937500
vt 0.640625 0.000000
vt 0.625000 0.437500
vt 0.625000 0.875000
vt 0.593750 0.625000
vt 0.593750 0.562500
vt 0.593750 0.062500
vt 0.593750 0.500000
vt 0.609375 1.000000
vt 0.593750 0.937500
vt 0.609375 0.000000
vt 0.593750 0.437500
vt 0.593750 0.875000
vt 0.593750 0.375000
vt 0.593750 0.812500
vt 0.593750 0.312500
vt 0.593750 0.750000
vt 0.593750 0.250000
vt 0.593750 0.687500
vt 0.593750 0.187500
vt 0.593750 0.125000
vt 0.562500 0.375000
vt 0.562500 0.312500
vt 0.562500 0.750000
vt 0.562500 0.250000
vt 0.562500 0.687500
vt 0.562500 0.187500
vt 0.562500 0.625000
vt 0.562500 0.125000
vt 0.562500 0.562500
vt 0.562500 0.062500
vt 0.562500 0.500000
vt 0.578125 1.000000
vt 0.562500 0.937500
vt 0.578125 0.000000
vt 0.562500 0.437500
vt 0.562500 0.875000
vt 0.562500 0.812500
vt 0.531250 0.062500
vt 0.531250 0.562500
vt 0.531250 0.500000
vt 0.546875 1.000000
vt 0.531250 0.937500
vt 0.546875 0.000000
vt 0.531250 0.437500
vt 0.531250 0.875000
vt 0.531250 0.375000
vt 0.531250 0.812500
vt 0.531250 0.312500
vt 0.531250 0.750000
vt 0.531250 0.250000
vt 0.531250 0.687500
vt 0.531250 0.187500
vt 0.531250 0.625000
vt 0.531250 0.125000
vt 0.500000 0.250000
vt 0.500000 0.750000
vt 0.500000 0.687500
vt 0.500000 0.187500
vt 0.500000 0.625000
vt 0.500000 0.125000
vt 0.500000 0.562500
vt 0.500000 0.062500
vt 0.500000 0.500000
vt 0.515625 1.000000
vt 0.500000 0.937500
vt 0.515625 0.000000
vt 0.500000 0.437500
vt 0.500000 0.875000
vt 0.500000 0.375000
vt 0.500000 0.812500
vt 0.500000 0.312500
vt 0.484375 1.000000
vt 0.468750 0.937500
vt 0.484375 0.000000
vt 0.468750 0.062500
vt 0.468750 0.437500
vt 0.468750 0.875000
vt 0.468750 0.375000
vt 0.468750 0.812500
vt 0.468750 0.312500
vt 0.468750 0.750000
vt 0.468750 0.250000
vt 0.468750 0.687500
vt 0.468750 0.187500
vt 0.468750 0.625000
vt 0.468750 0.125000
vt 0.468750 0.562500
vt 0.468750 0.500000
vt 0.437500 0.750000
vt 0.437500 0.687500
vt 0.437500 0.250000
vt 0.437500 0.187500
vt 0.437500 0.625000
vt 0.437500 0.125000
vt 0.437500 0.562500
vt 0.437500 0.062500
vt 0.437500 0.500000
vt 0.453125 1.000000
vt 0.437500 0.937500
vt 0.453125 0.000000
vt 0.437500 0.437500
vt 0.437500 0.875000
vt 0.437500 0.375000
vt 0.437500 0.812500
vt 0.437500 0.312500
vt 0.406250 0.437500
vt 0.406250 0.875000
vt 0.406250 0.375000
vt 0.406250 0.812500
vt 0.406250 0.312500
vt 0.406250 0.750000
vt 0.406250 0.250000
vt 0.406250 0.687500
vt 0.406250 0.187500
vt 0.406250 0.625000
vt 0.406250 0.125000
vt 0.406250 0.562500
vt 0.406250 0.062500
vt 0.406250 0.500000
vt 0.421875 1.000000
vt 0.406250 0.937500
vt 0.421875 0.000000
vt 0.375000 0.187500
vt 0.375000 0.625000
vt 0.375000 0.125000
vt 0.375000 0.562500
vt 0.375000 0.062500
vt 0.375000 0.500000
vt 0.390625 1.000000
vt 0.375000 0.937500
vt 0.390625 0.000000
vt 0.375000 0.437500
vt 0.375000 0.875000
vt 0.375000 0.375000
vt 0.375000 0.812500
vt 0.375000 0.312500
vt 0.375000 0.750000
vt 0.375000 0.250000
vt 0.375000 0.687500
vt 0.343750 0.937500
vt 0.343750 0.875000
vt 0.343750 0.375000
vt 0.343750 0.812500
vt 0.343750 0.312500
vt 0.343750 0.750000
vt 0.343750 0.250000
vt 0.343750 0.687500
vt 0.343750 0.187500
vt 0.343750 0.625000
vt 0.343750 0.125000
vt 0.343750 0.562500
vt 0.343750 0.062500
vt 0.343750 0.500000
vt 0.359375 1.000000
vt 0.359375 0.000000
vt 0.343750 0.437500
vt 0.312500 0.625000
vt 0.312500 0.125000
vt 0.312500 0.562500
vt 0.312500 0.062500
vt 0.312500 0.500000
vt 0.328125 1.000000
vt 0.312500 0.937500
vt 0.328125 0.000000
vt 0.312500 0.437500
vt 0.312500 0.875000
vt 0.312500 0.375000
vt 0.312500 0.812500
vt 0.312500 0.312500
vt 0.312500 0.750000
vt 0.312500 0.250000
vt 0.312500 0.687500
vt 0.312500 0.187500
vt 0.281250 0.375000
vt 0.281250 0.812500
vt 0.281250 0.312500
vt 0.281250 0.750000
vt 0.281250 0.250000
vt 0.281250 0.687500
vt 0.281250 0.187500
vt 0.281250 0.625000
vt 0.281250 0.125000
vt 0.281250 0.562500
vt 0.281250 0.062500
vt 0.281250 0.500000
vt 0.296875 1.000000
vt 0.281250 0.937500
vt 0.296875 0.000000
vt 0.281250 0.437500
vt 0.281250 0.875000
vt 0.250000 0.125000
vt 0.250000 0.625000
vt 0.250000 0.562500
vt 0.250000 0.062500
vt 0.250000 0.500000
vt 0.265625 1.000000
vt 0.250000 0.937500
vt 0.265625 0.000000
vt 0.250000 0.437500
vt 0.250000 0.875000
vt 0.250000 0.375000
vt 0.250000 0.812500
vt 0.250000 0.312500
vt 0.250000 0.750000
vt 0.250000 0.250000
vt 0.250000 0.687500
vt 0.250000 0.187500
vt 0.218750 0.375000
vt 0.218750 0.312500
vt 0.218750 0.750000
vt 0.218750 0.250000
vt 0.218750 0.687500
vt 0.218750 0.187500
vt 0.218750 0.625000
vt 0.218750 0.125000
vt 0.218750 0.562500
vt 0.218750 0.062500
vt 0.218750 0.500000
vt 0.234375 1.000000
vt 0.218750 0.937500
vt 0.234375 0.000000
vt 0.218750 0.437500
vt 0.218750 0.875000
vt 0.218750 0.812500
vt 0.187500 0.062500
vt 0.187500 0.562500
vt 0.187500 0.500000
vt 0.203125 1.000000
vt 0.187500 0.937500
vt 0.203125 0.000000
vt 0.187500 0.437500
vt 0.187500 0.875000
vt 0.187500 0.375000
vt 0.187500 0.812500
vt 0.187500 0.312500
vt 0.187500 0.750000
vt 0.187500 0.250000
vt 0.187500 0.687500
vt 0.187500 0.187500
vt 0.187500 0.625000
vt 0.187500 0.125000
vt 0.156250 0.750000
vt 0.156250 0.250000
vt 0.156250 0.687500
vt 0.156250 0.187500
vt 0.156250 0.625000
vt 0.156250 0.125000
vt 0.156250 0.562500
vt 0.156250 0.062500
vt 0.156250 0.500000
vt 0.171875 1.000000
vt 0.156250 0.937500
vt 0.171875 0.000000
vt 0.156250 0.437500
vt 0.156250 0.875000
vt 0.156250 0.375000
vt 0.156250 0.812500
vt 0.156250 0.312500
vt 0.125000 0.562500
vt 0.125000 0.500000
vt 0.140625 1.000000
vt 0.125000 0.937500
vt 0.140625 0.000000
vt 0.125000 0.062500
vt 0.125000 0.437500
vt 0.125000 0.875000
vt 0.125000 0.375000
vt 0.125000 0.812500
vt 0.125000 0.312500
vt 0.125000 0.750000
vt 0.125000 0.250000
vt 0.125000 0.687500
vt 0.125000 0.187500
vt 0.125000 0.625000
vt 0.125000 0.125000
vt 0.093750 0.250000
vt 0.093750 0.750000
vt 0.093750 0.687500
vt 0.093750 0.187500
vt 0.093750 0.625000
vt 0.093750 0.125000
vt 0.093750 0.562500
vt 0.093750 0.062500
vt 0.093750 0.500000
vt 0.109375 1.000000
vt 0.093750 0.937500
vt 0.109375 0.000000
vt 0.093750 0.437500
vt 0.093750 0.875000
vt 0.093750 0.375000
vt 0.093750 0.812500
vt 0.093750 0.312500
vt 0.078125 1.000000
vt 0.062500 0.937500
vt 0.078125 0.000000
vt 0.062500 0.062500
vt 0.062500 0.437500
vt 0.062500 0.875000
vt 0.062500 0.375000
vt 0.062500 0.812500
vt 0.062500 0.312500
vt 0.062500 0.750000
vt 0.062500 0.250000
vt 0.062500 0.687500
vt 0.062500 0.187500
vt 0.062500 0.625000
vt 0.062500 0.125000
vt 0.062500 0.562500
vt 0.062500 0.500000
vt 0.031250 0.750000
vt 0.031250 0.687500
vt 0.031250 0.250000
vt 0.031250 0.187500
vt 0.031250 0.625000
vt 0.031250 0.125000
vt 0.031250 0.562500
vt 0.031250 0.062500
vt 0.031250 0.500000
vt 0.046875 1.000000
vt 0.031250 0.937500
vt 0.046875 0.000000
vt 0.031250 0.437500
vt 0.031250 0.875000
vt 0.031250 0.375000
vt 0.031250 0.812500
vt 0.031250 0.312500
vt 0.000000 0.437500
vt 0.000000 0.875000
vt 0.000000 0.375000
vt 0.000000 0.812500
vt 0.000000 0.312500
vt 0.000000 0.750000
vt 0.000000 0.250000
vt 0.000000 0.687500
vt 0.000000 0.187500
vt 0.000000 0.625000
vt 0.000000 0.125000
vt 0.000000 0.562500
vt 0.000000 0.062500
vt 0.000000 0.500000
vt 0.015625 1.000000
vt 0.000000 0.937500
vt 0.015625 0.000000
vt 1.000000 0.187500
vt 0.968750 0.250000
vt 0.968750 0.187500
vt 1.000000 0.687500
vt 0.968750 0.625000
vt 1.000000 0.625000
vt 0.968750 0.125000
vt 1.000000 0.125000
vt 1.000000 0.562500
vt 0.968750 0.562500
vt 0.968750 0.062500
vt 1.000000 0.062500
vt 1.000000 0.500000
vt 0.968750 0.500000
vt 1.000000 0.937500
vt 0.984375 1.000000
vt 0.968750 0.937500
vt 0.984375 0.000000
vt 0.968750 0.437500
vt 1.000000 0.437500
vt 0.968750 0.875000
vt 1.000000 0.875000
vt 0.968750 0.375000
vt 1.000000 0.375000
vt 1.000000 0.812500
vt 0.968750 0.812500
vt 1.000000 0.312500
vt 0.968750 0.312500
vt 0.968750 0.750000
vt 1.000000 0.750000
vt 1.000000 0.250000
vt 0.968750 0.687500
vt 0.937500 0.375000
vt 0.937500 0.875000
vt 0.937500 0.812500
vt 0.937500 0.312500
vt 0.937500 0.750000
vt 0.937500 0.250000
vt 0.937500 0.687500
vt 0.937500 0.187500
vt 0.937500 0.625000
vt 0.937500 0.125000
vt 0.937500 0.562500
vt 0.937500 0.062500
vt 0.937500 0.500000
vt 0.953125 1.000000
vt 0.937500 0.937500
vt 0.953125 0.000000
vt 0.937500 0.437500
vt 0.906250 0.187500
vt 0.906250 0.125000
vt 0.906250 0.625000
vt 0.906250 0.562500
vt 0.906250 0.062500
vt 0.906250 0.500000
vt 0.921875 1.000000
vt 0.906250 0.937500
vt 0.921875 0.000000
vt 0.906250 0.437500
vt 0.906250 0.875000
vt 0.906250 0.375000
vt 0.906250 0.812500
vt 0.906250 0.312500
vt 0.906250 0.750000
vt 0.906250 0.250000
vt 0.906250 0.687500
vt 0.875000 0.875000
vt 0.875000 0.812500
vt 0.875000 0.375000
vt 0.875000 0.312500
vt 0.875000 0.750000
vt 0.875000 0.250000
vt 0.875000 0.687500
vt 0.875000 0.187500
vt 0.875000 0.625000
vt 0.875000 0.125000
vt 0.875000 0.562500
vt 0.875000 0.062500
vt 0.875000 0.500000
vt 0.890625 1.000000
vt 0.875000 0.937500
vt 0.890625 0.000000
vt 0.875000 0.437500
vt 0.843750 0.625000
vt 0.843750 0.562500
vt 0.843750 0.062500
vt 0.843750 0.500000
vt 0.859375 1.000000
vt 0.843750 0.937500
vt 0.859375 0.000000
vt 0.843750 0.437500
vt 0.843750 0.875000
vt 0.843750 0.375000
vt 0.843750 0.812500
vt 0.843750 0.312500
vt 0.843750 0.750000
vt 0.843750 0.250000
vt 0.843750 0.687500
vt 0.843750 0.187500
vt 0.843750 0.125000
vt 0.812500 0.375000
vt 0.812500 0.312500
vt 0.812500 0.750000
vt 0.812500 0.250000
vt 0.812500 0.687500
vt 0.812500 0.187500
vt 0.812500 0.625000
vt 0.812500 0.125000
vt 0.812500 0.562500
vt 0.812500 0.062500
vt 0.812500 0.500000
vt 0.828125 1.000000
vt 0.812500 0.937500
vt 0.828125 0.000000
vt 0.812500 0.437500
vt 0.812500 0.875000
vt 0.812500 0.812500
vt 0.781250 0.125000
vt 0.781250 0.062500
vt 0.781250 0.562500
vt 0.781250 0.500000
vt 0.796875 1.000000
vt 0.781250 0.937500
vt 0.796875 0.000000
vt 0.781250 0.437500
vt 0.781250 0.875000
vt 0.781250 0.375000
vt 0.781250 0.812500
vt 0.781250 0.312500
vt 0.781250 0.750000
vt 0.781250 0.250000
vt 0.781250 0.687500
vt 0.781250 0.187500
vt 0.781250 0.625000
vt 0.765625 1.000000
vt 0.765625 0.000000
s 0
f 478/1/1 17/2/1 6/3/1
f 476/4/2 9/5/2 10/6/2
f 6/3/3 18/7/3 479/8/3
f 2/9/4 10/6/4 11/10/4
f 479/8/5 19/11/5 480/12/5
f 477/13/6 11/10/6 12/14/6
f 480/12/7 20/15/7 481/16/7
f 3/17/8 12/14/8 13/18/8
f 481/16/9 21/19/9 482/20/9
f 3/17/10 14/21/10 4/22/10
f 474/23/11 82/24/11 7/25/11
f 308/26/12 482/20/12 21/19/12
f 4/22/13 15/27/13 5/28/13
f 475/29/14 7/25/14 8/30/14
f 5/28/15 16/31/15 478/1/15
f 1/32/16 8/30/16 9/5/16
f 15/27/17 31/33/17 16/31/17
f 8/30/18 24/34/18 9/5/18
f 16/31/19 32/35/19 17/2/19
f 9/5/20 25/36/20 10/6/20
f 17/2/21 33/37/21 18/7/21
f 10/6/22 26/38/22 11/10/22
f 19/11/23 33/37/23 34/39/23
f 11/10/24 27/40/24 12/14/24
f 20/15/25 34/39/25 35/41/25
f 13/18/26 27/40/26 28/42/26
f 20/15/27 36/43/27 21/19/27
f 13/18/28 29/44/28 14/21/28
f 7/25/29 82/45/29 22/46/29
f 308/47/30 21/19/30 36/43/30
f 14/21/31 30/48/31 15/27/31
f 7/25/32 23/49/32 8/30/32
f 34/39/33 50/50/33 35/41/33
f 27/40/34 43/51/34 28/42/34
f 35/41/35 51/52/35 36/43/35
f 28/42/36 44/53/36 29/44/36
f 22/46/37 82/54/37 37/55/37
f 308/56/38 36/43/38 51/52/38
f 30/48/39 44/53/39 45/57/39
f 22/46/40 38/58/40 23/49/40
f 30/48/41 46/59/41 31/33/41
f 23/49/42 39/60/42 24/34/42
f 32/35/43 46/59/43 47/61/43
f 24/34/44 40/62/44 25/36/44
f 33/37/45 47/61/45 48/63/45
f 25/36/46 41/64/46 26/38/46
f 34/39/47 48/63/47 49/65/47
f 26/38/48 42/66/48 27/40/48
f 38/58/49 54/67/49 39/60/49
f 47/61/50 61/68/50 62/69/50
f 39/60/51 55/70/51 40/62/51
f 47/61/52 63/71/52 48/63/52
f 40/62/53 56/72/53 41/64/53
f 48/63/54 64/73/54 49/65/54
f 41/64/55 57/74/55 42/66/55
f 49/65/56 65/75/56 50/50/56
f 43/51/57 57/74/57 58/76/57
f 50/50/58 66/77/58 51/52/58
f 44/53/59 58/76/59 59/78/59
f 37/55/60 82/79/60 52/80/60
f 308/81/61 51/52/61 66/77/61
f 44/53/62 60/82/62 45/57/62
f 37/55/63 53/83/63 38/58/63
f 45/57/64 61/68/64 46/59/64
f 58/76/65 72/84/65 73/85/65
f 65/75/66 81/86/66 66/77/66
f 59/78/67 73/85/67 74/87/67
f 52/80/68 82/88/68 67/89/68
f 308/90/69 66/77/69 81/86/69
f 59/78/70 75/91/70 60/82/70
f 52/80/71 68/92/71 53/83/71
f 60/82/72 76/93/72 61/68/72
f 53/83/73 69/94/73 54/67/73
f 62/69/74 76/93/74 77/95/74
f 54/67/75 70/96/75 55/70/75
f 62/69/76 78/97/76 63/71/76
f 56/72/77 70/96/77 71/98/77
f 63/71/78 79/99/78 64/73/78
f 56/72/79 72/84/79 57/74/79
f 65/75/80 79/99/80 80/100/80
f 77/95/81 92/101/81 93/102/81
f 69/94/82 86/103/82 70/96/82
f 77/95/83 94/104/83 78/97/83
f 71/98/84 86/103/84 87/105/84
f 78/97/85 95/106/85 79/99/85
f 71/98/86 88/107/86 72/84/86
f 79/99/87 96/108/87 80/100/87
f 73/85/88 88/107/88 89/109/88
f 81/86/89 96/108/89 97/110/89
f 74/87/90 89/109/90 90/111/90
f 67/89/91 82/112/91 83/113/91
f 308/114/92 81/86/92 97/110/92
f 74/87/93 91/115/93 75/91/93
f 67/89/94 84/116/94 68/92/94
f 75/91/95 92/101/95 76/93/95
f 68/92/96 85/117/96 69/94/96
f 96/108/97 112/118/97 97/110/97
f 90/111/98 104/119/98 105/120/98
f 83/113/99 82/121/99 98/122/99
f 308/123/100 97/110/100 112/118/100
f 90/111/101 106/124/101 91/115/101
f 84/116/102 98/122/102 99/125/102
f 91/115/103 107/126/103 92/101/103
f 84/116/104 100/127/104 85/117/104
f 93/102/105 107/126/105 108/128/105
f 85/117/106 101/129/106 86/103/106
f 93/102/107 109/130/107 94/104/107
f 87/105/108 101/129/108 102/131/108
f 95/106/109 109/130/109 110/132/109
f 87/105/110 103/133/110 88/107/110
f 96/108/111 110/132/111 111/134/111
f 89/109/112 103/133/112 104/119/112
f 108/128/113 124/135/113 109/130/113
f 102/131/114 116/136/114 117/137/114
f 110/132/115 124/135/115 125/138/115
f 102/131/116 118/139/116 103/133/116
f 110/132/117 126/140/117 111/134/117
f 104/119/118 118/139/118 119/141/118
f 111/134/119 127/142/119 112/118/119
f 105/120/120 119/141/120 120/143/120
f 98/122/121 82/144/121 113/145/121
f 308/146/122 112/118/122 127/142/122
f 105/120/123 121/147/123 106/124/123
f 99/125/124 113/145/124 114/148/124
f 106/124/125 122/149/125 107/126/125
f 99/125/126 115/150/126 100/127/126
f 108/128/127 122/149/127 123/151/127
f 100/127/128 116/136/128 101/129/128
f 113/145/129 82/152/129 128/153/129
f 308/154/130 127/142/130 142/155/130
f 120/143/131 136/156/131 121/147/131
f 113/145/132 129/157/132 114/148/132
f 121/147/133 137/158/133 122/149/133
f 114/148/134 130/159/134 115/150/134
f 123/151/135 137/158/135 138/160/135
f 115/150/136 131/161/136 116/136/136
f 123/151/137 139/162/137 124/135/137
f 117/137/138 131/161/138 132/163/138
f 125/138/139 139/162/139 140/164/139
f 117/137/140 133/165/140 118/139/140
f 125/138/141 141/166/141 126/140/141
f 119/141/142 133/165/142 134/167/142
f 126/140/143 142/155/143 127/142/143
f 120/143/144 134/167/144 135/168/144
f 132/163/145 146/169/145 147/170/145
f 140/164/146 154/171/146 155/172/146
f 132/163/147 148/173/147 133/165/147
f 140/164/148 156/174/148 141/166/148
f 134/167/149 148/173/149 149/175/149
f 142/155/150 156/174/150 157/176/150
f 135/168/151 149/175/151 150/177/151
f 128/153/152 82/178/152 143/179/152
f 308/180/153 142/155/153 157/176/153
f 135/168/154 151/181/154 136/156/154
f 128/153/155 144/182/155 129/157/155
f 136/156/156 152/183/156 137/158/156
f 130/159/157 144/182/157 145/184/157
f 138/160/158 152/183/158 153/185/158
f 130/159/159 146/169/159 131/161/159
f 138/160/160 154/171/160 139/162/160
f 150/177/161 166/186/161 151/181/161
f 143/179/162 159/187/162 144/182/162
f 151/181/163 167/188/163 152/183/163
f 145/184/164 159/187/164 160/189/164
f 153/185/165 167/188/165 168/190/165
f 145/184/166 161/191/166 146/169/166
f 153/185/167 169/192/167 154/171/167
f 147/170/168 161/191/168 162/193/168
f 155/172/169 169/192/169 170/194/169
f 147/170/170 163/195/170 148/173/170
f 155/172/171 171/196/171 156/174/171
f 149/175/172 163/195/172 164/197/172
f 156/174/173 172/198/173 157/176/173
f 150/177/174 164/197/174 165/199/174
f 143/179/175 82/200/175 158/201/175
f 308/202/176 157/176/176 172/198/176
f 169/192/177 185/203/177 170/194/177
f 162/193/178 178/204/178 163/195/178
f 171/196/179 185/203/179 186/205/179
f 164/197/180 178/204/180 179/206/180
f 172/198/181 186/205/181 187/207/181
f 165/199/182 179/206/182 180/208/182
f 158/201/183 82/209/183 173/210/183
f 308/211/184 172/198/184 187/207/184
f 165/199/185 181/212/185 166/186/185
f 158/201/186 174/213/186 159/187/186
f 166/186/187 182/214/187 167/188/187
f 159/187/188 175/215/188 160/189/188
f 168/190/189 182/214/189 183/216/189
f 160/189/190 176/217/190 161/191/190
f 168/190/191 184/218/191 169/192/191
f 162/193/192 176/217/192 177/219/192
f 174/213/193 188/220/193 189/221/193
f 181/212/194 197/222/194 182/214/194
f 175/215/195 189/221/195 190/223/195
f 183/216/196 197/222/196 198/224/196
f 175/215/197 191/225/197 176/217/197
f 183/216/198 199/226/198 184/218/198
f 177/219/199 191/225/199 192/227/199
f 185/203/200 199/226/200 200/228/200
f 177/219/201 193/229/201 178/204/201
f 185/203/202 201/230/202 186/205/202
f 179/206/203 193/229/203 194/231/203
f 186/205/204 202/232/204 187/207/204
f 180/208/205 194/231/205 195/233/205
f 173/210/206 82/234/206 188/220/206
f 308/235/207 187/207/207 202/232/207
f 180/208/208 196/236/208 181/212/208
f 192/227/209 208/237/209 193/229/209
f 200/228/210 216/238/210 201/230/210
f 194/231/211 208/237/211 209/239/211
f 201/230/212 217/240/212 202/232/212
f 195/233/213 209/239/213 210/241/213
f 188/220/214 82/242/214 203/243/214
f 308/244/215 202/232/215 217/240/215
f 195/233/216 211/245/216 196/236/216
f 188/220/217 204/246/217 189/221/217
f 196/236/218 212/247/218 197/222/218
f 189/221/219 205/248/219 190/223/219
f 198/224/220 212/247/220 213/249/220
f 190/223/221 206/250/221 191/225/221
f 198/224/222 214/251/222 199/226/222
f 192/227/223 206/250/223 207/252/223
f 200/228/224 214/251/224 215/253/224
f 211/245/225 227/254/225 212/247/225
f 204/246/226 220/255/226 205/248/226
f 213/249/227 227/254/227 228/256/227
f 205/248/228 221/257/228 206/250/228
f 213/249/229 229/258/229 214/251/229
f 207/252/230 221/257/230 222/259/230
f 215/253/231 229/258/231 230/260/231
f 207/252/232 223/261/232 208/237/232
f 215/253/233 231/262/233 216/238/233
f 209/239/234 223/261/234 224/263/234
f 216/238/235 232/264/235 217/240/235
f 210/241/236 224/263/236 225/265/236
f 203/243/237 82/266/237 218/267/237
f 308/268/238 217/240/238 232/264/238
f 210/241/239 226/269/239 211/245/239
f 203/243/240 219/270/240 204/246/240
f 230/260/241 246/271/241 231/262/241
f 224/263/242 238/272/242 239/273/242
f 232/264/243 246/271/243 247/274/243
f 225/265/244 239/273/244 240/275/244
f 218/267/245 82/276/245 233/277/245
f 308/278/246 232/264/246 247/274/246
f 225/265/247 241/279/247 226/269/247
f 218/267/248 234/280/248 219/270/248
f 226/269/249 242/281/249 227/254/249
f 220/255/250 234/280/250 235/282/250
f 228/256/251 242/281/251 243/283/251
f 220/255/252 236/284/252 221/257/252
f 228/256/253 244/285/253 229/258/253
f 222/259/254 236/284/254 237/286/254
f 230/260/255 244/285/255 245/287/255
f 222/259/256 238/272/256 223/261/256
f 243/283/257 257/288/257 258/289/257
f 235/282/258 251/290/258 236/284/258
f 243/283/259 259/291/259 244/285/259
f 237/286/260 251/290/260 252/292/260
f 245/287/261 259/291/261 260/293/261
f 237/286/262 253/294/262 238/272/262
f 245/287/263 261/295/263 246/271/263
f 239/273/264 253/294/264 254/296/264
f 246/271/265 262/297/265 247/274/265
f 240/275/266 254/296/266 255/298/266
f 233/277/267 82/299/267 248/300/267
f 308/301/268 247/274/268 262/297/268
f 240/275/269 256/302/269 241/279/269
f 233/277/270 249/303/270 234/280/270
f 241/279/271 257/288/271 242/281/271
f 234/280/272 250/304/272 235/282/272
f 261/295/273 277/305/273 262/297/273
f 255/298/274 269/306/274 270/307/274
f 248/300/275 82/308/275 263/309/275
f 308/310/276 262/297/276 277/305/276
f 255/298/277 271/311/277 256/302/277
f 249/303/278 263/309/278 264/312/278
f 256/302/279 272/313/279 257/288/279
f 249/303/280 265/314/280 250/304/280
f 258/289/281 272/313/281 273/315/281
f 250/304/282 266/316/282 251/290/282
f 258/289/283 274/317/283 259/291/283
f 252/292/284 266/316/284 267/318/284
f 260/293/285 274/317/285 275/319/285
f 252/292/286 268/320/286 253/294/286
f 260/293/287 276/321/287 261/295/287
f 254/296/288 268/320/288 269/306/288
f 265/314/289 281/322/289 266/316/289
f 273/315/290 289/323/290 274/317/290
f 267/318/291 281/322/291 282/324/291
f 275/319/292 289/323/292 290/325/292
f 267/318/293 283/326/293 268/320/293
f 275/319/294 291/327/294 276/321/294
f 269/306/295 283/326/295 284/328/295
f 277/305/296 291/327/296 292/329/296
f 270/307/297 284/328/297 285/330/297
f 263/309/298 82/331/298 278/332/298
f 308/333/299 277/305/299 292/329/299
f 270/307/300 286/334/300 271/311/300
f 264/312/301 278/332/301 279/335/301
f 271/311/302 287/336/302 272/313/302
f 264/312/303 280/337/303 265/314/303
f 273/315/304 287/336/304 288/338/304
f 285/330/305 299/339/305 300/340/305
f 278/332/306 82/341/306 293/342/306
f 308/343/307 292/329/307 307/344/307
f 285/330/308 301/345/308 286/334/308
f 278/332/309 294/346/309 279/335/309
f 286/334/310 302/347/310 287/336/310
f 279/335/311 295/348/311 280/337/311
f 288/338/312 302/347/312 303/349/312
f 280/337/313 296/350/313 281/322/313
f 288/338/314 304/351/314 289/323/314
f 282/324/315 296/350/315 297/352/315
f 290/325/316 304/351/316 305/353/316
f 282/324/317 298/354/317 283/326/317
f 290/325/318 306/355/318 291/327/318
f 284/328/319 298/354/319 299/339/319
f 292/329/320 306/355/320 307/344/320
f 303/349/321 320/356/321 304/351/321
f 297/352/322 312/357/322 313/358/322
f 305/353/323 320/356/323 321/359/323
f 297/352/324 314/360/324 298/354/324
f 305/353/325 322/361/325 306/355/325
f 299/339/326 314/360/326 315/362/326
f 306/355/327 323/363/327 307/344/327
f 300/340/328 315/362/328 316/364/328
f 293/342/329 82/365/329 309/366/329
f 308/367/330 307/344/330 323/363/330
f 300/340/331 317/368/331 301/345/331
f 293/342/332 310/369/332 294/346/332
f 301/345/333 318/370/333 302/347/333
f 294/346/334 311/371/334 295/348/334
f 303/349/335 318/370/335 319/372/335
f 295/348/336 312/357/336 296/350/336
f 309/366/337 82/373/337 324/374/337
f 308/375/338 323/363/338 338/376/338
f 316/364/339 332/377/339 317/368/339
f 309/366/340 325/378/340 310/369/340
f 317/368/341 333/379/341 318/370/341
f 310/369/342 326/380/342 311/371/342
f 319/372/343 333/379/343 334/381/343
f 311/371/344 327/382/344 312/357/344
f 319/372/345 335/383/345 320/356/345
f 312/357/346 328/384/346 313/358/346
f 321/359/347 335/383/347 336/385/347
f 313/358/348 329/386/348 314/360/348
f 321/359/349 337/387/349 322/361/349
f 315/362/350 329/386/350 330/388/350
f 322/361/351 338/376/351 323/363/351
f 316/364/352 330/388/352 331/389/352
f 328/384/353 342/390/353 343/391/353
f 336/385/354 350/392/354 351/393/354
f 328/384/355 344/394/355 329/386/355
f 336/385/356 352/395/356 337/387/356
f 330/388/357 344/394/357 345/396/357
f 337/387/358 353/397/358 338/376/358
f 331/389/359 345/396/359 346/398/359
f 324/374/360 82/399/360 339/400/360
f 308/401/361 338/376/361 353/397/361
f 331/389/362 347/402/362 332/377/362
f 324/374/363 340/403/363 325/378/363
f 332/377/364 348/404/364 333/379/364
f 325/378/365 341/405/365 326/380/365
f 334/381/366 348/404/366 349/406/366
f 326/380/367 342/390/367 327/382/367
f 334/381/368 350/392/368 335/383/368
f 346/398/369 362/407/369 347/402/369
f 339/400/370 355/408/370 340/403/370
f 347/402/371 363/409/371 348/404/371
f 341/405/372 355/408/372 356/410/372
f 349/406/373 363/409/373 364/411/373
f 341/405/374 357/412/374 342/390/374
f 349/406/375 365/413/375 350/392/375
f 343/391/376 357/412/376 358/414/376
f 351/393/377 365/413/377 366/415/377
f 343/391/378 359/416/378 344/394/378
f 351/393/379 367/417/379 352/395/379
f 345/396/380 359/416/380 360/418/380
f 352/395/381 368/419/381 353/397/381
f 346/398/382 360/418/382 361/420/382
f 339/400/383 82/421/383 354/422/383
f 308/423/384 353/397/384 368/419/384
f 366/424/385 380/425/385 381/426/385
f 358/427/386 374/428/386 359/429/386
f 366/424/387 382/430/387 367/431/387
f 360/432/388 374/428/388 375/433/388
f 367/431/389 383/434/389 368/435/389
f 361/436/390 375/433/390 376/437/390
f 354/438/391 82/439/391 369/440/391
f 308/441/392 368/435/392 383/434/392
f 361/436/393 377/442/393 362/443/393
f 354/438/394 370/444/394 355/445/394
f 362/443/395 378/446/395 363/447/395
f 356/448/396 370/444/396 371/449/396
f 364/450/397 378/446/397 379/451/397
f 356/448/398 372/452/398 357/453/398
f 364/450/399 380/425/399 365/454/399
f 358/427/400 372/452/400 373/455/400
f 377/442/401 393/456/401 378/446/401
f 371/449/402 385/457/402 386/458/402
f 379/451/403 393/456/403 394/459/403
f 371/449/404 387/460/404 372/452/404
f 379/451/405 395/461/405 380/425/405
f 373/455/406 387/460/406 388/462/406
f 381/426/407 395/461/407 396/463/407
f 373/455/408 389/464/408 374/428/408
f 381/426/409 397/465/409 382/430/409
f 375/433/410 389/464/410 390/466/410
f 382/430/411 398/467/411 383/434/411
f 376/437/412 390/466/412 391/468/412
f 369/440/413 82/469/413 384/470/413
f 308/471/414 383/434/414 398/467/414
f 376/437/415 392/472/415 377/442/415
f 369/440/416 385/457/416 370/444/416
f 397/465/417 411/473/417 412/474/417
f 390/466/418 404/475/418 405/476/418
f 397/465/419 413/477/419 398/467/419
f 391/468/420 405/476/420 406/478/420
f 384/470/421 82/479/421 399/480/421
f 308/481/422 398/467/422 413/477/422
f 391/468/423 407/482/423 392/472/423
f 385/457/424 399/480/424 400/483/424
f 392/472/425 408/484/425 393/456/425
f 385/457/426 401/485/426 386/458/426
f 394/459/427 408/484/427 409/486/427
f 386/458/428 402/487/428 387/460/428
f 394/459/429 410/488/429 395/461/429
f 388/462/430 402/487/430 403/489/430
f 396/463/431 410/488/431 411/473/431
f 388/462/432 404/475/432 389/464/432
f 401/485/433 415/490/433 416/491/433
f 409/486/434 423/492/434 424/493/434
f 401/485/435 417/494/435 402/487/435
f 409/486/436 425/495/436 410/488/436
f 403/489/437 417/494/437 418/496/437
f 411/473/438 425/495/438 426/497/438
f 403/489/439 419/498/439 404/475/439
f 411/473/440 427/499/440 412/474/440
f 405/476/441 419/498/441 420/500/441
f 412/474/442 428/501/442 413/477/442
f 406/478/443 420/500/443 421/502/443
f 399/480/444 82/503/444 414/504/444
f 308/505/445 413/477/445 428/501/445
f 406/478/446 422/506/446 407/482/446
f 399/480/447 415/490/447 400/483/447
f 407/482/448 423/492/448 408/484/448
f 420/500/449 434/507/449 435/508/449
f 427/499/450 443/509/450 428/501/450
f 421/502/451 435/508/451 436/510/451
f 414/504/452 82/511/452 429/512/452
f 308/513/453 428/501/453 443/509/453
f 421/502/454 437/514/454 422/506/454
f 414/504/455 430/515/455 415/490/455
f 422/506/456 438/516/456 423/492/456
f 416/491/457 430/515/457 431/517/457
f 424/493/458 438/516/458 439/518/458
f 416/491/459 432/519/459 417/494/459
f 424/493/460 440/520/460 425/495/460
f 418/496/461 432/519/461 433/521/461
f 426/497/462 440/520/462 441/522/462
f 418/496/463 434/507/463 419/498/463
f 426/497/464 442/523/464 427/499/464
f 439/518/465 453/524/465 454/525/465
f 431/517/466 447/526/466 432/519/466
f 439/518/467 455/527/467 440/520/467
f 433/521/468 447/526/468 448/528/468
f 441/522/469 455/527/469 456/529/469
f 433/521/470 449/530/470 434/507/470
f 441/522/471 457/531/471 442/523/471
f 435/508/472 449/530/472 450/532/472
f 442/523/473 458/533/473 443/509/473
f 436/510/474 450/532/474 451/534/474
f 429/512/475 82/535/475 444/536/475
f 308/537/476 443/509/476 458/533/476
f 436/510/477 452/538/477 437/514/477
f 430/515/478 444/536/478 445/539/478
f 437/514/479 453/524/479 438/516/479
f 430/515/480 446/540/480 431/517/480
f 458/533/481 472/541/481 473/542/481
f 451/534/482 465/543/482 466/544/482
f 444/536/483 82/545/483 459/546/483
f 308/547/484 458/533/484 473/542/484
f 451/534/485 467/548/485 452/538/485
f 444/536/486 460/549/486 445/539/486
f 452/538/487 468/550/487 453/524/487
f 446/540/488 460/549/488 461/551/488
f 454/525/489 468/550/489 469/552/489
f 446/540/490 462/553/490 447/526/490
f 454/525/491 470/554/491 455/527/491
f 448/528/492 462/553/492 463/555/492
f 456/529/493 470/554/493 471/556/493
f 448/528/494 464/557/494 449/530/494
f 456/529/495 472/541/495 457/531/495
f 450/532/496 464/557/496 465/543/496
f 462/553/497 1/32/497 476/4/497
f 469/552/498 479/8/498 470/554/498
f 463/555/499 476/4/499 2/9/499
f 471/556/500 479/8/500 480/12/500
f 464/557/501 2/9/501 477/13/501
f 471/556/502 481/16/502 472/541/502
f 465/543/503 477/13/503 3/17/503
f 472/541/504 482/20/504 473/542/504
f 466/544/505 3/17/505 4/22/505
f 459/546/506 82/558/506 474/23/506
f 308/559/507 473/542/507 482/20/507
f 466/544/508 5/28/508 467/548/508
f 460/549/509 474/23/509 475/29/509
f 467/548/510 478/1/510 468/550/510
f 461/551/511 475/29/511 1/32/511
f 468/550/512 6/3/512 469/552/512
f 478/1/1 16/31/1 17/2/1
f 476/4/2 1/32/2 9/5/2
f 6/3/3 17/2/3 18/7/3
f 2/9/4 476/4/4 10/6/4
f 479/8/5 18/7/5 19/11/5
f 477/13/6 2/9/6 11/10/6
f 480/12/513 19/11/513 20/15/513
f 3/17/8 477/13/8 12/14/8
f 481/16/9 20/15/9 21/19/9
f 3/17/10 13/18/10 14/21/10
f 4/22/13 14/21/13 15/27/13
f 475/29/14 474/23/14 7/25/14
f 5/28/15 15/27/15 16/31/15
f 1/32/16 475/29/16 8/30/16
f 15/27/17 30/48/17 31/33/17
f 8/30/18 23/49/18 24/34/18
f 16/31/19 31/33/19 32/35/19
f 9/5/20 24/34/20 25/36/20
f 17/2/21 32/35/21 33/37/21
f 10/6/22 25/36/22 26/38/22
f 19/11/23 18/7/23 33/37/23
f 11/10/24 26/38/24 27/40/24
f 20/15/25 19/11/25 34/39/25
f 13/18/26 12/14/26 27/40/26
f 20/15/27 35/41/27 36/43/27
f 13/18/28 28/42/28 29/44/28
f 14/21/31 29/44/31 30/48/31
f 7/25/32 22/46/32 23/49/32
f 34/39/33 49/65/33 50/50/33
f 27/40/34 42/66/34 43/51/34
f 35/41/35 50/50/35 51/52/35
f 28/42/36 43/51/36 44/53/36
f 30/48/39 29/44/39 44/53/39
f 22/46/40 37/55/40 38/58/40
f 30/48/41 45/57/41 46/59/41
f 23/49/42 38/58/42 39/60/42
f 32/35/43 31/33/43 46/59/43
f 24/34/44 39/60/44 40/62/44
f 33/37/45 32/35/45 47/61/45
f 25/36/46 40/62/46 41/64/46
f 34/39/47 33/37/47 48/63/47
f 26/38/48 41/64/48 42/66/48
f 38/58/49 53/83/49 54/67/49
f 47/61/50 46/59/50 61/68/50
f 39/60/51 54/67/51 55/70/51
f 47/61/52 62/69/52 63/71/52
f 40/62/53 55/70/53 56/72/53
f 48/63/54 63/71/54 64/73/54
f 41/64/55 56/72/55 57/74/55
f 49/65/56 64/73/56 65/75/56
f 43/51/57 42/66/57 57/74/57
f 50/50/58 65/75/58 66/77/58
f 44/53/59 43/51/59 58/76/59
f 44/53/62 59/78/62 60/82/62
f 37/55/63 52/80/63 53/83/63
f 45/57/64 60/82/64 61/68/64
f 58/76/65 57/74/65 72/84/65
f 65/75/66 80/100/66 81/86/66
f 59/78/67 58/76/67 73/85/67
f 59/78/70 74/87/70 75/91/70
f 52/80/71 67/89/71 68/92/71
f 60/82/72 75/91/72 76/93/72
f 53/83/73 68/92/73 69/94/73
f 62/69/74 61/68/74 76/93/74
f 54/67/75 69/94/75 70/96/75
f 62/69/76 77/95/76 78/97/76
f 56/72/77 55/70/77 70/96/77
f 63/71/78 78/97/78 79/99/78
f 56/72/79 71/98/79 72/84/79
f 65/75/80 64/73/80 79/99/80
f 77/95/81 76/93/81 92/101/81
f 69/94/82 85/117/82 86/103/82
f 77/95/83 93/102/83 94/104/83
f 71/98/84 70/96/84 86/103/84
f 78/97/85 94/104/85 95/106/85
f 71/98/86 87/105/86 88/107/86
f 79/99/87 95/106/87 96/108/87
f 73/85/88 72/84/88 88/107/88
f 81/86/89 80/100/89 96/108/89
f 74/87/90 73/85/90 89/109/90
f 74/87/93 90/111/93 91/115/93
f 67/89/94 83/113/94 84/116/94
f 75/91/95 91/115/95 92/101/95
f 68/92/96 84/116/96 85/117/96
f 96/108/97 111/134/97 112/118/97
f 90/111/98 89/109/98 104/119/98
f 90/111/101 105/120/101 106/124/101
f 84/116/102 83/113/102 98/122/102
f 91/115/103 106/124/103 107/126/103
f 84/116/104 99/125/104 100/127/104
f 93/102/105 92/101/105 107/126/105
f 85/117/106 100/127/106 101/129/106
f 93/102/107 108/128/107 109/130/107
f 87/105/108 86/103/108 101/129/108
f 95/106/109 94/104/109 109/130/109
f 87/105/110 102/131/110 103/133/110
f 96/108/111 95/106/111 110/132/111
f 89/109/112 88/107/112 103/133/112
f 108/128/113 123/151/113 124/135/113
f 102/131/114 101/129/114 116/136/114
f 110/132/115 109/130/115 124/135/115
f 102/131/116 117/137/116 118/139/116
f 110/132/117 125/138/117 126/140/117
f 104/119/118 103/133/118 118/139/118
f 111/134/119 126/140/119 127/142/119
f 105/120/120 104/119/120 119/141/120
f 105/120/123 120/143/123 121/147/123
f 99/125/124 98/122/124 113/145/124
f 106/124/125 121/147/125 122/149/125
f 99/125/126 114/148/126 115/150/126
f 108/128/127 107/126/127 122/149/127
f 100/127/128 115/150/128 116/136/128
f 120/143/131 135/168/131 136/156/131
f 113/145/132 128/153/132 129/157/132
f 121/147/133 136/156/133 137/158/133
f 114/148/134 129/157/134 130/159/134
f 123/151/135 122/149/135 137/158/135
f 115/150/136 130/159/136 131/161/136
f 123/151/137 138/160/137 139/162/137
f 117/137/138 116/136/138 131/161/138
f 125/138/139 124/135/139 139/162/139
f 117/137/140 132/163/140 133/165/140
f 125/138/141 140/164/141 141/166/141
f 119/141/142 118/139/142 133/165/142
f 126/140/143 141/166/143 142/155/143
f 120/143/144 119/141/144 134/167/144
f 132/163/145 131/161/145 146/169/145
f 140/164/146 139/162/146 154/171/146
f 132/163/147 147/170/147 148/173/147
f 140/164/148 155/172/148 156/174/148
f 134/167/149 133/165/149 148/173/149
f 142/155/150 141/166/150 156/174/150
f 135/168/151 134/167/151 149/175/151
f 135/168/154 150/177/154 151/181/154
f 128/153/155 143/179/155 144/182/155
f 136/156/156 151/181/156 152/183/156
f 130/159/157 129/157/157 144/182/157
f 138/160/158 137/158/158 152/183/158
f 130/159/159 145/184/159 146/169/159
f 138/160/160 153/185/160 154/171/160
f 150/177/161 165/199/161 166/186/161
f 143/179/162 158/201/162 159/187/162
f 151/181/163 166/186/163 167/188/163
f 145/184/164 144/182/164 159/187/164
f 153/185/165 152/183/165 167/188/165
f 145/184/166 160/189/166 161/191/166
f 153/185/167 168/190/167 169/192/167
f 147/170/168 146/169/168 161/191/168
f 155/172/169 154/171/169 169/192/169
f 147/170/170 162/193/170 163/195/170
f 155/172/171 170/194/171 171/196/171
f 149/175/172 148/173/172 163/195/172
f 156/174/173 171/196/173 172/198/173
f 150/177/174 149/175/174 164/197/174
f 169/192/177 184/218/177 185/203/177
f 162/193/178 177/219/178 178/204/178
f 171/196/179 170/194/179 185/203/179
f 164/197/180 163/195/180 178/204/180
f 172/198/181 171/196/181 186/205/181
f 165/199/182 164/197/182 179/206/182
f 165/199/185 180/208/185 181/212/185
f 158/201/186 173/210/186 174/213/186
f 166/186/187 181/212/187 182/214/187
f 159/187/188 174/213/188 175/215/188
f 168/190/189 167/188/189 182/214/189
f 160/189/190 175/215/190 176/217/190
f 168/190/191 183/216/191 184/218/191
f 162/193/192 161/191/192 176/217/192
f 174/213/193 173/210/193 188/220/193
f 181/212/194 196/236/194 197/222/194
f 175/215/195 174/213/195 189/221/195
f 183/216/196 182/214/196 197/222/196
f 175/215/197 190/223/197 191/225/197
f 183/216/198 198/224/198 199/226/198
f 177/219/199 176/217/199 191/225/199
f 185/203/200 184/218/200 199/226/200
f 177/219/201 192/227/201 193/229/201
f 185/203/202 200/228/202 201/230/202
f 179/206/203 178/204/203 193/229/203
f 186/205/204 201/230/204 202/232/204
f 180/208/205 179/206/205 194/231/205
f 180/208/208 195/233/208 196/236/208
f 192/227/209 207/252/209 208/237/209
f 200/228/210 215/253/210 216/238/210
f 194/231/211 193/229/211 208/237/211
f 201/230/212 216/238/212 217/240/212
f 195/233/213 194/231/213 209/239/213
f 195/233/216 210/241/216 211/245/216
f 188/220/217 203/243/217 204/246/217
f 196/236/218 211/245/218 212/247/218
f 189/221/219 204/246/219 205/248/219
f 198/224/220 197/222/220 212/247/220
f 190/223/221 205/248/221 206/250/221
f 198/224/222 213/249/222 214/251/222
f 192/227/223 191/225/223 206/250/223
f 200/228/224 199/226/224 214/251/224
f 211/245/225 226/269/225 227/254/225
f 204/246/226 219/270/226 220/255/226
f 213/249/227 212/247/227 227/254/227
f 205/248/228 220/255/228 221/257/228
f 213/249/229 228/256/229 229/258/229
f 207/252/230 206/250/230 221/257/230
f 215/253/231 214/251/231 229/258/231
f 207/252/232 222/259/232 223/261/232
f 215/253/233 230/260/233 231/262/233
f 209/239/234 208/237/234 223/261/234
f 216/238/235 231/262/235 232/264/235
f 210/241/236 209/239/236 224/263/236
f 210/241/239 225/265/239 226/269/239
f 203/243/240 218/267/240 219/270/240
f 230/260/241 245/287/241 246/271/241
f 224/263/242 223/261/242 238/272/242
f 232/264/243 231/262/243 246/271/243
f 225/265/244 224/263/244 239/273/244
f 225/265/247 240/275/247 241/279/247
f 218/267/248 233/277/248 234/280/248
f 226/269/249 241/279/249 242/281/249
f 220/255/250 219/270/250 234/280/250
f 228/256/251 227/254/251 242/281/251
f 220/255/252 235/282/252 236/284/252
f 228/256/253 243/283/253 244/285/253
f 222/259/254 221/257/254 236/284/254
f 230/260/255 229/258/255 244/285/255
f 222/259/256 237/286/256 238/272/256
f 243/283/257 242/281/257 257/288/257
f 235/282/258 250/304/258 251/290/258
f 243/283/259 258/289/259 259/291/259
f 237/286/260 236/284/260 251/290/260
f 245/287/261 244/285/261 259/291/261
f 237/286/262 252/292/262 253/294/262
f 245/287/263 260/293/263 261/295/263
f 239/273/264 238/272/264 253/294/264
f 246/271/265 261/295/265 262/297/265
f 240/275/266 239/273/266 254/296/266
f 240/275/269 255/298/269 256/302/269
f 233/277/270 248/300/270 249/303/270
f 241/279/271 256/302/271 257/288/271
f 234/280/272 249/303/272 250/304/272
f 261/295/273 276/321/273 277/305/273
f 255/298/274 254/296/274 269/306/274
f 255/298/277 270/307/277 271/311/277
f 249/303/278 248/300/278 263/309/278
f 256/302/279 271/311/279 272/313/279
f 249/303/280 264/312/280 265/314/280
f 258/289/281 257/288/281 272/313/281
f 250/304/282 265/314/282 266/316/282
f 258/289/283 273/315/283 274/317/283
f 252/292/284 251/290/284 266/316/284
f 260/293/285 259/291/285 274/317/285
f 252/292/286 267/318/286 268/320/286
f 260/293/287 275/319/287 276/321/287
f 254/296/288 253/294/288 268/320/288
f 265/314/289 280/337/289 281/322/289
f 273/315/290 288/338/290 289/323/290
f 267/318/291 266/316/291 281/322/291
f 275/319/292 274/317/292 289/323/292
f 267/318/293 282/324/293 283/326/293
f 275/319/294 290/325/294 291/327/294
f 269/306/295 268/320/295 283/326/295
f 277/305/296 276/321/296 291/327/296
f 270/307/297 269/306/297 284/328/297
f 270/307/300 285/330/300 286/334/300
f 264/312/301 263/309/301 278/332/301
f 271/311/302 286/334/302 287/336/302
f 264/312/303 279/335/303 280/337/303
f 273/315/304 272/313/304 287/336/304
f 285/330/305 284/328/305 299/339/305
f 285/330/308 300/340/308 301/345/308
f 278/332/309 293/342/309 294/346/309
f 286/334/310 301/345/310 302/347/310
f 279/335/311 294/346/311 295/348/311
f 288/338/312 287/336/312 302/347/312
f 280/337/313 295/348/313 296/350/313
f 288/338/314 303/349/314 304/351/314
f 282/324/315 281/322/315 296/350/315
f 290/325/316 289/323/316 304/351/316
f 282/324/317 297/352/317 298/354/317
f 290/325/318 305/353/318 306/355/318
f 284/328/319 283/326/319 298/354/319
f 292/329/320 291/327/320 306/355/320
f 303/349/321 319/372/321 320/356/321
f 297/352/322 296/350/322 312/357/322
f 305/353/323 304/351/323 320/356/323
f 297/352/324 313/358/324 314/360/324
f 305/353/514 321/359/514 322/361/514
f 299/339/326 298/354/326 314/360/326
f 306/355/327 322/361/327 323/363/327
f 300/340/328 299/339/328 315/362/328
f 300/340/331 316/364/331 317/368/331
f 293/342/332 309/366/332 310/369/332
f 301/345/333 317/368/333 318/370/333
f 294/346/515 310/369/515 311/371/515
f 303/349/335 302/347/335 318/370/335
f 295/348/336 311/371/336 312/357/336
f 316/364/339 331/389/339 332/377/339
f 309/366/340 324/374/340 325/378/340
f 317/368/341 332/377/341 333/379/341
f 310/369/342 325/378/342 326/380/342
f 319/372/343 318/370/343 333/379/343
f 311/371/344 326/380/344 327/382/344
f 319/372/345 334/381/345 335/383/345
f 312/357/346 327/382/346 328/384/346
f 321/359/347 320/356/347 335/383/347
f 313/358/348 328/384/348 329/386/348
f 321/359/349 336/385/349 337/387/349
f 315/362/350 314/360/350 329/386/350
f 322/361/351 337/387/351 338/376/351
f 316/364/352 315/362/352 330/388/352
f 328/384/353 327/382/353 342/390/353
f 336/385/354 335/383/354 350/392/354
f 328/384/355 343/391/355 344/394/355
f 336/385/356 351/393/356 352/395/356
f 330/388/357 329/386/357 344/394/357
f 337/387/358 352/395/358 353/397/358
f 331/389/359 330/388/359 345/396/359
f 331/389/362 346/398/362 347/402/362
f 324/374/363 339/400/363 340/403/363
f 332/377/364 347/402/364 348/404/364
f 325/378/365 340/403/365 341/405/365
f 334/381/366 333/379/366 348/404/366
f 326/380/367 341/405/367 342/390/367
f 334/381/368 349/406/368 350/392/368
f 346/398/369 361/420/369 362/407/369
f 339/400/370 354/422/370 355/408/370
f 347/402/371 362/407/371 363/409/371
f 341/405/372 340/403/372 355/408/372
f 349/406/373 348/404/373 363/409/373
f 341/405/374 356/410/374 357/412/374
f 349/406/375 364/411/375 365/413/375
f 343/391/376 342/390/376 357/412/376
f 351/393/377 350/392/377 365/413/377
f 343/391/378 358/414/378 359/416/378
f 351/393/379 366/415/379 367/417/379
f 345/396/380 344/394/380 359/416/380
f 352/395/381 367/417/381 368/419/381
f 346/398/382 345/396/382 360/418/382
f 366/424/385 365/454/385 380/425/385
f 358/427/386 373/455/386 374/428/386
f 366/424/387 381/426/387 382/430/387
f 360/432/388 359/429/388 374/428/388
f 367/431/389 382/430/389 383/434/389
f 361/436/390 360/432/390 375/433/390
f 361/436/393 376/437/393 377/442/393
f 354/438/394 369/440/394 370/444/394
f 362/443/395 377/442/395 378/446/395
f 356/448/396 355/445/396 370/444/396
f 364/450/397 363/447/397 378/446/397
f 356/448/398 371/449/398 372/452/398
f 364/450/399 379/451/399 380/425/399
f 358/427/400 357/453/400 372/452/400
f 377/442/401 392/472/401 393/456/401
f 371/449/402 370/444/402 385/457/402
f 379/451/403 378/446/403 393/456/403
f 371/449/404 386/458/404 387/460/404
f 379/451/405 394/459/405 395/461/405
f 373/455/406 372/452/406 387/460/406
f 381/426/407 380/425/407 395/461/407
f 373/455/408 388/462/408 389/464/408
f 381/426/409 396/463/409 397/465/409
f 375/433/410 374/428/410 389/464/410
f 382/430/411 397/465/411 398/467/411
f 376/437/412 375/433/412 390/466/412
f 376/437/415 391/468/415 392/472/415
f 369/440/416 384/470/416 385/457/416
f 397/465/417 396/463/417 411/473/417
f 390/466/418 389/464/418 404/475/418
f 397/465/419 412/474/419 413/477/419
f 391/468/420 390/466/420 405/476/420
f 391/468/423 406/478/423 407/482/423
f 385/457/424 384/470/424 399/480/424
f 392/472/425 407/482/425 408/484/425
f 385/457/426 400/483/426 401/485/426
f 394/459/427 393/456/427 408/484/427
f 386/458/428 401/485/428 402/487/428
f 394/459/429 409/486/429 410/488/429
f 388/462/430 387/460/430 402/487/430
f 396/463/431 395/461/431 410/488/431
f 388/462/432 403/489/432 404/475/432
f 401/485/516 400/483/516 415/490/516
f 409/486/434 408/484/434 423/492/434
f 401/485/435 416/491/435 417/494/435
f 409/486/436 424/493/436 425/495/436
f 403/489/437 402/487/437 417/494/437
f 411/473/438 410/488/438 425/495/438
f 403/489/439 418/496/439 419/498/439
f 411/473/517 426/497/517 427/499/517
f 405/476/441 404/475/441 419/498/441
f 412/474/442 427/499/442 428/501/442
f 406/478/443 405/476/443 420/500/443
f 406/478/446 421/502/446 422/506/446
f 399/480/447 414/504/447 415/490/447
f 407/482/448 422/506/448 423/492/448
f 420/500/449 419/498/449 434/507/449
f 427/499/450 442/523/450 443/509/450
f 421/502/451 420/500/451 435/508/451
f 421/502/454 436/510/454 437/514/454
f 414/504/455 429/512/455 430/515/455
f 422/506/456 437/514/456 438/516/456
f 416/491/457 415/490/457 430/515/457
f 424/493/458 423/492/458 438/516/458
f 416/491/459 431/517/459 432/519/459
f 424/493/460 439/518/460 440/520/460
f 418/496/461 417/494/461 432/519/461
f 426/497/462 425/495/462 440/520/462
f 418/496/463 433/521/463 434/507/463
f 426/497/464 441/522/464 442/523/464
f 439/518/465 438/516/465 453/524/465
f 431/517/466 446/540/466 447/526/466
f 439/518/467 454/525/467 455/527/467
f 433/521/468 432/519/468 447/526/468
f 441/522/469 440/520/469 455/527/469
f 433/521/470 448/528/470 449/530/470
f 441/522/471 456/529/471 457/531/471
f 435/508/472 434/507/472 449/530/472
f 442/523/473 457/531/473 458/533/473
f 436/510/474 435/508/474 450/532/474
f 436/510/477 451/534/477 452/538/477
f 430/515/478 429/512/478 444/536/478
f 437/514/479 452/538/479 453/524/479
f 430/515/480 445/539/480 446/540/480
f 458/533/481 457/531/481 472/541/481
f 451/534/482 450/532/482 465/543/482
f 451/534/485 466/544/485 467/548/485
f 444/536/486 459/546/486 460/549/486
f 452/538/487 467/548/487 468/550/487
f 446/540/488 445/539/488 460/549/488
f 454/525/489 453/524/489 468/550/489
f 446/540/490 461/551/490 462/553/490
f 454/525/491 469/552/491 470/554/491
f 448/528/492 447/526/492 462/553/492
f 456/529/493 455/527/493 470/554/493
f 448/528/494 463/555/494 464/557/494
f 456/529/495 471/556/495 472/541/495
f 450/532/496 449/530/496 464/557/496
f 462/553/497 461/551/497 1/32/497
f 469/552/498 6/3/498 479/8/498
f 463/555/499 462/553/499 476/4/499
f 471/556/500 470/554/500 479/8/500
f 464/557/501 463/555/501 2/9/501
f 471/556/502 480/12/502 481/16/502
f 465/543/503 464/557/503 477/13/503
f 472/541/504 481/16/504 482/20/504
f 466/544/505 465/543/505 3/17/505
f 466/544/508 4/22/508 5/28/508
f 460/549/509 459/546/509 474/23/509
f 467/548/510 5/28/510 478/1/510
f 461/551/511 460/549/511 475/29/511
f 468/550/512 478/1/512 6/3/512
o Sphere.001
v -0.076854 0.140504 -0.154382
v -0.076854 0.044016 -0.226220
v -0.076854 -0.082051 -0.265099
v -0.076854 -0.150279 -0.270102
v -0.076854 -0.218506 -0.265099
v -0.076854 -0.344573 -0.226220
v -0.066588 0.192723 -0.059544
v -0.056716 0.172822 -0.107451
v -0.047619 0.140504 -0.151602
v -0.039645 0.097012 -0.190301
v -0.033100 0.044016 -0.222060
v -0.028237 -0.016446 -0.245660
v -0.025243 -0.082051 -0.260192
v -0.024232 -0.150279 -0.265099
v -0.025243 -0.218506 -0.260192
v -0.028237 -0.284111 -0.245660
v -0.033100 -0.344573 -0.222060
v -0.039645 -0.397569 -0.190301
v -0.047619 -0.441061 -0.151602
v -0.056716 -0.473379 -0.107451
v -0.066588 -0.493280 -0.059544
v -0.056716 0.192723 -0.056653
v -0.037353 0.172822 -0.101781
v -0.019507 0.140504 -0.143370
v -0.003865 0.097012 -0.179824
v 0.008972 0.044016 -0.209740
v 0.018511 -0.016446 -0.231970
v 0.024385 -0.082051 -0.245660
v 0.026368 -0.150279 -0.250282
v 0.024385 -0.218506 -0.245660
v 0.018511 -0.284111 -0.231970
v 0.008972 -0.344573 -0.209740
v -0.003865 -0.397569 -0.179824
v -0.019507 -0.441061 -0.143370
v -0.037353 -0.473379 -0.101781
v -0.056716 -0.493280 -0.056653
v -0.047619 0.192723 -0.051959
v -0.019507 0.172822 -0.092573
v 0.006401 0.140504 -0.130002
v 0.029110 0.097012 -0.162810
v 0.047746 0.044016 -0.189734
v 0.061595 -0.016446 -0.209740
v 0.070122 -0.082051 -0.222060
v 0.073002 -0.150279 -0.226220
v 0.070122 -0.218506 -0.222060
v 0.061595 -0.284111 -0.209740
v 0.047746 -0.344573 -0.189734
v 0.029110 -0.397569 -0.162810
v 0.006401 -0.441061 -0.130002
v -0.019507 -0.473379 -0.092573
v -0.047619 -0.493280 -0.051959
v -0.039645 0.192723 -0.045642
v -0.003865 0.172822 -0.080181
v 0.029110 0.140504 -0.112012
v 0.058013 0.097012 -0.139912
v 0.081732 0.044016 -0.162810
v 0.099358 -0.016446 -0.179824
v 0.110211 -0.082051 -0.190301
v 0.113876 -0.150279 -0.193839
v 0.110211 -0.218506 -0.190301
v 0.099358 -0.284111 -0.179824
v 0.081732 -0.344573 -0.162810
v 0.058013 -0.397569 -0.139912
v 0.029110 -0.441061 -0.112012
v -0.003865 -0.473379 -0.080181
v -0.039645 -0.493280 -0.045642
v -0.033100 0.192723 -0.037944
v 0.008972 0.172822 -0.065081
v 0.047746 0.140504 -0.090091
v 0.081732 0.097012 -0.112012
v 0.109624 0.044016 -0.130002
v 0.130349 -0.016446 -0.143370
v 0.143112 -0.082051 -0.151602
v 0.147421 -0.150279 -0.154382
v 0.143112 -0.218506 -0.151602
v 0.130349 -0.284111 -0.143370
v 0.109624 -0.344573 -0.130002
v 0.081732 -0.397569 -0.112012
v 0.047746 -0.441061 -0.090091
v 0.008972 -0.473379 -0.065081
v -0.033100 -0.493280 -0.037944
v -0.076854 0.199443 -0.009722
v -0.028237 0.192723 -0.029162
v 0.018511 0.172822 -0.047854
v 0.061595 0.140504 -0.065081
v 0.099358 0.097012 -0.080181
v 0.130349 0.044016 -0.092573
v 0.153378 -0.016446 -0.101781
v 0.167559 -0.082051 -0.107451
v 0.172347 -0.150279 -0.109365
v 0.167559 -0.218506 -0.107451
v 0.153378 -0.284111 -0.101781
v 0.130349 -0.344573 -0.092573
v 0.099358 -0.397569 -0.080181
v 0.061595 -0.441061 -0.065081
v 0.018511 -0.473379 -0.047854
v -0.028237 -0.493280 -0.029162
v -0.025243 0.192723 -0.019633
v 0.024385 0.172822 -0.029162
v 0.070122 0.140504 -0.037944
v 0.110211 0.097012 -0.045642
v 0.143112 0.044016 -0.051959
v 0.167559 -0.016446 -0.056653
v 0.182613 -0.082051 -0.059544
v 0.187697 -0.150279 -0.060520
v 0.182613 -0.218506 -0.059544
v 0.167559 -0.284111 -0.056653
v 0.143112 -0.344573 -0.051959
v 0.110211 -0.397569 -0.045642
v 0.070122 -0.441061 -0.037944
v 0.024385 -0.473379 -0.029162
v -0.025243 -0.493280 -0.019633
v -0.024232 0.192723 -0.009722
v 0.026368 0.172822 -0.009722
v 0.073002 0.140504 -0.009722
v 0.113876 0.097012 -0.009722
v 0.147421 0.044016 -0.009722
v 0.172347 -0.016446 -0.009722
v 0.187697 -0.082051 -0.009722
v 0.192879 -0.150279 -0.009722
v 0.187697 -0.218506 -0.009722
v 0.172347 -0.284111 -0.009722
v 0.147421 -0.344573 -0.009722
v 0.113876 -0.397569 -0.009722
v 0.073002 -0.441061 -0.009722
v 0.026368 -0.473379 -0.009722
v -0.024232 -0.493280 -0.009722
v -0.025243 0.192723 0.000188
v 0.024385 0.172822 0.009717
v 0.070122 0.140504 0.018499
v 0.110211 0.097012 0.026197
v 0.143112 0.044016 0.032514
v 0.167559 -0.016446 0.037208
v 0.182613 -0.082051 0.040099
v 0.187696 -0.150279 0.041075
v 0.182613 -0.218506 0.040099
v 0.167559 -0.284111 0.037208
v 0.143112 -0.344573 0.032514
v 0.110211 -0.397569 0.026197
v 0.070122 -0.441061 0.018499
v 0.024385 -0.473379 0.009717
v -0.025243 -0.493280 0.000188
v -0.028237 0.192723 0.009717
v 0.018511 0.172822 0.028409
v 0.061595 0.140504 0.045636
v 0.099358 0.097012 0.060736
v 0.130349 0.044016 0.073128
v 0.153378 -0.016446 0.082336
v 0.167559 -0.082051 0.088006
v 0.172347 -0.150279 0.089921
v 0.167559 -0.218506 0.088006
v 0.153378 -0.284111 0.082336
v 0.130349 -0.344573 0.073128
v 0.099358 -0.397569 0.060736
v 0.061595 -0.441061 0.045636
v 0.018511 -0.473379 0.028409
v -0.028237 -0.493280 0.009717
v -0.033100 0.192723 0.018499
v 0.008972 0.172822 0.045636
v 0.047746 0.140504 0.070646
v 0.081732 0.097012 0.092567
v 0.109624 0.044016 0.110557
v 0.130349 -0.016446 0.123925
v 0.143112 -0.082051 0.132157
v 0.147421 -0.150279 0.134937
v 0.143112 -0.218506 0.132157
v 0.130349 -0.284111 0.123925
v 0.109624 -0.344573 0.110557
v 0.081732 -0.397569 0.092567
v 0.047746 -0.441061 0.070646
v 0.008972 -0.473379 0.045636
v -0.033100 -0.493280 0.018499
v -0.039645 0.192723 0.026197
v -0.003865 0.172822 0.060736
v 0.029110 0.140504 0.092567
v 0.058013 0.097012 0.120467
v 0.081732 0.044016 0.143365
v 0.099358 -0.016446 0.160379
v 0.110211 -0.082051 0.170856
v 0.113876 -0.150279 0.174394
v 0.110211 -0.218506 0.170856
v 0.099358 -0.284111 0.160379
v 0.081732 -0.344573 0.143365
v 0.058013 -0.397569 0.120467
v 0.029110 -0.441061 0.092567
v -0.003865 -0.473379 0.060736
v -0.039645 -0.493280 0.026197
v -0.047619 0.192723 0.032514
v -0.019507 0.172822 0.073128
v 0.006401 0.140504 0.110557
v 0.029110 0.097012 0.143365
v 0.047746 0.044016 0.170289
v 0.061595 -0.016446 0.190295
v 0.070122 -0.082051 0.202615
v 0.073002 -0.150279 0.206775
v 0.070122 -0.218506 0.202615
v 0.061595 -0.284111 0.190295
v 0.047746 -0.344573 0.170289
v 0.029110 -0.397569 0.143365
v 0.006401 -0.441061 0.110557
v -0.019507 -0.473379 0.073128
v -0.047619 -0.493280 0.032514
v -0.056717 0.192723 0.037208
v -0.037353 0.172822 0.082336
v -0.019507 0.140504 0.123925
v -0.003865 0.097012 0.160379
v 0.008972 0.044016 0.190295
v 0.018511 -0.016446 0.212526
v 0.024385 -0.082051 0.226215
v 0.026368 -0.150279 0.230837
v 0.024385 -0.218506 0.226215
v 0.018511 -0.284111 0.212526
v 0.008972 -0.344573 0.190295
v -0.003865 -0.397569 0.160379
v -0.019507 -0.441061 0.123925
v -0.037353 -0.473379 0.082336
v -0.056717 -0.493280 0.037208
v -0.066588 0.192723 0.040099
v -0.056717 0.172822 0.088006
v -0.047619 0.140504 0.132157
v -0.039645 0.097012 0.170856
v -0.033100 0.044016 0.202615
v -0.028237 -0.016446 0.226215
v -0.025243 -0.082051 0.240747
v -0.024232 -0.150279 0.245654
v -0.025243 -0.218506 0.240747
v -0.028237 -0.284111 0.226215
v -0.033100 -0.344573 0.202615
v -0.039645 -0.397569 0.170856
v -0.047619 -0.441061 0.132157
v -0.056717 -0.473379 0.088006
v -0.066588 -0.493280 0.040099
v -0.076854 0.192723 0.041075
v -0.076854 0.172822 0.089921
v -0.076854 0.140504 0.134937
v -0.076854 0.097012 0.174394
v -0.076854 0.044016 0.206775
v -0.076854 -0.016446 0.230837
v -0.076854 -0.082051 0.245654
v -0.076854 -0.150279 0.250657
v -0.076854 -0.218506 0.245654
v -0.076854 -0.284111 0.230837
v -0.076854 -0.344573 0.206775
v -0.076854 -0.397569 0.174394
v -0.076854 -0.441061 0.134937
v -0.076854 -0.473379 0.089921
v -0.076854 -0.493280 0.041075
v -0.087120 0.192723 0.040099
v -0.096992 0.172822 0.088006
v -0.106090 0.140504 0.132157
v -0.114064 0.097012 0.170856
v -0.120608 0.044016 0.202615
v -0.125471 -0.016446 0.226215
v -0.128466 -0.082051 0.240747
v -0.129477 -0.150279 0.245654
v -0.128466 -0.218506 0.240747
v -0.125471 -0.284111 0.226215
v -0.120608 -0.344573 0.202615
v -0.114064 -0.397569 0.170856
v -0.106090 -0.441061 0.132157
v -0.096992 -0.473379 0.088006
v -0.087120 -0.493280 0.040099
v -0.096992 0.192723 0.037208
v -0.116356 0.172822 0.082336
v -0.134202 0.140504 0.123925
v -0.149844 0.097012 0.160379
v -0.162681 0.044016 0.190295
v -0.172219 -0.016446 0.212526
v -0.178093 -0.082051 0.226215
v -0.180077 -0.150279 0.230837
v -0.178093 -0.218506 0.226215
v -0.172219 -0.284111 0.212526
v -0.162681 -0.344573 0.190295
v -0.149844 -0.397569 0.160379
v -0.134202 -0.441061 0.123925
v -0.116356 -0.473379 0.082336
v -0.096992 -0.493280 0.037208
v -0.106090 0.192723 0.032514
v -0.134202 0.172822 0.073128
v -0.160110 0.140504 0.110557
v -0.182818 0.097012 0.143365
v -0.201455 0.044016 0.170289
v -0.215303 -0.016446 0.190295
v -0.223831 -0.082051 0.202615
v -0.226710 -0.150279 0.206775
v -0.223831 -0.218506 0.202615
v -0.215303 -0.284111 0.190295
v -0.201455 -0.344573 0.170289
v -0.182818 -0.397569 0.143365
v -0.160110 -0.441061 0.110557
v -0.134202 -0.473379 0.073128
v -0.106090 -0.493280 0.032514
v -0.114064 0.192723 0.026197
v -0.149844 0.172822 0.060736
v -0.182818 0.140504 0.092567
v -0.211721 0.097012 0.120467
v -0.235441 0.044016 0.143365
v -0.253066 -0.016446 0.160379
v -0.263920 -0.082051 0.170856
v -0.267585 -0.150279 0.174394
v -0.263920 -0.218506 0.170856
v -0.253066 -0.284111 0.160379
v -0.235441 -0.344573 0.143365
v -0.211721 -0.397569 0.120467
v -0.182818 -0.441061 0.092567
v -0.149844 -0.473379 0.060736
v -0.114064 -0.493280 0.026197
v -0.076854 -0.500000 -0.009722
v -0.120608 0.192723 0.018499
v -0.162681 0.172822 0.045636
v -0.201455 0.140504 0.070646
v -0.235441 0.097012 0.092567
v -0.263332 0.044016 0.110557
v -0.284058 -0.016446 0.123925
v -0.296820 -0.082051 0.132157
v -0.301129 -0.150279 0.134937
v -0.296820 -0.218506 0.132157
v -0.284058 -0.284111 0.123925
v -0.263332 -0.344573 0.110557
v -0.235441 -0.397569 0.092567
v -0.201455 -0.441061 0.070646
v -0.162681 -0.473379 0.045636
v -0.120608 -0.493280 0.018499
v -0.125471 0.192723 0.009717
v -0.172219 0.172822 0.028409
v -0.215303 0.140504 0.045636
v -0.253066 0.097012 0.060736
v -0.284058 0.044016 0.073128
v -0.307086 -0.016446 0.082336
v -0.321267 -0.082051 0.088006
v -0.326055 -0.150279 0.089920
v -0.321267 -0.218506 0.088006
v -0.307086 -0.284111 0.082336
v -0.284058 -0.344573 0.073128
v -0.253066 -0.397569 0.060736
v -0.215303 -0.441061 0.045636
v -0.172219 -0.473379 0.028409
v -0.125471 -0.493280 0.009717
v -0.128465 0.192723 0.000188
v -0.178093 0.172822 0.009717
v -0.223831 0.140504 0.018499
v -0.263920 0.097012 0.026197
v -0.296820 0.044016 0.032514
v -0.321267 -0.016446 0.037208
v -0.336322 -0.082051 0.040099
v -0.341405 -0.150279 0.041075
v -0.336322 -0.218506 0.040099
v -0.321267 -0.284111 0.037208
v -0.296820 -0.344573 0.032514
v -0.263920 -0.397569 0.026197
v -0.223831 -0.441061 0.018499
v -0.178093 -0.473379 0.009717
v -0.128465 -0.493280 0.000188
v -0.129477 0.192723 -0.009722
v -0.180077 0.172822 -0.009722
v -0.226710 0.140504 -0.009722
v -0.267585 0.097012 -0.009722
v -0.301129 0.044016 -0.009722
v -0.326056 -0.016446 -0.009722
v -0.341405 -0.082051 -0.009723
v -0.346588 -0.150279 -0.009723
v -0.341405 -0.218506 -0.009723
v -0.326056 -0.284111 -0.009722
v -0.301129 -0.344573 -0.009722
v -0.267585 -0.397569 -0.009722
v -0.226710 -0.441061 -0.009722
v -0.180077 -0.473379 -0.009722
v -0.129477 -0.493280 -0.009722
v -0.128465 0.192723 -0.019633
v -0.178093 0.172822 -0.029162
v -0.223831 0.140504 -0.037944
v -0.263920 0.097012 -0.045642
v -0.296820 0.044016 -0.051959
v -0.321267 -0.016446 -0.056653
v -0.336322 -0.082051 -0.059544
v -0.341405 -0.150279 -0.060520
v -0.336322 -0.218506 -0.059544
v -0.321267 -0.284111 -0.056653
v -0.296820 -0.344573 -0.051959
v -0.263920 -0.397569 -0.045642
v -0.223831 -0.441061 -0.037944
v -0.178093 -0.473379 -0.029162
v -0.128465 -0.493280 -0.019633
v -0.125471 0.192723 -0.029162
v -0.172219 0.172822 -0.047854
v -0.215303 0.140504 -0.065081
v -0.253066 0.097012 -0.080181
v -0.284057 0.044016 -0.092573
v -0.307086 -0.016446 -0.101781
v -0.321267 -0.082051 -0.107451
v -0.326055 -0.150279 -0.109365
v -0.321267 -0.218506 -0.107451
v -0.307086 -0.284111 -0.101781
v -0.284057 -0.344573 -0.092573
v -0.253066 -0.397569 -0.080181
v -0.215303 -0.441061 -0.065081
v -0.172219 -0.473379 -0.047854
v -0.125471 -0.493280 -0.029162
v -0.120608 0.192723 -0.037944
v -0.162681 0.172822 -0.065081
v -0.201455 0.140504 -0.090091
v -0.235441 0.097012 -0.112012
v -0.263332 0.044016 -0.130002
v -0.284058 -0.016446 -0.143370
v -0.296820 -0.082051 -0.151602
v -0.301129 -0.150279 -0.154382
v -0.296820 -0.218506 -0.151602
v -0.284058 -0.284111 -0.143370
v -0.263332 -0.344573 -0.130002
v -0.235441 -0.397569 -0.112012
v -0.201455 -0.441061 -0.090091
v -0.162681 -0.473379 -0.065081
v -0.120608 -0.493280 -0.037944
v -0.114064 0.192723 -0.045642
v -0.149844 0.172822 -0.080181
v -0.182818 0.140504 -0.112012
v -0.211721 0.097012 -0.139912
v -0.235441 0.044016 -0.162809
v -0.253066 -0.016446 -0.179824
v -0.263920 -0.082051 -0.190301
v -0.267584 -0.150279 -0.193839
v -0.263920 -0.218506 -0.190301
v -0.253066 -0.284111 -0.179824
v -0.235441 -0.344573 -0.162809
v -0.211721 -0.397569 -0.139912
v -0.182818 -0.441061 -0.112012
v -0.149844 -0.473379 -0.080181
v -0.114064 -0.493280 -0.045642
v -0.106090 0.192723 -0.051959
v -0.134202 0.172822 -0.092573
v -0.160110 0.140504 -0.130002
v -0.182818 0.097012 -0.162810
v -0.201455 0.044016 -0.189734
v -0.215303 -0.016446 -0.209740
v -0.223831 -0.082051 -0.222060
v -0.226710 -0.150279 -0.226220
v -0.223831 -0.218506 -0.222060
v -0.215303 -0.284111 -0.209740
v -0.201455 -0.344573 -0.189734
v -0.182818 -0.397569 -0.162810
v -0.160110 -0.441061 -0.130002
v -0.134202 -0.473379 -0.092573
v -0.106090 -0.493280 -0.051959
v -0.096992 0.192723 -0.056653
v -0.116356 0.172822 -0.101781
v -0.134202 0.140504 -0.143370
v -0.149844 0.097012 -0.179824
v -0.162681 0.044016 -0.209740
v -0.172219 -0.016446 -0.231970
v -0.178093 -0.082051 -0.245660
v -0.180077 -0.150279 -0.250282
v -0.178093 -0.218506 -0.245660
v -0.172219 -0.284111 -0.231970
v -0.162681 -0.344573 -0.209740
v -0.149844 -0.397569 -0.179824
v -0.134202 -0.441061 -0.143370
v -0.116356 -0.473379 -0.101781
v -0.096992 -0.493280 -0.056653
v -0.087120 0.192723 -0.059544
v -0.096992 0.172822 -0.107451
v -0.106090 0.140504 -0.151602
v -0.114064 0.097012 -0.190301
v -0.120608 0.044016 -0.222060
v -0.125471 -0.016446 -0.245660
v -0.128465 -0.082051 -0.260192
v -0.129476 -0.150279 -0.265099
v -0.128465 -0.218506 -0.260192
v -0.125471 -0.284111 -0.245660
v -0.120608 -0.344573 -0.222060
v -0.114064 -0.397569 -0.190301
v -0.106090 -0.441061 -0.151602
v -0.096992 -0.473379 -0.107451
v -0.087120 -0.493280 -0.059544
v -0.076854 0.192723 -0.060520
v -0.076854 0.172822 -0.109365
v -0.076854 0.097012 -0.193839
v -0.076854 -0.016446 -0.250282
v -0.076854 -0.284111 -0.250282
v -0.076854 -0.397569 -0.193839
v -0.076854 -0.441061 -0.154381
v -0.076854 -0.473379 -0.109365
v -0.076854 -0.493280 -0.060520
vn 0.0923 -0.2194 -0.9713
vn 0.0554 0.8111 -0.5823
vn 0.0880 -0.3683 -0.9255
vn 0.0702 0.6703 -0.7388
vn 0.0809 -0.5197 -0.8505
vn 0.0809 0.5197 -0.8505
vn 0.0702 -0.6703 -0.7388
vn 0.0880 0.3683 -0.9255
vn 0.0554 -0.8111 -0.5823
vn 0.0923 0.2194 -0.9713
vn 0.0358 -0.9255 -0.3771
vn 0.0944 0.0728 -0.9929
vn 0.0125 0.9913 -0.1311
vn 0.0125 -0.9913 -0.1311
vn 0.0944 -0.0728 -0.9929
vn 0.0358 0.9255 -0.3771
vn 0.2803 -0.0730 -0.9571
vn 0.1062 0.9258 -0.3627
vn 0.2741 -0.2199 -0.9362
vn 0.1641 0.8118 -0.5604
vn 0.2612 -0.3691 -0.8919
vn 0.2083 0.6712 -0.7114
vn 0.2399 -0.5207 -0.8194
vn 0.2399 0.5207 -0.8194
vn 0.2083 -0.6712 -0.7114
vn 0.2612 0.3691 -0.8919
vn 0.1641 -0.8118 -0.5604
vn 0.2741 0.2199 -0.9362
vn 0.1062 -0.9258 -0.3627
vn 0.2803 0.0730 -0.9571
vn 0.0369 0.9913 -0.1261
vn 0.0369 -0.9913 -0.1261
vn 0.3392 -0.6729 -0.6573
vn 0.4259 0.3707 -0.8254
vn 0.2669 -0.8131 -0.5173
vn 0.4472 0.2209 -0.8667
vn 0.1726 -0.9265 -0.3345
vn 0.4573 0.0733 -0.8863
vn 0.0600 0.9914 -0.1162
vn 0.0600 -0.9914 -0.1162
vn 0.4573 -0.0733 -0.8863
vn 0.1726 0.9265 -0.3345
vn 0.4472 -0.2209 -0.8667
vn 0.2669 0.8131 -0.5173
vn 0.4259 -0.3707 -0.8254
vn 0.3392 0.6729 -0.6573
vn 0.3910 -0.5225 -0.7577
vn 0.3910 0.5225 -0.7577
vn 0.2325 0.9273 -0.2935
vn 0.6054 -0.2222 -0.7642
vn 0.3600 0.8148 -0.4544
vn 0.5762 -0.3727 -0.7274
vn 0.4580 0.6753 -0.5781
vn 0.5286 -0.5248 -0.6672
vn 0.5286 0.5248 -0.6672
vn 0.4580 -0.6753 -0.5781
vn 0.5762 0.3727 -0.7274
vn 0.3600 -0.8148 -0.4544
vn 0.6054 0.2222 -0.7642
vn 0.2325 -0.9273 -0.2935
vn 0.6193 0.0738 -0.7817
vn 0.0807 0.9915 -0.1019
vn 0.0807 -0.9915 -0.1019
vn 0.6193 -0.0738 -0.7817
vn 0.7063 0.3749 -0.6005
vn 0.4396 -0.8167 -0.3738
vn 0.7426 0.2237 -0.6313
vn 0.2835 -0.9282 -0.2411
vn 0.7598 0.0743 -0.6459
vn 0.0984 0.9916 -0.0836
vn 0.0984 -0.9916 -0.0836
vn 0.7598 -0.0743 -0.6459
vn 0.2836 0.9282 -0.2411
vn 0.7426 -0.2237 -0.6313
vn 0.4396 0.8167 -0.3738
vn 0.7063 -0.3749 -0.6005
vn 0.5602 0.6778 -0.4762
vn 0.6473 -0.5275 -0.5503
vn 0.6473 0.5275 -0.5503
vn 0.5602 -0.6778 -0.4762
vn 0.8524 -0.2250 -0.4720
vn 0.5027 0.8185 -0.2783
vn 0.8103 -0.3770 -0.4487
vn 0.6413 0.6801 -0.3551
vn 0.7419 -0.5299 -0.4108
vn 0.7419 0.5299 -0.4108
vn 0.6413 -0.6801 -0.3551
vn 0.8103 0.3770 -0.4487
vn 0.5027 -0.8185 -0.2783
vn 0.8524 0.2250 -0.4720
vn 0.3238 -0.9290 -0.1793
vn 0.8724 0.0748 -0.4831
vn 0.1122 0.9917 -0.0621
vn 0.1122 -0.9917 -0.0621
vn 0.8724 -0.0748 -0.4831
vn 0.3238 0.9290 -0.1793
vn 0.5463 -0.8198 -0.1717
vn 0.9293 0.2261 -0.2920
vn 0.3516 -0.9296 -0.1105
vn 0.9513 0.0752 -0.2989
vn 0.1218 0.9918 -0.0383
vn 0.1218 -0.9918 -0.0383
vn 0.9513 -0.0752 -0.2989
vn 0.3516 0.9296 -0.1105
vn 0.9293 -0.2261 -0.2920
vn 0.5463 0.8198 -0.1717
vn 0.8830 -0.3786 -0.2775
vn 0.6977 0.6820 -0.2193
vn 0.8079 -0.5318 -0.2539
vn 0.8079 0.5318 -0.2539
vn 0.6977 -0.6820 -0.2193
vn 0.8830 0.3786 -0.2775
vn 0.9204 -0.3795 -0.0939
vn 0.7267 0.6830 -0.0741
vn 0.8418 -0.5329 -0.0859
vn 0.8418 0.5329 -0.0859
vn 0.7267 -0.6830 -0.0741
vn 0.9204 0.3795 -0.0939
vn 0.5686 -0.8205 -0.0580
vn 0.9689 0.2267 -0.0989
vn 0.3658 -0.9300 -0.0373
vn 0.9920 0.0754 -0.1012
vn 0.1267 0.9919 -0.0129
vn 0.1267 -0.9919 -0.0129
vn 0.9920 -0.0754 -0.1012
vn 0.3658 0.9300 -0.0373
vn 0.9689 -0.2267 -0.0989
vn 0.5686 0.8205 -0.0580
vn 0.3658 -0.9300 0.0373
vn 0.9920 0.0754 0.1012
vn 0.1267 0.9919 0.0129
vn 0.1267 -0.9919 0.0129
vn 0.9920 -0.0754 0.1012
vn 0.3658 0.9300 0.0373
vn 0.9689 -0.2267 0.0989
vn 0.5686 0.8205 0.0580
vn 0.9204 -0.3795 0.0939
vn 0.7267 0.6830 0.0741
vn 0.8418 -0.5329 0.0859
vn 0.8418 0.5329 0.0859
vn 0.7267 -0.6830 0.0741
vn 0.9204 0.3795 0.0939
vn 0.5686 -0.8205 0.0580
vn 0.9689 0.2267 0.0989
vn 0.6978 0.6820 0.2193
vn 0.8079 -0.5318 0.2539
vn 0.8079 0.5318 0.2539
vn 0.6977 -0.6820 0.2193
vn 0.8830 0.3786 0.2775
vn 0.5463 -0.8198 0.1717
vn 0.9293 0.2261 0.2920
vn 0.3516 -0.9296 0.1105
vn 0.9513 0.0752 0.2989
vn 0.1218 0.9918 0.0383
vn 0.1218 -0.9918 0.0383
vn 0.9513 -0.0752 0.2989
vn 0.3516 0.9296 0.1105
vn 0.9293 -0.2261 0.2920
vn 0.5463 0.8198 0.1717
vn 0.8830 -0.3786 0.2775
vn 0.8724 0.0748 0.4831
vn 0.1122 0.9917 0.0621
vn 0.1122 -0.9917 0.0621
vn 0.8724 -0.0748 0.4831
vn 0.3238 0.9290 0.1793
vn 0.8524 -0.2250 0.4720
vn 0.5027 0.8185 0.2783
vn 0.8103 -0.3770 0.4487
vn 0.6413 0.6801 0.3551
vn 0.7419 -0.5299 0.4108
vn 0.7419 0.5299 0.4108
vn 0.6413 -0.6801 0.3551
vn 0.8103 0.3770 0.4487
vn 0.5027 -0.8185 0.2783
vn 0.8524 0.2250 0.4720
vn 0.3238 -0.9290 0.1793
vn 0.6473 -0.5275 0.5503
vn 0.6473 0.5275 0.5503
vn 0.5602 -0.6778 0.4762
vn 0.7063 0.3749 0.6005
vn 0.4396 -0.8167 0.3738
vn 0.7426 0.2237 0.6313
vn 0.2836 -0.9282 0.2411
vn 0.7598 0.0743 0.6459
vn 0.0984 0.9916 0.0836
vn 0.0984 -0.9916 0.0836
vn 0.7598 -0.0743 0.6459
vn 0.2836 0.9282 0.2411
vn 0.7426 -0.2237 0.6313
vn 0.4396 0.8167 0.3738
vn 0.7063 -0.3749 0.6005
vn 0.5602 0.6778 0.4762
vn 0.0807 0.9915 0.1019
vn 0.0807 -0.9915 0.1019
vn 0.6193 -0.0738 0.7817
vn 0.2325 0.9273 0.2935
vn 0.6054 -0.2222 0.7642
vn 0.3600 0.8148 0.4544
vn 0.5762 -0.3727 0.7274
vn 0.4580 0.6753 0.5781
vn 0.5286 -0.5248 0.6672
vn 0.5286 0.5248 0.6672
vn 0.4580 -0.6753 0.5781
vn 0.5762 0.3727 0.7274
vn 0.3600 -0.8148 0.4544
vn 0.6054 0.2222 0.7642
vn 0.2325 -0.9273 0.2935
vn 0.6193 0.0738 0.7817
vn 0.3910 0.5225 0.7577
vn 0.3392 -0.6729 0.6573
vn 0.4259 0.3707 0.8254
vn 0.2669 -0.8131 0.5173
vn 0.4472 0.2209 0.8667
vn 0.1726 -0.9265 0.3345
vn 0.4573 0.0733 0.8863
vn 0.0600 0.9914 0.1162
vn 0.0600 -0.9914 0.1162
vn 0.4573 -0.0733 0.8863
vn 0.1726 0.9265 0.3345
vn 0.4472 -0.2209 0.8667
vn 0.2669 0.8131 0.5173
vn 0.4259 -0.3707 0.8254
vn 0.3392 0.6729 0.6573
vn 0.3910 -0.5225 0.7577
vn 0.2803 -0.0730 0.9571
vn 0.1062 0.9258 0.3627
vn 0.2741 -0.2199 0.9362
vn 0.1641 0.8118 0.5604
vn 0.2612 -0.3691 0.8919
vn 0.2083 0.6712 0.7114
vn 0.2399 -0.5207 0.8194
vn 0.2399 0.5207 0.8194
vn 0.2083 -0.6712 0.7114
vn 0.2612 0.3691 0.8919
vn 0.1641 -0.8118 0.5604
vn 0.2741 0.2199 0.9362
vn 0.1062 -0.9258 0.3627
vn 0.2803 0.0730 0.9571
vn 0.0369 0.9913 0.1261
vn 0.0369 -0.9913 0.1261
vn 0.0702 -0.6703 0.7388
vn 0.0880 0.3683 0.9255
vn 0.0554 -0.8111 0.5823
vn 0.0923 0.2194 0.9713
vn 0.0359 -0.9255 0.3771
vn 0.0944 0.0728 0.9929
vn 0.0125 0.9913 0.1311
vn 0.0125 -0.9913 0.1311
vn 0.0944 -0.0728 0.9929
vn 0.0359 0.9255 0.3771
vn 0.0923 -0.2194 0.9713
vn 0.0554 0.8111 0.5823
vn 0.0880 -0.3683 0.9255
vn 0.0702 0.6703 0.7388
vn 0.0809 -0.5197 0.8505
vn 0.0809 0.5197 0.8505
vn -0.0923 -0.2194 0.9713
vn -0.0554 0.8111 0.5823
vn -0.0880 -0.3683 0.9255
vn -0.0702 0.6703 0.7388
vn -0.0809 -0.5197 0.8505
vn -0.0809 0.5197 0.8505
vn -0.0702 -0.6703 0.7388
vn -0.0880 0.3683 0.9255
vn -0.0554 -0.8111 0.5823
vn -0.0923 0.2194 0.9713
vn -0.0359 -0.9255 0.3771
vn -0.0944 0.0728 0.9929
vn -0.0125 0.9913 0.1311
vn -0.0125 -0.9913 0.1311
vn -0.0944 -0.0728 0.9929
vn -0.0359 0.9255 0.3771
vn -0.1641 -0.8118 0.5604
vn -0.2741 0.2199 0.9362
vn -0.1062 -0.9258 0.3627
vn -0.2803 0.0730 0.9571
vn -0.0369 0.9913 0.1261
vn -0.0369 -0.9913 0.1261
vn -0.2803 -0.0730 0.9571
vn -0.1062 0.9258 0.3627
vn -0.2741 -0.2199 0.9362
vn -0.1641 0.8118 0.5604
vn -0.2612 -0.3691 0.8919
vn -0.2083 0.6712 0.7114
vn -0.2399 -0.5207 0.8194
vn -0.2399 0.5207 0.8194
vn -0.2083 -0.6712 0.7114
vn -0.2612 0.3691 0.8919
vn -0.2669 0.8131 0.5173
vn -0.4259 -0.3707 0.8254
vn -0.3392 0.6729 0.6573
vn -0.3910 -0.5225 0.7577
vn -0.3910 0.5225 0.7577
vn -0.3392 -0.6729 0.6573
vn -0.4259 0.3707 0.8254
vn -0.2669 -0.8131 0.5173
vn -0.4472 0.2209 0.8667
vn -0.1726 -0.9265 0.3345
vn -0.4573 0.0733 0.8863
vn -0.0600 0.9914 0.1162
vn -0.0600 -0.9914 0.1162
vn -0.4573 -0.0733 0.8863
vn -0.1726 0.9265 0.3345
vn -0.4472 -0.2209 0.8667
vn -0.6054 0.2222 0.7642
vn -0.2325 -0.9273 0.2935
vn -0.6193 0.0738 0.7817
vn -0.0807 0.9915 0.1019
vn -0.0807 -0.9915 0.1019
vn -0.6193 -0.0738 0.7817
vn -0.2325 0.9273 0.2935
vn -0.6054 -0.2222 0.7642
vn -0.3600 0.8148 0.4544
vn -0.5762 -0.3727 0.7274
vn -0.4580 0.6753 0.5781
vn -0.5286 -0.5248 0.6672
vn -0.5286 0.5248 0.6672
vn -0.4580 -0.6753 0.5781
vn -0.5762 0.3727 0.7274
vn -0.3600 -0.8148 0.4544
vn -0.7063 -0.3749 0.6005
vn -0.5602 0.6778 0.4762
vn -0.6473 -0.5275 0.5503
vn -0.6473 0.5275 0.5503
vn -0.5602 -0.6778 0.4762
vn -0.7063 0.3749 0.6005
vn -0.4396 -0.8167 0.3738
vn -0.7426 0.2237 0.6313
vn -0.2835 -0.9282 0.2411
vn -0.7598 0.0743 0.6459
vn -0.0984 0.9916 0.0836
vn -0.0984 -0.9916 0.0836
vn -0.7598 -0.0743 0.6459
vn -0.2836 0.9282 0.2411
vn -0.7426 -0.2237 0.6313
vn -0.4396 0.8167 0.3738
vn -0.3238 -0.9290 0.1793
vn -0.8724 0.0748 0.4831
vn -0.1122 0.9917 0.0621
vn -0.1122 -0.9917 0.0621
vn -0.8724 -0.0748 0.4831
vn -0.3238 0.9290 0.1793
vn -0.8524 -0.2250 0.4720
vn -0.5027 0.8185 0.2783
vn -0.8103 -0.3770 0.4487
vn -0.6413 0.6801 0.3551
vn -0.7419 -0.5299 0.4108
vn -0.7419 0.5299 0.4108
vn -0.6413 -0.6801 0.3551
vn -0.8103 0.3770 0.4487
vn -0.5027 -0.8185 0.2783
vn -0.8524 0.2250 0.4720
vn -0.6977 0.6820 0.2193
vn -0.8079 -0.5318 0.2539
vn -0.8079 0.5318 0.2539
vn -0.6977 -0.6820 0.2193
vn -0.8830 0.3786 0.2775
vn -0.5463 -0.8198 0.1717
vn -0.9293 0.2261 0.2920
vn -0.3516 -0.9296 0.1105
vn -0.9513 0.0752 0.2989
vn -0.1218 0.9918 0.0383
vn -0.1218 -0.9918 0.0383
vn -0.9513 -0.0752 0.2989
vn -0.3516 0.9296 0.1105
vn -0.9293 -0.2261 0.2920
vn -0.5463 0.8198 0.1717
vn -0.8830 -0.3786 0.2775
vn -0.9920 0.0754 0.1012
vn -0.1267 0.9919 0.0129
vn -0.1267 -0.9919 0.0129
vn -0.9920 -0.0754 0.1012
vn -0.3658 0.9300 0.0373
vn -0.9689 -0.2267 0.0989
vn -0.5686 0.8205 0.0580
vn -0.9204 -0.3795 0.0939
vn -0.7267 0.6830 0.0741
vn -0.8418 -0.5329 0.0859
vn -0.8418 0.5329 0.0859
vn -0.7267 -0.6830 0.0741
vn -0.9204 0.3795 0.0939
vn -0.5686 -0.8205 0.0580
vn -0.9689 0.2267 0.0989
vn -0.3658 -0.9300 0.0373
vn -0.8418 -0.5329 -0.0859
vn -0.8418 0.5329 -0.0859
vn -0.7267 -0.6830 -0.0741
vn -0.9204 0.3795 -0.0939
vn -0.5686 -0.8205 -0.0580
vn -0.9689 0.2267 -0.0989
vn -0.3658 -0.9300 -0.0373
vn -0.9920 0.0754 -0.1012
vn -0.1267 0.9919 -0.0129
vn -0.1267 -0.9919 -0.0129
vn -0.9920 -0.0754 -0.1012
vn -0.3658 0.9300 -0.0373
vn -0.9689 -0.2267 -0.0989
vn -0.5686 0.8205 -0.0580
vn -0.9204 -0.3795 -0.0939
vn -0.7267 0.6830 -0.0741
vn -0.1218 -0.9918 -0.0383
vn -0.9513 -0.0752 -0.2989
vn -0.3516 0.9296 -0.1105
vn -0.9293 -0.2261 -0.2920
vn -0.5463 0.8198 -0.1717
vn -0.8830 -0.3786 -0.2775
vn -0.6977 0.6820 -0.2193
vn -0.8079 -0.5318 -0.2539
vn -0.8079 0.5318 -0.2539
vn -0.6977 -0.6820 -0.2193
vn -0.8830 0.3786 -0.2775
vn -0.5463 -0.8198 -0.1717
vn -0.9293 0.2261 -0.2920
vn -0.3516 -0.9296 -0.1105
vn -0.9513 0.0752 -0.2989
vn -0.1218 0.9918 -0.0383
vn -0.6413 -0.6801 -0.3551
vn -0.8103 0.3770 -0.4487
vn -0.5027 -0.8185 -0.2783
vn -0.8524 0.2250 -0.4720
vn -0.3238 -0.9290 -0.1793
vn -0.8724 0.0748 -0.4831
vn -0.1122 0.9917 -0.0621
vn -0.1122 -0.9917 -0.0621
vn -0.8724 -0.0748 -0.4831
vn -0.3238 0.9290 -0.1793
vn -0.8524 -0.2250 -0.4720
vn -0.5027 0.8185 -0.2783
vn -0.8103 -0.3770 -0.4487
vn -0.6413 0.6801 -0.3551
vn -0.7419 -0.5299 -0.4108
vn -0.7419 0.5299 -0.4108
vn -0.2836 0.9282 -0.2411
vn -0.7426 -0.2237 -0.6313
vn -0.4396 0.8167 -0.3738
vn -0.7063 -0.3749 -0.6005
vn -0.5602 0.6778 -0.4762
vn -0.6473 -0.5275 -0.5503
vn -0.6473 0.5275 -0.5503
vn -0.5602 -0.6778 -0.4762
vn -0.7063 0.3749 -0.6005
vn -0.4396 -0.8167 -0.3738
vn -0.7426 0.2237 -0.6313
vn -0.2836 -0.9282 -0.2411
vn -0.7598 0.0743 -0.6459
vn -0.0984 0.9916 -0.0836
vn -0.0984 -0.9916 -0.0836
vn -0.7598 -0.0743 -0.6459
vn -0.5762 0.3727 -0.7274
vn -0.3600 -0.8148 -0.4544
vn -0.6054 0.2222 -0.7642
vn -0.2325 -0.9273 -0.2935
vn -0.6193 0.0738 -0.7817
vn -0.0807 0.9915 -0.1019
vn -0.0807 -0.9915 -0.1019
vn -0.6193 -0.0738 -0.7817
vn -0.2325 0.9273 -0.2935
vn -0.6054 -0.2222 -0.7642
vn -0.3600 0.8148 -0.4544
vn -0.5762 -0.3727 -0.7274
vn -0.4580 0.6753 -0.5781
vn -0.5286 -0.5248 -0.6672
vn -0.5286 0.5248 -0.6672
vn -0.4580 -0.6753 -0.5781
vn -0.4472 -0.2209 -0.8667
vn -0.2669 0.8131 -0.5173
vn -0.4259 -0.3707 -0.8254
vn -0.3392 0.6729 -0.6573
vn -0.3910 -0.5225 -0.7577
vn -0.3910 0.5225 -0.7577
vn -0.3392 -0.6729 -0.6573
vn -0.4259 0.3707 -0.8254
vn -0.2669 -0.8131 -0.5173
vn -0.4472 0.2209 -0.8667
vn -0.1726 -0.9265 -0.3345
vn -0.4573 0.0733 -0.8863
vn -0.0600 0.9914 -0.1162
vn -0.0600 -0.9914 -0.1162
vn -0.4573 -0.0733 -0.8863
vn -0.1726 0.9265 -0.3345
vn -0.1641 -0.8118 -0.5604
vn -0.2741 0.2199 -0.9362
vn -0.1062 -0.9258 -0.3627
vn -0.2803 0.0730 -0.9571
vn -0.0369 0.9913 -0.1261
vn -0.0369 -0.9913 -0.1261
vn -0.2803 -0.0730 -0.9571
vn -0.1062 0.9258 -0.3627
vn -0.2741 -0.2199 -0.9362
vn -0.1641 0.8118 -0.5604
vn -0.2612 -0.3691 -0.8919
vn -0.2083 0.6712 -0.7114
vn -0.2399 -0.5207 -0.8194
vn -0.2399 0.5207 -0.8194
vn -0.2083 -0.6712 -0.7114
vn -0.2612 0.3691 -0.8919
vn -0.0554 0.8111 -0.5823
vn -0.0880 -0.3683 -0.9255
vn -0.0702 0.6703 -0.7388
vn -0.0809 -0.5197 -0.8505
vn -0.0809 0.5197 -0.8505
vn -0.0702 -0.6703 -0.7388
vn -0.0880 0.3683 -0.9255
vn -0.0554 -0.8111 -0.5823
vn -0.0923 0.2194 -0.9713
vn -0.0359 -0.9255 -0.3771
vn -0.0944 0.0728 -0.9929
vn -0.0125 0.9913 -0.1311
vn -0.0125 -0.9913 -0.1311
vn -0.0944 -0.0728 -0.9929
vn -0.0358 0.9255 -0.3771
vn -0.0923 -0.2194 -0.9713
vn 0.2836 -0.9282 -0.2411
vn 0.2835 0.9282 -0.2411
vn 0.6977 0.6820 0.2193
vn 0.6978 -0.6820 0.2193
vn -0.2836 -0.9282 0.2411
vn -0.0358 -0.9255 -0.3771
vn -0.0359 0.9255 -0.3771
vt 0.750000 0.437500
vt 0.718750 0.375000
vt 0.750000 0.375000
vt 0.750000 0.812500
vt 0.718750 0.875000
vt 0.718750 0.812500
vt 0.718750 0.312500
vt 0.750000 0.312500
vt 0.750000 0.750000
vt 0.718750 0.750000
vt 0.718750 0.250000
vt 0.750000 0.250000
vt 0.750000 0.687500
vt 0.718750 0.687500
vt 0.718750 0.187500
vt 0.750000 0.187500
vt 0.750000 0.625000
vt 0.718750 0.625000
vt 0.718750 0.125000
vt 0.750000 0.125000
vt 0.750000 0.562500
vt 0.718750 0.562500
vt 0.718750 0.062500
vt 0.750000 0.062500
vt 0.718750 0.500000
vt 0.750000 0.500000
vt 0.750000 0.937500
vt 0.734375 1.000000
vt 0.718750 0.937500
vt 0.734375 0.000000
vt 0.718750 0.437500
vt 0.750000 0.875000
vt 0.687500 0.437500
vt 0.687500 0.875000
vt 0.687500 0.375000
vt 0.687500 0.812500
vt 0.687500 0.312500
vt 0.687500 0.750000
vt 0.687500 0.250000
vt 0.687500 0.687500
vt 0.687500 0.187500
vt 0.687500 0.625000
vt 0.687500 0.125000
vt 0.687500 0.562500
vt 0.687500 0.062500
vt 0.687500 0.500000
vt 0.703125 1.000000
vt 0.687500 0.937500
vt 0.703125 0.000000
vt 0.656250 0.250000
vt 0.656250 0.187500
vt 0.656250 0.625000
vt 0.656250 0.125000
vt 0.656250 0.562500
vt 0.656250 0.062500
vt 0.656250 0.500000
vt 0.671875 1.000000
vt 0.656250 0.937500
vt 0.671875 0.000000
vt 0.656250 0.437500
vt 0.656250 0.875000
vt 0.656250 0.375000
vt 0.656250 0.812500
vt 0.656250 0.312500
vt 0.656250 0.750000
vt 0.656250 0.687500
vt 0.625000 0.875000
vt 0.625000 0.375000
vt 0.625000 0.812500
vt 0.625000 0.312500
vt 0.625000 0.750000
vt 0.625000 0.250000
vt 0.625000 0.687500
vt 0.625000 0.187500
vt 0.625000 0.625000
vt 0.625000 0.125000
vt 0.625000 0.562500
vt 0.625000 0.062500
vt 0.625000 0.500000
vt 0.640625 1.000000
vt 0.625000 0.937500
vt 0.640625 0.000000
vt 0.625000 0.437500
vt 0.593750 0.625000
vt 0.593750 0.187500
vt 0.593750 0.125000
vt 0.593750 0.562500
vt 0.593750 0.062500
vt 0.593750 0.500000
vt 0.609375 1.000000
vt 0.593750 0.937500
vt 0.609375 0.000000
vt 0.593750 0.437500
vt 0.593750 0.875000
vt 0.593750 0.375000
vt 0.593750 0.812500
vt 0.593750 0.312500
vt 0.593750 0.750000
vt 0.593750 0.250000
vt 0.593750 0.687500
vt 0.562500 0.375000
vt 0.562500 0.812500
vt 0.562500 0.312500
vt 0.562500 0.750000
vt 0.562500 0.250000
vt 0.562500 0.687500
vt 0.562500 0.187500
vt 0.562500 0.625000
vt 0.562500 0.125000
vt 0.562500 0.562500
vt 0.562500 0.062500
vt 0.562500 0.500000
vt 0.578125 1.000000
vt 0.562500 0.937500
vt 0.578125 0.000000
vt 0.562500 0.437500
vt 0.562500 0.875000
vt 0.531250 0.187500
vt 0.531250 0.125000
vt 0.531250 0.625000
vt 0.531250 0.562500
vt 0.531250 0.062500
vt 0.531250 0.500000
vt 0.546875 1.000000
vt 0.531250 0.937500
vt 0.546875 0.000000
vt 0.531250 0.437500
vt 0.531250 0.875000
vt 0.531250 0.375000
vt 0.531250 0.812500
vt 0.531250 0.312500
vt 0.531250 0.750000
vt 0.531250 0.250000
vt 0.531250 0.687500
vt 0.500000 0.375000
vt 0.500000 0.312500
vt 0.500000 0.750000
vt 0.500000 0.250000
vt 0.500000 0.687500
vt 0.500000 0.187500
vt 0.500000 0.625000
vt 0.500000 0.125000
vt 0.500000 0.562500
vt 0.500000 0.062500
vt 0.500000 0.500000
vt 0.515625 1.000000
vt 0.500000 0.937500
vt 0.515625 0.000000
vt 0.500000 0.437500
vt 0.500000 0.875000
vt 0.500000 0.812500
vt 0.468750 0.062500
vt 0.468750 0.562500
vt 0.468750 0.500000
vt 0.484375 1.000000
vt 0.468750 0.937500
vt 0.484375 0.000000
vt 0.468750 0.437500
vt 0.468750 0.875000
vt 0.468750 0.375000
vt 0.468750 0.812500
vt 0.468750 0.312500
vt 0.468750 0.750000
vt 0.468750 0.250000
vt 0.468750 0.687500
vt 0.468750 0.187500
vt 0.468750 0.625000
vt 0.468750 0.125000
vt 0.437500 0.750000
vt 0.437500 0.250000
vt 0.437500 0.687500
vt 0.437500 0.187500
vt 0.437500 0.625000
vt 0.437500 0.125000
vt 0.437500 0.562500
vt 0.437500 0.062500
vt 0.437500 0.500000
vt 0.453125 1.000000
vt 0.437500 0.937500
vt 0.453125 0.000000
vt 0.437500 0.437500
vt 0.437500 0.875000
vt 0.437500 0.375000
vt 0.437500 0.812500
vt 0.437500 0.312500
vt 0.406250 0.562500
vt 0.406250 0.500000
vt 0.421875 1.000000
vt 0.406250 0.937500
vt 0.421875 0.000000
vt 0.406250 0.062500
vt 0.406250 0.437500
vt 0.406250 0.875000
vt 0.406250 0.375000
vt 0.406250 0.812500
vt 0.406250 0.312500
vt 0.406250 0.750000
vt 0.406250 0.250000
vt 0.406250 0.687500
vt 0.406250 0.187500
vt 0.406250 0.625000
vt 0.406250 0.125000
vt 0.375000 0.250000
vt 0.375000 0.750000
vt 0.375000 0.687500
vt 0.375000 0.187500
vt 0.375000 0.625000
vt 0.375000 0.125000
vt 0.375000 0.562500
vt 0.375000 0.062500
vt 0.375000 0.500000
vt 0.390625 1.000000
vt 0.375000 0.937500
vt 0.390625 0.000000
vt 0.375000 0.437500
vt 0.375000 0.875000
vt 0.375000 0.375000
vt 0.375000 0.812500
vt 0.375000 0.312500
vt 0.359375 1.000000
vt 0.343750 0.937500
vt 0.359375 0.000000
vt 0.343750 0.062500
vt 0.343750 0.437500
vt 0.343750 0.875000
vt 0.343750 0.375000
vt 0.343750 0.812500
vt 0.343750 0.312500
vt 0.343750 0.750000
vt 0.343750 0.250000
vt 0.343750 0.687500
vt 0.343750 0.187500
vt 0.343750 0.625000
vt 0.343750 0.125000
vt 0.343750 0.562500
vt 0.343750 0.500000
vt 0.312500 0.750000
vt 0.312500 0.687500
vt 0.312500 0.250000
vt 0.312500 0.187500
vt 0.312500 0.625000
vt 0.312500 0.125000
vt 0.312500 0.562500
vt 0.312500 0.062500
vt 0.312500 0.500000
vt 0.328125 1.000000
vt 0.312500 0.937500
vt 0.328125 0.000000
vt 0.312500 0.437500
vt 0.312500 0.875000
vt 0.312500 0.375000
vt 0.312500 0.812500
vt 0.312500 0.312500
vt 0.281250 0.437500
vt 0.281250 0.875000
vt 0.281250 0.375000
vt 0.281250 0.812500
vt 0.281250 0.312500
vt 0.281250 0.750000
vt 0.281250 0.250000
vt 0.281250 0.687500
vt 0.281250 0.187500
vt 0.281250 0.625000
vt 0.281250 0.125000
vt 0.281250 0.562500
vt 0.281250 0.062500
vt 0.281250 0.500000
vt 0.296875 1.000000
vt 0.281250 0.937500
vt 0.296875 0.000000
vt 0.250000 0.250000
vt 0.250000 0.187500
vt 0.250000 0.625000
vt 0.250000 0.125000
vt 0.250000 0.562500
vt 0.250000 0.062500
vt 0.250000 0.500000
vt 0.265625 1.000000
vt 0.250000 0.937500
vt 0.265625 0.000000
vt 0.250000 0.437500
vt 0.250000 0.875000
vt 0.250000 0.375000
vt 0.250000 0.812500
vt 0.250000 0.312500
vt 0.250000 0.750000
vt 0.250000 0.687500
vt 0.218750 0.375000
vt 0.218750 0.812500
vt 0.218750 0.312500
vt 0.218750 0.750000
vt 0.218750 0.250000
vt 0.218750 0.687500
vt 0.218750 0.187500
vt 0.218750 0.625000
vt 0.218750 0.125000
vt 0.218750 0.562500
vt 0.218750 0.062500
vt 0.218750 0.500000
vt 0.234375 1.000000
vt 0.218750 0.937500
vt 0.234375 0.000000
vt 0.218750 0.437500
vt 0.218750 0.875000
vt 0.187500 0.125000
vt 0.187500 0.625000
vt 0.187500 0.562500
vt 0.187500 0.062500
vt 0.187500 0.500000
vt 0.203125 1.000000
vt 0.187500 0.937500
vt 0.203125 0.000000
vt 0.187500 0.437500
vt 0.187500 0.875000
vt 0.187500 0.375000
vt 0.187500 0.812500
vt 0.187500 0.312500
vt 0.187500 0.750000
vt 0.187500 0.250000
vt 0.187500 0.687500
vt 0.187500 0.187500
vt 0.156250 0.812500
vt 0.156250 0.375000
vt 0.156250 0.312500
vt 0.156250 0.750000
vt 0.156250 0.250000
vt 0.156250 0.687500
vt 0.156250 0.187500
vt 0.156250 0.625000
vt 0.156250 0.125000
vt 0.156250 0.562500
vt 0.156250 0.062500
vt 0.156250 0.500000
vt 0.171875 1.000000
vt 0.156250 0.937500
vt 0.171875 0.000000
vt 0.156250 0.437500
vt 0.156250 0.875000
vt 0.125000 0.625000
vt 0.125000 0.562500
vt 0.125000 0.125000
vt 0.125000 0.062500
vt 0.125000 0.500000
vt 0.140625 1.000000
vt 0.125000 0.937500
vt 0.140625 0.000000
vt 0.125000 0.437500
vt 0.125000 0.875000
vt 0.125000 0.375000
vt 0.125000 0.812500
vt 0.125000 0.312500
vt 0.125000 0.750000
vt 0.125000 0.250000
vt 0.125000 0.687500
vt 0.125000 0.187500
vt 0.093750 0.375000
vt 0.093750 0.312500
vt 0.093750 0.750000
vt 0.093750 0.250000
vt 0.093750 0.687500
vt 0.093750 0.187500
vt 0.093750 0.625000
vt 0.093750 0.125000
vt 0.093750 0.562500
vt 0.093750 0.062500
vt 0.093750 0.500000
vt 0.109375 1.000000
vt 0.093750 0.937500
vt 0.109375 0.000000
vt 0.093750 0.437500
vt 0.093750 0.875000
vt 0.093750 0.812500
vt 0.062500 0.062500
vt 0.062500 0.562500
vt 0.062500 0.500000
vt 0.078125 1.000000
vt 0.062500 0.937500
vt 0.078125 0.000000
vt 0.062500 0.437500
vt 0.062500 0.875000
vt 0.062500 0.375000
vt 0.062500 0.812500
vt 0.062500 0.312500
vt 0.062500 0.750000
vt 0.062500 0.250000
vt 0.062500 0.687500
vt 0.062500 0.187500
vt 0.062500 0.625000
vt 0.062500 0.125000
vt 0.031250 0.750000
vt 0.031250 0.250000
vt 0.031250 0.687500
vt 0.031250 0.187500
vt 0.031250 0.625000
vt 0.031250 0.125000
vt 0.031250 0.562500
vt 0.031250 0.062500
vt 0.031250 0.500000
vt 0.046875 1.000000
vt 0.031250 0.937500
vt 0.046875 0.000000
vt 0.031250 0.437500
vt 0.031250 0.875000
vt 0.031250 0.375000
vt 0.031250 0.812500
vt 0.031250 0.312500
vt 0.000000 0.562500
vt 0.000000 0.500000
vt 0.015625 1.000000
vt 0.000000 0.937500
vt 0.015625 0.000000
vt 0.000000 0.062500
vt 0.000000 0.437500
vt 0.000000 0.875000
vt 0.000000 0.375000
vt 0.000000 0.812500
vt 0.000000 0.312500
vt 0.000000 0.750000
vt 0.000000 0.250000
vt 0.000000 0.687500
vt 0.000000 0.187500
vt 0.000000 0.625000
vt 0.000000 0.125000
vt 1.000000 0.312500
vt 0.968750 0.250000
vt 1.000000 0.250000
vt 1.000000 0.687500
vt 0.968750 0.750000
vt 0.968750 0.687500
vt 1.000000 0.187500
vt 0.968750 0.187500
vt 0.968750 0.625000
vt 1.000000 0.625000
vt 0.968750 0.125000
vt 1.000000 0.125000
vt 1.000000 0.562500
vt 0.968750 0.562500
vt 0.968750 0.062500
vt 1.000000 0.062500
vt 1.000000 0.500000
vt 0.968750 0.500000
vt 1.000000 0.937500
vt 0.984375 1.000000
vt 0.968750 0.937500
vt 0.984375 0.000000
vt 0.968750 0.437500
vt 1.000000 0.437500
vt 0.968750 0.875000
vt 1.000000 0.875000
vt 0.968750 0.375000
vt 1.000000 0.375000
vt 1.000000 0.812500
vt 0.968750 0.812500
vt 0.968750 0.312500
vt 1.000000 0.750000
vt 0.953125 0.000000
vt 0.937500 0.062500
vt 0.937500 0.437500
vt 0.937500 0.875000
vt 0.937500 0.375000
vt 0.937500 0.812500
vt 0.937500 0.312500
vt 0.937500 0.750000
vt 0.937500 0.250000
vt 0.937500 0.687500
vt 0.937500 0.187500
vt 0.937500 0.625000
vt 0.937500 0.125000
vt 0.937500 0.562500
vt 0.937500 0.500000
vt 0.953125 1.000000
vt 0.937500 0.937500
vt 0.906250 0.250000
vt 0.906250 0.187500
vt 0.906250 0.625000
vt 0.906250 0.125000
vt 0.906250 0.562500
vt 0.906250 0.062500
vt 0.906250 0.500000
vt 0.921875 1.000000
vt 0.906250 0.937500
vt 0.921875 0.000000
vt 0.906250 0.437500
vt 0.906250 0.875000
vt 0.906250 0.375000
vt 0.906250 0.812500
vt 0.906250 0.312500
vt 0.906250 0.750000
vt 0.906250 0.687500
vt 0.875000 0.875000
vt 0.875000 0.375000
vt 0.875000 0.812500
vt 0.875000 0.312500
vt 0.875000 0.750000
vt 0.875000 0.250000
vt 0.875000 0.687500
vt 0.875000 0.187500
vt 0.875000 0.625000
vt 0.875000 0.125000
vt 0.875000 0.562500
vt 0.875000 0.062500
vt 0.875000 0.500000
vt 0.890625 1.000000
vt 0.875000 0.937500
vt 0.890625 0.000000
vt 0.875000 0.437500
vt 0.843750 0.625000
vt 0.843750 0.125000
vt 0.843750 0.562500
vt 0.843750 0.062500
vt 0.843750 0.500000
vt 0.859375 1.000000
vt 0.843750 0.937500
vt 0.859375 0.000000
vt 0.843750 0.437500
vt 0.843750 0.875000
vt 0.843750 0.375000
vt 0.843750 0.812500
vt 0.843750 0.312500
vt 0.843750 0.750000
vt 0.843750 0.250000
vt 0.843750 0.687500
vt 0.843750 0.187500
vt 0.812500 0.375000
vt 0.812500 0.812500
vt 0.812500 0.312500
vt 0.812500 0.750000
vt 0.812500 0.250000
vt 0.812500 0.687500
vt 0.812500 0.187500
vt 0.812500 0.625000
vt 0.812500 0.125000
vt 0.812500 0.562500
vt 0.812500 0.062500
vt 0.812500 0.500000
vt 0.828125 1.000000
vt 0.812500 0.937500
vt 0.828125 0.000000
vt 0.812500 0.437500
vt 0.812500 0.875000
vt 0.781250 0.125000
vt 0.781250 0.625000
vt 0.781250 0.562500
vt 0.781250 0.062500
vt 0.781250 0.500000
vt 0.796875 1.000000
vt 0.781250 0.937500
vt 0.796875 0.000000
vt 0.781250 0.437500
vt 0.781250 0.875000
vt 0.781250 0.375000
vt 0.781250 0.812500
vt 0.781250 0.312500
vt 0.781250 0.750000
vt 0.781250 0.250000
vt 0.781250 0.687500
vt 0.781250 0.187500
vt 0.765625 1.000000
vt 0.765625 0.000000
s 0
f 487/560/518 498/561/518 960/562/518
f 483/563/519 490/564/519 491/565/519
f 960/562/520 499/566/520 488/567/520
f 958/568/521 491/565/521 492/569/521
f 488/567/522 500/570/522 961/571/522
f 484/572/523 492/569/523 493/573/523
f 961/571/524 501/574/524 962/575/524
f 959/576/525 493/573/525 494/577/525
f 962/575/526 502/578/526 963/579/526
f 485/580/527 494/577/527 495/581/527
f 963/579/528 503/582/528 964/583/528
f 485/580/529 496/584/529 486/585/529
f 956/586/530 564/587/530 489/588/530
f 790/589/531 964/583/531 503/582/531
f 486/585/532 497/590/532 487/560/532
f 957/591/533 489/588/533 490/564/533
f 496/584/534 512/592/534 497/590/534
f 489/588/535 505/593/535 490/564/535
f 497/590/536 513/594/536 498/561/536
f 490/564/537 506/595/537 491/565/537
f 498/561/538 514/596/538 499/566/538
f 491/565/539 507/597/539 492/569/539
f 499/566/540 515/598/540 500/570/540
f 492/569/541 508/599/541 493/573/541
f 501/574/542 515/598/542 516/600/542
f 493/573/543 509/601/543 494/577/543
f 502/578/544 516/600/544 517/602/544
f 495/581/545 509/601/545 510/603/545
f 502/578/546 518/604/546 503/582/546
f 495/581/547 511/605/547 496/584/547
f 489/588/548 564/606/548 504/607/548
f 790/608/549 503/582/549 518/604/549
f 516/600/550 530/609/550 531/610/550
f 508/599/551 524/611/551 509/601/551
f 516/600/552 532/612/552 517/602/552
f 509/601/553 525/613/553 510/603/553
f 517/602/554 533/614/554 518/604/554
f 510/603/555 526/615/555 511/605/555
f 504/607/556 564/616/556 519/617/556
f 790/618/557 518/604/557 533/614/557
f 512/592/558 526/615/558 527/619/558
f 504/607/559 520/620/559 505/593/559
f 512/592/560 528/621/560 513/594/560
f 505/593/561 521/622/561 506/595/561
f 514/596/562 528/621/562 529/623/562
f 506/595/563 522/624/563 507/597/563
f 515/598/564 529/623/564 530/609/564
f 507/597/565 523/625/565 508/599/565
f 519/617/566 535/626/566 520/620/566
f 527/619/567 543/627/567 528/621/567
f 520/620/568 536/628/568 521/622/568
f 529/623/569 543/627/569 544/629/569
f 521/622/570 537/630/570 522/624/570
f 529/623/571 545/631/571 530/609/571
f 522/624/572 538/632/572 523/625/572
f 530/609/573 546/633/573 531/610/573
f 523/625/574 539/634/574 524/611/574
f 531/610/575 547/635/575 532/612/575
f 525/613/576 539/634/576 540/636/576
f 532/612/577 548/637/577 533/614/577
f 526/615/578 540/636/578 541/638/578
f 519/617/579 564/639/579 534/640/579
f 790/641/580 533/614/580 548/637/580
f 526/615/581 542/642/581 527/619/581
f 538/632/582 554/643/582 539/634/582
f 547/635/583 561/644/583 562/645/583
f 540/636/584 554/643/584 555/646/584
f 547/635/585 563/647/585 548/637/585
f 541/638/586 555/646/586 556/648/586
f 534/640/587 564/649/587 549/650/587
f 790/651/588 548/637/588 563/647/588
f 541/638/589 557/652/589 542/642/589
f 534/640/590 550/653/590 535/626/590
f 542/642/591 558/654/591 543/627/591
f 535/626/592 551/655/592 536/628/592
f 544/629/593 558/654/593 559/656/593
f 536/628/594 552/657/594 537/630/594
f 544/629/595 560/658/595 545/631/595
f 538/632/596 552/657/596 553/659/596
f 545/631/597 561/644/597 546/633/597
f 557/652/598 574/660/598 558/654/598
f 550/653/599 567/661/599 551/655/599
f 559/656/600 574/660/600 575/662/600
f 551/655/601 568/663/601 552/657/601
f 559/656/602 576/664/602 560/658/602
f 553/659/603 568/663/603 569/665/603
f 560/658/604 577/666/604 561/644/604
f 553/659/605 570/667/605 554/643/605
f 561/644/606 578/668/606 562/645/606
f 555/646/607 570/667/607 571/669/607
f 563/647/608 578/668/608 579/670/608
f 556/648/609 571/669/609 572/671/609
f 549/650/610 564/672/610 565/673/610
f 790/674/611 563/647/611 579/670/611
f 556/648/612 573/675/612 557/652/612
f 549/650/613 566/676/613 550/653/613
f 578/668/614 592/677/614 593/678/614
f 571/669/615 585/679/615 586/680/615
f 578/668/616 594/681/616 579/670/616
f 572/671/617 586/680/617 587/682/617
f 565/673/618 564/683/618 580/684/618
f 790/685/619 579/670/619 594/681/619
f 572/671/620 588/686/620 573/675/620
f 566/676/621 580/684/621 581/687/621
f 573/675/622 589/688/622 574/660/622
f 566/676/623 582/689/623 567/661/623
f 575/662/624 589/688/624 590/690/624
f 567/661/625 583/691/625 568/663/625
f 575/662/626 591/692/626 576/664/626
f 569/665/627 583/691/627 584/693/627
f 577/666/628 591/692/628 592/677/628
f 569/665/629 585/679/629 570/667/629
f 590/690/630 604/694/630 605/695/630
f 582/689/631 598/696/631 583/691/631
f 590/690/632 606/697/632 591/692/632
f 584/693/633 598/696/633 599/698/633
f 592/677/634 606/697/634 607/699/634
f 584/693/635 600/700/635 585/679/635
f 592/677/636 608/701/636 593/678/636
f 586/680/637 600/700/637 601/702/637
f 593/678/638 609/703/638 594/681/638
f 587/682/639 601/702/639 602/704/639
f 580/684/640 564/705/640 595/706/640
f 790/707/641 594/681/641 609/703/641
f 587/682/642 603/708/642 588/686/642
f 581/687/643 595/706/643 596/709/643
f 588/686/644 604/694/644 589/688/644
f 581/687/645 597/710/645 582/689/645
f 608/701/646 624/711/646 609/703/646
f 602/704/647 616/712/647 617/713/647
f 595/706/648 564/714/648 610/715/648
f 790/716/649 609/703/649 624/711/649
f 602/704/650 618/717/650 603/708/650
f 595/706/651 611/718/651 596/709/651
f 603/708/652 619/719/652 604/694/652
f 596/709/653 612/720/653 597/710/653
f 605/695/654 619/719/654 620/721/654
f 597/710/655 613/722/655 598/696/655
f 605/695/656 621/723/656 606/697/656
f 599/698/657 613/722/657 614/724/657
f 607/699/658 621/723/658 622/725/658
f 599/698/659 615/726/659 600/700/659
f 607/699/660 623/727/660 608/701/660
f 601/702/661 615/726/661 616/712/661
f 612/720/662 628/728/662 613/722/662
f 620/721/663 636/729/663 621/723/663
f 614/724/664 628/728/664 629/730/664
f 622/725/665 636/729/665 637/731/665
f 614/724/666 630/732/666 615/726/666
f 622/725/667 638/733/667 623/727/667
f 616/712/668 630/732/668 631/734/668
f 624/711/669 638/733/669 639/735/669
f 617/713/670 631/734/670 632/736/670
f 610/715/671 564/737/671 625/738/671
f 790/739/672 624/711/672 639/735/672
f 617/713/673 633/740/673 618/717/673
f 610/715/674 626/741/674 611/718/674
f 618/717/675 634/742/675 619/719/675
f 612/720/676 626/741/676 627/743/676
f 620/721/677 634/742/677 635/744/677
f 632/736/678 646/745/678 647/746/678
f 625/738/679 564/747/679 640/748/679
f 790/749/680 639/735/680 654/750/680
f 632/736/681 648/751/681 633/740/681
f 625/738/682 641/752/682 626/741/682
f 633/740/683 649/753/683 634/742/683
f 627/743/684 641/752/684 642/754/684
f 635/744/685 649/753/685 650/755/685
f 627/743/686 643/756/686 628/728/686
f 635/744/687 651/757/687 636/729/687
f 629/730/688 643/756/688 644/758/688
f 637/731/689 651/757/689 652/759/689
f 629/730/690 645/760/690 630/732/690
f 637/731/691 653/761/691 638/733/691
f 631/734/692 645/760/692 646/745/692
f 638/733/693 654/750/693 639/735/693
f 650/755/694 666/762/694 651/757/694
f 644/758/695 658/763/695 659/764/695
f 651/757/696 667/765/696 652/759/696
f 644/758/697 660/766/697 645/760/697
f 653/761/698 667/765/698 668/767/698
f 646/745/699 660/766/699 661/768/699
f 654/750/700 668/767/700 669/769/700
f 647/746/701 661/768/701 662/770/701
f 640/748/702 564/771/702 655/772/702
f 790/773/703 654/750/703 669/769/703
f 647/746/704 663/774/704 648/751/704
f 640/748/705 656/775/705 641/752/705
f 648/751/706 664/776/706 649/753/706
f 641/752/707 657/777/707 642/754/707
f 650/755/708 664/776/708 665/778/708
f 642/754/709 658/763/709 643/756/709
f 655/772/710 564/779/710 670/780/710
f 790/781/711 669/769/711 684/782/711
f 662/770/712 678/783/712 663/774/712
f 656/775/713 670/780/713 671/784/713
f 663/774/714 679/785/714 664/776/714
f 657/777/715 671/784/715 672/786/715
f 665/778/716 679/785/716 680/787/716
f 657/777/717 673/788/717 658/763/717
f 665/778/718 681/789/718 666/762/718
f 659/764/719 673/788/719 674/790/719
f 667/765/720 681/789/720 682/791/720
f 659/764/721 675/792/721 660/766/721
f 667/765/722 683/793/722 668/767/722
f 661/768/723 675/792/723 676/794/723
f 668/767/724 684/782/724 669/769/724
f 662/770/725 676/794/725 677/795/725
f 674/790/726 688/796/726 689/797/726
f 682/791/727 696/798/727 697/799/727
f 674/790/728 690/800/728 675/792/728
f 682/791/729 698/801/729 683/793/729
f 676/794/730 690/800/730 691/802/730
f 683/793/731 699/803/731 684/782/731
f 677/795/732 691/802/732 692/804/732
f 670/780/733 564/805/733 685/806/733
f 790/807/734 684/782/734 699/803/734
f 677/795/735 693/808/735 678/783/735
f 670/780/736 686/809/736 671/784/736
f 678/783/737 694/810/737 679/785/737
f 671/784/738 687/811/738 672/786/738
f 680/787/739 694/810/739 695/812/739
f 672/786/740 688/796/740 673/788/740
f 680/787/741 696/798/741 681/789/741
f 692/804/742 708/813/742 693/808/742
f 685/806/743 701/814/743 686/809/743
f 693/808/744 709/815/744 694/810/744
f 686/809/745 702/816/745 687/811/745
f 695/812/746 709/815/746 710/817/746
f 687/811/747 703/818/747 688/796/747
f 695/812/748 711/819/748 696/798/748
f 689/797/749 703/818/749 704/820/749
f 697/799/750 711/819/750 712/821/750
f 689/797/751 705/822/751 690/800/751
f 697/799/752 713/823/752 698/801/752
f 691/802/753 705/822/753 706/824/753
f 698/801/754 714/825/754 699/803/754
f 692/804/755 706/824/755 707/826/755
f 685/806/756 564/827/756 700/828/756
f 790/829/757 699/803/757 714/825/757
f 712/821/758 726/830/758 727/831/758
f 704/820/759 720/832/759 705/822/759
f 712/821/760 728/833/760 713/823/760
f 706/824/761 720/832/761 721/834/761
f 714/825/762 728/833/762 729/835/762
f 707/826/763 721/834/763 722/836/763
f 700/828/764 564/837/764 715/838/764
f 790/839/765 714/825/765 729/835/765
f 707/826/766 723/840/766 708/813/766
f 700/828/767 716/841/767 701/814/767
f 708/813/768 724/842/768 709/815/768
f 702/816/769 716/841/769 717/843/769
f 710/817/770 724/842/770 725/844/770
f 702/816/771 718/845/771 703/818/771
f 710/817/772 726/830/772 711/819/772
f 704/820/773 718/845/773 719/846/773
f 723/840/774 739/847/774 724/842/774
f 716/841/775 732/848/775 717/843/775
f 725/844/776 739/847/776 740/849/776
f 717/843/777 733/850/777 718/845/777
f 725/844/778 741/851/778 726/830/778
f 719/846/779 733/850/779 734/852/779
f 727/831/780 741/851/780 742/853/780
f 719/846/781 735/854/781 720/832/781
f 727/831/782 743/855/782 728/833/782
f 721/834/783 735/854/783 736/856/783
f 728/833/784 744/857/784 729/835/784
f 722/836/785 736/856/785 737/858/785
f 715/838/786 564/859/786 730/860/786
f 790/861/787 729/835/787 744/857/787
f 722/836/788 738/862/788 723/840/788
f 715/838/789 731/863/789 716/841/789
f 742/853/790 758/864/790 743/855/790
f 736/856/791 750/865/791 751/866/791
f 743/855/792 759/867/792 744/857/792
f 737/858/793 751/866/793 752/868/793
f 730/860/794 564/869/794 745/870/794
f 790/871/795 744/857/795 759/867/795
f 737/858/796 753/872/796 738/862/796
f 731/863/797 745/870/797 746/873/797
f 738/862/798 754/874/798 739/847/798
f 731/863/799 747/875/799 732/848/799
f 740/849/800 754/874/800 755/876/800
f 732/848/801 748/877/801 733/850/801
f 740/849/802 756/878/802 741/851/802
f 734/852/803 748/877/803 749/879/803
f 742/853/804 756/878/804 757/880/804
f 734/852/805 750/865/805 735/854/805
f 746/873/806 762/881/806 747/875/806
f 755/876/807 769/882/807 770/883/807
f 747/875/808 763/884/808 748/877/808
f 755/876/809 771/885/809 756/878/809
f 749/879/810 763/884/810 764/886/810
f 757/880/811 771/885/811 772/887/811
f 749/879/812 765/888/812 750/865/812
f 757/880/813 773/889/813 758/864/813
f 751/866/814 765/888/814 766/890/814
f 759/867/815 773/889/815 774/891/815
f 752/868/816 766/890/816 767/892/816
f 745/870/817 564/893/817 760/894/817
f 790/895/818 759/867/818 774/891/818
f 752/868/819 768/896/819 753/872/819
f 746/873/820 760/894/820 761/897/820
f 753/872/821 769/882/821 754/874/821
f 766/890/822 780/898/822 781/899/822
f 774/891/823 788/900/823 789/901/823
f 767/892/824 781/899/824 782/902/824
f 760/894/825 564/903/825 775/904/825
f 790/905/826 774/891/826 789/901/826
f 767/892/827 783/906/827 768/896/827
f 760/894/828 776/907/828 761/897/828
f 768/896/829 784/908/829 769/882/829
f 761/897/830 777/909/830 762/881/830
f 770/883/831 784/908/831 785/910/831
f 762/881/832 778/911/832 763/884/832
f 770/883/833 786/912/833 771/885/833
f 764/886/834 778/911/834 779/913/834
f 772/887/835 786/912/835 787/914/835
f 764/886/836 780/898/836 765/888/836
f 772/887/837 788/900/837 773/889/837
f 785/910/838 800/915/838 801/916/838
f 777/909/839 794/917/839 778/911/839
f 785/910/840 802/918/840 786/912/840
f 779/913/841 794/917/841 795/919/841
f 787/914/842 802/918/842 803/920/842
f 779/913/843 796/921/843 780/898/843
f 787/914/844 804/922/844 788/900/844
f 781/899/845 796/921/845 797/923/845
f 788/900/846 805/924/846 789/901/846
f 782/902/847 797/923/847 798/925/847
f 775/904/848 564/926/848 791/927/848
f 790/928/849 789/901/849 805/924/849
f 782/902/850 799/929/850 783/906/850
f 775/904/851 792/930/851 776/907/851
f 783/906/852 800/915/852 784/908/852
f 776/907/853 793/931/853 777/909/853
f 804/922/854 820/932/854 805/924/854
f 798/925/855 812/933/855 813/934/855
f 791/927/856 564/935/856 806/936/856
f 790/937/857 805/924/857 820/932/857
f 798/925/858 814/938/858 799/929/858
f 791/927/859 807/939/859 792/930/859
f 799/929/860 815/940/860 800/915/860
f 792/930/861 808/941/861 793/931/861
f 801/916/862 815/940/862 816/942/862
f 793/931/863 809/943/863 794/917/863
f 801/916/864 817/944/864 802/918/864
f 794/917/865 810/945/865 795/919/865
f 803/920/866 817/944/866 818/946/866
f 795/919/867 811/947/867 796/921/867
f 803/920/868 819/948/868 804/922/868
f 797/923/869 811/947/869 812/933/869
f 808/941/870 824/949/870 809/943/870
f 816/942/871 832/950/871 817/944/871
f 810/945/872 824/949/872 825/951/872
f 818/946/873 832/950/873 833/952/873
f 810/945/874 826/953/874 811/947/874
f 818/946/875 834/954/875 819/948/875
f 812/933/876 826/953/876 827/955/876
f 819/948/877 835/956/877 820/932/877
f 813/934/878 827/955/878 828/957/878
f 806/936/879 564/958/879 821/959/879
f 790/960/880 820/932/880 835/956/880
f 813/934/881 829/961/881 814/938/881
f 806/936/882 822/962/882 807/939/882
f 814/938/883 830/963/883 815/940/883
f 807/939/884 823/964/884 808/941/884
f 816/942/885 830/963/885 831/965/885
f 828/957/886 842/966/886 843/967/886
f 821/959/887 564/968/887 836/969/887
f 790/970/888 835/956/888 850/971/888
f 828/957/889 844/972/889 829/961/889
f 821/959/890 837/973/890 822/962/890
f 829/961/891 845/974/891 830/963/891
f 823/964/892 837/973/892 838/975/892
f 831/965/893 845/974/893 846/976/893
f 823/964/894 839/977/894 824/949/894
f 831/965/895 847/978/895 832/950/895
f 825/951/896 839/977/896 840/979/896
f 833/952/897 847/978/897 848/980/897
f 825/951/898 841/981/898 826/953/898
f 833/952/899 849/982/899 834/954/899
f 827/955/900 841/981/900 842/966/900
f 834/954/901 850/971/901 835/956/901
f 846/983/902 862/984/902 847/985/902
f 840/986/903 854/987/903 855/988/903
f 848/989/904 862/984/904 863/990/904
f 840/986/905 856/991/905 841/992/905
f 848/989/906 864/993/906 849/994/906
f 842/995/907 856/991/907 857/996/907
f 849/994/908 865/997/908 850/998/908
f 843/999/909 857/996/909 858/1000/909
f 836/1001/910 564/1002/910 851/1003/910
f 790/1004/911 850/998/911 865/997/911
f 843/999/912 859/1005/912 844/1006/912
f 836/1001/913 852/1007/913 837/1008/913
f 844/1006/914 860/1009/914 845/1010/914
f 838/1011/915 852/1007/915 853/1012/915
f 846/983/916 860/1009/916 861/1013/916
f 838/1011/917 854/987/917 839/1014/917
f 790/1015/918 865/997/918 880/1016/918
f 858/1000/919 874/1017/919 859/1005/919
f 851/1003/920 867/1018/920 852/1007/920
f 859/1005/921 875/1019/921 860/1009/921
f 853/1012/922 867/1018/922 868/1020/922
f 861/1013/923 875/1019/923 876/1021/923
f 853/1012/924 869/1022/924 854/987/924
f 861/1013/925 877/1023/925 862/984/925
f 855/988/926 869/1022/926 870/1024/926
f 863/990/927 877/1023/927 878/1025/927
f 855/988/928 871/1026/928 856/991/928
f 863/990/929 879/1027/929 864/993/929
f 857/996/930 871/1026/930 872/1028/930
f 864/993/931 880/1016/931 865/997/931
f 858/1000/932 872/1028/932 873/1029/932
f 851/1003/933 564/1030/933 866/1031/933
f 878/1025/934 892/1032/934 893/1033/934
f 870/1024/935 886/1034/935 871/1026/935
f 879/1027/936 893/1033/936 894/1035/936
f 872/1028/937 886/1034/937 887/1036/937
f 879/1027/938 895/1037/938 880/1016/938
f 873/1029/939 887/1036/939 888/1038/939
f 866/1031/940 564/1039/940 881/1040/940
f 790/1041/941 880/1016/941 895/1037/941
f 873/1029/942 889/1042/942 874/1017/942
f 867/1018/943 881/1040/943 882/1043/943
f 874/1017/944 890/1044/944 875/1019/944
f 867/1018/945 883/1045/945 868/1020/945
f 876/1021/946 890/1044/946 891/1046/946
f 868/1020/947 884/1047/947 869/1022/947
f 876/1021/948 892/1032/948 877/1023/948
f 870/1024/949 884/1047/949 885/1048/949
f 881/1040/950 897/1049/950 882/1043/950
f 889/1042/951 905/1050/951 890/1044/951
f 883/1045/952 897/1049/952 898/1051/952
f 891/1046/953 905/1050/953 906/1052/953
f 883/1045/954 899/1053/954 884/1047/954
f 891/1046/955 907/1054/955 892/1032/955
f 885/1048/956 899/1053/956 900/1055/956
f 893/1033/957 907/1054/957 908/1056/957
f 885/1048/958 901/1057/958 886/1034/958
f 893/1033/959 909/1058/959 894/1035/959
f 887/1036/960 901/1057/960 902/1059/960
f 894/1035/961 910/1060/961 895/1037/961
f 888/1038/962 902/1059/962 903/1061/962
f 881/1040/963 564/1062/963 896/1063/963
f 790/1064/964 895/1037/964 910/1060/964
f 888/1038/965 904/1065/965 889/1042/965
f 900/1055/966 916/1066/966 901/1057/966
f 908/1056/967 924/1067/967 909/1058/967
f 902/1059/968 916/1066/968 917/1068/968
f 909/1058/969 925/1069/969 910/1060/969
f 903/1061/970 917/1068/970 918/1070/970
f 896/1063/971 564/1071/971 911/1072/971
f 790/1073/972 910/1060/972 925/1069/972
f 903/1061/973 919/1074/973 904/1065/973
f 896/1063/974 912/1075/974 897/1049/974
f 904/1065/975 920/1076/975 905/1050/975
f 898/1051/976 912/1075/976 913/1077/976
f 906/1052/977 920/1076/977 921/1078/977
f 898/1051/978 914/1079/978 899/1053/978
f 906/1052/979 922/1080/979 907/1054/979
f 900/1055/980 914/1079/980 915/1081/980
f 908/1056/981 922/1080/981 923/1082/981
f 919/1074/982 935/1083/982 920/1076/982
f 912/1075/983 928/1084/983 913/1077/983
f 921/1078/984 935/1083/984 936/1085/984
f 913/1077/985 929/1086/985 914/1079/985
f 921/1078/986 937/1087/986 922/1080/986
f 915/1081/987 929/1086/987 930/1088/987
f 923/1082/988 937/1087/988 938/1089/988
f 915/1081/989 931/1090/989 916/1066/989
f 923/1082/990 939/1091/990 924/1067/990
f 917/1068/991 931/1090/991 932/1092/991
f 924/1067/992 940/1093/992 925/1069/992
f 918/1070/993 932/1092/993 933/1094/993
f 911/1072/994 564/1095/994 926/1096/994
f 790/1097/995 925/1069/995 940/1093/995
f 918/1070/996 934/1098/996 919/1074/996
f 912/1075/997 926/1096/997 927/1099/997
f 938/1089/998 954/1100/998 939/1091/998
f 932/1092/999 946/1101/999 947/1102/999
f 940/1093/1000 954/1100/1000 955/1103/1000
f 933/1094/1001 947/1102/1001 948/1104/1001
f 926/1096/1002 564/1105/1002 941/1106/1002
f 790/1107/1003 940/1093/1003 955/1103/1003
f 933/1094/1004 949/1108/1004 934/1098/1004
f 926/1096/1005 942/1109/1005 927/1099/1005
f 934/1098/1006 950/1110/1006 935/1083/1006
f 928/1084/1007 942/1109/1007 943/1111/1007
f 936/1085/1008 950/1110/1008 951/1112/1008
f 928/1084/1009 944/1113/1009 929/1086/1009
f 936/1085/1010 952/1114/1010 937/1087/1010
f 930/1088/1011 944/1113/1011 945/1115/1011
f 938/1089/1012 952/1114/1012 953/1116/1012
f 930/1088/1013 946/1101/1013 931/1090/1013
f 943/1111/1014 957/591/1014 483/563/1014
f 950/1110/1015 488/567/1015 951/1112/1015
f 944/1113/1016 483/563/1016 958/568/1016
f 951/1112/1017 961/571/1017 952/1114/1017
f 945/1115/1018 958/568/1018 484/572/1018
f 953/1116/1019 961/571/1019 962/575/1019
f 946/1101/1020 484/572/1020 959/576/1020
f 953/1116/1021 963/579/1021 954/1100/1021
f 947/1102/1022 959/576/1022 485/580/1022
f 954/1100/1023 964/583/1023 955/1103/1023
f 948/1104/1024 485/580/1024 486/585/1024
f 941/1106/1025 564/1117/1025 956/586/1025
f 790/1118/1026 955/1103/1026 964/583/1026
f 948/1104/1027 487/560/1027 949/1108/1027
f 942/1109/1028 956/586/1028 957/591/1028
f 949/1108/1029 960/562/1029 950/1110/1029
f 487/560/518 497/590/518 498/561/518
f 483/563/519 957/591/519 490/564/519
f 960/562/520 498/561/520 499/566/520
f 958/568/521 483/563/521 491/565/521
f 488/567/522 499/566/522 500/570/522
f 484/572/523 958/568/523 492/569/523
f 961/571/524 500/570/524 501/574/524
f 959/576/525 484/572/525 493/573/525
f 962/575/526 501/574/526 502/578/526
f 485/580/527 959/576/527 494/577/527
f 963/579/528 502/578/528 503/582/528
f 485/580/529 495/581/529 496/584/529
f 486/585/532 496/584/532 497/590/532
f 957/591/533 956/586/533 489/588/533
f 496/584/534 511/605/534 512/592/534
f 489/588/535 504/607/535 505/593/535
f 497/590/536 512/592/536 513/594/536
f 490/564/537 505/593/537 506/595/537
f 498/561/538 513/594/538 514/596/538
f 491/565/539 506/595/539 507/597/539
f 499/566/540 514/596/540 515/598/540
f 492/569/541 507/597/541 508/599/541
f 501/574/542 500/570/542 515/598/542
f 493/573/543 508/599/543 509/601/543
f 502/578/544 501/574/544 516/600/544
f 495/581/545 494/577/545 509/601/545
f 502/578/546 517/602/546 518/604/546
f 495/581/547 510/603/547 511/605/547
f 516/600/550 515/598/550 530/609/550
f 508/599/551 523/625/551 524/611/551
f 516/600/552 531/610/552 532/612/552
f 509/601/553 524/611/553 525/613/553
f 517/602/554 532/612/554 533/614/554
f 510/603/555 525/613/555 526/615/555
f 512/592/558 511/605/558 526/615/558
f 504/607/559 519/617/559 520/620/559
f 512/592/560 527/619/560 528/621/560
f 505/593/561 520/620/561 521/622/561
f 514/596/562 513/594/562 528/621/562
f 506/595/563 521/622/563 522/624/563
f 515/598/564 514/596/564 529/623/564
f 507/597/565 522/624/565 523/625/565
f 519/617/566 534/640/566 535/626/566
f 527/619/567 542/642/567 543/627/567
f 520/620/568 535/626/568 536/628/568
f 529/623/569 528/621/569 543/627/569
f 521/622/570 536/628/570 537/630/570
f 529/623/571 544/629/571 545/631/571
f 522/624/572 537/630/572 538/632/572
f 530/609/573 545/631/573 546/633/573
f 523/625/574 538/632/574 539/634/574
f 531/610/575 546/633/575 547/635/575
f 525/613/576 524/611/576 539/634/576
f 532/612/577 547/635/577 548/637/577
f 526/615/578 525/613/578 540/636/578
f 526/615/581 541/638/581 542/642/581
f 538/632/582 553/659/582 554/643/582
f 547/635/583 546/633/583 561/644/583
f 540/636/584 539/634/584 554/643/584
f 547/635/1030 562/645/1030 563/647/1030
f 541/638/586 540/636/586 555/646/586
f 541/638/589 556/648/589 557/652/589
f 534/640/1031 549/650/1031 550/653/1031
f 542/642/591 557/652/591 558/654/591
f 535/626/592 550/653/592 551/655/592
f 544/629/593 543/627/593 558/654/593
f 536/628/594 551/655/594 552/657/594
f 544/629/595 559/656/595 560/658/595
f 538/632/596 537/630/596 552/657/596
f 545/631/597 560/658/597 561/644/597
f 557/652/598 573/675/598 574/660/598
f 550/653/599 566/676/599 567/661/599
f 559/656/600 558/654/600 574/660/600
f 551/655/601 567/661/601 568/663/601
f 559/656/602 575/662/602 576/664/602
f 553/659/603 552/657/603 568/663/603
f 560/658/604 576/664/604 577/666/604
f 553/659/605 569/665/605 570/667/605
f 561/644/606 577/666/606 578/668/606
f 555/646/607 554/643/607 570/667/607
f 563/647/608 562/645/608 578/668/608
f 556/648/609 555/646/609 571/669/609
f 556/648/612 572/671/612 573/675/612
f 549/650/613 565/673/613 566/676/613
f 578/668/614 577/666/614 592/677/614
f 571/669/615 570/667/615 585/679/615
f 578/668/616 593/678/616 594/681/616
f 572/671/617 571/669/617 586/680/617
f 572/671/620 587/682/620 588/686/620
f 566/676/621 565/673/621 580/684/621
f 573/675/622 588/686/622 589/688/622
f 566/676/623 581/687/623 582/689/623
f 575/662/624 574/660/624 589/688/624
f 567/661/625 582/689/625 583/691/625
f 575/662/626 590/690/626 591/692/626
f 569/665/627 568/663/627 583/691/627
f 577/666/628 576/664/628 591/692/628
f 569/665/629 584/693/629 585/679/629
f 590/690/630 589/688/630 604/694/630
f 582/689/631 597/710/631 598/696/631
f 590/690/632 605/695/632 606/697/632
f 584/693/633 583/691/633 598/696/633
f 592/677/634 591/692/634 606/697/634
f 584/693/635 599/698/635 600/700/635
f 592/677/636 607/699/636 608/701/636
f 586/680/637 585/679/637 600/700/637
f 593/678/638 608/701/638 609/703/638
f 587/682/639 586/680/639 601/702/639
f 587/682/642 602/704/642 603/708/642
f 581/687/643 580/684/643 595/706/643
f 588/686/644 603/708/644 604/694/644
f 581/687/645 596/709/645 597/710/645
f 608/701/646 623/727/646 624/711/646
f 602/704/647 601/702/647 616/712/647
f 602/704/650 617/713/650 618/717/650
f 595/706/651 610/715/651 611/718/651
f 603/708/652 618/717/652 619/719/652
f 596/709/653 611/718/653 612/720/653
f 605/695/654 604/694/654 619/719/654
f 597/710/655 612/720/655 613/722/655
f 605/695/656 620/721/656 621/723/656
f 599/698/657 598/696/657 613/722/657
f 607/699/658 606/697/658 621/723/658
f 599/698/659 614/724/659 615/726/659
f 607/699/660 622/725/660 623/727/660
f 601/702/661 600/700/661 615/726/661
f 612/720/1032 627/743/1032 628/728/1032
f 620/721/663 635/744/663 636/729/663
f 614/724/664 613/722/664 628/728/664
f 622/725/1033 621/723/1033 636/729/1033
f 614/724/666 629/730/666 630/732/666
f 622/725/667 637/731/667 638/733/667
f 616/712/668 615/726/668 630/732/668
f 624/711/669 623/727/669 638/733/669
f 617/713/670 616/712/670 631/734/670
f 617/713/673 632/736/673 633/740/673
f 610/715/674 625/738/674 626/741/674
f 618/717/675 633/740/675 634/742/675
f 612/720/676 611/718/676 626/741/676
f 620/721/677 619/719/677 634/742/677
f 632/736/678 631/734/678 646/745/678
f 632/736/681 647/746/681 648/751/681
f 625/738/682 640/748/682 641/752/682
f 633/740/683 648/751/683 649/753/683
f 627/743/684 626/741/684 641/752/684
f 635/744/685 634/742/685 649/753/685
f 627/743/686 642/754/686 643/756/686
f 635/744/687 650/755/687 651/757/687
f 629/730/688 628/728/688 643/756/688
f 637/731/689 636/729/689 651/757/689
f 629/730/690 644/758/690 645/760/690
f 637/731/691 652/759/691 653/761/691
f 631/734/692 630/732/692 645/760/692
f 638/733/693 653/761/693 654/750/693
f 650/755/694 665/778/694 666/762/694
f 644/758/695 643/756/695 658/763/695
f 651/757/696 666/762/696 667/765/696
f 644/758/697 659/764/697 660/766/697
f 653/761/698 652/759/698 667/765/698
f 646/745/699 645/760/699 660/766/699
f 654/750/700 653/761/700 668/767/700
f 647/746/701 646/745/701 661/768/701
f 647/746/704 662/770/704 663/774/704
f 640/748/705 655/772/705 656/775/705
f 648/751/706 663/774/706 664/776/706
f 641/752/707 656/775/707 657/777/707
f 650/755/708 649/753/708 664/776/708
f 642/754/709 657/777/709 658/763/709
f 662/770/712 677/795/712 678/783/712
f 656/775/713 655/772/713 670/780/713
f 663/774/714 678/783/714 679/785/714
f 657/777/715 656/775/715 671/784/715
f 665/778/716 664/776/716 679/785/716
f 657/777/717 672/786/717 673/788/717
f 665/778/718 680/787/718 681/789/718
f 659/764/719 658/763/719 673/788/719
f 667/765/720 666/762/720 681/789/720
f 659/764/721 674/790/721 675/792/721
f 667/765/722 682/791/722 683/793/722
f 661/768/723 660/766/723 675/792/723
f 668/767/724 683/793/724 684/782/724
f 662/770/725 661/768/725 676/794/725
f 674/790/726 673/788/726 688/796/726
f 682/791/727 681/789/727 696/798/727
f 674/790/728 689/797/728 690/800/728
f 682/791/729 697/799/729 698/801/729
f 676/794/730 675/792/730 690/800/730
f 683/793/731 698/801/731 699/803/731
f 677/795/732 676/794/732 691/802/732
f 677/795/735 692/804/735 693/808/735
f 670/780/736 685/806/736 686/809/736
f 678/783/737 693/808/737 694/810/737
f 671/784/738 686/809/738 687/811/738
f 680/787/739 679/785/739 694/810/739
f 672/786/740 687/811/740 688/796/740
f 680/787/741 695/812/741 696/798/741
f 692/804/742 707/826/742 708/813/742
f 685/806/743 700/828/743 701/814/743
f 693/808/744 708/813/744 709/815/744
f 686/809/745 701/814/745 702/816/745
f 695/812/746 694/810/746 709/815/746
f 687/811/747 702/816/747 703/818/747
f 695/812/748 710/817/748 711/819/748
f 689/797/749 688/796/749 703/818/749
f 697/799/750 696/798/750 711/819/750
f 689/797/751 704/820/751 705/822/751
f 697/799/752 712/821/752 713/823/752
f 691/802/753 690/800/753 705/822/753
f 698/801/754 713/823/754 714/825/754
f 692/804/755 691/802/755 706/824/755
f 712/821/758 711/819/758 726/830/758
f 704/820/759 719/846/759 720/832/759
f 712/821/760 727/831/760 728/833/760
f 706/824/761 705/822/761 720/832/761
f 714/825/762 713/823/762 728/833/762
f 707/826/763 706/824/763 721/834/763
f 707/826/766 722/836/766 723/840/766
f 700/828/767 715/838/767 716/841/767
f 708/813/768 723/840/768 724/842/768
f 702/816/769 701/814/769 716/841/769
f 710/817/770 709/815/770 724/842/770
f 702/816/771 717/843/771 718/845/771
f 710/817/772 725/844/772 726/830/772
f 704/820/773 703/818/773 718/845/773
f 723/840/774 738/862/774 739/847/774
f 716/841/775 731/863/775 732/848/775
f 725/844/776 724/842/776 739/847/776
f 717/843/777 732/848/777 733/850/777
f 725/844/778 740/849/778 741/851/778
f 719/846/779 718/845/779 733/850/779
f 727/831/780 726/830/780 741/851/780
f 719/846/781 734/852/781 735/854/781
f 727/831/782 742/853/782 743/855/782
f 721/834/783 720/832/783 735/854/783
f 728/833/784 743/855/784 744/857/784
f 722/836/785 721/834/785 736/856/785
f 722/836/788 737/858/788 738/862/788
f 715/838/789 730/860/789 731/863/789
f 742/853/790 757/880/790 758/864/790
f 736/856/791 735/854/791 750/865/791
f 743/855/792 758/864/792 759/867/792
f 737/858/793 736/856/793 751/866/793
f 737/858/796 752/868/796 753/872/796
f 731/863/797 730/860/797 745/870/797
f 738/862/798 753/872/798 754/874/798
f 731/863/799 746/873/799 747/875/799
f 740/849/800 739/847/800 754/874/800
f 732/848/801 747/875/801 748/877/801
f 740/849/802 755/876/802 756/878/802
f 734/852/803 733/850/803 748/877/803
f 742/853/804 741/851/804 756/878/804
f 734/852/805 749/879/805 750/865/805
f 746/873/806 761/897/806 762/881/806
f 755/876/807 754/874/807 769/882/807
f 747/875/808 762/881/808 763/884/808
f 755/876/809 770/883/809 771/885/809
f 749/879/810 748/877/810 763/884/810
f 757/880/811 756/878/811 771/885/811
f 749/879/812 764/886/812 765/888/812
f 757/880/813 772/887/813 773/889/813
f 751/866/814 750/865/814 765/888/814
f 759/867/815 758/864/815 773/889/815
f 752/868/816 751/866/816 766/890/816
f 752/868/819 767/892/819 768/896/819
f 746/873/820 745/870/820 760/894/820
f 753/872/821 768/896/821 769/882/821
f 766/890/822 765/888/822 780/898/822
f 774/891/823 773/889/823 788/900/823
f 767/892/824 766/890/824 781/899/824
f 767/892/827 782/902/827 783/906/827
f 760/894/828 775/904/828 776/907/828
f 768/896/829 783/906/829 784/908/829
f 761/897/830 776/907/830 777/909/830
f 770/883/831 769/882/831 784/908/831
f 762/881/832 777/909/832 778/911/832
f 770/883/833 785/910/833 786/912/833
f 764/886/834 763/884/834 778/911/834
f 772/887/835 771/885/835 786/912/835
f 764/886/836 779/913/836 780/898/836
f 772/887/837 787/914/837 788/900/837
f 785/910/838 784/908/838 800/915/838
f 777/909/839 793/931/839 794/917/839
f 785/910/840 801/916/840 802/918/840
f 779/913/841 778/911/841 794/917/841
f 787/914/842 786/912/842 802/918/842
f 779/913/843 795/919/843 796/921/843
f 787/914/844 803/920/844 804/922/844
f 781/899/845 780/898/845 796/921/845
f 788/900/1034 804/922/1034 805/924/1034
f 782/902/847 781/899/847 797/923/847
f 782/902/850 798/925/850 799/929/850
f 775/904/851 791/927/851 792/930/851
f 783/906/852 799/929/852 800/915/852
f 776/907/853 792/930/853 793/931/853
f 804/922/854 819/948/854 820/932/854
f 798/925/855 797/923/855 812/933/855
f 798/925/858 813/934/858 814/938/858
f 791/927/859 806/936/859 807/939/859
f 799/929/860 814/938/860 815/940/860
f 792/930/861 807/939/861 808/941/861
f 801/916/862 800/915/862 815/940/862
f 793/931/863 808/941/863 809/943/863
f 801/916/864 816/942/864 817/944/864
f 794/917/865 809/943/865 810/945/865
f 803/920/866 802/918/866 817/944/866
f 795/919/867 810/945/867 811/947/867
f 803/920/868 818/946/868 819/948/868
f 797/923/869 796/921/869 811/947/869
f 808/941/870 823/964/870 824/949/870
f 816/942/871 831/965/871 832/950/871
f 810/945/872 809/943/872 824/949/872
f 818/946/873 817/944/873 832/950/873
f 810/945/874 825/951/874 826/953/874
f 818/946/875 833/952/875 834/954/875
f 812/933/876 811/947/876 826/953/876
f 819/948/877 834/954/877 835/956/877
f 813/934/878 812/933/878 827/955/878
f 813/934/881 828/957/881 829/961/881
f 806/936/882 821/959/882 822/962/882
f 814/938/883 829/961/883 830/963/883
f 807/939/884 822/962/884 823/964/884
f 816/942/885 815/940/885 830/963/885
f 828/957/886 827/955/886 842/966/886
f 828/957/889 843/967/889 844/972/889
f 821/959/890 836/969/890 837/973/890
f 829/961/891 844/972/891 845/974/891
f 823/964/892 822/962/892 837/973/892
f 831/965/893 830/963/893 845/974/893
f 823/964/894 838/975/894 839/977/894
f 831/965/895 846/976/895 847/978/895
f 825/951/896 824/949/896 839/977/896
f 833/952/897 832/950/897 847/978/897
f 825/951/898 840/979/898 841/981/898
f 833/952/899 848/980/899 849/982/899
f 827/955/900 826/953/900 841/981/900
f 834/954/901 849/982/901 850/971/901
f 846/983/902 861/1013/902 862/984/902
f 840/986/903 839/1014/903 854/987/903
f 848/989/904 847/985/904 862/984/904
f 840/986/905 855/988/905 856/991/905
f 848/989/906 863/990/906 864/993/906
f 842/995/907 841/992/907 856/991/907
f 849/994/908 864/993/908 865/997/908
f 843/999/909 842/995/909 857/996/909
f 843/999/912 858/1000/912 859/1005/912
f 836/1001/913 851/1003/913 852/1007/913
f 844/1006/914 859/1005/914 860/1009/914
f 838/1011/915 837/1008/915 852/1007/915
f 846/983/916 845/1010/916 860/1009/916
f 838/1011/917 853/1012/917 854/987/917
f 858/1000/919 873/1029/919 874/1017/919
f 851/1003/920 866/1031/920 867/1018/920
f 859/1005/921 874/1017/921 875/1019/921
f 853/1012/922 852/1007/922 867/1018/922
f 861/1013/923 860/1009/923 875/1019/923
f 853/1012/924 868/1020/924 869/1022/924
f 861/1013/925 876/1021/925 877/1023/925
f 855/988/926 854/987/926 869/1022/926
f 863/990/927 862/984/927 877/1023/927
f 855/988/928 870/1024/928 871/1026/928
f 863/990/929 878/1025/929 879/1027/929
f 857/996/930 856/991/930 871/1026/930
f 864/993/931 879/1027/931 880/1016/931
f 858/1000/932 857/996/932 872/1028/932
f 878/1025/934 877/1023/934 892/1032/934
f 870/1024/935 885/1048/935 886/1034/935
f 879/1027/936 878/1025/936 893/1033/936
f 872/1028/937 871/1026/937 886/1034/937
f 879/1027/938 894/1035/938 895/1037/938
f 873/1029/939 872/1028/939 887/1036/939
f 873/1029/942 888/1038/942 889/1042/942
f 867/1018/943 866/1031/943 881/1040/943
f 874/1017/944 889/1042/944 890/1044/944
f 867/1018/945 882/1043/945 883/1045/945
f 876/1021/946 875/1019/946 890/1044/946
f 868/1020/947 883/1045/947 884/1047/947
f 876/1021/948 891/1046/948 892/1032/948
f 870/1024/949 869/1022/949 884/1047/949
f 881/1040/950 896/1063/950 897/1049/950
f 889/1042/951 904/1065/951 905/1050/951
f 883/1045/952 882/1043/952 897/1049/952
f 891/1046/953 890/1044/953 905/1050/953
f 883/1045/954 898/1051/954 899/1053/954
f 891/1046/955 906/1052/955 907/1054/955
f 885/1048/956 884/1047/956 899/1053/956
f 893/1033/957 892/1032/957 907/1054/957
f 885/1048/958 900/1055/958 901/1057/958
f 893/1033/959 908/1056/959 909/1058/959
f 887/1036/960 886/1034/960 901/1057/960
f 894/1035/961 909/1058/961 910/1060/961
f 888/1038/962 887/1036/962 902/1059/962
f 888/1038/965 903/1061/965 904/1065/965
f 900/1055/966 915/1081/966 916/1066/966
f 908/1056/967 923/1082/967 924/1067/967
f 902/1059/968 901/1057/968 916/1066/968
f 909/1058/969 924/1067/969 925/1069/969
f 903/1061/970 902/1059/970 917/1068/970
f 903/1061/973 918/1070/973 919/1074/973
f 896/1063/974 911/1072/974 912/1075/974
f 904/1065/975 919/1074/975 920/1076/975
f 898/1051/976 897/1049/976 912/1075/976
f 906/1052/977 905/1050/977 920/1076/977
f 898/1051/978 913/1077/978 914/1079/978
f 906/1052/979 921/1078/979 922/1080/979
f 900/1055/980 899/1053/980 914/1079/980
f 908/1056/981 907/1054/981 922/1080/981
f 919/1074/982 934/1098/982 935/1083/982
f 912/1075/983 927/1099/983 928/1084/983
f 921/1078/984 920/1076/984 935/1083/984
f 913/1077/985 928/1084/985 929/1086/985
f 921/1078/986 936/1085/986 937/1087/986
f 915/1081/987 914/1079/987 929/1086/987
f 923/1082/988 922/1080/988 937/1087/988
f 915/1081/989 930/1088/989 931/1090/989
f 923/1082/990 938/1089/990 939/1091/990
f 917/1068/991 916/1066/991 931/1090/991
f 924/1067/992 939/1091/992 940/1093/992
f 918/1070/993 917/1068/993 932/1092/993
f 918/1070/996 933/1094/996 934/1098/996
f 912/1075/997 911/1072/997 926/1096/997
f 938/1089/998 953/1116/998 954/1100/998
f 932/1092/999 931/1090/999 946/1101/999
f 940/1093/1000 939/1091/1000 954/1100/1000
f 933/1094/1001 932/1092/1001 947/1102/1001
f 933/1094/1004 948/1104/1004 949/1108/1004
f 926/1096/1005 941/1106/1005 942/1109/1005
f 934/1098/1006 949/1108/1006 950/1110/1006
f 928/1084/1007 927/1099/1007 942/1109/1007
f 936/1085/1008 935/1083/1008 950/1110/1008
f 928/1084/1009 943/1111/1009 944/1113/1009
f 936/1085/1010 951/1112/1010 952/1114/1010
f 930/1088/1011 929/1086/1011 944/1113/1011
f 938/1089/1012 937/1087/1012 952/1114/1012
f 930/1088/1013 945/1115/1013 946/1101/1013
f 943/1111/1014 942/1109/1014 957/591/1014
f 950/1110/1015 960/562/1015 488/567/1015
f 944/1113/1016 943/1111/1016 483/563/1016
f 951/1112/1017 488/567/1017 961/571/1017
f 945/1115/1018 944/1113/1018 958/568/1018
f 953/1116/1019 952/1114/1019 961/571/1019
f 946/1101/1020 945/1115/1020 484/572/1020
f 953/1116/1021 962/575/1021 963/579/1021
f 947/1102/1022 946/1101/1022 959/576/1022
f 954/1100/1035 963/579/1035 964/583/1035
f 948/1104/1024 947/1102/1024 485/580/1024
f 948/1104/1027 486/585/1027 487/560/1027
f 942/1109/1036 941/1106/1036 956/586/1036
f 949/1108/1029 487/560/1029 960/562/1029
o Cube.001
v 0.131002 0.093472 -0.288635
v -0.046766 -0.163106 -0.234637
v 0.099596 0.093472 -0.392024
v -0.078171 -0.163106 -0.338025
v 0.046229 0.157626 -0.262885
v -0.131538 -0.098952 -0.208886
v 0.014823 0.157626 -0.366273
v -0.162944 -0.098952 -0.312274
v 0.097034 0.005807 -0.259496
v 0.025928 -0.096824 -0.237897
v -0.015946 -0.096824 -0.375748
v 0.055161 0.005807 -0.397347
v -0.128977 -0.011287 -0.341414
v -0.057870 0.091344 -0.363013
v -0.087103 -0.011287 -0.203562
v -0.015996 0.091344 -0.225162
v 0.114758 0.092536 -0.373665
v 0.128483 0.092066 -0.344334
v 0.133391 0.092536 -0.312325
v 0.008017 -0.121847 -0.232859
v -0.010693 -0.143062 -0.229997
v -0.030367 -0.157173 -0.230979
v -0.049562 -0.171527 -0.256751
v -0.057075 -0.175757 -0.287969
v -0.068195 -0.171527 -0.318091
v 0.072466 0.031288 -0.402201
v 0.086941 0.055707 -0.403777
v 0.096168 0.077724 -0.399622
v 0.032365 0.148323 -0.379838
v 0.057664 0.131172 -0.391660
v 0.082660 0.110261 -0.395115
v -0.100293 -0.153803 -0.339541
v -0.127894 -0.136652 -0.335294
v -0.150589 -0.115741 -0.324264
v -0.039960 0.116367 -0.368051
v -0.021249 0.137582 -0.370913
v -0.001575 0.151693 -0.369931
v 0.036253 0.166047 -0.282818
v 0.025133 0.170277 -0.312941
v 0.017620 0.166047 -0.344159
v -0.165333 -0.098016 -0.288585
v -0.160425 -0.097546 -0.256575
v -0.146700 -0.098016 -0.227244
v 0.001690 0.116367 -0.230937
v 0.018832 0.137582 -0.238965
v 0.034635 0.151693 -0.250724
v 0.118646 0.110261 -0.276646
v 0.095952 0.131172 -0.265615
v 0.068351 0.148323 -0.261368
v -0.114602 -0.115741 -0.205794
v -0.089606 -0.136652 -0.209250
v -0.064307 -0.153803 -0.221072
v 0.132378 0.077724 -0.280414
v 0.127021 0.055707 -0.271829
v 0.114115 0.031288 -0.265088
v 0.079258 -0.019851 -0.254096
v 0.061481 -0.045509 -0.248696
v 0.043704 -0.071167 -0.243296
v -0.066577 -0.157173 -0.350186
v -0.050774 -0.143062 -0.361945
v -0.033632 -0.121847 -0.369972
v 0.001830 -0.071167 -0.381148
v 0.019607 -0.045509 -0.386548
v 0.037384 -0.019851 -0.391947
v -0.164320 -0.083204 -0.320495
v -0.158964 -0.061187 -0.329081
v -0.146057 -0.036768 -0.335822
v -0.111200 0.014371 -0.346813
v -0.093423 0.040029 -0.352213
v -0.075646 0.065687 -0.357613
v -0.128110 -0.083204 -0.201288
v -0.118883 -0.061187 -0.197133
v -0.104408 -0.036768 -0.198709
v -0.069326 0.014371 -0.208962
v -0.051549 0.040029 -0.214362
v -0.033773 0.065687 -0.219762
v 0.015501 0.071965 -0.213555
v 0.048371 0.048576 -0.216482
v 0.077314 0.025186 -0.232332
v -0.055606 -0.030666 -0.191956
v -0.022736 -0.054056 -0.194882
v 0.006208 -0.077445 -0.210732
v -0.064278 0.103373 -0.326953
v -0.058126 0.107383 -0.287650
v -0.041378 0.103373 -0.251566
v -0.135384 0.000742 -0.305353
v -0.129233 0.004751 -0.266050
v -0.112485 0.000742 -0.229966
v 0.023664 0.025186 -0.408954
v -0.009206 0.048576 -0.406027
v -0.038150 0.071965 -0.390177
v -0.047443 -0.077445 -0.387354
v -0.080313 -0.054056 -0.384428
v -0.109257 -0.030666 -0.368578
v 0.103442 -0.006222 -0.295556
v 0.097291 -0.010231 -0.334859
v 0.080542 -0.006222 -0.370944
v 0.032335 -0.108853 -0.273957
v 0.026184 -0.112863 -0.313260
v 0.009436 -0.108853 -0.349344
v 0.014371 -0.133850 -0.268722
v -0.004720 -0.154879 -0.264476
v -0.025889 -0.168281 -0.261517
v 0.008231 -0.137875 -0.307807
v -0.010775 -0.159011 -0.302033
v -0.032003 -0.172596 -0.295585
v -0.008405 -0.133850 -0.343703
v -0.026633 -0.154879 -0.336613
v -0.045870 -0.168281 -0.327298
v -0.064987 -0.102605 -0.381514
v -0.081136 -0.124782 -0.373035
v -0.094129 -0.141632 -0.359547
v -0.097705 -0.079377 -0.378600
v -0.112789 -0.102679 -0.370206
v -0.123394 -0.122130 -0.356659
v -0.126468 -0.056079 -0.362839
v -0.140284 -0.080021 -0.355068
v -0.148065 -0.100815 -0.343164
v -0.152459 -0.024835 -0.299945
v -0.165320 -0.049924 -0.294485
v -0.170234 -0.074167 -0.289521
v -0.146368 -0.020879 -0.260846
v -0.159650 -0.046347 -0.256811
v -0.165376 -0.071664 -0.255072
v -0.129683 -0.024835 -0.224964
v -0.143408 -0.049924 -0.222348
v -0.150252 -0.074167 -0.223741
v -0.073101 -0.056079 -0.187152
v -0.088905 -0.080021 -0.185925
v -0.101993 -0.100815 -0.191491
v -0.040431 -0.079377 -0.190052
v -0.057636 -0.102679 -0.188638
v -0.073984 -0.122130 -0.193998
v -0.011621 -0.102605 -0.205828
v -0.029757 -0.124782 -0.203892
v -0.048057 -0.141632 -0.207874
v 0.097926 0.117017 -0.370755
v 0.072470 0.140011 -0.364941
v 0.044323 0.157582 -0.354473
v 0.111174 0.118639 -0.339076
v 0.084983 0.142971 -0.331121
v 0.054691 0.161383 -0.321919
v 0.117784 0.117017 -0.305382
v 0.093395 0.140011 -0.296055
v 0.064181 0.157582 -0.289099
v -0.149727 -0.122497 -0.295528
v -0.125337 -0.145491 -0.304855
v -0.096124 -0.163062 -0.311811
v -0.143116 -0.124119 -0.261833
v -0.116925 -0.148451 -0.269789
v -0.086633 -0.166863 -0.278990
v -0.129869 -0.122497 -0.230154
v -0.104412 -0.145491 -0.235969
v -0.076266 -0.163062 -0.246437
v 0.062187 0.136152 -0.241362
v 0.049194 0.119302 -0.227874
v 0.033045 0.097125 -0.219395
v 0.091452 0.116650 -0.244251
v 0.080847 0.097199 -0.230704
v 0.065763 0.073897 -0.222310
v 0.116123 0.095335 -0.257746
v 0.108342 0.074541 -0.245841
v 0.094525 0.050599 -0.238071
v -0.002276 0.046307 -0.208156
v -0.020052 0.020649 -0.202756
v -0.037829 -0.005009 -0.197356
v 0.030594 0.022918 -0.211082
v 0.012817 -0.002740 -0.205682
v -0.004960 -0.028398 -0.200282
v 0.059538 -0.000471 -0.226932
v 0.041761 -0.026129 -0.221532
v 0.023984 -0.051787 -0.216132
v -0.006053 0.162801 -0.339392
v -0.027222 0.149399 -0.336434
v -0.046313 0.128370 -0.332188
v 0.000060 0.167116 -0.305325
v -0.021167 0.153531 -0.298876
v -0.040174 0.132395 -0.293103
v 0.013928 0.162801 -0.273612
v -0.005309 0.149399 -0.264297
v -0.023537 0.128370 -0.257207
v -0.082054 0.077715 -0.321553
v -0.099831 0.052057 -0.316153
v -0.117608 0.026400 -0.310753
v -0.075903 0.081725 -0.282250
v -0.093679 0.056067 -0.276850
v -0.111456 0.030409 -0.271450
v -0.059154 0.077715 -0.246166
v -0.076931 0.052057 -0.240766
v -0.094708 0.026400 -0.235366
v 0.070051 0.095335 -0.409419
v 0.056963 0.074541 -0.414984
v 0.041159 0.050599 -0.413758
v 0.042042 0.116650 -0.406912
v 0.025694 0.097199 -0.412272
v 0.008489 0.073897 -0.410858
v 0.016115 0.136152 -0.393035
v -0.002185 0.119302 -0.397017
v -0.020322 0.097125 -0.395082
v 0.005887 -0.000471 -0.403554
v -0.011890 -0.026129 -0.398154
v -0.029667 -0.051787 -0.392754
v -0.026983 0.022918 -0.400627
v -0.044759 -0.002740 -0.395228
v -0.062536 -0.028398 -0.389828
v -0.055926 0.046307 -0.384777
v -0.073703 0.020649 -0.379378
v -0.091480 -0.005009 -0.373978
v 0.138292 0.068687 -0.311389
v 0.133378 0.044444 -0.306425
v 0.120517 0.019355 -0.300965
v 0.133433 0.066184 -0.345838
v 0.127708 0.040867 -0.344099
v 0.114425 0.015399 -0.340064
v 0.118310 0.068687 -0.377169
v 0.111466 0.044444 -0.378562
v 0.097740 0.019355 -0.375946
v 0.085666 -0.031880 -0.290156
v 0.067889 -0.057538 -0.284757
v 0.050112 -0.083195 -0.279357
v 0.079514 -0.035889 -0.329459
v 0.061737 -0.061547 -0.324060
v 0.043960 -0.087205 -0.318660
v 0.062766 -0.031880 -0.365544
v 0.044989 -0.057538 -0.360144
v 0.027212 -0.083195 -0.354744
vn 0.7350 -0.6765 -0.0458
vn 0.5356 -0.8445 0.0066
vn 0.6363 -0.6765 -0.3708
vn 0.4414 -0.8445 -0.3033
vn 0.8011 -0.5104 0.3126
vn 0.7297 -0.5991 0.3298
vn 0.5330 -0.7678 0.3555
vn 0.2444 -0.8992 0.3629
vn 0.1535 -0.9834 0.0970
vn 0.0736 -0.9834 -0.1660
vn 0.0012 -0.8992 -0.4375
vn 0.2452 -0.7678 -0.5919
vn 0.4230 -0.5991 -0.6799
vn 0.4919 -0.5104 -0.7054
vn 0.7044 -0.5914 -0.3926
vn 0.8038 -0.5914 -0.0655
vn -0.2266 -0.1995 -0.9533
vn -0.3557 -0.4013 -0.8440
vn -0.4836 -0.0050 -0.8753
vn -0.6012 -0.2155 -0.7695
vn 0.1477 -0.3204 -0.9357
vn 0.0825 -0.4103 -0.9082
vn -0.0653 -0.5946 -0.8013
vn -0.2314 -0.7691 -0.5958
vn -0.4980 -0.6648 -0.5568
vn -0.7073 -0.5064 -0.4932
vn -0.8700 -0.2858 -0.4018
vn -0.8198 -0.0236 -0.5721
vn -0.7210 0.1977 -0.6641
vn -0.6624 0.2927 -0.6896
vn -0.4252 0.0845 -0.9011
vn -0.1665 -0.1113 -0.9797
vn -0.8783 0.4697 0.0893
vn -0.9672 0.2215 0.1245
vn -0.7796 0.4697 0.4143
vn -0.8730 0.2215 0.4345
vn -0.8204 0.4826 -0.3068
vn -0.8778 0.3853 -0.2848
vn -0.9633 0.1467 -0.2248
vn -0.9773 -0.1587 -0.1403
vn -0.9693 -0.1942 0.1508
vn -0.8894 -0.1942 0.4138
vn -0.7342 -0.1587 0.6602
vn -0.6755 0.1467 0.7226
vn -0.5711 0.3853 0.7249
vn -0.5111 0.4826 0.7112
vn -0.7232 0.5642 0.3984
vn -0.8226 0.5642 0.0712
vn 0.0849 -0.0050 0.9964
vn -0.0717 -0.2155 0.9739
vn 0.3420 -0.1995 0.9183
vn 0.1739 -0.4013 0.8993
vn -0.1669 0.2927 0.9415
vn -0.2298 0.1977 0.9529
vn -0.3631 -0.0236 0.9315
vn -0.4995 -0.2858 0.8178
vn -0.3135 -0.5064 0.8033
vn -0.1042 -0.6648 0.7397
vn 0.1391 -0.7691 0.6238
vn 0.3914 -0.5946 0.7023
vn 0.5737 -0.4103 0.7089
vn 0.6432 -0.3204 0.6955
vn 0.4066 -0.1113 0.9068
vn 0.1478 0.0845 0.9854
vn 0.6026 0.7435 -0.2901
vn 0.4478 0.8610 -0.2410
vn 0.6621 0.7435 -0.0941
vn 0.5063 0.8610 -0.0488
vn 0.6700 0.5232 -0.5266
vn 0.5089 0.6927 -0.5111
vn 0.3408 0.8199 -0.4601
vn 0.1635 0.8994 -0.4054
vn 0.2312 0.9551 -0.1852
vn 0.2952 0.9551 0.0253
vn 0.3614 0.8994 0.2460
vn 0.5391 0.8199 0.1928
vn 0.7072 0.6927 0.1417
vn 0.8497 0.5232 0.0650
vn 0.8181 0.5593 -0.1337
vn 0.7542 0.5593 -0.3439
vn -0.6621 -0.7435 0.0941
vn -0.5063 -0.8610 0.0488
vn -0.6026 -0.7435 0.2901
vn -0.4478 -0.8610 0.2410
vn -0.8497 -0.5232 -0.0650
vn -0.7072 -0.6927 -0.1417
vn -0.5391 -0.8199 -0.1928
vn -0.3614 -0.8994 -0.2460
vn -0.2952 -0.9551 -0.0253
vn -0.2312 -0.9551 0.1852
vn -0.1635 -0.8994 0.4054
vn -0.3408 -0.8199 0.4601
vn -0.5089 -0.6927 0.5111
vn -0.6700 -0.5232 0.5266
vn -0.7542 -0.5593 0.3439
vn -0.8181 -0.5593 0.1337
vn 0.3498 0.3994 0.8474
vn 0.2240 0.1987 0.9541
vn 0.6019 0.2087 0.7708
vn 0.4837 0.0022 0.8752
vn 0.2880 0.7762 0.5609
vn 0.0624 0.5939 0.8021
vn -0.0853 0.4094 0.9083
vn -0.1479 0.3203 0.9357
vn 0.1663 0.1113 0.9798
vn 0.4252 -0.0847 0.9011
vn 0.6623 -0.2929 0.6896
vn 0.7208 -0.2006 0.6635
vn 0.8200 0.0206 0.5720
vn 0.8523 0.3491 0.3895
vn 0.7067 0.5063 0.4942
vn 0.4984 0.6640 0.5575
vn 0.1572 0.0979 0.9827
vn 0.4160 -0.0979 0.9041
vn -0.1573 0.3066 0.9388
vn 0.6529 -0.3066 0.6927
vn -0.5410 0.8409 -0.0094
vn -0.7368 0.6747 0.0445
vn -0.4444 0.8409 0.3088
vn -0.6371 0.6747 0.3728
vn -0.1928 0.9249 -0.3278
vn -0.5348 0.7659 -0.3570
vn -0.7309 0.5967 -0.3311
vn -0.8011 0.5103 -0.3127
vn -0.8039 0.5912 0.0654
vn -0.7044 0.5912 0.3928
vn -0.4919 0.5103 0.7055
vn -0.4233 0.5967 0.6817
vn -0.2459 0.7659 0.5941
vn 0.0221 0.9249 0.3797
vn -0.0747 0.9834 0.1657
vn -0.1542 0.9834 -0.0961
vn -0.8133 0.5779 0.0683
vn -0.7139 0.5779 0.3956
vn -0.8108 0.4966 -0.3097
vn -0.5016 0.4966 0.7084
vn 0.0715 0.2087 -0.9754
vn -0.0849 0.0022 -0.9964
vn -0.1806 0.3994 -0.8988
vn -0.3445 0.1987 -0.9175
vn 0.4917 0.3491 -0.7977
vn 0.3633 0.0206 -0.9314
vn 0.2300 -0.2006 -0.9523
vn 0.1669 -0.2929 -0.9415
vn -0.1478 -0.0847 -0.9854
vn -0.4068 0.1113 -0.9067
vn -0.6433 0.3203 -0.6954
vn -0.5761 0.4094 -0.7074
vn -0.3942 0.5939 -0.7013
vn -0.0727 0.7762 -0.6263
vn 0.1041 0.6640 -0.7405
vn 0.3125 0.5063 -0.8038
vn -0.1572 -0.0979 -0.9827
vn -0.4160 0.0979 -0.9041
vn 0.1573 -0.3066 -0.9388
vn -0.6529 0.3066 -0.6927
vn 0.9665 -0.2268 -0.1198
vn 0.8775 -0.4715 -0.0873
vn 0.8699 -0.2268 -0.4380
vn 0.7778 -0.4715 -0.4155
vn 0.9748 0.2039 0.0903
vn 0.9625 -0.1486 0.2271
vn 0.8764 -0.3868 0.2870
vn 0.8203 -0.4827 0.3069
vn 0.8225 -0.5643 -0.0711
vn 0.7231 -0.5643 -0.3984
vn 0.5110 -0.4827 -0.7113
vn 0.5687 -0.3868 -0.7259
vn 0.6736 -0.1486 -0.7240
vn 0.7599 0.2039 -0.6172
vn 0.8899 0.1932 -0.4133
vn 0.9694 0.1932 -0.1515
vn 0.8133 -0.5779 -0.0683
vn 0.7139 -0.5779 -0.3956
vn 0.8108 -0.4966 0.3097
vn 0.5016 -0.4966 -0.7084
vn 0.7368 -0.6747 -0.0445
vn 0.5410 -0.8409 0.0094
vn 0.6371 -0.6747 -0.3728
vn 0.4444 -0.8409 -0.3088
vn 0.8011 -0.5103 0.3127
vn 0.7309 -0.5967 0.3311
vn 0.5348 -0.7659 0.3570
vn 0.1928 -0.9249 0.3278
vn 0.1542 -0.9834 0.0961
vn 0.0747 -0.9834 -0.1657
vn -0.0221 -0.9249 -0.3797
vn 0.2459 -0.7659 -0.5941
vn 0.4233 -0.5967 -0.6817
vn 0.4919 -0.5103 -0.7055
vn 0.7044 -0.5912 -0.3928
vn 0.8039 -0.5912 -0.0654
vn -0.2240 -0.1987 -0.9541
vn -0.3498 -0.3994 -0.8474
vn -0.4837 -0.0022 -0.8752
vn -0.6019 -0.2087 -0.7708
vn 0.1479 -0.3203 -0.9357
vn 0.0853 -0.4094 -0.9083
vn -0.0624 -0.5939 -0.8021
vn -0.2880 -0.7762 -0.5609
vn -0.4984 -0.6640 -0.5575
vn -0.7067 -0.5063 -0.4942
vn -0.8523 -0.3491 -0.3895
vn -0.8200 -0.0206 -0.5720
vn -0.7208 0.2006 -0.6635
vn -0.6623 0.2929 -0.6896
vn -0.4252 0.0847 -0.9011
vn -0.1663 -0.1113 -0.9798
vn -0.8775 0.4715 0.0873
vn -0.9665 0.2268 0.1198
vn -0.7778 0.4715 0.4155
vn -0.8699 0.2268 0.4380
vn -0.8203 0.4827 -0.3069
vn -0.8764 0.3868 -0.2870
vn -0.9625 0.1486 -0.2271
vn -0.9748 -0.2039 -0.0903
vn -0.9694 -0.1932 0.1515
vn -0.8899 -0.1932 0.4133
vn -0.7599 -0.2039 0.6172
vn -0.6736 0.1486 0.7240
vn -0.5687 0.3868 0.7259
vn -0.5110 0.4827 0.7113
vn -0.7231 0.5643 0.3984
vn -0.8225 0.5643 0.0711
vn 0.0849 -0.0022 0.9964
vn -0.0715 -0.2087 0.9754
vn 0.3445 -0.1987 0.9175
vn 0.1806 -0.3994 0.8988
vn -0.1669 0.2929 0.9415
vn -0.2300 0.2006 0.9523
vn -0.3633 -0.0206 0.9314
vn -0.4917 -0.3491 0.7977
vn -0.3125 -0.5063 0.8038
vn -0.1041 -0.6640 0.7405
vn 0.0727 -0.7762 0.6263
vn 0.3942 -0.5939 0.7013
vn 0.5761 -0.4094 0.7074
vn 0.6433 -0.3203 0.6954
vn 0.4068 -0.1113 0.9067
vn 0.1478 0.0847 0.9854
vn 0.6046 0.7424 -0.2887
vn 0.4488 0.8599 -0.2433
vn 0.6630 0.7424 -0.0964
vn 0.5083 0.8599 -0.0474
vn 0.6368 0.5412 -0.5492
vn 0.5088 0.6927 -0.5111
vn 0.3408 0.8199 -0.4600
vn 0.1487 0.9177 -0.3683
vn 0.2312 0.9551 -0.1851
vn 0.2951 0.9551 0.0252
vn 0.3284 0.9177 0.2234
vn 0.5391 0.8199 0.1927
vn 0.7072 0.6927 0.1418
vn 0.8346 0.5412 0.1022
vn 0.8181 0.5594 -0.1336
vn 0.7541 0.5594 -0.3440
vn -0.6630 -0.7424 0.0964
vn -0.5083 -0.8599 0.0474
vn -0.6046 -0.7424 0.2887
vn -0.4488 -0.8599 0.2433
vn -0.8346 -0.5412 -0.1022
vn -0.7072 -0.6927 -0.1418
vn -0.5391 -0.8199 -0.1927
vn -0.3284 -0.9177 -0.2234
vn -0.2951 -0.9551 -0.0252
vn -0.2312 -0.9551 0.1851
vn -0.1487 -0.9177 0.3683
vn -0.3408 -0.8199 0.4600
vn -0.5088 -0.6927 0.5111
vn -0.6368 -0.5412 0.5492
vn -0.7541 -0.5594 0.3440
vn -0.8181 -0.5594 0.1336
vn 0.3557 0.4013 0.8440
vn 0.2266 0.1995 0.9533
vn 0.6012 0.2155 0.7695
vn 0.4836 0.0050 0.8753
vn 0.2314 0.7691 0.5958
vn 0.0653 0.5946 0.8013
vn -0.0825 0.4103 0.9082
vn -0.1477 0.3204 0.9357
vn 0.1665 0.1113 0.9797
vn 0.4252 -0.0845 0.9011
vn 0.6624 -0.2927 0.6896
vn 0.7210 -0.1977 0.6641
vn 0.8198 0.0236 0.5721
vn 0.8700 0.2858 0.4018
vn 0.7073 0.5064 0.4932
vn 0.4980 0.6648 0.5568
vn -0.5356 0.8445 -0.0066
vn -0.7350 0.6765 0.0458
vn -0.4414 0.8445 0.3033
vn -0.6363 0.6765 0.3708
vn -0.2444 0.8992 -0.3629
vn -0.5330 0.7678 -0.3555
vn -0.7297 0.5991 -0.3298
vn -0.8011 0.5104 -0.3126
vn -0.8038 0.5914 0.0655
vn -0.7044 0.5914 0.3926
vn -0.4919 0.5104 0.7054
vn -0.4230 0.5991 0.6799
vn -0.2452 0.7678 0.5919
vn -0.0012 0.8992 0.4375
vn -0.0736 0.9834 0.1660
vn -0.1535 0.9834 -0.0970
vn 0.0717 0.2155 -0.9739
vn -0.0849 0.0050 -0.9964
vn -0.1739 0.4013 -0.8993
vn -0.3420 0.1995 -0.9183
vn 0.4995 0.2858 -0.8178
vn 0.3631 0.0236 -0.9315
vn 0.2298 -0.1977 -0.9529
vn 0.1669 -0.2927 -0.9415
vn -0.1478 -0.0845 -0.9854
vn -0.4066 0.1113 -0.9068
vn -0.6432 0.3204 -0.6955
vn -0.5737 0.4103 -0.7089
vn -0.3914 0.5946 -0.7023
vn -0.1391 0.7691 -0.6238
vn 0.1042 0.6648 -0.7397
vn 0.3135 0.5064 -0.8033
vn 0.9672 -0.2215 -0.1245
vn 0.8783 -0.4697 -0.0893
vn 0.8730 -0.2215 -0.4345
vn 0.7796 -0.4697 -0.4143
vn 0.9773 0.1587 0.1403
vn 0.9633 -0.1467 0.2248
vn 0.8778 -0.3853 0.2848
vn 0.8204 -0.4826 0.3068
vn 0.8226 -0.5642 -0.0712
vn 0.7232 -0.5642 -0.3984
vn 0.5111 -0.4826 -0.7112
vn 0.5711 -0.3853 -0.7249
vn 0.6755 -0.1467 -0.7226
vn 0.7342 0.1587 -0.6602
vn 0.8894 0.1942 -0.4138
vn 0.9693 0.1942 -0.1508
vt 0.583333 0.062500
vt 0.562500 0.125000
vt 0.562500 0.062500
vt 0.604167 0.062500
vt 0.583333 0.125000
vt 0.583333 0.187500
vt 0.562500 0.187500
vt 0.604167 0.187500
vt 0.562500 0.000000
vt 0.541667 0.062500
vt 0.541667 0.000000
vt 0.583333 0.000000
vt 0.604167 0.000000
vt 0.625000 0.000000
vt 0.625000 0.125000
vt 0.604167 0.125000
vt 0.625000 0.250000
vt 0.604167 0.250000
vt 0.583333 0.250000
vt 0.562500 0.250000
vt 0.541667 0.187500
vt 0.541667 0.250000
vt 0.541667 0.125000
vt 0.583333 0.312500
vt 0.562500 0.375000
vt 0.562500 0.312500
vt 0.604167 0.312500
vt 0.583333 0.375000
vt 0.583406 0.437500
vt 0.562509 0.437500
vt 0.604320 0.437500
vt 0.541667 0.312500
vt 0.625000 0.375000
vt 0.604167 0.375000
vt 0.625000 0.500000
vt 0.605903 0.500000
vt 0.583912 0.500000
vt 0.562572 0.500000
vt 0.541667 0.437500
vt 0.541667 0.500000
vt 0.541667 0.375000
vt 0.584997 0.562500
vt 0.562789 0.625000
vt 0.562708 0.562500
vt 0.610343 0.562500
vt 0.585648 0.625000
vt 0.584997 0.687500
vt 0.562708 0.687500
vt 0.610343 0.687500
vt 0.541667 0.562500
vt 0.648148 0.625000
vt 0.612558 0.625000
vt 0.625000 0.750000
vt 0.605903 0.750000
vt 0.583912 0.750000
vt 0.562572 0.750000
vt 0.541667 0.687500
vt 0.541667 0.750000
vt 0.541667 0.625000
vt 0.583406 0.812500
vt 0.562500 0.875000
vt 0.562509 0.812500
vt 0.604320 0.812500
vt 0.583333 0.875000
vt 0.583333 0.937500
vt 0.562500 0.937500
vt 0.604167 0.937500
vt 0.541667 0.812500
vt 0.625000 0.875000
vt 0.604167 0.875000
vt 0.625000 1.000000
vt 0.604167 1.000000
vt 0.583333 1.000000
vt 0.562500 1.000000
vt 0.541667 0.937500
vt 0.541667 1.000000
vt 0.541667 0.875000
vt 0.187301 0.562500
vt 0.247685 0.625000
vt 0.187211 0.625000
vt 0.306478 0.562500
vt 0.248409 0.562500
vt 0.187301 0.687500
vt 0.306478 0.687500
vt 0.248409 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.187500 0.500000
vt 0.312500 0.500000
vt 0.250000 0.500000
vt 0.375000 0.500000
vt 0.355867 0.562500
vt 0.304109 0.625000
vt 0.355867 0.687500
vt 0.375000 0.750000
vt 0.312500 0.750000
vt 0.250000 0.750000
vt 0.187500 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.693522 0.562500
vt 0.752315 0.625000
vt 0.695891 0.625000
vt 0.812699 0.562500
vt 0.751592 0.562500
vt 0.693522 0.687500
vt 0.812699 0.687500
vt 0.751591 0.687500
vt 0.644133 0.562500
vt 0.687500 0.500000
vt 0.812500 0.500000
vt 0.750000 0.500000
vt 0.875000 0.500000
vt 0.875000 0.562500
vt 0.812789 0.625000
vt 0.875000 0.687500
vt 0.875000 0.750000
vt 0.812500 0.750000
vt 0.750000 0.750000
vt 0.687500 0.750000
vt 0.644133 0.687500
vt 0.395680 0.812500
vt 0.416667 0.875000
vt 0.395833 0.875000
vt 0.416594 0.812500
vt 0.437500 0.875000
vt 0.395833 0.937500
vt 0.416667 0.937500
vt 0.375000 0.812500
vt 0.394097 0.750000
vt 0.416088 0.750000
vt 0.437491 0.812500
vt 0.437428 0.750000
vt 0.458333 0.812500
vt 0.458333 0.875000
vt 0.437500 0.937500
vt 0.458333 0.937500
vt 0.437500 1.000000
vt 0.416667 1.000000
vt 0.395833 1.000000
vt 0.375000 1.000000
vt 0.375000 0.937500
vt 0.375000 0.875000
vt 0.500000 0.812500
vt 0.479167 0.875000
vt 0.479167 0.812500
vt 0.520833 0.812500
vt 0.500000 0.875000
vt 0.479167 0.937500
vt 0.520833 0.875000
vt 0.500000 0.937500
vt 0.479167 0.750000
vt 0.458333 0.750000
vt 0.500000 0.750000
vt 0.520833 0.750000
vt 0.520833 0.937500
vt 0.520833 1.000000
vt 0.500000 1.000000
vt 0.479167 1.000000
vt 0.458333 1.000000
vt 0.389658 0.562500
vt 0.414352 0.625000
vt 0.387442 0.625000
vt 0.415003 0.562500
vt 0.437211 0.625000
vt 0.389657 0.687500
vt 0.415003 0.687500
vt 0.394097 0.500000
vt 0.416088 0.500000
vt 0.437292 0.562500
vt 0.437428 0.500000
vt 0.458333 0.562500
vt 0.458333 0.625000
vt 0.437292 0.687500
vt 0.458333 0.687500
vt 0.351852 0.625000
vt 0.500000 0.562500
vt 0.479167 0.625000
vt 0.479167 0.562500
vt 0.520833 0.562500
vt 0.500000 0.625000
vt 0.479167 0.687500
vt 0.520833 0.625000
vt 0.500000 0.687500
vt 0.479167 0.500000
vt 0.458333 0.500000
vt 0.520833 0.500000
vt 0.500000 0.500000
vt 0.541667 0.625000
vt 0.520833 0.687500
vt 0.520833 0.750000
vt 0.500000 0.750000
vt 0.458333 0.750000
vt 0.458333 0.625000
vt 0.395833 0.312500
vt 0.416667 0.375000
vt 0.395833 0.375000
vt 0.416667 0.312500
vt 0.437500 0.375000
vt 0.395680 0.437500
vt 0.416594 0.437500
vt 0.375000 0.250000
vt 0.375000 0.312500
vt 0.395833 0.250000
vt 0.416667 0.250000
vt 0.437500 0.312500
vt 0.437500 0.250000
vt 0.458333 0.312500
vt 0.458333 0.375000
vt 0.437491 0.437500
vt 0.458333 0.437500
vt 0.375000 0.437500
vt 0.375000 0.375000
vt 0.500000 0.312500
vt 0.479167 0.375000
vt 0.479167 0.312500
vt 0.520833 0.312500
vt 0.500000 0.375000
vt 0.479167 0.437500
vt 0.520833 0.375000
vt 0.500000 0.437500
vt 0.479167 0.250000
vt 0.458333 0.312500
vt 0.458333 0.250000
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.541667 0.312500
vt 0.541667 0.375000
vt 0.520833 0.437500
vt 0.541667 0.437500
vt 0.458333 0.437500
vt 0.458333 0.375000
vt 0.395833 0.062500
vt 0.416667 0.125000
vt 0.395833 0.125000
vt 0.416667 0.062500
vt 0.437500 0.125000
vt 0.395833 0.187500
vt 0.416667 0.187500
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.395833 0.000000
vt 0.416667 0.000000
vt 0.437500 0.062500
vt 0.437500 0.000000
vt 0.458333 0.062500
vt 0.458333 0.125000
vt 0.437500 0.187500
vt 0.458333 0.187500
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.500000 0.062500
vt 0.479167 0.125000
vt 0.479167 0.062500
vt 0.520833 0.062500
vt 0.500000 0.125000
vt 0.479167 0.187500
vt 0.520833 0.125000
vt 0.500000 0.187500
vt 0.458333 0.000000
vt 0.500000 0.000000
vt 0.479167 0.000000
vt 0.520833 0.000000
vt 0.520833 0.187500
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.479167 0.250000
vt 0.458333 0.250000
vt 0.625000 0.062500
vt 0.625000 0.187500
vt 0.625000 0.312500
vt 0.625000 0.437500
vt 0.625000 0.812500
vt 0.625000 0.937500
vt 0.875000 0.625000
vt 0.458333 0.500000
vt 0.541667 0.500000
vt 0.541667 0.750000
s 0
f 1066/1119/1037 1068/1120/1037 1065/1121/1037
f 1067/1122/1038 1069/1123/1038 1066/1119/1038
f 1068/1120/1039 1072/1124/1039 1071/1125/1039
f 1069/1123/1040 1073/1126/1040 1072/1124/1040
f 984/1127/1041 1062/1128/1041 974/1129/1041
f 985/1130/1042 1065/1121/1042 984/1127/1042
f 986/1131/1043 1066/1119/1043 985/1130/1043
f 966/1132/1044 1067/1122/1044 986/1131/1044
f 1067/1122/1045 988/1133/1045 1070/1134/1045
f 988/1133/1046 1073/1126/1046 1070/1134/1046
f 1073/1126/1047 968/1135/1047 1023/1136/1047
f 1072/1124/1048 1023/1136/1048 1024/1137/1048
f 1071/1125/1049 1024/1137/1049 1025/1138/1049
f 1064/1139/1050 1025/1138/1050 975/1140/1050
f 1063/1141/1051 1071/1125/1051 1064/1139/1051
f 1065/1121/1052 1063/1141/1052 1062/1128/1052
f 1075/1142/1053 1077/1143/1053 1074/1144/1053
f 1076/1145/1054 1078/1146/1054 1075/1142/1054
f 1077/1143/1055 1081/1147/1055 1080/1148/1055
f 1078/1146/1056 1082/1149/1056 1081/1147/1056
f 1025/1138/1057 1056/1150/1057 975/1140/1057
f 1024/1137/1058 1074/1144/1058 1025/1138/1058
f 1023/1136/1059 1075/1142/1059 1024/1137/1059
f 968/1135/1060 1076/1145/1060 1023/1136/1060
f 1076/1145/1061 997/1151/1061 1079/1152/1061
f 997/1151/1062 1082/1149/1062 1079/1152/1062
f 1082/1149/1063 972/1153/1063 1029/1154/1063
f 1081/1147/1064 1029/1154/1064 1030/1155/1064
f 1080/1148/1065 1030/1155/1065 1031/1156/1065
f 1058/1157/1066 1031/1156/1066 977/1158/1066
f 1057/1159/1067 1080/1148/1067 1058/1157/1067
f 1074/1144/1068 1057/1159/1068 1056/1150/1068
f 1084/1160/1069 1086/1161/1069 1083/1162/1069
f 1085/1163/1070 1087/1164/1070 1084/1160/1070
f 1086/1161/1071 1090/1165/1071 1089/1166/1071
f 1087/1164/1072 1091/1167/1072 1090/1165/1072
f 1031/1156/1073 1050/1168/1073 977/1158/1073
f 1030/1155/1074 1083/1162/1074 1031/1156/1074
f 1029/1154/1075 1084/1160/1075 1030/1155/1075
f 972/1153/1076 1085/1163/1076 1029/1154/1076
f 1085/1163/1077 1006/1169/1077 1088/1170/1077
f 1006/1169/1078 1091/1167/1078 1088/1170/1078
f 1091/1167/1079 970/1171/1079 1035/1172/1079
f 1090/1165/1080 1035/1172/1080 1036/1173/1080
f 1089/1166/1081 1036/1173/1081 1037/1174/1081
f 1052/1175/1082 1037/1174/1082 979/1176/1082
f 1051/1177/1083 1089/1166/1083 1052/1175/1083
f 1083/1162/1084 1051/1177/1084 1050/1168/1084
f 1093/1178/1085 1095/1179/1085 1092/1180/1085
f 1094/1181/1086 1096/1182/1086 1093/1178/1086
f 1095/1179/1087 1099/1183/1087 1098/1184/1087
f 1096/1182/1088 1100/1185/1088 1099/1183/1088
f 1037/1174/1089 1044/1186/1089 979/1176/1089
f 1036/1173/1090 1092/1180/1090 1037/1174/1090
f 1035/1172/1091 1093/1178/1091 1036/1173/1091
f 970/1171/1092 1094/1181/1092 1035/1172/1092
f 1094/1181/1093 1015/1187/1093 1097/1188/1093
f 1015/1187/1094 1100/1185/1094 1097/1188/1094
f 1100/1185/1095 966/1189/1095 986/1190/1095
f 1099/1183/1096 986/1190/1096 985/1191/1096
f 1098/1184/1097 985/1191/1097 984/1192/1097
f 1046/1193/1098 984/1192/1098 974/1194/1098
f 1045/1195/1099 1098/1184/1099 1046/1193/1099
f 1092/1180/1100 1045/1195/1100 1044/1186/1100
f 1101/1196/1101 1105/1197/1101 1104/1198/1101
f 1103/1199/1102 1105/1197/1102 1102/1200/1102
f 1105/1197/1103 1107/1201/1103 1104/1198/1103
f 1105/1197/1104 1109/1202/1104 1108/1203/1104
f 967/1204/1105 1101/1196/1105 981/1205/1105
f 995/1206/1106 1102/1200/1106 1101/1196/1106
f 993/1207/1107 1102/1200/1107 994/1208/1107
f 971/1209/1108 1103/1199/1108 993/1207/1108
f 1004/1210/1109 1106/1211/1109 1103/1199/1109
f 1106/1211/1110 1002/1212/1110 1109/1202/1110
f 1109/1202/1111 969/1213/1111 1013/1214/1111
f 1108/1203/1112 1013/1214/1112 1012/1215/1112
f 1108/1203/1113 1011/1216/1113 1107/1201/1113
f 1107/1201/1114 965/1217/1114 983/1218/1114
f 1104/1198/1115 983/1218/1115 982/1219/1115
f 981/1205/1116 1104/1198/1116 982/1219/1116
f 1110/1220/1117 1114/1221/1117 1113/1222/1117
f 1112/1223/1118 1114/1221/1118 1111/1224/1118
f 1114/1221/1119 1116/1225/1119 1113/1222/1119
f 1114/1221/1120 1118/1226/1120 1117/1227/1120
f 972/1153/1121 1110/1220/1121 1005/1228/1121
f 998/1229/1122 1111/1224/1122 1110/1220/1122
f 996/1230/1123 1111/1224/1123 997/1231/1123
f 968/1232/1124 1112/1223/1124 996/1230/1124
f 989/1233/1125 1115/1234/1125 1112/1223/1125
f 1115/1234/1126 987/1235/1126 1118/1226/1126
f 1118/1226/1127 966/1236/1127 1016/1237/1127
f 1117/1227/1128 1016/1237/1128 1015/1238/1128
f 1117/1227/1129 1014/1239/1129 1116/1225/1129
f 1116/1225/1130 970/1171/1130 1007/1240/1130
f 1113/1222/1131 1007/1240/1131 1006/1169/1131
f 1005/1228/1132 1113/1222/1132 1006/1169/1132
f 1119/1241/1133 1123/1242/1133 1122/1243/1133
f 1120/1244/1134 1124/1245/1134 1123/1242/1134
f 1123/1242/1135 1125/1246/1135 1122/1243/1135
f 1124/1245/1136 1126/1247/1136 1123/1242/1136
f 969/1213/1137 1119/1241/1137 1013/1248/1137
f 1010/1249/1138 1120/1244/1138 1119/1241/1138
f 1009/1250/1139 1121/1251/1139 1120/1244/1139
f 1008/1252/1140 1041/1253/1140 1121/1251/1140
f 1121/1251/1141 1042/1254/1141 1124/1245/1141
f 1042/1254/1142 1127/1255/1142 1124/1245/1142
f 1043/1256/1143 1019/1257/1143 1127/1255/1143
f 1127/1255/1144 1018/1258/1144 1126/1247/1144
f 1126/1247/1145 1017/1259/1145 1125/1246/1145
f 1125/1246/1146 965/1260/1146 1011/1261/1146
f 1012/1262/1147 1125/1246/1147 1011/1261/1147
f 1119/1241/1148 1012/1262/1148 1013/1248/1148
f 1129/1263/1149 1131/1264/1149 1128/1265/1149
f 1130/1266/1149 1132/1267/1149 1129/1263/1149
f 1132/1267/1150 1134/1268/1150 1131/1264/1150
f 1133/1269/1150 1135/1270/1150 1132/1267/1150
f 1040/1271/1151 1041/1253/1151 980/1272/1151
f 1039/1273/1151 1128/1265/1151 1040/1271/1151
f 1039/1273/1151 1130/1266/1151 1129/1263/1151
f 979/1176/1151 1130/1266/1151 1038/1274/1151
f 1044/1186/1149 1133/1269/1149 1130/1266/1149
f 1045/1195/1150 1136/1275/1150 1133/1269/1150
f 1046/1193/1152 1022/1276/1152 1136/1275/1152
f 1136/1275/1152 1021/1277/1152 1135/1270/1152
f 1135/1270/1152 1020/1278/1152 1134/1268/1152
f 1134/1268/1152 973/1279/1152 1043/1256/1152
f 1131/1264/1150 1043/1256/1150 1042/1254/1150
f 1128/1265/1149 1042/1254/1149 1041/1253/1149
f 1137/1280/1153 1141/1281/1153 1140/1282/1153
f 1138/1283/1154 1142/1284/1154 1141/1281/1154
f 1141/1281/1155 1143/1285/1155 1140/1282/1155
f 1142/1284/1156 1144/1286/1156 1141/1281/1156
f 971/1209/1157 1137/1280/1157 1004/1210/1157
f 1001/1287/1158 1138/1283/1158 1137/1280/1158
f 1000/1288/1159 1139/1289/1159 1138/1283/1159
f 999/1290/1160 1047/1291/1160 1139/1289/1160
f 1139/1289/1161 1048/1292/1161 1142/1284/1161
f 1048/1292/1162 1145/1293/1162 1142/1284/1162
f 1049/1294/1163 1008/1252/1163 1145/1293/1163
f 1145/1293/1164 1009/1250/1164 1144/1286/1164
f 1144/1286/1165 1010/1249/1165 1143/1285/1165
f 1143/1285/1166 969/1213/1166 1002/1212/1166
f 1003/1295/1167 1143/1285/1167 1002/1212/1167
f 1137/1280/1168 1003/1295/1168 1004/1210/1168
f 1147/1296/1169 1149/1297/1169 1146/1298/1169
f 1148/1299/1169 1150/1300/1169 1147/1296/1169
f 1150/1300/1170 1152/1301/1170 1149/1297/1170
f 1151/1302/1170 1153/1303/1170 1150/1300/1170
f 1034/1304/1171 1047/1291/1171 978/1305/1171
f 1034/1304/1171 1147/1296/1171 1146/1298/1171
f 1032/1306/1171 1147/1296/1171 1033/1307/1171
f 1032/1306/1171 1050/1168/1171 1148/1299/1171
f 1148/1299/1169 1051/1308/1169 1151/1302/1169
f 1051/1308/1170 1154/1309/1170 1151/1302/1170
f 1052/1175/1172 1038/1310/1172 1154/1309/1172
f 1153/1303/1172 1038/1310/1172 1039/1311/1172
f 1153/1303/1172 1040/1271/1172 1152/1301/1172
f 1152/1301/1172 980/1312/1172 1049/1294/1172
f 1149/1297/1170 1049/1294/1170 1048/1313/1170
f 1047/1291/1169 1149/1297/1169 1048/1313/1169
f 1155/1314/1173 1159/1315/1173 1158/1316/1173
f 1156/1317/1174 1160/1318/1174 1159/1315/1174
f 1159/1315/1175 1161/1319/1175 1158/1316/1175
f 1160/1318/1176 1162/1320/1176 1159/1315/1176
f 967/1321/1177 1155/1314/1177 995/1322/1177
f 992/1323/1178 1156/1317/1178 1155/1314/1178
f 991/1324/1179 1157/1325/1179 1156/1317/1179
f 990/1326/1180 1053/1327/1180 1157/1325/1180
f 1157/1325/1181 1054/1328/1181 1160/1318/1181
f 1054/1328/1182 1163/1329/1182 1160/1318/1182
f 1055/1330/1183 999/1290/1183 1163/1329/1183
f 1163/1329/1184 1000/1288/1184 1162/1320/1184
f 1162/1320/1185 1001/1287/1185 1161/1319/1185
f 1161/1319/1186 971/1209/1186 993/1331/1186
f 994/1332/1187 1161/1319/1187 993/1331/1187
f 1155/1314/1188 994/1332/1188 995/1322/1188
f 1165/1333/1189 1167/1334/1189 1164/1335/1189
f 1166/1336/1189 1168/1337/1189 1165/1333/1189
f 1168/1337/1190 1170/1338/1190 1167/1334/1190
f 1169/1339/1190 1171/1340/1190 1168/1337/1190
f 1028/1341/1191 1053/1342/1191 976/1343/1191
f 1028/1341/1191 1165/1333/1191 1164/1335/1191
f 1026/1344/1191 1165/1333/1191 1027/1345/1191
f 1026/1344/1191 1056/1346/1191 1166/1336/1191
f 1166/1336/1189 1057/1347/1189 1169/1339/1189
f 1057/1347/1190 1172/1348/1190 1169/1339/1190
f 1058/1349/1192 1032/1306/1192 1172/1348/1192
f 1172/1348/1192 1033/1307/1192 1171/1340/1192
f 1170/1338/1192 1033/1307/1192 1034/1304/1192
f 1170/1338/1192 978/1305/1192 1055/1350/1192
f 1167/1334/1190 1055/1350/1190 1054/1351/1190
f 1053/1342/1189 1167/1334/1189 1054/1351/1189
f 1173/1352/1193 1177/1353/1193 1176/1354/1193
f 1174/1355/1194 1178/1356/1194 1177/1353/1194
f 1177/1353/1195 1179/1357/1195 1176/1354/1195
f 1178/1356/1196 1180/1358/1196 1177/1353/1196
f 965/1359/1197 1173/1352/1197 983/1360/1197
f 1017/1361/1198 1174/1355/1198 1173/1352/1198
f 1018/1362/1199 1175/1363/1199 1174/1355/1199
f 1019/1364/1200 1059/1365/1200 1175/1363/1200
f 1175/1363/1201 1060/1366/1201 1178/1356/1201
f 1060/1366/1202 1181/1367/1202 1178/1356/1202
f 1061/1368/1203 990/1326/1203 1181/1367/1203
f 1181/1367/1204 991/1324/1204 1180/1358/1204
f 1180/1358/1205 992/1323/1205 1179/1357/1205
f 1179/1357/1206 967/1321/1206 981/1369/1206
f 982/1370/1207 1179/1357/1207 981/1369/1207
f 1173/1352/1208 982/1370/1208 983/1360/1208
f 1183/1371/1209 1185/1372/1209 1182/1373/1209
f 1184/1374/1209 1186/1375/1209 1183/1371/1209
f 1186/1375/1210 1188/1376/1210 1185/1372/1210
f 1187/1377/1210 1189/1378/1210 1186/1375/1210
f 973/1379/1211 1182/1373/1211 1059/1365/1211
f 1021/1380/1211 1182/1373/1211 1020/1381/1211
f 1022/1382/1211 1183/1371/1211 1021/1380/1211
f 974/1129/1211 1184/1374/1211 1022/1382/1211
f 1062/1128/1209 1187/1377/1209 1184/1374/1209
f 1063/1141/1210 1190/1383/1210 1187/1377/1210
f 1064/1139/1212 1026/1384/1212 1190/1383/1212
f 1190/1383/1212 1027/1385/1212 1189/1378/1212
f 1189/1378/1212 1028/1386/1212 1188/1376/1212
f 1061/1368/1212 1028/1386/1212 976/1387/1212
f 1185/1372/1210 1061/1368/1210 1060/1366/1210
f 1059/1365/1209 1185/1372/1209 1060/1366/1209
f 1066/1119/1213 1069/1123/1213 1068/1120/1213
f 1067/1122/1214 1070/1134/1214 1069/1123/1214
f 1068/1120/1215 1069/1123/1215 1072/1124/1215
f 1069/1123/1216 1070/1134/1216 1073/1126/1216
f 984/1127/1217 1065/1121/1217 1062/1128/1217
f 985/1130/1218 1066/1119/1218 1065/1121/1218
f 986/1131/1219 1067/1122/1219 1066/1119/1219
f 966/1132/1220 987/1388/1220 1067/1122/1220
f 1067/1122/1221 987/1388/1221 988/1133/1221
f 988/1133/1222 989/1389/1222 1073/1126/1222
f 1073/1126/1223 989/1389/1223 968/1135/1223
f 1072/1124/1224 1073/1126/1224 1023/1136/1224
f 1071/1125/1225 1072/1124/1225 1024/1137/1225
f 1064/1139/1226 1071/1125/1226 1025/1138/1226
f 1063/1141/1227 1068/1120/1227 1071/1125/1227
f 1065/1121/1228 1068/1120/1228 1063/1141/1228
f 1075/1142/1229 1078/1146/1229 1077/1143/1229
f 1076/1145/1230 1079/1152/1230 1078/1146/1230
f 1077/1143/1231 1078/1146/1231 1081/1147/1231
f 1078/1146/1232 1079/1152/1232 1082/1149/1232
f 1025/1138/1233 1074/1144/1233 1056/1150/1233
f 1024/1137/1234 1075/1142/1234 1074/1144/1234
f 1023/1136/1235 1076/1145/1235 1075/1142/1235
f 968/1135/1236 996/1390/1236 1076/1145/1236
f 1076/1145/1237 996/1390/1237 997/1151/1237
f 997/1151/1238 998/1391/1238 1082/1149/1238
f 1082/1149/1239 998/1391/1239 972/1153/1239
f 1081/1147/1240 1082/1149/1240 1029/1154/1240
f 1080/1148/1241 1081/1147/1241 1030/1155/1241
f 1058/1157/1242 1080/1148/1242 1031/1156/1242
f 1057/1159/1243 1077/1143/1243 1080/1148/1243
f 1074/1144/1244 1077/1143/1244 1057/1159/1244
f 1084/1160/1245 1087/1164/1245 1086/1161/1245
f 1085/1163/1246 1088/1170/1246 1087/1164/1246
f 1086/1161/1247 1087/1164/1247 1090/1165/1247
f 1087/1164/1248 1088/1170/1248 1091/1167/1248
f 1031/1156/1249 1083/1162/1249 1050/1168/1249
f 1030/1155/1250 1084/1160/1250 1083/1162/1250
f 1029/1154/1251 1085/1163/1251 1084/1160/1251
f 972/1153/1252 1005/1228/1252 1085/1163/1252
f 1085/1163/1253 1005/1228/1253 1006/1169/1253
f 1006/1169/1254 1007/1240/1254 1091/1167/1254
f 1091/1167/1255 1007/1240/1255 970/1171/1255
f 1090/1165/1256 1091/1167/1256 1035/1172/1256
f 1089/1166/1257 1090/1165/1257 1036/1173/1257
f 1052/1175/1258 1089/1166/1258 1037/1174/1258
f 1051/1177/1259 1086/1161/1259 1089/1166/1259
f 1083/1162/1260 1086/1161/1260 1051/1177/1260
f 1093/1178/1261 1096/1182/1261 1095/1179/1261
f 1094/1181/1262 1097/1188/1262 1096/1182/1262
f 1095/1179/1263 1096/1182/1263 1099/1183/1263
f 1096/1182/1264 1097/1188/1264 1100/1185/1264
f 1037/1174/1265 1092/1180/1265 1044/1186/1265
f 1036/1173/1266 1093/1178/1266 1092/1180/1266
f 1035/1172/1267 1094/1181/1267 1093/1178/1267
f 970/1171/1268 1014/1392/1268 1094/1181/1268
f 1094/1181/1269 1014/1392/1269 1015/1187/1269
f 1015/1187/1270 1016/1393/1270 1100/1185/1270
f 1100/1185/1271 1016/1393/1271 966/1189/1271
f 1099/1183/1272 1100/1185/1272 986/1190/1272
f 1098/1184/1273 1099/1183/1273 985/1191/1273
f 1046/1193/1274 1098/1184/1274 984/1192/1274
f 1045/1195/1275 1095/1179/1275 1098/1184/1275
f 1092/1180/1276 1095/1179/1276 1045/1195/1276
f 1101/1196/1277 1102/1200/1277 1105/1197/1277
f 1103/1199/1278 1106/1211/1278 1105/1197/1278
f 1105/1197/1279 1108/1203/1279 1107/1201/1279
f 1105/1197/1280 1106/1211/1280 1109/1202/1280
f 967/1204/1281 995/1206/1281 1101/1196/1281
f 995/1206/1282 994/1208/1282 1102/1200/1282
f 993/1207/1283 1103/1199/1283 1102/1200/1283
f 971/1209/1284 1004/1210/1284 1103/1199/1284
f 1004/1210/1285 1003/1295/1285 1106/1211/1285
f 1106/1211/1286 1003/1295/1286 1002/1212/1286
f 1109/1202/1287 1002/1212/1287 969/1213/1287
f 1108/1203/1288 1109/1202/1288 1013/1214/1288
f 1108/1203/1289 1012/1215/1289 1011/1216/1289
f 1107/1201/1290 1011/1216/1290 965/1217/1290
f 1104/1198/1291 1107/1201/1291 983/1218/1291
f 981/1205/1292 1101/1196/1292 1104/1198/1292
f 1110/1220/1293 1111/1224/1293 1114/1221/1293
f 1112/1223/1294 1115/1234/1294 1114/1221/1294
f 1114/1221/1295 1117/1227/1295 1116/1225/1295
f 1114/1221/1296 1115/1234/1296 1118/1226/1296
f 972/1153/1297 998/1229/1297 1110/1220/1297
f 998/1229/1298 997/1231/1298 1111/1224/1298
f 996/1230/1299 1112/1223/1299 1111/1224/1299
f 968/1232/1300 989/1233/1300 1112/1223/1300
f 989/1233/1301 988/1394/1301 1115/1234/1301
f 1115/1234/1302 988/1394/1302 987/1235/1302
f 1118/1226/1303 987/1235/1303 966/1236/1303
f 1117/1227/1304 1118/1226/1304 1016/1237/1304
f 1117/1227/1305 1015/1238/1305 1014/1239/1305
f 1116/1225/1306 1014/1239/1306 970/1171/1306
f 1113/1222/1307 1116/1225/1307 1007/1240/1307
f 1005/1228/1308 1110/1220/1308 1113/1222/1308
f 1119/1241/1309 1120/1244/1309 1123/1242/1309
f 1120/1244/1310 1121/1251/1310 1124/1245/1310
f 1123/1242/1311 1126/1247/1311 1125/1246/1311
f 1124/1245/1312 1127/1255/1312 1126/1247/1312
f 969/1213/1313 1010/1249/1313 1119/1241/1313
f 1010/1249/1314 1009/1250/1314 1120/1244/1314
f 1009/1250/1315 1008/1252/1315 1121/1251/1315
f 1008/1252/1316 980/1272/1316 1041/1253/1316
f 1121/1251/1317 1041/1253/1317 1042/1254/1317
f 1042/1254/1318 1043/1256/1318 1127/1255/1318
f 1043/1256/1319 973/1279/1319 1019/1257/1319
f 1127/1255/1320 1019/1257/1320 1018/1258/1320
f 1126/1247/1321 1018/1258/1321 1017/1259/1321
f 1125/1246/1322 1017/1259/1322 965/1260/1322
f 1012/1262/1323 1122/1243/1323 1125/1246/1323
f 1119/1241/1324 1122/1243/1324 1012/1262/1324
f 1129/1263/1149 1132/1267/1149 1131/1264/1149
f 1130/1266/1149 1133/1269/1149 1132/1267/1149
f 1132/1267/1150 1135/1270/1150 1134/1268/1150
f 1133/1269/1150 1136/1275/1150 1135/1270/1150
f 1040/1271/1151 1128/1265/1151 1041/1253/1151
f 1039/1273/1151 1129/1263/1151 1128/1265/1151
f 1039/1273/1151 1038/1274/1151 1130/1266/1151
f 979/1176/1151 1044/1186/1151 1130/1266/1151
f 1044/1186/1149 1045/1195/1149 1133/1269/1149
f 1045/1195/1150 1046/1193/1150 1136/1275/1150
f 1046/1193/1152 974/1194/1152 1022/1276/1152
f 1136/1275/1152 1022/1276/1152 1021/1277/1152
f 1135/1270/1152 1021/1277/1152 1020/1278/1152
f 1134/1268/1152 1020/1278/1152 973/1279/1152
f 1131/1264/1150 1134/1268/1150 1043/1256/1150
f 1128/1265/1149 1131/1264/1149 1042/1254/1149
f 1137/1280/1325 1138/1283/1325 1141/1281/1325
f 1138/1283/1326 1139/1289/1326 1142/1284/1326
f 1141/1281/1327 1144/1286/1327 1143/1285/1327
f 1142/1284/1328 1145/1293/1328 1144/1286/1328
f 971/1209/1329 1001/1287/1329 1137/1280/1329
f 1001/1287/1330 1000/1288/1330 1138/1283/1330
f 1000/1288/1331 999/1290/1331 1139/1289/1331
f 999/1290/1332 978/1395/1332 1047/1291/1332
f 1139/1289/1333 1047/1291/1333 1048/1292/1333
f 1048/1292/1334 1049/1294/1334 1145/1293/1334
f 1049/1294/1335 980/1272/1335 1008/1252/1335
f 1145/1293/1336 1008/1252/1336 1009/1250/1336
f 1144/1286/1337 1009/1250/1337 1010/1249/1337
f 1143/1285/1338 1010/1249/1338 969/1213/1338
f 1003/1295/1339 1140/1282/1339 1143/1285/1339
f 1137/1280/1340 1140/1282/1340 1003/1295/1340
f 1147/1296/1169 1150/1300/1169 1149/1297/1169
f 1148/1299/1169 1151/1302/1169 1150/1300/1169
f 1150/1300/1170 1153/1303/1170 1152/1301/1170
f 1151/1302/1170 1154/1309/1170 1153/1303/1170
f 1034/1304/1171 1146/1298/1171 1047/1291/1171
f 1034/1304/1171 1033/1307/1171 1147/1296/1171
f 1032/1306/1171 1148/1299/1171 1147/1296/1171
f 1032/1306/1171 977/1396/1171 1050/1168/1171
f 1148/1299/1169 1050/1168/1169 1051/1308/1169
f 1051/1308/1170 1052/1175/1170 1154/1309/1170
f 1052/1175/1172 979/1397/1172 1038/1310/1172
f 1153/1303/1172 1154/1309/1172 1038/1310/1172
f 1153/1303/1172 1039/1311/1172 1040/1271/1172
f 1152/1301/1172 1040/1271/1172 980/1312/1172
f 1149/1297/1170 1152/1301/1170 1049/1294/1170
f 1047/1291/1169 1146/1298/1169 1149/1297/1169
f 1155/1314/1341 1156/1317/1341 1159/1315/1341
f 1156/1317/1342 1157/1325/1342 1160/1318/1342
f 1159/1315/1343 1162/1320/1343 1161/1319/1343
f 1160/1318/1344 1163/1329/1344 1162/1320/1344
f 967/1321/1345 992/1323/1345 1155/1314/1345
f 992/1323/1346 991/1324/1346 1156/1317/1346
f 991/1324/1347 990/1326/1347 1157/1325/1347
f 990/1326/1348 976/1387/1348 1053/1327/1348
f 1157/1325/1349 1053/1327/1349 1054/1328/1349
f 1054/1328/1350 1055/1330/1350 1163/1329/1350
f 1055/1330/1351 978/1395/1351 999/1290/1351
f 1163/1329/1352 999/1290/1352 1000/1288/1352
f 1162/1320/1353 1000/1288/1353 1001/1287/1353
f 1161/1319/1354 1001/1287/1354 971/1209/1354
f 994/1332/1355 1158/1316/1355 1161/1319/1355
f 1155/1314/1356 1158/1316/1356 994/1332/1356
f 1165/1333/1189 1168/1337/1189 1167/1334/1189
f 1166/1336/1189 1169/1339/1189 1168/1337/1189
f 1168/1337/1190 1171/1340/1190 1170/1338/1190
f 1169/1339/1190 1172/1348/1190 1171/1340/1190
f 1028/1341/1191 1164/1335/1191 1053/1342/1191
f 1028/1341/1191 1027/1345/1191 1165/1333/1191
f 1026/1344/1191 1166/1336/1191 1165/1333/1191
f 1026/1344/1191 975/1140/1191 1056/1346/1191
f 1166/1336/1189 1056/1346/1189 1057/1347/1189
f 1057/1347/1190 1058/1349/1190 1172/1348/1190
f 1058/1349/1192 977/1396/1192 1032/1306/1192
f 1172/1348/1192 1032/1306/1192 1033/1307/1192
f 1170/1338/1192 1171/1340/1192 1033/1307/1192
f 1170/1338/1192 1034/1304/1192 978/1305/1192
f 1167/1334/1190 1170/1338/1190 1055/1350/1190
f 1053/1342/1189 1164/1335/1189 1167/1334/1189
f 1173/1352/1357 1174/1355/1357 1177/1353/1357
f 1174/1355/1358 1175/1363/1358 1178/1356/1358
f 1177/1353/1359 1180/1358/1359 1179/1357/1359
f 1178/1356/1360 1181/1367/1360 1180/1358/1360
f 965/1359/1361 1017/1361/1361 1173/1352/1361
f 1017/1361/1362 1018/1362/1362 1174/1355/1362
f 1018/1362/1363 1019/1364/1363 1175/1363/1363
f 1019/1364/1364 973/1379/1364 1059/1365/1364
f 1175/1363/1365 1059/1365/1365 1060/1366/1365
f 1060/1366/1366 1061/1368/1366 1181/1367/1366
f 1061/1368/1367 976/1387/1367 990/1326/1367
f 1181/1367/1368 990/1326/1368 991/1324/1368
f 1180/1358/1369 991/1324/1369 992/1323/1369
f 1179/1357/1370 992/1323/1370 967/1321/1370
f 982/1370/1371 1176/1354/1371 1179/1357/1371
f 1173/1352/1372 1176/1354/1372 982/1370/1372
f 1183/1371/1209 1186/1375/1209 1185/1372/1209
f 1184/1374/1209 1187/1377/1209 1186/1375/1209
f 1186/1375/1210 1189/1378/1210 1188/1376/1210
f 1187/1377/1210 1190/1383/1210 1189/1378/1210
f 973/1379/1211 1020/1381/1211 1182/1373/1211
f 1021/1380/1211 1183/1371/1211 1182/1373/1211
f 1022/1382/1211 1184/1374/1211 1183/1371/1211
f 974/1129/1211 1062/1128/1211 1184/1374/1211
f 1062/1128/1209 1063/1141/1209 1187/1377/1209
f 1063/1141/1210 1064/1139/1210 1190/1383/1210
f 1064/1139/1212 975/1140/1212 1026/1384/1212
f 1190/1383/1212 1026/1384/1212 1027/1385/1212
f 1189/1378/1212 1027/1385/1212 1028/1386/1212
f 1061/1368/1212 1188/1376/1212 1028/1386/1212
f 1185/1372/1210 1188/1376/1210 1061/1368/1210
f 1059/1365/1209 1182/1373/1209 1185/1372/1209
o Cube.002
v 0.321424 -0.411795 -0.083474
v 0.004825 -0.403375 -0.083474
v 0.321424 -0.411795 -0.200379
v 0.004825 -0.403375 -0.200379
v 0.324012 -0.314486 -0.083474
v 0.007413 -0.306066 -0.083474
v 0.324012 -0.314486 -0.200379
v 0.007413 -0.306066 -0.200379
v 0.226013 -0.425487 -0.063989
v 0.099373 -0.422120 -0.063989
v 0.099373 -0.422120 -0.219864
v 0.226013 -0.425487 -0.219864
v 0.102824 -0.292374 -0.219864
v 0.229463 -0.295742 -0.219864
v 0.102824 -0.292374 -0.063989
v 0.229463 -0.295742 -0.063989
v 0.325853 -0.419015 -0.176607
v 0.328078 -0.422642 -0.141927
v 0.325853 -0.419015 -0.107247
v 0.068224 -0.420944 -0.064407
v 0.040134 -0.417764 -0.067327
v 0.018004 -0.411175 -0.074530
v 0.000018 -0.410350 -0.107247
v -0.002396 -0.413853 -0.141927
v 0.000018 -0.410350 -0.176607
v 0.257181 -0.425969 -0.219447
v 0.285400 -0.424287 -0.216526
v 0.307849 -0.418884 -0.209323
v 0.328104 -0.334396 -0.208906
v 0.329656 -0.363325 -0.213189
v 0.326568 -0.392130 -0.208906
v 0.000733 -0.383465 -0.208906
v -0.000819 -0.354536 -0.213189
v 0.002268 -0.325731 -0.208906
v 0.260613 -0.296918 -0.219447
v 0.288703 -0.300097 -0.216526
v 0.310833 -0.306686 -0.209323
v 0.328819 -0.307512 -0.107247
v 0.331233 -0.304008 -0.141927
v 0.328819 -0.307512 -0.176607
v 0.002983 -0.298846 -0.176607
v 0.000759 -0.295220 -0.141927
v 0.002983 -0.298846 -0.107247
v 0.260613 -0.296918 -0.064407
v 0.288703 -0.300097 -0.067327
v 0.310833 -0.306686 -0.074530
v 0.326568 -0.392130 -0.074947
v 0.329656 -0.363325 -0.070665
v 0.328104 -0.334396 -0.074947
v 0.002268 -0.325731 -0.074947
v -0.000819 -0.354536 -0.070665
v 0.000733 -0.383465 -0.074947
v 0.307849 -0.418884 -0.074530
v 0.285400 -0.424287 -0.067327
v 0.257181 -0.425969 -0.064407
v 0.194353 -0.424645 -0.063989
v 0.162693 -0.423803 -0.063989
v 0.131033 -0.422962 -0.063989
v 0.018004 -0.411175 -0.209323
v 0.040134 -0.417764 -0.216526
v 0.068224 -0.420944 -0.219447
v 0.131033 -0.422962 -0.219864
v 0.162693 -0.423803 -0.219864
v 0.194353 -0.424645 -0.219864
v 0.020988 -0.298978 -0.209323
v 0.043436 -0.293575 -0.216526
v 0.071655 -0.291893 -0.219447
v 0.134484 -0.293216 -0.219864
v 0.166144 -0.294058 -0.219864
v 0.197804 -0.294900 -0.219864
v 0.020988 -0.298978 -0.074530
v 0.043436 -0.293575 -0.067327
v 0.071655 -0.291893 -0.064407
v 0.134484 -0.293216 -0.063989
v 0.166144 -0.294058 -0.063989
v 0.197804 -0.294900 -0.063989
v 0.228682 -0.325137 -0.042070
v 0.227738 -0.360615 -0.034763
v 0.226795 -0.396092 -0.042070
v 0.102042 -0.321769 -0.042070
v 0.101099 -0.357247 -0.034763
v 0.100155 -0.392724 -0.042070
v 0.229949 -0.277496 -0.184548
v 0.230110 -0.271414 -0.141927
v 0.229949 -0.277496 -0.099305
v 0.103309 -0.274128 -0.184548
v 0.103471 -0.268047 -0.141927
v 0.103309 -0.274128 -0.099305
v 0.226795 -0.396092 -0.241784
v 0.227738 -0.360615 -0.249090
v 0.228682 -0.325137 -0.241784
v 0.100155 -0.392724 -0.241784
v 0.101099 -0.357247 -0.249090
v 0.102042 -0.321769 -0.241784
v 0.225528 -0.443733 -0.099305
v 0.225366 -0.449815 -0.141927
v 0.225528 -0.443733 -0.184548
v 0.098888 -0.440365 -0.099305
v 0.098726 -0.446447 -0.141927
v 0.098888 -0.440365 -0.184548
v 0.067698 -0.439095 -0.099534
v 0.039326 -0.435259 -0.101143
v 0.016320 -0.426420 -0.104736
v 0.067494 -0.445147 -0.141927
v 0.038829 -0.441097 -0.141927
v 0.015064 -0.431561 -0.141927
v 0.067698 -0.439095 -0.184319
v 0.039326 -0.435259 -0.182711
v 0.016320 -0.426420 -0.179117
v 0.068958 -0.391703 -0.241255
v 0.040540 -0.389608 -0.237555
v 0.017395 -0.385999 -0.227678
v 0.069854 -0.356416 -0.248526
v 0.041101 -0.355651 -0.244580
v 0.017100 -0.355013 -0.233891
v 0.070835 -0.321131 -0.241255
v 0.042346 -0.321713 -0.237555
v 0.019041 -0.324086 -0.227678
v 0.072095 -0.273739 -0.184319
v 0.043560 -0.276062 -0.182711
v 0.020116 -0.283665 -0.179117
v 0.072214 -0.267685 -0.141927
v 0.043373 -0.270205 -0.141927
v 0.019135 -0.278464 -0.141927
v 0.072095 -0.273739 -0.099534
v 0.043560 -0.276062 -0.101143
v 0.020116 -0.283665 -0.104736
v 0.070835 -0.321131 -0.042598
v 0.042346 -0.321713 -0.046298
v 0.019041 -0.324086 -0.056175
v 0.069854 -0.356416 -0.035327
v 0.041101 -0.355651 -0.039273
v 0.017100 -0.355013 -0.049963
v 0.068958 -0.391703 -0.042598
v 0.040540 -0.389608 -0.046298
v 0.017395 -0.385999 -0.056175
v 0.336399 -0.394291 -0.178887
v 0.340563 -0.363615 -0.180873
v 0.338036 -0.332761 -0.178887
v 0.339701 -0.396033 -0.141927
v 0.344215 -0.363712 -0.141927
v 0.341425 -0.331197 -0.141927
v 0.336399 -0.394291 -0.104966
v 0.340563 -0.363615 -0.102981
v 0.338036 -0.332761 -0.104966
v -0.007563 -0.323570 -0.178887
v -0.011726 -0.354246 -0.180873
v -0.009199 -0.385100 -0.178887
v -0.010864 -0.321829 -0.141927
v -0.015379 -0.354149 -0.141927
v -0.012589 -0.386664 -0.141927
v -0.007563 -0.323570 -0.104966
v -0.011726 -0.354246 -0.102981
v -0.009199 -0.385100 -0.104966
v 0.311442 -0.331862 -0.056175
v 0.288297 -0.328254 -0.046298
v 0.259878 -0.326158 -0.042598
v 0.311737 -0.362848 -0.049963
v 0.287736 -0.362210 -0.039273
v 0.258983 -0.361445 -0.035327
v 0.309796 -0.393775 -0.056175
v 0.286491 -0.396149 -0.046298
v 0.258002 -0.396730 -0.042598
v 0.197022 -0.324295 -0.042070
v 0.165362 -0.323453 -0.042070
v 0.133702 -0.322611 -0.042070
v 0.196078 -0.359773 -0.034763
v 0.164418 -0.358931 -0.034763
v 0.132759 -0.358089 -0.034763
v 0.195135 -0.395250 -0.042070
v 0.163475 -0.394408 -0.042070
v 0.131815 -0.393566 -0.042070
v 0.312517 -0.291441 -0.179117
v 0.289511 -0.282602 -0.182711
v 0.261139 -0.278766 -0.184319
v 0.313773 -0.286300 -0.141927
v 0.290008 -0.276764 -0.141927
v 0.261343 -0.272715 -0.141927
v 0.312517 -0.291441 -0.104736
v 0.289511 -0.282602 -0.101143
v 0.261139 -0.278766 -0.099534
v 0.198289 -0.276654 -0.184548
v 0.166629 -0.275812 -0.184548
v 0.134969 -0.274970 -0.184548
v 0.198451 -0.270572 -0.141927
v 0.166791 -0.269731 -0.141927
v 0.135131 -0.268889 -0.141927
v 0.198289 -0.276654 -0.099305
v 0.166629 -0.275812 -0.099305
v 0.134969 -0.274970 -0.099305
v 0.309796 -0.393775 -0.227678
v 0.286491 -0.396149 -0.237555
v 0.258002 -0.396730 -0.241255
v 0.311737 -0.362848 -0.233891
v 0.287736 -0.362210 -0.244580
v 0.258983 -0.361445 -0.248526
v 0.311442 -0.331862 -0.227678
v 0.288297 -0.328254 -0.237555
v 0.259878 -0.326158 -0.241255
v 0.195135 -0.395250 -0.241784
v 0.163475 -0.394408 -0.241784
v 0.131815 -0.393566 -0.241784
v 0.196078 -0.359773 -0.249090
v 0.164418 -0.358931 -0.249090
v 0.132759 -0.358089 -0.249090
v 0.197022 -0.324295 -0.241784
v 0.165362 -0.323453 -0.241784
v 0.133702 -0.322611 -0.241784
v 0.308721 -0.434196 -0.104736
v 0.285277 -0.441800 -0.101143
v 0.256741 -0.444123 -0.099534
v 0.309701 -0.439397 -0.141927
v 0.285463 -0.447656 -0.141927
v 0.256623 -0.450176 -0.141927
v 0.308721 -0.434196 -0.179117
v 0.285277 -0.441800 -0.182711
v 0.256741 -0.444123 -0.184319
v 0.193868 -0.442891 -0.099305
v 0.162208 -0.442049 -0.099305
v 0.130548 -0.441207 -0.099305
v 0.193706 -0.448973 -0.141927
v 0.162046 -0.448131 -0.141927
v 0.130386 -0.447289 -0.141927
v 0.193868 -0.442891 -0.184548
v 0.162208 -0.442049 -0.184548
v 0.130548 -0.441207 -0.184548
vn -0.1405 -0.9801 0.1406
vn -0.3737 -0.9175 0.1359
vn -0.1405 -0.9801 -0.1406
vn -0.3737 -0.9175 -0.1359
vn -0.0397 -0.8875 0.4591
vn -0.1468 -0.8779 0.4558
vn -0.3836 -0.8165 0.4316
vn -0.6466 -0.6659 0.3721
vn -0.7069 -0.6970 0.1202
vn -0.7069 -0.6970 -0.1202
vn -0.6466 -0.6659 -0.3721
vn -0.3836 -0.8165 -0.4316
vn -0.1468 -0.8779 -0.4558
vn -0.0397 -0.8875 -0.4591
vn -0.0413 -0.9891 -0.1413
vn -0.0413 -0.9891 0.1413
vn -0.1408 -0.1964 -0.9704
vn -0.4096 -0.1783 -0.8947
vn -0.1302 0.2036 -0.9704
vn -0.3995 0.1998 -0.8947
vn -0.0333 -0.5969 -0.8016
vn -0.1492 -0.5888 -0.7944
vn -0.4018 -0.5442 -0.7365
vn -0.6683 -0.4488 -0.5933
vn -0.7506 -0.1368 -0.6465
vn -0.7422 0.1765 -0.6465
vn -0.6435 0.4836 -0.5933
vn -0.3723 0.5648 -0.7365
vn -0.1177 0.5958 -0.7944
vn -0.0015 0.5978 -0.8016
vn -0.0125 0.2020 -0.9793
vn -0.0232 -0.2011 -0.9793
vn -0.0882 0.9861 -0.1406
vn -0.3244 0.9361 -0.1359
vn -0.0882 0.9861 0.1406
vn -0.3244 0.9361 0.1359
vn 0.0076 0.8884 -0.4591
vn -0.0999 0.8844 -0.4558
vn -0.3396 0.8357 -0.4316
vn -0.6103 0.6993 -0.3721
vn -0.6689 0.7336 -0.1202
vn -0.6689 0.7336 0.1202
vn -0.6103 0.6993 0.3721
vn -0.3396 0.8357 0.4316
vn -0.0999 0.8844 0.4558
vn 0.0076 0.8884 0.4591
vn 0.0113 0.9899 0.1413
vn 0.0113 0.9899 -0.1413
vn -0.1302 0.2036 0.9704
vn -0.3995 0.1998 0.8947
vn -0.1408 -0.1964 0.9704
vn -0.4096 -0.1783 0.8947
vn -0.0015 0.5978 0.8016
vn -0.1177 0.5958 0.7944
vn -0.3723 0.5648 0.7365
vn -0.6435 0.4836 0.5933
vn -0.7422 0.1765 0.6465
vn -0.7506 -0.1368 0.6465
vn -0.6683 -0.4488 0.5933
vn -0.4018 -0.5442 0.7365
vn -0.1492 -0.5888 0.7944
vn -0.0333 -0.5969 0.8016
vn -0.0232 -0.2011 0.9793
vn -0.0125 0.2020 0.9793
vn 0.9859 -0.1377 -0.0946
vn 0.9919 0.0872 -0.0928
vn 0.9859 -0.1377 0.0946
vn 0.9919 0.0872 0.0928
vn 0.8724 -0.3983 -0.2835
vn 0.9367 -0.1477 -0.3174
vn 0.9432 0.0976 -0.3175
vn 0.8948 0.3181 -0.3134
vn 0.9317 0.3492 -0.1002
vn 0.9317 0.3492 0.1002
vn 0.8948 0.3181 0.3134
vn 0.9432 0.0976 0.3175
vn 0.9367 -0.1477 0.3174
vn 0.8724 -0.3983 0.2835
vn 0.9118 -0.3983 0.1001
vn 0.9118 -0.3983 -0.1001
vn -0.9859 0.1377 -0.0946
vn -0.9919 -0.0872 -0.0928
vn -0.9859 0.1377 0.0946
vn -0.9919 -0.0872 0.0928
vn -0.8724 0.3983 -0.2835
vn -0.9367 0.1477 -0.3174
vn -0.9432 -0.0976 -0.3175
vn -0.8948 -0.3181 -0.3134
vn -0.9317 -0.3492 -0.1002
vn -0.9317 -0.3492 0.1002
vn -0.8948 -0.3181 0.3134
vn -0.9432 -0.0976 0.3175
vn -0.9367 0.1477 0.3174
vn -0.8724 0.3983 0.2835
vn -0.9118 0.3983 0.1001
vn -0.9118 0.3983 -0.1001
vn 0.4040 0.1835 0.8962
vn 0.1385 0.1984 0.9703
vn 0.3937 -0.2047 0.8962
vn 0.1277 -0.2055 0.9703
vn 0.7130 0.3947 0.5795
vn 0.3993 0.5464 0.7362
vn 0.1469 0.5905 0.7935
vn 0.0331 0.5970 0.8016
vn 0.0230 0.2012 0.9793
vn 0.0123 -0.2021 0.9793
vn 0.0013 -0.5979 0.8016
vn 0.1153 -0.5975 0.7935
vn 0.3697 -0.5668 0.7362
vn 0.6910 -0.4320 0.5795
vn 0.7417 -0.1757 0.6473
vn 0.7500 0.1361 0.6473
vn 0.0054 0.2016 0.9795
vn -0.0054 -0.2016 0.9795
vn 0.0159 0.5974 0.8018
vn -0.0159 -0.5974 0.8018
vn 0.3688 0.9190 -0.1395
vn 0.1385 0.9801 -0.1420
vn 0.3688 0.9190 0.1395
vn 0.1385 0.9801 0.1420
vn 0.6882 0.6467 -0.3289
vn 0.3814 0.8165 -0.4334
vn 0.1446 0.8774 -0.4574
vn 0.0395 0.8875 -0.4592
vn 0.0412 0.9891 -0.1414
vn 0.0412 0.9891 0.1414
vn 0.0395 0.8875 0.4592
vn 0.1446 0.8774 0.4574
vn 0.3814 0.8165 0.4334
vn 0.6882 0.6467 0.3289
vn 0.7063 0.6977 0.1197
vn 0.7063 0.6977 -0.1197
vn 0.0263 0.9896 -0.1413
vn 0.0263 0.9896 0.1413
vn 0.0236 0.8881 -0.4591
vn 0.0236 0.8881 0.4591
vn 0.3937 -0.2047 -0.8962
vn 0.1277 -0.2055 -0.9703
vn 0.4040 0.1835 -0.8962
vn 0.1385 0.1984 -0.9703
vn 0.6910 -0.4320 -0.5795
vn 0.3697 -0.5668 -0.7362
vn 0.1153 -0.5975 -0.7935
vn 0.0013 -0.5979 -0.8016
vn 0.0123 -0.2021 -0.9793
vn 0.0230 0.2012 -0.9793
vn 0.0331 0.5970 -0.8016
vn 0.1469 0.5905 -0.7935
vn 0.3993 0.5464 -0.7362
vn 0.7130 0.3947 -0.5795
vn 0.7500 0.1361 -0.6473
vn 0.7417 -0.1757 -0.6473
vn -0.0054 -0.2016 -0.9795
vn 0.0054 0.2016 -0.9795
vn -0.0159 -0.5974 -0.8018
vn 0.0159 0.5974 -0.8018
vn 0.3194 -0.9373 0.1395
vn 0.0862 -0.9861 0.1420
vn 0.3194 -0.9373 -0.1395
vn 0.0862 -0.9861 -0.1420
vn 0.6528 -0.6824 0.3289
vn 0.3375 -0.8357 0.4334
vn 0.0977 -0.8839 0.4574
vn -0.0077 -0.8883 0.4592
vn -0.0115 -0.9899 0.1414
vn -0.0115 -0.9899 -0.1414
vn -0.0077 -0.8883 -0.4592
vn 0.0977 -0.8839 -0.4574
vn 0.3375 -0.8357 -0.4334
vn 0.6528 -0.6824 -0.3289
vn 0.6682 -0.7343 -0.1197
vn 0.6682 -0.7343 0.1197
vn -0.0263 -0.9896 0.1413
vn -0.0263 -0.9896 -0.1413
vn -0.0236 -0.8881 0.4591
vn -0.0236 -0.8881 -0.4591
vn -0.1385 -0.9801 0.1420
vn -0.3688 -0.9190 0.1395
vn -0.1385 -0.9801 -0.1420
vn -0.3688 -0.9190 -0.1395
vn -0.0395 -0.8875 0.4592
vn -0.1446 -0.8774 0.4574
vn -0.3814 -0.8165 0.4334
vn -0.6882 -0.6467 0.3289
vn -0.7063 -0.6977 0.1197
vn -0.7063 -0.6977 -0.1197
vn -0.6882 -0.6467 -0.3289
vn -0.3814 -0.8165 -0.4334
vn -0.1446 -0.8774 -0.4574
vn -0.0395 -0.8875 -0.4592
vn -0.0412 -0.9891 -0.1414
vn -0.0412 -0.9891 0.1414
vn -0.1385 -0.1984 -0.9703
vn -0.4040 -0.1835 -0.8962
vn -0.1277 0.2055 -0.9703
vn -0.3937 0.2047 -0.8962
vn -0.0331 -0.5970 -0.8016
vn -0.1469 -0.5905 -0.7935
vn -0.3993 -0.5464 -0.7362
vn -0.7130 -0.3947 -0.5795
vn -0.7500 -0.1361 -0.6473
vn -0.7417 0.1757 -0.6473
vn -0.6910 0.4320 -0.5795
vn -0.3697 0.5668 -0.7362
vn -0.1153 0.5975 -0.7935
vn -0.0013 0.5979 -0.8016
vn -0.0123 0.2021 -0.9793
vn -0.0230 -0.2012 -0.9793
vn -0.0862 0.9861 -0.1420
vn -0.3194 0.9373 -0.1395
vn -0.0862 0.9861 0.1420
vn -0.3194 0.9373 0.1395
vn 0.0077 0.8883 -0.4592
vn -0.0977 0.8839 -0.4574
vn -0.3375 0.8357 -0.4334
vn -0.6528 0.6824 -0.3289
vn -0.6682 0.7343 -0.1197
vn -0.6682 0.7343 0.1197
vn -0.6528 0.6824 0.3289
vn -0.3375 0.8357 0.4334
vn -0.0977 0.8839 0.4574
vn 0.0077 0.8883 0.4592
vn 0.0115 0.9899 0.1414
vn 0.0115 0.9899 -0.1414
vn -0.1277 0.2055 0.9703
vn -0.3937 0.2047 0.8962
vn -0.1385 -0.1984 0.9703
vn -0.4040 -0.1835 0.8962
vn -0.0013 0.5979 0.8016
vn -0.1153 0.5975 0.7935
vn -0.3697 0.5668 0.7362
vn -0.6910 0.4320 0.5795
vn -0.7417 0.1757 0.6473
vn -0.7500 -0.1361 0.6473
vn -0.7130 -0.3947 0.5795
vn -0.3993 -0.5464 0.7362
vn -0.1469 -0.5905 0.7935
vn -0.0331 -0.5970 0.8016
vn -0.0230 -0.2012 0.9793
vn -0.0123 0.2021 0.9793
vn 0.9858 -0.1398 -0.0928
vn 0.9919 0.0851 -0.0946
vn 0.9858 -0.1398 0.0928
vn 0.9919 0.0851 0.0946
vn 0.8766 -0.3652 -0.3134
vn 0.9367 -0.1476 -0.3175
vn 0.9432 0.0977 -0.3174
vn 0.8923 0.3513 -0.2835
vn 0.9317 0.3492 -0.1001
vn 0.9317 0.3492 0.1001
vn 0.8923 0.3513 0.2835
vn 0.9432 0.0977 0.3174
vn 0.9367 -0.1476 0.3175
vn 0.8766 -0.3652 0.3134
vn 0.9118 -0.3982 0.1002
vn 0.9118 -0.3982 -0.1002
vn -0.9858 0.1398 -0.0928
vn -0.9919 -0.0851 -0.0946
vn -0.9858 0.1398 0.0928
vn -0.9919 -0.0851 0.0946
vn -0.8766 0.3652 -0.3134
vn -0.9367 0.1476 -0.3175
vn -0.9432 -0.0977 -0.3174
vn -0.8923 -0.3513 -0.2835
vn -0.9317 -0.3492 -0.1001
vn -0.9317 -0.3492 0.1001
vn -0.8923 -0.3513 0.2835
vn -0.9432 -0.0977 0.3174
vn -0.9367 0.1476 0.3175
vn -0.8766 0.3652 0.3134
vn -0.9118 0.3982 0.1002
vn -0.9118 0.3982 -0.1002
vn 0.4096 0.1783 0.8947
vn 0.1408 0.1964 0.9704
vn 0.3995 -0.1998 0.8947
vn 0.1302 -0.2036 0.9704
vn 0.6683 0.4488 0.5933
vn 0.4018 0.5442 0.7365
vn 0.1492 0.5888 0.7944
vn 0.0333 0.5969 0.8016
vn 0.0232 0.2011 0.9793
vn 0.0125 -0.2020 0.9793
vn 0.0015 -0.5978 0.8016
vn 0.1177 -0.5958 0.7944
vn 0.3723 -0.5648 0.7365
vn 0.6435 -0.4836 0.5933
vn 0.7422 -0.1765 0.6465
vn 0.7506 0.1368 0.6465
vn 0.3737 0.9175 -0.1359
vn 0.1405 0.9801 -0.1406
vn 0.3737 0.9175 0.1359
vn 0.1405 0.9801 0.1406
vn 0.6466 0.6659 -0.3721
vn 0.3836 0.8165 -0.4316
vn 0.1468 0.8779 -0.4558
vn 0.0397 0.8875 -0.4591
vn 0.0413 0.9891 -0.1413
vn 0.0413 0.9891 0.1413
vn 0.0397 0.8875 0.4591
vn 0.1468 0.8779 0.4558
vn 0.3836 0.8165 0.4316
vn 0.6466 0.6659 0.3721
vn 0.7069 0.6970 0.1202
vn 0.7069 0.6970 -0.1202
vn 0.3995 -0.1998 -0.8947
vn 0.1302 -0.2036 -0.9704
vn 0.4096 0.1783 -0.8947
vn 0.1408 0.1964 -0.9704
vn 0.6435 -0.4836 -0.5933
vn 0.3723 -0.5648 -0.7365
vn 0.1177 -0.5959 -0.7944
vn 0.0015 -0.5978 -0.8016
vn 0.0125 -0.2020 -0.9793
vn 0.0232 0.2011 -0.9793
vn 0.0333 0.5969 -0.8016
vn 0.1492 0.5887 -0.7944
vn 0.4018 0.5442 -0.7365
vn 0.6683 0.4488 -0.5933
vn 0.7506 0.1368 -0.6465
vn 0.7422 -0.1765 -0.6465
vn 0.3244 -0.9361 0.1359
vn 0.0882 -0.9861 0.1406
vn 0.3244 -0.9361 -0.1359
vn 0.0882 -0.9861 -0.1406
vn 0.6103 -0.6993 0.3721
vn 0.3396 -0.8357 0.4316
vn 0.0999 -0.8844 0.4558
vn -0.0076 -0.8884 0.4591
vn -0.0113 -0.9899 0.1413
vn -0.0113 -0.9899 -0.1413
vn -0.0076 -0.8884 -0.4591
vn 0.0999 -0.8844 -0.4558
vn 0.3396 -0.8357 -0.4316
vn 0.6103 -0.6993 -0.3721
vn 0.6689 -0.7336 -0.1202
vn 0.6689 -0.7336 0.1202
vt 0.583333 0.062500
vt 0.562500 0.125000
vt 0.562500 0.062500
vt 0.604167 0.062500
vt 0.583333 0.125000
vt 0.583333 0.187500
vt 0.562500 0.187500
vt 0.604167 0.187500
vt 0.562500 0.000000
vt 0.541667 0.062500
vt 0.541667 0.000000
vt 0.583333 0.000000
vt 0.604167 0.000000
vt 0.625000 0.000000
vt 0.625000 0.125000
vt 0.604167 0.125000
vt 0.625000 0.250000
vt 0.604167 0.250000
vt 0.583333 0.250000
vt 0.562500 0.250000
vt 0.541667 0.187500
vt 0.541667 0.250000
vt 0.541667 0.125000
vt 0.583333 0.312500
vt 0.562500 0.375000
vt 0.562500 0.312500
vt 0.604167 0.312500
vt 0.583333 0.375000
vt 0.583406 0.437500
vt 0.562509 0.437500
vt 0.604320 0.437500
vt 0.541667 0.312500
vt 0.625000 0.375000
vt 0.604167 0.375000
vt 0.625000 0.500000
vt 0.605903 0.500000
vt 0.583912 0.500000
vt 0.562572 0.500000
vt 0.541667 0.437500
vt 0.541667 0.500000
vt 0.541667 0.375000
vt 0.584997 0.562500
vt 0.562789 0.625000
vt 0.562708 0.562500
vt 0.610343 0.562500
vt 0.585648 0.625000
vt 0.584997 0.687500
vt 0.562708 0.687500
vt 0.610343 0.687500
vt 0.541667 0.562500
vt 0.648148 0.625000
vt 0.612558 0.625000
vt 0.625000 0.750000
vt 0.605903 0.750000
vt 0.583912 0.750000
vt 0.562572 0.750000
vt 0.541667 0.687500
vt 0.541667 0.750000
vt 0.541667 0.625000
vt 0.583406 0.812500
vt 0.562500 0.875000
vt 0.562509 0.812500
vt 0.604320 0.812500
vt 0.583333 0.875000
vt 0.583333 0.937500
vt 0.562500 0.937500
vt 0.604167 0.937500
vt 0.541667 0.812500
vt 0.625000 0.875000
vt 0.604167 0.875000
vt 0.625000 1.000000
vt 0.604167 1.000000
vt 0.583333 1.000000
vt 0.562500 1.000000
vt 0.541667 0.937500
vt 0.541667 1.000000
vt 0.541667 0.875000
vt 0.187301 0.562500
vt 0.247685 0.625000
vt 0.187211 0.625000
vt 0.306478 0.562500
vt 0.248409 0.562500
vt 0.187301 0.687500
vt 0.306478 0.687500
vt 0.248409 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.187500 0.500000
vt 0.312500 0.500000
vt 0.250000 0.500000
vt 0.375000 0.500000
vt 0.355867 0.562500
vt 0.304109 0.625000
vt 0.355867 0.687500
vt 0.375000 0.750000
vt 0.312500 0.750000
vt 0.250000 0.750000
vt 0.187500 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.693522 0.562500
vt 0.752315 0.625000
vt 0.695891 0.625000
vt 0.812699 0.562500
vt 0.751592 0.562500
vt 0.693522 0.687500
vt 0.812699 0.687500
vt 0.751591 0.687500
vt 0.644133 0.562500
vt 0.687500 0.500000
vt 0.812500 0.500000
vt 0.750000 0.500000
vt 0.875000 0.500000
vt 0.875000 0.562500
vt 0.812789 0.625000
vt 0.875000 0.687500
vt 0.875000 0.750000
vt 0.812500 0.750000
vt 0.750000 0.750000
vt 0.687500 0.750000
vt 0.644133 0.687500
vt 0.395680 0.812500
vt 0.416667 0.875000
vt 0.395833 0.875000
vt 0.416594 0.812500
vt 0.437500 0.875000
vt 0.395833 0.937500
vt 0.416667 0.937500
vt 0.375000 0.812500
vt 0.394097 0.750000
vt 0.416088 0.750000
vt 0.437491 0.812500
vt 0.437428 0.750000
vt 0.458333 0.812500
vt 0.458333 0.875000
vt 0.437500 0.937500
vt 0.458333 0.937500
vt 0.437500 1.000000
vt 0.416667 1.000000
vt 0.395833 1.000000
vt 0.375000 1.000000
vt 0.375000 0.937500
vt 0.375000 0.875000
vt 0.500000 0.812500
vt 0.479167 0.875000
vt 0.479167 0.812500
vt 0.520833 0.812500
vt 0.500000 0.875000
vt 0.479167 0.937500
vt 0.520833 0.875000
vt 0.500000 0.937500
vt 0.479167 0.750000
vt 0.458333 0.750000
vt 0.500000 0.750000
vt 0.520833 0.750000
vt 0.520833 0.937500
vt 0.520833 1.000000
vt 0.500000 1.000000
vt 0.479167 1.000000
vt 0.458333 1.000000
vt 0.389658 0.562500
vt 0.414352 0.625000
vt 0.387442 0.625000
vt 0.415003 0.562500
vt 0.437211 0.625000
vt 0.389657 0.687500
vt 0.415003 0.687500
vt 0.394097 0.500000
vt 0.416088 0.500000
vt 0.437292 0.562500
vt 0.437428 0.500000
vt 0.458333 0.562500
vt 0.458333 0.625000
vt 0.437292 0.687500
vt 0.458333 0.687500
vt 0.351852 0.625000
vt 0.500000 0.562500
vt 0.479167 0.625000
vt 0.479167 0.562500
vt 0.520833 0.562500
vt 0.500000 0.625000
vt 0.479167 0.687500
vt 0.520833 0.625000
vt 0.500000 0.687500
vt 0.479167 0.500000
vt 0.458333 0.500000
vt 0.520833 0.500000
vt 0.500000 0.500000
vt 0.541667 0.625000
vt 0.520833 0.687500
vt 0.520833 0.750000
vt 0.500000 0.750000
vt 0.458333 0.750000
vt 0.458333 0.625000
vt 0.395833 0.312500
vt 0.416667 0.375000
vt 0.395833 0.375000
vt 0.416667 0.312500
vt 0.437500 0.375000
vt 0.395680 0.437500
vt 0.416594 0.437500
vt 0.375000 0.250000
vt 0.375000 0.312500
vt 0.395833 0.250000
vt 0.416667 0.250000
vt 0.437500 0.312500
vt 0.437500 0.250000
vt 0.458333 0.312500
vt 0.458333 0.375000
vt 0.437491 0.437500
vt 0.458333 0.437500
vt 0.375000 0.437500
vt 0.375000 0.375000
vt 0.500000 0.312500
vt 0.479167 0.375000
vt 0.479167 0.312500
vt 0.520833 0.312500
vt 0.500000 0.375000
vt 0.479167 0.437500
vt 0.520833 0.375000
vt 0.500000 0.437500
vt 0.479167 0.250000
vt 0.458333 0.312500
vt 0.458333 0.250000
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.541667 0.312500
vt 0.541667 0.375000
vt 0.520833 0.437500
vt 0.541667 0.437500
vt 0.458333 0.437500
vt 0.458333 0.375000
vt 0.395833 0.062500
vt 0.416667 0.125000
vt 0.395833 0.125000
vt 0.416667 0.062500
vt 0.437500 0.125000
vt 0.395833 0.187500
vt 0.416667 0.187500
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.395833 0.000000
vt 0.416667 0.000000
vt 0.437500 0.062500
vt 0.437500 0.000000
vt 0.458333 0.062500
vt 0.458333 0.125000
vt 0.437500 0.187500
vt 0.458333 0.187500
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.500000 0.062500
vt 0.479167 0.125000
vt 0.479167 0.062500
vt 0.520833 0.062500
vt 0.500000 0.125000
vt 0.479167 0.187500
vt 0.520833 0.125000
vt 0.500000 0.187500
vt 0.458333 0.000000
vt 0.500000 0.000000
vt 0.479167 0.000000
vt 0.520833 0.000000
vt 0.520833 0.187500
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.479167 0.250000
vt 0.458333 0.250000
vt 0.625000 0.062500
vt 0.625000 0.187500
vt 0.625000 0.312500
vt 0.625000 0.437500
vt 0.625000 0.812500
vt 0.625000 0.937500
vt 0.875000 0.625000
vt 0.458333 0.500000
vt 0.541667 0.500000
vt 0.541667 0.750000
s 0
f 1292/1398/1373 1294/1399/1373 1291/1400/1373
f 1293/1401/1374 1295/1402/1374 1292/1398/1374
f 1294/1399/1375 1298/1403/1375 1297/1404/1375
f 1295/1402/1376 1299/1405/1376 1298/1403/1376
f 1210/1406/1377 1288/1407/1377 1200/1408/1377
f 1211/1409/1378 1291/1400/1378 1210/1406/1378
f 1212/1410/1379 1292/1398/1379 1211/1409/1379
f 1192/1411/1380 1293/1401/1380 1212/1410/1380
f 1293/1401/1381 1214/1412/1381 1296/1413/1381
f 1214/1412/1382 1299/1405/1382 1296/1413/1382
f 1299/1405/1383 1194/1414/1383 1249/1415/1383
f 1298/1403/1384 1249/1415/1384 1250/1416/1384
f 1297/1404/1385 1250/1416/1385 1251/1417/1385
f 1290/1418/1386 1251/1417/1386 1201/1419/1386
f 1289/1420/1387 1297/1404/1387 1290/1418/1387
f 1291/1400/1388 1289/1420/1388 1288/1407/1388
f 1301/1421/1389 1303/1422/1389 1300/1423/1389
f 1302/1424/1390 1304/1425/1390 1301/1421/1390
f 1303/1422/1391 1307/1426/1391 1306/1427/1391
f 1304/1425/1392 1308/1428/1392 1307/1426/1392
f 1251/1417/1393 1282/1429/1393 1201/1419/1393
f 1250/1416/1394 1300/1423/1394 1251/1417/1394
f 1249/1415/1395 1301/1421/1395 1250/1416/1395
f 1194/1414/1396 1302/1424/1396 1249/1415/1396
f 1302/1424/1397 1223/1430/1397 1305/1431/1397
f 1223/1430/1398 1308/1428/1398 1305/1431/1398
f 1308/1428/1399 1198/1432/1399 1255/1433/1399
f 1307/1426/1400 1255/1433/1400 1256/1434/1400
f 1306/1427/1401 1256/1434/1401 1257/1435/1401
f 1284/1436/1402 1257/1435/1402 1203/1437/1402
f 1283/1438/1403 1306/1427/1403 1284/1436/1403
f 1300/1423/1404 1283/1438/1404 1282/1429/1404
f 1310/1439/1405 1312/1440/1405 1309/1441/1405
f 1311/1442/1406 1313/1443/1406 1310/1439/1406
f 1312/1440/1407 1316/1444/1407 1315/1445/1407
f 1313/1443/1408 1317/1446/1408 1316/1444/1408
f 1257/1435/1409 1276/1447/1409 1203/1437/1409
f 1256/1434/1410 1309/1441/1410 1257/1435/1410
f 1255/1433/1411 1310/1439/1411 1256/1434/1411
f 1198/1432/1412 1311/1442/1412 1255/1433/1412
f 1311/1442/1413 1232/1448/1413 1314/1449/1413
f 1232/1448/1414 1317/1446/1414 1314/1449/1414
f 1317/1446/1415 1196/1450/1415 1261/1451/1415
f 1316/1444/1416 1261/1451/1416 1262/1452/1416
f 1315/1445/1417 1262/1452/1417 1263/1453/1417
f 1278/1454/1418 1263/1453/1418 1205/1455/1418
f 1277/1456/1419 1315/1445/1419 1278/1454/1419
f 1309/1441/1420 1277/1456/1420 1276/1447/1420
f 1319/1457/1421 1321/1458/1421 1318/1459/1421
f 1320/1460/1422 1322/1461/1422 1319/1457/1422
f 1321/1458/1423 1325/1462/1423 1324/1463/1423
f 1322/1461/1424 1326/1464/1424 1325/1462/1424
f 1263/1453/1425 1270/1465/1425 1205/1455/1425
f 1262/1452/1426 1318/1459/1426 1263/1453/1426
f 1261/1451/1427 1319/1457/1427 1262/1452/1427
f 1196/1450/1428 1320/1460/1428 1261/1451/1428
f 1320/1460/1429 1241/1466/1429 1323/1467/1429
f 1241/1466/1430 1326/1464/1430 1323/1467/1430
f 1326/1464/1431 1192/1468/1431 1212/1469/1431
f 1325/1462/1432 1212/1469/1432 1211/1470/1432
f 1324/1463/1433 1211/1470/1433 1210/1471/1433
f 1272/1472/1434 1210/1471/1434 1200/1473/1434
f 1271/1474/1435 1324/1463/1435 1272/1472/1435
f 1318/1459/1436 1271/1474/1436 1270/1465/1436
f 1327/1475/1437 1331/1476/1437 1330/1477/1437
f 1329/1478/1438 1331/1476/1438 1328/1479/1438
f 1331/1476/1439 1333/1480/1439 1330/1477/1439
f 1331/1476/1440 1335/1481/1440 1334/1482/1440
f 1193/1483/1441 1327/1475/1441 1207/1484/1441
f 1221/1485/1442 1328/1479/1442 1327/1475/1442
f 1219/1486/1443 1328/1479/1443 1220/1487/1443
f 1197/1488/1444 1329/1478/1444 1219/1486/1444
f 1230/1489/1445 1332/1490/1445 1329/1478/1445
f 1332/1490/1446 1228/1491/1446 1335/1481/1446
f 1335/1481/1447 1195/1492/1447 1239/1493/1447
f 1334/1482/1448 1239/1493/1448 1238/1494/1448
f 1334/1482/1449 1237/1495/1449 1333/1480/1449
f 1333/1480/1450 1191/1496/1450 1209/1497/1450
f 1330/1477/1451 1209/1497/1451 1208/1498/1451
f 1207/1484/1452 1330/1477/1452 1208/1498/1452
f 1336/1499/1453 1340/1500/1453 1339/1501/1453
f 1338/1502/1454 1340/1500/1454 1337/1503/1454
f 1340/1500/1455 1342/1504/1455 1339/1501/1455
f 1340/1500/1456 1344/1505/1456 1343/1506/1456
f 1198/1432/1457 1336/1499/1457 1231/1507/1457
f 1224/1508/1458 1337/1503/1458 1336/1499/1458
f 1222/1509/1459 1337/1503/1459 1223/1510/1459
f 1194/1511/1460 1338/1502/1460 1222/1509/1460
f 1215/1512/1461 1341/1513/1461 1338/1502/1461
f 1341/1513/1462 1213/1514/1462 1344/1505/1462
f 1344/1505/1463 1192/1515/1463 1242/1516/1463
f 1343/1506/1464 1242/1516/1464 1241/1517/1464
f 1343/1506/1465 1240/1518/1465 1342/1504/1465
f 1342/1504/1466 1196/1450/1466 1233/1519/1466
f 1339/1501/1467 1233/1519/1467 1232/1448/1467
f 1231/1507/1468 1339/1501/1468 1232/1448/1468
f 1345/1520/1469 1349/1521/1469 1348/1522/1469
f 1346/1523/1470 1350/1524/1470 1349/1521/1470
f 1349/1521/1471 1351/1525/1471 1348/1522/1471
f 1350/1524/1472 1352/1526/1472 1349/1521/1472
f 1195/1492/1473 1345/1520/1473 1239/1527/1473
f 1236/1528/1474 1346/1523/1474 1345/1520/1474
f 1235/1529/1475 1347/1530/1475 1346/1523/1475
f 1234/1531/1476 1267/1532/1476 1347/1530/1476
f 1347/1530/1477 1268/1533/1477 1350/1524/1477
f 1268/1533/1478 1353/1534/1478 1350/1524/1478
f 1269/1535/1479 1245/1536/1479 1353/1534/1479
f 1353/1534/1480 1244/1537/1480 1352/1526/1480
f 1352/1526/1481 1243/1538/1481 1351/1525/1481
f 1351/1525/1482 1191/1539/1482 1237/1540/1482
f 1238/1541/1483 1351/1525/1483 1237/1540/1483
f 1345/1520/1484 1238/1541/1484 1239/1527/1484
f 1355/1542/1485 1357/1543/1485 1354/1544/1485
f 1356/1545/1485 1358/1546/1485 1355/1542/1485
f 1358/1546/1486 1360/1547/1486 1357/1543/1486
f 1359/1548/1486 1361/1549/1486 1358/1546/1486
f 1266/1550/1487 1267/1532/1487 1206/1551/1487
f 1265/1552/1487 1354/1544/1487 1266/1550/1487
f 1265/1552/1487 1356/1545/1487 1355/1542/1487
f 1205/1455/1487 1356/1545/1487 1264/1553/1487
f 1270/1465/1485 1359/1548/1485 1356/1545/1485
f 1271/1474/1486 1362/1554/1486 1359/1548/1486
f 1272/1472/1488 1248/1555/1488 1362/1554/1488
f 1362/1554/1488 1247/1556/1488 1361/1549/1488
f 1361/1549/1488 1246/1557/1488 1360/1547/1488
f 1360/1547/1488 1199/1558/1488 1269/1535/1488
f 1357/1543/1486 1269/1535/1486 1268/1533/1486
f 1354/1544/1485 1268/1533/1485 1267/1532/1485
f 1363/1559/1489 1367/1560/1489 1366/1561/1489
f 1364/1562/1490 1368/1563/1490 1367/1560/1490
f 1367/1560/1491 1369/1564/1491 1366/1561/1491
f 1368/1563/1492 1370/1565/1492 1367/1560/1492
f 1197/1488/1493 1363/1559/1493 1230/1489/1493
f 1227/1566/1494 1364/1562/1494 1363/1559/1494
f 1226/1567/1495 1365/1568/1495 1364/1562/1495
f 1225/1569/1496 1273/1570/1496 1365/1568/1496
f 1365/1568/1497 1274/1571/1497 1368/1563/1497
f 1274/1571/1498 1371/1572/1498 1368/1563/1498
f 1275/1573/1499 1234/1531/1499 1371/1572/1499
f 1371/1572/1500 1235/1529/1500 1370/1565/1500
f 1370/1565/1501 1236/1528/1501 1369/1564/1501
f 1369/1564/1502 1195/1492/1502 1228/1491/1502
f 1229/1574/1503 1369/1564/1503 1228/1491/1503
f 1363/1559/1504 1229/1574/1504 1230/1489/1504
f 1373/1575/1505 1375/1576/1505 1372/1577/1505
f 1374/1578/1505 1376/1579/1505 1373/1575/1505
f 1376/1579/1506 1378/1580/1506 1375/1576/1506
f 1377/1581/1506 1379/1582/1506 1376/1579/1506
f 1260/1583/1507 1273/1570/1507 1204/1584/1507
f 1260/1583/1507 1373/1575/1507 1372/1577/1507
f 1258/1585/1507 1373/1575/1507 1259/1586/1507
f 1258/1585/1507 1276/1447/1507 1374/1578/1507
f 1374/1578/1505 1277/1587/1505 1377/1581/1505
f 1277/1587/1506 1380/1588/1506 1377/1581/1506
f 1278/1454/1508 1264/1589/1508 1380/1588/1508
f 1379/1582/1508 1264/1589/1508 1265/1590/1508
f 1379/1582/1508 1266/1550/1508 1378/1580/1508
f 1378/1580/1508 1206/1591/1508 1275/1573/1508
f 1375/1576/1506 1275/1573/1506 1274/1592/1506
f 1273/1570/1505 1375/1576/1505 1274/1592/1505
f 1381/1593/1509 1385/1594/1509 1384/1595/1509
f 1382/1596/1510 1386/1597/1510 1385/1594/1510
f 1385/1594/1511 1387/1598/1511 1384/1595/1511
f 1386/1597/1512 1388/1599/1512 1385/1594/1512
f 1193/1600/1513 1381/1593/1513 1221/1601/1513
f 1218/1602/1514 1382/1596/1514 1381/1593/1514
f 1217/1603/1515 1383/1604/1515 1382/1596/1515
f 1216/1605/1516 1279/1606/1516 1383/1604/1516
f 1383/1604/1517 1280/1607/1517 1386/1597/1517
f 1280/1607/1518 1389/1608/1518 1386/1597/1518
f 1281/1609/1519 1225/1569/1519 1389/1608/1519
f 1389/1608/1520 1226/1567/1520 1388/1599/1520
f 1388/1599/1521 1227/1566/1521 1387/1598/1521
f 1387/1598/1522 1197/1488/1522 1219/1610/1522
f 1220/1611/1523 1387/1598/1523 1219/1610/1523
f 1381/1593/1524 1220/1611/1524 1221/1601/1524
f 1391/1612/1525 1393/1613/1525 1390/1614/1525
f 1392/1615/1525 1394/1616/1525 1391/1612/1525
f 1394/1616/1526 1396/1617/1526 1393/1613/1526
f 1395/1618/1526 1397/1619/1526 1394/1616/1526
f 1254/1620/1527 1279/1621/1527 1202/1622/1527
f 1254/1620/1527 1391/1612/1527 1390/1614/1527
f 1252/1623/1527 1391/1612/1527 1253/1624/1527
f 1252/1623/1527 1282/1625/1527 1392/1615/1527
f 1392/1615/1525 1283/1626/1525 1395/1618/1525
f 1283/1626/1526 1398/1627/1526 1395/1618/1526
f 1284/1628/1528 1258/1585/1528 1398/1627/1528
f 1398/1627/1528 1259/1586/1528 1397/1619/1528
f 1396/1617/1528 1259/1586/1528 1260/1583/1528
f 1396/1617/1528 1204/1584/1528 1281/1629/1528
f 1393/1613/1526 1281/1629/1526 1280/1630/1526
f 1279/1621/1525 1393/1613/1525 1280/1630/1525
f 1399/1631/1529 1403/1632/1529 1402/1633/1529
f 1400/1634/1530 1404/1635/1530 1403/1632/1530
f 1403/1632/1531 1405/1636/1531 1402/1633/1531
f 1404/1635/1532 1406/1637/1532 1403/1632/1532
f 1191/1638/1533 1399/1631/1533 1209/1639/1533
f 1243/1640/1534 1400/1634/1534 1399/1631/1534
f 1244/1641/1535 1401/1642/1535 1400/1634/1535
f 1245/1643/1536 1285/1644/1536 1401/1642/1536
f 1401/1642/1537 1286/1645/1537 1404/1635/1537
f 1286/1645/1538 1407/1646/1538 1404/1635/1538
f 1287/1647/1539 1216/1605/1539 1407/1646/1539
f 1407/1646/1540 1217/1603/1540 1406/1637/1540
f 1406/1637/1541 1218/1602/1541 1405/1636/1541
f 1405/1636/1542 1193/1600/1542 1207/1648/1542
f 1208/1649/1543 1405/1636/1543 1207/1648/1543
f 1399/1631/1544 1208/1649/1544 1209/1639/1544
f 1409/1650/1545 1411/1651/1545 1408/1652/1545
f 1410/1653/1545 1412/1654/1545 1409/1650/1545
f 1412/1654/1546 1414/1655/1546 1411/1651/1546
f 1413/1656/1546 1415/1657/1546 1412/1654/1546
f 1199/1658/1547 1408/1652/1547 1285/1644/1547
f 1247/1659/1547 1408/1652/1547 1246/1660/1547
f 1248/1661/1547 1409/1650/1547 1247/1659/1547
f 1200/1408/1547 1410/1653/1547 1248/1661/1547
f 1288/1407/1545 1413/1656/1545 1410/1653/1545
f 1289/1420/1546 1416/1662/1546 1413/1656/1546
f 1290/1418/1548 1252/1663/1548 1416/1662/1548
f 1416/1662/1548 1253/1664/1548 1415/1657/1548
f 1415/1657/1548 1254/1665/1548 1414/1655/1548
f 1287/1647/1548 1254/1665/1548 1202/1666/1548
f 1411/1651/1546 1287/1647/1546 1286/1645/1546
f 1285/1644/1545 1411/1651/1545 1286/1645/1545
f 1292/1398/1549 1295/1402/1549 1294/1399/1549
f 1293/1401/1550 1296/1413/1550 1295/1402/1550
f 1294/1399/1551 1295/1402/1551 1298/1403/1551
f 1295/1402/1552 1296/1413/1552 1299/1405/1552
f 1210/1406/1553 1291/1400/1553 1288/1407/1553
f 1211/1409/1554 1292/1398/1554 1291/1400/1554
f 1212/1410/1555 1293/1401/1555 1292/1398/1555
f 1192/1411/1556 1213/1667/1556 1293/1401/1556
f 1293/1401/1557 1213/1667/1557 1214/1412/1557
f 1214/1412/1558 1215/1668/1558 1299/1405/1558
f 1299/1405/1559 1215/1668/1559 1194/1414/1559
f 1298/1403/1560 1299/1405/1560 1249/1415/1560
f 1297/1404/1561 1298/1403/1561 1250/1416/1561
f 1290/1418/1562 1297/1404/1562 1251/1417/1562
f 1289/1420/1563 1294/1399/1563 1297/1404/1563
f 1291/1400/1564 1294/1399/1564 1289/1420/1564
f 1301/1421/1565 1304/1425/1565 1303/1422/1565
f 1302/1424/1566 1305/1431/1566 1304/1425/1566
f 1303/1422/1567 1304/1425/1567 1307/1426/1567
f 1304/1425/1568 1305/1431/1568 1308/1428/1568
f 1251/1417/1569 1300/1423/1569 1282/1429/1569
f 1250/1416/1570 1301/1421/1570 1300/1423/1570
f 1249/1415/1571 1302/1424/1571 1301/1421/1571
f 1194/1414/1572 1222/1669/1572 1302/1424/1572
f 1302/1424/1573 1222/1669/1573 1223/1430/1573
f 1223/1430/1574 1224/1670/1574 1308/1428/1574
f 1308/1428/1575 1224/1670/1575 1198/1432/1575
f 1307/1426/1576 1308/1428/1576 1255/1433/1576
f 1306/1427/1577 1307/1426/1577 1256/1434/1577
f 1284/1436/1578 1306/1427/1578 1257/1435/1578
f 1283/1438/1579 1303/1422/1579 1306/1427/1579
f 1300/1423/1580 1303/1422/1580 1283/1438/1580
f 1310/1439/1581 1313/1443/1581 1312/1440/1581
f 1311/1442/1582 1314/1449/1582 1313/1443/1582
f 1312/1440/1583 1313/1443/1583 1316/1444/1583
f 1313/1443/1584 1314/1449/1584 1317/1446/1584
f 1257/1435/1585 1309/1441/1585 1276/1447/1585
f 1256/1434/1586 1310/1439/1586 1309/1441/1586
f 1255/1433/1587 1311/1442/1587 1310/1439/1587
f 1198/1432/1588 1231/1507/1588 1311/1442/1588
f 1311/1442/1589 1231/1507/1589 1232/1448/1589
f 1232/1448/1590 1233/1519/1590 1317/1446/1590
f 1317/1446/1591 1233/1519/1591 1196/1450/1591
f 1316/1444/1592 1317/1446/1592 1261/1451/1592
f 1315/1445/1593 1316/1444/1593 1262/1452/1593
f 1278/1454/1594 1315/1445/1594 1263/1453/1594
f 1277/1456/1595 1312/1440/1595 1315/1445/1595
f 1309/1441/1596 1312/1440/1596 1277/1456/1596
f 1319/1457/1597 1322/1461/1597 1321/1458/1597
f 1320/1460/1598 1323/1467/1598 1322/1461/1598
f 1321/1458/1599 1322/1461/1599 1325/1462/1599
f 1322/1461/1600 1323/1467/1600 1326/1464/1600
f 1263/1453/1601 1318/1459/1601 1270/1465/1601
f 1262/1452/1602 1319/1457/1602 1318/1459/1602
f 1261/1451/1603 1320/1460/1603 1319/1457/1603
f 1196/1450/1604 1240/1671/1604 1320/1460/1604
f 1320/1460/1605 1240/1671/1605 1241/1466/1605
f 1241/1466/1606 1242/1672/1606 1326/1464/1606
f 1326/1464/1607 1242/1672/1607 1192/1468/1607
f 1325/1462/1608 1326/1464/1608 1212/1469/1608
f 1324/1463/1609 1325/1462/1609 1211/1470/1609
f 1272/1472/1610 1324/1463/1610 1210/1471/1610
f 1271/1474/1611 1321/1458/1611 1324/1463/1611
f 1318/1459/1612 1321/1458/1612 1271/1474/1612
f 1327/1475/1613 1328/1479/1613 1331/1476/1613
f 1329/1478/1614 1332/1490/1614 1331/1476/1614
f 1331/1476/1615 1334/1482/1615 1333/1480/1615
f 1331/1476/1616 1332/1490/1616 1335/1481/1616
f 1193/1483/1617 1221/1485/1617 1327/1475/1617
f 1221/1485/1618 1220/1487/1618 1328/1479/1618
f 1219/1486/1619 1329/1478/1619 1328/1479/1619
f 1197/1488/1620 1230/1489/1620 1329/1478/1620
f 1230/1489/1621 1229/1574/1621 1332/1490/1621
f 1332/1490/1622 1229/1574/1622 1228/1491/1622
f 1335/1481/1623 1228/1491/1623 1195/1492/1623
f 1334/1482/1624 1335/1481/1624 1239/1493/1624
f 1334/1482/1625 1238/1494/1625 1237/1495/1625
f 1333/1480/1626 1237/1495/1626 1191/1496/1626
f 1330/1477/1627 1333/1480/1627 1209/1497/1627
f 1207/1484/1628 1327/1475/1628 1330/1477/1628
f 1336/1499/1629 1337/1503/1629 1340/1500/1629
f 1338/1502/1630 1341/1513/1630 1340/1500/1630
f 1340/1500/1631 1343/1506/1631 1342/1504/1631
f 1340/1500/1632 1341/1513/1632 1344/1505/1632
f 1198/1432/1633 1224/1508/1633 1336/1499/1633
f 1224/1508/1634 1223/1510/1634 1337/1503/1634
f 1222/1509/1635 1338/1502/1635 1337/1503/1635
f 1194/1511/1636 1215/1512/1636 1338/1502/1636
f 1215/1512/1637 1214/1673/1637 1341/1513/1637
f 1341/1513/1638 1214/1673/1638 1213/1514/1638
f 1344/1505/1639 1213/1514/1639 1192/1515/1639
f 1343/1506/1640 1344/1505/1640 1242/1516/1640
f 1343/1506/1641 1241/1517/1641 1240/1518/1641
f 1342/1504/1642 1240/1518/1642 1196/1450/1642
f 1339/1501/1643 1342/1504/1643 1233/1519/1643
f 1231/1507/1644 1336/1499/1644 1339/1501/1644
f 1345/1520/1645 1346/1523/1645 1349/1521/1645
f 1346/1523/1646 1347/1530/1646 1350/1524/1646
f 1349/1521/1647 1352/1526/1647 1351/1525/1647
f 1350/1524/1648 1353/1534/1648 1352/1526/1648
f 1195/1492/1649 1236/1528/1649 1345/1520/1649
f 1236/1528/1650 1235/1529/1650 1346/1523/1650
f 1235/1529/1651 1234/1531/1651 1347/1530/1651
f 1234/1531/1652 1206/1551/1652 1267/1532/1652
f 1347/1530/1653 1267/1532/1653 1268/1533/1653
f 1268/1533/1654 1269/1535/1654 1353/1534/1654
f 1269/1535/1655 1199/1558/1655 1245/1536/1655
f 1353/1534/1656 1245/1536/1656 1244/1537/1656
f 1352/1526/1657 1244/1537/1657 1243/1538/1657
f 1351/1525/1658 1243/1538/1658 1191/1539/1658
f 1238/1541/1659 1348/1522/1659 1351/1525/1659
f 1345/1520/1660 1348/1522/1660 1238/1541/1660
f 1355/1542/1485 1358/1546/1485 1357/1543/1485
f 1356/1545/1485 1359/1548/1485 1358/1546/1485
f 1358/1546/1486 1361/1549/1486 1360/1547/1486
f 1359/1548/1486 1362/1554/1486 1361/1549/1486
f 1266/1550/1487 1354/1544/1487 1267/1532/1487
f 1265/1552/1487 1355/1542/1487 1354/1544/1487
f 1265/1552/1487 1264/1553/1487 1356/1545/1487
f 1205/1455/1487 1270/1465/1487 1356/1545/1487
f 1270/1465/1485 1271/1474/1485 1359/1548/1485
f 1271/1474/1486 1272/1472/1486 1362/1554/1486
f 1272/1472/1488 1200/1473/1488 1248/1555/1488
f 1362/1554/1488 1248/1555/1488 1247/1556/1488
f 1361/1549/1488 1247/1556/1488 1246/1557/1488
f 1360/1547/1488 1246/1557/1488 1199/1558/1488
f 1357/1543/1486 1360/1547/1486 1269/1535/1486
f 1354/1544/1485 1357/1543/1485 1268/1533/1485
f 1363/1559/1661 1364/1562/1661 1367/1560/1661
f 1364/1562/1662 1365/1568/1662 1368/1563/1662
f 1367/1560/1663 1370/1565/1663 1369/1564/1663
f 1368/1563/1664 1371/1572/1664 1370/1565/1664
f 1197/1488/1665 1227/1566/1665 1363/1559/1665
f 1227/1566/1666 1226/1567/1666 1364/1562/1666
f 1226/1567/1667 1225/1569/1667 1365/1568/1667
f 1225/1569/1668 1204/1674/1668 1273/1570/1668
f 1365/1568/1669 1273/1570/1669 1274/1571/1669
f 1274/1571/1670 1275/1573/1670 1371/1572/1670
f 1275/1573/1671 1206/1551/1671 1234/1531/1671
f 1371/1572/1672 1234/1531/1672 1235/1529/1672
f 1370/1565/1673 1235/1529/1673 1236/1528/1673
f 1369/1564/1674 1236/1528/1674 1195/1492/1674
f 1229/1574/1675 1366/1561/1675 1369/1564/1675
f 1363/1559/1676 1366/1561/1676 1229/1574/1676
f 1373/1575/1505 1376/1579/1505 1375/1576/1505
f 1374/1578/1505 1377/1581/1505 1376/1579/1505
f 1376/1579/1506 1379/1582/1506 1378/1580/1506
f 1377/1581/1506 1380/1588/1506 1379/1582/1506
f 1260/1583/1507 1372/1577/1507 1273/1570/1507
f 1260/1583/1507 1259/1586/1507 1373/1575/1507
f 1258/1585/1507 1374/1578/1507 1373/1575/1507
f 1258/1585/1507 1203/1675/1507 1276/1447/1507
f 1374/1578/1505 1276/1447/1505 1277/1587/1505
f 1277/1587/1506 1278/1454/1506 1380/1588/1506
f 1278/1454/1508 1205/1676/1508 1264/1589/1508
f 1379/1582/1508 1380/1588/1508 1264/1589/1508
f 1379/1582/1508 1265/1590/1508 1266/1550/1508
f 1378/1580/1508 1266/1550/1508 1206/1591/1508
f 1375/1576/1506 1378/1580/1506 1275/1573/1506
f 1273/1570/1505 1372/1577/1505 1375/1576/1505
f 1381/1593/1677 1382/1596/1677 1385/1594/1677
f 1382/1596/1678 1383/1604/1678 1386/1597/1678
f 1385/1594/1679 1388/1599/1679 1387/1598/1679
f 1386/1597/1680 1389/1608/1680 1388/1599/1680
f 1193/1600/1681 1218/1602/1681 1381/1593/1681
f 1218/1602/1682 1217/1603/1682 1382/1596/1682
f 1217/1603/1683 1216/1605/1683 1383/1604/1683
f 1216/1605/1684 1202/1666/1684 1279/1606/1684
f 1383/1604/1685 1279/1606/1685 1280/1607/1685
f 1280/1607/1686 1281/1609/1686 1389/1608/1686
f 1281/1609/1687 1204/1674/1687 1225/1569/1687
f 1389/1608/1688 1225/1569/1688 1226/1567/1688
f 1388/1599/1689 1226/1567/1689 1227/1566/1689
f 1387/1598/1690 1227/1566/1690 1197/1488/1690
f 1220/1611/1691 1384/1595/1691 1387/1598/1691
f 1381/1593/1692 1384/1595/1692 1220/1611/1692
f 1391/1612/1525 1394/1616/1525 1393/1613/1525
f 1392/1615/1525 1395/1618/1525 1394/1616/1525
f 1394/1616/1526 1397/1619/1526 1396/1617/1526
f 1395/1618/1526 1398/1627/1526 1397/1619/1526
f 1254/1620/1527 1390/1614/1527 1279/1621/1527
f 1254/1620/1527 1253/1624/1527 1391/1612/1527
f 1252/1623/1527 1392/1615/1527 1391/1612/1527
f 1252/1623/1527 1201/1419/1527 1282/1625/1527
f 1392/1615/1525 1282/1625/1525 1283/1626/1525
f 1283/1626/1526 1284/1628/1526 1398/1627/1526
f 1284/1628/1528 1203/1675/1528 1258/1585/1528
f 1398/1627/1528 1258/1585/1528 1259/1586/1528
f 1396/1617/1528 1397/1619/1528 1259/1586/1528
f 1396/1617/1528 1260/1583/1528 1204/1584/1528
f 1393/1613/1526 1396/1617/1526 1281/1629/1526
f 1279/1621/1525 1390/1614/1525 1393/1613/1525
f 1399/1631/1693 1400/1634/1693 1403/1632/1693
f 1400/1634/1694 1401/1642/1694 1404/1635/1694
f 1403/1632/1695 1406/1637/1695 1405/1636/1695
f 1404/1635/1696 1407/1646/1696 1406/1637/1696
f 1191/1638/1697 1243/1640/1697 1399/1631/1697
f 1243/1640/1698 1244/1641/1698 1400/1634/1698
f 1244/1641/1699 1245/1643/1699 1401/1642/1699
f 1245/1643/1700 1199/1658/1700 1285/1644/1700
f 1401/1642/1701 1285/1644/1701 1286/1645/1701
f 1286/1645/1702 1287/1647/1702 1407/1646/1702
f 1287/1647/1703 1202/1666/1703 1216/1605/1703
f 1407/1646/1704 1216/1605/1704 1217/1603/1704
f 1406/1637/1705 1217/1603/1705 1218/1602/1705
f 1405/1636/1706 1218/1602/1706 1193/1600/1706
f 1208/1649/1707 1402/1633/1707 1405/1636/1707
f 1399/1631/1708 1402/1633/1708 1208/1649/1708
f 1409/1650/1545 1412/1654/1545 1411/1651/1545
f 1410/1653/1545 1413/1656/1545 1412/1654/1545
f 1412/1654/1546 1415/1657/1546 1414/1655/1546
f 1413/1656/1546 1416/1662/1546 1415/1657/1546
f 1199/1658/1547 1246/1660/1547 1408/1652/1547
f 1247/1659/1547 1409/1650/1547 1408/1652/1547
f 1248/1661/1547 1410/1653/1547 1409/1650/1547
f 1200/1408/1547 1288/1407/1547 1410/1653/1547
f 1288/1407/1545 1289/1420/1545 1413/1656/1545
f 1289/1420/1546 1290/1418/1546 1416/1662/1546
f 1290/1418/1548 1201/1419/1548 1252/1663/1548
f 1416/1662/1548 1252/1663/1548 1253/1664/1548
f 1415/1657/1548 1253/1664/1548 1254/1665/1548
f 1287/1647/1548 1414/1655/1548 1254/1665/1548
f 1411/1651/1546 1414/1655/1546 1287/1647/1546
f 1285/1644/1545 1408/1652/1545 1411/1651/1545
o Cube.003
v 0.321424 -0.411795 0.226570
v 0.004825 -0.403375 0.226570
v 0.321424 -0.411795 0.109665
v 0.004825 -0.403375 0.109665
v 0.324012 -0.314486 0.226570
v 0.007413 -0.306066 0.226570
v 0.324012 -0.314486 0.109665
v 0.007413 -0.306066 0.109665
v 0.226013 -0.425487 0.246055
v 0.099373 -0.422120 0.246055
v 0.099373 -0.422120 0.090180
v 0.226013 -0.425487 0.090180
v 0.102824 -0.292374 0.090180
v 0.229463 -0.295742 0.090180
v 0.102824 -0.292374 0.246055
v 0.229463 -0.295742 0.246055
v 0.325853 -0.419015 0.133437
v 0.328078 -0.422642 0.168118
v 0.325853 -0.419015 0.202798
v 0.068224 -0.420944 0.245637
v 0.040134 -0.417764 0.242717
v 0.018004 -0.411175 0.235514
v 0.000018 -0.410350 0.202798
v -0.002396 -0.413853 0.168118
v 0.000018 -0.410350 0.133437
v 0.257181 -0.425969 0.090598
v 0.285400 -0.424287 0.093518
v 0.307849 -0.418884 0.100721
v 0.328104 -0.334396 0.101138
v 0.329656 -0.363325 0.096856
v 0.326568 -0.392130 0.101138
v 0.000733 -0.383465 0.101138
v -0.000819 -0.354536 0.096856
v 0.002268 -0.325731 0.101138
v 0.260613 -0.296918 0.090598
v 0.288703 -0.300097 0.093518
v 0.310833 -0.306686 0.100721
v 0.328819 -0.307512 0.202798
v 0.331233 -0.304008 0.168118
v 0.328819 -0.307512 0.133437
v 0.002983 -0.298846 0.133437
v 0.000759 -0.295220 0.168118
v 0.002983 -0.298846 0.202798
v 0.260613 -0.296918 0.245637
v 0.288703 -0.300097 0.242717
v 0.310833 -0.306686 0.235514
v 0.326568 -0.392130 0.235097
v 0.329656 -0.363325 0.239380
v 0.328104 -0.334396 0.235097
v 0.002268 -0.325731 0.235097
v -0.000819 -0.354536 0.239380
v 0.000733 -0.383465 0.235097
v 0.307849 -0.418884 0.235514
v 0.285400 -0.424287 0.242717
v 0.257181 -0.425969 0.245637
v 0.194353 -0.424645 0.246055
v 0.162693 -0.423803 0.246055
v 0.131033 -0.422962 0.246055
v 0.018004 -0.411175 0.100721
v 0.040134 -0.417764 0.093518
v 0.068224 -0.420944 0.090598
v 0.131033 -0.422962 0.090180
v 0.162693 -0.423803 0.090180
v 0.194353 -0.424645 0.090180
v 0.020988 -0.298978 0.100721
v 0.043436 -0.293575 0.093518
v 0.071655 -0.291893 0.090598
v 0.134484 -0.293216 0.090180
v 0.166144 -0.294058 0.090180
v 0.197804 -0.294900 0.090180
v 0.020988 -0.298978 0.235514
v 0.043436 -0.293575 0.242717
v 0.071655 -0.291893 0.245637
v 0.134484 -0.293216 0.246055
v 0.166144 -0.294058 0.246055
v 0.197804 -0.294900 0.246055
v 0.228682 -0.325137 0.267975
v 0.227738 -0.360615 0.275281
v 0.226795 -0.396092 0.267975
v 0.102042 -0.321769 0.267975
v 0.101099 -0.357247 0.275281
v 0.100155 -0.392724 0.267975
v 0.229949 -0.277496 0.125496
v 0.230110 -0.271414 0.168118
v 0.229949 -0.277496 0.210739
v 0.103309 -0.274128 0.125496
v 0.103471 -0.268047 0.168118
v 0.103309 -0.274128 0.210739
v 0.226795 -0.396092 0.068261
v 0.227738 -0.360615 0.060954
v 0.228682 -0.325137 0.068261
v 0.100155 -0.392724 0.068261
v 0.101099 -0.357247 0.060954
v 0.102042 -0.321769 0.068261
v 0.225528 -0.443733 0.210739
v 0.225366 -0.449815 0.168118
v 0.225528 -0.443733 0.125496
v 0.098888 -0.440365 0.210739
v 0.098726 -0.446447 0.168118
v 0.098888 -0.440365 0.125496
v 0.067698 -0.439095 0.210510
v 0.039326 -0.435259 0.208901
v 0.016320 -0.426420 0.205308
v 0.067494 -0.445147 0.168118
v 0.038829 -0.441097 0.168118
v 0.015064 -0.431561 0.168118
v 0.067698 -0.439095 0.125725
v 0.039326 -0.435259 0.127334
v 0.016320 -0.426420 0.130927
v 0.068958 -0.391703 0.068789
v 0.040540 -0.389608 0.072489
v 0.017395 -0.385999 0.082366
v 0.069854 -0.356416 0.061518
v 0.041101 -0.355651 0.065464
v 0.017100 -0.355013 0.076154
v 0.070835 -0.321131 0.068789
v 0.042346 -0.321713 0.072489
v 0.019041 -0.324086 0.082366
v 0.072095 -0.273739 0.125725
v 0.043560 -0.276062 0.127334
v 0.020116 -0.283665 0.130927
v 0.072214 -0.267685 0.168118
v 0.043373 -0.270205 0.168118
v 0.019135 -0.278464 0.168118
v 0.072095 -0.273739 0.210510
v 0.043560 -0.276062 0.208901
v 0.020116 -0.283665 0.205308
v 0.070835 -0.321131 0.267446
v 0.042346 -0.321713 0.263746
v 0.019041 -0.324086 0.253869
v 0.069854 -0.356416 0.274717
v 0.041101 -0.355651 0.270771
v 0.017100 -0.355013 0.260082
v 0.068958 -0.391703 0.267446
v 0.040540 -0.389608 0.263746
v 0.017395 -0.385999 0.253869
v 0.336399 -0.394291 0.131157
v 0.340563 -0.363615 0.129172
v 0.338036 -0.332761 0.131157
v 0.339701 -0.396033 0.168118
v 0.344215 -0.363712 0.168118
v 0.341425 -0.331197 0.168118
v 0.336399 -0.394291 0.205078
v 0.340563 -0.363615 0.207064
v 0.338036 -0.332761 0.205078
v -0.007563 -0.323570 0.131157
v -0.011726 -0.354246 0.129172
v -0.009199 -0.385100 0.131157
v -0.010864 -0.321829 0.168118
v -0.015379 -0.354149 0.168118
v -0.012589 -0.386664 0.168118
v -0.007563 -0.323570 0.205078
v -0.011726 -0.354246 0.207064
v -0.009199 -0.385100 0.205078
v 0.311442 -0.331862 0.253869
v 0.288297 -0.328254 0.263746
v 0.259878 -0.326158 0.267446
v 0.311737 -0.362848 0.260082
v 0.287736 -0.362210 0.270771
v 0.258983 -0.361445 0.274717
v 0.309796 -0.393775 0.253869
v 0.286491 -0.396149 0.263746
v 0.258002 -0.396730 0.267446
v 0.197022 -0.324295 0.267975
v 0.165362 -0.323453 0.267975
v 0.133702 -0.322611 0.267975
v 0.196078 -0.359773 0.275281
v 0.164418 -0.358931 0.275281
v 0.132759 -0.358089 0.275281
v 0.195135 -0.395250 0.267974
v 0.163475 -0.394408 0.267974
v 0.131815 -0.393566 0.267975
v 0.312517 -0.291441 0.130927
v 0.289511 -0.282602 0.127334
v 0.261139 -0.278766 0.125725
v 0.313773 -0.286300 0.168118
v 0.290008 -0.276764 0.168118
v 0.261343 -0.272715 0.168118
v 0.312517 -0.291441 0.205308
v 0.289511 -0.282602 0.208901
v 0.261139 -0.278766 0.210510
v 0.198289 -0.276654 0.125496
v 0.166629 -0.275812 0.125496
v 0.134969 -0.274970 0.125496
v 0.198451 -0.270572 0.168118
v 0.166791 -0.269731 0.168118
v 0.135131 -0.268889 0.168118
v 0.198289 -0.276654 0.210739
v 0.166629 -0.275812 0.210739
v 0.134969 -0.274970 0.210739
v 0.309796 -0.393775 0.082366
v 0.286491 -0.396149 0.072489
v 0.258002 -0.396730 0.068789
v 0.311737 -0.362848 0.076154
v 0.287736 -0.362210 0.065464
v 0.258983 -0.361445 0.061518
v 0.311442 -0.331862 0.082366
v 0.288297 -0.328254 0.072489
v 0.259878 -0.326158 0.068789
v 0.195135 -0.395250 0.068261
v 0.163475 -0.394408 0.068261
v 0.131815 -0.393566 0.068261
v 0.196078 -0.359773 0.060954
v 0.164418 -0.358931 0.060954
v 0.132759 -0.358089 0.060954
v 0.197022 -0.324295 0.068261
v 0.165362 -0.323453 0.068261
v 0.133702 -0.322611 0.068261
v 0.308721 -0.434196 0.205308
v 0.285277 -0.441800 0.208901
v 0.256741 -0.444123 0.210510
v 0.309701 -0.439397 0.168118
v 0.285463 -0.447656 0.168118
v 0.256623 -0.450176 0.168118
v 0.308721 -0.434196 0.130927
v 0.285277 -0.441800 0.127334
v 0.256741 -0.444123 0.125725
v 0.193868 -0.442891 0.210739
v 0.162208 -0.442049 0.210739
v 0.130548 -0.441207 0.210739
v 0.193706 -0.448973 0.168118
v 0.162046 -0.448131 0.168118
v 0.130386 -0.447289 0.168118
v 0.193868 -0.442891 0.125496
v 0.162208 -0.442049 0.125496
v 0.130548 -0.441207 0.125496
vn -0.1405 -0.9801 0.1406
vn -0.3737 -0.9175 0.1359
vn -0.1405 -0.9801 -0.1406
vn -0.3737 -0.9175 -0.1359
vn -0.0397 -0.8875 0.4591
vn -0.1468 -0.8779 0.4558
vn -0.3836 -0.8165 0.4316
vn -0.6466 -0.6659 0.3721
vn -0.7069 -0.6970 0.1202
vn -0.7069 -0.6970 -0.1202
vn -0.6466 -0.6659 -0.3721
vn -0.3836 -0.8165 -0.4316
vn -0.1468 -0.8779 -0.4558
vn -0.0397 -0.8875 -0.4591
vn -0.0413 -0.9891 -0.1413
vn -0.0413 -0.9891 0.1413
vn -0.1408 -0.1964 -0.9704
vn -0.4096 -0.1783 -0.8947
vn -0.1302 0.2036 -0.9704
vn -0.3995 0.1998 -0.8947
vn -0.0333 -0.5969 -0.8016
vn -0.1492 -0.5888 -0.7944
vn -0.4018 -0.5442 -0.7365
vn -0.6683 -0.4488 -0.5933
vn -0.7506 -0.1368 -0.6465
vn -0.7422 0.1765 -0.6465
vn -0.6435 0.4836 -0.5933
vn -0.3723 0.5648 -0.7365
vn -0.1177 0.5958 -0.7944
vn -0.0015 0.5978 -0.8016
vn -0.0125 0.2020 -0.9793
vn -0.0232 -0.2011 -0.9793
vn -0.0882 0.9861 -0.1406
vn -0.3244 0.9361 -0.1359
vn -0.0882 0.9861 0.1406
vn -0.3244 0.9361 0.1359
vn 0.0076 0.8884 -0.4591
vn -0.0999 0.8844 -0.4558
vn -0.3396 0.8357 -0.4316
vn -0.6103 0.6993 -0.3721
vn -0.6689 0.7336 -0.1202
vn -0.6689 0.7336 0.1202
vn -0.6103 0.6993 0.3721
vn -0.3396 0.8357 0.4316
vn -0.0999 0.8844 0.4558
vn 0.0076 0.8884 0.4591
vn 0.0113 0.9899 0.1413
vn 0.0113 0.9899 -0.1413
vn -0.1302 0.2036 0.9704
vn -0.3995 0.1998 0.8947
vn -0.1408 -0.1964 0.9704
vn -0.4096 -0.1783 0.8947
vn -0.0015 0.5978 0.8016
vn -0.1177 0.5958 0.7944
vn -0.3723 0.5648 0.7365
vn -0.6435 0.4836 0.5933
vn -0.7422 0.1765 0.6465
vn -0.7506 -0.1368 0.6465
vn -0.6683 -0.4488 0.5933
vn -0.4018 -0.5442 0.7365
vn -0.1492 -0.5888 0.7944
vn -0.0333 -0.5969 0.8016
vn -0.0232 -0.2011 0.9793
vn -0.0125 0.2020 0.9793
vn 0.9859 -0.1377 -0.0946
vn 0.9919 0.0872 -0.0928
vn 0.9859 -0.1377 0.0946
vn 0.9919 0.0872 0.0928
vn 0.8724 -0.3983 -0.2835
vn 0.9367 -0.1477 -0.3174
vn 0.9432 0.0976 -0.3175
vn 0.8948 0.3181 -0.3134
vn 0.9317 0.3492 -0.1002
vn 0.9317 0.3492 0.1002
vn 0.8948 0.3181 0.3134
vn 0.9432 0.0976 0.3175
vn 0.9367 -0.1477 0.3174
vn 0.8724 -0.3983 0.2835
vn 0.9118 -0.3983 0.1001
vn 0.9118 -0.3983 -0.1001
vn -0.9859 0.1377 -0.0946
vn -0.9919 -0.0872 -0.0928
vn -0.9859 0.1377 0.0946
vn -0.9919 -0.0872 0.0928
vn -0.8724 0.3983 -0.2835
vn -0.9367 0.1477 -0.3174
vn -0.9432 -0.0976 -0.3175
vn -0.8948 -0.3181 -0.3134
vn -0.9317 -0.3492 -0.1002
vn -0.9317 -0.3492 0.1002
vn -0.8948 -0.3181 0.3134
vn -0.9432 -0.0976 0.3175
vn -0.9367 0.1477 0.3174
vn -0.8724 0.3983 0.2835
vn -0.9118 0.3983 0.1001
vn -0.9118 0.3983 -0.1001
vn 0.4040 0.1835 0.8962
vn 0.1385 0.1984 0.9703
vn 0.3937 -0.2047 0.8962
vn 0.1277 -0.2055 0.9703
vn 0.7130 0.3947 0.5795
vn 0.3993 0.5464 0.7362
vn 0.1469 0.5905 0.7935
vn 0.0331 0.5970 0.8016
vn 0.0230 0.2012 0.9793
vn 0.0123 -0.2021 0.9793
vn 0.0013 -0.5979 0.8016
vn 0.1153 -0.5975 0.7935
vn 0.3697 -0.5668 0.7362
vn 0.6910 -0.4320 0.5795
vn 0.7417 -0.1757 0.6473
vn 0.7500 0.1361 0.6473
vn 0.0054 0.2016 0.9795
vn -0.0054 -0.2016 0.9795
vn 0.0159 0.5974 0.8018
vn -0.0159 -0.5974 0.8018
vn 0.3688 0.9190 -0.1395
vn 0.1385 0.9801 -0.1420
vn 0.3688 0.9190 0.1395
vn 0.1385 0.9801 0.1420
vn 0.6882 0.6467 -0.3289
vn 0.3814 0.8165 -0.4334
vn 0.1446 0.8774 -0.4574
vn 0.0395 0.8875 -0.4592
vn 0.0412 0.9891 -0.1414
vn 0.0412 0.9891 0.1414
vn 0.0395 0.8875 0.4592
vn 0.1446 0.8774 0.4574
vn 0.3814 0.8165 0.4334
vn 0.6882 0.6467 0.3289
vn 0.7063 0.6977 0.1197
vn 0.7063 0.6977 -0.1197
vn 0.0263 0.9896 -0.1413
vn 0.0263 0.9896 0.1413
vn 0.0236 0.8881 -0.4591
vn 0.0236 0.8881 0.4591
vn 0.3937 -0.2047 -0.8962
vn 0.1277 -0.2055 -0.9703
vn 0.4040 0.1835 -0.8962
vn 0.1385 0.1984 -0.9703
vn 0.6910 -0.4320 -0.5795
vn 0.3697 -0.5668 -0.7362
vn 0.1153 -0.5975 -0.7935
vn 0.0013 -0.5979 -0.8016
vn 0.0123 -0.2021 -0.9793
vn 0.0230 0.2012 -0.9793
vn 0.0331 0.5970 -0.8016
vn 0.1469 0.5905 -0.7935
vn 0.3993 0.5464 -0.7362
vn 0.7130 0.3947 -0.5795
vn 0.7500 0.1361 -0.6473
vn 0.7417 -0.1757 -0.6473
vn -0.0054 -0.2016 -0.9795
vn 0.0054 0.2016 -0.9795
vn -0.0159 -0.5974 -0.8018
vn 0.0159 0.5974 -0.8018
vn 0.3194 -0.9373 0.1395
vn 0.0862 -0.9861 0.1420
vn 0.3194 -0.9373 -0.1395
vn 0.0862 -0.9861 -0.1420
vn 0.6528 -0.6824 0.3289
vn 0.3375 -0.8357 0.4334
vn 0.0977 -0.8839 0.4574
vn -0.0077 -0.8883 0.4592
vn -0.0115 -0.9899 0.1414
vn -0.0115 -0.9899 -0.1414
vn -0.0077 -0.8883 -0.4592
vn 0.0977 -0.8839 -0.4574
vn 0.3375 -0.8357 -0.4334
vn 0.6528 -0.6824 -0.3289
vn 0.6682 -0.7343 -0.1197
vn 0.6682 -0.7343 0.1197
vn -0.0263 -0.9896 0.1413
vn -0.0263 -0.9896 -0.1413
vn -0.0236 -0.8881 0.4591
vn -0.0236 -0.8881 -0.4591
vn -0.1385 -0.9801 0.1420
vn -0.3688 -0.9190 0.1395
vn -0.1385 -0.9801 -0.1420
vn -0.3688 -0.9190 -0.1395
vn -0.0395 -0.8875 0.4592
vn -0.1446 -0.8774 0.4574
vn -0.3814 -0.8165 0.4334
vn -0.6882 -0.6467 0.3289
vn -0.7063 -0.6977 0.1197
vn -0.7063 -0.6977 -0.1197
vn -0.6882 -0.6467 -0.3289
vn -0.3814 -0.8165 -0.4334
vn -0.1446 -0.8774 -0.4574
vn -0.0395 -0.8875 -0.4592
vn -0.0412 -0.9891 -0.1414
vn -0.0412 -0.9891 0.1414
vn -0.1385 -0.1984 -0.9703
vn -0.4040 -0.1835 -0.8962
vn -0.1277 0.2055 -0.9703
vn -0.3937 0.2047 -0.8962
vn -0.0331 -0.5970 -0.8016
vn -0.1469 -0.5905 -0.7935
vn -0.3993 -0.5464 -0.7362
vn -0.7130 -0.3947 -0.5795
vn -0.7500 -0.1361 -0.6473
vn -0.7417 0.1757 -0.6473
vn -0.6910 0.4320 -0.5795
vn -0.3697 0.5668 -0.7362
vn -0.1153 0.5975 -0.7935
vn -0.0013 0.5979 -0.8016
vn -0.0123 0.2021 -0.9793
vn -0.0230 -0.2012 -0.9793
vn -0.0862 0.9861 -0.1420
vn -0.3194 0.9373 -0.1395
vn -0.0862 0.9861 0.1420
vn -0.3194 0.9373 0.1395
vn 0.0077 0.8883 -0.4592
vn -0.0977 0.8839 -0.4574
vn -0.3375 0.8357 -0.4334
vn -0.6528 0.6824 -0.3289
vn -0.6682 0.7343 -0.1197
vn -0.6682 0.7343 0.1197
vn -0.6528 0.6824 0.3289
vn -0.3375 0.8357 0.4334
vn -0.0977 0.8839 0.4574
vn 0.0077 0.8883 0.4592
vn 0.0115 0.9899 0.1414
vn 0.0115 0.9899 -0.1414
vn -0.1277 0.2055 0.9703
vn -0.3937 0.2047 0.8962
vn -0.1385 -0.1984 0.9703
vn -0.4040 -0.1835 0.8962
vn -0.0013 0.5979 0.8016
vn -0.1153 0.5975 0.7935
vn -0.3697 0.5668 0.7362
vn -0.6910 0.4320 0.5795
vn -0.7417 0.1757 0.6473
vn -0.7500 -0.1361 0.6473
vn -0.7130 -0.3947 0.5795
vn -0.3993 -0.5464 0.7362
vn -0.1469 -0.5905 0.7935
vn -0.0331 -0.5970 0.8016
vn -0.0230 -0.2012 0.9793
vn -0.0123 0.2021 0.9793
vn 0.9858 -0.1398 -0.0928
vn 0.9919 0.0851 -0.0946
vn 0.9858 -0.1398 0.0928
vn 0.9919 0.0851 0.0946
vn 0.8766 -0.3652 -0.3134
vn 0.9367 -0.1476 -0.3175
vn 0.9432 0.0977 -0.3174
vn 0.8923 0.3513 -0.2835
vn 0.9317 0.3492 -0.1001
vn 0.9317 0.3492 0.1001
vn 0.8923 0.3513 0.2835
vn 0.9432 0.0977 0.3174
vn 0.9367 -0.1476 0.3175
vn 0.8766 -0.3652 0.3134
vn 0.9118 -0.3982 0.1002
vn 0.9118 -0.3982 -0.1002
vn -0.9858 0.1398 -0.0928
vn -0.9919 -0.0851 -0.0946
vn -0.9858 0.1398 0.0928
vn -0.9919 -0.0851 0.0946
vn -0.8766 0.3652 -0.3134
vn -0.9367 0.1476 -0.3175
vn -0.9432 -0.0977 -0.3174
vn -0.8923 -0.3513 -0.2835
vn -0.9317 -0.3492 -0.1001
vn -0.9317 -0.3492 0.1001
vn -0.8923 -0.3513 0.2835
vn -0.9432 -0.0977 0.3174
vn -0.9367 0.1476 0.3175
vn -0.8766 0.3652 0.3134
vn -0.9118 0.3982 0.1002
vn -0.9118 0.3982 -0.1002
vn 0.4096 0.1783 0.8947
vn 0.1408 0.1964 0.9704
vn 0.3995 -0.1998 0.8947
vn 0.1302 -0.2036 0.9704
vn 0.6683 0.4488 0.5933
vn 0.4018 0.5442 0.7365
vn 0.1492 0.5888 0.7944
vn 0.0333 0.5969 0.8016
vn 0.0232 0.2011 0.9793
vn 0.0125 -0.2020 0.9793
vn 0.0015 -0.5978 0.8016
vn 0.1177 -0.5958 0.7944
vn 0.3723 -0.5648 0.7365
vn 0.6435 -0.4836 0.5933
vn 0.7422 -0.1765 0.6465
vn 0.7506 0.1368 0.6465
vn 0.3737 0.9175 -0.1359
vn 0.1405 0.9801 -0.1406
vn 0.3737 0.9175 0.1359
vn 0.1405 0.9801 0.1406
vn 0.6466 0.6659 -0.3721
vn 0.3836 0.8165 -0.4316
vn 0.1468 0.8779 -0.4558
vn 0.0397 0.8875 -0.4591
vn 0.0413 0.9891 -0.1413
vn 0.0413 0.9891 0.1413
vn 0.0397 0.8875 0.4591
vn 0.1468 0.8779 0.4558
vn 0.3836 0.8165 0.4316
vn 0.6466 0.6659 0.3721
vn 0.7069 0.6970 0.1202
vn 0.7069 0.6970 -0.1202
vn 0.3995 -0.1998 -0.8947
vn 0.1302 -0.2036 -0.9704
vn 0.4096 0.1783 -0.8947
vn 0.1408 0.1964 -0.9704
vn 0.6435 -0.4836 -0.5933
vn 0.3723 -0.5648 -0.7365
vn 0.1177 -0.5959 -0.7944
vn 0.0015 -0.5978 -0.8016
vn 0.0125 -0.2020 -0.9793
vn 0.0232 0.2011 -0.9793
vn 0.0333 0.5969 -0.8016
vn 0.1492 0.5887 -0.7944
vn 0.4018 0.5442 -0.7365
vn 0.6683 0.4488 -0.5933
vn 0.7506 0.1368 -0.6465
vn 0.7422 -0.1765 -0.6465
vn 0.3244 -0.9361 0.1359
vn 0.0882 -0.9861 0.1406
vn 0.3244 -0.9361 -0.1359
vn 0.0882 -0.9861 -0.1406
vn 0.6103 -0.6993 0.3721
vn 0.3396 -0.8357 0.4316
vn 0.0999 -0.8844 0.4558
vn -0.0076 -0.8884 0.4591
vn -0.0113 -0.9899 0.1413
vn -0.0113 -0.9899 -0.1413
vn -0.0076 -0.8884 -0.4591
vn 0.0999 -0.8844 -0.4558
vn 0.3396 -0.8357 -0.4316
vn 0.6103 -0.6993 -0.3721
vn 0.6689 -0.7336 -0.1202
vn 0.6689 -0.7336 0.1202
vt 0.583333 0.062500
vt 0.562500 0.125000
vt 0.562500 0.062500
vt 0.604167 0.062500
vt 0.583333 0.125000
vt 0.583333 0.187500
vt 0.562500 0.187500
vt 0.604167 0.187500
vt 0.562500 0.000000
vt 0.541667 0.062500
vt 0.541667 0.000000
vt 0.583333 0.000000
vt 0.604167 0.000000
vt 0.625000 0.000000
vt 0.625000 0.125000
vt 0.604167 0.125000
vt 0.625000 0.250000
vt 0.604167 0.250000
vt 0.583333 0.250000
vt 0.562500 0.250000
vt 0.541667 0.187500
vt 0.541667 0.250000
vt 0.541667 0.125000
vt 0.583333 0.312500
vt 0.562500 0.375000
vt 0.562500 0.312500
vt 0.604167 0.312500
vt 0.583333 0.375000
vt 0.583406 0.437500
vt 0.562509 0.437500
vt 0.604320 0.437500
vt 0.541667 0.312500
vt 0.625000 0.375000
vt 0.604167 0.375000
vt 0.625000 0.500000
vt 0.605903 0.500000
vt 0.583912 0.500000
vt 0.562572 0.500000
vt 0.541667 0.437500
vt 0.541667 0.500000
vt 0.541667 0.375000
vt 0.584997 0.562500
vt 0.562789 0.625000
vt 0.562708 0.562500
vt 0.610343 0.562500
vt 0.585648 0.625000
vt 0.584997 0.687500
vt 0.562708 0.687500
vt 0.610343 0.687500
vt 0.541667 0.562500
vt 0.648148 0.625000
vt 0.612558 0.625000
vt 0.625000 0.750000
vt 0.605903 0.750000
vt 0.583912 0.750000
vt 0.562572 0.750000
vt 0.541667 0.687500
vt 0.541667 0.750000
vt 0.541667 0.625000
vt 0.583406 0.812500
vt 0.562500 0.875000
vt 0.562509 0.812500
vt 0.604320 0.812500
vt 0.583333 0.875000
vt 0.583333 0.937500
vt 0.562500 0.937500
vt 0.604167 0.937500
vt 0.541667 0.812500
vt 0.625000 0.875000
vt 0.604167 0.875000
vt 0.625000 1.000000
vt 0.604167 1.000000
vt 0.583333 1.000000
vt 0.562500 1.000000
vt 0.541667 0.937500
vt 0.541667 1.000000
vt 0.541667 0.875000
vt 0.187301 0.562500
vt 0.247685 0.625000
vt 0.187211 0.625000
vt 0.306478 0.562500
vt 0.248409 0.562500
vt 0.187301 0.687500
vt 0.306478 0.687500
vt 0.248409 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.187500 0.500000
vt 0.312500 0.500000
vt 0.250000 0.500000
vt 0.375000 0.500000
vt 0.355867 0.562500
vt 0.304109 0.625000
vt 0.355867 0.687500
vt 0.375000 0.750000
vt 0.312500 0.750000
vt 0.250000 0.750000
vt 0.187500 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.693522 0.562500
vt 0.752315 0.625000
vt 0.695891 0.625000
vt 0.812699 0.562500
vt 0.751592 0.562500
vt 0.693522 0.687500
vt 0.812699 0.687500
vt 0.751591 0.687500
vt 0.644133 0.562500
vt 0.687500 0.500000
vt 0.812500 0.500000
vt 0.750000 0.500000
vt 0.875000 0.500000
vt 0.875000 0.562500
vt 0.812789 0.625000
vt 0.875000 0.687500
vt 0.875000 0.750000
vt 0.812500 0.750000
vt 0.750000 0.750000
vt 0.687500 0.750000
vt 0.644133 0.687500
vt 0.395680 0.812500
vt 0.416667 0.875000
vt 0.395833 0.875000
vt 0.416594 0.812500
vt 0.437500 0.875000
vt 0.395833 0.937500
vt 0.416667 0.937500
vt 0.375000 0.812500
vt 0.394097 0.750000
vt 0.416088 0.750000
vt 0.437491 0.812500
vt 0.437428 0.750000
vt 0.458333 0.812500
vt 0.458333 0.875000
vt 0.437500 0.937500
vt 0.458333 0.937500
vt 0.437500 1.000000
vt 0.416667 1.000000
vt 0.395833 1.000000
vt 0.375000 1.000000
vt 0.375000 0.937500
vt 0.375000 0.875000
vt 0.500000 0.812500
vt 0.479167 0.875000
vt 0.479167 0.812500
vt 0.520833 0.812500
vt 0.500000 0.875000
vt 0.479167 0.937500
vt 0.520833 0.875000
vt 0.500000 0.937500
vt 0.479167 0.750000
vt 0.458333 0.750000
vt 0.500000 0.750000
vt 0.520833 0.750000
vt 0.520833 0.937500
vt 0.520833 1.000000
vt 0.500000 1.000000
vt 0.479167 1.000000
vt 0.458333 1.000000
vt 0.389658 0.562500
vt 0.414352 0.625000
vt 0.387442 0.625000
vt 0.415003 0.562500
vt 0.437211 0.625000
vt 0.389657 0.687500
vt 0.415003 0.687500
vt 0.394097 0.500000
vt 0.416088 0.500000
vt 0.437292 0.562500
vt 0.437428 0.500000
vt 0.458333 0.562500
vt 0.458333 0.625000
vt 0.437292 0.687500
vt 0.458333 0.687500
vt 0.351852 0.625000
vt 0.500000 0.562500
vt 0.479167 0.625000
vt 0.479167 0.562500
vt 0.520833 0.562500
vt 0.500000 0.625000
vt 0.479167 0.687500
vt 0.520833 0.625000
vt 0.500000 0.687500
vt 0.479167 0.500000
vt 0.458333 0.500000
vt 0.520833 0.500000
vt 0.500000 0.500000
vt 0.541667 0.625000
vt 0.520833 0.687500
vt 0.520833 0.750000
vt 0.500000 0.750000
vt 0.458333 0.750000
vt 0.458333 0.625000
vt 0.395833 0.312500
vt 0.416667 0.375000
vt 0.395833 0.375000
vt 0.416667 0.312500
vt 0.437500 0.375000
vt 0.395680 0.437500
vt 0.416594 0.437500
vt 0.375000 0.250000
vt 0.375000 0.312500
vt 0.395833 0.250000
vt 0.416667 0.250000
vt 0.437500 0.312500
vt 0.437500 0.250000
vt 0.458333 0.312500
vt 0.458333 0.375000
vt 0.437491 0.437500
vt 0.458333 0.437500
vt 0.375000 0.437500
vt 0.375000 0.375000
vt 0.500000 0.312500
vt 0.479167 0.375000
vt 0.479167 0.312500
vt 0.520833 0.312500
vt 0.500000 0.375000
vt 0.479167 0.437500
vt 0.520833 0.375000
vt 0.500000 0.437500
vt 0.479167 0.250000
vt 0.458333 0.312500
vt 0.458333 0.250000
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.541667 0.312500
vt 0.541667 0.375000
vt 0.520833 0.437500
vt 0.541667 0.437500
vt 0.458333 0.437500
vt 0.458333 0.375000
vt 0.395833 0.062500
vt 0.416667 0.125000
vt 0.395833 0.125000
vt 0.416667 0.062500
vt 0.437500 0.125000
vt 0.395833 0.187500
vt 0.416667 0.187500
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.395833 0.000000
vt 0.416667 0.000000
vt 0.437500 0.062500
vt 0.437500 0.000000
vt 0.458333 0.062500
vt 0.458333 0.125000
vt 0.437500 0.187500
vt 0.458333 0.187500
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.500000 0.062500
vt 0.479167 0.125000
vt 0.479167 0.062500
vt 0.520833 0.062500
vt 0.500000 0.125000
vt 0.479167 0.187500
vt 0.520833 0.125000
vt 0.500000 0.187500
vt 0.458333 0.000000
vt 0.500000 0.000000
vt 0.479167 0.000000
vt 0.520833 0.000000
vt 0.520833 0.187500
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.479167 0.250000
vt 0.458333 0.250000
vt 0.625000 0.062500
vt 0.625000 0.187500
vt 0.625000 0.312500
vt 0.625000 0.437500
vt 0.625000 0.812500
vt 0.625000 0.937500
vt 0.875000 0.625000
vt 0.458333 0.500000
vt 0.541667 0.500000
vt 0.541667 0.750000
s 0
f 1518/1677/1709 1520/1678/1709 1517/1679/1709
f 1519/1680/1710 1521/1681/1710 1518/1677/1710
f 1520/1678/1711 1524/1682/1711 1523/1683/1711
f 1521/1681/1712 1525/1684/1712 1524/1682/1712
f 1436/1685/1713 1514/1686/1713 1426/1687/1713
f 1437/1688/1714 1517/1679/1714 1436/1685/1714
f 1438/1689/1715 1518/1677/1715 1437/1688/1715
f 1418/1690/1716 1519/1680/1716 1438/1689/1716
f 1519/1680/1717 1440/1691/1717 1522/1692/1717
f 1440/1691/1718 1525/1684/1718 1522/1692/1718
f 1525/1684/1719 1420/1693/1719 1475/1694/1719
f 1524/1682/1720 1475/1694/1720 1476/1695/1720
f 1523/1683/1721 1476/1695/1721 1477/1696/1721
f 1516/1697/1722 1477/1696/1722 1427/1698/1722
f 1515/1699/1723 1523/1683/1723 1516/1697/1723
f 1517/1679/1724 1515/1699/1724 1514/1686/1724
f 1527/1700/1725 1529/1701/1725 1526/1702/1725
f 1528/1703/1726 1530/1704/1726 1527/1700/1726
f 1529/1701/1727 1533/1705/1727 1532/1706/1727
f 1530/1704/1728 1534/1707/1728 1533/1705/1728
f 1477/1696/1729 1508/1708/1729 1427/1698/1729
f 1476/1695/1730 1526/1702/1730 1477/1696/1730
f 1475/1694/1731 1527/1700/1731 1476/1695/1731
f 1420/1693/1732 1528/1703/1732 1475/1694/1732
f 1528/1703/1733 1449/1709/1733 1531/1710/1733
f 1449/1709/1734 1534/1707/1734 1531/1710/1734
f 1534/1707/1735 1424/1711/1735 1481/1712/1735
f 1533/1705/1736 1481/1712/1736 1482/1713/1736
f 1532/1706/1737 1482/1713/1737 1483/1714/1737
f 1510/1715/1738 1483/1714/1738 1429/1716/1738
f 1509/1717/1739 1532/1706/1739 1510/1715/1739
f 1526/1702/1740 1509/1717/1740 1508/1708/1740
f 1536/1718/1741 1538/1719/1741 1535/1720/1741
f 1537/1721/1742 1539/1722/1742 1536/1718/1742
f 1538/1719/1743 1542/1723/1743 1541/1724/1743
f 1539/1722/1744 1543/1725/1744 1542/1723/1744
f 1483/1714/1745 1502/1726/1745 1429/1716/1745
f 1482/1713/1746 1535/1720/1746 1483/1714/1746
f 1481/1712/1747 1536/1718/1747 1482/1713/1747
f 1424/1711/1748 1537/1721/1748 1481/1712/1748
f 1537/1721/1749 1458/1727/1749 1540/1728/1749
f 1458/1727/1750 1543/1725/1750 1540/1728/1750
f 1543/1725/1751 1422/1729/1751 1487/1730/1751
f 1542/1723/1752 1487/1730/1752 1488/1731/1752
f 1541/1724/1753 1488/1731/1753 1489/1732/1753
f 1504/1733/1754 1489/1732/1754 1431/1734/1754
f 1503/1735/1755 1541/1724/1755 1504/1733/1755
f 1535/1720/1756 1503/1735/1756 1502/1726/1756
f 1545/1736/1757 1547/1737/1757 1544/1738/1757
f 1546/1739/1758 1548/1740/1758 1545/1736/1758
f 1547/1737/1759 1551/1741/1759 1550/1742/1759
f 1548/1740/1760 1552/1743/1760 1551/1741/1760
f 1489/1732/1761 1496/1744/1761 1431/1734/1761
f 1488/1731/1762 1544/1738/1762 1489/1732/1762
f 1487/1730/1763 1545/1736/1763 1488/1731/1763
f 1422/1729/1764 1546/1739/1764 1487/1730/1764
f 1546/1739/1765 1467/1745/1765 1549/1746/1765
f 1467/1745/1766 1552/1743/1766 1549/1746/1766
f 1552/1743/1767 1418/1747/1767 1438/1748/1767
f 1551/1741/1768 1438/1748/1768 1437/1749/1768
f 1550/1742/1769 1437/1749/1769 1436/1750/1769
f 1498/1751/1770 1436/1750/1770 1426/1752/1770
f 1497/1753/1771 1550/1742/1771 1498/1751/1771
f 1544/1738/1772 1497/1753/1772 1496/1744/1772
f 1553/1754/1773 1557/1755/1773 1556/1756/1773
f 1555/1757/1774 1557/1755/1774 1554/1758/1774
f 1557/1755/1775 1559/1759/1775 1556/1756/1775
f 1557/1755/1776 1561/1760/1776 1560/1761/1776
f 1419/1762/1777 1553/1754/1777 1433/1763/1777
f 1447/1764/1778 1554/1758/1778 1553/1754/1778
f 1445/1765/1779 1554/1758/1779 1446/1766/1779
f 1423/1767/1780 1555/1757/1780 1445/1765/1780
f 1456/1768/1781 1558/1769/1781 1555/1757/1781
f 1558/1769/1782 1454/1770/1782 1561/1760/1782
f 1561/1760/1783 1421/1771/1783 1465/1772/1783
f 1560/1761/1784 1465/1772/1784 1464/1773/1784
f 1560/1761/1785 1463/1774/1785 1559/1759/1785
f 1559/1759/1786 1417/1775/1786 1435/1776/1786
f 1556/1756/1787 1435/1776/1787 1434/1777/1787
f 1433/1763/1788 1556/1756/1788 1434/1777/1788
f 1562/1778/1789 1566/1779/1789 1565/1780/1789
f 1564/1781/1790 1566/1779/1790 1563/1782/1790
f 1566/1779/1791 1568/1783/1791 1565/1780/1791
f 1566/1779/1792 1570/1784/1792 1569/1785/1792
f 1424/1711/1793 1562/1778/1793 1457/1786/1793
f 1450/1787/1794 1563/1782/1794 1562/1778/1794
f 1448/1788/1795 1563/1782/1795 1449/1789/1795
f 1420/1790/1796 1564/1781/1796 1448/1788/1796
f 1441/1791/1797 1567/1792/1797 1564/1781/1797
f 1567/1792/1798 1439/1793/1798 1570/1784/1798
f 1570/1784/1799 1418/1794/1799 1468/1795/1799
f 1569/1785/1800 1468/1795/1800 1467/1796/1800
f 1569/1785/1801 1466/1797/1801 1568/1783/1801
f 1568/1783/1802 1422/1729/1802 1459/1798/1802
f 1565/1780/1803 1459/1798/1803 1458/1727/1803
f 1457/1786/1804 1565/1780/1804 1458/1727/1804
f 1571/1799/1805 1575/1800/1805 1574/1801/1805
f 1572/1802/1806 1576/1803/1806 1575/1800/1806
f 1575/1800/1807 1577/1804/1807 1574/1801/1807
f 1576/1803/1808 1578/1805/1808 1575/1800/1808
f 1421/1771/1809 1571/1799/1809 1465/1806/1809
f 1462/1807/1810 1572/1802/1810 1571/1799/1810
f 1461/1808/1811 1573/1809/1811 1572/1802/1811
f 1460/1810/1812 1493/1811/1812 1573/1809/1812
f 1573/1809/1813 1494/1812/1813 1576/1803/1813
f 1494/1812/1814 1579/1813/1814 1576/1803/1814
f 1495/1814/1815 1471/1815/1815 1579/1813/1815
f 1579/1813/1816 1470/1816/1816 1578/1805/1816
f 1578/1805/1817 1469/1817/1817 1577/1804/1817
f 1577/1804/1818 1417/1818/1818 1463/1819/1818
f 1464/1820/1819 1577/1804/1819 1463/1819/1819
f 1571/1799/1820 1464/1820/1820 1465/1806/1820
f 1581/1821/1821 1583/1822/1821 1580/1823/1821
f 1582/1824/1821 1584/1825/1821 1581/1821/1821
f 1584/1825/1822 1586/1826/1822 1583/1822/1822
f 1585/1827/1822 1587/1828/1822 1584/1825/1822
f 1492/1829/1823 1493/1811/1823 1432/1830/1823
f 1491/1831/1823 1580/1823/1823 1492/1829/1823
f 1491/1831/1823 1582/1824/1823 1581/1821/1823
f 1431/1734/1823 1582/1824/1823 1490/1832/1823
f 1496/1744/1821 1585/1827/1821 1582/1824/1821
f 1497/1753/1822 1588/1833/1822 1585/1827/1822
f 1498/1751/1824 1474/1834/1824 1588/1833/1824
f 1588/1833/1824 1473/1835/1824 1587/1828/1824
f 1587/1828/1824 1472/1836/1824 1586/1826/1824
f 1586/1826/1824 1425/1837/1824 1495/1814/1824
f 1583/1822/1822 1495/1814/1822 1494/1812/1822
f 1580/1823/1821 1494/1812/1821 1493/1811/1821
f 1589/1838/1825 1593/1839/1825 1592/1840/1825
f 1590/1841/1826 1594/1842/1826 1593/1839/1826
f 1593/1839/1827 1595/1843/1827 1592/1840/1827
f 1594/1842/1828 1596/1844/1828 1593/1839/1828
f 1423/1767/1829 1589/1838/1829 1456/1768/1829
f 1453/1845/1830 1590/1841/1830 1589/1838/1830
f 1452/1846/1831 1591/1847/1831 1590/1841/1831
f 1451/1848/1832 1499/1849/1832 1591/1847/1832
f 1591/1847/1833 1500/1850/1833 1594/1842/1833
f 1500/1850/1834 1597/1851/1834 1594/1842/1834
f 1501/1852/1835 1460/1810/1835 1597/1851/1835
f 1597/1851/1836 1461/1808/1836 1596/1844/1836
f 1596/1844/1837 1462/1807/1837 1595/1843/1837
f 1595/1843/1838 1421/1771/1838 1454/1770/1838
f 1455/1853/1839 1595/1843/1839 1454/1770/1839
f 1589/1838/1840 1455/1853/1840 1456/1768/1840
f 1599/1854/1841 1601/1855/1841 1598/1856/1841
f 1600/1857/1841 1602/1858/1841 1599/1854/1841
f 1602/1858/1842 1604/1859/1842 1601/1855/1842
f 1603/1860/1842 1605/1861/1842 1602/1858/1842
f 1486/1862/1843 1499/1849/1843 1430/1863/1843
f 1486/1862/1843 1599/1854/1843 1598/1856/1843
f 1484/1864/1843 1599/1854/1843 1485/1865/1843
f 1484/1864/1843 1502/1726/1843 1600/1857/1843
f 1600/1857/1841 1503/1866/1841 1603/1860/1841
f 1503/1866/1842 1606/1867/1842 1603/1860/1842
f 1504/1733/1844 1490/1868/1844 1606/1867/1844
f 1605/1861/1844 1490/1868/1844 1491/1869/1844
f 1605/1861/1844 1492/1829/1844 1604/1859/1844
f 1604/1859/1844 1432/1870/1844 1501/1852/1844
f 1601/1855/1842 1501/1852/1842 1500/1871/1842
f 1499/1849/1841 1601/1855/1841 1500/1871/1841
f 1607/1872/1845 1611/1873/1845 1610/1874/1845
f 1608/1875/1846 1612/1876/1846 1611/1873/1846
f 1611/1873/1847 1613/1877/1847 1610/1874/1847
f 1612/1876/1848 1614/1878/1848 1611/1873/1848
f 1419/1879/1849 1607/1872/1849 1447/1880/1849
f 1444/1881/1850 1608/1875/1850 1607/1872/1850
f 1443/1882/1851 1609/1883/1851 1608/1875/1851
f 1442/1884/1852 1505/1885/1852 1609/1883/1852
f 1609/1883/1853 1506/1886/1853 1612/1876/1853
f 1506/1886/1854 1615/1887/1854 1612/1876/1854
f 1507/1888/1855 1451/1848/1855 1615/1887/1855
f 1615/1887/1856 1452/1846/1856 1614/1878/1856
f 1614/1878/1857 1453/1845/1857 1613/1877/1857
f 1613/1877/1858 1423/1767/1858 1445/1889/1858
f 1446/1890/1859 1613/1877/1859 1445/1889/1859
f 1607/1872/1860 1446/1890/1860 1447/1880/1860
f 1617/1891/1861 1619/1892/1861 1616/1893/1861
f 1618/1894/1861 1620/1895/1861 1617/1891/1861
f 1620/1895/1862 1622/1896/1862 1619/1892/1862
f 1621/1897/1862 1623/1898/1862 1620/1895/1862
f 1480/1899/1863 1505/1900/1863 1428/1901/1863
f 1480/1899/1863 1617/1891/1863 1616/1893/1863
f 1478/1902/1863 1617/1891/1863 1479/1903/1863
f 1478/1902/1863 1508/1904/1863 1618/1894/1863
f 1618/1894/1861 1509/1905/1861 1621/1897/1861
f 1509/1905/1862 1624/1906/1862 1621/1897/1862
f 1510/1907/1864 1484/1864/1864 1624/1906/1864
f 1624/1906/1864 1485/1865/1864 1623/1898/1864
f 1622/1896/1864 1485/1865/1864 1486/1862/1864
f 1622/1896/1864 1430/1863/1864 1507/1908/1864
f 1619/1892/1862 1507/1908/1862 1506/1909/1862
f 1505/1900/1861 1619/1892/1861 1506/1909/1861
f 1625/1910/1865 1629/1911/1865 1628/1912/1865
f 1626/1913/1866 1630/1914/1866 1629/1911/1866
f 1629/1911/1867 1631/1915/1867 1628/1912/1867
f 1630/1914/1868 1632/1916/1868 1629/1911/1868
f 1417/1917/1869 1625/1910/1869 1435/1918/1869
f 1469/1919/1870 1626/1913/1870 1625/1910/1870
f 1470/1920/1871 1627/1921/1871 1626/1913/1871
f 1471/1922/1872 1511/1923/1872 1627/1921/1872
f 1627/1921/1873 1512/1924/1873 1630/1914/1873
f 1512/1924/1874 1633/1925/1874 1630/1914/1874
f 1513/1926/1875 1442/1884/1875 1633/1925/1875
f 1633/1925/1876 1443/1882/1876 1632/1916/1876
f 1632/1916/1877 1444/1881/1877 1631/1915/1877
f 1631/1915/1878 1419/1879/1878 1433/1927/1878
f 1434/1928/1879 1631/1915/1879 1433/1927/1879
f 1625/1910/1880 1434/1928/1880 1435/1918/1880
f 1635/1929/1881 1637/1930/1881 1634/1931/1881
f 1636/1932/1881 1638/1933/1881 1635/1929/1881
f 1638/1933/1882 1640/1934/1882 1637/1930/1882
f 1639/1935/1882 1641/1936/1882 1638/1933/1882
f 1425/1937/1883 1634/1931/1883 1511/1923/1883
f 1473/1938/1883 1634/1931/1883 1472/1939/1883
f 1474/1940/1883 1635/1929/1883 1473/1938/1883
f 1426/1687/1883 1636/1932/1883 1474/1940/1883
f 1514/1686/1881 1639/1935/1881 1636/1932/1881
f 1515/1699/1882 1642/1941/1882 1639/1935/1882
f 1516/1697/1884 1478/1942/1884 1642/1941/1884
f 1642/1941/1884 1479/1943/1884 1641/1936/1884
f 1641/1936/1884 1480/1944/1884 1640/1934/1884
f 1513/1926/1884 1480/1944/1884 1428/1945/1884
f 1637/1930/1882 1513/1926/1882 1512/1924/1882
f 1511/1923/1881 1637/1930/1881 1512/1924/1881
f 1518/1677/1885 1521/1681/1885 1520/1678/1885
f 1519/1680/1886 1522/1692/1886 1521/1681/1886
f 1520/1678/1887 1521/1681/1887 1524/1682/1887
f 1521/1681/1888 1522/1692/1888 1525/1684/1888
f 1436/1685/1889 1517/1679/1889 1514/1686/1889
f 1437/1688/1890 1518/1677/1890 1517/1679/1890
f 1438/1689/1891 1519/1680/1891 1518/1677/1891
f 1418/1690/1892 1439/1946/1892 1519/1680/1892
f 1519/1680/1893 1439/1946/1893 1440/1691/1893
f 1440/1691/1894 1441/1947/1894 1525/1684/1894
f 1525/1684/1895 1441/1947/1895 1420/1693/1895
f 1524/1682/1896 1525/1684/1896 1475/1694/1896
f 1523/1683/1897 1524/1682/1897 1476/1695/1897
f 1516/1697/1898 1523/1683/1898 1477/1696/1898
f 1515/1699/1899 1520/1678/1899 1523/1683/1899
f 1517/1679/1900 1520/1678/1900 1515/1699/1900
f 1527/1700/1901 1530/1704/1901 1529/1701/1901
f 1528/1703/1902 1531/1710/1902 1530/1704/1902
f 1529/1701/1903 1530/1704/1903 1533/1705/1903
f 1530/1704/1904 1531/1710/1904 1534/1707/1904
f 1477/1696/1905 1526/1702/1905 1508/1708/1905
f 1476/1695/1906 1527/1700/1906 1526/1702/1906
f 1475/1694/1907 1528/1703/1907 1527/1700/1907
f 1420/1693/1908 1448/1948/1908 1528/1703/1908
f 1528/1703/1909 1448/1948/1909 1449/1709/1909
f 1449/1709/1910 1450/1949/1910 1534/1707/1910
f 1534/1707/1911 1450/1949/1911 1424/1711/1911
f 1533/1705/1912 1534/1707/1912 1481/1712/1912
f 1532/1706/1913 1533/1705/1913 1482/1713/1913
f 1510/1715/1914 1532/1706/1914 1483/1714/1914
f 1509/1717/1915 1529/1701/1915 1532/1706/1915
f 1526/1702/1916 1529/1701/1916 1509/1717/1916
f 1536/1718/1917 1539/1722/1917 1538/1719/1917
f 1537/1721/1918 1540/1728/1918 1539/1722/1918
f 1538/1719/1919 1539/1722/1919 1542/1723/1919
f 1539/1722/1920 1540/1728/1920 1543/1725/1920
f 1483/1714/1921 1535/1720/1921 1502/1726/1921
f 1482/1713/1922 1536/1718/1922 1535/1720/1922
f 1481/1712/1923 1537/1721/1923 1536/1718/1923
f 1424/1711/1924 1457/1786/1924 1537/1721/1924
f 1537/1721/1925 1457/1786/1925 1458/1727/1925
f 1458/1727/1926 1459/1798/1926 1543/1725/1926
f 1543/1725/1927 1459/1798/1927 1422/1729/1927
f 1542/1723/1928 1543/1725/1928 1487/1730/1928
f 1541/1724/1929 1542/1723/1929 1488/1731/1929
f 1504/1733/1930 1541/1724/1930 1489/1732/1930
f 1503/1735/1931 1538/1719/1931 1541/1724/1931
f 1535/1720/1932 1538/1719/1932 1503/1735/1932
f 1545/1736/1933 1548/1740/1933 1547/1737/1933
f 1546/1739/1934 1549/1746/1934 1548/1740/1934
f 1547/1737/1935 1548/1740/1935 1551/1741/1935
f 1548/1740/1936 1549/1746/1936 1552/1743/1936
f 1489/1732/1937 1544/1738/1937 1496/1744/1937
f 1488/1731/1938 1545/1736/1938 1544/1738/1938
f 1487/1730/1939 1546/1739/1939 1545/1736/1939
f 1422/1729/1940 1466/1950/1940 1546/1739/1940
f 1546/1739/1941 1466/1950/1941 1467/1745/1941
f 1467/1745/1942 1468/1951/1942 1552/1743/1942
f 1552/1743/1943 1468/1951/1943 1418/1747/1943
f 1551/1741/1944 1552/1743/1944 1438/1748/1944
f 1550/1742/1945 1551/1741/1945 1437/1749/1945
f 1498/1751/1946 1550/1742/1946 1436/1750/1946
f 1497/1753/1947 1547/1737/1947 1550/1742/1947
f 1544/1738/1948 1547/1737/1948 1497/1753/1948
f 1553/1754/1949 1554/1758/1949 1557/1755/1949
f 1555/1757/1950 1558/1769/1950 1557/1755/1950
f 1557/1755/1951 1560/1761/1951 1559/1759/1951
f 1557/1755/1952 1558/1769/1952 1561/1760/1952
f 1419/1762/1953 1447/1764/1953 1553/1754/1953
f 1447/1764/1954 1446/1766/1954 1554/1758/1954
f 1445/1765/1955 1555/1757/1955 1554/1758/1955
f 1423/1767/1956 1456/1768/1956 1555/1757/1956
f 1456/1768/1957 1455/1853/1957 1558/1769/1957
f 1558/1769/1958 1455/1853/1958 1454/1770/1958
f 1561/1760/1959 1454/1770/1959 1421/1771/1959
f 1560/1761/1960 1561/1760/1960 1465/1772/1960
f 1560/1761/1961 1464/1773/1961 1463/1774/1961
f 1559/1759/1962 1463/1774/1962 1417/1775/1962
f 1556/1756/1963 1559/1759/1963 1435/1776/1963
f 1433/1763/1964 1553/1754/1964 1556/1756/1964
f 1562/1778/1965 1563/1782/1965 1566/1779/1965
f 1564/1781/1966 1567/1792/1966 1566/1779/1966
f 1566/1779/1967 1569/1785/1967 1568/1783/1967
f 1566/1779/1968 1567/1792/1968 1570/1784/1968
f 1424/1711/1969 1450/1787/1969 1562/1778/1969
f 1450/1787/1970 1449/1789/1970 1563/1782/1970
f 1448/1788/1971 1564/1781/1971 1563/1782/1971
f 1420/1790/1972 1441/1791/1972 1564/1781/1972
f 1441/1791/1973 1440/1952/1973 1567/1792/1973
f 1567/1792/1974 1440/1952/1974 1439/1793/1974
f 1570/1784/1975 1439/1793/1975 1418/1794/1975
f 1569/1785/1976 1570/1784/1976 1468/1795/1976
f 1569/1785/1977 1467/1796/1977 1466/1797/1977
f 1568/1783/1978 1466/1797/1978 1422/1729/1978
f 1565/1780/1979 1568/1783/1979 1459/1798/1979
f 1457/1786/1980 1562/1778/1980 1565/1780/1980
f 1571/1799/1981 1572/1802/1981 1575/1800/1981
f 1572/1802/1982 1573/1809/1982 1576/1803/1982
f 1575/1800/1983 1578/1805/1983 1577/1804/1983
f 1576/1803/1984 1579/1813/1984 1578/1805/1984
f 1421/1771/1985 1462/1807/1985 1571/1799/1985
f 1462/1807/1986 1461/1808/1986 1572/1802/1986
f 1461/1808/1987 1460/1810/1987 1573/1809/1987
f 1460/1810/1988 1432/1830/1988 1493/1811/1988
f 1573/1809/1989 1493/1811/1989 1494/1812/1989
f 1494/1812/1990 1495/1814/1990 1579/1813/1990
f 1495/1814/1991 1425/1837/1991 1471/1815/1991
f 1579/1813/1992 1471/1815/1992 1470/1816/1992
f 1578/1805/1993 1470/1816/1993 1469/1817/1993
f 1577/1804/1994 1469/1817/1994 1417/1818/1994
f 1464/1820/1995 1574/1801/1995 1577/1804/1995
f 1571/1799/1996 1574/1801/1996 1464/1820/1996
f 1581/1821/1821 1584/1825/1821 1583/1822/1821
f 1582/1824/1821 1585/1827/1821 1584/1825/1821
f 1584/1825/1822 1587/1828/1822 1586/1826/1822
f 1585/1827/1822 1588/1833/1822 1587/1828/1822
f 1492/1829/1823 1580/1823/1823 1493/1811/1823
f 1491/1831/1823 1581/1821/1823 1580/1823/1823
f 1491/1831/1823 1490/1832/1823 1582/1824/1823
f 1431/1734/1823 1496/1744/1823 1582/1824/1823
f 1496/1744/1821 1497/1753/1821 1585/1827/1821
f 1497/1753/1822 1498/1751/1822 1588/1833/1822
f 1498/1751/1824 1426/1752/1824 1474/1834/1824
f 1588/1833/1824 1474/1834/1824 1473/1835/1824
f 1587/1828/1824 1473/1835/1824 1472/1836/1824
f 1586/1826/1824 1472/1836/1824 1425/1837/1824
f 1583/1822/1822 1586/1826/1822 1495/1814/1822
f 1580/1823/1821 1583/1822/1821 1494/1812/1821
f 1589/1838/1997 1590/1841/1997 1593/1839/1997
f 1590/1841/1998 1591/1847/1998 1594/1842/1998
f 1593/1839/1999 1596/1844/1999 1595/1843/1999
f 1594/1842/2000 1597/1851/2000 1596/1844/2000
f 1423/1767/2001 1453/1845/2001 1589/1838/2001
f 1453/1845/2002 1452/1846/2002 1590/1841/2002
f 1452/1846/2003 1451/1848/2003 1591/1847/2003
f 1451/1848/2004 1430/1953/2004 1499/1849/2004
f 1591/1847/2005 1499/1849/2005 1500/1850/2005
f 1500/1850/2006 1501/1852/2006 1597/1851/2006
f 1501/1852/2007 1432/1830/2007 1460/1810/2007
f 1597/1851/2008 1460/1810/2008 1461/1808/2008
f 1596/1844/2009 1461/1808/2009 1462/1807/2009
f 1595/1843/2010 1462/1807/2010 1421/1771/2010
f 1455/1853/2011 1592/1840/2011 1595/1843/2011
f 1589/1838/2012 1592/1840/2012 1455/1853/2012
f 1599/1854/1841 1602/1858/1841 1601/1855/1841
f 1600/1857/1841 1603/1860/1841 1602/1858/1841
f 1602/1858/1842 1605/1861/1842 1604/1859/1842
f 1603/1860/1842 1606/1867/1842 1605/1861/1842
f 1486/1862/1843 1598/1856/1843 1499/1849/1843
f 1486/1862/1843 1485/1865/1843 1599/1854/1843
f 1484/1864/1843 1600/1857/1843 1599/1854/1843
f 1484/1864/1843 1429/1954/1843 1502/1726/1843
f 1600/1857/1841 1502/1726/1841 1503/1866/1841
f 1503/1866/1842 1504/1733/1842 1606/1867/1842
f 1504/1733/1844 1431/1955/1844 1490/1868/1844
f 1605/1861/1844 1606/1867/1844 1490/1868/1844
f 1605/1861/1844 1491/1869/1844 1492/1829/1844
f 1604/1859/1844 1492/1829/1844 1432/1870/1844
f 1601/1855/1842 1604/1859/1842 1501/1852/1842
f 1499/1849/1841 1598/1856/1841 1601/1855/1841
f 1607/1872/2013 1608/1875/2013 1611/1873/2013
f 1608/1875/2014 1609/1883/2014 1612/1876/2014
f 1611/1873/2015 1614/1878/2015 1613/1877/2015
f 1612/1876/2016 1615/1887/2016 1614/1878/2016
f 1419/1879/2017 1444/1881/2017 1607/1872/2017
f 1444/1881/2018 1443/1882/2018 1608/1875/2018
f 1443/1882/2019 1442/1884/2019 1609/1883/2019
f 1442/1884/2020 1428/1945/2020 1505/1885/2020
f 1609/1883/2021 1505/1885/2021 1506/1886/2021
f 1506/1886/2022 1507/1888/2022 1615/1887/2022
f 1507/1888/2023 1430/1953/2023 1451/1848/2023
f 1615/1887/2024 1451/1848/2024 1452/1846/2024
f 1614/1878/2025 1452/1846/2025 1453/1845/2025
f 1613/1877/2026 1453/1845/2026 1423/1767/2026
f 1446/1890/2027 1610/1874/2027 1613/1877/2027
f 1607/1872/2028 1610/1874/2028 1446/1890/2028
f 1617/1891/1861 1620/1895/1861 1619/1892/1861
f 1618/1894/1861 1621/1897/1861 1620/1895/1861
f 1620/1895/1862 1623/1898/1862 1622/1896/1862
f 1621/1897/1862 1624/1906/1862 1623/1898/1862
f 1480/1899/1863 1616/1893/1863 1505/1900/1863
f 1480/1899/1863 1479/1903/1863 1617/1891/1863
f 1478/1902/1863 1618/1894/1863 1617/1891/1863
f 1478/1902/1863 1427/1698/1863 1508/1904/1863
f 1618/1894/1861 1508/1904/1861 1509/1905/1861
f 1509/1905/1862 1510/1907/1862 1624/1906/1862
f 1510/1907/1864 1429/1954/1864 1484/1864/1864
f 1624/1906/1864 1484/1864/1864 1485/1865/1864
f 1622/1896/1864 1623/1898/1864 1485/1865/1864
f 1622/1896/1864 1486/1862/1864 1430/1863/1864
f 1619/1892/1862 1622/1896/1862 1507/1908/1862
f 1505/1900/1861 1616/1893/1861 1619/1892/1861
f 1625/1910/2029 1626/1913/2029 1629/1911/2029
f 1626/1913/2030 1627/1921/2030 1630/1914/2030
f 1629/1911/2031 1632/1916/2031 1631/1915/2031
f 1630/1914/2032 1633/1925/2032 1632/1916/2032
f 1417/1917/2033 1469/1919/2033 1625/1910/2033
f 1469/1919/2034 1470/1920/2034 1626/1913/2034
f 1470/1920/2035 1471/1922/2035 1627/1921/2035
f 1471/1922/2036 1425/1937/2036 1511/1923/2036
f 1627/1921/2037 1511/1923/2037 1512/1924/2037
f 1512/1924/2038 1513/1926/2038 1633/1925/2038
f 1513/1926/2039 1428/1945/2039 1442/1884/2039
f 1633/1925/2040 1442/1884/2040 1443/1882/2040
f 1632/1916/2041 1443/1882/2041 1444/1881/2041
f 1631/1915/2042 1444/1881/2042 1419/1879/2042
f 1434/1928/2043 1628/1912/2043 1631/1915/2043
f 1625/1910/2044 1628/1912/2044 1434/1928/2044
f 1635/1929/1881 1638/1933/1881 1637/1930/1881
f 1636/1932/1881 1639/1935/1881 1638/1933/1881
f 1638/1933/1882 1641/1936/1882 1640/1934/1882
f 1639/1935/1882 1642/1941/1882 1641/1936/1882
f 1425/1937/1883 1472/1939/1883 1634/1931/1883
f 1473/1938/1883 1635/1929/1883 1634/1931/1883
f 1474/1940/1883 1636/1932/1883 1635/1929/1883
f 1426/1687/1883 1514/1686/1883 1636/1932/1883
f 1514/1686/1881 1515/1699/1881 1639/1935/1881
f 1515/1699/1882 1516/1697/1882 1642/1941/1882
f 1516/1697/1884 1427/1698/1884 1478/1942/1884
f 1642/1941/1884 1478/1942/1884 1479/1943/1884
f 1641/1936/1884 1479/1943/1884 1480/1944/1884
f 1513/1926/1884 1640/1934/1884 1480/1944/1884
f 1637/1930/1882 1640/1934/1882 1513/1926/1882
f 1511/1923/1881 1634/1931/1881 1637/1930/1881
o Cube.004
v -0.142272 0.020747 0.254680
v 0.124702 -0.073756 0.396607
v -0.094265 0.020747 0.164376
v 0.172709 -0.073756 0.306303
v -0.167681 -0.071320 0.241172
v 0.099294 -0.165823 0.383099
v -0.119674 -0.071320 0.150869
v 0.147300 -0.165823 0.292796
v -0.065946 0.007740 0.314560
v 0.040844 -0.030061 0.371330
v 0.104852 -0.030061 0.250926
v -0.001937 0.007740 0.194155
v 0.070974 -0.152816 0.232916
v -0.035816 -0.115015 0.176145
v 0.006966 -0.152816 0.353320
v -0.099824 -0.115015 0.296550
v -0.106069 0.028840 0.181654
v -0.121335 0.032905 0.207898
v -0.134551 0.028840 0.235231
v 0.067200 -0.039690 0.384928
v 0.091505 -0.050394 0.394956
v 0.111694 -0.062720 0.398552
v 0.140212 -0.068420 0.381299
v 0.157340 -0.065740 0.356045
v 0.168694 -0.068420 0.327722
v -0.028474 0.016713 0.180461
v -0.054050 0.022817 0.169758
v -0.077369 0.023797 0.164498
v -0.114900 -0.051220 0.144959
v -0.107560 -0.023216 0.144618
v -0.099825 0.003404 0.152973
v 0.174938 -0.093856 0.299041
v 0.171115 -0.121860 0.292765
v 0.159863 -0.148479 0.291027
v -0.062171 -0.105386 0.162547
v -0.086477 -0.094682 0.152519
v -0.106665 -0.082356 0.148924
v -0.163666 -0.076656 0.219754
v -0.152312 -0.079336 0.191430
v -0.135184 -0.076656 0.166177
v 0.139579 -0.173916 0.312244
v 0.126363 -0.177981 0.339578
v 0.111097 -0.173916 0.365821
v -0.125837 -0.105386 0.282307
v -0.147745 -0.094682 0.267768
v -0.162017 -0.082356 0.253044
v -0.154834 0.003404 0.256449
v -0.166087 -0.023216 0.254710
v -0.169909 -0.051220 0.248435
v 0.104854 -0.148479 0.394502
v 0.112589 -0.121860 0.402857
v 0.119929 -0.093856 0.402516
v -0.132721 0.023797 0.268618
v -0.115317 0.022817 0.285006
v -0.092140 0.016713 0.300221
v -0.039248 -0.001710 0.328752
v -0.012551 -0.011160 0.342945
v 0.014146 -0.020611 0.357138
v 0.167045 -0.062720 0.294432
v 0.152773 -0.050394 0.279708
v 0.130866 -0.039690 0.265168
v 0.078155 -0.020611 0.236733
v 0.051458 -0.011160 0.222540
v 0.024760 -0.001710 0.208348
v 0.137749 -0.168873 0.278857
v 0.120346 -0.167892 0.262469
v 0.097169 -0.161788 0.247254
v 0.044277 -0.143366 0.218723
v 0.017579 -0.133916 0.204530
v -0.009118 -0.124465 0.190338
v 0.082397 -0.168873 0.382977
v 0.059078 -0.167892 0.377717
v 0.033503 -0.161788 0.367014
v -0.019732 -0.143366 0.339128
v -0.046429 -0.133916 0.324935
v -0.073127 -0.124465 0.310742
v -0.101150 -0.087203 0.317562
v -0.094887 -0.053637 0.328130
v -0.082623 -0.020071 0.327411
v 0.005640 -0.125004 0.374333
v 0.011903 -0.091439 0.384901
v 0.024167 -0.057873 0.384182
v -0.055082 -0.132277 0.200891
v -0.074172 -0.138032 0.232970
v -0.090086 -0.132277 0.266738
v 0.051708 -0.170079 0.257662
v 0.032618 -0.175833 0.289741
v 0.016703 -0.170079 0.323509
v -0.000612 -0.020071 0.173143
v -0.006875 -0.053637 0.162574
v -0.019139 -0.087203 0.163294
v 0.106178 -0.057873 0.229914
v 0.099915 -0.091438 0.219345
v 0.087651 -0.125004 0.220064
v -0.046680 0.025003 0.289813
v -0.027589 0.030757 0.257734
v -0.011675 0.025003 0.223967
v 0.060110 -0.012798 0.346584
v 0.079200 -0.007044 0.314505
v 0.095115 -0.012798 0.280738
v 0.086401 -0.022528 0.360333
v 0.110251 -0.033935 0.371418
v 0.129164 -0.048646 0.377913
v 0.105425 -0.016815 0.328446
v 0.128813 -0.028505 0.340880
v 0.146729 -0.044087 0.350404
v 0.121217 -0.022528 0.294841
v 0.143746 -0.033935 0.308412
v 0.159708 -0.048646 0.320458
v 0.132222 -0.067367 0.244283
v 0.154347 -0.077127 0.259710
v 0.169095 -0.086890 0.277336
v 0.126031 -0.100765 0.233787
v 0.148656 -0.109347 0.249725
v 0.164506 -0.116512 0.268741
v 0.113795 -0.134137 0.234486
v 0.136619 -0.141364 0.250286
v 0.152929 -0.145466 0.268742
v 0.078040 -0.178976 0.271888
v 0.102177 -0.184555 0.286313
v 0.122433 -0.183710 0.300642
v 0.059088 -0.184715 0.303813
v 0.084191 -0.190190 0.317158
v 0.106754 -0.188936 0.329153
v 0.043224 -0.178976 0.337379
v 0.068682 -0.184555 0.349320
v 0.091889 -0.183710 0.358097
v 0.032218 -0.134137 0.387938
v 0.058081 -0.141364 0.398022
v 0.082503 -0.145466 0.401219
v 0.038482 -0.100765 0.398472
v 0.064349 -0.109347 0.408313
v 0.088977 -0.116512 0.410816
v 0.050645 -0.067367 0.397734
v 0.075809 -0.077127 0.407446
v 0.098669 -0.086890 0.409813
v -0.119989 0.008149 0.171995
v -0.130028 -0.019960 0.164691
v -0.136055 -0.050066 0.163454
v -0.137557 0.010711 0.199274
v -0.149101 -0.018870 0.193137
v -0.154486 -0.050631 0.190274
v -0.150344 0.008149 0.229095
v -0.162014 -0.019960 0.224858
v -0.166411 -0.050066 0.220554
v 0.155373 -0.153225 0.318381
v 0.167042 -0.125116 0.322617
v 0.171439 -0.095010 0.326922
v 0.142585 -0.155787 0.348201
v 0.154129 -0.126206 0.354338
v 0.159514 -0.094445 0.357201
v 0.125018 -0.153225 0.375481
v 0.135057 -0.125116 0.382785
v 0.141084 -0.095010 0.384022
v -0.164067 -0.058186 0.270139
v -0.149319 -0.067949 0.287765
v -0.127194 -0.077709 0.303193
v -0.159478 -0.028564 0.278734
v -0.143628 -0.035728 0.297750
v -0.121002 -0.044311 0.313688
v -0.147901 0.000390 0.278734
v -0.131591 -0.003712 0.297190
v -0.108767 -0.010939 0.312989
v -0.074452 -0.096654 0.331755
v -0.047755 -0.106104 0.345947
v -0.021057 -0.115554 0.360140
v -0.068189 -0.063088 0.342323
v -0.041492 -0.072538 0.356516
v -0.014794 -0.081988 0.370709
v -0.055925 -0.029522 0.341604
v -0.029228 -0.038972 0.355796
v -0.002530 -0.048422 0.369989
v -0.124136 -0.096430 0.169563
v -0.105222 -0.111141 0.176057
v -0.081372 -0.122548 0.187143
v -0.141701 -0.100988 0.197071
v -0.123785 -0.116571 0.206595
v -0.100397 -0.128261 0.219029
v -0.154680 -0.096430 0.227018
v -0.138717 -0.111141 0.239064
v -0.116188 -0.122548 0.252634
v -0.028384 -0.141728 0.215084
v -0.001687 -0.151178 0.229277
v 0.025011 -0.160628 0.243470
v -0.047475 -0.147482 0.247163
v -0.020777 -0.156932 0.261356
v 0.005920 -0.166382 0.275548
v -0.063389 -0.141728 0.280930
v -0.036691 -0.151178 0.295123
v -0.009994 -0.160628 0.309316
v -0.077474 0.000390 0.146257
v -0.053052 -0.003712 0.149454
v -0.027190 -0.010939 0.159537
v -0.083949 -0.028564 0.136659
v -0.059320 -0.035728 0.139162
v -0.033454 -0.044311 0.149003
v -0.093640 -0.058186 0.137662
v -0.070781 -0.067949 0.140029
v -0.045617 -0.077709 0.149741
v 0.026086 -0.029522 0.187335
v 0.052783 -0.038972 0.201528
v 0.079481 -0.048422 0.215721
v 0.019823 -0.063088 0.176767
v 0.046520 -0.072538 0.190960
v 0.073217 -0.081988 0.205152
v 0.007559 -0.096654 0.177486
v 0.034256 -0.106104 0.191679
v 0.060954 -0.115554 0.205872
v -0.117405 0.038634 0.246833
v -0.097149 0.039480 0.261162
v -0.073012 0.033900 0.275587
v -0.101726 0.043860 0.218322
v -0.079163 0.045114 0.230317
v -0.054059 0.039639 0.243663
v -0.086861 0.038634 0.189378
v -0.063654 0.039480 0.198155
v -0.038196 0.033900 0.210096
v -0.019982 0.015552 0.304006
v 0.006715 0.006102 0.318199
v 0.033412 -0.003348 0.332391
v -0.000892 0.021307 0.271927
v 0.025805 0.011856 0.286120
v 0.052503 0.002406 0.300312
v 0.015022 0.015552 0.238160
v 0.041720 0.006102 0.252352
v 0.068417 -0.003348 0.266545
vn 0.2791 0.9024 0.3282
vn 0.4628 0.7806 0.4201
vn 0.4282 0.9024 0.0478
vn 0.6071 0.7806 0.1487
vn 0.0032 0.8190 0.5738
vn 0.0902 0.7823 0.6163
vn 0.2832 0.6653 0.6908
vn 0.4990 0.4587 0.7352
vn 0.7008 0.4807 0.5271
vn 0.8289 0.4807 0.2861
vn 0.8886 0.4587 0.0025
vn 0.7310 0.6653 -0.1516
vn 0.5614 0.7823 -0.2699
vn 0.4774 0.8190 -0.3183
vn 0.3474 0.9377 0.0039
vn 0.1976 0.9377 0.2858
vn 0.6059 0.1341 -0.7841
vn 0.7806 0.0551 -0.6226
vn 0.5123 -0.2053 -0.8339
vn 0.6905 -0.2713 -0.6705
vn 0.5502 0.5199 -0.6534
vn 0.6356 0.4846 -0.6009
vn 0.8026 0.3869 -0.4541
vn 0.9438 0.2404 -0.2268
vn 0.9580 -0.0676 -0.2786
vn 0.8796 -0.3517 -0.3203
vn 0.7087 -0.6116 -0.3518
vn 0.5301 -0.6002 -0.5989
vn 0.3481 -0.5573 -0.7538
vn 0.2606 -0.5294 -0.8074
vn 0.4279 -0.1753 -0.8867
vn 0.5221 0.1660 -0.8366
vn -0.0886 -0.9699 -0.2269
vn 0.1194 -0.9867 -0.1106
vn -0.2377 -0.9699 0.0535
vn -0.0249 -0.9867 0.1609
vn 0.0228 -0.8282 -0.5600
vn 0.1100 -0.8532 -0.5099
vn 0.3062 -0.8740 -0.3774
vn 0.5345 -0.8245 -0.1858
vn 0.4510 -0.8884 0.0852
vn 0.3229 -0.8884 0.3262
vn 0.1449 -0.8245 0.5470
vn -0.1416 -0.8740 0.4649
vn -0.3611 -0.8532 0.3764
vn -0.4514 -0.8282 0.3321
vn -0.3224 -0.9465 0.0093
vn -0.1726 -0.9465 -0.2725
vn -0.4048 -0.2053 0.8911
vn -0.1696 -0.2713 0.9474
vn -0.3111 0.1341 0.9409
vn -0.0795 0.0551 0.9953
vn -0.5235 -0.5294 0.6676
vn -0.4302 -0.5573 0.7101
vn -0.1999 -0.6002 0.7744
vn 0.1048 -0.6116 0.7842
vn 0.2265 -0.3517 0.9083
vn 0.3049 -0.0676 0.9500
vn 0.3399 0.2404 0.9092
vn 0.0725 0.3869 0.9193
vn -0.1426 0.4846 0.8630
vn -0.2339 0.5199 0.8216
vn -0.4015 0.1660 0.9007
vn -0.4957 -0.1753 0.8506
vn -0.7522 0.4019 -0.5222
vn -0.8129 0.1854 -0.5521
vn -0.8536 0.4019 -0.3315
vn -0.9123 0.1854 -0.3650
vn -0.4854 0.6162 -0.6202
vn -0.5805 0.3927 -0.7133
vn -0.6447 0.1599 -0.7475
vn -0.6588 -0.0639 -0.7496
vn -0.8204 -0.0848 -0.5655
vn -0.9276 -0.0848 -0.3637
vn -0.9899 -0.0639 -0.1269
vn -0.9803 0.1599 -0.1163
vn -0.9160 0.3927 -0.0822
vn -0.7857 0.6162 -0.0555
vn -0.7293 0.6335 -0.2584
vn -0.6221 0.6335 -0.4600
vn 0.8536 -0.4019 0.3315
vn 0.9123 -0.1854 0.3650
vn 0.7522 -0.4019 0.5222
vn 0.8129 -0.1854 0.5521
vn 0.7857 -0.6162 0.0555
vn 0.9160 -0.3927 0.0822
vn 0.9803 -0.1599 0.1163
vn 0.9899 0.0639 0.1269
vn 0.9276 0.0848 0.3637
vn 0.8204 0.0848 0.5655
vn 0.6588 0.0639 0.7496
vn 0.6447 -0.1599 0.7475
vn 0.5805 -0.3927 0.7133
vn 0.4854 -0.6162 0.6202
vn 0.6221 -0.6335 0.4600
vn 0.7293 -0.6335 0.2584
vn -0.7779 -0.0609 0.6254
vn -0.6046 -0.1364 0.7848
vn -0.6854 0.2741 0.6746
vn -0.5100 0.2064 0.8351
vn -0.9629 -0.1796 0.2015
vn -0.8011 -0.3895 0.4546
vn -0.6338 -0.4869 0.6010
vn -0.5501 -0.5200 0.6534
vn -0.5220 -0.1661 0.8366
vn -0.4277 0.1754 0.8867
vn -0.2605 0.5294 0.8074
vn -0.3454 0.5583 0.7543
vn -0.5276 0.6014 0.5999
vn -0.7538 0.5779 0.3127
vn -0.8796 0.3509 0.3211
vn -0.9577 0.0681 0.2796
vn -0.5089 -0.1707 0.8437
vn -0.4147 0.1707 0.8938
vn -0.5369 -0.5247 0.6606
vn -0.2473 0.5247 0.8146
vn -0.4570 -0.7832 -0.4216
vn -0.2766 -0.9030 -0.3287
vn -0.6051 -0.7832 -0.1430
vn -0.4272 -0.9030 -0.0455
vn -0.5541 -0.4315 -0.7119
vn -0.2804 -0.6658 -0.6914
vn -0.0875 -0.7823 -0.6167
vn -0.0030 -0.8190 -0.5738
vn -0.1974 -0.9377 -0.2858
vn -0.3473 -0.9377 -0.0038
vn -0.4774 -0.8190 0.3184
vn -0.5602 -0.7823 0.2724
vn -0.7300 -0.6658 0.1542
vn -0.9000 -0.4315 -0.0612
vn -0.8282 -0.4816 -0.2864
vn -0.7007 -0.4816 -0.5264
vn -0.1851 -0.9422 -0.2792
vn -0.3350 -0.9422 0.0027
vn 0.0098 -0.8237 -0.5669
vn -0.4645 -0.8237 0.3252
vn 0.1758 0.2741 -0.9455
vn 0.4070 0.2064 -0.8898
vn 0.0833 -0.0609 -0.9947
vn 0.3124 -0.1364 -0.9401
vn -0.1624 0.5779 -0.7998
vn 0.2022 0.6014 -0.7729
vn 0.4321 0.5583 -0.7082
vn 0.5236 0.5294 -0.6675
vn 0.4958 0.1754 -0.8505
vn 0.4016 -0.1661 -0.9006
vn 0.2340 -0.5200 -0.8215
vn 0.1437 -0.4869 -0.8616
vn -0.0712 -0.3895 -0.9183
vn -0.3715 -0.1796 -0.9109
vn -0.3038 0.0681 -0.9503
vn -0.2258 0.3509 -0.9088
vn 0.5089 0.1707 -0.8437
vn 0.4147 -0.1707 -0.8938
vn 0.5369 0.5247 -0.6606
vn 0.2473 -0.5247 -0.8146
vn -0.1168 0.9863 0.1166
vn 0.0895 0.9692 0.2292
vn 0.0313 0.9863 -0.1620
vn 0.2401 0.9692 -0.0540
vn -0.5536 0.8236 0.1229
vn -0.3053 0.8731 0.3800
vn -0.1092 0.8519 0.5122
vn -0.0228 0.8281 0.5601
vn 0.1727 0.9465 0.2727
vn 0.3226 0.9465 -0.0094
vn 0.4516 0.8281 -0.3322
vn 0.3635 0.8519 -0.3770
vn 0.1443 0.8731 -0.4656
vn -0.2078 0.8236 -0.5277
vn -0.3225 0.8889 -0.3253
vn -0.4500 0.8889 -0.0854
vn 0.1851 0.9422 0.2792
vn 0.3350 0.9422 -0.0027
vn -0.0098 0.8237 0.5669
vn 0.4645 0.8237 -0.3252
vn 0.2766 0.9030 0.3287
vn 0.4570 0.7832 0.4216
vn 0.4272 0.9030 0.0455
vn 0.6051 0.7832 0.1430
vn 0.0030 0.8190 0.5738
vn 0.0875 0.7823 0.6167
vn 0.2804 0.6658 0.6914
vn 0.5541 0.4315 0.7119
vn 0.7007 0.4816 0.5264
vn 0.8282 0.4816 0.2864
vn 0.9000 0.4315 0.0612
vn 0.7300 0.6658 -0.1542
vn 0.5602 0.7823 -0.2724
vn 0.4774 0.8190 -0.3184
vn 0.3473 0.9377 0.0038
vn 0.1974 0.9377 0.2858
vn 0.6046 0.1364 -0.7848
vn 0.7779 0.0609 -0.6254
vn 0.5100 -0.2064 -0.8351
vn 0.6854 -0.2741 -0.6746
vn 0.5501 0.5200 -0.6534
vn 0.6338 0.4869 -0.6010
vn 0.8011 0.3895 -0.4546
vn 0.9629 0.1796 -0.2015
vn 0.9577 -0.0681 -0.2796
vn 0.8796 -0.3509 -0.3211
vn 0.7538 -0.5779 -0.3127
vn 0.5276 -0.6014 -0.5999
vn 0.3454 -0.5583 -0.7543
vn 0.2605 -0.5294 -0.8074
vn 0.4277 -0.1754 -0.8867
vn 0.5220 0.1661 -0.8366
vn -0.0895 -0.9692 -0.2292
vn 0.1168 -0.9863 -0.1166
vn -0.2401 -0.9692 0.0540
vn -0.0313 -0.9863 0.1620
vn 0.0228 -0.8281 -0.5601
vn 0.1092 -0.8519 -0.5122
vn 0.3053 -0.8731 -0.3800
vn 0.5536 -0.8236 -0.1229
vn 0.4500 -0.8889 0.0854
vn 0.3225 -0.8889 0.3253
vn 0.2078 -0.8236 0.5277
vn -0.1443 -0.8731 0.4656
vn -0.3635 -0.8519 0.3770
vn -0.4516 -0.8281 0.3322
vn -0.3226 -0.9465 0.0094
vn -0.1727 -0.9465 -0.2727
vn -0.4070 -0.2064 0.8898
vn -0.1758 -0.2741 0.9455
vn -0.3124 0.1364 0.9401
vn -0.0833 0.0609 0.9947
vn -0.5236 -0.5294 0.6675
vn -0.4321 -0.5583 0.7082
vn -0.2022 -0.6014 0.7729
vn 0.1624 -0.5779 0.7998
vn 0.2258 -0.3509 0.9088
vn 0.3038 -0.0681 0.9503
vn 0.3715 0.1796 0.9109
vn 0.0712 0.3895 0.9183
vn -0.1437 0.4869 0.8616
vn -0.2340 0.5200 0.8215
vn -0.4016 0.1661 0.9006
vn -0.4958 -0.1754 0.8505
vn -0.7526 0.4039 -0.5201
vn -0.8114 0.1875 -0.5536
vn -0.8520 0.4039 -0.3330
vn -0.9128 0.1875 -0.3630
vn -0.4799 0.5842 -0.6545
vn -0.5805 0.3926 -0.7134
vn -0.6448 0.1598 -0.7475
vn -0.6821 -0.0966 -0.7248
vn -0.8204 -0.0849 -0.5654
vn -0.9276 -0.0849 -0.3638
vn -0.9824 -0.0966 -0.1601
vn -0.9803 0.1598 -0.1164
vn -0.9160 0.3926 -0.0821
vn -0.8110 0.5842 -0.0318
vn -0.7294 0.6334 -0.2584
vn -0.6222 0.6334 -0.4601
vn 0.8520 -0.4039 0.3330
vn 0.9128 -0.1875 0.3630
vn 0.7526 -0.4039 0.5201
vn 0.8114 -0.1875 0.5536
vn 0.8110 -0.5842 0.0318
vn 0.9160 -0.3926 0.0821
vn 0.9803 -0.1598 0.1164
vn 0.9824 0.0966 0.1601
vn 0.9276 0.0849 0.3638
vn 0.8204 0.0849 0.5654
vn 0.6821 0.0966 0.7248
vn 0.6448 -0.1598 0.7475
vn 0.5805 -0.3926 0.7134
vn 0.4799 -0.5842 0.6545
vn 0.6222 -0.6334 0.4601
vn 0.7294 -0.6334 0.2584
vn -0.7806 -0.0551 0.6226
vn -0.6059 -0.1341 0.7841
vn -0.6905 0.2713 0.6705
vn -0.5123 0.2053 0.8339
vn -0.9438 -0.2404 0.2268
vn -0.8026 -0.3869 0.4541
vn -0.6356 -0.4846 0.6009
vn -0.5502 -0.5199 0.6534
vn -0.5221 -0.1660 0.8366
vn -0.4279 0.1753 0.8867
vn -0.2606 0.5294 0.8074
vn -0.3481 0.5573 0.7538
vn -0.5301 0.6002 0.5989
vn -0.7087 0.6116 0.3518
vn -0.8796 0.3517 0.3203
vn -0.9580 0.0676 0.2786
vn -0.4628 -0.7806 -0.4201
vn -0.2791 -0.9024 -0.3282
vn -0.6071 -0.7806 -0.1487
vn -0.4282 -0.9024 -0.0478
vn -0.4990 -0.4587 -0.7352
vn -0.2832 -0.6653 -0.6908
vn -0.0902 -0.7823 -0.6163
vn -0.0032 -0.8190 -0.5738
vn -0.1976 -0.9377 -0.2858
vn -0.3474 -0.9377 -0.0039
vn -0.4774 -0.8190 0.3183
vn -0.5614 -0.7823 0.2699
vn -0.7310 -0.6653 0.1516
vn -0.8886 -0.4587 -0.0025
vn -0.8289 -0.4807 -0.2861
vn -0.7008 -0.4807 -0.5271
vn 0.1696 0.2713 -0.9474
vn 0.4048 0.2053 -0.8911
vn 0.0795 -0.0551 -0.9953
vn 0.3111 -0.1341 -0.9409
vn -0.1048 0.6116 -0.7842
vn 0.1999 0.6002 -0.7744
vn 0.4302 0.5573 -0.7101
vn 0.5235 0.5294 -0.6676
vn 0.4957 0.1753 -0.8506
vn 0.4015 -0.1660 -0.9007
vn 0.2339 -0.5199 -0.8216
vn 0.1426 -0.4846 -0.8630
vn -0.0725 -0.3869 -0.9193
vn -0.3399 -0.2404 -0.9092
vn -0.3049 0.0676 -0.9500
vn -0.2265 0.3517 -0.9083
vn -0.1194 0.9867 0.1106
vn 0.0886 0.9699 0.2269
vn 0.0249 0.9867 -0.1609
vn 0.2377 0.9699 -0.0535
vn -0.5345 0.8245 0.1858
vn -0.3062 0.8740 0.3774
vn -0.1100 0.8532 0.5099
vn -0.0228 0.8282 0.5600
vn 0.1726 0.9465 0.2725
vn 0.3224 0.9465 -0.0093
vn 0.4514 0.8282 -0.3321
vn 0.3611 0.8532 -0.3764
vn 0.1416 0.8740 -0.4649
vn -0.1449 0.8245 -0.5470
vn -0.3229 0.8884 -0.3262
vn -0.4510 0.8884 -0.0852
vt 0.583333 0.062500
vt 0.562500 0.125000
vt 0.562500 0.062500
vt 0.604167 0.062500
vt 0.583333 0.125000
vt 0.583333 0.187500
vt 0.562500 0.187500
vt 0.604167 0.187500
vt 0.562500 0.000000
vt 0.541667 0.062500
vt 0.541667 0.000000
vt 0.583333 0.000000
vt 0.604167 0.000000
vt 0.625000 0.000000
vt 0.625000 0.125000
vt 0.604167 0.125000
vt 0.625000 0.250000
vt 0.604167 0.250000
vt 0.583333 0.250000
vt 0.562500 0.250000
vt 0.541667 0.187500
vt 0.541667 0.250000
vt 0.541667 0.125000
vt 0.583333 0.312500
vt 0.562500 0.375000
vt 0.562500 0.312500
vt 0.604167 0.312500
vt 0.583333 0.375000
vt 0.583406 0.437500
vt 0.562509 0.437500
vt 0.604320 0.437500
vt 0.541667 0.312500
vt 0.625000 0.375000
vt 0.604167 0.375000
vt 0.625000 0.500000
vt 0.605903 0.500000
vt 0.583912 0.500000
vt 0.562572 0.500000
vt 0.541667 0.437500
vt 0.541667 0.500000
vt 0.541667 0.375000
vt 0.584997 0.562500
vt 0.562789 0.625000
vt 0.562708 0.562500
vt 0.610343 0.562500
vt 0.585648 0.625000
vt 0.584997 0.687500
vt 0.562708 0.687500
vt 0.610343 0.687500
vt 0.541667 0.562500
vt 0.648148 0.625000
vt 0.612558 0.625000
vt 0.625000 0.750000
vt 0.605903 0.750000
vt 0.583912 0.750000
vt 0.562572 0.750000
vt 0.541667 0.687500
vt 0.541667 0.750000
vt 0.541667 0.625000
vt 0.583406 0.812500
vt 0.562500 0.875000
vt 0.562509 0.812500
vt 0.604320 0.812500
vt 0.583333 0.875000
vt 0.583333 0.937500
vt 0.562500 0.937500
vt 0.604167 0.937500
vt 0.541667 0.812500
vt 0.625000 0.875000
vt 0.604167 0.875000
vt 0.625000 1.000000
vt 0.604167 1.000000
vt 0.583333 1.000000
vt 0.562500 1.000000
vt 0.541667 0.937500
vt 0.541667 1.000000
vt 0.541667 0.875000
vt 0.187301 0.562500
vt 0.247685 0.625000
vt 0.187211 0.625000
vt 0.306478 0.562500
vt 0.248409 0.562500
vt 0.187301 0.687500
vt 0.306478 0.687500
vt 0.248409 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.187500 0.500000
vt 0.312500 0.500000
vt 0.250000 0.500000
vt 0.375000 0.500000
vt 0.355867 0.562500
vt 0.304109 0.625000
vt 0.355867 0.687500
vt 0.375000 0.750000
vt 0.312500 0.750000
vt 0.250000 0.750000
vt 0.187500 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.693522 0.562500
vt 0.752315 0.625000
vt 0.695891 0.625000
vt 0.812699 0.562500
vt 0.751592 0.562500
vt 0.693522 0.687500
vt 0.812699 0.687500
vt 0.751591 0.687500
vt 0.644133 0.562500
vt 0.687500 0.500000
vt 0.812500 0.500000
vt 0.750000 0.500000
vt 0.875000 0.500000
vt 0.875000 0.562500
vt 0.812789 0.625000
vt 0.875000 0.687500
vt 0.875000 0.750000
vt 0.812500 0.750000
vt 0.750000 0.750000
vt 0.687500 0.750000
vt 0.644133 0.687500
vt 0.395680 0.812500
vt 0.416667 0.875000
vt 0.395833 0.875000
vt 0.416594 0.812500
vt 0.437500 0.875000
vt 0.395833 0.937500
vt 0.416667 0.937500
vt 0.375000 0.812500
vt 0.394097 0.750000
vt 0.416088 0.750000
vt 0.437491 0.812500
vt 0.437428 0.750000
vt 0.458333 0.812500
vt 0.458333 0.875000
vt 0.437500 0.937500
vt 0.458333 0.937500
vt 0.437500 1.000000
vt 0.416667 1.000000
vt 0.395833 1.000000
vt 0.375000 1.000000
vt 0.375000 0.937500
vt 0.375000 0.875000
vt 0.500000 0.812500
vt 0.479167 0.875000
vt 0.479167 0.812500
vt 0.520833 0.812500
vt 0.500000 0.875000
vt 0.479167 0.937500
vt 0.520833 0.875000
vt 0.500000 0.937500
vt 0.479167 0.750000
vt 0.458333 0.750000
vt 0.500000 0.750000
vt 0.520833 0.750000
vt 0.520833 0.937500
vt 0.520833 1.000000
vt 0.500000 1.000000
vt 0.479167 1.000000
vt 0.458333 1.000000
vt 0.389658 0.562500
vt 0.414352 0.625000
vt 0.387442 0.625000
vt 0.415003 0.562500
vt 0.437211 0.625000
vt 0.389657 0.687500
vt 0.415003 0.687500
vt 0.394097 0.500000
vt 0.416088 0.500000
vt 0.437292 0.562500
vt 0.437428 0.500000
vt 0.458333 0.562500
vt 0.458333 0.625000
vt 0.437292 0.687500
vt 0.458333 0.687500
vt 0.351852 0.625000
vt 0.500000 0.562500
vt 0.479167 0.625000
vt 0.479167 0.562500
vt 0.520833 0.562500
vt 0.500000 0.625000
vt 0.479167 0.687500
vt 0.520833 0.625000
vt 0.500000 0.687500
vt 0.479167 0.500000
vt 0.458333 0.500000
vt 0.520833 0.500000
vt 0.500000 0.500000
vt 0.541667 0.625000
vt 0.520833 0.687500
vt 0.520833 0.750000
vt 0.500000 0.750000
vt 0.458333 0.750000
vt 0.458333 0.625000
vt 0.395833 0.312500
vt 0.416667 0.375000
vt 0.395833 0.375000
vt 0.416667 0.312500
vt 0.437500 0.375000
vt 0.395680 0.437500
vt 0.416594 0.437500
vt 0.375000 0.250000
vt 0.375000 0.312500
vt 0.395833 0.250000
vt 0.416667 0.250000
vt 0.437500 0.312500
vt 0.437500 0.250000
vt 0.458333 0.312500
vt 0.458333 0.375000
vt 0.437491 0.437500
vt 0.458333 0.437500
vt 0.375000 0.437500
vt 0.375000 0.375000
vt 0.500000 0.312500
vt 0.479167 0.375000
vt 0.479167 0.312500
vt 0.520833 0.312500
vt 0.500000 0.375000
vt 0.479167 0.437500
vt 0.520833 0.375000
vt 0.500000 0.437500
vt 0.479167 0.250000
vt 0.458333 0.312500
vt 0.458333 0.250000
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.541667 0.312500
vt 0.541667 0.375000
vt 0.520833 0.437500
vt 0.541667 0.437500
vt 0.458333 0.437500
vt 0.458333 0.375000
vt 0.395833 0.062500
vt 0.416667 0.125000
vt 0.395833 0.125000
vt 0.416667 0.062500
vt 0.437500 0.125000
vt 0.395833 0.187500
vt 0.416667 0.187500
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.395833 0.000000
vt 0.416667 0.000000
vt 0.437500 0.062500
vt 0.437500 0.000000
vt 0.458333 0.062500
vt 0.458333 0.125000
vt 0.437500 0.187500
vt 0.458333 0.187500
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.500000 0.062500
vt 0.479167 0.125000
vt 0.479167 0.062500
vt 0.520833 0.062500
vt 0.500000 0.125000
vt 0.479167 0.187500
vt 0.520833 0.125000
vt 0.500000 0.187500
vt 0.458333 0.000000
vt 0.500000 0.000000
vt 0.479167 0.000000
vt 0.520833 0.000000
vt 0.520833 0.187500
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.479167 0.250000
vt 0.458333 0.250000
vt 0.625000 0.062500
vt 0.625000 0.187500
vt 0.625000 0.312500
vt 0.625000 0.437500
vt 0.625000 0.812500
vt 0.625000 0.937500
vt 0.875000 0.625000
vt 0.458333 0.500000
vt 0.541667 0.500000
vt 0.541667 0.750000
s 0
f 1744/1956/2045 1746/1957/2045 1743/1958/2045
f 1745/1959/2046 1747/1960/2046 1744/1956/2046
f 1746/1957/2047 1750/1961/2047 1749/1962/2047
f 1747/1960/2048 1751/1963/2048 1750/1961/2048
f 1662/1964/2049 1740/1965/2049 1652/1966/2049
f 1663/1967/2050 1743/1958/2050 1662/1964/2050
f 1664/1968/2051 1744/1956/2051 1663/1967/2051
f 1644/1969/2052 1745/1959/2052 1664/1968/2052
f 1745/1959/2053 1666/1970/2053 1748/1971/2053
f 1666/1970/2054 1751/1963/2054 1748/1971/2054
f 1751/1963/2055 1646/1972/2055 1701/1973/2055
f 1750/1961/2056 1701/1973/2056 1702/1974/2056
f 1749/1962/2057 1702/1974/2057 1703/1975/2057
f 1742/1976/2058 1703/1975/2058 1653/1977/2058
f 1741/1978/2059 1749/1962/2059 1742/1976/2059
f 1743/1958/2060 1741/1978/2060 1740/1965/2060
f 1753/1979/2061 1755/1980/2061 1752/1981/2061
f 1754/1982/2062 1756/1983/2062 1753/1979/2062
f 1755/1980/2063 1759/1984/2063 1758/1985/2063
f 1756/1983/2064 1760/1986/2064 1759/1984/2064
f 1703/1975/2065 1734/1987/2065 1653/1977/2065
f 1702/1974/2066 1752/1981/2066 1703/1975/2066
f 1701/1973/2067 1753/1979/2067 1702/1974/2067
f 1646/1972/2068 1754/1982/2068 1701/1973/2068
f 1754/1982/2069 1675/1988/2069 1757/1989/2069
f 1675/1988/2070 1760/1986/2070 1757/1989/2070
f 1760/1986/2071 1650/1990/2071 1707/1991/2071
f 1759/1984/2072 1707/1991/2072 1708/1992/2072
f 1758/1985/2073 1708/1992/2073 1709/1993/2073
f 1736/1994/2074 1709/1993/2074 1655/1995/2074
f 1735/1996/2075 1758/1985/2075 1736/1994/2075
f 1752/1981/2076 1735/1996/2076 1734/1987/2076
f 1762/1997/2077 1764/1998/2077 1761/1999/2077
f 1763/2000/2078 1765/2001/2078 1762/1997/2078
f 1764/1998/2079 1768/2002/2079 1767/2003/2079
f 1765/2001/2080 1769/2004/2080 1768/2002/2080
f 1709/1993/2081 1728/2005/2081 1655/1995/2081
f 1708/1992/2082 1761/1999/2082 1709/1993/2082
f 1707/1991/2083 1762/1997/2083 1708/1992/2083
f 1650/1990/2084 1763/2000/2084 1707/1991/2084
f 1763/2000/2085 1684/2006/2085 1766/2007/2085
f 1684/2006/2086 1769/2004/2086 1766/2007/2086
f 1769/2004/2087 1648/2008/2087 1713/2009/2087
f 1768/2002/2088 1713/2009/2088 1714/2010/2088
f 1767/2003/2089 1714/2010/2089 1715/2011/2089
f 1730/2012/2090 1715/2011/2090 1657/2013/2090
f 1729/2014/2091 1767/2003/2091 1730/2012/2091
f 1761/1999/2092 1729/2014/2092 1728/2005/2092
f 1771/2015/2093 1773/2016/2093 1770/2017/2093
f 1772/2018/2094 1774/2019/2094 1771/2015/2094
f 1773/2016/2095 1777/2020/2095 1776/2021/2095
f 1774/2019/2096 1778/2022/2096 1777/2020/2096
f 1715/2011/2097 1722/2023/2097 1657/2013/2097
f 1714/2010/2098 1770/2017/2098 1715/2011/2098
f 1713/2009/2099 1771/2015/2099 1714/2010/2099
f 1648/2008/2100 1772/2018/2100 1713/2009/2100
f 1772/2018/2101 1693/2024/2101 1775/2025/2101
f 1693/2024/2102 1778/2022/2102 1775/2025/2102
f 1778/2022/2103 1644/2026/2103 1664/2027/2103
f 1777/2020/2104 1664/2027/2104 1663/2028/2104
f 1776/2021/2105 1663/2028/2105 1662/2029/2105
f 1724/2030/2106 1662/2029/2106 1652/2031/2106
f 1723/2032/2107 1776/2021/2107 1724/2030/2107
f 1770/2017/2108 1723/2032/2108 1722/2023/2108
f 1779/2033/2109 1783/2034/2109 1782/2035/2109
f 1781/2036/2110 1783/2034/2110 1780/2037/2110
f 1783/2034/2111 1785/2038/2111 1782/2035/2111
f 1783/2034/2112 1787/2039/2112 1786/2040/2112
f 1645/2041/2113 1779/2033/2113 1659/2042/2113
f 1673/2043/2114 1780/2037/2114 1779/2033/2114
f 1671/2044/2115 1780/2037/2115 1672/2045/2115
f 1649/2046/2116 1781/2036/2116 1671/2044/2116
f 1682/2047/2117 1784/2048/2117 1781/2036/2117
f 1784/2048/2118 1680/2049/2118 1787/2039/2118
f 1787/2039/2119 1647/2050/2119 1691/2051/2119
f 1786/2040/2120 1691/2051/2120 1690/2052/2120
f 1786/2040/2121 1689/2053/2121 1785/2038/2121
f 1785/2038/2122 1643/2054/2122 1661/2055/2122
f 1782/2035/2123 1661/2055/2123 1660/2056/2123
f 1659/2042/2124 1782/2035/2124 1660/2056/2124
f 1788/2057/2125 1792/2058/2125 1791/2059/2125
f 1790/2060/2126 1792/2058/2126 1789/2061/2126
f 1792/2058/2127 1794/2062/2127 1791/2059/2127
f 1792/2058/2128 1796/2063/2128 1795/2064/2128
f 1650/1990/2129 1788/2057/2129 1683/2065/2129
f 1676/2066/2130 1789/2061/2130 1788/2057/2130
f 1674/2067/2131 1789/2061/2131 1675/2068/2131
f 1646/2069/2132 1790/2060/2132 1674/2067/2132
f 1667/2070/2133 1793/2071/2133 1790/2060/2133
f 1793/2071/2134 1665/2072/2134 1796/2063/2134
f 1796/2063/2135 1644/2073/2135 1694/2074/2135
f 1795/2064/2136 1694/2074/2136 1693/2075/2136
f 1795/2064/2137 1692/2076/2137 1794/2062/2137
f 1794/2062/2138 1648/2008/2138 1685/2077/2138
f 1791/2059/2139 1685/2077/2139 1684/2006/2139
f 1683/2065/2140 1791/2059/2140 1684/2006/2140
f 1797/2078/2141 1801/2079/2141 1800/2080/2141
f 1798/2081/2142 1802/2082/2142 1801/2079/2142
f 1801/2079/2143 1803/2083/2143 1800/2080/2143
f 1802/2082/2144 1804/2084/2144 1801/2079/2144
f 1647/2050/2145 1797/2078/2145 1691/2085/2145
f 1688/2086/2146 1798/2081/2146 1797/2078/2146
f 1687/2087/2147 1799/2088/2147 1798/2081/2147
f 1686/2089/2148 1719/2090/2148 1799/2088/2148
f 1799/2088/2149 1720/2091/2149 1802/2082/2149
f 1720/2091/2150 1805/2092/2150 1802/2082/2150
f 1721/2093/2151 1697/2094/2151 1805/2092/2151
f 1805/2092/2152 1696/2095/2152 1804/2084/2152
f 1804/2084/2153 1695/2096/2153 1803/2083/2153
f 1803/2083/2154 1643/2097/2154 1689/2098/2154
f 1690/2099/2155 1803/2083/2155 1689/2098/2155
f 1797/2078/2156 1690/2099/2156 1691/2085/2156
f 1807/2100/2157 1809/2101/2157 1806/2102/2157
f 1808/2103/2157 1810/2104/2157 1807/2100/2157
f 1810/2104/2158 1812/2105/2158 1809/2101/2158
f 1811/2106/2158 1813/2107/2158 1810/2104/2158
f 1718/2108/2159 1719/2090/2159 1658/2109/2159
f 1717/2110/2159 1806/2102/2159 1718/2108/2159
f 1717/2110/2159 1808/2103/2159 1807/2100/2159
f 1657/2013/2159 1808/2103/2159 1716/2111/2159
f 1722/2023/2157 1811/2106/2157 1808/2103/2157
f 1723/2032/2158 1814/2112/2158 1811/2106/2158
f 1724/2030/2160 1700/2113/2160 1814/2112/2160
f 1814/2112/2160 1699/2114/2160 1813/2107/2160
f 1813/2107/2160 1698/2115/2160 1812/2105/2160
f 1812/2105/2160 1651/2116/2160 1721/2093/2160
f 1809/2101/2158 1721/2093/2158 1720/2091/2158
f 1806/2102/2157 1720/2091/2157 1719/2090/2157
f 1815/2117/2161 1819/2118/2161 1818/2119/2161
f 1816/2120/2162 1820/2121/2162 1819/2118/2162
f 1819/2118/2163 1821/2122/2163 1818/2119/2163
f 1820/2121/2164 1822/2123/2164 1819/2118/2164
f 1649/2046/2165 1815/2117/2165 1682/2047/2165
f 1679/2124/2166 1816/2120/2166 1815/2117/2166
f 1678/2125/2167 1817/2126/2167 1816/2120/2167
f 1677/2127/2168 1725/2128/2168 1817/2126/2168
f 1817/2126/2169 1726/2129/2169 1820/2121/2169
f 1726/2129/2170 1823/2130/2170 1820/2121/2170
f 1727/2131/2171 1686/2089/2171 1823/2130/2171
f 1823/2130/2172 1687/2087/2172 1822/2123/2172
f 1822/2123/2173 1688/2086/2173 1821/2122/2173
f 1821/2122/2174 1647/2050/2174 1680/2049/2174
f 1681/2132/2175 1821/2122/2175 1680/2049/2175
f 1815/2117/2176 1681/2132/2176 1682/2047/2176
f 1825/2133/2177 1827/2134/2177 1824/2135/2177
f 1826/2136/2177 1828/2137/2177 1825/2133/2177
f 1828/2137/2178 1830/2138/2178 1827/2134/2178
f 1829/2139/2178 1831/2140/2178 1828/2137/2178
f 1712/2141/2179 1725/2128/2179 1656/2142/2179
f 1712/2141/2179 1825/2133/2179 1824/2135/2179
f 1710/2143/2179 1825/2133/2179 1711/2144/2179
f 1710/2143/2179 1728/2005/2179 1826/2136/2179
f 1826/2136/2177 1729/2145/2177 1829/2139/2177
f 1729/2145/2178 1832/2146/2178 1829/2139/2178
f 1730/2012/2180 1716/2147/2180 1832/2146/2180
f 1831/2140/2180 1716/2147/2180 1717/2148/2180
f 1831/2140/2180 1718/2108/2180 1830/2138/2180
f 1830/2138/2180 1658/2149/2180 1727/2131/2180
f 1827/2134/2178 1727/2131/2178 1726/2150/2178
f 1725/2128/2177 1827/2134/2177 1726/2150/2177
f 1833/2151/2181 1837/2152/2181 1836/2153/2181
f 1834/2154/2182 1838/2155/2182 1837/2152/2182
f 1837/2152/2183 1839/2156/2183 1836/2153/2183
f 1838/2155/2184 1840/2157/2184 1837/2152/2184
f 1645/2158/2185 1833/2151/2185 1673/2159/2185
f 1670/2160/2186 1834/2154/2186 1833/2151/2186
f 1669/2161/2187 1835/2162/2187 1834/2154/2187
f 1668/2163/2188 1731/2164/2188 1835/2162/2188
f 1835/2162/2189 1732/2165/2189 1838/2155/2189
f 1732/2165/2190 1841/2166/2190 1838/2155/2190
f 1733/2167/2191 1677/2127/2191 1841/2166/2191
f 1841/2166/2192 1678/2125/2192 1840/2157/2192
f 1840/2157/2193 1679/2124/2193 1839/2156/2193
f 1839/2156/2194 1649/2046/2194 1671/2168/2194
f 1672/2169/2195 1839/2156/2195 1671/2168/2195
f 1833/2151/2196 1672/2169/2196 1673/2159/2196
f 1843/2170/2197 1845/2171/2197 1842/2172/2197
f 1844/2173/2197 1846/2174/2197 1843/2170/2197
f 1846/2174/2198 1848/2175/2198 1845/2171/2198
f 1847/2176/2198 1849/2177/2198 1846/2174/2198
f 1706/2178/2199 1731/2179/2199 1654/2180/2199
f 1706/2178/2199 1843/2170/2199 1842/2172/2199
f 1704/2181/2199 1843/2170/2199 1705/2182/2199
f 1704/2181/2199 1734/2183/2199 1844/2173/2199
f 1844/2173/2197 1735/2184/2197 1847/2176/2197
f 1735/2184/2198 1850/2185/2198 1847/2176/2198
f 1736/2186/2200 1710/2143/2200 1850/2185/2200
f 1850/2185/2200 1711/2144/2200 1849/2177/2200
f 1848/2175/2200 1711/2144/2200 1712/2141/2200
f 1848/2175/2200 1656/2142/2200 1733/2187/2200
f 1845/2171/2198 1733/2187/2198 1732/2188/2198
f 1731/2179/2197 1845/2171/2197 1732/2188/2197
f 1851/2189/2201 1855/2190/2201 1854/2191/2201
f 1852/2192/2202 1856/2193/2202 1855/2190/2202
f 1855/2190/2203 1857/2194/2203 1854/2191/2203
f 1856/2193/2204 1858/2195/2204 1855/2190/2204
f 1643/2196/2205 1851/2189/2205 1661/2197/2205
f 1695/2198/2206 1852/2192/2206 1851/2189/2206
f 1696/2199/2207 1853/2200/2207 1852/2192/2207
f 1697/2201/2208 1737/2202/2208 1853/2200/2208
f 1853/2200/2209 1738/2203/2209 1856/2193/2209
f 1738/2203/2210 1859/2204/2210 1856/2193/2210
f 1739/2205/2211 1668/2163/2211 1859/2204/2211
f 1859/2204/2212 1669/2161/2212 1858/2195/2212
f 1858/2195/2213 1670/2160/2213 1857/2194/2213
f 1857/2194/2214 1645/2158/2214 1659/2206/2214
f 1660/2207/2215 1857/2194/2215 1659/2206/2215
f 1851/2189/2216 1660/2207/2216 1661/2197/2216
f 1861/2208/2217 1863/2209/2217 1860/2210/2217
f 1862/2211/2217 1864/2212/2217 1861/2208/2217
f 1864/2212/2218 1866/2213/2218 1863/2209/2218
f 1865/2214/2218 1867/2215/2218 1864/2212/2218
f 1651/2216/2219 1860/2210/2219 1737/2202/2219
f 1699/2217/2219 1860/2210/2219 1698/2218/2219
f 1700/2219/2219 1861/2208/2219 1699/2217/2219
f 1652/1966/2219 1862/2211/2219 1700/2219/2219
f 1740/1965/2217 1865/2214/2217 1862/2211/2217
f 1741/1978/2218 1868/2220/2218 1865/2214/2218
f 1742/1976/2220 1704/2221/2220 1868/2220/2220
f 1868/2220/2220 1705/2222/2220 1867/2215/2220
f 1867/2215/2220 1706/2223/2220 1866/2213/2220
f 1739/2205/2220 1706/2223/2220 1654/2224/2220
f 1863/2209/2218 1739/2205/2218 1738/2203/2218
f 1737/2202/2217 1863/2209/2217 1738/2203/2217
f 1744/1956/2221 1747/1960/2221 1746/1957/2221
f 1745/1959/2222 1748/1971/2222 1747/1960/2222
f 1746/1957/2223 1747/1960/2223 1750/1961/2223
f 1747/1960/2224 1748/1971/2224 1751/1963/2224
f 1662/1964/2225 1743/1958/2225 1740/1965/2225
f 1663/1967/2226 1744/1956/2226 1743/1958/2226
f 1664/1968/2227 1745/1959/2227 1744/1956/2227
f 1644/1969/2228 1665/2225/2228 1745/1959/2228
f 1745/1959/2229 1665/2225/2229 1666/1970/2229
f 1666/1970/2230 1667/2226/2230 1751/1963/2230
f 1751/1963/2231 1667/2226/2231 1646/1972/2231
f 1750/1961/2232 1751/1963/2232 1701/1973/2232
f 1749/1962/2233 1750/1961/2233 1702/1974/2233
f 1742/1976/2234 1749/1962/2234 1703/1975/2234
f 1741/1978/2235 1746/1957/2235 1749/1962/2235
f 1743/1958/2236 1746/1957/2236 1741/1978/2236
f 1753/1979/2237 1756/1983/2237 1755/1980/2237
f 1754/1982/2238 1757/1989/2238 1756/1983/2238
f 1755/1980/2239 1756/1983/2239 1759/1984/2239
f 1756/1983/2240 1757/1989/2240 1760/1986/2240
f 1703/1975/2241 1752/1981/2241 1734/1987/2241
f 1702/1974/2242 1753/1979/2242 1752/1981/2242
f 1701/1973/2243 1754/1982/2243 1753/1979/2243
f 1646/1972/2244 1674/2227/2244 1754/1982/2244
f 1754/1982/2245 1674/2227/2245 1675/1988/2245
f 1675/1988/2246 1676/2228/2246 1760/1986/2246
f 1760/1986/2247 1676/2228/2247 1650/1990/2247
f 1759/1984/2248 1760/1986/2248 1707/1991/2248
f 1758/1985/2249 1759/1984/2249 1708/1992/2249
f 1736/1994/2250 1758/1985/2250 1709/1993/2250
f 1735/1996/2251 1755/1980/2251 1758/1985/2251
f 1752/1981/2252 1755/1980/2252 1735/1996/2252
f 1762/1997/2253 1765/2001/2253 1764/1998/2253
f 1763/2000/2254 1766/2007/2254 1765/2001/2254
f 1764/1998/2255 1765/2001/2255 1768/2002/2255
f 1765/2001/2256 1766/2007/2256 1769/2004/2256
f 1709/1993/2257 1761/1999/2257 1728/2005/2257
f 1708/1992/2258 1762/1997/2258 1761/1999/2258
f 1707/1991/2259 1763/2000/2259 1762/1997/2259
f 1650/1990/2260 1683/2065/2260 1763/2000/2260
f 1763/2000/2261 1683/2065/2261 1684/2006/2261
f 1684/2006/2262 1685/2077/2262 1769/2004/2262
f 1769/2004/2263 1685/2077/2263 1648/2008/2263
f 1768/2002/2264 1769/2004/2264 1713/2009/2264
f 1767/2003/2265 1768/2002/2265 1714/2010/2265
f 1730/2012/2266 1767/2003/2266 1715/2011/2266
f 1729/2014/2267 1764/1998/2267 1767/2003/2267
f 1761/1999/2268 1764/1998/2268 1729/2014/2268
f 1771/2015/2269 1774/2019/2269 1773/2016/2269
f 1772/2018/2270 1775/2025/2270 1774/2019/2270
f 1773/2016/2271 1774/2019/2271 1777/2020/2271
f 1774/2019/2272 1775/2025/2272 1778/2022/2272
f 1715/2011/2273 1770/2017/2273 1722/2023/2273
f 1714/2010/2274 1771/2015/2274 1770/2017/2274
f 1713/2009/2275 1772/2018/2275 1771/2015/2275
f 1648/2008/2276 1692/2229/2276 1772/2018/2276
f 1772/2018/2277 1692/2229/2277 1693/2024/2277
f 1693/2024/2278 1694/2230/2278 1778/2022/2278
f 1778/2022/2279 1694/2230/2279 1644/2026/2279
f 1777/2020/2280 1778/2022/2280 1664/2027/2280
f 1776/2021/2281 1777/2020/2281 1663/2028/2281
f 1724/2030/2282 1776/2021/2282 1662/2029/2282
f 1723/2032/2283 1773/2016/2283 1776/2021/2283
f 1770/2017/2284 1773/2016/2284 1723/2032/2284
f 1779/2033/2285 1780/2037/2285 1783/2034/2285
f 1781/2036/2286 1784/2048/2286 1783/2034/2286
f 1783/2034/2287 1786/2040/2287 1785/2038/2287
f 1783/2034/2288 1784/2048/2288 1787/2039/2288
f 1645/2041/2289 1673/2043/2289 1779/2033/2289
f 1673/2043/2290 1672/2045/2290 1780/2037/2290
f 1671/2044/2291 1781/2036/2291 1780/2037/2291
f 1649/2046/2292 1682/2047/2292 1781/2036/2292
f 1682/2047/2293 1681/2132/2293 1784/2048/2293
f 1784/2048/2294 1681/2132/2294 1680/2049/2294
f 1787/2039/2295 1680/2049/2295 1647/2050/2295
f 1786/2040/2296 1787/2039/2296 1691/2051/2296
f 1786/2040/2297 1690/2052/2297 1689/2053/2297
f 1785/2038/2298 1689/2053/2298 1643/2054/2298
f 1782/2035/2299 1785/2038/2299 1661/2055/2299
f 1659/2042/2300 1779/2033/2300 1782/2035/2300
f 1788/2057/2301 1789/2061/2301 1792/2058/2301
f 1790/2060/2302 1793/2071/2302 1792/2058/2302
f 1792/2058/2303 1795/2064/2303 1794/2062/2303
f 1792/2058/2304 1793/2071/2304 1796/2063/2304
f 1650/1990/2305 1676/2066/2305 1788/2057/2305
f 1676/2066/2306 1675/2068/2306 1789/2061/2306
f 1674/2067/2307 1790/2060/2307 1789/2061/2307
f 1646/2069/2308 1667/2070/2308 1790/2060/2308
f 1667/2070/2309 1666/2231/2309 1793/2071/2309
f 1793/2071/2310 1666/2231/2310 1665/2072/2310
f 1796/2063/2311 1665/2072/2311 1644/2073/2311
f 1795/2064/2312 1796/2063/2312 1694/2074/2312
f 1795/2064/2313 1693/2075/2313 1692/2076/2313
f 1794/2062/2314 1692/2076/2314 1648/2008/2314
f 1791/2059/2315 1794/2062/2315 1685/2077/2315
f 1683/2065/2316 1788/2057/2316 1791/2059/2316
f 1797/2078/2317 1798/2081/2317 1801/2079/2317
f 1798/2081/2318 1799/2088/2318 1802/2082/2318
f 1801/2079/2319 1804/2084/2319 1803/2083/2319
f 1802/2082/2320 1805/2092/2320 1804/2084/2320
f 1647/2050/2321 1688/2086/2321 1797/2078/2321
f 1688/2086/2322 1687/2087/2322 1798/2081/2322
f 1687/2087/2323 1686/2089/2323 1799/2088/2323
f 1686/2089/2324 1658/2109/2324 1719/2090/2324
f 1799/2088/2325 1719/2090/2325 1720/2091/2325
f 1720/2091/2326 1721/2093/2326 1805/2092/2326
f 1721/2093/2327 1651/2116/2327 1697/2094/2327
f 1805/2092/2328 1697/2094/2328 1696/2095/2328
f 1804/2084/2329 1696/2095/2329 1695/2096/2329
f 1803/2083/2330 1695/2096/2330 1643/2097/2330
f 1690/2099/2331 1800/2080/2331 1803/2083/2331
f 1797/2078/2332 1800/2080/2332 1690/2099/2332
f 1807/2100/2157 1810/2104/2157 1809/2101/2157
f 1808/2103/2157 1811/2106/2157 1810/2104/2157
f 1810/2104/2158 1813/2107/2158 1812/2105/2158
f 1811/2106/2158 1814/2112/2158 1813/2107/2158
f 1718/2108/2159 1806/2102/2159 1719/2090/2159
f 1717/2110/2159 1807/2100/2159 1806/2102/2159
f 1717/2110/2159 1716/2111/2159 1808/2103/2159
f 1657/2013/2159 1722/2023/2159 1808/2103/2159
f 1722/2023/2157 1723/2032/2157 1811/2106/2157
f 1723/2032/2158 1724/2030/2158 1814/2112/2158
f 1724/2030/2160 1652/2031/2160 1700/2113/2160
f 1814/2112/2160 1700/2113/2160 1699/2114/2160
f 1813/2107/2160 1699/2114/2160 1698/2115/2160
f 1812/2105/2160 1698/2115/2160 1651/2116/2160
f 1809/2101/2158 1812/2105/2158 1721/2093/2158
f 1806/2102/2157 1809/2101/2157 1720/2091/2157
f 1815/2117/2333 1816/2120/2333 1819/2118/2333
f 1816/2120/2334 1817/2126/2334 1820/2121/2334
f 1819/2118/2335 1822/2123/2335 1821/2122/2335
f 1820/2121/2336 1823/2130/2336 1822/2123/2336
f 1649/2046/2337 1679/2124/2337 1815/2117/2337
f 1679/2124/2338 1678/2125/2338 1816/2120/2338
f 1678/2125/2339 1677/2127/2339 1817/2126/2339
f 1677/2127/2340 1656/2232/2340 1725/2128/2340
f 1817/2126/2341 1725/2128/2341 1726/2129/2341
f 1726/2129/2342 1727/2131/2342 1823/2130/2342
f 1727/2131/2343 1658/2109/2343 1686/2089/2343
f 1823/2130/2344 1686/2089/2344 1687/2087/2344
f 1822/2123/2345 1687/2087/2345 1688/2086/2345
f 1821/2122/2346 1688/2086/2346 1647/2050/2346
f 1681/2132/2347 1818/2119/2347 1821/2122/2347
f 1815/2117/2348 1818/2119/2348 1681/2132/2348
f 1825/2133/2177 1828/2137/2177 1827/2134/2177
f 1826/2136/2177 1829/2139/2177 1828/2137/2177
f 1828/2137/2178 1831/2140/2178 1830/2138/2178
f 1829/2139/2178 1832/2146/2178 1831/2140/2178
f 1712/2141/2179 1824/2135/2179 1725/2128/2179
f 1712/2141/2179 1711/2144/2179 1825/2133/2179
f 1710/2143/2179 1826/2136/2179 1825/2133/2179
f 1710/2143/2179 1655/2233/2179 1728/2005/2179
f 1826/2136/2177 1728/2005/2177 1729/2145/2177
f 1729/2145/2178 1730/2012/2178 1832/2146/2178
f 1730/2012/2180 1657/2234/2180 1716/2147/2180
f 1831/2140/2180 1832/2146/2180 1716/2147/2180
f 1831/2140/2180 1717/2148/2180 1718/2108/2180
f 1830/2138/2180 1718/2108/2180 1658/2149/2180
f 1827/2134/2178 1830/2138/2178 1727/2131/2178
f 1725/2128/2177 1824/2135/2177 1827/2134/2177
f 1833/2151/2349 1834/2154/2349 1837/2152/2349
f 1834/2154/2350 1835/2162/2350 1838/2155/2350
f 1837/2152/2351 1840/2157/2351 1839/2156/2351
f 1838/2155/2352 1841/2166/2352 1840/2157/2352
f 1645/2158/2353 1670/2160/2353 1833/2151/2353
f 1670/2160/2354 1669/2161/2354 1834/2154/2354
f 1669/2161/2355 1668/2163/2355 1835/2162/2355
f 1668/2163/2356 1654/2224/2356 1731/2164/2356
f 1835/2162/2357 1731/2164/2357 1732/2165/2357
f 1732/2165/2358 1733/2167/2358 1841/2166/2358
f 1733/2167/2359 1656/2232/2359 1677/2127/2359
f 1841/2166/2360 1677/2127/2360 1678/2125/2360
f 1840/2157/2361 1678/2125/2361 1679/2124/2361
f 1839/2156/2362 1679/2124/2362 1649/2046/2362
f 1672/2169/2363 1836/2153/2363 1839/2156/2363
f 1833/2151/2364 1836/2153/2364 1672/2169/2364
f 1843/2170/2197 1846/2174/2197 1845/2171/2197
f 1844/2173/2197 1847/2176/2197 1846/2174/2197
f 1846/2174/2198 1849/2177/2198 1848/2175/2198
f 1847/2176/2198 1850/2185/2198 1849/2177/2198
f 1706/2178/2199 1842/2172/2199 1731/2179/2199
f 1706/2178/2199 1705/2182/2199 1843/2170/2199
f 1704/2181/2199 1844/2173/2199 1843/2170/2199
f 1704/2181/2199 1653/1977/2199 1734/2183/2199
f 1844/2173/2197 1734/2183/2197 1735/2184/2197
f 1735/2184/2198 1736/2186/2198 1850/2185/2198
f 1736/2186/2200 1655/2233/2200 1710/2143/2200
f 1850/2185/2200 1710/2143/2200 1711/2144/2200
f 1848/2175/2200 1849/2177/2200 1711/2144/2200
f 1848/2175/2200 1712/2141/2200 1656/2142/2200
f 1845/2171/2198 1848/2175/2198 1733/2187/2198
f 1731/2179/2197 1842/2172/2197 1845/2171/2197
f 1851/2189/2365 1852/2192/2365 1855/2190/2365
f 1852/2192/2366 1853/2200/2366 1856/2193/2366
f 1855/2190/2367 1858/2195/2367 1857/2194/2367
f 1856/2193/2368 1859/2204/2368 1858/2195/2368
f 1643/2196/2369 1695/2198/2369 1851/2189/2369
f 1695/2198/2370 1696/2199/2370 1852/2192/2370
f 1696/2199/2371 1697/2201/2371 1853/2200/2371
f 1697/2201/2372 1651/2216/2372 1737/2202/2372
f 1853/2200/2373 1737/2202/2373 1738/2203/2373
f 1738/2203/2374 1739/2205/2374 1859/2204/2374
f 1739/2205/2375 1654/2224/2375 1668/2163/2375
f 1859/2204/2376 1668/2163/2376 1669/2161/2376
f 1858/2195/2377 1669/2161/2377 1670/2160/2377
f 1857/2194/2378 1670/2160/2378 1645/2158/2378
f 1660/2207/2379 1854/2191/2379 1857/2194/2379
f 1851/2189/2380 1854/2191/2380 1660/2207/2380
f 1861/2208/2217 1864/2212/2217 1863/2209/2217
f 1862/2211/2217 1865/2214/2217 1864/2212/2217
f 1864/2212/2218 1867/2215/2218 1866/2213/2218
f 1865/2214/2218 1868/2220/2218 1867/2215/2218
f 1651/2216/2219 1698/2218/2219 1860/2210/2219
f 1699/2217/2219 1861/2208/2219 1860/2210/2219
f 1700/2219/2219 1862/2211/2219 1861/2208/2219
f 1652/1966/2219 1740/1965/2219 1862/2211/2219
f 1740/1965/2217 1741/1978/2217 1865/2214/2217
f 1741/1978/2218 1742/1976/2218 1868/2220/2218
f 1742/1976/2220 1653/1977/2220 1704/2221/2220
f 1868/2220/2220 1704/2221/2220 1705/2222/2220
f 1867/2215/2220 1705/2222/2220 1706/2223/2220
f 1739/2205/2220 1866/2213/2220 1706/2223/2220
f 1863/2209/2218 1866/2213/2218 1739/2205/2218
f 1737/2202/2217 1860/2210/2217 1863/2209/2217
================================================
FILE: example/panda/proxy.txt
================================================
-5.550776502986174560e-02 8.231279402941545087e-02 -1.258323406064429106e-01
-3.998699747023472251e-02 3.020038030973061227e-01 -2.599704789464084564e-01
2.073124080213479496e-03 3.770118695120376895e-01 -2.168838484418099122e-01
5.750893906056839949e-02 2.883552490916799216e-01 -2.101273523705166069e-01
-3.476761380703526083e-02 4.834374151764500582e-01 -8.875533188111095484e-02
2.621275142798934477e-02 4.435870942052951293e-01 -1.248984240581483551e-01
1.101798104483094343e-01 2.418496348280122776e-01 -1.397495042337346671e-01
1.319851991971777161e-01 3.465081803292787699e-01 -3.289878265419945297e-02
1.336974251276483172e-01 2.202749102229801248e-01 -5.356574945673026750e-02
1.086184527498474384e-01 1.609235458707530264e-01 4.532679330515694202e-03
1.124615477246108525e-01 2.060799267972991955e-01 8.486694706963725243e-02
1.108263323668484124e-01 3.067028747214898821e-01 1.125133854568828168e-01
3.467248041426101296e-02 1.647063428200797319e-01 1.570200825594204042e-01
5.467205903680633083e-02 2.383217976424019668e-01 1.807923176314753244e-01
-1.874564039015293865e-02 1.964175410582183201e-01 2.093479902576223517e-01
-8.225799846755116329e-02 7.646714150453758074e-02 9.141778909875285242e-02
-5.224189994222040723e-02 3.949125850434637708e-01 1.982747426322316320e-01
-9.405994484432338154e-02 2.593093170156238192e-01 2.389352117168089817e-01
-9.703494708754141496e-02 5.829404013340751256e-02 1.610719828558321298e-02
-2.055200834886550532e-01 1.992344466413660808e-01 1.883966082339720294e-01
-1.356711670947795823e-01 4.887447652362277850e-01 4.418001738081579755e-02
-2.173799857791113010e-01 3.819825506574616725e-01 1.621729917257352660e-01
-2.787432499633716865e-01 3.248821230676776084e-01 1.292483467059716007e-01
-2.450270426904422316e-01 1.584248380129207323e-01 1.196774461102083353e-01
-2.878676416203407307e-01 2.432906162580208975e-01 1.181967670454668606e-01
-2.738120342378779215e-01 4.030445928469174599e-01 5.560421110246818038e-02
-3.130810722330712448e-01 2.938745572072704593e-01 6.305078821655751842e-02
-2.900091691531618832e-01 1.605951499766521962e-01 -6.958543919617858620e-03
-2.717992175119994247e-01 1.621162895758318789e-01 -1.056899670566608662e-01
-2.822868168892812557e-01 3.669708595544671770e-01 -1.189639037640026625e-01
-2.114300325877888076e-01 4.391127418038818786e-01 -1.285642937250763562e-01
-1.545111280485170213e-01 4.848961521022657761e-01 -7.189320777701599385e-02
-1.229855012261280439e-01 1.144157565148969102e-01 -1.816325621961639436e-01
-1.532148299108402612e-01 3.232779165147374734e-01 -2.528048557046997935e-01
-1.045429956927828724e-01 3.847695021971876961e-01 -2.346506189822485933e-01
-1.119364304491641537e-01 2.427564400919719478e-01 -2.627898221710761040e-01
-7.157083246404463694e-02 1.791651339035606194e-01 -2.392723688528716752e-01
-3.756669628577867515e-02 4.460785868262663323e-01 -1.680404449446399051e-01
5.418741668800247657e-02 1.920492558142700146e-01 -1.889949736685311854e-01
8.862197896480086823e-02 1.517676377888277917e-01 -9.579918481000744557e-02
9.629790456930117926e-02 3.561883225570809941e-01 -1.402252465633767597e-01
7.769766040132154716e-02 4.308292908908605834e-01 -4.683058806158802573e-02
-1.650320705265993088e-02 6.777278104619478827e-02 -1.408983456056614111e-02
1.418740874141209141e-01 2.644011773493079032e-01 2.106153578506128957e-02
5.039146841614738394e-03 4.801431579071926969e-01 -1.110328469672949674e-03
9.929583664508447038e-02 3.923613559930114181e-01 5.873973982447559805e-02
3.655372962407749116e-02 4.502507365863256950e-01 6.611435693657022328e-02
5.426811249069161264e-02 3.946426376428642802e-01 1.356021440899852215e-01
5.107282062075303719e-02 3.141208058035512285e-01 1.844436528632925065e-01
-6.212762610691983933e-02 4.899666979250824617e-01 5.079732036460833655e-02
-5.886325235218057422e-02 4.571426039724759138e-01 1.312719617818547124e-01
-1.571251398008543382e-02 3.009904072813025522e-01 2.245242758477692946e-01
-1.105376630224988305e-01 1.756978990046991762e-01 2.093494487887317401e-01
-1.418308141011307977e-01 3.924240001779347176e-01 1.956733328044601428e-01
-1.624546968236126321e-01 4.506657479746047912e-01 1.233850377025712819e-01
-2.001444731145569922e-01 3.067787104903851869e-01 2.071914218811163055e-01
-2.131031949385588931e-01 4.584919498633186485e-01 4.978459100188815933e-02
-1.888675957603991040e-01 8.248559414814389534e-02 3.854738602963610899e-02
-3.123775198565065248e-01 2.233428816374893322e-01 4.166007383791669322e-02
-3.108480335106847958e-01 3.521194789909793954e-01 -3.128124358998397858e-02
-3.119671767594359713e-01 2.154746516775981058e-01 -6.131003363328066391e-02
-2.566408760812657897e-01 4.306810105751979756e-01 -5.493290825089015145e-02
-1.489544366138922937e-01 7.010902990781905930e-02 -7.538793493780569788e-02
-2.712847985969045750e-01 2.770486221809733141e-01 -1.762602356884806765e-01
-2.201925101004638119e-01 1.500069479774868708e-01 -1.667541134031051697e-01
-1.880144755989196226e-01 3.878074674977167180e-01 -2.062840709220200852e-01
-1.563961531203472144e-01 1.870677896185135580e-01 -2.337640261544615405e-01
-1.161512059362490717e-01 4.571983027739320304e-01 -1.603710262715521695e-01
-5.109761322512303833e-02 1.734666661059509896e-01 -1.041602591896385044e-01
-5.749434767165484406e-02 -4.802923093816066080e-01 -9.003244462973614137e-02
-7.426083221293819370e-02 1.979886544564754547e-01 -2.039573394437437653e-02
3.923653135237548106e-02 -1.476519536313414815e-01 -2.434247844185520593e-01
2.898759172367905770e-02 -2.820774199326867016e-01 -2.270827014526285437e-01
1.062434236405538912e-01 -1.091549446469401718e-01 -1.960030823272940481e-01
7.462506252438656784e-02 -2.115380268523078811e-01 -2.191504889962407610e-01
1.077267130351568514e-01 -1.246008187360323101e-02 -1.674916204452387980e-01
4.737973589827712551e-02 -4.377353785729148350e-01 -9.525483132031775657e-02
1.553412247905231469e-01 -3.088354236180936296e-01 -6.180040370049259130e-02
1.316812611768390928e-01 5.308158351285514887e-02 -6.934336985558382793e-02
7.293107162963116674e-02 -4.402032108589321591e-01 -1.831674559411535994e-02
1.019577142943011230e-02 1.789941903667302825e-01 -1.441688249336608302e-02
1.347883560061164387e-01 5.960133450103863306e-02 1.740374193646921502e-02
1.233405631664254432e-01 2.235237609452735835e-03 1.205061879518037060e-01
1.207919818797827116e-01 -7.437310676425021305e-02 1.556902482235498208e-01
3.333711169507772021e-02 1.521397506197987581e-01 6.216924082309258082e-02
8.167635557237819255e-02 -2.127196891852091709e-01 1.940077909797579825e-01
-7.922176513590274000e-03 -2.008414427282950998e-01 2.370221583760520612e-01
-3.926097580430663869e-02 -7.956246720096604719e-02 2.415177463394604096e-01
-6.157823166656801323e-02 1.794979557309702156e-01 7.208259481587020456e-02
-9.064252285207617665e-02 -4.908812767019926637e-01 4.519023899358984298e-02
-1.615658610652523919e-01 1.028505387068022470e-01 1.483535096185837343e-01
-1.856083390824073698e-01 -2.021098036211639581e-01 2.236939874155560171e-01
-2.521846444483782368e-01 -1.100465839403525492e-01 1.827957620580193676e-01
-1.411180629849258239e-01 1.674540592912803771e-01 7.727491585457200995e-02
-1.874102472302851541e-01 -4.192614378958642307e-01 1.143903663090891365e-01
-3.054719825283451762e-01 -2.836364462067636483e-01 8.547719600716996435e-02
-1.966712725113235194e-01 -4.565533849220086426e-01 3.094476444199486392e-02
-3.325769729390443130e-01 -2.142967904085320563e-01 5.307425256307919670e-02
-2.073755297621894811e-01 1.536577752935968100e-01 -6.248630671590251970e-03
-3.134365110943921895e-01 1.319621029895936062e-02 -1.362956406195723127e-02
-3.302597436879254889e-01 -2.629106309851381162e-01 -1.713402274221870886e-02
-3.025119038047423548e-01 -3.384401689051037554e-01 -2.094933705209516689e-02
-2.914860344629453537e-01 1.395853391972036277e-02 -1.044096188399867564e-01
-3.150421118131560561e-01 -2.034782209865934055e-01 -1.210192845117787352e-01
-3.063949342938200870e-01 -2.832939361184677041e-01 -1.034186470063439972e-01
-1.737286540717478123e-01 1.530824731615720580e-01 -9.521882114696589294e-02
-2.294820571316612545e-01 -3.951116104417027319e-01 -1.213764205846953048e-01
-1.274760689568076866e-01 -4.894644144351057435e-01 -4.455677218461355504e-02
-1.883780035264805142e-01 5.136867836639960605e-02 -1.914117689693696223e-01
-1.433321802695492198e-01 -4.544472693282612386e-01 -1.176185819695205748e-01
-1.861209110668022859e-01 -1.306640638528994147e-01 -2.455403270050096065e-01
-1.255438649381262484e-01 -3.308568763444957872e-01 -2.262914805016349729e-01
-8.293709088078490832e-02 -7.409872502685851958e-02 -2.627245961667383778e-01
-4.488803173837272592e-02 -2.241254320686571211e-01 -2.607906531668653383e-01
-5.778871869538328154e-02 -3.051853877028548601e-01 -2.400823954967900298e-01
6.933445338944499436e-02 -3.553085890772759159e-01 -1.641864654730656181e-01
7.111602818333581655e-02 7.790943989140272641e-02 -1.428095505832082113e-01
1.525112891953009364e-01 -1.671772220211436566e-01 -1.425730385902093389e-01
1.635091738419923169e-01 -2.038581383228289801e-02 -7.259129809666196864e-02
1.863623081801869530e-01 -1.606691605382634946e-01 -6.215477803302005277e-02
1.860597960800501049e-01 -8.942066294581488273e-02 -3.125312549919054272e-02
1.775889425945270939e-01 -2.610350903930070965e-01 -1.126311690894651364e-02
-2.344530852828673184e-03 -4.843526491371444864e-01 -1.771665515139877762e-03
1.874771611551337980e-01 -1.746344797761466561e-01 2.508785818118245206e-02
1.673698763635819708e-01 -1.919017332593175168e-02 3.993461629869029389e-02
1.727239639120116965e-01 -9.728079191922173186e-02 7.539791998699200970e-02
1.665128570191360680e-01 -2.398893093849127078e-01 7.477779444444791757e-02
8.554297965405206761e-02 8.554345817627583004e-02 9.907725635580637291e-02
1.374989359402138955e-01 -1.654442203122245270e-01 1.448631144616539801e-01
1.039356910079401175e-01 -2.893834800076415803e-01 1.517027542469466528e-01
5.438731001724109171e-02 -3.558292982123242298e-01 1.561732813257111774e-01
3.977107071849014797e-02 3.777275649648492717e-02 1.772075415709397916e-01
4.590873630782910109e-02 -4.654621172417983122e-02 2.060602168536818102e-01
-3.911739318753588296e-02 8.070176686650581965e-02 1.807716190068306139e-01
-9.479836629895829758e-02 -2.981754513201692047e-01 2.235338189407063414e-01
-1.100390923997783266e-01 -3.853285868891334620e-01 1.787177106278184380e-01
-8.242936460454086678e-02 -4.478555864744457327e-01 1.249425775901668867e-01
-1.888650226570900426e-01 -2.855705533295971876e-01 2.032813129858487555e-01
-2.265427099231723806e-01 -2.634159763056893519e-02 1.842681888197268136e-01
-2.550827902725592011e-01 -2.565838064055482315e-01 1.667852298436166836e-01
-2.482112241496278460e-01 -3.481030410403000985e-01 1.249598292707110592e-01
-2.297985879068757398e-01 6.087823153217783445e-02 1.338388555900609234e-01
-2.957476590054699539e-01 -1.970419633881451138e-01 1.358875246111849178e-01
-3.119199946875143015e-01 -1.117417892092426313e-01 1.094828721846853037e-01
-2.870051418703375434e-01 -3.041021473069096398e-02 1.252605446213309048e-01
-2.797507465290141049e-01 -3.607317081877161047e-01 5.298546634207472461e-02
-3.449913209098987021e-01 -1.603767754118392563e-01 -1.592450024404423606e-03
-2.702243690174346291e-01 9.058667041744072890e-02 4.269668100568255013e-03
-3.263880428785118637e-01 -3.747339076041567019e-02 3.523707954466567815e-02
-2.584401925137772560e-01 -4.069771660701563265e-01 -1.268926338740771105e-02
-2.005025956799353359e-01 -4.533193987842354389e-01 -5.364111305908087629e-02
-3.287314812507123873e-01 -8.657125175611445067e-02 -8.483433757088074123e-02
-2.662392182619103265e-01 -2.568119279554665901e-01 -1.740010367826804738e-01
-2.804289937429688129e-01 -1.006646227632223278e-01 -1.730233146419092871e-01
-2.668405141313899276e-01 -1.735824191510288439e-01 -1.920330648734163981e-01
-2.525824237248898707e-01 -2.324085279657835817e-02 -1.821827892227929069e-01
-1.840250875899033522e-01 -2.232892774345062448e-01 -2.413800269728181846e-01
-1.988819222387389063e-01 -3.352238490689539718e-01 -1.952599600966745674e-01
-9.355341034860617522e-02 -4.004485492521940859e-01 -1.896387776120031510e-01
-1.135511577905180119e-01 -1.573767722888508580e-01 -2.660925622802430279e-01
-2.813875390087298631e-02 -1.635259475613462798e-01 -3.147323277676624120e-01
-1.676631262046321402e-02 -1.079960361406900327e-01 -3.682350934770836082e-01
-1.672365410687623732e-01 -6.284849773867773326e-02 -2.863768166859614328e-01
-1.314253088457320862e-01 -9.199806120652145869e-02 -2.070890620617938938e-01
-1.241286541559470658e-01 -1.078253514224595221e-02 -2.347827353061745503e-01
9.201051793553144398e-02 1.399799232378114711e-01 -2.920510722208433108e-01
1.211655923690682923e-01 9.458253137396505084e-02 -3.562868507952313690e-01
4.115163588079493590e-02 -4.874960108111341811e-02 -2.309689362023518155e-01
4.852723466605053659e-02 3.723330910471137761e-02 -2.177824527878361727e-01
-3.404961679122573553e-02 1.122606399542335764e-01 -2.528848643628236470e-01
-1.065033828550958628e-01 2.244844597753020699e-02 -3.461574265587462618e-01
6.919888814060355664e-02 8.740115000401535439e-02 -4.099269188987086920e-01
-8.072288583581499477e-03 6.823603547798220981e-02 -4.041228784155609932e-01
-3.381086987320603560e-02 -1.563812111732662199e-02 -3.956936075740030145e-01
5.137320248785325061e-02 -1.212605104729846429e-02 -3.874575361706760757e-01
-1.261228133420569941e-01 -8.160954968428942025e-02 -3.628899191346737774e-01
-4.052654592419473178e-02 -1.415180878565600597e-01 -2.120107194791421290e-01
4.656574598807482201e-02 1.485438727010804372e-01 -3.689228245248280524e-01
6.503619342049293817e-02 1.115706678142393826e-01 -2.308743500298740536e-01
1.285881472092311739e-01 6.710619562981345876e-02 -2.745451502763253115e-01
-5.702278579898233807e-02 -2.597417195676817239e-02 -1.937255719348669747e-01
4.296380999936167538e-03 1.660631317836948062e-01 -2.972090903885326041e-01
-5.885169880695314409e-02 4.475700100584593250e-02 -2.228472070792212878e-01
-7.501112417070433802e-02 8.647366148215318571e-02 -3.117930102299074635e-01
4.752100889209005807e-02 -8.415271595363915669e-02 -3.020920128352372336e-01
9.148144442025316070e-02 -2.134697714161457732e-02 -3.100027284651764581e-01
6.626177484956208286e-02 -4.303014755876416464e-01 -2.007927262613765640e-01
5.852483850399682608e-02 -2.938179110159915841e-01 -6.490886645457707993e-02
3.374254060812370004e-01 -3.923336580223399928e-01 -1.128168128810779630e-01
2.552557831023974932e-01 -4.155037762442775895e-01 -5.659741402802327681e-02
2.728606248768720710e-01 -2.830245499458966529e-01 -1.887835507532414914e-01
3.194857409937097992e-01 -2.963927462518259892e-01 -1.169946638267229944e-01
1.734568760427020462e-01 -2.711288713571128883e-01 -1.333777238799236098e-01
1.943417251864173467e-01 -2.860222711123543426e-01 -2.028709679802187249e-01
2.156584362201333649e-01 -3.689647550891873573e-01 -2.473055876178304469e-01
2.779244050448715364e-01 -4.482168436013012069e-01 -1.426067527497896781e-01
1.730486640180248573e-02 -3.756710736431007058e-01 -2.297525435980859176e-01
3.676433455927818200e-02 -2.739560620137703739e-01 -1.318540626234361857e-01
7.314933732017132872e-02 -3.713620332299126803e-01 -3.832026410277449002e-02
3.388873015090434482e-01 -3.518785468133447947e-01 -1.822407784337956582e-01
-8.196586983989565672e-03 -3.267244745397698269e-01 -1.769085376478596872e-01
-4.084161220566253844e-03 -4.053974280138745723e-01 -1.557141508364496829e-01
-3.654173467225249085e-03 -3.540696154696604814e-01 -7.924700368347148416e-02
2.745351214436949627e-01 -3.333271444116505666e-01 -4.327420975366753908e-02
1.790096840114032140e-01 -3.813246900680575813e-01 -3.929217572805278336e-02
2.503920320519980036e-01 -2.782837304219433761e-01 -9.976785915629624024e-02
1.130883503342663687e-01 -2.891601903657884942e-01 -2.131199118616968757e-01
2.774850750181143688e-01 -4.176240097268989482e-01 -2.226964641929417565e-01
1.420498851675272289e-01 -4.108012589104620416e-01 -2.291439346177294722e-01
1.239611174293918372e-01 -4.463023107030321679e-01 -1.362139380796755228e-01
2.075985808705454183e-01 -4.434556157605496485e-01 -1.831513307195437590e-01
5.869408343759982549e-02 -4.413766353361571193e-01 1.506274656265957912e-01
5.800594858045587249e-02 -4.209172538577762857e-01 2.423984931319135283e-01
6.529146201338115185e-02 -3.273488012482583254e-01 6.822825852834475679e-02
3.508789954216028073e-02 -2.776841636620446807e-01 1.998521624947844355e-01
1.804530694543642153e-03 -3.992247699724174037e-01 1.140764890335152770e-01
2.498181502103440643e-01 -3.639831082006405039e-01 2.743082854510320856e-01
3.107251084536175600e-01 -2.954251232736465616e-01 1.218432946945066930e-01
2.206682311764842086e-01 -2.730773671644081135e-01 1.815246457141029279e-01
1.147904978362478956e-01 -4.033254532127595882e-01 7.587030747612803772e-02
1.587601370074837592e-01 -3.217014195536745946e-01 6.943535248428396589e-02
2.651776443452864007e-01 -4.485475779005181352e-01 1.619997594039062350e-01
2.370590032568737140e-02 -2.892273834359271878e-01 1.174063432639800841e-01
7.751364446121310203e-02 -3.104779972276472400e-01 2.595106459875623495e-01
5.366983911521431769e-03 -3.426470613099035223e-01 2.432402748679551863e-01
3.346623759024219824e-01 -4.020963316924511255e-01 1.463669666724379259e-01
3.138347843442640395e-01 -4.026841227906329257e-01 2.422269688912635777e-01
1.684320066616068201e-01 -3.207957802464417862e-01 2.659340539580008489e-01
1.243855501634354432e-01 -3.898972110361811905e-01 2.686893463243060221e-01
3.030380145182849372e-01 -2.997150900185945321e-01 2.292539973829201549e-01
2.949694270862091017e-01 -4.068294848314123469e-01 8.496581012113588183e-02
2.509691903922973144e-01 -3.347079325304533892e-01 6.682279042077263198e-02
1.834499629133450604e-01 -4.474363240835148581e-01 1.769688991257585553e-01
1.955951475554616459e-01 -4.258420794384723540e-01 9.243146101738426712e-02
6.280332695674885335e-02 -1.320586570865656985e-02 3.460601442064881184e-01
1.483740785349419722e-01 -1.116310868174916898e-01 2.503590081480292495e-01
8.334883033473708402e-02 -1.433283877452027344e-01 4.018358000523979956e-01
4.857124584733577000e-02 -6.707222889993139092e-02 3.967553009142785747e-01
-1.569059121220639785e-01 -4.210229112410191626e-03 2.590931339004621248e-01
-1.548099427904343905e-02 -1.026011427477167837e-01 1.743223363538898285e-01
8.221762322791240085e-02 -8.642200763928496038e-03 2.728449393343670981e-01
1.460221927068068859e-01 -4.042968272413305364e-02 2.944411725200785090e-01
1.271191185609810981e-01 -1.796481001991364068e-01 3.182348436112846235e-01
2.989060776970349498e-02 -1.664072326250388423e-01 3.505882470443626997e-01
1.506511824685654033e-01 -7.964808311139018326e-02 3.678384511489326547e-01
-1.164781066079784849e-01 -7.985835709135620519e-02 3.094537728621728490e-01
-2.878260502009664934e-02 -9.318981959660427306e-02 3.600043942944177733e-01
-1.768255197697845768e-02 -1.897294347499440398e-02 3.464194946546278464e-01
-1.178029974999752205e-01 -8.053855411153146293e-02 1.553502043517525077e-01
-1.600253293893078577e-01 -8.612424738326149298e-02 2.362292483717431391e-01
-4.764319533982273869e-02 -1.408609410131720641e-01 3.056113886526913292e-01
-6.684423223009242077e-02 -5.242227006282698665e-03 1.465467098382926547e-01
7.425460083578695747e-02 -1.311821825664111207e-01 2.199763731511597542e-01
-9.054884388535158757e-02 1.832667141432457389e-02 2.978639568945842808e-01
-1.040193042525863615e-01 4.293274567439404887e-02 2.207938152551688071e-01
-5.948172901458232548e-03 2.302500629685206093e-02 2.441964598481220494e-01
================================================
FILE: example/pumpkin/mesh.obj
================================================
# Blender 3.6.1
# www.blender.org
o Cube.003
v -0.229940 -0.199760 0.283841
v -0.174661 -0.026964 0.192628
v -0.229940 -0.199760 -0.283841
v -0.174661 -0.026964 -0.192628
v 0.229940 -0.199760 0.283841
v 0.174661 -0.026964 0.192628
v 0.229940 -0.199760 -0.283841
v 0.174661 -0.026964 -0.192628
v -0.288160 -0.065191 0.348050
v -0.288160 -0.065191 -0.348050
v 0.288160 -0.065191 -0.348050
v 0.288160 -0.065191 0.348050
v -0.263480 -0.208177 -0.168403
v -0.280328 -0.212405 -0.000000
v -0.263480 -0.208177 0.168403
v -0.270580 -0.043014 0.319724
v -0.244257 -0.031665 0.281064
v -0.208876 -0.027523 0.234462
v -0.196551 -0.026979 0.113119
v -0.207263 -0.026987 -0.000000
v -0.196551 -0.026979 -0.113119
v -0.297172 -0.099702 -0.363602
v -0.291154 -0.139574 -0.358447
v -0.264833 -0.175982 -0.326794
v 0.136423 -0.208177 -0.325243
v -0.000000 -0.212405 -0.346040
v -0.136423 -0.208177 -0.325243
v -0.102919 -0.026979 -0.214806
v -0.000000 -0.026987 -0.225479
v 0.102919 -0.026979 -0.214806
v 0.297172 -0.099702 -0.363602
v 0.291154 -0.139574 -0.358447
v 0.264833 -0.175982 -0.326794
v 0.263480 -0.208177 0.168403
v 0.280328 -0.212405 -0.000000
v 0.263480 -0.208177 -0.168403
v 0.196551 -0.026979 -0.113119
v 0.207263 -0.026987 -0.000000
v 0.196551 -0.026979 0.113119
v 0.297172 -0.099702 0.363602
v 0.291154 -0.139574 0.358447
v 0.264833 -0.175982 0.326794
v -0.136423 -0.208177 0.325243
v 0.000000 -0.212405 0.346040
v 0.136423 -0.208177 0.325243
v 0.102919 -0.026979 0.214806
v -0.000000 -0.026987 0.225479
v -0.102919 -0.026979 0.214806
v -0.264833 -0.175982 0.326794
v -0.291154 -0.139574 0.358447
v -0.297172 -0.099702 0.363602
v -0.208876 -0.027523 -0.234462
v -0.244257 -0.031665 -0.281064
v -0.270580 -0.043014 -0.319724
v 0.208876 -0.027523 -0.234462
v 0.244257 -0.031665 -0.281064
v 0.270580 -0.043014 -0.319724
v 0.208876 -0.027523 0.234462
v 0.244257 -0.031665 0.281064
v 0.270580 -0.043014 0.319724
v 0.157588 -0.065191 0.445939
v -0.000000 -0.065191 0.478569
v -0.157588 -0.065191 0.445939
v 0.369205 -0.065191 -0.190340
v 0.396220 -0.065191 -0.000000
v 0.369205 -0.065191 0.190340
v -0.157588 -0.065191 -0.445939
v -0.000000 -0.065191 -0.478569
v 0.157588 -0.065191 -0.445939
v -0.369205 -0.065191 0.190340
v -0.396220 -0.065191 -0.000000
v -0.369205 -0.065191 -0.190340
v -0.346607 -0.043014 0.174884
v -0.312361 -0.031666 0.153987
v -0.263669 -0.027527 0.129996
v -0.371925 -0.043014 -0.000000
v -0.334876 -0.031667 -0.000000
v -0.281538 -0.027530 -0.000000
v -0.346607 -0.043014 -0.174884
v -0.312361 -0.031666 -0.153987
v -0.263669 -0.027527 -0.129996
v -0.147993 -0.043014 -0.409514
v -0.133735 -0.031666 -0.359056
v -0.115634 -0.027527 -0.294862
v -0.000000 -0.043014 -0.439403
v 0.000000 -0.031667 -0.384722
v -0.000000 -0.027530 -0.314189
v 0.147993 -0.043014 -0.409514
v 0.133735 -0.031666 -0.359056
v 0.115634 -0.027527 -0.294862
v 0.346607 -0.043014 -0.174884
v 0.312361 -0.031666 -0.153987
v 0.263669 -0.027527 -0.129996
v 0.371925 -0.043014 -0.000000
v 0.334876 -0.031667 -0.000000
v 0.281538 -0.027530 -0.000000
v 0.346607 -0.043014 0.174884
v 0.312361 -0.031666 0.153987
v 0.263669 -0.027527 0.129996
v 0.147993 -0.043014 0.409514
v 0.133735 -0.031666 0.359056
v 0.115634 -0.027527 0.294862
v -0.000000 -0.043014 0.439403
v -0.000000 -0.031667 0.384722
v -0.000000 -0.027530 0.314189
v -0.147993 -0.043014 0.409514
v -0.133735 -0.031666 0.359056
v -0.115634 -0.027527 0.294862
v -0.145394 -0.226186 -0.179476
v -0.000000 -0.232284 -0.189118
v 0.145394 -0.226186 -0.179476
v -0.153204 -0.232284 -0.000000
v 0.000000 -0.238940 -0.000000
v 0.153204 -0.232284 -0.000000
v -0.145394 -0.226186 0.179476
v 0.000000 -0.232284 0.189118
v 0.145394 -0.226186 0.179476
v 0.105842 -0.027011 -0.114213
v -0.000000 -0.027022 -0.117990
v -0.105842 -0.027011 -0.114213
v 0.110098 -0.027022 -0.000000
v -0.000000 -0.027034 -0.000000
v -0.110098 -0.027022 -0.000000
v 0.105842 -0.027011 0.114213
v -0.000000 -0.027022 0.117990
v -0.105842 -0.027011 0.114213
v 0.146141 -0.178311 0.415792
v 0.159175 -0.140197 0.459492
v 0.162510 -0.099779 0.465894
v 0.000000 -0.180349 0.445914
v -0.000000 -0.140820 0.493247
v -0.000000 -0.099857 0.500000
v -0.146141 -0.178311 0.415792
v -0.159175 -0.140197 0.459492
v -0.162510 -0.099779 0.465894
v 0.336958 -0.178311 -0.180332
v 0.373229 -0.140197 -0.195964
v 0.380775 -0.099779 -0.198837
v 0.361369 -0.180349 -0.000000
v 0.400647 -0.140820 -0.000000
v 0.408650 -0.099857 -0.000000
v 0.336958 -0.178311 0.180332
v 0.373229 -0.140197 0.195964
v 0.380775 -0.099779 0.198837
v -0.146140 -0.178311 -0.415792
v -0.159175 -0.140197 -0.459492
v -0.162510 -0.099779 -0.465894
v -0.000000 -0.180349 -0.445914
v -0.000000 -0.140820 -0.493247
v -0.000000 -0.099857 -0.500000
v 0.146141 -0.178311 -0.415792
v 0.159175 -0.140197 -0.459492
v 0.162510 -0.099779 -0.465894
v -0.336958 -0.178311 0.180332
v -0.373229 -0.140197 0.195964
v -0.380775 -0.099779 0.198837
v -0.361369 -0.180349 -0.000000
v -0.400647 -0.140820 -0.000000
v -0.408650 -0.099857 -0.000000
v -0.336958 -0.178311 -0.180332
v -0.373229 -0.140197 -0.195964
v -0.380775 -0.099779 -0.198837
vn -0.1717 0.9837 0.0535
vn -0.4598 0.8880 -0.0000
vn -0.4201 0.8973 0.1356
vn -0.0441 0.9989 0.0132
vn -0.1841 0.9829 -0.0000
vn -0.1717 0.9837 -0.0535
vn -0.4201 0.8973 -0.1356
vn -0.0441 0.9989 -0.0132
vn -0.3013 0.9185 0.2562
vn -0.7554 0.6084 0.2434
vn -0.5603 0.6771 0.4771
vn -0.1256 0.9865 0.1054
vn -0.0343 0.9990 0.0300
vn -0.0041 1.0000 0.0047
vn -0.0039 1.0000 0.0007
vn -0.0451 0.9990 -0.0000
vn -0.0039 1.0000 -0.0007
vn -0.0041 1.0000 -0.0047
vn -0.0343 0.9990 -0.0300
vn -0.1256 0.9865 -0.1054
vn -0.3013 0.9185 -0.2562
vn -0.7554 0.6084 -0.2434
vn -0.5603 0.6771 -0.4771
vn -0.8157 0.5784 -0.0000
vn -0.1450 0.9351 -0.3232
vn -0.0000 0.9907 -0.1360
vn -0.0000 0.9405 -0.3397
vn -0.0585 0.9896 -0.1314
vn -0.0000 0.9994 -0.0354
vn 0.1450 0.9351 -0.3232
vn 0.0585 0.9896 -0.1314
vn -0.2805 0.7293 -0.6240
vn -0.0149 0.9993 -0.0354
vn -0.0011 1.0000 -0.0036
vn -0.0000 1.0000 -0.0032
vn 0.0149 0.9993 -0.0354
vn 0.0011 1.0000 -0.0036
vn 0.0343 0.9990 -0.0300
vn 0.1256 0.9865 -0.1054
vn 0.3013 0.9185 -0.2562
vn 0.5603 0.6771 -0.4771
vn 0.2805 0.7293 -0.6240
vn -0.0000 0.7460 -0.6660
vn 0.1717 0.9837 -0.0535
vn 0.4598 0.8880 -0.0000
vn 0.4201 0.8973 -0.1356
vn 0.0441 0.9989 -0.0132
vn 0.1841 0.9829 -0.0000
vn 0.1717 0.9837 0.0535
vn 0.4201 0.8973 0.1356
vn 0.0441 0.9989 0.0132
vn 0.7554 0.6084 -0.2434
vn 0.0041 1.0000 -0.0047
vn 0.0039 1.0000 -0.0007
vn 0.0451 0.9990 -0.0000
vn 0.0039 1.0000 0.0007
vn 0.0041 1.0000 0.0047
vn 0.0343 0.9990 0.0300
vn 0.1256 0.9865 0.1054
vn 0.3013 0.9185 0.2562
vn 0.7554 0.6084 0.2434
vn 0.5603 0.6771 0.4771
vn 0.8157 0.5784 -0.0000
vn 0.1450 0.9351 0.3232
vn -0.0000 0.9907 0.1360
vn -0.0000 0.9405 0.3397
vn 0.0585 0.9896 0.1314
vn -0.0000 0.9994 0.0354
vn -0.1450 0.9351 0.3232
vn -0.0585 0.9896 0.1314
vn 0.2805 0.7293 0.6240
vn 0.0149 0.9993 0.0354
vn 0.0011 1.0000 0.0036
vn -0.0000 1.0000 0.0032
vn -0.0149 0.9993 0.0354
vn -0.0011 1.0000 0.0036
vn -0.2805 0.7293 0.6240
vn -0.0000 0.7460 0.6660
vn -0.0981 -0.9920 -0.0797
vn -0.0000 -1.0000 -0.0000
vn -0.0979 -0.9952 -0.0000
vn 0.0981 -0.9920 -0.0797
vn -0.0000 -0.9967 -0.0806
vn -0.0981 -0.9920 0.0797
vn 0.0981 -0.9920 0.0797
vn -0.0000 -0.9967 0.0806
vn -0.2399 -0.9495 -0.2021
vn -0.2497 -0.9641 -0.0909
vn -0.1183 -0.9704 -0.2104
vn 0.1183 -0.9704 -0.2104
vn -0.0000 -0.9771 -0.2129
vn 0.2399 -0.9495 -0.2021
vn 0.2497 -0.9641 -0.0909
vn 0.0979 -0.9952 -0.0000
vn 0.2497 -0.9641 0.0909
vn 0.2399 -0.9495 0.2021
vn 0.1183 -0.9704 0.2104
vn -0.0000 -0.9771 0.2129
vn -0.1183 -0.9704 0.2104
vn -0.2399 -0.9495 0.2021
vn -0.2497 -0.9641 0.0909
vn -0.2524 -0.9676 -0.0000
vn -0.0002 1.0000 0.0002
vn -0.0000 1.0000 -0.0000
vn -0.0002 1.0000 -0.0000
vn 0.0002 1.0000 0.0002
vn -0.0000 1.0000 0.0002
vn -0.0002 1.0000 -0.0002
vn 0.0002 1.0000 -0.0002
vn -0.0000 1.0000 -0.0002
vn 0.0002 1.0000 -0.0000
vn 0.0037 1.0000 -0.0000
vn 0.3510 -0.5399 0.7650
vn -0.0000 -0.8906 0.4549
vn 0.2032 -0.8799 0.4295
vn -0.0000 0.2110 0.9775
vn -0.0000 -0.5433 0.8396
vn -0.3510 -0.5399 0.7650
vn -0.2032 -0.8799 0.4295
vn 0.4045 -0.8498 0.3379
vn 0.6517 -0.5268 0.5457
vn 0.4057 0.1912 0.8938
vn 0.7565 0.1426 0.6382
vn -0.4057 0.1912 0.8938
vn -0.7565 0.1426 0.6382
vn -0.6517 -0.5268 0.5457
vn -0.4045 -0.8498 0.3379
vn 0.5024 -0.8501 -0.1578
vn 0.8649 -0.5019 -0.0000
vn 0.5191 -0.8547 -0.0000
vn 0.8197 -0.5105 -0.2597
vn 0.9967 0.0809 -0.0000
vn 0.5024 -0.8501 0.1578
vn 0.8197 -0.5105 0.2597
vn 0.4045 -0.8498 -0.3379
vn 0.6517 -0.5268 -0.5457
vn 0.9485 0.0971 -0.3015
vn 0.7565 0.1426 -0.6382
vn 0.9485 0.0971 0.3015
vn 0.2524 -0.9676 -0.0000
vn -0.3510 -0.5399 -0.7650
vn -0.0000 -0.8906 -0.4549
vn -0.2032 -0.8799 -0.4295
vn -0.0000 0.2110 -0.9775
vn -0.0000 -0.5433 -0.8396
vn 0.3510 -0.5399 -0.7650
vn 0.2032 -0.8799 -0.4295
vn -0.4045 -0.8498 -0.3379
vn -0.6517 -0.5268 -0.5457
vn -0.4057 0.1912 -0.8938
vn -0.7565 0.1426 -0.6382
vn 0.4057 0.1912 -0.8938
vn -0.5024 -0.8501 0.1578
vn -0.8649 -0.5019 -0.0000
vn -0.5191 -0.8547 -0.0000
vn -0.8197 -0.5105 0.2597
vn -0.9967 0.0809 -0.0000
vn -0.5024 -0.8501 -0.1578
vn -0.8197 -0.5105 -0.2597
vn -0.9485 0.0971 0.3015
vn -0.9485 0.0971 -0.3015
vn -0.0037 1.0000 -0.0000
vt 0.614045 0.062500
vt 0.598827 0.125000
vt 0.603756 0.062500
vt 0.620327 0.062500
vt 0.612461 0.125000
vt 0.611808 0.187500
vt 0.596951 0.187500
vt 0.620047 0.187500
vt 0.611784 0.000000
vt 0.582398 0.062500
vt 0.607378 0.000000
vt 0.616189 0.000000
vt 0.620595 0.000000
vt 0.625000 0.000000
vt 0.625000 0.062500
vt 0.620129 0.125000
vt 0.625000 0.187500
vt 0.625000 0.250000
vt 0.620035 0.250000
vt 0.611715 0.250000
vt 0.596684 0.250000
vt 0.571958 0.187500
vt 0.571585 0.250000
vt 0.574568 0.125000
vt 0.596684 0.312500
vt 0.611715 0.375000
vt 0.596684 0.375000
vt 0.611715 0.312500
vt 0.620035 0.375000
vt 0.596696 0.437500
vt 0.611816 0.437500
vt 0.571585 0.312500
vt 0.620035 0.312500
vt 0.625000 0.312500
vt 0.625000 0.375000
vt 0.620250 0.437500
vt 0.625000 0.437500
vt 0.622456 0.500000
vt 0.612522 0.500000
vt 0.596784 0.500000
vt 0.571585 0.500000
vt 0.571585 0.437500
vt 0.571585 0.375000
vt 0.614035 0.562500
vt 0.597087 0.625000
vt 0.596973 0.562500
vt 0.628646 0.562500
vt 0.614942 0.625000
vt 0.614035 0.687500
vt 0.596973 0.687500
vt 0.628646 0.687500
vt 0.571585 0.562500
vt 0.625000 0.500000
vt 0.651677 0.562500
vt 0.631735 0.625000
vt 0.651677 0.687500
vt 0.625000 0.750000
vt 0.622456 0.750000
vt 0.612522 0.750000
vt 0.596784 0.750000
vt 0.571585 0.687500
vt 0.571585 0.750000
vt 0.571585 0.625000
vt 0.596964 0.812500
vt 0.612461 0.875000
vt 0.598827 0.875000
vt 0.611909 0.812500
vt 0.620129 0.875000
vt 0.603756 0.937500
vt 0.614045 0.937500
vt 0.571958 0.812500
vt 0.620261 0.812500
vt 0.625000 0.812500
vt 0.625000 0.875000
vt 0.620327 0.937500
vt 0.625000 0.937500
vt 0.620595 1.000000
vt 0.616189 1.000000
vt 0.611784 1.000000
vt 0.607378 1.000000
vt 0.582398 0.937500
vt 0.574568 0.875000
vt 0.187479 0.562500
vt 0.249755 0.625000
vt 0.187469 0.625000
vt 0.311863 0.562500
vt 0.249832 0.562500
vt 0.187479 0.687500
vt 0.311863 0.687500
vt 0.249832 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.187500 0.500000
vt 0.312500 0.500000
vt 0.250000 0.500000
vt 0.375000 0.500000
vt 0.372977 0.562500
vt 0.311613 0.625000
vt 0.372977 0.687500
vt 0.375000 0.750000
vt 0.312500 0.750000
vt 0.250000 0.750000
vt 0.187500 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.695897 0.562500
vt 0.753227 0.625000
vt 0.699200 0.625000
vt 0.812777 0.562500
vt 0.752219 0.562500
vt 0.695897 0.687500
vt 0.812777 0.687500
vt 0.752219 0.687500
vt 0.687500 0.500000
vt 0.812500 0.500000
vt 0.750000 0.500000
vt 0.875000 0.500000
vt 0.875000 0.562500
vt 0.812903 0.625000
vt 0.875000 0.687500
vt 0.875000 0.750000
vt 0.812500 0.750000
vt 0.750000 0.750000
vt 0.687500 0.750000
vt 0.657275 0.625000
vt 0.486801 0.812500
vt 0.432629 0.875000
vt 0.432531 0.812500
vt 0.536327 0.875000
vt 0.487461 0.875000
vt 0.489045 0.937500
vt 0.432827 0.937500
vt 0.432352 0.750000
vt 0.375000 0.812500
vt 0.486654 0.750000
vt 0.534451 0.812500
vt 0.534176 0.750000
vt 0.541256 0.937500
vt 0.549284 1.000000
vt 0.491189 1.000000
vt 0.433095 1.000000
vt 0.375000 0.937500
vt 0.375000 1.000000
vt 0.375000 0.875000
vt 0.431882 0.562500
vt 0.486470 0.625000
vt 0.431648 0.625000
vt 0.486539 0.562500
vt 0.534153 0.625000
vt 0.431882 0.687500
vt 0.486539 0.687500
vt 0.432352 0.500000
vt 0.486654 0.500000
vt 0.534162 0.562500
vt 0.534176 0.500000
vt 0.534162 0.687500
vt 0.372553 0.625000
vt 0.486715 0.312500
vt 0.432535 0.375000
vt 0.432535 0.312500
vt 0.534184 0.375000
vt 0.486715 0.375000
vt 0.486707 0.437500
vt 0.432519 0.437500
vt 0.432535 0.250000
vt 0.375000 0.312500
vt 0.375000 0.250000
vt 0.486715 0.250000
vt 0.534184 0.312500
vt 0.534184 0.250000
vt 0.534183 0.437500
vt 0.375000 0.437500
vt 0.375000 0.375000
vt 0.432827 0.062500
vt 0.487461 0.125000
vt 0.432629 0.125000
vt 0.489045 0.062500
vt 0.536327 0.125000
vt 0.432547 0.187500
vt 0.486808 0.187500
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.433095 0.000000
vt 0.491189 0.000000
vt 0.541256 0.062500
vt 0.549284 0.000000
vt 0.534451 0.187500
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.625000 0.125000
vt 0.625000 1.000000
vt 0.875000 0.625000
s 1
f 74/1/1 76/2/2 73/3/3
f 75/4/4 77/5/5 74/1/1
f 76/2/2 80/6/6 79/7/7
f 77/5/5 81/8/8 80/6/6
f 16/9/9 70/10/10 9/11/11
f 17/12/12 73/3/3 16/9/9
f 18/13/13 74/1/1 17/12/12
f 2/14/14 75/4/4 18/13/13
f 19/15/15 78/16/16 75/4/4
f 78/16/16 21/17/17 81/8/8
f 81/8/8 4/18/18 52/19/19
f 80/6/6 52/19/19 53/20/20
f 79/7/7 53/20/20 54/21/21
f 72/22/22 54/21/21 10/23/23
f 71/24/24 79/7/7 72/22/22
f 73/3/3 71/24/24 70/10/10
f 82/25/25 86/26/26 85/27/27
f 83/28/28 87/29/29 86/26/26
f 86/26/26 88/30/30 85/27/27
f 87/29/29 89/31/31 86/26/26
f 10/23/23 82/25/25 67/32/32
f 54/21/21 83/28/28 82/25/25
f 53/20/20 84/33/33 83/28/28
f 52/19/19 28/34/34 84/33/33
f 84/33/33 29/35/35 87/29/29
f 29/35/35 90/36/36 87/29/29
f 30/37/37 55/38/38 90/36/36
f 90/36/36 56/39/39 89/31/31
f 89/31/31 57/40/40 88/30/30
f 88/30/30 11/41/41 69/42/42
f 85/27/27 69/42/42 68/43/43
f 67/32/32 85/27/27 68/43/43
f 92/44/44 94/45/45 91/46/46
f 93/47/47 95/48/48 92/44/44
f 94/45/45 98/49/49 97/50/50
f 95/48/48 99/51/51 98/49/49
f 57/40/40 64/52/52 11/41/41
f 56/39/39 91/46/46 57/40/40
f 55/38/38 92/44/44 56/39/39
f 8/53/53 93/47/47 55/38/38
f 37/54/54 96/55/55 93/47/47
f 96/55/55 39/56/56 99/51/51
f 99/51/51 6/57/57 58/58/58
f 98/49/49 58/58/58 59/59/59
f 97/50/50 59/59/59 60/60/60
f 66/61/61 60/60/60 12/62/62
f 65/63/63 97/50/50 66/61/61
f 91/46/46 65/63/63 64/52/52
f 100/64/64 104/65/65 103/66/66
f 101/67/67 105/68/68 104/65/65
f 104/65/65 106/69/69 103/66/66
f 105/68/68 107/70/70 104/65/65
f 12/62/62 100/64/64 61/71/71
f 60/60/60 101/67/67 100/64/64
f 59/59/59 102/72/72 101/67/67
f 58/58/58 46/73/73 102/72/72
f 102/72/72 47/74/74 105/68/68
f 47/74/74 108/75/75 105/68/68
f 48/76/76 18/77/13 108/75/75
f 108/75/75 17/78/12 107/70/70
f 107/70/70 16/79/9 106/69/69
f 106/69/69 9/80/11 63/81/77
f 103/66/66 63/81/77 62/82/78
f 61/71/71 103/66/66 62/82/78
f 109/83/79 113/84/80 112/85/81
f 111/86/82 113/84/80 110/87/83
f 113/84/80 115/88/84 112/85/81
f 113/84/80 117/89/85 116/90/86
f 3/91/87 109/83/79 13/92/88
f 27/93/89 110/87/83 109/83/79
f 25/94/90 110/87/83 26/95/91
f 7/96/92 111/86/82 25/94/90
f 36/97/93 114/98/94 111/86/82
f 114/98/94 34/99/95 117/89/85
f 117/89/85 5/100/96 45/101/97
f 116/90/86 45/101/97 44/102/98
f 116/90/86 43/103/99 115/88/84
f 115/88/84 1/104/100 15/105/101
f 112/85/81 15/105/101 14/106/102
f 13/92/88 112/85/81 14/106/102
f 118/107/103 122/108/104 121/109/105
f 120/110/106 122/108/104 119/111/107
f 122/108/104 124/112/108 121/109/105
f 122/108/104 126/113/109 125/114/110
f 8/53/53 118/107/103 37/54/54
f 30/115/37 119/111/107 118/107/103
f 28/116/34 119/111/107 29/117/35
f 4/118/18 120/110/106 28/116/34
f 21/119/17 123/120/111 120/110/106
f 123/120/111 19/121/15 126/113/109
f 126/113/109 2/122/14 48/123/76
f 125/114/110 48/123/76 47/124/74
f 125/114/110 46/125/73 124/112/108
f 124/112/108 6/57/57 39/56/56
f 121/109/105 39/56/56 38/126/112
f 37/54/54 121/109/105 38/126/112
f 128/127/113 130/128/114 127/129/115
f 128/127/113 132/130/116 131/131/117
f 130/128/114 134/132/118 133/133/119
f 132/130/116 134/132/118 131/131/117
f 42/134/120 45/135/97 5/100/96
f 41/136/121 127/129/115 42/134/120
f 41/136/121 129/137/122 128/127/113
f 40/138/123 61/71/71 129/137/122
f 129/137/122 62/82/78 132/130/116
f 62/82/78 135/139/124 132/130/116
f 63/81/77 51/140/125 135/139/124
f 135/139/124 50/141/126 134/132/118
f 133/133/119 50/141/126 49/142/127
f 43/143/99 49/142/127 1/144/100
f 44/145/98 133/133/119 43/143/99
f 127/129/115 44/145/98 45/135/97
f 136/146/128 140/147/129 139/148/130
f 137/149/131 141/150/132 140/147/129
f 140/147/129 142/151/133 139/148/130
f 141/150/132 143/152/134 140/147/129
f 7/96/92 136/146/128 36/97/93
f 33/153/135 137/149/131 136/146/128
f 32/154/136 138/155/137 137/149/131
f 11/41/41 138/155/137 31/156/138
f 64/52/52 141/150/132 138/155/137
f 141/150/132 66/61/61 144/157/139
f 144/157/139 12/62/62 40/138/123
f 144/157/139 41/136/121 143/152/134
f 143/152/134 42/134/120 142/151/133
f 142/151/133 5/100/96 34/99/95
f 139/148/130 34/99/95 35/158/140
f 36/97/93 139/148/130 35/158/140
f 146/159/141 148/160/142 145/161/143
f 146/159/141 150/162/144 149/163/145
f 148/160/142 152/164/146 151/165/147
f 150/162/144 152/164/146 149/163/145
f 24/166/148 27/167/89 3/168/87
f 23/169/149 145/161/143 24/166/148
f 23/169/149 147/170/150 146/159/141
f 22/171/151 67/32/32 147/170/150
f 147/170/150 68/43/43 150/162/144
f 68/43/43 153/172/152 150/162/144
f 69/42/42 31/156/138 153/172/152
f 153/172/152 32/154/136 152/164/146
f 151/165/147 32/154/136 33/153/135
f 25/173/90 33/153/135 7/96/92
f 26/174/91 151/165/147 25/173/90
f 145/161/143 26/174/91 27/167/89
f 154/175/153 158/176/154 157/177/155
f 155/178/156 159/179/157 158/176/154
f 158/176/154 160/180/158 157/177/155
f 159/179/157 161/181/159 158/176/154
f 1/182/100 154/175/153 15/183/101
f 49/184/127 155/178/156 154/175/153
f 50/185/126 156/186/160 155/178/156
f 9/11/11 156/186/160 51/187/125
f 70/10/10 159/179/157 156/186/160
f 159/179/157 72/22/22 162/188/161
f 162/188/161 10/23/23 22/171/151
f 162/188/161 23/169/149 161/181/159
f 161/181/159 24/166/148 160/180/158
f 160/180/158 3/168/87 13/189/88
f 157/177/155 13/189/88 14/190/102
f 15/183/101 157/177/155 14/190/102
f 74/1/1 77/5/5 76/2/2
f 75/4/4 78/16/16 77/5/5
f 76/2/2 77/5/5 80/6/6
f 77/5/5 78/16/16 81/8/8
f 16/9/9 73/3/3 70/10/10
f 17/12/12 74/1/1 73/3/3
f 18/13/13 75/4/4 74/1/1
f 2/14/14 19/15/15 75/4/4
f 19/15/15 20/191/162 78/16/16
f 78/16/16 20/191/162 21/17/17
f 81/8/8 21/17/17 4/18/18
f 80/6/6 81/8/8 52/19/19
f 79/7/7 80/6/6 53/20/20
f 72/22/22 79/7/7 54/21/21
f 71/24/24 76/2/2 79/7/7
f 73/3/3 76/2/2 71/24/24
f 82/25/25 83/28/28 86/26/26
f 83/28/28 84/33/33 87/29/29
f 86/26/26 89/31/31 88/30/30
f 87/29/29 90/36/36 89/31/31
f 10/23/23 54/21/21 82/25/25
f 54/21/21 53/20/20 83/28/28
f 53/20/20 52/19/19 84/33/33
f 52/19/19 4/18/18 28/34/34
f 84/33/33 28/34/34 29/35/35
f 29/35/35 30/37/37 90/36/36
f 30/37/37 8/53/53 55/38/38
f 90/36/36 55/38/38 56/39/39
f 89/31/31 56/39/39 57/40/40
f 88/30/30 57/40/40 11/41/41
f 85/27/27 88/30/30 69/42/42
f 67/32/32 82/25/25 85/27/27
f 92/44/44 95/48/48 94/45/45
f 93/47/47 96/55/55 95/48/48
f 94/45/45 95/48/48 98/49/49
f 95/48/48 96/55/55 99/51/51
f 57/40/40 91/46/46 64/52/52
f 56/39/39 92/44/44 91/46/46
f 55/38/38 93/47/47 92/44/44
f 8/53/53 37/54/54 93/47/47
f 37/54/54 38/126/112 96/55/55
f 96/55/55 38/126/112 39/56/56
f 99/51/51 39/56/56 6/57/57
f 98/49/49 99/51/51 58/58/58
f 97/50/50 98/49/49 59/59/59
f 66/61/61 97/50/50 60/60/60
f 65/63/63 94/45/45 97/50/50
f 91/46/46 94/45/45 65/63/63
f 100/64/64 101/67/67 104/65/65
f 101/67/67 102/72/72 105/68/68
f 104/65/65 107/70/70 106/69/69
f 105/68/68 108/75/75 107/70/70
f 12/62/62 60/60/60 100/64/64
f 60/60/60 59/59/59 101/67/67
f 59/59/59 58/58/58 102/72/72
f 58/58/58 6/57/57 46/73/73
f 102/72/72 46/73/73 47/74/74
f 47/74/74 48/76/76 108/75/75
f 48/76/76 2/192/14 18/77/13
f 108/75/75 18/77/13 17/78/12
f 107/70/70 17/78/12 16/79/9
f 106/69/69 16/79/9 9/80/11
f 103/66/66 106/69/69 63/81/77
f 61/71/71 100/64/64 103/66/66
f 109/83/79 110/87/83 113/84/80
f 111/86/82 114/98/94 113/84/80
f 113/84/80 116/90/86 115/88/84
f 113/84/80 114/98/94 117/89/85
f 3/91/87 27/93/89 109/83/79
f 27/93/89 26/95/91 110/87/83
f 25/94/90 111/86/82 110/87/83
f 7/96/92 36/97/93 111/86/82
f 36/97/93 35/158/140 114/98/94
f 114/98/94 35/158/140 34/99/95
f 117/89/85 34/99/95 5/100/96
f 116/90/86 117/89/85 45/101/97
f 116/90/86 44/102/98 43/103/99
f 115/88/84 43/103/99 1/104/100
f 112/85/81 115/88/84 15/105/101
f 13/92/88 109/83/79 112/85/81
f 118/107/103 119/111/107 122/108/104
f 120/110/106 123/120/111 122/108/104
f 122/108/104 125/114/110 124/112/108
f 122/108/104 123/120/111 126/113/109
f 8/53/53 30/115/37 118/107/103
f 30/115/37 29/117/35 119/111/107
f 28/116/34 120/110/106 119/111/107
f 4/118/18 21/119/17 120/110/106
f 21/119/17 20/193/162 123/120/111
f 123/120/111 20/193/162 19/121/15
f 126/113/109 19/121/15 2/122/14
f 125/114/110 126/113/109 48/123/76
f 125/114/110 47/124/74 46/125/73
f 124/112/108 46/125/73 6/57/57
f 121/109/105 124/112/108 39/56/56
f 37/54/54 118/107/103 121/109/105
f 128/127/113 131/131/117 130/128/114
f 128/127/113 129/137/122 132/130/116
f 130/128/114 131/131/117 134/132/118
f 132/130/116 135/139/124 134/132/118
f 42/134/120 127/129/115 45/135/97
f 41/136/121 128/127/113 127/129/115
f 41/136/121 40/138/123 129/137/122
f 40/138/123 12/62/62 61/71/71
f 129/137/122 61/71/71 62/82/78
f 62/82/78 63/81/77 135/139/124
f 63/81/77 9/80/11 51/140/125
f 135/139/124 51/140/125 50/141/126
f 133/133/119 134/132/118 50/141/126
f 43/143/99 133/133/119 49/142/127
f 44/145/98 130/128/114 133/133/119
f 127/129/115 130/128/114 44/145/98
f 136/146/128 137/149/131 140/147/129
f 137/149/131 138/155/137 141/150/132
f 140/147/129 143/152/134 142/151/133
f 141/150/132 144/157/139 143/152/134
f 7/96/92 33/153/135 136/146/128
f 33/153/135 32/154/136 137/149/131
f 32/154/136 31/156/138 138/155/137
f 11/41/41 64/52/52 138/155/137
f 64/52/52 65/63/63 141/150/132
f 141/150/132 65/63/63 66/61/61
f 144/157/139 66/61/61 12/62/62
f 144/157/139 40/138/123 41/136/121
f 143/152/134 41/136/121 42/134/120
f 142/151/133 42/134/120 5/100/96
f 139/148/130 142/151/133 34/99/95
f 36/97/93 136/146/128 139/148/130
f 146/159/141 149/163/145 148/160/142
f 146/159/141 147/170/150 150/162/144
f 148/160/142 149/163/145 152/164/146
f 150/162/144 153/172/152 152/164/146
f 24/166/148 145/161/143 27/167/89
f 23/169/149 146/159/141 145/161/143
f 23/169/149 22/171/151 147/170/150
f 22/171/151 10/23/23 67/32/32
f 147/170/150 67/32/32 68/43/43
f 68/43/43 69/42/42 153/172/152
f 69/42/42 11/41/41 31/156/138
f 153/172/152 31/156/138 32/154/136
f 151/165/147 152/164/146 32/154/136
f 25/173/90 151/165/147 33/153/135
f 26/174/91 148/160/142 151/165/147
f 145/161/143 148/160/142 26/174/91
f 154/175/153 155/178/156 158/176/154
f 155/178/156 156/186/160 159/179/157
f 158/176/154 161/181/159 160/180/158
f 159/179/157 162/188/161 161/181/159
f 1/182/100 49/184/127 154/175/153
f 49/184/127 50/185/126 155/178/156
f 50/185/126 51/187/125 156/186/160
f 9/11/11 70/10/10 156/186/160
f 70/10/10 71/24/24 159/179/157
f 159/179/157 71/24/24 72/22/22
f 162/188/161 72/22/22 10/23/23
f 162/188/161 22/171/151 23/169/149
f 161/181/159 23/169/149 24/166/148
f 160/180/158 24/166/148 3/168/87
f 157/177/155 160/180/158 13/189/88
f 15/183/101 154/175/153 157/177/155
o Cube.002
v -0.077255 -0.050590 0.245657
v -0.077255 0.214111 0.245657
v -0.077255 -0.050590 0.069190
v -0.077255 0.214111 0.069190
v 0.099213 -0.050590 0.245657
v 0.099213 0.214111 0.245657
v 0.099213 -0.050590 0.069190
v 0.099213 0.214111 0.069190
v -0.106666 0.094345 0.275068
v -0.106666 0.094345 0.039778
v 0.128624 0.094345 0.039778
v 0.128624 0.094345 0.275068
v -0.090125 -0.058127 0.105074
v -0.096590 -0.061912 0.157423
v -0.090125 -0.058127 0.209773
v -0.106036 0.136153 0.274439
v -0.101628 0.171659 0.270030
v -0.090755 0.198504 0.259157
v -0.090125 0.219445 0.209773
v -0.096590 0.222124 0.157423
v -0.090125 0.219445 0.105074
v -0.106036 0.049315 0.040408
v -0.101628 0.006357 0.044817
v -0.090755 -0.028917 0.055690
v 0.063328 -0.058127 0.056319
v 0.010979 -0.061912 0.049855
v -0.041370 -0.058127 0.056319
v -0.041370 0.219445 0.056319
v 0.010979 0.222124 0.049855
v 0.063328 0.219445 0.056319
v 0.127994 0.049315 0.040408
v 0.123586 0.006357 0.044817
v 0.112713 -0.028917 0.055690
v 0.112083 -0.058127 0.209773
v 0.118548 -0.061912 0.157423
v 0.112083 -0.058127 0.105074
v 0.112083 0.219445 0.105074
v 0.118548 0.222124 0.157423
v 0.112083 0.219445 0.209773
v 0.127994 0.049315 0.274439
v 0.123586 0.006357 0.270030
v 0.112713 -0.028917 0.259157
v -0.041370 -0.058127 0.258527
v 0.010979 -0.061912 0.264992
v 0.063328 -0.058127 0.258527
v 0.063328 0.219445 0.258527
v 0.010979 0.222124 0.264992
v -0.041370 0.219445 0.258527
v -0.090755 -0.028917 0.259157
v -0.101628 0.006357 0.270030
v -0.106036 0.049315 0.274439
v -0.090755 0.198504 0.055690
v -0.101628 0.171659 0.044817
v -0.106036 0.136153 0.040408
v 0.112713 0.198504 0.055690
v 0.123586 0.171659 0.044817
v 0.127994 0.136153 0.040408
v 0.112713 0.198504 0.259157
v 0.123586 0.171659 0.270030
v 0.127994 0.136153 0.274439
v 0.075316 0.094345 0.308156
v 0.010979 0.094345 0.319185
v -0.053358 0.094345 0.308156
v 0.161712 0.094345 0.093086
v 0.172741 0.094345 0.157423
v 0.161712 0.094345 0.221761
v -0.053358 0.094345 0.006691
v 0.010979 0.094345 -0.004338
v 0.075316 0.094345 0.006691
v -0.139754 0.094345 0.221761
v -0.150783 0.094345 0.157423
v -0.139754 0.094345 0.093086
v -0.138956 0.136202 0.221414
v -0.133371 0.172054 0.218986
v -0.118462 0.199980 0.213562
v -0.149932 0.136252 0.157423
v -0.143975 0.172449 0.157423
v -0.127839 0.201271 0.157423
v -0.138956 0.136202 0.093433
v -0.133371 0.172054 0.095861
v -0.118462 0.199980 0.101285
v -0.053011 0.136202 0.007489
v -0.050584 0.172054 0.013073
v -0.045159 0.199980 0.027983
v 0.010979 0.136252 -0.003487
v 0.010979 0.172449 0.002470
v 0.010979 0.201271 0.018605
v 0.074969 0.136202 0.007489
v 0.072542 0.172054 0.013073
v 0.067117 0.199980 0.027983
v 0.160914 0.136202 0.093433
v 0.155329 0.172054 0.095861
v 0.140420 0.199980 0.101285
v 0.171890 0.136252 0.157423
v 0.165933 0.172449 0.157423
v 0.149797 0.201271 0.157423
v 0.160914 0.136202 0.221414
v 0.155329 0.172054 0.218986
v 0.140420 0.199980 0.213562
v 0.074969 0.136202 0.307358
v 0.072542 0.172054 0.301773
v 0.067117 0.199980 0.286864
v 0.010979 0.136252 0.318334
v 0.010979 0.172449 0.312377
v 0.010979 0.201271 0.296242
v -0.053011 0.136202 0.307358
v -0.050584 0.172054 0.301773
v -0.045159 0.199980 0.286864
v -0.044812 -0.074253 0.101632
v 0.010979 -0.079713 0.098635
v 0.066770 -0.074253 0.101632
v -0.047810 -0.079713 0.157423
v 0.010979 -0.085673 0.157423
v 0.069767 -0.079713 0.157423
v -0.044812 -0.074253 0.213215
v 0.010979 -0.079713 0.216212
v 0.066770 -0.074253 0.213215
v 0.066770 0.230858 0.101632
v 0.010979 0.234722 0.098635
v -0.044812 0.230858 0.101632
v 0.069767 0.234722 0.157423
v 0.010979 0.238940 0.157423
v -0.047810 0.234722 0.157423
v 0.066770 0.230858 0.213215
v 0.010979 0.234722 0.216212
v -0.044812 0.230858 0.213215
v 0.067117 -0.031002 0.286864
v 0.072542 0.005799 0.301773
v 0.074969 0.049245 0.307358
v 0.010979 -0.032827 0.296242
v 0.010979 0.005241 0.312377
v 0.010979 0.049175 0.318334
v -0.045159 -0.031003 0.286864
v -0.050584 0.005799 0.301773
v -0.053011 0.049245 0.307358
v 0.140420 -0.031002 0.101285
v 0.155329 0.005799 0.095861
v 0.160914 0.049245 0.093433
v 0.149797 -0.032827 0.157423
v 0.165933 0.005241 0.157423
v 0.171890 0.049175 0.157423
v 0.140420 -0.031003 0.213562
v 0.155329 0.005799 0.218986
v 0.160914 0.049245 0.221414
v -0.045159 -0.031002 0.027983
v -0.050584 0.005799 0.013073
v -0.053011 0.049245 0.007489
v 0.010979 -0.032827 0.018605
v 0.010979 0.005241 0.002470
v 0.010979 0.049175 -0.003487
v 0.067117 -0.031003 0.027983
v 0.072542 0.005799 0.013073
v 0.074969 0.049245 0.007489
v -0.118462 -0.031002 0.213562
v -0.133371 0.005799 0.218986
v -0.138956 0.049245 0.221414
v -0.127839 -0.032827 0.157423
v -0.143975 0.005241 0.157423
v -0.149932 0.049175 0.157423
v -0.118462 -0.031003 0.101285
v -0.133371 0.005799 0.095861
v -0.138956 0.049245 0.093433
vn -0.9724 0.1628 0.1669
vn -0.8595 0.4882 0.1512
vn -0.9724 0.1628 -0.1669
vn -0.8595 0.4882 -0.1512
vn -0.8495 0.0207 0.5272
vn -0.8374 0.1685 0.5200
vn -0.7398 0.4875 0.4638
vn -0.5365 0.7675 0.3508
vn -0.5516 0.8267 0.1112
vn -0.5516 0.8267 -0.1112
vn -0.5365 0.7675 -0.3508
vn -0.7398 0.4875 -0.4638
vn -0.8374 0.1685 -0.5200
vn -0.8495 0.0207 -0.5272
vn -0.9854 0.0202 -0.1689
vn -0.9854 0.0202 0.1689
vn -0.1669 0.1628 -0.9724
vn -0.1512 0.4882 -0.8595
vn 0.1669 0.1628 -0.9724
vn 0.1512 0.4882 -0.8595
vn -0.5272 0.0207 -0.8495
vn -0.5200 0.1685 -0.8374
vn -0.4638 0.4875 -0.7398
vn -0.3508 0.7675 -0.5365
vn -0.1112 0.8267 -0.5516
vn 0.1112 0.8267 -0.5516
vn 0.3508 0.7675 -0.5365
vn 0.4638 0.4875 -0.7398
vn 0.5200 0.1685 -0.8374
vn 0.5272 0.0207 -0.8495
vn 0.1689 0.0202 -0.9854
vn -0.1689 0.0202 -0.9854
vn 0.9724 0.1628 -0.1669
vn 0.8595 0.4882 -0.1512
vn 0.9724 0.1628 0.1669
vn 0.8595 0.4882 0.1512
vn 0.8495 0.0207 -0.5272
vn 0.8374 0.1685 -0.5200
vn 0.7398 0.4875 -0.4638
vn 0.5365 0.7675 -0.3508
vn 0.5516 0.8267 -0.1112
vn 0.5516 0.8267 0.1112
vn 0.5365 0.7675 0.3508
vn 0.7398 0.4875 0.4638
vn 0.8374 0.1685 0.5200
vn 0.8495 0.0207 0.5272
vn 0.9854 0.0202 0.1689
vn 0.9854 0.0202 -0.1689
vn 0.1669 0.1628 0.9724
vn 0.1512 0.4882 0.8595
vn -0.1669 0.1628 0.9724
vn -0.1512 0.4882 0.8595
vn 0.5272 0.0207 0.8495
vn 0.5200 0.1685 0.8374
vn 0.4638 0.4875 0.7398
vn 0.3508 0.7675 0.5365
vn 0.1112 0.8267 0.5516
vn -0.1112 0.8267 0.5516
vn -0.3508 0.7675 0.5365
vn -0.4638 0.4875 0.7398
vn -0.5200 0.1685 0.8374
vn -0.5272 0.0207 0.8495
vn -0.1689 0.0202 0.9854
vn 0.1689 0.0202 0.9854
vn -0.1003 -0.9897 -0.1022
vn 0.1022 -0.9897 -0.1003
vn -0.1003 -0.9897 0.1022
vn 0.1022 -0.9897 0.1003
vn -0.3397 -0.8885 -0.3084
vn -0.1097 -0.9338 -0.3406
vn 0.1096 -0.9337 -0.3407
vn 0.3084 -0.8885 -0.3397
vn 0.3406 -0.9338 -0.1097
vn 0.3406 -0.9338 0.1097
vn 0.3084 -0.8885 0.3397
vn 0.1096 -0.9337 0.3407
vn -0.1097 -0.9338 0.3406
vn -0.3397 -0.8885 0.3084
vn -0.3407 -0.9337 0.1096
vn -0.3407 -0.9337 -0.1096
vn 0.0714 0.9948 -0.0727
vn -0.0727 0.9948 -0.0714
vn 0.0714 0.9948 0.0727
vn -0.0727 0.9948 0.0714
vn 0.2541 0.9393 -0.2307
vn 0.0802 0.9651 -0.2492
vn -0.0802 0.9651 -0.2492
vn -0.2307 0.9393 -0.2541
vn -0.2492 0.9651 -0.0802
vn -0.2492 0.9651 0.0802
vn -0.2307 0.9393 0.2541
vn -0.0802 0.9651 0.2492
vn 0.0802 0.9651 0.2492
vn 0.2541 0.9393 0.2307
vn 0.2492 0.9651 0.0802
vn 0.2492 0.9651 -0.0802
vn 0.1642 -0.3849 0.9082
vn 0.1694 -0.1324 0.9766
vn -0.1642 -0.3849 0.9082
vn -0.1694 -0.1324 0.9766
vn 0.3674 -0.6959 0.6170
vn 0.4915 -0.3883 0.7796
vn 0.5242 -0.1373 0.8405
vn 0.5274 -0.0191 0.8494
vn 0.1690 -0.0186 0.9854
vn -0.1690 -0.0186 0.9854
vn -0.5274 -0.0191 0.8494
vn -0.5242 -0.1373 0.8405
vn -0.4915 -0.3883 0.7796
vn -0.3674 -0.6959 0.6170
vn -0.1358 -0.7247 0.6755
vn 0.1358 -0.7247 0.6755
vn 0.9082 -0.3849 -0.1642
vn 0.9766 -0.1324 -0.1694
vn 0.9082 -0.3849 0.1642
vn 0.9766 -0.1324 0.1694
vn 0.6170 -0.6959 -0.3674
vn 0.7796 -0.3883 -0.4915
vn 0.8405 -0.1373 -0.5242
vn 0.8494 -0.0191 -0.5274
vn 0.9854 -0.0186 -0.1690
vn 0.9854 -0.0186 0.1690
vn 0.8494 -0.0191 0.5274
vn 0.8405 -0.1373 0.5242
vn 0.7796 -0.3883 0.4915
vn 0.6170 -0.6959 0.3674
vn 0.6755 -0.7247 0.1358
vn 0.6755 -0.7247 -0.1358
vn -0.1642 -0.3849 -0.9082
vn -0.1694 -0.1324 -0.9766
vn 0.1642 -0.3849 -0.9082
vn 0.1694 -0.1324 -0.9766
vn -0.3674 -0.6959 -0.6170
vn -0.4915 -0.3883 -0.7796
vn -0.5242 -0.1373 -0.8405
vn -0.5274 -0.0191 -0.8494
vn -0.1690 -0.0186 -0.9854
vn 0.1690 -0.0186 -0.9854
vn 0.5274 -0.0191 -0.8494
vn 0.5242 -0.1373 -0.8405
vn 0.4915 -0.3883 -0.7796
vn 0.3674 -0.6959 -0.6170
vn 0.1358 -0.7247 -0.6755
vn -0.1358 -0.7247 -0.6755
vn -0.9082 -0.3849 0.1642
vn -0.9766 -0.1324 0.1694
vn -0.9082 -0.3849 -0.1642
vn -0.9766 -0.1324 -0.1694
vn -0.6170 -0.6959 0.3674
vn -0.7796 -0.3883 0.4915
vn -0.8405 -0.1373 0.5242
vn -0.8494 -0.0191 0.5274
vn -0.9854 -0.0186 0.1690
vn -0.9854 -0.0186 -0.1690
vn -0.8494 -0.0191 -0.5274
vn -0.8405 -0.1373 -0.5242
vn -0.7796 -0.3883 -0.4915
vn -0.6170 -0.6959 -0.3674
vn -0.6755 -0.7247 -0.1358
vn -0.6755 -0.7247 0.1358
vn -0.9726 0.1601 0.1686
vn -0.8620 0.4826 0.1551
vn -0.9726 0.1601 -0.1686
vn -0.8620 0.4826 -0.1551
vn -0.8494 0.0206 0.5274
vn -0.8369 0.1657 0.5217
vn -0.7400 0.4855 0.4654
vn -0.5117 0.8039 0.3030
vn -0.5527 0.8261 0.1105
vn -0.5526 0.8261 -0.1105
vn -0.5117 0.8039 -0.3030
vn -0.7400 0.4855 -0.4654
vn -0.8369 0.1657 -0.5217
vn -0.8494 0.0206 -0.5274
vn -0.9854 0.0200 -0.1690
vn -0.9854 0.0200 0.1690
vn -0.1686 0.1601 -0.9726
vn -0.1551 0.4826 -0.8620
vn 0.1686 0.1601 -0.9726
vn 0.1551 0.4826 -0.8620
vn -0.5274 0.0206 -0.8494
vn -0.5217 0.1657 -0.8369
vn -0.4654 0.4855 -0.7400
vn -0.3030 0.8039 -0.5117
vn -0.1105 0.8261 -0.5526
vn 0.1105 0.8261 -0.5527
vn 0.3030 0.8039 -0.5117
vn 0.4654 0.4855 -0.7400
vn 0.5217 0.1657 -0.8369
vn 0.5274 0.0206 -0.8494
vn 0.1690 0.0200 -0.9854
vn -0.1690 0.0200 -0.9854
vn 0.9726 0.1601 -0.1686
vn 0.8620 0.4826 -0.1551
vn 0.9726 0.1601 0.1686
vn 0.8620 0.4826 0.1551
vn 0.8494 0.0206 -0.5274
vn 0.8369 0.1657 -0.5217
vn 0.7400 0.4855 -0.4654
vn 0.5117 0.8039 -0.3030
vn 0.5527 0.8261 -0.1105
vn 0.5527 0.8261 0.1105
vn 0.5117 0.8039 0.3030
vn 0.7400 0.4855 0.4654
vn 0.8369 0.1657 0.5217
vn 0.8494 0.0206 0.5274
vn 0.9854 0.0200 0.1690
vn 0.9854 0.0200 -0.1690
vn 0.1686 0.1601 0.9726
vn 0.1551 0.4826 0.8620
vn -0.1686 0.1601 0.9726
vn -0.1551 0.4826 0.8620
vn 0.5274 0.0206 0.8494
vn 0.5217 0.1657 0.8369
vn 0.4654 0.4855 0.7400
vn 0.3030 0.8039 0.5117
vn 0.1105 0.8261 0.5526
vn -0.1105 0.8261 0.5527
vn -0.3030 0.8039 0.5117
vn -0.4654 0.4855 0.7400
vn -0.5217 0.1657 0.8369
vn -0.5274 0.0206 0.8494
vn -0.1690 0.0200 0.9854
vn 0.1690 0.0200 0.9854
vn -0.1022 -0.9897 -0.1003
vn 0.1003 -0.9897 -0.1022
vn -0.1022 -0.9897 0.1003
vn 0.1003 -0.9897 0.1022
vn -0.3084 -0.8885 -0.3397
vn -0.1096 -0.9337 -0.3407
vn 0.1097 -0.9338 -0.3406
vn 0.3397 -0.8885 -0.3084
vn 0.3407 -0.9337 -0.1096
vn 0.3407 -0.9337 0.1096
vn 0.3397 -0.8885 0.3084
vn 0.1097 -0.9338 0.3406
vn -0.1096 -0.9337 0.3407
vn -0.3084 -0.8885 0.3397
vn -0.3406 -0.9338 0.1097
vn -0.3406 -0.9338 -0.1097
vn 0.0727 0.9948 -0.0714
vn -0.0714 0.9948 -0.0727
vn 0.0727 0.9948 0.0714
vn -0.0714 0.9948 0.0727
vn 0.2307 0.9393 -0.2541
vn -0.2541 0.9393 -0.2307
vn -0.2541 0.9393 0.2307
vn 0.2307 0.9393 0.2541
vn 0.1597 -0.3908 0.9065
vn 0.1677 -0.1349 0.9766
vn -0.1597 -0.3908 0.9065
vn -0.1677 -0.1349 0.9766
vn 0.4149 -0.6531 0.6336
vn 0.4891 -0.3911 0.7796
vn 0.5224 -0.1399 0.8411
vn 0.5273 -0.0193 0.8495
vn 0.1689 -0.0187 0.9854
vn -0.1689 -0.0187 0.9854
vn -0.5273 -0.0193 0.8495
vn -0.5224 -0.1399 0.8411
vn -0.4891 -0.3911 0.7796
vn -0.4149 -0.6531 0.6336
vn -0.1363 -0.7252 0.6749
vn 0.1363 -0.7252 0.6749
vn 0.9065 -0.3908 -0.1597
vn 0.9766 -0.1349 -0.1677
vn 0.9065 -0.3908 0.1597
vn 0.9766 -0.1349 0.1677
vn 0.6336 -0.6531 -0.4149
vn 0.7796 -0.3911 -0.4891
vn 0.8411 -0.1399 -0.5224
vn 0.8495 -0.0193 -0.5273
vn 0.9854 -0.0187 -0.1689
vn 0.9854 -0.0187 0.1689
vn 0.8495 -0.0193 0.5273
vn 0.8411 -0.1399 0.5224
vn 0.7796 -0.3911 0.4891
vn 0.6336 -0.6531 0.4149
vn 0.6749 -0.7252 0.1363
vn 0.6749 -0.7252 -0.1363
vn -0.1597 -0.3908 -0.9065
vn -0.1677 -0.1349 -0.9766
vn 0.1597 -0.3908 -0.9065
vn 0.1677 -0.1349 -0.9766
vn -0.4149 -0.6531 -0.6336
vn -0.4891 -0.3911 -0.7796
vn -0.5224 -0.1399 -0.8411
vn -0.5273 -0.0193 -0.8495
vn -0.1689 -0.0187 -0.9854
vn 0.1689 -0.0187 -0.9854
vn 0.5273 -0.0193 -0.8495
vn 0.5224 -0.1399 -0.8411
vn 0.4891 -0.3911 -0.7796
vn 0.4149 -0.6531 -0.6336
vn 0.1363 -0.7252 -0.6749
vn -0.1363 -0.7252 -0.6749
vn -0.9065 -0.3908 0.1597
vn -0.9766 -0.1349 0.1677
vn -0.9065 -0.3908 -0.1597
vn -0.9766 -0.1349 -0.1677
vn -0.6336 -0.6531 0.4149
vn -0.7796 -0.3911 0.4891
vn -0.8411 -0.1399 0.5224
vn -0.8495 -0.0193 0.5273
vn -0.9854 -0.0187 0.1689
vn -0.9854 -0.0187 -0.1689
vn -0.8495 -0.0193 -0.5273
vn -0.8411 -0.1399 -0.5224
vn -0.7796 -0.3911 -0.4891
vn -0.6336 -0.6531 -0.4149
vn -0.6749 -0.7252 -0.1363
vn -0.6749 -0.7252 0.1363
vt 0.572770 0.062500
vt 0.544714 0.125000
vt 0.545696 0.062500
vt 0.599045 0.062500
vt 0.572454 0.125000
vt 0.572324 0.187500
vt 0.544340 0.187500
vt 0.598989 0.187500
vt 0.547295 0.000000
vt 0.516416 0.062500
vt 0.521393 0.000000
vt 0.573197 0.000000
vt 0.599098 0.000000
vt 0.625000 0.000000
vt 0.625000 0.125000
vt 0.599005 0.125000
vt 0.625000 0.250000
vt 0.598987 0.250000
vt 0.572305 0.250000
vt 0.544287 0.250000
vt 0.514337 0.187500
vt 0.514262 0.250000
vt 0.514857 0.125000
vt 0.572305 0.312500
vt 0.544287 0.375000
vt 0.544287 0.312500
vt 0.598987 0.312500
vt 0.572305 0.375000
vt 0.572369 0.437500
vt 0.544295 0.437500
vt 0.599122 0.437500
vt 0.514262 0.312500
vt 0.625000 0.375000
vt 0.598987 0.375000
vt 0.625000 0.500000
vt 0.600512 0.500000
vt 0.572814 0.500000
vt 0.544350 0.500000
vt 0.514262 0.437500
vt 0.514262 0.500000
vt 0.514262 0.375000
vt 0.573767 0.562500
vt 0.544541 0.625000
vt 0.544469 0.562500
vt 0.604412 0.562500
vt 0.574338 0.625000
vt 0.573767 0.687500
vt 0.544469 0.687500
vt 0.604412 0.687500
vt 0.514262 0.562500
vt 0.645332 0.625000
vt 0.606357 0.625000
vt 0.625000 0.750000
vt 0.600512 0.750000
vt 0.572814 0.750000
vt 0.544350 0.750000
vt 0.514262 0.687500
vt 0.514262 0.750000
vt 0.514262 0.625000
vt 0.572387 0.812500
vt 0.544714 0.875000
vt 0.544348 0.812500
vt 0.599124 0.812500
vt 0.572454 0.875000
vt 0.572770 0.937500
vt 0.545696 0.937500
vt 0.599045 0.937500
vt 0.514337 0.812500
vt 0.625000 0.875000
vt 0.599005 0.875000
vt 0.625000 1.000000
vt 0.599098 1.000000
vt 0.573197 1.000000
vt 0.547295 1.000000
vt 0.516416 0.937500
vt 0.521393 1.000000
vt 0.514857 0.875000
vt 0.187376 0.562500
vt 0.248561 0.625000
vt 0.187320 0.625000
vt 0.308756 0.562500
vt 0.249011 0.562500
vt 0.187376 0.687500
vt 0.308756 0.687500
vt 0.249011 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.187500 0.500000
vt 0.312500 0.500000
vt 0.250000 0.500000
vt 0.375000 0.500000
vt 0.363106 0.562500
vt 0.307284 0.625000
vt 0.363106 0.687500
vt 0.375000 0.750000
vt 0.312500 0.750000
vt 0.250000 0.750000
vt 0.187500 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.692790 0.562500
vt 0.752033 0.625000
vt 0.694870 0.625000
vt 0.812675 0.562500
vt 0.751398 0.562500
vt 0.692790 0.687500
vt 0.812675 0.687500
vt 0.751398 0.687500
vt 0.641806 0.562500
vt 0.687500 0.500000
vt 0.812500 0.500000
vt 0.750000 0.500000
vt 0.875000 0.500000
vt 0.875000 0.562500
vt 0.812754 0.625000
vt 0.875000 0.687500
vt 0.875000 0.750000
vt 0.812500 0.750000
vt 0.750000 0.750000
vt 0.687500 0.750000
vt 0.641806 0.687500
vt 0.411394 0.812500
vt 0.447454 0.875000
vt 0.411506 0.875000
vt 0.447279 0.812500
vt 0.482214 0.875000
vt 0.411545 0.937500
vt 0.447770 0.937500
vt 0.375000 0.812500
vt 0.410408 0.750000
vt 0.446946 0.750000
vt 0.481834 0.812500
vt 0.481742 0.750000
vt 0.483196 0.937500
vt 0.484795 1.000000
vt 0.448197 1.000000
vt 0.411598 1.000000
vt 0.375000 1.000000
vt 0.375000 0.937500
vt 0.375000 0.875000
vt 0.407648 0.562500
vt 0.445866 0.625000
vt 0.406271 0.625000
vt 0.446271 0.562500
vt 0.481607 0.625000
vt 0.407648 0.687500
vt 0.446271 0.687500
vt 0.410408 0.500000
vt 0.446946 0.500000
vt 0.481657 0.562500
vt 0.481742 0.500000
vt 0.481657 0.687500
vt 0.360610 0.625000
vt 0.411487 0.312500
vt 0.447305 0.375000
vt 0.411487 0.375000
vt 0.447305 0.312500
vt 0.481787 0.375000
vt 0.411391 0.437500
vt 0.447260 0.437500
vt 0.375000 0.250000
vt 0.375000 0.312500
vt 0.411487 0.250000
vt 0.447305 0.250000
vt 0.481787 0.312500
vt 0.481787 0.250000
vt 0.481781 0.437500
vt 0.375000 0.437500
vt 0.375000 0.375000
vt 0.411545 0.062500
vt 0.447454 0.125000
vt 0.411506 0.125000
vt 0.447770 0.062500
vt 0.482214 0.125000
vt 0.411489 0.187500
vt 0.447324 0.187500
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.411598 0.000000
vt 0.448197 0.000000
vt 0.483196 0.062500
vt 0.484795 0.000000
vt 0.481840 0.187500
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.625000 0.062500
vt 0.625000 0.187500
vt 0.625000 0.312500
vt 0.625000 0.437500
vt 0.625000 0.812500
vt 0.625000 0.937500
vt 0.875000 0.625000
s 0
f 236/194/163 238/195/163 235/196/163
f 237/197/164 239/198/164 236/194/164
f 238/195/165 242/199/165 241/200/165
f 239/198/166 243/201/166 242/199/166
f 178/202/167 232/203/167 171/204/167
f 179/205/168 235/196/168 178/202/168
f 180/206/169 236/194/169 179/205/169
f 164/207/170 237/197/170 180/206/170
f 237/197/171 182/208/171 240/209/171
f 182/208/172 243/201/172 240/209/172
f 243/201/173 166/210/173 214/211/173
f 242/199/174 214/211/174 215/212/174
f 241/200/175 215/212/175 216/213/175
f 234/214/176 216/213/176 172/215/176
f 233/216/177 241/200/177 234/214/177
f 235/196/178 233/216/178 232/203/178
f 245/217/179 247/218/179 244/219/179
f 246/220/180 248/221/180 245/217/180
f 247/218/181 251/222/181 250/223/181
f 248/221/182 252/224/182 251/222/182
f 216/213/183 229/225/183 172/215/183
f 215/212/184 244/219/184 216/213/184
f 214/211/185 245/217/185 215/212/185
f 166/210/186 246/220/186 214/211/186
f 246/220/187 191/226/187 249/227/187
f 191/226/188 252/224/188 249/227/188
f 252/224/189 170/228/189 217/229/189
f 251/222/190 217/229/190 218/230/190
f 250/223/191 218/230/191 219/231/191
f 231/232/192 219/231/192 173/233/192
f 230/234/193 250/223/193 231/232/193
f 244/219/194 230/234/194 229/225/194
f 254/235/195 256/236/195 253/237/195
f 255/238/196 257/239/196 254/235/196
f 256/236/197 260/240/197 259/241/197
f 257/239/198 261/242/198 260/240/198
f 219/231/199 226/243/199 173/233/199
f 218/230/200 253/237/200 219/231/200
f 217/229/201 254/235/201 218/230/201
f 170/228/202 255/238/202 217/229/202
f 255/238/203 200/244/203 258/245/203
f 200/244/204 261/242/204 258/245/204
f 261/242/205 168/246/205 220/247/205
f 260/240/206 220/247/206 221/248/206
f 259/241/207 221/248/207 222/249/207
f 228/250/208 222/249/208 174/251/208
f 227/252/209 259/241/209 228/250/209
f 253/237/210 227/252/210 226/243/210
f 263/253/211 265/254/211 262/255/211
f 264/256/212 266/257/212 263/253/212
f 265/254/213 269/258/213 268/259/213
f 266/257/214 270/260/214 269/258/214
f 222/249/215 223/261/215 174/251/215
f 221/248/216 262/255/216 222/249/216
f 220/247/217 263/253/217 221/248/217
f 168/246/218 264/256/218 220/247/218
f 264/256/219 209/262/219 267/263/219
f 209/262/220 270/260/220 267/263/220
f 270/260/221 164/264/221 180/265/221
f 269/258/222 180/265/222 179/266/222
f 268/259/223 179/266/223 178/267/223
f 225/268/224 178/267/224 171/269/224
f 224/270/225 268/259/225 225/268/225
f 262/255/226 224/270/226 223/261/226
f 271/271/227 275/272/227 274/273/227
f 273/274/228 275/272/228 272/275/228
f 275/272/229 277/276/229 274/273/229
f 275/272/230 279/277/230 278/278/230
f 165/279/231 271/271/231 175/280/231
f 189/281/232 272/275/232 271/271/232
f 187/282/233 272/275/233 188/283/233
f 169/284/234 273/274/234 187/282/234
f 198/285/235 276/286/235 273/274/235
f 276/286/236 196/287/236 279/277/236
f 279/277/237 167/288/237 207/289/237
f 278/278/238 207/289/238 206/290/238
f 278/278/239 205/291/239 277/276/239
f 277/276/240 163/292/240 177/293/240
f 274/273/241 177/293/241 176/294/241
f 175/280/242 274/273/242 176/294/242
f 280/295/243 284/296/243 283/297/243
f 282/298/244 284/296/244 281/299/244
f 284/296/245 286/300/245 283/297/245
f 284/296/246 288/301/246 287/302/246
f 170/228/247 280/295/247 199/303/247
f 192/304/248 281/299/248 280/295/248
f 190/305/249 281/299/249 191/306/249
f 166/307/250 282/298/250 190/305/250
f 183/308/251 285/309/251 282/298/251
f 285/309/252 181/310/252 288/301/252
f 288/301/253 164/311/253 210/312/253
f 287/302/254 210/312/254 209/313/254
f 287/302/255 208/314/255 286/300/255
f 286/300/256 168/246/256 201/315/256
f 283/297/257 201/315/257 200/244/257
f 199/303/258 283/297/258 200/244/258
f 289/316/259 293/317/259 292/318/259
f 290/319/260 294/320/260 293/317/260
f 293/317/261 295/321/261 292/318/261
f 294/320/262 296/322/262 293/317/262
f 167/288/263 289/316/263 207/323/263
f 204/324/264 290/319/264 289/316/264
f 203/325/265 291/326/265 290/319/265
f 202/327/266 223/261/266 291/326/266
f 291/326/267 224/270/267 294/320/267
f 224/270/268 297/328/268 294/320/268
f 225/268/269 213/329/269 297/328/269
f 297/328/270 212/330/270 296/322/270
f 296/322/271 211/331/271 295/321/271
f 295/321/272 163/332/272 205/333/272
f 206/334/273 295/321/273 205/333/273
f 289/316/274 206/334/274 207/323/274
f 298/335/275 302/336/275 301/337/275
f 299/338/276 303/339/276 302/336/276
f 302/336/277 304/340/277 301/337/277
f 303/339/278 305/341/278 302/336/278
f 169/284/279 298/335/279 198/285/279
f 195/342/280 299/338/280 298/335/280
f 194/343/281 300/344/281 299/338/281
f 193/345/282 226/243/282 300/344/282
f 300/344/283 227/252/283 303/339/283
f 227/252/284 306/346/284 303/339/284
f 228/250/285 202/327/285 306/346/285
f 306/346/286 203/325/286 305/341/286
f 305/341/287 204/324/287 304/340/287
f 304/340/288 167/288/288 196/287/288
f 197/347/289 304/340/289 196/287/289
f 298/335/290 197/347/290 198/285/290
f 307/348/291 311/349/291 310/350/291
f 308/351/292 312/352/292 311/349/292
f 311/349/293 313/353/293 310/350/293
f 312/352/294 314/354/294 311/349/294
f 165/355/295 307/348/295 189/356/295
f 186/357/296 308/351/296 307/348/296
f 185/358/297 309/359/297 308/351/297
f 184/360/298 229/225/298 309/359/298
f 309/359/299 230/234/299 312/352/299
f 230/234/300 315/361/300 312/352/300
f 231/232/301 193/345/301 315/361/301
f 315/361/302 194/343/302 314/354/302
f 314/354/303 195/342/303 313/353/303
f 313/353/304 169/284/304 187/362/304
f 188/363/305 313/353/305 187/362/305
f 307/348/306 188/363/306 189/356/306
f 316/364/307 320/365/307 319/366/307
f 317/367/308 321/368/308 320/365/308
f 320/365/309 322/369/309 319/366/309
f 321/368/310 323/370/310 320/365/310
f 163/371/311 316/364/311 177/372/311
f 211/373/312 317/367/312 316/364/312
f 212/374/313 318/375/313 317/367/313
f 213/376/314 232/203/314 318/375/314
f 318/375/315 233/216/315 321/368/315
f 233/216/316 324/377/316 321/368/316
f 234/214/317 184/360/317 324/377/317
f 324/377/318 185/358/318 323/370/318
f 323/370/319 186/357/319 322/369/319
f 322/369/320 165/355/320 175/378/320
f 176/379/321 322/369/321 175/378/321
f 316/364/322 176/379/322 177/372/322
f 236/194/323 239/198/323 238/195/323
f 237/197/324 240/209/324 239/198/324
f 238/195/325 239/198/325 242/199/325
f 239/198/326 240/209/326 243/201/326
f 178/202/327 235/196/327 232/203/327
f 179/205/328 236/194/328 235/196/328
f 180/206/329 237/197/329 236/194/329
f 164/207/330 181/380/330 237/197/330
f 237/197/331 181/380/331 182/208/331
f 182/208/332 183/381/332 243/201/332
f 243/201/333 183/381/333 166/210/333
f 242/199/334 243/201/334 214/211/334
f 241/200/335 242/199/335 215/212/335
f 234/214/336 241/200/336 216/213/336
f 233/216/337 238/195/337 241/200/337
f 235/196/338 238/195/338 233/216/338
f 245/217/339 248/221/339 247/218/339
f 246/220/340 249/227/340 248/221/340
f 247/218/341 248/221/341 251/222/341
f 248/221/342 249/227/342 252/224/342
f 216/213/343 244/219/343 229/225/343
f 215/212/344 245/217/344 244/219/344
f 214/211/345 246/220/345 245/217/345
f 166/210/346 190/382/346 246/220/346
f 246/220/347 190/382/347 191/226/347
f 191/226/348 192/383/348 252/224/348
f 252/224/349 192/383/349 170/228/349
f 251/222/350 252/224/350 217/229/350
f 250/223/351 251/222/351 218/230/351
f 231/232/352 250/223/352 219/231/352
f 230/234/353 247/218/353 250/223/353
f 244/219/354 247/218/354 230/234/354
f 254/235/355 257/239/355 256/236/355
f 255/238/356 258/245/356 257/239/356
f 256/236/357 257/239/357 260/240/357
f 257/239/358 258/245/358 261/242/358
f 219/231/359 253/237/359 226/243/359
f 218/230/360 254/235/360 253/237/360
f 217/229/361 255/238/361 254/235/361
f 170/228/362 199/303/362 255/238/362
f 255/238/363 199/303/363 200/244/363
f 200/244/364 201/315/364 261/242/364
f 261/242/365 201/315/365 168/246/365
f 260/240/366 261/242/366 220/247/366
f 259/241/367 260/240/367 221/248/367
f 228/250/368 259/241/368 222/249/368
f 227/252/369 256/236/369 259/241/369
f 253/237/370 256/236/370 227/252/370
f 263/253/371 266/257/371 265/254/371
f 264/256/372 267/263/372 266/257/372
f 265/254/373 266/257/373 269/258/373
f 266/257/374 267/263/374 270/260/374
f 222/249/375 262/255/375 223/261/375
f 221/248/376 263/253/376 262/255/376
f 220/247/377 264/256/377 263/253/377
f 168/246/378 208/384/378 264/256/378
f 264/256/379 208/384/379 209/262/379
f 209/262/380 210/385/380 270/260/380
f 270/260/381 210/385/381 164/264/381
f 269/258/382 270/260/382 180/265/382
f 268/259/383 269/258/383 179/266/383
f 225/268/384 268/259/384 178/267/384
f 224/270/385 265/254/385 268/259/385
f 262/255/386 265/254/386 224/270/386
f 271/271/387 272/275/387 275/272/387
f 273/274/388 276/286/388 275/272/388
f 275/272/389 278/278/389 277/276/389
f 275/272/390 276/286/390 279/277/390
f 165/279/391 189/281/391 271/271/391
f 189/281/392 188/283/392 272/275/392
f 187/282/393 273/274/393 272/275/393
f 169/284/394 198/285/394 273/274/394
f 198/285/395 197/347/395 276/286/395
f 276/286/396 197/347/396 196/287/396
f 279/277/397 196/287/397 167/288/397
f 278/278/398 279/277/398 207/289/398
f 278/278/399 206/290/399 205/291/399
f 277/276/400 205/291/400 163/292/400
f 274/273/401 277/276/401 177/293/401
f 175/280/402 271/271/402 274/273/402
f 280/295/403 281/299/403 284/296/403
f 282/298/404 285/309/404 284/296/404
f 284/296/405 287/302/405 286/300/405
f 284/296/406 285/309/406 288/301/406
f 170/228/407 192/304/407 280/295/407
f 192/304/248 191/306/248 281/299/248
f 190/305/249 282/298/249 281/299/249
f 166/307/408 183/308/408 282/298/408
f 183/308/251 182/386/251 285/309/251
f 285/309/252 182/386/252 181/310/252
f 288/301/409 181/310/409 164/311/409
f 287/302/254 288/301/254 210/312/254
f 287/302/255 209/313/255 208/314/255
f 286/300/410 208/314/410 168/246/410
f 283/297/257 286/300/257 201/315/257
f 199/303/258 280/295/258 283/297/258
f 289/316/411 290/319/411 293/317/411
f 290/319/412 291/326/412 294/320/412
f 293/317/413 296/322/413 295/321/413
f 294/320/414 297/328/414 296/322/414
f 167/288/415 204/324/415 289/316/415
f 204/324/416 203/325/416 290/319/416
f 203/325/417 202/327/417 291/326/417
f 202/327/418 174/251/418 223/261/418
f 291/326/419 223/261/419 224/270/419
f 224/270/420 225/268/420 297/328/420
f 225/268/421 171/269/421 213/329/421
f 297/328/422 213/329/422 212/330/422
f 296/322/423 212/330/423 211/331/423
f 295/321/424 211/331/424 163/332/424
f 206/334/425 292/318/425 295/321/425
f 289/316/426 292/318/426 206/334/426
f 298/335/427 299/338/427 302/336/427
f 299/338/428 300/344/428 303/339/428
f 302/336/429 305/341/429 304/340/429
f 303/339/430 306/346/430 305/341/430
f 169/284/431 195/342/431 298/335/431
f 195/342/432 194/343/432 299/338/432
f 194/343/433 193/345/433 300/344/433
f 193/345/434 173/233/434 226/243/434
f 300/344/435 226/243/435 227/252/435
f 227/252/436 228/250/436 306/346/436
f 228/250/437 174/251/437 202/327/437
f 306/346/438 202/327/438 203/325/438
f 305/341/439 203/325/439 204/324/439
f 304/340/440 204/324/440 167/288/440
f 197/347/441 301/337/441 304/340/441
f 298/335/442 301/337/442 197/347/442
f 307/348/443 308/351/443 311/349/443
f 308/351/444 309/359/444 312/352/444
f 311/349/445 314/354/445 313/353/445
f 312/352/446 315/361/446 314/354/446
f 165/355/447 186/357/447 307/348/447
f 186/357/448 185/358/448 308/351/448
f 185/358/449 184/360/449 309/359/449
f 184/360/450 172/215/450 229/225/450
f 309/359/451 229/225/451 230/234/451
f 230/234/452 231/232/452 315/361/452
f 231/232/453 173/233/453 193/345/453
f 315/361/454 193/345/454 194/343/454
f 314/354/455 194/343/455 195/342/455
f 313/353/456 195/342/456 169/284/456
f 188/363/457 310/350/457 313/353/457
f 307/348/458 310/350/458 188/363/458
f 316/364/459 317/367/459 320/365/459
f 317/367/460 318/375/460 321/368/460
f 320/365/461 323/370/461 322/369/461
f 321/368/462 324/377/462 323/370/462
f 163/371/463 211/373/463 316/364/463
f 211/373/464 212/374/464 317/367/464
f 212/374/465 213/376/465 318/375/465
f 213/376/466 171/204/466 232/203/466
f 318/375/467 232/203/467 233/216/467
f 233/216/468 234/214/468 324/377/468
f 234/214/469 172/215/469 184/360/469
f 324/377/470 184/360/470 185/358/470
f 323/370/471 185/358/471 186/357/471
f 322/369/472 186/357/472 165/355/472
f 176/379/473 319/366/473 322/369/473
f 316/364/474 319/366/474 176/379/474
o Cube.001
v -0.077255 -0.050590 -0.069076
v -0.077255 0.214111 -0.069076
v -0.077255 -0.050590 -0.245544
v -0.077255 0.214111 -0.245544
v 0.099213 -0.050590 -0.069076
v 0.099213 0.214111 -0.069076
v 0.099213 -0.050590 -0.245544
v 0.099213 0.214111 -0.245544
v -0.106666 0.094345 -0.039665
v -0.106666 0.094345 -0.274955
v 0.128624 0.094345 -0.274955
v 0.128624 0.094345 -0.039665
v -0.090125 -0.058127 -0.209659
v -0.096590 -0.061912 -0.157310
v -0.090125 -0.058127 -0.104961
v -0.106036 0.136153 -0.040295
v -0.101628 0.171659 -0.044703
v -0.090755 0.198504 -0.055576
v -0.090125 0.219445 -0.104961
v -0.096590 0.222124 -0.157310
v -0.090125 0.219445 -0.209659
v -0.106036 0.049315 -0.274325
v -0.101628 0.006357 -0.269917
v -0.090755 -0.028917 -0.259044
v 0.063328 -0.058127 -0.258414
v 0.010979 -0.061912 -0.264879
v -0.041370 -0.058127 -0.258414
v -0.041370 0.219445 -0.258414
v 0.010979 0.222124 -0.264879
v 0.063328 0.219445 -0.258414
v 0.127994 0.049315 -0.274325
v 0.123586 0.006357 -0.269917
v 0.112713 -0.028917 -0.259044
v 0.112083 -0.058127 -0.104961
v 0.118548 -0.061912 -0.157310
v 0.112083 -0.058127 -0.209659
v 0.112083 0.219445 -0.209659
v 0.118548 0.222124 -0.157310
v 0.112083 0.219445 -0.104961
v 0.127994 0.049315 -0.040295
v 0.123586 0.006357 -0.044703
v 0.112713 -0.028917 -0.055576
v -0.041370 -0.058127 -0.056206
v 0.010979 -0.061912 -0.049741
v 0.063328 -0.058127 -0.056206
v 0.063328 0.219445 -0.056206
v 0.010979 0.222124 -0.049741
v -0.041370 0.219445 -0.056206
v -0.090755 -0.028917 -0.055576
v -0.101628 0.006357 -0.044703
v -0.106036 0.049315 -0.040295
v -0.090755 0.198504 -0.259044
v -0.101628 0.171659 -0.269917
v -0.106036 0.136153 -0.274325
v 0.112713 0.198504 -0.259044
v 0.123586 0.171659 -0.269917
v 0.127994 0.136153 -0.274325
v 0.112713 0.198504 -0.055576
v 0.123586 0.171659 -0.044703
v 0.127994 0.136153 -0.040295
v 0.075316 0.094345 -0.006578
v 0.010979 0.094345 0.004452
v -0.053358 0.094345 -0.006578
v 0.161712 0.094345 -0.221647
v 0.172741 0.094345 -0.157310
v 0.161712 0.094345 -0.092973
v -0.053358 0.094345 -0.308043
v 0.010979 0.094345 -0.319072
v 0.075316 0.094345 -0.308043
v -0.139754 0.094345 -0.092973
v -0.150783 0.094345 -0.157310
v -0.139754 0.094345 -0.221647
v -0.138956 0.136202 -0.093320
v -0.133371 0.172054 -0.095747
v -0.118462 0.199980 -0.101172
v -0.149932 0.136252 -0.157310
v -0.143975 0.172449 -0.157310
v -0.127839 0.201271 -0.157310
v -0.138956 0.136202 -0.221301
v -0.133371 0.172054 -0.218873
v -0.118462 0.199980 -0.213448
v -0.053011 0.136202 -0.307245
v -0.050584 0.172054 -0.301660
v -0.045159 0.199980 -0.286751
v 0.010979 0.136252 -0.318221
v 0.010979 0.172449 -0.312264
v 0.010979 0.201271 -0.296129
v 0.074969 0.136202 -0.307245
v 0.072542 0.172054 -0.301660
v 0.067117 0.199980 -0.286751
v 0.160914 0.136202 -0.221301
v 0.155329 0.172054 -0.218873
v 0.140420 0.199980 -0.213448
v 0.171890 0.136252 -0.157310
v 0.165933 0.172449 -0.157310
v 0.149797 0.201271 -0.157310
v 0.160914 0.136202 -0.093320
v 0.155329 0.172054 -0.095747
v 0.140420 0.199980 -0.101172
v 0.074969 0.136202 -0.007375
v 0.072542 0.172054 -0.012960
v 0.067117 0.199980 -0.027869
v 0.010979 0.136252 0.003601
v 0.010979 0.172449 -0.002357
v 0.010979 0.201271 -0.018492
v -0.053011 0.136202 -0.007375
v -0.050584 0.172054 -0.012960
v -0.045159 0.199980 -0.027869
v -0.044812 -0.074253 -0.213102
v 0.010979 -0.079713 -0.216099
v 0.066770 -0.074253 -0.213102
v -0.047810 -0.079713 -0.157310
v 0.010979 -0.085673 -0.157310
v 0.069767 -0.079713 -0.157310
v -0.044812 -0.074253 -0.101519
v 0.010979 -0.079713 -0.098522
v 0.066770 -0.074253 -0.101519
v 0.066770 0.230858 -0.213102
v 0.010979 0.234722 -0.216099
v -0.044812 0.230858 -0.213102
v 0.069767 0.234722 -0.157310
v 0.010979 0.238940 -0.157310
v -0.047810 0.234722 -0.157310
v 0.066770 0.230858 -0.101519
v 0.010979 0.234722 -0.098522
v -0.044812 0.230858 -0.101519
v 0.067117 -0.031002 -0.027869
v 0.072542 0.005799 -0.012960
v 0.074969 0.049245 -0.007375
v 0.010979 -0.032827 -0.018492
v 0.010979 0.005241 -0.002357
v 0.010979 0.049175 0.003601
v -0.045159 -0.031003 -0.027869
v -0.050584 0.005799 -0.012960
v -0.053011 0.049245 -0.007375
v 0.140420 -0.031002 -0.213448
v 0.155329 0.005799 -0.218873
v 0.160914 0.049245 -0.221301
v 0.149797 -0.032827 -0.157310
v 0.165933 0.005241 -0.157310
v 0.171890 0.049175 -0.157310
v 0.140420 -0.031003 -0.101172
v 0.155329 0.005799 -0.095747
v 0.160914 0.049245 -0.093320
v -0.045159 -0.031002 -0.286751
v -0.050584 0.005799 -0.301660
v -0.053011 0.049245 -0.307245
v 0.010979 -0.032827 -0.296129
v 0.010979 0.005241 -0.312264
v 0.010979 0.049175 -0.318221
v 0.067117 -0.031003 -0.286751
v 0.072542 0.005799 -0.301660
v 0.074969 0.049245 -0.307245
v -0.118462 -0.031002 -0.101172
v -0.133371 0.005799 -0.095747
v -0.138956 0.049245 -0.093320
v -0.127839 -0.032827 -0.157310
v -0.143975 0.005241 -0.157310
v -0.149932 0.049175 -0.157310
v -0.118462 -0.031003 -0.213448
v -0.133371 0.005799 -0.218873
v -0.138956 0.049245 -0.221301
vn -0.9724 0.1628 0.1669
vn -0.8595 0.4882 0.1512
vn -0.9724 0.1628 -0.1669
vn -0.8595 0.4882 -0.1512
vn -0.8495 0.0207 0.5272
vn -0.8374 0.1685 0.5200
vn -0.7398 0.4875 0.4638
vn -0.5365 0.7675 0.3508
vn -0.5516 0.8267 0.1112
vn -0.5516 0.8267 -0.1112
vn -0.5365 0.7675 -0.3508
vn -0.7398 0.4875 -0.4638
vn -0.8374 0.1685 -0.5200
vn -0.8495 0.0207 -0.5272
vn -0.9854 0.0202 -0.1689
vn -0.9854 0.0202 0.1689
vn -0.1669 0.1628 -0.9724
vn -0.1512 0.4882 -0.8595
vn 0.1669 0.1628 -0.9724
vn 0.1512 0.4882 -0.8595
vn -0.5272 0.0207 -0.8495
vn -0.5200 0.1685 -0.8374
vn -0.4638 0.4875 -0.7398
vn -0.3508 0.7675 -0.5365
vn -0.1112 0.8267 -0.5516
vn 0.1112 0.8267 -0.5516
vn 0.3508 0.7675 -0.5365
vn 0.4638 0.4875 -0.7398
vn 0.5200 0.1685 -0.8374
vn 0.5272 0.0207 -0.8495
vn 0.1689 0.0202 -0.9854
vn -0.1689 0.0202 -0.9854
vn 0.9724 0.1628 -0.1669
vn 0.8595 0.4882 -0.1512
vn 0.9724 0.1628 0.1669
vn 0.8595 0.4882 0.1512
vn 0.8495 0.0207 -0.5272
vn 0.8374 0.1685 -0.5200
vn 0.7398 0.4875 -0.4638
vn 0.5365 0.7675 -0.3508
vn 0.5516 0.8267 -0.1112
vn 0.5516 0.8267 0.1112
vn 0.5365 0.7675 0.3508
vn 0.7398 0.4875 0.4638
vn 0.8374 0.1685 0.5200
vn 0.8495 0.0207 0.5272
vn 0.9854 0.0202 0.1689
vn 0.9854 0.0202 -0.1689
vn 0.1669 0.1628 0.9724
vn 0.1512 0.4882 0.8595
vn -0.1669 0.1628 0.9724
vn -0.1512 0.4882 0.8595
vn 0.5272 0.0207 0.8495
vn 0.5200 0.1685 0.8374
vn 0.4638 0.4875 0.7398
vn 0.3508 0.7675 0.5365
vn 0.1112 0.8267 0.5516
vn -0.1112 0.8267 0.5516
vn -0.3508 0.7675 0.5365
vn -0.4638 0.4875 0.7398
vn -0.5200 0.1685 0.8374
vn -0.5272 0.0207 0.8495
vn -0.1689 0.0202 0.9854
vn 0.1689 0.0202 0.9854
vn -0.1003 -0.9897 -0.1022
vn 0.1022 -0.9897 -0.1003
vn -0.1003 -0.9897 0.1022
vn 0.1022 -0.9897 0.1003
vn -0.3397 -0.8885 -0.3084
vn -0.1097 -0.9338 -0.3406
vn 0.1096 -0.9337 -0.3407
vn 0.3084 -0.8885 -0.3397
vn 0.3406 -0.9338 -0.1097
vn 0.3406 -0.9338 0.1097
vn 0.3084 -0.8885 0.3397
vn 0.1096 -0.9337 0.3407
vn -0.1097 -0.9338 0.3406
vn -0.3397 -0.8885 0.3084
vn -0.3407 -0.9337 0.1096
vn -0.3407 -0.9337 -0.1096
vn 0.0714 0.9948 -0.0727
vn -0.0727 0.9948 -0.0714
vn 0.0714 0.9948 0.0727
vn -0.0727 0.9948 0.0714
vn 0.2541 0.9393 -0.2307
vn 0.0802 0.9651 -0.2492
vn -0.0802 0.9651 -0.2492
vn -0.2307 0.9393 -0.2541
vn -0.2492 0.9651 -0.0802
vn -0.2492 0.9651 0.0802
vn -0.2307 0.9393 0.2541
vn -0.0802 0.9651 0.2492
vn 0.0802 0.9651 0.2492
vn 0.2541 0.9393 0.2307
vn 0.2492 0.9651 0.0802
vn 0.2492 0.9651 -0.0802
vn 0.1642 -0.3849 0.9082
vn 0.1694 -0.1324 0.9766
vn -0.1642 -0.3849 0.9082
vn -0.1694 -0.1324 0.9766
vn 0.3674 -0.6959 0.6170
vn 0.4915 -0.3883 0.7796
vn 0.5242 -0.1373 0.8405
vn 0.5274 -0.0191 0.8494
vn 0.1690 -0.0186 0.9854
vn -0.1690 -0.0186 0.9854
vn -0.5274 -0.0191 0.8494
vn -0.5242 -0.1373 0.8405
vn -0.4915 -0.3883 0.7796
vn -0.3674 -0.6959 0.6170
vn -0.1358 -0.7247 0.6755
vn 0.1358 -0.7247 0.6755
vn 0.9082 -0.3849 -0.1642
vn 0.9766 -0.1324 -0.1694
vn 0.9082 -0.3849 0.1642
vn 0.9766 -0.1324 0.1694
vn 0.6170 -0.6959 -0.3674
vn 0.7796 -0.3883 -0.4915
vn 0.8405 -0.1373 -0.5242
vn 0.8494 -0.0191 -0.5274
vn 0.9854 -0.0186 -0.1690
vn 0.9854 -0.0186 0.1690
vn 0.8494 -0.0191 0.5274
vn 0.8405 -0.1373 0.5242
vn 0.7796 -0.3883 0.4915
vn 0.6170 -0.6959 0.3674
vn 0.6755 -0.7247 0.1358
vn 0.6755 -0.7247 -0.1358
vn -0.1642 -0.3849 -0.9082
vn -0.1694 -0.1324 -0.9766
vn 0.1642 -0.3849 -0.9082
vn 0.1694 -0.1324 -0.9766
vn -0.3674 -0.6959 -0.6170
vn -0.4915 -0.3883 -0.7796
vn -0.5242 -0.1373 -0.8405
vn -0.5274 -0.0191 -0.8494
vn -0.1690 -0.0186 -0.9854
vn 0.1690 -0.0186 -0.9854
vn 0.5274 -0.0191 -0.8494
vn 0.5242 -0.1373 -0.8405
vn 0.4915 -0.3883 -0.7796
vn 0.3674 -0.6959 -0.6170
vn 0.1358 -0.7247 -0.6755
vn -0.1358 -0.7247 -0.6755
vn -0.9082 -0.3849 0.1642
vn -0.9766 -0.1324 0.1694
vn -0.9082 -0.3849 -0.1642
vn -0.9766 -0.1324 -0.1694
vn -0.6170 -0.6959 0.3674
vn -0.7796 -0.3883 0.4915
vn -0.8405 -0.1373 0.5242
vn -0.8494 -0.0191 0.5274
vn -0.9854 -0.0186 0.1690
vn -0.9854 -0.0186 -0.1690
vn -0.8494 -0.0191 -0.5274
vn -0.8405 -0.1373 -0.5242
vn -0.7796 -0.3883 -0.4915
vn -0.6170 -0.6959 -0.3674
vn -0.6755 -0.7247 -0.1358
vn -0.6755 -0.7247 0.1358
vn -0.9726 0.1601 0.1686
vn -0.8620 0.4826 0.1551
vn -0.9726 0.1601 -0.1686
vn -0.8620 0.4826 -0.1551
vn -0.8494 0.0206 0.5274
vn -0.8369 0.1657 0.5217
vn -0.7400 0.4855 0.4654
vn -0.5117 0.8039 0.3030
vn -0.5527 0.8261 0.1105
vn -0.5526 0.8261 -0.1105
vn -0.5117 0.8039 -0.3030
vn -0.7400 0.4855 -0.4654
vn -0.8369 0.1657 -0.5217
vn -0.8494 0.0206 -0.5274
vn -0.9854 0.0200 -0.1690
vn -0.9854 0.0200 0.1690
vn -0.1686 0.1601 -0.9726
vn -0.1551 0.4826 -0.8620
vn 0.1686 0.1601 -0.9726
vn 0.1551 0.4826 -0.8620
vn -0.5274 0.0206 -0.8494
vn -0.5217 0.1657 -0.8369
vn -0.4654 0.4855 -0.7400
vn -0.3030 0.8039 -0.5117
vn -0.1105 0.8261 -0.5526
vn 0.1105 0.8261 -0.5527
vn 0.3030 0.8039 -0.5117
vn 0.4654 0.4855 -0.7400
vn 0.5217 0.1657 -0.8369
vn 0.5274 0.0206 -0.8494
vn 0.1690 0.0200 -0.9854
vn -0.1690 0.0200 -0.9854
vn 0.9726 0.1601 -0.1686
vn 0.8620 0.4826 -0.1551
vn 0.9726 0.1601 0.1686
vn 0.8620 0.4826 0.1551
vn 0.8494 0.0206 -0.5274
vn 0.8369 0.1657 -0.5217
vn 0.7400 0.4855 -0.4654
vn 0.5117 0.8039 -0.3030
vn 0.5527 0.8261 -0.1105
vn 0.5527 0.8261 0.1105
vn 0.5117 0.8039 0.3030
vn 0.7400 0.4855 0.4654
vn 0.8369 0.1657 0.5217
vn 0.8494 0.0206 0.5274
vn 0.9854 0.0200 0.1690
vn 0.9854 0.0200 -0.1690
vn 0.1686 0.1601 0.9726
vn 0.1551 0.4826 0.8620
vn -0.1686 0.1601 0.9726
vn -0.1551 0.4826 0.8620
vn 0.5274 0.0206 0.8494
vn 0.5217 0.1657 0.8369
vn 0.4654 0.4855 0.7400
vn 0.3030 0.8039 0.5117
vn 0.1105 0.8261 0.5526
vn -0.1105 0.8261 0.5527
vn -0.3030 0.8039 0.5117
vn -0.4654 0.4855 0.7400
vn -0.5217 0.1657 0.8369
vn -0.5274 0.0206 0.8494
vn -0.1690 0.0200 0.9854
vn 0.1690 0.0200 0.9854
vn -0.1022 -0.9897 -0.1003
vn 0.1003 -0.9897 -0.1022
vn -0.1022 -0.9897 0.1003
vn 0.1003 -0.9897 0.1022
vn -0.3084 -0.8885 -0.3397
vn -0.1096 -0.9337 -0.3407
vn 0.1097 -0.9338 -0.3406
vn 0.3397 -0.8885 -0.3084
vn 0.3407 -0.9337 -0.1096
vn 0.3407 -0.9337 0.1096
vn 0.3397 -0.8885 0.3084
vn 0.1097 -0.9338 0.3406
vn -0.1096 -0.9337 0.3407
vn -0.3084 -0.8885 0.3397
vn -0.3406 -0.9338 0.1097
vn -0.3406 -0.9338 -0.1097
vn 0.0727 0.9948 -0.0714
vn -0.0714 0.9948 -0.0727
vn 0.0727 0.9948 0.0714
vn -0.0714 0.9948 0.0727
vn 0.2307 0.9393 -0.2541
vn -0.2541 0.9393 -0.2307
vn -0.2541 0.9393 0.2307
vn 0.2307 0.9393 0.2541
vn 0.1597 -0.3908 0.9065
vn 0.1677 -0.1349 0.9766
vn -0.1597 -0.3908 0.9065
vn -0.1677 -0.1349 0.9766
vn 0.4149 -0.6531 0.6336
vn 0.4891 -0.3911 0.7796
vn 0.5224 -0.1399 0.8411
vn 0.5273 -0.0193 0.8495
vn 0.1689 -0.0187 0.9854
vn -0.1689 -0.0187 0.9854
vn -0.5273 -0.0193 0.8495
vn -0.5224 -0.1399 0.8411
vn -0.4891 -0.3911 0.7796
vn -0.4149 -0.6531 0.6336
vn -0.1363 -0.7252 0.6749
vn 0.1363 -0.7252 0.6749
vn 0.9065 -0.3908 -0.1597
vn 0.9766 -0.1349 -0.1677
vn 0.9065 -0.3908 0.1597
vn 0.9766 -0.1349 0.1677
vn 0.6336 -0.6531 -0.4149
vn 0.7796 -0.3911 -0.4891
vn 0.8411 -0.1399 -0.5224
vn 0.8495 -0.0193 -0.5273
vn 0.9854 -0.0187 -0.1689
vn 0.9854 -0.0187 0.1689
vn 0.8495 -0.0193 0.5273
vn 0.8411 -0.1399 0.5224
vn 0.7796 -0.3911 0.4891
vn 0.6336 -0.6531 0.4149
vn 0.6749 -0.7252 0.1363
vn 0.6749 -0.7252 -0.1363
vn -0.1597 -0.3908 -0.9065
vn -0.1677 -0.1349 -0.9766
vn 0.1597 -0.3908 -0.9065
vn 0.1677 -0.1349 -0.9766
vn -0.4149 -0.6531 -0.6336
vn -0.4891 -0.3911 -0.7796
vn -0.5224 -0.1399 -0.8411
vn -0.5273 -0.0193 -0.8495
vn -0.1689 -0.0187 -0.9854
vn 0.1689 -0.0187 -0.9854
vn 0.5273 -0.0193 -0.8495
vn 0.5224 -0.1399 -0.8411
vn 0.4891 -0.3911 -0.7796
vn 0.4149 -0.6531 -0.6336
vn 0.1363 -0.7252 -0.6749
vn -0.1363 -0.7252 -0.6749
vn -0.9065 -0.3908 0.1597
vn -0.9766 -0.1349 0.1677
vn -0.9065 -0.3908 -0.1597
vn -0.9766 -0.1349 -0.1677
vn -0.6336 -0.6531 0.4149
vn -0.7796 -0.3911 0.4891
vn -0.8411 -0.1399 0.5224
vn -0.8495 -0.0193 0.5273
vn -0.9854 -0.0187 0.1689
vn -0.9854 -0.0187 -0.1689
vn -0.8495 -0.0193 -0.5273
vn -0.8411 -0.1399 -0.5224
vn -0.7796 -0.3911 -0.4891
vn -0.6336 -0.6531 -0.4149
vn -0.6749 -0.7252 -0.1363
vn -0.6749 -0.7252 0.1363
vt 0.572770 0.062500
vt 0.544714 0.125000
vt 0.545696 0.062500
vt 0.599045 0.062500
vt 0.572454 0.125000
vt 0.572324 0.187500
vt 0.544340 0.187500
vt 0.598989 0.187500
vt 0.547295 0.000000
vt 0.516416 0.062500
vt 0.521393 0.000000
vt 0.573197 0.000000
vt 0.599098 0.000000
vt 0.625000 0.000000
vt 0.625000 0.125000
vt 0.599005 0.125000
vt 0.625000 0.250000
vt 0.598987 0.250000
vt 0.572305 0.250000
vt 0.544287 0.250000
vt 0.514337 0.187500
vt 0.514262 0.250000
vt 0.514857 0.125000
vt 0.572305 0.312500
vt 0.544287 0.375000
vt 0.544287 0.312500
vt 0.598987 0.312500
vt 0.572305 0.375000
vt 0.572369 0.437500
vt 0.544295 0.437500
vt 0.599122 0.437500
vt 0.514262 0.312500
vt 0.625000 0.375000
vt 0.598987 0.375000
vt 0.625000 0.500000
vt 0.600512 0.500000
vt 0.572814 0.500000
vt 0.544350 0.500000
vt 0.514262 0.437500
vt 0.514262 0.500000
vt 0.514262 0.375000
vt 0.573767 0.562500
vt 0.544541 0.625000
vt 0.544469 0.562500
vt 0.604412 0.562500
vt 0.574338 0.625000
vt 0.573767 0.687500
vt 0.544469 0.687500
vt 0.604412 0.687500
vt 0.514262 0.562500
vt 0.645332 0.625000
vt 0.606357 0.625000
vt 0.625000 0.750000
vt 0.600512 0.750000
vt 0.572814 0.750000
vt 0.544350 0.750000
vt 0.514262 0.687500
vt 0.514262 0.750000
vt 0.514262 0.625000
vt 0.572387 0.812500
vt 0.544714 0.875000
vt 0.544348 0.812500
vt 0.599124 0.812500
vt 0.572454 0.875000
vt 0.572770 0.937500
vt 0.545696 0.937500
vt 0.599045 0.937500
vt 0.514337 0.812500
vt 0.625000 0.875000
vt 0.599005 0.875000
vt 0.625000 1.000000
vt 0.599098 1.000000
vt 0.573197 1.000000
vt 0.547295 1.000000
vt 0.516416 0.937500
vt 0.521393 1.000000
vt 0.514857 0.875000
vt 0.187376 0.562500
vt 0.248561 0.625000
vt 0.187320 0.625000
vt 0.308756 0.562500
vt 0.249011 0.562500
vt 0.187376 0.687500
vt 0.308756 0.687500
vt 0.249011 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.187500 0.500000
vt 0.312500 0.500000
vt 0.250000 0.500000
vt 0.375000 0.500000
vt 0.363106 0.562500
vt 0.307284 0.625000
vt 0.363106 0.687500
vt 0.375000 0.750000
vt 0.312500 0.750000
vt 0.250000 0.750000
vt 0.187500 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.692790 0.562500
vt 0.752033 0.625000
vt 0.694870 0.625000
vt 0.812675 0.562500
vt 0.751398 0.562500
vt 0.692790 0.687500
vt 0.812675 0.687500
vt 0.751398 0.687500
vt 0.641806 0.562500
vt 0.687500 0.500000
vt 0.812500 0.500000
vt 0.750000 0.500000
vt 0.875000 0.500000
vt 0.875000 0.562500
vt 0.812754 0.625000
vt 0.875000 0.687500
vt 0.875000 0.750000
vt 0.812500 0.750000
vt 0.750000 0.750000
vt 0.687500 0.750000
vt 0.641806 0.687500
vt 0.411394 0.812500
vt 0.447454 0.875000
vt 0.411506 0.875000
vt 0.447279 0.812500
vt 0.482214 0.875000
vt 0.411545 0.937500
vt 0.447770 0.937500
vt 0.375000 0.812500
vt 0.410408 0.750000
vt 0.446946 0.750000
vt 0.481834 0.812500
vt 0.481742 0.750000
vt 0.483196 0.937500
vt 0.484795 1.000000
vt 0.448197 1.000000
vt 0.411598 1.000000
vt 0.375000 1.000000
vt 0.375000 0.937500
vt 0.375000 0.875000
vt 0.407648 0.562500
vt 0.445866 0.625000
vt 0.406271 0.625000
vt 0.446271 0.562500
vt 0.481607 0.625000
vt 0.407648 0.687500
vt 0.446271 0.687500
vt 0.410408 0.500000
vt 0.446946 0.500000
vt 0.481657 0.562500
vt 0.481742 0.500000
vt 0.481657 0.687500
vt 0.360610 0.625000
vt 0.411487 0.312500
vt 0.447305 0.375000
vt 0.411487 0.375000
vt 0.447305 0.312500
vt 0.481787 0.375000
vt 0.411391 0.437500
vt 0.447260 0.437500
vt 0.375000 0.250000
vt 0.375000 0.312500
vt 0.411487 0.250000
vt 0.447305 0.250000
vt 0.481787 0.312500
vt 0.481787 0.250000
vt 0.481781 0.437500
vt 0.375000 0.437500
vt 0.375000 0.375000
vt 0.411545 0.062500
vt 0.447454 0.125000
vt 0.411506 0.125000
vt 0.447770 0.062500
vt 0.482214 0.125000
vt 0.411489 0.187500
vt 0.447324 0.187500
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.411598 0.000000
vt 0.448197 0.000000
vt 0.483196 0.062500
vt 0.484795 0.000000
vt 0.481840 0.187500
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.625000 0.062500
vt 0.625000 0.187500
vt 0.625000 0.312500
vt 0.625000 0.437500
vt 0.625000 0.812500
vt 0.625000 0.937500
vt 0.875000 0.625000
s 0
f 398/387/475 400/388/475 397/389/475
f 399/390/476 401/391/476 398/387/476
f 400/388/477 404/392/477 403/393/477
f 401/391/478 405/394/478 404/392/478
f 340/395/479 394/396/479 333/397/479
f 341/398/480 397/389/480 340/395/480
f 342/399/481 398/387/481 341/398/481
f 326/400/482 399/390/482 342/399/482
f 399/390/483 344/401/483 402/402/483
f 344/401/484 405/394/484 402/402/484
f 405/394/485 328/403/485 376/404/485
f 404/392/486 376/404/486 377/405/486
f 403/393/487 377/405/487 378/406/487
f 396/407/488 378/406/488 334/408/488
f 395/409/489 403/393/489 396/407/489
f 397/389/490 395/409/490 394/396/490
f 407/410/491 409/411/491 406/412/491
f 408/413/492 410/414/492 407/410/492
f 409/411/493 413/415/493 412/416/493
f 410/414/494 414/417/494 413/415/494
f 378/406/495 391/418/495 334/408/495
f 377/405/496 406/412/496 378/406/496
f 376/404/497 407/410/497 377/405/497
f 328/403/498 408/413/498 376/404/498
f 408/413/499 353/419/499 411/420/499
f 353/419/500 414/417/500 411/420/500
f 414/417/501 332/421/501 379/422/501
f 413/415/502 379/422/502 380/423/502
f 412/416/503 380/423/503 381/424/503
f 393/425/504 381/424/504 335/426/504
f 392/427/505 412/416/505 393/425/505
f 406/412/506 392/427/506 391/418/506
f 416/428/507 418/429/507 415/430/507
f 417/431/508 419/432/508 416/428/508
f 418/429/509 422/433/509 421/434/509
f 419/432/510 423/435/510 422/433/510
f 381/424/511 388/436/511 335/426/511
f 380/423/512 415/430/512 381/424/512
f 379/422/513 416/428/513 380/423/513
f 332/421/514 417/431/514 379/422/514
f 417/431/515 362/437/515 420/438/515
f 362/437/516 423/435/516 420/438/516
f 423/435/517 330/439/517 382/440/517
f 422/433/518 382/440/518 383/441/518
f 421/434/519 383/441/519 384/442/519
f 390/443/520 384/442/520 336/444/520
f 389/445/521 421/434/521 390/443/521
f 415/430/522 389/445/522 388/436/522
f 425/446/523 427/447/523 424/448/523
f 426/449/524 428/450/524 425/446/524
f 427/447/525 431/451/525 430/452/525
f 428/450/526 432/453/526 431/451/526
f 384/442/527 385/454/527 336/444/527
f 383/441/528 424/448/528 384/442/528
f 382/440/529 425/446/529 383/441/529
f 330/439/530 426/449/530 382/440/530
f 426/449/531 371/455/531 429/456/531
f 371/455/532 432/453/532 429/456/532
f 432/453/533 326/457/533 342/458/533
f 431/451/534 342/458/534 341/459/534
f 430/452/535 341/459/535 340/460/535
f 387/461/536 340/460/536 333/462/536
f 386/463/537 430/452/537 387/461/537
f 424/448/538 386/463/538 385/454/538
f 433/464/539 437/465/539 436/466/539
f 435/467/540 437/465/540 434/468/540
f 437/465/541 439/469/541 436/466/541
f 437/465/542 441/470/542 440/471/542
f 327/472/543 433/464/543 337/473/543
f 351/474/544 434/468/544 433/464/544
f 349/475/545 434/468/545 350/476/545
f 331/477/546 435/467/546 349/475/546
f 360/478/547 438/479/547 435/467/547
f 438/479/548 358/480/548 441/470/548
f 441/470/549 329/481/549 369/482/549
f 440/471/550 369/482/550 368/483/550
f 440/471/551 367/484/551 439/469/551
f 439/469/552 325/485/552 339/486/552
f 436/466/553 339/486/553 338/487/553
f 337/473/554 436/466/554 338/487/554
f 442/488/555 446/489/555 445/490/555
f 444/491/556 446/489/556 443/492/556
f 446/489/557 448/493/557 445/490/557
f 446/489/558 450/494/558 449/495/558
f 332/421/559 442/488/559 361/496/559
f 354/497/560 443/492/560 442/488/560
f 352/498/561 443/492/561 353/499/561
f 328/500/562 444/491/562 352/498/562
f 345/501/563 447/502/563 444/491/563
f 447/502/564 343/503/564 450/494/564
f 450/494/565 326/504/565 372/505/565
f 449/495/566 372/505/566 371/506/566
f 449/495/567 370/507/567 448/493/567
f 448/493/568 330/439/568 363/508/568
f 445/490/569 363/508/569 362/437/569
f 361/496/570 445/490/570 362/437/570
f 451/509/571 455/510/571 454/511/571
f 452/512/572 456/513/572 455/510/572
f 455/510/573 457/514/573 454/511/573
f 456/513/574 458/515/574 455/510/574
f 329/481/575 451/509/575 369/516/575
f 366/517/576 452/512/576 451/509/576
f 365/518/577 453/519/577 452/512/577
f 364/520/578 385/454/578 453/519/578
f 453/519/579 386/463/579 456/513/579
f 386/463/580 459/521/580 456/513/580
f 387/461/581 375/522/581 459/521/581
f 459/521/582 374/523/582 458/515/582
f 458/515/583 373/524/583 457/514/583
f 457/514/584 325/525/584 367/526/584
f 368/527/585 457/514/585 367/526/585
f 451/509/586 368/527/586 369/516/586
f 460/528/587 464/529/587 463/530/587
f 461/531/588 465/532/588 464/529/588
f 464/529/589 466/533/589 463/530/589
f 465/532/590 467/534/590 464/529/590
f 331/477/591 460/528/591 360/478/591
f 357/535/592 461/531/592 460/528/592
f 356/536/593 462/537/593 461/531/593
f 355/538/594 388/436/594 462/537/594
f 462/537/595 389/445/595 465/532/595
f 389/445/596 468/539/596 465/532/596
f 390/443/597 364/520/597 468/539/597
f 468/539/598 365/518/598 467/534/598
f 467/534/599 366/517/599 466/533/599
f 466/533/600 329/481/600 358/480/600
f 359/540/601 466/533/601 358/480/601
f 460/528/602 359/540/602 360/478/602
f 469/541/603 473/542/603 472/543/603
f 470/544/604 474/545/604 473/542/604
f 473/542/605 475/546/605 472/543/605
f 474/545/606 476/547/606 473/542/606
f 327/548/607 469/541/607 351/549/607
f 348/550/608 470/544/608 469/541/608
f 347/551/609 471/552/609 470/544/609
f 346/553/610 391/418/610 471/552/610
f 471/552/611 392/427/611 474/545/611
f 392/427/612 477/554/612 474/545/612
f 393/425/613 355/538/613 477/554/613
f 477/554/614 356/536/614 476/547/614
f 476/547/615 357/535/615 475/546/615
f 475/546/616 331/477/616 349/555/616
f 350/556/617 475/546/617 349/555/617
f 469/541/618 350/556/618 351/549/618
f 478/557/619 482/558/619 481/559/619
f 479/560/620 483/561/620 482/558/620
f 482/558/621 484/562/621 481/559/621
f 483/561/622 485/563/622 482/558/622
f 325/564/623 478/557/623 339/565/623
f 373/566/624 479/560/624 478/557/624
f 374/567/625 480/568/625 479/560/625
f 375/569/626 394/396/626 480/568/626
f 480/568/627 395/409/627 483/561/627
f 395/409/628 486/570/628 483/561/628
f 396/407/629 346/553/629 486/570/629
f 486/570/630 347/551/630 485/563/630
f 485/563/631 348/550/631 484/562/631
f 484/562/632 327/548/632 337/571/632
f 338/572/633 484/562/633 337/571/633
f 478/557/634 338/572/634 339/565/634
f 398/387/635 401/391/635 400/388/635
f 399/390/636 402/402/636 401/391/636
f 400/388/637 401/391/637 404/392/637
f 401/391/638 402/402/638 405/394/638
f 340/395/639 397/389/639 394/396/639
f 341/398/640 398/387/640 397/389/640
f 342/399/641 399/390/641 398/387/641
f 326/400/642 343/573/642 399/390/642
f 399/390/643 343/573/643 344/401/643
f 344/401/644 345/574/644 405/394/644
f 405/394/645 345/574/645 328/403/645
f 404/392/646 405/394/646 376/404/646
f 403/393/647 404/392/647 377/405/647
f 396/407/648 403/393/648 378/406/648
f 395/409/649 400/388/649 403/393/649
f 397/389/650 400/388/650 395/409/650
f 407/410/651 410/414/651 409/411/651
f 408/413/652 411/420/652 410/414/652
f 409/411/653 410/414/653 413/415/653
f 410/414/654 411/420/654 414/417/654
f 378/406/655 406/412/655 391/418/655
f 377/405/656 407/410/656 406/412/656
f 376/404/657 408/413/657 407/410/657
f 328/403/658 352/575/658 408/413/658
f 408/413/659 352/575/659 353/419/659
f 353/419/660 354/576/660 414/417/660
f 414/417/661 354/576/661 332/421/661
f 413/415/662 414/417/662 379/422/662
f 412/416/663 413/415/663 380/423/663
f 393/425/664 412/416/664 381/424/664
f 392/427/665 409/411/665 412/416/665
f 406/412/666 409/411/666 392/427/666
f 416/428/667 419/432/667 418/429/667
f 417/431/668 420/438/668 419/432/668
f 418/429/669 419/432/669 422/433/669
f 419/432/670 420/438/670 423/435/670
f 381/424/671 415/430/671 388/436/671
f 380/423/672 416/428/672 415/430/672
f 379/422/673 417/431/673 416/428/673
f 332/421/674 361/496/674 417/431/674
f 417/431/675 361/496/675 362/437/675
f 362/437/676 363/508/676 423/435/676
f 423/435/677 363/508/677 330/439/677
f 422/433/678 423/435/678 382/440/678
f 421/434/679 422/433/679 383/441/679
f 390/443/680 421/434/680 384/442/680
f 389/445/681 418/429/681 421/434/681
f 415/430/682 418/429/682 389/445/682
f 425/446/683 428/450/683 427/447/683
f 426/449/684 429/456/684 428/450/684
f 427/447/685 428/450/685 431/451/685
f 428/450/686 429/456/686 432/453/686
f 384/442/687 424/448/687 385/454/687
f 383/441/688 425/446/688 424/448/688
f 382/440/689 426/449/689 425/446/689
f 330/439/690 370/577/690 426/449/690
f 426/449/691 370/577/691 371/455/691
f 371/455/692 372/578/692 432/453/692
f 432/453/693 372/578/693 326/457/693
f 431/451/694 432/453/694 342/458/694
f 430/452/695 431/451/695 341/459/695
f 387/461/696 430/452/696 340/460/696
f 386/463/697 427/447/697 430/452/697
f 424/448/698 427/447/698 386/463/698
f 433/464/699 434/468/699 437/465/699
f 435/467/700 438/479/700 437/465/700
f 437/465/701 440/471/701 439/469/701
f 437/465/702 438/479/702 441/470/702
f 327/472/703 351/474/703 433/464/703
f 351/474/704 350/476/704 434/468/704
f 349/475/705 435/467/705 434/468/705
f 331/477/706 360/478/706 435/467/706
f 360/478/707 359/540/707 438/479/707
f 438/479/708 359/540/708 358/480/708
f 441/470/709 358/480/709 329/481/709
f 440/471/710 441/470/710 369/482/710
f 440/471/711 368/483/711 367/484/711
f 439/469/712 367/484/712 325/485/712
f 436/466/713 439/469/713 339/486/713
f 337/473/714 433/464/714 436/466/714
f 442/488/715 443/492/715 446/489/715
f 444/491/716 447/502/716 446/489/716
f 446/489/717 449/495/717 448/493/717
f 446/489/718 447/502/718 450/494/718
f 332/421/719 354/497/719 442/488/719
f 354/497/560 353/499/560 443/492/560
f 352/498/561 444/491/561 443/492/561
f 328/500/720 345/501/720 444/491/720
f 345/501/563 344/579/563 447/502/563
f 447/502/564 344/579/564 343/503/564
f 450/494/721 343/503/721 326/504/721
f 449/495/566 450/494/566 372/505/566
f 449/495/567 371/506/567 370/507/567
f 448/493/722 370/507/722 330/439/722
f 445/490/569 448/493/569 363/508/569
f 361/496/570 442/488/570 445/490/570
f 451/509/723 452/512/723 455/510/723
f 452/512/724 453/519/724 456/513/724
f 455/510/725 458/515/725 457/514/725
f 456/513/726 459/521/726 458/515/726
f 329/481/727 366/517/727 451/509/727
f 366/517/728 365/518/728 452/512/728
f 365/518/729 364/520/729 453/519/729
f 364/520/730 336/444/730 385/454/730
f 453/519/731 385/454/731 386/463/731
f 386/463/732 387/461/732 459/521/732
f 387/461/733 333/462/733 375/522/733
f 459/521/734 375/522/734 374/523/734
f 458/515/735 374/523/735 373/524/735
f 457/514/736 373/524/736 325/525/736
f 368/527/737 454/511/737 457/514/737
f 451/509/738 454/511/738 368/527/738
f 460/528/739 461/531/739 464/529/739
f 461/531/740 462/537/740 465/532/740
f 464/529/741 467/534/741 466/533/741
f 465/532/742 468/539/742 467/534/742
f 331/477/743 357/535/743 460/528/743
f 357/535/744 356/536/744 461/531/744
f 356/536/745 355/538/745 462/537/745
f 355/538/746 335/426/746 388/436/746
f 462/537/747 388/436/747 389/445/747
f 389/445/748 390/443/748 468/539/748
f 390/443/749 336/444/749 364/520/749
f 468/539/750 364/520/750 365/518/750
f 467/534/751 365/518/751 366/517/751
f 466/533/752 366/517/752 329/481/752
f 359/540/753 463/530/753 466/533/753
f 460/528/754 463/530/754 359/540/754
f 469/541/755 470/544/755 473/542/755
f 470/544/756 471/552/756 474/545/756
f 473/542/757 476/547/757 475/546/757
f 474/545/758 477/554/758 476/547/758
f 327/548/759 348/550/759 469/541/759
f 348/550/760 347/551/760 470/544/760
f 347/551/761 346/553/761 471/552/761
f 346/553/762 334/408/762 391/418/762
f 471/552/763 391/418/763 392/427/763
f 392/427/764 393/425/764 477/554/764
f 393/425/765 335/426/765 355/538/765
f 477/554/766 355/538/766 356/536/766
f 476/547/767 356/536/767 357/535/767
f 475/546/768 357/535/768 331/477/768
f 350/556/769 472/543/769 475/546/769
f 469/541/770 472/543/770 350/556/770
f 478/557/771 479/560/771 482/558/771
f 479/560/772 480/568/772 483/561/772
f 482/558/773 485/563/773 484/562/773
f 483/561/774 486/570/774 485/563/774
f 325/564/775 373/566/775 478/557/775
f 373/566/776 374/567/776 479/560/776
f 374/567/777 375/569/777 480/568/777
f 375/569/778 333/397/778 394/396/778
f 480/568/779 394/396/779 395/409/779
f 395/409/780 396/407/780 486/570/780
f 396/407/781 334/408/781 346/553/781
f 486/570/782 346/553/782 347/551/782
f 485/563/783 347/551/783 348/550/783
f 484/562/784 348/550/784 327/548/784
f 338/572/785 481/559/785 484/562/785
f 478/557/786 481/559/786 338/572/786
================================================
FILE: example/pumpkin/proxy.txt
================================================
-3.305625800631857847e-01 -3.582112260109803631e-02 1.225265727345806049e-01
-2.891790282159942826e-01 -2.954317631027934127e-02 -1.294240719588450883e-01
-3.255246921920186276e-01 -4.155946782889213126e-02 2.036661504014122825e-01
-2.441065782332441925e-01 -2.731126873209656358e-02 7.236268670205919795e-02
-2.025180245905243703e-01 -2.720551599774094498e-02 -1.855916254562235135e-01
-2.436628185665528634e-01 -3.046851632537571353e-02 -2.496366469409554978e-01
-3.089458382843916495e-01 -5.292315989920232111e-02 -2.730081384307621595e-01
-8.471090479034153187e-03 -4.019958040651864378e-02 -4.241295108967495509e-01
-5.501207336562838335e-02 -3.049473248009932150e-02 -3.541849065228990234e-01
-1.908201509676434315e-01 -5.873567319355568750e-02 -4.083285628689202751e-01
-1.536966445407488280e-01 -3.144088413276975580e-02 -3.407870313926928407e-01
-1.531105106617371425e-01 -2.744841681644258694e-02 -2.581957847036964737e-01
1.796578818573656422e-01 -4.068024441433408522e-02 -3.737961708364442304e-01
2.573452178738866447e-01 -6.332605493076551295e-02 -3.674836203446764094e-01
-9.332002178651825630e-02 -5.873228258895412529e-02 -4.478397509765706941e-01
3.169548208890685670e-01 -3.126403605466101382e-02 -8.786119073032652715e-02
2.802891571144007221e-01 -4.172353050193016583e-02 -2.911287504375464907e-01
2.775639744505939421e-01 -2.752826330244070169e-02 -2.790444552353136404e-02
2.553371862067686515e-01 -2.745859299965242029e-02 -1.275428081622160259e-01
2.460822651232012492e-01 -2.732090019046799073e-02 6.674237608564001500e-02
3.035484206585786771e-01 -4.126686872396988920e-02 2.432437687129069270e-01
2.771507300538731400e-01 -4.640461604315829214e-02 3.164986714923510580e-01
3.686724288785246384e-01 -6.266184270688805735e-02 -1.741716530145940300e-01
-4.751587649756402326e-02 -3.138743650058427315e-02 3.708425782980062646e-01
1.891401783715063178e-01 -4.061099168751930444e-02 3.664791129064343123e-01
1.549016520333406366e-01 -2.949030114296951471e-02 3.036659977337358707e-01
1.753442215011407485e-02 -2.735805258370840182e-02 2.832415874420205482e-01
-1.430253008752081501e-01 -3.303987503444182766e-02 3.596244173000321953e-01
-1.696246105486682587e-01 -5.883896173214703262e-02 4.244218777522742458e-01
-9.887400484426756009e-02 -6.492769648652627068e-02 4.576312476848998778e-01
-1.209853290393276687e-01 -2.285595146590622684e-01 -1.428669278071643678e-01
-6.992882994332651203e-02 -2.338588237461467212e-01 -5.696273797340099115e-02
-2.702199447258085696e-02 -2.376867877871111923e-01 -2.209036590968088860e-03
1.014020967147686580e-01 -2.295352522721610111e-01 -1.396550389542065285e-01
1.194774145303961581e-02 -2.354654036694441288e-01 -8.369398983224093924e-02
-8.910501616632258981e-02 -2.339184935037838808e-01 3.207126934832438964e-02
1.143960018047741369e-02 -2.356360278944247266e-01 7.948525990890326076e-02
4.934131813650789039e-02 -2.306682528365183449e-01 1.729551472830221370e-01
-2.538315477258969999e-01 -2.066971479937628198e-01 -1.937985361421760144e-01
-1.865501823522728309e-01 -2.192312942230380535e-01 -1.812436784730549477e-01
-1.047883247514028726e-01 -2.163489923868006648e-01 -2.732904418529299440e-01
5.496959794811225736e-02 -2.158550629352785355e-01 -2.969777892211177828e-01
2.138216384724743180e-01 -2.028954106560132697e-01 2.782817397541323956e-01
1.960127629161294641e-02 -2.140285384193262030e-01 3.254405248744442480e-01
6.082162856387602484e-02 -2.205154480423411612e-01 2.578655826917355953e-01
-1.171916319863917644e-01 -2.269369474847552914e-01 1.847565918611689950e-01
-1.670965742486373584e-01 -2.294668242954858162e-01 1.582094440905804436e-02
-2.439142319232562861e-01 -2.172359602936806133e-01 -2.118311803619243525e-02
1.639860104160309473e-01 -2.698130006256336047e-02 -1.455538818415267333e-01
-1.773414412748603938e-02 -2.698669772323706098e-02 -2.203330703797440338e-01
-1.299896309082801404e-01 -2.698775728342539190e-02 -1.623637070733678911e-01
-1.570081626555516752e-01 -2.699673584904519613e-02 7.901938692937451880e-02
1.682840429129706938e-01 -2.698068845196061744e-02 1.423470430984331569e-01
1.396570950372809095e-01 -2.701024145875048632e-02 -1.059869390109448480e-02
1.315223046254042072e-01 -1.540125290284823367e-01 4.487685197858553021e-01
2.283039245939131323e-01 -1.735853176292315725e-01 3.586421624511269002e-01
2.490805574785096010e-01 -9.969941305101201467e-02 4.001126468713680495e-01
1.700740774079824158e-01 -8.690154844157810254e-02 4.513304076134090348e-01
8.157522350750109896e-03 -8.356372085197648347e-02 4.882176698323373865e-01
-2.495438027280154902e-01 -9.235806283830172436e-02 3.947322950477944459e-01
-1.663379590359393423e-01 -1.370488578318499184e-01 4.546982330959173368e-01
-1.381254170063012376e-01 -1.915549192307260395e-01 3.765610623936284096e-01
1.407416588391514067e-01 -1.882592141454490497e-01 3.861688261217740004e-01
6.902968407481942237e-02 -1.984469846590416753e-01 3.723063253278254825e-01
3.998164040422186316e-01 -1.178132258964468687e-01 -3.789438519880029138e-02
3.845311004474321059e-01 -1.303950047414662305e-01 -1.291685497697919183e-01
3.410797000169266679e-01 -9.109293080193105185e-02 -2.687173715435760557e-01
3.996474451705566144e-01 -9.368900570781388271e-02 4.803516475324882268e-02
3.523006000042921970e-01 -1.618224250228668670e-01 1.877770260555521042e-01
2.899371625327846758e-01 -1.734078392944408964e-01 2.835086421511506805e-01
2.578021100560222156e-01 -2.033841466915202734e-01 2.064093640050335132e-01
2.898899318958663707e-01 -2.024401234163081542e-01 -9.558827010001304247e-02
-8.777890389316084577e-03 -1.228680038271652009e-01 -4.943393742272799707e-01
1.195358026197241158e-01 -1.716873880435449695e-01 -4.298293393215076863e-01
-2.750386185569220210e-01 -1.616286209317618072e-01 -3.394058731898144110e-01
1.988540598852654440e-01 -8.793930378772643608e-02 -4.301906580181821815e-01
7.520440246857960920e-02 -2.009783645378427075e-01 -3.628895572928060864e-01
-6.732680390141004834e-02 -2.085259276837536091e-01 -3.413560820036655175e-01
-4.040064510441064294e-01 -1.113904272319502925e-01 -1.700832129518019958e-02
-2.632792379521032844e-01 -2.023641611689527275e-01 2.006855007828953763e-01
-3.203519626483323224e-01 -1.619998070634730003e-01 2.499062601186932686e-01
-3.454962451290883307e-01 -9.091326581703522425e-02 2.598358351106671571e-01
-3.704063004041687202e-01 -9.849487547247184893e-02 -2.180410437057650197e-01
-3.301804477794455517e-01 -1.393097460734714632e-01 -2.814333939305925791e-01
-3.280863489799584753e-01 -1.775554159291257983e-01 -1.994135620783232532e-01
-2.711881476394304435e-01 -2.089654305418714997e-01 1.090736923270640424e-01
-3.158019028314587029e-01 -3.055698747331550905e-02 3.472415982598335560e-02
-2.835616593609171510e-01 -4.377826451498847277e-02 2.970087541543939280e-01
-1.982408772222689630e-01 -2.710887687952115432e-02 1.738173509971586150e-01
-2.413370828964726822e-01 -2.725989743677229710e-02 -3.827952044720363267e-02
-3.055686452732203628e-01 -3.620904547018346381e-02 -2.006110128361907052e-01
-3.563937901016004384e-01 -4.601749801152235642e-02 -1.300098188508494734e-01
-3.770490183108183291e-01 -5.604914201691892084e-02 -6.324527920087956445e-02
-3.832647356044346920e-01 -5.939737860877412207e-02 4.564698398933567797e-02
6.080693004191590340e-02 -3.213593855761701784e-02 -3.752524030945674771e-01
1.153207548901037105e-01 -2.851125831490283064e-02 -3.108972420024397887e-01
-2.442775087197170403e-01 -5.068653380942817616e-02 -3.532442979292333418e-01
1.505213090327508496e-01 -2.720771881932066913e-02 -2.221852439261247647e-01
2.054365763550519697e-01 -2.989964601104738717e-02 -2.765634605265429591e-01
1.337931355430545810e-01 -5.808558169537104271e-02 -4.384533780979759099e-01
3.357119288210857477e-01 -3.410675673906195249e-02 4.877201363699408543e-02
3.094154404828375893e-01 -3.597574105009419820e-02 -1.916893225438434656e-01
2.340870991215324159e-01 -2.852333282173991416e-02 -2.146027057824338169e-01
2.002854793605655381e-01 -2.698250489775235253e-02 -7.483544186209509230e-02
2.147780642217593516e-01 -2.704206335388145005e-02 -1.990204026503152729e-04
2.719870875209396965e-01 -2.930241422101933363e-02 1.642748665381852935e-01
3.512717998072626968e-01 -4.451454720412477417e-02 1.540168340622229681e-01
3.706842733950488911e-01 -4.610297735545280939e-02 -3.194517600711695854e-02
1.151098256494980376e-01 -3.516462991310732172e-02 3.790296892660807160e-01
2.202462014942504209e-01 -2.952330098414216938e-02 2.606663789000972509e-01
-1.044989314767669764e-01 -2.746537481529816305e-02 2.867538752963254556e-01
-1.989426832934155231e-01 -2.965895541359769849e-02 2.767270118930632572e-01
4.049273617380014234e-02 -4.412492229083160578e-02 4.331467205245070895e-01
-3.137788738406491890e-02 -2.325589842336807100e-01 -1.418314490628821700e-01
6.202212385312601273e-02 -2.356791134910740104e-01 -1.578915312481333696e-02
1.292365542454456950e-01 -2.313687870919403644e-01 -5.454799315185274167e-02
-3.256500467520254127e-02 -2.316838186823682866e-01 1.652042584250706381e-01
-7.813222035275049770e-02 -2.317249056876990454e-01 1.067135118069878624e-01
9.784782601902672350e-02 -2.327245096507502553e-01 5.477027766367833367e-02
1.353725430300343757e-01 -2.282242092488345753e-01 1.347884368627367824e-01
-2.151924781904381856e-01 -2.037754887521022740e-01 -2.701130462456166059e-01
-1.344925721819555109e-02 -2.217016441366123636e-01 -2.673130149573822445e-01
8.968693145554113011e-02 -2.253308791044686954e-01 -2.083710817226646084e-01
1.276837853111382481e-01 -2.137564996263749961e-01 -2.846603804593182407e-01
2.164993687623353869e-01 -2.144055364192126445e-01 -1.805786105062300306e-01
2.206774472220860706e-01 -2.044265723295558002e-01 -2.577073573562868036e-01
2.213302710390497308e-01 -2.191542572406049716e-01 -6.077146196266507844e-02
2.580835276157992797e-01 -2.150552567686734307e-01 2.032380319223095130e-02
2.218501546088469945e-01 -2.182626773714566593e-01 8.065519883455118944e-02
1.704943366446019237e-01 -2.286959884014477251e-01 2.169837100893840312e-02
1.870501105147664433e-01 -2.189311758788914652e-01 1.830545783457043774e-01
1.249710611017846862e-01 -2.104501269072529546e-01 3.118461463922129195e-01
1.165978603961838472e-01 -2.224977583068915021e-01 2.200449450540328811e-01
-6.589468434515249884e-02 -2.141455108645391736e-01 3.061345093472855883e-01
-1.713659635150624819e-02 -2.261057392802833244e-01 2.310835027297528654e-01
-1.679501217580823824e-01 -2.114068393971962612e-01 2.655627855785916425e-01
-1.922433775077869167e-01 -2.227866554451403414e-01 8.323442436302475045e-02
-1.987890409420966786e-01 -2.187599105162857938e-01 1.568855048049888068e-01
-2.070193001679832556e-01 -2.197796967444373140e-01 -1.003267813836422329e-01
-9.011638378923053638e-02 -2.702173458207241322e-02 2.434305173505488598e-02
3.676251017729197101e-02 -2.700504049328936867e-02 -1.574862026446524343e-01
-1.606195574805306392e-01 -2.699911589925605565e-02 -4.469457894737035447e-02
-4.110629936631302744e-02 -2.701177910488326492e-02 1.350283759700281194e-01
3.750997410370584401e-02 -2.699907655449122063e-02 1.755456902147471132e-01
1.444430711363597353e-01 -2.700202797570856392e-02 7.132325245123308233e-02
4.529989173573109595e-02 -1.681876794473774750e-01 4.506575697954532389e-01
-6.491249018584235864e-02 -1.506157513341619403e-01 4.674476164548339030e-01
9.498346923774823869e-02 -8.144524948114294227e-02 4.687583997887148080e-01
-2.074231358544243453e-01 -1.601039702409000343e-01 3.941493288416589857e-01
-3.501853488368755069e-02 -1.971394824192844175e-01 3.848620218763441625e-01
3.730855525222085411e-01 -1.518468729346029999e-01 1.147278973415375714e-01
2.692233718902010131e-01 -1.745028812922591321e-01 -3.215629738556439743e-01
3.150573275080216784e-01 -1.564213568749146521e-01 -2.729934042191223265e-01
3.110680244535929395e-01 -1.021030005874500324e-01 -3.351966871312938578e-01
3.590843749714444644e-01 -8.928171736380806356e-02 2.316341040758693737e-01
3.169982475881958361e-01 -1.169858679899417331e-01 3.171613496430101264e-01
3.041409191181249083e-01 -1.951471855157803081e-01 1.211140958711664217e-01
3.484709972353600693e-01 -1.827934108150049419e-01 4.102295384695937852e-02
3.310613892025754956e-01 -1.810496500239620055e-01 -1.741063152241782430e-01
3.506403760061989061e-01 -1.799902314070233045e-01 -7.098233162843682609e-02
-6.932729894866025699e-02 -1.498209261095712896e-01 -4.674424546837963601e-01
-1.452202512289278435e-01 -1.153023649196300293e-01 -4.667963199657306217e-01
-1.395339294987379963e-01 -2.012425050032476403e-01 -3.455710810294131408e-01
-1.777159512869863900e-01 -1.659466639410638045e-01 -4.088488249767995297e-01
-2.953553447673876731e-01 -9.363062449893722050e-02 -3.610393551088005304e-01
6.808750452950484977e-02 -7.754980268785302555e-02 -4.719652508479208475e-01
2.023394674656796843e-01 -1.561767378747227486e-01 -4.036152892225279842e-01
1.747511389852907659e-01 -1.968772750325933352e-01 -3.312779306330214668e-01
5.978649020725995515e-03 -2.048313533406285747e-01 -3.681444751250163572e-01
1.894546864328535904e-02 -1.801250778626042603e-01 -4.418835568830709049e-01
-9.695874310970198362e-02 -1.852045511529024358e-01 -4.065883694146569804e-01
-3.884906077532972457e-01 -1.117579930411666256e-01 1.270347892653528876e-01
-3.915349801920816608e-01 -1.122249133455866660e-01 -1.046760531627188018e-01
-2.837875821269904564e-01 -1.651095663909021294e-01 3.144756650011065746e-01
-3.041822276248227208e-01 -1.027435280862289968e-01 3.484912220784160408e-01
-3.488261395888587191e-01 -1.821074172470286967e-01 -4.942681924820786776e-02
-2.974875340733483808e-01 -1.981379379365961202e-01 -1.156206840676578529e-01
-3.508304512779751305e-01 -1.808584714905668300e-01 5.644391595477935009e-02
-3.423610191134343061e-01 -1.793281365630007840e-01 1.316958135600726776e-01
-1.319897690503380738e-01 1.754717180439386293e-01 2.158012439977385621e-01
-1.435816268499228998e-01 1.005028359133968152e-01 1.986965239606864331e-01
1.347248783772648506e-01 9.558618820822367690e-02 4.965593701135093130e-02
1.165347695419952229e-01 2.072576708235475684e-01 8.068364297059311963e-02
5.776312299418161211e-02 1.776354034069785315e-01 3.012019937804593095e-01
1.149778303243851607e-01 1.176711951297298858e-01 2.829690018852685229e-01
-1.028386496772261682e-01 1.132846727348972987e-01 2.769815344683271841e-01
-9.547587231926611337e-03 -8.221809211004779683e-02 1.707219625306566624e-01
-3.018552579676806943e-02 2.225670819190275207e-01 6.481051322568852358e-02
1.012693231230462759e-01 2.252014978327522843e-01 1.740939797926491783e-01
3.821121953686514722e-02 6.990822490645572096e-02 3.140529909035473244e-01
-5.293193364774259613e-02 3.024323475458115529e-02 3.043026219102830665e-01
1.662886118218065423e-01 5.964044371131071304e-02 1.236215076056688122e-01
1.394453379846917196e-01 -2.003898025814663045e-02 2.237689870235865519e-01
-9.559991427523040208e-02 5.804828013837265432e-02 3.373251205587998042e-02
8.646070559871310568e-02 3.417705919771805989e-02 1.711777024371714459e-02
6.995850327811024838e-02 -5.463700095324379447e-02 5.633159368416223306e-02
-2.906530499061847628e-02 -5.059088638627483314e-02 4.576105683430349569e-02
-1.486858006057061310e-01 4.531920329695175026e-02 1.532530845340624170e-01
-1.275740485437322769e-01 1.226637545009406222e-02 2.299755237629698135e-01
-1.430823925837653721e-01 1.539806710971617365e-01 1.347348734817741256e-01
-9.798080123413679954e-02 2.070942053619747036e-01 8.557323461838599321e-02
-1.237687007504249814e-01 1.340534137854593189e-01 6.888756486693849157e-02
5.246029771714385420e-02 2.165771632880491016e-01 4.985932472856234587e-02
1.608370012970008478e-01 1.809252039511601295e-01 1.554742895310599893e-01
1.547426303823718208e-01 1.580622635724614478e-01 2.243704592976560053e-01
-1.282000217260746179e-02 1.533508950906588286e-01 3.113955977365212568e-01
-5.120038536500005899e-02 2.102236755254659195e-01 2.671930274450788345e-01
1.000285767974507939e-01 -6.607655216137237053e-02 1.353262000196704951e-01
6.364225708753830368e-02 -7.082179559046279660e-02 2.236275110821917822e-01
-4.604496199893044761e-02 -5.765011311113092674e-02 2.555292417459290588e-01
2.744635036753043447e-02 2.363262772256533584e-01 1.377747416259905711e-01
-7.494693328372623031e-02 2.256419678904104664e-01 1.823614094142129993e-01
2.360869898393843508e-02 2.302920620514914973e-01 2.293025507344369218e-01
1.629703865267970175e-01 6.260470379544913477e-02 2.108997841266970208e-01
1.295106110765656959e-01 4.760623987836992005e-02 2.715393291493486405e-01
-1.226510536320311352e-02 4.474933902063980840e-02 1.114761264290658940e-03
-1.224386860475650957e-01 -3.726210663748124458e-02 1.542774562002475480e-01
-1.328816435434715948e-01 1.830213943290144862e-01 -1.283843227553914945e-01
-1.345837361223578554e-01 9.679346760651408310e-02 -8.473954285426194000e-02
1.851100502999972197e-02 1.743925140474581748e-01 -3.098351782798887122e-01
-1.058437656843327079e-01 1.320163450023412455e-01 -2.745453566535794865e-01
1.564170823937309496e-01 1.686032577901786234e-01 -2.158995260181230946e-01
1.431610668138317832e-01 2.028075909491668383e-01 -1.358044125394144153e-01
1.210895727919429898e-01 1.852317885850674473e-01 -5.498676138529555446e-02
1.008758195291719617e-01 1.066774252014654756e-01 -2.274354807856414709e-02
-8.117769894757481675e-02 1.086526855784436252e-01 -2.419444087275701011e-02
-1.742208124581172066e-02 -8.151994770537875079e-02 -1.696393867498376951e-01
-8.184016646418906804e-02 -5.574461014149986737e-02 -8.897461488014062059e-02
1.046948645587655463e-01 2.189393383130335269e-01 -2.198533420934463689e-01
9.083224401517464228e-03 2.300777524979654154e-01 -2.334719410435462938e-01
7.246569957990674216e-02 2.340136837230473443e-01 -1.571732102008595866e-01
-3.791957522193244862e-02 -2.957109583864182822e-02 -2.595315838497347719e-02
-1.914627963632503996e-02 2.734613357738581788e-02 -3.100352398182016356e-01
4.189423681175058478e-02 7.233347982454539637e-02 -3.133538730826487817e-01
5.246762196364802844e-02 -5.525686192454042772e-02 -2.636767518498276397e-01
-1.225228259465742375e-01 1.964013803558621873e-01 -2.020126159349932837e-01
-1.483719825902979905e-01 1.150924126137438247e-01 -1.689090516035883782e-01
-7.884111037862416571e-02 1.955758208019078292e-01 -2.684576654828355058e-01
1.128520486118354538e-01 1.665639404483068020e-01 -2.776173528995279360e-01
1.555055693150068918e-01 1.313967667651513294e-01 -8.442143200452448260e-02
-7.357487091108268207e-02 1.978602168573886377e-01 -4.434906191472039183e-02
-4.257925503248858462e-04 1.325614989042110670e-01 1.719533986311384214e-03
1.053680621744673834e-01 -5.511355181724285346e-02 -8.888540129446406701e-02
2.964331135448011051e-02 -7.741098624968822950e-02 -9.822137072767482546e-02
-6.412208807192112947e-02 2.288885959290906102e-01 -1.768189785988466545e-01
-3.331085409809492193e-02 2.286353240608652859e-01 -8.920711586148354078e-02
4.861085416423354538e-02 2.203279739112889990e-01 -5.489110234310007330e-02
4.056639554249451968e-02 -3.658360245705029323e-02 -2.850365469324300793e-02
1.709590969149854534e-01 4.702228103117960661e-02 -1.536197699198297018e-01
1.575062539583833576e-01 9.316738838528770827e-02 -2.283798533349565563e-01
-7.942581546907487300e-02 -3.617140482819730662e-02 -2.589847335112204219e-01
-4.636487526879109339e-02 9.384821000417009618e-02 -3.092323575506465971e-01
1.149380034618625035e-01 6.797180258515375662e-02 -2.828518350448032126e-01
-1.402761648002541772e-01 3.857605438780374596e-02 -2.050259350051908680e-01
-1.131284019590445000e-01 2.357154210630460489e-02 -5.860979792341050598e-02
-1.049088377491786128e-01 3.208445780259096963e-02 -2.715250317771705779e-01
================================================
FILE: example/teddybear/mesh.obj
================================================
# Blender 3.6.1
# www.blender.org
o Sphere
v -0.090264 0.462556 -0.155620
v -0.090264 0.401258 -0.226042
v -0.090264 0.321167 -0.264155
v -0.090264 0.277823 -0.269059
v -0.090264 0.234478 -0.264155
v -0.090264 0.154388 -0.226042
v -0.081254 0.495731 -0.062652
v -0.072590 0.483088 -0.109615
v -0.064606 0.462556 -0.152895
v -0.057607 0.434926 -0.190831
v -0.051864 0.401258 -0.221964
v -0.047596 0.362846 -0.245099
v -0.044967 0.321167 -0.259344
v -0.044080 0.277823 -0.264155
v -0.044967 0.234478 -0.259344
v -0.047596 0.192799 -0.245099
v -0.051864 0.154388 -0.221964
v -0.057607 0.120720 -0.190831
v -0.064606 0.093089 -0.152895
v -0.072590 0.072558 -0.109615
v -0.081254 0.059914 -0.062652
v -0.072590 0.495731 -0.059819
v -0.055596 0.483088 -0.104056
v -0.039933 0.462556 -0.144826
v -0.026205 0.434926 -0.180561
v -0.014938 0.401258 -0.209887
v -0.006567 0.362846 -0.231679
v -0.001411 0.321167 -0.245099
v 0.000329 0.277823 -0.249630
v -0.001411 0.234478 -0.245099
v -0.006567 0.192799 -0.231679
v -0.014938 0.154388 -0.209887
v -0.026205 0.120720 -0.180561
v -0.039933 0.093089 -0.144826
v -0.055596 0.072558 -0.104056
v -0.072590 0.059914 -0.059819
v -0.064606 0.495731 -0.055217
v -0.039933 0.483088 -0.095030
v -0.017195 0.462556 -0.131721
v 0.002736 0.434926 -0.163882
v 0.019092 0.401258 -0.190275
v 0.031246 0.362846 -0.209887
v 0.038730 0.321167 -0.221964
v 0.041257 0.277823 -0.226042
v 0.038730 0.234478 -0.221964
v 0.031246 0.192799 -0.209887
v 0.019092 0.154388 -0.190275
v 0.002736 0.120720 -0.163882
v -0.017195 0.093089 -0.131721
v -0.039933 0.072558 -0.095030
v -0.064606 0.059914 -0.055217
v -0.057607 0.495731 -0.049024
v -0.026205 0.483088 -0.082882
v 0.002736 0.462556 -0.114086
v 0.028102 0.434926 -0.141436
v 0.048920 0.401258 -0.163882
v 0.064389 0.362846 -0.180561
v 0.073915 0.321167 -0.190831
v 0.077131 0.277823 -0.194299
v 0.073915 0.234478 -0.190831
v 0.064389 0.192799 -0.180561
v 0.048920 0.154388 -0.163882
v 0.028102 0.120720 -0.141436
v 0.002736 0.093089 -0.114086
v -0.026205 0.072558 -0.082882
v -0.057607 0.059914 -0.049024
v -0.051864 0.495731 -0.041478
v -0.014938 0.483088 -0.068080
v 0.019092 0.462556 -0.092597
v 0.048920 0.434926 -0.114086
v 0.073399 0.401258 -0.131721
v 0.091589 0.362846 -0.144826
v 0.102790 0.321167 -0.152895
v 0.106572 0.277823 -0.155620
v 0.102790 0.234478 -0.152895
v 0.091589 0.192799 -0.144826
v 0.073399 0.154388 -0.131721
v 0.048920 0.120720 -0.114086
v 0.019092 0.093089 -0.092597
v -0.014938 0.072558 -0.068080
v -0.051864 0.059914 -0.041478
v -0.090264 0.500000 -0.013813
v -0.047596 0.495731 -0.032869
v -0.006567 0.483088 -0.051193
v 0.031246 0.462556 -0.068080
v 0.064389 0.434926 -0.082882
v 0.091589 0.401258 -0.095030
v 0.111800 0.362846 -0.104056
v 0.124246 0.321167 -0.109615
v 0.128448 0.277823 -0.111491
v 0.124246 0.234478 -0.109615
v 0.111800 0.192799 -0.104056
v 0.091589 0.154388 -0.095030
v 0.064389 0.120720 -0.082882
v 0.031246 0.093089 -0.068080
v -0.006567 0.072558 -0.051193
v -0.047596 0.059914 -0.032869
v -0.044967 0.495731 -0.023528
v -0.001411 0.483088 -0.032869
v 0.038730 0.462556 -0.041478
v 0.073915 0.434926 -0.049024
v 0.102790 0.401258 -0.055217
v 0.124246 0.362846 -0.059819
v 0.137458 0.321167 -0.062652
v 0.141920 0.277823 -0.063609
v 0.137458 0.234478 -0.062652
v 0.124246 0.192799 -0.059819
v 0.102790 0.154388 -0.055217
v 0.073915 0.120720 -0.049024
v 0.038730 0.093089 -0.041478
v -0.001411 0.072558 -0.032869
v -0.044967 0.059914 -0.023528
v -0.044080 0.495731 -0.013813
v 0.000329 0.483088 -0.013813
v 0.041257 0.462556 -0.013813
v 0.077131 0.434926 -0.013813
v 0.106572 0.401258 -0.013813
v 0.128448 0.362846 -0.013813
v 0.141920 0.321167 -0.013813
v 0.146468 0.277823 -0.013813
v 0.141920 0.234478 -0.013813
v 0.128448 0.192799 -0.013813
v 0.106572 0.154388 -0.013813
v 0.077131 0.120720 -0.013813
v 0.041257 0.093089 -0.013813
v 0.000329 0.072558 -0.013813
v -0.044080 0.059914 -0.013813
v -0.044967 0.495731 -0.004098
v -0.001411 0.483088 0.005243
v 0.038730 0.462556 0.013852
v 0.073915 0.434926 0.021398
v 0.102790 0.401258 0.027591
v 0.124246 0.362846 0.032193
v 0.137458 0.321167 0.035026
v 0.141920 0.277823 0.035983
v 0.137458 0.234478 0.035026
v 0.124246 0.192799 0.032193
v 0.102790 0.154388 0.027591
v 0.073915 0.120720 0.021398
v 0.038730 0.093089 0.013852
v -0.001411 0.072558 0.005243
v -0.044967 0.059914 -0.004098
v -0.047596 0.495731 0.005243
v -0.006567 0.483088 0.023567
v 0.031246 0.462556 0.040454
v 0.064389 0.434926 0.055256
v 0.091589 0.401258 0.067404
v 0.111800 0.362846 0.076430
v 0.124246 0.321167 0.081989
v 0.128448 0.277823 0.083866
v 0.124246 0.234478 0.081989
v 0.111800 0.192799 0.076430
v 0.091589 0.154388 0.067404
v 0.064389 0.120720 0.055256
v 0.031246 0.093089 0.040454
v -0.006567 0.072558 0.023567
v -0.047596 0.059914 0.005243
v -0.051864 0.495731 0.013852
v -0.014938 0.483088 0.040454
v 0.019092 0.462556 0.064971
v 0.048920 0.434926 0.086460
v 0.073399 0.401258 0.104095
v 0.091589 0.362846 0.117200
v 0.102790 0.321167 0.125269
v 0.106572 0.277823 0.127994
v 0.102790 0.234478 0.125269
v 0.091589 0.192799 0.117200
v 0.073399 0.154388 0.104095
v 0.048920 0.120720 0.086460
v 0.019092 0.093089 0.064971
v -0.014938 0.072558 0.040454
v -0.051864 0.059914 0.013852
v -0.057607 0.495731 0.021398
v -0.026205 0.483088 0.055256
v 0.002736 0.462556 0.086460
v 0.028102 0.434926 0.113810
v 0.048920 0.401258 0.136256
v 0.064389 0.362846 0.152935
v 0.073915 0.321167 0.163205
v 0.077131 0.277823 0.166673
v 0.073915 0.234478 0.163205
v 0.064389 0.192799 0.152935
v 0.048920 0.154388 0.136256
v 0.028102 0.120720 0.113810
v 0.002736 0.093089 0.086460
v -0.026205 0.072558 0.055256
v -0.057607 0.059914 0.021398
v -0.064606 0.495731 0.027591
v -0.039933 0.483088 0.067404
v -0.017195 0.462556 0.104095
v 0.002736 0.434926 0.136256
v 0.019092 0.401258 0.162649
v 0.031246 0.362846 0.182261
v 0.038730 0.321167 0.194339
v 0.041257 0.277823 0.198416
v 0.038730 0.234478 0.194339
v 0.031246 0.192799 0.182261
v 0.019092 0.154388 0.162649
v 0.002736 0.120720 0.136256
v -0.017195 0.093089 0.104095
v -0.039933 0.072558 0.067404
v -0.064606 0.059914 0.027591
v -0.072590 0.495731 0.032193
v -0.055596 0.483088 0.076430
v -0.039933 0.462556 0.117200
v -0.026205 0.434926 0.152935
v -0.014938 0.401258 0.182261
v -0.006567 0.362846 0.204053
v -0.001411 0.321167 0.217473
v 0.000329 0.277823 0.222004
v -0.001411 0.234478 0.217473
v -0.006567 0.192799 0.204053
v -0.014938 0.154388 0.182261
v -0.026205 0.120720 0.152935
v -0.039933 0.093089 0.117200
v -0.055596 0.072558 0.076430
v -0.072590 0.059914 0.032193
v -0.081254 0.495731 0.035026
v -0.072590 0.483088 0.081989
v -0.064606 0.462556 0.125269
v -0.057607 0.434926 0.163205
v -0.051864 0.401258 0.194338
v -0.047596 0.362846 0.217473
v -0.044968 0.321167 0.231718
v -0.044080 0.277823 0.236529
v -0.044968 0.234478 0.231718
v -0.047596 0.192799 0.217473
v -0.051864 0.154388 0.194338
v -0.057607 0.120720 0.163205
v -0.064606 0.093089 0.125269
v -0.072590 0.072558 0.081989
v -0.081254 0.059914 0.035026
v -0.090264 0.495731 0.035983
v -0.090264 0.483088 0.083866
v -0.090264 0.462556 0.127994
v -0.090264 0.434926 0.166673
v -0.090264 0.401258 0.198416
v -0.090264 0.362846 0.222004
v -0.090264 0.321167 0.236529
v -0.090264 0.277823 0.241433
v -0.090264 0.234478 0.236529
v -0.090264 0.192799 0.222004
v -0.090264 0.154388 0.198416
v -0.090264 0.120720 0.166673
v -0.090264 0.093089 0.127994
v -0.090264 0.072558 0.083866
v -0.090264 0.059914 0.035983
v -0.099274 0.495731 0.035026
v -0.107938 0.483088 0.081989
v -0.115923 0.462556 0.125269
v -0.122922 0.434926 0.163205
v -0.128665 0.401258 0.194338
v -0.132933 0.362846 0.217473
v -0.135561 0.321167 0.231718
v -0.136449 0.277823 0.236529
v -0.135561 0.234478 0.231718
v -0.132933 0.192799 0.217473
v -0.128665 0.154388 0.194338
v -0.122922 0.120720 0.163205
v -0.115923 0.093089 0.125269
v -0.107938 0.072558 0.081989
v -0.099274 0.059914 0.035026
v -0.107938 0.495731 0.032193
v -0.124933 0.483088 0.076430
v -0.140596 0.462556 0.117200
v -0.154324 0.434926 0.152935
v -0.165590 0.401258 0.182261
v -0.173962 0.362846 0.204053
v -0.179117 0.321167 0.217473
v -0.180858 0.277823 0.222004
v -0.179117 0.234478 0.217473
v -0.173962 0.192799 0.204053
v -0.165590 0.154388 0.182261
v -0.154324 0.120720 0.152935
v -0.140596 0.093089 0.117200
v -0.124933 0.072558 0.076430
v -0.107938 0.059914 0.032193
v -0.115923 0.495731 0.027591
v -0.140596 0.483088 0.067404
v -0.163334 0.462556 0.104095
v -0.183264 0.434926 0.136256
v -0.199621 0.401258 0.162649
v -0.211775 0.362846 0.182261
v -0.219259 0.321167 0.194338
v -0.221786 0.277823 0.198416
v -0.219259 0.234478 0.194338
v -0.211775 0.192799 0.182261
v -0.199621 0.154388 0.162649
v -0.183264 0.120720 0.136256
v -0.163334 0.093089 0.104095
v -0.140596 0.072558 0.067404
v -0.115923 0.059914 0.027591
v -0.122922 0.495731 0.021398
v -0.154324 0.483088 0.055256
v -0.183264 0.462556 0.086460
v -0.208631 0.434926 0.113810
v -0.229448 0.401258 0.136256
v -0.244917 0.362846 0.152935
v -0.254443 0.321167 0.163205
v -0.257660 0.277823 0.166673
v -0.254443 0.234478 0.163205
v -0.244917 0.192799 0.152935
v -0.229448 0.154388 0.136256
v -0.208631 0.120720 0.113810
v -0.183264 0.093089 0.086460
v -0.154324 0.072558 0.055256
v -0.122922 0.059914 0.021398
v -0.090264 0.055645 -0.013813
v -0.128665 0.495731 0.013852
v -0.165590 0.483088 0.040454
v -0.199621 0.462556 0.064971
v -0.229449 0.434926 0.086460
v -0.253928 0.401258 0.104095
v -0.272117 0.362846 0.117200
v -0.283318 0.321167 0.125269
v -0.287100 0.277823 0.127994
v -0.283318 0.234478 0.125269
v -0.272117 0.192799 0.117200
v -0.253928 0.154388 0.104095
v -0.229449 0.120720 0.086460
v -0.199621 0.093089 0.064971
v -0.165590 0.072558 0.040454
v -0.128665 0.059914 0.013852
v -0.132933 0.495731 0.005243
v -0.173962 0.483088 0.023567
v -0.211774 0.462556 0.040454
v -0.244918 0.434926 0.055256
v -0.272117 0.401258 0.067404
v -0.292328 0.362846 0.076430
v -0.304774 0.321167 0.081989
v -0.308977 0.277823 0.083865
v -0.304774 0.234478 0.081989
v -0.292328 0.192799 0.076430
v -0.272117 0.154388 0.067404
v -0.244918 0.120720 0.055256
v -0.211774 0.093089 0.040454
v -0.173962 0.072558 0.023567
v -0.132933 0.059914 0.005243
v -0.135561 0.495731 -0.004098
v -0.179117 0.483088 0.005243
v -0.219259 0.462556 0.013852
v -0.254443 0.434926 0.021398
v -0.283318 0.401258 0.027591
v -0.304774 0.362846 0.032193
v -0.317987 0.321167 0.035026
v -0.322448 0.277823 0.035983
v -0.317987 0.234478 0.035026
v -0.304774 0.192799 0.032193
v -0.283318 0.154388 0.027591
v -0.254443 0.120720 0.021398
v -0.219259 0.093089 0.013852
v -0.179117 0.072558 0.005243
v -0.135561 0.059914 -0.004098
v -0.136449 0.495731 -0.013813
v -0.180858 0.483088 -0.013813
v -0.221786 0.462556 -0.013813
v -0.257660 0.434926 -0.013813
v -0.287100 0.401258 -0.013813
v -0.308977 0.362846 -0.013813
v -0.322448 0.321167 -0.013813
v -0.326997 0.277823 -0.013813
v -0.322448 0.234478 -0.013813
v -0.308977 0.192799 -0.013813
v -0.287100 0.154388 -0.013813
v -0.257660 0.120720 -0.013813
v -0.221786 0.093089 -0.013813
v -0.180858 0.072558 -0.013813
v -0.136449 0.059914 -0.013813
v -0.135561 0.495731 -0.023528
v -0.179117 0.483088 -0.032869
v -0.219259 0.462556 -0.041478
v -0.254443 0.434926 -0.049024
v -0.283318 0.401258 -0.055217
v -0.304774 0.362846 -0.059819
v -0.317987 0.321167 -0.062652
v -0.322448 0.277823 -0.063609
v -0.317987 0.234478 -0.062652
v -0.304774 0.192799 -0.059819
v -0.283318 0.154388 -0.055217
v -0.254443 0.120720 -0.049024
v -0.219259 0.093089 -0.041478
v -0.179117 0.072558 -0.032869
v -0.135561 0.059914 -0.023528
v -0.132933 0.495731 -0.032869
v -0.173962 0.483088 -0.051193
v -0.211774 0.462556 -0.068080
v -0.244917 0.434926 -0.082882
v -0.272117 0.401258 -0.095030
v -0.292328 0.362846 -0.104056
v -0.304774 0.321167 -0.109615
v -0.308977 0.277823 -0.111491
v -0.304774 0.234478 -0.109615
v -0.292328 0.192799 -0.104056
v -0.272117 0.154388 -0.095030
v -0.244917 0.120720 -0.082882
v -0.211774 0.093089 -0.068080
v -0.173962 0.072558 -0.051193
v -0.132933 0.059914 -0.032869
v -0.128665 0.495731 -0.041478
v -0.165590 0.483088 -0.068080
v -0.199621 0.462556 -0.092597
v -0.229448 0.434926 -0.114086
v -0.253928 0.401258 -0.131721
v -0.272117 0.362846 -0.144826
v -0.283318 0.321167 -0.152895
v -0.287100 0.277823 -0.155620
v -0.283318 0.234478 -0.152895
v -0.272117 0.192799 -0.144826
v -0.253928 0.154388 -0.131721
v -0.229448 0.120720 -0.114086
v -0.199621 0.093089 -0.092597
v -0.165590 0.072558 -0.068080
v -0.128665 0.059914 -0.041478
v -0.122922 0.495731 -0.049024
v -0.154324 0.483088 -0.082882
v -0.183264 0.462556 -0.114086
v -0.208631 0.434926 -0.141436
v -0.229448 0.401258 -0.163882
v -0.244917 0.362846 -0.180560
v -0.254443 0.321167 -0.190831
v -0.257659 0.277823 -0.194299
v -0.254443 0.234478 -0.190831
v -0.244917 0.192799 -0.180560
v -0.229448 0.154388 -0.163882
v -0.208631 0.120720 -0.141436
v -0.183264 0.093089 -0.114086
v -0.154324 0.072558 -0.082882
v -0.122922 0.059914 -0.049024
v -0.115923 0.495731 -0.055217
v -0.140595 0.483088 -0.095030
v -0.163334 0.462556 -0.131721
v -0.183264 0.434926 -0.163882
v -0.199621 0.401258 -0.190275
v -0.211775 0.362846 -0.209887
v -0.219259 0.321167 -0.221964
v -0.221786 0.277823 -0.226042
v -0.219259 0.234478 -0.221964
v -0.211775 0.192799 -0.209887
v -0.199621 0.154388 -0.190275
v -0.183264 0.120720 -0.163882
v -0.163334 0.093089 -0.131721
v -0.140595 0.072558 -0.095030
v -0.115923 0.059914 -0.055217
v -0.107938 0.495731 -0.059818
v -0.124933 0.483088 -0.104056
v -0.140595 0.462556 -0.144826
v -0.154324 0.434926 -0.180561
v -0.165590 0.401258 -0.209887
v -0.173962 0.362846 -0.231679
v -0.179117 0.321167 -0.245098
v -0.180858 0.277823 -0.249629
v -0.179117 0.234478 -0.245098
v -0.173962 0.192799 -0.231679
v -0.165590 0.154388 -0.209887
v -0.154324 0.120720 -0.180561
v -0.140595 0.093089 -0.144826
v -0.124933 0.072558 -0.104056
v -0.107938 0.059914 -0.059818
v -0.099274 0.495731 -0.062652
v -0.107938 0.483088 -0.109615
v -0.115923 0.462556 -0.152895
v -0.122922 0.434926 -0.190831
v -0.128665 0.401258 -0.221964
v -0.132933 0.362846 -0.245098
v -0.135561 0.321167 -0.259344
v -0.136448 0.277823 -0.264154
v -0.135561 0.234478 -0.259344
v -0.132933 0.192799 -0.245098
v -0.128665 0.154388 -0.221964
v -0.122922 0.120720 -0.190831
v -0.115923 0.093089 -0.152895
v -0.107938 0.072558 -0.109615
v -0.099274 0.059914 -0.062652
v -0.090264 0.495731 -0.063609
v -0.090264 0.483088 -0.111491
v -0.090264 0.434926 -0.194299
v -0.090264 0.362846 -0.249630
v -0.090264 0.192799 -0.249630
v -0.090264 0.120720 -0.194299
v -0.090264 0.093089 -0.155620
v -0.090264 0.072558 -0.111491
v -0.090264 0.059914 -0.063609
vn 0.0901 -0.5212 -0.8487
vn 0.0616 0.8122 -0.5802
vn 0.0770 -0.6840 -0.7254
vn 0.0770 0.6840 -0.7254
vn 0.0616 -0.8122 -0.5802
vn 0.0901 0.5212 -0.8487
vn 0.0448 -0.9058 -0.4214
vn 0.0998 0.3274 -0.9396
vn 0.0271 -0.9665 -0.2552
vn 0.1049 0.1118 -0.9882
vn 0.0091 0.9963 -0.0854
vn 0.0091 -0.9963 -0.0854
vn 0.1049 -0.1118 -0.9882
vn 0.0271 0.9665 -0.2552
vn 0.0998 -0.3274 -0.9396
vn 0.0448 0.9058 -0.4214
vn 0.2939 -0.3257 -0.8986
vn 0.1324 0.9048 -0.4048
vn 0.2657 -0.5189 -0.8125
vn 0.1821 0.8105 -0.5567
vn 0.2274 -0.6818 -0.6953
vn 0.2274 0.6818 -0.6953
vn 0.1821 -0.8105 -0.5567
vn 0.2657 0.5189 -0.8125
vn 0.1324 -0.9048 -0.4048
vn 0.2939 0.3257 -0.8986
vn 0.0802 -0.9661 -0.2453
vn 0.3089 0.1112 -0.9446
vn 0.0269 0.9963 -0.0821
vn 0.0269 -0.9963 -0.0821
vn 0.3089 -0.1112 -0.9446
vn 0.0802 0.9661 -0.2453
vn 0.2146 -0.9030 -0.3723
vn 0.4726 0.3225 -0.8201
vn 0.1302 -0.9654 -0.2259
vn 0.4963 0.1100 -0.8612
vn 0.0436 0.9962 -0.0757
vn 0.0436 -0.9962 -0.0757
vn 0.4963 -0.1100 -0.8612
vn 0.1302 0.9654 -0.2259
vn 0.4726 -0.3225 -0.8201
vn 0.2146 0.9030 -0.3723
vn 0.4281 -0.5147 -0.7428
vn 0.2946 0.8074 -0.5111
vn 0.3671 -0.6778 -0.6371
vn 0.3671 0.6778 -0.6371
vn 0.2946 -0.8074 -0.5111
vn 0.4281 0.5147 -0.7428
vn 0.2880 0.9006 -0.3255
vn 0.5702 -0.5095 -0.6444
vn 0.3945 0.8035 -0.4458
vn 0.4904 -0.6726 -0.5542
vn 0.4904 0.6726 -0.5542
vn 0.3945 -0.8035 -0.4458
vn 0.5702 0.5095 -0.6444
vn 0.2880 -0.9006 -0.3255
vn 0.6282 0.3185 -0.7099
vn 0.1750 -0.9645 -0.1978
vn 0.6588 0.1085 -0.7445
vn 0.0587 0.9961 -0.0663
vn 0.0587 -0.9961 -0.0663
vn 0.6588 -0.1085 -0.7445
vn 0.1750 0.9645 -0.1978
vn 0.6282 -0.3185 -0.7099
vn 0.7554 0.3143 -0.5750
vn 0.2131 -0.9635 -0.1622
vn 0.7912 0.1069 -0.6022
vn 0.0715 0.9960 -0.0544
vn 0.0715 -0.9960 -0.0544
vn 0.7912 -0.1069 -0.6022
vn 0.2131 0.9635 -0.1622
vn 0.7554 -0.3143 -0.5750
vn 0.3499 0.8981 -0.2664
vn 0.6873 -0.5039 -0.5231
vn 0.4782 0.7993 -0.3640
vn 0.5927 -0.6672 -0.4511
vn 0.5927 0.6672 -0.4511
vn 0.4782 -0.7993 -0.3640
vn 0.6873 0.5039 -0.5231
vn 0.3499 -0.8981 -0.2664
vn 0.7764 -0.4990 -0.3849
vn 0.5429 0.7955 -0.2692
vn 0.6712 -0.6623 -0.3328
vn 0.6712 0.6623 -0.3328
vn 0.5429 -0.7955 -0.2692
vn 0.7764 0.4990 -0.3849
vn 0.3982 -0.8958 -0.1974
vn 0.8516 0.3106 -0.4222
vn 0.2428 -0.9626 -0.1204
vn 0.8909 0.1055 -0.4417
vn 0.0816 0.9958 -0.0404
vn 0.0816 -0.9958 -0.0404
vn 0.8909 -0.1055 -0.4417
vn 0.2428 0.9626 -0.1204
vn 0.8516 -0.3106 -0.4222
vn 0.3982 0.8958 -0.1974
vn 0.2633 -0.9619 -0.0741
vn 0.9574 0.1045 -0.2693
vn 0.0885 0.9958 -0.0249
vn 0.0885 -0.9958 -0.0249
vn 0.9574 -0.1045 -0.2693
vn 0.2633 0.9619 -0.0741
vn 0.9159 -0.3079 -0.2577
vn 0.4313 0.8940 -0.1213
vn 0.8363 -0.4953 -0.2353
vn 0.5870 0.7926 -0.1651
vn 0.7243 -0.6587 -0.2038
vn 0.7243 0.6587 -0.2038
vn 0.5870 -0.7926 -0.1651
vn 0.8363 0.4953 -0.2353
vn 0.4313 -0.8940 -0.1213
vn 0.9159 0.3079 -0.2577
vn 0.7510 -0.6567 -0.0686
vn 0.7510 0.6567 -0.0686
vn 0.6093 -0.7910 -0.0557
vn 0.8662 0.4933 -0.0791
vn 0.4480 -0.8931 -0.0409
vn 0.9480 0.3064 -0.0866
vn 0.2737 -0.9615 -0.0250
vn 0.9905 0.1039 -0.0905
vn 0.0920 0.9957 -0.0084
vn 0.0920 -0.9957 -0.0084
vn 0.9905 -0.1039 -0.0905
vn 0.2737 0.9615 -0.0250
vn 0.9480 -0.3064 -0.0866
vn 0.4480 0.8931 -0.0409
vn 0.8662 -0.4933 -0.0791
vn 0.6093 0.7910 -0.0557
vn 0.0920 0.9957 0.0084
vn 0.0920 -0.9957 0.0084
vn 0.9905 -0.1039 0.0905
vn 0.2737 0.9615 0.0250
vn 0.9480 -0.3064 0.0866
vn 0.4480 0.8931 0.0409
vn 0.8662 -0.4933 0.0791
vn 0.6093 0.7910 0.0557
vn 0.7510 -0.6567 0.0686
vn 0.7510 0.6567 0.0686
vn 0.6093 -0.7910 0.0557
vn 0.8662 0.4933 0.0791
vn 0.4480 -0.8931 0.0409
vn 0.9480 0.3064 0.0866
vn 0.2737 -0.9615 0.0250
vn 0.9905 0.1039 0.0905
vn 0.7243 0.6587 0.2038
vn 0.5870 -0.7926 0.1651
vn 0.8363 0.4953 0.2353
vn 0.4313 -0.8940 0.1213
vn 0.9159 0.3079 0.2577
vn 0.2633 -0.9619 0.0741
vn 0.9574 0.1045 0.2693
vn 0.0885 0.9958 0.0249
vn 0.0885 -0.9958 0.0249
vn 0.9574 -0.1045 0.2693
vn 0.2633 0.9619 0.0741
vn 0.9159 -0.3079 0.2577
vn 0.4313 0.8940 0.1213
vn 0.8363 -0.4953 0.2353
vn 0.5870 0.7926 0.1651
vn 0.7243 -0.6587 0.2038
vn 0.8909 -0.1055 0.4417
vn 0.2428 0.9626 0.1204
vn 0.8516 -0.3106 0.4222
vn 0.3982 0.8958 0.1974
vn 0.7764 -0.4990 0.3849
vn 0.5429 0.7955 0.2692
vn 0.6712 -0.6623 0.3328
vn 0.6712 0.6623 0.3328
vn 0.5429 -0.7955 0.2692
vn 0.7764 0.4990 0.3849
vn 0.3982 -0.8958 0.1974
vn 0.8516 0.3106 0.4222
vn 0.2428 -0.9626 0.1204
vn 0.8909 0.1055 0.4417
vn 0.0816 0.9958 0.0404
vn 0.0816 -0.9958 0.0404
vn 0.4782 -0.7993 0.3640
vn 0.6873 0.5039 0.5231
vn 0.3499 -0.8981 0.2664
vn 0.7554 0.3143 0.5750
vn 0.2131 -0.9635 0.1622
vn 0.7912 0.1069 0.6022
vn 0.0715 0.9960 0.0544
vn 0.0715 -0.9960 0.0544
vn 0.7912 -0.1069 0.6022
vn 0.2131 0.9635 0.1622
vn 0.7554 -0.3143 0.5750
vn 0.3499 0.8981 0.2664
vn 0.6873 -0.5039 0.5231
vn 0.4782 0.7993 0.3640
vn 0.5927 -0.6672 0.4511
vn 0.5927 0.6672 0.4511
vn 0.1750 0.9645 0.1978
vn 0.6282 -0.3185 0.7099
vn 0.2880 0.9006 0.3255
vn 0.5702 -0.5095 0.6444
vn 0.3945 0.8035 0.4458
vn 0.4904 -0.6726 0.5542
vn 0.4904 0.6726 0.5542
vn 0.3945 -0.8035 0.4458
vn 0.5702 0.5095 0.6444
vn 0.2880 -0.9006 0.3255
vn 0.6282 0.3185 0.7099
vn 0.1750 -0.9645 0.1978
vn 0.6588 0.1084 0.7445
vn 0.0587 0.9961 0.0663
vn 0.0587 -0.9961 0.0663
vn 0.6588 -0.1084 0.7445
vn 0.4281 0.5147 0.7428
vn 0.2146 -0.9030 0.3723
vn 0.4726 0.3225 0.8201
vn 0.1302 -0.9654 0.2259
vn 0.4963 0.1100 0.8612
vn 0.0436 0.9962 0.0757
vn 0.0436 -0.9962 0.0757
vn 0.4963 -0.1100 0.8612
vn 0.1302 0.9654 0.2259
vn 0.4726 -0.3225 0.8201
vn 0.2146 0.9030 0.3723
vn 0.4281 -0.5147 0.7428
vn 0.2946 0.8074 0.5111
vn 0.3671 -0.6778 0.6371
vn 0.3671 0.6778 0.6371
vn 0.2946 -0.8074 0.5111
vn 0.2939 -0.3257 0.8986
vn 0.1324 0.9048 0.4048
vn 0.2657 -0.5189 0.8125
vn 0.1821 0.8105 0.5567
vn 0.2274 -0.6818 0.6953
vn 0.2274 0.6818 0.6953
vn 0.1821 -0.8105 0.5567
vn 0.2657 0.5189 0.8125
vn 0.1324 -0.9048 0.4048
vn 0.2939 0.3257 0.8986
vn 0.0802 -0.9661 0.2453
vn 0.3089 0.1111 0.9446
vn 0.0269 0.9963 0.0821
vn 0.0269 -0.9963 0.0821
vn 0.3089 -0.1111 0.9446
vn 0.0802 0.9661 0.2453
vn 0.0448 -0.9058 0.4214
vn 0.0998 0.3274 0.9396
vn 0.0271 -0.9665 0.2552
vn 0.1049 0.1118 0.9882
vn 0.0091 0.9963 0.0854
vn 0.0091 -0.9963 0.0854
vn 0.1049 -0.1118 0.9882
vn 0.0271 0.9665 0.2552
vn 0.0998 -0.3274 0.9396
vn 0.0448 0.9058 0.4214
vn 0.0901 -0.5212 0.8487
vn 0.0616 0.8122 0.5802
vn 0.0770 -0.6840 0.7254
vn 0.0770 0.6840 0.7254
vn 0.0616 -0.8122 0.5802
vn 0.0901 0.5212 0.8487
vn -0.0901 -0.5212 0.8487
vn -0.0616 0.8122 0.5802
vn -0.0770 -0.6840 0.7254
vn -0.0770 0.6840 0.7254
vn -0.0616 -0.8122 0.5802
vn -0.0901 0.5212 0.8487
vn -0.0448 -0.9058 0.4214
vn -0.0998 0.3274 0.9396
vn -0.0271 -0.9665 0.2552
vn -0.1049 0.1118 0.9882
vn -0.0091 0.9963 0.0854
vn -0.0091 -0.9963 0.0854
vn -0.1049 -0.1118 0.9882
vn -0.0271 0.9665 0.2552
vn -0.0998 -0.3274 0.9396
vn -0.0448 0.9058 0.4214
vn -0.0802 -0.9661 0.2453
vn -0.3089 0.1111 0.9446
vn -0.0269 0.9963 0.0821
vn -0.0269 -0.9963 0.0821
vn -0.3089 -0.1111 0.9446
vn -0.0802 0.9661 0.2453
vn -0.2939 -0.3257 0.8986
vn -0.1324 0.9048 0.4048
vn -0.2657 -0.5189 0.8125
vn -0.1821 0.8105 0.5567
vn -0.2274 -0.6818 0.6953
vn -0.2274 0.6818 0.6953
vn -0.1821 -0.8105 0.5567
vn -0.2657 0.5189 0.8125
vn -0.1324 -0.9048 0.4048
vn -0.2939 0.3257 0.8986
vn -0.2946 0.8074 0.5111
vn -0.3671 -0.6778 0.6371
vn -0.3671 0.6778 0.6371
vn -0.2946 -0.8074 0.5111
vn -0.4281 0.5147 0.7428
vn -0.2146 -0.9030 0.3723
vn -0.4726 0.3225 0.8201
vn -0.1302 -0.9654 0.2259
vn -0.4963 0.1100 0.8612
vn -0.0436 0.9962 0.0757
vn -0.0436 -0.9962 0.0757
vn -0.4963 -0.1100 0.8612
vn -0.1302 0.9654 0.2259
vn -0.4726 -0.3225 0.8201
vn -0.2146 0.9030 0.3723
vn -0.4281 -0.5147 0.7428
vn -0.6588 0.1084 0.7445
vn -0.0587 0.9961 0.0663
vn -0.0587 -0.9961 0.0663
vn -0.6588 -0.1084 0.7445
vn -0.1750 0.9645 0.1978
vn -0.6282 -0.3185 0.7099
vn -0.2880 0.9006 0.3255
vn -0.5702 -0.5095 0.6444
vn -0.3945 0.8035 0.4458
vn -0.4904 -0.6726 0.5542
vn -0.4904 0.6726 0.5542
vn -0.3945 -0.8035 0.4458
vn -0.5702 0.5095 0.6444
vn -0.2880 -0.9006 0.3255
vn -0.6282 0.3185 0.7099
vn -0.1750 -0.9645 0.1978
vn -0.5927 -0.6672 0.4511
vn -0.5927 0.6672 0.4511
vn -0.4782 -0.7993 0.3640
vn -0.6873 0.5039 0.5231
vn -0.3499 -0.8981 0.2664
vn -0.7554 0.3143 0.5750
vn -0.2131 -0.9635 0.1622
vn -0.7912 0.1069 0.6022
vn -0.0715 0.9960 0.0544
vn -0.0715 -0.9960 0.0544
vn -0.7912 -0.1069 0.6022
vn -0.2131 0.9635 0.1622
vn -0.7554 -0.3143 0.5750
vn -0.3500 0.8981 0.2664
vn -0.6873 -0.5039 0.5231
vn -0.4782 0.7993 0.3640
vn -0.0816 0.9958 0.0404
vn -0.0816 -0.9958 0.0404
vn -0.8909 -0.1055 0.4417
vn -0.2428 0.9626 0.1204
vn -0.8516 -0.3106 0.4222
vn -0.3982 0.8958 0.1974
vn -0.7764 -0.4990 0.3849
vn -0.5429 0.7955 0.2692
vn -0.6712 -0.6623 0.3328
vn -0.6712 0.6623 0.3328
vn -0.5429 -0.7955 0.2692
vn -0.7764 0.4990 0.3849
vn -0.3982 -0.8958 0.1974
vn -0.8516 0.3106 0.4222
vn -0.2428 -0.9626 0.1204
vn -0.8909 0.1055 0.4417
vn -0.7243 0.6587 0.2038
vn -0.5870 -0.7926 0.1651
vn -0.8363 0.4953 0.2353
vn -0.4313 -0.8940 0.1213
vn -0.9159 0.3079 0.2577
vn -0.2633 -0.9619 0.0741
vn -0.9574 0.1045 0.2693
vn -0.0885 0.9958 0.0249
vn -0.0885 -0.9958 0.0249
vn -0.9574 -0.1045 0.2693
vn -0.2633 0.9619 0.0741
vn -0.9159 -0.3079 0.2577
vn -0.4313 0.8940 0.1213
vn -0.8363 -0.4953 0.2353
vn -0.5870 0.7926 0.1651
vn -0.7243 -0.6587 0.2038
vn -0.9905 -0.1039 0.0905
vn -0.2737 0.9615 0.0250
vn -0.9480 -0.3064 0.0866
vn -0.4480 0.8931 0.0409
vn -0.8662 -0.4933 0.0791
vn -0.6093 0.7910 0.0557
vn -0.7510 -0.6567 0.0686
vn -0.7510 0.6567 0.0686
vn -0.6093 -0.7910 0.0557
vn -0.8662 0.4933 0.0791
vn -0.4480 -0.8931 0.0409
vn -0.9480 0.3064 0.0866
vn -0.2737 -0.9615 0.0250
vn -0.9905 0.1039 0.0905
vn -0.0920 0.9957 0.0084
vn -0.0920 -0.9957 0.0084
vn -0.6093 -0.7910 -0.0557
vn -0.8662 0.4933 -0.0791
vn -0.4480 -0.8931 -0.0409
vn -0.9480 0.3064 -0.0866
vn -0.2737 -0.9615 -0.0250
vn -0.9905 0.1039 -0.0905
vn -0.0920 0.9957 -0.0084
vn -0.0920 -0.9957 -0.0084
vn -0.9905 -0.1039 -0.0905
vn -0.2737 0.9615 -0.0250
vn -0.9480 -0.3064 -0.0866
vn -0.4480 0.8931 -0.0409
vn -0.8662 -0.4933 -0.0791
vn -0.6093 0.7910 -0.0557
vn -0.7510 -0.6567 -0.0686
vn -0.7510 0.6567 -0.0686
vn -0.9159 -0.3079 -0.2577
vn -0.4313 0.8940 -0.1213
vn -0.8363 -0.4953 -0.2353
vn -0.5870 0.7926 -0.1651
vn -0.7243 -0.6587 -0.2038
vn -0.7243 0.6587 -0.2038
vn -0.5870 -0.7926 -0.1651
vn -0.8363 0.4953 -0.2353
vn -0.4313 -0.8940 -0.1213
vn -0.9159 0.3079 -0.2577
vn -0.2633 -0.9619 -0.0741
vn -0.9574 0.1045 -0.2693
vn -0.0885 0.9958 -0.0249
vn -0.0885 -0.9958 -0.0249
vn -0.9574 -0.1045 -0.2693
vn -0.2633 0.9619 -0.0741
vn -0.3982 -0.8958 -0.1974
vn -0.8516 0.3106 -0.4222
vn -0.2428 -0.9626 -0.1204
vn -0.8909 0.1055 -0.4417
vn -0.0816 0.9958 -0.0404
vn -0.0816 -0.9958 -0.0404
vn -0.8909 -0.1055 -0.4417
vn -0.2428 0.9626 -0.1204
vn -0.8516 -0.3106 -0.4222
vn -0.3982 0.8958 -0.1974
vn -0.7764 -0.4990 -0.3849
vn -0.5429 0.7955 -0.2692
vn -0.6712 -0.6623 -0.3328
vn -0.6712 0.6623 -0.3328
vn -0.5429 -0.7955 -0.2692
vn -0.7764 0.4990 -0.3849
vn -0.3500 0.8981 -0.2664
vn -0.6873 -0.5039 -0.5231
vn -0.4782 0.7993 -0.3640
vn -0.5927 -0.6672 -0.4511
vn -0.5927 0.6672 -0.4511
vn -0.4782 -0.7993 -0.3640
vn -0.6873 0.5039 -0.5231
vn -0.3499 -0.8981 -0.2664
vn -0.7554 0.3143 -0.5750
vn -0.2131 -0.9635 -0.1622
vn -0.7912 0.1069 -0.6022
vn -0.0715 0.9960 -0.0544
vn -0.0715 -0.9960 -0.0544
vn -0.7912 -0.1069 -0.6022
vn -0.2131 0.9635 -0.1622
vn -0.7554 -0.3143 -0.5750
vn -0.6282 0.3185 -0.7099
vn -0.1750 -0.9645 -0.1978
vn -0.6588 0.1084 -0.7445
vn -0.0587 0.9961 -0.0663
vn -0.0587 -0.9961 -0.0663
vn -0.6588 -0.1084 -0.7445
vn -0.1750 0.9645 -0.1978
vn -0.6282 -0.3185 -0.7099
vn -0.2880 0.9006 -0.3255
vn -0.5702 -0.5095 -0.6444
vn -0.3945 0.8035 -0.4458
vn -0.4904 -0.6726 -0.5542
vn -0.4904 0.6726 -0.5542
vn -0.3945 -0.8035 -0.4458
vn -0.5702 0.5095 -0.6444
vn -0.2880 -0.9006 -0.3255
vn -0.4281 -0.5147 -0.7428
vn -0.2946 0.8074 -0.5111
vn -0.3671 -0.6778 -0.6371
vn -0.3671 0.6778 -0.6371
vn -0.2946 -0.8074 -0.5111
vn -0.4281 0.5147 -0.7428
vn -0.2146 -0.9030 -0.3723
vn -0.4726 0.3225 -0.8201
vn -0.1302 -0.9654 -0.2259
vn -0.4963 0.1100 -0.8612
vn -0.0436 0.9962 -0.0757
vn -0.0436 -0.9962 -0.0757
vn -0.4963 -0.1100 -0.8612
vn -0.1302 0.9654 -0.2259
vn -0.4726 -0.3225 -0.8201
vn -0.2146 0.9030 -0.3723
vn -0.0802 -0.9661 -0.2453
vn -0.3089 0.1111 -0.9446
vn -0.0269 0.9963 -0.0821
vn -0.0269 -0.9963 -0.0821
vn -0.3089 -0.1111 -0.9446
vn -0.0802 0.9661 -0.2453
vn -0.2939 -0.3257 -0.8986
vn -0.1324 0.9048 -0.4048
vn -0.2657 -0.5189 -0.8125
vn -0.1821 0.8105 -0.5567
vn -0.2274 -0.6818 -0.6953
vn -0.2274 0.6818 -0.6953
vn -0.1821 -0.8105 -0.5567
vn -0.2657 0.5189 -0.8125
vn -0.1324 -0.9048 -0.4048
vn -0.2939 0.3257 -0.8986
vn -0.0616 0.8122 -0.5802
vn -0.0770 -0.6840 -0.7254
vn -0.0770 0.6840 -0.7254
vn -0.0616 -0.8122 -0.5802
vn -0.0901 0.5212 -0.8487
vn -0.0448 -0.9058 -0.4214
vn -0.0998 0.3274 -0.9396
vn -0.0271 -0.9665 -0.2552
vn -0.1049 0.1118 -0.9882
vn -0.0091 0.9963 -0.0854
vn -0.0091 -0.9963 -0.0854
vn -0.1049 -0.1118 -0.9882
vn -0.0271 0.9665 -0.2552
vn -0.0998 -0.3274 -0.9396
vn -0.0448 0.9058 -0.4214
vn -0.0901 -0.5212 -0.8487
vn 0.0447 -0.9058 -0.4214
vn -0.3500 -0.8981 0.2664
vn -0.3499 0.8981 0.2664
vn -0.3499 0.8981 -0.2664
vn -0.3500 -0.8981 -0.2664
vt 0.750000 0.375000
vt 0.718750 0.312500
vt 0.750000 0.312500
vt 0.750000 0.750000
vt 0.718750 0.812500
vt 0.718750 0.750000
vt 0.718750 0.250000
vt 0.750000 0.250000
vt 0.750000 0.687500
vt 0.718750 0.687500
vt 0.718750 0.187500
vt 0.750000 0.187500
vt 0.750000 0.625000
vt 0.718750 0.625000
vt 0.718750 0.125000
vt 0.750000 0.125000
vt 0.750000 0.562500
vt 0.718750 0.562500
vt 0.718750 0.062500
vt 0.750000 0.062500
vt 0.718750 0.500000
vt 0.750000 0.500000
vt 0.750000 0.937500
vt 0.734375 1.000000
vt 0.718750 0.937500
vt 0.734375 0.000000
vt 0.718750 0.437500
vt 0.750000 0.437500
vt 0.750000 0.875000
vt 0.718750 0.875000
vt 0.718750 0.375000
vt 0.750000 0.812500
vt 0.687500 0.375000
vt 0.687500 0.812500
vt 0.687500 0.312500
vt 0.687500 0.750000
vt 0.687500 0.250000
vt 0.687500 0.687500
vt 0.687500 0.187500
vt 0.687500 0.625000
vt 0.687500 0.125000
vt 0.687500 0.562500
vt 0.687500 0.062500
vt 0.687500 0.500000
vt 0.703125 1.000000
vt 0.687500 0.937500
vt 0.703125 0.000000
vt 0.687500 0.437500
vt 0.687500 0.875000
vt 0.656250 0.125000
vt 0.656250 0.562500
vt 0.656250 0.062500
vt 0.656250 0.500000
vt 0.671875 1.000000
vt 0.656250 0.937500
vt 0.671875 0.000000
vt 0.656250 0.437500
vt 0.656250 0.875000
vt 0.656250 0.375000
vt 0.656250 0.812500
vt 0.656250 0.312500
vt 0.656250 0.750000
vt 0.656250 0.250000
vt 0.656250 0.687500
vt 0.656250 0.187500
vt 0.656250 0.625000
vt 0.625000 0.812500
vt 0.625000 0.375000
vt 0.625000 0.312500
vt 0.625000 0.750000
vt 0.625000 0.250000
vt 0.625000 0.687500
vt 0.625000 0.187500
vt 0.625000 0.625000
vt 0.625000 0.125000
vt 0.625000 0.562500
vt 0.625000 0.062500
vt 0.625000 0.500000
vt 0.640625 1.000000
vt 0.625000 0.937500
vt 0.640625 0.000000
vt 0.625000 0.437500
vt 0.625000 0.875000
vt 0.593750 0.625000
vt 0.593750 0.562500
vt 0.593750 0.062500
vt 0.593750 0.500000
vt 0.609375 1.000000
vt 0.593750 0.937500
vt 0.609375 0.000000
vt 0.593750 0.437500
vt 0.593750 0.875000
vt 0.593750 0.375000
vt 0.593750 0.812500
vt 0.593750 0.312500
vt 0.593750 0.750000
vt 0.593750 0.250000
vt 0.593750 0.687500
vt 0.593750 0.187500
vt 0.593750 0.125000
vt 0.562500 0.375000
vt 0.562500 0.312500
vt 0.562500 0.750000
vt 0.562500 0.250000
vt 0.562500 0.687500
vt 0.562500 0.187500
vt 0.562500 0.625000
vt 0.562500 0.125000
vt 0.562500 0.562500
vt 0.562500 0.062500
vt 0.562500 0.500000
vt 0.578125 1.000000
vt 0.562500 0.937500
vt 0.578125 0.000000
vt 0.562500 0.437500
vt 0.562500 0.875000
vt 0.562500 0.812500
vt 0.531250 0.062500
vt 0.531250 0.562500
vt 0.531250 0.500000
vt 0.546875 1.000000
vt 0.531250 0.937500
vt 0.546875 0.000000
vt 0.531250 0.437500
vt 0.531250 0.875000
vt 0.531250 0.375000
vt 0.531250 0.812500
vt 0.531250 0.312500
vt 0.531250 0.750000
vt 0.531250 0.250000
vt 0.531250 0.687500
vt 0.531250 0.187500
vt 0.531250 0.625000
vt 0.531250 0.125000
vt 0.500000 0.250000
vt 0.500000 0.750000
vt 0.500000 0.687500
vt 0.500000 0.187500
vt 0.500000 0.625000
vt 0.500000 0.125000
vt 0.500000 0.562500
vt 0.500000 0.062500
vt 0.500000 0.500000
vt 0.515625 1.000000
vt 0.500000 0.937500
vt 0.515625 0.000000
vt 0.500000 0.437500
vt 0.500000 0.875000
vt 0.500000 0.375000
vt 0.500000 0.812500
vt 0.500000 0.312500
vt 0.484375 1.000000
vt 0.468750 0.937500
vt 0.484375 0.000000
vt 0.468750 0.062500
vt 0.468750 0.437500
vt 0.468750 0.875000
vt 0.468750 0.375000
vt 0.468750 0.812500
vt 0.468750 0.312500
vt 0.468750 0.750000
vt 0.468750 0.250000
vt 0.468750 0.687500
vt 0.468750 0.187500
vt 0.468750 0.625000
vt 0.468750 0.125000
vt 0.468750 0.562500
vt 0.468750 0.500000
vt 0.437500 0.750000
vt 0.437500 0.687500
vt 0.437500 0.250000
vt 0.437500 0.187500
vt 0.437500 0.625000
vt 0.437500 0.125000
vt 0.437500 0.562500
vt 0.437500 0.062500
vt 0.437500 0.500000
vt 0.453125 1.000000
vt 0.437500 0.937500
vt 0.453125 0.000000
vt 0.437500 0.437500
vt 0.437500 0.875000
vt 0.437500 0.375000
vt 0.437500 0.812500
vt 0.437500 0.312500
vt 0.406250 0.437500
vt 0.406250 0.875000
vt 0.406250 0.375000
vt 0.406250 0.812500
vt 0.406250 0.312500
vt 0.406250 0.750000
vt 0.406250 0.250000
vt 0.406250 0.687500
vt 0.406250 0.187500
vt 0.406250 0.625000
vt 0.406250 0.125000
vt 0.406250 0.562500
vt 0.406250 0.062500
vt 0.406250 0.500000
vt 0.421875 1.000000
vt 0.406250 0.937500
vt 0.421875 0.000000
vt 0.375000 0.187500
vt 0.375000 0.625000
vt 0.375000 0.125000
vt 0.375000 0.562500
vt 0.375000 0.062500
vt 0.375000 0.500000
vt 0.390625 1.000000
vt 0.375000 0.937500
vt 0.390625 0.000000
vt 0.375000 0.437500
vt 0.375000 0.875000
vt 0.375000 0.375000
vt 0.375000 0.812500
vt 0.375000 0.312500
vt 0.375000 0.750000
vt 0.375000 0.250000
vt 0.375000 0.687500
vt 0.343750 0.937500
vt 0.343750 0.875000
vt 0.343750 0.375000
vt 0.343750 0.812500
vt 0.343750 0.312500
vt 0.343750 0.750000
vt 0.343750 0.250000
vt 0.343750 0.687500
vt 0.343750 0.187500
vt 0.343750 0.625000
vt 0.343750 0.125000
vt 0.343750 0.562500
vt 0.343750 0.062500
vt 0.343750 0.500000
vt 0.359375 1.000000
vt 0.359375 0.000000
vt 0.343750 0.437500
vt 0.312500 0.625000
vt 0.312500 0.125000
vt 0.312500 0.562500
vt 0.312500 0.062500
vt 0.312500 0.500000
vt 0.328125 1.000000
vt 0.312500 0.937500
vt 0.328125 0.000000
vt 0.312500 0.437500
vt 0.312500 0.875000
vt 0.312500 0.375000
vt 0.312500 0.812500
vt 0.312500 0.312500
vt 0.312500 0.750000
vt 0.312500 0.250000
vt 0.312500 0.687500
vt 0.312500 0.187500
vt 0.281250 0.375000
vt 0.281250 0.812500
vt 0.281250 0.312500
vt 0.281250 0.750000
vt 0.281250 0.250000
vt 0.281250 0.687500
vt 0.281250 0.187500
vt 0.281250 0.625000
vt 0.281250 0.125000
vt 0.281250 0.562500
vt 0.281250 0.062500
vt 0.281250 0.500000
vt 0.296875 1.000000
vt 0.281250 0.937500
vt 0.296875 0.000000
vt 0.281250 0.437500
vt 0.281250 0.875000
vt 0.250000 0.125000
vt 0.250000 0.625000
vt 0.250000 0.562500
vt 0.250000 0.062500
vt 0.250000 0.500000
vt 0.265625 1.000000
vt 0.250000 0.937500
vt 0.265625 0.000000
vt 0.250000 0.437500
vt 0.250000 0.875000
vt 0.250000 0.375000
vt 0.250000 0.812500
vt 0.250000 0.312500
vt 0.250000 0.750000
vt 0.250000 0.250000
vt 0.250000 0.687500
vt 0.250000 0.187500
vt 0.218750 0.375000
vt 0.218750 0.312500
vt 0.218750 0.750000
vt 0.218750 0.250000
vt 0.218750 0.687500
vt 0.218750 0.187500
vt 0.218750 0.625000
vt 0.218750 0.125000
vt 0.218750 0.562500
vt 0.218750 0.062500
vt 0.218750 0.500000
vt 0.234375 1.000000
vt 0.218750 0.937500
vt 0.234375 0.000000
vt 0.218750 0.437500
vt 0.218750 0.875000
vt 0.218750 0.812500
vt 0.187500 0.062500
vt 0.187500 0.562500
vt 0.187500 0.500000
vt 0.203125 1.000000
vt 0.187500 0.937500
vt 0.203125 0.000000
vt 0.187500 0.437500
vt 0.187500 0.875000
vt 0.187500 0.375000
vt 0.187500 0.812500
vt 0.187500 0.312500
vt 0.187500 0.750000
vt 0.187500 0.250000
vt 0.187500 0.687500
vt 0.187500 0.187500
vt 0.187500 0.625000
vt 0.187500 0.125000
vt 0.156250 0.750000
vt 0.156250 0.250000
vt 0.156250 0.687500
vt 0.156250 0.187500
vt 0.156250 0.625000
vt 0.156250 0.125000
vt 0.156250 0.562500
vt 0.156250 0.062500
vt 0.156250 0.500000
vt 0.171875 1.000000
vt 0.156250 0.937500
vt 0.171875 0.000000
vt 0.156250 0.437500
vt 0.156250 0.875000
vt 0.156250 0.375000
vt 0.156250 0.812500
vt 0.156250 0.312500
vt 0.125000 0.562500
vt 0.125000 0.500000
vt 0.140625 1.000000
vt 0.125000 0.937500
vt 0.140625 0.000000
vt 0.125000 0.062500
vt 0.125000 0.437500
vt 0.125000 0.875000
vt 0.125000 0.375000
vt 0.125000 0.812500
vt 0.125000 0.312500
vt 0.125000 0.750000
vt 0.125000 0.250000
vt 0.125000 0.687500
vt 0.125000 0.187500
vt 0.125000 0.625000
vt 0.125000 0.125000
vt 0.093750 0.250000
vt 0.093750 0.750000
vt 0.093750 0.687500
vt 0.093750 0.187500
vt 0.093750 0.625000
vt 0.093750 0.125000
vt 0.093750 0.562500
vt 0.093750 0.062500
vt 0.093750 0.500000
vt 0.109375 1.000000
vt 0.093750 0.937500
vt 0.109375 0.000000
vt 0.093750 0.437500
vt 0.093750 0.875000
vt 0.093750 0.375000
vt 0.093750 0.812500
vt 0.093750 0.312500
vt 0.078125 1.000000
vt 0.062500 0.937500
vt 0.078125 0.000000
vt 0.062500 0.062500
vt 0.062500 0.437500
vt 0.062500 0.875000
vt 0.062500 0.375000
vt 0.062500 0.812500
vt 0.062500 0.312500
vt 0.062500 0.750000
vt 0.062500 0.250000
vt 0.062500 0.687500
vt 0.062500 0.187500
vt 0.062500 0.625000
vt 0.062500 0.125000
vt 0.062500 0.562500
vt 0.062500 0.500000
vt 0.031250 0.750000
vt 0.031250 0.687500
vt 0.031250 0.250000
vt 0.031250 0.187500
vt 0.031250 0.625000
vt 0.031250 0.125000
vt 0.031250 0.562500
vt 0.031250 0.062500
vt 0.031250 0.500000
vt 0.046875 1.000000
vt 0.031250 0.937500
vt 0.046875 0.000000
vt 0.031250 0.437500
vt 0.031250 0.875000
vt 0.031250 0.375000
vt 0.031250 0.812500
vt 0.031250 0.312500
vt 0.000000 0.437500
vt 0.000000 0.875000
vt 0.000000 0.375000
vt 0.000000 0.812500
vt 0.000000 0.312500
vt 0.000000 0.750000
vt 0.000000 0.250000
vt 0.000000 0.687500
vt 0.000000 0.187500
vt 0.000000 0.625000
vt 0.000000 0.125000
vt 0.000000 0.562500
vt 0.000000 0.062500
vt 0.000000 0.500000
vt 0.015625 1.000000
vt 0.000000 0.937500
vt 0.015625 0.000000
vt 1.000000 0.187500
vt 0.968750 0.250000
vt 0.968750 0.187500
vt 1.000000 0.687500
vt 0.968750 0.625000
vt 1.000000 0.625000
vt 0.968750 0.125000
vt 1.000000 0.125000
vt 1.000000 0.562500
vt 0.968750 0.562500
vt 0.968750 0.062500
vt 1.000000 0.062500
vt 1.000000 0.500000
vt 0.968750 0.500000
vt 1.000000 0.937500
vt 0.984375 1.000000
vt 0.968750 0.937500
vt 0.984375 0.000000
vt 0.968750 0.437500
vt 1.000000 0.437500
vt 0.968750 0.875000
vt 1.000000 0.875000
vt 0.968750 0.375000
vt 1.000000 0.375000
vt 1.000000 0.812500
vt 0.968750 0.812500
vt 1.000000 0.312500
vt 0.968750 0.312500
vt 0.968750 0.750000
vt 1.000000 0.750000
vt 1.000000 0.250000
vt 0.968750 0.687500
vt 0.937500 0.375000
vt 0.937500 0.875000
vt 0.937500 0.812500
vt 0.937500 0.312500
vt 0.937500 0.750000
vt 0.937500 0.250000
vt 0.937500 0.687500
vt 0.937500 0.187500
vt 0.937500 0.625000
vt 0.937500 0.125000
vt 0.937500 0.562500
vt 0.937500 0.062500
vt 0.937500 0.500000
vt 0.953125 1.000000
vt 0.937500 0.937500
vt 0.953125 0.000000
vt 0.937500 0.437500
vt 0.906250 0.187500
vt 0.906250 0.125000
vt 0.906250 0.625000
vt 0.906250 0.562500
vt 0.906250 0.062500
vt 0.906250 0.500000
vt 0.921875 1.000000
vt 0.906250 0.937500
vt 0.921875 0.000000
vt 0.906250 0.437500
vt 0.906250 0.875000
vt 0.906250 0.375000
vt 0.906250 0.812500
vt 0.906250 0.312500
vt 0.906250 0.750000
vt 0.906250 0.250000
vt 0.906250 0.687500
vt 0.875000 0.875000
vt 0.875000 0.812500
vt 0.875000 0.375000
vt 0.875000 0.312500
vt 0.875000 0.750000
vt 0.875000 0.250000
vt 0.875000 0.687500
vt 0.875000 0.187500
vt 0.875000 0.625000
vt 0.875000 0.125000
vt 0.875000 0.562500
vt 0.875000 0.062500
vt 0.875000 0.500000
vt 0.890625 1.000000
vt 0.875000 0.937500
vt 0.890625 0.000000
vt 0.875000 0.437500
vt 0.843750 0.625000
vt 0.843750 0.562500
vt 0.843750 0.062500
vt 0.843750 0.500000
vt 0.859375 1.000000
vt 0.843750 0.937500
vt 0.859375 0.000000
vt 0.843750 0.437500
vt 0.843750 0.875000
vt 0.843750 0.375000
vt 0.843750 0.812500
vt 0.843750 0.312500
vt 0.843750 0.750000
vt 0.843750 0.250000
vt 0.843750 0.687500
vt 0.843750 0.187500
vt 0.843750 0.125000
vt 0.812500 0.375000
vt 0.812500 0.312500
vt 0.812500 0.750000
vt 0.812500 0.250000
vt 0.812500 0.687500
vt 0.812500 0.187500
vt 0.812500 0.625000
vt 0.812500 0.125000
vt 0.812500 0.562500
vt 0.812500 0.062500
vt 0.812500 0.500000
vt 0.828125 1.000000
vt 0.812500 0.937500
vt 0.828125 0.000000
vt 0.812500 0.437500
vt 0.812500 0.875000
vt 0.812500 0.812500
vt 0.781250 0.125000
vt 0.781250 0.062500
vt 0.781250 0.562500
vt 0.781250 0.500000
vt 0.796875 1.000000
vt 0.781250 0.937500
vt 0.796875 0.000000
vt 0.781250 0.437500
vt 0.781250 0.875000
vt 0.781250 0.375000
vt 0.781250 0.812500
vt 0.781250 0.312500
vt 0.781250 0.750000
vt 0.781250 0.250000
vt 0.781250 0.687500
vt 0.781250 0.187500
vt 0.781250 0.625000
vt 0.765625 1.000000
vt 0.765625 0.000000
s 0
f 478/1/1 17/2/1 6/3/1
f 476/4/2 9/5/2 10/6/2
f 6/3/3 18/7/3 479/8/3
f 2/9/4 10/6/4 11/10/4
f 479/8/5 19/11/5 480/12/5
f 477/13/6 11/10/6 12/14/6
f 480/12/7 20/15/7 481/16/7
f 3/17/8 12/14/8 13/18/8
f 481/16/9 21/19/9 482/20/9
f 3/17/10 14/21/10 4/22/10
f 474/23/11 82/24/11 7/25/11
f 308/26/12 482/20/12 21/19/12
f 4/22/13 15/27/13 5/28/13
f 475/29/14 7/25/14 8/30/14
f 5/28/15 16/31/15 478/1/15
f 1/32/16 8/30/16 9/5/16
f 15/27/17 31/33/17 16/31/17
f 8/30/18 24/34/18 9/5/18
f 16/31/19 32/35/19 17/2/19
f 9/5/20 25/36/20 10/6/20
f 17/2/21 33/37/21 18/7/21
f 10/6/22 26/38/22 11/10/22
f 19/11/23 33/37/23 34/39/23
f 11/10/24 27/40/24 12/14/24
f 20/15/25 34/39/25 35/41/25
f 13/18/26 27/40/26 28/42/26
f 20/15/27 36/43/27 21/19/27
f 13/18/28 29/44/28 14/21/28
f 7/25/29 82/45/29 22/46/29
f 308/47/30 21/19/30 36/43/30
f 14/21/31 30/48/31 15/27/31
f 7/25/32 23/49/32 8/30/32
f 34/39/33 50/50/33 35/41/33
f 27/40/34 43/51/34 28/42/34
f 35/41/35 51/52/35 36/43/35
f 28/42/36 44/53/36 29/44/36
f 22/46/37 82/54/37 37/55/37
f 308/56/38 36/43/38 51/52/38
f 30/48/39 44/53/39 45/57/39
f 22/46/40 38/58/40 23/49/40
f 30/48/41 46/59/41 31/33/41
f 23/49/42 39/60/42 24/34/42
f 32/35/43 46/59/43 47/61/43
f 24/34/44 40/62/44 25/36/44
f 33/37/45 47/61/45 48/63/45
f 25/36/46 41/64/46 26/38/46
f 34/39/47 48/63/47 49/65/47
f 26/38/48 42/66/48 27/40/48
f 38/58/49 54/67/49 39/60/49
f 47/61/50 61/68/50 62/69/50
f 39/60/51 55/70/51 40/62/51
f 47/61/52 63/71/52 48/63/52
f 40/62/53 56/72/53 41/64/53
f 48/63/54 64/73/54 49/65/54
f 41/64/55 57/74/55 42/66/55
f 49/65/56 65/75/56 50/50/56
f 43/51/57 57/74/57 58/76/57
f 50/50/58 66/77/58 51/52/58
f 44/53/59 58/76/59 59/78/59
f 37/55/60 82/79/60 52/80/60
f 308/81/61 51/52/61 66/77/61
f 44/53/62 60/82/62 45/57/62
f 37/55/63 53/83/63 38/58/63
f 45/57/64 61/68/64 46/59/64
f 58/76/65 72/84/65 73/85/65
f 65/75/66 81/86/66 66/77/66
f 59/78/67 73/85/67 74/87/67
f 52/80/68 82/88/68 67/89/68
f 308/90/69 66/77/69 81/86/69
f 59/78/70 75/91/70 60/82/70
f 52/80/71 68/92/71 53/83/71
f 60/82/72 76/93/72 61/68/72
f 53/83/73 69/94/73 54/67/73
f 62/69/74 76/93/74 77/95/74
f 54/67/75 70/96/75 55/70/75
f 62/69/76 78/97/76 63/71/76
f 56/72/77 70/96/77 71/98/77
f 63/71/78 79/99/78 64/73/78
f 56/72/79 72/84/79 57/74/79
f 65/75/80 79/99/80 80/100/80
f 77/95/81 92/101/81 93/102/81
f 69/94/82 86/103/82 70/96/82
f 77/95/83 94/104/83 78/97/83
f 71/98/84 86/103/84 87/105/84
f 78/97/85 95/106/85 79/99/85
f 71/98/86 88/107/86 72/84/86
f 79/99/87 96/108/87 80/100/87
f 73/85/88 88/107/88 89/109/88
f 81/86/89 96/108/89 97/110/89
f 74/87/90 89/109/90 90/111/90
f 67/89/91 82/112/91 83/113/91
f 308/114/92 81/86/92 97/110/92
f 74/87/93 91/115/93 75/91/93
f 67/89/94 84/116/94 68/92/94
f 75/91/95 92/101/95 76/93/95
f 68/92/96 85/117/96 69/94/96
f 96/108/97 112/118/97 97/110/97
f 90/111/98 104/119/98 105/120/98
f 83/113/99 82/121/99 98/122/99
f 308/123/100 97/110/100 112/118/100
f 90/111/101 106/124/101 91/115/101
f 84/116/102 98/122/102 99/125/102
f 91/115/103 107/126/103 92/101/103
f 84/116/104 100/127/104 85/117/104
f 93/102/105 107/126/105 108/128/105
f 85/117/106 101/129/106 86/103/106
f 93/102/107 109/130/107 94/104/107
f 87/105/108 101/129/108 102/131/108
f 95/106/109 109/130/109 110/132/109
f 87/105/110 103/133/110 88/107/110
f 96/108/111 110/132/111 111/134/111
f 89/109/112 103/133/112 104/119/112
f 108/128/113 124/135/113 109/130/113
f 102/131/114 116/136/114 117/137/114
f 110/132/115 124/135/115 125/138/115
f 102/131/116 118/139/116 103/133/116
f 110/132/117 126/140/117 111/134/117
f 104/119/118 118/139/118 119/141/118
f 111/134/119 127/142/119 112/118/119
f 105/120/120 119/141/120 120/143/120
f 98/122/121 82/144/121 113/145/121
f 308/146/122 112/118/122 127/142/122
f 105/120/123 121/147/123 106/124/123
f 99/125/124 113/145/124 114/148/124
f 106/124/125 122/149/125 107/126/125
f 99/125/126 115/150/126 100/127/126
f 108/128/127 122/149/127 123/151/127
f 100/127/128 116/136/128 101/129/128
f 113/145/129 82/152/129 128/153/129
f 308/154/130 127/142/130 142/155/130
f 120/143/131 136/156/131 121/147/131
f 113/145/132 129/157/132 114/148/132
f 121/147/133 137/158/133 122/149/133
f 114/148/134 130/159/134 115/150/134
f 123/151/135 137/158/135 138/160/135
f 115/150/136 131/161/136 116/136/136
f 123/151/137 139/162/137 124/135/137
f 117/137/138 131/161/138 132/163/138
f 125/138/139 139/162/139 140/164/139
f 117/137/140 133/165/140 118/139/140
f 125/138/141 141/166/141 126/140/141
f 119/141/142 133/165/142 134/167/142
f 126/140/143 142/155/143 127/142/143
f 120/143/144 134/167/144 135/168/144
f 132/163/145 146/169/145 147/170/145
f 140/164/146 154/171/146 155/172/146
f 132/163/147 148/173/147 133/165/147
f 140/164/148 156/174/148 141/166/148
f 134/167/149 148/173/149 149/175/149
f 142/155/150 156/174/150 157/176/150
f 135/168/151 149/175/151 150/177/151
f 128/153/152 82/178/152 143/179/152
f 308/180/153 142/155/153 157/176/153
f 135/168/154 151/181/154 136/156/154
f 128/153/155 144/182/155 129/157/155
f 136/156/156 152/183/156 137/158/156
f 130/159/157 144/182/157 145/184/157
f 138/160/158 152/183/158 153/185/158
f 130/159/159 146/169/159 131/161/159
f 138/160/160 154/171/160 139/162/160
f 150/177/161 166/186/161 151/181/161
f 143/179/162 159/187/162 144/182/162
f 151/181/163 167/188/163 152/183/163
f 145/184/164 159/187/164 160/189/164
f 153/185/165 167/188/165 168/190/165
f 145/184/166 161/191/166 146/169/166
f 153/185/167 169/192/167 154/171/167
f 147/170/168 161/191/168 162/193/168
f 155/172/169 169/192/169 170/194/169
f 147/170/170 163/195/170 148/173/170
f 155/172/171 171/196/171 156/174/171
f 149/175/172 163/195/172 164/197/172
f 156/174/173 172/198/173 157/176/173
f 150/177/174 164/197/174 165/199/174
f 143/179/175 82/200/175 158/201/175
f 308/202/176 157/176/176 172/198/176
f 169/192/177 185/203/177 170/194/177
f 162/193/178 178/204/178 163/195/178
f 171/196/179 185/203/179 186/205/179
f 164/197/180 178/204/180 179/206/180
f 172/198/181 186/205/181 187/207/181
f 165/199/182 179/206/182 180/208/182
f 158/201/183 82/209/183 173/210/183
f 308/211/184 172/198/184 187/207/184
f 165/199/185 181/212/185 166/186/185
f 158/201/186 174/213/186 159/187/186
f 166/186/187 182/214/187 167/188/187
f 159/187/188 175/215/188 160/189/188
f 168/190/189 182/214/189 183/216/189
f 160/189/190 176/217/190 161/191/190
f 168/190/191 184/218/191 169/192/191
f 162/193/192 176/217/192 177/219/192
f 174/213/193 188/220/193 189/221/193
f 181/212/194 197/222/194 182/214/194
f 175/215/195 189/221/195 190/223/195
f 183/216/196 197/222/196 198/224/196
f 175/215/197 191/225/197 176/217/197
f 183/216/198 199/226/198 184/218/198
f 177/219/199 191/225/199 192/227/199
f 185/203/200 199/226/200 200/228/200
f 177/219/201 193/229/201 178/204/201
f 185/203/202 201/230/202 186/205/202
f 179/206/203 193/229/203 194/231/203
f 186/205/204 202/232/204 187/207/204
f 180/208/205 194/231/205 195/233/205
f 173/210/206 82/234/206 188/220/206
f 308/235/207 187/207/207 202/232/207
f 180/208/208 196/236/208 181/212/208
f 192/227/209 208/237/209 193/229/209
f 200/228/210 216/238/210 201/230/210
f 194/231/211 208/237/211 209/239/211
f 201/230/212 217/240/212 202/232/212
f 195/233/213 209/239/213 210/241/213
f 188/220/214 82/242/214 203/243/214
f 308/244/215 202/232/215 217/240/215
f 195/233/216 211/245/216 196/236/216
f 188/220/217 204/246/217 189/221/217
f 196/236/218 212/247/218 197/222/218
f 189/221/219 205/248/219 190/223/219
f 198/224/220 212/247/220 213/249/220
f 190/223/221 206/250/221 191/225/221
f 198/224/222 214/251/222 199/226/222
f 192/227/223 206/250/223 207/252/223
f 200/228/224 214/251/224 215/253/224
f 211/245/225 227/254/225 212/247/225
f 204/246/226 220/255/226 205/248/226
f 213/249/227 227/254/227 228/256/227
f 205/248/228 221/257/228 206/250/228
f 213/249/229 229/258/229 214/251/229
f 207/252/230 221/257/230 222/259/230
f 215/253/231 229/258/231 230/260/231
f 207/252/232 223/261/232 208/237/232
f 215/253/233 231/262/233 216/238/233
f 209/239/234 223/261/234 224/263/234
f 216/238/235 232/264/235 217/240/235
f 210/241/236 224/263/236 225/265/236
f 203/243/237 82/266/237 218/267/237
f 308/268/238 217/240/238 232/264/238
f 210/241/239 226/269/239 211/245/239
f 203/243/240 219/270/240 204/246/240
f 230/260/241 246/271/241 231/262/241
f 224/263/242 238/272/242 239/273/242
f 232/264/243 246/271/243 247/274/243
f 225/265/244 239/273/244 240/275/244
f 218/267/245 82/276/245 233/277/245
f 308/278/246 232/264/246 247/274/246
f 225/265/247 241/279/247 226/269/247
f 218/267/248 234/280/248 219/270/248
f 226/269/249 242/281/249 227/254/249
f 220/255/250 234/280/250 235/282/250
f 228/256/251 242/281/251 243/283/251
f 220/255/252 236/284/252 221/257/252
f 228/256/253 244/285/253 229/258/253
f 222/259/254 236/284/254 237/286/254
f 230/260/255 244/285/255 245/287/255
f 222/259/256 238/272/256 223/261/256
f 243/283/257 257/288/257 258/289/257
f 235/282/258 251/290/258 236/284/258
f 243/283/259 259/291/259 244/285/259
f 237/286/260 251/290/260 252/292/260
f 245/287/261 259/291/261 260/293/261
f 237/286/262 253/294/262 238/272/262
f 245/287/263 261/295/263 246/271/263
f 239/273/264 253/294/264 254/296/264
f 246/271/265 262/297/265 247/274/265
f 240/275/266 254/296/266 255/298/266
f 233/277/267 82/299/267 248/300/267
f 308/301/268 247/274/268 262/297/268
f 240/275/269 256/302/269 241/279/269
f 233/277/270 249/303/270 234/280/270
f 241/279/271 257/288/271 242/281/271
f 234/280/272 250/304/272 235/282/272
f 261/295/273 277/305/273 262/297/273
f 255/298/274 269/306/274 270/307/274
f 248/300/275 82/308/275 263/309/275
f 308/310/276 262/297/276 277/305/276
f 255/298/277 271/311/277 256/302/277
f 249/303/278 263/309/278 264/312/278
f 256/302/279 272/313/279 257/288/279
f 249/303/280 265/314/280 250/304/280
f 258/289/281 272/313/281 273/315/281
f 250/304/282 266/316/282 251/290/282
f 258/289/283 274/317/283 259/291/283
f 252/292/284 266/316/284 267/318/284
f 260/293/285 274/317/285 275/319/285
f 252/292/286 268/320/286 253/294/286
f 260/293/287 276/321/287 261/295/287
f 254/296/288 268/320/288 269/306/288
f 265/314/289 281/322/289 266/316/289
f 273/315/290 289/323/290 274/317/290
f 267/318/291 281/322/291 282/324/291
f 275/319/292 289/323/292 290/325/292
f 267/318/293 283/326/293 268/320/293
f 275/319/294 291/327/294 276/321/294
f 269/306/295 283/326/295 284/328/295
f 277/305/296 291/327/296 292/329/296
f 270/307/297 284/328/297 285/330/297
f 263/309/298 82/331/298 278/332/298
f 308/333/299 277/305/299 292/329/299
f 270/307/300 286/334/300 271/311/300
f 264/312/301 278/332/301 279/335/301
f 271/311/302 287/336/302 272/313/302
f 264/312/303 280/337/303 265/314/303
f 273/315/304 287/336/304 288/338/304
f 285/330/305 299/339/305 300/340/305
f 278/332/306 82/341/306 293/342/306
f 308/343/307 292/329/307 307/344/307
f 285/330/308 301/345/308 286/334/308
f 278/332/309 294/346/309 279/335/309
f 286/334/310 302/347/310 287/336/310
f 279/335/311 295/348/311 280/337/311
f 288/338/312 302/347/312 303/349/312
f 280/337/313 296/350/313 281/322/313
f 288/338/314 304/351/314 289/323/314
f 282/324/315 296/350/315 297/352/315
f 290/325/316 304/351/316 305/353/316
f 282/324/317 298/354/317 283/326/317
f 290/325/318 306/355/318 291/327/318
f 284/328/319 298/354/319 299/339/319
f 292/329/320 306/355/320 307/344/320
f 303/349/321 320/356/321 304/351/321
f 297/352/322 312/357/322 313/358/322
f 305/353/323 320/356/323 321/359/323
f 297/352/324 314/360/324 298/354/324
f 305/353/325 322/361/325 306/355/325
f 299/339/326 314/360/326 315/362/326
f 306/355/327 323/363/327 307/344/327
f 300/340/328 315/362/328 316/364/328
f 293/342/329 82/365/329 309/366/329
f 308/367/330 307/344/330 323/363/330
f 300/340/331 317/368/331 301/345/331
f 293/342/332 310/369/332 294/346/332
f 301/345/333 318/370/333 302/347/333
f 294/346/334 311/371/334 295/348/334
f 303/349/335 318/370/335 319/372/335
f 295/348/336 312/357/336 296/350/336
f 309/366/337 82/373/337 324/374/337
f 308/375/338 323/363/338 338/376/338
f 316/364/339 332/377/339 317/368/339
f 309/366/340 325/378/340 310/369/340
f 317/368/341 333/379/341 318/370/341
f 310/369/342 326/380/342 311/371/342
f 319/372/343 333/379/343 334/381/343
f 311/371/344 327/382/344 312/357/344
f 319/372/345 335/383/345 320/356/345
f 312/357/346 328/384/346 313/358/346
f 321/359/347 335/383/347 336/385/347
f 313/358/348 329/386/348 314/360/348
f 321/359/349 337/387/349 322/361/349
f 315/362/350 329/386/350 330/388/350
f 322/361/351 338/376/351 323/363/351
f 316/364/352 330/388/352 331/389/352
f 328/384/353 342/390/353 343/391/353
f 336/385/354 350/392/354 351/393/354
f 328/384/355 344/394/355 329/386/355
f 336/385/356 352/395/356 337/387/356
f 330/388/357 344/394/357 345/396/357
f 337/387/358 353/397/358 338/376/358
f 331/389/359 345/396/359 346/398/359
f 324/374/360 82/399/360 339/400/360
f 308/401/361 338/376/361 353/397/361
f 331/389/362 347/402/362 332/377/362
f 324/374/363 340/403/363 325/378/363
f 332/377/364 348/404/364 333/379/364
f 325/378/365 341/405/365 326/380/365
f 334/381/366 348/404/366 349/406/366
f 326/380/367 342/390/367 327/382/367
f 334/381/368 350/392/368 335/383/368
f 346/398/369 362/407/369 347/402/369
f 339/400/370 355/408/370 340/403/370
f 347/402/371 363/409/371 348/404/371
f 341/405/372 355/408/372 356/410/372
f 349/406/373 363/409/373 364/411/373
f 341/405/374 357/412/374 342/390/374
f 349/406/375 365/413/375 350/392/375
f 343/391/376 357/412/376 358/414/376
f 351/393/377 365/413/377 366/415/377
f 343/391/378 359/416/378 344/394/378
f 351/393/379 367/417/379 352/395/379
f 345/396/380 359/416/380 360/418/380
f 352/395/381 368/419/381 353/397/381
f 346/398/382 360/418/382 361/420/382
f 339/400/383 82/421/383 354/422/383
f 308/423/384 353/397/384 368/419/384
f 366/424/385 380/425/385 381/426/385
f 358/427/386 374/428/386 359/429/386
f 366/424/387 382/430/387 367/431/387
f 360/432/388 374/428/388 375/433/388
f 367/431/389 383/434/389 368/435/389
f 361/436/390 375/433/390 376/437/390
f 354/438/391 82/439/391 369/440/391
f 308/441/392 368/435/392 383/434/392
f 361/436/393 377/442/393 362/443/393
f 354/438/394 370/444/394 355/445/394
f 362/443/395 378/446/395 363/447/395
f 356/448/396 370/444/396 371/449/396
f 364/450/397 378/446/397 379/451/397
f 356/448/398 372/452/398 357/453/398
f 364/450/399 380/425/399 365/454/399
f 358/427/400 372/452/400 373/455/400
f 377/442/401 393/456/401 378/446/401
f 371/449/402 385/457/402 386/458/402
f 379/451/403 393/456/403 394/459/403
f 371/449/404 387/460/404 372/452/404
f 379/451/405 395/461/405 380/425/405
f 373/455/406 387/460/406 388/462/406
f 381/426/407 395/461/407 396/463/407
f 373/455/408 389/464/408 374/428/408
f 381/426/409 397/465/409 382/430/409
f 375/433/410 389/464/410 390/466/410
f 382/430/411 398/467/411 383/434/411
f 376/437/412 390/466/412 391/468/412
f 369/440/413 82/469/413 384/470/413
f 308/471/414 383/434/414 398/467/414
f 376/437/415 392/472/415 377/442/415
f 369/440/416 385/457/416 370/444/416
f 397/465/417 411/473/417 412/474/417
f 390/466/418 404/475/418 405/476/418
f 397/465/419 413/477/419 398/467/419
f 391/468/420 405/476/420 406/478/420
f 384/470/421 82/479/421 399/480/421
f 308/481/422 398/467/422 413/477/422
f 391/468/423 407/482/423 392/472/423
f 385/457/424 399/480/424 400/483/424
f 392/472/425 408/484/425 393/456/425
f 385/457/426 401/485/426 386/458/426
f 394/459/427 408/484/427 409/486/427
f 386/458/428 402/487/428 387/460/428
f 394/459/429 410/488/429 395/461/429
f 388/462/430 402/487/430 403/489/430
f 396/463/431 410/488/431 411/473/431
f 388/462/432 404/475/432 389/464/432
f 401/485/433 415/490/433 416/491/433
f 409/486/434 423/492/434 424/493/434
f 401/485/435 417/494/435 402/487/435
f 409/486/436 425/495/436 410/488/436
f 403/489/437 417/494/437 418/496/437
f 411/473/438 425/495/438 426/497/438
f 403/489/439 419/498/439 404/475/439
f 411/473/440 427/499/440 412/474/440
f 405/476/441 419/498/441 420/500/441
f 412/474/442 428/501/442 413/477/442
f 406/478/443 420/500/443 421/502/443
f 399/480/444 82/503/444 414/504/444
f 308/505/445 413/477/445 428/501/445
f 406/478/446 422/506/446 407/482/446
f 399/480/447 415/490/447 400/483/447
f 407/482/448 423/492/448 408/484/448
f 420/500/449 434/507/449 435/508/449
f 427/499/450 443/509/450 428/501/450
f 421/502/451 435/508/451 436/510/451
f 414/504/452 82/511/452 429/512/452
f 308/513/453 428/501/453 443/509/453
f 421/502/454 437/514/454 422/506/454
f 414/504/455 430/515/455 415/490/455
f 422/506/456 438/516/456 423/492/456
f 416/491/457 430/515/457 431/517/457
f 424/493/458 438/516/458 439/518/458
f 416/491/459 432/519/459 417/494/459
f 424/493/460 440/520/460 425/495/460
f 418/496/461 432/519/461 433/521/461
f 426/497/462 440/520/462 441/522/462
f 418/496/463 434/507/463 419/498/463
f 426/497/464 442/523/464 427/499/464
f 439/518/465 453/524/465 454/525/465
f 431/517/466 447/526/466 432/519/466
f 439/518/467 455/527/467 440/520/467
f 433/521/468 447/526/468 448/528/468
f 441/522/469 455/527/469 456/529/469
f 433/521/470 449/530/470 434/507/470
f 441/522/471 457/531/471 442/523/471
f 435/508/472 449/530/472 450/532/472
f 442/523/473 458/533/473 443/509/473
f 436/510/474 450/532/474 451/534/474
f 429/512/475 82/535/475 444/536/475
f 308/537/476 443/509/476 458/533/476
f 436/510/477 452/538/477 437/514/477
f 430/515/478 444/536/478 445/539/478
f 437/514/479 453/524/479 438/516/479
f 430/515/480 446/540/480 431/517/480
f 458/533/481 472/541/481 473/542/481
f 451/534/482 465/543/482 466/544/482
f 444/536/483 82/545/483 459/546/483
f 308/547/484 458/533/484 473/542/484
f 451/534/485 467/548/485 452/538/485
f 444/536/486 460/549/486 445/539/486
f 452/538/487 468/550/487 453/524/487
f 446/540/488 460/549/488 461/551/488
f 454/525/489 468/550/489 469/552/489
f 446/540/490 462/553/490 447/526/490
f 454/525/491 470/554/491 455/527/491
f 448/528/492 462/553/492 463/555/492
f 456/529/493 470/554/493 471/556/493
f 448/528/494 464/557/494 449/530/494
f 456/529/495 472/541/495 457/531/495
f 450/532/496 464/557/496 465/543/496
f 462/553/497 1/32/497 476/4/497
f 469/552/498 479/8/498 470/554/498
f 463/555/499 476/4/499 2/9/499
f 471/556/500 479/8/500 480/12/500
f 464/557/501 2/9/501 477/13/501
f 471/556/502 481/16/502 472/541/502
f 465/543/503 477/13/503 3/17/503
f 472/541/504 482/20/504 473/542/504
f 466/544/505 3/17/505 4/22/505
f 459/546/506 82/558/506 474/23/506
f 308/559/507 473/542/507 482/20/507
f 466/544/508 5/28/508 467/548/508
f 460/549/509 474/23/509 475/29/509
f 467/548/510 478/1/510 468/550/510
f 461/551/511 475/29/511 1/32/511
f 468/550/512 6/3/512 469/552/512
f 478/1/1 16/31/1 17/2/1
f 476/4/2 1/32/2 9/5/2
f 6/3/3 17/2/3 18/7/3
f 2/9/4 476/4/4 10/6/4
f 479/8/5 18/7/5 19/11/5
f 477/13/6 2/9/6 11/10/6
f 480/12/513 19/11/513 20/15/513
f 3/17/8 477/13/8 12/14/8
f 481/16/9 20/15/9 21/19/9
f 3/17/10 13/18/10 14/21/10
f 4/22/13 14/21/13 15/27/13
f 475/29/14 474/23/14 7/25/14
f 5/28/15 15/27/15 16/31/15
f 1/32/16 475/29/16 8/30/16
f 15/27/17 30/48/17 31/33/17
f 8/30/18 23/49/18 24/34/18
f 16/31/19 31/33/19 32/35/19
f 9/5/20 24/34/20 25/36/20
f 17/2/21 32/35/21 33/37/21
f 10/6/22 25/36/22 26/38/22
f 19/11/23 18/7/23 33/37/23
f 11/10/24 26/38/24 27/40/24
f 20/15/25 19/11/25 34/39/25
f 13/18/26 12/14/26 27/40/26
f 20/15/27 35/41/27 36/43/27
f 13/18/28 28/42/28 29/44/28
f 14/21/31 29/44/31 30/48/31
f 7/25/32 22/46/32 23/49/32
f 34/39/33 49/65/33 50/50/33
f 27/40/34 42/66/34 43/51/34
f 35/41/35 50/50/35 51/52/35
f 28/42/36 43/51/36 44/53/36
f 30/48/39 29/44/39 44/53/39
f 22/46/40 37/55/40 38/58/40
f 30/48/41 45/57/41 46/59/41
f 23/49/42 38/58/42 39/60/42
f 32/35/43 31/33/43 46/59/43
f 24/34/44 39/60/44 40/62/44
f 33/37/45 32/35/45 47/61/45
f 25/36/46 40/62/46 41/64/46
f 34/39/47 33/37/47 48/63/47
f 26/38/48 41/64/48 42/66/48
f 38/58/49 53/83/49 54/67/49
f 47/61/50 46/59/50 61/68/50
f 39/60/51 54/67/51 55/70/51
f 47/61/52 62/69/52 63/71/52
f 40/62/53 55/70/53 56/72/53
f 48/63/54 63/71/54 64/73/54
f 41/64/55 56/72/55 57/74/55
f 49/65/56 64/73/56 65/75/56
f 43/51/57 42/66/57 57/74/57
f 50/50/58 65/75/58 66/77/58
f 44/53/59 43/51/59 58/76/59
f 44/53/62 59/78/62 60/82/62
f 37/55/63 52/80/63 53/83/63
f 45/57/64 60/82/64 61/68/64
f 58/76/65 57/74/65 72/84/65
f 65/75/66 80/100/66 81/86/66
f 59/78/67 58/76/67 73/85/67
f 59/78/70 74/87/70 75/91/70
f 52/80/71 67/89/71 68/92/71
f 60/82/72 75/91/72 76/93/72
f 53/83/73 68/92/73 69/94/73
f 62/69/74 61/68/74 76/93/74
f 54/67/75 69/94/75 70/96/75
f 62/69/76 77/95/76 78/97/76
f 56/72/77 55/70/77 70/96/77
f 63/71/78 78/97/78 79/99/78
f 56/72/79 71/98/79 72/84/79
f 65/75/80 64/73/80 79/99/80
f 77/95/81 76/93/81 92/101/81
f 69/94/82 85/117/82 86/103/82
f 77/95/83 93/102/83 94/104/83
f 71/98/84 70/96/84 86/103/84
f 78/97/85 94/104/85 95/106/85
f 71/98/86 87/105/86 88/107/86
f 79/99/87 95/106/87 96/108/87
f 73/85/88 72/84/88 88/107/88
f 81/86/89 80/100/89 96/108/89
f 74/87/90 73/85/90 89/109/90
f 74/87/93 90/111/93 91/115/93
f 67/89/94 83/113/94 84/116/94
f 75/91/95 91/115/95 92/101/95
f 68/92/96 84/116/96 85/117/96
f 96/108/97 111/134/97 112/118/97
f 90/111/98 89/109/98 104/119/98
f 90/111/101 105/120/101 106/124/101
f 84/116/102 83/113/102 98/122/102
f 91/115/103 106/124/103 107/126/103
f 84/116/104 99/125/104 100/127/104
f 93/102/105 92/101/105 107/126/105
f 85/117/106 100/127/106 101/129/106
f 93/102/107 108/128/107 109/130/107
f 87/105/108 86/103/108 101/129/108
f 95/106/109 94/104/109 109/130/109
f 87/105/110 102/131/110 103/133/110
f 96/108/111 95/106/111 110/132/111
f 89/109/112 88/107/112 103/133/112
f 108/128/113 123/151/113 124/135/113
f 102/131/114 101/129/114 116/136/114
f 110/132/115 109/130/115 124/135/115
f 102/131/116 117/137/116 118/139/116
f 110/132/117 125/138/117 126/140/117
f 104/119/118 103/133/118 118/139/118
f 111/134/119 126/140/119 127/142/119
f 105/120/120 104/119/120 119/141/120
f 105/120/123 120/143/123 121/147/123
f 99/125/124 98/122/124 113/145/124
f 106/124/125 121/147/125 122/149/125
f 99/125/126 114/148/126 115/150/126
f 108/128/127 107/126/127 122/149/127
f 100/127/128 115/150/128 116/136/128
f 120/143/131 135/168/131 136/156/131
f 113/145/132 128/153/132 129/157/132
f 121/147/133 136/156/133 137/158/133
f 114/148/134 129/157/134 130/159/134
f 123/151/135 122/149/135 137/158/135
f 115/150/136 130/159/136 131/161/136
f 123/151/137 138/160/137 139/162/137
f 117/137/138 116/136/138 131/161/138
f 125/138/139 124/135/139 139/162/139
f 117/137/140 132/163/140 133/165/140
f 125/138/141 140/164/141 141/166/141
f 119/141/142 118/139/142 133/165/142
f 126/140/143 141/166/143 142/155/143
f 120/143/144 119/141/144 134/167/144
f 132/163/145 131/161/145 146/169/145
f 140/164/146 139/162/146 154/171/146
f 132/163/147 147/170/147 148/173/147
f 140/164/148 155/172/148 156/174/148
f 134/167/149 133/165/149 148/173/149
f 142/155/150 141/166/150 156/174/150
f 135/168/151 134/167/151 149/175/151
f 135/168/154 150/177/154 151/181/154
f 128/153/155 143/179/155 144/182/155
f 136/156/156 151/181/156 152/183/156
f 130/159/157 129/157/157 144/182/157
f 138/160/158 137/158/158 152/183/158
f 130/159/159 145/184/159 146/169/159
f 138/160/160 153/185/160 154/171/160
f 150/177/161 165/199/161 166/186/161
f 143/179/162 158/201/162 159/187/162
f 151/181/163 166/186/163 167/188/163
f 145/184/164 144/182/164 159/187/164
f 153/185/165 152/183/165 167/188/165
f 145/184/166 160/189/166 161/191/166
f 153/185/167 168/190/167 169/192/167
f 147/170/168 146/169/168 161/191/168
f 155/172/169 154/171/169 169/192/169
f 147/170/170 162/193/170 163/195/170
f 155/172/171 170/194/171 171/196/171
f 149/175/172 148/173/172 163/195/172
f 156/174/173 171/196/173 172/198/173
f 150/177/174 149/175/174 164/197/174
f 169/192/177 184/218/177 185/203/177
f 162/193/178 177/219/178 178/204/178
f 171/196/179 170/194/179 185/203/179
f 164/197/180 163/195/180 178/204/180
f 172/198/181 171/196/181 186/205/181
f 165/199/182 164/197/182 179/206/182
f 165/199/185 180/208/185 181/212/185
f 158/201/186 173/210/186 174/213/186
f 166/186/187 181/212/187 182/214/187
f 159/187/188 174/213/188 175/215/188
f 168/190/189 167/188/189 182/214/189
f 160/189/190 175/215/190 176/217/190
f 168/190/191 183/216/191 184/218/191
f 162/193/192 161/191/192 176/217/192
f 174/213/193 173/210/193 188/220/193
f 181/212/194 196/236/194 197/222/194
f 175/215/195 174/213/195 189/221/195
f 183/216/196 182/214/196 197/222/196
f 175/215/197 190/223/197 191/225/197
f 183/216/198 198/224/198 199/226/198
f 177/219/199 176/217/199 191/225/199
f 185/203/200 184/218/200 199/226/200
f 177/219/201 192/227/201 193/229/201
f 185/203/202 200/228/202 201/230/202
f 179/206/203 178/204/203 193/229/203
f 186/205/204 201/230/204 202/232/204
f 180/208/205 179/206/205 194/231/205
f 180/208/208 195/233/208 196/236/208
f 192/227/209 207/252/209 208/237/209
f 200/228/210 215/253/210 216/238/210
f 194/231/211 193/229/211 208/237/211
f 201/230/212 216/238/212 217/240/212
f 195/233/213 194/231/213 209/239/213
f 195/233/216 210/241/216 211/245/216
f 188/220/217 203/243/217 204/246/217
f 196/236/218 211/245/218 212/247/218
f 189/221/219 204/246/219 205/248/219
f 198/224/220 197/222/220 212/247/220
f 190/223/221 205/248/221 206/250/221
f 198/224/222 213/249/222 214/251/222
f 192/227/223 191/225/223 206/250/223
f 200/228/224 199/226/224 214/251/224
f 211/245/225 226/269/225 227/254/225
f 204/246/226 219/270/226 220/255/226
f 213/249/227 212/247/227 227/254/227
f 205/248/228 220/255/228 221/257/228
f 213/249/229 228/256/229 229/258/229
f 207/252/230 206/250/230 221/257/230
f 215/253/231 214/251/231 229/258/231
f 207/252/232 222/259/232 223/261/232
f 215/253/233 230/260/233 231/262/233
f 209/239/234 208/237/234 223/261/234
f 216/238/235 231/262/235 232/264/235
f 210/241/236 209/239/236 224/263/236
f 210/241/239 225/265/239 226/269/239
f 203/243/240 218/267/240 219/270/240
f 230/260/241 245/287/241 246/271/241
f 224/263/242 223/261/242 238/272/242
f 232/264/243 231/262/243 246/271/243
f 225/265/244 224/263/244 239/273/244
f 225/265/247 240/275/247 241/279/247
f 218/267/248 233/277/248 234/280/248
f 226/269/249 241/279/249 242/281/249
f 220/255/250 219/270/250 234/280/250
f 228/256/251 227/254/251 242/281/251
f 220/255/252 235/282/252 236/284/252
f 228/256/253 243/283/253 244/285/253
f 222/259/254 221/257/254 236/284/254
f 230/260/255 229/258/255 244/285/255
f 222/259/256 237/286/256 238/272/256
f 243/283/257 242/281/257 257/288/257
f 235/282/258 250/304/258 251/290/258
f 243/283/259 258/289/259 259/291/259
f 237/286/260 236/284/260 251/290/260
f 245/287/261 244/285/261 259/291/261
f 237/286/262 252/292/262 253/294/262
f 245/287/263 260/293/263 261/295/263
f 239/273/264 238/272/264 253/294/264
f 246/271/265 261/295/265 262/297/265
f 240/275/266 239/273/266 254/296/266
f 240/275/269 255/298/269 256/302/269
f 233/277/270 248/300/270 249/303/270
f 241/279/271 256/302/271 257/288/271
f 234/280/272 249/303/272 250/304/272
f 261/295/273 276/321/273 277/305/273
f 255/298/274 254/296/274 269/306/274
f 255/298/277 270/307/277 271/311/277
f 249/303/278 248/300/278 263/309/278
f 256/302/279 271/311/279 272/313/279
f 249/303/280 264/312/280 265/314/280
f 258/289/281 257/288/281 272/313/281
f 250/304/282 265/314/282 266/316/282
f 258/289/283 273/315/283 274/317/283
f 252/292/284 251/290/284 266/316/284
f 260/293/285 259/291/285 274/317/285
f 252/292/286 267/318/286 268/320/286
f 260/293/287 275/319/287 276/321/287
f 254/296/288 253/294/288 268/320/288
f 265/314/289 280/337/289 281/322/289
f 273/315/290 288/338/290 289/323/290
f 267/318/291 266/316/291 281/322/291
f 275/319/292 274/317/292 289/323/292
f 267/318/293 282/324/293 283/326/293
f 275/319/294 290/325/294 291/327/294
f 269/306/295 268/320/295 283/326/295
f 277/305/296 276/321/296 291/327/296
f 270/307/297 269/306/297 284/328/297
f 270/307/300 285/330/300 286/334/300
f 264/312/301 263/309/301 278/332/301
f 271/311/302 286/334/302 287/336/302
f 264/312/303 279/335/303 280/337/303
f 273/315/304 272/313/304 287/336/304
f 285/330/305 284/328/305 299/339/305
f 285/330/308 300/340/308 301/345/308
f 278/332/309 293/342/309 294/346/309
f 286/334/310 301/345/310 302/347/310
f 279/335/311 294/346/311 295/348/311
f 288/338/312 287/336/312 302/347/312
f 280/337/313 295/348/313 296/350/313
f 288/338/314 303/349/314 304/351/314
f 282/324/315 281/322/315 296/350/315
f 290/325/316 289/323/316 304/351/316
f 282/324/317 297/352/317 298/354/317
f 290/325/318 305/353/318 306/355/318
f 284/328/319 283/326/319 298/354/319
f 292/329/320 291/327/320 306/355/320
f 303/349/321 319/372/321 320/356/321
f 297/352/322 296/350/322 312/357/322
f 305/353/323 304/351/323 320/356/323
f 297/352/324 313/358/324 314/360/324
f 305/353/514 321/359/514 322/361/514
f 299/339/326 298/354/326 314/360/326
f 306/355/327 322/361/327 323/363/327
f 300/340/328 299/339/328 315/362/328
f 300/340/331 316/364/331 317/368/331
f 293/342/332 309/366/332 310/369/332
f 301/345/333 317/368/333 318/370/333
f 294/346/515 310/369/515 311/371/515
f 303/349/335 302/347/335 318/370/335
f 295/348/336 311/371/336 312/357/336
f 316/364/339 331/389/339 332/377/339
f 309/366/340 324/374/340 325/378/340
f 317/368/341 332/377/341 333/379/341
f 310/369/342 325/378/342 326/380/342
f 319/372/343 318/370/343 333/379/343
f 311/371/344 326/380/344 327/382/344
f 319/372/345 334/381/345 335/383/345
f 312/357/346 327/382/346 328/384/346
f 321/359/347 320/356/347 335/383/347
f 313/358/348 328/384/348 329/386/348
f 321/359/349 336/385/349 337/387/349
f 315/362/350 314/360/350 329/386/350
f 322/361/351 337/387/351 338/376/351
f 316/364/352 315/362/352 330/388/352
f 328/384/353 327/382/353 342/390/353
f 336/385/354 335/383/354 350/392/354
f 328/384/355 343/391/355 344/394/355
f 336/385/356 351/393/356 352/395/356
f 330/388/357 329/386/357 344/394/357
f 337/387/358 352/395/358 353/397/358
f 331/389/359 330/388/359 345/396/359
f 331/389/362 346/398/362 347/402/362
f 324/374/363 339/400/363 340/403/363
f 332/377/364 347/402/364 348/404/364
f 325/378/365 340/403/365 341/405/365
f 334/381/366 333/379/366 348/404/366
f 326/380/367 341/405/367 342/390/367
f 334/381/368 349/406/368 350/392/368
f 346/398/369 361/420/369 362/407/369
f 339/400/370 354/422/370 355/408/370
f 347/402/371 362/407/371 363/409/371
f 341/405/372 340/403/372 355/408/372
f 349/406/373 348/404/373 363/409/373
f 341/405/374 356/410/374 357/412/374
f 349/406/375 364/411/375 365/413/375
f 343/391/376 342/390/376 357/412/376
f 351/393/377 350/392/377 365/413/377
f 343/391/378 358/414/378 359/416/378
f 351/393/379 366/415/379 367/417/379
f 345/396/380 344/394/380 359/416/380
f 352/395/381 367/417/381 368/419/381
f 346/398/382 345/396/382 360/418/382
f 366/424/385 365/454/385 380/425/385
f 358/427/386 373/455/386 374/428/386
f 366/424/387 381/426/387 382/430/387
f 360/432/388 359/429/388 374/428/388
f 367/431/389 382/430/389 383/434/389
f 361/436/390 360/432/390 375/433/390
f 361/436/393 376/437/393 377/442/393
f 354/438/394 369/440/394 370/444/394
f 362/443/395 377/442/395 378/446/395
f 356/448/396 355/445/396 370/444/396
f 364/450/397 363/447/397 378/446/397
f 356/448/398 371/449/398 372/452/398
f 364/450/399 379/451/399 380/425/399
f 358/427/400 357/453/400 372/452/400
f 377/442/401 392/472/401 393/456/401
f 371/449/402 370/444/402 385/457/402
f 379/451/403 378/446/403 393/456/403
f 371/449/404 386/458/404 387/460/404
f 379/451/405 394/459/405 395/461/405
f 373/455/406 372/452/406 387/460/406
f 381/426/407 380/425/407 395/461/407
f 373/455/408 388/462/408 389/464/408
f 381/426/409 396/463/409 397/465/409
f 375/433/410 374/428/410 389/464/410
f 382/430/411 397/465/411 398/467/411
f 376/437/412 375/433/412 390/466/412
f 376/437/415 391/468/415 392/472/415
f 369/440/416 384/470/416 385/457/416
f 397/465/417 396/463/417 411/473/417
f 390/466/418 389/464/418 404/475/418
f 397/465/419 412/474/419 413/477/419
f 391/468/420 390/466/420 405/476/420
f 391/468/423 406/478/423 407/482/423
f 385/457/424 384/470/424 399/480/424
f 392/472/425 407/482/425 408/484/425
f 385/457/426 400/483/426 401/485/426
f 394/459/427 393/456/427 408/484/427
f 386/458/428 401/485/428 402/487/428
f 394/459/429 409/486/429 410/488/429
f 388/462/430 387/460/430 402/487/430
f 396/463/431 395/461/431 410/488/431
f 388/462/432 403/489/432 404/475/432
f 401/485/516 400/483/516 415/490/516
f 409/486/434 408/484/434 423/492/434
f 401/485/435 416/491/435 417/494/435
f 409/486/436 424/493/436 425/495/436
f 403/489/437 402/487/437 417/494/437
f 411/473/438 410/488/438 425/495/438
f 403/489/439 418/496/439 419/498/439
f 411/473/517 426/497/517 427/499/517
f 405/476/441 404/475/441 419/498/441
f 412/474/442 427/499/442 428/501/442
f 406/478/443 405/476/443 420/500/443
f 406/478/446 421/502/446 422/506/446
f 399/480/447 414/504/447 415/490/447
f 407/482/448 422/506/448 423/492/448
f 420/500/449 419/498/449 434/507/449
f 427/499/450 442/523/450 443/509/450
f 421/502/451 420/500/451 435/508/451
f 421/502/454 436/510/454 437/514/454
f 414/504/455 429/512/455 430/515/455
f 422/506/456 437/514/456 438/516/456
f 416/491/457 415/490/457 430/515/457
f 424/493/458 423/492/458 438/516/458
f 416/491/459 431/517/459 432/519/459
f 424/493/460 439/518/460 440/520/460
f 418/496/461 417/494/461 432/519/461
f 426/497/462 425/495/462 440/520/462
f 418/496/463 433/521/463 434/507/463
f 426/497/464 441/522/464 442/523/464
f 439/518/465 438/516/465 453/524/465
f 431/517/466 446/540/466 447/526/466
f 439/518/467 454/525/467 455/527/467
f 433/521/468 432/519/468 447/526/468
f 441/522/469 440/520/469 455/527/469
f 433/521/470 448/528/470 449/530/470
f 441/522/471 456/529/471 457/531/471
f 435/508/472 434/507/472 449/530/472
f 442/523/473 457/531/473 458/533/473
f 436/510/474 435/508/474 450/532/474
f 436/510/477 451/534/477 452/538/477
f 430/515/478 429/512/478 444/536/478
f 437/514/479 452/538/479 453/524/479
f 430/515/480 445/539/480 446/540/480
f 458/533/481 457/531/481 472/541/481
f 451/534/482 450/532/482 465/543/482
f 451/534/485 466/544/485 467/548/485
f 444/536/486 459/546/486 460/549/486
f 452/538/487 467/548/487 468/550/487
f 446/540/488 445/539/488 460/549/488
f 454/525/489 453/524/489 468/550/489
f 446/540/490 461/551/490 462/553/490
f 454/525/491 469/552/491 470/554/491
f 448/528/492 447/526/492 462/553/492
f 456/529/493 455/527/493 470/554/493
f 448/528/494 463/555/494 464/557/494
f 456/529/495 471/556/495 472/541/495
f 450/532/496 449/530/496 464/557/496
f 462/553/497 461/551/497 1/32/497
f 469/552/498 6/3/498 479/8/498
f 463/555/499 462/553/499 476/4/499
f 471/556/500 470/554/500 479/8/500
f 464/557/501 463/555/501 2/9/501
f 471/556/502 480/12/502 481/16/502
f 465/543/503 464/557/503 477/13/503
f 472/541/504 481/16/504 482/20/504
f 466/544/505 465/543/505 3/17/505
f 466/544/508 4/22/508 5/28/508
f 460/549/509 459/546/509 474/23/509
f 467/548/510 5/28/510 478/1/510
f 461/551/511 460/549/511 475/29/511
f 468/550/512 478/1/512 6/3/512
o Sphere.001
v -0.076854 0.140504 -0.154382
v -0.076854 0.044016 -0.226220
v -0.076854 -0.082051 -0.265099
v -0.076854 -0.150279 -0.270102
v -0.076854 -0.218506 -0.265099
v -0.076854 -0.344573 -0.226220
v -0.066588 0.192723 -0.059544
v -0.056716 0.172822 -0.107451
v -0.047619 0.140504 -0.151602
v -0.039645 0.097012 -0.190301
v -0.033100 0.044016 -0.222060
v -0.028237 -0.016446 -0.245660
v -0.025243 -0.082051 -0.260192
v -0.024232 -0.150279 -0.265099
v -0.025243 -0.218506 -0.260192
v -0.028237 -0.284111 -0.245660
v -0.033100 -0.344573 -0.222060
v -0.039645 -0.397569 -0.190301
v -0.047619 -0.441061 -0.151602
v -0.056716 -0.473379 -0.107451
v -0.066588 -0.493280 -0.059544
v -0.056716 0.192723 -0.056653
v -0.037353 0.172822 -0.101781
v -0.019507 0.140504 -0.143370
v -0.003865 0.097012 -0.179824
v 0.008972 0.044016 -0.209740
v 0.018511 -0.016446 -0.231970
v 0.024385 -0.082051 -0.245660
v 0.026368 -0.150279 -0.250282
v 0.024385 -0.218506 -0.245660
v 0.018511 -0.284111 -0.231970
v 0.008972 -0.344573 -0.209740
v -0.003865 -0.397569 -0.179824
v -0.019507 -0.441061 -0.143370
v -0.037353 -0.473379 -0.101781
v -0.056716 -0.493280 -0.056653
v -0.047619 0.192723 -0.051959
v -0.019507 0.172822 -0.092573
v 0.006401 0.140504 -0.130002
v 0.029110 0.097012 -0.162810
v 0.047746 0.044016 -0.189734
v 0.061595 -0.016446 -0.209740
v 0.070122 -0.082051 -0.222060
v 0.073002 -0.150279 -0.226220
v 0.070122 -0.218506 -0.222060
v 0.061595 -0.284111 -0.209740
v 0.047746 -0.344573 -0.189734
v 0.029110 -0.397569 -0.162810
v 0.006401 -0.441061 -0.130002
v -0.019507 -0.473379 -0.092573
v -0.047619 -0.493280 -0.051959
v -0.039645 0.192723 -0.045642
v -0.003865 0.172822 -0.080181
v 0.029110 0.140504 -0.112012
v 0.058013 0.097012 -0.139912
v 0.081732 0.044016 -0.162810
v 0.099358 -0.016446 -0.179824
v 0.110211 -0.082051 -0.190301
v 0.113876 -0.150279 -0.193839
v 0.110211 -0.218506 -0.190301
v 0.099358 -0.284111 -0.179824
v 0.081732 -0.344573 -0.162810
v 0.058013 -0.397569 -0.139912
v 0.029110 -0.441061 -0.112012
v -0.003865 -0.473379 -0.080181
v -0.039645 -0.493280 -0.045642
v -0.033100 0.192723 -0.037944
v 0.008972 0.172822 -0.065081
v 0.047746 0.140504 -0.090091
v 0.081732 0.097012 -0.112012
v 0.109624 0.044016 -0.130002
v 0.130349 -0.016446 -0.143370
v 0.143112 -0.082051 -0.151602
v 0.147421 -0.150279 -0.154382
v 0.143112 -0.218506 -0.151602
v 0.130349 -0.284111 -0.143370
v 0.109624 -0.344573 -0.130002
v 0.081732 -0.397569 -0.112012
v 0.047746 -0.441061 -0.090091
v 0.008972 -0.473379 -0.065081
v -0.033100 -0.493280 -0.037944
v -0.076854 0.199443 -0.009722
v -0.028237 0.192723 -0.029162
v 0.018511 0.172822 -0.047854
v 0.061595 0.140504 -0.065081
v 0.099358 0.097012 -0.080181
v 0.130349 0.044016 -0.092573
v 0.153378 -0.016446 -0.101781
v 0.167559 -0.082051 -0.107451
v 0.172347 -0.150279 -0.109365
v 0.167559 -0.218506 -0.107451
v 0.153378 -0.284111 -0.101781
v 0.130349 -0.344573 -0.092573
v 0.099358 -0.397569 -0.080181
v 0.061595 -0.441061 -0.065081
v 0.018511 -0.473379 -0.047854
v -0.028237 -0.493280 -0.029162
v -0.025243 0.192723 -0.019633
v 0.024385 0.172822 -0.029162
v 0.070122 0.140504 -0.037944
v 0.110211 0.097012 -0.045642
v 0.143112 0.044016 -0.051959
v 0.167559 -0.016446 -0.056653
v 0.182613 -0.082051 -0.059544
v 0.187697 -0.150279 -0.060520
v 0.182613 -0.218506 -0.059544
v 0.167559 -0.284111 -0.056653
v 0.143112 -0.344573 -0.051959
v 0.110211 -0.397569 -0.045642
v 0.070122 -0.441061 -0.037944
v 0.024385 -0.473379 -0.029162
v -0.025243 -0.493280 -0.019633
v -0.024232 0.192723 -0.009722
v 0.026368 0.172822 -0.009722
v 0.073002 0.140504 -0.009722
v 0.113876 0.097012 -0.009722
v 0.147421 0.044016 -0.009722
v 0.172347 -0.016446 -0.009722
v 0.187697 -0.082051 -0.009722
v 0.192879 -0.150279 -0.009722
v 0.187697 -0.218506 -0.009722
v 0.172347 -0.284111 -0.009722
v 0.147421 -0.344573 -0.009722
v 0.113876 -0.397569 -0.009722
v 0.073002 -0.441061 -0.009722
v 0.026368 -0.473379 -0.009722
v -0.024232 -0.493280 -0.009722
v -0.025243 0.192723 0.000188
v 0.024385 0.172822 0.009717
v 0.070122 0.140504 0.018499
v 0.110211 0.097012 0.026197
v 0.143112 0.044016 0.032514
v 0.167559 -0.016446 0.037208
v 0.182613 -0.082051 0.040099
v 0.187696 -0.150279 0.041075
v 0.182613 -0.218506 0.040099
v 0.167559 -0.284111 0.037208
v 0.143112 -0.344573 0.032514
v 0.110211 -0.397569 0.026197
v 0.070122 -0.441061 0.018499
v 0.024385 -0.473379 0.009717
v -0.025243 -0.493280 0.000188
v -0.028237 0.192723 0.009717
v 0.018511 0.172822 0.028409
v 0.061595 0.140504 0.045636
v 0.099358 0.097012 0.060736
v 0.130349 0.044016 0.073128
v 0.153378 -0.016446 0.082336
v 0.167559 -0.082051 0.088006
v 0.172347 -0.150279 0.089921
v 0.167559 -0.218506 0.088006
v 0.153378 -0.284111 0.082336
v 0.130349 -0.344573 0.073128
v 0.099358 -0.397569 0.060736
v 0.061595 -0.441061 0.045636
v 0.018511 -0.473379 0.028409
v -0.028237 -0.493280 0.009717
v -0.033100 0.192723 0.018499
v 0.008972 0.172822 0.045636
v 0.047746 0.140504 0.070646
v 0.081732 0.097012 0.092567
v 0.109624 0.044016 0.110557
v 0.130349 -0.016446 0.123925
v 0.143112 -0.082051 0.132157
v 0.147421 -0.150279 0.134937
v 0.143112 -0.218506 0.132157
v 0.130349 -0.284111 0.123925
v 0.109624 -0.344573 0.110557
v 0.081732 -0.397569 0.092567
v 0.047746 -0.441061 0.070646
v 0.008972 -0.473379 0.045636
v -0.033100 -0.493280 0.018499
v -0.039645 0.192723 0.026197
v -0.003865 0.172822 0.060736
v 0.029110 0.140504 0.092567
v 0.058013 0.097012 0.120467
v 0.081732 0.044016 0.143365
v 0.099358 -0.016446 0.160379
v 0.110211 -0.082051 0.170856
v 0.113876 -0.150279 0.174394
v 0.110211 -0.218506 0.170856
v 0.099358 -0.284111 0.160379
v 0.081732 -0.344573 0.143365
v 0.058013 -0.397569 0.120467
v 0.029110 -0.441061 0.092567
v -0.003865 -0.473379 0.060736
v -0.039645 -0.493280 0.026197
v -0.047619 0.192723 0.032514
v -0.019507 0.172822 0.073128
v 0.006401 0.140504 0.110557
v 0.029110 0.097012 0.143365
v 0.047746 0.044016 0.170289
v 0.061595 -0.016446 0.190295
v 0.070122 -0.082051 0.202615
v 0.073002 -0.150279 0.206775
v 0.070122 -0.218506 0.202615
v 0.061595 -0.284111 0.190295
v 0.047746 -0.344573 0.170289
v 0.029110 -0.397569 0.143365
v 0.006401 -0.441061 0.110557
v -0.019507 -0.473379 0.073128
v -0.047619 -0.493280 0.032514
v -0.056717 0.192723 0.037208
v -0.037353 0.172822 0.082336
v -0.019507 0.140504 0.123925
v -0.003865 0.097012 0.160379
v 0.008972 0.044016 0.190295
v 0.018511 -0.016446 0.212526
v 0.024385 -0.082051 0.226215
v 0.026368 -0.150279 0.230837
v 0.024385 -0.218506 0.226215
v 0.018511 -0.284111 0.212526
v 0.008972 -0.344573 0.190295
v -0.003865 -0.397569 0.160379
v -0.019507 -0.441061 0.123925
v -0.037353 -0.473379 0.082336
v -0.056717 -0.493280 0.037208
v -0.066588 0.192723 0.040099
v -0.056717 0.172822 0.088006
v -0.047619 0.140504 0.132157
v -0.039645 0.097012 0.170856
v -0.033100 0.044016 0.202615
v -0.028237 -0.016446 0.226215
v -0.025243 -0.082051 0.240747
v -0.024232 -0.150279 0.245654
v -0.025243 -0.218506 0.240747
v -0.028237 -0.284111 0.226215
v -0.033100 -0.344573 0.202615
v -0.039645 -0.397569 0.170856
v -0.047619 -0.441061 0.132157
v -0.056717 -0.473379 0.088006
v -0.066588 -0.493280 0.040099
v -0.076854 0.192723 0.041075
v -0.076854 0.172822 0.089921
v -0.076854 0.140504 0.134937
v -0.076854 0.097012 0.174394
v -0.076854 0.044016 0.206775
v -0.076854 -0.016446 0.230837
v -0.076854 -0.082051 0.245654
v -0.076854 -0.150279 0.250657
v -0.076854 -0.218506 0.245654
v -0.076854 -0.284111 0.230837
v -0.076854 -0.344573 0.206775
v -0.076854 -0.397569 0.174394
v -0.076854 -0.441061 0.134937
v -0.076854 -0.473379 0.089921
v -0.076854 -0.493280 0.041075
v -0.087120 0.192723 0.040099
v -0.096992 0.172822 0.088006
v -0.106090 0.140504 0.132157
v -0.114064 0.097012 0.170856
v -0.120608 0.044016 0.202615
v -0.125471 -0.016446 0.226215
v -0.128466 -0.082051 0.240747
v -0.129477 -0.150279 0.245654
v -0.128466 -0.218506 0.240747
v -0.125471 -0.284111 0.226215
v -0.120608 -0.344573 0.202615
v -0.114064 -0.397569 0.170856
v -0.106090 -0.441061 0.132157
v -0.096992 -0.473379 0.088006
v -0.087120 -0.493280 0.040099
v -0.096992 0.192723 0.037208
v -0.116356 0.172822 0.082336
v -0.134202 0.140504 0.123925
v -0.149844 0.097012 0.160379
v -0.162681 0.044016 0.190295
v -0.172219 -0.016446 0.212526
v -0.178093 -0.082051 0.226215
v -0.180077 -0.150279 0.230837
v -0.178093 -0.218506 0.226215
v -0.172219 -0.284111 0.212526
v -0.162681 -0.344573 0.190295
v -0.149844 -0.397569 0.160379
v -0.134202 -0.441061 0.123925
v -0.116356 -0.473379 0.082336
v -0.096992 -0.493280 0.037208
v -0.106090 0.192723 0.032514
v -0.134202 0.172822 0.073128
v -0.160110 0.140504 0.110557
v -0.182818 0.097012 0.143365
v -0.201455 0.044016 0.170289
v -0.215303 -0.016446 0.190295
v -0.223831 -0.082051 0.202615
v -0.226710 -0.150279 0.206775
v -0.223831 -0.218506 0.202615
v -0.215303 -0.284111 0.190295
v -0.201455 -0.344573 0.170289
v -0.182818 -0.397569 0.143365
v -0.160110 -0.441061 0.110557
v -0.134202 -0.473379 0.073128
v -0.106090 -0.493280 0.032514
v -0.114064 0.192723 0.026197
v -0.149844 0.172822 0.060736
v -0.182818 0.140504 0.092567
v -0.211721 0.097012 0.120467
v -0.235441 0.044016 0.143365
v -0.253066 -0.016446 0.160379
v -0.263920 -0.082051 0.170856
v -0.267585 -0.150279 0.174394
v -0.263920 -0.218506 0.170856
v -0.253066 -0.284111 0.160379
v -0.235441 -0.344573 0.143365
v -0.211721 -0.397569 0.120467
v -0.182818 -0.441061 0.092567
v -0.149844 -0.473379 0.060736
v -0.114064 -0.493280 0.026197
v -0.076854 -0.500000 -0.009722
v -0.120608 0.192723 0.018499
v -0.162681 0.172822 0.045636
v -0.201455 0.140504 0.070646
v -0.235441 0.097012 0.092567
v -0.263332 0.044016 0.110557
v -0.284058 -0.016446 0.123925
v -0.296820 -0.082051 0.132157
v -0.301129 -0.150279 0.134937
v -0.296820 -0.218506 0.132157
v -0.284058 -0.284111 0.123925
v -0.263332 -0.344573 0.110557
v -0.235441 -0.397569 0.092567
v -0.201455 -0.441061 0.070646
v -0.162681 -0.473379 0.045636
v -0.120608 -0.493280 0.018499
v -0.125471 0.192723 0.009717
v -0.172219 0.172822 0.028409
v -0.215303 0.140504 0.045636
v -0.253066 0.097012 0.060736
v -0.284058 0.044016 0.073128
v -0.307086 -0.016446 0.082336
v -0.321267 -0.082051 0.088006
v -0.326055 -0.150279 0.089920
v -0.321267 -0.218506 0.088006
v -0.307086 -0.284111 0.082336
v -0.284058 -0.344573 0.073128
v -0.253066 -0.397569 0.060736
v -0.215303 -0.441061 0.045636
v -0.172219 -0.473379 0.028409
v -0.125471 -0.493280 0.009717
v -0.128465 0.192723 0.000188
v -0.178093 0.172822 0.009717
v -0.223831 0.140504 0.018499
v -0.263920 0.097012 0.026197
v -0.296820 0.044016 0.032514
v -0.321267 -0.016446 0.037208
v -0.336322 -0.082051 0.040099
v -0.341405 -0.150279 0.041075
v -0.336322 -0.218506 0.040099
v -0.321267 -0.284111 0.037208
v -0.296820 -0.344573 0.032514
v -0.263920 -0.397569 0.026197
v -0.223831 -0.441061 0.018499
v -0.178093 -0.473379 0.009717
v -0.128465 -0.493280 0.000188
v -0.129477 0.192723 -0.009722
v -0.180077 0.172822 -0.009722
v -0.226710 0.140504 -0.009722
v -0.267585 0.097012 -0.009722
v -0.301129 0.044016 -0.009722
v -0.326056 -0.016446 -0.009722
v -0.341405 -0.082051 -0.009723
v -0.346588 -0.150279 -0.009723
v -0.341405 -0.218506 -0.009723
v -0.326056 -0.284111 -0.009722
v -0.301129 -0.344573 -0.009722
v -0.267585 -0.397569 -0.009722
v -0.226710 -0.441061 -0.009722
v -0.180077 -0.473379 -0.009722
v -0.129477 -0.493280 -0.009722
v -0.128465 0.192723 -0.019633
v -0.178093 0.172822 -0.029162
v -0.223831 0.140504 -0.037944
v -0.263920 0.097012 -0.045642
v -0.296820 0.044016 -0.051959
v -0.321267 -0.016446 -0.056653
v -0.336322 -0.082051 -0.059544
v -0.341405 -0.150279 -0.060520
v -0.336322 -0.218506 -0.059544
v -0.321267 -0.284111 -0.056653
v -0.296820 -0.344573 -0.051959
v -0.263920 -0.397569 -0.045642
v -0.223831 -0.441061 -0.037944
v -0.178093 -0.473379 -0.029162
v -0.128465 -0.493280 -0.019633
v -0.125471 0.192723 -0.029162
v -0.172219 0.172822 -0.047854
v -0.215303 0.140504 -0.065081
v -0.253066 0.097012 -0.080181
v -0.284057 0.044016 -0.092573
v -0.307086 -0.016446 -0.101781
v -0.321267 -0.082051 -0.107451
v -0.326055 -0.150279 -0.109365
v -0.321267 -0.218506 -0.107451
v -0.307086 -0.284111 -0.101781
v -0.284057 -0.344573 -0.092573
v -0.253066 -0.397569 -0.080181
v -0.215303 -0.441061 -0.065081
v -0.172219 -0.473379 -0.047854
v -0.125471 -0.493280 -0.029162
v -0.120608 0.192723 -0.037944
v -0.162681 0.172822 -0.065081
v -0.201455 0.140504 -0.090091
v -0.235441 0.097012 -0.112012
v -0.263332 0.044016 -0.130002
v -0.284058 -0.016446 -0.143370
v -0.296820 -0.082051 -0.151602
v -0.301129 -0.150279 -0.154382
v -0.296820 -0.218506 -0.151602
v -0.284058 -0.284111 -0.143370
v -0.263332 -0.344573 -0.130002
v -0.235441 -0.397569 -0.112012
v -0.201455 -0.441061 -0.090091
v -0.162681 -0.473379 -0.065081
v -0.120608 -0.493280 -0.037944
v -0.114064 0.192723 -0.045642
v -0.149844 0.172822 -0.080181
v -0.182818 0.140504 -0.112012
v -0.211721 0.097012 -0.139912
v -0.235441 0.044016 -0.162809
v -0.253066 -0.016446 -0.179824
v -0.263920 -0.082051 -0.190301
v -0.267584 -0.150279 -0.193839
v -0.263920 -0.218506 -0.190301
v -0.253066 -0.284111 -0.179824
v -0.235441 -0.344573 -0.162809
v -0.211721 -0.397569 -0.139912
v -0.182818 -0.441061 -0.112012
v -0.149844 -0.473379 -0.080181
v -0.114064 -0.493280 -0.045642
v -0.106090 0.192723 -0.051959
v -0.134202 0.172822 -0.092573
v -0.160110 0.140504 -0.130002
v -0.182818 0.097012 -0.162810
v -0.201455 0.044016 -0.189734
v -0.215303 -0.016446 -0.209740
v -0.223831 -0.082051 -0.222060
v -0.226710 -0.150279 -0.226220
v -0.223831 -0.218506 -0.222060
v -0.215303 -0.284111 -0.209740
v -0.201455 -0.344573 -0.189734
v -0.182818 -0.397569 -0.162810
v -0.160110 -0.441061 -0.130002
v -0.134202 -0.473379 -0.092573
v -0.106090 -0.493280 -0.051959
v -0.096992 0.192723 -0.056653
v -0.116356 0.172822 -0.101781
v -0.134202 0.140504 -0.143370
v -0.149844 0.097012 -0.179824
v -0.162681 0.044016 -0.209740
v -0.172219 -0.016446 -0.231970
v -0.178093 -0.082051 -0.245660
v -0.180077 -0.150279 -0.250282
v -0.178093 -0.218506 -0.245660
v -0.172219 -0.284111 -0.231970
v -0.162681 -0.344573 -0.209740
v -0.149844 -0.397569 -0.179824
v -0.134202 -0.441061 -0.143370
v -0.116356 -0.473379 -0.101781
v -0.096992 -0.493280 -0.056653
v -0.087120 0.192723 -0.059544
v -0.096992 0.172822 -0.107451
v -0.106090 0.140504 -0.151602
v -0.114064 0.097012 -0.190301
v -0.120608 0.044016 -0.222060
v -0.125471 -0.016446 -0.245660
v -0.128465 -0.082051 -0.260192
v -0.129476 -0.150279 -0.265099
v -0.128465 -0.218506 -0.260192
v -0.125471 -0.284111 -0.245660
v -0.120608 -0.344573 -0.222060
v -0.114064 -0.397569 -0.190301
v -0.106090 -0.441061 -0.151602
v -0.096992 -0.473379 -0.107451
v -0.087120 -0.493280 -0.059544
v -0.076854 0.192723 -0.060520
v -0.076854 0.172822 -0.109365
v -0.076854 0.097012 -0.193839
v -0.076854 -0.016446 -0.250282
v -0.076854 -0.284111 -0.250282
v -0.076854 -0.397569 -0.193839
v -0.076854 -0.441061 -0.154381
v -0.076854 -0.473379 -0.109365
v -0.076854 -0.493280 -0.060520
vn 0.0923 -0.2194 -0.9713
vn 0.0554 0.8111 -0.5823
vn 0.0880 -0.3683 -0.9255
vn 0.0702 0.6703 -0.7388
vn 0.0809 -0.5197 -0.8505
vn 0.0809 0.5197 -0.8505
vn 0.0702 -0.6703 -0.7388
vn 0.0880 0.3683 -0.9255
vn 0.0554 -0.8111 -0.5823
vn 0.0923 0.2194 -0.9713
vn 0.0358 -0.9255 -0.3771
vn 0.0944 0.0728 -0.9929
vn 0.0125 0.9913 -0.1311
vn 0.0125 -0.9913 -0.1311
vn 0.0944 -0.0728 -0.9929
vn 0.0358 0.9255 -0.3771
vn 0.2803 -0.0730 -0.9571
vn 0.1062 0.9258 -0.3627
vn 0.2741 -0.2199 -0.9362
vn 0.1641 0.8118 -0.5604
vn 0.2612 -0.3691 -0.8919
vn 0.2083 0.6712 -0.7114
vn 0.2399 -0.5207 -0.8194
vn 0.2399 0.5207 -0.8194
vn 0.2083 -0.6712 -0.7114
vn 0.2612 0.3691 -0.8919
vn 0.1641 -0.8118 -0.5604
vn 0.2741 0.2199 -0.9362
vn 0.1062 -0.9258 -0.3627
vn 0.2803 0.0730 -0.9571
vn 0.0369 0.9913 -0.1261
vn 0.0369 -0.9913 -0.1261
vn 0.3392 -0.6729 -0.6573
vn 0.4259 0.3707 -0.8254
vn 0.2669 -0.8131 -0.5173
vn 0.4472 0.2209 -0.8667
vn 0.1726 -0.9265 -0.3345
vn 0.4573 0.0733 -0.8863
vn 0.0600 0.9914 -0.1162
vn 0.0600 -0.9914 -0.1162
vn 0.4573 -0.0733 -0.8863
vn 0.1726 0.9265 -0.3345
vn 0.4472 -0.2209 -0.8667
vn 0.2669 0.8131 -0.5173
vn 0.4259 -0.3707 -0.8254
vn 0.3392 0.6729 -0.6573
vn 0.3910 -0.5225 -0.7577
vn 0.3910 0.5225 -0.7577
vn 0.2325 0.9273 -0.2935
vn 0.6054 -0.2222 -0.7642
vn 0.3600 0.8148 -0.4544
vn 0.5762 -0.3727 -0.7274
vn 0.4580 0.6753 -0.5781
vn 0.5286 -0.5248 -0.6672
vn 0.5286 0.5248 -0.6672
vn 0.4580 -0.6753 -0.5781
vn 0.5762 0.3727 -0.7274
vn 0.3600 -0.8148 -0.4544
vn 0.6054 0.2222 -0.7642
vn 0.2325 -0.9273 -0.2935
vn 0.6193 0.0738 -0.7817
vn 0.0807 0.9915 -0.1019
vn 0.0807 -0.9915 -0.1019
vn 0.6193 -0.0738 -0.7817
vn 0.7063 0.3749 -0.6005
vn 0.4396 -0.8167 -0.3738
vn 0.7426 0.2237 -0.6313
vn 0.2835 -0.9282 -0.2411
vn 0.7598 0.0743 -0.6459
vn 0.0984 0.9916 -0.0836
vn 0.0984 -0.9916 -0.0836
vn 0.7598 -0.0743 -0.6459
vn 0.2836 0.9282 -0.2411
vn 0.7426 -0.2237 -0.6313
vn 0.4396 0.8167 -0.3738
vn 0.7063 -0.3749 -0.6005
vn 0.5602 0.6778 -0.4762
vn 0.6473 -0.5275 -0.5503
vn 0.6473 0.5275 -0.5503
vn 0.5602 -0.6778 -0.4762
vn 0.8524 -0.2250 -0.4720
vn 0.5027 0.8185 -0.2783
vn 0.8103 -0.3770 -0.4487
vn 0.6413 0.6801 -0.3551
vn 0.7419 -0.5299 -0.4108
vn 0.7419 0.5299 -0.4108
vn 0.6413 -0.6801 -0.3551
vn 0.8103 0.3770 -0.4487
vn 0.5027 -0.8185 -0.2783
vn 0.8524 0.2250 -0.4720
vn 0.3238 -0.9290 -0.1793
vn 0.8724 0.0748 -0.4831
vn 0.1122 0.9917 -0.0621
vn 0.1122 -0.9917 -0.0621
vn 0.8724 -0.0748 -0.4831
vn 0.3238 0.9290 -0.1793
vn 0.5463 -0.8198 -0.1717
vn 0.9293 0.2261 -0.2920
vn 0.3516 -0.9296 -0.1105
vn 0.9513 0.0752 -0.2989
vn 0.1218 0.9918 -0.0383
vn 0.1218 -0.9918 -0.0383
vn 0.9513 -0.0752 -0.2989
vn 0.3516 0.9296 -0.1105
vn 0.9293 -0.2261 -0.2920
vn 0.5463 0.8198 -0.1717
vn 0.8830 -0.3786 -0.2775
vn 0.6977 0.6820 -0.2193
vn 0.8079 -0.5318 -0.2539
vn 0.8079 0.5318 -0.2539
vn 0.6977 -0.6820 -0.2193
vn 0.8830 0.3786 -0.2775
vn 0.9204 -0.3795 -0.0939
vn 0.7267 0.6830 -0.0741
vn 0.8418 -0.5329 -0.0859
vn 0.8418 0.5329 -0.0859
vn 0.7267 -0.6830 -0.0741
vn 0.9204 0.3795 -0.0939
vn 0.5686 -0.8205 -0.0580
vn 0.9689 0.2267 -0.0989
vn 0.3658 -0.9300 -0.0373
vn 0.9920 0.0754 -0.1012
vn 0.1267 0.9919 -0.0129
vn 0.1267 -0.9919 -0.0129
vn 0.9920 -0.0754 -0.1012
vn 0.3658 0.9300 -0.0373
vn 0.9689 -0.2267 -0.0989
vn 0.5686 0.8205 -0.0580
vn 0.3658 -0.9300 0.0373
vn 0.9920 0.0754 0.1012
vn 0.1267 0.9919 0.0129
vn 0.1267 -0.9919 0.0129
vn 0.9920 -0.0754 0.1012
vn 0.3658 0.9300 0.0373
vn 0.9689 -0.2267 0.0989
vn 0.5686 0.8205 0.0580
vn 0.9204 -0.3795 0.0939
vn 0.7267 0.6830 0.0741
vn 0.8418 -0.5329 0.0859
vn 0.8418 0.5329 0.0859
vn 0.7267 -0.6830 0.0741
vn 0.9204 0.3795 0.0939
vn 0.5686 -0.8205 0.0580
vn 0.9689 0.2267 0.0989
vn 0.6978 0.6820 0.2193
vn 0.8079 -0.5318 0.2539
vn 0.8079 0.5318 0.2539
vn 0.6977 -0.6820 0.2193
vn 0.8830 0.3786 0.2775
vn 0.5463 -0.8198 0.1717
vn 0.9293 0.2261 0.2920
vn 0.3516 -0.9296 0.1105
vn 0.9513 0.0752 0.2989
vn 0.1218 0.9918 0.0383
vn 0.1218 -0.9918 0.0383
vn 0.9513 -0.0752 0.2989
vn 0.3516 0.9296 0.1105
vn 0.9293 -0.2261 0.2920
vn 0.5463 0.8198 0.1717
vn 0.8830 -0.3786 0.2775
vn 0.8724 0.0748 0.4831
vn 0.1122 0.9917 0.0621
vn 0.1122 -0.9917 0.0621
vn 0.8724 -0.0748 0.4831
vn 0.3238 0.9290 0.1793
vn 0.8524 -0.2250 0.4720
vn 0.5027 0.8185 0.2783
vn 0.8103 -0.3770 0.4487
vn 0.6413 0.6801 0.3551
vn 0.7419 -0.5299 0.4108
vn 0.7419 0.5299 0.4108
vn 0.6413 -0.6801 0.3551
vn 0.8103 0.3770 0.4487
vn 0.5027 -0.8185 0.2783
vn 0.8524 0.2250 0.4720
vn 0.3238 -0.9290 0.1793
vn 0.6473 -0.5275 0.5503
vn 0.6473 0.5275 0.5503
vn 0.5602 -0.6778 0.4762
vn 0.7063 0.3749 0.6005
vn 0.4396 -0.8167 0.3738
vn 0.7426 0.2237 0.6313
vn 0.2836 -0.9282 0.2411
vn 0.7598 0.0743 0.6459
vn 0.0984 0.9916 0.0836
vn 0.0984 -0.9916 0.0836
vn 0.7598 -0.0743 0.6459
vn 0.2836 0.9282 0.2411
vn 0.7426 -0.2237 0.6313
vn 0.4396 0.8167 0.3738
vn 0.7063 -0.3749 0.6005
vn 0.5602 0.6778 0.4762
vn 0.0807 0.9915 0.1019
vn 0.0807 -0.9915 0.1019
vn 0.6193 -0.0738 0.7817
vn 0.2325 0.9273 0.2935
vn 0.6054 -0.2222 0.7642
vn 0.3600 0.8148 0.4544
vn 0.5762 -0.3727 0.7274
vn 0.4580 0.6753 0.5781
vn 0.5286 -0.5248 0.6672
vn 0.5286 0.5248 0.6672
vn 0.4580 -0.6753 0.5781
vn 0.5762 0.3727 0.7274
vn 0.3600 -0.8148 0.4544
vn 0.6054 0.2222 0.7642
vn 0.2325 -0.9273 0.2935
vn 0.6193 0.0738 0.7817
vn 0.3910 0.5225 0.7577
vn 0.3392 -0.6729 0.6573
vn 0.4259 0.3707 0.8254
vn 0.2669 -0.8131 0.5173
vn 0.4472 0.2209 0.8667
vn 0.1726 -0.9265 0.3345
vn 0.4573 0.0733 0.8863
vn 0.0600 0.9914 0.1162
vn 0.0600 -0.9914 0.1162
vn 0.4573 -0.0733 0.8863
vn 0.1726 0.9265 0.3345
vn 0.4472 -0.2209 0.8667
vn 0.2669 0.8131 0.5173
vn 0.4259 -0.3707 0.8254
vn 0.3392 0.6729 0.6573
vn 0.3910 -0.5225 0.7577
vn 0.2803 -0.0730 0.9571
vn 0.1062 0.9258 0.3627
vn 0.2741 -0.2199 0.9362
vn 0.1641 0.8118 0.5604
vn 0.2612 -0.3691 0.8919
vn 0.2083 0.6712 0.7114
vn 0.2399 -0.5207 0.8194
vn 0.2399 0.5207 0.8194
vn 0.2083 -0.6712 0.7114
vn 0.2612 0.3691 0.8919
vn 0.1641 -0.8118 0.5604
vn 0.2741 0.2199 0.9362
vn 0.1062 -0.9258 0.3627
vn 0.2803 0.0730 0.9571
vn 0.0369 0.9913 0.1261
vn 0.0369 -0.9913 0.1261
vn 0.0702 -0.6703 0.7388
vn 0.0880 0.3683 0.9255
vn 0.0554 -0.8111 0.5823
vn 0.0923 0.2194 0.9713
vn 0.0359 -0.9255 0.3771
vn 0.0944 0.0728 0.9929
vn 0.0125 0.9913 0.1311
vn 0.0125 -0.9913 0.1311
vn 0.0944 -0.0728 0.9929
vn 0.0359 0.9255 0.3771
vn 0.0923 -0.2194 0.9713
vn 0.0554 0.8111 0.5823
vn 0.0880 -0.3683 0.9255
vn 0.0702 0.6703 0.7388
vn 0.0809 -0.5197 0.8505
vn 0.0809 0.5197 0.8505
vn -0.0923 -0.2194 0.9713
vn -0.0554 0.8111 0.5823
vn -0.0880 -0.3683 0.9255
vn -0.0702 0.6703 0.7388
vn -0.0809 -0.5197 0.8505
vn -0.0809 0.5197 0.8505
vn -0.0702 -0.6703 0.7388
vn -0.0880 0.3683 0.9255
vn -0.0554 -0.8111 0.5823
vn -0.0923 0.2194 0.9713
vn -0.0359 -0.9255 0.3771
vn -0.0944 0.0728 0.9929
vn -0.0125 0.9913 0.1311
vn -0.0125 -0.9913 0.1311
vn -0.0944 -0.0728 0.9929
vn -0.0359 0.9255 0.3771
vn -0.1641 -0.8118 0.5604
vn -0.2741 0.2199 0.9362
vn -0.1062 -0.9258 0.3627
vn -0.2803 0.0730 0.9571
vn -0.0369 0.9913 0.1261
vn -0.0369 -0.9913 0.1261
vn -0.2803 -0.0730 0.9571
vn -0.1062 0.9258 0.3627
vn -0.2741 -0.2199 0.9362
vn -0.1641 0.8118 0.5604
vn -0.2612 -0.3691 0.8919
vn -0.2083 0.6712 0.7114
vn -0.2399 -0.5207 0.8194
vn -0.2399 0.5207 0.8194
vn -0.2083 -0.6712 0.7114
vn -0.2612 0.3691 0.8919
vn -0.2669 0.8131 0.5173
vn -0.4259 -0.3707 0.8254
vn -0.3392 0.6729 0.6573
vn -0.3910 -0.5225 0.7577
vn -0.3910 0.5225 0.7577
vn -0.3392 -0.6729 0.6573
vn -0.4259 0.3707 0.8254
vn -0.2669 -0.8131 0.5173
vn -0.4472 0.2209 0.8667
vn -0.1726 -0.9265 0.3345
vn -0.4573 0.0733 0.8863
vn -0.0600 0.9914 0.1162
vn -0.0600 -0.9914 0.1162
vn -0.4573 -0.0733 0.8863
vn -0.1726 0.9265 0.3345
vn -0.4472 -0.2209 0.8667
vn -0.6054 0.2222 0.7642
vn -0.2325 -0.9273 0.2935
vn -0.6193 0.0738 0.7817
vn -0.0807 0.9915 0.1019
vn -0.0807 -0.9915 0.1019
vn -0.6193 -0.0738 0.7817
vn -0.2325 0.9273 0.2935
vn -0.6054 -0.2222 0.7642
vn -0.3600 0.8148 0.4544
vn -0.5762 -0.3727 0.7274
vn -0.4580 0.6753 0.5781
vn -0.5286 -0.5248 0.6672
vn -0.5286 0.5248 0.6672
vn -0.4580 -0.6753 0.5781
vn -0.5762 0.3727 0.7274
vn -0.3600 -0.8148 0.4544
vn -0.7063 -0.3749 0.6005
vn -0.5602 0.6778 0.4762
vn -0.6473 -0.5275 0.5503
vn -0.6473 0.5275 0.5503
vn -0.5602 -0.6778 0.4762
vn -0.7063 0.3749 0.6005
vn -0.4396 -0.8167 0.3738
vn -0.7426 0.2237 0.6313
vn -0.2835 -0.9282 0.2411
vn -0.7598 0.0743 0.6459
vn -0.0984 0.9916 0.0836
vn -0.0984 -0.9916 0.0836
vn -0.7598 -0.0743 0.6459
vn -0.2836 0.9282 0.2411
vn -0.7426 -0.2237 0.6313
vn -0.4396 0.8167 0.3738
vn -0.3238 -0.9290 0.1793
vn -0.8724 0.0748 0.4831
vn -0.1122 0.9917 0.0621
vn -0.1122 -0.9917 0.0621
vn -0.8724 -0.0748 0.4831
vn -0.3238 0.9290 0.1793
vn -0.8524 -0.2250 0.4720
vn -0.5027 0.8185 0.2783
vn -0.8103 -0.3770 0.4487
vn -0.6413 0.6801 0.3551
vn -0.7419 -0.5299 0.4108
vn -0.7419 0.5299 0.4108
vn -0.6413 -0.6801 0.3551
vn -0.8103 0.3770 0.4487
vn -0.5027 -0.8185 0.2783
vn -0.8524 0.2250 0.4720
vn -0.6977 0.6820 0.2193
vn -0.8079 -0.5318 0.2539
vn -0.8079 0.5318 0.2539
vn -0.6977 -0.6820 0.2193
vn -0.8830 0.3786 0.2775
vn -0.5463 -0.8198 0.1717
vn -0.9293 0.2261 0.2920
vn -0.3516 -0.9296 0.1105
vn -0.9513 0.0752 0.2989
vn -0.1218 0.9918 0.0383
vn -0.1218 -0.9918 0.0383
vn -0.9513 -0.0752 0.2989
vn -0.3516 0.9296 0.1105
vn -0.9293 -0.2261 0.2920
vn -0.5463 0.8198 0.1717
vn -0.8830 -0.3786 0.2775
vn -0.9920 0.0754 0.1012
vn -0.1267 0.9919 0.0129
vn -0.1267 -0.9919 0.0129
vn -0.9920 -0.0754 0.1012
vn -0.3658 0.9300 0.0373
vn -0.9689 -0.2267 0.0989
vn -0.5686 0.8205 0.0580
vn -0.9204 -0.3795 0.0939
vn -0.7267 0.6830 0.0741
vn -0.8418 -0.5329 0.0859
vn -0.8418 0.5329 0.0859
vn -0.7267 -0.6830 0.0741
vn -0.9204 0.3795 0.0939
vn -0.5686 -0.8205 0.0580
vn -0.9689 0.2267 0.0989
vn -0.3658 -0.9300 0.0373
vn -0.8418 -0.5329 -0.0859
vn -0.8418 0.5329 -0.0859
vn -0.7267 -0.6830 -0.0741
vn -0.9204 0.3795 -0.0939
vn -0.5686 -0.8205 -0.0580
vn -0.9689 0.2267 -0.0989
vn -0.3658 -0.9300 -0.0373
vn -0.9920 0.0754 -0.1012
vn -0.1267 0.9919 -0.0129
vn -0.1267 -0.9919 -0.0129
vn -0.9920 -0.0754 -0.1012
vn -0.3658 0.9300 -0.0373
vn -0.9689 -0.2267 -0.0989
vn -0.5686 0.8205 -0.0580
vn -0.9204 -0.3795 -0.0939
vn -0.7267 0.6830 -0.0741
vn -0.1218 -0.9918 -0.0383
vn -0.9513 -0.0752 -0.2989
vn -0.3516 0.9296 -0.1105
vn -0.9293 -0.2261 -0.2920
vn -0.5463 0.8198 -0.1717
vn -0.8830 -0.3786 -0.2775
vn -0.6977 0.6820 -0.2193
vn -0.8079 -0.5318 -0.2539
vn -0.8079 0.5318 -0.2539
vn -0.6977 -0.6820 -0.2193
vn -0.8830 0.3786 -0.2775
vn -0.5463 -0.8198 -0.1717
vn -0.9293 0.2261 -0.2920
vn -0.3516 -0.9296 -0.1105
vn -0.9513 0.0752 -0.2989
vn -0.1218 0.9918 -0.0383
vn -0.6413 -0.6801 -0.3551
vn -0.8103 0.3770 -0.4487
vn -0.5027 -0.8185 -0.2783
vn -0.8524 0.2250 -0.4720
vn -0.3238 -0.9290 -0.1793
vn -0.8724 0.0748 -0.4831
vn -0.1122 0.9917 -0.0621
vn -0.1122 -0.9917 -0.0621
vn -0.8724 -0.0748 -0.4831
vn -0.3238 0.9290 -0.1793
vn -0.8524 -0.2250 -0.4720
vn -0.5027 0.8185 -0.2783
vn -0.8103 -0.3770 -0.4487
vn -0.6413 0.6801 -0.3551
vn -0.7419 -0.5299 -0.4108
vn -0.7419 0.5299 -0.4108
vn -0.2836 0.9282 -0.2411
vn -0.7426 -0.2237 -0.6313
vn -0.4396 0.8167 -0.3738
vn -0.7063 -0.3749 -0.6005
vn -0.5602 0.6778 -0.4762
vn -0.6473 -0.5275 -0.5503
vn -0.6473 0.5275 -0.5503
vn -0.5602 -0.6778 -0.4762
vn -0.7063 0.3749 -0.6005
vn -0.4396 -0.8167 -0.3738
vn -0.7426 0.2237 -0.6313
vn -0.2836 -0.9282 -0.2411
vn -0.7598 0.0743 -0.6459
vn -0.0984 0.9916 -0.0836
vn -0.0984 -0.9916 -0.0836
vn -0.7598 -0.0743 -0.6459
vn -0.5762 0.3727 -0.7274
vn -0.3600 -0.8148 -0.4544
vn -0.6054 0.2222 -0.7642
vn -0.2325 -0.9273 -0.2935
vn -0.6193 0.0738 -0.7817
vn -0.0807 0.9915 -0.1019
vn -0.0807 -0.9915 -0.1019
vn -0.6193 -0.0738 -0.7817
vn -0.2325 0.9273 -0.2935
vn -0.6054 -0.2222 -0.7642
vn -0.3600 0.8148 -0.4544
vn -0.5762 -0.3727 -0.7274
vn -0.4580 0.6753 -0.5781
vn -0.5286 -0.5248 -0.6672
vn -0.5286 0.5248 -0.6672
vn -0.4580 -0.6753 -0.5781
vn -0.4472 -0.2209 -0.8667
vn -0.2669 0.8131 -0.5173
vn -0.4259 -0.3707 -0.8254
vn -0.3392 0.6729 -0.6573
vn -0.3910 -0.5225 -0.7577
vn -0.3910 0.5225 -0.7577
vn -0.3392 -0.6729 -0.6573
vn -0.4259 0.3707 -0.8254
vn -0.2669 -0.8131 -0.5173
vn -0.4472 0.2209 -0.8667
vn -0.1726 -0.9265 -0.3345
vn -0.4573 0.0733 -0.8863
vn -0.0600 0.9914 -0.1162
vn -0.0600 -0.9914 -0.1162
vn -0.4573 -0.0733 -0.8863
vn -0.1726 0.9265 -0.3345
vn -0.1641 -0.8118 -0.5604
vn -0.2741 0.2199 -0.9362
vn -0.1062 -0.9258 -0.3627
vn -0.2803 0.0730 -0.9571
vn -0.0369 0.9913 -0.1261
vn -0.0369 -0.9913 -0.1261
vn -0.2803 -0.0730 -0.9571
vn -0.1062 0.9258 -0.3627
vn -0.2741 -0.2199 -0.9362
vn -0.1641 0.8118 -0.5604
vn -0.2612 -0.3691 -0.8919
vn -0.2083 0.6712 -0.7114
vn -0.2399 -0.5207 -0.8194
vn -0.2399 0.5207 -0.8194
vn -0.2083 -0.6712 -0.7114
vn -0.2612 0.3691 -0.8919
vn -0.0554 0.8111 -0.5823
vn -0.0880 -0.3683 -0.9255
vn -0.0702 0.6703 -0.7388
vn -0.0809 -0.5197 -0.8505
vn -0.0809 0.5197 -0.8505
vn -0.0702 -0.6703 -0.7388
vn -0.0880 0.3683 -0.9255
vn -0.0554 -0.8111 -0.5823
vn -0.0923 0.2194 -0.9713
vn -0.0359 -0.9255 -0.3771
vn -0.0944 0.0728 -0.9929
vn -0.0125 0.9913 -0.1311
vn -0.0125 -0.9913 -0.1311
vn -0.0944 -0.0728 -0.9929
vn -0.0358 0.9255 -0.3771
vn -0.0923 -0.2194 -0.9713
vn 0.2836 -0.9282 -0.2411
vn 0.2835 0.9282 -0.2411
vn 0.6977 0.6820 0.2193
vn 0.6978 -0.6820 0.2193
vn -0.2836 -0.9282 0.2411
vn -0.0358 -0.9255 -0.3771
vn -0.0359 0.9255 -0.3771
vt 0.750000 0.437500
vt 0.718750 0.375000
vt 0.750000 0.375000
vt 0.750000 0.812500
vt 0.718750 0.875000
vt 0.718750 0.812500
vt 0.718750 0.312500
vt 0.750000 0.312500
vt 0.750000 0.750000
vt 0.718750 0.750000
vt 0.718750 0.250000
vt 0.750000 0.250000
vt 0.750000 0.687500
vt 0.718750 0.687500
vt 0.718750 0.187500
vt 0.750000 0.187500
vt 0.750000 0.625000
vt 0.718750 0.625000
vt 0.718750 0.125000
vt 0.750000 0.125000
vt 0.750000 0.562500
vt 0.718750 0.562500
vt 0.718750 0.062500
vt 0.750000 0.062500
vt 0.718750 0.500000
vt 0.750000 0.500000
vt 0.750000 0.937500
vt 0.734375 1.000000
vt 0.718750 0.937500
vt 0.734375 0.000000
vt 0.718750 0.437500
vt 0.750000 0.875000
vt 0.687500 0.437500
vt 0.687500 0.875000
vt 0.687500 0.375000
vt 0.687500 0.812500
vt 0.687500 0.312500
vt 0.687500 0.750000
vt 0.687500 0.250000
vt 0.687500 0.687500
vt 0.687500 0.187500
vt 0.687500 0.625000
vt 0.687500 0.125000
vt 0.687500 0.562500
vt 0.687500 0.062500
vt 0.687500 0.500000
vt 0.703125 1.000000
vt 0.687500 0.937500
vt 0.703125 0.000000
vt 0.656250 0.250000
vt 0.656250 0.187500
vt 0.656250 0.625000
vt 0.656250 0.125000
vt 0.656250 0.562500
vt 0.656250 0.062500
vt 0.656250 0.500000
vt 0.671875 1.000000
vt 0.656250 0.937500
vt 0.671875 0.000000
vt 0.656250 0.437500
vt 0.656250 0.875000
vt 0.656250 0.375000
vt 0.656250 0.812500
vt 0.656250 0.312500
vt 0.656250 0.750000
vt 0.656250 0.687500
vt 0.625000 0.875000
vt 0.625000 0.375000
vt 0.625000 0.812500
vt 0.625000 0.312500
vt 0.625000 0.750000
vt 0.625000 0.250000
vt 0.625000 0.687500
vt 0.625000 0.187500
vt 0.625000 0.625000
vt 0.625000 0.125000
vt 0.625000 0.562500
vt 0.625000 0.062500
vt 0.625000 0.500000
vt 0.640625 1.000000
vt 0.625000 0.937500
vt 0.640625 0.000000
vt 0.625000 0.437500
vt 0.593750 0.625000
vt 0.593750 0.187500
vt 0.593750 0.125000
vt 0.593750 0.562500
vt 0.593750 0.062500
vt 0.593750 0.500000
vt 0.609375 1.000000
vt 0.593750 0.937500
vt 0.609375 0.000000
vt 0.593750 0.437500
vt 0.593750 0.875000
vt 0.593750 0.375000
vt 0.593750 0.812500
vt 0.593750 0.312500
vt 0.593750 0.750000
vt 0.593750 0.250000
vt 0.593750 0.687500
vt 0.562500 0.375000
vt 0.562500 0.812500
vt 0.562500 0.312500
vt 0.562500 0.750000
vt 0.562500 0.250000
vt 0.562500 0.687500
vt 0.562500 0.187500
vt 0.562500 0.625000
vt 0.562500 0.125000
vt 0.562500 0.562500
vt 0.562500 0.062500
vt 0.562500 0.500000
vt 0.578125 1.000000
vt 0.562500 0.937500
vt 0.578125 0.000000
vt 0.562500 0.437500
vt 0.562500 0.875000
vt 0.531250 0.187500
vt 0.531250 0.125000
vt 0.531250 0.625000
vt 0.531250 0.562500
vt 0.531250 0.062500
vt 0.531250 0.500000
vt 0.546875 1.000000
vt 0.531250 0.937500
vt 0.546875 0.000000
vt 0.531250 0.437500
vt 0.531250 0.875000
vt 0.531250 0.375000
vt 0.531250 0.812500
vt 0.531250 0.312500
vt 0.531250 0.750000
vt 0.531250 0.250000
vt 0.531250 0.687500
vt 0.500000 0.375000
vt 0.500000 0.312500
vt 0.500000 0.750000
vt 0.500000 0.250000
vt 0.500000 0.687500
vt 0.500000 0.187500
vt 0.500000 0.625000
vt 0.500000 0.125000
vt 0.500000 0.562500
vt 0.500000 0.062500
vt 0.500000 0.500000
vt 0.515625 1.000000
vt 0.500000 0.937500
vt 0.515625 0.000000
vt 0.500000 0.437500
vt 0.500000 0.875000
vt 0.500000 0.812500
vt 0.468750 0.062500
vt 0.468750 0.562500
vt 0.468750 0.500000
vt 0.484375 1.000000
vt 0.468750 0.937500
vt 0.484375 0.000000
vt 0.468750 0.437500
vt 0.468750 0.875000
vt 0.468750 0.375000
vt 0.468750 0.812500
vt 0.468750 0.312500
vt 0.468750 0.750000
vt 0.468750 0.250000
vt 0.468750 0.687500
vt 0.468750 0.187500
vt 0.468750 0.625000
vt 0.468750 0.125000
vt 0.437500 0.750000
vt 0.437500 0.250000
vt 0.437500 0.687500
vt 0.437500 0.187500
vt 0.437500 0.625000
vt 0.437500 0.125000
vt 0.437500 0.562500
vt 0.437500 0.062500
vt 0.437500 0.500000
vt 0.453125 1.000000
vt 0.437500 0.937500
vt 0.453125 0.000000
vt 0.437500 0.437500
vt 0.437500 0.875000
vt 0.437500 0.375000
vt 0.437500 0.812500
vt 0.437500 0.312500
vt 0.406250 0.562500
vt 0.406250 0.500000
vt 0.421875 1.000000
vt 0.406250 0.937500
vt 0.421875 0.000000
vt 0.406250 0.062500
vt 0.406250 0.437500
vt 0.406250 0.875000
vt 0.406250 0.375000
vt 0.406250 0.812500
vt 0.406250 0.312500
vt 0.406250 0.750000
vt 0.406250 0.250000
vt 0.406250 0.687500
vt 0.406250 0.187500
vt 0.406250 0.625000
vt 0.406250 0.125000
vt 0.375000 0.250000
vt 0.375000 0.750000
vt 0.375000 0.687500
vt 0.375000 0.187500
vt 0.375000 0.625000
vt 0.375000 0.125000
vt 0.375000 0.562500
vt 0.375000 0.062500
vt 0.375000 0.500000
vt 0.390625 1.000000
vt 0.375000 0.937500
vt 0.390625 0.000000
vt 0.375000 0.437500
vt 0.375000 0.875000
vt 0.375000 0.375000
vt 0.375000 0.812500
vt 0.375000 0.312500
vt 0.359375 1.000000
vt 0.343750 0.937500
vt 0.359375 0.000000
vt 0.343750 0.062500
vt 0.343750 0.437500
vt 0.343750 0.875000
vt 0.343750 0.375000
vt 0.343750 0.812500
vt 0.343750 0.312500
vt 0.343750 0.750000
vt 0.343750 0.250000
vt 0.343750 0.687500
vt 0.343750 0.187500
vt 0.343750 0.625000
vt 0.343750 0.125000
vt 0.343750 0.562500
vt 0.343750 0.500000
vt 0.312500 0.750000
vt 0.312500 0.687500
vt 0.312500 0.250000
vt 0.312500 0.187500
vt 0.312500 0.625000
vt 0.312500 0.125000
vt 0.312500 0.562500
vt 0.312500 0.062500
vt 0.312500 0.500000
vt 0.328125 1.000000
vt 0.312500 0.937500
vt 0.328125 0.000000
vt 0.312500 0.437500
vt 0.312500 0.875000
vt 0.312500 0.375000
vt 0.312500 0.812500
vt 0.312500 0.312500
vt 0.281250 0.437500
vt 0.281250 0.875000
vt 0.281250 0.375000
vt 0.281250 0.812500
vt 0.281250 0.312500
vt 0.281250 0.750000
vt 0.281250 0.250000
vt 0.281250 0.687500
vt 0.281250 0.187500
vt 0.281250 0.625000
vt 0.281250 0.125000
vt 0.281250 0.562500
vt 0.281250 0.062500
vt 0.281250 0.500000
vt 0.296875 1.000000
vt 0.281250 0.937500
vt 0.296875 0.000000
vt 0.250000 0.250000
vt 0.250000 0.187500
vt 0.250000 0.625000
vt 0.250000 0.125000
vt 0.250000 0.562500
vt 0.250000 0.062500
vt 0.250000 0.500000
vt 0.265625 1.000000
vt 0.250000 0.937500
vt 0.265625 0.000000
vt 0.250000 0.437500
vt 0.250000 0.875000
vt 0.250000 0.375000
vt 0.250000 0.812500
vt 0.250000 0.312500
vt 0.250000 0.750000
vt 0.250000 0.687500
vt 0.218750 0.375000
vt 0.218750 0.812500
vt 0.218750 0.312500
vt 0.218750 0.750000
vt 0.218750 0.250000
vt 0.218750 0.687500
vt 0.218750 0.187500
vt 0.218750 0.625000
vt 0.218750 0.125000
vt 0.218750 0.562500
vt 0.218750 0.062500
vt 0.218750 0.500000
vt 0.234375 1.000000
vt 0.218750 0.937500
vt 0.234375 0.000000
vt 0.218750 0.437500
vt 0.218750 0.875000
vt 0.187500 0.125000
vt 0.187500 0.625000
vt 0.187500 0.562500
vt 0.187500 0.062500
vt 0.187500 0.500000
vt 0.203125 1.000000
vt 0.187500 0.937500
vt 0.203125 0.000000
vt 0.187500 0.437500
vt 0.187500 0.875000
vt 0.187500 0.375000
vt 0.187500 0.812500
vt 0.187500 0.312500
vt 0.187500 0.750000
vt 0.187500 0.250000
vt 0.187500 0.687500
vt 0.187500 0.187500
vt 0.156250 0.812500
vt 0.156250 0.375000
vt 0.156250 0.312500
vt 0.156250 0.750000
vt 0.156250 0.250000
vt 0.156250 0.687500
vt 0.156250 0.187500
vt 0.156250 0.625000
vt 0.156250 0.125000
vt 0.156250 0.562500
vt 0.156250 0.062500
vt 0.156250 0.500000
vt 0.171875 1.000000
vt 0.156250 0.937500
vt 0.171875 0.000000
vt 0.156250 0.437500
vt 0.156250 0.875000
vt 0.125000 0.625000
vt 0.125000 0.562500
vt 0.125000 0.125000
vt 0.125000 0.062500
vt 0.125000 0.500000
vt 0.140625 1.000000
vt 0.125000 0.937500
vt 0.140625 0.000000
vt 0.125000 0.437500
vt 0.125000 0.875000
vt 0.125000 0.375000
vt 0.125000 0.812500
vt 0.125000 0.312500
vt 0.125000 0.750000
vt 0.125000 0.250000
vt 0.125000 0.687500
vt 0.125000 0.187500
vt 0.093750 0.375000
vt 0.093750 0.312500
vt 0.093750 0.750000
vt 0.093750 0.250000
vt 0.093750 0.687500
vt 0.093750 0.187500
vt 0.093750 0.625000
vt 0.093750 0.125000
vt 0.093750 0.562500
vt 0.093750 0.062500
vt 0.093750 0.500000
vt 0.109375 1.000000
vt 0.093750 0.937500
vt 0.109375 0.000000
vt 0.093750 0.437500
vt 0.093750 0.875000
vt 0.093750 0.812500
vt 0.062500 0.062500
vt 0.062500 0.562500
vt 0.062500 0.500000
vt 0.078125 1.000000
vt 0.062500 0.937500
vt 0.078125 0.000000
vt 0.062500 0.437500
vt 0.062500 0.875000
vt 0.062500 0.375000
vt 0.062500 0.812500
vt 0.062500 0.312500
vt 0.062500 0.750000
vt 0.062500 0.250000
vt 0.062500 0.687500
vt 0.062500 0.187500
vt 0.062500 0.625000
vt 0.062500 0.125000
vt 0.031250 0.750000
vt 0.031250 0.250000
vt 0.031250 0.687500
vt 0.031250 0.187500
vt 0.031250 0.625000
vt 0.031250 0.125000
vt 0.031250 0.562500
vt 0.031250 0.062500
vt 0.031250 0.500000
vt 0.046875 1.000000
vt 0.031250 0.937500
vt 0.046875 0.000000
vt 0.031250 0.437500
vt 0.031250 0.875000
vt 0.031250 0.375000
vt 0.031250 0.812500
vt 0.031250 0.312500
vt 0.000000 0.562500
vt 0.000000 0.500000
vt 0.015625 1.000000
vt 0.000000 0.937500
vt 0.015625 0.000000
vt 0.000000 0.062500
vt 0.000000 0.437500
vt 0.000000 0.875000
vt 0.000000 0.375000
vt 0.000000 0.812500
vt 0.000000 0.312500
vt 0.000000 0.750000
vt 0.000000 0.250000
vt 0.000000 0.687500
vt 0.000000 0.187500
vt 0.000000 0.625000
vt 0.000000 0.125000
vt 1.000000 0.312500
vt 0.968750 0.250000
vt 1.000000 0.250000
vt 1.000000 0.687500
vt 0.968750 0.750000
vt 0.968750 0.687500
vt 1.000000 0.187500
vt 0.968750 0.187500
vt 0.968750 0.625000
vt 1.000000 0.625000
vt 0.968750 0.125000
vt 1.000000 0.125000
vt 1.000000 0.562500
vt 0.968750 0.562500
vt 0.968750 0.062500
vt 1.000000 0.062500
vt 1.000000 0.500000
vt 0.968750 0.500000
vt 1.000000 0.937500
vt 0.984375 1.000000
vt 0.968750 0.937500
vt 0.984375 0.000000
vt 0.968750 0.437500
vt 1.000000 0.437500
vt 0.968750 0.875000
vt 1.000000 0.875000
vt 0.968750 0.375000
vt 1.000000 0.375000
vt 1.000000 0.812500
vt 0.968750 0.812500
vt 0.968750 0.312500
vt 1.000000 0.750000
vt 0.953125 0.000000
vt 0.937500 0.062500
vt 0.937500 0.437500
vt 0.937500 0.875000
vt 0.937500 0.375000
vt 0.937500 0.812500
vt 0.937500 0.312500
vt 0.937500 0.750000
vt 0.937500 0.250000
vt 0.937500 0.687500
vt 0.937500 0.187500
vt 0.937500 0.625000
vt 0.937500 0.125000
vt 0.937500 0.562500
vt 0.937500 0.500000
vt 0.953125 1.000000
vt 0.937500 0.937500
vt 0.906250 0.250000
vt 0.906250 0.187500
vt 0.906250 0.625000
vt 0.906250 0.125000
vt 0.906250 0.562500
vt 0.906250 0.062500
vt 0.906250 0.500000
vt 0.921875 1.000000
vt 0.906250 0.937500
vt 0.921875 0.000000
vt 0.906250 0.437500
vt 0.906250 0.875000
vt 0.906250 0.375000
vt 0.906250 0.812500
vt 0.906250 0.312500
vt 0.906250 0.750000
vt 0.906250 0.687500
vt 0.875000 0.875000
vt 0.875000 0.375000
vt 0.875000 0.812500
vt 0.875000 0.312500
vt 0.875000 0.750000
vt 0.875000 0.250000
vt 0.875000 0.687500
vt 0.875000 0.187500
vt 0.875000 0.625000
vt 0.875000 0.125000
vt 0.875000 0.562500
vt 0.875000 0.062500
vt 0.875000 0.500000
vt 0.890625 1.000000
vt 0.875000 0.937500
vt 0.890625 0.000000
vt 0.875000 0.437500
vt 0.843750 0.625000
vt 0.843750 0.125000
vt 0.843750 0.562500
vt 0.843750 0.062500
vt 0.843750 0.500000
vt 0.859375 1.000000
vt 0.843750 0.937500
vt 0.859375 0.000000
vt 0.843750 0.437500
vt 0.843750 0.875000
vt 0.843750 0.375000
vt 0.843750 0.812500
vt 0.843750 0.312500
vt 0.843750 0.750000
vt 0.843750 0.250000
vt 0.843750 0.687500
vt 0.843750 0.187500
vt 0.812500 0.375000
vt 0.812500 0.812500
vt 0.812500 0.312500
vt 0.812500 0.750000
vt 0.812500 0.250000
vt 0.812500 0.687500
vt 0.812500 0.187500
vt 0.812500 0.625000
vt 0.812500 0.125000
vt 0.812500 0.562500
vt 0.812500 0.062500
vt 0.812500 0.500000
vt 0.828125 1.000000
vt 0.812500 0.937500
vt 0.828125 0.000000
vt 0.812500 0.437500
vt 0.812500 0.875000
vt 0.781250 0.125000
vt 0.781250 0.625000
vt 0.781250 0.562500
vt 0.781250 0.062500
vt 0.781250 0.500000
vt 0.796875 1.000000
vt 0.781250 0.937500
vt 0.796875 0.000000
vt 0.781250 0.437500
vt 0.781250 0.875000
vt 0.781250 0.375000
vt 0.781250 0.812500
vt 0.781250 0.312500
vt 0.781250 0.750000
vt 0.781250 0.250000
vt 0.781250 0.687500
vt 0.781250 0.187500
vt 0.765625 1.000000
vt 0.765625 0.000000
s 0
f 487/560/518 498/561/518 960/562/518
f 483/563/519 490/564/519 491/565/519
f 960/562/520 499/566/520 488/567/520
f 958/568/521 491/565/521 492/569/521
f 488/567/522 500/570/522 961/571/522
f 484/572/523 492/569/523 493/573/523
f 961/571/524 501/574/524 962/575/524
f 959/576/525 493/573/525 494/577/525
f 962/575/526 502/578/526 963/579/526
f 485/580/527 494/577/527 495/581/527
f 963/579/528 503/582/528 964/583/528
f 485/580/529 496/584/529 486/585/529
f 956/586/530 564/587/530 489/588/530
f 790/589/531 964/583/531 503/582/531
f 486/585/532 497/590/532 487/560/532
f 957/591/533 489/588/533 490/564/533
f 496/584/534 512/592/534 497/590/534
f 489/588/535 505/593/535 490/564/535
f 497/590/536 513/594/536 498/561/536
f 490/564/537 506/595/537 491/565/537
f 498/561/538 514/596/538 499/566/538
f 491/565/539 507/597/539 492/569/539
f 499/566/540 515/598/540 500/570/540
f 492/569/541 508/599/541 493/573/541
f 501/574/542 515/598/542 516/600/542
f 493/573/543 509/601/543 494/577/543
f 502/578/544 516/600/544 517/602/544
f 495/581/545 509/601/545 510/603/545
f 502/578/546 518/604/546 503/582/546
f 495/581/547 511/605/547 496/584/547
f 489/588/548 564/606/548 504/607/548
f 790/608/549 503/582/549 518/604/549
f 516/600/550 530/609/550 531/610/550
f 508/599/551 524/611/551 509/601/551
f 516/600/552 532/612/552 517/602/552
f 509/601/553 525/613/553 510/603/553
f 517/602/554 533/614/554 518/604/554
f 510/603/555 526/615/555 511/605/555
f 504/607/556 564/616/556 519/617/556
f 790/618/557 518/604/557 533/614/557
f 512/592/558 526/615/558 527/619/558
f 504/607/559 520/620/559 505/593/559
f 512/592/560 528/621/560 513/594/560
f 505/593/561 521/622/561 506/595/561
f 514/596/562 528/621/562 529/623/562
f 506/595/563 522/624/563 507/597/563
f 515/598/564 529/623/564 530/609/564
f 507/597/565 523/625/565 508/599/565
f 519/617/566 535/626/566 520/620/566
f 527/619/567 543/627/567 528/621/567
f 520/620/568 536/628/568 521/622/568
f 529/623/569 543/627/569 544/629/569
f 521/622/570 537/630/570 522/624/570
f 529/623/571 545/631/571 530/609/571
f 522/624/572 538/632/572 523/625/572
f 530/609/573 546/633/573 531/610/573
f 523/625/574 539/634/574 524/611/574
f 531/610/575 547/635/575 532/612/575
f 525/613/576 539/634/576 540/636/576
f 532/612/577 548/637/577 533/614/577
f 526/615/578 540/636/578 541/638/578
f 519/617/579 564/639/579 534/640/579
f 790/641/580 533/614/580 548/637/580
f 526/615/581 542/642/581 527/619/581
f 538/632/582 554/643/582 539/634/582
f 547/635/583 561/644/583 562/645/583
f 540/636/584 554/643/584 555/646/584
f 547/635/585 563/647/585 548/637/585
f 541/638/586 555/646/586 556/648/586
f 534/640/587 564/649/587 549/650/587
f 790/651/588 548/637/588 563/647/588
f 541/638/589 557/652/589 542/642/589
f 534/640/590 550/653/590 535/626/590
f 542/642/591 558/654/591 543/627/591
f 535/626/592 551/655/592 536/628/592
f 544/629/593 558/654/593 559/656/593
f 536/628/594 552/657/594 537/630/594
f 544/629/595 560/658/595 545/631/595
f 538/632/596 552/657/596 553/659/596
f 545/631/597 561/644/597 546/633/597
f 557/652/598 574/660/598 558/654/598
f 550/653/599 567/661/599 551/655/599
f 559/656/600 574/660/600 575/662/600
f 551/655/601 568/663/601 552/657/601
f 559/656/602 576/664/602 560/658/602
f 553/659/603 568/663/603 569/665/603
f 560/658/604 577/666/604 561/644/604
f 553/659/605 570/667/605 554/643/605
f 561/644/606 578/668/606 562/645/606
f 555/646/607 570/667/607 571/669/607
f 563/647/608 578/668/608 579/670/608
f 556/648/609 571/669/609 572/671/609
f 549/650/610 564/672/610 565/673/610
f 790/674/611 563/647/611 579/670/611
f 556/648/612 573/675/612 557/652/612
f 549/650/613 566/676/613 550/653/613
f 578/668/614 592/677/614 593/678/614
f 571/669/615 585/679/615 586/680/615
f 578/668/616 594/681/616 579/670/616
f 572/671/617 586/680/617 587/682/617
f 565/673/618 564/683/618 580/684/618
f 790/685/619 579/670/619 594/681/619
f 572/671/620 588/686/620 573/675/620
f 566/676/621 580/684/621 581/687/621
f 573/675/622 589/688/622 574/660/622
f 566/676/623 582/689/623 567/661/623
f 575/662/624 589/688/624 590/690/624
f 567/661/625 583/691/625 568/663/625
f 575/662/626 591/692/626 576/664/626
f 569/665/627 583/691/627 584/693/627
f 577/666/628 591/692/628 592/677/628
f 569/665/629 585/679/629 570/667/629
f 590/690/630 604/694/630 605/695/630
f 582/689/631 598/696/631 583/691/631
f 590/690/632 606/697/632 591/692/632
f 584/693/633 598/696/633 599/698/633
f 592/677/634 606/697/634 607/699/634
f 584/693/635 600/700/635 585/679/635
f 592/677/636 608/701/636 593/678/636
f 586/680/637 600/700/637 601/702/637
f 593/678/638 609/703/638 594/681/638
f 587/682/639 601/702/639 602/704/639
f 580/684/640 564/705/640 595/706/640
f 790/707/641 594/681/641 609/703/641
f 587/682/642 603/708/642 588/686/642
f 581/687/643 595/706/643 596/709/643
f 588/686/644 604/694/644 589/688/644
f 581/687/645 597/710/645 582/689/645
f 608/701/646 624/711/646 609/703/646
f 602/704/647 616/712/647 617/713/647
f 595/706/648 564/714/648 610/715/648
f 790/716/649 609/703/649 624/711/649
f 602/704/650 618/717/650 603/708/650
f 595/706/651 611/718/651 596/709/651
f 603/708/652 619/719/652 604/694/652
f 596/709/653 612/720/653 597/710/653
f 605/695/654 619/719/654 620/721/654
f 597/710/655 613/722/655 598/696/655
f 605/695/656 621/723/656 606/697/656
f 599/698/657 613/722/657 614/724/657
f 607/699/658 621/723/658 622/725/658
f 599/698/659 615/726/659 600/700/659
f 607/699/660 623/727/660 608/701/660
f 601/702/661 615/726/661 616/712/661
f 612/720/662 628/728/662 613/722/662
f 620/721/663 636/729/663 621/723/663
f 614/724/664 628/728/664 629/730/664
f 622/725/665 636/729/665 637/731/665
f 614/724/666 630/732/666 615/726/666
f 622/725/667 638/733/667 623/727/667
f 616/712/668 630/732/668 631/734/668
f 624/711/669 638/733/669 639/735/669
f 617/713/670 631/734/670 632/736/670
f 610/715/671 564/737/671 625/738/671
f 790/739/672 624/711/672 639/735/672
f 617/713/673 633/740/673 618/717/673
f 610/715/674 626/741/674 611/718/674
f 618/717/675 634/742/675 619/719/675
f 612/720/676 626/741/676 627/743/676
f 620/721/677 634/742/677 635/744/677
f 632/736/678 646/745/678 647/746/678
f 625/738/679 564/747/679 640/748/679
f 790/749/680 639/735/680 654/750/680
f 632/736/681 648/751/681 633/740/681
f 625/738/682 641/752/682 626/741/682
f 633/740/683 649/753/683 634/742/683
f 627/743/684 641/752/684 642/754/684
f 635/744/685 649/753/685 650/755/685
f 627/743/686 643/756/686 628/728/686
f 635/744/687 651/757/687 636/729/687
f 629/730/688 643/756/688 644/758/688
f 637/731/689 651/757/689 652/759/689
f 629/730/690 645/760/690 630/732/690
f 637/731/691 653/761/691 638/733/691
f 631/734/692 645/760/692 646/745/692
f 638/733/693 654/750/693 639/735/693
f 650/755/694 666/762/694 651/757/694
f 644/758/695 658/763/695 659/764/695
f 651/757/696 667/765/696 652/759/696
f 644/758/697 660/766/697 645/760/697
f 653/761/698 667/765/698 668/767/698
f 646/745/699 660/766/699 661/768/699
f 654/750/700 668/767/700 669/769/700
f 647/746/701 661/768/701 662/770/701
f 640/748/702 564/771/702 655/772/702
f 790/773/703 654/750/703 669/769/703
f 647/746/704 663/774/704 648/751/704
f 640/748/705 656/775/705 641/752/705
f 648/751/706 664/776/706 649/753/706
f 641/752/707 657/777/707 642/754/707
f 650/755/708 664/776/708 665/778/708
f 642/754/709 658/763/709 643/756/709
f 655/772/710 564/779/710 670/780/710
f 790/781/711 669/769/711 684/782/711
f 662/770/712 678/783/712 663/774/712
f 656/775/713 670/780/713 671/784/713
f 663/774/714 679/785/714 664/776/714
f 657/777/715 671/784/715 672/786/715
f 665/778/716 679/785/716 680/787/716
f 657/777/717 673/788/717 658/763/717
f 665/778/718 681/789/718 666/762/718
f 659/764/719 673/788/719 674/790/719
f 667/765/720 681/789/720 682/791/720
f 659/764/721 675/792/721 660/766/721
f 667/765/722 683/793/722 668/767/722
f 661/768/723 675/792/723 676/794/723
f 668/767/724 684/782/724 669/769/724
f 662/770/725 676/794/725 677/795/725
f 674/790/726 688/796/726 689/797/726
f 682/791/727 696/798/727 697/799/727
f 674/790/728 690/800/728 675/792/728
f 682/791/729 698/801/729 683/793/729
f 676/794/730 690/800/730 691/802/730
f 683/793/731 699/803/731 684/782/731
f 677/795/732 691/802/732 692/804/732
f 670/780/733 564/805/733 685/806/733
f 790/807/734 684/782/734 699/803/734
f 677/795/735 693/808/735 678/783/735
f 670/780/736 686/809/736 671/784/736
f 678/783/737 694/810/737 679/785/737
f 671/784/738 687/811/738 672/786/738
f 680/787/739 694/810/739 695/812/739
f 672/786/740 688/796/740 673/788/740
f 680/787/741 696/798/741 681/789/741
f 692/804/742 708/813/742 693/808/742
f 685/806/743 701/814/743 686/809/743
f 693/808/744 709/815/744 694/810/744
f 686/809/745 702/816/745 687/811/745
f 695/812/746 709/815/746 710/817/746
f 687/811/747 703/818/747 688/796/747
f 695/812/748 711/819/748 696/798/748
f 689/797/749 703/818/749 704/820/749
f 697/799/750 711/819/750 712/821/750
f 689/797/751 705/822/751 690/800/751
f 697/799/752 713/823/752 698/801/752
f 691/802/753 705/822/753 706/824/753
f 698/801/754 714/825/754 699/803/754
f 692/804/755 706/824/755 707/826/755
f 685/806/756 564/827/756 700/828/756
f 790/829/757 699/803/757 714/825/757
f 712/821/758 726/830/758 727/831/758
f 704/820/759 720/832/759 705/822/759
f 712/821/760 728/833/760 713/823/760
f 706/824/761 720/832/761 721/834/761
f 714/825/762 728/833/762 729/835/762
f 707/826/763 721/834/763 722/836/763
f 700/828/764 564/837/764 715/838/764
f 790/839/765 714/825/765 729/835/765
f 707/826/766 723/840/766 708/813/766
f 700/828/767 716/841/767 701/814/767
f 708/813/768 724/842/768 709/815/768
f 702/816/769 716/841/769 717/843/769
f 710/817/770 724/842/770 725/844/770
f 702/816/771 718/845/771 703/818/771
f 710/817/772 726/830/772 711/819/772
f 704/820/773 718/845/773 719/846/773
f 723/840/774 739/847/774 724/842/774
f 716/841/775 732/848/775 717/843/775
f 725/844/776 739/847/776 740/849/776
f 717/843/777 733/850/777 718/845/777
f 725/844/778 741/851/778 726/830/778
f 719/846/779 733/850/779 734/852/779
f 727/831/780 741/851/780 742/853/780
f 719/846/781 735/854/781 720/832/781
f 727/831/782 743/855/782 728/833/782
f 721/834/783 735/854/783 736/856/783
f 728/833/784 744/857/784 729/835/784
f 722/836/785 736/856/785 737/858/785
f 715/838/786 564/859/786 730/860/786
f 790/861/787 729/835/787 744/857/787
f 722/836/788 738/862/788 723/840/788
f 715/838/789 731/863/789 716/841/789
f 742/853/790 758/864/790 743/855/790
f 736/856/791 750/865/791 751/866/791
f 743/855/792 759/867/792 744/857/792
f 737/858/793 751/866/793 752/868/793
f 730/860/794 564/869/794 745/870/794
f 790/871/795 744/857/795 759/867/795
f 737/858/796 753/872/796 738/862/796
f 731/863/797 745/870/797 746/873/797
f 738/862/798 754/874/798 739/847/798
f 731/863/799 747/875/799 732/848/799
f 740/849/800 754/874/800 755/876/800
f 732/848/801 748/877/801 733/850/801
f 740/849/802 756/878/802 741/851/802
f 734/852/803 748/877/803 749/879/803
f 742/853/804 756/878/804 757/880/804
f 734/852/805 750/865/805 735/854/805
f 746/873/806 762/881/806 747/875/806
f 755/876/807 769/882/807 770/883/807
f 747/875/808 763/884/808 748/877/808
f 755/876/809 771/885/809 756/878/809
f 749/879/810 763/884/810 764/886/810
f 757/880/811 771/885/811 772/887/811
f 749/879/812 765/888/812 750/865/812
f 757/880/813 773/889/813 758/864/813
f 751/866/814 765/888/814 766/890/814
f 759/867/815 773/889/815 774/891/815
f 752/868/816 766/890/816 767/892/816
f 745/870/817 564/893/817 760/894/817
f 790/895/818 759/867/818 774/891/818
f 752/868/819 768/896/819 753/872/819
f 746/873/820 760/894/820 761/897/820
f 753/872/821 769/882/821 754/874/821
f 766/890/822 780/898/822 781/899/822
f 774/891/823 788/900/823 789/901/823
f 767/892/824 781/899/824 782/902/824
f 760/894/825 564/903/825 775/904/825
f 790/905/826 774/891/826 789/901/826
f 767/892/827 783/906/827 768/896/827
f 760/894/828 776/907/828 761/897/828
f 768/896/829 784/908/829 769/882/829
f 761/897/830 777/909/830 762/881/830
f 770/883/831 784/908/831 785/910/831
f 762/881/832 778/911/832 763/884/832
f 770/883/833 786/912/833 771/885/833
f 764/886/834 778/911/834 779/913/834
f 772/887/835 786/912/835 787/914/835
f 764/886/836 780/898/836 765/888/836
f 772/887/837 788/900/837 773/889/837
f 785/910/838 800/915/838 801/916/838
f 777/909/839 794/917/839 778/911/839
f 785/910/840 802/918/840 786/912/840
f 779/913/841 794/917/841 795/919/841
f 787/914/842 802/918/842 803/920/842
f 779/913/843 796/921/843 780/898/843
f 787/914/844 804/922/844 788/900/844
f 781/899/845 796/921/845 797/923/845
f 788/900/846 805/924/846 789/901/846
f 782/902/847 797/923/847 798/925/847
f 775/904/848 564/926/848 791/927/848
f 790/928/849 789/901/849 805/924/849
f 782/902/850 799/929/850 783/906/850
f 775/904/851 792/930/851 776/907/851
f 783/906/852 800/915/852 784/908/852
f 776/907/853 793/931/853 777/909/853
f 804/922/854 820/932/854 805/924/854
f 798/925/855 812/933/855 813/934/855
f 791/927/856 564/935/856 806/936/856
f 790/937/857 805/924/857 820/932/857
f 798/925/858 814/938/858 799/929/858
f 791/927/859 807/939/859 792/930/859
f 799/929/860 815/940/860 800/915/860
f 792/930/861 808/941/861 793/931/861
f 801/916/862 815/940/862 816/942/862
f 793/931/863 809/943/863 794/917/863
f 801/916/864 817/944/864 802/918/864
f 794/917/865 810/945/865 795/919/865
f 803/920/866 817/944/866 818/946/866
f 795/919/867 811/947/867 796/921/867
f 803/920/868 819/948/868 804/922/868
f 797/923/869 811/947/869 812/933/869
f 808/941/870 824/949/870 809/943/870
f 816/942/871 832/950/871 817/944/871
f 810/945/872 824/949/872 825/951/872
f 818/946/873 832/950/873 833/952/873
f 810/945/874 826/953/874 811/947/874
f 818/946/875 834/954/875 819/948/875
f 812/933/876 826/953/876 827/955/876
f 819/948/877 835/956/877 820/932/877
f 813/934/878 827/955/878 828/957/878
f 806/936/879 564/958/879 821/959/879
f 790/960/880 820/932/880 835/956/880
f 813/934/881 829/961/881 814/938/881
f 806/936/882 822/962/882 807/939/882
f 814/938/883 830/963/883 815/940/883
f 807/939/884 823/964/884 808/941/884
f 816/942/885 830/963/885 831/965/885
f 828/957/886 842/966/886 843/967/886
f 821/959/887 564/968/887 836/969/887
f 790/970/888 835/956/888 850/971/888
f 828/957/889 844/972/889 829/961/889
f 821/959/890 837/973/890 822/962/890
f 829/961/891 845/974/891 830/963/891
f 823/964/892 837/973/892 838/975/892
f 831/965/893 845/974/893 846/976/893
f 823/964/894 839/977/894 824/949/894
f 831/965/895 847/978/895 832/950/895
f 825/951/896 839/977/896 840/979/896
f 833/952/897 847/978/897 848/980/897
f 825/951/898 841/981/898 826/953/898
f 833/952/899 849/982/899 834/954/899
f 827/955/900 841/981/900 842/966/900
f 834/954/901 850/971/901 835/956/901
f 846/983/902 862/984/902 847/985/902
f 840/986/903 854/987/903 855/988/903
f 848/989/904 862/984/904 863/990/904
f 840/986/905 856/991/905 841/992/905
f 848/989/906 864/993/906 849/994/906
f 842/995/907 856/991/907 857/996/907
f 849/994/908 865/997/908 850/998/908
f 843/999/909 857/996/909 858/1000/909
f 836/1001/910 564/1002/910 851/1003/910
f 790/1004/911 850/998/911 865/997/911
f 843/999/912 859/1005/912 844/1006/912
f 836/1001/913 852/1007/913 837/1008/913
f 844/1006/914 860/1009/914 845/1010/914
f 838/1011/915 852/1007/915 853/1012/915
f 846/983/916 860/1009/916 861/1013/916
f 838/1011/917 854/987/917 839/1014/917
f 790/1015/918 865/997/918 880/1016/918
f 858/1000/919 874/1017/919 859/1005/919
f 851/1003/920 867/1018/920 852/1007/920
f 859/1005/921 875/1019/921 860/1009/921
f 853/1012/922 867/1018/922 868/1020/922
f 861/1013/923 875/1019/923 876/1021/923
f 853/1012/924 869/1022/924 854/987/924
f 861/1013/925 877/1023/925 862/984/925
f 855/988/926 869/1022/926 870/1024/926
f 863/990/927 877/1023/927 878/1025/927
f 855/988/928 871/1026/928 856/991/928
f 863/990/929 879/1027/929 864/993/929
f 857/996/930 871/1026/930 872/1028/930
f 864/993/931 880/1016/931 865/997/931
f 858/1000/932 872/1028/932 873/1029/932
f 851/1003/933 564/1030/933 866/1031/933
f 878/1025/934 892/1032/934 893/1033/934
f 870/1024/935 886/1034/935 871/1026/935
f 879/1027/936 893/1033/936 894/1035/936
f 872/1028/937 886/1034/937 887/1036/937
f 879/1027/938 895/1037/938 880/1016/938
f 873/1029/939 887/1036/939 888/1038/939
f 866/1031/940 564/1039/940 881/1040/940
f 790/1041/941 880/1016/941 895/1037/941
f 873/1029/942 889/1042/942 874/1017/942
f 867/1018/943 881/1040/943 882/1043/943
f 874/1017/944 890/1044/944 875/1019/944
f 867/1018/945 883/1045/945 868/1020/945
f 876/1021/946 890/1044/946 891/1046/946
f 868/1020/947 884/1047/947 869/1022/947
f 876/1021/948 892/1032/948 877/1023/948
f 870/1024/949 884/1047/949 885/1048/949
f 881/1040/950 897/1049/950 882/1043/950
f 889/1042/951 905/1050/951 890/1044/951
f 883/1045/952 897/1049/952 898/1051/952
f 891/1046/953 905/1050/953 906/1052/953
f 883/1045/954 899/1053/954 884/1047/954
f 891/1046/955 907/1054/955 892/1032/955
f 885/1048/956 899/1053/956 900/1055/956
f 893/1033/957 907/1054/957 908/1056/957
f 885/1048/958 901/1057/958 886/1034/958
f 893/1033/959 909/1058/959 894/1035/959
f 887/1036/960 901/1057/960 902/1059/960
f 894/1035/961 910/1060/961 895/1037/961
f 888/1038/962 902/1059/962 903/1061/962
f 881/1040/963 564/1062/963 896/1063/963
f 790/1064/964 895/1037/964 910/1060/964
f 888/1038/965 904/1065/965 889/1042/965
f 900/1055/966 916/1066/966 901/1057/966
f 908/1056/967 924/1067/967 909/1058/967
f 902/1059/968 916/1066/968 917/1068/968
f 909/1058/969 925/1069/969 910/1060/969
f 903/1061/970 917/1068/970 918/1070/970
f 896/1063/971 564/1071/971 911/1072/971
f 790/1073/972 910/1060/972 925/1069/972
f 903/1061/973 919/1074/973 904/1065/973
f 896/1063/974 912/1075/974 897/1049/974
f 904/1065/975 920/1076/975 905/1050/975
f 898/1051/976 912/1075/976 913/1077/976
f 906/1052/977 920/1076/977 921/1078/977
f 898/1051/978 914/1079/978 899/1053/978
f 906/1052/979 922/1080/979 907/1054/979
f 900/1055/980 914/1079/980 915/1081/980
f 908/1056/981 922/1080/981 923/1082/981
f 919/1074/982 935/1083/982 920/1076/982
f 912/1075/983 928/1084/983 913/1077/983
f 921/1078/984 935/1083/984 936/1085/984
f 913/1077/985 929/1086/985 914/1079/985
f 921/1078/986 937/1087/986 922/1080/986
f 915/1081/987 929/1086/987 930/1088/987
f 923/1082/988 937/1087/988 938/1089/988
f 915/1081/989 931/1090/989 916/1066/989
f 923/1082/990 939/1091/990 924/1067/990
f 917/1068/991 931/1090/991 932/1092/991
f 924/1067/992 940/1093/992 925/1069/992
f 918/1070/993 932/1092/993 933/1094/993
f 911/1072/994 564/1095/994 926/1096/994
f 790/1097/995 925/1069/995 940/1093/995
f 918/1070/996 934/1098/996 919/1074/996
f 912/1075/997 926/1096/997 927/1099/997
f 938/1089/998 954/1100/998 939/1091/998
f 932/1092/999 946/1101/999 947/1102/999
f 940/1093/1000 954/1100/1000 955/1103/1000
f 933/1094/1001 947/1102/1001 948/1104/1001
f 926/1096/1002 564/1105/1002 941/1106/1002
f 790/1107/1003 940/1093/1003 955/1103/1003
f 933/1094/1004 949/1108/1004 934/1098/1004
f 926/1096/1005 942/1109/1005 927/1099/1005
f 934/1098/1006 950/1110/1006 935/1083/1006
f 928/1084/1007 942/1109/1007 943/1111/1007
f 936/1085/1008 950/1110/1008 951/1112/1008
f 928/1084/1009 944/1113/1009 929/1086/1009
f 936/1085/1010 952/1114/1010 937/1087/1010
f 930/1088/1011 944/1113/1011 945/1115/1011
f 938/1089/1012 952/1114/1012 953/1116/1012
f 930/1088/1013 946/1101/1013 931/1090/1013
f 943/1111/1014 957/591/1014 483/563/1014
f 950/1110/1015 488/567/1015 951/1112/1015
f 944/1113/1016 483/563/1016 958/568/1016
f 951/1112/1017 961/571/1017 952/1114/1017
f 945/1115/1018 958/568/1018 484/572/1018
f 953/1116/1019 961/571/1019 962/575/1019
f 946/1101/1020 484/572/1020 959/576/1020
f 953/1116/1021 963/579/1021 954/1100/1021
f 947/1102/1022 959/576/1022 485/580/1022
f 954/1100/1023 964/583/1023 955/1103/1023
f 948/1104/1024 485/580/1024 486/585/1024
f 941/1106/1025 564/1117/1025 956/586/1025
f 790/1118/1026 955/1103/1026 964/583/1026
f 948/1104/1027 487/560/1027 949/1108/1027
f 942/1109/1028 956/586/1028 957/591/1028
f 949/1108/1029 960/562/1029 950/1110/1029
f 487/560/518 497/590/518 498/561/518
f 483/563/519 957/591/519 490/564/519
f 960/562/520 498/561/520 499/566/520
f 958/568/521 483/563/521 491/565/521
f 488/567/522 499/566/522 500/570/522
f 484/572/523 958/568/523 492/569/523
f 961/571/524 500/570/524 501/574/524
f 959/576/525 484/572/525 493/573/525
f 962/575/526 501/574/526 502/578/526
f 485/580/527 959/576/527 494/577/527
f 963/579/528 502/578/528 503/582/528
f 485/580/529 495/581/529 496/584/529
f 486/585/532 496/584/532 497/590/532
f 957/591/533 956/586/533 489/588/533
f 496/584/534 511/605/534 512/592/534
f 489/588/535 504/607/535 505/593/535
f 497/590/536 512/592/536 513/594/536
f 490/564/537 505/593/537 506/595/537
f 498/561/538 513/594/538 514/596/538
f 491/565/539 506/595/539 507/597/539
f 499/566/540 514/596/540 515/598/540
f 492/569/541 507/597/541 508/599/541
f 501/574/542 500/570/542 515/598/542
f 493/573/543 508/599/543 509/601/543
f 502/578/544 501/574/544 516/600/544
f 495/581/545 494/577/545 509/601/545
f 502/578/546 517/602/546 518/604/546
f 495/581/547 510/603/547 511/605/547
f 516/600/550 515/598/550 530/609/550
f 508/599/551 523/625/551 524/611/551
f 516/600/552 531/610/552 532/612/552
f 509/601/553 524/611/553 525/613/553
f 517/602/554 532/612/554 533/614/554
f 510/603/555 525/613/555 526/615/555
f 512/592/558 511/605/558 526/615/558
f 504/607/559 519/617/559 520/620/559
f 512/592/560 527/619/560 528/621/560
f 505/593/561 520/620/561 521/622/561
f 514/596/562 513/594/562 528/621/562
f 506/595/563 521/622/563 522/624/563
f 515/598/564 514/596/564 529/623/564
f 507/597/565 522/624/565 523/625/565
f 519/617/566 534/640/566 535/626/566
f 527/619/567 542/642/567 543/627/567
f 520/620/568 535/626/568 536/628/568
f 529/623/569 528/621/569 543/627/569
f 521/622/570 536/628/570 537/630/570
f 529/623/571 544/629/571 545/631/571
f 522/624/572 537/630/572 538/632/572
f 530/609/573 545/631/573 546/633/573
f 523/625/574 538/632/574 539/634/574
f 531/610/575 546/633/575 547/635/575
f 525/613/576 524/611/576 539/634/576
f 532/612/577 547/635/577 548/637/577
f 526/615/578 525/613/578 540/636/578
f 526/615/581 541/638/581 542/642/581
f 538/632/582 553/659/582 554/643/582
f 547/635/583 546/633/583 561/644/583
f 540/636/584 539/634/584 554/643/584
f 547/635/1030 562/645/1030 563/647/1030
f 541/638/586 540/636/586 555/646/586
f 541/638/589 556/648/589 557/652/589
f 534/640/1031 549/650/1031 550/653/1031
f 542/642/591 557/652/591 558/654/591
f 535/626/592 550/653/592 551/655/592
f 544/629/593 543/627/593 558/654/593
f 536/628/594 551/655/594 552/657/594
f 544/629/595 559/656/595 560/658/595
f 538/632/596 537/630/596 552/657/596
f 545/631/597 560/658/597 561/644/597
f 557/652/598 573/675/598 574/660/598
f 550/653/599 566/676/599 567/661/599
f 559/656/600 558/654/600 574/660/600
f 551/655/601 567/661/601 568/663/601
f 559/656/602 575/662/602 576/664/602
f 553/659/603 552/657/603 568/663/603
f 560/658/604 576/664/604 577/666/604
f 553/659/605 569/665/605 570/667/605
f 561/644/606 577/666/606 578/668/606
f 555/646/607 554/643/607 570/667/607
f 563/647/608 562/645/608 578/668/608
f 556/648/609 555/646/609 571/669/609
f 556/648/612 572/671/612 573/675/612
f 549/650/613 565/673/613 566/676/613
f 578/668/614 577/666/614 592/677/614
f 571/669/615 570/667/615 585/679/615
f 578/668/616 593/678/616 594/681/616
f 572/671/617 571/669/617 586/680/617
f 572/671/620 587/682/620 588/686/620
f 566/676/621 565/673/621 580/684/621
f 573/675/622 588/686/622 589/688/622
f 566/676/623 581/687/623 582/689/623
f 575/662/624 574/660/624 589/688/624
f 567/661/625 582/689/625 583/691/625
f 575/662/626 590/690/626 591/692/626
f 569/665/627 568/663/627 583/691/627
f 577/666/628 576/664/628 591/692/628
f 569/665/629 584/693/629 585/679/629
f 590/690/630 589/688/630 604/694/630
f 582/689/631 597/710/631 598/696/631
f 590/690/632 605/695/632 606/697/632
f 584/693/633 583/691/633 598/696/633
f 592/677/634 591/692/634 606/697/634
f 584/693/635 599/698/635 600/700/635
f 592/677/636 607/699/636 608/701/636
f 586/680/637 585/679/637 600/700/637
f 593/678/638 608/701/638 609/703/638
f 587/682/639 586/680/639 601/702/639
f 587/682/642 602/704/642 603/708/642
f 581/687/643 580/684/643 595/706/643
f 588/686/644 603/708/644 604/694/644
f 581/687/645 596/709/645 597/710/645
f 608/701/646 623/727/646 624/711/646
f 602/704/647 601/702/647 616/712/647
f 602/704/650 617/713/650 618/717/650
f 595/706/651 610/715/651 611/718/651
f 603/708/652 618/717/652 619/719/652
f 596/709/653 611/718/653 612/720/653
f 605/695/654 604/694/654 619/719/654
f 597/710/655 612/720/655 613/722/655
f 605/695/656 620/721/656 621/723/656
f 599/698/657 598/696/657 613/722/657
f 607/699/658 606/697/658 621/723/658
f 599/698/659 614/724/659 615/726/659
f 607/699/660 622/725/660 623/727/660
f 601/702/661 600/700/661 615/726/661
f 612/720/1032 627/743/1032 628/728/1032
f 620/721/663 635/744/663 636/729/663
f 614/724/664 613/722/664 628/728/664
f 622/725/1033 621/723/1033 636/729/1033
f 614/724/666 629/730/666 630/732/666
f 622/725/667 637/731/667 638/733/667
f 616/712/668 615/726/668 630/732/668
f 624/711/669 623/727/669 638/733/669
f 617/713/670 616/712/670 631/734/670
f 617/713/673 632/736/673 633/740/673
f 610/715/674 625/738/674 626/741/674
f 618/717/675 633/740/675 634/742/675
f 612/720/676 611/718/676 626/741/676
f 620/721/677 619/719/677 634/742/677
f 632/736/678 631/734/678 646/745/678
f 632/736/681 647/746/681 648/751/681
f 625/738/682 640/748/682 641/752/682
f 633/740/683 648/751/683 649/753/683
f 627/743/684 626/741/684 641/752/684
f 635/744/685 634/742/685 649/753/685
f 627/743/686 642/754/686 643/756/686
f 635/744/687 650/755/687 651/757/687
f 629/730/688 628/728/688 643/756/688
f 637/731/689 636/729/689 651/757/689
f 629/730/690 644/758/690 645/760/690
f 637/731/691 652/759/691 653/761/691
f 631/734/692 630/732/692 645/760/692
f 638/733/693 653/761/693 654/750/693
f 650/755/694 665/778/694 666/762/694
f 644/758/695 643/756/695 658/763/695
f 651/757/696 666/762/696 667/765/696
f 644/758/697 659/764/697 660/766/697
f 653/761/698 652/759/698 667/765/698
f 646/745/699 645/760/699 660/766/699
f 654/750/700 653/761/700 668/767/700
f 647/746/701 646/745/701 661/768/701
f 647/746/704 662/770/704 663/774/704
f 640/748/705 655/772/705 656/775/705
f 648/751/706 663/774/706 664/776/706
f 641/752/707 656/775/707 657/777/707
f 650/755/708 649/753/708 664/776/708
f 642/754/709 657/777/709 658/763/709
f 662/770/712 677/795/712 678/783/712
f 656/775/713 655/772/713 670/780/713
f 663/774/714 678/783/714 679/785/714
f 657/777/715 656/775/715 671/784/715
f 665/778/716 664/776/716 679/785/716
f 657/777/717 672/786/717 673/788/717
f 665/778/718 680/787/718 681/789/718
f 659/764/719 658/763/719 673/788/719
f 667/765/720 666/762/720 681/789/720
f 659/764/721 674/790/721 675/792/721
f 667/765/722 682/791/722 683/793/722
f 661/768/723 660/766/723 675/792/723
f 668/767/724 683/793/724 684/782/724
f 662/770/725 661/768/725 676/794/725
f 674/790/726 673/788/726 688/796/726
f 682/791/727 681/789/727 696/798/727
f 674/790/728 689/797/728 690/800/728
f 682/791/729 697/799/729 698/801/729
f 676/794/730 675/792/730 690/800/730
f 683/793/731 698/801/731 699/803/731
f 677/795/732 676/794/732 691/802/732
f 677/795/735 692/804/735 693/808/735
f 670/780/736 685/806/736 686/809/736
f 678/783/737 693/808/737 694/810/737
f 671/784/738 686/809/738 687/811/738
f 680/787/739 679/785/739 694/810/739
f 672/786/740 687/811/740 688/796/740
f 680/787/741 695/812/741 696/798/741
f 692/804/742 707/826/742 708/813/742
f 685/806/743 700/828/743 701/814/743
f 693/808/744 708/813/744 709/815/744
f 686/809/745 701/814/745 702/816/745
f 695/812/746 694/810/746 709/815/746
f 687/811/747 702/816/747 703/818/747
f 695/812/748 710/817/748 711/819/748
f 689/797/749 688/796/749 703/818/749
f 697/799/750 696/798/750 711/819/750
f 689/797/751 704/820/751 705/822/751
f 697/799/752 712/821/752 713/823/752
f 691/802/753 690/800/753 705/822/753
f 698/801/754 713/823/754 714/825/754
f 692/804/755 691/802/755 706/824/755
f 712/821/758 711/819/758 726/830/758
f 704/820/759 719/846/759 720/832/759
f 712/821/760 727/831/760 728/833/760
f 706/824/761 705/822/761 720/832/761
f 714/825/762 713/823/762 728/833/762
f 707/826/763 706/824/763 721/834/763
f 707/826/766 722/836/766 723/840/766
f 700/828/767 715/838/767 716/841/767
f 708/813/768 723/840/768 724/842/768
f 702/816/769 701/814/769 716/841/769
f 710/817/770 709/815/770 724/842/770
f 702/816/771 717/843/771 718/845/771
f 710/817/772 725/844/772 726/830/772
f 704/820/773 703/818/773 718/845/773
f 723/840/774 738/862/774 739/847/774
f 716/841/775 731/863/775 732/848/775
f 725/844/776 724/842/776 739/847/776
f 717/843/777 732/848/777 733/850/777
f 725/844/778 740/849/778 741/851/778
f 719/846/779 718/845/779 733/850/779
f 727/831/780 726/830/780 741/851/780
f 719/846/781 734/852/781 735/854/781
f 727/831/782 742/853/782 743/855/782
f 721/834/783 720/832/783 735/854/783
f 728/833/784 743/855/784 744/857/784
f 722/836/785 721/834/785 736/856/785
f 722/836/788 737/858/788 738/862/788
f 715/838/789 730/860/789 731/863/789
f 742/853/790 757/880/790 758/864/790
f 736/856/791 735/854/791 750/865/791
f 743/855/792 758/864/792 759/867/792
f 737/858/793 736/856/793 751/866/793
f 737/858/796 752/868/796 753/872/796
f 731/863/797 730/860/797 745/870/797
f 738/862/798 753/872/798 754/874/798
f 731/863/799 746/873/799 747/875/799
f 740/849/800 739/847/800 754/874/800
f 732/848/801 747/875/801 748/877/801
f 740/849/802 755/876/802 756/878/802
f 734/852/803 733/850/803 748/877/803
f 742/853/804 741/851/804 756/878/804
f 734/852/805 749/879/805 750/865/805
f 746/873/806 761/897/806 762/881/806
f 755/876/807 754/874/807 769/882/807
f 747/875/808 762/881/808 763/884/808
f 755/876/809 770/883/809 771/885/809
f 749/879/810 748/877/810 763/884/810
f 757/880/811 756/878/811 771/885/811
f 749/879/812 764/886/812 765/888/812
f 757/880/813 772/887/813 773/889/813
f 751/866/814 750/865/814 765/888/814
f 759/867/815 758/864/815 773/889/815
f 752/868/816 751/866/816 766/890/816
f 752/868/819 767/892/819 768/896/819
f 746/873/820 745/870/820 760/894/820
f 753/872/821 768/896/821 769/882/821
f 766/890/822 765/888/822 780/898/822
f 774/891/823 773/889/823 788/900/823
f 767/892/824 766/890/824 781/899/824
f 767/892/827 782/902/827 783/906/827
f 760/894/828 775/904/828 776/907/828
f 768/896/829 783/906/829 784/908/829
f 761/897/830 776/907/830 777/909/830
f 770/883/831 769/882/831 784/908/831
f 762/881/832 777/909/832 778/911/832
f 770/883/833 785/910/833 786/912/833
f 764/886/834 763/884/834 778/911/834
f 772/887/835 771/885/835 786/912/835
f 764/886/836 779/913/836 780/898/836
f 772/887/837 787/914/837 788/900/837
f 785/910/838 784/908/838 800/915/838
f 777/909/839 793/931/839 794/917/839
f 785/910/840 801/916/840 802/918/840
f 779/913/841 778/911/841 794/917/841
f 787/914/842 786/912/842 802/918/842
f 779/913/843 795/919/843 796/921/843
f 787/914/844 803/920/844 804/922/844
f 781/899/845 780/898/845 796/921/845
f 788/900/1034 804/922/1034 805/924/1034
f 782/902/847 781/899/847 797/923/847
f 782/902/850 798/925/850 799/929/850
f 775/904/851 791/927/851 792/930/851
f 783/906/852 799/929/852 800/915/852
f 776/907/853 792/930/853 793/931/853
f 804/922/854 819/948/854 820/932/854
f 798/925/855 797/923/855 812/933/855
f 798/925/858 813/934/858 814/938/858
f 791/927/859 806/936/859 807/939/859
f 799/929/860 814/938/860 815/940/860
f 792/930/861 807/939/861 808/941/861
f 801/916/862 800/915/862 815/940/862
f 793/931/863 808/941/863 809/943/863
f 801/916/864 816/942/864 817/944/864
f 794/917/865 809/943/865 810/945/865
f 803/920/866 802/918/866 817/944/866
f 795/919/867 810/945/867 811/947/867
f 803/920/868 818/946/868 819/948/868
f 797/923/869 796/921/869 811/947/869
f 808/941/870 823/964/870 824/949/870
f 816/942/871 831/965/871 832/950/871
f 810/945/872 809/943/872 824/949/872
f 818/946/873 817/944/873 832/950/873
f 810/945/874 825/951/874 826/953/874
f 818/946/875 833/952/875 834/954/875
f 812/933/876 811/947/876 826/953/876
f 819/948/877 834/954/877 835/956/877
f 813/934/878 812/933/878 827/955/878
f 813/934/881 828/957/881 829/961/881
f 806/936/882 821/959/882 822/962/882
f 814/938/883 829/961/883 830/963/883
f 807/939/884 822/962/884 823/964/884
f 816/942/885 815/940/885 830/963/885
f 828/957/886 827/955/886 842/966/886
f 828/957/889 843/967/889 844/972/889
f 821/959/890 836/969/890 837/973/890
f 829/961/891 844/972/891 845/974/891
f 823/964/892 822/962/892 837/973/892
f 831/965/893 830/963/893 845/974/893
f 823/964/894 838/975/894 839/977/894
f 831/965/895 846/976/895 847/978/895
f 825/951/896 824/949/896 839/977/896
f 833/952/897 832/950/897 847/978/897
f 825/951/898 840/979/898 841/981/898
f 833/952/899 848/980/899 849/982/899
f 827/955/900 826/953/900 841/981/900
f 834/954/901 849/982/901 850/971/901
f 846/983/902 861/1013/902 862/984/902
f 840/986/903 839/1014/903 854/987/903
f 848/989/904 847/985/904 862/984/904
f 840/986/905 855/988/905 856/991/905
f 848/989/906 863/990/906 864/993/906
f 842/995/907 841/992/907 856/991/907
f 849/994/908 864/993/908 865/997/908
f 843/999/909 842/995/909 857/996/909
f 843/999/912 858/1000/912 859/1005/912
f 836/1001/913 851/1003/913 852/1007/913
f 844/1006/914 859/1005/914 860/1009/914
f 838/1011/915 837/1008/915 852/1007/915
f 846/983/916 845/1010/916 860/1009/916
f 838/1011/917 853/1012/917 854/987/917
f 858/1000/919 873/1029/919 874/1017/919
f 851/1003/920 866/1031/920 867/1018/920
f 859/1005/921 874/1017/921 875/1019/921
f 853/1012/922 852/1007/922 867/1018/922
f 861/1013/923 860/1009/923 875/1019/923
f 853/1012/924 868/1020/924 869/1022/924
f 861/1013/925 876/1021/925 877/1023/925
f 855/988/926 854/987/926 869/1022/926
f 863/990/927 862/984/927 877/1023/927
f 855/988/928 870/1024/928 871/1026/928
f 863/990/929 878/1025/929 879/1027/929
f 857/996/930 856/991/930 871/1026/930
f 864/993/931 879/1027/931 880/1016/931
f 858/1000/932 857/996/932 872/1028/932
f 878/1025/934 877/1023/934 892/1032/934
f 870/1024/935 885/1048/935 886/1034/935
f 879/1027/936 878/1025/936 893/1033/936
f 872/1028/937 871/1026/937 886/1034/937
f 879/1027/938 894/1035/938 895/1037/938
f 873/1029/939 872/1028/939 887/1036/939
f 873/1029/942 888/1038/942 889/1042/942
f 867/1018/943 866/1031/943 881/1040/943
f 874/1017/944 889/1042/944 890/1044/944
f 867/1018/945 882/1043/945 883/1045/945
f 876/1021/946 875/1019/946 890/1044/946
f 868/1020/947 883/1045/947 884/1047/947
f 876/1021/948 891/1046/948 892/1032/948
f 870/1024/949 869/1022/949 884/1047/949
f 881/1040/950 896/1063/950 897/1049/950
f 889/1042/951 904/1065/951 905/1050/951
f 883/1045/952 882/1043/952 897/1049/952
f 891/1046/953 890/1044/953 905/1050/953
f 883/1045/954 898/1051/954 899/1053/954
f 891/1046/955 906/1052/955 907/1054/955
f 885/1048/956 884/1047/956 899/1053/956
f 893/1033/957 892/1032/957 907/1054/957
f 885/1048/958 900/1055/958 901/1057/958
f 893/1033/959 908/1056/959 909/1058/959
f 887/1036/960 886/1034/960 901/1057/960
f 894/1035/961 909/1058/961 910/1060/961
f 888/1038/962 887/1036/962 902/1059/962
f 888/1038/965 903/1061/965 904/1065/965
f 900/1055/966 915/1081/966 916/1066/966
f 908/1056/967 923/1082/967 924/1067/967
f 902/1059/968 901/1057/968 916/1066/968
f 909/1058/969 924/1067/969 925/1069/969
f 903/1061/970 902/1059/970 917/1068/970
f 903/1061/973 918/1070/973 919/1074/973
f 896/1063/974 911/1072/974 912/1075/974
f 904/1065/975 919/1074/975 920/1076/975
f 898/1051/976 897/1049/976 912/1075/976
f 906/1052/977 905/1050/977 920/1076/977
f 898/1051/978 913/1077/978 914/1079/978
f 906/1052/979 921/1078/979 922/1080/979
f 900/1055/980 899/1053/980 914/1079/980
f 908/1056/981 907/1054/981 922/1080/981
f 919/1074/982 934/1098/982 935/1083/982
f 912/1075/983 927/1099/983 928/1084/983
f 921/1078/984 920/1076/984 935/1083/984
f 913/1077/985 928/1084/985 929/1086/985
f 921/1078/986 936/1085/986 937/1087/986
f 915/1081/987 914/1079/987 929/1086/987
f 923/1082/988 922/1080/988 937/1087/988
f 915/1081/989 930/1088/989 931/1090/989
f 923/1082/990 938/1089/990 939/1091/990
f 917/1068/991 916/1066/991 931/1090/991
f 924/1067/992 939/1091/992 940/1093/992
f 918/1070/993 917/1068/993 932/1092/993
f 918/1070/996 933/1094/996 934/1098/996
f 912/1075/997 911/1072/997 926/1096/997
f 938/1089/998 953/1116/998 954/1100/998
f 932/1092/999 931/1090/999 946/1101/999
f 940/1093/1000 939/1091/1000 954/1100/1000
f 933/1094/1001 932/1092/1001 947/1102/1001
f 933/1094/1004 948/1104/1004 949/1108/1004
f 926/1096/1005 941/1106/1005 942/1109/1005
f 934/1098/1006 949/1108/1006 950/1110/1006
f 928/1084/1007 927/1099/1007 942/1109/1007
f 936/1085/1008 935/1083/1008 950/1110/1008
f 928/1084/1009 943/1111/1009 944/1113/1009
f 936/1085/1010 951/1112/1010 952/1114/1010
f 930/1088/1011 929/1086/1011 944/1113/1011
f 938/1089/1012 937/1087/1012 952/1114/1012
f 930/1088/1013 945/1115/1013 946/1101/1013
f 943/1111/1014 942/1109/1014 957/591/1014
f 950/1110/1015 960/562/1015 488/567/1015
f 944/1113/1016 943/1111/1016 483/563/1016
f 951/1112/1017 488/567/1017 961/571/1017
f 945/1115/1018 944/1113/1018 958/568/1018
f 953/1116/1019 952/1114/1019 961/571/1019
f 946/1101/1020 945/1115/1020 484/572/1020
f 953/1116/1021 962/575/1021 963/579/1021
f 947/1102/1022 946/1101/1022 959/576/1022
f 954/1100/1035 963/579/1035 964/583/1035
f 948/1104/1024 947/1102/1024 485/580/1024
f 948/1104/1027 486/585/1027 487/560/1027
f 942/1109/1036 941/1106/1036 956/586/1036
f 949/1108/1029 487/560/1029 960/562/1029
o Cube.001
v 0.131002 0.093472 -0.288635
v -0.046766 -0.163106 -0.234637
v 0.099596 0.093472 -0.392024
v -0.078171 -0.163106 -0.338025
v 0.046229 0.157626 -0.262885
v -0.131538 -0.098952 -0.208886
v 0.014823 0.157626 -0.366273
v -0.162944 -0.098952 -0.312274
v 0.097034 0.005807 -0.259496
v 0.025928 -0.096824 -0.237897
v -0.015946 -0.096824 -0.375748
v 0.055161 0.005807 -0.397347
v -0.128977 -0.011287 -0.341414
v -0.057870 0.091344 -0.363013
v -0.087103 -0.011287 -0.203562
v -0.015996 0.091344 -0.225162
v 0.114758 0.092536 -0.373665
v 0.128483 0.092066 -0.344334
v 0.133391 0.092536 -0.312325
v 0.008017 -0.121847 -0.232859
v -0.010693 -0.143062 -0.229997
v -0.030367 -0.157173 -0.230979
v -0.049562 -0.171527 -0.256751
v -0.057075 -0.175757 -0.287969
v -0.068195 -0.171527 -0.318091
v 0.072466 0.031288 -0.402201
v 0.086941 0.055707 -0.403777
v 0.096168 0.077724 -0.399622
v 0.032365 0.148323 -0.379838
v 0.057664 0.131172 -0.391660
v 0.082660 0.110261 -0.395115
v -0.100293 -0.153803 -0.339541
v -0.127894 -0.136652 -0.335294
v -0.150589 -0.115741 -0.324264
v -0.039960 0.116367 -0.368051
v -0.021249 0.137582 -0.370913
v -0.001575 0.151693 -0.369931
v 0.036253 0.166047 -0.282818
v 0.025133 0.170277 -0.312941
v 0.017620 0.166047 -0.344159
v -0.165333 -0.098016 -0.288585
v -0.160425 -0.097546 -0.256575
v -0.146700 -0.098016 -0.227244
v 0.001690 0.116367 -0.230937
v 0.018832 0.137582 -0.238965
v 0.034635 0.151693 -0.250724
v 0.118646 0.110261 -0.276646
v 0.095952 0.131172 -0.265615
v 0.068351 0.148323 -0.261368
v -0.114602 -0.115741 -0.205794
v -0.089606 -0.136652 -0.209250
v -0.064307 -0.153803 -0.221072
v 0.132378 0.077724 -0.280414
v 0.127021 0.055707 -0.271829
v 0.114115 0.031288 -0.265088
v 0.079258 -0.019851 -0.254096
v 0.061481 -0.045509 -0.248696
v 0.043704 -0.071167 -0.243296
v -0.066577 -0.157173 -0.350186
v -0.050774 -0.143062 -0.361945
v -0.033632 -0.121847 -0.369972
v 0.001830 -0.071167 -0.381148
v 0.019607 -0.045509 -0.386548
v 0.037384 -0.019851 -0.391947
v -0.164320 -0.083204 -0.320495
v -0.158964 -0.061187 -0.329081
v -0.146057 -0.036768 -0.335822
v -0.111200 0.014371 -0.346813
v -0.093423 0.040029 -0.352213
v -0.075646 0.065687 -0.357613
v -0.128110 -0.083204 -0.201288
v -0.118883 -0.061187 -0.197133
v -0.104408 -0.036768 -0.198709
v -0.069326 0.014371 -0.208962
v -0.051549 0.040029 -0.214362
v -0.033773 0.065687 -0.219762
v 0.015501 0.071965 -0.213555
v 0.048371 0.048576 -0.216482
v 0.077314 0.025186 -0.232332
v -0.055606 -0.030666 -0.191956
v -0.022736 -0.054056 -0.194882
v 0.006208 -0.077445 -0.210732
v -0.064278 0.103373 -0.326953
v -0.058126 0.107383 -0.287650
v -0.041378 0.103373 -0.251566
v -0.135384 0.000742 -0.305353
v -0.129233 0.004751 -0.266050
v -0.112485 0.000742 -0.229966
v 0.023664 0.025186 -0.408954
v -0.009206 0.048576 -0.406027
v -0.038150 0.071965 -0.390177
v -0.047443 -0.077445 -0.387354
v -0.080313 -0.054056 -0.384428
v -0.109257 -0.030666 -0.368578
v 0.103442 -0.006222 -0.295556
v 0.097291 -0.010231 -0.334859
v 0.080542 -0.006222 -0.370944
v 0.032335 -0.108853 -0.273957
v 0.026184 -0.112863 -0.313260
v 0.009436 -0.108853 -0.349344
v 0.014371 -0.133850 -0.268722
v -0.004720 -0.154879 -0.264476
v -0.025889 -0.168281 -0.261517
v 0.008231 -0.137875 -0.307807
v -0.010775 -0.159011 -0.302033
v -0.032003 -0.172596 -0.295585
v -0.008405 -0.133850 -0.343703
v -0.026633 -0.154879 -0.336613
v -0.045870 -0.168281 -0.327298
v -0.064987 -0.102605 -0.381514
v -0.081136 -0.124782 -0.373035
v -0.094129 -0.141632 -0.359547
v -0.097705 -0.079377 -0.378600
v -0.112789 -0.102679 -0.370206
v -0.123394 -0.122130 -0.356659
v -0.126468 -0.056079 -0.362839
v -0.140284 -0.080021 -0.355068
v -0.148065 -0.100815 -0.343164
v -0.152459 -0.024835 -0.299945
v -0.165320 -0.049924 -0.294485
v -0.170234 -0.074167 -0.289521
v -0.146368 -0.020879 -0.260846
v -0.159650 -0.046347 -0.256811
v -0.165376 -0.071664 -0.255072
v -0.129683 -0.024835 -0.224964
v -0.143408 -0.049924 -0.222348
v -0.150252 -0.074167 -0.223741
v -0.073101 -0.056079 -0.187152
v -0.088905 -0.080021 -0.185925
v -0.101993 -0.100815 -0.191491
v -0.040431 -0.079377 -0.190052
v -0.057636 -0.102679 -0.188638
v -0.073984 -0.122130 -0.193998
v -0.011621 -0.102605 -0.205828
v -0.029757 -0.124782 -0.203892
v -0.048057 -0.141632 -0.207874
v 0.097926 0.117017 -0.370755
v 0.072470 0.140011 -0.364941
v 0.044323 0.157582 -0.354473
v 0.111174 0.118639 -0.339076
v 0.084983 0.142971 -0.331121
v 0.054691 0.161383 -0.321919
v 0.117784 0.117017 -0.305382
v 0.093395 0.140011 -0.296055
v 0.064181 0.157582 -0.289099
v -0.149727 -0.122497 -0.295528
v -0.125337 -0.145491 -0.304855
v -0.096124 -0.163062 -0.311811
v -0.143116 -0.124119 -0.261833
v -0.116925 -0.148451 -0.269789
v -0.086633 -0.166863 -0.278990
v -0.129869 -0.122497 -0.230154
v -0.104412 -0.145491 -0.235969
v -0.076266 -0.163062 -0.246437
v 0.062187 0.136152 -0.241362
v 0.049194 0.119302 -0.227874
v 0.033045 0.097125 -0.219395
v 0.091452 0.116650 -0.244251
v 0.080847 0.097199 -0.230704
v 0.065763 0.073897 -0.222310
v 0.116123 0.095335 -0.257746
v 0.108342 0.074541 -0.245841
v 0.094525 0.050599 -0.238071
v -0.002276 0.046307 -0.208156
v -0.020052 0.020649 -0.202756
v -0.037829 -0.005009 -0.197356
v 0.030594 0.022918 -0.211082
v 0.012817 -0.002740 -0.205682
v -0.004960 -0.028398 -0.200282
v 0.059538 -0.000471 -0.226932
v 0.041761 -0.026129 -0.221532
v 0.023984 -0.051787 -0.216132
v -0.006053 0.162801 -0.339392
v -0.027222 0.149399 -0.336434
v -0.046313 0.128370 -0.332188
v 0.000060 0.167116 -0.305325
v -0.021167 0.153531 -0.298876
v -0.040174 0.132395 -0.293103
v 0.013928 0.162801 -0.273612
v -0.005309 0.149399 -0.264297
v -0.023537 0.128370 -0.257207
v -0.082054 0.077715 -0.321553
v -0.099831 0.052057 -0.316153
v -0.117608 0.026400 -0.310753
v -0.075903 0.081725 -0.282250
v -0.093679 0.056067 -0.276850
v -0.111456 0.030409 -0.271450
v -0.059154 0.077715 -0.246166
v -0.076931 0.052057 -0.240766
v -0.094708 0.026400 -0.235366
v 0.070051 0.095335 -0.409419
v 0.056963 0.074541 -0.414984
v 0.041159 0.050599 -0.413758
v 0.042042 0.116650 -0.406912
v 0.025694 0.097199 -0.412272
v 0.008489 0.073897 -0.410858
v 0.016115 0.136152 -0.393035
v -0.002185 0.119302 -0.397017
v -0.020322 0.097125 -0.395082
v 0.005887 -0.000471 -0.403554
v -0.011890 -0.026129 -0.398154
v -0.029667 -0.051787 -0.392754
v -0.026983 0.022918 -0.400627
v -0.044759 -0.002740 -0.395228
v -0.062536 -0.028398 -0.389828
v -0.055926 0.046307 -0.384777
v -0.073703 0.020649 -0.379378
v -0.091480 -0.005009 -0.373978
v 0.138292 0.068687 -0.311389
v 0.133378 0.044444 -0.306425
v 0.120517 0.019355 -0.300965
v 0.133433 0.066184 -0.345838
v 0.127708 0.040867 -0.344099
v 0.114425 0.015399 -0.340064
v 0.118310 0.068687 -0.377169
v 0.111466 0.044444 -0.378562
v 0.097740 0.019355 -0.375946
v 0.085666 -0.031880 -0.290156
v 0.067889 -0.057538 -0.284757
v 0.050112 -0.083195 -0.279357
v 0.079514 -0.035889 -0.329459
v 0.061737 -0.061547 -0.324060
v 0.043960 -0.087205 -0.318660
v 0.062766 -0.031880 -0.365544
v 0.044989 -0.057538 -0.360144
v 0.027212 -0.083195 -0.354744
vn 0.7350 -0.6765 -0.0458
vn 0.5356 -0.8445 0.0066
vn 0.6363 -0.6765 -0.3708
vn 0.4414 -0.8445 -0.3033
vn 0.8011 -0.5104 0.3126
vn 0.7297 -0.5991 0.3298
vn 0.5330 -0.7678 0.3555
vn 0.2444 -0.8992 0.3629
vn 0.1535 -0.9834 0.0970
vn 0.0736 -0.9834 -0.1660
vn 0.0012 -0.8992 -0.4375
vn 0.2452 -0.7678 -0.5919
vn 0.4230 -0.5991 -0.6799
vn 0.4919 -0.5104 -0.7054
vn 0.7044 -0.5914 -0.3926
vn 0.8038 -0.5914 -0.0655
vn -0.2266 -0.1995 -0.9533
vn -0.3557 -0.4013 -0.8440
vn -0.4836 -0.0050 -0.8753
vn -0.6012 -0.2155 -0.7695
vn 0.1477 -0.3204 -0.9357
vn 0.0825 -0.4103 -0.9082
vn -0.0653 -0.5946 -0.8013
vn -0.2314 -0.7691 -0.5958
vn -0.4980 -0.6648 -0.5568
vn -0.7073 -0.5064 -0.4932
vn -0.8700 -0.2858 -0.4018
vn -0.8198 -0.0236 -0.5721
vn -0.7210 0.1977 -0.6641
vn -0.6624 0.2927 -0.6896
vn -0.4252 0.0845 -0.9011
vn -0.1665 -0.1113 -0.9797
vn -0.8783 0.4697 0.0893
vn -0.9672 0.2215 0.1245
vn -0.7796 0.4697 0.4143
vn -0.8730 0.2215 0.4345
vn -0.8204 0.4826 -0.3068
vn -0.8778 0.3853 -0.2848
vn -0.9633 0.1467 -0.2248
vn -0.9773 -0.1587 -0.1403
vn -0.9693 -0.1942 0.1508
vn -0.8894 -0.1942 0.4138
vn -0.7342 -0.1587 0.6602
vn -0.6755 0.1467 0.7226
vn -0.5711 0.3853 0.7249
vn -0.5111 0.4826 0.7112
vn -0.7232 0.5642 0.3984
vn -0.8226 0.5642 0.0712
vn 0.0849 -0.0050 0.9964
vn -0.0717 -0.2155 0.9739
vn 0.3420 -0.1995 0.9183
vn 0.1739 -0.4013 0.8993
vn -0.1669 0.2927 0.9415
vn -0.2298 0.1977 0.9529
vn -0.3631 -0.0236 0.9315
vn -0.4995 -0.2858 0.8178
vn -0.3135 -0.5064 0.8033
vn -0.1042 -0.6648 0.7397
vn 0.1391 -0.7691 0.6238
vn 0.3914 -0.5946 0.7023
vn 0.5737 -0.4103 0.7089
vn 0.6432 -0.3204 0.6955
vn 0.4066 -0.1113 0.9068
vn 0.1478 0.0845 0.9854
vn 0.6026 0.7435 -0.2901
vn 0.4478 0.8610 -0.2410
vn 0.6621 0.7435 -0.0941
vn 0.5063 0.8610 -0.0488
vn 0.6700 0.5232 -0.5266
vn 0.5089 0.6927 -0.5111
vn 0.3408 0.8199 -0.4601
vn 0.1635 0.8994 -0.4054
vn 0.2312 0.9551 -0.1852
vn 0.2952 0.9551 0.0253
vn 0.3614 0.8994 0.2460
vn 0.5391 0.8199 0.1928
vn 0.7072 0.6927 0.1417
vn 0.8497 0.5232 0.0650
vn 0.8181 0.5593 -0.1337
vn 0.7542 0.5593 -0.3439
vn -0.6621 -0.7435 0.0941
vn -0.5063 -0.8610 0.0488
vn -0.6026 -0.7435 0.2901
vn -0.4478 -0.8610 0.2410
vn -0.8497 -0.5232 -0.0650
vn -0.7072 -0.6927 -0.1417
vn -0.5391 -0.8199 -0.1928
vn -0.3614 -0.8994 -0.2460
vn -0.2952 -0.9551 -0.0253
vn -0.2312 -0.9551 0.1852
vn -0.1635 -0.8994 0.4054
vn -0.3408 -0.8199 0.4601
vn -0.5089 -0.6927 0.5111
vn -0.6700 -0.5232 0.5266
vn -0.7542 -0.5593 0.3439
vn -0.8181 -0.5593 0.1337
vn 0.3498 0.3994 0.8474
vn 0.2240 0.1987 0.9541
vn 0.6019 0.2087 0.7708
vn 0.4837 0.0022 0.8752
vn 0.2880 0.7762 0.5609
vn 0.0624 0.5939 0.8021
vn -0.0853 0.4094 0.9083
vn -0.1479 0.3203 0.9357
vn 0.1663 0.1113 0.9798
vn 0.4252 -0.0847 0.9011
vn 0.6623 -0.2929 0.6896
vn 0.7208 -0.2006 0.6635
vn 0.8200 0.0206 0.5720
vn 0.8523 0.3491 0.3895
vn 0.7067 0.5063 0.4942
vn 0.4984 0.6640 0.5575
vn 0.1572 0.0979 0.9827
vn 0.4160 -0.0979 0.9041
vn -0.1573 0.3066 0.9388
vn 0.6529 -0.3066 0.6927
vn -0.5410 0.8409 -0.0094
vn -0.7368 0.6747 0.0445
vn -0.4444 0.8409 0.3088
vn -0.6371 0.6747 0.3728
vn -0.1928 0.9249 -0.3278
vn -0.5348 0.7659 -0.3570
vn -0.7309 0.5967 -0.3311
vn -0.8011 0.5103 -0.3127
vn -0.8039 0.5912 0.0654
vn -0.7044 0.5912 0.3928
vn -0.4919 0.5103 0.7055
vn -0.4233 0.5967 0.6817
vn -0.2459 0.7659 0.5941
vn 0.0221 0.9249 0.3797
vn -0.0747 0.9834 0.1657
vn -0.1542 0.9834 -0.0961
vn -0.8133 0.5779 0.0683
vn -0.7139 0.5779 0.3956
vn -0.8108 0.4966 -0.3097
vn -0.5016 0.4966 0.7084
vn 0.0715 0.2087 -0.9754
vn -0.0849 0.0022 -0.9964
vn -0.1806 0.3994 -0.8988
vn -0.3445 0.1987 -0.9175
vn 0.4917 0.3491 -0.7977
vn 0.3633 0.0206 -0.9314
vn 0.2300 -0.2006 -0.9523
vn 0.1669 -0.2929 -0.9415
vn -0.1478 -0.0847 -0.9854
vn -0.4068 0.1113 -0.9067
vn -0.6433 0.3203 -0.6954
vn -0.5761 0.4094 -0.7074
vn -0.3942 0.5939 -0.7013
vn -0.0727 0.7762 -0.6263
vn 0.1041 0.6640 -0.7405
vn 0.3125 0.5063 -0.8038
vn -0.1572 -0.0979 -0.9827
vn -0.4160 0.0979 -0.9041
vn 0.1573 -0.3066 -0.9388
vn -0.6529 0.3066 -0.6927
vn 0.9665 -0.2268 -0.1198
vn 0.8775 -0.4715 -0.0873
vn 0.8699 -0.2268 -0.4380
vn 0.7778 -0.4715 -0.4155
vn 0.9748 0.2039 0.0903
vn 0.9625 -0.1486 0.2271
vn 0.8764 -0.3868 0.2870
vn 0.8203 -0.4827 0.3069
vn 0.8225 -0.5643 -0.0711
vn 0.7231 -0.5643 -0.3984
vn 0.5110 -0.4827 -0.7113
vn 0.5687 -0.3868 -0.7259
vn 0.6736 -0.1486 -0.7240
vn 0.7599 0.2039 -0.6172
vn 0.8899 0.1932 -0.4133
vn 0.9694 0.1932 -0.1515
vn 0.8133 -0.5779 -0.0683
vn 0.7139 -0.5779 -0.3956
vn 0.8108 -0.4966 0.3097
vn 0.5016 -0.4966 -0.7084
vn 0.7368 -0.6747 -0.0445
vn 0.5410 -0.8409 0.0094
vn 0.6371 -0.6747 -0.3728
vn 0.4444 -0.8409 -0.3088
vn 0.8011 -0.5103 0.3127
vn 0.7309 -0.5967 0.3311
vn 0.5348 -0.7659 0.3570
vn 0.1928 -0.9249 0.3278
vn 0.1542 -0.9834 0.0961
vn 0.0747 -0.9834 -0.1657
vn -0.0221 -0.9249 -0.3797
vn 0.2459 -0.7659 -0.5941
vn 0.4233 -0.5967 -0.6817
vn 0.4919 -0.5103 -0.7055
vn 0.7044 -0.5912 -0.3928
vn 0.8039 -0.5912 -0.0654
vn -0.2240 -0.1987 -0.9541
vn -0.3498 -0.3994 -0.8474
vn -0.4837 -0.0022 -0.8752
vn -0.6019 -0.2087 -0.7708
vn 0.1479 -0.3203 -0.9357
vn 0.0853 -0.4094 -0.9083
vn -0.0624 -0.5939 -0.8021
vn -0.2880 -0.7762 -0.5609
vn -0.4984 -0.6640 -0.5575
vn -0.7067 -0.5063 -0.4942
vn -0.8523 -0.3491 -0.3895
vn -0.8200 -0.0206 -0.5720
vn -0.7208 0.2006 -0.6635
vn -0.6623 0.2929 -0.6896
vn -0.4252 0.0847 -0.9011
vn -0.1663 -0.1113 -0.9798
vn -0.8775 0.4715 0.0873
vn -0.9665 0.2268 0.1198
vn -0.7778 0.4715 0.4155
vn -0.8699 0.2268 0.4380
vn -0.8203 0.4827 -0.3069
vn -0.8764 0.3868 -0.2870
vn -0.9625 0.1486 -0.2271
vn -0.9748 -0.2039 -0.0903
vn -0.9694 -0.1932 0.1515
vn -0.8899 -0.1932 0.4133
vn -0.7599 -0.2039 0.6172
vn -0.6736 0.1486 0.7240
vn -0.5687 0.3868 0.7259
vn -0.5110 0.4827 0.7113
vn -0.7231 0.5643 0.3984
vn -0.8225 0.5643 0.0711
vn 0.0849 -0.0022 0.9964
vn -0.0715 -0.2087 0.9754
vn 0.3445 -0.1987 0.9175
vn 0.1806 -0.3994 0.8988
vn -0.1669 0.2929 0.9415
vn -0.2300 0.2006 0.9523
vn -0.3633 -0.0206 0.9314
vn -0.4917 -0.3491 0.7977
vn -0.3125 -0.5063 0.8038
vn -0.1041 -0.6640 0.7405
vn 0.0727 -0.7762 0.6263
vn 0.3942 -0.5939 0.7013
vn 0.5761 -0.4094 0.7074
vn 0.6433 -0.3203 0.6954
vn 0.4068 -0.1113 0.9067
vn 0.1478 0.0847 0.9854
vn 0.6046 0.7424 -0.2887
vn 0.4488 0.8599 -0.2433
vn 0.6630 0.7424 -0.0964
vn 0.5083 0.8599 -0.0474
vn 0.6368 0.5412 -0.5492
vn 0.5088 0.6927 -0.5111
vn 0.3408 0.8199 -0.4600
vn 0.1487 0.9177 -0.3683
vn 0.2312 0.9551 -0.1851
vn 0.2951 0.9551 0.0252
vn 0.3284 0.9177 0.2234
vn 0.5391 0.8199 0.1927
vn 0.7072 0.6927 0.1418
vn 0.8346 0.5412 0.1022
vn 0.8181 0.5594 -0.1336
vn 0.7541 0.5594 -0.3440
vn -0.6630 -0.7424 0.0964
vn -0.5083 -0.8599 0.0474
vn -0.6046 -0.7424 0.2887
vn -0.4488 -0.8599 0.2433
vn -0.8346 -0.5412 -0.1022
vn -0.7072 -0.6927 -0.1418
vn -0.5391 -0.8199 -0.1927
vn -0.3284 -0.9177 -0.2234
vn -0.2951 -0.9551 -0.0252
vn -0.2312 -0.9551 0.1851
vn -0.1487 -0.9177 0.3683
vn -0.3408 -0.8199 0.4600
vn -0.5088 -0.6927 0.5111
vn -0.6368 -0.5412 0.5492
vn -0.7541 -0.5594 0.3440
vn -0.8181 -0.5594 0.1336
vn 0.3557 0.4013 0.8440
vn 0.2266 0.1995 0.9533
vn 0.6012 0.2155 0.7695
vn 0.4836 0.0050 0.8753
vn 0.2314 0.7691 0.5958
vn 0.0653 0.5946 0.8013
vn -0.0825 0.4103 0.9082
vn -0.1477 0.3204 0.9357
vn 0.1665 0.1113 0.9797
vn 0.4252 -0.0845 0.9011
vn 0.6624 -0.2927 0.6896
vn 0.7210 -0.1977 0.6641
vn 0.8198 0.0236 0.5721
vn 0.8700 0.2858 0.4018
vn 0.7073 0.5064 0.4932
vn 0.4980 0.6648 0.5568
vn -0.5356 0.8445 -0.0066
vn -0.7350 0.6765 0.0458
vn -0.4414 0.8445 0.3033
vn -0.6363 0.6765 0.3708
vn -0.2444 0.8992 -0.3629
vn -0.5330 0.7678 -0.3555
vn -0.7297 0.5991 -0.3298
vn -0.8011 0.5104 -0.3126
vn -0.8038 0.5914 0.0655
vn -0.7044 0.5914 0.3926
vn -0.4919 0.5104 0.7054
vn -0.4230 0.5991 0.6799
vn -0.2452 0.7678 0.5919
vn -0.0012 0.8992 0.4375
vn -0.0736 0.9834 0.1660
vn -0.1535 0.9834 -0.0970
vn 0.0717 0.2155 -0.9739
vn -0.0849 0.0050 -0.9964
vn -0.1739 0.4013 -0.8993
vn -0.3420 0.1995 -0.9183
vn 0.4995 0.2858 -0.8178
vn 0.3631 0.0236 -0.9315
vn 0.2298 -0.1977 -0.9529
vn 0.1669 -0.2927 -0.9415
vn -0.1478 -0.0845 -0.9854
vn -0.4066 0.1113 -0.9068
vn -0.6432 0.3204 -0.6955
vn -0.5737 0.4103 -0.7089
vn -0.3914 0.5946 -0.7023
vn -0.1391 0.7691 -0.6238
vn 0.1042 0.6648 -0.7397
vn 0.3135 0.5064 -0.8033
vn 0.9672 -0.2215 -0.1245
vn 0.8783 -0.4697 -0.0893
vn 0.8730 -0.2215 -0.4345
vn 0.7796 -0.4697 -0.4143
vn 0.9773 0.1587 0.1403
vn 0.9633 -0.1467 0.2248
vn 0.8778 -0.3853 0.2848
vn 0.8204 -0.4826 0.3068
vn 0.8226 -0.5642 -0.0712
vn 0.7232 -0.5642 -0.3984
vn 0.5111 -0.4826 -0.7112
vn 0.5711 -0.3853 -0.7249
vn 0.6755 -0.1467 -0.7226
vn 0.7342 0.1587 -0.6602
vn 0.8894 0.1942 -0.4138
vn 0.9693 0.1942 -0.1508
vt 0.583333 0.062500
vt 0.562500 0.125000
vt 0.562500 0.062500
vt 0.604167 0.062500
vt 0.583333 0.125000
vt 0.583333 0.187500
vt 0.562500 0.187500
vt 0.604167 0.187500
vt 0.562500 0.000000
vt 0.541667 0.062500
vt 0.541667 0.000000
vt 0.583333 0.000000
vt 0.604167 0.000000
vt 0.625000 0.000000
vt 0.625000 0.125000
vt 0.604167 0.125000
vt 0.625000 0.250000
vt 0.604167 0.250000
vt 0.583333 0.250000
vt 0.562500 0.250000
vt 0.541667 0.187500
vt 0.541667 0.250000
vt 0.541667 0.125000
vt 0.583333 0.312500
vt 0.562500 0.375000
vt 0.562500 0.312500
vt 0.604167 0.312500
vt 0.583333 0.375000
vt 0.583406 0.437500
vt 0.562509 0.437500
vt 0.604320 0.437500
vt 0.541667 0.312500
vt 0.625000 0.375000
vt 0.604167 0.375000
vt 0.625000 0.500000
vt 0.605903 0.500000
vt 0.583912 0.500000
vt 0.562572 0.500000
vt 0.541667 0.437500
vt 0.541667 0.500000
vt 0.541667 0.375000
vt 0.584997 0.562500
vt 0.562789 0.625000
vt 0.562708 0.562500
vt 0.610343 0.562500
vt 0.585648 0.625000
vt 0.584997 0.687500
vt 0.562708 0.687500
vt 0.610343 0.687500
vt 0.541667 0.562500
vt 0.648148 0.625000
vt 0.612558 0.625000
vt 0.625000 0.750000
vt 0.605903 0.750000
vt 0.583912 0.750000
vt 0.562572 0.750000
vt 0.541667 0.687500
vt 0.541667 0.750000
vt 0.541667 0.625000
vt 0.583406 0.812500
vt 0.562500 0.875000
vt 0.562509 0.812500
vt 0.604320 0.812500
vt 0.583333 0.875000
vt 0.583333 0.937500
vt 0.562500 0.937500
vt 0.604167 0.937500
vt 0.541667 0.812500
vt 0.625000 0.875000
vt 0.604167 0.875000
vt 0.625000 1.000000
vt 0.604167 1.000000
vt 0.583333 1.000000
vt 0.562500 1.000000
vt 0.541667 0.937500
vt 0.541667 1.000000
vt 0.541667 0.875000
vt 0.187301 0.562500
vt 0.247685 0.625000
vt 0.187211 0.625000
vt 0.306478 0.562500
vt 0.248409 0.562500
vt 0.187301 0.687500
vt 0.306478 0.687500
vt 0.248409 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.187500 0.500000
vt 0.312500 0.500000
vt 0.250000 0.500000
vt 0.375000 0.500000
vt 0.355867 0.562500
vt 0.304109 0.625000
vt 0.355867 0.687500
vt 0.375000 0.750000
vt 0.312500 0.750000
vt 0.250000 0.750000
vt 0.187500 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.693522 0.562500
vt 0.752315 0.625000
vt 0.695891 0.625000
vt 0.812699 0.562500
vt 0.751592 0.562500
vt 0.693522 0.687500
vt 0.812699 0.687500
vt 0.751591 0.687500
vt 0.644133 0.562500
vt 0.687500 0.500000
vt 0.812500 0.500000
vt 0.750000 0.500000
vt 0.875000 0.500000
vt 0.875000 0.562500
vt 0.812789 0.625000
vt 0.875000 0.687500
vt 0.875000 0.750000
vt 0.812500 0.750000
vt 0.750000 0.750000
vt 0.687500 0.750000
vt 0.644133 0.687500
vt 0.395680 0.812500
vt 0.416667 0.875000
vt 0.395833 0.875000
vt 0.416594 0.812500
vt 0.437500 0.875000
vt 0.395833 0.937500
vt 0.416667 0.937500
vt 0.375000 0.812500
vt 0.394097 0.750000
vt 0.416088 0.750000
vt 0.437491 0.812500
vt 0.437428 0.750000
vt 0.458333 0.812500
vt 0.458333 0.875000
vt 0.437500 0.937500
vt 0.458333 0.937500
vt 0.437500 1.000000
vt 0.416667 1.000000
vt 0.395833 1.000000
vt 0.375000 1.000000
vt 0.375000 0.937500
vt 0.375000 0.875000
vt 0.500000 0.812500
vt 0.479167 0.875000
vt 0.479167 0.812500
vt 0.520833 0.812500
vt 0.500000 0.875000
vt 0.479167 0.937500
vt 0.520833 0.875000
vt 0.500000 0.937500
vt 0.479167 0.750000
vt 0.458333 0.750000
vt 0.500000 0.750000
vt 0.520833 0.750000
vt 0.520833 0.937500
vt 0.520833 1.000000
vt 0.500000 1.000000
vt 0.479167 1.000000
vt 0.458333 1.000000
vt 0.389658 0.562500
vt 0.414352 0.625000
vt 0.387442 0.625000
vt 0.415003 0.562500
vt 0.437211 0.625000
vt 0.389657 0.687500
vt 0.415003 0.687500
vt 0.394097 0.500000
vt 0.416088 0.500000
vt 0.437292 0.562500
vt 0.437428 0.500000
vt 0.458333 0.562500
vt 0.458333 0.625000
vt 0.437292 0.687500
vt 0.458333 0.687500
vt 0.351852 0.625000
vt 0.500000 0.562500
vt 0.479167 0.625000
vt 0.479167 0.562500
vt 0.520833 0.562500
vt 0.500000 0.625000
vt 0.479167 0.687500
vt 0.520833 0.625000
vt 0.500000 0.687500
vt 0.479167 0.500000
vt 0.458333 0.500000
vt 0.520833 0.500000
vt 0.500000 0.500000
vt 0.541667 0.625000
vt 0.520833 0.687500
vt 0.520833 0.750000
vt 0.500000 0.750000
vt 0.458333 0.750000
vt 0.458333 0.625000
vt 0.395833 0.312500
vt 0.416667 0.375000
vt 0.395833 0.375000
vt 0.416667 0.312500
vt 0.437500 0.375000
vt 0.395680 0.437500
vt 0.416594 0.437500
vt 0.375000 0.250000
vt 0.375000 0.312500
vt 0.395833 0.250000
vt 0.416667 0.250000
vt 0.437500 0.312500
vt 0.437500 0.250000
vt 0.458333 0.312500
vt 0.458333 0.375000
vt 0.437491 0.437500
vt 0.458333 0.437500
vt 0.375000 0.437500
vt 0.375000 0.375000
vt 0.500000 0.312500
vt 0.479167 0.375000
vt 0.479167 0.312500
vt 0.520833 0.312500
vt 0.500000 0.375000
vt 0.479167 0.437500
vt 0.520833 0.375000
vt 0.500000 0.437500
vt 0.479167 0.250000
vt 0.458333 0.312500
vt 0.458333 0.250000
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.541667 0.312500
vt 0.541667 0.375000
vt 0.520833 0.437500
vt 0.541667 0.437500
vt 0.458333 0.437500
vt 0.458333 0.375000
vt 0.395833 0.062500
vt 0.416667 0.125000
vt 0.395833 0.125000
vt 0.416667 0.062500
vt 0.437500 0.125000
vt 0.395833 0.187500
vt 0.416667 0.187500
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.395833 0.000000
vt 0.416667 0.000000
vt 0.437500 0.062500
vt 0.437500 0.000000
vt 0.458333 0.062500
vt 0.458333 0.125000
vt 0.437500 0.187500
vt 0.458333 0.187500
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.500000 0.062500
vt 0.479167 0.125000
vt 0.479167 0.062500
vt 0.520833 0.062500
vt 0.500000 0.125000
vt 0.479167 0.187500
vt 0.520833 0.125000
vt 0.500000 0.187500
vt 0.458333 0.000000
vt 0.500000 0.000000
vt 0.479167 0.000000
vt 0.520833 0.000000
vt 0.520833 0.187500
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.479167 0.250000
vt 0.458333 0.250000
vt 0.625000 0.062500
vt 0.625000 0.187500
vt 0.625000 0.312500
vt 0.625000 0.437500
vt 0.625000 0.812500
vt 0.625000 0.937500
vt 0.875000 0.625000
vt 0.458333 0.500000
vt 0.541667 0.500000
vt 0.541667 0.750000
s 0
f 1066/1119/1037 1068/1120/1037 1065/1121/1037
f 1067/1122/1038 1069/1123/1038 1066/1119/1038
f 1068/1120/1039 1072/1124/1039 1071/1125/1039
f 1069/1123/1040 1073/1126/1040 1072/1124/1040
f 984/1127/1041 1062/1128/1041 974/1129/1041
f 985/1130/1042 1065/1121/1042 984/1127/1042
f 986/1131/1043 1066/1119/1043 985/1130/1043
f 966/1132/1044 1067/1122/1044 986/1131/1044
f 1067/1122/1045 988/1133/1045 1070/1134/1045
f 988/1133/1046 1073/1126/1046 1070/1134/1046
f 1073/1126/1047 968/1135/1047 1023/1136/1047
f 1072/1124/1048 1023/1136/1048 1024/1137/1048
f 1071/1125/1049 1024/1137/1049 1025/1138/1049
f 1064/1139/1050 1025/1138/1050 975/1140/1050
f 1063/1141/1051 1071/1125/1051 1064/1139/1051
f 1065/1121/1052 1063/1141/1052 1062/1128/1052
f 1075/1142/1053 1077/1143/1053 1074/1144/1053
f 1076/1145/1054 1078/1146/1054 1075/1142/1054
f 1077/1143/1055 1081/1147/1055 1080/1148/1055
f 1078/1146/1056 1082/1149/1056 1081/1147/1056
f 1025/1138/1057 1056/1150/1057 975/1140/1057
f 1024/1137/1058 1074/1144/1058 1025/1138/1058
f 1023/1136/1059 1075/1142/1059 1024/1137/1059
f 968/1135/1060 1076/1145/1060 1023/1136/1060
f 1076/1145/1061 997/1151/1061 1079/1152/1061
f 997/1151/1062 1082/1149/1062 1079/1152/1062
f 1082/1149/1063 972/1153/1063 1029/1154/1063
f 1081/1147/1064 1029/1154/1064 1030/1155/1064
f 1080/1148/1065 1030/1155/1065 1031/1156/1065
f 1058/1157/1066 1031/1156/1066 977/1158/1066
f 1057/1159/1067 1080/1148/1067 1058/1157/1067
f 1074/1144/1068 1057/1159/1068 1056/1150/1068
f 1084/1160/1069 1086/1161/1069 1083/1162/1069
f 1085/1163/1070 1087/1164/1070 1084/1160/1070
f 1086/1161/1071 1090/1165/1071 1089/1166/1071
f 1087/1164/1072 1091/1167/1072 1090/1165/1072
f 1031/1156/1073 1050/1168/1073 977/1158/1073
f 1030/1155/1074 1083/1162/1074 1031/1156/1074
f 1029/1154/1075 1084/1160/1075 1030/1155/1075
f 972/1153/1076 1085/1163/1076 1029/1154/1076
f 1085/1163/1077 1006/1169/1077 1088/1170/1077
f 1006/1169/1078 1091/1167/1078 1088/1170/1078
f 1091/1167/1079 970/1171/1079 1035/1172/1079
f 1090/1165/1080 1035/1172/1080 1036/1173/1080
f 1089/1166/1081 1036/1173/1081 1037/1174/1081
f 1052/1175/1082 1037/1174/1082 979/1176/1082
f 1051/1177/1083 1089/1166/1083 1052/1175/1083
f 1083/1162/1084 1051/1177/1084 1050/1168/1084
f 1093/1178/1085 1095/1179/1085 1092/1180/1085
f 1094/1181/1086 1096/1182/1086 1093/1178/1086
f 1095/1179/1087 1099/1183/1087 1098/1184/1087
f 1096/1182/1088 1100/1185/1088 1099/1183/1088
f 1037/1174/1089 1044/1186/1089 979/1176/1089
f 1036/1173/1090 1092/1180/1090 1037/1174/1090
f 1035/1172/1091 1093/1178/1091 1036/1173/1091
f 970/1171/1092 1094/1181/1092 1035/1172/1092
f 1094/1181/1093 1015/1187/1093 1097/1188/1093
f 1015/1187/1094 1100/1185/1094 1097/1188/1094
f 1100/1185/1095 966/1189/1095 986/1190/1095
f 1099/1183/1096 986/1190/1096 985/1191/1096
f 1098/1184/1097 985/1191/1097 984/1192/1097
f 1046/1193/1098 984/1192/1098 974/1194/1098
f 1045/1195/1099 1098/1184/1099 1046/1193/1099
f 1092/1180/1100 1045/1195/1100 1044/1186/1100
f 1101/1196/1101 1105/1197/1101 1104/1198/1101
f 1103/1199/1102 1105/1197/1102 1102/1200/1102
f 1105/1197/1103 1107/1201/1103 1104/1198/1103
f 1105/1197/1104 1109/1202/1104 1108/1203/1104
f 967/1204/1105 1101/1196/1105 981/1205/1105
f 995/1206/1106 1102/1200/1106 1101/1196/1106
f 993/1207/1107 1102/1200/1107 994/1208/1107
f 971/1209/1108 1103/1199/1108 993/1207/1108
f 1004/1210/1109 1106/1211/1109 1103/1199/1109
f 1106/1211/1110 1002/1212/1110 1109/1202/1110
f 1109/1202/1111 969/1213/1111 1013/1214/1111
f 1108/1203/1112 1013/1214/1112 1012/1215/1112
f 1108/1203/1113 1011/1216/1113 1107/1201/1113
f 1107/1201/1114 965/1217/1114 983/1218/1114
f 1104/1198/1115 983/1218/1115 982/1219/1115
f 981/1205/1116 1104/1198/1116 982/1219/1116
f 1110/1220/1117 1114/1221/1117 1113/1222/1117
f 1112/1223/1118 1114/1221/1118 1111/1224/1118
f 1114/1221/1119 1116/1225/1119 1113/1222/1119
f 1114/1221/1120 1118/1226/1120 1117/1227/1120
f 972/1153/1121 1110/1220/1121 1005/1228/1121
f 998/1229/1122 1111/1224/1122 1110/1220/1122
f 996/1230/1123 1111/1224/1123 997/1231/1123
f 968/1232/1124 1112/1223/1124 996/1230/1124
f 989/1233/1125 1115/1234/1125 1112/1223/1125
f 1115/1234/1126 987/1235/1126 1118/1226/1126
f 1118/1226/1127 966/1236/1127 1016/1237/1127
f 1117/1227/1128 1016/1237/1128 1015/1238/1128
f 1117/1227/1129 1014/1239/1129 1116/1225/1129
f 1116/1225/1130 970/1171/1130 1007/1240/1130
f 1113/1222/1131 1007/1240/1131 1006/1169/1131
f 1005/1228/1132 1113/1222/1132 1006/1169/1132
f 1119/1241/1133 1123/1242/1133 1122/1243/1133
f 1120/1244/1134 1124/1245/1134 1123/1242/1134
f 1123/1242/1135 1125/1246/1135 1122/1243/1135
f 1124/1245/1136 1126/1247/1136 1123/1242/1136
f 969/1213/1137 1119/1241/1137 1013/1248/1137
f 1010/1249/1138 1120/1244/1138 1119/1241/1138
f 1009/1250/1139 1121/1251/1139 1120/1244/1139
f 1008/1252/1140 1041/1253/1140 1121/1251/1140
f 1121/1251/1141 1042/1254/1141 1124/1245/1141
f 1042/1254/1142 1127/1255/1142 1124/1245/1142
f 1043/1256/1143 1019/1257/1143 1127/1255/1143
f 1127/1255/1144 1018/1258/1144 1126/1247/1144
f 1126/1247/1145 1017/1259/1145 1125/1246/1145
f 1125/1246/1146 965/1260/1146 1011/1261/1146
f 1012/1262/1147 1125/1246/1147 1011/1261/1147
f 1119/1241/1148 1012/1262/1148 1013/1248/1148
f 1129/1263/1149 1131/1264/1149 1128/1265/1149
f 1130/1266/1149 1132/1267/1149 1129/1263/1149
f 1132/1267/1150 1134/1268/1150 1131/1264/1150
f 1133/1269/1150 1135/1270/1150 1132/1267/1150
f 1040/1271/1151 1041/1253/1151 980/1272/1151
f 1039/1273/1151 1128/1265/1151 1040/1271/1151
f 1039/1273/1151 1130/1266/1151 1129/1263/1151
f 979/1176/1151 1130/1266/1151 1038/1274/1151
f 1044/1186/1149 1133/1269/1149 1130/1266/1149
f 1045/1195/1150 1136/1275/1150 1133/1269/1150
f 1046/1193/1152 1022/1276/1152 1136/1275/1152
f 1136/1275/1152 1021/1277/1152 1135/1270/1152
f 1135/1270/1152 1020/1278/1152 1134/1268/1152
f 1134/1268/1152 973/1279/1152 1043/1256/1152
f 1131/1264/1150 1043/1256/1150 1042/1254/1150
f 1128/1265/1149 1042/1254/1149 1041/1253/1149
f 1137/1280/1153 1141/1281/1153 1140/1282/1153
f 1138/1283/1154 1142/1284/1154 1141/1281/1154
f 1141/1281/1155 1143/1285/1155 1140/1282/1155
f 1142/1284/1156 1144/1286/1156 1141/1281/1156
f 971/1209/1157 1137/1280/1157 1004/1210/1157
f 1001/1287/1158 1138/1283/1158 1137/1280/1158
f 1000/1288/1159 1139/1289/1159 1138/1283/1159
f 999/1290/1160 1047/1291/1160 1139/1289/1160
f 1139/1289/1161 1048/1292/1161 1142/1284/1161
f 1048/1292/1162 1145/1293/1162 1142/1284/1162
f 1049/1294/1163 1008/1252/1163 1145/1293/1163
f 1145/1293/1164 1009/1250/1164 1144/1286/1164
f 1144/1286/1165 1010/1249/1165 1143/1285/1165
f 1143/1285/1166 969/1213/1166 1002/1212/1166
f 1003/1295/1167 1143/1285/1167 1002/1212/1167
f 1137/1280/1168 1003/1295/1168 1004/1210/1168
f 1147/1296/1169 1149/1297/1169 1146/1298/1169
f 1148/1299/1169 1150/1300/1169 1147/1296/1169
f 1150/1300/1170 1152/1301/1170 1149/1297/1170
f 1151/1302/1170 1153/1303/1170 1150/1300/1170
f 1034/1304/1171 1047/1291/1171 978/1305/1171
f 1034/1304/1171 1147/1296/1171 1146/1298/1171
f 1032/1306/1171 1147/1296/1171 1033/1307/1171
f 1032/1306/1171 1050/1168/1171 1148/1299/1171
f 1148/1299/1169 1051/1308/1169 1151/1302/1169
f 1051/1308/1170 1154/1309/1170 1151/1302/1170
f 1052/1175/1172 1038/1310/1172 1154/1309/1172
f 1153/1303/1172 1038/1310/1172 1039/1311/1172
f 1153/1303/1172 1040/1271/1172 1152/1301/1172
f 1152/1301/1172 980/1312/1172 1049/1294/1172
f 1149/1297/1170 1049/1294/1170 1048/1313/1170
f 1047/1291/1169 1149/1297/1169 1048/1313/1169
f 1155/1314/1173 1159/1315/1173 1158/1316/1173
f 1156/1317/1174 1160/1318/1174 1159/1315/1174
f 1159/1315/1175 1161/1319/1175 1158/1316/1175
f 1160/1318/1176 1162/1320/1176 1159/1315/1176
f 967/1321/1177 1155/1314/1177 995/1322/1177
f 992/1323/1178 1156/1317/1178 1155/1314/1178
f 991/1324/1179 1157/1325/1179 1156/1317/1179
f 990/1326/1180 1053/1327/1180 1157/1325/1180
f 1157/1325/1181 1054/1328/1181 1160/1318/1181
f 1054/1328/1182 1163/1329/1182 1160/1318/1182
f 1055/1330/1183 999/1290/1183 1163/1329/1183
f 1163/1329/1184 1000/1288/1184 1162/1320/1184
f 1162/1320/1185 1001/1287/1185 1161/1319/1185
f 1161/1319/1186 971/1209/1186 993/1331/1186
f 994/1332/1187 1161/1319/1187 993/1331/1187
f 1155/1314/1188 994/1332/1188 995/1322/1188
f 1165/1333/1189 1167/1334/1189 1164/1335/1189
f 1166/1336/1189 1168/1337/1189 1165/1333/1189
f 1168/1337/1190 1170/1338/1190 1167/1334/1190
f 1169/1339/1190 1171/1340/1190 1168/1337/1190
f 1028/1341/1191 1053/1342/1191 976/1343/1191
f 1028/1341/1191 1165/1333/1191 1164/1335/1191
f 1026/1344/1191 1165/1333/1191 1027/1345/1191
f 1026/1344/1191 1056/1346/1191 1166/1336/1191
f 1166/1336/1189 1057/1347/1189 1169/1339/1189
f 1057/1347/1190 1172/1348/1190 1169/1339/1190
f 1058/1349/1192 1032/1306/1192 1172/1348/1192
f 1172/1348/1192 1033/1307/1192 1171/1340/1192
f 1170/1338/1192 1033/1307/1192 1034/1304/1192
f 1170/1338/1192 978/1305/1192 1055/1350/1192
f 1167/1334/1190 1055/1350/1190 1054/1351/1190
f 1053/1342/1189 1167/1334/1189 1054/1351/1189
f 1173/1352/1193 1177/1353/1193 1176/1354/1193
f 1174/1355/1194 1178/1356/1194 1177/1353/1194
f 1177/1353/1195 1179/1357/1195 1176/1354/1195
f 1178/1356/1196 1180/1358/1196 1177/1353/1196
f 965/1359/1197 1173/1352/1197 983/1360/1197
f 1017/1361/1198 1174/1355/1198 1173/1352/1198
f 1018/1362/1199 1175/1363/1199 1174/1355/1199
f 1019/1364/1200 1059/1365/1200 1175/1363/1200
f 1175/1363/1201 1060/1366/1201 1178/1356/1201
f 1060/1366/1202 1181/1367/1202 1178/1356/1202
f 1061/1368/1203 990/1326/1203 1181/1367/1203
f 1181/1367/1204 991/1324/1204 1180/1358/1204
f 1180/1358/1205 992/1323/1205 1179/1357/1205
f 1179/1357/1206 967/1321/1206 981/1369/1206
f 982/1370/1207 1179/1357/1207 981/1369/1207
f 1173/1352/1208 982/1370/1208 983/1360/1208
f 1183/1371/1209 1185/1372/1209 1182/1373/1209
f 1184/1374/1209 1186/1375/1209 1183/1371/1209
f 1186/1375/1210 1188/1376/1210 1185/1372/1210
f 1187/1377/1210 1189/1378/1210 1186/1375/1210
f 973/1379/1211 1182/1373/1211 1059/1365/1211
f 1021/1380/1211 1182/1373/1211 1020/1381/1211
f 1022/1382/1211 1183/1371/1211 1021/1380/1211
f 974/1129/1211 1184/1374/1211 1022/1382/1211
f 1062/1128/1209 1187/1377/1209 1184/1374/1209
f 1063/1141/1210 1190/1383/1210 1187/1377/1210
f 1064/1139/1212 1026/1384/1212 1190/1383/1212
f 1190/1383/1212 1027/1385/1212 1189/1378/1212
f 1189/1378/1212 1028/1386/1212 1188/1376/1212
f 1061/1368/1212 1028/1386/1212 976/1387/1212
f 1185/1372/1210 1061/1368/1210 1060/1366/1210
f 1059/1365/1209 1185/1372/1209 1060/1366/1209
f 1066/1119/1213 1069/1123/1213 1068/1120/1213
f 1067/1122/1214 1070/1134/1214 1069/1123/1214
f 1068/1120/1215 1069/1123/1215 1072/1124/1215
f 1069/1123/1216 1070/1134/1216 1073/1126/1216
f 984/1127/1217 1065/1121/1217 1062/1128/1217
f 985/1130/1218 1066/1119/1218 1065/1121/1218
f 986/1131/1219 1067/1122/1219 1066/1119/1219
f 966/1132/1220 987/1388/1220 1067/1122/1220
f 1067/1122/1221 987/1388/1221 988/1133/1221
f 988/1133/1222 989/1389/1222 1073/1126/1222
f 1073/1126/1223 989/1389/1223 968/1135/1223
f 1072/1124/1224 1073/1126/1224 1023/1136/1224
f 1071/1125/1225 1072/1124/1225 1024/1137/1225
f 1064/1139/1226 1071/1125/1226 1025/1138/1226
f 1063/1141/1227 1068/1120/1227 1071/1125/1227
f 1065/1121/1228 1068/1120/1228 1063/1141/1228
f 1075/1142/1229 1078/1146/1229 1077/1143/1229
f 1076/1145/1230 1079/1152/1230 1078/1146/1230
f 1077/1143/1231 1078/1146/1231 1081/1147/1231
f 1078/1146/1232 1079/1152/1232 1082/1149/1232
f 1025/1138/1233 1074/1144/1233 1056/1150/1233
f 1024/1137/1234 1075/1142/1234 1074/1144/1234
f 1023/1136/1235 1076/1145/1235 1075/1142/1235
f 968/1135/1236 996/1390/1236 1076/1145/1236
f 1076/1145/1237 996/1390/1237 997/1151/1237
f 997/1151/1238 998/1391/1238 1082/1149/1238
f 1082/1149/1239 998/1391/1239 972/1153/1239
f 1081/1147/1240 1082/1149/1240 1029/1154/1240
f 1080/1148/1241 1081/1147/1241 1030/1155/1241
f 1058/1157/1242 1080/1148/1242 1031/1156/1242
f 1057/1159/1243 1077/1143/1243 1080/1148/1243
f 1074/1144/1244 1077/1143/1244 1057/1159/1244
f 1084/1160/1245 1087/1164/1245 1086/1161/1245
f 1085/1163/1246 1088/1170/1246 1087/1164/1246
f 1086/1161/1247 1087/1164/1247 1090/1165/1247
f 1087/1164/1248 1088/1170/1248 1091/1167/1248
f 1031/1156/1249 1083/1162/1249 1050/1168/1249
f 1030/1155/1250 1084/1160/1250 1083/1162/1250
f 1029/1154/1251 1085/1163/1251 1084/1160/1251
f 972/1153/1252 1005/1228/1252 1085/1163/1252
f 1085/1163/1253 1005/1228/1253 1006/1169/1253
f 1006/1169/1254 1007/1240/1254 1091/1167/1254
f 1091/1167/1255 1007/1240/1255 970/1171/1255
f 1090/1165/1256 1091/1167/1256 1035/1172/1256
f 1089/1166/1257 1090/1165/1257 1036/1173/1257
f 1052/1175/1258 1089/1166/1258 1037/1174/1258
f 1051/1177/1259 1086/1161/1259 1089/1166/1259
f 1083/1162/1260 1086/1161/1260 1051/1177/1260
f 1093/1178/1261 1096/1182/1261 1095/1179/1261
f 1094/1181/1262 1097/1188/1262 1096/1182/1262
f 1095/1179/1263 1096/1182/1263 1099/1183/1263
f 1096/1182/1264 1097/1188/1264 1100/1185/1264
f 1037/1174/1265 1092/1180/1265 1044/1186/1265
f 1036/1173/1266 1093/1178/1266 1092/1180/1266
f 1035/1172/1267 1094/1181/1267 1093/1178/1267
f 970/1171/1268 1014/1392/1268 1094/1181/1268
f 1094/1181/1269 1014/1392/1269 1015/1187/1269
f 1015/1187/1270 1016/1393/1270 1100/1185/1270
f 1100/1185/1271 1016/1393/1271 966/1189/1271
f 1099/1183/1272 1100/1185/1272 986/1190/1272
f 1098/1184/1273 1099/1183/1273 985/1191/1273
f 1046/1193/1274 1098/1184/1274 984/1192/1274
f 1045/1195/1275 1095/1179/1275 1098/1184/1275
f 1092/1180/1276 1095/1179/1276 1045/1195/1276
f 1101/1196/1277 1102/1200/1277 1105/1197/1277
f 1103/1199/1278 1106/1211/1278 1105/1197/1278
f 1105/1197/1279 1108/1203/1279 1107/1201/1279
f 1105/1197/1280 1106/1211/1280 1109/1202/1280
f 967/1204/1281 995/1206/1281 1101/1196/1281
f 995/1206/1282 994/1208/1282 1102/1200/1282
f 993/1207/1283 1103/1199/1283 1102/1200/1283
f 971/1209/1284 1004/1210/1284 1103/1199/1284
f 1004/1210/1285 1003/1295/1285 1106/1211/1285
f 1106/1211/1286 1003/1295/1286 1002/1212/1286
f 1109/1202/1287 1002/1212/1287 969/1213/1287
f 1108/1203/1288 1109/1202/1288 1013/1214/1288
f 1108/1203/1289 1012/1215/1289 1011/1216/1289
f 1107/1201/1290 1011/1216/1290 965/1217/1290
f 1104/1198/1291 1107/1201/1291 983/1218/1291
f 981/1205/1292 1101/1196/1292 1104/1198/1292
f 1110/1220/1293 1111/1224/1293 1114/1221/1293
f 1112/1223/1294 1115/1234/1294 1114/1221/1294
f 1114/1221/1295 1117/1227/1295 1116/1225/1295
f 1114/1221/1296 1115/1234/1296 1118/1226/1296
f 972/1153/1297 998/1229/1297 1110/1220/1297
f 998/1229/1298 997/1231/1298 1111/1224/1298
f 996/1230/1299 1112/1223/1299 1111/1224/1299
f 968/1232/1300 989/1233/1300 1112/1223/1300
f 989/1233/1301 988/1394/1301 1115/1234/1301
f 1115/1234/1302 988/1394/1302 987/1235/1302
f 1118/1226/1303 987/1235/1303 966/1236/1303
f 1117/1227/1304 1118/1226/1304 1016/1237/1304
f 1117/1227/1305 1015/1238/1305 1014/1239/1305
f 1116/1225/1306 1014/1239/1306 970/1171/1306
f 1113/1222/1307 1116/1225/1307 1007/1240/1307
f 1005/1228/1308 1110/1220/1308 1113/1222/1308
f 1119/1241/1309 1120/1244/1309 1123/1242/1309
f 1120/1244/1310 1121/1251/1310 1124/1245/1310
f 1123/1242/1311 1126/1247/1311 1125/1246/1311
f 1124/1245/1312 1127/1255/1312 1126/1247/1312
f 969/1213/1313 1010/1249/1313 1119/1241/1313
f 1010/1249/1314 1009/1250/1314 1120/1244/1314
f 1009/1250/1315 1008/1252/1315 1121/1251/1315
f 1008/1252/1316 980/1272/1316 1041/1253/1316
f 1121/1251/1317 1041/1253/1317 1042/1254/1317
f 1042/1254/1318 1043/1256/1318 1127/1255/1318
f 1043/1256/1319 973/1279/1319 1019/1257/1319
f 1127/1255/1320 1019/1257/1320 1018/1258/1320
f 1126/1247/1321 1018/1258/1321 1017/1259/1321
f 1125/1246/1322 1017/1259/1322 965/1260/1322
f 1012/1262/1323 1122/1243/1323 1125/1246/1323
f 1119/1241/1324 1122/1243/1324 1012/1262/1324
f 1129/1263/1149 1132/1267/1149 1131/1264/1149
f 1130/1266/1149 1133/1269/1149 1132/1267/1149
f 1132/1267/1150 1135/1270/1150 1134/1268/1150
f 1133/1269/1150 1136/1275/1150 1135/1270/1150
f 1040/1271/1151 1128/1265/1151 1041/1253/1151
f 1039/1273/1151 1129/1263/1151 1128/1265/1151
f 1039/1273/1151 1038/1274/1151 1130/1266/1151
f 979/1176/1151 1044/1186/1151 1130/1266/1151
f 1044/1186/1149 1045/1195/1149 1133/1269/1149
f 1045/1195/1150 1046/1193/1150 1136/1275/1150
f 1046/1193/1152 974/1194/1152 1022/1276/1152
f 1136/1275/1152 1022/1276/1152 1021/1277/1152
f 1135/1270/1152 1021/1277/1152 1020/1278/1152
f 1134/1268/1152 1020/1278/1152 973/1279/1152
f 1131/1264/1150 1134/1268/1150 1043/1256/1150
f 1128/1265/1149 1131/1264/1149 1042/1254/1149
f 1137/1280/1325 1138/1283/1325 1141/1281/1325
f 1138/1283/1326 1139/1289/1326 1142/1284/1326
f 1141/1281/1327 1144/1286/1327 1143/1285/1327
f 1142/1284/1328 1145/1293/1328 1144/1286/1328
f 971/1209/1329 1001/1287/1329 1137/1280/1329
f 1001/1287/1330 1000/1288/1330 1138/1283/1330
f 1000/1288/1331 999/1290/1331 1139/1289/1331
f 999/1290/1332 978/1395/1332 1047/1291/1332
f 1139/1289/1333 1047/1291/1333 1048/1292/1333
f 1048/1292/1334 1049/1294/1334 1145/1293/1334
f 1049/1294/1335 980/1272/1335 1008/1252/1335
f 1145/1293/1336 1008/1252/1336 1009/1250/1336
f 1144/1286/1337 1009/1250/1337 1010/1249/1337
f 1143/1285/1338 1010/1249/1338 969/1213/1338
f 1003/1295/1339 1140/1282/1339 1143/1285/1339
f 1137/1280/1340 1140/1282/1340 1003/1295/1340
f 1147/1296/1169 1150/1300/1169 1149/1297/1169
f 1148/1299/1169 1151/1302/1169 1150/1300/1169
f 1150/1300/1170 1153/1303/1170 1152/1301/1170
f 1151/1302/1170 1154/1309/1170 1153/1303/1170
f 1034/1304/1171 1146/1298/1171 1047/1291/1171
f 1034/1304/1171 1033/1307/1171 1147/1296/1171
f 1032/1306/1171 1148/1299/1171 1147/1296/1171
f 1032/1306/1171 977/1396/1171 1050/1168/1171
f 1148/1299/1169 1050/1168/1169 1051/1308/1169
f 1051/1308/1170 1052/1175/1170 1154/1309/1170
f 1052/1175/1172 979/1397/1172 1038/1310/1172
f 1153/1303/1172 1154/1309/1172 1038/1310/1172
f 1153/1303/1172 1039/1311/1172 1040/1271/1172
f 1152/1301/1172 1040/1271/1172 980/1312/1172
f 1149/1297/1170 1152/1301/1170 1049/1294/1170
f 1047/1291/1169 1146/1298/1169 1149/1297/1169
f 1155/1314/1341 1156/1317/1341 1159/1315/1341
f 1156/1317/1342 1157/1325/1342 1160/1318/1342
f 1159/1315/1343 1162/1320/1343 1161/1319/1343
f 1160/1318/1344 1163/1329/1344 1162/1320/1344
f 967/1321/1345 992/1323/1345 1155/1314/1345
f 992/1323/1346 991/1324/1346 1156/1317/1346
f 991/1324/1347 990/1326/1347 1157/1325/1347
f 990/1326/1348 976/1387/1348 1053/1327/1348
f 1157/1325/1349 1053/1327/1349 1054/1328/1349
f 1054/1328/1350 1055/1330/1350 1163/1329/1350
f 1055/1330/1351 978/1395/1351 999/1290/1351
f 1163/1329/1352 999/1290/1352 1000/1288/1352
f 1162/1320/1353 1000/1288/1353 1001/1287/1353
f 1161/1319/1354 1001/1287/1354 971/1209/1354
f 994/1332/1355 1158/1316/1355 1161/1319/1355
f 1155/1314/1356 1158/1316/1356 994/1332/1356
f 1165/1333/1189 1168/1337/1189 1167/1334/1189
f 1166/1336/1189 1169/1339/1189 1168/1337/1189
f 1168/1337/1190 1171/1340/1190 1170/1338/1190
f 1169/1339/1190 1172/1348/1190 1171/1340/1190
f 1028/1341/1191 1164/1335/1191 1053/1342/1191
f 1028/1341/1191 1027/1345/1191 1165/1333/1191
f 1026/1344/1191 1166/1336/1191 1165/1333/1191
f 1026/1344/1191 975/1140/1191 1056/1346/1191
f 1166/1336/1189 1056/1346/1189 1057/1347/1189
f 1057/1347/1190 1058/1349/1190 1172/1348/1190
f 1058/1349/1192 977/1396/1192 1032/1306/1192
f 1172/1348/1192 1032/1306/1192 1033/1307/1192
f 1170/1338/1192 1171/1340/1192 1033/1307/1192
f 1170/1338/1192 1034/1304/1192 978/1305/1192
f 1167/1334/1190 1170/1338/1190 1055/1350/1190
f 1053/1342/1189 1164/1335/1189 1167/1334/1189
f 1173/1352/1357 1174/1355/1357 1177/1353/1357
f 1174/1355/1358 1175/1363/1358 1178/1356/1358
f 1177/1353/1359 1180/1358/1359 1179/1357/1359
f 1178/1356/1360 1181/1367/1360 1180/1358/1360
f 965/1359/1361 1017/1361/1361 1173/1352/1361
f 1017/1361/1362 1018/1362/1362 1174/1355/1362
f 1018/1362/1363 1019/1364/1363 1175/1363/1363
f 1019/1364/1364 973/1379/1364 1059/1365/1364
f 1175/1363/1365 1059/1365/1365 1060/1366/1365
f 1060/1366/1366 1061/1368/1366 1181/1367/1366
f 1061/1368/1367 976/1387/1367 990/1326/1367
f 1181/1367/1368 990/1326/1368 991/1324/1368
f 1180/1358/1369 991/1324/1369 992/1323/1369
f 1179/1357/1370 992/1323/1370 967/1321/1370
f 982/1370/1371 1176/1354/1371 1179/1357/1371
f 1173/1352/1372 1176/1354/1372 982/1370/1372
f 1183/1371/1209 1186/1375/1209 1185/1372/1209
f 1184/1374/1209 1187/1377/1209 1186/1375/1209
f 1186/1375/1210 1189/1378/1210 1188/1376/1210
f 1187/1377/1210 1190/1383/1210 1189/1378/1210
f 973/1379/1211 1020/1381/1211 1182/1373/1211
f 1021/1380/1211 1183/1371/1211 1182/1373/1211
f 1022/1382/1211 1184/1374/1211 1183/1371/1211
f 974/1129/1211 1062/1128/1211 1184/1374/1211
f 1062/1128/1209 1063/1141/1209 1187/1377/1209
f 1063/1141/1210 1064/1139/1210 1190/1383/1210
f 1064/1139/1212 975/1140/1212 1026/1384/1212
f 1190/1383/1212 1026/1384/1212 1027/1385/1212
f 1189/1378/1212 1027/1385/1212 1028/1386/1212
f 1061/1368/1212 1188/1376/1212 1028/1386/1212
f 1185/1372/1210 1188/1376/1210 1061/1368/1210
f 1059/1365/1209 1182/1373/1209 1185/1372/1209
o Cube.002
v 0.321424 -0.411795 -0.083474
v 0.004825 -0.403375 -0.083474
v 0.321424 -0.411795 -0.200379
v 0.004825 -0.403375 -0.200379
v 0.324012 -0.314486 -0.083474
v 0.007413 -0.306066 -0.083474
v 0.324012 -0.314486 -0.200379
v 0.007413 -0.306066 -0.200379
v 0.226013 -0.425487 -0.063989
v 0.099373 -0.422120 -0.063989
v 0.099373 -0.422120 -0.219864
v 0.226013 -0.425487 -0.219864
v 0.102824 -0.292374 -0.219864
v 0.229463 -0.295742 -0.219864
v 0.102824 -0.292374 -0.063989
v 0.229463 -0.295742 -0.063989
v 0.325853 -0.419015 -0.176607
v 0.328078 -0.422642 -0.141927
v 0.325853 -0.419015 -0.107247
v 0.068224 -0.420944 -0.064407
v 0.040134 -0.417764 -0.067327
v 0.018004 -0.411175 -0.074530
v 0.000018 -0.410350 -0.107247
v -0.002396 -0.413853 -0.141927
v 0.000018 -0.410350 -0.176607
v 0.257181 -0.425969 -0.219447
v 0.285400 -0.424287 -0.216526
v 0.307849 -0.418884 -0.209323
v 0.328104 -0.334396 -0.208906
v 0.329656 -0.363325 -0.213189
v 0.326568 -0.392130 -0.208906
v 0.000733 -0.383465 -0.208906
v -0.000819 -0.354536 -0.213189
v 0.002268 -0.325731 -0.208906
v 0.260613 -0.296918 -0.219447
v 0.288703 -0.300097 -0.216526
v 0.310833 -0.306686 -0.209323
v 0.328819 -0.307512 -0.107247
v 0.331233 -0.304008 -0.141927
v 0.328819 -0.307512 -0.176607
v 0.002983 -0.298846 -0.176607
v 0.000759 -0.295220 -0.141927
v 0.002983 -0.298846 -0.107247
v 0.260613 -0.296918 -0.064407
v 0.288703 -0.300097 -0.067327
v 0.310833 -0.306686 -0.074530
v 0.326568 -0.392130 -0.074947
v 0.329656 -0.363325 -0.070665
v 0.328104 -0.334396 -0.074947
v 0.002268 -0.325731 -0.074947
v -0.000819 -0.354536 -0.070665
v 0.000733 -0.383465 -0.074947
v 0.307849 -0.418884 -0.074530
v 0.285400 -0.424287 -0.067327
v 0.257181 -0.425969 -0.064407
v 0.194353 -0.424645 -0.063989
v 0.162693 -0.423803 -0.063989
v 0.131033 -0.422962 -0.063989
v 0.018004 -0.411175 -0.209323
v 0.040134 -0.417764 -0.216526
v 0.068224 -0.420944 -0.219447
v 0.131033 -0.422962 -0.219864
v 0.162693 -0.423803 -0.219864
v 0.194353 -0.424645 -0.219864
v 0.020988 -0.298978 -0.209323
v 0.043436 -0.293575 -0.216526
v 0.071655 -0.291893 -0.219447
v 0.134484 -0.293216 -0.219864
v 0.166144 -0.294058 -0.219864
v 0.197804 -0.294900 -0.219864
v 0.020988 -0.298978 -0.074530
v 0.043436 -0.293575 -0.067327
v 0.071655 -0.291893 -0.064407
v 0.134484 -0.293216 -0.063989
v 0.166144 -0.294058 -0.063989
v 0.197804 -0.294900 -0.063989
v 0.228682 -0.325137 -0.042070
v 0.227738 -0.360615 -0.034763
v 0.226795 -0.396092 -0.042070
v 0.102042 -0.321769 -0.042070
v 0.101099 -0.357247 -0.034763
v 0.100155 -0.392724 -0.042070
v 0.229949 -0.277496 -0.184548
v 0.230110 -0.271414 -0.141927
v 0.229949 -0.277496 -0.099305
v 0.103309 -0.274128 -0.184548
v 0.103471 -0.268047 -0.141927
v 0.103309 -0.274128 -0.099305
v 0.226795 -0.396092 -0.241784
v 0.227738 -0.360615 -0.249090
v 0.228682 -0.325137 -0.241784
v 0.100155 -0.392724 -0.241784
v 0.101099 -0.357247 -0.249090
v 0.102042 -0.321769 -0.241784
v 0.225528 -0.443733 -0.099305
v 0.225366 -0.449815 -0.141927
v 0.225528 -0.443733 -0.184548
v 0.098888 -0.440365 -0.099305
v 0.098726 -0.446447 -0.141927
v 0.098888 -0.440365 -0.184548
v 0.067698 -0.439095 -0.099534
v 0.039326 -0.435259 -0.101143
v 0.016320 -0.426420 -0.104736
v 0.067494 -0.445147 -0.141927
v 0.038829 -0.441097 -0.141927
v 0.015064 -0.431561 -0.141927
v 0.067698 -0.439095 -0.184319
v 0.039326 -0.435259 -0.182711
v 0.016320 -0.426420 -0.179117
v 0.068958 -0.391703 -0.241255
v 0.040540 -0.389608 -0.237555
v 0.017395 -0.385999 -0.227678
v 0.069854 -0.356416 -0.248526
v 0.041101 -0.355651 -0.244580
v 0.017100 -0.355013 -0.233891
v 0.070835 -0.321131 -0.241255
v 0.042346 -0.321713 -0.237555
v 0.019041 -0.324086 -0.227678
v 0.072095 -0.273739 -0.184319
v 0.043560 -0.276062 -0.182711
v 0.020116 -0.283665 -0.179117
v 0.072214 -0.267685 -0.141927
v 0.043373 -0.270205 -0.141927
v 0.019135 -0.278464 -0.141927
v 0.072095 -0.273739 -0.099534
v 0.043560 -0.276062 -0.101143
v 0.020116 -0.283665 -0.104736
v 0.070835 -0.321131 -0.042598
v 0.042346 -0.321713 -0.046298
v 0.019041 -0.324086 -0.056175
v 0.069854 -0.356416 -0.035327
v 0.041101 -0.355651 -0.039273
v 0.017100 -0.355013 -0.049963
v 0.068958 -0.391703 -0.042598
v 0.040540 -0.389608 -0.046298
v 0.017395 -0.385999 -0.056175
v 0.336399 -0.394291 -0.178887
v 0.340563 -0.363615 -0.180873
v 0.338036 -0.332761 -0.178887
v 0.339701 -0.396033 -0.141927
v 0.344215 -0.363712 -0.141927
v 0.341425 -0.331197 -0.141927
v 0.336399 -0.394291 -0.104966
v 0.340563 -0.363615 -0.102981
v 0.338036 -0.332761 -0.104966
v -0.007563 -0.323570 -0.178887
v -0.011726 -0.354246 -0.180873
v -0.009199 -0.385100 -0.178887
v -0.010864 -0.321829 -0.141927
v -0.015379 -0.354149 -0.141927
v -0.012589 -0.386664 -0.141927
v -0.007563 -0.323570 -0.104966
v -0.011726 -0.354246 -0.102981
v -0.009199 -0.385100 -0.104966
v 0.311442 -0.331862 -0.056175
v 0.288297 -0.328254 -0.046298
v 0.259878 -0.326158 -0.042598
v 0.311737 -0.362848 -0.049963
v 0.287736 -0.362210 -0.039273
v 0.258983 -0.361445 -0.035327
v 0.309796 -0.393775 -0.056175
v 0.286491 -0.396149 -0.046298
v 0.258002 -0.396730 -0.042598
v 0.197022 -0.324295 -0.042070
v 0.165362 -0.323453 -0.042070
v 0.133702 -0.322611 -0.042070
v 0.196078 -0.359773 -0.034763
v 0.164418 -0.358931 -0.034763
v 0.132759 -0.358089 -0.034763
v 0.195135 -0.395250 -0.042070
v 0.163475 -0.394408 -0.042070
v 0.131815 -0.393566 -0.042070
v 0.312517 -0.291441 -0.179117
v 0.289511 -0.282602 -0.182711
v 0.261139 -0.278766 -0.184319
v 0.313773 -0.286300 -0.141927
v 0.290008 -0.276764 -0.141927
v 0.261343 -0.272715 -0.141927
v 0.312517 -0.291441 -0.104736
v 0.289511 -0.282602 -0.101143
v 0.261139 -0.278766 -0.099534
v 0.198289 -0.276654 -0.184548
v 0.166629 -0.275812 -0.184548
v 0.134969 -0.274970 -0.184548
v 0.198451 -0.270572 -0.141927
v 0.166791 -0.269731 -0.141927
v 0.135131 -0.268889 -0.141927
v 0.198289 -0.276654 -0.099305
v 0.166629 -0.275812 -0.099305
v 0.134969 -0.274970 -0.099305
v 0.309796 -0.393775 -0.227678
v 0.286491 -0.396149 -0.237555
v 0.258002 -0.396730 -0.241255
v 0.311737 -0.362848 -0.233891
v 0.287736 -0.362210 -0.244580
v 0.258983 -0.361445 -0.248526
v 0.311442 -0.331862 -0.227678
v 0.288297 -0.328254 -0.237555
v 0.259878 -0.326158 -0.241255
v 0.195135 -0.395250 -0.241784
v 0.163475 -0.394408 -0.241784
v 0.131815 -0.393566 -0.241784
v 0.196078 -0.359773 -0.249090
v 0.164418 -0.358931 -0.249090
v 0.132759 -0.358089 -0.249090
v 0.197022 -0.324295 -0.241784
v 0.165362 -0.323453 -0.241784
v 0.133702 -0.322611 -0.241784
v 0.308721 -0.434196 -0.104736
v 0.285277 -0.441800 -0.101143
v 0.256741 -0.444123 -0.099534
v 0.309701 -0.439397 -0.141927
v 0.285463 -0.447656 -0.141927
v 0.256623 -0.450176 -0.141927
v 0.308721 -0.434196 -0.179117
v 0.285277 -0.441800 -0.182711
v 0.256741 -0.444123 -0.184319
v 0.193868 -0.442891 -0.099305
v 0.162208 -0.442049 -0.099305
v 0.130548 -0.441207 -0.099305
v 0.193706 -0.448973 -0.141927
v 0.162046 -0.448131 -0.141927
v 0.130386 -0.447289 -0.141927
v 0.193868 -0.442891 -0.184548
v 0.162208 -0.442049 -0.184548
v 0.130548 -0.441207 -0.184548
vn -0.1405 -0.9801 0.1406
vn -0.3737 -0.9175 0.1359
vn -0.1405 -0.9801 -0.1406
vn -0.3737 -0.9175 -0.1359
vn -0.0397 -0.8875 0.4591
vn -0.1468 -0.8779 0.4558
vn -0.3836 -0.8165 0.4316
vn -0.6466 -0.6659 0.3721
vn -0.7069 -0.6970 0.1202
vn -0.7069 -0.6970 -0.1202
vn -0.6466 -0.6659 -0.3721
vn -0.3836 -0.8165 -0.4316
vn -0.1468 -0.8779 -0.4558
vn -0.0397 -0.8875 -0.4591
vn -0.0413 -0.9891 -0.1413
vn -0.0413 -0.9891 0.1413
vn -0.1408 -0.1964 -0.9704
vn -0.4096 -0.1783 -0.8947
vn -0.1302 0.2036 -0.9704
vn -0.3995 0.1998 -0.8947
vn -0.0333 -0.5969 -0.8016
vn -0.1492 -0.5888 -0.7944
vn -0.4018 -0.5442 -0.7365
vn -0.6683 -0.4488 -0.5933
vn -0.7506 -0.1368 -0.6465
vn -0.7422 0.1765 -0.6465
vn -0.6435 0.4836 -0.5933
vn -0.3723 0.5648 -0.7365
vn -0.1177 0.5958 -0.7944
vn -0.0015 0.5978 -0.8016
vn -0.0125 0.2020 -0.9793
vn -0.0232 -0.2011 -0.9793
vn -0.0882 0.9861 -0.1406
vn -0.3244 0.9361 -0.1359
vn -0.0882 0.9861 0.1406
vn -0.3244 0.9361 0.1359
vn 0.0076 0.8884 -0.4591
vn -0.0999 0.8844 -0.4558
vn -0.3396 0.8357 -0.4316
vn -0.6103 0.6993 -0.3721
vn -0.6689 0.7336 -0.1202
vn -0.6689 0.7336 0.1202
vn -0.6103 0.6993 0.3721
vn -0.3396 0.8357 0.4316
vn -0.0999 0.8844 0.4558
vn 0.0076 0.8884 0.4591
vn 0.0113 0.9899 0.1413
vn 0.0113 0.9899 -0.1413
vn -0.1302 0.2036 0.9704
vn -0.3995 0.1998 0.8947
vn -0.1408 -0.1964 0.9704
vn -0.4096 -0.1783 0.8947
vn -0.0015 0.5978 0.8016
vn -0.1177 0.5958 0.7944
vn -0.3723 0.5648 0.7365
vn -0.6435 0.4836 0.5933
vn -0.7422 0.1765 0.6465
vn -0.7506 -0.1368 0.6465
vn -0.6683 -0.4488 0.5933
vn -0.4018 -0.5442 0.7365
vn -0.1492 -0.5888 0.7944
vn -0.0333 -0.5969 0.8016
vn -0.0232 -0.2011 0.9793
vn -0.0125 0.2020 0.9793
vn 0.9859 -0.1377 -0.0946
vn 0.9919 0.0872 -0.0928
vn 0.9859 -0.1377 0.0946
vn 0.9919 0.0872 0.0928
vn 0.8724 -0.3983 -0.2835
vn 0.9367 -0.1477 -0.3174
vn 0.9432 0.0976 -0.3175
vn 0.8948 0.3181 -0.3134
vn 0.9317 0.3492 -0.1002
vn 0.9317 0.3492 0.1002
vn 0.8948 0.3181 0.3134
vn 0.9432 0.0976 0.3175
vn 0.9367 -0.1477 0.3174
vn 0.8724 -0.3983 0.2835
vn 0.9118 -0.3983 0.1001
vn 0.9118 -0.3983 -0.1001
vn -0.9859 0.1377 -0.0946
vn -0.9919 -0.0872 -0.0928
vn -0.9859 0.1377 0.0946
vn -0.9919 -0.0872 0.0928
vn -0.8724 0.3983 -0.2835
vn -0.9367 0.1477 -0.3174
vn -0.9432 -0.0976 -0.3175
vn -0.8948 -0.3181 -0.3134
vn -0.9317 -0.3492 -0.1002
vn -0.9317 -0.3492 0.1002
vn -0.8948 -0.3181 0.3134
vn -0.9432 -0.0976 0.3175
vn -0.9367 0.1477 0.3174
vn -0.8724 0.3983 0.2835
vn -0.9118 0.3983 0.1001
vn -0.9118 0.3983 -0.1001
vn 0.4040 0.1835 0.8962
vn 0.1385 0.1984 0.9703
vn 0.3937 -0.2047 0.8962
vn 0.1277 -0.2055 0.9703
vn 0.7130 0.3947 0.5795
vn 0.3993 0.5464 0.7362
vn 0.1469 0.5905 0.7935
vn 0.0331 0.5970 0.8016
vn 0.0230 0.2012 0.9793
vn 0.0123 -0.2021 0.9793
vn 0.0013 -0.5979 0.8016
vn 0.1153 -0.5975 0.7935
vn 0.3697 -0.5668 0.7362
vn 0.6910 -0.4320 0.5795
vn 0.7417 -0.1757 0.6473
vn 0.7500 0.1361 0.6473
vn 0.0054 0.2016 0.9795
vn -0.0054 -0.2016 0.9795
vn 0.0159 0.5974 0.8018
vn -0.0159 -0.5974 0.8018
vn 0.3688 0.9190 -0.1395
vn 0.1385 0.9801 -0.1420
vn 0.3688 0.9190 0.1395
vn 0.1385 0.9801 0.1420
vn 0.6882 0.6467 -0.3289
vn 0.3814 0.8165 -0.4334
vn 0.1446 0.8774 -0.4574
vn 0.0395 0.8875 -0.4592
vn 0.0412 0.9891 -0.1414
vn 0.0412 0.9891 0.1414
vn 0.0395 0.8875 0.4592
vn 0.1446 0.8774 0.4574
vn 0.3814 0.8165 0.4334
vn 0.6882 0.6467 0.3289
vn 0.7063 0.6977 0.1197
vn 0.7063 0.6977 -0.1197
vn 0.0263 0.9896 -0.1413
vn 0.0263 0.9896 0.1413
vn 0.0236 0.8881 -0.4591
vn 0.0236 0.8881 0.4591
vn 0.3937 -0.2047 -0.8962
vn 0.1277 -0.2055 -0.9703
vn 0.4040 0.1835 -0.8962
vn 0.1385 0.1984 -0.9703
vn 0.6910 -0.4320 -0.5795
vn 0.3697 -0.5668 -0.7362
vn 0.1153 -0.5975 -0.7935
vn 0.0013 -0.5979 -0.8016
vn 0.0123 -0.2021 -0.9793
vn 0.0230 0.2012 -0.9793
vn 0.0331 0.5970 -0.8016
vn 0.1469 0.5905 -0.7935
vn 0.3993 0.5464 -0.7362
vn 0.7130 0.3947 -0.5795
vn 0.7500 0.1361 -0.6473
vn 0.7417 -0.1757 -0.6473
vn -0.0054 -0.2016 -0.9795
vn 0.0054 0.2016 -0.9795
vn -0.0159 -0.5974 -0.8018
vn 0.0159 0.5974 -0.8018
vn 0.3194 -0.9373 0.1395
vn 0.0862 -0.9861 0.1420
vn 0.3194 -0.9373 -0.1395
vn 0.0862 -0.9861 -0.1420
vn 0.6528 -0.6824 0.3289
vn 0.3375 -0.8357 0.4334
vn 0.0977 -0.8839 0.4574
vn -0.0077 -0.8883 0.4592
vn -0.0115 -0.9899 0.1414
vn -0.0115 -0.9899 -0.1414
vn -0.0077 -0.8883 -0.4592
vn 0.0977 -0.8839 -0.4574
vn 0.3375 -0.8357 -0.4334
vn 0.6528 -0.6824 -0.3289
vn 0.6682 -0.7343 -0.1197
vn 0.6682 -0.7343 0.1197
vn -0.0263 -0.9896 0.1413
vn -0.0263 -0.9896 -0.1413
vn -0.0236 -0.8881 0.4591
vn -0.0236 -0.8881 -0.4591
vn -0.1385 -0.9801 0.1420
vn -0.3688 -0.9190 0.1395
vn -0.1385 -0.9801 -0.1420
vn -0.3688 -0.9190 -0.1395
vn -0.0395 -0.8875 0.4592
vn -0.1446 -0.8774 0.4574
vn -0.3814 -0.8165 0.4334
vn -0.6882 -0.6467 0.3289
vn -0.7063 -0.6977 0.1197
vn -0.7063 -0.6977 -0.1197
vn -0.6882 -0.6467 -0.3289
vn -0.3814 -0.8165 -0.4334
vn -0.1446 -0.8774 -0.4574
vn -0.0395 -0.8875 -0.4592
vn -0.0412 -0.9891 -0.1414
vn -0.0412 -0.9891 0.1414
vn -0.1385 -0.1984 -0.9703
vn -0.4040 -0.1835 -0.8962
vn -0.1277 0.2055 -0.9703
vn -0.3937 0.2047 -0.8962
vn -0.0331 -0.5970 -0.8016
vn -0.1469 -0.5905 -0.7935
vn -0.3993 -0.5464 -0.7362
vn -0.7130 -0.3947 -0.5795
vn -0.7500 -0.1361 -0.6473
vn -0.7417 0.1757 -0.6473
vn -0.6910 0.4320 -0.5795
vn -0.3697 0.5668 -0.7362
vn -0.1153 0.5975 -0.7935
vn -0.0013 0.5979 -0.8016
vn -0.0123 0.2021 -0.9793
vn -0.0230 -0.2012 -0.9793
vn -0.0862 0.9861 -0.1420
vn -0.3194 0.9373 -0.1395
vn -0.0862 0.9861 0.1420
vn -0.3194 0.9373 0.1395
vn 0.0077 0.8883 -0.4592
vn -0.0977 0.8839 -0.4574
vn -0.3375 0.8357 -0.4334
vn -0.6528 0.6824 -0.3289
vn -0.6682 0.7343 -0.1197
vn -0.6682 0.7343 0.1197
vn -0.6528 0.6824 0.3289
vn -0.3375 0.8357 0.4334
vn -0.0977 0.8839 0.4574
vn 0.0077 0.8883 0.4592
vn 0.0115 0.9899 0.1414
vn 0.0115 0.9899 -0.1414
vn -0.1277 0.2055 0.9703
vn -0.3937 0.2047 0.8962
vn -0.1385 -0.1984 0.9703
vn -0.4040 -0.1835 0.8962
vn -0.0013 0.5979 0.8016
vn -0.1153 0.5975 0.7935
vn -0.3697 0.5668 0.7362
vn -0.6910 0.4320 0.5795
vn -0.7417 0.1757 0.6473
vn -0.7500 -0.1361 0.6473
vn -0.7130 -0.3947 0.5795
vn -0.3993 -0.5464 0.7362
vn -0.1469 -0.5905 0.7935
vn -0.0331 -0.5970 0.8016
vn -0.0230 -0.2012 0.9793
vn -0.0123 0.2021 0.9793
vn 0.9858 -0.1398 -0.0928
vn 0.9919 0.0851 -0.0946
vn 0.9858 -0.1398 0.0928
vn 0.9919 0.0851 0.0946
vn 0.8766 -0.3652 -0.3134
vn 0.9367 -0.1476 -0.3175
vn 0.9432 0.0977 -0.3174
vn 0.8923 0.3513 -0.2835
vn 0.9317 0.3492 -0.1001
vn 0.9317 0.3492 0.1001
vn 0.8923 0.3513 0.2835
vn 0.9432 0.0977 0.3174
vn 0.9367 -0.1476 0.3175
vn 0.8766 -0.3652 0.3134
vn 0.9118 -0.3982 0.1002
vn 0.9118 -0.3982 -0.1002
vn -0.9858 0.1398 -0.0928
vn -0.9919 -0.0851 -0.0946
vn -0.9858 0.1398 0.0928
vn -0.9919 -0.0851 0.0946
vn -0.8766 0.3652 -0.3134
vn -0.9367 0.1476 -0.3175
vn -0.9432 -0.0977 -0.3174
vn -0.8923 -0.3513 -0.2835
vn -0.9317 -0.3492 -0.1001
vn -0.9317 -0.3492 0.1001
vn -0.8923 -0.3513 0.2835
vn -0.9432 -0.0977 0.3174
vn -0.9367 0.1476 0.3175
vn -0.8766 0.3652 0.3134
vn -0.9118 0.3982 0.1002
vn -0.9118 0.3982 -0.1002
vn 0.4096 0.1783 0.8947
vn 0.1408 0.1964 0.9704
vn 0.3995 -0.1998 0.8947
vn 0.1302 -0.2036 0.9704
vn 0.6683 0.4488 0.5933
vn 0.4018 0.5442 0.7365
vn 0.1492 0.5888 0.7944
vn 0.0333 0.5969 0.8016
vn 0.0232 0.2011 0.9793
vn 0.0125 -0.2020 0.9793
vn 0.0015 -0.5978 0.8016
vn 0.1177 -0.5958 0.7944
vn 0.3723 -0.5648 0.7365
vn 0.6435 -0.4836 0.5933
vn 0.7422 -0.1765 0.6465
vn 0.7506 0.1368 0.6465
vn 0.3737 0.9175 -0.1359
vn 0.1405 0.9801 -0.1406
vn 0.3737 0.9175 0.1359
vn 0.1405 0.9801 0.1406
vn 0.6466 0.6659 -0.3721
vn 0.3836 0.8165 -0.4316
vn 0.1468 0.8779 -0.4558
vn 0.0397 0.8875 -0.4591
vn 0.0413 0.9891 -0.1413
vn 0.0413 0.9891 0.1413
vn 0.0397 0.8875 0.4591
vn 0.1468 0.8779 0.4558
vn 0.3836 0.8165 0.4316
vn 0.6466 0.6659 0.3721
vn 0.7069 0.6970 0.1202
vn 0.7069 0.6970 -0.1202
vn 0.3995 -0.1998 -0.8947
vn 0.1302 -0.2036 -0.9704
vn 0.4096 0.1783 -0.8947
vn 0.1408 0.1964 -0.9704
vn 0.6435 -0.4836 -0.5933
vn 0.3723 -0.5648 -0.7365
vn 0.1177 -0.5959 -0.7944
vn 0.0015 -0.5978 -0.8016
vn 0.0125 -0.2020 -0.9793
vn 0.0232 0.2011 -0.9793
vn 0.0333 0.5969 -0.8016
vn 0.1492 0.5887 -0.7944
vn 0.4018 0.5442 -0.7365
vn 0.6683 0.4488 -0.5933
vn 0.7506 0.1368 -0.6465
vn 0.7422 -0.1765 -0.6465
vn 0.3244 -0.9361 0.1359
vn 0.0882 -0.9861 0.1406
vn 0.3244 -0.9361 -0.1359
vn 0.0882 -0.9861 -0.1406
vn 0.6103 -0.6993 0.3721
vn 0.3396 -0.8357 0.4316
vn 0.0999 -0.8844 0.4558
vn -0.0076 -0.8884 0.4591
vn -0.0113 -0.9899 0.1413
vn -0.0113 -0.9899 -0.1413
vn -0.0076 -0.8884 -0.4591
vn 0.0999 -0.8844 -0.4558
vn 0.3396 -0.8357 -0.4316
vn 0.6103 -0.6993 -0.3721
vn 0.6689 -0.7336 -0.1202
vn 0.6689 -0.7336 0.1202
vt 0.583333 0.062500
vt 0.562500 0.125000
vt 0.562500 0.062500
vt 0.604167 0.062500
vt 0.583333 0.125000
vt 0.583333 0.187500
vt 0.562500 0.187500
vt 0.604167 0.187500
vt 0.562500 0.000000
vt 0.541667 0.062500
vt 0.541667 0.000000
vt 0.583333 0.000000
vt 0.604167 0.000000
vt 0.625000 0.000000
vt 0.625000 0.125000
vt 0.604167 0.125000
vt 0.625000 0.250000
vt 0.604167 0.250000
vt 0.583333 0.250000
vt 0.562500 0.250000
vt 0.541667 0.187500
vt 0.541667 0.250000
vt 0.541667 0.125000
vt 0.583333 0.312500
vt 0.562500 0.375000
vt 0.562500 0.312500
vt 0.604167 0.312500
vt 0.583333 0.375000
vt 0.583406 0.437500
vt 0.562509 0.437500
vt 0.604320 0.437500
vt 0.541667 0.312500
vt 0.625000 0.375000
vt 0.604167 0.375000
vt 0.625000 0.500000
vt 0.605903 0.500000
vt 0.583912 0.500000
vt 0.562572 0.500000
vt 0.541667 0.437500
vt 0.541667 0.500000
vt 0.541667 0.375000
vt 0.584997 0.562500
vt 0.562789 0.625000
vt 0.562708 0.562500
vt 0.610343 0.562500
vt 0.585648 0.625000
vt 0.584997 0.687500
vt 0.562708 0.687500
vt 0.610343 0.687500
vt 0.541667 0.562500
vt 0.648148 0.625000
vt 0.612558 0.625000
vt 0.625000 0.750000
vt 0.605903 0.750000
vt 0.583912 0.750000
vt 0.562572 0.750000
vt 0.541667 0.687500
vt 0.541667 0.750000
vt 0.541667 0.625000
vt 0.583406 0.812500
vt 0.562500 0.875000
vt 0.562509 0.812500
vt 0.604320 0.812500
vt 0.583333 0.875000
vt 0.583333 0.937500
vt 0.562500 0.937500
vt 0.604167 0.937500
vt 0.541667 0.812500
vt 0.625000 0.875000
vt 0.604167 0.875000
vt 0.625000 1.000000
vt 0.604167 1.000000
vt 0.583333 1.000000
vt 0.562500 1.000000
vt 0.541667 0.937500
vt 0.541667 1.000000
vt 0.541667 0.875000
vt 0.187301 0.562500
vt 0.247685 0.625000
vt 0.187211 0.625000
vt 0.306478 0.562500
vt 0.248409 0.562500
vt 0.187301 0.687500
vt 0.306478 0.687500
vt 0.248409 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.187500 0.500000
vt 0.312500 0.500000
vt 0.250000 0.500000
vt 0.375000 0.500000
vt 0.355867 0.562500
vt 0.304109 0.625000
vt 0.355867 0.687500
vt 0.375000 0.750000
vt 0.312500 0.750000
vt 0.250000 0.750000
vt 0.187500 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.693522 0.562500
vt 0.752315 0.625000
vt 0.695891 0.625000
vt 0.812699 0.562500
vt 0.751592 0.562500
vt 0.693522 0.687500
vt 0.812699 0.687500
vt 0.751591 0.687500
vt 0.644133 0.562500
vt 0.687500 0.500000
vt 0.812500 0.500000
vt 0.750000 0.500000
vt 0.875000 0.500000
vt 0.875000 0.562500
vt 0.812789 0.625000
vt 0.875000 0.687500
vt 0.875000 0.750000
vt 0.812500 0.750000
vt 0.750000 0.750000
vt 0.687500 0.750000
vt 0.644133 0.687500
vt 0.395680 0.812500
vt 0.416667 0.875000
vt 0.395833 0.875000
vt 0.416594 0.812500
vt 0.437500 0.875000
vt 0.395833 0.937500
vt 0.416667 0.937500
vt 0.375000 0.812500
vt 0.394097 0.750000
vt 0.416088 0.750000
vt 0.437491 0.812500
vt 0.437428 0.750000
vt 0.458333 0.812500
vt 0.458333 0.875000
vt 0.437500 0.937500
vt 0.458333 0.937500
vt 0.437500 1.000000
vt 0.416667 1.000000
vt 0.395833 1.000000
vt 0.375000 1.000000
vt 0.375000 0.937500
vt 0.375000 0.875000
vt 0.500000 0.812500
vt 0.479167 0.875000
vt 0.479167 0.812500
vt 0.520833 0.812500
vt 0.500000 0.875000
vt 0.479167 0.937500
vt 0.520833 0.875000
vt 0.500000 0.937500
vt 0.479167 0.750000
vt 0.458333 0.750000
vt 0.500000 0.750000
vt 0.520833 0.750000
vt 0.520833 0.937500
vt 0.520833 1.000000
vt 0.500000 1.000000
vt 0.479167 1.000000
vt 0.458333 1.000000
vt 0.389658 0.562500
vt 0.414352 0.625000
vt 0.387442 0.625000
vt 0.415003 0.562500
vt 0.437211 0.625000
vt 0.389657 0.687500
vt 0.415003 0.687500
vt 0.394097 0.500000
vt 0.416088 0.500000
vt 0.437292 0.562500
vt 0.437428 0.500000
vt 0.458333 0.562500
vt 0.458333 0.625000
vt 0.437292 0.687500
vt 0.458333 0.687500
vt 0.351852 0.625000
vt 0.500000 0.562500
vt 0.479167 0.625000
vt 0.479167 0.562500
vt 0.520833 0.562500
vt 0.500000 0.625000
vt 0.479167 0.687500
vt 0.520833 0.625000
vt 0.500000 0.687500
vt 0.479167 0.500000
vt 0.458333 0.500000
vt 0.520833 0.500000
vt 0.500000 0.500000
vt 0.541667 0.625000
vt 0.520833 0.687500
vt 0.520833 0.750000
vt 0.500000 0.750000
vt 0.458333 0.750000
vt 0.458333 0.625000
vt 0.395833 0.312500
vt 0.416667 0.375000
vt 0.395833 0.375000
vt 0.416667 0.312500
vt 0.437500 0.375000
vt 0.395680 0.437500
vt 0.416594 0.437500
vt 0.375000 0.250000
vt 0.375000 0.312500
vt 0.395833 0.250000
vt 0.416667 0.250000
vt 0.437500 0.312500
vt 0.437500 0.250000
vt 0.458333 0.312500
vt 0.458333 0.375000
vt 0.437491 0.437500
vt 0.458333 0.437500
vt 0.375000 0.437500
vt 0.375000 0.375000
vt 0.500000 0.312500
vt 0.479167 0.375000
vt 0.479167 0.312500
vt 0.520833 0.312500
vt 0.500000 0.375000
vt 0.479167 0.437500
vt 0.520833 0.375000
vt 0.500000 0.437500
vt 0.479167 0.250000
vt 0.458333 0.312500
vt 0.458333 0.250000
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.541667 0.312500
vt 0.541667 0.375000
vt 0.520833 0.437500
vt 0.541667 0.437500
vt 0.458333 0.437500
vt 0.458333 0.375000
vt 0.395833 0.062500
vt 0.416667 0.125000
vt 0.395833 0.125000
vt 0.416667 0.062500
vt 0.437500 0.125000
vt 0.395833 0.187500
vt 0.416667 0.187500
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.395833 0.000000
vt 0.416667 0.000000
vt 0.437500 0.062500
vt 0.437500 0.000000
vt 0.458333 0.062500
vt 0.458333 0.125000
vt 0.437500 0.187500
vt 0.458333 0.187500
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.500000 0.062500
vt 0.479167 0.125000
vt 0.479167 0.062500
vt 0.520833 0.062500
vt 0.500000 0.125000
vt 0.479167 0.187500
vt 0.520833 0.125000
vt 0.500000 0.187500
vt 0.458333 0.000000
vt 0.500000 0.000000
vt 0.479167 0.000000
vt 0.520833 0.000000
vt 0.520833 0.187500
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.479167 0.250000
vt 0.458333 0.250000
vt 0.625000 0.062500
vt 0.625000 0.187500
vt 0.625000 0.312500
vt 0.625000 0.437500
vt 0.625000 0.812500
vt 0.625000 0.937500
vt 0.875000 0.625000
vt 0.458333 0.500000
vt 0.541667 0.500000
vt 0.541667 0.750000
s 0
f 1292/1398/1373 1294/1399/1373 1291/1400/1373
f 1293/1401/1374 1295/1402/1374 1292/1398/1374
f 1294/1399/1375 1298/1403/1375 1297/1404/1375
f 1295/1402/1376 1299/1405/1376 1298/1403/1376
f 1210/1406/1377 1288/1407/1377 1200/1408/1377
f 1211/1409/1378 1291/1400/1378 1210/1406/1378
f 1212/1410/1379 1292/1398/1379 1211/1409/1379
f 1192/1411/1380 1293/1401/1380 1212/1410/1380
f 1293/1401/1381 1214/1412/1381 1296/1413/1381
f 1214/1412/1382 1299/1405/1382 1296/1413/1382
f 1299/1405/1383 1194/1414/1383 1249/1415/1383
f 1298/1403/1384 1249/1415/1384 1250/1416/1384
f 1297/1404/1385 1250/1416/1385 1251/1417/1385
f 1290/1418/1386 1251/1417/1386 1201/1419/1386
f 1289/1420/1387 1297/1404/1387 1290/1418/1387
f 1291/1400/1388 1289/1420/1388 1288/1407/1388
f 1301/1421/1389 1303/1422/1389 1300/1423/1389
f 1302/1424/1390 1304/1425/1390 1301/1421/1390
f 1303/1422/1391 1307/1426/1391 1306/1427/1391
f 1304/1425/1392 1308/1428/1392 1307/1426/1392
f 1251/1417/1393 1282/1429/1393 1201/1419/1393
f 1250/1416/1394 1300/1423/1394 1251/1417/1394
f 1249/1415/1395 1301/1421/1395 1250/1416/1395
f 1194/1414/1396 1302/1424/1396 1249/1415/1396
f 1302/1424/1397 1223/1430/1397 1305/1431/1397
f 1223/1430/1398 1308/1428/1398 1305/1431/1398
f 1308/1428/1399 1198/1432/1399 1255/1433/1399
f 1307/1426/1400 1255/1433/1400 1256/1434/1400
f 1306/1427/1401 1256/1434/1401 1257/1435/1401
f 1284/1436/1402 1257/1435/1402 1203/1437/1402
f 1283/1438/1403 1306/1427/1403 1284/1436/1403
f 1300/1423/1404 1283/1438/1404 1282/1429/1404
f 1310/1439/1405 1312/1440/1405 1309/1441/1405
f 1311/1442/1406 1313/1443/1406 1310/1439/1406
f 1312/1440/1407 1316/1444/1407 1315/1445/1407
f 1313/1443/1408 1317/1446/1408 1316/1444/1408
f 1257/1435/1409 1276/1447/1409 1203/1437/1409
f 1256/1434/1410 1309/1441/1410 1257/1435/1410
f 1255/1433/1411 1310/1439/1411 1256/1434/1411
f 1198/1432/1412 1311/1442/1412 1255/1433/1412
f 1311/1442/1413 1232/1448/1413 1314/1449/1413
f 1232/1448/1414 1317/1446/1414 1314/1449/1414
f 1317/1446/1415 1196/1450/1415 1261/1451/1415
f 1316/1444/1416 1261/1451/1416 1262/1452/1416
f 1315/1445/1417 1262/1452/1417 1263/1453/1417
f 1278/1454/1418 1263/1453/1418 1205/1455/1418
f 1277/1456/1419 1315/1445/1419 1278/1454/1419
f 1309/1441/1420 1277/1456/1420 1276/1447/1420
f 1319/1457/1421 1321/1458/1421 1318/1459/1421
f 1320/1460/1422 1322/1461/1422 1319/1457/1422
f 1321/1458/1423 1325/1462/1423 1324/1463/1423
f 1322/1461/1424 1326/1464/1424 1325/1462/1424
f 1263/1453/1425 1270/1465/1425 1205/1455/1425
f 1262/1452/1426 1318/1459/1426 1263/1453/1426
f 1261/1451/1427 1319/1457/1427 1262/1452/1427
f 1196/1450/1428 1320/1460/1428 1261/1451/1428
f 1320/1460/1429 1241/1466/1429 1323/1467/1429
f 1241/1466/1430 1326/1464/1430 1323/1467/1430
f 1326/1464/1431 1192/1468/1431 1212/1469/1431
f 1325/1462/1432 1212/1469/1432 1211/1470/1432
f 1324/1463/1433 1211/1470/1433 1210/1471/1433
f 1272/1472/1434 1210/1471/1434 1200/1473/1434
f 1271/1474/1435 1324/1463/1435 1272/1472/1435
f 1318/1459/1436 1271/1474/1436 1270/1465/1436
f 1327/1475/1437 1331/1476/1437 1330/1477/1437
f 1329/1478/1438 1331/1476/1438 1328/1479/1438
f 1331/1476/1439 1333/1480/1439 1330/1477/1439
f 1331/1476/1440 1335/1481/1440 1334/1482/1440
f 1193/1483/1441 1327/1475/1441 1207/1484/1441
f 1221/1485/1442 1328/1479/1442 1327/1475/1442
f 1219/1486/1443 1328/1479/1443 1220/1487/1443
f 1197/1488/1444 1329/1478/1444 1219/1486/1444
f 1230/1489/1445 1332/1490/1445 1329/1478/1445
f 1332/1490/1446 1228/1491/1446 1335/1481/1446
f 1335/1481/1447 1195/1492/1447 1239/1493/1447
f 1334/1482/1448 1239/1493/1448 1238/1494/1448
f 1334/1482/1449 1237/1495/1449 1333/1480/1449
f 1333/1480/1450 1191/1496/1450 1209/1497/1450
f 1330/1477/1451 1209/1497/1451 1208/1498/1451
f 1207/1484/1452 1330/1477/1452 1208/1498/1452
f 1336/1499/1453 1340/1500/1453 1339/1501/1453
f 1338/1502/1454 1340/1500/1454 1337/1503/1454
f 1340/1500/1455 1342/1504/1455 1339/1501/1455
f 1340/1500/1456 1344/1505/1456 1343/1506/1456
f 1198/1432/1457 1336/1499/1457 1231/1507/1457
f 1224/1508/1458 1337/1503/1458 1336/1499/1458
f 1222/1509/1459 1337/1503/1459 1223/1510/1459
f 1194/1511/1460 1338/1502/1460 1222/1509/1460
f 1215/1512/1461 1341/1513/1461 1338/1502/1461
f 1341/1513/1462 1213/1514/1462 1344/1505/1462
f 1344/1505/1463 1192/1515/1463 1242/1516/1463
f 1343/1506/1464 1242/1516/1464 1241/1517/1464
f 1343/1506/1465 1240/1518/1465 1342/1504/1465
f 1342/1504/1466 1196/1450/1466 1233/1519/1466
f 1339/1501/1467 1233/1519/1467 1232/1448/1467
f 1231/1507/1468 1339/1501/1468 1232/1448/1468
f 1345/1520/1469 1349/1521/1469 1348/1522/1469
f 1346/1523/1470 1350/1524/1470 1349/1521/1470
f 1349/1521/1471 1351/1525/1471 1348/1522/1471
f 1350/1524/1472 1352/1526/1472 1349/1521/1472
f 1195/1492/1473 1345/1520/1473 1239/1527/1473
f 1236/1528/1474 1346/1523/1474 1345/1520/1474
f 1235/1529/1475 1347/1530/1475 1346/1523/1475
f 1234/1531/1476 1267/1532/1476 1347/1530/1476
f 1347/1530/1477 1268/1533/1477 1350/1524/1477
f 1268/1533/1478 1353/1534/1478 1350/1524/1478
f 1269/1535/1479 1245/1536/1479 1353/1534/1479
f 1353/1534/1480 1244/1537/1480 1352/1526/1480
f 1352/1526/1481 1243/1538/1481 1351/1525/1481
f 1351/1525/1482 1191/1539/1482 1237/1540/1482
f 1238/1541/1483 1351/1525/1483 1237/1540/1483
f 1345/1520/1484 1238/1541/1484 1239/1527/1484
f 1355/1542/1485 1357/1543/1485 1354/1544/1485
f 1356/1545/1485 1358/1546/1485 1355/1542/1485
f 1358/1546/1486 1360/1547/1486 1357/1543/1486
f 1359/1548/1486 1361/1549/1486 1358/1546/1486
f 1266/1550/1487 1267/1532/1487 1206/1551/1487
f 1265/1552/1487 1354/1544/1487 1266/1550/1487
f 1265/1552/1487 1356/1545/1487 1355/1542/1487
f 1205/1455/1487 1356/1545/1487 1264/1553/1487
f 1270/1465/1485 1359/1548/1485 1356/1545/1485
f 1271/1474/1486 1362/1554/1486 1359/1548/1486
f 1272/1472/1488 1248/1555/1488 1362/1554/1488
f 1362/1554/1488 1247/1556/1488 1361/1549/1488
f 1361/1549/1488 1246/1557/1488 1360/1547/1488
f 1360/1547/1488 1199/1558/1488 1269/1535/1488
f 1357/1543/1486 1269/1535/1486 1268/1533/1486
f 1354/1544/1485 1268/1533/1485 1267/1532/1485
f 1363/1559/1489 1367/1560/1489 1366/1561/1489
f 1364/1562/1490 1368/1563/1490 1367/1560/1490
f 1367/1560/1491 1369/1564/1491 1366/1561/1491
f 1368/1563/1492 1370/1565/1492 1367/1560/1492
f 1197/1488/1493 1363/1559/1493 1230/1489/1493
f 1227/1566/1494 1364/1562/1494 1363/1559/1494
f 1226/1567/1495 1365/1568/1495 1364/1562/1495
f 1225/1569/1496 1273/1570/1496 1365/1568/1496
f 1365/1568/1497 1274/1571/1497 1368/1563/1497
f 1274/1571/1498 1371/1572/1498 1368/1563/1498
f 1275/1573/1499 1234/1531/1499 1371/1572/1499
f 1371/1572/1500 1235/1529/1500 1370/1565/1500
f 1370/1565/1501 1236/1528/1501 1369/1564/1501
f 1369/1564/1502 1195/1492/1502 1228/1491/1502
f 1229/1574/1503 1369/1564/1503 1228/1491/1503
f 1363/1559/1504 1229/1574/1504 1230/1489/1504
f 1373/1575/1505 1375/1576/1505 1372/1577/1505
f 1374/1578/1505 1376/1579/1505 1373/1575/1505
f 1376/1579/1506 1378/1580/1506 1375/1576/1506
f 1377/1581/1506 1379/1582/1506 1376/1579/1506
f 1260/1583/1507 1273/1570/1507 1204/1584/1507
f 1260/1583/1507 1373/1575/1507 1372/1577/1507
f 1258/1585/1507 1373/1575/1507 1259/1586/1507
f 1258/1585/1507 1276/1447/1507 1374/1578/1507
f 1374/1578/1505 1277/1587/1505 1377/1581/1505
f 1277/1587/1506 1380/1588/1506 1377/1581/1506
f 1278/1454/1508 1264/1589/1508 1380/1588/1508
f 1379/1582/1508 1264/1589/1508 1265/1590/1508
f 1379/1582/1508 1266/1550/1508 1378/1580/1508
f 1378/1580/1508 1206/1591/1508 1275/1573/1508
f 1375/1576/1506 1275/1573/1506 1274/1592/1506
f 1273/1570/1505 1375/1576/1505 1274/1592/1505
f 1381/1593/1509 1385/1594/1509 1384/1595/1509
f 1382/1596/1510 1386/1597/1510 1385/1594/1510
f 1385/1594/1511 1387/1598/1511 1384/1595/1511
f 1386/1597/1512 1388/1599/1512 1385/1594/1512
f 1193/1600/1513 1381/1593/1513 1221/1601/1513
f 1218/1602/1514 1382/1596/1514 1381/1593/1514
f 1217/1603/1515 1383/1604/1515 1382/1596/1515
f 1216/1605/1516 1279/1606/1516 1383/1604/1516
f 1383/1604/1517 1280/1607/1517 1386/1597/1517
f 1280/1607/1518 1389/1608/1518 1386/1597/1518
f 1281/1609/1519 1225/1569/1519 1389/1608/1519
f 1389/1608/1520 1226/1567/1520 1388/1599/1520
f 1388/1599/1521 1227/1566/1521 1387/1598/1521
f 1387/1598/1522 1197/1488/1522 1219/1610/1522
f 1220/1611/1523 1387/1598/1523 1219/1610/1523
f 1381/1593/1524 1220/1611/1524 1221/1601/1524
f 1391/1612/1525 1393/1613/1525 1390/1614/1525
f 1392/1615/1525 1394/1616/1525 1391/1612/1525
f 1394/1616/1526 1396/1617/1526 1393/1613/1526
f 1395/1618/1526 1397/1619/1526 1394/1616/1526
f 1254/1620/1527 1279/1621/1527 1202/1622/1527
f 1254/1620/1527 1391/1612/1527 1390/1614/1527
f 1252/1623/1527 1391/1612/1527 1253/1624/1527
f 1252/1623/1527 1282/1625/1527 1392/1615/1527
f 1392/1615/1525 1283/1626/1525 1395/1618/1525
f 1283/1626/1526 1398/1627/1526 1395/1618/1526
f 1284/1628/1528 1258/1585/1528 1398/1627/1528
f 1398/1627/1528 1259/1586/1528 1397/1619/1528
f 1396/1617/1528 1259/1586/1528 1260/1583/1528
f 1396/1617/1528 1204/1584/1528 1281/1629/1528
f 1393/1613/1526 1281/1629/1526 1280/1630/1526
f 1279/1621/1525 1393/1613/1525 1280/1630/1525
f 1399/1631/1529 1403/1632/1529 1402/1633/1529
f 1400/1634/1530 1404/1635/1530 1403/1632/1530
f 1403/1632/1531 1405/1636/1531 1402/1633/1531
f 1404/1635/1532 1406/1637/1532 1403/1632/1532
f 1191/1638/1533 1399/1631/1533 1209/1639/1533
f 1243/1640/1534 1400/1634/1534 1399/1631/1534
f 1244/1641/1535 1401/1642/1535 1400/1634/1535
f 1245/1643/1536 1285/1644/1536 1401/1642/1536
f 1401/1642/1537 1286/1645/1537 1404/1635/1537
f 1286/1645/1538 1407/1646/1538 1404/1635/1538
f 1287/1647/1539 1216/1605/1539 1407/1646/1539
f 1407/1646/1540 1217/1603/1540 1406/1637/1540
f 1406/1637/1541 1218/1602/1541 1405/1636/1541
f 1405/1636/1542 1193/1600/1542 1207/1648/1542
f 1208/1649/1543 1405/1636/1543 1207/1648/1543
f 1399/1631/1544 1208/1649/1544 1209/1639/1544
f 1409/1650/1545 1411/1651/1545 1408/1652/1545
f 1410/1653/1545 1412/1654/1545 1409/1650/1545
f 1412/1654/1546 1414/1655/1546 1411/1651/1546
f 1413/1656/1546 1415/1657/1546 1412/1654/1546
f 1199/1658/1547 1408/1652/1547 1285/1644/1547
f 1247/1659/1547 1408/1652/1547 1246/1660/1547
f 1248/1661/1547 1409/1650/1547 1247/1659/1547
f 1200/1408/1547 1410/1653/1547 1248/1661/1547
f 1288/1407/1545 1413/1656/1545 1410/1653/1545
f 1289/1420/1546 1416/1662/1546 1413/1656/1546
f 1290/1418/1548 1252/1663/1548 1416/1662/1548
f 1416/1662/1548 1253/1664/1548 1415/1657/1548
f 1415/1657/1548 1254/1665/1548 1414/1655/1548
f 1287/1647/1548 1254/1665/1548 1202/1666/1548
f 1411/1651/1546 1287/1647/1546 1286/1645/1546
f 1285/1644/1545 1411/1651/1545 1286/1645/1545
f 1292/1398/1549 1295/1402/1549 1294/1399/1549
f 1293/1401/1550 1296/1413/1550 1295/1402/1550
f 1294/1399/1551 1295/1402/1551 1298/1403/1551
f 1295/1402/1552 1296/1413/1552 1299/1405/1552
f 1210/1406/1553 1291/1400/1553 1288/1407/1553
f 1211/1409/1554 1292/1398/1554 1291/1400/1554
f 1212/1410/1555 1293/1401/1555 1292/1398/1555
f 1192/1411/1556 1213/1667/1556 1293/1401/1556
f 1293/1401/1557 1213/1667/1557 1214/1412/1557
f 1214/1412/1558 1215/1668/1558 1299/1405/1558
f 1299/1405/1559 1215/1668/1559 1194/1414/1559
f 1298/1403/1560 1299/1405/1560 1249/1415/1560
f 1297/1404/1561 1298/1403/1561 1250/1416/1561
f 1290/1418/1562 1297/1404/1562 1251/1417/1562
f 1289/1420/1563 1294/1399/1563 1297/1404/1563
f 1291/1400/1564 1294/1399/1564 1289/1420/1564
f 1301/1421/1565 1304/1425/1565 1303/1422/1565
f 1302/1424/1566 1305/1431/1566 1304/1425/1566
f 1303/1422/1567 1304/1425/1567 1307/1426/1567
f 1304/1425/1568 1305/1431/1568 1308/1428/1568
f 1251/1417/1569 1300/1423/1569 1282/1429/1569
f 1250/1416/1570 1301/1421/1570 1300/1423/1570
f 1249/1415/1571 1302/1424/1571 1301/1421/1571
f 1194/1414/1572 1222/1669/1572 1302/1424/1572
f 1302/1424/1573 1222/1669/1573 1223/1430/1573
f 1223/1430/1574 1224/1670/1574 1308/1428/1574
f 1308/1428/1575 1224/1670/1575 1198/1432/1575
f 1307/1426/1576 1308/1428/1576 1255/1433/1576
f 1306/1427/1577 1307/1426/1577 1256/1434/1577
f 1284/1436/1578 1306/1427/1578 1257/1435/1578
f 1283/1438/1579 1303/1422/1579 1306/1427/1579
f 1300/1423/1580 1303/1422/1580 1283/1438/1580
f 1310/1439/1581 1313/1443/1581 1312/1440/1581
f 1311/1442/1582 1314/1449/1582 1313/1443/1582
f 1312/1440/1583 1313/1443/1583 1316/1444/1583
f 1313/1443/1584 1314/1449/1584 1317/1446/1584
f 1257/1435/1585 1309/1441/1585 1276/1447/1585
f 1256/1434/1586 1310/1439/1586 1309/1441/1586
f 1255/1433/1587 1311/1442/1587 1310/1439/1587
f 1198/1432/1588 1231/1507/1588 1311/1442/1588
f 1311/1442/1589 1231/1507/1589 1232/1448/1589
f 1232/1448/1590 1233/1519/1590 1317/1446/1590
f 1317/1446/1591 1233/1519/1591 1196/1450/1591
f 1316/1444/1592 1317/1446/1592 1261/1451/1592
f 1315/1445/1593 1316/1444/1593 1262/1452/1593
f 1278/1454/1594 1315/1445/1594 1263/1453/1594
f 1277/1456/1595 1312/1440/1595 1315/1445/1595
f 1309/1441/1596 1312/1440/1596 1277/1456/1596
f 1319/1457/1597 1322/1461/1597 1321/1458/1597
f 1320/1460/1598 1323/1467/1598 1322/1461/1598
f 1321/1458/1599 1322/1461/1599 1325/1462/1599
f 1322/1461/1600 1323/1467/1600 1326/1464/1600
f 1263/1453/1601 1318/1459/1601 1270/1465/1601
f 1262/1452/1602 1319/1457/1602 1318/1459/1602
f 1261/1451/1603 1320/1460/1603 1319/1457/1603
f 1196/1450/1604 1240/1671/1604 1320/1460/1604
f 1320/1460/1605 1240/1671/1605 1241/1466/1605
f 1241/1466/1606 1242/1672/1606 1326/1464/1606
f 1326/1464/1607 1242/1672/1607 1192/1468/1607
f 1325/1462/1608 1326/1464/1608 1212/1469/1608
f 1324/1463/1609 1325/1462/1609 1211/1470/1609
f 1272/1472/1610 1324/1463/1610 1210/1471/1610
f 1271/1474/1611 1321/1458/1611 1324/1463/1611
f 1318/1459/1612 1321/1458/1612 1271/1474/1612
f 1327/1475/1613 1328/1479/1613 1331/1476/1613
f 1329/1478/1614 1332/1490/1614 1331/1476/1614
f 1331/1476/1615 1334/1482/1615 1333/1480/1615
f 1331/1476/1616 1332/1490/1616 1335/1481/1616
f 1193/1483/1617 1221/1485/1617 1327/1475/1617
f 1221/1485/1618 1220/1487/1618 1328/1479/1618
f 1219/1486/1619 1329/1478/1619 1328/1479/1619
f 1197/1488/1620 1230/1489/1620 1329/1478/1620
f 1230/1489/1621 1229/1574/1621 1332/1490/1621
f 1332/1490/1622 1229/1574/1622 1228/1491/1622
f 1335/1481/1623 1228/1491/1623 1195/1492/1623
f 1334/1482/1624 1335/1481/1624 1239/1493/1624
f 1334/1482/1625 1238/1494/1625 1237/1495/1625
f 1333/1480/1626 1237/1495/1626 1191/1496/1626
f 1330/1477/1627 1333/1480/1627 1209/1497/1627
f 1207/1484/1628 1327/1475/1628 1330/1477/1628
f 1336/1499/1629 1337/1503/1629 1340/1500/1629
f 1338/1502/1630 1341/1513/1630 1340/1500/1630
f 1340/1500/1631 1343/1506/1631 1342/1504/1631
f 1340/1500/1632 1341/1513/1632 1344/1505/1632
f 1198/1432/1633 1224/1508/1633 1336/1499/1633
f 1224/1508/1634 1223/1510/1634 1337/1503/1634
f 1222/1509/1635 1338/1502/1635 1337/1503/1635
f 1194/1511/1636 1215/1512/1636 1338/1502/1636
f 1215/1512/1637 1214/1673/1637 1341/1513/1637
f 1341/1513/1638 1214/1673/1638 1213/1514/1638
f 1344/1505/1639 1213/1514/1639 1192/1515/1639
f 1343/1506/1640 1344/1505/1640 1242/1516/1640
f 1343/1506/1641 1241/1517/1641 1240/1518/1641
f 1342/1504/1642 1240/1518/1642 1196/1450/1642
f 1339/1501/1643 1342/1504/1643 1233/1519/1643
f 1231/1507/1644 1336/1499/1644 1339/1501/1644
f 1345/1520/1645 1346/1523/1645 1349/1521/1645
f 1346/1523/1646 1347/1530/1646 1350/1524/1646
f 1349/1521/1647 1352/1526/1647 1351/1525/1647
f 1350/1524/1648 1353/1534/1648 1352/1526/1648
f 1195/1492/1649 1236/1528/1649 1345/1520/1649
f 1236/1528/1650 1235/1529/1650 1346/1523/1650
f 1235/1529/1651 1234/1531/1651 1347/1530/1651
f 1234/1531/1652 1206/1551/1652 1267/1532/1652
f 1347/1530/1653 1267/1532/1653 1268/1533/1653
f 1268/1533/1654 1269/1535/1654 1353/1534/1654
f 1269/1535/1655 1199/1558/1655 1245/1536/1655
f 1353/1534/1656 1245/1536/1656 1244/1537/1656
f 1352/1526/1657 1244/1537/1657 1243/1538/1657
f 1351/1525/1658 1243/1538/1658 1191/1539/1658
f 1238/1541/1659 1348/1522/1659 1351/1525/1659
f 1345/1520/1660 1348/1522/1660 1238/1541/1660
f 1355/1542/1485 1358/1546/1485 1357/1543/1485
f 1356/1545/1485 1359/1548/1485 1358/1546/1485
f 1358/1546/1486 1361/1549/1486 1360/1547/1486
f 1359/1548/1486 1362/1554/1486 1361/1549/1486
f 1266/1550/1487 1354/1544/1487 1267/1532/1487
f 1265/1552/1487 1355/1542/1487 1354/1544/1487
f 1265/1552/1487 1264/1553/1487 1356/1545/1487
f 1205/1455/1487 1270/1465/1487 1356/1545/1487
f 1270/1465/1485 1271/1474/1485 1359/1548/1485
f 1271/1474/1486 1272/1472/1486 1362/1554/1486
f 1272/1472/1488 1200/1473/1488 1248/1555/1488
f 1362/1554/1488 1248/1555/1488 1247/1556/1488
f 1361/1549/1488 1247/1556/1488 1246/1557/1488
f 1360/1547/1488 1246/1557/1488 1199/1558/1488
f 1357/1543/1486 1360/1547/1486 1269/1535/1486
f 1354/1544/1485 1357/1543/1485 1268/1533/1485
f 1363/1559/1661 1364/1562/1661 1367/1560/1661
f 1364/1562/1662 1365/1568/1662 1368/1563/1662
f 1367/1560/1663 1370/1565/1663 1369/1564/1663
f 1368/1563/1664 1371/1572/1664 1370/1565/1664
f 1197/1488/1665 1227/1566/1665 1363/1559/1665
f 1227/1566/1666 1226/1567/1666 1364/1562/1666
f 1226/1567/1667 1225/1569/1667 1365/1568/1667
f 1225/1569/1668 1204/1674/1668 1273/1570/1668
f 1365/1568/1669 1273/1570/1669 1274/1571/1669
f 1274/1571/1670 1275/1573/1670 1371/1572/1670
f 1275/1573/1671 1206/1551/1671 1234/1531/1671
f 1371/1572/1672 1234/1531/1672 1235/1529/1672
f 1370/1565/1673 1235/1529/1673 1236/1528/1673
f 1369/1564/1674 1236/1528/1674 1195/1492/1674
f 1229/1574/1675 1366/1561/1675 1369/1564/1675
f 1363/1559/1676 1366/1561/1676 1229/1574/1676
f 1373/1575/1505 1376/1579/1505 1375/1576/1505
f 1374/1578/1505 1377/1581/1505 1376/1579/1505
f 1376/1579/1506 1379/1582/1506 1378/1580/1506
f 1377/1581/1506 1380/1588/1506 1379/1582/1506
f 1260/1583/1507 1372/1577/1507 1273/1570/1507
f 1260/1583/1507 1259/1586/1507 1373/1575/1507
f 1258/1585/1507 1374/1578/1507 1373/1575/1507
f 1258/1585/1507 1203/1675/1507 1276/1447/1507
f 1374/1578/1505 1276/1447/1505 1277/1587/1505
f 1277/1587/1506 1278/1454/1506 1380/1588/1506
f 1278/1454/1508 1205/1676/1508 1264/1589/1508
f 1379/1582/1508 1380/1588/1508 1264/1589/1508
f 1379/1582/1508 1265/1590/1508 1266/1550/1508
f 1378/1580/1508 1266/1550/1508 1206/1591/1508
f 1375/1576/1506 1378/1580/1506 1275/1573/1506
f 1273/1570/1505 1372/1577/1505 1375/1576/1505
f 1381/1593/1677 1382/1596/1677 1385/1594/1677
f 1382/1596/1678 1383/1604/1678 1386/1597/1678
f 1385/1594/1679 1388/1599/1679 1387/1598/1679
f 1386/1597/1680 1389/1608/1680 1388/1599/1680
f 1193/1600/1681 1218/1602/1681 1381/1593/1681
f 1218/1602/1682 1217/1603/1682 1382/1596/1682
f 1217/1603/1683 1216/1605/1683 1383/1604/1683
f 1216/1605/1684 1202/1666/1684 1279/1606/1684
f 1383/1604/1685 1279/1606/1685 1280/1607/1685
f 1280/1607/1686 1281/1609/1686 1389/1608/1686
f 1281/1609/1687 1204/1674/1687 1225/1569/1687
f 1389/1608/1688 1225/1569/1688 1226/1567/1688
f 1388/1599/1689 1226/1567/1689 1227/1566/1689
f 1387/1598/1690 1227/1566/1690 1197/1488/1690
f 1220/1611/1691 1384/1595/1691 1387/1598/1691
f 1381/1593/1692 1384/1595/1692 1220/1611/1692
f 1391/1612/1525 1394/1616/1525 1393/1613/1525
f 1392/1615/1525 1395/1618/1525 1394/1616/1525
f 1394/1616/1526 1397/1619/1526 1396/1617/1526
f 1395/1618/1526 1398/1627/1526 1397/1619/1526
f 1254/1620/1527 1390/1614/1527 1279/1621/1527
f 1254/1620/1527 1253/1624/1527 1391/1612/1527
f 1252/1623/1527 1392/1615/1527 1391/1612/1527
f 1252/1623/1527 1201/1419/1527 1282/1625/1527
f 1392/1615/1525 1282/1625/1525 1283/1626/1525
f 1283/1626/1526 1284/1628/1526 1398/1627/1526
f 1284/1628/1528 1203/1675/1528 1258/1585/1528
f 1398/1627/1528 1258/1585/1528 1259/1586/1528
f 1396/1617/1528 1397/1619/1528 1259/1586/1528
f 1396/1617/1528 1260/1583/1528 1204/1584/1528
f 1393/1613/1526 1396/1617/1526 1281/1629/1526
f 1279/1621/1525 1390/1614/1525 1393/1613/1525
f 1399/1631/1693 1400/1634/1693 1403/1632/1693
f 1400/1634/1694 1401/1642/1694 1404/1635/1694
f 1403/1632/1695 1406/1637/1695 1405/1636/1695
f 1404/1635/1696 1407/1646/1696 1406/1637/1696
f 1191/1638/1697 1243/1640/1697 1399/1631/1697
f 1243/1640/1698 1244/1641/1698 1400/1634/1698
f 1244/1641/1699 1245/1643/1699 1401/1642/1699
f 1245/1643/1700 1199/1658/1700 1285/1644/1700
f 1401/1642/1701 1285/1644/1701 1286/1645/1701
f 1286/1645/1702 1287/1647/1702 1407/1646/1702
f 1287/1647/1703 1202/1666/1703 1216/1605/1703
f 1407/1646/1704 1216/1605/1704 1217/1603/1704
f 1406/1637/1705 1217/1603/1705 1218/1602/1705
f 1405/1636/1706 1218/1602/1706 1193/1600/1706
f 1208/1649/1707 1402/1633/1707 1405/1636/1707
f 1399/1631/1708 1402/1633/1708 1208/1649/1708
f 1409/1650/1545 1412/1654/1545 1411/1651/1545
f 1410/1653/1545 1413/1656/1545 1412/1654/1545
f 1412/1654/1546 1415/1657/1546 1414/1655/1546
f 1413/1656/1546 1416/1662/1546 1415/1657/1546
f 1199/1658/1547 1246/1660/1547 1408/1652/1547
f 1247/1659/1547 1409/1650/1547 1408/1652/1547
f 1248/1661/1547 1410/1653/1547 1409/1650/1547
f 1200/1408/1547 1288/1407/1547 1410/1653/1547
f 1288/1407/1545 1289/1420/1545 1413/1656/1545
f 1289/1420/1546 1290/1418/1546 1416/1662/1546
f 1290/1418/1548 1201/1419/1548 1252/1663/1548
f 1416/1662/1548 1252/1663/1548 1253/1664/1548
f 1415/1657/1548 1253/1664/1548 1254/1665/1548
f 1287/1647/1548 1414/1655/1548 1254/1665/1548
f 1411/1651/1546 1414/1655/1546 1287/1647/1546
f 1285/1644/1545 1408/1652/1545 1411/1651/1545
o Cube.003
v 0.321424 -0.411795 0.226570
v 0.004825 -0.403375 0.226570
v 0.321424 -0.411795 0.109665
v 0.004825 -0.403375 0.109665
v 0.324012 -0.314486 0.226570
v 0.007413 -0.306066 0.226570
v 0.324012 -0.314486 0.109665
v 0.007413 -0.306066 0.109665
v 0.226013 -0.425487 0.246055
v 0.099373 -0.422120 0.246055
v 0.099373 -0.422120 0.090180
v 0.226013 -0.425487 0.090180
v 0.102824 -0.292374 0.090180
v 0.229463 -0.295742 0.090180
v 0.102824 -0.292374 0.246055
v 0.229463 -0.295742 0.246055
v 0.325853 -0.419015 0.133437
v 0.328078 -0.422642 0.168118
v 0.325853 -0.419015 0.202798
v 0.068224 -0.420944 0.245637
v 0.040134 -0.417764 0.242717
v 0.018004 -0.411175 0.235514
v 0.000018 -0.410350 0.202798
v -0.002396 -0.413853 0.168118
v 0.000018 -0.410350 0.133437
v 0.257181 -0.425969 0.090598
v 0.285400 -0.424287 0.093518
v 0.307849 -0.418884 0.100721
v 0.328104 -0.334396 0.101138
v 0.329656 -0.363325 0.096856
v 0.326568 -0.392130 0.101138
v 0.000733 -0.383465 0.101138
v -0.000819 -0.354536 0.096856
v 0.002268 -0.325731 0.101138
v 0.260613 -0.296918 0.090598
v 0.288703 -0.300097 0.093518
v 0.310833 -0.306686 0.100721
v 0.328819 -0.307512 0.202798
v 0.331233 -0.304008 0.168118
v 0.328819 -0.307512 0.133437
v 0.002983 -0.298846 0.133437
v 0.000759 -0.295220 0.168118
v 0.002983 -0.298846 0.202798
v 0.260613 -0.296918 0.245637
v 0.288703 -0.300097 0.242717
v 0.310833 -0.306686 0.235514
v 0.326568 -0.392130 0.235097
v 0.329656 -0.363325 0.239380
v 0.328104 -0.334396 0.235097
v 0.002268 -0.325731 0.235097
v -0.000819 -0.354536 0.239380
v 0.000733 -0.383465 0.235097
v 0.307849 -0.418884 0.235514
v 0.285400 -0.424287 0.242717
v 0.257181 -0.425969 0.245637
v 0.194353 -0.424645 0.246055
v 0.162693 -0.423803 0.246055
v 0.131033 -0.422962 0.246055
v 0.018004 -0.411175 0.100721
v 0.040134 -0.417764 0.093518
v 0.068224 -0.420944 0.090598
v 0.131033 -0.422962 0.090180
v 0.162693 -0.423803 0.090180
v 0.194353 -0.424645 0.090180
v 0.020988 -0.298978 0.100721
v 0.043436 -0.293575 0.093518
v 0.071655 -0.291893 0.090598
v 0.134484 -0.293216 0.090180
v 0.166144 -0.294058 0.090180
v 0.197804 -0.294900 0.090180
v 0.020988 -0.298978 0.235514
v 0.043436 -0.293575 0.242717
v 0.071655 -0.291893 0.245637
v 0.134484 -0.293216 0.246055
v 0.166144 -0.294058 0.246055
v 0.197804 -0.294900 0.246055
v 0.228682 -0.325137 0.267975
v 0.227738 -0.360615 0.275281
v 0.226795 -0.396092 0.267975
v 0.102042 -0.321769 0.267975
v 0.101099 -0.357247 0.275281
v 0.100155 -0.392724 0.267975
v 0.229949 -0.277496 0.125496
v 0.230110 -0.271414 0.168118
v 0.229949 -0.277496 0.210739
v 0.103309 -0.274128 0.125496
v 0.103471 -0.268047 0.168118
v 0.103309 -0.274128 0.210739
v 0.226795 -0.396092 0.068261
v 0.227738 -0.360615 0.060954
v 0.228682 -0.325137 0.068261
v 0.100155 -0.392724 0.068261
v 0.101099 -0.357247 0.060954
v 0.102042 -0.321769 0.068261
v 0.225528 -0.443733 0.210739
v 0.225366 -0.449815 0.168118
v 0.225528 -0.443733 0.125496
v 0.098888 -0.440365 0.210739
v 0.098726 -0.446447 0.168118
v 0.098888 -0.440365 0.125496
v 0.067698 -0.439095 0.210510
v 0.039326 -0.435259 0.208901
v 0.016320 -0.426420 0.205308
v 0.067494 -0.445147 0.168118
v 0.038829 -0.441097 0.168118
v 0.015064 -0.431561 0.168118
v 0.067698 -0.439095 0.125725
v 0.039326 -0.435259 0.127334
v 0.016320 -0.426420 0.130927
v 0.068958 -0.391703 0.068789
v 0.040540 -0.389608 0.072489
v 0.017395 -0.385999 0.082366
v 0.069854 -0.356416 0.061518
v 0.041101 -0.355651 0.065464
v 0.017100 -0.355013 0.076154
v 0.070835 -0.321131 0.068789
v 0.042346 -0.321713 0.072489
v 0.019041 -0.324086 0.082366
v 0.072095 -0.273739 0.125725
v 0.043560 -0.276062 0.127334
v 0.020116 -0.283665 0.130927
v 0.072214 -0.267685 0.168118
v 0.043373 -0.270205 0.168118
v 0.019135 -0.278464 0.168118
v 0.072095 -0.273739 0.210510
v 0.043560 -0.276062 0.208901
v 0.020116 -0.283665 0.205308
v 0.070835 -0.321131 0.267446
v 0.042346 -0.321713 0.263746
v 0.019041 -0.324086 0.253869
v 0.069854 -0.356416 0.274717
v 0.041101 -0.355651 0.270771
v 0.017100 -0.355013 0.260082
v 0.068958 -0.391703 0.267446
v 0.040540 -0.389608 0.263746
v 0.017395 -0.385999 0.253869
v 0.336399 -0.394291 0.131157
v 0.340563 -0.363615 0.129172
v 0.338036 -0.332761 0.131157
v 0.339701 -0.396033 0.168118
v 0.344215 -0.363712 0.168118
v 0.341425 -0.331197 0.168118
v 0.336399 -0.394291 0.205078
v 0.340563 -0.363615 0.207064
v 0.338036 -0.332761 0.205078
v -0.007563 -0.323570 0.131157
v -0.011726 -0.354246 0.129172
v -0.009199 -0.385100 0.131157
v -0.010864 -0.321829 0.168118
v -0.015379 -0.354149 0.168118
v -0.012589 -0.386664 0.168118
v -0.007563 -0.323570 0.205078
v -0.011726 -0.354246 0.207064
v -0.009199 -0.385100 0.205078
v 0.311442 -0.331862 0.253869
v 0.288297 -0.328254 0.263746
v 0.259878 -0.326158 0.267446
v 0.311737 -0.362848 0.260082
v 0.287736 -0.362210 0.270771
v 0.258983 -0.361445 0.274717
v 0.309796 -0.393775 0.253869
v 0.286491 -0.396149 0.263746
v 0.258002 -0.396730 0.267446
v 0.197022 -0.324295 0.267975
v 0.165362 -0.323453 0.267975
v 0.133702 -0.322611 0.267975
v 0.196078 -0.359773 0.275281
v 0.164418 -0.358931 0.275281
v 0.132759 -0.358089 0.275281
v 0.195135 -0.395250 0.267974
v 0.163475 -0.394408 0.267974
v 0.131815 -0.393566 0.267975
v 0.312517 -0.291441 0.130927
v 0.289511 -0.282602 0.127334
v 0.261139 -0.278766 0.125725
v 0.313773 -0.286300 0.168118
v 0.290008 -0.276764 0.168118
v 0.261343 -0.272715 0.168118
v 0.312517 -0.291441 0.205308
v 0.289511 -0.282602 0.208901
v 0.261139 -0.278766 0.210510
v 0.198289 -0.276654 0.125496
v 0.166629 -0.275812 0.125496
v 0.134969 -0.274970 0.125496
v 0.198451 -0.270572 0.168118
v 0.166791 -0.269731 0.168118
v 0.135131 -0.268889 0.168118
v 0.198289 -0.276654 0.210739
v 0.166629 -0.275812 0.210739
v 0.134969 -0.274970 0.210739
v 0.309796 -0.393775 0.082366
v 0.286491 -0.396149 0.072489
v 0.258002 -0.396730 0.068789
v 0.311737 -0.362848 0.076154
v 0.287736 -0.362210 0.065464
v 0.258983 -0.361445 0.061518
v 0.311442 -0.331862 0.082366
v 0.288297 -0.328254 0.072489
v 0.259878 -0.326158 0.068789
v 0.195135 -0.395250 0.068261
v 0.163475 -0.394408 0.068261
v 0.131815 -0.393566 0.068261
v 0.196078 -0.359773 0.060954
v 0.164418 -0.358931 0.060954
v 0.132759 -0.358089 0.060954
v 0.197022 -0.324295 0.068261
v 0.165362 -0.323453 0.068261
v 0.133702 -0.322611 0.068261
v 0.308721 -0.434196 0.205308
v 0.285277 -0.441800 0.208901
v 0.256741 -0.444123 0.210510
v 0.309701 -0.439397 0.168118
v 0.285463 -0.447656 0.168118
v 0.256623 -0.450176 0.168118
v 0.308721 -0.434196 0.130927
v 0.285277 -0.441800 0.127334
v 0.256741 -0.444123 0.125725
v 0.193868 -0.442891 0.210739
v 0.162208 -0.442049 0.210739
v 0.130548 -0.441207 0.210739
v 0.193706 -0.448973 0.168118
v 0.162046 -0.448131 0.168118
v 0.130386 -0.447289 0.168118
v 0.193868 -0.442891 0.125496
v 0.162208 -0.442049 0.125496
v 0.130548 -0.441207 0.125496
vn -0.1405 -0.9801 0.1406
vn -0.3737 -0.9175 0.1359
vn -0.1405 -0.9801 -0.1406
vn -0.3737 -0.9175 -0.1359
vn -0.0397 -0.8875 0.4591
vn -0.1468 -0.8779 0.4558
vn -0.3836 -0.8165 0.4316
vn -0.6466 -0.6659 0.3721
vn -0.7069 -0.6970 0.1202
vn -0.7069 -0.6970 -0.1202
vn -0.6466 -0.6659 -0.3721
vn -0.3836 -0.8165 -0.4316
vn -0.1468 -0.8779 -0.4558
vn -0.0397 -0.8875 -0.4591
vn -0.0413 -0.9891 -0.1413
vn -0.0413 -0.9891 0.1413
vn -0.1408 -0.1964 -0.9704
vn -0.4096 -0.1783 -0.8947
vn -0.1302 0.2036 -0.9704
vn -0.3995 0.1998 -0.8947
vn -0.0333 -0.5969 -0.8016
vn -0.1492 -0.5888 -0.7944
vn -0.4018 -0.5442 -0.7365
vn -0.6683 -0.4488 -0.5933
vn -0.7506 -0.1368 -0.6465
vn -0.7422 0.1765 -0.6465
vn -0.6435 0.4836 -0.5933
vn -0.3723 0.5648 -0.7365
vn -0.1177 0.5958 -0.7944
vn -0.0015 0.5978 -0.8016
vn -0.0125 0.2020 -0.9793
vn -0.0232 -0.2011 -0.9793
vn -0.0882 0.9861 -0.1406
vn -0.3244 0.9361 -0.1359
vn -0.0882 0.9861 0.1406
vn -0.3244 0.9361 0.1359
vn 0.0076 0.8884 -0.4591
vn -0.0999 0.8844 -0.4558
vn -0.3396 0.8357 -0.4316
vn -0.6103 0.6993 -0.3721
vn -0.6689 0.7336 -0.1202
vn -0.6689 0.7336 0.1202
vn -0.6103 0.6993 0.3721
vn -0.3396 0.8357 0.4316
vn -0.0999 0.8844 0.4558
vn 0.0076 0.8884 0.4591
vn 0.0113 0.9899 0.1413
vn 0.0113 0.9899 -0.1413
vn -0.1302 0.2036 0.9704
vn -0.3995 0.1998 0.8947
vn -0.1408 -0.1964 0.9704
vn -0.4096 -0.1783 0.8947
vn -0.0015 0.5978 0.8016
vn -0.1177 0.5958 0.7944
vn -0.3723 0.5648 0.7365
vn -0.6435 0.4836 0.5933
vn -0.7422 0.1765 0.6465
vn -0.7506 -0.1368 0.6465
vn -0.6683 -0.4488 0.5933
vn -0.4018 -0.5442 0.7365
vn -0.1492 -0.5888 0.7944
vn -0.0333 -0.5969 0.8016
vn -0.0232 -0.2011 0.9793
vn -0.0125 0.2020 0.9793
vn 0.9859 -0.1377 -0.0946
vn 0.9919 0.0872 -0.0928
vn 0.9859 -0.1377 0.0946
vn 0.9919 0.0872 0.0928
vn 0.8724 -0.3983 -0.2835
vn 0.9367 -0.1477 -0.3174
vn 0.9432 0.0976 -0.3175
vn 0.8948 0.3181 -0.3134
vn 0.9317 0.3492 -0.1002
vn 0.9317 0.3492 0.1002
vn 0.8948 0.3181 0.3134
vn 0.9432 0.0976 0.3175
vn 0.9367 -0.1477 0.3174
vn 0.8724 -0.3983 0.2835
vn 0.9118 -0.3983 0.1001
vn 0.9118 -0.3983 -0.1001
vn -0.9859 0.1377 -0.0946
vn -0.9919 -0.0872 -0.0928
vn -0.9859 0.1377 0.0946
vn -0.9919 -0.0872 0.0928
vn -0.8724 0.3983 -0.2835
vn -0.9367 0.1477 -0.3174
vn -0.9432 -0.0976 -0.3175
vn -0.8948 -0.3181 -0.3134
vn -0.9317 -0.3492 -0.1002
vn -0.9317 -0.3492 0.1002
vn -0.8948 -0.3181 0.3134
vn -0.9432 -0.0976 0.3175
vn -0.9367 0.1477 0.3174
vn -0.8724 0.3983 0.2835
vn -0.9118 0.3983 0.1001
vn -0.9118 0.3983 -0.1001
vn 0.4040 0.1835 0.8962
vn 0.1385 0.1984 0.9703
vn 0.3937 -0.2047 0.8962
vn 0.1277 -0.2055 0.9703
vn 0.7130 0.3947 0.5795
vn 0.3993 0.5464 0.7362
vn 0.1469 0.5905 0.7935
vn 0.0331 0.5970 0.8016
vn 0.0230 0.2012 0.9793
vn 0.0123 -0.2021 0.9793
vn 0.0013 -0.5979 0.8016
vn 0.1153 -0.5975 0.7935
vn 0.3697 -0.5668 0.7362
vn 0.6910 -0.4320 0.5795
vn 0.7417 -0.1757 0.6473
vn 0.7500 0.1361 0.6473
vn 0.0054 0.2016 0.9795
vn -0.0054 -0.2016 0.9795
vn 0.0159 0.5974 0.8018
vn -0.0159 -0.5974 0.8018
vn 0.3688 0.9190 -0.1395
vn 0.1385 0.9801 -0.1420
vn 0.3688 0.9190 0.1395
vn 0.1385 0.9801 0.1420
vn 0.6882 0.6467 -0.3289
vn 0.3814 0.8165 -0.4334
vn 0.1446 0.8774 -0.4574
vn 0.0395 0.8875 -0.4592
vn 0.0412 0.9891 -0.1414
vn 0.0412 0.9891 0.1414
vn 0.0395 0.8875 0.4592
vn 0.1446 0.8774 0.4574
vn 0.3814 0.8165 0.4334
vn 0.6882 0.6467 0.3289
vn 0.7063 0.6977 0.1197
vn 0.7063 0.6977 -0.1197
vn 0.0263 0.9896 -0.1413
vn 0.0263 0.9896 0.1413
vn 0.0236 0.8881 -0.4591
vn 0.0236 0.8881 0.4591
vn 0.3937 -0.2047 -0.8962
vn 0.1277 -0.2055 -0.9703
vn 0.4040 0.1835 -0.8962
vn 0.1385 0.1984 -0.9703
vn 0.6910 -0.4320 -0.5795
vn 0.3697 -0.5668 -0.7362
vn 0.1153 -0.5975 -0.7935
vn 0.0013 -0.5979 -0.8016
vn 0.0123 -0.2021 -0.9793
vn 0.0230 0.2012 -0.9793
vn 0.0331 0.5970 -0.8016
vn 0.1469 0.5905 -0.7935
vn 0.3993 0.5464 -0.7362
vn 0.7130 0.3947 -0.5795
vn 0.7500 0.1361 -0.6473
vn 0.7417 -0.1757 -0.6473
vn -0.0054 -0.2016 -0.9795
vn 0.0054 0.2016 -0.9795
vn -0.0159 -0.5974 -0.8018
vn 0.0159 0.5974 -0.8018
vn 0.3194 -0.9373 0.1395
vn 0.0862 -0.9861 0.1420
vn 0.3194 -0.9373 -0.1395
vn 0.0862 -0.9861 -0.1420
vn 0.6528 -0.6824 0.3289
vn 0.3375 -0.8357 0.4334
vn 0.0977 -0.8839 0.4574
vn -0.0077 -0.8883 0.4592
vn -0.0115 -0.9899 0.1414
vn -0.0115 -0.9899 -0.1414
vn -0.0077 -0.8883 -0.4592
vn 0.0977 -0.8839 -0.4574
vn 0.3375 -0.8357 -0.4334
vn 0.6528 -0.6824 -0.3289
vn 0.6682 -0.7343 -0.1197
vn 0.6682 -0.7343 0.1197
vn -0.0263 -0.9896 0.1413
vn -0.0263 -0.9896 -0.1413
vn -0.0236 -0.8881 0.4591
vn -0.0236 -0.8881 -0.4591
vn -0.1385 -0.9801 0.1420
vn -0.3688 -0.9190 0.1395
vn -0.1385 -0.9801 -0.1420
vn -0.3688 -0.9190 -0.1395
vn -0.0395 -0.8875 0.4592
vn -0.1446 -0.8774 0.4574
vn -0.3814 -0.8165 0.4334
vn -0.6882 -0.6467 0.3289
vn -0.7063 -0.6977 0.1197
vn -0.7063 -0.6977 -0.1197
vn -0.6882 -0.6467 -0.3289
vn -0.3814 -0.8165 -0.4334
vn -0.1446 -0.8774 -0.4574
vn -0.0395 -0.8875 -0.4592
vn -0.0412 -0.9891 -0.1414
vn -0.0412 -0.9891 0.1414
vn -0.1385 -0.1984 -0.9703
vn -0.4040 -0.1835 -0.8962
vn -0.1277 0.2055 -0.9703
vn -0.3937 0.2047 -0.8962
vn -0.0331 -0.5970 -0.8016
vn -0.1469 -0.5905 -0.7935
vn -0.3993 -0.5464 -0.7362
vn -0.7130 -0.3947 -0.5795
vn -0.7500 -0.1361 -0.6473
vn -0.7417 0.1757 -0.6473
vn -0.6910 0.4320 -0.5795
vn -0.3697 0.5668 -0.7362
vn -0.1153 0.5975 -0.7935
vn -0.0013 0.5979 -0.8016
vn -0.0123 0.2021 -0.9793
vn -0.0230 -0.2012 -0.9793
vn -0.0862 0.9861 -0.1420
vn -0.3194 0.9373 -0.1395
vn -0.0862 0.9861 0.1420
vn -0.3194 0.9373 0.1395
vn 0.0077 0.8883 -0.4592
vn -0.0977 0.8839 -0.4574
vn -0.3375 0.8357 -0.4334
vn -0.6528 0.6824 -0.3289
vn -0.6682 0.7343 -0.1197
vn -0.6682 0.7343 0.1197
vn -0.6528 0.6824 0.3289
vn -0.3375 0.8357 0.4334
vn -0.0977 0.8839 0.4574
vn 0.0077 0.8883 0.4592
vn 0.0115 0.9899 0.1414
vn 0.0115 0.9899 -0.1414
vn -0.1277 0.2055 0.9703
vn -0.3937 0.2047 0.8962
vn -0.1385 -0.1984 0.9703
vn -0.4040 -0.1835 0.8962
vn -0.0013 0.5979 0.8016
vn -0.1153 0.5975 0.7935
vn -0.3697 0.5668 0.7362
vn -0.6910 0.4320 0.5795
vn -0.7417 0.1757 0.6473
vn -0.7500 -0.1361 0.6473
vn -0.7130 -0.3947 0.5795
vn -0.3993 -0.5464 0.7362
vn -0.1469 -0.5905 0.7935
vn -0.0331 -0.5970 0.8016
vn -0.0230 -0.2012 0.9793
vn -0.0123 0.2021 0.9793
vn 0.9858 -0.1398 -0.0928
vn 0.9919 0.0851 -0.0946
vn 0.9858 -0.1398 0.0928
vn 0.9919 0.0851 0.0946
vn 0.8766 -0.3652 -0.3134
vn 0.9367 -0.1476 -0.3175
vn 0.9432 0.0977 -0.3174
vn 0.8923 0.3513 -0.2835
vn 0.9317 0.3492 -0.1001
vn 0.9317 0.3492 0.1001
vn 0.8923 0.3513 0.2835
vn 0.9432 0.0977 0.3174
vn 0.9367 -0.1476 0.3175
vn 0.8766 -0.3652 0.3134
vn 0.9118 -0.3982 0.1002
vn 0.9118 -0.3982 -0.1002
vn -0.9858 0.1398 -0.0928
vn -0.9919 -0.0851 -0.0946
vn -0.9858 0.1398 0.0928
vn -0.9919 -0.0851 0.0946
vn -0.8766 0.3652 -0.3134
vn -0.9367 0.1476 -0.3175
vn -0.9432 -0.0977 -0.3174
vn -0.8923 -0.3513 -0.2835
vn -0.9317 -0.3492 -0.1001
vn -0.9317 -0.3492 0.1001
vn -0.8923 -0.3513 0.2835
vn -0.9432 -0.0977 0.3174
vn -0.9367 0.1476 0.3175
vn -0.8766 0.3652 0.3134
vn -0.9118 0.3982 0.1002
vn -0.9118 0.3982 -0.1002
vn 0.4096 0.1783 0.8947
vn 0.1408 0.1964 0.9704
vn 0.3995 -0.1998 0.8947
vn 0.1302 -0.2036 0.9704
vn 0.6683 0.4488 0.5933
vn 0.4018 0.5442 0.7365
vn 0.1492 0.5888 0.7944
vn 0.0333 0.5969 0.8016
vn 0.0232 0.2011 0.9793
vn 0.0125 -0.2020 0.9793
vn 0.0015 -0.5978 0.8016
vn 0.1177 -0.5958 0.7944
vn 0.3723 -0.5648 0.7365
vn 0.6435 -0.4836 0.5933
vn 0.7422 -0.1765 0.6465
vn 0.7506 0.1368 0.6465
vn 0.3737 0.9175 -0.1359
vn 0.1405 0.9801 -0.1406
vn 0.3737 0.9175 0.1359
vn 0.1405 0.9801 0.1406
vn 0.6466 0.6659 -0.3721
vn 0.3836 0.8165 -0.4316
vn 0.1468 0.8779 -0.4558
vn 0.0397 0.8875 -0.4591
vn 0.0413 0.9891 -0.1413
vn 0.0413 0.9891 0.1413
vn 0.0397 0.8875 0.4591
vn 0.1468 0.8779 0.4558
vn 0.3836 0.8165 0.4316
vn 0.6466 0.6659 0.3721
vn 0.7069 0.6970 0.1202
vn 0.7069 0.6970 -0.1202
vn 0.3995 -0.1998 -0.8947
vn 0.1302 -0.2036 -0.9704
vn 0.4096 0.1783 -0.8947
vn 0.1408 0.1964 -0.9704
vn 0.6435 -0.4836 -0.5933
vn 0.3723 -0.5648 -0.7365
vn 0.1177 -0.5959 -0.7944
vn 0.0015 -0.5978 -0.8016
vn 0.0125 -0.2020 -0.9793
vn 0.0232 0.2011 -0.9793
vn 0.0333 0.5969 -0.8016
vn 0.1492 0.5887 -0.7944
vn 0.4018 0.5442 -0.7365
vn 0.6683 0.4488 -0.5933
vn 0.7506 0.1368 -0.6465
vn 0.7422 -0.1765 -0.6465
vn 0.3244 -0.9361 0.1359
vn 0.0882 -0.9861 0.1406
vn 0.3244 -0.9361 -0.1359
vn 0.0882 -0.9861 -0.1406
vn 0.6103 -0.6993 0.3721
vn 0.3396 -0.8357 0.4316
vn 0.0999 -0.8844 0.4558
vn -0.0076 -0.8884 0.4591
vn -0.0113 -0.9899 0.1413
vn -0.0113 -0.9899 -0.1413
vn -0.0076 -0.8884 -0.4591
vn 0.0999 -0.8844 -0.4558
vn 0.3396 -0.8357 -0.4316
vn 0.6103 -0.6993 -0.3721
vn 0.6689 -0.7336 -0.1202
vn 0.6689 -0.7336 0.1202
vt 0.583333 0.062500
vt 0.562500 0.125000
vt 0.562500 0.062500
vt 0.604167 0.062500
vt 0.583333 0.125000
vt 0.583333 0.187500
vt 0.562500 0.187500
vt 0.604167 0.187500
vt 0.562500 0.000000
vt 0.541667 0.062500
vt 0.541667 0.000000
vt 0.583333 0.000000
vt 0.604167 0.000000
vt 0.625000 0.000000
vt 0.625000 0.125000
vt 0.604167 0.125000
vt 0.625000 0.250000
vt 0.604167 0.250000
vt 0.583333 0.250000
vt 0.562500 0.250000
vt 0.541667 0.187500
vt 0.541667 0.250000
vt 0.541667 0.125000
vt 0.583333 0.312500
vt 0.562500 0.375000
vt 0.562500 0.312500
vt 0.604167 0.312500
vt 0.583333 0.375000
vt 0.583406 0.437500
vt 0.562509 0.437500
vt 0.604320 0.437500
vt 0.541667 0.312500
vt 0.625000 0.375000
vt 0.604167 0.375000
vt 0.625000 0.500000
vt 0.605903 0.500000
vt 0.583912 0.500000
vt 0.562572 0.500000
vt 0.541667 0.437500
vt 0.541667 0.500000
vt 0.541667 0.375000
vt 0.584997 0.562500
vt 0.562789 0.625000
vt 0.562708 0.562500
vt 0.610343 0.562500
vt 0.585648 0.625000
vt 0.584997 0.687500
vt 0.562708 0.687500
vt 0.610343 0.687500
vt 0.541667 0.562500
vt 0.648148 0.625000
vt 0.612558 0.625000
vt 0.625000 0.750000
vt 0.605903 0.750000
vt 0.583912 0.750000
vt 0.562572 0.750000
vt 0.541667 0.687500
vt 0.541667 0.750000
vt 0.541667 0.625000
vt 0.583406 0.812500
vt 0.562500 0.875000
vt 0.562509 0.812500
vt 0.604320 0.812500
vt 0.583333 0.875000
vt 0.583333 0.937500
vt 0.562500 0.937500
vt 0.604167 0.937500
vt 0.541667 0.812500
vt 0.625000 0.875000
vt 0.604167 0.875000
vt 0.625000 1.000000
vt 0.604167 1.000000
vt 0.583333 1.000000
vt 0.562500 1.000000
vt 0.541667 0.937500
vt 0.541667 1.000000
vt 0.541667 0.875000
vt 0.187301 0.562500
vt 0.247685 0.625000
vt 0.187211 0.625000
vt 0.306478 0.562500
vt 0.248409 0.562500
vt 0.187301 0.687500
vt 0.306478 0.687500
vt 0.248409 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.187500 0.500000
vt 0.312500 0.500000
vt 0.250000 0.500000
vt 0.375000 0.500000
vt 0.355867 0.562500
vt 0.304109 0.625000
vt 0.355867 0.687500
vt 0.375000 0.750000
vt 0.312500 0.750000
vt 0.250000 0.750000
vt 0.187500 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.693522 0.562500
vt 0.752315 0.625000
vt 0.695891 0.625000
vt 0.812699 0.562500
vt 0.751592 0.562500
vt 0.693522 0.687500
vt 0.812699 0.687500
vt 0.751591 0.687500
vt 0.644133 0.562500
vt 0.687500 0.500000
vt 0.812500 0.500000
vt 0.750000 0.500000
vt 0.875000 0.500000
vt 0.875000 0.562500
vt 0.812789 0.625000
vt 0.875000 0.687500
vt 0.875000 0.750000
vt 0.812500 0.750000
vt 0.750000 0.750000
vt 0.687500 0.750000
vt 0.644133 0.687500
vt 0.395680 0.812500
vt 0.416667 0.875000
vt 0.395833 0.875000
vt 0.416594 0.812500
vt 0.437500 0.875000
vt 0.395833 0.937500
vt 0.416667 0.937500
vt 0.375000 0.812500
vt 0.394097 0.750000
vt 0.416088 0.750000
vt 0.437491 0.812500
vt 0.437428 0.750000
vt 0.458333 0.812500
vt 0.458333 0.875000
vt 0.437500 0.937500
vt 0.458333 0.937500
vt 0.437500 1.000000
vt 0.416667 1.000000
vt 0.395833 1.000000
vt 0.375000 1.000000
vt 0.375000 0.937500
vt 0.375000 0.875000
vt 0.500000 0.812500
vt 0.479167 0.875000
vt 0.479167 0.812500
vt 0.520833 0.812500
vt 0.500000 0.875000
vt 0.479167 0.937500
vt 0.520833 0.875000
vt 0.500000 0.937500
vt 0.479167 0.750000
vt 0.458333 0.750000
vt 0.500000 0.750000
vt 0.520833 0.750000
vt 0.520833 0.937500
vt 0.520833 1.000000
vt 0.500000 1.000000
vt 0.479167 1.000000
vt 0.458333 1.000000
vt 0.389658 0.562500
vt 0.414352 0.625000
vt 0.387442 0.625000
vt 0.415003 0.562500
vt 0.437211 0.625000
vt 0.389657 0.687500
vt 0.415003 0.687500
vt 0.394097 0.500000
vt 0.416088 0.500000
vt 0.437292 0.562500
vt 0.437428 0.500000
vt 0.458333 0.562500
vt 0.458333 0.625000
vt 0.437292 0.687500
vt 0.458333 0.687500
vt 0.351852 0.625000
vt 0.500000 0.562500
vt 0.479167 0.625000
vt 0.479167 0.562500
vt 0.520833 0.562500
vt 0.500000 0.625000
vt 0.479167 0.687500
vt 0.520833 0.625000
vt 0.500000 0.687500
vt 0.479167 0.500000
vt 0.458333 0.500000
vt 0.520833 0.500000
vt 0.500000 0.500000
vt 0.541667 0.625000
vt 0.520833 0.687500
vt 0.520833 0.750000
vt 0.500000 0.750000
vt 0.458333 0.750000
vt 0.458333 0.625000
vt 0.395833 0.312500
vt 0.416667 0.375000
vt 0.395833 0.375000
vt 0.416667 0.312500
vt 0.437500 0.375000
vt 0.395680 0.437500
vt 0.416594 0.437500
vt 0.375000 0.250000
vt 0.375000 0.312500
vt 0.395833 0.250000
vt 0.416667 0.250000
vt 0.437500 0.312500
vt 0.437500 0.250000
vt 0.458333 0.312500
vt 0.458333 0.375000
vt 0.437491 0.437500
vt 0.458333 0.437500
vt 0.375000 0.437500
vt 0.375000 0.375000
vt 0.500000 0.312500
vt 0.479167 0.375000
vt 0.479167 0.312500
vt 0.520833 0.312500
vt 0.500000 0.375000
vt 0.479167 0.437500
vt 0.520833 0.375000
vt 0.500000 0.437500
vt 0.479167 0.250000
vt 0.458333 0.312500
vt 0.458333 0.250000
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.541667 0.312500
vt 0.541667 0.375000
vt 0.520833 0.437500
vt 0.541667 0.437500
vt 0.458333 0.437500
vt 0.458333 0.375000
vt 0.395833 0.062500
vt 0.416667 0.125000
vt 0.395833 0.125000
vt 0.416667 0.062500
vt 0.437500 0.125000
vt 0.395833 0.187500
vt 0.416667 0.187500
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.395833 0.000000
vt 0.416667 0.000000
vt 0.437500 0.062500
vt 0.437500 0.000000
vt 0.458333 0.062500
vt 0.458333 0.125000
vt 0.437500 0.187500
vt 0.458333 0.187500
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.500000 0.062500
vt 0.479167 0.125000
vt 0.479167 0.062500
vt 0.520833 0.062500
vt 0.500000 0.125000
vt 0.479167 0.187500
vt 0.520833 0.125000
vt 0.500000 0.187500
vt 0.458333 0.000000
vt 0.500000 0.000000
vt 0.479167 0.000000
vt 0.520833 0.000000
vt 0.520833 0.187500
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.479167 0.250000
vt 0.458333 0.250000
vt 0.625000 0.062500
vt 0.625000 0.187500
vt 0.625000 0.312500
vt 0.625000 0.437500
vt 0.625000 0.812500
vt 0.625000 0.937500
vt 0.875000 0.625000
vt 0.458333 0.500000
vt 0.541667 0.500000
vt 0.541667 0.750000
s 0
f 1518/1677/1709 1520/1678/1709 1517/1679/1709
f 1519/1680/1710 1521/1681/1710 1518/1677/1710
f 1520/1678/1711 1524/1682/1711 1523/1683/1711
f 1521/1681/1712 1525/1684/1712 1524/1682/1712
f 1436/1685/1713 1514/1686/1713 1426/1687/1713
f 1437/1688/1714 1517/1679/1714 1436/1685/1714
f 1438/1689/1715 1518/1677/1715 1437/1688/1715
f 1418/1690/1716 1519/1680/1716 1438/1689/1716
f 1519/1680/1717 1440/1691/1717 1522/1692/1717
f 1440/1691/1718 1525/1684/1718 1522/1692/1718
f 1525/1684/1719 1420/1693/1719 1475/1694/1719
f 1524/1682/1720 1475/1694/1720 1476/1695/1720
f 1523/1683/1721 1476/1695/1721 1477/1696/1721
f 1516/1697/1722 1477/1696/1722 1427/1698/1722
f 1515/1699/1723 1523/1683/1723 1516/1697/1723
f 1517/1679/1724 1515/1699/1724 1514/1686/1724
f 1527/1700/1725 1529/1701/1725 1526/1702/1725
f 1528/1703/1726 1530/1704/1726 1527/1700/1726
f 1529/1701/1727 1533/1705/1727 1532/1706/1727
f 1530/1704/1728 1534/1707/1728 1533/1705/1728
f 1477/1696/1729 1508/1708/1729 1427/1698/1729
f 1476/1695/1730 1526/1702/1730 1477/1696/1730
f 1475/1694/1731 1527/1700/1731 1476/1695/1731
f 1420/1693/1732 1528/1703/1732 1475/1694/1732
f 1528/1703/1733 1449/1709/1733 1531/1710/1733
f 1449/1709/1734 1534/1707/1734 1531/1710/1734
f 1534/1707/1735 1424/1711/1735 1481/1712/1735
f 1533/1705/1736 1481/1712/1736 1482/1713/1736
f 1532/1706/1737 1482/1713/1737 1483/1714/1737
f 1510/1715/1738 1483/1714/1738 1429/1716/1738
f 1509/1717/1739 1532/1706/1739 1510/1715/1739
f 1526/1702/1740 1509/1717/1740 1508/1708/1740
f 1536/1718/1741 1538/1719/1741 1535/1720/1741
f 1537/1721/1742 1539/1722/1742 1536/1718/1742
f 1538/1719/1743 1542/1723/1743 1541/1724/1743
f 1539/1722/1744 1543/1725/1744 1542/1723/1744
f 1483/1714/1745 1502/1726/1745 1429/1716/1745
f 1482/1713/1746 1535/1720/1746 1483/1714/1746
f 1481/1712/1747 1536/1718/1747 1482/1713/1747
f 1424/1711/1748 1537/1721/1748 1481/1712/1748
f 1537/1721/1749 1458/1727/1749 1540/1728/1749
f 1458/1727/1750 1543/1725/1750 1540/1728/1750
f 1543/1725/1751 1422/1729/1751 1487/1730/1751
f 1542/1723/1752 1487/1730/1752 1488/1731/1752
f 1541/1724/1753 1488/1731/1753 1489/1732/1753
f 1504/1733/1754 1489/1732/1754 1431/1734/1754
f 1503/1735/1755 1541/1724/1755 1504/1733/1755
f 1535/1720/1756 1503/1735/1756 1502/1726/1756
f 1545/1736/1757 1547/1737/1757 1544/1738/1757
f 1546/1739/1758 1548/1740/1758 1545/1736/1758
f 1547/1737/1759 1551/1741/1759 1550/1742/1759
f 1548/1740/1760 1552/1743/1760 1551/1741/1760
f 1489/1732/1761 1496/1744/1761 1431/1734/1761
f 1488/1731/1762 1544/1738/1762 1489/1732/1762
f 1487/1730/1763 1545/1736/1763 1488/1731/1763
f 1422/1729/1764 1546/1739/1764 1487/1730/1764
f 1546/1739/1765 1467/1745/1765 1549/1746/1765
f 1467/1745/1766 1552/1743/1766 1549/1746/1766
f 1552/1743/1767 1418/1747/1767 1438/1748/1767
f 1551/1741/1768 1438/1748/1768 1437/1749/1768
f 1550/1742/1769 1437/1749/1769 1436/1750/1769
f 1498/1751/1770 1436/1750/1770 1426/1752/1770
f 1497/1753/1771 1550/1742/1771 1498/1751/1771
f 1544/1738/1772 1497/1753/1772 1496/1744/1772
f 1553/1754/1773 1557/1755/1773 1556/1756/1773
f 1555/1757/1774 1557/1755/1774 1554/1758/1774
f 1557/1755/1775 1559/1759/1775 1556/1756/1775
f 1557/1755/1776 1561/1760/1776 1560/1761/1776
f 1419/1762/1777 1553/1754/1777 1433/1763/1777
f 1447/1764/1778 1554/1758/1778 1553/1754/1778
f 1445/1765/1779 1554/1758/1779 1446/1766/1779
f 1423/1767/1780 1555/1757/1780 1445/1765/1780
f 1456/1768/1781 1558/1769/1781 1555/1757/1781
f 1558/1769/1782 1454/1770/1782 1561/1760/1782
f 1561/1760/1783 1421/1771/1783 1465/1772/1783
f 1560/1761/1784 1465/1772/1784 1464/1773/1784
f 1560/1761/1785 1463/1774/1785 1559/1759/1785
f 1559/1759/1786 1417/1775/1786 1435/1776/1786
f 1556/1756/1787 1435/1776/1787 1434/1777/1787
f 1433/1763/1788 1556/1756/1788 1434/1777/1788
f 1562/1778/1789 1566/1779/1789 1565/1780/1789
f 1564/1781/1790 1566/1779/1790 1563/1782/1790
f 1566/1779/1791 1568/1783/1791 1565/1780/1791
f 1566/1779/1792 1570/1784/1792 1569/1785/1792
f 1424/1711/1793 1562/1778/1793 1457/1786/1793
f 1450/1787/1794 1563/1782/1794 1562/1778/1794
f 1448/1788/1795 1563/1782/1795 1449/1789/1795
f 1420/1790/1796 1564/1781/1796 1448/1788/1796
f 1441/1791/1797 1567/1792/1797 1564/1781/1797
f 1567/1792/1798 1439/1793/1798 1570/1784/1798
f 1570/1784/1799 1418/1794/1799 1468/1795/1799
f 1569/1785/1800 1468/1795/1800 1467/1796/1800
f 1569/1785/1801 1466/1797/1801 1568/1783/1801
f 1568/1783/1802 1422/1729/1802 1459/1798/1802
f 1565/1780/1803 1459/1798/1803 1458/1727/1803
f 1457/1786/1804 1565/1780/1804 1458/1727/1804
f 1571/1799/1805 1575/1800/1805 1574/1801/1805
f 1572/1802/1806 1576/1803/1806 1575/1800/1806
f 1575/1800/1807 1577/1804/1807 1574/1801/1807
f 1576/1803/1808 1578/1805/1808 1575/1800/1808
f 1421/1771/1809 1571/1799/1809 1465/1806/1809
f 1462/1807/1810 1572/1802/1810 1571/1799/1810
f 1461/1808/1811 1573/1809/1811 1572/1802/1811
f 1460/1810/1812 1493/1811/1812 1573/1809/1812
f 1573/1809/1813 1494/1812/1813 1576/1803/1813
f 1494/1812/1814 1579/1813/1814 1576/1803/1814
f 1495/1814/1815 1471/1815/1815 1579/1813/1815
f 1579/1813/1816 1470/1816/1816 1578/1805/1816
f 1578/1805/1817 1469/1817/1817 1577/1804/1817
f 1577/1804/1818 1417/1818/1818 1463/1819/1818
f 1464/1820/1819 1577/1804/1819 1463/1819/1819
f 1571/1799/1820 1464/1820/1820 1465/1806/1820
f 1581/1821/1821 1583/1822/1821 1580/1823/1821
f 1582/1824/1821 1584/1825/1821 1581/1821/1821
f 1584/1825/1822 1586/1826/1822 1583/1822/1822
f 1585/1827/1822 1587/1828/1822 1584/1825/1822
f 1492/1829/1823 1493/1811/1823 1432/1830/1823
f 1491/1831/1823 1580/1823/1823 1492/1829/1823
f 1491/1831/1823 1582/1824/1823 1581/1821/1823
f 1431/1734/1823 1582/1824/1823 1490/1832/1823
f 1496/1744/1821 1585/1827/1821 1582/1824/1821
f 1497/1753/1822 1588/1833/1822 1585/1827/1822
f 1498/1751/1824 1474/1834/1824 1588/1833/1824
f 1588/1833/1824 1473/1835/1824 1587/1828/1824
f 1587/1828/1824 1472/1836/1824 1586/1826/1824
f 1586/1826/1824 1425/1837/1824 1495/1814/1824
f 1583/1822/1822 1495/1814/1822 1494/1812/1822
f 1580/1823/1821 1494/1812/1821 1493/1811/1821
f 1589/1838/1825 1593/1839/1825 1592/1840/1825
f 1590/1841/1826 1594/1842/1826 1593/1839/1826
f 1593/1839/1827 1595/1843/1827 1592/1840/1827
f 1594/1842/1828 1596/1844/1828 1593/1839/1828
f 1423/1767/1829 1589/1838/1829 1456/1768/1829
f 1453/1845/1830 1590/1841/1830 1589/1838/1830
f 1452/1846/1831 1591/1847/1831 1590/1841/1831
f 1451/1848/1832 1499/1849/1832 1591/1847/1832
f 1591/1847/1833 1500/1850/1833 1594/1842/1833
f 1500/1850/1834 1597/1851/1834 1594/1842/1834
f 1501/1852/1835 1460/1810/1835 1597/1851/1835
f 1597/1851/1836 1461/1808/1836 1596/1844/1836
f 1596/1844/1837 1462/1807/1837 1595/1843/1837
f 1595/1843/1838 1421/1771/1838 1454/1770/1838
f 1455/1853/1839 1595/1843/1839 1454/1770/1839
f 1589/1838/1840 1455/1853/1840 1456/1768/1840
f 1599/1854/1841 1601/1855/1841 1598/1856/1841
f 1600/1857/1841 1602/1858/1841 1599/1854/1841
f 1602/1858/1842 1604/1859/1842 1601/1855/1842
f 1603/1860/1842 1605/1861/1842 1602/1858/1842
f 1486/1862/1843 1499/1849/1843 1430/1863/1843
f 1486/1862/1843 1599/1854/1843 1598/1856/1843
f 1484/1864/1843 1599/1854/1843 1485/1865/1843
f 1484/1864/1843 1502/1726/1843 1600/1857/1843
f 1600/1857/1841 1503/1866/1841 1603/1860/1841
f 1503/1866/1842 1606/1867/1842 1603/1860/1842
f 1504/1733/1844 1490/1868/1844 1606/1867/1844
f 1605/1861/1844 1490/1868/1844 1491/1869/1844
f 1605/1861/1844 1492/1829/1844 1604/1859/1844
f 1604/1859/1844 1432/1870/1844 1501/1852/1844
f 1601/1855/1842 1501/1852/1842 1500/1871/1842
f 1499/1849/1841 1601/1855/1841 1500/1871/1841
f 1607/1872/1845 1611/1873/1845 1610/1874/1845
f 1608/1875/1846 1612/1876/1846 1611/1873/1846
f 1611/1873/1847 1613/1877/1847 1610/1874/1847
f 1612/1876/1848 1614/1878/1848 1611/1873/1848
f 1419/1879/1849 1607/1872/1849 1447/1880/1849
f 1444/1881/1850 1608/1875/1850 1607/1872/1850
f 1443/1882/1851 1609/1883/1851 1608/1875/1851
f 1442/1884/1852 1505/1885/1852 1609/1883/1852
f 1609/1883/1853 1506/1886/1853 1612/1876/1853
f 1506/1886/1854 1615/1887/1854 1612/1876/1854
f 1507/1888/1855 1451/1848/1855 1615/1887/1855
f 1615/1887/1856 1452/1846/1856 1614/1878/1856
f 1614/1878/1857 1453/1845/1857 1613/1877/1857
f 1613/1877/1858 1423/1767/1858 1445/1889/1858
f 1446/1890/1859 1613/1877/1859 1445/1889/1859
f 1607/1872/1860 1446/1890/1860 1447/1880/1860
f 1617/1891/1861 1619/1892/1861 1616/1893/1861
f 1618/1894/1861 1620/1895/1861 1617/1891/1861
f 1620/1895/1862 1622/1896/1862 1619/1892/1862
f 1621/1897/1862 1623/1898/1862 1620/1895/1862
f 1480/1899/1863 1505/1900/1863 1428/1901/1863
f 1480/1899/1863 1617/1891/1863 1616/1893/1863
f 1478/1902/1863 1617/1891/1863 1479/1903/1863
f 1478/1902/1863 1508/1904/1863 1618/1894/1863
f 1618/1894/1861 1509/1905/1861 1621/1897/1861
f 1509/1905/1862 1624/1906/1862 1621/1897/1862
f 1510/1907/1864 1484/1864/1864 1624/1906/1864
f 1624/1906/1864 1485/1865/1864 1623/1898/1864
f 1622/1896/1864 1485/1865/1864 1486/1862/1864
f 1622/1896/1864 1430/1863/1864 1507/1908/1864
f 1619/1892/1862 1507/1908/1862 1506/1909/1862
f 1505/1900/1861 1619/1892/1861 1506/1909/1861
f 1625/1910/1865 1629/1911/1865 1628/1912/1865
f 1626/1913/1866 1630/1914/1866 1629/1911/1866
f 1629/1911/1867 1631/1915/1867 1628/1912/1867
f 1630/1914/1868 1632/1916/1868 1629/1911/1868
f 1417/1917/1869 1625/1910/1869 1435/1918/1869
f 1469/1919/1870 1626/1913/1870 1625/1910/1870
f 1470/1920/1871 1627/1921/1871 1626/1913/1871
f 1471/1922/1872 1511/1923/1872 1627/1921/1872
f 1627/1921/1873 1512/1924/1873 1630/1914/1873
f 1512/1924/1874 1633/1925/1874 1630/1914/1874
f 1513/1926/1875 1442/1884/1875 1633/1925/1875
f 1633/1925/1876 1443/1882/1876 1632/1916/1876
f 1632/1916/1877 1444/1881/1877 1631/1915/1877
f 1631/1915/1878 1419/1879/1878 1433/1927/1878
f 1434/1928/1879 1631/1915/1879 1433/1927/1879
f 1625/1910/1880 1434/1928/1880 1435/1918/1880
f 1635/1929/1881 1637/1930/1881 1634/1931/1881
f 1636/1932/1881 1638/1933/1881 1635/1929/1881
f 1638/1933/1882 1640/1934/1882 1637/1930/1882
f 1639/1935/1882 1641/1936/1882 1638/1933/1882
f 1425/1937/1883 1634/1931/1883 1511/1923/1883
f 1473/1938/1883 1634/1931/1883 1472/1939/1883
f 1474/1940/1883 1635/1929/1883 1473/1938/1883
f 1426/1687/1883 1636/1932/1883 1474/1940/1883
f 1514/1686/1881 1639/1935/1881 1636/1932/1881
f 1515/1699/1882 1642/1941/1882 1639/1935/1882
f 1516/1697/1884 1478/1942/1884 1642/1941/1884
f 1642/1941/1884 1479/1943/1884 1641/1936/1884
f 1641/1936/1884 1480/1944/1884 1640/1934/1884
f 1513/1926/1884 1480/1944/1884 1428/1945/1884
f 1637/1930/1882 1513/1926/1882 1512/1924/1882
f 1511/1923/1881 1637/1930/1881 1512/1924/1881
f 1518/1677/1885 1521/1681/1885 1520/1678/1885
f 1519/1680/1886 1522/1692/1886 1521/1681/1886
f 1520/1678/1887 1521/1681/1887 1524/1682/1887
f 1521/1681/1888 1522/1692/1888 1525/1684/1888
f 1436/1685/1889 1517/1679/1889 1514/1686/1889
f 1437/1688/1890 1518/1677/1890 1517/1679/1890
f 1438/1689/1891 1519/1680/1891 1518/1677/1891
f 1418/1690/1892 1439/1946/1892 1519/1680/1892
f 1519/1680/1893 1439/1946/1893 1440/1691/1893
f 1440/1691/1894 1441/1947/1894 1525/1684/1894
f 1525/1684/1895 1441/1947/1895 1420/1693/1895
f 1524/1682/1896 1525/1684/1896 1475/1694/1896
f 1523/1683/1897 1524/1682/1897 1476/1695/1897
f 1516/1697/1898 1523/1683/1898 1477/1696/1898
f 1515/1699/1899 1520/1678/1899 1523/1683/1899
f 1517/1679/1900 1520/1678/1900 1515/1699/1900
f 1527/1700/1901 1530/1704/1901 1529/1701/1901
f 1528/1703/1902 1531/1710/1902 1530/1704/1902
f 1529/1701/1903 1530/1704/1903 1533/1705/1903
f 1530/1704/1904 1531/1710/1904 1534/1707/1904
f 1477/1696/1905 1526/1702/1905 1508/1708/1905
f 1476/1695/1906 1527/1700/1906 1526/1702/1906
f 1475/1694/1907 1528/1703/1907 1527/1700/1907
f 1420/1693/1908 1448/1948/1908 1528/1703/1908
f 1528/1703/1909 1448/1948/1909 1449/1709/1909
f 1449/1709/1910 1450/1949/1910 1534/1707/1910
f 1534/1707/1911 1450/1949/1911 1424/1711/1911
f 1533/1705/1912 1534/1707/1912 1481/1712/1912
f 1532/1706/1913 1533/1705/1913 1482/1713/1913
f 1510/1715/1914 1532/1706/1914 1483/1714/1914
f 1509/1717/1915 1529/1701/1915 1532/1706/1915
f 1526/1702/1916 1529/1701/1916 1509/1717/1916
f 1536/1718/1917 1539/1722/1917 1538/1719/1917
f 1537/1721/1918 1540/1728/1918 1539/1722/1918
f 1538/1719/1919 1539/1722/1919 1542/1723/1919
f 1539/1722/1920 1540/1728/1920 1543/1725/1920
f 1483/1714/1921 1535/1720/1921 1502/1726/1921
f 1482/1713/1922 1536/1718/1922 1535/1720/1922
f 1481/1712/1923 1537/1721/1923 1536/1718/1923
f 1424/1711/1924 1457/1786/1924 1537/1721/1924
f 1537/1721/1925 1457/1786/1925 1458/1727/1925
f 1458/1727/1926 1459/1798/1926 1543/1725/1926
f 1543/1725/1927 1459/1798/1927 1422/1729/1927
f 1542/1723/1928 1543/1725/1928 1487/1730/1928
f 1541/1724/1929 1542/1723/1929 1488/1731/1929
f 1504/1733/1930 1541/1724/1930 1489/1732/1930
f 1503/1735/1931 1538/1719/1931 1541/1724/1931
f 1535/1720/1932 1538/1719/1932 1503/1735/1932
f 1545/1736/1933 1548/1740/1933 1547/1737/1933
f 1546/1739/1934 1549/1746/1934 1548/1740/1934
f 1547/1737/1935 1548/1740/1935 1551/1741/1935
f 1548/1740/1936 1549/1746/1936 1552/1743/1936
f 1489/1732/1937 1544/1738/1937 1496/1744/1937
f 1488/1731/1938 1545/1736/1938 1544/1738/1938
f 1487/1730/1939 1546/1739/1939 1545/1736/1939
f 1422/1729/1940 1466/1950/1940 1546/1739/1940
f 1546/1739/1941 1466/1950/1941 1467/1745/1941
f 1467/1745/1942 1468/1951/1942 1552/1743/1942
f 1552/1743/1943 1468/1951/1943 1418/1747/1943
f 1551/1741/1944 1552/1743/1944 1438/1748/1944
f 1550/1742/1945 1551/1741/1945 1437/1749/1945
f 1498/1751/1946 1550/1742/1946 1436/1750/1946
f 1497/1753/1947 1547/1737/1947 1550/1742/1947
f 1544/1738/1948 1547/1737/1948 1497/1753/1948
f 1553/1754/1949 1554/1758/1949 1557/1755/1949
f 1555/1757/1950 1558/1769/1950 1557/1755/1950
f 1557/1755/1951 1560/1761/1951 1559/1759/1951
f 1557/1755/1952 1558/1769/1952 1561/1760/1952
f 1419/1762/1953 1447/1764/1953 1553/1754/1953
f 1447/1764/1954 1446/1766/1954 1554/1758/1954
f 1445/1765/1955 1555/1757/1955 1554/1758/1955
f 1423/1767/1956 1456/1768/1956 1555/1757/1956
f 1456/1768/1957 1455/1853/1957 1558/1769/1957
f 1558/1769/1958 1455/1853/1958 1454/1770/1958
f 1561/1760/1959 1454/1770/1959 1421/1771/1959
f 1560/1761/1960 1561/1760/1960 1465/1772/1960
f 1560/1761/1961 1464/1773/1961 1463/1774/1961
f 1559/1759/1962 1463/1774/1962 1417/1775/1962
f 1556/1756/1963 1559/1759/1963 1435/1776/1963
f 1433/1763/1964 1553/1754/1964 1556/1756/1964
f 1562/1778/1965 1563/1782/1965 1566/1779/1965
f 1564/1781/1966 1567/1792/1966 1566/1779/1966
f 1566/1779/1967 1569/1785/1967 1568/1783/1967
f 1566/1779/1968 1567/1792/1968 1570/1784/1968
f 1424/1711/1969 1450/1787/1969 1562/1778/1969
f 1450/1787/1970 1449/1789/1970 1563/1782/1970
f 1448/1788/1971 1564/1781/1971 1563/1782/1971
f 1420/1790/1972 1441/1791/1972 1564/1781/1972
f 1441/1791/1973 1440/1952/1973 1567/1792/1973
f 1567/1792/1974 1440/1952/1974 1439/1793/1974
f 1570/1784/1975 1439/1793/1975 1418/1794/1975
f 1569/1785/1976 1570/1784/1976 1468/1795/1976
f 1569/1785/1977 1467/1796/1977 1466/1797/1977
f 1568/1783/1978 1466/1797/1978 1422/1729/1978
f 1565/1780/1979 1568/1783/1979 1459/1798/1979
f 1457/1786/1980 1562/1778/1980 1565/1780/1980
f 1571/1799/1981 1572/1802/1981 1575/1800/1981
f 1572/1802/1982 1573/1809/1982 1576/1803/1982
f 1575/1800/1983 1578/1805/1983 1577/1804/1983
f 1576/1803/1984 1579/1813/1984 1578/1805/1984
f 1421/1771/1985 1462/1807/1985 1571/1799/1985
f 1462/1807/1986 1461/1808/1986 1572/1802/1986
f 1461/1808/1987 1460/1810/1987 1573/1809/1987
f 1460/1810/1988 1432/1830/1988 1493/1811/1988
f 1573/1809/1989 1493/1811/1989 1494/1812/1989
f 1494/1812/1990 1495/1814/1990 1579/1813/1990
f 1495/1814/1991 1425/1837/1991 1471/1815/1991
f 1579/1813/1992 1471/1815/1992 1470/1816/1992
f 1578/1805/1993 1470/1816/1993 1469/1817/1993
f 1577/1804/1994 1469/1817/1994 1417/1818/1994
f 1464/1820/1995 1574/1801/1995 1577/1804/1995
f 1571/1799/1996 1574/1801/1996 1464/1820/1996
f 1581/1821/1821 1584/1825/1821 1583/1822/1821
f 1582/1824/1821 1585/1827/1821 1584/1825/1821
f 1584/1825/1822 1587/1828/1822 1586/1826/1822
f 1585/1827/1822 1588/1833/1822 1587/1828/1822
f 1492/1829/1823 1580/1823/1823 1493/1811/1823
f 1491/1831/1823 1581/1821/1823 1580/1823/1823
f 1491/1831/1823 1490/1832/1823 1582/1824/1823
f 1431/1734/1823 1496/1744/1823 1582/1824/1823
f 1496/1744/1821 1497/1753/1821 1585/1827/1821
f 1497/1753/1822 1498/1751/1822 1588/1833/1822
f 1498/1751/1824 1426/1752/1824 1474/1834/1824
f 1588/1833/1824 1474/1834/1824 1473/1835/1824
f 1587/1828/1824 1473/1835/1824 1472/1836/1824
f 1586/1826/1824 1472/1836/1824 1425/1837/1824
f 1583/1822/1822 1586/1826/1822 1495/1814/1822
f 1580/1823/1821 1583/1822/1821 1494/1812/1821
f 1589/1838/1997 1590/1841/1997 1593/1839/1997
f 1590/1841/1998 1591/1847/1998 1594/1842/1998
f 1593/1839/1999 1596/1844/1999 1595/1843/1999
f 1594/1842/2000 1597/1851/2000 1596/1844/2000
f 1423/1767/2001 1453/1845/2001 1589/1838/2001
f 1453/1845/2002 1452/1846/2002 1590/1841/2002
f 1452/1846/2003 1451/1848/2003 1591/1847/2003
f 1451/1848/2004 1430/1953/2004 1499/1849/2004
f 1591/1847/2005 1499/1849/2005 1500/1850/2005
f 1500/1850/2006 1501/1852/2006 1597/1851/2006
f 1501/1852/2007 1432/1830/2007 1460/1810/2007
f 1597/1851/2008 1460/1810/2008 1461/1808/2008
f 1596/1844/2009 1461/1808/2009 1462/1807/2009
f 1595/1843/2010 1462/1807/2010 1421/1771/2010
f 1455/1853/2011 1592/1840/2011 1595/1843/2011
f 1589/1838/2012 1592/1840/2012 1455/1853/2012
f 1599/1854/1841 1602/1858/1841 1601/1855/1841
f 1600/1857/1841 1603/1860/1841 1602/1858/1841
f 1602/1858/1842 1605/1861/1842 1604/1859/1842
f 1603/1860/1842 1606/1867/1842 1605/1861/1842
f 1486/1862/1843 1598/1856/1843 1499/1849/1843
f 1486/1862/1843 1485/1865/1843 1599/1854/1843
f 1484/1864/1843 1600/1857/1843 1599/1854/1843
f 1484/1864/1843 1429/1954/1843 1502/1726/1843
f 1600/1857/1841 1502/1726/1841 1503/1866/1841
f 1503/1866/1842 1504/1733/1842 1606/1867/1842
f 1504/1733/1844 1431/1955/1844 1490/1868/1844
f 1605/1861/1844 1606/1867/1844 1490/1868/1844
f 1605/1861/1844 1491/1869/1844 1492/1829/1844
f 1604/1859/1844 1492/1829/1844 1432/1870/1844
f 1601/1855/1842 1604/1859/1842 1501/1852/1842
f 1499/1849/1841 1598/1856/1841 1601/1855/1841
f 1607/1872/2013 1608/1875/2013 1611/1873/2013
f 1608/1875/2014 1609/1883/2014 1612/1876/2014
f 1611/1873/2015 1614/1878/2015 1613/1877/2015
f 1612/1876/2016 1615/1887/2016 1614/1878/2016
f 1419/1879/2017 1444/1881/2017 1607/1872/2017
f 1444/1881/2018 1443/1882/2018 1608/1875/2018
f 1443/1882/2019 1442/1884/2019 1609/1883/2019
f 1442/1884/2020 1428/1945/2020 1505/1885/2020
f 1609/1883/2021 1505/1885/2021 1506/1886/2021
f 1506/1886/2022 1507/1888/2022 1615/1887/2022
f 1507/1888/2023 1430/1953/2023 1451/1848/2023
f 1615/1887/2024 1451/1848/2024 1452/1846/2024
f 1614/1878/2025 1452/1846/2025 1453/1845/2025
f 1613/1877/2026 1453/1845/2026 1423/1767/2026
f 1446/1890/2027 1610/1874/2027 1613/1877/2027
f 1607/1872/2028 1610/1874/2028 1446/1890/2028
f 1617/1891/1861 1620/1895/1861 1619/1892/1861
f 1618/1894/1861 1621/1897/1861 1620/1895/1861
f 1620/1895/1862 1623/1898/1862 1622/1896/1862
f 1621/1897/1862 1624/1906/1862 1623/1898/1862
f 1480/1899/1863 1616/1893/1863 1505/1900/1863
f 1480/1899/1863 1479/1903/1863 1617/1891/1863
f 1478/1902/1863 1618/1894/1863 1617/1891/1863
f 1478/1902/1863 1427/1698/1863 1508/1904/1863
f 1618/1894/1861 1508/1904/1861 1509/1905/1861
f 1509/1905/1862 1510/1907/1862 1624/1906/1862
f 1510/1907/1864 1429/1954/1864 1484/1864/1864
f 1624/1906/1864 1484/1864/1864 1485/1865/1864
f 1622/1896/1864 1623/1898/1864 1485/1865/1864
f 1622/1896/1864 1486/1862/1864 1430/1863/1864
f 1619/1892/1862 1622/1896/1862 1507/1908/1862
f 1505/1900/1861 1616/1893/1861 1619/1892/1861
f 1625/1910/2029 1626/1913/2029 1629/1911/2029
f 1626/1913/2030 1627/1921/2030 1630/1914/2030
f 1629/1911/2031 1632/1916/2031 1631/1915/2031
f 1630/1914/2032 1633/1925/2032 1632/1916/2032
f 1417/1917/2033 1469/1919/2033 1625/1910/2033
f 1469/1919/2034 1470/1920/2034 1626/1913/2034
f 1470/1920/2035 1471/1922/2035 1627/1921/2035
f 1471/1922/2036 1425/1937/2036 1511/1923/2036
f 1627/1921/2037 1511/1923/2037 1512/1924/2037
f 1512/1924/2038 1513/1926/2038 1633/1925/2038
f 1513/1926/2039 1428/1945/2039 1442/1884/2039
f 1633/1925/2040 1442/1884/2040 1443/1882/2040
f 1632/1916/2041 1443/1882/2041 1444/1881/2041
f 1631/1915/2042 1444/1881/2042 1419/1879/2042
f 1434/1928/2043 1628/1912/2043 1631/1915/2043
f 1625/1910/2044 1628/1912/2044 1434/1928/2044
f 1635/1929/1881 1638/1933/1881 1637/1930/1881
f 1636/1932/1881 1639/1935/1881 1638/1933/1881
f 1638/1933/1882 1641/1936/1882 1640/1934/1882
f 1639/1935/1882 1642/1941/1882 1641/1936/1882
f 1425/1937/1883 1472/1939/1883 1634/1931/1883
f 1473/1938/1883 1635/1929/1883 1634/1931/1883
f 1474/1940/1883 1636/1932/1883 1635/1929/1883
f 1426/1687/1883 1514/1686/1883 1636/1932/1883
f 1514/1686/1881 1515/1699/1881 1639/1935/1881
f 1515/1699/1882 1516/1697/1882 1642/1941/1882
f 1516/1697/1884 1427/1698/1884 1478/1942/1884
f 1642/1941/1884 1478/1942/1884 1479/1943/1884
f 1641/1936/1884 1479/1943/1884 1480/1944/1884
f 1513/1926/1884 1640/1934/1884 1480/1944/1884
f 1637/1930/1882 1640/1934/1882 1513/1926/1882
f 1511/1923/1881 1634/1931/1881 1637/1930/1881
o Cube.004
v -0.142272 0.020747 0.254680
v 0.124702 -0.073756 0.396607
v -0.094265 0.020747 0.164376
v 0.172709 -0.073756 0.306303
v -0.167681 -0.071320 0.241172
v 0.099294 -0.165823 0.383099
v -0.119674 -0.071320 0.150869
v 0.147300 -0.165823 0.292796
v -0.065946 0.007740 0.314560
v 0.040844 -0.030061 0.371330
v 0.104852 -0.030061 0.250926
v -0.001937 0.007740 0.194155
v 0.070974 -0.152816 0.232916
v -0.035816 -0.115015 0.176145
v 0.006966 -0.152816 0.353320
v -0.099824 -0.115015 0.296550
v -0.106069 0.028840 0.181654
v -0.121335 0.032905 0.207898
v -0.134551 0.028840 0.235231
v 0.067200 -0.039690 0.384928
v 0.091505 -0.050394 0.394956
v 0.111694 -0.062720 0.398552
v 0.140212 -0.068420 0.381299
v 0.157340 -0.065740 0.356045
v 0.168694 -0.068420 0.327722
v -0.028474 0.016713 0.180461
v -0.054050 0.022817 0.169758
v -0.077369 0.023797 0.164498
v -0.114900 -0.051220 0.144959
v -0.107560 -0.023216 0.144618
v -0.099825 0.003404 0.152973
v 0.174938 -0.093856 0.299041
v 0.171115 -0.121860 0.292765
v 0.159863 -0.148479 0.291027
v -0.062171 -0.105386 0.162547
v -0.086477 -0.094682 0.152519
v -0.106665 -0.082356 0.148924
v -0.163666 -0.076656 0.219754
v -0.152312 -0.079336 0.191430
v -0.135184 -0.076656 0.166177
v 0.139579 -0.173916 0.312244
v 0.126363 -0.177981 0.339578
v 0.111097 -0.173916 0.365821
v -0.125837 -0.105386 0.282307
v -0.147745 -0.094682 0.267768
v -0.162017 -0.082356 0.253044
v -0.154834 0.003404 0.256449
v -0.166087 -0.023216 0.254710
v -0.169909 -0.051220 0.248435
v 0.104854 -0.148479 0.394502
v 0.112589 -0.121860 0.402857
v 0.119929 -0.093856 0.402516
v -0.132721 0.023797 0.268618
v -0.115317 0.022817 0.285006
v -0.092140 0.016713 0.300221
v -0.039248 -0.001710 0.328752
v -0.012551 -0.011160 0.342945
v 0.014146 -0.020611 0.357138
v 0.167045 -0.062720 0.294432
v 0.152773 -0.050394 0.279708
v 0.130866 -0.039690 0.265168
v 0.078155 -0.020611 0.236733
v 0.051458 -0.011160 0.222540
v 0.024760 -0.001710 0.208348
v 0.137749 -0.168873 0.278857
v 0.120346 -0.167892 0.262469
v 0.097169 -0.161788 0.247254
v 0.044277 -0.143366 0.218723
v 0.017579 -0.133916 0.204530
v -0.009118 -0.124465 0.190338
v 0.082397 -0.168873 0.382977
v 0.059078 -0.167892 0.377717
v 0.033503 -0.161788 0.367014
v -0.019732 -0.143366 0.339128
v -0.046429 -0.133916 0.324935
v -0.073127 -0.124465 0.310742
v -0.101150 -0.087203 0.317562
v -0.094887 -0.053637 0.328130
v -0.082623 -0.020071 0.327411
v 0.005640 -0.125004 0.374333
v 0.011903 -0.091439 0.384901
v 0.024167 -0.057873 0.384182
v -0.055082 -0.132277 0.200891
v -0.074172 -0.138032 0.232970
v -0.090086 -0.132277 0.266738
v 0.051708 -0.170079 0.257662
v 0.032618 -0.175833 0.289741
v 0.016703 -0.170079 0.323509
v -0.000612 -0.020071 0.173143
v -0.006875 -0.053637 0.162574
v -0.019139 -0.087203 0.163294
v 0.106178 -0.057873 0.229914
v 0.099915 -0.091438 0.219345
v 0.087651 -0.125004 0.220064
v -0.046680 0.025003 0.289813
v -0.027589 0.030757 0.257734
v -0.011675 0.025003 0.223967
v 0.060110 -0.012798 0.346584
v 0.079200 -0.007044 0.314505
v 0.095115 -0.012798 0.280738
v 0.086401 -0.022528 0.360333
v 0.110251 -0.033935 0.371418
v 0.129164 -0.048646 0.377913
v 0.105425 -0.016815 0.328446
v 0.128813 -0.028505 0.340880
v 0.146729 -0.044087 0.350404
v 0.121217 -0.022528 0.294841
v 0.143746 -0.033935 0.308412
v 0.159708 -0.048646 0.320458
v 0.132222 -0.067367 0.244283
v 0.154347 -0.077127 0.259710
v 0.169095 -0.086890 0.277336
v 0.126031 -0.100765 0.233787
v 0.148656 -0.109347 0.249725
v 0.164506 -0.116512 0.268741
v 0.113795 -0.134137 0.234486
v 0.136619 -0.141364 0.250286
v 0.152929 -0.145466 0.268742
v 0.078040 -0.178976 0.271888
v 0.102177 -0.184555 0.286313
v 0.122433 -0.183710 0.300642
v 0.059088 -0.184715 0.303813
v 0.084191 -0.190190 0.317158
v 0.106754 -0.188936 0.329153
v 0.043224 -0.178976 0.337379
v 0.068682 -0.184555 0.349320
v 0.091889 -0.183710 0.358097
v 0.032218 -0.134137 0.387938
v 0.058081 -0.141364 0.398022
v 0.082503 -0.145466 0.401219
v 0.038482 -0.100765 0.398472
v 0.064349 -0.109347 0.408313
v 0.088977 -0.116512 0.410816
v 0.050645 -0.067367 0.397734
v 0.075809 -0.077127 0.407446
v 0.098669 -0.086890 0.409813
v -0.119989 0.008149 0.171995
v -0.130028 -0.019960 0.164691
v -0.136055 -0.050066 0.163454
v -0.137557 0.010711 0.199274
v -0.149101 -0.018870 0.193137
v -0.154486 -0.050631 0.190274
v -0.150344 0.008149 0.229095
v -0.162014 -0.019960 0.224858
v -0.166411 -0.050066 0.220554
v 0.155373 -0.153225 0.318381
v 0.167042 -0.125116 0.322617
v 0.171439 -0.095010 0.326922
v 0.142585 -0.155787 0.348201
v 0.154129 -0.126206 0.354338
v 0.159514 -0.094445 0.357201
v 0.125018 -0.153225 0.375481
v 0.135057 -0.125116 0.382785
v 0.141084 -0.095010 0.384022
v -0.164067 -0.058186 0.270139
v -0.149319 -0.067949 0.287765
v -0.127194 -0.077709 0.303193
v -0.159478 -0.028564 0.278734
v -0.143628 -0.035728 0.297750
v -0.121002 -0.044311 0.313688
v -0.147901 0.000390 0.278734
v -0.131591 -0.003712 0.297190
v -0.108767 -0.010939 0.312989
v -0.074452 -0.096654 0.331755
v -0.047755 -0.106104 0.345947
v -0.021057 -0.115554 0.360140
v -0.068189 -0.063088 0.342323
v -0.041492 -0.072538 0.356516
v -0.014794 -0.081988 0.370709
v -0.055925 -0.029522 0.341604
v -0.029228 -0.038972 0.355796
v -0.002530 -0.048422 0.369989
v -0.124136 -0.096430 0.169563
v -0.105222 -0.111141 0.176057
v -0.081372 -0.122548 0.187143
v -0.141701 -0.100988 0.197071
v -0.123785 -0.116571 0.206595
v -0.100397 -0.128261 0.219029
v -0.154680 -0.096430 0.227018
v -0.138717 -0.111141 0.239064
v -0.116188 -0.122548 0.252634
v -0.028384 -0.141728 0.215084
v -0.001687 -0.151178 0.229277
v 0.025011 -0.160628 0.243470
v -0.047475 -0.147482 0.247163
v -0.020777 -0.156932 0.261356
v 0.005920 -0.166382 0.275548
v -0.063389 -0.141728 0.280930
v -0.036691 -0.151178 0.295123
v -0.009994 -0.160628 0.309316
v -0.077474 0.000390 0.146257
v -0.053052 -0.003712 0.149454
v -0.027190 -0.010939 0.159537
v -0.083949 -0.028564 0.136659
v -0.059320 -0.035728 0.139162
v -0.033454 -0.044311 0.149003
v -0.093640 -0.058186 0.137662
v -0.070781 -0.067949 0.140029
v -0.045617 -0.077709 0.149741
v 0.026086 -0.029522 0.187335
v 0.052783 -0.038972 0.201528
v 0.079481 -0.048422 0.215721
v 0.019823 -0.063088 0.176767
v 0.046520 -0.072538 0.190960
v 0.073217 -0.081988 0.205152
v 0.007559 -0.096654 0.177486
v 0.034256 -0.106104 0.191679
v 0.060954 -0.115554 0.205872
v -0.117405 0.038634 0.246833
v -0.097149 0.039480 0.261162
v -0.073012 0.033900 0.275587
v -0.101726 0.043860 0.218322
v -0.079163 0.045114 0.230317
v -0.054059 0.039639 0.243663
v -0.086861 0.038634 0.189378
v -0.063654 0.039480 0.198155
v -0.038196 0.033900 0.210096
v -0.019982 0.015552 0.304006
v 0.006715 0.006102 0.318199
v 0.033412 -0.003348 0.332391
v -0.000892 0.021307 0.271927
v 0.025805 0.011856 0.286120
v 0.052503 0.002406 0.300312
v 0.015022 0.015552 0.238160
v 0.041720 0.006102 0.252352
v 0.068417 -0.003348 0.266545
vn 0.2791 0.9024 0.3282
vn 0.4628 0.7806 0.4201
vn 0.4282 0.9024 0.0478
vn 0.6071 0.7806 0.1487
vn 0.0032 0.8190 0.5738
vn 0.0902 0.7823 0.6163
vn 0.2832 0.6653 0.6908
vn 0.4990 0.4587 0.7352
vn 0.7008 0.4807 0.5271
vn 0.8289 0.4807 0.2861
vn 0.8886 0.4587 0.0025
vn 0.7310 0.6653 -0.1516
vn 0.5614 0.7823 -0.2699
vn 0.4774 0.8190 -0.3183
vn 0.3474 0.9377 0.0039
vn 0.1976 0.9377 0.2858
vn 0.6059 0.1341 -0.7841
vn 0.7806 0.0551 -0.6226
vn 0.5123 -0.2053 -0.8339
vn 0.6905 -0.2713 -0.6705
vn 0.5502 0.5199 -0.6534
vn 0.6356 0.4846 -0.6009
vn 0.8026 0.3869 -0.4541
vn 0.9438 0.2404 -0.2268
vn 0.9580 -0.0676 -0.2786
vn 0.8796 -0.3517 -0.3203
vn 0.7087 -0.6116 -0.3518
vn 0.5301 -0.6002 -0.5989
vn 0.3481 -0.5573 -0.7538
vn 0.2606 -0.5294 -0.8074
vn 0.4279 -0.1753 -0.8867
vn 0.5221 0.1660 -0.8366
vn -0.0886 -0.9699 -0.2269
vn 0.1194 -0.9867 -0.1106
vn -0.2377 -0.9699 0.0535
vn -0.0249 -0.9867 0.1609
vn 0.0228 -0.8282 -0.5600
vn 0.1100 -0.8532 -0.5099
vn 0.3062 -0.8740 -0.3774
vn 0.5345 -0.8245 -0.1858
vn 0.4510 -0.8884 0.0852
vn 0.3229 -0.8884 0.3262
vn 0.1449 -0.8245 0.5470
vn -0.1416 -0.8740 0.4649
vn -0.3611 -0.8532 0.3764
vn -0.4514 -0.8282 0.3321
vn -0.3224 -0.9465 0.0093
vn -0.1726 -0.9465 -0.2725
vn -0.4048 -0.2053 0.8911
vn -0.1696 -0.2713 0.9474
vn -0.3111 0.1341 0.9409
vn -0.0795 0.0551 0.9953
vn -0.5235 -0.5294 0.6676
vn -0.4302 -0.5573 0.7101
vn -0.1999 -0.6002 0.7744
vn 0.1048 -0.6116 0.7842
vn 0.2265 -0.3517 0.9083
vn 0.3049 -0.0676 0.9500
vn 0.3399 0.2404 0.9092
vn 0.0725 0.3869 0.9193
vn -0.1426 0.4846 0.8630
vn -0.2339 0.5199 0.8216
vn -0.4015 0.1660 0.9007
vn -0.4957 -0.1753 0.8506
vn -0.7522 0.4019 -0.5222
vn -0.8129 0.1854 -0.5521
vn -0.8536 0.4019 -0.3315
vn -0.9123 0.1854 -0.3650
vn -0.4854 0.6162 -0.6202
vn -0.5805 0.3927 -0.7133
vn -0.6447 0.1599 -0.7475
vn -0.6588 -0.0639 -0.7496
vn -0.8204 -0.0848 -0.5655
vn -0.9276 -0.0848 -0.3637
vn -0.9899 -0.0639 -0.1269
vn -0.9803 0.1599 -0.1163
vn -0.9160 0.3927 -0.0822
vn -0.7857 0.6162 -0.0555
vn -0.7293 0.6335 -0.2584
vn -0.6221 0.6335 -0.4600
vn 0.8536 -0.4019 0.3315
vn 0.9123 -0.1854 0.3650
vn 0.7522 -0.4019 0.5222
vn 0.8129 -0.1854 0.5521
vn 0.7857 -0.6162 0.0555
vn 0.9160 -0.3927 0.0822
vn 0.9803 -0.1599 0.1163
vn 0.9899 0.0639 0.1269
vn 0.9276 0.0848 0.3637
vn 0.8204 0.0848 0.5655
vn 0.6588 0.0639 0.7496
vn 0.6447 -0.1599 0.7475
vn 0.5805 -0.3927 0.7133
vn 0.4854 -0.6162 0.6202
vn 0.6221 -0.6335 0.4600
vn 0.7293 -0.6335 0.2584
vn -0.7779 -0.0609 0.6254
vn -0.6046 -0.1364 0.7848
vn -0.6854 0.2741 0.6746
vn -0.5100 0.2064 0.8351
vn -0.9629 -0.1796 0.2015
vn -0.8011 -0.3895 0.4546
vn -0.6338 -0.4869 0.6010
vn -0.5501 -0.5200 0.6534
vn -0.5220 -0.1661 0.8366
vn -0.4277 0.1754 0.8867
vn -0.2605 0.5294 0.8074
vn -0.3454 0.5583 0.7543
vn -0.5276 0.6014 0.5999
vn -0.7538 0.5779 0.3127
vn -0.8796 0.3509 0.3211
vn -0.9577 0.0681 0.2796
vn -0.5089 -0.1707 0.8437
vn -0.4147 0.1707 0.8938
vn -0.5369 -0.5247 0.6606
vn -0.2473 0.5247 0.8146
vn -0.4570 -0.7832 -0.4216
vn -0.2766 -0.9030 -0.3287
vn -0.6051 -0.7832 -0.1430
vn -0.4272 -0.9030 -0.0455
vn -0.5541 -0.4315 -0.7119
vn -0.2804 -0.6658 -0.6914
vn -0.0875 -0.7823 -0.6167
vn -0.0030 -0.8190 -0.5738
vn -0.1974 -0.9377 -0.2858
vn -0.3473 -0.9377 -0.0038
vn -0.4774 -0.8190 0.3184
vn -0.5602 -0.7823 0.2724
vn -0.7300 -0.6658 0.1542
vn -0.9000 -0.4315 -0.0612
vn -0.8282 -0.4816 -0.2864
vn -0.7007 -0.4816 -0.5264
vn -0.1851 -0.9422 -0.2792
vn -0.3350 -0.9422 0.0027
vn 0.0098 -0.8237 -0.5669
vn -0.4645 -0.8237 0.3252
vn 0.1758 0.2741 -0.9455
vn 0.4070 0.2064 -0.8898
vn 0.0833 -0.0609 -0.9947
vn 0.3124 -0.1364 -0.9401
vn -0.1624 0.5779 -0.7998
vn 0.2022 0.6014 -0.7729
vn 0.4321 0.5583 -0.7082
vn 0.5236 0.5294 -0.6675
vn 0.4958 0.1754 -0.8505
vn 0.4016 -0.1661 -0.9006
vn 0.2340 -0.5200 -0.8215
vn 0.1437 -0.4869 -0.8616
vn -0.0712 -0.3895 -0.9183
vn -0.3715 -0.1796 -0.9109
vn -0.3038 0.0681 -0.9503
vn -0.2258 0.3509 -0.9088
vn 0.5089 0.1707 -0.8437
vn 0.4147 -0.1707 -0.8938
vn 0.5369 0.5247 -0.6606
vn 0.2473 -0.5247 -0.8146
vn -0.1168 0.9863 0.1166
vn 0.0895 0.9692 0.2292
vn 0.0313 0.9863 -0.1620
vn 0.2401 0.9692 -0.0540
vn -0.5536 0.8236 0.1229
vn -0.3053 0.8731 0.3800
vn -0.1092 0.8519 0.5122
vn -0.0228 0.8281 0.5601
vn 0.1727 0.9465 0.2727
vn 0.3226 0.9465 -0.0094
vn 0.4516 0.8281 -0.3322
vn 0.3635 0.8519 -0.3770
vn 0.1443 0.8731 -0.4656
vn -0.2078 0.8236 -0.5277
vn -0.3225 0.8889 -0.3253
vn -0.4500 0.8889 -0.0854
vn 0.1851 0.9422 0.2792
vn 0.3350 0.9422 -0.0027
vn -0.0098 0.8237 0.5669
vn 0.4645 0.8237 -0.3252
vn 0.2766 0.9030 0.3287
vn 0.4570 0.7832 0.4216
vn 0.4272 0.9030 0.0455
vn 0.6051 0.7832 0.1430
vn 0.0030 0.8190 0.5738
vn 0.0875 0.7823 0.6167
vn 0.2804 0.6658 0.6914
vn 0.5541 0.4315 0.7119
vn 0.7007 0.4816 0.5264
vn 0.8282 0.4816 0.2864
vn 0.9000 0.4315 0.0612
vn 0.7300 0.6658 -0.1542
vn 0.5602 0.7823 -0.2724
vn 0.4774 0.8190 -0.3184
vn 0.3473 0.9377 0.0038
vn 0.1974 0.9377 0.2858
vn 0.6046 0.1364 -0.7848
vn 0.7779 0.0609 -0.6254
vn 0.5100 -0.2064 -0.8351
vn 0.6854 -0.2741 -0.6746
vn 0.5501 0.5200 -0.6534
vn 0.6338 0.4869 -0.6010
vn 0.8011 0.3895 -0.4546
vn 0.9629 0.1796 -0.2015
vn 0.9577 -0.0681 -0.2796
vn 0.8796 -0.3509 -0.3211
vn 0.7538 -0.5779 -0.3127
vn 0.5276 -0.6014 -0.5999
vn 0.3454 -0.5583 -0.7543
vn 0.2605 -0.5294 -0.8074
vn 0.4277 -0.1754 -0.8867
vn 0.5220 0.1661 -0.8366
vn -0.0895 -0.9692 -0.2292
vn 0.1168 -0.9863 -0.1166
vn -0.2401 -0.9692 0.0540
vn -0.0313 -0.9863 0.1620
vn 0.0228 -0.8281 -0.5601
vn 0.1092 -0.8519 -0.5122
vn 0.3053 -0.8731 -0.3800
vn 0.5536 -0.8236 -0.1229
vn 0.4500 -0.8889 0.0854
vn 0.3225 -0.8889 0.3253
vn 0.2078 -0.8236 0.5277
vn -0.1443 -0.8731 0.4656
vn -0.3635 -0.8519 0.3770
vn -0.4516 -0.8281 0.3322
vn -0.3226 -0.9465 0.0094
vn -0.1727 -0.9465 -0.2727
vn -0.4070 -0.2064 0.8898
vn -0.1758 -0.2741 0.9455
vn -0.3124 0.1364 0.9401
vn -0.0833 0.0609 0.9947
vn -0.5236 -0.5294 0.6675
vn -0.4321 -0.5583 0.7082
vn -0.2022 -0.6014 0.7729
vn 0.1624 -0.5779 0.7998
vn 0.2258 -0.3509 0.9088
vn 0.3038 -0.0681 0.9503
vn 0.3715 0.1796 0.9109
vn 0.0712 0.3895 0.9183
vn -0.1437 0.4869 0.8616
vn -0.2340 0.5200 0.8215
vn -0.4016 0.1661 0.9006
vn -0.4958 -0.1754 0.8505
vn -0.7526 0.4039 -0.5201
vn -0.8114 0.1875 -0.5536
vn -0.8520 0.4039 -0.3330
vn -0.9128 0.1875 -0.3630
vn -0.4799 0.5842 -0.6545
vn -0.5805 0.3926 -0.7134
vn -0.6448 0.1598 -0.7475
vn -0.6821 -0.0966 -0.7248
vn -0.8204 -0.0849 -0.5654
vn -0.9276 -0.0849 -0.3638
vn -0.9824 -0.0966 -0.1601
vn -0.9803 0.1598 -0.1164
vn -0.9160 0.3926 -0.0821
vn -0.8110 0.5842 -0.0318
vn -0.7294 0.6334 -0.2584
vn -0.6222 0.6334 -0.4601
vn 0.8520 -0.4039 0.3330
vn 0.9128 -0.1875 0.3630
vn 0.7526 -0.4039 0.5201
vn 0.8114 -0.1875 0.5536
vn 0.8110 -0.5842 0.0318
vn 0.9160 -0.3926 0.0821
vn 0.9803 -0.1598 0.1164
vn 0.9824 0.0966 0.1601
vn 0.9276 0.0849 0.3638
vn 0.8204 0.0849 0.5654
vn 0.6821 0.0966 0.7248
vn 0.6448 -0.1598 0.7475
vn 0.5805 -0.3926 0.7134
vn 0.4799 -0.5842 0.6545
vn 0.6222 -0.6334 0.4601
vn 0.7294 -0.6334 0.2584
vn -0.7806 -0.0551 0.6226
vn -0.6059 -0.1341 0.7841
vn -0.6905 0.2713 0.6705
vn -0.5123 0.2053 0.8339
vn -0.9438 -0.2404 0.2268
vn -0.8026 -0.3869 0.4541
vn -0.6356 -0.4846 0.6009
vn -0.5502 -0.5199 0.6534
vn -0.5221 -0.1660 0.8366
vn -0.4279 0.1753 0.8867
vn -0.2606 0.5294 0.8074
vn -0.3481 0.5573 0.7538
vn -0.5301 0.6002 0.5989
vn -0.7087 0.6116 0.3518
vn -0.8796 0.3517 0.3203
vn -0.9580 0.0676 0.2786
vn -0.4628 -0.7806 -0.4201
vn -0.2791 -0.9024 -0.3282
vn -0.6071 -0.7806 -0.1487
vn -0.4282 -0.9024 -0.0478
vn -0.4990 -0.4587 -0.7352
vn -0.2832 -0.6653 -0.6908
vn -0.0902 -0.7823 -0.6163
vn -0.0032 -0.8190 -0.5738
vn -0.1976 -0.9377 -0.2858
vn -0.3474 -0.9377 -0.0039
vn -0.4774 -0.8190 0.3183
vn -0.5614 -0.7823 0.2699
vn -0.7310 -0.6653 0.1516
vn -0.8886 -0.4587 -0.0025
vn -0.8289 -0.4807 -0.2861
vn -0.7008 -0.4807 -0.5271
vn 0.1696 0.2713 -0.9474
vn 0.4048 0.2053 -0.8911
vn 0.0795 -0.0551 -0.9953
vn 0.3111 -0.1341 -0.9409
vn -0.1048 0.6116 -0.7842
vn 0.1999 0.6002 -0.7744
vn 0.4302 0.5573 -0.7101
vn 0.5235 0.5294 -0.6676
vn 0.4957 0.1753 -0.8506
vn 0.4015 -0.1660 -0.9007
vn 0.2339 -0.5199 -0.8216
vn 0.1426 -0.4846 -0.8630
vn -0.0725 -0.3869 -0.9193
vn -0.3399 -0.2404 -0.9092
vn -0.3049 0.0676 -0.9500
vn -0.2265 0.3517 -0.9083
vn -0.1194 0.9867 0.1106
vn 0.0886 0.9699 0.2269
vn 0.0249 0.9867 -0.1609
vn 0.2377 0.9699 -0.0535
vn -0.5345 0.8245 0.1858
vn -0.3062 0.8740 0.3774
vn -0.1100 0.8532 0.5099
vn -0.0228 0.8282 0.5600
vn 0.1726 0.9465 0.2725
vn 0.3224 0.9465 -0.0093
vn 0.4514 0.8282 -0.3321
vn 0.3611 0.8532 -0.3764
vn 0.1416 0.8740 -0.4649
vn -0.1449 0.8245 -0.5470
vn -0.3229 0.8884 -0.3262
vn -0.4510 0.8884 -0.0852
vt 0.583333 0.062500
vt 0.562500 0.125000
vt 0.562500 0.062500
vt 0.604167 0.062500
vt 0.583333 0.125000
vt 0.583333 0.187500
vt 0.562500 0.187500
vt 0.604167 0.187500
vt 0.562500 0.000000
vt 0.541667 0.062500
vt 0.541667 0.000000
vt 0.583333 0.000000
vt 0.604167 0.000000
vt 0.625000 0.000000
vt 0.625000 0.125000
vt 0.604167 0.125000
vt 0.625000 0.250000
vt 0.604167 0.250000
vt 0.583333 0.250000
vt 0.562500 0.250000
vt 0.541667 0.187500
vt 0.541667 0.250000
vt 0.541667 0.125000
vt 0.583333 0.312500
vt 0.562500 0.375000
vt 0.562500 0.312500
vt 0.604167 0.312500
vt 0.583333 0.375000
vt 0.583406 0.437500
vt 0.562509 0.437500
vt 0.604320 0.437500
vt 0.541667 0.312500
vt 0.625000 0.375000
vt 0.604167 0.375000
vt 0.625000 0.500000
vt 0.605903 0.500000
vt 0.583912 0.500000
vt 0.562572 0.500000
vt 0.541667 0.437500
vt 0.541667 0.500000
vt 0.541667 0.375000
vt 0.584997 0.562500
vt 0.562789 0.625000
vt 0.562708 0.562500
vt 0.610343 0.562500
vt 0.585648 0.625000
vt 0.584997 0.687500
vt 0.562708 0.687500
vt 0.610343 0.687500
vt 0.541667 0.562500
vt 0.648148 0.625000
vt 0.612558 0.625000
vt 0.625000 0.750000
vt 0.605903 0.750000
vt 0.583912 0.750000
vt 0.562572 0.750000
vt 0.541667 0.687500
vt 0.541667 0.750000
vt 0.541667 0.625000
vt 0.583406 0.812500
vt 0.562500 0.875000
vt 0.562509 0.812500
vt 0.604320 0.812500
vt 0.583333 0.875000
vt 0.583333 0.937500
vt 0.562500 0.937500
vt 0.604167 0.937500
vt 0.541667 0.812500
vt 0.625000 0.875000
vt 0.604167 0.875000
vt 0.625000 1.000000
vt 0.604167 1.000000
vt 0.583333 1.000000
vt 0.562500 1.000000
vt 0.541667 0.937500
vt 0.541667 1.000000
vt 0.541667 0.875000
vt 0.187301 0.562500
vt 0.247685 0.625000
vt 0.187211 0.625000
vt 0.306478 0.562500
vt 0.248409 0.562500
vt 0.187301 0.687500
vt 0.306478 0.687500
vt 0.248409 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.187500 0.500000
vt 0.312500 0.500000
vt 0.250000 0.500000
vt 0.375000 0.500000
vt 0.355867 0.562500
vt 0.304109 0.625000
vt 0.355867 0.687500
vt 0.375000 0.750000
vt 0.312500 0.750000
vt 0.250000 0.750000
vt 0.187500 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.693522 0.562500
vt 0.752315 0.625000
vt 0.695891 0.625000
vt 0.812699 0.562500
vt 0.751592 0.562500
vt 0.693522 0.687500
vt 0.812699 0.687500
vt 0.751591 0.687500
vt 0.644133 0.562500
vt 0.687500 0.500000
vt 0.812500 0.500000
vt 0.750000 0.500000
vt 0.875000 0.500000
vt 0.875000 0.562500
vt 0.812789 0.625000
vt 0.875000 0.687500
vt 0.875000 0.750000
vt 0.812500 0.750000
vt 0.750000 0.750000
vt 0.687500 0.750000
vt 0.644133 0.687500
vt 0.395680 0.812500
vt 0.416667 0.875000
vt 0.395833 0.875000
vt 0.416594 0.812500
vt 0.437500 0.875000
vt 0.395833 0.937500
vt 0.416667 0.937500
vt 0.375000 0.812500
vt 0.394097 0.750000
vt 0.416088 0.750000
vt 0.437491 0.812500
vt 0.437428 0.750000
vt 0.458333 0.812500
vt 0.458333 0.875000
vt 0.437500 0.937500
vt 0.458333 0.937500
vt 0.437500 1.000000
vt 0.416667 1.000000
vt 0.395833 1.000000
vt 0.375000 1.000000
vt 0.375000 0.937500
vt 0.375000 0.875000
vt 0.500000 0.812500
vt 0.479167 0.875000
vt 0.479167 0.812500
vt 0.520833 0.812500
vt 0.500000 0.875000
vt 0.479167 0.937500
vt 0.520833 0.875000
vt 0.500000 0.937500
vt 0.479167 0.750000
vt 0.458333 0.750000
vt 0.500000 0.750000
vt 0.520833 0.750000
vt 0.520833 0.937500
vt 0.520833 1.000000
vt 0.500000 1.000000
vt 0.479167 1.000000
vt 0.458333 1.000000
vt 0.389658 0.562500
vt 0.414352 0.625000
vt 0.387442 0.625000
vt 0.415003 0.562500
vt 0.437211 0.625000
vt 0.389657 0.687500
vt 0.415003 0.687500
vt 0.394097 0.500000
vt 0.416088 0.500000
vt 0.437292 0.562500
vt 0.437428 0.500000
vt 0.458333 0.562500
vt 0.458333 0.625000
vt 0.437292 0.687500
vt 0.458333 0.687500
vt 0.351852 0.625000
vt 0.500000 0.562500
vt 0.479167 0.625000
vt 0.479167 0.562500
vt 0.520833 0.562500
vt 0.500000 0.625000
vt 0.479167 0.687500
vt 0.520833 0.625000
vt 0.500000 0.687500
vt 0.479167 0.500000
vt 0.458333 0.500000
vt 0.520833 0.500000
vt 0.500000 0.500000
vt 0.541667 0.625000
vt 0.520833 0.687500
vt 0.520833 0.750000
vt 0.500000 0.750000
vt 0.458333 0.750000
vt 0.458333 0.625000
vt 0.395833 0.312500
vt 0.416667 0.375000
vt 0.395833 0.375000
vt 0.416667 0.312500
vt 0.437500 0.375000
vt 0.395680 0.437500
vt 0.416594 0.437500
vt 0.375000 0.250000
vt 0.375000 0.312500
vt 0.395833 0.250000
vt 0.416667 0.250000
vt 0.437500 0.312500
vt 0.437500 0.250000
vt 0.458333 0.312500
vt 0.458333 0.375000
vt 0.437491 0.437500
vt 0.458333 0.437500
vt 0.375000 0.437500
vt 0.375000 0.375000
vt 0.500000 0.312500
vt 0.479167 0.375000
vt 0.479167 0.312500
vt 0.520833 0.312500
vt 0.500000 0.375000
vt 0.479167 0.437500
vt 0.520833 0.375000
vt 0.500000 0.437500
vt 0.479167 0.250000
vt 0.458333 0.312500
vt 0.458333 0.250000
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.541667 0.312500
vt 0.541667 0.375000
vt 0.520833 0.437500
vt 0.541667 0.437500
vt 0.458333 0.437500
vt 0.458333 0.375000
vt 0.395833 0.062500
vt 0.416667 0.125000
vt 0.395833 0.125000
vt 0.416667 0.062500
vt 0.437500 0.125000
vt 0.395833 0.187500
vt 0.416667 0.187500
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.395833 0.000000
vt 0.416667 0.000000
vt 0.437500 0.062500
vt 0.437500 0.000000
vt 0.458333 0.062500
vt 0.458333 0.125000
vt 0.437500 0.187500
vt 0.458333 0.187500
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.500000 0.062500
vt 0.479167 0.125000
vt 0.479167 0.062500
vt 0.520833 0.062500
vt 0.500000 0.125000
vt 0.479167 0.187500
vt 0.520833 0.125000
vt 0.500000 0.187500
vt 0.458333 0.000000
vt 0.500000 0.000000
vt 0.479167 0.000000
vt 0.520833 0.000000
vt 0.520833 0.187500
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.479167 0.250000
vt 0.458333 0.250000
vt 0.625000 0.062500
vt 0.625000 0.187500
vt 0.625000 0.312500
vt 0.625000 0.437500
vt 0.625000 0.812500
vt 0.625000 0.937500
vt 0.875000 0.625000
vt 0.458333 0.500000
vt 0.541667 0.500000
vt 0.541667 0.750000
s 0
f 1744/1956/2045 1746/1957/2045 1743/1958/2045
f 1745/1959/2046 1747/1960/2046 1744/1956/2046
f 1746/1957/2047 1750/1961/2047 1749/1962/2047
f 1747/1960/2048 1751/1963/2048 1750/1961/2048
f 1662/1964/2049 1740/1965/2049 1652/1966/2049
f 1663/1967/2050 1743/1958/2050 1662/1964/2050
f 1664/1968/2051 1744/1956/2051 1663/1967/2051
f 1644/1969/2052 1745/1959/2052 1664/1968/2052
f 1745/1959/2053 1666/1970/2053 1748/1971/2053
f 1666/1970/2054 1751/1963/2054 1748/1971/2054
f 1751/1963/2055 1646/1972/2055 1701/1973/2055
f 1750/1961/2056 1701/1973/2056 1702/1974/2056
f 1749/1962/2057 1702/1974/2057 1703/1975/2057
f 1742/1976/2058 1703/1975/2058 1653/1977/2058
f 1741/1978/2059 1749/1962/2059 1742/1976/2059
f 1743/1958/2060 1741/1978/2060 1740/1965/2060
f 1753/1979/2061 1755/1980/2061 1752/1981/2061
f 1754/1982/2062 1756/1983/2062 1753/1979/2062
f 1755/1980/2063 1759/1984/2063 1758/1985/2063
f 1756/1983/2064 1760/1986/2064 1759/1984/2064
f 1703/1975/2065 1734/1987/2065 1653/1977/2065
f 1702/1974/2066 1752/1981/2066 1703/1975/2066
f 1701/1973/2067 1753/1979/2067 1702/1974/2067
f 1646/1972/2068 1754/1982/2068 1701/1973/2068
f 1754/1982/2069 1675/1988/2069 1757/1989/2069
f 1675/1988/2070 1760/1986/2070 1757/1989/2070
f 1760/1986/2071 1650/1990/2071 1707/1991/2071
f 1759/1984/2072 1707/1991/2072 1708/1992/2072
f 1758/1985/2073 1708/1992/2073 1709/1993/2073
f 1736/1994/2074 1709/1993/2074 1655/1995/2074
f 1735/1996/2075 1758/1985/2075 1736/1994/2075
f 1752/1981/2076 1735/1996/2076 1734/1987/2076
f 1762/1997/2077 1764/1998/2077 1761/1999/2077
f 1763/2000/2078 1765/2001/2078 1762/1997/2078
f 1764/1998/2079 1768/2002/2079 1767/2003/2079
f 1765/2001/2080 1769/2004/2080 1768/2002/2080
f 1709/1993/2081 1728/2005/2081 1655/1995/2081
f 1708/1992/2082 1761/1999/2082 1709/1993/2082
f 1707/1991/2083 1762/1997/2083 1708/1992/2083
f 1650/1990/2084 1763/2000/2084 1707/1991/2084
f 1763/2000/2085 1684/2006/2085 1766/2007/2085
f 1684/2006/2086 1769/2004/2086 1766/2007/2086
f 1769/2004/2087 1648/2008/2087 1713/2009/2087
f 1768/2002/2088 1713/2009/2088 1714/2010/2088
f 1767/2003/2089 1714/2010/2089 1715/2011/2089
f 1730/2012/2090 1715/2011/2090 1657/2013/2090
f 1729/2014/2091 1767/2003/2091 1730/2012/2091
f 1761/1999/2092 1729/2014/2092 1728/2005/2092
f 1771/2015/2093 1773/2016/2093 1770/2017/2093
f 1772/2018/2094 1774/2019/2094 1771/2015/2094
f 1773/2016/2095 1777/2020/2095 1776/2021/2095
f 1774/2019/2096 1778/2022/2096 1777/2020/2096
f 1715/2011/2097 1722/2023/2097 1657/2013/2097
f 1714/2010/2098 1770/2017/2098 1715/2011/2098
f 1713/2009/2099 1771/2015/2099 1714/2010/2099
f 1648/2008/2100 1772/2018/2100 1713/2009/2100
f 1772/2018/2101 1693/2024/2101 1775/2025/2101
f 1693/2024/2102 1778/2022/2102 1775/2025/2102
f 1778/2022/2103 1644/2026/2103 1664/2027/2103
f 1777/2020/2104 1664/2027/2104 1663/2028/2104
f 1776/2021/2105 1663/2028/2105 1662/2029/2105
f 1724/2030/2106 1662/2029/2106 1652/2031/2106
f 1723/2032/2107 1776/2021/2107 1724/2030/2107
f 1770/2017/2108 1723/2032/2108 1722/2023/2108
f 1779/2033/2109 1783/2034/2109 1782/2035/2109
f 1781/2036/2110 1783/2034/2110 1780/2037/2110
f 1783/2034/2111 1785/2038/2111 1782/2035/2111
f 1783/2034/2112 1787/2039/2112 1786/2040/2112
f 1645/2041/2113 1779/2033/2113 1659/2042/2113
f 1673/2043/2114 1780/2037/2114 1779/2033/2114
f 1671/2044/2115 1780/2037/2115 1672/2045/2115
f 1649/2046/2116 1781/2036/2116 1671/2044/2116
f 1682/2047/2117 1784/2048/2117 1781/2036/2117
f 1784/2048/2118 1680/2049/2118 1787/2039/2118
f 1787/2039/2119 1647/2050/2119 1691/2051/2119
f 1786/2040/2120 1691/2051/2120 1690/2052/2120
f 1786/2040/2121 1689/2053/2121 1785/2038/2121
f 1785/2038/2122 1643/2054/2122 1661/2055/2122
f 1782/2035/2123 1661/2055/2123 1660/2056/2123
f 1659/2042/2124 1782/2035/2124 1660/2056/2124
f 1788/2057/2125 1792/2058/2125 1791/2059/2125
f 1790/2060/2126 1792/2058/2126 1789/2061/2126
f 1792/2058/2127 1794/2062/2127 1791/2059/2127
f 1792/2058/2128 1796/2063/2128 1795/2064/2128
f 1650/1990/2129 1788/2057/2129 1683/2065/2129
f 1676/2066/2130 1789/2061/2130 1788/2057/2130
f 1674/2067/2131 1789/2061/2131 1675/2068/2131
f 1646/2069/2132 1790/2060/2132 1674/2067/2132
f 1667/2070/2133 1793/2071/2133 1790/2060/2133
f 1793/2071/2134 1665/2072/2134 1796/2063/2134
f 1796/2063/2135 1644/2073/2135 1694/2074/2135
f 1795/2064/2136 1694/2074/2136 1693/2075/2136
f 1795/2064/2137 1692/2076/2137 1794/2062/2137
f 1794/2062/2138 1648/2008/2138 1685/2077/2138
f 1791/2059/2139 1685/2077/2139 1684/2006/2139
f 1683/2065/2140 1791/2059/2140 1684/2006/2140
f 1797/2078/2141 1801/2079/2141 1800/2080/2141
f 1798/2081/2142 1802/2082/2142 1801/2079/2142
f 1801/2079/2143 1803/2083/2143 1800/2080/2143
f 1802/2082/2144 1804/2084/2144 1801/2079/2144
f 1647/2050/2145 1797/2078/2145 1691/2085/2145
f 1688/2086/2146 1798/2081/2146 1797/2078/2146
f 1687/2087/2147 1799/2088/2147 1798/2081/2147
f 1686/2089/2148 1719/2090/2148 1799/2088/2148
f 1799/2088/2149 1720/2091/2149 1802/2082/2149
f 1720/2091/2150 1805/2092/2150 1802/2082/2150
f 1721/2093/2151 1697/2094/2151 1805/2092/2151
f 1805/2092/2152 1696/2095/2152 1804/2084/2152
f 1804/2084/2153 1695/2096/2153 1803/2083/2153
f 1803/2083/2154 1643/2097/2154 1689/2098/2154
f 1690/2099/2155 1803/2083/2155 1689/2098/2155
f 1797/2078/2156 1690/2099/2156 1691/2085/2156
f 1807/2100/2157 1809/2101/2157 1806/2102/2157
f 1808/2103/2157 1810/2104/2157 1807/2100/2157
f 1810/2104/2158 1812/2105/2158 1809/2101/2158
f 1811/2106/2158 1813/2107/2158 1810/2104/2158
f 1718/2108/2159 1719/2090/2159 1658/2109/2159
f 1717/2110/2159 1806/2102/2159 1718/2108/2159
f 1717/2110/2159 1808/2103/2159 1807/2100/2159
f 1657/2013/2159 1808/2103/2159 1716/2111/2159
f 1722/2023/2157 1811/2106/2157 1808/2103/2157
f 1723/2032/2158 1814/2112/2158 1811/2106/2158
f 1724/2030/2160 1700/2113/2160 1814/2112/2160
f 1814/2112/2160 1699/2114/2160 1813/2107/2160
f 1813/2107/2160 1698/2115/2160 1812/2105/2160
f 1812/2105/2160 1651/2116/2160 1721/2093/2160
f 1809/2101/2158 1721/2093/2158 1720/2091/2158
f 1806/2102/2157 1720/2091/2157 1719/2090/2157
f 1815/2117/2161 1819/2118/2161 1818/2119/2161
f 1816/2120/2162 1820/2121/2162 1819/2118/2162
f 1819/2118/2163 1821/2122/2163 1818/2119/2163
f 1820/2121/2164 1822/2123/2164 1819/2118/2164
f 1649/2046/2165 1815/2117/2165 1682/2047/2165
f 1679/2124/2166 1816/2120/2166 1815/2117/2166
f 1678/2125/2167 1817/2126/2167 1816/2120/2167
f 1677/2127/2168 1725/2128/2168 1817/2126/2168
f 1817/2126/2169 1726/2129/2169 1820/2121/2169
f 1726/2129/2170 1823/2130/2170 1820/2121/2170
f 1727/2131/2171 1686/2089/2171 1823/2130/2171
f 1823/2130/2172 1687/2087/2172 1822/2123/2172
f 1822/2123/2173 1688/2086/2173 1821/2122/2173
f 1821/2122/2174 1647/2050/2174 1680/2049/2174
f 1681/2132/2175 1821/2122/2175 1680/2049/2175
f 1815/2117/2176 1681/2132/2176 1682/2047/2176
f 1825/2133/2177 1827/2134/2177 1824/2135/2177
f 1826/2136/2177 1828/2137/2177 1825/2133/2177
f 1828/2137/2178 1830/2138/2178 1827/2134/2178
f 1829/2139/2178 1831/2140/2178 1828/2137/2178
f 1712/2141/2179 1725/2128/2179 1656/2142/2179
f 1712/2141/2179 1825/2133/2179 1824/2135/2179
f 1710/2143/2179 1825/2133/2179 1711/2144/2179
f 1710/2143/2179 1728/2005/2179 1826/2136/2179
f 1826/2136/2177 1729/2145/2177 1829/2139/2177
f 1729/2145/2178 1832/2146/2178 1829/2139/2178
f 1730/2012/2180 1716/2147/2180 1832/2146/2180
f 1831/2140/2180 1716/2147/2180 1717/2148/2180
f 1831/2140/2180 1718/2108/2180 1830/2138/2180
f 1830/2138/2180 1658/2149/2180 1727/2131/2180
f 1827/2134/2178 1727/2131/2178 1726/2150/2178
f 1725/2128/2177 1827/2134/2177 1726/2150/2177
f 1833/2151/2181 1837/2152/2181 1836/2153/2181
f 1834/2154/2182 1838/2155/2182 1837/2152/2182
f 1837/2152/2183 1839/2156/2183 1836/2153/2183
f 1838/2155/2184 1840/2157/2184 1837/2152/2184
f 1645/2158/2185 1833/2151/2185 1673/2159/2185
f 1670/2160/2186 1834/2154/2186 1833/2151/2186
f 1669/2161/2187 1835/2162/2187 1834/2154/2187
f 1668/2163/2188 1731/2164/2188 1835/2162/2188
f 1835/2162/2189 1732/2165/2189 1838/2155/2189
f 1732/2165/2190 1841/2166/2190 1838/2155/2190
f 1733/2167/2191 1677/2127/2191 1841/2166/2191
f 1841/2166/2192 1678/2125/2192 1840/2157/2192
f 1840/2157/2193 1679/2124/2193 1839/2156/2193
f 1839/2156/2194 1649/2046/2194 1671/2168/2194
f 1672/2169/2195 1839/2156/2195 1671/2168/2195
f 1833/2151/2196 1672/2169/2196 1673/2159/2196
f 1843/2170/2197 1845/2171/2197 1842/2172/2197
f 1844/2173/2197 1846/2174/2197 1843/2170/2197
f 1846/2174/2198 1848/2175/2198 1845/2171/2198
f 1847/2176/2198 1849/2177/2198 1846/2174/2198
f 1706/2178/2199 1731/2179/2199 1654/2180/2199
f 1706/2178/2199 1843/2170/2199 1842/2172/2199
f 1704/2181/2199 1843/2170/2199 1705/2182/2199
f 1704/2181/2199 1734/2183/2199 1844/2173/2199
f 1844/2173/2197 1735/2184/2197 1847/2176/2197
f 1735/2184/2198 1850/2185/2198 1847/2176/2198
f 1736/2186/2200 1710/2143/2200 1850/2185/2200
f 1850/2185/2200 1711/2144/2200 1849/2177/2200
f 1848/2175/2200 1711/2144/2200 1712/2141/2200
f 1848/2175/2200 1656/2142/2200 1733/2187/2200
f 1845/2171/2198 1733/2187/2198 1732/2188/2198
f 1731/2179/2197 1845/2171/2197 1732/2188/2197
f 1851/2189/2201 1855/2190/2201 1854/2191/2201
f 1852/2192/2202 1856/2193/2202 1855/2190/2202
f 1855/2190/2203 1857/2194/2203 1854/2191/2203
f 1856/2193/2204 1858/2195/2204 1855/2190/2204
f 1643/2196/2205 1851/2189/2205 1661/2197/2205
f 1695/2198/2206 1852/2192/2206 1851/2189/2206
f 1696/2199/2207 1853/2200/2207 1852/2192/2207
f 1697/2201/2208 1737/2202/2208 1853/2200/2208
f 1853/2200/2209 1738/2203/2209 1856/2193/2209
f 1738/2203/2210 1859/2204/2210 1856/2193/2210
f 1739/2205/2211 1668/2163/2211 1859/2204/2211
f 1859/2204/2212 1669/2161/2212 1858/2195/2212
f 1858/2195/2213 1670/2160/2213 1857/2194/2213
f 1857/2194/2214 1645/2158/2214 1659/2206/2214
f 1660/2207/2215 1857/2194/2215 1659/2206/2215
f 1851/2189/2216 1660/2207/2216 1661/2197/2216
f 1861/2208/2217 1863/2209/2217 1860/2210/2217
f 1862/2211/2217 1864/2212/2217 1861/2208/2217
f 1864/2212/2218 1866/2213/2218 1863/2209/2218
f 1865/2214/2218 1867/2215/2218 1864/2212/2218
f 1651/2216/2219 1860/2210/2219 1737/2202/2219
f 1699/2217/2219 1860/2210/2219 1698/2218/2219
f 1700/2219/2219 1861/2208/2219 1699/2217/2219
f 1652/1966/2219 1862/2211/2219 1700/2219/2219
f 1740/1965/2217 1865/2214/2217 1862/2211/2217
f 1741/1978/2218 1868/2220/2218 1865/2214/2218
f 1742/1976/2220 1704/2221/2220 1868/2220/2220
f 1868/2220/2220 1705/2222/2220 1867/2215/2220
f 1867/2215/2220 1706/2223/2220 1866/2213/2220
f 1739/2205/2220 1706/2223/2220 1654/2224/2220
f 1863/2209/2218 1739/2205/2218 1738/2203/2218
f 1737/2202/2217 1863/2209/2217 1738/2203/2217
f 1744/1956/2221 1747/1960/2221 1746/1957/2221
f 1745/1959/2222 1748/1971/2222 1747/1960/2222
f 1746/1957/2223 1747/1960/2223 1750/1961/2223
f 1747/1960/2224 1748/1971/2224 1751/1963/2224
f 1662/1964/2225 1743/1958/2225 1740/1965/2225
f 1663/1967/2226 1744/1956/2226 1743/1958/2226
f 1664/1968/2227 1745/1959/2227 1744/1956/2227
f 1644/1969/2228 1665/2225/2228 1745/1959/2228
f 1745/1959/2229 1665/2225/2229 1666/1970/2229
f 1666/1970/2230 1667/2226/2230 1751/1963/2230
f 1751/1963/2231 1667/2226/2231 1646/1972/2231
f 1750/1961/2232 1751/1963/2232 1701/1973/2232
f 1749/1962/2233 1750/1961/2233 1702/1974/2233
f 1742/1976/2234 1749/1962/2234 1703/1975/2234
f 1741/1978/2235 1746/1957/2235 1749/1962/2235
f 1743/1958/2236 1746/1957/2236 1741/1978/2236
f 1753/1979/2237 1756/1983/2237 1755/1980/2237
f 1754/1982/2238 1757/1989/2238 1756/1983/2238
f 1755/1980/2239 1756/1983/2239 1759/1984/2239
f 1756/1983/2240 1757/1989/2240 1760/1986/2240
f 1703/1975/2241 1752/1981/2241 1734/1987/2241
f 1702/1974/2242 1753/1979/2242 1752/1981/2242
f 1701/1973/2243 1754/1982/2243 1753/1979/2243
f 1646/1972/2244 1674/2227/2244 1754/1982/2244
f 1754/1982/2245 1674/2227/2245 1675/1988/2245
f 1675/1988/2246 1676/2228/2246 1760/1986/2246
f 1760/1986/2247 1676/2228/2247 1650/1990/2247
f 1759/1984/2248 1760/1986/2248 1707/1991/2248
f 1758/1985/2249 1759/1984/2249 1708/1992/2249
f 1736/1994/2250 1758/1985/2250 1709/1993/2250
f 1735/1996/2251 1755/1980/2251 1758/1985/2251
f 1752/1981/2252 1755/1980/2252 1735/1996/2252
f 1762/1997/2253 1765/2001/2253 1764/1998/2253
f 1763/2000/2254 1766/2007/2254 1765/2001/2254
f 1764/1998/2255 1765/2001/2255 1768/2002/2255
f 1765/2001/2256 1766/2007/2256 1769/2004/2256
f 1709/1993/2257 1761/1999/2257 1728/2005/2257
f 1708/1992/2258 1762/1997/2258 1761/1999/2258
f 1707/1991/2259 1763/2000/2259 1762/1997/2259
f 1650/1990/2260 1683/2065/2260 1763/2000/2260
f 1763/2000/2261 1683/2065/2261 1684/2006/2261
f 1684/2006/2262 1685/2077/2262 1769/2004/2262
f 1769/2004/2263 1685/2077/2263 1648/2008/2263
f 1768/2002/2264 1769/2004/2264 1713/2009/2264
f 1767/2003/2265 1768/2002/2265 1714/2010/2265
f 1730/2012/2266 1767/2003/2266 1715/2011/2266
f 1729/2014/2267 1764/1998/2267 1767/2003/2267
f 1761/1999/2268 1764/1998/2268 1729/2014/2268
f 1771/2015/2269 1774/2019/2269 1773/2016/2269
f 1772/2018/2270 1775/2025/2270 1774/2019/2270
f 1773/2016/2271 1774/2019/2271 1777/2020/2271
f 1774/2019/2272 1775/2025/2272 1778/2022/2272
f 1715/2011/2273 1770/2017/2273 1722/2023/2273
f 1714/2010/2274 1771/2015/2274 1770/2017/2274
f 1713/2009/2275 1772/2018/2275 1771/2015/2275
f 1648/2008/2276 1692/2229/2276 1772/2018/2276
f 1772/2018/2277 1692/2229/2277 1693/2024/2277
f 1693/2024/2278 1694/2230/2278 1778/2022/2278
f 1778/2022/2279 1694/2230/2279 1644/2026/2279
f 1777/2020/2280 1778/2022/2280 1664/2027/2280
f 1776/2021/2281 1777/2020/2281 1663/2028/2281
f 1724/2030/2282 1776/2021/2282 1662/2029/2282
f 1723/2032/2283 1773/2016/2283 1776/2021/2283
f 1770/2017/2284 1773/2016/2284 1723/2032/2284
f 1779/2033/2285 1780/2037/2285 1783/2034/2285
f 1781/2036/2286 1784/2048/2286 1783/2034/2286
f 1783/2034/2287 1786/2040/2287 1785/2038/2287
f 1783/2034/2288 1784/2048/2288 1787/2039/2288
f 1645/2041/2289 1673/2043/2289 1779/2033/2289
f 1673/2043/2290 1672/2045/2290 1780/2037/2290
f 1671/2044/2291 1781/2036/2291 1780/2037/2291
f 1649/2046/2292 1682/2047/2292 1781/2036/2292
f 1682/2047/2293 1681/2132/2293 1784/2048/2293
f 1784/2048/2294 1681/2132/2294 1680/2049/2294
f 1787/2039/2295 1680/2049/2295 1647/2050/2295
f 1786/2040/2296 1787/2039/2296 1691/2051/2296
f 1786/2040/2297 1690/2052/2297 1689/2053/2297
f 1785/2038/2298 1689/2053/2298 1643/2054/2298
f 1782/2035/2299 1785/2038/2299 1661/2055/2299
f 1659/2042/2300 1779/2033/2300 1782/2035/2300
f 1788/2057/2301 1789/2061/2301 1792/2058/2301
f 1790/2060/2302 1793/2071/2302 1792/2058/2302
f 1792/2058/2303 1795/2064/2303 1794/2062/2303
f 1792/2058/2304 1793/2071/2304 1796/2063/2304
f 1650/1990/2305 1676/2066/2305 1788/2057/2305
f 1676/2066/2306 1675/2068/2306 1789/2061/2306
f 1674/2067/2307 1790/2060/2307 1789/2061/2307
f 1646/2069/2308 1667/2070/2308 1790/2060/2308
f 1667/2070/2309 1666/2231/2309 1793/2071/2309
f 1793/2071/2310 1666/2231/2310 1665/2072/2310
f 1796/2063/2311 1665/2072/2311 1644/2073/2311
f 1795/2064/2312 1796/2063/2312 1694/2074/2312
f 1795/2064/2313 1693/2075/2313 1692/2076/2313
f 1794/2062/2314 1692/2076/2314 1648/2008/2314
f 1791/2059/2315 1794/2062/2315 1685/2077/2315
f 1683/2065/2316 1788/2057/2316 1791/2059/2316
f 1797/2078/2317 1798/2081/2317 1801/2079/2317
f 1798/2081/2318 1799/2088/2318 1802/2082/2318
f 1801/2079/2319 1804/2084/2319 1803/2083/2319
f 1802/2082/2320 1805/2092/2320 1804/2084/2320
f 1647/2050/2321 1688/2086/2321 1797/2078/2321
f 1688/2086/2322 1687/2087/2322 1798/2081/2322
f 1687/2087/2323 1686/2089/2323 1799/2088/2323
f 1686/2089/2324 1658/2109/2324 1719/2090/2324
f 1799/2088/2325 1719/2090/2325 1720/2091/2325
f 1720/2091/2326 1721/2093/2326 1805/2092/2326
f 1721/2093/2327 1651/2116/2327 1697/2094/2327
f 1805/2092/2328 1697/2094/2328 1696/2095/2328
f 1804/2084/2329 1696/2095/2329 1695/2096/2329
f 1803/2083/2330 1695/2096/2330 1643/2097/2330
f 1690/2099/2331 1800/2080/2331 1803/2083/2331
f 1797/2078/2332 1800/2080/2332 1690/2099/2332
f 1807/2100/2157 1810/2104/2157 1809/2101/2157
f 1808/2103/2157 1811/2106/2157 1810/2104/2157
f 1810/2104/2158 1813/2107/2158 1812/2105/2158
f 1811/2106/2158 1814/2112/2158 1813/2107/2158
f 1718/2108/2159 1806/2102/2159 1719/2090/2159
f 1717/2110/2159 1807/2100/2159 1806/2102/2159
f 1717/2110/2159 1716/2111/2159 1808/2103/2159
f 1657/2013/2159 1722/2023/2159 1808/2103/2159
f 1722/2023/2157 1723/2032/2157 1811/2106/2157
f 1723/2032/2158 1724/2030/2158 1814/2112/2158
f 1724/2030/2160 1652/2031/2160 1700/2113/2160
f 1814/2112/2160 1700/2113/2160 1699/2114/2160
f 1813/2107/2160 1699/2114/2160 1698/2115/2160
f 1812/2105/2160 1698/2115/2160 1651/2116/2160
f 1809/2101/2158 1812/2105/2158 1721/2093/2158
f 1806/2102/2157 1809/2101/2157 1720/2091/2157
f 1815/2117/2333 1816/2120/2333 1819/2118/2333
f 1816/2120/2334 1817/2126/2334 1820/2121/2334
f 1819/2118/2335 1822/2123/2335 1821/2122/2335
f 1820/2121/2336 1823/2130/2336 1822/2123/2336
f 1649/2046/2337 1679/2124/2337 1815/2117/2337
f 1679/2124/2338 1678/2125/2338 1816/2120/2338
f 1678/2125/2339 1677/2127/2339 1817/2126/2339
f 1677/2127/2340 1656/2232/2340 1725/2128/2340
f 1817/2126/2341 1725/2128/2341 1726/2129/2341
f 1726/2129/2342 1727/2131/2342 1823/2130/2342
f 1727/2131/2343 1658/2109/2343 1686/2089/2343
f 1823/2130/2344 1686/2089/2344 1687/2087/2344
f 1822/2123/2345 1687/2087/2345 1688/2086/2345
f 1821/2122/2346 1688/2086/2346 1647/2050/2346
f 1681/2132/2347 1818/2119/2347 1821/2122/2347
f 1815/2117/2348 1818/2119/2348 1681/2132/2348
f 1825/2133/2177 1828/2137/2177 1827/2134/2177
f 1826/2136/2177 1829/2139/2177 1828/2137/2177
f 1828/2137/2178 1831/2140/2178 1830/2138/2178
f 1829/2139/2178 1832/2146/2178 1831/2140/2178
f 1712/2141/2179 1824/2135/2179 1725/2128/2179
f 1712/2141/2179 1711/2144/2179 1825/2133/2179
f 1710/2143/2179 1826/2136/2179 1825/2133/2179
f 1710/2143/2179 1655/2233/2179 1728/2005/2179
f 1826/2136/2177 1728/2005/2177 1729/2145/2177
f 1729/2145/2178 1730/2012/2178 1832/2146/2178
f 1730/2012/2180 1657/2234/2180 1716/2147/2180
f 1831/2140/2180 1832/2146/2180 1716/2147/2180
f 1831/2140/2180 1717/2148/2180 1718/2108/2180
f 1830/2138/2180 1718/2108/2180 1658/2149/2180
f 1827/2134/2178 1830/2138/2178 1727/2131/2178
f 1725/2128/2177 1824/2135/2177 1827/2134/2177
f 1833/2151/2349 1834/2154/2349 1837/2152/2349
f 1834/2154/2350 1835/2162/2350 1838/2155/2350
f 1837/2152/2351 1840/2157/2351 1839/2156/2351
f 1838/2155/2352 1841/2166/2352 1840/2157/2352
f 1645/2158/2353 1670/2160/2353 1833/2151/2353
f 1670/2160/2354 1669/2161/2354 1834/2154/2354
f 1669/2161/2355 1668/2163/2355 1835/2162/2355
f 1668/2163/2356 1654/2224/2356 1731/2164/2356
f 1835/2162/2357 1731/2164/2357 1732/2165/2357
f 1732/2165/2358 1733/2167/2358 1841/2166/2358
f 1733/2167/2359 1656/2232/2359 1677/2127/2359
f 1841/2166/2360 1677/2127/2360 1678/2125/2360
f 1840/2157/2361 1678/2125/2361 1679/2124/2361
f 1839/2156/2362 1679/2124/2362 1649/2046/2362
f 1672/2169/2363 1836/2153/2363 1839/2156/2363
f 1833/2151/2364 1836/2153/2364 1672/2169/2364
f 1843/2170/2197 1846/2174/2197 1845/2171/2197
f 1844/2173/2197 1847/2176/2197 1846/2174/2197
f 1846/2174/2198 1849/2177/2198 1848/2175/2198
f 1847/2176/2198 1850/2185/2198 1849/2177/2198
f 1706/2178/2199 1842/2172/2199 1731/2179/2199
f 1706/2178/2199 1705/2182/2199 1843/2170/2199
f 1704/2181/2199 1844/2173/2199 1843/2170/2199
f 1704/2181/2199 1653/1977/2199 1734/2183/2199
f 1844/2173/2197 1734/2183/2197 1735/2184/2197
f 1735/2184/2198 1736/2186/2198 1850/2185/2198
f 1736/2186/2200 1655/2233/2200 1710/2143/2200
f 1850/2185/2200 1710/2143/2200 1711/2144/2200
f 1848/2175/2200 1849/2177/2200 1711/2144/2200
f 1848/2175/2200 1712/2141/2200 1656/2142/2200
f 1845/2171/2198 1848/2175/2198 1733/2187/2198
f 1731/2179/2197 1842/2172/2197 1845/2171/2197
f 1851/2189/2365 1852/2192/2365 1855/2190/2365
f 1852/2192/2366 1853/2200/2366 1856/2193/2366
f 1855/2190/2367 1858/2195/2367 1857/2194/2367
f 1856/2193/2368 1859/2204/2368 1858/2195/2368
f 1643/2196/2369 1695/2198/2369 1851/2189/2369
f 1695/2198/2370 1696/2199/2370 1852/2192/2370
f 1696/2199/2371 1697/2201/2371 1853/2200/2371
f 1697/2201/2372 1651/2216/2372 1737/2202/2372
f 1853/2200/2373 1737/2202/2373 1738/2203/2373
f 1738/2203/2374 1739/2205/2374 1859/2204/2374
f 1739/2205/2375 1654/2224/2375 1668/2163/2375
f 1859/2204/2376 1668/2163/2376 1669/2161/2376
f 1858/2195/2377 1669/2161/2377 1670/2160/2377
f 1857/2194/2378 1670/2160/2378 1645/2158/2378
f 1660/2207/2379 1854/2191/2379 1857/2194/2379
f 1851/2189/2380 1854/2191/2380 1660/2207/2380
f 1861/2208/2217 1864/2212/2217 1863/2209/2217
f 1862/2211/2217 1865/2214/2217 1864/2212/2217
f 1864/2212/2218 1867/2215/2218 1866/2213/2218
f 1865/2214/2218 1868/2220/2218 1867/2215/2218
f 1651/2216/2219 1698/2218/2219 1860/2210/2219
f 1699/2217/2219 1861/2208/2219 1860/2210/2219
f 1700/2219/2219 1862/2211/2219 1861/2208/2219
f 1652/1966/2219 1740/1965/2219 1862/2211/2219
f 1740/1965/2217 1741/1978/2217 1865/2214/2217
f 1741/1978/2218 1742/1976/2218 1868/2220/2218
f 1742/1976/2220 1653/1977/2220 1704/2221/2220
f 1868/2220/2220 1704/2221/2220 1705/2222/2220
f 1867/2215/2220 1705/2222/2220 1706/2223/2220
f 1739/2205/2220 1866/2213/2220 1706/2223/2220
f 1863/2209/2218 1866/2213/2218 1739/2205/2218
f 1737/2202/2217 1860/2210/2217 1863/2209/2217
================================================
FILE: example/teddybear/proxy.txt
================================================
-5.550776502986174560e-02 8.231279402941545087e-02 -1.258323406064429106e-01
-3.998699747023472251e-02 3.020038030973061227e-01 -2.599704789464084564e-01
2.073124080213479496e-03 3.770118695120376895e-01 -2.168838484418099122e-01
5.750893906056839949e-02 2.883552490916799216e-01 -2.101273523705166069e-01
-3.476761380703526083e-02 4.834374151764500582e-01 -8.875533188111095484e-02
2.621275142798934477e-02 4.435870942052951293e-01 -1.248984240581483551e-01
1.101798104483094343e-01 2.418496348280122776e-01 -1.397495042337346671e-01
1.319851991971777161e-01 3.465081803292787699e-01 -3.289878265419945297e-02
1.336974251276483172e-01 2.202749102229801248e-01 -5.356574945673026750e-02
1.086184527498474384e-01 1.609235458707530264e-01 4.532679330515694202e-03
1.124615477246108525e-01 2.060799267972991955e-01 8.486694706963725243e-02
1.108263323668484124e-01 3.067028747214898821e-01 1.125133854568828168e-01
3.467248041426101296e-02 1.647063428200797319e-01 1.570200825594204042e-01
5.467205903680633083e-02 2.383217976424019668e-01 1.807923176314753244e-01
-1.874564039015293865e-02 1.964175410582183201e-01 2.093479902576223517e-01
-8.225799846755116329e-02 7.646714150453758074e-02 9.141778909875285242e-02
-5.224189994222040723e-02 3.949125850434637708e-01 1.982747426322316320e-01
-9.405994484432338154e-02 2.593093170156238192e-01 2.389352117168089817e-01
-9.703494708754141496e-02 5.829404013340751256e-02 1.610719828558321298e-02
-2.055200834886550532e-01 1.992344466413660808e-01 1.883966082339720294e-01
-1.356711670947795823e-01 4.887447652362277850e-01 4.418001738081579755e-02
-2.173799857791113010e-01 3.819825506574616725e-01 1.621729917257352660e-01
-2.787432499633716865e-01 3.248821230676776084e-01 1.292483467059716007e-01
-2.450270426904422316e-01 1.584248380129207323e-01 1.196774461102083353e-01
-2.878676416203407307e-01 2.432906162580208975e-01 1.181967670454668606e-01
-2.738120342378779215e-01 4.030445928469174599e-01 5.560421110246818038e-02
-3.130810722330712448e-01 2.938745572072704593e-01 6.305078821655751842e-02
-2.900091691531618832e-01 1.605951499766521962e-01 -6.958543919617858620e-03
-2.717992175119994247e-01 1.621162895758318789e-01 -1.056899670566608662e-01
-2.822868168892812557e-01 3.669708595544671770e-01 -1.189639037640026625e-01
-2.114300325877888076e-01 4.391127418038818786e-01 -1.285642937250763562e-01
-1.545111280485170213e-01 4.848961521022657761e-01 -7.189320777701599385e-02
-1.229855012261280439e-01 1.144157565148969102e-01 -1.816325621961639436e-01
-1.532148299108402612e-01 3.232779165147374734e-01 -2.528048557046997935e-01
-1.045429956927828724e-01 3.847695021971876961e-01 -2.346506189822485933e-01
-1.119364304491641537e-01 2.427564400919719478e-01 -2.627898221710761040e-01
-7.157083246404463694e-02 1.791651339035606194e-01 -2.392723688528716752e-01
-3.756669628577867515e-02 4.460785868262663323e-01 -1.680404449446399051e-01
5.418741668800247657e-02 1.920492558142700146e-01 -1.889949736685311854e-01
8.862197896480086823e-02 1.517676377888277917e-01 -9.579918481000744557e-02
9.629790456930117926e-02 3.561883225570809941e-01 -1.402252465633767597e-01
7.769766040132154716e-02 4.308292908908605834e-01 -4.683058806158802573e-02
-1.650320705265993088e-02 6.777278104619478827e-02 -1.408983456056614111e-02
1.418740874141209141e-01 2.644011773493079032e-01 2.106153578506128957e-02
5.039146841614738394e-03 4.801431579071926969e-01 -1.110328469672949674e-03
9.929583664508447038e-02 3.923613559930114181e-01 5.873973982447559805e-02
3.655372962407749116e-02 4.502507365863256950e-01 6.611435693657022328e-02
5.426811249069161264e-02 3.946426376428642802e-01 1.356021440899852215e-01
5.107282062075303719e-02 3.141208058035512285e-01 1.844436528632925065e-01
-6.212762610691983933e-02 4.899666979250824617e-01 5.079732036460833655e-02
-5.886325235218057422e-02 4.571426039724759138e-01 1.312719617818547124e-01
-1.571251398008543382e-02 3.009904072813025522e-01 2.245242758477692946e-01
-1.105376630224988305e-01 1.756978990046991762e-01 2.093494487887317401e-01
-1.418308141011307977e-01 3.924240001779347176e-01 1.956733328044601428e-01
-1.624546968236126321e-01 4.506657479746047912e-01 1.233850377025712819e-01
-2.001444731145569922e-01 3.067787104903851869e-01 2.071914218811163055e-01
-2.131031949385588931e-01 4.584919498633186485e-01 4.978459100188815933e-02
-1.888675957603991040e-01 8.248559414814389534e-02 3.854738602963610899e-02
-3.123775198565065248e-01 2.233428816374893322e-01 4.166007383791669322e-02
-3.108480335106847958e-01 3.521194789909793954e-01 -3.128124358998397858e-02
-3.119671767594359713e-01 2.154746516775981058e-01 -6.131003363328066391e-02
-2.566408760812657897e-01 4.306810105751979756e-01 -5.493290825089015145e-02
-1.489544366138922937e-01 7.010902990781905930e-02 -7.538793493780569788e-02
-2.712847985969045750e-01 2.770486221809733141e-01 -1.762602356884806765e-01
-2.201925101004638119e-01 1.500069479774868708e-01 -1.667541134031051697e-01
-1.880144755989196226e-01 3.878074674977167180e-01 -2.062840709220200852e-01
-1.563961531203472144e-01 1.870677896185135580e-01 -2.337640261544615405e-01
-1.161512059362490717e-01 4.571983027739320304e-01 -1.603710262715521695e-01
-5.109761322512303833e-02 1.734666661059509896e-01 -1.041602591896385044e-01
-5.749434767165484406e-02 -4.802923093816066080e-01 -9.003244462973614137e-02
-7.426083221293819370e-02 1.979886544564754547e-01 -2.039573394437437653e-02
3.923653135237548106e-02 -1.476519536313414815e-01 -2.434247844185520593e-01
2.898759172367905770e-02 -2.820774199326867016e-01 -2.270827014526285437e-01
1.062434236405538912e-01 -1.091549446469401718e-01 -1.960030823272940481e-01
7.462506252438656784e-02 -2.115380268523078811e-01 -2.191504889962407610e-01
1.077267130351568514e-01 -1.246008187360323101e-02 -1.674916204452387980e-01
4.737973589827712551e-02 -4.377353785729148350e-01 -9.525483132031775657e-02
1.553412247905231469e-01 -3.088354236180936296e-01 -6.180040370049259130e-02
1.316812611768390928e-01 5.308158351285514887e-02 -6.934336985558382793e-02
7.293107162963116674e-02 -4.402032108589321591e-01 -1.831674559411535994e-02
1.019577142943011230e-02 1.789941903667302825e-01 -1.441688249336608302e-02
1.347883560061164387e-01 5.960133450103863306e-02 1.740374193646921502e-02
1.233405631664254432e-01 2.235237609452735835e-03 1.205061879518037060e-01
1.207919818797827116e-01 -7.437310676425021305e-02 1.556902482235498208e-01
3.333711169507772021e-02 1.521397506197987581e-01 6.216924082309258082e-02
8.167635557237819255e-02 -2.127196891852091709e-01 1.940077909797579825e-01
-7.922176513590274000e-03 -2.008414427282950998e-01 2.370221583760520612e-01
-3.926097580430663869e-02 -7.956246720096604719e-02 2.415177463394604096e-01
-6.157823166656801323e-02 1.794979557309702156e-01 7.208259481587020456e-02
-9.064252285207617665e-02 -4.908812767019926637e-01 4.519023899358984298e-02
-1.615658610652523919e-01 1.028505387068022470e-01 1.483535096185837343e-01
-1.856083390824073698e-01 -2.021098036211639581e-01 2.236939874155560171e-01
-2.521846444483782368e-01 -1.100465839403525492e-01 1.827957620580193676e-01
-1.411180629849258239e-01 1.674540592912803771e-01 7.727491585457200995e-02
-1.874102472302851541e-01 -4.192614378958642307e-01 1.143903663090891365e-01
-3.054719825283451762e-01 -2.836364462067636483e-01 8.547719600716996435e-02
-1.966712725113235194e-01 -4.565533849220086426e-01 3.094476444199486392e-02
-3.325769729390443130e-01 -2.142967904085320563e-01 5.307425256307919670e-02
-2.073755297621894811e-01 1.536577752935968100e-01 -6.248630671590251970e-03
-3.134365110943921895e-01 1.319621029895936062e-02 -1.362956406195723127e-02
-3.302597436879254889e-01 -2.629106309851381162e-01 -1.713402274221870886e-02
-3.025119038047423548e-01 -3.384401689051037554e-01 -2.094933705209516689e-02
-2.914860344629453537e-01 1.395853391972036277e-02 -1.044096188399867564e-01
-3.150421118131560561e-01 -2.034782209865934055e-01 -1.210192845117787352e-01
-3.063949342938200870e-01 -2.832939361184677041e-01 -1.034186470063439972e-01
-1.737286540717478123e-01 1.530824731615720580e-01 -9.521882114696589294e-02
-2.294820571316612545e-01 -3.951116104417027319e-01 -1.213764205846953048e-01
-1.274760689568076866e-01 -4.894644144351057435e-01 -4.455677218461355504e-02
-1.883780035264805142e-01 5.136867836639960605e-02 -1.914117689693696223e-01
-1.433321802695492198e-01 -4.544472693282612386e-01 -1.176185819695205748e-01
-1.861209110668022859e-01 -1.306640638528994147e-01 -2.455403270050096065e-01
-1.255438649381262484e-01 -3.308568763444957872e-01 -2.262914805016349729e-01
-8.293709088078490832e-02 -7.409872502685851958e-02 -2.627245961667383778e-01
-4.488803173837272592e-02 -2.241254320686571211e-01 -2.607906531668653383e-01
-5.778871869538328154e-02 -3.051853877028548601e-01 -2.400823954967900298e-01
6.933445338944499436e-02 -3.553085890772759159e-01 -1.641864654730656181e-01
7.111602818333581655e-02 7.790943989140272641e-02 -1.428095505832082113e-01
1.525112891953009364e-01 -1.671772220211436566e-01 -1.425730385902093389e-01
1.635091738419923169e-01 -2.038581383228289801e-02 -7.259129809666196864e-02
1.863623081801869530e-01 -1.606691605382634946e-01 -6.215477803302005277e-02
1.860597960800501049e-01 -8.942066294581488273e-02 -3.125312549919054272e-02
1.775889425945270939e-01 -2.610350903930070965e-01 -1.126311690894651364e-02
-2.344530852828673184e-03 -4.843526491371444864e-01 -1.771665515139877762e-03
1.874771611551337980e-01 -1.746344797761466561e-01 2.508785818118245206e-02
1.673698763635819708e-01 -1.919017332593175168e-02 3.993461629869029389e-02
1.727239639120116965e-01 -9.728079191922173186e-02 7.539791998699200970e-02
1.665128570191360680e-01 -2.398893093849127078e-01 7.477779444444791757e-02
8.554297965405206761e-02 8.554345817627583004e-02 9.907725635580637291e-02
1.374989359402138955e-01 -1.654442203122245270e-01 1.448631144616539801e-01
1.039356910079401175e-01 -2.893834800076415803e-01 1.517027542469466528e-01
5.438731001724109171e-02 -3.558292982123242298e-01 1.561732813257111774e-01
3.977107071849014797e-02 3.777275649648492717e-02 1.772075415709397916e-01
4.590873630782910109e-02 -4.654621172417983122e-02 2.060602168536818102e-01
-3.911739318753588296e-02 8.070176686650581965e-02 1.807716190068306139e-01
-9.479836629895829758e-02 -2.981754513201692047e-01 2.235338189407063414e-01
-1.100390923997783266e-01 -3.853285868891334620e-01 1.787177106278184380e-01
-8.242936460454086678e-02 -4.478555864744457327e-01 1.249425775901668867e-01
-1.888650226570900426e-01 -2.855705533295971876e-01 2.032813129858487555e-01
-2.265427099231723806e-01 -2.634159763056893519e-02 1.842681888197268136e-01
-2.550827902725592011e-01 -2.565838064055482315e-01 1.667852298436166836e-01
-2.482112241496278460e-01 -3.481030410403000985e-01 1.249598292707110592e-01
-2.297985879068757398e-01 6.087823153217783445e-02 1.338388555900609234e-01
-2.957476590054699539e-01 -1.970419633881451138e-01 1.358875246111849178e-01
-3.119199946875143015e-01 -1.117417892092426313e-01 1.094828721846853037e-01
-2.870051418703375434e-01 -3.041021473069096398e-02 1.252605446213309048e-01
-2.797507465290141049e-01 -3.607317081877161047e-01 5.298546634207472461e-02
-3.449913209098987021e-01 -1.603767754118392563e-01 -1.592450024404423606e-03
-2.702243690174346291e-01 9.058667041744072890e-02 4.269668100568255013e-03
-3.263880428785118637e-01 -3.747339076041567019e-02 3.523707954466567815e-02
-2.584401925137772560e-01 -4.069771660701563265e-01 -1.268926338740771105e-02
-2.005025956799353359e-01 -4.533193987842354389e-01 -5.364111305908087629e-02
-3.287314812507123873e-01 -8.657125175611445067e-02 -8.483433757088074123e-02
-2.662392182619103265e-01 -2.568119279554665901e-01 -1.740010367826804738e-01
-2.804289937429688129e-01 -1.006646227632223278e-01 -1.730233146419092871e-01
-2.668405141313899276e-01 -1.735824191510288439e-01 -1.920330648734163981e-01
-2.525824237248898707e-01 -2.324085279657835817e-02 -1.821827892227929069e-01
-1.840250875899033522e-01 -2.232892774345062448e-01 -2.413800269728181846e-01
-1.988819222387389063e-01 -3.352238490689539718e-01 -1.952599600966745674e-01
-9.355341034860617522e-02 -4.004485492521940859e-01 -1.896387776120031510e-01
-1.135511577905180119e-01 -1.573767722888508580e-01 -2.660925622802430279e-01
-2.813875390087298631e-02 -1.635259475613462798e-01 -3.147323277676624120e-01
-1.676631262046321402e-02 -1.079960361406900327e-01 -3.682350934770836082e-01
-1.672365410687623732e-01 -6.284849773867773326e-02 -2.863768166859614328e-01
-1.314253088457320862e-01 -9.199806120652145869e-02 -2.070890620617938938e-01
-1.241286541559470658e-01 -1.078253514224595221e-02 -2.347827353061745503e-01
9.201051793553144398e-02 1.399799232378114711e-01 -2.920510722208433108e-01
1.211655923690682923e-01 9.458253137396505084e-02 -3.562868507952313690e-01
4.115163588079493590e-02 -4.874960108111341811e-02 -2.309689362023518155e-01
4.852723466605053659e-02 3.723330910471137761e-02 -2.177824527878361727e-01
-3.404961679122573553e-02 1.122606399542335764e-01 -2.528848643628236470e-01
-1.065033828550958628e-01 2.244844597753020699e-02 -3.461574265587462618e-01
6.919888814060355664e-02 8.740115000401535439e-02 -4.099269188987086920e-01
-8.072288583581499477e-03 6.823603547798220981e-02 -4.041228784155609932e-01
-3.381086987320603560e-02 -1.563812111732662199e-02 -3.956936075740030145e-01
5.137320248785325061e-02 -1.212605104729846429e-02 -3.874575361706760757e-01
-1.261228133420569941e-01 -8.160954968428942025e-02 -3.628899191346737774e-01
-4.052654592419473178e-02 -1.415180878565600597e-01 -2.120107194791421290e-01
4.656574598807482201e-02 1.485438727010804372e-01 -3.689228245248280524e-01
6.503619342049293817e-02 1.115706678142393826e-01 -2.308743500298740536e-01
1.285881472092311739e-01 6.710619562981345876e-02 -2.745451502763253115e-01
-5.702278579898233807e-02 -2.597417195676817239e-02 -1.937255719348669747e-01
4.296380999936167538e-03 1.660631317836948062e-01 -2.972090903885326041e-01
-5.885169880695314409e-02 4.475700100584593250e-02 -2.228472070792212878e-01
-7.501112417070433802e-02 8.647366148215318571e-02 -3.117930102299074635e-01
4.752100889209005807e-02 -8.415271595363915669e-02 -3.020920128352372336e-01
9.148144442025316070e-02 -2.134697714161457732e-02 -3.100027284651764581e-01
6.626177484956208286e-02 -4.303014755876416464e-01 -2.007927262613765640e-01
5.852483850399682608e-02 -2.938179110159915841e-01 -6.490886645457707993e-02
3.374254060812370004e-01 -3.923336580223399928e-01 -1.128168128810779630e-01
2.552557831023974932e-01 -4.155037762442775895e-01 -5.659741402802327681e-02
2.728606248768720710e-01 -2.830245499458966529e-01 -1.887835507532414914e-01
3.194857409937097992e-01 -2.963927462518259892e-01 -1.169946638267229944e-01
1.734568760427020462e-01 -2.711288713571128883e-01 -1.333777238799236098e-01
1.943417251864173467e-01 -2.860222711123543426e-01 -2.028709679802187249e-01
2.156584362201333649e-01 -3.689647550891873573e-01 -2.473055876178304469e-01
2.779244050448715364e-01 -4.482168436013012069e-01 -1.426067527497896781e-01
1.730486640180248573e-02 -3.756710736431007058e-01 -2.297525435980859176e-01
3.676433455927818200e-02 -2.739560620137703739e-01 -1.318540626234361857e-01
7.314933732017132872e-02 -3.713620332299126803e-01 -3.832026410277449002e-02
3.388873015090434482e-01 -3.518785468133447947e-01 -1.822407784337956582e-01
-8.196586983989565672e-03 -3.267244745397698269e-01 -1.769085376478596872e-01
-4.084161220566253844e-03 -4.053974280138745723e-01 -1.557141508364496829e-01
-3.654173467225249085e-03 -3.540696154696604814e-01 -7.924700368347148416e-02
2.745351214436949627e-01 -3.333271444116505666e-01 -4.327420975366753908e-02
1.790096840114032140e-01 -3.813246900680575813e-01 -3.929217572805278336e-02
2.503920320519980036e-01 -2.782837304219433761e-01 -9.976785915629624024e-02
1.130883503342663687e-01 -2.891601903657884942e-01 -2.131199118616968757e-01
2.774850750181143688e-01 -4.176240097268989482e-01 -2.226964641929417565e-01
1.420498851675272289e-01 -4.108012589104620416e-01 -2.291439346177294722e-01
1.239611174293918372e-01 -4.463023107030321679e-01 -1.362139380796755228e-01
2.075985808705454183e-01 -4.434556157605496485e-01 -1.831513307195437590e-01
5.869408343759982549e-02 -4.413766353361571193e-01 1.506274656265957912e-01
5.800594858045587249e-02 -4.209172538577762857e-01 2.423984931319135283e-01
6.529146201338115185e-02 -3.273488012482583254e-01 6.822825852834475679e-02
3.508789954216028073e-02 -2.776841636620446807e-01 1.998521624947844355e-01
1.804530694543642153e-03 -3.992247699724174037e-01 1.140764890335152770e-01
2.498181502103440643e-01 -3.639831082006405039e-01 2.743082854510320856e-01
3.107251084536175600e-01 -2.954251232736465616e-01 1.218432946945066930e-01
2.206682311764842086e-01 -2.730773671644081135e-01 1.815246457141029279e-01
1.147904978362478956e-01 -4.033254532127595882e-01 7.587030747612803772e-02
1.587601370074837592e-01 -3.217014195536745946e-01 6.943535248428396589e-02
2.651776443452864007e-01 -4.485475779005181352e-01 1.619997594039062350e-01
2.370590032568737140e-02 -2.892273834359271878e-01 1.174063432639800841e-01
7.751364446121310203e-02 -3.104779972276472400e-01 2.595106459875623495e-01
5.366983911521431769e-03 -3.426470613099035223e-01 2.432402748679551863e-01
3.346623759024219824e-01 -4.020963316924511255e-01 1.463669666724379259e-01
3.138347843442640395e-01 -4.026841227906329257e-01 2.422269688912635777e-01
1.684320066616068201e-01 -3.207957802464417862e-01 2.659340539580008489e-01
1.243855501634354432e-01 -3.898972110361811905e-01 2.686893463243060221e-01
3.030380145182849372e-01 -2.997150900185945321e-01 2.292539973829201549e-01
2.949694270862091017e-01 -4.068294848314123469e-01 8.496581012113588183e-02
2.509691903922973144e-01 -3.347079325304533892e-01 6.682279042077263198e-02
1.834499629133450604e-01 -4.474363240835148581e-01 1.769688991257585553e-01
1.955951475554616459e-01 -4.258420794384723540e-01 9.243146101738426712e-02
6.280332695674885335e-02 -1.320586570865656985e-02 3.460601442064881184e-01
1.483740785349419722e-01 -1.116310868174916898e-01 2.503590081480292495e-01
8.334883033473708402e-02 -1.433283877452027344e-01 4.018358000523979956e-01
4.857124584733577000e-02 -6.707222889993139092e-02 3.967553009142785747e-01
-1.569059121220639785e-01 -4.210229112410191626e-03 2.590931339004621248e-01
-1.548099427904343905e-02 -1.026011427477167837e-01 1.743223363538898285e-01
8.221762322791240085e-02 -8.642200763928496038e-03 2.728449393343670981e-01
1.460221927068068859e-01 -4.042968272413305364e-02 2.944411725200785090e-01
1.271191185609810981e-01 -1.796481001991364068e-01 3.182348436112846235e-01
2.989060776970349498e-02 -1.664072326250388423e-01 3.505882470443626997e-01
1.506511824685654033e-01 -7.964808311139018326e-02 3.678384511489326547e-01
-1.164781066079784849e-01 -7.985835709135620519e-02 3.094537728621728490e-01
-2.878260502009664934e-02 -9.318981959660427306e-02 3.600043942944177733e-01
-1.768255197697845768e-02 -1.897294347499440398e-02 3.464194946546278464e-01
-1.178029974999752205e-01 -8.053855411153146293e-02 1.553502043517525077e-01
-1.600253293893078577e-01 -8.612424738326149298e-02 2.362292483717431391e-01
-4.764319533982273869e-02 -1.408609410131720641e-01 3.056113886526913292e-01
-6.684423223009242077e-02 -5.242227006282698665e-03 1.465467098382926547e-01
7.425460083578695747e-02 -1.311821825664111207e-01 2.199763731511597542e-01
-9.054884388535158757e-02 1.832667141432457389e-02 2.978639568945842808e-01
-1.040193042525863615e-01 4.293274567439404887e-02 2.207938152551688071e-01
-5.948172901458232548e-03 2.302500629685206093e-02 2.441964598481220494e-01
================================================
FILE: example/toycar/mesh.obj
================================================
# Blender v3.6.1 OBJ File: 'car.blend'
# www.blender.org
mtllib 0000_Collection.mtl
o Cylinder_Cylinder.004
v 0.115303 -0.149343 0.309622
v 0.070618 -0.149343 0.387019
v 0.113553 -0.169863 0.308611
v 0.068868 -0.169863 0.386008
v 0.108369 -0.189595 0.305618
v 0.063684 -0.189595 0.383015
v 0.099951 -0.207780 0.300758
v 0.055266 -0.207780 0.378155
v 0.088623 -0.223720 0.294218
v 0.043938 -0.223720 0.371615
v 0.074819 -0.236801 0.286248
v 0.030134 -0.236801 0.363645
v 0.059070 -0.246521 0.277156
v 0.014385 -0.246521 0.354552
v 0.041982 -0.252506 0.267290
v -0.002703 -0.252506 0.344686
v 0.024211 -0.254527 0.257029
v -0.020474 -0.254527 0.334426
v 0.006439 -0.252506 0.246769
v -0.038246 -0.252506 0.324166
v -0.010649 -0.246521 0.236903
v -0.055334 -0.246521 0.314300
v -0.026398 -0.236801 0.227811
v -0.071083 -0.236801 0.305207
v -0.040202 -0.223720 0.219841
v -0.084887 -0.223720 0.297238
v -0.051530 -0.207780 0.213300
v -0.096215 -0.207780 0.290697
v -0.059948 -0.189595 0.208440
v -0.104633 -0.189595 0.285837
v -0.065132 -0.169863 0.205448
v -0.109817 -0.169863 0.282844
v -0.066882 -0.149343 0.204437
v -0.111567 -0.149343 0.281834
v -0.065132 -0.128822 0.205448
v -0.109817 -0.128822 0.282844
v -0.059948 -0.109090 0.208440
v -0.104633 -0.109090 0.285837
v -0.051530 -0.090905 0.213300
v -0.096215 -0.090905 0.290697
v -0.040202 -0.074966 0.219841
v -0.084887 -0.074966 0.297238
v -0.026398 -0.061885 0.227811
v -0.071083 -0.061885 0.305207
v -0.010649 -0.052165 0.236903
v -0.055334 -0.052165 0.314300
v 0.006439 -0.046179 0.246769
v -0.038246 -0.046179 0.324166
v 0.024211 -0.044158 0.257029
v -0.020474 -0.044158 0.334426
v 0.041982 -0.046179 0.267290
v -0.002703 -0.046179 0.344686
v 0.059070 -0.052165 0.277156
v 0.014385 -0.052165 0.354552
v 0.074819 -0.061885 0.286248
v 0.030134 -0.061885 0.363645
v 0.088623 -0.074966 0.294218
v 0.043938 -0.074966 0.371615
v 0.099951 -0.090905 0.300758
v 0.055266 -0.090905 0.378155
v 0.108369 -0.109090 0.305618
v 0.063684 -0.109090 0.383015
v 0.113553 -0.128822 0.308611
v 0.068868 -0.128822 0.386008
vt 1.000000 1.000000
vt 0.968750 0.500000
vt 1.000000 0.500000
vt 0.968750 1.000000
vt 0.937500 0.500000
vt 0.937500 1.000000
vt 0.906250 0.500000
vt 0.906250 1.000000
vt 0.875000 0.500000
vt 0.875000 1.000000
vt 0.843750 0.500000
vt 0.843750 1.000000
vt 0.812500 0.500000
vt 0.812500 1.000000
vt 0.781250 0.500000
vt 0.781250 1.000000
vt 0.750000 0.500000
vt 0.750000 1.000000
vt 0.718750 0.500000
vt 0.718750 1.000000
vt 0.687500 0.500000
vt 0.687500 1.000000
vt 0.656250 0.500000
vt 0.656250 1.000000
vt 0.625000 0.500000
vt 0.625000 1.000000
vt 0.593750 0.500000
vt 0.593750 1.000000
vt 0.562500 0.500000
vt 0.562500 1.000000
vt 0.531250 0.500000
vt 0.531250 1.000000
vt 0.500000 0.500000
vt 0.500000 1.000000
vt 0.468750 0.500000
vt 0.468750 1.000000
vt 0.437500 0.500000
vt 0.437500 1.000000
vt 0.406250 0.500000
vt 0.406250 1.000000
vt 0.375000 0.500000
vt 0.375000 1.000000
vt 0.343750 0.500000
vt 0.343750 1.000000
vt 0.312500 0.500000
vt 0.312500 1.000000
vt 0.281250 0.500000
vt 0.281250 1.000000
vt 0.250000 0.500000
vt 0.250000 1.000000
vt 0.218750 0.500000
vt 0.218750 1.000000
vt 0.187500 0.500000
vt 0.187500 1.000000
vt 0.156250 0.500000
vt 0.156250 1.000000
vt 0.125000 0.500000
vt 0.125000 1.000000
vt 0.093750 0.500000
vt 0.093750 1.000000
vt 0.062500 0.500000
vt 0.158156 0.028269
vt 0.471731 0.158156
vt 0.341844 0.471731
vt 0.062500 1.000000
vt 0.031250 0.500000
vt 0.031250 1.000000
vt 0.000000 0.500000
vt 0.796822 0.014612
vt 0.514612 0.203178
vt 0.703178 0.485388
vt 0.296822 0.485388
vt 0.250000 0.490000
vt 0.203178 0.485388
vt 0.158156 0.471731
vt 0.116663 0.449553
vt 0.080294 0.419706
vt 0.050447 0.383337
vt 0.028269 0.341844
vt 0.014612 0.296822
vt 0.010000 0.250000
vt 0.014612 0.203178
vt 0.028269 0.158156
vt 0.050447 0.116663
vt 0.080294 0.080294
vt 0.116663 0.050447
vt 0.203178 0.014612
vt 0.341844 0.028269
vt 0.250000 0.010000
vt 0.296822 0.014612
vt 0.383337 0.050447
vt 0.419706 0.080294
vt 0.449553 0.116663
vt 0.485388 0.203178
vt 0.471731 0.341844
vt 0.490000 0.250000
vt 0.485388 0.296822
vt 0.449553 0.383337
vt 0.419706 0.419706
vt 0.383337 0.449553
vt 0.000000 1.000000
vt 0.750000 0.490000
vt 0.796822 0.485388
vt 0.841844 0.471731
vt 0.883337 0.449553
vt 0.919706 0.419706
vt 0.949553 0.383337
vt 0.971731 0.341844
vt 0.985388 0.296822
vt 0.990000 0.250000
vt 0.985388 0.203178
vt 0.971731 0.158156
vt 0.949553 0.116663
vt 0.919706 0.080294
vt 0.883337 0.050447
vt 0.841844 0.028269
vt 0.750000 0.010000
vt 0.703178 0.014612
vt 0.658156 0.028269
vt 0.616663 0.050447
vt 0.580294 0.080294
vt 0.550447 0.116663
vt 0.528269 0.158156
vt 0.510000 0.250000
vt 0.514612 0.296822
vt 0.528269 0.341844
vt 0.550447 0.383337
vt 0.580294 0.419706
vt 0.616663 0.449553
vt 0.658156 0.471731
vn 0.8619 -0.0980 0.4976
vn 0.8287 -0.2903 0.4785
vn 0.7638 -0.4714 0.4410
vn 0.6694 -0.6344 0.3865
vn 0.5494 -0.7730 0.3172
vn 0.4082 -0.8819 0.2357
vn 0.2514 -0.9569 0.1451
vn 0.0849 -0.9952 0.0490
vn -0.0849 -0.9952 -0.0490
vn -0.2514 -0.9569 -0.1451
vn -0.4082 -0.8819 -0.2357
vn -0.5494 -0.7730 -0.3172
vn -0.6694 -0.6344 -0.3865
vn -0.7638 -0.4714 -0.4410
vn -0.8287 -0.2903 -0.4785
vn -0.8619 -0.0980 -0.4976
vn -0.8619 0.0980 -0.4976
vn -0.8287 0.2903 -0.4785
vn -0.7638 0.4714 -0.4410
vn -0.6694 0.6344 -0.3865
vn -0.5494 0.7730 -0.3172
vn -0.4082 0.8819 -0.2357
vn -0.2514 0.9569 -0.1451
vn -0.0849 0.9952 -0.0490
vn 0.0849 0.9952 0.0490
vn 0.2514 0.9569 0.1451
vn 0.4082 0.8819 0.2357
vn 0.5494 0.7730 0.3172
vn 0.6694 0.6344 0.3865
vn 0.7638 0.4714 0.4410
vn -0.5000 -0.0000 0.8660
vn 0.8287 0.2903 0.4785
vn 0.8619 0.0980 0.4976
vn 0.5000 0.0000 -0.8660
usemtl Material.006
s off
f 2/1/1 3/2/1 1/3/1
f 4/4/2 5/5/2 3/2/2
f 6/6/3 7/7/3 5/5/3
f 8/8/4 9/9/4 7/7/4
f 10/10/5 11/11/5 9/9/5
f 12/12/6 13/13/6 11/11/6
f 14/14/7 15/15/7 13/13/7
f 16/16/8 17/17/8 15/15/8
f 18/18/9 19/19/9 17/17/9
f 20/20/10 21/21/10 19/19/10
f 22/22/11 23/23/11 21/21/11
f 24/24/12 25/25/12 23/23/12
f 26/26/13 27/27/13 25/25/13
f 28/28/14 29/29/14 27/27/14
f 30/30/15 31/31/15 29/29/15
f 32/32/16 33/33/16 31/31/16
f 34/34/17 35/35/17 33/33/17
f 36/36/18 37/37/18 35/35/18
f 38/38/19 39/39/19 37/37/19
f 40/40/20 41/41/20 39/39/20
f 42/42/21 43/43/21 41/41/21
f 44/44/22 45/45/22 43/43/22
f 46/46/23 47/47/23 45/45/23
f 48/48/24 49/49/24 47/47/24
f 50/50/25 51/51/25 49/49/25
f 52/52/26 53/53/26 51/51/26
f 54/54/27 55/55/27 53/53/27
f 56/56/28 57/57/28 55/55/28
f 58/58/29 59/59/29 57/57/29
f 60/60/30 61/61/30 59/59/30
f 38/62/31 22/63/31 6/64/31
f 62/65/32 63/66/32 61/61/32
f 64/67/33 1/68/33 63/66/33
f 31/69/34 47/70/34 63/71/34
f 2/1/1 4/4/1 3/2/1
f 4/4/2 6/6/2 5/5/2
f 6/6/3 8/8/3 7/7/3
f 8/8/4 10/10/4 9/9/4
f 10/10/5 12/12/5 11/11/5
f 12/12/6 14/14/6 13/13/6
f 14/14/7 16/16/7 15/15/7
f 16/16/8 18/18/8 17/17/8
f 18/18/9 20/20/9 19/19/9
f 20/20/10 22/22/10 21/21/10
f 22/22/11 24/24/11 23/23/11
f 24/24/12 26/26/12 25/25/12
f 26/26/13 28/28/13 27/27/13
f 28/28/14 30/30/14 29/29/14
f 30/30/15 32/32/15 31/31/15
f 32/32/16 34/34/16 33/33/16
f 34/34/17 36/36/17 35/35/17
f 36/36/18 38/38/18 37/37/18
f 38/38/19 40/40/19 39/39/19
f 40/40/20 42/42/20 41/41/20
f 42/42/21 44/44/21 43/43/21
f 44/44/22 46/46/22 45/45/22
f 46/46/23 48/48/23 47/47/23
f 48/48/24 50/50/24 49/49/24
f 50/50/25 52/52/25 51/51/25
f 52/52/26 54/54/26 53/53/26
f 54/54/27 56/56/27 55/55/27
f 56/56/28 58/58/28 57/57/28
f 58/58/29 60/60/29 59/59/29
f 60/60/30 62/65/30 61/61/30
f 6/64/31 4/72/31 2/73/31
f 2/73/31 64/74/31 6/64/31
f 64/74/31 62/75/31 6/64/31
f 62/75/31 60/76/31 58/77/31
f 58/77/31 56/78/31 54/79/31
f 54/79/31 52/80/31 50/81/31
f 50/81/31 48/82/31 54/79/31
f 48/82/31 46/83/31 54/79/31
f 46/83/31 44/84/31 42/85/31
f 42/85/31 40/86/31 38/62/31
f 38/62/31 36/87/31 30/88/31
f 36/87/31 34/89/31 30/88/31
f 34/89/31 32/90/31 30/88/31
f 30/88/31 28/91/31 26/92/31
f 26/92/31 24/93/31 22/63/31
f 22/63/31 20/94/31 14/95/31
f 20/94/31 18/96/31 14/95/31
f 18/96/31 16/97/31 14/95/31
f 14/95/31 12/98/31 10/99/31
f 10/99/31 8/100/31 6/64/31
f 62/75/31 58/77/31 6/64/31
f 58/77/31 54/79/31 6/64/31
f 46/83/31 42/85/31 54/79/31
f 42/85/31 38/62/31 54/79/31
f 30/88/31 26/92/31 38/62/31
f 26/92/31 22/63/31 38/62/31
f 14/95/31 10/99/31 22/63/31
f 10/99/31 6/64/31 22/63/31
f 6/64/31 54/79/31 38/62/31
f 62/65/32 64/67/32 63/66/32
f 64/67/33 2/101/33 1/68/33
f 63/71/34 1/102/34 3/103/34
f 3/103/34 5/104/34 7/105/34
f 7/105/34 9/106/34 11/107/34
f 11/107/34 13/108/34 15/109/34
f 15/109/34 17/110/34 19/111/34
f 19/111/34 21/112/34 23/113/34
f 23/113/34 25/114/34 27/115/34
f 27/115/34 29/116/34 31/69/34
f 31/69/34 33/117/34 35/118/34
f 35/118/34 37/119/34 39/120/34
f 39/120/34 41/121/34 43/122/34
f 43/122/34 45/123/34 47/70/34
f 47/70/34 49/124/34 51/125/34
f 51/125/34 53/126/34 55/127/34
f 55/127/34 57/128/34 59/129/34
f 59/129/34 61/130/34 63/71/34
f 63/71/34 3/103/34 15/109/34
f 3/103/34 7/105/34 15/109/34
f 7/105/34 11/107/34 15/109/34
f 15/109/34 19/111/34 31/69/34
f 19/111/34 23/113/34 31/69/34
f 23/113/34 27/115/34 31/69/34
f 31/69/34 35/118/34 47/70/34
f 35/118/34 39/120/34 47/70/34
f 39/120/34 43/122/34 47/70/34
f 47/70/34 51/125/34 63/71/34
f 51/125/34 55/127/34 63/71/34
f 55/127/34 59/129/34 63/71/34
f 63/71/34 15/109/34 31/69/34
o Cube.002
v 0.500000 -0.142879 0.005417
v 0.254691 -0.142879 0.430304
v -0.254691 -0.142879 -0.430304
v -0.500000 -0.142879 -0.005417
v -0.123508 -0.142879 0.211951
v 0.121801 -0.142879 -0.212937
v -0.054179 -0.142879 -0.314538
v -0.179676 -0.142879 -0.386994
v -0.424985 -0.142879 0.037894
v -0.299487 -0.142879 0.110349
v 0.030356 -0.142879 0.300784
v 0.156422 -0.142879 0.373569
v 0.401731 -0.142879 -0.051319
v 0.275665 -0.142879 -0.124103
v 0.294594 0.112775 0.003191
v 0.500000 0.024242 0.005417
v 0.317592 0.112219 0.003441
v 0.340301 0.110556 0.003687
v 0.362435 0.107807 0.003926
v 0.383716 0.104008 0.004157
v 0.403876 0.099205 0.004375
v 0.422662 0.093460 0.004579
v 0.439838 0.086844 0.004765
v 0.455187 0.079442 0.004931
v 0.468516 0.071345 0.005076
v 0.479658 0.062655 0.005196
v 0.488473 0.053483 0.005292
v 0.494850 0.043943 0.005361
v 0.498708 0.034155 0.005403
v 0.150061 0.112775 0.253530
v 0.254691 0.024242 0.430304
v 0.161776 0.112219 0.273322
v 0.173343 0.110556 0.292866
v 0.184618 0.107807 0.311915
v 0.195458 0.104008 0.330229
v 0.205727 0.099205 0.347579
v 0.215297 0.093460 0.363747
v 0.224046 0.086844 0.378528
v 0.231864 0.079442 0.391738
v 0.238654 0.071345 0.403209
v 0.244329 0.062655 0.412798
v 0.248820 0.053483 0.420384
v 0.252068 0.043943 0.425872
v 0.254033 0.034155 0.429193
v -0.150061 0.107542 -0.253530
v -0.254691 0.019046 -0.430304
v -0.161776 0.106985 -0.273322
v -0.173343 0.105323 -0.292866
v -0.184618 0.102576 -0.311915
v -0.195458 0.098778 -0.330229
v -0.205727 0.093977 -0.347579
v -0.215297 0.088235 -0.363747
v -0.224046 0.081622 -0.378528
v -0.231864 0.074222 -0.391738
v -0.238654 0.066128 -0.403209
v -0.244329 0.057443 -0.412798
v -0.248820 0.048274 -0.420384
v -0.252068 0.038738 -0.425872
v -0.254033 0.028954 -0.429193
v -0.294594 0.107542 -0.003191
v -0.500000 0.019046 -0.005417
v -0.317592 0.106985 -0.003441
v -0.340301 0.105323 -0.003687
v -0.362435 0.102576 -0.003926
v -0.383716 0.098778 -0.004157
v -0.403876 0.093977 -0.004375
v -0.422662 0.088235 -0.004579
v -0.439838 0.081622 -0.004765
v -0.455187 0.074222 -0.004931
v -0.468516 0.066128 -0.005076
v -0.479658 0.057443 -0.005196
v -0.488473 0.048274 -0.005292
v -0.494850 0.038738 -0.005361
v -0.498708 0.028954 -0.005403
v -0.073120 0.254527 0.124676
v -0.123508 0.165160 0.211951
v -0.078762 0.253965 0.134448
v -0.084333 0.252287 0.144097
v -0.089762 0.249512 0.153501
v -0.094983 0.245677 0.162543
v -0.099928 0.240830 0.171109
v -0.104537 0.235030 0.179091
v -0.108750 0.228352 0.186389
v -0.112515 0.220880 0.192910
v -0.115785 0.212706 0.198574
v -0.118518 0.203935 0.203308
v -0.120681 0.194676 0.207053
v -0.122245 0.185046 0.209763
v -0.123191 0.175166 0.211402
v 0.071413 0.254527 -0.125662
v 0.121801 0.165160 -0.212937
v 0.077054 0.253965 -0.135434
v 0.082625 0.252287 -0.145083
v 0.088055 0.249512 -0.154487
v 0.093275 0.245677 -0.163529
v 0.098221 0.240830 -0.172095
v 0.102829 0.235030 -0.180077
v 0.107042 0.228352 -0.187375
v 0.110808 0.220880 -0.193896
v 0.114077 0.212706 -0.199560
v 0.116811 0.203935 -0.204294
v 0.118973 0.194676 -0.208039
v 0.120537 0.185046 -0.210749
v 0.121484 0.175166 -0.212388
v -0.249099 0.254527 0.023075
v -0.299487 0.164852 0.110349
v -0.254741 0.253964 0.032846
v -0.260312 0.252279 0.042495
v -0.265741 0.249495 0.051900
v -0.270962 0.245647 0.060942
v -0.275907 0.240782 0.069508
v -0.280516 0.234963 0.077490
v -0.284729 0.228262 0.084787
v -0.288494 0.220764 0.091309
v -0.291764 0.212562 0.096972
v -0.294497 0.203761 0.101706
v -0.296660 0.194470 0.105452
v -0.298224 0.184807 0.108161
v -0.299171 0.174892 0.109801
v -0.424985 0.018496 0.037894
v -0.374597 0.107542 -0.049381
v -0.424668 0.028466 0.037345
v -0.423721 0.038311 0.035705
v -0.422157 0.047906 0.032996
v -0.419995 0.057132 0.029251
v -0.417261 0.065871 0.024516
v -0.413992 0.074015 0.018853
v -0.410226 0.081461 0.012331
v -0.406013 0.088115 0.005034
v -0.401405 0.093893 -0.002948
v -0.396459 0.098724 -0.011514
v -0.391239 0.102545 -0.020556
v -0.385809 0.105309 -0.029961
v -0.380238 0.106982 -0.039610
v -0.230064 0.107542 -0.299719
v -0.179676 0.018496 -0.386994
v -0.224422 0.106982 -0.309491
v -0.218851 0.105309 -0.319140
v -0.213422 0.102545 -0.328544
v -0.208201 0.098724 -0.337587
v -0.203256 0.093893 -0.346152
v -0.198647 0.088115 -0.354134
v -0.194434 0.081461 -0.361432
v -0.190669 0.074015 -0.367954
v -0.187399 0.065871 -0.373617
v -0.184666 0.057132 -0.378351
v -0.182503 0.047906 -0.382097
v -0.180939 0.038311 -0.384806
v -0.179993 0.028466 -0.386445
v -0.054179 0.164852 -0.314538
v -0.104567 0.254527 -0.227264
v -0.054495 0.174892 -0.313990
v -0.055442 0.184807 -0.312350
v -0.057006 0.194470 -0.309641
v -0.059168 0.203761 -0.305895
v -0.061902 0.212562 -0.301161
v -0.065172 0.220764 -0.295498
v -0.068937 0.228262 -0.288976
v -0.073150 0.234963 -0.281679
v -0.077758 0.240782 -0.273697
v -0.082704 0.245647 -0.265131
v -0.087924 0.249495 -0.256089
v -0.093354 0.252279 -0.246684
v -0.098925 0.253964 -0.237035
v 0.225277 0.254527 -0.036829
v 0.275665 0.164849 -0.124103
v 0.230918 0.253964 -0.046600
v 0.236489 0.252279 -0.056249
v 0.241919 0.249495 -0.065654
v 0.247139 0.245646 -0.074696
v 0.252085 0.240782 -0.083262
v 0.256693 0.234962 -0.091243
v 0.260906 0.228261 -0.098541
v 0.264672 0.220763 -0.105063
v 0.267942 0.212561 -0.110726
v 0.270675 0.203759 -0.115460
v 0.272837 0.194468 -0.119206
v 0.274401 0.184804 -0.121915
v 0.275348 0.174890 -0.123554
v 0.401731 0.023690 -0.051319
v 0.351343 0.112775 0.035956
v 0.401414 0.033664 -0.050770
v 0.400468 0.043513 -0.049131
v 0.398904 0.053113 -0.046421
v 0.396741 0.062342 -0.042676
v 0.394008 0.071086 -0.037942
v 0.390738 0.079234 -0.032278
v 0.386973 0.086683 -0.025757
v 0.382760 0.093340 -0.018459
v 0.378151 0.099121 -0.010477
v 0.373206 0.103953 -0.001911
v 0.367985 0.107776 0.007131
v 0.362556 0.110542 0.016535
v 0.356985 0.112215 0.026184
v 0.206810 0.112775 0.286294
v 0.156422 0.023690 0.373569
v 0.201169 0.112215 0.296066
v 0.195598 0.110542 0.305715
v 0.190168 0.107776 0.315119
v 0.184948 0.103953 0.324161
v 0.180002 0.099121 0.332727
v 0.175394 0.093340 0.340709
v 0.171181 0.086683 0.348007
v 0.167415 0.079234 0.354528
v 0.164146 0.071086 0.360192
v 0.161412 0.062342 0.364926
v 0.159250 0.053113 0.368671
v 0.157686 0.043513 0.371381
v 0.156739 0.033664 0.373020
v 0.030356 0.164849 0.300784
v 0.080744 0.254527 0.213510
v 0.030673 0.174890 0.300236
v 0.031619 0.184804 0.298596
v 0.033184 0.194468 0.295887
v 0.035346 0.203759 0.292141
v 0.038079 0.212561 0.287407
v 0.041349 0.220763 0.281744
v 0.045114 0.228261 0.275222
v 0.049328 0.234962 0.267925
v 0.053936 0.240782 0.259943
v 0.058881 0.245647 0.251377
v 0.064102 0.249495 0.242335
v 0.069532 0.252279 0.232930
v 0.075102 0.253964 0.223281
vt 0.823648 0.551352
vt 0.842447 0.698648
vt 0.842447 0.551352
vt 0.800686 0.551352
vt 0.749717 0.698648
vt 0.749717 0.551352
vt 0.375000 0.750000
vt 0.536104 0.774850
vt 0.375000 0.774850
vt 0.538425 0.000000
vt 0.375000 0.250000
vt 0.375000 0.000000
vt 0.375000 0.500000
vt 0.350150 0.750000
vt 0.350150 0.500000
vt 0.537885 0.282553
vt 0.375000 0.282553
vt 0.568585 0.324314
vt 0.375000 0.375283
vt 0.375000 0.324314
vt 0.250283 0.500000
vt 0.199314 0.750000
vt 0.199314 0.500000
vt 0.375000 0.874717
vt 0.568585 0.925686
vt 0.375000 0.925686
vt 0.568587 0.433578
vt 0.375000 0.433578
vt 0.536104 0.475150
vt 0.375000 0.475150
vt 0.537885 0.967447
vt 0.375000 0.967447
vt 0.536652 0.500000
vt 0.375000 0.816422
vt 0.308578 0.500000
vt 0.250283 0.750000
vt 0.308578 0.750000
vt 0.568587 0.816422
vt 0.157553 0.500000
vt 0.125000 0.750000
vt 0.125000 0.500000
vt 0.157553 0.750000
vt 0.691422 0.698648
vt 0.691422 0.551352
vt 0.800686 0.698648
vt 0.829398 0.704398
vt 0.823648 0.698648
vt 0.829398 0.545602
vt 0.835075 0.710075
vt 0.835075 0.539925
vt 0.840609 0.715609
vt 0.845929 0.529071
vt 0.845929 0.720929
vt 0.850969 0.725969
vt 0.850969 0.524031
vt 0.855666 0.730666
vt 0.855666 0.519334
vt 0.875000 0.734959
vt 0.625000 0.234959
vt 0.592404 0.011203
vt 0.625000 0.015041
vt 0.592404 0.238797
vt 0.584486 0.007871
vt 0.584486 0.242129
vt 0.575988 0.005085
vt 0.575988 0.244915
vt 0.567019 0.002882
vt 0.567019 0.247118
vt 0.557690 0.001287
vt 0.557690 0.248713
vt 0.548118 0.000323
vt 0.548118 0.249677
vt 0.649850 0.551352
vt 0.670602 0.545602
vt 0.649850 0.545602
vt 0.664925 0.539925
vt 0.649850 0.539925
vt 0.659391 0.534391
vt 0.649850 0.534391
vt 0.654071 0.529071
vt 0.649850 0.529071
vt 0.649031 0.524031
vt 0.649850 0.524031
vt 0.649850 0.519334
vt 0.644334 0.519334
vt 0.649850 0.500000
vt 0.625000 0.484959
vt 0.591530 0.475150
vt 0.625000 0.475150
vt 0.591736 0.488797
vt 0.583400 0.475150
vt 0.583656 0.492129
vt 0.574675 0.475150
vt 0.574985 0.494915
vt 0.565464 0.475150
vt 0.565832 0.497118
vt 0.555885 0.475150
vt 0.556312 0.498713
vt 0.546057 0.475150
vt 0.546544 0.499677
vt 0.676352 0.698648
vt 0.676352 0.551352
vt 0.670602 0.704398
vt 0.664925 0.710075
vt 0.659391 0.715609
vt 0.565832 0.747118
vt 0.556312 0.501287
vt 0.565832 0.502882
vt 0.556312 0.748713
vt 0.546544 0.500323
vt 0.536652 0.750000
vt 0.842447 0.704398
vt 0.842447 0.710075
vt 0.842447 0.715609
vt 0.842447 0.720929
vt 0.842447 0.725969
vt 0.842447 0.730666
vt 0.842447 0.750000
vt 0.625000 0.984959
vt 0.592200 0.967447
vt 0.625000 0.967447
vt 0.592404 0.988797
vt 0.584233 0.967447
vt 0.584486 0.992129
vt 0.575683 0.967447
vt 0.575988 0.994915
vt 0.566657 0.967447
vt 0.567019 0.997118
vt 0.557270 0.967447
vt 0.557690 0.998712
vt 0.547639 0.967447
vt 0.548118 0.999677
vt 0.800687 0.545602
vt 0.749717 0.539925
vt 0.800686 0.539925
vt 0.749717 0.534391
vt 0.800686 0.534391
vt 0.749717 0.529071
vt 0.800686 0.529071
vt 0.749717 0.524031
vt 0.800686 0.524031
vt 0.749717 0.519334
vt 0.800686 0.519334
vt 0.749717 0.500000
vt 0.800686 0.500000
vt 0.625000 0.324314
vt 0.603833 0.375283
vt 0.603759 0.324314
vt 0.598691 0.375283
vt 0.598600 0.324314
vt 0.593173 0.375283
vt 0.593063 0.324314
vt 0.587349 0.375283
vt 0.587218 0.324314
vt 0.581291 0.375283
vt 0.581139 0.324314
vt 0.575075 0.375283
vt 0.574902 0.324314
vt 0.568781 0.375283
vt 0.691422 0.704398
vt 0.749717 0.710075
vt 0.691422 0.710075
vt 0.749717 0.715609
vt 0.691422 0.715609
vt 0.749717 0.720929
vt 0.691422 0.720929
vt 0.749717 0.725969
vt 0.691422 0.725969
vt 0.749717 0.730666
vt 0.691422 0.730666
vt 0.749717 0.750000
vt 0.691422 0.750000
vt 0.625000 0.816422
vt 0.603833 0.874717
vt 0.603760 0.816422
vt 0.598691 0.874717
vt 0.598601 0.816422
vt 0.593173 0.874717
vt 0.593064 0.816422
vt 0.587349 0.874717
vt 0.587219 0.816422
vt 0.581291 0.874717
vt 0.581140 0.816422
vt 0.575075 0.874717
vt 0.574903 0.816422
vt 0.568781 0.874717
vt 0.749717 0.545602
vt 0.691422 0.545602
vt 0.691422 0.539925
vt 0.691422 0.534391
vt 0.691422 0.529071
vt 0.691422 0.524031
vt 0.691422 0.519334
vt 0.625000 0.433578
vt 0.625000 0.375283
vt 0.603760 0.433578
vt 0.598601 0.433578
vt 0.593064 0.433578
vt 0.587219 0.433578
vt 0.581140 0.433578
vt 0.574903 0.433578
vt 0.691289 0.545602
vt 0.649718 0.545602
vt 0.690893 0.539925
vt 0.649324 0.539925
vt 0.690238 0.534391
vt 0.648674 0.534391
vt 0.689333 0.529071
vt 0.647775 0.529071
vt 0.688188 0.524031
vt 0.646639 0.524031
vt 0.686819 0.519334
vt 0.645279 0.519334
vt 0.685243 0.500000
vt 0.625000 0.439757
vt 0.649850 0.698648
vt 0.649850 0.704398
vt 0.649850 0.710075
vt 0.649850 0.715609
vt 0.654071 0.720929
vt 0.546544 0.750323
vt 0.749717 0.704398
vt 0.800686 0.704398
vt 0.800686 0.710075
vt 0.800686 0.715609
vt 0.800686 0.720929
vt 0.800686 0.725969
vt 0.800686 0.730666
vt 0.625000 0.925686
vt 0.625000 0.874717
vt 0.603759 0.925686
vt 0.598600 0.925686
vt 0.593063 0.925686
vt 0.587218 0.925686
vt 0.581139 0.925686
vt 0.574902 0.925686
vt 0.800823 0.704398
vt 0.842583 0.704398
vt 0.801230 0.710075
vt 0.842987 0.710075
vt 0.801902 0.715609
vt 0.843655 0.715609
vt 0.802832 0.720929
vt 0.844579 0.720929
vt 0.804008 0.725969
vt 0.845747 0.725969
vt 0.805414 0.730666
vt 0.847143 0.730666
vt 0.807033 0.750000
vt 0.625000 0.932033
vt 0.842447 0.545602
vt 0.842447 0.539925
vt 0.840609 0.534391
vt 0.842447 0.529071
vt 0.842447 0.524031
vt 0.842447 0.519334
vt 0.842447 0.500000
vt 0.859959 0.500000
vt 0.625000 0.265041
vt 0.592200 0.282553
vt 0.592404 0.261203
vt 0.584233 0.282553
vt 0.584486 0.257871
vt 0.575683 0.282553
vt 0.575988 0.255085
vt 0.566657 0.282553
vt 0.567019 0.252882
vt 0.557270 0.282553
vt 0.557690 0.251287
vt 0.547639 0.282553
vt 0.548118 0.250323
vt 0.538425 0.250000
vt 0.800823 0.545602
vt 0.842583 0.545602
vt 0.801230 0.539925
vt 0.842987 0.539925
vt 0.801902 0.534391
vt 0.843655 0.534391
vt 0.802832 0.529071
vt 0.844579 0.529071
vt 0.804008 0.524031
vt 0.845747 0.524031
vt 0.805414 0.519334
vt 0.847143 0.519334
vt 0.807033 0.500000
vt 0.625000 0.282553
vt 0.375000 1.000000
vt 0.875000 0.515041
vt 0.640041 0.500000
vt 0.546544 0.749677
vt 0.859959 0.750000
vt 0.538425 1.000000
vt 0.691422 0.500000
vt 0.649850 0.720929
vt 0.546057 0.774850
vt 0.800686 0.750000
vt 0.842447 0.534391
vt 0.625000 0.317967
vt 0.649031 0.725969
vt 0.644334 0.730666
vt 0.625000 0.515041
vt 0.625000 0.734959
vt 0.591736 0.511203
vt 0.591736 0.738797
vt 0.583656 0.507871
vt 0.583656 0.742129
vt 0.574985 0.505085
vt 0.574985 0.744915
vt 0.649850 0.725969
vt 0.649850 0.730666
vt 0.649850 0.750000
vt 0.640041 0.750000
vt 0.625000 0.765041
vt 0.591530 0.774850
vt 0.591736 0.761203
vt 0.583400 0.774850
vt 0.583656 0.757871
vt 0.574675 0.774850
vt 0.574985 0.755085
vt 0.565464 0.774850
vt 0.565832 0.752882
vt 0.555885 0.774850
vt 0.556312 0.751288
vt 0.691289 0.704398
vt 0.649718 0.704398
vt 0.690893 0.710075
vt 0.649324 0.710075
vt 0.690238 0.715609
vt 0.648674 0.715609
vt 0.689333 0.720929
vt 0.647775 0.720929
vt 0.688188 0.725969
vt 0.646639 0.725969
vt 0.686819 0.730666
vt 0.645279 0.730666
vt 0.685242 0.750000
vt 0.625000 0.774850
vt 0.625000 0.810242
vn 0.0000 -1.0000 0.0000
vn 0.0000 1.0000 0.0000
vn 0.5000 0.0000 -0.8660
vn 0.8660 0.0000 0.5000
vn -0.5000 0.0000 0.8660
vn -0.8660 0.0000 -0.5000
vn 0.6042 0.7164 0.3488
vn 0.0240 0.9996 0.0139
vn 0.0725 0.9965 0.0419
vn 0.1222 0.9900 0.0705
vn 0.1738 0.9797 0.1004
vn 0.2284 0.9646 0.1318
vn 0.2868 0.9436 0.1656
vn 0.3501 0.9146 0.2021
vn 0.4194 0.8749 0.2421
vn 0.4952 0.8204 0.2859
vn 0.5775 0.7452 0.3334
vn 0.6639 0.6421 0.3833
vn 0.7483 0.5033 0.4320
vn 0.8190 0.3249 0.4729
vn 0.8605 0.1128 0.4968
vn 0.0248 -0.9988 -0.0429
vn 0.0745 -0.9889 -0.1284
vn 0.1241 -0.9691 -0.2132
vn 0.1749 -0.9391 -0.2959
vn -0.1964 0.8983 0.3931
vn -0.2439 0.8470 0.4724
vn -0.3056 0.7848 0.5391
vn -0.3486 0.7111 0.6106
vn -0.3875 0.6261 0.6766
vn -0.4219 0.5303 0.7354
vn -0.4511 0.4245 0.7851
vn -0.4741 0.3100 0.8241
vn -0.4902 0.1888 0.8509
vn -0.4987 0.0634 0.8645
vn -0.0240 0.9996 -0.0139
vn -0.0725 0.9965 -0.0419
vn -0.1221 0.9900 -0.0705
vn -0.1737 0.9797 -0.1003
vn -0.7483 0.5035 -0.4320
vn -0.8190 0.3250 -0.4729
vn -0.8605 0.1129 -0.4968
vn -0.0249 -0.9988 0.0429
vn -0.0747 -0.9889 0.1284
vn -0.1275 -0.9691 0.2114
vn 0.1683 0.9390 -0.2999
vn 0.2160 0.8985 -0.3822
vn 0.2636 0.8471 -0.4614
vn 0.3082 0.7847 -0.5379
vn 0.3500 0.7109 -0.6100
vn 0.3884 0.6260 -0.6762
vn 0.4226 0.5301 -0.7351
vn 0.4516 0.4243 -0.7849
vn 0.4744 0.3099 -0.8240
vn 0.4904 0.1887 -0.8508
vn 0.4988 0.0634 -0.8644
vn -0.0250 0.9988 0.0432
vn -0.0747 0.9888 0.1295
vn -0.1241 0.9687 0.2151
vn -0.1728 0.9383 0.2996
vn -0.2205 0.8973 0.3823
vn -0.2667 0.8456 0.4625
vn -0.3109 0.7827 0.5392
vn -0.3524 0.7086 0.6113
vn -0.3905 0.6234 0.6774
vn -0.4243 0.5276 0.7359
vn -0.4529 0.4220 0.7854
vn -0.4753 0.3080 0.8241
vn -0.4909 0.1875 0.8508
vn -0.4989 0.0630 0.8644
vn 0.0250 0.9988 -0.0432
vn 0.0747 0.9888 -0.1295
vn 0.1241 0.9687 -0.2151
vn 0.1728 0.9383 -0.2996
vn 0.2205 0.8974 -0.3823
vn 0.2667 0.8456 -0.4625
vn 0.3109 0.7827 -0.5392
vn 0.3524 0.7086 -0.6113
vn 0.3906 0.6234 -0.6773
vn 0.4244 0.5276 -0.7359
vn 0.4529 0.4220 -0.7853
vn 0.4754 0.3080 -0.8241
vn 0.4909 0.1876 -0.8508
vn 0.4989 0.0630 -0.8644
vn -0.0249 0.9988 0.0431
vn -0.0745 0.9888 0.1290
vn -0.1238 0.9689 0.2143
vn -0.1725 0.9387 0.2986
vn -0.2202 0.8980 0.3811
vn -0.2665 0.8464 0.4611
vn -0.3108 0.7837 0.5378
vn -0.3525 0.7098 0.6099
vn -0.3907 0.6248 0.6760
vn -0.4247 0.5289 0.7348
vn -0.4533 0.4232 0.7845
vn -0.4758 0.3090 0.8235
vn -0.4913 0.1882 0.8504
vn -0.4991 0.0632 0.8643
vn -0.6339 0.7016 -0.3255
vn -0.6660 0.6982 -0.2627
vn -0.6955 0.6910 -0.1970
vn -0.7222 0.6798 -0.1277
vn -0.7460 0.6637 -0.0538
vn -0.7663 0.6420 0.0257
vn -0.7820 0.6132 0.1116
vn -0.7916 0.5757 0.2048
vn -0.7929 0.5272 0.3056
vn -0.7826 0.4654 0.4134
vn -0.7570 0.3881 0.5257
vn -0.7117 0.2941 0.6379
vn -0.6439 0.1843 0.7426
vn -0.5530 0.0629 0.8308
vn -0.0246 -0.9988 0.0427
vn -0.0737 -0.9891 0.1277
vn -0.1222 -0.9696 0.2122
vn -0.1695 -0.9403 0.2953
vn 0.4993 0.0638 -0.8641
vn 0.0249 0.9988 -0.0431
vn 0.0745 0.9888 -0.1290
vn 0.1238 0.9689 -0.2143
vn 0.1725 0.9387 -0.2986
vn 0.2202 0.8979 -0.3811
vn 0.2665 0.8464 -0.4611
vn 0.3108 0.7837 -0.5377
vn 0.3525 0.7098 -0.6098
vn 0.3908 0.6248 -0.6760
vn 0.4247 0.5289 -0.7348
vn 0.4534 0.4232 -0.7844
vn 0.4758 0.3090 -0.8235
vn 0.4913 0.1882 -0.8504
vn 0.4991 0.0632 -0.8642
vn 0.6217 0.7160 0.3176
vn 0.6545 0.7123 0.2535
vn 0.6845 0.7047 0.1866
vn 0.7118 0.6928 0.1161
vn 0.7359 0.6758 0.0411
vn 0.7564 0.6529 -0.0393
vn 0.7723 0.6227 -0.1259
vn 0.7819 0.5835 -0.2194
vn 0.7832 0.5331 -0.3199
vn 0.7731 0.4695 -0.4265
vn 0.7479 0.3905 -0.5368
vn 0.7039 0.2951 -0.6461
vn 0.6383 0.1845 -0.7474
vn 0.5509 0.0629 -0.8322
vn 0.0246 -0.9988 -0.0427
vn 0.0737 -0.9891 -0.1278
vn 0.1218 -0.9697 -0.2120
vn -0.1779 0.9373 0.2998
vn -0.2224 0.8978 0.3802
vn -0.2678 0.8469 0.4595
vn -0.3117 0.7848 0.5357
vn -0.3531 0.7114 0.6077
vn -0.3913 0.6267 0.6739
vn -0.4252 0.5311 0.7329
vn -0.4538 0.4254 0.7830
vn -0.4763 0.3110 0.8225
vn -0.4916 0.1896 0.8499
vn -0.4993 0.0638 0.8641
vn 0.5860 0.7160 0.3794
vn 0.5472 0.7124 0.4395
vn 0.5045 0.7049 0.4986
vn 0.4574 0.6931 0.5571
vn 0.4048 0.6764 0.6153
vn 0.3458 0.6537 0.6731
vn 0.2790 0.6238 0.7301
vn 0.2032 0.5848 0.7853
vn 0.1170 0.5348 0.8369
vn 0.0196 0.4713 0.8818
vn -0.0887 0.3923 0.9155
vn -0.2058 0.2967 0.9325
vn -0.3270 0.1857 0.9266
vn -0.4449 0.0633 0.8933
vn 0.7483 0.5033 0.4321
vn 0.0738 -0.9891 -0.1277
vn 0.1226 -0.9696 -0.2119
vn 0.1710 -0.9403 -0.2944
vn 0.2188 -0.9020 -0.3723
vn -0.2632 0.8445 0.4664
vn -0.3075 0.7835 0.5400
vn -0.3492 0.7105 0.6109
vn -0.4217 0.5307 0.7352
vn -0.4508 0.4252 0.7849
vn -0.4738 0.3109 0.8239
vn -0.4900 0.1896 0.8509
vn -0.4986 0.0638 0.8645
vn -0.0738 -0.9891 0.1277
vn -0.1227 -0.9697 0.2115
vn -0.1713 -0.9423 0.2877
vn 0.2180 0.8978 -0.3827
vn 0.2640 0.8469 -0.4617
vn 0.3081 0.7848 -0.5378
vn 0.3497 0.7114 -0.6097
vn 0.3880 0.6267 -0.6758
vn 0.4222 0.5311 -0.7347
vn 0.4512 0.4254 -0.7845
vn 0.4741 0.3110 -0.8237
vn 0.4902 0.1896 -0.8507
vn 0.4987 0.0638 -0.8644
vn -0.1237 0.9689 0.2144
vn -0.1723 0.9387 0.2987
vn -0.2199 0.8980 0.3812
vn -0.2661 0.8464 0.4613
vn -0.3103 0.7837 0.5380
vn -0.3519 0.7098 0.6102
vn -0.3901 0.6248 0.6764
vn -0.4240 0.5289 0.7352
vn -0.4527 0.4232 0.7849
vn -0.4752 0.3090 0.8238
vn -0.4908 0.1882 0.8507
vn -0.4989 0.0632 0.8643
vn 0.1237 0.9689 -0.2144
vn 0.1723 0.9387 -0.2986
vn 0.2199 0.8979 -0.3812
vn 0.2661 0.8464 -0.4613
vn 0.3103 0.7837 -0.5380
vn 0.3519 0.7098 -0.6102
vn 0.3901 0.6248 -0.6764
vn 0.4240 0.5289 -0.7352
vn 0.4527 0.4232 -0.7848
vn 0.4753 0.3090 -0.8238
vn 0.4909 0.1882 -0.8507
vn 0.4989 0.0632 -0.8643
vn -0.0748 0.9888 0.1294
vn -0.1242 0.9687 0.2150
vn -0.1730 0.9383 0.2994
vn -0.2208 0.8974 0.3821
vn -0.2672 0.8456 0.4622
vn -0.3115 0.7827 0.5389
vn -0.3531 0.7086 0.6109
vn -0.3913 0.6234 0.6769
vn -0.4251 0.5276 0.7355
vn -0.4537 0.4220 0.7849
vn -0.4760 0.3080 0.8237
vn -0.4913 0.1875 0.8505
vn -0.4991 0.0630 0.8643
vn -0.6337 0.7016 -0.3257
vn -0.6657 0.6982 -0.2633
vn -0.6950 0.6912 -0.1981
vn -0.7216 0.6801 -0.1292
vn -0.7454 0.6643 -0.0557
vn -0.7657 0.6428 0.0233
vn -0.7815 0.6143 0.1089
vn -0.7914 0.5771 0.2019
vn -0.7929 0.5288 0.3027
vn -0.7830 0.4673 0.4106
vn -0.7576 0.3900 0.5233
vn -0.7126 0.2958 0.6362
vn -0.6446 0.1855 0.7417
vn -0.5533 0.0633 0.8305
vn -0.0247 -0.9988 0.0429
vn -0.0740 -0.9889 0.1287
vn -0.1226 -0.9691 0.2141
vn -0.1688 -0.9391 0.2994
vn 0.4993 0.0634 -0.8641
vn 0.0748 0.9888 -0.1294
vn 0.1243 0.9687 -0.2150
vn 0.1731 0.9383 -0.2994
vn 0.2209 0.8973 -0.3821
vn 0.2672 0.8456 -0.4622
vn 0.3116 0.7827 -0.5388
vn 0.3532 0.7086 -0.6108
vn 0.3914 0.6234 -0.6769
vn 0.4252 0.5276 -0.7354
vn 0.4537 0.4220 -0.7849
vn 0.4761 0.3080 -0.8237
vn 0.4914 0.1875 -0.8505
vn 0.4991 0.0630 -0.8643
vn 0.6216 0.7160 0.3178
vn 0.6542 0.7124 0.2541
vn 0.6841 0.7049 0.1877
vn 0.7112 0.6931 0.1175
vn 0.7353 0.6764 0.0429
vn 0.7558 0.6537 -0.0371
vn 0.7718 0.6238 -0.1234
vn 0.7817 0.5848 -0.2167
vn 0.7832 0.5348 -0.3171
vn 0.7734 0.4713 -0.4239
vn 0.7485 0.3923 -0.5346
vn 0.7047 0.2967 -0.6445
vn 0.6390 0.1857 -0.7465
vn 0.5512 0.0633 -0.8320
vn 0.0247 -0.9988 -0.0430
vn 0.0738 -0.9889 -0.1289
vn 0.1193 -0.9691 -0.2161
vn 0.1680 -0.9390 -0.3000
vn -0.2230 0.8985 0.3781
vn -0.2678 0.8471 0.4590
vn -0.3117 0.7847 0.5358
vn -0.3533 0.7109 0.6081
vn -0.3914 0.6260 0.6745
vn -0.4253 0.5301 0.7335
vn -0.4540 0.4243 0.7835
vn -0.4764 0.3099 0.8228
vn -0.4917 0.1887 0.8501
vn -0.4992 0.0634 0.8641
vn 0.5859 0.7160 0.3796
vn 0.5468 0.7123 0.4401
vn 0.5039 0.7047 0.4995
vn 0.4564 0.6928 0.5584
vn 0.4035 0.6758 0.6168
vn 0.3441 0.6529 0.6747
vn 0.2771 0.6227 0.7318
vn 0.2009 0.5835 0.7869
vn 0.1146 0.5331 0.8382
vn 0.0172 0.4695 0.8828
vn -0.0909 0.3905 0.9161
vn -0.2076 0.2951 0.9326
vn -0.3281 0.1845 0.9264
vn -0.4453 0.0629 0.8932
vn -0.6167 0.7021 -0.3561
vn -0.2283 0.9646 -0.1318
vn -0.2867 0.9436 -0.1655
vn -0.3500 0.9147 -0.2021
vn -0.4192 0.8750 -0.2420
vn -0.4951 0.8205 -0.2858
vn -0.5774 0.7453 -0.3334
vn -0.6638 0.6422 -0.3833
vn 0.2592 0.8804 -0.3970
vn 0.2723 0.8445 -0.4612
vn 0.3139 0.7835 -0.5363
vn 0.3545 0.7105 -0.6079
vn 0.3922 0.6261 -0.6739
vn 0.4258 0.5307 -0.7328
vn 0.4543 0.4252 -0.7828
vn 0.4766 0.3109 -0.8223
vn 0.4919 0.1896 -0.8498
vn -0.5989 0.7017 -0.3860
vn -0.5609 0.6982 -0.4449
vn -0.5190 0.6912 -0.5028
vn -0.4727 0.6801 -0.5603
vn -0.4210 0.6643 -0.6176
vn -0.3626 0.6428 -0.6747
vn -0.2964 0.6143 -0.7313
vn -0.2208 0.5771 -0.7863
vn -0.1343 0.5288 -0.8380
vn -0.0359 0.4673 -0.8834
vn 0.0744 0.3900 -0.9178
vn 0.1946 0.2958 -0.9352
vn 0.3200 0.1855 -0.9291
vn 0.4426 0.0633 -0.8945
vn -0.2165 -0.8986 0.3817
vn 0.2871 0.8470 -0.4474
vn 0.3141 0.7848 -0.5342
vn 0.3544 0.7111 -0.6072
vn 0.4259 0.5303 -0.7331
vn 0.4544 0.4245 -0.7832
vn 0.4767 0.3100 -0.8226
vn 0.4919 0.1888 -0.8500
vn -0.5988 0.7016 -0.3862
vn -0.5605 0.6982 -0.4455
vn -0.5184 0.6910 -0.5038
vn -0.4717 0.6798 -0.5616
vn -0.4196 0.6638 -0.6192
vn -0.3609 0.6420 -0.6764
vn -0.2943 0.6132 -0.7330
vn -0.2184 0.5757 -0.7880
vn -0.1318 0.5272 -0.8395
vn -0.0333 0.4654 -0.8845
vn 0.0768 0.3881 -0.9184
vn 0.1966 0.2941 -0.9353
vn 0.3212 0.1843 -0.9289
vn 0.4430 0.0629 -0.8943
usemtl Material.005
s off
f 94/131/35 245/132/35 259/133/35
f 275/134/36 154/135/36 139/136/36
f 67/137/37 200/138/37 72/139/37
f 80/140/38 66/141/38 65/142/38
f 68/143/35 72/144/35 73/145/35
f 66/141/39 260/146/39 76/147/39
f 274/148/39 69/149/39 75/150/39
f 69/151/35 78/152/35 75/153/35
f 70/154/37 230/155/37 78/156/37
f 69/149/39 170/157/39 74/158/39
f 184/159/39 68/143/39 73/160/39
f 78/156/37 244/161/37 77/162/37
f 125/163/40 67/137/40 68/143/40
f 200/138/37 71/164/37 72/139/37
f 74/165/35 70/166/35 69/151/35
f 73/145/35 71/167/35 74/165/35
f 214/168/37 70/154/37 71/164/37
f 76/169/35 65/170/35 66/171/35
f 75/153/35 77/172/35 76/169/35
f 139/136/36 215/173/36 169/174/36
f 260/146/39 75/150/39 76/147/39
f 74/158/39 184/159/39 73/160/39
f 259/133/41 229/175/41 275/134/41
f 94/131/42 81/176/42 79/177/42
f 96/178/43 82/179/43 81/176/43
f 97/180/44 83/181/44 82/179/44
f 83/181/45 99/182/45 84/183/45
f 99/182/46 85/184/46 84/183/46
f 100/185/47 86/186/47 85/184/47
f 101/187/48 87/188/48 86/186/48
f 102/189/49 88/190/49 87/191/49
f 103/192/50 89/193/50 88/190/50
f 104/194/51 90/195/51 89/193/51
f 105/196/52 91/197/52 90/195/52
f 106/198/53 92/199/53 91/197/53
f 107/200/54 93/201/54 92/199/54
f 108/202/55 80/140/55 93/201/55
f 185/203/56 126/204/56 198/205/56
f 198/205/57 127/206/57 197/207/57
f 197/207/58 128/208/58 196/209/58
f 196/209/59 129/210/59 195/211/59
f 195/211/60 130/212/60 194/213/60
f 130/212/61 193/214/61 194/213/61
f 131/215/62 192/216/62 193/214/62
f 132/217/63 191/218/63 192/219/63
f 133/220/64 190/221/64 191/218/64
f 134/222/65 189/223/65 190/221/65
f 135/224/66 188/225/66 189/223/66
f 136/226/67 187/227/67 188/225/67
f 137/228/68 186/229/68 187/227/68
f 138/230/69 184/159/69 186/229/69
f 109/231/70 126/204/70 124/232/70
f 111/233/71 127/206/71 126/204/71
f 112/234/72 128/208/72 127/206/72
f 113/235/73 129/210/73 128/208/73
f 121/236/74 137/237/74 136/238/74
f 122/239/75 138/240/75 137/237/75
f 138/240/76 110/241/76 125/163/76
f 245/132/77 81/176/77 258/242/77
f 258/242/78 82/179/78 257/243/78
f 257/243/79 83/181/79 256/244/79
f 256/244/80 84/183/80 255/245/80
f 84/183/81 254/246/81 255/245/81
f 85/184/82 253/247/82 254/246/82
f 86/186/83 252/248/83 253/247/83
f 87/249/84 251/250/84 252/251/84
f 88/252/85 250/253/85 251/250/85
f 89/254/86 249/255/86 250/253/86
f 90/256/87 248/257/87 249/255/87
f 91/258/88 247/259/88 248/257/88
f 92/260/89 246/261/89 247/259/89
f 93/262/90 244/161/90 246/261/90
f 139/136/91 288/263/91 275/134/91
f 288/263/92 142/264/92 287/265/92
f 287/265/93 143/266/93 286/267/93
f 286/267/94 144/268/94 285/269/94
f 285/269/95 145/270/95 284/271/95
f 284/271/96 146/272/96 283/273/96
f 283/273/97 147/274/97 282/275/97
f 282/276/98 148/277/98 281/278/98
f 281/278/99 149/279/99 280/280/99
f 280/280/100 150/281/100 279/282/100
f 279/282/101 151/283/101 278/284/101
f 278/284/102 152/285/102 277/286/102
f 277/286/103 153/287/103 276/288/103
f 276/288/104 140/289/104 274/148/104
f 154/135/105 228/290/105 215/173/105
f 228/290/106 157/291/106 227/292/106
f 227/292/107 158/293/107 226/294/107
f 226/294/108 159/295/108 225/296/108
f 225/296/109 160/297/109 224/298/109
f 224/298/110 161/299/110 223/300/110
f 223/300/111 162/301/111 222/302/111
f 222/303/112 163/304/112 221/305/112
f 221/305/113 164/306/113 220/307/113
f 220/307/114 165/308/114 219/309/114
f 219/309/115 166/310/115 218/311/115
f 218/311/116 167/312/116 217/313/116
f 217/313/117 168/314/117 216/315/117
f 216/315/118 155/316/118 214/168/118
f 169/174/119 141/317/119 139/136/119
f 171/318/120 142/264/120 141/317/120
f 172/319/121 143/266/121 142/264/121
f 173/320/122 144/268/122 143/266/122
f 174/321/123 145/270/123 144/268/123
f 175/322/124 146/272/124 145/270/124
f 176/323/125 147/274/125 146/272/125
f 177/324/126 148/277/126 147/325/126
f 178/326/127 149/279/127 148/277/127
f 179/327/128 150/281/128 149/279/128
f 180/328/129 151/283/129 150/281/129
f 181/329/130 152/285/130 151/283/130
f 182/330/131 153/287/131 152/285/131
f 183/331/132 140/289/132 153/287/132
f 185/203/133 171/332/133 169/174/133
f 198/333/134 172/334/134 171/332/134
f 197/335/135 173/336/135 172/334/135
f 196/337/136 174/338/136 173/336/136
f 195/339/137 175/340/137 174/338/137
f 194/341/138 176/342/138 175/340/138
f 193/343/139 177/344/139 176/342/139
f 192/219/140 178/326/140 177/345/140
f 191/218/141 179/327/141 178/326/141
f 190/221/142 180/328/142 179/327/142
f 189/223/143 181/329/143 180/328/143
f 188/225/144 182/330/144 181/329/144
f 187/227/145 183/331/145 182/330/145
f 186/229/146 170/157/146 183/331/146
f 199/346/147 111/233/147 109/231/147
f 201/347/148 112/234/148 111/233/148
f 202/348/149 113/235/149 112/234/149
f 203/349/150 114/350/150 113/235/150
f 123/351/151 200/138/151 110/241/151
f 229/175/152 156/352/152 154/135/152
f 231/353/153 157/291/153 156/352/153
f 232/354/154 158/293/154 157/291/154
f 233/355/155 159/295/155 158/293/155
f 234/356/156 160/297/156 159/295/156
f 235/357/157 161/299/157 160/297/157
f 236/358/158 162/301/158 161/299/158
f 237/359/159 163/304/159 162/360/159
f 238/361/160 164/306/160 163/304/160
f 239/362/161 165/308/161 164/306/161
f 240/363/162 166/310/162 165/308/162
f 241/364/163 167/312/163 166/310/163
f 242/365/164 168/314/164 167/312/164
f 243/366/165 155/316/165 168/314/165
f 245/132/166 231/367/166 229/175/166
f 258/368/167 232/369/167 231/367/167
f 257/370/168 233/371/168 232/369/168
f 256/372/169 234/373/169 233/371/169
f 255/374/170 235/375/170 234/373/170
f 254/376/171 236/377/171 235/375/171
f 253/378/172 237/379/172 236/377/172
f 252/251/173 238/361/173 237/380/173
f 251/250/174 239/362/174 238/361/174
f 250/253/175 240/363/175 239/362/175
f 249/255/176 241/364/176 240/363/176
f 248/257/177 242/365/177 241/364/177
f 247/259/178 243/366/178 242/365/178
f 246/261/179 230/155/179 243/366/179
f 259/133/180 96/178/180 94/131/180
f 261/381/181 97/180/181 96/178/181
f 262/382/182 98/383/182 97/180/182
f 98/383/183 264/384/183 99/182/183
f 99/182/184 265/385/184 100/185/184
f 100/185/185 266/386/185 101/187/185
f 101/187/186 267/387/186 102/388/186
f 102/389/187 268/390/187 103/391/187
f 103/391/188 269/392/188 104/393/188
f 104/393/189 270/394/189 105/395/189
f 105/395/190 271/396/190 106/397/190
f 106/397/191 272/398/191 107/399/191
f 107/399/192 273/400/192 108/401/192
f 108/401/193 260/146/193 95/402/193
f 259/133/194 288/403/194 261/404/194
f 261/404/195 287/405/195 262/406/195
f 262/406/196 286/407/196 263/408/196
f 263/408/197 285/409/197 264/410/197
f 264/410/198 284/411/198 265/412/198
f 265/412/199 283/413/199 266/414/199
f 266/414/200 282/415/200 267/387/200
f 267/416/201 281/278/201 268/390/201
f 268/390/202 280/280/202 269/392/202
f 269/392/203 279/282/203 270/394/203
f 270/394/204 278/284/204 271/396/204
f 271/396/205 277/286/205 272/398/205
f 272/398/206 276/288/206 273/400/206
f 273/400/207 274/148/207 260/146/207
f 244/161/37 65/417/37 77/162/37
f 94/131/35 79/177/35 245/132/35
f 275/134/36 229/175/36 154/135/36
f 67/137/37 110/241/37 200/138/37
f 80/140/38 95/402/38 66/141/38
f 68/143/35 67/137/35 72/144/35
f 66/141/39 95/402/39 260/146/39
f 274/148/39 140/289/39 69/149/39
f 69/151/35 70/166/35 78/152/35
f 70/154/37 155/316/37 230/155/37
f 69/149/39 140/289/39 170/157/39
f 184/159/39 125/163/39 68/143/39
f 78/156/37 230/155/37 244/161/37
f 125/163/40 110/241/40 67/137/40
f 200/138/37 214/168/37 71/164/37
f 74/165/35 71/167/35 70/166/35
f 73/145/35 72/144/35 71/167/35
f 214/168/37 155/316/37 70/154/37
f 76/169/35 77/172/35 65/170/35
f 75/153/35 78/152/35 77/172/35
f 139/136/36 154/135/36 215/173/36
f 260/146/39 274/148/39 75/150/39
f 74/158/39 170/157/39 184/159/39
f 259/133/41 245/132/41 229/175/41
f 94/131/42 96/178/42 81/176/42
f 96/178/43 97/180/43 82/179/43
f 97/180/44 98/383/44 83/181/44
f 83/181/45 98/383/45 99/182/45
f 99/182/46 100/185/46 85/184/46
f 100/185/47 101/187/47 86/186/47
f 101/187/48 102/418/48 87/188/48
f 102/189/49 103/192/49 88/190/49
f 103/192/50 104/194/50 89/193/50
f 104/194/51 105/196/51 90/195/51
f 105/196/52 106/198/52 91/197/52
f 106/198/208 107/200/208 92/199/208
f 107/200/54 108/202/54 93/201/54
f 108/202/55 95/402/55 80/140/55
f 185/203/180 124/232/180 126/204/180
f 198/205/209 126/204/209 127/206/209
f 197/207/210 127/206/210 128/208/210
f 196/209/211 128/208/211 129/210/211
f 195/211/212 129/210/212 130/212/212
f 130/212/213 131/215/213 193/214/213
f 131/215/214 132/419/214 192/216/214
f 132/217/215 133/220/215 191/218/215
f 133/220/64 134/222/64 190/221/64
f 134/222/216 135/224/216 189/223/216
f 135/224/217 136/226/217 188/225/217
f 136/226/218 137/228/218 187/227/218
f 137/228/219 138/230/219 186/229/219
f 138/230/220 125/163/220 184/159/220
f 109/231/70 111/233/70 126/204/70
f 111/233/71 112/234/71 127/206/71
f 112/234/72 113/235/72 128/208/72
f 113/235/73 114/350/73 129/210/73
f 121/236/74 122/239/74 137/237/74
f 122/239/75 123/420/75 138/240/75
f 138/240/76 123/420/76 110/241/76
f 245/132/147 79/177/147 81/176/147
f 258/242/221 81/176/221 82/179/221
f 257/243/222 82/179/222 83/181/222
f 256/244/223 83/181/223 84/183/223
f 84/183/224 85/184/224 254/246/224
f 85/184/225 86/186/225 253/247/225
f 86/186/226 87/421/226 252/248/226
f 87/249/227 88/252/227 251/250/227
f 88/252/228 89/254/228 250/253/228
f 89/254/229 90/256/229 249/255/229
f 90/256/230 91/258/230 248/257/230
f 91/258/231 92/260/231 247/259/231
f 92/260/232 93/262/232 246/261/232
f 93/262/233 80/422/233 244/161/233
f 139/136/119 141/317/119 288/263/119
f 288/263/120 141/317/120 142/264/120
f 287/265/234 142/264/234 143/266/234
f 286/267/235 143/266/235 144/268/235
f 285/269/236 144/268/236 145/270/236
f 284/271/237 145/270/237 146/272/237
f 283/273/238 146/272/238 147/274/238
f 282/276/239 147/325/239 148/277/239
f 281/278/240 148/277/240 149/279/240
f 280/280/241 149/279/241 150/281/241
f 279/282/242 150/281/242 151/283/242
f 278/284/243 151/283/243 152/285/243
f 277/286/244 152/285/244 153/287/244
f 276/288/245 153/287/245 140/289/245
f 154/135/152 156/352/152 228/290/152
f 228/290/153 156/352/153 157/291/153
f 227/292/246 157/291/246 158/293/246
f 226/294/247 158/293/247 159/295/247
f 225/296/248 159/295/248 160/297/248
f 224/298/249 160/297/249 161/299/249
f 223/300/250 161/299/250 162/301/250
f 222/303/251 162/360/251 163/304/251
f 221/305/252 163/304/252 164/306/252
f 220/307/253 164/306/253 165/308/253
f 219/309/254 165/308/254 166/310/254
f 218/311/255 166/310/255 167/312/255
f 217/313/256 167/312/256 168/314/256
f 216/315/257 168/314/257 155/316/257
f 169/174/91 171/318/91 141/317/91
f 171/318/258 172/319/258 142/264/258
f 172/319/259 173/320/259 143/266/259
f 173/320/260 174/321/260 144/268/260
f 174/321/261 175/322/261 145/270/261
f 175/322/262 176/323/262 146/272/262
f 176/323/263 177/423/263 147/274/263
f 177/324/264 178/326/264 148/277/264
f 178/326/265 179/327/265 149/279/265
f 179/327/266 180/328/266 150/281/266
f 180/328/267 181/329/267 151/283/267
f 181/329/268 182/330/268 152/285/268
f 182/330/269 183/331/269 153/287/269
f 183/331/270 170/157/270 140/289/270
f 185/203/271 198/333/271 171/332/271
f 198/333/272 197/335/272 172/334/272
f 197/335/273 196/337/273 173/336/273
f 196/337/274 195/339/274 174/338/274
f 195/339/275 194/341/275 175/340/275
f 194/341/276 193/343/276 176/342/276
f 193/343/277 192/216/277 177/344/277
f 192/219/278 191/218/278 178/326/278
f 191/218/279 190/221/279 179/327/279
f 190/221/280 189/223/280 180/328/280
f 189/223/281 188/225/281 181/329/281
f 188/225/282 187/227/282 182/330/282
f 187/227/283 186/229/283 183/331/283
f 186/229/284 184/159/284 170/157/284
f 199/346/285 201/347/285 111/233/285
f 201/347/286 202/348/286 112/234/286
f 202/348/287 203/349/287 113/235/287
f 203/349/288 204/424/288 114/350/288
f 123/351/289 213/425/289 200/138/289
f 229/175/105 231/353/105 156/352/105
f 231/353/290 232/354/290 157/291/290
f 232/354/291 233/355/291 158/293/291
f 233/355/292 234/356/292 159/295/292
f 234/356/293 235/357/293 160/297/293
f 235/357/294 236/358/294 161/299/294
f 236/358/295 237/426/295 162/301/295
f 237/359/296 238/361/296 163/304/296
f 238/361/297 239/362/297 164/306/297
f 239/362/298 240/363/298 165/308/298
f 240/363/299 241/364/299 166/310/299
f 241/364/300 242/365/300 167/312/300
f 242/365/301 243/366/301 168/314/301
f 243/366/302 230/155/302 155/316/302
f 245/132/303 258/368/303 231/367/303
f 258/368/304 257/370/304 232/369/304
f 257/370/305 256/372/305 233/371/305
f 256/372/306 255/374/306 234/373/306
f 255/374/307 254/376/307 235/375/307
f 254/376/308 253/378/308 236/377/308
f 253/378/309 252/248/309 237/379/309
f 252/251/310 251/250/310 238/361/310
f 251/250/311 250/253/311 239/362/311
f 250/253/312 249/255/312 240/363/312
f 249/255/313 248/257/313 241/364/313
f 248/257/314 247/259/314 242/365/314
f 247/259/315 246/261/315 243/366/315
f 246/261/316 244/161/316 230/155/316
f 259/133/317 261/381/317 96/178/317
f 261/381/318 262/382/318 97/180/318
f 262/382/319 263/427/319 98/383/319
f 98/383/320 263/427/320 264/384/320
f 99/182/321 264/384/321 265/385/321
f 100/185/322 265/385/322 266/386/322
f 101/187/323 266/386/323 267/387/323
f 102/389/324 267/416/324 268/390/324
f 103/391/325 268/390/325 269/392/325
f 104/393/326 269/392/326 270/394/326
f 105/395/327 270/394/327 271/396/327
f 106/397/328 271/396/328 272/398/328
f 107/399/329 272/398/329 273/400/329
f 108/401/330 273/400/330 260/146/330
f 259/133/331 275/134/331 288/403/331
f 261/404/332 288/403/332 287/405/332
f 262/406/333 287/405/333 286/407/333
f 263/408/334 286/407/334 285/409/334
f 264/410/335 285/409/335 284/411/335
f 265/412/336 284/411/336 283/413/336
f 266/414/337 283/413/337 282/415/337
f 267/416/338 282/428/338 281/278/338
f 268/390/339 281/278/339 280/280/339
f 269/392/340 280/280/340 279/282/340
f 270/394/341 279/282/341 278/284/341
f 271/396/342 278/284/342 277/286/342
f 272/398/343 277/286/343 276/288/343
f 273/400/344 276/288/344 274/148/344
f 244/161/37 80/422/37 65/417/37
usemtl Material.008
f 169/174/345 199/346/345 185/203/345
f 185/203/35 109/231/35 124/232/35
f 114/350/346 130/212/346 129/210/346
f 115/429/347 131/215/347 130/212/347
f 116/430/348 132/431/348 131/215/348
f 117/432/349 133/433/349 132/431/349
f 118/434/350 134/435/350 133/433/350
f 119/436/351 135/437/351 134/435/351
f 120/438/352 136/238/352 135/437/352
f 114/350/353 205/439/353 115/429/353
f 115/429/354 206/440/354 116/430/354
f 116/430/355 207/441/355 117/442/355
f 117/443/356 208/444/356 118/445/356
f 118/445/357 209/446/357 119/447/357
f 119/447/358 210/448/358 120/449/358
f 120/449/359 211/450/359 121/451/359
f 121/451/360 212/452/360 122/453/360
f 122/453/361 213/425/361 123/351/361
f 199/346/362 228/454/362 201/455/362
f 201/455/363 227/456/363 202/457/363
f 202/457/364 226/458/364 203/459/364
f 203/459/365 225/460/365 204/461/365
f 204/461/366 224/462/366 205/463/366
f 205/463/367 223/464/367 206/465/367
f 206/465/368 222/466/368 207/441/368
f 207/467/369 221/305/369 208/444/369
f 208/444/370 220/307/370 209/446/370
f 209/446/371 219/309/371 210/448/371
f 210/448/372 218/311/372 211/450/372
f 211/450/373 217/313/373 212/452/373
f 212/452/374 216/315/374 213/425/374
f 213/425/375 214/168/375 200/138/375
f 169/174/345 215/173/345 199/346/345
f 185/203/35 199/346/35 109/231/35
f 114/350/346 115/429/346 130/212/346
f 115/429/347 116/430/347 131/215/347
f 116/430/348 117/432/348 132/431/348
f 117/432/349 118/434/349 133/433/349
f 118/434/350 119/436/350 134/435/350
f 119/436/351 120/438/351 135/437/351
f 120/438/352 121/236/352 136/238/352
f 114/350/376 204/424/376 205/439/376
f 115/429/377 205/439/377 206/440/377
f 116/430/378 206/440/378 207/441/378
f 117/443/379 207/467/379 208/444/379
f 118/445/357 208/444/357 209/446/357
f 119/447/380 209/446/380 210/448/380
f 120/449/381 210/448/381 211/450/381
f 121/451/382 211/450/382 212/452/382
f 122/453/383 212/452/383 213/425/383
f 199/346/384 215/173/384 228/454/384
f 201/455/385 228/454/385 227/456/385
f 202/457/386 227/456/386 226/458/386
f 203/459/387 226/458/387 225/460/387
f 204/461/388 225/460/388 224/462/388
f 205/463/389 224/462/389 223/464/389
f 206/465/390 223/464/390 222/466/390
f 207/467/391 222/468/391 221/305/391
f 208/444/392 221/305/392 220/307/392
f 209/446/393 220/307/393 219/309/393
f 210/448/394 219/309/394 218/311/394
f 211/450/395 218/311/395 217/313/395
f 212/452/396 217/313/396 216/315/396
f 213/425/397 216/315/397 214/168/397
o Cylinder.004_Cylinder.005
v -0.198660 -0.149343 0.128355
v -0.243345 -0.149343 0.205752
v -0.200411 -0.169863 0.127344
v -0.245096 -0.169863 0.204741
v -0.205594 -0.189595 0.124351
v -0.250279 -0.189595 0.201748
v -0.214012 -0.207780 0.119491
v -0.258697 -0.207780 0.196888
v -0.225341 -0.223720 0.112951
v -0.270026 -0.223720 0.190348
v -0.239145 -0.236801 0.104981
v -0.283830 -0.236801 0.182378
v -0.254893 -0.246521 0.095889
v -0.299578 -0.246521 0.173285
v -0.271982 -0.252506 0.086023
v -0.316667 -0.252506 0.163419
v -0.289753 -0.254527 0.075762
v -0.334438 -0.254527 0.153159
v -0.307524 -0.252506 0.065502
v -0.352209 -0.252506 0.142899
v -0.324613 -0.246521 0.055636
v -0.369298 -0.246521 0.133033
v -0.340361 -0.236801 0.046544
v -0.385046 -0.236801 0.123940
v -0.354165 -0.223720 0.038574
v -0.398850 -0.223720 0.115971
v -0.365494 -0.207780 0.032033
v -0.410179 -0.207780 0.109430
v -0.373912 -0.189595 0.027173
v -0.418597 -0.189595 0.104570
v -0.379095 -0.169863 0.024181
v -0.423780 -0.169863 0.101577
v -0.380846 -0.149343 0.023170
v -0.425531 -0.149343 0.100567
v -0.379095 -0.128822 0.024181
v -0.423780 -0.128822 0.101577
v -0.373912 -0.109090 0.027173
v -0.418597 -0.109090 0.104570
v -0.365494 -0.090905 0.032033
v -0.410179 -0.090905 0.109430
v -0.354165 -0.074966 0.038574
v -0.398850 -0.074966 0.115971
v -0.340361 -0.061885 0.046544
v -0.385046 -0.061885 0.123940
v -0.324613 -0.052165 0.055636
v -0.369298 -0.052165 0.133033
v -0.307524 -0.046179 0.065502
v -0.352209 -0.046179 0.142899
v -0.289753 -0.044158 0.075762
v -0.334438 -0.044158 0.153159
v -0.271982 -0.046179 0.086023
v -0.316667 -0.046179 0.163419
v -0.254893 -0.052165 0.095889
v -0.299578 -0.052165 0.173285
v -0.239145 -0.061885 0.104981
v -0.283830 -0.061885 0.182378
v -0.225341 -0.074966 0.112951
v -0.270026 -0.074966 0.190348
v -0.214012 -0.090905 0.119491
v -0.258697 -0.090905 0.196888
v -0.205594 -0.109090 0.124351
v -0.250279 -0.109090 0.201748
v -0.200411 -0.128822 0.127344
v -0.245096 -0.128822 0.204741
vt 1.000000 1.000000
vt 0.968750 0.500000
vt 1.000000 0.500000
vt 0.968750 1.000000
vt 0.937500 0.500000
vt 0.937500 1.000000
vt 0.906250 0.500000
vt 0.906250 1.000000
vt 0.875000 0.500000
vt 0.875000 1.000000
vt 0.843750 0.500000
vt 0.843750 1.000000
vt 0.812500 0.500000
vt 0.812500 1.000000
vt 0.781250 0.500000
vt 0.781250 1.000000
vt 0.750000 0.500000
vt 0.750000 1.000000
vt 0.718750 0.500000
vt 0.718750 1.000000
vt 0.687500 0.500000
vt 0.687500 1.000000
vt 0.656250 0.500000
vt 0.656250 1.000000
vt 0.625000 0.500000
vt 0.625000 1.000000
vt 0.593750 0.500000
vt 0.593750 1.000000
vt 0.562500 0.500000
vt 0.562500 1.000000
vt 0.531250 0.500000
vt 0.531250 1.000000
vt 0.500000 0.500000
vt 0.500000 1.000000
vt 0.468750 0.500000
vt 0.468750 1.000000
vt 0.437500 0.500000
vt 0.437500 1.000000
vt 0.406250 0.500000
vt 0.406250 1.000000
vt 0.375000 0.500000
vt 0.375000 1.000000
vt 0.343750 0.500000
vt 0.343750 1.000000
vt 0.312500 0.500000
vt 0.312500 1.000000
vt 0.281250 0.500000
vt 0.281250 1.000000
vt 0.250000 0.500000
vt 0.250000 1.000000
vt 0.218750 0.500000
vt 0.218750 1.000000
vt 0.187500 0.500000
vt 0.187500 1.000000
vt 0.156250 0.500000
vt 0.156250 1.000000
vt 0.125000 0.500000
vt 0.125000 1.000000
vt 0.093750 0.500000
vt 0.093750 1.000000
vt 0.062500 0.500000
vt 0.158156 0.028269
vt 0.471731 0.158156
vt 0.341844 0.471731
vt 0.062500 1.000000
vt 0.031250 0.500000
vt 0.031250 1.000000
vt 0.000000 0.500000
vt 0.796822 0.014612
vt 0.514612 0.203178
vt 0.703178 0.485388
vt 0.296822 0.485388
vt 0.250000 0.490000
vt 0.203178 0.485388
vt 0.158156 0.471731
vt 0.116663 0.449553
vt 0.080294 0.419706
vt 0.050447 0.383337
vt 0.028269 0.341844
vt 0.014612 0.296822
vt 0.010000 0.250000
vt 0.014612 0.203178
vt 0.028269 0.158156
vt 0.050447 0.116663
vt 0.080294 0.080294
vt 0.116663 0.050447
vt 0.203178 0.014612
vt 0.341844 0.028269
vt 0.250000 0.010000
vt 0.296822 0.014612
vt 0.383337 0.050447
vt 0.419706 0.080294
vt 0.449553 0.116663
vt 0.485388 0.203178
vt 0.471731 0.341844
vt 0.490000 0.250000
vt 0.485388 0.296822
vt 0.449553 0.383337
vt 0.419706 0.419706
vt 0.383337 0.449553
vt 0.000000 1.000000
vt 0.750000 0.490000
vt 0.796822 0.485388
vt 0.841844 0.471731
vt 0.883337 0.449553
vt 0.919706 0.419706
vt 0.949553 0.383337
vt 0.971731 0.341844
vt 0.985388 0.296822
vt 0.990000 0.250000
vt 0.985388 0.203178
vt 0.971731 0.158156
vt 0.949553 0.116663
vt 0.919706 0.080294
vt 0.883337 0.050447
vt 0.841844 0.028269
vt 0.750000 0.010000
vt 0.703178 0.014612
vt 0.658156 0.028269
vt 0.616663 0.050447
vt 0.580294 0.080294
vt 0.550447 0.116663
vt 0.528269 0.158156
vt 0.510000 0.250000
vt 0.514612 0.296822
vt 0.528269 0.341844
vt 0.550447 0.383337
vt 0.580294 0.419706
vt 0.616663 0.449553
vt 0.658156 0.471731
vn 0.8619 -0.0980 0.4976
vn 0.8287 -0.2903 0.4785
vn 0.7638 -0.4714 0.4410
vn 0.6694 -0.6344 0.3865
vn 0.5494 -0.7730 0.3172
vn 0.4082 -0.8819 0.2357
vn 0.2514 -0.9569 0.1451
vn 0.0849 -0.9952 0.0490
vn -0.0849 -0.9952 -0.0490
vn -0.2514 -0.9569 -0.1451
vn -0.4082 -0.8819 -0.2357
vn -0.5494 -0.7730 -0.3172
vn -0.6694 -0.6344 -0.3865
vn -0.7638 -0.4714 -0.4410
vn -0.8287 -0.2903 -0.4785
vn -0.8619 -0.0980 -0.4976
vn -0.8619 0.0980 -0.4976
vn -0.8287 0.2903 -0.4785
vn -0.7638 0.4714 -0.4410
vn -0.6694 0.6344 -0.3865
vn -0.5494 0.7730 -0.3172
vn -0.4082 0.8819 -0.2357
vn -0.2514 0.9569 -0.1451
vn -0.0849 0.9952 -0.0490
vn 0.0849 0.9952 0.0490
vn 0.2514 0.9569 0.1451
vn 0.4082 0.8819 0.2357
vn 0.5494 0.7730 0.3172
vn 0.6694 0.6344 0.3865
vn 0.7638 0.4714 0.4410
vn -0.5000 0.0000 0.8660
vn 0.8287 0.2903 0.4785
vn 0.8619 0.0980 0.4976
vn 0.5000 0.0000 -0.8660
usemtl Material.007
s off
f 290/469/398 291/470/398 289/471/398
f 292/472/399 293/473/399 291/470/399
f 294/474/400 295/475/400 293/473/400
f 296/476/401 297/477/401 295/475/401
f 298/478/402 299/479/402 297/477/402
f 300/480/403 301/481/403 299/479/403
f 302/482/404 303/483/404 301/481/404
f 304/484/405 305/485/405 303/483/405
f 306/486/406 307/487/406 305/485/406
f 308/488/407 309/489/407 307/487/407
f 310/490/408 311/491/408 309/489/408
f 312/492/409 313/493/409 311/491/409
f 314/494/410 315/495/410 313/493/410
f 316/496/411 317/497/411 315/495/411
f 318/498/412 319/499/412 317/497/412
f 320/500/413 321/501/413 319/499/413
f 322/502/414 323/503/414 321/501/414
f 324/504/415 325/505/415 323/503/415
f 326/506/416 327/507/416 325/505/416
f 328/508/417 329/509/417 327/507/417
f 330/510/418 331/511/418 329/509/418
f 332/512/419 333/513/419 331/511/419
f 334/514/420 335/515/420 333/513/420
f 336/516/421 337/517/421 335/515/421
f 338/518/422 339/519/422 337/517/422
f 340/520/423 341/521/423 339/519/423
f 342/522/424 343/523/424 341/521/424
f 344/524/425 345/525/425 343/523/425
f 346/526/426 347/527/426 345/525/426
f 348/528/427 349/529/427 347/527/427
f 326/530/428 310/531/428 294/532/428
f 350/533/429 351/534/429 349/529/429
f 352/535/430 289/536/430 351/534/430
f 319/537/431 335/538/431 351/539/431
f 290/469/398 292/472/398 291/470/398
f 292/472/399 294/474/399 293/473/399
f 294/474/400 296/476/400 295/475/400
f 296/476/401 298/478/401 297/477/401
f 298/478/402 300/480/402 299/479/402
f 300/480/403 302/482/403 301/481/403
f 302/482/404 304/484/404 303/483/404
f 304/484/405 306/486/405 305/485/405
f 306/486/406 308/488/406 307/487/406
f 308/488/407 310/490/407 309/489/407
f 310/490/408 312/492/408 311/491/408
f 312/492/409 314/494/409 313/493/409
f 314/494/410 316/496/410 315/495/410
f 316/496/411 318/498/411 317/497/411
f 318/498/412 320/500/412 319/499/412
f 320/500/413 322/502/413 321/501/413
f 322/502/414 324/504/414 323/503/414
f 324/504/415 326/506/415 325/505/415
f 326/506/416 328/508/416 327/507/416
f 328/508/417 330/510/417 329/509/417
f 330/510/418 332/512/418 331/511/418
f 332/512/419 334/514/419 333/513/419
f 334/514/420 336/516/420 335/515/420
f 336/516/421 338/518/421 337/517/421
f 338/518/422 340/520/422 339/519/422
f 340/520/423 342/522/423 341/521/423
f 342/522/424 344/524/424 343/523/424
f 344/524/425 346/526/425 345/525/425
f 346/526/426 348/528/426 347/527/426
f 348/528/427 350/533/427 349/529/427
f 294/532/428 292/540/428 290/541/428
f 290/541/428 352/542/428 294/532/428
f 352/542/428 350/543/428 294/532/428
f 350/543/428 348/544/428 346/545/428
f 346/545/428 344/546/428 342/547/428
f 342/547/428 340/548/428 338/549/428
f 338/549/428 336/550/428 342/547/428
f 336/550/428 334/551/428 342/547/428
f 334/551/428 332/552/428 330/553/428
f 330/553/428 328/554/428 326/530/428
f 326/530/428 324/555/428 318/556/428
f 324/555/428 322/557/428 318/556/428
f 322/557/428 320/558/428 318/556/428
f 318/556/428 316/559/428 314/560/428
f 314/560/428 312/561/428 310/531/428
f 310/531/428 308/562/428 302/563/428
f 308/562/428 306/564/428 302/563/428
f 306/564/428 304/565/428 302/563/428
f 302/563/428 300/566/428 298/567/428
f 298/567/428 296/568/428 294/532/428
f 350/543/428 346/545/428 294/532/428
f 346/545/428 342/547/428 294/532/428
f 334/551/428 330/553/428 342/547/428
f 330/553/428 326/530/428 342/547/428
f 318/556/428 314/560/428 326/530/428
f 314/560/428 310/531/428 326/530/428
f 302/563/428 298/567/428 310/531/428
f 298/567/428 294/532/428 310/531/428
f 294/532/428 342/547/428 326/530/428
f 350/533/429 352/535/429 351/534/429
f 352/535/430 290/569/430 289/536/430
f 351/539/431 289/570/431 291/571/431
f 291/571/431 293/572/431 295/573/431
f 295/573/431 297/574/431 299/575/431
f 299/575/431 301/576/431 303/577/431
f 303/577/431 305/578/431 307/579/431
f 307/579/431 309/580/431 311/581/431
f 311/581/431 313/582/431 315/583/431
f 315/583/431 317/584/431 319/537/431
f 319/537/431 321/585/431 323/586/431
f 323/586/431 325/587/431 327/588/431
f 327/588/431 329/589/431 331/590/431
f 331/590/431 333/591/431 335/538/431
f 335/538/431 337/592/431 339/593/431
f 339/593/431 341/594/431 343/595/431
f 343/595/431 345/596/431 347/597/431
f 347/597/431 349/598/431 351/539/431
f 351/539/431 291/571/431 303/577/431
f 291/571/431 295/573/431 303/577/431
f 295/573/431 299/575/431 303/577/431
f 303/577/431 307/579/431 319/537/431
f 307/579/431 311/581/431 319/537/431
f 311/581/431 315/583/431 319/537/431
f 319/537/431 323/586/431 335/538/431
f 323/586/431 327/588/431 335/538/431
f 327/588/431 331/590/431 335/538/431
f 335/538/431 339/593/431 351/539/431
f 339/593/431 343/595/431 351/539/431
f 343/595/431 347/597/431 351/539/431
f 351/539/431 303/577/431 319/537/431
o Cylinder.005_Cylinder.006
v 0.368780 -0.149343 -0.129413
v 0.324095 -0.149343 -0.052016
v 0.367030 -0.169863 -0.130424
v 0.322345 -0.169863 -0.053027
v 0.361846 -0.189595 -0.133416
v 0.317161 -0.189595 -0.056020
v 0.353428 -0.207780 -0.138276
v 0.308743 -0.207780 -0.060880
v 0.342100 -0.223720 -0.144817
v 0.297415 -0.223720 -0.067420
v 0.328296 -0.236801 -0.152787
v 0.283611 -0.236801 -0.075390
v 0.312547 -0.246521 -0.161879
v 0.267862 -0.246521 -0.084482
v 0.295459 -0.252506 -0.171745
v 0.250774 -0.252506 -0.094348
v 0.277688 -0.254527 -0.182005
v 0.233002 -0.254527 -0.104609
v 0.259916 -0.252506 -0.192266
v 0.215231 -0.252506 -0.114869
v 0.242828 -0.246521 -0.202132
v 0.198143 -0.246521 -0.124735
v 0.227079 -0.236801 -0.211224
v 0.182394 -0.236801 -0.133827
v 0.213275 -0.223720 -0.219194
v 0.168590 -0.223720 -0.141797
v 0.201947 -0.207780 -0.225734
v 0.157262 -0.207780 -0.148338
v 0.193529 -0.189595 -0.230594
v 0.148844 -0.189595 -0.153198
v 0.188345 -0.169863 -0.233587
v 0.143660 -0.169863 -0.156190
v 0.186595 -0.149343 -0.234598
v 0.141910 -0.149343 -0.157201
v 0.188345 -0.128822 -0.233587
v 0.143660 -0.128822 -0.156190
v 0.193529 -0.109090 -0.230594
v 0.148844 -0.109090 -0.153198
v 0.201947 -0.090905 -0.225734
v 0.157262 -0.090905 -0.148338
v 0.213275 -0.074966 -0.219194
v 0.168590 -0.074966 -0.141797
v 0.227079 -0.061885 -0.211224
v 0.182394 -0.061885 -0.133827
v 0.242828 -0.052165 -0.202132
v 0.198143 -0.052165 -0.124735
v 0.259916 -0.046179 -0.192266
v 0.215231 -0.046179 -0.114869
v 0.277687 -0.044158 -0.182005
v 0.233002 -0.044158 -0.104609
v 0.295459 -0.046179 -0.171745
v 0.250774 -0.046179 -0.094348
v 0.312547 -0.052165 -0.161879
v 0.267862 -0.052165 -0.084482
v 0.328296 -0.061885 -0.152787
v 0.283611 -0.061885 -0.075390
v 0.342100 -0.074966 -0.144817
v 0.297415 -0.074966 -0.067420
v 0.353428 -0.090905 -0.138276
v 0.308743 -0.090905 -0.060880
v 0.361846 -0.109090 -0.133416
v 0.317161 -0.109090 -0.056020
v 0.367030 -0.128822 -0.130424
v 0.322345 -0.128822 -0.053027
vt 1.000000 1.000000
vt 0.968750 0.500000
vt 1.000000 0.500000
vt 0.968750 1.000000
vt 0.937500 0.500000
vt 0.937500 1.000000
vt 0.906250 0.500000
vt 0.906250 1.000000
vt 0.875000 0.500000
vt 0.875000 1.000000
vt 0.843750 0.500000
vt 0.843750 1.000000
vt 0.812500 0.500000
vt 0.812500 1.000000
vt 0.781250 0.500000
vt 0.781250 1.000000
vt 0.750000 0.500000
vt 0.750000 1.000000
vt 0.718750 0.500000
vt 0.718750 1.000000
vt 0.687500 0.500000
vt 0.687500 1.000000
vt 0.656250 0.500000
vt 0.656250 1.000000
vt 0.625000 0.500000
vt 0.625000 1.000000
vt 0.593750 0.500000
vt 0.593750 1.000000
vt 0.562500 0.500000
vt 0.562500 1.000000
vt 0.531250 0.500000
vt 0.531250 1.000000
vt 0.500000 0.500000
vt 0.500000 1.000000
vt 0.468750 0.500000
vt 0.468750 1.000000
vt 0.437500 0.500000
vt 0.437500 1.000000
vt 0.406250 0.500000
vt 0.406250 1.000000
vt 0.375000 0.500000
vt 0.375000 1.000000
vt 0.343750 0.500000
vt 0.343750 1.000000
vt 0.312500 0.500000
vt 0.312500 1.000000
vt 0.281250 0.500000
vt 0.281250 1.000000
vt 0.250000 0.500000
vt 0.250000 1.000000
vt 0.218750 0.500000
vt 0.218750 1.000000
vt 0.187500 0.500000
vt 0.187500 1.000000
vt 0.156250 0.500000
vt 0.156250 1.000000
vt 0.125000 0.500000
vt 0.125000 1.000000
vt 0.093750 0.500000
vt 0.093750 1.000000
vt 0.062500 0.500000
vt 0.158156 0.028269
vt 0.471731 0.158156
vt 0.341844 0.471731
vt 0.062500 1.000000
vt 0.031250 0.500000
vt 0.031250 1.000000
vt 0.000000 0.500000
vt 0.796822 0.014612
vt 0.514612 0.203178
vt 0.703178 0.485388
vt 0.296822 0.485388
vt 0.250000 0.490000
vt 0.203178 0.485388
vt 0.158156 0.471731
vt 0.116663 0.449553
vt 0.080294 0.419706
vt 0.050447 0.383337
vt 0.028269 0.341844
vt 0.014612 0.296822
vt 0.010000 0.250000
vt 0.014612 0.203178
vt 0.028269 0.158156
vt 0.050447 0.116663
vt 0.080294 0.080294
vt 0.116663 0.050447
vt 0.203178 0.014612
vt 0.341844 0.028269
vt 0.250000 0.010000
vt 0.296822 0.014612
vt 0.383337 0.050447
vt 0.419706 0.080294
vt 0.449553 0.116663
vt 0.485388 0.203178
vt 0.471731 0.341844
vt 0.490000 0.250000
vt 0.485388 0.296822
vt 0.449553 0.383337
vt 0.419706 0.419706
vt 0.383337 0.449553
vt 0.000000 1.000000
vt 0.750000 0.490000
vt 0.796822 0.485388
vt 0.841844 0.471731
vt 0.883337 0.449553
vt 0.919706 0.419706
vt 0.949553 0.383337
vt 0.971731 0.341844
vt 0.985388 0.296822
vt 0.990000 0.250000
vt 0.985388 0.203178
vt 0.971731 0.158156
vt 0.949553 0.116663
vt 0.919706 0.080294
vt 0.883337 0.050447
vt 0.841844 0.028269
vt 0.750000 0.010000
vt 0.703178 0.014612
vt 0.658156 0.028269
vt 0.616663 0.050447
vt 0.580294 0.080294
vt 0.550447 0.116663
vt 0.528269 0.158156
vt 0.510000 0.250000
vt 0.514612 0.296822
vt 0.528269 0.341844
vt 0.550447 0.383337
vt 0.580294 0.419706
vt 0.616663 0.449553
vt 0.658156 0.471731
vn 0.8619 -0.0980 0.4976
vn 0.8287 -0.2903 0.4785
vn 0.7638 -0.4714 0.4410
vn 0.6694 -0.6344 0.3865
vn 0.5494 -0.7730 0.3172
vn 0.4082 -0.8819 0.2357
vn 0.2514 -0.9569 0.1451
vn 0.0849 -0.9952 0.0490
vn -0.0849 -0.9952 -0.0490
vn -0.2514 -0.9569 -0.1451
vn -0.4082 -0.8819 -0.2357
vn -0.5494 -0.7730 -0.3172
vn -0.6694 -0.6344 -0.3865
vn -0.7638 -0.4714 -0.4410
vn -0.8287 -0.2903 -0.4785
vn -0.8619 -0.0980 -0.4976
vn -0.8619 0.0980 -0.4976
vn -0.8287 0.2903 -0.4785
vn -0.7638 0.4714 -0.4410
vn -0.6694 0.6344 -0.3865
vn -0.5494 0.7730 -0.3172
vn -0.4082 0.8819 -0.2357
vn -0.2514 0.9569 -0.1451
vn -0.0849 0.9952 -0.0490
vn 0.0849 0.9952 0.0490
vn 0.2514 0.9569 0.1451
vn 0.4082 0.8819 0.2357
vn 0.5494 0.7730 0.3172
vn 0.6694 0.6344 0.3865
vn 0.7638 0.4714 0.4410
vn -0.5000 0.0000 0.8660
vn 0.8287 0.2903 0.4785
vn 0.8619 0.0980 0.4976
vn 0.5000 0.0000 -0.8660
usemtl Material.003
s off
f 354/599/432 355/600/432 353/601/432
f 356/602/433 357/603/433 355/600/433
f 358/604/434 359/605/434 357/603/434
f 360/606/435 361/607/435 359/605/435
f 362/608/436 363/609/436 361/607/436
f 364/610/437 365/611/437 363/609/437
f 366/612/438 367/613/438 365/611/438
f 368/614/439 369/615/439 367/613/439
f 370/616/440 371/617/440 369/615/440
f 372/618/441 373/619/441 371/617/441
f 374/620/442 375/621/442 373/619/442
f 376/622/443 377/623/443 375/621/443
f 378/624/444 379/625/444 377/623/444
f 380/626/445 381/627/445 379/625/445
f 382/628/446 383/629/446 381/627/446
f 384/630/447 385/631/447 383/629/447
f 386/632/448 387/633/448 385/631/448
f 388/634/449 389/635/449 387/633/449
f 390/636/450 391/637/450 389/635/450
f 392/638/451 393/639/451 391/637/451
f 394/640/452 395/641/452 393/639/452
f 396/642/453 397/643/453 395/641/453
f 398/644/454 399/645/454 397/643/454
f 400/646/455 401/647/455 399/645/455
f 402/648/456 403/649/456 401/647/456
f 404/650/457 405/651/457 403/649/457
f 406/652/458 407/653/458 405/651/458
f 408/654/459 409/655/459 407/653/459
f 410/656/460 411/657/460 409/655/460
f 412/658/461 413/659/461 411/657/461
f 390/660/462 374/661/462 358/662/462
f 414/663/463 415/664/463 413/659/463
f 416/665/464 353/666/464 415/664/464
f 383/667/465 399/668/465 415/669/465
f 354/599/432 356/602/432 355/600/432
f 356/602/433 358/604/433 357/603/433
f 358/604/434 360/606/434 359/605/434
f 360/606/435 362/608/435 361/607/435
f 362/608/436 364/610/436 363/609/436
f 364/610/437 366/612/437 365/611/437
f 366/612/438 368/614/438 367/613/438
f 368/614/439 370/616/439 369/615/439
f 370/616/440 372/618/440 371/617/440
f 372/618/441 374/620/441 373/619/441
f 374/620/442 376/622/442 375/621/442
f 376/622/443 378/624/443 377/623/443
f 378/624/444 380/626/444 379/625/444
f 380/626/445 382/628/445 381/627/445
f 382/628/446 384/630/446 383/629/446
f 384/630/447 386/632/447 385/631/447
f 386/632/448 388/634/448 387/633/448
f 388/634/449 390/636/449 389/635/449
f 390/636/450 392/638/450 391/637/450
f 392/638/451 394/640/451 393/639/451
f 394/640/452 396/642/452 395/641/452
f 396/642/453 398/644/453 397/643/453
f 398/644/454 400/646/454 399/645/454
f 400/646/455 402/648/455 401/647/455
f 402/648/456 404/650/456 403/649/456
f 404/650/457 406/652/457 405/651/457
f 406/652/458 408/654/458 407/653/458
f 408/654/459 410/656/459 409/655/459
f 410/656/460 412/658/460 411/657/460
f 412/658/461 414/663/461 413/659/461
f 358/662/462 356/670/462 354/671/462
f 354/671/462 416/672/462 358/662/462
f 416/672/462 414/673/462 358/662/462
f 414/673/462 412/674/462 410/675/462
f 410/675/462 408/676/462 406/677/462
f 406/677/462 404/678/462 402/679/462
f 402/679/462 400/680/462 406/677/462
f 400/680/462 398/681/462 406/677/462
f 398/681/462 396/682/462 394/683/462
f 394/683/462 392/684/462 390/660/462
f 390/660/462 388/685/462 382/686/462
f 388/685/462 386/687/462 382/686/462
f 386/687/462 384/688/462 382/686/462
f 382/686/462 380/689/462 378/690/462
f 378/690/462 376/691/462 374/661/462
f 374/661/462 372/692/462 366/693/462
f 372/692/462 370/694/462 366/693/462
f 370/694/462 368/695/462 366/693/462
f 366/693/462 364/696/462 362/697/462
f 362/697/462 360/698/462 358/662/462
f 414/673/462 410/675/462 358/662/462
f 410/675/462 406/677/462 358/662/462
f 398/681/462 394/683/462 406/677/462
f 394/683/462 390/660/462 406/677/462
f 382/686/462 378/690/462 390/660/462
f 378/690/462 374/661/462 390/660/462
f 366/693/462 362/697/462 374/661/462
f 362/697/462 358/662/462 374/661/462
f 358/662/462 406/677/462 390/660/462
f 414/663/463 416/665/463 415/664/463
f 416/665/464 354/699/464 353/666/464
f 415/669/465 353/700/465 355/701/465
f 355/701/465 357/702/465 359/703/465
f 359/703/465 361/704/465 363/705/465
f 363/705/465 365/706/465 367/707/465
f 367/707/465 369/708/465 371/709/465
f 371/709/465 373/710/465 375/711/465
f 375/711/465 377/712/465 379/713/465
f 379/713/465 381/714/465 383/667/465
f 383/667/465 385/715/465 387/716/465
f 387/716/465 389/717/465 391/718/465
f 391/718/465 393/719/465 395/720/465
f 395/720/465 397/721/465 399/668/465
f 399/668/465 401/722/465 403/723/465
f 403/723/465 405/724/465 407/725/465
f 407/725/465 409/726/465 411/727/465
f 411/727/465 413/728/465 415/669/465
f 415/669/465 355/701/465 367/707/465
f 355/701/465 359/703/465 367/707/465
f 359/703/465 363/705/465 367/707/465
f 367/707/465 371/709/465 383/667/465
f 371/709/465 375/711/465 383/667/465
f 375/711/465 379/713/465 383/667/465
f 383/667/465 387/716/465 399/668/465
f 387/716/465 391/718/465 399/668/465
f 391/718/465 395/720/465 399/668/465
f 399/668/465 403/723/465 415/669/465
f 403/723/465 407/725/465 415/669/465
f 407/725/465 411/727/465 415/669/465
f 415/669/465 367/707/465 383/667/465
o Cylinder.006_Cylinder.001
v 0.054817 -0.149343 -0.310680
v 0.010132 -0.149343 -0.233283
v 0.053066 -0.169863 -0.311691
v 0.008381 -0.169863 -0.234294
v 0.047883 -0.189595 -0.314683
v 0.003198 -0.189595 -0.237287
v 0.039465 -0.207780 -0.319544
v -0.005220 -0.207780 -0.242147
v 0.028136 -0.223720 -0.326084
v -0.016549 -0.223720 -0.248687
v 0.014332 -0.236801 -0.334054
v -0.030353 -0.236801 -0.256657
v -0.001416 -0.246521 -0.343146
v -0.046101 -0.246521 -0.265750
v -0.018505 -0.252506 -0.353012
v -0.063190 -0.252506 -0.275616
v -0.036276 -0.254527 -0.363273
v -0.080961 -0.254527 -0.285876
v -0.054047 -0.252506 -0.373533
v -0.098732 -0.252506 -0.296136
v -0.071136 -0.246521 -0.383399
v -0.115821 -0.246521 -0.306002
v -0.086884 -0.236801 -0.392491
v -0.131570 -0.236801 -0.315095
v -0.100688 -0.223720 -0.400461
v -0.145373 -0.223720 -0.323064
v -0.112017 -0.207780 -0.407001
v -0.156702 -0.207780 -0.329605
v -0.120435 -0.189595 -0.411862
v -0.165120 -0.189595 -0.334465
v -0.125618 -0.169863 -0.414854
v -0.170303 -0.169863 -0.337458
v -0.127369 -0.149343 -0.415865
v -0.172054 -0.149343 -0.338468
v -0.125618 -0.128822 -0.414854
v -0.170303 -0.128822 -0.337458
v -0.120435 -0.109090 -0.411862
v -0.165120 -0.109090 -0.334465
v -0.112017 -0.090905 -0.407001
v -0.156702 -0.090905 -0.329605
v -0.100688 -0.074966 -0.400461
v -0.145373 -0.074966 -0.323064
v -0.086884 -0.061885 -0.392491
v -0.131570 -0.061885 -0.315095
v -0.071136 -0.052165 -0.383399
v -0.115821 -0.052165 -0.306002
v -0.054047 -0.046179 -0.373533
v -0.098732 -0.046179 -0.296136
v -0.036276 -0.044158 -0.363273
v -0.080961 -0.044158 -0.285876
v -0.018505 -0.046179 -0.353012
v -0.063190 -0.046179 -0.275616
v -0.001416 -0.052165 -0.343146
v -0.046101 -0.052165 -0.265750
v 0.014332 -0.061885 -0.334054
v -0.030353 -0.061885 -0.256657
v 0.028136 -0.074966 -0.326084
v -0.016549 -0.074966 -0.248687
v 0.039465 -0.090905 -0.319544
v -0.005220 -0.090905 -0.242147
v 0.047883 -0.109090 -0.314683
v 0.003198 -0.109090 -0.237287
v 0.053066 -0.128822 -0.311691
v 0.008381 -0.128822 -0.234294
vt 1.000000 1.000000
vt 0.968750 0.500000
vt 1.000000 0.500000
vt 0.968750 1.000000
vt 0.937500 0.500000
vt 0.937500 1.000000
vt 0.906250 0.500000
vt 0.906250 1.000000
vt 0.875000 0.500000
vt 0.875000 1.000000
vt 0.843750 0.500000
vt 0.843750 1.000000
vt 0.812500 0.500000
vt 0.812500 1.000000
vt 0.781250 0.500000
vt 0.781250 1.000000
vt 0.750000 0.500000
vt 0.750000 1.000000
vt 0.718750 0.500000
vt 0.718750 1.000000
vt 0.687500 0.500000
vt 0.687500 1.000000
vt 0.656250 0.500000
vt 0.656250 1.000000
vt 0.625000 0.500000
vt 0.625000 1.000000
vt 0.593750 0.500000
vt 0.593750 1.000000
vt 0.562500 0.500000
vt 0.562500 1.000000
vt 0.531250 0.500000
vt 0.531250 1.000000
vt 0.500000 0.500000
vt 0.500000 1.000000
vt 0.468750 0.500000
vt 0.468750 1.000000
vt 0.437500 0.500000
vt 0.437500 1.000000
vt 0.406250 0.500000
vt 0.406250 1.000000
vt 0.375000 0.500000
vt 0.375000 1.000000
vt 0.343750 0.500000
vt 0.343750 1.000000
vt 0.312500 0.500000
vt 0.312500 1.000000
vt 0.281250 0.500000
vt 0.281250 1.000000
vt 0.250000 0.500000
vt 0.250000 1.000000
vt 0.218750 0.500000
vt 0.218750 1.000000
vt 0.187500 0.500000
vt 0.187500 1.000000
vt 0.156250 0.500000
vt 0.156250 1.000000
vt 0.125000 0.500000
vt 0.125000 1.000000
vt 0.093750 0.500000
vt 0.093750 1.000000
vt 0.062500 0.500000
vt 0.158156 0.028269
vt 0.471731 0.158156
vt 0.341844 0.471731
vt 0.062500 1.000000
vt 0.031250 0.500000
vt 0.031250 1.000000
vt 0.000000 0.500000
vt 0.796822 0.014612
vt 0.514612 0.203178
vt 0.703178 0.485388
vt 0.296822 0.485388
vt 0.250000 0.490000
vt 0.203178 0.485388
vt 0.158156 0.471731
vt 0.116663 0.449553
vt 0.080294 0.419706
vt 0.050447 0.383337
vt 0.028269 0.341844
vt 0.014612 0.296822
vt 0.010000 0.250000
vt 0.014612 0.203178
vt 0.028269 0.158156
vt 0.050447 0.116663
vt 0.080294 0.080294
vt 0.116663 0.050447
vt 0.203178 0.014612
vt 0.341844 0.028269
vt 0.250000 0.010000
vt 0.296822 0.014612
vt 0.383337 0.050447
vt 0.419706 0.080294
vt 0.449553 0.116663
vt 0.485388 0.203178
vt 0.471731 0.341844
vt 0.490000 0.250000
vt 0.485388 0.296822
vt 0.449553 0.383337
vt 0.419706 0.419706
vt 0.383337 0.449553
vt 0.000000 1.000000
vt 0.750000 0.490000
vt 0.796822 0.485388
vt 0.841844 0.471731
vt 0.883337 0.449553
vt 0.919706 0.419706
vt 0.949553 0.383337
vt 0.971731 0.341844
vt 0.985388 0.296822
vt 0.990000 0.250000
vt 0.985388 0.203178
vt 0.971731 0.158156
vt 0.949553 0.116663
vt 0.919706 0.080294
vt 0.883337 0.050447
vt 0.841844 0.028269
vt 0.750000 0.010000
vt 0.703178 0.014612
vt 0.658156 0.028269
vt 0.616663 0.050447
vt 0.580294 0.080294
vt 0.550447 0.116663
vt 0.528269 0.158156
vt 0.510000 0.250000
vt 0.514612 0.296822
vt 0.528269 0.341844
vt 0.550447 0.383337
vt 0.580294 0.419706
vt 0.616663 0.449553
vt 0.658156 0.471731
vn 0.8619 -0.0980 0.4976
vn 0.8287 -0.2903 0.4785
vn 0.7638 -0.4714 0.4410
vn 0.6694 -0.6344 0.3865
vn 0.5494 -0.7730 0.3172
vn 0.4082 -0.8819 0.2357
vn 0.2514 -0.9569 0.1451
vn 0.0849 -0.9952 0.0490
vn -0.0849 -0.9952 -0.0490
vn -0.2514 -0.9569 -0.1451
vn -0.4082 -0.8819 -0.2357
vn -0.5494 -0.7730 -0.3172
vn -0.6694 -0.6344 -0.3865
vn -0.7638 -0.4714 -0.4410
vn -0.8287 -0.2903 -0.4785
vn -0.8619 -0.0980 -0.4976
vn -0.8619 0.0980 -0.4976
vn -0.8287 0.2903 -0.4785
vn -0.7638 0.4714 -0.4410
vn -0.6694 0.6344 -0.3865
vn -0.5494 0.7730 -0.3172
vn -0.4082 0.8819 -0.2357
vn -0.2514 0.9569 -0.1451
vn -0.0849 0.9952 -0.0490
vn 0.0849 0.9952 0.0490
vn 0.2514 0.9569 0.1451
vn 0.4082 0.8819 0.2357
vn 0.5494 0.7730 0.3172
vn 0.6694 0.6344 0.3865
vn 0.7638 0.4714 0.4410
vn -0.5000 0.0000 0.8660
vn 0.8287 0.2903 0.4785
vn 0.8619 0.0980 0.4976
vn 0.5000 0.0000 -0.8660
usemtl Material.004
s off
f 418/729/466 419/730/466 417/731/466
f 420/732/467 421/733/467 419/730/467
f 422/734/468 423/735/468 421/733/468
f 424/736/469 425/737/469 423/735/469
f 426/738/470 427/739/470 425/737/470
f 428/740/471 429/741/471 427/739/471
f 430/742/472 431/743/472 429/741/472
f 432/744/473 433/745/473 431/743/473
f 434/746/474 435/747/474 433/745/474
f 436/748/475 437/749/475 435/747/475
f 438/750/476 439/751/476 437/749/476
f 440/752/477 441/753/477 439/751/477
f 442/754/478 443/755/478 441/753/478
f 444/756/479 445/757/479 443/755/479
f 446/758/480 447/759/480 445/757/480
f 448/760/481 449/761/481 447/759/481
f 450/762/482 451/763/482 449/761/482
f 452/764/483 453/765/483 451/763/483
f 454/766/484 455/767/484 453/765/484
f 456/768/485 457/769/485 455/767/485
f 458/770/486 459/771/486 457/769/486
f 460/772/487 461/773/487 459/771/487
f 462/774/488 463/775/488 461/773/488
f 464/776/489 465/777/489 463/775/489
f 466/778/490 467/779/490 465/777/490
f 468/780/491 469/781/491 467/779/491
f 470/782/492 471/783/492 469/781/492
f 472/784/493 473/785/493 471/783/493
f 474/786/494 475/787/494 473/785/494
f 476/788/495 477/789/495 475/787/495
f 454/790/496 438/791/496 422/792/496
f 478/793/497 479/794/497 477/789/497
f 480/795/498 417/796/498 479/794/498
f 447/797/499 463/798/499 479/799/499
f 418/729/466 420/732/466 419/730/466
f 420/732/467 422/734/467 421/733/467
f 422/734/468 424/736/468 423/735/468
f 424/736/469 426/738/469 425/737/469
f 426/738/470 428/740/470 427/739/470
f 428/740/471 430/742/471 429/741/471
f 430/742/472 432/744/472 431/743/472
f 432/744/473 434/746/473 433/745/473
f 434/746/474 436/748/474 435/747/474
f 436/748/475 438/750/475 437/749/475
f 438/750/476 440/752/476 439/751/476
f 440/752/477 442/754/477 441/753/477
f 442/754/478 444/756/478 443/755/478
f 444/756/479 446/758/479 445/757/479
f 446/758/480 448/760/480 447/759/480
f 448/760/481 450/762/481 449/761/481
f 450/762/482 452/764/482 451/763/482
f 452/764/483 454/766/483 453/765/483
f 454/766/484 456/768/484 455/767/484
f 456/768/485 458/770/485 457/769/485
f 458/770/486 460/772/486 459/771/486
f 460/772/487 462/774/487 461/773/487
f 462/774/488 464/776/488 463/775/488
f 464/776/489 466/778/489 465/777/489
f 466/778/490 468/780/490 467/779/490
f 468/780/491 470/782/491 469/781/491
f 470/782/492 472/784/492 471/783/492
f 472/784/493 474/786/493 473/785/493
f 474/786/494 476/788/494 475/787/494
f 476/788/495 478/793/495 477/789/495
f 422/792/496 420/800/496 418/801/496
f 418/801/496 480/802/496 422/792/496
f 480/802/496 478/803/496 422/792/496
f 478/803/496 476/804/496 474/805/496
f 474/805/496 472/806/496 470/807/496
f 470/807/496 468/808/496 466/809/496
f 466/809/496 464/810/496 470/807/496
f 464/810/496 462/811/496 470/807/496
f 462/811/496 460/812/496 458/813/496
f 458/813/496 456/814/496 454/790/496
f 454/790/496 452/815/496 446/816/496
f 452/815/496 450/817/496 446/816/496
f 450/817/496 448/818/496 446/816/496
f 446/816/496 444/819/496 442/820/496
f 442/820/496 440/821/496 438/791/496
f 438/791/496 436/822/496 430/823/496
f 436/822/496 434/824/496 430/823/496
f 434/824/496 432/825/496 430/823/496
f 430/823/496 428/826/496 426/827/496
f 426/827/496 424/828/496 422/792/496
f 478/803/496 474/805/496 422/792/496
f 474/805/496 470/807/496 422/792/496
f 462/811/496 458/813/496 470/807/496
f 458/813/496 454/790/496 470/807/496
f 446/816/496 442/820/496 454/790/496
f 442/820/496 438/791/496 454/790/496
f 430/823/496 426/827/496 438/791/496
f 426/827/496 422/792/496 438/791/496
f 422/792/496 470/807/496 454/790/496
f 478/793/497 480/795/497 479/794/497
f 480/795/498 418/829/498 417/796/498
f 479/799/499 417/830/499 419/831/499
f 419/831/499 421/832/499 423/833/499
f 423/833/499 425/834/499 427/835/499
f 427/835/499 429/836/499 431/837/499
f 431/837/499 433/838/499 435/839/499
f 435/839/499 437/840/499 439/841/499
f 439/841/499 441/842/499 443/843/499
f 443/843/499 445/844/499 447/797/499
f 447/797/499 449/845/499 451/846/499
f 451/846/499 453/847/499 455/848/499
f 455/848/499 457/849/499 459/850/499
f 459/850/499 461/851/499 463/798/499
f 463/798/499 465/852/499 467/853/499
f 467/853/499 469/854/499 471/855/499
f 471/855/499 473/856/499 475/857/499
f 475/857/499 477/858/499 479/799/499
f 479/799/499 419/831/499 431/837/499
f 419/831/499 423/833/499 431/837/499
f 423/833/499 427/835/499 431/837/499
f 431/837/499 435/839/499 447/797/499
f 435/839/499 439/841/499 447/797/499
f 439/841/499 443/843/499 447/797/499
f 447/797/499 451/846/499 463/798/499
f 451/846/499 455/848/499 463/798/499
f 455/848/499 459/850/499 463/798/499
f 463/798/499 467/853/499 479/799/499
f 467/853/499 471/855/499 479/799/499
f 471/855/499 475/857/499 479/799/499
f 479/799/499 431/837/499 447/797/499
================================================
FILE: example/toycar/proxy.txt
================================================
-3.008041751643567713e-02 -1.599077930202584819e-01 3.288800368815070208e-01
1.013531047142105096e-01 -1.094049703092424136e-01 3.179610295033679646e-01
2.034530673947447521e-02 -2.447102350337516108e-01 3.510041414025183437e-01
-8.049394250692221142e-02 -2.270424101683294893e-01 2.977256683724919251e-01
-1.035831822649268680e-01 -1.529982836124034740e-01 2.687254279384550526e-01
-7.801099839981905093e-02 -8.021783864686982124e-02 2.767081740553794522e-01
-1.578350006917617412e-02 -4.440321033169543352e-02 3.312812981087256414e-01
5.142522412232817391e-02 -9.874635320864461052e-02 3.759376126933716700e-01
5.239386321784535866e-02 -1.781467799375712480e-01 3.764966235751158741e-01
9.217898455935311097e-02 -2.028351108527517987e-01 2.962708753801160944e-01
-3.995189079805573640e-02 -1.923918974548007865e-01 2.199856349152522206e-01
-4.369320124635300839e-02 -1.178801277431206357e-01 2.178255354504805563e-01
5.455798080443195752e-02 -5.950989374390465436e-02 2.745506387778722801e-01
2.605886069166897523e-02 -2.347891860684041265e-01 2.580968028993643748e-01
1.907982499403319543e-01 1.127749979496002197e-01 2.231034851099198968e-01
3.949222337383241632e-03 2.545270025730133057e-01 -4.655564414018188713e-03
5.415562364869509759e-02 2.545270025730133057e-01 4.647495877018791022e-02
1.366487802351336323e-02 2.545270025730133057e-01 1.320197606793362399e-01
-2.430159497857822259e-01 -1.336503345951309540e-01 -4.235633867332915803e-01
4.148035139471684118e-01 -6.251759223867692228e-02 1.529814158855632433e-01
4.312497778163669748e-01 -1.380651770694928115e-01 1.244956950161623527e-01
4.705860874872060950e-01 -7.572064866862228405e-02 5.636331271780845031e-02
3.784600118530122015e-01 -9.082486238364399778e-02 2.159301113646339143e-01
-2.605984668721801900e-01 -1.428789943456649780e-01 -2.692520992901657473e-01
-2.973503666921596400e-01 -1.428789943456649780e-01 -2.104874941013184508e-01
-3.657282212543175870e-01 -1.428789943456649780e-01 -1.470715554115350487e-01
2.254026951730743467e-01 -9.629995046930588110e-02 4.133945725863367171e-01
-2.043614550653164927e-02 5.910098291360462580e-03 2.714592757789404676e-01
7.472556515896443263e-02 -1.428789943456649780e-01 2.218007723627005157e-01
3.634551373786878981e-02 -1.428789943456650058e-01 8.281396092391980768e-02
9.547915123966020656e-02 -1.428789943456649780e-01 1.579314600597122864e-01
1.411394974303601391e-01 -1.428789943456649780e-01 8.044173091732546854e-02
1.811012713208198233e-01 -1.428789943456649780e-01 2.074111740904903411e-02
1.468570432498366232e-02 -1.428789943456649780e-01 2.700278989966745646e-01
-1.262033495734976585e-02 -1.428789943456649780e-01 1.277334994610299646e-01
2.755141121667840087e-01 -9.894225106639023548e-03 -1.241901093934618394e-01
2.287573880347565369e-01 4.577518963883588088e-02 -1.511852928496118520e-01
-2.797824133870719931e-01 1.193901972922412347e-01 1.217254988595727477e-01
-2.105137036425477337e-01 -6.714871189910018301e-02 1.617179969380473925e-01
-2.555646215562525914e-01 -4.996055922964065532e-04 1.357077138716764886e-01
-4.497862167869904337e-01 -1.027860740606224565e-01 2.357465349763013598e-02
3.939787359150183077e-01 -1.258293227146771720e-01 -5.579476380954233317e-02
-4.692477527168905649e-01 -6.214819451458810512e-02 -5.868137297758706239e-02
-4.289918892110392012e-01 -6.847918240758138486e-02 -1.284064667924819658e-01
-3.593118678433361302e-01 -1.383802812742367661e-01 -2.490956187263556887e-01
-1.655931479124009842e-01 -1.216285426479010723e-02 -3.788632366808567520e-01
-4.878454830840295992e-02 -1.428789943456649780e-01 3.443439515744133039e-02
-1.789680688822672117e-01 -1.428789943456649780e-01 8.055605089341308367e-02
-1.501914949639726882e-01 -1.428789943456649780e-01 1.705229140192431325e-01
-5.444385482892021705e-02 -1.428789943456649780e-01 -4.660065495366726152e-02
8.260351842869512184e-02 -1.428789943456649780e-01 -1.533432347360902948e-01
-1.161623123561457971e-01 -1.428789943456649780e-01 1.049495630597240547e-01
-1.996999125300886746e-01 -1.428789943456649780e-01 -7.240452576207294377e-02
-1.463767459633658619e-01 -1.428789943456649780e-01 -1.769580915486140826e-01
-2.912881505187286946e-01 -1.428789943456650058e-01 -7.675701262425024818e-02
-3.293793167328769589e-01 -1.428789943456649780e-01 8.203691285107776732e-03
7.008685976606088275e-02 -7.470981589908523568e-02 -2.427938464370544558e-01
8.312333124709664345e-04 3.546329003550421216e-03 -2.827781702430718913e-01
1.105013378770156873e-01 -1.341047278400405163e-01 -2.194607879985099042e-01
2.500444809513239219e-01 -1.428789943456649780e-01 2.884718128745404897e-01
2.835024345970282300e-01 -1.428789943456649780e-01 3.569411634360946106e-01
1.895541485459726339e-01 -1.428789943456649780e-01 3.651491847741398789e-01
3.435330574973436413e-01 -1.428789943456649780e-01 2.620451658544767359e-01
1.682868722357639957e-01 -1.428789943456649780e-01 2.529453126371088434e-01
3.254740107520199821e-01 -1.428789943456649780e-01 7.072386039482242426e-02
-9.408063322905441117e-02 2.545270025730133057e-01 -5.006765994726418423e-02
-1.941749711094575281e-01 2.545270025730133057e-01 -5.814426166684079009e-03
-9.462577157485194124e-02 2.545270025730133057e-01 8.605907026345040178e-02
1.477647316463838179e-01 -9.333362517890712862e-02 3.685706796749736447e-01
1.031591772436869503e-01 2.429597187009943926e-01 1.984431510147755817e-01
1.392263548929987271e-01 2.489180454518133412e-01 1.237353433974014150e-01
1.583073161227711578e-01 1.909417855785882912e-01 2.097605422515985696e-01
3.369996084262928404e-01 9.659439884027823386e-02 1.350867793740902489e-01
3.007415964700749234e-01 7.734695152541309904e-02 2.784494533414554884e-01
4.242117711351314213e-01 7.120868642227390355e-02 8.211783816108243761e-02
-4.196047184130055063e-01 7.770209147877035760e-02 1.135375084250377087e-02
-1.927431244846800340e-01 1.075333320606519094e-01 -1.802260620310700556e-01
-2.582052310537700235e-01 1.061204907791518243e-01 -1.268894194190239577e-01
-1.274272709428995662e-02 2.530649370803728093e-01 -1.941713814030470997e-01
-6.242112273996372473e-02 2.332726251465234157e-01 -2.779461943784671818e-01
8.386974044986750254e-02 1.862282391161536321e-01 -2.314583170612571628e-01
-3.397276177276529197e-01 1.551572900475051697e-01 -1.428186360541509534e-02
-3.186203470583528485e-01 1.900407375708308977e-01 5.733144518954495783e-02
1.197065047365872964e-01 2.538531237085444592e-01 -1.116640355436684107e-01
2.182680246885777275e-01 2.279683475880480148e-01 -1.235282244620659886e-01
1.511028310807542574e-01 2.116942203755817764e-01 -1.788863984922892392e-01
2.461605944710053007e-01 2.439843982265741673e-01 -6.040415890926174175e-02
2.778871650073769795e-01 2.007803790260917487e-01 -1.056664112637188679e-01
3.189841076107164941e-01 1.385615591988948736e-01 -9.446383897235638427e-02
1.189669381589373159e-01 1.251166804254961740e-01 3.319798854604533167e-01
1.334717195755382724e-01 6.148428227289885750e-02 3.592350615608139175e-01
4.136644572179863921e-01 -3.148325842303777156e-02 -4.442917744860291895e-02
4.872767539507336099e-01 -1.248004925158877065e-01 -1.928817923386060527e-03
2.598131514658370556e-01 1.127749979496002058e-01 7.325551388992862512e-02
2.399449834590868247e-01 1.127749979496002197e-01 1.489211658068920197e-01
1.637306261874847957e-01 2.545270025730133057e-01 -5.106195456459958898e-02
1.347541331961190081e-01 2.545270025730133057e-01 5.492540483294672260e-02
8.676192136681346034e-02 2.545270025730133057e-01 -3.344917492862551495e-02
-2.325764979574476421e-01 -5.058706411309332046e-02 -4.175361592750670425e-01
4.417279914642022742e-01 -5.504306126151409240e-03 1.063469244773902200e-01
2.706575842492505712e-01 -7.384417160236255578e-03 4.026491064012981402e-01
3.537159531353744613e-01 2.058152089813053767e-02 2.587880124303453266e-01
3.403789917704680423e-01 -6.186520513927418891e-02 2.818882716568940072e-01
3.057312595629535501e-01 -4.127418748628245565e-02 3.418998120450700284e-01
3.803239479742143159e-01 -1.992992323647464065e-02 2.127016842586820222e-01
-4.261034935157498316e-01 -1.428789943456649780e-01 -1.010039896288800476e-01
-3.122390692199679685e-01 -1.428789943456649780e-01 -3.289908214016046273e-01
-2.330165784510331295e-01 -1.428789943456649780e-01 -3.368451095082249935e-01
-4.782602657034688565e-01 -1.428789943456650058e-01 -4.192712584261983760e-02
1.920451739789227619e-01 -3.668612122380034146e-02 3.941358210265608619e-01
-1.150784960732098849e-01 -9.567204891104040543e-02 2.168177545607453405e-01
-1.089900743418871776e-01 -2.533046948310040125e-02 2.203328894155480999e-01
-1.078630637909309409e-01 8.672530960008487111e-02 2.209835661006100960e-01
-3.454440276906847740e-02 1.049184016567535210e-01 2.633139092073489840e-01
1.556824013415305796e-01 -1.428789943456649780e-01 -1.050165460270968409e-01
7.991567145689520979e-02 -1.428789943456649780e-01 2.248887233850625084e-02
-7.167941732491081575e-02 -1.428789943456649780e-01 1.613096720801515671e-01
7.325542128028081357e-02 -1.428789943456649780e-01 -7.130972694788569044e-02
1.372036190991233473e-01 -1.428789943456649780e-01 -2.988922683012296133e-02
1.729484254488463518e-02 -1.428789943456650058e-01 -1.415370855345507439e-02
1.577782687795628580e-01 2.191164906679177876e-02 -1.921653739271357575e-01
1.893638750093445955e-01 1.385529372308320073e-01 -1.739292966446681077e-01
1.313930308932143853e-01 1.377144703787054614e-01 -2.073989975502296135e-01
1.472306186484788615e-01 -6.810335656305760432e-02 -1.982551023160597781e-01
2.545022876503374998e-01 1.314636963132230774e-01 -1.363213711124874605e-01
-1.868203767525979386e-01 1.786503428035846747e-02 1.753974105378350856e-01
-2.151942230670570444e-01 8.127334966633947833e-02 1.590156849593584998e-01
-1.748978709222643313e-01 1.510833573849916789e-01 1.822809051672108216e-01
-4.791448865167574067e-01 -2.467139965026992482e-02 6.624001175274988121e-03
3.241447509200608978e-01 5.567935342742133531e-02 -9.611329372392368731e-02
-2.726755912605816068e-01 1.068777784634767497e-02 -3.991538210955010713e-01
-3.096693640763498179e-01 -4.568018542517236769e-02 -3.350788254176311765e-01
-3.936657342674756421e-01 -2.420378425809303208e-02 -1.895930680239139576e-01
-3.578482027716477498e-01 -4.318888317251705355e-02 -2.516307571024639089e-01
-4.295496224950269837e-01 4.900452119352645408e-03 -1.274404458911024274e-01
-7.775392594473359842e-02 1.529188315411551119e-02 -3.281490432592755746e-01
-6.054801764763761973e-02 1.093957238003369775e-01 -3.182151692810418209e-01
-1.232378189343877745e-01 7.090703747702713222e-02 -3.544092820793880994e-01
-1.262378997549711279e-01 -1.428789943456649780e-01 -2.781453358869208359e-02
-4.414624715687544276e-02 -1.428789943456649780e-01 -2.111827615323477736e-01
2.651351355979224972e-02 -1.428789943456649780e-01 -1.960732102764978502e-01
-2.546172036532631244e-01 -1.428789943456649780e-01 5.848859574834407160e-02
-5.880049691055850430e-02 -1.428789943456649780e-01 -2.900926310199781755e-01
2.486785117628207212e-02 -1.428789943456649780e-01 -2.661460498579929501e-01
-2.127349085281885210e-01 -1.428789943456649780e-01 -2.342893989891133011e-03
-2.688791971700063566e-02 -1.428789943456649780e-01 -1.269991954670846024e-01
-3.602734222357412697e-01 -1.428789943456649780e-01 -5.638119508547503184e-02
-2.283993155744980796e-01 -1.428789943456649780e-01 -1.712802826600821893e-01
-1.684064935726128642e-01 -1.428789943456649780e-01 -2.561711855814242655e-01
2.891834990487993431e-02 6.976702040393384374e-02 -2.665622406194935734e-01
8.693538896316693743e-02 3.162575630992555537e-02 -2.330664483674740883e-01
9.444231881550420316e-03 1.491078990886201550e-01 -2.778055067735180073e-01
2.756087072625218459e-01 -1.428789943456649780e-01 2.155095689474603171e-01
4.006236009569492662e-01 -1.428789943456649780e-01 4.056265458202543139e-02
3.507182302090796622e-01 -1.428789943456649780e-01 1.571384272949141458e-01
2.498907615813568406e-01 -1.428789943456649780e-01 -1.173257342499487918e-02
3.287783020285070523e-01 -1.428789943456649780e-01 -5.299012469293713778e-03
2.450240322898214196e-01 -1.428789943456649780e-01 9.155930905348749760e-02
1.879429033557862283e-01 -1.428789943456649780e-01 1.503281130312500158e-01
-8.265023970787990781e-02 2.545270025730133057e-01 1.656358147354532184e-02
-2.094310554941566579e-02 2.545270025730133057e-01 -7.142346276683707307e-02
-5.347168678086042931e-02 2.545270025730133057e-01 -1.414801937408750043e-01
4.376479270522913578e-02 2.545270025730133057e-01 -9.768977877178744440e-02
-9.780628139040459246e-02 2.545270025730133057e-01 -1.927153495590092080e-01
7.006834983514051429e-02 3.241020236654239967e-02 3.237121732067975710e-01
4.830811011963696833e-02 9.553118299532642466e-02 3.111487586529718130e-01
9.461701177451975475e-02 -3.635246682806635848e-02 3.378855012913410949e-01
-3.126430926456684745e-01 -1.208117264506714944e-01 1.027534593362453286e-01
-3.368659328087125759e-01 2.697906729008384130e-02 8.876864737214465451e-02
-4.134339176297990859e-01 1.674118243149566934e-02 4.456289515706475524e-02
2.397515623021320152e-01 1.813241571009908681e-01 8.844801006956456990e-02
2.570711183495646424e-01 2.048164171525036215e-01 1.020015336763511679e-02
3.282870814547192984e-01 1.355040024490944406e-01 2.920799836825857879e-02
2.236424555273570647e-01 1.013672892978989409e-01 3.007302708228469790e-01
3.009054959624005710e-01 9.522351928696556889e-02 2.054169773461973769e-01
2.569810993670229027e-01 6.837646895596272145e-02 3.780989067380177149e-01
3.924458142133960825e-01 5.659190446541730035e-02 1.664083501106428109e-01
3.047400013506773031e-01 3.173638331450114730e-02 3.419151561242582127e-01
-3.636720723858299920e-01 4.258917515241231189e-02 -2.280796360208950935e-01
-3.203772037942676509e-01 3.553182992108781801e-02 -3.097604569419006304e-01
4.204518094672002881e-01 8.080103028450966174e-02 -1.340627841337434581e-02
4.906450474262099570e-01 4.975833532237387141e-02 5.136059119054338547e-03
-7.412972330301498269e-02 2.515102291643478427e-01 1.534947721551515598e-01
-2.019878130697033702e-02 2.485970187160612965e-01 1.965155059410869098e-01
-2.996918805559973392e-02 1.942658716807009467e-01 2.595333305020108838e-01
-1.109741162557867089e-01 1.715929260153348235e-01 2.187152473603267699e-01
7.775113536581725049e-02 2.492984698799731724e-01 -1.611040103463334539e-01
1.168697998445394556e-02 2.206054156973578928e-01 -2.513167521132639304e-01
-1.730762378033310900e-01 2.529951035960633643e-01 8.742268077670814486e-02
-2.635816937459818377e-01 2.231629321910354458e-01 1.029280522902447093e-01
-2.126571113693401238e-01 2.064930993201087861e-01 1.470545445431984954e-01
-2.729933654588342362e-01 2.372939622364097223e-01 3.481841743030971575e-02
-3.719247667438944083e-01 1.069835647135208012e-01 6.169001714315359985e-02
3.098564848075335276e-01 1.743266277807886122e-01 -3.360441751003080679e-02
3.585003636803585358e-01 1.197267566739983607e-01 -3.679925968302116823e-02
3.853594974274682228e-01 4.688823796079921724e-02 -6.040070461254069822e-02
1.942715408580548975e-01 5.001888235395933469e-02 3.901103337589299502e-01
1.278223314691873669e-01 1.774210768273642436e-01 2.796719130756642469e-01
4.924609614322386641e-02 2.304823579720638727e-01 2.717676939177878226e-01
3.693980686025515048e-02 1.661105503727139943e-01 3.039778079662383004e-01
4.991717076269726605e-01 -2.469559581500465478e-02 4.938782059417700862e-03
-2.856089965686683385e-01 1.994113191796140216e-01 -2.236486538739460364e-02
-3.024192785217422186e-01 1.473134704940958217e-01 -9.597501001968400391e-02
-2.580226595222575581e-01 1.356559038775583814e-01 -1.958584014117798200e-01
-2.433270353032729594e-01 1.984520228149331544e-01 -9.749075928488135090e-02
-3.571781203861900678e-01 9.536698004416349628e-02 -7.508542776191258483e-02
-3.176674592942919095e-01 9.302152719097141198e-02 -1.591408369890882357e-01
-4.755664982463075341e-01 5.492915780381281421e-02 -1.649502092853008181e-02
-1.516455389917267249e-01 1.445457732525340100e-01 -3.256328309101772511e-01
-1.982086699292902321e-01 1.753670152460271581e-01 -2.211576054616488096e-01
-1.818971584355340010e-01 2.297407404114726326e-01 -1.421964440604682189e-01
-1.915641082715185961e-01 1.075420007109642029e-01 -2.761363600472928059e-01
-2.673572353181769556e-01 8.856399064879554028e-02 -2.716996179717711679e-01
-2.416311264286911120e-01 8.282539076326148053e-02 -3.426223111674723265e-01
-4.090870243927528516e-01 6.632994816486115219e-02 -1.074304478101620886e-01
-2.195762721858716682e-01 5.502769548714448594e-02 -3.997460444408329483e-01
-1.369698589919506326e-01 2.125300039373475558e-01 -2.533224461372173275e-01
-8.948110666560515902e-02 1.753797901771273204e-01 -3.196861167701421369e-01
-2.114484414496873854e-01 -1.788227252722176530e-01 1.410261365362814878e-01
-3.519655667935992494e-01 -6.786675487644851235e-02 5.206582515355913376e-02
-3.416793595032953101e-01 -1.988347594950012898e-01 1.489783852841617406e-01
-2.759873298857198343e-01 -1.859437006929063052e-01 1.869053873427343471e-01
-3.018495936775782984e-01 -2.481503412206051418e-01 1.664756376843169738e-01
-4.051398730365156631e-01 -2.122160043228668580e-01 1.079830886435250159e-01
-3.978261833611123621e-01 -1.529169680684189236e-01 5.328480608171727584e-02
-4.080384739579243636e-01 -8.799597965390372478e-02 1.106658699008653407e-01
-2.541897579859542233e-01 -1.244874882215506462e-01 1.994903110246473432e-01
-3.138272957274786390e-01 -5.551717141042094433e-02 1.650582659743432545e-01
-3.450504653647801789e-01 -2.234674961503602120e-01 4.383649668397857080e-02
-2.539560210662785700e-01 -7.823689166047237409e-02 9.642987639956755730e-02
-2.664763343034576848e-01 -2.349108522099726537e-01 8.920163299073329000e-02
2.790345177430171297e-01 -2.530610187096446473e-01 -1.545677670982837926e-01
1.736033640188132665e-01 -1.824732327764439499e-01 -2.004030367542102842e-01
2.088692474447170488e-01 -5.193928747294235260e-02 -1.418254253609498861e-01
3.600428296122910043e-01 -1.055644670743211000e-01 -1.340617343024393116e-01
2.574133602677568522e-01 -1.526437980508147385e-01 -1.937106824061379795e-01
2.646361395875617695e-01 -8.104813499679636157e-02 -1.895407824776961236e-01
2.961838783751849502e-01 -2.268783914588647643e-01 -7.298486723863251147e-02
1.773156314213608642e-01 -2.137034342064950965e-01 -1.367595413992551134e-01
2.403467345185559267e-01 -1.516439790642750696e-01 -1.003687897206735347e-01
2.728150902802282673e-01 -7.871683828938171035e-02 -8.162249964532684865e-02
2.014078898154263997e-01 -1.058330469747980201e-01 -2.260452389893339575e-01
3.246449162331409566e-01 -1.769881157986896436e-01 -1.548947648054231352e-01
-7.788669715009227668e-02 -2.441419751283070982e-01 -3.806077785350738818e-01
-5.160112790645378367e-02 -4.577384902151235280e-02 -3.695420008820210489e-01
2.494531597038172899e-02 -7.201135650198439708e-02 -3.277582127309514037e-01
-9.447946519723325609e-02 -2.316043750234147991e-01 -2.936805843800783800e-01
-6.362454839605888313e-02 -1.172946859798069674e-01 -3.790623054349018806e-01
-1.561110138167150774e-01 -1.151749201080639745e-01 -3.537604559853596009e-01
-1.723325528776649754e-02 -8.750021807197132240e-02 -2.490823030283906836e-01
-7.653089415271821516e-02 -6.424596867740077488e-02 -2.833185158078170840e-01
-1.582701354136146210e-01 -1.857498462706901887e-01 -3.305102019737782282e-01
-2.820405286733868380e-02 -2.076023097634119297e-01 -2.554167095516339803e-01
1.954743515300430903e-03 -2.420209404858527091e-01 -3.411998447192706729e-01
-4.866162666691160876e-03 -1.601395174364015350e-01 -3.451379835336461088e-01
-1.119421595143438170e-01 -1.689076634149663525e-01 -4.069582526720981197e-01
================================================
FILE: example/turtle/mesh.obj
================================================
# Blender v3.6.1 OBJ File: 'turbo.blend'
# www.blender.org
mtllib 0000_Collection.mtl
o Plane
v -0.119246 -0.093755 0.346267
v 0.355785 -0.093755 0.023394
v 0.352602 -0.093755 -0.015520
v -0.494059 -0.093755 0.000350
v 0.008464 -0.093755 -0.325465
v 0.355785 -0.093755 -0.003287
v 0.325765 -0.093755 -0.118672
v 0.272564 -0.093755 -0.185467
v 0.199248 -0.093755 -0.256904
v 0.125469 -0.093755 -0.290065
v -0.273907 -0.093755 -0.287529
v -0.367872 -0.093755 -0.231661
v -0.433396 -0.093755 -0.167171
v -0.471304 -0.093755 -0.091541
v -0.000555 -0.093755 0.346267
v 0.095981 -0.093755 0.325165
v 0.152612 -0.093755 0.299450
v 0.212431 -0.093755 0.264595
v 0.271383 -0.093755 0.215006
v 0.311769 -0.093755 0.158043
v 0.339950 -0.093755 0.091125
v -0.209241 -0.093755 0.331396
v -0.292510 -0.093755 0.301898
v -0.362768 -0.093755 0.259424
v -0.427082 -0.093755 0.193280
v -0.473980 -0.093755 0.110233
v -0.069137 0.139619 -0.180971
v -0.069137 0.062181 -0.276008
v -0.069137 -0.038997 -0.327441
v -0.069137 -0.093755 -0.334060
v -0.052808 0.181528 -0.055509
v -0.037105 0.165556 -0.118886
v -0.022634 0.139619 -0.177294
v -0.009950 0.104713 -0.228489
v 0.000459 0.062181 -0.270504
v 0.008194 0.013656 -0.301724
v 0.012957 -0.038997 -0.320949
v 0.014566 -0.093755 -0.327441
v -0.037105 0.181528 -0.051685
v -0.006305 0.165556 -0.111384
v 0.022081 0.139619 -0.166404
v 0.046962 0.104713 -0.214629
v 0.067381 0.062181 -0.254206
v 0.082554 0.013656 -0.283615
v 0.091897 -0.038997 -0.301724
v 0.095052 -0.093755 -0.307839
v -0.022634 0.181528 -0.045475
v 0.022081 0.165556 -0.099203
v 0.063292 0.139619 -0.148719
v 0.099413 0.104713 -0.192120
v 0.129057 0.062181 -0.227739
v 0.151084 0.013656 -0.254206
v 0.164648 -0.038997 -0.270504
v 0.169229 -0.093755 -0.276008
v -0.009950 0.181528 -0.037117
v 0.046962 0.165556 -0.082809
v 0.099413 0.139619 -0.124920
v 0.145386 0.104713 -0.161829
v 0.183116 0.062181 -0.192120
v 0.211151 0.013656 -0.214629
v 0.228415 -0.038997 -0.228489
v 0.234245 -0.093755 -0.233169
v 0.000459 0.181528 -0.026934
v 0.067381 0.165556 -0.062834
v 0.129057 0.139619 -0.095920
v 0.183116 0.104713 -0.124920
v 0.227481 0.062181 -0.148719
v 0.260447 0.013656 -0.166404
v 0.280748 -0.038997 -0.177294
v 0.287602 -0.093755 -0.180971
v -0.069137 0.186922 0.010401
v 0.008194 0.181528 -0.015316
v 0.082554 0.165556 -0.040044
v 0.151084 0.139619 -0.062834
v 0.211151 0.104713 -0.082809
v 0.260447 0.062181 -0.099203
v 0.297077 0.013656 -0.111384
v 0.319634 -0.038997 -0.118886
v 0.327251 -0.093755 -0.121418
v 0.012957 0.181528 -0.002709
v 0.091897 0.165556 -0.015316
v 0.164648 0.139619 -0.026934
v 0.228415 0.104713 -0.037117
v 0.280748 0.062181 -0.045475
v 0.319634 0.013656 -0.051685
v 0.343580 -0.038997 -0.055509
v 0.351666 -0.093755 -0.056800
v 0.014566 0.181528 0.010401
v 0.095052 0.165556 0.010401
v 0.169229 0.139619 0.010401
v 0.234245 0.104713 0.010401
v 0.287602 0.062181 0.010401
v 0.327251 0.013656 0.010401
v 0.351666 -0.038997 0.010401
v 0.359910 -0.093755 0.010401
v 0.012957 0.181528 0.023511
v 0.091897 0.165556 0.036118
v 0.164648 0.139619 0.047736
v 0.228415 0.104713 0.057919
v 0.280748 0.062181 0.066276
v 0.319634 0.013656 0.072486
v 0.343580 -0.038997 0.076311
v 0.351666 -0.093755 0.077602
v 0.008194 0.181528 0.036118
v 0.082554 0.165556 0.060846
v 0.151084 0.139619 0.083636
v 0.211151 0.104713 0.103611
v 0.260447 0.062181 0.120005
v 0.297077 0.013656 0.132186
v 0.319634 -0.038997 0.139687
v 0.327250 -0.093755 0.142220
v 0.000459 0.181528 0.047736
v 0.067381 0.165556 0.083636
v 0.129057 0.139619 0.116721
v 0.183116 0.104713 0.145721
v 0.227481 0.062181 0.169521
v 0.260447 0.013656 0.187206
v 0.280748 -0.038997 0.198096
v 0.287602 -0.093755 0.201773
v -0.009950 0.181528 0.057919
v 0.046962 0.165556 0.103611
v 0.099413 0.139619 0.145721
v 0.145386 0.104713 0.182631
v 0.183116 0.062181 0.212922
v 0.211151 0.013656 0.235431
v 0.228415 -0.038997 0.249291
v 0.234245 -0.093755 0.253971
v -0.022634 0.181528 0.066276
v 0.022081 0.165556 0.120005
v 0.063292 0.139619 0.169521
v 0.099413 0.104713 0.212922
v 0.129057 0.062181 0.248541
v 0.151084 0.013656 0.275008
v 0.164648 -0.038997 0.291306
v 0.169228 -0.093755 0.296809
v -0.037106 0.181528 0.072486
v -0.006305 0.165556 0.132186
v 0.022081 0.139619 0.187205
v 0.046962 0.104713 0.235431
v 0.067381 0.062181 0.275008
v 0.082554 0.013656 0.304416
v 0.091897 -0.038997 0.322526
v 0.095052 -0.093755 0.328641
v -0.052808 0.181528 0.076311
v -0.037106 0.165556 0.139687
v -0.022634 0.139619 0.198096
v -0.009950 0.104713 0.249291
v 0.000459 0.062181 0.291306
v 0.008194 0.013656 0.322526
v 0.012957 -0.038997 0.341751
v 0.014566 -0.093755 0.348242
v -0.069137 0.181528 0.077602
v -0.069137 0.165556 0.142220
v -0.069137 0.139619 0.201773
v -0.069137 0.104713 0.253971
v -0.069137 0.062181 0.296809
v -0.069137 0.013656 0.328641
v -0.069137 -0.038997 0.348243
v -0.069137 -0.093755 0.354861
v -0.085467 0.181528 0.076310
v -0.101169 0.165556 0.139687
v -0.115640 0.139619 0.198096
v -0.128324 0.104713 0.249291
v -0.138734 0.062181 0.291306
v -0.146469 0.013656 0.322526
v -0.151232 -0.038997 0.341751
v -0.152840 -0.093755 0.348242
v -0.101169 0.181528 0.072486
v -0.131970 0.165556 0.132186
v -0.160356 0.139619 0.187205
v -0.185237 0.104713 0.235431
v -0.205656 0.062181 0.275008
v -0.220828 0.013656 0.304416
v -0.230172 -0.038997 0.322526
v -0.233326 -0.093755 0.328641
v -0.115640 0.181528 0.066276
v -0.160356 0.165556 0.120005
v -0.201566 0.139619 0.169521
v -0.237687 0.104713 0.212922
v -0.267331 0.062181 0.248541
v -0.289358 0.013656 0.275008
v -0.302923 -0.038997 0.291306
v -0.307503 -0.093755 0.296809
v -0.128324 0.181528 0.057919
v -0.185237 0.165556 0.103611
v -0.237687 0.139619 0.145721
v -0.283661 0.104713 0.182631
v -0.321390 0.062181 0.212922
v -0.349426 0.013656 0.235430
v -0.366690 -0.038997 0.249291
v -0.372519 -0.093755 0.253971
v -0.138734 0.181528 0.047736
v -0.205656 0.165556 0.083636
v -0.267331 0.139619 0.116721
v -0.321390 0.104713 0.145721
v -0.365755 0.062181 0.169521
v -0.398722 0.013656 0.187205
v -0.419022 -0.038997 0.198096
v -0.425877 -0.093755 0.201773
v -0.146469 0.181528 0.036118
v -0.220828 0.165556 0.060846
v -0.289358 0.139619 0.083636
v -0.349426 0.104713 0.103611
v -0.398721 0.062181 0.120005
v -0.435352 0.013656 0.132186
v -0.457908 -0.038997 0.139687
v -0.465525 -0.093755 0.142220
v -0.151232 0.181528 0.023511
v -0.230172 0.165556 0.036118
v -0.302923 0.139619 0.047736
v -0.366690 0.104713 0.057919
v -0.419022 0.062181 0.066276
v -0.457908 0.013656 0.072486
v -0.481855 -0.038997 0.076310
v -0.489940 -0.093755 0.077602
v -0.152840 0.181528 0.010401
v -0.233326 0.165556 0.010401
v -0.307503 0.139619 0.010401
v -0.372519 0.104713 0.010401
v -0.425877 0.062181 0.010401
v -0.465525 0.013656 0.010401
v -0.489940 -0.038997 0.010401
v -0.498184 -0.093755 0.010401
v -0.151232 0.181528 -0.002709
v -0.230172 0.165556 -0.015316
v -0.302923 0.139619 -0.026934
v -0.366690 0.104713 -0.037117
v -0.419022 0.062181 -0.045475
v -0.457908 0.013656 -0.051685
v -0.481854 -0.038997 -0.055509
v -0.489940 -0.093755 -0.056800
v -0.146469 0.181528 -0.015316
v -0.220828 0.165556 -0.040044
v -0.289358 0.139619 -0.062834
v -0.349426 0.104713 -0.082809
v -0.398721 0.062181 -0.099203
v -0.435352 0.013656 -0.111384
v -0.457908 -0.038997 -0.118886
v -0.465525 -0.093755 -0.121418
v -0.138734 0.181528 -0.026934
v -0.205655 0.165556 -0.062834
v -0.267331 0.139619 -0.095920
v -0.321390 0.104713 -0.124920
v -0.365755 0.062181 -0.148719
v -0.398721 0.013656 -0.166404
v -0.419022 -0.038997 -0.177294
v -0.425876 -0.093755 -0.180971
v -0.128324 0.181528 -0.037117
v -0.185236 0.165556 -0.082809
v -0.237687 0.139619 -0.124919
v -0.283661 0.104713 -0.161829
v -0.321390 0.062181 -0.192120
v -0.349426 0.013656 -0.214629
v -0.366690 -0.038997 -0.228489
v -0.372519 -0.093755 -0.233169
v -0.115640 0.181528 -0.045475
v -0.160356 0.165556 -0.099203
v -0.201566 0.139619 -0.148719
v -0.237687 0.104713 -0.192120
v -0.267331 0.062181 -0.227739
v -0.289358 0.013656 -0.254206
v -0.302923 -0.038997 -0.270504
v -0.307503 -0.093755 -0.276007
v -0.101169 0.181528 -0.051685
v -0.131970 0.165556 -0.111384
v -0.160356 0.139619 -0.166404
v -0.185236 0.104713 -0.214629
v -0.205655 0.062181 -0.254206
v -0.220828 0.013656 -0.283614
v -0.230171 -0.038997 -0.301724
v -0.233326 -0.093755 -0.307839
v -0.085467 0.181528 -0.055509
v -0.101169 0.165556 -0.118886
v -0.115640 0.139619 -0.177294
v -0.128324 0.104713 -0.228489
v -0.138734 0.062181 -0.270504
v -0.146469 0.013656 -0.301724
v -0.151232 -0.038997 -0.320949
v -0.152840 -0.093755 -0.327441
v -0.069137 0.181528 -0.056800
v -0.069137 0.165556 -0.121418
v -0.069137 0.104713 -0.233169
v -0.069137 0.013656 -0.307839
vt 0.996255 0.538588
vt 1.000000 0.480658
vt 1.000000 0.520378
vt 0.902075 0.791587
vt 0.026776 0.651760
vt 0.981367 0.379827
vt 0.750000 0.625000
vt 0.718750 0.562500
vt 0.750000 0.562500
vt 0.718750 0.500000
vt 0.750000 0.500000
vt 0.750000 0.937500
vt 0.734375 1.000000
vt 0.718750 0.937500
vt 0.718750 0.875000
vt 0.750000 0.875000
vt 0.718750 0.812500
vt 0.750000 0.812500
vt 0.718750 0.750000
vt 0.750000 0.750000
vt 0.718750 0.687500
vt 0.750000 0.687500
vt 0.718750 0.625000
vt 0.687500 0.625000
vt 0.687500 0.562500
vt 0.687500 0.500000
vt 0.703125 1.000000
vt 0.687500 0.937500
vt 0.687500 0.875000
vt 0.687500 0.812500
vt 0.687500 0.750000
vt 0.687500 0.687500
vt 0.656250 0.875000
vt 0.656250 0.812500
vt 0.656250 0.750000
vt 0.656250 0.687500
vt 0.656250 0.625000
vt 0.656250 0.562500
vt 0.656250 0.500000
vt 0.671875 1.000000
vt 0.656250 0.937500
vt 0.625000 0.625000
vt 0.625000 0.562500
vt 0.625000 0.500000
vt 0.640625 1.000000
vt 0.625000 0.937500
vt 0.625000 0.875000
vt 0.625000 0.812500
vt 0.625000 0.750000
vt 0.625000 0.687500
vt 0.593750 0.812500
vt 0.593750 0.750000
vt 0.593750 0.687500
vt 0.593750 0.625000
vt 0.593750 0.562500
vt 0.593750 0.500000
vt 0.609375 1.000000
vt 0.593750 0.937500
vt 0.593750 0.875000
vt 0.562500 0.562500
vt 0.562500 0.500000
vt 0.578125 1.000000
vt 0.562500 0.937500
vt 0.562500 0.875000
vt 0.562500 0.812500
vt 0.562500 0.750000
vt 0.562500 0.687500
vt 0.562500 0.625000
vt 0.531250 0.750000
vt 0.531250 0.687500
vt 0.531250 0.625000
vt 0.531250 0.562500
vt 0.531250 0.500000
vt 0.546875 1.000000
vt 0.531250 0.937500
vt 0.531250 0.875000
vt 0.531250 0.812500
vt 0.500000 0.500000
vt 0.515625 1.000000
vt 0.500000 0.937500
vt 0.500000 0.875000
vt 0.500000 0.812500
vt 0.500000 0.750000
vt 0.500000 0.687500
vt 0.500000 0.625000
vt 0.500000 0.562500
vt 0.468750 0.812500
vt 0.468750 0.750000
vt 0.468750 0.687500
vt 0.468750 0.625000
vt 0.468750 0.562500
vt 0.468750 0.500000
vt 0.484375 1.000000
vt 0.468750 0.937500
vt 0.468750 0.875000
vt 0.437500 0.562500
vt 0.437500 0.500000
vt 0.453125 1.000000
vt 0.437500 0.937500
vt 0.437500 0.875000
vt 0.437500 0.812500
vt 0.437500 0.750000
vt 0.437500 0.687500
vt 0.437500 0.625000
vt 0.406250 0.750000
vt 0.406250 0.687500
vt 0.406250 0.625000
vt 0.406250 0.562500
vt 0.406250 0.500000
vt 0.421875 1.000000
vt 0.406250 0.937500
vt 0.406250 0.875000
vt 0.406250 0.812500
vt 0.390625 1.000000
vt 0.375000 0.937500
vt 0.375000 0.875000
vt 0.375000 0.812500
vt 0.375000 0.750000
vt 0.375000 0.687500
vt 0.375000 0.625000
vt 0.375000 0.562500
vt 0.375000 0.500000
vt 0.343750 0.750000
vt 0.343750 0.687500
vt 0.343750 0.625000
vt 0.343750 0.562500
vt 0.343750 0.500000
vt 0.359375 1.000000
vt 0.343750 0.937500
vt 0.343750 0.875000
vt 0.343750 0.812500
vt 0.312500 0.937500
vt 0.312500 0.875000
vt 0.312500 0.812500
vt 0.312500 0.750000
vt 0.312500 0.687500
vt 0.312500 0.625000
vt 0.312500 0.562500
vt 0.312500 0.500000
vt 0.328125 1.000000
vt 0.281250 0.687500
vt 0.281250 0.625000
vt 0.281250 0.562500
vt 0.281250 0.500000
vt 0.296875 1.000000
vt 0.281250 0.937500
vt 0.281250 0.875000
vt 0.281250 0.812500
vt 0.281250 0.750000
vt 0.250000 0.875000
vt 0.250000 0.812500
vt 0.250000 0.750000
vt 0.250000 0.687500
vt 0.250000 0.625000
vt 0.250000 0.562500
vt 0.250000 0.500000
vt 0.265625 1.000000
vt 0.250000 0.937500
vt 0.218750 0.562500
vt 0.218750 0.500000
vt 0.234375 1.000000
vt 0.218750 0.937500
vt 0.218750 0.875000
vt 0.218750 0.812500
vt 0.218750 0.750000
vt 0.218750 0.687500
vt 0.218750 0.625000
vt 0.187500 0.812500
vt 0.187500 0.750000
vt 0.187500 0.687500
vt 0.187500 0.625000
vt 0.187500 0.562500
vt 0.187500 0.500000
vt 0.203125 1.000000
vt 0.187500 0.937500
vt 0.187500 0.875000
vt 0.156250 0.562500
vt 0.156250 0.500000
vt 0.171875 1.000000
vt 0.156250 0.937500
vt 0.156250 0.875000
vt 0.156250 0.812500
vt 0.156250 0.750000
vt 0.156250 0.687500
vt 0.156250 0.625000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.125000 0.562500
vt 0.125000 0.500000
vt 0.140625 1.000000
vt 0.125000 0.937500
vt 0.125000 0.875000
vt 0.125000 0.812500
vt 0.093750 0.500000
vt 0.109375 1.000000
vt 0.093750 0.937500
vt 0.093750 0.875000
vt 0.093750 0.812500
vt 0.093750 0.750000
vt 0.093750 0.687500
vt 0.093750 0.625000
vt 0.093750 0.562500
vt 0.062500 0.750000
vt 0.062500 0.687500
vt 0.062500 0.625000
vt 0.062500 0.562500
vt 0.062500 0.500000
vt 0.078125 1.000000
vt 0.062500 0.937500
vt 0.062500 0.875000
vt 0.062500 0.812500
vt 0.031250 0.500000
vt 0.046875 1.000000
vt 0.031250 0.937500
vt 0.031250 0.875000
vt 0.031250 0.812500
vt 0.031250 0.750000
vt 0.031250 0.687500
vt 0.031250 0.625000
vt 0.031250 0.562500
vt 0.000000 0.687500
vt 0.000000 0.625000
vt 0.000000 0.562500
vt 0.000000 0.500000
vt 0.015625 1.000000
vt 0.000000 0.937500
vt 0.000000 0.875000
vt 0.000000 0.812500
vt 0.000000 0.750000
vt 1.000000 0.875000
vt 0.968750 0.937500
vt 0.968750 0.875000
vt 1.000000 0.812500
vt 0.968750 0.812500
vt 1.000000 0.750000
vt 0.968750 0.750000
vt 1.000000 0.687500
vt 0.968750 0.687500
vt 1.000000 0.625000
vt 0.968750 0.625000
vt 1.000000 0.562500
vt 0.968750 0.562500
vt 1.000000 0.500000
vt 0.968750 0.500000
vt 1.000000 0.937500
vt 0.984375 1.000000
vt 0.937500 0.687500
vt 0.937500 0.625000
vt 0.937500 0.562500
vt 0.937500 0.500000
vt 0.953125 1.000000
vt 0.937500 0.937500
vt 0.937500 0.875000
vt 0.937500 0.812500
vt 0.937500 0.750000
vt 0.906250 0.937500
vt 0.906250 0.875000
vt 0.906250 0.812500
vt 0.906250 0.750000
vt 0.906250 0.687500
vt 0.906250 0.625000
vt 0.906250 0.562500
vt 0.906250 0.500000
vt 0.921875 1.000000
vt 0.875000 0.687500
vt 0.875000 0.625000
vt 0.875000 0.562500
vt 0.875000 0.500000
vt 0.890625 1.000000
vt 0.875000 0.937500
vt 0.875000 0.875000
vt 0.875000 0.812500
vt 0.875000 0.750000
vt 0.843750 0.875000
vt 0.843750 0.812500
vt 0.843750 0.750000
vt 0.843750 0.687500
vt 0.843750 0.625000
vt 0.843750 0.562500
vt 0.843750 0.500000
vt 0.859375 1.000000
vt 0.843750 0.937500
vt 0.812500 0.625000
vt 0.812500 0.562500
vt 0.812500 0.500000
vt 0.828125 1.000000
vt 0.812500 0.937500
vt 0.812500 0.875000
vt 0.812500 0.812500
vt 0.812500 0.750000
vt 0.812500 0.687500
vt 0.781250 0.875000
vt 0.781250 0.812500
vt 0.781250 0.750000
vt 0.781250 0.687500
vt 0.781250 0.625000
vt 0.781250 0.562500
vt 0.781250 0.500000
vt 0.796875 1.000000
vt 0.781250 0.937500
vt 0.765625 1.000000
vt 0.000000 0.514963
vt 0.023627 0.351382
vt 0.078811 0.227751
vt 0.441038 0.000000
vt 0.154488 0.129283
vt 0.237160 0.066052
vt 0.335142 0.022138
vt 0.580700 0.000000
vt 0.760930 0.069696
vt 0.694293 0.031415
vt 0.831318 0.121584
vt 0.948207 0.280207
vt 0.900685 0.195407
vt 0.964676 0.692150
vt 0.815806 0.897934
vt 0.728991 0.947299
vt 0.591312 1.000000
vt 0.259050 0.943524
vt 0.148483 0.860355
vt 0.071381 0.764349
vn 0.0000 1.0000 0.0000
vn 0.0739 0.3479 -0.9346
vn 0.0783 0.1196 -0.9897
vn 0.0063 0.9968 -0.0800
vn 0.0190 0.9706 -0.2399
vn 0.0316 0.9164 -0.3991
vn 0.0439 0.8305 -0.5553
vn 0.0556 0.7085 -0.7035
vn 0.0660 0.5473 -0.8343
vn 0.1972 0.5525 -0.8099
vn 0.2215 0.3521 -0.9094
vn 0.2349 0.1212 -0.9644
vn 0.0187 0.9969 -0.0769
vn 0.0562 0.9714 -0.2309
vn 0.0937 0.9183 -0.3846
vn 0.1306 0.8339 -0.5362
vn 0.1658 0.7133 -0.6810
vn 0.0914 0.9727 -0.2131
vn 0.1527 0.9220 -0.3559
vn 0.2137 0.8404 -0.4980
vn 0.2727 0.7223 -0.6355
vn 0.3261 0.5624 -0.7598
vn 0.3679 0.3601 -0.8573
vn 0.3913 0.1244 -0.9118
vn 0.0304 0.9970 -0.0709
vn 0.4496 0.5763 -0.6824
vn 0.5108 0.3716 -0.7753
vn 0.5456 0.1289 -0.8281
vn 0.0410 0.9972 -0.0622
vn 0.1233 0.9746 -0.1871
vn 0.2066 0.9268 -0.3135
vn 0.2906 0.8491 -0.4411
vn 0.3733 0.7346 -0.5666
vn 0.2532 0.9322 -0.2588
vn 0.3582 0.8589 -0.3661
vn 0.4636 0.7487 -0.4739
vn 0.5633 0.5925 -0.5758
vn 0.6453 0.3852 -0.6597
vn 0.6930 0.1343 -0.7084
vn 0.0499 0.9974 -0.0510
vn 0.1505 0.9766 -0.1539
vn 0.7631 0.3993 -0.5081
vn 0.8242 0.1400 -0.5487
vn 0.0570 0.9977 -0.0379
vn 0.1721 0.9784 -0.1146
vn 0.2904 0.9372 -0.1933
vn 0.4131 0.8682 -0.2750
vn 0.5386 0.7625 -0.3586
vn 0.6603 0.6088 -0.4396
vn 0.4520 0.8755 -0.1708
vn 0.5929 0.7735 -0.2240
vn 0.7323 0.6222 -0.2767
vn 0.8527 0.4112 -0.3222
vn 0.9256 0.1449 -0.3497
vn 0.0618 0.9978 -0.0234
vn 0.1870 0.9798 -0.0706
vn 0.3164 0.9411 -0.1196
vn 0.9817 0.1478 -0.1204
vn 0.0643 0.9979 -0.0079
vn 0.1946 0.9806 -0.0239
vn 0.3298 0.9432 -0.0405
vn 0.4722 0.8796 -0.0579
vn 0.6215 0.7797 -0.0762
vn 0.7709 0.6299 -0.0946
vn 0.9016 0.4181 -0.1106
vn 0.4722 0.8796 0.0579
vn 0.6215 0.7797 0.0762
vn 0.7709 0.6299 0.0946
vn 0.9016 0.4181 0.1106
vn 0.9817 0.1478 0.1204
vn 0.0643 0.9979 0.0079
vn 0.1946 0.9806 0.0239
vn 0.3298 0.9432 0.0405
vn 0.9256 0.1449 0.3497
vn 0.0618 0.9978 0.0234
vn 0.1870 0.9798 0.0706
vn 0.3164 0.9411 0.1196
vn 0.4520 0.8755 0.1708
vn 0.5929 0.7735 0.2240
vn 0.7323 0.6222 0.2767
vn 0.8527 0.4112 0.3222
vn 0.5386 0.7625 0.3586
vn 0.6603 0.6088 0.4396
vn 0.7631 0.3993 0.5081
vn 0.8242 0.1400 0.5487
vn 0.0570 0.9977 0.0379
vn 0.1721 0.9784 0.1146
vn 0.2904 0.9372 0.1933
vn 0.4131 0.8682 0.2750
vn 0.0499 0.9974 0.0510
vn 0.1505 0.9766 0.1539
vn 0.2532 0.9322 0.2588
vn 0.3582 0.8589 0.3661
vn 0.4636 0.7487 0.4739
vn 0.5633 0.5925 0.5758
vn 0.6453 0.3852 0.6597
vn 0.6930 0.1343 0.7084
vn 0.3733 0.7346 0.5666
vn 0.4496 0.5763 0.6824
vn 0.5108 0.3716 0.7753
vn 0.5456 0.1289 0.8281
vn 0.0410 0.9972 0.0622
vn 0.1233 0.9746 0.1871
vn 0.2066 0.9268 0.3135
vn 0.2906 0.8491 0.4411
vn 0.0914 0.9727 0.2131
vn 0.1527 0.9220 0.3559
vn 0.2137 0.8404 0.4980
vn 0.2727 0.7223 0.6355
vn 0.3261 0.5624 0.7598
vn 0.3679 0.3601 0.8573
vn 0.3913 0.1244 0.9118
vn 0.0304 0.9970 0.0709
vn 0.1972 0.5525 0.8099
vn 0.2215 0.3521 0.9094
vn 0.2349 0.1212 0.9644
vn 0.0187 0.9969 0.0769
vn 0.0562 0.9714 0.2309
vn 0.0937 0.9183 0.3846
vn 0.1306 0.8339 0.5362
vn 0.1658 0.7133 0.6810
vn 0.0316 0.9164 0.3991
vn 0.0439 0.8305 0.5553
vn 0.0556 0.7085 0.7035
vn 0.0660 0.5473 0.8343
vn 0.0739 0.3479 0.9346
vn 0.0783 0.1196 0.9897
vn 0.0063 0.9968 0.0800
vn 0.0190 0.9706 0.2399
vn -0.0739 0.3479 0.9346
vn -0.0783 0.1196 0.9897
vn -0.0063 0.9968 0.0800
vn -0.0190 0.9706 0.2399
vn -0.0316 0.9164 0.3991
vn -0.0439 0.8305 0.5553
vn -0.0556 0.7085 0.7035
vn -0.0660 0.5473 0.8343
vn -0.0937 0.9183 0.3846
vn -0.1306 0.8339 0.5362
vn -0.1658 0.7133 0.6810
vn -0.1972 0.5525 0.8099
vn -0.2215 0.3521 0.9094
vn -0.2349 0.1212 0.9644
vn -0.0187 0.9969 0.0769
vn -0.0562 0.9714 0.2309
vn -0.3679 0.3601 0.8573
vn -0.3913 0.1244 0.9118
vn -0.0304 0.9970 0.0709
vn -0.0914 0.9727 0.2131
vn -0.1527 0.9220 0.3559
vn -0.2137 0.8404 0.4980
vn -0.2727 0.7223 0.6355
vn -0.3261 0.5624 0.7598
vn -0.2906 0.8491 0.4411
vn -0.3733 0.7346 0.5666
vn -0.4496 0.5763 0.6824
vn -0.5108 0.3716 0.7753
vn -0.5456 0.1289 0.8281
vn -0.0410 0.9972 0.0622
vn -0.1233 0.9746 0.1871
vn -0.2066 0.9268 0.3135
vn -0.6930 0.1343 0.7084
vn -0.0499 0.9974 0.0510
vn -0.1505 0.9766 0.1539
vn -0.2532 0.9322 0.2588
vn -0.3582 0.8589 0.3661
vn -0.4636 0.7487 0.4739
vn -0.5633 0.5925 0.5758
vn -0.6453 0.3852 0.6597
vn -0.4131 0.8682 0.2750
vn -0.5386 0.7625 0.3586
vn -0.6603 0.6088 0.4396
vn -0.7631 0.3993 0.5081
vn -0.8242 0.1400 0.5487
vn -0.0570 0.9977 0.0379
vn -0.1721 0.9784 0.1146
vn -0.2904 0.9372 0.1933
vn -0.9256 0.1449 0.3497
vn -0.0618 0.9978 0.0234
vn -0.1870 0.9798 0.0706
vn -0.3164 0.9411 0.1196
vn -0.4520 0.8755 0.1708
vn -0.5929 0.7735 0.2240
vn -0.7323 0.6222 0.2767
vn -0.8527 0.4112 0.3222
vn -0.6215 0.7797 0.0762
vn -0.7709 0.6299 0.0946
vn -0.9016 0.4181 0.1106
vn -0.9817 0.1478 0.1204
vn -0.0643 0.9979 0.0079
vn -0.1946 0.9806 0.0239
vn -0.3298 0.9432 0.0405
vn -0.4722 0.8796 0.0579
vn -0.1946 0.9806 -0.0239
vn -0.3298 0.9432 -0.0405
vn -0.4722 0.8796 -0.0579
vn -0.6215 0.7797 -0.0762
vn -0.7709 0.6299 -0.0946
vn -0.9016 0.4181 -0.1106
vn -0.9817 0.1478 -0.1204
vn -0.0643 0.9979 -0.0079
vn -0.7323 0.6222 -0.2767
vn -0.8527 0.4112 -0.3222
vn -0.9256 0.1449 -0.3497
vn -0.0618 0.9978 -0.0234
vn -0.1870 0.9798 -0.0706
vn -0.3164 0.9411 -0.1196
vn -0.4520 0.8755 -0.1708
vn -0.5929 0.7735 -0.2240
vn -0.1721 0.9784 -0.1146
vn -0.2904 0.9372 -0.1933
vn -0.4131 0.8682 -0.2750
vn -0.5386 0.7624 -0.3586
vn -0.6603 0.6088 -0.4396
vn -0.7631 0.3993 -0.5081
vn -0.8242 0.1400 -0.5487
vn -0.0570 0.9977 -0.0379
vn -0.5633 0.5925 -0.5758
vn -0.6453 0.3852 -0.6597
vn -0.6930 0.1343 -0.7084
vn -0.0499 0.9974 -0.0510
vn -0.1505 0.9766 -0.1539
vn -0.2532 0.9322 -0.2588
vn -0.3582 0.8589 -0.3661
vn -0.4636 0.7487 -0.4739
vn -0.2066 0.9268 -0.3135
vn -0.2906 0.8491 -0.4411
vn -0.3733 0.7346 -0.5666
vn -0.4496 0.5763 -0.6824
vn -0.5108 0.3716 -0.7753
vn -0.5456 0.1289 -0.8281
vn -0.0410 0.9972 -0.0622
vn -0.1233 0.9746 -0.1871
vn -0.3679 0.3601 -0.8573
vn -0.3913 0.1244 -0.9118
vn -0.0304 0.9970 -0.0709
vn -0.0914 0.9727 -0.2131
vn -0.1527 0.9220 -0.3559
vn -0.2137 0.8404 -0.4980
vn -0.2727 0.7223 -0.6355
vn -0.3261 0.5624 -0.7598
vn -0.0937 0.9183 -0.3846
vn -0.1306 0.8339 -0.5362
vn -0.1658 0.7133 -0.6810
vn -0.1972 0.5525 -0.8099
vn -0.2215 0.3521 -0.9094
vn -0.2349 0.1212 -0.9644
vn -0.0187 0.9969 -0.0769
vn -0.0562 0.9714 -0.2309
vn -0.0739 0.3479 -0.9346
vn -0.0783 0.1196 -0.9897
vn -0.0063 0.9968 -0.0800
vn -0.0190 0.9706 -0.2399
vn -0.0316 0.9164 -0.3991
vn -0.0439 0.8305 -0.5553
vn -0.0556 0.7085 -0.7035
vn -0.0660 0.5473 -0.8343
vn -0.5386 0.7624 0.3586
usemtl Material.001
s off
f 3/1/1 2/2/1 6/3/1
f 8/4/1 14/5/1 21/6/1
f 283/7/2 37/8/2 29/9/2
f 29/9/3 38/10/3 30/11/3
f 280/12/4 71/13/4 31/14/4
f 280/12/5 32/15/5 281/16/5
f 281/16/6 33/17/6 27/18/6
f 27/18/7 34/19/7 282/20/7
f 282/20/8 35/21/8 28/22/8
f 28/22/9 36/23/9 283/7/9
f 35/21/10 44/24/10 36/23/10
f 36/23/11 45/25/11 37/8/11
f 37/8/12 46/26/12 38/10/12
f 31/14/13 71/27/13 39/28/13
f 31/14/14 40/29/14 32/15/14
f 32/15/15 41/30/15 33/17/15
f 33/17/16 42/31/16 34/19/16
f 34/19/17 43/32/17 35/21/17
f 39/28/18 48/33/18 40/29/18
f 40/29/19 49/34/19 41/30/19
f 41/30/20 50/35/20 42/31/20
f 42/31/21 51/36/21 43/32/21
f 43/32/22 52/37/22 44/24/22
f 44/24/23 53/38/23 45/25/23
f 45/25/24 54/39/24 46/26/24
f 39/28/25 71/40/25 47/41/25
f 51/36/26 60/42/26 52/37/26
f 52/37/27 61/43/27 53/38/27
f 53/38/28 62/44/28 54/39/28
f 47/41/29 71/45/29 55/46/29
f 47/41/30 56/47/30 48/33/30
f 48/33/31 57/48/31 49/34/31
f 49/34/32 58/49/32 50/35/32
f 50/35/33 59/50/33 51/36/33
f 56/47/34 65/51/34 57/48/34
f 57/48/35 66/52/35 58/49/35
f 58/49/36 67/53/36 59/50/36
f 59/50/37 68/54/37 60/42/37
f 60/42/38 69/55/38 61/43/38
f 61/43/39 70/56/39 62/44/39
f 55/46/40 71/57/40 63/58/40
f 55/46/41 64/59/41 56/47/41
f 68/54/42 78/60/42 69/55/42
f 69/55/43 79/61/43 70/56/43
f 63/58/44 71/62/44 72/63/44
f 63/58/45 73/64/45 64/59/45
f 64/59/46 74/65/46 65/51/46
f 65/51/47 75/66/47 66/52/47
f 66/52/48 76/67/48 67/53/48
f 67/53/49 77/68/49 68/54/49
f 74/65/50 83/69/50 75/66/50
f 75/66/51 84/70/51 76/67/51
f 76/67/52 85/71/52 77/68/52
f 77/68/53 86/72/53 78/60/53
f 78/60/54 87/73/54 79/61/54
f 72/63/55 71/74/55 80/75/55
f 72/63/56 81/76/56 73/64/56
f 73/64/57 82/77/57 74/65/57
f 86/72/58 95/78/58 87/73/58
f 80/75/59 71/79/59 88/80/59
f 80/75/60 89/81/60 81/76/60
f 81/76/61 90/82/61 82/77/61
f 82/77/62 91/83/62 83/69/62
f 83/69/63 92/84/63 84/70/63
f 84/70/64 93/85/64 85/71/64
f 85/71/65 94/86/65 86/72/65
f 91/83/66 98/87/66 99/88/66
f 92/84/67 99/88/67 100/89/67
f 93/85/68 100/89/68 101/90/68
f 94/86/69 101/90/69 102/91/69
f 95/78/70 102/91/70 103/92/70
f 88/80/71 71/93/71 96/94/71
f 89/81/72 96/94/72 97/95/72
f 90/82/73 97/95/73 98/87/73
f 103/92/74 110/96/74 111/97/74
f 96/94/75 71/98/75 104/99/75
f 97/95/76 104/99/76 105/100/76
f 98/87/77 105/100/77 106/101/77
f 99/88/78 106/101/78 107/102/78
f 100/89/79 107/102/79 108/103/79
f 101/90/80 108/103/80 109/104/80
f 102/91/81 109/104/81 110/96/81
f 108/103/82 115/105/82 116/106/82
f 109/104/83 116/106/83 117/107/83
f 110/96/84 117/107/84 118/108/84
f 111/97/85 118/108/85 119/109/85
f 104/99/86 71/110/86 112/111/86
f 105/100/87 112/111/87 113/112/87
f 106/101/88 113/112/88 114/113/88
f 107/102/89 114/113/89 115/105/89
f 112/111/90 71/114/90 120/115/90
f 113/112/91 120/115/91 121/116/91
f 114/113/92 121/116/92 122/117/92
f 115/105/93 122/117/93 123/118/93
f 116/106/94 123/118/94 124/119/94
f 117/107/95 124/119/95 125/120/95
f 118/108/96 125/120/96 126/121/96
f 119/109/97 126/121/97 127/122/97
f 124/119/98 131/123/98 132/124/98
f 125/120/99 132/124/99 133/125/99
f 126/121/100 133/125/100 134/126/100
f 127/122/101 134/126/101 135/127/101
f 120/115/102 71/128/102 128/129/102
f 121/116/103 128/129/103 129/130/103
f 122/117/104 129/130/104 130/131/104
f 123/118/105 130/131/105 131/123/105
f 129/130/106 136/132/106 137/133/106
f 130/131/107 137/133/107 138/134/107
f 131/123/108 138/134/108 139/135/108
f 132/124/109 139/135/109 140/136/109
f 133/125/110 140/136/110 141/137/110
f 134/126/111 141/137/111 142/138/111
f 135/127/112 142/138/112 143/139/112
f 128/129/113 71/140/113 136/132/113
f 141/137/114 148/141/114 149/142/114
f 142/138/115 149/142/115 150/143/115
f 143/139/116 150/143/116 151/144/116
f 136/132/117 71/145/117 144/146/117
f 137/133/118 144/146/118 145/147/118
f 138/134/119 145/147/119 146/148/119
f 139/135/120 146/148/120 147/149/120
f 140/136/121 147/149/121 148/141/121
f 146/148/122 153/150/122 154/151/122
f 147/149/123 154/151/123 155/152/123
f 148/141/124 155/152/124 156/153/124
f 149/142/125 156/153/125 157/154/125
f 150/143/126 157/154/126 158/155/126
f 151/144/127 158/155/127 159/156/127
f 144/146/128 71/157/128 152/158/128
f 145/147/129 152/158/129 153/150/129
f 157/154/130 166/159/130 158/155/130
f 158/155/131 167/160/131 159/156/131
f 152/158/132 71/161/132 160/162/132
f 152/158/133 161/163/133 153/150/133
f 153/150/134 162/164/134 154/151/134
f 154/151/135 163/165/135 155/152/135
f 155/152/136 164/166/136 156/153/136
f 156/153/137 165/167/137 157/154/137
f 161/163/138 170/168/138 162/164/138
f 162/164/139 171/169/139 163/165/139
f 163/165/140 172/170/140 164/166/140
f 164/166/141 173/171/141 165/167/141
f 165/167/142 174/172/142 166/159/142
f 166/159/143 175/173/143 167/160/143
f 160/162/144 71/174/144 168/175/144
f 160/162/145 169/176/145 161/163/145
f 173/171/146 182/177/146 174/172/146
f 174/172/147 183/178/147 175/173/147
f 168/175/148 71/179/148 176/180/148
f 168/175/149 177/181/149 169/176/149
f 169/176/150 178/182/150 170/168/150
f 170/168/151 179/183/151 171/169/151
f 171/169/152 180/184/152 172/170/152
f 172/170/153 181/185/153 173/171/153
f 178/182/154 187/186/154 179/183/154
f 179/183/155 188/187/155 180/184/155
f 180/184/156 189/188/156 181/185/156
f 181/185/157 190/189/157 182/177/157
f 182/177/158 191/190/158 183/178/158
f 176/180/159 71/191/159 184/192/159
f 176/180/160 185/193/160 177/181/160
f 177/181/161 186/194/161 178/182/161
f 190/189/162 199/195/162 191/190/162
f 184/192/163 71/196/163 192/197/163
f 184/192/164 193/198/164 185/193/164
f 185/193/165 194/199/165 186/194/165
f 186/194/166 195/200/166 187/186/166
f 187/186/167 196/201/167 188/187/167
f 188/187/168 197/202/168 189/188/168
f 189/188/169 198/203/169 190/189/169
f 194/199/170 203/204/170 195/200/170
f 195/200/171 204/205/171 196/201/171
f 196/201/172 205/206/172 197/202/172
f 197/202/173 206/207/173 198/203/173
f 198/203/174 207/208/174 199/195/174
f 192/197/175 71/209/175 200/210/175
f 192/197/176 201/211/176 193/198/176
f 193/198/177 202/212/177 194/199/177
f 206/207/178 215/213/178 207/208/178
f 200/210/179 71/214/179 208/215/179
f 200/210/180 209/216/180 201/211/180
f 201/211/181 210/217/181 202/212/181
f 202/212/182 211/218/182 203/204/182
f 203/204/183 212/219/183 204/205/183
f 204/205/184 213/220/184 205/206/184
f 205/206/185 214/221/185 206/207/185
f 211/218/186 220/222/186 212/219/186
f 212/219/187 221/223/187 213/220/187
f 213/220/188 222/224/188 214/221/188
f 214/221/189 223/225/189 215/213/189
f 208/215/190 71/226/190 216/227/190
f 208/215/191 217/228/191 209/216/191
f 209/216/192 218/229/192 210/217/192
f 210/217/193 219/230/193 211/218/193
f 217/231/194 224/232/194 225/233/194
f 218/234/195 225/233/195 226/235/195
f 219/236/196 226/235/196 227/237/196
f 220/238/197 227/237/197 228/239/197
f 221/240/198 228/239/198 229/241/198
f 222/242/199 229/241/199 230/243/199
f 223/244/200 230/243/200 231/245/200
f 216/246/201 71/247/201 224/232/201
f 229/241/202 236/248/202 237/249/202
f 230/243/203 237/249/203 238/250/203
f 231/245/204 238/250/204 239/251/204
f 224/232/205 71/252/205 232/253/205
f 225/233/206 232/253/206 233/254/206
f 226/235/207 233/254/207 234/255/207
f 227/237/208 234/255/208 235/256/208
f 228/239/209 235/256/209 236/248/209
f 233/254/210 240/257/210 241/258/210
f 234/255/211 241/258/211 242/259/211
f 235/256/212 242/259/212 243/260/212
f 236/248/213 243/260/213 244/261/213
f 237/249/214 244/261/214 245/262/214
f 238/250/215 245/262/215 246/263/215
f 239/251/216 246/263/216 247/264/216
f 232/253/217 71/265/217 240/257/217
f 245/262/218 252/266/218 253/267/218
f 246/263/219 253/267/219 254/268/219
f 247/264/220 254/268/220 255/269/220
f 240/257/221 71/270/221 248/271/221
f 241/258/222 248/271/222 249/272/222
f 242/259/223 249/272/223 250/273/223
f 243/260/224 250/273/224 251/274/224
f 244/261/225 251/274/225 252/266/225
f 250/273/226 257/275/226 258/276/226
f 251/274/227 258/276/227 259/277/227
f 252/266/228 259/277/228 260/278/228
f 253/267/229 260/278/229 261/279/229
f 254/268/230 261/279/230 262/280/230
f 255/269/231 262/280/231 263/281/231
f 248/271/232 71/282/232 256/283/232
f 249/272/233 256/283/233 257/275/233
f 262/280/234 269/284/234 270/285/234
f 263/281/235 270/285/235 271/286/235
f 256/283/236 71/287/236 264/288/236
f 257/275/237 264/288/237 265/289/237
f 258/276/238 265/289/238 266/290/238
f 259/277/239 266/290/239 267/291/239
f 260/278/240 267/291/240 268/292/240
f 261/279/241 268/292/241 269/284/241
f 266/290/242 273/293/242 274/294/242
f 267/291/243 274/294/243 275/295/243
f 268/292/244 275/295/244 276/296/244
f 269/284/245 276/296/245 277/297/245
f 270/285/246 277/297/246 278/298/246
f 271/286/247 278/298/247 279/299/247
f 264/288/248 71/300/248 272/301/248
f 265/289/249 272/301/249 273/293/249
f 278/298/250 283/7/250 29/9/250
f 279/299/251 29/9/251 30/11/251
f 272/301/252 71/302/252 280/12/252
f 273/293/253 280/12/253 281/16/253
f 274/294/254 281/16/254 27/18/254
f 275/295/255 27/18/255 282/20/255
f 276/296/256 282/20/256 28/22/256
f 277/297/257 28/22/257 283/7/257
f 14/5/1 4/303/1 26/304/1
f 26/304/1 25/305/1 1/306/1
f 25/305/1 24/307/1 23/308/1
f 22/309/1 25/305/1 23/308/1
f 22/309/1 1/306/1 25/305/1
f 1/306/1 15/310/1 17/311/1
f 15/310/1 16/312/1 17/311/1
f 17/311/1 18/313/1 20/314/1
f 18/313/1 19/315/1 20/314/1
f 20/314/1 21/6/1 17/311/1
f 21/6/1 2/2/1 3/1/1
f 3/1/1 7/316/1 8/4/1
f 8/4/1 9/317/1 10/318/1
f 10/318/1 5/319/1 11/320/1
f 11/320/1 12/321/1 14/5/1
f 12/321/1 13/322/1 14/5/1
f 14/5/1 26/304/1 21/6/1
f 11/320/1 14/5/1 8/4/1
f 1/306/1 17/311/1 21/6/1
f 26/304/1 1/306/1 21/6/1
f 21/6/1 3/1/1 8/4/1
f 8/4/1 10/318/1 11/320/1
f 283/7/2 36/23/2 37/8/2
f 29/9/3 37/8/3 38/10/3
f 280/12/5 31/14/5 32/15/5
f 281/16/6 32/15/6 33/17/6
f 27/18/7 33/17/7 34/19/7
f 282/20/8 34/19/8 35/21/8
f 28/22/9 35/21/9 36/23/9
f 35/21/10 43/32/10 44/24/10
f 36/23/11 44/24/11 45/25/11
f 37/8/12 45/25/12 46/26/12
f 31/14/14 39/28/14 40/29/14
f 32/15/15 40/29/15 41/30/15
f 33/17/16 41/30/16 42/31/16
f 34/19/17 42/31/17 43/32/17
f 39/28/18 47/41/18 48/33/18
f 40/29/19 48/33/19 49/34/19
f 41/30/20 49/34/20 50/35/20
f 42/31/21 50/35/21 51/36/21
f 43/32/22 51/36/22 52/37/22
f 44/24/23 52/37/23 53/38/23
f 45/25/24 53/38/24 54/39/24
f 51/36/26 59/50/26 60/42/26
f 52/37/27 60/42/27 61/43/27
f 53/38/28 61/43/28 62/44/28
f 47/41/30 55/46/30 56/47/30
f 48/33/31 56/47/31 57/48/31
f 49/34/32 57/48/32 58/49/32
f 50/35/33 58/49/33 59/50/33
f 56/47/34 64/59/34 65/51/34
f 57/48/35 65/51/35 66/52/35
f 58/49/36 66/52/36 67/53/36
f 59/50/37 67/53/37 68/54/37
f 60/42/38 68/54/38 69/55/38
f 61/43/39 69/55/39 70/56/39
f 55/46/41 63/58/41 64/59/41
f 68/54/42 77/68/42 78/60/42
f 69/55/43 78/60/43 79/61/43
f 63/58/45 72/63/45 73/64/45
f 64/59/46 73/64/46 74/65/46
f 65/51/47 74/65/47 75/66/47
f 66/52/48 75/66/48 76/67/48
f 67/53/49 76/67/49 77/68/49
f 74/65/50 82/77/50 83/69/50
f 75/66/51 83/69/51 84/70/51
f 76/67/52 84/70/52 85/71/52
f 77/68/53 85/71/53 86/72/53
f 78/60/54 86/72/54 87/73/54
f 72/63/56 80/75/56 81/76/56
f 73/64/57 81/76/57 82/77/57
f 86/72/58 94/86/58 95/78/58
f 80/75/60 88/80/60 89/81/60
f 81/76/61 89/81/61 90/82/61
f 82/77/62 90/82/62 91/83/62
f 83/69/63 91/83/63 92/84/63
f 84/70/64 92/84/64 93/85/64
f 85/71/65 93/85/65 94/86/65
f 91/83/66 90/82/66 98/87/66
f 92/84/67 91/83/67 99/88/67
f 93/85/68 92/84/68 100/89/68
f 94/86/69 93/85/69 101/90/69
f 95/78/70 94/86/70 102/91/70
f 89/81/72 88/80/72 96/94/72
f 90/82/73 89/81/73 97/95/73
f 103/92/74 102/91/74 110/96/74
f 97/95/76 96/94/76 104/99/76
f 98/87/77 97/95/77 105/100/77
f 99/88/78 98/87/78 106/101/78
f 100/89/79 99/88/79 107/102/79
f 101/90/80 100/89/80 108/103/80
f 102/91/81 101/90/81 109/104/81
f 108/103/82 107/102/82 115/105/82
f 109/104/83 108/103/83 116/106/83
f 110/96/84 109/104/84 117/107/84
f 111/97/85 110/96/85 118/108/85
f 105/100/87 104/99/87 112/111/87
f 106/101/88 105/100/88 113/112/88
f 107/102/89 106/101/89 114/113/89
f 113/112/91 112/111/91 120/115/91
f 114/113/92 113/112/92 121/116/92
f 115/105/93 114/113/93 122/117/93
f 116/106/94 115/105/94 123/118/94
f 117/107/95 116/106/95 124/119/95
f 118/108/96 117/107/96 125/120/96
f 119/109/97 118/108/97 126/121/97
f 124/119/98 123/118/98 131/123/98
f 125/120/99 124/119/99 132/124/99
f 126/121/100 125/120/100 133/125/100
f 127/122/101 126/121/101 134/126/101
f 121/116/103 120/115/103 128/129/103
f 122/117/104 121/116/104 129/130/104
f 123/118/105 122/117/105 130/131/105
f 129/130/106 128/129/106 136/132/106
f 130/131/107 129/130/107 137/133/107
f 131/123/108 130/131/108 138/134/108
f 132/124/109 131/123/109 139/135/109
f 133/125/110 132/124/110 140/136/110
f 134/126/111 133/125/111 141/137/111
f 135/127/112 134/126/112 142/138/112
f 141/137/114 140/136/114 148/141/114
f 142/138/115 141/137/115 149/142/115
f 143/139/116 142/138/116 150/143/116
f 137/133/118 136/132/118 144/146/118
f 138/134/119 137/133/119 145/147/119
f 139/135/120 138/134/120 146/148/120
f 140/136/121 139/135/121 147/149/121
f 146/148/122 145/147/122 153/150/122
f 147/149/123 146/148/123 154/151/123
f 148/141/124 147/149/124 155/152/124
f 149/142/125 148/141/125 156/153/125
f 150/143/126 149/142/126 157/154/126
f 151/144/127 150/143/127 158/155/127
f 145/147/129 144/146/129 152/158/129
f 157/154/130 165/167/130 166/159/130
f 158/155/131 166/159/131 167/160/131
f 152/158/133 160/162/133 161/163/133
f 153/150/134 161/163/134 162/164/134
f 154/151/135 162/164/135 163/165/135
f 155/152/136 163/165/136 164/166/136
f 156/153/137 164/166/137 165/167/137
f 161/163/138 169/176/138 170/168/138
f 162/164/139 170/168/139 171/169/139
f 163/165/140 171/169/140 172/170/140
f 164/166/141 172/170/141 173/171/141
f 165/167/142 173/171/142 174/172/142
f 166/159/143 174/172/143 175/173/143
f 160/162/145 168/175/145 169/176/145
f 173/171/146 181/185/146 182/177/146
f 174/172/147 182/177/147 183/178/147
f 168/175/149 176/180/149 177/181/149
f 169/176/150 177/181/150 178/182/150
f 170/168/151 178/182/151 179/183/151
f 171/169/152 179/183/152 180/184/152
f 172/170/153 180/184/153 181/185/153
f 178/182/154 186/194/154 187/186/154
f 179/183/155 187/186/155 188/187/155
f 180/184/156 188/187/156 189/188/156
f 181/185/157 189/188/157 190/189/157
f 182/177/158 190/189/158 191/190/158
f 176/180/160 184/192/160 185/193/160
f 177/181/161 185/193/161 186/194/161
f 190/189/162 198/203/162 199/195/162
f 184/192/164 192/197/164 193/198/164
f 185/193/165 193/198/165 194/199/165
f 186/194/166 194/199/166 195/200/166
f 187/186/167 195/200/167 196/201/167
f 188/187/168 196/201/168 197/202/168
f 189/188/169 197/202/169 198/203/169
f 194/199/170 202/212/170 203/204/170
f 195/200/258 203/204/258 204/205/258
f 196/201/172 204/205/172 205/206/172
f 197/202/173 205/206/173 206/207/173
f 198/203/174 206/207/174 207/208/174
f 192/197/176 200/210/176 201/211/176
f 193/198/177 201/211/177 202/212/177
f 206/207/178 214/221/178 215/213/178
f 200/210/180 208/215/180 209/216/180
f 201/211/181 209/216/181 210/217/181
f 202/212/182 210/217/182 211/218/182
f 203/204/183 211/218/183 212/219/183
f 204/205/184 212/219/184 213/220/184
f 205/206/185 213/220/185 214/221/185
f 211/218/186 219/230/186 220/222/186
f 212/219/187 220/222/187 221/223/187
f 213/220/188 221/223/188 222/224/188
f 214/221/189 222/224/189 223/225/189
f 208/215/191 216/227/191 217/228/191
f 209/216/192 217/228/192 218/229/192
f 210/217/193 218/229/193 219/230/193
f 217/231/194 216/246/194 224/232/194
f 218/234/195 217/231/195 225/233/195
f 219/236/196 218/234/196 226/235/196
f 220/238/197 219/236/197 227/237/197
f 221/240/198 220/238/198 228/239/198
f 222/242/199 221/240/199 229/241/199
f 223/244/200 222/242/200 230/243/200
f 229/241/202 228/239/202 236/248/202
f 230/243/203 229/241/203 237/249/203
f 231/245/204 230/243/204 238/250/204
f 225/233/206 224/232/206 232/253/206
f 226/235/207 225/233/207 233/254/207
f 227/237/208 226/235/208 234/255/208
f 228/239/209 227/237/209 235/256/209
f 233/254/210 232/253/210 240/257/210
f 234/255/211 233/254/211 241/258/211
f 235/256/212 234/255/212 242/259/212
f 236/248/213 235/256/213 243/260/213
f 237/249/214 236/248/214 244/261/214
f 238/250/215 237/249/215 245/262/215
f 239/251/216 238/250/216 246/263/216
f 245/262/218 244/261/218 252/266/218
f 246/263/219 245/262/219 253/267/219
f 247/264/220 246/263/220 254/268/220
f 241/258/222 240/257/222 248/271/222
f 242/259/223 241/258/223 249/272/223
f 243/260/224 242/259/224 250/273/224
f 244/261/225 243/260/225 251/274/225
f 250/273/226 249/272/226 257/275/226
f 251/274/227 250/273/227 258/276/227
f 252/266/228 251/274/228 259/277/228
f 253/267/229 252/266/229 260/278/229
f 254/268/230 253/267/230 261/279/230
f 255/269/231 254/268/231 262/280/231
f 249/272/233 248/271/233 256/283/233
f 262/280/234 261/279/234 269/284/234
f 263/281/235 262/280/235 270/285/235
f 257/275/237 256/283/237 264/288/237
f 258/276/238 257/275/238 265/289/238
f 259/277/239 258/276/239 266/290/239
f 260/278/240 259/277/240 267/291/240
f 261/279/241 260/278/241 268/292/241
f 266/290/242 265/289/242 273/293/242
f 267/291/243 266/290/243 274/294/243
f 268/292/244 267/291/244 275/295/244
f 269/284/245 268/292/245 276/296/245
f 270/285/246 269/284/246 277/297/246
f 271/286/247 270/285/247 278/298/247
f 265/289/249 264/288/249 272/301/249
f 278/298/250 277/297/250 283/7/250
f 279/299/251 278/298/251 29/9/251
f 273/293/253 272/301/253 280/12/253
f 274/294/254 273/293/254 281/16/254
f 275/295/255 274/294/255 27/18/255
f 276/296/256 275/295/256 282/20/256
f 277/297/257 276/296/257 28/22/257
o Cube.001_Cube.002
v 0.281191 -0.099813 0.044360
v 0.281191 -0.011160 0.044360
v 0.281191 -0.099813 -0.049764
v 0.281191 -0.011160 -0.049764
v 0.489878 -0.099813 0.044360
v 0.489878 -0.011160 0.044360
v 0.489878 -0.099813 -0.049764
v 0.489878 -0.011160 -0.049764
v 0.445160 -0.114588 -0.065451
v 0.385534 -0.114588 -0.065451
v 0.325909 -0.114588 -0.065451
v 0.325909 0.003615 -0.065451
v 0.385534 0.003615 -0.065451
v 0.445160 0.003615 -0.065451
v 0.325909 -0.114588 0.060047
v 0.385534 -0.114588 0.060047
v 0.445160 -0.114588 0.060047
v 0.445160 0.003615 0.060047
v 0.385534 0.003615 0.060047
v 0.325909 0.003615 0.060047
v 0.279016 -0.106278 -0.030624
v 0.277924 -0.109526 -0.002702
v 0.279016 -0.106278 0.025220
v 0.279016 -0.081785 0.051225
v 0.277924 -0.055486 0.054673
v 0.279016 -0.029188 0.051225
v 0.279016 -0.004694 0.025220
v 0.277924 -0.001447 -0.002702
v 0.279016 -0.004694 -0.030624
v 0.279016 -0.029188 -0.056629
v 0.277924 -0.055486 -0.060077
v 0.279016 -0.081785 -0.056629
v 0.311239 -0.114272 -0.065115
v 0.297983 -0.112057 -0.062764
v 0.287489 -0.106595 -0.056964
v 0.459830 0.003299 -0.065115
v 0.473086 0.001084 -0.062764
v 0.483580 -0.004378 -0.056964
v 0.492053 -0.029188 -0.056629
v 0.493145 -0.055486 -0.060077
v 0.492053 -0.081785 -0.056629
v 0.492053 -0.106278 0.025220
v 0.493145 -0.109526 -0.002702
v 0.492053 -0.106278 -0.030624
v 0.492053 -0.004694 -0.030624
v 0.493145 -0.001447 -0.002702
v 0.492053 -0.004694 0.025220
v 0.492053 -0.029188 0.051225
v 0.493145 -0.055486 0.054673
v 0.492053 -0.081785 0.051225
v 0.459830 -0.114272 0.059711
v 0.473086 -0.112057 0.057360
v 0.483580 -0.106595 0.051561
v 0.311239 0.003299 0.059711
v 0.297983 0.001084 0.057360
v 0.287489 -0.004378 0.051561
v 0.483580 -0.106595 -0.056964
v 0.473086 -0.112057 -0.062764
v 0.459830 -0.114272 -0.065115
v 0.430253 -0.114588 -0.065451
v 0.415347 -0.114588 -0.065451
v 0.400441 -0.114588 -0.065451
v 0.370628 -0.114588 -0.065451
v 0.355722 -0.114588 -0.065451
v 0.340816 -0.114588 -0.065451
v 0.287489 -0.004378 -0.056964
v 0.297983 0.001084 -0.062764
v 0.311239 0.003299 -0.065115
v 0.340816 0.003615 -0.065451
v 0.355722 0.003615 -0.065451
v 0.370628 0.003615 -0.065451
v 0.400441 0.003615 -0.065451
v 0.415347 0.003615 -0.065451
v 0.430253 0.003615 -0.065451
v 0.287489 -0.106595 0.051561
v 0.297983 -0.112057 0.057360
v 0.311239 -0.114272 0.059711
v 0.340816 -0.114588 0.060047
v 0.355722 -0.114588 0.060047
v 0.370628 -0.114588 0.060047
v 0.400441 -0.114588 0.060047
v 0.415347 -0.114588 0.060047
v 0.430253 -0.114588 0.060047
v 0.483580 -0.004378 0.051561
v 0.473086 0.001084 0.057360
v 0.459830 0.003299 0.059711
v 0.430253 0.003615 0.060047
v 0.415347 0.003615 0.060047
v 0.400441 0.003615 0.060047
v 0.370628 0.003615 0.060047
v 0.355722 0.003615 0.060047
v 0.340816 0.003615 0.060047
v 0.445160 0.020238 -0.037018
v 0.445160 0.025778 -0.002702
v 0.445160 0.020238 0.031614
v 0.385534 0.020238 -0.037018
v 0.385534 0.025778 -0.002702
v 0.385534 0.020238 0.031614
v 0.325909 0.020238 -0.037018
v 0.325909 0.025778 -0.002702
v 0.325909 0.020238 0.031614
v 0.325909 -0.131210 -0.037018
v 0.325909 -0.136751 -0.002702
v 0.325909 -0.131210 0.031614
v 0.385534 -0.131210 -0.037018
v 0.385534 -0.136751 -0.002702
v 0.385534 -0.131210 0.031614
v 0.445160 -0.131210 -0.037018
v 0.445160 -0.136751 -0.002702
v 0.445160 -0.131210 0.031614
v 0.445160 -0.023165 0.077695
v 0.445160 -0.055486 0.083578
v 0.445160 -0.087808 0.077695
v 0.385534 -0.023165 0.077695
v 0.385534 -0.055486 0.083578
v 0.385534 -0.087808 0.077695
v 0.325909 -0.023165 0.077695
v 0.325909 -0.055486 0.083578
v 0.325909 -0.087808 0.077695
v 0.325909 -0.023165 -0.083099
v 0.325909 -0.055486 -0.088982
v 0.325909 -0.087808 -0.083099
v 0.385534 -0.023165 -0.083099
v 0.385534 -0.055486 -0.088982
v 0.385534 -0.087808 -0.083099
v 0.445160 -0.023165 -0.083099
v 0.445160 -0.055486 -0.088982
v 0.445160 -0.087808 -0.083099
v 0.274364 -0.083514 0.027056
v 0.272788 -0.055486 0.028654
v 0.274364 -0.027458 0.027056
v 0.272788 -0.085020 -0.002702
v 0.271069 -0.055486 -0.002702
v 0.272788 -0.025953 -0.002702
v 0.274364 -0.083514 -0.032460
v 0.272788 -0.055486 -0.034058
v 0.274364 -0.027458 -0.032460
v 0.459850 -0.087633 -0.082674
v 0.459870 -0.055486 -0.088528
v 0.459850 -0.023339 -0.082674
v 0.473247 -0.086414 -0.079695
v 0.473408 -0.055486 -0.085351
v 0.473247 -0.024559 -0.079695
v 0.484182 -0.083689 -0.071743
v 0.484708 -0.055486 -0.076745
v 0.484182 -0.027284 -0.071743
v 0.496705 -0.083514 -0.032460
v 0.498281 -0.055486 -0.034058
v 0.496705 -0.027458 -0.032460
v 0.498280 -0.085020 -0.002702
v 0.500000 -0.055486 -0.002702
v 0.498280 -0.025953 -0.002702
v 0.496705 -0.083514 0.027056
v 0.498280 -0.055486 0.028654
v 0.496705 -0.027458 0.027056
v 0.311219 -0.087633 0.077270
v 0.311199 -0.055486 0.083124
v 0.311219 -0.023339 0.077270
v 0.297822 -0.086414 0.074291
v 0.297661 -0.055486 0.079947
v 0.297822 -0.024559 0.074291
v 0.286887 -0.083689 0.066339
v 0.286361 -0.055486 0.071341
v 0.286887 -0.027284 0.066339
v 0.459850 -0.130809 -0.036833
v 0.473247 -0.128004 -0.035538
v 0.484182 -0.120514 -0.032645
v 0.459870 -0.136324 -0.002702
v 0.473408 -0.133331 -0.002702
v 0.484708 -0.125225 -0.002702
v 0.459850 -0.130809 0.031429
v 0.473247 -0.128004 0.030134
v 0.484182 -0.120514 0.027241
v 0.311219 0.019837 -0.036833
v 0.297822 0.017031 -0.035538
v 0.286887 0.009541 -0.032645
v 0.311199 0.025351 -0.002702
v 0.297661 0.022358 -0.002702
v 0.286361 0.014252 -0.002702
v 0.311219 0.019837 0.031429
v 0.297822 0.017031 0.030134
v 0.286887 0.009541 0.027241
v 0.484182 0.009541 -0.032645
v 0.473247 0.017031 -0.035538
v 0.459850 0.019837 -0.036833
v 0.484708 0.014252 -0.002702
v 0.473408 0.022358 -0.002702
v 0.459870 0.025351 -0.002702
v 0.484182 0.009541 0.027241
v 0.473247 0.017031 0.030134
v 0.459850 0.019837 0.031429
v 0.430253 0.020238 -0.037018
v 0.415347 0.020238 -0.037018
v 0.400441 0.020238 -0.037018
v 0.430253 0.025778 -0.002702
v 0.415347 0.025778 -0.002702
v 0.400441 0.025778 -0.002702
v 0.430253 0.020238 0.031614
v 0.415347 0.020238 0.031614
v 0.400441 0.020238 0.031614
v 0.370628 0.020238 -0.037018
v 0.355722 0.020238 -0.037018
v 0.340816 0.020238 -0.037018
v 0.370628 0.025778 -0.002702
v 0.355722 0.025778 -0.002702
v 0.340816 0.025778 -0.002702
v 0.370628 0.020238 0.031614
v 0.355722 0.020238 0.031614
v 0.340816 0.020238 0.031614
v 0.286887 -0.120514 -0.032645
v 0.297822 -0.128004 -0.035538
v 0.311219 -0.130809 -0.036833
v 0.286361 -0.125225 -0.002702
v 0.297661 -0.133331 -0.002702
v 0.311199 -0.136324 -0.002702
v 0.286887 -0.120514 0.027241
v 0.297822 -0.128004 0.030134
v 0.311219 -0.130809 0.031429
v 0.340816 -0.131210 -0.037018
v 0.355722 -0.131210 -0.037018
v 0.370628 -0.131210 -0.037018
v 0.340816 -0.136751 -0.002702
v 0.355722 -0.136751 -0.002702
v 0.370628 -0.136751 -0.002702
v 0.340816 -0.131210 0.031614
v 0.355722 -0.131210 0.031614
v 0.370628 -0.131210 0.031614
v 0.400441 -0.131210 -0.037018
v 0.415347 -0.131210 -0.037018
v 0.430253 -0.131210 -0.037018
v 0.400441 -0.136751 -0.002702
v 0.415347 -0.136751 -0.002702
v 0.430253 -0.136751 -0.002702
v 0.400441 -0.131210 0.031614
v 0.415347 -0.131210 0.031614
v 0.430253 -0.131210 0.031614
v 0.484182 -0.083689 0.066339
v 0.484708 -0.055486 0.071341
v 0.484182 -0.027284 0.066339
v 0.473247 -0.086414 0.074291
v 0.473408 -0.055486 0.079947
v 0.473247 -0.024559 0.074291
v 0.459850 -0.087633 0.077270
v 0.459870 -0.055486 0.083124
v 0.459850 -0.023339 0.077270
v 0.430253 -0.087808 0.077695
v 0.430253 -0.055486 0.083578
v 0.430253 -0.023165 0.077695
v 0.415347 -0.087808 0.077695
v 0.415347 -0.055486 0.083578
v 0.415347 -0.023165 0.077695
v 0.400441 -0.087808 0.077695
v 0.400441 -0.055486 0.083578
v 0.400441 -0.023165 0.077695
v 0.370628 -0.087808 0.077695
v 0.370628 -0.055486 0.083578
v 0.370628 -0.023165 0.077695
v 0.355722 -0.087808 0.077695
v 0.355722 -0.055486 0.083578
v 0.355722 -0.023165 0.077695
v 0.340816 -0.087808 0.077695
v 0.340816 -0.055486 0.083578
v 0.340816 -0.023165 0.077695
v 0.286887 -0.083689 -0.071743
v 0.286361 -0.055486 -0.076745
v 0.286887 -0.027284 -0.071743
v 0.297822 -0.086414 -0.079695
v 0.297661 -0.055486 -0.085351
v 0.297822 -0.024559 -0.079695
v 0.311219 -0.087633 -0.082674
v 0.311199 -0.055486 -0.088528
v 0.311219 -0.023339 -0.082674
v 0.340816 -0.087808 -0.083099
v 0.340816 -0.055486 -0.088982
v 0.340816 -0.023165 -0.083099
v 0.355722 -0.087808 -0.083099
v 0.355722 -0.055486 -0.088982
v 0.355722 -0.023165 -0.083099
v 0.370628 -0.087808 -0.083099
v 0.370628 -0.055486 -0.088982
v 0.370628 -0.023165 -0.083099
v 0.400441 -0.087808 -0.083099
v 0.400441 -0.055486 -0.088982
v 0.400441 -0.023165 -0.083099
v 0.415347 -0.087808 -0.083099
v 0.415347 -0.055486 -0.088982
v 0.415347 -0.023165 -0.083099
v 0.430253 -0.087808 -0.083099
v 0.430253 -0.055486 -0.088982
v 0.430253 -0.023165 -0.083099
vt 0.437500 0.062276
vt 0.500000 0.122396
vt 0.437500 0.123210
vt 0.562500 0.062276
vt 0.500000 0.062174
vt 0.437500 0.180725
vt 0.562500 0.180725
vt 0.500000 0.178060
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.437500 0.000000
vt 0.562500 0.000000
vt 0.500000 0.000000
vt 0.625000 0.000000
vt 0.625000 0.062500
vt 0.562500 0.123210
vt 0.625000 0.187500
vt 0.625000 0.250000
vt 0.562500 0.228516
vt 0.500000 0.223958
vt 0.437500 0.228516
vt 0.375000 0.250000
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.500000 0.453451
vt 0.437500 0.470540
vt 0.437500 0.453349
vt 0.562500 0.470540
vt 0.500000 0.471354
vt 0.437663 0.491150
vt 0.562337 0.491150
vt 0.500000 0.493815
vt 0.437500 0.437500
vt 0.375000 0.453125
vt 0.375000 0.437500
vt 0.500000 0.437500
vt 0.562500 0.453349
vt 0.562500 0.437500
vt 0.625000 0.453125
vt 0.625000 0.468750
vt 0.625000 0.484375
vt 0.625000 0.500000
vt 0.560628 0.521525
vt 0.500000 0.526042
vt 0.439372 0.521525
vt 0.375000 0.500000
vt 0.375000 0.484375
vt 0.375000 0.468750
vt 0.444214 0.569214
vt 0.500000 0.625000
vt 0.446615 0.625000
vt 0.555786 0.569214
vt 0.500000 0.571615
vt 0.444214 0.680786
vt 0.555786 0.680786
vt 0.500000 0.678385
vt 0.396525 0.564372
vt 0.603475 0.564372
vt 0.553385 0.625000
vt 0.603475 0.685628
vt 0.625000 0.750000
vt 0.560628 0.728475
vt 0.500000 0.723958
vt 0.439372 0.728475
vt 0.375000 0.750000
vt 0.396525 0.685628
vt 0.401042 0.625000
vt 0.500000 0.953125
vt 0.437500 0.968750
vt 0.437500 0.953125
vt 0.562500 0.968750
vt 0.500000 0.968750
vt 0.437500 0.984375
vt 0.562500 0.984375
vt 0.500000 0.984375
vt 0.437500 0.937500
vt 0.375000 0.953125
vt 0.375000 0.937500
vt 0.500000 0.937500
vt 0.562500 0.953125
vt 0.562500 0.937500
vt 0.625000 0.953125
vt 0.625000 0.968750
vt 0.625000 0.984375
vt 0.625000 1.000000
vt 0.562500 1.000000
vt 0.500000 1.000000
vt 0.437500 1.000000
vt 0.375000 1.000000
vt 0.375000 0.984375
vt 0.375000 0.968750
vt 0.345540 0.562500
vt 0.328451 0.625000
vt 0.328349 0.562500
vt 0.366150 0.562663
vt 0.346354 0.625000
vt 0.345540 0.687500
vt 0.328349 0.687500
vt 0.366150 0.687337
vt 0.328125 0.500000
vt 0.312500 0.562500
vt 0.312500 0.500000
vt 0.343750 0.500000
vt 0.359375 0.500000
vt 0.368815 0.625000
vt 0.359375 0.750000
vt 0.343750 0.750000
vt 0.328125 0.750000
vt 0.312500 0.687500
vt 0.312500 0.750000
vt 0.312500 0.625000
vt 0.843750 0.562500
vt 0.828125 0.625000
vt 0.828125 0.562500
vt 0.859375 0.562500
vt 0.843750 0.625000
vt 0.843750 0.687500
vt 0.828125 0.687500
vt 0.859375 0.687500
vt 0.828125 0.500000
vt 0.812500 0.562500
vt 0.812500 0.500000
vt 0.843750 0.500000
vt 0.859375 0.500000
vt 0.875000 0.500000
vt 0.875000 0.625000
vt 0.859375 0.625000
vt 0.875000 0.750000
vt 0.859375 0.750000
vt 0.843750 0.750000
vt 0.828125 0.750000
vt 0.812500 0.687500
vt 0.812500 0.750000
vt 0.812500 0.625000
vt 0.633850 0.562663
vt 0.653646 0.625000
vt 0.631185 0.625000
vt 0.654460 0.562500
vt 0.671550 0.625000
vt 0.633850 0.687337
vt 0.654460 0.687500
vt 0.640625 0.500000
vt 0.656250 0.500000
vt 0.671651 0.562500
vt 0.671875 0.500000
vt 0.687500 0.562500
vt 0.687500 0.625000
vt 0.671651 0.687500
vt 0.687500 0.687500
vt 0.671875 0.750000
vt 0.656250 0.750000
vt 0.640625 0.750000
vt 0.598958 0.625000
vt 0.718750 0.562500
vt 0.703125 0.625000
vt 0.703125 0.562500
vt 0.734375 0.562500
vt 0.718750 0.625000
vt 0.703125 0.687500
vt 0.734375 0.625000
vt 0.718750 0.687500
vt 0.703125 0.500000
vt 0.687500 0.500000
vt 0.718750 0.500000
vt 0.750000 0.500000
vt 0.734375 0.500000
vt 0.750000 0.562500
vt 0.750000 0.625000
vt 0.734375 0.687500
vt 0.750000 0.687500
vt 0.734375 0.750000
vt 0.718750 0.750000
vt 0.703125 0.750000
vt 0.687500 0.750000
vt 0.781250 0.562500
vt 0.765625 0.625000
vt 0.765625 0.562500
vt 0.796875 0.562500
vt 0.781250 0.625000
vt 0.765625 0.687500
vt 0.796875 0.687500
vt 0.781250 0.687500
vt 0.781250 0.500000
vt 0.765625 0.500000
vt 0.796875 0.500000
vt 0.796875 0.625000
vt 0.796875 0.750000
vt 0.781250 0.750000
vt 0.765625 0.750000
vt 0.750000 0.750000
vt 0.140625 0.562500
vt 0.156250 0.625000
vt 0.140625 0.625000
vt 0.156250 0.562500
vt 0.171875 0.625000
vt 0.140625 0.687500
vt 0.156250 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.140625 0.500000
vt 0.156250 0.500000
vt 0.171875 0.562500
vt 0.171875 0.500000
vt 0.187500 0.562500
vt 0.187500 0.625000
vt 0.171875 0.687500
vt 0.187500 0.687500
vt 0.171875 0.750000
vt 0.156250 0.750000
vt 0.140625 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.218750 0.562500
vt 0.203125 0.625000
vt 0.203125 0.562500
vt 0.234375 0.562500
vt 0.218750 0.625000
vt 0.203125 0.687500
vt 0.234375 0.625000
vt 0.218750 0.687500
vt 0.187500 0.500000
vt 0.218750 0.500000
vt 0.203125 0.500000
vt 0.250000 0.500000
vt 0.234375 0.500000
vt 0.250000 0.562500
vt 0.250000 0.625000
vt 0.234375 0.687500
vt 0.250000 0.687500
vt 0.234375 0.750000
vt 0.218750 0.750000
vt 0.203125 0.750000
vt 0.187500 0.750000
vt 0.281250 0.562500
vt 0.265625 0.625000
vt 0.265625 0.562500
vt 0.296875 0.562500
vt 0.281250 0.625000
vt 0.265625 0.687500
vt 0.296875 0.687500
vt 0.281250 0.687500
vt 0.281250 0.500000
vt 0.265625 0.500000
vt 0.296875 0.500000
vt 0.296875 0.625000
vt 0.296875 0.750000
vt 0.281250 0.750000
vt 0.265625 0.750000
vt 0.250000 0.750000
vt 0.437663 0.758850
vt 0.500000 0.778646
vt 0.437500 0.779460
vt 0.562337 0.758850
vt 0.500000 0.756185
vt 0.500000 0.796549
vt 0.437500 0.796651
vt 0.562500 0.779460
vt 0.375000 0.765625
vt 0.625000 0.765625
vt 0.625000 0.781250
vt 0.562500 0.796651
vt 0.625000 0.796875
vt 0.562500 0.812500
vt 0.500000 0.812500
vt 0.437500 0.812500
vt 0.375000 0.796875
vt 0.375000 0.812500
vt 0.375000 0.781250
vt 0.500000 0.828125
vt 0.437500 0.843750
vt 0.437500 0.828125
vt 0.562500 0.828125
vt 0.500000 0.843750
vt 0.437500 0.859375
vt 0.562500 0.843750
vt 0.500000 0.859375
vt 0.375000 0.828125
vt 0.625000 0.812500
vt 0.625000 0.828125
vt 0.625000 0.859375
vt 0.562500 0.859375
vt 0.562500 0.875000
vt 0.500000 0.875000
vt 0.437500 0.875000
vt 0.375000 0.875000
vt 0.375000 0.859375
vt 0.375000 0.843750
vt 0.500000 0.890625
vt 0.437500 0.906250
vt 0.437500 0.890625
vt 0.562500 0.890625
vt 0.500000 0.906250
vt 0.437500 0.921875
vt 0.562500 0.906250
vt 0.500000 0.921875
vt 0.375000 0.890625
vt 0.625000 0.875000
vt 0.625000 0.890625
vt 0.625000 0.906250
vt 0.562500 0.921875
vt 0.625000 0.921875
vt 0.375000 0.921875
vt 0.375000 0.906250
vt 0.437500 0.258850
vt 0.500000 0.278646
vt 0.437500 0.279460
vt 0.562500 0.258850
vt 0.500000 0.256185
vt 0.500000 0.296549
vt 0.437500 0.296651
vt 0.562500 0.279460
vt 0.375000 0.265625
vt 0.625000 0.265625
vt 0.625000 0.281250
vt 0.562500 0.296651
vt 0.625000 0.296875
vt 0.562500 0.312500
vt 0.500000 0.312500
vt 0.437500 0.312500
vt 0.375000 0.296875
vt 0.375000 0.312500
vt 0.375000 0.281250
vt 0.500000 0.328125
vt 0.437500 0.343750
vt 0.437500 0.328125
vt 0.562500 0.328125
vt 0.500000 0.343750
vt 0.437500 0.359375
vt 0.562500 0.343750
vt 0.500000 0.359375
vt 0.375000 0.328125
vt 0.625000 0.312500
vt 0.625000 0.343750
vt 0.562500 0.359375
vt 0.625000 0.359375
vt 0.562500 0.375000
vt 0.500000 0.375000
vt 0.437500 0.375000
vt 0.375000 0.375000
vt 0.375000 0.359375
vt 0.375000 0.343750
vt 0.500000 0.390625
vt 0.437500 0.406250
vt 0.437500 0.390625
vt 0.562500 0.390625
vt 0.500000 0.406250
vt 0.437500 0.421875
vt 0.562500 0.406250
vt 0.500000 0.421875
vt 0.375000 0.390625
vt 0.625000 0.375000
vt 0.625000 0.406250
vt 0.562500 0.421875
vt 0.625000 0.421875
vt 0.375000 0.421875
vt 0.375000 0.406250
vt 0.625000 0.125000
vt 0.625000 0.437500
vt 0.625000 0.937500
vt 0.875000 0.562500
vt 0.875000 0.687500
vt 0.625000 0.843750
vt 0.625000 0.328125
vt 0.625000 0.390625
vn -0.9968 -0.0580 0.0557
vn -0.9968 0.0591 0.0547
vn -0.9968 -0.0580 -0.0557
vn -0.9968 0.0591 -0.0547
vn -0.9607 -0.2109 0.1804
vn -0.9789 -0.0660 0.1932
vn -0.9789 0.0660 0.1932
vn -0.9611 0.1916 0.1987
vn -0.9769 0.2047 0.0621
vn -0.9769 0.2047 -0.0621
vn -0.9611 0.1916 -0.1987
vn -0.9789 0.0660 -0.1932
vn -0.9789 -0.0660 -0.1932
vn -0.9607 -0.2109 -0.1804
vn -0.9769 -0.2047 -0.0620
vn -0.9769 -0.2047 0.0620
vn 0.2288 -0.1745 -0.9577
vn 0.2249 0.1764 -0.9583
vn 0.6054 -0.1462 -0.7824
vn 0.5990 0.1507 -0.7865
vn 0.0310 -0.5500 -0.8346
vn 0.0306 -0.1790 -0.9834
vn 0.0303 0.1791 -0.9834
vn 0.0307 0.5501 -0.8345
vn 0.2294 0.5373 -0.8116
vn 0.5964 0.4462 -0.6672
vn 0.8686 0.2643 -0.4191
vn 0.8881 0.0963 -0.4495
vn 0.8878 -0.0959 -0.4502
vn 0.8408 -0.3089 -0.4445
vn 0.5992 -0.4437 -0.6664
vn 0.2334 -0.5353 -0.8118
vn 0.9968 -0.0580 -0.0557
vn 0.9968 0.0591 -0.0547
vn 0.9968 -0.0580 0.0557
vn 0.9968 0.0591 0.0547
vn 0.9607 -0.2109 -0.1804
vn 0.9789 -0.0660 -0.1932
vn 0.9789 0.0660 -0.1932
vn 0.9611 0.1916 -0.1987
vn 0.9769 0.2047 -0.0621
vn 0.9769 0.2047 0.0621
vn 0.9611 0.1916 0.1987
vn 0.9789 0.0660 0.1932
vn 0.9789 -0.0660 0.1932
vn 0.9607 -0.2109 0.1804
vn 0.9769 -0.2047 0.0620
vn 0.9769 -0.2047 -0.0620
vn -0.2288 -0.1745 0.9577
vn -0.2249 0.1764 0.9583
vn -0.6054 -0.1462 0.7824
vn -0.5990 0.1507 0.7865
vn -0.0310 -0.5500 0.8346
vn -0.0306 -0.1790 0.9834
vn -0.0303 0.1791 0.9834
vn -0.0307 0.5501 0.8345
vn -0.2294 0.5373 0.8116
vn -0.5964 0.4462 0.6672
vn -0.8686 0.2643 0.4191
vn -0.8881 0.0963 0.4495
vn -0.8878 -0.0959 0.4502
vn -0.8408 -0.3089 0.4445
vn -0.5992 -0.4437 0.6664
vn -0.2334 -0.5353 0.8118
vn 0.2169 -0.9637 -0.1558
vn 0.5838 -0.8010 -0.1328
vn 0.2169 -0.9637 0.1558
vn 0.5838 -0.8010 0.1328
vn 0.0302 -0.8629 -0.5045
vn 0.2276 -0.8405 -0.4916
vn 0.5892 -0.6957 -0.4110
vn 0.8348 -0.4686 -0.2889
vn 0.8773 -0.4715 -0.0896
vn 0.8773 -0.4715 0.0896
vn 0.8348 -0.4686 0.2889
vn 0.5892 -0.6957 0.4110
vn 0.2276 -0.8405 0.4916
vn 0.0302 -0.8629 0.5045
vn 0.0289 -0.9868 0.1593
vn 0.0289 -0.9868 -0.1593
vn -0.2169 0.9637 -0.1558
vn -0.5838 0.8010 -0.1328
vn -0.2169 0.9637 0.1558
vn -0.5838 0.8010 0.1328
vn -0.0302 0.8629 -0.5045
vn -0.2276 0.8405 -0.4916
vn -0.5892 0.6957 -0.4110
vn -0.8348 0.4686 -0.2889
vn -0.8773 0.4715 -0.0896
vn -0.8773 0.4715 0.0896
vn -0.8348 0.4686 0.2889
vn -0.5892 0.6957 0.4110
vn -0.2276 0.8405 0.4916
vn -0.0302 0.8629 0.5045
vn -0.0289 0.9868 0.1593
vn -0.0289 0.9868 -0.1593
vn 0.5774 0.8049 -0.1368
vn 0.2132 0.9642 -0.1575
vn 0.5774 0.8049 0.1368
vn 0.2132 0.9642 0.1575
vn 0.8624 0.4417 -0.2472
vn 0.5865 0.6966 -0.4132
vn 0.2237 0.8404 -0.4936
vn 0.0299 0.8629 -0.5046
vn 0.0287 0.9868 -0.1594
vn 0.0287 0.9868 0.1594
vn 0.0299 0.8629 0.5046
vn 0.2237 0.8404 0.4936
vn 0.5865 0.6966 0.4132
vn 0.8624 0.4417 0.2472
vn 0.8770 0.4722 0.0892
vn 0.8770 0.4722 -0.0892
vn 0.0000 0.9872 -0.1594
vn 0.0000 0.9872 0.1594
vn 0.0000 0.8633 -0.5047
vn 0.0000 0.8633 0.5047
vn -0.5774 -0.8049 -0.1368
vn -0.2132 -0.9642 -0.1575
vn -0.5774 -0.8049 0.1368
vn -0.2132 -0.9642 0.1575
vn -0.8624 -0.4417 -0.2472
vn -0.5865 -0.6966 -0.4132
vn -0.2237 -0.8404 -0.4936
vn -0.0299 -0.8629 -0.5046
vn -0.0287 -0.9868 -0.1594
vn -0.0287 -0.9868 0.1594
vn -0.0299 -0.8629 0.5046
vn -0.2237 -0.8404 0.4936
vn -0.5865 -0.6966 0.4132
vn -0.8624 -0.4417 0.2472
vn -0.8770 -0.4722 0.0892
vn -0.8770 -0.4722 -0.0892
vn 0.0000 -0.9872 -0.1594
vn 0.0000 -0.9872 0.1594
vn 0.0000 -0.8633 -0.5047
vn 0.0000 -0.8633 0.5047
vn 0.6054 -0.1462 0.7824
vn 0.5990 0.1507 0.7865
vn 0.2288 -0.1746 0.9577
vn 0.2249 0.1764 0.9583
vn 0.8408 -0.3089 0.4445
vn 0.8878 -0.0959 0.4502
vn 0.8881 0.0963 0.4495
vn 0.8686 0.2643 0.4191
vn 0.5964 0.4462 0.6672
vn 0.2294 0.5373 0.8116
vn 0.0307 0.5501 0.8345
vn 0.0303 0.1791 0.9834
vn 0.0306 -0.1790 0.9834
vn 0.0310 -0.5500 0.8346
vn 0.2334 -0.5353 0.8118
vn 0.5992 -0.4437 0.6664
vn 0.0000 -0.1791 0.9838
vn 0.0000 0.1791 0.9838
vn 0.0000 -0.5503 0.8350
vn 0.0000 0.5503 0.8350
vn -0.6054 -0.1462 -0.7824
vn -0.5990 0.1507 -0.7865
vn -0.2288 -0.1746 -0.9577
vn -0.2249 0.1764 -0.9583
vn -0.8408 -0.3089 -0.4445
vn -0.8878 -0.0959 -0.4502
vn -0.8881 0.0963 -0.4495
vn -0.8686 0.2643 -0.4191
vn -0.5964 0.4462 -0.6672
vn -0.2294 0.5373 -0.8116
vn -0.0307 0.5501 -0.8345
vn -0.0303 0.1791 -0.9834
vn -0.0306 -0.1790 -0.9834
vn -0.0310 -0.5500 -0.8346
vn -0.2334 -0.5353 -0.8118
vn -0.5992 -0.4437 -0.6664
vn 0.0000 -0.1791 -0.9838
vn 0.0000 0.1791 -0.9838
vn 0.0000 -0.5503 -0.8350
vn 0.0000 0.5503 -0.8350
vn -0.9968 -0.0591 0.0547
vn -0.9968 0.0580 0.0557
vn -0.9968 -0.0591 -0.0547
vn -0.9968 0.0580 -0.0557
vn -0.9611 -0.1916 0.1987
vn -0.9607 0.2109 0.1804
vn -0.9769 0.2047 0.0620
vn -0.9769 0.2047 -0.0620
vn -0.9607 0.2109 -0.1804
vn -0.9611 -0.1916 -0.1987
vn -0.9769 -0.2047 -0.0621
vn -0.9769 -0.2047 0.0621
vn 0.2249 -0.1764 -0.9583
vn 0.2288 0.1746 -0.9577
vn 0.5990 -0.1507 -0.7865
vn 0.6054 0.1462 -0.7824
vn 0.0307 -0.5501 -0.8345
vn 0.0303 -0.1791 -0.9834
vn 0.0306 0.1790 -0.9834
vn 0.0310 0.5500 -0.8346
vn 0.2334 0.5353 -0.8118
vn 0.5992 0.4437 -0.6664
vn 0.8408 0.3089 -0.4445
vn 0.8878 0.0959 -0.4502
vn 0.8881 -0.0963 -0.4495
vn 0.8686 -0.2643 -0.4191
vn 0.5964 -0.4462 -0.6672
vn 0.2294 -0.5373 -0.8116
vn 0.9968 -0.0591 -0.0547
vn 0.9968 0.0580 -0.0557
vn 0.9968 -0.0591 0.0547
vn 0.9968 0.0580 0.0557
vn 0.9611 -0.1916 -0.1987
vn 0.9607 0.2109 -0.1804
vn 0.9769 0.2047 -0.0620
vn 0.9769 0.2047 0.0620
vn 0.9607 0.2109 0.1804
vn 0.9611 -0.1916 0.1987
vn 0.9769 -0.2047 0.0621
vn 0.9769 -0.2047 -0.0621
vn -0.2249 -0.1764 0.9583
vn -0.2288 0.1745 0.9577
vn -0.5990 -0.1507 0.7865
vn -0.6054 0.1462 0.7824
vn -0.0307 -0.5501 0.8345
vn -0.0303 -0.1791 0.9834
vn -0.0306 0.1790 0.9834
vn -0.0310 0.5500 0.8346
vn -0.2334 0.5353 0.8118
vn -0.5992 0.4437 0.6664
vn -0.8408 0.3089 0.4445
vn -0.8878 0.0959 0.4502
vn -0.8881 -0.0963 0.4495
vn -0.8686 -0.2643 0.4191
vn -0.5964 -0.4462 0.6672
vn -0.2294 -0.5373 0.8116
vn 0.2132 -0.9642 -0.1575
vn 0.5774 -0.8049 -0.1368
vn 0.2132 -0.9642 0.1575
vn 0.5774 -0.8049 0.1368
vn 0.0299 -0.8629 -0.5046
vn 0.2237 -0.8404 -0.4936
vn 0.5865 -0.6966 -0.4132
vn 0.8624 -0.4417 -0.2472
vn 0.8770 -0.4722 -0.0892
vn 0.8770 -0.4722 0.0892
vn 0.8624 -0.4417 0.2472
vn 0.5865 -0.6966 0.4132
vn 0.2237 -0.8404 0.4936
vn 0.0299 -0.8629 0.5046
vn 0.0287 -0.9868 0.1594
vn 0.0287 -0.9868 -0.1594
vn -0.2132 0.9642 -0.1575
vn -0.5774 0.8049 -0.1368
vn -0.2132 0.9642 0.1575
vn -0.5774 0.8049 0.1368
vn -0.0299 0.8629 -0.5046
vn -0.2237 0.8404 -0.4936
vn -0.5865 0.6966 -0.4132
vn -0.8624 0.4417 -0.2472
vn -0.8770 0.4722 -0.0892
vn -0.8770 0.4722 0.0892
vn -0.8624 0.4417 0.2472
vn -0.5865 0.6966 0.4132
vn -0.2237 0.8404 0.4936
vn -0.0299 0.8629 0.5046
vn -0.0287 0.9868 0.1594
vn -0.0287 0.9868 -0.1594
vn 0.5838 0.8010 -0.1328
vn 0.2169 0.9637 -0.1558
vn 0.5838 0.8010 0.1328
vn 0.2169 0.9637 0.1558
vn 0.8349 0.4686 -0.2889
vn 0.5892 0.6957 -0.4110
vn 0.2276 0.8405 -0.4916
vn 0.0302 0.8629 -0.5045
vn 0.0289 0.9868 -0.1593
vn 0.0289 0.9868 0.1593
vn 0.0302 0.8629 0.5045
vn 0.2276 0.8405 0.4916
vn 0.5892 0.6957 0.4110
vn 0.8348 0.4686 0.2889
vn 0.8773 0.4715 0.0896
vn 0.8773 0.4715 -0.0896
vn -0.5838 -0.8010 -0.1328
vn -0.2169 -0.9637 -0.1558
vn -0.5838 -0.8010 0.1328
vn -0.2169 -0.9637 0.1558
vn -0.8348 -0.4686 -0.2889
vn -0.5892 -0.6957 -0.4110
vn -0.2276 -0.8405 -0.4916
vn -0.0302 -0.8629 -0.5045
vn -0.0289 -0.9868 -0.1593
vn -0.0289 -0.9868 0.1593
vn -0.0302 -0.8629 0.5045
vn -0.2276 -0.8405 0.4916
vn -0.5892 -0.6957 0.4110
vn -0.8348 -0.4686 0.2889
vn -0.8773 -0.4715 0.0896
vn -0.8773 -0.4715 -0.0896
vn 0.5990 -0.1507 0.7865
vn 0.6054 0.1462 0.7824
vn 0.2249 -0.1764 0.9583
vn 0.2288 0.1746 0.9577
vn 0.8686 -0.2643 0.4191
vn 0.8881 -0.0963 0.4495
vn 0.8878 0.0959 0.4502
vn 0.8408 0.3089 0.4445
vn 0.5992 0.4437 0.6664
vn 0.2334 0.5353 0.8118
vn 0.0310 0.5500 0.8346
vn 0.0306 0.1790 0.9834
vn 0.0303 -0.1791 0.9834
vn 0.0307 -0.5501 0.8345
vn 0.2294 -0.5373 0.8116
vn 0.5964 -0.4462 0.6672
vn -0.5990 -0.1507 -0.7865
vn -0.6054 0.1462 -0.7824
vn -0.2249 -0.1764 -0.9583
vn -0.2288 0.1745 -0.9577
vn -0.8686 -0.2643 -0.4191
vn -0.8881 -0.0963 -0.4495
vn -0.8878 0.0959 -0.4502
vn -0.8408 0.3089 -0.4445
vn -0.5992 0.4437 -0.6664
vn -0.2334 0.5353 -0.8118
vn -0.0310 0.5500 -0.8346
vn -0.0306 0.1790 -0.9834
vn -0.0303 -0.1791 -0.9834
vn -0.0307 -0.5501 -0.8345
vn -0.2294 -0.5373 -0.8116
vn -0.5964 -0.4462 -0.6672
usemtl Material.002
s off
f 412/323/259 416/324/259 415/325/259
f 414/326/260 416/324/260 413/327/260
f 416/324/261 418/328/261 415/325/261
f 416/324/262 420/329/262 419/330/262
f 284/331/263 412/323/263 306/332/263
f 307/333/264 413/327/264 412/323/264
f 309/334/265 413/327/265 308/335/265
f 285/336/266 414/326/266 309/334/266
f 310/337/267 417/338/267 414/326/267
f 417/338/268 312/339/268 420/329/268
f 420/329/269 287/340/269 313/341/269
f 419/330/270 313/341/270 314/342/270
f 419/330/271 315/343/271 418/328/271
f 418/328/272 286/344/272 304/345/272
f 415/325/273 304/345/273 305/346/273
f 306/332/274 415/325/274 305/346/274
f 422/347/275 424/348/275 421/349/275
f 422/347/276 426/350/276 425/351/276
f 425/351/277 427/352/277 424/348/277
f 425/351/278 429/353/278 428/354/278
f 411/355/279 342/356/279 292/357/279
f 410/358/280 421/349/280 411/355/280
f 410/358/281 423/359/281 422/347/281
f 409/360/282 319/361/282 423/359/282
f 423/359/283 320/362/283 426/350/283
f 426/350/284 321/363/284 429/353/284
f 429/353/285 291/364/285 322/365/285
f 429/353/286 323/366/286 428/354/286
f 427/352/287 323/366/287 324/367/287
f 427/352/288 290/368/288 340/369/288
f 424/348/289 340/369/289 341/370/289
f 421/349/290 341/370/290 342/356/290
f 430/371/291 434/372/291 433/373/291
f 432/374/292 434/372/292 431/375/292
f 434/372/293 436/376/293 433/373/293
f 434/372/294 438/377/294 437/378/294
f 290/368/295 430/371/295 327/379/295
f 324/367/296 431/375/296 430/371/296
f 322/365/297 431/375/297 323/366/297
f 291/364/298 432/374/298 322/365/298
f 328/380/299 435/381/299 432/374/299
f 435/381/300 330/382/300 438/377/300
f 438/377/301 289/383/301 331/384/301
f 437/378/302 331/384/302 332/385/302
f 437/378/303 333/386/303 436/376/303
f 436/376/304 288/387/304 325/388/304
f 433/373/305 325/388/305 326/389/305
f 327/379/306 433/373/306 326/389/306
f 440/390/307 442/391/307 439/392/307
f 440/390/308 444/393/308 443/394/308
f 443/394/309 445/395/309 442/391/309
f 443/394/310 447/396/310 446/397/310
f 402/398/311 360/399/311 298/400/311
f 401/401/312 439/392/312 402/398/312
f 401/401/313 441/402/313 440/390/313
f 400/403/314 337/404/314 441/402/314
f 441/402/315 338/405/315 444/393/315
f 444/393/316 339/406/316 447/396/316
f 447/396/317 285/407/317 309/408/317
f 447/396/318 308/409/318 446/397/318
f 445/395/319 308/409/319 307/410/319
f 445/395/320 284/411/320 358/412/320
f 442/391/321 358/412/321 359/413/321
f 439/392/322 359/413/322 360/399/322
f 449/414/323 451/415/323 448/416/323
f 450/417/324 452/418/324 449/414/324
f 451/415/325 455/419/325 454/420/325
f 452/418/326 456/421/326 455/419/326
f 342/422/327 391/423/327 292/424/327
f 341/425/328 448/416/328 342/422/328
f 340/426/329 449/414/329 341/425/329
f 290/368/330 450/417/330 340/426/330
f 450/417/331 326/389/331 453/427/331
f 326/389/332 456/421/332 453/427/332
f 456/421/333 288/387/333 336/428/333
f 455/419/334 336/428/334 335/429/334
f 454/420/335 335/429/335 334/430/335
f 393/431/336 334/430/336 300/432/336
f 392/433/337 454/420/337 393/431/337
f 448/416/338 392/433/338 391/423/338
f 458/434/339 460/435/339 457/436/339
f 459/437/340 461/438/340 458/434/340
f 460/435/341 464/439/341 463/440/341
f 461/438/342 465/441/342 464/439/342
f 351/442/343 382/443/343 295/444/343
f 350/445/344 457/436/344 351/442/344
f 349/446/345 458/434/345 350/445/345
f 287/447/346 459/437/346 349/446/346
f 459/437/347 311/448/347 462/449/347
f 311/448/348 465/441/348 462/449/348
f 465/441/349 285/450/349 339/451/349
f 464/439/350 339/451/350 338/452/350
f 463/440/351 338/452/351 337/453/351
f 384/454/352 337/453/352 303/455/352
f 383/456/353 463/440/353 384/454/353
f 457/436/354 383/456/354 382/443/354
f 466/457/355 470/458/355 469/459/355
f 467/460/356 471/461/356 470/458/356
f 470/458/357 472/462/357 469/459/357
f 471/461/358 473/463/358 470/458/358
f 291/364/359 466/457/359 328/380/359
f 321/464/360 467/460/360 466/457/360
f 320/465/361 468/466/361 467/460/361
f 319/467/362 376/468/362 468/466/362
f 468/466/363 377/469/363 471/461/363
f 377/469/364 474/470/364 471/461/364
f 378/471/365 369/472/365 474/470/365
f 474/470/366 368/473/366 473/463/366
f 473/463/367 367/474/367 472/462/367
f 472/462/368 289/383/368 330/382/368
f 329/475/369 472/462/369 330/382/369
f 466/457/370 329/475/370 328/380/370
f 476/476/371 478/477/371 475/478/371
f 477/479/371 479/480/371 476/476/371
f 479/480/372 481/481/372 478/477/372
f 480/482/372 482/483/372 479/480/372
f 357/484/373 376/468/373 297/485/373
f 356/486/373 475/478/373 357/484/373
f 356/486/373 477/479/373 476/476/373
f 296/487/373 477/479/373 355/488/373
f 379/489/371 480/482/371 477/479/371
f 380/490/372 483/491/372 480/482/372
f 381/492/374 372/493/374 483/491/374
f 483/491/374 371/494/374 482/483/374
f 482/483/374 370/495/374 481/481/374
f 481/481/374 301/496/374 378/471/374
f 478/477/372 378/471/372 377/469/372
f 475/478/371 377/469/371 376/468/371
f 485/497/371 487/498/371 484/499/371
f 486/500/371 488/501/371 485/497/371
f 488/501/372 490/502/372 487/498/372
f 488/501/372 492/503/372 491/504/372
f 296/487/373 484/499/373 379/489/373
f 353/505/373 484/499/373 354/506/373
f 352/507/373 485/497/373 353/505/373
f 295/444/373 486/500/373 352/507/373
f 382/443/371 489/508/371 486/500/371
f 383/456/372 492/503/372 489/508/372
f 384/454/374 375/509/374 492/503/374
f 492/503/374 374/510/374 491/504/374
f 491/504/374 373/511/374 490/502/374
f 490/502/374 302/512/374 381/492/374
f 487/498/372 381/492/372 380/490/372
f 484/499/371 380/490/371 379/489/371
f 493/513/375 497/514/375 496/515/375
f 494/516/376 498/517/376 497/514/376
f 497/514/377 499/518/377 496/515/377
f 498/517/378 500/519/378 497/514/378
f 286/520/379 493/513/379 304/521/379
f 318/522/380 494/516/380 493/513/380
f 317/523/381 495/524/381 494/516/381
f 316/525/382 385/526/382 495/524/382
f 495/524/383 386/527/383 498/517/383
f 386/527/384 501/528/384 498/517/384
f 387/529/385 360/530/385 501/528/385
f 501/528/386 359/531/386 500/519/386
f 500/519/387 358/532/387 499/518/387
f 499/518/388 284/533/388 306/534/388
f 305/535/389 499/518/389 306/534/389
f 493/513/390 305/535/390 304/521/390
f 503/536/391 505/537/391 502/538/391
f 504/539/391 506/540/391 503/536/391
f 506/540/392 508/541/392 505/537/392
f 507/542/392 509/543/392 506/540/392
f 294/544/393 502/538/393 385/526/393
f 347/545/393 502/538/393 348/546/393
f 347/545/393 504/539/393 503/536/393
f 293/547/393 504/539/393 346/548/393
f 388/549/391 507/542/391 504/539/391
f 389/550/392 510/551/392 507/542/392
f 390/552/394 363/553/394 510/551/394
f 510/551/394 362/554/394 509/543/394
f 509/543/394 361/555/394 508/541/394
f 508/541/394 298/556/394 387/529/394
f 505/537/392 387/529/392 386/527/392
f 502/538/391 386/527/391 385/526/391
f 512/557/391 514/558/391 511/559/391
f 513/560/391 515/561/391 512/557/391
f 515/561/392 517/562/392 514/558/392
f 515/561/392 519/563/392 518/564/392
f 293/547/393 511/559/393 388/549/393
f 344/565/393 511/559/393 345/566/393
f 343/567/393 512/557/393 344/565/393
f 292/424/393 513/560/393 343/567/393
f 391/423/391 516/568/391 513/560/391
f 392/433/392 519/563/392 516/568/392
f 393/431/394 366/569/394 519/563/394
f 519/563/394 365/570/394 518/564/394
f 518/564/394 364/571/394 517/562/394
f 517/562/394 299/572/394 390/552/394
f 514/558/392 390/552/392 389/550/392
f 511/559/391 389/550/391 388/549/391
f 520/573/395 524/574/395 523/575/395
f 522/576/396 524/574/396 521/577/396
f 523/575/397 527/578/397 526/579/397
f 525/580/398 527/578/398 524/574/398
f 288/387/399 520/573/399 336/581/399
f 332/385/400 520/573/400 333/386/400
f 332/385/401 522/576/401 521/577/401
f 289/383/402 522/576/402 331/384/402
f 367/582/403 525/580/403 522/576/403
f 368/583/404 528/584/404 525/580/404
f 369/585/405 394/586/405 528/584/405
f 528/584/406 395/587/406 527/578/406
f 526/579/407 395/587/407 396/588/407
f 334/589/408 396/588/408 300/590/408
f 335/591/409 526/579/409 334/589/409
f 336/581/410 523/575/410 335/591/410
f 530/592/411 532/593/411 529/594/411
f 531/595/412 533/596/412 530/592/412
f 533/596/411 535/597/411 532/593/411
f 534/598/412 536/599/412 533/596/412
f 396/588/413 366/600/413 300/590/413
f 395/587/411 529/594/411 396/588/411
f 394/586/412 530/592/412 395/587/412
f 301/601/414 531/595/414 394/586/414
f 370/602/414 534/598/414 531/595/414
f 534/598/414 372/603/414 537/604/414
f 372/603/414 397/605/414 537/604/414
f 537/604/412 398/606/412 536/599/412
f 536/599/411 399/607/411 535/597/411
f 535/597/413 299/608/413 364/609/413
f 532/593/413 364/609/413 365/610/413
f 529/594/413 365/610/413 366/600/413
f 539/611/411 541/612/411 538/613/411
f 540/614/412 542/615/412 539/611/412
f 542/615/411 544/616/411 541/612/411
f 543/617/412 545/618/412 542/615/412
f 399/607/413 363/619/413 299/608/413
f 398/606/411 538/613/411 399/607/411
f 397/605/412 539/611/412 398/606/412
f 302/620/414 540/614/414 397/605/414
f 373/621/414 543/617/414 540/614/414
f 374/622/414 546/623/414 543/617/414
f 375/624/414 400/403/414 546/623/414
f 546/623/412 401/401/412 545/618/412
f 545/618/411 402/398/411 544/616/411
f 544/616/413 298/400/413 361/625/413
f 541/612/413 361/625/413 362/626/413
f 363/619/413 541/612/413 362/626/413
f 547/627/415 551/628/415 550/629/415
f 549/630/416 551/628/416 548/631/416
f 550/629/417 554/632/417 553/633/417
f 552/634/418 554/632/418 551/628/418
f 286/344/419 547/627/419 318/635/419
f 314/342/420 547/627/420 315/343/420
f 314/342/421 549/630/421 548/631/421
f 287/340/422 549/630/422 313/341/422
f 349/636/423 552/634/423 549/630/423
f 350/637/424 555/638/424 552/634/424
f 351/639/425 403/640/425 555/638/425
f 555/638/426 404/641/426 554/632/426
f 553/633/427 404/641/427 405/642/427
f 316/643/428 405/642/428 294/644/428
f 317/645/429 553/633/429 316/643/429
f 318/635/430 550/629/430 317/645/430
f 557/646/431 559/647/431 556/648/431
f 558/649/432 560/650/432 557/646/432
f 560/650/431 562/651/431 559/647/431
f 561/652/432 563/653/432 560/650/432
f 405/642/433 348/654/433 294/644/433
f 404/641/431 556/648/431 405/642/431
f 403/640/432 557/646/432 404/641/432
f 295/655/434 558/649/434 403/640/434
f 558/649/434 353/656/434 561/652/434
f 353/656/434 564/657/434 561/652/434
f 354/658/434 406/659/434 564/657/434
f 564/657/432 407/660/432 563/653/432
f 563/653/431 408/661/431 562/651/431
f 562/651/433 293/662/433 346/663/433
f 347/664/433 562/651/433 346/663/433
f 556/648/433 347/664/433 348/654/433
f 566/665/431 568/666/431 565/667/431
f 567/668/432 569/669/432 566/665/432
f 569/669/431 571/670/431 568/666/431
f 570/671/432 572/672/432 569/669/432
f 293/662/433 565/667/433 345/673/433
f 407/660/431 565/667/431 408/661/431
f 406/659/432 566/665/432 407/660/432
f 296/674/434 567/668/434 406/659/434
f 567/668/434 356/675/434 570/671/434
f 356/675/434 573/676/434 570/671/434
f 357/677/434 409/360/434 573/676/434
f 573/676/432 410/358/432 572/672/432
f 572/672/431 411/355/431 571/670/431
f 343/678/433 411/355/433 292/357/433
f 568/666/433 343/678/433 344/679/433
f 565/667/433 344/679/433 345/673/433
f 412/323/435 413/327/435 416/324/435
f 414/326/436 417/338/436 416/324/436
f 416/324/437 419/330/437 418/328/437
f 416/324/438 417/338/438 420/329/438
f 284/331/439 307/333/439 412/323/439
f 307/333/264 308/335/264 413/327/264
f 309/334/265 414/326/265 413/327/265
f 285/336/440 310/337/440 414/326/440
f 310/337/441 311/680/441 417/338/441
f 417/338/442 311/680/442 312/339/442
f 420/329/443 312/339/443 287/340/443
f 419/330/270 420/329/270 313/341/270
f 419/330/271 314/342/271 315/343/271
f 418/328/444 315/343/444 286/344/444
f 415/325/445 418/328/445 304/345/445
f 306/332/446 412/323/446 415/325/446
f 422/347/447 425/351/447 424/348/447
f 422/347/448 423/359/448 426/350/448
f 425/351/449 428/354/449 427/352/449
f 425/351/450 426/350/450 429/353/450
f 411/355/451 421/349/451 342/356/451
f 410/358/452 422/347/452 421/349/452
f 410/358/453 409/360/453 423/359/453
f 409/360/454 297/681/454 319/361/454
f 423/359/455 319/361/455 320/362/455
f 426/350/456 320/362/456 321/363/456
f 429/353/457 321/363/457 291/364/457
f 429/353/458 322/365/458 323/366/458
f 427/352/459 428/354/459 323/366/459
f 427/352/460 324/367/460 290/368/460
f 424/348/461 427/352/461 340/369/461
f 421/349/462 424/348/462 341/370/462
f 430/371/463 431/375/463 434/372/463
f 432/374/464 435/381/464 434/372/464
f 434/372/465 437/378/465 436/376/465
f 434/372/466 435/381/466 438/377/466
f 290/368/467 324/367/467 430/371/467
f 324/367/296 323/366/296 431/375/296
f 322/365/297 432/374/297 431/375/297
f 291/364/468 328/380/468 432/374/468
f 328/380/469 329/475/469 435/381/469
f 435/381/470 329/475/470 330/382/470
f 438/377/471 330/382/471 289/383/471
f 437/378/302 438/377/302 331/384/302
f 437/378/303 332/385/303 333/386/303
f 436/376/472 333/386/472 288/387/472
f 433/373/473 436/376/473 325/388/473
f 327/379/474 430/371/474 433/373/474
f 440/390/475 443/394/475 442/391/475
f 440/390/476 441/402/476 444/393/476
f 443/394/477 446/397/477 445/395/477
f 443/394/478 444/393/478 447/396/478
f 402/398/479 439/392/479 360/399/479
f 401/401/480 440/390/480 439/392/480
f 401/401/481 400/403/481 441/402/481
f 400/403/482 303/682/482 337/404/482
f 441/402/483 337/404/483 338/405/483
f 444/393/484 338/405/484 339/406/484
f 447/396/485 339/406/485 285/407/485
f 447/396/486 309/408/486 308/409/486
f 445/395/487 446/397/487 308/409/487
f 445/395/488 307/410/488 284/411/488
f 442/391/489 445/395/489 358/412/489
f 439/392/490 442/391/490 359/413/490
f 449/414/491 452/418/491 451/415/491
f 450/417/492 453/427/492 452/418/492
f 451/415/493 452/418/493 455/419/493
f 452/418/494 453/427/494 456/421/494
f 342/422/495 448/416/495 391/423/495
f 341/425/496 449/414/496 448/416/496
f 340/426/497 450/417/497 449/414/497
f 290/368/498 327/379/498 450/417/498
f 450/417/499 327/379/499 326/389/499
f 326/389/500 325/388/500 456/421/500
f 456/421/501 325/388/501 288/387/501
f 455/419/502 456/421/502 336/428/502
f 454/420/503 455/419/503 335/429/503
f 393/431/504 454/420/504 334/430/504
f 392/433/505 451/415/505 454/420/505
f 448/416/506 451/415/506 392/433/506
f 458/434/507 461/438/507 460/435/507
f 459/437/508 462/449/508 461/438/508
f 460/435/509 461/438/509 464/439/509
f 461/438/510 462/449/510 465/441/510
f 351/442/511 457/436/511 382/443/511
f 350/445/512 458/434/512 457/436/512
f 349/446/513 459/437/513 458/434/513
f 287/447/514 312/683/514 459/437/514
f 459/437/515 312/683/515 311/448/515
f 311/448/516 310/684/516 465/441/516
f 465/441/517 310/684/517 285/450/517
f 464/439/518 465/441/518 339/451/518
f 463/440/519 464/439/519 338/452/519
f 384/454/520 463/440/520 337/453/520
f 383/456/521 460/435/521 463/440/521
f 457/436/522 460/435/522 383/456/522
f 466/457/523 467/460/523 470/458/523
f 467/460/524 468/466/524 471/461/524
f 470/458/525 473/463/525 472/462/525
f 471/461/526 474/470/526 473/463/526
f 291/364/527 321/464/527 466/457/527
f 321/464/528 320/465/528 467/460/528
f 320/465/529 319/467/529 468/466/529
f 319/467/530 297/485/530 376/468/530
f 468/466/531 376/468/531 377/469/531
f 377/469/532 378/471/532 474/470/532
f 378/471/533 301/496/533 369/472/533
f 474/470/534 369/472/534 368/473/534
f 473/463/535 368/473/535 367/474/535
f 472/462/536 367/474/536 289/383/536
f 329/475/537 469/459/537 472/462/537
f 466/457/538 469/459/538 329/475/538
f 476/476/371 479/480/371 478/477/371
f 477/479/371 480/482/371 479/480/371
f 479/480/372 482/483/372 481/481/372
f 480/482/372 483/491/372 482/483/372
f 357/484/373 475/478/373 376/468/373
f 356/486/373 476/476/373 475/478/373
f 356/486/373 355/488/373 477/479/373
f 296/487/373 379/489/373 477/479/373
f 379/489/371 380/490/371 480/482/371
f 380/490/372 381/492/372 483/491/372
f 381/492/374 302/512/374 372/493/374
f 483/491/374 372/493/374 371/494/374
f 482/483/374 371/494/374 370/495/374
f 481/481/374 370/495/374 301/496/374
f 478/477/372 481/481/372 378/471/372
f 475/478/371 478/477/371 377/469/371
f 485/497/371 488/501/371 487/498/371
f 486/500/371 489/508/371 488/501/371
f 488/501/372 491/504/372 490/502/372
f 488/501/372 489/508/372 492/503/372
f 296/487/373 354/506/373 484/499/373
f 353/505/373 485/497/373 484/499/373
f 352/507/373 486/500/373 485/497/373
f 295/444/373 382/443/373 486/500/373
f 382/443/371 383/456/371 489/508/371
f 383/456/372 384/454/372 492/503/372
f 384/454/374 303/455/374 375/509/374
f 492/503/374 375/509/374 374/510/374
f 491/504/374 374/510/374 373/511/374
f 490/502/374 373/511/374 302/512/374
f 487/498/372 490/502/372 381/492/372
f 484/499/371 487/498/371 380/490/371
f 493/513/539 494/516/539 497/514/539
f 494/516/540 495/524/540 498/517/540
f 497/514/541 500/519/541 499/518/541
f 498/517/542 501/528/542 500/519/542
f 286/520/543 318/522/543 493/513/543
f 318/522/544 317/523/544 494/516/544
f 317/523/545 316/525/545 495/524/545
f 316/525/546 294/544/546 385/526/546
f 495/524/547 385/526/547 386/527/547
f 386/527/548 387/529/548 501/528/548
f 387/529/549 298/556/549 360/530/549
f 501/528/550 360/530/550 359/531/550
f 500/519/551 359/531/551 358/532/551
f 499/518/552 358/532/552 284/533/552
f 305/535/553 496/515/553 499/518/553
f 493/513/554 496/515/554 305/535/554
f 503/536/391 506/540/391 505/537/391
f 504/539/391 507/542/391 506/540/391
f 506/540/392 509/543/392 508/541/392
f 507/542/392 510/551/392 509/543/392
f 294/544/393 348/546/393 502/538/393
f 347/545/393 503/536/393 502/538/393
f 347/545/393 346/548/393 504/539/393
f 293/547/393 388/549/393 504/539/393
f 388/549/391 389/550/391 507/542/391
f 389/550/392 390/552/392 510/551/392
f 390/552/394 299/572/394 363/553/394
f 510/551/394 363/553/394 362/554/394
f 509/543/394 362/554/394 361/555/394
f 508/541/394 361/555/394 298/556/394
f 505/537/392 508/541/392 387/529/392
f 502/538/391 505/537/391 386/527/391
f 512/557/391 515/561/391 514/558/391
f 513/560/391 516/568/391 515/561/391
f 515/561/392 518/564/392 517/562/392
f 515/561/392 516/568/392 519/563/392
f 293/547/393 345/566/393 511/559/393
f 344/565/393 512/557/393 511/559/393
f 343/567/393 513/560/393 512/557/393
f 292/424/393 391/423/393 513/560/393
f 391/423/391 392/433/391 516/568/391
f 392/433/392 393/431/392 519/563/392
f 393/431/394 300/432/394 366/569/394
f 519/563/394 366/569/394 365/570/394
f 518/564/394 365/570/394 364/571/394
f 517/562/394 364/571/394 299/572/394
f 514/558/392 517/562/392 390/552/392
f 511/559/391 514/558/391 389/550/391
f 520/573/555 521/577/555 524/574/555
f 522/576/556 525/580/556 524/574/556
f 523/575/557 524/574/557 527/578/557
f 525/580/558 528/584/558 527/578/558
f 288/387/559 333/386/559 520/573/559
f 332/385/560 521/577/560 520/573/560
f 332/385/561 331/384/561 522/576/561
f 289/383/562 367/582/562 522/576/562
f 367/582/563 368/583/563 525/580/563
f 368/583/564 369/585/564 528/584/564
f 369/585/565 301/601/565 394/586/565
f 528/584/566 394/586/566 395/587/566
f 526/579/567 527/578/567 395/587/567
f 334/589/568 526/579/568 396/588/568
f 335/591/569 523/575/569 526/579/569
f 336/581/570 520/573/570 523/575/570
f 530/592/411 533/596/411 532/593/411
f 531/595/412 534/598/412 533/596/412
f 533/596/411 536/599/411 535/597/411
f 534/598/412 537/604/412 536/599/412
f 396/588/413 529/594/413 366/600/413
f 395/587/411 530/592/411 529/594/411
f 394/586/412 531/595/412 530/592/412
f 301/601/414 370/602/414 531/595/414
f 370/602/414 371/685/414 534/598/414
f 534/598/414 371/685/414 372/603/414
f 372/603/414 302/620/414 397/605/414
f 537/604/412 397/605/412 398/606/412
f 536/599/411 398/606/411 399/607/411
f 535/597/413 399/607/413 299/608/413
f 532/593/413 535/597/413 364/609/413
f 529/594/413 532/593/413 365/610/413
f 539/611/411 542/615/411 541/612/411
f 540/614/412 543/617/412 542/615/412
f 542/615/411 545/618/411 544/616/411
f 543/617/412 546/623/412 545/618/412
f 399/607/413 538/613/413 363/619/413
f 398/606/411 539/611/411 538/613/411
f 397/605/412 540/614/412 539/611/412
f 302/620/414 373/621/414 540/614/414
f 373/621/414 374/622/414 543/617/414
f 374/622/414 375/624/414 546/623/414
f 375/624/414 303/682/414 400/403/414
f 546/623/412 400/403/412 401/401/412
f 545/618/411 401/401/411 402/398/411
f 544/616/413 402/398/413 298/400/413
f 541/612/413 544/616/413 361/625/413
f 363/619/413 538/613/413 541/612/413
f 547/627/571 548/631/571 551/628/571
f 549/630/572 552/634/572 551/628/572
f 550/629/573 551/628/573 554/632/573
f 552/634/574 555/638/574 554/632/574
f 286/344/575 315/343/575 547/627/575
f 314/342/576 548/631/576 547/627/576
f 314/342/577 313/341/577 549/630/577
f 287/340/578 349/636/578 549/630/578
f 349/636/579 350/637/579 552/634/579
f 350/637/580 351/639/580 555/638/580
f 351/639/581 295/655/581 403/640/581
f 555/638/582 403/640/582 404/641/582
f 553/633/583 554/632/583 404/641/583
f 316/643/584 553/633/584 405/642/584
f 317/645/585 550/629/585 553/633/585
f 318/635/586 547/627/586 550/629/586
f 557/646/431 560/650/431 559/647/431
f 558/649/432 561/652/432 560/650/432
f 560/650/431 563/653/431 562/651/431
f 561/652/432 564/657/432 563/653/432
f 405/642/433 556/648/433 348/654/433
f 404/641/431 557/646/431 556/648/431
f 403/640/432 558/649/432 557/646/432
f 295/655/434 352/686/434 558/649/434
f 558/649/434 352/686/434 353/656/434
f 353/656/434 354/658/434 564/657/434
f 354/658/434 296/674/434 406/659/434
f 564/657/432 406/659/432 407/660/432
f 563/653/431 407/660/431 408/661/431
f 562/651/433 408/661/433 293/662/433
f 347/664/433 559/647/433 562/651/433
f 556/648/433 559/647/433 347/664/433
f 566/665/431 569/669/431 568/666/431
f 567/668/432 570/671/432 569/669/432
f 569/669/431 572/672/431 571/670/431
f 570/671/432 573/676/432 572/672/432
f 293/662/433 408/661/433 565/667/433
f 407/660/431 566/665/431 565/667/431
f 406/659/432 567/668/432 566/665/432
f 296/674/434 355/687/434 567/668/434
f 567/668/434 355/687/434 356/675/434
f 356/675/434 357/677/434 573/676/434
f 357/677/434 297/681/434 409/360/434
f 573/676/432 409/360/432 410/358/432
f 572/672/431 410/358/431 411/355/431
f 343/678/433 571/670/433 411/355/433
f 568/666/433 571/670/433 343/678/433
f 565/667/433 568/666/433 344/679/433
o Cube.002_Cube.003
v 0.180624 -0.099813 -0.214865
v 0.180624 -0.011160 -0.214865
v 0.099110 -0.099813 -0.261927
v 0.099110 -0.011160 -0.261927
v 0.281899 -0.099813 -0.390279
v 0.281899 -0.011160 -0.390279
v 0.200386 -0.099813 -0.437341
v 0.200386 -0.011160 -0.437341
v 0.165098 -0.114588 -0.407596
v 0.136162 -0.114588 -0.357478
v 0.107227 -0.114588 -0.307360
v 0.107227 0.003615 -0.307360
v 0.136162 0.003615 -0.357478
v 0.165098 0.003615 -0.407596
v 0.215911 -0.114588 -0.244610
v 0.244847 -0.114588 -0.294729
v 0.273783 -0.114588 -0.344847
v 0.273783 0.003615 -0.344847
v 0.244847 0.003615 -0.294729
v 0.215911 0.003615 -0.244610
v 0.114631 -0.106278 -0.250530
v 0.138282 -0.109526 -0.235651
v 0.162993 -0.106278 -0.222608
v 0.185514 -0.081785 -0.209605
v 0.187970 -0.055486 -0.206963
v 0.185514 -0.029188 -0.209605
v 0.162993 -0.004694 -0.222608
v 0.138282 -0.001447 -0.235651
v 0.114631 -0.004694 -0.250529
v 0.092110 -0.029188 -0.263532
v 0.088594 -0.055486 -0.264338
v 0.092110 -0.081785 -0.263532
v 0.100398 -0.114272 -0.294860
v 0.096001 -0.112057 -0.282542
v 0.095931 -0.106595 -0.270821
v 0.172509 0.003299 -0.419759
v 0.180978 0.001084 -0.429726
v 0.191093 -0.004378 -0.435647
v 0.195496 -0.029188 -0.442601
v 0.193039 -0.055486 -0.445243
v 0.195496 -0.081785 -0.442601
v 0.266379 -0.106278 -0.401677
v 0.242728 -0.109526 -0.416556
v 0.218017 -0.106278 -0.429599
v 0.218017 -0.004694 -0.429599
v 0.242728 -0.001447 -0.416556
v 0.266379 -0.004694 -0.401677
v 0.288899 -0.029188 -0.388674
v 0.292416 -0.055486 -0.387868
v 0.288899 -0.081785 -0.388674
v 0.280612 -0.114272 -0.357346
v 0.285008 -0.112057 -0.369664
v 0.285079 -0.106595 -0.381385
v 0.208501 0.003299 -0.232447
v 0.200032 0.001084 -0.222480
v 0.189916 -0.004378 -0.216559
v 0.191093 -0.106595 -0.435647
v 0.180978 -0.112057 -0.429726
v 0.172509 -0.114272 -0.419759
v 0.157864 -0.114588 -0.395066
v 0.150630 -0.114588 -0.382537
v 0.143396 -0.114588 -0.370007
v 0.128928 -0.114588 -0.344948
v 0.121694 -0.114588 -0.332419
v 0.114460 -0.114588 -0.319889
v 0.095931 -0.004378 -0.270821
v 0.096001 0.001084 -0.282542
v 0.100398 0.003299 -0.294860
v 0.114460 0.003615 -0.319889
v 0.121694 0.003615 -0.332419
v 0.128928 0.003615 -0.344948
v 0.143396 0.003615 -0.370007
v 0.150630 0.003615 -0.382537
v 0.157864 0.003615 -0.395066
v 0.189916 -0.106595 -0.216559
v 0.200032 -0.112057 -0.222480
v 0.208501 -0.114272 -0.232447
v 0.223145 -0.114588 -0.257140
v 0.230379 -0.114588 -0.269669
v 0.237613 -0.114588 -0.282199
v 0.252081 -0.114588 -0.307258
v 0.259315 -0.114588 -0.319788
v 0.266549 -0.114588 -0.332317
v 0.285079 -0.004378 -0.381385
v 0.285008 0.001084 -0.369664
v 0.280612 0.003299 -0.357346
v 0.266549 0.003615 -0.332317
v 0.259315 0.003615 -0.319788
v 0.252081 0.003615 -0.307258
v 0.237613 0.003615 -0.282199
v 0.230379 0.003615 -0.269669
v 0.223145 0.003615 -0.257140
v 0.189722 0.020238 -0.393379
v 0.219441 0.025778 -0.376221
v 0.249159 0.020238 -0.359063
v 0.160786 0.020238 -0.343261
v 0.190505 0.025778 -0.326103
v 0.220223 0.020238 -0.308945
v 0.131850 0.020238 -0.293143
v 0.161569 0.025778 -0.275985
v 0.191287 0.020238 -0.258827
v 0.131850 -0.131210 -0.293143
v 0.161569 -0.136751 -0.275985
v 0.191287 -0.131210 -0.258827
v 0.160786 -0.131210 -0.343261
v 0.190505 -0.136751 -0.326103
v 0.220223 -0.131210 -0.308945
v 0.189722 -0.131210 -0.393379
v 0.219441 -0.136751 -0.376221
v 0.249159 -0.131210 -0.359063
v 0.289067 -0.023165 -0.336023
v 0.294161 -0.055486 -0.333081
v 0.289067 -0.087808 -0.336023
v 0.260131 -0.023165 -0.285904
v 0.265226 -0.055486 -0.282963
v 0.260131 -0.087808 -0.285904
v 0.231195 -0.023165 -0.235786
v 0.236290 -0.055486 -0.232845
v 0.231195 -0.087808 -0.235786
v 0.091943 -0.023165 -0.316184
v 0.086848 -0.055486 -0.319125
v 0.091943 -0.087808 -0.316184
v 0.120879 -0.023165 -0.366302
v 0.115784 -0.055486 -0.369243
v 0.120879 -0.087808 -0.366302
v 0.149814 -0.023165 -0.416420
v 0.144720 -0.055486 -0.419361
v 0.149814 -0.087808 -0.416420
v 0.162325 -0.083514 -0.217779
v 0.162945 -0.055486 -0.215656
v 0.162325 -0.027458 -0.217779
v 0.135790 -0.085020 -0.231334
v 0.134955 -0.055486 -0.229888
v 0.135790 -0.025953 -0.231334
v 0.110783 -0.083514 -0.247537
v 0.108634 -0.055486 -0.247012
v 0.110783 -0.027458 -0.247537
v 0.157312 -0.087633 -0.428556
v 0.152252 -0.055486 -0.431500
v 0.157312 -0.023339 -0.428556
v 0.166393 -0.086414 -0.438327
v 0.161573 -0.055486 -0.441290
v 0.166393 -0.024559 -0.438327
v 0.178587 -0.083689 -0.443542
v 0.174511 -0.055486 -0.446486
v 0.178587 -0.027284 -0.443542
v 0.218684 -0.083514 -0.434427
v 0.218064 -0.055486 -0.436551
v 0.218684 -0.027458 -0.434427
v 0.245220 -0.085020 -0.420872
v 0.246054 -0.055486 -0.422318
v 0.245220 -0.025953 -0.420872
v 0.270226 -0.083514 -0.404669
v 0.272375 -0.055486 -0.405194
v 0.270226 -0.027458 -0.404669
v 0.223697 -0.087633 -0.223651
v 0.228758 -0.055486 -0.220707
v 0.223697 -0.023339 -0.223651
v 0.214616 -0.086414 -0.213879
v 0.219436 -0.055486 -0.210916
v 0.214616 -0.024559 -0.213879
v 0.202423 -0.083689 -0.208664
v 0.206499 -0.055486 -0.205721
v 0.202423 -0.027284 -0.208664
v 0.197011 -0.130809 -0.405635
v 0.204634 -0.128004 -0.416248
v 0.212447 -0.120514 -0.423993
v 0.226580 -0.136324 -0.388587
v 0.233149 -0.133331 -0.399966
v 0.238633 -0.125225 -0.409464
v 0.256128 -0.130809 -0.371504
v 0.261508 -0.128004 -0.383412
v 0.264309 -0.120514 -0.394050
v 0.124881 0.019837 -0.280702
v 0.119501 0.017031 -0.268794
v 0.116700 0.009541 -0.258156
v 0.154430 0.025351 -0.263620
v 0.147860 0.022358 -0.252241
v 0.142376 0.014252 -0.242742
v 0.183998 0.019837 -0.246571
v 0.176375 0.017031 -0.235958
v 0.168563 0.009541 -0.228213
v 0.212447 0.009541 -0.423993
v 0.204634 0.017031 -0.416248
v 0.197011 0.019837 -0.405635
v 0.238633 0.014252 -0.409464
v 0.233149 0.022358 -0.399966
v 0.226580 0.025351 -0.388587
v 0.264309 0.009541 -0.394050
v 0.261508 0.017031 -0.383412
v 0.256128 0.019837 -0.371504
v 0.182488 0.020238 -0.380850
v 0.175254 0.020238 -0.368320
v 0.168020 0.020238 -0.355791
v 0.212207 0.025778 -0.363692
v 0.204973 0.025778 -0.351162
v 0.197739 0.025778 -0.338633
v 0.241925 0.020238 -0.346534
v 0.234691 0.020238 -0.334004
v 0.227457 0.020238 -0.321475
v 0.153552 0.020238 -0.330732
v 0.146318 0.020238 -0.318202
v 0.139084 0.020238 -0.305672
v 0.183271 0.025778 -0.313574
v 0.176037 0.025778 -0.301044
v 0.168803 0.025778 -0.288514
v 0.212989 0.020238 -0.296416
v 0.205755 0.020238 -0.283886
v 0.198521 0.020238 -0.271356
v 0.116700 -0.120514 -0.258156
v 0.119501 -0.128004 -0.268794
v 0.124881 -0.130809 -0.280702
v 0.142376 -0.125225 -0.242742
v 0.147860 -0.133331 -0.252241
v 0.154430 -0.136324 -0.263620
v 0.168563 -0.120514 -0.228213
v 0.176375 -0.128004 -0.235958
v 0.183998 -0.130809 -0.246571
v 0.139084 -0.131210 -0.305672
v 0.146318 -0.131210 -0.318202
v 0.153552 -0.131210 -0.330732
v 0.168803 -0.136751 -0.288514
v 0.176037 -0.136751 -0.301044
v 0.183271 -0.136751 -0.313574
v 0.198521 -0.131210 -0.271356
v 0.205755 -0.131210 -0.283886
v 0.212989 -0.131210 -0.296416
v 0.168020 -0.131210 -0.355791
v 0.175254 -0.131210 -0.368320
v 0.182488 -0.131210 -0.380850
v 0.197739 -0.136751 -0.338633
v 0.204973 -0.136751 -0.351162
v 0.212207 -0.136751 -0.363692
v 0.227457 -0.131210 -0.321475
v 0.234691 -0.131210 -0.334004
v 0.241925 -0.131210 -0.346534
v 0.298169 -0.083689 -0.374501
v 0.302756 -0.055486 -0.372443
v 0.298169 -0.027284 -0.374501
v 0.299749 -0.086414 -0.361334
v 0.304725 -0.055486 -0.358641
v 0.299749 -0.024559 -0.361334
v 0.295827 -0.087633 -0.348584
v 0.300907 -0.055486 -0.345673
v 0.295827 -0.023339 -0.348584
v 0.281833 -0.087808 -0.323493
v 0.286927 -0.055486 -0.320552
v 0.281833 -0.023165 -0.323493
v 0.274599 -0.087808 -0.310964
v 0.279693 -0.055486 -0.308022
v 0.274599 -0.023165 -0.310964
v 0.267365 -0.087808 -0.298434
v 0.272459 -0.055486 -0.295493
v 0.267365 -0.023165 -0.298434
v 0.252897 -0.087808 -0.273375
v 0.257992 -0.055486 -0.270434
v 0.252897 -0.023165 -0.273375
v 0.245663 -0.087808 -0.260845
v 0.250758 -0.055486 -0.257904
v 0.245663 -0.023165 -0.260845
v 0.238429 -0.087808 -0.248316
v 0.243524 -0.055486 -0.245374
v 0.238429 -0.023165 -0.248316
v 0.082840 -0.083689 -0.277705
v 0.078253 -0.055486 -0.279763
v 0.082840 -0.027284 -0.277705
v 0.081260 -0.086414 -0.290872
v 0.076284 -0.055486 -0.293565
v 0.081260 -0.024559 -0.290872
v 0.085182 -0.087633 -0.303623
v 0.080102 -0.055486 -0.306533
v 0.085182 -0.023339 -0.303623
v 0.099177 -0.087808 -0.328713
v 0.094082 -0.055486 -0.331655
v 0.099177 -0.023165 -0.328713
v 0.106411 -0.087808 -0.341243
v 0.101316 -0.055486 -0.344184
v 0.106411 -0.023165 -0.341243
v 0.113645 -0.087808 -0.353772
v 0.108550 -0.055486 -0.356714
v 0.113645 -0.023165 -0.353772
v 0.128112 -0.087808 -0.378831
v 0.123018 -0.055486 -0.381773
v 0.128112 -0.023165 -0.378831
v 0.135346 -0.087808 -0.391361
v 0.130252 -0.055486 -0.394302
v 0.135346 -0.023165 -0.391361
v 0.142580 -0.087808 -0.403891
v 0.137486 -0.055486 -0.406832
v 0.142580 -0.023165 -0.403891
vt 0.437500 0.062276
vt 0.500000 0.122396
vt 0.437500 0.123210
vt 0.562500 0.062276
vt 0.500000 0.062174
vt 0.437500 0.180725
vt 0.562500 0.180725
vt 0.500000 0.178060
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.437500 0.000000
vt 0.562500 0.000000
vt 0.500000 0.000000
vt 0.625000 0.000000
vt 0.625000 0.062500
vt 0.562500 0.123210
vt 0.625000 0.187500
vt 0.625000 0.250000
vt 0.562500 0.228516
vt 0.500000 0.223958
vt 0.437500 0.228516
vt 0.375000 0.250000
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.500000 0.453451
vt 0.437500 0.470540
vt 0.437500 0.453349
vt 0.562500 0.470540
vt 0.500000 0.471354
vt 0.437663 0.491150
vt 0.562337 0.491150
vt 0.500000 0.493815
vt 0.437500 0.437500
vt 0.375000 0.453125
vt 0.375000 0.437500
vt 0.500000 0.437500
vt 0.562500 0.453349
vt 0.562500 0.437500
vt 0.625000 0.453125
vt 0.625000 0.468750
vt 0.625000 0.484375
vt 0.625000 0.500000
vt 0.560628 0.521525
vt 0.500000 0.526042
vt 0.439372 0.521525
vt 0.375000 0.500000
vt 0.375000 0.484375
vt 0.375000 0.468750
vt 0.444214 0.569214
vt 0.500000 0.625000
vt 0.446615 0.625000
vt 0.555786 0.569214
vt 0.500000 0.571615
vt 0.444214 0.680786
vt 0.555786 0.680786
vt 0.500000 0.678385
vt 0.396525 0.564372
vt 0.603475 0.564372
vt 0.553385 0.625000
vt 0.603475 0.685628
vt 0.625000 0.750000
vt 0.560628 0.728475
vt 0.500000 0.723958
vt 0.439372 0.728475
vt 0.375000 0.750000
vt 0.396525 0.685628
vt 0.401042 0.625000
vt 0.500000 0.953125
vt 0.437500 0.968750
vt 0.437500 0.953125
vt 0.562500 0.968750
vt 0.500000 0.968750
vt 0.437500 0.984375
vt 0.562500 0.984375
vt 0.500000 0.984375
vt 0.437500 0.937500
vt 0.375000 0.953125
vt 0.375000 0.937500
vt 0.500000 0.937500
vt 0.562500 0.953125
vt 0.562500 0.937500
vt 0.625000 0.953125
vt 0.625000 0.968750
vt 0.625000 0.984375
vt 0.625000 1.000000
vt 0.562500 1.000000
vt 0.500000 1.000000
vt 0.437500 1.000000
vt 0.375000 1.000000
vt 0.375000 0.984375
vt 0.375000 0.968750
vt 0.345540 0.562500
vt 0.328451 0.625000
vt 0.328349 0.562500
vt 0.366150 0.562663
vt 0.346354 0.625000
vt 0.345540 0.687500
vt 0.328349 0.687500
vt 0.366150 0.687337
vt 0.328125 0.500000
vt 0.312500 0.562500
vt 0.312500 0.500000
vt 0.343750 0.500000
vt 0.359375 0.500000
vt 0.368815 0.625000
vt 0.359375 0.750000
vt 0.343750 0.750000
vt 0.328125 0.750000
vt 0.312500 0.687500
vt 0.312500 0.750000
vt 0.312500 0.625000
vt 0.843750 0.562500
vt 0.828125 0.625000
vt 0.828125 0.562500
vt 0.859375 0.562500
vt 0.843750 0.625000
vt 0.843750 0.687500
vt 0.828125 0.687500
vt 0.859375 0.687500
vt 0.828125 0.500000
vt 0.812500 0.562500
vt 0.812500 0.500000
vt 0.843750 0.500000
vt 0.859375 0.500000
vt 0.875000 0.500000
vt 0.875000 0.625000
vt 0.859375 0.625000
vt 0.875000 0.750000
vt 0.859375 0.750000
vt 0.843750 0.750000
vt 0.828125 0.750000
vt 0.812500 0.687500
vt 0.812500 0.750000
vt 0.812500 0.625000
vt 0.633850 0.562663
vt 0.653646 0.625000
vt 0.631185 0.625000
vt 0.654460 0.562500
vt 0.671550 0.625000
vt 0.633850 0.687337
vt 0.654460 0.687500
vt 0.640625 0.500000
vt 0.656250 0.500000
vt 0.671651 0.562500
vt 0.671875 0.500000
vt 0.687500 0.562500
vt 0.687500 0.625000
vt 0.671651 0.687500
vt 0.687500 0.687500
vt 0.671875 0.750000
vt 0.656250 0.750000
vt 0.640625 0.750000
vt 0.598958 0.625000
vt 0.718750 0.562500
vt 0.703125 0.625000
vt 0.703125 0.562500
vt 0.734375 0.562500
vt 0.718750 0.625000
vt 0.703125 0.687500
vt 0.734375 0.625000
vt 0.718750 0.687500
vt 0.703125 0.500000
vt 0.687500 0.500000
vt 0.718750 0.500000
vt 0.750000 0.500000
vt 0.734375 0.500000
vt 0.750000 0.562500
vt 0.750000 0.625000
vt 0.734375 0.687500
vt 0.750000 0.687500
vt 0.734375 0.750000
vt 0.718750 0.750000
vt 0.703125 0.750000
vt 0.687500 0.750000
vt 0.781250 0.562500
vt 0.765625 0.625000
vt 0.765625 0.562500
vt 0.796875 0.562500
vt 0.781250 0.625000
vt 0.765625 0.687500
vt 0.796875 0.687500
vt 0.781250 0.687500
vt 0.781250 0.500000
vt 0.765625 0.500000
vt 0.796875 0.500000
vt 0.796875 0.625000
vt 0.796875 0.750000
vt 0.781250 0.750000
vt 0.765625 0.750000
vt 0.750000 0.750000
vt 0.140625 0.562500
vt 0.156250 0.625000
vt 0.140625 0.625000
vt 0.156250 0.562500
vt 0.171875 0.625000
vt 0.140625 0.687500
vt 0.156250 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.140625 0.500000
vt 0.156250 0.500000
vt 0.171875 0.562500
vt 0.171875 0.500000
vt 0.187500 0.562500
vt 0.187500 0.625000
vt 0.171875 0.687500
vt 0.187500 0.687500
vt 0.171875 0.750000
vt 0.156250 0.750000
vt 0.140625 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.218750 0.562500
vt 0.203125 0.625000
vt 0.203125 0.562500
vt 0.234375 0.562500
vt 0.218750 0.625000
vt 0.203125 0.687500
vt 0.234375 0.625000
vt 0.218750 0.687500
vt 0.187500 0.500000
vt 0.218750 0.500000
vt 0.203125 0.500000
vt 0.250000 0.500000
vt 0.234375 0.500000
vt 0.250000 0.562500
vt 0.250000 0.625000
vt 0.234375 0.687500
vt 0.250000 0.687500
vt 0.234375 0.750000
vt 0.218750 0.750000
vt 0.203125 0.750000
vt 0.187500 0.750000
vt 0.281250 0.562500
vt 0.265625 0.625000
vt 0.265625 0.562500
vt 0.296875 0.562500
vt 0.281250 0.625000
vt 0.265625 0.687500
vt 0.296875 0.687500
vt 0.281250 0.687500
vt 0.281250 0.500000
vt 0.265625 0.500000
vt 0.296875 0.500000
vt 0.296875 0.625000
vt 0.296875 0.750000
vt 0.281250 0.750000
vt 0.265625 0.750000
vt 0.250000 0.750000
vt 0.437663 0.758850
vt 0.500000 0.778646
vt 0.437500 0.779460
vt 0.562337 0.758850
vt 0.500000 0.756185
vt 0.500000 0.796549
vt 0.437500 0.796651
vt 0.562500 0.779460
vt 0.375000 0.765625
vt 0.625000 0.765625
vt 0.625000 0.781250
vt 0.562500 0.796651
vt 0.625000 0.796875
vt 0.562500 0.812500
vt 0.500000 0.812500
vt 0.437500 0.812500
vt 0.375000 0.796875
vt 0.375000 0.812500
vt 0.375000 0.781250
vt 0.500000 0.828125
vt 0.437500 0.843750
vt 0.437500 0.828125
vt 0.562500 0.828125
vt 0.500000 0.843750
vt 0.437500 0.859375
vt 0.562500 0.843750
vt 0.500000 0.859375
vt 0.375000 0.828125
vt 0.625000 0.812500
vt 0.625000 0.828125
vt 0.625000 0.859375
vt 0.562500 0.859375
vt 0.562500 0.875000
vt 0.500000 0.875000
vt 0.437500 0.875000
vt 0.375000 0.875000
vt 0.375000 0.859375
vt 0.375000 0.843750
vt 0.500000 0.890625
vt 0.437500 0.906250
vt 0.437500 0.890625
vt 0.562500 0.890625
vt 0.500000 0.906250
vt 0.437500 0.921875
vt 0.562500 0.906250
vt 0.500000 0.921875
vt 0.375000 0.890625
vt 0.625000 0.875000
vt 0.625000 0.890625
vt 0.625000 0.906250
vt 0.562500 0.921875
vt 0.625000 0.921875
vt 0.375000 0.921875
vt 0.375000 0.906250
vt 0.437500 0.258850
vt 0.500000 0.278646
vt 0.437500 0.279460
vt 0.562500 0.258850
vt 0.500000 0.256185
vt 0.500000 0.296549
vt 0.437500 0.296651
vt 0.562500 0.279460
vt 0.375000 0.265625
vt 0.625000 0.265625
vt 0.625000 0.281250
vt 0.562500 0.296651
vt 0.625000 0.296875
vt 0.562500 0.312500
vt 0.500000 0.312500
vt 0.437500 0.312500
vt 0.375000 0.296875
vt 0.375000 0.312500
vt 0.375000 0.281250
vt 0.500000 0.328125
vt 0.437500 0.343750
vt 0.437500 0.328125
vt 0.562500 0.328125
vt 0.500000 0.343750
vt 0.437500 0.359375
vt 0.562500 0.343750
vt 0.500000 0.359375
vt 0.375000 0.328125
vt 0.625000 0.312500
vt 0.625000 0.343750
vt 0.562500 0.359375
vt 0.625000 0.359375
vt 0.562500 0.375000
vt 0.500000 0.375000
vt 0.437500 0.375000
vt 0.375000 0.375000
vt 0.375000 0.359375
vt 0.375000 0.343750
vt 0.500000 0.390625
vt 0.437500 0.406250
vt 0.437500 0.390625
vt 0.562500 0.390625
vt 0.500000 0.406250
vt 0.437500 0.421875
vt 0.562500 0.406250
vt 0.500000 0.421875
vt 0.375000 0.390625
vt 0.625000 0.375000
vt 0.625000 0.406250
vt 0.562500 0.421875
vt 0.625000 0.421875
vt 0.375000 0.421875
vt 0.375000 0.406250
vt 0.625000 0.125000
vt 0.625000 0.437500
vt 0.625000 0.937500
vt 0.875000 0.562500
vt 0.875000 0.687500
vt 0.625000 0.843750
vt 0.625000 0.328125
vt 0.625000 0.390625
vn -0.4516 -0.0563 0.8904
vn -0.4525 0.0574 0.8899
vn -0.5453 -0.0563 0.8363
vn -0.5444 0.0574 0.8368
vn -0.3295 -0.2052 0.9216
vn -0.3275 -0.0642 0.9427
vn -0.3274 0.0641 0.9427
vn -0.3142 0.1864 0.9309
vn -0.4368 0.1989 0.8773
vn -0.5413 0.1989 0.8170
vn -0.6490 0.1864 0.7376
vn -0.6527 0.0641 0.7549
vn -0.6526 -0.0642 0.7550
vn -0.6334 -0.2052 0.7461
vn -0.5413 -0.1990 0.8170
vn -0.4369 -0.1990 0.8772
vn -0.7104 -0.1743 -0.6819
vn -0.7129 0.1761 -0.6788
vn -0.3616 -0.1446 -0.9210
vn -0.3685 0.1490 -0.9176
vn -0.7068 -0.5500 -0.4449
vn -0.8358 -0.1790 -0.5190
vn -0.8360 0.1791 -0.5187
vn -0.7069 0.5501 -0.4446
vn -0.5838 0.5364 -0.6095
vn -0.2677 0.4414 -0.8565
vn 0.0827 0.2584 -0.9625
vn 0.0666 0.0940 -0.9933
vn 0.0658 -0.0937 -0.9934
vn 0.0472 -0.3024 -0.9520
vn -0.2655 -0.4389 -0.8584
vn -0.5818 -0.5344 -0.6131
vn 0.4516 -0.0563 -0.8904
vn 0.4525 0.0574 -0.8899
vn 0.5453 -0.0563 -0.8363
vn 0.5444 0.0574 -0.8368
vn 0.3295 -0.2052 -0.9216
vn 0.3275 -0.0642 -0.9427
vn 0.3274 0.0641 -0.9427
vn 0.3142 0.1864 -0.9309
vn 0.4368 0.1989 -0.8773
vn 0.5413 0.1989 -0.8170
vn 0.6490 0.1864 -0.7376
vn 0.6527 0.0641 -0.7549
vn 0.6526 -0.0642 -0.7550
vn 0.6334 -0.2052 -0.7461
vn 0.5413 -0.1990 -0.8170
vn 0.4369 -0.1990 -0.8772
vn 0.7104 -0.1743 0.6819
vn 0.7129 0.1761 0.6788
vn 0.3616 -0.1446 0.9210
vn 0.3685 0.1490 0.9176
vn 0.7068 -0.5500 0.4449
vn 0.8358 -0.1790 0.5190
vn 0.8360 0.1791 0.5187
vn 0.7069 0.5501 0.4446
vn 0.5838 0.5364 0.6095
vn 0.2677 0.4414 0.8565
vn -0.0827 0.2584 0.9625
vn -0.0666 0.0940 0.9933
vn -0.0658 -0.0937 0.9934
vn -0.0472 -0.3024 0.9520
vn 0.2655 -0.4389 0.8584
vn 0.5818 -0.5344 0.6131
vn -0.0232 -0.9623 -0.2710
vn 0.1838 -0.7927 -0.5812
vn 0.2463 -0.9623 -0.1154
vn 0.4115 -0.7927 -0.4498
vn -0.4213 -0.8629 -0.2791
vn -0.3080 -0.8392 -0.4482
vn -0.0518 -0.6884 -0.7235
vn 0.1762 -0.4589 -0.8709
vn 0.3658 -0.4607 -0.8087
vn 0.5174 -0.4607 -0.7211
vn 0.6661 -0.4589 -0.5880
vn 0.6525 -0.6884 -0.3169
vn 0.5422 -0.8392 0.0427
vn 0.4524 -0.8629 0.2253
vn 0.1529 -0.9868 0.0539
vn -0.1231 -0.9868 -0.1055
vn -0.2463 0.9623 0.1154
vn -0.4115 0.7927 0.4498
vn 0.0232 0.9623 0.2710
vn -0.1838 0.7927 0.5812
vn -0.4524 0.8629 -0.2253
vn -0.5422 0.8392 -0.0426
vn -0.6525 0.6884 0.3169
vn -0.6661 0.4589 0.5880
vn -0.5174 0.4607 0.7211
vn -0.3658 0.4607 0.8087
vn -0.1762 0.4589 0.8709
vn 0.0518 0.6884 0.7235
vn 0.3080 0.8392 0.4482
vn 0.4213 0.8629 0.2791
vn 0.1231 0.9868 0.1055
vn -0.1529 0.9868 -0.0539
vn 0.1772 0.7968 -0.5777
vn -0.0265 0.9629 -0.2686
vn 0.4117 0.7968 -0.4423
vn 0.2458 0.9629 -0.1113
vn 0.2251 0.4320 -0.8733
vn -0.0552 0.6894 -0.7223
vn -0.3117 0.8391 -0.4457
vn -0.4216 0.8628 -0.2790
vn -0.1233 0.9868 -0.1053
vn 0.1528 0.9868 0.0541
vn 0.4524 0.8628 0.2256
vn 0.5419 0.8391 0.0471
vn 0.6531 0.6894 -0.3134
vn 0.6438 0.4320 -0.6316
vn 0.5170 0.4614 -0.7210
vn 0.3659 0.4614 -0.8082
vn -0.1380 0.9872 -0.0797
vn 0.1380 0.9872 0.0797
vn -0.4371 0.8633 -0.2523
vn 0.4371 0.8633 0.2523
vn -0.4117 -0.7968 0.4423
vn -0.2458 -0.9629 0.1113
vn -0.1772 -0.7968 0.5777
vn 0.0265 -0.9629 0.2686
vn -0.6438 -0.4320 0.6316
vn -0.6531 -0.6894 0.3134
vn -0.5419 -0.8391 -0.0471
vn -0.4524 -0.8628 -0.2256
vn -0.1528 -0.9868 -0.0541
vn 0.1233 -0.9868 0.1053
vn 0.4216 -0.8628 0.2790
vn 0.3117 -0.8391 0.4457
vn 0.0552 -0.6894 0.7223
vn -0.2251 -0.4320 0.8733
vn -0.3659 -0.4614 0.8082
vn -0.5170 -0.4614 0.7210
vn -0.1380 -0.9872 -0.0797
vn 0.1380 -0.9872 0.0797
vn -0.4371 -0.8633 -0.2523
vn 0.4371 -0.8633 0.2523
vn 0.9785 -0.1446 -0.1473
vn 0.9789 0.1490 -0.1397
vn 0.9457 -0.1743 0.2742
vn 0.9443 0.1761 0.2780
vn 0.8009 -0.3024 -0.5169
vn 0.8274 -0.0937 -0.5537
vn 0.8269 0.0940 -0.5544
vn 0.7922 0.2584 -0.5528
vn 0.8756 0.4414 -0.1964
vn 0.8197 0.5364 0.2008
vn 0.7385 0.5501 0.3899
vn 0.8672 0.1791 0.4646
vn 0.8674 -0.1790 0.4644
vn 0.7387 -0.5500 0.3897
vn 0.8219 -0.5344 0.1973
vn 0.8762 -0.4389 -0.1993
vn 0.8520 -0.1791 0.4919
vn 0.8520 0.1791 0.4919
vn 0.7231 -0.5503 0.4175
vn 0.7231 0.5503 0.4175
vn -0.9785 -0.1446 0.1473
vn -0.9789 0.1490 0.1397
vn -0.9457 -0.1743 -0.2742
vn -0.9443 0.1761 -0.2780
vn -0.8009 -0.3024 0.5169
vn -0.8274 -0.0937 0.5537
vn -0.8269 0.0940 0.5544
vn -0.7922 0.2584 0.5528
vn -0.8756 0.4414 0.1964
vn -0.8197 0.5364 -0.2008
vn -0.7385 0.5501 -0.3899
vn -0.8672 0.1791 -0.4646
vn -0.8674 -0.1790 -0.4644
vn -0.7387 -0.5500 -0.3897
vn -0.8219 -0.5344 -0.1973
vn -0.8762 -0.4389 0.1993
vn -0.8520 -0.1791 -0.4919
vn -0.8520 0.1791 -0.4919
vn -0.7231 -0.5503 -0.4175
vn -0.7231 0.5503 -0.4175
vn -0.4525 -0.0574 0.8899
vn -0.4516 0.0563 0.8904
vn -0.5444 -0.0574 0.8368
vn -0.5453 0.0563 0.8363
vn -0.3142 -0.1864 0.9309
vn -0.3274 -0.0641 0.9427
vn -0.3275 0.0642 0.9427
vn -0.3295 0.2052 0.9216
vn -0.4369 0.1990 0.8772
vn -0.5413 0.1990 0.8170
vn -0.6334 0.2052 0.7461
vn -0.6526 0.0642 0.7550
vn -0.6527 -0.0641 0.7549
vn -0.6490 -0.1864 0.7376
vn -0.5413 -0.1989 0.8170
vn -0.4368 -0.1989 0.8773
vn -0.7129 -0.1761 -0.6788
vn -0.7104 0.1743 -0.6819
vn -0.3685 -0.1490 -0.9176
vn -0.3616 0.1446 -0.9210
vn -0.7069 -0.5501 -0.4446
vn -0.8360 -0.1791 -0.5187
vn -0.8358 0.1790 -0.5190
vn -0.7068 0.5500 -0.4449
vn -0.5818 0.5344 -0.6131
vn -0.2655 0.4389 -0.8584
vn 0.0472 0.3024 -0.9520
vn 0.0658 0.0937 -0.9934
vn 0.0666 -0.0940 -0.9933
vn 0.0827 -0.2584 -0.9625
vn -0.2677 -0.4414 -0.8565
vn -0.5838 -0.5364 -0.6095
vn 0.4525 -0.0574 -0.8899
vn 0.4516 0.0563 -0.8904
vn 0.5444 -0.0574 -0.8368
vn 0.5453 0.0563 -0.8363
vn 0.3142 -0.1864 -0.9309
vn 0.3274 -0.0641 -0.9427
vn 0.3275 0.0642 -0.9427
vn 0.3295 0.2052 -0.9216
vn 0.4369 0.1990 -0.8772
vn 0.5413 0.1990 -0.8170
vn 0.6334 0.2052 -0.7461
vn 0.6526 0.0642 -0.7550
vn 0.6527 -0.0641 -0.7549
vn 0.6490 -0.1864 -0.7376
vn 0.5413 -0.1989 -0.8170
vn 0.4368 -0.1989 -0.8773
vn 0.7129 -0.1761 0.6788
vn 0.7104 0.1743 0.6819
vn 0.3685 -0.1490 0.9176
vn 0.3616 0.1446 0.9210
vn 0.7069 -0.5501 0.4446
vn 0.8360 -0.1791 0.5187
vn 0.8358 0.1790 0.5190
vn 0.7068 0.5500 0.4449
vn 0.5818 0.5344 0.6131
vn 0.2655 0.4389 0.8584
vn -0.0472 0.3024 0.9520
vn -0.0658 0.0937 0.9934
vn -0.0666 -0.0940 0.9933
vn -0.0827 -0.2584 0.9625
vn 0.2677 -0.4414 0.8565
vn 0.5838 -0.5364 0.6095
vn -0.0265 -0.9629 -0.2686
vn 0.1772 -0.7968 -0.5777
vn 0.2458 -0.9629 -0.1113
vn 0.4117 -0.7968 -0.4423
vn -0.4216 -0.8628 -0.2790
vn -0.3117 -0.8391 -0.4457
vn -0.0551 -0.6894 -0.7223
vn 0.2251 -0.4320 -0.8733
vn 0.3659 -0.4614 -0.8082
vn 0.5170 -0.4614 -0.7210
vn 0.6438 -0.4320 -0.6316
vn 0.6531 -0.6894 -0.3134
vn 0.5419 -0.8391 0.0471
vn 0.4524 -0.8628 0.2256
vn 0.1528 -0.9868 0.0541
vn -0.1233 -0.9868 -0.1053
vn -0.2458 0.9629 0.1113
vn -0.4117 0.7968 0.4423
vn 0.0265 0.9629 0.2686
vn -0.1772 0.7968 0.5777
vn -0.4524 0.8628 -0.2256
vn -0.5419 0.8391 -0.0471
vn -0.6531 0.6894 0.3134
vn -0.6438 0.4320 0.6316
vn -0.5170 0.4614 0.7210
vn -0.3659 0.4614 0.8082
vn -0.2251 0.4320 0.8733
vn 0.0552 0.6894 0.7223
vn 0.3117 0.8391 0.4457
vn 0.4216 0.8628 0.2790
vn 0.1233 0.9868 0.1053
vn -0.1528 0.9868 -0.0541
vn 0.1838 0.7927 -0.5812
vn -0.0232 0.9623 -0.2710
vn 0.4115 0.7927 -0.4498
vn 0.2463 0.9623 -0.1154
vn 0.1762 0.4589 -0.8709
vn -0.0518 0.6884 -0.7235
vn -0.3080 0.8392 -0.4482
vn -0.4213 0.8629 -0.2791
vn -0.1231 0.9868 -0.1055
vn 0.1529 0.9868 0.0539
vn 0.4524 0.8629 0.2253
vn 0.5422 0.8392 0.0427
vn 0.6525 0.6884 -0.3169
vn 0.6661 0.4589 -0.5880
vn 0.5174 0.4607 -0.7211
vn 0.3658 0.4607 -0.8087
vn -0.4115 -0.7927 0.4498
vn -0.2463 -0.9623 0.1154
vn -0.1838 -0.7927 0.5812
vn 0.0232 -0.9623 0.2710
vn -0.6661 -0.4589 0.5880
vn -0.6525 -0.6884 0.3169
vn -0.5422 -0.8392 -0.0427
vn -0.4524 -0.8629 -0.2253
vn -0.1529 -0.9868 -0.0539
vn 0.1231 -0.9868 0.1055
vn 0.4213 -0.8629 0.2791
vn 0.3080 -0.8392 0.4482
vn 0.0518 -0.6884 0.7235
vn -0.1762 -0.4589 0.8709
vn -0.3658 -0.4607 0.8087
vn -0.5174 -0.4607 0.7211
vn 0.9789 -0.1490 -0.1397
vn 0.9785 0.1446 -0.1473
vn 0.9443 -0.1761 0.2780
vn 0.9457 0.1743 0.2742
vn 0.7922 -0.2584 -0.5528
vn 0.8269 -0.0940 -0.5544
vn 0.8274 0.0937 -0.5537
vn 0.8009 0.3024 -0.5169
vn 0.8762 0.4389 -0.1993
vn 0.8219 0.5344 0.1973
vn 0.7387 0.5500 0.3897
vn 0.8674 0.1790 0.4644
vn 0.8672 -0.1791 0.4646
vn 0.7385 -0.5501 0.3899
vn 0.8197 -0.5364 0.2008
vn 0.8756 -0.4414 -0.1964
vn -0.9789 -0.1490 0.1397
vn -0.9785 0.1446 0.1473
vn -0.9443 -0.1761 -0.2780
vn -0.9457 0.1743 -0.2742
vn -0.7922 -0.2584 0.5528
vn -0.8269 -0.0940 0.5544
vn -0.8274 0.0937 0.5537
vn -0.8009 0.3024 0.5169
vn -0.8762 0.4389 0.1993
vn -0.8219 0.5344 -0.1973
vn -0.7387 0.5500 -0.3897
vn -0.8674 0.1790 -0.4644
vn -0.8672 -0.1791 -0.4646
vn -0.7385 -0.5501 -0.3899
vn -0.8197 -0.5364 -0.2008
vn -0.8756 -0.4414 0.1964
usemtl Material.002
s off
f 702/688/587 706/689/587 705/690/587
f 704/691/588 706/689/588 703/692/588
f 706/689/589 708/693/589 705/690/589
f 706/689/590 710/694/590 709/695/590
f 574/696/591 702/688/591 596/697/591
f 597/698/592 703/692/592 702/688/592
f 599/699/593 703/692/593 598/700/593
f 575/701/594 704/691/594 599/699/594
f 600/702/595 707/703/595 704/691/595
f 707/703/596 602/704/596 710/694/596
f 710/694/597 577/705/597 603/706/597
f 709/695/598 603/706/598 604/707/598
f 709/695/599 605/708/599 708/693/599
f 708/693/600 576/709/600 594/710/600
f 705/690/601 594/710/601 595/711/601
f 596/697/602 705/690/602 595/711/602
f 712/712/603 714/713/603 711/714/603
f 712/712/604 716/715/604 715/716/604
f 715/716/605 717/717/605 714/713/605
f 715/716/606 719/718/606 718/719/606
f 701/720/607 632/721/607 582/722/607
f 700/723/608 711/714/608 701/720/608
f 700/723/609 713/724/609 712/712/609
f 699/725/610 609/726/610 713/724/610
f 713/724/611 610/727/611 716/715/611
f 716/715/612 611/728/612 719/718/612
f 719/718/613 581/729/613 612/730/613
f 719/718/614 613/731/614 718/719/614
f 717/717/615 613/731/615 614/732/615
f 717/717/616 580/733/616 630/734/616
f 714/713/617 630/734/617 631/735/617
f 711/714/618 631/735/618 632/721/618
f 720/736/619 724/737/619 723/738/619
f 722/739/620 724/737/620 721/740/620
f 724/737/621 726/741/621 723/738/621
f 724/737/622 728/742/622 727/743/622
f 580/733/623 720/736/623 617/744/623
f 614/732/624 721/740/624 720/736/624
f 612/730/625 721/740/625 613/731/625
f 581/729/626 722/739/626 612/730/626
f 618/745/627 725/746/627 722/739/627
f 725/746/628 620/747/628 728/742/628
f 728/742/629 579/748/629 621/749/629
f 727/743/630 621/749/630 622/750/630
f 727/743/631 623/751/631 726/741/631
f 726/741/632 578/752/632 615/753/632
f 723/738/633 615/753/633 616/754/633
f 617/744/634 723/738/634 616/754/634
f 730/755/635 732/756/635 729/757/635
f 730/755/636 734/758/636 733/759/636
f 733/759/637 735/760/637 732/756/637
f 733/759/638 737/761/638 736/762/638
f 692/763/639 650/764/639 588/765/639
f 691/766/640 729/757/640 692/763/640
f 691/766/641 731/767/641 730/755/641
f 690/768/642 627/769/642 731/767/642
f 731/767/643 628/770/643 734/758/643
f 734/758/644 629/771/644 737/761/644
f 737/761/645 575/772/645 599/773/645
f 737/761/646 598/774/646 736/762/646
f 735/760/647 598/774/647 597/775/647
f 735/760/648 574/776/648 648/777/648
f 732/756/649 648/777/649 649/778/649
f 729/757/650 649/778/650 650/764/650
f 739/779/651 741/780/651 738/781/651
f 740/782/652 742/783/652 739/779/652
f 741/780/653 745/784/653 744/785/653
f 742/783/654 746/786/654 745/784/654
f 632/787/655 681/788/655 582/789/655
f 631/790/656 738/781/656 632/787/656
f 630/791/657 739/779/657 631/790/657
f 580/733/658 740/782/658 630/791/658
f 740/782/659 616/754/659 743/792/659
f 616/754/660 746/786/660 743/792/660
f 746/786/661 578/752/661 626/793/661
f 745/784/662 626/793/662 625/794/662
f 744/785/663 625/794/663 624/795/663
f 683/796/664 624/795/664 590/797/664
f 682/798/665 744/785/665 683/796/665
f 738/781/666 682/798/666 681/788/666
f 748/799/667 750/800/667 747/801/667
f 749/802/668 751/803/668 748/799/668
f 750/800/669 754/804/669 753/805/669
f 751/803/670 755/806/670 754/804/670
f 641/807/671 672/808/671 585/809/671
f 640/810/672 747/801/672 641/807/672
f 639/811/673 748/799/673 640/810/673
f 577/812/674 749/802/674 639/811/674
f 749/802/675 601/813/675 752/814/675
f 601/813/676 755/806/676 752/814/676
f 755/806/677 575/815/677 629/816/677
f 754/804/678 629/816/678 628/817/678
f 753/805/679 628/817/679 627/818/679
f 674/819/680 627/818/680 593/820/680
f 673/821/681 753/805/681 674/819/681
f 747/801/682 673/821/682 672/808/682
f 756/822/683 760/823/683 759/824/683
f 757/825/684 761/826/684 760/823/684
f 760/823/685 762/827/685 759/824/685
f 761/826/686 763/828/686 760/823/686
f 581/729/687 756/822/687 618/745/687
f 611/829/688 757/825/688 756/822/688
f 610/830/689 758/831/689 757/825/689
f 609/832/690 666/833/690 758/831/690
f 758/831/691 667/834/691 761/826/691
f 667/834/692 764/835/692 761/826/692
f 668/836/693 659/837/693 764/835/693
f 764/835/694 658/838/694 763/828/694
f 763/828/695 657/839/695 762/827/695
f 762/827/696 579/748/696 620/747/696
f 619/840/697 762/827/697 620/747/697
f 756/822/698 619/840/698 618/745/698
f 766/841/699 768/842/699 765/843/699
f 767/844/699 769/845/699 766/841/699
f 769/845/700 771/846/700 768/842/700
f 770/847/700 772/848/700 769/845/700
f 647/849/701 666/833/701 587/850/701
f 646/851/701 765/843/701 647/849/701
f 646/851/701 767/844/701 766/841/701
f 586/852/701 767/844/701 645/853/701
f 669/854/699 770/847/699 767/844/699
f 670/855/700 773/856/700 770/847/700
f 671/857/702 662/858/702 773/856/702
f 773/856/702 661/859/702 772/848/702
f 772/848/702 660/860/702 771/846/702
f 771/846/702 591/861/702 668/836/702
f 768/842/700 668/836/700 667/834/700
f 765/843/699 667/834/699 666/833/699
f 775/862/699 777/863/699 774/864/699
f 776/865/699 778/866/699 775/862/699
f 778/866/700 780/867/700 777/863/700
f 778/866/700 782/868/700 781/869/700
f 586/852/701 774/864/701 669/854/701
f 643/870/701 774/864/701 644/871/701
f 642/872/701 775/862/701 643/870/701
f 585/809/701 776/865/701 642/872/701
f 672/808/699 779/873/699 776/865/699
f 673/821/700 782/868/700 779/873/700
f 674/819/702 665/874/702 782/868/702
f 782/868/702 664/875/702 781/869/702
f 781/869/702 663/876/702 780/867/702
f 780/867/702 592/877/702 671/857/702
f 777/863/700 671/857/700 670/855/700
f 774/864/699 670/855/699 669/854/699
f 783/878/703 787/879/703 786/880/703
f 784/881/704 788/882/704 787/879/704
f 787/879/705 789/883/705 786/880/705
f 788/882/706 790/884/706 787/879/706
f 576/885/707 783/878/707 594/886/707
f 608/887/708 784/881/708 783/878/708
f 607/888/709 785/889/709 784/881/709
f 606/890/710 675/891/710 785/889/710
f 785/889/711 676/892/711 788/882/711
f 676/892/712 791/893/712 788/882/712
f 677/894/713 650/895/713 791/893/713
f 791/893/714 649/896/714 790/884/714
f 790/884/715 648/897/715 789/883/715
f 789/883/716 574/898/716 596/899/716
f 595/900/717 789/883/717 596/899/717
f 783/878/718 595/900/718 594/886/718
f 793/901/719 795/902/719 792/903/719
f 794/904/719 796/905/719 793/901/719
f 796/905/720 798/906/720 795/902/720
f 797/907/720 799/908/720 796/905/720
f 584/909/721 792/903/721 675/891/721
f 637/910/721 792/903/721 638/911/721
f 637/910/721 794/904/721 793/901/721
f 583/912/721 794/904/721 636/913/721
f 678/914/719 797/907/719 794/904/719
f 679/915/720 800/916/720 797/907/720
f 680/917/722 653/918/722 800/916/722
f 800/916/722 652/919/722 799/908/722
f 799/908/722 651/920/722 798/906/722
f 798/906/722 588/921/722 677/894/722
f 795/902/720 677/894/720 676/892/720
f 792/903/719 676/892/719 675/891/719
f 802/922/719 804/923/719 801/924/719
f 803/925/719 805/926/719 802/922/719
f 805/926/720 807/927/720 804/923/720
f 805/926/720 809/928/720 808/929/720
f 583/912/721 801/924/721 678/914/721
f 634/930/721 801/924/721 635/931/721
f 633/932/721 802/922/721 634/930/721
f 582/789/721 803/925/721 633/932/721
f 681/788/719 806/933/719 803/925/719
f 682/798/720 809/928/720 806/933/720
f 683/796/722 656/934/722 809/928/722
f 809/928/722 655/935/722 808/929/722
f 808/929/722 654/936/722 807/927/722
f 807/927/722 589/937/722 680/917/722
f 804/923/720 680/917/720 679/915/720
f 801/924/719 679/915/719 678/914/719
f 810/938/723 814/939/723 813/940/723
f 812/941/724 814/939/724 811/942/724
f 813/940/725 817/943/725 816/944/725
f 815/945/726 817/943/726 814/939/726
f 578/752/727 810/938/727 626/946/727
f 622/750/728 810/938/728 623/751/728
f 622/750/729 812/941/729 811/942/729
f 579/748/730 812/941/730 621/749/730
f 657/947/731 815/945/731 812/941/731
f 658/948/732 818/949/732 815/945/732
f 659/950/733 684/951/733 818/949/733
f 818/949/734 685/952/734 817/943/734
f 816/944/735 685/952/735 686/953/735
f 624/954/736 686/953/736 590/955/736
f 625/956/737 816/944/737 624/954/737
f 626/946/738 813/940/738 625/956/738
f 820/957/739 822/958/739 819/959/739
f 821/960/740 823/961/740 820/957/740
f 823/961/739 825/962/739 822/958/739
f 824/963/740 826/964/740 823/961/740
f 686/953/741 656/965/741 590/955/741
f 685/952/739 819/959/739 686/953/739
f 684/951/740 820/957/740 685/952/740
f 591/966/742 821/960/742 684/951/742
f 660/967/742 824/963/742 821/960/742
f 824/963/742 662/968/742 827/969/742
f 662/968/742 687/970/742 827/969/742
f 827/969/740 688/971/740 826/964/740
f 826/964/739 689/972/739 825/962/739
f 825/962/741 589/973/741 654/974/741
f 822/958/741 654/974/741 655/975/741
f 819/959/741 655/975/741 656/965/741
f 829/976/739 831/977/739 828/978/739
f 830/979/740 832/980/740 829/976/740
f 832/980/739 834/981/739 831/977/739
f 833/982/740 835/983/740 832/980/740
f 689/972/741 653/984/741 589/973/741
f 688/971/739 828/978/739 689/972/739
f 687/970/740 829/976/740 688/971/740
f 592/985/742 830/979/742 687/970/742
f 663/986/742 833/982/742 830/979/742
f 664/987/742 836/988/742 833/982/742
f 665/989/742 690/768/742 836/988/742
f 836/988/740 691/766/740 835/983/740
f 835/983/739 692/763/739 834/981/739
f 834/981/741 588/765/741 651/990/741
f 831/977/741 651/990/741 652/991/741
f 653/984/741 831/977/741 652/991/741
f 837/992/743 841/993/743 840/994/743
f 839/995/744 841/993/744 838/996/744
f 840/994/745 844/997/745 843/998/745
f 842/999/746 844/997/746 841/993/746
f 576/709/747 837/992/747 608/1000/747
f 604/707/748 837/992/748 605/708/748
f 604/707/749 839/995/749 838/996/749
f 577/705/750 839/995/750 603/706/750
f 639/1001/751 842/999/751 839/995/751
f 640/1002/752 845/1003/752 842/999/752
f 641/1004/753 693/1005/753 845/1003/753
f 845/1003/754 694/1006/754 844/997/754
f 843/998/755 694/1006/755 695/1007/755
f 606/1008/756 695/1007/756 584/1009/756
f 607/1010/757 843/998/757 606/1008/757
f 608/1000/758 840/994/758 607/1010/758
f 847/1011/759 849/1012/759 846/1013/759
f 848/1014/760 850/1015/760 847/1011/760
f 850/1015/759 852/1016/759 849/1012/759
f 851/1017/760 853/1018/760 850/1015/760
f 695/1007/761 638/1019/761 584/1009/761
f 694/1006/759 846/1013/759 695/1007/759
f 693/1005/760 847/1011/760 694/1006/760
f 585/1020/762 848/1014/762 693/1005/762
f 848/1014/762 643/1021/762 851/1017/762
f 643/1021/762 854/1022/762 851/1017/762
f 644/1023/762 696/1024/762 854/1022/762
f 854/1022/760 697/1025/760 853/1018/760
f 853/1018/759 698/1026/759 852/1016/759
f 852/1016/761 583/1027/761 636/1028/761
f 637/1029/761 852/1016/761 636/1028/761
f 846/1013/761 637/1029/761 638/1019/761
f 856/1030/759 858/1031/759 855/1032/759
f 857/1033/760 859/1034/760 856/1030/760
f 859/1034/759 861/1035/759 858/1031/759
f 860/1036/760 862/1037/760 859/1034/760
f 583/1027/761 855/1032/761 635/1038/761
f 697/1025/759 855/1032/759 698/1026/759
f 696/1024/760 856/1030/760 697/1025/760
f 586/1039/762 857/1033/762 696/1024/762
f 857/1033/762 646/1040/762 860/1036/762
f 646/1040/762 863/1041/762 860/1036/762
f 647/1042/762 699/725/762 863/1041/762
f 863/1041/760 700/723/760 862/1037/760
f 862/1037/759 701/720/759 861/1035/759
f 633/1043/761 701/720/761 582/722/761
f 858/1031/761 633/1043/761 634/1044/761
f 855/1032/761 634/1044/761 635/1038/761
f 702/688/763 703/692/763 706/689/763
f 704/691/764 707/703/764 706/689/764
f 706/689/765 709/695/765 708/693/765
f 706/689/766 707/703/766 710/694/766
f 574/696/767 597/698/767 702/688/767
f 597/698/768 598/700/768 703/692/768
f 599/699/769 704/691/769 703/692/769
f 575/701/770 600/702/770 704/691/770
f 600/702/771 601/1045/771 707/703/771
f 707/703/772 601/1045/772 602/704/772
f 710/694/773 602/704/773 577/705/773
f 709/695/774 710/694/774 603/706/774
f 709/695/775 604/707/775 605/708/775
f 708/693/776 605/708/776 576/709/776
f 705/690/777 708/693/777 594/710/777
f 596/697/778 702/688/778 705/690/778
f 712/712/779 715/716/779 714/713/779
f 712/712/780 713/724/780 716/715/780
f 715/716/781 718/719/781 717/717/781
f 715/716/782 716/715/782 719/718/782
f 701/720/783 711/714/783 632/721/783
f 700/723/784 712/712/784 711/714/784
f 700/723/785 699/725/785 713/724/785
f 699/725/786 587/1046/786 609/726/786
f 713/724/787 609/726/787 610/727/787
f 716/715/788 610/727/788 611/728/788
f 719/718/789 611/728/789 581/729/789
f 719/718/790 612/730/790 613/731/790
f 717/717/791 718/719/791 613/731/791
f 717/717/792 614/732/792 580/733/792
f 714/713/793 717/717/793 630/734/793
f 711/714/794 714/713/794 631/735/794
f 720/736/795 721/740/795 724/737/795
f 722/739/796 725/746/796 724/737/796
f 724/737/797 727/743/797 726/741/797
f 724/737/798 725/746/798 728/742/798
f 580/733/799 614/732/799 720/736/799
f 614/732/800 613/731/800 721/740/800
f 612/730/801 722/739/801 721/740/801
f 581/729/802 618/745/802 722/739/802
f 618/745/803 619/840/803 725/746/803
f 725/746/804 619/840/804 620/747/804
f 728/742/805 620/747/805 579/748/805
f 727/743/806 728/742/806 621/749/806
f 727/743/807 622/750/807 623/751/807
f 726/741/808 623/751/808 578/752/808
f 723/738/809 726/741/809 615/753/809
f 617/744/810 720/736/810 723/738/810
f 730/755/811 733/759/811 732/756/811
f 730/755/812 731/767/812 734/758/812
f 733/759/813 736/762/813 735/760/813
f 733/759/814 734/758/814 737/761/814
f 692/763/815 729/757/815 650/764/815
f 691/766/816 730/755/816 729/757/816
f 691/766/817 690/768/817 731/767/817
f 690/768/818 593/1047/818 627/769/818
f 731/767/819 627/769/819 628/770/819
f 734/758/820 628/770/820 629/771/820
f 737/761/821 629/771/821 575/772/821
f 737/761/822 599/773/822 598/774/822
f 735/760/823 736/762/823 598/774/823
f 735/760/824 597/775/824 574/776/824
f 732/756/825 735/760/825 648/777/825
f 729/757/826 732/756/826 649/778/826
f 739/779/827 742/783/827 741/780/827
f 740/782/828 743/792/828 742/783/828
f 741/780/829 742/783/829 745/784/829
f 742/783/830 743/792/830 746/786/830
f 632/787/831 738/781/831 681/788/831
f 631/790/832 739/779/832 738/781/832
f 630/791/833 740/782/833 739/779/833
f 580/733/834 617/744/834 740/782/834
f 740/782/835 617/744/835 616/754/835
f 616/754/836 615/753/836 746/786/836
f 746/786/837 615/753/837 578/752/837
f 745/784/838 746/786/838 626/793/838
f 744/785/839 745/784/839 625/794/839
f 683/796/840 744/785/840 624/795/840
f 682/798/841 741/780/841 744/785/841
f 738/781/842 741/780/842 682/798/842
f 748/799/843 751/803/843 750/800/843
f 749/802/844 752/814/844 751/803/844
f 750/800/845 751/803/845 754/804/845
f 751/803/846 752/814/846 755/806/846
f 641/807/847 747/801/847 672/808/847
f 640/810/848 748/799/848 747/801/848
f 639/811/849 749/802/849 748/799/849
f 577/812/850 602/1048/850 749/802/850
f 749/802/851 602/1048/851 601/813/851
f 601/813/852 600/1049/852 755/806/852
f 755/806/853 600/1049/853 575/815/853
f 754/804/854 755/806/854 629/816/854
f 753/805/855 754/804/855 628/817/855
f 674/819/856 753/805/856 627/818/856
f 673/821/857 750/800/857 753/805/857
f 747/801/858 750/800/858 673/821/858
f 756/822/859 757/825/859 760/823/859
f 757/825/860 758/831/860 761/826/860
f 760/823/861 763/828/861 762/827/861
f 761/826/862 764/835/862 763/828/862
f 581/729/863 611/829/863 756/822/863
f 611/829/864 610/830/864 757/825/864
f 610/830/865 609/832/865 758/831/865
f 609/832/866 587/850/866 666/833/866
f 758/831/867 666/833/867 667/834/867
f 667/834/868 668/836/868 764/835/868
f 668/836/869 591/861/869 659/837/869
f 764/835/870 659/837/870 658/838/870
f 763/828/871 658/838/871 657/839/871
f 762/827/872 657/839/872 579/748/872
f 619/840/873 759/824/873 762/827/873
f 756/822/874 759/824/874 619/840/874
f 766/841/699 769/845/699 768/842/699
f 767/844/699 770/847/699 769/845/699
f 769/845/700 772/848/700 771/846/700
f 770/847/700 773/856/700 772/848/700
f 647/849/701 765/843/701 666/833/701
f 646/851/701 766/841/701 765/843/701
f 646/851/701 645/853/701 767/844/701
f 586/852/701 669/854/701 767/844/701
f 669/854/699 670/855/699 770/847/699
f 670/855/700 671/857/700 773/856/700
f 671/857/702 592/877/702 662/858/702
f 773/856/702 662/858/702 661/859/702
f 772/848/702 661/859/702 660/860/702
f 771/846/702 660/860/702 591/861/702
f 768/842/700 771/846/700 668/836/700
f 765/843/699 768/842/699 667/834/699
f 775/862/699 778/866/699 777/863/699
f 776/865/699 779/873/699 778/866/699
f 778/866/700 781/869/700 780/867/700
f 778/866/700 779/873/700 782/868/700
f 586/852/701 644/871/701 774/864/701
f 643/870/701 775/862/701 774/864/701
f 642/872/701 776/865/701 775/862/701
f 585/809/701 672/808/701 776/865/701
f 672/808/699 673/821/699 779/873/699
f 673/821/700 674/819/700 782/868/700
f 674/819/702 593/820/702 665/874/702
f 782/868/702 665/874/702 664/875/702
f 781/869/702 664/875/702 663/876/702
f 780/867/702 663/876/702 592/877/702
f 777/863/700 780/867/700 671/857/700
f 774/864/699 777/863/699 670/855/699
f 783/878/875 784/881/875 787/879/875
f 784/881/876 785/889/876 788/882/876
f 787/879/877 790/884/877 789/883/877
f 788/882/878 791/893/878 790/884/878
f 576/885/879 608/887/879 783/878/879
f 608/887/880 607/888/880 784/881/880
f 607/888/881 606/890/881 785/889/881
f 606/890/882 584/909/882 675/891/882
f 785/889/883 675/891/883 676/892/883
f 676/892/884 677/894/884 791/893/884
f 677/894/885 588/921/885 650/895/885
f 791/893/886 650/895/886 649/896/886
f 790/884/887 649/896/887 648/897/887
f 789/883/888 648/897/888 574/898/888
f 595/900/889 786/880/889 789/883/889
f 783/878/890 786/880/890 595/900/890
f 793/901/719 796/905/719 795/902/719
f 794/904/719 797/907/719 796/905/719
f 796/905/720 799/908/720 798/906/720
f 797/907/720 800/916/720 799/908/720
f 584/909/721 638/911/721 792/903/721
f 637/910/721 793/901/721 792/903/721
f 637/910/721 636/913/721 794/904/721
f 583/912/721 678/914/721 794/904/721
f 678/914/719 679/915/719 797/907/719
f 679/915/720 680/917/720 800/916/720
f 680/917/722 589/937/722 653/918/722
f 800/916/722 653/918/722 652/919/722
f 799/908/722 652/919/722 651/920/722
f 798/906/722 651/920/722 588/921/722
f 795/902/720 798/906/720 677/894/720
f 792/903/719 795/902/719 676/892/719
f 802/922/719 805/926/719 804/923/719
f 803/925/719 806/933/719 805/926/719
f 805/926/720 808/929/720 807/927/720
f 805/926/720 806/933/720 809/928/720
f 583/912/721 635/931/721 801/924/721
f 634/930/721 802/922/721 801/924/721
f 633/932/721 803/925/721 802/922/721
f 582/789/721 681/788/721 803/925/721
f 681/788/719 682/798/719 806/933/719
f 682/798/720 683/796/720 809/928/720
f 683/796/722 590/797/722 656/934/722
f 809/928/722 656/934/722 655/935/722
f 808/929/722 655/935/722 654/936/722
f 807/927/722 654/936/722 589/937/722
f 804/923/720 807/927/720 680/917/720
f 801/924/719 804/923/719 679/915/719
f 810/938/891 811/942/891 814/939/891
f 812/941/892 815/945/892 814/939/892
f 813/940/893 814/939/893 817/943/893
f 815/945/894 818/949/894 817/943/894
f 578/752/895 623/751/895 810/938/895
f 622/750/896 811/942/896 810/938/896
f 622/750/897 621/749/897 812/941/897
f 579/748/898 657/947/898 812/941/898
f 657/947/899 658/948/899 815/945/899
f 658/948/900 659/950/900 818/949/900
f 659/950/901 591/966/901 684/951/901
f 818/949/902 684/951/902 685/952/902
f 816/944/903 817/943/903 685/952/903
f 624/954/904 816/944/904 686/953/904
f 625/956/905 813/940/905 816/944/905
f 626/946/906 810/938/906 813/940/906
f 820/957/739 823/961/739 822/958/739
f 821/960/740 824/963/740 823/961/740
f 823/961/739 826/964/739 825/962/739
f 824/963/740 827/969/740 826/964/740
f 686/953/741 819/959/741 656/965/741
f 685/952/739 820/957/739 819/959/739
f 684/951/740 821/960/740 820/957/740
f 591/966/742 660/967/742 821/960/742
f 660/967/742 661/1050/742 824/963/742
f 824/963/742 661/1050/742 662/968/742
f 662/968/742 592/985/742 687/970/742
f 827/969/740 687/970/740 688/971/740
f 826/964/739 688/971/739 689/972/739
f 825/962/741 689/972/741 589/973/741
f 822/958/741 825/962/741 654/974/741
f 819/959/741 822/958/741 655/975/741
f 829/976/739 832/980/739 831/977/739
f 830/979/740 833/982/740 832/980/740
f 832/980/739 835/983/739 834/981/739
f 833/982/740 836/988/740 835/983/740
f 689/972/741 828/978/741 653/984/741
f 688/971/739 829/976/739 828/978/739
f 687/970/740 830/979/740 829/976/740
f 592/985/742 663/986/742 830/979/742
f 663/986/742 664/987/742 833/982/742
f 664/987/742 665/989/742 836/988/742
f 665/989/742 593/1047/742 690/768/742
f 836/988/740 690/768/740 691/766/740
f 835/983/739 691/766/739 692/763/739
f 834/981/741 692/763/741 588/765/741
f 831/977/741 834/981/741 651/990/741
f 653/984/741 828/978/741 831/977/741
f 837/992/907 838/996/907 841/993/907
f 839/995/908 842/999/908 841/993/908
f 840/994/909 841/993/909 844/997/909
f 842/999/910 845/1003/910 844/997/910
f 576/709/911 605/708/911 837/992/911
f 604/707/912 838/996/912 837/992/912
f 604/707/913 603/706/913 839/995/913
f 577/705/914 639/1001/914 839/995/914
f 639/1001/915 640/1002/915 842/999/915
f 640/1002/916 641/1004/916 845/1003/916
f 641/1004/917 585/1020/917 693/1005/917
f 845/1003/918 693/1005/918 694/1006/918
f 843/998/919 844/997/919 694/1006/919
f 606/1008/920 843/998/920 695/1007/920
f 607/1010/921 840/994/921 843/998/921
f 608/1000/922 837/992/922 840/994/922
f 847/1011/759 850/1015/759 849/1012/759
f 848/1014/760 851/1017/760 850/1015/760
f 850/1015/759 853/1018/759 852/1016/759
f 851/1017/760 854/1022/760 853/1018/760
f 695/1007/761 846/1013/761 638/1019/761
f 694/1006/759 847/1011/759 846/1013/759
f 693/1005/760 848/1014/760 847/1011/760
f 585/1020/762 642/1051/762 848/1014/762
f 848/1014/762 642/1051/762 643/1021/762
f 643/1021/762 644/1023/762 854/1022/762
f 644/1023/762 586/1039/762 696/1024/762
f 854/1022/760 696/1024/760 697/1025/760
f 853/1018/759 697/1025/759 698/1026/759
f 852/1016/761 698/1026/761 583/1027/761
f 637/1029/761 849/1012/761 852/1016/761
f 846/1013/761 849/1012/761 637/1029/761
f 856/1030/759 859/1034/759 858/1031/759
f 857/1033/760 860/1036/760 859/1034/760
f 859/1034/759 862/1037/759 861/1035/759
f 860/1036/760 863/1041/760 862/1037/760
f 583/1027/761 698/1026/761 855/1032/761
f 697/1025/759 856/1030/759 855/1032/759
f 696/1024/760 857/1033/760 856/1030/760
f 586/1039/762 645/1052/762 857/1033/762
f 857/1033/762 645/1052/762 646/1040/762
f 646/1040/762 647/1042/762 863/1041/762
f 647/1042/762 587/1046/762 699/725/762
f 863/1041/760 699/725/760 700/723/760
f 862/1037/759 700/723/759 701/720/759
f 633/1043/761 861/1035/761 701/720/761
f 858/1031/761 861/1035/761 633/1043/761
f 855/1032/761 858/1031/761 634/1044/761
o Cube.003_Cube.004
v 0.296396 -0.099813 0.394917
v 0.296396 -0.011160 0.394917
v 0.214882 -0.099813 0.441978
v 0.214882 -0.011160 0.441978
v 0.197941 -0.099813 0.224389
v 0.197941 -0.011160 0.224389
v 0.116428 -0.099813 0.271451
v 0.116428 -0.011160 0.271451
v 0.123939 -0.114588 0.315836
v 0.152069 -0.114588 0.364558
v 0.180199 -0.114588 0.413280
v 0.180199 0.003615 0.413280
v 0.152069 0.003615 0.364558
v 0.123939 0.003615 0.315836
v 0.288884 -0.114588 0.350531
v 0.260754 -0.114588 0.301809
v 0.232624 -0.114588 0.253087
v 0.232624 0.003615 0.253087
v 0.260754 0.003615 0.301809
v 0.288884 0.003615 0.350531
v 0.232484 -0.106278 0.434185
v 0.257180 -0.109526 0.421117
v 0.280846 -0.106278 0.406263
v 0.303366 -0.081785 0.393261
v 0.306868 -0.055486 0.392429
v 0.303366 -0.029188 0.393261
v 0.280846 -0.004694 0.406263
v 0.257180 -0.001447 0.421117
v 0.232484 -0.004694 0.434185
v 0.209963 -0.029188 0.447188
v 0.207492 -0.055486 0.449804
v 0.209963 -0.081785 0.447188
v 0.187411 -0.114272 0.425100
v 0.195701 -0.112057 0.434757
v 0.205675 -0.106595 0.440432
v 0.117309 0.003299 0.303680
v 0.113092 0.001084 0.291673
v 0.113163 -0.004378 0.280198
v 0.109457 -0.029188 0.273106
v 0.105955 -0.055486 0.273938
v 0.109457 -0.081785 0.273106
v 0.180340 -0.106278 0.232182
v 0.155643 -0.109526 0.245251
v 0.131978 -0.106278 0.260104
v 0.131978 -0.004694 0.260104
v 0.155643 -0.001447 0.245251
v 0.180340 -0.004694 0.232182
v 0.202861 -0.029188 0.219180
v 0.205332 -0.055486 0.216563
v 0.202861 -0.081785 0.219180
v 0.225412 -0.114272 0.241267
v 0.217122 -0.112057 0.231611
v 0.207149 -0.106595 0.225935
v 0.295514 0.003299 0.362687
v 0.299732 0.001084 0.374695
v 0.299660 -0.004378 0.386170
v 0.113163 -0.106595 0.280198
v 0.113092 -0.112057 0.291673
v 0.117309 -0.114272 0.303680
v 0.130972 -0.114588 0.328017
v 0.138004 -0.114588 0.340197
v 0.145037 -0.114588 0.352378
v 0.159102 -0.114588 0.376739
v 0.166134 -0.114588 0.388919
v 0.173167 -0.114588 0.401100
v 0.205675 -0.004378 0.440432
v 0.195701 0.001084 0.434757
v 0.187411 0.003299 0.425100
v 0.173167 0.003615 0.401100
v 0.166134 0.003615 0.388919
v 0.159102 0.003615 0.376739
v 0.145037 0.003615 0.352378
v 0.138004 0.003615 0.340197
v 0.130972 0.003615 0.328017
v 0.299660 -0.106595 0.386170
v 0.299732 -0.112057 0.374695
v 0.295514 -0.114272 0.362687
v 0.281851 -0.114588 0.338351
v 0.274819 -0.114588 0.326170
v 0.267787 -0.114588 0.313990
v 0.253722 -0.114588 0.289628
v 0.246689 -0.114588 0.277448
v 0.239657 -0.114588 0.265267
v 0.207149 -0.004378 0.225935
v 0.217122 0.001084 0.231611
v 0.225412 0.003299 0.241267
v 0.239657 0.003615 0.265267
v 0.246689 0.003615 0.277448
v 0.253722 0.003615 0.289628
v 0.267787 0.003615 0.313990
v 0.274819 0.003615 0.326170
v 0.281851 0.003615 0.338351
v 0.148563 0.020238 0.301619
v 0.178282 0.025778 0.284461
v 0.208000 0.020238 0.267303
v 0.176693 0.020238 0.350342
v 0.206412 0.025778 0.333184
v 0.236130 0.020238 0.316026
v 0.204823 0.020238 0.399064
v 0.234541 0.025778 0.381906
v 0.264260 0.020238 0.364748
v 0.204823 -0.131210 0.399064
v 0.234541 -0.136751 0.381906
v 0.264260 -0.131210 0.364748
v 0.176693 -0.131210 0.350342
v 0.206412 -0.136751 0.333184
v 0.236130 -0.131210 0.316026
v 0.148563 -0.131210 0.301619
v 0.178282 -0.136751 0.284461
v 0.208000 -0.131210 0.267303
v 0.247908 -0.023165 0.244263
v 0.253003 -0.055486 0.241321
v 0.247908 -0.087808 0.244263
v 0.276038 -0.023165 0.292985
v 0.281132 -0.055486 0.290044
v 0.276038 -0.087808 0.292985
v 0.304168 -0.023165 0.341707
v 0.309262 -0.055486 0.338766
v 0.304168 -0.087808 0.341707
v 0.164915 -0.023165 0.422105
v 0.159821 -0.055486 0.425046
v 0.164915 -0.087808 0.422105
v 0.136785 -0.023165 0.373382
v 0.131691 -0.055486 0.376324
v 0.136785 -0.087808 0.373382
v 0.108656 -0.023165 0.324660
v 0.103561 -0.055486 0.327602
v 0.108656 -0.087808 0.324660
v 0.284631 -0.083514 0.409147
v 0.286758 -0.055486 0.409635
v 0.284631 -0.027458 0.409147
v 0.259603 -0.085020 0.425313
v 0.260414 -0.055486 0.426718
v 0.259603 -0.025953 0.425313
v 0.233088 -0.083514 0.438905
v 0.232447 -0.055486 0.440991
v 0.233088 -0.027458 0.438905
v 0.102093 -0.087633 0.312443
v 0.097014 -0.055486 0.315354
v 0.102093 -0.023339 0.312443
v 0.098353 -0.086414 0.300007
v 0.093379 -0.055486 0.302703
v 0.098353 -0.024559 0.300007
v 0.100081 -0.083689 0.287095
v 0.095501 -0.055486 0.289166
v 0.100081 -0.027284 0.287095
v 0.128193 -0.083514 0.257220
v 0.126065 -0.055486 0.256732
v 0.128193 -0.027458 0.257220
v 0.153221 -0.085020 0.241054
v 0.152409 -0.055486 0.239649
v 0.153221 -0.025953 0.241054
v 0.179735 -0.083514 0.227462
v 0.180376 -0.055486 0.225376
v 0.179735 -0.027458 0.227462
v 0.310730 -0.087633 0.353924
v 0.315809 -0.055486 0.351014
v 0.310730 -0.023339 0.353924
v 0.314470 -0.086414 0.366361
v 0.319444 -0.055486 0.363664
v 0.314470 -0.024559 0.366361
v 0.312742 -0.083689 0.379272
v 0.317322 -0.055486 0.377202
v 0.312742 -0.027284 0.379272
v 0.141793 -0.130809 0.289523
v 0.136594 -0.128004 0.277928
v 0.133941 -0.120514 0.267546
v 0.171342 -0.136324 0.272441
v 0.164955 -0.133331 0.261379
v 0.159624 -0.125225 0.252144
v 0.200909 -0.130809 0.255392
v 0.193468 -0.128004 0.245092
v 0.185803 -0.120514 0.237603
v 0.211914 0.019837 0.410976
v 0.219355 0.017031 0.421275
v 0.227020 0.009541 0.428764
v 0.241482 0.025351 0.393927
v 0.247868 0.022358 0.404989
v 0.253200 0.014252 0.414223
v 0.271030 0.019837 0.376845
v 0.276229 0.017031 0.388439
v 0.278883 0.009541 0.398821
v 0.133941 0.009541 0.267546
v 0.136594 0.017031 0.277928
v 0.141793 0.019837 0.289523
v 0.159624 0.014252 0.252144
v 0.164955 0.022358 0.261379
v 0.171342 0.025351 0.272441
v 0.185803 0.009541 0.237603
v 0.193468 0.017031 0.245092
v 0.200909 0.019837 0.255392
v 0.155596 0.020238 0.313800
v 0.162628 0.020238 0.325981
v 0.169661 0.020238 0.338161
v 0.185314 0.025778 0.296642
v 0.192347 0.025778 0.308823
v 0.199379 0.025778 0.321003
v 0.215033 0.020238 0.279484
v 0.222065 0.020238 0.291665
v 0.229098 0.020238 0.303845
v 0.183726 0.020238 0.362522
v 0.190758 0.020238 0.374703
v 0.197790 0.020238 0.386883
v 0.213444 0.025778 0.345364
v 0.220477 0.025778 0.357545
v 0.227509 0.025778 0.369725
v 0.243163 0.020238 0.328206
v 0.250195 0.020238 0.340387
v 0.257227 0.020238 0.352567
v 0.227020 -0.120514 0.428764
v 0.219355 -0.128004 0.421275
v 0.211914 -0.130809 0.410976
v 0.253200 -0.125225 0.414223
v 0.247868 -0.133331 0.404989
v 0.241482 -0.136324 0.393927
v 0.278883 -0.120514 0.398821
v 0.276229 -0.128004 0.388439
v 0.271030 -0.130809 0.376845
v 0.197790 -0.131210 0.386883
v 0.190758 -0.131210 0.374703
v 0.183726 -0.131210 0.362522
v 0.227509 -0.136751 0.369725
v 0.220477 -0.136751 0.357545
v 0.213444 -0.136751 0.345364
v 0.257227 -0.131210 0.352567
v 0.250195 -0.131210 0.340387
v 0.243163 -0.131210 0.328206
v 0.169661 -0.131210 0.338161
v 0.162628 -0.131210 0.325981
v 0.155596 -0.131210 0.313800
v 0.199379 -0.136751 0.321003
v 0.192347 -0.136751 0.308823
v 0.185314 -0.136751 0.296642
v 0.229098 -0.131210 0.303845
v 0.222065 -0.131210 0.291665
v 0.215033 -0.131210 0.279484
v 0.219663 -0.083689 0.218054
v 0.223746 -0.055486 0.215123
v 0.219663 -0.027284 0.218054
v 0.231709 -0.086414 0.223014
v 0.236531 -0.055486 0.220054
v 0.231709 -0.024559 0.223014
v 0.240609 -0.087633 0.232471
v 0.245669 -0.055486 0.229528
v 0.240609 -0.023339 0.232471
v 0.254941 -0.087808 0.256443
v 0.260035 -0.055486 0.253502
v 0.254941 -0.023165 0.256443
v 0.261973 -0.087808 0.268624
v 0.267068 -0.055486 0.265682
v 0.261973 -0.023165 0.268624
v 0.269005 -0.087808 0.280804
v 0.274100 -0.055486 0.277863
v 0.269005 -0.023165 0.280804
v 0.283070 -0.087808 0.305165
v 0.288165 -0.055486 0.302224
v 0.283070 -0.023165 0.305165
v 0.290103 -0.087808 0.317346
v 0.295197 -0.055486 0.314405
v 0.290103 -0.023165 0.317346
v 0.297135 -0.087808 0.329527
v 0.302230 -0.055486 0.326585
v 0.297135 -0.023165 0.329527
v 0.193160 -0.083689 0.448313
v 0.189077 -0.055486 0.451244
v 0.193160 -0.027284 0.448313
v 0.181114 -0.086414 0.443354
v 0.176292 -0.055486 0.446313
v 0.181114 -0.024559 0.443354
v 0.172214 -0.087633 0.433896
v 0.167154 -0.055486 0.436840
v 0.172214 -0.023339 0.433896
v 0.157883 -0.087808 0.409924
v 0.152788 -0.055486 0.412865
v 0.157883 -0.023165 0.409924
v 0.150850 -0.087808 0.397743
v 0.145756 -0.055486 0.400685
v 0.150850 -0.023165 0.397743
v 0.143818 -0.087808 0.385563
v 0.138723 -0.055486 0.388504
v 0.143818 -0.023165 0.385563
v 0.129753 -0.087808 0.361202
v 0.124658 -0.055486 0.364143
v 0.129753 -0.023165 0.361202
v 0.122721 -0.087808 0.349021
v 0.117626 -0.055486 0.351963
v 0.122721 -0.023165 0.349021
v 0.115688 -0.087808 0.336841
v 0.110593 -0.055486 0.339782
v 0.115688 -0.023165 0.336841
vt 0.437500 0.062276
vt 0.500000 0.122396
vt 0.437500 0.123210
vt 0.562500 0.062276
vt 0.500000 0.062174
vt 0.437500 0.180725
vt 0.562500 0.180725
vt 0.500000 0.178060
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.437500 0.000000
vt 0.562500 0.000000
vt 0.500000 0.000000
vt 0.625000 0.000000
vt 0.625000 0.062500
vt 0.562500 0.123210
vt 0.625000 0.187500
vt 0.625000 0.250000
vt 0.562500 0.228516
vt 0.500000 0.223958
vt 0.437500 0.228516
vt 0.375000 0.250000
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.500000 0.453451
vt 0.437500 0.470540
vt 0.437500 0.453349
vt 0.562500 0.470540
vt 0.500000 0.471354
vt 0.437663 0.491150
vt 0.562337 0.491150
vt 0.500000 0.493815
vt 0.437500 0.437500
vt 0.375000 0.453125
vt 0.375000 0.437500
vt 0.500000 0.437500
vt 0.562500 0.453349
vt 0.562500 0.437500
vt 0.625000 0.453125
vt 0.625000 0.468750
vt 0.625000 0.484375
vt 0.625000 0.500000
vt 0.560628 0.521525
vt 0.500000 0.526042
vt 0.439372 0.521525
vt 0.375000 0.500000
vt 0.375000 0.484375
vt 0.375000 0.468750
vt 0.444214 0.569214
vt 0.500000 0.625000
vt 0.446615 0.625000
vt 0.555786 0.569214
vt 0.500000 0.571615
vt 0.444214 0.680786
vt 0.555786 0.680786
vt 0.500000 0.678385
vt 0.396525 0.564372
vt 0.603475 0.564372
vt 0.553385 0.625000
vt 0.603475 0.685628
vt 0.625000 0.750000
vt 0.560628 0.728475
vt 0.500000 0.723958
vt 0.439372 0.728475
vt 0.375000 0.750000
vt 0.396525 0.685628
vt 0.401042 0.625000
vt 0.500000 0.953125
vt 0.437500 0.968750
vt 0.437500 0.953125
vt 0.562500 0.968750
vt 0.500000 0.968750
vt 0.437500 0.984375
vt 0.562500 0.984375
vt 0.500000 0.984375
vt 0.437500 0.937500
vt 0.375000 0.953125
vt 0.375000 0.937500
vt 0.500000 0.937500
vt 0.562500 0.953125
vt 0.562500 0.937500
vt 0.625000 0.953125
vt 0.625000 0.968750
vt 0.625000 0.984375
vt 0.625000 1.000000
vt 0.562500 1.000000
vt 0.500000 1.000000
vt 0.437500 1.000000
vt 0.375000 1.000000
vt 0.375000 0.984375
vt 0.375000 0.968750
vt 0.345540 0.562500
vt 0.328451 0.625000
vt 0.328349 0.562500
vt 0.366150 0.562663
vt 0.346354 0.625000
vt 0.345540 0.687500
vt 0.328349 0.687500
vt 0.366150 0.687337
vt 0.328125 0.500000
vt 0.312500 0.562500
vt 0.312500 0.500000
vt 0.343750 0.500000
vt 0.359375 0.500000
vt 0.368815 0.625000
vt 0.359375 0.750000
vt 0.343750 0.750000
vt 0.328125 0.750000
vt 0.312500 0.687500
vt 0.312500 0.750000
vt 0.312500 0.625000
vt 0.843750 0.562500
vt 0.828125 0.625000
vt 0.828125 0.562500
vt 0.859375 0.562500
vt 0.843750 0.625000
vt 0.843750 0.687500
vt 0.828125 0.687500
vt 0.859375 0.687500
vt 0.828125 0.500000
vt 0.812500 0.562500
vt 0.812500 0.500000
vt 0.843750 0.500000
vt 0.859375 0.500000
vt 0.875000 0.500000
vt 0.875000 0.625000
vt 0.859375 0.625000
vt 0.875000 0.750000
vt 0.859375 0.750000
vt 0.843750 0.750000
vt 0.828125 0.750000
vt 0.812500 0.687500
vt 0.812500 0.750000
vt 0.812500 0.625000
vt 0.633850 0.562663
vt 0.653646 0.625000
vt 0.631185 0.625000
vt 0.654460 0.562500
vt 0.671550 0.625000
vt 0.633850 0.687337
vt 0.654460 0.687500
vt 0.640625 0.500000
vt 0.656250 0.500000
vt 0.671651 0.562500
vt 0.671875 0.500000
vt 0.687500 0.562500
vt 0.687500 0.625000
vt 0.671651 0.687500
vt 0.687500 0.687500
vt 0.671875 0.750000
vt 0.656250 0.750000
vt 0.640625 0.750000
vt 0.598958 0.625000
vt 0.718750 0.562500
vt 0.703125 0.625000
vt 0.703125 0.562500
vt 0.734375 0.562500
vt 0.718750 0.625000
vt 0.703125 0.687500
vt 0.734375 0.625000
vt 0.718750 0.687500
vt 0.703125 0.500000
vt 0.687500 0.500000
vt 0.718750 0.500000
vt 0.750000 0.500000
vt 0.734375 0.500000
vt 0.750000 0.562500
vt 0.750000 0.625000
vt 0.734375 0.687500
vt 0.750000 0.687500
vt 0.734375 0.750000
vt 0.718750 0.750000
vt 0.703125 0.750000
vt 0.687500 0.750000
vt 0.781250 0.562500
vt 0.765625 0.625000
vt 0.765625 0.562500
vt 0.796875 0.562500
vt 0.781250 0.625000
vt 0.765625 0.687500
vt 0.796875 0.687500
vt 0.781250 0.687500
vt 0.781250 0.500000
vt 0.765625 0.500000
vt 0.796875 0.500000
vt 0.796875 0.625000
vt 0.796875 0.750000
vt 0.781250 0.750000
vt 0.765625 0.750000
vt 0.750000 0.750000
vt 0.140625 0.562500
vt 0.156250 0.625000
vt 0.140625 0.625000
vt 0.156250 0.562500
vt 0.171875 0.625000
vt 0.140625 0.687500
vt 0.156250 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.140625 0.500000
vt 0.156250 0.500000
vt 0.171875 0.562500
vt 0.171875 0.500000
vt 0.187500 0.562500
vt 0.187500 0.625000
vt 0.171875 0.687500
vt 0.187500 0.687500
vt 0.171875 0.750000
vt 0.156250 0.750000
vt 0.140625 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.218750 0.562500
vt 0.203125 0.625000
vt 0.203125 0.562500
vt 0.234375 0.562500
vt 0.218750 0.625000
vt 0.203125 0.687500
vt 0.234375 0.625000
vt 0.218750 0.687500
vt 0.187500 0.500000
vt 0.218750 0.500000
vt 0.203125 0.500000
vt 0.250000 0.500000
vt 0.234375 0.500000
vt 0.250000 0.562500
vt 0.250000 0.625000
vt 0.234375 0.687500
vt 0.250000 0.687500
vt 0.234375 0.750000
vt 0.218750 0.750000
vt 0.203125 0.750000
vt 0.187500 0.750000
vt 0.281250 0.562500
vt 0.265625 0.625000
vt 0.265625 0.562500
vt 0.296875 0.562500
vt 0.281250 0.625000
vt 0.265625 0.687500
vt 0.296875 0.687500
vt 0.281250 0.687500
vt 0.281250 0.500000
vt 0.265625 0.500000
vt 0.296875 0.500000
vt 0.296875 0.625000
vt 0.296875 0.750000
vt 0.281250 0.750000
vt 0.265625 0.750000
vt 0.250000 0.750000
vt 0.437663 0.758850
vt 0.500000 0.778646
vt 0.437500 0.779460
vt 0.562337 0.758850
vt 0.500000 0.756185
vt 0.500000 0.796549
vt 0.437500 0.796651
vt 0.562500 0.779460
vt 0.375000 0.765625
vt 0.625000 0.765625
vt 0.625000 0.781250
vt 0.562500 0.796651
vt 0.625000 0.796875
vt 0.562500 0.812500
vt 0.500000 0.812500
vt 0.437500 0.812500
vt 0.375000 0.796875
vt 0.375000 0.812500
vt 0.375000 0.781250
vt 0.500000 0.828125
vt 0.437500 0.843750
vt 0.437500 0.828125
vt 0.562500 0.828125
vt 0.500000 0.843750
vt 0.437500 0.859375
vt 0.562500 0.843750
vt 0.500000 0.859375
vt 0.375000 0.828125
vt 0.625000 0.812500
vt 0.625000 0.828125
vt 0.625000 0.859375
vt 0.562500 0.859375
vt 0.562500 0.875000
vt 0.500000 0.875000
vt 0.437500 0.875000
vt 0.375000 0.875000
vt 0.375000 0.859375
vt 0.375000 0.843750
vt 0.500000 0.890625
vt 0.437500 0.906250
vt 0.437500 0.890625
vt 0.562500 0.890625
vt 0.500000 0.906250
vt 0.437500 0.921875
vt 0.562500 0.906250
vt 0.500000 0.921875
vt 0.375000 0.890625
vt 0.625000 0.875000
vt 0.625000 0.890625
vt 0.625000 0.906250
vt 0.562500 0.921875
vt 0.625000 0.921875
vt 0.375000 0.921875
vt 0.375000 0.906250
vt 0.437500 0.258850
vt 0.500000 0.278646
vt 0.437500 0.279460
vt 0.562500 0.258850
vt 0.500000 0.256185
vt 0.500000 0.296549
vt 0.437500 0.296651
vt 0.562500 0.279460
vt 0.375000 0.265625
vt 0.625000 0.265625
vt 0.625000 0.281250
vt 0.562500 0.296651
vt 0.625000 0.296875
vt 0.562500 0.312500
vt 0.500000 0.312500
vt 0.437500 0.312500
vt 0.375000 0.296875
vt 0.375000 0.312500
vt 0.375000 0.281250
vt 0.500000 0.328125
vt 0.437500 0.343750
vt 0.437500 0.328125
vt 0.562500 0.328125
vt 0.500000 0.343750
vt 0.437500 0.359375
vt 0.562500 0.343750
vt 0.500000 0.359375
vt 0.375000 0.328125
vt 0.625000 0.312500
vt 0.625000 0.343750
vt 0.562500 0.359375
vt 0.625000 0.359375
vt 0.562500 0.375000
vt 0.500000 0.375000
vt 0.437500 0.375000
vt 0.375000 0.375000
vt 0.375000 0.359375
vt 0.375000 0.343750
vt 0.500000 0.390625
vt 0.437500 0.406250
vt 0.437500 0.390625
vt 0.562500 0.390625
vt 0.500000 0.406250
vt 0.437500 0.421875
vt 0.562500 0.406250
vt 0.500000 0.421875
vt 0.375000 0.390625
vt 0.625000 0.375000
vt 0.625000 0.406250
vt 0.562500 0.421875
vt 0.625000 0.421875
vt 0.375000 0.421875
vt 0.375000 0.406250
vt 0.625000 0.125000
vt 0.625000 0.437500
vt 0.625000 0.937500
vt 0.875000 0.562500
vt 0.875000 0.687500
vt 0.625000 0.843750
vt 0.625000 0.328125
vt 0.625000 0.390625
vn 0.5441 -0.0548 0.8372
vn 0.5432 0.0558 0.8377
vn 0.4530 -0.0548 0.8898
vn 0.4539 0.0558 0.8893
vn 0.6304 -0.1998 0.7501
vn 0.6488 -0.0625 0.7584
vn 0.6488 0.0624 0.7584
vn 0.6457 0.1815 0.7417
vn 0.5405 0.1936 0.8188
vn 0.4388 0.1936 0.8775
vn 0.3195 0.1815 0.9300
vn 0.3323 0.0624 0.9411
vn 0.3324 -0.0625 0.9411
vn 0.3344 -0.1998 0.9210
vn 0.4388 -0.1936 0.8774
vn 0.5405 -0.1936 0.8188
vn -0.9476 -0.1740 0.2679
vn -0.9461 0.1759 0.2719
vn -0.9766 -0.1430 -0.1609
vn -0.9771 0.1474 -0.1532
vn -0.7391 -0.5500 0.3888
vn -0.8678 -0.1790 0.4636
vn -0.8676 0.1791 0.4638
vn -0.7390 0.5501 0.3891
vn -0.8218 0.5356 0.1946
vn -0.8749 0.4367 -0.2093
vn -0.7874 0.2528 -0.5622
vn -0.8209 0.0919 -0.5636
vn -0.8214 -0.0916 -0.5630
vn -0.7966 -0.2962 -0.5270
vn -0.8755 -0.4342 -0.2122
vn -0.8240 -0.5335 0.1910
vn -0.5441 -0.0548 -0.8372
vn -0.5432 0.0558 -0.8377
vn -0.4530 -0.0548 -0.8898
vn -0.4539 0.0558 -0.8893
vn -0.6304 -0.1998 -0.7501
vn -0.6488 -0.0625 -0.7584
vn -0.6488 0.0624 -0.7584
vn -0.6457 0.1815 -0.7417
vn -0.5405 0.1936 -0.8188
vn -0.4388 0.1936 -0.8775
vn -0.3195 0.1815 -0.9300
vn -0.3323 0.0624 -0.9411
vn -0.3324 -0.0625 -0.9411
vn -0.3344 -0.1998 -0.9210
vn -0.4388 -0.1936 -0.8774
vn -0.5405 -0.1936 -0.8188
vn 0.9476 -0.1740 -0.2679
vn 0.9461 0.1759 -0.2719
vn 0.9766 -0.1430 0.1609
vn 0.9771 0.1474 0.1532
vn 0.7391 -0.5500 -0.3889
vn 0.8678 -0.1790 -0.4636
vn 0.8676 0.1791 -0.4638
vn 0.7389 0.5501 -0.3891
vn 0.8218 0.5356 -0.1946
vn 0.8749 0.4367 0.2093
vn 0.7874 0.2528 0.5622
vn 0.8209 0.0919 0.5636
vn 0.8214 -0.0916 0.5630
vn 0.7966 -0.2962 0.5270
vn 0.8755 -0.4342 0.2122
vn 0.8240 -0.5335 -0.1910
vn -0.2491 -0.9609 -0.1208
vn -0.4157 -0.7847 -0.4598
vn 0.0200 -0.9609 -0.2762
vn -0.1904 -0.7847 -0.5899
vn -0.4528 -0.8629 0.2245
vn -0.5447 -0.8378 0.0368
vn -0.6543 -0.6813 -0.3284
vn -0.6646 -0.4497 -0.5967
vn -0.5185 -0.4506 -0.7267
vn -0.3701 -0.4506 -0.8124
vn -0.1845 -0.4497 -0.8739
vn 0.0428 -0.6812 -0.7308
vn 0.3042 -0.8378 -0.4533
vn 0.4209 -0.8629 -0.2799
vn 0.1226 -0.9868 -0.1062
vn -0.1533 -0.9868 0.0531
vn -0.0200 0.9609 0.2762
vn 0.1904 0.7847 0.5899
vn 0.2491 0.9609 0.1208
vn 0.4157 0.7847 0.4598
vn -0.4209 0.8629 0.2799
vn -0.3042 0.8378 0.4533
vn -0.0428 0.6812 0.7308
vn 0.1845 0.4497 0.8739
vn 0.3701 0.4506 0.8124
vn 0.5185 0.4506 0.7267
vn 0.6646 0.4497 0.5967
vn 0.6543 0.6812 0.3284
vn 0.5447 0.8378 -0.0368
vn 0.4528 0.8629 -0.2245
vn 0.1533 0.9868 -0.0531
vn -0.1226 0.9868 0.1062
vn -0.4160 0.7889 -0.4523
vn -0.2486 0.9616 -0.1166
vn -0.1838 0.7889 -0.5864
vn 0.0234 0.9616 -0.2736
vn -0.6423 0.4228 -0.6393
vn -0.6549 0.6823 -0.3249
vn -0.5443 0.8379 0.0413
vn -0.4528 0.8628 0.2248
vn -0.1533 0.9867 0.0534
vn 0.1229 0.9867 -0.1060
vn 0.4211 0.8628 -0.2797
vn 0.3080 0.8379 -0.4507
vn 0.0461 0.6823 -0.7296
vn -0.2325 0.4228 -0.8759
vn -0.3703 0.4513 -0.8119
vn -0.5180 0.4513 -0.7266
vn -0.1380 0.9872 0.0797
vn 0.1380 0.9872 -0.0797
vn -0.4371 0.8633 0.2523
vn 0.4371 0.8633 -0.2523
vn 0.1838 -0.7889 0.5864
vn -0.0234 -0.9616 0.2736
vn 0.4160 -0.7889 0.4523
vn 0.2486 -0.9616 0.1166
vn 0.2325 -0.4228 0.8759
vn -0.0461 -0.6823 0.7296
vn -0.3080 -0.8379 0.4507
vn -0.4211 -0.8628 0.2797
vn -0.1229 -0.9867 0.1060
vn 0.1533 -0.9867 -0.0534
vn 0.4528 -0.8628 -0.2248
vn 0.5443 -0.8379 -0.0413
vn 0.6549 -0.6823 0.3249
vn 0.6423 -0.4228 0.6393
vn 0.5180 -0.4513 0.7266
vn 0.3703 -0.4513 0.8119
vn -0.1380 -0.9872 0.0797
vn 0.1380 -0.9872 -0.0797
vn -0.4371 -0.8633 0.2523
vn 0.4371 -0.8633 -0.2523
vn 0.3490 -0.1430 -0.9262
vn 0.3559 0.1474 -0.9228
vn 0.7058 -0.1740 -0.6867
vn 0.7085 0.1759 -0.6834
vn -0.0581 -0.2962 -0.9533
vn -0.0769 -0.0916 -0.9928
vn -0.0776 0.0919 -0.9927
vn -0.0932 0.2528 -0.9630
vn 0.2562 0.4367 -0.8623
vn 0.5794 0.5356 -0.6143
vn 0.7064 0.5501 -0.4454
vn 0.8355 0.1791 -0.5195
vn 0.8354 -0.1790 -0.5198
vn 0.7063 -0.5500 -0.4457
vn 0.5774 -0.5335 -0.6181
vn 0.2540 -0.4342 -0.8642
vn 0.8520 -0.1791 -0.4919
vn 0.8520 0.1791 -0.4919
vn 0.7231 -0.5503 -0.4175
vn 0.7231 0.5503 -0.4175
vn -0.3490 -0.1430 0.9262
vn -0.3559 0.1474 0.9228
vn -0.7058 -0.1740 0.6867
vn -0.7085 0.1759 0.6834
vn 0.0581 -0.2962 0.9533
vn 0.0769 -0.0916 0.9928
vn 0.0776 0.0919 0.9927
vn 0.0932 0.2528 0.9630
vn -0.2563 0.4367 0.8623
vn -0.5794 0.5356 0.6143
vn -0.7064 0.5501 0.4454
vn -0.8355 0.1791 0.5195
vn -0.8354 -0.1790 0.5198
vn -0.7063 -0.5500 0.4457
vn -0.5774 -0.5335 0.6181
vn -0.2540 -0.4342 0.8643
vn -0.8520 -0.1791 0.4919
vn -0.8520 0.1791 0.4919
vn -0.7231 -0.5503 0.4175
vn -0.7231 0.5503 0.4175
vn 0.5432 -0.0558 0.8377
vn 0.5441 0.0548 0.8372
vn 0.4539 -0.0558 0.8893
vn 0.4530 0.0548 0.8898
vn 0.6457 -0.1815 0.7417
vn 0.6488 -0.0624 0.7584
vn 0.6488 0.0625 0.7584
vn 0.6304 0.1998 0.7501
vn 0.4388 0.1936 0.8774
vn 0.3344 0.1998 0.9210
vn 0.3324 0.0625 0.9411
vn 0.3323 -0.0624 0.9411
vn 0.3195 -0.1815 0.9300
vn 0.4388 -0.1936 0.8775
vn -0.9461 -0.1759 0.2719
vn -0.9476 0.1740 0.2679
vn -0.9771 -0.1474 -0.1532
vn -0.9766 0.1430 -0.1609
vn -0.7390 -0.5501 0.3891
vn -0.8676 -0.1791 0.4638
vn -0.8678 0.1790 0.4636
vn -0.7391 0.5500 0.3888
vn -0.8240 0.5335 0.1910
vn -0.8755 0.4342 -0.2122
vn -0.7966 0.2962 -0.5270
vn -0.8214 0.0916 -0.5630
vn -0.8209 -0.0919 -0.5636
vn -0.7874 -0.2528 -0.5622
vn -0.8749 -0.4367 -0.2093
vn -0.8218 -0.5356 0.1946
vn -0.5432 -0.0558 -0.8377
vn -0.5441 0.0548 -0.8372
vn -0.4539 -0.0558 -0.8893
vn -0.4530 0.0548 -0.8898
vn -0.6457 -0.1815 -0.7417
vn -0.6488 -0.0624 -0.7584
vn -0.6488 0.0625 -0.7584
vn -0.6304 0.1998 -0.7501
vn -0.4388 0.1936 -0.8774
vn -0.3344 0.1998 -0.9210
vn -0.3324 0.0625 -0.9411
vn -0.3323 -0.0624 -0.9411
vn -0.3195 -0.1815 -0.9300
vn -0.4388 -0.1936 -0.8775
vn 0.9461 -0.1759 -0.2719
vn 0.9476 0.1740 -0.2679
vn 0.9771 -0.1474 0.1532
vn 0.9766 0.1430 0.1609
vn 0.7389 -0.5501 -0.3891
vn 0.8676 -0.1791 -0.4638
vn 0.8678 0.1790 -0.4636
vn 0.7391 0.5500 -0.3889
vn 0.8240 0.5335 -0.1910
vn 0.8755 0.4342 0.2122
vn 0.7966 0.2962 0.5270
vn 0.8214 0.0916 0.5630
vn 0.8209 -0.0919 0.5636
vn 0.7874 -0.2528 0.5622
vn 0.8749 -0.4367 0.2093
vn 0.8218 -0.5356 -0.1946
vn -0.2486 -0.9616 -0.1166
vn -0.4160 -0.7889 -0.4523
vn 0.0234 -0.9616 -0.2736
vn -0.1838 -0.7889 -0.5864
vn -0.4528 -0.8628 0.2248
vn -0.5443 -0.8379 0.0413
vn -0.6549 -0.6823 -0.3249
vn -0.6423 -0.4228 -0.6393
vn -0.5180 -0.4513 -0.7266
vn -0.3703 -0.4513 -0.8119
vn -0.2325 -0.4228 -0.8759
vn 0.0461 -0.6823 -0.7296
vn 0.3080 -0.8379 -0.4507
vn 0.4211 -0.8628 -0.2797
vn 0.1229 -0.9867 -0.1060
vn -0.1533 -0.9867 0.0534
vn -0.0234 0.9616 0.2736
vn 0.1838 0.7889 0.5864
vn 0.2486 0.9616 0.1166
vn 0.4160 0.7889 0.4523
vn -0.4211 0.8628 0.2797
vn -0.3080 0.8379 0.4507
vn -0.0461 0.6823 0.7296
vn 0.2325 0.4228 0.8759
vn 0.3703 0.4513 0.8119
vn 0.5180 0.4513 0.7266
vn 0.6423 0.4228 0.6393
vn 0.6549 0.6823 0.3249
vn 0.5443 0.8379 -0.0413
vn 0.4528 0.8628 -0.2248
vn 0.1533 0.9867 -0.0534
vn -0.1229 0.9867 0.1060
vn -0.4157 0.7847 -0.4598
vn -0.2491 0.9609 -0.1208
vn -0.1904 0.7847 -0.5899
vn 0.0200 0.9609 -0.2762
vn -0.6646 0.4497 -0.5967
vn -0.6543 0.6812 -0.3284
vn -0.5447 0.8378 0.0368
vn -0.4528 0.8629 0.2245
vn -0.1533 0.9868 0.0531
vn 0.1227 0.9868 -0.1062
vn 0.4209 0.8629 -0.2799
vn 0.3042 0.8378 -0.4533
vn 0.0428 0.6812 -0.7308
vn -0.1845 0.4497 -0.8739
vn -0.3701 0.4506 -0.8124
vn -0.5185 0.4506 -0.7267
vn 0.1904 -0.7847 0.5899
vn -0.0200 -0.9609 0.2762
vn 0.4157 -0.7847 0.4598
vn 0.2491 -0.9609 0.1208
vn 0.1845 -0.4497 0.8739
vn -0.0428 -0.6812 0.7308
vn -0.3042 -0.8378 0.4533
vn -0.4209 -0.8629 0.2799
vn -0.1226 -0.9868 0.1062
vn 0.1533 -0.9868 -0.0531
vn 0.4528 -0.8629 -0.2245
vn 0.5447 -0.8378 -0.0368
vn 0.6543 -0.6813 0.3284
vn 0.6646 -0.4497 0.5967
vn 0.5185 -0.4506 0.7267
vn 0.3701 -0.4506 0.8124
vn 0.3559 -0.1474 -0.9228
vn 0.3490 0.1430 -0.9262
vn 0.7085 -0.1759 -0.6834
vn 0.7058 0.1740 -0.6867
vn -0.0932 -0.2528 -0.9630
vn -0.0776 -0.0919 -0.9927
vn -0.0769 0.0916 -0.9928
vn -0.0581 0.2962 -0.9533
vn 0.2540 0.4342 -0.8643
vn 0.5774 0.5335 -0.6181
vn 0.7063 0.5500 -0.4457
vn 0.8354 0.1790 -0.5198
vn 0.8355 -0.1791 -0.5195
vn 0.7064 -0.5501 -0.4454
vn 0.5794 -0.5356 -0.6143
vn 0.2562 -0.4367 -0.8623
vn -0.3559 -0.1474 0.9228
vn -0.3490 0.1430 0.9262
vn -0.7085 -0.1759 0.6834
vn -0.7058 0.1740 0.6867
vn 0.0932 -0.2528 0.9630
vn 0.0776 -0.0919 0.9927
vn 0.0769 0.0916 0.9928
vn 0.0581 0.2962 0.9533
vn -0.2540 0.4342 0.8643
vn -0.5774 0.5335 0.6181
vn -0.7063 0.5500 0.4457
vn -0.8354 0.1790 0.5198
vn -0.8355 -0.1791 0.5195
vn -0.7064 -0.5501 0.4454
vn -0.5794 -0.5356 0.6143
vn -0.2562 -0.4367 0.8623
usemtl Material.002
s off
f 992/1053/923 996/1054/923 995/1055/923
f 994/1056/924 996/1054/924 993/1057/924
f 996/1054/925 998/1058/925 995/1055/925
f 996/1054/926 1000/1059/926 999/1060/926
f 864/1061/927 992/1053/927 886/1062/927
f 887/1063/928 993/1057/928 992/1053/928
f 889/1064/929 993/1057/929 888/1065/929
f 865/1066/930 994/1056/930 889/1064/930
f 890/1067/931 997/1068/931 994/1056/931
f 997/1068/932 892/1069/932 1000/1059/932
f 1000/1059/933 867/1070/933 893/1071/933
f 999/1060/934 893/1071/934 894/1072/934
f 999/1060/935 895/1073/935 998/1058/935
f 998/1058/936 866/1074/936 884/1075/936
f 995/1055/937 884/1075/937 885/1076/937
f 886/1062/938 995/1055/938 885/1076/938
f 1002/1077/939 1004/1078/939 1001/1079/939
f 1002/1077/940 1006/1080/940 1005/1081/940
f 1005/1081/941 1007/1082/941 1004/1078/941
f 1005/1081/942 1009/1083/942 1008/1084/942
f 991/1085/943 922/1086/943 872/1087/943
f 990/1088/944 1001/1079/944 991/1085/944
f 990/1088/945 1003/1089/945 1002/1077/945
f 989/1090/946 899/1091/946 1003/1089/946
f 1003/1089/947 900/1092/947 1006/1080/947
f 1006/1080/948 901/1093/948 1009/1083/948
f 1009/1083/949 871/1094/949 902/1095/949
f 1009/1083/950 903/1096/950 1008/1084/950
f 1007/1082/951 903/1096/951 904/1097/951
f 1007/1082/952 870/1098/952 920/1099/952
f 1004/1078/953 920/1099/953 921/1100/953
f 1001/1079/954 921/1100/954 922/1086/954
f 1010/1101/955 1014/1102/955 1013/1103/955
f 1012/1104/956 1014/1102/956 1011/1105/956
f 1014/1102/957 1016/1106/957 1013/1103/957
f 1014/1102/958 1018/1107/958 1017/1108/958
f 870/1098/959 1010/1101/959 907/1109/959
f 904/1097/960 1011/1105/960 1010/1101/960
f 902/1095/961 1011/1105/961 903/1096/961
f 871/1094/962 1012/1104/962 902/1095/962
f 908/1110/963 1015/1111/963 1012/1104/963
f 1015/1111/964 910/1112/964 1018/1107/964
f 1018/1107/965 869/1113/965 911/1114/965
f 1017/1108/966 911/1114/966 912/1115/966
f 1017/1108/967 913/1116/967 1016/1106/967
f 1016/1106/968 868/1117/968 905/1118/968
f 1013/1103/969 905/1118/969 906/1119/969
f 907/1109/970 1013/1103/970 906/1119/970
f 1020/1120/971 1022/1121/971 1019/1122/971
f 1020/1120/972 1024/1123/972 1023/1124/972
f 1023/1124/973 1025/1125/973 1022/1121/973
f 1023/1124/974 1027/1126/974 1026/1127/974
f 982/1128/975 940/1129/975 878/1130/975
f 981/1131/976 1019/1122/976 982/1128/976
f 981/1131/977 1021/1132/977 1020/1120/977
f 980/1133/978 917/1134/978 1021/1132/978
f 1021/1132/979 918/1135/979 1024/1123/979
f 1024/1123/980 919/1136/980 1027/1126/980
f 1027/1126/981 865/1137/981 889/1138/981
f 1027/1126/982 888/1139/982 1026/1127/982
f 1025/1125/983 888/1139/983 887/1140/983
f 1025/1125/984 864/1141/984 938/1142/984
f 1022/1121/985 938/1142/985 939/1143/985
f 1019/1122/986 939/1143/986 940/1129/986
f 1029/1144/987 1031/1145/987 1028/1146/987
f 1030/1147/988 1032/1148/988 1029/1144/988
f 1031/1145/989 1035/1149/989 1034/1150/989
f 1032/1148/990 1036/1151/990 1035/1149/990
f 922/1152/991 971/1153/991 872/1154/991
f 921/1155/992 1028/1146/992 922/1152/992
f 920/1156/993 1029/1144/993 921/1155/993
f 870/1098/994 1030/1147/994 920/1156/994
f 1030/1147/995 906/1119/995 1033/1157/995
f 906/1119/996 1036/1151/996 1033/1157/996
f 1036/1151/997 868/1117/997 916/1158/997
f 1035/1149/998 916/1158/998 915/1159/998
f 1034/1150/999 915/1159/999 914/1160/999
f 973/1161/1000 914/1160/1000 880/1162/1000
f 972/1163/1001 1034/1150/1001 973/1161/1001
f 1028/1146/1002 972/1163/1002 971/1153/1002
f 1038/1164/1003 1040/1165/1003 1037/1166/1003
f 1039/1167/1004 1041/1168/1004 1038/1164/1004
f 1040/1165/1005 1044/1169/1005 1043/1170/1005
f 1041/1168/1006 1045/1171/1006 1044/1169/1006
f 931/1172/1007 962/1173/1007 875/1174/1007
f 930/1175/1008 1037/1166/1008 931/1172/1008
f 929/1176/1009 1038/1164/1009 930/1175/1009
f 867/1177/1010 1039/1167/1010 929/1176/1010
f 1039/1167/1011 891/1178/1011 1042/1179/1011
f 891/1178/1012 1045/1171/1012 1042/1179/1012
f 1045/1171/1013 865/1180/1013 919/1181/1013
f 1044/1169/1014 919/1181/1014 918/1182/1014
f 1043/1170/1015 918/1182/1015 917/1183/1015
f 964/1184/1016 917/1183/1016 883/1185/1016
f 963/1186/1017 1043/1170/1017 964/1184/1017
f 1037/1166/1018 963/1186/1018 962/1173/1018
f 1046/1187/1019 1050/1188/1019 1049/1189/1019
f 1047/1190/1020 1051/1191/1020 1050/1188/1020
f 1050/1188/1021 1052/1192/1021 1049/1189/1021
f 1051/1191/1022 1053/1193/1022 1050/1188/1022
f 871/1094/1023 1046/1187/1023 908/1110/1023
f 901/1194/1024 1047/1190/1024 1046/1187/1024
f 900/1195/1025 1048/1196/1025 1047/1190/1025
f 899/1197/1026 956/1198/1026 1048/1196/1026
f 1048/1196/1027 957/1199/1027 1051/1191/1027
f 957/1199/1028 1054/1200/1028 1051/1191/1028
f 958/1201/1029 949/1202/1029 1054/1200/1029
f 1054/1200/1030 948/1203/1030 1053/1193/1030
f 1053/1193/1031 947/1204/1031 1052/1192/1031
f 1052/1192/1032 869/1113/1032 910/1112/1032
f 909/1205/1033 1052/1192/1033 910/1112/1033
f 1046/1187/1034 909/1205/1034 908/1110/1034
f 1056/1206/1035 1058/1207/1035 1055/1208/1035
f 1057/1209/1035 1059/1210/1035 1056/1206/1035
f 1059/1210/1036 1061/1211/1036 1058/1207/1036
f 1060/1212/1036 1062/1213/1036 1059/1210/1036
f 937/1214/1037 956/1198/1037 877/1215/1037
f 936/1216/1037 1055/1208/1037 937/1214/1037
f 936/1216/1037 1057/1209/1037 1056/1206/1037
f 876/1217/1037 1057/1209/1037 935/1218/1037
f 959/1219/1035 1060/1212/1035 1057/1209/1035
f 960/1220/1036 1063/1221/1036 1060/1212/1036
f 961/1222/1038 952/1223/1038 1063/1221/1038
f 1063/1221/1038 951/1224/1038 1062/1213/1038
f 1062/1213/1038 950/1225/1038 1061/1211/1038
f 1061/1211/1038 881/1226/1038 958/1201/1038
f 1058/1207/1036 958/1201/1036 957/1199/1036
f 1055/1208/1035 957/1199/1035 956/1198/1035
f 1065/1227/1035 1067/1228/1035 1064/1229/1035
f 1066/1230/1035 1068/1231/1035 1065/1227/1035
f 1068/1231/1036 1070/1232/1036 1067/1228/1036
f 1068/1231/1036 1072/1233/1036 1071/1234/1036
f 876/1217/1037 1064/1229/1037 959/1219/1037
f 933/1235/1037 1064/1229/1037 934/1236/1037
f 932/1237/1037 1065/1227/1037 933/1235/1037
f 875/1174/1037 1066/1230/1037 932/1237/1037
f 962/1173/1035 1069/1238/1035 1066/1230/1035
f 963/1186/1036 1072/1233/1036 1069/1238/1036
f 964/1184/1038 955/1239/1038 1072/1233/1038
f 1072/1233/1038 954/1240/1038 1071/1234/1038
f 1071/1234/1038 953/1241/1038 1070/1232/1038
f 1070/1232/1038 882/1242/1038 961/1222/1038
f 1067/1228/1036 961/1222/1036 960/1220/1036
f 1064/1229/1035 960/1220/1035 959/1219/1035
f 1073/1243/1039 1077/1244/1039 1076/1245/1039
f 1074/1246/1040 1078/1247/1040 1077/1244/1040
f 1077/1244/1041 1079/1248/1041 1076/1245/1041
f 1078/1247/1042 1080/1249/1042 1077/1244/1042
f 866/1250/1043 1073/1243/1043 884/1251/1043
f 898/1252/1044 1074/1246/1044 1073/1243/1044
f 897/1253/1045 1075/1254/1045 1074/1246/1045
f 896/1255/1046 965/1256/1046 1075/1254/1046
f 1075/1254/1047 966/1257/1047 1078/1247/1047
f 966/1257/1048 1081/1258/1048 1078/1247/1048
f 967/1259/1049 940/1260/1049 1081/1258/1049
f 1081/1258/1050 939/1261/1050 1080/1249/1050
f 1080/1249/1051 938/1262/1051 1079/1248/1051
f 1079/1248/1052 864/1263/1052 886/1264/1052
f 885/1265/1053 1079/1248/1053 886/1264/1053
f 1073/1243/1054 885/1265/1054 884/1251/1054
f 1083/1266/1055 1085/1267/1055 1082/1268/1055
f 1084/1269/1055 1086/1270/1055 1083/1266/1055
f 1086/1270/1056 1088/1271/1056 1085/1267/1056
f 1087/1272/1056 1089/1273/1056 1086/1270/1056
f 874/1274/1057 1082/1268/1057 965/1256/1057
f 927/1275/1057 1082/1268/1057 928/1276/1057
f 927/1275/1057 1084/1269/1057 1083/1266/1057
f 873/1277/1057 1084/1269/1057 926/1278/1057
f 968/1279/1055 1087/1272/1055 1084/1269/1055
f 969/1280/1056 1090/1281/1056 1087/1272/1056
f 970/1282/1058 943/1283/1058 1090/1281/1058
f 1090/1281/1058 942/1284/1058 1089/1273/1058
f 1089/1273/1058 941/1285/1058 1088/1271/1058
f 1088/1271/1058 878/1286/1058 967/1259/1058
f 1085/1267/1056 967/1259/1056 966/1257/1056
f 1082/1268/1055 966/1257/1055 965/1256/1055
f 1092/1287/1055 1094/1288/1055 1091/1289/1055
f 1093/1290/1055 1095/1291/1055 1092/1287/1055
f 1095/1291/1056 1097/1292/1056 1094/1288/1056
f 1095/1291/1056 1099/1293/1056 1098/1294/1056
f 873/1277/1057 1091/1289/1057 968/1279/1057
f 924/1295/1057 1091/1289/1057 925/1296/1057
f 923/1297/1057 1092/1287/1057 924/1295/1057
f 872/1154/1057 1093/1290/1057 923/1297/1057
f 971/1153/1055 1096/1298/1055 1093/1290/1055
f 972/1163/1056 1099/1293/1056 1096/1298/1056
f 973/1161/1058 946/1299/1058 1099/1293/1058
f 1099/1293/1058 945/1300/1058 1098/1294/1058
f 1098/1294/1058 944/1301/1058 1097/1292/1058
f 1097/1292/1058 879/1302/1058 970/1282/1058
f 1094/1288/1056 970/1282/1056 969/1280/1056
f 1091/1289/1055 969/1280/1055 968/1279/1055
f 1100/1303/1059 1104/1304/1059 1103/1305/1059
f 1102/1306/1060 1104/1304/1060 1101/1307/1060
f 1103/1305/1061 1107/1308/1061 1106/1309/1061
f 1105/1310/1062 1107/1308/1062 1104/1304/1062
f 868/1117/1063 1100/1303/1063 916/1311/1063
f 912/1115/1064 1100/1303/1064 913/1116/1064
f 912/1115/1065 1102/1306/1065 1101/1307/1065
f 869/1113/1066 1102/1306/1066 911/1114/1066
f 947/1312/1067 1105/1310/1067 1102/1306/1067
f 948/1313/1068 1108/1314/1068 1105/1310/1068
f 949/1315/1069 974/1316/1069 1108/1314/1069
f 1108/1314/1070 975/1317/1070 1107/1308/1070
f 1106/1309/1071 975/1317/1071 976/1318/1071
f 914/1319/1072 976/1318/1072 880/1320/1072
f 915/1321/1073 1106/1309/1073 914/1319/1073
f 916/1311/1074 1103/1305/1074 915/1321/1074
f 1110/1322/1075 1112/1323/1075 1109/1324/1075
f 1111/1325/1076 1113/1326/1076 1110/1322/1076
f 1113/1326/1075 1115/1327/1075 1112/1323/1075
f 1114/1328/1076 1116/1329/1076 1113/1326/1076
f 976/1318/1077 946/1330/1077 880/1320/1077
f 975/1317/1075 1109/1324/1075 976/1318/1075
f 974/1316/1076 1110/1322/1076 975/1317/1076
f 881/1331/1078 1111/1325/1078 974/1316/1078
f 950/1332/1078 1114/1328/1078 1111/1325/1078
f 1114/1328/1078 952/1333/1078 1117/1334/1078
f 952/1333/1078 977/1335/1078 1117/1334/1078
f 1117/1334/1076 978/1336/1076 1116/1329/1076
f 1116/1329/1075 979/1337/1075 1115/1327/1075
f 1115/1327/1077 879/1338/1077 944/1339/1077
f 1112/1323/1077 944/1339/1077 945/1340/1077
f 1109/1324/1077 945/1340/1077 946/1330/1077
f 1119/1341/1075 1121/1342/1075 1118/1343/1075
f 1120/1344/1076 1122/1345/1076 1119/1341/1076
f 1122/1345/1075 1124/1346/1075 1121/1342/1075
f 1123/1347/1076 1125/1348/1076 1122/1345/1076
f 979/1337/1077 943/1349/1077 879/1338/1077
f 978/1336/1075 1118/1343/1075 979/1337/1075
f 977/1335/1076 1119/1341/1076 978/1336/1076
f 882/1350/1078 1120/1344/1078 977/1335/1078
f 953/1351/1078 1123/1347/1078 1120/1344/1078
f 954/1352/1078 1126/1353/1078 1123/1347/1078
f 955/1354/1078 980/1133/1078 1126/1353/1078
f 1126/1353/1076 981/1131/1076 1125/1348/1076
f 1125/1348/1075 982/1128/1075 1124/1346/1075
f 1124/1346/1077 878/1130/1077 941/1355/1077
f 1121/1342/1077 941/1355/1077 942/1356/1077
f 943/1349/1077 1121/1342/1077 942/1356/1077
f 1127/1357/1079 1131/1358/1079 1130/1359/1079
f 1129/1360/1080 1131/1358/1080 1128/1361/1080
f 1130/1359/1081 1134/1362/1081 1133/1363/1081
f 1132/1364/1082 1134/1362/1082 1131/1358/1082
f 866/1074/1083 1127/1357/1083 898/1365/1083
f 894/1072/1084 1127/1357/1084 895/1073/1084
f 894/1072/1085 1129/1360/1085 1128/1361/1085
f 867/1070/1086 1129/1360/1086 893/1071/1086
f 929/1366/1087 1132/1364/1087 1129/1360/1087
f 930/1367/1088 1135/1368/1088 1132/1364/1088
f 931/1369/1089 983/1370/1089 1135/1368/1089
f 1135/1368/1090 984/1371/1090 1134/1362/1090
f 1133/1363/1091 984/1371/1091 985/1372/1091
f 896/1373/1092 985/1372/1092 874/1374/1092
f 897/1375/1093 1133/1363/1093 896/1373/1093
f 898/1365/1094 1130/1359/1094 897/1375/1094
f 1137/1376/1095 1139/1377/1095 1136/1378/1095
f 1138/1379/1096 1140/1380/1096 1137/1376/1096
f 1140/1380/1095 1142/1381/1095 1139/1377/1095
f 1141/1382/1096 1143/1383/1096 1140/1380/1096
f 985/1372/1097 928/1384/1097 874/1374/1097
f 984/1371/1095 1136/1378/1095 985/1372/1095
f 983/1370/1096 1137/1376/1096 984/1371/1096
f 875/1385/1098 1138/1379/1098 983/1370/1098
f 1138/1379/1098 933/1386/1098 1141/1382/1098
f 933/1386/1098 1144/1387/1098 1141/1382/1098
f 934/1388/1098 986/1389/1098 1144/1387/1098
f 1144/1387/1096 987/1390/1096 1143/1383/1096
f 1143/1383/1095 988/1391/1095 1142/1381/1095
f 1142/1381/1097 873/1392/1097 926/1393/1097
f 927/1394/1097 1142/1381/1097 926/1393/1097
f 1136/1378/1097 927/1394/1097 928/1384/1097
f 1146/1395/1095 1148/1396/1095 1145/1397/1095
f 1147/1398/1096 1149/1399/1096 1146/1395/1096
f 1149/1399/1095 1151/1400/1095 1148/1396/1095
f 1150/1401/1096 1152/1402/1096 1149/1399/1096
f 873/1392/1097 1145/1397/1097 925/1403/1097
f 987/1390/1095 1145/1397/1095 988/1391/1095
f 986/1389/1096 1146/1395/1096 987/1390/1096
f 876/1404/1098 1147/1398/1098 986/1389/1098
f 1147/1398/1098 936/1405/1098 1150/1401/1098
f 936/1405/1098 1153/1406/1098 1150/1401/1098
f 937/1407/1098 989/1090/1098 1153/1406/1098
f 1153/1406/1096 990/1088/1096 1152/1402/1096
f 1152/1402/1095 991/1085/1095 1151/1400/1095
f 923/1408/1097 991/1085/1097 872/1087/1097
f 1148/1396/1097 923/1408/1097 924/1409/1097
f 1145/1397/1097 924/1409/1097 925/1403/1097
f 992/1053/1099 993/1057/1099 996/1054/1099
f 994/1056/1100 997/1068/1100 996/1054/1100
f 996/1054/1101 999/1060/1101 998/1058/1101
f 996/1054/1102 997/1068/1102 1000/1059/1102
f 864/1061/1103 887/1063/1103 992/1053/1103
f 887/1063/1104 888/1065/1104 993/1057/1104
f 889/1064/1105 994/1056/1105 993/1057/1105
f 865/1066/1106 890/1067/1106 994/1056/1106
f 890/1067/931 891/1410/931 997/1068/931
f 997/1068/1107 891/1410/1107 892/1069/1107
f 1000/1059/1108 892/1069/1108 867/1070/1108
f 999/1060/1109 1000/1059/1109 893/1071/1109
f 999/1060/1110 894/1072/1110 895/1073/1110
f 998/1058/1111 895/1073/1111 866/1074/1111
f 995/1055/1112 998/1058/1112 884/1075/1112
f 886/1062/938 992/1053/938 995/1055/938
f 1002/1077/1113 1005/1081/1113 1004/1078/1113
f 1002/1077/1114 1003/1089/1114 1006/1080/1114
f 1005/1081/1115 1008/1084/1115 1007/1082/1115
f 1005/1081/1116 1006/1080/1116 1009/1083/1116
f 991/1085/1117 1001/1079/1117 922/1086/1117
f 990/1088/1118 1002/1077/1118 1001/1079/1118
f 990/1088/1119 989/1090/1119 1003/1089/1119
f 989/1090/1120 877/1411/1120 899/1091/1120
f 1003/1089/1121 899/1091/1121 900/1092/1121
f 1006/1080/1122 900/1092/1122 901/1093/1122
f 1009/1083/1123 901/1093/1123 871/1094/1123
f 1009/1083/1124 902/1095/1124 903/1096/1124
f 1007/1082/1125 1008/1084/1125 903/1096/1125
f 1007/1082/1126 904/1097/1126 870/1098/1126
f 1004/1078/1127 1007/1082/1127 920/1099/1127
f 1001/1079/1128 1004/1078/1128 921/1100/1128
f 1010/1101/1129 1011/1105/1129 1014/1102/1129
f 1012/1104/1130 1015/1111/1130 1014/1102/1130
f 1014/1102/1131 1017/1108/1131 1016/1106/1131
f 1014/1102/1132 1015/1111/1132 1018/1107/1132
f 870/1098/1133 904/1097/1133 1010/1101/1133
f 904/1097/1134 903/1096/1134 1011/1105/1134
f 902/1095/1135 1012/1104/1135 1011/1105/1135
f 871/1094/1136 908/1110/1136 1012/1104/1136
f 908/1110/963 909/1205/963 1015/1111/963
f 1015/1111/1137 909/1205/1137 910/1112/1137
f 1018/1107/1138 910/1112/1138 869/1113/1138
f 1017/1108/1139 1018/1107/1139 911/1114/1139
f 1017/1108/1140 912/1115/1140 913/1116/1140
f 1016/1106/1141 913/1116/1141 868/1117/1141
f 1013/1103/1142 1016/1106/1142 905/1118/1142
f 907/1109/970 1010/1101/970 1013/1103/970
f 1020/1120/1143 1023/1124/1143 1022/1121/1143
f 1020/1120/1144 1021/1132/1144 1024/1123/1144
f 1023/1124/1145 1026/1127/1145 1025/1125/1145
f 1023/1124/1146 1024/1123/1146 1027/1126/1146
f 982/1128/1147 1019/1122/1147 940/1129/1147
f 981/1131/1148 1020/1120/1148 1019/1122/1148
f 981/1131/1149 980/1133/1149 1021/1132/1149
f 980/1133/1150 883/1412/1150 917/1134/1150
f 1021/1132/1151 917/1134/1151 918/1135/1151
f 1024/1123/1152 918/1135/1152 919/1136/1152
f 1027/1126/1153 919/1136/1153 865/1137/1153
f 1027/1126/1154 889/1138/1154 888/1139/1154
f 1025/1125/1155 1026/1127/1155 888/1139/1155
f 1025/1125/1156 887/1140/1156 864/1141/1156
f 1022/1121/1157 1025/1125/1157 938/1142/1157
f 1019/1122/1158 1022/1121/1158 939/1143/1158
f 1029/1144/1159 1032/1148/1159 1031/1145/1159
f 1030/1147/1160 1033/1157/1160 1032/1148/1160
f 1031/1145/1161 1032/1148/1161 1035/1149/1161
f 1032/1148/1162 1033/1157/1162 1036/1151/1162
f 922/1152/1163 1028/1146/1163 971/1153/1163
f 921/1155/1164 1029/1144/1164 1028/1146/1164
f 920/1156/1165 1030/1147/1165 1029/1144/1165
f 870/1098/1166 907/1109/1166 1030/1147/1166
f 1030/1147/1167 907/1109/1167 906/1119/1167
f 906/1119/1168 905/1118/1168 1036/1151/1168
f 1036/1151/1169 905/1118/1169 868/1117/1169
f 1035/1149/1170 1036/1151/1170 916/1158/1170
f 1034/1150/1171 1035/1149/1171 915/1159/1171
f 973/1161/1172 1034/1150/1172 914/1160/1172
f 972/1163/1173 1031/1145/1173 1034/1150/1173
f 1028/1146/1174 1031/1145/1174 972/1163/1174
f 1038/1164/1175 1041/1168/1175 1040/1165/1175
f 1039/1167/1176 1042/1179/1176 1041/1168/1176
f 1040/1165/1177 1041/1168/1177 1044/1169/1177
f 1041/1168/1178 1042/1179/1178 1045/1171/1178
f 931/1172/1179 1037/1166/1179 962/1173/1179
f 930/1175/1180 1038/1164/1180 1037/1166/1180
f 929/1176/1181 1039/1167/1181 1038/1164/1181
f 867/1177/1182 892/1413/1182 1039/1167/1182
f 1039/1167/1183 892/1413/1183 891/1178/1183
f 891/1178/1184 890/1414/1184 1045/1171/1184
f 1045/1171/1185 890/1414/1185 865/1180/1185
f 1044/1169/1186 1045/1171/1186 919/1181/1186
f 1043/1170/1187 1044/1169/1187 918/1182/1187
f 964/1184/1188 1043/1170/1188 917/1183/1188
f 963/1186/1189 1040/1165/1189 1043/1170/1189
f 1037/1166/1190 1040/1165/1190 963/1186/1190
f 1046/1187/1191 1047/1190/1191 1050/1188/1191
f 1047/1190/1192 1048/1196/1192 1051/1191/1192
f 1050/1188/1193 1053/1193/1193 1052/1192/1193
f 1051/1191/1194 1054/1200/1194 1053/1193/1194
f 871/1094/1195 901/1194/1195 1046/1187/1195
f 901/1194/1196 900/1195/1196 1047/1190/1196
f 900/1195/1197 899/1197/1197 1048/1196/1197
f 899/1197/1198 877/1215/1198 956/1198/1198
f 1048/1196/1199 956/1198/1199 957/1199/1199
f 957/1199/1200 958/1201/1200 1054/1200/1200
f 958/1201/1201 881/1226/1201 949/1202/1201
f 1054/1200/1202 949/1202/1202 948/1203/1202
f 1053/1193/1203 948/1203/1203 947/1204/1203
f 1052/1192/1204 947/1204/1204 869/1113/1204
f 909/1205/1205 1049/1189/1205 1052/1192/1205
f 1046/1187/1206 1049/1189/1206 909/1205/1206
f 1056/1206/1035 1059/1210/1035 1058/1207/1035
f 1057/1209/1035 1060/1212/1035 1059/1210/1035
f 1059/1210/1036 1062/1213/1036 1061/1211/1036
f 1060/1212/1036 1063/1221/1036 1062/1213/1036
f 937/1214/1037 1055/1208/1037 956/1198/1037
f 936/1216/1037 1056/1206/1037 1055/1208/1037
f 936/1216/1037 935/1218/1037 1057/1209/1037
f 876/1217/1037 959/1219/1037 1057/1209/1037
f 959/1219/1035 960/1220/1035 1060/1212/1035
f 960/1220/1036 961/1222/1036 1063/1221/1036
f 961/1222/1038 882/1242/1038 952/1223/1038
f 1063/1221/1038 952/1223/1038 951/1224/1038
f 1062/1213/1038 951/1224/1038 950/1225/1038
f 1061/1211/1038 950/1225/1038 881/1226/1038
f 1058/1207/1036 1061/1211/1036 958/1201/1036
f 1055/1208/1035 1058/1207/1035 957/1199/1035
f 1065/1227/1035 1068/1231/1035 1067/1228/1035
f 1066/1230/1035 1069/1238/1035 1068/1231/1035
f 1068/1231/1036 1071/1234/1036 1070/1232/1036
f 1068/1231/1036 1069/1238/1036 1072/1233/1036
f 876/1217/1037 934/1236/1037 1064/1229/1037
f 933/1235/1037 1065/1227/1037 1064/1229/1037
f 932/1237/1037 1066/1230/1037 1065/1227/1037
f 875/1174/1037 962/1173/1037 1066/1230/1037
f 962/1173/1035 963/1186/1035 1069/1238/1035
f 963/1186/1036 964/1184/1036 1072/1233/1036
f 964/1184/1038 883/1185/1038 955/1239/1038
f 1072/1233/1038 955/1239/1038 954/1240/1038
f 1071/1234/1038 954/1240/1038 953/1241/1038
f 1070/1232/1038 953/1241/1038 882/1242/1038
f 1067/1228/1036 1070/1232/1036 961/1222/1036
f 1064/1229/1035 1067/1228/1035 960/1220/1035
f 1073/1243/1207 1074/1246/1207 1077/1244/1207
f 1074/1246/1208 1075/1254/1208 1078/1247/1208
f 1077/1244/1209 1080/1249/1209 1079/1248/1209
f 1078/1247/1210 1081/1258/1210 1080/1249/1210
f 866/1250/1211 898/1252/1211 1073/1243/1211
f 898/1252/1212 897/1253/1212 1074/1246/1212
f 897/1253/1213 896/1255/1213 1075/1254/1213
f 896/1255/1214 874/1274/1214 965/1256/1214
f 1075/1254/1215 965/1256/1215 966/1257/1215
f 966/1257/1216 967/1259/1216 1081/1258/1216
f 967/1259/1217 878/1286/1217 940/1260/1217
f 1081/1258/1218 940/1260/1218 939/1261/1218
f 1080/1249/1219 939/1261/1219 938/1262/1219
f 1079/1248/1220 938/1262/1220 864/1263/1220
f 885/1265/1221 1076/1245/1221 1079/1248/1221
f 1073/1243/1222 1076/1245/1222 885/1265/1222
f 1083/1266/1055 1086/1270/1055 1085/1267/1055
f 1084/1269/1055 1087/1272/1055 1086/1270/1055
f 1086/1270/1056 1089/1273/1056 1088/1271/1056
f 1087/1272/1056 1090/1281/1056 1089/1273/1056
f 874/1274/1057 928/1276/1057 1082/1268/1057
f 927/1275/1057 1083/1266/1057 1082/1268/1057
f 927/1275/1057 926/1278/1057 1084/1269/1057
f 873/1277/1057 968/1279/1057 1084/1269/1057
f 968/1279/1055 969/1280/1055 1087/1272/1055
f 969/1280/1056 970/1282/1056 1090/1281/1056
f 970/1282/1058 879/1302/1058 943/1283/1058
f 1090/1281/1058 943/1283/1058 942/1284/1058
f 1089/1273/1058 942/1284/1058 941/1285/1058
f 1088/1271/1058 941/1285/1058 878/1286/1058
f 1085/1267/1056 1088/1271/1056 967/1259/1056
f 1082/1268/1055 1085/1267/1055 966/1257/1055
f 1092/1287/1055 1095/1291/1055 1094/1288/1055
f 1093/1290/1055 1096/1298/1055 1095/1291/1055
f 1095/1291/1056 1098/1294/1056 1097/1292/1056
f 1095/1291/1056 1096/1298/1056 1099/1293/1056
f 873/1277/1057 925/1296/1057 1091/1289/1057
f 924/1295/1057 1092/1287/1057 1091/1289/1057
f 923/1297/1057 1093/1290/1057 1092/1287/1057
f 872/1154/1057 971/1153/1057 1093/1290/1057
f 971/1153/1055 972/1163/1055 1096/1298/1055
f 972/1163/1056 973/1161/1056 1099/1293/1056
f 973/1161/1058 880/1162/1058 946/1299/1058
f 1099/1293/1058 946/1299/1058 945/1300/1058
f 1098/1294/1058 945/1300/1058 944/1301/1058
f 1097/1292/1058 944/1301/1058 879/1302/1058
f 1094/1288/1056 1097/1292/1056 970/1282/1056
f 1091/1289/1055 1094/1288/1055 969/1280/1055
f 1100/1303/1223 1101/1307/1223 1104/1304/1223
f 1102/1306/1224 1105/1310/1224 1104/1304/1224
f 1103/1305/1225 1104/1304/1225 1107/1308/1225
f 1105/1310/1226 1108/1314/1226 1107/1308/1226
f 868/1117/1227 913/1116/1227 1100/1303/1227
f 912/1115/1228 1101/1307/1228 1100/1303/1228
f 912/1115/1229 911/1114/1229 1102/1306/1229
f 869/1113/1230 947/1312/1230 1102/1306/1230
f 947/1312/1231 948/1313/1231 1105/1310/1231
f 948/1313/1232 949/1315/1232 1108/1314/1232
f 949/1315/1233 881/1331/1233 974/1316/1233
f 1108/1314/1234 974/1316/1234 975/1317/1234
f 1106/1309/1235 1107/1308/1235 975/1317/1235
f 914/1319/1236 1106/1309/1236 976/1318/1236
f 915/1321/1237 1103/1305/1237 1106/1309/1237
f 916/1311/1238 1100/1303/1238 1103/1305/1238
f 1110/1322/1075 1113/1326/1075 1112/1323/1075
f 1111/1325/1076 1114/1328/1076 1113/1326/1076
f 1113/1326/1075 1116/1329/1075 1115/1327/1075
f 1114/1328/1076 1117/1334/1076 1116/1329/1076
f 976/1318/1077 1109/1324/1077 946/1330/1077
f 975/1317/1075 1110/1322/1075 1109/1324/1075
f 974/1316/1076 1111/1325/1076 1110/1322/1076
f 881/1331/1078 950/1332/1078 1111/1325/1078
f 950/1332/1078 951/1415/1078 1114/1328/1078
f 1114/1328/1078 951/1415/1078 952/1333/1078
f 952/1333/1078 882/1350/1078 977/1335/1078
f 1117/1334/1076 977/1335/1076 978/1336/1076
f 1116/1329/1075 978/1336/1075 979/1337/1075
f 1115/1327/1077 979/1337/1077 879/1338/1077
f 1112/1323/1077 1115/1327/1077 944/1339/1077
f 1109/1324/1077 1112/1323/1077 945/1340/1077
f 1119/1341/1075 1122/1345/1075 1121/1342/1075
f 1120/1344/1076 1123/1347/1076 1122/1345/1076
f 1122/1345/1075 1125/1348/1075 1124/1346/1075
f 1123/1347/1076 1126/1353/1076 1125/1348/1076
f 979/1337/1077 1118/1343/1077 943/1349/1077
f 978/1336/1075 1119/1341/1075 1118/1343/1075
f 977/1335/1076 1120/1344/1076 1119/1341/1076
f 882/1350/1078 953/1351/1078 1120/1344/1078
f 953/1351/1078 954/1352/1078 1123/1347/1078
f 954/1352/1078 955/1354/1078 1126/1353/1078
f 955/1354/1078 883/1412/1078 980/1133/1078
f 1126/1353/1076 980/1133/1076 981/1131/1076
f 1125/1348/1075 981/1131/1075 982/1128/1075
f 1124/1346/1077 982/1128/1077 878/1130/1077
f 1121/1342/1077 1124/1346/1077 941/1355/1077
f 943/1349/1077 1118/1343/1077 1121/1342/1077
f 1127/1357/1239 1128/1361/1239 1131/1358/1239
f 1129/1360/1240 1132/1364/1240 1131/1358/1240
f 1130/1359/1241 1131/1358/1241 1134/1362/1241
f 1132/1364/1242 1135/1368/1242 1134/1362/1242
f 866/1074/1243 895/1073/1243 1127/1357/1243
f 894/1072/1244 1128/1361/1244 1127/1357/1244
f 894/1072/1245 893/1071/1245 1129/1360/1245
f 867/1070/1246 929/1366/1246 1129/1360/1246
f 929/1366/1247 930/1367/1247 1132/1364/1247
f 930/1367/1248 931/1369/1248 1135/1368/1248
f 931/1369/1249 875/1385/1249 983/1370/1249
f 1135/1368/1250 983/1370/1250 984/1371/1250
f 1133/1363/1251 1134/1362/1251 984/1371/1251
f 896/1373/1252 1133/1363/1252 985/1372/1252
f 897/1375/1253 1130/1359/1253 1133/1363/1253
f 898/1365/1254 1127/1357/1254 1130/1359/1254
f 1137/1376/1095 1140/1380/1095 1139/1377/1095
f 1138/1379/1096 1141/1382/1096 1140/1380/1096
f 1140/1380/1095 1143/1383/1095 1142/1381/1095
f 1141/1382/1096 1144/1387/1096 1143/1383/1096
f 985/1372/1097 1136/1378/1097 928/1384/1097
f 984/1371/1095 1137/1376/1095 1136/1378/1095
f 983/1370/1096 1138/1379/1096 1137/1376/1096
f 875/1385/1098 932/1416/1098 1138/1379/1098
f 1138/1379/1098 932/1416/1098 933/1386/1098
f 933/1386/1098 934/1388/1098 1144/1387/1098
f 934/1388/1098 876/1404/1098 986/1389/1098
f 1144/1387/1096 986/1389/1096 987/1390/1096
f 1143/1383/1095 987/1390/1095 988/1391/1095
f 1142/1381/1097 988/1391/1097 873/1392/1097
f 927/1394/1097 1139/1377/1097 1142/1381/1097
f 1136/1378/1097 1139/1377/1097 927/1394/1097
f 1146/1395/1095 1149/1399/1095 1148/1396/1095
f 1147/1398/1096 1150/1401/1096 1149/1399/1096
f 1149/1399/1095 1152/1402/1095 1151/1400/1095
f 1150/1401/1096 1153/1406/1096 1152/1402/1096
f 873/1392/1097 988/1391/1097 1145/1397/1097
f 987/1390/1095 1146/1395/1095 1145/1397/1095
f 986/1389/1096 1147/1398/1096 1146/1395/1096
f 876/1404/1098 935/1417/1098 1147/1398/1098
f 1147/1398/1098 935/1417/1098 936/1405/1098
f 936/1405/1098 937/1407/1098 1153/1406/1098
f 937/1407/1098 877/1411/1098 989/1090/1098
f 1153/1406/1096 989/1090/1096 990/1088/1096
f 1152/1402/1095 990/1088/1095 991/1085/1095
f 923/1408/1097 1151/1400/1097 991/1085/1097
f 1148/1396/1097 1151/1400/1097 923/1408/1097
f 1145/1397/1097 1148/1396/1097 924/1409/1097
o Cube.004_Cube.005
v -0.353687 -0.044343 -0.194894
v -0.314485 -0.120887 -0.216419
v -0.280577 0.003097 -0.230441
v -0.241374 -0.073447 -0.251966
v -0.448599 -0.039772 -0.384005
v -0.409397 -0.116317 -0.405531
v -0.375489 0.007668 -0.419552
v -0.336286 -0.068877 -0.441077
v -0.349499 0.027353 -0.381365
v -0.322381 0.026047 -0.327333
v -0.295264 0.024741 -0.273301
v -0.242993 -0.077319 -0.302002
v -0.270111 -0.076013 -0.356034
v -0.297229 -0.074707 -0.410065
v -0.392745 -0.038513 -0.225906
v -0.419862 -0.037207 -0.279938
v -0.446980 -0.035901 -0.333970
v -0.394710 -0.137960 -0.362670
v -0.367592 -0.139266 -0.308638
v -0.340474 -0.140572 -0.254606
v -0.297314 -0.001015 -0.219672
v -0.319942 -0.012308 -0.207349
v -0.340691 -0.029161 -0.198582
v -0.350059 -0.063416 -0.194708
v -0.340611 -0.087885 -0.198802
v -0.326800 -0.108830 -0.207479
v -0.295770 -0.116871 -0.223247
v -0.272148 -0.105626 -0.233591
v -0.252393 -0.088725 -0.244337
v -0.243025 -0.054470 -0.248211
v -0.251479 -0.030049 -0.242138
v -0.266284 -0.009056 -0.235440
v -0.288712 0.023977 -0.259957
v -0.283531 0.020589 -0.247595
v -0.280847 0.012720 -0.237221
v -0.304302 -0.074282 -0.423156
v -0.313136 -0.073264 -0.433742
v -0.324830 -0.071241 -0.439736
v -0.339915 -0.049804 -0.441263
v -0.349363 -0.025335 -0.437169
v -0.363174 -0.004390 -0.428492
v -0.437580 -0.024495 -0.391634
v -0.417825 -0.007594 -0.402380
v -0.394204 0.003651 -0.412724
v -0.349283 -0.084059 -0.437389
v -0.370032 -0.100912 -0.428622
v -0.392660 -0.112205 -0.416299
v -0.423690 -0.104164 -0.400531
v -0.438494 -0.083171 -0.393833
v -0.446949 -0.058750 -0.387760
v -0.453251 -0.035683 -0.347467
v -0.456475 -0.036120 -0.360905
v -0.454327 -0.037684 -0.373932
v -0.333681 -0.140451 -0.241362
v -0.326805 -0.137644 -0.229700
v -0.319943 -0.130235 -0.221054
v -0.370030 0.017015 -0.414917
v -0.363168 0.024424 -0.406271
v -0.356292 0.027231 -0.394609
v -0.342720 0.027026 -0.367857
v -0.335940 0.026700 -0.354349
v -0.329161 0.026373 -0.340841
v -0.315602 0.025720 -0.313825
v -0.308822 0.025394 -0.300317
v -0.302043 0.025067 -0.286809
v -0.235646 -0.075536 -0.262039
v -0.233499 -0.077099 -0.275066
v -0.236722 -0.077536 -0.288504
v -0.249773 -0.076992 -0.315510
v -0.256552 -0.076666 -0.329018
v -0.263332 -0.076339 -0.342526
v -0.276891 -0.075686 -0.369541
v -0.283670 -0.075360 -0.383049
v -0.290450 -0.075033 -0.396557
v -0.365144 -0.041978 -0.196235
v -0.376837 -0.039955 -0.202229
v -0.385671 -0.038938 -0.212815
v -0.399524 -0.038186 -0.239414
v -0.406303 -0.037860 -0.252922
v -0.413083 -0.037533 -0.266430
v -0.426642 -0.036880 -0.293446
v -0.433421 -0.036554 -0.306954
v -0.440201 -0.036227 -0.320462
v -0.409126 -0.125940 -0.398751
v -0.406443 -0.133809 -0.388377
v -0.401261 -0.137197 -0.376014
v -0.387931 -0.138287 -0.349162
v -0.381151 -0.138613 -0.335654
v -0.374372 -0.138940 -0.322146
v -0.360813 -0.139593 -0.295130
v -0.354033 -0.139919 -0.281622
v -0.347254 -0.140246 -0.268114
v -0.311964 -0.103390 -0.403363
v -0.336169 -0.125470 -0.391749
v -0.365274 -0.137982 -0.377444
v -0.284846 -0.104696 -0.349331
v -0.309051 -0.126776 -0.337717
v -0.338156 -0.139288 -0.323412
v -0.257729 -0.106002 -0.295300
v -0.281933 -0.128082 -0.283685
v -0.311038 -0.140593 -0.269380
v -0.324700 0.024762 -0.258527
v -0.353805 0.012250 -0.244222
v -0.378009 -0.009830 -0.232608
v -0.351817 0.026068 -0.312559
v -0.380922 0.013556 -0.298254
v -0.405127 -0.008524 -0.286640
v -0.378935 0.027374 -0.366591
v -0.408040 0.014862 -0.352286
v -0.432245 -0.007218 -0.340672
v -0.420261 -0.123732 -0.349502
v -0.439123 -0.098791 -0.339433
v -0.448846 -0.067919 -0.333807
v -0.393143 -0.125038 -0.295471
v -0.412005 -0.100096 -0.285401
v -0.421728 -0.069225 -0.279775
v -0.366025 -0.126344 -0.241439
v -0.384887 -0.101402 -0.231369
v -0.394610 -0.070530 -0.225743
v -0.241128 -0.045301 -0.302164
v -0.250851 -0.014429 -0.296538
v -0.269713 0.010513 -0.286469
v -0.268245 -0.043995 -0.356196
v -0.277969 -0.013123 -0.350570
v -0.296831 0.011819 -0.340501
v -0.295363 -0.042689 -0.410228
v -0.305086 -0.011817 -0.404602
v -0.323948 0.013125 -0.394532
v -0.329934 -0.049843 -0.199200
v -0.318066 -0.074883 -0.203974
v -0.305146 -0.098243 -0.212811
v -0.306769 -0.033579 -0.208645
v -0.292927 -0.059117 -0.214258
v -0.280649 -0.084579 -0.222987
v -0.283706 -0.019846 -0.221677
v -0.269353 -0.043275 -0.227658
v -0.258917 -0.068246 -0.235287
v -0.330883 0.013082 -0.407726
v -0.312129 -0.011724 -0.417761
v -0.302452 -0.042431 -0.423337
v -0.338751 0.010821 -0.419038
v -0.320754 -0.013029 -0.428829
v -0.311398 -0.042586 -0.434056
v -0.348696 0.004699 -0.426605
v -0.332579 -0.017119 -0.435819
v -0.323753 -0.044002 -0.440301
v -0.384827 -0.014976 -0.423161
v -0.371908 -0.038336 -0.431997
v -0.360039 -0.063377 -0.436771
v -0.409324 -0.028640 -0.412984
v -0.397046 -0.054103 -0.421713
v -0.383204 -0.079641 -0.427326
v -0.431056 -0.044973 -0.400684
v -0.420620 -0.069945 -0.408313
v -0.406268 -0.093374 -0.414295
v -0.387521 -0.070788 -0.212634
v -0.377844 -0.101496 -0.218210
v -0.359090 -0.126301 -0.228245
v -0.378575 -0.070633 -0.201915
v -0.369219 -0.100191 -0.207143
v -0.351223 -0.124040 -0.216934
v -0.366220 -0.069218 -0.195671
v -0.357395 -0.096101 -0.200152
v -0.341278 -0.117919 -0.209366
v -0.385583 0.027256 -0.379931
v -0.391441 0.024475 -0.392263
v -0.395349 0.016789 -0.402898
v -0.414542 0.014815 -0.365720
v -0.419375 0.012528 -0.378715
v -0.420930 0.005776 -0.390923
v -0.438605 -0.007149 -0.354151
v -0.442452 -0.008625 -0.367461
v -0.441866 -0.013395 -0.380282
v -0.251368 -0.106071 -0.281820
v -0.247522 -0.104594 -0.268510
v -0.248108 -0.099825 -0.255689
v -0.275432 -0.128035 -0.270251
v -0.270598 -0.125747 -0.257257
v -0.269043 -0.118996 -0.245048
v -0.304391 -0.140476 -0.256040
v -0.298533 -0.137694 -0.243708
v -0.294624 -0.130008 -0.233073
v -0.337838 -0.095504 -0.434476
v -0.327306 -0.100752 -0.427478
v -0.318966 -0.102815 -0.416509
v -0.359253 -0.114652 -0.424789
v -0.350529 -0.121898 -0.416517
v -0.343048 -0.124778 -0.404976
v -0.384355 -0.125687 -0.411860
v -0.378317 -0.133852 -0.402677
v -0.371989 -0.137221 -0.390729
v -0.305185 -0.103716 -0.389855
v -0.298405 -0.104043 -0.376347
v -0.291626 -0.104369 -0.362839
v -0.329389 -0.125796 -0.378241
v -0.322610 -0.126123 -0.364733
v -0.315830 -0.126449 -0.351225
v -0.358494 -0.138308 -0.363936
v -0.351715 -0.138635 -0.350428
v -0.344936 -0.138961 -0.336920
v -0.278067 -0.105022 -0.335824
v -0.271287 -0.105349 -0.322316
v -0.264508 -0.105675 -0.308808
v -0.302272 -0.127102 -0.324209
v -0.295492 -0.127429 -0.310701
v -0.288713 -0.127755 -0.297193
v -0.331377 -0.139614 -0.309904
v -0.324597 -0.139940 -0.296396
v -0.317818 -0.140267 -0.282888
v -0.305619 0.012468 -0.224111
v -0.311657 0.020632 -0.233295
v -0.317985 0.024001 -0.245242
v -0.330721 0.001432 -0.211182
v -0.339445 0.008678 -0.219455
v -0.346925 0.011559 -0.230995
v -0.352135 -0.017716 -0.201495
v -0.362668 -0.012468 -0.208493
v -0.371007 -0.010404 -0.219463
v -0.331479 0.025088 -0.272035
v -0.338259 0.025415 -0.285543
v -0.345038 0.025741 -0.299051
v -0.360584 0.012577 -0.257730
v -0.367364 0.012903 -0.271238
v -0.374143 0.013230 -0.284746
v -0.384789 -0.009503 -0.246116
v -0.391568 -0.009177 -0.259624
v -0.398348 -0.008850 -0.273132
v -0.358597 0.026394 -0.326067
v -0.365376 0.026721 -0.339575
v -0.372156 0.027047 -0.353083
v -0.387702 0.013883 -0.311762
v -0.394481 0.014209 -0.325270
v -0.401261 0.014536 -0.338778
v -0.411907 -0.008197 -0.300148
v -0.418686 -0.007871 -0.313656
v -0.425466 -0.007544 -0.327164
v -0.455951 -0.064896 -0.374458
v -0.447604 -0.091756 -0.379893
v -0.431008 -0.113598 -0.388153
v -0.458359 -0.066791 -0.360883
v -0.449150 -0.096342 -0.366403
v -0.431007 -0.120198 -0.375902
v -0.455120 -0.067533 -0.347322
v -0.445461 -0.098240 -0.352935
v -0.426688 -0.123046 -0.362933
v -0.442066 -0.068245 -0.320299
v -0.432343 -0.099117 -0.325925
v -0.413481 -0.124059 -0.335994
v -0.435287 -0.068572 -0.306791
v -0.425564 -0.099444 -0.312417
v -0.406702 -0.124385 -0.322487
v -0.428508 -0.068898 -0.293283
v -0.418784 -0.099770 -0.298909
v -0.399922 -0.124712 -0.308979
v -0.414949 -0.069551 -0.266267
v -0.405225 -0.100423 -0.271893
v -0.386363 -0.125365 -0.281963
v -0.408169 -0.069878 -0.252759
v -0.398446 -0.100749 -0.258385
v -0.379584 -0.125691 -0.268455
v -0.401390 -0.070204 -0.239251
v -0.391667 -0.101076 -0.244877
v -0.372805 -0.126018 -0.254947
v -0.258965 0.000378 -0.247818
v -0.242369 -0.021463 -0.256078
v -0.234023 -0.048323 -0.261514
v -0.258967 0.006978 -0.260069
v -0.240824 -0.016878 -0.269569
v -0.231614 -0.046429 -0.275088
v -0.263285 0.009826 -0.273038
v -0.244513 -0.014980 -0.283036
v -0.234854 -0.045687 -0.288649
v -0.276492 0.010839 -0.299977
v -0.257630 -0.014103 -0.310046
v -0.247907 -0.044974 -0.315672
v -0.283272 0.011166 -0.313485
v -0.264410 -0.013776 -0.323554
v -0.254687 -0.044648 -0.329180
v -0.290051 0.011492 -0.326993
v -0.271189 -0.013450 -0.337062
v -0.261466 -0.044321 -0.342688
v -0.303610 0.012145 -0.354009
v -0.284748 -0.012797 -0.364078
v -0.275025 -0.043669 -0.369704
v -0.310390 0.012472 -0.367517
v -0.291528 -0.012470 -0.377586
v -0.281804 -0.043342 -0.383212
v -0.317169 0.012798 -0.381024
v -0.298307 -0.012144 -0.391094
v -0.288584 -0.043016 -0.396720
vt 0.437500 0.062276
vt 0.500000 0.122396
vt 0.437500 0.123210
vt 0.562500 0.062276
vt 0.500000 0.062174
vt 0.437500 0.180725
vt 0.562500 0.180725
vt 0.500000 0.178060
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.437500 0.000000
vt 0.562500 0.000000
vt 0.500000 0.000000
vt 0.625000 0.000000
vt 0.625000 0.062500
vt 0.562500 0.123210
vt 0.625000 0.187500
vt 0.625000 0.250000
vt 0.562500 0.228516
vt 0.500000 0.223958
vt 0.437500 0.228516
vt 0.375000 0.250000
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.500000 0.453451
vt 0.437500 0.470540
vt 0.437500 0.453349
vt 0.562500 0.470540
vt 0.500000 0.471354
vt 0.437663 0.491150
vt 0.562337 0.491150
vt 0.500000 0.493815
vt 0.437500 0.437500
vt 0.375000 0.453125
vt 0.375000 0.437500
vt 0.500000 0.437500
vt 0.562500 0.453349
vt 0.562500 0.437500
vt 0.625000 0.453125
vt 0.625000 0.468750
vt 0.625000 0.484375
vt 0.625000 0.500000
vt 0.560628 0.521525
vt 0.500000 0.526042
vt 0.439372 0.521525
vt 0.375000 0.500000
vt 0.375000 0.484375
vt 0.375000 0.468750
vt 0.444214 0.569214
vt 0.500000 0.625000
vt 0.446615 0.625000
vt 0.555786 0.569214
vt 0.500000 0.571615
vt 0.444214 0.680786
vt 0.555786 0.680786
vt 0.500000 0.678385
vt 0.396525 0.564372
vt 0.603475 0.564372
vt 0.553385 0.625000
vt 0.603475 0.685628
vt 0.625000 0.750000
vt 0.560628 0.728475
vt 0.500000 0.723958
vt 0.439372 0.728475
vt 0.375000 0.750000
vt 0.396525 0.685628
vt 0.401042 0.625000
vt 0.500000 0.953125
vt 0.437500 0.968750
vt 0.437500 0.953125
vt 0.562500 0.968750
vt 0.500000 0.968750
vt 0.437500 0.984375
vt 0.562500 0.984375
vt 0.500000 0.984375
vt 0.437500 0.937500
vt 0.375000 0.953125
vt 0.375000 0.937500
vt 0.500000 0.937500
vt 0.562500 0.953125
vt 0.562500 0.937500
vt 0.625000 0.953125
vt 0.625000 0.968750
vt 0.625000 0.984375
vt 0.625000 1.000000
vt 0.562500 1.000000
vt 0.500000 1.000000
vt 0.437500 1.000000
vt 0.375000 1.000000
vt 0.375000 0.984375
vt 0.375000 0.968750
vt 0.345540 0.562500
vt 0.328451 0.625000
vt 0.328349 0.562500
vt 0.366150 0.562663
vt 0.346354 0.625000
vt 0.345540 0.687500
vt 0.328349 0.687500
vt 0.366150 0.687337
vt 0.328125 0.500000
vt 0.312500 0.562500
vt 0.312500 0.500000
vt 0.343750 0.500000
vt 0.359375 0.500000
vt 0.368815 0.625000
vt 0.359375 0.750000
vt 0.343750 0.750000
vt 0.328125 0.750000
vt 0.312500 0.687500
vt 0.312500 0.750000
vt 0.312500 0.625000
vt 0.843750 0.562500
vt 0.828125 0.625000
vt 0.828125 0.562500
vt 0.859375 0.562500
vt 0.843750 0.625000
vt 0.843750 0.687500
vt 0.828125 0.687500
vt 0.859375 0.687500
vt 0.828125 0.500000
vt 0.812500 0.562500
vt 0.812500 0.500000
vt 0.843750 0.500000
vt 0.859375 0.500000
vt 0.875000 0.500000
vt 0.875000 0.625000
vt 0.859375 0.625000
vt 0.875000 0.750000
vt 0.859375 0.750000
vt 0.843750 0.750000
vt 0.828125 0.750000
vt 0.812500 0.687500
vt 0.812500 0.750000
vt 0.812500 0.625000
vt 0.633850 0.562663
vt 0.653646 0.625000
vt 0.631185 0.625000
vt 0.654460 0.562500
vt 0.671550 0.625000
vt 0.633850 0.687337
vt 0.654460 0.687500
vt 0.640625 0.500000
vt 0.656250 0.500000
vt 0.671651 0.562500
vt 0.671875 0.500000
vt 0.687500 0.562500
vt 0.687500 0.625000
vt 0.671651 0.687500
vt 0.687500 0.687500
vt 0.671875 0.750000
vt 0.656250 0.750000
vt 0.640625 0.750000
vt 0.598958 0.625000
vt 0.718750 0.562500
vt 0.703125 0.625000
vt 0.703125 0.562500
vt 0.734375 0.562500
vt 0.718750 0.625000
vt 0.703125 0.687500
vt 0.734375 0.625000
vt 0.718750 0.687500
vt 0.703125 0.500000
vt 0.687500 0.500000
vt 0.718750 0.500000
vt 0.750000 0.500000
vt 0.734375 0.500000
vt 0.750000 0.562500
vt 0.750000 0.625000
vt 0.734375 0.687500
vt 0.750000 0.687500
vt 0.734375 0.750000
vt 0.718750 0.750000
vt 0.703125 0.750000
vt 0.687500 0.750000
vt 0.781250 0.562500
vt 0.765625 0.625000
vt 0.765625 0.562500
vt 0.796875 0.562500
vt 0.781250 0.625000
vt 0.765625 0.687500
vt 0.796875 0.687500
vt 0.781250 0.687500
vt 0.781250 0.500000
vt 0.765625 0.500000
vt 0.796875 0.500000
vt 0.796875 0.625000
vt 0.796875 0.750000
vt 0.781250 0.750000
vt 0.765625 0.750000
vt 0.750000 0.750000
vt 0.140625 0.562500
vt 0.156250 0.625000
vt 0.140625 0.625000
vt 0.156250 0.562500
vt 0.171875 0.625000
vt 0.140625 0.687500
vt 0.156250 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.140625 0.500000
vt 0.156250 0.500000
vt 0.171875 0.562500
vt 0.171875 0.500000
vt 0.187500 0.562500
vt 0.187500 0.625000
vt 0.171875 0.687500
vt 0.187500 0.687500
vt 0.171875 0.750000
vt 0.156250 0.750000
vt 0.140625 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.218750 0.562500
vt 0.203125 0.625000
vt 0.203125 0.562500
vt 0.234375 0.562500
vt 0.218750 0.625000
vt 0.203125 0.687500
vt 0.234375 0.625000
vt 0.218750 0.687500
vt 0.187500 0.500000
vt 0.218750 0.500000
vt 0.203125 0.500000
vt 0.250000 0.500000
vt 0.234375 0.500000
vt 0.250000 0.562500
vt 0.250000 0.625000
vt 0.234375 0.687500
vt 0.250000 0.687500
vt 0.234375 0.750000
vt 0.218750 0.750000
vt 0.203125 0.750000
vt 0.187500 0.750000
vt 0.281250 0.562500
vt 0.265625 0.625000
vt 0.265625 0.562500
vt 0.296875 0.562500
vt 0.281250 0.625000
vt 0.265625 0.687500
vt 0.296875 0.687500
vt 0.281250 0.687500
vt 0.281250 0.500000
vt 0.265625 0.500000
vt 0.296875 0.500000
vt 0.296875 0.625000
vt 0.296875 0.750000
vt 0.281250 0.750000
vt 0.265625 0.750000
vt 0.250000 0.750000
vt 0.437663 0.758850
vt 0.500000 0.778646
vt 0.437500 0.779460
vt 0.562337 0.758850
vt 0.500000 0.756185
vt 0.500000 0.796549
vt 0.437500 0.796651
vt 0.562500 0.779460
vt 0.375000 0.765625
vt 0.625000 0.765625
vt 0.625000 0.781250
vt 0.562500 0.796651
vt 0.625000 0.796875
vt 0.562500 0.812500
vt 0.500000 0.812500
vt 0.437500 0.812500
vt 0.375000 0.796875
vt 0.375000 0.812500
vt 0.375000 0.781250
vt 0.500000 0.828125
vt 0.437500 0.843750
vt 0.437500 0.828125
vt 0.562500 0.828125
vt 0.500000 0.843750
vt 0.437500 0.859375
vt 0.562500 0.843750
vt 0.500000 0.859375
vt 0.375000 0.828125
vt 0.625000 0.812500
vt 0.625000 0.828125
vt 0.625000 0.859375
vt 0.562500 0.859375
vt 0.562500 0.875000
vt 0.500000 0.875000
vt 0.437500 0.875000
vt 0.375000 0.875000
vt 0.375000 0.859375
vt 0.375000 0.843750
vt 0.500000 0.890625
vt 0.437500 0.906250
vt 0.437500 0.890625
vt 0.562500 0.890625
vt 0.500000 0.906250
vt 0.437500 0.921875
vt 0.562500 0.906250
vt 0.500000 0.921875
vt 0.375000 0.890625
vt 0.625000 0.875000
vt 0.625000 0.890625
vt 0.625000 0.906250
vt 0.562500 0.921875
vt 0.625000 0.921875
vt 0.375000 0.921875
vt 0.375000 0.906250
vt 0.437500 0.258850
vt 0.500000 0.278646
vt 0.437500 0.279460
vt 0.562500 0.258850
vt 0.500000 0.256185
vt 0.500000 0.296549
vt 0.437500 0.296651
vt 0.562500 0.279460
vt 0.375000 0.265625
vt 0.625000 0.265625
vt 0.625000 0.281250
vt 0.562500 0.296651
vt 0.625000 0.296875
vt 0.562500 0.312500
vt 0.500000 0.312500
vt 0.437500 0.312500
vt 0.375000 0.296875
vt 0.375000 0.312500
vt 0.375000 0.281250
vt 0.500000 0.328125
vt 0.437500 0.343750
vt 0.437500 0.328125
vt 0.562500 0.328125
vt 0.500000 0.343750
vt 0.437500 0.359375
vt 0.562500 0.343750
vt 0.500000 0.359375
vt 0.375000 0.328125
vt 0.625000 0.312500
vt 0.625000 0.343750
vt 0.562500 0.359375
vt 0.625000 0.359375
vt 0.562500 0.375000
vt 0.500000 0.375000
vt 0.437500 0.375000
vt 0.375000 0.375000
vt 0.375000 0.359375
vt 0.375000 0.343750
vt 0.500000 0.390625
vt 0.437500 0.406250
vt 0.437500 0.390625
vt 0.562500 0.390625
vt 0.500000 0.406250
vt 0.437500 0.421875
vt 0.562500 0.406250
vt 0.500000 0.421875
vt 0.375000 0.390625
vt 0.625000 0.375000
vt 0.625000 0.406250
vt 0.562500 0.421875
vt 0.625000 0.421875
vt 0.375000 0.421875
vt 0.375000 0.406250
vt 0.625000 0.125000
vt 0.625000 0.437500
vt 0.625000 0.937500
vt 0.875000 0.562500
vt 0.875000 0.687500
vt 0.625000 0.843750
vt 0.625000 0.328125
vt 0.625000 0.390625
vn 0.3771 0.0008 0.9262
vn 0.4304 -0.1012 0.8969
vn 0.4648 0.0578 0.8835
vn 0.5165 -0.0454 0.8551
vn 0.1940 0.0716 0.9784
vn 0.2571 -0.0620 0.9644
vn 0.3162 -0.1776 0.9319
vn 0.3600 -0.2898 0.8868
vn 0.4807 -0.2319 0.8457
vn 0.5784 -0.1685 0.7982
vn 0.6728 -0.0869 0.7347
vn 0.6204 0.0198 0.7840
vn 0.5612 0.1353 0.8165
vn 0.4778 0.2558 0.8404
vn 0.3949 0.1897 0.8989
vn 0.2972 0.1264 0.9464
vn 0.5659 0.6387 -0.5213
vn 0.7234 0.3357 -0.6033
vn 0.2767 0.5362 -0.7974
vn 0.4147 0.2805 -0.8656
vn 0.3914 0.8962 -0.2089
vn 0.6712 0.6508 -0.3549
vn 0.8296 0.3416 -0.4416
vn 0.8779 -0.0537 -0.4758
vn 0.7671 -0.0500 -0.6395
vn 0.4541 -0.0364 -0.8902
vn 0.0589 0.0015 -0.9983
vn -0.0010 0.1641 -0.9864
vn -0.0862 0.3323 -0.9392
vn -0.1647 0.5137 -0.8420
vn 0.0567 0.7354 -0.6752
vn 0.2909 0.8770 -0.3826
vn -0.4291 0.1008 -0.8976
vn -0.3774 -0.0023 -0.9261
vn -0.5169 0.0439 -0.8549
vn -0.4635 -0.0582 -0.8842
vn -0.3829 0.2973 -0.8746
vn -0.3163 0.1776 -0.9319
vn -0.2571 0.0621 -0.9644
vn -0.1884 -0.0454 -0.9811
vn -0.2972 -0.1263 -0.9464
vn -0.3949 -0.1897 -0.8989
vn -0.5011 -0.2483 -0.8290
vn -0.5613 -0.1353 -0.8165
vn -0.6204 -0.0198 -0.7840
vn -0.6668 0.1131 -0.7366
vn -0.5784 0.1685 -0.7982
vn -0.4807 0.2319 -0.8457
vn -0.7204 -0.3371 0.6061
vn -0.5673 -0.6405 0.5176
vn -0.4067 -0.2824 0.8688
vn -0.2808 -0.5419 0.7921
vn -0.8778 0.0536 0.4760
vn -0.8295 -0.3418 0.4418
vn -0.6712 -0.6509 0.3546
vn -0.3914 -0.8963 0.2087
vn -0.2916 -0.8785 0.3784
vn -0.0575 -0.7379 0.6725
vn 0.1774 -0.4628 0.8685
vn 0.0871 -0.3323 0.9392
vn 0.0004 -0.1648 0.9863
vn -0.1112 0.0250 0.9935
vn -0.4511 0.0347 0.8918
vn -0.7646 0.0481 0.6427
vn -0.4013 0.9158 -0.0160
vn -0.5116 0.7746 -0.3718
vn -0.6435 0.7586 0.1018
vn -0.7189 0.6401 -0.2710
vn -0.0031 1.0000 -0.0076
vn -0.0905 0.9791 -0.1823
vn -0.2502 0.8243 -0.5079
vn -0.3555 0.5735 -0.7380
vn -0.5325 0.4760 -0.6999
vn -0.6733 0.3847 -0.6314
vn -0.8086 0.2795 -0.5177
vn -0.8917 0.4080 -0.1960
vn -0.8548 0.4831 0.1893
vn -0.7868 0.4914 0.3735
vn -0.5729 0.7723 0.2743
vn -0.3254 0.9330 0.1539
vn 0.6435 -0.7586 -0.1018
vn 0.7189 -0.6401 0.2710
vn 0.4013 -0.9158 0.0160
vn 0.5116 -0.7746 0.3718
vn 0.7868 -0.4914 -0.3735
vn 0.8548 -0.4831 -0.1893
vn 0.8917 -0.4080 0.1960
vn 0.8086 -0.2795 0.5177
vn 0.6733 -0.3847 0.6314
vn 0.5325 -0.4760 0.6999
vn 0.3555 -0.5735 0.7380
vn 0.2502 -0.8243 0.5079
vn 0.0905 -0.9791 0.1823
vn 0.0031 -1.0000 0.0076
vn 0.3254 -0.9330 -0.1539
vn 0.5729 -0.7723 -0.2743
vn 0.2078 -0.6166 -0.7593
vn 0.4547 -0.7491 -0.4817
vn -0.0057 -0.7551 -0.6555
vn 0.2099 -0.9080 -0.3627
vn 0.0061 -0.2409 -0.9705
vn 0.3714 -0.3825 -0.8460
vn 0.6566 -0.4724 -0.5880
vn 0.7603 -0.4901 -0.4264
vn 0.5475 -0.7711 -0.3251
vn 0.2998 -0.9318 -0.2047
vn -0.0236 -0.9987 -0.0453
vn -0.1108 -0.9703 -0.2149
vn -0.2736 -0.8011 -0.5324
vn -0.3820 -0.4927 -0.7819
vn -0.2510 -0.4387 -0.8629
vn -0.1109 -0.3478 -0.9310
vn 0.5604 -0.7720 -0.2999
vn 0.3127 -0.9327 -0.1795
vn 0.7738 -0.4910 -0.4002
vn -0.0103 -0.9998 -0.0190
vn 0.0057 0.7551 0.6555
vn -0.2099 0.9080 0.3627
vn -0.2078 0.6166 0.7593
vn -0.4547 0.7491 0.4817
vn 0.3820 0.4927 0.7819
vn 0.2736 0.8011 0.5324
vn 0.1108 0.9703 0.2149
vn 0.0236 0.9987 0.0453
vn -0.2998 0.9318 0.2047
vn -0.5475 0.7711 0.3251
vn -0.7603 0.4901 0.4264
vn -0.6566 0.4724 0.5880
vn -0.3715 0.3825 0.8460
vn -0.0060 0.2409 0.9705
vn 0.1109 0.3478 0.9310
vn 0.2510 0.4387 0.8629
vn -0.3127 0.9327 0.1795
vn -0.5604 0.7720 0.2999
vn 0.0103 0.9998 0.0190
vn -0.7738 0.4910 0.4002
vn -0.9449 -0.2565 -0.2035
vn -0.8132 -0.5163 -0.2686
vn -0.9229 -0.3273 0.2026
vn -0.7663 -0.6310 0.1210
vn -0.8622 0.0612 -0.5029
vn -0.7934 -0.1266 -0.5954
vn -0.7070 -0.2940 -0.6431
vn -0.5990 -0.4254 -0.6784
vn -0.5876 -0.7123 -0.3837
vn -0.4946 -0.8687 -0.0261
vn -0.4185 -0.8949 0.1546
vn -0.6981 -0.6496 0.3012
vn -0.8565 -0.3405 0.3879
vn -0.9052 0.0549 0.4215
vn -0.9712 0.0580 0.2311
vn -0.9837 0.0603 -0.1694
vn -0.8434 -0.3413 0.4150
vn -0.6850 -0.6505 0.3281
vn -0.8919 0.0543 0.4489
vn -0.4053 -0.8960 0.1817
vn -0.8919 0.0543 0.4490
vn 0.8149 0.5103 0.2748
vn 0.9471 0.2548 0.1951
vn 0.7684 0.6290 -0.1178
vn 0.9225 0.3261 -0.2067
vn 0.5863 0.4775 0.6544
vn 0.7076 0.2940 0.6425
vn 0.7932 0.1259 0.5959
vn 0.8353 -0.0359 0.5487
vn 0.9842 -0.0619 0.1660
vn 0.9701 -0.0598 -0.2350
vn 0.9051 -0.0550 -0.4217
vn 0.8565 0.3403 -0.3881
vn 0.6982 0.6495 -0.3010
vn 0.4188 0.8949 -0.1544
vn 0.4975 0.8670 0.0291
vn 0.5893 0.7098 0.3860
vn 0.6850 0.6505 -0.3281
vn 0.8434 0.3413 -0.4150
vn 0.4053 0.8960 -0.1817
vn 0.8919 -0.0543 -0.4489
vn 0.3774 0.0023 0.9261
vn 0.4291 -0.1008 0.8976
vn 0.4635 0.0582 0.8842
vn 0.5169 -0.0439 0.8549
vn 0.1884 0.0454 0.9811
vn 0.2571 -0.0621 0.9644
vn 0.3163 -0.1776 0.9319
vn 0.3829 -0.2973 0.8746
vn 0.6668 -0.1131 0.7366
vn 0.5613 0.1353 0.8165
vn 0.5011 0.2483 0.8290
vn 0.2972 0.1263 0.9464
vn 0.5673 0.6405 -0.5176
vn 0.7204 0.3371 -0.6061
vn 0.2808 0.5419 -0.7921
vn 0.4067 0.2824 -0.8688
vn 0.3914 0.8963 -0.2087
vn 0.6712 0.6509 -0.3546
vn 0.8295 0.3418 -0.4418
vn 0.8778 -0.0536 -0.4760
vn 0.7646 -0.0481 -0.6427
vn 0.4511 -0.0347 -0.8918
vn 0.1112 -0.0250 -0.9935
vn -0.0005 0.1648 -0.9863
vn -0.0871 0.3323 -0.9392
vn -0.1774 0.4628 -0.8685
vn 0.0575 0.7379 -0.6725
vn 0.2916 0.8785 -0.3784
vn -0.4304 0.1012 -0.8969
vn -0.3771 -0.0008 -0.9262
vn -0.5165 0.0454 -0.8551
vn -0.4648 -0.0578 -0.8835
vn -0.3600 0.2898 -0.8868
vn -0.3162 0.1776 -0.9319
vn -0.2571 0.0620 -0.9644
vn -0.1940 -0.0716 -0.9784
vn -0.2972 -0.1264 -0.9464
vn -0.4778 -0.2558 -0.8404
vn -0.5612 -0.1353 -0.8165
vn -0.6728 0.0869 -0.7347
vn -0.7234 -0.3357 0.6033
vn -0.5659 -0.6387 0.5213
vn -0.4147 -0.2805 0.8656
vn -0.2768 -0.5362 0.7974
vn -0.8779 0.0537 0.4758
vn -0.8296 -0.3416 0.4416
vn -0.6712 -0.6508 0.3549
vn -0.3914 -0.8962 0.2089
vn -0.2909 -0.8770 0.3826
vn -0.0567 -0.7354 0.6752
vn 0.1647 -0.5137 0.8420
vn 0.0862 -0.3323 0.9392
vn 0.0010 -0.1641 0.9864
vn -0.0589 -0.0015 0.9983
vn -0.4541 0.0364 0.8902
vn -0.7671 0.0500 0.6395
vn -0.3986 0.9170 -0.0132
vn -0.5074 0.7799 -0.3666
vn -0.6434 0.7582 0.1058
vn -0.7209 0.6413 -0.2628
vn -0.0029 1.0000 -0.0074
vn -0.0872 0.9799 -0.1796
vn -0.2476 0.8262 -0.5061
vn -0.3887 0.5298 -0.7538
vn -0.5330 0.4765 -0.6992
vn -0.6731 0.3855 -0.6311
vn -0.7767 0.2781 -0.5651
vn -0.8926 0.4076 -0.1925
vn -0.8546 0.4820 0.1935
vn -0.7867 0.4913 0.3737
vn -0.5729 0.7723 0.2745
vn -0.3252 0.9330 0.1541
vn 0.6434 -0.7582 -0.1058
vn 0.7209 -0.6413 0.2628
vn 0.3986 -0.9170 0.0132
vn 0.5074 -0.7799 0.3666
vn 0.7867 -0.4913 -0.3737
vn 0.8546 -0.4820 -0.1935
vn 0.8926 -0.4076 0.1925
vn 0.7768 -0.2781 0.5651
vn 0.6731 -0.3855 0.6311
vn 0.5330 -0.4765 0.6992
vn 0.3887 -0.5298 0.7538
vn 0.2476 -0.8262 0.5061
vn 0.0872 -0.9799 0.1796
vn 0.0029 -1.0000 0.0074
vn 0.3252 -0.9330 -0.1541
vn 0.5729 -0.7723 -0.2745
vn 0.2002 -0.6151 -0.7626
vn 0.4516 -0.7494 -0.4842
vn -0.0071 -0.7496 -0.6618
vn 0.2093 -0.9066 -0.3665
vn 0.0630 -0.2436 -0.9678
vn 0.3681 -0.3828 -0.8473
vn 0.6534 -0.4734 -0.5907
vn 0.7601 -0.4902 -0.4266
vn 0.5473 -0.7711 -0.3253
vn 0.2998 -0.9317 -0.2049
vn -0.0236 -0.9987 -0.0456
vn -0.1109 -0.9694 -0.2191
vn -0.2734 -0.7991 -0.5354
vn -0.3901 -0.5376 -0.7475
vn -0.2518 -0.4383 -0.8629
vn -0.1111 -0.3470 -0.9313
vn 0.0071 0.7496 0.6618
vn -0.2093 0.9066 0.3665
vn -0.2002 0.6151 0.7626
vn -0.4516 0.7494 0.4842
vn 0.3901 0.5376 0.7475
vn 0.2734 0.7991 0.5354
vn 0.1109 0.9694 0.2191
vn 0.0236 0.9987 0.0456
vn -0.2998 0.9317 0.2049
vn -0.5473 0.7711 0.3253
vn -0.7601 0.4902 0.4266
vn -0.6534 0.4734 0.5907
vn -0.3681 0.3828 0.8473
vn -0.0630 0.2436 0.9678
vn 0.1111 0.3470 0.9313
vn 0.2518 0.4383 0.8629
vn -0.9471 -0.2548 -0.1951
vn -0.8149 -0.5103 -0.2748
vn -0.9225 -0.3261 0.2067
vn -0.7684 -0.6290 0.1178
vn -0.8353 0.0359 -0.5487
vn -0.7932 -0.1259 -0.5959
vn -0.7076 -0.2940 -0.6425
vn -0.5863 -0.4775 -0.6544
vn -0.5893 -0.7098 -0.3860
vn -0.4975 -0.8670 -0.0291
vn -0.4188 -0.8949 0.1544
vn -0.6982 -0.6495 0.3010
vn -0.8565 -0.3403 0.3881
vn -0.9051 0.0550 0.4217
vn -0.9701 0.0598 0.2350
vn -0.9842 0.0619 -0.1660
vn 0.8132 0.5163 0.2686
vn 0.9449 0.2565 0.2035
vn 0.7663 0.6310 -0.1210
vn 0.9229 0.3273 -0.2026
vn 0.5990 0.4254 0.6784
vn 0.7070 0.2940 0.6431
vn 0.7934 0.1266 0.5954
vn 0.8622 -0.0612 0.5029
vn 0.9837 -0.0603 0.1694
vn 0.9712 -0.0580 -0.2311
vn 0.9052 -0.0549 -0.4215
vn 0.8565 0.3405 -0.3879
vn 0.6981 0.6496 -0.3012
vn 0.4185 0.8949 -0.1546
vn 0.4946 0.8687 0.0261
vn 0.5876 0.7123 0.3837
usemtl Material.002
s off
f 1282/1418/1255 1286/1419/1255 1285/1420/1255
f 1284/1421/1256 1286/1419/1256 1283/1422/1256
f 1286/1419/1257 1288/1423/1257 1285/1420/1257
f 1286/1419/1258 1290/1424/1258 1289/1425/1258
f 1154/1426/1259 1282/1418/1259 1176/1427/1259
f 1177/1428/1260 1283/1422/1260 1282/1418/1260
f 1179/1429/1261 1283/1422/1261 1178/1430/1261
f 1155/1431/1262 1284/1421/1262 1179/1429/1262
f 1180/1432/1263 1287/1433/1263 1284/1421/1263
f 1287/1433/1264 1182/1434/1264 1290/1424/1264
f 1290/1424/1265 1157/1435/1265 1183/1436/1265
f 1289/1425/1266 1183/1436/1266 1184/1437/1266
f 1289/1425/1267 1185/1438/1267 1288/1423/1267
f 1288/1423/1268 1156/1439/1268 1174/1440/1268
f 1285/1420/1269 1174/1440/1269 1175/1441/1269
f 1176/1427/1270 1285/1420/1270 1175/1441/1270
f 1292/1442/1271 1294/1443/1271 1291/1444/1271
f 1292/1442/1272 1296/1445/1272 1295/1446/1272
f 1295/1446/1273 1297/1447/1273 1294/1443/1273
f 1295/1446/1274 1299/1448/1274 1298/1449/1274
f 1281/1450/1275 1212/1451/1275 1162/1452/1275
f 1280/1453/1276 1291/1444/1276 1281/1450/1276
f 1280/1453/1277 1293/1454/1277 1292/1442/1277
f 1279/1455/1278 1189/1456/1278 1293/1454/1278
f 1293/1454/1279 1190/1457/1279 1296/1445/1279
f 1296/1445/1280 1191/1458/1280 1299/1448/1280
f 1299/1448/1281 1161/1459/1281 1192/1460/1281
f 1299/1448/1282 1193/1461/1282 1298/1449/1282
f 1297/1447/1283 1193/1461/1283 1194/1462/1283
f 1297/1447/1284 1160/1463/1284 1210/1464/1284
f 1294/1443/1285 1210/1464/1285 1211/1465/1285
f 1291/1444/1286 1211/1465/1286 1212/1451/1286
f 1300/1466/1287 1304/1467/1287 1303/1468/1287
f 1302/1469/1288 1304/1467/1288 1301/1470/1288
f 1304/1467/1289 1306/1471/1289 1303/1468/1289
f 1304/1467/1290 1308/1472/1290 1307/1473/1290
f 1160/1463/1291 1300/1466/1291 1197/1474/1291
f 1194/1462/1292 1301/1470/1292 1300/1466/1292
f 1192/1460/1293 1301/1470/1293 1193/1461/1293
f 1161/1459/1294 1302/1469/1294 1192/1460/1294
f 1198/1475/1295 1305/1476/1295 1302/1469/1295
f 1305/1476/1296 1200/1477/1296 1308/1472/1296
f 1308/1472/1297 1159/1478/1297 1201/1479/1297
f 1307/1473/1298 1201/1479/1298 1202/1480/1298
f 1307/1473/1299 1203/1481/1299 1306/1471/1299
f 1306/1471/1300 1158/1482/1300 1195/1483/1300
f 1303/1468/1301 1195/1483/1301 1196/1484/1301
f 1197/1474/1302 1303/1468/1302 1196/1484/1302
f 1310/1485/1303 1312/1486/1303 1309/1487/1303
f 1310/1485/1304 1314/1488/1304 1313/1489/1304
f 1313/1489/1305 1315/1490/1305 1312/1486/1305
f 1313/1489/1306 1317/1491/1306 1316/1492/1306
f 1272/1493/1307 1230/1494/1307 1168/1495/1307
f 1271/1496/1308 1309/1487/1308 1272/1493/1308
f 1271/1496/1309 1311/1497/1309 1310/1485/1309
f 1270/1498/1310 1207/1499/1310 1311/1497/1310
f 1311/1497/1311 1208/1500/1311 1314/1488/1311
f 1314/1488/1312 1209/1501/1312 1317/1491/1312
f 1317/1491/1313 1155/1502/1313 1179/1503/1313
f 1317/1491/1314 1178/1504/1314 1316/1492/1314
f 1315/1490/1315 1178/1504/1315 1177/1505/1315
f 1315/1490/1316 1154/1506/1316 1228/1507/1316
f 1312/1486/1317 1228/1507/1317 1229/1508/1317
f 1309/1487/1318 1229/1508/1318 1230/1494/1318
f 1319/1509/1319 1321/1510/1319 1318/1511/1319
f 1320/1512/1320 1322/1513/1320 1319/1509/1320
f 1321/1510/1321 1325/1514/1321 1324/1515/1321
f 1322/1513/1322 1326/1516/1322 1325/1514/1322
f 1212/1517/1323 1261/1518/1323 1162/1519/1323
f 1211/1520/1324 1318/1511/1324 1212/1517/1324
f 1210/1521/1325 1319/1509/1325 1211/1520/1325
f 1160/1463/1326 1320/1512/1326 1210/1521/1326
f 1320/1512/1327 1196/1484/1327 1323/1522/1327
f 1196/1484/1328 1326/1516/1328 1323/1522/1328
f 1326/1516/1329 1158/1482/1329 1206/1523/1329
f 1325/1514/1330 1206/1523/1330 1205/1524/1330
f 1324/1515/1331 1205/1524/1331 1204/1525/1331
f 1263/1526/1332 1204/1525/1332 1170/1527/1332
f 1262/1528/1333 1324/1515/1333 1263/1526/1333
f 1318/1511/1334 1262/1528/1334 1261/1518/1334
f 1328/1529/1335 1330/1530/1335 1327/1531/1335
f 1329/1532/1336 1331/1533/1336 1328/1529/1336
f 1330/1530/1337 1334/1534/1337 1333/1535/1337
f 1331/1533/1338 1335/1536/1338 1334/1534/1338
f 1221/1537/1339 1252/1538/1339 1165/1539/1339
f 1220/1540/1340 1327/1531/1340 1221/1537/1340
f 1219/1541/1341 1328/1529/1341 1220/1540/1341
f 1157/1542/1342 1329/1532/1342 1219/1541/1342
f 1329/1532/1343 1181/1543/1343 1332/1544/1343
f 1181/1543/1344 1335/1536/1344 1332/1544/1344
f 1335/1536/1345 1155/1545/1345 1209/1546/1345
f 1334/1534/1346 1209/1546/1346 1208/1547/1346
f 1333/1535/1347 1208/1547/1347 1207/1548/1347
f 1254/1549/1348 1207/1548/1348 1173/1550/1348
f 1253/1551/1349 1333/1535/1349 1254/1549/1349
f 1327/1531/1350 1253/1551/1350 1252/1538/1350
f 1336/1552/1351 1340/1553/1351 1339/1554/1351
f 1337/1555/1352 1341/1556/1352 1340/1553/1352
f 1340/1553/1353 1342/1557/1353 1339/1554/1353
f 1341/1556/1354 1343/1558/1354 1340/1553/1354
f 1161/1459/1355 1336/1552/1355 1198/1475/1355
f 1191/1559/1356 1337/1555/1356 1336/1552/1356
f 1190/1560/1357 1338/1561/1357 1337/1555/1357
f 1189/1562/1358 1246/1563/1358 1338/1561/1358
f 1338/1561/1359 1247/1564/1359 1341/1556/1359
f 1247/1564/1360 1344/1565/1360 1341/1556/1360
f 1248/1566/1361 1239/1567/1361 1344/1565/1361
f 1344/1565/1362 1238/1568/1362 1343/1558/1362
f 1343/1558/1363 1237/1569/1363 1342/1557/1363
f 1342/1557/1364 1159/1478/1364 1200/1477/1364
f 1199/1570/1365 1342/1557/1365 1200/1477/1365
f 1336/1552/1366 1199/1570/1366 1198/1475/1366
f 1346/1571/1367 1348/1572/1367 1345/1573/1367
f 1347/1574/1367 1349/1575/1367 1346/1571/1367
f 1349/1575/1368 1351/1576/1368 1348/1572/1368
f 1350/1577/1368 1352/1578/1368 1349/1575/1368
f 1227/1579/1369 1246/1563/1369 1167/1580/1369
f 1226/1581/1369 1345/1573/1369 1227/1579/1369
f 1226/1581/1369 1347/1574/1369 1346/1571/1369
f 1166/1582/1369 1347/1574/1369 1225/1583/1369
f 1249/1584/1367 1350/1577/1367 1347/1574/1367
f 1250/1585/1368 1353/1586/1368 1350/1577/1368
f 1251/1587/1370 1242/1588/1370 1353/1586/1370
f 1353/1586/1370 1241/1589/1370 1352/1578/1370
f 1352/1578/1370 1240/1590/1370 1351/1576/1370
f 1351/1576/1370 1171/1591/1370 1248/1566/1370
f 1348/1572/1368 1248/1566/1368 1247/1564/1368
f 1345/1573/1367 1247/1564/1367 1246/1563/1367
f 1355/1592/1367 1357/1593/1367 1354/1594/1367
f 1356/1595/1367 1358/1596/1367 1355/1592/1367
f 1358/1596/1368 1360/1597/1368 1357/1593/1368
f 1358/1596/1368 1362/1598/1368 1361/1599/1368
f 1166/1582/1369 1354/1594/1369 1249/1584/1369
f 1223/1600/1369 1354/1594/1369 1224/1601/1369
f 1222/1602/1369 1355/1592/1369 1223/1600/1369
f 1165/1539/1369 1356/1595/1369 1222/1602/1369
f 1252/1538/1367 1359/1603/1367 1356/1595/1367
f 1253/1551/1368 1362/1598/1368 1359/1603/1368
f 1254/1549/1370 1245/1604/1370 1362/1598/1370
f 1362/1598/1370 1244/1605/1370 1361/1599/1370
f 1361/1599/1370 1243/1606/1370 1360/1597/1370
f 1360/1597/1370 1172/1607/1370 1251/1587/1370
f 1357/1593/1368 1251/1587/1368 1250/1585/1368
f 1354/1594/1367 1250/1585/1367 1249/1584/1367
f 1363/1608/1371 1367/1609/1371 1366/1610/1371
f 1364/1611/1372 1368/1612/1372 1367/1609/1372
f 1367/1609/1373 1369/1613/1373 1366/1610/1373
f 1368/1612/1374 1370/1614/1374 1367/1609/1374
f 1156/1615/1375 1363/1608/1375 1174/1616/1375
f 1188/1617/1376 1364/1611/1376 1363/1608/1376
f 1187/1618/1377 1365/1619/1377 1364/1611/1377
f 1186/1620/1378 1255/1621/1378 1365/1619/1378
f 1365/1619/1379 1256/1622/1379 1368/1612/1379
f 1256/1622/1380 1371/1623/1380 1368/1612/1380
f 1257/1624/1381 1230/1625/1381 1371/1623/1381
f 1371/1623/1382 1229/1626/1382 1370/1614/1382
f 1370/1614/1383 1228/1627/1383 1369/1613/1383
f 1369/1613/1384 1154/1628/1384 1176/1629/1384
f 1175/1630/1385 1369/1613/1385 1176/1629/1385
f 1363/1608/1386 1175/1630/1386 1174/1616/1386
f 1373/1631/1387 1375/1632/1387 1372/1633/1387
f 1374/1634/1387 1376/1635/1387 1373/1631/1387
f 1376/1635/1388 1378/1636/1388 1375/1632/1388
f 1377/1637/1388 1379/1638/1388 1376/1635/1388
f 1164/1639/1389 1372/1633/1389 1255/1621/1389
f 1217/1640/1389 1372/1633/1389 1218/1641/1389
f 1217/1640/1389 1374/1634/1389 1373/1631/1389
f 1163/1642/1389 1374/1634/1389 1216/1643/1389
f 1258/1644/1387 1377/1637/1387 1374/1634/1387
f 1259/1645/1388 1380/1646/1388 1377/1637/1388
f 1260/1647/1390 1233/1648/1390 1380/1646/1390
f 1380/1646/1390 1232/1649/1390 1379/1638/1390
f 1379/1638/1390 1231/1650/1390 1378/1636/1390
f 1378/1636/1390 1168/1651/1390 1257/1624/1390
f 1375/1632/1388 1257/1624/1388 1256/1622/1388
f 1372/1633/1387 1256/1622/1387 1255/1621/1387
f 1382/1652/1387 1384/1653/1387 1381/1654/1387
f 1383/1655/1387 1385/1656/1387 1382/1652/1387
f 1385/1656/1388 1387/1657/1388 1384/1653/1388
f 1385/1656/1388 1389/1658/1388 1388/1659/1388
f 1163/1642/1389 1381/1654/1389 1258/1644/1389
f 1214/1660/1389 1381/1654/1389 1215/1661/1389
f 1213/1662/1389 1382/1652/1389 1214/1660/1389
f 1162/1519/1389 1383/1655/1389 1213/1662/1389
f 1261/1518/1387 1386/1663/1387 1383/1655/1387
f 1262/1528/1388 1389/1658/1388 1386/1663/1388
f 1263/1526/1390 1236/1664/1390 1389/1658/1390
f 1389/1658/1390 1235/1665/1390 1388/1659/1390
f 1388/1659/1390 1234/1666/1390 1387/1657/1390
f 1387/1657/1390 1169/1667/1390 1260/1647/1390
f 1384/1653/1388 1260/1647/1388 1259/1645/1388
f 1381/1654/1387 1259/1645/1387 1258/1644/1387
f 1390/1668/1391 1394/1669/1391 1393/1670/1391
f 1392/1671/1392 1394/1669/1392 1391/1672/1392
f 1393/1670/1393 1397/1673/1393 1396/1674/1393
f 1395/1675/1394 1397/1673/1394 1394/1669/1394
f 1158/1482/1395 1390/1668/1395 1206/1676/1395
f 1202/1480/1396 1390/1668/1396 1203/1481/1396
f 1202/1480/1397 1392/1671/1397 1391/1672/1397
f 1159/1478/1398 1392/1671/1398 1201/1479/1398
f 1237/1677/1399 1395/1675/1399 1392/1671/1399
f 1238/1678/1400 1398/1679/1400 1395/1675/1400
f 1239/1680/1401 1264/1681/1401 1398/1679/1401
f 1398/1679/1402 1265/1682/1402 1397/1673/1402
f 1396/1674/1403 1265/1682/1403 1266/1683/1403
f 1204/1684/1404 1266/1683/1404 1170/1685/1404
f 1205/1686/1405 1396/1674/1405 1204/1684/1405
f 1206/1676/1406 1393/1670/1406 1205/1686/1406
f 1400/1687/1407 1402/1688/1407 1399/1689/1407
f 1401/1690/1408 1403/1691/1408 1400/1687/1408
f 1403/1691/1407 1405/1692/1407 1402/1688/1407
f 1404/1693/1408 1406/1694/1408 1403/1691/1408
f 1266/1683/1409 1236/1695/1409 1170/1685/1409
f 1265/1682/1407 1399/1689/1407 1266/1683/1407
f 1264/1681/1408 1400/1687/1408 1265/1682/1408
f 1171/1696/1410 1401/1690/1410 1264/1681/1410
f 1240/1697/1410 1404/1693/1410 1401/1690/1410
f 1404/1693/1410 1242/1698/1410 1407/1699/1410
f 1242/1698/1410 1267/1700/1410 1407/1699/1410
f 1407/1699/1408 1268/1701/1408 1406/1694/1408
f 1406/1694/1407 1269/1702/1407 1405/1692/1407
f 1405/1692/1409 1169/1703/1409 1234/1704/1409
f 1402/1688/1411 1234/1704/1411 1235/1705/1411
f 1399/1689/1409 1235/1705/1409 1236/1695/1409
f 1409/1706/1407 1411/1707/1407 1408/1708/1407
f 1410/1709/1408 1412/1710/1408 1409/1706/1408
f 1412/1710/1407 1414/1711/1407 1411/1707/1407
f 1413/1712/1408 1415/1713/1408 1412/1710/1408
f 1269/1702/1411 1233/1714/1411 1169/1703/1411
f 1268/1701/1407 1408/1708/1407 1269/1702/1407
f 1267/1700/1408 1409/1706/1408 1268/1701/1408
f 1172/1715/1410 1410/1709/1410 1267/1700/1410
f 1243/1716/1410 1413/1712/1410 1410/1709/1410
f 1244/1717/1410 1416/1718/1410 1413/1712/1410
f 1245/1719/1410 1270/1498/1410 1416/1718/1410
f 1416/1718/1408 1271/1496/1408 1415/1713/1408
f 1415/1713/1407 1272/1493/1407 1414/1711/1407
f 1414/1711/1409 1168/1495/1409 1231/1720/1409
f 1411/1707/1409 1231/1720/1409 1232/1721/1409
f 1233/1714/1409 1411/1707/1409 1232/1721/1409
f 1417/1722/1412 1421/1723/1412 1420/1724/1412
f 1419/1725/1413 1421/1723/1413 1418/1726/1413
f 1420/1724/1414 1424/1727/1414 1423/1728/1414
f 1422/1729/1415 1424/1727/1415 1421/1723/1415
f 1156/1439/1416 1417/1722/1416 1188/1730/1416
f 1184/1437/1417 1417/1722/1417 1185/1438/1417
f 1184/1437/1418 1419/1725/1418 1418/1726/1418
f 1157/1435/1419 1419/1725/1419 1183/1436/1419
f 1219/1731/1420 1422/1729/1420 1419/1725/1420
f 1220/1732/1421 1425/1733/1421 1422/1729/1421
f 1221/1734/1422 1273/1735/1422 1425/1733/1422
f 1425/1733/1423 1274/1736/1423 1424/1727/1423
f 1423/1728/1424 1274/1736/1424 1275/1737/1424
f 1186/1738/1425 1275/1737/1425 1164/1739/1425
f 1187/1740/1426 1423/1728/1426 1186/1738/1426
f 1188/1730/1427 1420/1724/1427 1187/1740/1427
f 1427/1741/1428 1429/1742/1428 1426/1743/1428
f 1428/1744/1429 1430/1745/1429 1427/1741/1429
f 1430/1745/1428 1432/1746/1428 1429/1742/1428
f 1431/1747/1429 1433/1748/1429 1430/1745/1429
f 1275/1737/1430 1218/1749/1430 1164/1739/1430
f 1274/1736/1428 1426/1743/1428 1275/1737/1428
f 1273/1735/1429 1427/1741/1429 1274/1736/1429
f 1165/1750/1431 1428/1744/1431 1273/1735/1431
f 1428/1744/1431 1223/1751/1431 1431/1747/1431
f 1223/1751/1431 1434/1752/1431 1431/1747/1431
f 1224/1753/1431 1276/1754/1431 1434/1752/1431
f 1434/1752/1429 1277/1755/1429 1433/1748/1429
f 1433/1748/1428 1278/1756/1428 1432/1746/1428
f 1432/1746/1430 1163/1757/1430 1216/1758/1430
f 1217/1759/1430 1432/1746/1430 1216/1758/1430
f 1426/1743/1430 1217/1759/1430 1218/1749/1430
f 1436/1760/1428 1438/1761/1428 1435/1762/1428
f 1437/1763/1429 1439/1764/1429 1436/1760/1429
f 1439/1764/1428 1441/1765/1428 1438/1761/1428
f 1440/1766/1429 1442/1767/1429 1439/1764/1429
f 1163/1757/1430 1435/1762/1430 1215/1768/1430
f 1277/1755/1428 1435/1762/1428 1278/1756/1428
f 1276/1754/1429 1436/1760/1429 1277/1755/1429
f 1166/1769/1431 1437/1763/1431 1276/1754/1431
f 1437/1763/1431 1226/1770/1431 1440/1766/1431
f 1226/1770/1431 1443/1771/1431 1440/1766/1431
f 1227/1772/1431 1279/1455/1431 1443/1771/1431
f 1443/1771/1429 1280/1453/1429 1442/1767/1429
f 1442/1767/1428 1281/1450/1428 1441/1765/1428
f 1213/1773/1430 1281/1450/1430 1162/1452/1430
f 1438/1761/1430 1213/1773/1430 1214/1774/1430
f 1435/1762/1430 1214/1774/1430 1215/1768/1430
f 1282/1418/1432 1283/1422/1432 1286/1419/1432
f 1284/1421/1433 1287/1433/1433 1286/1419/1433
f 1286/1419/1434 1289/1425/1434 1288/1423/1434
f 1286/1419/1435 1287/1433/1435 1290/1424/1435
f 1154/1426/1436 1177/1428/1436 1282/1418/1436
f 1177/1428/1437 1178/1430/1437 1283/1422/1437
f 1179/1429/1438 1284/1421/1438 1283/1422/1438
f 1155/1431/1439 1180/1432/1439 1284/1421/1439
f 1180/1432/1263 1181/1775/1263 1287/1433/1263
f 1287/1433/1264 1181/1775/1264 1182/1434/1264
f 1290/1424/1440 1182/1434/1440 1157/1435/1440
f 1289/1425/1266 1290/1424/1266 1183/1436/1266
f 1289/1425/1441 1184/1437/1441 1185/1438/1441
f 1288/1423/1442 1185/1438/1442 1156/1439/1442
f 1285/1420/1269 1288/1423/1269 1174/1440/1269
f 1176/1427/1443 1282/1418/1443 1285/1420/1443
f 1292/1442/1444 1295/1446/1444 1294/1443/1444
f 1292/1442/1445 1293/1454/1445 1296/1445/1445
f 1295/1446/1446 1298/1449/1446 1297/1447/1446
f 1295/1446/1447 1296/1445/1447 1299/1448/1447
f 1281/1450/1448 1291/1444/1448 1212/1451/1448
f 1280/1453/1449 1292/1442/1449 1291/1444/1449
f 1280/1453/1450 1279/1455/1450 1293/1454/1450
f 1279/1455/1451 1167/1776/1451 1189/1456/1451
f 1293/1454/1452 1189/1456/1452 1190/1457/1452
f 1296/1445/1453 1190/1457/1453 1191/1458/1453
f 1299/1448/1454 1191/1458/1454 1161/1459/1454
f 1299/1448/1455 1192/1460/1455 1193/1461/1455
f 1297/1447/1456 1298/1449/1456 1193/1461/1456
f 1297/1447/1457 1194/1462/1457 1160/1463/1457
f 1294/1443/1458 1297/1447/1458 1210/1464/1458
f 1291/1444/1459 1294/1443/1459 1211/1465/1459
f 1300/1466/1460 1301/1470/1460 1304/1467/1460
f 1302/1469/1461 1305/1476/1461 1304/1467/1461
f 1304/1467/1462 1307/1473/1462 1306/1471/1462
f 1304/1467/1463 1305/1476/1463 1308/1472/1463
f 1160/1463/1464 1194/1462/1464 1300/1466/1464
f 1194/1462/1465 1193/1461/1465 1301/1470/1465
f 1192/1460/1466 1302/1469/1466 1301/1470/1466
f 1161/1459/1467 1198/1475/1467 1302/1469/1467
f 1198/1475/1468 1199/1570/1468 1305/1476/1468
f 1305/1476/1296 1199/1570/1296 1200/1477/1296
f 1308/1472/1469 1200/1477/1469 1159/1478/1469
f 1307/1473/1470 1308/1472/1470 1201/1479/1470
f 1307/1473/1299 1202/1480/1299 1203/1481/1299
f 1306/1471/1471 1203/1481/1471 1158/1482/1471
f 1303/1468/1301 1306/1471/1301 1195/1483/1301
f 1197/1474/1302 1300/1466/1302 1303/1468/1302
f 1310/1485/1472 1313/1489/1472 1312/1486/1472
f 1310/1485/1473 1311/1497/1473 1314/1488/1473
f 1313/1489/1474 1316/1492/1474 1315/1490/1474
f 1313/1489/1475 1314/1488/1475 1317/1491/1475
f 1272/1493/1476 1309/1487/1476 1230/1494/1476
f 1271/1496/1477 1310/1485/1477 1309/1487/1477
f 1271/1496/1478 1270/1498/1478 1311/1497/1478
f 1270/1498/1479 1173/1777/1479 1207/1499/1479
f 1311/1497/1480 1207/1499/1480 1208/1500/1480
f 1314/1488/1481 1208/1500/1481 1209/1501/1481
f 1317/1491/1482 1209/1501/1482 1155/1502/1482
f 1317/1491/1483 1179/1503/1483 1178/1504/1483
f 1315/1490/1484 1316/1492/1484 1178/1504/1484
f 1315/1490/1485 1177/1505/1485 1154/1506/1485
f 1312/1486/1486 1315/1490/1486 1228/1507/1486
f 1309/1487/1487 1312/1486/1487 1229/1508/1487
f 1319/1509/1488 1322/1513/1488 1321/1510/1488
f 1320/1512/1489 1323/1522/1489 1322/1513/1489
f 1321/1510/1490 1322/1513/1490 1325/1514/1490
f 1322/1513/1491 1323/1522/1491 1326/1516/1491
f 1212/1517/1492 1318/1511/1492 1261/1518/1492
f 1211/1520/1493 1319/1509/1493 1318/1511/1493
f 1210/1521/1494 1320/1512/1494 1319/1509/1494
f 1160/1463/1495 1197/1474/1495 1320/1512/1495
f 1320/1512/1496 1197/1474/1496 1196/1484/1496
f 1196/1484/1497 1195/1483/1497 1326/1516/1497
f 1326/1516/1498 1195/1483/1498 1158/1482/1498
f 1325/1514/1499 1326/1516/1499 1206/1523/1499
f 1324/1515/1500 1325/1514/1500 1205/1524/1500
f 1263/1526/1501 1324/1515/1501 1204/1525/1501
f 1262/1528/1502 1321/1510/1502 1324/1515/1502
f 1318/1511/1503 1321/1510/1503 1262/1528/1503
f 1328/1529/1504 1331/1533/1504 1330/1530/1504
f 1329/1532/1505 1332/1544/1505 1331/1533/1505
f 1330/1530/1506 1331/1533/1506 1334/1534/1506
f 1331/1533/1507 1332/1544/1507 1335/1536/1507
f 1221/1537/1508 1327/1531/1508 1252/1538/1508
f 1220/1540/1509 1328/1529/1509 1327/1531/1509
f 1219/1541/1510 1329/1532/1510 1328/1529/1510
f 1157/1542/1511 1182/1778/1511 1329/1532/1511
f 1329/1532/1512 1182/1778/1512 1181/1543/1512
f 1181/1543/1513 1180/1779/1513 1335/1536/1513
f 1335/1536/1514 1180/1779/1514 1155/1545/1514
f 1334/1534/1515 1335/1536/1515 1209/1546/1515
f 1333/1535/1516 1334/1534/1516 1208/1547/1516
f 1254/1549/1517 1333/1535/1517 1207/1548/1517
f 1253/1551/1518 1330/1530/1518 1333/1535/1518
f 1327/1531/1519 1330/1530/1519 1253/1551/1519
f 1336/1552/1520 1337/1555/1520 1340/1553/1520
f 1337/1555/1521 1338/1561/1521 1341/1556/1521
f 1340/1553/1522 1343/1558/1522 1342/1557/1522
f 1341/1556/1523 1344/1565/1523 1343/1558/1523
f 1161/1459/1524 1191/1559/1524 1336/1552/1524
f 1191/1559/1525 1190/1560/1525 1337/1555/1525
f 1190/1560/1526 1189/1562/1526 1338/1561/1526
f 1189/1562/1527 1167/1580/1527 1246/1563/1527
f 1338/1561/1528 1246/1563/1528 1247/1564/1528
f 1247/1564/1529 1248/1566/1529 1344/1565/1529
f 1248/1566/1530 1171/1591/1530 1239/1567/1530
f 1344/1565/1531 1239/1567/1531 1238/1568/1531
f 1343/1558/1532 1238/1568/1532 1237/1569/1532
f 1342/1557/1533 1237/1569/1533 1159/1478/1533
f 1199/1570/1534 1339/1554/1534 1342/1557/1534
f 1336/1552/1535 1339/1554/1535 1199/1570/1535
f 1346/1571/1367 1349/1575/1367 1348/1572/1367
f 1347/1574/1367 1350/1577/1367 1349/1575/1367
f 1349/1575/1368 1352/1578/1368 1351/1576/1368
f 1350/1577/1368 1353/1586/1368 1352/1578/1368
f 1227/1579/1369 1345/1573/1369 1246/1563/1369
f 1226/1581/1369 1346/1571/1369 1345/1573/1369
f 1226/1581/1369 1225/1583/1369 1347/1574/1369
f 1166/1582/1369 1249/1584/1369 1347/1574/1369
f 1249/1584/1367 1250/1585/1367 1350/1577/1367
f 1250/1585/1368 1251/1587/1368 1353/1586/1368
f 1251/1587/1370 1172/1607/1370 1242/1588/1370
f 1353/1586/1370 1242/1588/1370 1241/1589/1370
f 1352/1578/1370 1241/1589/1370 1240/1590/1370
f 1351/1576/1370 1240/1590/1370 1171/1591/1370
f 1348/1572/1368 1351/1576/1368 1248/1566/1368
f 1345/1573/1367 1348/1572/1367 1247/1564/1367
f 1355/1592/1367 1358/1596/1367 1357/1593/1367
f 1356/1595/1367 1359/1603/1367 1358/1596/1367
f 1358/1596/1368 1361/1599/1368 1360/1597/1368
f 1358/1596/1368 1359/1603/1368 1362/1598/1368
f 1166/1582/1369 1224/1601/1369 1354/1594/1369
f 1223/1600/1369 1355/1592/1369 1354/1594/1369
f 1222/1602/1369 1356/1595/1369 1355/1592/1369
f 1165/1539/1369 1252/1538/1369 1356/1595/1369
f 1252/1538/1367 1253/1551/1367 1359/1603/1367
f 1253/1551/1368 1254/1549/1368 1362/1598/1368
f 1254/1549/1370 1173/1550/1370 1245/1604/1370
f 1362/1598/1370 1245/1604/1370 1244/1605/1370
f 1361/1599/1370 1244/1605/1370 1243/1606/1370
f 1360/1597/1370 1243/1606/1370 1172/1607/1370
f 1357/1593/1368 1360/1597/1368 1251/1587/1368
f 1354/1594/1367 1357/1593/1367 1250/1585/1367
f 1363/1608/1536 1364/1611/1536 1367/1609/1536
f 1364/1611/1537 1365/1619/1537 1368/1612/1537
f 1367/1609/1538 1370/1614/1538 1369/1613/1538
f 1368/1612/1539 1371/1623/1539 1370/1614/1539
f 1156/1615/1540 1188/1617/1540 1363/1608/1540
f 1188/1617/1541 1187/1618/1541 1364/1611/1541
f 1187/1618/1542 1186/1620/1542 1365/1619/1542
f 1186/1620/1543 1164/1639/1543 1255/1621/1543
f 1365/1619/1544 1255/1621/1544 1256/1622/1544
f 1256/1622/1545 1257/1624/1545 1371/1623/1545
f 1257/1624/1546 1168/1651/1546 1230/1625/1546
f 1371/1623/1547 1230/1625/1547 1229/1626/1547
f 1370/1614/1548 1229/1626/1548 1228/1627/1548
f 1369/1613/1549 1228/1627/1549 1154/1628/1549
f 1175/1630/1550 1366/1610/1550 1369/1613/1550
f 1363/1608/1551 1366/1610/1551 1175/1630/1551
f 1373/1631/1387 1376/1635/1387 1375/1632/1387
f 1374/1634/1387 1377/1637/1387 1376/1635/1387
f 1376/1635/1388 1379/1638/1388 1378/1636/1388
f 1377/1637/1388 1380/1646/1388 1379/1638/1388
f 1164/1639/1389 1218/1641/1389 1372/1633/1389
f 1217/1640/1389 1373/1631/1389 1372/1633/1389
f 1217/1640/1389 1216/1643/1389 1374/1634/1389
f 1163/1642/1389 1258/1644/1389 1374/1634/1389
f 1258/1644/1387 1259/1645/1387 1377/1637/1387
f 1259/1645/1388 1260/1647/1388 1380/1646/1388
f 1260/1647/1390 1169/1667/1390 1233/1648/1390
f 1380/1646/1390 1233/1648/1390 1232/1649/1390
f 1379/1638/1390 1232/1649/1390 1231/1650/1390
f 1378/1636/1390 1231/1650/1390 1168/1651/1390
f 1375/1632/1388 1378/1636/1388 1257/1624/1388
f 1372/1633/1387 1375/1632/1387 1256/1622/1387
f 1382/1652/1387 1385/1656/1387 1384/1653/1387
f 1383/1655/1387 1386/1663/1387 1385/1656/1387
f 1385/1656/1388 1388/1659/1388 1387/1657/1388
f 1385/1656/1388 1386/1663/1388 1389/1658/1388
f 1163/1642/1389 1215/1661/1389 1381/1654/1389
f 1214/1660/1389 1382/1652/1389 1381/1654/1389
f 1213/1662/1389 1383/1655/1389 1382/1652/1389
f 1162/1519/1389 1261/1518/1389 1383/1655/1389
f 1261/1518/1387 1262/1528/1387 1386/1663/1387
f 1262/1528/1388 1263/1526/1388 1389/1658/1388
f 1263/1526/1390 1170/1527/1390 1236/1664/1390
f 1389/1658/1390 1236/1664/1390 1235/1665/1390
f 1388/1659/1390 1235/1665/1390 1234/1666/1390
f 1387/1657/1390 1234/1666/1390 1169/1667/1390
f 1384/1653/1388 1387/1657/1388 1260/1647/1388
f 1381/1654/1387 1384/1653/1387 1259/1645/1387
f 1390/1668/1552 1391/1672/1552 1394/1669/1552
f 1392/1671/1553 1395/1675/1553 1394/1669/1553
f 1393/1670/1554 1394/1669/1554 1397/1673/1554
f 1395/1675/1555 1398/1679/1555 1397/1673/1555
f 1158/1482/1556 1203/1481/1556 1390/1668/1556
f 1202/1480/1557 1391/1672/1557 1390/1668/1557
f 1202/1480/1558 1201/1479/1558 1392/1671/1558
f 1159/1478/1559 1237/1677/1559 1392/1671/1559
f 1237/1677/1560 1238/1678/1560 1395/1675/1560
f 1238/1678/1561 1239/1680/1561 1398/1679/1561
f 1239/1680/1562 1171/1696/1562 1264/1681/1562
f 1398/1679/1563 1264/1681/1563 1265/1682/1563
f 1396/1674/1564 1397/1673/1564 1265/1682/1564
f 1204/1684/1565 1396/1674/1565 1266/1683/1565
f 1205/1686/1566 1393/1670/1566 1396/1674/1566
f 1206/1676/1567 1390/1668/1567 1393/1670/1567
f 1400/1687/1407 1403/1691/1407 1402/1688/1407
f 1401/1690/1408 1404/1693/1408 1403/1691/1408
f 1403/1691/1407 1406/1694/1407 1405/1692/1407
f 1404/1693/1408 1407/1699/1408 1406/1694/1408
f 1266/1683/1409 1399/1689/1409 1236/1695/1409
f 1265/1682/1407 1400/1687/1407 1399/1689/1407
f 1264/1681/1408 1401/1690/1408 1400/1687/1408
f 1171/1696/1410 1240/1697/1410 1401/1690/1410
f 1240/1697/1410 1241/1780/1410 1404/1693/1410
f 1404/1693/1410 1241/1780/1410 1242/1698/1410
f 1242/1698/1410 1172/1715/1410 1267/1700/1410
f 1407/1699/1408 1267/1700/1408 1268/1701/1408
f 1406/1694/1407 1268/1701/1407 1269/1702/1407
f 1405/1692/1409 1269/1702/1409 1169/1703/1409
f 1402/1688/1409 1405/1692/1409 1234/1704/1409
f 1399/1689/1409 1402/1688/1409 1235/1705/1409
f 1409/1706/1407 1412/1710/1407 1411/1707/1407
f 1410/1709/1408 1413/1712/1408 1412/1710/1408
f 1412/1710/1407 1415/1713/1407 1414/1711/1407
f 1413/1712/1408 1416/1718/1408 1415/1713/1408
f 1269/1702/1409 1408/1708/1409 1233/1714/1409
f 1268/1701/1407 1409/1706/1407 1408/1708/1407
f 1267/1700/1408 1410/1709/1408 1409/1706/1408
f 1172/1715/1410 1243/1716/1410 1410/1709/1410
f 1243/1716/1410 1244/1717/1410 1413/1712/1410
f 1244/1717/1410 1245/1719/1410 1416/1718/1410
f 1245/1719/1410 1173/1777/1410 1270/1498/1410
f 1416/1718/1408 1270/1498/1408 1271/1496/1408
f 1415/1713/1407 1271/1496/1407 1272/1493/1407
f 1414/1711/1409 1272/1493/1409 1168/1495/1409
f 1411/1707/1409 1414/1711/1409 1231/1720/1409
f 1233/1714/1409 1408/1708/1409 1411/1707/1409
f 1417/1722/1568 1418/1726/1568 1421/1723/1568
f 1419/1725/1569 1422/1729/1569 1421/1723/1569
f 1420/1724/1570 1421/1723/1570 1424/1727/1570
f 1422/1729/1571 1425/1733/1571 1424/1727/1571
f 1156/1439/1572 1185/1438/1572 1417/1722/1572
f 1184/1437/1573 1418/1726/1573 1417/1722/1573
f 1184/1437/1574 1183/1436/1574 1419/1725/1574
f 1157/1435/1575 1219/1731/1575 1419/1725/1575
f 1219/1731/1576 1220/1732/1576 1422/1729/1576
f 1220/1732/1577 1221/1734/1577 1425/1733/1577
f 1221/1734/1578 1165/1750/1578 1273/1735/1578
f 1425/1733/1579 1273/1735/1579 1274/1736/1579
f 1423/1728/1580 1424/1727/1580 1274/1736/1580
f 1186/1738/1581 1423/1728/1581 1275/1737/1581
f 1187/1740/1582 1420/1724/1582 1423/1728/1582
f 1188/1730/1583 1417/1722/1583 1420/1724/1583
f 1427/1741/1428 1430/1745/1428 1429/1742/1428
f 1428/1744/1429 1431/1747/1429 1430/1745/1429
f 1430/1745/1428 1433/1748/1428 1432/1746/1428
f 1431/1747/1429 1434/1752/1429 1433/1748/1429
f 1275/1737/1430 1426/1743/1430 1218/1749/1430
f 1274/1736/1428 1427/1741/1428 1426/1743/1428
f 1273/1735/1429 1428/1744/1429 1427/1741/1429
f 1165/1750/1431 1222/1781/1431 1428/1744/1431
f 1428/1744/1431 1222/1781/1431 1223/1751/1431
f 1223/1751/1431 1224/1753/1431 1434/1752/1431
f 1224/1753/1431 1166/1769/1431 1276/1754/1431
f 1434/1752/1429 1276/1754/1429 1277/1755/1429
f 1433/1748/1428 1277/1755/1428 1278/1756/1428
f 1432/1746/1430 1278/1756/1430 1163/1757/1430
f 1217/1759/1430 1429/1742/1430 1432/1746/1430
f 1426/1743/1430 1429/1742/1430 1217/1759/1430
f 1436/1760/1428 1439/1764/1428 1438/1761/1428
f 1437/1763/1429 1440/1766/1429 1439/1764/1429
f 1439/1764/1428 1442/1767/1428 1441/1765/1428
f 1440/1766/1429 1443/1771/1429 1442/1767/1429
f 1163/1757/1430 1278/1756/1430 1435/1762/1430
f 1277/1755/1428 1436/1760/1428 1435/1762/1428
f 1276/1754/1429 1437/1763/1429 1436/1760/1429
f 1166/1769/1431 1225/1782/1431 1437/1763/1431
f 1437/1763/1431 1225/1782/1431 1226/1770/1431
f 1226/1770/1431 1227/1772/1431 1443/1771/1431
f 1227/1772/1431 1167/1776/1431 1279/1455/1431
f 1443/1771/1429 1279/1455/1429 1280/1453/1429
f 1442/1767/1428 1280/1453/1428 1281/1450/1428
f 1213/1773/1430 1441/1765/1430 1281/1450/1430
f 1438/1761/1430 1441/1765/1430 1213/1773/1430
f 1435/1762/1430 1438/1761/1430 1214/1774/1430
o Cube.005_Cube.001
v -0.441926 -0.057000 0.390238
v -0.402723 -0.135524 0.402745
v -0.368815 -0.014001 0.431045
v -0.329613 -0.092525 0.443552
v -0.356076 -0.041278 0.219858
v -0.316874 -0.119802 0.232365
v -0.282966 0.001722 0.260665
v -0.243763 -0.076802 0.273172
v -0.295711 0.018606 0.301891
v -0.320239 0.014114 0.350571
v -0.344768 0.009622 0.399251
v -0.292498 -0.095076 0.415927
v -0.267969 -0.090584 0.367247
v -0.243441 -0.086092 0.318567
v -0.442249 -0.047711 0.344843
v -0.417720 -0.043218 0.296163
v -0.393192 -0.038726 0.247483
v -0.340922 -0.143425 0.264159
v -0.365450 -0.147917 0.312839
v -0.389978 -0.152409 0.361519
v -0.387436 -0.017182 0.423610
v -0.411010 -0.027143 0.411938
v -0.430813 -0.042693 0.399399
v -0.440181 -0.076268 0.391581
v -0.431679 -0.101220 0.394688
v -0.416922 -0.122856 0.399001
v -0.385892 -0.132671 0.413731
v -0.363217 -0.122874 0.427186
v -0.342515 -0.107159 0.437941
v -0.333147 -0.073585 0.445760
v -0.342547 -0.048797 0.444436
v -0.356406 -0.026996 0.438339
v -0.350924 0.008083 0.411128
v -0.357224 0.004049 0.421243
v -0.363630 -0.004230 0.428068
v -0.237806 -0.084860 0.306399
v -0.235159 -0.082974 0.294245
v -0.237762 -0.079994 0.282392
v -0.245508 -0.057534 0.271829
v -0.254010 -0.032583 0.268722
v -0.268767 -0.010946 0.264409
v -0.343174 -0.026643 0.225469
v -0.322473 -0.010928 0.236224
v -0.299798 -0.001132 0.249679
v -0.254877 -0.091109 0.264011
v -0.274679 -0.106659 0.251472
v -0.298253 -0.116621 0.239800
v -0.329283 -0.106806 0.225071
v -0.343142 -0.085005 0.218974
v -0.352542 -0.060218 0.217650
v -0.386756 -0.037748 0.235696
v -0.378497 -0.037637 0.226205
v -0.367259 -0.039035 0.220922
v -0.395893 -0.153081 0.373597
v -0.400499 -0.151044 0.385127
v -0.402727 -0.144346 0.395438
v -0.282963 0.010544 0.267972
v -0.285191 0.017241 0.278283
v -0.289797 0.019278 0.289813
v -0.301843 0.017483 0.314061
v -0.307975 0.016360 0.326231
v -0.314107 0.015237 0.338401
v -0.326371 0.012991 0.362741
v -0.332503 0.011868 0.374911
v -0.338636 0.010745 0.387081
v -0.318430 -0.094768 0.442488
v -0.307192 -0.096166 0.437205
v -0.298934 -0.096055 0.427714
v -0.286366 -0.093953 0.403757
v -0.280233 -0.092830 0.391587
v -0.274101 -0.091707 0.379417
v -0.261837 -0.089461 0.355077
v -0.255705 -0.088338 0.342907
v -0.249573 -0.087215 0.330737
v -0.447927 -0.053808 0.381018
v -0.450530 -0.050829 0.369165
v -0.447883 -0.048943 0.357011
v -0.436116 -0.046588 0.332673
v -0.429984 -0.045465 0.320503
v -0.423852 -0.044341 0.308333
v -0.411588 -0.042095 0.283993
v -0.405456 -0.040972 0.271823
v -0.399324 -0.039849 0.259653
v -0.322059 -0.129573 0.235342
v -0.328465 -0.137851 0.242167
v -0.334765 -0.141886 0.252282
v -0.347054 -0.144548 0.276329
v -0.353186 -0.145671 0.288499
v -0.359318 -0.146794 0.300669
v -0.371582 -0.149040 0.325009
v -0.377714 -0.150163 0.337179
v -0.383846 -0.151286 0.349349
v -0.258176 -0.113805 0.308585
v -0.282380 -0.134389 0.294490
v -0.311486 -0.145158 0.278831
v -0.282704 -0.118297 0.357265
v -0.306909 -0.138881 0.343170
v -0.336014 -0.149651 0.327511
v -0.307233 -0.122789 0.405945
v -0.331437 -0.143374 0.391850
v -0.360542 -0.154143 0.376191
v -0.374204 0.011356 0.384579
v -0.403309 0.000587 0.368920
v -0.427514 -0.019998 0.354825
v -0.349675 0.015848 0.335899
v -0.378780 0.005079 0.320240
v -0.402985 -0.015506 0.306145
v -0.325147 0.020340 0.287219
v -0.354252 0.009571 0.271560
v -0.378457 -0.011014 0.257465
v -0.366472 -0.127766 0.252729
v -0.385334 -0.101825 0.245619
v -0.395057 -0.070509 0.243610
v -0.391001 -0.132259 0.301409
v -0.409863 -0.106318 0.294299
v -0.419586 -0.075002 0.292290
v -0.415529 -0.136751 0.350089
v -0.434391 -0.110810 0.342979
v -0.444114 -0.079494 0.340970
v -0.290632 -0.063293 0.419800
v -0.300355 -0.031977 0.417791
v -0.319217 -0.006036 0.410681
v -0.266103 -0.058801 0.371120
v -0.275827 -0.027485 0.369111
v -0.294689 -0.001544 0.362001
v -0.241575 -0.054309 0.322440
v -0.251298 -0.022993 0.320431
v -0.270160 0.002948 0.313321
v -0.424086 -0.064046 0.405613
v -0.413582 -0.089720 0.410161
v -0.399298 -0.113697 0.413522
v -0.402286 -0.049236 0.419588
v -0.389933 -0.075525 0.425159
v -0.376166 -0.101555 0.427921
v -0.377858 -0.036856 0.431416
v -0.364870 -0.061071 0.437349
v -0.353069 -0.086508 0.439324
v -0.264370 0.003706 0.301167
v -0.245599 -0.022092 0.308224
v -0.235939 -0.053242 0.310237
v -0.260634 0.002274 0.289110
v -0.242498 -0.022524 0.295794
v -0.233281 -0.052513 0.297836
v -0.261107 -0.002948 0.277119
v -0.244534 -0.025604 0.282836
v -0.236165 -0.052909 0.285076
v -0.286391 -0.020105 0.249888
v -0.272107 -0.044082 0.253249
v -0.261603 -0.069757 0.257797
v -0.309523 -0.032248 0.235489
v -0.295756 -0.058277 0.238251
v -0.283403 -0.084566 0.243822
v -0.332620 -0.047294 0.224086
v -0.320819 -0.072732 0.226061
v -0.307832 -0.096946 0.231994
v -0.449750 -0.080560 0.353173
v -0.440090 -0.111711 0.355186
v -0.421319 -0.137509 0.362243
v -0.452408 -0.081289 0.365574
v -0.443191 -0.111279 0.367617
v -0.425056 -0.136077 0.374300
v -0.449525 -0.080894 0.378334
v -0.441155 -0.108199 0.380574
v -0.424582 -0.130854 0.386291
v -0.319070 0.021008 0.275202
v -0.313324 0.018940 0.264099
v -0.307761 0.011808 0.254973
v -0.348011 0.010301 0.259610
v -0.341119 0.008670 0.248980
v -0.332885 0.002341 0.240897
v -0.372092 -0.010177 0.245607
v -0.364335 -0.011062 0.235627
v -0.354277 -0.015550 0.229010
v -0.313597 -0.123625 0.417803
v -0.321354 -0.122741 0.427783
v -0.331412 -0.118252 0.434400
v -0.337678 -0.144103 0.403800
v -0.344571 -0.142472 0.414430
v -0.352804 -0.136144 0.422513
v -0.366619 -0.154810 0.388208
v -0.372366 -0.152743 0.399311
v -0.377929 -0.145611 0.408437
v -0.250250 -0.103388 0.273321
v -0.249189 -0.109524 0.284560
v -0.252453 -0.112427 0.296455
v -0.271208 -0.121200 0.260575
v -0.272272 -0.129232 0.270944
v -0.276518 -0.132902 0.282419
v -0.296766 -0.130746 0.247358
v -0.300200 -0.139526 0.256088
v -0.305476 -0.143612 0.266860
v -0.264308 -0.114928 0.320755
v -0.270440 -0.116051 0.332925
v -0.276572 -0.117174 0.345095
v -0.288513 -0.135512 0.306660
v -0.294645 -0.136635 0.318830
v -0.300777 -0.137758 0.331000
v -0.317618 -0.146281 0.291001
v -0.323750 -0.147405 0.303171
v -0.329882 -0.148528 0.315341
v -0.288836 -0.119420 0.369435
v -0.294968 -0.120543 0.381605
v -0.301101 -0.121666 0.393775
v -0.313041 -0.140004 0.355340
v -0.319173 -0.141127 0.367510
v -0.325305 -0.142251 0.379680
v -0.342146 -0.150774 0.339681
v -0.348278 -0.151897 0.351851
v -0.354410 -0.153020 0.364021
v -0.388923 -0.003056 0.416052
v -0.385490 0.005724 0.407322
v -0.380214 0.009810 0.396550
v -0.414481 -0.012602 0.402836
v -0.413417 -0.004571 0.392466
v -0.409171 -0.000900 0.380991
v -0.435440 -0.030414 0.390089
v -0.436501 -0.024278 0.378850
v -0.433236 -0.021375 0.366955
v -0.368072 0.012479 0.372409
v -0.361939 0.013602 0.360239
v -0.355807 0.014725 0.348069
v -0.397177 0.001710 0.356750
v -0.391045 0.002833 0.344580
v -0.384912 0.003956 0.332410
v -0.421381 -0.018875 0.342655
v -0.415249 -0.017752 0.330485
v -0.409117 -0.016629 0.318315
v -0.343543 0.016971 0.323729
v -0.337411 0.018094 0.311559
v -0.331279 0.019217 0.299389
v -0.372648 0.006202 0.308070
v -0.366516 0.007325 0.295900
v -0.360384 0.008448 0.283730
v -0.396853 -0.014383 0.293975
v -0.390721 -0.013260 0.281805
v -0.384589 -0.012137 0.269635
v -0.368362 -0.066030 0.217255
v -0.359559 -0.093255 0.218636
v -0.343419 -0.115990 0.225213
v -0.380242 -0.068073 0.222351
v -0.370893 -0.098038 0.224131
v -0.352890 -0.122861 0.231077
v -0.388606 -0.069363 0.231825
v -0.378930 -0.100510 0.233806
v -0.360175 -0.126311 0.240895
v -0.401190 -0.071632 0.255780
v -0.391466 -0.102948 0.257789
v -0.372604 -0.128889 0.264899
v -0.407322 -0.072756 0.267950
v -0.397599 -0.104071 0.269959
v -0.378736 -0.130012 0.277069
v -0.413454 -0.073879 0.280120
v -0.403731 -0.105194 0.282129
v -0.384869 -0.131135 0.289239
v -0.425718 -0.076125 0.304460
v -0.415995 -0.107441 0.306469
v -0.397133 -0.133382 0.313579
v -0.431850 -0.077248 0.316630
v -0.422127 -0.108564 0.318639
v -0.403265 -0.134505 0.325749
v -0.437982 -0.078371 0.328800
v -0.428259 -0.109687 0.330809
v -0.409397 -0.135628 0.337920
v -0.342270 -0.017813 0.438197
v -0.326130 -0.040547 0.444774
v -0.317327 -0.067773 0.446155
v -0.332800 -0.010942 0.432333
v -0.314796 -0.035764 0.439279
v -0.305447 -0.065730 0.441059
v -0.325514 -0.007492 0.422515
v -0.306759 -0.033293 0.429604
v -0.297083 -0.064440 0.431585
v -0.313085 -0.004913 0.398511
v -0.294223 -0.030854 0.405621
v -0.284500 -0.062170 0.407630
v -0.306953 -0.003790 0.386341
v -0.288091 -0.029731 0.393451
v -0.278368 -0.061047 0.395460
v -0.300821 -0.002667 0.374171
v -0.281959 -0.028608 0.381281
v -0.272235 -0.059924 0.383290
v -0.288556 -0.000421 0.349831
v -0.269694 -0.026362 0.356941
v -0.259971 -0.057678 0.358950
v -0.282424 0.000702 0.337661
v -0.263562 -0.025239 0.344771
v -0.253839 -0.056555 0.346780
v -0.276292 0.001825 0.325491
v -0.257430 -0.024116 0.332601
v -0.247707 -0.055432 0.334610
vt 0.437500 0.062276
vt 0.500000 0.122396
vt 0.437500 0.123210
vt 0.562500 0.062276
vt 0.500000 0.062174
vt 0.437500 0.180725
vt 0.562500 0.180725
vt 0.500000 0.178060
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.437500 0.000000
vt 0.562500 0.000000
vt 0.500000 0.000000
vt 0.625000 0.000000
vt 0.625000 0.062500
vt 0.562500 0.123210
vt 0.625000 0.187500
vt 0.625000 0.250000
vt 0.562500 0.228516
vt 0.500000 0.223958
vt 0.437500 0.228516
vt 0.375000 0.250000
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.500000 0.453451
vt 0.437500 0.470540
vt 0.437500 0.453349
vt 0.562500 0.470540
vt 0.500000 0.471354
vt 0.437663 0.491150
vt 0.562337 0.491150
vt 0.500000 0.493815
vt 0.437500 0.437500
vt 0.375000 0.453125
vt 0.375000 0.437500
vt 0.500000 0.437500
vt 0.562500 0.453349
vt 0.562500 0.437500
vt 0.625000 0.453125
vt 0.625000 0.468750
vt 0.625000 0.484375
vt 0.625000 0.500000
vt 0.560628 0.521525
vt 0.500000 0.526042
vt 0.439372 0.521525
vt 0.375000 0.500000
vt 0.375000 0.484375
vt 0.375000 0.468750
vt 0.444214 0.569214
vt 0.500000 0.625000
vt 0.446615 0.625000
vt 0.555786 0.569214
vt 0.500000 0.571615
vt 0.444214 0.680786
vt 0.555786 0.680786
vt 0.500000 0.678385
vt 0.396525 0.564372
vt 0.603475 0.564372
vt 0.553385 0.625000
vt 0.603475 0.685628
vt 0.625000 0.750000
vt 0.560628 0.728475
vt 0.500000 0.723958
vt 0.439372 0.728475
vt 0.375000 0.750000
vt 0.396525 0.685628
vt 0.401042 0.625000
vt 0.500000 0.953125
vt 0.437500 0.968750
vt 0.437500 0.953125
vt 0.562500 0.968750
vt 0.500000 0.968750
vt 0.437500 0.984375
vt 0.562500 0.984375
vt 0.500000 0.984375
vt 0.437500 0.937500
vt 0.375000 0.953125
vt 0.375000 0.937500
vt 0.500000 0.937500
vt 0.562500 0.953125
vt 0.562500 0.937500
vt 0.625000 0.953125
vt 0.625000 0.968750
vt 0.625000 0.984375
vt 0.625000 1.000000
vt 0.562500 1.000000
vt 0.500000 1.000000
vt 0.437500 1.000000
vt 0.375000 1.000000
vt 0.375000 0.984375
vt 0.375000 0.968750
vt 0.345540 0.562500
vt 0.328451 0.625000
vt 0.328349 0.562500
vt 0.366150 0.562663
vt 0.346354 0.625000
vt 0.345540 0.687500
vt 0.328349 0.687500
vt 0.366150 0.687337
vt 0.328125 0.500000
vt 0.312500 0.562500
vt 0.312500 0.500000
vt 0.343750 0.500000
vt 0.359375 0.500000
vt 0.368815 0.625000
vt 0.359375 0.750000
vt 0.343750 0.750000
vt 0.328125 0.750000
vt 0.312500 0.687500
vt 0.312500 0.750000
vt 0.312500 0.625000
vt 0.843750 0.562500
vt 0.828125 0.625000
vt 0.828125 0.562500
vt 0.859375 0.562500
vt 0.843750 0.625000
vt 0.843750 0.687500
vt 0.828125 0.687500
vt 0.859375 0.687500
vt 0.828125 0.500000
vt 0.812500 0.562500
vt 0.812500 0.500000
vt 0.843750 0.500000
vt 0.859375 0.500000
vt 0.875000 0.500000
vt 0.875000 0.625000
vt 0.859375 0.625000
vt 0.875000 0.750000
vt 0.859375 0.750000
vt 0.843750 0.750000
vt 0.828125 0.750000
vt 0.812500 0.687500
vt 0.812500 0.750000
vt 0.812500 0.625000
vt 0.633850 0.562663
vt 0.653646 0.625000
vt 0.631185 0.625000
vt 0.654460 0.562500
vt 0.671550 0.625000
vt 0.633850 0.687337
vt 0.654460 0.687500
vt 0.640625 0.500000
vt 0.656250 0.500000
vt 0.671651 0.562500
vt 0.671875 0.500000
vt 0.687500 0.562500
vt 0.687500 0.625000
vt 0.671651 0.687500
vt 0.687500 0.687500
vt 0.671875 0.750000
vt 0.656250 0.750000
vt 0.640625 0.750000
vt 0.598958 0.625000
vt 0.718750 0.562500
vt 0.703125 0.625000
vt 0.703125 0.562500
vt 0.734375 0.562500
vt 0.718750 0.625000
vt 0.703125 0.687500
vt 0.734375 0.625000
vt 0.718750 0.687500
vt 0.703125 0.500000
vt 0.687500 0.500000
vt 0.718750 0.500000
vt 0.750000 0.500000
vt 0.734375 0.500000
vt 0.750000 0.562500
vt 0.750000 0.625000
vt 0.734375 0.687500
vt 0.750000 0.687500
vt 0.734375 0.750000
vt 0.718750 0.750000
vt 0.703125 0.750000
vt 0.687500 0.750000
vt 0.781250 0.562500
vt 0.765625 0.625000
vt 0.765625 0.562500
vt 0.796875 0.562500
vt 0.781250 0.625000
vt 0.765625 0.687500
vt 0.796875 0.687500
vt 0.781250 0.687500
vt 0.781250 0.500000
vt 0.765625 0.500000
vt 0.796875 0.500000
vt 0.796875 0.625000
vt 0.796875 0.750000
vt 0.781250 0.750000
vt 0.765625 0.750000
vt 0.750000 0.750000
vt 0.140625 0.562500
vt 0.156250 0.625000
vt 0.140625 0.625000
vt 0.156250 0.562500
vt 0.171875 0.625000
vt 0.140625 0.687500
vt 0.156250 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.140625 0.500000
vt 0.156250 0.500000
vt 0.171875 0.562500
vt 0.171875 0.500000
vt 0.187500 0.562500
vt 0.187500 0.625000
vt 0.171875 0.687500
vt 0.187500 0.687500
vt 0.171875 0.750000
vt 0.156250 0.750000
vt 0.140625 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.218750 0.562500
vt 0.203125 0.625000
vt 0.203125 0.562500
vt 0.234375 0.562500
vt 0.218750 0.625000
vt 0.203125 0.687500
vt 0.234375 0.625000
vt 0.218750 0.687500
vt 0.187500 0.500000
vt 0.218750 0.500000
vt 0.203125 0.500000
vt 0.250000 0.500000
vt 0.234375 0.500000
vt 0.250000 0.562500
vt 0.250000 0.625000
vt 0.234375 0.687500
vt 0.250000 0.687500
vt 0.234375 0.750000
vt 0.218750 0.750000
vt 0.203125 0.750000
vt 0.187500 0.750000
vt 0.281250 0.562500
vt 0.265625 0.625000
vt 0.265625 0.562500
vt 0.296875 0.562500
vt 0.281250 0.625000
vt 0.265625 0.687500
vt 0.296875 0.687500
vt 0.281250 0.687500
vt 0.281250 0.500000
vt 0.265625 0.500000
vt 0.296875 0.500000
vt 0.296875 0.625000
vt 0.296875 0.750000
vt 0.281250 0.750000
vt 0.265625 0.750000
vt 0.250000 0.750000
vt 0.437663 0.758850
vt 0.500000 0.778646
vt 0.437500 0.779460
vt 0.562337 0.758850
vt 0.500000 0.756185
vt 0.500000 0.796549
vt 0.437500 0.796651
vt 0.562500 0.779460
vt 0.375000 0.765625
vt 0.625000 0.765625
vt 0.625000 0.781250
vt 0.562500 0.796651
vt 0.625000 0.796875
vt 0.562500 0.812500
vt 0.500000 0.812500
vt 0.437500 0.812500
vt 0.375000 0.796875
vt 0.375000 0.812500
vt 0.375000 0.781250
vt 0.500000 0.828125
vt 0.437500 0.843750
vt 0.437500 0.828125
vt 0.562500 0.828125
vt 0.500000 0.843750
vt 0.437500 0.859375
vt 0.562500 0.843750
vt 0.500000 0.859375
vt 0.375000 0.828125
vt 0.625000 0.812500
vt 0.625000 0.828125
vt 0.625000 0.859375
vt 0.562500 0.859375
vt 0.562500 0.875000
vt 0.500000 0.875000
vt 0.437500 0.875000
vt 0.375000 0.875000
vt 0.375000 0.859375
vt 0.375000 0.843750
vt 0.500000 0.890625
vt 0.437500 0.906250
vt 0.437500 0.890625
vt 0.562500 0.890625
vt 0.500000 0.906250
vt 0.437500 0.921875
vt 0.562500 0.906250
vt 0.500000 0.921875
vt 0.375000 0.890625
vt 0.625000 0.875000
vt 0.625000 0.890625
vt 0.625000 0.906250
vt 0.562500 0.921875
vt 0.625000 0.921875
vt 0.375000 0.921875
vt 0.375000 0.906250
vt 0.437500 0.258850
vt 0.500000 0.278646
vt 0.437500 0.279460
vt 0.562500 0.258850
vt 0.500000 0.256185
vt 0.500000 0.296549
vt 0.437500 0.296651
vt 0.562500 0.279460
vt 0.375000 0.265625
vt 0.625000 0.265625
vt 0.625000 0.281250
vt 0.562500 0.296651
vt 0.625000 0.296875
vt 0.562500 0.312500
vt 0.500000 0.312500
vt 0.437500 0.312500
vt 0.375000 0.296875
vt 0.375000 0.312500
vt 0.375000 0.281250
vt 0.500000 0.328125
vt 0.437500 0.343750
vt 0.437500 0.328125
vt 0.562500 0.328125
vt 0.500000 0.343750
vt 0.437500 0.359375
vt 0.562500 0.343750
vt 0.500000 0.359375
vt 0.375000 0.328125
vt 0.625000 0.312500
vt 0.625000 0.343750
vt 0.562500 0.359375
vt 0.625000 0.359375
vt 0.562500 0.375000
vt 0.500000 0.375000
vt 0.437500 0.375000
vt 0.375000 0.375000
vt 0.375000 0.359375
vt 0.375000 0.343750
vt 0.500000 0.390625
vt 0.437500 0.406250
vt 0.437500 0.390625
vt 0.562500 0.390625
vt 0.500000 0.406250
vt 0.437500 0.421875
vt 0.562500 0.406250
vt 0.500000 0.421875
vt 0.375000 0.390625
vt 0.625000 0.375000
vt 0.625000 0.406250
vt 0.562500 0.421875
vt 0.625000 0.421875
vt 0.375000 0.421875
vt 0.375000 0.406250
vt 0.625000 0.125000
vt 0.625000 0.437500
vt 0.625000 0.937500
vt 0.875000 0.562500
vt 0.875000 0.687500
vt 0.625000 0.843750
vt 0.625000 0.328125
vt 0.625000 0.390625
vn -0.5105 -0.0581 0.8579
vn -0.4622 -0.1529 0.8735
vn -0.4311 -0.0114 0.9022
vn -0.3843 -0.1071 0.9170
vn -0.6489 0.0170 0.7607
vn -0.6054 -0.1081 0.7885
vn -0.5517 -0.2157 0.8056
vn -0.4979 -0.3198 0.8061
vn -0.4007 -0.2735 0.8744
vn -0.3120 -0.2213 0.9240
vn -0.2130 -0.1523 0.9651
vn -0.2755 -0.0532 0.9598
vn -0.3292 0.0544 0.9427
vn -0.3903 0.1691 0.9050
vn -0.4787 0.1125 0.8708
vn -0.5674 0.0603 0.8212
vn 0.7748 0.6096 0.1677
vn 0.9279 0.3002 0.2211
vn 0.8115 0.5234 -0.2600
vn 0.9391 0.2705 -0.2119
vn 0.4202 0.8711 0.2542
vn 0.6996 0.6105 0.3714
vn 0.8578 0.2933 0.4221
vn 0.9064 -0.1032 0.4096
vn 0.9753 -0.0842 0.2041
vn 0.9750 -0.0358 -0.2192
vn 0.8113 0.0328 -0.5837
vn 0.7706 0.1862 -0.6095
vn 0.6917 0.3454 -0.6342
vn 0.5822 0.5185 -0.6263
vn 0.5945 0.7269 -0.3437
vn 0.5054 0.8615 0.0497
vn 0.4634 0.1524 -0.8729
vn 0.5102 0.0567 -0.8582
vn 0.3840 0.1057 -0.9173
vn 0.4323 0.0109 -0.9017
vn 0.4767 0.3279 -0.8156
vn 0.5517 0.2157 -0.8057
vn 0.6055 0.1081 -0.7885
vn 0.6543 0.0066 -0.7562
vn 0.5674 -0.0603 -0.8212
vn 0.4786 -0.1125 -0.8708
vn 0.3694 -0.1610 -0.9152
vn 0.3292 -0.0544 -0.9427
vn 0.2755 0.0533 -0.9598
vn 0.2181 0.1757 -0.9600
vn 0.3120 0.2214 -0.9239
vn 0.4007 0.2735 -0.8744
vn -0.9284 -0.3019 -0.2167
vn -0.7726 -0.6113 -0.1715
vn -0.9365 -0.2728 0.2201
vn -0.8102 -0.5288 0.2530
vn -0.9065 0.1031 -0.4093
vn -0.8579 -0.2934 -0.4218
vn -0.6994 -0.6105 -0.3716
vn -0.4199 -0.8712 -0.2544
vn -0.5025 -0.8630 -0.0532
vn -0.5930 -0.7294 0.3411
vn -0.5925 -0.4710 0.6535
vn -0.6912 -0.3454 0.6348
vn -0.7709 -0.1868 0.6090
vn -0.8388 -0.0045 0.5444
vn -0.9743 0.0338 0.2225
vn -0.9764 0.0819 -0.2000
vn -0.1982 0.9400 -0.2776
vn 0.0333 0.7972 -0.6028
vn -0.4392 0.7983 -0.4121
vn -0.1667 0.6796 -0.7144
vn 0.0250 0.9974 0.0677
vn 0.1209 0.9847 -0.1257
vn 0.2903 0.8300 -0.4763
vn 0.3998 0.5846 -0.7060
vn 0.2710 0.5019 -0.8213
vn 0.1409 0.4254 -0.8940
vn -0.0220 0.3364 -0.9414
vn -0.3282 0.4662 -0.8215
vn -0.6392 0.5376 -0.5499
vn -0.7586 0.5365 -0.3697
vn -0.5459 0.8038 -0.2363
vn -0.2984 0.9494 -0.0982
vn 0.4392 -0.7983 0.4121
vn 0.1667 -0.6796 0.7144
vn 0.1982 -0.9400 0.2776
vn -0.0333 -0.7972 0.6028
vn 0.7586 -0.5365 0.3697
vn 0.6392 -0.5376 0.5499
vn 0.3282 -0.4662 0.8215
vn 0.0220 -0.3364 0.9414
vn -0.1409 -0.4254 0.8940
vn -0.2710 -0.5019 0.8213
vn -0.3999 -0.5846 0.7060
vn -0.2903 -0.8300 0.4763
vn -0.1209 -0.9847 0.1257
vn -0.0250 -0.9974 -0.0677
vn 0.2984 -0.9494 0.0982
vn 0.5459 -0.8038 0.2363
vn 0.7221 -0.5808 -0.3757
vn 0.6501 -0.7598 -0.0025
vn 0.5160 -0.7021 -0.4908
vn 0.4065 -0.9031 -0.1385
vn 0.7576 -0.1884 -0.6249
vn 0.8874 -0.3641 -0.2826
vn 0.8604 -0.4966 0.1150
vn 0.7880 -0.5310 0.3114
vn 0.5742 -0.7986 0.1805
vn 0.3265 -0.9442 0.0423
vn 0.0042 -0.9920 -0.1260
vn 0.0972 -0.9454 -0.3110
vn 0.2653 -0.7300 -0.6298
vn 0.3980 -0.3999 -0.8257
vn 0.5311 -0.3556 -0.7691
vn 0.6606 -0.2794 -0.6968
vn 0.5604 -0.8016 0.2084
vn 0.3127 -0.9472 0.0702
vn 0.7738 -0.5341 0.3406
vn -0.0103 -0.9952 -0.0970
vn -0.5160 0.7021 0.4908
vn -0.4065 0.9031 0.1385
vn -0.7221 0.5808 0.3757
vn -0.6501 0.7598 0.0025
vn -0.3980 0.3999 0.8257
vn -0.2653 0.7300 0.6298
vn -0.0972 0.9454 0.3110
vn -0.0042 0.9920 0.1260
vn -0.3265 0.9442 -0.0423
vn -0.5742 0.7986 -0.1805
vn -0.7880 0.5310 -0.3114
vn -0.8604 0.4966 -0.1150
vn -0.8874 0.3641 0.2826
vn -0.7576 0.1884 0.6249
vn -0.6606 0.2794 0.6968
vn -0.5311 0.3556 0.7691
vn -0.3127 0.9472 -0.0702
vn -0.5604 0.8016 -0.2084
vn 0.0103 0.9952 0.0970
vn -0.7738 0.5341 -0.3406
vn -0.3640 -0.1680 -0.9161
vn -0.2434 -0.4250 -0.8719
vn -0.7057 -0.2611 -0.6586
vn -0.5537 -0.5712 -0.6059
vn -0.0665 0.1370 -0.9883
vn 0.0391 -0.0385 -0.9985
vn 0.1191 -0.1970 -0.9731
vn 0.2022 -0.3254 -0.9237
vn -0.0285 -0.6260 -0.7793
vn -0.2793 -0.8221 -0.4962
vn -0.3899 -0.8657 -0.3140
vn -0.6697 -0.6051 -0.4305
vn -0.8279 -0.2879 -0.4812
vn -0.8763 0.1086 -0.4694
vn -0.7493 0.1235 -0.6506
vn -0.4073 0.1377 -0.9028
vn -0.8434 -0.2908 -0.4518
vn -0.6850 -0.6081 -0.4013
vn -0.8919 0.1059 -0.4396
vn -0.4053 -0.8689 -0.2844
vn -0.4053 -0.8688 -0.2844
vn -0.8434 -0.2909 -0.4518
vn 0.2390 0.4185 0.8762
vn 0.3723 0.1667 0.9130
vn 0.5521 0.5688 0.6096
vn 0.7090 0.2601 0.6554
vn -0.1901 0.3771 0.9065
vn -0.1182 0.1970 0.9732
vn -0.0396 0.0378 0.9985
vn 0.0166 -0.1127 0.9935
vn 0.4105 -0.1392 0.9012
vn 0.7522 -0.1251 0.6470
vn 0.8764 -0.1087 0.4692
vn 0.8281 0.2879 0.4810
vn 0.6697 0.6050 0.4307
vn 0.3899 0.8656 0.3143
vn 0.2783 0.8199 0.5003
vn 0.0275 0.6231 0.7817
vn 0.6850 0.6081 0.4013
vn 0.8434 0.2908 0.4518
vn 0.4053 0.8689 0.2844
vn 0.8919 -0.1059 0.4396
vn 0.8434 0.2909 0.4518
vn 0.4053 0.8688 0.2844
vn -0.5102 -0.0567 0.8582
vn -0.4634 -0.1524 0.8729
vn -0.4323 -0.0109 0.9017
vn -0.3840 -0.1057 0.9173
vn -0.6543 -0.0066 0.7562
vn -0.6055 -0.1081 0.7885
vn -0.5517 -0.2157 0.8057
vn -0.4767 -0.3279 0.8156
vn -0.3120 -0.2214 0.9239
vn -0.2181 -0.1757 0.9600
vn -0.2755 -0.0533 0.9598
vn -0.3694 0.1610 0.9152
vn -0.4786 0.1125 0.8708
vn 0.7726 0.6113 0.1715
vn 0.9284 0.3019 0.2167
vn 0.8102 0.5288 -0.2530
vn 0.9365 0.2728 -0.2201
vn 0.4199 0.8712 0.2544
vn 0.6994 0.6105 0.3716
vn 0.8579 0.2934 0.4218
vn 0.9065 -0.1031 0.4093
vn 0.9764 -0.0819 0.2000
vn 0.9743 -0.0338 -0.2225
vn 0.8388 0.0045 -0.5444
vn 0.7709 0.1868 -0.6090
vn 0.6912 0.3454 -0.6348
vn 0.5925 0.4710 -0.6535
vn 0.5930 0.7294 -0.3411
vn 0.5025 0.8630 0.0532
vn 0.4622 0.1529 -0.8735
vn 0.5105 0.0581 -0.8579
vn 0.3843 0.1071 -0.9170
vn 0.4311 0.0114 -0.9022
vn 0.4979 0.3198 -0.8061
vn 0.5517 0.2157 -0.8056
vn 0.6054 0.1081 -0.7885
vn 0.6489 -0.0170 -0.7607
vn 0.4787 -0.1125 -0.8708
vn 0.3903 -0.1691 -0.9050
vn 0.2755 0.0532 -0.9598
vn 0.2130 0.1523 -0.9651
vn 0.3120 0.2213 -0.9240
vn -0.9279 -0.3002 -0.2211
vn -0.7748 -0.6096 -0.1677
vn -0.9391 -0.2705 0.2119
vn -0.8115 -0.5234 0.2600
vn -0.9064 0.1032 -0.4096
vn -0.8578 -0.2933 -0.4221
vn -0.6996 -0.6105 -0.3714
vn -0.4202 -0.8711 -0.2542
vn -0.5054 -0.8615 -0.0497
vn -0.5945 -0.7269 0.3437
vn -0.5822 -0.5185 0.6263
vn -0.6917 -0.3454 0.6342
vn -0.7706 -0.1862 0.6095
vn -0.8113 -0.0328 0.5837
vn -0.9750 0.0358 0.2192
vn -0.9753 0.0842 -0.2041
vn -0.1990 0.9411 -0.2734
vn 0.0316 0.8023 -0.5960
vn -0.4426 0.7978 -0.4094
vn -0.1745 0.6811 -0.7111
vn 0.0250 0.9974 0.0680
vn 0.1206 0.9853 -0.1211
vn 0.2904 0.8318 -0.4731
vn 0.3917 0.5445 -0.7417
vn 0.2703 0.5024 -0.8213
vn 0.1408 0.4262 -0.8936
vn 0.0321 0.3330 -0.9424
vn -0.3317 0.4659 -0.8203
vn -0.6426 0.5364 -0.5470
vn -0.7588 0.5364 -0.3695
vn -0.5461 0.8037 -0.2361
vn -0.2985 0.9494 -0.0979
vn 0.4426 -0.7978 0.4094
vn 0.1745 -0.6811 0.7111
vn 0.1990 -0.9411 0.2734
vn -0.0316 -0.8023 0.5960
vn 0.7588 -0.5364 0.3695
vn 0.6426 -0.5364 0.5470
vn 0.3317 -0.4659 0.8203
vn -0.0321 -0.3330 0.9424
vn -0.1408 -0.4262 0.8936
vn -0.2703 -0.5024 0.8213
vn -0.3917 -0.5445 0.7417
vn -0.2904 -0.8318 0.4731
vn -0.1206 -0.9853 0.1211
vn -0.0250 -0.9974 -0.0680
vn 0.2985 -0.9494 0.0979
vn 0.5461 -0.8037 0.2361
vn 0.7200 -0.5783 -0.3837
vn 0.6503 -0.7596 -0.0069
vn 0.5200 -0.6959 -0.4953
vn 0.4093 -0.9014 -0.1414
vn 0.7895 -0.1959 -0.5817
vn 0.8864 -0.3640 -0.2862
vn 0.8607 -0.4971 0.1103
vn 0.7881 -0.5311 0.3112
vn 0.5742 -0.7986 0.1802
vn 0.3267 -0.9442 0.0421
vn 0.0045 -0.9920 -0.1262
vn 0.1006 -0.9441 -0.3139
vn 0.2678 -0.7278 -0.6314
vn 0.3676 -0.4440 -0.8171
vn 0.5306 -0.3551 -0.7696
vn 0.6607 -0.2786 -0.6970
vn -0.5200 0.6959 0.4953
vn -0.4093 0.9014 0.1414
vn -0.7200 0.5783 0.3837
vn -0.6503 0.7596 0.0069
vn -0.3676 0.4440 0.8171
vn -0.2678 0.7278 0.6314
vn -0.1006 0.9441 0.3139
vn -0.0045 0.9920 0.1262
vn -0.3267 0.9442 -0.0421
vn -0.5742 0.7986 -0.1802
vn -0.7881 0.5311 -0.3112
vn -0.8607 0.4971 -0.1103
vn -0.8864 0.3640 0.2862
vn -0.7895 0.1959 0.5817
vn -0.6607 0.2786 0.6970
vn -0.5306 0.3551 0.7696
vn -0.3723 -0.1667 -0.9130
vn -0.2390 -0.4185 -0.8762
vn -0.7090 -0.2601 -0.6554
vn -0.5521 -0.5688 -0.6096
vn -0.0166 0.1127 -0.9935
vn 0.0396 -0.0378 -0.9985
vn 0.1182 -0.1970 -0.9732
vn 0.1901 -0.3771 -0.9065
vn -0.0275 -0.6231 -0.7817
vn -0.2783 -0.8199 -0.5003
vn -0.3899 -0.8656 -0.3143
vn -0.6697 -0.6050 -0.4307
vn -0.8281 -0.2879 -0.4810
vn -0.8764 0.1087 -0.4691
vn -0.7522 0.1251 -0.6470
vn -0.4105 0.1392 -0.9012
vn 0.2434 0.4250 0.8719
vn 0.3640 0.1680 0.9161
vn 0.5537 0.5712 0.6059
vn 0.7057 0.2611 0.6586
vn -0.2022 0.3254 0.9237
vn -0.1191 0.1970 0.9731
vn -0.0391 0.0385 0.9985
vn 0.0665 -0.1370 0.9883
vn 0.4073 -0.1377 0.9028
vn 0.7493 -0.1235 0.6506
vn 0.8763 -0.1086 0.4694
vn 0.8279 0.2879 0.4812
vn 0.6697 0.6051 0.4305
vn 0.3899 0.8657 0.3140
vn 0.2793 0.8221 0.4962
vn 0.0285 0.6260 0.7793
usemtl Material.002
s off
f 1572/1783/1584 1576/1784/1584 1575/1785/1584
f 1574/1786/1585 1576/1784/1585 1573/1787/1585
f 1576/1784/1586 1578/1788/1586 1575/1785/1586
f 1576/1784/1587 1580/1789/1587 1579/1790/1587
f 1444/1791/1588 1572/1783/1588 1466/1792/1588
f 1467/1793/1589 1573/1787/1589 1572/1783/1589
f 1469/1794/1590 1573/1787/1590 1468/1795/1590
f 1445/1796/1591 1574/1786/1591 1469/1794/1591
f 1470/1797/1592 1577/1798/1592 1574/1786/1592
f 1577/1798/1593 1472/1799/1593 1580/1789/1593
f 1580/1789/1594 1447/1800/1594 1473/1801/1594
f 1579/1790/1595 1473/1801/1595 1474/1802/1595
f 1579/1790/1596 1475/1803/1596 1578/1788/1596
f 1578/1788/1597 1446/1804/1597 1464/1805/1597
f 1575/1785/1598 1464/1805/1598 1465/1806/1598
f 1466/1792/1599 1575/1785/1599 1465/1806/1599
f 1582/1807/1600 1584/1808/1600 1581/1809/1600
f 1582/1807/1601 1586/1810/1601 1585/1811/1601
f 1585/1811/1602 1587/1812/1602 1584/1808/1602
f 1585/1811/1603 1589/1813/1603 1588/1814/1603
f 1571/1815/1604 1502/1816/1604 1452/1817/1604
f 1570/1818/1605 1581/1809/1605 1571/1815/1605
f 1570/1818/1606 1583/1819/1606 1582/1807/1606
f 1569/1820/1607 1479/1821/1607 1583/1819/1607
f 1583/1819/1608 1480/1822/1608 1586/1810/1608
f 1586/1810/1609 1481/1823/1609 1589/1813/1609
f 1589/1813/1610 1451/1824/1610 1482/1825/1610
f 1589/1813/1611 1483/1826/1611 1588/1814/1611
f 1587/1812/1612 1483/1826/1612 1484/1827/1612
f 1587/1812/1613 1450/1828/1613 1500/1829/1613
f 1584/1808/1614 1500/1829/1614 1501/1830/1614
f 1581/1809/1615 1501/1830/1615 1502/1816/1615
f 1590/1831/1616 1594/1832/1616 1593/1833/1616
f 1592/1834/1617 1594/1832/1617 1591/1835/1617
f 1594/1832/1618 1596/1836/1618 1593/1833/1618
f 1594/1832/1619 1598/1837/1619 1597/1838/1619
f 1450/1828/1620 1590/1831/1620 1487/1839/1620
f 1484/1827/1621 1591/1835/1621 1590/1831/1621
f 1482/1825/1622 1591/1835/1622 1483/1826/1622
f 1451/1824/1623 1592/1834/1623 1482/1825/1623
f 1488/1840/1624 1595/1841/1624 1592/1834/1624
f 1595/1841/1625 1490/1842/1625 1598/1837/1625
f 1598/1837/1626 1449/1843/1626 1491/1844/1626
f 1597/1838/1627 1491/1844/1627 1492/1845/1627
f 1597/1838/1628 1493/1846/1628 1596/1836/1628
f 1596/1836/1629 1448/1847/1629 1485/1848/1629
f 1593/1833/1630 1485/1848/1630 1486/1849/1630
f 1487/1839/1631 1593/1833/1631 1486/1849/1631
f 1600/1850/1632 1602/1851/1632 1599/1852/1632
f 1600/1850/1633 1604/1853/1633 1603/1854/1633
f 1603/1854/1634 1605/1855/1634 1602/1851/1634
f 1603/1854/1635 1607/1856/1635 1606/1857/1635
f 1562/1858/1636 1520/1859/1636 1458/1860/1636
f 1561/1861/1637 1599/1852/1637 1562/1858/1637
f 1561/1861/1638 1601/1862/1638 1600/1850/1638
f 1560/1863/1639 1497/1864/1639 1601/1862/1639
f 1601/1862/1640 1498/1865/1640 1604/1853/1640
f 1604/1853/1641 1499/1866/1641 1607/1856/1641
f 1607/1856/1642 1445/1867/1642 1469/1868/1642
f 1607/1856/1643 1468/1869/1643 1606/1857/1643
f 1605/1855/1644 1468/1869/1644 1467/1870/1644
f 1605/1855/1645 1444/1871/1645 1518/1872/1645
f 1602/1851/1646 1518/1872/1646 1519/1873/1646
f 1599/1852/1647 1519/1873/1647 1520/1859/1647
f 1609/1874/1648 1611/1875/1648 1608/1876/1648
f 1610/1877/1649 1612/1878/1649 1609/1874/1649
f 1611/1875/1650 1615/1879/1650 1614/1880/1650
f 1612/1878/1651 1616/1881/1651 1615/1879/1651
f 1502/1882/1652 1551/1883/1652 1452/1884/1652
f 1501/1885/1653 1608/1876/1653 1502/1882/1653
f 1500/1886/1654 1609/1874/1654 1501/1885/1654
f 1450/1828/1655 1610/1877/1655 1500/1886/1655
f 1610/1877/1656 1486/1849/1656 1613/1887/1656
f 1486/1849/1657 1616/1881/1657 1613/1887/1657
f 1616/1881/1658 1448/1847/1658 1496/1888/1658
f 1615/1879/1659 1496/1888/1659 1495/1889/1659
f 1614/1880/1660 1495/1889/1660 1494/1890/1660
f 1553/1891/1661 1494/1890/1661 1460/1892/1661
f 1552/1893/1662 1614/1880/1662 1553/1891/1662
f 1608/1876/1663 1552/1893/1663 1551/1883/1663
f 1618/1894/1664 1620/1895/1664 1617/1896/1664
f 1619/1897/1665 1621/1898/1665 1618/1894/1665
f 1620/1895/1666 1624/1899/1666 1623/1900/1666
f 1621/1898/1667 1625/1901/1667 1624/1899/1667
f 1511/1902/1668 1542/1903/1668 1455/1904/1668
f 1510/1905/1669 1617/1896/1669 1511/1902/1669
f 1509/1906/1670 1618/1894/1670 1510/1905/1670
f 1447/1907/1671 1619/1897/1671 1509/1906/1671
f 1619/1897/1672 1471/1908/1672 1622/1909/1672
f 1471/1908/1673 1625/1901/1673 1622/1909/1673
f 1625/1901/1674 1445/1910/1674 1499/1911/1674
f 1624/1899/1675 1499/1911/1675 1498/1912/1675
f 1623/1900/1676 1498/1912/1676 1497/1913/1676
f 1544/1914/1677 1497/1913/1677 1463/1915/1677
f 1543/1916/1678 1623/1900/1678 1544/1914/1678
f 1617/1896/1679 1543/1916/1679 1542/1903/1679
f 1626/1917/1680 1630/1918/1680 1629/1919/1680
f 1627/1920/1681 1631/1921/1681 1630/1918/1681
f 1630/1918/1682 1632/1922/1682 1629/1919/1682
f 1631/1921/1683 1633/1923/1683 1630/1918/1683
f 1451/1824/1684 1626/1917/1684 1488/1840/1684
f 1481/1924/1685 1627/1920/1685 1626/1917/1685
f 1480/1925/1686 1628/1926/1686 1627/1920/1686
f 1479/1927/1687 1536/1928/1687 1628/1926/1687
f 1628/1926/1688 1537/1929/1688 1631/1921/1688
f 1537/1929/1689 1634/1930/1689 1631/1921/1689
f 1538/1931/1690 1529/1932/1690 1634/1930/1690
f 1634/1930/1691 1528/1933/1691 1633/1923/1691
f 1633/1923/1692 1527/1934/1692 1632/1922/1692
f 1632/1922/1693 1449/1843/1693 1490/1842/1693
f 1489/1935/1694 1632/1922/1694 1490/1842/1694
f 1626/1917/1695 1489/1935/1695 1488/1840/1695
f 1636/1936/1696 1638/1937/1696 1635/1938/1696
f 1637/1939/1696 1639/1940/1696 1636/1936/1696
f 1639/1940/1697 1641/1941/1697 1638/1937/1697
f 1640/1942/1697 1642/1943/1697 1639/1940/1697
f 1517/1944/1698 1536/1928/1698 1457/1945/1698
f 1516/1946/1698 1635/1938/1698 1517/1944/1698
f 1516/1946/1698 1637/1939/1698 1636/1936/1698
f 1456/1947/1698 1637/1939/1698 1515/1948/1698
f 1539/1949/1696 1640/1942/1696 1637/1939/1696
f 1540/1950/1697 1643/1951/1697 1640/1942/1697
f 1541/1952/1699 1532/1953/1699 1643/1951/1699
f 1643/1951/1699 1531/1954/1699 1642/1943/1699
f 1642/1943/1699 1530/1955/1699 1641/1941/1699
f 1641/1941/1699 1461/1956/1699 1538/1931/1699
f 1638/1937/1697 1538/1931/1697 1537/1929/1697
f 1635/1938/1696 1537/1929/1696 1536/1928/1696
f 1645/1957/1696 1647/1958/1696 1644/1959/1696
f 1646/1960/1696 1648/1961/1696 1645/1957/1696
f 1648/1961/1697 1650/1962/1697 1647/1958/1697
f 1648/1961/1697 1652/1963/1697 1651/1964/1697
f 1456/1947/1698 1644/1959/1698 1539/1949/1698
f 1513/1965/1698 1644/1959/1698 1514/1966/1698
f 1512/1967/1698 1645/1957/1698 1513/1965/1698
f 1455/1904/1698 1646/1960/1698 1512/1967/1698
f 1542/1903/1696 1649/1968/1696 1646/1960/1696
f 1543/1916/1697 1652/1963/1697 1649/1968/1697
f 1544/1914/1699 1535/1969/1699 1652/1963/1699
f 1652/1963/1699 1534/1970/1699 1651/1964/1699
f 1651/1964/1699 1533/1971/1699 1650/1962/1699
f 1650/1962/1699 1462/1972/1699 1541/1952/1699
f 1647/1958/1697 1541/1952/1697 1540/1950/1697
f 1644/1959/1696 1540/1950/1696 1539/1949/1696
f 1653/1973/1700 1657/1974/1700 1656/1975/1700
f 1654/1976/1701 1658/1977/1701 1657/1974/1701
f 1657/1974/1702 1659/1978/1702 1656/1975/1702
f 1658/1977/1703 1660/1979/1703 1657/1974/1703
f 1446/1980/1704 1653/1973/1704 1464/1981/1704
f 1478/1982/1705 1654/1976/1705 1653/1973/1705
f 1477/1983/1706 1655/1984/1706 1654/1976/1706
f 1476/1985/1707 1545/1986/1707 1655/1984/1707
f 1655/1984/1708 1546/1987/1708 1658/1977/1708
f 1546/1987/1709 1661/1988/1709 1658/1977/1709
f 1547/1989/1710 1520/1990/1710 1661/1988/1710
f 1661/1988/1711 1519/1991/1711 1660/1979/1711
f 1660/1979/1712 1518/1992/1712 1659/1978/1712
f 1659/1978/1713 1444/1993/1713 1466/1994/1713
f 1465/1995/1714 1659/1978/1714 1466/1994/1714
f 1653/1973/1715 1465/1995/1715 1464/1981/1715
f 1663/1996/1716 1665/1997/1716 1662/1998/1716
f 1664/1999/1716 1666/2000/1716 1663/1996/1716
f 1666/2000/1717 1668/2001/1717 1665/1997/1717
f 1667/2002/1717 1669/2003/1717 1666/2000/1717
f 1454/2004/1718 1662/1998/1718 1545/1986/1718
f 1507/2005/1718 1662/1998/1718 1508/2006/1718
f 1507/2005/1718 1664/1999/1718 1663/1996/1718
f 1453/2007/1718 1664/1999/1718 1506/2008/1718
f 1548/2009/1716 1667/2002/1716 1664/1999/1716
f 1549/2010/1717 1670/2011/1717 1667/2002/1717
f 1550/2012/1719 1523/2013/1719 1670/2011/1719
f 1670/2011/1719 1522/2014/1719 1669/2003/1719
f 1669/2003/1719 1521/2015/1719 1668/2001/1719
f 1668/2001/1719 1458/2016/1719 1547/1989/1719
f 1665/1997/1717 1547/1989/1717 1546/1987/1717
f 1662/1998/1716 1546/1987/1716 1545/1986/1716
f 1672/2017/1716 1674/2018/1716 1671/2019/1716
f 1673/2020/1716 1675/2021/1716 1672/2017/1716
f 1675/2021/1717 1677/2022/1717 1674/2018/1717
f 1675/2021/1717 1679/2023/1717 1678/2024/1717
f 1453/2007/1718 1671/2019/1718 1548/2009/1718
f 1504/2025/1718 1671/2019/1718 1505/2026/1718
f 1503/2027/1718 1672/2017/1718 1504/2025/1718
f 1452/1884/1718 1673/2020/1718 1503/2027/1718
f 1551/1883/1716 1676/2028/1716 1673/2020/1716
f 1552/1893/1717 1679/2023/1717 1676/2028/1717
f 1553/1891/1719 1526/2029/1719 1679/2023/1719
f 1679/2023/1719 1525/2030/1719 1678/2024/1719
f 1678/2024/1719 1524/2031/1719 1677/2022/1719
f 1677/2022/1719 1459/2032/1719 1550/2012/1719
f 1674/2018/1717 1550/2012/1717 1549/2010/1717
f 1671/2019/1716 1549/2010/1716 1548/2009/1716
f 1680/2033/1720 1684/2034/1720 1683/2035/1720
f 1682/2036/1721 1684/2034/1721 1681/2037/1721
f 1683/2035/1722 1687/2038/1722 1686/2039/1722
f 1685/2040/1723 1687/2038/1723 1684/2034/1723
f 1448/1847/1724 1680/2033/1724 1496/2041/1724
f 1492/1845/1725 1680/2033/1725 1493/1846/1725
f 1492/1845/1726 1682/2036/1726 1681/2037/1726
f 1449/1843/1727 1682/2036/1727 1491/1844/1727
f 1527/2042/1728 1685/2040/1728 1682/2036/1728
f 1528/2043/1729 1688/2044/1729 1685/2040/1729
f 1529/2045/1730 1554/2046/1730 1688/2044/1730
f 1688/2044/1731 1555/2047/1731 1687/2038/1731
f 1686/2039/1732 1555/2047/1732 1556/2048/1732
f 1494/2049/1733 1556/2048/1733 1460/2050/1733
f 1495/2051/1734 1686/2039/1734 1494/2049/1734
f 1496/2041/1735 1683/2035/1735 1495/2051/1735
f 1690/2052/1736 1692/2053/1736 1689/2054/1736
f 1691/2055/1737 1693/2056/1737 1690/2052/1737
f 1693/2056/1736 1695/2057/1736 1692/2053/1736
f 1694/2058/1737 1696/2059/1737 1693/2056/1737
f 1556/2048/1738 1526/2060/1738 1460/2050/1738
f 1555/2047/1736 1689/2054/1736 1556/2048/1736
f 1554/2046/1737 1690/2052/1737 1555/2047/1737
f 1461/2061/1739 1691/2055/1739 1554/2046/1739
f 1530/2062/1739 1694/2058/1739 1691/2055/1739
f 1694/2058/1740 1532/2063/1740 1697/2064/1740
f 1532/2063/1739 1557/2065/1739 1697/2064/1739
f 1697/2064/1737 1558/2066/1737 1696/2059/1737
f 1696/2059/1736 1559/2067/1736 1695/2057/1736
f 1695/2057/1738 1459/2068/1738 1524/2069/1738
f 1692/2053/1738 1524/2069/1738 1525/2070/1738
f 1689/2054/1738 1525/2070/1738 1526/2060/1738
f 1699/2071/1736 1701/2072/1736 1698/2073/1736
f 1700/2074/1737 1702/2075/1737 1699/2071/1737
f 1702/2075/1741 1704/2076/1741 1701/2072/1741
f 1703/2077/1737 1705/2078/1737 1702/2075/1737
f 1559/2067/1738 1523/2079/1738 1459/2068/1738
f 1558/2066/1736 1698/2073/1736 1559/2067/1736
f 1557/2065/1737 1699/2071/1737 1558/2066/1737
f 1462/2080/1739 1700/2074/1739 1557/2065/1739
f 1533/2081/1739 1703/2077/1739 1700/2074/1739
f 1534/2082/1739 1706/2083/1739 1703/2077/1739
f 1535/2084/1739 1560/1863/1739 1706/2083/1739
f 1706/2083/1737 1561/1861/1737 1705/2078/1737
f 1705/2078/1736 1562/1858/1736 1704/2076/1736
f 1704/2076/1738 1458/1860/1738 1521/2085/1738
f 1701/2072/1738 1521/2085/1738 1522/2086/1738
f 1523/2079/1738 1701/2072/1738 1522/2086/1738
f 1707/2087/1742 1711/2088/1742 1710/2089/1742
f 1709/2090/1743 1711/2088/1743 1708/2091/1743
f 1710/2089/1744 1714/2092/1744 1713/2093/1744
f 1712/2094/1745 1714/2092/1745 1711/2088/1745
f 1446/1804/1746 1707/2087/1746 1478/2095/1746
f 1474/1802/1747 1707/2087/1747 1475/1803/1747
f 1474/1802/1748 1709/2090/1748 1708/2091/1748
f 1447/1800/1749 1709/2090/1749 1473/1801/1749
f 1509/2096/1750 1712/2094/1750 1709/2090/1750
f 1510/2097/1751 1715/2098/1751 1712/2094/1751
f 1511/2099/1752 1563/2100/1752 1715/2098/1752
f 1715/2098/1753 1564/2101/1753 1714/2092/1753
f 1713/2093/1754 1564/2101/1754 1565/2102/1754
f 1476/2103/1755 1565/2102/1755 1454/2104/1755
f 1477/2105/1756 1713/2093/1756 1476/2103/1756
f 1478/2095/1757 1710/2089/1757 1477/2105/1757
f 1717/2106/1758 1719/2107/1758 1716/2108/1758
f 1718/2109/1759 1720/2110/1759 1717/2106/1759
f 1720/2110/1758 1722/2111/1758 1719/2107/1758
f 1721/2112/1759 1723/2113/1759 1720/2110/1759
f 1565/2102/1760 1508/2114/1760 1454/2104/1760
f 1564/2101/1758 1716/2108/1758 1565/2102/1758
f 1563/2100/1759 1717/2106/1759 1564/2101/1759
f 1455/2115/1761 1718/2109/1761 1563/2100/1761
f 1718/2109/1761 1513/2116/1761 1721/2112/1761
f 1513/2116/1761 1724/2117/1761 1721/2112/1761
f 1514/2118/1761 1566/2119/1761 1724/2117/1761
f 1724/2117/1762 1567/2120/1762 1723/2113/1762
f 1723/2113/1758 1568/2121/1758 1722/2111/1758
f 1722/2111/1760 1453/2122/1760 1506/2123/1760
f 1507/2124/1763 1722/2111/1763 1506/2123/1763
f 1716/2108/1760 1507/2124/1760 1508/2114/1760
f 1726/2125/1758 1728/2126/1758 1725/2127/1758
f 1727/2128/1759 1729/2129/1759 1726/2125/1759
f 1729/2129/1758 1731/2130/1758 1728/2126/1758
f 1730/2131/1762 1732/2132/1762 1729/2129/1762
f 1453/2122/1763 1725/2127/1763 1505/2133/1763
f 1567/2120/1758 1725/2127/1758 1568/2121/1758
f 1566/2119/1759 1726/2125/1759 1567/2120/1759
f 1456/2134/1761 1727/2128/1761 1566/2119/1761
f 1727/2128/1761 1516/2135/1761 1730/2131/1761
f 1516/2135/1761 1733/2136/1761 1730/2131/1761
f 1517/2137/1761 1569/1820/1761 1733/2136/1761
f 1733/2136/1759 1570/1818/1759 1732/2132/1759
f 1732/2132/1758 1571/1815/1758 1731/2130/1758
f 1503/2138/1760 1571/1815/1760 1452/1817/1760
f 1728/2126/1760 1503/2138/1760 1504/2139/1760
f 1725/2127/1760 1504/2139/1760 1505/2133/1760
f 1572/1783/1764 1573/1787/1764 1576/1784/1764
f 1574/1786/1765 1577/1798/1765 1576/1784/1765
f 1576/1784/1766 1579/1790/1766 1578/1788/1766
f 1576/1784/1767 1577/1798/1767 1580/1789/1767
f 1444/1791/1768 1467/1793/1768 1572/1783/1768
f 1467/1793/1769 1468/1795/1769 1573/1787/1769
f 1469/1794/1770 1574/1786/1770 1573/1787/1770
f 1445/1796/1771 1470/1797/1771 1574/1786/1771
f 1470/1797/1592 1471/2140/1592 1577/1798/1592
f 1577/1798/1772 1471/2140/1772 1472/1799/1772
f 1580/1789/1773 1472/1799/1773 1447/1800/1773
f 1579/1790/1774 1580/1789/1774 1473/1801/1774
f 1579/1790/1596 1474/1802/1596 1475/1803/1596
f 1578/1788/1775 1475/1803/1775 1446/1804/1775
f 1575/1785/1776 1578/1788/1776 1464/1805/1776
f 1466/1792/1599 1572/1783/1599 1575/1785/1599
f 1582/1807/1777 1585/1811/1777 1584/1808/1777
f 1582/1807/1778 1583/1819/1778 1586/1810/1778
f 1585/1811/1779 1588/1814/1779 1587/1812/1779
f 1585/1811/1780 1586/1810/1780 1589/1813/1780
f 1571/1815/1781 1581/1809/1781 1502/1816/1781
f 1570/1818/1782 1582/1807/1782 1581/1809/1782
f 1570/1818/1783 1569/1820/1783 1583/1819/1783
f 1569/1820/1784 1457/2141/1784 1479/1821/1784
f 1583/1819/1785 1479/1821/1785 1480/1822/1785
f 1586/1810/1786 1480/1822/1786 1481/1823/1786
f 1589/1813/1787 1481/1823/1787 1451/1824/1787
f 1589/1813/1788 1482/1825/1788 1483/1826/1788
f 1587/1812/1789 1588/1814/1789 1483/1826/1789
f 1587/1812/1790 1484/1827/1790 1450/1828/1790
f 1584/1808/1791 1587/1812/1791 1500/1829/1791
f 1581/1809/1792 1584/1808/1792 1501/1830/1792
f 1590/1831/1793 1591/1835/1793 1594/1832/1793
f 1592/1834/1794 1595/1841/1794 1594/1832/1794
f 1594/1832/1795 1597/1838/1795 1596/1836/1795
f 1594/1832/1796 1595/1841/1796 1598/1837/1796
f 1450/1828/1797 1484/1827/1797 1590/1831/1797
f 1484/1827/1798 1483/1826/1798 1591/1835/1798
f 1482/1825/1799 1592/1834/1799 1591/1835/1799
f 1451/1824/1800 1488/1840/1800 1592/1834/1800
f 1488/1840/1624 1489/1935/1624 1595/1841/1624
f 1595/1841/1801 1489/1935/1801 1490/1842/1801
f 1598/1837/1802 1490/1842/1802 1449/1843/1802
f 1597/1838/1627 1598/1837/1627 1491/1844/1627
f 1597/1838/1803 1492/1845/1803 1493/1846/1803
f 1596/1836/1804 1493/1846/1804 1448/1847/1804
f 1593/1833/1805 1596/1836/1805 1485/1848/1805
f 1487/1839/1631 1590/1831/1631 1593/1833/1631
f 1600/1850/1806 1603/1854/1806 1602/1851/1806
f 1600/1850/1807 1601/1862/1807 1604/1853/1807
f 1603/1854/1808 1606/1857/1808 1605/1855/1808
f 1603/1854/1809 1604/1853/1809 1607/1856/1809
f 1562/1858/1810 1599/1852/1810 1520/1859/1810
f 1561/1861/1811 1600/1850/1811 1599/1852/1811
f 1561/1861/1812 1560/1863/1812 1601/1862/1812
f 1560/1863/1813 1463/2142/1813 1497/1864/1813
f 1601/1862/1814 1497/1864/1814 1498/1865/1814
f 1604/1853/1815 1498/1865/1815 1499/1866/1815
f 1607/1856/1816 1499/1866/1816 1445/1867/1816
f 1607/1856/1817 1469/1868/1817 1468/1869/1817
f 1605/1855/1818 1606/1857/1818 1468/1869/1818
f 1605/1855/1819 1467/1870/1819 1444/1871/1819
f 1602/1851/1820 1605/1855/1820 1518/1872/1820
f 1599/1852/1821 1602/1851/1821 1519/1873/1821
f 1609/1874/1822 1612/1878/1822 1611/1875/1822
f 1610/1877/1823 1613/1887/1823 1612/1878/1823
f 1611/1875/1824 1612/1878/1824 1615/1879/1824
f 1612/1878/1825 1613/1887/1825 1616/1881/1825
f 1502/1882/1826 1608/1876/1826 1551/1883/1826
f 1501/1885/1827 1609/1874/1827 1608/1876/1827
f 1500/1886/1828 1610/1877/1828 1609/1874/1828
f 1450/1828/1829 1487/1839/1829 1610/1877/1829
f 1610/1877/1830 1487/1839/1830 1486/1849/1830
f 1486/1849/1831 1485/1848/1831 1616/1881/1831
f 1616/1881/1832 1485/1848/1832 1448/1847/1832
f 1615/1879/1833 1616/1881/1833 1496/1888/1833
f 1614/1880/1834 1615/1879/1834 1495/1889/1834
f 1553/1891/1835 1614/1880/1835 1494/1890/1835
f 1552/1893/1836 1611/1875/1836 1614/1880/1836
f 1608/1876/1837 1611/1875/1837 1552/1893/1837
f 1618/1894/1838 1621/1898/1838 1620/1895/1838
f 1619/1897/1839 1622/1909/1839 1621/1898/1839
f 1620/1895/1840 1621/1898/1840 1624/1899/1840
f 1621/1898/1841 1622/1909/1841 1625/1901/1841
f 1511/1902/1842 1617/1896/1842 1542/1903/1842
f 1510/1905/1843 1618/1894/1843 1617/1896/1843
f 1509/1906/1844 1619/1897/1844 1618/1894/1844
f 1447/1907/1845 1472/2143/1845 1619/1897/1845
f 1619/1897/1846 1472/2143/1846 1471/1908/1846
f 1471/1908/1847 1470/2144/1847 1625/1901/1847
f 1625/1901/1848 1470/2144/1848 1445/1910/1848
f 1624/1899/1849 1625/1901/1849 1499/1911/1849
f 1623/1900/1850 1624/1899/1850 1498/1912/1850
f 1544/1914/1851 1623/1900/1851 1497/1913/1851
f 1543/1916/1852 1620/1895/1852 1623/1900/1852
f 1617/1896/1853 1620/1895/1853 1543/1916/1853
f 1626/1917/1854 1627/1920/1854 1630/1918/1854
f 1627/1920/1855 1628/1926/1855 1631/1921/1855
f 1630/1918/1856 1633/1923/1856 1632/1922/1856
f 1631/1921/1857 1634/1930/1857 1633/1923/1857
f 1451/1824/1858 1481/1924/1858 1626/1917/1858
f 1481/1924/1859 1480/1925/1859 1627/1920/1859
f 1480/1925/1860 1479/1927/1860 1628/1926/1860
f 1479/1927/1861 1457/1945/1861 1536/1928/1861
f 1628/1926/1862 1536/1928/1862 1537/1929/1862
f 1537/1929/1863 1538/1931/1863 1634/1930/1863
f 1538/1931/1864 1461/1956/1864 1529/1932/1864
f 1634/1930/1865 1529/1932/1865 1528/1933/1865
f 1633/1923/1866 1528/1933/1866 1527/1934/1866
f 1632/1922/1867 1527/1934/1867 1449/1843/1867
f 1489/1935/1868 1629/1919/1868 1632/1922/1868
f 1626/1917/1869 1629/1919/1869 1489/1935/1869
f 1636/1936/1696 1639/1940/1696 1638/1937/1696
f 1637/1939/1696 1640/1942/1696 1639/1940/1696
f 1639/1940/1697 1642/1943/1697 1641/1941/1697
f 1640/1942/1697 1643/1951/1697 1642/1943/1697
f 1517/1944/1698 1635/1938/1698 1536/1928/1698
f 1516/1946/1698 1636/1936/1698 1635/1938/1698
f 1516/1946/1698 1515/1948/1698 1637/1939/1698
f 1456/1947/1698 1539/1949/1698 1637/1939/1698
f 1539/1949/1696 1540/1950/1696 1640/1942/1696
f 1540/1950/1697 1541/1952/1697 1643/1951/1697
f 1541/1952/1699 1462/1972/1699 1532/1953/1699
f 1643/1951/1699 1532/1953/1699 1531/1954/1699
f 1642/1943/1699 1531/1954/1699 1530/1955/1699
f 1641/1941/1699 1530/1955/1699 1461/1956/1699
f 1638/1937/1697 1641/1941/1697 1538/1931/1697
f 1635/1938/1696 1638/1937/1696 1537/1929/1696
f 1645/1957/1696 1648/1961/1696 1647/1958/1696
f 1646/1960/1696 1649/1968/1696 1648/1961/1696
f 1648/1961/1697 1651/1964/1697 1650/1962/1697
f 1648/1961/1697 1649/1968/1697 1652/1963/1697
f 1456/1947/1698 1514/1966/1698 1644/1959/1698
f 1513/1965/1698 1645/1957/1698 1644/1959/1698
f 1512/1967/1698 1646/1960/1698 1645/1957/1698
f 1455/1904/1698 1542/1903/1698 1646/1960/1698
f 1542/1903/1696 1543/1916/1696 1649/1968/1696
f 1543/1916/1697 1544/1914/1697 1652/1963/1697
f 1544/1914/1699 1463/1915/1699 1535/1969/1699
f 1652/1963/1699 1535/1969/1699 1534/1970/1699
f 1651/1964/1699 1534/1970/1699 1533/1971/1699
f 1650/1962/1699 1533/1971/1699 1462/1972/1699
f 1647/1958/1697 1650/1962/1697 1541/1952/1697
f 1644/1959/1696 1647/1958/1696 1540/1950/1696
f 1653/1973/1870 1654/1976/1870 1657/1974/1870
f 1654/1976/1871 1655/1984/1871 1658/1977/1871
f 1657/1974/1872 1660/1979/1872 1659/1978/1872
f 1658/1977/1873 1661/1988/1873 1660/1979/1873
f 1446/1980/1874 1478/1982/1874 1653/1973/1874
f 1478/1982/1875 1477/1983/1875 1654/1976/1875
f 1477/1983/1876 1476/1985/1876 1655/1984/1876
f 1476/1985/1877 1454/2004/1877 1545/1986/1877
f 1655/1984/1878 1545/1986/1878 1546/1987/1878
f 1546/1987/1879 1547/1989/1879 1661/1988/1879
f 1547/1989/1880 1458/2016/1880 1520/1990/1880
f 1661/1988/1881 1520/1990/1881 1519/1991/1881
f 1660/1979/1882 1519/1991/1882 1518/1992/1882
f 1659/1978/1883 1518/1992/1883 1444/1993/1883
f 1465/1995/1884 1656/1975/1884 1659/1978/1884
f 1653/1973/1885 1656/1975/1885 1465/1995/1885
f 1663/1996/1716 1666/2000/1716 1665/1997/1716
f 1664/1999/1716 1667/2002/1716 1666/2000/1716
f 1666/2000/1717 1669/2003/1717 1668/2001/1717
f 1667/2002/1717 1670/2011/1717 1669/2003/1717
f 1454/2004/1718 1508/2006/1718 1662/1998/1718
f 1507/2005/1718 1663/1996/1718 1662/1998/1718
f 1507/2005/1718 1506/2008/1718 1664/1999/1718
f 1453/2007/1718 1548/2009/1718 1664/1999/1718
f 1548/2009/1716 1549/2010/1716 1667/2002/1716
f 1549/2010/1717 1550/2012/1717 1670/2011/1717
f 1550/2012/1719 1459/2032/1719 1523/2013/1719
f 1670/2011/1719 1523/2013/1719 1522/2014/1719
f 1669/2003/1719 1522/2014/1719 1521/2015/1719
f 1668/2001/1719 1521/2015/1719 1458/2016/1719
f 1665/1997/1717 1668/2001/1717 1547/1989/1717
f 1662/1998/1716 1665/1997/1716 1546/1987/1716
f 1672/2017/1716 1675/2021/1716 1674/2018/1716
f 1673/2020/1716 1676/2028/1716 1675/2021/1716
f 1675/2021/1717 1678/2024/1717 1677/2022/1717
f 1675/2021/1717 1676/2028/1717 1679/2023/1717
f 1453/2007/1718 1505/2026/1718 1671/2019/1718
f 1504/2025/1718 1672/2017/1718 1671/2019/1718
f 1503/2027/1718 1673/2020/1718 1672/2017/1718
f 1452/1884/1718 1551/1883/1718 1673/2020/1718
f 1551/1883/1716 1552/1893/1716 1676/2028/1716
f 1552/1893/1717 1553/1891/1717 1679/2023/1717
f 1553/1891/1719 1460/1892/1719 1526/2029/1719
f 1679/2023/1719 1526/2029/1719 1525/2030/1719
f 1678/2024/1719 1525/2030/1719 1524/2031/1719
f 1677/2022/1719 1524/2031/1719 1459/2032/1719
f 1674/2018/1717 1677/2022/1717 1550/2012/1717
f 1671/2019/1716 1674/2018/1716 1549/2010/1716
f 1680/2033/1886 1681/2037/1886 1684/2034/1886
f 1682/2036/1887 1685/2040/1887 1684/2034/1887
f 1683/2035/1888 1684/2034/1888 1687/2038/1888
f 1685/2040/1889 1688/2044/1889 1687/2038/1889
f 1448/1847/1890 1493/1846/1890 1680/2033/1890
f 1492/1845/1891 1681/2037/1891 1680/2033/1891
f 1492/1845/1892 1491/1844/1892 1682/2036/1892
f 1449/1843/1893 1527/2042/1893 1682/2036/1893
f 1527/2042/1894 1528/2043/1894 1685/2040/1894
f 1528/2043/1895 1529/2045/1895 1688/2044/1895
f 1529/2045/1896 1461/2061/1896 1554/2046/1896
f 1688/2044/1897 1554/2046/1897 1555/2047/1897
f 1686/2039/1898 1687/2038/1898 1555/2047/1898
f 1494/2049/1899 1686/2039/1899 1556/2048/1899
f 1495/2051/1900 1683/2035/1900 1686/2039/1900
f 1496/2041/1901 1680/2033/1901 1683/2035/1901
f 1690/2052/1736 1693/2056/1736 1692/2053/1736
f 1691/2055/1737 1694/2058/1737 1693/2056/1737
f 1693/2056/1736 1696/2059/1736 1695/2057/1736
f 1694/2058/1737 1697/2064/1737 1696/2059/1737
f 1556/2048/1738 1689/2054/1738 1526/2060/1738
f 1555/2047/1736 1690/2052/1736 1689/2054/1736
f 1554/2046/1737 1691/2055/1737 1690/2052/1737
f 1461/2061/1739 1530/2062/1739 1691/2055/1739
f 1530/2062/1739 1531/2145/1739 1694/2058/1739
f 1694/2058/1739 1531/2145/1739 1532/2063/1739
f 1532/2063/1740 1462/2080/1740 1557/2065/1740
f 1697/2064/1737 1557/2065/1737 1558/2066/1737
f 1696/2059/1736 1558/2066/1736 1559/2067/1736
f 1695/2057/1738 1559/2067/1738 1459/2068/1738
f 1692/2053/1738 1695/2057/1738 1524/2069/1738
f 1689/2054/1738 1692/2053/1738 1525/2070/1738
f 1699/2071/1736 1702/2075/1736 1701/2072/1736
f 1700/2074/1737 1703/2077/1737 1702/2075/1737
f 1702/2075/1736 1705/2078/1736 1704/2076/1736
f 1703/2077/1737 1706/2083/1737 1705/2078/1737
f 1559/2067/1738 1698/2073/1738 1523/2079/1738
f 1558/2066/1736 1699/2071/1736 1698/2073/1736
f 1557/2065/1737 1700/2074/1737 1699/2071/1737
f 1462/2080/1740 1533/2081/1740 1700/2074/1740
f 1533/2081/1739 1534/2082/1739 1703/2077/1739
f 1534/2082/1740 1535/2084/1740 1706/2083/1740
f 1535/2084/1739 1463/2142/1739 1560/1863/1739
f 1706/2083/1737 1560/1863/1737 1561/1861/1737
f 1705/2078/1736 1561/1861/1736 1562/1858/1736
f 1704/2076/1738 1562/1858/1738 1458/1860/1738
f 1701/2072/1738 1704/2076/1738 1521/2085/1738
f 1523/2079/1738 1698/2073/1738 1701/2072/1738
f 1707/2087/1902 1708/2091/1902 1711/2088/1902
f 1709/2090/1903 1712/2094/1903 1711/2088/1903
f 1710/2089/1904 1711/2088/1904 1714/2092/1904
f 1712/2094/1905 1715/2098/1905 1714/2092/1905
f 1446/1804/1906 1475/1803/1906 1707/2087/1906
f 1474/1802/1907 1708/2091/1907 1707/2087/1907
f 1474/1802/1908 1473/1801/1908 1709/2090/1908
f 1447/1800/1909 1509/2096/1909 1709/2090/1909
f 1509/2096/1910 1510/2097/1910 1712/2094/1910
f 1510/2097/1911 1511/2099/1911 1715/2098/1911
f 1511/2099/1912 1455/2115/1912 1563/2100/1912
f 1715/2098/1913 1563/2100/1913 1564/2101/1913
f 1713/2093/1914 1714/2092/1914 1564/2101/1914
f 1476/2103/1915 1713/2093/1915 1565/2102/1915
f 1477/2105/1916 1710/2089/1916 1713/2093/1916
f 1478/2095/1917 1707/2087/1917 1710/2089/1917
f 1717/2106/1758 1720/2110/1758 1719/2107/1758
f 1718/2109/1759 1721/2112/1759 1720/2110/1759
f 1720/2110/1758 1723/2113/1758 1722/2111/1758
f 1721/2112/1759 1724/2117/1759 1723/2113/1759
f 1565/2102/1763 1716/2108/1763 1508/2114/1763
f 1564/2101/1758 1717/2106/1758 1716/2108/1758
f 1563/2100/1759 1718/2109/1759 1717/2106/1759
f 1455/2115/1761 1512/2146/1761 1718/2109/1761
f 1718/2109/1761 1512/2146/1761 1513/2116/1761
f 1513/2116/1761 1514/2118/1761 1724/2117/1761
f 1514/2118/1761 1456/2134/1761 1566/2119/1761
f 1724/2117/1759 1566/2119/1759 1567/2120/1759
f 1723/2113/1758 1567/2120/1758 1568/2121/1758
f 1722/2111/1763 1568/2121/1763 1453/2122/1763
f 1507/2124/1760 1719/2107/1760 1722/2111/1760
f 1716/2108/1763 1719/2107/1763 1507/2124/1763
f 1726/2125/1758 1729/2129/1758 1728/2126/1758
f 1727/2128/1759 1730/2131/1759 1729/2129/1759
f 1729/2129/1758 1732/2132/1758 1731/2130/1758
f 1730/2131/1759 1733/2136/1759 1732/2132/1759
f 1453/2122/1760 1568/2121/1760 1725/2127/1760
f 1567/2120/1758 1726/2125/1758 1725/2127/1758
f 1566/2119/1759 1727/2128/1759 1726/2125/1759
f 1456/2134/1761 1515/2147/1761 1727/2128/1761
f 1727/2128/1761 1515/2147/1761 1516/2135/1761
f 1516/2135/1761 1517/2137/1761 1733/2136/1761
f 1517/2137/1761 1457/2141/1761 1569/1820/1761
f 1733/2136/1762 1569/1820/1762 1570/1818/1762
f 1732/2132/1758 1570/1818/1758 1571/1815/1758
f 1503/2138/1763 1731/2130/1763 1571/1815/1763
f 1728/2126/1760 1731/2130/1760 1503/2138/1760
f 1725/2127/1763 1728/2126/1763 1504/2139/1763
================================================
FILE: example/turtle/proxy.txt
================================================
-1.860431563537060518e-01 -9.375499933958053589e-02 -8.959774656980318275e-02
-2.498418255269513610e-01 -9.375499933958054977e-02 -6.165322398079144117e-02
-1.200808296033092637e-01 -9.375499933958053589e-02 -1.773027579858070524e-02
1.501076852552826746e-01 -9.375499933958053589e-02 2.974227030481994827e-03
-1.343040758267255441e-02 -9.375499933958053589e-02 -2.462572452093138298e-02
1.270904367814460201e-01 -9.375499933958052201e-02 -1.373087932116676957e-01
2.405638385694613413e-01 -9.375499933958053589e-02 -8.097912522434463911e-02
-3.134770655791869198e-01 -9.375499933958053589e-02 -9.242291908877144080e-02
-6.818934097652129545e-02 -9.375499933958052201e-02 -7.561906067866898395e-02
7.936774489828646306e-02 -9.375499933958054977e-02 -8.535093962733797390e-02
1.970200613244025012e-01 -9.375499933958053589e-02 -1.323127980110465896e-01
2.117959376905650948e-01 -9.375499933958053589e-02 -1.411535404779480640e-02
1.116524942484813099e-02 -9.375499933958053589e-02 -8.996902750890534151e-02
6.489733801241262534e-02 -3.330609682580047692e-02 -3.060961966648226840e-01
-4.121693139113432464e-02 1.739484537824630139e-01 -8.457650562505869551e-02
-1.811119551802135852e-02 1.323123013724633312e-01 -1.875555304119838063e-01
7.015021756118672291e-02 1.138056082132338082e-01 -1.893345024250213116e-01
8.900859940262764725e-02 2.191494184281394392e-02 -2.747314313948817643e-01
1.342323818851942074e-01 5.400989979702855437e-02 -2.312296781694712366e-01
3.671586802652455211e-02 1.659520184209754279e-01 -8.749746131686583772e-02
1.828099877712539845e-01 7.014268249691811563e-02 -1.798397896729040657e-01
-2.434189971633327065e-02 1.834363586543601432e-01 -1.389962167460675503e-02
2.823241547019043063e-01 -2.076646375465641958e-02 -1.605982354045662475e-01
2.165910594501582187e-01 1.084603898802034005e-01 -4.919929501117145665e-02
2.920764285327639986e-01 2.420466155036143929e-02 -1.008960818417290461e-01
1.632037579923094617e-01 1.406485749933163443e-01 -1.470714510236647253e-02
2.796016959490815190e-01 6.329117192004658776e-02 -4.346589503618935452e-02
3.434932624666194312e-01 -3.485098456428201152e-02 -4.054549756471095140e-02
2.259380750970955698e-01 1.078284188187847992e-01 3.081096674527630114e-02
2.786820212366718508e-01 6.682662806571783298e-02 3.560634792062615517e-02
2.230964572517442623e-01 9.497064549475489614e-02 1.056359926526540954e-01
2.805018048364913974e-01 4.288217768725097884e-02 1.103281399304365529e-01
5.699903833187319746e-02 1.664402743101776327e-01 8.818014924902448093e-02
1.658753150481697969e-01 1.066596706903404634e-01 1.580204142729377537e-01
2.110275903330577907e-01 4.154683974101299609e-02 2.068505134186872374e-01
1.325725540106177103e-01 7.047736401130322292e-02 2.354667546746644080e-01
1.797204209746041115e-01 -6.811814586142339589e-02 2.859065773613035932e-01
-9.727976160809084089e-03 1.771153368092256986e-01 8.075601222758418962e-02
9.579903511219448053e-02 1.218312288085003081e-01 1.823473626238044376e-01
4.938464759304937723e-02 -2.516986556725907265e-02 3.275261697635321045e-01
-1.717583867164509948e-02 1.491660910889109903e-01 1.739694792622591413e-01
-1.285378191738280607e-01 -3.694029487294894082e-02 3.427799592284718755e-01
-6.989574485300832640e-02 1.834726194663790777e-01 5.331499214220113136e-02
-9.571137824201830790e-02 8.708560280075736126e-02 2.696239976842324526e-01
-9.278741869454942837e-02 2.504751993100641505e-02 3.192981065625964954e-01
-1.018591815467107670e-01 1.698487312860050003e-01 1.214575914322627848e-01
-1.951213034206627783e-01 1.193760319638146189e-01 2.064453344784600297e-01
-3.323976326557028238e-01 -6.657102958734778531e-02 2.761763224013820128e-01
-2.182526813384641773e-01 1.456931403419915450e-01 1.405711848702565492e-01
-1.713164544541035950e-01 1.722802665200824368e-01 7.455281037119734777e-02
-3.504948151941115753e-01 3.463510105496448582e-02 2.127957610152082513e-01
-1.264570485691579316e-01 1.830715156546970090e-01 2.315582002999195393e-02
-2.928701685267122157e-01 1.416001996997146017e-01 5.874646028071679738e-02
-3.493415379246794794e-01 1.065492399321200795e-01 9.442058137632566850e-02
-3.959345633709739909e-01 6.537079512170879125e-02 1.163655852171053273e-01
-4.502720223154033641e-01 1.978725119574940849e-02 7.890716477860010292e-02
-3.902551854425744815e-01 8.741119628346694093e-02 4.276055432042798893e-02
-4.906511184724942076e-01 -9.363841001227397876e-02 7.166214540554770307e-02
-3.197330501374738287e-01 1.306994091210644249e-01 -2.533314024658973873e-02
-3.834861403224371079e-01 9.169278322863608222e-02 -3.335141911068510712e-02
-4.883934199633858109e-01 -3.726657189797012021e-02 4.335163260344320271e-03
-2.524504332888009017e-01 1.506261063574758363e-01 -6.491554406522327680e-02
-3.145384141205939676e-01 1.184802042220245849e-01 -9.174817494337242363e-02
-3.484331119823236844e-01 8.274976590464555581e-02 -1.310012460630048670e-01
-4.242354256009298608e-01 -3.891244692985043607e-02 -1.693968108565207797e-01
-3.696357694333340382e-01 3.007761623506943066e-02 -1.779586349527673139e-01
-1.587017898981122332e-01 1.747510158711874928e-01 -5.040997046798948583e-02
-2.871843985980629399e-01 1.019456291127285447e-01 -1.627545855595900481e-01
-3.093768587552697680e-01 -6.075052151550072721e-02 -2.696366386703361595e-01
-1.557408909127799002e-01 1.468324269544531435e-01 -1.496948418799297353e-01
-2.048654560964580373e-01 8.432969994644878842e-02 -2.293718918303163101e-01
-1.891048067364720364e-01 2.940912152225361870e-02 -2.805933208654052824e-01
-1.960680563167367385e-01 -3.836492802493384618e-02 -3.097847921070085997e-01
-1.003568056543185927e-01 1.714882709920681481e-01 -9.412407171752597279e-02
-8.637580330852334676e-02 -7.092768956719358586e-02 -3.299375089567067598e-01
-7.527203880831626059e-02 1.484810268105103415e-01 -1.601381109712492479e-01
-9.324701281054756374e-02 1.057863841431460339e-01 -2.296574629351103292e-01
-1.294044358391055749e-01 6.292297952416336937e-02 -2.704944816819070952e-01
-7.329000857904567623e-02 1.637049359461439924e-02 -3.057299779594481737e-01
-2.043088150057514596e-01 -9.375499933958053589e-02 3.076583516557729325e-01
7.819254069847120237e-02 -9.375499933958053589e-02 3.188571570243572961e-01
3.148867742120960433e-01 -9.375499933958053589e-02 1.393477139379799556e-01
-2.389206740824085379e-02 -9.375499933958053589e-02 -3.120334896279683412e-01
4.763266504436482196e-02 -9.375499933958052201e-02 -3.112597911737781109e-01
-2.076240069512276243e-01 -9.375499933958053589e-02 2.488576119536334980e-03
-1.741238119144605800e-02 -9.375499933958053589e-02 8.530330398735234965e-02
-1.908570790555867980e-01 -9.375499933958054977e-02 9.489021891365519157e-02
-4.265449318787620792e-01 -9.375499933958053589e-02 -5.914330468975769438e-03
-1.226817013098559228e-01 -9.375499933958053589e-02 7.886356074104297620e-02
-4.707718746546875610e-01 -9.375499933958053589e-02 -8.577976644321101685e-02
-4.236594687046559282e-01 -9.375499933958053589e-02 8.575588361274107119e-02
-2.757818023670280394e-01 -9.375499933958053589e-02 6.113697420674555488e-02
5.254473579644225456e-02 -9.375499933958053589e-02 8.982703299254217455e-02
1.716241208701068666e-01 -9.375499933958053589e-02 6.856403773111084676e-02
-2.972766585701486219e-01 -9.375499933958053589e-02 -8.277007350300430444e-03
3.183558957960036517e-02 -9.375499933958053589e-02 2.893897672896666301e-02
-3.651474940569264804e-01 -9.375499933958053589e-02 1.845314188714979736e-02
-3.357944186801581909e-01 -9.375499933958053589e-02 -1.836194696282413552e-01
-2.024316480350380476e-01 -9.375499933958053589e-02 -2.036948498569505217e-01
6.668894154037013910e-02 -9.375499933958053589e-02 -1.635307031380314469e-01
-4.139399389466690837e-01 -9.375499933958053589e-02 -1.361928215575411716e-01
-1.513958042991385700e-01 -9.375499933958053589e-02 -1.578528059600970979e-01
-4.649915267890766812e-02 -9.375499933958053589e-02 -2.385914332019863082e-01
-1.358944510931561767e-02 -9.375499933958053589e-02 -1.697550348144101773e-01
-1.275761712419489213e-01 -9.375499933958053589e-02 -2.594370367147575407e-01
-2.604258932374214353e-01 -9.375499933958053589e-02 -1.617785508217988721e-01
1.970579331416150159e-02 -9.375499933958053589e-02 -2.264209050270949153e-01
-1.780244155186798058e-02 -9.375499933958053589e-02 3.240475941348807121e-01
2.250133709278038796e-01 -9.375499933958053589e-02 1.867379152361540173e-01
-1.761607217180501672e-01 -9.375499933958052201e-02 2.063537163446757250e-01
1.472266643600420311e-01 -9.375499933958053589e-02 1.754900415611973519e-01
-7.734968187053320454e-02 -9.375499933958053589e-02 2.561453003794688099e-01
3.896990906499026019e-02 -9.375499933958053589e-02 1.687480523054968595e-01
8.902837338232995334e-02 -9.375499933958053589e-02 2.151063196907729846e-01
-9.811899261496626057e-02 -9.375499933958053589e-02 1.896421708068657952e-01
-7.318769369098945821e-02 -9.375499933958053589e-02 1.264309967549283320e-01
-2.206201766959710875e-01 -9.375499933958053589e-02 1.591098834813287344e-01
-3.689598433346492623e-01 -9.375499933958053589e-02 1.425017261727299367e-01
-9.495929064672549436e-02 -9.375499933958053589e-02 3.289659754083842369e-01
-1.408341952919641149e-01 -9.375499933958053589e-02 1.427524350996520752e-01
-3.195829092783342151e-02 -9.375499933958053589e-02 1.946749446149557972e-01
1.224105496724692821e-01 -9.375499933958053589e-02 1.175552827400128814e-01
2.170227406581196894e-01 -9.375499933958053589e-02 1.147764920467069377e-01
2.316895940599368597e-02 -9.375499933958053589e-02 2.523925472225752031e-01
-1.440623772161684912e-01 -9.375499933958053589e-02 2.842362267986382474e-01
-2.868372392851628039e-01 -9.375499933958053589e-02 1.545174779505040230e-01
-7.370626987287484319e-03 -2.042436708421872110e-02 -3.156422011401836536e-01
-1.110971236540774128e-02 7.201006397765707945e-02 -2.615189604792024225e-01
5.599022145087218655e-02 7.333545929132007091e-02 -2.452964138767055879e-01
1.519662545642616047e-01 -8.216682698114074687e-03 -2.630158276206001622e-01
1.349446699781769732e-01 1.102025252968786562e-01 -1.581403156637983254e-01
2.321739491140916989e-01 1.036987310995187740e-02 -1.959819242868854339e-01
2.553561831359457690e-01 -5.955451459360938671e-02 -2.060313070256233892e-01
3.213063861222689721e-01 -8.189315348502068659e-02 -1.273201140718790425e-01
1.089450981314405154e-01 1.539468485173163259e-01 -5.667658348532179230e-02
2.337017572017148770e-01 7.709919761591660370e-02 -1.076543270565436528e-01
5.420925161962778654e-02 1.736064741205722850e-01 8.161538662897751747e-03
1.081029419154218296e-01 1.582287496271103100e-01 5.090350959408891651e-02
1.657208998537774147e-01 1.349043418878396461e-01 6.906739835924538551e-02
3.231306001254384830e-01 -1.446473187013176502e-02 9.912237326433576134e-02
2.831967366235733086e-01 -5.626283616254269121e-03 1.681897393754434034e-01
2.773884515951954843e-01 -6.029241522776411510e-02 2.054200548917299307e-01
5.907267702617438609e-02 3.147756461639295011e-02 2.979769090924243580e-01
5.531840180170546212e-02 8.451758424985422735e-02 2.545494167630201754e-01
-7.694183029814278055e-03 9.387268335929292717e-02 2.600309509084783866e-01
-1.748611812596843895e-02 4.490332572502758746e-02 3.040588006373921570e-01
-2.797598369288978315e-02 -1.038360613134673219e-02 3.343356237336884140e-01
-1.160149866501790639e-01 1.302538899477720669e-01 2.120707973926118650e-01
-1.747125141142764071e-01 8.495940886785063872e-02 2.586846838528217285e-01
-2.108744581409553454e-01 3.797221269659262211e-02 2.902519396443474786e-01
-1.959415327945767948e-01 -1.606545901261789006e-02 3.219843476786297121e-01
-2.562213399677788495e-01 6.988835762093700832e-02 2.445487452842378118e-01
-4.193179951072763534e-01 -5.020634090101434854e-02 1.999319237057300691e-01
-3.027567717962714733e-01 1.138222155083711828e-01 1.425804873986508681e-01
-4.230254851783949999e-01 1.163851585861450100e-02 1.522862292322789690e-01
-4.623705466710700818e-01 -7.270244575689925604e-02 1.415855931534435730e-01
-2.325839818332749720e-01 1.607498822467819943e-01 6.648485881214287463e-02
-1.954942642065244374e-01 1.729790419664102297e-01 6.931131736553275066e-03
-4.402577147234222887e-01 4.060304297868574941e-02 -1.608888028530178205e-02
-4.670559373108364998e-01 -1.586404403153918521e-02 -6.515006312524174170e-02
-4.326126866385179848e-01 1.231164819843223734e-02 -1.165550197035742919e-01
-2.303104606143392130e-01 1.386154357915534707e-01 -1.317114552039031217e-01
-2.710122411846504487e-01 3.894330472130718562e-02 -2.433604048586610891e-01
-1.369603373319397965e-01 -3.540755290432770312e-03 -3.088778837050802362e-01
2.785069454648551046e-01 -3.837169884591475066e-02 5.178215801612809366e-02
2.778943761512815414e-01 -5.099227916351952578e-02 -5.839244542144712141e-02
2.784664571858187498e-01 -1.056610219737671441e-01 1.860104939606303104e-02
2.812616775888418519e-01 3.139104396186996604e-03 -1.125073049776563992e-02
4.828324004000987824e-01 5.831291068054977184e-03 -4.081424398910750428e-02
3.988376576177645738e-01 2.036965683142957090e-02 -3.620249451203449986e-02
4.435084215752267411e-01 2.120697048709793697e-02 2.561198527673296671e-02
3.376066412141415229e-01 2.167360498154051462e-02 2.272154578224079916e-02
3.052587370706342229e-01 -1.233769542694298504e-01 -4.678657713598334544e-02
3.852422498778499405e-01 -1.061885771346036608e-02 6.942710021819090138e-02
3.949931692060190680e-01 -1.006758332796097027e-01 -7.461910859325462941e-02
4.836663165998829061e-01 -7.437793907765555268e-02 -7.391937910204396389e-02
4.976941831105498593e-01 -3.463302144169536079e-02 1.683066270993298710e-02
4.908579588481093436e-01 -9.462859671482039270e-02 4.461957210684532743e-02
4.702614866367590962e-01 -1.315950079841333442e-01 -1.759124371761112970e-02
3.511323211074167117e-01 -1.170619244146455240e-01 5.581519366245690089e-02
4.059857911963787869e-01 -1.330611356191775951e-01 2.014971136767790619e-02
4.457400337631272080e-01 -4.140735788679160423e-02 8.099740143981418172e-02
4.089417240747618698e-01 -8.973427335877949551e-02 7.642558354366187079e-02
4.047972288848448175e-01 -2.690106129495416568e-02 -8.377903009583542726e-02
1.809945533124085182e-01 -4.626674998783097265e-02 -2.100131930841627748e-01
9.940890625472739717e-02 -5.245798027976281142e-02 -2.552449554566567080e-01
1.525498840948594781e-01 -4.749275117891912923e-02 -4.292203439324554504e-01
1.996575273386476113e-01 1.485510670157822767e-02 -4.168641476709116778e-01
2.689669128091414230e-01 -4.115063245909535324e-03 -3.986429069127946367e-01
1.481386014760457948e-01 -1.289227472038371736e-01 -2.460751343600004470e-01
1.960598566116956154e-01 -1.335214315634338789e-01 -3.757267810240700889e-01
2.664790741097942672e-01 -1.902642046047463634e-02 -3.023543325377593405e-01
2.544688124243909555e-01 -7.273994539906245005e-02 -2.706123766768037120e-01
1.183767543346085915e-01 -9.365581834727548793e-02 -3.542605276341739828e-01
2.189402391253137681e-01 -6.260580131973983442e-02 -4.356459687593546293e-01
2.638252293401285309e-01 -1.085914380477200625e-01 -4.020274363249907168e-01
1.920348868066918402e-01 2.398089764266729887e-02 -3.510163988604907415e-01
1.983677182728074417e-01 -1.337587130964885129e-01 -3.026596509233392962e-01
2.558112899863689282e-01 -1.167662214117130420e-01 -3.211713524961597455e-01
3.011600228022974779e-01 -7.279179361643618729e-02 -3.574963920524310557e-01
2.257287736971043957e-01 3.534320693474214334e-03 -2.615086731179784296e-01
1.353621111664655152e-01 -9.684318936018615834e-03 -3.736207451268089841e-01
1.102792790822891555e-01 -3.543152368610201497e-02 2.718886349845660577e-01
3.174655310297035959e-01 -4.561035640776993361e-02 3.631674099062820460e-01
2.891460417115786496e-01 3.217669767492860756e-04 3.943383771368278445e-01
2.211708296363547599e-01 2.434433783925239042e-02 3.765079078873692642e-01
2.000067061677232860e-01 -1.149977795496933963e-01 4.322322989323445497e-01
2.579261628961177499e-01 -2.006737062662277554e-02 2.656970439807755935e-01
2.561698157122047625e-01 -1.056217967445000766e-01 2.820506437200243077e-01
1.778050435020407938e-01 -2.366842009299651578e-02 4.394566223898574986e-01
1.459354489428807689e-01 -9.428639853693736306e-02 3.806919455948048481e-01
2.534971184097606356e-01 -4.175388600060523508e-02 4.293942546326390275e-01
1.614758565378808242e-01 -5.538740639465783606e-03 2.414306187508593604e-01
1.819353505332144127e-01 -1.097445961323214453e-01 2.334316935564443862e-01
2.672457090863264995e-01 8.378270833800719603e-03 3.293477417692347586e-01
2.520949426334829213e-01 -1.340010847679045936e-01 3.905135945037990775e-01
2.035371536099716550e-01 -1.366005178490889149e-01 3.300686009771867657e-01
2.116705221830927908e-01 -3.655638071039334080e-02 2.178179273660457715e-01
1.448164601778190730e-01 -2.186312479162568715e-03 3.596423872917061115e-01
-2.536511864920266146e-01 -4.775981588654182319e-02 -2.399710740973545475e-01
-4.116335699976809703e-01 -3.857752862025257828e-02 -4.120976132357596633e-01
-3.525770849768632975e-01 -3.011099158253966249e-02 -4.366198517793056788e-01
-3.288531453953378869e-01 -8.911748298988234063e-02 -4.334186533068129510e-01
-3.157607974912419735e-01 -1.386375049508999924e-01 -2.877710367261094859e-01
-2.874377369153893191e-01 -1.110566532709635068e-01 -3.377985468613581288e-01
-3.515461945033509505e-01 2.586797583186107446e-02 -3.110479465888724704e-01
-4.174509211171775203e-01 2.516873093246825826e-03 -3.380887014586181905e-01
-4.013410104627518349e-01 -1.263651624166866327e-01 -3.202938887190230344e-01
-4.192364484101244182e-01 -7.511846229707266898e-02 -2.795578688024576985e-01
-2.765772334656038667e-01 -3.939575516091134433e-02 -3.693446400549337261e-01
-2.622089042241810275e-01 -1.157436628149050989e-01 -2.558569280635389909e-01
-3.440068943177087002e-01 -1.320654062818378549e-01 -3.711327908632061634e-01
-3.860541498855264431e-01 -5.407052204934433004e-03 -2.590256845040037614e-01
-4.332346421030544015e-01 -9.910319481044080336e-02 -3.923342627888369294e-01
-4.521805855044513933e-01 -5.537587661371606851e-02 -3.426000725562363125e-01
-3.758246738301168621e-01 -1.164240824407460351e-01 -2.422311613786269358e-01
-2.556601141385121401e-01 -1.080225907931375677e-02 -2.993885716182593248e-01
-3.195226604172674567e-01 1.400014224332035012e-02 -3.803460349170191268e-01
-2.513581193707176142e-01 -6.866615978212382843e-02 2.666707994617733002e-01
-2.904174617986132745e-01 9.500892187586273269e-03 2.628859853999375717e-01
-2.887517460814272297e-01 -1.353562010860441811e-01 2.625034439742843784e-01
-3.972614691318802627e-01 3.007361357334977208e-03 3.762730215713971926e-01
-3.761197641218651522e-01 -2.890866948065009927e-03 2.824278006419007059e-01
-4.036512518079895528e-01 -1.374505326577874498e-01 3.353001101265009054e-01
-3.213816647269887028e-01 -7.363755623650207471e-02 4.455573877738163713e-01
-3.520589009868082542e-01 -5.432598633957197790e-03 4.286192787759486311e-01
-2.536176272641700868e-01 -7.649395447807467996e-02 3.415262051837291146e-01
-3.536040139768901192e-01 -3.777067977544978222e-02 2.211815558412430216e-01
-3.728332288664484295e-01 -1.354616623957085875e-01 4.163219393693274872e-01
-3.167578443212177297e-01 -1.370181966166331144e-01 3.768222631994208038e-01
-4.474436532702092406e-01 -4.890466883263665243e-02 3.762780511744564516e-01
-3.219247811662552250e-01 1.667685816925452633e-02 3.244574879990358984e-01
-3.733629309308951205e-01 -1.070696907645360962e-01 2.348848130208958107e-01
-3.525453902562136199e-01 -1.443841930130304041e-01 2.836545167023694636e-01
-4.043478439788935619e-01 -6.057544595206282995e-02 2.648509860612776046e-01
-2.778049867788772165e-01 -1.680346764366544982e-02 3.563026618345577212e-01
================================================
FILE: externs/__init__.py
================================================
================================================
FILE: externs/pvcnn/modules/__init__.py
================================================
# from modules.ball_query import BallQuery
# from modules.frustum import FrustumPointNetLoss
# from modules.loss import KLLoss
# from modules.pointnet import PointNetAModule, PointNetSAModule, PointNetFPModule
from externs.pvcnn.modules.pvconv import PVConv, ProxyVoxelConv
from externs.pvcnn.modules.se import SE3d
from externs.pvcnn.modules.shared_mlp import SharedMLP
from externs.pvcnn.modules.voxelization import Voxelization
================================================
FILE: externs/pvcnn/modules/ball_query.py
================================================
import torch
import torch.nn as nn
import modules.functional as F
__all__ = ['BallQuery']
class BallQuery(nn.Module):
def __init__(self, radius, num_neighbors, include_coordinates=True):
super().__init__()
self.radius = radius
self.num_neighbors = num_neighbors
self.include_coordinates = include_coordinates
def forward(self, points_coords, centers_coords, points_features=None):
points_coords = points_coords.contiguous()
centers_coords = centers_coords.contiguous()
neighbor_indices = F.ball_query(centers_coords, points_coords, self.radius, self.num_neighbors)
neighbor_coordinates = F.grouping(points_coords, neighbor_indices)
neighbor_coordinates = neighbor_coordinates - centers_coords.unsqueeze(-1)
if points_features is None:
assert self.include_coordinates, 'No Features For Grouping'
neighbor_features = neighbor_coordinates
else:
neighbor_features = F.grouping(points_features, neighbor_indices)
if self.include_coordinates:
neighbor_features = torch.cat([neighbor_coordinates, neighbor_features], dim=1)
return neighbor_features
def extra_repr(self):
return 'radius={}, num_neighbors={}{}'.format(
self.radius, self.num_neighbors, ', include coordinates' if self.include_coordinates else '')
================================================
FILE: externs/pvcnn/modules/frustum.py
================================================
import numpy as np
import torch
import torch.nn as nn
import torch.nn.functional as F
import modules.functional as PF
__all__ = ['FrustumPointNetLoss', 'get_box_corners_3d']
class FrustumPointNetLoss(nn.Module):
def __init__(self, num_heading_angle_bins, num_size_templates, size_templates, box_loss_weight=1.0,
corners_loss_weight=10.0, heading_residual_loss_weight=20.0, size_residual_loss_weight=20.0):
super().__init__()
self.box_loss_weight = box_loss_weight
self.corners_loss_weight = corners_loss_weight
self.heading_residual_loss_weight = heading_residual_loss_weight
self.size_residual_loss_weight = size_residual_loss_weight
self.num_heading_angle_bins = num_heading_angle_bins
self.num_size_templates = num_size_templates
self.register_buffer('size_templates', size_templates.view(self.num_size_templates, 3))
self.register_buffer(
'heading_angle_bin_centers', torch.arange(0, 2 * np.pi, 2 * np.pi / self.num_heading_angle_bins)
)
def forward(self, inputs, targets):
mask_logits = inputs['mask_logits'] # (B, 2, N)
center_reg = inputs['center_reg'] # (B, 3)
center = inputs['center'] # (B, 3)
heading_scores = inputs['heading_scores'] # (B, NH)
heading_residuals_normalized = inputs['heading_residuals_normalized'] # (B, NH)
heading_residuals = inputs['heading_residuals'] # (B, NH)
size_scores = inputs['size_scores'] # (B, NS)
size_residuals_normalized = inputs['size_residuals_normalized'] # (B, NS, 3)
size_residuals = inputs['size_residuals'] # (B, NS, 3)
mask_logits_target = targets['mask_logits'] # (B, N)
center_target = targets['center'] # (B, 3)
heading_bin_id_target = targets['heading_bin_id'] # (B, )
heading_residual_target = targets['heading_residual'] # (B, )
size_template_id_target = targets['size_template_id'] # (B, )
size_residual_target = targets['size_residual'] # (B, 3)
batch_size = center.size(0)
batch_id = torch.arange(batch_size, device=center.device)
# Basic Classification and Regression losses
mask_loss = F.cross_entropy(mask_logits, mask_logits_target)
heading_loss = F.cross_entropy(heading_scores, heading_bin_id_target)
size_loss = F.cross_entropy(size_scores, size_template_id_target)
center_loss = PF.huber_loss(torch.norm(center_target - center, dim=-1), delta=2.0)
center_reg_loss = PF.huber_loss(torch.norm(center_target - center_reg, dim=-1), delta=1.0)
# Refinement losses for size/heading
heading_residuals_normalized = heading_residuals_normalized[batch_id, heading_bin_id_target] # (B, )
heading_residual_normalized_target = heading_residual_target / (np.pi / self.num_heading_angle_bins)
heading_residual_normalized_loss = PF.huber_loss(
heading_residuals_normalized - heading_residual_normalized_target, delta=1.0
)
size_residuals_normalized = size_residuals_normalized[batch_id, size_template_id_target] # (B, 3)
size_residual_normalized_target = size_residual_target / self.size_templates[size_template_id_target]
size_residual_normalized_loss = PF.huber_loss(
torch.norm(size_residual_normalized_target - size_residuals_normalized, dim=-1), delta=1.0
)
# Bounding box losses
heading = (heading_residuals[batch_id, heading_bin_id_target]
+ self.heading_angle_bin_centers[heading_bin_id_target]) # (B, )
# Warning: in origin code, size_residuals are added twice (issue #43 and #49 in charlesq34/frustum-pointnets)
size = (size_residuals[batch_id, size_template_id_target]
+ self.size_templates[size_template_id_target]) # (B, 3)
corners = get_box_corners_3d(centers=center, headings=heading, sizes=size, with_flip=False) # (B, 3, 8)
heading_target = self.heading_angle_bin_centers[heading_bin_id_target] + heading_residual_target # (B, )
size_target = self.size_templates[size_template_id_target] + size_residual_target # (B, 3)
corners_target, corners_target_flip = get_box_corners_3d(centers=center_target, headings=heading_target,
sizes=size_target, with_flip=True) # (B, 3, 8)
corners_loss = PF.huber_loss(torch.min(
torch.norm(corners - corners_target, dim=1), torch.norm(corners - corners_target_flip, dim=1)
), delta=1.0)
# Summing up
loss = mask_loss + self.box_loss_weight * (
center_loss + center_reg_loss + heading_loss + size_loss
+ self.heading_residual_loss_weight * heading_residual_normalized_loss
+ self.size_residual_loss_weight * size_residual_normalized_loss
+ self.corners_loss_weight * corners_loss
)
return loss
def get_box_corners_3d(centers, headings, sizes, with_flip=False):
"""
:param centers: coords of box centers, FloatTensor[N, 3]
:param headings: heading angles, FloatTensor[N, ]
:param sizes: box sizes, FloatTensor[N, 3]
:param with_flip: bool, whether to return flipped box (headings + np.pi)
:return:
coords of box corners, FloatTensor[N, 3, 8]
NOTE: corner points are in counter clockwise order, e.g.,
2--1
3--0 5
7--4
"""
l = sizes[:, 0] # (N,)
w = sizes[:, 1] # (N,)
h = sizes[:, 2] # (N,)
x_corners = torch.stack([l/2, l/2, -l/2, -l/2, l/2, l/2, -l/2, -l/2], dim=1) # (N, 8)
y_corners = torch.stack([h/2, h/2, h/2, h/2, -h/2, -h/2, -h/2, -h/2], dim=1) # (N, 8)
z_corners = torch.stack([w/2, -w/2, -w/2, w/2, w/2, -w/2, -w/2, w/2], dim=1) # (N, 8)
c = torch.cos(headings) # (N,)
s = torch.sin(headings) # (N,)
o = torch.ones_like(headings) # (N,)
z = torch.zeros_like(headings) # (N,)
centers = centers.unsqueeze(-1) # (B, 3, 1)
corners = torch.stack([x_corners, y_corners, z_corners], dim=1) # (N, 3, 8)
R = torch.stack([c, z, s, z, o, z, -s, z, c], dim=1).view(-1, 3, 3) # roty matrix: (N, 3, 3)
if with_flip:
R_flip = torch.stack([-c, z, -s, z, o, z, s, z, -c], dim=1).view(-1, 3, 3)
return torch.matmul(R, corners) + centers, torch.matmul(R_flip, corners) + centers
else:
return torch.matmul(R, corners) + centers
# centers = centers.unsqueeze(1) # (B, 1, 3)
# corners = torch.stack([x_corners, y_corners, z_corners], dim=-1) # (N, 8, 3)
# RT = torch.stack([c, z, -s, z, o, z, s, z, c], dim=1).view(-1, 3, 3) # (N, 3, 3)
# if with_flip:
# RT_flip = torch.stack([-c, z, s, z, o, z, -s, z, -c], dim=1).view(-1, 3, 3) # (N, 3, 3)
# return torch.matmul(corners, RT) + centers, torch.matmul(corners, RT_flip) + centers # (N, 8, 3)
# else:
# return torch.matmul(corners, RT) + centers # (N, 8, 3)
# corners = torch.stack([x_corners, y_corners, z_corners], dim=1) # (N, 3, 8)
# R = torch.stack([c, z, s, z, o, z, -s, z, c], dim=1).view(-1, 3, 3) # (N, 3, 3)
# corners = torch.matmul(R, corners) + centers.unsqueeze(2) # (N, 3, 8)
# corners = corners.transpose(1, 2) # (N, 8, 3)
================================================
FILE: externs/pvcnn/modules/functional/__init__.py
================================================
# from modules.functional.ball_query import ball_query
from externs.pvcnn.modules.functional.devoxelization import trilinear_devoxelize
# from modules.functional.grouping import grouping
# from modules.functional.interpolatation import nearest_neighbor_interpolate
# from modules.functional.loss import kl_loss, huber_loss
# from modules.functional.sampling import gather, furthest_point_sample, logits_mask
from externs.pvcnn.modules.functional.voxelization import avg_voxelize
================================================
FILE: externs/pvcnn/modules/functional/backend.py
================================================
import os
from torch.utils.cpp_extension import load
_src_path = os.path.dirname(os.path.abspath(__file__))
_backend = load(name='_pvcnn_backend',
extra_cflags=['-O3', '-std=c++17'],
sources=[os.path.join(_src_path,'src', f) for f in [
'ball_query/ball_query.cpp',
'ball_query/ball_query.cu',
'grouping/grouping.cpp',
'grouping/grouping.cu',
'interpolate/neighbor_interpolate.cpp',
'interpolate/neighbor_interpolate.cu',
'interpolate/trilinear_devox.cpp',
'interpolate/trilinear_devox.cu',
'sampling/sampling.cpp',
'sampling/sampling.cu',
'voxelization/vox.cpp',
'voxelization/vox.cu',
'bindings.cpp',
]]
)
__all__ = ['_backend']
================================================
FILE: externs/pvcnn/modules/functional/ball_query.py
================================================
from torch.autograd import Function
from modules.functional.backend import _backend
__all__ = ['ball_query']
def ball_query(centers_coords, points_coords, radius, num_neighbors):
"""
:param centers_coords: coordinates of centers, FloatTensor[B, 3, M]
:param points_coords: coordinates of points, FloatTensor[B, 3, N]
:param radius: float, radius of ball query
:param num_neighbors: int, maximum number of neighbors
:return:
neighbor_indices: indices of neighbors, IntTensor[B, M, U]
"""
centers_coords = centers_coords.contiguous()
points_coords = points_coords.contiguous()
return _backend.ball_query(centers_coords, points_coords, radius, num_neighbors)
================================================
FILE: externs/pvcnn/modules/functional/devoxelization.py
================================================
from torch.autograd import Function
from externs.pvcnn.modules.functional.backend import _backend
__all__ = ['trilinear_devoxelize']
class TrilinearDevoxelization(Function):
@staticmethod
def forward(ctx, features, coords, resolution, is_training=True):
"""
:param ctx:
:param coords: the coordinates of points, FloatTensor[B, 3, N]
:param features: FloatTensor[B, C, R, R, R]
:param resolution: int, the voxel resolution
:param is_training: bool, training mode
:return:
FloatTensor[B, C, N]
"""
B, C = features.shape[:2]
features = features.contiguous().view(B, C, -1)
coords = coords.contiguous()
outs, inds, wgts = _backend.trilinear_devoxelize_forward(resolution, is_training, coords, features)
if is_training:
ctx.save_for_backward(inds, wgts)
ctx.r = resolution
return outs
@staticmethod
def backward(ctx, grad_output):
"""
:param ctx:
:param grad_output: gradient of outputs, FloatTensor[B, C, N]
:return:
gradient of inputs, FloatTensor[B, C, R, R, R]
"""
inds, wgts = ctx.saved_tensors
grad_inputs = _backend.trilinear_devoxelize_backward(grad_output.contiguous(), inds, wgts, ctx.r)
return grad_inputs.view(grad_output.size(0), grad_output.size(1), ctx.r, ctx.r, ctx.r), None, None, None
trilinear_devoxelize = TrilinearDevoxelization.apply
================================================
FILE: externs/pvcnn/modules/functional/grouping.py
================================================
from torch.autograd import Function
from modules.functional.backend import _backend
__all__ = ['grouping']
class Grouping(Function):
@staticmethod
def forward(ctx, features, indices):
"""
:param ctx:
:param features: features of points, FloatTensor[B, C, N]
:param indices: neighbor indices of centers, IntTensor[B, M, U], M is #centers, U is #neighbors
:return:
grouped_features: grouped features, FloatTensor[B, C, M, U]
"""
features = features.contiguous()
indices = indices.contiguous()
ctx.save_for_backward(indices)
ctx.num_points = features.size(-1)
return _backend.grouping_forward(features, indices)
@staticmethod
def backward(ctx, grad_output):
indices, = ctx.saved_tensors
grad_features = _backend.grouping_backward(grad_output.contiguous(), indices, ctx.num_points)
return grad_features, None
grouping = Grouping.apply
================================================
FILE: externs/pvcnn/modules/functional/interpolatation.py
================================================
from torch.autograd import Function
from modules.functional.backend import _backend
__all__ = ['nearest_neighbor_interpolate']
class NeighborInterpolation(Function):
@staticmethod
def forward(ctx, points_coords, centers_coords, centers_features):
"""
:param ctx:
:param points_coords: coordinates of points, FloatTensor[B, 3, N]
:param centers_coords: coordinates of centers, FloatTensor[B, 3, M]
:param centers_features: features of centers, FloatTensor[B, C, M]
:return:
points_features: features of points, FloatTensor[B, C, N]
"""
centers_coords = centers_coords.contiguous()
points_coords = points_coords.contiguous()
centers_features = centers_features.contiguous()
points_features, indices, weights = _backend.three_nearest_neighbors_interpolate_forward(
points_coords, centers_coords, centers_features
)
ctx.save_for_backward(indices, weights)
ctx.num_centers = centers_coords.size(-1)
return points_features
@staticmethod
def backward(ctx, grad_output):
indices, weights = ctx.saved_tensors
grad_centers_features = _backend.three_nearest_neighbors_interpolate_backward(
grad_output.contiguous(), indices, weights, ctx.num_centers
)
return None, None, grad_centers_features
nearest_neighbor_interpolate = NeighborInterpolation.apply
================================================
FILE: externs/pvcnn/modules/functional/loss.py
================================================
import torch
import torch.nn.functional as F
__all__ = ['kl_loss', 'huber_loss']
def kl_loss(x, y):
x = F.softmax(x.detach(), dim=1)
y = F.log_softmax(y, dim=1)
return torch.mean(torch.sum(x * (torch.log(x) - y), dim=1))
def huber_loss(error, delta):
abs_error = torch.abs(error)
quadratic = torch.min(abs_error, torch.full_like(abs_error, fill_value=delta))
losses = 0.5 * (quadratic ** 2) + delta * (abs_error - quadratic)
return torch.mean(losses)
================================================
FILE: externs/pvcnn/modules/functional/sampling.py
================================================
import numpy as np
import torch
from torch.autograd import Function
from modules.functional.backend import _backend
__all__ = ['gather', 'furthest_point_sample', 'logits_mask']
class Gather(Function):
@staticmethod
def forward(ctx, features, indices):
"""
Gather
:param ctx:
:param features: features of points, FloatTensor[B, C, N]
:param indices: centers' indices in points, IntTensor[b, m]
:return:
centers_coords: coordinates of sampled centers, FloatTensor[B, C, M]
"""
features = features.contiguous()
indices = indices.int().contiguous()
ctx.save_for_backward(indices)
ctx.num_points = features.size(-1)
return _backend.gather_features_forward(features, indices)
@staticmethod
def backward(ctx, grad_output):
indices, = ctx.saved_tensors
grad_features = _backend.gather_features_backward(grad_output.contiguous(), indices, ctx.num_points)
return grad_features, None
gather = Gather.apply
def furthest_point_sample(coords, num_samples):
"""
Uses iterative furthest point sampling to select a set of npoint features that have the largest
minimum distance to the sampled point set
:param coords: coordinates of points, FloatTensor[B, 3, N]
:param num_samples: int, M
:return:
centers_coords: coordinates of sampled centers, FloatTensor[B, 3, M]
"""
coords = coords.contiguous()
indices = _backend.furthest_point_sampling(coords, num_samples)
return gather(coords, indices)
def logits_mask(coords, logits, num_points_per_object):
"""
Use logits to sample points
:param coords: coords of points, FloatTensor[B, 3, N]
:param logits: binary classification logits, FloatTensor[B, 2, N]
:param num_points_per_object: M, #points per object after masking, int
:return:
selected_coords: FloatTensor[B, 3, M]
masked_coords_mean: mean coords of selected points, FloatTensor[B, 3]
mask: mask to select points, BoolTensor[B, N]
"""
batch_size, _, num_points = coords.shape
mask = torch.lt(logits[:, 0, :], logits[:, 1, :]) # [B, N]
num_candidates = torch.sum(mask, dim=-1, keepdim=True) # [B, 1]
masked_coords = coords * mask.view(batch_size, 1, num_points) # [B, C, N]
masked_coords_mean = torch.sum(masked_coords, dim=-1) / torch.max(num_candidates,
torch.ones_like(num_candidates)).float() # [B, C]
selected_indices = torch.zeros((batch_size, num_points_per_object), device=coords.device, dtype=torch.int32)
for i in range(batch_size):
current_mask = mask[i] # [N]
current_candidates = current_mask.nonzero().view(-1)
current_num_candidates = current_candidates.numel()
if current_num_candidates >= num_points_per_object:
choices = np.random.choice(current_num_candidates, num_points_per_object, replace=False)
selected_indices[i] = current_candidates[choices]
elif current_num_candidates > 0:
choices = np.concatenate([
np.arange(current_num_candidates).repeat(num_points_per_object // current_num_candidates),
np.random.choice(current_num_candidates, num_points_per_object % current_num_candidates, replace=False)
])
np.random.shuffle(choices)
selected_indices[i] = current_candidates[choices]
selected_coords = gather(masked_coords - masked_coords_mean.view(batch_size, -1, 1), selected_indices)
return selected_coords, masked_coords_mean, mask
================================================
FILE: externs/pvcnn/modules/functional/src/ball_query/ball_query.cpp
================================================
#include "ball_query.hpp"
#include "ball_query.cuh"
#include "../utils.hpp"
at::Tensor ball_query_forward(at::Tensor centers_coords,
at::Tensor points_coords, const float radius,
const int num_neighbors) {
CHECK_CUDA(centers_coords);
CHECK_CUDA(points_coords);
CHECK_CONTIGUOUS(centers_coords);
CHECK_CONTIGUOUS(points_coords);
CHECK_IS_FLOAT(centers_coords);
CHECK_IS_FLOAT(points_coords);
int b = centers_coords.size(0);
int m = centers_coords.size(2);
int n = points_coords.size(2);
at::Tensor neighbors_indices = torch::zeros(
{b, m, num_neighbors},
at::device(centers_coords.device()).dtype(at::ScalarType::Int));
ball_query(b, n, m, radius * radius, num_neighbors,
centers_coords.data_ptr(),
points_coords.data_ptr(),
neighbors_indices.data_ptr());
return neighbors_indices;
}
================================================
FILE: externs/pvcnn/modules/functional/src/ball_query/ball_query.cu
================================================
#include
#include
#include
#include "../cuda_utils.cuh"
/*
Function: ball query
Args:
b : batch size
n : number of points in point clouds
m : number of query centers
r2 : ball query radius ** 2
u : maximum number of neighbors
centers_coords: coordinates of centers, FloatTensor[b, 3, m]
points_coords : coordinates of points, FloatTensor[b, 3, n]
neighbors_indices : neighbor indices in points, IntTensor[b, m, u]
*/
__global__ void ball_query_kernel(int b, int n, int m, float r2, int u,
const float *__restrict__ centers_coords,
const float *__restrict__ points_coords,
int *__restrict__ neighbors_indices) {
int batch_index = blockIdx.x;
int index = threadIdx.x;
int stride = blockDim.x;
points_coords += batch_index * n * 3;
centers_coords += batch_index * m * 3;
neighbors_indices += batch_index * m * u;
for (int j = index; j < m; j += stride) {
float center_x = centers_coords[j];
float center_y = centers_coords[j + m];
float center_z = centers_coords[j + m + m];
for (int k = 0, cnt = 0; k < n && cnt < u; ++k) {
float dx = center_x - points_coords[k];
float dy = center_y - points_coords[k + n];
float dz = center_z - points_coords[k + n + n];
float d2 = dx * dx + dy * dy + dz * dz;
if (d2 < r2) {
if (cnt == 0) {
for (int v = 0; v < u; ++v) {
neighbors_indices[j * u + v] = k;
}
}
neighbors_indices[j * u + cnt] = k;
++cnt;
}
}
}
}
void ball_query(int b, int n, int m, float r2, int u,
const float *centers_coords, const float *points_coords,
int *neighbors_indices) {
ball_query_kernel<<>>(
b, n, m, r2, u, centers_coords, points_coords, neighbors_indices);
CUDA_CHECK_ERRORS();
}
================================================
FILE: externs/pvcnn/modules/functional/src/ball_query/ball_query.cuh
================================================
#ifndef _BALL_QUERY_CUH
#define _BALL_QUERY_CUH
void ball_query(int b, int n, int m, float r2, int u,
const float *centers_coords, const float *points_coords,
int *neighbors_indices);
#endif
================================================
FILE: externs/pvcnn/modules/functional/src/ball_query/ball_query.hpp
================================================
#ifndef _BALL_QUERY_HPP
#define _BALL_QUERY_HPP
#include
at::Tensor ball_query_forward(at::Tensor centers_coords,
at::Tensor points_coords, const float radius,
const int num_neighbors);
#endif
================================================
FILE: externs/pvcnn/modules/functional/src/bindings.cpp
================================================
#include
#include "ball_query/ball_query.hpp"
#include "grouping/grouping.hpp"
#include "interpolate/neighbor_interpolate.hpp"
#include "interpolate/trilinear_devox.hpp"
#include "sampling/sampling.hpp"
#include "voxelization/vox.hpp"
PYBIND11_MODULE(_pvcnn_backend, m) {
m.def("gather_features_forward", &gather_features_forward,
"Gather Centers' Features forward (CUDA)");
m.def("gather_features_backward", &gather_features_backward,
"Gather Centers' Features backward (CUDA)");
m.def("furthest_point_sampling", &furthest_point_sampling_forward,
"Furthest Point Sampling (CUDA)");
m.def("ball_query", &ball_query_forward, "Ball Query (CUDA)");
m.def("grouping_forward", &grouping_forward,
"Grouping Features forward (CUDA)");
m.def("grouping_backward", &grouping_backward,
"Grouping Features backward (CUDA)");
m.def("three_nearest_neighbors_interpolate_forward",
&three_nearest_neighbors_interpolate_forward,
"3 Nearest Neighbors Interpolate forward (CUDA)");
m.def("three_nearest_neighbors_interpolate_backward",
&three_nearest_neighbors_interpolate_backward,
"3 Nearest Neighbors Interpolate backward (CUDA)");
m.def("trilinear_devoxelize_forward", &trilinear_devoxelize_forward,
"Trilinear Devoxelization forward (CUDA)");
m.def("trilinear_devoxelize_backward", &trilinear_devoxelize_backward,
"Trilinear Devoxelization backward (CUDA)");
m.def("avg_voxelize_forward", &avg_voxelize_forward,
"Voxelization forward with average pooling (CUDA)");
m.def("avg_voxelize_backward", &avg_voxelize_backward,
"Voxelization backward (CUDA)");
}
================================================
FILE: externs/pvcnn/modules/functional/src/cuda_utils.cuh
================================================
#ifndef _CUDA_UTILS_H
#define _CUDA_UTILS_H
#include
#include
#include
#include
#include
#include
#define MAXIMUM_THREADS 512
inline int optimal_num_threads(int work_size) {
const int pow_2 = std::log2(static_cast(work_size));
return max(min(1 << pow_2, MAXIMUM_THREADS), 1);
}
inline dim3 optimal_block_config(int x, int y) {
const int x_threads = optimal_num_threads(x);
const int y_threads =
max(min(optimal_num_threads(y), MAXIMUM_THREADS / x_threads), 1);
dim3 block_config(x_threads, y_threads, 1);
return block_config;
}
#define CUDA_CHECK_ERRORS() \
{ \
cudaError_t err = cudaGetLastError(); \
if (cudaSuccess != err) { \
fprintf(stderr, "CUDA kernel failed : %s\n%s at L:%d in %s\n", \
cudaGetErrorString(err), __PRETTY_FUNCTION__, __LINE__, \
__FILE__); \
exit(-1); \
} \
}
#endif
================================================
FILE: externs/pvcnn/modules/functional/src/grouping/grouping.cpp
================================================
#include "grouping.hpp"
#include "grouping.cuh"
#include "../utils.hpp"
at::Tensor grouping_forward(at::Tensor features, at::Tensor indices) {
CHECK_CUDA(features);
CHECK_CUDA(indices);
CHECK_CONTIGUOUS(features);
CHECK_CONTIGUOUS(indices);
CHECK_IS_FLOAT(features);
CHECK_IS_INT(indices);
int b = features.size(0);
int c = features.size(1);
int n = features.size(2);
int m = indices.size(1);
int u = indices.size(2);
at::Tensor output = torch::zeros(
{b, c, m, u}, at::device(features.device()).dtype(at::ScalarType::Float));
grouping(b, c, n, m, u, features.data_ptr(), indices.data_ptr(),
output.data_ptr());
return output;
}
at::Tensor grouping_backward(at::Tensor grad_y, at::Tensor indices,
const int n) {
CHECK_CUDA(grad_y);
CHECK_CUDA(indices);
CHECK_CONTIGUOUS(grad_y);
CHECK_CONTIGUOUS(indices);
CHECK_IS_FLOAT(grad_y);
CHECK_IS_INT(indices);
int b = grad_y.size(0);
int c = grad_y.size(1);
int m = indices.size(1);
int u = indices.size(2);
at::Tensor grad_x = torch::zeros(
{b, c, n}, at::device(grad_y.device()).dtype(at::ScalarType::Float));
grouping_grad(b, c, n, m, u, grad_y.data_ptr(),
indices.data_ptr(), grad_x.data_ptr());
return grad_x;
}
================================================
FILE: externs/pvcnn/modules/functional/src/grouping/grouping.cu
================================================
#include
#include
#include "../cuda_utils.cuh"
/*
Function: grouping features of neighbors (forward)
Args:
b : batch size
c : #channles of features
n : number of points in point clouds
m : number of query centers
u : maximum number of neighbors
features: points' features, FloatTensor[b, c, n]
indices : neighbor indices in points, IntTensor[b, m, u]
out : gathered features, FloatTensor[b, c, m, u]
*/
__global__ void grouping_kernel(int b, int c, int n, int m, int u,
const float *__restrict__ features,
const int *__restrict__ indices,
float *__restrict__ out) {
int batch_index = blockIdx.x;
features += batch_index * n * c;
indices += batch_index * m * u;
out += batch_index * m * u * c;
const int index = threadIdx.y * blockDim.x + threadIdx.x;
const int stride = blockDim.y * blockDim.x;
for (int i = index; i < c * m; i += stride) {
const int l = i / m;
const int j = i % m;
for (int k = 0; k < u; ++k) {
out[(l * m + j) * u + k] = features[l * n + indices[j * u + k]];
}
}
}
void grouping(int b, int c, int n, int m, int u, const float *features,
const int *indices, float *out) {
grouping_kernel<<>>(b, c, n, m, u, features,
indices, out);
CUDA_CHECK_ERRORS();
}
/*
Function: grouping features of neighbors (backward)
Args:
b : batch size
c : #channles of features
n : number of points in point clouds
m : number of query centers
u : maximum number of neighbors
grad_y : grad of gathered features, FloatTensor[b, c, m, u]
indices : neighbor indices in points, IntTensor[b, m, u]
grad_x: grad of points' features, FloatTensor[b, c, n]
*/
__global__ void grouping_grad_kernel(int b, int c, int n, int m, int u,
const float *__restrict__ grad_y,
const int *__restrict__ indices,
float *__restrict__ grad_x) {
int batch_index = blockIdx.x;
grad_y += batch_index * m * u * c;
indices += batch_index * m * u;
grad_x += batch_index * n * c;
const int index = threadIdx.y * blockDim.x + threadIdx.x;
const int stride = blockDim.y * blockDim.x;
for (int i = index; i < c * m; i += stride) {
const int l = i / m;
const int j = i % m;
for (int k = 0; k < u; ++k) {
atomicAdd(grad_x + l * n + indices[j * u + k],
grad_y[(l * m + j) * u + k]);
}
}
}
void grouping_grad(int b, int c, int n, int m, int u, const float *grad_y,
const int *indices, float *grad_x) {
grouping_grad_kernel<<>>(
b, c, n, m, u, grad_y, indices, grad_x);
CUDA_CHECK_ERRORS();
}
================================================
FILE: externs/pvcnn/modules/functional/src/grouping/grouping.cuh
================================================
#ifndef _GROUPING_CUH
#define _GROUPING_CUH
void grouping(int b, int c, int n, int m, int u, const float *features,
const int *indices, float *out);
void grouping_grad(int b, int c, int n, int m, int u, const float *grad_y,
const int *indices, float *grad_x);
#endif
================================================
FILE: externs/pvcnn/modules/functional/src/grouping/grouping.hpp
================================================
#ifndef _GROUPING_HPP
#define _GROUPING_HPP
#include
at::Tensor grouping_forward(at::Tensor features, at::Tensor indices);
at::Tensor grouping_backward(at::Tensor grad_y, at::Tensor indices,
const int n);
#endif
================================================
FILE: externs/pvcnn/modules/functional/src/interpolate/neighbor_interpolate.cpp
================================================
#include "neighbor_interpolate.hpp"
#include "neighbor_interpolate.cuh"
#include "../utils.hpp"
std::vector
three_nearest_neighbors_interpolate_forward(at::Tensor points_coords,
at::Tensor centers_coords,
at::Tensor centers_features) {
CHECK_CUDA(points_coords);
CHECK_CUDA(centers_coords);
CHECK_CUDA(centers_features);
CHECK_CONTIGUOUS(points_coords);
CHECK_CONTIGUOUS(centers_coords);
CHECK_CONTIGUOUS(centers_features);
CHECK_IS_FLOAT(points_coords);
CHECK_IS_FLOAT(centers_coords);
CHECK_IS_FLOAT(centers_features);
int b = centers_features.size(0);
int c = centers_features.size(1);
int m = centers_features.size(2);
int n = points_coords.size(2);
at::Tensor indices = torch::zeros(
{b, 3, n}, at::device(points_coords.device()).dtype(at::ScalarType::Int));
at::Tensor weights = torch::zeros(
{b, 3, n},
at::device(points_coords.device()).dtype(at::ScalarType::Float));
at::Tensor output = torch::zeros(
{b, c, n},
at::device(centers_features.device()).dtype(at::ScalarType::Float));
three_nearest_neighbors_interpolate(
b, c, m, n, points_coords.data_ptr(),
centers_coords.data_ptr(), centers_features.data_ptr(),
indices.data_ptr(), weights.data_ptr(),
output.data_ptr());
return {output, indices, weights};
}
at::Tensor three_nearest_neighbors_interpolate_backward(at::Tensor grad_y,
at::Tensor indices,
at::Tensor weights,
const int m) {
CHECK_CUDA(grad_y);
CHECK_CUDA(indices);
CHECK_CUDA(weights);
CHECK_CONTIGUOUS(grad_y);
CHECK_CONTIGUOUS(indices);
CHECK_CONTIGUOUS(weights);
CHECK_IS_FLOAT(grad_y);
CHECK_IS_INT(indices);
CHECK_IS_FLOAT(weights);
int b = grad_y.size(0);
int c = grad_y.size(1);
int n = grad_y.size(2);
at::Tensor grad_x = torch::zeros(
{b, c, m}, at::device(grad_y.device()).dtype(at::ScalarType::Float));
three_nearest_neighbors_interpolate_grad(
b, c, n, m, grad_y.data_ptr(), indices.data_ptr(),
weights.data_ptr(), grad_x.data_ptr());
return grad_x;
}
================================================
FILE: externs/pvcnn/modules/functional/src/interpolate/neighbor_interpolate.cu
================================================
#include
#include
#include
#include "../cuda_utils.cuh"
/*
Function: three nearest neighbors
Args:
b : batch size
n : number of points in point clouds
m : number of query centers
points_coords : coordinates of points, FloatTensor[b, 3, n]
centers_coords: coordinates of centers, FloatTensor[b, 3, m]
weights : weights of nearest 3 centers to the point,
FloatTensor[b, 3, n]
indices : indices of nearest 3 centers to the point,
IntTensor[b, 3, n]
*/
__global__ void three_nearest_neighbors_kernel(
int b, int n, int m, const float *__restrict__ points_coords,
const float *__restrict__ centers_coords, float *__restrict__ weights,
int *__restrict__ indices) {
int batch_index = blockIdx.x;
int index = threadIdx.x;
int stride = blockDim.x;
points_coords += batch_index * 3 * n;
weights += batch_index * 3 * n;
indices += batch_index * 3 * n;
centers_coords += batch_index * 3 * m;
for (int j = index; j < n; j += stride) {
float ux = points_coords[j];
float uy = points_coords[j + n];
float uz = points_coords[j + n + n];
double best0 = 1e40, best1 = 1e40, best2 = 1e40;
int besti0 = 0, besti1 = 0, besti2 = 0;
for (int k = 0; k < m; ++k) {
float x = centers_coords[k];
float y = centers_coords[k + m];
float z = centers_coords[k + m + m];
float d = (ux - x) * (ux - x) + (uy - y) * (uy - y) + (uz - z) * (uz - z);
if (d < best2) {
best2 = d;
besti2 = k;
if (d < best1) {
best2 = best1;
besti2 = besti1;
best1 = d;
besti1 = k;
if (d < best0) {
best1 = best0;
besti1 = besti0;
best0 = d;
besti0 = k;
}
}
}
}
best0 = max(min(1e10f, best0), 1e-10f);
best1 = max(min(1e10f, best1), 1e-10f);
best2 = max(min(1e10f, best2), 1e-10f);
float d0d1 = best0 * best1;
float d0d2 = best0 * best2;
float d1d2 = best1 * best2;
float d0d1d2 = 1.0f / (d0d1 + d0d2 + d1d2);
weights[j] = d1d2 * d0d1d2;
indices[j] = besti0;
weights[j + n] = d0d2 * d0d1d2;
indices[j + n] = besti1;
weights[j + n + n] = d0d1 * d0d1d2;
indices[j + n + n] = besti2;
}
}
/*
Function: interpolate three nearest neighbors (forward)
Args:
b : batch size
c : #channels of features
m : number of query centers
n : number of points in point clouds
centers_features: features of centers, FloatTensor[b, c, m]
indices : indices of nearest 3 centers to the point,
IntTensor[b, 3, n]
weights : weights for interpolation, FloatTensor[b, 3, n]
out : features of points, FloatTensor[b, c, n]
*/
__global__ void three_nearest_neighbors_interpolate_kernel(
int b, int c, int m, int n, const float *__restrict__ centers_features,
const int *__restrict__ indices, const float *__restrict__ weights,
float *__restrict__ out) {
int batch_index = blockIdx.x;
centers_features += batch_index * m * c;
indices += batch_index * n * 3;
weights += batch_index * n * 3;
out += batch_index * n * c;
const int index = threadIdx.y * blockDim.x + threadIdx.x;
const int stride = blockDim.y * blockDim.x;
for (int i = index; i < c * n; i += stride) {
const int l = i / n;
const int j = i % n;
float w1 = weights[j];
float w2 = weights[j + n];
float w3 = weights[j + n + n];
int i1 = indices[j];
int i2 = indices[j + n];
int i3 = indices[j + n + n];
out[i] = centers_features[l * m + i1] * w1 +
centers_features[l * m + i2] * w2 +
centers_features[l * m + i3] * w3;
}
}
void three_nearest_neighbors_interpolate(int b, int c, int m, int n,
const float *points_coords,
const float *centers_coords,
const float *centers_features,
int *indices, float *weights,
float *out) {
three_nearest_neighbors_kernel<<>>(
b, n, m, points_coords, centers_coords, weights, indices);
three_nearest_neighbors_interpolate_kernel<<<
b, optimal_block_config(n, c), 0, at::cuda::getCurrentCUDAStream()>>>(
b, c, m, n, centers_features, indices, weights, out);
CUDA_CHECK_ERRORS();
}
/*
Function: interpolate three nearest neighbors (backward)
Args:
b : batch size
c : #channels of features
m : number of query centers
n : number of points in point clouds
grad_y : grad of features of points, FloatTensor[b, c, n]
indices : indices of nearest 3 centers to the point, IntTensor[b, 3, n]
weights : weights for interpolation, FloatTensor[b, 3, n]
grad_x : grad of features of centers, FloatTensor[b, c, m]
*/
__global__ void three_nearest_neighbors_interpolate_grad_kernel(
int b, int c, int n, int m, const float *__restrict__ grad_y,
const int *__restrict__ indices, const float *__restrict__ weights,
float *__restrict__ grad_x) {
int batch_index = blockIdx.x;
grad_y += batch_index * n * c;
indices += batch_index * n * 3;
weights += batch_index * n * 3;
grad_x += batch_index * m * c;
const int index = threadIdx.y * blockDim.x + threadIdx.x;
const int stride = blockDim.y * blockDim.x;
for (int i = index; i < c * n; i += stride) {
const int l = i / n;
const int j = i % n;
float w1 = weights[j];
float w2 = weights[j + n];
float w3 = weights[j + n + n];
int i1 = indices[j];
int i2 = indices[j + n];
int i3 = indices[j + n + n];
atomicAdd(grad_x + l * m + i1, grad_y[i] * w1);
atomicAdd(grad_x + l * m + i2, grad_y[i] * w2);
atomicAdd(grad_x + l * m + i3, grad_y[i] * w3);
}
}
void three_nearest_neighbors_interpolate_grad(int b, int c, int n, int m,
const float *grad_y,
const int *indices,
const float *weights,
float *grad_x) {
three_nearest_neighbors_interpolate_grad_kernel<<<
b, optimal_block_config(n, c), 0, at::cuda::getCurrentCUDAStream()>>>(
b, c, n, m, grad_y, indices, weights, grad_x);
CUDA_CHECK_ERRORS();
}
================================================
FILE: externs/pvcnn/modules/functional/src/interpolate/neighbor_interpolate.cuh
================================================
#ifndef _NEIGHBOR_INTERPOLATE_CUH
#define _NEIGHBOR_INTERPOLATE_CUH
void three_nearest_neighbors_interpolate(int b, int c, int m, int n,
const float *points_coords,
const float *centers_coords,
const float *centers_features,
int *indices, float *weights,
float *out);
void three_nearest_neighbors_interpolate_grad(int b, int c, int n, int m,
const float *grad_y,
const int *indices,
const float *weights,
float *grad_x);
#endif
================================================
FILE: externs/pvcnn/modules/functional/src/interpolate/neighbor_interpolate.hpp
================================================
#ifndef _NEIGHBOR_INTERPOLATE_HPP
#define _NEIGHBOR_INTERPOLATE_HPP
#include
#include
std::vector
three_nearest_neighbors_interpolate_forward(at::Tensor points_coords,
at::Tensor centers_coords,
at::Tensor centers_features);
at::Tensor three_nearest_neighbors_interpolate_backward(at::Tensor grad_y,
at::Tensor indices,
at::Tensor weights,
const int m);
#endif
================================================
FILE: externs/pvcnn/modules/functional/src/interpolate/trilinear_devox.cpp
================================================
#include "trilinear_devox.hpp"
#include "trilinear_devox.cuh"
#include "../utils.hpp"
/*
Function: trilinear devoxelization (forward)
Args:
r : voxel resolution
trainig : whether is training mode
coords : the coordinates of points, FloatTensor[b, 3, n]
features : features, FloatTensor[b, c, s], s = r ** 3
Return:
outs : outputs, FloatTensor[b, c, n]
inds : the voxel coordinates of point cube, IntTensor[b, 8, n]
wgts : weight for trilinear interpolation, FloatTensor[b, 8, n]
*/
std::vector
trilinear_devoxelize_forward(const int r, const bool is_training,
const at::Tensor coords,
const at::Tensor features) {
CHECK_CUDA(features);
CHECK_CUDA(coords);
CHECK_CONTIGUOUS(features);
CHECK_CONTIGUOUS(coords);
CHECK_IS_FLOAT(features);
CHECK_IS_FLOAT(coords);
int b = features.size(0);
int c = features.size(1);
int n = coords.size(2);
int r2 = r * r;
int r3 = r2 * r;
at::Tensor outs = torch::zeros(
{b, c, n}, at::device(features.device()).dtype(at::ScalarType::Float));
if (is_training) {
at::Tensor inds = torch::zeros(
{b, 8, n}, at::device(features.device()).dtype(at::ScalarType::Int));
at::Tensor wgts = torch::zeros(
{b, 8, n}, at::device(features.device()).dtype(at::ScalarType::Float));
trilinear_devoxelize(b, c, n, r, r2, r3, true, coords.data_ptr(),
features.data_ptr(), inds.data_ptr(),
wgts.data_ptr(), outs.data_ptr());
return {outs, inds, wgts};
} else {
at::Tensor inds = torch::zeros(
{1}, at::device(features.device()).dtype(at::ScalarType::Int));
at::Tensor wgts = torch::zeros(
{1}, at::device(features.device()).dtype(at::ScalarType::Float));
trilinear_devoxelize(b, c, n, r, r2, r3, false, coords.data_ptr(),
features.data_ptr(), inds.data_ptr(),
wgts.data_ptr(), outs.data_ptr());
return {outs, inds, wgts};
}
}
/*
Function: trilinear devoxelization (backward)
Args:
grad_y : grad outputs, FloatTensor[b, c, n]
indices : the voxel coordinates of point cube, IntTensor[b, 8, n]
weights : weight for trilinear interpolation, FloatTensor[b, 8, n]
r : voxel resolution
Return:
grad_x : grad inputs, FloatTensor[b, c, s], s = r ** 3
*/
at::Tensor trilinear_devoxelize_backward(const at::Tensor grad_y,
const at::Tensor indices,
const at::Tensor weights,
const int r) {
CHECK_CUDA(grad_y);
CHECK_CUDA(weights);
CHECK_CUDA(indices);
CHECK_CONTIGUOUS(grad_y);
CHECK_CONTIGUOUS(weights);
CHECK_CONTIGUOUS(indices);
CHECK_IS_FLOAT(grad_y);
CHECK_IS_FLOAT(weights);
CHECK_IS_INT(indices);
int b = grad_y.size(0);
int c = grad_y.size(1);
int n = grad_y.size(2);
int r3 = r * r * r;
at::Tensor grad_x = torch::zeros(
{b, c, r3}, at::device(grad_y.device()).dtype(at::ScalarType::Float));
trilinear_devoxelize_grad(b, c, n, r3, indices.data_ptr(),
weights.data_ptr(), grad_y.data_ptr(),
grad_x.data_ptr());
return grad_x;
}
================================================
FILE: externs/pvcnn/modules/functional/src/interpolate/trilinear_devox.cu
================================================
#include
#include
#include "../cuda_utils.cuh"
/*
Function: trilinear devoxlization (forward)
Args:
b : batch size
c : #channels
n : number of points
r : voxel resolution
r2 : r ** 2
r3 : r ** 3
coords : the coordinates of points, FloatTensor[b, 3, n]
feat : features, FloatTensor[b, c, r3]
inds : the voxel indices of point cube, IntTensor[b, 8, n]
wgts : weight for trilinear interpolation, FloatTensor[b, 8, n]
outs : outputs, FloatTensor[b, c, n]
*/
__global__ void trilinear_devoxelize_kernel(int b, int c, int n, int r, int r2,
int r3, bool is_training,
const float *__restrict__ coords,
const float *__restrict__ feat,
int *__restrict__ inds,
float *__restrict__ wgts,
float *__restrict__ outs) {
int batch_index = blockIdx.x;
int stride = blockDim.x;
int index = threadIdx.x;
coords += batch_index * n * 3;
inds += batch_index * n * 8;
wgts += batch_index * n * 8;
feat += batch_index * c * r3;
outs += batch_index * c * n;
for (int i = index; i < n; i += stride) {
float x = coords[i];
float y = coords[i + n];
float z = coords[i + n + n];
float x_lo_f = floorf(x);
float y_lo_f = floorf(y);
float z_lo_f = floorf(z);
float x_d_1 = x - x_lo_f; // / (x_hi_f - x_lo_f + 1e-8f)
float y_d_1 = y - y_lo_f;
float z_d_1 = z - z_lo_f;
float x_d_0 = 1.0f - x_d_1;
float y_d_0 = 1.0f - y_d_1;
float z_d_0 = 1.0f - z_d_1;
float wgt000 = x_d_0 * y_d_0 * z_d_0;
float wgt001 = x_d_0 * y_d_0 * z_d_1;
float wgt010 = x_d_0 * y_d_1 * z_d_0;
float wgt011 = x_d_0 * y_d_1 * z_d_1;
float wgt100 = x_d_1 * y_d_0 * z_d_0;
float wgt101 = x_d_1 * y_d_0 * z_d_1;
float wgt110 = x_d_1 * y_d_1 * z_d_0;
float wgt111 = x_d_1 * y_d_1 * z_d_1;
int x_lo = static_cast(x_lo_f);
int y_lo = static_cast(y_lo_f);
int z_lo = static_cast(z_lo_f);
int x_hi = (x_d_1 > 0) ? -1 : 0;
int y_hi = (y_d_1 > 0) ? -1 : 0;
int z_hi = (z_d_1 > 0) ? 1 : 0;
int idx000 = x_lo * r2 + y_lo * r + z_lo;
int idx001 = idx000 + z_hi; // x_lo * r2 + y_lo * r + z_hi;
int idx010 = idx000 + (y_hi & r); // x_lo * r2 + y_hi * r + z_lo;
int idx011 = idx010 + z_hi; // x_lo * r2 + y_hi * r + z_hi;
int idx100 = idx000 + (x_hi & r2); // x_hi * r2 + y_lo * r + z_lo;
int idx101 = idx100 + z_hi; // x_hi * r2 + y_lo * r + z_hi;
int idx110 = idx100 + (y_hi & r); // x_hi * r2 + y_hi * r + z_lo;
int idx111 = idx110 + z_hi; // x_hi * r2 + y_hi * r + z_hi;
if (is_training) {
wgts[i] = wgt000;
wgts[i + n] = wgt001;
wgts[i + n * 2] = wgt010;
wgts[i + n * 3] = wgt011;
wgts[i + n * 4] = wgt100;
wgts[i + n * 5] = wgt101;
wgts[i + n * 6] = wgt110;
wgts[i + n * 7] = wgt111;
inds[i] = idx000;
inds[i + n] = idx001;
inds[i + n * 2] = idx010;
inds[i + n * 3] = idx011;
inds[i + n * 4] = idx100;
inds[i + n * 5] = idx101;
inds[i + n * 6] = idx110;
inds[i + n * 7] = idx111;
}
for (int j = 0; j < c; j++) {
int jr3 = j * r3;
outs[j * n + i] =
wgt000 * feat[jr3 + idx000] + wgt001 * feat[jr3 + idx001] +
wgt010 * feat[jr3 + idx010] + wgt011 * feat[jr3 + idx011] +
wgt100 * feat[jr3 + idx100] + wgt101 * feat[jr3 + idx101] +
wgt110 * feat[jr3 + idx110] + wgt111 * feat[jr3 + idx111];
}
}
}
/*
Function: trilinear devoxlization (backward)
Args:
b : batch size
c : #channels
n : number of points
r3 : voxel cube size = voxel resolution ** 3
inds : the voxel indices of point cube, IntTensor[b, 8, n]
wgts : weight for trilinear interpolation, FloatTensor[b, 8, n]
grad_y : grad outputs, FloatTensor[b, c, n]
grad_x : grad inputs, FloatTensor[b, c, r3]
*/
__global__ void trilinear_devoxelize_grad_kernel(
int b, int c, int n, int r3, const int *__restrict__ inds,
const float *__restrict__ wgts, const float *__restrict__ grad_y,
float *__restrict__ grad_x) {
int batch_index = blockIdx.x;
int stride = blockDim.x;
int index = threadIdx.x;
inds += batch_index * n * 8;
wgts += batch_index * n * 8;
grad_x += batch_index * c * r3;
grad_y += batch_index * c * n;
for (int i = index; i < n; i += stride) {
int idx000 = inds[i];
int idx001 = inds[i + n];
int idx010 = inds[i + n * 2];
int idx011 = inds[i + n * 3];
int idx100 = inds[i + n * 4];
int idx101 = inds[i + n * 5];
int idx110 = inds[i + n * 6];
int idx111 = inds[i + n * 7];
float wgt000 = wgts[i];
float wgt001 = wgts[i + n];
float wgt010 = wgts[i + n * 2];
float wgt011 = wgts[i + n * 3];
float wgt100 = wgts[i + n * 4];
float wgt101 = wgts[i + n * 5];
float wgt110 = wgts[i + n * 6];
float wgt111 = wgts[i + n * 7];
for (int j = 0; j < c; j++) {
int jr3 = j * r3;
float g = grad_y[j * n + i];
atomicAdd(grad_x + jr3 + idx000, wgt000 * g);
atomicAdd(grad_x + jr3 + idx001, wgt001 * g);
atomicAdd(grad_x + jr3 + idx010, wgt010 * g);
atomicAdd(grad_x + jr3 + idx011, wgt011 * g);
atomicAdd(grad_x + jr3 + idx100, wgt100 * g);
atomicAdd(grad_x + jr3 + idx101, wgt101 * g);
atomicAdd(grad_x + jr3 + idx110, wgt110 * g);
atomicAdd(grad_x + jr3 + idx111, wgt111 * g);
}
}
}
void trilinear_devoxelize(int b, int c, int n, int r, int r2, int r3,
bool training, const float *coords, const float *feat,
int *inds, float *wgts, float *outs) {
trilinear_devoxelize_kernel<<>>(
b, c, n, r, r2, r3, training, coords, feat, inds, wgts, outs);
CUDA_CHECK_ERRORS();
}
void trilinear_devoxelize_grad(int b, int c, int n, int r3, const int *inds,
const float *wgts, const float *grad_y,
float *grad_x) {
trilinear_devoxelize_grad_kernel<<>>(
b, c, n, r3, inds, wgts, grad_y, grad_x);
CUDA_CHECK_ERRORS();
}
================================================
FILE: externs/pvcnn/modules/functional/src/interpolate/trilinear_devox.cuh
================================================
#ifndef _TRILINEAR_DEVOX_CUH
#define _TRILINEAR_DEVOX_CUH
// CUDA function declarations
void trilinear_devoxelize(int b, int c, int n, int r, int r2, int r3,
bool is_training, const float *coords,
const float *feat, int *inds, float *wgts,
float *outs);
void trilinear_devoxelize_grad(int b, int c, int n, int r3, const int *inds,
const float *wgts, const float *grad_y,
float *grad_x);
#endif
================================================
FILE: externs/pvcnn/modules/functional/src/interpolate/trilinear_devox.hpp
================================================
#ifndef _TRILINEAR_DEVOX_HPP
#define _TRILINEAR_DEVOX_HPP
#include
#include
std::vector trilinear_devoxelize_forward(const int r,
const bool is_training,
const at::Tensor coords,
const at::Tensor features);
at::Tensor trilinear_devoxelize_backward(const at::Tensor grad_y,
const at::Tensor indices,
const at::Tensor weights, const int r);
#endif
================================================
FILE: externs/pvcnn/modules/functional/src/sampling/sampling.cpp
================================================
#include "sampling.hpp"
#include "sampling.cuh"
#include "../utils.hpp"
at::Tensor gather_features_forward(at::Tensor features, at::Tensor indices) {
CHECK_CUDA(features);
CHECK_CUDA(indices);
CHECK_CONTIGUOUS(features);
CHECK_CONTIGUOUS(indices);
CHECK_IS_FLOAT(features);
CHECK_IS_INT(indices);
int b = features.size(0);
int c = features.size(1);
int n = features.size(2);
int m = indices.size(1);
at::Tensor output = torch::zeros(
{b, c, m}, at::device(features.device()).dtype(at::ScalarType::Float));
gather_features(b, c, n, m, features.data_ptr(),
indices.data_ptr(), output.data_ptr());
return output;
}
at::Tensor gather_features_backward(at::Tensor grad_y, at::Tensor indices,
const int n) {
CHECK_CUDA(grad_y);
CHECK_CUDA(indices);
CHECK_CONTIGUOUS(grad_y);
CHECK_CONTIGUOUS(indices);
CHECK_IS_FLOAT(grad_y);
CHECK_IS_INT(indices);
int b = grad_y.size(0);
int c = grad_y.size(1);
at::Tensor grad_x = torch::zeros(
{b, c, n}, at::device(grad_y.device()).dtype(at::ScalarType::Float));
gather_features_grad(b, c, n, indices.size(1), grad_y.data_ptr(),
indices.data_ptr(), grad_x.data_ptr());
return grad_x;
}
at::Tensor furthest_point_sampling_forward(at::Tensor coords,
const int num_samples) {
CHECK_CUDA(coords);
CHECK_CONTIGUOUS(coords);
CHECK_IS_FLOAT(coords);
int b = coords.size(0);
int n = coords.size(2);
at::Tensor indices = torch::zeros(
{b, num_samples}, at::device(coords.device()).dtype(at::ScalarType::Int));
at::Tensor distances = torch::full(
{b, n}, 1e38f, at::device(coords.device()).dtype(at::ScalarType::Float));
furthest_point_sampling(b, n, num_samples, coords.data_ptr(),
distances.data_ptr(), indices.data_ptr());
return indices;
}
================================================
FILE: externs/pvcnn/modules/functional/src/sampling/sampling.cu
================================================
#include
#include
#include "../cuda_utils.cuh"
/*
Function: gather centers' features (forward)
Args:
b : batch size
c : #channles of features
n : number of points in point clouds
m : number of query/sampled centers
features: points' features, FloatTensor[b, c, n]
indices : centers' indices in points, IntTensor[b, m]
out : gathered features, FloatTensor[b, c, m]
*/
__global__ void gather_features_kernel(int b, int c, int n, int m,
const float *__restrict__ features,
const int *__restrict__ indices,
float *__restrict__ out) {
int batch_index = blockIdx.x;
int channel_index = blockIdx.y;
int temp_index = batch_index * c + channel_index;
features += temp_index * n;
indices += batch_index * m;
out += temp_index * m;
for (int j = threadIdx.x; j < m; j += blockDim.x) {
out[j] = features[indices[j]];
}
}
void gather_features(int b, int c, int n, int m, const float *features,
const int *indices, float *out) {
gather_features_kernel<<>>(
b, c, n, m, features, indices, out);
CUDA_CHECK_ERRORS();
}
/*
Function: gather centers' features (backward)
Args:
b : batch size
c : #channles of features
n : number of points in point clouds
m : number of query/sampled centers
grad_y : grad of gathered features, FloatTensor[b, c, m]
indices : centers' indices in points, IntTensor[b, m]
grad_x : grad of points' features, FloatTensor[b, c, n]
*/
__global__ void gather_features_grad_kernel(int b, int c, int n, int m,
const float *__restrict__ grad_y,
const int *__restrict__ indices,
float *__restrict__ grad_x) {
int batch_index = blockIdx.x;
int channel_index = blockIdx.y;
int temp_index = batch_index * c + channel_index;
grad_y += temp_index * m;
indices += batch_index * m;
grad_x += temp_index * n;
for (int j = threadIdx.x; j < m; j += blockDim.x) {
atomicAdd(grad_x + indices[j], grad_y[j]);
}
}
void gather_features_grad(int b, int c, int n, int m, const float *grad_y,
const int *indices, float *grad_x) {
gather_features_grad_kernel<<>>(
b, c, n, m, grad_y, indices, grad_x);
CUDA_CHECK_ERRORS();
}
/*
Function: furthest point sampling
Args:
b : batch size
n : number of points in point clouds
m : number of query/sampled centers
coords : points' coords, FloatTensor[b, 3, n]
distances : minimum distance of a point to the set, IntTensor[b, n]
indices : sampled centers' indices in points, IntTensor[b, m]
*/
__global__ void furthest_point_sampling_kernel(int b, int n, int m,
const float *__restrict__ coords,
float *__restrict__ distances,
int *__restrict__ indices) {
if (m <= 0)
return;
int batch_index = blockIdx.x;
coords += batch_index * n * 3;
distances += batch_index * n;
indices += batch_index * m;
const int BlockSize = 512;
__shared__ float dists[BlockSize];
__shared__ int dists_i[BlockSize];
const int BufferSize = 3072;
__shared__ float buf[BufferSize * 3];
int old = 0;
if (threadIdx.x == 0)
indices[0] = old;
for (int j = threadIdx.x; j < min(BufferSize, n); j += blockDim.x) {
buf[j] = coords[j];
buf[j + BufferSize] = coords[j + n];
buf[j + BufferSize + BufferSize] = coords[j + n + n];
}
__syncthreads();
for (int j = 1; j < m; j++) {
int besti = 0; // best index
float best = -1; // farthest distance
// calculating the distance with the latest sampled point
float x1 = coords[old];
float y1 = coords[old + n];
float z1 = coords[old + n + n];
for (int k = threadIdx.x; k < n; k += blockDim.x) {
// fetch distance at block n, thread k
float td = distances[k];
float x2, y2, z2;
if (k < BufferSize) {
x2 = buf[k];
y2 = buf[k + BufferSize];
z2 = buf[k + BufferSize + BufferSize];
} else {
x2 = coords[k];
y2 = coords[k + n];
z2 = coords[k + n + n];
}
float d =
(x2 - x1) * (x2 - x1) + (y2 - y1) * (y2 - y1) + (z2 - z1) * (z2 - z1);
float d2 = min(d, td);
// update "point-to-set" distance
if (d2 != td)
distances[k] = d2;
// update the farthest distance at sample step j
if (d2 > best) {
best = d2;
besti = k;
}
}
dists[threadIdx.x] = best;
dists_i[threadIdx.x] = besti;
for (int u = 0; (1 << u) < blockDim.x; u++) {
__syncthreads();
if (threadIdx.x < (blockDim.x >> (u + 1))) {
int i1 = (threadIdx.x * 2) << u;
int i2 = (threadIdx.x * 2 + 1) << u;
if (dists[i1] < dists[i2]) {
dists[i1] = dists[i2];
dists_i[i1] = dists_i[i2];
}
}
}
__syncthreads();
// finish sample step j; old is the sampled index
old = dists_i[0];
if (threadIdx.x == 0)
indices[j] = old;
}
}
void furthest_point_sampling(int b, int n, int m, const float *coords,
float *distances, int *indices) {
furthest_point_sampling_kernel<<>>(b, n, m, coords, distances,
indices);
CUDA_CHECK_ERRORS();
}
================================================
FILE: externs/pvcnn/modules/functional/src/sampling/sampling.cuh
================================================
#ifndef _SAMPLING_CUH
#define _SAMPLING_CUH
void gather_features(int b, int c, int n, int m, const float *features,
const int *indices, float *out);
void gather_features_grad(int b, int c, int n, int m, const float *grad_y,
const int *indices, float *grad_x);
void furthest_point_sampling(int b, int n, int m, const float *coords,
float *distances, int *indices);
#endif
================================================
FILE: externs/pvcnn/modules/functional/src/sampling/sampling.hpp
================================================
#ifndef _SAMPLING_HPP
#define _SAMPLING_HPP
#include
at::Tensor gather_features_forward(at::Tensor features, at::Tensor indices);
at::Tensor gather_features_backward(at::Tensor grad_y, at::Tensor indices,
const int n);
at::Tensor furthest_point_sampling_forward(at::Tensor coords,
const int num_samples);
#endif
================================================
FILE: externs/pvcnn/modules/functional/src/utils.hpp
================================================
#ifndef _UTILS_HPP
#define _UTILS_HPP
#include
#include
#define CHECK_CUDA(x) TORCH_CHECK(x.device().is_cuda(), #x " must be a CUDA tensor")
#define CHECK_CONTIGUOUS(x) \
TORCH_CHECK(x.is_contiguous(), #x " must be a contiguous tensor")
#define CHECK_IS_INT(x) \
TORCH_CHECK(x.scalar_type() == at::ScalarType::Int, \
#x " must be an int tensor")
#define CHECK_IS_FLOAT(x) \
TORCH_CHECK(x.scalar_type() == at::ScalarType::Float, \
#x " must be a float tensor")
#endif
================================================
FILE: externs/pvcnn/modules/functional/src/voxelization/vox.cpp
================================================
#include "vox.hpp"
#include "vox.cuh"
#include "../utils.hpp"
/*
Function: average pool voxelization (forward)
Args:
features: features, FloatTensor[b, c, n]
coords : coords of each point, IntTensor[b, 3, n]
resolution : voxel resolution
Return:
out : outputs, FloatTensor[b, c, s], s = r ** 3
ind : voxel index of each point, IntTensor[b, n]
cnt : #points in each voxel index, IntTensor[b, s]
*/
std::vector avg_voxelize_forward(const at::Tensor features,
const at::Tensor coords,
const int resolution) {
CHECK_CUDA(features);
CHECK_CUDA(coords);
CHECK_CONTIGUOUS(features);
CHECK_CONTIGUOUS(coords);
CHECK_IS_FLOAT(features);
CHECK_IS_INT(coords);
int b = features.size(0);
int c = features.size(1);
int n = features.size(2);
int r = resolution;
int r2 = r * r;
int r3 = r2 * r;
at::Tensor ind = torch::zeros(
{b, n}, at::device(features.device()).dtype(at::ScalarType::Int));
at::Tensor out = torch::zeros(
{b, c, r3}, at::device(features.device()).dtype(at::ScalarType::Float));
at::Tensor cnt = torch::zeros(
{b, r3}, at::device(features.device()).dtype(at::ScalarType::Int));
avg_voxelize(b, c, n, r, r2, r3, coords.data_ptr(),
features.data_ptr(), ind.data_ptr(),
cnt.data_ptr(), out.data_ptr());
return {out, ind, cnt};
}
/*
Function: average pool voxelization (backward)
Args:
grad_y : grad outputs, FloatTensor[b, c, s]
indices: voxel index of each point, IntTensor[b, n]
cnt : #points in each voxel index, IntTensor[b, s]
Return:
grad_x : grad inputs, FloatTensor[b, c, n]
*/
at::Tensor avg_voxelize_backward(const at::Tensor grad_y,
const at::Tensor indices,
const at::Tensor cnt) {
CHECK_CUDA(grad_y);
CHECK_CUDA(indices);
CHECK_CUDA(cnt);
CHECK_CONTIGUOUS(grad_y);
CHECK_CONTIGUOUS(indices);
CHECK_CONTIGUOUS(cnt);
CHECK_IS_FLOAT(grad_y);
CHECK_IS_INT(indices);
CHECK_IS_INT(cnt);
int b = grad_y.size(0);
int c = grad_y.size(1);
int s = grad_y.size(2);
int n = indices.size(1);
at::Tensor grad_x = torch::zeros(
{b, c, n}, at::device(grad_y.device()).dtype(at::ScalarType::Float));
avg_voxelize_grad(b, c, n, s, indices.data_ptr(), cnt.data_ptr(),
grad_y.data_ptr(), grad_x.data_ptr());
return grad_x;
}
================================================
FILE: externs/pvcnn/modules/functional/src/voxelization/vox.cu
================================================
#include
#include
#include "../cuda_utils.cuh"
/*
Function: get how many points in each voxel grid
Args:
b : batch size
n : number of points
r : voxel resolution
r2 : = r * r
r3 : s, voxel cube size = r ** 3
coords : coords of each point, IntTensor[b, 3, n]
ind : voxel index of each point, IntTensor[b, n]
cnt : #points in each voxel index, IntTensor[b, s]
*/
__global__ void grid_stats_kernel(int b, int n, int r, int r2, int r3,
const int *__restrict__ coords,
int *__restrict__ ind, int *cnt) {
int batch_index = blockIdx.x;
int stride = blockDim.x;
int index = threadIdx.x;
coords += batch_index * n * 3;
ind += batch_index * n;
cnt += batch_index * r3;
for (int i = index; i < n; i += stride) {
// if (ind[i] == -1)
// continue;
ind[i] = coords[i] * r2 + coords[i + n] * r + coords[i + n + n];
atomicAdd(cnt + ind[i], 1);
}
}
/*
Function: average pool voxelization (forward)
Args:
b : batch size
c : #channels
n : number of points
s : voxel cube size = voxel resolution ** 3
ind : voxel index of each point, IntTensor[b, n]
cnt : #points in each voxel index, IntTensor[b, s]
feat: features, FloatTensor[b, c, n]
out : outputs, FloatTensor[b, c, s]
*/
__global__ void avg_voxelize_kernel(int b, int c, int n, int s,
const int *__restrict__ ind,
const int *__restrict__ cnt,
const float *__restrict__ feat,
float *__restrict__ out) {
int batch_index = blockIdx.x;
int stride = blockDim.x;
int index = threadIdx.x;
ind += batch_index * n;
feat += batch_index * c * n;
out += batch_index * c * s;
cnt += batch_index * s;
for (int i = index; i < n; i += stride) {
int pos = ind[i];
// if (pos == -1)
// continue;
int cur_cnt = cnt[pos];
if (cur_cnt > 0) {
float div_cur_cnt = 1.0 / static_cast(cur_cnt);
for (int j = 0; j < c; j++) {
atomicAdd(out + j * s + pos, feat[j * n + i] * div_cur_cnt);
}
}
}
}
/*
Function: average pool voxelization (backward)
Args:
b : batch size
c : #channels
n : number of points
r3 : voxel cube size = voxel resolution ** 3
ind : voxel index of each point, IntTensor[b, n]
cnt : #points in each voxel index, IntTensor[b, s]
grad_y : grad outputs, FloatTensor[b, c, s]
grad_x : grad inputs, FloatTensor[b, c, n]
*/
__global__ void avg_voxelize_grad_kernel(int b, int c, int n, int r3,
const int *__restrict__ ind,
const int *__restrict__ cnt,
const float *__restrict__ grad_y,
float *__restrict__ grad_x) {
int batch_index = blockIdx.x;
int stride = blockDim.x;
int index = threadIdx.x;
ind += batch_index * n;
grad_x += batch_index * c * n;
grad_y += batch_index * c * r3;
cnt += batch_index * r3;
for (int i = index; i < n; i += stride) {
int pos = ind[i];
// if (pos == -1)
// continue;
int cur_cnt = cnt[pos];
if (cur_cnt > 0) {
float div_cur_cnt = 1.0 / static_cast(cur_cnt);
for (int j = 0; j < c; j++) {
atomicAdd(grad_x + j * n + i, grad_y[j * r3 + pos] * div_cur_cnt);
}
}
}
}
void avg_voxelize(int b, int c, int n, int r, int r2, int r3, const int *coords,
const float *feat, int *ind, int *cnt, float *out) {
grid_stats_kernel<<>>(b, n, r, r2, r3, coords, ind,
cnt);
avg_voxelize_kernel<<>>(b, c, n, r3, ind, cnt,
feat, out);
CUDA_CHECK_ERRORS();
}
void avg_voxelize_grad(int b, int c, int n, int s, const int *ind,
const int *cnt, const float *grad_y, float *grad_x) {
avg_voxelize_grad_kernel<<>>(b, c, n, s, ind, cnt,
grad_y, grad_x);
CUDA_CHECK_ERRORS();
}
================================================
FILE: externs/pvcnn/modules/functional/src/voxelization/vox.cuh
================================================
#ifndef _VOX_CUH
#define _VOX_CUH
// CUDA function declarations
void avg_voxelize(int b, int c, int n, int r, int r2, int r3, const int *coords,
const float *feat, int *ind, int *cnt, float *out);
void avg_voxelize_grad(int b, int c, int n, int s, const int *idx,
const int *cnt, const float *grad_y, float *grad_x);
#endif
================================================
FILE: externs/pvcnn/modules/functional/src/voxelization/vox.hpp
================================================
#ifndef _VOX_HPP
#define _VOX_HPP
#include
#include
std::vector avg_voxelize_forward(const at::Tensor features,
const at::Tensor coords,
const int resolution);
at::Tensor avg_voxelize_backward(const at::Tensor grad_y,
const at::Tensor indices,
const at::Tensor cnt);
#endif
================================================
FILE: externs/pvcnn/modules/functional/voxelization.py
================================================
from torch.autograd import Function
from externs.pvcnn.modules.functional.backend import _backend
__all__ = ['avg_voxelize']
class AvgVoxelization(Function):
@staticmethod
def forward(ctx, features, coords, resolution):
"""
:param ctx:
:param features: Features of the point cloud, FloatTensor[B, C, N]
:param coords: Voxelized Coordinates of each point, IntTensor[B, 3, N]
:param resolution: Voxel resolution
:return:
Voxelized Features, FloatTensor[B, C, R, R, R]
"""
features = features.contiguous()
coords = coords.int().contiguous()
b, c, _ = features.shape
out, indices, counts = _backend.avg_voxelize_forward(features, coords, resolution)
ctx.save_for_backward(indices, counts)
return out.view(b, c, resolution, resolution, resolution)
@staticmethod
def backward(ctx, grad_output):
"""
:param ctx:
:param grad_output: gradient of output, FloatTensor[B, C, R, R, R]
:return:
gradient of inputs, FloatTensor[B, C, N]
"""
b, c = grad_output.shape[:2]
indices, counts = ctx.saved_tensors
grad_features = _backend.avg_voxelize_backward(grad_output.contiguous().view(b, c, -1), indices, counts)
return grad_features, None, None
avg_voxelize = AvgVoxelization.apply
================================================
FILE: externs/pvcnn/modules/loss.py
================================================
import torch.nn as nn
import modules.functional as F
__all__ = ['KLLoss']
class KLLoss(nn.Module):
def forward(self, x, y):
return F.kl_loss(x, y)
================================================
FILE: externs/pvcnn/modules/pointnet.py
================================================
import torch
import torch.nn as nn
import modules.functional as F
from modules.ball_query import BallQuery
from modules.shared_mlp import SharedMLP
__all__ = ['PointNetAModule', 'PointNetSAModule', 'PointNetFPModule']
class PointNetAModule(nn.Module):
def __init__(self, in_channels, out_channels, include_coordinates=True):
super().__init__()
if not isinstance(out_channels, (list, tuple)):
out_channels = [[out_channels]]
elif not isinstance(out_channels[0], (list, tuple)):
out_channels = [out_channels]
mlps = []
total_out_channels = 0
for _out_channels in out_channels:
mlps.append(
SharedMLP(in_channels=in_channels + (3 if include_coordinates else 0),
out_channels=_out_channels, dim=1)
)
total_out_channels += _out_channels[-1]
self.include_coordinates = include_coordinates
self.out_channels = total_out_channels
self.mlps = nn.ModuleList(mlps)
def forward(self, inputs):
features, coords = inputs
if self.include_coordinates:
features = torch.cat([features, coords], dim=1)
coords = torch.zeros((coords.size(0), 3, 1), device=coords.device)
if len(self.mlps) > 1:
features_list = []
for mlp in self.mlps:
features_list.append(mlp(features).max(dim=-1, keepdim=True).values)
return torch.cat(features_list, dim=1), coords
else:
return self.mlps[0](features).max(dim=-1, keepdim=True).values, coords
def extra_repr(self):
return f'out_channels={self.out_channels}, include_coordinates={self.include_coordinates}'
class PointNetSAModule(nn.Module):
def __init__(self, num_centers, radius, num_neighbors, in_channels, out_channels, include_coordinates=True):
super().__init__()
if not isinstance(radius, (list, tuple)):
radius = [radius]
if not isinstance(num_neighbors, (list, tuple)):
num_neighbors = [num_neighbors] * len(radius)
assert len(radius) == len(num_neighbors)
if not isinstance(out_channels, (list, tuple)):
out_channels = [[out_channels]] * len(radius)
elif not isinstance(out_channels[0], (list, tuple)):
out_channels = [out_channels] * len(radius)
assert len(radius) == len(out_channels)
groupers, mlps = [], []
total_out_channels = 0
for _radius, _out_channels, _num_neighbors in zip(radius, out_channels, num_neighbors):
groupers.append(
BallQuery(radius=_radius, num_neighbors=_num_neighbors, include_coordinates=include_coordinates)
)
mlps.append(
SharedMLP(in_channels=in_channels + (3 if include_coordinates else 0),
out_channels=_out_channels, dim=2)
)
total_out_channels += _out_channels[-1]
self.num_centers = num_centers
self.out_channels = total_out_channels
self.groupers = nn.ModuleList(groupers)
self.mlps = nn.ModuleList(mlps)
def forward(self, inputs):
features, coords = inputs
centers_coords = F.furthest_point_sample(coords, self.num_centers)
features_list = []
for grouper, mlp in zip(self.groupers, self.mlps):
features_list.append(mlp(grouper(coords, centers_coords, features)).max(dim=-1).values)
if len(features_list) > 1:
return torch.cat(features_list, dim=1), centers_coords
else:
return features_list[0], centers_coords
def extra_repr(self):
return f'num_centers={self.num_centers}, out_channels={self.out_channels}'
class PointNetFPModule(nn.Module):
def __init__(self, in_channels, out_channels):
super().__init__()
self.mlp = SharedMLP(in_channels=in_channels, out_channels=out_channels, dim=1)
def forward(self, inputs):
if len(inputs) == 3:
points_coords, centers_coords, centers_features = inputs
points_features = None
else:
points_coords, centers_coords, centers_features, points_features = inputs
interpolated_features = F.nearest_neighbor_interpolate(points_coords, centers_coords, centers_features)
if points_features is not None:
interpolated_features = torch.cat(
[interpolated_features, points_features], dim=1
)
return self.mlp(interpolated_features), points_coords
================================================
FILE: externs/pvcnn/modules/pvconv.py
================================================
import torch.nn as nn
import externs.pvcnn.modules.functional as F
from externs.pvcnn.modules.voxelization import Voxelization
from externs.pvcnn.modules.shared_mlp import SharedMLP
from externs.pvcnn.modules.se import SE3d
__all__ = ['PVConv']
class PVConv(nn.Module):
def __init__(self, in_channels, out_channels, kernel_size, resolution, with_se=False, normalize=True, eps=0):
super().__init__()
self.in_channels = in_channels
self.out_channels = out_channels
self.kernel_size = kernel_size
self.resolution = resolution
self.voxelization = Voxelization(resolution, normalize=normalize, eps=eps)
voxel_layers = [
nn.Conv3d(in_channels, out_channels, kernel_size, stride=1, padding=kernel_size // 2),
nn.BatchNorm3d(out_channels, eps=1e-4),
nn.LeakyReLU(0.1, True),
nn.Conv3d(out_channels, out_channels, kernel_size, stride=1, padding=kernel_size // 2),
nn.BatchNorm3d(out_channels, eps=1e-4),
nn.LeakyReLU(0.1, True),
]
if with_se:
voxel_layers.append(SE3d(out_channels))
self.voxel_layers = nn.Sequential(*voxel_layers)
self.point_features = SharedMLP(in_channels, out_channels)
def forward(self, inputs):
features, coords = inputs
voxel_features, voxel_coords = self.voxelization(features, coords)
voxel_features = self.voxel_layers(voxel_features)
voxel_features = F.trilinear_devoxelize(voxel_features, voxel_coords, self.resolution, self.training)
fused_features = voxel_features + self.point_features(features)
return fused_features, coords
class ProxyVoxelConv(nn.Module):
def __init__(self, in_channels, out_channels, kernel_size, resolution, with_se=False, normalize=True, eps=0):
super().__init__()
self.in_channels = in_channels
self.voxelization = Voxelization(resolution, normalize=normalize, eps=eps)
# self.expansion_layer = nn.Conv3d(in_channels=self.in_channels, out_channels=out_channels, kernel_size=kernel_size, stride=1, padding=kernel_size//2, bias=False)
# self.expansion_layer.weight.requires_grad=False
# self.expansion_layer.weight[...] = 1
# self.expansion_layer = nn.MaxPool3d(kernel_size=kernel_size, stride=1, padding=kernel_size//2)
def forward(self, inputs):
features, coords = inputs
voxel_features, voxel_coords = self.voxelization(features, coords)
# voxel_features = self.expansion_layer(voxel_features)
return voxel_features, voxel_coords
================================================
FILE: externs/pvcnn/modules/se.py
================================================
import torch.nn as nn
__all__ = ['SE3d']
class SE3d(nn.Module):
def __init__(self, channel, reduction=8):
super().__init__()
self.fc = nn.Sequential(
nn.Linear(channel, channel // reduction, bias=False),
nn.ReLU(inplace=True),
nn.Linear(channel // reduction, channel, bias=False),
nn.Sigmoid()
)
def forward(self, inputs):
return inputs * self.fc(inputs.mean(-1).mean(-1).mean(-1)).view(inputs.shape[0], inputs.shape[1], 1, 1, 1)
================================================
FILE: externs/pvcnn/modules/shared_mlp.py
================================================
import torch.nn as nn
__all__ = ['SharedMLP']
class SharedMLP(nn.Module):
def __init__(self, in_channels, out_channels, dim=1):
super().__init__()
if dim == 1:
conv = nn.Conv1d
bn = nn.BatchNorm1d
elif dim == 2:
conv = nn.Conv2d
bn = nn.BatchNorm2d
else:
raise ValueError
if not isinstance(out_channels, (list, tuple)):
out_channels = [out_channels]
layers = []
for oc in out_channels:
layers.extend([
conv(in_channels, oc, 1),
bn(oc),
nn.ReLU(True),
])
in_channels = oc
self.layers = nn.Sequential(*layers)
def forward(self, inputs):
if isinstance(inputs, (list, tuple)):
return (self.layers(inputs[0]), *inputs[1:])
else:
return self.layers(inputs)
================================================
FILE: externs/pvcnn/modules/voxelization.py
================================================
import torch
import torch.nn as nn
import externs.pvcnn.modules.functional as F
__all__ = ['Voxelization']
class Voxelization(nn.Module):
def __init__(self, resolution, normalize=True, eps=0):
super().__init__()
self.r = int(resolution)
self.normalize = normalize
self.eps = eps
def forward(self, features, coords):
coords = coords.detach()
# norm_coords = coords - coords.mean(2, keepdim=True)
# if self.normalize:
# norm_coords = norm_coords / (norm_coords.norm(dim=1, keepdim=True).max(dim=2, keepdim=True).values * 2.0 + self.eps) + 0.5
# else:
# norm_coords = (norm_coords + 1) / 2.0
norm_coords = coords
norm_coords = torch.clamp(norm_coords * self.r, 0, self.r - 1)
vox_coords = torch.round(norm_coords).to(torch.int32)
return F.avg_voxelize(features, vox_coords, self.r), norm_coords
def extra_repr(self):
return 'resolution={}{}'.format(self.r, ', normalized eps = {}'.format(self.eps) if self.normalize else '')
================================================
FILE: foreground_segment.py
================================================
import cv2
import argparse
import numpy as np
import torch
from PIL import Image
class BackgroundRemoval:
def __init__(self, device='cuda'):
from carvekit.api.high import HiInterface
self.interface = HiInterface(
object_type="object", # Can be "object" or "hairs-like".
batch_size_seg=5,
batch_size_matting=1,
device=device,
seg_mask_size=640, # Use 640 for Tracer B7 and 320 for U2Net
matting_mask_size=2048,
trimap_prob_threshold=231,
trimap_dilation=30,
trimap_erosion_iters=5,
fp16=True,
)
@torch.no_grad()
def __call__(self, image):
# image: [H, W, 3] array in [0, 255].
image = Image.fromarray(image)
image = self.interface([image])[0]
image = np.array(image)
return image
def process(image_path, mask_path):
mask_predictor = BackgroundRemoval()
image = cv2.imread(image_path, cv2.IMREAD_UNCHANGED)
if image.shape[-1] == 4:
image = cv2.cvtColor(image, cv2.COLOR_BGRA2RGB)
else:
image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)
rgba = mask_predictor(image) # [H, W, 4]
cv2.imwrite(mask_path, cv2.cvtColor(rgba, cv2.COLOR_RGBA2BGRA))
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument('--input', required=True, type=str)
parser.add_argument('--output', required=True, type=str)
opt = parser.parse_args()
process(opt.input, opt.output)
================================================
FILE: generate.py
================================================
import argparse
from pathlib import Path
import numpy as np
import torch
from omegaconf import OmegaConf
from skimage.io import imsave
import sys
import os
# os.chdir(os.path.dirname(__file__))
sys.path.insert(0, os.path.dirname(__file__))
from ldm.models.diffusion.sync_dreamer import SyncMultiviewDiffusion, SyncDDIMSampler
from ldm.models.diffusion.ctrldemo_sync_dreamer import CtrlDemo, CtrlDemoSampler
from ldm.util import instantiate_from_config, prepare_inputs, prepare_proxy
from ldm.util import Ctrl3DParams
def load_model(cfg,ckpt,strict=True):
config = OmegaConf.load(cfg)
model = instantiate_from_config(config.model)
print(f'loading model from {ckpt} ...')
ckpt = torch.load(ckpt,map_location='cpu')
model.load_state_dict(ckpt['state_dict'],strict=strict)
model = model.cuda().eval()
return model
def main():
parser = argparse.ArgumentParser()
parser.add_argument('--cfg',type=str, default='configs/syncdreamer.yaml')
parser.add_argument('--ckpt',type=str, default='ckpt/syncdreamer-step80k.ckpt')
parser.add_argument('--output', type=str, required=True)
parser.add_argument('--input', type=str, required=True)
parser.add_argument('--input_proxy', type=str, default=None)
parser.add_argument('--start_view', type=int, default=0)
parser.add_argument('--elevation', type=float, required=True)
parser.add_argument('--sample_num', type=int, default=4)
parser.add_argument('--crop_size', type=int, default=-1)
parser.add_argument('--cfg_scale', type=float, default=2.0)
parser.add_argument('--ctrl_start_step', type=float, default=0.0)
parser.add_argument('--ctrl_end_step', type=float, default=1.0)
parser.add_argument('--batch_view_num', type=int, default=8)
parser.add_argument('--seed', type=int, default=6033)
parser.add_argument('--sampler', type=str, default='ddim_sync')
parser.add_argument('--sample_steps', type=int, default=50)
flags = parser.parse_args()
torch.random.manual_seed(flags.seed)
np.random.seed(flags.seed)
model = load_model(flags.cfg, flags.ckpt, strict=False)
if flags.input_proxy is not None:
assert isinstance(model, CtrlDemo)
else:
assert isinstance(model, SyncMultiviewDiffusion)
Path(f'{flags.output}').mkdir(exist_ok=True, parents=True)
# prepare data
if flags.elevation != 30:
raise ValueError("The elevation needs to be set to 30.")
data = prepare_inputs(flags.input, flags.elevation, flags.crop_size)
if flags.input_proxy is not None:
data['proxy'] = prepare_proxy(flags.input_proxy, flags.start_view)
for k, v in data.items():
data[k] = v.unsqueeze(0).cuda()
data[k] = torch.repeat_interleave(data[k], flags.sample_num, dim=0)
if flags.sampler=='ddim_sync':
sampler = SyncDDIMSampler(model, flags.sample_steps)
elif flags.sampler=='ddim_demo':
data['proxy'] = [data['proxy']]
sampler = CtrlDemoSampler(model, flags.sample_steps)
ctrl3D_params = [Ctrl3DParams(256, flags.ctrl_start_step, flags.ctrl_end_step)]
sampler.set_ctrl3D_params(ctrl3D_params, 1.0)
else:
raise NotImplementedError
x_sample = model.inference(sampler, data, flags.cfg_scale, flags.batch_view_num)[0]
images = model.decode_latents(x_sample[0]).unsqueeze(0)
B, N, _, H, W = images.shape
images = (torch.clamp(images,max=1.0,min=-1.0) + 1) * 0.5
images = (images.permute(0, 1, 3, 4, 2).cpu().numpy() * 255).astype(np.uint8)
for bi in range(B):
output_fn = Path(flags.output)/ f'{bi}.png'
imsave(output_fn, np.concatenate([images[bi,ni] for ni in range(N)], 1))
if __name__=="__main__":
main()
================================================
FILE: ldm/DPMPPScheduler.py
================================================
import math
from typing import List, Optional, Tuple, Union
from dataclasses import dataclass
import numpy as np
import torch
from diffusers import DDIMScheduler, DPMSolverMultistepScheduler
from diffusers.schedulers.scheduling_utils import KarrasDiffusionSchedulers, SchedulerMixin, SchedulerOutput
from diffusers.utils.torch_utils import randn_tensor
from diffusers.utils import BaseOutput
@dataclass
class DPMPPSchedulerOutput(BaseOutput):
"""
Output class for the scheduler's `step` function output.
Args:
prev_sample (`torch.FloatTensor` of shape `(batch_size, num_channels, height, width)` for images):
Computed sample `(x_{t-1})` of previous timestep. `prev_sample` should be used as next model input in the
denoising loop.
pred_original_sample (`torch.FloatTensor` of shape `(batch_size, num_channels, height, width)` for images):
The predicted denoised sample `(x_{0})` based on the model output from the current timestep.
`pred_original_sample` can be used to preview progress or for guidance.
"""
prev_sample: torch.FloatTensor
pred_original_sample: Optional[torch.FloatTensor] = None
class DPMPPScheduler(DPMSolverMultistepScheduler):
def step(
self,
model_output: torch.FloatTensor,
timestep: int,
sample: torch.FloatTensor,
generator=None,
return_dict: bool = True,
) -> Union[SchedulerOutput, Tuple]:
"""
Predict the sample from the previous timestep by reversing the SDE. This function propagates the sample with
the multistep DPMSolver.
Args:
model_output (`torch.FloatTensor`):
The direct output from learned diffusion model.
timestep (`int`):
The current discrete timestep in the diffusion chain.
sample (`torch.FloatTensor`):
A current instance of a sample created by the diffusion process.
generator (`torch.Generator`, *optional*):
A random number generator.
return_dict (`bool`):
Whether or not to return a [`~schedulers.scheduling_utils.SchedulerOutput`] or `tuple`.
Returns:
[`~schedulers.scheduling_utils.SchedulerOutput`] or `tuple`:
If return_dict is `True`, [`~schedulers.scheduling_utils.SchedulerOutput`] is returned, otherwise a
tuple is returned where the first element is the sample tensor.
"""
if self.num_inference_steps is None:
raise ValueError(
"Number of inference steps is 'None', you need to run 'set_timesteps' after creating the scheduler"
)
if self.step_index is None:
self._init_step_index(timestep)
lower_order_final = (
(self.step_index == len(self.timesteps) - 1) and self.config.lower_order_final and len(self.timesteps) < 15
)
lower_order_second = (
(self.step_index == len(self.timesteps) - 2) and self.config.lower_order_final and len(self.timesteps) < 15
)
model_output = self.convert_model_output(model_output, sample=sample)
for i in range(self.config.solver_order - 1):
self.model_outputs[i] = self.model_outputs[i + 1]
self.model_outputs[-1] = model_output
if self.config.algorithm_type in ["sde-dpmsolver", "sde-dpmsolver++"]:
noise = randn_tensor(
model_output.shape, generator=generator, device=model_output.device, dtype=model_output.dtype
)
else:
noise = None
if self.config.solver_order == 1 or self.lower_order_nums < 1 or lower_order_final:
prev_sample = self.dpm_solver_first_order_update(model_output, sample=sample, noise=noise)
elif self.config.solver_order == 2 or self.lower_order_nums < 2 or lower_order_second:
prev_sample = self.multistep_dpm_solver_second_order_update(self.model_outputs, sample=sample, noise=noise)
else:
prev_sample = self.multistep_dpm_solver_third_order_update(self.model_outputs, sample=sample)
if self.lower_order_nums < self.config.solver_order:
self.lower_order_nums += 1
# upon completion increase step index by one
self._step_index += 1
if not return_dict:
return (prev_sample,)
return DPMPPSchedulerOutput(prev_sample=prev_sample, pred_original_sample=model_output)
def reinit(self):
cls(self, )
================================================
FILE: ldm/base_utils.py
================================================
import pickle
import numpy as np
import cv2
from skimage.io import imread
def save_pickle(data, pkl_path):
# os.system('mkdir -p {}'.format(os.path.dirname(pkl_path)))
with open(pkl_path, 'wb') as f:
pickle.dump(data, f)
def read_pickle(pkl_path):
with open(pkl_path, 'rb') as f:
return pickle.load(f)
def draw_epipolar_line(F, img0, img1, pt0, color):
h1,w1=img1.shape[:2]
hpt = np.asarray([pt0[0], pt0[1], 1], dtype=np.float32)[:, None]
l = F @ hpt
l = l[:, 0]
a, b, c = l[0], l[1], l[2]
pt1 = np.asarray([0, -c / b]).astype(np.int32)
pt2 = np.asarray([w1, (-a * w1 - c) / b]).astype(np.int32)
img0 = cv2.circle(img0, tuple(pt0.astype(np.int32)), 5, color, 2)
img1 = cv2.line(img1, tuple(pt1), tuple(pt2), color, 2)
return img0, img1
def draw_epipolar_lines(F, img0, img1,num=20):
img0,img1=img0.copy(),img1.copy()
h0, w0, _ = img0.shape
h1, w1, _ = img1.shape
for k in range(num):
color = np.random.randint(0, 255, [3], dtype=np.int32)
color = [int(c) for c in color]
pt = np.random.uniform(0, 1, 2)
pt[0] *= w0
pt[1] *= h0
pt = pt.astype(np.int32)
img0, img1 = draw_epipolar_line(F, img0, img1, pt, color)
return img0, img1
def compute_F(K1, K2, Rt0, Rt1=None):
if Rt1 is None:
R, t = Rt0[:,:3], Rt0[:,3:]
else:
Rt = compute_dR_dt(Rt0,Rt1)
R, t = Rt[:,:3], Rt[:,3:]
A = K1 @ R.T @ t # [3,1]
C = np.asarray([[0,-A[2,0],A[1,0]],
[A[2,0],0,-A[0,0]],
[-A[1,0],A[0,0],0]])
F = (np.linalg.inv(K2)).T @ R @ K1.T @ C
return F
def compute_dR_dt(Rt0, Rt1):
R0, t0 = Rt0[:,:3], Rt0[:,3:]
R1, t1 = Rt1[:,:3], Rt1[:,3:]
dR = np.dot(R1, R0.T)
dt = t1 - np.dot(dR, t0)
return np.concatenate([dR, dt], -1)
def concat_images(img0,img1,vert=False):
if not vert:
h0,h1=img0.shape[0],img1.shape[0],
if h00)
if np.sum(mask0)>0: dpt[mask0]=1e-4
mask1=(np.abs(dpt) > -1e-4) & (np.abs(dpt) < 0)
if np.sum(mask1)>0: dpt[mask1]=-1e-4
pts2d = pts[:,:2]/dpt[:,None]
return pts2d, dpt
def draw_keypoints(img, kps, colors=None, radius=2):
out_img=img.copy()
for pi, pt in enumerate(kps):
pt = np.round(pt).astype(np.int32)
if colors is not None:
color=[int(c) for c in colors[pi]]
cv2.circle(out_img, tuple(pt), radius, color, -1)
else:
cv2.circle(out_img, tuple(pt), radius, (0,255,0), -1)
return out_img
def output_points(fn,pts,colors=None):
with open(fn, 'w') as f:
for pi, pt in enumerate(pts):
f.write(f'{pt[0]:.6f} {pt[1]:.6f} {pt[2]:.6f} ')
if colors is not None:
f.write(f'{int(colors[pi,0])} {int(colors[pi,1])} {int(colors[pi,2])}')
f.write('\n')
def mask_depth_to_pts(mask,depth,K,rgb=None):
hs,ws=np.nonzero(mask)
depth=depth[hs,ws]
pts=np.asarray([ws,hs,depth],np.float32).transpose()
pts[:,:2]*=pts[:,2:]
if rgb is not None:
return np.dot(pts, np.linalg.inv(K).transpose()), rgb[hs,ws]
else:
return np.dot(pts, np.linalg.inv(K).transpose())
def transform_points_pose(pts, pose):
R, t = pose[:, :3], pose[:, 3]
if len(pts.shape)==1:
return (R @ pts[:,None] + t[:,None])[:,0]
return pts @ R.T + t[None,:]
def pose_apply(pose,pts):
return transform_points_pose(pts, pose)
def downsample_gaussian_blur(img, ratio):
sigma = (1 / ratio) / 3
# ksize=np.ceil(2*sigma)
ksize = int(np.ceil(((sigma - 0.8) / 0.3 + 1) * 2 + 1))
ksize = ksize + 1 if ksize % 2 == 0 else ksize
img = cv2.GaussianBlur(img, (ksize, ksize), sigma, borderType=cv2.BORDER_REFLECT101)
return img
================================================
FILE: ldm/data/__init__.py
================================================
================================================
FILE: ldm/data/base.py
================================================
import os
import numpy as np
from abc import abstractmethod
from torch.utils.data import Dataset, ConcatDataset, ChainDataset, IterableDataset
class Txt2ImgIterableBaseDataset(IterableDataset):
'''
Define an interface to make the IterableDatasets for text2img data chainable
'''
def __init__(self, num_records=0, valid_ids=None, size=256):
super().__init__()
self.num_records = num_records
self.valid_ids = valid_ids
self.sample_ids = valid_ids
self.size = size
print(f'{self.__class__.__name__} dataset contains {self.__len__()} examples.')
def __len__(self):
return self.num_records
@abstractmethod
def __iter__(self):
pass
class PRNGMixin(object):
"""
Adds a prng property which is a numpy RandomState which gets
reinitialized whenever the pid changes to avoid synchronized sampling
behavior when used in conjunction with multiprocessing.
"""
@property
def prng(self):
currentpid = os.getpid()
if getattr(self, "_initpid", None) != currentpid:
self._initpid = currentpid
self._prng = np.random.RandomState()
return self._prng
================================================
FILE: ldm/data/coco.py
================================================
import os
import json
import albumentations
import numpy as np
from PIL import Image
from tqdm import tqdm
from torch.utils.data import Dataset
from abc import abstractmethod
class CocoBase(Dataset):
"""needed for (image, caption, segmentation) pairs"""
def __init__(self, size=None, dataroot="", datajson="", onehot_segmentation=False, use_stuffthing=False,
crop_size=None, force_no_crop=False, given_files=None, use_segmentation=True,crop_type=None):
self.split = self.get_split()
self.size = size
if crop_size is None:
self.crop_size = size
else:
self.crop_size = crop_size
assert crop_type in [None, 'random', 'center']
self.crop_type = crop_type
self.use_segmenation = use_segmentation
self.onehot = onehot_segmentation # return segmentation as rgb or one hot
self.stuffthing = use_stuffthing # include thing in segmentation
if self.onehot and not self.stuffthing:
raise NotImplemented("One hot mode is only supported for the "
"stuffthings version because labels are stored "
"a bit different.")
data_json = datajson
with open(data_json) as json_file:
self.json_data = json.load(json_file)
self.img_id_to_captions = dict()
self.img_id_to_filepath = dict()
self.img_id_to_segmentation_filepath = dict()
assert data_json.split("/")[-1] in [f"captions_train{self.year()}.json",
f"captions_val{self.year()}.json"]
# TODO currently hardcoded paths, would be better to follow logic in
# cocstuff pixelmaps
if self.use_segmenation:
if self.stuffthing:
self.segmentation_prefix = (
f"data/cocostuffthings/val{self.year()}" if
data_json.endswith(f"captions_val{self.year()}.json") else
f"data/cocostuffthings/train{self.year()}")
else:
self.segmentation_prefix = (
f"data/coco/annotations/stuff_val{self.year()}_pixelmaps" if
data_json.endswith(f"captions_val{self.year()}.json") else
f"data/coco/annotations/stuff_train{self.year()}_pixelmaps")
imagedirs = self.json_data["images"]
self.labels = {"image_ids": list()}
for imgdir in tqdm(imagedirs, desc="ImgToPath"):
self.img_id_to_filepath[imgdir["id"]] = os.path.join(dataroot, imgdir["file_name"])
self.img_id_to_captions[imgdir["id"]] = list()
pngfilename = imgdir["file_name"].replace("jpg", "png")
if self.use_segmenation:
self.img_id_to_segmentation_filepath[imgdir["id"]] = os.path.join(
self.segmentation_prefix, pngfilename)
if given_files is not None:
if pngfilename in given_files:
self.labels["image_ids"].append(imgdir["id"])
else:
self.labels["image_ids"].append(imgdir["id"])
capdirs = self.json_data["annotations"]
for capdir in tqdm(capdirs, desc="ImgToCaptions"):
# there are in average 5 captions per image
#self.img_id_to_captions[capdir["image_id"]].append(np.array([capdir["caption"]]))
self.img_id_to_captions[capdir["image_id"]].append(capdir["caption"])
self.rescaler = albumentations.SmallestMaxSize(max_size=self.size)
if self.split=="validation":
self.cropper = albumentations.CenterCrop(height=self.crop_size, width=self.crop_size)
else:
# default option for train is random crop
if self.crop_type in [None, 'random']:
self.cropper = albumentations.RandomCrop(height=self.crop_size, width=self.crop_size)
else:
self.cropper = albumentations.CenterCrop(height=self.crop_size, width=self.crop_size)
self.preprocessor = albumentations.Compose(
[self.rescaler, self.cropper],
additional_targets={"segmentation": "image"})
if force_no_crop:
self.rescaler = albumentations.Resize(height=self.size, width=self.size)
self.preprocessor = albumentations.Compose(
[self.rescaler],
additional_targets={"segmentation": "image"})
@abstractmethod
def year(self):
raise NotImplementedError()
def __len__(self):
return len(self.labels["image_ids"])
def preprocess_image(self, image_path, segmentation_path=None):
image = Image.open(image_path)
if not image.mode == "RGB":
image = image.convert("RGB")
image = np.array(image).astype(np.uint8)
if segmentation_path:
segmentation = Image.open(segmentation_path)
if not self.onehot and not segmentation.mode == "RGB":
segmentation = segmentation.convert("RGB")
segmentation = np.array(segmentation).astype(np.uint8)
if self.onehot:
assert self.stuffthing
# stored in caffe format: unlabeled==255. stuff and thing from
# 0-181. to be compatible with the labels in
# https://github.com/nightrome/cocostuff/blob/master/labels.txt
# we shift stuffthing one to the right and put unlabeled in zero
# as long as segmentation is uint8 shifting to right handles the
# latter too
assert segmentation.dtype == np.uint8
segmentation = segmentation + 1
processed = self.preprocessor(image=image, segmentation=segmentation)
image, segmentation = processed["image"], processed["segmentation"]
else:
image = self.preprocessor(image=image,)['image']
image = (image / 127.5 - 1.0).astype(np.float32)
if segmentation_path:
if self.onehot:
assert segmentation.dtype == np.uint8
# make it one hot
n_labels = 183
flatseg = np.ravel(segmentation)
onehot = np.zeros((flatseg.size, n_labels), dtype=np.bool)
onehot[np.arange(flatseg.size), flatseg] = True
onehot = onehot.reshape(segmentation.shape + (n_labels,)).astype(int)
segmentation = onehot
else:
segmentation = (segmentation / 127.5 - 1.0).astype(np.float32)
return image, segmentation
else:
return image
def __getitem__(self, i):
img_path = self.img_id_to_filepath[self.labels["image_ids"][i]]
if self.use_segmenation:
seg_path = self.img_id_to_segmentation_filepath[self.labels["image_ids"][i]]
image, segmentation = self.preprocess_image(img_path, seg_path)
else:
image = self.preprocess_image(img_path)
captions = self.img_id_to_captions[self.labels["image_ids"][i]]
# randomly draw one of all available captions per image
caption = captions[np.random.randint(0, len(captions))]
example = {"image": image,
#"caption": [str(caption[0])],
"caption": caption,
"img_path": img_path,
"filename_": img_path.split(os.sep)[-1]
}
if self.use_segmenation:
example.update({"seg_path": seg_path, 'segmentation': segmentation})
return example
class CocoImagesAndCaptionsTrain2017(CocoBase):
"""returns a pair of (image, caption)"""
def __init__(self, size, onehot_segmentation=False, use_stuffthing=False, crop_size=None, force_no_crop=False,):
super().__init__(size=size,
dataroot="data/coco/train2017",
datajson="data/coco/annotations/captions_train2017.json",
onehot_segmentation=onehot_segmentation,
use_stuffthing=use_stuffthing, crop_size=crop_size, force_no_crop=force_no_crop)
def get_split(self):
return "train"
def year(self):
return '2017'
class CocoImagesAndCaptionsValidation2017(CocoBase):
"""returns a pair of (image, caption)"""
def __init__(self, size, onehot_segmentation=False, use_stuffthing=False, crop_size=None, force_no_crop=False,
given_files=None):
super().__init__(size=size,
dataroot="data/coco/val2017",
datajson="data/coco/annotations/captions_val2017.json",
onehot_segmentation=onehot_segmentation,
use_stuffthing=use_stuffthing, crop_size=crop_size, force_no_crop=force_no_crop,
given_files=given_files)
def get_split(self):
return "validation"
def year(self):
return '2017'
class CocoImagesAndCaptionsTrain2014(CocoBase):
"""returns a pair of (image, caption)"""
def __init__(self, size, onehot_segmentation=False, use_stuffthing=False, crop_size=None, force_no_crop=False,crop_type='random'):
super().__init__(size=size,
dataroot="data/coco/train2014",
datajson="data/coco/annotations2014/annotations/captions_train2014.json",
onehot_segmentation=onehot_segmentation,
use_stuffthing=use_stuffthing, crop_size=crop_size, force_no_crop=force_no_crop,
use_segmentation=False,
crop_type=crop_type)
def get_split(self):
return "train"
def year(self):
return '2014'
class CocoImagesAndCaptionsValidation2014(CocoBase):
"""returns a pair of (image, caption)"""
def __init__(self, size, onehot_segmentation=False, use_stuffthing=False, crop_size=None, force_no_crop=False,
given_files=None,crop_type='center',**kwargs):
super().__init__(size=size,
dataroot="data/coco/val2014",
datajson="data/coco/annotations2014/annotations/captions_val2014.json",
onehot_segmentation=onehot_segmentation,
use_stuffthing=use_stuffthing, crop_size=crop_size, force_no_crop=force_no_crop,
given_files=given_files,
use_segmentation=False,
crop_type=crop_type)
def get_split(self):
return "validation"
def year(self):
return '2014'
if __name__ == '__main__':
with open("data/coco/annotations2014/annotations/captions_val2014.json", "r") as json_file:
json_data = json.load(json_file)
capdirs = json_data["annotations"]
import pudb; pudb.set_trace()
#d2 = CocoImagesAndCaptionsTrain2014(size=256)
d2 = CocoImagesAndCaptionsValidation2014(size=256)
print("constructed dataset.")
print(f"length of {d2.__class__.__name__}: {len(d2)}")
ex2 = d2[0]
# ex3 = d3[0]
# print(ex1["image"].shape)
print(ex2["image"].shape)
# print(ex3["image"].shape)
# print(ex1["segmentation"].shape)
print(ex2["caption"].__class__.__name__)
================================================
FILE: ldm/data/control_sync_dreamer.py
================================================
import pytorch_lightning as pl
import numpy as np
import torch
import PIL
import os
from skimage.io import imread
import webdataset as wds
import PIL.Image as Image
from torch.utils.data import Dataset
from torch.utils.data.distributed import DistributedSampler
from pathlib import Path
from ldm.base_utils import read_pickle, pose_inverse
from ldm.data.sync_dreamer import SyncDreamerTrainData, SyncDreamerDataset
import torchvision.transforms as transforms
import torchvision
from einops import rearrange
from ldm.util import prepare_inputs, prepare_proxy
class ControlSyncDreamerTrainData(SyncDreamerTrainData):
def __init__(self, target_dir, input_dir, proxy_dir, uid_set_pkl, image_size=256):
self.default_image_size = 256
self.image_size = image_size
self.target_dir = Path(target_dir)
self.input_dir = Path(input_dir)
self.proxy_dir = Path(proxy_dir)
self.proxy_uids = read_pickle(uid_set_pkl) # e.g. self.proxy_uids = ['0012053f094f4309808f52b3efb88977.txt']
self.uids = [i.split('.')[0] for i in self.proxy_uids]
assert len(self.proxy_uids) == len(self.uids)
print('============= length of dataset %d =============' % len(self.uids))
image_transforms = []
image_transforms.extend([transforms.ToTensor(), transforms.Lambda(lambda x: rearrange(x * 2. - 1., 'c h w -> h w c'))])
self.image_transforms = torchvision.transforms.Compose(image_transforms)
self.num_images = 16
def get_data_for_index(self, index):
target_dir = os.path.join(self.target_dir, self.uids[index])
input_dir = os.path.join(self.input_dir, self.uids[index])
views = np.arange(0, self.num_images)
start_view_index = np.random.randint(0, self.num_images)
# start_view_index = 0
views = (views + start_view_index) % self.num_images
target_images = []
for si, target_index in enumerate(views):
img = self.load_index(target_dir, target_index)
target_images.append(img)
target_images = torch.stack(target_images, 0)
input_img = self.load_index(input_dir, start_view_index)
K, azimuths, elevations, distances, cam_poses = read_pickle(os.path.join(input_dir, f'meta.pkl'))
input_elevation = torch.from_numpy(elevations[start_view_index:start_view_index+1].astype(np.float32))
result = {"target_image": target_images, "input_image": input_img, "input_elevation": input_elevation}
proxy_path = os.path.join(self.proxy_dir, self.proxy_uids[index])
proxy = prepare_proxy(proxy_path)
rot_rad = np.deg2rad(-22.5*start_view_index)
rotate_matrix = torch.from_numpy(np.array([[np.cos(rot_rad), -np.sin(rot_rad), 0], [np.sin(rot_rad), np.cos(rot_rad), 0], [0, 0, 1]]))
proxy = (rotate_matrix * proxy[:, None, :]).sum(-1).float()
result['proxy'] = proxy
return result
class ControlSyncDreamerEvalData(Dataset):
def __init__(self, image_dir, proxy_dir, uid_set_pkl):
self.image_size = 256
self.image_dir = Path(image_dir)
self.proxy_dir = Path(proxy_dir)
self.crop_size = 20
self.proxy_uids = read_pickle(uid_set_pkl) # e.g. self.proxy_uids = ['0012053f094f4309808f52b3efb88977.txt']
self.uids = [i.split('.')[0] for i in self.proxy_uids]
assert len(self.proxy_uids) == len(self.uids)
print('============= length of dataset %d =============' % len(self.proxy_uids))
def __len__(self):
return len(self.uids)
def get_data_for_index(self, index):
input_img_path = os.path.join(self.image_dir, self.uids[index], '000.png')
proxy_path = os.path.join(self.proxy_dir, self.proxy_uids[index])
elevation = 30
result = prepare_inputs(input_img_path, elevation, 200)
result['proxy'] = prepare_proxy(proxy_path)
return result
def __getitem__(self, index):
return self.get_data_for_index(index)
class ControlSyncDreamerDataset(SyncDreamerDataset):
def __init__(self, target_dir, input_dir, validation_dir, proxy_dir, batch_size, uid_set_pkl, valid_uid_set_pkl, image_size=256, num_workers=4, seed=0, **kwargs):
pl.LightningDataModule.__init__(self)
self.target_dir = target_dir
self.input_dir = input_dir
self.validation_dir = validation_dir
self.batch_size = batch_size
self.num_workers = num_workers
self.uid_set_pkl = uid_set_pkl
self.valid_uid_set_pkl = valid_uid_set_pkl
self.seed = seed
self.additional_args = kwargs
self.image_size = image_size
# --------------------------
self.proxy_dir = proxy_dir
def setup(self, stage):
if stage in ['fit']:
self.train_dataset = ControlSyncDreamerTrainData(self.target_dir, self.input_dir, self.proxy_dir, uid_set_pkl=self.uid_set_pkl, image_size=256)
self.val_dataset = ControlSyncDreamerEvalData(image_dir=self.validation_dir, proxy_dir=self.proxy_dir, uid_set_pkl=self.valid_uid_set_pkl)
else:
raise NotImplementedError
================================================
FILE: ldm/data/dummy.py
================================================
import numpy as np
import random
import string
from torch.utils.data import Dataset, Subset
class DummyData(Dataset):
def __init__(self, length, size):
self.length = length
self.size = size
def __len__(self):
return self.length
def __getitem__(self, i):
x = np.random.randn(*self.size)
letters = string.ascii_lowercase
y = ''.join(random.choice(string.ascii_lowercase) for i in range(10))
return {"jpg": x, "txt": y}
class DummyDataWithEmbeddings(Dataset):
def __init__(self, length, size, emb_size):
self.length = length
self.size = size
self.emb_size = emb_size
def __len__(self):
return self.length
def __getitem__(self, i):
x = np.random.randn(*self.size)
y = np.random.randn(*self.emb_size).astype(np.float32)
return {"jpg": x, "txt": y}
================================================
FILE: ldm/data/imagenet.py
================================================
import os, yaml, pickle, shutil, tarfile, glob
import cv2
import albumentations
import PIL
import numpy as np
import torchvision.transforms.functional as TF
from omegaconf import OmegaConf
from functools import partial
from PIL import Image
from tqdm import tqdm
from torch.utils.data import Dataset, Subset
import taming.data.utils as tdu
from taming.data.imagenet import str_to_indices, give_synsets_from_indices, download, retrieve
from taming.data.imagenet import ImagePaths
from ldm.modules.image_degradation import degradation_fn_bsr, degradation_fn_bsr_light
def synset2idx(path_to_yaml="data/index_synset.yaml"):
with open(path_to_yaml) as f:
di2s = yaml.load(f)
return dict((v,k) for k,v in di2s.items())
class ImageNetBase(Dataset):
def __init__(self, config=None):
self.config = config or OmegaConf.create()
if not type(self.config)==dict:
self.config = OmegaConf.to_container(self.config)
self.keep_orig_class_label = self.config.get("keep_orig_class_label", False)
self.process_images = True # if False we skip loading & processing images and self.data contains filepaths
self._prepare()
self._prepare_synset_to_human()
self._prepare_idx_to_synset()
self._prepare_human_to_integer_label()
self._load()
def __len__(self):
return len(self.data)
def __getitem__(self, i):
return self.data[i]
def _prepare(self):
raise NotImplementedError()
def _filter_relpaths(self, relpaths):
ignore = set([
"n06596364_9591.JPEG",
])
relpaths = [rpath for rpath in relpaths if not rpath.split("/")[-1] in ignore]
if "sub_indices" in self.config:
indices = str_to_indices(self.config["sub_indices"])
synsets = give_synsets_from_indices(indices, path_to_yaml=self.idx2syn) # returns a list of strings
self.synset2idx = synset2idx(path_to_yaml=self.idx2syn)
files = []
for rpath in relpaths:
syn = rpath.split("/")[0]
if syn in synsets:
files.append(rpath)
return files
else:
return relpaths
def _prepare_synset_to_human(self):
SIZE = 2655750
URL = "https://heibox.uni-heidelberg.de/f/9f28e956cd304264bb82/?dl=1"
self.human_dict = os.path.join(self.root, "synset_human.txt")
if (not os.path.exists(self.human_dict) or
not os.path.getsize(self.human_dict)==SIZE):
download(URL, self.human_dict)
def _prepare_idx_to_synset(self):
URL = "https://heibox.uni-heidelberg.de/f/d835d5b6ceda4d3aa910/?dl=1"
self.idx2syn = os.path.join(self.root, "index_synset.yaml")
if (not os.path.exists(self.idx2syn)):
download(URL, self.idx2syn)
def _prepare_human_to_integer_label(self):
URL = "https://heibox.uni-heidelberg.de/f/2362b797d5be43b883f6/?dl=1"
self.human2integer = os.path.join(self.root, "imagenet1000_clsidx_to_labels.txt")
if (not os.path.exists(self.human2integer)):
download(URL, self.human2integer)
with open(self.human2integer, "r") as f:
lines = f.read().splitlines()
assert len(lines) == 1000
self.human2integer_dict = dict()
for line in lines:
value, key = line.split(":")
self.human2integer_dict[key] = int(value)
def _load(self):
with open(self.txt_filelist, "r") as f:
self.relpaths = f.read().splitlines()
l1 = len(self.relpaths)
self.relpaths = self._filter_relpaths(self.relpaths)
print("Removed {} files from filelist during filtering.".format(l1 - len(self.relpaths)))
self.synsets = [p.split("/")[0] for p in self.relpaths]
self.abspaths = [os.path.join(self.datadir, p) for p in self.relpaths]
unique_synsets = np.unique(self.synsets)
class_dict = dict((synset, i) for i, synset in enumerate(unique_synsets))
if not self.keep_orig_class_label:
self.class_labels = [class_dict[s] for s in self.synsets]
else:
self.class_labels = [self.synset2idx[s] for s in self.synsets]
with open(self.human_dict, "r") as f:
human_dict = f.read().splitlines()
human_dict = dict(line.split(maxsplit=1) for line in human_dict)
self.human_labels = [human_dict[s] for s in self.synsets]
labels = {
"relpath": np.array(self.relpaths),
"synsets": np.array(self.synsets),
"class_label": np.array(self.class_labels),
"human_label": np.array(self.human_labels),
}
if self.process_images:
self.size = retrieve(self.config, "size", default=256)
self.data = ImagePaths(self.abspaths,
labels=labels,
size=self.size,
random_crop=self.random_crop,
)
else:
self.data = self.abspaths
class ImageNetTrain(ImageNetBase):
NAME = "ILSVRC2012_train"
URL = "http://www.image-net.org/challenges/LSVRC/2012/"
AT_HASH = "a306397ccf9c2ead27155983c254227c0fd938e2"
FILES = [
"ILSVRC2012_img_train.tar",
]
SIZES = [
147897477120,
]
def __init__(self, process_images=True, data_root=None, **kwargs):
self.process_images = process_images
self.data_root = data_root
super().__init__(**kwargs)
def _prepare(self):
if self.data_root:
self.root = os.path.join(self.data_root, self.NAME)
else:
cachedir = os.environ.get("XDG_CACHE_HOME", os.path.expanduser("~/.cache"))
self.root = os.path.join(cachedir, "autoencoders/data", self.NAME)
self.datadir = os.path.join(self.root, "data")
self.txt_filelist = os.path.join(self.root, "filelist.txt")
self.expected_length = 1281167
self.random_crop = retrieve(self.config, "ImageNetTrain/random_crop",
default=True)
if not tdu.is_prepared(self.root):
# prep
print("Preparing dataset {} in {}".format(self.NAME, self.root))
datadir = self.datadir
if not os.path.exists(datadir):
path = os.path.join(self.root, self.FILES[0])
if not os.path.exists(path) or not os.path.getsize(path)==self.SIZES[0]:
import academictorrents as at
atpath = at.get(self.AT_HASH, datastore=self.root)
assert atpath == path
print("Extracting {} to {}".format(path, datadir))
os.makedirs(datadir, exist_ok=True)
with tarfile.open(path, "r:") as tar:
tar.extractall(path=datadir)
print("Extracting sub-tars.")
subpaths = sorted(glob.glob(os.path.join(datadir, "*.tar")))
for subpath in tqdm(subpaths):
subdir = subpath[:-len(".tar")]
os.makedirs(subdir, exist_ok=True)
with tarfile.open(subpath, "r:") as tar:
tar.extractall(path=subdir)
filelist = glob.glob(os.path.join(datadir, "**", "*.JPEG"))
filelist = [os.path.relpath(p, start=datadir) for p in filelist]
filelist = sorted(filelist)
filelist = "\n".join(filelist)+"\n"
with open(self.txt_filelist, "w") as f:
f.write(filelist)
tdu.mark_prepared(self.root)
class ImageNetValidation(ImageNetBase):
NAME = "ILSVRC2012_validation"
URL = "http://www.image-net.org/challenges/LSVRC/2012/"
AT_HASH = "5d6d0df7ed81efd49ca99ea4737e0ae5e3a5f2e5"
VS_URL = "https://heibox.uni-heidelberg.de/f/3e0f6e9c624e45f2bd73/?dl=1"
FILES = [
"ILSVRC2012_img_val.tar",
"validation_synset.txt",
]
SIZES = [
6744924160,
1950000,
]
def __init__(self, process_images=True, data_root=None, **kwargs):
self.data_root = data_root
self.process_images = process_images
super().__init__(**kwargs)
def _prepare(self):
if self.data_root:
self.root = os.path.join(self.data_root, self.NAME)
else:
cachedir = os.environ.get("XDG_CACHE_HOME", os.path.expanduser("~/.cache"))
self.root = os.path.join(cachedir, "autoencoders/data", self.NAME)
self.datadir = os.path.join(self.root, "data")
self.txt_filelist = os.path.join(self.root, "filelist.txt")
self.expected_length = 50000
self.random_crop = retrieve(self.config, "ImageNetValidation/random_crop",
default=False)
if not tdu.is_prepared(self.root):
# prep
print("Preparing dataset {} in {}".format(self.NAME, self.root))
datadir = self.datadir
if not os.path.exists(datadir):
path = os.path.join(self.root, self.FILES[0])
if not os.path.exists(path) or not os.path.getsize(path)==self.SIZES[0]:
import academictorrents as at
atpath = at.get(self.AT_HASH, datastore=self.root)
assert atpath == path
print("Extracting {} to {}".format(path, datadir))
os.makedirs(datadir, exist_ok=True)
with tarfile.open(path, "r:") as tar:
tar.extractall(path=datadir)
vspath = os.path.join(self.root, self.FILES[1])
if not os.path.exists(vspath) or not os.path.getsize(vspath)==self.SIZES[1]:
download(self.VS_URL, vspath)
with open(vspath, "r") as f:
synset_dict = f.read().splitlines()
synset_dict = dict(line.split() for line in synset_dict)
print("Reorganizing into synset folders")
synsets = np.unique(list(synset_dict.values()))
for s in synsets:
os.makedirs(os.path.join(datadir, s), exist_ok=True)
for k, v in synset_dict.items():
src = os.path.join(datadir, k)
dst = os.path.join(datadir, v)
shutil.move(src, dst)
filelist = glob.glob(os.path.join(datadir, "**", "*.JPEG"))
filelist = [os.path.relpath(p, start=datadir) for p in filelist]
filelist = sorted(filelist)
filelist = "\n".join(filelist)+"\n"
with open(self.txt_filelist, "w") as f:
f.write(filelist)
tdu.mark_prepared(self.root)
class ImageNetSR(Dataset):
def __init__(self, size=None,
degradation=None, downscale_f=4, min_crop_f=0.5, max_crop_f=1.,
random_crop=True):
"""
Imagenet Superresolution Dataloader
Performs following ops in order:
1. crops a crop of size s from image either as random or center crop
2. resizes crop to size with cv2.area_interpolation
3. degrades resized crop with degradation_fn
:param size: resizing to size after cropping
:param degradation: degradation_fn, e.g. cv_bicubic or bsrgan_light
:param downscale_f: Low Resolution Downsample factor
:param min_crop_f: determines crop size s,
where s = c * min_img_side_len with c sampled from interval (min_crop_f, max_crop_f)
:param max_crop_f: ""
:param data_root:
:param random_crop:
"""
self.base = self.get_base()
assert size
assert (size / downscale_f).is_integer()
self.size = size
self.LR_size = int(size / downscale_f)
self.min_crop_f = min_crop_f
self.max_crop_f = max_crop_f
assert(max_crop_f <= 1.)
self.center_crop = not random_crop
self.image_rescaler = albumentations.SmallestMaxSize(max_size=size, interpolation=cv2.INTER_AREA)
self.pil_interpolation = False # gets reset later if incase interp_op is from pillow
if degradation == "bsrgan":
self.degradation_process = partial(degradation_fn_bsr, sf=downscale_f)
elif degradation == "bsrgan_light":
self.degradation_process = partial(degradation_fn_bsr_light, sf=downscale_f)
else:
interpolation_fn = {
"cv_nearest": cv2.INTER_NEAREST,
"cv_bilinear": cv2.INTER_LINEAR,
"cv_bicubic": cv2.INTER_CUBIC,
"cv_area": cv2.INTER_AREA,
"cv_lanczos": cv2.INTER_LANCZOS4,
"pil_nearest": PIL.Image.NEAREST,
"pil_bilinear": PIL.Image.BILINEAR,
"pil_bicubic": PIL.Image.BICUBIC,
"pil_box": PIL.Image.BOX,
"pil_hamming": PIL.Image.HAMMING,
"pil_lanczos": PIL.Image.LANCZOS,
}[degradation]
self.pil_interpolation = degradation.startswith("pil_")
if self.pil_interpolation:
self.degradation_process = partial(TF.resize, size=self.LR_size, interpolation=interpolation_fn)
else:
self.degradation_process = albumentations.SmallestMaxSize(max_size=self.LR_size,
interpolation=interpolation_fn)
def __len__(self):
return len(self.base)
def __getitem__(self, i):
example = self.base[i]
image = Image.open(example["file_path_"])
if not image.mode == "RGB":
image = image.convert("RGB")
image = np.array(image).astype(np.uint8)
min_side_len = min(image.shape[:2])
crop_side_len = min_side_len * np.random.uniform(self.min_crop_f, self.max_crop_f, size=None)
crop_side_len = int(crop_side_len)
if self.center_crop:
self.cropper = albumentations.CenterCrop(height=crop_side_len, width=crop_side_len)
else:
self.cropper = albumentations.RandomCrop(height=crop_side_len, width=crop_side_len)
image = self.cropper(image=image)["image"]
image = self.image_rescaler(image=image)["image"]
if self.pil_interpolation:
image_pil = PIL.Image.fromarray(image)
LR_image = self.degradation_process(image_pil)
LR_image = np.array(LR_image).astype(np.uint8)
else:
LR_image = self.degradation_process(image=image)["image"]
example["image"] = (image/127.5 - 1.0).astype(np.float32)
example["LR_image"] = (LR_image/127.5 - 1.0).astype(np.float32)
example["caption"] = example["human_label"] # dummy caption
return example
class ImageNetSRTrain(ImageNetSR):
def __init__(self, **kwargs):
super().__init__(**kwargs)
def get_base(self):
with open("data/imagenet_train_hr_indices.p", "rb") as f:
indices = pickle.load(f)
dset = ImageNetTrain(process_images=False,)
return Subset(dset, indices)
class ImageNetSRValidation(ImageNetSR):
def __init__(self, **kwargs):
super().__init__(**kwargs)
def get_base(self):
with open("data/imagenet_val_hr_indices.p", "rb") as f:
indices = pickle.load(f)
dset = ImageNetValidation(process_images=False,)
return Subset(dset, indices)
================================================
FILE: ldm/data/inpainting/__init__.py
================================================
================================================
FILE: ldm/data/inpainting/synthetic_mask.py
================================================
from PIL import Image, ImageDraw
import numpy as np
settings = {
"256narrow": {
"p_irr": 1,
"min_n_irr": 4,
"max_n_irr": 50,
"max_l_irr": 40,
"max_w_irr": 10,
"min_n_box": None,
"max_n_box": None,
"min_s_box": None,
"max_s_box": None,
"marg": None,
},
"256train": {
"p_irr": 0.5,
"min_n_irr": 1,
"max_n_irr": 5,
"max_l_irr": 200,
"max_w_irr": 100,
"min_n_box": 1,
"max_n_box": 4,
"min_s_box": 30,
"max_s_box": 150,
"marg": 10,
},
"512train": { # TODO: experimental
"p_irr": 0.5,
"min_n_irr": 1,
"max_n_irr": 5,
"max_l_irr": 450,
"max_w_irr": 250,
"min_n_box": 1,
"max_n_box": 4,
"min_s_box": 30,
"max_s_box": 300,
"marg": 10,
},
"512train-large": { # TODO: experimental
"p_irr": 0.5,
"min_n_irr": 1,
"max_n_irr": 5,
"max_l_irr": 450,
"max_w_irr": 400,
"min_n_box": 1,
"max_n_box": 4,
"min_s_box": 75,
"max_s_box": 450,
"marg": 10,
},
}
def gen_segment_mask(mask, start, end, brush_width):
mask = mask > 0
mask = (255 * mask).astype(np.uint8)
mask = Image.fromarray(mask)
draw = ImageDraw.Draw(mask)
draw.line([start, end], fill=255, width=brush_width, joint="curve")
mask = np.array(mask) / 255
return mask
def gen_box_mask(mask, masked):
x_0, y_0, w, h = masked
mask[y_0:y_0 + h, x_0:x_0 + w] = 1
return mask
def gen_round_mask(mask, masked, radius):
x_0, y_0, w, h = masked
xy = [(x_0, y_0), (x_0 + w, y_0 + w)]
mask = mask > 0
mask = (255 * mask).astype(np.uint8)
mask = Image.fromarray(mask)
draw = ImageDraw.Draw(mask)
draw.rounded_rectangle(xy, radius=radius, fill=255)
mask = np.array(mask) / 255
return mask
def gen_large_mask(prng, img_h, img_w,
marg, p_irr, min_n_irr, max_n_irr, max_l_irr, max_w_irr,
min_n_box, max_n_box, min_s_box, max_s_box):
"""
img_h: int, an image height
img_w: int, an image width
marg: int, a margin for a box starting coordinate
p_irr: float, 0 <= p_irr <= 1, a probability of a polygonal chain mask
min_n_irr: int, min number of segments
max_n_irr: int, max number of segments
max_l_irr: max length of a segment in polygonal chain
max_w_irr: max width of a segment in polygonal chain
min_n_box: int, min bound for the number of box primitives
max_n_box: int, max bound for the number of box primitives
min_s_box: int, min length of a box side
max_s_box: int, max length of a box side
"""
mask = np.zeros((img_h, img_w))
uniform = prng.randint
if np.random.uniform(0, 1) < p_irr: # generate polygonal chain
n = uniform(min_n_irr, max_n_irr) # sample number of segments
for _ in range(n):
y = uniform(0, img_h) # sample a starting point
x = uniform(0, img_w)
a = uniform(0, 360) # sample angle
l = uniform(10, max_l_irr) # sample segment length
w = uniform(5, max_w_irr) # sample a segment width
# draw segment starting from (x,y) to (x_,y_) using brush of width w
x_ = x + l * np.sin(a)
y_ = y + l * np.cos(a)
mask = gen_segment_mask(mask, start=(x, y), end=(x_, y_), brush_width=w)
x, y = x_, y_
else: # generate Box masks
n = uniform(min_n_box, max_n_box) # sample number of rectangles
for _ in range(n):
h = uniform(min_s_box, max_s_box) # sample box shape
w = uniform(min_s_box, max_s_box)
x_0 = uniform(marg, img_w - marg - w) # sample upper-left coordinates of box
y_0 = uniform(marg, img_h - marg - h)
if np.random.uniform(0, 1) < 0.5:
mask = gen_box_mask(mask, masked=(x_0, y_0, w, h))
else:
r = uniform(0, 60) # sample radius
mask = gen_round_mask(mask, masked=(x_0, y_0, w, h), radius=r)
return mask
make_lama_mask = lambda prng, h, w: gen_large_mask(prng, h, w, **settings["256train"])
make_narrow_lama_mask = lambda prng, h, w: gen_large_mask(prng, h, w, **settings["256narrow"])
make_512_lama_mask = lambda prng, h, w: gen_large_mask(prng, h, w, **settings["512train"])
make_512_lama_mask_large = lambda prng, h, w: gen_large_mask(prng, h, w, **settings["512train-large"])
MASK_MODES = {
"256train": make_lama_mask,
"256narrow": make_narrow_lama_mask,
"512train": make_512_lama_mask,
"512train-large": make_512_lama_mask_large
}
if __name__ == "__main__":
import sys
out = sys.argv[1]
prng = np.random.RandomState(1)
kwargs = settings["256train"]
mask = gen_large_mask(prng, 256, 256, **kwargs)
mask = (255 * mask).astype(np.uint8)
mask = Image.fromarray(mask)
mask.save(out)
================================================
FILE: ldm/data/laion.py
================================================
import webdataset as wds
import kornia
from PIL import Image
import io
import os
import torchvision
from PIL import Image
import glob
import random
import numpy as np
import pytorch_lightning as pl
from tqdm import tqdm
from omegaconf import OmegaConf
from einops import rearrange
import torch
from webdataset.handlers import warn_and_continue
from ldm.util import instantiate_from_config
from ldm.data.inpainting.synthetic_mask import gen_large_mask, MASK_MODES
from ldm.data.base import PRNGMixin
class DataWithWings(torch.utils.data.IterableDataset):
def __init__(self, min_size, transform=None, target_transform=None):
self.min_size = min_size
self.transform = transform if transform is not None else nn.Identity()
self.target_transform = target_transform if target_transform is not None else nn.Identity()
self.kv = OnDiskKV(file='/home/ubuntu/laion5B-watermark-safety-ordered', key_format='q', value_format='ee')
self.kv_aesthetic = OnDiskKV(file='/home/ubuntu/laion5B-aesthetic-tags-kv', key_format='q', value_format='e')
self.pwatermark_threshold = 0.8
self.punsafe_threshold = 0.5
self.aesthetic_threshold = 5.
self.total_samples = 0
self.samples = 0
location = 'pipe:aws s3 cp --quiet s3://s-datasets/laion5b/laion2B-data/{000000..231349}.tar -'
self.inner_dataset = wds.DataPipeline(
wds.ResampledShards(location),
wds.tarfile_to_samples(handler=wds.warn_and_continue),
wds.shuffle(1000, handler=wds.warn_and_continue),
wds.decode('pilrgb', handler=wds.warn_and_continue),
wds.map(self._add_tags, handler=wds.ignore_and_continue),
wds.select(self._filter_predicate),
wds.map_dict(jpg=self.transform, txt=self.target_transform, punsafe=self._punsafe_to_class, handler=wds.warn_and_continue),
wds.to_tuple('jpg', 'txt', 'punsafe', handler=wds.warn_and_continue),
)
@staticmethod
def _compute_hash(url, text):
if url is None:
url = ''
if text is None:
text = ''
total = (url + text).encode('utf-8')
return mmh3.hash64(total)[0]
def _add_tags(self, x):
hsh = self._compute_hash(x['json']['url'], x['txt'])
pwatermark, punsafe = self.kv[hsh]
aesthetic = self.kv_aesthetic[hsh][0]
return {**x, 'pwatermark': pwatermark, 'punsafe': punsafe, 'aesthetic': aesthetic}
def _punsafe_to_class(self, punsafe):
return torch.tensor(punsafe >= self.punsafe_threshold).long()
def _filter_predicate(self, x):
try:
return x['pwatermark'] < self.pwatermark_threshold and x['aesthetic'] >= self.aesthetic_threshold and x['json']['original_width'] >= self.min_size and x['json']['original_height'] >= self.min_size
except:
return False
def __iter__(self):
return iter(self.inner_dataset)
def dict_collation_fn(samples, combine_tensors=True, combine_scalars=True):
"""Take a list of samples (as dictionary) and create a batch, preserving the keys.
If `tensors` is True, `ndarray` objects are combined into
tensor batches.
:param dict samples: list of samples
:param bool tensors: whether to turn lists of ndarrays into a single ndarray
:returns: single sample consisting of a batch
:rtype: dict
"""
keys = set.intersection(*[set(sample.keys()) for sample in samples])
batched = {key: [] for key in keys}
for s in samples:
[batched[key].append(s[key]) for key in batched]
result = {}
for key in batched:
if isinstance(batched[key][0], (int, float)):
if combine_scalars:
result[key] = np.array(list(batched[key]))
elif isinstance(batched[key][0], torch.Tensor):
if combine_tensors:
result[key] = torch.stack(list(batched[key]))
elif isinstance(batched[key][0], np.ndarray):
if combine_tensors:
result[key] = np.array(list(batched[key]))
else:
result[key] = list(batched[key])
return result
class WebDataModuleFromConfig(pl.LightningDataModule):
def __init__(self, tar_base, batch_size, train=None, validation=None,
test=None, num_workers=4, multinode=True, min_size=None,
max_pwatermark=1.0,
**kwargs):
super().__init__(self)
print(f'Setting tar base to {tar_base}')
self.tar_base = tar_base
self.batch_size = batch_size
self.num_workers = num_workers
self.train = train
self.validation = validation
self.test = test
self.multinode = multinode
self.min_size = min_size # filter out very small images
self.max_pwatermark = max_pwatermark # filter out watermarked images
def make_loader(self, dataset_config, train=True):
if 'image_transforms' in dataset_config:
image_transforms = [instantiate_from_config(tt) for tt in dataset_config.image_transforms]
else:
image_transforms = []
image_transforms.extend([torchvision.transforms.ToTensor(),
torchvision.transforms.Lambda(lambda x: rearrange(x * 2. - 1., 'c h w -> h w c'))])
image_transforms = torchvision.transforms.Compose(image_transforms)
if 'transforms' in dataset_config:
transforms_config = OmegaConf.to_container(dataset_config.transforms)
else:
transforms_config = dict()
transform_dict = {dkey: load_partial_from_config(transforms_config[dkey])
if transforms_config[dkey] != 'identity' else identity
for dkey in transforms_config}
img_key = dataset_config.get('image_key', 'jpeg')
transform_dict.update({img_key: image_transforms})
if 'postprocess' in dataset_config:
postprocess = instantiate_from_config(dataset_config['postprocess'])
else:
postprocess = None
shuffle = dataset_config.get('shuffle', 0)
shardshuffle = shuffle > 0
nodesplitter = wds.shardlists.split_by_node if self.multinode else wds.shardlists.single_node_only
if self.tar_base == "__improvedaesthetic__":
print("## Warning, loading the same improved aesthetic dataset "
"for all splits and ignoring shards parameter.")
tars = "pipe:aws s3 cp s3://s-laion/improved-aesthetics-laion-2B-en-subsets/aesthetics_tars/{000000..060207}.tar -"
else:
tars = os.path.join(self.tar_base, dataset_config.shards)
dset = wds.WebDataset(
tars,
nodesplitter=nodesplitter,
shardshuffle=shardshuffle,
handler=wds.warn_and_continue).repeat().shuffle(shuffle)
print(f'Loading webdataset with {len(dset.pipeline[0].urls)} shards.')
dset = (dset
.select(self.filter_keys)
.decode('pil', handler=wds.warn_and_continue)
.select(self.filter_size)
.map_dict(**transform_dict, handler=wds.warn_and_continue)
)
if postprocess is not None:
dset = dset.map(postprocess)
dset = (dset
.batched(self.batch_size, partial=False,
collation_fn=dict_collation_fn)
)
loader = wds.WebLoader(dset, batch_size=None, shuffle=False,
num_workers=self.num_workers)
return loader
def filter_size(self, x):
try:
valid = True
if self.min_size is not None and self.min_size > 1:
try:
valid = valid and x['json']['original_width'] >= self.min_size and x['json']['original_height'] >= self.min_size
except Exception:
valid = False
if self.max_pwatermark is not None and self.max_pwatermark < 1.0:
try:
valid = valid and x['json']['pwatermark'] <= self.max_pwatermark
except Exception:
valid = False
return valid
except Exception:
return False
def filter_keys(self, x):
try:
return ("jpg" in x) and ("txt" in x)
except Exception:
return False
def train_dataloader(self):
return self.make_loader(self.train)
def val_dataloader(self):
return self.make_loader(self.validation, train=False)
def test_dataloader(self):
return self.make_loader(self.test, train=False)
from ldm.modules.image_degradation import degradation_fn_bsr_light
import cv2
class AddLR(object):
def __init__(self, factor, output_size, initial_size=None, image_key="jpg"):
self.factor = factor
self.output_size = output_size
self.image_key = image_key
self.initial_size = initial_size
def pt2np(self, x):
x = ((x+1.0)*127.5).clamp(0, 255).to(dtype=torch.uint8).detach().cpu().numpy()
return x
def np2pt(self, x):
x = torch.from_numpy(x)/127.5-1.0
return x
def __call__(self, sample):
# sample['jpg'] is tensor hwc in [-1, 1] at this point
x = self.pt2np(sample[self.image_key])
if self.initial_size is not None:
x = cv2.resize(x, (self.initial_size, self.initial_size), interpolation=2)
x = degradation_fn_bsr_light(x, sf=self.factor)['image']
x = cv2.resize(x, (self.output_size, self.output_size), interpolation=2)
x = self.np2pt(x)
sample['lr'] = x
return sample
class AddBW(object):
def __init__(self, image_key="jpg"):
self.image_key = image_key
def pt2np(self, x):
x = ((x+1.0)*127.5).clamp(0, 255).to(dtype=torch.uint8).detach().cpu().numpy()
return x
def np2pt(self, x):
x = torch.from_numpy(x)/127.5-1.0
return x
def __call__(self, sample):
# sample['jpg'] is tensor hwc in [-1, 1] at this point
x = sample[self.image_key]
w = torch.rand(3, device=x.device)
w /= w.sum()
out = torch.einsum('hwc,c->hw', x, w)
# Keep as 3ch so we can pass to encoder, also we might want to add hints
sample['lr'] = out.unsqueeze(-1).tile(1,1,3)
return sample
class AddMask(PRNGMixin):
def __init__(self, mode="512train", p_drop=0.):
super().__init__()
assert mode in list(MASK_MODES.keys()), f'unknown mask generation mode "{mode}"'
self.make_mask = MASK_MODES[mode]
self.p_drop = p_drop
def __call__(self, sample):
# sample['jpg'] is tensor hwc in [-1, 1] at this point
x = sample['jpg']
mask = self.make_mask(self.prng, x.shape[0], x.shape[1])
if self.prng.choice(2, p=[1 - self.p_drop, self.p_drop]):
mask = np.ones_like(mask)
mask[mask < 0.5] = 0
mask[mask > 0.5] = 1
mask = torch.from_numpy(mask[..., None])
sample['mask'] = mask
sample['masked_image'] = x * (mask < 0.5)
return sample
class AddEdge(PRNGMixin):
def __init__(self, mode="512train", mask_edges=True):
super().__init__()
assert mode in list(MASK_MODES.keys()), f'unknown mask generation mode "{mode}"'
self.make_mask = MASK_MODES[mode]
self.n_down_choices = [0]
self.sigma_choices = [1, 2]
self.mask_edges = mask_edges
@torch.no_grad()
def __call__(self, sample):
# sample['jpg'] is tensor hwc in [-1, 1] at this point
x = sample['jpg']
mask = self.make_mask(self.prng, x.shape[0], x.shape[1])
mask[mask < 0.5] = 0
mask[mask > 0.5] = 1
mask = torch.from_numpy(mask[..., None])
sample['mask'] = mask
n_down_idx = self.prng.choice(len(self.n_down_choices))
sigma_idx = self.prng.choice(len(self.sigma_choices))
n_choices = len(self.n_down_choices)*len(self.sigma_choices)
raveled_idx = np.ravel_multi_index((n_down_idx, sigma_idx),
(len(self.n_down_choices), len(self.sigma_choices)))
normalized_idx = raveled_idx/max(1, n_choices-1)
n_down = self.n_down_choices[n_down_idx]
sigma = self.sigma_choices[sigma_idx]
kernel_size = 4*sigma+1
kernel_size = (kernel_size, kernel_size)
sigma = (sigma, sigma)
canny = kornia.filters.Canny(
low_threshold=0.1,
high_threshold=0.2,
kernel_size=kernel_size,
sigma=sigma,
hysteresis=True,
)
y = (x+1.0)/2.0 # in 01
y = y.unsqueeze(0).permute(0, 3, 1, 2).contiguous()
# down
for i_down in range(n_down):
size = min(y.shape[-2], y.shape[-1])//2
y = kornia.geometry.transform.resize(y, size, antialias=True)
# edge
_, y = canny(y)
if n_down > 0:
size = x.shape[0], x.shape[1]
y = kornia.geometry.transform.resize(y, size, interpolation="nearest")
y = y.permute(0, 2, 3, 1)[0].expand(-1, -1, 3).contiguous()
y = y*2.0-1.0
if self.mask_edges:
sample['masked_image'] = y * (mask < 0.5)
else:
sample['masked_image'] = y
sample['mask'] = torch.zeros_like(sample['mask'])
# concat normalized idx
sample['smoothing_strength'] = torch.ones_like(sample['mask'])*normalized_idx
return sample
def example00():
url = "pipe:aws s3 cp s3://s-datasets/laion5b/laion2B-data/000000.tar -"
dataset = wds.WebDataset(url)
example = next(iter(dataset))
for k in example:
print(k, type(example[k]))
print(example["__key__"])
for k in ["json", "txt"]:
print(example[k].decode())
image = Image.open(io.BytesIO(example["jpg"]))
outdir = "tmp"
os.makedirs(outdir, exist_ok=True)
image.save(os.path.join(outdir, example["__key__"] + ".png"))
def load_example(example):
return {
"key": example["__key__"],
"image": Image.open(io.BytesIO(example["jpg"])),
"text": example["txt"].decode(),
}
for i, example in tqdm(enumerate(dataset)):
ex = load_example(example)
print(ex["image"].size, ex["text"])
if i >= 100:
break
def example01():
# the first laion shards contain ~10k examples each
url = "pipe:aws s3 cp s3://s-datasets/laion5b/laion2B-data/{000000..000002}.tar -"
batch_size = 3
shuffle_buffer = 10000
dset = wds.WebDataset(
url,
nodesplitter=wds.shardlists.split_by_node,
shardshuffle=True,
)
dset = (dset
.shuffle(shuffle_buffer, initial=shuffle_buffer)
.decode('pil', handler=warn_and_continue)
.batched(batch_size, partial=False,
collation_fn=dict_collation_fn)
)
num_workers = 2
loader = wds.WebLoader(dset, batch_size=None, shuffle=False, num_workers=num_workers)
batch_sizes = list()
keys_per_epoch = list()
for epoch in range(5):
keys = list()
for batch in tqdm(loader):
batch_sizes.append(len(batch["__key__"]))
keys.append(batch["__key__"])
for bs in batch_sizes:
assert bs==batch_size
print(f"{len(batch_sizes)} batches of size {batch_size}.")
batch_sizes = list()
keys_per_epoch.append(keys)
for i_batch in [0, 1, -1]:
print(f"Batch {i_batch} of epoch {epoch}:")
print(keys[i_batch])
print("next epoch.")
def example02():
from omegaconf import OmegaConf
from torch.utils.data.distributed import DistributedSampler
from torch.utils.data import IterableDataset
from torch.utils.data import DataLoader, RandomSampler, Sampler, SequentialSampler
from pytorch_lightning.trainer.supporters import CombinedLoader, CycleIterator
#config = OmegaConf.load("configs/stable-diffusion/txt2img-1p4B-multinode-clip-encoder-high-res-512.yaml")
#config = OmegaConf.load("configs/stable-diffusion/txt2img-upscale-clip-encoder-f16-1024.yaml")
config = OmegaConf.load("configs/stable-diffusion/txt2img-v2-clip-encoder-improved_aesthetics-256.yaml")
datamod = WebDataModuleFromConfig(**config["data"]["params"])
dataloader = datamod.train_dataloader()
for batch in dataloader:
print(batch.keys())
print(batch["jpg"].shape)
break
def example03():
# improved aesthetics
tars = "pipe:aws s3 cp s3://s-laion/improved-aesthetics-laion-2B-en-subsets/aesthetics_tars/{000000..060207}.tar -"
dataset = wds.WebDataset(tars)
def filter_keys(x):
try:
return ("jpg" in x) and ("txt" in x)
except Exception:
return False
def filter_size(x):
try:
return x['json']['original_width'] >= 512 and x['json']['original_height'] >= 512
except Exception:
return False
def filter_watermark(x):
try:
return x['json']['pwatermark'] < 0.5
except Exception:
return False
dataset = (dataset
.select(filter_keys)
.decode('pil', handler=wds.warn_and_continue))
n_save = 20
n_total = 0
n_large = 0
n_large_nowm = 0
for i, example in enumerate(dataset):
n_total += 1
if filter_size(example):
n_large += 1
if filter_watermark(example):
n_large_nowm += 1
if n_large_nowm < n_save+1:
image = example["jpg"]
image.save(os.path.join("tmp", f"{n_large_nowm-1:06}.png"))
if i%500 == 0:
print(i)
print(f"Large: {n_large}/{n_total} | {n_large/n_total*100:.2f}%")
if n_large > 0:
print(f"No Watermark: {n_large_nowm}/{n_large} | {n_large_nowm/n_large*100:.2f}%")
def example04():
# improved aesthetics
for i_shard in range(60208)[::-1]:
print(i_shard)
tars = "pipe:aws s3 cp s3://s-laion/improved-aesthetics-laion-2B-en-subsets/aesthetics_tars/{:06}.tar -".format(i_shard)
dataset = wds.WebDataset(tars)
def filter_keys(x):
try:
return ("jpg" in x) and ("txt" in x)
except Exception:
return False
def filter_size(x):
try:
return x['json']['original_width'] >= 512 and x['json']['original_height'] >= 512
except Exception:
return False
dataset = (dataset
.select(filter_keys)
.decode('pil', handler=wds.warn_and_continue))
try:
example = next(iter(dataset))
except Exception:
print(f"Error @ {i_shard}")
if __name__ == "__main__":
#example01()
#example02()
example03()
#example04()
================================================
FILE: ldm/data/lsun.py
================================================
import os
import numpy as np
import PIL
from PIL import Image
from torch.utils.data import Dataset
from torchvision import transforms
class LSUNBase(Dataset):
def __init__(self,
txt_file,
data_root,
size=None,
interpolation="bicubic",
flip_p=0.5
):
self.data_paths = txt_file
self.data_root = data_root
with open(self.data_paths, "r") as f:
self.image_paths = f.read().splitlines()
self._length = len(self.image_paths)
self.labels = {
"relative_file_path_": [l for l in self.image_paths],
"file_path_": [os.path.join(self.data_root, l)
for l in self.image_paths],
}
self.size = size
self.interpolation = {"linear": PIL.Image.LINEAR,
"bilinear": PIL.Image.BILINEAR,
"bicubic": PIL.Image.BICUBIC,
"lanczos": PIL.Image.LANCZOS,
}[interpolation]
self.flip = transforms.RandomHorizontalFlip(p=flip_p)
def __len__(self):
return self._length
def __getitem__(self, i):
example = dict((k, self.labels[k][i]) for k in self.labels)
image = Image.open(example["file_path_"])
if not image.mode == "RGB":
image = image.convert("RGB")
# default to score-sde preprocessing
img = np.array(image).astype(np.uint8)
crop = min(img.shape[0], img.shape[1])
h, w, = img.shape[0], img.shape[1]
img = img[(h - crop) // 2:(h + crop) // 2,
(w - crop) // 2:(w + crop) // 2]
image = Image.fromarray(img)
if self.size is not None:
image = image.resize((self.size, self.size), resample=self.interpolation)
image = self.flip(image)
image = np.array(image).astype(np.uint8)
example["image"] = (image / 127.5 - 1.0).astype(np.float32)
return example
class LSUNChurchesTrain(LSUNBase):
def __init__(self, **kwargs):
super().__init__(txt_file="data/lsun/church_outdoor_train.txt", data_root="data/lsun/churches", **kwargs)
class LSUNChurchesValidation(LSUNBase):
def __init__(self, flip_p=0., **kwargs):
super().__init__(txt_file="data/lsun/church_outdoor_val.txt", data_root="data/lsun/churches",
flip_p=flip_p, **kwargs)
class LSUNBedroomsTrain(LSUNBase):
def __init__(self, **kwargs):
super().__init__(txt_file="data/lsun/bedrooms_train.txt", data_root="data/lsun/bedrooms", **kwargs)
class LSUNBedroomsValidation(LSUNBase):
def __init__(self, flip_p=0.0, **kwargs):
super().__init__(txt_file="data/lsun/bedrooms_val.txt", data_root="data/lsun/bedrooms",
flip_p=flip_p, **kwargs)
class LSUNCatsTrain(LSUNBase):
def __init__(self, **kwargs):
super().__init__(txt_file="data/lsun/cat_train.txt", data_root="data/lsun/cats", **kwargs)
class LSUNCatsValidation(LSUNBase):
def __init__(self, flip_p=0., **kwargs):
super().__init__(txt_file="data/lsun/cat_val.txt", data_root="data/lsun/cats",
flip_p=flip_p, **kwargs)
================================================
FILE: ldm/data/nerf_like.py
================================================
from torch.utils.data import Dataset
import os
import json
import numpy as np
import torch
import imageio
import math
import cv2
from torchvision import transforms
def cartesian_to_spherical(xyz):
ptsnew = np.hstack((xyz, np.zeros(xyz.shape)))
xy = xyz[:,0]**2 + xyz[:,1]**2
z = np.sqrt(xy + xyz[:,2]**2)
theta = np.arctan2(np.sqrt(xy), xyz[:,2]) # for elevation angle defined from Z-axis down
#ptsnew[:,4] = np.arctan2(xyz[:,2], np.sqrt(xy)) # for elevation angle defined from XY-plane up
azimuth = np.arctan2(xyz[:,1], xyz[:,0])
return np.array([theta, azimuth, z])
def get_T(T_target, T_cond):
theta_cond, azimuth_cond, z_cond = cartesian_to_spherical(T_cond[None, :])
theta_target, azimuth_target, z_target = cartesian_to_spherical(T_target[None, :])
d_theta = theta_target - theta_cond
d_azimuth = (azimuth_target - azimuth_cond) % (2 * math.pi)
d_z = z_target - z_cond
d_T = torch.tensor([d_theta.item(), math.sin(d_azimuth.item()), math.cos(d_azimuth.item()), d_z.item()])
return d_T
def get_spherical(T_target, T_cond):
theta_cond, azimuth_cond, z_cond = cartesian_to_spherical(T_cond[None, :])
theta_target, azimuth_target, z_target = cartesian_to_spherical(T_target[None, :])
d_theta = theta_target - theta_cond
d_azimuth = (azimuth_target - azimuth_cond) % (2 * math.pi)
d_z = z_target - z_cond
d_T = torch.tensor([math.degrees(d_theta.item()), math.degrees(d_azimuth.item()), d_z.item()])
return d_T
class RTMV(Dataset):
def __init__(self, root_dir='datasets/RTMV/google_scanned',\
first_K=64, resolution=256, load_target=False):
self.root_dir = root_dir
self.scene_list = sorted(next(os.walk(root_dir))[1])
self.resolution = resolution
self.first_K = first_K
self.load_target = load_target
def __len__(self):
return len(self.scene_list)
def __getitem__(self, idx):
scene_dir = os.path.join(self.root_dir, self.scene_list[idx])
with open(os.path.join(scene_dir, 'transforms.json'), "r") as f:
meta = json.load(f)
imgs = []
poses = []
for i_img in range(self.first_K):
meta_img = meta['frames'][i_img]
if i_img == 0 or self.load_target:
img_path = os.path.join(scene_dir, meta_img['file_path'])
img = imageio.imread(img_path)
img = cv2.resize(img, (self.resolution, self.resolution), interpolation = cv2.INTER_LINEAR)
imgs.append(img)
c2w = meta_img['transform_matrix']
poses.append(c2w)
imgs = (np.array(imgs) / 255.).astype(np.float32) # (RGBA) imgs
imgs = torch.tensor(self.blend_rgba(imgs)).permute(0, 3, 1, 2)
imgs = imgs * 2 - 1. # convert to stable diffusion range
poses = torch.tensor(np.array(poses).astype(np.float32))
return imgs, poses
def blend_rgba(self, img):
img = img[..., :3] * img[..., -1:] + (1. - img[..., -1:]) # blend A to RGB
return img
class GSO(Dataset):
def __init__(self, root_dir='datasets/GoogleScannedObjects',\
split='val', first_K=5, resolution=256, load_target=False, name='render_mvs'):
self.root_dir = root_dir
with open(os.path.join(root_dir, '%s.json' % split), "r") as f:
self.scene_list = json.load(f)
self.resolution = resolution
self.first_K = first_K
self.load_target = load_target
self.name = name
def __len__(self):
return len(self.scene_list)
def __getitem__(self, idx):
scene_dir = os.path.join(self.root_dir, self.scene_list[idx])
with open(os.path.join(scene_dir, 'transforms_%s.json' % self.name), "r") as f:
meta = json.load(f)
imgs = []
poses = []
for i_img in range(self.first_K):
meta_img = meta['frames'][i_img]
if i_img == 0 or self.load_target:
img_path = os.path.join(scene_dir, meta_img['file_path'])
img = imageio.imread(img_path)
img = cv2.resize(img, (self.resolution, self.resolution), interpolation = cv2.INTER_LINEAR)
imgs.append(img)
c2w = meta_img['transform_matrix']
poses.append(c2w)
imgs = (np.array(imgs) / 255.).astype(np.float32) # (RGBA) imgs
mask = imgs[:, :, :, -1]
imgs = torch.tensor(self.blend_rgba(imgs)).permute(0, 3, 1, 2)
imgs = imgs * 2 - 1. # convert to stable diffusion range
poses = torch.tensor(np.array(poses).astype(np.float32))
return imgs, poses
def blend_rgba(self, img):
img = img[..., :3] * img[..., -1:] + (1. - img[..., -1:]) # blend A to RGB
return img
class WILD(Dataset):
def __init__(self, root_dir='data/nerf_wild',\
first_K=33, resolution=256, load_target=False):
self.root_dir = root_dir
self.scene_list = sorted(next(os.walk(root_dir))[1])
self.resolution = resolution
self.first_K = first_K
self.load_target = load_target
def __len__(self):
return len(self.scene_list)
def __getitem__(self, idx):
scene_dir = os.path.join(self.root_dir, self.scene_list[idx])
with open(os.path.join(scene_dir, 'transforms_train.json'), "r") as f:
meta = json.load(f)
imgs = []
poses = []
for i_img in range(self.first_K):
meta_img = meta['frames'][i_img]
if i_img == 0 or self.load_target:
img_path = os.path.join(scene_dir, meta_img['file_path'])
img = imageio.imread(img_path + '.png')
img = cv2.resize(img, (self.resolution, self.resolution), interpolation = cv2.INTER_LINEAR)
imgs.append(img)
c2w = meta_img['transform_matrix']
poses.append(c2w)
imgs = (np.array(imgs) / 255.).astype(np.float32) # (RGBA) imgs
imgs = torch.tensor(self.blend_rgba(imgs)).permute(0, 3, 1, 2)
imgs = imgs * 2 - 1. # convert to stable diffusion range
poses = torch.tensor(np.array(poses).astype(np.float32))
return imgs, poses
def blend_rgba(self, img):
img = img[..., :3] * img[..., -1:] + (1. - img[..., -1:]) # blend A to RGB
return img
================================================
FILE: ldm/data/simple.py
================================================
from typing import Dict
import webdataset as wds
import numpy as np
from omegaconf import DictConfig, ListConfig
import torch
from torch.utils.data import Dataset
from pathlib import Path
import json
from PIL import Image
from torchvision import transforms
import torchvision
from einops import rearrange
from ldm.util import instantiate_from_config
from datasets import load_dataset
import pytorch_lightning as pl
import copy
import csv
import cv2
import random
import matplotlib.pyplot as plt
from torch.utils.data import DataLoader
import json
import os
import webdataset as wds
import math
from torch.utils.data.distributed import DistributedSampler
# Some hacky things to make experimentation easier
def make_transform_multi_folder_data(paths, caption_files=None, **kwargs):
ds = make_multi_folder_data(paths, caption_files, **kwargs)
return TransformDataset(ds)
def make_nfp_data(base_path):
dirs = list(Path(base_path).glob("*/"))
print(f"Found {len(dirs)} folders")
print(dirs)
tforms = [transforms.Resize(512), transforms.CenterCrop(512)]
datasets = [NfpDataset(x, image_transforms=copy.copy(tforms), default_caption="A view from a train window") for x in dirs]
return torch.utils.data.ConcatDataset(datasets)
class VideoDataset(Dataset):
def __init__(self, root_dir, image_transforms, caption_file, offset=8, n=2):
self.root_dir = Path(root_dir)
self.caption_file = caption_file
self.n = n
ext = "mp4"
self.paths = sorted(list(self.root_dir.rglob(f"*.{ext}")))
self.offset = offset
if isinstance(image_transforms, ListConfig):
image_transforms = [instantiate_from_config(tt) for tt in image_transforms]
image_transforms.extend([transforms.ToTensor(),
transforms.Lambda(lambda x: rearrange(x * 2. - 1., 'c h w -> h w c'))])
image_transforms = transforms.Compose(image_transforms)
self.tform = image_transforms
with open(self.caption_file) as f:
reader = csv.reader(f)
rows = [row for row in reader]
self.captions = dict(rows)
def __len__(self):
return len(self.paths)
def __getitem__(self, index):
for i in range(10):
try:
return self._load_sample(index)
except Exception:
# Not really good enough but...
print("uh oh")
def _load_sample(self, index):
n = self.n
filename = self.paths[index]
min_frame = 2*self.offset + 2
vid = cv2.VideoCapture(str(filename))
max_frames = int(vid.get(cv2.CAP_PROP_FRAME_COUNT))
curr_frame_n = random.randint(min_frame, max_frames)
vid.set(cv2.CAP_PROP_POS_FRAMES,curr_frame_n)
_, curr_frame = vid.read()
prev_frames = []
for i in range(n):
prev_frame_n = curr_frame_n - (i+1)*self.offset
vid.set(cv2.CAP_PROP_POS_FRAMES,prev_frame_n)
_, prev_frame = vid.read()
prev_frame = self.tform(Image.fromarray(prev_frame[...,::-1]))
prev_frames.append(prev_frame)
vid.release()
caption = self.captions[filename.name]
data = {
"image": self.tform(Image.fromarray(curr_frame[...,::-1])),
"prev": torch.cat(prev_frames, dim=-1),
"txt": caption
}
return data
# end hacky things
def make_tranforms(image_transforms):
# if isinstance(image_transforms, ListConfig):
# image_transforms = [instantiate_from_config(tt) for tt in image_transforms]
image_transforms = []
image_transforms.extend([transforms.ToTensor(),
transforms.Lambda(lambda x: rearrange(x * 2. - 1., 'c h w -> h w c'))])
image_transforms = transforms.Compose(image_transforms)
return image_transforms
def make_multi_folder_data(paths, caption_files=None, **kwargs):
"""Make a concat dataset from multiple folders
Don't suport captions yet
If paths is a list, that's ok, if it's a Dict interpret it as:
k=folder v=n_times to repeat that
"""
list_of_paths = []
if isinstance(paths, (Dict, DictConfig)):
assert caption_files is None, \
"Caption files not yet supported for repeats"
for folder_path, repeats in paths.items():
list_of_paths.extend([folder_path]*repeats)
paths = list_of_paths
if caption_files is not None:
datasets = [FolderData(p, caption_file=c, **kwargs) for (p, c) in zip(paths, caption_files)]
else:
datasets = [FolderData(p, **kwargs) for p in paths]
return torch.utils.data.ConcatDataset(datasets)
class NfpDataset(Dataset):
def __init__(self,
root_dir,
image_transforms=[],
ext="jpg",
default_caption="",
) -> None:
"""assume sequential frames and a deterministic transform"""
self.root_dir = Path(root_dir)
self.default_caption = default_caption
self.paths = sorted(list(self.root_dir.rglob(f"*.{ext}")))
self.tform = make_tranforms(image_transforms)
def __len__(self):
return len(self.paths) - 1
def __getitem__(self, index):
prev = self.paths[index]
curr = self.paths[index+1]
data = {}
data["image"] = self._load_im(curr)
data["prev"] = self._load_im(prev)
data["txt"] = self.default_caption
return data
def _load_im(self, filename):
im = Image.open(filename).convert("RGB")
return self.tform(im)
class ObjaverseDataModuleFromConfig(pl.LightningDataModule):
def __init__(self, root_dir, batch_size, total_view, train=None, validation=None,
test=None, num_workers=4, **kwargs):
super().__init__(self)
self.root_dir = root_dir
self.batch_size = batch_size
self.num_workers = num_workers
self.total_view = total_view
if train is not None:
dataset_config = train
if validation is not None:
dataset_config = validation
if 'image_transforms' in dataset_config:
image_transforms = [torchvision.transforms.Resize(dataset_config.image_transforms.size)]
else:
image_transforms = []
image_transforms.extend([transforms.ToTensor(),
transforms.Lambda(lambda x: rearrange(x * 2. - 1., 'c h w -> h w c'))])
self.image_transforms = torchvision.transforms.Compose(image_transforms)
def train_dataloader(self):
dataset = ObjaverseData(root_dir=self.root_dir, total_view=self.total_view, validation=False, \
image_transforms=self.image_transforms)
sampler = DistributedSampler(dataset)
return wds.WebLoader(dataset, batch_size=self.batch_size, num_workers=self.num_workers, shuffle=False, sampler=sampler)
def val_dataloader(self):
dataset = ObjaverseData(root_dir=self.root_dir, total_view=self.total_view, validation=True, \
image_transforms=self.image_transforms)
sampler = DistributedSampler(dataset)
return wds.WebLoader(dataset, batch_size=self.batch_size, num_workers=self.num_workers, shuffle=False)
def test_dataloader(self):
return wds.WebLoader(ObjaverseData(root_dir=self.root_dir, total_view=self.total_view, validation=self.validation),\
batch_size=self.batch_size, num_workers=self.num_workers, shuffle=False)
class ObjaverseData(Dataset):
def __init__(self,
root_dir='.objaverse/hf-objaverse-v1/views',
image_transforms=[],
ext="png",
default_trans=torch.zeros(3),
postprocess=None,
return_paths=False,
total_view=4,
validation=False
) -> None:
"""Create a dataset from a folder of images.
If you pass in a root directory it will be searched for images
ending in ext (ext can be a list)
"""
self.root_dir = Path(root_dir)
self.default_trans = default_trans
self.return_paths = return_paths
if isinstance(postprocess, DictConfig):
postprocess = instantiate_from_config(postprocess)
self.postprocess = postprocess
self.total_view = total_view
if not isinstance(ext, (tuple, list, ListConfig)):
ext = [ext]
with open(os.path.join(root_dir, 'valid_paths.json')) as f:
self.paths = json.load(f)
total_objects = len(self.paths)
if validation:
self.paths = self.paths[math.floor(total_objects / 100. * 99.):] # used last 1% as validation
else:
self.paths = self.paths[:math.floor(total_objects / 100. * 99.)] # used first 99% as training
print('============= length of dataset %d =============' % len(self.paths))
self.tform = image_transforms
def __len__(self):
return len(self.paths)
def cartesian_to_spherical(self, xyz):
ptsnew = np.hstack((xyz, np.zeros(xyz.shape)))
xy = xyz[:,0]**2 + xyz[:,1]**2
z = np.sqrt(xy + xyz[:,2]**2)
theta = np.arctan2(np.sqrt(xy), xyz[:,2]) # for elevation angle defined from Z-axis down
#ptsnew[:,4] = np.arctan2(xyz[:,2], np.sqrt(xy)) # for elevation angle defined from XY-plane up
azimuth = np.arctan2(xyz[:,1], xyz[:,0])
return np.array([theta, azimuth, z])
def get_T(self, target_RT, cond_RT):
R, T = target_RT[:3, :3], target_RT[:, -1]
T_target = -R.T @ T
R, T = cond_RT[:3, :3], cond_RT[:, -1]
T_cond = -R.T @ T
theta_cond, azimuth_cond, z_cond = self.cartesian_to_spherical(T_cond[None, :])
theta_target, azimuth_target, z_target = self.cartesian_to_spherical(T_target[None, :])
d_theta = theta_target - theta_cond
d_azimuth = (azimuth_target - azimuth_cond) % (2 * math.pi)
d_z = z_target - z_cond
d_T = torch.tensor([d_theta.item(), math.sin(d_azimuth.item()), math.cos(d_azimuth.item()), d_z.item()])
return d_T
def load_im(self, path, color):
'''
replace background pixel with random color in rendering
'''
try:
img = plt.imread(path)
except:
print(path)
sys.exit()
img[img[:, :, -1] == 0.] = color
img = Image.fromarray(np.uint8(img[:, :, :3] * 255.))
return img
def __getitem__(self, index):
data = {}
if self.paths[index][-2:] == '_1': # dirty fix for rendering dataset twice
total_view = 8
else:
total_view = 4
index_target, index_cond = random.sample(range(total_view), 2) # without replacement
filename = os.path.join(self.root_dir, self.paths[index])
# print(self.paths[index])
if self.return_paths:
data["path"] = str(filename)
color = [1., 1., 1., 1.]
try:
target_im = self.process_im(self.load_im(os.path.join(filename, '%03d.png' % index_target), color))
cond_im = self.process_im(self.load_im(os.path.join(filename, '%03d.png' % index_cond), color))
target_RT = np.load(os.path.join(filename, '%03d.npy' % index_target))
cond_RT = np.load(os.path.join(filename, '%03d.npy' % index_cond))
except:
# very hacky solution, sorry about this
filename = os.path.join(self.root_dir, '692db5f2d3a04bb286cb977a7dba903e_1') # this one we know is valid
target_im = self.process_im(self.load_im(os.path.join(filename, '%03d.png' % index_target), color))
cond_im = self.process_im(self.load_im(os.path.join(filename, '%03d.png' % index_cond), color))
target_RT = np.load(os.path.join(filename, '%03d.npy' % index_target))
cond_RT = np.load(os.path.join(filename, '%03d.npy' % index_cond))
target_im = torch.zeros_like(target_im)
cond_im = torch.zeros_like(cond_im)
data["image_target"] = target_im
data["image_cond"] = cond_im
data["T"] = self.get_T(target_RT, cond_RT)
if self.postprocess is not None:
data = self.postprocess(data)
return data
def process_im(self, im):
im = im.convert("RGB")
return self.tform(im)
class FolderData(Dataset):
def __init__(self,
root_dir,
caption_file=None,
image_transforms=[],
ext="jpg",
default_caption="",
postprocess=None,
return_paths=False,
) -> None:
"""Create a dataset from a folder of images.
If you pass in a root directory it will be searched for images
ending in ext (ext can be a list)
"""
self.root_dir = Path(root_dir)
self.default_caption = default_caption
self.return_paths = return_paths
if isinstance(postprocess, DictConfig):
postprocess = instantiate_from_config(postprocess)
self.postprocess = postprocess
if caption_file is not None:
with open(caption_file, "rt") as f:
ext = Path(caption_file).suffix.lower()
if ext == ".json":
captions = json.load(f)
elif ext == ".jsonl":
lines = f.readlines()
lines = [json.loads(x) for x in lines]
captions = {x["file_name"]: x["text"].strip("\n") for x in lines}
else:
raise ValueError(f"Unrecognised format: {ext}")
self.captions = captions
else:
self.captions = None
if not isinstance(ext, (tuple, list, ListConfig)):
ext = [ext]
# Only used if there is no caption file
self.paths = []
for e in ext:
self.paths.extend(sorted(list(self.root_dir.rglob(f"*.{e}"))))
self.tform = make_tranforms(image_transforms)
def __len__(self):
if self.captions is not None:
return len(self.captions.keys())
else:
return len(self.paths)
def __getitem__(self, index):
data = {}
if self.captions is not None:
chosen = list(self.captions.keys())[index]
caption = self.captions.get(chosen, None)
if caption is None:
caption = self.default_caption
filename = self.root_dir/chosen
else:
filename = self.paths[index]
if self.return_paths:
data["path"] = str(filename)
im = Image.open(filename).convert("RGB")
im = self.process_im(im)
data["image"] = im
if self.captions is not None:
data["txt"] = caption
else:
data["txt"] = self.default_caption
if self.postprocess is not None:
data = self.postprocess(data)
return data
def process_im(self, im):
im = im.convert("RGB")
return self.tform(im)
import random
class TransformDataset():
def __init__(self, ds, extra_label="sksbspic"):
self.ds = ds
self.extra_label = extra_label
self.transforms = {
"align": transforms.Resize(768),
"centerzoom": transforms.CenterCrop(768),
"randzoom": transforms.RandomCrop(768),
}
def __getitem__(self, index):
data = self.ds[index]
im = data['image']
im = im.permute(2,0,1)
# In case data is smaller than expected
im = transforms.Resize(1024)(im)
tform_name = random.choice(list(self.transforms.keys()))
im = self.transforms[tform_name](im)
im = im.permute(1,2,0)
data['image'] = im
data['txt'] = data['txt'] + f" {self.extra_label} {tform_name}"
return data
def __len__(self):
return len(self.ds)
def hf_dataset(
name,
image_transforms=[],
image_column="image",
text_column="text",
split='train',
image_key='image',
caption_key='txt',
):
"""Make huggingface dataset with appropriate list of transforms applied
"""
ds = load_dataset(name, split=split)
tform = make_tranforms(image_transforms)
assert image_column in ds.column_names, f"Didn't find column {image_column} in {ds.column_names}"
assert text_column in ds.column_names, f"Didn't find column {text_column} in {ds.column_names}"
def pre_process(examples):
processed = {}
processed[image_key] = [tform(im) for im in examples[image_column]]
processed[caption_key] = examples[text_column]
return processed
ds.set_transform(pre_process)
return ds
class TextOnly(Dataset):
def __init__(self, captions, output_size, image_key="image", caption_key="txt", n_gpus=1):
"""Returns only captions with dummy images"""
self.output_size = output_size
self.image_key = image_key
self.caption_key = caption_key
if isinstance(captions, Path):
self.captions = self._load_caption_file(captions)
else:
self.captions = captions
if n_gpus > 1:
# hack to make sure that all the captions appear on each gpu
repeated = [n_gpus*[x] for x in self.captions]
self.captions = []
[self.captions.extend(x) for x in repeated]
def __len__(self):
return len(self.captions)
def __getitem__(self, index):
dummy_im = torch.zeros(3, self.output_size, self.output_size)
dummy_im = rearrange(dummy_im * 2. - 1., 'c h w -> h w c')
return {self.image_key: dummy_im, self.caption_key: self.captions[index]}
def _load_caption_file(self, filename):
with open(filename, 'rt') as f:
captions = f.readlines()
return [x.strip('\n') for x in captions]
import random
import json
class IdRetreivalDataset(FolderData):
def __init__(self, ret_file, *args, **kwargs):
super().__init__(*args, **kwargs)
with open(ret_file, "rt") as f:
self.ret = json.load(f)
def __getitem__(self, index):
data = super().__getitem__(index)
key = self.paths[index].name
matches = self.ret[key]
if len(matches) > 0:
retreived = random.choice(matches)
else:
retreived = key
filename = self.root_dir/retreived
im = Image.open(filename).convert("RGB")
im = self.process_im(im)
# data["match"] = im
data["match"] = torch.cat((data["image"], im), dim=-1)
return data
================================================
FILE: ldm/data/sync_dreamer.py
================================================
import pytorch_lightning as pl
import numpy as np
import torch
import PIL
import os
from skimage.io import imread
import webdataset as wds
import PIL.Image as Image
from torch.utils.data import Dataset
from torch.utils.data.distributed import DistributedSampler
from pathlib import Path
from ldm.base_utils import read_pickle, pose_inverse
import torchvision.transforms as transforms
import torchvision
from einops import rearrange
from ldm.util import prepare_inputs
class SyncDreamerTrainData(Dataset):
def __init__(self, target_dir, input_dir, uid_set_pkl, image_size=256):
self.default_image_size = 256
self.image_size = image_size
self.target_dir = Path(target_dir)
self.input_dir = Path(input_dir)
self.uids = read_pickle(uid_set_pkl)
print('============= length of dataset %d =============' % len(self.uids))
image_transforms = []
image_transforms.extend([transforms.ToTensor(), transforms.Lambda(lambda x: rearrange(x * 2. - 1., 'c h w -> h w c'))])
self.image_transforms = torchvision.transforms.Compose(image_transforms)
self.num_images = 16
def __len__(self):
return len(self.uids)
def load_im(self, path):
img = imread(path)
img = img.astype(np.float32) / 255.0
mask = img[:,:,3:]
img[:,:,:3] = img[:,:,:3] * mask + 1 - mask # white background
img = Image.fromarray(np.uint8(img[:, :, :3] * 255.))
return img, mask
def process_im(self, im):
im = im.convert("RGB")
im = im.resize((self.image_size, self.image_size), resample=PIL.Image.BICUBIC)
return self.image_transforms(im)
def load_index(self, filename, index):
img, _ = self.load_im(os.path.join(filename, '%03d.png' % index))
img = self.process_im(img)
return img
def get_data_for_index(self, index):
target_dir = os.path.join(self.target_dir, self.uids[index])
input_dir = os.path.join(self.input_dir, self.uids[index])
views = np.arange(0, self.num_images)
start_view_index = np.random.randint(0, self.num_images)
views = (views + start_view_index) % self.num_images
target_images = []
for si, target_index in enumerate(views):
img = self.load_index(target_dir, target_index)
target_images.append(img)
target_images = torch.stack(target_images, 0)
input_img = self.load_index(input_dir, start_view_index)
K, azimuths, elevations, distances, cam_poses = read_pickle(os.path.join(input_dir, f'meta.pkl'))
input_elevation = torch.from_numpy(elevations[start_view_index:start_view_index+1].astype(np.float32))
return {"target_image": target_images, "input_image": input_img, "input_elevation": input_elevation}
def __getitem__(self, index):
data = self.get_data_for_index(index)
return data
class SyncDreamerEvalData(Dataset):
def __init__(self, image_dir):
self.image_size = 256
self.image_dir = Path(image_dir)
self.crop_size = 20
self.fns = []
for fn in Path(image_dir).iterdir():
if fn.suffix=='.png':
self.fns.append(fn)
print('============= length of dataset %d =============' % len(self.fns))
def __len__(self):
return len(self.fns)
def get_data_for_index(self, index):
input_img_fn = self.fns[index]
elevation = int(Path(input_img_fn).stem.split('-')[-1])
return prepare_inputs(input_img_fn, elevation, 200)
def __getitem__(self, index):
return self.get_data_for_index(index)
class SyncDreamerDataset(pl.LightningDataModule):
def __init__(self, target_dir, input_dir, validation_dir, batch_size, uid_set_pkl, image_size=256, num_workers=4, seed=0, **kwargs):
super().__init__()
self.target_dir = target_dir
self.input_dir = input_dir
self.validation_dir = validation_dir
self.batch_size = batch_size
self.num_workers = num_workers
self.uid_set_pkl = uid_set_pkl
self.seed = seed
self.additional_args = kwargs
self.image_size = image_size
def setup(self, stage):
if stage in ['fit']:
self.train_dataset = SyncDreamerTrainData(self.target_dir, self.input_dir, uid_set_pkl=self.uid_set_pkl, image_size=256)
self.val_dataset = SyncDreamerEvalData(image_dir=self.validation_dir)
else:
raise NotImplementedError
def train_dataloader(self):
sampler = DistributedSampler(self.train_dataset, seed=self.seed)
return wds.WebLoader(self.train_dataset, batch_size=self.batch_size, num_workers=self.num_workers, shuffle=False, sampler=sampler)
def val_dataloader(self):
loader = wds.WebLoader(self.val_dataset, batch_size=self.batch_size, num_workers=self.num_workers, shuffle=False)
return loader
def test_dataloader(self):
return wds.WebLoader(self.val_dataset, batch_size=self.batch_size, num_workers=self.num_workers, shuffle=False)
================================================
FILE: ldm/lr_scheduler.py
================================================
import numpy as np
class LambdaWarmUpCosineScheduler:
"""
note: use with a base_lr of 1.0
"""
def __init__(self, warm_up_steps, lr_min, lr_max, lr_start, max_decay_steps, verbosity_interval=0):
self.lr_warm_up_steps = warm_up_steps
self.lr_start = lr_start
self.lr_min = lr_min
self.lr_max = lr_max
self.lr_max_decay_steps = max_decay_steps
self.last_lr = 0.
self.verbosity_interval = verbosity_interval
def schedule(self, n, **kwargs):
if self.verbosity_interval > 0:
if n % self.verbosity_interval == 0: print(f"current step: {n}, recent lr-multiplier: {self.last_lr}")
if n < self.lr_warm_up_steps:
lr = (self.lr_max - self.lr_start) / self.lr_warm_up_steps * n + self.lr_start
self.last_lr = lr
return lr
else:
t = (n - self.lr_warm_up_steps) / (self.lr_max_decay_steps - self.lr_warm_up_steps)
t = min(t, 1.0)
lr = self.lr_min + 0.5 * (self.lr_max - self.lr_min) * (
1 + np.cos(t * np.pi))
self.last_lr = lr
return lr
def __call__(self, n, **kwargs):
return self.schedule(n,**kwargs)
class LambdaWarmUpCosineScheduler2:
"""
supports repeated iterations, configurable via lists
note: use with a base_lr of 1.0.
"""
def __init__(self, warm_up_steps, f_min, f_max, f_start, cycle_lengths, verbosity_interval=0):
assert len(warm_up_steps) == len(f_min) == len(f_max) == len(f_start) == len(cycle_lengths)
self.lr_warm_up_steps = warm_up_steps
self.f_start = f_start
self.f_min = f_min
self.f_max = f_max
self.cycle_lengths = cycle_lengths
self.cum_cycles = np.cumsum([0] + list(self.cycle_lengths))
self.last_f = 0.
self.verbosity_interval = verbosity_interval
def find_in_interval(self, n):
interval = 0
for cl in self.cum_cycles[1:]:
if n <= cl:
return interval
interval += 1
def schedule(self, n, **kwargs):
cycle = self.find_in_interval(n)
n = n - self.cum_cycles[cycle]
if self.verbosity_interval > 0:
if n % self.verbosity_interval == 0: print(f"current step: {n}, recent lr-multiplier: {self.last_f}, "
f"current cycle {cycle}")
if n < self.lr_warm_up_steps[cycle]:
f = (self.f_max[cycle] - self.f_start[cycle]) / self.lr_warm_up_steps[cycle] * n + self.f_start[cycle]
self.last_f = f
return f
else:
t = (n - self.lr_warm_up_steps[cycle]) / (self.cycle_lengths[cycle] - self.lr_warm_up_steps[cycle])
t = min(t, 1.0)
f = self.f_min[cycle] + 0.5 * (self.f_max[cycle] - self.f_min[cycle]) * (
1 + np.cos(t * np.pi))
self.last_f = f
return f
def __call__(self, n, **kwargs):
return self.schedule(n, **kwargs)
class LambdaLinearScheduler(LambdaWarmUpCosineScheduler2):
def schedule(self, n, **kwargs):
cycle = self.find_in_interval(n)
n = n - self.cum_cycles[cycle]
if self.verbosity_interval > 0:
if n % self.verbosity_interval == 0: print(f"current step: {n}, recent lr-multiplier: {self.last_f}, "
f"current cycle {cycle}")
if n < self.lr_warm_up_steps[cycle]:
f = (self.f_max[cycle] - self.f_start[cycle]) / self.lr_warm_up_steps[cycle] * n + self.f_start[cycle]
self.last_f = f
return f
else:
f = self.f_min[cycle] + (self.f_max[cycle] - self.f_min[cycle]) * (self.cycle_lengths[cycle] - n) / (self.cycle_lengths[cycle])
self.last_f = f
return f
================================================
FILE: ldm/models/autoencoder.py
================================================
import torch
import pytorch_lightning as pl
import torch.nn.functional as F
from contextlib import contextmanager
from taming.modules.vqvae.quantize import VectorQuantizer2 as VectorQuantizer
from ldm.modules.diffusionmodules.model import Encoder, Decoder
from ldm.modules.distributions.distributions import DiagonalGaussianDistribution
from ldm.util import instantiate_from_config
class VQModel(pl.LightningModule):
def __init__(self,
ddconfig,
lossconfig,
n_embed,
embed_dim,
ckpt_path=None,
ignore_keys=[],
image_key="image",
colorize_nlabels=None,
monitor=None,
batch_resize_range=None,
scheduler_config=None,
lr_g_factor=1.0,
remap=None,
sane_index_shape=False, # tell vector quantizer to return indices as bhw
use_ema=False
):
super().__init__()
self.embed_dim = embed_dim
self.n_embed = n_embed
self.image_key = image_key
self.encoder = Encoder(**ddconfig)
self.decoder = Decoder(**ddconfig)
self.loss = instantiate_from_config(lossconfig)
self.quantize = VectorQuantizer(n_embed, embed_dim, beta=0.25,
remap=remap,
sane_index_shape=sane_index_shape)
self.quant_conv = torch.nn.Conv2d(ddconfig["z_channels"], embed_dim, 1)
self.post_quant_conv = torch.nn.Conv2d(embed_dim, ddconfig["z_channels"], 1)
if colorize_nlabels is not None:
assert type(colorize_nlabels)==int
self.register_buffer("colorize", torch.randn(3, colorize_nlabels, 1, 1))
if monitor is not None:
self.monitor = monitor
self.batch_resize_range = batch_resize_range
if self.batch_resize_range is not None:
print(f"{self.__class__.__name__}: Using per-batch resizing in range {batch_resize_range}.")
self.use_ema = use_ema
if self.use_ema:
self.model_ema = LitEma(self)
print(f"Keeping EMAs of {len(list(self.model_ema.buffers()))}.")
if ckpt_path is not None:
self.init_from_ckpt(ckpt_path, ignore_keys=ignore_keys)
self.scheduler_config = scheduler_config
self.lr_g_factor = lr_g_factor
@contextmanager
def ema_scope(self, context=None):
if self.use_ema:
self.model_ema.store(self.parameters())
self.model_ema.copy_to(self)
if context is not None:
print(f"{context}: Switched to EMA weights")
try:
yield None
finally:
if self.use_ema:
self.model_ema.restore(self.parameters())
if context is not None:
print(f"{context}: Restored training weights")
def init_from_ckpt(self, path, ignore_keys=list()):
sd = torch.load(path, map_location="cpu")["state_dict"]
keys = list(sd.keys())
for k in keys:
for ik in ignore_keys:
if k.startswith(ik):
print("Deleting key {} from state_dict.".format(k))
del sd[k]
missing, unexpected = self.load_state_dict(sd, strict=False)
print(f"Restored from {path} with {len(missing)} missing and {len(unexpected)} unexpected keys")
if len(missing) > 0:
print(f"Missing Keys: {missing}")
print(f"Unexpected Keys: {unexpected}")
def on_train_batch_end(self, *args, **kwargs):
if self.use_ema:
self.model_ema(self)
def encode(self, x):
h = self.encoder(x)
h = self.quant_conv(h)
quant, emb_loss, info = self.quantize(h)
return quant, emb_loss, info
def encode_to_prequant(self, x):
h = self.encoder(x)
h = self.quant_conv(h)
return h
def decode(self, quant):
quant = self.post_quant_conv(quant)
dec = self.decoder(quant)
return dec
def decode_code(self, code_b):
quant_b = self.quantize.embed_code(code_b)
dec = self.decode(quant_b)
return dec
def forward(self, input, return_pred_indices=False):
quant, diff, (_,_,ind) = self.encode(input)
dec = self.decode(quant)
if return_pred_indices:
return dec, diff, ind
return dec, diff
def get_input(self, batch, k):
x = batch[k]
if len(x.shape) == 3:
x = x[..., None]
x = x.permute(0, 3, 1, 2).to(memory_format=torch.contiguous_format).float()
if self.batch_resize_range is not None:
lower_size = self.batch_resize_range[0]
upper_size = self.batch_resize_range[1]
if self.global_step <= 4:
# do the first few batches with max size to avoid later oom
new_resize = upper_size
else:
new_resize = np.random.choice(np.arange(lower_size, upper_size+16, 16))
if new_resize != x.shape[2]:
x = F.interpolate(x, size=new_resize, mode="bicubic")
x = x.detach()
return x
def training_step(self, batch, batch_idx, optimizer_idx):
# https://github.com/pytorch/pytorch/issues/37142
# try not to fool the heuristics
x = self.get_input(batch, self.image_key)
xrec, qloss, ind = self(x, return_pred_indices=True)
if optimizer_idx == 0:
# autoencode
aeloss, log_dict_ae = self.loss(qloss, x, xrec, optimizer_idx, self.global_step,
last_layer=self.get_last_layer(), split="train",
predicted_indices=ind)
self.log_dict(log_dict_ae, prog_bar=False, logger=True, on_step=True, on_epoch=True)
return aeloss
if optimizer_idx == 1:
# discriminator
discloss, log_dict_disc = self.loss(qloss, x, xrec, optimizer_idx, self.global_step,
last_layer=self.get_last_layer(), split="train")
self.log_dict(log_dict_disc, prog_bar=False, logger=True, on_step=True, on_epoch=True)
return discloss
def validation_step(self, batch, batch_idx):
log_dict = self._validation_step(batch, batch_idx)
with self.ema_scope():
log_dict_ema = self._validation_step(batch, batch_idx, suffix="_ema")
return log_dict
def _validation_step(self, batch, batch_idx, suffix=""):
x = self.get_input(batch, self.image_key)
xrec, qloss, ind = self(x, return_pred_indices=True)
aeloss, log_dict_ae = self.loss(qloss, x, xrec, 0,
self.global_step,
last_layer=self.get_last_layer(),
split="val"+suffix,
predicted_indices=ind
)
discloss, log_dict_disc = self.loss(qloss, x, xrec, 1,
self.global_step,
last_layer=self.get_last_layer(),
split="val"+suffix,
predicted_indices=ind
)
rec_loss = log_dict_ae[f"val{suffix}/rec_loss"]
self.log(f"val{suffix}/rec_loss", rec_loss,
prog_bar=True, logger=True, on_step=False, on_epoch=True, sync_dist=True)
self.log(f"val{suffix}/aeloss", aeloss,
prog_bar=True, logger=True, on_step=False, on_epoch=True, sync_dist=True)
if version.parse(pl.__version__) >= version.parse('1.4.0'):
del log_dict_ae[f"val{suffix}/rec_loss"]
self.log_dict(log_dict_ae)
self.log_dict(log_dict_disc)
return self.log_dict
def configure_optimizers(self):
lr_d = self.learning_rate
lr_g = self.lr_g_factor*self.learning_rate
print("lr_d", lr_d)
print("lr_g", lr_g)
opt_ae = torch.optim.Adam(list(self.encoder.parameters())+
list(self.decoder.parameters())+
list(self.quantize.parameters())+
list(self.quant_conv.parameters())+
list(self.post_quant_conv.parameters()),
lr=lr_g, betas=(0.5, 0.9))
opt_disc = torch.optim.Adam(self.loss.discriminator.parameters(),
lr=lr_d, betas=(0.5, 0.9))
if self.scheduler_config is not None:
scheduler = instantiate_from_config(self.scheduler_config)
print("Setting up LambdaLR scheduler...")
scheduler = [
{
'scheduler': LambdaLR(opt_ae, lr_lambda=scheduler.schedule),
'interval': 'step',
'frequency': 1
},
{
'scheduler': LambdaLR(opt_disc, lr_lambda=scheduler.schedule),
'interval': 'step',
'frequency': 1
},
]
return [opt_ae, opt_disc], scheduler
return [opt_ae, opt_disc], []
def get_last_layer(self):
return self.decoder.conv_out.weight
def log_images(self, batch, only_inputs=False, plot_ema=False, **kwargs):
log = dict()
x = self.get_input(batch, self.image_key)
x = x.to(self.device)
if only_inputs:
log["inputs"] = x
return log
xrec, _ = self(x)
if x.shape[1] > 3:
# colorize with random projection
assert xrec.shape[1] > 3
x = self.to_rgb(x)
xrec = self.to_rgb(xrec)
log["inputs"] = x
log["reconstructions"] = xrec
if plot_ema:
with self.ema_scope():
xrec_ema, _ = self(x)
if x.shape[1] > 3: xrec_ema = self.to_rgb(xrec_ema)
log["reconstructions_ema"] = xrec_ema
return log
def to_rgb(self, x):
assert self.image_key == "segmentation"
if not hasattr(self, "colorize"):
self.register_buffer("colorize", torch.randn(3, x.shape[1], 1, 1).to(x))
x = F.conv2d(x, weight=self.colorize)
x = 2.*(x-x.min())/(x.max()-x.min()) - 1.
return x
class VQModelInterface(VQModel):
def __init__(self, embed_dim, *args, **kwargs):
super().__init__(embed_dim=embed_dim, *args, **kwargs)
self.embed_dim = embed_dim
def encode(self, x):
h = self.encoder(x)
h = self.quant_conv(h)
return h
def decode(self, h, force_not_quantize=False):
# also go through quantization layer
if not force_not_quantize:
quant, emb_loss, info = self.quantize(h)
else:
quant = h
quant = self.post_quant_conv(quant)
dec = self.decoder(quant)
return dec
class AutoencoderKL(pl.LightningModule):
def __init__(self,
ddconfig,
lossconfig,
embed_dim,
ckpt_path=None,
ignore_keys=[],
image_key="image",
colorize_nlabels=None,
monitor=None,
):
super().__init__()
self.image_key = image_key
self.encoder = Encoder(**ddconfig)
self.decoder = Decoder(**ddconfig)
self.loss = instantiate_from_config(lossconfig)
assert ddconfig["double_z"]
self.quant_conv = torch.nn.Conv2d(2*ddconfig["z_channels"], 2*embed_dim, 1)
self.post_quant_conv = torch.nn.Conv2d(embed_dim, ddconfig["z_channels"], 1)
self.embed_dim = embed_dim
if colorize_nlabels is not None:
assert type(colorize_nlabels)==int
self.register_buffer("colorize", torch.randn(3, colorize_nlabels, 1, 1))
if monitor is not None:
self.monitor = monitor
if ckpt_path is not None:
self.init_from_ckpt(ckpt_path, ignore_keys=ignore_keys)
def init_from_ckpt(self, path, ignore_keys=list()):
sd = torch.load(path, map_location="cpu")["state_dict"]
keys = list(sd.keys())
for k in keys:
for ik in ignore_keys:
if k.startswith(ik):
print("Deleting key {} from state_dict.".format(k))
del sd[k]
self.load_state_dict(sd, strict=False)
print(f"Restored from {path}")
def encode(self, x):
h = self.encoder(x)
moments = self.quant_conv(h)
posterior = DiagonalGaussianDistribution(moments)
return posterior
def decode(self, z):
z = self.post_quant_conv(z)
dec = self.decoder(z)
return dec
def forward(self, input, sample_posterior=True):
posterior = self.encode(input)
if sample_posterior:
z = posterior.sample()
else:
z = posterior.mode()
dec = self.decode(z)
return dec, posterior
def get_input(self, batch, k):
x = batch[k]
if len(x.shape) == 3:
x = x[..., None]
x = x.permute(0, 3, 1, 2).to(memory_format=torch.contiguous_format).float()
return x
def training_step(self, batch, batch_idx, optimizer_idx):
inputs = self.get_input(batch, self.image_key)
reconstructions, posterior = self(inputs)
if optimizer_idx == 0:
# train encoder+decoder+logvar
aeloss, log_dict_ae = self.loss(inputs, reconstructions, posterior, optimizer_idx, self.global_step,
last_layer=self.get_last_layer(), split="train")
self.log("aeloss", aeloss, prog_bar=True, logger=True, on_step=True, on_epoch=True)
self.log_dict(log_dict_ae, prog_bar=False, logger=True, on_step=True, on_epoch=False)
return aeloss
if optimizer_idx == 1:
# train the discriminator
discloss, log_dict_disc = self.loss(inputs, reconstructions, posterior, optimizer_idx, self.global_step,
last_layer=self.get_last_layer(), split="train")
self.log("discloss", discloss, prog_bar=True, logger=True, on_step=True, on_epoch=True)
self.log_dict(log_dict_disc, prog_bar=False, logger=True, on_step=True, on_epoch=False)
return discloss
def validation_step(self, batch, batch_idx):
inputs = self.get_input(batch, self.image_key)
reconstructions, posterior = self(inputs)
aeloss, log_dict_ae = self.loss(inputs, reconstructions, posterior, 0, self.global_step,
last_layer=self.get_last_layer(), split="val")
discloss, log_dict_disc = self.loss(inputs, reconstructions, posterior, 1, self.global_step,
last_layer=self.get_last_layer(), split="val")
self.log("val/rec_loss", log_dict_ae["val/rec_loss"])
self.log_dict(log_dict_ae)
self.log_dict(log_dict_disc)
return self.log_dict
def configure_optimizers(self):
lr = self.learning_rate
opt_ae = torch.optim.Adam(list(self.encoder.parameters())+
list(self.decoder.parameters())+
list(self.quant_conv.parameters())+
list(self.post_quant_conv.parameters()),
lr=lr, betas=(0.5, 0.9))
opt_disc = torch.optim.Adam(self.loss.discriminator.parameters(),
lr=lr, betas=(0.5, 0.9))
return [opt_ae, opt_disc], []
def get_last_layer(self):
return self.decoder.conv_out.weight
@torch.no_grad()
def log_images(self, batch, only_inputs=False, **kwargs):
log = dict()
x = self.get_input(batch, self.image_key)
x = x.to(self.device)
if not only_inputs:
xrec, posterior = self(x)
if x.shape[1] > 3:
# colorize with random projection
assert xrec.shape[1] > 3
x = self.to_rgb(x)
xrec = self.to_rgb(xrec)
log["samples"] = self.decode(torch.randn_like(posterior.sample()))
log["reconstructions"] = xrec
log["inputs"] = x
return log
def to_rgb(self, x):
assert self.image_key == "segmentation"
if not hasattr(self, "colorize"):
self.register_buffer("colorize", torch.randn(3, x.shape[1], 1, 1).to(x))
x = F.conv2d(x, weight=self.colorize)
x = 2.*(x-x.min())/(x.max()-x.min()) - 1.
return x
class IdentityFirstStage(torch.nn.Module):
def __init__(self, *args, vq_interface=False, **kwargs):
self.vq_interface = vq_interface # TODO: Should be true by default but check to not break older stuff
super().__init__()
def encode(self, x, *args, **kwargs):
return x
def decode(self, x, *args, **kwargs):
return x
def quantize(self, x, *args, **kwargs):
if self.vq_interface:
return x, None, [None, None, None]
return x
def forward(self, x, *args, **kwargs):
return x
================================================
FILE: ldm/models/diffusion/__init__.py
================================================
================================================
FILE: ldm/models/diffusion/ctrldemo_sync_dreamer.py
================================================
from pathlib import Path
import pytorch_lightning as pl
import torch
import torch.nn as nn
import torch.nn.functional as F
import numpy as np
from skimage.io import imsave
from torch.optim.lr_scheduler import LambdaLR
from tqdm import tqdm
import imageio
from ldm.base_utils import read_pickle, concat_images_list
from ldm.models.diffusion.sync_dreamer_utils import get_warp_coordinates, create_target_volume, get_proxy_warp_coordinates
from ldm.models.diffusion.sync_dreamer_network import NoisyTargetViewEncoder, ControlSpatialTime3DNet, FrustumTV3DNet
from ldm.modules.diffusionmodules.util import make_ddim_timesteps, timestep_embedding
from ldm.modules.encoders.modules import FrozenCLIPImageEmbedder
from ldm.util import instantiate_from_config, get_3x4_RT_matrix_from_az_el, save_pickle, read_pickle
from ldm.models.diffusion.sync_dreamer import SyncMultiviewDiffusion, disable_training_module, disabled_train, repeat_to_batch, UNetWrapper, SyncDDIMSampler, SpatialVolumeNet
from externs.pvcnn.modules import ProxyVoxelConv
from diffusers import DDIMScheduler, DPMSolverMultistepScheduler
from ldm.DPMPPScheduler import DPMPPScheduler
class ControlSpatialVolumeNet(SpatialVolumeNet):
def __init__(self, time_dim, view_dim, view_num,
input_image_size=256, frustum_volume_depth=48,
spatial_volume_size=32, spatial_volume_length=0.5,
frustum_volume_length=0.86603, # sqrt(3)/2
block=(1, 1, 3, 32), feature_scale=1
):
super().__init__(time_dim, view_dim, view_num,
input_image_size, frustum_volume_depth,
spatial_volume_size, spatial_volume_length,
frustum_volume_length)
self.feature_scale = feature_scale
if block is not None:
in_channels, out_channels, kernal_size, resolution = block
self.pvcnn = ProxyVoxelConv(in_channels, out_channels, kernal_size, resolution)
self.controlnet = ControlSpatialTime3DNet(input_dim=16 * view_num, time_dim=time_dim, proxy_input_dim=1, dims=(64, 128, 256, 512))
def construct_spatial_volume(self, x, t_embed, v_embed, target_poses, target_Ks, proxy=None):
"""
@param x: B,N,4,H,W
@param t_embed: B,t_dim
@param v_embed: B,N,v_dim
@param target_poses: N,3,4
@param target_Ks: N,3,3
@return:
"""
B, N, _, H, W = x.shape
V = self.spatial_volume_size
device = x.device
spatial_volume_verts = torch.linspace(-self.spatial_volume_length, self.spatial_volume_length, V, dtype=torch.float32, device=device)
spatial_volume_verts = torch.stack(torch.meshgrid(spatial_volume_verts, spatial_volume_verts, spatial_volume_verts), -1)
spatial_volume_verts = spatial_volume_verts.reshape(1, V ** 3, 3)[:, :, (2, 1, 0)]
spatial_volume_verts = spatial_volume_verts.view(1, V, V, V, 3).permute(0, 4, 1, 2, 3).repeat(B, 1, 1, 1, 1)
# encode source features
t_embed_ = t_embed.view(B, 1, self.time_dim).repeat(1, N, 1).view(B, N, self.time_dim)
# v_embed_ = v_embed.view(1, N, self.view_dim).repeat(B, 1, 1).view(B, N, self.view_dim)
v_embed_ = v_embed
target_Ks = target_Ks.unsqueeze(0).repeat(B, 1, 1, 1)
target_poses = target_poses.unsqueeze(0).repeat(B, 1, 1, 1)
proxy_video = []
# extract 2D image features
spatial_volume_feats = []
# project source features
for ni in range(0, N):
pose_source_ = target_poses[:, ni]
K_source_ = target_Ks[:, ni]
x_ = self.target_encoder(x[:, ni], t_embed_[:, ni], v_embed_[:, ni])
C = x_.shape[1]
coords_source = get_warp_coordinates(spatial_volume_verts, x_.shape[-1], self.input_image_size, K_source_, pose_source_).view(B, V, V * V, 2)
unproj_feats_ = F.grid_sample(x_, coords_source, mode='bilinear', padding_mode='zeros', align_corners=True)
unproj_feats_ = unproj_feats_.view(B, C, V, V, V)
spatial_volume_feats.append(unproj_feats_)
spatial_volume_feats = torch.stack(spatial_volume_feats, 1) # B,N,C,V,V,V
N = spatial_volume_feats.shape[1]
spatial_volume_feats = spatial_volume_feats.view(B, N*C, V, V, V)
if proxy is not None:
# proxy in [-0.5, 0.5]
_, num_proxy, _ = proxy.shape
proxy += 0.5 # [0, 1]
proxy = proxy.permute(0, 2, 1)
proxy_feature = torch.ones([B, 1, num_proxy], dtype=proxy.dtype).to(proxy.device) * self.feature_scale
proxy_feature, _ = self.pvcnn([proxy_feature, proxy])
proxy_feature = proxy_feature.permute(0, 1, 4, 3, 2)
proxy_residual = self.controlnet(spatial_volume_feats, t_embed, proxy_feature)
else:
proxy_residual = None
spatial_volume_feats = self.spatial_volume_feats(spatial_volume_feats, t_embed, proxy_residual) # b,64,32,32,32
return spatial_volume_feats
class CtrlDemo(SyncMultiviewDiffusion):
def __init__(self, unet_config, scheduler_config,
finetune_unet=False, finetune_projection=True,
view_num=16, image_size=256,
cfg_scale=3.0, output_num=8, batch_view_num=4,
drop_conditions=False, drop_scheme='default',
clip_image_encoder_path="/apdcephfs/private_rondyliu/projects/clip/ViT-L-14.pt",
sample_type='ddim', sample_steps=200, feature_scale=1):
pl.LightningModule.__init__(self)
self.finetune_unet = finetune_unet
self.finetune_projection = finetune_projection
self.view_num = view_num
self.viewpoint_dim = 4
self.output_num = output_num
self.image_size = image_size
self.batch_view_num = batch_view_num
self.cfg_scale = cfg_scale
self.clip_image_encoder_path = clip_image_encoder_path
self._init_time_step_embedding()
self._init_first_stage()
self._init_schedule()
self._init_multiview()
self._init_clip_image_encoder()
self._init_clip_projection()
self.spatial_volume = ControlSpatialVolumeNet(self.time_embed_dim, self.viewpoint_dim, self.view_num, feature_scale=feature_scale)
self.model = UNetWrapper(unet_config, drop_conditions=drop_conditions, drop_scheme=drop_scheme)
self.scheduler_config = scheduler_config
latent_size = image_size//8
self._init_sampler(latent_size, sample_steps)
def _init_sampler(self, latent_size, sample_steps):
self.sampler = CtrlDemoSampler(self, sample_steps , 'ddim', "uniform", 1.0, latent_size=latent_size)
def prepare(self, batch):
x, clip_embed, input_info = super().prepare(batch)
if 'proxy' in batch:
input_info['proxy'] = batch['proxy']
return x, clip_embed, input_info
def inference(self, sampler, batch, cfg_scale, batch_view_num, return_inter_results=False, inter_interval=50, inter_view_interval=2, callback=None):
_, clip_embed, input_info = self.prepare(batch)
x_sample, _, total_spatial_volume = sampler.inference(input_info, clip_embed, unconditional_scale=cfg_scale, log_every_t=inter_interval, batch_view_num=batch_view_num,callback=callback)
return x_sample, total_spatial_volume
def decode_latents(self, x_sample):
images = self.decode_first_stage(x_sample)
return images
def get_target_view_feats(self, x_input, spatial_volume, clip_embed, t_embed, v_embed, target_index, spatial_volume_params):
"""
@param x_input: B,4,H,W
@param spatial_volume: B,C,V,V,V
@param clip_embed: B,1,768
@param t_embed: B,t_dim
@param v_embed: B,N,v_dim
@param target_index: B,TN
@return:
tensors of size B*TN,*
"""
B, _, H, W = x_input.shape
frustum_volume_feats, frustum_volume_depth = self.spatial_volume.construct_view_frustum_volume(spatial_volume, t_embed, v_embed, **spatial_volume_params)
# clip
TN = target_index.shape[1]
v_embed_ = v_embed[torch.arange(B)[:,None], target_index].view(B*TN, self.viewpoint_dim) # B*TN,v_dim
clip_embed_ = clip_embed.unsqueeze(1).repeat(1,TN,1,1).view(B*TN,1,768)
clip_embed_ = self.cc_projection(torch.cat([clip_embed_, v_embed_.unsqueeze(1)], -1)) # B*TN,1,768
x_input_ = x_input.unsqueeze(1).repeat(1, TN, 1, 1, 1).view(B * TN, 4, H, W)
x_concat = x_input_
return clip_embed_, frustum_volume_feats, x_concat
def training_step(self, batch):
B = batch['target_image'].shape[0]
time_steps = torch.randint(0, self.num_timesteps, (B,), device=self.device).long()
x, clip_embed, input_info = self.prepare(batch)
x_noisy, noise = self.add_noise(x, time_steps) # B,N,4,H,W
N = self.view_num
target_index = torch.randint(0, N, (B, 1), device=self.device).long() # B, 1
v_embed = self.get_viewpoint_embedding(B, input_info['elevation']) # N,v_dim
proxy_ = input_info['proxy'].detach().clone()
t_embed = self.embed_time(time_steps)
spatial_volume = self.spatial_volume.construct_spatial_volume(x_noisy, t_embed, v_embed, self.poses, self.Ks, proxy=proxy_)
spatial_volume_params = {'poses': self.poses, 'Ks': self.Ks, 'target_indices': target_index}
clip_embed, volume_feats, x_concat = self.get_target_view_feats(input_info['x'], spatial_volume, clip_embed, t_embed, v_embed, target_index, spatial_volume_params=spatial_volume_params)
x_noisy_ = x_noisy[torch.arange(B)[:,None],target_index][:,0] # B,4,H,W
noise_predict = self.model(x_noisy_, time_steps, clip_embed, volume_feats, x_concat, is_train=True) # B,4,H,W
noise_target = noise[torch.arange(B)[:,None],target_index][:,0] # B,4,H,W
# loss simple for diffusion
loss_simple = torch.nn.functional.mse_loss(noise_target, noise_predict, reduction='none')
loss = loss_simple.mean()
self.log('sim', loss_simple.mean(), prog_bar=True, logger=True, on_step=True, on_epoch=True, rank_zero_only=True)
# log others
lr = self.optimizers().param_groups[0]['lr']
self.log('lr', lr, prog_bar=True, logger=True, on_step=True, on_epoch=False, rank_zero_only=True)
self.log("step", self.global_step, prog_bar=True, logger=True, on_step=True, on_epoch=False, rank_zero_only=True)
return loss
def configure_optimizers(self):
lr = self.learning_rate
print(f'setting learning rate to {lr:.4f} ...')
paras = []
paras.append({"params": self.spatial_volume.controlnet.parameters(), "lr": lr},)
opt = torch.optim.AdamW(paras, lr=lr)
scheduler = instantiate_from_config(self.scheduler_config)
print("Setting up LambdaLR scheduler...")
scheduler = [{'scheduler': LambdaLR(opt, lr_lambda=scheduler.schedule), 'interval': 'step', 'frequency': 1}]
return [opt], scheduler
class CtrlDemoSampler:
def __init__(self, model: CtrlDemo, scheduler_steps, scheduler_name='ddim', ddim_discretize="uniform", ddim_eta=1.0, latent_size=32):
self.model = model
self.ddpm_num_timesteps = model.num_timesteps
self.latent_size = latent_size
self.eta = ddim_eta
self.scheduler_name = scheduler_name
self.scheduler_steps = scheduler_steps
if scheduler_name == 'ddim':
self.scheduler=DDIMScheduler(num_train_timesteps=self.ddpm_num_timesteps, beta_start=0.00085, beta_end=0.0120, beta_schedule="scaled_linear", set_alpha_to_one=False, clip_sample=False, steps_offset=1, trained_betas=None)
elif scheduler_name == 'dpm++':
self.scheduler=DPMPPScheduler(num_train_timesteps=self.ddpm_num_timesteps, beta_start=0.00085, beta_end=0.0120, beta_schedule="scaled_linear", use_karras_sigmas=True)
# self.scheduler=DPMSolverMultistepScheduler(num_train_timesteps=self.ddpm_num_timesteps, beta_start=0.00085, beta_end=0.0120, beta_schedule="scaled_linear", use_karras_sigmas=True)
self.scheduler.set_timesteps(scheduler_steps, device=self.model.device)
self.set_ctrl3D_params([{"start_percent":0.0, "end_percent":1.0}], strength=1.0)
def parameterization(self):
sampler_params = {
'scheduler_steps' : self.scheduler_steps,
'scheduler_name': self.scheduler_name,
'ddim_eta': self.eta,
'latent_size': self.latent_size,
}
return sampler_params
# save_pickle(sampler_params, save_path)
@classmethod
def from_pkl(cls, model, pkl_dir):
params = read_pickle(pkl_dir)
return cls(model, **params)
def set_ctrl3D_params(self, ctrl3D_params_list, strength: float=1.0):
self.ctrl3D_params_list = ctrl3D_params_list
self.model.spatial_volume.controlnet.ctrl_strength = strength
def concat_proxy(self, proxys, inferenc_step):
step = inferenc_step/1000
valid_proxys = []
for params, pxy in zip(self.ctrl3D_params_list, proxys):
if not isinstance(pxy, torch.Tensor):
continue
if (1-params.end_percent) < step < (1-params.start_percent):
valid_proxys.append(pxy)
if len(valid_proxys) == 0:
return None
valid_proxys=torch.cat(valid_proxys, dim=1)
return valid_proxys
@torch.no_grad()
def denoise_apply_impl(self, x_target_noisy, time_steps, noise_pred, is_step0=False):
if self.scheduler_name == 'ddim':
result = self.scheduler.step(noise_pred, time_steps, x_target_noisy, return_dict=True, eta=self.eta if not is_step0 else 0)
elif self.scheduler_name == 'dpm++':
result = self.scheduler.step(noise_pred, time_steps, x_target_noisy, return_dict=True)
x_pred, x_origin = result[0], result[1]
return x_pred, x_origin
@torch.no_grad()
def denoise_apply(self, x_target_noisy, input_info, v_embed, clip_embed, time_steps, index, unconditional_scale, batch_view_num=1, is_step0=False, spatial_volume=None):
"""
@param x_target_noisy: B,N,4,H,W
@param input_info:
@param clip_embed: B,M,768
@param time_steps: B,
@param index: int
@param unconditional_scale:
@param batch_view_num: int
@param is_step0: bool
@return:
"""
x_input = input_info['x']
B, N, C, H, W = x_target_noisy.shape
# construct source data
t_embed = self.model.embed_time(time_steps) # B,t_dim
if spatial_volume is None:
proxy = None if 'proxy' not in input_info else input_info['proxy'].detach().clone()
spatial_volume = self.model.spatial_volume.construct_spatial_volume(x_target_noisy, t_embed, v_embed, self.model.poses, self.model.Ks, proxy=proxy)
e_t = []
target_indices = torch.arange(N) # N
for ni in range(0, N, batch_view_num):
x_target_noisy_ = x_target_noisy[:, ni:ni + batch_view_num]
VN = x_target_noisy_.shape[1]
x_target_noisy_ = x_target_noisy_.reshape(B*VN,C,H,W)
time_steps_ = repeat_to_batch(time_steps, B, VN)
target_indices_ = target_indices[ni:ni+batch_view_num].unsqueeze(0).repeat(B,1)
spatial_volume_params = {'poses': self.model.poses, 'Ks': self.model.Ks, 'target_indices': target_indices_}
clip_embed_, volume_feats_, x_concat_ = self.model.get_target_view_feats(x_input, spatial_volume, clip_embed, t_embed, v_embed, target_indices_, spatial_volume_params)
if unconditional_scale!=1.0:
noise = self.model.model.predict_with_unconditional_scale(x_target_noisy_, time_steps_, clip_embed_, volume_feats_, x_concat_, unconditional_scale)
else:
noise = self.model.model(x_target_noisy_, time_steps_, clip_embed_, volume_feats_, x_concat_, is_train=False)
e_t.append(noise.view(B,VN,4,H,W))
e_t = torch.cat(e_t, 1)
x_prev, _ = self.denoise_apply_impl(x_target_noisy, int(time_steps[0]), e_t, is_step0)
return x_prev, spatial_volume
@torch.no_grad()
def inference(self, input_info, clip_embed, unconditional_scale=1.0, log_every_t=50, batch_view_num=1, callback=None):
"""
@param input_info: x, elevation
@param clip_embed: B,M,768
@param unconditional_scale:
@param log_every_t:
@param batch_view_num:
@return:
"""
print(f"unconditional scale {unconditional_scale:.1f}")
C, H, W = 4, self.latent_size, self.latent_size
B = clip_embed.shape[0]
N = self.model.view_num
device = self.model.device
x_target_noisy = torch.randn([B, N, C, H, W], device=device)
elevation_input = input_info['elevation']
v_embed = self.model.get_viewpoint_embedding(B, elevation_input) # B,N,v_dim
timesteps = self.scheduler.timesteps
intermediates = {'x_inter': []}
# time_range = np.flip(timesteps)
total_steps = timesteps.shape[0]
iterator = tqdm(timesteps, desc='DDIM Sampler', total=total_steps)
condition_name = ['proxy']
total_volume_feature = []
for i, step in enumerate(iterator):
index = total_steps - i - 1 # index in ddim state
time_steps = torch.full((B,), step, device=device, dtype=torch.long)
t_input_info = {k:v for k, v in input_info.items() if k not in condition_name}
valid_proxy = self.concat_proxy(input_info['proxy'], int(step))
if valid_proxy is not None:
t_input_info['proxy'] = valid_proxy
x_target_noisy, spatial_volume = self.denoise_apply(x_target_noisy, t_input_info, v_embed, clip_embed, time_steps, index, unconditional_scale, batch_view_num=batch_view_num, is_step0=index==0)
total_volume_feature.append(spatial_volume)
if index % log_every_t == 0 or index == total_steps - 1:
intermediates['x_inter'].append(x_target_noisy)
if callback is not None:
callback(i, total_steps)
return x_target_noisy, intermediates, total_volume_feature
def get_clip_feature(self, x_input, clip_embed, v_embed, target_index):
B, _, H, W = x_input.shape
TN = target_index.shape[1]
viewpoint_dim = 4
v_embed_ = v_embed[torch.arange(B)[:,None], target_index].view(B*TN, self.model.viewpoint_dim) # B*TN,v_dim
clip_embed_ = clip_embed.unsqueeze(1).repeat(1,TN,1,1).view(B*TN,1,768)
clip_embed_ = self.model.cc_projection(torch.cat([clip_embed_, v_embed_.unsqueeze(1)], -1)) # B*TN,1,768
x_input_ = x_input.unsqueeze(1).repeat(1, TN, 1, 1, 1).view(B * TN, 4, H, W)
x_concat = x_input_
return clip_embed_, x_concat
================================================
FILE: ldm/models/diffusion/sync_dreamer.py
================================================
from pathlib import Path
import pytorch_lightning as pl
import torch
import torch.nn as nn
import torch.nn.functional as F
import numpy as np
from skimage.io import imsave
from torch.optim.lr_scheduler import LambdaLR
from tqdm import tqdm
from ldm.base_utils import read_pickle, concat_images_list
from ldm.models.diffusion.sync_dreamer_utils import get_warp_coordinates, create_target_volume
from ldm.models.diffusion.sync_dreamer_network import NoisyTargetViewEncoder, SpatialTime3DNet, FrustumTV3DNet
from ldm.modules.diffusionmodules.util import make_ddim_timesteps, timestep_embedding
from ldm.modules.encoders.modules import FrozenCLIPImageEmbedder
from ldm.util import instantiate_from_config
from ldm.typing import *
def disabled_train(self, mode=True):
"""Overwrite model.train with this function to make sure train/eval mode
does not change anymore."""
return self
def disable_training_module(module: nn.Module):
module = module.eval()
module.train = disabled_train
for para in module.parameters():
para.requires_grad = False
return module
def repeat_to_batch(tensor, B, VN):
t_shape = tensor.shape
ones = [1 for _ in range(len(t_shape)-1)]
tensor_new = tensor.view(B,1,*t_shape[1:]).repeat(1,VN,*ones).view(B*VN,*t_shape[1:])
return tensor_new
class UNetWrapper(nn.Module):
def __init__(self, diff_model_config, drop_conditions=False, drop_scheme='default', use_zero_123=True):
super().__init__()
self.diffusion_model = instantiate_from_config(diff_model_config)
self.drop_conditions = drop_conditions
self.drop_scheme=drop_scheme
self.use_zero_123 = use_zero_123
def drop(self, cond, mask):
shape = cond.shape
B = shape[0]
cond = mask.view(B,*[1 for _ in range(len(shape)-1)]) * cond
return cond
def get_trainable_parameters(self):
return self.diffusion_model.get_trainable_parameters()
def get_drop_scheme(self, B, device):
if self.drop_scheme=='default':
random = torch.rand(B, dtype=torch.float32, device=device)
drop_clip = (random > 0.15) & (random <= 0.2)
drop_volume = (random > 0.1) & (random <= 0.15)
drop_concat = (random > 0.05) & (random <= 0.1)
drop_all = random <= 0.05
else:
raise NotImplementedError
return drop_clip, drop_volume, drop_concat, drop_all
def forward(self, x, t, clip_embed, volume_feats, x_concat, is_train=False):
"""
@param x: B,4,H,W
@param t: B,
@param clip_embed: B,M,768
@param volume_feats: B,C,D,H,W
@param x_concat: B,C,H,W
@param is_train:
@return:
"""
if self.drop_conditions and is_train:
B = x.shape[0]
drop_clip, drop_volume, drop_concat, drop_all = self.get_drop_scheme(B, x.device)
clip_mask = 1.0 - (drop_clip | drop_all).float()
clip_embed = self.drop(clip_embed, clip_mask)
volume_mask = 1.0 - (drop_volume | drop_all).float()
for k, v in volume_feats.items():
volume_feats[k] = self.drop(v, mask=volume_mask)
concat_mask = 1.0 - (drop_concat | drop_all).float()
x_concat = self.drop(x_concat, concat_mask)
if self.use_zero_123:
# zero123 does not multiply this when encoding, maybe a bug for zero123
first_stage_scale_factor = 0.18215
x_concat_ = x_concat * 1.0
x_concat_[:, :4] = x_concat_[:, :4] / first_stage_scale_factor
else:
x_concat_ = x_concat
x = torch.cat([x, x_concat_], 1)
pred = self.diffusion_model(x, t, clip_embed, source_dict=volume_feats)
return pred
def predict_with_unconditional_scale(self, x, t, clip_embed, volume_feats, x_concat, unconditional_scale):
x_ = torch.cat([x] * 2, 0)
t_ = torch.cat([t] * 2, 0)
clip_embed_ = torch.cat([clip_embed, torch.zeros_like(clip_embed)], 0)
v_ = {}
for k, v in volume_feats.items():
v_[k] = torch.cat([v, torch.zeros_like(v)], 0)
x_concat_ = torch.cat([x_concat, torch.zeros_like(x_concat)], 0)
if self.use_zero_123:
# zero123 does not multiply this when encoding, maybe a bug for zero123
first_stage_scale_factor = 0.18215
x_concat_[:, :4] = x_concat_[:, :4] / first_stage_scale_factor
x_ = torch.cat([x_, x_concat_], 1)
s, s_uc = self.diffusion_model(x_, t_, clip_embed_, source_dict=v_).chunk(2)
s = s_uc + unconditional_scale * (s - s_uc)
return s
def predict_for_threestudio(self, x, t, clip_embed, volume_feats, x_concat):
x_ = torch.cat([x] * 2, 0)
t_ = torch.cat([t] * 2, 0)
v_ = {}
for k, v in volume_feats.items():
v_[k] = torch.cat([torch.zeros_like(v), v], 0)
# if self.use_zero_123:
# # zero123 does not multiply this when encoding, maybe a bug for zero123
# first_stage_scale_factor = 0.18215
# x_concat_[:, :4] = x_concat_[:, :4] / first_stage_scale_factor
x_ = torch.cat([x_, x_concat[0]], 1)
s_uc, s = self.diffusion_model(x_, t_, clip_embed[0], source_dict=v_).chunk(2)
return s_uc, s
def predict_with_unconditional_scale_mv(self, x, t, clip_embed, volume_feats, x_concat, unconditional_scale, t_x_concat, t_clip_embed, merge_weight):
merge_weight = merge_weight.squeeze(0).view(-1, 1, 1, 1).to(clip_embed.device)
s_view1 = self.predict_with_unconditional_scale(x, t, clip_embed, volume_feats, x_concat, unconditional_scale)
s_view2 = self.predict_with_unconditional_scale(x, t, t_clip_embed, volume_feats, t_x_concat, unconditional_scale)
s = merge_weight*s_view1+(1-merge_weight)*s_view2
return s
class SpatialVolumeNet(nn.Module):
def __init__(self, time_dim, view_dim, view_num,
input_image_size=256, frustum_volume_depth=48,
spatial_volume_size=32, spatial_volume_length=0.5,
frustum_volume_length=0.86603 # sqrt(3)/2
):
super().__init__()
self.target_encoder = NoisyTargetViewEncoder(time_dim, view_dim, output_dim=16)
self.spatial_volume_feats = SpatialTime3DNet(input_dim=16 * view_num, time_dim=time_dim, dims=(64, 128, 256, 512))
self.frustum_volume_feats = FrustumTV3DNet(64, time_dim, view_dim, dims=(64, 128, 256, 512))
self.frustum_volume_length = frustum_volume_length
self.input_image_size = input_image_size
self.spatial_volume_size = spatial_volume_size
self.spatial_volume_length = spatial_volume_length
self.frustum_volume_size = self.input_image_size // 8
self.frustum_volume_depth = frustum_volume_depth
self.time_dim = time_dim
self.view_dim = view_dim
self.default_origin_depth = 1.5 # our rendered images are 1.5 away from the origin, we assume camera is 1.5 away from the origin
def construct_spatial_volume(self, x, t_embed, v_embed, target_poses, target_Ks):
"""
@param x: B,N,4,H,W
@param t_embed: B,t_dim
@param v_embed: B,N,v_dim
@param target_poses: N,3,4
@param target_Ks: N,3,3
@return:
"""
B, N, _, H, W = x.shape
V = self.spatial_volume_size
device = x.device
spatial_volume_verts = torch.linspace(-self.spatial_volume_length, self.spatial_volume_length, V, dtype=torch.float32, device=device)
spatial_volume_verts = torch.stack(torch.meshgrid(spatial_volume_verts, spatial_volume_verts, spatial_volume_verts), -1)
spatial_volume_verts = spatial_volume_verts.reshape(1, V ** 3, 3)[:, :, (2, 1, 0)]
spatial_volume_verts = spatial_volume_verts.view(1, V, V, V, 3).permute(0, 4, 1, 2, 3).repeat(B, 1, 1, 1, 1)
# encode source features
t_embed_ = t_embed.view(B, 1, self.time_dim).repeat(1, N, 1).view(B, N, self.time_dim)
# v_embed_ = v_embed.view(1, N, self.view_dim).repeat(B, 1, 1).view(B, N, self.view_dim)
v_embed_ = v_embed
target_Ks = target_Ks.unsqueeze(0).repeat(B, 1, 1, 1)
target_poses = target_poses.unsqueeze(0).repeat(B, 1, 1, 1)
# extract 2D image features
spatial_volume_feats = []
# project source features
for ni in range(0, N):
pose_source_ = target_poses[:, ni]
K_source_ = target_Ks[:, ni]
x_ = self.target_encoder(x[:, ni], t_embed_[:, ni], v_embed_[:, ni])
C = x_.shape[1]
coords_source = get_warp_coordinates(spatial_volume_verts, x_.shape[-1], self.input_image_size, K_source_, pose_source_).view(B, V, V * V, 2)
unproj_feats_ = F.grid_sample(x_, coords_source, mode='bilinear', padding_mode='zeros', align_corners=True)
unproj_feats_ = unproj_feats_.view(B, C, V, V, V)
spatial_volume_feats.append(unproj_feats_)
spatial_volume_feats = torch.stack(spatial_volume_feats, 1) # B,N,C,V,V,V
N = spatial_volume_feats.shape[1]
spatial_volume_feats = spatial_volume_feats.view(B, N*C, V, V, V)
spatial_volume_feats = self.spatial_volume_feats(spatial_volume_feats, t_embed) # b,64,32,32,32
return spatial_volume_feats
def construct_view_frustum_volume(self, spatial_volume, t_embed, v_embed, poses, Ks, target_indices):
"""
@param spatial_volume: B,C,V,V,V
@param t_embed: B,t_dim
@param v_embed: B,N,v_dim
@param poses: N,3,4
@param Ks: N,3,3
@param target_indices: B,TN
@return: B*TN,C,H,W
"""
B, TN = target_indices.shape
H, W = self.frustum_volume_size, self.frustum_volume_size
D = self.frustum_volume_depth
V = self.spatial_volume_size
near = torch.ones(B * TN, 1, H, W, dtype=spatial_volume.dtype, device=spatial_volume.device) * self.default_origin_depth - self.frustum_volume_length
far = torch.ones(B * TN, 1, H, W, dtype=spatial_volume.dtype, device=spatial_volume.device) * self.default_origin_depth + self.frustum_volume_length
target_indices = target_indices.view(B*TN) # B*TN
poses_ = poses[target_indices] # B*TN,3,4
Ks_ = Ks[target_indices] # B*TN,3,4
volume_xyz, volume_depth = create_target_volume(D, self.frustum_volume_size, self.input_image_size, poses_, Ks_, near, far) # B*TN,3 or 1,D,H,W
volume_xyz_ = volume_xyz / self.spatial_volume_length # since the spatial volume is constructed in [-spatial_volume_length,spatial_volume_length]
volume_xyz_ = volume_xyz_.permute(0, 2, 3, 4, 1) # B*TN,D,H,W,3
spatial_volume_ = spatial_volume.unsqueeze(1).repeat(1, TN, 1, 1, 1, 1).view(B * TN, -1, V, V, V)
volume_feats = F.grid_sample(spatial_volume_, volume_xyz_, mode='bilinear', padding_mode='zeros', align_corners=True) # B*TN,C,D,H,W
v_embed_ = v_embed[torch.arange(B)[:,None], target_indices.view(B,TN)].view(B*TN, -1) # B*TN
t_embed_ = t_embed.unsqueeze(1).repeat(1,TN,1).view(B*TN,-1)
volume_feats_dict = self.frustum_volume_feats(volume_feats, t_embed_, v_embed_)
return volume_feats_dict, volume_depth
class SyncMultiviewDiffusion(pl.LightningModule):
def __init__(self, unet_config, scheduler_config,
finetune_unet=False, finetune_projection=True,
view_num=16, image_size=256,
cfg_scale=3.0, output_num=8, batch_view_num=4,
drop_conditions=False, drop_scheme='default',
clip_image_encoder_path="/apdcephfs/private_rondyliu/projects/clip/ViT-L-14.pt",
sample_type='ddim', sample_steps=200):
super().__init__()
self.finetune_unet = finetune_unet
self.finetune_projection = finetune_projection
self.view_num = view_num
self.viewpoint_dim = 4
self.output_num = output_num
self.image_size = image_size
self.batch_view_num = batch_view_num
self.cfg_scale = cfg_scale
self.clip_image_encoder_path = clip_image_encoder_path
self._init_time_step_embedding()
self._init_first_stage()
self._init_schedule()
self._init_multiview()
self._init_clip_image_encoder()
self._init_clip_projection()
self.spatial_volume = SpatialVolumeNet(self.time_embed_dim, self.viewpoint_dim, self.view_num)
self.model = UNetWrapper(unet_config, drop_conditions=drop_conditions, drop_scheme=drop_scheme)
self.scheduler_config = scheduler_config
latent_size = image_size//8
if sample_type=='ddim':
self.sampler = SyncDDIMSampler(self, sample_steps , "uniform", 1.0, latent_size=latent_size)
else:
raise NotImplementedError
def _init_clip_projection(self):
self.cc_projection = nn.Linear(772, 768)
nn.init.eye_(list(self.cc_projection.parameters())[0][:768, :768])
nn.init.zeros_(list(self.cc_projection.parameters())[1])
self.cc_projection.requires_grad_(True)
if not self.finetune_projection:
disable_training_module(self.cc_projection)
def _init_multiview(self):
K, azs, _, _, poses = read_pickle(f'meta_info/camera-{self.view_num}.pkl')
default_image_size = 256
ratio = self.image_size/default_image_size
K = np.diag([ratio,ratio,1]) @ K
K = torch.from_numpy(K.astype(np.float32)) # [3,3]
K = K.unsqueeze(0).repeat(self.view_num,1,1) # N,3,3
poses = torch.from_numpy(poses.astype(np.float32)) # N,3,4
self.register_buffer('poses', poses)
self.register_buffer('Ks', K)
azs = (azs + np.pi) % (np.pi * 2) - np.pi # scale to [-pi,pi] and the index=0 has az=0
self.register_buffer('azimuth', torch.from_numpy(azs.astype(np.float32)))
def get_viewpoint_embedding(self, batch_size, elevation_ref):
"""
@param batch_size:
@param elevation_ref: B
@return:
"""
azimuth_input = self.azimuth[0].unsqueeze(0) # 1
azimuth_target = self.azimuth # N
elevation_input = -elevation_ref # note that zero123 use a negative elevation here!!!
elevation_target = -np.deg2rad(30)
d_e = elevation_target - elevation_input # B
N = self.azimuth.shape[0]
B = batch_size
d_e = d_e.unsqueeze(1).repeat(1, N)
d_a = azimuth_target - azimuth_input # N
d_a = d_a.unsqueeze(0).repeat(B, 1)
d_z = torch.zeros_like(d_a)
embedding = torch.stack([d_e, torch.sin(d_a), torch.cos(d_a), d_z], -1) # B,N,4
return embedding
def _init_first_stage(self):
first_stage_config={
"target": "ldm.models.autoencoder.AutoencoderKL",
"params": {
"embed_dim": 4,
"monitor": "val/rec_loss",
"ddconfig":{
"double_z": True,
"z_channels": 4,
"resolution": self.image_size,
"in_channels": 3,
"out_ch": 3,
"ch": 128,
"ch_mult": [1,2,4,4],
"num_res_blocks": 2,
"attn_resolutions": [],
"dropout": 0.0
},
"lossconfig": {"target": "torch.nn.Identity"},
}
}
self.first_stage_scale_factor = 0.18215
self.first_stage_model = instantiate_from_config(first_stage_config)
self.first_stage_model = disable_training_module(self.first_stage_model)
def _init_clip_image_encoder(self):
self.clip_image_encoder = FrozenCLIPImageEmbedder(model=self.clip_image_encoder_path)
self.clip_image_encoder = disable_training_module(self.clip_image_encoder)
def _init_schedule(self):
self.num_timesteps = 1000
linear_start = 0.00085
linear_end = 0.0120
num_timesteps = 1000
betas = torch.linspace(linear_start ** 0.5, linear_end ** 0.5, num_timesteps, dtype=torch.float32) ** 2 # T
assert betas.shape[0] == self.num_timesteps
# all in float64 first
alphas = 1. - betas
alphas_cumprod = torch.cumprod(alphas, dim=0) # T
alphas_cumprod_prev = torch.cat([torch.ones(1, dtype=torch.float64), alphas_cumprod[:-1]], 0)
posterior_variance = betas * (1. - alphas_cumprod_prev) / (1. - alphas_cumprod) # T
posterior_log_variance_clipped = torch.log(torch.clamp(posterior_variance, min=1e-20))
posterior_log_variance_clipped = torch.clamp(posterior_log_variance_clipped, min=-10)
self.register_buffer("betas", betas.float())
self.register_buffer("alphas", alphas.float())
self.register_buffer("alphas_cumprod", alphas_cumprod.float())
self.register_buffer("sqrt_alphas_cumprod", torch.sqrt(alphas_cumprod).float())
self.register_buffer("sqrt_one_minus_alphas_cumprod", torch.sqrt(1 - alphas_cumprod).float())
self.register_buffer("posterior_variance", posterior_variance.float())
self.register_buffer('posterior_log_variance_clipped', posterior_log_variance_clipped.float())
def _init_time_step_embedding(self):
self.time_embed_dim = 256
self.time_embed = nn.Sequential(
nn.Linear(self.time_embed_dim, self.time_embed_dim),
nn.SiLU(True),
nn.Linear(self.time_embed_dim, self.time_embed_dim),
)
def encode_first_stage(self, x, sample=True):
with torch.no_grad():
posterior = self.first_stage_model.encode(x) # b,4,h//8,w//8
if sample:
return posterior.sample().detach() * self.first_stage_scale_factor
else:
return posterior.mode().detach() * self.first_stage_scale_factor
def decode_first_stage(self, z):
with torch.no_grad():
z = 1. / self.first_stage_scale_factor * z
return self.first_stage_model.decode(z)
def prepare(self, batch):
# encode target
if 'target_image' in batch:
image_target = batch['target_image'].permute(0, 1, 4, 2, 3) # b,n,3,h,w
N = image_target.shape[1]
x = [self.encode_first_stage(image_target[:,ni], True) for ni in range(N)]
x = torch.stack(x, 1) # b,n,4,h//8,w//8
else:
x = None
image_input = batch['input_image'].permute(0, 3, 1, 2)
elevation_input = batch['input_elevation'][:, 0] # b
x_input = self.encode_first_stage(image_input)
input_info = {'image': image_input, 'elevation': elevation_input, 'x': x_input}
with torch.no_grad():
clip_embed = self.clip_image_encoder.encode(image_input)
return x, clip_embed, input_info
def embed_time(self, t):
t_embed = timestep_embedding(t, self.time_embed_dim, repeat_only=False) # B,TED
t_embed = self.time_embed(t_embed) # B,TED
return t_embed
def get_target_view_feats(self, x_input, spatial_volume, clip_embed, t_embed, v_embed, target_index):
"""
@param x_input: B,4,H,W
@param spatial_volume: B,C,V,V,V
@param clip_embed: B,1,768
@param t_embed: B,t_dim
@param v_embed: B,N,v_dim
@param target_index: B,TN
@return:
tensors of size B*TN,*
"""
B, _, H, W = x_input.shape
frustum_volume_feats, frustum_volume_depth = self.spatial_volume.construct_view_frustum_volume(spatial_volume, t_embed, v_embed, self.poses, self.Ks, target_index)
# clip
TN = target_index.shape[1]
v_embed_ = v_embed[torch.arange(B)[:,None], target_index].view(B*TN, self.viewpoint_dim) # B*TN,v_dim
clip_embed_ = clip_embed.unsqueeze(1).repeat(1,TN,1,1).view(B*TN,1,768)
clip_embed_ = self.cc_projection(torch.cat([clip_embed_, v_embed_.unsqueeze(1)], -1)) # B*TN,1,768
x_input_ = x_input.unsqueeze(1).repeat(1, TN, 1, 1, 1).view(B * TN, 4, H, W)
x_concat = x_input_
return clip_embed_, frustum_volume_feats, x_concat
def training_step(self, batch):
B = batch['target_image'].shape[0]
time_steps = torch.randint(0, self.num_timesteps, (B,), device=self.device).long()
x, clip_embed, input_info = self.prepare(batch)
x_noisy, noise = self.add_noise(x, time_steps) # B,N,4,H,W
N = self.view_num
target_index = torch.randint(0, N, (B, 1), device=self.device).long() # B, 1
v_embed = self.get_viewpoint_embedding(B, input_info['elevation']) # N,v_dim
t_embed = self.embed_time(time_steps)
spatial_volume = self.spatial_volume.construct_spatial_volume(x_noisy, t_embed, v_embed, self.poses, self.Ks)
clip_embed, volume_feats, x_concat = self.get_target_view_feats(input_info['x'], spatial_volume, clip_embed, t_embed, v_embed, target_index)
x_noisy_ = x_noisy[torch.arange(B)[:,None],target_index][:,0] # B,4,H,W
noise_predict = self.model(x_noisy_, time_steps, clip_embed, volume_feats, x_concat, is_train=True) # B,4,H,W
noise_target = noise[torch.arange(B)[:,None],target_index][:,0] # B,4,H,W
# loss simple for diffusion
loss_simple = torch.nn.functional.mse_loss(noise_target, noise_predict, reduction='none')
loss = loss_simple.mean()
self.log('sim', loss_simple.mean(), prog_bar=True, logger=True, on_step=True, on_epoch=True, rank_zero_only=True)
# log others
lr = self.optimizers().param_groups[0]['lr']
self.log('lr', lr, prog_bar=True, logger=True, on_step=True, on_epoch=False, rank_zero_only=True)
self.log("step", self.global_step, prog_bar=True, logger=True, on_step=True, on_epoch=False, rank_zero_only=True)
return loss
def add_noise(self, x_start, t):
"""
@param x_start: B,*
@param t: B,
@return:
"""
B = x_start.shape[0]
noise = torch.randn_like(x_start) # B,*
sqrt_alphas_cumprod_ = self.sqrt_alphas_cumprod[t] # B,
sqrt_one_minus_alphas_cumprod_ = self.sqrt_one_minus_alphas_cumprod[t] # B
sqrt_alphas_cumprod_ = sqrt_alphas_cumprod_.view(B, *[1 for _ in range(len(x_start.shape)-1)])
sqrt_one_minus_alphas_cumprod_ = sqrt_one_minus_alphas_cumprod_.view(B, *[1 for _ in range(len(x_start.shape)-1)])
x_noisy = sqrt_alphas_cumprod_ * x_start + sqrt_one_minus_alphas_cumprod_ * noise
return x_noisy, noise
def sample(self, sampler, batch, cfg_scale, batch_view_num, return_inter_results=False, inter_interval=50, inter_view_interval=2):
_, clip_embed, input_info = self.prepare(batch)
x_sample, inter = sampler.sample(input_info, clip_embed, unconditional_scale=cfg_scale, log_every_t=inter_interval, batch_view_num=batch_view_num)
N = x_sample.shape[1]
x_sample = torch.stack([self.decode_first_stage(x_sample[:, ni]) for ni in range(N)], 1)
if return_inter_results:
torch.cuda.synchronize()
torch.cuda.empty_cache()
inter = torch.stack(inter['x_inter'], 2) # # B,N,T,C,H,W
B,N,T,C,H,W = inter.shape
inter_results = []
for ni in tqdm(range(0, N, inter_view_interval)):
inter_results_ = []
for ti in range(T):
inter_results_.append(self.decode_first_stage(inter[:, ni, ti]))
inter_results.append(torch.stack(inter_results_, 1)) # B,T,3,H,W
inter_results = torch.stack(inter_results,1) # B,N,T,3,H,W
return x_sample, inter_results
else:
return x_sample
def decode_latents(self, x_sample):
images = self.decode_first_stage(x_sample)
return images
def inference(self, sampler, batch, cfg_scale, batch_view_num, return_inter_results=False, inter_interval=50, inter_view_interval=2):
_, clip_embed, input_info = self.prepare(batch)
x_sample, inter = sampler.sample(input_info, clip_embed, unconditional_scale=cfg_scale, log_every_t=inter_interval, batch_view_num=batch_view_num)
return x_sample, inter
def log_image(self, x_sample, batch, step, output_dir):
process = lambda x: ((torch.clip(x, min=-1, max=1).cpu().numpy() * 0.5 + 0.5) * 255).astype(np.uint8)
B = x_sample.shape[0]
N = x_sample.shape[1]
image_cond = []
for bi in range(B):
img_pr_ = concat_images_list(process(batch['input_image'][bi]),*[process(x_sample[bi, ni].permute(1, 2, 0)) for ni in range(N)])
image_cond.append(img_pr_)
output_dir = Path(output_dir)
imsave(str(output_dir/f'{step}.jpg'), concat_images_list(*image_cond, vert=True))
@torch.no_grad()
def validation_step(self, batch, batch_idx):
if batch_idx==0 and self.global_rank==0:
self.eval()
step = self.global_step
batch_ = {}
for k, v in batch.items(): batch_[k] = v[:self.output_num]
x_sample = self.sample(self.sampler, batch_, self.cfg_scale, self.batch_view_num)
output_dir = Path(self.image_dir) / 'images' / 'val'
output_dir.mkdir(exist_ok=True, parents=True)
self.log_image(x_sample, batch, step, output_dir=output_dir)
def configure_optimizers(self):
lr = self.learning_rate
print(f'setting learning rate to {lr:.4f} ...')
paras = []
if self.finetune_projection:
paras.append({"params": self.cc_projection.parameters(), "lr": lr},)
if self.finetune_unet:
paras.append({"params": self.model.parameters(), "lr": lr},)
else:
paras.append({"params": self.model.get_trainable_parameters(), "lr": lr},)
paras.append({"params": self.time_embed.parameters(), "lr": lr*10.0},)
paras.append({"params": self.spatial_volume.parameters(), "lr": lr*10.0},)
opt = torch.optim.AdamW(paras, lr=lr)
scheduler = instantiate_from_config(self.scheduler_config)
print("Setting up LambdaLR scheduler...")
scheduler = [{'scheduler': LambdaLR(opt, lr_lambda=scheduler.schedule), 'interval': 'step', 'frequency': 1}]
return [opt], scheduler
class SyncDDIMSampler:
def __init__(self, model: SyncMultiviewDiffusion, ddim_num_steps, ddim_discretize="uniform", ddim_eta=1.0, latent_size=32):
super().__init__()
self.model = model
self.ddpm_num_timesteps = model.num_timesteps
self.latent_size = latent_size
self._make_schedule(ddim_num_steps, ddim_discretize, ddim_eta)
self.eta = ddim_eta
def _make_schedule(self, ddim_num_steps, ddim_discretize="uniform", ddim_eta=0., verbose=True):
self.ddim_timesteps = make_ddim_timesteps(ddim_discr_method=ddim_discretize, num_ddim_timesteps=ddim_num_steps, num_ddpm_timesteps=self.ddpm_num_timesteps, verbose=verbose) # DT
ddim_timesteps_ = torch.from_numpy(self.ddim_timesteps.astype(np.int64)) # DT
alphas_cumprod = self.model.alphas_cumprod # T
assert alphas_cumprod.shape[0] == self.ddpm_num_timesteps, 'alphas have to be defined for each timestep'
self.ddim_alphas = alphas_cumprod[ddim_timesteps_].double() # DT
self.ddim_alphas_prev = torch.cat([alphas_cumprod[0:1], alphas_cumprod[ddim_timesteps_[:-1]]], 0) # DT
self.ddim_sigmas = ddim_eta * torch.sqrt((1 - self.ddim_alphas_prev) / (1 - self.ddim_alphas) * (1 - self.ddim_alphas / self.ddim_alphas_prev))
self.ddim_alphas_raw = self.model.alphas[ddim_timesteps_].float() # DT
self.ddim_sigmas = self.ddim_sigmas.float()
self.ddim_alphas = self.ddim_alphas.float()
self.ddim_alphas_prev = self.ddim_alphas_prev.float()
self.ddim_sqrt_one_minus_alphas = torch.sqrt(1. - self.ddim_alphas).float()
@torch.no_grad()
def denoise_apply_impl(self, x_target_noisy, index, noise_pred, is_step0=False):
"""
@param x_target_noisy: B,N,4,H,W
@param index: index
@param noise_pred: B,N,4,H,W
@param is_step0: bool
@return:
"""
device = x_target_noisy.device
B,N,_,H,W = x_target_noisy.shape
# apply noise
a_t = self.ddim_alphas[index].to(device).float().view(1,1,1,1,1)
a_prev = self.ddim_alphas_prev[index].to(device).float().view(1,1,1,1,1)
sqrt_one_minus_at = self.ddim_sqrt_one_minus_alphas[index].to(device).float().view(1,1,1,1,1)
sigma_t = self.ddim_sigmas[index].to(device).float().view(1,1,1,1,1)
pred_x0 = (x_target_noisy - sqrt_one_minus_at * noise_pred) / a_t.sqrt()
dir_xt = torch.clamp(1. - a_prev - sigma_t**2, min=1e-7).sqrt() * noise_pred
x_prev = a_prev.sqrt() * pred_x0 + dir_xt
if not is_step0:
noise = sigma_t * torch.randn_like(x_target_noisy)
x_prev = x_prev + noise
return x_prev
@torch.no_grad()
def denoise_apply(self, x_target_noisy, input_info, clip_embed, time_steps, index, unconditional_scale, batch_view_num=1, is_step0=False):
"""
@param x_target_noisy: B,N,4,H,W
@param input_info:
@param clip_embed: B,M,768
@param time_steps: B,
@param index: int
@param unconditional_scale:
@param batch_view_num: int
@param is_step0: bool
@return:
"""
x_input, elevation_input = input_info['x'], input_info['elevation']
B, N, C, H, W = x_target_noisy.shape
# construct source data
v_embed = self.model.get_viewpoint_embedding(B, elevation_input) # B,N,v_dim
t_embed = self.model.embed_time(time_steps) # B,t_dim
spatial_volume = self.model.spatial_volume.construct_spatial_volume(x_target_noisy, t_embed, v_embed, self.model.poses, self.model.Ks)
e_t = []
target_indices = torch.arange(N) # N
for ni in range(0, N, batch_view_num):
x_target_noisy_ = x_target_noisy[:, ni:ni + batch_view_num]
VN = x_target_noisy_.shape[1]
x_target_noisy_ = x_target_noisy_.reshape(B*VN,C,H,W)
time_steps_ = repeat_to_batch(time_steps, B, VN)
target_indices_ = target_indices[ni:ni+batch_view_num].unsqueeze(0).repeat(B,1)
clip_embed_, volume_feats_, x_concat_ = self.model.get_target_view_feats(x_input, spatial_volume, clip_embed, t_embed, v_embed, target_indices_)
if unconditional_scale!=1.0:
noise = self.model.model.predict_with_unconditional_scale(x_target_noisy_, time_steps_, clip_embed_, volume_feats_, x_concat_, unconditional_scale)
else:
noise = self.model.model(x_target_noisy_, time_steps_, clip_embed_, volume_feats_, x_concat_, is_train=False)
e_t.append(noise.view(B,VN,4,H,W))
e_t = torch.cat(e_t, 1)
x_prev = self.denoise_apply_impl(x_target_noisy, index, e_t, is_step0)
return x_prev
@torch.no_grad()
def sample(self, input_info, clip_embed, unconditional_scale=1.0, log_every_t=50, batch_view_num=1):
"""
@param input_info: x, elevation
@param clip_embed: B,M,768
@param unconditional_scale:
@param log_every_t:
@param batch_view_num:
@return:
"""
print(f"unconditional scale {unconditional_scale:.1f}")
C, H, W = 4, self.latent_size, self.latent_size
B = clip_embed.shape[0]
N = self.model.view_num
device = self.model.device
x_target_noisy = torch.randn([B, N, C, H, W], device=device)
timesteps = self.ddim_timesteps
intermediates = {'x_inter': []}
time_range = np.flip(timesteps)
total_steps = timesteps.shape[0]
iterator = tqdm(time_range, desc='DDIM Sampler', total=total_steps)
for i, step in enumerate(iterator):
index = total_steps - i - 1 # index in ddim state
time_steps = torch.full((B,), step, device=device, dtype=torch.long)
x_target_noisy = self.denoise_apply(x_target_noisy, input_info, clip_embed, time_steps, index, unconditional_scale, batch_view_num=batch_view_num, is_step0=index==0)
if index % log_every_t == 0 or index == total_steps - 1:
intermediates['x_inter'].append(x_target_noisy)
return x_target_noisy, intermediates
================================================
FILE: ldm/models/diffusion/sync_dreamer_attention.py
================================================
import torch
import torch.nn as nn
from ldm.modules.attention import default, zero_module, checkpoint
from ldm.modules.diffusionmodules.openaimodel import UNetModel
from ldm.modules.diffusionmodules.util import timestep_embedding
class DepthAttention(nn.Module):
def __init__(self, query_dim, context_dim, heads, dim_head, output_bias=True):
super().__init__()
inner_dim = dim_head * heads
context_dim = default(context_dim, query_dim)
self.scale = dim_head ** -0.5
self.heads = heads
self.dim_head = dim_head
self.to_q = nn.Conv2d(query_dim, inner_dim, 1, 1, bias=False)
self.to_k = nn.Conv3d(context_dim, inner_dim, 1, 1, bias=False)
self.to_v = nn.Conv3d(context_dim, inner_dim, 1, 1, bias=False)
if output_bias:
self.to_out = nn.Conv2d(inner_dim, query_dim, 1, 1)
else:
self.to_out = nn.Conv2d(inner_dim, query_dim, 1, 1, bias=False)
def forward(self, x, context):
"""
@param x: b,f0,h,w
@param context: b,f1,d,h,w
@return:
"""
hn, hd = self.heads, self.dim_head
b, _, h, w = x.shape
b, _, d, h, w = context.shape
q = self.to_q(x).reshape(b,hn,hd,h,w) # b,t,h,w
k = self.to_k(context).reshape(b,hn,hd,d,h,w) # b,t,d,h,w
v = self.to_v(context).reshape(b,hn,hd,d,h,w) # b,t,d,h,w
sim = torch.sum(q.unsqueeze(3) * k, 2) * self.scale # b,hn,d,h,w
attn = sim.softmax(dim=2)
# b,hn,hd,d,h,w * b,hn,1,d,h,w
out = torch.sum(v * attn.unsqueeze(2), 3) # b,hn,hd,h,w
out = out.reshape(b,hn*hd,h,w)
return self.to_out(out)
class DepthTransformer(nn.Module):
def __init__(self, dim, n_heads, d_head, context_dim=None, checkpoint=True):
super().__init__()
inner_dim = n_heads * d_head
self.proj_in = nn.Sequential(
nn.Conv2d(dim, inner_dim, 1, 1),
nn.GroupNorm(8, inner_dim),
nn.SiLU(True),
)
self.proj_context = nn.Sequential(
nn.Conv3d(context_dim, context_dim, 1, 1, bias=False), # no bias
nn.GroupNorm(8, context_dim),
nn.ReLU(True), # only relu, because we want input is 0, output is 0
)
self.depth_attn = DepthAttention(query_dim=inner_dim, heads=n_heads, dim_head=d_head, context_dim=context_dim, output_bias=False) # is a self-attention if not self.disable_self_attn
self.proj_out = nn.Sequential(
nn.GroupNorm(8, inner_dim),
nn.ReLU(True),
nn.Conv2d(inner_dim, inner_dim, 3, 1, 1, bias=False),
nn.GroupNorm(8, inner_dim),
nn.ReLU(True),
zero_module(nn.Conv2d(inner_dim, dim, 3, 1, 1, bias=False)),
)
self.checkpoint = checkpoint
def forward(self, x, context=None):
return checkpoint(self._forward, (x, context), self.parameters(), self.checkpoint)
def _forward(self, x, context):
x_in = x
x = self.proj_in(x)
context = self.proj_context(context)
x = self.depth_attn(x, context)
x = self.proj_out(x) + x_in
return x
class DepthWiseAttention(UNetModel):
def __init__(self, volume_dims=(5,16,32,64), *args, **kwargs):
super().__init__(*args, **kwargs)
# num_heads = 4
model_channels = kwargs['model_channels']
channel_mult = kwargs['channel_mult']
d0,d1,d2,d3 = volume_dims
# 4
ch = model_channels*channel_mult[2]
self.middle_conditions = DepthTransformer(ch, 4, d3 // 2, context_dim=d3)
self.output_conditions=nn.ModuleList()
self.output_b2c = {3:0,4:1,5:2,6:3,7:4,8:5,9:6,10:7,11:8}
# 8
ch = model_channels*channel_mult[2]
self.output_conditions.append(DepthTransformer(ch, 4, d2 // 2, context_dim=d2)) # 0
self.output_conditions.append(DepthTransformer(ch, 4, d2 // 2, context_dim=d2)) # 1
# 16
self.output_conditions.append(DepthTransformer(ch, 4, d1 // 2, context_dim=d1)) # 2
ch = model_channels*channel_mult[1]
self.output_conditions.append(DepthTransformer(ch, 4, d1 // 2, context_dim=d1)) # 3
self.output_conditions.append(DepthTransformer(ch, 4, d1 // 2, context_dim=d1)) # 4
# 32
self.output_conditions.append(DepthTransformer(ch, 4, d0 // 2, context_dim=d0)) # 5
ch = model_channels*channel_mult[0]
self.output_conditions.append(DepthTransformer(ch, 4, d0 // 2, context_dim=d0)) # 6
self.output_conditions.append(DepthTransformer(ch, 4, d0 // 2, context_dim=d0)) # 7
self.output_conditions.append(DepthTransformer(ch, 4, d0 // 2, context_dim=d0)) # 8
def forward(self, x, timesteps=None, context=None, source_dict=None, **kwargs):
hs = []
t_emb = timestep_embedding(timesteps, self.model_channels, repeat_only=False)
emb = self.time_embed(t_emb)
h = x.type(self.dtype)
for index, module in enumerate(self.input_blocks):
h = module(h, emb, context)
hs.append(h)
h = self.middle_block(h, emb, context)
h = self.middle_conditions(h, context=source_dict[h.shape[-1]])
for index, module in enumerate(self.output_blocks):
h = torch.cat([h, hs.pop()], dim=1)
h = module(h, emb, context)
if index in self.output_b2c:
layer = self.output_conditions[self.output_b2c[index]]
h = layer(h, context=source_dict[h.shape[-1]])
h = h.type(x.dtype)
return self.out(h)
def get_trainable_parameters(self):
paras = [para for para in self.middle_conditions.parameters()] + [para for para in self.output_conditions.parameters()]
return paras
================================================
FILE: ldm/models/diffusion/sync_dreamer_network.py
================================================
import torch
import torch.nn as nn
from ldm.modules.attention import default, zero_module, checkpoint
from ldm.modules.diffusionmodules.util import conv_nd
class Image2DResBlockWithTV(nn.Module):
def __init__(self, dim, tdim, vdim):
super().__init__()
norm = lambda c: nn.GroupNorm(8, c)
self.time_embed = nn.Conv2d(tdim, dim, 1, 1)
self.view_embed = nn.Conv2d(vdim, dim, 1, 1)
self.conv = nn.Sequential(
norm(dim),
nn.SiLU(True),
nn.Conv2d(dim, dim, 3, 1, 1),
norm(dim),
nn.SiLU(True),
nn.Conv2d(dim, dim, 3, 1, 1),
)
def forward(self, x, t, v):
return x+self.conv(x+self.time_embed(t)+self.view_embed(v))
class NoisyTargetViewEncoder(nn.Module):
def __init__(self, time_embed_dim, viewpoint_dim, run_dim=16, output_dim=8):
super().__init__()
self.init_conv = nn.Conv2d(4, run_dim, 3, 1, 1)
self.out_conv0 = Image2DResBlockWithTV(run_dim, time_embed_dim, viewpoint_dim)
self.out_conv1 = Image2DResBlockWithTV(run_dim, time_embed_dim, viewpoint_dim)
self.out_conv2 = Image2DResBlockWithTV(run_dim, time_embed_dim, viewpoint_dim)
self.final_out = nn.Sequential(
nn.GroupNorm(8, run_dim),
nn.SiLU(True),
nn.Conv2d(run_dim, output_dim, 3, 1, 1)
)
def forward(self, x, t, v):
B, DT = t.shape
t = t.view(B, DT, 1, 1)
B, DV = v.shape
v = v.view(B, DV, 1, 1)
x = self.init_conv(x)
x = self.out_conv0(x, t, v)
x = self.out_conv1(x, t, v)
x = self.out_conv2(x, t, v)
x = self.final_out(x)
return x
class SpatialUpTimeBlock(nn.Module):
def __init__(self, x_in_dim, t_in_dim, out_dim):
super().__init__()
norm_act = lambda c: nn.GroupNorm(8, c)
self.t_conv = nn.Conv3d(t_in_dim, x_in_dim, 1, 1) # 16
self.norm = norm_act(x_in_dim)
self.silu = nn.SiLU(True)
self.conv = nn.ConvTranspose3d(x_in_dim, out_dim, kernel_size=3, padding=1, output_padding=1, stride=2)
def forward(self, x, t):
x = x + self.t_conv(t)
return self.conv(self.silu(self.norm(x)))
class SpatialTimeBlock(nn.Module):
def __init__(self, x_in_dim, t_in_dim, out_dim, stride):
super().__init__()
norm_act = lambda c: nn.GroupNorm(8, c)
self.t_conv = nn.Conv3d(t_in_dim, x_in_dim, 1, 1) # 16
self.bn = norm_act(x_in_dim)
self.silu = nn.SiLU(True)
self.conv = nn.Conv3d(x_in_dim, out_dim, 3, stride=stride, padding=1)
def forward(self, x, t):
x = x + self.t_conv(t)
return self.conv(self.silu(self.bn(x)))
class SpatialTime3DNet(nn.Module):
def __init__(self, time_dim=256, input_dim=128, dims=(32, 64, 128, 256)):
super().__init__()
d0, d1, d2, d3 = dims
dt = time_dim
self.init_conv = nn.Conv3d(input_dim, d0, 3, 1, 1) # 32
self.conv0 = SpatialTimeBlock(d0, dt, d0, stride=1)
self.conv1 = SpatialTimeBlock(d0, dt, d1, stride=2)
self.conv2_0 = SpatialTimeBlock(d1, dt, d1, stride=1)
self.conv2_1 = SpatialTimeBlock(d1, dt, d1, stride=1)
self.conv3 = SpatialTimeBlock(d1, dt, d2, stride=2)
self.conv4_0 = SpatialTimeBlock(d2, dt, d2, stride=1)
self.conv4_1 = SpatialTimeBlock(d2, dt, d2, stride=1)
self.conv5 = SpatialTimeBlock(d2, dt, d3, stride=2)
self.conv6_0 = SpatialTimeBlock(d3, dt, d3, stride=1)
self.conv6_1 = SpatialTimeBlock(d3, dt, d3, stride=1)
self.conv7 = SpatialUpTimeBlock(d3, dt, d2)
self.conv8 = SpatialUpTimeBlock(d2, dt, d1)
self.conv9 = SpatialUpTimeBlock(d1, dt, d0)
def forward(self, x, t, control=None):
B, C = t.shape
t = t.view(B, C, 1, 1, 1)
x = self.init_conv(x)
conv0 = self.conv0(x, t)
x = self.conv1(conv0, t)
x = self.conv2_0(x, t)
conv2 = self.conv2_1(x, t)
x = self.conv3(conv2, t)
x = self.conv4_0(x, t)
conv4 = self.conv4_1(x, t)
x = self.conv5(conv4, t)
x = self.conv6_0(x, t)
x = self.conv6_1(x, t)
if control is not None:
x += control.pop()
if control is not None:
conv4 += control.pop()
x = conv4 + self.conv7(x, t)
if control is not None:
conv2 += control.pop()
x = conv2 + self.conv8(x, t)
if control is not None:
conv0 += control.pop()
x = conv0 + self.conv9(x, t)
return x
class ControlSpatialTime3DNet(nn.Module):
def __init__(self, time_dim=256, input_dim=128, proxy_input_dim=3, dims=(32, 64, 128, 256)):
super().__init__()
d0, d1, d2, d3 = dims
dt = time_dim
self.ctrl_strength = 1.0
self.proxy_proj_in = nn.Sequential(
nn.Conv3d(proxy_input_dim, d0//2, 3, 1, 1),
nn.GroupNorm(8, d0//2),
nn.SiLU(True),
zero_module(nn.Conv3d(d0//2, d0, 3, 1, 1)),
)
self.zero_convs = nn.ModuleList()
self.init_conv = nn.Conv3d(input_dim, d0, 3, 1, 1) # 32
self.conv0 = SpatialTimeBlock(d0, dt, d0, stride=1)
self.zero_convs.append(self.make_zero_conv(3, d0))
self.conv1 = SpatialTimeBlock(d0, dt, d1, stride=2)
self.conv2_0 = SpatialTimeBlock(d1, dt, d1, stride=1)
self.conv2_1 = SpatialTimeBlock(d1, dt, d1, stride=1)
self.zero_convs.append(self.make_zero_conv(3, d1))
self.conv3 = SpatialTimeBlock(d1, dt, d2, stride=2)
self.conv4_0 = SpatialTimeBlock(d2, dt, d2, stride=1)
self.conv4_1 = SpatialTimeBlock(d2, dt, d2, stride=1)
self.zero_convs.append(self.make_zero_conv(3, d2))
self.conv5 = SpatialTimeBlock(d2, dt, d3, stride=2)
self.conv6_0 = SpatialTimeBlock(d3, dt, d3, stride=1)
self.conv6_1 = SpatialTimeBlock(d3, dt, d3, stride=1)
self.zero_convs.append(self.make_zero_conv(3, d3))
def forward(self, x, t, proxy_feature):
B, C = t.shape
t = t.view(B, C, 1, 1, 1)
outs = []
x = self.init_conv(x)
hint = self.proxy_proj_in(proxy_feature)
x = x + hint
conv0 = self.conv0(x, t)
outs.append(self.zero_convs[0](conv0))
x = self.conv1(conv0, t)
x = self.conv2_0(x, t)
conv2 = self.conv2_1(x, t)
outs.append(self.zero_convs[1](conv2))
x = self.conv3(conv2, t)
x = self.conv4_0(x, t)
conv4 = self.conv4_1(x, t)
outs.append(self.zero_convs[2](conv4))
x = self.conv5(conv4, t)
x = self.conv6_0(x, t)
x = self.conv6_1(x, t)
outs.append(self.zero_convs[3](x))
return [o*self.ctrl_strength for o in outs]
def make_zero_conv(self, dims, channels):
return zero_module(conv_nd(dims, channels, channels, 1, padding=0))
class FrustumTVBlock(nn.Module):
def __init__(self, x_dim, t_dim, v_dim, out_dim, stride):
super().__init__()
norm_act = lambda c: nn.GroupNorm(8, c)
self.t_conv = nn.Conv3d(t_dim, x_dim, 1, 1) # 16
self.v_conv = nn.Conv3d(v_dim, x_dim, 1, 1) # 16
self.bn = norm_act(x_dim)
self.silu = nn.SiLU(True)
self.conv = nn.Conv3d(x_dim, out_dim, 3, stride=stride, padding=1)
def forward(self, x, t, v):
x = x + self.t_conv(t) + self.v_conv(v)
return self.conv(self.silu(self.bn(x)))
class FrustumTVUpBlock(nn.Module):
def __init__(self, x_dim, t_dim, v_dim, out_dim):
super().__init__()
norm_act = lambda c: nn.GroupNorm(8, c)
self.t_conv = nn.Conv3d(t_dim, x_dim, 1, 1) # 16
self.v_conv = nn.Conv3d(v_dim, x_dim, 1, 1) # 16
self.norm = norm_act(x_dim)
self.silu = nn.SiLU(True)
self.conv = nn.ConvTranspose3d(x_dim, out_dim, kernel_size=3, padding=1, output_padding=1, stride=2)
def forward(self, x, t, v):
x = x + self.t_conv(t) + self.v_conv(v)
return self.conv(self.silu(self.norm(x)))
class FrustumTV3DNet(nn.Module):
def __init__(self, in_dim, t_dim, v_dim, dims=(32, 64, 128, 256)):
super().__init__()
self.conv0 = nn.Conv3d(in_dim, dims[0], 3, 1, 1) # 32
self.conv1 = FrustumTVBlock(dims[0], t_dim, v_dim, dims[1], 2)
self.conv2 = FrustumTVBlock(dims[1], t_dim, v_dim, dims[1], 1)
self.conv3 = FrustumTVBlock(dims[1], t_dim, v_dim, dims[2], 2)
self.conv4 = FrustumTVBlock(dims[2], t_dim, v_dim, dims[2], 1)
self.conv5 = FrustumTVBlock(dims[2], t_dim, v_dim, dims[3], 2)
self.conv6 = FrustumTVBlock(dims[3], t_dim, v_dim, dims[3], 1)
self.up0 = FrustumTVUpBlock(dims[3], t_dim, v_dim, dims[2])
self.up1 = FrustumTVUpBlock(dims[2], t_dim, v_dim, dims[1])
self.up2 = FrustumTVUpBlock(dims[1], t_dim, v_dim, dims[0])
def forward(self, x, t, v):
B,DT = t.shape
t = t.view(B,DT,1,1,1)
B,DV = v.shape
v = v.view(B,DV,1,1,1)
b, _, d, h, w = x.shape
x0 = self.conv0(x)
x1 = self.conv2(self.conv1(x0, t, v), t, v)
x2 = self.conv4(self.conv3(x1, t, v), t, v)
x3 = self.conv6(self.conv5(x2, t, v), t, v)
x2 = self.up0(x3, t, v) + x2
x1 = self.up1(x2, t, v) + x1
x0 = self.up2(x1, t, v) + x0
return {w: x0, w//2: x1, w//4: x2, w//8: x3}
================================================
FILE: ldm/models/diffusion/sync_dreamer_utils.py
================================================
import torch
from kornia import create_meshgrid
def project_and_normalize(ref_grid, src_proj, length):
"""
@param ref_grid: b 3 n
@param src_proj: b 4 4
@param length: int
@return: b, n, 2
"""
src_grid = src_proj[:, :3, :3] @ ref_grid + src_proj[:, :3, 3:] # b 3 n
div_val = src_grid[:, -1:]
div_val[div_val<1e-4] = 1e-4
src_grid = src_grid[:, :2] / div_val # divide by depth (b, 2, n)
src_grid[:, 0] = src_grid[:, 0]/((length - 1) / 2) - 1 # scale to -1~1
src_grid[:, 1] = src_grid[:, 1]/((length - 1) / 2) - 1 # scale to -1~1
src_grid = src_grid.permute(0, 2, 1) # (b, n, 2)
return src_grid
def project_(ref_grid, src_proj):
"""
@param ref_grid: b 3 n
@param src_proj: b 4 4
@param length: int
@return: b, n, 2
"""
src_grid = src_proj[:, :3, :3] @ ref_grid + src_proj[:, :3, 3:] # b 3 n
div_val = src_grid[:, -1:]
div_val[div_val<1e-4] = 1e-4
src_grid = src_grid[:, :2] / div_val # divide by depth (b, 2, n)
src_grid = src_grid.permute(0, 2, 1) # (b, n, 2)
return src_grid
def construct_project_matrix(x_ratio, y_ratio, Ks, poses):
"""
@param x_ratio: float
@param y_ratio: float
@param Ks: b,3,3
@param poses: b,3,4
@return:
"""
rfn = Ks.shape[0]
scale_m = torch.tensor([x_ratio, y_ratio, 1.0], dtype=torch.float32, device=Ks.device)
scale_m = torch.diag(scale_m)
ref_prj = scale_m[None, :, :] @ Ks @ poses # rfn,3,4
pad_vals = torch.zeros([rfn, 1, 4], dtype=torch.float32, device=ref_prj.device)
pad_vals[:, :, 3] = 1.0
ref_prj = torch.cat([ref_prj, pad_vals], 1) # rfn,4,4
return ref_prj
def get_warp_coordinates(volume_xyz, warp_size, input_size, Ks, warp_pose):
B, _, D, H, W = volume_xyz.shape
ratio = warp_size / input_size
warp_proj = construct_project_matrix(ratio, ratio, Ks, warp_pose) # B,4,4
warp_coords = project_and_normalize(volume_xyz.view(B,3,D*H*W), warp_proj, warp_size).view(B, D, H, W, 2)
return warp_coords
def get_proxy_warp_coordinates(proxy_xyz, warp_size, input_size, Ks, warp_pose):
B, num_proxy, _ = proxy_xyz.shape
ratio = warp_size / input_size
warp_proj = construct_project_matrix(ratio, ratio, Ks, warp_pose) # B,4,4
warp_coords = project_(proxy_xyz.permute(0, 2, 1), warp_proj)
return warp_coords
def create_target_volume(depth_size, volume_size, input_image_size, pose_target, K, near=None, far=None):
device, dtype = pose_target.device, pose_target.dtype
# compute a depth range on the unit sphere
H, W, D, B = volume_size, volume_size, depth_size, pose_target.shape[0]
if near is not None and far is not None :
# near, far b,1,h,w
depth_values = torch.linspace(0, 1, steps=depth_size).to(near.device).to(near.dtype) # d
depth_values = depth_values.view(1, D, 1, 1) # 1,d,1,1
depth_values = depth_values * (far - near) + near # b d h w
depth_values = depth_values.view(B, 1, D, H * W)
else:
near, far = near_far_from_unit_sphere_using_camera_poses(pose_target) # b 1
depth_values = torch.linspace(0, 1, steps=depth_size).to(near.device).to(near.dtype) # d
depth_values = depth_values[None,:,None] * (far[:,None,:] - near[:,None,:]) + near[:,None,:] # b d 1
depth_values = depth_values.view(B, 1, D, 1).expand(B, 1, D, H*W)
ratio = volume_size / input_image_size
# creat a grid on the target (reference) view
# H, W, D, B = volume_size, volume_size, depth_values.shape[1], depth_values.shape[0]
# creat mesh grid: note reference also means target
ref_grid = create_meshgrid(H, W, normalized_coordinates=False) # (1, H, W, 2)
ref_grid = ref_grid.to(device).to(dtype)
ref_grid = ref_grid.permute(0, 3, 1, 2) # (1, 2, H, W)
ref_grid = ref_grid.reshape(1, 2, H*W) # (1, 2, H*W)
ref_grid = ref_grid.expand(B, -1, -1) # (B, 2, H*W)
ref_grid = torch.cat((ref_grid, torch.ones(B, 1, H*W, dtype=ref_grid.dtype, device=ref_grid.device)), dim=1) # (B, 3, H*W)
ref_grid = ref_grid.unsqueeze(2) * depth_values # (B, 3, D, H*W)
# unproject to space and transfer to world coordinates.
Ks = K
ref_proj = construct_project_matrix(ratio, ratio, Ks, pose_target) # B,4,4
ref_proj_inv = torch.inverse(ref_proj) # B,4,4
ref_grid = ref_proj_inv[:,:3,:3] @ ref_grid.view(B,3,D*H*W) + ref_proj_inv[:,:3,3:] # B,3,3 @ B,3,DHW + B,3,1 => B,3,DHW
return ref_grid.reshape(B,3,D,H,W), depth_values.view(B,1,D,H,W)
def near_far_from_unit_sphere_using_camera_poses(camera_poses):
"""
@param camera_poses: b 3 4
@return:
near: b,1
far: b,1
"""
R_w2c = camera_poses[..., :3, :3] # b 3 3
t_w2c = camera_poses[..., :3, 3:] # b 3 1
camera_origin = -R_w2c.permute(0,2,1) @ t_w2c # b 3 1
# R_w2c.T @ (0,0,1) = z_dir
camera_orient = R_w2c.permute(0,2,1)[...,:3,2:3] # b 3 1
camera_origin, camera_orient = camera_origin[...,0], camera_orient[..., 0] # b 3
a = torch.sum(camera_orient ** 2, dim=-1, keepdim=True) # b 1
b = -torch.sum(camera_orient * camera_origin, dim=-1, keepdim=True) # b 1
mid = b / a # b 1
near, far = mid - 1.0, mid + 1.0
return near, far
================================================
FILE: ldm/modules/attention.py
================================================
from inspect import isfunction
import math
import torch
import torch.nn.functional as F
from torch import nn, einsum
from einops import rearrange, repeat
from ldm.modules.diffusionmodules.util import checkpoint
import xformers
import xformers.ops
def exists(val):
return val is not None
def uniq(arr):
return{el: True for el in arr}.keys()
def default(val, d):
if exists(val):
return val
return d() if isfunction(d) else d
def max_neg_value(t):
return -torch.finfo(t.dtype).max
def init_(tensor):
dim = tensor.shape[-1]
std = 1 / math.sqrt(dim)
tensor.uniform_(-std, std)
return tensor
# feedforward
class GEGLU(nn.Module):
def __init__(self, dim_in, dim_out):
super().__init__()
self.proj = nn.Linear(dim_in, dim_out * 2)
def forward(self, x):
x, gate = self.proj(x).chunk(2, dim=-1)
return x * F.gelu(gate)
# feedforward
class ConvGEGLU(nn.Module):
def __init__(self, dim_in, dim_out):
super().__init__()
self.proj = nn.Conv2d(dim_in, dim_out * 2, 1, 1, 0)
def forward(self, x):
x, gate = self.proj(x).chunk(2, dim=1)
return x * F.gelu(gate)
class FeedForward(nn.Module):
def __init__(self, dim, dim_out=None, mult=4, glu=False, dropout=0.):
super().__init__()
inner_dim = int(dim * mult)
dim_out = default(dim_out, dim)
project_in = nn.Sequential(
nn.Linear(dim, inner_dim),
nn.GELU()
) if not glu else GEGLU(dim, inner_dim)
self.net = nn.Sequential(
project_in,
nn.Dropout(dropout),
nn.Linear(inner_dim, dim_out)
)
def forward(self, x):
return self.net(x)
def zero_module(module):
"""
Zero out the parameters of a module and return it.
"""
for p in module.parameters():
p.detach().zero_()
return module
def Normalize(in_channels):
return torch.nn.GroupNorm(num_groups=32, num_channels=in_channels, eps=1e-6, affine=True)
class LinearAttention(nn.Module):
def __init__(self, dim, heads=4, dim_head=32):
super().__init__()
self.heads = heads
hidden_dim = dim_head * heads
self.to_qkv = nn.Conv2d(dim, hidden_dim * 3, 1, bias = False)
self.to_out = nn.Conv2d(hidden_dim, dim, 1)
def forward(self, x):
b, c, h, w = x.shape
qkv = self.to_qkv(x)
q, k, v = rearrange(qkv, 'b (qkv heads c) h w -> qkv b heads c (h w)', heads = self.heads, qkv=3)
k = k.softmax(dim=-1)
context = torch.einsum('bhdn,bhen->bhde', k, v)
out = torch.einsum('bhde,bhdn->bhen', context, q)
out = rearrange(out, 'b heads c (h w) -> b (heads c) h w', heads=self.heads, h=h, w=w)
return self.to_out(out)
class SpatialSelfAttention(nn.Module):
def __init__(self, in_channels):
super().__init__()
self.in_channels = in_channels
self.norm = Normalize(in_channels)
self.q = torch.nn.Conv2d(in_channels,
in_channels,
kernel_size=1,
stride=1,
padding=0)
self.k = torch.nn.Conv2d(in_channels,
in_channels,
kernel_size=1,
stride=1,
padding=0)
self.v = torch.nn.Conv2d(in_channels,
in_channels,
kernel_size=1,
stride=1,
padding=0)
self.proj_out = torch.nn.Conv2d(in_channels,
in_channels,
kernel_size=1,
stride=1,
padding=0)
def forward(self, x):
h_ = x
h_ = self.norm(h_)
q = self.q(h_)
k = self.k(h_)
v = self.v(h_)
# compute attention
b,c,h,w = q.shape
q = rearrange(q, 'b c h w -> b (h w) c')
k = rearrange(k, 'b c h w -> b c (h w)')
w_ = torch.einsum('bij,bjk->bik', q, k)
w_ = w_ * (int(c)**(-0.5))
w_ = torch.nn.functional.softmax(w_, dim=2)
# attend to values
v = rearrange(v, 'b c h w -> b c (h w)')
w_ = rearrange(w_, 'b i j -> b j i')
h_ = torch.einsum('bij,bjk->bik', v, w_)
h_ = rearrange(h_, 'b c (h w) -> b c h w', h=h)
h_ = self.proj_out(h_)
return x+h_
class CrossAttention(nn.Module):
def __init__(self, query_dim, context_dim=None, heads=8, dim_head=64, dropout=0.):
super().__init__()
inner_dim = dim_head * heads
context_dim = default(context_dim, query_dim)
self.scale = dim_head ** -0.5
self.heads = heads
self.to_q = nn.Linear(query_dim, inner_dim, bias=False)
self.to_k = nn.Linear(context_dim, inner_dim, bias=False)
self.to_v = nn.Linear(context_dim, inner_dim, bias=False)
self.to_out = nn.Sequential(
nn.Linear(inner_dim, query_dim),
nn.Dropout(dropout)
)
def forward(self, x, context=None, mask=None):
h = self.heads
q = self.to_q(x)
context = default(context, x)
k = self.to_k(context)
v = self.to_v(context)
q, k, v = map(lambda t: rearrange(t, 'b n (h d) -> (b h) n d', h=h), (q, k, v))
out = xformers.ops.memory_efficient_attention(
q, k, v, attn_bias=mask, scale=self.scale
)
# q, k, v = map(lambda t: rearrange(t, 'b n (h d) -> (b h) n d', h=h), (q, k, v))
# sim = einsum('b i d, b j d -> b i j', q, k) * self.scale
# if exists(mask):
# mask = mask>0
# mask = rearrange(mask, 'b ... -> b (...)')
# max_neg_value = -torch.finfo(sim.dtype).max
# mask = repeat(mask, 'b j -> (b h) () j', h=h)
# sim.masked_fill_(~mask, max_neg_value)
# # attention, what we cannot get enough of
# attn = sim.softmax(dim=-1)
# out = einsum('b i j, b j d -> b i d', attn, v)
out = rearrange(out, '(b h) n d -> b n (h d)', h=h)
return self.to_out(out)
class BasicSpatialTransformer(nn.Module):
def __init__(self, dim, n_heads, d_head, context_dim=None, checkpoint=True):
super().__init__()
inner_dim = n_heads * d_head
self.proj_in = nn.Sequential(
nn.GroupNorm(8, dim),
nn.Conv2d(dim, inner_dim, kernel_size=1, stride=1, padding=0),
nn.GroupNorm(8, inner_dim),
nn.ReLU(True),
)
self.attn = CrossAttention(query_dim=inner_dim, heads=n_heads, dim_head=d_head, context_dim=context_dim) # is a self-attention if not self.disable_self_attn
self.out_conv = nn.Sequential(
nn.GroupNorm(8, inner_dim),
nn.ReLU(True),
nn.Conv2d(inner_dim, inner_dim, 1, 1),
)
self.proj_out = nn.Sequential(
nn.GroupNorm(8, inner_dim),
nn.ReLU(True),
zero_module(nn.Conv2d(inner_dim, dim, kernel_size=1, stride=1, padding=0)),
)
self.checkpoint = checkpoint
def forward(self, x, context=None):
return checkpoint(self._forward, (x, context), self.parameters(), self.checkpoint)
def _forward(self, x, context):
# input
b,_,h,w = x.shape
x_in = x
x = self.proj_in(x)
# attention
x = rearrange(x, 'b c h w -> b (h w) c').contiguous()
context = rearrange(context, 'b c h w -> b (h w) c').contiguous()
x = self.attn(x, context) + x
x = rearrange(x, 'b (h w) c -> b c h w', h=h, w=w).contiguous()
# output
x = self.out_conv(x) + x
x = self.proj_out(x) + x_in
return x
class BasicTransformerBlock(nn.Module):
def __init__(self, dim, n_heads, d_head, dropout=0., context_dim=None, gated_ff=True, checkpoint=True, disable_self_attn=False):
super().__init__()
self.disable_self_attn = disable_self_attn
self.attn1 = CrossAttention(query_dim=dim, heads=n_heads, dim_head=d_head, dropout=dropout,
context_dim=context_dim if self.disable_self_attn else None) # is a self-attention if not self.disable_self_attn
self.ff = FeedForward(dim, dropout=dropout, glu=gated_ff)
self.attn2 = CrossAttention(query_dim=dim, context_dim=context_dim,
heads=n_heads, dim_head=d_head, dropout=dropout) # is self-attn if context is none
self.norm1 = nn.LayerNorm(dim)
self.norm2 = nn.LayerNorm(dim)
self.norm3 = nn.LayerNorm(dim)
self.checkpoint = checkpoint
def forward(self, x, context=None):
return checkpoint(self._forward, (x, context), self.parameters(), self.checkpoint)
def _forward(self, x, context=None):
x = self.attn1(self.norm1(x), context=context if self.disable_self_attn else None) + x
x = self.attn2(self.norm2(x), context=context) + x
x = self.ff(self.norm3(x)) + x
return x
class ConvFeedForward(nn.Module):
def __init__(self, dim, dim_out=None, mult=4, glu=False, dropout=0.):
super().__init__()
inner_dim = int(dim * mult)
dim_out = default(dim_out, dim)
project_in = nn.Sequential(
nn.Conv2d(dim, inner_dim, 1, 1, 0),
nn.GELU()
) if not glu else ConvGEGLU(dim, inner_dim)
self.net = nn.Sequential(
project_in,
nn.Dropout(dropout),
nn.Conv2d(inner_dim, dim_out, 1, 1, 0)
)
def forward(self, x):
return self.net(x)
class SpatialTransformer(nn.Module):
"""
Transformer block for image-like data.
First, project the input (aka embedding)
and reshape to b, t, d.
Then apply standard transformer action.
Finally, reshape to image
"""
def __init__(self, in_channels, n_heads, d_head,
depth=1, dropout=0., context_dim=None,
disable_self_attn=False):
super().__init__()
self.in_channels = in_channels
inner_dim = n_heads * d_head
self.norm = Normalize(in_channels)
self.proj_in = nn.Conv2d(in_channels,
inner_dim,
kernel_size=1,
stride=1,
padding=0)
self.transformer_blocks = nn.ModuleList(
[BasicTransformerBlock(inner_dim, n_heads, d_head, dropout=dropout, context_dim=context_dim,
disable_self_attn=disable_self_attn)
for d in range(depth)]
)
self.proj_out = zero_module(nn.Conv2d(inner_dim,
in_channels,
kernel_size=1,
stride=1,
padding=0))
def forward(self, x, context=None):
# note: if no context is given, cross-attention defaults to self-attention
b, c, h, w = x.shape
x_in = x
x = self.norm(x)
x = self.proj_in(x)
x = rearrange(x, 'b c h w -> b (h w) c').contiguous()
for block in self.transformer_blocks:
x = block(x, context=context)
x = rearrange(x, 'b (h w) c -> b c h w', h=h, w=w).contiguous()
x = self.proj_out(x)
return x + x_in
================================================
FILE: ldm/modules/diffusionmodules/__init__.py
================================================
================================================
FILE: ldm/modules/diffusionmodules/model.py
================================================
# pytorch_diffusion + derived encoder decoder
import math
import torch
import torch.nn as nn
import numpy as np
from einops import rearrange
from ldm.util import instantiate_from_config
from ldm.modules.attention import LinearAttention
def get_timestep_embedding(timesteps, embedding_dim):
"""
This matches the implementation in Denoising Diffusion Probabilistic Models:
From Fairseq.
Build sinusoidal embeddings.
This matches the implementation in tensor2tensor, but differs slightly
from the description in Section 3.5 of "Attention Is All You Need".
"""
assert len(timesteps.shape) == 1
half_dim = embedding_dim // 2
emb = math.log(10000) / (half_dim - 1)
emb = torch.exp(torch.arange(half_dim, dtype=torch.float32) * -emb)
emb = emb.to(device=timesteps.device)
emb = timesteps.float()[:, None] * emb[None, :]
emb = torch.cat([torch.sin(emb), torch.cos(emb)], dim=1)
if embedding_dim % 2 == 1: # zero pad
emb = torch.nn.functional.pad(emb, (0,1,0,0))
return emb
def nonlinearity(x):
# swish
return x*torch.sigmoid(x)
def Normalize(in_channels, num_groups=32):
return torch.nn.GroupNorm(num_groups=num_groups, num_channels=in_channels, eps=1e-6, affine=True)
class Upsample(nn.Module):
def __init__(self, in_channels, with_conv):
super().__init__()
self.with_conv = with_conv
if self.with_conv:
self.conv = torch.nn.Conv2d(in_channels,
in_channels,
kernel_size=3,
stride=1,
padding=1)
def forward(self, x):
x = torch.nn.functional.interpolate(x, scale_factor=2.0, mode="nearest")
if self.with_conv:
x = self.conv(x)
return x
class Downsample(nn.Module):
def __init__(self, in_channels, with_conv):
super().__init__()
self.with_conv = with_conv
if self.with_conv:
# no asymmetric padding in torch conv, must do it ourselves
self.conv = torch.nn.Conv2d(in_channels,
in_channels,
kernel_size=3,
stride=2,
padding=0)
def forward(self, x):
if self.with_conv:
pad = (0,1,0,1)
x = torch.nn.functional.pad(x, pad, mode="constant", value=0)
x = self.conv(x)
else:
x = torch.nn.functional.avg_pool2d(x, kernel_size=2, stride=2)
return x
class ResnetBlock(nn.Module):
def __init__(self, *, in_channels, out_channels=None, conv_shortcut=False,
dropout, temb_channels=512):
super().__init__()
self.in_channels = in_channels
out_channels = in_channels if out_channels is None else out_channels
self.out_channels = out_channels
self.use_conv_shortcut = conv_shortcut
self.norm1 = Normalize(in_channels)
self.conv1 = torch.nn.Conv2d(in_channels,
out_channels,
kernel_size=3,
stride=1,
padding=1)
if temb_channels > 0:
self.temb_proj = torch.nn.Linear(temb_channels,
out_channels)
self.norm2 = Normalize(out_channels)
self.dropout = torch.nn.Dropout(dropout)
self.conv2 = torch.nn.Conv2d(out_channels,
out_channels,
kernel_size=3,
stride=1,
padding=1)
if self.in_channels != self.out_channels:
if self.use_conv_shortcut:
self.conv_shortcut = torch.nn.Conv2d(in_channels,
out_channels,
kernel_size=3,
stride=1,
padding=1)
else:
self.nin_shortcut = torch.nn.Conv2d(in_channels,
out_channels,
kernel_size=1,
stride=1,
padding=0)
def forward(self, x, temb):
h = x
h = self.norm1(h)
h = nonlinearity(h)
h = self.conv1(h)
if temb is not None:
h = h + self.temb_proj(nonlinearity(temb))[:,:,None,None]
h = self.norm2(h)
h = nonlinearity(h)
h = self.dropout(h)
h = self.conv2(h)
if self.in_channels != self.out_channels:
if self.use_conv_shortcut:
x = self.conv_shortcut(x)
else:
x = self.nin_shortcut(x)
return x+h
class LinAttnBlock(LinearAttention):
"""to match AttnBlock usage"""
def __init__(self, in_channels):
super().__init__(dim=in_channels, heads=1, dim_head=in_channels)
class AttnBlock(nn.Module):
def __init__(self, in_channels):
super().__init__()
self.in_channels = in_channels
self.norm = Normalize(in_channels)
self.q = torch.nn.Conv2d(in_channels,
in_channels,
kernel_size=1,
stride=1,
padding=0)
self.k = torch.nn.Conv2d(in_channels,
in_channels,
kernel_size=1,
stride=1,
padding=0)
self.v = torch.nn.Conv2d(in_channels,
in_channels,
kernel_size=1,
stride=1,
padding=0)
self.proj_out = torch.nn.Conv2d(in_channels,
in_channels,
kernel_size=1,
stride=1,
padding=0)
def forward(self, x):
h_ = x
h_ = self.norm(h_)
q = self.q(h_)
k = self.k(h_)
v = self.v(h_)
# compute attention
b,c,h,w = q.shape
q = q.reshape(b,c,h*w)
q = q.permute(0,2,1) # b,hw,c
k = k.reshape(b,c,h*w) # b,c,hw
w_ = torch.bmm(q,k) # b,hw,hw w[b,i,j]=sum_c q[b,i,c]k[b,c,j]
w_ = w_ * (int(c)**(-0.5))
w_ = torch.nn.functional.softmax(w_, dim=2)
# attend to values
v = v.reshape(b,c,h*w)
w_ = w_.permute(0,2,1) # b,hw,hw (first hw of k, second of q)
h_ = torch.bmm(v,w_) # b, c,hw (hw of q) h_[b,c,j] = sum_i v[b,c,i] w_[b,i,j]
h_ = h_.reshape(b,c,h,w)
h_ = self.proj_out(h_)
return x+h_
def make_attn(in_channels, attn_type="vanilla"):
assert attn_type in ["vanilla", "linear", "none"], f'attn_type {attn_type} unknown'
print(f"making attention of type '{attn_type}' with {in_channels} in_channels")
if attn_type == "vanilla":
return AttnBlock(in_channels)
elif attn_type == "none":
return nn.Identity(in_channels)
else:
return LinAttnBlock(in_channels)
class Model(nn.Module):
def __init__(self, *, ch, out_ch, ch_mult=(1,2,4,8), num_res_blocks,
attn_resolutions, dropout=0.0, resamp_with_conv=True, in_channels,
resolution, use_timestep=True, use_linear_attn=False, attn_type="vanilla"):
super().__init__()
if use_linear_attn: attn_type = "linear"
self.ch = ch
self.temb_ch = self.ch*4
self.num_resolutions = len(ch_mult)
self.num_res_blocks = num_res_blocks
self.resolution = resolution
self.in_channels = in_channels
self.use_timestep = use_timestep
if self.use_timestep:
# timestep embedding
self.temb = nn.Module()
self.temb.dense = nn.ModuleList([
torch.nn.Linear(self.ch,
self.temb_ch),
torch.nn.Linear(self.temb_ch,
self.temb_ch),
])
# downsampling
self.conv_in = torch.nn.Conv2d(in_channels,
self.ch,
kernel_size=3,
stride=1,
padding=1)
curr_res = resolution
in_ch_mult = (1,)+tuple(ch_mult)
self.down = nn.ModuleList()
for i_level in range(self.num_resolutions):
block = nn.ModuleList()
attn = nn.ModuleList()
block_in = ch*in_ch_mult[i_level]
block_out = ch*ch_mult[i_level]
for i_block in range(self.num_res_blocks):
block.append(ResnetBlock(in_channels=block_in,
out_channels=block_out,
temb_channels=self.temb_ch,
dropout=dropout))
block_in = block_out
if curr_res in attn_resolutions:
attn.append(make_attn(block_in, attn_type=attn_type))
down = nn.Module()
down.block = block
down.attn = attn
if i_level != self.num_resolutions-1:
down.downsample = Downsample(block_in, resamp_with_conv)
curr_res = curr_res // 2
self.down.append(down)
# middle
self.mid = nn.Module()
self.mid.block_1 = ResnetBlock(in_channels=block_in,
out_channels=block_in,
temb_channels=self.temb_ch,
dropout=dropout)
self.mid.attn_1 = make_attn(block_in, attn_type=attn_type)
self.mid.block_2 = ResnetBlock(in_channels=block_in,
out_channels=block_in,
temb_channels=self.temb_ch,
dropout=dropout)
# upsampling
self.up = nn.ModuleList()
for i_level in reversed(range(self.num_resolutions)):
block = nn.ModuleList()
attn = nn.ModuleList()
block_out = ch*ch_mult[i_level]
skip_in = ch*ch_mult[i_level]
for i_block in range(self.num_res_blocks+1):
if i_block == self.num_res_blocks:
skip_in = ch*in_ch_mult[i_level]
block.append(ResnetBlock(in_channels=block_in+skip_in,
out_channels=block_out,
temb_channels=self.temb_ch,
dropout=dropout))
block_in = block_out
if curr_res in attn_resolutions:
attn.append(make_attn(block_in, attn_type=attn_type))
up = nn.Module()
up.block = block
up.attn = attn
if i_level != 0:
up.upsample = Upsample(block_in, resamp_with_conv)
curr_res = curr_res * 2
self.up.insert(0, up) # prepend to get consistent order
# end
self.norm_out = Normalize(block_in)
self.conv_out = torch.nn.Conv2d(block_in,
out_ch,
kernel_size=3,
stride=1,
padding=1)
def forward(self, x, t=None, context=None):
#assert x.shape[2] == x.shape[3] == self.resolution
if context is not None:
# assume aligned context, cat along channel axis
x = torch.cat((x, context), dim=1)
if self.use_timestep:
# timestep embedding
assert t is not None
temb = get_timestep_embedding(t, self.ch)
temb = self.temb.dense[0](temb)
temb = nonlinearity(temb)
temb = self.temb.dense[1](temb)
else:
temb = None
# downsampling
hs = [self.conv_in(x)]
for i_level in range(self.num_resolutions):
for i_block in range(self.num_res_blocks):
h = self.down[i_level].block[i_block](hs[-1], temb)
if len(self.down[i_level].attn) > 0:
h = self.down[i_level].attn[i_block](h)
hs.append(h)
if i_level != self.num_resolutions-1:
hs.append(self.down[i_level].downsample(hs[-1]))
# middle
h = hs[-1]
h = self.mid.block_1(h, temb)
h = self.mid.attn_1(h)
h = self.mid.block_2(h, temb)
# upsampling
for i_level in reversed(range(self.num_resolutions)):
for i_block in range(self.num_res_blocks+1):
h = self.up[i_level].block[i_block](
torch.cat([h, hs.pop()], dim=1), temb)
if len(self.up[i_level].attn) > 0:
h = self.up[i_level].attn[i_block](h)
if i_level != 0:
h = self.up[i_level].upsample(h)
# end
h = self.norm_out(h)
h = nonlinearity(h)
h = self.conv_out(h)
return h
def get_last_layer(self):
return self.conv_out.weight
class Encoder(nn.Module):
def __init__(self, *, ch, out_ch, ch_mult=(1,2,4,8), num_res_blocks,
attn_resolutions, dropout=0.0, resamp_with_conv=True, in_channels,
resolution, z_channels, double_z=True, use_linear_attn=False, attn_type="vanilla",
**ignore_kwargs):
super().__init__()
if use_linear_attn: attn_type = "linear"
self.ch = ch
self.temb_ch = 0
self.num_resolutions = len(ch_mult)
self.num_res_blocks = num_res_blocks
self.resolution = resolution
self.in_channels = in_channels
# downsampling
self.conv_in = torch.nn.Conv2d(in_channels,
self.ch,
kernel_size=3,
stride=1,
padding=1)
curr_res = resolution
in_ch_mult = (1,)+tuple(ch_mult)
self.in_ch_mult = in_ch_mult
self.down = nn.ModuleList()
for i_level in range(self.num_resolutions):
block = nn.ModuleList()
attn = nn.ModuleList()
block_in = ch*in_ch_mult[i_level]
block_out = ch*ch_mult[i_level]
for i_block in range(self.num_res_blocks):
block.append(ResnetBlock(in_channels=block_in,
out_channels=block_out,
temb_channels=self.temb_ch,
dropout=dropout))
block_in = block_out
if curr_res in attn_resolutions:
attn.append(make_attn(block_in, attn_type=attn_type))
down = nn.Module()
down.block = block
down.attn = attn
if i_level != self.num_resolutions-1:
down.downsample = Downsample(block_in, resamp_with_conv)
curr_res = curr_res // 2
self.down.append(down)
# middle
self.mid = nn.Module()
self.mid.block_1 = ResnetBlock(in_channels=block_in,
out_channels=block_in,
temb_channels=self.temb_ch,
dropout=dropout)
self.mid.attn_1 = make_attn(block_in, attn_type=attn_type)
self.mid.block_2 = ResnetBlock(in_channels=block_in,
out_channels=block_in,
temb_channels=self.temb_ch,
dropout=dropout)
# end
self.norm_out = Normalize(block_in)
self.conv_out = torch.nn.Conv2d(block_in,
2*z_channels if double_z else z_channels,
kernel_size=3,
stride=1,
padding=1)
def forward(self, x):
# timestep embedding
temb = None
# downsampling
hs = [self.conv_in(x)]
for i_level in range(self.num_resolutions):
for i_block in range(self.num_res_blocks):
h = self.down[i_level].block[i_block](hs[-1], temb)
if len(self.down[i_level].attn) > 0:
h = self.down[i_level].attn[i_block](h)
hs.append(h)
if i_level != self.num_resolutions-1:
hs.append(self.down[i_level].downsample(hs[-1]))
# middle
h = hs[-1]
h = self.mid.block_1(h, temb)
h = self.mid.attn_1(h)
h = self.mid.block_2(h, temb)
# end
h = self.norm_out(h)
h = nonlinearity(h)
h = self.conv_out(h)
return h
class Decoder(nn.Module):
def __init__(self, *, ch, out_ch, ch_mult=(1,2,4,8), num_res_blocks,
attn_resolutions, dropout=0.0, resamp_with_conv=True, in_channels,
resolution, z_channels, give_pre_end=False, tanh_out=False, use_linear_attn=False,
attn_type="vanilla", **ignorekwargs):
super().__init__()
if use_linear_attn: attn_type = "linear"
self.ch = ch
self.temb_ch = 0
self.num_resolutions = len(ch_mult)
self.num_res_blocks = num_res_blocks
self.resolution = resolution
self.in_channels = in_channels
self.give_pre_end = give_pre_end
self.tanh_out = tanh_out
# compute in_ch_mult, block_in and curr_res at lowest res
in_ch_mult = (1,)+tuple(ch_mult)
block_in = ch*ch_mult[self.num_resolutions-1]
curr_res = resolution // 2**(self.num_resolutions-1)
self.z_shape = (1,z_channels,curr_res,curr_res)
print("Working with z of shape {} = {} dimensions.".format(
self.z_shape, np.prod(self.z_shape)))
# z to block_in
self.conv_in = torch.nn.Conv2d(z_channels,
block_in,
kernel_size=3,
stride=1,
padding=1)
# middle
self.mid = nn.Module()
self.mid.block_1 = ResnetBlock(in_channels=block_in,
out_channels=block_in,
temb_channels=self.temb_ch,
dropout=dropout)
self.mid.attn_1 = make_attn(block_in, attn_type=attn_type)
self.mid.block_2 = ResnetBlock(in_channels=block_in,
out_channels=block_in,
temb_channels=self.temb_ch,
dropout=dropout)
# upsampling
self.up = nn.ModuleList()
for i_level in reversed(range(self.num_resolutions)):
block = nn.ModuleList()
attn = nn.ModuleList()
block_out = ch*ch_mult[i_level]
for i_block in range(self.num_res_blocks+1):
block.append(ResnetBlock(in_channels=block_in,
out_channels=block_out,
temb_channels=self.temb_ch,
dropout=dropout))
block_in = block_out
if curr_res in attn_resolutions:
attn.append(make_attn(block_in, attn_type=attn_type))
up = nn.Module()
up.block = block
up.attn = attn
if i_level != 0:
up.upsample = Upsample(block_in, resamp_with_conv)
curr_res = curr_res * 2
self.up.insert(0, up) # prepend to get consistent order
# end
self.norm_out = Normalize(block_in)
self.conv_out = torch.nn.Conv2d(block_in,
out_ch,
kernel_size=3,
stride=1,
padding=1)
def forward(self, z):
#assert z.shape[1:] == self.z_shape[1:]
self.last_z_shape = z.shape
# timestep embedding
temb = None
# z to block_in
h = self.conv_in(z)
# middle
h = self.mid.block_1(h, temb)
h = self.mid.attn_1(h)
h = self.mid.block_2(h, temb)
# upsampling
for i_level in reversed(range(self.num_resolutions)):
for i_block in range(self.num_res_blocks+1):
h = self.up[i_level].block[i_block](h, temb)
if len(self.up[i_level].attn) > 0:
h = self.up[i_level].attn[i_block](h)
if i_level != 0:
h = self.up[i_level].upsample(h)
# end
if self.give_pre_end:
return h
h = self.norm_out(h)
h = nonlinearity(h)
h = self.conv_out(h)
if self.tanh_out:
h = torch.tanh(h)
return h
class SimpleDecoder(nn.Module):
def __init__(self, in_channels, out_channels, *args, **kwargs):
super().__init__()
self.model = nn.ModuleList([nn.Conv2d(in_channels, in_channels, 1),
ResnetBlock(in_channels=in_channels,
out_channels=2 * in_channels,
temb_channels=0, dropout=0.0),
ResnetBlock(in_channels=2 * in_channels,
out_channels=4 * in_channels,
temb_channels=0, dropout=0.0),
ResnetBlock(in_channels=4 * in_channels,
out_channels=2 * in_channels,
temb_channels=0, dropout=0.0),
nn.Conv2d(2*in_channels, in_channels, 1),
Upsample(in_channels, with_conv=True)])
# end
self.norm_out = Normalize(in_channels)
self.conv_out = torch.nn.Conv2d(in_channels,
out_channels,
kernel_size=3,
stride=1,
padding=1)
def forward(self, x):
for i, layer in enumerate(self.model):
if i in [1,2,3]:
x = layer(x, None)
else:
x = layer(x)
h = self.norm_out(x)
h = nonlinearity(h)
x = self.conv_out(h)
return x
class UpsampleDecoder(nn.Module):
def __init__(self, in_channels, out_channels, ch, num_res_blocks, resolution,
ch_mult=(2,2), dropout=0.0):
super().__init__()
# upsampling
self.temb_ch = 0
self.num_resolutions = len(ch_mult)
self.num_res_blocks = num_res_blocks
block_in = in_channels
curr_res = resolution // 2 ** (self.num_resolutions - 1)
self.res_blocks = nn.ModuleList()
self.upsample_blocks = nn.ModuleList()
for i_level in range(self.num_resolutions):
res_block = []
block_out = ch * ch_mult[i_level]
for i_block in range(self.num_res_blocks + 1):
res_block.append(ResnetBlock(in_channels=block_in,
out_channels=block_out,
temb_channels=self.temb_ch,
dropout=dropout))
block_in = block_out
self.res_blocks.append(nn.ModuleList(res_block))
if i_level != self.num_resolutions - 1:
self.upsample_blocks.append(Upsample(block_in, True))
curr_res = curr_res * 2
# end
self.norm_out = Normalize(block_in)
self.conv_out = torch.nn.Conv2d(block_in,
out_channels,
kernel_size=3,
stride=1,
padding=1)
def forward(self, x):
# upsampling
h = x
for k, i_level in enumerate(range(self.num_resolutions)):
for i_block in range(self.num_res_blocks + 1):
h = self.res_blocks[i_level][i_block](h, None)
if i_level != self.num_resolutions - 1:
h = self.upsample_blocks[k](h)
h = self.norm_out(h)
h = nonlinearity(h)
h = self.conv_out(h)
return h
class LatentRescaler(nn.Module):
def __init__(self, factor, in_channels, mid_channels, out_channels, depth=2):
super().__init__()
# residual block, interpolate, residual block
self.factor = factor
self.conv_in = nn.Conv2d(in_channels,
mid_channels,
kernel_size=3,
stride=1,
padding=1)
self.res_block1 = nn.ModuleList([ResnetBlock(in_channels=mid_channels,
out_channels=mid_channels,
temb_channels=0,
dropout=0.0) for _ in range(depth)])
self.attn = AttnBlock(mid_channels)
self.res_block2 = nn.ModuleList([ResnetBlock(in_channels=mid_channels,
out_channels=mid_channels,
temb_channels=0,
dropout=0.0) for _ in range(depth)])
self.conv_out = nn.Conv2d(mid_channels,
out_channels,
kernel_size=1,
)
def forward(self, x):
x = self.conv_in(x)
for block in self.res_block1:
x = block(x, None)
x = torch.nn.functional.interpolate(x, size=(int(round(x.shape[2]*self.factor)), int(round(x.shape[3]*self.factor))))
x = self.attn(x)
for block in self.res_block2:
x = block(x, None)
x = self.conv_out(x)
return x
class MergedRescaleEncoder(nn.Module):
def __init__(self, in_channels, ch, resolution, out_ch, num_res_blocks,
attn_resolutions, dropout=0.0, resamp_with_conv=True,
ch_mult=(1,2,4,8), rescale_factor=1.0, rescale_module_depth=1):
super().__init__()
intermediate_chn = ch * ch_mult[-1]
self.encoder = Encoder(in_channels=in_channels, num_res_blocks=num_res_blocks, ch=ch, ch_mult=ch_mult,
z_channels=intermediate_chn, double_z=False, resolution=resolution,
attn_resolutions=attn_resolutions, dropout=dropout, resamp_with_conv=resamp_with_conv,
out_ch=None)
self.rescaler = LatentRescaler(factor=rescale_factor, in_channels=intermediate_chn,
mid_channels=intermediate_chn, out_channels=out_ch, depth=rescale_module_depth)
def forward(self, x):
x = self.encoder(x)
x = self.rescaler(x)
return x
class MergedRescaleDecoder(nn.Module):
def __init__(self, z_channels, out_ch, resolution, num_res_blocks, attn_resolutions, ch, ch_mult=(1,2,4,8),
dropout=0.0, resamp_with_conv=True, rescale_factor=1.0, rescale_module_depth=1):
super().__init__()
tmp_chn = z_channels*ch_mult[-1]
self.decoder = Decoder(out_ch=out_ch, z_channels=tmp_chn, attn_resolutions=attn_resolutions, dropout=dropout,
resamp_with_conv=resamp_with_conv, in_channels=None, num_res_blocks=num_res_blocks,
ch_mult=ch_mult, resolution=resolution, ch=ch)
self.rescaler = LatentRescaler(factor=rescale_factor, in_channels=z_channels, mid_channels=tmp_chn,
out_channels=tmp_chn, depth=rescale_module_depth)
def forward(self, x):
x = self.rescaler(x)
x = self.decoder(x)
return x
class Upsampler(nn.Module):
def __init__(self, in_size, out_size, in_channels, out_channels, ch_mult=2):
super().__init__()
assert out_size >= in_size
num_blocks = int(np.log2(out_size//in_size))+1
factor_up = 1.+ (out_size % in_size)
print(f"Building {self.__class__.__name__} with in_size: {in_size} --> out_size {out_size} and factor {factor_up}")
self.rescaler = LatentRescaler(factor=factor_up, in_channels=in_channels, mid_channels=2*in_channels,
out_channels=in_channels)
self.decoder = Decoder(out_ch=out_channels, resolution=out_size, z_channels=in_channels, num_res_blocks=2,
attn_resolutions=[], in_channels=None, ch=in_channels,
ch_mult=[ch_mult for _ in range(num_blocks)])
def forward(self, x):
x = self.rescaler(x)
x = self.decoder(x)
return x
class Resize(nn.Module):
def __init__(self, in_channels=None, learned=False, mode="bilinear"):
super().__init__()
self.with_conv = learned
self.mode = mode
if self.with_conv:
print(f"Note: {self.__class__.__name} uses learned downsampling and will ignore the fixed {mode} mode")
raise NotImplementedError()
assert in_channels is not None
# no asymmetric padding in torch conv, must do it ourselves
self.conv = torch.nn.Conv2d(in_channels,
in_channels,
kernel_size=4,
stride=2,
padding=1)
def forward(self, x, scale_factor=1.0):
if scale_factor==1.0:
return x
else:
x = torch.nn.functional.interpolate(x, mode=self.mode, align_corners=False, scale_factor=scale_factor)
return x
class FirstStagePostProcessor(nn.Module):
def __init__(self, ch_mult:list, in_channels,
pretrained_model:nn.Module=None,
reshape=False,
n_channels=None,
dropout=0.,
pretrained_config=None):
super().__init__()
if pretrained_config is None:
assert pretrained_model is not None, 'Either "pretrained_model" or "pretrained_config" must not be None'
self.pretrained_model = pretrained_model
else:
assert pretrained_config is not None, 'Either "pretrained_model" or "pretrained_config" must not be None'
self.instantiate_pretrained(pretrained_config)
self.do_reshape = reshape
if n_channels is None:
n_channels = self.pretrained_model.encoder.ch
self.proj_norm = Normalize(in_channels,num_groups=in_channels//2)
self.proj = nn.Conv2d(in_channels,n_channels,kernel_size=3,
stride=1,padding=1)
blocks = []
downs = []
ch_in = n_channels
for m in ch_mult:
blocks.append(ResnetBlock(in_channels=ch_in,out_channels=m*n_channels,dropout=dropout))
ch_in = m * n_channels
downs.append(Downsample(ch_in, with_conv=False))
self.model = nn.ModuleList(blocks)
self.downsampler = nn.ModuleList(downs)
def instantiate_pretrained(self, config):
model = instantiate_from_config(config)
self.pretrained_model = model.eval()
# self.pretrained_model.train = False
for param in self.pretrained_model.parameters():
param.requires_grad = False
@torch.no_grad()
def encode_with_pretrained(self,x):
c = self.pretrained_model.encode(x)
if isinstance(c, DiagonalGaussianDistribution):
c = c.mode()
return c
def forward(self,x):
z_fs = self.encode_with_pretrained(x)
z = self.proj_norm(z_fs)
z = self.proj(z)
z = nonlinearity(z)
for submodel, downmodel in zip(self.model,self.downsampler):
z = submodel(z,temb=None)
z = downmodel(z)
if self.do_reshape:
z = rearrange(z,'b c h w -> b (h w) c')
return z
================================================
FILE: ldm/modules/diffusionmodules/openaimodel.py
================================================
from abc import abstractmethod
from functools import partial
import math
from typing import Iterable
import numpy as np
import torch as th
import torch.nn as nn
import torch.nn.functional as F
from ldm.modules.diffusionmodules.util import (
checkpoint,
conv_nd,
linear,
avg_pool_nd,
zero_module,
normalization,
timestep_embedding,
)
from ldm.modules.attention import SpatialTransformer
from ldm.util import exists
# dummy replace
def convert_module_to_f16(x):
pass
def convert_module_to_f32(x):
pass
## go
class AttentionPool2d(nn.Module):
"""
Adapted from CLIP: https://github.com/openai/CLIP/blob/main/clip/model.py
"""
def __init__(
self,
spacial_dim: int,
embed_dim: int,
num_heads_channels: int,
output_dim: int = None,
):
super().__init__()
self.positional_embedding = nn.Parameter(th.randn(embed_dim, spacial_dim ** 2 + 1) / embed_dim ** 0.5)
self.qkv_proj = conv_nd(1, embed_dim, 3 * embed_dim, 1)
self.c_proj = conv_nd(1, embed_dim, output_dim or embed_dim, 1)
self.num_heads = embed_dim // num_heads_channels
self.attention = QKVAttention(self.num_heads)
def forward(self, x):
b, c, *_spatial = x.shape
x = x.reshape(b, c, -1) # NC(HW)
x = th.cat([x.mean(dim=-1, keepdim=True), x], dim=-1) # NC(HW+1)
x = x + self.positional_embedding[None, :, :].to(x.dtype) # NC(HW+1)
x = self.qkv_proj(x)
x = self.attention(x)
x = self.c_proj(x)
return x[:, :, 0]
class TimestepBlock(nn.Module):
"""
Any module where forward() takes timestep embeddings as a second argument.
"""
@abstractmethod
def forward(self, x, emb):
"""
Apply the module to `x` given `emb` timestep embeddings.
"""
class TimestepEmbedSequential(nn.Sequential, TimestepBlock):
"""
A sequential module that passes timestep embeddings to the children that
support it as an extra input.
"""
def forward(self, x, emb, context=None):
for layer in self:
if isinstance(layer, TimestepBlock):
x = layer(x, emb)
elif isinstance(layer, SpatialTransformer):
x = layer(x, context)
else:
x = layer(x)
return x
class Upsample(nn.Module):
"""
An upsampling layer with an optional convolution.
:param channels: channels in the inputs and outputs.
:param use_conv: a bool determining if a convolution is applied.
:param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then
upsampling occurs in the inner-two dimensions.
"""
def __init__(self, channels, use_conv, dims=2, out_channels=None, padding=1):
super().__init__()
self.channels = channels
self.out_channels = out_channels or channels
self.use_conv = use_conv
self.dims = dims
if use_conv:
self.conv = conv_nd(dims, self.channels, self.out_channels, 3, padding=padding)
def forward(self, x):
assert x.shape[1] == self.channels
if self.dims == 3:
x = F.interpolate(
x, (x.shape[2], x.shape[3] * 2, x.shape[4] * 2), mode="nearest"
)
else:
x = F.interpolate(x, scale_factor=2, mode="nearest")
if self.use_conv:
x = self.conv(x)
return x
class TransposedUpsample(nn.Module):
'Learned 2x upsampling without padding'
def __init__(self, channels, out_channels=None, ks=5):
super().__init__()
self.channels = channels
self.out_channels = out_channels or channels
self.up = nn.ConvTranspose2d(self.channels,self.out_channels,kernel_size=ks,stride=2)
def forward(self,x):
return self.up(x)
class Downsample(nn.Module):
"""
A downsampling layer with an optional convolution.
:param channels: channels in the inputs and outputs.
:param use_conv: a bool determining if a convolution is applied.
:param dims: determines if the signal is 1D, 2D, or 3D. If 3D, then
downsampling occurs in the inner-two dimensions.
"""
def __init__(self, channels, use_conv, dims=2, out_channels=None,padding=1):
super().__init__()
self.channels = channels
self.out_channels = out_channels or channels
self.use_conv = use_conv
self.dims = dims
stride = 2 if dims != 3 else (1, 2, 2)
if use_conv:
self.op = conv_nd(
dims, self.channels, self.out_channels, 3, stride=stride, padding=padding
)
else:
assert self.channels == self.out_channels
self.op = avg_pool_nd(dims, kernel_size=stride, stride=stride)
def forward(self, x):
assert x.shape[1] == self.channels
return self.op(x)
class ResBlock(TimestepBlock):
"""
A residual block that can optionally change the number of channels.
:param channels: the number of input channels.
:param emb_channels: the number of timestep embedding channels.
:param dropout: the rate of dropout.
:param out_channels: if specified, the number of out channels.
:param use_conv: if True and out_channels is specified, use a spatial
convolution instead of a smaller 1x1 convolution to change the
channels in the skip connection.
:param dims: determines if the signal is 1D, 2D, or 3D.
:param use_checkpoint: if True, use gradient checkpointing on this module.
:param up: if True, use this block for upsampling.
:param down: if True, use this block for downsampling.
"""
def __init__(
self,
channels,
emb_channels,
dropout,
out_channels=None,
use_conv=False,
use_scale_shift_norm=False,
dims=2,
use_checkpoint=False,
up=False,
down=False,
):
super().__init__()
self.channels = channels
self.emb_channels = emb_channels
self.dropout = dropout
self.out_channels = out_channels or channels
self.use_conv = use_conv
self.use_checkpoint = use_checkpoint
self.use_scale_shift_norm = use_scale_shift_norm
self.in_layers = nn.Sequential(
normalization(channels),
nn.SiLU(),
conv_nd(dims, channels, self.out_channels, 3, padding=1),
)
self.updown = up or down
if up:
self.h_upd = Upsample(channels, False, dims)
self.x_upd = Upsample(channels, False, dims)
elif down:
self.h_upd = Downsample(channels, False, dims)
self.x_upd = Downsample(channels, False, dims)
else:
self.h_upd = self.x_upd = nn.Identity()
self.emb_layers = nn.Sequential(
nn.SiLU(),
linear(
emb_channels,
2 * self.out_channels if use_scale_shift_norm else self.out_channels,
),
)
self.out_layers = nn.Sequential(
normalization(self.out_channels),
nn.SiLU(),
nn.Dropout(p=dropout),
zero_module(
conv_nd(dims, self.out_channels, self.out_channels, 3, padding=1)
),
)
if self.out_channels == channels:
self.skip_connection = nn.Identity()
elif use_conv:
self.skip_connection = conv_nd(
dims, channels, self.out_channels, 3, padding=1
)
else:
self.skip_connection = conv_nd(dims, channels, self.out_channels, 1)
def forward(self, x, emb):
"""
Apply the block to a Tensor, conditioned on a timestep embedding.
:param x: an [N x C x ...] Tensor of features.
:param emb: an [N x emb_channels] Tensor of timestep embeddings.
:return: an [N x C x ...] Tensor of outputs.
"""
return checkpoint(
self._forward, (x, emb), self.parameters(), self.use_checkpoint
)
def _forward(self, x, emb):
if self.updown:
in_rest, in_conv = self.in_layers[:-1], self.in_layers[-1]
h = in_rest(x)
h = self.h_upd(h)
x = self.x_upd(x)
h = in_conv(h)
else:
h = self.in_layers(x)
emb_out = self.emb_layers(emb).type(h.dtype)
while len(emb_out.shape) < len(h.shape):
emb_out = emb_out[..., None]
if self.use_scale_shift_norm: # False
out_norm, out_rest = self.out_layers[0], self.out_layers[1:]
scale, shift = th.chunk(emb_out, 2, dim=1)
h = out_norm(h) * (1 + scale) + shift
h = out_rest(h)
else:
h = h + emb_out
h = self.out_layers(h)
return self.skip_connection(x) + h
class AttentionBlock(nn.Module):
"""
An attention block that allows spatial positions to attend to each other.
Originally ported from here, but adapted to the N-d case.
https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66.
"""
def __init__(
self,
channels,
num_heads=1,
num_head_channels=-1,
use_checkpoint=False,
use_new_attention_order=False,
):
super().__init__()
self.channels = channels
if num_head_channels == -1:
self.num_heads = num_heads
else:
assert (
channels % num_head_channels == 0
), f"q,k,v channels {channels} is not divisible by num_head_channels {num_head_channels}"
self.num_heads = channels // num_head_channels
self.use_checkpoint = use_checkpoint
self.norm = normalization(channels)
self.qkv = conv_nd(1, channels, channels * 3, 1)
if use_new_attention_order:
# split qkv before split heads
self.attention = QKVAttention(self.num_heads)
else:
# split heads before split qkv
self.attention = QKVAttentionLegacy(self.num_heads)
self.proj_out = zero_module(conv_nd(1, channels, channels, 1))
def forward(self, x):
return checkpoint(self._forward, (x,), self.parameters(), True) # TODO: check checkpoint usage, is True # TODO: fix the .half call!!!
#return pt_checkpoint(self._forward, x) # pytorch
def _forward(self, x):
b, c, *spatial = x.shape
x = x.reshape(b, c, -1)
qkv = self.qkv(self.norm(x))
h = self.attention(qkv)
h = self.proj_out(h)
return (x + h).reshape(b, c, *spatial)
def count_flops_attn(model, _x, y):
"""
A counter for the `thop` package to count the operations in an
attention operation.
Meant to be used like:
macs, params = thop.profile(
model,
inputs=(inputs, timestamps),
custom_ops={QKVAttention: QKVAttention.count_flops},
)
"""
b, c, *spatial = y[0].shape
num_spatial = int(np.prod(spatial))
# We perform two matmuls with the same number of ops.
# The first computes the weight matrix, the second computes
# the combination of the value vectors.
matmul_ops = 2 * b * (num_spatial ** 2) * c
model.total_ops += th.DoubleTensor([matmul_ops])
class QKVAttentionLegacy(nn.Module):
"""
A module which performs QKV attention. Matches legacy QKVAttention + input/ouput heads shaping
"""
def __init__(self, n_heads):
super().__init__()
self.n_heads = n_heads
def forward(self, qkv):
"""
Apply QKV attention.
:param qkv: an [N x (H * 3 * C) x T] tensor of Qs, Ks, and Vs.
:return: an [N x (H * C) x T] tensor after attention.
"""
bs, width, length = qkv.shape
assert width % (3 * self.n_heads) == 0
ch = width // (3 * self.n_heads)
q, k, v = qkv.reshape(bs * self.n_heads, ch * 3, length).split(ch, dim=1)
scale = 1 / math.sqrt(math.sqrt(ch))
weight = th.einsum(
"bct,bcs->bts", q * scale, k * scale
) # More stable with f16 than dividing afterwards
weight = th.softmax(weight.float(), dim=-1).type(weight.dtype)
a = th.einsum("bts,bcs->bct", weight, v)
return a.reshape(bs, -1, length)
@staticmethod
def count_flops(model, _x, y):
return count_flops_attn(model, _x, y)
class QKVAttention(nn.Module):
"""
A module which performs QKV attention and splits in a different order.
"""
def __init__(self, n_heads):
super().__init__()
self.n_heads = n_heads
def forward(self, qkv):
"""
Apply QKV attention.
:param qkv: an [N x (3 * H * C) x T] tensor of Qs, Ks, and Vs.
:return: an [N x (H * C) x T] tensor after attention.
"""
bs, width, length = qkv.shape
assert width % (3 * self.n_heads) == 0
ch = width // (3 * self.n_heads)
q, k, v = qkv.chunk(3, dim=1)
scale = 1 / math.sqrt(math.sqrt(ch))
weight = th.einsum(
"bct,bcs->bts",
(q * scale).view(bs * self.n_heads, ch, length),
(k * scale).view(bs * self.n_heads, ch, length),
) # More stable with f16 than dividing afterwards
weight = th.softmax(weight.float(), dim=-1).type(weight.dtype)
a = th.einsum("bts,bcs->bct", weight, v.reshape(bs * self.n_heads, ch, length))
return a.reshape(bs, -1, length)
@staticmethod
def count_flops(model, _x, y):
return count_flops_attn(model, _x, y)
class UNetModel(nn.Module):
"""
The full UNet model with attention and timestep embedding.
:param in_channels: channels in the input Tensor.
:param model_channels: base channel count for the model.
:param out_channels: channels in the output Tensor.
:param num_res_blocks: number of residual blocks per downsample.
:param attention_resolutions: a collection of downsample rates at which
attention will take place. May be a set, list, or tuple.
For example, if this contains 4, then at 4x downsampling, attention
will be used.
:param dropout: the dropout probability.
:param channel_mult: channel multiplier for each level of the UNet.
:param conv_resample: if True, use learned convolutions for upsampling and
downsampling.
:param dims: determines if the signal is 1D, 2D, or 3D.
:param num_classes: if specified (as an int), then this model will be
class-conditional with `num_classes` classes.
:param use_checkpoint: use gradient checkpointing to reduce memory usage.
:param num_heads: the number of attention heads in each attention layer.
:param num_heads_channels: if specified, ignore num_heads and instead use
a fixed channel width per attention head.
:param num_heads_upsample: works with num_heads to set a different number
of heads for upsampling. Deprecated.
:param use_scale_shift_norm: use a FiLM-like conditioning mechanism.
:param resblock_updown: use residual blocks for up/downsampling.
:param use_new_attention_order: use a different attention pattern for potentially
increased efficiency.
"""
def __init__(
self,
image_size,
in_channels,
model_channels,
out_channels,
num_res_blocks,
attention_resolutions,
dropout=0,
channel_mult=(1, 2, 4, 8),
conv_resample=True,
dims=2,
num_classes=None,
use_checkpoint=False,
use_fp16=False,
num_heads=-1,
num_head_channels=-1,
num_heads_upsample=-1,
use_scale_shift_norm=False,
resblock_updown=False,
use_new_attention_order=False,
use_spatial_transformer=False, # custom transformer support
transformer_depth=1, # custom transformer support
context_dim=None, # custom transformer support
n_embed=None, # custom support for prediction of discrete ids into codebook of first stage vq model
legacy=True,
disable_self_attentions=None,
num_attention_blocks=None
):
super().__init__()
if use_spatial_transformer:
assert context_dim is not None, 'Fool!! You forgot to include the dimension of your cross-attention conditioning...'
if context_dim is not None:
assert use_spatial_transformer, 'Fool!! You forgot to use the spatial transformer for your cross-attention conditioning...'
from omegaconf.listconfig import ListConfig
if type(context_dim) == ListConfig:
context_dim = list(context_dim)
if num_heads_upsample == -1:
num_heads_upsample = num_heads
if num_heads == -1:
assert num_head_channels != -1, 'Either num_heads or num_head_channels has to be set'
if num_head_channels == -1:
assert num_heads != -1, 'Either num_heads or num_head_channels has to be set'
self.image_size = image_size
self.in_channels = in_channels
self.model_channels = model_channels
self.out_channels = out_channels
if isinstance(num_res_blocks, int):
self.num_res_blocks = len(channel_mult) * [num_res_blocks]
else:
if len(num_res_blocks) != len(channel_mult):
raise ValueError("provide num_res_blocks either as an int (globally constant) or "
"as a list/tuple (per-level) with the same length as channel_mult")
self.num_res_blocks = num_res_blocks
#self.num_res_blocks = num_res_blocks
if disable_self_attentions is not None:
# should be a list of booleans, indicating whether to disable self-attention in TransformerBlocks or not
assert len(disable_self_attentions) == len(channel_mult)
if num_attention_blocks is not None:
assert len(num_attention_blocks) == len(self.num_res_blocks)
assert all(map(lambda i: self.num_res_blocks[i] >= num_attention_blocks[i], range(len(num_attention_blocks))))
print(f"Constructor of UNetModel received num_attention_blocks={num_attention_blocks}. "
f"This option has LESS priority than attention_resolutions {attention_resolutions}, "
f"i.e., in cases where num_attention_blocks[i] > 0 but 2**i not in attention_resolutions, "
f"attention will still not be set.") # todo: convert to warning
self.attention_resolutions = attention_resolutions
self.dropout = dropout
self.channel_mult = channel_mult
self.conv_resample = conv_resample
self.num_classes = num_classes
self.use_checkpoint = use_checkpoint
self.dtype = th.float16 if use_fp16 else th.float32
self.num_heads = num_heads
self.num_head_channels = num_head_channels
self.num_heads_upsample = num_heads_upsample
self.predict_codebook_ids = n_embed is not None
time_embed_dim = model_channels * 4
self.time_embed = nn.Sequential(
linear(model_channels, time_embed_dim),
nn.SiLU(),
linear(time_embed_dim, time_embed_dim),
)
if self.num_classes is not None:
self.label_emb = nn.Embedding(num_classes, time_embed_dim)
self.input_blocks = nn.ModuleList(
[
TimestepEmbedSequential(
conv_nd(dims, in_channels, model_channels, 3, padding=1)
)
]
) # 0
self._feature_size = model_channels
input_block_chans = [model_channels]
ch = model_channels
ds = 1
for level, mult in enumerate(channel_mult):
for nr in range(self.num_res_blocks[level]):
layers = [
ResBlock(
ch,
time_embed_dim,
dropout,
out_channels=mult * model_channels,
dims=dims,
use_checkpoint=use_checkpoint,
use_scale_shift_norm=use_scale_shift_norm,
)
]
ch = mult * model_channels
if ds in attention_resolutions: # always True
if num_head_channels == -1:
dim_head = ch // num_heads
else:
num_heads = ch // num_head_channels
dim_head = num_head_channels
if legacy:
#num_heads = 1
dim_head = ch // num_heads if use_spatial_transformer else num_head_channels
if exists(disable_self_attentions):
disabled_sa = disable_self_attentions[level]
else:
disabled_sa = False
if not exists(num_attention_blocks) or nr < num_attention_blocks[level]:
layers.append(
AttentionBlock(
ch,
use_checkpoint=use_checkpoint,
num_heads=num_heads,
num_head_channels=dim_head,
use_new_attention_order=use_new_attention_order,
) if not use_spatial_transformer else SpatialTransformer(
ch, num_heads, dim_head, depth=transformer_depth, context_dim=context_dim,
disable_self_attn=disabled_sa
)
)
self.input_blocks.append(TimestepEmbedSequential(*layers))
self._feature_size += ch
input_block_chans.append(ch)
if level != len(channel_mult) - 1:
out_ch = ch
self.input_blocks.append(
TimestepEmbedSequential(
ResBlock(
ch,
time_embed_dim,
dropout,
out_channels=out_ch,
dims=dims,
use_checkpoint=use_checkpoint,
use_scale_shift_norm=use_scale_shift_norm,
down=True,
)
if resblock_updown
else Downsample(
ch, conv_resample, dims=dims, out_channels=out_ch
)
)
)
ch = out_ch
input_block_chans.append(ch)
ds *= 2
self._feature_size += ch
if num_head_channels == -1:
dim_head = ch // num_heads
else:
num_heads = ch // num_head_channels
dim_head = num_head_channels
if legacy:
#num_heads = 1
dim_head = ch // num_heads if use_spatial_transformer else num_head_channels
self.middle_block = TimestepEmbedSequential(
ResBlock(
ch,
time_embed_dim,
dropout,
dims=dims,
use_checkpoint=use_checkpoint,
use_scale_shift_norm=use_scale_shift_norm,
),
AttentionBlock(
ch,
use_checkpoint=use_checkpoint,
num_heads=num_heads,
num_head_channels=dim_head,
use_new_attention_order=use_new_attention_order,
) if not use_spatial_transformer else SpatialTransformer( # always uses a self-attn
ch, num_heads, dim_head, depth=transformer_depth, context_dim=context_dim
),
ResBlock(
ch,
time_embed_dim,
dropout,
dims=dims,
use_checkpoint=use_checkpoint,
use_scale_shift_norm=use_scale_shift_norm,
),
)
self._feature_size += ch
self.output_blocks = nn.ModuleList([])
for level, mult in list(enumerate(channel_mult))[::-1]:
for i in range(self.num_res_blocks[level] + 1):
ich = input_block_chans.pop()
layers = [
ResBlock(
ch + ich,
time_embed_dim,
dropout,
out_channels=model_channels * mult,
dims=dims,
use_checkpoint=use_checkpoint,
use_scale_shift_norm=use_scale_shift_norm,
)
]
ch = model_channels * mult
if ds in attention_resolutions:
if num_head_channels == -1:
dim_head = ch // num_heads
else:
num_heads = ch // num_head_channels
dim_head = num_head_channels
if legacy:
#num_heads = 1
dim_head = ch // num_heads if use_spatial_transformer else num_head_channels
if exists(disable_self_attentions):
disabled_sa = disable_self_attentions[level]
else:
disabled_sa = False
if not exists(num_attention_blocks) or i < num_attention_blocks[level]:
layers.append(
AttentionBlock(
ch,
use_checkpoint=use_checkpoint,
num_heads=num_heads_upsample,
num_head_channels=dim_head,
use_new_attention_order=use_new_attention_order,
) if not use_spatial_transformer else SpatialTransformer(
ch, num_heads, dim_head, depth=transformer_depth, context_dim=context_dim,
disable_self_attn=disabled_sa
)
)
if level and i == self.num_res_blocks[level]:
out_ch = ch
layers.append(
ResBlock(
ch,
time_embed_dim,
dropout,
out_channels=out_ch,
dims=dims,
use_checkpoint=use_checkpoint,
use_scale_shift_norm=use_scale_shift_norm,
up=True,
)
if resblock_updown
else Upsample(ch, conv_resample, dims=dims, out_channels=out_ch)
)
ds //= 2
self.output_blocks.append(TimestepEmbedSequential(*layers))
self._feature_size += ch
self.out = nn.Sequential(
normalization(ch),
nn.SiLU(),
zero_module(conv_nd(dims, model_channels, out_channels, 3, padding=1)),
)
if self.predict_codebook_ids:
self.id_predictor = nn.Sequential(
normalization(ch),
conv_nd(dims, model_channels, n_embed, 1),
#nn.LogSoftmax(dim=1) # change to cross_entropy and produce non-normalized logits
)
def convert_to_fp16(self):
"""
Convert the torso of the model to float16.
"""
self.input_blocks.apply(convert_module_to_f16)
self.middle_block.apply(convert_module_to_f16)
self.output_blocks.apply(convert_module_to_f16)
def convert_to_fp32(self):
"""
Convert the torso of the model to float32.
"""
self.input_blocks.apply(convert_module_to_f32)
self.middle_block.apply(convert_module_to_f32)
self.output_blocks.apply(convert_module_to_f32)
def forward(self, x, timesteps=None, context=None, y=None,**kwargs):
"""
Apply the model to an input batch.
:param x: an [N x C x ...] Tensor of inputs.
:param timesteps: a 1-D batch of timesteps.
:param context: conditioning plugged in via crossattn
:param y: an [N] Tensor of labels, if class-conditional.
:return: an [N x C x ...] Tensor of outputs.
"""
assert (y is not None) == (
self.num_classes is not None
), "must specify y if and only if the model is class-conditional"
hs = []
t_emb = timestep_embedding(timesteps, self.model_channels, repeat_only=False) # N
emb = self.time_embed(t_emb) #
if self.num_classes is not None:
assert y.shape == (x.shape[0],)
emb = emb + self.label_emb(y)
h = x.type(self.dtype)
for module in self.input_blocks:
h = module(h, emb, context) # conv
hs.append(h)
h = self.middle_block(h, emb, context)
for module in self.output_blocks:
h = th.cat([h, hs.pop()], dim=1)
h = module(h, emb, context)
h = h.type(x.dtype)
if self.predict_codebook_ids:
return self.id_predictor(h)
else:
return self.out(h)
class EncoderUNetModel(nn.Module):
"""
The half UNet model with attention and timestep embedding.
For usage, see UNet.
"""
def __init__(
self,
image_size,
in_channels,
model_channels,
out_channels,
num_res_blocks,
attention_resolutions,
dropout=0,
channel_mult=(1, 2, 4, 8),
conv_resample=True,
dims=2,
use_checkpoint=False,
use_fp16=False,
num_heads=1,
num_head_channels=-1,
num_heads_upsample=-1,
use_scale_shift_norm=False,
resblock_updown=False,
use_new_attention_order=False,
pool="adaptive",
*args,
**kwargs
):
super().__init__()
if num_heads_upsample == -1:
num_heads_upsample = num_heads
self.in_channels = in_channels
self.model_channels = model_channels
self.out_channels = out_channels
self.num_res_blocks = num_res_blocks
self.attention_resolutions = attention_resolutions
self.dropout = dropout
self.channel_mult = channel_mult
self.conv_resample = conv_resample
self.use_checkpoint = use_checkpoint
self.dtype = th.float16 if use_fp16 else th.float32
self.num_heads = num_heads
self.num_head_channels = num_head_channels
self.num_heads_upsample = num_heads_upsample
time_embed_dim = model_channels * 4
self.time_embed = nn.Sequential(
linear(model_channels, time_embed_dim),
nn.SiLU(),
linear(time_embed_dim, time_embed_dim),
)
self.input_blocks = nn.ModuleList(
[
TimestepEmbedSequential(
conv_nd(dims, in_channels, model_channels, 3, padding=1)
)
]
)
self._feature_size = model_channels
input_block_chans = [model_channels]
ch = model_channels
ds = 1
for level, mult in enumerate(channel_mult):
for _ in range(num_res_blocks):
layers = [
ResBlock(
ch,
time_embed_dim,
dropout,
out_channels=mult * model_channels,
dims=dims,
use_checkpoint=use_checkpoint,
use_scale_shift_norm=use_scale_shift_norm,
)
]
ch = mult * model_channels
if ds in attention_resolutions:
layers.append(
AttentionBlock(
ch,
use_checkpoint=use_checkpoint,
num_heads=num_heads,
num_head_channels=num_head_channels,
use_new_attention_order=use_new_attention_order,
)
)
self.input_blocks.append(TimestepEmbedSequential(*layers))
self._feature_size += ch
input_block_chans.append(ch)
if level != len(channel_mult) - 1:
out_ch = ch
self.input_blocks.append(
TimestepEmbedSequential(
ResBlock(
ch,
time_embed_dim,
dropout,
out_channels=out_ch,
dims=dims,
use_checkpoint=use_checkpoint,
use_scale_shift_norm=use_scale_shift_norm,
down=True,
)
if resblock_updown
else Downsample(
ch, conv_resample, dims=dims, out_channels=out_ch
)
)
)
ch = out_ch
input_block_chans.append(ch)
ds *= 2
self._feature_size += ch
self.middle_block = TimestepEmbedSequential(
ResBlock(
ch,
time_embed_dim,
dropout,
dims=dims,
use_checkpoint=use_checkpoint,
use_scale_shift_norm=use_scale_shift_norm,
),
AttentionBlock(
ch,
use_checkpoint=use_checkpoint,
num_heads=num_heads,
num_head_channels=num_head_channels,
use_new_attention_order=use_new_attention_order,
),
ResBlock(
ch,
time_embed_dim,
dropout,
dims=dims,
use_checkpoint=use_checkpoint,
use_scale_shift_norm=use_scale_shift_norm,
),
)
self._feature_size += ch
self.pool = pool
if pool == "adaptive":
self.out = nn.Sequential(
normalization(ch),
nn.SiLU(),
nn.AdaptiveAvgPool2d((1, 1)),
zero_module(conv_nd(dims, ch, out_channels, 1)),
nn.Flatten(),
)
elif pool == "attention":
assert num_head_channels != -1
self.out = nn.Sequential(
normalization(ch),
nn.SiLU(),
AttentionPool2d(
(image_size // ds), ch, num_head_channels, out_channels
),
)
elif pool == "spatial":
self.out = nn.Sequential(
nn.Linear(self._feature_size, 2048),
nn.ReLU(),
nn.Linear(2048, self.out_channels),
)
elif pool == "spatial_v2":
self.out = nn.Sequential(
nn.Linear(self._feature_size, 2048),
normalization(2048),
nn.SiLU(),
nn.Linear(2048, self.out_channels),
)
else:
raise NotImplementedError(f"Unexpected {pool} pooling")
def convert_to_fp16(self):
"""
Convert the torso of the model to float16.
"""
self.input_blocks.apply(convert_module_to_f16)
self.middle_block.apply(convert_module_to_f16)
def convert_to_fp32(self):
"""
Convert the torso of the model to float32.
"""
self.input_blocks.apply(convert_module_to_f32)
self.middle_block.apply(convert_module_to_f32)
def forward(self, x, timesteps):
"""
Apply the model to an input batch.
:param x: an [N x C x ...] Tensor of inputs.
:param timesteps: a 1-D batch of timesteps.
:return: an [N x K] Tensor of outputs.
"""
emb = self.time_embed(timestep_embedding(timesteps, self.model_channels))
results = []
h = x.type(self.dtype)
for module in self.input_blocks:
h = module(h, emb)
if self.pool.startswith("spatial"):
results.append(h.type(x.dtype).mean(dim=(2, 3)))
h = self.middle_block(h, emb)
if self.pool.startswith("spatial"):
results.append(h.type(x.dtype).mean(dim=(2, 3)))
h = th.cat(results, axis=-1)
return self.out(h)
else:
h = h.type(x.dtype)
return self.out(h)
================================================
FILE: ldm/modules/diffusionmodules/util.py
================================================
# adopted from
# https://github.com/openai/improved-diffusion/blob/main/improved_diffusion/gaussian_diffusion.py
# and
# https://github.com/lucidrains/denoising-diffusion-pytorch/blob/7706bdfc6f527f58d33f84b7b522e61e6e3164b3/denoising_diffusion_pytorch/denoising_diffusion_pytorch.py
# and
# https://github.com/openai/guided-diffusion/blob/0ba878e517b276c45d1195eb29f6f5f72659a05b/guided_diffusion/nn.py
#
# thanks!
import os
import math
import torch
import torch.nn as nn
import numpy as np
from einops import repeat
from ldm.util import instantiate_from_config
def make_beta_schedule(schedule, n_timestep, linear_start=1e-4, linear_end=2e-2, cosine_s=8e-3):
if schedule == "linear":
betas = (
torch.linspace(linear_start ** 0.5, linear_end ** 0.5, n_timestep, dtype=torch.float64) ** 2
)
elif schedule == "cosine":
timesteps = (
torch.arange(n_timestep + 1, dtype=torch.float64) / n_timestep + cosine_s
)
alphas = timesteps / (1 + cosine_s) * np.pi / 2
alphas = torch.cos(alphas).pow(2)
alphas = alphas / alphas[0]
betas = 1 - alphas[1:] / alphas[:-1]
betas = np.clip(betas, a_min=0, a_max=0.999)
elif schedule == "sqrt_linear":
betas = torch.linspace(linear_start, linear_end, n_timestep, dtype=torch.float64)
elif schedule == "sqrt":
betas = torch.linspace(linear_start, linear_end, n_timestep, dtype=torch.float64) ** 0.5
else:
raise ValueError(f"schedule '{schedule}' unknown.")
return betas.numpy()
def make_ddim_timesteps(ddim_discr_method, num_ddim_timesteps, num_ddpm_timesteps, verbose=True):
if ddim_discr_method == 'uniform':
c = num_ddpm_timesteps // num_ddim_timesteps
ddim_timesteps = np.asarray(list(range(0, num_ddpm_timesteps, c)))
elif ddim_discr_method == 'quad':
ddim_timesteps = ((np.linspace(0, np.sqrt(num_ddpm_timesteps * .8), num_ddim_timesteps)) ** 2).astype(int)
else:
raise NotImplementedError(f'There is no ddim discretization method called "{ddim_discr_method}"')
# assert ddim_timesteps.shape[0] == num_ddim_timesteps
# add one to get the final alpha values right (the ones from first scale to data during sampling)
steps_out = ddim_timesteps + 1
if verbose:
print(f'Selected timesteps for ddim sampler: {steps_out}')
return steps_out
def make_ddim_sampling_parameters(alphacums, ddim_timesteps, eta, verbose=True):
# select alphas for computing the variance schedule
alphas = alphacums[ddim_timesteps]
alphas_prev = np.asarray([alphacums[0]] + alphacums[ddim_timesteps[:-1]].tolist())
# according the the formula provided in https://arxiv.org/abs/2010.02502
sigmas = eta * np.sqrt((1 - alphas_prev) / (1 - alphas) * (1 - alphas / alphas_prev))
if verbose:
print(f'Selected alphas for ddim sampler: a_t: {alphas}; a_(t-1): {alphas_prev}')
print(f'For the chosen value of eta, which is {eta}, '
f'this results in the following sigma_t schedule for ddim sampler {sigmas}')
return sigmas, alphas, alphas_prev
def betas_for_alpha_bar(num_diffusion_timesteps, alpha_bar, max_beta=0.999):
"""
Create a beta schedule that discretizes the given alpha_t_bar function,
which defines the cumulative product of (1-beta) over time from t = [0,1].
:param num_diffusion_timesteps: the number of betas to produce.
:param alpha_bar: a lambda that takes an argument t from 0 to 1 and
produces the cumulative product of (1-beta) up to that
part of the diffusion process.
:param max_beta: the maximum beta to use; use values lower than 1 to
prevent singularities.
"""
betas = []
for i in range(num_diffusion_timesteps):
t1 = i / num_diffusion_timesteps
t2 = (i + 1) / num_diffusion_timesteps
betas.append(min(1 - alpha_bar(t2) / alpha_bar(t1), max_beta))
return np.array(betas)
def extract_into_tensor(a, t, x_shape):
b, *_ = t.shape
out = a.gather(-1, t)
return out.reshape(b, *((1,) * (len(x_shape) - 1)))
def checkpoint(func, inputs, params, flag):
"""
Evaluate a function without caching intermediate activations, allowing for
reduced memory at the expense of extra compute in the backward pass.
:param func: the function to evaluate.
:param inputs: the argument sequence to pass to `func`.
:param params: a sequence of parameters `func` depends on but does not
explicitly take as arguments.
:param flag: if False, disable gradient checkpointing.
"""
if flag:
args = tuple(inputs) + tuple(params)
return CheckpointFunction.apply(func, len(inputs), *args)
else:
return func(*inputs)
class CheckpointFunction(torch.autograd.Function):
@staticmethod
def forward(ctx, run_function, length, *args):
ctx.run_function = run_function
ctx.input_tensors = list(args[:length])
ctx.input_params = list(args[length:])
with torch.no_grad():
output_tensors = ctx.run_function(*ctx.input_tensors)
return output_tensors
@staticmethod
def backward(ctx, *output_grads):
ctx.input_tensors = [x.detach().requires_grad_(True) for x in ctx.input_tensors]
with torch.enable_grad():
# Fixes a bug where the first op in run_function modifies the
# Tensor storage in place, which is not allowed for detach()'d
# Tensors.
shallow_copies = [x.view_as(x) for x in ctx.input_tensors]
output_tensors = ctx.run_function(*shallow_copies)
input_grads = torch.autograd.grad(
output_tensors,
ctx.input_tensors + ctx.input_params,
output_grads,
allow_unused=True,
)
del ctx.input_tensors
del ctx.input_params
del output_tensors
return (None, None) + input_grads
def timestep_embedding(timesteps, dim, max_period=10000, repeat_only=False):
"""
Create sinusoidal timestep embeddings.
:param timesteps: a 1-D Tensor of N indices, one per batch element.
These may be fractional.
:param dim: the dimension of the output.
:param max_period: controls the minimum frequency of the embeddings.
:return: an [N x dim] Tensor of positional embeddings.
"""
if not repeat_only:
half = dim // 2
freqs = torch.exp(
-math.log(max_period) * torch.arange(start=0, end=half, dtype=torch.float32) / half
).to(device=timesteps.device)
args = timesteps[:, None].float() * freqs[None]
embedding = torch.cat([torch.cos(args), torch.sin(args)], dim=-1)
if dim % 2:
embedding = torch.cat([embedding, torch.zeros_like(embedding[:, :1])], dim=-1)
else:
embedding = repeat(timesteps, 'b -> b d', d=dim)
return embedding
def zero_module(module):
"""
Zero out the parameters of a module and return it.
"""
for p in module.parameters():
p.detach().zero_()
return module
def scale_module(module, scale):
"""
Scale the parameters of a module and return it.
"""
for p in module.parameters():
p.detach().mul_(scale)
return module
def mean_flat(tensor):
"""
Take the mean over all non-batch dimensions.
"""
return tensor.mean(dim=list(range(1, len(tensor.shape))))
def normalization(channels):
"""
Make a standard normalization layer.
:param channels: number of input channels.
:return: an nn.Module for normalization.
"""
return GroupNorm32(32, channels)
# PyTorch 1.7 has SiLU, but we support PyTorch 1.5.
class SiLU(nn.Module):
def forward(self, x):
return x * torch.sigmoid(x)
class GroupNorm32(nn.GroupNorm):
def forward(self, x):
return super().forward(x.float()).type(x.dtype)
def conv_nd(dims, *args, **kwargs):
"""
Create a 1D, 2D, or 3D convolution module.
"""
if dims == 1:
return nn.Conv1d(*args, **kwargs)
elif dims == 2:
return nn.Conv2d(*args, **kwargs)
elif dims == 3:
return nn.Conv3d(*args, **kwargs)
raise ValueError(f"unsupported dimensions: {dims}")
def linear(*args, **kwargs):
"""
Create a linear module.
"""
return nn.Linear(*args, **kwargs)
def avg_pool_nd(dims, *args, **kwargs):
"""
Create a 1D, 2D, or 3D average pooling module.
"""
if dims == 1:
return nn.AvgPool1d(*args, **kwargs)
elif dims == 2:
return nn.AvgPool2d(*args, **kwargs)
elif dims == 3:
return nn.AvgPool3d(*args, **kwargs)
raise ValueError(f"unsupported dimensions: {dims}")
class HybridConditioner(nn.Module):
def __init__(self, c_concat_config, c_crossattn_config):
super().__init__()
self.concat_conditioner = instantiate_from_config(c_concat_config)
self.crossattn_conditioner = instantiate_from_config(c_crossattn_config)
def forward(self, c_concat, c_crossattn):
c_concat = self.concat_conditioner(c_concat)
c_crossattn = self.crossattn_conditioner(c_crossattn)
return {'c_concat': [c_concat], 'c_crossattn': [c_crossattn]}
def noise_like(shape, device, repeat=False):
repeat_noise = lambda: torch.randn((1, *shape[1:]), device=device).repeat(shape[0], *((1,) * (len(shape) - 1)))
noise = lambda: torch.randn(shape, device=device)
return repeat_noise() if repeat else noise()
================================================
FILE: ldm/modules/distributions/__init__.py
================================================
================================================
FILE: ldm/modules/distributions/distributions.py
================================================
import torch
import numpy as np
class AbstractDistribution:
def sample(self):
raise NotImplementedError()
def mode(self):
raise NotImplementedError()
class DiracDistribution(AbstractDistribution):
def __init__(self, value):
self.value = value
def sample(self):
return self.value
def mode(self):
return self.value
class DiagonalGaussianDistribution(object):
def __init__(self, parameters, deterministic=False):
self.parameters = parameters
self.mean, self.logvar = torch.chunk(parameters, 2, dim=1)
self.logvar = torch.clamp(self.logvar, -30.0, 20.0)
self.deterministic = deterministic
self.std = torch.exp(0.5 * self.logvar)
self.var = torch.exp(self.logvar)
if self.deterministic:
self.var = self.std = torch.zeros_like(self.mean).to(device=self.parameters.device)
def sample(self):
x = self.mean + self.std * torch.randn(self.mean.shape).to(device=self.parameters.device)
return x
def kl(self, other=None):
if self.deterministic:
return torch.Tensor([0.])
else:
if other is None:
return 0.5 * torch.sum(torch.pow(self.mean, 2)
+ self.var - 1.0 - self.logvar,
dim=[1, 2, 3])
else:
return 0.5 * torch.sum(
torch.pow(self.mean - other.mean, 2) / other.var
+ self.var / other.var - 1.0 - self.logvar + other.logvar,
dim=[1, 2, 3])
def nll(self, sample, dims=[1,2,3]):
if self.deterministic:
return torch.Tensor([0.])
logtwopi = np.log(2.0 * np.pi)
return 0.5 * torch.sum(
logtwopi + self.logvar + torch.pow(sample - self.mean, 2) / self.var,
dim=dims)
def mode(self):
return self.mean
def normal_kl(mean1, logvar1, mean2, logvar2):
"""
source: https://github.com/openai/guided-diffusion/blob/27c20a8fab9cb472df5d6bdd6c8d11c8f430b924/guided_diffusion/losses.py#L12
Compute the KL divergence between two gaussians.
Shapes are automatically broadcasted, so batches can be compared to
scalars, among other use cases.
"""
tensor = None
for obj in (mean1, logvar1, mean2, logvar2):
if isinstance(obj, torch.Tensor):
tensor = obj
break
assert tensor is not None, "at least one argument must be a Tensor"
# Force variances to be Tensors. Broadcasting helps convert scalars to
# Tensors, but it does not work for torch.exp().
logvar1, logvar2 = [
x if isinstance(x, torch.Tensor) else torch.tensor(x).to(tensor)
for x in (logvar1, logvar2)
]
return 0.5 * (
-1.0
+ logvar2
- logvar1
+ torch.exp(logvar1 - logvar2)
+ ((mean1 - mean2) ** 2) * torch.exp(-logvar2)
)
================================================
FILE: ldm/modules/encoders/__init__.py
================================================
================================================
FILE: ldm/modules/encoders/modules.py
================================================
import torch
import torch.nn as nn
import numpy as np
from functools import partial
import kornia
from ldm.modules.x_transformer import Encoder, TransformerWrapper # TODO: can we directly rely on lucidrains code and simply add this as a reuirement? --> test
from ldm.util import default
import clip
class AbstractEncoder(nn.Module):
def __init__(self):
super().__init__()
def encode(self, *args, **kwargs):
raise NotImplementedError
class IdentityEncoder(AbstractEncoder):
def encode(self, x):
return x
class FaceClipEncoder(AbstractEncoder):
def __init__(self, augment=True, retreival_key=None):
super().__init__()
self.encoder = FrozenCLIPImageEmbedder()
self.augment = augment
self.retreival_key = retreival_key
def forward(self, img):
encodings = []
with torch.no_grad():
x_offset = 125
if self.retreival_key:
# Assumes retrieved image are packed into the second half of channels
face = img[:,3:,190:440,x_offset:(512-x_offset)]
other = img[:,:3,...].clone()
else:
face = img[:,:,190:440,x_offset:(512-x_offset)]
other = img.clone()
if self.augment:
face = K.RandomHorizontalFlip()(face)
other[:,:,190:440,x_offset:(512-x_offset)] *= 0
encodings = [
self.encoder.encode(face),
self.encoder.encode(other),
]
return torch.cat(encodings, dim=1)
def encode(self, img):
if isinstance(img, list):
# Uncondition
return torch.zeros((1, 2, 768), device=self.encoder.model.visual.conv1.weight.device)
return self(img)
class FaceIdClipEncoder(AbstractEncoder):
def __init__(self):
super().__init__()
self.encoder = FrozenCLIPImageEmbedder()
for p in self.encoder.parameters():
p.requires_grad = False
self.id = FrozenFaceEncoder("/home/jpinkney/code/stable-diffusion/model_ir_se50.pth", augment=True)
def forward(self, img):
encodings = []
with torch.no_grad():
face = kornia.geometry.resize(img, (256, 256),
interpolation='bilinear', align_corners=True)
other = img.clone()
other[:,:,184:452,122:396] *= 0
encodings = [
self.id.encode(face),
self.encoder.encode(other),
]
return torch.cat(encodings, dim=1)
def encode(self, img):
if isinstance(img, list):
# Uncondition
return torch.zeros((1, 2, 768), device=self.encoder.model.visual.conv1.weight.device)
return self(img)
class ClassEmbedder(nn.Module):
def __init__(self, embed_dim, n_classes=1000, key='class'):
super().__init__()
self.key = key
self.embedding = nn.Embedding(n_classes, embed_dim)
def forward(self, batch, key=None):
if key is None:
key = self.key
# this is for use in crossattn
c = batch[key][:, None]
c = self.embedding(c)
return c
class TransformerEmbedder(AbstractEncoder):
"""Some transformer encoder layers"""
def __init__(self, n_embed, n_layer, vocab_size, max_seq_len=77, device="cuda"):
super().__init__()
self.device = device
self.transformer = TransformerWrapper(num_tokens=vocab_size, max_seq_len=max_seq_len,
attn_layers=Encoder(dim=n_embed, depth=n_layer))
def forward(self, tokens):
tokens = tokens.to(self.device) # meh
z = self.transformer(tokens, return_embeddings=True)
return z
def encode(self, x):
return self(x)
class BERTTokenizer(AbstractEncoder):
""" Uses a pretrained BERT tokenizer by huggingface. Vocab size: 30522 (?)"""
def __init__(self, device="cuda", vq_interface=True, max_length=77):
super().__init__()
from transformers import BertTokenizerFast # TODO: add to reuquirements
self.tokenizer = BertTokenizerFast.from_pretrained("bert-base-uncased")
self.device = device
self.vq_interface = vq_interface
self.max_length = max_length
def forward(self, text):
batch_encoding = self.tokenizer(text, truncation=True, max_length=self.max_length, return_length=True,
return_overflowing_tokens=False, padding="max_length", return_tensors="pt")
tokens = batch_encoding["input_ids"].to(self.device)
return tokens
@torch.no_grad()
def encode(self, text):
tokens = self(text)
if not self.vq_interface:
return tokens
return None, None, [None, None, tokens]
def decode(self, text):
return text
class BERTEmbedder(AbstractEncoder):
"""Uses the BERT tokenizr model and add some transformer encoder layers"""
def __init__(self, n_embed, n_layer, vocab_size=30522, max_seq_len=77,
device="cuda",use_tokenizer=True, embedding_dropout=0.0):
super().__init__()
self.use_tknz_fn = use_tokenizer
if self.use_tknz_fn:
self.tknz_fn = BERTTokenizer(vq_interface=False, max_length=max_seq_len)
self.device = device
self.transformer = TransformerWrapper(num_tokens=vocab_size, max_seq_len=max_seq_len,
attn_layers=Encoder(dim=n_embed, depth=n_layer),
emb_dropout=embedding_dropout)
def forward(self, text):
if self.use_tknz_fn:
tokens = self.tknz_fn(text)#.to(self.device)
else:
tokens = text
z = self.transformer(tokens, return_embeddings=True)
return z
def encode(self, text):
# output of length 77
return self(text)
from transformers import T5Tokenizer, T5EncoderModel, CLIPTokenizer, CLIPTextModel
def disabled_train(self, mode=True):
"""Overwrite model.train with this function to make sure train/eval mode
does not change anymore."""
return self
class FrozenT5Embedder(AbstractEncoder):
"""Uses the T5 transformer encoder for text"""
def __init__(self, version="google/t5-v1_1-large", device="cuda", max_length=77): # others are google/t5-v1_1-xl and google/t5-v1_1-xxl
super().__init__()
self.tokenizer = T5Tokenizer.from_pretrained(version, cache_dir='/apdcephfs/private_rondyliu/projects/huggingface_models')
self.transformer = T5EncoderModel.from_pretrained(version, cache_dir='/apdcephfs/private_rondyliu/projects/huggingface_models')
self.device = device
self.max_length = max_length # TODO: typical value?
self.freeze()
def freeze(self):
self.transformer = self.transformer.eval()
#self.train = disabled_train
for param in self.parameters():
param.requires_grad = False
def forward(self, text):
batch_encoding = self.tokenizer(text, truncation=True, max_length=self.max_length, return_length=True,
return_overflowing_tokens=False, padding="max_length", return_tensors="pt")
tokens = batch_encoding["input_ids"].to(self.device)
outputs = self.transformer(input_ids=tokens)
z = outputs.last_hidden_state
return z
def encode(self, text):
return self(text)
from ldm.thirdp.psp.id_loss import IDFeatures
import kornia.augmentation as K
class FrozenFaceEncoder(AbstractEncoder):
def __init__(self, model_path, augment=False):
super().__init__()
self.loss_fn = IDFeatures(model_path)
# face encoder is frozen
for p in self.loss_fn.parameters():
p.requires_grad = False
# Mapper is trainable
self.mapper = torch.nn.Linear(512, 768)
p = 0.25
if augment:
self.augment = K.AugmentationSequential(
K.RandomHorizontalFlip(p=0.5),
K.RandomEqualize(p=p),
# K.RandomPlanckianJitter(p=p),
# K.RandomPlasmaBrightness(p=p),
# K.RandomPlasmaContrast(p=p),
# K.ColorJiggle(0.02, 0.2, 0.2, p=p),
)
else:
self.augment = False
def forward(self, img):
if isinstance(img, list):
# Uncondition
return torch.zeros((1, 1, 768), device=self.mapper.weight.device)
if self.augment is not None:
# Transforms require 0-1
img = self.augment((img + 1)/2)
img = 2*img - 1
feat = self.loss_fn(img, crop=True)
feat = self.mapper(feat.unsqueeze(1))
return feat
def encode(self, img):
return self(img)
class FrozenCLIPEmbedder(AbstractEncoder):
"""Uses the CLIP transformer encoder for text (from huggingface)"""
def __init__(self, version="openai/clip-vit-large-patch14", device="cuda", max_length=77): # clip-vit-base-patch32
super().__init__()
self.tokenizer = CLIPTokenizer.from_pretrained(version, cache_dir='/apdcephfs/private_rondyliu/projects/huggingface_models')
self.transformer = CLIPTextModel.from_pretrained(version, cache_dir='/apdcephfs/private_rondyliu/projects/huggingface_models')
self.device = device
self.max_length = max_length # TODO: typical value?
self.freeze()
def freeze(self):
self.transformer = self.transformer.eval()
#self.train = disabled_train
for param in self.parameters():
param.requires_grad = False
def forward(self, text):
batch_encoding = self.tokenizer(text, truncation=True, max_length=self.max_length, return_length=True,
return_overflowing_tokens=False, padding="max_length", return_tensors="pt")
tokens = batch_encoding["input_ids"].to(self.device)
outputs = self.transformer(input_ids=tokens)
z = outputs.last_hidden_state
return z
def encode(self, text):
return self(text)
import torch.nn.functional as F
from transformers import CLIPVisionModel
class ClipImageProjector(AbstractEncoder):
"""
Uses the CLIP image encoder.
"""
def __init__(self, version="openai/clip-vit-large-patch14", max_length=77): # clip-vit-base-patch32
super().__init__()
self.model = CLIPVisionModel.from_pretrained(version)
self.model.train()
self.max_length = max_length # TODO: typical value?
self.antialias = True
self.mapper = torch.nn.Linear(1024, 768)
self.register_buffer('mean', torch.Tensor([0.48145466, 0.4578275, 0.40821073]), persistent=False)
self.register_buffer('std', torch.Tensor([0.26862954, 0.26130258, 0.27577711]), persistent=False)
null_cond = self.get_null_cond(version, max_length)
self.register_buffer('null_cond', null_cond)
@torch.no_grad()
def get_null_cond(self, version, max_length):
device = self.mean.device
embedder = FrozenCLIPEmbedder(version=version, device=device, max_length=max_length)
null_cond = embedder([""])
return null_cond
def preprocess(self, x):
# Expects inputs in the range -1, 1
x = kornia.geometry.resize(x, (224, 224),
interpolation='bicubic',align_corners=True,
antialias=self.antialias)
x = (x + 1.) / 2.
# renormalize according to clip
x = kornia.enhance.normalize(x, self.mean, self.std)
return x
def forward(self, x):
if isinstance(x, list):
return self.null_cond
# x is assumed to be in range [-1,1]
x = self.preprocess(x)
outputs = self.model(pixel_values=x)
last_hidden_state = outputs.last_hidden_state
last_hidden_state = self.mapper(last_hidden_state)
return F.pad(last_hidden_state, [0,0, 0,self.max_length-last_hidden_state.shape[1], 0,0])
def encode(self, im):
return self(im)
class ProjectedFrozenCLIPEmbedder(AbstractEncoder):
def __init__(self, version="openai/clip-vit-large-patch14", device="cuda", max_length=77): # clip-vit-base-patch32
super().__init__()
self.embedder = FrozenCLIPEmbedder(version=version, device=device, max_length=max_length)
self.projection = torch.nn.Linear(768, 768)
def forward(self, text):
z = self.embedder(text)
return self.projection(z)
def encode(self, text):
return self(text)
class FrozenCLIPImageEmbedder(AbstractEncoder):
"""
Uses the CLIP image encoder.
Not actually frozen... If you want that set cond_stage_trainable=False in cfg
"""
def __init__(
self,
model='ViT-L/14',
jit=False,
device='cpu',
antialias=False,
):
super().__init__()
self.model, _ = clip.load(name=model, device=device, jit=jit)
# We don't use the text part so delete it
del self.model.transformer
self.antialias = antialias
self.register_buffer('mean', torch.Tensor([0.48145466, 0.4578275, 0.40821073]), persistent=False)
self.register_buffer('std', torch.Tensor([0.26862954, 0.26130258, 0.27577711]), persistent=False)
def preprocess(self, x):
# Expects inputs in the range -1, 1
x = kornia.geometry.resize(x, (224, 224),
interpolation='bicubic',align_corners=True,
antialias=self.antialias)
x = (x + 1.) / 2.
# renormalize according to clip
x = kornia.enhance.normalize(x, self.mean, self.std)
return x
def forward(self, x):
# x is assumed to be in range [-1,1]
if isinstance(x, list):
# [""] denotes condition dropout for ucg
device = self.model.visual.conv1.weight.device
return torch.zeros(1, 768, device=device)
return self.model.encode_image(self.preprocess(x)).float()
def encode(self, im):
return self(im).unsqueeze(1)
from torchvision import transforms
import random
class FrozenCLIPImageMutliEmbedder(AbstractEncoder):
"""
Uses the CLIP image encoder.
Not actually frozen... If you want that set cond_stage_trainable=False in cfg
"""
def __init__(
self,
model='ViT-L/14',
jit=False,
device='cpu',
antialias=True,
max_crops=5,
):
super().__init__()
self.model, _ = clip.load(name=model, device=device, jit=jit)
# We don't use the text part so delete it
del self.model.transformer
self.antialias = antialias
self.register_buffer('mean', torch.Tensor([0.48145466, 0.4578275, 0.40821073]), persistent=False)
self.register_buffer('std', torch.Tensor([0.26862954, 0.26130258, 0.27577711]), persistent=False)
self.max_crops = max_crops
def preprocess(self, x):
# Expects inputs in the range -1, 1
randcrop = transforms.RandomResizedCrop(224, scale=(0.085, 1.0), ratio=(1,1))
max_crops = self.max_crops
patches = []
crops = [randcrop(x) for _ in range(max_crops)]
patches.extend(crops)
x = torch.cat(patches, dim=0)
x = (x + 1.) / 2.
# renormalize according to clip
x = kornia.enhance.normalize(x, self.mean, self.std)
return x
def forward(self, x):
# x is assumed to be in range [-1,1]
if isinstance(x, list):
# [""] denotes condition dropout for ucg
device = self.model.visual.conv1.weight.device
return torch.zeros(1, self.max_crops, 768, device=device)
batch_tokens = []
for im in x:
patches = self.preprocess(im.unsqueeze(0))
tokens = self.model.encode_image(patches).float()
for t in tokens:
if random.random() < 0.1:
t *= 0
batch_tokens.append(tokens.unsqueeze(0))
return torch.cat(batch_tokens, dim=0)
def encode(self, im):
return self(im)
class SpatialRescaler(nn.Module):
def __init__(self,
n_stages=1,
method='bilinear',
multiplier=0.5,
in_channels=3,
out_channels=None,
bias=False):
super().__init__()
self.n_stages = n_stages
assert self.n_stages >= 0
assert method in ['nearest','linear','bilinear','trilinear','bicubic','area']
self.multiplier = multiplier
self.interpolator = partial(torch.nn.functional.interpolate, mode=method)
self.remap_output = out_channels is not None
if self.remap_output:
print(f'Spatial Rescaler mapping from {in_channels} to {out_channels} channels after resizing.')
self.channel_mapper = nn.Conv2d(in_channels,out_channels,1,bias=bias)
def forward(self,x):
for stage in range(self.n_stages):
x = self.interpolator(x, scale_factor=self.multiplier)
if self.remap_output:
x = self.channel_mapper(x)
return x
def encode(self, x):
return self(x)
from ldm.util import instantiate_from_config
from ldm.modules.diffusionmodules.util import make_beta_schedule, extract_into_tensor, noise_like
class LowScaleEncoder(nn.Module):
def __init__(self, model_config, linear_start, linear_end, timesteps=1000, max_noise_level=250, output_size=64,
scale_factor=1.0):
super().__init__()
self.max_noise_level = max_noise_level
self.model = instantiate_from_config(model_config)
self.augmentation_schedule = self.register_schedule(timesteps=timesteps, linear_start=linear_start,
linear_end=linear_end)
self.out_size = output_size
self.scale_factor = scale_factor
def register_schedule(self, beta_schedule="linear", timesteps=1000,
linear_start=1e-4, linear_end=2e-2, cosine_s=8e-3):
betas = make_beta_schedule(beta_schedule, timesteps, linear_start=linear_start, linear_end=linear_end,
cosine_s=cosine_s)
alphas = 1. - betas
alphas_cumprod = np.cumprod(alphas, axis=0)
alphas_cumprod_prev = np.append(1., alphas_cumprod[:-1])
timesteps, = betas.shape
self.num_timesteps = int(timesteps)
self.linear_start = linear_start
self.linear_end = linear_end
assert alphas_cumprod.shape[0] == self.num_timesteps, 'alphas have to be defined for each timestep'
to_torch = partial(torch.tensor, dtype=torch.float32)
self.register_buffer('betas', to_torch(betas))
self.register_buffer('alphas_cumprod', to_torch(alphas_cumprod))
self.register_buffer('alphas_cumprod_prev', to_torch(alphas_cumprod_prev))
# calculations for diffusion q(x_t | x_{t-1}) and others
self.register_buffer('sqrt_alphas_cumprod', to_torch(np.sqrt(alphas_cumprod)))
self.register_buffer('sqrt_one_minus_alphas_cumprod', to_torch(np.sqrt(1. - alphas_cumprod)))
self.register_buffer('log_one_minus_alphas_cumprod', to_torch(np.log(1. - alphas_cumprod)))
self.register_buffer('sqrt_recip_alphas_cumprod', to_torch(np.sqrt(1. / alphas_cumprod)))
self.register_buffer('sqrt_recipm1_alphas_cumprod', to_torch(np.sqrt(1. / alphas_cumprod - 1)))
def q_sample(self, x_start, t, noise=None):
noise = default(noise, lambda: torch.randn_like(x_start))
return (extract_into_tensor(self.sqrt_alphas_cumprod, t, x_start.shape) * x_start +
extract_into_tensor(self.sqrt_one_minus_alphas_cumprod, t, x_start.shape) * noise)
def forward(self, x):
z = self.model.encode(x).sample()
z = z * self.scale_factor
noise_level = torch.randint(0, self.max_noise_level, (x.shape[0],), device=x.device).long()
z = self.q_sample(z, noise_level)
if self.out_size is not None:
z = torch.nn.functional.interpolate(z, size=self.out_size, mode="nearest") # TODO: experiment with mode
# z = z.repeat_interleave(2, -2).repeat_interleave(2, -1)
return z, noise_level
def decode(self, z):
z = z / self.scale_factor
return self.model.decode(z)
if __name__ == "__main__":
from ldm.util import count_params
sentences = ["a hedgehog drinking a whiskey", "der mond ist aufgegangen", "Ein Satz mit vielen Sonderzeichen: äöü ß ?! : 'xx-y/@s'"]
model = FrozenT5Embedder(version="google/t5-v1_1-xl").cuda()
count_params(model, True)
z = model(sentences)
print(z.shape)
model = FrozenCLIPEmbedder().cuda()
count_params(model, True)
z = model(sentences)
print(z.shape)
print("done.")
================================================
FILE: ldm/modules/x_transformer.py
================================================
"""shout-out to https://github.com/lucidrains/x-transformers/tree/main/x_transformers"""
import torch
from torch import nn, einsum
import torch.nn.functional as F
from functools import partial
from inspect import isfunction
from collections import namedtuple
from einops import rearrange, repeat, reduce
# constants
DEFAULT_DIM_HEAD = 64
Intermediates = namedtuple('Intermediates', [
'pre_softmax_attn',
'post_softmax_attn'
])
LayerIntermediates = namedtuple('Intermediates', [
'hiddens',
'attn_intermediates'
])
class AbsolutePositionalEmbedding(nn.Module):
def __init__(self, dim, max_seq_len):
super().__init__()
self.emb = nn.Embedding(max_seq_len, dim)
self.init_()
def init_(self):
nn.init.normal_(self.emb.weight, std=0.02)
def forward(self, x):
n = torch.arange(x.shape[1], device=x.device)
return self.emb(n)[None, :, :]
class FixedPositionalEmbedding(nn.Module):
def __init__(self, dim):
super().__init__()
inv_freq = 1. / (10000 ** (torch.arange(0, dim, 2).float() / dim))
self.register_buffer('inv_freq', inv_freq)
def forward(self, x, seq_dim=1, offset=0):
t = torch.arange(x.shape[seq_dim], device=x.device).type_as(self.inv_freq) + offset
sinusoid_inp = torch.einsum('i , j -> i j', t, self.inv_freq)
emb = torch.cat((sinusoid_inp.sin(), sinusoid_inp.cos()), dim=-1)
return emb[None, :, :]
# helpers
def exists(val):
return val is not None
def default(val, d):
if exists(val):
return val
return d() if isfunction(d) else d
def always(val):
def inner(*args, **kwargs):
return val
return inner
def not_equals(val):
def inner(x):
return x != val
return inner
def equals(val):
def inner(x):
return x == val
return inner
def max_neg_value(tensor):
return -torch.finfo(tensor.dtype).max
# keyword argument helpers
def pick_and_pop(keys, d):
values = list(map(lambda key: d.pop(key), keys))
return dict(zip(keys, values))
def group_dict_by_key(cond, d):
return_val = [dict(), dict()]
for key in d.keys():
match = bool(cond(key))
ind = int(not match)
return_val[ind][key] = d[key]
return (*return_val,)
def string_begins_with(prefix, str):
return str.startswith(prefix)
def group_by_key_prefix(prefix, d):
return group_dict_by_key(partial(string_begins_with, prefix), d)
def groupby_prefix_and_trim(prefix, d):
kwargs_with_prefix, kwargs = group_dict_by_key(partial(string_begins_with, prefix), d)
kwargs_without_prefix = dict(map(lambda x: (x[0][len(prefix):], x[1]), tuple(kwargs_with_prefix.items())))
return kwargs_without_prefix, kwargs
# classes
class Scale(nn.Module):
def __init__(self, value, fn):
super().__init__()
self.value = value
self.fn = fn
def forward(self, x, **kwargs):
x, *rest = self.fn(x, **kwargs)
return (x * self.value, *rest)
class Rezero(nn.Module):
def __init__(self, fn):
super().__init__()
self.fn = fn
self.g = nn.Parameter(torch.zeros(1))
def forward(self, x, **kwargs):
x, *rest = self.fn(x, **kwargs)
return (x * self.g, *rest)
class ScaleNorm(nn.Module):
def __init__(self, dim, eps=1e-5):
super().__init__()
self.scale = dim ** -0.5
self.eps = eps
self.g = nn.Parameter(torch.ones(1))
def forward(self, x):
norm = torch.norm(x, dim=-1, keepdim=True) * self.scale
return x / norm.clamp(min=self.eps) * self.g
class RMSNorm(nn.Module):
def __init__(self, dim, eps=1e-8):
super().__init__()
self.scale = dim ** -0.5
self.eps = eps
self.g = nn.Parameter(torch.ones(dim))
def forward(self, x):
norm = torch.norm(x, dim=-1, keepdim=True) * self.scale
return x / norm.clamp(min=self.eps) * self.g
class Residual(nn.Module):
def forward(self, x, residual):
return x + residual
class GRUGating(nn.Module):
def __init__(self, dim):
super().__init__()
self.gru = nn.GRUCell(dim, dim)
def forward(self, x, residual):
gated_output = self.gru(
rearrange(x, 'b n d -> (b n) d'),
rearrange(residual, 'b n d -> (b n) d')
)
return gated_output.reshape_as(x)
# feedforward
class GEGLU(nn.Module):
def __init__(self, dim_in, dim_out):
super().__init__()
self.proj = nn.Linear(dim_in, dim_out * 2)
def forward(self, x):
x, gate = self.proj(x).chunk(2, dim=-1)
return x * F.gelu(gate)
class FeedForward(nn.Module):
def __init__(self, dim, dim_out=None, mult=4, glu=False, dropout=0.):
super().__init__()
inner_dim = int(dim * mult)
dim_out = default(dim_out, dim)
project_in = nn.Sequential(
nn.Linear(dim, inner_dim),
nn.GELU()
) if not glu else GEGLU(dim, inner_dim)
self.net = nn.Sequential(
project_in,
nn.Dropout(dropout),
nn.Linear(inner_dim, dim_out)
)
def forward(self, x):
return self.net(x)
# attention.
class Attention(nn.Module):
def __init__(
self,
dim,
dim_head=DEFAULT_DIM_HEAD,
heads=8,
causal=False,
mask=None,
talking_heads=False,
sparse_topk=None,
use_entmax15=False,
num_mem_kv=0,
dropout=0.,
on_attn=False
):
super().__init__()
if use_entmax15:
raise NotImplementedError("Check out entmax activation instead of softmax activation!")
self.scale = dim_head ** -0.5
self.heads = heads
self.causal = causal
self.mask = mask
inner_dim = dim_head * heads
self.to_q = nn.Linear(dim, inner_dim, bias=False)
self.to_k = nn.Linear(dim, inner_dim, bias=False)
self.to_v = nn.Linear(dim, inner_dim, bias=False)
self.dropout = nn.Dropout(dropout)
# talking heads
self.talking_heads = talking_heads
if talking_heads:
self.pre_softmax_proj = nn.Parameter(torch.randn(heads, heads))
self.post_softmax_proj = nn.Parameter(torch.randn(heads, heads))
# explicit topk sparse attention
self.sparse_topk = sparse_topk
# entmax
#self.attn_fn = entmax15 if use_entmax15 else F.softmax
self.attn_fn = F.softmax
# add memory key / values
self.num_mem_kv = num_mem_kv
if num_mem_kv > 0:
self.mem_k = nn.Parameter(torch.randn(heads, num_mem_kv, dim_head))
self.mem_v = nn.Parameter(torch.randn(heads, num_mem_kv, dim_head))
# attention on attention
self.attn_on_attn = on_attn
self.to_out = nn.Sequential(nn.Linear(inner_dim, dim * 2), nn.GLU()) if on_attn else nn.Linear(inner_dim, dim)
def forward(
self,
x,
context=None,
mask=None,
context_mask=None,
rel_pos=None,
sinusoidal_emb=None,
prev_attn=None,
mem=None
):
b, n, _, h, talking_heads, device = *x.shape, self.heads, self.talking_heads, x.device
kv_input = default(context, x)
q_input = x
k_input = kv_input
v_input = kv_input
if exists(mem):
k_input = torch.cat((mem, k_input), dim=-2)
v_input = torch.cat((mem, v_input), dim=-2)
if exists(sinusoidal_emb):
# in shortformer, the query would start at a position offset depending on the past cached memory
offset = k_input.shape[-2] - q_input.shape[-2]
q_input = q_input + sinusoidal_emb(q_input, offset=offset)
k_input = k_input + sinusoidal_emb(k_input)
q = self.to_q(q_input)
k = self.to_k(k_input)
v = self.to_v(v_input)
q, k, v = map(lambda t: rearrange(t, 'b n (h d) -> b h n d', h=h), (q, k, v))
input_mask = None
if any(map(exists, (mask, context_mask))):
q_mask = default(mask, lambda: torch.ones((b, n), device=device).bool())
k_mask = q_mask if not exists(context) else context_mask
k_mask = default(k_mask, lambda: torch.ones((b, k.shape[-2]), device=device).bool())
q_mask = rearrange(q_mask, 'b i -> b () i ()')
k_mask = rearrange(k_mask, 'b j -> b () () j')
input_mask = q_mask * k_mask
if self.num_mem_kv > 0:
mem_k, mem_v = map(lambda t: repeat(t, 'h n d -> b h n d', b=b), (self.mem_k, self.mem_v))
k = torch.cat((mem_k, k), dim=-2)
v = torch.cat((mem_v, v), dim=-2)
if exists(input_mask):
input_mask = F.pad(input_mask, (self.num_mem_kv, 0), value=True)
dots = einsum('b h i d, b h j d -> b h i j', q, k) * self.scale
mask_value = max_neg_value(dots)
if exists(prev_attn):
dots = dots + prev_attn
pre_softmax_attn = dots
if talking_heads:
dots = einsum('b h i j, h k -> b k i j', dots, self.pre_softmax_proj).contiguous()
if exists(rel_pos):
dots = rel_pos(dots)
if exists(input_mask):
dots.masked_fill_(~input_mask, mask_value)
del input_mask
if self.causal:
i, j = dots.shape[-2:]
r = torch.arange(i, device=device)
mask = rearrange(r, 'i -> () () i ()') < rearrange(r, 'j -> () () () j')
mask = F.pad(mask, (j - i, 0), value=False)
dots.masked_fill_(mask, mask_value)
del mask
if exists(self.sparse_topk) and self.sparse_topk < dots.shape[-1]:
top, _ = dots.topk(self.sparse_topk, dim=-1)
vk = top[..., -1].unsqueeze(-1).expand_as(dots)
mask = dots < vk
dots.masked_fill_(mask, mask_value)
del mask
attn = self.attn_fn(dots, dim=-1)
post_softmax_attn = attn
attn = self.dropout(attn)
if talking_heads:
attn = einsum('b h i j, h k -> b k i j', attn, self.post_softmax_proj).contiguous()
out = einsum('b h i j, b h j d -> b h i d', attn, v)
out = rearrange(out, 'b h n d -> b n (h d)')
intermediates = Intermediates(
pre_softmax_attn=pre_softmax_attn,
post_softmax_attn=post_softmax_attn
)
return self.to_out(out), intermediates
class AttentionLayers(nn.Module):
def __init__(
self,
dim,
depth,
heads=8,
causal=False,
cross_attend=False,
only_cross=False,
use_scalenorm=False,
use_rmsnorm=False,
use_rezero=False,
rel_pos_num_buckets=32,
rel_pos_max_distance=128,
position_infused_attn=False,
custom_layers=None,
sandwich_coef=None,
par_ratio=None,
residual_attn=False,
cross_residual_attn=False,
macaron=False,
pre_norm=True,
gate_residual=False,
**kwargs
):
super().__init__()
ff_kwargs, kwargs = groupby_prefix_and_trim('ff_', kwargs)
attn_kwargs, _ = groupby_prefix_and_trim('attn_', kwargs)
dim_head = attn_kwargs.get('dim_head', DEFAULT_DIM_HEAD)
self.dim = dim
self.depth = depth
self.layers = nn.ModuleList([])
self.has_pos_emb = position_infused_attn
self.pia_pos_emb = FixedPositionalEmbedding(dim) if position_infused_attn else None
self.rotary_pos_emb = always(None)
assert rel_pos_num_buckets <= rel_pos_max_distance, 'number of relative position buckets must be less than the relative position max distance'
self.rel_pos = None
self.pre_norm = pre_norm
self.residual_attn = residual_attn
self.cross_residual_attn = cross_residual_attn
norm_class = ScaleNorm if use_scalenorm else nn.LayerNorm
norm_class = RMSNorm if use_rmsnorm else norm_class
norm_fn = partial(norm_class, dim)
norm_fn = nn.Identity if use_rezero else norm_fn
branch_fn = Rezero if use_rezero else None
if cross_attend and not only_cross:
default_block = ('a', 'c', 'f')
elif cross_attend and only_cross:
default_block = ('c', 'f')
else:
default_block = ('a', 'f')
if macaron:
default_block = ('f',) + default_block
if exists(custom_layers):
layer_types = custom_layers
elif exists(par_ratio):
par_depth = depth * len(default_block)
assert 1 < par_ratio <= par_depth, 'par ratio out of range'
default_block = tuple(filter(not_equals('f'), default_block))
par_attn = par_depth // par_ratio
depth_cut = par_depth * 2 // 3 # 2 / 3 attention layer cutoff suggested by PAR paper
par_width = (depth_cut + depth_cut // par_attn) // par_attn
assert len(default_block) <= par_width, 'default block is too large for par_ratio'
par_block = default_block + ('f',) * (par_width - len(default_block))
par_head = par_block * par_attn
layer_types = par_head + ('f',) * (par_depth - len(par_head))
elif exists(sandwich_coef):
assert sandwich_coef > 0 and sandwich_coef <= depth, 'sandwich coefficient should be less than the depth'
layer_types = ('a',) * sandwich_coef + default_block * (depth - sandwich_coef) + ('f',) * sandwich_coef
else:
layer_types = default_block * depth
self.layer_types = layer_types
self.num_attn_layers = len(list(filter(equals('a'), layer_types)))
for layer_type in self.layer_types:
if layer_type == 'a':
layer = Attention(dim, heads=heads, causal=causal, **attn_kwargs)
elif layer_type == 'c':
layer = Attention(dim, heads=heads, **attn_kwargs)
elif layer_type == 'f':
layer = FeedForward(dim, **ff_kwargs)
layer = layer if not macaron else Scale(0.5, layer)
else:
raise Exception(f'invalid layer type {layer_type}')
if isinstance(layer, Attention) and exists(branch_fn):
layer = branch_fn(layer)
if gate_residual:
residual_fn = GRUGating(dim)
else:
residual_fn = Residual()
self.layers.append(nn.ModuleList([
norm_fn(),
layer,
residual_fn
]))
def forward(
self,
x,
context=None,
mask=None,
context_mask=None,
mems=None,
return_hiddens=False
):
hiddens = []
intermediates = []
prev_attn = None
prev_cross_attn = None
mems = mems.copy() if exists(mems) else [None] * self.num_attn_layers
for ind, (layer_type, (norm, block, residual_fn)) in enumerate(zip(self.layer_types, self.layers)):
is_last = ind == (len(self.layers) - 1)
if layer_type == 'a':
hiddens.append(x)
layer_mem = mems.pop(0)
residual = x
if self.pre_norm:
x = norm(x)
if layer_type == 'a':
out, inter = block(x, mask=mask, sinusoidal_emb=self.pia_pos_emb, rel_pos=self.rel_pos,
prev_attn=prev_attn, mem=layer_mem)
elif layer_type == 'c':
out, inter = block(x, context=context, mask=mask, context_mask=context_mask, prev_attn=prev_cross_attn)
elif layer_type == 'f':
out = block(x)
x = residual_fn(out, residual)
if layer_type in ('a', 'c'):
intermediates.append(inter)
if layer_type == 'a' and self.residual_attn:
prev_attn = inter.pre_softmax_attn
elif layer_type == 'c' and self.cross_residual_attn:
prev_cross_attn = inter.pre_softmax_attn
if not self.pre_norm and not is_last:
x = norm(x)
if return_hiddens:
intermediates = LayerIntermediates(
hiddens=hiddens,
attn_intermediates=intermediates
)
return x, intermediates
return x
class Encoder(AttentionLayers):
def __init__(self, **kwargs):
assert 'causal' not in kwargs, 'cannot set causality on encoder'
super().__init__(causal=False, **kwargs)
class TransformerWrapper(nn.Module):
def __init__(
self,
*,
num_tokens,
max_seq_len,
attn_layers,
emb_dim=None,
max_mem_len=0.,
emb_dropout=0.,
num_memory_tokens=None,
tie_embedding=False,
use_pos_emb=True
):
super().__init__()
assert isinstance(attn_layers, AttentionLayers), 'attention layers must be one of Encoder or Decoder'
dim = attn_layers.dim
emb_dim = default(emb_dim, dim)
self.max_seq_len = max_seq_len
self.max_mem_len = max_mem_len
self.num_tokens = num_tokens
self.token_emb = nn.Embedding(num_tokens, emb_dim)
self.pos_emb = AbsolutePositionalEmbedding(emb_dim, max_seq_len) if (
use_pos_emb and not attn_layers.has_pos_emb) else always(0)
self.emb_dropout = nn.Dropout(emb_dropout)
self.project_emb = nn.Linear(emb_dim, dim) if emb_dim != dim else nn.Identity()
self.attn_layers = attn_layers
self.norm = nn.LayerNorm(dim)
self.init_()
self.to_logits = nn.Linear(dim, num_tokens) if not tie_embedding else lambda t: t @ self.token_emb.weight.t()
# memory tokens (like [cls]) from Memory Transformers paper
num_memory_tokens = default(num_memory_tokens, 0)
self.num_memory_tokens = num_memory_tokens
if num_memory_tokens > 0:
self.memory_tokens = nn.Parameter(torch.randn(num_memory_tokens, dim))
# let funnel encoder know number of memory tokens, if specified
if hasattr(attn_layers, 'num_memory_tokens'):
attn_layers.num_memory_tokens = num_memory_tokens
def init_(self):
nn.init.normal_(self.token_emb.weight, std=0.02)
def forward(
self,
x,
return_embeddings=False,
mask=None,
return_mems=False,
return_attn=False,
mems=None,
**kwargs
):
b, n, device, num_mem = *x.shape, x.device, self.num_memory_tokens
x = self.token_emb(x)
x += self.pos_emb(x)
x = self.emb_dropout(x)
x = self.project_emb(x)
if num_mem > 0:
mem = repeat(self.memory_tokens, 'n d -> b n d', b=b)
x = torch.cat((mem, x), dim=1)
# auto-handle masking after appending memory tokens
if exists(mask):
mask = F.pad(mask, (num_mem, 0), value=True)
x, intermediates = self.attn_layers(x, mask=mask, mems=mems, return_hiddens=True, **kwargs)
x = self.norm(x)
mem, x = x[:, :num_mem], x[:, num_mem:]
out = self.to_logits(x) if not return_embeddings else x
if return_mems:
hiddens = intermediates.hiddens
new_mems = list(map(lambda pair: torch.cat(pair, dim=-2), zip(mems, hiddens))) if exists(mems) else hiddens
new_mems = list(map(lambda t: t[..., -self.max_mem_len:, :].detach(), new_mems))
return out, new_mems
if return_attn:
attn_maps = list(map(lambda t: t.post_softmax_attn, intermediates.attn_intermediates))
return out, attn_maps
return out
================================================
FILE: ldm/thirdp/psp/helpers.py
================================================
# https://github.com/eladrich/pixel2style2pixel
from collections import namedtuple
import torch
from torch.nn import Conv2d, BatchNorm2d, PReLU, ReLU, Sigmoid, MaxPool2d, AdaptiveAvgPool2d, Sequential, Module
"""
ArcFace implementation from [TreB1eN](https://github.com/TreB1eN/InsightFace_Pytorch)
"""
class Flatten(Module):
def forward(self, input):
return input.view(input.size(0), -1)
def l2_norm(input, axis=1):
norm = torch.norm(input, 2, axis, True)
output = torch.div(input, norm)
return output
class Bottleneck(namedtuple('Block', ['in_channel', 'depth', 'stride'])):
""" A named tuple describing a ResNet block. """
def get_block(in_channel, depth, num_units, stride=2):
return [Bottleneck(in_channel, depth, stride)] + [Bottleneck(depth, depth, 1) for i in range(num_units - 1)]
def get_blocks(num_layers):
if num_layers == 50:
blocks = [
get_block(in_channel=64, depth=64, num_units=3),
get_block(in_channel=64, depth=128, num_units=4),
get_block(in_channel=128, depth=256, num_units=14),
get_block(in_channel=256, depth=512, num_units=3)
]
elif num_layers == 100:
blocks = [
get_block(in_channel=64, depth=64, num_units=3),
get_block(in_channel=64, depth=128, num_units=13),
get_block(in_channel=128, depth=256, num_units=30),
get_block(in_channel=256, depth=512, num_units=3)
]
elif num_layers == 152:
blocks = [
get_block(in_channel=64, depth=64, num_units=3),
get_block(in_channel=64, depth=128, num_units=8),
get_block(in_channel=128, depth=256, num_units=36),
get_block(in_channel=256, depth=512, num_units=3)
]
else:
raise ValueError("Invalid number of layers: {}. Must be one of [50, 100, 152]".format(num_layers))
return blocks
class SEModule(Module):
def __init__(self, channels, reduction):
super(SEModule, self).__init__()
self.avg_pool = AdaptiveAvgPool2d(1)
self.fc1 = Conv2d(channels, channels // reduction, kernel_size=1, padding=0, bias=False)
self.relu = ReLU(inplace=True)
self.fc2 = Conv2d(channels // reduction, channels, kernel_size=1, padding=0, bias=False)
self.sigmoid = Sigmoid()
def forward(self, x):
module_input = x
x = self.avg_pool(x)
x = self.fc1(x)
x = self.relu(x)
x = self.fc2(x)
x = self.sigmoid(x)
return module_input * x
class bottleneck_IR(Module):
def __init__(self, in_channel, depth, stride):
super(bottleneck_IR, self).__init__()
if in_channel == depth:
self.shortcut_layer = MaxPool2d(1, stride)
else:
self.shortcut_layer = Sequential(
Conv2d(in_channel, depth, (1, 1), stride, bias=False),
BatchNorm2d(depth)
)
self.res_layer = Sequential(
BatchNorm2d(in_channel),
Conv2d(in_channel, depth, (3, 3), (1, 1), 1, bias=False), PReLU(depth),
Conv2d(depth, depth, (3, 3), stride, 1, bias=False), BatchNorm2d(depth)
)
def forward(self, x):
shortcut = self.shortcut_layer(x)
res = self.res_layer(x)
return res + shortcut
class bottleneck_IR_SE(Module):
def __init__(self, in_channel, depth, stride):
super(bottleneck_IR_SE, self).__init__()
if in_channel == depth:
self.shortcut_layer = MaxPool2d(1, stride)
else:
self.shortcut_layer = Sequential(
Conv2d(in_channel, depth, (1, 1), stride, bias=False),
BatchNorm2d(depth)
)
self.res_layer = Sequential(
BatchNorm2d(in_channel),
Conv2d(in_channel, depth, (3, 3), (1, 1), 1, bias=False),
PReLU(depth),
Conv2d(depth, depth, (3, 3), stride, 1, bias=False),
BatchNorm2d(depth),
SEModule(depth, 16)
)
def forward(self, x):
shortcut = self.shortcut_layer(x)
res = self.res_layer(x)
return res + shortcut
================================================
FILE: ldm/thirdp/psp/id_loss.py
================================================
# https://github.com/eladrich/pixel2style2pixel
import torch
from torch import nn
from ldm.thirdp.psp.model_irse import Backbone
class IDFeatures(nn.Module):
def __init__(self, model_path):
super(IDFeatures, self).__init__()
print('Loading ResNet ArcFace')
self.facenet = Backbone(input_size=112, num_layers=50, drop_ratio=0.6, mode='ir_se')
self.facenet.load_state_dict(torch.load(model_path, map_location="cpu"))
self.face_pool = torch.nn.AdaptiveAvgPool2d((112, 112))
self.facenet.eval()
def forward(self, x, crop=False):
# Not sure of the image range here
if crop:
x = torch.nn.functional.interpolate(x, (256, 256), mode="area")
x = x[:, :, 35:223, 32:220]
x = self.face_pool(x)
x_feats = self.facenet(x)
return x_feats
================================================
FILE: ldm/thirdp/psp/model_irse.py
================================================
# https://github.com/eladrich/pixel2style2pixel
from torch.nn import Linear, Conv2d, BatchNorm1d, BatchNorm2d, PReLU, Dropout, Sequential, Module
from ldm.thirdp.psp.helpers import get_blocks, Flatten, bottleneck_IR, bottleneck_IR_SE, l2_norm
"""
Modified Backbone implementation from [TreB1eN](https://github.com/TreB1eN/InsightFace_Pytorch)
"""
class Backbone(Module):
def __init__(self, input_size, num_layers, mode='ir', drop_ratio=0.4, affine=True):
super(Backbone, self).__init__()
assert input_size in [112, 224], "input_size should be 112 or 224"
assert num_layers in [50, 100, 152], "num_layers should be 50, 100 or 152"
assert mode in ['ir', 'ir_se'], "mode should be ir or ir_se"
blocks = get_blocks(num_layers)
if mode == 'ir':
unit_module = bottleneck_IR
elif mode == 'ir_se':
unit_module = bottleneck_IR_SE
self.input_layer = Sequential(Conv2d(3, 64, (3, 3), 1, 1, bias=False),
BatchNorm2d(64),
PReLU(64))
if input_size == 112:
self.output_layer = Sequential(BatchNorm2d(512),
Dropout(drop_ratio),
Flatten(),
Linear(512 * 7 * 7, 512),
BatchNorm1d(512, affine=affine))
else:
self.output_layer = Sequential(BatchNorm2d(512),
Dropout(drop_ratio),
Flatten(),
Linear(512 * 14 * 14, 512),
BatchNorm1d(512, affine=affine))
modules = []
for block in blocks:
for bottleneck in block:
modules.append(unit_module(bottleneck.in_channel,
bottleneck.depth,
bottleneck.stride))
self.body = Sequential(*modules)
def forward(self, x):
x = self.input_layer(x)
x = self.body(x)
x = self.output_layer(x)
return l2_norm(x)
def IR_50(input_size):
"""Constructs a ir-50 model."""
model = Backbone(input_size, num_layers=50, mode='ir', drop_ratio=0.4, affine=False)
return model
def IR_101(input_size):
"""Constructs a ir-101 model."""
model = Backbone(input_size, num_layers=100, mode='ir', drop_ratio=0.4, affine=False)
return model
def IR_152(input_size):
"""Constructs a ir-152 model."""
model = Backbone(input_size, num_layers=152, mode='ir', drop_ratio=0.4, affine=False)
return model
def IR_SE_50(input_size):
"""Constructs a ir_se-50 model."""
model = Backbone(input_size, num_layers=50, mode='ir_se', drop_ratio=0.4, affine=False)
return model
def IR_SE_101(input_size):
"""Constructs a ir_se-101 model."""
model = Backbone(input_size, num_layers=100, mode='ir_se', drop_ratio=0.4, affine=False)
return model
def IR_SE_152(input_size):
"""Constructs a ir_se-152 model."""
model = Backbone(input_size, num_layers=152, mode='ir_se', drop_ratio=0.4, affine=False)
return model
================================================
FILE: ldm/typing.py
================================================
# Basic types
from typing import (
Any,
Callable,
Dict,
Iterable,
List,
Literal,
NamedTuple,
NewType,
Optional,
Sized,
Tuple,
Type,
TypeVar,
Union,
)
# PyTorch Tensor type
from torch import Tensor
================================================
FILE: ldm/util.py
================================================
import importlib
import torchvision
import torch
from torch import optim
import numpy as np
import pickle
from inspect import isfunction
from PIL import Image, ImageDraw, ImageFont
from dataclasses import dataclass, field
import os
import numpy as np
import matplotlib.pyplot as plt
from PIL import Image
import torch
import time
import cv2
import PIL
import numpy as np
import math
import open3d as o3d
def normalize(vec):
return vec / (np.linalg.norm(vec, axis=-1, keepdims=True) + 1e-9)
# All the following functions follow the opencv convention for camera coordinates.
def look_at(cam_location, point):
# Cam points in positive z direction
forward = point - cam_location
forward = normalize(forward)
up = np.array([0., 0., 1.])
right = np.cross(forward, up)
right = normalize(right)
up = np.cross(right, forward)
up = normalize(up)
mat = np.stack((right, up, -forward, cam_location), axis=-1)
hom_vec = np.array([[0., 0., 0., 1.]])
if len(mat.shape) > 2:
hom_vec = np.tile(hom_vec, [mat.shape[0], 1, 1])
mat = np.concatenate((mat, hom_vec), axis=-2)
return mat
def az_el_to_points(azimuths, elevations):
x = np.cos(azimuths)*np.cos(elevations)
y = np.sin(azimuths)*np.cos(elevations)
z = np.sin(elevations)
return np.stack([x,y,z],-1) #
def get_3x4_RT_matrix_from_az_el(az, el, distance):
cam_pose = az_el_to_points(az, el) * distance
c2w = look_at(cam_pose, np.array([0,0,0]))
R = c2w[..., :3, :3]
t = c2w[..., :3, 3]
cam_rec = np.asarray([[1, 0, 0], [0, -1, 0], [0, 0, -1]], np.float32)
R = R.T
t = -R @ t
R_world2cv = cam_rec @ R
t_world2cv = cam_rec @ t
RT = np.concatenate([R_world2cv,t_world2cv[:,None]],1)
return RT
def pil_rectangle_crop(im):
width, height = im.size # Get dimensions
if width <= height:
left = 0
right = width
top = (height - width)/2
bottom = (height + width)/2
else:
top = 0
bottom = height
left = (width - height) / 2
bottom = (width + height) / 2
# Crop the center of the image
im = im.crop((left, top, right, bottom))
return im
def add_margin(pil_img, color=0, size=256):
width, height = pil_img.size
result = Image.new(pil_img.mode, (size, size), color)
result.paste(pil_img, ((size - width) // 2, (size - height) // 2))
return result
def create_carvekit_interface():
from carvekit.api.high import HiInterface
# Check doc strings for more information
interface = HiInterface(object_type="object", # Can be "object" or "hairs-like".
batch_size_seg=5,
batch_size_matting=1,
device='cuda' if torch.cuda.is_available() else 'cpu',
seg_mask_size=640, # Use 640 for Tracer B7 and 320 for U2Net
matting_mask_size=2048,
trimap_prob_threshold=231,
trimap_dilation=30,
trimap_erosion_iters=5,
fp16=False)
return interface
def load_and_preprocess(interface, input_im):
'''
:param input_im (PIL Image).
:return image (H, W, 3) array in [0, 1].
'''
# See https://github.com/Ir1d/image-background-remove-tool
image = input_im.convert('RGB')
image_without_background = interface([image])[0]
image_without_background = np.array(image_without_background)
est_seg = image_without_background > 127
image = np.array(image)
foreground = est_seg[:, : , -1].astype(np.bool_)
image[~foreground] = [255., 255., 255.]
x, y, w, h = cv2.boundingRect(foreground.astype(np.uint8))
image = image[y:y+h, x:x+w, :]
image = PIL.Image.fromarray(np.array(image))
# resize image such that long edge is 512
image.thumbnail([200, 200], Image.LANCZOS)
image = add_margin(image, (255, 255, 255), size=256)
image = np.array(image)
return image
def log_txt_as_img(wh, xc, size=10):
# wh a tuple of (width, height)
# xc a list of captions to plot
b = len(xc)
txts = list()
for bi in range(b):
txt = Image.new("RGB", wh, color="white")
draw = ImageDraw.Draw(txt)
font = ImageFont.truetype('data/DejaVuSans.ttf', size=size)
nc = int(40 * (wh[0] / 256))
lines = "\n".join(xc[bi][start:start + nc] for start in range(0, len(xc[bi]), nc))
try:
draw.text((0, 0), lines, fill="black", font=font)
except UnicodeEncodeError:
print("Cant encode string for logging. Skipping.")
txt = np.array(txt).transpose(2, 0, 1) / 127.5 - 1.0
txts.append(txt)
txts = np.stack(txts)
txts = torch.tensor(txts)
return txts
def ismap(x):
if not isinstance(x, torch.Tensor):
return False
return (len(x.shape) == 4) and (x.shape[1] > 3)
def isimage(x):
if not isinstance(x,torch.Tensor):
return False
return (len(x.shape) == 4) and (x.shape[1] == 3 or x.shape[1] == 1)
def exists(x):
return x is not None
def default(val, d):
if exists(val):
return val
return d() if isfunction(d) else d
def mean_flat(tensor):
"""
https://github.com/openai/guided-diffusion/blob/27c20a8fab9cb472df5d6bdd6c8d11c8f430b924/guided_diffusion/nn.py#L86
Take the mean over all non-batch dimensions.
"""
return tensor.mean(dim=list(range(1, len(tensor.shape))))
def count_params(model, verbose=False):
total_params = sum(p.numel() for p in model.parameters())
if verbose:
print(f"{model.__class__.__name__} has {total_params*1.e-6:.2f} M params.")
return total_params
def instantiate_from_config(config):
if not "target" in config:
if config == '__is_first_stage__':
return None
elif config == "__is_unconditional__":
return None
raise KeyError("Expected key `target` to instantiate.")
return get_obj_from_str(config["target"])(**config.get("params", dict()))
def get_obj_from_str(string, reload=False):
module, cls = string.rsplit(".", 1)
if reload:
module_imp = importlib.import_module(module)
importlib.reload(module_imp)
return getattr(importlib.import_module(module, package=None), cls)
class AdamWwithEMAandWings(optim.Optimizer):
# credit to https://gist.github.com/crowsonkb/65f7265353f403714fce3b2595e0b298
def __init__(self, params, lr=1.e-3, betas=(0.9, 0.999), eps=1.e-8, # TODO: check hyperparameters before using
weight_decay=1.e-2, amsgrad=False, ema_decay=0.9999, # ema decay to match previous code
ema_power=1., param_names=()):
"""AdamW that saves EMA versions of the parameters."""
if not 0.0 <= lr:
raise ValueError("Invalid learning rate: {}".format(lr))
if not 0.0 <= eps:
raise ValueError("Invalid epsilon value: {}".format(eps))
if not 0.0 <= betas[0] < 1.0:
raise ValueError("Invalid beta parameter at index 0: {}".format(betas[0]))
if not 0.0 <= betas[1] < 1.0:
raise ValueError("Invalid beta parameter at index 1: {}".format(betas[1]))
if not 0.0 <= weight_decay:
raise ValueError("Invalid weight_decay value: {}".format(weight_decay))
if not 0.0 <= ema_decay <= 1.0:
raise ValueError("Invalid ema_decay value: {}".format(ema_decay))
defaults = dict(lr=lr, betas=betas, eps=eps,
weight_decay=weight_decay, amsgrad=amsgrad, ema_decay=ema_decay,
ema_power=ema_power, param_names=param_names)
super().__init__(params, defaults)
def __setstate__(self, state):
super().__setstate__(state)
for group in self.param_groups:
group.setdefault('amsgrad', False)
@torch.no_grad()
def step(self, closure=None):
"""Performs a single optimization step.
Args:
closure (callable, optional): A closure that reevaluates the model
and returns the loss.
"""
loss = None
if closure is not None:
with torch.enable_grad():
loss = closure()
for group in self.param_groups:
params_with_grad = []
grads = []
exp_avgs = []
exp_avg_sqs = []
ema_params_with_grad = []
state_sums = []
max_exp_avg_sqs = []
state_steps = []
amsgrad = group['amsgrad']
beta1, beta2 = group['betas']
ema_decay = group['ema_decay']
ema_power = group['ema_power']
for p in group['params']:
if p.grad is None:
continue
params_with_grad.append(p)
if p.grad.is_sparse:
raise RuntimeError('AdamW does not support sparse gradients')
grads.append(p.grad)
state = self.state[p]
# State initialization
if len(state) == 0:
state['step'] = 0
# Exponential moving average of gradient values
state['exp_avg'] = torch.zeros_like(p, memory_format=torch.preserve_format)
# Exponential moving average of squared gradient values
state['exp_avg_sq'] = torch.zeros_like(p, memory_format=torch.preserve_format)
if amsgrad:
# Maintains max of all exp. moving avg. of sq. grad. values
state['max_exp_avg_sq'] = torch.zeros_like(p, memory_format=torch.preserve_format)
# Exponential moving average of parameter values
state['param_exp_avg'] = p.detach().float().clone()
exp_avgs.append(state['exp_avg'])
exp_avg_sqs.append(state['exp_avg_sq'])
ema_params_with_grad.append(state['param_exp_avg'])
if amsgrad:
max_exp_avg_sqs.append(state['max_exp_avg_sq'])
# update the steps for each param group update
state['step'] += 1
# record the step after step update
state_steps.append(state['step'])
optim._functional.adamw(params_with_grad,
grads,
exp_avgs,
exp_avg_sqs,
max_exp_avg_sqs,
state_steps,
amsgrad=amsgrad,
beta1=beta1,
beta2=beta2,
lr=group['lr'],
weight_decay=group['weight_decay'],
eps=group['eps'],
maximize=False)
cur_ema_decay = min(ema_decay, 1 - state['step'] ** -ema_power)
for param, ema_param in zip(params_with_grad, ema_params_with_grad):
ema_param.mul_(cur_ema_decay).add_(param.float(), alpha=1 - cur_ema_decay)
return loss
def prepare_inputs(image_input, elevation_input, crop_size=-1, image_size=256):
if isinstance(image_input, str):
image_input = Image.open(image_input)
if crop_size!=-1:
alpha_np = np.asarray(image_input)[:, :, 3]
coords = np.stack(np.nonzero(alpha_np), 1)[:, (1, 0)]
min_x, min_y = np.min(coords, 0)
max_x, max_y = np.max(coords, 0)
ref_img_ = image_input.crop((min_x, min_y, max_x, max_y))
h, w = ref_img_.height, ref_img_.width
scale = crop_size / max(h, w)
h_, w_ = int(scale * h), int(scale * w)
ref_img_ = ref_img_.resize((w_, h_), resample=Image.BICUBIC)
image_input = add_margin(ref_img_, size=image_size)
else:
image_input = add_margin(image_input, size=max(image_input.height, image_input.width))
image_input = image_input.resize((image_size, image_size), resample=Image.BICUBIC)
image_input = np.asarray(image_input)
image_input = image_input.astype(np.float32) / 255.0
if image_input.shape[-1]==4:
ref_mask = image_input[:, :, 3:]
image_input[:, :, :3] = image_input[:, :, :3] * ref_mask + 1 - ref_mask # white background
image_input = image_input[:, :, :3] * 2.0 - 1.0
image_input = torch.from_numpy(image_input.astype(np.float32))
elevation_input = torch.from_numpy(np.asarray([np.deg2rad(elevation_input)], np.float32))
return {"input_image": image_input, "input_elevation": elevation_input}
def prepare_proxy(proxy_path, start_view_index=0):
if isinstance(proxy_path, str):
proxy = np.loadtxt(proxy_path)[:, None, :]
proxy = torch.from_numpy(proxy)
axis_mat = torch.tensor([1, 0, 0, 0, 0, -1, 0, 1, 0]).reshape(3, 3)
proxy = (proxy * axis_mat).sum(-1)
proxy = proxy.float()
rot_rad = np.deg2rad(-22.5*start_view_index)
rotate_matrix = torch.from_numpy(np.array([[np.cos(rot_rad), -np.sin(rot_rad), 0], [np.sin(rot_rad), np.cos(rot_rad), 0], [0, 0, 1]]))
proxy = (rotate_matrix * proxy[:, None, :]).sum(-1).float()
return proxy
def save_pickle(data, pkl_path):
# os.system('mkdir -p {}'.format(os.path.dirname(pkl_path)))
with open(pkl_path, 'wb') as f:
pickle.dump(data, f)
def read_pickle(pkl_path):
with open(pkl_path, 'rb') as f:
return pickle.load(f)
@dataclass
class Ctrl3DParams:
num_proxy: int = 256
start_percent: float= 0.0
end_percent: float= 1.0
proxy = -1
def sample_proxy(object_dir, num_proxy=256, overwrite=False):
obj_path = os.path.join(object_dir, 'mesh.obj')
proxy_path = os.path.join(object_dir, 'proxy.txt')
if os.path.exists(proxy_path) and not overwrite:
return proxy_path
print(obj_path)
mesh = o3d.io.read_triangle_mesh(obj_path)
proxy = mesh.sample_points_poisson_disk(num_proxy)
proxy = np.asarray(proxy.points)
np.savetxt(proxy_path, proxy)
return proxy_path
================================================
FILE: misc.ipynb
================================================
{
"cells": [
{
"cell_type": "code",
"execution_count": 7,
"metadata": {},
"outputs": [
{
"data": {
"image/jpeg": "/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAgGBgcGBQgHBwcJCQgKDBQNDAsLDBkSEw8UHRofHh0aHBwgJC4nICIsIxwcKDcpLDAxNDQ0Hyc5PTgyPC4zNDL/2wBDAQkJCQwLDBgNDRgyIRwhMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjL/wAARCAGQAlgDASIAAhEBAxEB/8QAHwAAAQUBAQEBAQEAAAAAAAAAAAECAwQFBgcICQoL/8QAtRAAAgEDAwIEAwUFBAQAAAF9AQIDAAQRBRIhMUEGE1FhByJxFDKBkaEII0KxwRVS0fAkM2JyggkKFhcYGRolJicoKSo0NTY3ODk6Q0RFRkdISUpTVFVWV1hZWmNkZWZnaGlqc3R1dnd4eXqDhIWGh4iJipKTlJWWl5iZmqKjpKWmp6ipqrKztLW2t7i5usLDxMXGx8jJytLT1NXW19jZ2uHi4+Tl5ufo6erx8vP09fb3+Pn6/8QAHwEAAwEBAQEBAQEBAQAAAAAAAAECAwQFBgcICQoL/8QAtREAAgECBAQDBAcFBAQAAQJ3AAECAxEEBSExBhJBUQdhcRMiMoEIFEKRobHBCSMzUvAVYnLRChYkNOEl8RcYGRomJygpKjU2Nzg5OkNERUZHSElKU1RVVldYWVpjZGVmZ2hpanN0dXZ3eHl6goOEhYaHiImKkpOUlZaXmJmaoqOkpaanqKmqsrO0tba3uLm6wsPExcbHyMnK0tPU1dbX2Nna4uPk5ebn6Onq8vP09fb3+Pn6/9oADAMBAAIRAxEAPwD1GPXrt/FT6WbQCAEruwdwAGd3pitBtQmGqi18obM4z3+tTSXmwKyRPKznCrGvJ4J7kelR2+pJcTPEYpYZUxlZUxkexzg1z1KVSSXLO2t9unY6Z1qTatC2lt+vcy73xBeW3ieHTUtA0LlQWwdzZ6kdsD+ldHVG51O2tdxmmjVkUsR3x/kVVh8RWk08ESt81x/q8c54zz6VpBO7u76/d5HFThOLk5Svd6eXkbFFVBfwGYxCaPzB1XPNNg1GOeWWEfJNFjfG3UA9G4PQ4OD7H0q009jUu0Vm6hrdlpXkfbZvKWZtiuUYqD7kcL+NXBNkAgqQeQRVWdriuiaisyfXLK21KGwnnEdxMu6MMjBW5xgN0z7ZzV7zfcflQ01uF0S0VSi1O1muJLeK5heaM4dFYEg+mKW51G3s4vNubiKFM43OcAn0pDLlFUE1eyk2bLuA+YcJ833j6Csq78SXEF/LbR28chRsADJJ/AUAdJRXKHxZcrIY2tYlcclTuyK33v0jAycsQPlUZNAm7Fyiqsd6kqFkIyBnBGDVW41qGC4FuDG0uMtucIF4yOT3oC5qUVkrrYaeOMRK+9gpaKVXCZ4BPoM1KNUX7RJEyFAGEcbyfKJZMElVzyeBnPT06GgZo0VQttRNxK0JiZJY0Uy91RyM7N3cjIPTHNY1x43soWukjt5pXtycpgIZAOrLk8jPGeOfqKAOoorM03W7fVbZ7i3JCI21twxzgH196tG7AQORhD/Fjimot7CcktyzRWTeatLGLX7HHHMbgkLuyKhm1a+t43eRLJdoyQZGzTUZPZCckt2blFZel6q99amWREQ79oC5PapLrWLOzbbNOivx8oBJ/KpKNCis2fVoxp0t1bPHNsHrx171hT+MbiBNxs0YZ6JkmgVzr6KwtL12a+uHjlijRVQtlcnvWo90sa7ie2cAc0Bcs0VVivFlyACpBIwwweKzrrXJhJOllBC6W+RNcXEvlxIR1AIBJI79APXORWOIxNLDQ9pWlZDiubY26KyrPVpZLtbK9t1t7lgWjKvvjkA67TgHIzyCB7Z5rWlUxSwoSD5rlQcdPlJ/pTo16demqlKV4vqgl7u4lFLMhhUMWBycdKzr/VI9PtzLIrOQMhEGWOOp+g7mrlKME5SdkgNCiszTtZj1KyF3HHIkTHCl1xu9x7Z4qY3zCVUELEEZ3f5NUnfVAXaKhE4ZQQQQeRUUl/DFcQwPKglmz5ad2wMmgC3RXn/jL4kTeFdTtrKLSo7xriZIVJnKHLDOcbTwKp6L8WJdV1G9sZNGSKW0lWN2S4Lo2fQ7R6dKAPTKKyZdaCXiW6Q+aSzLIYznycAkb/TOOKlOrRLcpA4ZWdWYNtO0BeuT2oA0a5STxRqkE0gfQ5ZEjaRcoGG7EgCEHHTacn3qyfFCm6PlwxvYsAIrvzOJH7qBjPXj8Knt/EDSX8ltNaSwokPnGcj5MZxj1Bo6XBq25Tk8S6h57iPRpdkZdG3bs5BUKRgYI+Y5Ht14ou/FNxF5K22myzboo5HkKOFGWAZfu5GMnnnp0rR0/X4dUc/ZY5mjRmR5GQqFI7fjwfoRV97oRozvIqqoyWY4AFJSUldPQinUjUV4O5iX2u30MsQgtAA9qszrJC5aIkjqRwcAklRz8p5HFVX8SamIZG+yRq6wxsEa3lPztj5fcdyf4R6nIq2vjHTf7ZfT3uIo1ESyJcNIojkz2U5/zg1o32qiySMKjTzzHbDCnBc4yTk8AAdT/UgUVWqUeao7K1/lvc05ZdjMvtfvYL24SG2zBHCWD+RIxJGORjGRnKgcHPOcVuWcsk9jBLKqrI8as6rnAJHIGea52PxbcLqd3Z3NipNmiPcG3Z2MasCQRuQB+BztOR6Gtz+0UaZYo2jkZoxIuHHzKe49R7+9YYfFUcQualK//B2+/oDi1uXaKz59UW22+dG43OqAIu7ls46duKi1DXIdMj8y4GE8wRjHJJxnp6AV0CSbdkatFc/p/iy01G8jtUhuElfOA6DAwM9QfQdak1LxLb6WkTTqCZD8qK3zbc8sR2ApXW5Xs5XStqzcorJsda+23klv5GwICwffuBGRj8wQRU8eoOwbdbuCp2ttw2D+eaZJfoqsl2sihkORu2nIIIP0qO71FLO1luHV3SPlhGm5sdyB3x14obsOKcmkt2XaKydQ8Q6fpulnUZ7lDb7QyGP5jJnptHfNPk1u1TSl1NWMloyq++Nc4Q/xEeg6mlzIpUpu2m7t8zTorDn8V6TBdw2ovEmnlcIEgHmbc9C2Og5H51dm1WC3vYLWUlGnB8tyPlLD+HPY4yQO+DRzIfsp6ab6/JF+isebxJpsOrW+mNco11OCVVBuC84G4jpknArS873FCdyJRcbXW5NRUPne4o873FMRNRUPne4o873FAE1FQ+d7ijzvcUATUVD53uKKAM6yuVF7al3CruPLHH8LetWtYa0cQTpJG0ySjBV+SMEfj1NZLS6UL8WDTv8AaD/CScZ9M4xmppLfT4HVXZ1Y9MH/AOtUSqwiryaSLdKorJxeuvyOQ8Q6uU8QS2M0hS0ktkd+cbySwwD+H8qs6ElpdXkdxBvZLUcMZCRuIwAPwyfyrodQ/srT40kvJXUMcL/ET+lWYrOzkiV4ncxsNylW4INefDARji3XjNq+riuunXXUj20X7nVHC3euH+29RtbqUpBDORGpIGSQCTz25/nXQeH5w8Uk8bMYD8ibiDnkk4Ppk/zrbOm2jHJDk+pNKLC2AwDIB7NSoZZGjipYhSet9Nlr+ZTneNjn5tS8m7m03UjIsF7lIZMF1ZmJ4zj5Tgjg+ladkjWlibcMoxnbtzgenXpV77Db/wB6X/vuj7Bb/wB6X/vuvWcrqyM0jlnvpboyaVfQt+6IkG/ncBjBDdxnPvjrV+X7ZdeHoo7W4EdwEGC2cPj+Enrg+ua2G061b72849WpfsFv/ek/76octdAsefWGo6hf203h2CMQMs7eYGiCiFQeZfVcnOFyckcEDIroPECyroK28DTzSJGI0cgyOWBXBYj6ZJroBp1qHZxvDNjcQ3Jx0zTvsNv/AHpf++6JS5hRjY888Ppe/wBpRtNpZtUjTbI7BmM75GG578ZzWtcnf4iuG6/L6Z711v2G3/vS/wDfdV5ND0+VizrIxPJy9Ju4QjyqxyD86hIwHGwDI6f/AFvpXRTXd1FOxQR+XgMpcDk4xVz+wtOH8Mn/AH3Vj7BbYxuk/wC+qQTipKxmWN3dyF3uQm3b8rKOueakjutt1Ljcx80EhRnA8tRV/wCwW/8Aek/76qF9F06Ry7xFnPVieTQEY8qsUJr1TqGz5w7NDgFTzhmJ/Srq6lC8ssSPueEAuFBOM54z68dOvT1p8Wj2ELs8SMrsMFg3JHpTm0mxZlYq2VfzAd3RsYz9cGgoih1KKZ0VXIkMYk8tgQwU9Mg8iuGh1BYWuDFLIZnnlhVUjYnyy6liOOcHcf8AgOK79NMs43kdA6tIdzkHBY4AyfXgAfhUL6DpUk4ne2VpgCBIQN2D15xQBh+FZ4hYaisBJiW8lRCe6gAL+gFdRcf8gWP6VDHplnDGI4lZEHRVOAPwqQ2cJXaZJivpv4rSE+UicOYyoGzLpK/3Sf5VX17793/u/wBK2JdKs5tu8ynb0+emNo1i33jMfrITV0q3JJysRUpc8Urmfo0uyyx/00/pXOeI2uI9Ukl+yXMsbkbWQfL29PpXbRaXaQrtQyAZz9+pPsNv/el/77rA2OU0rzk0G786GWLzCCqyjBxmsWOO7lbDyNb7X2hV+YN8ucgtzjtxivQ3021kQoxlKnr89Qf2Fp392T/vugNTnfDEkil5JYjEzRk7WYlu3X3+ldBLKpdWkJ2bMdSBn3qWLR7KFi0fmAkY+/U32C3/AL0v/fdFwlqZ1tIDcF4z8gkJ4YkEYx9OtZPlHUtM/sAxq5MsovRIpKlNxYZx/f3Aj/gXpXTfYLf+9J/31UFxoenXTq8ySGRRhXWQqwHoGGDivMzXLvr1JRjLllF3T8yqUuRmDo1xePHounXyv/aNlckSuwxvjQMgl+jgr+Z9DXbTGe/uk+yssUds5PnOm4O+CpAGRwMnJ9fxrJtdHsLIEWyPHltxIblj6knkn61f3Sf8/U//AH0P8K1y3AxwVB0t2232V32Qqjc5J9iPWTeW2mu812j5IAZIthQdz1PQZry9pYbrxg6yBZ4Fu47VrYuSrKwJLEA8kEd89K9Ju7q085La6vZA5G8K7YGOec4wOh/I1nxaV4ehuzcwpbJO3WVCoJPA5I78j35pYjBurWhUUrKPTpun/wAArn3uTXWXsWggKxkKAnYLjp+HFc3FbakblQJFBXgOynHAxw3oa6pYdOmlaBbgSSbSWjEoJxnByPrxUFtpOizxCW2jR49xAZcdQcHt6g13kDLnU4tI0lri5LskEY3eWhZmPTgDkk1naK0d7ePrj/aBJPGFjinXaYl78ds/y+tdD9ht/wC9L/33R9ht/wC9L/33SaT3Gm1seP8AxC0671LxraXXkxvY2Y8xwZdrMSo6cdsVW061YizaNBH+9EjuDgkZ6e/4167c+HNJvGZriAyMwwSWPNMj8LaLEoWO12henzHis50+eSb2O7D4z6vRlGHxN/K1mu/n6EM7w2jXN35QQH55HVRlsdz6/rUdhMn2mVhNNKt0u5kkG1UA4XaD1BB/Q1s/YLf+9L/31Tf7OtS4f59wGA27nFanAZDx6bpcCh3WCBm2ABRznnHA+tRWhW2s7uGC6dnkjMpnnPEcmMYOcZwec+nX33W061YYbeQDnls81DcaJp10kqTxvIsqGOQM/wB5fSh+8rS2DzOJg1aJJ4reOW2SYo/m+f8AP86hc8BgCTnOeelbGi68b/RLj7RJGXSRolkkXcjg528A8jsfpWhb+DfD9o++3szEwG3KSMOPTrVtNA0tIWhELGNiCVZywJH1qI0aMG1Thbzu/wAv+CdVbEqVlCCXns/10PJ49AuLOX+1Uu9ON4kxeK125jdeuRnj6D+tekQXwbU9Pu7mSMrLbvB5iZCCUspwM+u0gf7oHcVof8I9pOMfZhjpUraTYvbmBkLQkYMZIKkfTFcOJy6FfDzoczXMrf5HRjM0rYtxdVL3b2srb9DkfBl5JCviXVNSF3BG+oyeWtyHAWJflVUDdSeTgf3gK3NLtlgsLJpo9l1FbiMkHlB1259un4Vcj8O6VFKkixSF0+4XlZ9n+7uzj8Kt/YLf+9L/AN91hlmV/U51Kjldyt6JLY4Z1OZJGVBqsck3mwTSSJJtQKykDqwJ3H04471jeJ9MuNRvY5VZCgXbAGiL7ZD1zz0PHPt2rrG020bbuDnacjLdD6077Db/AN6X/vuvXaTVmTCcoPmjucxoWmRaPeIguHmdVdCpiCLH0O5c88+gPUHPtl6voV7f+ITlovLnOFmMJcJEByp54I7e5z613J020LhzvLAYDbuRTvsNv/el/wC+6TimrGkK84Sc1uzE0iOK3Hlq8rSC2jiaRhsIxngD2zwfTb6ZMxnkjgnSLetwEO+bljwOCM5yT6dq0xp1qHLjeGPBO7k077Db/wB6X/vuqMTPtLjmVhM8q+YAHdQu7A647cY6+lQ6yLG4sGe/3eXbyCZCCchx93AHXk4x3zWoum2i527xk7jhup9ad9gt/wC9J/31QxxaTTexwdoYrTU3ubiGJIskz2iDJsFfHOPR/wCLHQE44zXTa0babR5IZYHlQ7RFHCOd2flx2HP4Y68VpLpVkkryqhEkgAdwRlsdMnHNSfYLf+9L/wB91nGHKmkdVXFe1nGclqvPp/XXru9bt8B4c0qfStdH26ByqttjkRSUEmMgf7gGcehPNWvGtlBdbZYYmlvWRi0fJXywOXPoVwMevTvXa/Ybf+9L/wB90h0+2IIJkIIwQW60lSSg4LY0nmFSeIjiZq8l+a2OF8E20FnOz3EX7+QM1rIyYEig4Zh6NkAY44Ax1Ndx9o96I9Ms4o1jjDoi9FU4Ap/2C3/vS/8AfdVCHJHlRhisQ8RVdWS1e/8AwO3oM+0e9H2j3p/2C3/vS/8AfdH2C3/vS/8AfdWc4z7R70faPen/AGC3/vS/990fYLf+9L/33QAz7R70faPen/YLf+9L/wB90fYLf+9L/wB90AM+0e9FP+wW/wDel/77ooA4pvFngVtTGoHxDZecDux53y59cYqzN478GTuHfxFYZHHEvWvlHFGKznQpzjyyimjV16jabk9FZeh9T6l4w8EapGiXHiKyGw5UpNgj17Vbh8f+DreBIY/EOnqiKFUeb0Ar5NxRirUEnzW1OdU4qTmlqz61/wCFi+Ef+hj0/wD7+0f8LF8I/wDQx6f/AN/a+SsUYpln1r/wsXwj/wBDHp//AH9o/wCFi+Ef+hj0/wD7+18lYoxQB9a/8LF8I/8AQx6f/wB/aP8AhYvhH/oY9P8A+/tfJWKMUAfWv/CxfCP/AEMen/8Af2j/AIWL4R/6GPT/APv7XyVijFAH1r/wsXwj/wBDHp//AH9o/wCFi+Ef+hj0/wD7+18lYoxQB9a/8LF8I/8AQx6f/wB/aP8AhYvhH/oY9P8A+/tfJWKMUAfWv/CxfCP/AEMen/8Af2j/AIWL4R/6GPT/APv7XyVijFAH1r/wsXwj/wBDHp//AH9o/wCFi+Ef+hj0/wD7+18lYoxQB9a/8LF8I/8AQx6f/wB/aP8AhYvhH/oY9P8A+/tfJWKMUAfWv/CxfCP/AEMen/8Af2j/AIWL4R/6GPT/APv7XyVijFAH1r/wsXwj/wBDHp//AH9o/wCFi+Ef+hj0/wD7+18lYoxQB9a/8LF8I/8AQx6f/wB/aP8AhYvhH/oY9P8A+/tfJWKMUAfWv/CxfCP/AEMen/8Af2j/AIWL4R/6GPT/APv7XyVijFAH1r/wsXwj/wBDHp//AH9o/wCFi+Ef+hj0/wD7+18lYoxQB9a/8LF8I/8AQx6f/wB/aP8AhYvhH/oY9P8A+/tfJWKMUAfWv/CxfCP/AEMen/8Af2j/AIWL4R/6GPT/APv7XyVijFAH1TdeMfAt5K0s+u6c7tF5W4y9F56eh5NUJdc8ATTb38RWZXJITzhgEn6dOox7/SvmXFGKAPqSLxd4NtSGtPFNnC4UpkyB/lznGCO2KuQePvB1vGUTxHYYLM5zL1LEk/qTXyfijFAH1r/wsXwj/wBDHp//AH9o/wCFi+Ef+hj0/wD7+18lYoxQB9a/8LF8I/8AQx6f/wB/aP8AhYvhH/oY9P8A+/tfJWKMUAfWv/CxfCP/AEMen/8Af2j/AIWL4R/6GPT/APv7XyVijFAH1r/wsXwj/wBDHp//AH9o/wCFi+Ef+hj0/wD7+18lYoxQB9a/8LF8I/8AQx6f/wB/aP8AhYvhH/oY9P8A+/tfJWKMUAfWv/CxfCP/AEMen/8Af2j/AIWL4R/6GPT/APv7XyVijFAH1r/wsXwj/wBDHp//AH9o/wCFi+Ef+hj0/wD7+18lYoxQB9a/8LF8I/8AQx6f/wB/aP8AhYvhH/oY9P8A+/tfJWKMUAfWv/CxfCP/AEMen/8Af2j/AIWL4R/6GPT/APv7XyVijFAH1r/wsXwj/wBDHp//AH9o/wCFi+Ef+hj0/wD7+18lYoxQB9a/8LF8I/8AQx6f/wB/aP8AhYvhH/oY9P8A+/tfJWKMUAfWv/CxfCP/AEMen/8Af2j/AIWL4R/6GPT/APv7XyVijFAH1r/wsXwj/wBDHp//AH9o/wCFi+Ef+hj0/wD7+18lYoxQB9a/8LF8I/8AQx6f/wB/aP8AhYvhH/oY9P8A+/tfJWKMUAfWv/CxfCP/AEMen/8Af2j/AIWL4R/6GPT/APv7XyVijFAH1r/wsXwj/wBDHp//AH9o/wCFi+Ef+hj0/wD7+18lYoxQB9a/8LF8I/8AQx6f/wB/aP8AhYvhH/oY9P8A+/tfJWKMUAfWv/CxfCP/AEMen/8Af2ivkrFFAD9tG2n4oxVWAZto20/FGKLAM20bafijFFgGbaNtPxRiiwDNtG2n4oxRYBm2jbT8UYosAzbRtp+KMUWAZto20/FGKLAM20bafijFFgGbaNtPxRiiwDNtG2n4oxRYBm2jbT8UYosAzbRtp+KMUWAZto20/FGKLAM20bafijFFgGbaNtPxRiiwDNtG2n4oxRYBm2jbT8UYosAzbRtp+KMUWAZto20/FGKLAM20bafijFFgGbaNtPxRiiwDNtG2n4oxRYBm2jbT8UYosAzbRtp+KMUWAZto20/FGKLAM20bafijFFgGbaNtPxRiiwDNtG2n4oxRYBm2jbT8UYosAzbRtp+KMUWAZto20/FGKLAM20bafijFFgGbaNtPxRiiwDNtG2n4oxRYBm2jbT8UYosAzbRtp+KMUWAZtop+KKLAOxRin4oxV2EMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRT8UUWAkxRinYoxVWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbiinYoosA/FGKfijFVYQzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYqa1tJ725S2tozLNJkIi9WOM4HqeOlMxXovwn8JSeIrvWbpBtksrM/ZZMfcuWOY2/DafzpPRXA87uLaW0uZLedCk0TFHUkHaR1HFR4qabzDPIZgfNLEvnruzz+tMxTsAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYop+KKLAPxRin4oxVWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+Kk8kgZkOweh6n8KLAQYp0cTyuEjUux6BRk1YjVGbCJnAyXkPAHrgf/Xp0t05QxRMyxd8fKW+uP5UDG/YSv+tkVPbOT/QVo6RcXlh9pWx1Z7bzIXDrHO0eeOp2nBrHxVi0XLTdP9Q/X6VM17pE/hGSW+HPmXEe48nO7J/SmeR6SxH/AIFj+dORgV2Sfc7Huv8An0pjIUYqf/11Vihfs0h+6ob/AHWB/lUbIVOGBB9CMU7FPWWVRgO2PQnI/KiwEOKMVPvRvvxDPqnH6dKXyQ/+qfcf7p4P/wBenYCvijFSFSpIIII6g0mKVgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYop+KKLAOxRipMUYq7AR4oxUmKMUWAjxRipMUYosBHijFSYoxRYCPFGKkxRiiwEeKMVJijFFgI8UYqTFGKLAR4oxUmKMUWAjxRipMUYosBHijFSYoxRYCPFGKkxRiiwEeKMVJijFFgI8UYqTFGKLAR4oxUmKMUWAjxRipMUYosBHinrEWG4kKo6sakCBVDvnB6L3P/ANakZi556DoB0FFgE3hOIhg/3z1/+tUe0s3ckn86fipoh5UbT/xZ2x/XufwH6kUrAMmxEv2dMcHMjD+JvT6CoMVJijFOwXI8VasR88/P/LvJ2z/DUOKtWI/eT9f+PeTp/umpmvdZE/hZRxUoHmR7f4kGR7juP6/nSYpVJRgw6g5qrFkWKMVNIgVzj7p5H0puKLAR4o21JijFFgFEpwFkAkUdm6j6HtQYA4JhJYd1P3h/j+FJigZByOCKLAR4oxVnKzffIWT+/wBj9f8AGo2jZGKsCCKLARYoxUmKMUWAjxRipMUYosBHijFSYoxRYCPFGKkxRiiwEeKMVJijFFgI8UYqTFGKLAR4oxUmKMUWAjxRipMUYosBHijFSYoxRYCPFGKkxRiiwEeKMVJijFFgI8UYqTFGKLAR4oxUmKMUWAjxRUmKKLAOxRin4oxVWEMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKkVQq73Gc/dX19/pTkQHLN91evv7U1iXbJ//AFUWAYxLMWY5JpMU/FGKLANCFmCqMknAFS3BG8RqcpGNoPqe5/P+lSW42b5/+eY+X/ePA/x/Co4oJJ5BHFGzueiqMmkMhxRiupsfBd5cgNcSpAD/AAgbm/wrch8B2IUeZJK59d2P5Cs3VggsedYqzZAeZLnH+ok6nH8Jr0CTwLppHy+YD7OazpvBXkM721xnKMoWRc9RjrUyqxaaJnFuLOHxRir99pV3pz4uIiF7OOVP41TxWys9UMUjdCD3U4/A8j+tR4qeMZ3J/eU/mOf6VHinYBmKMU/FGKLAMxRin4oxRYBmKlRlZRHJ90fdbHK//W9qbijFFgEeNo2ww9wR0I9RTcVYQhl8uQ4X+Fv7p/wqNkKMVYYI6iiwEeKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKKfiiiwD8UYp+KMVVgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYpQhZgoHJ4FOxUijZGX7n5R/WiwEcmOEX7q/qfWmYp+KMUWAZijFPxT4YXnmSGMZkkYKo9SaAL2n6VPqRitoeA2ZJHPRR0H8j+dej6J4dg06ALEg3H7zHqxp2i6TFZxLGi88Lu9SBjNdTbWqZ74XpXDUqOT8i7FeGyYcZX3qyto2Oq1ejt125yeanECYHJrMDKa0bB+ZarzWZxyRg1uGBMd6ryQKUPWgDlL7TVljdJFVh3BHUV514g8NHTy1xbAmHqyf3fce1exXEC4Dc+hrD1CzRldSuRjp6irhNxd0B4wh2OrehzQ6bXZfQ4rV13TP7Pv2CLiJ+U9vas6UZYH1UH9K7otSV0QQ4oxT8UYp2AZijFPxRiiwDMUYp+KMUWAZiplHnR7P41Hy+49KZilGQQQcEcg0WAjxRirEqhgJVGA3DAdmqLFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFFPxRRYB+KMU/bRtq7CGYoxT9tG2iwDMUYp+2jbRYBmKMU/bRtosAzFGKfto20WAZijFP20baLAMxRin7aNtFgGYoxT9tG2iwDMUYp+2jbRYBmKMU/bRtosAzFGKfto20WAZijFP20baLAMxRin7aNtFgGBSTgdTT5PvbR0XgU+IYYv8A3Rn8e1M20rDGYoxT9tG2nYQzFdD4OtPO1gzsMrboX/HtWDtrsfBUeLe5f+9Kif1P9KxrO0GUtz0CwhC4GPur+tbVvGoiBx1NZVl91z71sRf6pPpXAUW1jUIOO1S+WvpTQOKloAj8tfSoCi4IxVqoD1NAGbKgMTDHasm7QbVbHfFbTjlh9ayLn/UH8KAOC8WWfmWTsB80R3D6f/qriHGY4z7Efr/9evStfTdbzg90/pXnGMwL7Mf5CuzDO6aJZDijFP20ba6bEjMUYp+2jbRYBmKMU/bRtosAzFGKfto20WAWIjJRjhX4J9D2NMZCrFWGCDginbaklG9Uk7n5W+o/+tiiwEGKMU/bRtosAzFGKfto20WAZijFP20baLAMxRin7aNtFgGYoxT9tG2iwDMUYp+2jbRYBmKMU/bRtosAzFGKfto20WAZijFP20baLAMxRin7aNtFgGYoxT9tG2iwDMUU/bRRYB+KMU7FGKqwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwC4xD/vN/L/APXTMVK4wqD/AGc/rTMUWAbijFOxRiiwDcV1fhWXyo/Lzgk+b+uP/Za5bFadncfZNSgycII1RvxGf5msq0eaDSKR6zZMdrjPetiJyYk57Vyem3O8KQ/3l7HvW9byuYsbjwa80o3A7Y61Lub1rNSVyg+Y9KmEr4+9QBb3t61XZm3HnvUfmv8A3jUBkfJ+Y0AI7Hc3PrWNcsfIbn0q3LKwR2LHp61i3koCKpcdc9aAMHxDMEtpySPu4/SuEA/cNx0YfyNdXq8FzqMMhtYy6q2WI/kK5coyLIrIQwIzkdK7cMlysmRDijFOxRiuqxI3FGKdijFFgG4oxTsUYosA3FGKdijFFgG4qSIbleP1GR9R/wDWzTcU5CUdXHY5osBHijFSyR7JGUdAePp2pmKLANxRinYoxRYBuKMU7FGKLANxRinYoxRYBuKMU7FGKLANxRinYoxRYBuKMU7FGKLANxRinYoxRYBuKMU7FGKLANxRinYoxRYBuKMU7FGKLANxRTsUUWAfijFSYoxVWAjxRipMUYosBHijFSYoxRYCPFGKkxRiiwEeKMVJijFFgI8UYqTFGKLAR4oxUmKMUWAjxRipMUYosBHijFSYoxRYCPFGKkxRiiwEeKMVJijFFgI8UYqTFGKLAR4oxUmKTFFgCUYcD0UfypmKmkHznjHtTcUWAjxRipMUBSegJ+lFgIwuTgdTxU11g3cuOgYgfhxUttA7XUIKHG9eo96jaNmYsSoJOfvCkBu6HqyqFglYhh90+tdlaXsZxiXAb36GvOdPsGu7tURxgHJK9q7m10iXaPLlxnsy5/WvOxEIxnoWjoIrobcCbp/tVYW4yo/ff+PVlRaXd9d0R7dSM/pVpdKus4zF/wB9H/CsBls3A7zf+PVVkuVCEmUf99VINHmI+aWMfTJoOiJ0eZj9BigDKubqMJt35z1rHZ5NSvUs7Rd80p2rngAdya6aTSbVRnyy/Y7jmsu7sTCweEGJ4yGVk4IPYigDrbLw9BZaXHaDDsBl2I++x6muM8V+FQ8Mk0K7ZAPx47H1HvXa6BrP9rWRE21bqE7ZVHQ+jD2NRazcR/ZpWOMAbV+ppxk4u6A8GZCrFWGCDgikxV29MMt9O6MQrOSOPeoPLXtIPxBr107q5mQ4oxU3lejIfxo8l+wz9CDTAhxRipTGy9VYfUUmKLAR4oxUmKMUWAjxRipMUYosASjIjb1Qfpx/So8VORm3X2Yj9BTMUWAjxRipMUYosBHijFSYoxRYCPFGKkxRiiwEeKMVJijFFgI8UYqTFGKLAR4oxUmKMUWAjxRipMUYosBHijFSYoxRYCPFGKkxRiiwEeKMVJijFFgI8UVJiiiwDsUYp+KMVVgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8U5Y8jcThfWgCLFTwWvnK5aWOIKuQZMgNzjA4/H8KTcF+4uPc8mmnLHJJJ96LMETSwKZC3nR9uAeeme9MMG0nCq2P8ApoD/ACpsg5U+qj+VNxU8r7lXj2JfJnBwsI64+VQe+KYVnI6SYPsabilBI6Ej6GjlYXh2f3/8AfbIwuVJUjALcj0BNQY4q3byS72w7HEbHlj/AHSKksBLPeRx72I4znngUpNxTbHaPf8Ar7zpvDmm+TArEfO/JNdnaxhVyBx0FZenQiOPP90Y/wAasvqhCiOyiSQr1kfIXPt615EpOTuxm/EmAq4q0qjPQflXG/25q8Lci3Lj+F4yAfyNaml+K7W4mFvfx/Y5jwCWyh/Ht+NIDodq+g/KmOoB6CrARSAQcg01414oAzpUyWX1rJu48pnHI4P0remRQwOKz7iMb2GOGFAHKLPJpOorcxkhPuSj+8h/wql4v17fGbe3fgjAI7A9/wAa1tRh3xc9sqa4HULZ47iXcykkgjL8gcjFbUIxlP3gs2tDMxRip/s8hOAATnHDA0nkSYzsbGM9O1epzR7k+zn2ZDijFSmKQdUYc46U3afSmrMTTW4gLL0Yj6Gl3v3wfqM0YoxRYkMqesY/4CcUbUPRiP8AeFGKMUWGBibGQAw9V5pmKeMg5HBp+4N98Z/2h1/+vRYBoGbd/Zgf0NR4qzs228nOQSuCPxqHFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYop+KKLAP20bafijFXYQzbRtp+KMUWAZto20/FGKLAM20bafijFFgGbaNtPxRiiwDNtG2n4oxRYBm2jbT8UYosAzbRtp+KMUWAZto20/FGKLAM20bafijFFgGbaNtPxRiiwDNtSTLiTb/CANv0xSYqVWVlCSg4H3WHVf8RSsBX20basNbvtLJiRP7yc4+vcVFigLgwyiH6j/P50zbUoGY2Hoc/5/Sm4osAzbRtp+KMU7AOgUfvecfu2rT8Ow7r4v/dA/wAf6Vnw4Alz3jOPzFbfhoAGU965sS7U2UjrG/49ooc4Mp+Y+3etCO1TyxgYIHFZ8x8q4t2IymzaT6Hir6XQO0RAygnBKnhfqf6V5ai3sWRzW6uNrrz2NY93ZK6YfDofuutb/kecoM7B8HIC8Af4/jTJIcDaQCpptJbMDF03XL/QHWNs3Njn/Vk8r/unt9Old1Y6la6raCe0lDrnDKeGQ+hHauPuLA4JjXcndT2rNW3utOuBd6fK0bjqFGePQjuPapA9FmX5QfeqN0v3T+FVdJ8Qx6mn2e4TyLwDOw/dkHqv9R2q7cj91n0NAGDfx5Mi+ozXAeII8Tq2OG/z/jXo16BuU+oxXC+Ik/dr7MRW2HdqiE9jmttG2n4oxXsWMxoLDozDvwadvkAxvbGMde3pRijFJxTKU5LZi+bJzls5xnIz0o81z1CHnPKCkxRilyR7D9pPuxd/TMaHr/DSZXH+rXoB1P50YoxRyIPaMXMZB/dEHthulJiI/wALjkdweO/anpBJICUQkDqegH49Kfsii+8RK391fu/ie/4fnS5UHtH5fchVSMWRLM6q0o7AnAB5FdhpHh/wdd+EIrrUtbNjqLO+TkE4BwB5fJI9xjrXGuzPECx5LdOwAH/16ixSdNvrYTlfoasulactjfTQ6mtw0DKIML5fnA4ydrcjb+ueOhrH20/FGKtKxIzbRtp+KMU7AM20bafijFFgGbaNtPxRiiwDNtG2n4oxRYBm2jbT8UYosAzbRtp+KMUWAZto20/FGKLAM20bafijFFgGbaNtPxRiiwDNtFPxRRYB+KMU7FGKqwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWARSVYMpII6EHFSeeW/1saSe5GD+YpmKMUWETILdmxmSPdx2Yf0pv2cH7k0TexO0/rUeKfIMkN/e5/HvSsAG0nxkRMw9V+b+VRlSpwwIPuKcBg5HBre8KWP9sa9Da3V48NqgMsxyTlF5I/HpSk+VXYzDgUN5vtGT/Ktnw82El2rvbOMA13vj298LT6FEulSWD3qkiMWyKSE2ncGwOB0/GuD0G4AmkUwRDjPAI/ka5as1Og5NFLRnYGya8RkeQK23KqB8oPv696rW1w8DmB08t04aI9vce1atnKhf/VjlezGrNxp9rqUR3o8cqHMcqNyv6civMcmyyrFKGXchyKtRgScevWsXebK5ME6SRzfwnIKyD1HFa9pLDKuULiTpg4pAWkss8r9wdfWmTaeo+dQEU9QBWxAIsAZb5fYf41HdPBsf5nx7L/8AXoA4zUdOWFvMhLLJEwkjbPQiugiuReabHcLxvUEj0OcEfnVHU5YNrkGQnyz1Uf41Y08Rx6FCMPkru7dzmgCre/dT6muJ8R4CEd95rt714tsYxJ1OeQP6VxGvyQblBic5Zv8Alp/9ataCvUj6iexzmKMVY3wjpbKf95yf5Yo84DpBCPwJ/ma9oyK+KUKWOFBP0FT/AGiUfdEa/wC7Go/pSNcXDdZpMem40WYDRaTkZMTKPVvlH60v2fH35ol/4Fu/lmmEEnJ5NJiizAk2269Xlf6KFH58/wAqTzlX/Vwovuw3H9eP0pmKMUWAJJHlPzuzY6ZPSm4p2KUIWYKOpOBTsMJBhI19s/n/AJFMxUsuGkYjp0H0FMxRYBuKMU7FGKLANxRinYoxRYBuKMU7FGKLANxRinYoxRYBuKMU7FGKLANxRinYoxRYBuKMU7FGKLANxRinYoxRYBuKMU7FGKLANxRinYoxRYBuKKdiiiwEmKMU7FGKqwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcU5RuUr36ijFKODkdaLAMxSqWQ5Vip9QcU9l/iHQ/oabigB9uMzY9VYdf8AZNTaZL5N8hzw3ymo7bi5i/3gKjAKNxwQaipDni49wR6Np8wbyifTFbdswO7muR0e8WaBGzgg5rprc/vPqK8Fpp2ZqXrixt9RtWguE3LnKkcFT6g9jXPNDcaPcol1l4d3yXCjg+zehrooTyRU2A6sjgMrDBU8g0gK1pf5h3LIjj61Wu7w+UAdoyfWlm0SIkm2meAn+EAMv5H/ABqNdBj3b7i4ebAxtC7R/jQBkeXLq928MbfulwJXHQL3GfU1vTlEt9owAMAAUr/ZbK38tTDCg/hyFFZ0+o2TgIt5bsxPQSqT/OgCtfSgbfYE1zD6Y2qFpTjCnaAWI9z/ADra1CbexSL53fCIF7k10drpENtZxQPCGZV+ZsdT3NCdtgPOX8Nyj7of8HB/nVeTQ7mMZyw/3kx+tenNpdsegZfoaP7IUfclP0YVqq1RbSYrI8lns5rf/WLx/eHQ1DivQNd0xUhdSqjI7dPrXBYwcV6WErOqmpboiSsMxRinYoxXXYQ3FGKdijFFgG4p8Yxuf+6OPqaTFSONqqnfqfrRYCHFGKdijFFgG4oxTsUYosA3FGKdijFFgG4oxTsUYosA3FGKdijFFgG4oxTsUYosA3FGKdijFFgG4oxTsUYosA3FGKdijFFgG4oxTsUYosA3FGKdijFFgG4op2KKLAPxRin7aNtXYQzFGKfto20WAZijFP20baLAMxRin7aNtFgGYoxT9tG2iwDMUYp+2jbRYBmKMU/bRtosAzFGKfto20WAZijFP20baLAMxRin7aNtFgGYoxT9tG2iwDMUYp+2jbRYBmKMU/bRtosAzFGKfto20WARccqeh/SkKkHBHNO208DeMfxDp7+1KwEQ+Vg3oc0+4XbcSgdNxx+dJtqWcZZX/vID+mP6UW1As6TeC2n2OfkY/lXaWc+URlbpwcGvPcVtaVqYjIilYjsG9a83GYZ39pH5lxfQ7+KZg/XrVlZmz2rCtrtmQEOGx+NX0uWyOAa80sj1HW5oJza2sPmTBQWbGQmeg+tZU/2y5BFxcXWO4DlR+Qq5pkRuLuefjLzMfy4FbqQuF+7mgDhn0eJ8kSvn/a5qrL4fVv8Ani/+8mK9Ae2jb78Kn6rUDadat/yzK/7poA89/sGWFw8UbIynIaKUgg+3NWY7vX7Q/u9TvlA7S/vB+ors20iM/dlYfUZqP+yJhkpIjfXigDEsvF2rQEfbIILyEH5mjXZJ+XQ12mm6lY6ra/aLZwV6MrDDKfQiueudKwmZYMEfxrWVY3q6JqckjlVjlQq5OcZHING4Gr4mlRICc9Ez+teadTmuh1/XBqLeVCT5Y6nGM1g7a9fBUZQi5S6mcncZijFP20ba7rEjMUYp+2lCFiAByaLAIijlmHyr+p9KackknqalfGAq/dHf1NM20rAMxRin7aNtOwDMUYp+2jbRYBmKMU/bRtosAzFGKfto20WAZijFP20baLAMxRin7aNtFgGYoxT9tG2iwDMUYp+2jbRYBmKMU/bRtosAzFGKfto20WAZijFP20baLAMxRT9tFFgH4oxT8UYqrAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAGN/+9/OnMN1vG390lT/ADH8zTcVPH+9ilQ8NgMPfH/1ialqwIq4oxTypBwRzRiqsBbtNSltsA/Mv1wa27fXImAHmup9DXM4oxXHVwVObutGUpNHfaBdKSyowbDOcd+Tn+tdLHONoypryS0u5bOYSRH6j1ro7LxS5KxvvDE4GQGH+NcFbBVKeq1RXOup3olQ98fWl+Rv7prlIPFFvJjMkR+pKn9a0ItXgkGcH6qQwrllCUfiVija8lD2xTlt1wME1mx6hAek2365FXI7oEDEqH8RUgLdx7UxnqDXm3iIjKjuccfnXf6heKIyfNXhD0rzPVrkXN0Qv3U4z6munCU3OqrdNRSehm4oxT8UYr3rGQzFGKfigDJwKLAMxTyNilf4j19van48vpy/r6f/AF6ZilYBmKMU/FGKdgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFFPxRRYB+KMU/FGKqwhmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKdE3lyKxGQDyPUd6XFGKLBcHXY7RtztOAaaUxyOR61M43xK/dflb+n+H4VGMg5HFJIbGYoxUvynqMH1FJsPUcj2oER4qW1wt3ExOAGHfFNxUlvxcxdvmHfH61NRe4/QmfwsgCkkAU4kq3yEjHAIOKeBtXPc8Cm4qrXKJo9QvIvu3D49Cc/zq0mu3yAcxtj1Ss/FGKzlh6Ut4ofMy3c6ve3SFHlwp6hBiqOKfijFVClCCtFWC4zFGKkCE9Bx60uFHX5j7dKsRGELfTuTS8KMJ+JpxJPXp6UmKVgGYoxT8UYp2AZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZiin4oosA7FGKfijFVYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAMxRin4oxRYBmKMU/FGKLAEZCsQ33WGGpHjKMVP5+tLipFIZdj8Y+63p/wDWosBBilxipGjZDgj6H1puKAEyT1AP1qa1VWuosggbxnv3pgQkZPA9adHK8MqSQsUZDuUjqD61E480WkKSumiMgE/e+nFJs4JyOO1OIJOT1pMVdhieX0+Zfzo2D+8KXFGKLAJhfUn8KMgdFH480uKMUWAacnqc0mKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYoxT8UYosAzFGKfijFFgGYop+KKLASYoxTsUYq7ANxRinYoxRYBuKMU7FGKLANxRinYoxRYBuKMU7FGKLANxRinYoxRYBuKMU7FGKLANxRinYoxRYBuKMU7FGKLANxRinYoxRYBuKMU7FGKLANxRinYoxRYBuKMU7FGKLANxRinYoxRYBuKMU7FGKLANxRinYoxRYBuKMU7FGKLAIrMowDx6dqXcfRR9FFGKMUrBcack5JJNGKdijFOwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUYp2KMUWAbijFOxRiiwDcUU7FFFgH4oxT9tG2qsIZijFP20baLAMxRin7aNtFgGYoxT9tG2iwDMUYp+2jbRYBmKMU/bRtosAzFGKfto20WAZijFP20baLAMxRin7aNtFgGYoxT9tG2iwDMUYp+2jbRYBmKMU/bRtosAzFGKfto20WAZijFP20baLAMxRin7aNtFgGYoxT9tG2iwDMUYp+2jbRYBmKMU/bRtosAzFGKfto20WAZijFP20baLAMxRin7aNtFgGYoxT9tG2iwDMUYp+2jbRYBmKMU/bRtosAzFGKfto20WAZijFP20baLAMxRin7aNtFgGYoxT9tG2iwDMUYp+2jbRYBmKMU/bRtosAzFGKfto20WAZijFP20baLAMxRin7aNtFgGYoxT9tG2iwDMUYp+2jbRYBmKMU/bRtosAzFGKfto20WAZiin7aKLAPxRin4oxVWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZijFPxRiiwDMUYp+KMUWAZiin4oosB//9k=",
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAlgAAAGQCAIAAAD9V4nPAAAKt2lDQ1BJQ0MgUHJvZmlsZQAAeJyVlwdUU8kax+fe9JCQQCDSCb1JbwGkhB5A6VVUQhIglBADQcWuLK7gWhARQWUpCyIKrkqRVVQsWFFUwL5BFgV1XSyIiuVd4BDcfee9d953z9z5ne9+859v5syc810AyAyOSJQOUwHIEGaLw/w8GTGxcQzcCIAAFhAAFWhxuFkiVkhIEEBspv+7ve9DohG7bT6p9e/f/6sp8PhZXACgEIQTeVncDISPI+0TVyTOBgB1BPHrLcsWTfIdhJXESIIID09y8jR/nuTEKUZTp2IiwrwQ1gcAT+JwxMkAkKwQPyOHm4zokCbnshLyBEKE1yHslpGRyUP4DMLGSIwI4Ul9ZuJ3Osl/00yUaXI4yTKeXsuU4b0FWaJ0zor/czv+t2WkS2bmMEIaKUXsH4b0dGTP/kjLDJSxMHFB8AwLeFPxU5wi8Y+cYW6WV9wMZ6WHs2eYx/EOlOmkLwia4SSBryxGkM2OmGF+lk/4DIszw2TzJom9WDPMEc/mIEmLlPlT+GyZfm5KRPQM5wiiFshySwsPnI3xkvnFkjDZWvhCP8/ZeX1l+5CR9d3aBWzZ2OyUCH/ZPnBm8+cLWbOaWTGy3Hh8b5/ZmEhZvCjbUzaXKD1EFs9P95P5s3LCZWOzkcM5OzZEtoepnICQGQbewAcEIQ8DRAIb4Ig0KxAKQDZ/efbkYrwyRSvEguSUbAYLuXF8BlvItZjLsLGysQdg8v5OH4+3YVP3EqJ3zPoya5Bj/R65MztmfYnFALTmA6Byf9anvx8ASh4ALZ1ciThn2oeefGEAEVCAElAFWkAPGANzJDcH4AI8kIwDQDCIALFgMeCCFJABxGAZWAXWg3xQCLaDXaAMVIBqcAAcBkdBKzgJzoKL4Cq4CXrBAyAFQ+AFGAXvwQQEQTiIDNEgVUgbMoDMIBuICblBPlAQFAbFQglQMiSEJNAqaCNUCBVBZVAlVA/9Cp2AzkKXoR7oHjQAjUBvoE8wCibBSrAmbAhbwkyYBQfCEfAiOBleCufCefBWuBSugg/BLfBZ+CrcC0vhF/AYCqDkUHSUDsocxUR5oYJRcagklBi1BlWAKkFVoRpR7agu1G2UFPUS9RGNRdPQDLQ52gXtj45Ec9FL0WvQW9Bl6APoFvR59G30AHoU/RVDxmhgzDDOGDYmBpOMWYbJx5RgajHNmAuYXswQ5j0Wi6VjjbCOWH9sLDYVuxK7BbsP24Q9g+3BDmLHcDicKs4M54oLxnFw2bh83B7cIdxp3C3cEO4DXg6vjbfB++Lj8EL8BnwJ/iC+A38L/ww/QaASDAjOhGACj7CCsI1QQ2gn3CAMESaICkQjoisxgphKXE8sJTYSLxAfEt/KycnpyjnJhcoJ5NbJlcodkbskNyD3kaRIMiV5keJJEtJWUh3pDOke6S2ZTDYke5DjyNnkreR68jnyY/IHeZq8hTxbnie/Vr5cvkX+lvwrCoFiQGFRFlNyKSWUY5QblJdUAtWQ6kXlUNdQy6knqP3UMQWagrVCsEKGwhaFgwqXFYYVcYqGij6KPMU8xWrFc4qDNBRNj+ZF49I20mpoF2hDSlglIyW2UqpSodJhpW6lUWVFZTvlKOXlyuXKp5SldBTdkM6mp9O30Y/S++if5mjOYc3hz9k8p3HOrTnjKuoqHip8lQKVJpVelU+qDFUf1TTVHaqtqo/U0GqmaqFqy9T2q11Qe6mupO6izlUvUD+qfl8D1jDVCNNYqVGtcU1jTFNL009TpLlH85zmSy26lodWqlaxVofWiDZN201boF2sfVr7OUOZwWKkM0oZ5xmjOho6/joSnUqdbp0JXSPdSN0Nuk26j/SIeky9JL1ivU69UX1t/fn6q/Qb9O8bEAyYBikGuw26DMYNjQyjDTcZthoOG6kYsY1yjRqMHhqTjd2NlxpXGd8xwZowTdJM9pncNIVN7U1TTMtNb5jBZg5mArN9Zj1zMXOd5grnVs3tNyeZs8xzzBvMByzoFkEWGyxaLV5Z6lvGWe6w7LL8amVvlW5VY/XAWtE6wHqDdbv1GxtTG65Nuc0dW7Ktr+1a2zbb13Zmdny7/XZ37Wn28+032Xfaf3FwdBA7NDqMOOo7JjjudexnKjFDmFuYl5wwTp5Oa51OOn10dnDOdj7q/JeLuUuay0GX4XlG8/jzauYNuuq6clwrXaVuDLcEt5/dpO467hz3KvcnHnoePI9aj2csE1Yq6xDrlaeVp9iz2XPcy9lrtdcZb5S3n3eBd7ePok+kT5nPY19d32TfBt9RP3u/lX5n/DH+gf47/PvZmmwuu549GuAYsDrgfCApMDywLPBJkGmQOKh9Pjw/YP7O+Q8XGCwQLmgNBsHs4J3Bj0KMQpaG/BaKDQ0JLQ99GmYdtiqsK5wWviT8YPj7CM+IbREPIo0jJZGdUZSo+Kj6qPFo7+iiaGmMZczqmKuxarGC2LY4XFxUXG3c2EKfhbsWDsXbx+fH9y0yWrR80eXFaovTF59aQlnCWXIsAZMQnXAw4TMnmFPFGUtkJ+5NHOV6cXdzX/A8eMW8Eb4rv4j/LMk1qShpONk1eWfySIp7SknKS4GXoEzwOtU/tSJ1PC04rS7tW3p0elMGPiMh44RQUZgmPJ+plbk8s0dkJsoXSZc6L921dFQcKK7NgrIWZbVlKyGF0jWJseQHyUCOW055zodlUcuOLVdYLlx+bYXpis0rnuX65v6yEr2Su7Jzlc6q9asGVrNWV66B1iSu6VyrtzZv7dA6v3UH1hPXp62/vsFqQ9GGdxujN7bnaeatyxv8we+Hhnz5fHF+/yaXTRU/on8U/Ni92Xbzns1fC3gFVwqtCksKP2/hbrnyk/VPpT9925q0tXubw7b927Hbhdv7drjvOFCkUJRbNLhz/s6WYkZxQfG7XUt2XS6xK6nYTdwt2S0tDSpt26O/Z/uez2UpZb3lnuVNezX2bt47vo+379Z+j/2NFZoVhRWffhb8fLfSr7KlyrCqpBpbnVP9tCaqpusX5i/1tWq1hbVf6oR10gNhB87XO9bXH9Q4uK0BbpA0jByKP3TzsPfhtkbzxsomelPhEXBEcuT5rwm/9h0NPNp5jHms8bjB8b3NtOaCFqhlRctoa0qrtC22redEwInOdpf25t8sfqs7qXOy/JTyqW0dxI68jm+nc0+PnRGdeXk2+exg55LOB+dizt05H3q++0LghUsXfS+e62J1nb7keunkZefLJ64wr7Redbjacs3+WvN1++vN3Q7dLTccb7TddLrZ3jOvp+OW+62zt71vX7zDvnO1d0FvT19k393++H7pXd7d4Xvp917fz7k/8WDdQ8zDgkfURyWPNR5X/W7ye5PUQXpqwHvg2pPwJw8GuYMv/sj64/NQ3lPy05Jn2s/qh22GT474jtx8vvD50AvRi4mX+X8q/Ln3lfGr4395/HVtNGZ06LX49bc3W96qvq17Z/eucyxk7PH7jPcT4wUfVD8c+Mj82PUp+tOziWWfcZ9Lv5h8af8a+PXht4xv30QcMWeqFEAhDU5KAuBNHQDkWABoNwEgLpyur6cMmv4nmCLwn3i6Bp8yBwCq+wGIWAlA0HUA9pQBYIjoU+IBCKEgfhcA29rK2kwtPFW3Txr1EPJ30szytwq6Lx0B/7Tpmv67vP/Zg0lVO/DP/l+bTw3k5kZHoQAA34ZJREFUeAHs/QeAHUl174/fnO+duZOjNMrSaqVN2pyD8S5pydjmOQA25o8TBvOA5wD8/RzAzxgbY2MMNiYuwWa9Cxg255y0QbvKaXK8OYffp/rMtK7ujKQZaWY0M7d6Z6+qq6tOnfpWd337nAptjUdjVqu1XC5b9KER0AhoBDQCGoHaQ8AGC07W2mazWG0zIECk1T493kos/1mmsk9PoWM0AhoBjcCsETjWF82U5SRX59oHnUTUTCXruJWPgMF8NlvZZitFJsrplMXhOK7SNlc5Hy9lRssWeyVNOqyOUrk0lB3JlXJOq/O4LPpEI6AR0AhMQ8But8NAHAS4KGFJRRinVCaTkVOb7dgLuiQrlUrZbJawmV6k8cuBO4tfcjnsqvsy4lQRBCSLCOSXw+FwUBa/Ikr/agRAwGGBBZNJ23e+7tr1csnrzb3xLdYrr7UUCgodq6Pc/0NX/9espXQ+dFVp7cesTr+lXHJY7YfSRz/T94XXcgfaHS2fbPv/XRK6IF/Oa0A1AhoBjcB0BKCf0dHRb337W7/5/t9MJpO3fe97v/Wbv+nxeBiQcTjsxWIRnovFYt/81jc/+NsfDAQCP/v5z4OBwKWXXkp8oVBwuVxHjh752c9+/oHf+i1Fe+Uyib/9ne8kEvG3vvWtpLnttu/19Kx+5zve+YMf/rC/v/91v/ALTpfrJz/9SUtz8y//0i/7/f7+gf5vfOMblPj+973/qaeeevzJJzZu2Pi2t74VmtSjQtPbqwZjbNyJ1p//2P/4w85s2j026rntm+XeoxZerGDB+GueI3/jLg66ygnf+H9a+79tsTrxhZbK5c8O/PMj2WdTlsyu/N7PDPx9tBCzzehWrUFEdZU1AhqB4xGAb7DnHnn00ZdefvmZZ5559rlnj/b2/uOX/vHL//Llw4cP//O/fPmrX/ta/8DAocOH/+Zv/x8sODgwMDw87HQ6B4cGv/yVf/niP35xYmJi165df/3Zv7733nuJj8fjF1xwfmdn5xNPPHHnnXf+wk039fX1P/300+vXr7vwwgsfePDBbCbztre8dffuPUNDQ/Do//zP/2zYsMHvD9x7330ut/stb7718SceT6VSYi8er6w+q0UEbOVi0X74gMXpVJ5P/AV4Rwf6MBMhQktqv6OcKlucZQs057InXy6XS9w6iUJyT+6g1+omkcfq7isODeVG7ZaZxhdrEVJdZ42ARqAaAey2zZs2P/30U3v27t2yZcuDDz4I1U1MjL/8yivxWCyfz6XT6fq6+ptuuBGapJNxuV12J71R2uvx7t2378iRo80tzbfe+pYHHnwATu3p6Wluaj5y9OgtN98SjyfWb1hfX18H4W3duvXZZ5/B1Lvqqqt2vvji+eedt2bNGlSJxeM9q3s6Ozriifj111//0MMPvf7mW+rq6tCqWlF9XpMI2HDYFztXWfJ5C34KnBRuj6W1zaLuj6LF21OE6SwFq6VsKeeK3k1Wqw1Pgt/uW+PszpRzxGYtuVZbU7OzsWjRt1RN3kG60hqBWSBQLBXb2lptNns4XB8MBDHm8F62trZ1dXa2t7dDaUePHsF12djYyFif1WZ7+OFHbrvttueef+7gwYOM/EGNOFfvuvuu1rY2t8tN+r/4y7+02+wDAwObN2/693//dwzKlpaWT3/mM+lMFor9wQ9/8N93/LfVZiU7/tKtW8658yc/fvTxx6DGL33pH597/vlEMol/VVuEs2i6mkhiTSSS5VjU9h9fde15teT2ZF//ZusNr4MRVe3xjvZ+yz34dWspkwtcWlr/J1ZXnTFG6Nid3P9nfX+7r3Ckxdb4ybYPXVt/mR4jrIn7RVdSIzB3BOAbJsKMj497vV54Dvusra0NO69cKm3cuHH//v2FYnHtmjUjIyNNTU0kc7vdvb29jCxi+RGAMiHIyMREPJFYt25tXaguGo0ePHSoWCgQ39HRgbuVXwh19+7dhWKBUUbIEm8q7IshODY2tnbt2hdffBGn6NZzztm7d28ylbJYyls2b8GI1GOEc2/PFZjDmojFcYSWisXy6IjV7bGGw5MzZaSyVmc5O1wupq2eDqvNbikbBMkcG6sjXUwP5kYaHPX1zjrNgivw1tBV0gjMHwJwIcQmrEMYnyTTXhDPXBgGZIghAEcybUaSyVVcVDI7hvTEk0xm1hCQaZ/EczBqKPESSSkcpEc+8QT45RKRZnFSNDHzV0UtaRkjYBAhb0dUgfuG22K601wtIsQ5CgUed9PIOsISGcraKbqM7wCtukZAI6ARqHEE1LuYgqDEAKHw2bTFqcd4rvpS0TAQa3NNPeOjvGlyzPIG4p2U18/ZYVVmjZUSq15XqzGfZXHHkiHguBeYY1fmO1S2TS3bOqVkoOBF/rQquDilVNVAtQimCQ9LuVTG4ValOTdDVYYTnc7uBjhRbiO+8o6YbbFVAstYZ9SFo/oCDTNlJHFVmslIaCwz4NK0LErC8WqABlmAa/KeN7JMliXSj09fpcO8nlY0XJl5gdUNN69lLYKwKfwXD8BFqNSSKMKRYpqodLlnrA83vEEM+D3o9M9Y3NIWQE8yNDTMAP4suZB1Th6P1+xlTlI5YMzl8sg3nDkqIZ2I4fM5hmkFvIpRsMpFIClxEzEHz0hQpjNyOJzMsjtRcZRiiFfdFBUhV7FYMKXNmMvsO49pY6RDwuHDR6ampE9dVP8qXYx+cDIrIDA2s2ZNj5Q4YykniqQUJhAmkwk8bZVpED1VpBnNomknpRi+NTNyzgEgcrncsWiaKf2pVDoYDHR0tPh8oJoxW5PaGRWcFI4yDps9D19WHzTWNDWn0lA1cd+p5lTH1IWKf0ljXBJEzaaoSDG7YCqVzOfzLOmTB1YYi9K9Hg9Ng/8QMfAZ95LP5+dxTiQSgOAP+Cm9WODpZvY4+IuK1WqgZKFY6hsc6Gxr87jdhUI+m80hk3pRRCAY5G2AW7TqJQ8pbpZmMStvFlVgMaHTQSl4VqtLN3NLwyXimf7+4WQyFQj429tbAgFXNpfhbcZMxkL8GaGmdtMby7iXjuU1hUw1ihnB08QdMQkR8sEEqEGAFAiRdMgHHMGBSNJMtvwJFCIBPQNyWXxZhZ4SCBIzqGYUNf2SwDY9/fT46TGivZTIrxIigiSpeXk5BRy8G6KvWQNpaeXtZM0Ec7zsdhpPvUmd5KBhjGQyHs4DxvQtFrGSkeNE+ZDJY0YR8tTNmIyrtPopSp/KOan5iUucSjg//1JcJBo5dPiQQVFKJveCgpG71biPJ0+nSuvs6vQ7FJgqCX5okvFnHDxF9HxTCRVnFEppRmFBn0sURDf0/BPPU5AcxHAADqdgGA6Hg8GgpISHHn3s0csvv5xLpNmzZ89PfvKT3/u936NIpt6R5hiYVguT7khPb8jM9VwuxwQEcq1evVrNNXC7iTFVIoCuqKj+ylYJ2K3lyVvHMIhQe2h4aGx8TK0oJZFqOzX9j6FlAoV8hqXRqtJ0B2WLz+tds3aNVLCyFDOMNA4U5jAjlRpW6/DIMLMqqJ3EK/vDYi1YCi6bi8tSQbQFUrr1k5Si7i3VAkYrGE1wXEmGdOCNxdJf/8bX9u/7XkPDHo8HuILR6Lnnn/+rb33rW1xuRraUoc/dDmKiD7+pYuan/Y+8o/smM0YC6MZMxapIOaU1ufrSK6+ATENDAzP7iTfrrhTlzxjiopl4ZIihUA4zzYxizUjSmykp65GHHmKKyn0PPMDivFw+z7oCj9d76NChgN9/w/XXbzv3XKxeqrZ+/fqbXve63sO9THJJpFL9fb39vUd3XHY5y/hCdXXJRIJpLNxn9eGwSj3VSzhcrvu//W3b3Xc/f9FFm6+/KZfmCgaikybjNevQ0Uc2bOjZvnWrPPjcH06HE0LIlkvfsBYbcoXXO13Gdh6m7tUBu81yZND24wdc112c37oeXq5OwDl3ciqV/853v7Vr13caGl4LBhP5fGBg4JxzzvmVd7zjHV6vc7J0m21gcJAqkGXy2QQpAyuavr6+Hp05JTHokaCvv48XBSMJ97WdWoMAjzT4qLeEEreipciDY7NGI1Fm/aAGklU/ZrPx1txQX8djqCYHGQLpJ7dt24ZYEtAQJKbdefo8Xo9skUOhckgNlSb0DHb+U8+yRB73y9tBfhp41I7EVR0jMcRzU021mpIDF6AqKwjUy8GUfAIsrps0pitKIy8SZOuViuhlGlTtJFhL67IolfajbWgqdf9arTt27GBGlnmXm/Wk8Yx+yYLB8ZrRtFyiR+YGYu4WpMhTxCPN/UMySUxRSCYZtxdP3UMPPXT11VeTDO6kIFOyBIhh3lcmnejsaKMp8sbDX5Wm8pQVtT6vvampOZdXpUwXWJkYmqB9p1dK0gAI2dHZUFs9Icdui6nuiRiuUlnpkaU4oAA3HhXuaTOvWa6Jc75YHEql+INs6rzeVr+/gV02zHQKJl65KVa9QdMcDz74IM8MQJGegyK4RHJojJXCPEtvetObwJDuCVRZX3zBBRfIGil6Z9qOlAF/4K//+q9ZZfW6170OCcTQs7IY+Stf+cq73vWurq4uute77rrr5Zdffve7381MPKTdeOONkxga77OxgiNRcBCExlCVgMNadttKLlvRZc272HTWaFY6NazQxlWbvXXNIEBvQX+BnHw2nUtEEpGR+NigvTw5c4FacMmEBQkcxHBQRybBUwufz8ep1FcSADiwC7zE8OxOWCfutN95+Z7L6RRZN80lkQl6ZORAQnUpFstoOk0TRNJpt83WEgy2YuUpklZNj1jSu1zO/ft7v/jFD91884O//QFLICDlM/vx/h/+8P4/+7Of/tEffb6hISB3r6kPidgp5cXonnetfp1kMH9FB/PUDFAi98J3v/u9cLgRbisUeIB2vfe976P6ZEF/XlZ4oEgmNhzvAcBCuVylCxaFTWlVAdLY7A4YiHizauoudTqvuOyyxx5/HDlvvfXW7du23fb97z/19NPf+8EP4MVQKEQXQF5ewriLmOSZLRZ/8n//b+jw4eE1az0+H2/H3Eg0RCQS4UZatWoVyKMqxECLj4+NtZZKYyPxfB47rA1EMeKHhwcPHj6QvPO2iNve+Fd/3dbc7La6j473/8XPv/jrb3rf11y7/9M19A7XubcUtpWVo2LagTZUwMaGasUvfcfz4m7Hs7sc//iprJPdHisMQ1KhVW/vyN/93e9ee+3PP/tZSyhk6e21vPxy5oILHvrZzx760z+986Mf/Ye2tnDe4AwmozJV1QSHEjjoGagX3dfzzz9/5MiR66+/nnahvjxizc3qxh4dG3vp+ec3bd/+429/e2047AuF2GeS56Lo893wtrfxBkZ7gRswcic5PJ4H77xz4PbbL/jwh1s7O1999VW5S5nIysOLwqIze+Jwb/MrfQglkp1fmljSqBt7qmcQdFBV3d50xfl86AMfSP/qr+Ze/3puJsVwHGRzuehq6z/+8dhnP1vs6SnlcmRRDpkHHvD9/d/H/vVfS+EwtUUyrqRSPO7/xCdiN95YfNObnAYRlt1u77/9m2Pnzvjf/i3SKExJRiwl2u11f/ZnhZaWzB/+YTGZRKyotEx/HYODg4I1FQBT+la6UQJYCby98vJy++2333zzzRgKVbTBclj6fe65UMj3wgsvAMRll13G6wxPKTOeuXu4jTBWjO2TyiMjER5F7ozm5npgRD4C3/CGNyCctu/u7q4SjjIel+3zX330+w83vOOaA+97a9fatT25HMQ8A84Uncskfu2Tj6Usnb9202u//MZzAqEwN9P0tjFakL7Z+vKu15wO+5qeHkonZaVQSUPvwE3PKiVUhYSoFxpyu0D+POq8kZGFlPwSR2Jw45f3ADovoOOUJ8EAJyTJkIOEUqEYy+cORCIhlyuRyXTxxPb3Wdrb+2LRjfVhdTsah4kGq6AeffRRgEVJ+j5aCoVJxi9FPPLII7fccgudI0UTee+99z733HOf//znKZpnDHqDBeE5KvIf//EfzFNncTFNhiYwBE8jJMoT8eyzz5KM2+Cmm24SsUjbt2/fFVdcYWyCxUtueTjtnCi4/C6FE0Qo3VSubM0WHS/3pjbUZTe1uoslg2yAq2d9Y/d6nGMGgSrLDJic3pDVFXx2yF5XHAnYFODUkboytZ1SqJoSbdyBdEAoQ6dz4MCB1tZW3sqptVp2xmup0ZULnkZy9QMvs1Vu2B5OJBPxaJx3frNBSUll+cWDTU2RI7mI2RuJgCMs2O7zcbumnc6Xk8nVwWC9x8PERZKB59BQ9O///r0f//hTW7ZIPvVWzR3Q0GD5wAcsDz/8w899rvCpT/0LdjulTKaY+geDfip47F/KNVvWjAUEm817332f27LlC4cOtQ4OdgwPr7/++ndxy0lFUB7NpfVJTEY6btQjkqbn4CmTeFPmsQAeBae7f/9LL97znave/UfeQJ2CBD2ooN2OZNoaYOncuaW5z8mYzeX4Qz5qUS9uCe7nKEycTge3bh31erdCCfk8lzhID2mzO0xfXx9LFDLZ7FMPP5wulNde+ovZC2+6vqn51V0vxibGMpk0idvaOjZu3LIv3FCyFtGCt5Zv99/2g9SPHig+/NDR5+q3tf5R5Mo3W+ryLguPyrEqSAid1XZXZXukt9jUydZXyhqx2m2vPWYN1xfa1lmKeCrQBySskUjyb//2A7/3e/dfcIHKjJrvepfl8cctP/uZ5dd/3bJ160/+5m9yn/rUf3g86tnHVvXX1bkNn5YJI7Dw1PCwsDADCXhW6BWJhJPo1rA3n7z99lVPPLF/fPzCK68ce/DBc0dG1jkcvKLeAYm+/e2IpaHp7mg13toe/OlPOx555M1r197+yCON7373+eefz00O2jynckuQmGeTpuRZAEbK5enmFYS+FGyB7txzz6WZJDH6yF2EPpgKyAFMGy6cK64IfuADE9/7Xv7SS22yLytZhodDv/7r/71nz6v/8i9v/YM/aGpoIK/a1HX7dq/PF/yDP4j+278VsFaHhsaGhi78+79P3n//12KxK9ev37p5s7r9cGpfdZX3b//W/yd/EvuLv+CGoAPEE8VM3+Df/M2Rf/iH237t17Y/8MCVl14q96qCe3keyrdG84O1oMxDheXBKSjTEqCMkUECbnTeCqUB6IcZHjt06J2dnaqH2b//i7mcI5OZ4JGgu+Gx4V2etgQQ0jNqvm/f05nM+8Lh/J49zaOj32xpaYIReKJ5/G688YZPfepT9N081SggGFIc4Uy2/J5bt/r8Rx7ZFfjzr4585JfT27edk8ky+jWZrDKx3en5kw9suOPBsf96vPno4Isffd/WuvpG/OlVjUJlE7Hojx946u5Hn7t0Y8va1d10tTAEb3A82yKZLgAj7I477oCEqD63Mu9lqMrv4Ojoq729RL3tjW/kNiI9FYQsucQaJtJwHwMdjxB5QY9KmQoIpAMjI0fzxbWBYCyXC9kcQbevva6x3uYazmef7x9YH/DjcSILkjnIAgvCBFACjwTvFsQQTwIClMubBJSGGYcaAM57KxY2jxBpvva1r73vfe/bvHkzzSFXL774YkiaZiU7dSQZi7QuueQSXmL+6Z/+Cb6k1agIpiGi6GpJKc8eLtDxtGM0bw16bW1Bm8NqiefKE6nyeKo0nizuH86sC5RLuNKMZuGHwZiQm7fMUjJbfmWknMd8pQOz2COx+KtHJq6op3xVQaqAYugvtx+RlPvaa6+hBm9gKEAClEQffqksPQWNRaTKz91DR2h0lxCh1+LtKnRt2bAlX8TYVm9L0JW83pilUGXClMu7xp6JiYDL1eH3Y7/31NeXUylbLrelufnlkZEN9fVe/E7Kt+b8xje+/J73HGPB4WEL5sX/+3+qcI6rr7bs23f7D3/4ul/5lXdJo6jYUx2oJ0kgGMZzUTaTyT377HcvuODvN27M3X770cHBoxdd9L9ox2w2TUp0pn+k4bgnaQ5uBu4uAjynQEe/DItzmxGesWS703V0785dP/gzS7DT4cHW5BUEwlBNBSCDQ0MEEEX/CMgv79rF6ZrVqxubmniLZOEC9aI3hwLv+ou/8J9zztVvexvvKBRKYt5UhCNxM9Cv8l5L//j8Qw95vvmNgVDT+o//FXAPDfXTpk8++fAtt7ylsRFDSg0NDL7xbZs3d7c2Ng8kBu6q/3KpbXydy9r2ROlfN3+21dVcZBhMja1SdJ5pSccqhVU7uCf0t3/ueuTpwttvfuOW3/mfJy5/58QXW9/xR2WrM3fTjsj/+VwxvBoKdbs93/72v77pTZMsaNTU8ra3Wbq6oEAlb8cOy8033/3d737rt37rN9PZzNFnntn13e9e/sEPNtTXox4JuCG5+QkT4Klhqf+6deuoL20nj8/O554LPffcVatXH9yz55Hx8a63v/3Jhx7KkIznf8MGTG3SkRi0MWF/wrv+HXdc2No6Viy6Xnvt+WeeueqKK372s5/RiLw3k4yDQikOMxEDlA5HDpqVLpdm4q0UHSBIUtIiHCSWAP0tkdzSsYmJoV/+5fWJRMOv//rQD34QWbUqEYkUotHzPvYx1mM+dM01bc89d2D/fiFChYLPd/Dzn2+8+ebopz99V0fH8DPPXL9/v+/w4fyv/EqHzdbZ3U19KYKnqM/rXfXd7zb90i8VGxr6fvu3x/r7h8bGtt53X/2XvvTVc85pveKKl1944bIdO6QWSvLyPLDaPbS38LlUBug55DGjMejT6ZUefPBBwiRW6FhYhh/t6dl/3nmFZNLy6KO/UyqtsdludrnUSDsOB+540EAaB8ITiaHzzhtsb7esWjX2/PNv6+9Xi1h5Fduz51cuvvj3r7vuejprni7zLqQN3G5XLl/sWdX28Q+0v3Xvvv/4aelH941EIo9s375demcTbW5NtOIN7IqLN151cf7eh178n2dbv/1fT157ac/6deu4XyQlNcIQxE6689Fd9Kof+Y1br7vmaroY9pj479tvb2hsvPyyy5S733gSYH3sIZSE3sj+yssvk2xVZ2dzV9fazZuzQ0NsS8HiXBJQOyTTN3HvyinKoyE4EI9uJCB+Eo1CoT+TrWvuybv9sfio3+WNO70OX2tC3ZaWiYmB8XQCO4DU4EYWsvP6ed111xGAHsRuRhoHiOHE5ioEOSncWHFFq5GYmM985jMoQDdEmMjf/M3fJEyHxSkHlxBCp8abJs4ZFMZlunPnTrqz++67j1/sRRKgBr+wGIfR+1s9TpvXac0oaw/uo89iNAGLUTU0RKiSGSQnRUA9ETUvQblNGTfMy7ihoQA/Ipx+Foi4/VCJVwcMROLpYWkIelvuCuJJAMJc4j5UqiAN6orHS83NFkbmiCnmC2UsZq/FphqFLSTHx61NTfSJqhSy0xyTpVit4zm0trS4XAORSD6TOTwwkMD5kUw2hELdgcCReHwD7yJWK761aPS7N900pa7F8tBDli9/2fKe91guvHAyEjvjox/9j2j0jV6v51i6E4eUpkbLOhyugwehsIfS6RfS6V2trc8cPmz57nctQ0PeTZs+cOON70ynk0ZFJ4mQDTNZBg7lAAIvprQXhfCWQ8sCGrBIg1aVbHc4B4/s3/Vf/9eWGmq9+v04FYt5dT/w2otwDnZjUac226p77umZmNiSSJQ6OppaWtp++lPmxuC/w5mGaTPY3x/E/xMI8Fy0NDZyy8kLEy50Xlx4/+PmoQVz2SwW0D5vMFvf+OorOwu5DJ7eiy66bPv2Cw8c2FtXFz56dP+ePTtvuunK5uambK6YKrz2v9c1HTnietoaiftjfa+8MuppGBx7OefodVi9mzpvCYc6S0V1M6uDVne4Sg677eCY63PfvqnlJ5c2fvOmvf9sKeSs7nxm20W5UJulkOdeGR6O9vV96+Mfl2wWxmRfe82ybZvlnHMUF8px662WP/iDb46OvsvF/cU8oP5+tnnDfwUmqijjlxuGZ5m3ZHDmmaKC3JNEZnK57//jP24YHi6k1ZuK4+jRO1966dY//EPM3se+850ta9bQtdHKHNjcu3ft+unnP//m7u7XxsZIPDE2tvv++3GwARpvt+wqjlhS0o4Ip1x8Mzh1eD2VBuWXXETybOIU4ZTExPArSkq3mS4Ubvu//zfPa9x733vpzp1tb3/7C5/7HKP9Gz/yERye3Fjr7r+/we8/d+tWOgEy0j3d85OfRO+55+odOzb8x3+884YbeP929/Vl3/KWvfm875pr6OiZ4MRtds8993C/8Sq98cor3/W5z+2+++7hdet6XnkFhWzvfOemUql3bOxNr389+EjPiW7L9HDgkubFhIOagG9Vfejf8cvxSkKvxLs53ZbREsqC7OuzB4PqZrDbh/r7rYFA/MiR/fKyzMuQ8T4kQ1m8GZUPHmRAi3ECRpWPcpvxRwO98sp3A4EbrrrqUo+HqWVqQEJ6q7vvvrejs/2iC87HpGM8i+GHrtDdodVveq3vxYaGo1gGcjfQXdKVP/7EUxfvuGDVqtXpdJb7afuW7qdefrlt4xue3HlfR3ury63c64jFWXnPI0/3cVNd/JbSY99rbqinwaD2HRee//grfY/tHPvhHZ/+y09/0uv1yU0GudJ7DvT37z1woNTTwwO999VXe59++nc++MF8RwdPiCQDBziJHoGugZtS3WROJzG4PggDF79yZ/AWh+eEQcVV9Zjg+ULR3ujHx2plX4JEId7gbvTYQv0DMdqAbloqyC8H7MUvJXJI68jjgVjuPy4RqdIZT4iUxS9VJt4sHXjNS3IVPRFIm5KMq/Ru/EK3CEd/kCQvMpUEiBBRyFR9BF2z6pTkIIauhyFNVbwZO3WVfxVLGhkVFxp/RpemUpADJWlB7jHQk7JQhp4d9NCNihNQkotFmkOyoI86sCOZ2nPkSJk3s7Y2e0vIBj0nbYODuILoLHA2MMloshTuE+pFW8iQ21AiMYqnntEsj4c/3MpUfCIW23XwIHdthvdfRmVcvLvvOeeco3I/I4hXiMFBy/vfb/niFy3//u9KMgcGf1vbq4cO9bW3N0rMsV8z57Eo6o+hTK/neu65xwqF3z3//P6dOy1PPGF58cWWCy74pXXruq699vzNm3HtHpuPSl1pDtDgVQD3C3SIlcD7GW4AelIokAdTbgBSVhbFi9/EcN/O//xLf+TVAc/WizZfnFezJSctHlJy8wxh5FosjQ0N99TXf7Cv73V03Oed9/AVV5z3zDPbHn+8bc+eXZs3ZwqFc887b7Sra4sxWAWS3OfoI6g+/vjjdB041dEE66G5tS36rve77K5m9UTUcy8ZN561o6Prpz/9PmfMBYHUsUG5XRos+fABz8BjE7d2ddrDUUvsY/27yt0X+nvwMCjz76GBsd9vWXVhqaiGtNXt4m8b+suvBm78btPH/sTbF/nfkc+sKx20uK2Rv/z98ff9qS2WsJTU87V374F16w6YFvL4uOXv/s5y222WX/1Vy+tfryRxuFyWnh52tjl4wQWbAued59yyhdmtPOwgyVXUJgCwhOVVkgD3KgjzRD3+6KPXhUJpn29DR0fY7T44MUHXmf7e94rbtl320Y+2NDTwvklihLCBzs7773/vFVf0ZTLdnZ0BxlOPHDknEOAtk3cajEJwAx/KgvyoH6XwVAIOTcMDiBCeSgI8Hdy6vL7DhZJMfmkFmmPy/bhQSEUi4UDA8aUv2T760Zv++Z8La9Y4eKK/9a3dfX07zj9/DY7rNIyp+oFyLrf1/PMffOihJpzqr3994/33U7XUVVelwuFHIpHrL7+cZw+taGI0odOxP/DATeeeW3j962+96y5GB60HDuTe+Mb99fWDoRBv2IVslmSIXdbH5MCAAETlqx4n6obZwRuK+byRxuhjS4w/09fgTenpsV1//ajH82XMDEx8po/E47nBwY+tXo0/AXJy010fOWLhlWj/fuuGDfjkudXU3/btB53Omw8d4kX438477xp6e9qe8bXG1p68remRx5659JLz0Yieyul093R4bc1rcrnDRs8/+TzTaW7aeuVLe/qZQdLTswolj/YNbuipX9+W3B+xR2PxxkZFRU6H49+/+Z1sx5Xrt100cHDXpl/87a/87KfnPPncL1x7+Q/ufc655trkK99ubGjmngMHmp8scuzmZXL16lt+9Veb/f6BSy+94x/+gXkB2847D/sCHDi43eFL7ksOYSy6bPFsyINk3hwkzhdLGWvmR6nv58uFtfktsUzIai8/U3h6LDe2JrZqs3VrwaJ21gBEpKEAv0Yhx/0gsPISaeRZIpFZ1ikDJAZq6ou5j9uHXgxbmeeNsRDee5hoSoyUTlliEapyj5crpzwXSh/sQ+Vyq0pyfAbjDDrkICU6oLnqQI3tRUCSvp4Hm0sAyLsFATyBPO2YQYyFYCvzDqQyU1Onq7xtq7VUYDgof2Bw4InDB53MIhiu2xDedI6DKbSUUsgdK4V7WErBHRlLp5uDwZ6mJu42+h4/L/5YM7nc9rVrmRj51OAgtONxOsfHh1tajkH67LMWRgrf/nbLlVeqmRembdHYmGTnaMBUik0dTrstH0+U9u5XepgHVTZsTUyWWOyP3/nOfkatRkcVxe7YEe/qsr3jHb+Oh5vpk2YOAoDAfUV3jLsYHCBFICIGQMxkIEljmacSwNJ48fmn7Nmxo3H7phve6vaFkMwMKa6iLf/w7OBu5fSiiy66ZNu21r1709iIzc1jOM02bNjy/POOdLqJQUTjA0mMyHE/c3CT0OuhDHx833338RqNwcQ9T1Oial1dqDEUY8YYLzPx+AQtjIeTLj0cZs1JmPTr12+47LLLkUBim+PKu7/8s7r43otuvcBRzlpjGduG+mJz28Sot3Sgd/WrrzzZ/6XA+7/A7tugoJBgftbwgOfJJy2xPHW4OPeMqmnB6r/txwVPOH7T20uuoKNcGh8faW42uFNdpnfCdFaBN7xB/ZpHUxOTnsbADXsLPARAaUd+uRsffvhhfrnlqJrk4pQub+yBB96Gi95u//mBA+vD4SORyNr6+mfy+fYHHvjxa6/9rw9/GKyoHe8Hr7zySs/Bg5d1dkZzubt278bf0Fgs+qJRGOhXfumX2BYc3IACsQRoYtiRjKYysOD3vvc91GOsiqvcogwNoCdpeHJRiZ6KVyIkMMnr1o9/HPcpfRCqD/31X3ffcANA9d5zzyiDQePjzA5giq9Zd4TwRFz54Q/f+Y//+JaGhmZ6Axyh3d1PMjjyhjcEedExhhi5za655pqevr6H+voYsQwwzjIy4n300cxFF+XOOecnu3df+d73sle6QGcKX6YBB08UL8u0BPgKyjPWhHblhiCB0eFCA4yNWXih7OiwrV/PTQolpIThsIJefTXQ03MdxEZ6rHlue4gwHueRs65apcQbdxQNyfQlHI3r29q25vNqEgc3AaKiidKWiy/8nzsGWlsOYeoxynjLDdsef/onLa0djMEhk0OUJEss6+nacsu993zjxnJhZHScDvSCLcF0al8xMxqJ1IUZmTcY4pIdF/7o7vv2PvLDRlc+euSyzdf98mDv/r/59v3nXPO2h37yXdvE3o984vfoIkSyZAEQ3oxGSqXvPPss9gjdRIzi6HmN9zVSkgA1uB3pnvCy0kcQprOAGsUW5PY1aqSSARyTKkaiyUzJXnAmk46J18afS5dTJTsbmTsOJoeGUuXz6jdTPXpfgYJfDlXbqYNTRMmjYl4yA4LJbH6RQDL05Fm99tpr0Z9ukQcP45t+DSLkzVTKpI5qEoJ6PisEqwpNHoYolGRSzKmJkOadMhEV2SOCLhU15PYjQPPxwPMEUiz14kC+mIzitFD6qPunPDpYnJiwZ7LtqWJbYGu6LfTkZeNDtvie8qFQ0ecvMsWzUVlplEJHQyn0HZgyxNBPpe32OBPf8Udhb/EGw1wnxkIwubC9uInx2yv3rGvKkaxqeuedlrVrlZ+N6//xH5Y//mMVyZHNIm9yVYzE8Psv+6y5vk35+39QtKkOa/LAfXvDtbxXHTr03I4dr0YiFrwpeErg6PPPT9955xe6unCVXS6wTOVR/3K/Ce3JMCHcg1OVuhCPjchdByRVuVBpYmJs0/mX7XH56jteXXfB9cocNG5XBAIpUwOwtrlXgaWts9M7MeGKx4su1yjvH1Zr89AQg6Z5l2u4sbGnUKAsGoKMlIImZMFqwUChX/7Qhz6EkUq80VYqzZo1XSgGd1OckznEalo1OZiY0/x//s8f8y1AdJtU2G5xvDZx1fpr3f3bLEkmOjp2vbDv4DNPhX0NneVMqPt855NP9F38wppLL8MfwsxX58Shjg+/3/nQfkpRR9hZdjmsQ2nn0/ubnv503SVfH/zc5wvrLmFwF9+4+dqG4syRwaFw6aWSbfI3m2W2kVpEgVuoEhkuc0oVmC3PL/Me4CHhQurCJMC+3bufWbUKVwzza/7lqafWMt5jt29573v37Np13fr17qlVmHSpO598suHgwVwspuqbSv3Tnj1vvPjilNP5S+9+97PPPMPgi/mUMVqP85MXHSw/kOSOpWWZqAhZ8kTQ7ps2baK90EeeCH4FQ94deVnkFGfmuVu28CmP3QcPXvI//0Mbpy+8sOELX3jqve/tYYTIOCrrz1PWVF+/7j3vGectv6uL4RzbHXcM/K//de3VV2M4VqZcu3q16/3v/9bnPve72WyQxTZr1rj373+OhSVvetPqzk4oszLx8g075CbmV5Dld8bK0LPTrUtiHiTufEY1cG+ef75yGRnEpkw9oxspHz58y3XXMddfYWQQYRHLqr5eTUBneJ40QoRkhx1TqV9mQ/p8XqGPZKbbjwz3Dz7wXCIRCwUUjXEXen3+S3fwDZdnOrZv55RIUZKu86WnnrHXlZ12NipUQ9z06fwaL1DKx23W7rzt2xLxGPffNdf+4pOPPXbbX7z3Tb/zucve8ts/v+3LE/uf/stP/C4vA2QUschHE27HQCg0dOjQfdzHPDTx+I7Bwc4dO1AAsRxgxUGvTb9AN01/wSm5iOGuRZTQoSkTMqQM+55YyO1r6Q47EpbReCTO/J98zl7Iejxpf71hBRo9DrmkCKMoVZZxM1cTIZFckjRSEJoTY4bRh6tyav4aLaj8jTxmPHvUiMmWn/3sZz/wgQ+QHlKktzLLVUQI4Ian0ZAwJd3obJTzE7YuMGOvkirNoo4LkMJMJFohWQoy1eaUtwoeVDGAhCOpkehsBPiStPUHP3CvWxcPh185f/tap8/2nMVVaN/OIo3RffsCzz/fNDSUZjRPueCUA9b8Bb6Qw4Gvmd6OTiUFC2azozhLC4VgJlPvdHJv0U9Temdn11NPBTCcqMCBA0hStiDHxz5mYSb5H/6hGtblOHKk9fLLmwFQnRjHF19LPTGc/9pb3+i2v3EqbvJftbbmUG863Qf58YIOjv39lk2bIEcl/4UXHr7kkmoiRHne+iFyWgQp3PCYCIIGwojEMuNUNdDUQQvCcHyfgQ872F2+C173HoYBKzWEl1566SUWEUqOxx57bC1DRMVi2u1mYuJ5r7564WOP8ZTuuuSSgaam1caMaB4oxJKR+5zHhG6aOwe/OjOTsZnw8jGWpoa3pyiZVhPh3J64QlCPXEw9YzTx/vvvZ1YI9GOxW8MXvS7/VM6y+g2WTDZmyb1yz663v+efbO0tpcefK6fiIbdn7/7+nisMmi+Wst6WkV/7rdbDf2U/HLUE7AOf+1Ru1br2P/6k65lDlFUI1aUbupl61N7ecdddYYtlXBTAfMf/fN55jLAoVwI9jxz4lc85pw2tuDcmo4x/AJwDuKgLrsv77rsPpoeoqAIxPatWPXfZZWPnntvc1MSU0zVPPvnGW2/leakPBq/+7d/+6U9+wroRHihSQifXv/GNzw0N7WB1YCBwaGzM/oY3XPHGN770wgvtra0Mq/70pz/loRMaVpJ7ehiBOu+88ygdYPHNYCAyZfTKK6/EC8VEORBGVdGWX7Lwi1aiGGF07jnnnPRf/VXd979/5Gtfg7G63/WuS77ylcSXvkQlVeUrD4ZnsKrvuCMaCNy7bduleIBuu+3mw4cTJOPv2CPO61q6tb39YoaBP/jB0gUXRK+9NvDYY2sefDD8h3/ISySdVKXU5Rt2cPuivTwngu+JKgPiAj1NRXDfPstVV/FGzBjGJLHxqOJ8P3KEtWu/bNxLCiN6VN5jSPyWt1iiUZpPEaHBhVa3u/Taa50XXPD6QuGYfQ0Bve7acx97+qUXExt++PNDb7upUN/QxJROrKxrrrkGBUy6IszT+LqrCj958JkDE1usjz23tjuAbpTBE4vtT6VEZ9SgOXH68Xy+9tru53futLoaPT7/A7d//eDz9//5xz4Ak/GomxVHMl0hBa3q6WEw3H3oUNLrDaZSeM+2bt/OlAEYzoSCzohSyIsESuQSNze/PB6mQAKk51Zt9QSydd1tDqs9VwrbfHVeq4UZHnjnSsVxiydktalb1rAIJa/QGDWia4MeKIgKEolulMVBDIcSbmTkKgfpOQhQa/osRhE4BRNJjGQuEYawGbQ30qrlYhhe0t8hioEc4gmoq+oVufr1yDwXURSupoaSzCQ6qcAJfkkpwimUJFSESlE6dgbvEJgdoMqdiRpUnM4XJuBUsgA23ewHPpCLRGL33//K9u3roCHYmiEi6vjTJ564/Ior6t7yFuYsUArVRLhZCtrVOxx7UqkWYyU4Att4F8bo9Hg2hkK7IxGukofW7+np+K//umRo6D5m5PzbvzHPcHLOIaT1yU9avv51y4c+pHrYUun6xsYgcqSi/7Q79dBw/utXhPzH3hsqIFA9FxZqgEdGcqA5pjaLzXgKR0ePctOZosxseI+5P8GBkSTaC92oF29dmO+gRJWpoJkYGEn5wx/+EKuxPlz3G7/+G6xpNR8ZMxnL5y/eseOKyy8nBldM98AA7cEIxPrdux3Z7MHNmw+tXXuwtbUTbYx7gFIolAbiJoSMeUcRqsNPy62OL4E2wnKarjwZKR09yU5ZJEMI7MKEZ6ZahW+5/JFnn+u5e2DTtvOcXfmUc5slv6n4qtXe9rrIS4dGMvG9/7m3qWtfxyWdON8RFbnh3YU1G9v+8k9ib3xz9Bf+V7lYPvrV77R/9k99z+8c/NRflnyNpWyqs7NpYuKqQ4fuwCnKwRAsaF93neUb31DeUVqTY+9eXsGvbmtrkJd1FWUc3Mw8FDw16MmdQ2WJZtQA21feqnm03/aWt9xzzz2dLS1YY5s3bqxncsTEBOOCPJ/cpSADdRmwFRkoPf997/vxv//7L6XTL9psWy6/HO5kEh03OSAAGs8md6YkJsAdDvPxZsPiehwzUiI6AB060PSiFSpJo6AtMWbrl7ze8B13bL3ttvs/9KHG7dsZY7z7Ix+55S//0vtXfzXwiU/A7ZX0hrXe9JWveO68c/Ab39j3+OOuxx479Pa3v/vhh72f/vTAJz9J72AmJmW5v/+Wf/7n5zZtemXDhmsslttXr17v8Zz34Q/v+9rX8u3tuFUFvWX966BVaAB5MQFfwJ2xPtzKXJWelF/mbTCHgzmV4+OytnKS3gBwdPSaSy7Zgn9L5JCY+fz4tHnaJybUFAYxB3myjMnbb2tqYvk25uCxcj1e/w1XnTcRf+n2Z1bd+VTshm2H3//Oc9iAAgWm69bY1PbmGxz/+p+Hf/jUxre58pc1seJbHdJBc5cQllzc4kw6xZYfGR079xde/9JTD77w4J1/+ru/2t3VmTbGtyUZ6SFR1tjJQHQ3XNjXh+uzsbt7w6ZNvKnBGfRNIplS6Bd4cojkpuSUMH0EkHKjEyZGFFDQWcpN7DhlU91Lndub99qxso+OjPCG2JvJbODLMU5loJMF4QI4eckHHxAj42coSSQPEqd0drzaoy1pyIgmFPqFL3zh4x//OM8VfSLeFYYZPv3pT9OJoDkaojnZSUz3xGwL2v3BBx9k7h+eGfpWnKKIRRpUJDoYyisdpsAxRwonUYX6jPuCdVySRNHsVKj6X2MdxWRTIxMNqRrlCslRZZ5/HEFMVKMi6AAUvF/T49M7yJuN4MktxKqfeDzJB+3MMriUzWUvOO+8AK48UvBnoEcpVIr+WvUj3BgWS7vHszsaXc06vGKR10BKaqyvf40ql0pdXi/2PjLhx6uv/sAXvvD4Rz5Ct2gWYmGW7h/9kXppZnjvn/6p5U1v+lW1KQoFWSxf3pO6bzD39Svr/CwxmekgGYlDofX79/u3b1ejNjhFmJMRjSqBgUBIyZmCulIAVUB/ukJsBVqZ5qPhCJCesJkSBKDJL37xi9/5zncw1C6//PKBgUH6U+DlfuCqpKSI1atWEYVYFWO3r8a9UygMrF591w03qLlGpOSpoYNDH+OGpCCy01KUizud6R74DGksHhOm9WMX8kvRZhGmSix1ZewKNyN6smiYRuTSL/7iL3L7UXpdYyj/yfPu//6zL7wQ+ZWm1+23uJ8bHtq6uiMasfSWG4aC517RekHyYLR0STGZSFEuxvHQeLbjI38+nswXH3ssXF/PVkYdv/p7sQuecDLb5uAjzILmQ0833vj+z3/+/r/7uzi9zf79Spef/lTN9RUWhBG+8IX6X/iF9xpOgckbWxSmmhisPFwojH3G84UVi8NaHkbSUAue7uuuu+7ee++lCm9+85tJTDzUyLMG1Jh0xIs02LE5HN78a7/25S9/uXXHDqxGnil6oR/96EdAhwnIG6oJGo3CKcsngIurQE0MhiDroIhHH1qQ7BymMhQE7/JoUBw9bPDee9v/+I97P/Wp4pVXPvngg1xdu2XL4X/+557f+A0WOA//1m9xk8kdALf5nnuu9W/+5tA//EOuufmSG2743rPPXnjttUPXXNP1gQ9Er746fuWVvGsrsbjoJiZW/e7vphje/upX7/vSlwZeeGEPjtrPfCb+sY+t+uAHD37lKwVm3Mx00woIy+VXufu5fcFXVbtcphuiMabf0JhB9CPcBzQPB2Nv27adt2/fXlJKn2P8IqDl8st/y9h7aPL55J2ru5vnrvuVV6KViXnW7PaNN930dlgQgZV4UQop33bzueu6D/7Xg/m9R5m+kHW6Jl8qK1MSJnEgWP+hX3Lc+/Cr+WyqWFqLRgjkDuOq0rVCOMx0wYUXfvgP//CuB5544L4ff+L33rtl88Z0epKzTcl0HNzi3EnUh0juPLkESsRwX0pnRBjhUCDPJ0M18o7MGxwzIcnLDUrnZZau9CiVGDxc7XYz2wEHGfqxDr/AfkmF/Gq2fnE4sL+k7vxSIsXx2PBgoA+FijKiibxR8hjQKDx7nEJ1LKiA7d7+9rejMACCAD6oT3ziE0ijA8V/hSuGZxix9EG8vaIkDxhiyUuWG2+8kQAriImnGyVm8q5Q5hbruyAwbCflCkW4Mv8Av1wOONlfBltXfZdZNCyq7RPVem3VzaohQbKqHcLwBfhtWVYlCpeiFenhOaRJQTzqKEw8tgWv2xAh3QFDIKSRfoFLJJZS8E/TlaAnkXwFjI14iKcnokYK6Cl6kFJoHbMUdAvYbN1eb18yGS8UjjJUZpjy9TZbxxQLkoaMl1yy/dCh//PFL/75n/4pYyjEqYP5OliETFX5sz8Lbd365+vWdXBTcX/0p8s7xwv/fkVd4AQsKNnBtqOj86mn3hMMfuX55y0MtGPdHj2qRh8vuOAizFfKlZTmL5qDBnYeLwST1AWgzPqJxTjlHqtMSSSwfOxjH4M1+RA89yRtCixAUpmMBuMgO/Ypk9n8vKIyLmXMDuJ9U1LyD5CiD3c7h0SSCy7k1oLSuCc5mG+FfEYu4YxK5dGQZGzXwGwsnHvXX389QrhRaU0IZlV399DI6NPP7g84vFFr+kD8OesLkT3Re27fH91X33IknXnMfiC7tn5L16bLN3hxIXFbkp0XmpdferG0dRsdArfszhdewA4bWrXa7mpz7d9HcRAhj8y2bRsOHPjUJz7xp5/8ZJqtD3hfZwXhxRerGvD68ld/5V+9+tPnnLNWOEzqJb9ya/HLgQMJ5wSF8jjLQKykAUlu1Kthi3gcbCmUx5PaYVGQjKeVZjLRpgiG4q740IeYfsXQN2JBg5ucd1Ak8xZI92smpjmYF8opcrhPKO7CCy+kOqDKs4D3S7WX0TqkQRQIE6lieB5TqeYvf3ngQx8aedOb6lKpPbff7hsZaf34x1Pbth34/OdXfepT4zfemF29mpdWxJI3V1e375//OXHJJSxuY67NW3//91n+yHcg01/7WolVZHgdjBum5HY33HEH4QOf/Sz73NRt2XJk/XpcENzwhz7zmTUf/nD4ttsGfud3hDUFn2X6a+VOrVSdp3r6/UECukXaG+glMQG2qIzH2ft4MoZ48PX7g2w0w/Os+r6pgzTRaBy+OT6xpa6unm2t6LQqE09lUv+y+QtTfZmPWCrTD5kWR2WSyTC3SDLBRJaog/mExkAdjyt9BPHTUzOg/dgTT0QjkZtv/kWcUdMTSEyltpVpkM8pV/Fs8N5AETwMgMYdySWA4jnhCSGNpJS8dBn0ShKDTiTFJcWvi1kZBoZm9ZBMrydyeOPmpqdTIyMFEclBgEhoj9W+2LjybBAPC/Jo3XDDDTw8//Iv/4L3iRWBZISbWVb40Y9+FG6WxBRBGt43YXH8MA8//DAsKCTE80z6d77znQgUbdGfiTAFulyvr7k+6HW7IulCJJWPpwvpbD5kiflsOXGJ0sFx/9Dx+gKhYH3Y4Qu/OobfuGBjeicTUDIJd6qXpfY8xHSd6HYihKXV5Co6yCEwEkkp4FOZl5GKQ9ZD68vrJznWSEouSqkkD5EwKcf4h6nhNBtdKosZeXc4xhVTSZmufO+9D7/wwpcuv/zliy5SDkwo8Ikn7M8/v+O66/4ApqTlaQv4mC5pKtMJ/+U+wazhMv3Mrl0PvPjiE08+ud/tPnrJJTGb7Ya3vOX/z3oa1K7KT0Ngr9MuZOe9R1gNQiKe8UJpNTMLsNDD8otWpKmErjLNU48/zt3LqH4olVp35MhF993HTPlXrrpq7/r1Q7yaGDrwMtHe2Xmh8WJh5iWAcDRBDcLoQHPg0IN0K/t0SU87Ugq3JR06rzIkZrwN9aCBm2684bFndn768wc3xNqvL3u3Wcec2Z8Hm5713ppOvzF3R0Pu2x77YPAdlsF3fGuv88KeniefeRpC4u6iUhgkMCJyuA2ooFSZhx1vBwNpAiAjKQ899OSTT37x8st3XnUV85CVKf/UU7annz7/yit//8ord9BwlZWSsFSNRwMqIgwbYRXwLMvQXWV6rlI7lCFAPFXjrRRDGSdqVYtwVdKYLUti3h54dWAUkILMeAJcIj1tJ2VJRnTgsUUHMJeegbpLgmO/5GUT2oYG1fBW68+//vVUb++1H/xgS3Mz7zoOds1kobO5psQoicJMAqMgUUOlAVJebKVT4rlIJIgkO+394ksv0V3wTF1++eXcW3R8TIgo4AmbdtMeU2yZhKx4xipVBRGOyhgzbDaPxHAfTE8JmlXJTpKYlIK+WcSMAUqZTTL0IbvJRoRPkoubG7Hy2jVjoaeMJDtmFm+FpCTMISUSoNzpRfO+zO1bGa8ykKsyyigVCSjGL0oCEZ2OVIowDx4HAWKQxlNHwMikfniKAEHy8g5O3yQzyriEoUAfylWzNOSTnlPS83RJaxKDWJ43SjFTinxDUfpoZduplxKoz5iMz79mz40/E21Jz11AdtVXTCY2kpCPGbJGkKoBCDqI8Nn/koVSqt7VDJ2gsWoiozpAdJJSlEa010lvFQCZmEi98MKLvb27MpkYKzvWrGFl3VZ6Brl/qCnmEa+Jp6wF6XlTMZABfPoyOpzc+PhEKhVvbe1wu1VzzyiEVuaAVCgFRiENDQSGBISQZsx1okiq3Ge8TNAVdg0Nrdq7FzcliSl+vKtrN8uhDDXQk+Lajbk5VaIUaFN9n8DL74zKc1/RCnI7kQYmgwtxpfJu19s3cPcDuwN5T5PFFirnbKVhu2vC3l4qtZcHnKWjdkvK3mHJdr2jcXVbQxhPFE4paAMJIABXIZlTFOPu5eD25t238g6n4WKxzAsvvHTkyCus7GUEbfVqls+dGwx65C6tqpScoiSH1IUABSGnUuyMuSSSxLwEz4jDjLkQe8rE6MDdTlsDI0Lk6abRZ8hII9IoxpsBtxGjzXgDJpNNXZpRjVNEGveG8t2rmY9qBBr8J1tfdQHqkxkoyVXzljiFwCV52crrxpJUbBkoxeNN9zRLRXmc5HaZZXozWWUuM0yA226Gh2EqWxWZnfyRM5/zM7yVeUpPotKUaupf9AeQypjZhxenlEp90NbgG8UWxoOvPIuVWBGuPK3MWxUG6soYJBsHHaJyJ1ReqgqTbMZGPHmuKiHmKaIQyCl9GCtF5H2E4tEDE95MhvBZNqiZ5eQBCqVDRybExjRvp8Om3O7qxQoVeEmyqZeZIhusydfAcfqxnF69VSGWjOiDBLPKEiCGA5lmvKkD8VAIZUoMjt6qhjNTniiAhOli5yXxiYRMj5+uw/SY2eSanuY0YmYsmkhEzR6o0yh3obMo/95Cl7GC5csdMJsKLuu7ZDYVlDSLA8jilDL7WuuUGgGNwLJGoGLuNaY3xI6JM/3NlFcq/nh1q9wG16g3e9Q7LHa8UoVpl5Y1Llp5jYBGQCOgEagRBAwnMuY/jpejRyx8LaWji1nSig7Nw+q0pPutucGyu8vibrGUj80ucVodE/nowezRsL2ux9NdZF3PSae0mCJ1QCNQawjgoxMXNK7dqrpj4E53GIgT1fRMcoqEkwxuVcnUpxoBjcDsEVC7TEFe7p/8qO7RB5kFle7qjv7q+y0NTXxdQEmxOu2jPw8P/4OjHMvZmiLt/7tUf6mlpJ5kvrj2amrfp0b/7nC532vx/GbgXb/ccCtcOPuydUqNQI0gAIc98MADzK5kfikz7hgfZXQZ/oPemHLFcmlWYXNKJIxIYmBRszqNZZRwIcmYZMiCenZmkaGyGsFNV1MjsDgIONTnjQ/sCz90n1r25XAEDh/M3fvzxLvew+gzc/wsufGGkX91W+Jlq8tbHikO/eu4fxveUGba4Q79auS2/ZajPqsnZ819LfH9y70XrnJ38kGcxVFdl6IRWEYIsGvBNddcwyRt5jqyYwDLLaBA5hlezYcNLWw6epCtztjdnqt33XUXKylJz+KWD37wg1AjXMgKCjbqvPbaa81FOMuo7lpVjcASR8DBuJ99YtxeLFjsxspcPv08Nsr0/Ml57vmooxzF/DOq4XSUxkr5lM0ZMj5HVxgsjriMS/ayPWXJjObHV7nURohLvM5aPY3AIiMAk3GwKpxZi5h37ELCHHSmxTPb/qGHHmJKOiwIBd55552sxmN5GZNUYUpZ+yFmIvteslhbJj3qR2yRm08Xt+IRcPAF8VxHZ9YfdKeTrBMpF/LpdWpRKs+tmhDraErZ19YVXrRa+QJqJuW6tGwPGOvlLU6L43zHObvy+7wssLUUusttqxwd+WJh+nKuFQ+irqBG4OQIGDxYYiUyn0FmvwKWfrP0kN0YZOBQfKRsucKuCywVZyCQGCw/TEZZPYZw6BDiFDmaCE+Otr6qEZgrAlZeTllX63z1lfr77+Z7QvFztiZvvJnVNzJ3lPU8tvSRutGvugtHM86N0ebftLibZe4oaylTxfRXo7c9W3y5ydrwfv+7tvk2a7/oXBtAp68FBLAC2VOG7bUwBHF4MljIfpL33Xcf9MZmKA8//DCDf3zklh3CWGnOPpZsk0ZiNg9inTj8B/OxPQI7CZBYRhBrATRdR43AoiFgZaMHCuOT0Ba2iGSyqMej9qOrdG/y9VhOixmLHduPpMemw7CxMXtop0ppl9XF55M0Cy5as+mClh0CEJg4OcX+I4z/E4OPsUDmkeIOpUaSgJRi+UGf7NrFOCKXMBbhyOkzTpcdDlphjcASRMDKLvKTahm7AxxHgcfpqzjwuIipE2OnLa7NfHUqlf5XI6ARmAEBuBBqZPBvhmsMzht7AXIJjjRGK2ZMpSM1AhqBM0LAcdo7XZ1RsTqzRkAjUIHAKR9D7RGtQEsHNQLzjIDjtttum2eRWpxGQCOgEdAIaAQ0AhoBjYBGQCOgEdAILAsEGJ5QG4frQyOgEdAIaAQ0ArWJACvlNRHWZtPrWmsENAIaAY2AQkATob4PNAIaAY2ARqCmEeCrlTVdf115jYBGQCOgEahxBLRFWOM3gK6+RkAjoBGodQQ0Edb6HaDrrxHQCGgEahwB7Rqt8RtAV18joBHQCNQ6AtoirPU7QNdfI6AR0AjUOAKaCGv8BtDV1whoBDQCtY6Ado3W+h2g668R0AhoBGocAW0R1vgNoKuvEdAIaARqHQFNhLV+B+j6awQ0AhqBGkdAu0Zr/AbQ1dcIaAQ0ArWOgLYIa/0O0PXXCGgENAI1joCtxuuvq68R0AhoBDQCNY4ArlG92WiN3wO6+hoBjYBGoKYR0K7Rmm5+XXmNgEZAI6AR0K5RfQ9oBDQCGgGNQE0joF2jNd38uvIaAY2ARkAjoF2j+h7QCGgENAIagZpGQLtGa7r5deU1AhoBjYBGQLtG9T2gEdAIaAQ0AjWNgHaN1nTz68prBDQCGgGNgHaN6ntAI6AR0AhoBGoaAW0R1nTz68prBDQCGgGNgN50W98DGgGNgEZAI1DTCGiLsKabX1deI6AR0AhoBDQR6ntAI6AR0AhoBGoaAb18oqabX1deI6AR0AhoBPSsUX0PaAQ0AhoBjUBNI6BdozXd/LryGgGNgEZAI6Bnjep7QCOgEdAIaARqGgFtEdZ08+vKawQ0AhoBjYAmQn0PaAQ0AhoBjUBNI6BdozXd/LryGgGNgEZAI6AtQn0PaAQ0AhoBjUBNI6CJsKabX1deI6AR0AhoBLRrVN8DGgGNgEZAI1DTCGiLsKabX1deI6AR0AhoBDQR6ntAI6AR0AhoBGoaAe0arenm15XXCGgENAIaAW0R6ntAI6AR0AhoBGoaAb3pdk03v668RkAjoBHQCOAatWoUNAIaAY2ARkAjULMIaNdozTa9rrhGQCOgEdAIKAS0a1TfBxoBjYBGQCNQ0who12hNN7+uvEZAI6AR0Aho16i+BzQCGgGNgEagphHQrtGabn5deY2ARkAjoBHQrlF9D2gENAIaAY1ATSOgXaM13fy68hoBjYBGQCOgXaP6HtAIaAQ0AhqBmkZAu0Zruvl15TUCGgGNgEZAu0b1PaAR0AhoBDQCNY2Ado3WdPPrymsENAIaAY2Atgj1PaAR0AhoBDQCNY0AY4Q1XX9deY2ARkAjoBGocQS0RVjjN4CuvkZAI6ARqHUENBHW+h2g668R0AhoBGocAe0arfEbQFdfI6AR0AjUOgLaIqz1O0DXXyOgEdAI1DgCmghr/AbQ1V8SCDBnLZ/PW622UqnkcOhtLpZEo2glagcB7RqtnbbWNV26CJRK5f/9v3/3xz++++qrL3v22Z3PPLPT6eQlVR8aAY3AYiCgLcLFQFmXoRE4OQLZbHbXrr1/9Vd/Eo3GvvWt/8QotFj0wqaTY6avagTmDQH9vM0blFqQRuD0EMAd2t7e+sQTz/7+7//mbbf9CC70er2nJ0rn0ghoBE4DAe0aPQ3QdBaNwHwikMvlb7nlxlwu9/jjzxw61ItTVG9zMZ/4alkagVMhoC3CUyGkr2sEFhgBm832pjf9QqFQ/I3f+APCTqdrgQvU4jUCGoHjENBEeBwc+kQjsMgIlErF5ubGCy44126319fXjY6O2e08lfrQCGgEFg8B7YRZPKx1SRqB6Qjk88ULL9zW2NjApauuuuSHP7zTmCkzPaGO0QhoBBYKAW0RLhSyWq5GYDYIlMvl17/+Rkn5hjfc9MMf/kTPF50NbjqNRmAeEdBEOI9galEagbkhwHzRYDBw2WUXSTYsQqaPjo6O4yadmyCdWiOgETgDBLRr9AzA01k1AmeGQLFYuPDCCzdtWidiGCy87rrLb7vtvx0OTYRnhqzOrRGYCwLaIpwLWjqtRmBeEbDZ7Mlk+nvfu2PL9g0er/u5J17q7R3Uq+nnFWMtTCNwagSsbW3nnzqVTqER0AgsCALWAkeqUH+LzxIoR76ftvvsTqfTYikvSGlaqEZAIzATArhG9U5OMwGj4zQCi4IAy+ftHujP53Tb056SzSXPo34qFwV9XYhGwEBAu0b1jaARWCQEmBqDocc0UcqzqRdQa9lS5h/+t6Vstjqbsb2opsBFag5djEbARAAi1IdGQCOwgAhAfxwYfg31oVDA5/f5LFZrMhEvFIt2hzOaSCUi6b54fz6QDRQbIcuZVBF2VAyKKPHiVP7OlEXHaQQ0ArNFQLtGZ4uUTqcRmCsCxVLJbrM1h4NN9SG3y5lIpcdjiVgyDaEVi8o6bGzwr+qqb93W3NdxJNGYquvvyNvz7LImSwll1IIxRKiUMF8rjMWiLpcrFovb7bZUKkV8oZAvFouwo1ibsKMcc1VVp9cI1DIC2jVay62v675QCGAElorFzpbGjuaGbL4QS2djuWKhWK5vbC4UC0yHKZfKfIk3my8e7h8+cHCg1Vnf07xqohxPJZLDoxOwpOE7VeohyXCmWmxEWW18mALa83p97MRWV1eXTqcDgUAikfR43JlMJplMQpC5XBaiVT5X49Ar9BeqmbXclYKAtaNjx0qpi66HRuBsIgA/caABtl5dwLdj60aHy3VwcDSbg/LyhXwB0sLzSRjrjS1GSUlyZeoVrNbzi6XWvP1nvubO+rqgf2h0YiISN0xDlUAOoUN4UQ0ulktwHPGKGw2GxFKEXyFIDhZgKJbNZuFFCs3nc6LYFC9OCtT/aAQ0AoKAtbPzYo2FRkAjcNoIwDEQG75KbDJoyOvxbFrd4bLZ9h4d6B8ZzWVz4rpEvkldVWVBhJlzk8X2rP9njQUH9qKju7Mlk80NDI5CcyYRVuWqOlUkzNwbNYKI7Whzudx+vw9e9Hg8lMs3nmDERCKRTuNQLQl3VknQpxqBmkUAIrykZiuvK64ROG0EDOIpY4zBfXw1wu12Z7O5Qj6/pbu9b2Rsz+E++Iad0sRsO0UpBYvtwqKtq1S43WVxiWBLZ0cz5Hq0d4i8wqCnEHL85Ukpaj2iFQMRh6pPTdPxOhxOfKeRyAS/hlmp/K3HZ9VnGoGaQ0ATYc01ua7wmSBgWF3KLYmlVV8fgmCYrhKNxmOxhMNuu2hjz4GBkYGxCHNEZ19KKV9qu6wxtNq3+7u9dvckLTHI19RUX18fOHRoAIvzDNlqihctkGIoxNhiHX5UHKdTjGiOJs5ea51SI7ByELB2dV26cmqja6IRWDAEZGGD2+0KhYKBgJ9TyC8eTzD8B80EfF5Y8KWDvcwLhRHnpIXhGo0X2rKBu5rLrmPLJ+DCcDhYXx88fHgQgWfIhaKSEDkK8/nfkKLEeqgxmUxEIhGxEbXXdE5tpxOvDAT0rNGV0Y66FguHgBoCxAUaDmNHhSgmHk/29Q3mcnnCxENQdX7vRZvWPrfnUDSZcpzOhyOwApUc9WssqpfK4FmNRBJYn11dLb3KR3rc1dOrMIVYrcpaZbbOuDrGhBHb2topDkYcGxvNZLKaDk8PXp1rmSLAM3bZMlVdq60RWGgEjHkl9nAYyynIBMxIJMZAIJM94QkpmgQhv/f89auf23s4nkqzavA0VCrny/WX+UPrvIe/OWqbco2acqDhtrZGCh0cHDPLNa/OS0CMXRgRqm9oaKCOw8NDGIjGdFTYVx8agRWOgLW7+/IVXkVdPY3AHBHAfwg38C0kRgGDQX8qlRkfj+ACrSIGKMrjdu7YuObZPQdT2dzpsSCqlfMW9/lW1ypb/Pai1T2DrijT2dmSSmUnJmLKBl2YQ2qNAcrIZHNzCwv5R0aGE4k4pS0QAS9MPbRUjcCcEdBbrM0ZMp1hZSNgUKCjuTkcCPjwgvJdJCgBJmAOZ1XFcTNevHHNriP9qWz+tFkQmXBPtpgv+EpWiwefZVUpnFL6wMDoqlVt6XQmm83Py2Dh9FIQK3VkvJBJNIFAsLW1DZfp6OhINBphWFHT4XTQdMzKQECPEa6MdtS1mAcEoEDW8DU3N7AmnYkwhw8PsASQ3h8SnC6dnc1gwf6x6EgkflrjgseJtFvtTLHB61o5RliZAh4aGppobW08coSJMwtlFEqJYnQyp3T//r0su2htbW9ubp2YUAOKQKTpsLJddHhlIKCJcGW0o67FGSEAzZC/qSnMrtgMBB450s+MzRNRIJZTvlDc0NmGFbi3b8hhP1O3Co7YOobn3N4DlmH5AsX0ykBO6XQ2k8k1NtaPjkYWgY2wU1nOz/Y0hw4dYJVkS0vbxo3N0CH+UuDi0nQldYxGYJkiwKbby1RzrbZGYH4QwMphpTnGViqVxgo0jB7rdEeoWRhDg011wZ7Wxgdf3M044pk/QJBKNBdNZqJ2q0N9mukEBypBgd3drYkEu4myLdsJ0s1zNFDYcQ4fPXoEc9mgwy2Dg32RSBQyXiwd5rlKWpxGoAoBbRFWAaJPawgBKA12aW9vootnEE6WDZzc2MIYcjocF65f/cRrB9hTdL6oAPYrltl9lG/Tn+yg9JGRSEtLQ2/v8EI7SKv0gA6Bq6+v1+PxdnV1NzayouMIu3vP6DeuyqtPNQJLHAFNhEu8gbR6C4QAnFJmrTor1qPRBCsTMG7o6w0X6QlLJA1fkLhk0+pXjw4m0hmGBk+e/oSCjr/AFqH1zrqw138S16jkwAZj7mhdXQkXbiyWxHd5vKQFPzO2884xdhgON6xduyESGWehBUhqT+mCQ68LWEgEtGt0IdHVspckAlg2brcTu4qBQEyrQoEZMZOMcnJm4fuC3c1hpqr0jozLJmonTz/b2lvV8vZiWXk7TykQE3Z8PIoVy4zWUyaerQJzSQf78sbAtFI+jsic0k2btvT390Wj2lM6FxB12iWGgLYIl1iDaHUWGAFYkD08g0Hf8HCEj/cZM2JmO+/DbrNv7Gp78rUDxjTRebPGoL9kIZUpJmwWVhGecIzQBIYBwnQ6xxp/9p0xKdy8ujgB8Yj29fWNjY3jKW1qUp5SZtbAkYujgC5FIzCPCGginEcwtagljQAePGiju7uZT76zeydezTmNbxWKxe1ruvpGI8lM7szXS1QhZWfhhqdYULGn5lfIe3w81tXVHI/zBYkqSYt6yseB+cBThad0YmhogCosvs92UautC1txCGjX6IprUl2hmRDAEPR4XOxVFonExZCak1+Rj8TXB3xMFmWm6Jy+LDGTLtVxJWs5aA+2ucKvWA/bZ7UswYpTl8FCBjjHxmJnyyiUahicN+kp7ejoXL9+4+HDh/gU8MnnHFVDoM81AmcVAW0RnlX4deGLggArIox5MYGBgbFMJn8afTQ8uq2n65XD/Ya+pzba5lQt1g4miskjuaTdMlu/IuQ3Ph7v7m6ZmEjMqawFSiy29dGjR9mtFC7s7+9lcxrtJl0gtLXYeUdgtqMj816wFqgRWHgErHgO+cMQDAS8R44MZ7Nqs7Q5lmtljszq1sZsvjAcic/JmzrrgtTiwTz/zcIvasrEKGSJPYOduHzNyLMbwFMai8X27dvLcsOurlUoZqg2z+8NZ7eOuvQViQCuUX2brsiW1ZViKiZf3bMzwZIPR7A/GVbUlBdxbve8y+HY2Nn6yCv7GBo0Hpe5ZT9lS5StZb/d1+jwH7AOo+Mp00sCdIlGk62tYUYKl85TjBVYLBb27dvT2dm9YcMmdqXJ50/HBJ8lCDqZRmBeENCu0XmBUQtZcgjAgn6/t6WlnhXoiUR67obgZI2YI3NuT+fh4fFMrjDXL+7OERT4Vf5mmw8DF7vQ6/VgGi4dLkQTXhd6e4/W14fXr9/EGnz27NZu0tk2qk53NhDQRHg2UNdlLjACDAriM2xoCPb2jhrfjpjt2Ns0vcoel6u5PvTwS3sNFpxnW1CKY/lEqpTOFuI2i2uaAieLgG8ikWRdXYCJM0uHCEVj3KTwXzqd6ulZGwgEWGvIovulpuTJwNXXagkBPWu0llq7NuoKC4bDgbo6PyxobBw6W2fjdHjYR2Zzd0v/aIRhwgU0B5ktU7RabOzPAifOYcCP9NiCTU117A/AtgDT9T+7MViBvIXs27e7s3OVuEmNl5LTb46zWx1d+gpGQFuEK7hxa7FqMkEUFjx6dISpGme49ZfL6WhrqHtImYOnbVOeuhWY0OMv+1s97butgyf6+sSJpFDHeDwdCvlHR9XeLidKdrbixQRkw24+fL9x4+aDBw+k06fvpj5btdDlrngEltyTs+IR1xVcOAQYFwzzJaVQABaklDN0xBWKpXXtLXxxkI8uLZzOSk9co8X0wfRhm2XOzyPkl0hkGCY8Q8pf0ApiGk5MTBw8eHDt2vV+f4CXlQUtTgvXCMwVAT1rdK6I6fRLFAG6V5yErJrv64MFz5AEVR35JEVnU/iRl/fOy7eWToYae41aIQf2GoUTT5Zwxms4RXkD8HpdLJFEwNI84MJMJs3KinXr1jNeyFalevrM0myp2tRKu0Zrs91XWq2ZPMlCAqirv199R+LMaZDpmBs6WwcnYiwfXFC/KC2Ba9Rn9zZ5Gw9bIqfDhBYL60OCQX86PbGU7UKWYLLjzN69exgvJMz37jUXrrTncNnWZ86umGVbU634ikUAe6iCBee2AuFEoEB+3c0N+/tHFpoFDQWslrK1VJzDNJlKtVkcaXhH3UtwjLBST8LwdLFY3LNnd0tLa3NzC+GqBPpUI3BWENCu0bMCuy503hDAI8oyCaZNMkd0vpgAX+Pa9qbRWCKbzy8CEWLCZsqZo7lRm7VuTrNGTRABIZ8v+HzuJbiOwlRSAhjr5XKJFfeMFzqdzoGBfm0XVkGkTxcfAe0aXXzMdYnzhgAEgEuQJYPMjjF2jZmfITK7zbaqpfGp1w4affT8yDx5ndWmN3Zn8TRGCA25WFrMHQ0EfMnkkltQOL3ihuO6zDcr1q3bwAY0rLjXXDgdJR2zmAho1+hioq3Lmk8EWDnA1JjGxhDjgmrq5emySJVOytEarouns8ls7vRG7KoEnvIUl6i75Gn3tpZPl3OhFr5Q6HI558smPqXOZ5wAlW3MnXG53KtWreaF5owFagEagdNHQLtGTx87nfMsIgALMpmzvb1xYGCcj7tPbSI6DxqVLKWetsZ9fcMOGzuLni41zUURdlyJJ5ODo/vrrasZK5xL1mNp4W9mDOEizmaZO7oYah8r+3RDWMFsRrpqVc+aNWsPHTq4fFj8dCus8y1VBLRrdKm2jNbrZAioTVg6O5tYRX5aH5Q4oWj4NeDFznSNxZKL2C8r3oKAz8SuhfsYIPT7PcYiiuVBhOjMTmxHjhxiG7bu7tVHjhzWPtIT3pr6wkIioF2jC4mulr0wCOBIa2triMXSiUR2Hm1BlC2WyquaGwfGo6W5T+Hkk0NilmGZFQrqjwAx8lcZiQlLpPn1JIN9/ZuaNrKO4rQB480A76jX654vF/FpazLXjHChYQ7aW1vb9DzSuaKn088LAto1Oi8waiGLhwDcwjcl4BW+SWu3z/ObHNNk2hrrnnr1IJNFZ+NfhMygMQ54yO22BwJur9eJz9bvdxLj9Tpcrsm92bBcM5kCKROJPEvgc7lCFGs2y/oBRgathXwxlcCl6TyTgU7EMvDmcjkAZ/HaYz5KgguPHj3M3JlcLqfX2s8HolrG3BDANaoPjcCyQQBbkH01+cpgfz9bk8wzC2KkNdYF0tlcKpszVk2c0D4T4w9WczhsgYCrudnX0hJsbPSzgIFF/SgGIUGEHCayrBkgDIsbZqJa7cDiv7GxxNBQMjKeiZWTQ+m+xtI6eLUik5l7VgG0YoDQ5/PEYsnKomeV+awmosrUm51IN2zYyMbcyaT4pU+I/1lVVhe+AhHQY4QrsFFXapXo6J1OWzjMx5VkscQ8VxSWWt3SeIRP+E5+HfcYjZklGRTInBRHOOxtbQ20tAQaGwNMUXE4eJQU+cnIouJAq5VIfs28yk9aLvOhWtgQOcFgkH0ANm0qZdK50Xz01bDF2uYbj6VyuRL+3op8poBTBMiCr7iuzheNLqFP9Z5C6anLKA8+cCETZw4c2A9KpwPBlDT9r0ZgTggc96DOKadOrBFYZARgkba28NhYDLtwfocGqQjMxByZkN/7wv7eGW1NEmDM4f/s7g5v3Njc2MgqfrfDoVYsMMVDZnmYzEcnPr0fFyKkLAIyGMYvBpDfX64rBFflWnM35UZHY/v3j/X1JfJ5NRV2jnRoxdDEJEWlOWZc5JacuThqi2u0r6+PtfbsxIYNPdf6zyxXx2oEToWAtghPhZC+vjQQgDzYQYahNdaML8R8TripW+0mkzQ+PXjcR5cgYGw5jNHVqzHgmpuagvAfw1pCe1Ag+pBGuJAAgNGDS6ASPEOO6tzloEZCotAhAo1MtrY2BxQLHe7bNzYwkMJ6ZLBylvNfDKMKVUsMEwIUpVSWvizCvFTE4/Hh4aE1a9ZgFxo6L79aLAuotZKVCOgxwko0dHiJIgCFuN2uYNB79Oi87aNWXVXWYzSFn997BFqrvAQFMhC4alVgw4am5mYo0AXhsTcYHMZBAL5BPUiRU8koPFcpRMKkgfMgKk5JI2EkIBBPIAESEIBi29vZKCAwMhJ/7bWx4eE08mdpAcPCzMph/Qe/y5AHFU6gMT4+7vF4urt7Dh+WzX0EP/2rEVgoBLRrdKGQ1XLnCwE6d/r01tb64eEoMhfC0GHsLhz0F0qleCrDnE/RXHyhbW3+7dvbKN3n8xlzMl1Cfvy6XC5hQfpuDslFjBmekjM56QOmFPKD2LjEL7QnaWDHTCYjYQL4S0slv8fjZX4s04J27RqZmMjNxlMKUKwjDAaZL7P8hgml+vzyQjAwMNDdvaqjo1NvRmrCogMLh4C2CBcOWy15fhCAMJqbQ6wWp4ufpWE014Ix+zqa6vtGIkz2lLxQFX8bNoTPP78zEFCcBL1BfowLGvaZjYBQMr/EmyUqO3HKNJwSdWz2I4nlMC9BjYQRjkBGyAh7vV7hQr5ha7OlVq1qqq/3vPjiYG9v8pSvAeCDUxTbEs6WIpbpL4D09vauX7++oaFxYmK8CtJlWimt9pJFQI8RLtmm0YopBGAd1gPgF+3rG1u43hDJDcHAwX6+/wB/8JFc5Q7dvr1548ZW+MnrxRZUbIcJiD5oRZgYaaEq+08iT/Rr5pIElCtEyClmEHYhtiBhCqUgLlF0uZwKhYI7djhCoZHduyOn3E8O5ys6YtcSOJEayyIegA8dOrR+/QZWU+hJpMuiyZavkpoIl2/b1YTmmDhMThkaihgUsiDzJrD9gj4Pn15KZwtMfIE/gkHnRRe1d3U1MCKIfWayoCAOHZrkx6UqejapTihTfquayoyU7BCe5IJfCXOVUwbJsAs59fv9qVQKCVu2tLJm8eWXx1Kp4kkt4zJGIcOErFM0lalSYFmcojzVxzXK7mt8qmK527jLAvOaVRIXSs3WXVd8qSOAZcZMUaaJ5nIn7/rPqCJQT3tj/Vgswa4ulNja6tuxo4Ny8YPCRojGPuOQMrDbIEKzPE5NUjQTEIDMiKcrNznPzELnLvNlJAbyE4+onJIL/pMwwjGGSAwXptNppK1Z0wwXvvjiyNjYSfaWszJTxusVIjSLXZYByC8ajYZCdey+xlTSqneOZVklrfSSREBbhEuyWbRSBgJMqGTzzP7+hR0igmAaQ4GXD/Sx2VlnV+Cyy7qDwQCfBwoEAtAYbMTKd2kQUsJJ/JrtA0FWntJTS2ctVh2XTkmEJID5TGokC9Qok2gknEgkKI6CCBDf2mq/6CLbM88MjY3ljJUVpi6TAXJBhMywxcdr/FUnWF7nzKHt7+9fv34jyyoAitotL/21tssCAU2Ey6KZalFJFu+xoo6dtRnxWsjur+z3uFl/HktmGho8F17YDgvS+cKCUBosJa5R4TNsQYmU9oAjqxQzTyVALslY1X5VkViBJhFyySRCwhRBoexJikCmrUIGjFTW1QW2bSs8++xIIjGzoYzNiZ64eY0BzarCl98pd0Jf31EcpKyyXwHUvvwaoAY01q7RGmjkZVhFenBGuZj9mEhETzoedqZ1wy/aEg6NRpMut33HjraGhjrccZh9EAnkhGsUWhLegorMsJRKmtMo3iRLMy9EaK6jIBKx8J/MmuEUJjZWU7CyHvvYi7PU4/E1NpYMLhwvFGbYmxSFUR57Op9XU1KX+0GLUOt4PNbZ2dHb2zfjvj/LvY5a/7OLgLYIzy7+uvSZEaArZ5SOPakX2r+Htcn36Pf2DV18cSufdqLPhWwwwlAA4sEhKSzIL6QIRckpSsNn04lwOsnNXL3jY5HDYRqFXEQBkwi5RNEwAfHowxAjHkKfz9/SUty2rbRz50Rp2k5kvEawJQ0frGds9fRUOl7Bs3+GjT44OMgM0lAohFkMJmdfJ63BCkJA308rqDFXSlXoxwMBD/NW+MDeQvfjbmV22rtWe9b0NNPbwkAQobAdAbPDhRSx0kwWBGliqvA+E1WrpFEuxYl8CkUrMUYJQ4pcoixWGba3hzZuDFWpwSlX+QwFFVtJjkQwOXr0SGdnZxVW06uvYzQCc0WgepBjrvl1eo3AQiDQ0BAYGooutBOM6ZutdYFws6OlvonZMQyrMTQIi8A32F6VjlCoyCRFqS/dcRXzcVqVhpSV3FkJVKX9RzxlYepVJhDjzxSIYrFYjASUImEuBQLBNWtKqVTh8OF05cQZJpTgFA0EWPixoMOrlfouRpjptaOjY93d3awv1HS4GIjXTBkr6p2xZlptJVcU5uBDQul0fkGXTAiCrDhf09lQjynoDbCUntmh4guFbOrq6kwSMihHEaSJOzFQo3kqAbrm6b3zjEQoY37Ts1exIwlMdqQ4cYqihowgsrgQmzWbzaxZExweZj4NnHdMJGOHjK1Czceiln+IN5WxsTG8o8Z2MxNmAy3/mukanGUEVtRzcpax1MXPBwL0bqGQd2KCT7NW9OvzIXm6DLa7XtsTLtlt/Ae74HWUNNChuB+F/DDXYDjC5sEpek4/phcxYwxypsdXRXKKDpU8ilaSi0jTbcvWM3wNuLubDWiObeRGMhzLqLfQJvX0Wix0DJVi6zWWFU5/51joorX8FYyAdo2u4MZdflWjN2fD6FSKT9cqg2ZBK8Aag56uhqZWz0RO+UKFWgjAQCbloACnlT5SUYnu+Ex0q+I8EUXPbtp/xKAJ5SpDzzg4JQHUKGkIoySr7LEUcep2d/sHB7MsvjcZlvSFQpFhwmKRL9yKjBXyiz0diUy0tLQwfeYMG2KFIKKrccYIaNfoGUOoBcwrAsGgj9HBRejgmC+647y2sh0HopqZgjkIeVAVwpXMhybTjY8zVG9GIpwuk3I5THYkAUqyrF6yE5YF5iz2yOdzPT2+l1+evCQNgm/Z+JIGH7hYUUxIg4yMjLIf99jYOB82XmG1m9eHSQubLQJn9GI720J0Oo3ALBCAhvDyYccswuo3lg82NbnXrm7MFkuwBKQiPAQXYmmZRCV2WBVFcdVMYFZrxkjz6mwClFIlltNK7yjKYP2ZesLWqGqsF2Q3cHdbmy8cxpU6WRR5qSNf6DVjZqPDcklDrRksbG/voI7LRWet51JGQLtGl3Lr1JZudPThsJ+1gwYlLGzd+XrE5o31Pr9vZHQcq8s0BwlDNmgixUMnldahRFbx4mkoWkV4ZlnEm0UTSbjSO0oMRaOeudOYMVkmSzLGCXO57Jo1vp0741MWEh87ZMNuDzI5TkPJpZyFZuLjvUyZ8ft9JhpLWWGt2xJHQLtGl3gD1Yp69Obs4gIBGZ9WX9iOGzOirc3d0x0ula2soIBF6FhRgAOakbDgDvFgk1W1wZkTIQJnJKfKoqVQSucwvaPkgrPNTbqhSWFK0jCAWFfn8vnYhEVNHyUlH/ddeRNHzbaggowRtrW1Hzx4YIVNjjXrqAOLhkD1Q75oBeuCNAKVCECBdXX+aDQ9ZdBUXpznMDzR0OhsrKtL5dSCfewqkwghRUgFRpQioRncj+apREJFpK/SCXackSCr8kouIklcdQlNuEqkBEz56FAZKVQtCy1IiXoyoYZALsd2qU625hYLkAlH1AMurCrIlLysA5Afw6XNzS18m4K9ZqpAW9ZV08ovPgLaNbr4mOsSZ0AAS4xFCiwfXOjJopTNQFu43lUXCI4kksqSmmI+yIkwCUxOgvCm97Akmx5JzPTIGeppREniKn7idLpkUlaRLjEoCflJcdAkAfJCkHBDY6OztxciVJwqM2/5wjDbrQk1nkifZRoPXIODA6yvNycQLdOKaLXPOgLaNXrWm0ArQJddCoe9iQRLBTDFFnYCF/QQCtnr6rxulzuTm5BpMsJJUA59a2V7CC9WxpCgKk3l1TMMC4FVCakiY9IIEZIMtQmTAN8pv3B6MOjyeNLm14r4Qi8rKJhTugh2dpXai3AKFOl0JpVKh8MNDBkuXLssQl10EWcXgeMe+7Orii69ZhGgCzM+JIuVs+A3JN7ChgZXQ12QtYoMEOJRFNiFVCqpiDCKCUeesmkqM54y8YkSUFyVHEqfHilWoAjhKlxIMjJSFwzr+vrJmaJYgaWSco2uSBaU6kP+Q0PDTU1N4HAiVHW8RuCUCGjX6Ckh0gkWFgE6cVgQ990iLKKnJjg7oYqGUCiZ5SPvahKmSXVV9h9XOaoqX0VUVVdnf3oiOcSb+og0dKiM5Cq9f2UM/MfMSRJPeUcdQ0OTH51gWpDHwxd9F3zv8tlXfN5TspSQMcK2traBgYHp7TXvxWmBKxIB7Rpdkc26vCpV5lsTyaQMei3sfFHMQT4pEapzMkA4HIuLX1HAglpkWoqJHTEc5qmZrCpmHk9PVCL9u8yOkbI4RXPmjop6Bv8p+jSq4wiFqEc2n1eas4KCMUTDIqyuyDyqfXZFGevrR9auXWe3j1S9Q5xdxXTpywiB6hfeZaS6VnVlIMB0ENZ9ZzKTUzwWtFJ0lMGg3e/j40uuTC7Hx41MG4IAlpZZuml4mTESMNNXxc/LqRBhVW9OJIpVRhJjGq/Ei+bClIQxYqmWoY8V1yjzYOdFt6UshN3ymC8TDocrXxeWssJat6WGgHaNLrUWqS196Md9PhebgU2NZi149b0+q8/lyvFJ+EIhMLV8EDUqB95EiUpeNNVSPDPTIcbZTFdmjjtR+hnjiURDDpHFKWrQ6Utifjk4Fd0wYvkwRipFGvXxCiq68rbersKUlmKyDNNH2W4GKKqu6lONwCkR0OsITwmRTrCACNC3BwIMYk0uBljAkqZEQychvz9hDKoZ5DL5I9dNsiEwY5eqOGemrnbGyKkyZ/j3ROlnjCfyRPpUiYYSSMySCWP+LdNK1ZcOJXtVyhV2iqMYyudDWnopxQpr2cWpjh4jXBycdSkzI0C/Td/N8sEZCWDmPGcWyybbQZ8PImQKCiRxZsLmP/eMFqfQW2VhVXCZnlKyQ9PGTFhlGEGfEATZGSycib4rRS7vMEYye3A3NzfH44kqcJZ3xbT2i4KAdo0uCsy6kJkQoJtmvii7bGO4GLP8Z0o0r3HwbiDg8LhcY2n25DzuWEa9J7hNp0apDLXgMOgQi1YRocfDvjk5QCb+uAqvrBNqx5eKcQLLFnQru7Irq+mWRG1mHvBYEqppJVY6AvCf3+9OJhdvcj/E4HLw8mdjTLKqrzSNquWIOrwoahsBNo2D89SAIjWNRmPlcnVll2MdZ6NzNBptbGyklWeTWKfRCJgIaNeoCYUOLDYCvL+z70kmk6zipAXVw+/15EvYoGqyJceJOk1UOsnVBdWwSrioij6VqnJaGYOqkkvijTO4UGXhE/bs2j0+zjc9VrJFSPWZ9huJxNasaWCHnUqsqvDUpxqB6Qho1+h0THTMYiBAV+V2O/hCwqL5RZmYSg/ZVO9NZdlyTB0mf8hp1S+kUhXD6YyR05OdMmZGOWByIp9nlUAhPLO7F2kyjxT7D1QlATVgEzJ2KpDTKiEr75RZsjhI6+vr9Y5rK69xF7RG2jW6oPBq4SdEAP7zep3GR5dOmGZ+LxjLCbCSHLFksgw5GIdZhHCJeTpjYB7pZDbFVekgCpu/0B79fmHqMNaDFNiMm0Aux3OtWJxSmP3q83mqRK3UU14jmDITDtefBrwrFRNdr9kgoF2js0FJp1kQBFwu58RECh/kgkifWSgfJXLEkwmX2wORwBnwiiQUQpFTITwxsEwx0rea6c14CZxGPPKn5xJuM4VLAlGV38p4FJYYFCMXKQnwy8G6TItFVkZRO+xe5SYVajQlrNQAbwLMDAoEAonEorrcVyqeNVIv7RqtkYZectVkAicdtPGFoBk8kAuhLpTHTBl4IpPLuj1eAvAHc2QUdRjhKmaqpElTH2yOGR2q5BWmNFNKQIRXRXJKPPKnxyOHoyp+upCqGDOLMGKhoJ5r/KL8DzGwvt44XSScq5Rf5FPaB6OwsbEhHk+u+GHRRcZ2BRe3mC/jKxhGXbW5IQAnMUBomDGLN8GvbCkHvO54Es/hJAOZ/IH2UEhlHWAarioyOf4g/viIs3NWRYSm8obaZWOMcFIxaBtX4dnR8myUSpMxTMj7DfOwzkb5usxliYB2jS7LZlvuStNfu91qgHAxXXZ4IusC3vFIzhNWnIcOeBdNJCtJ0Yxc/MB0NejZiUTbkyhjXi0UcPZaC4Vjm6aSlS/2GYxeTeonEbisL4EGU4TwjrJ0ZPqrzLKumlZ+gRDQRLhAwGqxJ0MAQwuLMBpNTw1fnSzxPF6DCEcjkbYAzKLcj6YhNSPZmFcrFTAppzJyHsOzkY+26EZKs5c3VYUuMQeLRfaXmXT2GETIaFkN+X6ARfbgjkSiNVXxebwPa00UYwm1VmVd37OPAN0TX1ZfzAFC6oyT0OdxHhzINRWU25PRPpM/uDobBpp9soWGeEZticzns+k0BK82Wps6rA0NDanUxIxZptKsqH+5u1KpTEuLkwWFsrHAiqqerswCIKAtwgUAVYs8FQJ8SB2TBqtsMaczOB2UZk2k+Y4rPtE8vCh2lShLGHasnAhTefVUFVrY69M5jBiJ5Be1+TAvu04TSKdT0ajDqMekKxUbET7ASDqpb3Vh9V986UABIH6/T88dXXzwl2OJmgiXY6stb53puxkgzOUKhmfvmOWyoLWiUL/HnVduQ8v4uCUWiwYCIeZtptNpJlagCRQIPVZ+q0GYBq0MPSe1MyOrtK1MU3Wp6lRSIsfMMl2mGSPJ5NMKpKd/RxqRuP4gP5FALSTMIkLYMJnkcxOkOgbsxETE0OFYjHG6kn9Aht23+RiFsQd3DbmFV3KjLmTdtGt0IdHVsmdCANOETwNlMrL7yUwpFiCO2SYhvyeZzbGmDqqIxzN+fwDfLDMM2aaZAmEXGFE+Y0sYXoRgZJ+XSjPxRMsnJMt0xYnnqIyHzIjB3DSn6nAqDEcyYTUzCwH6dJSE5IT2RFWoUdIQKWEjACHaWE0/teHaZLFcYlEBhVSqsbLDhnc03dzcRHut7Jrq2s0LAtoinBcYtZC5IYDhlUikKq2WueWfe2rIKOjzRBNpq1q8aJuYsDQ25pxON9Th9ao1hYiEUcwwRMVBDD2pSUukORERQj7V/GMoWZnX1JpIhJtEaDIcCSBCkxTN9CQ2wwQku+QiDEdKJNomEs5Kv6gRXwoGQ1xh+LCyoEqBKzLMsno+PsVbDjNIa6riK7I1F7pS2mmw0Ahr+dUIyLhgoYCtU31p4c5ZXO5zu1LZgg0mtFricQdOUfpHyENohnAVCZl8wyXzIJJj9nqScXpiU5oEKhNMZ0Fi0NCUQwAGNZMR4JRIlKffN/yiVeoxxRRbkMgZNKksesWFxTsamlN7rTgQdIVmhYAaHZlVQp1IIzBPCKgpK8ZNt5j3Huaax+3M5PIGD1pYch2LZUMhvJSKC8UQFCvN5XKZXScc4za+cmtWnUtwz3SHm5nFTDlfAYqrEg7jiXDhP9SmdrlcJp22ZbPKE1j5TIM2E0aMNLX1pLN1ES7lcLgOcBbzTpuvdtdyFhMB7RpdTLR1Wbj18C4yXsW8j0WdMkq3iGHEZBlMQ5oB7ohGLa2tWafTI95RaRvcjBChhOk94Y8qEuLSjEQoWc7wV1i2Ugg6VBEhacQEJJkQIQHSsKNYXZ1n0ybb8HBpYkLV0aRDdtIxjNiac/9Qcd4ieJUx9xKqxFaHNQImAjX3bJg114GzhQB7X+HbW8zSoQEX83Ns9mJp0m3IdJKxMdfoaAIDD4sQIoFUhFcqmY94jipVKxNUXZr96YxCZoyEjBEr6vFLGqFnfqHwGIZtNjs2NoaRunHj+s2bm7dv96xfz2cIYcdJdZgEZKyom712KyYlRmGauaMzArtiKqkrcuYIaNfomWOoJcwJgbLTyYRMvHmqb59TztNOTD/o97jwiyLBLLRYdPb2Zj2eCY/Hx+Qdv9/PJWgPXhTvKKfCOlWOUCJFyOl1r5KXXwlUVcqUKQHRR3yhhIlkaDMej0suwhwSecUVV6A21Ojz+UOheFPTRG9vYXAQqMsejxfLNpNJgXlVcSv7lNedZDLV3NxINWdEe2VXX9du9gho1+jssdIp5wEBjDOHg6EsrBw65UXql9lu2+10pDJ8j/dYobBwMukfGYl1dhYikQgdJYNJkAqOR7Eh4D9OMbzgSAJUHtOKX8VgxjF98Ino6RhJXomHzDhIRili6ol5J1fhYIiNMGkkF1dZMmiekjGZTIoEkjEGRgy82NPTs2nTJrKTGOswGKyDDuvq+DJffPduNFdVqay7FLfifwEHpyhfrqdtF+1mW/GorsgKqgdbHxqBRUMApuAzhNFojk5q0QqFxYJ+bzqbryqT5YLDw766OjZo9sFAbNOMSngaGVWCOoSooBk4SbQ1dSbNdBYkC5HTK1VJhIQ5kIORR4DE8iu5YDIx/szihC85lYOrMkDIKQqjIQfhSy+9FDnYta2trXyffWBgAJdpU1Or2+2x2YYwICH66YrVQoy8Q4AMuNVCfXUdTw8B7Ro9Pdx0rtNEgF47m81g8BA4TRFzz0ZZXrdzNJKoYi9UYFro4GB2zRq1xQxrziAzuk4cjEKKFMUpfag5g0YKJxJRlRwmKWesVFWyqjSVp4itPCUjzGfGcAoB80sMKcV2hPB27NjR2dkpJCqqrl69OhqN9vf3e72BtWsdQ0OjiQS51LqRWjsEQ4/HnUjITka1BoCu76wQ0K7RWcGkE80XAgxT1deHI5HR+RI4Gzkwh9Nuy+RlNd5xbAA3RCKBiYlYc7MDr2NdXR0CIRUsLTHvyMtpFRFyVdyksyl9xjR00FXxxFRFclppx1AomqAPByzIJX7ZUPuiiy4iJhwOkwD+g8U5xS7Erj169Cg2Z1tbs9U6DHcypWYmk7VKkRV2yh5GankMs2qBZYXVTVdnvhCYwZMzX6K1HI3AjAiw08eM8QsXSffvdrty+YKsnagqqFy29/V5YzH2rVZbV9NdwkBCJ5ISgoF1KrtRRVkzMVmV5DmdQlSVMikOc5BIhBCGlTk1ilVmIoYgPluo7tZbb8V4paMnGdwML8LlpBfTcM2aNcQ4na6OjvY1a6xuN0XMSallnxgoMhnl6172NdEVWEgEtGt0IdHVsqchwEs5EzronqZdWcAIp8MOMbB2YkbnILrk895Dh6xdXXyWotjQ0IiDFGOL3lPMPugH7mGcCbWFq/g9c6PQrLCgAeeZsEgpwnzEc1DcxMQEv4ShQIxXst9yyy1NTU1mLy+54EVUZfoPpA59dnV1Yc4ODg62toYnJkYPHmTmiFnyyg8AqdF2LCBRWw3oQyMwIwLaNTojLDpyQRCgp6aPrq/3JxKRRZvKT6FMGS2wMh7jyj4zCcAN2ay3t9fa3h6l3wwEgnSgDMhBKnAJB2EQIUB/KnTFLzQp3FMJllytjCHN9GTwmRkJvRHGBqVoiRfCq5wvCvNxFQVgaCaLUvSb3/xmhgYhOVHDFIgoYjAEsRplWilkif7I7O724TodG3PWlIPUQEaZy7wM6bmjlXemDpsI6FmjJhQ6sAgIKB6CAhahJLMInIF+r4c9Rk5eKsZeNusZHCx3dEDSdpZMwCL0oeY23DAQs2ngOaE6OElOzYKIh5bkqhlJQFitMgauEs6TxJxSkEyEMZPBi+KPJQ2XSM8lAjAi6eE2TD1sQV4sOOUS+ogcfjnIxTAhVzEfEcUIIpF8hXHVqlwkoq6bBa34APAwG5fGMj7JtOKrqyt4Ogho1+jpoKbznB4CGF4Mt0WjMXrtRXPQsYjQ63Yk+ATBqZbwi104NMRETeW85ZNMEA+WBPQGdcBnMJNJPJxylVMTCoNpClTNjDEDQnjmqUmNZJFIiJCAmYwEHHIKBcLBiKUsfqE0YqC3u+++++1vfztpTCGESSCUiTTioXNMWBidcGNjI4FQKOp2ZzMZ+WahqdFKDgALw4TGfJnF9smvZFhXVt1meGhXVgV1bZYQAnT7Pp+XjU6m+v9F0o3vTmRyRdss1u/DhalU3cBAiK9EZTLq8xSQh+FSU6oy6mayDqfEV56S+ETHKespRSCNA1LEhiMAh2HzhUKh7u5upr2sXbuWdRFwGzEs+T9y5Mgdd9yBZAo15ZMLLqyMgVDJghzEcpVJQx4PgWNZzLwrNcDrgTFfZnIL2ZVaTV2vM0FAjxGeCXo679wQkM6ddecWCxNHF60vtno9rvRwxKCHUxcKSTDNsq5Ojcaxfh0KwfxiKiYEQ23hQpxsBKgLv2IjmigQyVHJQ+alygBpOJVfAob5V4T2sD7lF+uNUkggMRTKeCEpiSSMzcfgJbkOHDjw4IMPXnfddYS5KlRH6Rzk5ZR4iJxT7CHRnxV1Pp+FzzEuIv6UdTYP3hPADCTxeJ9NPXTZSxgB7Rpdwo2z4lTDVslm2RtTefAqzJiFrScluZwOPslnEMQpiBAPZTCYW7u2iNXV0tICkcBAsB37luGZhKUQgoFlhklAJNUxWY3KVIYrT814ctEvi6vT6KDVcKMMRiLfAEctmado4TNSckCBXIICWSyIWDQkzZNPPknkNddcM908RTEGFEmDTNIT4HC53LCwQcREIl5oWxGnJCNyKkCmFXJQO94egBrkV0iVdDXmFQE9WWZe4dTCToWA0RvDScq6WpwDnmDlYu74hYAUbZDB5C9MYTAByw0LPT3ZuroQ44IQHpTQ3NxMAF8lBiK5iMHGolflgGmIMccRKQg2Ih4jkj5XEiMWUcRAdTLWyCWJIQAakoxfUkJ19NQEOCgI8kM4AU6hSaE6CoULmRHKVRykSHjqqaeYOIPXVHIhioNcaILOBOSULCSmIh5PqKcnODExbrez1XiAbakNw5cvuTOzlE8tOjlFFCoYv0oTdTL15jL1r1HMMvlBf8Ff7ONlorVWc/EQ0K7RxcNalzSFAF2z6p0X+hCqo+Ouq6svFIboygnzB5cw9wUnYbHIZEKGjujlGbz0MIuntTXf0BC02xULoh4ZYCD5hX6IhO3oVWEUhNC3kga+IQaqg96gMU6hKK6SkkOESIBkZpUJcyBHAhJPLtNkIV6sQPMSxdGPc0o8E3mw9tANCWS5//773/3udxNPLlVJg1ZhPvOUZKwsxA6PxdL9/Xx2iiusKEAB1mxg1DITx5HPZ71eF7WMRiPs2U0kHMlKkkgk6vWyF2tWWDuXU4sQoEwkiHyzFKNk0Xdp/fLixQ4Eqs7qxluMe29p1V9rcyoEtGv0VAjp6/OHAD0RA2F0x3yPUPrreZFtdHCTth0C6Y4RDgcx4oYx4/O68rlYa1uwkM/CZPgbMaToFll6wJQUVtDDHXTufOPB7c40NLBS0MlkFFgN8oD8DNakBGVS8CtEhXyYT6pAgEJJxilUxMEpKSE/ScApoiSS38qDBKQ0YzitjEEC0jATJZKCxOIkXmxBLlEjrMbx8fHnn3+eLzFBw0gjAdWUjHIKCxIDQU9MpNkwjk8Ok4rvMBoWpyo/kYihL0bpxISyQZkrZGRkLzfcs9SOGpR4dUBf1utTP7DFcGTUjZUdgJ3NYkRig/KrlseYdRJqRKAq4+wdqIQr2uejOmr/ubOniC55iSKgXaNLtGFWqlq4CemmWRR+Jt0R/Sx/IsEgPMgVm0ytE8dbaXz4t4DjEVsHw4UBQrvTE5kYIQs2H510JMJcETr94ybT40pkjZ3Xq5yiUA7JsLegGQIc8BynEuYX4UKQNBOn/ApXEZ5rPyvZRQJ5AQelRQi/wr5cJZ5TysXOk8ToBtURL4FnnnmGmaV8fYLs0Bu0J0JITExfXx8BZv0cOYJw9X7AqaG18a8ickUPBqTHeAL+MJSxYEdykY/cmnlQh6oDNSzPXuXsuoNJzJsAkbzrsL0nHMnrBTsE4OvlD1GTJU2WYp4tRgBt8QlTOxOTxShVl7F8ENCu0eXTVitEUzpQ829WVTKIZrLjpr/mD8uGNXVwnuGAVG46Y2SNmTjszKJ8lYZRopY6wJj1AVsyVUxnik6HDEzSGaq+vrJPZEJJOJxoaWG9vFpKL5dgPjMNrkgIUkiLX04hQrHzpA5clYAZCWlVVU+yV0ZWChGqQ45pyZl6ip1HRtKLjUgYtoYm6d/Rk7FAcj300EPvfOc7uQRZYjuKQLKwvxq8yKzJgwcnEgmf+Gsr1ThJWN42TE3MlML+xnclLQbH8GKhpqeS3mbL0UYMgEKTBk40mY0WwQ3AVBXeTgx2VDgByLSmMEuY5wAK4PWdmIhLXeZZuha3zBHQrtFl3oDLSn16PfplukWjx1RsNP0Q2jN+6VKF9vAQyp/iHjpPulSmlWQyfCBJmRr8iRzptQ2Ok+7OytacLoc9mVIfZz9RD0hZLleho6PIZ+qhFqiIUqAZk/nQGFYzaQyCIQZp/Eq5BEhP2ExDeDoRSuLKX9LDWBIjeRGOEVYpB+YziZCUaGiewtkYfyTGhQsRYva9+OKLLDc0HJ6TTleGFfkeExoy1/TwYSDlkT9mnFUqcxphE3Aj72SDwnAAiI82lVJTNEljYKXo0OHAxnUEAiqsNn8t4ZSGFxU1qoY19JL0p6HMybMgX90BxnHylPpqDSJQ/dJagxDoKi8mAjP2RNIDogadIIaLywWvQDZYFWryCLzAaJaximFylgddJnIkPb+Gm+4ElTA+yeucMtdOkIipoalwGFNJmYMiHDNLEnMKI8JP0k/zCzOZp5Km6pRIUe8ExR0XLWLNKDIKF5oxcBgxEAUxogwxQpZoAgFjkKEhRiqc+vjjjzN8SKSp7eHDh+FFwNy9O5bNMtln3ljQ1LAqIC1DpBGYvEgjwnzs2Gp8F1BdolrgCjUGgy7CpOP9BhPToEY1E0eOSiGTUXP/B1RZSgiMs2+XuReicyxjBLRrdBk33jJSnX6cbgiF6eCMPg57BSajt6PfZ4tOZSgY/KLoDfuA3pA/es9pDjTVlc2hO7NieeQjMeabUPqkyVKJG2rgWWxrQzsn5IcZh6qV5iAXTNOQjBRdaR2KKKlapdhZhpHGUcWFSKPmpgQSUKIYeURyip6M9hHgVAYIkUAARmQqqcyaQQJycIryzQo8yQcPjvb387X6GRAwC1rogFCaSWw0rmEyojvTbaij8njzAuTz8Z6Bu5u3H4x+dRsYxpxU/Ux05DbDageBswnCmVRA5104BLRFuHDYasmTCNDxNDTVT0RT+VIhw2r6QrFgd3hcVrfL5nEzSaPMVPxCrpyJG6N7x/s550Z70yCnl60PBceiiRN0fvTApdbWtN8fhDYwqkgvTCOSOIUFscAISAxXOa0shxiOyhjC02OqEpzkdDqtmuOC5DJVEmcil3CW4v9EK7gQo7C3t5dJpOxNgy+UMFb1xER0926YlRHQY5bWSRRYtEsGbJNQwXxs5mqs1VQmPo4BSDEUwhZnbwH2Q1ekiBOVdpgG9qz05c2AKVpgazblrLLpRLWBgB4jrI12Pnu1ZCZLe3fr527d9OT370n4vYmDe4LnnTuyfyAbqs/aPAMD44VAKBlLWT1eUtrp5+22IlM61ccDlQ1H72hlTNEY4LGUSxCOoh1FV7OqkkpsKWeNb9Or4PGHYQ7mmpqY7qhYBC6BXWA+2EW6S7JUmoPkJk2VHE6rYo4v5BRn5K3qmqcTIYUSiW4iizDkh/En5cLf2IIIkQBvE7hD+fQEv2SBXY4eZaWjOEXPRNNTVGReLptNhJsUx4D4UbEUmZhaV8cHp9SSj3S6wGoNAcNMf8rSwScYDI2NRSut7VPm0glqBAHtGq2Rhj4L1TS8oGrhQSoe//rzAxP+JlupmPU1Ncfy48PZoK3Y4MwE0mN1nQHn2FBg44bU/qNut7dcFxje+XL4isvj+w9HiiXv6p6JwwP5xoZ0JJ7x+LJ2RzyRLrq9GaYgsmQNL6JaCKHsBOXxwpqkq2cOBoNB1FhFK29gJsv35ZVjturgen09e4cqjyj+RsSQGzrhV4k0aI9LlbmqTrk0nbeInD3hTE9JTCXtiTS4WSZncopuOG+ZKUqAA5XEKCQXs2ZIxqwZamGYiZZdu8az2XZyFwpqPSLZl8thKgv54SBFbeMNwO7zOYJBJxZkOo2RByOqVjMTn6R27AxgpFtOIJykOvrSPCKgLcJ5BFOLUggYnbPyX/EWz66cjP3wav/oc4dyeaZLlD2+uqGMfSDYZVWf1bNYfWssvSWrp8e2P2+zdDsLFtd42dJ6bmik5LCEnC5LQ7YMgzW4rE5r1u9x2kuJQmqkZePWkaeecTY1+ZubRnfvs/WsssXTE/mSLVwfHU/g64xF4+l8OetnW85Ea8h/aCCCQthTMCRDk2WDL2FNp519ZCA8Gywo5iB8w6E0m3JC0nXKKTGEZ0mECogzOEQZUwAKUK7RiU/GQQmYqiY1Ys7K9FExZzEWDx482NLS3Ns7eviwq6NDXgIQsCw5wNBaaQ4OfAuEP/ylTLRhNBHfKRyZTBb4VbfTiceP4UveErJZti9XyzxMbHVAIwACx73takQ0AmeCgEEfLEXgnZ2BK3Ye4dMHTBSc9GJhmrkc1nK+UMiW3DKDX3VHJYvqpSdH4PKsAuTM7oknChZ7gNjyaKHsbrGMsrtJqyXGu7/fWtfoOJi3NG132qzu8bIttNaT8fjLDqct32B1MeTY4LE1RlI+azHYUjd+YM+2BGQ7NjZwqOmaq9J7Dowlk/7OrsRYLFFXHwkW3H78omqbGKm4yYicCtmYLKj0muYXJfIMe1VKme6sI1L0MX8pmkjTO0qh2HyyjoIwV02jkCqwlAJzkN0DXn0VQ8ofj0eNj9sv+96fmpoUls2qfVNpEKZZ4TWFGiHIRAInsKLDGQ8AnNExMGNiHVlTCGjXaE0190JV1mALZjPa8VlRBl3S8DDbuqh91Iyjslymy7PDWawyygybPRgDcZOR8Kfq25Q5YEz3m5x1CgHlSpYcUQ5fJMvgoQ9CLUXZQLOl3Ju32lq4Yj1UcHRfPBZa/cQBi6097D+ad1ib7L76UNblpv+0WMMN+RyzdlwMPqllgnSUhCmXAKcyd1S4h1MOSWlqyyVqBwlxyYycl4DRZStLFPkIFBBhONlEWyIpF+PVXIaItpAfWUhm7K+W3rkzkk43YtBKdRAzL7otESEAAw4ALzYiw4fs/dbaikNY0aH4S49XlepX/h1/UZ/VNgLaIqzt9j/j2tMT8Z6NC9TvdzDBIRLBGlFzOuiniJ8unp5adv86c/KYkg5bQIIGFakoMTGNVXfYoMQzJRUlrY5Itli2ui0O90DWUnI0OHLpcx3lOqvyi9KlwmomI4ra0AzxwknySwyL09FfGIiPFBKA1/G5QaVIMG070k+vuxkjTGaekstMb14CH+LNUwLE4P80cxFD6RiFcgnloUZ0QBPCpZLa/JODNwgmziST0TMH3Cx6SQWMOrKNHDvnqR1toMOWFi90GI+fzDpcUlXQypx1BI4beDjr2mgFlhECYgUFg8xccDFnYWwsCxHSK83If2a96I75oEEqxUf+JudAmpcWIgAXx+IxGTnid7IIZWSWG0LpgM+NutCGxJsOUk6piGkmYnWNjIxAgUNDQ9iyxOOHFBJiiQIc1tDQwH4uHHyzSRgRTppTdaazFGJhNVMICYhBq8qUaI6NKGkoUU5Jhtc0l2MDccbDoEM1zdU4TGErMwDpc0/GYjnWWDJ22Nbuj0xkuTPlhhQESFMTWKzMFl7AWmnX6AKCu4JFM9/E47XX17ux/4aH08yCobuhC55FlfkSEB/EIeEULc0izxkkYQPS3JRP7JgYPr/Q0EAXWe0XNWkGXoHS6DShwEOHDvGLfQb58bVefhFESvgGWxBqHB4eZuk6215z9bzzztu0aRPZTevwWKlzCU0HkxhUwuJGK1GAUyJNvsS0ZdgMxdiEtORweUMFhzuZzoQnYvECk2uNXGRcHNznUtf5TCvVxDpkfLqh0e0POMfHxIxWftSpG29lYzCfeNaILO0arZGGns9q0ps0NGIS2cbHs7lskRUKJ7cCK8umm/Z6A+weE4+rz/1UXpr/sNoOjX2fA+UyQ5LHyjKWD2awZYmEPKRcSMU04xSXuN14Hffu3QsFoidfvsUFivkl+1kTQ2IOsvOleCxC1rDjtxwdHb3nnnv27dt35ZVXtrW1wYWIOmW9kDY92XRwpNBKabCgWIEigTAqFYrFQKl0aSIxVi42Nzl7jybLOVZp+tKZfM5qZ8FJgW9KWNhpDc0MTMrsfq2Wmai3GGPUU2I5OwZZZanLIcwNicthaDBdH3a3dfgH+1lzqb7CwdjhclBf67jYCGjX6GIjvqzLo+/E6mtu82bTxaEh9fVzm7FL5OwrRQ+VTKoFcNMtntkLmWVK+jyI0O1x06NX8gpEGApl+bqeEAnKcMAiQicIh074gB8L0pl+ghcUnoPwSIMJCEFCh3CeqT8GGVehPbiQ4UOyk5GVfDuMA7HmfJYTqV2pm5kG+aY+ZqQYqeYpAcxT0zsq1ckzPsY8WD6tG42G/cV2y3jA6S14WIA5kMmkvevXlPsGE5mcpbXZkspG0oVUQ5jdsCPRVIz3GV8gmcknAc7tybGzuVqjyZoTK6ON+AA4jFqrTw4q5eBJtYZvUp2pfyu1O8thdMM7Cvm1d/kHetldSCktx1nWTBe/xBDQrtEl1iBLWB36QVxxbZ3+8dF0MqG+8nNaytKZsq2oPZXKzN6OPK2CcF9a7JNbjNJLH+uomRkTCKiZ9LAanEe96BzNAULqBaUdPXqUANZeOByGfkiDDsSICQjhMToIBZKReAMZB35RmHJsbEwin3jiCXyqfCyXzwTChSJh9hURIZXpkYC26GD6QomBCIkxhcsiClh6d3v7iMORSCQP+VvctmBsOO9wr7G5So5Rq8vZ7bDl3VmH3+Jx2tIeq7OuVGoqpBvt5Wa+6RgdTiXjrZdeWNq1h31Kg+vWWBKZkbGIpaXF7vUPD45lbHZrS1t0Ip7C3A2G0vkik5BKNjueAfyvBl2qL/kqg5Npw0KYzOqdmrqjWkU1xrHmqKzjvIe5x2LRPHZgY6vP2F9O7oRFKn3eq6MFLhAC2jW6QMCuQLF0Xi0dvvHRrMGCp92V8B0A5meeHonOGVW05POwlSxIGCrBIUpHDashUehE2A76wdobGBiQuaCye6fQDLadOTrI0CBheAKGg4dMtchFMriQsToSMHb4n//5n9u2bbv66qsp60Sm4XTOQyCRHCbDSRHEVBIhkZyiuQwckpjSOVXDmWyqxpxSa6nex2Lzkl1REYs3Zc2JvWx3WNQCdGeZFZjsbIAgb7cqIlqyetfavBb7/oLdtcbWvMqTsbsteT7+67P5QllWJaqlmny3sTU9Ys1l2tefm917ID0x0bRpA2w4unu/r7nZ0tk58Oprdl+QwGgklk3nCx5fPuCPR5MIzeO8LVuYKcWMWApWW69DikoF9ToClOyzrthTaSOUqUJncsCFiVjeH3QGQ+5xtmvQh0ZgGgLaNToNEh0xEwJ4xprafOlUMZWkKzvW9c+U9mRx9HAMnGER0oNLb3ey1Gd2jW6VFQ2xeNwoa5K5sUzwVhp+UL5xOLmJDOagsA5cwuxQ6IQDjyiR8CIExikEY6pDMrJgAmIX4j6d7LcNTiUeNykcySUMOI6dO3diGt50003r169n3LGK20yZ0wNVnEcCCkKgLJmQ9MSgG0Qop2RBAQxWAtSOq3V17NjpizN11oB7EgWVWoI4uyWrMUNVhSepAqIq2h3GW4TL4nWX+M4xq1GCqxRVDZesdRuVvKNFm3e9zWdxJS1OFq60bfPbbYGEo+hvCTpsYZs9ZCl4yumQrRyoc6f7jwTcLvu6VZbx8WIkkSiV/Ku6Ev2jUb4y0RC2WeyRRCZWKNtbWhKxVBKedrryVnuO703KLno2u2JL2s/AQW08OxWGSJXWJzYz2a12fCTT2o4trehWoFAQWBlKLDKaaNyQFdgIJPq3ZhA49mzXTJV1RU8HAQffCPQ6Rg9BKmfaX8iXlVjytQgzFwzePbYIQWrudheYEAqj0A8KLcEZXII8GBqE+XAw8vUGEtBxykoJmKaSwKQnJT2OU/Iyg0Z10AbTSIDs8BMuVowzcVf+4Ac/OP/886+77jqIqso0lIzTWwUFqlIiHD6uTE+MEJ6ZnXIlTEq+sGi1JtlZBgI1Gc5MefKANLNRJ8U/RrPDN2rPT8WdBAzuUWLLlhx/EJHdk7BYBlNli69FfX9+hDWdzRafMZTYZ7HWn2MrlxwDVoelxeFudJSLwYTbyYLOYtptdYeL+XA+XZ/Pdta1Fob70xPjzVs22fK5sX0HfKE635ZNfS+8ZHV7LHUNKYcjNh4regPpQCibL6TZhttiKzqcWJl5i5UdaPmPb5zQHrxzKWWVn7ZU5gvAOGu5yCWjsXAVXHbZhre//eKf/OSFZ589wCsN86XAzIBFESQBA3B1n6hq6mPlIgARnmm/tnLB0TWbRIBuxOtn02flTavshU8PIITQQdHjsEvWZD97eoJOlYtezOf1YYSUy0nzPqcP9HqxdpRHkbpInwjlEEAeE0/gFVbKw14yFAdvEUNivJ3QElk4KksWd6i4Sc1LSCOevMwjNXYCY9s537PPPssWoLfeemtHRwdWXaWQGcMQ7fR4VKUU0ZarhNGNSBhXEhv8p/p6Ag5GY51pzOKF3kzRREQFwKfMh0SgIM7UlFR1EDQQ5u2H1QxZi91idUxAmVZ/2RsoM33K6rH461TWI0Wrb73VZ7XHynh0be31bqslMGorhVb7bJagxc60JRi+sZj2+QPueMJRiOcScf/qLud4JJVMpWzWhi1bxvfsjyTT7p41aat9tH/Y4vUX6hs9Y8MNhVTG4sDAzxVgTOuWLR1wIX9Hjow9+ujue+55ed++IWNHJMvNN5939OjYpk3tjz++b2QkdubvfwYK+meJInDc2+US1VGrdbYRoCvzB10MtNDtcpyhOsztoNNmqj9jV2cu7aTKMAVGOWENrSfVhkR8PspVFEFeIQwohzSYfXzwlvE/iFCYRiiQMAkY5BM3aVWJXIXwWEqPNQmPmjUiHuMMNynxuEk5ZeoN22F//etfv+GGGy699FKIVrhW1CNBleSqeJEM56GMZJT0xBMpJI0QwuapQYq8c0C7fIywSvxZOzUVAXRDiQr3LBiwXawiT4UG7VSyOhjjjadLVleIKLXHAH/+kIJrBC5rsrqbbO6yPWGxu+od9oLDZg2MWuzuFpst28j3LvMZXykfLqXDjY5wbjCairhtVs+mzelndwXXrn79L10hKKxa1bhq1RXvetdlO3ceeeihVx966DXWjn7kI69/5ZW+O+983rg9zhpcuuBFQEC7RhcB5OVfhFV9Nd45bzNc1HwZjwdzx+wSFwoirKeK7VkoBc4osBSCACQhpUIbQjmM9sEYTJDhlH6WX9PNqHIan4aHLKHDKnVJTF/JmCI8yiF5SSMB4mFKZtDAVQiEXO+++25Mwze+8Y1YjaZpiPwqsZwiFiGV8SRDc3Qw0xOgCmYasnCKWOIpN51OsCQylVJ22rI7TJUVORrUOFlP8dBK5BQ88GXBpl5u4mwq4AgxaHmQprD4rY0bS8VSIGLdesEVzwzv5zXMsa9gC252jloan9p/3XVbTFhwltbX+7Zu7Tp8ePS11/rD4UA8nmGM2OtVYvWxghHQrtEV3LjzVjW61HSy0NDimxjKGP6uM5JMj8xqAmMwxuzozkjgiTPDgsVcgekVFGT0mmX251QzZWAL6EQ4RoiQMHNMxASEZghgAppkQxEShlpIj+V3PD9hh/GRKQc2H1SHCSj2mShGSsiVeNykYjLKV3P/9V//9U1vetPmzZshLRybxwtUWSkRPU2fp0jjl0gzLIFKIiQG5fHHSgAhbLgWiXC20GhTxNk8zOoZLaWGCg36xIJU6+g7ugPRvpitqHZc45SVHnypa2IimUhk8Ivu3j2ALbhnz8DwcAzmoy3e//7r/uu/nj7nnM4NG9rgxTOZIHY2QdFlzw4B7RqdHU61nQrqyqaKLq/dF3JmkgXjY7dnhAhWGn33Qo+7QBi5bC6VLbDqX2hMjUR5mPUKwVD+pHUBbXAV5sBviV9UuMdQ7xjfSHbqTBfJJYgNSmM+BQYbfkqrNfjCC6vC4Ynu7gEGF5uamuBCBFbmohRxk3IJxyYS4L/vfe97rKzggJU5nSWmVUSISmQ3y0IIZYmqRrzd7UZP/InHrMZZFrQykvEF565N9RlWvuYm3eTUC9b0epz/8z87v//9JxkCNNbYqHuSewN3Om8aP/vZztHRxF13vYRRya1aCe/KgEXXohKBY496ZawOawSqEKA7GDiYaF8XgkMMH1XV9bmdIo0pCfTec8s299TMI6nyCTIJBiJRHZ7hK+SXMILhLfiJKS2EiVQkigP3eL+qWT4JsBd9PtK4nn9+Y3//tp6eiY6OIZbACS0xaxS3J+k5lVwSYJZpa2srRIVkWAoL8uGHH/7mN7/JXBthL7MICVRxnggkUpQ3E1OFypRm7QgY82VgSohwwdE29VkiAXWbFWDBOn4jQ8xsNVtDKQiGe/eyj3qUIOTHH14KIrkvyAgLAjOu0WTymBd6idRLqzHvCGjX6LxDujIF0jtgC04MZlZtqTvyapR5pGdiF9IhMV+GxVs4BQ0+WhDQKAU2SmZjljJz+4UGGGBjyWDaHPyDP4QIoT26QDyf5CJAJAEi4ScYq0o/1qyzC9nu3Zt7e/1dXb3NzU8aSy0mXyvJSHpm3IibFN+m6lyNg0twLfEMGcoCRNykrLv/2te+dvPNN5977rn4VCV7VYmVp5WcJ/HIrywC5UkjLM6gJDur4QY2Bg0rxazwMIYgPgxe3dKx3NDhRDDk5WYz3gaOvRCo96QTHOKuWGinxQkK19GLjcBxHpXFLlyXt6wQsDuskWGmo1vXXdDUuzuSThQwq06vBliVGGA+H4Nt+BVPT8apcxnkYFCE8UMGynI6meqivuQn+SEMDsLQBgwEhRDg1+QVPJbEm+mhQMjxxRc7g8HmxsahTZt2Yzqyfh1er6oIubAayYi1Z85tEbFEYhfiIxU3Kcngv//+7/8+cODAL/7iL0KTnJrqSaDyF4WRU8mXxMDW5BL5ciqFulxuaqaqro5KMSs2zFsaryLN3YFwu3foYCI2qnaF8/m8si6iVlBYsc27IBWrftVdkEK00JWCAINtEwNphlu6t4TH+1NjvUm1meTc/et0RvDL8QNbi4ERTGD4YxXhUR79pTGKNskPMAfDhNCSXBWFUFVmshBdKjl372bqRGNnZ6Sj40UIi0/qwpI+3+Qqw6o6iHyGDFlxj2TzqnCYLMaXRfdQI8leffVVXHVveMMbOjs7MUbN9FUBVILqqibRVOmMQKpDQSSGNF2ufCbjqZKz8k6FAr0hZ/v6UC5dPPA8GxqwDSovDbjii9ksM2nnfrOuPJh0jaYhoF2j0yDRESdFgG4FIjzw/Fjnprq6Zu/QoXgyglNRfcfnpPmqLzL6xh+jMvgCq6/N27nSa8obNqkeXaFanKaWVTBspuZHmKVBGyx+4JSZLGakkUalffXV1sbGnmJx9JprXvPwbYYCa9gne1WkQZ9QF9xTmVHCkBach9MV+6/STUou4hkm5EtP5CVZe3u7LDTELrzoootMC2+6TPJWRVYyOpc4lQoiVgxIA4eqTCvnVFm9ZUugwdPc5bc5bUMH4vHxLD4MKk8ladlAwJ/JRIwKV0O3clDQNTldBLRr9HSRq+F8eEfpWY68EvHXu1tWBy09luFDiWQkC8XMZUyFUauyx2PnK+rTu/V5QRex7DRSLqkhOg5TpjGEpzpHDtM0ZL6oJICKGMaDC7nKQBs/R4+27t/fHg7Hu7qe27yZMTwnLGhKMwO4NOl2ZdKNFGcUq8pFFDKx0rD/8LVSKEQlfTQB3KQYgnAh6SmXmB/96Eds/I1pSEZTvhkgGRKQI6VIvMnoZOEQ4WaWleoapaaMBXLXhVt9jZ1+7L/RI4nEhHozcziPGX+gAbz5/HE7pFeAo4O1joB2jdb6HXB69Vd9scOajucPvTgOHbb2BMuW4NjRRCKSk3k0FbxzohLYz5r3dBjlGEWdKOnpxaNkNmdscn2shGMhU6ZYThAVxhlhchmL4vNqok2yfWysc2Qkfsklu/3+FJ+aYIdtnKtwlZm9MgDV0eeyskLkVBIVXTZXsfkQLqsJJaPEs383qwwxKMnIAox169Y9+eSTUB0zaGC4Ki8oGat4DiFSFtm5WlmulMKeYhJYIb/qk4jqFcHjd9a1ekNNnmyq0L83mmbzI97Gpg1dAwjtOzoaWyHV19WYbwS0a3S+Ea0lebAd1mEqmju4czwQdjd0+tvW1SUmshMDKaaYgsT0LsmEh7xYhMxK4XV+JrPHTHhGAYMSID/5gyFYOwElVNtzEAnFsAUM3kvmzUA9R4/6BwY2NTfbLrxwX08PK/9w4U7mwmNJerhwOt8QD0Vh1UFp5oQXqQCJIUJyYTiy4xp0SIyUSxZIl9KxCyE/TiE/WPmFF14gGRvQkEsYzsRCiFCyV/6aCcwAojio+MK9cJhlLXhA+I+vYPjs9a2+uhY16hkbyRx+aRwi5GazTW6ZXa0I91g6zfKYydeF6sv6vOYRmOFhrnlMNABzQwAuJEMymoMCnW56KG/XljCdDj1UdDidz6oFEmpgbpqRSBoMMBZR5HLHFhjMreyTplblVhyk5QxbbromXIJOgkFfNDpcLq96/vktTEbp6dnZ2prOZOpZiidkY5aGiUYMxDajKCKrhgzhLTEWRQ4rK4hhBg2RjB1CjXAVwtmwlKUUogwWDHYh3we+5557Xve610mkKCDmZpVKCOSQMUguwZ2cVtAnSnGIgOX0yysK1VFuBt4kPHaMP0xAvBHxsWzvq9FMQln8UKDDWf1yY1aSBODMQcyyhMCsiQ4sGAIze3gWrDgteMUioEYH2byqWB49muTPE3A2dPjWnN/IWub4WIZuK5vK8xUE5i5AioICvRK+QDYdZeX64uBCiUyWgXUgCfpHdTIZwDYtRaMNBw9uaG8vrV+/nyksLP9LpSDpKMYZXEX6SiXhGIad4BvhMPOSUYSqIAyHcIYM+aUXJt5MgyimybCC0JxKKsKhT0xDrFISE0MCwnzLEDpcvXp1lYlpSptNgNrNJtnSSaP4z3B+2p02f5072OhlVyO+S4j7YXBfjKWB6iuFmIDqJewYsDPqjygWy2MRnjLljNl1ZC0goF2jtdDKi1pHcYeqMZvdUavd5g06eYvv2hxm4jrTTVnUxbSagmI+xUMME9bX8y5/ir7stCtgM2alWMp8qG6GIlDAWDFYTCaDe/as5kOEl18+tmfPnV5vXalUjxkCu0M/sBFcyNBdFRdyiieTX+jQ1NAkPIknI1Yah5mAeDgS96nhsVTjkWYWyBWCrKTYnp6effv2vci6xc5OkpHXlDP7AK5WPmJvADADCLOXs9ApjcUPqop2h7ptgg1uhp8dLhsLIRLj2dGjiVy6oExDHL00zLGpMKfWy+PhY8Xy/eQljcCpa6JTLAwCuEYXRrCWWtsIqM7dsPwy8Txv8ZzhNWUcsb7F27o2VMwVU9F8MppNxfK82pOwWFJz/+b7bmT9nFt92HwiKaIr5UMrrI5PpxlnWv/yy+41a3rb20caGhqLxU18OLC72waH0YZCP4zqwYiw1/RWJZ7Oe0bPG7Yg9EkW7EJGDUUHfJ5YfkRCeBTBr3xQHjaFIDm4yhp8EnOQnbxsQ4NdyAfuT24Ukt5UjzBECwGjg2GzKgf1kjoUp4vbk48p2ax824SZL746F/Yft0o+V0xMZAYPxHB+4mZA+clZMNMmwpyyUtxXLpeD3UQJnDKxTlCbCGiLsDbbffFqzZs7WxZTHj7SiaH0+EAaIwzHaaDB3dgZaFtr9/vsjYn82GAaUsym8yVZVjg5vHeGepodHwEzjEwcoYVMxrd//8Zise2SS4avuGIAMoM1MPBwQsJMr732WldXF/aZaYQRCa/guoRazEjRj3hicJ9WUhGXhB0JwGfkwpUKlVZ5WWW8kKum+xQu5BSCRCYBssB/vb29WIdSHL8UxFGpBqeVTlrj+rEqI+x4BExJixowbD415od6LG9ww3wh/twenxq5xeBLxXMjRxLpeK6QV75cGVpmRPBMtGS0lOzGtjJnJOdMdNB5lzgCmgiXeAOtKPXUK7nh0Uon8soWVB29lU6wY5XP5XWG2/xMeSjkiqzKSMVy6WQ+n2HrFpkoIX2/IoC5IEJiSW8GWB2RKxZ9+/Ztn5gId3Ye3bp1gHkqxeIx9yysxqeRWOFw+PBhwmz4QonSdxNJDNQF5VSSEAngLXgOkoO6RMOq4UPojYxkJ6VZBaqD/cclHKRSCr9iGspOpCTgKmKZRINdyCoLJJCGeI4qHUyx0wPMRQVscky/tFAxk9ae0lHsOafL7vY63H6HL+jmTQgXej5TTMayE4MpzD4mVcHWKiWHWgIxF9fnietA8eDHLqOyx86JE+orNY3AsbGNmoZBV35xEVCdnZrlp1gqky5GR7ORidxAoczgkNvnwDkWavY2dwfpK/mWXD5XyiTzsCO/0GSxgKmk1EUGAk7OjAwmOZU9qgpiTmWp5Bofv6S5eV0uN3jBBbvYnrNcbmVVvRI3ddB1Innbtm0EsMMYBeTzScRIPGyH3xJKw4AjZirT5L9cJR6ahA5hL/MqKSE87EJoDFencCGERxokc5BFGE6ykBgiJMwlCUOHDBay9N6UecoAeUlDWUBWLLoBALvolLlOL4Hh55zCw6q+W+Rw2zw+J95OGhT+c7jYoK7Mm002XYhPZIaPxHOZAj5PijPMPvUrnoPTU+AkuTweVybD5gMnSaIv1ToCevlErd8BS6H+zB0NhpwMw6EMCxDhvPJRxUYQoTIjfA5vwFXf4nO6bZiMjCnSh0KK/GFG0LHSn2JoyYATEgqMN6otYzArFFWqXxsjZK5kcm0q1en3v5LL/biry8+4Hu5M04AjIyVyECA7LHXhhRfCamyEDRd2dHQQQzwJcFTChQzvQWxTfb/KS0YOEsM9eFDllF/SIAcmI0BxEBtcCPMRKVfJy1VmkJqlixUoEigFvmTjm/7+foQjwSzULEICVXURckWmwbuCxRmxgcH7GOiqNHRAMpVm6R6vLy6P3eVx8AvzOd3MpOW9wQLtpZM5tjobxbrPAJtqI3Wo4WN1nKHP05B1ih/UwCJMJGKV4Jwij75cewho12jttfkSqzGdKTZSQ8PkAgNOZWGiqAnV5TLF2FiGHo2uU1Gj24431Rt0M7EiEOKTAgWXmz3PiqRkGLKQLlx7xSa+dLRzZ++jj+3DInLYnYlETy7T43aP1tc/6fMdYfPlfB6iYiLJcbag0TlPUoV09HwXCQZ6+eWXWcAAF8JJkArJuMr0GWw4Zr6InkRW4koB8Jx0vvxCfuZVUkKiSCCNRBI2E1cmI95MgBDokNJP0qGbl8iIbsgXCYQLBebsoOFxSpplTQYkNS5Nzg3HptRUtQgvFUw58bCi0urxu5jY4vY5XW4HC9zdHnc6mbY7nKl4ppBXw8Csk1FtMc2nbapXXe5CnkPJaD71AaaFLEnLXs4IaNfocm69laI7w14MnDG+NsOCQjpjOuKKHtygxkJ0lGVhHFGjj2ZjSTu+1pb2uj//8zd3tjRw4W1vu+DOe16+7+5YOLhm47Y+h/NgMR9NJhzBcKPNOWx3qM8t5XN5PmFPmPR2NbvUGI80pvbICBflbt26FbZjHilDhqxhgBcNA0uVzcAe/T0xVV08phhpTDcpLCjWpMpj7I7GKQF4x5wFSiURYkomzMFV4pHGQXrkCFURlvSUrtCZOrhKDGeSzBSOWCjVzGusz1OvFcRQQSUAE40PObkgDQtvFfAcbtRgnT+bS9XVh0plNm3BrYo560gl0kCVjKazY5lMCk91qcDwm7LAYU+libAmAWM8+JhuUzou9r98jhFvgappBVCLrYQub8kjoF2jS76JakNBtuf0eiFCOqxTV1h1asfPpWCSYSaWf+fbNggLIgLX6C3Xbf/5fz53aPD+RDHiDwbZgCtYZ69rdgeD2a61XclUEg9kQ11DJpvxe/yxeMzr8gZDwXQ2DT2wnyj8gVHF7jdwIWz32GOPsYahs6OTAULV8RvmWiKeKOQLrHyHqyRGkbbBMRhVLJmAvbDkJi8ZNROPKEHi4apjlyq4CgmI4RK//EcAWmUPNhjUJEvjggX15D9O0YTRQCOHEsABgSmas9ld3jqHB+O1wNVA0I9PMpvLhOqCFivDdPilgxi4obq6RCxh2IP2eCTBAoZUrJiMTMB2+ZzyP8uQntJKaaeqqf7lmPuShlO38XykQFX8yiycUASNrvrQCJwAAe0aPQEwOnoREaCPkg24jS0xT6fgdDq/YUPLW996fmVmt9vyCzfm7vppPjpSTkzEuMS0xLG+5KruoWLcWO1gs2Ybs16ft8B+2qWC28E3bHE+ltwud9ATzJfy8Wy8Plwf8oRcda5Qe+jwocMsoijby/Wher/XXzSmoTC2lywl64P1PpcPXsCydDgdXMrmspCK0+2EqRwMoxmzSeFLkwgpilOJRzeny2kvHXPVujwuRTBWRXVWp5VCvQFvIBhQtIf9XC6lCineBjwuunoPhdLRj0ZHS94S1I7MifiENWiFv9N8gMOW9jY0W631iXg8lcrgW2aLn3y2PDYYh+RwFFtKrJJEnYh4RFEGaWVLUo0rqhN1avDpMuMSIcJYLC36q7roQyMwEwLaNToTKjpu0RHANcqQGUZhKsUg3NyKx/d16aVr3vOeixkarMpptbnrQiHrUEKm41vLmFkuNQhpdO5MScWEKrDRTYaduC1jibFituh0OPPlfNwSVyxksYykRhLeBF0qSx83dG14/InHe/t6mb25atUqulcIiQNyikxEXM0uv8/PanAKsZVseVseCaGGEIYmrka1IsBuxwo0NcQcxCSVU+Qz2MavoiIsQBjbac0V+HJQHguVaT3eOq/T60Rx5JBF0aFxYAVmihnlyrRaUtFUOp4eig0hh4zDQ8NUkMHT4b4xtZ13IWo4fZGeJPEUyRkcYSihnJnqMNE3A0b0MvzhhYA3DeN7vMu+LssQ/uWksnaNLqfWWsG6wkxQYF2dA9turkQIE61f33zuuR1V+ECQjz95kAn7dPFT/T1EyEIFNfOTTWdIrwbVFOOpjlIlUgwpfjQVA6OQgF8OBhQxvC6+6GK7xb7/wP74RFxNJXWqqaR0uHDnocghFr83NjZyCptxwHPpaDpVVgshSCZzQbkqBTF31BzS46psXkOMYtZymS8XQnKiW2IoQeKWQEtkTH3aV4lmeqracqWUxZa2qK8BIzObypZwY1rZT25yo05os4RzVw2AumFRtDCy1sQPGLJEU+Cc4viaqLiu5GkgoF2jpwGazrIgCDDJEdJxu9UGpIqQZn0wFkZ/NzycePbZI3zIobs7zC8TBX/wg+dfe21wY1eIBGLoILVUYpjt2LQUWMcsh64TmsF1ScCMrAyQGJPu0ssuxXpj65mjvWoqKaOAKo1BoNAV35RgUqgM5rncx/aaoUYyR0ZKpAhGEMknZRHJqamMRKoRPqsVlbARoTQKgkTlEr8qsgKlyTCwGUQO0yONSLWMsMjkHXG6zgHVyoovxzAIBQKeREJ2tquhii/HxjrrOmvX6FlvAq3AJAL02hiFgYCDb/nNCRS6vFDIe/jw2NGj43fd9VoqxUchFGHwdSefx9nY0GA9qgYI5cB+Y9U7CwrkFMKYujKrfxGLnhdddBErBfleoCyrIEw8+bkkK+4ZnGOuaSWnwmEcZhmwk0l7RBKuPEWaCOQSAZREMpNlKtOYoiRAMmRWRRrZoVi1aBJKNa7OzPHTMy73GMMR7RwbSwDdcq+L1n+hEdCu0YVGWMufAwK5HFzCFp1YeHPor9lS+Wc/2xUMssxN8R9+MJyLlEo85h6WViUDcT2Z9BWLcVHL5Bs5reIS+lC4pyqNUA5bzzBZ9KGHHhIuJCzJJIusMoQOZWoMl+iXTSBIQ0FESh894ynlitowK1YmG41iEVapZwqcHjAp09jum5X7E3hq+aKf8VFik48FZH4lMF3Mco0BW6br5vOYxep+WK7V0HovFgLaNbpYSOtyZoVAOZ22BAL2iYk5fCsAIhkZSQwNKW5TG8oY00mkNJaQqW87qFktx4rHQkqnx6AKMlbxnJDZsaQnDmGlrVmzhkHBe++9Fy6ErvisrhCbZGK6ChvQQISYhjhUCXBV5PMLpVUlNosytYLCSTYyMsIlyqrUjXDlKQmqYsgoVUuneQ/os9n2Gg5hFnKgBt/Q8FitAf54W+ATFwY7ml2B8GIFXqZmyycAGn6/J5nMGCBrIlw+LXeWNNWu0bMEvC52JgTotlhQyKf6HA4oaqYUJ4hTbkfTzqlIw1KDMMvvRnI5Jp4YB2RUKPjSaT6Zm3Kx8MBY/2dyUhUvkkMIxkxgyiYeLoT8brnlFuzCgYEBmI8hQ/RACMkIkAsygyOZJkMYLqQ44slLmkqZYuoRycFVlurD31I6YUrB4uRSZekkM08lJQlMmYRRT6QxwdXgRbXXTJ45P/lELjfOVa5jl5ZKTv7Y09RiCcKRVivLDfllJpH6gJRhLAo1GmfL5AeYAdvYYlSz4DJps7OqpnaNnlX4deHTEKB7ZxIJXMi2LWa3Pi3VrCP40mGxLMvKzaEi/JTptBdKYCtoiApCEAcmQqvYRWKInK6JpIRgWF9/4403stz+4MGDiIJ1zH3XyC4ZiTe/JiHUy7SayjrAdnAUMuVgaSIBKJNcFLFx40bkEFOZZcawZIf/ZM9uZIoOzNOBiN1uDEFFDGYyxkr5w4WYy7Hv3LAiSuVOJA0sGLZYGo1fDEfemI/R8IxFL51Iasd8UWZL8eag/aJLp12WsibaIlzKrVOLutFRwwKNjfZEYn563omJiGHWHAdmNhvKZoe83gA0I5wkBtxsyOY4QcYCDEbvrr/+ekYEX3nlFegQA46JLQgkpRAYvxycIh9ShG0wE0058BZ8bJ6SAK0kSyQS2bJlS3t7O5FmghkDlSYgpVAEEhCL8xb1KI4EHGYFuYqG7GAKOyKQU/OSgUk6lUqm0yMo4nDU5/PrsH6NcuenUWaswnxF8sLg9/NJevGLzpdULWclIwARqudTHxqBpYMAL/LwAtNHDaPwjPSic/f6/L60JZVNqQUOxgEl5XKBdHqA78MTIa5LYRrSE8BogxjMgk2GMGOqAhAM6S+77DJYZ+fOnaOjo5h0bEzKJt0mHVYyDQJN+QSgKzNGTpFPDN8g5AOE27dvr2RNLpmiSMMh9AadQ2CiGAEJ84s1CREKuQoBE0Ygh5FqcvIqchDLgXHMQZb6+rCoMTExlki8lM/XFwrQYf3SNw0Nv6gjk1FTbQUQ/asRODkC2jV6cnz01bOAAN0Xu5Y1NNhTKcyUM1JAdYXstKm21Va9vCmrVHKn02z1yecD3VACNCCXIAloyXAkKoOJLPyaV83sBKoiOYVXIC08pc899xyeySNHjkBOso6CyMrSSSynIoRCOTVj+PwvA4TYgpiV11xzDVeryiWXaVOKBPKSTMQS5iphCJKKQMZkF3ojbJYiCdCZxBwGMypqRA6/Ior0SIDdk0n0GSsW90xMbGDTU+YYVaq0pMJozuQk40u8yre8pHTTyixZBLRrdMk2TU0rhlGYTrMg2h6L4SQ8fSggsolIjG/5CgGYghgmhKSYNsJ8Gen6sQK5SjcKDUAbEpD0pt8SIeZBAlOaGUDU6tWrw+Hwiy++yFRSlj0Qw9xRWI0v3UMqHBQE8Uh26amRL/YZ1AUFYgiiA3IuueQSNJFLpDcPdCCLmKFm0cg0w2QhDb+MVlKcmIxyFSFmMkpHH5nII5HEoDDCIWMOEqMJkX5/IBxuGBlh87bD4+Mg5luyXMg7BgtpIhH2F9UsaDa1DpwCAe0aPQVA+vJZQQDyS6UYKeQjRHObPjpN2zKbYPv5nG/UWKJ47DIbu9RlMhN+P5aTcodCCUISMIpYUZKWSK4KNZq5oRkIhiwSwykBfjnIzgxPLDk+2/Tqq68KkUAq0CGUQy7SIJD0UhxWo0wQJYa8uFVJtm7dOoYGITAykoCAFGT+VsVwSkFIFrEin1/cs5QiieUqCcyAJDZ/hZVZ6cHB5jhC2FA4E3kQwmlTU0ux2F8s7hsf32p8756sS+uA5TFi0UnPF11aDbPktdGu0SXfRLWqIJ1aKmWhQ56YwCg5TauQfLlCOSSboFWYlgQLBX8i4QiHGVdjr5mckB8kAaMIRZnAQyRVMZySDKapJJXK9MSzlALfZm9vLysrGDKESiEY4UVSIkHYC6ojMWE55VPA3d3dMqonAquKlrwm7UkayU4YUWjFQQz+2PPOO48wtZNfiJaMHFyVjOavWREuSZhfMGlra8OWhcWpAsTf2sqGrv2pFHuf1jO9xsy+RAJgVVfnicfVtmocS0QrrcbSR0C7Rpd+G9WohvRjOOd8Pr7MbsOVeHrdGrky2UJ7AwsAYlU4lsuOaLSptTXmdrOUYtJXSRqhCgwgSU9/SkxVXjml252xt4VLxMcI+WHbwYiMF3KwNJ70XIURMfu4ylAiRIg0yIapofzCPSQwS5zOgiQmkjSmSsgU5hNlqAtX+cWmZHgPURJPLg6pHVflEI40s5NREqM/AdJTCvzHTuKE0ZawMeoZzWTCKGLqsEQC7CWE7zmVUjNml4hKWo1lgYB2jS6LZqpdJXHLBYP28fGZqeiUuNAhZuG4kuxTc1znSD+fyWDrRNvaJteei29Q2IIOVWiAIqAHDrlqlshVM4EZWRUgAcQD523evBlGxK4aGxvD2cjYoVzCAwn/MabI7FApyKRAEaXKqBjVk0iUqYqEycyiheEg8q6uLqQZAo4xFvFUDZVIT424alROUS8BeBFPLGOEJguSjDAH1i1huNDj8fp86lMZUxt5myWf5QCo1NV5MQcNPY5r67OsmS5+ySOgXaNLvolqWEFe65nRCQBszCLbkJ4GGHTYPq/XbovOlJdNmcOh0BhGIRyAQ1Joo4qNyAhJQCFV9FN1OpN8FUcyiAouweCD8OQUEjKNTomBwKZLmLEIlKlMKdklhkv4MLEyzznnHIzL6RURfczEBFAMTTBS+SWMNITwCx2KwSpsyiVsQSSzCwGbnrpcSYzCJTVlhi1qsQjHx1NVryxSWf2rETgJAto1ehJw9KWzjwD9L9+Er6uzjY4e1/vPXjP8mkw4cdhthWK1BPrxdLohFhupr8/BH9hJFMdRxUmwAmSAJ7Oq0CpCMq+S3gxXBkgvWSiCNJVm3ImyTI8nL0KIJyDCCQtX8QtRwV5MFl2/fr2UValAZdjMbkYih4NTLonzFoMVJbFfAYcZPcQDAuB4vT7o0sy4FAJiDiaTx1ZkLgWttA7LBQHtGl0uLVWjetLbYykZ6+ttMOJU5z8HNIyP+TlZSpgvzJjdFYnwEfsY/AERimEkPsPKMmYkFaGNymTzEoZv5EAa5eLJ5FTIjxgKhZxMg5JTTDf8mbAgVyFCrmIOYn0SL7lIwyF5+UUav1VHZSRhSU8awhjKHFjMFAFEBBDgdKZxoHK9Ss7ZOsWaZfng4GBMr5o4W02wrMvlGVvW+mvlVz4C3KJQIEspoEPGwk7jjs3k8nyYMJEpzDT5lDkgLalULBBQe63BOnT3GD34FU0DETIg0uQGE/HpMealOQWQzyGeUqgLJoPPsMMIM7+GBRUEsMkYXCQZYagI9VikwbxQApRFFjQnQBUgLabJCKOjIemJ55dDTiVgKk8AsZUKV55KYq4K9YIJapCFrcXJdHy+ShmLGqaKoZAnnWaWkP7o0qIiv2IK0xbhimnKFV6ReNzK9ixjYzN7HU9SeTrrTL7onJoEWpWSq+zCNj5e5/fHmCaCU1HIxlzbZ9peUALhyuwml1RGzjJMKUjjFyHQGMXBeePj48xCIQyfQcYiHw7m4JRfKJCDMPNumIZKWUiQET4xZ8nCHBnkIJO6QIccUorJfyKWXw4iOZCDMnJapb+RSjGlzHEVBJiZaRheS8UipFn4CvLISLKqgarqok81AidCQBPhiZDR8UsIAeiKyTLMOPH7rWzJPVdDJJcvh3ze8igLFY4zfaSGSEsmW5LJaDCYx52IRUU8ZANDQAAGDSkGopMVUiHAwSWSwRMSmD1YpCc7Nl9/fz+TSGE+DhyzUoRcNX8RK65RAlKQyV5QHUoqYuTLEcZeMKx8R2cI9YUXXoAXg8Egg3zM9mSGjuwDjgVJ7WSwk5QmySEZNpVTfqUuUhBFmMoAhbAmSXgrMMCcAc/ZQzEvKdEzGHSxqTh7zJ32etN50UQLWb4IaNfo8m27mtM8Hrc0NjJeCDnNgQuhrFS2UOdTI20nYFAMI+/YWJPfP5bJpMXZKOYXYSEGfuEbYQvB3ZCmaEB8knAbYYnk12RKs5GQIPFQIF+oYAM2YsQdSiliiRqkpha/UzpX5SAXAfk1iyYj5ZKReCJJAEWRi7yQIuzFL5YlhuP+/fuJRx9YnGWFLNXgQ1FMpWF1I85V4sliipVSEC5iiecqkkVPTklPGHXyeT8lnwBPkbcYv1AyW2yzp9rQUGLq5WQxytVlrDAEtEW4whp0xVaHPpdeDy4MhSwTE3MwRMhYKJYDfq/VGp/RIgQy/ILJZFsikbfZ+Git+o4u1CIBARRigBKINPElzAHl8GtGmokruUQioRBSwn8ceD6hJTjJzAirGQSjZMJbHBSHAvAZv5TCKZHqslGcMJb8moVi6nHAqcRIMkOSMmeRI+5WzFB8qo8++ij2IhuwsbrxoosuwlJUco3DVEmEwIImKRJDGDypdLEIniqiMv3ih1E5HPbRcEwV0ubg4uO/YkrURLhimnLlV4Sel/kyGGkM5LH7mtERn7rWJMvlSwyy8W34k6a2DQ62u1z73O40REh/L/QjBh8ZxeQybSMRReR0mdMZBWlQ0bPPPosXFDMOUhQHrKRU3GLIF1EUMd3aEyI0qVHYEZ5DAYSQESEE5FfkEBZRVAcfKaeUS6UwSZl6gzN23759hw4deumll66++mo2Y6NQSpG8J/klDTv+FIsMPapiT5JyoS9Rb+PzGPaxMdYOnk1NFrqmWv5CIwAR6kMjsGwQoPNl1gybnMCIM3HQzBUplpUDk+/UF0vV1puZAR5hrujgYJvNdgSjip1B4Rj4BhtLmIZfOAA5cioZoSJTwskDDz74IKN3sgs2/CRiRRSnsFRVdi7JVeK5igVJwLT2uGRaeyY7oh48Z1JjpUDSozkxyOHABCRMSgYpcZ/efvvtDCvecMMNPT09RE5XRkRJiYxLZrNMVUXaCcGsLHohw5iDnokJtVnrQpaiZa98BNTAycqvpa7hCkIAOwAHKW7F8fHZ1gqrKV8ouV2OVOZku1DChalU89hYwulkiX0TXCW8Is8IvzCNSU5SNqcc0x+iykh4jkFBfJKwICQE02CiYZxVkijpEWgWNL1ikkB+5app7UkuLiEZmegMNVayoxRUmRcJ6EAFGTVkKBE6ZHPw//qv/7rmmmt27NiBHCmCLFSZ7ATwrBLmjEKy2UbMS5ZyTNdz0WJoU5ZMwPvZ7Onvyb5o2uqCljgC2jW6xBtIq1eNAG9uzCDlo+4MFrKy7hT+TiM3LJPOFf1uRzI98/cizDLgwmh0lcezy2aL+HwB6ArCMN2YwjQwkJmeADwhxlZlpBkW+ty7dy97iq5du1byMkuFAM5SEQjNwF4ciCJGGOskpGgKl5TySyRZ4DahN8LEi3whMzyiHBRKKZTOwVVyMVjIAR2yQPC+++5DDUYN+eUSCcgiksmITGOVozWTCRJ5dv2ibDMQCLiNOTIn93gbmuofjcBJEdCu0ZPCoy8uSQTgQjb8CodZTaE+W3hKpwbjR8lsMehzDkUyp6wQTsTx8TVu9268o4zWsSYPRyJUxyFmHAGTe5AmdFIl1kxAFowtFjZgDkJRxJNdZntiHZKLSNIIaRkml3JvwjqYdHIKb5GMBLOnRknPr1AjBVEo6ygQhXCZNcMwIfJFJglYX0F63KRPPPEEE0pZcVFVL1GASakjI0FW7Z9dc5A3m3BY7a8NlZ+y9amXPjQCJ0dAu0ZPjo++ukQRoPuLRllNge+OWS2n6A2ZSZHMFJpCarGBdP0nrRXzRUNjY11OZ28oFIY54AyMQpiJvJwy8QTqwqISUTAE0ghLYLpk2AWmMc1KmK8qsWREJtZn1YoIuIoSKVp+hR2hKLJwiAJS+vRyJUaSkYXEHGLgogOimLnD/jUUQV1IxreWiISzWXHBZwhNIiQXYRRAkbExNrjpkaUiJypxoePBG9cyqyagcj1HZqHRrhH52jVaIw29AqtJhxiNWkOh8vj4Kca5Yc10tuS0qXGt2Tj0ILVEon1iIu5yJbzeIPYZTCZsgTkFMQiZEYC9MLYkwK95VLIUWUgj81xoBvKajSHpzVNyEZZfAlylXA4++yDxEJJYdSYvCjUSSS4OESgBUyyRkp0AlySMWMw+zEShQy5Rl9bWVkQxnIl86ktYKBAHKW8D7PI2POwrFllBiE/1FJibpS9AgDky3okJjPuzqMMCVEuLPHsIaNfo2cNel3xmCNC9Ywvygff6enx6pzAK83x6Al5xsvX2rKZ4IHxiYq3NtqejI2O3u1lswEAa+kIYcA9kpmjHGISDMzCqoDq5SgI5hCyJhEcJc5CeSxKQqnMqgRP9koVL8ktAsiNBwqIAXIUCaCWH6CMcJiVWCYfhRCZXCbOcEbHQIZHUAtJlsJBBTeLhP2RylQAu1dFRezTaAZlWCVzMU3Sur2fjbxZHTn5weDFL12WtVAS0a3SltmxN1AseYTIHY22YTMnkSbmwZMkWSn6PYyKRM+zCU+NTLjtHRzexCL67O804HZSAX1G4B7LBoiKMFHgCKhJxxEgkp0RiY0FIsIjsbSZmllCmpCeBBGb/a8qXAKVDZsgUn6pEVlIjhcoh1CgsWFUczl5ihAuxXAmw6h+6RQ4CWdwfjY739bl6e7vZZMYgwlPwd5X8+ToFb6+XmjqGhtKnAd18qaHlrDwEtGt05bVpbdUILpTBwmKR2Ywn5kLmy6QLAa9zPH6qEcUp/IweHy7sKRQOdnQkoDy4gf4XboAUxSyTtEIYcNJUVvUvpxyQEOYaDEp6TqErViyQngNmElIU9uKXBJUSZh82JUi5KEmJHAgU5kMNdJYlEKKSZJESCWPvcpUDPeF4LGAMUezMeDw6MhI9fLg+ElGzac4qC2K/MkPKPTwsM55OE6vZo6pT1g4C2jVaO229YmsKfbDpWjgMtbBzysxcSI8fzxRa61gJPvuDrpZ9nG3R6LpCobe5+SiTLuvrG6AWaAx6gzOETv6/9s6suXErPcNcQHATqbXbltyetmcmjl2TpJJKJZWb/IP84KRSlYvc5nJqMvGMx24vvVKiuBMkCDDPwZHQFFtia4FIgnxRMozl4JzvPFDj1fedjRxjVYtzt7eQH26hf/Y6z5Kh1SceR6u4TspIGc3OHttn2VuhivP86EGcPs7BCiHaxobaUTqlEK1F6tgwjzx5CiMxBiEkAf16iI7+/vff0jV3OCz2eicEhnkuKn2F8pM9Oiq1WvC8M5aPclOCLSeg0OiW/wJsTvU7nfze3pSRhdeuWZiP+suUi7h0fPbv9jUn+WDw+du3hSB4gYRUq/VCwUFOrK7Y3DidQ2llzCpNHA61smRTGjsiS9hbdbTXeZCNDNnQJ3vMnrvs7SNzZd10GieefZyyiH+y4RrSAQbxQ/YoCKXkFlnhvp6eEn78dRCg9CYWSq9S5PKmUpZwHbr7+2aJieFQw+eXwHvripBHuHWvfFMrjP6127m9vfD8nJY5BGO+ouMJjXYZBgPids3f+9g5YuB5J+/euYeHf5lOWwQ40Q8a/3CeEDk2FM5u5BTLDweIGVfwCK0UWaW5trRY5HiKzUpjLGCRLF7suMhRfIvE12Y4a0mcwCaOn8ULxDbUHVHEfpxUjklMj9GoOZC/GuYFPs5qaQe8rmrVISzdaOCw3ljZpdmjgjaPgNoIN++dbmmNkAPiop0O6/eGrRYtefMcmGh0NA5Kbp7B9ff4miIJ4/FRo0EL3/fZLCMLq2gGgoR4UBICw+Qs1qlCw+yGO4jkWK/OpkFsYrMWCJhNY+UqfpBneSTWMKuFZG7lkevxrTjn+MBmOHcaqylVsJljP7WwUh4VdQ9Ocf0SO2B8Sr3OurtmapvEMlVGIjBDQKHRGRg6TDmBSAtNl5mDg2sGF3LX88NaudD3AkYU3qOuaAMrFzYa3/T7Z0dHp3t7yJBZsBePkNxoNWTPaZwz6sJF9InWOI7ZZj/ls8fxI4sPrNSRhmfJjQM0LNY/K2ycWunl1BbBFXswVyLX2WZLJIG9wuVIdO5DaTbDhx9jyeFhsdViMvG7hYUfXrRy2B4CCo1uz7veipry+WZABXuW2MMv5MMeV5uvfM8Ld6vuq6b3Xqzi27c9IEPmNPlkMDhkmMHhYaNS6VcqVcKKZIACWX2ymSEqNmqKHLLhIKKIXEQsjSpGShZrz23Lv5ouVjLyIUP2bJdiZoYJcmxdxlgXY9viZ7EHy+2z3CX8OJ2Ww5CFlt7Tu1rsks6IKx8eup4XeB5h7dWr8pKqrWKWTkCh0aUjV4GPTAAVHAz4qof1etDtvg9F8iFt9/3PDsrRJ/VBX9Wo5Sx3fo4W7lYqLFXxdn9/SnMbcoio2GCprSWnxBsJojJWD+FBb1BEq0BoJE10JIg3jOapWMnuysk+Hssbp+TMns1ejI85xQyssgmsW8ldTGKPCWFYpaGQwZB3tSHB9LiADJbAwk4Ht/tB7ytBq5TVRhJQaHQjX+u2VyrSwjwD9uhHSg+aGMfIN1/2suswuP7hX9aoL2V+MDgZDp80m29qtZf0xGSCa9arYIw74/EQFdsbhZ411oZYpTjFRbOiGGmP0T8++vhw7O3GFbuRePZBm9VH93PPkgPZXmZpevHEx7NZcRHP1ff3MWeFHmGkgiWmNz87MyM6Zi3UsQgkTkCh0cSRKsO1IIAW9vu5apUZucyYCmsTMtjzgnql8K7l3a+Z8MO6Rd6h43m/Go0+dd3zcrnDD51O6EMTfcCNN8awvdHoieMUUL65HBAee4VkNobJgb1odWtOGknMXdLM5bP49Pbp8VmjKb7tQkuLc32su1SuVsM7RQUXrR/5WMUr3+0joNDo9r3zralxpIW4Phl6tbTbtJZlCLC1+v5BzX3bYpDAw33C9ygjOWTswSf8dDq0FDJP95AR6rS+FYvEIIcvXvx0dHTAtDLIG4K3QJliaSQZBVjt5KLdeHx24yJp2H+Yob3F3Q9vvbf76pGN3LIcRRDUIncwSURXi7rxjNdUqTAzTq7Z/MjikTdmoRsicEcCCo3eEZiSp4oAMkFUkqAgCzYx+0wmyHWGk189qdgYYeJVuYwl5iaTCj/9vhUjlAhF+Z6F73G2UEXkkNF7Vs+sSi3Qqlk9I5kVUXuR/awo2uP4lq1dnDPX7S173aqsPeY6p6ggthGyHY8rmUzFKmziiBZniLXlMn58/uxMKrgYle4mSUCh0SRpKq81JMAHnU4hGMZYe2KkIx8tmZbdvDemkeyx7LWKeKmLeGxMAvcr5v7MZJqNxttWq0jgDy1kqXq60rDZXioomdWtWL2utS/WM5KhXmwcGJWLtlgayZPHuRbvZ3OzQmhlNVpcwudBsoqE8HAlPWXwBfnzYHfXbTTMxG/aRGBpBBQaXRpqFbQyApEW0jmFj2zQaufbg2C3UhiOljpAm5EUr1592evVv/iiyTBHpi3tdn0UiCZANvpt0q0GRbR7e3GW1z2k0WYSCyE6xxWbJxftzDic2mN7nWWAWVFqMDiKTh/tzwRb2NW9+eukzCR5hXfv7LrzSy39qi062zoCCo1u3Svfzgqjhaxl3+2yeGHoBeO9aun1uVlpb5k0KK3TOfr22/rxcX9vz9vZwYAQcSIqGQ2VMz1LMQnFwpmLHEXXCiR7knGLzSriYl20+diqxelnHyGf2I8kWRwmZdHBVmvP92nIpHlyeXBQwUrF2d3N4wvi3KqXqH132i+NgEKjS0OtglZMwGphq5U/2J8c7edyr5b3oY9rTuxzPHZ/+IH17vn0+4eHw6dPJ9UqC1mEZqRcmGPwHoplV03iINK+ixUHrb+Iy4hMsllpjOUtPojLuv0BxZGYZZrOz/FTn8cR3dvn8JCUdAmiC9HOTu70dMLxcv84eYjhenZzCCg0ujnvUjX5KAHzkZ1m3jWyXx6GJ08zbxo4WMv+8mKDnVKm32eFo+LPP09dd1IpT6pPB5Xn7d2zarnKQhBMCmPG/CGFWIj3Rt9RgplWGm2Q07qMNpT6ocv4URQksCpLnggh3WRarUa7/SQIqriLS3MHqd3uLgs9MqE2LZ3GpNtYrjQikCwBQqPJZqjcRGDdCWRzmcb5tFwo7u4NBn3HN2PV7jYsL5EaXhbKAPbCaOie5qftb3548u+/cSojFoSoVv1aLWRfqYT0skE7I100k4kjXegi6sUE2VbM8A7xEWNRRCM5RR1ndQX5jE9tDoif1ddutzsa9c/PC8Ph88t1BxOp4qJMrOzt75vxLc2m6bikb9EiXrr3mATkET4mXeW9lgRYm/Cs5//6aeXF2/H+XkCHleEw1ogVWGw0IEeLXK6QdTgajxl4yDTTxhI6cjqOXyqF5TJtikG97pfLRhfpEMpsaPhtcy4jj8TSaNsXUURkD41ELNFCZo2hRRARZUMFucja9YybaLdZg/DrMGRR+2W0DqKCNHru7+eYRLTXIwJsDDcV1iYCqyAgIVwFdZW5UgJ8dnvDIJ/LuHmHxQuZkhSxsLOSrsopwSF1c+4XzrNzo2SzvhHrVzisjNTtVt++ZXHaMU6g40wqFVzGoFod1Wqso8uYfaMrkZhfrD9lo6monVV4tJCDOR+RK6jg6em7ly9LjcYXrCgVOamPLkioIM4qvmCnwyqPRrlX+uugwkUgo16j+iXYSgLZTLM/+WTX/bHhdToOonJwYGbopmfpSj7LlDqaDl+Ovv80e0yz4OwrsfYgHviFYeiwvgWKNRpVmk1SsSYDC1yMy+VJuezTDbVWYxDCBF3EZUQaCahG01WbhkY2PEI2mknpp4oEttvnjUbvzZuTbvc4ukuiR9ekaJhEdmeHgZUEeJn+9NFLNDXXJgILCajX6EI8urmhBPj+Njr+159VfzplNCGzkuZZQ7dWQx5YYsk4VcuXQyfrHBX2F/Aul43bivKRJm5f5HAyKXc6DMzIvHnDdcZjMPH3hBUvKhV0EWeRQRoDxihGHWp6LAbV74+6XarM/KiV0ejz8XgnCoeaXBeU/vBbVozrdVbnYLVIxmzIF3w4VOWQDAGFRpPhqFzSRYBP/mAU0oO0WspzgOwRRCRMGi1YMen1bA+a5dUJcXOzzn7+6CzDbGzzgoSE0I+U5SKudVgxfka2UfGi55WGw512O/PqFRFR2vzMiEnS5HJDnEhW8w0CJlGz//ZJsAxHkCowC/nuLo2UzHVnwM7YvDzOKkkEriWgXqPXYtHFzSeAMuAUPq273zc8+p3Y7zIryY/Hucg1zPZ6zMNi+3E8Og2M8abjn0Z/Psl+MjWjF+a2bKUSYJuRs6uB07l09vRqGmpRtt5YGJZtgsitRP/4efTNFl2rZUulbLvNMErCoY9eqAoQgTsRkEd4J1xKvDkE6Cxz2vV/96zy4vS9B4YcslR7q+UwaGF/3yd+OF7S/DPZfCb/tHQQCfJ7eyxuYokcELa9t/d26X7NKt98KY/xalFBerfW60zrkzk7o8uObbN8jKKUpwjcn4CE8P7s9GTaCXj+dBJm6mWnOwwupeKiTrQaMlU3IxZcN4sr9ugznqBKo9ywN4iWAbwiUYjHpTuIbVdurTl/LK9W6eNDl1fmPTcSOAd5ze2XedtDQKHR7XnXquk8AT7MjQ59RwtMw+1cjddFrmGO1e3pq8lyhkzQPRzS/HaryOR8Mbc6z47C4Efv5d9l/zmcETu0xHV5ntZB4qKz/tytMl1JImzmh26rkSOYbTaN2ZfTfa/EIhUqAh8hII/wI4B0e4MJEB09601ODqpOnnX45jfrvnheDteQYexmtm6PRRsepU8pyrFTLP794VeBUcEZJcxkmYkU9zQyZvb6vLVrck5FaALEEUS/cQTHY2K9KTB7TejJjFURkBCuirzKXQsCo8m0OwjoMvO6NWZ2zw9tsp9xoqMM/SZEubs7YSYau8Bhgl94shr5QbvbOTBL+F6YgajQWRQ3NGod/NC09bqCtdRiZ4dZbMxiyM2mGbyYIKL1qq2s2SwCCo1u1vtUbe5IwMllfzkf//Vx6W2HSVhufJhbfNbpR8qqD4y+RxEZbog+JdetlK6i4ZvJuyOGO7w3w7iDFErpC2y70ehl3YAMW6UyLZXoFJpjdIRtUlXv0GW9AZXzUALyCB9KUM+nmgACMxiH40nmqFagEyn9OW6qjlUjPvHMREM3TrQQzaLhkE//w10fcthxS//65B+bpnzzH1dKJSZKo3VwgVE3Gbuk67bijPRHAhmISZOqJHBJ6FVMogQkhIniVGYpJIDD9fPZ+MunRTrO3MbzQhHtR58ZSsvlgLnNxmMCp2jA/ZvDeHIwHn13+qcvps8JKEKRUhCYdpuYzY3avELYSCAb+ocjyGqGnU4+CIzNa2nsCjmp6HQQQAi1icBWE+Db3fXCIJzuVZ3OYH4cxbVo7Oced63bZe14/CHTdojrhoN47eQv12Yyf5EhjPmLqU6RGUZujEYM21g7HcQ2uoDSmdZOdtPpUGVJ4PzL1Hm6CKzp35vpgihr006A7qOvzv1nB+4fBsM7BSJRRDqzDAbOcMgqECFT0hAb9DzWNrpbvBT/quIU/2XvHxrGqWKy7Clr1rfb7vo0s6F/VJaYMKqPEBIQ7nZz/CnAxfUxMu2/h7J/VQSuDp5alRUqVwRWSoCveWsQOrlcrcTqfXc2hcfZ0L9Wyx0OWSA33N1lLQizuhPXUcqP5on8DSbefzf+J2sSszgDE6oVPvrUnQ29+wPYgD3RiAgGkExwBKNqOozoWENv9e710xMiYAiojVC/ByJgCCBmv+AUHrp/fDm0o/buysXKIQFSXCWUA5cO2cC3w22iEZGp2hAVm+a6nBGbXK3CXKCMWfTRGFyumxNfl0Gi16wGUwvXpRYTDhgx0ukwW/dFFSLb1rHxMlEMymxbCCg0ui1vWvVcTIBl61mh8GS/cFgrnPcf1FPTClgkfqyaZBQRH5FOJTQfIm8oJeFTu81KXSFX+Lx6nC/QASfEs+TBJW9W/DApnzftf47DYofMvErDJ2txGGu4tXyrlgxBxW0nAXWW2c73rlpfQwAX7C9vGVPotgYfrv9wTfrFl6zIoS70eeGHCdJQl0gRfYKNdDBBXZAZnD+jQNNM4Gfe+M3f1ViDnqAobW93D9EuNui6uxRkL+O5onyFAuFcxI85snM0fGIetmHJrGBfl42uiUC6CSg0mu73J+sTJIAmDMfTs174/Kj4w7vRtRPN3KO4SxXJMmEpAcbIrzJ9YaLYqemkGoY5H0+xkDupV8c+AxONakZaeI/SFj1ifT7j2hltm9LnBecvn6fzi5E6xG88dgaDC4fVmh3tFQJdRFX3NoCAQqMb8BJVhcQIOPnMy3P/bz8v75Tyw3HCYxestGArImcnaeOYpQHpU+Pkp7XdYFpkaHoumxtNp7hidD3FKWR8HnvbvmjkMRIz4zDyrBG0C3Ez7mN0SubmbnTMxTglqwCagiLNQ/nM4zijtF8GgZkxjgOusJGx4p8WhfbbQ0Ch0e1516rpLQlkXzQYX+/+4edRLF23fPL2yeKckS360TB+MD8Z/rnVeNZ2ciwTnGH5JzRpYhvqJpOgUHDb7U7RbOV8nqBlOBh4vu+7rus4Lk+jf6PR0HFy1WrZ84bjsbe/vxeGQavVrNfrTFETaapZ4z4IXBuPpejIYOMOxvbcvgpKKQIbQ0Ch0Y15lapIMgSYebszDId+5tP9wutzP6kA6QLjEKFafTQalY/zJ6FpLKRnClFKnuCfp1UpE8acTqth6BDARNTCMLS+IMpXLObzeTxI0xmHyywBH4ZG6lot+uQwyOGg1SKr94MxYs27PLjwBRdYqFsisNkEFBrd7Per2t2HgJPP/tjw/+aZ2x+FPS981HEMhDbrdX88KniZaT/s10yA09h8VaVw47hSZDKzKF5qQqbZbJUY5nA4HQwQPNMPNZutECvt97mLyUXjBBodveh+epmhyVybCIjALIGl99GeLVzHIrCuBNCnP73xf/O0UCyYnpOPtBGcrFQmtM8xPv3ZSfXf/unrg4MSXt1NxSFmUfOfFUuSGX+Rld/tDyrIg1GCOAN5ezEKHYjAjQQUGr0RjW5sMwEExvMz37+bfHXs/u8vPoHIxCWFLKvVMRpmJix1ws7pNPsf3/QHr2Mfbpv5q+4isEwChEaXWZzKEoHUEKBnZdebvj4Pvjou/N+rcbL/UlDBWm2Mr8moQfpz4sj1ev5//tdPBc4UpknN74gM3RAC8gg35EWqGo9BgMm4T7thwcn+9tPCd2/o2nm5ePzDCrPtglFElA6fRgXJj8xdN/+wjPW0CIjAfQhICO9DTc9sDwG08PV5+Oww/9Wx88O7wA/oiHL/2iOBjOTb2Rn7PkPXicdcqOD9c9STIiACDyag0OiDESqDTSdgRtk3g71q7rfHzstm2BmEqOM9NsKh5fKkXJr0BwWmj4kiovfIRo+IgAgkTEAeYcJAld1GEkD52oMpc808f5KrFrNvzBC9O7iGSCCOYG2HZSWm7Q79Qk0g1EZENxKXKiUC6SIgIUzX+5K1KyNARNSfZL57HZ4c5P7qOH/WDVv9DCMdFjQc2nEXTGxWKU+KhWmP9Xs9M49opIIrq4gKFgERmCOg0OgcEJ2KwI0ErIC9aoZFN3uwk/2ylul7mVYvQ8MhE7ugifyPh6ORfFOEk2m1d8pcnvaHuV6f8YgM+DMJtImACKwVAXmEa/U6ZEwKCDC8YTzJvG7S7SWzW8l8dpT3Rl6xWBoMR5lcIZzSgsicLsZV9LxJd1AcmcnSIsdR4dAUvF6ZuI0EJITb+NZV5wcSQNGYuQz3r9njhznY3HyO9YwKLCKB2+dP8kHItJ8Uwr8vI4HaREAE1pmAQqPr/HZk27oTiJYzMkYGofkZ+xeih/hpXPy6vzzZJwKXBOQRXpLQ/0XgAQTk9j0Anh4VgRUTkBCu+AWoeBEQAREQgdUSUGh0tfxVugiIgAiIwIoJyCNc8QtQ8SIgAiIgAqslICFcLX+VLgIiIAIisGICCo2u+AWoeBEQAREQgdUSkEe4Wv4qXQREQAREYMUE7jWL/optVvEiIAIiIAIikBgBQqOa9yIxmspIBERABEQgdQQUGk3dK5PBIiACIiACSRJQaDRJmspLBERABEQgdQTkEabulclgERABERCBJAmojTBJmspLBERABEQgdQQUGk3dK5PBIiACIiACSRJQaDRJmspLBERABEQgdQQUGk3dK5PBIiACIiACSRJQaDRJmspLBERABEQgdQQUGk3dK5PBIiACIiACSRJQaDRJmspLBERABEQgdQQUGk3dK5PBIiACIiACSRJQaDRJmspLBERABEQgdQQUGk3dK5PBIiACIiACSRJQaDRJmspLBERABEQgdQQUGk3dK5PBIiACIiACSRJQaDRJmspLBERABEQgdQQUGk3dK5PBIiACIiACSRJQaDRJmspLBERABEQgdQQIjabOZhksAiIgAiIgAokRkEeYGEplJAIiIAIikEYCEsI0vjXZLAIiIAIikBgBhUYTQ6mMREAEREAE0khAHmEa35psFgEREAERSIyAhk8khlIZiYAIiIAIpJGABtSn8a3JZhEQAREQgcQIKDSaGEplJAIiIAIikEYCEsI0vjXZLAIiIAIikBgBtREmhlIZiYAIiIAIpJGA2gjT+NZkswiIgAiIQGIE5BEmhlIZiYAIiIAIpJGA2gjT+NZkswiIgAiIQGIEFBpNDKUyEgEREAERSCMBhUbT+NZkswiIgAiIQGIEFBpNDKUyEgEREAERSCMBhUbT+NZkswiIgAiIQGIE/h9thFomZy+F4wAAAABJRU5ErkJggg==",
"text/plain": [
""
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"from PIL import Image\n",
"example = Image.open(\"assets/example_mesh_orientation.png\").resize((600, 400))\n",
"example.show()\n",
"# Make sure the input mesh's forward direction corresponds to +X and the upward direction corresponds to +y"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"example/pumpkin/mesh.obj\n"
]
},
{
"data": {
"text/plain": [
"'example/pumpkin/proxy.txt'"
]
},
"execution_count": 9,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"import os\n",
"import trimesh\n",
"from ldm.util import sample_proxy\n",
"\n",
"# Make sure the input mesh's forward direction corresponds to +X and the upward direction corresponds to +y\n",
"# Make sure the input image and the mesh are aligned in the positive direction\n",
"root_dir = \"example/pumpkin\"\n",
"sample_proxy(root_dir)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.2"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
================================================
FILE: raymarching/__init__.py
================================================
from .raymarching import *
================================================
FILE: raymarching/backend.py
================================================
import os
from torch.utils.cpp_extension import load
_src_path = os.path.dirname(os.path.abspath(__file__))
nvcc_flags = [
'-O3', '-std=c++14',
'-U__CUDA_NO_HALF_OPERATORS__', '-U__CUDA_NO_HALF_CONVERSIONS__', '-U__CUDA_NO_HALF2_OPERATORS__',
]
if os.name == "posix":
c_flags = ['-O3', '-std=c++14']
elif os.name == "nt":
c_flags = ['/O2', '/std:c++17']
# find cl.exe
def find_cl_path():
import glob
for edition in ["Enterprise", "Professional", "BuildTools", "Community"]:
paths = sorted(glob.glob(r"C:\\Program Files (x86)\\Microsoft Visual Studio\\*\\%s\\VC\\Tools\\MSVC\\*\\bin\\Hostx64\\x64" % edition), reverse=True)
if paths:
return paths[0]
# If cl.exe is not on path, try to find it.
if os.system("where cl.exe >nul 2>nul") != 0:
cl_path = find_cl_path()
if cl_path is None:
raise RuntimeError("Could not locate a supported Microsoft Visual C++ installation")
os.environ["PATH"] += ";" + cl_path
_backend = load(name='_raymarching',
extra_cflags=c_flags,
extra_cuda_cflags=nvcc_flags,
sources=[os.path.join(_src_path, 'src', f) for f in [
'raymarching.cu',
'bindings.cpp',
]],
)
__all__ = ['_backend']
================================================
FILE: raymarching/raymarching.py
================================================
import numpy as np
import time
import torch
import torch.nn as nn
from torch.autograd import Function
from torch.cuda.amp import custom_bwd, custom_fwd
try:
import _raymarching as _backend
except ImportError:
from .backend import _backend
# ----------------------------------------
# utils
# ----------------------------------------
class _near_far_from_aabb(Function):
@staticmethod
@custom_fwd(cast_inputs=torch.float32)
def forward(ctx, rays_o, rays_d, aabb, min_near=0.2):
''' near_far_from_aabb, CUDA implementation
Calculate rays' intersection time (near and far) with aabb
Args:
rays_o: float, [N, 3]
rays_d: float, [N, 3]
aabb: float, [6], (xmin, ymin, zmin, xmax, ymax, zmax)
min_near: float, scalar
Returns:
nears: float, [N]
fars: float, [N]
'''
if not rays_o.is_cuda: rays_o = rays_o.cuda()
if not rays_d.is_cuda: rays_d = rays_d.cuda()
rays_o = rays_o.contiguous().view(-1, 3)
rays_d = rays_d.contiguous().view(-1, 3)
N = rays_o.shape[0] # num rays
nears = torch.empty(N, dtype=rays_o.dtype, device=rays_o.device)
fars = torch.empty(N, dtype=rays_o.dtype, device=rays_o.device)
_backend.near_far_from_aabb(rays_o, rays_d, aabb, N, min_near, nears, fars)
return nears, fars
near_far_from_aabb = _near_far_from_aabb.apply
class _sph_from_ray(Function):
@staticmethod
@custom_fwd(cast_inputs=torch.float32)
def forward(ctx, rays_o, rays_d, radius):
''' sph_from_ray, CUDA implementation
get spherical coordinate on the background sphere from rays.
Assume rays_o are inside the Sphere(radius).
Args:
rays_o: [N, 3]
rays_d: [N, 3]
radius: scalar, float
Return:
coords: [N, 2], in [-1, 1], theta and phi on a sphere. (further-surface)
'''
if not rays_o.is_cuda: rays_o = rays_o.cuda()
if not rays_d.is_cuda: rays_d = rays_d.cuda()
rays_o = rays_o.contiguous().view(-1, 3)
rays_d = rays_d.contiguous().view(-1, 3)
N = rays_o.shape[0] # num rays
coords = torch.empty(N, 2, dtype=rays_o.dtype, device=rays_o.device)
_backend.sph_from_ray(rays_o, rays_d, radius, N, coords)
return coords
sph_from_ray = _sph_from_ray.apply
class _morton3D(Function):
@staticmethod
def forward(ctx, coords):
''' morton3D, CUDA implementation
Args:
coords: [N, 3], int32, in [0, 128) (for some reason there is no uint32 tensor in torch...)
TODO: check if the coord range is valid! (current 128 is safe)
Returns:
indices: [N], int32, in [0, 128^3)
'''
if not coords.is_cuda: coords = coords.cuda()
N = coords.shape[0]
indices = torch.empty(N, dtype=torch.int32, device=coords.device)
_backend.morton3D(coords.int(), N, indices)
return indices
morton3D = _morton3D.apply
class _morton3D_invert(Function):
@staticmethod
def forward(ctx, indices):
''' morton3D_invert, CUDA implementation
Args:
indices: [N], int32, in [0, 128^3)
Returns:
coords: [N, 3], int32, in [0, 128)
'''
if not indices.is_cuda: indices = indices.cuda()
N = indices.shape[0]
coords = torch.empty(N, 3, dtype=torch.int32, device=indices.device)
_backend.morton3D_invert(indices.int(), N, coords)
return coords
morton3D_invert = _morton3D_invert.apply
class _packbits(Function):
@staticmethod
@custom_fwd(cast_inputs=torch.float32)
def forward(ctx, grid, thresh, bitfield=None):
''' packbits, CUDA implementation
Pack up the density grid into a bit field to accelerate ray marching.
Args:
grid: float, [C, H * H * H], assume H % 2 == 0
thresh: float, threshold
Returns:
bitfield: uint8, [C, H * H * H / 8]
'''
if not grid.is_cuda: grid = grid.cuda()
grid = grid.contiguous()
C = grid.shape[0]
H3 = grid.shape[1]
N = C * H3 // 8
if bitfield is None:
bitfield = torch.empty(N, dtype=torch.uint8, device=grid.device)
_backend.packbits(grid, N, thresh, bitfield)
return bitfield
packbits = _packbits.apply
# ----------------------------------------
# train functions
# ----------------------------------------
class _march_rays_train(Function):
@staticmethod
@custom_fwd(cast_inputs=torch.float32)
def forward(ctx, rays_o, rays_d, bound, density_bitfield, C, H, nears, fars, step_counter=None, mean_count=-1, perturb=False, align=-1, force_all_rays=False, dt_gamma=0, max_steps=1024):
''' march rays to generate points (forward only)
Args:
rays_o/d: float, [N, 3]
bound: float, scalar
density_bitfield: uint8: [CHHH // 8]
C: int
H: int
nears/fars: float, [N]
step_counter: int32, (2), used to count the actual number of generated points.
mean_count: int32, estimated mean steps to accelerate training. (but will randomly drop rays if the actual point count exceeded this threshold.)
perturb: bool
align: int, pad output so its size is dividable by align, set to -1 to disable.
force_all_rays: bool, ignore step_counter and mean_count, always calculate all rays. Useful if rendering the whole image, instead of some rays.
dt_gamma: float, called cone_angle in instant-ngp, exponentially accelerate ray marching if > 0. (very significant effect, but generally lead to worse performance)
max_steps: int, max number of sampled points along each ray, also affect min_stepsize.
Returns:
xyzs: float, [M, 3], all generated points' coords. (all rays concated, need to use `rays` to extract points belonging to each ray)
dirs: float, [M, 3], all generated points' view dirs.
deltas: float, [M, 2], all generated points' deltas. (first for RGB, second for Depth)
rays: int32, [N, 3], all rays' (index, point_offset, point_count), e.g., xyzs[rays[i, 1]:rays[i, 2]] --> points belonging to rays[i, 0]
'''
if not rays_o.is_cuda: rays_o = rays_o.cuda()
if not rays_d.is_cuda: rays_d = rays_d.cuda()
if not density_bitfield.is_cuda: density_bitfield = density_bitfield.cuda()
rays_o = rays_o.contiguous().view(-1, 3)
rays_d = rays_d.contiguous().view(-1, 3)
density_bitfield = density_bitfield.contiguous()
N = rays_o.shape[0] # num rays
M = N * max_steps # init max points number in total
# running average based on previous epoch (mimic `measured_batch_size_before_compaction` in instant-ngp)
# It estimate the max points number to enable faster training, but will lead to random ignored rays if underestimated.
if not force_all_rays and mean_count > 0:
if align > 0:
mean_count += align - mean_count % align
M = mean_count
xyzs = torch.zeros(M, 3, dtype=rays_o.dtype, device=rays_o.device)
dirs = torch.zeros(M, 3, dtype=rays_o.dtype, device=rays_o.device)
deltas = torch.zeros(M, 2, dtype=rays_o.dtype, device=rays_o.device)
rays = torch.empty(N, 3, dtype=torch.int32, device=rays_o.device) # id, offset, num_steps
if step_counter is None:
step_counter = torch.zeros(2, dtype=torch.int32, device=rays_o.device) # point counter, ray counter
if perturb:
noises = torch.rand(N, dtype=rays_o.dtype, device=rays_o.device)
else:
noises = torch.zeros(N, dtype=rays_o.dtype, device=rays_o.device)
_backend.march_rays_train(rays_o, rays_d, density_bitfield, bound, dt_gamma, max_steps, N, C, H, M, nears, fars, xyzs, dirs, deltas, rays, step_counter, noises) # m is the actually used points number
#print(step_counter, M)
# only used at the first (few) epochs.
if force_all_rays or mean_count <= 0:
m = step_counter[0].item() # D2H copy
if align > 0:
m += align - m % align
xyzs = xyzs[:m]
dirs = dirs[:m]
deltas = deltas[:m]
torch.cuda.empty_cache()
return xyzs, dirs, deltas, rays
march_rays_train = _march_rays_train.apply
class _composite_rays_train(Function):
@staticmethod
@custom_fwd(cast_inputs=torch.float32)
def forward(ctx, sigmas, rgbs, deltas, rays, T_thresh=1e-4):
''' composite rays' rgbs, according to the ray marching formula.
Args:
rgbs: float, [M, 3]
sigmas: float, [M,]
deltas: float, [M, 2]
rays: int32, [N, 3]
Returns:
weights_sum: float, [N,], the alpha channel
depth: float, [N, ], the Depth
image: float, [N, 3], the RGB channel (after multiplying alpha!)
'''
sigmas = sigmas.contiguous()
rgbs = rgbs.contiguous()
M = sigmas.shape[0]
N = rays.shape[0]
weights_sum = torch.empty(N, dtype=sigmas.dtype, device=sigmas.device)
depth = torch.empty(N, dtype=sigmas.dtype, device=sigmas.device)
image = torch.empty(N, 3, dtype=sigmas.dtype, device=sigmas.device)
_backend.composite_rays_train_forward(sigmas, rgbs, deltas, rays, M, N, T_thresh, weights_sum, depth, image)
ctx.save_for_backward(sigmas, rgbs, deltas, rays, weights_sum, depth, image)
ctx.dims = [M, N, T_thresh]
return weights_sum, depth, image
@staticmethod
@custom_bwd
def backward(ctx, grad_weights_sum, grad_depth, grad_image):
# NOTE: grad_depth is not used now! It won't be propagated to sigmas.
grad_weights_sum = grad_weights_sum.contiguous()
grad_image = grad_image.contiguous()
sigmas, rgbs, deltas, rays, weights_sum, depth, image = ctx.saved_tensors
M, N, T_thresh = ctx.dims
grad_sigmas = torch.zeros_like(sigmas)
grad_rgbs = torch.zeros_like(rgbs)
_backend.composite_rays_train_backward(grad_weights_sum, grad_image, sigmas, rgbs, deltas, rays, weights_sum, image, M, N, T_thresh, grad_sigmas, grad_rgbs)
return grad_sigmas, grad_rgbs, None, None, None
composite_rays_train = _composite_rays_train.apply
# ----------------------------------------
# infer functions
# ----------------------------------------
class _march_rays(Function):
@staticmethod
@custom_fwd(cast_inputs=torch.float32)
def forward(ctx, n_alive, n_step, rays_alive, rays_t, rays_o, rays_d, bound, density_bitfield, C, H, near, far, align=-1, perturb=False, dt_gamma=0, max_steps=1024):
''' march rays to generate points (forward only, for inference)
Args:
n_alive: int, number of alive rays
n_step: int, how many steps we march
rays_alive: int, [N], the alive rays' IDs in N (N >= n_alive, but we only use first n_alive)
rays_t: float, [N], the alive rays' time, we only use the first n_alive.
rays_o/d: float, [N, 3]
bound: float, scalar
density_bitfield: uint8: [CHHH // 8]
C: int
H: int
nears/fars: float, [N]
align: int, pad output so its size is dividable by align, set to -1 to disable.
perturb: bool/int, int > 0 is used as the random seed.
dt_gamma: float, called cone_angle in instant-ngp, exponentially accelerate ray marching if > 0. (very significant effect, but generally lead to worse performance)
max_steps: int, max number of sampled points along each ray, also affect min_stepsize.
Returns:
xyzs: float, [n_alive * n_step, 3], all generated points' coords
dirs: float, [n_alive * n_step, 3], all generated points' view dirs.
deltas: float, [n_alive * n_step, 2], all generated points' deltas (here we record two deltas, the first is for RGB, the second for depth).
'''
if not rays_o.is_cuda: rays_o = rays_o.cuda()
if not rays_d.is_cuda: rays_d = rays_d.cuda()
rays_o = rays_o.contiguous().view(-1, 3)
rays_d = rays_d.contiguous().view(-1, 3)
M = n_alive * n_step
if align > 0:
M += align - (M % align)
xyzs = torch.zeros(M, 3, dtype=rays_o.dtype, device=rays_o.device)
dirs = torch.zeros(M, 3, dtype=rays_o.dtype, device=rays_o.device)
deltas = torch.zeros(M, 2, dtype=rays_o.dtype, device=rays_o.device) # 2 vals, one for rgb, one for depth
if perturb:
# torch.manual_seed(perturb) # test_gui uses spp index as seed
noises = torch.rand(n_alive, dtype=rays_o.dtype, device=rays_o.device)
else:
noises = torch.zeros(n_alive, dtype=rays_o.dtype, device=rays_o.device)
_backend.march_rays(n_alive, n_step, rays_alive, rays_t, rays_o, rays_d, bound, dt_gamma, max_steps, C, H, density_bitfield, near, far, xyzs, dirs, deltas, noises)
return xyzs, dirs, deltas
march_rays = _march_rays.apply
class _composite_rays(Function):
@staticmethod
@custom_fwd(cast_inputs=torch.float32) # need to cast sigmas & rgbs to float
def forward(ctx, n_alive, n_step, rays_alive, rays_t, sigmas, rgbs, deltas, weights_sum, depth, image, T_thresh=1e-2):
''' composite rays' rgbs, according to the ray marching formula. (for inference)
Args:
n_alive: int, number of alive rays
n_step: int, how many steps we march
rays_alive: int, [n_alive], the alive rays' IDs in N (N >= n_alive)
rays_t: float, [N], the alive rays' time
sigmas: float, [n_alive * n_step,]
rgbs: float, [n_alive * n_step, 3]
deltas: float, [n_alive * n_step, 2], all generated points' deltas (here we record two deltas, the first is for RGB, the second for depth).
In-place Outputs:
weights_sum: float, [N,], the alpha channel
depth: float, [N,], the depth value
image: float, [N, 3], the RGB channel (after multiplying alpha!)
'''
_backend.composite_rays(n_alive, n_step, T_thresh, rays_alive, rays_t, sigmas, rgbs, deltas, weights_sum, depth, image)
return tuple()
composite_rays = _composite_rays.apply
================================================
FILE: raymarching/setup.py
================================================
import os
from setuptools import setup
from torch.utils.cpp_extension import BuildExtension, CUDAExtension
_src_path = os.path.dirname(os.path.abspath(__file__))
nvcc_flags = [
'-O3', '-std=c++14',
'-U__CUDA_NO_HALF_OPERATORS__', '-U__CUDA_NO_HALF_CONVERSIONS__', '-U__CUDA_NO_HALF2_OPERATORS__',
]
if os.name == "posix":
c_flags = ['-O3', '-std=c++14']
elif os.name == "nt":
c_flags = ['/O2', '/std:c++17']
# find cl.exe
def find_cl_path():
import glob
for edition in ["Enterprise", "Professional", "BuildTools", "Community"]:
paths = sorted(glob.glob(r"C:\\Program Files (x86)\\Microsoft Visual Studio\\*\\%s\\VC\\Tools\\MSVC\\*\\bin\\Hostx64\\x64" % edition), reverse=True)
if paths:
return paths[0]
# If cl.exe is not on path, try to find it.
if os.system("where cl.exe >nul 2>nul") != 0:
cl_path = find_cl_path()
if cl_path is None:
raise RuntimeError("Could not locate a supported Microsoft Visual C++ installation")
os.environ["PATH"] += ";" + cl_path
'''
Usage:
python setup.py build_ext --inplace # build extensions locally, do not install (only can be used from the parent directory)
python setup.py install # build extensions and install (copy) to PATH.
pip install . # ditto but better (e.g., dependency & metadata handling)
python setup.py develop # build extensions and install (symbolic) to PATH.
pip install -e . # ditto but better (e.g., dependency & metadata handling)
'''
setup(
name='raymarching', # package name, import this to use python API
ext_modules=[
CUDAExtension(
name='_raymarching', # extension name, import this to use CUDA API
sources=[os.path.join(_src_path, 'src', f) for f in [
'raymarching.cu',
'bindings.cpp',
]],
extra_compile_args={
'cxx': c_flags,
'nvcc': nvcc_flags,
}
),
],
cmdclass={
'build_ext': BuildExtension,
}
)
================================================
FILE: raymarching/src/bindings.cpp
================================================
#include
#include "raymarching.h"
PYBIND11_MODULE(TORCH_EXTENSION_NAME, m) {
// utils
m.def("packbits", &packbits, "packbits (CUDA)");
m.def("near_far_from_aabb", &near_far_from_aabb, "near_far_from_aabb (CUDA)");
m.def("sph_from_ray", &sph_from_ray, "sph_from_ray (CUDA)");
m.def("morton3D", &morton3D, "morton3D (CUDA)");
m.def("morton3D_invert", &morton3D_invert, "morton3D_invert (CUDA)");
// train
m.def("march_rays_train", &march_rays_train, "march_rays_train (CUDA)");
m.def("composite_rays_train_forward", &composite_rays_train_forward, "composite_rays_train_forward (CUDA)");
m.def("composite_rays_train_backward", &composite_rays_train_backward, "composite_rays_train_backward (CUDA)");
// infer
m.def("march_rays", &march_rays, "march rays (CUDA)");
m.def("composite_rays", &composite_rays, "composite rays (CUDA)");
}
================================================
FILE: raymarching/src/raymarching.cu
================================================
#include
#include
#include
#include
#include
#include
#include
#include
#include
#define CHECK_CUDA(x) TORCH_CHECK(x.device().is_cuda(), #x " must be a CUDA tensor")
#define CHECK_CONTIGUOUS(x) TORCH_CHECK(x.is_contiguous(), #x " must be a contiguous tensor")
#define CHECK_IS_INT(x) TORCH_CHECK(x.scalar_type() == at::ScalarType::Int, #x " must be an int tensor")
#define CHECK_IS_FLOATING(x) TORCH_CHECK(x.scalar_type() == at::ScalarType::Float || x.scalar_type() == at::ScalarType::Half || x.scalar_type() == at::ScalarType::Double, #x " must be a floating tensor")
inline constexpr __device__ float SQRT3() { return 1.7320508075688772f; }
inline constexpr __device__ float RSQRT3() { return 0.5773502691896258f; }
inline constexpr __device__ float PI() { return 3.141592653589793f; }
inline constexpr __device__ float RPI() { return 0.3183098861837907f; }
template
inline __host__ __device__ T div_round_up(T val, T divisor) {
return (val + divisor - 1) / divisor;
}
inline __host__ __device__ float signf(const float x) {
return copysignf(1.0, x);
}
inline __host__ __device__ float clamp(const float x, const float min, const float max) {
return fminf(max, fmaxf(min, x));
}
inline __host__ __device__ void swapf(float& a, float& b) {
float c = a; a = b; b = c;
}
inline __device__ int mip_from_pos(const float x, const float y, const float z, const float max_cascade) {
const float mx = fmaxf(fabsf(x), fmaxf(fabs(y), fabs(z)));
int exponent;
frexpf(mx, &exponent); // [0, 0.5) --> -1, [0.5, 1) --> 0, [1, 2) --> 1, [2, 4) --> 2, ...
return fminf(max_cascade - 1, fmaxf(0, exponent));
}
inline __device__ int mip_from_dt(const float dt, const float H, const float max_cascade) {
const float mx = dt * H * 0.5;
int exponent;
frexpf(mx, &exponent);
return fminf(max_cascade - 1, fmaxf(0, exponent));
}
inline __host__ __device__ uint32_t __expand_bits(uint32_t v)
{
v = (v * 0x00010001u) & 0xFF0000FFu;
v = (v * 0x00000101u) & 0x0F00F00Fu;
v = (v * 0x00000011u) & 0xC30C30C3u;
v = (v * 0x00000005u) & 0x49249249u;
return v;
}
inline __host__ __device__ uint32_t __morton3D(uint32_t x, uint32_t y, uint32_t z)
{
uint32_t xx = __expand_bits(x);
uint32_t yy = __expand_bits(y);
uint32_t zz = __expand_bits(z);
return xx | (yy << 1) | (zz << 2);
}
inline __host__ __device__ uint32_t __morton3D_invert(uint32_t x)
{
x = x & 0x49249249;
x = (x | (x >> 2)) & 0xc30c30c3;
x = (x | (x >> 4)) & 0x0f00f00f;
x = (x | (x >> 8)) & 0xff0000ff;
x = (x | (x >> 16)) & 0x0000ffff;
return x;
}
////////////////////////////////////////////////////
///////////// utils /////////////
////////////////////////////////////////////////////
// rays_o/d: [N, 3]
// nears/fars: [N]
// scalar_t should always be float in use.
template
__global__ void kernel_near_far_from_aabb(
const scalar_t * __restrict__ rays_o,
const scalar_t * __restrict__ rays_d,
const scalar_t * __restrict__ aabb,
const uint32_t N,
const float min_near,
scalar_t * nears, scalar_t * fars
) {
// parallel per ray
const uint32_t n = threadIdx.x + blockIdx.x * blockDim.x;
if (n >= N) return;
// locate
rays_o += n * 3;
rays_d += n * 3;
const float ox = rays_o[0], oy = rays_o[1], oz = rays_o[2];
const float dx = rays_d[0], dy = rays_d[1], dz = rays_d[2];
const float rdx = 1 / dx, rdy = 1 / dy, rdz = 1 / dz;
// get near far (assume cube scene)
float near = (aabb[0] - ox) * rdx;
float far = (aabb[3] - ox) * rdx;
if (near > far) swapf(near, far);
float near_y = (aabb[1] - oy) * rdy;
float far_y = (aabb[4] - oy) * rdy;
if (near_y > far_y) swapf(near_y, far_y);
if (near > far_y || near_y > far) {
nears[n] = fars[n] = std::numeric_limits::max();
return;
}
if (near_y > near) near = near_y;
if (far_y < far) far = far_y;
float near_z = (aabb[2] - oz) * rdz;
float far_z = (aabb[5] - oz) * rdz;
if (near_z > far_z) swapf(near_z, far_z);
if (near > far_z || near_z > far) {
nears[n] = fars[n] = std::numeric_limits::max();
return;
}
if (near_z > near) near = near_z;
if (far_z < far) far = far_z;
if (near < min_near) near = min_near;
nears[n] = near;
fars[n] = far;
}
void near_far_from_aabb(const at::Tensor rays_o, const at::Tensor rays_d, const at::Tensor aabb, const uint32_t N, const float min_near, at::Tensor nears, at::Tensor fars) {
static constexpr uint32_t N_THREAD = 128;
AT_DISPATCH_FLOATING_TYPES_AND_HALF(
rays_o.scalar_type(), "near_far_from_aabb", ([&] {
kernel_near_far_from_aabb<<>>(rays_o.data_ptr(), rays_d.data_ptr(), aabb.data_ptr(), N, min_near, nears.data_ptr(), fars.data_ptr());
}));
}
// rays_o/d: [N, 3]
// radius: float
// coords: [N, 2]
template
__global__ void kernel_sph_from_ray(
const scalar_t * __restrict__ rays_o,
const scalar_t * __restrict__ rays_d,
const float radius,
const uint32_t N,
scalar_t * coords
) {
// parallel per ray
const uint32_t n = threadIdx.x + blockIdx.x * blockDim.x;
if (n >= N) return;
// locate
rays_o += n * 3;
rays_d += n * 3;
coords += n * 2;
const float ox = rays_o[0], oy = rays_o[1], oz = rays_o[2];
const float dx = rays_d[0], dy = rays_d[1], dz = rays_d[2];
const float rdx = 1 / dx, rdy = 1 / dy, rdz = 1 / dz;
// solve t from || o + td || = radius
const float A = dx * dx + dy * dy + dz * dz;
const float B = ox * dx + oy * dy + oz * dz; // in fact B / 2
const float C = ox * ox + oy * oy + oz * oz - radius * radius;
const float t = (- B + sqrtf(B * B - A * C)) / A; // always use the larger solution (positive)
// solve theta, phi (assume y is the up axis)
const float x = ox + t * dx, y = oy + t * dy, z = oz + t * dz;
const float theta = atan2(sqrtf(x * x + z * z), y); // [0, PI)
const float phi = atan2(z, x); // [-PI, PI)
// normalize to [-1, 1]
coords[0] = 2 * theta * RPI() - 1;
coords[1] = phi * RPI();
}
void sph_from_ray(const at::Tensor rays_o, const at::Tensor rays_d, const float radius, const uint32_t N, at::Tensor coords) {
static constexpr uint32_t N_THREAD = 128;
AT_DISPATCH_FLOATING_TYPES_AND_HALF(
rays_o.scalar_type(), "sph_from_ray", ([&] {
kernel_sph_from_ray<<>>(rays_o.data_ptr(), rays_d.data_ptr(), radius, N, coords.data_ptr());
}));
}
// coords: int32, [N, 3]
// indices: int32, [N]
__global__ void kernel_morton3D(
const int * __restrict__ coords,
const uint32_t N,
int * indices
) {
// parallel
const uint32_t n = threadIdx.x + blockIdx.x * blockDim.x;
if (n >= N) return;
// locate
coords += n * 3;
indices[n] = __morton3D(coords[0], coords[1], coords[2]);
}
void morton3D(const at::Tensor coords, const uint32_t N, at::Tensor indices) {
static constexpr uint32_t N_THREAD = 128;
kernel_morton3D<<>>(coords.data_ptr(), N, indices.data_ptr());
}
// indices: int32, [N]
// coords: int32, [N, 3]
__global__ void kernel_morton3D_invert(
const int * __restrict__ indices,
const uint32_t N,
int * coords
) {
// parallel
const uint32_t n = threadIdx.x + blockIdx.x * blockDim.x;
if (n >= N) return;
// locate
coords += n * 3;
const int ind = indices[n];
coords[0] = __morton3D_invert(ind >> 0);
coords[1] = __morton3D_invert(ind >> 1);
coords[2] = __morton3D_invert(ind >> 2);
}
void morton3D_invert(const at::Tensor indices, const uint32_t N, at::Tensor coords) {
static constexpr uint32_t N_THREAD = 128;
kernel_morton3D_invert<<>>(indices.data_ptr(), N, coords.data_ptr());
}
// grid: float, [C, H, H, H]
// N: int, C * H * H * H / 8
// density_thresh: float
// bitfield: uint8, [N]
template
__global__ void kernel_packbits(
const scalar_t * __restrict__ grid,
const uint32_t N,
const float density_thresh,
uint8_t * bitfield
) {
// parallel per byte
const uint32_t n = threadIdx.x + blockIdx.x * blockDim.x;
if (n >= N) return;
// locate
grid += n * 8;
uint8_t bits = 0;
#pragma unroll
for (uint8_t i = 0; i < 8; i++) {
bits |= (grid[i] > density_thresh) ? ((uint8_t)1 << i) : 0;
}
bitfield[n] = bits;
}
void packbits(const at::Tensor grid, const uint32_t N, const float density_thresh, at::Tensor bitfield) {
static constexpr uint32_t N_THREAD = 128;
AT_DISPATCH_FLOATING_TYPES_AND_HALF(
grid.scalar_type(), "packbits", ([&] {
kernel_packbits<<>>(grid.data_ptr(), N, density_thresh, bitfield.data_ptr());
}));
}
////////////////////////////////////////////////////
///////////// training /////////////
////////////////////////////////////////////////////
// rays_o/d: [N, 3]
// grid: [CHHH / 8]
// xyzs, dirs, deltas: [M, 3], [M, 3], [M, 2]
// dirs: [M, 3]
// rays: [N, 3], idx, offset, num_steps
template
__global__ void kernel_march_rays_train(
const scalar_t * __restrict__ rays_o,
const scalar_t * __restrict__ rays_d,
const uint8_t * __restrict__ grid,
const float bound,
const float dt_gamma, const uint32_t max_steps,
const uint32_t N, const uint32_t C, const uint32_t H, const uint32_t M,
const scalar_t* __restrict__ nears,
const scalar_t* __restrict__ fars,
scalar_t * xyzs, scalar_t * dirs, scalar_t * deltas,
int * rays,
int * counter,
const scalar_t* __restrict__ noises
) {
// parallel per ray
const uint32_t n = threadIdx.x + blockIdx.x * blockDim.x;
if (n >= N) return;
// locate
rays_o += n * 3;
rays_d += n * 3;
// ray marching
const float ox = rays_o[0], oy = rays_o[1], oz = rays_o[2];
const float dx = rays_d[0], dy = rays_d[1], dz = rays_d[2];
const float rdx = 1 / dx, rdy = 1 / dy, rdz = 1 / dz;
const float rH = 1 / (float)H;
const float H3 = H * H * H;
const float near = nears[n];
const float far = fars[n];
const float noise = noises[n];
const float dt_min = 2 * SQRT3() / max_steps;
const float dt_max = 2 * SQRT3() * (1 << (C - 1)) / H;
float t0 = near;
// perturb
t0 += clamp(t0 * dt_gamma, dt_min, dt_max) * noise;
// first pass: estimation of num_steps
float t = t0;
uint32_t num_steps = 0;
//if (t < far) printf("valid ray %d t=%f near=%f far=%f \n", n, t, near, far);
while (t < far && num_steps < max_steps) {
// current point
const float x = clamp(ox + t * dx, -bound, bound);
const float y = clamp(oy + t * dy, -bound, bound);
const float z = clamp(oz + t * dz, -bound, bound);
const float dt = clamp(t * dt_gamma, dt_min, dt_max);
// get mip level
const int level = max(mip_from_pos(x, y, z, C), mip_from_dt(dt, H, C)); // range in [0, C - 1]
const float mip_bound = fminf(scalbnf(1.0f, level), bound);
const float mip_rbound = 1 / mip_bound;
// convert to nearest grid position
const int nx = clamp(0.5 * (x * mip_rbound + 1) * H, 0.0f, (float)(H - 1));
const int ny = clamp(0.5 * (y * mip_rbound + 1) * H, 0.0f, (float)(H - 1));
const int nz = clamp(0.5 * (z * mip_rbound + 1) * H, 0.0f, (float)(H - 1));
const uint32_t index = level * H3 + __morton3D(nx, ny, nz);
const bool occ = grid[index / 8] & (1 << (index % 8));
// if occpuied, advance a small step, and write to output
//if (n == 0) printf("t=%f density=%f vs thresh=%f step=%d\n", t, density, density_thresh, num_steps);
if (occ) {
num_steps++;
t += dt;
// else, skip a large step (basically skip a voxel grid)
} else {
// calc distance to next voxel
const float tx = (((nx + 0.5f + 0.5f * signf(dx)) * rH * 2 - 1) * mip_bound - x) * rdx;
const float ty = (((ny + 0.5f + 0.5f * signf(dy)) * rH * 2 - 1) * mip_bound - y) * rdy;
const float tz = (((nz + 0.5f + 0.5f * signf(dz)) * rH * 2 - 1) * mip_bound - z) * rdz;
const float tt = t + fmaxf(0.0f, fminf(tx, fminf(ty, tz)));
// step until next voxel
do {
t += clamp(t * dt_gamma, dt_min, dt_max);
} while (t < tt);
}
}
//printf("[n=%d] num_steps=%d, near=%f, far=%f, dt=%f, max_steps=%f\n", n, num_steps, near, far, dt_min, (far - near) / dt_min);
// second pass: really locate and write points & dirs
uint32_t point_index = atomicAdd(counter, num_steps);
uint32_t ray_index = atomicAdd(counter + 1, 1);
//printf("[n=%d] num_steps=%d, point_index=%d, ray_index=%d\n", n, num_steps, point_index, ray_index);
// write rays
rays[ray_index * 3] = n;
rays[ray_index * 3 + 1] = point_index;
rays[ray_index * 3 + 2] = num_steps;
if (num_steps == 0) return;
if (point_index + num_steps > M) return;
xyzs += point_index * 3;
dirs += point_index * 3;
deltas += point_index * 2;
t = t0;
uint32_t step = 0;
float last_t = t;
while (t < far && step < num_steps) {
// current point
const float x = clamp(ox + t * dx, -bound, bound);
const float y = clamp(oy + t * dy, -bound, bound);
const float z = clamp(oz + t * dz, -bound, bound);
const float dt = clamp(t * dt_gamma, dt_min, dt_max);
// get mip level
const int level = max(mip_from_pos(x, y, z, C), mip_from_dt(dt, H, C)); // range in [0, C - 1]
const float mip_bound = fminf(scalbnf(1.0f, level), bound);
const float mip_rbound = 1 / mip_bound;
// convert to nearest grid position
const int nx = clamp(0.5 * (x * mip_rbound + 1) * H, 0.0f, (float)(H - 1));
const int ny = clamp(0.5 * (y * mip_rbound + 1) * H, 0.0f, (float)(H - 1));
const int nz = clamp(0.5 * (z * mip_rbound + 1) * H, 0.0f, (float)(H - 1));
// query grid
const uint32_t index = level * H3 + __morton3D(nx, ny, nz);
const bool occ = grid[index / 8] & (1 << (index % 8));
// if occpuied, advance a small step, and write to output
if (occ) {
// write step
xyzs[0] = x;
xyzs[1] = y;
xyzs[2] = z;
dirs[0] = dx;
dirs[1] = dy;
dirs[2] = dz;
t += dt;
deltas[0] = dt;
deltas[1] = t - last_t; // used to calc depth
last_t = t;
xyzs += 3;
dirs += 3;
deltas += 2;
step++;
// else, skip a large step (basically skip a voxel grid)
} else {
// calc distance to next voxel
const float tx = (((nx + 0.5f + 0.5f * signf(dx)) * rH * 2 - 1) * mip_bound - x) * rdx;
const float ty = (((ny + 0.5f + 0.5f * signf(dy)) * rH * 2 - 1) * mip_bound - y) * rdy;
const float tz = (((nz + 0.5f + 0.5f * signf(dz)) * rH * 2 - 1) * mip_bound - z) * rdz;
const float tt = t + fmaxf(0.0f, fminf(tx, fminf(ty, tz)));
// step until next voxel
do {
t += clamp(t * dt_gamma, dt_min, dt_max);
} while (t < tt);
}
}
}
void march_rays_train(const at::Tensor rays_o, const at::Tensor rays_d, const at::Tensor grid, const float bound, const float dt_gamma, const uint32_t max_steps, const uint32_t N, const uint32_t C, const uint32_t H, const uint32_t M, const at::Tensor nears, const at::Tensor fars, at::Tensor xyzs, at::Tensor dirs, at::Tensor deltas, at::Tensor rays, at::Tensor counter, at::Tensor noises) {
static constexpr uint32_t N_THREAD = 128;
AT_DISPATCH_FLOATING_TYPES_AND_HALF(
rays_o.scalar_type(), "march_rays_train", ([&] {
kernel_march_rays_train<<>>(rays_o.data_ptr(), rays_d.data_ptr(), grid.data_ptr(), bound, dt_gamma, max_steps, N, C, H, M, nears.data_ptr(), fars.data_ptr(), xyzs.data_ptr(), dirs.data_ptr(), deltas.data_ptr(), rays.data_ptr(), counter.data_ptr(), noises.data_ptr());
}));
}
// sigmas: [M]
// rgbs: [M, 3]
// deltas: [M, 2]
// rays: [N, 3], idx, offset, num_steps
// weights_sum: [N], final pixel alpha
// depth: [N,]
// image: [N, 3]
template
__global__ void kernel_composite_rays_train_forward(
const scalar_t * __restrict__ sigmas,
const scalar_t * __restrict__ rgbs,
const scalar_t * __restrict__ deltas,
const int * __restrict__ rays,
const uint32_t M, const uint32_t N, const float T_thresh,
scalar_t * weights_sum,
scalar_t * depth,
scalar_t * image
) {
// parallel per ray
const uint32_t n = threadIdx.x + blockIdx.x * blockDim.x;
if (n >= N) return;
// locate
uint32_t index = rays[n * 3];
uint32_t offset = rays[n * 3 + 1];
uint32_t num_steps = rays[n * 3 + 2];
// empty ray, or ray that exceed max step count.
if (num_steps == 0 || offset + num_steps > M) {
weights_sum[index] = 0;
depth[index] = 0;
image[index * 3] = 0;
image[index * 3 + 1] = 0;
image[index * 3 + 2] = 0;
return;
}
sigmas += offset;
rgbs += offset * 3;
deltas += offset * 2;
// accumulate
uint32_t step = 0;
scalar_t T = 1.0f;
scalar_t r = 0, g = 0, b = 0, ws = 0, t = 0, d = 0;
while (step < num_steps) {
const scalar_t alpha = 1.0f - __expf(- sigmas[0] * deltas[0]);
const scalar_t weight = alpha * T;
r += weight * rgbs[0];
g += weight * rgbs[1];
b += weight * rgbs[2];
t += deltas[1]; // real delta
d += weight * t;
ws += weight;
T *= 1.0f - alpha;
// minimal remained transmittence
if (T < T_thresh) break;
//printf("[n=%d] num_steps=%d, alpha=%f, w=%f, T=%f, sum_dt=%f, d=%f\n", n, step, alpha, weight, T, sum_delta, d);
// locate
sigmas++;
rgbs += 3;
deltas += 2;
step++;
}
//printf("[n=%d] rgb=(%f, %f, %f), d=%f\n", n, r, g, b, d);
// write
weights_sum[index] = ws; // weights_sum
depth[index] = d;
image[index * 3] = r;
image[index * 3 + 1] = g;
image[index * 3 + 2] = b;
}
void composite_rays_train_forward(const at::Tensor sigmas, const at::Tensor rgbs, const at::Tensor deltas, const at::Tensor rays, const uint32_t M, const uint32_t N, const float T_thresh, at::Tensor weights_sum, at::Tensor depth, at::Tensor image) {
static constexpr uint32_t N_THREAD = 128;
AT_DISPATCH_FLOATING_TYPES_AND_HALF(
sigmas.scalar_type(), "composite_rays_train_forward", ([&] {
kernel_composite_rays_train_forward<<>>(sigmas.data_ptr(), rgbs.data_ptr(), deltas.data_ptr(), rays.data_ptr(), M, N, T_thresh, weights_sum.data_ptr(), depth.data_ptr(), image.data_ptr());
}));
}
// grad_weights_sum: [N,]
// grad: [N, 3]
// sigmas: [M]
// rgbs: [M, 3]
// deltas: [M, 2]
// rays: [N, 3], idx, offset, num_steps
// weights_sum: [N,], weights_sum here
// image: [N, 3]
// grad_sigmas: [M]
// grad_rgbs: [M, 3]
template
__global__ void kernel_composite_rays_train_backward(
const scalar_t * __restrict__ grad_weights_sum,
const scalar_t * __restrict__ grad_image,
const scalar_t * __restrict__ sigmas,
const scalar_t * __restrict__ rgbs,
const scalar_t * __restrict__ deltas,
const int * __restrict__ rays,
const scalar_t * __restrict__ weights_sum,
const scalar_t * __restrict__ image,
const uint32_t M, const uint32_t N, const float T_thresh,
scalar_t * grad_sigmas,
scalar_t * grad_rgbs
) {
// parallel per ray
const uint32_t n = threadIdx.x + blockIdx.x * blockDim.x;
if (n >= N) return;
// locate
uint32_t index = rays[n * 3];
uint32_t offset = rays[n * 3 + 1];
uint32_t num_steps = rays[n * 3 + 2];
if (num_steps == 0 || offset + num_steps > M) return;
grad_weights_sum += index;
grad_image += index * 3;
weights_sum += index;
image += index * 3;
sigmas += offset;
rgbs += offset * 3;
deltas += offset * 2;
grad_sigmas += offset;
grad_rgbs += offset * 3;
// accumulate
uint32_t step = 0;
scalar_t T = 1.0f;
const scalar_t r_final = image[0], g_final = image[1], b_final = image[2], ws_final = weights_sum[0];
scalar_t r = 0, g = 0, b = 0, ws = 0;
while (step < num_steps) {
const scalar_t alpha = 1.0f - __expf(- sigmas[0] * deltas[0]);
const scalar_t weight = alpha * T;
r += weight * rgbs[0];
g += weight * rgbs[1];
b += weight * rgbs[2];
ws += weight;
T *= 1.0f - alpha;
// check https://note.kiui.moe/others/nerf_gradient/ for the gradient calculation.
// write grad_rgbs
grad_rgbs[0] = grad_image[0] * weight;
grad_rgbs[1] = grad_image[1] * weight;
grad_rgbs[2] = grad_image[2] * weight;
// write grad_sigmas
grad_sigmas[0] = deltas[0] * (
grad_image[0] * (T * rgbs[0] - (r_final - r)) +
grad_image[1] * (T * rgbs[1] - (g_final - g)) +
grad_image[2] * (T * rgbs[2] - (b_final - b)) +
grad_weights_sum[0] * (1 - ws_final)
);
//printf("[n=%d] num_steps=%d, T=%f, grad_sigmas=%f, r_final=%f, r=%f\n", n, step, T, grad_sigmas[0], r_final, r);
// minimal remained transmittence
if (T < T_thresh) break;
// locate
sigmas++;
rgbs += 3;
deltas += 2;
grad_sigmas++;
grad_rgbs += 3;
step++;
}
}
void composite_rays_train_backward(const at::Tensor grad_weights_sum, const at::Tensor grad_image, const at::Tensor sigmas, const at::Tensor rgbs, const at::Tensor deltas, const at::Tensor rays, const at::Tensor weights_sum, const at::Tensor image, const uint32_t M, const uint32_t N, const float T_thresh, at::Tensor grad_sigmas, at::Tensor grad_rgbs) {
static constexpr uint32_t N_THREAD = 128;
AT_DISPATCH_FLOATING_TYPES_AND_HALF(
grad_image.scalar_type(), "composite_rays_train_backward", ([&] {
kernel_composite_rays_train_backward<<>>(grad_weights_sum.data_ptr(), grad_image.data_ptr(), sigmas.data_ptr(), rgbs.data_ptr(), deltas.data_ptr(), rays.data_ptr(), weights_sum.data_ptr(), image.data_ptr(), M, N, T_thresh, grad_sigmas.data_ptr(), grad_rgbs.data_ptr());
}));
}
////////////////////////////////////////////////////
///////////// infernce /////////////
////////////////////////////////////////////////////
template
__global__ void kernel_march_rays(
const uint32_t n_alive,
const uint32_t n_step,
const int* __restrict__ rays_alive,
const scalar_t* __restrict__ rays_t,
const scalar_t* __restrict__ rays_o,
const scalar_t* __restrict__ rays_d,
const float bound,
const float dt_gamma, const uint32_t max_steps,
const uint32_t C, const uint32_t H,
const uint8_t * __restrict__ grid,
const scalar_t* __restrict__ nears,
const scalar_t* __restrict__ fars,
scalar_t* xyzs, scalar_t* dirs, scalar_t* deltas,
const scalar_t* __restrict__ noises
) {
const uint32_t n = threadIdx.x + blockIdx.x * blockDim.x;
if (n >= n_alive) return;
const int index = rays_alive[n]; // ray id
const float noise = noises[n];
// locate
rays_o += index * 3;
rays_d += index * 3;
xyzs += n * n_step * 3;
dirs += n * n_step * 3;
deltas += n * n_step * 2;
const float ox = rays_o[0], oy = rays_o[1], oz = rays_o[2];
const float dx = rays_d[0], dy = rays_d[1], dz = rays_d[2];
const float rdx = 1 / dx, rdy = 1 / dy, rdz = 1 / dz;
const float rH = 1 / (float)H;
const float H3 = H * H * H;
float t = rays_t[index]; // current ray's t
const float near = nears[index], far = fars[index];
const float dt_min = 2 * SQRT3() / max_steps;
const float dt_max = 2 * SQRT3() * (1 << (C - 1)) / H;
// march for n_step steps, record points
uint32_t step = 0;
// introduce some randomness
t += clamp(t * dt_gamma, dt_min, dt_max) * noise;
float last_t = t;
while (t < far && step < n_step) {
// current point
const float x = clamp(ox + t * dx, -bound, bound);
const float y = clamp(oy + t * dy, -bound, bound);
const float z = clamp(oz + t * dz, -bound, bound);
const float dt = clamp(t * dt_gamma, dt_min, dt_max);
// get mip level
const int level = max(mip_from_pos(x, y, z, C), mip_from_dt(dt, H, C)); // range in [0, C - 1]
const float mip_bound = fminf(scalbnf(1, level), bound);
const float mip_rbound = 1 / mip_bound;
// convert to nearest grid position
const int nx = clamp(0.5 * (x * mip_rbound + 1) * H, 0.0f, (float)(H - 1));
const int ny = clamp(0.5 * (y * mip_rbound + 1) * H, 0.0f, (float)(H - 1));
const int nz = clamp(0.5 * (z * mip_rbound + 1) * H, 0.0f, (float)(H - 1));
const uint32_t index = level * H3 + __morton3D(nx, ny, nz);
const bool occ = grid[index / 8] & (1 << (index % 8));
// if occpuied, advance a small step, and write to output
if (occ) {
// write step
xyzs[0] = x;
xyzs[1] = y;
xyzs[2] = z;
dirs[0] = dx;
dirs[1] = dy;
dirs[2] = dz;
// calc dt
t += dt;
deltas[0] = dt;
deltas[1] = t - last_t; // used to calc depth
last_t = t;
// step
xyzs += 3;
dirs += 3;
deltas += 2;
step++;
// else, skip a large step (basically skip a voxel grid)
} else {
// calc distance to next voxel
const float tx = (((nx + 0.5f + 0.5f * signf(dx)) * rH * 2 - 1) * mip_bound - x) * rdx;
const float ty = (((ny + 0.5f + 0.5f * signf(dy)) * rH * 2 - 1) * mip_bound - y) * rdy;
const float tz = (((nz + 0.5f + 0.5f * signf(dz)) * rH * 2 - 1) * mip_bound - z) * rdz;
const float tt = t + fmaxf(0.0f, fminf(tx, fminf(ty, tz)));
// step until next voxel
do {
t += clamp(t * dt_gamma, dt_min, dt_max);
} while (t < tt);
}
}
}
void march_rays(const uint32_t n_alive, const uint32_t n_step, const at::Tensor rays_alive, const at::Tensor rays_t, const at::Tensor rays_o, const at::Tensor rays_d, const float bound, const float dt_gamma, const uint32_t max_steps, const uint32_t C, const uint32_t H, const at::Tensor grid, const at::Tensor near, const at::Tensor far, at::Tensor xyzs, at::Tensor dirs, at::Tensor deltas, at::Tensor noises) {
static constexpr uint32_t N_THREAD = 128;
AT_DISPATCH_FLOATING_TYPES_AND_HALF(
rays_o.scalar_type(), "march_rays", ([&] {
kernel_march_rays<<>>(n_alive, n_step, rays_alive.data_ptr(), rays_t.data_ptr(), rays_o.data_ptr(), rays_d.data_ptr(), bound, dt_gamma, max_steps, C, H, grid.data_ptr(), near.data_ptr(), far.data_ptr(), xyzs.data_ptr(), dirs.data_ptr(), deltas.data_ptr(), noises.data_ptr());
}));
}
template
__global__ void kernel_composite_rays(
const uint32_t n_alive,
const uint32_t n_step,
const float T_thresh,
int* rays_alive,
scalar_t* rays_t,
const scalar_t* __restrict__ sigmas,
const scalar_t* __restrict__ rgbs,
const scalar_t* __restrict__ deltas,
scalar_t* weights_sum, scalar_t* depth, scalar_t* image
) {
const uint32_t n = threadIdx.x + blockIdx.x * blockDim.x;
if (n >= n_alive) return;
const int index = rays_alive[n]; // ray id
// locate
sigmas += n * n_step;
rgbs += n * n_step * 3;
deltas += n * n_step * 2;
rays_t += index;
weights_sum += index;
depth += index;
image += index * 3;
scalar_t t = rays_t[0]; // current ray's t
scalar_t weight_sum = weights_sum[0];
scalar_t d = depth[0];
scalar_t r = image[0];
scalar_t g = image[1];
scalar_t b = image[2];
// accumulate
uint32_t step = 0;
while (step < n_step) {
// ray is terminated if delta == 0
if (deltas[0] == 0) break;
const scalar_t alpha = 1.0f - __expf(- sigmas[0] * deltas[0]);
/*
T_0 = 1; T_i = \prod_{j=0}^{i-1} (1 - alpha_j)
w_i = alpha_i * T_i
-->
T_i = 1 - \sum_{j=0}^{i-1} w_j
*/
const scalar_t T = 1 - weight_sum;
const scalar_t weight = alpha * T;
weight_sum += weight;
t += deltas[1]; // real delta
d += weight * t;
r += weight * rgbs[0];
g += weight * rgbs[1];
b += weight * rgbs[2];
//printf("[n=%d] num_steps=%d, alpha=%f, w=%f, T=%f, sum_dt=%f, d=%f\n", n, step, alpha, weight, T, sum_delta, d);
// ray is terminated if T is too small
// use a larger bound to further accelerate inference
if (T < T_thresh) break;
// locate
sigmas++;
rgbs += 3;
deltas += 2;
step++;
}
//printf("[n=%d] rgb=(%f, %f, %f), d=%f\n", n, r, g, b, d);
// rays_alive = -1 means ray is terminated early.
if (step < n_step) {
rays_alive[n] = -1;
} else {
rays_t[0] = t;
}
weights_sum[0] = weight_sum; // this is the thing I needed!
depth[0] = d;
image[0] = r;
image[1] = g;
image[2] = b;
}
void composite_rays(const uint32_t n_alive, const uint32_t n_step, const float T_thresh, at::Tensor rays_alive, at::Tensor rays_t, at::Tensor sigmas, at::Tensor rgbs, at::Tensor deltas, at::Tensor weights, at::Tensor depth, at::Tensor image) {
static constexpr uint32_t N_THREAD = 128;
AT_DISPATCH_FLOATING_TYPES_AND_HALF(
image.scalar_type(), "composite_rays", ([&] {
kernel_composite_rays<<>>(n_alive, n_step, T_thresh, rays_alive.data_ptr(), rays_t.data_ptr(), sigmas.data_ptr(), rgbs.data_ptr(), deltas.data_ptr(), weights.data_ptr(), depth.data_ptr(), image.data_ptr());
}));
}
================================================
FILE: raymarching/src/raymarching.h
================================================
#pragma once
#include
#include
void near_far_from_aabb(const at::Tensor rays_o, const at::Tensor rays_d, const at::Tensor aabb, const uint32_t N, const float min_near, at::Tensor nears, at::Tensor fars);
void sph_from_ray(const at::Tensor rays_o, const at::Tensor rays_d, const float radius, const uint32_t N, at::Tensor coords);
void morton3D(const at::Tensor coords, const uint32_t N, at::Tensor indices);
void morton3D_invert(const at::Tensor indices, const uint32_t N, at::Tensor coords);
void packbits(const at::Tensor grid, const uint32_t N, const float density_thresh, at::Tensor bitfield);
void march_rays_train(const at::Tensor rays_o, const at::Tensor rays_d, const at::Tensor grid, const float bound, const float dt_gamma, const uint32_t max_steps, const uint32_t N, const uint32_t C, const uint32_t H, const uint32_t M, const at::Tensor nears, const at::Tensor fars, at::Tensor xyzs, at::Tensor dirs, at::Tensor deltas, at::Tensor rays, at::Tensor counter, at::Tensor noises);
void composite_rays_train_forward(const at::Tensor sigmas, const at::Tensor rgbs, const at::Tensor deltas, const at::Tensor rays, const uint32_t M, const uint32_t N, const float T_thresh, at::Tensor weights_sum, at::Tensor depth, at::Tensor image);
void composite_rays_train_backward(const at::Tensor grad_weights_sum, const at::Tensor grad_image, const at::Tensor sigmas, const at::Tensor rgbs, const at::Tensor deltas, const at::Tensor rays, const at::Tensor weights_sum, const at::Tensor image, const uint32_t M, const uint32_t N, const float T_thresh, at::Tensor grad_sigmas, at::Tensor grad_rgbs);
void march_rays(const uint32_t n_alive, const uint32_t n_step, const at::Tensor rays_alive, const at::Tensor rays_t, const at::Tensor rays_o, const at::Tensor rays_d, const float bound, const float dt_gamma, const uint32_t max_steps, const uint32_t C, const uint32_t H, const at::Tensor grid, const at::Tensor nears, const at::Tensor fars, at::Tensor xyzs, at::Tensor dirs, at::Tensor deltas, at::Tensor noises);
void composite_rays(const uint32_t n_alive, const uint32_t n_step, const float T_thresh, at::Tensor rays_alive, at::Tensor rays_t, at::Tensor sigmas, at::Tensor rgbs, at::Tensor deltas, at::Tensor weights_sum, at::Tensor depth, at::Tensor image);
================================================
FILE: renderer/agg_net.py
================================================
import torch.nn.functional as F
import torch.nn as nn
import torch
def weights_init(m):
if isinstance(m, nn.Linear):
nn.init.kaiming_normal_(m.weight.data)
if m.bias is not None:
nn.init.zeros_(m.bias.data)
class NeRF(nn.Module):
def __init__(self, vol_n=8+8, feat_ch=8+16+32+3, hid_n=64):
super(NeRF, self).__init__()
self.hid_n = hid_n
self.agg = Agg(feat_ch)
self.lr0 = nn.Sequential(nn.Linear(vol_n+16, hid_n), nn.ReLU())
self.sigma = nn.Sequential(nn.Linear(hid_n, 1), nn.Softplus())
self.color = nn.Sequential(
nn.Linear(16+vol_n+feat_ch+hid_n+4, hid_n), # agg_feats+vox_feat+img_feat+lr0_feats+dir
nn.ReLU(),
nn.Linear(hid_n, 1)
)
self.lr0.apply(weights_init)
self.sigma.apply(weights_init)
self.color.apply(weights_init)
def forward(self, vox_feat, img_feat_rgb_dir, source_img_mask):
# assert torch.sum(torch.sum(source_img_mask,1)<2)==0
b, d, n, _ = img_feat_rgb_dir.shape # b,d,n,f=8+16+32+3+4
agg_feat = self.agg(img_feat_rgb_dir, source_img_mask) # b,d,f=16
x = self.lr0(torch.cat((vox_feat, agg_feat), dim=-1)) # b,d,f=64
sigma = self.sigma(x) # b,d,1
x = torch.cat((x, vox_feat, agg_feat), dim=-1) # b,d,f=16+16+64
x = x.view(b, d, 1, x.shape[-1]).repeat(1, 1, n, 1)
x = torch.cat((x, img_feat_rgb_dir), dim=-1)
logits = self.color(x)
source_img_mask_ = source_img_mask.reshape(b, 1, n, 1).repeat(1, logits.shape[1], 1, 1) == 0
logits[source_img_mask_] = -1e7
color_weight = F.softmax(logits, dim=-2)
color = torch.sum((img_feat_rgb_dir[..., -7:-4] * color_weight), dim=-2)
return color, sigma
class Agg(nn.Module):
def __init__(self, feat_ch):
super(Agg, self).__init__()
self.feat_ch = feat_ch
self.view_fc = nn.Sequential(nn.Linear(4, feat_ch), nn.ReLU())
self.view_fc.apply(weights_init)
self.global_fc = nn.Sequential(nn.Linear(feat_ch*3, 32), nn.ReLU())
self.agg_w_fc = nn.Linear(32, 1)
self.fc = nn.Linear(32, 16)
self.global_fc.apply(weights_init)
self.agg_w_fc.apply(weights_init)
self.fc.apply(weights_init)
def masked_mean_var(self, img_feat_rgb, source_img_mask):
# img_feat_rgb: b,d,n,f source_img_mask: b,n
b, n = source_img_mask.shape
source_img_mask = source_img_mask.view(b, 1, n, 1)
mean = torch.sum(source_img_mask * img_feat_rgb, dim=-2)/ (torch.sum(source_img_mask, dim=-2) + 1e-5)
var = torch.sum((img_feat_rgb - mean.unsqueeze(-2)) ** 2 * source_img_mask, dim=-2) / (torch.sum(source_img_mask, dim=-2) + 1e-5)
return mean, var
def forward(self, img_feat_rgb_dir, source_img_mask):
# img_feat_rgb_dir b,d,n,f
b, d, n, _ = img_feat_rgb_dir.shape
view_feat = self.view_fc(img_feat_rgb_dir[..., -4:]) # b,d,n,f-4
img_feat_rgb = img_feat_rgb_dir[..., :-4] + view_feat
mean_feat, var_feat = self.masked_mean_var(img_feat_rgb, source_img_mask)
var_feat = var_feat.view(b, -1, 1, self.feat_ch).repeat(1, 1, n, 1)
avg_feat = mean_feat.view(b, -1, 1, self.feat_ch).repeat(1, 1, n, 1)
feat = torch.cat([img_feat_rgb, var_feat, avg_feat], dim=-1) # b,d,n,f
global_feat = self.global_fc(feat) # b,d,n,f
logits = self.agg_w_fc(global_feat) # b,d,n,1
source_img_mask_ = source_img_mask.reshape(b, 1, n, 1).repeat(1, logits.shape[1], 1, 1) == 0
logits[source_img_mask_] = -1e7
agg_w = F.softmax(logits, dim=-2)
im_feat = (global_feat * agg_w).sum(dim=-2)
return self.fc(im_feat)
================================================
FILE: renderer/cost_reg_net.py
================================================
import torch.nn as nn
class ConvBnReLU3D(nn.Module):
def __init__(self, in_channels, out_channels, kernel_size=3, stride=1, pad=1, norm_act=nn.BatchNorm3d):
super(ConvBnReLU3D, self).__init__()
self.conv = nn.Conv3d(in_channels, out_channels, kernel_size, stride=stride, padding=pad, bias=False)
self.bn = norm_act(out_channels)
self.relu = nn.ReLU(inplace=True)
def forward(self, x):
return self.relu(self.bn(self.conv(x)))
class CostRegNet(nn.Module):
def __init__(self, in_channels, norm_act=nn.BatchNorm3d):
super(CostRegNet, self).__init__()
self.conv0 = ConvBnReLU3D(in_channels, 8, norm_act=norm_act)
self.conv1 = ConvBnReLU3D(8, 16, stride=2, norm_act=norm_act)
self.conv2 = ConvBnReLU3D(16, 16, norm_act=norm_act)
self.conv3 = ConvBnReLU3D(16, 32, stride=2, norm_act=norm_act)
self.conv4 = ConvBnReLU3D(32, 32, norm_act=norm_act)
self.conv5 = ConvBnReLU3D(32, 64, stride=2, norm_act=norm_act)
self.conv6 = ConvBnReLU3D(64, 64, norm_act=norm_act)
self.conv7 = nn.Sequential(
nn.ConvTranspose3d(64, 32, 3, padding=1, output_padding=1, stride=2, bias=False),
norm_act(32)
)
self.conv9 = nn.Sequential(
nn.ConvTranspose3d(32, 16, 3, padding=1, output_padding=1, stride=2, bias=False),
norm_act(16)
)
self.conv11 = nn.Sequential(
nn.ConvTranspose3d(16, 8, 3, padding=1, output_padding=1,stride=2, bias=False),
norm_act(8)
)
self.depth_conv = nn.Sequential(nn.Conv3d(8, 1, 3, padding=1, bias=False))
self.feat_conv = nn.Sequential(nn.Conv3d(8, 8, 3, padding=1, bias=False))
def forward(self, x):
conv0 = self.conv0(x)
conv2 = self.conv2(self.conv1(conv0))
conv4 = self.conv4(self.conv3(conv2))
x = self.conv6(self.conv5(conv4))
x = conv4 + self.conv7(x)
del conv4
x = conv2 + self.conv9(x)
del conv2
x = conv0 + self.conv11(x)
del conv0
feat = self.feat_conv(x)
depth = self.depth_conv(x)
return feat, depth
class MinCostRegNet(nn.Module):
def __init__(self, in_channels, norm_act=nn.BatchNorm3d):
super(MinCostRegNet, self).__init__()
self.conv0 = ConvBnReLU3D(in_channels, 8, norm_act=norm_act)
self.conv1 = ConvBnReLU3D(8, 16, stride=2, norm_act=norm_act)
self.conv2 = ConvBnReLU3D(16, 16, norm_act=norm_act)
self.conv3 = ConvBnReLU3D(16, 32, stride=2, norm_act=norm_act)
self.conv4 = ConvBnReLU3D(32, 32, norm_act=norm_act)
self.conv9 = nn.Sequential(
nn.ConvTranspose3d(32, 16, 3, padding=1, output_padding=1,
stride=2, bias=False),
norm_act(16))
self.conv11 = nn.Sequential(
nn.ConvTranspose3d(16, 8, 3, padding=1, output_padding=1,
stride=2, bias=False),
norm_act(8))
self.depth_conv = nn.Sequential(nn.Conv3d(8, 1, 3, padding=1, bias=False))
self.feat_conv = nn.Sequential(nn.Conv3d(8, 8, 3, padding=1, bias=False))
def forward(self, x):
conv0 = self.conv0(x)
conv2 = self.conv2(self.conv1(conv0))
conv4 = self.conv4(self.conv3(conv2))
x = conv4
x = conv2 + self.conv9(x)
del conv2
x = conv0 + self.conv11(x)
del conv0
feat = self.feat_conv(x)
depth = self.depth_conv(x)
return feat, depth
================================================
FILE: renderer/dummy_dataset.py
================================================
import pytorch_lightning as pl
from torch.utils.data import Dataset
import webdataset as wds
from torch.utils.data.distributed import DistributedSampler
class DummyDataset(pl.LightningDataModule):
def __init__(self,seed):
super().__init__()
def setup(self, stage):
if stage in ['fit']:
self.train_dataset = DummyData(True)
self.val_dataset = DummyData(False)
else:
raise NotImplementedError
def train_dataloader(self):
return wds.WebLoader(self.train_dataset, batch_size=1, num_workers=0, shuffle=False)
def val_dataloader(self):
return wds.WebLoader(self.val_dataset, batch_size=1, num_workers=0, shuffle=False)
def test_dataloader(self):
return wds.WebLoader(DummyData(False))
class DummyData(Dataset):
def __init__(self,is_train):
self.is_train=is_train
def __len__(self):
if self.is_train:
return 99999999
else:
return 1
def __getitem__(self, index):
return {}
================================================
FILE: renderer/feature_net.py
================================================
import torch.nn as nn
import torch.nn.functional as F
class ConvBnReLU(nn.Module):
def __init__(self, in_channels, out_channels, kernel_size=3, stride=1, pad=1, norm_act=nn.BatchNorm2d):
super(ConvBnReLU, self).__init__()
self.conv = nn.Conv2d(in_channels, out_channels, kernel_size, stride=stride, padding=pad, bias=False)
self.bn = norm_act(out_channels)
self.relu = nn.ReLU(inplace=True)
def forward(self, x):
return self.relu(self.bn(self.conv(x)))
class FeatureNet(nn.Module):
def __init__(self, norm_act=nn.BatchNorm2d):
super(FeatureNet, self).__init__()
self.conv0 = nn.Sequential(ConvBnReLU(3, 8, 3, 1, 1, norm_act=norm_act), ConvBnReLU(8, 8, 3, 1, 1, norm_act=norm_act))
self.conv1 = nn.Sequential(ConvBnReLU(8, 16, 5, 2, 2, norm_act=norm_act), ConvBnReLU(16, 16, 3, 1, 1, norm_act=norm_act))
self.conv2 = nn.Sequential(ConvBnReLU(16, 32, 5, 2, 2, norm_act=norm_act), ConvBnReLU(32, 32, 3, 1, 1, norm_act=norm_act))
self.toplayer = nn.Conv2d(32, 32, 1)
self.lat1 = nn.Conv2d(16, 32, 1)
self.lat0 = nn.Conv2d(8, 32, 1)
self.smooth1 = nn.Conv2d(32, 16, 3, padding=1)
self.smooth0 = nn.Conv2d(32, 8, 3, padding=1)
def _upsample_add(self, x, y):
return F.interpolate(x, scale_factor=2, mode='bilinear', align_corners=True) + y
def forward(self, x):
conv0 = self.conv0(x)
conv1 = self.conv1(conv0)
conv2 = self.conv2(conv1)
feat2 = self.toplayer(conv2)
feat1 = self._upsample_add(feat2, self.lat1(conv1))
feat0 = self._upsample_add(feat1, self.lat0(conv0))
feat1 = self.smooth1(feat1)
feat0 = self.smooth0(feat0)
return feat2, feat1, feat0
================================================
FILE: renderer/neus_networks.py
================================================
import math
import numpy as np
import torch
import torch.nn as nn
import torch.nn.functional as F
import tinycudann as tcnn
# Positional encoding embedding. Code was taken from https://github.com/bmild/nerf.
class Embedder:
def __init__(self, **kwargs):
self.kwargs = kwargs
self.create_embedding_fn()
def create_embedding_fn(self):
embed_fns = []
d = self.kwargs['input_dims']
out_dim = 0
if self.kwargs['include_input']:
embed_fns.append(lambda x: x)
out_dim += d
max_freq = self.kwargs['max_freq_log2']
N_freqs = self.kwargs['num_freqs']
if self.kwargs['log_sampling']:
freq_bands = 2. ** torch.linspace(0., max_freq, N_freqs)
else:
freq_bands = torch.linspace(2. ** 0., 2. ** max_freq, N_freqs)
for freq in freq_bands:
for p_fn in self.kwargs['periodic_fns']:
embed_fns.append(lambda x, p_fn=p_fn, freq=freq: p_fn(x * freq))
out_dim += d
self.embed_fns = embed_fns
self.out_dim = out_dim
def embed(self, inputs):
return torch.cat([fn(inputs) for fn in self.embed_fns], -1)
def get_embedder(multires, input_dims=3):
embed_kwargs = {
'include_input': True,
'input_dims': input_dims,
'max_freq_log2': multires - 1,
'num_freqs': multires,
'log_sampling': True,
'periodic_fns': [torch.sin, torch.cos],
}
embedder_obj = Embedder(**embed_kwargs)
def embed(x, eo=embedder_obj): return eo.embed(x)
return embed, embedder_obj.out_dim
class SDFNetwork(nn.Module):
def __init__(self, d_in, d_out, d_hidden, n_layers, skip_in=(4,), multires=0, bias=0.5,
scale=1, geometric_init=True, weight_norm=True, inside_outside=False):
super(SDFNetwork, self).__init__()
dims = [d_in] + [d_hidden for _ in range(n_layers)] + [d_out]
self.embed_fn_fine = None
if multires > 0:
embed_fn, input_ch = get_embedder(multires, input_dims=d_in)
self.embed_fn_fine = embed_fn
dims[0] = input_ch
self.num_layers = len(dims)
self.skip_in = skip_in
self.scale = scale
for l in range(0, self.num_layers - 1):
if l + 1 in self.skip_in:
out_dim = dims[l + 1] - dims[0]
else:
out_dim = dims[l + 1]
lin = nn.Linear(dims[l], out_dim)
if geometric_init:
if l == self.num_layers - 2:
if not inside_outside:
torch.nn.init.normal_(lin.weight, mean=np.sqrt(np.pi) / np.sqrt(dims[l]), std=0.0001)
torch.nn.init.constant_(lin.bias, -bias)
else:
torch.nn.init.normal_(lin.weight, mean=-np.sqrt(np.pi) / np.sqrt(dims[l]), std=0.0001)
torch.nn.init.constant_(lin.bias, bias)
elif multires > 0 and l == 0:
torch.nn.init.constant_(lin.bias, 0.0)
torch.nn.init.constant_(lin.weight[:, 3:], 0.0)
torch.nn.init.normal_(lin.weight[:, :3], 0.0, np.sqrt(2) / np.sqrt(out_dim))
elif multires > 0 and l in self.skip_in:
torch.nn.init.constant_(lin.bias, 0.0)
torch.nn.init.normal_(lin.weight, 0.0, np.sqrt(2) / np.sqrt(out_dim))
torch.nn.init.constant_(lin.weight[:, -(dims[0] - 3):], 0.0)
else:
torch.nn.init.constant_(lin.bias, 0.0)
torch.nn.init.normal_(lin.weight, 0.0, np.sqrt(2) / np.sqrt(out_dim))
if weight_norm:
lin = nn.utils.weight_norm(lin)
setattr(self, "lin" + str(l), lin)
self.activation = nn.Softplus(beta=100)
def forward(self, inputs):
inputs = inputs * self.scale
if self.embed_fn_fine is not None:
inputs = self.embed_fn_fine(inputs)
x = inputs
for l in range(0, self.num_layers - 1):
lin = getattr(self, "lin" + str(l))
if l in self.skip_in:
x = torch.cat([x, inputs], -1) / np.sqrt(2)
x = lin(x)
if l < self.num_layers - 2:
x = self.activation(x)
return x
def sdf(self, x):
return self.forward(x)[..., :1]
def sdf_hidden_appearance(self, x):
return self.forward(x)
def gradient(self, x):
x.requires_grad_(True)
with torch.enable_grad():
y = self.sdf(x)
d_output = torch.ones_like(y, requires_grad=False, device=y.device)
gradients = torch.autograd.grad(
outputs=y,
inputs=x,
grad_outputs=d_output,
create_graph=True,
retain_graph=True,
only_inputs=True)[0]
return gradients
def sdf_normal(self, x):
x.requires_grad_(True)
with torch.enable_grad():
y = self.sdf(x)
d_output = torch.ones_like(y, requires_grad=False, device=y.device)
gradients = torch.autograd.grad(
outputs=y,
inputs=x,
grad_outputs=d_output,
create_graph=True,
retain_graph=True,
only_inputs=True)[0]
return y[..., :1].detach(), gradients.detach()
class SDFNetworkWithFeature(nn.Module):
def __init__(self, cube, dp_in, df_in, d_out, d_hidden, n_layers, skip_in=(4,), multires=0, bias=0.5,
scale=1, geometric_init=True, weight_norm=True, inside_outside=False, cube_length=0.5):
super().__init__()
self.register_buffer("cube", cube)
self.cube_length = cube_length
dims = [dp_in+df_in] + [d_hidden for _ in range(n_layers)] + [d_out]
self.embed_fn_fine = None
if multires > 0:
embed_fn, input_ch = get_embedder(multires, input_dims=dp_in)
self.embed_fn_fine = embed_fn
dims[0] = input_ch + df_in
self.num_layers = len(dims)
self.skip_in = skip_in
self.scale = scale
for l in range(0, self.num_layers - 1):
if l + 1 in self.skip_in:
out_dim = dims[l + 1] - dims[0]
else:
out_dim = dims[l + 1]
lin = nn.Linear(dims[l], out_dim)
if geometric_init:
if l == self.num_layers - 2:
if not inside_outside:
torch.nn.init.normal_(lin.weight, mean=np.sqrt(np.pi) / np.sqrt(dims[l]), std=0.0001)
torch.nn.init.constant_(lin.bias, -bias)
else:
torch.nn.init.normal_(lin.weight, mean=-np.sqrt(np.pi) / np.sqrt(dims[l]), std=0.0001)
torch.nn.init.constant_(lin.bias, bias)
elif multires > 0 and l == 0:
torch.nn.init.constant_(lin.bias, 0.0)
torch.nn.init.constant_(lin.weight[:, 3:], 0.0)
torch.nn.init.normal_(lin.weight[:, :3], 0.0, np.sqrt(2) / np.sqrt(out_dim))
elif multires > 0 and l in self.skip_in:
torch.nn.init.constant_(lin.bias, 0.0)
torch.nn.init.normal_(lin.weight, 0.0, np.sqrt(2) / np.sqrt(out_dim))
torch.nn.init.constant_(lin.weight[:, -(dims[0] - 3):], 0.0)
else:
torch.nn.init.constant_(lin.bias, 0.0)
torch.nn.init.normal_(lin.weight, 0.0, np.sqrt(2) / np.sqrt(out_dim))
if weight_norm:
lin = nn.utils.weight_norm(lin)
setattr(self, "lin" + str(l), lin)
self.activation = nn.Softplus(beta=100)
def forward(self, points):
points = points * self.scale
# note: point*2 because the cube is [-0.5,0.5]
with torch.no_grad():
feats = F.grid_sample(self.cube, points.view(1,-1,1,1,3)/self.cube_length, mode='bilinear', align_corners=True, padding_mode='zeros').detach()
feats = feats.view(self.cube.shape[1], -1).permute(1,0).view(*points.shape[:-1], -1)
if self.embed_fn_fine is not None:
points = self.embed_fn_fine(points)
x = torch.cat([points, feats], -1)
for l in range(0, self.num_layers - 1):
lin = getattr(self, "lin" + str(l))
if l in self.skip_in:
x = torch.cat([x, points, feats], -1) / np.sqrt(2)
x = lin(x)
if l < self.num_layers - 2:
x = self.activation(x)
# concat feats
x = torch.cat([x, feats], -1)
return x
def sdf(self, x):
return self.forward(x)[..., :1]
def sdf_hidden_appearance(self, x):
return self.forward(x)
def gradient(self, x):
x.requires_grad_(True)
with torch.enable_grad():
y = self.sdf(x)
d_output = torch.ones_like(y, requires_grad=False, device=y.device)
gradients = torch.autograd.grad(
outputs=y,
inputs=x,
grad_outputs=d_output,
create_graph=True,
retain_graph=True,
only_inputs=True)[0]
return gradients
def sdf_normal(self, x):
x.requires_grad_(True)
with torch.enable_grad():
y = self.sdf(x)
d_output = torch.ones_like(y, requires_grad=False, device=y.device)
gradients = torch.autograd.grad(
outputs=y,
inputs=x,
grad_outputs=d_output,
create_graph=True,
retain_graph=True,
only_inputs=True)[0]
return y[..., :1].detach(), gradients.detach()
class VanillaMLP(nn.Module):
def __init__(self, dim_in, dim_out, n_neurons, n_hidden_layers):
super().__init__()
self.n_neurons, self.n_hidden_layers = n_neurons, n_hidden_layers
self.sphere_init, self.weight_norm = True, True
self.sphere_init_radius = 0.5
self.layers = [self.make_linear(dim_in, self.n_neurons, is_first=True, is_last=False), self.make_activation()]
for i in range(self.n_hidden_layers - 1):
self.layers += [self.make_linear(self.n_neurons, self.n_neurons, is_first=False, is_last=False), self.make_activation()]
self.layers += [self.make_linear(self.n_neurons, dim_out, is_first=False, is_last=True)]
self.layers = nn.Sequential(*self.layers)
@torch.cuda.amp.autocast(False)
def forward(self, x):
x = self.layers(x.float())
return x
def make_linear(self, dim_in, dim_out, is_first, is_last):
layer = nn.Linear(dim_in, dim_out, bias=True) # network without bias will degrade quality
if self.sphere_init:
if is_last:
torch.nn.init.constant_(layer.bias, -self.sphere_init_radius)
torch.nn.init.normal_(layer.weight, mean=math.sqrt(math.pi) / math.sqrt(dim_in), std=0.0001)
elif is_first:
torch.nn.init.constant_(layer.bias, 0.0)
torch.nn.init.constant_(layer.weight[:, 3:], 0.0)
torch.nn.init.normal_(layer.weight[:, :3], 0.0, math.sqrt(2) / math.sqrt(dim_out))
else:
torch.nn.init.constant_(layer.bias, 0.0)
torch.nn.init.normal_(layer.weight, 0.0, math.sqrt(2) / math.sqrt(dim_out))
else:
torch.nn.init.constant_(layer.bias, 0.0)
torch.nn.init.kaiming_uniform_(layer.weight, nonlinearity='relu')
if self.weight_norm:
layer = nn.utils.weight_norm(layer)
return layer
def make_activation(self):
if self.sphere_init:
return nn.Softplus(beta=100)
else:
return nn.ReLU(inplace=True)
class SDFHashGridNetwork(nn.Module):
def __init__(self, bound=0.5, feats_dim=13):
super().__init__()
self.bound = bound
# max_resolution = 32
# base_resolution = 16
# n_levels = 4
# log2_hashmap_size = 16
# n_features_per_level = 8
max_resolution = 2048
base_resolution = 16
n_levels = 16
log2_hashmap_size = 19
n_features_per_level = 2
# max_res = base_res * t^(k-1)
per_level_scale = (max_resolution / base_resolution)** (1 / (n_levels - 1))
self.encoder = tcnn.Encoding(
n_input_dims=3,
encoding_config={
"otype": "HashGrid",
"n_levels": n_levels,
"n_features_per_level": n_features_per_level,
"log2_hashmap_size": log2_hashmap_size,
"base_resolution": base_resolution,
"per_level_scale": per_level_scale,
},
)
self.sdf_mlp = VanillaMLP(n_levels*n_features_per_level+3,feats_dim,64,1)
def forward(self, x):
shape = x.shape[:-1]
x = x.reshape(-1, 3)
x_ = (x + self.bound) / (2 * self.bound)
feats = self.encoder(x_)
feats = torch.cat([x, feats], 1)
feats = self.sdf_mlp(feats)
feats = feats.reshape(*shape,-1)
return feats
def sdf(self, x):
return self(x)[...,:1]
def gradient(self, x):
x.requires_grad_(True)
with torch.enable_grad():
y = self.sdf(x)
d_output = torch.ones_like(y, requires_grad=False, device=y.device)
gradients = torch.autograd.grad(
outputs=y,
inputs=x,
grad_outputs=d_output,
create_graph=True,
retain_graph=True,
only_inputs=True)[0]
return gradients
def sdf_normal(self, x):
x.requires_grad_(True)
with torch.enable_grad():
y = self.sdf(x)
d_output = torch.ones_like(y, requires_grad=False, device=y.device)
gradients = torch.autograd.grad(
outputs=y,
inputs=x,
grad_outputs=d_output,
create_graph=True,
retain_graph=True,
only_inputs=True)[0]
return y[..., :1].detach(), gradients.detach()
class RenderingFFNetwork(nn.Module):
def __init__(self, in_feats_dim=12):
super().__init__()
self.dir_encoder = tcnn.Encoding(
n_input_dims=3,
encoding_config={
"otype": "SphericalHarmonics",
"degree": 4,
},
)
self.color_mlp = tcnn.Network(
n_input_dims = in_feats_dim + 3 + self.dir_encoder.n_output_dims,
n_output_dims = 3,
network_config={
"otype": "FullyFusedMLP",
"activation": "ReLU",
"output_activation": "none",
"n_neurons": 64,
"n_hidden_layers": 2,
},
)
def forward(self, points, normals, view_dirs, feature_vectors):
normals = F.normalize(normals, dim=-1)
view_dirs = F.normalize(view_dirs, dim=-1)
reflective = torch.sum(view_dirs * normals, -1, keepdim=True) * normals * 2 - view_dirs
x = torch.cat([feature_vectors, normals, self.dir_encoder(reflective)], -1)
colors = self.color_mlp(x).float()
colors = F.sigmoid(colors)
return colors
# This implementation is borrowed from IDR: https://github.com/lioryariv/idr
class RenderingNetwork(nn.Module):
def __init__(self, d_feature, d_in, d_out, d_hidden,
n_layers, weight_norm=True, multires_view=0, squeeze_out=True, use_view_dir=True):
super().__init__()
self.squeeze_out = squeeze_out
self.rgb_act=F.sigmoid
self.use_view_dir=use_view_dir
dims = [d_in + d_feature] + [d_hidden for _ in range(n_layers)] + [d_out]
self.embedview_fn = None
if multires_view > 0:
embedview_fn, input_ch = get_embedder(multires_view)
self.embedview_fn = embedview_fn
dims[0] += (input_ch - 3)
self.num_layers = len(dims)
for l in range(0, self.num_layers - 1):
out_dim = dims[l + 1]
lin = nn.Linear(dims[l], out_dim)
if weight_norm:
lin = nn.utils.weight_norm(lin)
setattr(self, "lin" + str(l), lin)
self.relu = nn.ReLU()
def forward(self, points, normals, view_dirs, feature_vectors):
if self.use_view_dir:
view_dirs = F.normalize(view_dirs, dim=-1)
normals = F.normalize(normals, dim=-1)
reflective = torch.sum(view_dirs*normals, -1, keepdim=True) * normals * 2 - view_dirs
if self.embedview_fn is not None: reflective = self.embedview_fn(reflective)
rendering_input = torch.cat([points, reflective, normals, feature_vectors], dim=-1)
else:
rendering_input = torch.cat([points, normals, feature_vectors], dim=-1)
x = rendering_input
for l in range(0, self.num_layers - 1):
lin = getattr(self, "lin" + str(l))
x = lin(x)
if l < self.num_layers - 2:
x = self.relu(x)
if self.squeeze_out:
x = self.rgb_act(x)
return x
class SingleVarianceNetwork(nn.Module):
def __init__(self, init_val, activation='exp'):
super(SingleVarianceNetwork, self).__init__()
self.act = activation
self.register_parameter('variance', nn.Parameter(torch.tensor(init_val)))
def forward(self, x):
device = x.device
if self.act=='exp':
return torch.ones([*x.shape[:-1], 1], dtype=torch.float32, device=device) * torch.exp(self.variance * 10.0)
else:
raise NotImplementedError
def warp(self, x, inv_s):
device = x.device
return torch.ones([*x.shape[:-1], 1], dtype=torch.float32, device=device) * inv_s
================================================
FILE: renderer/ngp_renderer.py
================================================
import math
import trimesh
import numpy as np
import torch
import torch.nn as nn
import torch.nn.functional as F
from packaging import version as pver
import tinycudann as tcnn
from torch.autograd import Function
from torch.cuda.amp import custom_bwd, custom_fwd
import raymarching
def custom_meshgrid(*args):
# ref: https://pytorch.org/docs/stable/generated/torch.meshgrid.html?highlight=meshgrid#torch.meshgrid
if pver.parse(torch.__version__) < pver.parse('1.10'):
return torch.meshgrid(*args)
else:
return torch.meshgrid(*args, indexing='ij')
def sample_pdf(bins, weights, n_samples, det=False):
# This implementation is from NeRF
# bins: [B, T], old_z_vals
# weights: [B, T - 1], bin weights.
# return: [B, n_samples], new_z_vals
# Get pdf
weights = weights + 1e-5 # prevent nans
pdf = weights / torch.sum(weights, -1, keepdim=True)
cdf = torch.cumsum(pdf, -1)
cdf = torch.cat([torch.zeros_like(cdf[..., :1]), cdf], -1)
# Take uniform samples
if det:
u = torch.linspace(0. + 0.5 / n_samples, 1. - 0.5 / n_samples, steps=n_samples).to(weights.device)
u = u.expand(list(cdf.shape[:-1]) + [n_samples])
else:
u = torch.rand(list(cdf.shape[:-1]) + [n_samples]).to(weights.device)
# Invert CDF
u = u.contiguous()
inds = torch.searchsorted(cdf, u, right=True)
below = torch.max(torch.zeros_like(inds - 1), inds - 1)
above = torch.min((cdf.shape[-1] - 1) * torch.ones_like(inds), inds)
inds_g = torch.stack([below, above], -1) # (B, n_samples, 2)
matched_shape = [inds_g.shape[0], inds_g.shape[1], cdf.shape[-1]]
cdf_g = torch.gather(cdf.unsqueeze(1).expand(matched_shape), 2, inds_g)
bins_g = torch.gather(bins.unsqueeze(1).expand(matched_shape), 2, inds_g)
denom = (cdf_g[..., 1] - cdf_g[..., 0])
denom = torch.where(denom < 1e-5, torch.ones_like(denom), denom)
t = (u - cdf_g[..., 0]) / denom
samples = bins_g[..., 0] + t * (bins_g[..., 1] - bins_g[..., 0])
return samples
def plot_pointcloud(pc, color=None):
# pc: [N, 3]
# color: [N, 3/4]
print('[visualize points]', pc.shape, pc.dtype, pc.min(0), pc.max(0))
pc = trimesh.PointCloud(pc, color)
# axis
axes = trimesh.creation.axis(axis_length=4)
# sphere
sphere = trimesh.creation.icosphere(radius=1)
trimesh.Scene([pc, axes, sphere]).show()
class NGPRenderer(nn.Module):
def __init__(self,
bound=1,
cuda_ray=True,
density_scale=1, # scale up deltas (or sigmas), to make the density grid more sharp. larger value than 1 usually improves performance.
min_near=0.2,
density_thresh=0.01,
bg_radius=-1,
):
super().__init__()
self.bound = bound
self.cascade = 1
self.grid_size = 128
self.density_scale = density_scale
self.min_near = min_near
self.density_thresh = density_thresh
self.bg_radius = bg_radius # radius of the background sphere.
# prepare aabb with a 6D tensor (xmin, ymin, zmin, xmax, ymax, zmax)
# NOTE: aabb (can be rectangular) is only used to generate points, we still rely on bound (always cubic) to calculate density grid and hashing.
aabb_train = torch.FloatTensor([-bound, -bound, -bound, bound, bound, bound])
aabb_infer = aabb_train.clone()
self.register_buffer('aabb_train', aabb_train)
self.register_buffer('aabb_infer', aabb_infer)
# extra state for cuda raymarching
self.cuda_ray = cuda_ray
if cuda_ray:
# density grid
density_grid = torch.zeros([self.cascade, self.grid_size ** 3]) # [CAS, H * H * H]
density_bitfield = torch.zeros(self.cascade * self.grid_size ** 3 // 8, dtype=torch.uint8) # [CAS * H * H * H // 8]
self.register_buffer('density_grid', density_grid)
self.register_buffer('density_bitfield', density_bitfield)
self.mean_density = 0
self.iter_density = 0
# step counter
step_counter = torch.zeros(16, 2, dtype=torch.int32) # 16 is hardcoded for averaging...
self.register_buffer('step_counter', step_counter)
self.mean_count = 0
self.local_step = 0
def forward(self, x, d):
raise NotImplementedError()
# separated density and color query (can accelerate non-cuda-ray mode.)
def density(self, x):
raise NotImplementedError()
def color(self, x, d, mask=None, **kwargs):
raise NotImplementedError()
def reset_extra_state(self):
if not self.cuda_ray:
return
# density grid
self.density_grid.zero_()
self.mean_density = 0
self.iter_density = 0
# step counter
self.step_counter.zero_()
self.mean_count = 0
self.local_step = 0
def run(self, rays_o, rays_d, num_steps=128, upsample_steps=128, bg_color=None, perturb=False, **kwargs):
# rays_o, rays_d: [B, N, 3], assumes B == 1
# bg_color: [3] in range [0, 1]
# return: image: [B, N, 3], depth: [B, N]
prefix = rays_o.shape[:-1]
rays_o = rays_o.contiguous().view(-1, 3)
rays_d = rays_d.contiguous().view(-1, 3)
N = rays_o.shape[0] # N = B * N, in fact
device = rays_o.device
# choose aabb
aabb = self.aabb_train if self.training else self.aabb_infer
# sample steps
nears, fars = raymarching.near_far_from_aabb(rays_o, rays_d, aabb, self.min_near)
nears.unsqueeze_(-1)
fars.unsqueeze_(-1)
#print(f'nears = {nears.min().item()} ~ {nears.max().item()}, fars = {fars.min().item()} ~ {fars.max().item()}')
z_vals = torch.linspace(0.0, 1.0, num_steps, device=device).unsqueeze(0) # [1, T]
z_vals = z_vals.expand((N, num_steps)) # [N, T]
z_vals = nears + (fars - nears) * z_vals # [N, T], in [nears, fars]
# perturb z_vals
sample_dist = (fars - nears) / num_steps
if perturb:
z_vals = z_vals + (torch.rand(z_vals.shape, device=device) - 0.5) * sample_dist
#z_vals = z_vals.clamp(nears, fars) # avoid out of bounds xyzs.
# generate xyzs
xyzs = rays_o.unsqueeze(-2) + rays_d.unsqueeze(-2) * z_vals.unsqueeze(-1) # [N, 1, 3] * [N, T, 1] -> [N, T, 3]
xyzs = torch.min(torch.max(xyzs, aabb[:3]), aabb[3:]) # a manual clip.
#plot_pointcloud(xyzs.reshape(-1, 3).detach().cpu().numpy())
# query SDF and RGB
density_outputs = self.density(xyzs.reshape(-1, 3))
#sigmas = density_outputs['sigma'].view(N, num_steps) # [N, T]
for k, v in density_outputs.items():
density_outputs[k] = v.view(N, num_steps, -1)
# upsample z_vals (nerf-like)
if upsample_steps > 0:
with torch.no_grad():
deltas = z_vals[..., 1:] - z_vals[..., :-1] # [N, T-1]
deltas = torch.cat([deltas, sample_dist * torch.ones_like(deltas[..., :1])], dim=-1)
alphas = 1 - torch.exp(-deltas * self.density_scale * density_outputs['sigma'].squeeze(-1)) # [N, T]
alphas_shifted = torch.cat([torch.ones_like(alphas[..., :1]), 1 - alphas + 1e-15], dim=-1) # [N, T+1]
weights = alphas * torch.cumprod(alphas_shifted, dim=-1)[..., :-1] # [N, T]
# sample new z_vals
z_vals_mid = (z_vals[..., :-1] + 0.5 * deltas[..., :-1]) # [N, T-1]
new_z_vals = sample_pdf(z_vals_mid, weights[:, 1:-1], upsample_steps, det=not self.training).detach() # [N, t]
new_xyzs = rays_o.unsqueeze(-2) + rays_d.unsqueeze(-2) * new_z_vals.unsqueeze(-1) # [N, 1, 3] * [N, t, 1] -> [N, t, 3]
new_xyzs = torch.min(torch.max(new_xyzs, aabb[:3]), aabb[3:]) # a manual clip.
# only forward new points to save computation
new_density_outputs = self.density(new_xyzs.reshape(-1, 3))
#new_sigmas = new_density_outputs['sigma'].view(N, upsample_steps) # [N, t]
for k, v in new_density_outputs.items():
new_density_outputs[k] = v.view(N, upsample_steps, -1)
# re-order
z_vals = torch.cat([z_vals, new_z_vals], dim=1) # [N, T+t]
z_vals, z_index = torch.sort(z_vals, dim=1)
xyzs = torch.cat([xyzs, new_xyzs], dim=1) # [N, T+t, 3]
xyzs = torch.gather(xyzs, dim=1, index=z_index.unsqueeze(-1).expand_as(xyzs))
for k in density_outputs:
tmp_output = torch.cat([density_outputs[k], new_density_outputs[k]], dim=1)
density_outputs[k] = torch.gather(tmp_output, dim=1, index=z_index.unsqueeze(-1).expand_as(tmp_output))
deltas = z_vals[..., 1:] - z_vals[..., :-1] # [N, T+t-1]
deltas = torch.cat([deltas, sample_dist * torch.ones_like(deltas[..., :1])], dim=-1)
alphas = 1 - torch.exp(-deltas * self.density_scale * density_outputs['sigma'].squeeze(-1)) # [N, T+t]
alphas_shifted = torch.cat([torch.ones_like(alphas[..., :1]), 1 - alphas + 1e-15], dim=-1) # [N, T+t+1]
weights = alphas * torch.cumprod(alphas_shifted, dim=-1)[..., :-1] # [N, T+t]
dirs = rays_d.view(-1, 1, 3).expand_as(xyzs)
for k, v in density_outputs.items():
density_outputs[k] = v.view(-1, v.shape[-1])
mask = weights > 1e-4 # hard coded
rgbs = self.color(xyzs.reshape(-1, 3), dirs.reshape(-1, 3), mask=mask.reshape(-1), **density_outputs)
rgbs = rgbs.view(N, -1, 3) # [N, T+t, 3]
#print(xyzs.shape, 'valid_rgb:', mask.sum().item())
# calculate weight_sum (mask)
weights_sum = weights.sum(dim=-1) # [N]
# calculate depth
ori_z_vals = ((z_vals - nears) / (fars - nears)).clamp(0, 1)
depth = torch.sum(weights * ori_z_vals, dim=-1)
# calculate color
image = torch.sum(weights.unsqueeze(-1) * rgbs, dim=-2) # [N, 3], in [0, 1]
# mix background color
if self.bg_radius > 0:
# use the bg model to calculate bg_color
sph = raymarching.sph_from_ray(rays_o, rays_d, self.bg_radius) # [N, 2] in [-1, 1]
bg_color = self.background(sph, rays_d.reshape(-1, 3)) # [N, 3]
elif bg_color is None:
bg_color = 1
image = image + (1 - weights_sum).unsqueeze(-1) * bg_color
image = image.view(*prefix, 3)
depth = depth.view(*prefix)
# tmp: reg loss in mip-nerf 360
# z_vals_shifted = torch.cat([z_vals[..., 1:], sample_dist * torch.ones_like(z_vals[..., :1])], dim=-1)
# mid_zs = (z_vals + z_vals_shifted) / 2 # [N, T]
# loss_dist = (torch.abs(mid_zs.unsqueeze(1) - mid_zs.unsqueeze(2)) * (weights.unsqueeze(1) * weights.unsqueeze(2))).sum() + 1/3 * ((z_vals_shifted - z_vals_shifted) * (weights ** 2)).sum()
return {
'depth': depth,
'image': image,
'weights_sum': weights_sum,
}
def run_cuda(self, rays_o, rays_d, dt_gamma=0, bg_color=None, perturb=False, force_all_rays=False, max_steps=1024, T_thresh=1e-4, **kwargs):
# rays_o, rays_d: [B, N, 3], assumes B == 1
# return: image: [B, N, 3], depth: [B, N]
prefix = rays_o.shape[:-1]
rays_o = rays_o.contiguous().view(-1, 3)
rays_d = rays_d.contiguous().view(-1, 3)
N = rays_o.shape[0] # N = B * N, in fact
device = rays_o.device
# pre-calculate near far
nears, fars = raymarching.near_far_from_aabb(rays_o, rays_d, self.aabb_train if self.training else self.aabb_infer, self.min_near)
# mix background color
if self.bg_radius > 0:
# use the bg model to calculate bg_color
sph = raymarching.sph_from_ray(rays_o, rays_d, self.bg_radius) # [N, 2] in [-1, 1]
bg_color = self.background(sph, rays_d) # [N, 3]
elif bg_color is None:
bg_color = 1
results = {}
if self.training:
# setup counter
counter = self.step_counter[self.local_step % 16]
counter.zero_() # set to 0
self.local_step += 1
xyzs, dirs, deltas, rays = raymarching.march_rays_train(rays_o, rays_d, self.bound, self.density_bitfield, self.cascade, self.grid_size, nears, fars, counter, self.mean_count, perturb, 128, force_all_rays, dt_gamma, max_steps)
#plot_pointcloud(xyzs.reshape(-1, 3).detach().cpu().numpy())
sigmas, rgbs = self(xyzs, dirs)
sigmas = self.density_scale * sigmas
weights_sum, depth, image = raymarching.composite_rays_train(sigmas, rgbs, deltas, rays, T_thresh)
image = image + (1 - weights_sum).unsqueeze(-1) * bg_color
depth = torch.clamp(depth - nears, min=0) / (fars - nears)
image = image.view(*prefix, 3)
depth = depth.view(*prefix)
else:
# allocate outputs
# if use autocast, must init as half so it won't be autocasted and lose reference.
#dtype = torch.half if torch.is_autocast_enabled() else torch.float32
# output should always be float32! only network inference uses half.
dtype = torch.float32
weights_sum = torch.zeros(N, dtype=dtype, device=device)
depth = torch.zeros(N, dtype=dtype, device=device)
image = torch.zeros(N, 3, dtype=dtype, device=device)
n_alive = N
rays_alive = torch.arange(n_alive, dtype=torch.int32, device=device) # [N]
rays_t = nears.clone() # [N]
step = 0
while step < max_steps:
# count alive rays
n_alive = rays_alive.shape[0]
# exit loop
if n_alive <= 0:
break
# decide compact_steps
n_step = max(min(N // n_alive, 8), 1)
xyzs, dirs, deltas = raymarching.march_rays(n_alive, n_step, rays_alive, rays_t, rays_o, rays_d, self.bound, self.density_bitfield, self.cascade, self.grid_size, nears, fars, 128, perturb if step == 0 else False, dt_gamma, max_steps)
sigmas, rgbs = self(xyzs, dirs)
# density_outputs = self.density(xyzs) # [M,], use a dict since it may include extra things, like geo_feat for rgb.
# sigmas = density_outputs['sigma']
# rgbs = self.color(xyzs, dirs, **density_outputs)
sigmas = self.density_scale * sigmas
raymarching.composite_rays(n_alive, n_step, rays_alive, rays_t, sigmas, rgbs, deltas, weights_sum, depth, image, T_thresh)
rays_alive = rays_alive[rays_alive >= 0]
#print(f'step = {step}, n_step = {n_step}, n_alive = {n_alive}, xyzs: {xyzs.shape}')
step += n_step
image = image + (1 - weights_sum).unsqueeze(-1) * bg_color
depth = torch.clamp(depth - nears, min=0) / (fars - nears)
image = image.view(*prefix, 3)
depth = depth.view(*prefix)
results['weights_sum'] = weights_sum
results['depth'] = depth
results['image'] = image
return results
@torch.no_grad()
def mark_untrained_grid(self, poses, intrinsic, S=64):
# poses: [B, 4, 4]
# intrinsic: [3, 3]
if not self.cuda_ray:
return
if isinstance(poses, np.ndarray):
poses = torch.from_numpy(poses)
B = poses.shape[0]
fx, fy, cx, cy = intrinsic
X = torch.arange(self.grid_size, dtype=torch.int32, device=self.density_bitfield.device).split(S)
Y = torch.arange(self.grid_size, dtype=torch.int32, device=self.density_bitfield.device).split(S)
Z = torch.arange(self.grid_size, dtype=torch.int32, device=self.density_bitfield.device).split(S)
count = torch.zeros_like(self.density_grid)
poses = poses.to(count.device)
# 5-level loop, forgive me...
for xs in X:
for ys in Y:
for zs in Z:
# construct points
xx, yy, zz = custom_meshgrid(xs, ys, zs)
coords = torch.cat([xx.reshape(-1, 1), yy.reshape(-1, 1), zz.reshape(-1, 1)], dim=-1) # [N, 3], in [0, 128)
indices = raymarching.morton3D(coords).long() # [N]
world_xyzs = (2 * coords.float() / (self.grid_size - 1) - 1).unsqueeze(0) # [1, N, 3] in [-1, 1]
# cascading
for cas in range(self.cascade):
bound = min(2 ** cas, self.bound)
half_grid_size = bound / self.grid_size
# scale to current cascade's resolution
cas_world_xyzs = world_xyzs * (bound - half_grid_size)
# split batch to avoid OOM
head = 0
while head < B:
tail = min(head + S, B)
# world2cam transform (poses is c2w, so we need to transpose it. Another transpose is needed for batched matmul, so the final form is without transpose.)
cam_xyzs = cas_world_xyzs - poses[head:tail, :3, 3].unsqueeze(1)
cam_xyzs = cam_xyzs @ poses[head:tail, :3, :3] # [S, N, 3]
# query if point is covered by any camera
mask_z = cam_xyzs[:, :, 2] > 0 # [S, N]
mask_x = torch.abs(cam_xyzs[:, :, 0]) < cx / fx * cam_xyzs[:, :, 2] + half_grid_size * 2
mask_y = torch.abs(cam_xyzs[:, :, 1]) < cy / fy * cam_xyzs[:, :, 2] + half_grid_size * 2
mask = (mask_z & mask_x & mask_y).sum(0).reshape(-1) # [N]
# update count
count[cas, indices] += mask
head += S
# mark untrained grid as -1
self.density_grid[count == 0] = -1
print(f'[mark untrained grid] {(count == 0).sum()} from {self.grid_size ** 3 * self.cascade}')
@torch.no_grad()
def update_extra_state(self, decay=0.95, S=128):
# call before each epoch to update extra states.
if not self.cuda_ray:
return
### update density grid
tmp_grid = - torch.ones_like(self.density_grid)
# full update.
if self.iter_density < 16:
#if True:
X = torch.arange(self.grid_size, dtype=torch.int32, device=self.density_bitfield.device).split(S)
Y = torch.arange(self.grid_size, dtype=torch.int32, device=self.density_bitfield.device).split(S)
Z = torch.arange(self.grid_size, dtype=torch.int32, device=self.density_bitfield.device).split(S)
for xs in X:
for ys in Y:
for zs in Z:
# construct points
xx, yy, zz = custom_meshgrid(xs, ys, zs)
coords = torch.cat([xx.reshape(-1, 1), yy.reshape(-1, 1), zz.reshape(-1, 1)], dim=-1) # [N, 3], in [0, 128)
indices = raymarching.morton3D(coords).long() # [N]
xyzs = 2 * coords.float() / (self.grid_size - 1) - 1 # [N, 3] in [-1, 1]
# cascading
for cas in range(self.cascade):
bound = min(2 ** cas, self.bound)
half_grid_size = bound / self.grid_size
# scale to current cascade's resolution
cas_xyzs = xyzs * (bound - half_grid_size)
# add noise in [-hgs, hgs]
cas_xyzs += (torch.rand_like(cas_xyzs) * 2 - 1) * half_grid_size
# query density
sigmas = self.density(cas_xyzs)['sigma'].reshape(-1).detach()
sigmas *= self.density_scale
# assign
tmp_grid[cas, indices] = sigmas
# partial update (half the computation)
# TODO: why no need of maxpool ?
else:
N = self.grid_size ** 3 // 4 # H * H * H / 4
for cas in range(self.cascade):
# random sample some positions
coords = torch.randint(0, self.grid_size, (N, 3), device=self.density_bitfield.device) # [N, 3], in [0, 128)
indices = raymarching.morton3D(coords).long() # [N]
# random sample occupied positions
occ_indices = torch.nonzero(self.density_grid[cas] > 0).squeeze(-1) # [Nz]
rand_mask = torch.randint(0, occ_indices.shape[0], [N], dtype=torch.long, device=self.density_bitfield.device)
occ_indices = occ_indices[rand_mask] # [Nz] --> [N], allow for duplication
occ_coords = raymarching.morton3D_invert(occ_indices) # [N, 3]
# concat
indices = torch.cat([indices, occ_indices], dim=0)
coords = torch.cat([coords, occ_coords], dim=0)
# same below
xyzs = 2 * coords.float() / (self.grid_size - 1) - 1 # [N, 3] in [-1, 1]
bound = min(2 ** cas, self.bound)
half_grid_size = bound / self.grid_size
# scale to current cascade's resolution
cas_xyzs = xyzs * (bound - half_grid_size)
# add noise in [-hgs, hgs]
cas_xyzs += (torch.rand_like(cas_xyzs) * 2 - 1) * half_grid_size
# query density
sigmas = self.density(cas_xyzs)['sigma'].reshape(-1).detach()
sigmas *= self.density_scale
# assign
tmp_grid[cas, indices] = sigmas
## max-pool on tmp_grid for less aggressive culling [No significant improvement...]
# invalid_mask = tmp_grid < 0
# tmp_grid = F.max_pool3d(tmp_grid.view(self.cascade, 1, self.grid_size, self.grid_size, self.grid_size), kernel_size=3, stride=1, padding=1).view(self.cascade, -1)
# tmp_grid[invalid_mask] = -1
# ema update
valid_mask = (self.density_grid >= 0) & (tmp_grid >= 0)
self.density_grid[valid_mask] = torch.maximum(self.density_grid[valid_mask] * decay, tmp_grid[valid_mask])
self.mean_density = torch.mean(self.density_grid.clamp(min=0)).item() # -1 regions are viewed as 0 density.
#self.mean_density = torch.mean(self.density_grid[self.density_grid > 0]).item() # do not count -1 regions
self.iter_density += 1
# convert to bitfield
density_thresh = min(self.mean_density, self.density_thresh)
self.density_bitfield = raymarching.packbits(self.density_grid, density_thresh, self.density_bitfield)
### update step counter
total_step = min(16, self.local_step)
if total_step > 0:
self.mean_count = int(self.step_counter[:total_step, 0].sum().item() / total_step)
self.local_step = 0
#print(f'[density grid] min={self.density_grid.min().item():.4f}, max={self.density_grid.max().item():.4f}, mean={self.mean_density:.4f}, occ_rate={(self.density_grid > 0.01).sum() / (128**3 * self.cascade):.3f} | [step counter] mean={self.mean_count}')
def render(self, rays_o, rays_d, staged=False, max_ray_batch=4096, **kwargs):
# rays_o, rays_d: [B, N, 3], assumes B == 1
# return: pred_rgb: [B, N, 3]
if self.cuda_ray:
_run = self.run_cuda
else:
_run = self.run
results = _run(rays_o, rays_d, **kwargs)
return results
class _trunc_exp(Function):
@staticmethod
@custom_fwd(cast_inputs=torch.float32) # cast to float32
def forward(ctx, x):
ctx.save_for_backward(x)
return torch.exp(x)
@staticmethod
@custom_bwd
def backward(ctx, g):
x = ctx.saved_tensors[0]
return g * torch.exp(x.clamp(-15, 15))
trunc_exp = _trunc_exp.apply
class NGPNetwork(NGPRenderer):
def __init__(self,
num_layers=2,
hidden_dim=64,
geo_feat_dim=15,
num_layers_color=3,
hidden_dim_color=64,
bound=0.5,
max_resolution=128,
base_resolution=16,
n_levels=16,
**kwargs
):
super().__init__(bound, **kwargs)
# sigma network
self.num_layers = num_layers
self.hidden_dim = hidden_dim
self.geo_feat_dim = geo_feat_dim
self.bound = bound
log2_hashmap_size = 19
n_features_per_level = 2
per_level_scale = np.exp2(np.log2(max_resolution / base_resolution) / (n_levels - 1))
self.encoder = tcnn.Encoding(
n_input_dims=3,
encoding_config={
"otype": "HashGrid",
"n_levels": n_levels,
"n_features_per_level": n_features_per_level,
"log2_hashmap_size": log2_hashmap_size,
"base_resolution": base_resolution,
"per_level_scale": per_level_scale,
},
)
self.sigma_net = tcnn.Network(
n_input_dims = n_levels * 2,
n_output_dims=1 + self.geo_feat_dim,
network_config={
"otype": "FullyFusedMLP",
"activation": "ReLU",
"output_activation": "None",
"n_neurons": hidden_dim,
"n_hidden_layers": num_layers - 1,
},
)
# color network
self.num_layers_color = num_layers_color
self.hidden_dim_color = hidden_dim_color
self.encoder_dir = tcnn.Encoding(
n_input_dims=3,
encoding_config={
"otype": "SphericalHarmonics",
"degree": 4,
},
)
self.in_dim_color = self.encoder_dir.n_output_dims + self.geo_feat_dim
self.color_net = tcnn.Network(
n_input_dims = self.in_dim_color,
n_output_dims=3,
network_config={
"otype": "FullyFusedMLP",
"activation": "ReLU",
"output_activation": "None",
"n_neurons": hidden_dim_color,
"n_hidden_layers": num_layers_color - 1,
},
)
self.density_scale, self.density_std = 10.0, 0.25
def forward(self, x, d):
# x: [N, 3], in [-bound, bound]
# d: [N, 3], nomalized in [-1, 1]
# sigma
x_raw = x
x = (x + self.bound) / (2 * self.bound) # to [0, 1]
x = self.encoder(x)
h = self.sigma_net(x)
# sigma = F.relu(h[..., 0])
density = h[..., 0]
# add density bias
dist = torch.norm(x_raw, dim=-1)
density_bias = (1 - dist / self.density_std) * self.density_scale
density = density_bias + density
sigma = F.softplus(density)
geo_feat = h[..., 1:]
# color
d = (d + 1) / 2 # tcnn SH encoding requires inputs to be in [0, 1]
d = self.encoder_dir(d)
# p = torch.zeros_like(geo_feat[..., :1]) # manual input padding
h = torch.cat([d, geo_feat], dim=-1)
h = self.color_net(h)
# sigmoid activation for rgb
color = torch.sigmoid(h)
return sigma, color
def density(self, x):
# x: [N, 3], in [-bound, bound]
x_raw = x
x = (x + self.bound) / (2 * self.bound) # to [0, 1]
x = self.encoder(x)
h = self.sigma_net(x)
# sigma = F.relu(h[..., 0])
density = h[..., 0]
# add density bias
dist = torch.norm(x_raw, dim=-1)
density_bias = (1 - dist / self.density_std) * self.density_scale
density = density_bias + density
sigma = F.softplus(density)
geo_feat = h[..., 1:]
return {
'sigma': sigma,
'geo_feat': geo_feat,
}
# allow masked inference
def color(self, x, d, mask=None, geo_feat=None, **kwargs):
# x: [N, 3] in [-bound, bound]
# mask: [N,], bool, indicates where we actually needs to compute rgb.
x = (x + self.bound) / (2 * self.bound) # to [0, 1]
if mask is not None:
rgbs = torch.zeros(mask.shape[0], 3, dtype=x.dtype, device=x.device) # [N, 3]
# in case of empty mask
if not mask.any():
return rgbs
x = x[mask]
d = d[mask]
geo_feat = geo_feat[mask]
# color
d = (d + 1) / 2 # tcnn SH encoding requires inputs to be in [0, 1]
d = self.encoder_dir(d)
h = torch.cat([d, geo_feat], dim=-1)
h = self.color_net(h)
# sigmoid activation for rgb
h = torch.sigmoid(h)
if mask is not None:
rgbs[mask] = h.to(rgbs.dtype) # fp16 --> fp32
else:
rgbs = h
return rgbs
================================================
FILE: renderer/renderer.py
================================================
import abc
import os
from pathlib import Path
import cv2
import numpy as np
import pytorch_lightning as pl
import torch
import torch.nn as nn
import torch.nn.functional as F
from omegaconf import OmegaConf
from skimage.io import imread, imsave
from PIL import Image
from torch.optim.lr_scheduler import LambdaLR
from ldm.base_utils import read_pickle, concat_images_list
from renderer.neus_networks import SDFNetwork, RenderingNetwork, SingleVarianceNetwork, SDFHashGridNetwork, RenderingFFNetwork
from renderer.ngp_renderer import NGPNetwork
from ldm.util import instantiate_from_config
DEFAULT_RADIUS = np.sqrt(3)/2
DEFAULT_SIDE_LENGTH = 0.6
def sample_pdf(bins, weights, n_samples, det=True):
device = bins.device
dtype = bins.dtype
# This implementation is from NeRF
# Get pdf
weights = weights + 1e-5 # prevent nans
pdf = weights / torch.sum(weights, -1, keepdim=True)
cdf = torch.cumsum(pdf, -1)
cdf = torch.cat([torch.zeros_like(cdf[..., :1]), cdf], -1)
# Take uniform samples
if det:
u = torch.linspace(0. + 0.5 / n_samples, 1. - 0.5 / n_samples, steps=n_samples, dtype=dtype, device=device)
u = u.expand(list(cdf.shape[:-1]) + [n_samples])
else:
u = torch.rand(list(cdf.shape[:-1]) + [n_samples], dtype=dtype, device=device)
# Invert CDF
u = u.contiguous()
inds = torch.searchsorted(cdf, u, right=True)
below = torch.max(torch.zeros_like(inds - 1), inds - 1)
above = torch.min((cdf.shape[-1] - 1) * torch.ones_like(inds), inds)
inds_g = torch.stack([below, above], -1) # (batch, N_samples, 2)
matched_shape = [inds_g.shape[0], inds_g.shape[1], cdf.shape[-1]]
cdf_g = torch.gather(cdf.unsqueeze(1).expand(matched_shape), 2, inds_g)
bins_g = torch.gather(bins.unsqueeze(1).expand(matched_shape), 2, inds_g)
denom = (cdf_g[..., 1] - cdf_g[..., 0])
denom = torch.where(denom < 1e-5, torch.ones_like(denom), denom)
t = (u - cdf_g[..., 0]) / denom
samples = bins_g[..., 0] + t * (bins_g[..., 1] - bins_g[..., 0])
return samples
def near_far_from_sphere(rays_o, rays_d, radius=DEFAULT_RADIUS):
a = torch.sum(rays_d ** 2, dim=-1, keepdim=True)
b = torch.sum(rays_o * rays_d, dim=-1, keepdim=True)
mid = -b / a
near = mid - radius
far = mid + radius
return near, far
class BackgroundRemoval:
def __init__(self, device='cuda'):
from carvekit.api.high import HiInterface
self.interface = HiInterface(
object_type="object", # Can be "object" or "hairs-like".
batch_size_seg=5,
batch_size_matting=1,
device=device,
seg_mask_size=640, # Use 640 for Tracer B7 and 320 for U2Net
matting_mask_size=2048,
trimap_prob_threshold=231,
trimap_dilation=30,
trimap_erosion_iters=5,
fp16=True,
)
@torch.no_grad()
def __call__(self, image):
# image: [H, W, 3] array in [0, 255].
image = Image.fromarray(image)
image = self.interface([image])[0]
image = np.array(image)
return image
class BaseRenderer(nn.Module):
def __init__(self, train_batch_num, test_batch_num):
super().__init__()
self.train_batch_num = train_batch_num
self.test_batch_num = test_batch_num
@abc.abstractmethod
def render_impl(self, ray_batch, is_train, step):
pass
@abc.abstractmethod
def render_with_loss(self, ray_batch, is_train, step):
pass
def render(self, ray_batch, is_train, step):
batch_num = self.train_batch_num if is_train else self.test_batch_num
ray_num = ray_batch['rays_o'].shape[0]
outputs = {}
for ri in range(0, ray_num, batch_num):
cur_ray_batch = {}
for k, v in ray_batch.items():
cur_ray_batch[k] = v[ri:ri + batch_num]
cur_outputs = self.render_impl(cur_ray_batch, is_train, step)
for k, v in cur_outputs.items():
if k not in outputs: outputs[k] = []
outputs[k].append(v)
for k, v in outputs.items():
outputs[k] = torch.cat(v, 0)
return outputs
class NeuSRenderer(BaseRenderer):
def __init__(self, train_batch_num, test_batch_num, lambda_eikonal_loss=0.1, use_mask=True,
lambda_rgb_loss=1.0, lambda_mask_loss=0.0, rgb_loss='soft_l1', coarse_sn=64, fine_sn=64):
super().__init__(train_batch_num, test_batch_num)
self.n_samples = coarse_sn
self.n_importance = fine_sn
self.up_sample_steps = 4
self.anneal_end = 200
self.use_mask = use_mask
self.lambda_eikonal_loss = lambda_eikonal_loss
self.lambda_rgb_loss = lambda_rgb_loss
self.lambda_mask_loss = lambda_mask_loss
self.rgb_loss = rgb_loss
self.sdf_network = SDFNetwork(d_out=257, d_in=3, d_hidden=256, n_layers=8, skip_in=[4], multires=6, bias=0.5, scale=1.0, geometric_init=True, weight_norm=True)
self.color_network = RenderingNetwork(d_feature=256, d_in=9, d_out=3, d_hidden=256, n_layers=4, weight_norm=True, multires_view=4, squeeze_out=True)
self.default_dtype = torch.float32
self.deviation_network = SingleVarianceNetwork(0.3)
@torch.no_grad()
def get_vertex_colors(self, vertices):
"""
@param vertices: n,3
@return:
"""
V = vertices.shape[0]
bn = 20480
verts_colors = []
with torch.no_grad():
for vi in range(0, V, bn):
verts = torch.from_numpy(vertices[vi:vi+bn].astype(np.float32)).cuda()
feats = self.sdf_network(verts)[..., 1:]
gradients = self.sdf_network.gradient(verts) # ...,3
gradients = F.normalize(gradients, dim=-1)
colors = self.color_network(verts, gradients, gradients, feats)
colors = torch.clamp(colors,min=0,max=1).cpu().numpy()
verts_colors.append(colors)
verts_colors = (np.concatenate(verts_colors, 0)*255).astype(np.uint8)
return verts_colors
def upsample(self, rays_o, rays_d, z_vals, sdf, n_importance, inv_s):
"""
Up sampling give a fixed inv_s
"""
device = rays_o.device
batch_size, n_samples = z_vals.shape
pts = rays_o[:, None, :] + rays_d[:, None, :] * z_vals[..., :, None] # n_rays, n_samples, 3
inner_mask = self.get_inner_mask(pts)
# radius = torch.linalg.norm(pts, ord=2, dim=-1, keepdim=False)
inside_sphere = inner_mask[:, :-1] | inner_mask[:, 1:]
sdf = sdf.reshape(batch_size, n_samples)
prev_sdf, next_sdf = sdf[:, :-1], sdf[:, 1:]
prev_z_vals, next_z_vals = z_vals[:, :-1], z_vals[:, 1:]
mid_sdf = (prev_sdf + next_sdf) * 0.5
cos_val = (next_sdf - prev_sdf) / (next_z_vals - prev_z_vals + 1e-5)
prev_cos_val = torch.cat([torch.zeros([batch_size, 1], dtype=self.default_dtype, device=device), cos_val[:, :-1]], dim=-1)
cos_val = torch.stack([prev_cos_val, cos_val], dim=-1)
cos_val, _ = torch.min(cos_val, dim=-1, keepdim=False)
cos_val = cos_val.clip(-1e3, 0.0) * inside_sphere
dist = (next_z_vals - prev_z_vals)
prev_esti_sdf = mid_sdf - cos_val * dist * 0.5
next_esti_sdf = mid_sdf + cos_val * dist * 0.5
prev_cdf = torch.sigmoid(prev_esti_sdf * inv_s)
next_cdf = torch.sigmoid(next_esti_sdf * inv_s)
alpha = (prev_cdf - next_cdf + 1e-5) / (prev_cdf + 1e-5)
weights = alpha * torch.cumprod(
torch.cat([torch.ones([batch_size, 1], dtype=self.default_dtype, device=device), 1. - alpha + 1e-7], -1), -1)[:, :-1]
z_samples = sample_pdf(z_vals, weights, n_importance, det=True).detach()
return z_samples
def cat_z_vals(self, rays_o, rays_d, z_vals, new_z_vals, sdf, last=False):
batch_size, n_samples = z_vals.shape
_, n_importance = new_z_vals.shape
pts = rays_o[:, None, :] + rays_d[:, None, :] * new_z_vals[..., :, None]
z_vals = torch.cat([z_vals, new_z_vals], dim=-1)
z_vals, index = torch.sort(z_vals, dim=-1)
if not last:
device = pts.device
new_sdf = self.sdf_network.sdf(pts.reshape(-1, 3)).reshape(batch_size, n_importance)
sdf = torch.cat([sdf, new_sdf], dim=-1)
xx = torch.arange(batch_size)[:, None].expand(batch_size, n_samples + n_importance).reshape(-1).to(device)
index = index.reshape(-1)
sdf = sdf[(xx, index)].reshape(batch_size, n_samples + n_importance)
return z_vals, sdf
def sample_depth(self, rays_o, rays_d, near, far, perturb):
n_samples = self.n_samples
n_importance = self.n_importance
up_sample_steps = self.up_sample_steps
device = rays_o.device
# sample points
batch_size = len(rays_o)
z_vals = torch.linspace(0.0, 1.0, n_samples, dtype=self.default_dtype, device=device) # sn
z_vals = near + (far - near) * z_vals[None, :] # rn,sn
if perturb > 0:
t_rand = (torch.rand([batch_size, 1]).to(device) - 0.5)
z_vals = z_vals + t_rand * 2.0 / n_samples
# Up sample
with torch.no_grad():
pts = rays_o[:, None, :] + rays_d[:, None, :] * z_vals[..., :, None]
sdf = self.sdf_network.sdf(pts).reshape(batch_size, n_samples)
for i in range(up_sample_steps):
rn, sn = z_vals.shape
inv_s = torch.ones(rn, sn - 1, dtype=self.default_dtype, device=device) * 64 * 2 ** i
new_z_vals = self.upsample(rays_o, rays_d, z_vals, sdf, n_importance // up_sample_steps, inv_s)
z_vals, sdf = self.cat_z_vals(rays_o, rays_d, z_vals, new_z_vals, sdf, last=(i + 1 == up_sample_steps))
return z_vals
def compute_sdf_alpha(self, points, dists, dirs, cos_anneal_ratio, step):
# points [...,3] dists [...] dirs[...,3]
sdf_nn_output = self.sdf_network(points)
sdf = sdf_nn_output[..., 0]
feature_vector = sdf_nn_output[..., 1:]
gradients = self.sdf_network.gradient(points) # ...,3
inv_s = self.deviation_network(points).clip(1e-6, 1e6) # ...,1
inv_s = inv_s[..., 0]
true_cos = (dirs * gradients).sum(-1) # [...]
iter_cos = -(F.relu(-true_cos * 0.5 + 0.5) * (1.0 - cos_anneal_ratio) +
F.relu(-true_cos) * cos_anneal_ratio) # always non-positive
# Estimate signed distances at section points
estimated_next_sdf = sdf + iter_cos * dists * 0.5
estimated_prev_sdf = sdf - iter_cos * dists * 0.5
prev_cdf = torch.sigmoid(estimated_prev_sdf * inv_s)
next_cdf = torch.sigmoid(estimated_next_sdf * inv_s)
p = prev_cdf - next_cdf
c = prev_cdf
alpha = ((p + 1e-5) / (c + 1e-5)).clip(0.0, 1.0) # [...]
return alpha, gradients, feature_vector, inv_s, sdf
def get_anneal_val(self, step):
if self.anneal_end < 0:
return 1.0
else:
return np.min([1.0, step / self.anneal_end])
def get_inner_mask(self, points):
return torch.sum(torch.abs(points)<=DEFAULT_SIDE_LENGTH,-1)==3
def render_impl(self, ray_batch, is_train, step):
near, far = near_far_from_sphere(ray_batch['rays_o'], ray_batch['rays_d'])
rays_o, rays_d = ray_batch['rays_o'], ray_batch['rays_d']
z_vals = self.sample_depth(rays_o, rays_d, near, far, is_train)
batch_size, n_samples = z_vals.shape
# section length in original space
dists = z_vals[..., 1:] - z_vals[..., :-1] # rn,sn-1
dists = torch.cat([dists, dists[..., -1:]], -1) # rn,sn
mid_z_vals = z_vals + dists * 0.5
points = rays_o.unsqueeze(-2) + rays_d.unsqueeze(-2) * mid_z_vals.unsqueeze(-1) # rn, sn, 3
inner_mask = self.get_inner_mask(points)
dirs = rays_d.unsqueeze(-2).expand(batch_size, n_samples, 3)
dirs = F.normalize(dirs, dim=-1)
device = rays_o.device
alpha, sampled_color, gradient_error, normal = torch.zeros(batch_size, n_samples, dtype=self.default_dtype, device=device), \
torch.zeros(batch_size, n_samples, 3, dtype=self.default_dtype, device=device), \
torch.zeros([batch_size, n_samples], dtype=self.default_dtype, device=device), \
torch.zeros([batch_size, n_samples, 3], dtype=self.default_dtype, device=device)
if torch.sum(inner_mask) > 0:
cos_anneal_ratio = self.get_anneal_val(step) if is_train else 1.0
alpha[inner_mask], gradients, feature_vector, inv_s, sdf = self.compute_sdf_alpha(points[inner_mask], dists[inner_mask], dirs[inner_mask], cos_anneal_ratio, step)
sampled_color[inner_mask] = self.color_network(points[inner_mask], gradients, -dirs[inner_mask], feature_vector)
# Eikonal loss
gradient_error[inner_mask] = (torch.linalg.norm(gradients, ord=2, dim=-1) - 1.0) ** 2 # rn,sn
normal[inner_mask] = F.normalize(gradients, dim=-1)
weights = alpha * torch.cumprod(torch.cat([torch.ones([batch_size, 1], dtype=self.default_dtype, device=device), 1. - alpha + 1e-7], -1), -1)[..., :-1] # rn,sn
mask = torch.sum(weights,dim=1).unsqueeze(-1) # rn,1
color = (sampled_color * weights[..., None]).sum(dim=1) + (1 - mask) # add white background
normal = (normal * weights[..., None]).sum(dim=1)
outputs = {
'rgb': color, # rn,3
'gradient_error': gradient_error, # rn,sn
'inner_mask': inner_mask, # rn,sn
'normal': normal, # rn,3
'mask': mask, # rn,1
}
return outputs
def render_with_loss(self, ray_batch, is_train, step):
render_outputs = self.render(ray_batch, is_train, step)
rgb_gt = ray_batch['rgb']
rgb_pr = render_outputs['rgb']
if self.rgb_loss == 'soft_l1':
epsilon = 0.001
rgb_loss = torch.sqrt(torch.sum((rgb_gt - rgb_pr) ** 2, dim=-1) + epsilon)
elif self.rgb_loss =='mse':
rgb_loss = F.mse_loss(rgb_pr, rgb_gt, reduction='none')
else:
raise NotImplementedError
rgb_loss = torch.mean(rgb_loss)
eikonal_loss = torch.sum(render_outputs['gradient_error'] * render_outputs['inner_mask']) / torch.sum(render_outputs['inner_mask'] + 1e-5)
loss = rgb_loss * self.lambda_rgb_loss + eikonal_loss * self.lambda_eikonal_loss
loss_batch = {
'eikonal': eikonal_loss,
'rendering': rgb_loss,
# 'mask': mask_loss,
}
if self.lambda_mask_loss>0 and self.use_mask:
mask_loss = F.mse_loss(render_outputs['mask'], ray_batch['mask'], reduction='none').mean()
loss += mask_loss * self.lambda_mask_loss
loss_batch['mask'] = mask_loss
return loss, loss_batch
class NeRFRenderer(BaseRenderer):
def __init__(self, train_batch_num, test_batch_num, bound=0.5, use_mask=False, lambda_rgb_loss=1.0, lambda_mask_loss=0.0):
super().__init__(train_batch_num, test_batch_num)
self.train_batch_num = train_batch_num
self.test_batch_num = test_batch_num
self.use_mask = use_mask
self.field = NGPNetwork(bound=bound)
self.update_interval = 16
self.fp16 = True
self.lambda_rgb_loss = lambda_rgb_loss
self.lambda_mask_loss = lambda_mask_loss
def render_impl(self, ray_batch, is_train, step):
rays_o, rays_d = ray_batch['rays_o'], ray_batch['rays_d']
with torch.cuda.amp.autocast(enabled=self.fp16):
if step % self.update_interval==0:
self.field.update_extra_state()
outputs = self.field.render(rays_o, rays_d,)
renderings={
'rgb': outputs['image'],
'depth': outputs['depth'],
'mask': outputs['weights_sum'].unsqueeze(-1),
}
return renderings
def render_with_loss(self, ray_batch, is_train, step):
render_outputs = self.render(ray_batch, is_train, step)
rgb_gt = ray_batch['rgb']
rgb_pr = render_outputs['rgb']
epsilon = 0.001
rgb_loss = torch.sqrt(torch.sum((rgb_gt - rgb_pr) ** 2, dim=-1) + epsilon)
rgb_loss = torch.mean(rgb_loss)
loss = rgb_loss * self.lambda_rgb_loss
loss_batch = {'rendering': rgb_loss}
if self.use_mask:
mask_loss = F.mse_loss(render_outputs['mask'], ray_batch['mask'], reduction='none')
mask_loss = torch.mean(mask_loss)
loss = loss + mask_loss * self.lambda_mask_loss
loss_batch['mask'] = mask_loss
return loss, loss_batch
class RendererTrainer(pl.LightningModule):
def __init__(self, image_path, total_steps, warm_up_steps, log_dir, train_batch_fg_num=0,
use_cube_feats=False, cube_ckpt=None, cube_cfg=None, cube_bound=0.5,
train_batch_num=4096, test_batch_num=8192, use_warm_up=True, use_mask=True,
lambda_rgb_loss=1.0, lambda_mask_loss=0.0, renderer='neus',
# used in neus
lambda_eikonal_loss=0.1,
coarse_sn=64, fine_sn=64):
super().__init__()
self.num_images = 16
self.image_size = 256
self.log_dir = log_dir
(Path(log_dir)/'images').mkdir(exist_ok=True, parents=True)
self.train_batch_num = train_batch_num
self.train_batch_fg_num = train_batch_fg_num
self.test_batch_num = test_batch_num
self.image_path = image_path
self.total_steps = total_steps
self.warm_up_steps = warm_up_steps
self.use_mask = use_mask
self.lambda_eikonal_loss = lambda_eikonal_loss
self.lambda_rgb_loss = lambda_rgb_loss
self.lambda_mask_loss = lambda_mask_loss
self.use_warm_up = use_warm_up
self.use_cube_feats, self.cube_cfg, self.cube_ckpt = use_cube_feats, cube_cfg, cube_ckpt
self._init_dataset()
if renderer=='neus':
self.renderer = NeuSRenderer(train_batch_num, test_batch_num,
lambda_rgb_loss=lambda_rgb_loss,
lambda_eikonal_loss=lambda_eikonal_loss,
lambda_mask_loss=lambda_mask_loss,
coarse_sn=coarse_sn, fine_sn=fine_sn)
elif renderer=='ngp':
self.renderer = NeRFRenderer(train_batch_num, test_batch_num, bound=cube_bound, use_mask=use_mask, lambda_mask_loss=lambda_mask_loss, lambda_rgb_loss=lambda_rgb_loss,)
else:
raise NotImplementedError
self.validation_index = 0
def _construct_ray_batch(self, images_info):
image_num = images_info['images'].shape[0]
_, h, w, _ = images_info['images'].shape
coords = torch.stack(torch.meshgrid(torch.arange(h), torch.arange(w)), -1)[:, :, (1, 0)] # h,w,2
coords = coords.float()[None, :, :, :].repeat(image_num, 1, 1, 1) # imn,h,w,2
coords = coords.reshape(image_num, h * w, 2)
coords = torch.cat([coords, torch.ones(image_num, h * w, 1, dtype=torch.float32)], 2) # imn,h*w,3
# imn,h*w,3 @ imn,3,3 => imn,h*w,3
rays_d = coords @ torch.inverse(images_info['Ks']).permute(0, 2, 1)
poses = images_info['poses'] # imn,3,4
R, t = poses[:, :, :3], poses[:, :, 3:]
rays_d = rays_d @ R
rays_d = F.normalize(rays_d, dim=-1)
rays_o = -R.permute(0,2,1) @ t # imn,3,3 @ imn,3,1
rays_o = rays_o.permute(0, 2, 1).repeat(1, h*w, 1) # imn,h*w,3
ray_batch = {
'rgb': images_info['images'].reshape(image_num*h*w,3),
'mask': images_info['masks'].reshape(image_num*h*w,1),
'rays_o': rays_o.reshape(image_num*h*w,3).float(),
'rays_d': rays_d.reshape(image_num*h*w,3).float(),
}
return ray_batch
@staticmethod
def load_model(cfg, ckpt):
config = OmegaConf.load(cfg)
model = instantiate_from_config(config.model)
print(f'loading model from {ckpt} ...')
ckpt = torch.load(ckpt)
model.load_state_dict(ckpt['state_dict'])
model = model.cuda().eval()
return model
def _init_dataset(self):
mask_predictor = BackgroundRemoval()
self.K, self.azs, self.els, self.dists, self.poses = read_pickle(f'meta_info/camera-{self.num_images}.pkl')
self.images_info = {'images': [] ,'masks': [], 'Ks': [], 'poses':[]}
img = imread(self.image_path)
for index in range(self.num_images):
rgb = np.copy(img[:,index*self.image_size:(index+1)*self.image_size,:])
# predict mask
if self.use_mask:
imsave(f'{self.log_dir}/input-{index}.png', rgb)
masked_image = mask_predictor(rgb)
imsave(f'{self.log_dir}/masked-{index}.png', masked_image)
mask = masked_image[:,:,3].astype(np.float32)/255
else:
h, w, _ = rgb.shape
mask = np.zeros([h,w], np.float32)
rgb = rgb.astype(np.float32)/255
K, pose = np.copy(self.K), self.poses[index]
self.images_info['images'].append(torch.from_numpy(rgb.astype(np.float32))) # h,w,3
self.images_info['masks'].append(torch.from_numpy(mask.astype(np.float32))) # h,w
self.images_info['Ks'].append(torch.from_numpy(K.astype(np.float32)))
self.images_info['poses'].append(torch.from_numpy(pose.astype(np.float32)))
for k, v in self.images_info.items(): self.images_info[k] = torch.stack(v, 0) # stack all values
self.train_batch = self._construct_ray_batch(self.images_info)
self.train_batch_pseudo_fg = {}
pseudo_fg_mask = torch.sum(self.train_batch['rgb']>0.99,1)!=3
for k, v in self.train_batch.items():
self.train_batch_pseudo_fg[k] = v[pseudo_fg_mask]
self.train_ray_fg_num = int(torch.sum(pseudo_fg_mask).cpu().numpy())
self.train_ray_num = self.num_images * self.image_size ** 2
self._shuffle_train_batch()
self._shuffle_train_fg_batch()
def _shuffle_train_batch(self):
self.train_batch_i = 0
shuffle_idxs = torch.randperm(self.train_ray_num, device='cpu') # shuffle
for k, v in self.train_batch.items():
self.train_batch[k] = v[shuffle_idxs]
def _shuffle_train_fg_batch(self):
self.train_batch_fg_i = 0
shuffle_idxs = torch.randperm(self.train_ray_fg_num, device='cpu') # shuffle
for k, v in self.train_batch_pseudo_fg.items():
self.train_batch_pseudo_fg[k] = v[shuffle_idxs]
def training_step(self, batch, batch_idx):
train_ray_batch = {k: v[self.train_batch_i:self.train_batch_i + self.train_batch_num].cuda() for k, v in self.train_batch.items()}
self.train_batch_i += self.train_batch_num
if self.train_batch_i + self.train_batch_num >= self.train_ray_num: self._shuffle_train_batch()
if self.train_batch_fg_num>0:
train_ray_batch_fg = {k: v[self.train_batch_fg_i:self.train_batch_fg_i+self.train_batch_fg_num].cuda() for k, v in self.train_batch_pseudo_fg.items()}
self.train_batch_fg_i += self.train_batch_fg_num
if self.train_batch_fg_i + self.train_batch_fg_num >= self.train_ray_fg_num: self._shuffle_train_fg_batch()
for k, v in train_ray_batch_fg.items():
train_ray_batch[k] = torch.cat([train_ray_batch[k], v], 0)
loss, loss_batch = self.renderer.render_with_loss(train_ray_batch, is_train=True, step=self.global_step)
self.log_dict(loss_batch, prog_bar=True, logger=True, on_step=True, on_epoch=False, rank_zero_only=True)
self.log('step', self.global_step, prog_bar=True, on_step=True, on_epoch=False, logger=False, rank_zero_only=True)
lr = self.optimizers().param_groups[0]['lr']
self.log('lr', lr, prog_bar=True, logger=True, on_step=True, on_epoch=False, rank_zero_only=True)
return loss
def _slice_images_info(self, index):
return {k:v[index:index+1] for k, v in self.images_info.items()}
@torch.no_grad()
def validation_step(self, batch, batch_idx):
with torch.no_grad():
if self.global_rank==0:
# we output an rendering image
images_info = self._slice_images_info(self.validation_index)
self.validation_index += 1
self.validation_index %= self.num_images
test_ray_batch = self._construct_ray_batch(images_info)
test_ray_batch = {k: v.cuda() for k,v in test_ray_batch.items()}
test_ray_batch['near'], test_ray_batch['far'] = near_far_from_sphere(test_ray_batch['rays_o'], test_ray_batch['rays_d'])
render_outputs = self.renderer.render(test_ray_batch, False, self.global_step)
process = lambda x: (x.cpu().numpy() * 255).astype(np.uint8)
h, w = self.image_size, self.image_size
rgb = torch.clamp(render_outputs['rgb'].reshape(h, w, 3), max=1.0, min=0.0)
mask = torch.clamp(render_outputs['mask'].reshape(h, w, 1), max=1.0, min=0.0)
mask_ = torch.repeat_interleave(mask, 3, dim=-1)
output_image = concat_images_list(process(rgb), process(mask_))
if 'normal' in render_outputs:
normal = torch.clamp((render_outputs['normal'].reshape(h, w, 3) + 1) / 2, max=1.0, min=0.0)
normal = normal * mask # we only show foregound normal
output_image = concat_images_list(output_image, process(normal))
# save images
imsave(f'{self.log_dir}/images/{self.global_step}.jpg', output_image)
def configure_optimizers(self):
lr = self.learning_rate
opt = torch.optim.AdamW([{"params": self.renderer.parameters(), "lr": lr},], lr=lr)
def schedule_fn(step):
total_step = self.total_steps
warm_up_step = self.warm_up_steps
warm_up_init = 0.02
warm_up_end = 1.0
final_lr = 0.02
interval = 1000
times = total_step // interval
ratio = np.power(final_lr, 1/times)
if step 0,1; c,h,w
tag = f"{split}/{k}"
pl_module.logger.experiment.add_image(tag, grid, global_step=pl_module.global_step)
@rank_zero_only
def log_to_file(self, save_dir, split, images, global_step, current_epoch):
root = os.path.join(save_dir, "images", split)
for k in images:
grid = torchvision.utils.make_grid(images[k], nrow=4)
grid = (grid + 1.0) / 2.0 # -1,1 -> 0,1; c,h,w
grid = grid.transpose(0, 1).transpose(1, 2).squeeze(-1)
grid = grid.numpy()
grid = (grid * 255).astype(np.uint8)
filename = "{:06}-{:06}-{}.jpg".format(global_step, current_epoch, k)
path = os.path.join(root, filename)
os.makedirs(os.path.split(path)[0], exist_ok=True)
Image.fromarray(grid).save(path)
@rank_zero_only
def log_img(self, pl_module, batch, split="train"):
if split == "val": should_log = True
else: should_log = self.check_frequency(pl_module.global_step)
if should_log:
is_train = pl_module.training
if is_train: pl_module.eval()
with torch.no_grad():
images = pl_module.log_images(batch, split=split, **self.log_images_kwargs)
for k in images:
N = min(images[k].shape[0], self.max_images)
images[k] = images[k][:N]
if isinstance(images[k], torch.Tensor):
images[k] = images[k].detach().cpu()
images[k] = torch.clamp(images[k], -1., 1.)
self.log_to_file(pl_module.logger.save_dir, split, images, pl_module.global_step, pl_module.current_epoch)
# self.log_to_logger(pl_module, images, split)
if is_train: pl_module.train()
def check_frequency(self, check_idx):
if (check_idx % self.batch_freq) == 0 and check_idx > 0:
return True
else:
return False
def on_train_batch_end(self, trainer, pl_module, outputs, batch, batch_idx):
self.log_img(pl_module, batch, split="train")
@rank_zero_only
def on_validation_batch_end(self, trainer, pl_module, outputs, batch, batch_idx, dataloader_idx=0):
# print('validation ....')
# print(dataloader_idx)
# print(batch_idx)
if batch_idx==0: self.log_img(pl_module, batch, split="val")
class CUDACallback(Callback):
# see https://github.com/SeanNaren/minGPT/blob/master/mingpt/callback.py
def on_train_epoch_start(self, trainer, pl_module):
# Reset the memory use counter
torch.cuda.reset_peak_memory_stats(trainer.strategy.root_device.index)
torch.cuda.synchronize(trainer.strategy.root_device.index)
self.start_time = time.time()
def on_train_epoch_end(self, trainer, pl_module):
torch.cuda.synchronize(trainer.strategy.root_device.index)
max_memory = torch.cuda.max_memory_allocated(trainer.strategy.root_device.index) / 2 ** 20
epoch_time = time.time() - self.start_time
try:
max_memory = trainer.strategy.reduce(max_memory)
epoch_time = trainer.strategy.reduce(epoch_time)
rank_zero_info(f"Average Epoch time: {epoch_time:.2f} seconds")
rank_zero_info(f"Average Peak memory {max_memory:.2f}MiB")
except AttributeError:
pass
def get_node_name(name, parent_name):
if len(name) <= len(parent_name):
return False, ''
p = name[:len(parent_name)]
if p != parent_name:
return False, ''
return True, name[len(parent_name):]
class ResumeCallBacks(Callback):
def on_train_start(self, trainer, pl_module):
pl_module.optimizers().param_groups = pl_module.optimizers()._optimizer.param_groups
def load_pretrain_stable_diffusion(new_model, finetune_from):
rank_zero_print(f"Attempting to load state from {finetune_from}")
old_state = torch.load(finetune_from, map_location="cpu")
if "state_dict" in old_state: old_state = old_state["state_dict"]
in_filters_load = old_state["model.diffusion_model.input_blocks.0.0.weight"]
new_state = new_model.state_dict()
if "model.diffusion_model.input_blocks.0.0.weight" in new_state:
in_filters_current = new_state["model.diffusion_model.input_blocks.0.0.weight"]
in_shape = in_filters_current.shape
## because the model adopts additional inputs as conditions.
if in_shape != in_filters_load.shape:
input_keys = ["model.diffusion_model.input_blocks.0.0.weight", "model_ema.diffusion_modelinput_blocks00weight",]
for input_key in input_keys:
if input_key not in old_state or input_key not in new_state:
continue
input_weight = new_state[input_key]
if input_weight.size() != old_state[input_key].size():
print(f"Manual init: {input_key}")
input_weight.zero_()
input_weight[:, :4, :, :].copy_(old_state[input_key])
old_state[input_key] = torch.nn.parameter.Parameter(input_weight)
new_model.load_state_dict(old_state, strict=False)
if hasattr(new_model.spatial_volume, 'controlnet'):
controlnet_state = {k.replace('spatial_volume.spatial_volume_feats.', ''):v for (k, v) in old_state.items() if k.startswith('spatial_volume.spatial_volume_feats')}
new_model.spatial_volume.controlnet.load_state_dict(controlnet_state, strict=False)
def get_optional_dict(name, config):
if name in config:
cfg = config[name]
else:
cfg = OmegaConf.create()
return cfg
if __name__ == "__main__":
# now = datetime.datetime.now().strftime("%Y-%m-%dT%H-%M-%S")
sys.path.append(os.getcwd())
opt = get_parser().parse_args()
assert opt.base != ''
name = os.path.split(opt.base)[-1]
name = os.path.splitext(name)[0]
logdir = os.path.join(opt.logdir, name)
# logdir: checkpoints+configs
ckptdir = os.path.join(opt.ckptdir, name)
cfgdir = os.path.join(logdir, "configs")
if opt.resume:
ckpt = os.path.join(ckptdir, "last.ckpt")
opt.resume_from_checkpoint = ckpt
opt.finetune_from = "" # disable finetune checkpoint
seed_everything(opt.seed)
###################config#####################
config = OmegaConf.load(opt.base) # loade default configs
lightning_config = config.lightning
trainer_config = config.lightning.trainer
for k in trainer_args(opt): # overwrite trainer configs
trainer_config[k] = getattr(opt, k)
###################trainer#####################
# training framework
gpuinfo = trainer_config["gpus"]
rank_zero_print(f"Running on GPUs {gpuinfo}")
ngpu = len(trainer_config.gpus.strip(",").split(','))
trainer_config['devices'] = ngpu
###################model#####################
model = instantiate_from_config(config.model)
model.cpu()
# load stable diffusion parameters
if opt.finetune_from != "":
load_pretrain_stable_diffusion(model, opt.finetune_from)
###################logger#####################
# default logger configs
default_logger_cfg = {"target": "pytorch_lightning.loggers.TensorBoardLogger",
"params": {"save_dir": logdir, "name": "tensorboard_logs", }}
logger_cfg = OmegaConf.create(default_logger_cfg)
logger = instantiate_from_config(logger_cfg)
###################callbacks#####################
# default ckpt callbacks
default_modelckpt_cfg = {"target": "pytorch_lightning.callbacks.ModelCheckpoint",
"params": {"dirpath": ckptdir, "filename": "{epoch:06}", "verbose": True, "save_last": True, "every_n_train_steps": 5000}}
modelckpt_cfg = OmegaConf.merge(default_modelckpt_cfg, get_optional_dict("modelcheckpoint", lightning_config)) # overwrite checkpoint configs
default_modelckpt_cfg_repeat = {"target": "pytorch_lightning.callbacks.ModelCheckpoint",
"params": {"dirpath": ckptdir, "filename": "{step:08}", "verbose": True, "save_last": False, "every_n_train_steps": 5000, "save_top_k": -1}}
modelckpt_cfg_repeat = OmegaConf.merge(default_modelckpt_cfg_repeat)
# add callback which sets up log directory
default_callbacks_cfg = {
"setup_callback": {
"target": "train_diffusion.SetupCallback",
"params": {"resume": opt.resume, "logdir": logdir, "ckptdir": ckptdir, "cfgdir": cfgdir, "config": config}
},
"learning_rate_logger": {
"target": "train_diffusion.LearningRateMonitor",
"params": {"logging_interval": "step"}
},
"cuda_callback": {"target": "train_diffusion.CUDACallback"},
}
callbacks_cfg = OmegaConf.merge(default_callbacks_cfg, get_optional_dict("callbacks", lightning_config))
callbacks_cfg['model_ckpt'] = modelckpt_cfg # add checkpoint
callbacks_cfg['model_ckpt_repeat'] = modelckpt_cfg_repeat # add checkpoint
callbacks = [instantiate_from_config(callbacks_cfg[k]) for k in callbacks_cfg] # construct all callbacks
if opt.resume:
callbacks.append(ResumeCallBacks())
trainer = Trainer.from_argparse_args(args=argparse.Namespace(), **trainer_config,
accelerator='cuda', strategy=DDPStrategy(find_unused_parameters=False), logger=logger, callbacks=callbacks)
trainer.logdir = logdir
###################data#####################
config.data.params.seed = opt.seed
data = instantiate_from_config(config.data)
data.prepare_data()
data.setup('fit')
####################lr#####################
bs, base_lr = config.data.params.batch_size, config.model.base_learning_rate
accumulate_grad_batches = trainer_config.accumulate_grad_batches if hasattr(trainer_config, "trainer_config") else 1
rank_zero_print(f"accumulate_grad_batches = {accumulate_grad_batches}")
model.learning_rate = base_lr
rank_zero_print("++++ NOT USING LR SCALING ++++")
rank_zero_print(f"Setting learning rate to {model.learning_rate:.2e}")
model.image_dir = logdir # used in output images during training
# run
trainer.fit(model, data)
================================================
FILE: train_renderer.py
================================================
import argparse
import imageio
import numpy as np
import torch
import torch.nn.functional as F
from pathlib import Path
import trimesh
from omegaconf import OmegaConf
from pytorch_lightning.callbacks import ModelCheckpoint, LearningRateMonitor, Callback
from pytorch_lightning.loggers import TensorBoardLogger
from pytorch_lightning import Trainer
from skimage.io import imsave
from tqdm import tqdm
import mcubes
from ldm.base_utils import read_pickle, output_points
from renderer.renderer import NeuSRenderer, DEFAULT_SIDE_LENGTH
from ldm.util import instantiate_from_config
class ResumeCallBacks(Callback):
def __init__(self):
pass
def on_train_start(self, trainer, pl_module):
pl_module.optimizers().param_groups = pl_module.optimizers()._optimizer.param_groups
def render_images(model, output,):
# render from model
n = 180
azimuths = (np.arange(n) / n * np.pi * 2).astype(np.float32)
elevations = np.deg2rad(np.asarray([30] * n).astype(np.float32))
K, _, _, _, poses = read_pickle(f'meta_info/camera-16.pkl')
output_points
h, w = 256, 256
default_size = 256
K = np.diag([w/default_size,h/default_size,1.0]) @ K
imgs = []
for ni in tqdm(range(n)):
# R = euler2mat(azimuths[ni], elevations[ni], 0, 'szyx')
# R = np.asarray([[0,-1,0],[0,0,-1],[1,0,0]]) @ R
e, a = elevations[ni], azimuths[ni]
row1 = np.asarray([np.sin(e)*np.cos(a),np.sin(e)*np.sin(a),-np.cos(e)])
row0 = np.asarray([-np.sin(a),np.cos(a), 0])
row2 = np.cross(row0, row1)
R = np.stack([row0,row1,row2],0)
t = np.asarray([0,0,1.5])
pose = np.concatenate([R,t[:,None]],1)
pose_ = torch.from_numpy(pose.astype(np.float32)).unsqueeze(0)
K_ = torch.from_numpy(K.astype(np.float32)).unsqueeze(0) # [1,3,3]
coords = torch.stack(torch.meshgrid(torch.arange(h), torch.arange(w)), -1)[:, :, (1, 0)] # h,w,2
coords = coords.float()[None, :, :, :].repeat(1, 1, 1, 1) # imn,h,w,2
coords = coords.reshape(1, h * w, 2)
coords = torch.cat([coords, torch.ones(1, h * w, 1, dtype=torch.float32)], 2) # imn,h*w,3
# imn,h*w,3 @ imn,3,3 => imn,h*w,3
rays_d = coords @ torch.inverse(K_).permute(0, 2, 1)
R, t = pose_[:, :, :3], pose_[:, :, 3:]
rays_d = rays_d @ R
rays_d = F.normalize(rays_d, dim=-1)
rays_o = -R.permute(0, 2, 1) @ t # imn,3,3 @ imn,3,1
rays_o = rays_o.permute(0, 2, 1).repeat(1, h * w, 1) # imn,h*w,3
ray_batch = {
'rays_o': rays_o.reshape(-1,3).cuda(),
'rays_d': rays_d.reshape(-1,3).cuda(),
}
with torch.no_grad():
image = model.renderer.render(ray_batch,False,5000)['rgb'].reshape(h,w,3)
image = (image.cpu().numpy() * 255).astype(np.uint8)
imgs.append(image)
imageio.mimsave(f'{output}/rendering.mp4', imgs, fps=30)
def extract_fields(bound_min, bound_max, resolution, query_func, batch_size=64, outside_val=1.0):
N = batch_size
X = torch.linspace(bound_min[0], bound_max[0], resolution).split(N)
Y = torch.linspace(bound_min[1], bound_max[1], resolution).split(N)
Z = torch.linspace(bound_min[2], bound_max[2], resolution).split(N)
u = np.zeros([resolution, resolution, resolution], dtype=np.float32)
with torch.no_grad():
for xi, xs in enumerate(X):
for yi, ys in enumerate(Y):
for zi, zs in enumerate(Z):
xx, yy, zz = torch.meshgrid(xs, ys, zs)
pts = torch.cat([xx.reshape(-1, 1), yy.reshape(-1, 1), zz.reshape(-1, 1)], dim=-1).cuda()
val = query_func(pts).detach()
outside_mask = torch.norm(pts,dim=-1)>=1.0
val[outside_mask]=outside_val
val = val.reshape(len(xs), len(ys), len(zs)).cpu().numpy()
u[xi * N: xi * N + len(xs), yi * N: yi * N + len(ys), zi * N: zi * N + len(zs)] = val
return u
def extract_geometry(bound_min, bound_max, resolution, threshold, query_func, color_func, outside_val=1.0):
u = extract_fields(bound_min, bound_max, resolution, query_func, outside_val=outside_val)
vertices, triangles = mcubes.marching_cubes(u, threshold)
b_max_np = bound_max.detach().cpu().numpy()
b_min_np = bound_min.detach().cpu().numpy()
vertices = vertices / (resolution - 1.0) * (b_max_np - b_min_np)[None, :] + b_min_np[None, :]
vertex_colors = color_func(vertices)
return vertices, triangles, vertex_colors
def extract_mesh(model, output, resolution=512):
if not isinstance(model.renderer, NeuSRenderer): return
bbox_min = -torch.ones(3)*DEFAULT_SIDE_LENGTH
bbox_max = torch.ones(3)*DEFAULT_SIDE_LENGTH
with torch.no_grad():
vertices, triangles, vertex_colors = extract_geometry(bbox_min, bbox_max, resolution, 0, lambda x: model.renderer.sdf_network.sdf(x), lambda x: model.renderer.get_vertex_colors(x))
# output geometry
mesh = trimesh.Trimesh(vertices, triangles, vertex_colors=vertex_colors)
mesh.export(str(f'{output}/mesh.ply'))
def main():
parser = argparse.ArgumentParser()
parser.add_argument('-i', '--image_path', type=str, required=True)
parser.add_argument('-n', '--name', type=str, required=True)
parser.add_argument('-b', '--base', type=str, default='configs/neus.yaml')
parser.add_argument('-l', '--log', type=str, default='output/renderer')
parser.add_argument('-s', '--seed', type=int, default=6033)
parser.add_argument('-g', '--gpus', type=str, default='0,')
parser.add_argument('-r', '--resume', action='store_true', default=False, dest='resume')
parser.add_argument('--fp16', action='store_true', default=False, dest='fp16')
opt = parser.parse_args()
# seed_everything(opt.seed)
# configs
cfg = OmegaConf.load(opt.base)
name = opt.name
log_dir, ckpt_dir = Path(opt.log) / name, Path(opt.log) / name / 'ckpt'
cfg.model.params['image_path'] = opt.image_path
cfg.model.params['log_dir'] = log_dir
# setup
log_dir.mkdir(exist_ok=True, parents=True)
ckpt_dir.mkdir(exist_ok=True, parents=True)
trainer_config = cfg.trainer
callback_config = cfg.callbacks
model_config = cfg.model
data_config = cfg.data
data_config.params.seed = opt.seed
data = instantiate_from_config(data_config)
data.prepare_data()
data.setup('fit')
model = instantiate_from_config(model_config,)
model.cpu()
model.learning_rate = model_config.base_lr
# logger
logger = TensorBoardLogger(save_dir=log_dir, name='tensorboard_logs')
callbacks=[]
callbacks.append(LearningRateMonitor(logging_interval='step'))
callbacks.append(ModelCheckpoint(dirpath=ckpt_dir, filename="{epoch:06}", verbose=True, save_last=True, every_n_train_steps=callback_config.save_interval))
# trainer
trainer_config.update({
"accelerator": "cuda", "check_val_every_n_epoch": None,
"benchmark": True, "num_sanity_val_steps": 0,
"devices": 1, "gpus": opt.gpus,
})
if opt.fp16:
trainer_config['precision']=16
if opt.resume:
callbacks.append(ResumeCallBacks())
trainer_config['resume_from_checkpoint'] = str(ckpt_dir / 'last.ckpt')
else:
if (ckpt_dir / 'last.ckpt').exists():
raise RuntimeError(f"checkpoint {ckpt_dir / 'last.ckpt'} existing ...")
trainer = Trainer.from_argparse_args(args=argparse.Namespace(), **trainer_config, logger=logger, callbacks=callbacks)
trainer.fit(model, data)
model = model.cuda().eval()
render_images(model, log_dir)
extract_mesh(model, log_dir)
if __name__=="__main__":
main()
================================================
FILE: workflow/Coin3D_condition_workflow.json
================================================
{
"last_node_id": 189,
"last_link_id": 685,
"nodes": [
{
"id": 5,
"type": "EmptyLatentImage",
"pos": [
1246.018310546875,
-7.2605719566345215
],
"size": [
315,
106
],
"flags": {},
"order": 0,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "LATENT",
"type": "LATENT",
"links": [
132
],
"slot_index": 0,
"shape": 3
}
],
"properties": {
"Node name for S&R": "EmptyLatentImage"
},
"widgets_values": [
512,
512,
9
]
},
{
"id": 17,
"type": "BNK_CLIPTextEncodeAdvanced",
"pos": [
-272.9209899902344,
73.68206024169922
],
"size": [
373.8340148925781,
148.20401000976562
],
"flags": {
"collapsed": false
},
"order": 9,
"mode": 0,
"inputs": [
{
"name": "clip",
"type": "CLIP",
"link": 16
}
],
"outputs": [
{
"name": "CONDITIONING",
"type": "CONDITIONING",
"links": [
469,
477
],
"slot_index": 0,
"shape": 3
}
],
"properties": {
"Node name for S&R": "BNK_CLIPTextEncodeAdvanced"
},
"widgets_values": [
"ugly, global light",
"none",
"A1111"
]
},
{
"id": 15,
"type": "CLIPSetLastLayer",
"pos": [
-269.6702575683594,
292.7852783203125
],
"size": [
388.526611328125,
109.5339126586914
],
"flags": {},
"order": 5,
"mode": 0,
"inputs": [
{
"name": "clip",
"type": "CLIP",
"link": 14
}
],
"outputs": [
{
"name": "CLIP",
"type": "CLIP",
"links": [
15,
16
],
"shape": 3
}
],
"properties": {
"Node name for S&R": "CLIPSetLastLayer"
},
"widgets_values": [
-2
]
},
{
"id": 16,
"type": "BNK_CLIPTextEncodeAdvanced",
"pos": [
-282.92095947265625,
-183.3180389404297
],
"size": [
400,
200
],
"flags": {},
"order": 8,
"mode": 0,
"inputs": [
{
"name": "clip",
"type": "CLIP",
"link": 15
}
],
"outputs": [
{
"name": "CONDITIONING",
"type": "CONDITIONING",
"links": [
468,
476
],
"slot_index": 0,
"shape": 3
}
],
"properties": {
"Node name for S&R": "BNK_CLIPTextEncodeAdvanced"
},
"widgets_values": [
"a lovely teddy bear",
"none",
"A1111"
]
},
{
"id": 4,
"type": "CheckpointLoaderSimple",
"pos": [
1239.7662353515625,
-172.13328552246094
],
"size": [
315,
98
],
"flags": {},
"order": 1,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "MODEL",
"type": "MODEL",
"links": [
339
],
"slot_index": 0,
"shape": 3
},
{
"name": "CLIP",
"type": "CLIP",
"links": [
14
],
"slot_index": 1,
"shape": 3
},
{
"name": "VAE",
"type": "VAE",
"links": [
387
],
"slot_index": 2,
"shape": 3
}
],
"properties": {
"Node name for S&R": "CheckpointLoaderSimple"
},
"widgets_values": [
"disneyPixarCartoon_v10.safetensors"
]
},
{
"id": 176,
"type": "ControlNetApplyAdvanced",
"pos": [
798.8362426757812,
116.40131378173828
],
"size": [
315,
186
],
"flags": {},
"order": 12,
"mode": 0,
"inputs": [
{
"name": "positive",
"type": "CONDITIONING",
"link": 468
},
{
"name": "negative",
"type": "CONDITIONING",
"link": 469
},
{
"name": "control_net",
"type": "CONTROL_NET",
"link": 470
},
{
"name": "image",
"type": "IMAGE",
"link": 471
},
{
"name": "vae",
"type": "VAE",
"link": null,
"shape": 7
}
],
"outputs": [
{
"name": "positive",
"type": "CONDITIONING",
"links": [],
"slot_index": 0,
"shape": 3
},
{
"name": "negative",
"type": "CONDITIONING",
"links": [],
"slot_index": 1,
"shape": 3
}
],
"properties": {
"Node name for S&R": "ControlNetApplyAdvanced"
},
"widgets_values": [
0.8300000000000001,
0,
0.791
]
},
{
"id": 175,
"type": "ControlNetLoader",
"pos": [
801.2661743164062,
-175.59864807128906
],
"size": [
315,
58
],
"flags": {},
"order": 2,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "CONTROL_NET",
"type": "CONTROL_NET",
"links": [
470
],
"slot_index": 0,
"shape": 3
}
],
"properties": {
"Node name for S&R": "ControlNetLoader"
},
"widgets_values": [
"control_v11f1p_sd15_depth.pth"
]
},
{
"id": 172,
"type": "MiDaS-DepthMapPreprocessor",
"pos": [
800.8279418945312,
-58.614768981933594
],
"size": [
315,
106
],
"flags": {},
"order": 6,
"mode": 0,
"inputs": [
{
"name": "image",
"type": "IMAGE",
"link": 683
}
],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"links": [
467,
471
],
"slot_index": 0,
"shape": 3
}
],
"properties": {
"Node name for S&R": "MiDaS-DepthMapPreprocessor"
},
"widgets_values": [
6.283185307179586,
0.4,
512
]
},
{
"id": 14,
"type": "ControlNetLoader",
"pos": [
243.1699981689453,
-178.82888793945312
],
"size": [
401.38616943359375,
58
],
"flags": {},
"order": 3,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "CONTROL_NET",
"type": "CONTROL_NET",
"links": [
12
],
"slot_index": 0,
"shape": 3
}
],
"properties": {
"Node name for S&R": "ControlNetLoader"
},
"widgets_values": [
"control_v11p_sd15_softedge.pth"
]
},
{
"id": 11,
"type": "PiDiNetPreprocessor",
"pos": [
242.2834930419922,
-59.89323043823242
],
"size": [
397.4411926269531,
82
],
"flags": {},
"order": 7,
"mode": 0,
"inputs": [
{
"name": "image",
"type": "IMAGE",
"link": 684
}
],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"links": [
390,
639
],
"slot_index": 0,
"shape": 3
}
],
"properties": {
"Node name for S&R": "PiDiNetPreprocessor"
},
"widgets_values": [
"enable",
512
]
},
{
"id": 13,
"type": "ControlNetApplyAdvanced",
"pos": [
246.46913146972656,
82.37593841552734
],
"size": [
402.00360107421875,
186
],
"flags": {},
"order": 13,
"mode": 0,
"inputs": [
{
"name": "positive",
"type": "CONDITIONING",
"link": 476
},
{
"name": "negative",
"type": "CONDITIONING",
"link": 477
},
{
"name": "control_net",
"type": "CONTROL_NET",
"link": 12
},
{
"name": "image",
"type": "IMAGE",
"link": 639
},
{
"name": "vae",
"type": "VAE",
"link": null,
"shape": 7
}
],
"outputs": [
{
"name": "positive",
"type": "CONDITIONING",
"links": [
635
],
"slot_index": 0,
"shape": 3
},
{
"name": "negative",
"type": "CONDITIONING",
"links": [
636
],
"slot_index": 1,
"shape": 3
}
],
"properties": {
"Node name for S&R": "ControlNetApplyAdvanced"
},
"widgets_values": [
0.87,
0,
0.665
]
},
{
"id": 174,
"type": "PreviewImage",
"pos": [
910.1175537109375,
-588.312255859375
],
"size": [
210,
246
],
"flags": {},
"order": 10,
"mode": 0,
"inputs": [
{
"name": "images",
"type": "IMAGE",
"link": 467
}
],
"outputs": [],
"properties": {
"Node name for S&R": "PreviewImage"
},
"widgets_values": []
},
{
"id": 165,
"type": "PreviewImage",
"pos": [
407.0728759765625,
-587.5983276367188
],
"size": [
210,
246
],
"flags": {},
"order": 11,
"mode": 0,
"inputs": [
{
"name": "images",
"type": "IMAGE",
"link": 390
}
],
"outputs": [],
"properties": {
"Node name for S&R": "PreviewImage"
},
"widgets_values": []
},
{
"id": 189,
"type": "LoadImage",
"pos": [
-237.53085327148438,
-655.7085571289062
],
"size": [
315,
314
],
"flags": {},
"order": 4,
"mode": 0,
"inputs": [],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"links": [
683,
684
],
"slot_index": 0
},
{
"name": "MASK",
"type": "MASK",
"links": null
}
],
"properties": {
"Node name for S&R": "LoadImage"
},
"widgets_values": [
"condition.png",
"image"
]
},
{
"id": 9,
"type": "SaveImage",
"pos": [
2038.4168701171875,
-220.16629028320312
],
"size": [
423.10552978515625,
411.7587585449219
],
"flags": {},
"order": 16,
"mode": 0,
"inputs": [
{
"name": "images",
"type": "IMAGE",
"link": 7
}
],
"outputs": [],
"properties": {},
"widgets_values": [
"tmp/ComfyUI"
]
},
{
"id": 8,
"type": "VAEDecode",
"pos": [
1268.514404296875,
180.10263061523438
],
"size": [
302.3266906738281,
46
],
"flags": {},
"order": 15,
"mode": 0,
"inputs": [
{
"name": "samples",
"type": "LATENT",
"link": 685
},
{
"name": "vae",
"type": "VAE",
"link": 387
}
],
"outputs": [
{
"name": "IMAGE",
"type": "IMAGE",
"links": [
7
],
"slot_index": 0,
"shape": 3
}
],
"properties": {
"Node name for S&R": "VAEDecode"
},
"widgets_values": []
},
{
"id": 3,
"type": "KSampler",
"pos": [
1632.62890625,
-180.86990356445312
],
"size": [
252.04791259765625,
410.6546936035156
],
"flags": {},
"order": 14,
"mode": 0,
"inputs": [
{
"name": "model",
"type": "MODEL",
"link": 339
},
{
"name": "positive",
"type": "CONDITIONING",
"link": 635
},
{
"name": "negative",
"type": "CONDITIONING",
"link": 636
},
{
"name": "latent_image",
"type": "LATENT",
"link": 132
}
],
"outputs": [
{
"name": "LATENT",
"type": "LATENT",
"links": [
685
],
"slot_index": 0,
"shape": 3
}
],
"properties": {
"Node name for S&R": "KSampler"
},
"widgets_values": [
934049932825402,
"randomize",
20,
7,
"dpmpp_2m",
"karras",
1
]
}
],
"links": [
[
7,
8,
0,
9,
0,
"IMAGE"
],
[
12,
14,
0,
13,
2,
"CONTROL_NET"
],
[
14,
4,
1,
15,
0,
"CLIP"
],
[
15,
15,
0,
16,
0,
"CLIP"
],
[
16,
15,
0,
17,
0,
"CLIP"
],
[
132,
5,
0,
3,
3,
"LATENT"
],
[
339,
4,
0,
3,
0,
"MODEL"
],
[
387,
4,
2,
8,
1,
"VAE"
],
[
390,
11,
0,
165,
0,
"IMAGE"
],
[
467,
172,
0,
174,
0,
"IMAGE"
],
[
468,
16,
0,
176,
0,
"CONDITIONING"
],
[
469,
17,
0,
176,
1,
"CONDITIONING"
],
[
470,
175,
0,
176,
2,
"CONTROL_NET"
],
[
471,
172,
0,
176,
3,
"IMAGE"
],
[
476,
16,
0,
13,
0,
"CONDITIONING"
],
[
477,
17,
0,
13,
1,
"CONDITIONING"
],
[
635,
13,
0,
3,
1,
"CONDITIONING"
],
[
636,
13,
1,
3,
2,
"CONDITIONING"
],
[
639,
11,
0,
13,
3,
"IMAGE"
],
[
683,
189,
0,
172,
0,
"IMAGE"
],
[
684,
189,
0,
11,
0,
"IMAGE"
],
[
685,
3,
0,
8,
0,
"LATENT"
]
],
"groups": [
{
"id": 1,
"title": "Controlnet-softedge",
"bounding": [
225.7006072998047,
-259.8983154296875,
448.1227722167969,
549.4774169921875
],
"color": "#A88",
"font_size": 24,
"flags": {}
},
{
"id": 2,
"title": "Prompt",
"bounding": [
-307.9209899902344,
-273.3179016113281,
441,
667
],
"color": "#3f789e",
"font_size": 24,
"flags": {}
},
{
"id": 3,
"title": "Sampler",
"bounding": [
1221.836669921875,
-255.61981201171875,
696.4837646484375,
513.9542236328125
],
"color": "#b06634",
"font_size": 24,
"flags": {}
},
{
"id": 4,
"title": "Controlnet-depth",
"bounding": [
767.096435546875,
-258.400390625,
380.5092468261719,
589.0125122070312
],
"color": "#3f789e",
"font_size": 24,
"flags": {}
}
],
"config": {},
"extra": {
"ds": {
"scale": 0.6209213230591555,
"offset": [
718.8585520989319,
691.0548916137849
]
}
},
"version": 0.4
}
================================================
FILE: workflow/Coin3D_condition_workflow_api.json
================================================
{
"3": {
"inputs": {
"seed": 934049932825402,
"steps": 20,
"cfg": 7,
"sampler_name": "dpmpp_2m",
"scheduler": "karras",
"denoise": 1,
"model": [
"4",
0
],
"positive": [
"13",
0
],
"negative": [
"13",
1
],
"latent_image": [
"5",
0
]
},
"class_type": "KSampler",
"_meta": {
"title": "KSampler"
}
},
"4": {
"inputs": {
"ckpt_name": "disneyPixarCartoon_v10.safetensors"
},
"class_type": "CheckpointLoaderSimple",
"_meta": {
"title": "Load Checkpoint"
}
},
"5": {
"inputs": {
"width": 512,
"height": 512,
"batch_size": 9
},
"class_type": "EmptyLatentImage",
"_meta": {
"title": "Empty Latent Image"
}
},
"8": {
"inputs": {
"samples": [
"3",
0
],
"vae": [
"4",
2
]
},
"class_type": "VAEDecode",
"_meta": {
"title": "VAE Decode"
}
},
"9": {
"inputs": {
"filename_prefix": "tmp/ComfyUI",
"images": [
"8",
0
]
},
"class_type": "SaveImage",
"_meta": {
"title": "Save Image"
}
},
"11": {
"inputs": {
"safe": "enable",
"resolution": 512,
"image": [
"189",
0
]
},
"class_type": "PiDiNetPreprocessor",
"_meta": {
"title": "PiDiNet Soft-Edge Lines"
}
},
"13": {
"inputs": {
"strength": 0.87,
"start_percent": 0,
"end_percent": 0.665,
"positive": [
"16",
0
],
"negative": [
"17",
0
],
"control_net": [
"14",
0
],
"image": [
"11",
0
]
},
"class_type": "ControlNetApplyAdvanced",
"_meta": {
"title": "Apply ControlNet"
}
},
"14": {
"inputs": {
"control_net_name": "control_v11p_sd15_softedge.pth"
},
"class_type": "ControlNetLoader",
"_meta": {
"title": "Load ControlNet Model"
}
},
"15": {
"inputs": {
"stop_at_clip_layer": -2,
"clip": [
"4",
1
]
},
"class_type": "CLIPSetLastLayer",
"_meta": {
"title": "CLIP Set Last Layer"
}
},
"16": {
"inputs": {
"text": "a lovely teddy bear",
"token_normalization": "none",
"weight_interpretation": "A1111",
"clip": [
"15",
0
]
},
"class_type": "BNK_CLIPTextEncodeAdvanced",
"_meta": {
"title": "CLIP Text Encode (Advanced)"
}
},
"17": {
"inputs": {
"text": "ugly, global light",
"token_normalization": "none",
"weight_interpretation": "A1111",
"clip": [
"15",
0
]
},
"class_type": "BNK_CLIPTextEncodeAdvanced",
"_meta": {
"title": "CLIP Text Encode (Advanced)"
}
},
"84": {
"inputs": {
"vae_name": "vae-ft-mse-840000-ema-pruned.safetensors"
},
"class_type": "VAELoader",
"_meta": {
"title": "Load VAE"
}
},
"165": {
"inputs": {
"images": [
"11",
0
]
},
"class_type": "PreviewImage",
"_meta": {
"title": "Preview Image"
}
},
"172": {
"inputs": {
"a": 6.283185307179586,
"bg_threshold": 0.4,
"resolution": 512,
"image": [
"189",
0
]
},
"class_type": "MiDaS-DepthMapPreprocessor",
"_meta": {
"title": "MiDaS Depth Map"
}
},
"174": {
"inputs": {
"images": [
"172",
0
]
},
"class_type": "PreviewImage",
"_meta": {
"title": "Preview Image"
}
},
"175": {
"inputs": {
"control_net_name": "control_v11f1p_sd15_depth.pth"
},
"class_type": "ControlNetLoader",
"_meta": {
"title": "Load ControlNet Model"
}
},
"176": {
"inputs": {
"strength": 0.8300000000000001,
"start_percent": 0,
"end_percent": 0.791,
"positive": [
"16",
0
],
"negative": [
"17",
0
],
"control_net": [
"175",
0
],
"image": [
"172",
0
]
},
"class_type": "ControlNetApplyAdvanced",
"_meta": {
"title": "Apply ControlNet"
}
},
"189": {
"inputs": {
"image": "condition.png",
"upload": "image"
},
"class_type": "LoadImage",
"_meta": {
"title": "Load Image"
}
}
}
================================================
FILE: workflow/inference_comfyui_api.py
================================================
import websocket #NOTE: websocket-client (https://github.com/websocket-client/websocket-client)
import uuid
import json
import urllib.request
import urllib.parse
import numpy as np
from PIL import Image
import io
MAX_SEED=np.iinfo(np.int32).max
# Set to ComfyUI running address and port
server_address = "127.0.0.1:6621"
client_id = str(uuid.uuid4())
def queue_prompt(prompt):
p = {"prompt": prompt, "client_id": client_id}
data = json.dumps(p).encode('utf-8')
req = urllib.request.Request("http://{}/prompt".format(server_address), data=data)
return json.loads(urllib.request.urlopen(req).read())
def get_image(filename, subfolder, folder_type):
data = {"filename": filename, "subfolder": subfolder, "type": folder_type}
url_values = urllib.parse.urlencode(data)
with urllib.request.urlopen("http://{}/view?{}".format(server_address, url_values)) as response:
return response.read()
def get_history(prompt_id):
with urllib.request.urlopen("http://{}/history/{}".format(server_address, prompt_id)) as response:
return json.loads(response.read())
def get_images(ws, prompt):
prompt_id = queue_prompt(prompt)['prompt_id']
output_images = {}
while True:
out = ws.recv()
if isinstance(out, str):
message = json.loads(out)
if message['type'] == 'executing':
data = message['data']
if data['node'] is None and data['prompt_id'] == prompt_id:
break #Execution is done
else:
continue #previews are binary data
history = get_history(prompt_id)[prompt_id]
for o in history['outputs']:
for node_id in history['outputs']:
node_output = history['outputs'][node_id]
if 'images' in node_output:
images_output = []
for image in node_output['images']:
image_data = get_image(image['filename'], image['subfolder'], image['type'])
images_output.append(image_data)
output_images[node_id] = images_output
return output_images
with open("Coin3D_condition_workflow_api.json", 'r') as f:
prompt = json.load(f)
# Load Image Node, set to your condition image path.
prompt["189"]['inputs']['image'] = "path/to/your/condition.png" # /mnt/projects/Coin3D/example/teddybear/condition.png
# Text Encode Node, set as your text prompt. Positive prompt
prompt["16"]['inputs']['text'] = "a lovely teddy bear"
# Negative prompt
# prompt["17"]['inputs']['text'] = "ugly, global light"
# Depth-condition: ControlNetApplyAdvanced Node, set as your text prompt.
prompt["176"]['inputs']['strength'] = 0.83
prompt["176"]['inputs']['end_percent'] = 0.791
prompt["176"]['inputs']['start_percent'] = 0.0
# Softedge-condition: ControlNetApplyAdvanced Node, set as your text prompt.
prompt["13"]['inputs']['strength'] = 0.87
prompt["13"]['inputs']['end_percent'] = 0.665
prompt["13"]['inputs']['start_percent'] = 0.0
prompt["3"]['inputs']['seed'] = np.random.randint(0, MAX_SEED)
ws = websocket.WebSocket()
ws.connect("ws://{}/ws?clientId={}".format(server_address, client_id))
images = get_images(ws, prompt)['9'] # '9' generated images.
for idx, image_data in enumerate(images):
image = Image.open(io.BytesIO(image_data))
image.save(f"{idx}.png")