Full Code of zju3dv/Coin3D for AI

main 3de63a14eb21 cached
126 files
1.9 MB
930.8k tokens
1033 symbols
1 requests
Download .txt
Showing preview only (1,965K chars total). Download the full file or copy to clipboard to get everything.
Repository: zju3dv/Coin3D
Branch: main
Commit: 3de63a14eb21
Files: 126
Total size: 1.9 MB

Directory structure:
gitextract_sa0or0_n/

├── CONDITION.md
├── README.md
├── blender_utils/
│   └── render_proxy.py
├── configs/
│   ├── coin3d_train.yaml
│   ├── ctrldemo.yaml
│   ├── nerf.yaml
│   ├── neus.yaml
│   └── syncdreamer.yaml
├── example/
│   ├── panda/
│   │   ├── mesh.obj
│   │   └── proxy.txt
│   ├── pumpkin/
│   │   ├── mesh.obj
│   │   └── proxy.txt
│   ├── teddybear/
│   │   ├── mesh.obj
│   │   └── proxy.txt
│   ├── toycar/
│   │   ├── mesh.obj
│   │   └── proxy.txt
│   └── turtle/
│       ├── mesh.obj
│       └── proxy.txt
├── externs/
│   ├── __init__.py
│   └── pvcnn/
│       └── modules/
│           ├── __init__.py
│           ├── ball_query.py
│           ├── frustum.py
│           ├── functional/
│           │   ├── __init__.py
│           │   ├── backend.py
│           │   ├── ball_query.py
│           │   ├── devoxelization.py
│           │   ├── grouping.py
│           │   ├── interpolatation.py
│           │   ├── loss.py
│           │   ├── sampling.py
│           │   ├── src/
│           │   │   ├── ball_query/
│           │   │   │   ├── ball_query.cpp
│           │   │   │   ├── ball_query.cu
│           │   │   │   ├── ball_query.cuh
│           │   │   │   └── ball_query.hpp
│           │   │   ├── bindings.cpp
│           │   │   ├── cuda_utils.cuh
│           │   │   ├── grouping/
│           │   │   │   ├── grouping.cpp
│           │   │   │   ├── grouping.cu
│           │   │   │   ├── grouping.cuh
│           │   │   │   └── grouping.hpp
│           │   │   ├── interpolate/
│           │   │   │   ├── neighbor_interpolate.cpp
│           │   │   │   ├── neighbor_interpolate.cu
│           │   │   │   ├── neighbor_interpolate.cuh
│           │   │   │   ├── neighbor_interpolate.hpp
│           │   │   │   ├── trilinear_devox.cpp
│           │   │   │   ├── trilinear_devox.cu
│           │   │   │   ├── trilinear_devox.cuh
│           │   │   │   └── trilinear_devox.hpp
│           │   │   ├── sampling/
│           │   │   │   ├── sampling.cpp
│           │   │   │   ├── sampling.cu
│           │   │   │   ├── sampling.cuh
│           │   │   │   └── sampling.hpp
│           │   │   ├── utils.hpp
│           │   │   └── voxelization/
│           │   │       ├── vox.cpp
│           │   │       ├── vox.cu
│           │   │       ├── vox.cuh
│           │   │       └── vox.hpp
│           │   └── voxelization.py
│           ├── loss.py
│           ├── pointnet.py
│           ├── pvconv.py
│           ├── se.py
│           ├── shared_mlp.py
│           └── voxelization.py
├── foreground_segment.py
├── generate.py
├── ldm/
│   ├── DPMPPScheduler.py
│   ├── base_utils.py
│   ├── data/
│   │   ├── __init__.py
│   │   ├── base.py
│   │   ├── coco.py
│   │   ├── control_sync_dreamer.py
│   │   ├── dummy.py
│   │   ├── imagenet.py
│   │   ├── inpainting/
│   │   │   ├── __init__.py
│   │   │   └── synthetic_mask.py
│   │   ├── laion.py
│   │   ├── lsun.py
│   │   ├── nerf_like.py
│   │   ├── simple.py
│   │   └── sync_dreamer.py
│   ├── lr_scheduler.py
│   ├── models/
│   │   ├── autoencoder.py
│   │   └── diffusion/
│   │       ├── __init__.py
│   │       ├── ctrldemo_sync_dreamer.py
│   │       ├── sync_dreamer.py
│   │       ├── sync_dreamer_attention.py
│   │       ├── sync_dreamer_network.py
│   │       └── sync_dreamer_utils.py
│   ├── modules/
│   │   ├── attention.py
│   │   ├── diffusionmodules/
│   │   │   ├── __init__.py
│   │   │   ├── model.py
│   │   │   ├── openaimodel.py
│   │   │   └── util.py
│   │   ├── distributions/
│   │   │   ├── __init__.py
│   │   │   └── distributions.py
│   │   ├── encoders/
│   │   │   ├── __init__.py
│   │   │   └── modules.py
│   │   └── x_transformer.py
│   ├── thirdp/
│   │   └── psp/
│   │       ├── helpers.py
│   │       ├── id_loss.py
│   │       └── model_irse.py
│   ├── typing.py
│   └── util.py
├── meta_info/
│   └── camera-16.pkl
├── misc.ipynb
├── raymarching/
│   ├── __init__.py
│   ├── backend.py
│   ├── raymarching.py
│   ├── setup.py
│   └── src/
│       ├── bindings.cpp
│       ├── raymarching.cu
│       └── raymarching.h
├── renderer/
│   ├── agg_net.py
│   ├── cost_reg_net.py
│   ├── dummy_dataset.py
│   ├── feature_net.py
│   ├── neus_networks.py
│   ├── ngp_renderer.py
│   └── renderer.py
├── requirements.txt
├── train_diffusion.py
├── train_renderer.py
└── workflow/
    ├── Coin3D_condition_workflow.json
    ├── Coin3D_condition_workflow_api.json
    └── inference_comfyui_api.py

================================================
FILE CONTENTS
================================================

================================================
FILE: CONDITION.md
================================================
### Prepare condition image for inference
1. Rendering the proxy image

First you need to download [Blender](https://www.blender.org/) and unzip it to any directory. The version we use is [Blender-3.6](https://download.blender.org/release/Blender3.6/blender-3.6.12-linux-x64.tar.xz)
```
path/to/your/blender -b -P  blender_utils/render_proxy.py --  --obj_path example/teddybear/mesh.obj
```
The example rendered result can be found in this [image](example/teddybear/condition.png). Optionally, if you use blender to construct a coarse proxy, setting a different base color for each primitive can improve the effect of extracting softedges in ControlNet.

2. Use ComfyUI and Controlnet to construct condition image

**We have prepared a [Workflow](workflow/Coin3D_condition_workflow.json) using Depth-Condition and Softedge-Condition. You can pull the workflow into the ComfyUI interface to use it. Users can also use other controlnet conditions as needed.**
The usage of ComfyUI can be found [here](https://github.com/comfyanonymous/ComfyUI).

To run the above workflow, you need to download the following pretrained models. Here, we use the [Disney](https://civitai.com/models/65203/disney-pixar-cartoon-type-a) style basemodel as an example.
```
mkdir interactive_workflow
cd interactive_workflow
git clone https://github.com/comfyanonymous/ComfyUI.git
cd custom_nodes
git clone https://github.com/BlenderNeko/ComfyUI_ADV_CLIP_emb.git
git clone https://github.com/Fannovel16/comfyui_controlnet_aux.git

cd ../models/checkpoints/
wget https://huggingface.co/BitStarWalkin/RPG_models/resolve/main/disneyPixarCartoon_v10.safetensors

cd ../controlnet/
wget https://huggingface.co/lllyasviel/ControlNet-v1-1/resolve/main/control_v11f1p_sd15_depth.pth
wget https://huggingface.co/lllyasviel/ControlNet-v1-1/resolve/main/control_v11p_sd15_softedge.pth
```

Make sure ComfyUI has the following custom_nodes and pretrain models
:
```bash
ComfyUI
|-- models
    |-- checkpoints
        |--disneyPixarCartoon_v10.safetensors
    |-- controlnet
        |--control_v11f1p_sd15_depth.pth
        |--control_v11p_sd15_softedge.pth
|-- custom_nodes
    |-- ComfyUI_ADV_CLIP_emb
    |-- comfyui_controlnet_aux
```

After loading the workflow, it will be as shown below.
<div align=center>
<img src="assets/example_condition_workflow.png" width="100%"/>
</div>

You need to **load the rendered proxy image** into the Load Image Node, **set Text Prompt** in the CLIP Text Encode (Advance) Node, and finally **adjust the appropriate ControlNet parameters** in the Apply ControlNet Node.

Apply ControlNet Parameters Explanation: 
- `strength` The parameter controls the degree of influence that ControlNet has on the generated image. The value usually ranges from 0 to 1. The higher the value, the greater the influence that ControlNet has on the generated image; the lower the value, the less influence. For example, when strength is set to 1, ControlNet completely controls the generation process; when it is set to 0, ControlNet does not participate in the control.
- `start_percent` Specifies the point in the generation process (as a percentage) when ControlNet starts to take effect. For example, a value of 0.3 means ControlNet starts influencing at 30% of the process. **Normally this parameter is set to 0.**
- `end_percent` Specifies the point in the generation process (as a percentage) when ControlNet stops taking effect. For example, a value of 0.7 means ControlNet stops influencing at 70% of the process.

**ComfyUI API**

At the same time, you can also get the condition image through the ComfyUI API. Here is the example code based on the above [workflow](workflow/Coin3D_condition_workflow.json).

First, you need to run ComfyUI
```
python3 main.py --port 6621
```
After that, run
```
python3 inference_comfyui_api.py
```

### Acknowledgement
We deeply appreciate the authors of the following repositories for generously sharing their code, which we have extensively utilized. Their contributions have been invaluable to our work, and we are grateful for their openness and willingness to share their expertise. Our project has greatly benefited from their efforts and dedication.
- [ComfyUI](https://github.com/comfyanonymous/ComfyUI)

================================================
FILE: README.md
================================================
# Coin3D: Controllable and Interactive 3D Assets Generation with Proxy-Guided Conditioning
### [Project Page](https://zju3dv.github.io/coin3d/) | [Video](https://www.youtube.com/watch?v=d6p3LLbmOnI) | [Paper](https://arxiv.org/abs/2405.08054)
<div align=center>
<img src="media/coin3d_highlight.gif" width="100%"/>
</div>



### ToDo List
- [x] Inference code and pretrained models.
- [ ] Interactive workflow.
- [x] Training data.
- [ ] Blender Addons



### Preparation for inference
1. Install packages in `requirements.txt`. We test our model on a A100-80G GPU with 11.8 CUDA and 2.0.1 pytorch.
```angular2html
conda create -n coin3d
conda activate coin3d
pip install -r requirements.txt
```
2. Download checkpoints 

```
mkdir ckpt
cd ckpt
wget https://huggingface.co/WenqiDong/Coin3D-v1/resolve/main/ViT-L-14.pt

wget https://huggingface.co/WenqiDong/Coin3D-v1/resolve/main/model.ckpt
```

### Inference
1. Make sure you have the following models.
```bash
Coin3D
|-- ckpt
    |-- ViT-L-14.pt
    |-- model.ckpt
```
2. We provide a workflow that uses a custom mesh and text prompt to generate the input image. You can refer to [this instruction](CONDITION.md).
3. (Optional) Make sure the input image has a white background. Here we refer to [SyncDreamer](https://github.com/liuyuan-pal/SyncDreamer) and use the following tools for foreground segmentation. Predict foreground mask as the alpha channel. We use [Paint3D](https://apps.microsoft.com/store/detail/paint-3d/9NBLGGH5FV99) to segment the foreground object interactively. 
We also provide a script `foreground_segment.py` using `carvekit` to predict foreground masks and you need to first crop the object region before feeding it to `foreground_segment.py`. We may double check the predicted masks are correct or not.
```bash
python3 foreground_segment.py --input <image-file-to-input> --output <image-file-in-png-format-to-output>
```
3. Using coarse proxy to control 3D generation of multi-view images.
```bash
python3 generate.py \
        --cfg configs/ctrldemo.yaml \
        --ckpt ckpt/model.ckpt \
        --input example/panda/input.png \
        --input_proxy example/panda/proxy.txt \
        --output output/custom \
        --sample_num 1 \
        --cfg_scale 2.0 \
        --elevation 30 \
        --ctrl_end_step 1.0 \
        --sampler ddim_demo
```
Explanation: 
- `--cfg` is the model configuration.
- `--ckpt` is the checkpoint to load.
- `--input` is the input image in the RGBA form. The alpha value means the foreground object mask.
- `--input_proxy` is the input coarse proxy. The proxy contains 256 points by default. [misc.ipynb](misc.ipynb) contains code for using the coarse mesh sampling proxy.
- `--output` is the output directory. Results would be saved to `output/custom/0.png` which contains 16 images of predefined viewpoints per `png` file. 
- `--sample_num` is the number of instances we will generate. 
- `--cfg_scale` is the *classifier-free-guidance*. `2.0` is OK for most cases.
- `--elevation` is the elevation angle of the input image in degree. Need to be set to 30.
- `--ctrl_end_step` is the timestamp of ending 3D control, from `0` to `1.0`, usually set to `0.6` to `1.0`.

4. Run a NeuS or a NeRF for 3D reconstruction.
```bash
# train a neus
python3 train_renderer.py -i output/custom/0.png \
                         -n custom-neus \
                         -b configs/neus.yaml \
                         -l output/renderer 
# train a nerf
python3 train_renderer.py -i output/custom/0.png \
                         -n custom-nerf \
                         -b configs/nerf.yaml \
                         -l output/renderer
```
Explanation:
- `-i` contains the multiview images generated by SyncDreamer. Since SyncDreamer does not always produce good results, we may need to select a good generated image set (from `0.png` to `3.png`) for reconstruction.
- `-n` means the name. `-l` means the log dir. Results will be saved to `<log_dir>/<name>` i.e. `output/renderer/custom-neus` and `output/renderer/custom-nerf`.

### Dataset
We train the model on the [Objaverse LVIS](https://objaverse.allenai.org/docs/objaverse-1.0/) dataset. The preprocessed data can be found [here](https://huggingface.co/datasets/WenqiDong/Coin3D_Objaverse_LVIS). We use the script for rendering multi-view images in [SyncDreamer](https://github.com/liuyuan-pal/SyncDreamer). The script of object extraction proxy can refer to [misc](misc.ipynb).

### Training
Please note that you need to set the data directory location in the [config file](configs/coin3d_train.yaml).
```
target_dir: path/to/renderings-v1 # renderings of target views
input_dir: path/to/renderings-random # renderings of input views
proxy_dir: path/to/proxy_256 # proxys of target objects
```

```bash
python3 train_syncdreamer.py -b configs/coin3d_train.yaml \
                           --finetune_from ckpt/syncdreamer-pretrain.ckpt \
                           -l ./logs/coin3d \
                           -c ./ckpt/coin3d \
                           --gpus 0
```

## Acknowledgement

We deeply appreciate the authors of the following repositories for generously sharing their code, which we have extensively utilized. Their contributions have been invaluable to our work, and we are grateful for their openness and willingness to share their expertise. Our project has greatly benefited from their efforts and dedication.

- [SyncDreamer](https://github.com/liuyuan-pal/SyncDreamer)
- [latent-diffusion](https://github.com/CompVis/latent-diffusion)
- [threestudio](https://github.com/threestudio-project/threestudio)
- [pvcnn](https://github.com/mit-han-lab/pvcnn)
- [stable diffusion](https://github.com/CompVis/stable-diffusion)
- [zero123](https://github.com/cvlab-columbia/zero123)
- [COLMAP](https://colmap.github.io/)
- [NeuS](https://github.com/Totoro97/NeuS)

## Citation
If you find this repository useful in your project, please cite the following work. :)
```
@article{dong2024coin3d,
  title={Coin3D: Controllable and Interactive 3D Assets Generation with Proxy-Guided Conditioning},
  author={Dong, Wenqi and Yang, Bangbang and Ma, Lin and Liu, Xiao and Cui, Liyuan and Bao, Hujun and Ma, Yuewen and Cui, Zhaopeng},
  year={2024},
  eprint={2405.08054},
  archivePrefix={arXiv},
  primaryClass={cs.GR}
}
```


================================================
FILE: blender_utils/render_proxy.py
================================================
import argparse
import json
import math
import os
import random
import sys
import time
import urllib.request
from pathlib import Path
import bmesh

from mathutils import Vector, Matrix
import numpy as np
from io import BytesIO
import bpy
from mathutils import Vector
import pickle
import json
from urllib import request, parse
import random
import base64

def load_object(object_path: str) -> None:
    """Loads a 3D model into the scene."""
    if object_path.endswith(".glb"):
        bpy.ops.import_scene.gltf(filepath=object_path, merge_vertices=True)
    elif object_path.endswith(".fbx"):
        bpy.ops.import_scene.fbx(filepath=object_path)
    elif object_path.endswith(".obj"):
        bpy.ops.import_scene.obj(filepath=object_path)
    else:
        raise ValueError(f"Unsupported file type: {object_path}")

def az_el_to_points(azimuths, elevations):
    x = np.cos(azimuths)*np.cos(elevations)
    y = np.sin(azimuths)*np.cos(elevations)
    z = np.sin(elevations)
    return np.stack([x,y,z],-1) #

def points_to_az_el_dist(location):
    dist = np.linalg.norm(location)
    location /= dist
    x, y, z = location
    ele = np.arcsin(z)
    azi = np.arctan2(y, x)
    return azi, ele, dist

def set_camera(camera, az, el, dist):
    # az, el in degree
    distances = dist

    azimuths = np.deg2rad(az).astype(np.float32)
    elevations = np.deg2rad(el).astype(np.float32)

    cam_pts = az_el_to_points(azimuths, elevations) * distances
    x, y, z = cam_pts
    camera.location = x, y, z


def get_camera(camera_name):
    context = bpy.context
    scene = context.scene
    # 获取或创建Empty对象
    empty = scene.objects.get("Empty")
    if empty is None:
        empty = bpy.data.objects.new("Empty", None)
        scene.collection.objects.link(empty)

    # 尝试获取相机
    camera = scene.objects.get(camera_name)
    if camera is None:
        # 创建新的相机
        camera_data = bpy.data.cameras.new(name=camera_name)
        camera = bpy.data.objects.new(camera_name, camera_data)
        context.collection.objects.link(camera)
        camera.location = (0, 1.2, 0)
    camera.data.lens = 35
    camera.data.sensor_width = 32

    constrs = camera.constraints.get('Track To', None)
    if constrs is None:
        cam_constraint = camera.constraints.new(type='TRACK_TO')
        cam_constraint.track_axis = 'TRACK_NEGATIVE_Z'
        cam_constraint.up_axis = 'UP_Y'
        # 设置约束的目标
        cam_constraint.target = empty
    return camera


def init_global(context):
    scene = context.scene
    render = scene.render

    cam = get_camera("Camera")
    set_camera(camera=cam, az=0, el=30, dist=1.5)
    bpy.context.scene.camera = cam

    render.engine = "CYCLES"
    render.image_settings.file_format = "PNG"
    render.image_settings.color_mode = "RGBA"
    render.resolution_x = 256
    render.resolution_y = 256
    render.resolution_percentage = 100

    scene.cycles.device = "GPU"
    scene.cycles.samples = 128
    scene.cycles.diffuse_bounces = 1
    scene.cycles.glossy_bounces = 1
    scene.cycles.transparent_max_bounces = 3
    scene.cycles.transmission_bounces = 3
    scene.cycles.filter_width = 0.01
    scene.cycles.use_denoising = True
    scene.render.film_transparent = True

    bpy.context.preferences.addons["cycles"].preferences.get_devices()
    # Set the device_type
    bpy.context.preferences.addons["cycles"].preferences.compute_device_type = "NONE" # or "OPENCL"
    bpy.context.scene.cycles.tile_size = 8192

    world_tree = bpy.context.scene.world.node_tree
    back_node = world_tree.nodes['Background']
    env_light = 0.5
    back_node.inputs['Color'].default_value = Vector([env_light, env_light, env_light, 1.0])
    back_node.inputs['Strength'].default_value = 1.0

def reset_scene() -> None:
    """Resets the scene to a clean state."""
    # delete everything that isn't part of a camera or a light
    for obj in bpy.data.objects:
        if obj.type not in {"CAMERA", "LIGHT"}:
            bpy.data.objects.remove(obj, do_unlink=True)
    # delete all the materials
    for material in bpy.data.materials:
        bpy.data.materials.remove(material, do_unlink=True)
    # delete all the textures
    for texture in bpy.data.textures:
        bpy.data.textures.remove(texture, do_unlink=True)
    # delete all the images
    for image in bpy.data.images:
        bpy.data.images.remove(image, do_unlink=True)

def scene_root_objects():
    for obj in bpy.context.scene.objects.values():
        if not obj.parent and isinstance(obj.data, (bpy.types.Mesh, bpy.types.Light)):
            yield obj

def scene_meshes():
    for obj in bpy.context.scene.objects.values():
        if isinstance(obj.data, (bpy.types.Mesh)):
            yield obj

def selected_root_objects():
    for obj in bpy.context.selected_objects:
        if not obj.parent and isinstance(obj.data, (bpy.types.Mesh, bpy.types.Light)):
            yield obj

def selected_meshes():
    for obj in bpy.context.selected_objects:
        if isinstance(obj.data, (bpy.types.Mesh)):
            yield obj

def selected_objects_bbox(single_obj=None, ignore_matrix=False):
    bbox_min = (math.inf,) * 3
    bbox_max = (-math.inf,) * 3
    found = False
    for obj in scene_meshes() if single_obj is None else [single_obj]:
        found = True
        for coord in obj.bound_box:
            coord = Vector(coord)
            if not ignore_matrix:
                coord = obj.matrix_world @ coord
            bbox_min = tuple(min(x, y) for x, y in zip(bbox_min, coord))
            bbox_max = tuple(max(x, y) for x, y in zip(bbox_max, coord))
    if not found:
        raise RuntimeError("no objects in scene to compute bounding box for")
    return Vector(bbox_min), Vector(bbox_max)

def scene_bbox(single_obj=None, ignore_matrix=False):
    bbox_min = (math.inf,) * 3
    bbox_max = (-math.inf,) * 3
    found = False
    for obj in scene_meshes() if single_obj is None else [single_obj]:
        found = True
        for coord in obj.bound_box:
            coord = Vector(coord)
            if not ignore_matrix:
                coord = obj.matrix_world @ coord
            bbox_min = tuple(min(x, y) for x, y in zip(bbox_min, coord))
            bbox_max = tuple(max(x, y) for x, y in zip(bbox_max, coord))
    if not found:
        raise RuntimeError("no objects in scene to compute bounding box for")
    return Vector(bbox_min), Vector(bbox_max)

def normalize_scene():

    bbox_min, bbox_max = scene_bbox()
    scale = 1 / max(bbox_max - bbox_min)
    for obj in scene_root_objects():
        obj.scale = obj.scale * scale
        obj.location=obj.location*scale
    # Apply scale to matrix_world.
    bpy.context.view_layer.update()
    bbox_min, bbox_max = scene_bbox()
    offset = -(bbox_min + bbox_max) / 2
    for obj in scene_root_objects():
        obj.matrix_world.translation += offset
    bpy.ops.object.select_all(action="DESELECT")


if __name__ == "__main__":
    parser = argparse.ArgumentParser(
        description="Render mesh minimal scripts"
    )
    parser.add_argument('--obj_path', type=str)
    parser.add_argument(
        "--outdir",
        default="./render_result",
        type=str,
        help="Path to save sampled data."
    )
    
    argv = sys.argv[sys.argv.index("--") + 1 :]
    args = parser.parse_args(argv)

    if not os.path.exists(args.outdir):
        os.makedirs(args.outdir)
    
    reset_scene()
    init_global(bpy.context)
    load_object(args.obj_path)
    normalize_scene()
    output_path = os.path.join(args.outdir, "condition.png")
    bpy.context.scene.render.filepath = (output_path)
    bpy.ops.render.render(write_still=True)
    

================================================
FILE: configs/coin3d_train.yaml
================================================
model:
  base_learning_rate: 5.0e-05
  target: ldm.models.diffusion.ctrldemo_sync_dreamer.CtrlDemo
  params:
    view_num: 16
    image_size: 256
    cfg_scale: 2.0
    output_num: 8
    batch_view_num: 4
    finetune_unet: false
    finetune_projection: false
    drop_conditions: false
    clip_image_encoder_path: ckpt/ViT-L-14.pt
    feature_scale: 1
    scheduler_config: # 10000 warmup steps
      target: ldm.lr_scheduler.LambdaLinearScheduler
      params:
        warm_up_steps: [ 100 ]
        cycle_lengths: [ 100000 ]
        f_start: [ 0.02 ]
        f_max: [ 1.0 ]
        f_min: [ 1.0 ]

    unet_config:
      target: ldm.models.diffusion.sync_dreamer_attention.DepthWiseAttention
      params:
        volume_dims: [64, 128, 256, 512]
        image_size: 32
        in_channels: 8
        out_channels: 4
        model_channels: 320
        attention_resolutions: [ 4, 2, 1 ]
        num_res_blocks: 2
        channel_mult: [ 1, 2, 4, 4 ]
        num_heads: 8
        use_spatial_transformer: True
        transformer_depth: 1
        context_dim: 768
        use_checkpoint: True
        legacy: False

data:
  target: ldm.data.control_sync_dreamer.ControlSyncDreamerDataset
  params:
    target_dir: path/to/renderings-v1 # renderings of target views
    input_dir: path/to/renderings-random # renderings of input views
    proxy_dir: path/to/proxy_256 # renderings of input views
    validation_dir: path/to/renderings-v1 # directory of validation data
    uid_set_pkl: path/to/proxy_256/train.pkl # a list of uids
    valid_uid_set_pkl: path/to/proxy_256/test.pkl # a list of uids
    batch_size: 8 # batch size for a single gpu
    num_workers: 8

lightning:
  modelcheckpoint:
    params:
      every_n_train_steps: 1000 # we will save models every 1k steps
  callbacks:
    {}

  trainer:
    benchmark: True
    val_check_interval: 100000 # we will run validation every 1k steps, the validation will output images to <log_dir>/<images>/val
    num_sanity_val_steps: 0
    check_val_every_n_epoch: null
    # max_epochs: 10000


================================================
FILE: configs/ctrldemo.yaml
================================================
model:
  base_learning_rate: 5.0e-05
  target: ldm.models.diffusion.ctrldemo_sync_dreamer.CtrlDemo
  params:
    view_num: 16
    image_size: 256
    cfg_scale: 2.0
    output_num: 8
    batch_view_num: 4
    finetune_unet: false
    finetune_projection: false
    drop_conditions: false
    clip_image_encoder_path: ckpt/ViT-L-14.pt

    scheduler_config: # 10000 warmup steps
      target: ldm.lr_scheduler.LambdaLinearScheduler
      params:
        warm_up_steps: [ 100 ]
        cycle_lengths: [ 100000 ]
        f_start: [ 0.02 ]
        f_max: [ 1.0 ]
        f_min: [ 1.0 ]

    unet_config:
      target: ldm.models.diffusion.sync_dreamer_attention.DepthWiseAttention
      params:
        volume_dims: [64, 128, 256, 512]
        image_size: 32
        in_channels: 8
        out_channels: 4
        model_channels: 320
        attention_resolutions: [ 4, 2, 1 ]
        num_res_blocks: 2
        channel_mult: [ 1, 2, 4, 4 ]
        num_heads: 8
        use_spatial_transformer: True
        transformer_depth: 1
        context_dim: 768
        use_checkpoint: True
        legacy: False

data: {}

lightning:
  trainer: {}

================================================
FILE: configs/nerf.yaml
================================================
model:
  base_lr: 1.0e-2
  target: renderer.renderer.RendererTrainer
  params:
    total_steps: 2000
    warm_up_steps: 100
    train_batch_num: 40960
    test_batch_num: 40960
    renderer: ngp
    cube_bound: 0.6
    use_mask: true
    lambda_rgb_loss: 0.5
    lambda_mask_loss: 10.0

data:
  target: renderer.dummy_dataset.DummyDataset
  params: {}

callbacks:
  save_interval: 5000

trainer:
  val_check_interval: 500
  max_steps: 2000



================================================
FILE: configs/neus.yaml
================================================
model:
  base_lr: 5.0e-4
  target: renderer.renderer.RendererTrainer
  params:
    total_steps: 2000
    warm_up_steps: 100
    train_batch_num: 3584
    train_batch_fg_num: 512
    test_batch_num: 4096
    use_mask: true
    lambda_rgb_loss: 0.5
    lambda_mask_loss: 1.0
    lambda_eikonal_loss: 0.1
    use_warm_up: true

data:
  target: renderer.dummy_dataset.DummyDataset
  params: {}

callbacks:
  save_interval: 500

trainer:
  val_check_interval: 500
  max_steps: 2000



================================================
FILE: configs/syncdreamer.yaml
================================================
model:
  base_learning_rate: 5.0e-05
  target: ldm.models.diffusion.sync_dreamer.SyncMultiviewDiffusion
  params:
    view_num: 16
    image_size: 256
    cfg_scale: 2.0
    output_num: 8
    batch_view_num: 4
    finetune_unet: false
    finetune_projection: false
    drop_conditions: false
    clip_image_encoder_path: ckpt/ViT-L-14.pt

    scheduler_config: # 10000 warmup steps
      target: ldm.lr_scheduler.LambdaLinearScheduler
      params:
        warm_up_steps: [ 100 ]
        cycle_lengths: [ 100000 ]
        f_start: [ 0.02 ]
        f_max: [ 1.0 ]
        f_min: [ 1.0 ]

    unet_config:
      target: ldm.models.diffusion.sync_dreamer_attention.DepthWiseAttention
      params:
        volume_dims: [64, 128, 256, 512]
        image_size: 32
        in_channels: 8
        out_channels: 4
        model_channels: 320
        attention_resolutions: [ 4, 2, 1 ]
        num_res_blocks: 2
        channel_mult: [ 1, 2, 4, 4 ]
        num_heads: 8
        use_spatial_transformer: True
        transformer_depth: 1
        context_dim: 768
        use_checkpoint: True
        legacy: False

data: {}

lightning:
  trainer: {}


================================================
FILE: example/panda/mesh.obj
================================================
# Blender 3.6.1
# www.blender.org
o Sphere
v -0.090264 0.462556 -0.155620
v -0.090264 0.401258 -0.226042
v -0.090264 0.321167 -0.264155
v -0.090264 0.277823 -0.269059
v -0.090264 0.234478 -0.264155
v -0.090264 0.154388 -0.226042
v -0.081254 0.495731 -0.062652
v -0.072590 0.483088 -0.109615
v -0.064606 0.462556 -0.152895
v -0.057607 0.434926 -0.190831
v -0.051864 0.401258 -0.221964
v -0.047596 0.362846 -0.245099
v -0.044967 0.321167 -0.259344
v -0.044080 0.277823 -0.264155
v -0.044967 0.234478 -0.259344
v -0.047596 0.192799 -0.245099
v -0.051864 0.154388 -0.221964
v -0.057607 0.120720 -0.190831
v -0.064606 0.093089 -0.152895
v -0.072590 0.072558 -0.109615
v -0.081254 0.059914 -0.062652
v -0.072590 0.495731 -0.059819
v -0.055596 0.483088 -0.104056
v -0.039933 0.462556 -0.144826
v -0.026205 0.434926 -0.180561
v -0.014938 0.401258 -0.209887
v -0.006567 0.362846 -0.231679
v -0.001411 0.321167 -0.245099
v 0.000329 0.277823 -0.249630
v -0.001411 0.234478 -0.245099
v -0.006567 0.192799 -0.231679
v -0.014938 0.154388 -0.209887
v -0.026205 0.120720 -0.180561
v -0.039933 0.093089 -0.144826
v -0.055596 0.072558 -0.104056
v -0.072590 0.059914 -0.059819
v -0.064606 0.495731 -0.055217
v -0.039933 0.483088 -0.095030
v -0.017195 0.462556 -0.131721
v 0.002736 0.434926 -0.163882
v 0.019092 0.401258 -0.190275
v 0.031246 0.362846 -0.209887
v 0.038730 0.321167 -0.221964
v 0.041257 0.277823 -0.226042
v 0.038730 0.234478 -0.221964
v 0.031246 0.192799 -0.209887
v 0.019092 0.154388 -0.190275
v 0.002736 0.120720 -0.163882
v -0.017195 0.093089 -0.131721
v -0.039933 0.072558 -0.095030
v -0.064606 0.059914 -0.055217
v -0.057607 0.495731 -0.049024
v -0.026205 0.483088 -0.082882
v 0.002736 0.462556 -0.114086
v 0.028102 0.434926 -0.141436
v 0.048920 0.401258 -0.163882
v 0.064389 0.362846 -0.180561
v 0.073915 0.321167 -0.190831
v 0.077131 0.277823 -0.194299
v 0.073915 0.234478 -0.190831
v 0.064389 0.192799 -0.180561
v 0.048920 0.154388 -0.163882
v 0.028102 0.120720 -0.141436
v 0.002736 0.093089 -0.114086
v -0.026205 0.072558 -0.082882
v -0.057607 0.059914 -0.049024
v -0.051864 0.495731 -0.041478
v -0.014938 0.483088 -0.068080
v 0.019092 0.462556 -0.092597
v 0.048920 0.434926 -0.114086
v 0.073399 0.401258 -0.131721
v 0.091589 0.362846 -0.144826
v 0.102790 0.321167 -0.152895
v 0.106572 0.277823 -0.155620
v 0.102790 0.234478 -0.152895
v 0.091589 0.192799 -0.144826
v 0.073399 0.154388 -0.131721
v 0.048920 0.120720 -0.114086
v 0.019092 0.093089 -0.092597
v -0.014938 0.072558 -0.068080
v -0.051864 0.059914 -0.041478
v -0.090264 0.500000 -0.013813
v -0.047596 0.495731 -0.032869
v -0.006567 0.483088 -0.051193
v 0.031246 0.462556 -0.068080
v 0.064389 0.434926 -0.082882
v 0.091589 0.401258 -0.095030
v 0.111800 0.362846 -0.104056
v 0.124246 0.321167 -0.109615
v 0.128448 0.277823 -0.111491
v 0.124246 0.234478 -0.109615
v 0.111800 0.192799 -0.104056
v 0.091589 0.154388 -0.095030
v 0.064389 0.120720 -0.082882
v 0.031246 0.093089 -0.068080
v -0.006567 0.072558 -0.051193
v -0.047596 0.059914 -0.032869
v -0.044967 0.495731 -0.023528
v -0.001411 0.483088 -0.032869
v 0.038730 0.462556 -0.041478
v 0.073915 0.434926 -0.049024
v 0.102790 0.401258 -0.055217
v 0.124246 0.362846 -0.059819
v 0.137458 0.321167 -0.062652
v 0.141920 0.277823 -0.063609
v 0.137458 0.234478 -0.062652
v 0.124246 0.192799 -0.059819
v 0.102790 0.154388 -0.055217
v 0.073915 0.120720 -0.049024
v 0.038730 0.093089 -0.041478
v -0.001411 0.072558 -0.032869
v -0.044967 0.059914 -0.023528
v -0.044080 0.495731 -0.013813
v 0.000329 0.483088 -0.013813
v 0.041257 0.462556 -0.013813
v 0.077131 0.434926 -0.013813
v 0.106572 0.401258 -0.013813
v 0.128448 0.362846 -0.013813
v 0.141920 0.321167 -0.013813
v 0.146468 0.277823 -0.013813
v 0.141920 0.234478 -0.013813
v 0.128448 0.192799 -0.013813
v 0.106572 0.154388 -0.013813
v 0.077131 0.120720 -0.013813
v 0.041257 0.093089 -0.013813
v 0.000329 0.072558 -0.013813
v -0.044080 0.059914 -0.013813
v -0.044967 0.495731 -0.004098
v -0.001411 0.483088 0.005243
v 0.038730 0.462556 0.013852
v 0.073915 0.434926 0.021398
v 0.102790 0.401258 0.027591
v 0.124246 0.362846 0.032193
v 0.137458 0.321167 0.035026
v 0.141920 0.277823 0.035983
v 0.137458 0.234478 0.035026
v 0.124246 0.192799 0.032193
v 0.102790 0.154388 0.027591
v 0.073915 0.120720 0.021398
v 0.038730 0.093089 0.013852
v -0.001411 0.072558 0.005243
v -0.044967 0.059914 -0.004098
v -0.047596 0.495731 0.005243
v -0.006567 0.483088 0.023567
v 0.031246 0.462556 0.040454
v 0.064389 0.434926 0.055256
v 0.091589 0.401258 0.067404
v 0.111800 0.362846 0.076430
v 0.124246 0.321167 0.081989
v 0.128448 0.277823 0.083866
v 0.124246 0.234478 0.081989
v 0.111800 0.192799 0.076430
v 0.091589 0.154388 0.067404
v 0.064389 0.120720 0.055256
v 0.031246 0.093089 0.040454
v -0.006567 0.072558 0.023567
v -0.047596 0.059914 0.005243
v -0.051864 0.495731 0.013852
v -0.014938 0.483088 0.040454
v 0.019092 0.462556 0.064971
v 0.048920 0.434926 0.086460
v 0.073399 0.401258 0.104095
v 0.091589 0.362846 0.117200
v 0.102790 0.321167 0.125269
v 0.106572 0.277823 0.127994
v 0.102790 0.234478 0.125269
v 0.091589 0.192799 0.117200
v 0.073399 0.154388 0.104095
v 0.048920 0.120720 0.086460
v 0.019092 0.093089 0.064971
v -0.014938 0.072558 0.040454
v -0.051864 0.059914 0.013852
v -0.057607 0.495731 0.021398
v -0.026205 0.483088 0.055256
v 0.002736 0.462556 0.086460
v 0.028102 0.434926 0.113810
v 0.048920 0.401258 0.136256
v 0.064389 0.362846 0.152935
v 0.073915 0.321167 0.163205
v 0.077131 0.277823 0.166673
v 0.073915 0.234478 0.163205
v 0.064389 0.192799 0.152935
v 0.048920 0.154388 0.136256
v 0.028102 0.120720 0.113810
v 0.002736 0.093089 0.086460
v -0.026205 0.072558 0.055256
v -0.057607 0.059914 0.021398
v -0.064606 0.495731 0.027591
v -0.039933 0.483088 0.067404
v -0.017195 0.462556 0.104095
v 0.002736 0.434926 0.136256
v 0.019092 0.401258 0.162649
v 0.031246 0.362846 0.182261
v 0.038730 0.321167 0.194339
v 0.041257 0.277823 0.198416
v 0.038730 0.234478 0.194339
v 0.031246 0.192799 0.182261
v 0.019092 0.154388 0.162649
v 0.002736 0.120720 0.136256
v -0.017195 0.093089 0.104095
v -0.039933 0.072558 0.067404
v -0.064606 0.059914 0.027591
v -0.072590 0.495731 0.032193
v -0.055596 0.483088 0.076430
v -0.039933 0.462556 0.117200
v -0.026205 0.434926 0.152935
v -0.014938 0.401258 0.182261
v -0.006567 0.362846 0.204053
v -0.001411 0.321167 0.217473
v 0.000329 0.277823 0.222004
v -0.001411 0.234478 0.217473
v -0.006567 0.192799 0.204053
v -0.014938 0.154388 0.182261
v -0.026205 0.120720 0.152935
v -0.039933 0.093089 0.117200
v -0.055596 0.072558 0.076430
v -0.072590 0.059914 0.032193
v -0.081254 0.495731 0.035026
v -0.072590 0.483088 0.081989
v -0.064606 0.462556 0.125269
v -0.057607 0.434926 0.163205
v -0.051864 0.401258 0.194338
v -0.047596 0.362846 0.217473
v -0.044968 0.321167 0.231718
v -0.044080 0.277823 0.236529
v -0.044968 0.234478 0.231718
v -0.047596 0.192799 0.217473
v -0.051864 0.154388 0.194338
v -0.057607 0.120720 0.163205
v -0.064606 0.093089 0.125269
v -0.072590 0.072558 0.081989
v -0.081254 0.059914 0.035026
v -0.090264 0.495731 0.035983
v -0.090264 0.483088 0.083866
v -0.090264 0.462556 0.127994
v -0.090264 0.434926 0.166673
v -0.090264 0.401258 0.198416
v -0.090264 0.362846 0.222004
v -0.090264 0.321167 0.236529
v -0.090264 0.277823 0.241433
v -0.090264 0.234478 0.236529
v -0.090264 0.192799 0.222004
v -0.090264 0.154388 0.198416
v -0.090264 0.120720 0.166673
v -0.090264 0.093089 0.127994
v -0.090264 0.072558 0.083866
v -0.090264 0.059914 0.035983
v -0.099274 0.495731 0.035026
v -0.107938 0.483088 0.081989
v -0.115923 0.462556 0.125269
v -0.122922 0.434926 0.163205
v -0.128665 0.401258 0.194338
v -0.132933 0.362846 0.217473
v -0.135561 0.321167 0.231718
v -0.136449 0.277823 0.236529
v -0.135561 0.234478 0.231718
v -0.132933 0.192799 0.217473
v -0.128665 0.154388 0.194338
v -0.122922 0.120720 0.163205
v -0.115923 0.093089 0.125269
v -0.107938 0.072558 0.081989
v -0.099274 0.059914 0.035026
v -0.107938 0.495731 0.032193
v -0.124933 0.483088 0.076430
v -0.140596 0.462556 0.117200
v -0.154324 0.434926 0.152935
v -0.165590 0.401258 0.182261
v -0.173962 0.362846 0.204053
v -0.179117 0.321167 0.217473
v -0.180858 0.277823 0.222004
v -0.179117 0.234478 0.217473
v -0.173962 0.192799 0.204053
v -0.165590 0.154388 0.182261
v -0.154324 0.120720 0.152935
v -0.140596 0.093089 0.117200
v -0.124933 0.072558 0.076430
v -0.107938 0.059914 0.032193
v -0.115923 0.495731 0.027591
v -0.140596 0.483088 0.067404
v -0.163334 0.462556 0.104095
v -0.183264 0.434926 0.136256
v -0.199621 0.401258 0.162649
v -0.211775 0.362846 0.182261
v -0.219259 0.321167 0.194338
v -0.221786 0.277823 0.198416
v -0.219259 0.234478 0.194338
v -0.211775 0.192799 0.182261
v -0.199621 0.154388 0.162649
v -0.183264 0.120720 0.136256
v -0.163334 0.093089 0.104095
v -0.140596 0.072558 0.067404
v -0.115923 0.059914 0.027591
v -0.122922 0.495731 0.021398
v -0.154324 0.483088 0.055256
v -0.183264 0.462556 0.086460
v -0.208631 0.434926 0.113810
v -0.229448 0.401258 0.136256
v -0.244917 0.362846 0.152935
v -0.254443 0.321167 0.163205
v -0.257660 0.277823 0.166673
v -0.254443 0.234478 0.163205
v -0.244917 0.192799 0.152935
v -0.229448 0.154388 0.136256
v -0.208631 0.120720 0.113810
v -0.183264 0.093089 0.086460
v -0.154324 0.072558 0.055256
v -0.122922 0.059914 0.021398
v -0.090264 0.055645 -0.013813
v -0.128665 0.495731 0.013852
v -0.165590 0.483088 0.040454
v -0.199621 0.462556 0.064971
v -0.229449 0.434926 0.086460
v -0.253928 0.401258 0.104095
v -0.272117 0.362846 0.117200
v -0.283318 0.321167 0.125269
v -0.287100 0.277823 0.127994
v -0.283318 0.234478 0.125269
v -0.272117 0.192799 0.117200
v -0.253928 0.154388 0.104095
v -0.229449 0.120720 0.086460
v -0.199621 0.093089 0.064971
v -0.165590 0.072558 0.040454
v -0.128665 0.059914 0.013852
v -0.132933 0.495731 0.005243
v -0.173962 0.483088 0.023567
v -0.211774 0.462556 0.040454
v -0.244918 0.434926 0.055256
v -0.272117 0.401258 0.067404
v -0.292328 0.362846 0.076430
v -0.304774 0.321167 0.081989
v -0.308977 0.277823 0.083865
v -0.304774 0.234478 0.081989
v -0.292328 0.192799 0.076430
v -0.272117 0.154388 0.067404
v -0.244918 0.120720 0.055256
v -0.211774 0.093089 0.040454
v -0.173962 0.072558 0.023567
v -0.132933 0.059914 0.005243
v -0.135561 0.495731 -0.004098
v -0.179117 0.483088 0.005243
v -0.219259 0.462556 0.013852
v -0.254443 0.434926 0.021398
v -0.283318 0.401258 0.027591
v -0.304774 0.362846 0.032193
v -0.317987 0.321167 0.035026
v -0.322448 0.277823 0.035983
v -0.317987 0.234478 0.035026
v -0.304774 0.192799 0.032193
v -0.283318 0.154388 0.027591
v -0.254443 0.120720 0.021398
v -0.219259 0.093089 0.013852
v -0.179117 0.072558 0.005243
v -0.135561 0.059914 -0.004098
v -0.136449 0.495731 -0.013813
v -0.180858 0.483088 -0.013813
v -0.221786 0.462556 -0.013813
v -0.257660 0.434926 -0.013813
v -0.287100 0.401258 -0.013813
v -0.308977 0.362846 -0.013813
v -0.322448 0.321167 -0.013813
v -0.326997 0.277823 -0.013813
v -0.322448 0.234478 -0.013813
v -0.308977 0.192799 -0.013813
v -0.287100 0.154388 -0.013813
v -0.257660 0.120720 -0.013813
v -0.221786 0.093089 -0.013813
v -0.180858 0.072558 -0.013813
v -0.136449 0.059914 -0.013813
v -0.135561 0.495731 -0.023528
v -0.179117 0.483088 -0.032869
v -0.219259 0.462556 -0.041478
v -0.254443 0.434926 -0.049024
v -0.283318 0.401258 -0.055217
v -0.304774 0.362846 -0.059819
v -0.317987 0.321167 -0.062652
v -0.322448 0.277823 -0.063609
v -0.317987 0.234478 -0.062652
v -0.304774 0.192799 -0.059819
v -0.283318 0.154388 -0.055217
v -0.254443 0.120720 -0.049024
v -0.219259 0.093089 -0.041478
v -0.179117 0.072558 -0.032869
v -0.135561 0.059914 -0.023528
v -0.132933 0.495731 -0.032869
v -0.173962 0.483088 -0.051193
v -0.211774 0.462556 -0.068080
v -0.244917 0.434926 -0.082882
v -0.272117 0.401258 -0.095030
v -0.292328 0.362846 -0.104056
v -0.304774 0.321167 -0.109615
v -0.308977 0.277823 -0.111491
v -0.304774 0.234478 -0.109615
v -0.292328 0.192799 -0.104056
v -0.272117 0.154388 -0.095030
v -0.244917 0.120720 -0.082882
v -0.211774 0.093089 -0.068080
v -0.173962 0.072558 -0.051193
v -0.132933 0.059914 -0.032869
v -0.128665 0.495731 -0.041478
v -0.165590 0.483088 -0.068080
v -0.199621 0.462556 -0.092597
v -0.229448 0.434926 -0.114086
v -0.253928 0.401258 -0.131721
v -0.272117 0.362846 -0.144826
v -0.283318 0.321167 -0.152895
v -0.287100 0.277823 -0.155620
v -0.283318 0.234478 -0.152895
v -0.272117 0.192799 -0.144826
v -0.253928 0.154388 -0.131721
v -0.229448 0.120720 -0.114086
v -0.199621 0.093089 -0.092597
v -0.165590 0.072558 -0.068080
v -0.128665 0.059914 -0.041478
v -0.122922 0.495731 -0.049024
v -0.154324 0.483088 -0.082882
v -0.183264 0.462556 -0.114086
v -0.208631 0.434926 -0.141436
v -0.229448 0.401258 -0.163882
v -0.244917 0.362846 -0.180560
v -0.254443 0.321167 -0.190831
v -0.257659 0.277823 -0.194299
v -0.254443 0.234478 -0.190831
v -0.244917 0.192799 -0.180560
v -0.229448 0.154388 -0.163882
v -0.208631 0.120720 -0.141436
v -0.183264 0.093089 -0.114086
v -0.154324 0.072558 -0.082882
v -0.122922 0.059914 -0.049024
v -0.115923 0.495731 -0.055217
v -0.140595 0.483088 -0.095030
v -0.163334 0.462556 -0.131721
v -0.183264 0.434926 -0.163882
v -0.199621 0.401258 -0.190275
v -0.211775 0.362846 -0.209887
v -0.219259 0.321167 -0.221964
v -0.221786 0.277823 -0.226042
v -0.219259 0.234478 -0.221964
v -0.211775 0.192799 -0.209887
v -0.199621 0.154388 -0.190275
v -0.183264 0.120720 -0.163882
v -0.163334 0.093089 -0.131721
v -0.140595 0.072558 -0.095030
v -0.115923 0.059914 -0.055217
v -0.107938 0.495731 -0.059818
v -0.124933 0.483088 -0.104056
v -0.140595 0.462556 -0.144826
v -0.154324 0.434926 -0.180561
v -0.165590 0.401258 -0.209887
v -0.173962 0.362846 -0.231679
v -0.179117 0.321167 -0.245098
v -0.180858 0.277823 -0.249629
v -0.179117 0.234478 -0.245098
v -0.173962 0.192799 -0.231679
v -0.165590 0.154388 -0.209887
v -0.154324 0.120720 -0.180561
v -0.140595 0.093089 -0.144826
v -0.124933 0.072558 -0.104056
v -0.107938 0.059914 -0.059818
v -0.099274 0.495731 -0.062652
v -0.107938 0.483088 -0.109615
v -0.115923 0.462556 -0.152895
v -0.122922 0.434926 -0.190831
v -0.128665 0.401258 -0.221964
v -0.132933 0.362846 -0.245098
v -0.135561 0.321167 -0.259344
v -0.136448 0.277823 -0.264154
v -0.135561 0.234478 -0.259344
v -0.132933 0.192799 -0.245098
v -0.128665 0.154388 -0.221964
v -0.122922 0.120720 -0.190831
v -0.115923 0.093089 -0.152895
v -0.107938 0.072558 -0.109615
v -0.099274 0.059914 -0.062652
v -0.090264 0.495731 -0.063609
v -0.090264 0.483088 -0.111491
v -0.090264 0.434926 -0.194299
v -0.090264 0.362846 -0.249630
v -0.090264 0.192799 -0.249630
v -0.090264 0.120720 -0.194299
v -0.090264 0.093089 -0.155620
v -0.090264 0.072558 -0.111491
v -0.090264 0.059914 -0.063609
vn 0.0901 -0.5212 -0.8487
vn 0.0616 0.8122 -0.5802
vn 0.0770 -0.6840 -0.7254
vn 0.0770 0.6840 -0.7254
vn 0.0616 -0.8122 -0.5802
vn 0.0901 0.5212 -0.8487
vn 0.0448 -0.9058 -0.4214
vn 0.0998 0.3274 -0.9396
vn 0.0271 -0.9665 -0.2552
vn 0.1049 0.1118 -0.9882
vn 0.0091 0.9963 -0.0854
vn 0.0091 -0.9963 -0.0854
vn 0.1049 -0.1118 -0.9882
vn 0.0271 0.9665 -0.2552
vn 0.0998 -0.3274 -0.9396
vn 0.0448 0.9058 -0.4214
vn 0.2939 -0.3257 -0.8986
vn 0.1324 0.9048 -0.4048
vn 0.2657 -0.5189 -0.8125
vn 0.1821 0.8105 -0.5567
vn 0.2274 -0.6818 -0.6953
vn 0.2274 0.6818 -0.6953
vn 0.1821 -0.8105 -0.5567
vn 0.2657 0.5189 -0.8125
vn 0.1324 -0.9048 -0.4048
vn 0.2939 0.3257 -0.8986
vn 0.0802 -0.9661 -0.2453
vn 0.3089 0.1112 -0.9446
vn 0.0269 0.9963 -0.0821
vn 0.0269 -0.9963 -0.0821
vn 0.3089 -0.1112 -0.9446
vn 0.0802 0.9661 -0.2453
vn 0.2146 -0.9030 -0.3723
vn 0.4726 0.3225 -0.8201
vn 0.1302 -0.9654 -0.2259
vn 0.4963 0.1100 -0.8612
vn 0.0436 0.9962 -0.0757
vn 0.0436 -0.9962 -0.0757
vn 0.4963 -0.1100 -0.8612
vn 0.1302 0.9654 -0.2259
vn 0.4726 -0.3225 -0.8201
vn 0.2146 0.9030 -0.3723
vn 0.4281 -0.5147 -0.7428
vn 0.2946 0.8074 -0.5111
vn 0.3671 -0.6778 -0.6371
vn 0.3671 0.6778 -0.6371
vn 0.2946 -0.8074 -0.5111
vn 0.4281 0.5147 -0.7428
vn 0.2880 0.9006 -0.3255
vn 0.5702 -0.5095 -0.6444
vn 0.3945 0.8035 -0.4458
vn 0.4904 -0.6726 -0.5542
vn 0.4904 0.6726 -0.5542
vn 0.3945 -0.8035 -0.4458
vn 0.5702 0.5095 -0.6444
vn 0.2880 -0.9006 -0.3255
vn 0.6282 0.3185 -0.7099
vn 0.1750 -0.9645 -0.1978
vn 0.6588 0.1085 -0.7445
vn 0.0587 0.9961 -0.0663
vn 0.0587 -0.9961 -0.0663
vn 0.6588 -0.1085 -0.7445
vn 0.1750 0.9645 -0.1978
vn 0.6282 -0.3185 -0.7099
vn 0.7554 0.3143 -0.5750
vn 0.2131 -0.9635 -0.1622
vn 0.7912 0.1069 -0.6022
vn 0.0715 0.9960 -0.0544
vn 0.0715 -0.9960 -0.0544
vn 0.7912 -0.1069 -0.6022
vn 0.2131 0.9635 -0.1622
vn 0.7554 -0.3143 -0.5750
vn 0.3499 0.8981 -0.2664
vn 0.6873 -0.5039 -0.5231
vn 0.4782 0.7993 -0.3640
vn 0.5927 -0.6672 -0.4511
vn 0.5927 0.6672 -0.4511
vn 0.4782 -0.7993 -0.3640
vn 0.6873 0.5039 -0.5231
vn 0.3499 -0.8981 -0.2664
vn 0.7764 -0.4990 -0.3849
vn 0.5429 0.7955 -0.2692
vn 0.6712 -0.6623 -0.3328
vn 0.6712 0.6623 -0.3328
vn 0.5429 -0.7955 -0.2692
vn 0.7764 0.4990 -0.3849
vn 0.3982 -0.8958 -0.1974
vn 0.8516 0.3106 -0.4222
vn 0.2428 -0.9626 -0.1204
vn 0.8909 0.1055 -0.4417
vn 0.0816 0.9958 -0.0404
vn 0.0816 -0.9958 -0.0404
vn 0.8909 -0.1055 -0.4417
vn 0.2428 0.9626 -0.1204
vn 0.8516 -0.3106 -0.4222
vn 0.3982 0.8958 -0.1974
vn 0.2633 -0.9619 -0.0741
vn 0.9574 0.1045 -0.2693
vn 0.0885 0.9958 -0.0249
vn 0.0885 -0.9958 -0.0249
vn 0.9574 -0.1045 -0.2693
vn 0.2633 0.9619 -0.0741
vn 0.9159 -0.3079 -0.2577
vn 0.4313 0.8940 -0.1213
vn 0.8363 -0.4953 -0.2353
vn 0.5870 0.7926 -0.1651
vn 0.7243 -0.6587 -0.2038
vn 0.7243 0.6587 -0.2038
vn 0.5870 -0.7926 -0.1651
vn 0.8363 0.4953 -0.2353
vn 0.4313 -0.8940 -0.1213
vn 0.9159 0.3079 -0.2577
vn 0.7510 -0.6567 -0.0686
vn 0.7510 0.6567 -0.0686
vn 0.6093 -0.7910 -0.0557
vn 0.8662 0.4933 -0.0791
vn 0.4480 -0.8931 -0.0409
vn 0.9480 0.3064 -0.0866
vn 0.2737 -0.9615 -0.0250
vn 0.9905 0.1039 -0.0905
vn 0.0920 0.9957 -0.0084
vn 0.0920 -0.9957 -0.0084
vn 0.9905 -0.1039 -0.0905
vn 0.2737 0.9615 -0.0250
vn 0.9480 -0.3064 -0.0866
vn 0.4480 0.8931 -0.0409
vn 0.8662 -0.4933 -0.0791
vn 0.6093 0.7910 -0.0557
vn 0.0920 0.9957 0.0084
vn 0.0920 -0.9957 0.0084
vn 0.9905 -0.1039 0.0905
vn 0.2737 0.9615 0.0250
vn 0.9480 -0.3064 0.0866
vn 0.4480 0.8931 0.0409
vn 0.8662 -0.4933 0.0791
vn 0.6093 0.7910 0.0557
vn 0.7510 -0.6567 0.0686
vn 0.7510 0.6567 0.0686
vn 0.6093 -0.7910 0.0557
vn 0.8662 0.4933 0.0791
vn 0.4480 -0.8931 0.0409
vn 0.9480 0.3064 0.0866
vn 0.2737 -0.9615 0.0250
vn 0.9905 0.1039 0.0905
vn 0.7243 0.6587 0.2038
vn 0.5870 -0.7926 0.1651
vn 0.8363 0.4953 0.2353
vn 0.4313 -0.8940 0.1213
vn 0.9159 0.3079 0.2577
vn 0.2633 -0.9619 0.0741
vn 0.9574 0.1045 0.2693
vn 0.0885 0.9958 0.0249
vn 0.0885 -0.9958 0.0249
vn 0.9574 -0.1045 0.2693
vn 0.2633 0.9619 0.0741
vn 0.9159 -0.3079 0.2577
vn 0.4313 0.8940 0.1213
vn 0.8363 -0.4953 0.2353
vn 0.5870 0.7926 0.1651
vn 0.7243 -0.6587 0.2038
vn 0.8909 -0.1055 0.4417
vn 0.2428 0.9626 0.1204
vn 0.8516 -0.3106 0.4222
vn 0.3982 0.8958 0.1974
vn 0.7764 -0.4990 0.3849
vn 0.5429 0.7955 0.2692
vn 0.6712 -0.6623 0.3328
vn 0.6712 0.6623 0.3328
vn 0.5429 -0.7955 0.2692
vn 0.7764 0.4990 0.3849
vn 0.3982 -0.8958 0.1974
vn 0.8516 0.3106 0.4222
vn 0.2428 -0.9626 0.1204
vn 0.8909 0.1055 0.4417
vn 0.0816 0.9958 0.0404
vn 0.0816 -0.9958 0.0404
vn 0.4782 -0.7993 0.3640
vn 0.6873 0.5039 0.5231
vn 0.3499 -0.8981 0.2664
vn 0.7554 0.3143 0.5750
vn 0.2131 -0.9635 0.1622
vn 0.7912 0.1069 0.6022
vn 0.0715 0.9960 0.0544
vn 0.0715 -0.9960 0.0544
vn 0.7912 -0.1069 0.6022
vn 0.2131 0.9635 0.1622
vn 0.7554 -0.3143 0.5750
vn 0.3499 0.8981 0.2664
vn 0.6873 -0.5039 0.5231
vn 0.4782 0.7993 0.3640
vn 0.5927 -0.6672 0.4511
vn 0.5927 0.6672 0.4511
vn 0.1750 0.9645 0.1978
vn 0.6282 -0.3185 0.7099
vn 0.2880 0.9006 0.3255
vn 0.5702 -0.5095 0.6444
vn 0.3945 0.8035 0.4458
vn 0.4904 -0.6726 0.5542
vn 0.4904 0.6726 0.5542
vn 0.3945 -0.8035 0.4458
vn 0.5702 0.5095 0.6444
vn 0.2880 -0.9006 0.3255
vn 0.6282 0.3185 0.7099
vn 0.1750 -0.9645 0.1978
vn 0.6588 0.1084 0.7445
vn 0.0587 0.9961 0.0663
vn 0.0587 -0.9961 0.0663
vn 0.6588 -0.1084 0.7445
vn 0.4281 0.5147 0.7428
vn 0.2146 -0.9030 0.3723
vn 0.4726 0.3225 0.8201
vn 0.1302 -0.9654 0.2259
vn 0.4963 0.1100 0.8612
vn 0.0436 0.9962 0.0757
vn 0.0436 -0.9962 0.0757
vn 0.4963 -0.1100 0.8612
vn 0.1302 0.9654 0.2259
vn 0.4726 -0.3225 0.8201
vn 0.2146 0.9030 0.3723
vn 0.4281 -0.5147 0.7428
vn 0.2946 0.8074 0.5111
vn 0.3671 -0.6778 0.6371
vn 0.3671 0.6778 0.6371
vn 0.2946 -0.8074 0.5111
vn 0.2939 -0.3257 0.8986
vn 0.1324 0.9048 0.4048
vn 0.2657 -0.5189 0.8125
vn 0.1821 0.8105 0.5567
vn 0.2274 -0.6818 0.6953
vn 0.2274 0.6818 0.6953
vn 0.1821 -0.8105 0.5567
vn 0.2657 0.5189 0.8125
vn 0.1324 -0.9048 0.4048
vn 0.2939 0.3257 0.8986
vn 0.0802 -0.9661 0.2453
vn 0.3089 0.1111 0.9446
vn 0.0269 0.9963 0.0821
vn 0.0269 -0.9963 0.0821
vn 0.3089 -0.1111 0.9446
vn 0.0802 0.9661 0.2453
vn 0.0448 -0.9058 0.4214
vn 0.0998 0.3274 0.9396
vn 0.0271 -0.9665 0.2552
vn 0.1049 0.1118 0.9882
vn 0.0091 0.9963 0.0854
vn 0.0091 -0.9963 0.0854
vn 0.1049 -0.1118 0.9882
vn 0.0271 0.9665 0.2552
vn 0.0998 -0.3274 0.9396
vn 0.0448 0.9058 0.4214
vn 0.0901 -0.5212 0.8487
vn 0.0616 0.8122 0.5802
vn 0.0770 -0.6840 0.7254
vn 0.0770 0.6840 0.7254
vn 0.0616 -0.8122 0.5802
vn 0.0901 0.5212 0.8487
vn -0.0901 -0.5212 0.8487
vn -0.0616 0.8122 0.5802
vn -0.0770 -0.6840 0.7254
vn -0.0770 0.6840 0.7254
vn -0.0616 -0.8122 0.5802
vn -0.0901 0.5212 0.8487
vn -0.0448 -0.9058 0.4214
vn -0.0998 0.3274 0.9396
vn -0.0271 -0.9665 0.2552
vn -0.1049 0.1118 0.9882
vn -0.0091 0.9963 0.0854
vn -0.0091 -0.9963 0.0854
vn -0.1049 -0.1118 0.9882
vn -0.0271 0.9665 0.2552
vn -0.0998 -0.3274 0.9396
vn -0.0448 0.9058 0.4214
vn -0.0802 -0.9661 0.2453
vn -0.3089 0.1111 0.9446
vn -0.0269 0.9963 0.0821
vn -0.0269 -0.9963 0.0821
vn -0.3089 -0.1111 0.9446
vn -0.0802 0.9661 0.2453
vn -0.2939 -0.3257 0.8986
vn -0.1324 0.9048 0.4048
vn -0.2657 -0.5189 0.8125
vn -0.1821 0.8105 0.5567
vn -0.2274 -0.6818 0.6953
vn -0.2274 0.6818 0.6953
vn -0.1821 -0.8105 0.5567
vn -0.2657 0.5189 0.8125
vn -0.1324 -0.9048 0.4048
vn -0.2939 0.3257 0.8986
vn -0.2946 0.8074 0.5111
vn -0.3671 -0.6778 0.6371
vn -0.3671 0.6778 0.6371
vn -0.2946 -0.8074 0.5111
vn -0.4281 0.5147 0.7428
vn -0.2146 -0.9030 0.3723
vn -0.4726 0.3225 0.8201
vn -0.1302 -0.9654 0.2259
vn -0.4963 0.1100 0.8612
vn -0.0436 0.9962 0.0757
vn -0.0436 -0.9962 0.0757
vn -0.4963 -0.1100 0.8612
vn -0.1302 0.9654 0.2259
vn -0.4726 -0.3225 0.8201
vn -0.2146 0.9030 0.3723
vn -0.4281 -0.5147 0.7428
vn -0.6588 0.1084 0.7445
vn -0.0587 0.9961 0.0663
vn -0.0587 -0.9961 0.0663
vn -0.6588 -0.1084 0.7445
vn -0.1750 0.9645 0.1978
vn -0.6282 -0.3185 0.7099
vn -0.2880 0.9006 0.3255
vn -0.5702 -0.5095 0.6444
vn -0.3945 0.8035 0.4458
vn -0.4904 -0.6726 0.5542
vn -0.4904 0.6726 0.5542
vn -0.3945 -0.8035 0.4458
vn -0.5702 0.5095 0.6444
vn -0.2880 -0.9006 0.3255
vn -0.6282 0.3185 0.7099
vn -0.1750 -0.9645 0.1978
vn -0.5927 -0.6672 0.4511
vn -0.5927 0.6672 0.4511
vn -0.4782 -0.7993 0.3640
vn -0.6873 0.5039 0.5231
vn -0.3499 -0.8981 0.2664
vn -0.7554 0.3143 0.5750
vn -0.2131 -0.9635 0.1622
vn -0.7912 0.1069 0.6022
vn -0.0715 0.9960 0.0544
vn -0.0715 -0.9960 0.0544
vn -0.7912 -0.1069 0.6022
vn -0.2131 0.9635 0.1622
vn -0.7554 -0.3143 0.5750
vn -0.3500 0.8981 0.2664
vn -0.6873 -0.5039 0.5231
vn -0.4782 0.7993 0.3640
vn -0.0816 0.9958 0.0404
vn -0.0816 -0.9958 0.0404
vn -0.8909 -0.1055 0.4417
vn -0.2428 0.9626 0.1204
vn -0.8516 -0.3106 0.4222
vn -0.3982 0.8958 0.1974
vn -0.7764 -0.4990 0.3849
vn -0.5429 0.7955 0.2692
vn -0.6712 -0.6623 0.3328
vn -0.6712 0.6623 0.3328
vn -0.5429 -0.7955 0.2692
vn -0.7764 0.4990 0.3849
vn -0.3982 -0.8958 0.1974
vn -0.8516 0.3106 0.4222
vn -0.2428 -0.9626 0.1204
vn -0.8909 0.1055 0.4417
vn -0.7243 0.6587 0.2038
vn -0.5870 -0.7926 0.1651
vn -0.8363 0.4953 0.2353
vn -0.4313 -0.8940 0.1213
vn -0.9159 0.3079 0.2577
vn -0.2633 -0.9619 0.0741
vn -0.9574 0.1045 0.2693
vn -0.0885 0.9958 0.0249
vn -0.0885 -0.9958 0.0249
vn -0.9574 -0.1045 0.2693
vn -0.2633 0.9619 0.0741
vn -0.9159 -0.3079 0.2577
vn -0.4313 0.8940 0.1213
vn -0.8363 -0.4953 0.2353
vn -0.5870 0.7926 0.1651
vn -0.7243 -0.6587 0.2038
vn -0.9905 -0.1039 0.0905
vn -0.2737 0.9615 0.0250
vn -0.9480 -0.3064 0.0866
vn -0.4480 0.8931 0.0409
vn -0.8662 -0.4933 0.0791
vn -0.6093 0.7910 0.0557
vn -0.7510 -0.6567 0.0686
vn -0.7510 0.6567 0.0686
vn -0.6093 -0.7910 0.0557
vn -0.8662 0.4933 0.0791
vn -0.4480 -0.8931 0.0409
vn -0.9480 0.3064 0.0866
vn -0.2737 -0.9615 0.0250
vn -0.9905 0.1039 0.0905
vn -0.0920 0.9957 0.0084
vn -0.0920 -0.9957 0.0084
vn -0.6093 -0.7910 -0.0557
vn -0.8662 0.4933 -0.0791
vn -0.4480 -0.8931 -0.0409
vn -0.9480 0.3064 -0.0866
vn -0.2737 -0.9615 -0.0250
vn -0.9905 0.1039 -0.0905
vn -0.0920 0.9957 -0.0084
vn -0.0920 -0.9957 -0.0084
vn -0.9905 -0.1039 -0.0905
vn -0.2737 0.9615 -0.0250
vn -0.9480 -0.3064 -0.0866
vn -0.4480 0.8931 -0.0409
vn -0.8662 -0.4933 -0.0791
vn -0.6093 0.7910 -0.0557
vn -0.7510 -0.6567 -0.0686
vn -0.7510 0.6567 -0.0686
vn -0.9159 -0.3079 -0.2577
vn -0.4313 0.8940 -0.1213
vn -0.8363 -0.4953 -0.2353
vn -0.5870 0.7926 -0.1651
vn -0.7243 -0.6587 -0.2038
vn -0.7243 0.6587 -0.2038
vn -0.5870 -0.7926 -0.1651
vn -0.8363 0.4953 -0.2353
vn -0.4313 -0.8940 -0.1213
vn -0.9159 0.3079 -0.2577
vn -0.2633 -0.9619 -0.0741
vn -0.9574 0.1045 -0.2693
vn -0.0885 0.9958 -0.0249
vn -0.0885 -0.9958 -0.0249
vn -0.9574 -0.1045 -0.2693
vn -0.2633 0.9619 -0.0741
vn -0.3982 -0.8958 -0.1974
vn -0.8516 0.3106 -0.4222
vn -0.2428 -0.9626 -0.1204
vn -0.8909 0.1055 -0.4417
vn -0.0816 0.9958 -0.0404
vn -0.0816 -0.9958 -0.0404
vn -0.8909 -0.1055 -0.4417
vn -0.2428 0.9626 -0.1204
vn -0.8516 -0.3106 -0.4222
vn -0.3982 0.8958 -0.1974
vn -0.7764 -0.4990 -0.3849
vn -0.5429 0.7955 -0.2692
vn -0.6712 -0.6623 -0.3328
vn -0.6712 0.6623 -0.3328
vn -0.5429 -0.7955 -0.2692
vn -0.7764 0.4990 -0.3849
vn -0.3500 0.8981 -0.2664
vn -0.6873 -0.5039 -0.5231
vn -0.4782 0.7993 -0.3640
vn -0.5927 -0.6672 -0.4511
vn -0.5927 0.6672 -0.4511
vn -0.4782 -0.7993 -0.3640
vn -0.6873 0.5039 -0.5231
vn -0.3499 -0.8981 -0.2664
vn -0.7554 0.3143 -0.5750
vn -0.2131 -0.9635 -0.1622
vn -0.7912 0.1069 -0.6022
vn -0.0715 0.9960 -0.0544
vn -0.0715 -0.9960 -0.0544
vn -0.7912 -0.1069 -0.6022
vn -0.2131 0.9635 -0.1622
vn -0.7554 -0.3143 -0.5750
vn -0.6282 0.3185 -0.7099
vn -0.1750 -0.9645 -0.1978
vn -0.6588 0.1084 -0.7445
vn -0.0587 0.9961 -0.0663
vn -0.0587 -0.9961 -0.0663
vn -0.6588 -0.1084 -0.7445
vn -0.1750 0.9645 -0.1978
vn -0.6282 -0.3185 -0.7099
vn -0.2880 0.9006 -0.3255
vn -0.5702 -0.5095 -0.6444
vn -0.3945 0.8035 -0.4458
vn -0.4904 -0.6726 -0.5542
vn -0.4904 0.6726 -0.5542
vn -0.3945 -0.8035 -0.4458
vn -0.5702 0.5095 -0.6444
vn -0.2880 -0.9006 -0.3255
vn -0.4281 -0.5147 -0.7428
vn -0.2946 0.8074 -0.5111
vn -0.3671 -0.6778 -0.6371
vn -0.3671 0.6778 -0.6371
vn -0.2946 -0.8074 -0.5111
vn -0.4281 0.5147 -0.7428
vn -0.2146 -0.9030 -0.3723
vn -0.4726 0.3225 -0.8201
vn -0.1302 -0.9654 -0.2259
vn -0.4963 0.1100 -0.8612
vn -0.0436 0.9962 -0.0757
vn -0.0436 -0.9962 -0.0757
vn -0.4963 -0.1100 -0.8612
vn -0.1302 0.9654 -0.2259
vn -0.4726 -0.3225 -0.8201
vn -0.2146 0.9030 -0.3723
vn -0.0802 -0.9661 -0.2453
vn -0.3089 0.1111 -0.9446
vn -0.0269 0.9963 -0.0821
vn -0.0269 -0.9963 -0.0821
vn -0.3089 -0.1111 -0.9446
vn -0.0802 0.9661 -0.2453
vn -0.2939 -0.3257 -0.8986
vn -0.1324 0.9048 -0.4048
vn -0.2657 -0.5189 -0.8125
vn -0.1821 0.8105 -0.5567
vn -0.2274 -0.6818 -0.6953
vn -0.2274 0.6818 -0.6953
vn -0.1821 -0.8105 -0.5567
vn -0.2657 0.5189 -0.8125
vn -0.1324 -0.9048 -0.4048
vn -0.2939 0.3257 -0.8986
vn -0.0616 0.8122 -0.5802
vn -0.0770 -0.6840 -0.7254
vn -0.0770 0.6840 -0.7254
vn -0.0616 -0.8122 -0.5802
vn -0.0901 0.5212 -0.8487
vn -0.0448 -0.9058 -0.4214
vn -0.0998 0.3274 -0.9396
vn -0.0271 -0.9665 -0.2552
vn -0.1049 0.1118 -0.9882
vn -0.0091 0.9963 -0.0854
vn -0.0091 -0.9963 -0.0854
vn -0.1049 -0.1118 -0.9882
vn -0.0271 0.9665 -0.2552
vn -0.0998 -0.3274 -0.9396
vn -0.0448 0.9058 -0.4214
vn -0.0901 -0.5212 -0.8487
vn 0.0447 -0.9058 -0.4214
vn -0.3500 -0.8981 0.2664
vn -0.3499 0.8981 0.2664
vn -0.3499 0.8981 -0.2664
vn -0.3500 -0.8981 -0.2664
vt 0.750000 0.375000
vt 0.718750 0.312500
vt 0.750000 0.312500
vt 0.750000 0.750000
vt 0.718750 0.812500
vt 0.718750 0.750000
vt 0.718750 0.250000
vt 0.750000 0.250000
vt 0.750000 0.687500
vt 0.718750 0.687500
vt 0.718750 0.187500
vt 0.750000 0.187500
vt 0.750000 0.625000
vt 0.718750 0.625000
vt 0.718750 0.125000
vt 0.750000 0.125000
vt 0.750000 0.562500
vt 0.718750 0.562500
vt 0.718750 0.062500
vt 0.750000 0.062500
vt 0.718750 0.500000
vt 0.750000 0.500000
vt 0.750000 0.937500
vt 0.734375 1.000000
vt 0.718750 0.937500
vt 0.734375 0.000000
vt 0.718750 0.437500
vt 0.750000 0.437500
vt 0.750000 0.875000
vt 0.718750 0.875000
vt 0.718750 0.375000
vt 0.750000 0.812500
vt 0.687500 0.375000
vt 0.687500 0.812500
vt 0.687500 0.312500
vt 0.687500 0.750000
vt 0.687500 0.250000
vt 0.687500 0.687500
vt 0.687500 0.187500
vt 0.687500 0.625000
vt 0.687500 0.125000
vt 0.687500 0.562500
vt 0.687500 0.062500
vt 0.687500 0.500000
vt 0.703125 1.000000
vt 0.687500 0.937500
vt 0.703125 0.000000
vt 0.687500 0.437500
vt 0.687500 0.875000
vt 0.656250 0.125000
vt 0.656250 0.562500
vt 0.656250 0.062500
vt 0.656250 0.500000
vt 0.671875 1.000000
vt 0.656250 0.937500
vt 0.671875 0.000000
vt 0.656250 0.437500
vt 0.656250 0.875000
vt 0.656250 0.375000
vt 0.656250 0.812500
vt 0.656250 0.312500
vt 0.656250 0.750000
vt 0.656250 0.250000
vt 0.656250 0.687500
vt 0.656250 0.187500
vt 0.656250 0.625000
vt 0.625000 0.812500
vt 0.625000 0.375000
vt 0.625000 0.312500
vt 0.625000 0.750000
vt 0.625000 0.250000
vt 0.625000 0.687500
vt 0.625000 0.187500
vt 0.625000 0.625000
vt 0.625000 0.125000
vt 0.625000 0.562500
vt 0.625000 0.062500
vt 0.625000 0.500000
vt 0.640625 1.000000
vt 0.625000 0.937500
vt 0.640625 0.000000
vt 0.625000 0.437500
vt 0.625000 0.875000
vt 0.593750 0.625000
vt 0.593750 0.562500
vt 0.593750 0.062500
vt 0.593750 0.500000
vt 0.609375 1.000000
vt 0.593750 0.937500
vt 0.609375 0.000000
vt 0.593750 0.437500
vt 0.593750 0.875000
vt 0.593750 0.375000
vt 0.593750 0.812500
vt 0.593750 0.312500
vt 0.593750 0.750000
vt 0.593750 0.250000
vt 0.593750 0.687500
vt 0.593750 0.187500
vt 0.593750 0.125000
vt 0.562500 0.375000
vt 0.562500 0.312500
vt 0.562500 0.750000
vt 0.562500 0.250000
vt 0.562500 0.687500
vt 0.562500 0.187500
vt 0.562500 0.625000
vt 0.562500 0.125000
vt 0.562500 0.562500
vt 0.562500 0.062500
vt 0.562500 0.500000
vt 0.578125 1.000000
vt 0.562500 0.937500
vt 0.578125 0.000000
vt 0.562500 0.437500
vt 0.562500 0.875000
vt 0.562500 0.812500
vt 0.531250 0.062500
vt 0.531250 0.562500
vt 0.531250 0.500000
vt 0.546875 1.000000
vt 0.531250 0.937500
vt 0.546875 0.000000
vt 0.531250 0.437500
vt 0.531250 0.875000
vt 0.531250 0.375000
vt 0.531250 0.812500
vt 0.531250 0.312500
vt 0.531250 0.750000
vt 0.531250 0.250000
vt 0.531250 0.687500
vt 0.531250 0.187500
vt 0.531250 0.625000
vt 0.531250 0.125000
vt 0.500000 0.250000
vt 0.500000 0.750000
vt 0.500000 0.687500
vt 0.500000 0.187500
vt 0.500000 0.625000
vt 0.500000 0.125000
vt 0.500000 0.562500
vt 0.500000 0.062500
vt 0.500000 0.500000
vt 0.515625 1.000000
vt 0.500000 0.937500
vt 0.515625 0.000000
vt 0.500000 0.437500
vt 0.500000 0.875000
vt 0.500000 0.375000
vt 0.500000 0.812500
vt 0.500000 0.312500
vt 0.484375 1.000000
vt 0.468750 0.937500
vt 0.484375 0.000000
vt 0.468750 0.062500
vt 0.468750 0.437500
vt 0.468750 0.875000
vt 0.468750 0.375000
vt 0.468750 0.812500
vt 0.468750 0.312500
vt 0.468750 0.750000
vt 0.468750 0.250000
vt 0.468750 0.687500
vt 0.468750 0.187500
vt 0.468750 0.625000
vt 0.468750 0.125000
vt 0.468750 0.562500
vt 0.468750 0.500000
vt 0.437500 0.750000
vt 0.437500 0.687500
vt 0.437500 0.250000
vt 0.437500 0.187500
vt 0.437500 0.625000
vt 0.437500 0.125000
vt 0.437500 0.562500
vt 0.437500 0.062500
vt 0.437500 0.500000
vt 0.453125 1.000000
vt 0.437500 0.937500
vt 0.453125 0.000000
vt 0.437500 0.437500
vt 0.437500 0.875000
vt 0.437500 0.375000
vt 0.437500 0.812500
vt 0.437500 0.312500
vt 0.406250 0.437500
vt 0.406250 0.875000
vt 0.406250 0.375000
vt 0.406250 0.812500
vt 0.406250 0.312500
vt 0.406250 0.750000
vt 0.406250 0.250000
vt 0.406250 0.687500
vt 0.406250 0.187500
vt 0.406250 0.625000
vt 0.406250 0.125000
vt 0.406250 0.562500
vt 0.406250 0.062500
vt 0.406250 0.500000
vt 0.421875 1.000000
vt 0.406250 0.937500
vt 0.421875 0.000000
vt 0.375000 0.187500
vt 0.375000 0.625000
vt 0.375000 0.125000
vt 0.375000 0.562500
vt 0.375000 0.062500
vt 0.375000 0.500000
vt 0.390625 1.000000
vt 0.375000 0.937500
vt 0.390625 0.000000
vt 0.375000 0.437500
vt 0.375000 0.875000
vt 0.375000 0.375000
vt 0.375000 0.812500
vt 0.375000 0.312500
vt 0.375000 0.750000
vt 0.375000 0.250000
vt 0.375000 0.687500
vt 0.343750 0.937500
vt 0.343750 0.875000
vt 0.343750 0.375000
vt 0.343750 0.812500
vt 0.343750 0.312500
vt 0.343750 0.750000
vt 0.343750 0.250000
vt 0.343750 0.687500
vt 0.343750 0.187500
vt 0.343750 0.625000
vt 0.343750 0.125000
vt 0.343750 0.562500
vt 0.343750 0.062500
vt 0.343750 0.500000
vt 0.359375 1.000000
vt 0.359375 0.000000
vt 0.343750 0.437500
vt 0.312500 0.625000
vt 0.312500 0.125000
vt 0.312500 0.562500
vt 0.312500 0.062500
vt 0.312500 0.500000
vt 0.328125 1.000000
vt 0.312500 0.937500
vt 0.328125 0.000000
vt 0.312500 0.437500
vt 0.312500 0.875000
vt 0.312500 0.375000
vt 0.312500 0.812500
vt 0.312500 0.312500
vt 0.312500 0.750000
vt 0.312500 0.250000
vt 0.312500 0.687500
vt 0.312500 0.187500
vt 0.281250 0.375000
vt 0.281250 0.812500
vt 0.281250 0.312500
vt 0.281250 0.750000
vt 0.281250 0.250000
vt 0.281250 0.687500
vt 0.281250 0.187500
vt 0.281250 0.625000
vt 0.281250 0.125000
vt 0.281250 0.562500
vt 0.281250 0.062500
vt 0.281250 0.500000
vt 0.296875 1.000000
vt 0.281250 0.937500
vt 0.296875 0.000000
vt 0.281250 0.437500
vt 0.281250 0.875000
vt 0.250000 0.125000
vt 0.250000 0.625000
vt 0.250000 0.562500
vt 0.250000 0.062500
vt 0.250000 0.500000
vt 0.265625 1.000000
vt 0.250000 0.937500
vt 0.265625 0.000000
vt 0.250000 0.437500
vt 0.250000 0.875000
vt 0.250000 0.375000
vt 0.250000 0.812500
vt 0.250000 0.312500
vt 0.250000 0.750000
vt 0.250000 0.250000
vt 0.250000 0.687500
vt 0.250000 0.187500
vt 0.218750 0.375000
vt 0.218750 0.312500
vt 0.218750 0.750000
vt 0.218750 0.250000
vt 0.218750 0.687500
vt 0.218750 0.187500
vt 0.218750 0.625000
vt 0.218750 0.125000
vt 0.218750 0.562500
vt 0.218750 0.062500
vt 0.218750 0.500000
vt 0.234375 1.000000
vt 0.218750 0.937500
vt 0.234375 0.000000
vt 0.218750 0.437500
vt 0.218750 0.875000
vt 0.218750 0.812500
vt 0.187500 0.062500
vt 0.187500 0.562500
vt 0.187500 0.500000
vt 0.203125 1.000000
vt 0.187500 0.937500
vt 0.203125 0.000000
vt 0.187500 0.437500
vt 0.187500 0.875000
vt 0.187500 0.375000
vt 0.187500 0.812500
vt 0.187500 0.312500
vt 0.187500 0.750000
vt 0.187500 0.250000
vt 0.187500 0.687500
vt 0.187500 0.187500
vt 0.187500 0.625000
vt 0.187500 0.125000
vt 0.156250 0.750000
vt 0.156250 0.250000
vt 0.156250 0.687500
vt 0.156250 0.187500
vt 0.156250 0.625000
vt 0.156250 0.125000
vt 0.156250 0.562500
vt 0.156250 0.062500
vt 0.156250 0.500000
vt 0.171875 1.000000
vt 0.156250 0.937500
vt 0.171875 0.000000
vt 0.156250 0.437500
vt 0.156250 0.875000
vt 0.156250 0.375000
vt 0.156250 0.812500
vt 0.156250 0.312500
vt 0.125000 0.562500
vt 0.125000 0.500000
vt 0.140625 1.000000
vt 0.125000 0.937500
vt 0.140625 0.000000
vt 0.125000 0.062500
vt 0.125000 0.437500
vt 0.125000 0.875000
vt 0.125000 0.375000
vt 0.125000 0.812500
vt 0.125000 0.312500
vt 0.125000 0.750000
vt 0.125000 0.250000
vt 0.125000 0.687500
vt 0.125000 0.187500
vt 0.125000 0.625000
vt 0.125000 0.125000
vt 0.093750 0.250000
vt 0.093750 0.750000
vt 0.093750 0.687500
vt 0.093750 0.187500
vt 0.093750 0.625000
vt 0.093750 0.125000
vt 0.093750 0.562500
vt 0.093750 0.062500
vt 0.093750 0.500000
vt 0.109375 1.000000
vt 0.093750 0.937500
vt 0.109375 0.000000
vt 0.093750 0.437500
vt 0.093750 0.875000
vt 0.093750 0.375000
vt 0.093750 0.812500
vt 0.093750 0.312500
vt 0.078125 1.000000
vt 0.062500 0.937500
vt 0.078125 0.000000
vt 0.062500 0.062500
vt 0.062500 0.437500
vt 0.062500 0.875000
vt 0.062500 0.375000
vt 0.062500 0.812500
vt 0.062500 0.312500
vt 0.062500 0.750000
vt 0.062500 0.250000
vt 0.062500 0.687500
vt 0.062500 0.187500
vt 0.062500 0.625000
vt 0.062500 0.125000
vt 0.062500 0.562500
vt 0.062500 0.500000
vt 0.031250 0.750000
vt 0.031250 0.687500
vt 0.031250 0.250000
vt 0.031250 0.187500
vt 0.031250 0.625000
vt 0.031250 0.125000
vt 0.031250 0.562500
vt 0.031250 0.062500
vt 0.031250 0.500000
vt 0.046875 1.000000
vt 0.031250 0.937500
vt 0.046875 0.000000
vt 0.031250 0.437500
vt 0.031250 0.875000
vt 0.031250 0.375000
vt 0.031250 0.812500
vt 0.031250 0.312500
vt 0.000000 0.437500
vt 0.000000 0.875000
vt 0.000000 0.375000
vt 0.000000 0.812500
vt 0.000000 0.312500
vt 0.000000 0.750000
vt 0.000000 0.250000
vt 0.000000 0.687500
vt 0.000000 0.187500
vt 0.000000 0.625000
vt 0.000000 0.125000
vt 0.000000 0.562500
vt 0.000000 0.062500
vt 0.000000 0.500000
vt 0.015625 1.000000
vt 0.000000 0.937500
vt 0.015625 0.000000
vt 1.000000 0.187500
vt 0.968750 0.250000
vt 0.968750 0.187500
vt 1.000000 0.687500
vt 0.968750 0.625000
vt 1.000000 0.625000
vt 0.968750 0.125000
vt 1.000000 0.125000
vt 1.000000 0.562500
vt 0.968750 0.562500
vt 0.968750 0.062500
vt 1.000000 0.062500
vt 1.000000 0.500000
vt 0.968750 0.500000
vt 1.000000 0.937500
vt 0.984375 1.000000
vt 0.968750 0.937500
vt 0.984375 0.000000
vt 0.968750 0.437500
vt 1.000000 0.437500
vt 0.968750 0.875000
vt 1.000000 0.875000
vt 0.968750 0.375000
vt 1.000000 0.375000
vt 1.000000 0.812500
vt 0.968750 0.812500
vt 1.000000 0.312500
vt 0.968750 0.312500
vt 0.968750 0.750000
vt 1.000000 0.750000
vt 1.000000 0.250000
vt 0.968750 0.687500
vt 0.937500 0.375000
vt 0.937500 0.875000
vt 0.937500 0.812500
vt 0.937500 0.312500
vt 0.937500 0.750000
vt 0.937500 0.250000
vt 0.937500 0.687500
vt 0.937500 0.187500
vt 0.937500 0.625000
vt 0.937500 0.125000
vt 0.937500 0.562500
vt 0.937500 0.062500
vt 0.937500 0.500000
vt 0.953125 1.000000
vt 0.937500 0.937500
vt 0.953125 0.000000
vt 0.937500 0.437500
vt 0.906250 0.187500
vt 0.906250 0.125000
vt 0.906250 0.625000
vt 0.906250 0.562500
vt 0.906250 0.062500
vt 0.906250 0.500000
vt 0.921875 1.000000
vt 0.906250 0.937500
vt 0.921875 0.000000
vt 0.906250 0.437500
vt 0.906250 0.875000
vt 0.906250 0.375000
vt 0.906250 0.812500
vt 0.906250 0.312500
vt 0.906250 0.750000
vt 0.906250 0.250000
vt 0.906250 0.687500
vt 0.875000 0.875000
vt 0.875000 0.812500
vt 0.875000 0.375000
vt 0.875000 0.312500
vt 0.875000 0.750000
vt 0.875000 0.250000
vt 0.875000 0.687500
vt 0.875000 0.187500
vt 0.875000 0.625000
vt 0.875000 0.125000
vt 0.875000 0.562500
vt 0.875000 0.062500
vt 0.875000 0.500000
vt 0.890625 1.000000
vt 0.875000 0.937500
vt 0.890625 0.000000
vt 0.875000 0.437500
vt 0.843750 0.625000
vt 0.843750 0.562500
vt 0.843750 0.062500
vt 0.843750 0.500000
vt 0.859375 1.000000
vt 0.843750 0.937500
vt 0.859375 0.000000
vt 0.843750 0.437500
vt 0.843750 0.875000
vt 0.843750 0.375000
vt 0.843750 0.812500
vt 0.843750 0.312500
vt 0.843750 0.750000
vt 0.843750 0.250000
vt 0.843750 0.687500
vt 0.843750 0.187500
vt 0.843750 0.125000
vt 0.812500 0.375000
vt 0.812500 0.312500
vt 0.812500 0.750000
vt 0.812500 0.250000
vt 0.812500 0.687500
vt 0.812500 0.187500
vt 0.812500 0.625000
vt 0.812500 0.125000
vt 0.812500 0.562500
vt 0.812500 0.062500
vt 0.812500 0.500000
vt 0.828125 1.000000
vt 0.812500 0.937500
vt 0.828125 0.000000
vt 0.812500 0.437500
vt 0.812500 0.875000
vt 0.812500 0.812500
vt 0.781250 0.125000
vt 0.781250 0.062500
vt 0.781250 0.562500
vt 0.781250 0.500000
vt 0.796875 1.000000
vt 0.781250 0.937500
vt 0.796875 0.000000
vt 0.781250 0.437500
vt 0.781250 0.875000
vt 0.781250 0.375000
vt 0.781250 0.812500
vt 0.781250 0.312500
vt 0.781250 0.750000
vt 0.781250 0.250000
vt 0.781250 0.687500
vt 0.781250 0.187500
vt 0.781250 0.625000
vt 0.765625 1.000000
vt 0.765625 0.000000
s 0
f 478/1/1 17/2/1 6/3/1
f 476/4/2 9/5/2 10/6/2
f 6/3/3 18/7/3 479/8/3
f 2/9/4 10/6/4 11/10/4
f 479/8/5 19/11/5 480/12/5
f 477/13/6 11/10/6 12/14/6
f 480/12/7 20/15/7 481/16/7
f 3/17/8 12/14/8 13/18/8
f 481/16/9 21/19/9 482/20/9
f 3/17/10 14/21/10 4/22/10
f 474/23/11 82/24/11 7/25/11
f 308/26/12 482/20/12 21/19/12
f 4/22/13 15/27/13 5/28/13
f 475/29/14 7/25/14 8/30/14
f 5/28/15 16/31/15 478/1/15
f 1/32/16 8/30/16 9/5/16
f 15/27/17 31/33/17 16/31/17
f 8/30/18 24/34/18 9/5/18
f 16/31/19 32/35/19 17/2/19
f 9/5/20 25/36/20 10/6/20
f 17/2/21 33/37/21 18/7/21
f 10/6/22 26/38/22 11/10/22
f 19/11/23 33/37/23 34/39/23
f 11/10/24 27/40/24 12/14/24
f 20/15/25 34/39/25 35/41/25
f 13/18/26 27/40/26 28/42/26
f 20/15/27 36/43/27 21/19/27
f 13/18/28 29/44/28 14/21/28
f 7/25/29 82/45/29 22/46/29
f 308/47/30 21/19/30 36/43/30
f 14/21/31 30/48/31 15/27/31
f 7/25/32 23/49/32 8/30/32
f 34/39/33 50/50/33 35/41/33
f 27/40/34 43/51/34 28/42/34
f 35/41/35 51/52/35 36/43/35
f 28/42/36 44/53/36 29/44/36
f 22/46/37 82/54/37 37/55/37
f 308/56/38 36/43/38 51/52/38
f 30/48/39 44/53/39 45/57/39
f 22/46/40 38/58/40 23/49/40
f 30/48/41 46/59/41 31/33/41
f 23/49/42 39/60/42 24/34/42
f 32/35/43 46/59/43 47/61/43
f 24/34/44 40/62/44 25/36/44
f 33/37/45 47/61/45 48/63/45
f 25/36/46 41/64/46 26/38/46
f 34/39/47 48/63/47 49/65/47
f 26/38/48 42/66/48 27/40/48
f 38/58/49 54/67/49 39/60/49
f 47/61/50 61/68/50 62/69/50
f 39/60/51 55/70/51 40/62/51
f 47/61/52 63/71/52 48/63/52
f 40/62/53 56/72/53 41/64/53
f 48/63/54 64/73/54 49/65/54
f 41/64/55 57/74/55 42/66/55
f 49/65/56 65/75/56 50/50/56
f 43/51/57 57/74/57 58/76/57
f 50/50/58 66/77/58 51/52/58
f 44/53/59 58/76/59 59/78/59
f 37/55/60 82/79/60 52/80/60
f 308/81/61 51/52/61 66/77/61
f 44/53/62 60/82/62 45/57/62
f 37/55/63 53/83/63 38/58/63
f 45/57/64 61/68/64 46/59/64
f 58/76/65 72/84/65 73/85/65
f 65/75/66 81/86/66 66/77/66
f 59/78/67 73/85/67 74/87/67
f 52/80/68 82/88/68 67/89/68
f 308/90/69 66/77/69 81/86/69
f 59/78/70 75/91/70 60/82/70
f 52/80/71 68/92/71 53/83/71
f 60/82/72 76/93/72 61/68/72
f 53/83/73 69/94/73 54/67/73
f 62/69/74 76/93/74 77/95/74
f 54/67/75 70/96/75 55/70/75
f 62/69/76 78/97/76 63/71/76
f 56/72/77 70/96/77 71/98/77
f 63/71/78 79/99/78 64/73/78
f 56/72/79 72/84/79 57/74/79
f 65/75/80 79/99/80 80/100/80
f 77/95/81 92/101/81 93/102/81
f 69/94/82 86/103/82 70/96/82
f 77/95/83 94/104/83 78/97/83
f 71/98/84 86/103/84 87/105/84
f 78/97/85 95/106/85 79/99/85
f 71/98/86 88/107/86 72/84/86
f 79/99/87 96/108/87 80/100/87
f 73/85/88 88/107/88 89/109/88
f 81/86/89 96/108/89 97/110/89
f 74/87/90 89/109/90 90/111/90
f 67/89/91 82/112/91 83/113/91
f 308/114/92 81/86/92 97/110/92
f 74/87/93 91/115/93 75/91/93
f 67/89/94 84/116/94 68/92/94
f 75/91/95 92/101/95 76/93/95
f 68/92/96 85/117/96 69/94/96
f 96/108/97 112/118/97 97/110/97
f 90/111/98 104/119/98 105/120/98
f 83/113/99 82/121/99 98/122/99
f 308/123/100 97/110/100 112/118/100
f 90/111/101 106/124/101 91/115/101
f 84/116/102 98/122/102 99/125/102
f 91/115/103 107/126/103 92/101/103
f 84/116/104 100/127/104 85/117/104
f 93/102/105 107/126/105 108/128/105
f 85/117/106 101/129/106 86/103/106
f 93/102/107 109/130/107 94/104/107
f 87/105/108 101/129/108 102/131/108
f 95/106/109 109/130/109 110/132/109
f 87/105/110 103/133/110 88/107/110
f 96/108/111 110/132/111 111/134/111
f 89/109/112 103/133/112 104/119/112
f 108/128/113 124/135/113 109/130/113
f 102/131/114 116/136/114 117/137/114
f 110/132/115 124/135/115 125/138/115
f 102/131/116 118/139/116 103/133/116
f 110/132/117 126/140/117 111/134/117
f 104/119/118 118/139/118 119/141/118
f 111/134/119 127/142/119 112/118/119
f 105/120/120 119/141/120 120/143/120
f 98/122/121 82/144/121 113/145/121
f 308/146/122 112/118/122 127/142/122
f 105/120/123 121/147/123 106/124/123
f 99/125/124 113/145/124 114/148/124
f 106/124/125 122/149/125 107/126/125
f 99/125/126 115/150/126 100/127/126
f 108/128/127 122/149/127 123/151/127
f 100/127/128 116/136/128 101/129/128
f 113/145/129 82/152/129 128/153/129
f 308/154/130 127/142/130 142/155/130
f 120/143/131 136/156/131 121/147/131
f 113/145/132 129/157/132 114/148/132
f 121/147/133 137/158/133 122/149/133
f 114/148/134 130/159/134 115/150/134
f 123/151/135 137/158/135 138/160/135
f 115/150/136 131/161/136 116/136/136
f 123/151/137 139/162/137 124/135/137
f 117/137/138 131/161/138 132/163/138
f 125/138/139 139/162/139 140/164/139
f 117/137/140 133/165/140 118/139/140
f 125/138/141 141/166/141 126/140/141
f 119/141/142 133/165/142 134/167/142
f 126/140/143 142/155/143 127/142/143
f 120/143/144 134/167/144 135/168/144
f 132/163/145 146/169/145 147/170/145
f 140/164/146 154/171/146 155/172/146
f 132/163/147 148/173/147 133/165/147
f 140/164/148 156/174/148 141/166/148
f 134/167/149 148/173/149 149/175/149
f 142/155/150 156/174/150 157/176/150
f 135/168/151 149/175/151 150/177/151
f 128/153/152 82/178/152 143/179/152
f 308/180/153 142/155/153 157/176/153
f 135/168/154 151/181/154 136/156/154
f 128/153/155 144/182/155 129/157/155
f 136/156/156 152/183/156 137/158/156
f 130/159/157 144/182/157 145/184/157
f 138/160/158 152/183/158 153/185/158
f 130/159/159 146/169/159 131/161/159
f 138/160/160 154/171/160 139/162/160
f 150/177/161 166/186/161 151/181/161
f 143/179/162 159/187/162 144/182/162
f 151/181/163 167/188/163 152/183/163
f 145/184/164 159/187/164 160/189/164
f 153/185/165 167/188/165 168/190/165
f 145/184/166 161/191/166 146/169/166
f 153/185/167 169/192/167 154/171/167
f 147/170/168 161/191/168 162/193/168
f 155/172/169 169/192/169 170/194/169
f 147/170/170 163/195/170 148/173/170
f 155/172/171 171/196/171 156/174/171
f 149/175/172 163/195/172 164/197/172
f 156/174/173 172/198/173 157/176/173
f 150/177/174 164/197/174 165/199/174
f 143/179/175 82/200/175 158/201/175
f 308/202/176 157/176/176 172/198/176
f 169/192/177 185/203/177 170/194/177
f 162/193/178 178/204/178 163/195/178
f 171/196/179 185/203/179 186/205/179
f 164/197/180 178/204/180 179/206/180
f 172/198/181 186/205/181 187/207/181
f 165/199/182 179/206/182 180/208/182
f 158/201/183 82/209/183 173/210/183
f 308/211/184 172/198/184 187/207/184
f 165/199/185 181/212/185 166/186/185
f 158/201/186 174/213/186 159/187/186
f 166/186/187 182/214/187 167/188/187
f 159/187/188 175/215/188 160/189/188
f 168/190/189 182/214/189 183/216/189
f 160/189/190 176/217/190 161/191/190
f 168/190/191 184/218/191 169/192/191
f 162/193/192 176/217/192 177/219/192
f 174/213/193 188/220/193 189/221/193
f 181/212/194 197/222/194 182/214/194
f 175/215/195 189/221/195 190/223/195
f 183/216/196 197/222/196 198/224/196
f 175/215/197 191/225/197 176/217/197
f 183/216/198 199/226/198 184/218/198
f 177/219/199 191/225/199 192/227/199
f 185/203/200 199/226/200 200/228/200
f 177/219/201 193/229/201 178/204/201
f 185/203/202 201/230/202 186/205/202
f 179/206/203 193/229/203 194/231/203
f 186/205/204 202/232/204 187/207/204
f 180/208/205 194/231/205 195/233/205
f 173/210/206 82/234/206 188/220/206
f 308/235/207 187/207/207 202/232/207
f 180/208/208 196/236/208 181/212/208
f 192/227/209 208/237/209 193/229/209
f 200/228/210 216/238/210 201/230/210
f 194/231/211 208/237/211 209/239/211
f 201/230/212 217/240/212 202/232/212
f 195/233/213 209/239/213 210/241/213
f 188/220/214 82/242/214 203/243/214
f 308/244/215 202/232/215 217/240/215
f 195/233/216 211/245/216 196/236/216
f 188/220/217 204/246/217 189/221/217
f 196/236/218 212/247/218 197/222/218
f 189/221/219 205/248/219 190/223/219
f 198/224/220 212/247/220 213/249/220
f 190/223/221 206/250/221 191/225/221
f 198/224/222 214/251/222 199/226/222
f 192/227/223 206/250/223 207/252/223
f 200/228/224 214/251/224 215/253/224
f 211/245/225 227/254/225 212/247/225
f 204/246/226 220/255/226 205/248/226
f 213/249/227 227/254/227 228/256/227
f 205/248/228 221/257/228 206/250/228
f 213/249/229 229/258/229 214/251/229
f 207/252/230 221/257/230 222/259/230
f 215/253/231 229/258/231 230/260/231
f 207/252/232 223/261/232 208/237/232
f 215/253/233 231/262/233 216/238/233
f 209/239/234 223/261/234 224/263/234
f 216/238/235 232/264/235 217/240/235
f 210/241/236 224/263/236 225/265/236
f 203/243/237 82/266/237 218/267/237
f 308/268/238 217/240/238 232/264/238
f 210/241/239 226/269/239 211/245/239
f 203/243/240 219/270/240 204/246/240
f 230/260/241 246/271/241 231/262/241
f 224/263/242 238/272/242 239/273/242
f 232/264/243 246/271/243 247/274/243
f 225/265/244 239/273/244 240/275/244
f 218/267/245 82/276/245 233/277/245
f 308/278/246 232/264/246 247/274/246
f 225/265/247 241/279/247 226/269/247
f 218/267/248 234/280/248 219/270/248
f 226/269/249 242/281/249 227/254/249
f 220/255/250 234/280/250 235/282/250
f 228/256/251 242/281/251 243/283/251
f 220/255/252 236/284/252 221/257/252
f 228/256/253 244/285/253 229/258/253
f 222/259/254 236/284/254 237/286/254
f 230/260/255 244/285/255 245/287/255
f 222/259/256 238/272/256 223/261/256
f 243/283/257 257/288/257 258/289/257
f 235/282/258 251/290/258 236/284/258
f 243/283/259 259/291/259 244/285/259
f 237/286/260 251/290/260 252/292/260
f 245/287/261 259/291/261 260/293/261
f 237/286/262 253/294/262 238/272/262
f 245/287/263 261/295/263 246/271/263
f 239/273/264 253/294/264 254/296/264
f 246/271/265 262/297/265 247/274/265
f 240/275/266 254/296/266 255/298/266
f 233/277/267 82/299/267 248/300/267
f 308/301/268 247/274/268 262/297/268
f 240/275/269 256/302/269 241/279/269
f 233/277/270 249/303/270 234/280/270
f 241/279/271 257/288/271 242/281/271
f 234/280/272 250/304/272 235/282/272
f 261/295/273 277/305/273 262/297/273
f 255/298/274 269/306/274 270/307/274
f 248/300/275 82/308/275 263/309/275
f 308/310/276 262/297/276 277/305/276
f 255/298/277 271/311/277 256/302/277
f 249/303/278 263/309/278 264/312/278
f 256/302/279 272/313/279 257/288/279
f 249/303/280 265/314/280 250/304/280
f 258/289/281 272/313/281 273/315/281
f 250/304/282 266/316/282 251/290/282
f 258/289/283 274/317/283 259/291/283
f 252/292/284 266/316/284 267/318/284
f 260/293/285 274/317/285 275/319/285
f 252/292/286 268/320/286 253/294/286
f 260/293/287 276/321/287 261/295/287
f 254/296/288 268/320/288 269/306/288
f 265/314/289 281/322/289 266/316/289
f 273/315/290 289/323/290 274/317/290
f 267/318/291 281/322/291 282/324/291
f 275/319/292 289/323/292 290/325/292
f 267/318/293 283/326/293 268/320/293
f 275/319/294 291/327/294 276/321/294
f 269/306/295 283/326/295 284/328/295
f 277/305/296 291/327/296 292/329/296
f 270/307/297 284/328/297 285/330/297
f 263/309/298 82/331/298 278/332/298
f 308/333/299 277/305/299 292/329/299
f 270/307/300 286/334/300 271/311/300
f 264/312/301 278/332/301 279/335/301
f 271/311/302 287/336/302 272/313/302
f 264/312/303 280/337/303 265/314/303
f 273/315/304 287/336/304 288/338/304
f 285/330/305 299/339/305 300/340/305
f 278/332/306 82/341/306 293/342/306
f 308/343/307 292/329/307 307/344/307
f 285/330/308 301/345/308 286/334/308
f 278/332/309 294/346/309 279/335/309
f 286/334/310 302/347/310 287/336/310
f 279/335/311 295/348/311 280/337/311
f 288/338/312 302/347/312 303/349/312
f 280/337/313 296/350/313 281/322/313
f 288/338/314 304/351/314 289/323/314
f 282/324/315 296/350/315 297/352/315
f 290/325/316 304/351/316 305/353/316
f 282/324/317 298/354/317 283/326/317
f 290/325/318 306/355/318 291/327/318
f 284/328/319 298/354/319 299/339/319
f 292/329/320 306/355/320 307/344/320
f 303/349/321 320/356/321 304/351/321
f 297/352/322 312/357/322 313/358/322
f 305/353/323 320/356/323 321/359/323
f 297/352/324 314/360/324 298/354/324
f 305/353/325 322/361/325 306/355/325
f 299/339/326 314/360/326 315/362/326
f 306/355/327 323/363/327 307/344/327
f 300/340/328 315/362/328 316/364/328
f 293/342/329 82/365/329 309/366/329
f 308/367/330 307/344/330 323/363/330
f 300/340/331 317/368/331 301/345/331
f 293/342/332 310/369/332 294/346/332
f 301/345/333 318/370/333 302/347/333
f 294/346/334 311/371/334 295/348/334
f 303/349/335 318/370/335 319/372/335
f 295/348/336 312/357/336 296/350/336
f 309/366/337 82/373/337 324/374/337
f 308/375/338 323/363/338 338/376/338
f 316/364/339 332/377/339 317/368/339
f 309/366/340 325/378/340 310/369/340
f 317/368/341 333/379/341 318/370/341
f 310/369/342 326/380/342 311/371/342
f 319/372/343 333/379/343 334/381/343
f 311/371/344 327/382/344 312/357/344
f 319/372/345 335/383/345 320/356/345
f 312/357/346 328/384/346 313/358/346
f 321/359/347 335/383/347 336/385/347
f 313/358/348 329/386/348 314/360/348
f 321/359/349 337/387/349 322/361/349
f 315/362/350 329/386/350 330/388/350
f 322/361/351 338/376/351 323/363/351
f 316/364/352 330/388/352 331/389/352
f 328/384/353 342/390/353 343/391/353
f 336/385/354 350/392/354 351/393/354
f 328/384/355 344/394/355 329/386/355
f 336/385/356 352/395/356 337/387/356
f 330/388/357 344/394/357 345/396/357
f 337/387/358 353/397/358 338/376/358
f 331/389/359 345/396/359 346/398/359
f 324/374/360 82/399/360 339/400/360
f 308/401/361 338/376/361 353/397/361
f 331/389/362 347/402/362 332/377/362
f 324/374/363 340/403/363 325/378/363
f 332/377/364 348/404/364 333/379/364
f 325/378/365 341/405/365 326/380/365
f 334/381/366 348/404/366 349/406/366
f 326/380/367 342/390/367 327/382/367
f 334/381/368 350/392/368 335/383/368
f 346/398/369 362/407/369 347/402/369
f 339/400/370 355/408/370 340/403/370
f 347/402/371 363/409/371 348/404/371
f 341/405/372 355/408/372 356/410/372
f 349/406/373 363/409/373 364/411/373
f 341/405/374 357/412/374 342/390/374
f 349/406/375 365/413/375 350/392/375
f 343/391/376 357/412/376 358/414/376
f 351/393/377 365/413/377 366/415/377
f 343/391/378 359/416/378 344/394/378
f 351/393/379 367/417/379 352/395/379
f 345/396/380 359/416/380 360/418/380
f 352/395/381 368/419/381 353/397/381
f 346/398/382 360/418/382 361/420/382
f 339/400/383 82/421/383 354/422/383
f 308/423/384 353/397/384 368/419/384
f 366/424/385 380/425/385 381/426/385
f 358/427/386 374/428/386 359/429/386
f 366/424/387 382/430/387 367/431/387
f 360/432/388 374/428/388 375/433/388
f 367/431/389 383/434/389 368/435/389
f 361/436/390 375/433/390 376/437/390
f 354/438/391 82/439/391 369/440/391
f 308/441/392 368/435/392 383/434/392
f 361/436/393 377/442/393 362/443/393
f 354/438/394 370/444/394 355/445/394
f 362/443/395 378/446/395 363/447/395
f 356/448/396 370/444/396 371/449/396
f 364/450/397 378/446/397 379/451/397
f 356/448/398 372/452/398 357/453/398
f 364/450/399 380/425/399 365/454/399
f 358/427/400 372/452/400 373/455/400
f 377/442/401 393/456/401 378/446/401
f 371/449/402 385/457/402 386/458/402
f 379/451/403 393/456/403 394/459/403
f 371/449/404 387/460/404 372/452/404
f 379/451/405 395/461/405 380/425/405
f 373/455/406 387/460/406 388/462/406
f 381/426/407 395/461/407 396/463/407
f 373/455/408 389/464/408 374/428/408
f 381/426/409 397/465/409 382/430/409
f 375/433/410 389/464/410 390/466/410
f 382/430/411 398/467/411 383/434/411
f 376/437/412 390/466/412 391/468/412
f 369/440/413 82/469/413 384/470/413
f 308/471/414 383/434/414 398/467/414
f 376/437/415 392/472/415 377/442/415
f 369/440/416 385/457/416 370/444/416
f 397/465/417 411/473/417 412/474/417
f 390/466/418 404/475/418 405/476/418
f 397/465/419 413/477/419 398/467/419
f 391/468/420 405/476/420 406/478/420
f 384/470/421 82/479/421 399/480/421
f 308/481/422 398/467/422 413/477/422
f 391/468/423 407/482/423 392/472/423
f 385/457/424 399/480/424 400/483/424
f 392/472/425 408/484/425 393/456/425
f 385/457/426 401/485/426 386/458/426
f 394/459/427 408/484/427 409/486/427
f 386/458/428 402/487/428 387/460/428
f 394/459/429 410/488/429 395/461/429
f 388/462/430 402/487/430 403/489/430
f 396/463/431 410/488/431 411/473/431
f 388/462/432 404/475/432 389/464/432
f 401/485/433 415/490/433 416/491/433
f 409/486/434 423/492/434 424/493/434
f 401/485/435 417/494/435 402/487/435
f 409/486/436 425/495/436 410/488/436
f 403/489/437 417/494/437 418/496/437
f 411/473/438 425/495/438 426/497/438
f 403/489/439 419/498/439 404/475/439
f 411/473/440 427/499/440 412/474/440
f 405/476/441 419/498/441 420/500/441
f 412/474/442 428/501/442 413/477/442
f 406/478/443 420/500/443 421/502/443
f 399/480/444 82/503/444 414/504/444
f 308/505/445 413/477/445 428/501/445
f 406/478/446 422/506/446 407/482/446
f 399/480/447 415/490/447 400/483/447
f 407/482/448 423/492/448 408/484/448
f 420/500/449 434/507/449 435/508/449
f 427/499/450 443/509/450 428/501/450
f 421/502/451 435/508/451 436/510/451
f 414/504/452 82/511/452 429/512/452
f 308/513/453 428/501/453 443/509/453
f 421/502/454 437/514/454 422/506/454
f 414/504/455 430/515/455 415/490/455
f 422/506/456 438/516/456 423/492/456
f 416/491/457 430/515/457 431/517/457
f 424/493/458 438/516/458 439/518/458
f 416/491/459 432/519/459 417/494/459
f 424/493/460 440/520/460 425/495/460
f 418/496/461 432/519/461 433/521/461
f 426/497/462 440/520/462 441/522/462
f 418/496/463 434/507/463 419/498/463
f 426/497/464 442/523/464 427/499/464
f 439/518/465 453/524/465 454/525/465
f 431/517/466 447/526/466 432/519/466
f 439/518/467 455/527/467 440/520/467
f 433/521/468 447/526/468 448/528/468
f 441/522/469 455/527/469 456/529/469
f 433/521/470 449/530/470 434/507/470
f 441/522/471 457/531/471 442/523/471
f 435/508/472 449/530/472 450/532/472
f 442/523/473 458/533/473 443/509/473
f 436/510/474 450/532/474 451/534/474
f 429/512/475 82/535/475 444/536/475
f 308/537/476 443/509/476 458/533/476
f 436/510/477 452/538/477 437/514/477
f 430/515/478 444/536/478 445/539/478
f 437/514/479 453/524/479 438/516/479
f 430/515/480 446/540/480 431/517/480
f 458/533/481 472/541/481 473/542/481
f 451/534/482 465/543/482 466/544/482
f 444/536/483 82/545/483 459/546/483
f 308/547/484 458/533/484 473/542/484
f 451/534/485 467/548/485 452/538/485
f 444/536/486 460/549/486 445/539/486
f 452/538/487 468/550/487 453/524/487
f 446/540/488 460/549/488 461/551/488
f 454/525/489 468/550/489 469/552/489
f 446/540/490 462/553/490 447/526/490
f 454/525/491 470/554/491 455/527/491
f 448/528/492 462/553/492 463/555/492
f 456/529/493 470/554/493 471/556/493
f 448/528/494 464/557/494 449/530/494
f 456/529/495 472/541/495 457/531/495
f 450/532/496 464/557/496 465/543/496
f 462/553/497 1/32/497 476/4/497
f 469/552/498 479/8/498 470/554/498
f 463/555/499 476/4/499 2/9/499
f 471/556/500 479/8/500 480/12/500
f 464/557/501 2/9/501 477/13/501
f 471/556/502 481/16/502 472/541/502
f 465/543/503 477/13/503 3/17/503
f 472/541/504 482/20/504 473/542/504
f 466/544/505 3/17/505 4/22/505
f 459/546/506 82/558/506 474/23/506
f 308/559/507 473/542/507 482/20/507
f 466/544/508 5/28/508 467/548/508
f 460/549/509 474/23/509 475/29/509
f 467/548/510 478/1/510 468/550/510
f 461/551/511 475/29/511 1/32/511
f 468/550/512 6/3/512 469/552/512
f 478/1/1 16/31/1 17/2/1
f 476/4/2 1/32/2 9/5/2
f 6/3/3 17/2/3 18/7/3
f 2/9/4 476/4/4 10/6/4
f 479/8/5 18/7/5 19/11/5
f 477/13/6 2/9/6 11/10/6
f 480/12/513 19/11/513 20/15/513
f 3/17/8 477/13/8 12/14/8
f 481/16/9 20/15/9 21/19/9
f 3/17/10 13/18/10 14/21/10
f 4/22/13 14/21/13 15/27/13
f 475/29/14 474/23/14 7/25/14
f 5/28/15 15/27/15 16/31/15
f 1/32/16 475/29/16 8/30/16
f 15/27/17 30/48/17 31/33/17
f 8/30/18 23/49/18 24/34/18
f 16/31/19 31/33/19 32/35/19
f 9/5/20 24/34/20 25/36/20
f 17/2/21 32/35/21 33/37/21
f 10/6/22 25/36/22 26/38/22
f 19/11/23 18/7/23 33/37/23
f 11/10/24 26/38/24 27/40/24
f 20/15/25 19/11/25 34/39/25
f 13/18/26 12/14/26 27/40/26
f 20/15/27 35/41/27 36/43/27
f 13/18/28 28/42/28 29/44/28
f 14/21/31 29/44/31 30/48/31
f 7/25/32 22/46/32 23/49/32
f 34/39/33 49/65/33 50/50/33
f 27/40/34 42/66/34 43/51/34
f 35/41/35 50/50/35 51/52/35
f 28/42/36 43/51/36 44/53/36
f 30/48/39 29/44/39 44/53/39
f 22/46/40 37/55/40 38/58/40
f 30/48/41 45/57/41 46/59/41
f 23/49/42 38/58/42 39/60/42
f 32/35/43 31/33/43 46/59/43
f 24/34/44 39/60/44 40/62/44
f 33/37/45 32/35/45 47/61/45
f 25/36/46 40/62/46 41/64/46
f 34/39/47 33/37/47 48/63/47
f 26/38/48 41/64/48 42/66/48
f 38/58/49 53/83/49 54/67/49
f 47/61/50 46/59/50 61/68/50
f 39/60/51 54/67/51 55/70/51
f 47/61/52 62/69/52 63/71/52
f 40/62/53 55/70/53 56/72/53
f 48/63/54 63/71/54 64/73/54
f 41/64/55 56/72/55 57/74/55
f 49/65/56 64/73/56 65/75/56
f 43/51/57 42/66/57 57/74/57
f 50/50/58 65/75/58 66/77/58
f 44/53/59 43/51/59 58/76/59
f 44/53/62 59/78/62 60/82/62
f 37/55/63 52/80/63 53/83/63
f 45/57/64 60/82/64 61/68/64
f 58/76/65 57/74/65 72/84/65
f 65/75/66 80/100/66 81/86/66
f 59/78/67 58/76/67 73/85/67
f 59/78/70 74/87/70 75/91/70
f 52/80/71 67/89/71 68/92/71
f 60/82/72 75/91/72 76/93/72
f 53/83/73 68/92/73 69/94/73
f 62/69/74 61/68/74 76/93/74
f 54/67/75 69/94/75 70/96/75
f 62/69/76 77/95/76 78/97/76
f 56/72/77 55/70/77 70/96/77
f 63/71/78 78/97/78 79/99/78
f 56/72/79 71/98/79 72/84/79
f 65/75/80 64/73/80 79/99/80
f 77/95/81 76/93/81 92/101/81
f 69/94/82 85/117/82 86/103/82
f 77/95/83 93/102/83 94/104/83
f 71/98/84 70/96/84 86/103/84
f 78/97/85 94/104/85 95/106/85
f 71/98/86 87/105/86 88/107/86
f 79/99/87 95/106/87 96/108/87
f 73/85/88 72/84/88 88/107/88
f 81/86/89 80/100/89 96/108/89
f 74/87/90 73/85/90 89/109/90
f 74/87/93 90/111/93 91/115/93
f 67/89/94 83/113/94 84/116/94
f 75/91/95 91/115/95 92/101/95
f 68/92/96 84/116/96 85/117/96
f 96/108/97 111/134/97 112/118/97
f 90/111/98 89/109/98 104/119/98
f 90/111/101 105/120/101 106/124/101
f 84/116/102 83/113/102 98/122/102
f 91/115/103 106/124/103 107/126/103
f 84/116/104 99/125/104 100/127/104
f 93/102/105 92/101/105 107/126/105
f 85/117/106 100/127/106 101/129/106
f 93/102/107 108/128/107 109/130/107
f 87/105/108 86/103/108 101/129/108
f 95/106/109 94/104/109 109/130/109
f 87/105/110 102/131/110 103/133/110
f 96/108/111 95/106/111 110/132/111
f 89/109/112 88/107/112 103/133/112
f 108/128/113 123/151/113 124/135/113
f 102/131/114 101/129/114 116/136/114
f 110/132/115 109/130/115 124/135/115
f 102/131/116 117/137/116 118/139/116
f 110/132/117 125/138/117 126/140/117
f 104/119/118 103/133/118 118/139/118
f 111/134/119 126/140/119 127/142/119
f 105/120/120 104/119/120 119/141/120
f 105/120/123 120/143/123 121/147/123
f 99/125/124 98/122/124 113/145/124
f 106/124/125 121/147/125 122/149/125
f 99/125/126 114/148/126 115/150/126
f 108/128/127 107/126/127 122/149/127
f 100/127/128 115/150/128 116/136/128
f 120/143/131 135/168/131 136/156/131
f 113/145/132 128/153/132 129/157/132
f 121/147/133 136/156/133 137/158/133
f 114/148/134 129/157/134 130/159/134
f 123/151/135 122/149/135 137/158/135
f 115/150/136 130/159/136 131/161/136
f 123/151/137 138/160/137 139/162/137
f 117/137/138 116/136/138 131/161/138
f 125/138/139 124/135/139 139/162/139
f 117/137/140 132/163/140 133/165/140
f 125/138/141 140/164/141 141/166/141
f 119/141/142 118/139/142 133/165/142
f 126/140/143 141/166/143 142/155/143
f 120/143/144 119/141/144 134/167/144
f 132/163/145 131/161/145 146/169/145
f 140/164/146 139/162/146 154/171/146
f 132/163/147 147/170/147 148/173/147
f 140/164/148 155/172/148 156/174/148
f 134/167/149 133/165/149 148/173/149
f 142/155/150 141/166/150 156/174/150
f 135/168/151 134/167/151 149/175/151
f 135/168/154 150/177/154 151/181/154
f 128/153/155 143/179/155 144/182/155
f 136/156/156 151/181/156 152/183/156
f 130/159/157 129/157/157 144/182/157
f 138/160/158 137/158/158 152/183/158
f 130/159/159 145/184/159 146/169/159
f 138/160/160 153/185/160 154/171/160
f 150/177/161 165/199/161 166/186/161
f 143/179/162 158/201/162 159/187/162
f 151/181/163 166/186/163 167/188/163
f 145/184/164 144/182/164 159/187/164
f 153/185/165 152/183/165 167/188/165
f 145/184/166 160/189/166 161/191/166
f 153/185/167 168/190/167 169/192/167
f 147/170/168 146/169/168 161/191/168
f 155/172/169 154/171/169 169/192/169
f 147/170/170 162/193/170 163/195/170
f 155/172/171 170/194/171 171/196/171
f 149/175/172 148/173/172 163/195/172
f 156/174/173 171/196/173 172/198/173
f 150/177/174 149/175/174 164/197/174
f 169/192/177 184/218/177 185/203/177
f 162/193/178 177/219/178 178/204/178
f 171/196/179 170/194/179 185/203/179
f 164/197/180 163/195/180 178/204/180
f 172/198/181 171/196/181 186/205/181
f 165/199/182 164/197/182 179/206/182
f 165/199/185 180/208/185 181/212/185
f 158/201/186 173/210/186 174/213/186
f 166/186/187 181/212/187 182/214/187
f 159/187/188 174/213/188 175/215/188
f 168/190/189 167/188/189 182/214/189
f 160/189/190 175/215/190 176/217/190
f 168/190/191 183/216/191 184/218/191
f 162/193/192 161/191/192 176/217/192
f 174/213/193 173/210/193 188/220/193
f 181/212/194 196/236/194 197/222/194
f 175/215/195 174/213/195 189/221/195
f 183/216/196 182/214/196 197/222/196
f 175/215/197 190/223/197 191/225/197
f 183/216/198 198/224/198 199/226/198
f 177/219/199 176/217/199 191/225/199
f 185/203/200 184/218/200 199/226/200
f 177/219/201 192/227/201 193/229/201
f 185/203/202 200/228/202 201/230/202
f 179/206/203 178/204/203 193/229/203
f 186/205/204 201/230/204 202/232/204
f 180/208/205 179/206/205 194/231/205
f 180/208/208 195/233/208 196/236/208
f 192/227/209 207/252/209 208/237/209
f 200/228/210 215/253/210 216/238/210
f 194/231/211 193/229/211 208/237/211
f 201/230/212 216/238/212 217/240/212
f 195/233/213 194/231/213 209/239/213
f 195/233/216 210/241/216 211/245/216
f 188/220/217 203/243/217 204/246/217
f 196/236/218 211/245/218 212/247/218
f 189/221/219 204/246/219 205/248/219
f 198/224/220 197/222/220 212/247/220
f 190/223/221 205/248/221 206/250/221
f 198/224/222 213/249/222 214/251/222
f 192/227/223 191/225/223 206/250/223
f 200/228/224 199/226/224 214/251/224
f 211/245/225 226/269/225 227/254/225
f 204/246/226 219/270/226 220/255/226
f 213/249/227 212/247/227 227/254/227
f 205/248/228 220/255/228 221/257/228
f 213/249/229 228/256/229 229/258/229
f 207/252/230 206/250/230 221/257/230
f 215/253/231 214/251/231 229/258/231
f 207/252/232 222/259/232 223/261/232
f 215/253/233 230/260/233 231/262/233
f 209/239/234 208/237/234 223/261/234
f 216/238/235 231/262/235 232/264/235
f 210/241/236 209/239/236 224/263/236
f 210/241/239 225/265/239 226/269/239
f 203/243/240 218/267/240 219/270/240
f 230/260/241 245/287/241 246/271/241
f 224/263/242 223/261/242 238/272/242
f 232/264/243 231/262/243 246/271/243
f 225/265/244 224/263/244 239/273/244
f 225/265/247 240/275/247 241/279/247
f 218/267/248 233/277/248 234/280/248
f 226/269/249 241/279/249 242/281/249
f 220/255/250 219/270/250 234/280/250
f 228/256/251 227/254/251 242/281/251
f 220/255/252 235/282/252 236/284/252
f 228/256/253 243/283/253 244/285/253
f 222/259/254 221/257/254 236/284/254
f 230/260/255 229/258/255 244/285/255
f 222/259/256 237/286/256 238/272/256
f 243/283/257 242/281/257 257/288/257
f 235/282/258 250/304/258 251/290/258
f 243/283/259 258/289/259 259/291/259
f 237/286/260 236/284/260 251/290/260
f 245/287/261 244/285/261 259/291/261
f 237/286/262 252/292/262 253/294/262
f 245/287/263 260/293/263 261/295/263
f 239/273/264 238/272/264 253/294/264
f 246/271/265 261/295/265 262/297/265
f 240/275/266 239/273/266 254/296/266
f 240/275/269 255/298/269 256/302/269
f 233/277/270 248/300/270 249/303/270
f 241/279/271 256/302/271 257/288/271
f 234/280/272 249/303/272 250/304/272
f 261/295/273 276/321/273 277/305/273
f 255/298/274 254/296/274 269/306/274
f 255/298/277 270/307/277 271/311/277
f 249/303/278 248/300/278 263/309/278
f 256/302/279 271/311/279 272/313/279
f 249/303/280 264/312/280 265/314/280
f 258/289/281 257/288/281 272/313/281
f 250/304/282 265/314/282 266/316/282
f 258/289/283 273/315/283 274/317/283
f 252/292/284 251/290/284 266/316/284
f 260/293/285 259/291/285 274/317/285
f 252/292/286 267/318/286 268/320/286
f 260/293/287 275/319/287 276/321/287
f 254/296/288 253/294/288 268/320/288
f 265/314/289 280/337/289 281/322/289
f 273/315/290 288/338/290 289/323/290
f 267/318/291 266/316/291 281/322/291
f 275/319/292 274/317/292 289/323/292
f 267/318/293 282/324/293 283/326/293
f 275/319/294 290/325/294 291/327/294
f 269/306/295 268/320/295 283/326/295
f 277/305/296 276/321/296 291/327/296
f 270/307/297 269/306/297 284/328/297
f 270/307/300 285/330/300 286/334/300
f 264/312/301 263/309/301 278/332/301
f 271/311/302 286/334/302 287/336/302
f 264/312/303 279/335/303 280/337/303
f 273/315/304 272/313/304 287/336/304
f 285/330/305 284/328/305 299/339/305
f 285/330/308 300/340/308 301/345/308
f 278/332/309 293/342/309 294/346/309
f 286/334/310 301/345/310 302/347/310
f 279/335/311 294/346/311 295/348/311
f 288/338/312 287/336/312 302/347/312
f 280/337/313 295/348/313 296/350/313
f 288/338/314 303/349/314 304/351/314
f 282/324/315 281/322/315 296/350/315
f 290/325/316 289/323/316 304/351/316
f 282/324/317 297/352/317 298/354/317
f 290/325/318 305/353/318 306/355/318
f 284/328/319 283/326/319 298/354/319
f 292/329/320 291/327/320 306/355/320
f 303/349/321 319/372/321 320/356/321
f 297/352/322 296/350/322 312/357/322
f 305/353/323 304/351/323 320/356/323
f 297/352/324 313/358/324 314/360/324
f 305/353/514 321/359/514 322/361/514
f 299/339/326 298/354/326 314/360/326
f 306/355/327 322/361/327 323/363/327
f 300/340/328 299/339/328 315/362/328
f 300/340/331 316/364/331 317/368/331
f 293/342/332 309/366/332 310/369/332
f 301/345/333 317/368/333 318/370/333
f 294/346/515 310/369/515 311/371/515
f 303/349/335 302/347/335 318/370/335
f 295/348/336 311/371/336 312/357/336
f 316/364/339 331/389/339 332/377/339
f 309/366/340 324/374/340 325/378/340
f 317/368/341 332/377/341 333/379/341
f 310/369/342 325/378/342 326/380/342
f 319/372/343 318/370/343 333/379/343
f 311/371/344 326/380/344 327/382/344
f 319/372/345 334/381/345 335/383/345
f 312/357/346 327/382/346 328/384/346
f 321/359/347 320/356/347 335/383/347
f 313/358/348 328/384/348 329/386/348
f 321/359/349 336/385/349 337/387/349
f 315/362/350 314/360/350 329/386/350
f 322/361/351 337/387/351 338/376/351
f 316/364/352 315/362/352 330/388/352
f 328/384/353 327/382/353 342/390/353
f 336/385/354 335/383/354 350/392/354
f 328/384/355 343/391/355 344/394/355
f 336/385/356 351/393/356 352/395/356
f 330/388/357 329/386/357 344/394/357
f 337/387/358 352/395/358 353/397/358
f 331/389/359 330/388/359 345/396/359
f 331/389/362 346/398/362 347/402/362
f 324/374/363 339/400/363 340/403/363
f 332/377/364 347/402/364 348/404/364
f 325/378/365 340/403/365 341/405/365
f 334/381/366 333/379/366 348/404/366
f 326/380/367 341/405/367 342/390/367
f 334/381/368 349/406/368 350/392/368
f 346/398/369 361/420/369 362/407/369
f 339/400/370 354/422/370 355/408/370
f 347/402/371 362/407/371 363/409/371
f 341/405/372 340/403/372 355/408/372
f 349/406/373 348/404/373 363/409/373
f 341/405/374 356/410/374 357/412/374
f 349/406/375 364/411/375 365/413/375
f 343/391/376 342/390/376 357/412/376
f 351/393/377 350/392/377 365/413/377
f 343/391/378 358/414/378 359/416/378
f 351/393/379 366/415/379 367/417/379
f 345/396/380 344/394/380 359/416/380
f 352/395/381 367/417/381 368/419/381
f 346/398/382 345/396/382 360/418/382
f 366/424/385 365/454/385 380/425/385
f 358/427/386 373/455/386 374/428/386
f 366/424/387 381/426/387 382/430/387
f 360/432/388 359/429/388 374/428/388
f 367/431/389 382/430/389 383/434/389
f 361/436/390 360/432/390 375/433/390
f 361/436/393 376/437/393 377/442/393
f 354/438/394 369/440/394 370/444/394
f 362/443/395 377/442/395 378/446/395
f 356/448/396 355/445/396 370/444/396
f 364/450/397 363/447/397 378/446/397
f 356/448/398 371/449/398 372/452/398
f 364/450/399 379/451/399 380/425/399
f 358/427/400 357/453/400 372/452/400
f 377/442/401 392/472/401 393/456/401
f 371/449/402 370/444/402 385/457/402
f 379/451/403 378/446/403 393/456/403
f 371/449/404 386/458/404 387/460/404
f 379/451/405 394/459/405 395/461/405
f 373/455/406 372/452/406 387/460/406
f 381/426/407 380/425/407 395/461/407
f 373/455/408 388/462/408 389/464/408
f 381/426/409 396/463/409 397/465/409
f 375/433/410 374/428/410 389/464/410
f 382/430/411 397/465/411 398/467/411
f 376/437/412 375/433/412 390/466/412
f 376/437/415 391/468/415 392/472/415
f 369/440/416 384/470/416 385/457/416
f 397/465/417 396/463/417 411/473/417
f 390/466/418 389/464/418 404/475/418
f 397/465/419 412/474/419 413/477/419
f 391/468/420 390/466/420 405/476/420
f 391/468/423 406/478/423 407/482/423
f 385/457/424 384/470/424 399/480/424
f 392/472/425 407/482/425 408/484/425
f 385/457/426 400/483/426 401/485/426
f 394/459/427 393/456/427 408/484/427
f 386/458/428 401/485/428 402/487/428
f 394/459/429 409/486/429 410/488/429
f 388/462/430 387/460/430 402/487/430
f 396/463/431 395/461/431 410/488/431
f 388/462/432 403/489/432 404/475/432
f 401/485/516 400/483/516 415/490/516
f 409/486/434 408/484/434 423/492/434
f 401/485/435 416/491/435 417/494/435
f 409/486/436 424/493/436 425/495/436
f 403/489/437 402/487/437 417/494/437
f 411/473/438 410/488/438 425/495/438
f 403/489/439 418/496/439 419/498/439
f 411/473/517 426/497/517 427/499/517
f 405/476/441 404/475/441 419/498/441
f 412/474/442 427/499/442 428/501/442
f 406/478/443 405/476/443 420/500/443
f 406/478/446 421/502/446 422/506/446
f 399/480/447 414/504/447 415/490/447
f 407/482/448 422/506/448 423/492/448
f 420/500/449 419/498/449 434/507/449
f 427/499/450 442/523/450 443/509/450
f 421/502/451 420/500/451 435/508/451
f 421/502/454 436/510/454 437/514/454
f 414/504/455 429/512/455 430/515/455
f 422/506/456 437/514/456 438/516/456
f 416/491/457 415/490/457 430/515/457
f 424/493/458 423/492/458 438/516/458
f 416/491/459 431/517/459 432/519/459
f 424/493/460 439/518/460 440/520/460
f 418/496/461 417/494/461 432/519/461
f 426/497/462 425/495/462 440/520/462
f 418/496/463 433/521/463 434/507/463
f 426/497/464 441/522/464 442/523/464
f 439/518/465 438/516/465 453/524/465
f 431/517/466 446/540/466 447/526/466
f 439/518/467 454/525/467 455/527/467
f 433/521/468 432/519/468 447/526/468
f 441/522/469 440/520/469 455/527/469
f 433/521/470 448/528/470 449/530/470
f 441/522/471 456/529/471 457/531/471
f 435/508/472 434/507/472 449/530/472
f 442/523/473 457/531/473 458/533/473
f 436/510/474 435/508/474 450/532/474
f 436/510/477 451/534/477 452/538/477
f 430/515/478 429/512/478 444/536/478
f 437/514/479 452/538/479 453/524/479
f 430/515/480 445/539/480 446/540/480
f 458/533/481 457/531/481 472/541/481
f 451/534/482 450/532/482 465/543/482
f 451/534/485 466/544/485 467/548/485
f 444/536/486 459/546/486 460/549/486
f 452/538/487 467/548/487 468/550/487
f 446/540/488 445/539/488 460/549/488
f 454/525/489 453/524/489 468/550/489
f 446/540/490 461/551/490 462/553/490
f 454/525/491 469/552/491 470/554/491
f 448/528/492 447/526/492 462/553/492
f 456/529/493 455/527/493 470/554/493
f 448/528/494 463/555/494 464/557/494
f 456/529/495 471/556/495 472/541/495
f 450/532/496 449/530/496 464/557/496
f 462/553/497 461/551/497 1/32/497
f 469/552/498 6/3/498 479/8/498
f 463/555/499 462/553/499 476/4/499
f 471/556/500 470/554/500 479/8/500
f 464/557/501 463/555/501 2/9/501
f 471/556/502 480/12/502 481/16/502
f 465/543/503 464/557/503 477/13/503
f 472/541/504 481/16/504 482/20/504
f 466/544/505 465/543/505 3/17/505
f 466/544/508 4/22/508 5/28/508
f 460/549/509 459/546/509 474/23/509
f 467/548/510 5/28/510 478/1/510
f 461/551/511 460/549/511 475/29/511
f 468/550/512 478/1/512 6/3/512
o Sphere.001
v -0.076854 0.140504 -0.154382
v -0.076854 0.044016 -0.226220
v -0.076854 -0.082051 -0.265099
v -0.076854 -0.150279 -0.270102
v -0.076854 -0.218506 -0.265099
v -0.076854 -0.344573 -0.226220
v -0.066588 0.192723 -0.059544
v -0.056716 0.172822 -0.107451
v -0.047619 0.140504 -0.151602
v -0.039645 0.097012 -0.190301
v -0.033100 0.044016 -0.222060
v -0.028237 -0.016446 -0.245660
v -0.025243 -0.082051 -0.260192
v -0.024232 -0.150279 -0.265099
v -0.025243 -0.218506 -0.260192
v -0.028237 -0.284111 -0.245660
v -0.033100 -0.344573 -0.222060
v -0.039645 -0.397569 -0.190301
v -0.047619 -0.441061 -0.151602
v -0.056716 -0.473379 -0.107451
v -0.066588 -0.493280 -0.059544
v -0.056716 0.192723 -0.056653
v -0.037353 0.172822 -0.101781
v -0.019507 0.140504 -0.143370
v -0.003865 0.097012 -0.179824
v 0.008972 0.044016 -0.209740
v 0.018511 -0.016446 -0.231970
v 0.024385 -0.082051 -0.245660
v 0.026368 -0.150279 -0.250282
v 0.024385 -0.218506 -0.245660
v 0.018511 -0.284111 -0.231970
v 0.008972 -0.344573 -0.209740
v -0.003865 -0.397569 -0.179824
v -0.019507 -0.441061 -0.143370
v -0.037353 -0.473379 -0.101781
v -0.056716 -0.493280 -0.056653
v -0.047619 0.192723 -0.051959
v -0.019507 0.172822 -0.092573
v 0.006401 0.140504 -0.130002
v 0.029110 0.097012 -0.162810
v 0.047746 0.044016 -0.189734
v 0.061595 -0.016446 -0.209740
v 0.070122 -0.082051 -0.222060
v 0.073002 -0.150279 -0.226220
v 0.070122 -0.218506 -0.222060
v 0.061595 -0.284111 -0.209740
v 0.047746 -0.344573 -0.189734
v 0.029110 -0.397569 -0.162810
v 0.006401 -0.441061 -0.130002
v -0.019507 -0.473379 -0.092573
v -0.047619 -0.493280 -0.051959
v -0.039645 0.192723 -0.045642
v -0.003865 0.172822 -0.080181
v 0.029110 0.140504 -0.112012
v 0.058013 0.097012 -0.139912
v 0.081732 0.044016 -0.162810
v 0.099358 -0.016446 -0.179824
v 0.110211 -0.082051 -0.190301
v 0.113876 -0.150279 -0.193839
v 0.110211 -0.218506 -0.190301
v 0.099358 -0.284111 -0.179824
v 0.081732 -0.344573 -0.162810
v 0.058013 -0.397569 -0.139912
v 0.029110 -0.441061 -0.112012
v -0.003865 -0.473379 -0.080181
v -0.039645 -0.493280 -0.045642
v -0.033100 0.192723 -0.037944
v 0.008972 0.172822 -0.065081
v 0.047746 0.140504 -0.090091
v 0.081732 0.097012 -0.112012
v 0.109624 0.044016 -0.130002
v 0.130349 -0.016446 -0.143370
v 0.143112 -0.082051 -0.151602
v 0.147421 -0.150279 -0.154382
v 0.143112 -0.218506 -0.151602
v 0.130349 -0.284111 -0.143370
v 0.109624 -0.344573 -0.130002
v 0.081732 -0.397569 -0.112012
v 0.047746 -0.441061 -0.090091
v 0.008972 -0.473379 -0.065081
v -0.033100 -0.493280 -0.037944
v -0.076854 0.199443 -0.009722
v -0.028237 0.192723 -0.029162
v 0.018511 0.172822 -0.047854
v 0.061595 0.140504 -0.065081
v 0.099358 0.097012 -0.080181
v 0.130349 0.044016 -0.092573
v 0.153378 -0.016446 -0.101781
v 0.167559 -0.082051 -0.107451
v 0.172347 -0.150279 -0.109365
v 0.167559 -0.218506 -0.107451
v 0.153378 -0.284111 -0.101781
v 0.130349 -0.344573 -0.092573
v 0.099358 -0.397569 -0.080181
v 0.061595 -0.441061 -0.065081
v 0.018511 -0.473379 -0.047854
v -0.028237 -0.493280 -0.029162
v -0.025243 0.192723 -0.019633
v 0.024385 0.172822 -0.029162
v 0.070122 0.140504 -0.037944
v 0.110211 0.097012 -0.045642
v 0.143112 0.044016 -0.051959
v 0.167559 -0.016446 -0.056653
v 0.182613 -0.082051 -0.059544
v 0.187697 -0.150279 -0.060520
v 0.182613 -0.218506 -0.059544
v 0.167559 -0.284111 -0.056653
v 0.143112 -0.344573 -0.051959
v 0.110211 -0.397569 -0.045642
v 0.070122 -0.441061 -0.037944
v 0.024385 -0.473379 -0.029162
v -0.025243 -0.493280 -0.019633
v -0.024232 0.192723 -0.009722
v 0.026368 0.172822 -0.009722
v 0.073002 0.140504 -0.009722
v 0.113876 0.097012 -0.009722
v 0.147421 0.044016 -0.009722
v 0.172347 -0.016446 -0.009722
v 0.187697 -0.082051 -0.009722
v 0.192879 -0.150279 -0.009722
v 0.187697 -0.218506 -0.009722
v 0.172347 -0.284111 -0.009722
v 0.147421 -0.344573 -0.009722
v 0.113876 -0.397569 -0.009722
v 0.073002 -0.441061 -0.009722
v 0.026368 -0.473379 -0.009722
v -0.024232 -0.493280 -0.009722
v -0.025243 0.192723 0.000188
v 0.024385 0.172822 0.009717
v 0.070122 0.140504 0.018499
v 0.110211 0.097012 0.026197
v 0.143112 0.044016 0.032514
v 0.167559 -0.016446 0.037208
v 0.182613 -0.082051 0.040099
v 0.187696 -0.150279 0.041075
v 0.182613 -0.218506 0.040099
v 0.167559 -0.284111 0.037208
v 0.143112 -0.344573 0.032514
v 0.110211 -0.397569 0.026197
v 0.070122 -0.441061 0.018499
v 0.024385 -0.473379 0.009717
v -0.025243 -0.493280 0.000188
v -0.028237 0.192723 0.009717
v 0.018511 0.172822 0.028409
v 0.061595 0.140504 0.045636
v 0.099358 0.097012 0.060736
v 0.130349 0.044016 0.073128
v 0.153378 -0.016446 0.082336
v 0.167559 -0.082051 0.088006
v 0.172347 -0.150279 0.089921
v 0.167559 -0.218506 0.088006
v 0.153378 -0.284111 0.082336
v 0.130349 -0.344573 0.073128
v 0.099358 -0.397569 0.060736
v 0.061595 -0.441061 0.045636
v 0.018511 -0.473379 0.028409
v -0.028237 -0.493280 0.009717
v -0.033100 0.192723 0.018499
v 0.008972 0.172822 0.045636
v 0.047746 0.140504 0.070646
v 0.081732 0.097012 0.092567
v 0.109624 0.044016 0.110557
v 0.130349 -0.016446 0.123925
v 0.143112 -0.082051 0.132157
v 0.147421 -0.150279 0.134937
v 0.143112 -0.218506 0.132157
v 0.130349 -0.284111 0.123925
v 0.109624 -0.344573 0.110557
v 0.081732 -0.397569 0.092567
v 0.047746 -0.441061 0.070646
v 0.008972 -0.473379 0.045636
v -0.033100 -0.493280 0.018499
v -0.039645 0.192723 0.026197
v -0.003865 0.172822 0.060736
v 0.029110 0.140504 0.092567
v 0.058013 0.097012 0.120467
v 0.081732 0.044016 0.143365
v 0.099358 -0.016446 0.160379
v 0.110211 -0.082051 0.170856
v 0.113876 -0.150279 0.174394
v 0.110211 -0.218506 0.170856
v 0.099358 -0.284111 0.160379
v 0.081732 -0.344573 0.143365
v 0.058013 -0.397569 0.120467
v 0.029110 -0.441061 0.092567
v -0.003865 -0.473379 0.060736
v -0.039645 -0.493280 0.026197
v -0.047619 0.192723 0.032514
v -0.019507 0.172822 0.073128
v 0.006401 0.140504 0.110557
v 0.029110 0.097012 0.143365
v 0.047746 0.044016 0.170289
v 0.061595 -0.016446 0.190295
v 0.070122 -0.082051 0.202615
v 0.073002 -0.150279 0.206775
v 0.070122 -0.218506 0.202615
v 0.061595 -0.284111 0.190295
v 0.047746 -0.344573 0.170289
v 0.029110 -0.397569 0.143365
v 0.006401 -0.441061 0.110557
v -0.019507 -0.473379 0.073128
v -0.047619 -0.493280 0.032514
v -0.056717 0.192723 0.037208
v -0.037353 0.172822 0.082336
v -0.019507 0.140504 0.123925
v -0.003865 0.097012 0.160379
v 0.008972 0.044016 0.190295
v 0.018511 -0.016446 0.212526
v 0.024385 -0.082051 0.226215
v 0.026368 -0.150279 0.230837
v 0.024385 -0.218506 0.226215
v 0.018511 -0.284111 0.212526
v 0.008972 -0.344573 0.190295
v -0.003865 -0.397569 0.160379
v -0.019507 -0.441061 0.123925
v -0.037353 -0.473379 0.082336
v -0.056717 -0.493280 0.037208
v -0.066588 0.192723 0.040099
v -0.056717 0.172822 0.088006
v -0.047619 0.140504 0.132157
v -0.039645 0.097012 0.170856
v -0.033100 0.044016 0.202615
v -0.028237 -0.016446 0.226215
v -0.025243 -0.082051 0.240747
v -0.024232 -0.150279 0.245654
v -0.025243 -0.218506 0.240747
v -0.028237 -0.284111 0.226215
v -0.033100 -0.344573 0.202615
v -0.039645 -0.397569 0.170856
v -0.047619 -0.441061 0.132157
v -0.056717 -0.473379 0.088006
v -0.066588 -0.493280 0.040099
v -0.076854 0.192723 0.041075
v -0.076854 0.172822 0.089921
v -0.076854 0.140504 0.134937
v -0.076854 0.097012 0.174394
v -0.076854 0.044016 0.206775
v -0.076854 -0.016446 0.230837
v -0.076854 -0.082051 0.245654
v -0.076854 -0.150279 0.250657
v -0.076854 -0.218506 0.245654
v -0.076854 -0.284111 0.230837
v -0.076854 -0.344573 0.206775
v -0.076854 -0.397569 0.174394
v -0.076854 -0.441061 0.134937
v -0.076854 -0.473379 0.089921
v -0.076854 -0.493280 0.041075
v -0.087120 0.192723 0.040099
v -0.096992 0.172822 0.088006
v -0.106090 0.140504 0.132157
v -0.114064 0.097012 0.170856
v -0.120608 0.044016 0.202615
v -0.125471 -0.016446 0.226215
v -0.128466 -0.082051 0.240747
v -0.129477 -0.150279 0.245654
v -0.128466 -0.218506 0.240747
v -0.125471 -0.284111 0.226215
v -0.120608 -0.344573 0.202615
v -0.114064 -0.397569 0.170856
v -0.106090 -0.441061 0.132157
v -0.096992 -0.473379 0.088006
v -0.087120 -0.493280 0.040099
v -0.096992 0.192723 0.037208
v -0.116356 0.172822 0.082336
v -0.134202 0.140504 0.123925
v -0.149844 0.097012 0.160379
v -0.162681 0.044016 0.190295
v -0.172219 -0.016446 0.212526
v -0.178093 -0.082051 0.226215
v -0.180077 -0.150279 0.230837
v -0.178093 -0.218506 0.226215
v -0.172219 -0.284111 0.212526
v -0.162681 -0.344573 0.190295
v -0.149844 -0.397569 0.160379
v -0.134202 -0.441061 0.123925
v -0.116356 -0.473379 0.082336
v -0.096992 -0.493280 0.037208
v -0.106090 0.192723 0.032514
v -0.134202 0.172822 0.073128
v -0.160110 0.140504 0.110557
v -0.182818 0.097012 0.143365
v -0.201455 0.044016 0.170289
v -0.215303 -0.016446 0.190295
v -0.223831 -0.082051 0.202615
v -0.226710 -0.150279 0.206775
v -0.223831 -0.218506 0.202615
v -0.215303 -0.284111 0.190295
v -0.201455 -0.344573 0.170289
v -0.182818 -0.397569 0.143365
v -0.160110 -0.441061 0.110557
v -0.134202 -0.473379 0.073128
v -0.106090 -0.493280 0.032514
v -0.114064 0.192723 0.026197
v -0.149844 0.172822 0.060736
v -0.182818 0.140504 0.092567
v -0.211721 0.097012 0.120467
v -0.235441 0.044016 0.143365
v -0.253066 -0.016446 0.160379
v -0.263920 -0.082051 0.170856
v -0.267585 -0.150279 0.174394
v -0.263920 -0.218506 0.170856
v -0.253066 -0.284111 0.160379
v -0.235441 -0.344573 0.143365
v -0.211721 -0.397569 0.120467
v -0.182818 -0.441061 0.092567
v -0.149844 -0.473379 0.060736
v -0.114064 -0.493280 0.026197
v -0.076854 -0.500000 -0.009722
v -0.120608 0.192723 0.018499
v -0.162681 0.172822 0.045636
v -0.201455 0.140504 0.070646
v -0.235441 0.097012 0.092567
v -0.263332 0.044016 0.110557
v -0.284058 -0.016446 0.123925
v -0.296820 -0.082051 0.132157
v -0.301129 -0.150279 0.134937
v -0.296820 -0.218506 0.132157
v -0.284058 -0.284111 0.123925
v -0.263332 -0.344573 0.110557
v -0.235441 -0.397569 0.092567
v -0.201455 -0.441061 0.070646
v -0.162681 -0.473379 0.045636
v -0.120608 -0.493280 0.018499
v -0.125471 0.192723 0.009717
v -0.172219 0.172822 0.028409
v -0.215303 0.140504 0.045636
v -0.253066 0.097012 0.060736
v -0.284058 0.044016 0.073128
v -0.307086 -0.016446 0.082336
v -0.321267 -0.082051 0.088006
v -0.326055 -0.150279 0.089920
v -0.321267 -0.218506 0.088006
v -0.307086 -0.284111 0.082336
v -0.284058 -0.344573 0.073128
v -0.253066 -0.397569 0.060736
v -0.215303 -0.441061 0.045636
v -0.172219 -0.473379 0.028409
v -0.125471 -0.493280 0.009717
v -0.128465 0.192723 0.000188
v -0.178093 0.172822 0.009717
v -0.223831 0.140504 0.018499
v -0.263920 0.097012 0.026197
v -0.296820 0.044016 0.032514
v -0.321267 -0.016446 0.037208
v -0.336322 -0.082051 0.040099
v -0.341405 -0.150279 0.041075
v -0.336322 -0.218506 0.040099
v -0.321267 -0.284111 0.037208
v -0.296820 -0.344573 0.032514
v -0.263920 -0.397569 0.026197
v -0.223831 -0.441061 0.018499
v -0.178093 -0.473379 0.009717
v -0.128465 -0.493280 0.000188
v -0.129477 0.192723 -0.009722
v -0.180077 0.172822 -0.009722
v -0.226710 0.140504 -0.009722
v -0.267585 0.097012 -0.009722
v -0.301129 0.044016 -0.009722
v -0.326056 -0.016446 -0.009722
v -0.341405 -0.082051 -0.009723
v -0.346588 -0.150279 -0.009723
v -0.341405 -0.218506 -0.009723
v -0.326056 -0.284111 -0.009722
v -0.301129 -0.344573 -0.009722
v -0.267585 -0.397569 -0.009722
v -0.226710 -0.441061 -0.009722
v -0.180077 -0.473379 -0.009722
v -0.129477 -0.493280 -0.009722
v -0.128465 0.192723 -0.019633
v -0.178093 0.172822 -0.029162
v -0.223831 0.140504 -0.037944
v -0.263920 0.097012 -0.045642
v -0.296820 0.044016 -0.051959
v -0.321267 -0.016446 -0.056653
v -0.336322 -0.082051 -0.059544
v -0.341405 -0.150279 -0.060520
v -0.336322 -0.218506 -0.059544
v -0.321267 -0.284111 -0.056653
v -0.296820 -0.344573 -0.051959
v -0.263920 -0.397569 -0.045642
v -0.223831 -0.441061 -0.037944
v -0.178093 -0.473379 -0.029162
v -0.128465 -0.493280 -0.019633
v -0.125471 0.192723 -0.029162
v -0.172219 0.172822 -0.047854
v -0.215303 0.140504 -0.065081
v -0.253066 0.097012 -0.080181
v -0.284057 0.044016 -0.092573
v -0.307086 -0.016446 -0.101781
v -0.321267 -0.082051 -0.107451
v -0.326055 -0.150279 -0.109365
v -0.321267 -0.218506 -0.107451
v -0.307086 -0.284111 -0.101781
v -0.284057 -0.344573 -0.092573
v -0.253066 -0.397569 -0.080181
v -0.215303 -0.441061 -0.065081
v -0.172219 -0.473379 -0.047854
v -0.125471 -0.493280 -0.029162
v -0.120608 0.192723 -0.037944
v -0.162681 0.172822 -0.065081
v -0.201455 0.140504 -0.090091
v -0.235441 0.097012 -0.112012
v -0.263332 0.044016 -0.130002
v -0.284058 -0.016446 -0.143370
v -0.296820 -0.082051 -0.151602
v -0.301129 -0.150279 -0.154382
v -0.296820 -0.218506 -0.151602
v -0.284058 -0.284111 -0.143370
v -0.263332 -0.344573 -0.130002
v -0.235441 -0.397569 -0.112012
v -0.201455 -0.441061 -0.090091
v -0.162681 -0.473379 -0.065081
v -0.120608 -0.493280 -0.037944
v -0.114064 0.192723 -0.045642
v -0.149844 0.172822 -0.080181
v -0.182818 0.140504 -0.112012
v -0.211721 0.097012 -0.139912
v -0.235441 0.044016 -0.162809
v -0.253066 -0.016446 -0.179824
v -0.263920 -0.082051 -0.190301
v -0.267584 -0.150279 -0.193839
v -0.263920 -0.218506 -0.190301
v -0.253066 -0.284111 -0.179824
v -0.235441 -0.344573 -0.162809
v -0.211721 -0.397569 -0.139912
v -0.182818 -0.441061 -0.112012
v -0.149844 -0.473379 -0.080181
v -0.114064 -0.493280 -0.045642
v -0.106090 0.192723 -0.051959
v -0.134202 0.172822 -0.092573
v -0.160110 0.140504 -0.130002
v -0.182818 0.097012 -0.162810
v -0.201455 0.044016 -0.189734
v -0.215303 -0.016446 -0.209740
v -0.223831 -0.082051 -0.222060
v -0.226710 -0.150279 -0.226220
v -0.223831 -0.218506 -0.222060
v -0.215303 -0.284111 -0.209740
v -0.201455 -0.344573 -0.189734
v -0.182818 -0.397569 -0.162810
v -0.160110 -0.441061 -0.130002
v -0.134202 -0.473379 -0.092573
v -0.106090 -0.493280 -0.051959
v -0.096992 0.192723 -0.056653
v -0.116356 0.172822 -0.101781
v -0.134202 0.140504 -0.143370
v -0.149844 0.097012 -0.179824
v -0.162681 0.044016 -0.209740
v -0.172219 -0.016446 -0.231970
v -0.178093 -0.082051 -0.245660
v -0.180077 -0.150279 -0.250282
v -0.178093 -0.218506 -0.245660
v -0.172219 -0.284111 -0.231970
v -0.162681 -0.344573 -0.209740
v -0.149844 -0.397569 -0.179824
v -0.134202 -0.441061 -0.143370
v -0.116356 -0.473379 -0.101781
v -0.096992 -0.493280 -0.056653
v -0.087120 0.192723 -0.059544
v -0.096992 0.172822 -0.107451
v -0.106090 0.140504 -0.151602
v -0.114064 0.097012 -0.190301
v -0.120608 0.044016 -0.222060
v -0.125471 -0.016446 -0.245660
v -0.128465 -0.082051 -0.260192
v -0.129476 -0.150279 -0.265099
v -0.128465 -0.218506 -0.260192
v -0.125471 -0.284111 -0.245660
v -0.120608 -0.344573 -0.222060
v -0.114064 -0.397569 -0.190301
v -0.106090 -0.441061 -0.151602
v -0.096992 -0.473379 -0.107451
v -0.087120 -0.493280 -0.059544
v -0.076854 0.192723 -0.060520
v -0.076854 0.172822 -0.109365
v -0.076854 0.097012 -0.193839
v -0.076854 -0.016446 -0.250282
v -0.076854 -0.284111 -0.250282
v -0.076854 -0.397569 -0.193839
v -0.076854 -0.441061 -0.154381
v -0.076854 -0.473379 -0.109365
v -0.076854 -0.493280 -0.060520
vn 0.0923 -0.2194 -0.9713
vn 0.0554 0.8111 -0.5823
vn 0.0880 -0.3683 -0.9255
vn 0.0702 0.6703 -0.7388
vn 0.0809 -0.5197 -0.8505
vn 0.0809 0.5197 -0.8505
vn 0.0702 -0.6703 -0.7388
vn 0.0880 0.3683 -0.9255
vn 0.0554 -0.8111 -0.5823
vn 0.0923 0.2194 -0.9713
vn 0.0358 -0.9255 -0.3771
vn 0.0944 0.0728 -0.9929
vn 0.0125 0.9913 -0.1311
vn 0.0125 -0.9913 -0.1311
vn 0.0944 -0.0728 -0.9929
vn 0.0358 0.9255 -0.3771
vn 0.2803 -0.0730 -0.9571
vn 0.1062 0.9258 -0.3627
vn 0.2741 -0.2199 -0.9362
vn 0.1641 0.8118 -0.5604
vn 0.2612 -0.3691 -0.8919
vn 0.2083 0.6712 -0.7114
vn 0.2399 -0.5207 -0.8194
vn 0.2399 0.5207 -0.8194
vn 0.2083 -0.6712 -0.7114
vn 0.2612 0.3691 -0.8919
vn 0.1641 -0.8118 -0.5604
vn 0.2741 0.2199 -0.9362
vn 0.1062 -0.9258 -0.3627
vn 0.2803 0.0730 -0.9571
vn 0.0369 0.9913 -0.1261
vn 0.0369 -0.9913 -0.1261
vn 0.3392 -0.6729 -0.6573
vn 0.4259 0.3707 -0.8254
vn 0.2669 -0.8131 -0.5173
vn 0.4472 0.2209 -0.8667
vn 0.1726 -0.9265 -0.3345
vn 0.4573 0.0733 -0.8863
vn 0.0600 0.9914 -0.1162
vn 0.0600 -0.9914 -0.1162
vn 0.4573 -0.0733 -0.8863
vn 0.1726 0.9265 -0.3345
vn 0.4472 -0.2209 -0.8667
vn 0.2669 0.8131 -0.5173
vn 0.4259 -0.3707 -0.8254
vn 0.3392 0.6729 -0.6573
vn 0.3910 -0.5225 -0.7577
vn 0.3910 0.5225 -0.7577
vn 0.2325 0.9273 -0.2935
vn 0.6054 -0.2222 -0.7642
vn 0.3600 0.8148 -0.4544
vn 0.5762 -0.3727 -0.7274
vn 0.4580 0.6753 -0.5781
vn 0.5286 -0.5248 -0.6672
vn 0.5286 0.5248 -0.6672
vn 0.4580 -0.6753 -0.5781
vn 0.5762 0.3727 -0.7274
vn 0.3600 -0.8148 -0.4544
vn 0.6054 0.2222 -0.7642
vn 0.2325 -0.9273 -0.2935
vn 0.6193 0.0738 -0.7817
vn 0.0807 0.9915 -0.1019
vn 0.0807 -0.9915 -0.1019
vn 0.6193 -0.0738 -0.7817
vn 0.7063 0.3749 -0.6005
vn 0.4396 -0.8167 -0.3738
vn 0.7426 0.2237 -0.6313
vn 0.2835 -0.9282 -0.2411
vn 0.7598 0.0743 -0.6459
vn 0.0984 0.9916 -0.0836
vn 0.0984 -0.9916 -0.0836
vn 0.7598 -0.0743 -0.6459
vn 0.2836 0.9282 -0.2411
vn 0.7426 -0.2237 -0.6313
vn 0.4396 0.8167 -0.3738
vn 0.7063 -0.3749 -0.6005
vn 0.5602 0.6778 -0.4762
vn 0.6473 -0.5275 -0.5503
vn 0.6473 0.5275 -0.5503
vn 0.5602 -0.6778 -0.4762
vn 0.8524 -0.2250 -0.4720
vn 0.5027 0.8185 -0.2783
vn 0.8103 -0.3770 -0.4487
vn 0.6413 0.6801 -0.3551
vn 0.7419 -0.5299 -0.4108
vn 0.7419 0.5299 -0.4108
vn 0.6413 -0.6801 -0.3551
vn 0.8103 0.3770 -0.4487
vn 0.5027 -0.8185 -0.2783
vn 0.8524 0.2250 -0.4720
vn 0.3238 -0.9290 -0.1793
vn 0.8724 0.0748 -0.4831
vn 0.1122 0.9917 -0.0621
vn 0.1122 -0.9917 -0.0621
vn 0.8724 -0.0748 -0.4831
vn 0.3238 0.9290 -0.1793
vn 0.5463 -0.8198 -0.1717
vn 0.9293 0.2261 -0.2920
vn 0.3516 -0.9296 -0.1105
vn 0.9513 0.0752 -0.2989
vn 0.1218 0.9918 -0.0383
vn 0.1218 -0.9918 -0.0383
vn 0.9513 -0.0752 -0.2989
vn 0.3516 0.9296 -0.1105
vn 0.9293 -0.2261 -0.2920
vn 0.5463 0.8198 -0.1717
vn 0.8830 -0.3786 -0.2775
vn 0.6977 0.6820 -0.2193
vn 0.8079 -0.5318 -0.2539
vn 0.8079 0.5318 -0.2539
vn 0.6977 -0.6820 -0.2193
vn 0.8830 0.3786 -0.2775
vn 0.9204 -0.3795 -0.0939
vn 0.7267 0.6830 -0.0741
vn 0.8418 -0.5329 -0.0859
vn 0.8418 0.5329 -0.0859
vn 0.7267 -0.6830 -0.0741
vn 0.9204 0.3795 -0.0939
vn 0.5686 -0.8205 -0.0580
vn 0.9689 0.2267 -0.0989
vn 0.3658 -0.9300 -0.0373
vn 0.9920 0.0754 -0.1012
vn 0.1267 0.9919 -0.0129
vn 0.1267 -0.9919 -0.0129
vn 0.9920 -0.0754 -0.1012
vn 0.3658 0.9300 -0.0373
vn 0.9689 -0.2267 -0.0989
vn 0.5686 0.8205 -0.0580
vn 0.3658 -0.9300 0.0373
vn 0.9920 0.0754 0.1012
vn 0.1267 0.9919 0.0129
vn 0.1267 -0.9919 0.0129
vn 0.9920 -0.0754 0.1012
vn 0.3658 0.9300 0.0373
vn 0.9689 -0.2267 0.0989
vn 0.5686 0.8205 0.0580
vn 0.9204 -0.3795 0.0939
vn 0.7267 0.6830 0.0741
vn 0.8418 -0.5329 0.0859
vn 0.8418 0.5329 0.0859
vn 0.7267 -0.6830 0.0741
vn 0.9204 0.3795 0.0939
vn 0.5686 -0.8205 0.0580
vn 0.9689 0.2267 0.0989
vn 0.6978 0.6820 0.2193
vn 0.8079 -0.5318 0.2539
vn 0.8079 0.5318 0.2539
vn 0.6977 -0.6820 0.2193
vn 0.8830 0.3786 0.2775
vn 0.5463 -0.8198 0.1717
vn 0.9293 0.2261 0.2920
vn 0.3516 -0.9296 0.1105
vn 0.9513 0.0752 0.2989
vn 0.1218 0.9918 0.0383
vn 0.1218 -0.9918 0.0383
vn 0.9513 -0.0752 0.2989
vn 0.3516 0.9296 0.1105
vn 0.9293 -0.2261 0.2920
vn 0.5463 0.8198 0.1717
vn 0.8830 -0.3786 0.2775
vn 0.8724 0.0748 0.4831
vn 0.1122 0.9917 0.0621
vn 0.1122 -0.9917 0.0621
vn 0.8724 -0.0748 0.4831
vn 0.3238 0.9290 0.1793
vn 0.8524 -0.2250 0.4720
vn 0.5027 0.8185 0.2783
vn 0.8103 -0.3770 0.4487
vn 0.6413 0.6801 0.3551
vn 0.7419 -0.5299 0.4108
vn 0.7419 0.5299 0.4108
vn 0.6413 -0.6801 0.3551
vn 0.8103 0.3770 0.4487
vn 0.5027 -0.8185 0.2783
vn 0.8524 0.2250 0.4720
vn 0.3238 -0.9290 0.1793
vn 0.6473 -0.5275 0.5503
vn 0.6473 0.5275 0.5503
vn 0.5602 -0.6778 0.4762
vn 0.7063 0.3749 0.6005
vn 0.4396 -0.8167 0.3738
vn 0.7426 0.2237 0.6313
vn 0.2836 -0.9282 0.2411
vn 0.7598 0.0743 0.6459
vn 0.0984 0.9916 0.0836
vn 0.0984 -0.9916 0.0836
vn 0.7598 -0.0743 0.6459
vn 0.2836 0.9282 0.2411
vn 0.7426 -0.2237 0.6313
vn 0.4396 0.8167 0.3738
vn 0.7063 -0.3749 0.6005
vn 0.5602 0.6778 0.4762
vn 0.0807 0.9915 0.1019
vn 0.0807 -0.9915 0.1019
vn 0.6193 -0.0738 0.7817
vn 0.2325 0.9273 0.2935
vn 0.6054 -0.2222 0.7642
vn 0.3600 0.8148 0.4544
vn 0.5762 -0.3727 0.7274
vn 0.4580 0.6753 0.5781
vn 0.5286 -0.5248 0.6672
vn 0.5286 0.5248 0.6672
vn 0.4580 -0.6753 0.5781
vn 0.5762 0.3727 0.7274
vn 0.3600 -0.8148 0.4544
vn 0.6054 0.2222 0.7642
vn 0.2325 -0.9273 0.2935
vn 0.6193 0.0738 0.7817
vn 0.3910 0.5225 0.7577
vn 0.3392 -0.6729 0.6573
vn 0.4259 0.3707 0.8254
vn 0.2669 -0.8131 0.5173
vn 0.4472 0.2209 0.8667
vn 0.1726 -0.9265 0.3345
vn 0.4573 0.0733 0.8863
vn 0.0600 0.9914 0.1162
vn 0.0600 -0.9914 0.1162
vn 0.4573 -0.0733 0.8863
vn 0.1726 0.9265 0.3345
vn 0.4472 -0.2209 0.8667
vn 0.2669 0.8131 0.5173
vn 0.4259 -0.3707 0.8254
vn 0.3392 0.6729 0.6573
vn 0.3910 -0.5225 0.7577
vn 0.2803 -0.0730 0.9571
vn 0.1062 0.9258 0.3627
vn 0.2741 -0.2199 0.9362
vn 0.1641 0.8118 0.5604
vn 0.2612 -0.3691 0.8919
vn 0.2083 0.6712 0.7114
vn 0.2399 -0.5207 0.8194
vn 0.2399 0.5207 0.8194
vn 0.2083 -0.6712 0.7114
vn 0.2612 0.3691 0.8919
vn 0.1641 -0.8118 0.5604
vn 0.2741 0.2199 0.9362
vn 0.1062 -0.9258 0.3627
vn 0.2803 0.0730 0.9571
vn 0.0369 0.9913 0.1261
vn 0.0369 -0.9913 0.1261
vn 0.0702 -0.6703 0.7388
vn 0.0880 0.3683 0.9255
vn 0.0554 -0.8111 0.5823
vn 0.0923 0.2194 0.9713
vn 0.0359 -0.9255 0.3771
vn 0.0944 0.0728 0.9929
vn 0.0125 0.9913 0.1311
vn 0.0125 -0.9913 0.1311
vn 0.0944 -0.0728 0.9929
vn 0.0359 0.9255 0.3771
vn 0.0923 -0.2194 0.9713
vn 0.0554 0.8111 0.5823
vn 0.0880 -0.3683 0.9255
vn 0.0702 0.6703 0.7388
vn 0.0809 -0.5197 0.8505
vn 0.0809 0.5197 0.8505
vn -0.0923 -0.2194 0.9713
vn -0.0554 0.8111 0.5823
vn -0.0880 -0.3683 0.9255
vn -0.0702 0.6703 0.7388
vn -0.0809 -0.5197 0.8505
vn -0.0809 0.5197 0.8505
vn -0.0702 -0.6703 0.7388
vn -0.0880 0.3683 0.9255
vn -0.0554 -0.8111 0.5823
vn -0.0923 0.2194 0.9713
vn -0.0359 -0.9255 0.3771
vn -0.0944 0.0728 0.9929
vn -0.0125 0.9913 0.1311
vn -0.0125 -0.9913 0.1311
vn -0.0944 -0.0728 0.9929
vn -0.0359 0.9255 0.3771
vn -0.1641 -0.8118 0.5604
vn -0.2741 0.2199 0.9362
vn -0.1062 -0.9258 0.3627
vn -0.2803 0.0730 0.9571
vn -0.0369 0.9913 0.1261
vn -0.0369 -0.9913 0.1261
vn -0.2803 -0.0730 0.9571
vn -0.1062 0.9258 0.3627
vn -0.2741 -0.2199 0.9362
vn -0.1641 0.8118 0.5604
vn -0.2612 -0.3691 0.8919
vn -0.2083 0.6712 0.7114
vn -0.2399 -0.5207 0.8194
vn -0.2399 0.5207 0.8194
vn -0.2083 -0.6712 0.7114
vn -0.2612 0.3691 0.8919
vn -0.2669 0.8131 0.5173
vn -0.4259 -0.3707 0.8254
vn -0.3392 0.6729 0.6573
vn -0.3910 -0.5225 0.7577
vn -0.3910 0.5225 0.7577
vn -0.3392 -0.6729 0.6573
vn -0.4259 0.3707 0.8254
vn -0.2669 -0.8131 0.5173
vn -0.4472 0.2209 0.8667
vn -0.1726 -0.9265 0.3345
vn -0.4573 0.0733 0.8863
vn -0.0600 0.9914 0.1162
vn -0.0600 -0.9914 0.1162
vn -0.4573 -0.0733 0.8863
vn -0.1726 0.9265 0.3345
vn -0.4472 -0.2209 0.8667
vn -0.6054 0.2222 0.7642
vn -0.2325 -0.9273 0.2935
vn -0.6193 0.0738 0.7817
vn -0.0807 0.9915 0.1019
vn -0.0807 -0.9915 0.1019
vn -0.6193 -0.0738 0.7817
vn -0.2325 0.9273 0.2935
vn -0.6054 -0.2222 0.7642
vn -0.3600 0.8148 0.4544
vn -0.5762 -0.3727 0.7274
vn -0.4580 0.6753 0.5781
vn -0.5286 -0.5248 0.6672
vn -0.5286 0.5248 0.6672
vn -0.4580 -0.6753 0.5781
vn -0.5762 0.3727 0.7274
vn -0.3600 -0.8148 0.4544
vn -0.7063 -0.3749 0.6005
vn -0.5602 0.6778 0.4762
vn -0.6473 -0.5275 0.5503
vn -0.6473 0.5275 0.5503
vn -0.5602 -0.6778 0.4762
vn -0.7063 0.3749 0.6005
vn -0.4396 -0.8167 0.3738
vn -0.7426 0.2237 0.6313
vn -0.2835 -0.9282 0.2411
vn -0.7598 0.0743 0.6459
vn -0.0984 0.9916 0.0836
vn -0.0984 -0.9916 0.0836
vn -0.7598 -0.0743 0.6459
vn -0.2836 0.9282 0.2411
vn -0.7426 -0.2237 0.6313
vn -0.4396 0.8167 0.3738
vn -0.3238 -0.9290 0.1793
vn -0.8724 0.0748 0.4831
vn -0.1122 0.9917 0.0621
vn -0.1122 -0.9917 0.0621
vn -0.8724 -0.0748 0.4831
vn -0.3238 0.9290 0.1793
vn -0.8524 -0.2250 0.4720
vn -0.5027 0.8185 0.2783
vn -0.8103 -0.3770 0.4487
vn -0.6413 0.6801 0.3551
vn -0.7419 -0.5299 0.4108
vn -0.7419 0.5299 0.4108
vn -0.6413 -0.6801 0.3551
vn -0.8103 0.3770 0.4487
vn -0.5027 -0.8185 0.2783
vn -0.8524 0.2250 0.4720
vn -0.6977 0.6820 0.2193
vn -0.8079 -0.5318 0.2539
vn -0.8079 0.5318 0.2539
vn -0.6977 -0.6820 0.2193
vn -0.8830 0.3786 0.2775
vn -0.5463 -0.8198 0.1717
vn -0.9293 0.2261 0.2920
vn -0.3516 -0.9296 0.1105
vn -0.9513 0.0752 0.2989
vn -0.1218 0.9918 0.0383
vn -0.1218 -0.9918 0.0383
vn -0.9513 -0.0752 0.2989
vn -0.3516 0.9296 0.1105
vn -0.9293 -0.2261 0.2920
vn -0.5463 0.8198 0.1717
vn -0.8830 -0.3786 0.2775
vn -0.9920 0.0754 0.1012
vn -0.1267 0.9919 0.0129
vn -0.1267 -0.9919 0.0129
vn -0.9920 -0.0754 0.1012
vn -0.3658 0.9300 0.0373
vn -0.9689 -0.2267 0.0989
vn -0.5686 0.8205 0.0580
vn -0.9204 -0.3795 0.0939
vn -0.7267 0.6830 0.0741
vn -0.8418 -0.5329 0.0859
vn -0.8418 0.5329 0.0859
vn -0.7267 -0.6830 0.0741
vn -0.9204 0.3795 0.0939
vn -0.5686 -0.8205 0.0580
vn -0.9689 0.2267 0.0989
vn -0.3658 -0.9300 0.0373
vn -0.8418 -0.5329 -0.0859
vn -0.8418 0.5329 -0.0859
vn -0.7267 -0.6830 -0.0741
vn -0.9204 0.3795 -0.0939
vn -0.5686 -0.8205 -0.0580
vn -0.9689 0.2267 -0.0989
vn -0.3658 -0.9300 -0.0373
vn -0.9920 0.0754 -0.1012
vn -0.1267 0.9919 -0.0129
vn -0.1267 -0.9919 -0.0129
vn -0.9920 -0.0754 -0.1012
vn -0.3658 0.9300 -0.0373
vn -0.9689 -0.2267 -0.0989
vn -0.5686 0.8205 -0.0580
vn -0.9204 -0.3795 -0.0939
vn -0.7267 0.6830 -0.0741
vn -0.1218 -0.9918 -0.0383
vn -0.9513 -0.0752 -0.2989
vn -0.3516 0.9296 -0.1105
vn -0.9293 -0.2261 -0.2920
vn -0.5463 0.8198 -0.1717
vn -0.8830 -0.3786 -0.2775
vn -0.6977 0.6820 -0.2193
vn -0.8079 -0.5318 -0.2539
vn -0.8079 0.5318 -0.2539
vn -0.6977 -0.6820 -0.2193
vn -0.8830 0.3786 -0.2775
vn -0.5463 -0.8198 -0.1717
vn -0.9293 0.2261 -0.2920
vn -0.3516 -0.9296 -0.1105
vn -0.9513 0.0752 -0.2989
vn -0.1218 0.9918 -0.0383
vn -0.6413 -0.6801 -0.3551
vn -0.8103 0.3770 -0.4487
vn -0.5027 -0.8185 -0.2783
vn -0.8524 0.2250 -0.4720
vn -0.3238 -0.9290 -0.1793
vn -0.8724 0.0748 -0.4831
vn -0.1122 0.9917 -0.0621
vn -0.1122 -0.9917 -0.0621
vn -0.8724 -0.0748 -0.4831
vn -0.3238 0.9290 -0.1793
vn -0.8524 -0.2250 -0.4720
vn -0.5027 0.8185 -0.2783
vn -0.8103 -0.3770 -0.4487
vn -0.6413 0.6801 -0.3551
vn -0.7419 -0.5299 -0.4108
vn -0.7419 0.5299 -0.4108
vn -0.2836 0.9282 -0.2411
vn -0.7426 -0.2237 -0.6313
vn -0.4396 0.8167 -0.3738
vn -0.7063 -0.3749 -0.6005
vn -0.5602 0.6778 -0.4762
vn -0.6473 -0.5275 -0.5503
vn -0.6473 0.5275 -0.5503
vn -0.5602 -0.6778 -0.4762
vn -0.7063 0.3749 -0.6005
vn -0.4396 -0.8167 -0.3738
vn -0.7426 0.2237 -0.6313
vn -0.2836 -0.9282 -0.2411
vn -0.7598 0.0743 -0.6459
vn -0.0984 0.9916 -0.0836
vn -0.0984 -0.9916 -0.0836
vn -0.7598 -0.0743 -0.6459
vn -0.5762 0.3727 -0.7274
vn -0.3600 -0.8148 -0.4544
vn -0.6054 0.2222 -0.7642
vn -0.2325 -0.9273 -0.2935
vn -0.6193 0.0738 -0.7817
vn -0.0807 0.9915 -0.1019
vn -0.0807 -0.9915 -0.1019
vn -0.6193 -0.0738 -0.7817
vn -0.2325 0.9273 -0.2935
vn -0.6054 -0.2222 -0.7642
vn -0.3600 0.8148 -0.4544
vn -0.5762 -0.3727 -0.7274
vn -0.4580 0.6753 -0.5781
vn -0.5286 -0.5248 -0.6672
vn -0.5286 0.5248 -0.6672
vn -0.4580 -0.6753 -0.5781
vn -0.4472 -0.2209 -0.8667
vn -0.2669 0.8131 -0.5173
vn -0.4259 -0.3707 -0.8254
vn -0.3392 0.6729 -0.6573
vn -0.3910 -0.5225 -0.7577
vn -0.3910 0.5225 -0.7577
vn -0.3392 -0.6729 -0.6573
vn -0.4259 0.3707 -0.8254
vn -0.2669 -0.8131 -0.5173
vn -0.4472 0.2209 -0.8667
vn -0.1726 -0.9265 -0.3345
vn -0.4573 0.0733 -0.8863
vn -0.0600 0.9914 -0.1162
vn -0.0600 -0.9914 -0.1162
vn -0.4573 -0.0733 -0.8863
vn -0.1726 0.9265 -0.3345
vn -0.1641 -0.8118 -0.5604
vn -0.2741 0.2199 -0.9362
vn -0.1062 -0.9258 -0.3627
vn -0.2803 0.0730 -0.9571
vn -0.0369 0.9913 -0.1261
vn -0.0369 -0.9913 -0.1261
vn -0.2803 -0.0730 -0.9571
vn -0.1062 0.9258 -0.3627
vn -0.2741 -0.2199 -0.9362
vn -0.1641 0.8118 -0.5604
vn -0.2612 -0.3691 -0.8919
vn -0.2083 0.6712 -0.7114
vn -0.2399 -0.5207 -0.8194
vn -0.2399 0.5207 -0.8194
vn -0.2083 -0.6712 -0.7114
vn -0.2612 0.3691 -0.8919
vn -0.0554 0.8111 -0.5823
vn -0.0880 -0.3683 -0.9255
vn -0.0702 0.6703 -0.7388
vn -0.0809 -0.5197 -0.8505
vn -0.0809 0.5197 -0.8505
vn -0.0702 -0.6703 -0.7388
vn -0.0880 0.3683 -0.9255
vn -0.0554 -0.8111 -0.5823
vn -0.0923 0.2194 -0.9713
vn -0.0359 -0.9255 -0.3771
vn -0.0944 0.0728 -0.9929
vn -0.0125 0.9913 -0.1311
vn -0.0125 -0.9913 -0.1311
vn -0.0944 -0.0728 -0.9929
vn -0.0358 0.9255 -0.3771
vn -0.0923 -0.2194 -0.9713
vn 0.2836 -0.9282 -0.2411
vn 0.2835 0.9282 -0.2411
vn 0.6977 0.6820 0.2193
vn 0.6978 -0.6820 0.2193
vn -0.2836 -0.9282 0.2411
vn -0.0358 -0.9255 -0.3771
vn -0.0359 0.9255 -0.3771
vt 0.750000 0.437500
vt 0.718750 0.375000
vt 0.750000 0.375000
vt 0.750000 0.812500
vt 0.718750 0.875000
vt 0.718750 0.812500
vt 0.718750 0.312500
vt 0.750000 0.312500
vt 0.750000 0.750000
vt 0.718750 0.750000
vt 0.718750 0.250000
vt 0.750000 0.250000
vt 0.750000 0.687500
vt 0.718750 0.687500
vt 0.718750 0.187500
vt 0.750000 0.187500
vt 0.750000 0.625000
vt 0.718750 0.625000
vt 0.718750 0.125000
vt 0.750000 0.125000
vt 0.750000 0.562500
vt 0.718750 0.562500
vt 0.718750 0.062500
vt 0.750000 0.062500
vt 0.718750 0.500000
vt 0.750000 0.500000
vt 0.750000 0.937500
vt 0.734375 1.000000
vt 0.718750 0.937500
vt 0.734375 0.000000
vt 0.718750 0.437500
vt 0.750000 0.875000
vt 0.687500 0.437500
vt 0.687500 0.875000
vt 0.687500 0.375000
vt 0.687500 0.812500
vt 0.687500 0.312500
vt 0.687500 0.750000
vt 0.687500 0.250000
vt 0.687500 0.687500
vt 0.687500 0.187500
vt 0.687500 0.625000
vt 0.687500 0.125000
vt 0.687500 0.562500
vt 0.687500 0.062500
vt 0.687500 0.500000
vt 0.703125 1.000000
vt 0.687500 0.937500
vt 0.703125 0.000000
vt 0.656250 0.250000
vt 0.656250 0.187500
vt 0.656250 0.625000
vt 0.656250 0.125000
vt 0.656250 0.562500
vt 0.656250 0.062500
vt 0.656250 0.500000
vt 0.671875 1.000000
vt 0.656250 0.937500
vt 0.671875 0.000000
vt 0.656250 0.437500
vt 0.656250 0.875000
vt 0.656250 0.375000
vt 0.656250 0.812500
vt 0.656250 0.312500
vt 0.656250 0.750000
vt 0.656250 0.687500
vt 0.625000 0.875000
vt 0.625000 0.375000
vt 0.625000 0.812500
vt 0.625000 0.312500
vt 0.625000 0.750000
vt 0.625000 0.250000
vt 0.625000 0.687500
vt 0.625000 0.187500
vt 0.625000 0.625000
vt 0.625000 0.125000
vt 0.625000 0.562500
vt 0.625000 0.062500
vt 0.625000 0.500000
vt 0.640625 1.000000
vt 0.625000 0.937500
vt 0.640625 0.000000
vt 0.625000 0.437500
vt 0.593750 0.625000
vt 0.593750 0.187500
vt 0.593750 0.125000
vt 0.593750 0.562500
vt 0.593750 0.062500
vt 0.593750 0.500000
vt 0.609375 1.000000
vt 0.593750 0.937500
vt 0.609375 0.000000
vt 0.593750 0.437500
vt 0.593750 0.875000
vt 0.593750 0.375000
vt 0.593750 0.812500
vt 0.593750 0.312500
vt 0.593750 0.750000
vt 0.593750 0.250000
vt 0.593750 0.687500
vt 0.562500 0.375000
vt 0.562500 0.812500
vt 0.562500 0.312500
vt 0.562500 0.750000
vt 0.562500 0.250000
vt 0.562500 0.687500
vt 0.562500 0.187500
vt 0.562500 0.625000
vt 0.562500 0.125000
vt 0.562500 0.562500
vt 0.562500 0.062500
vt 0.562500 0.500000
vt 0.578125 1.000000
vt 0.562500 0.937500
vt 0.578125 0.000000
vt 0.562500 0.437500
vt 0.562500 0.875000
vt 0.531250 0.187500
vt 0.531250 0.125000
vt 0.531250 0.625000
vt 0.531250 0.562500
vt 0.531250 0.062500
vt 0.531250 0.500000
vt 0.546875 1.000000
vt 0.531250 0.937500
vt 0.546875 0.000000
vt 0.531250 0.437500
vt 0.531250 0.875000
vt 0.531250 0.375000
vt 0.531250 0.812500
vt 0.531250 0.312500
vt 0.531250 0.750000
vt 0.531250 0.250000
vt 0.531250 0.687500
vt 0.500000 0.375000
vt 0.500000 0.312500
vt 0.500000 0.750000
vt 0.500000 0.250000
vt 0.500000 0.687500
vt 0.500000 0.187500
vt 0.500000 0.625000
vt 0.500000 0.125000
vt 0.500000 0.562500
vt 0.500000 0.062500
vt 0.500000 0.500000
vt 0.515625 1.000000
vt 0.500000 0.937500
vt 0.515625 0.000000
vt 0.500000 0.437500
vt 0.500000 0.875000
vt 0.500000 0.812500
vt 0.468750 0.062500
vt 0.468750 0.562500
vt 0.468750 0.500000
vt 0.484375 1.000000
vt 0.468750 0.937500
vt 0.484375 0.000000
vt 0.468750 0.437500
vt 0.468750 0.875000
vt 0.468750 0.375000
vt 0.468750 0.812500
vt 0.468750 0.312500
vt 0.468750 0.750000
vt 0.468750 0.250000
vt 0.468750 0.687500
vt 0.468750 0.187500
vt 0.468750 0.625000
vt 0.468750 0.125000
vt 0.437500 0.750000
vt 0.437500 0.250000
vt 0.437500 0.687500
vt 0.437500 0.187500
vt 0.437500 0.625000
vt 0.437500 0.125000
vt 0.437500 0.562500
vt 0.437500 0.062500
vt 0.437500 0.500000
vt 0.453125 1.000000
vt 0.437500 0.937500
vt 0.453125 0.000000
vt 0.437500 0.437500
vt 0.437500 0.875000
vt 0.437500 0.375000
vt 0.437500 0.812500
vt 0.437500 0.312500
vt 0.406250 0.562500
vt 0.406250 0.500000
vt 0.421875 1.000000
vt 0.406250 0.937500
vt 0.421875 0.000000
vt 0.406250 0.062500
vt 0.406250 0.437500
vt 0.406250 0.875000
vt 0.406250 0.375000
vt 0.406250 0.812500
vt 0.406250 0.312500
vt 0.406250 0.750000
vt 0.406250 0.250000
vt 0.406250 0.687500
vt 0.406250 0.187500
vt 0.406250 0.625000
vt 0.406250 0.125000
vt 0.375000 0.250000
vt 0.375000 0.750000
vt 0.375000 0.687500
vt 0.375000 0.187500
vt 0.375000 0.625000
vt 0.375000 0.125000
vt 0.375000 0.562500
vt 0.375000 0.062500
vt 0.375000 0.500000
vt 0.390625 1.000000
vt 0.375000 0.937500
vt 0.390625 0.000000
vt 0.375000 0.437500
vt 0.375000 0.875000
vt 0.375000 0.375000
vt 0.375000 0.812500
vt 0.375000 0.312500
vt 0.359375 1.000000
vt 0.343750 0.937500
vt 0.359375 0.000000
vt 0.343750 0.062500
vt 0.343750 0.437500
vt 0.343750 0.875000
vt 0.343750 0.375000
vt 0.343750 0.812500
vt 0.343750 0.312500
vt 0.343750 0.750000
vt 0.343750 0.250000
vt 0.343750 0.687500
vt 0.343750 0.187500
vt 0.343750 0.625000
vt 0.343750 0.125000
vt 0.343750 0.562500
vt 0.343750 0.500000
vt 0.312500 0.750000
vt 0.312500 0.687500
vt 0.312500 0.250000
vt 0.312500 0.187500
vt 0.312500 0.625000
vt 0.312500 0.125000
vt 0.312500 0.562500
vt 0.312500 0.062500
vt 0.312500 0.500000
vt 0.328125 1.000000
vt 0.312500 0.937500
vt 0.328125 0.000000
vt 0.312500 0.437500
vt 0.312500 0.875000
vt 0.312500 0.375000
vt 0.312500 0.812500
vt 0.312500 0.312500
vt 0.281250 0.437500
vt 0.281250 0.875000
vt 0.281250 0.375000
vt 0.281250 0.812500
vt 0.281250 0.312500
vt 0.281250 0.750000
vt 0.281250 0.250000
vt 0.281250 0.687500
vt 0.281250 0.187500
vt 0.281250 0.625000
vt 0.281250 0.125000
vt 0.281250 0.562500
vt 0.281250 0.062500
vt 0.281250 0.500000
vt 0.296875 1.000000
vt 0.281250 0.937500
vt 0.296875 0.000000
vt 0.250000 0.250000
vt 0.250000 0.187500
vt 0.250000 0.625000
vt 0.250000 0.125000
vt 0.250000 0.562500
vt 0.250000 0.062500
vt 0.250000 0.500000
vt 0.265625 1.000000
vt 0.250000 0.937500
vt 0.265625 0.000000
vt 0.250000 0.437500
vt 0.250000 0.875000
vt 0.250000 0.375000
vt 0.250000 0.812500
vt 0.250000 0.312500
vt 0.250000 0.750000
vt 0.250000 0.687500
vt 0.218750 0.375000
vt 0.218750 0.812500
vt 0.218750 0.312500
vt 0.218750 0.750000
vt 0.218750 0.250000
vt 0.218750 0.687500
vt 0.218750 0.187500
vt 0.218750 0.625000
vt 0.218750 0.125000
vt 0.218750 0.562500
vt 0.218750 0.062500
vt 0.218750 0.500000
vt 0.234375 1.000000
vt 0.218750 0.937500
vt 0.234375 0.000000
vt 0.218750 0.437500
vt 0.218750 0.875000
vt 0.187500 0.125000
vt 0.187500 0.625000
vt 0.187500 0.562500
vt 0.187500 0.062500
vt 0.187500 0.500000
vt 0.203125 1.000000
vt 0.187500 0.937500
vt 0.203125 0.000000
vt 0.187500 0.437500
vt 0.187500 0.875000
vt 0.187500 0.375000
vt 0.187500 0.812500
vt 0.187500 0.312500
vt 0.187500 0.750000
vt 0.187500 0.250000
vt 0.187500 0.687500
vt 0.187500 0.187500
vt 0.156250 0.812500
vt 0.156250 0.375000
vt 0.156250 0.312500
vt 0.156250 0.750000
vt 0.156250 0.250000
vt 0.156250 0.687500
vt 0.156250 0.187500
vt 0.156250 0.625000
vt 0.156250 0.125000
vt 0.156250 0.562500
vt 0.156250 0.062500
vt 0.156250 0.500000
vt 0.171875 1.000000
vt 0.156250 0.937500
vt 0.171875 0.000000
vt 0.156250 0.437500
vt 0.156250 0.875000
vt 0.125000 0.625000
vt 0.125000 0.562500
vt 0.125000 0.125000
vt 0.125000 0.062500
vt 0.125000 0.500000
vt 0.140625 1.000000
vt 0.125000 0.937500
vt 0.140625 0.000000
vt 0.125000 0.437500
vt 0.125000 0.875000
vt 0.125000 0.375000
vt 0.125000 0.812500
vt 0.125000 0.312500
vt 0.125000 0.750000
vt 0.125000 0.250000
vt 0.125000 0.687500
vt 0.125000 0.187500
vt 0.093750 0.375000
vt 0.093750 0.312500
vt 0.093750 0.750000
vt 0.093750 0.250000
vt 0.093750 0.687500
vt 0.093750 0.187500
vt 0.093750 0.625000
vt 0.093750 0.125000
vt 0.093750 0.562500
vt 0.093750 0.062500
vt 0.093750 0.500000
vt 0.109375 1.000000
vt 0.093750 0.937500
vt 0.109375 0.000000
vt 0.093750 0.437500
vt 0.093750 0.875000
vt 0.093750 0.812500
vt 0.062500 0.062500
vt 0.062500 0.562500
vt 0.062500 0.500000
vt 0.078125 1.000000
vt 0.062500 0.937500
vt 0.078125 0.000000
vt 0.062500 0.437500
vt 0.062500 0.875000
vt 0.062500 0.375000
vt 0.062500 0.812500
vt 0.062500 0.312500
vt 0.062500 0.750000
vt 0.062500 0.250000
vt 0.062500 0.687500
vt 0.062500 0.187500
vt 0.062500 0.625000
vt 0.062500 0.125000
vt 0.031250 0.750000
vt 0.031250 0.250000
vt 0.031250 0.687500
vt 0.031250 0.187500
vt 0.031250 0.625000
vt 0.031250 0.125000
vt 0.031250 0.562500
vt 0.031250 0.062500
vt 0.031250 0.500000
vt 0.046875 1.000000
vt 0.031250 0.937500
vt 0.046875 0.000000
vt 0.031250 0.437500
vt 0.031250 0.875000
vt 0.031250 0.375000
vt 0.031250 0.812500
vt 0.031250 0.312500
vt 0.000000 0.562500
vt 0.000000 0.500000
vt 0.015625 1.000000
vt 0.000000 0.937500
vt 0.015625 0.000000
vt 0.000000 0.062500
vt 0.000000 0.437500
vt 0.000000 0.875000
vt 0.000000 0.375000
vt 0.000000 0.812500
vt 0.000000 0.312500
vt 0.000000 0.750000
vt 0.000000 0.250000
vt 0.000000 0.687500
vt 0.000000 0.187500
vt 0.000000 0.625000
vt 0.000000 0.125000
vt 1.000000 0.312500
vt 0.968750 0.250000
vt 1.000000 0.250000
vt 1.000000 0.687500
vt 0.968750 0.750000
vt 0.968750 0.687500
vt 1.000000 0.187500
vt 0.968750 0.187500
vt 0.968750 0.625000
vt 1.000000 0.625000
vt 0.968750 0.125000
vt 1.000000 0.125000
vt 1.000000 0.562500
vt 0.968750 0.562500
vt 0.968750 0.062500
vt 1.000000 0.062500
vt 1.000000 0.500000
vt 0.968750 0.500000
vt 1.000000 0.937500
vt 0.984375 1.000000
vt 0.968750 0.937500
vt 0.984375 0.000000
vt 0.968750 0.437500
vt 1.000000 0.437500
vt 0.968750 0.875000
vt 1.000000 0.875000
vt 0.968750 0.375000
vt 1.000000 0.375000
vt 1.000000 0.812500
vt 0.968750 0.812500
vt 0.968750 0.312500
vt 1.000000 0.750000
vt 0.953125 0.000000
vt 0.937500 0.062500
vt 0.937500 0.437500
vt 0.937500 0.875000
vt 0.937500 0.375000
vt 0.937500 0.812500
vt 0.937500 0.312500
vt 0.937500 0.750000
vt 0.937500 0.250000
vt 0.937500 0.687500
vt 0.937500 0.187500
vt 0.937500 0.625000
vt 0.937500 0.125000
vt 0.937500 0.562500
vt 0.937500 0.500000
vt 0.953125 1.000000
vt 0.937500 0.937500
vt 0.906250 0.250000
vt 0.906250 0.187500
vt 0.906250 0.625000
vt 0.906250 0.125000
vt 0.906250 0.562500
vt 0.906250 0.062500
vt 0.906250 0.500000
vt 0.921875 1.000000
vt 0.906250 0.937500
vt 0.921875 0.000000
vt 0.906250 0.437500
vt 0.906250 0.875000
vt 0.906250 0.375000
vt 0.906250 0.812500
vt 0.906250 0.312500
vt 0.906250 0.750000
vt 0.906250 0.687500
vt 0.875000 0.875000
vt 0.875000 0.375000
vt 0.875000 0.812500
vt 0.875000 0.312500
vt 0.875000 0.750000
vt 0.875000 0.250000
vt 0.875000 0.687500
vt 0.875000 0.187500
vt 0.875000 0.625000
vt 0.875000 0.125000
vt 0.875000 0.562500
vt 0.875000 0.062500
vt 0.875000 0.500000
vt 0.890625 1.000000
vt 0.875000 0.937500
vt 0.890625 0.000000
vt 0.875000 0.437500
vt 0.843750 0.625000
vt 0.843750 0.125000
vt 0.843750 0.562500
vt 0.843750 0.062500
vt 0.843750 0.500000
vt 0.859375 1.000000
vt 0.843750 0.937500
vt 0.859375 0.000000
vt 0.843750 0.437500
vt 0.843750 0.875000
vt 0.843750 0.375000
vt 0.843750 0.812500
vt 0.843750 0.312500
vt 0.843750 0.750000
vt 0.843750 0.250000
vt 0.843750 0.687500
vt 0.843750 0.187500
vt 0.812500 0.375000
vt 0.812500 0.812500
vt 0.812500 0.312500
vt 0.812500 0.750000
vt 0.812500 0.250000
vt 0.812500 0.687500
vt 0.812500 0.187500
vt 0.812500 0.625000
vt 0.812500 0.125000
vt 0.812500 0.562500
vt 0.812500 0.062500
vt 0.812500 0.500000
vt 0.828125 1.000000
vt 0.812500 0.937500
vt 0.828125 0.000000
vt 0.812500 0.437500
vt 0.812500 0.875000
vt 0.781250 0.125000
vt 0.781250 0.625000
vt 0.781250 0.562500
vt 0.781250 0.062500
vt 0.781250 0.500000
vt 0.796875 1.000000
vt 0.781250 0.937500
vt 0.796875 0.000000
vt 0.781250 0.437500
vt 0.781250 0.875000
vt 0.781250 0.375000
vt 0.781250 0.812500
vt 0.781250 0.312500
vt 0.781250 0.750000
vt 0.781250 0.250000
vt 0.781250 0.687500
vt 0.781250 0.187500
vt 0.765625 1.000000
vt 0.765625 0.000000
s 0
f 487/560/518 498/561/518 960/562/518
f 483/563/519 490/564/519 491/565/519
f 960/562/520 499/566/520 488/567/520
f 958/568/521 491/565/521 492/569/521
f 488/567/522 500/570/522 961/571/522
f 484/572/523 492/569/523 493/573/523
f 961/571/524 501/574/524 962/575/524
f 959/576/525 493/573/525 494/577/525
f 962/575/526 502/578/526 963/579/526
f 485/580/527 494/577/527 495/581/527
f 963/579/528 503/582/528 964/583/528
f 485/580/529 496/584/529 486/585/529
f 956/586/530 564/587/530 489/588/530
f 790/589/531 964/583/531 503/582/531
f 486/585/532 497/590/532 487/560/532
f 957/591/533 489/588/533 490/564/533
f 496/584/534 512/592/534 497/590/534
f 489/588/535 505/593/535 490/564/535
f 497/590/536 513/594/536 498/561/536
f 490/564/537 506/595/537 491/565/537
f 498/561/538 514/596/538 499/566/538
f 491/565/539 507/597/539 492/569/539
f 499/566/540 515/598/540 500/570/540
f 492/569/541 508/599/541 493/573/541
f 501/574/542 515/598/542 516/600/542
f 493/573/543 509/601/543 494/577/543
f 502/578/544 516/600/544 517/602/544
f 495/581/545 509/601/545 510/603/545
f 502/578/546 518/604/546 503/582/546
f 495/581/547 511/605/547 496/584/547
f 489/588/548 564/606/548 504/607/548
f 790/608/549 503/582/549 518/604/549
f 516/600/550 530/609/550 531/610/550
f 508/599/551 524/611/551 509/601/551
f 516/600/552 532/612/552 517/602/552
f 509/601/553 525/613/553 510/603/553
f 517/602/554 533/614/554 518/604/554
f 510/603/555 526/615/555 511/605/555
f 504/607/556 564/616/556 519/617/556
f 790/618/557 518/604/557 533/614/557
f 512/592/558 526/615/558 527/619/558
f 504/607/559 520/620/559 505/593/559
f 512/592/560 528/621/560 513/594/560
f 505/593/561 521/622/561 506/595/561
f 514/596/562 528/621/562 529/623/562
f 506/595/563 522/624/563 507/597/563
f 515/598/564 529/623/564 530/609/564
f 507/597/565 523/625/565 508/599/565
f 519/617/566 535/626/566 520/620/566
f 527/619/567 543/627/567 528/621/567
f 520/620/568 536/628/568 521/622/568
f 529/623/569 543/627/569 544/629/569
f 521/622/570 537/630/570 522/624/570
f 529/623/571 545/631/571 530/609/571
f 522/624/572 538/632/572 523/625/572
f 530/609/573 546/633/573 531/610/573
f 523/625/574 539/634/574 524/611/574
f 531/610/575 547/635/575 532/612/575
f 525/613/576 539/634/576 540/636/576
f 532/612/577 548/637/577 533/614/577
f 526/615/578 540/636/578 541/638/578
f 519/617/579 564/639/579 534/640/579
f 790/641/580 533/614/580 548/637/580
f 526/615/581 542/642/581 527/619/581
f 538/632/582 554/643/582 539/634/582
f 547/635/583 561/644/583 562/645/583
f 540/636/584 554/643/584 555/646/584
f 547/635/585 563/647/585 548/637/585
f 541/638/586 555/646/586 556/648/586
f 534/640/587 564/649/587 549/650/587
f 790/651/588 548/637/588 563/647/588
f 541/638/589 557/652/589 542/642/589
f 534/640/590 550/653/590 535/626/590
f 542/642/591 558/654/591 543/627/591
f 535/626/592 551/655/592 536/628/592
f 544/629/593 558/654/593 559/656/593
f 536/628/594 552/657/594 537/630/594
f 544/629/595 560/658/595 545/631/595
f 538/632/596 552/657/596 553/659/596
f 545/631/597 561/644/597 546/633/597
f 557/652/598 574/660/598 558/654/598
f 550/653/599 567/661/599 551/655/599
f 559/656/600 574/660/600 575/662/600
f 551/655/601 568/663/601 552/657/601
f 559/656/602 576/664/602 560/658/602
f 553/659/603 568/663/603 569/665/603
f 560/658/604 577/666/604 561/644/604
f 553/659/605 570/667/605 554/643/605
f 561/644/606 578/668/606 562/645/606
f 555/646/607 570/667/607 571/669/607
f 563/647/608 578/668/608 579/670/608
f 556/648/609 571/669/609 572/671/609
f 549/650/610 564/672/610 565/673/610
f 790/674/611 563/647/611 579/670/611
f 556/648/612 573/675/612 557/652/612
f 549/650/613 566/676/613 550/653/613
f 578/668/614 592/677/614 593/678/614
f 571/669/615 585/679/615 586/680/615
f 578/668/616 594/681/616 579/670/616
f 572/671/617 586/680/617 587/682/617
f 565/673/618 564/683/618 580/684/618
f 790/685/619 579/670/619 594/681/619
f 572/671/620 588/686/620 573/675/620
f 566/676/621 580/684/621 581/687/621
f 573/675/622 589/688/622 574/660/622
f 566/676/623 582/689/623 567/661/623
f 575/662/624 589/688/624 590/690/624
f 567/661/625 583/691/625 568/663/625
f 575/662/626 591/692/626 576/664/626
f 569/665/627 583/691/627 584/693/627
f 577/666/628 591/692/628 592/677/628
f 569/665/629 585/679/629 570/667/629
f 590/690/630 604/694/630 605/695/630
f 582/689/631 598/696/631 583/691/631
f 590/690/632 606/697/632 591/692/632
f 584/693/633 598/696/633 599/698/633
f 592/677/634 606/697/634 607/699/634
f 584/693/635 600/700/635 585/679/635
f 592/677/636 608/701/636 593/678/636
f 586/680/637 600/700/637 601/702/637
f 593/678/638 609/703/638 594/681/638
f 587/682/639 601/702/639 602/704/639
f 580/684/640 564/705/640 595/706/640
f 790/707/641 594/681/641 609/703/641
f 587/682/642 603/708/642 588/686/642
f 581/687/643 595/706/643 596/709/643
f 588/686/644 604/694/644 589/688/644
f 581/687/645 597/710/645 582/689/645
f 608/701/646 624/711/646 609/703/646
f 602/704/647 616/712/647 617/713/647
f 595/706/648 564/714/648 610/715/648
f 790/716/649 609/703/649 624/711/649
f 602/704/650 618/717/650 603/708/650
f 595/706/651 611/718/651 596/709/651
f 603/708/652 619/719/652 604/694/652
f 596/709/653 612/720/653 597/710/653
f 605/695/654 619/719/654 620/721/654
f 597/710/655 613/722/655 598/696/655
f 605/695/656 621/723/656 606/697/656
f 599/698/657 613/722/657 614/724/657
f 607/699/658 621/723/658 622/725/658
f 599/698/659 615/726/659 600/700/659
f 607/699/660 623/727/660 608/701/660
f 601/702/661 615/726/661 616/712/661
f 612/720/662 628/728/662 613/722/662
f 620/721/663 636/729/663 621/723/663
f 614/724/664 628/728/664 629/730/664
f 622/725/665 636/729/665 637/731/665
f 614/724/666 630/732/666 615/726/666
f 622/725/667 638/733/667 623/727/667
f 616/712/668 630/732/668 631/734/668
f 624/711/669 638/733/669 639/735/669
f 617/713/670 631/734/670 632/736/670
f 610/715/671 564/737/671 625/738/671
f 790/739/672 624/711/672 639/735/672
f 617/713/673 633/740/673 618/717/673
f 610/715/674 626/741/674 611/718/674
f 618/717/675 634/742/675 619/719/675
f 612/720/676 626/741/676 627/743/676
f 620/721/677 634/742/677 635/744/677
f 632/736/678 646/745/678 647/746/678
f 625/738/679 564/747/679 640/748/679
f 790/749/680 639/735/680 654/750/680
f 632/736/681 648/751/681 633/740/681
f 625/738/682 641/752/682 626/741/682
f 633/740/683 649/753/683 634/742/683
f 627/743/684 641/752/684 642/754/684
f 635/744/685 649/753/685 650/755/685
f 627/743/686 643/756/686 628/728/686
f 635/744/687 651/757/687 636/729/687
f 629/730/688 643/756/688 644/758/688
f 637/731/689 651/757/689 652/759/689
f 629/730/690 645/760/690 630/732/690
f 637/731/691 653/761/691 638/733/691
f 631/734/692 645/760/692 646/745/692
f 638/733/693 654/750/693 639/735/693
f 650/755/694 666/762/694 651/757/694
f 644/758/695 658/763/695 659/764/695
f 651/757/696 667/765/696 652/759/696
f 644/758/697 660/766/697 645/760/697
f 653/761/698 667/765/698 668/767/698
f 646/745/699 660/766/699 661/768/699
f 654/750/700 668/767/700 669/769/700
f 647/746/701 661/768/701 662/770/701
f 640/748/702 564/771/702 655/772/702
f 790/773/703 654/750/703 669/769/703
f 647/746/704 663/774/704 648/751/704
f 640/748/705 656/775/705 641/752/705
f 648/751/706 664/776/706 649/753/706
f 641/752/707 657/777/707 642/754/707
f 650/755/708 664/776/708 665/778/708
f 642/754/709 658/763/709 643/756/709
f 655/772/710 564/779/710 670/780/710
f 790/781/711 669/769/711 684/782/711
f 662/770/712 678/783/712 663/774/712
f 656/775/713 670/780/713 671/784/713
f 663/774/714 679/785/714 664/776/714
f 657/777/715 671/784/715 672/786/715
f 665/778/716 679/785/716 680/787/716
f 657/777/717 673/788/717 658/763/717
f 665/778/718 681/789/718 666/762/718
f 659/764/719 673/788/719 674/790/719
f 667/765/720 681/789/720 682/791/720
f 659/764/721 675/792/721 660/766/721
f 667/765/722 683/793/722 668/767/722
f 661/768/723 675/792/723 676/794/723
f 668/767/724 684/782/724 669/769/724
f 662/770/725 676/794/725 677/795/725
f 674/790/726 688/796/726 689/797/726
f 682/791/727 696/798/727 697/799/727
f 674/790/728 690/800/728 675/792/728
f 682/791/729 698/801/729 683/793/729
f 676/794/730 690/800/730 691/802/730
f 683/793/731 699/803/731 684/782/731
f 677/795/732 691/802/732 692/804/732
f 670/780/733 564/805/733 685/806/733
f 790/807/734 684/782/734 699/803/734
f 677/795/735 693/808/735 678/783/735
f 670/780/736 686/809/736 671/784/736
f 678/783/737 694/810/737 679/785/737
f 671/784/738 687/811/738 672/786/738
f 680/787/739 694/810/739 695/812/739
f 672/786/740 688/796/740 673/788/740
f 680/787/741 696/798/741 681/789/741
f 692/804/742 708/813/742 693/808/742
f 685/806/743 701/814/743 686/809/743
f 693/808/744 709/815/744 694/810/744
f 686/809/745 702/816/745 687/811/745
f 695/812/746 709/815/746 710/817/746
f 687/811/747 703/818/747 688/796/747
f 695/812/748 711/819/748 696/798/748
f 689/797/749 703/818/749 704/820/749
f 697/799/750 711/819/750 712/821/750
f 689/797/751 705/822/751 690/800/751
f 697/799/752 713/823/752 698/801/752
f 691/802/753 705/822/753 706/824/753
f 698/801/754 714/825/754 699/803/754
f 692/804/755 706/824/755 707/826/755
f 685/806/756 564/827/756 700/828/756
f 790/829/757 699/803/757 714/825/757
f 712/821/758 726/830/758 727/831/758
f 704/820/759 720/832/759 705/822/759
f 712/821/760 728/833/760 713/823/760
f 706/824/761 720/832/761 721/834/761
f 714/825/762 728/833/762 729/835/762
f 707/826/763 721/834/763 722/836/763
f 700/828/764 564/837/764 715/838/764
f 790/839/765 714/825/765 729/835/765
f 707/826/766 723/840/766 708/813/766
f 700/828/767 716/841/767 701/814/767
f 708/813/768 724/842/768 709/815/768
f 702/816/769 716/841/769 717/843/769
f 710/817/770 724/842/770 725/844/770
f 702/816/771 718/845/771 703/818/771
f 710/817/772 726/830/772 711/819/772
f 704/820/773 718/845/773 719/846/773
f 723/840/774 739/847/774 724/842/774
f 716/841/775 732/848/775 717/843/775
f 725/844/776 739/847/776 740/849/776
f 717/843/777 733/850/777 718/845/777
f 725/844/778 741/851/778 726/830/778
f 719/846/779 733/850/779 734/852/779
f 727/831/780 741/851/780 742/853/780
f 719/846/781 735/854/781 720/832/781
f 727/831/782 743/855/782 728/833/782
f 721/834/783 735/854/783 736/856/783
f 728/833/784 744/857/784 729/835/784
f 722/836/785 736/856/785 737/858/785
f 715/838/786 564/859/786 730/860/786
f 790/861/787 729/835/787 744/857/787
f 722/836/788 738/862/788 723/840/788
f 715/838/789 731/863/789 716/841/789
f 742/853/790 758/864/790 743/855/790
f 736/856/791 750/865/791 751/866/791
f 743/855/792 759/867/792 744/857/792
f 737/858/793 751/866/793 752/868/793
f 730/860/794 564/869/794 745/870/794
f 790/871/795 744/857/795 759/867/795
f 737/858/796 753/872/796 738/862/796
f 731/863/797 745/870/797 746/873/797
f 738/862/798 754/874/798 739/847/798
f 731/863/799 747/875/799 732/848/799
f 740/849/800 754/874/800 755/876/800
f 732/848/801 748/877/801 733/850/801
f 740/849/802 756/878/802 741/851/802
f 734/852/803 748/877/803 749/879/803
f 742/853/804 756/878/804 757/880/804
f 734/852/805 750/865/805 735/854/805
f 746/873/806 762/881/806 747/875/806
f 755/876/807 769/882/807 770/883/807
f 747/875/808 763/884/808 748/877/808
f 755/876/809 771/885/809 756/878/809
f 749/879/810 763/884/810 764/886/810
f 757/880/811 771/885/811 772/887/811
f 749/879/812 765/888/812 750/865/812
f 757/880/813 773/889/813 758/864/813
f 751/866/814 765/888/814 766/890/814
f 759/867/815 773/889/815 774/891/815
f 752/868/816 766/890/816 767/892/816
f 745/870/817 564/893/817 760/894/817
f 790/895/818 759/867/818 774/891/818
f 752/868/819 768/896/819 753/872/819
f 746/873/820 760/894/820 761/897/820
f 753/872/821 769/882/821 754/874/821
f 766/890/822 780/898/822 781/899/822
f 774/891/823 788/900/823 789/901/823
f 767/892/824 781/899/824 782/902/824
f 760/894/825 564/903/825 775/904/825
f 790/905/826 774/891/826 789/901/826
f 767/892/827 783/906/827 768/896/827
f 760/894/828 776/907/828 761/897/828
f 768/896/829 784/908/829 769/882/829
f 761/897/830 777/909/830 762/881/830
f 770/883/831 784/908/831 785/910/831
f 762/881/832 778/911/832 763/884/832
f 770/883/833 786/912/833 771/885/833
f 764/886/834 778/911/834 779/913/834
f 772/887/835 786/912/835 787/914/835
f 764/886/836 780/898/836 765/888/836
f 772/887/837 788/900/837 773/889/837
f 785/910/838 800/915/838 801/916/838
f 777/909/839 794/917/839 778/911/839
f 785/910/840 802/918/840 786/912/840
f 779/913/841 794/917/841 795/919/841
f 787/914/842 802/918/842 803/920/842
f 779/913/843 796/921/843 780/898/843
f 787/914/844 804/922/844 788/900/844
f 781/899/845 796/921/845 797/923/845
f 788/900/846 805/924/846 789/901/846
f 782/902/847 797/923/847 798/925/847
f 775/904/848 564/926/848 791/927/848
f 790/928/849 789/901/849 805/924/849
f 782/902/850 799/929/850 783/906/850
f 775/904/851 792/930/851 776/907/851
f 783/906/852 800/915/852 784/908/852
f 776/907/853 793/931/853 777/909/853
f 804/922/854 820/932/854 805/924/854
f 798/925/855 812/933/855 813/934/855
f 791/927/856 564/935/856 806/936/856
f 790/937/857 805/924/857 820/932/857
f 798/925/858 814/938/858 799/929/858
f 791/927/859 807/939/859 792/930/859
f 799/929/860 815/940/860 800/915/860
f 792/930/861 808/941/861 793/931/861
f 801/916/862 815/940/862 816/942/862
f 793/931/863 809/943/863 794/917/863
f 801/916/864 817/944/864 802/918/864
f 794/917/865 810/945/865 795/919/865
f 803/920/866 817/944/866 818/946/866
f 795/919/867 811/947/867 796/921/867
f 803/920/868 819/948/868 804/922/868
f 797/923/869 811/947/869 812/933/869
f 808/941/870 824/949/870 809/943/870
f 816/942/871 832/950/871 817/944/871
f 810/945/872 824/949/872 825/951/872
f 818/946/873 832/950/873 833/952/873
f 810/945/874 826/953/874 811/947/874
f 818/946/875 834/954/875 819/948/875
f 812/933/876 826/953/876 827/955/876
f 819/948/877 835/956/877 820/932/877
f 813/934/878 827/955/878 828/957/878
f 806/936/879 564/958/879 821/959/879
f 790/960/880 820/932/880 835/956/880
f 813/934/881 829/961/881 814/938/881
f 806/936/882 822/962/882 807/939/882
f 814/938/883 830/963/883 815/940/883
f 807/939/884 823/964/884 808/941/884
f 816/942/885 830/963/885 831/965/885
f 828/957/886 842/966/886 843/967/886
f 821/959/887 564/968/887 836/969/887
f 790/970/888 835/956/888 850/971/888
f 828/957/889 844/972/889 829/961/889
f 821/959/890 837/973/890 822/962/890
f 829/961/891 845/974/891 830/963/891
f 823/964/892 837/973/892 838/975/892
f 831/965/893 845/974/893 846/976/893
f 823/964/894 839/977/894 824/949/894
f 831/965/895 847/978/895 832/950/895
f 825/951/896 839/977/896 840/979/896
f 833/952/897 847/978/897 848/980/897
f 825/951/898 841/981/898 826/953/898
f 833/952/899 849/982/899 834/954/899
f 827/955/900 841/981/900 842/966/900
f 834/954/901 850/971/901 835/956/901
f 846/983/902 862/984/902 847/985/902
f 840/986/903 854/987/903 855/988/903
f 848/989/904 862/984/904 863/990/904
f 840/986/905 856/991/905 841/992/905
f 848/989/906 864/993/906 849/994/906
f 842/995/907 856/991/907 857/996/907
f 849/994/908 865/997/908 850/998/908
f 843/999/909 857/996/909 858/1000/909
f 836/1001/910 564/1002/910 851/1003/910
f 790/1004/911 850/998/911 865/997/911
f 843/999/912 859/1005/912 844/1006/912
f 836/1001/913 852/1007/913 837/1008/913
f 844/1006/914 860/1009/914 845/1010/914
f 838/1011/915 852/1007/915 853/1012/915
f 846/983/916 860/1009/916 861/1013/916
f 838/1011/917 854/987/917 839/1014/917
f 790/1015/918 865/997/918 880/1016/918
f 858/1000/919 874/1017/919 859/1005/919
f 851/1003/920 867/1018/920 852/1007/920
f 859/1005/921 875/1019/921 860/1009/921
f 853/1012/922 867/1018/922 868/1020/922
f 861/1013/923 875/1019/923 876/1021/923
f 853/1012/924 869/1022/924 854/987/924
f 861/1013/925 877/1023/925 862/984/925
f 855/988/926 869/1022/926 870/1024/926
f 863/990/927 877/1023/927 878/1025/927
f 855/988/928 871/1026/928 856/991/928
f 863/990/929 879/1027/929 864/993/929
f 857/996/930 871/1026/930 872/1028/930
f 864/993/931 880/1016/931 865/997/931
f 858/1000/932 872/1028/932 873/1029/932
f 851/1003/933 564/1030/933 866/1031/933
f 878/1025/934 892/1032/934 893/1033/934
f 870/1024/935 886/1034/935 871/1026/935
f 879/1027/936 893/1033/936 894/1035/936
f 872/1028/937 886/1034/937 887/1036/937
f 879/1027/938 895/1037/938 880/1016/938
f 873/1029/939 887/1036/939 888/1038/939
f 866/1031/940 564/1039/940 881/1040/940
f 790/1041/941 880/1016/941 895/1037/941
f 873/1029/942 889/1042/942 874/1017/942
f 867/1018/943 881/1040/943 882/1043/943
f 874/1017/944 890/1044/944 875/1019/944
f 867/1018/945 883/1045/945 868/1020/945
f 876/1021/946 890/1044/946 891/1046/946
f 868/1020/947 884/1047/947 869/1022/947
f 876/1021/948 892/1032/948 877/1023/948
f 870/1024/949 884/1047/949 885/1048/949
f 881/1040/950 897/1049/950 882/1043/950
f 889/1042/951 905/1050/951 890/1044/951
f 883/1045/952 897/1049/952 898/1051/952
f 891/1046/953 905/1050/953 906/1052/953
f 883/1045/954 899/1053/954 884/1047/954
f 891/1046/955 907/1054/955 892/1032/955
f 885/1048/956 899/1053/956 900/1055/956
f 893/1033/957 907/1054/957 908/1056/957
f 885/1048/958 901/1057/958 886/1034/958
f 893/1033/959 909/1058/959 894/1035/959
f 887/1036/960 901/1057/960 902/1059/960
f 894/1035/961 910/1060/961 895/1037/961
f 888/1038/962 902/1059/962 903/1061/962
f 881/1040/963 564/1062/963 896/1063/963
f 790/1064/964 895/1037/964 910/1060/964
f 888/1038/965 904/1065/965 889/1042/965
f 900/1055/966 916/1066/966 901/1057/966
f 908/1056/967 924/1067/967 909/1058/967
f 902/1059/968 916/1066/968 917/1068/968
f 909/1058/969 925/1069/969 910/1060/969
f 903/1061/970 917/1068/970 918/1070/970
f 896/1063/971 564/1071/971 911/1072/971
f 790/1073/972 910/1060/972 925/1069/972
f 903/1061/973 919/1074/973 904/1065/973
f 896/1063/974 912/1075/974 897/1049/974
f 904/1065/975 920/1076/975 905/1050/975
f 898/1051/976 912/1075/976 913/1077/976
f 906/1052/977 920/1076/977 921/1078/977
f 898/1051/978 914/1079/978 899/1053/978
f 906/1052/979 922/1080/979 907/1054/979
f 900/1055/980 914/1079/980 915/1081/980
f 908/1056/981 922/1080/981 923/1082/981
f 919/1074/982 935/1083/982 920/1076/982
f 912/1075/983 928/1084/983 913/1077/983
f 921/1078/984 935/1083/984 936/1085/984
f 913/1077/985 929/1086/985 914/1079/985
f 921/1078/986 937/1087/986 922/1080/986
f 915/1081/987 929/1086/987 930/1088/987
f 923/1082/988 937/1087/988 938/1089/988
f 915/1081/989 931/1090/989 916/1066/989
f 923/1082/990 939/1091/990 924/1067/990
f 917/1068/991 931/1090/991 932/1092/991
f 924/1067/992 940/1093/992 925/1069/992
f 918/1070/993 932/1092/993 933/1094/993
f 911/1072/994 564/1095/994 926/1096/994
f 790/1097/995 925/1069/995 940/1093/995
f 918/1070/996 934/1098/996 919/1074/996
f 912/1075/997 926/1096/997 927/1099/997
f 938/1089/998 954/1100/998 939/1091/998
f 932/1092/999 946/1101/999 947/1102/999
f 940/1093/1000 954/1100/1000 955/1103/1000
f 933/1094/1001 947/1102/1001 948/1104/1001
f 926/1096/1002 564/1105/1002 941/1106/1002
f 790/1107/1003 940/1093/1003 955/1103/1003
f 933/1094/1004 949/1108/1004 934/1098/1004
f 926/1096/1005 942/1109/1005 927/1099/1005
f 934/1098/1006 950/1110/1006 935/1083/1006
f 928/1084/1007 942/1109/1007 943/1111/1007
f 936/1085/1008 950/1110/1008 951/1112/1008
f 928/1084/1009 944/1113/1009 929/1086/1009
f 936/1085/1010 952/1114/1010 937/1087/1010
f 930/1088/1011 944/1113/1011 945/1115/1011
f 938/1089/1012 952/1114/1012 953/1116/1012
f 930/1088/1013 946/1101/1013 931/1090/1013
f 943/1111/1014 957/591/1014 483/563/1014
f 950/1110/1015 488/567/1015 951/1112/1015
f 944/1113/1016 483/563/1016 958/568/1016
f 951/1112/1017 961/571/1017 952/1114/1017
f 945/1115/1018 958/568/1018 484/572/1018
f 953/1116/1019 961/571/1019 962/575/1019
f 946/1101/1020 484/572/1020 959/576/1020
f 953/1116/1021 963/579/1021 954/1100/1021
f 947/1102/1022 959/576/1022 485/580/1022
f 954/1100/1023 964/583/1023 955/1103/1023
f 948/1104/1024 485/580/1024 486/585/1024
f 941/1106/1025 564/1117/1025 956/586/1025
f 790/1118/1026 955/1103/1026 964/583/1026
f 948/1104/1027 487/560/1027 949/1108/1027
f 942/1109/1028 956/586/1028 957/591/1028
f 949/1108/1029 960/562/1029 950/1110/1029
f 487/560/518 497/590/518 498/561/518
f 483/563/519 957/591/519 490/564/519
f 960/562/520 498/561/520 499/566/520
f 958/568/521 483/563/521 491/565/521
f 488/567/522 499/566/522 500/570/522
f 484/572/523 958/568/523 492/569/523
f 961/571/524 500/570/524 501/574/524
f 959/576/525 484/572/525 493/573/525
f 962/575/526 501/574/526 502/578/526
f 485/580/527 959/576/527 494/577/527
f 963/579/528 502/578/528 503/582/528
f 485/580/529 495/581/529 496/584/529
f 486/585/532 496/584/532 497/590/532
f 957/591/533 956/586/533 489/588/533
f 496/584/534 511/605/534 512/592/534
f 489/588/535 504/607/535 505/593/535
f 497/590/536 512/592/536 513/594/536
f 490/564/537 505/593/537 506/595/537
f 498/561/538 513/594/538 514/596/538
f 491/565/539 506/595/539 507/597/539
f 499/566/540 514/596/540 515/598/540
f 492/569/541 507/597/541 508/599/541
f 501/574/542 500/570/542 515/598/542
f 493/573/543 508/599/543 509/601/543
f 502/578/544 501/574/544 516/600/544
f 495/581/545 494/577/545 509/601/545
f 502/578/546 517/602/546 518/604/546
f 495/581/547 510/603/547 511/605/547
f 516/600/550 515/598/550 530/609/550
f 508/599/551 523/625/551 524/611/551
f 516/600/552 531/610/552 532/612/552
f 509/601/553 524/611/553 525/613/553
f 517/602/554 532/612/554 533/614/554
f 510/603/555 525/613/555 526/615/555
f 512/592/558 511/605/558 526/615/558
f 504/607/559 519/617/559 520/620/559
f 512/592/560 527/619/560 528/621/560
f 505/593/561 520/620/561 521/622/561
f 514/596/562 513/594/562 528/621/562
f 506/595/563 521/622/563 522/624/563
f 515/598/564 514/596/564 529/623/564
f 507/597/565 522/624/565 523/625/565
f 519/617/566 534/640/566 535/626/566
f 527/619/567 542/642/567 543/627/567
f 520/620/568 535/626/568 536/628/568
f 529/623/569 528/621/569 543/627/569
f 521/622/570 536/628/570 537/630/570
f 529/623/571 544/629/571 545/631/571
f 522/624/572 537/630/572 538/632/572
f 530/609/573 545/631/573 546/633/573
f 523/625/574 538/632/574 539/634/574
f 531/610/575 546/633/575 547/635/575
f 525/613/576 524/611/576 539/634/576
f 532/612/577 547/635/577 548/637/577
f 526/615/578 525/613/578 540/636/578
f 526/615/581 541/638/581 542/642/581
f 538/632/582 553/659/582 554/643/582
f 547/635/583 546/633/583 561/644/583
f 540/636/584 539/634/584 554/643/584
f 547/635/1030 562/645/1030 563/647/1030
f 541/638/586 540/636/586 555/646/586
f 541/638/589 556/648/589 557/652/589
f 534/640/1031 549/650/1031 550/653/1031
f 542/642/591 557/652/591 558/654/591
f 535/626/592 550/653/592 551/655/592
f 544/629/593 543/627/593 558/654/593
f 536/628/594 551/655/594 552/657/594
f 544/629/595 559/656/595 560/658/595
f 538/632/596 537/630/596 552/657/596
f 545/631/597 560/658/597 561/644/597
f 557/652/598 573/675/598 574/660/598
f 550/653/599 566/676/599 567/661/599
f 559/656/600 558/654/600 574/660/600
f 551/655/601 567/661/601 568/663/601
f 559/656/602 575/662/602 576/664/602
f 553/659/603 552/657/603 568/663/603
f 560/658/604 576/664/604 577/666/604
f 553/659/605 569/665/605 570/667/605
f 561/644/606 577/666/606 578/668/606
f 555/646/607 554/643/607 570/667/607
f 563/647/608 562/645/608 578/668/608
f 556/648/609 555/646/609 571/669/609
f 556/648/612 572/671/612 573/675/612
f 549/650/613 565/673/613 566/676/613
f 578/668/614 577/666/614 592/677/614
f 571/669/615 570/667/615 585/679/615
f 578/668/616 593/678/616 594/681/616
f 572/671/617 571/669/617 586/680/617
f 572/671/620 587/682/620 588/686/620
f 566/676/621 565/673/621 580/684/621
f 573/675/622 588/686/622 589/688/622
f 566/676/623 581/687/623 582/689/623
f 575/662/624 574/660/624 589/688/624
f 567/661/625 582/689/625 583/691/625
f 575/662/626 590/690/626 591/692/626
f 569/665/627 568/663/627 583/691/627
f 577/666/628 576/664/628 591/692/628
f 569/665/629 584/693/629 585/679/629
f 590/690/630 589/688/630 604/694/630
f 582/689/631 597/710/631 598/696/631
f 590/690/632 605/695/632 606/697/632
f 584/693/633 583/691/633 598/696/633
f 592/677/634 591/692/634 606/697/634
f 584/693/635 599/698/635 600/700/635
f 592/677/636 607/699/636 608/701/636
f 586/680/637 585/679/637 600/700/637
f 593/678/638 608/701/638 609/703/638
f 587/682/639 586/680/639 601/702/639
f 587/682/642 602/704/642 603/708/642
f 581/687/643 580/684/643 595/706/643
f 588/686/644 603/708/644 604/694/644
f 581/687/645 596/709/645 597/710/645
f 608/701/646 623/727/646 624/711/646
f 602/704/647 601/702/647 616/712/647
f 602/704/650 617/713/650 618/717/650
f 595/706/651 610/715/651 611/718/651
f 603/708/652 618/717/652 619/719/652
f 596/709/653 611/718/653 612/720/653
f 605/695/654 604/694/654 619/719/654
f 597/710/655 612/720/655 613/722/655
f 605/695/656 620/721/656 621/723/656
f 599/698/657 598/696/657 613/722/657
f 607/699/658 606/697/658 621/723/658
f 599/698/659 614/724/659 615/726/659
f 607/699/660 622/725/660 623/727/660
f 601/702/661 600/700/661 615/726/661
f 612/720/1032 627/743/1032 628/728/1032
f 620/721/663 635/744/663 636/729/663
f 614/724/664 613/722/664 628/728/664
f 622/725/1033 621/723/1033 636/729/1033
f 614/724/666 629/730/666 630/732/666
f 622/725/667 637/731/667 638/733/667
f 616/712/668 615/726/668 630/732/668
f 624/711/669 623/727/669 638/733/669
f 617/713/670 616/712/670 631/734/670
f 617/713/673 632/736/673 633/740/673
f 610/715/674 625/738/674 626/741/674
f 618/717/675 633/740/675 634/742/675
f 612/720/676 611/718/676 626/741/676
f 620/721/677 619/719/677 634/742/677
f 632/736/678 631/734/678 646/745/678
f 632/736/681 647/746/681 648/751/681
f 625/738/682 640/748/682 641/752/682
f 633/740/683 648/751/683 649/753/683
f 627/743/684 626/741/684 641/752/684
f 635/744/685 634/742/685 649/753/685
f 627/743/686 642/754/686 643/756/686
f 635/744/687 650/755/687 651/757/687
f 629/730/688 628/728/688 643/756/688
f 637/731/689 636/729/689 651/757/689
f 629/730/690 644/758/690 645/760/690
f 637/731/691 652/759/691 653/761/691
f 631/734/692 630/732/692 645/760/692
f 638/733/693 653/761/693 654/750/693
f 650/755/694 665/778/694 666/762/694
f 644/758/695 643/756/695 658/763/695
f 651/757/696 666/762/696 667/765/696
f 644/758/697 659/764/697 660/766/697
f 653/761/698 652/759/698 667/765/698
f 646/745/699 645/760/699 660/766/699
f 654/750/700 653/761/700 668/767/700
f 647/746/701 646/745/701 661/768/701
f 647/746/704 662/770/704 663/774/704
f 640/748/705 655/772/705 656/775/705
f 648/751/706 663/774/706 664/776/706
f 641/752/707 656/775/707 657/777/707
f 650/755/708 649/753/708 664/776/708
f 642/754/709 657/777/709 658/763/709
f 662/770/712 677/795/712 678/783/712
f 656/775/713 655/772/713 670/780/713
f 663/774/714 678/783/714 679/785/714
f 657/777/715 656/775/715 671/784/715
f 665/778/716 664/776/716 679/785/716
f 657/777/717 672/786/717 673/788/717
f 665/778/718 680/787/718 681/789/718
f 659/764/719 658/763/719 673/788/719
f 667/765/720 666/762/720 681/789/720
f 659/764/721 674/790/721 675/792/721
f 667/765/722 682/791/722 683/793/722
f 661/768/723 660/766/723 675/792/723
f 668/767/724 683/793/724 684/782/724
f 662/770/725 661/768/725 676/794/725
f 674/790/726 673/788/726 688/796/726
f 682/791/727 681/789/727 696/798/727
f 674/790/728 689/797/728 690/800/728
f 682/791/729 697/799/729 698/801/729
f 676/794/730 675/792/730 690/800/730
f 683/793/731 698/801/731 699/803/731
f 677/795/732 676/794/732 691/802/732
f 677/795/735 692/804/735 693/808/735
f 670/780/736 685/806/736 686/809/736
f 678/783/737 693/808/737 694/810/737
f 671/784/738 686/809/738 687/811/738
f 680/787/739 679/785/739 694/810/739
f 672/786/740 687/811/740 688/796/740
f 680/787/741 695/812/741 696/798/741
f 692/804/742 707/826/742 708/813/742
f 685/806/743 700/828/743 701/814/743
f 693/808/744 708/813/744 709/815/744
f 686/809/745 701/814/745 702/816/745
f 695/812/746 694/810/746 709/815/746
f 687/811/747 702/816/747 703/818/747
f 695/812/748 710/817/748 711/819/748
f 689/797/749 688/796/749 703/818/749
f 697/799/750 696/798/750 711/819/750
f 689/797/751 704/820/751 705/822/751
f 697/799/752 712/821/752 713/823/752
f 691/802/753 690/800/753 705/822/753
f 698/801/754 713/823/754 714/825/754
f 692/804/755 691/802/755 706/824/755
f 712/821/758 711/819/758 726/830/758
f 704/820/759 719/846/759 720/832/759
f 712/821/760 727/831/760 728/833/760
f 706/824/761 705/822/761 720/832/761
f 714/825/762 713/823/762 728/833/762
f 707/826/763 706/824/763 721/834/763
f 707/826/766 722/836/766 723/840/766
f 700/828/767 715/838/767 716/841/767
f 708/813/768 723/840/768 724/842/768
f 702/816/769 701/814/769 716/841/769
f 710/817/770 709/815/770 724/842/770
f 702/816/771 717/843/771 718/845/771
f 710/817/772 725/844/772 726/830/772
f 704/820/773 703/818/773 718/845/773
f 723/840/774 738/862/774 739/847/774
f 716/841/775 731/863/775 732/848/775
f 725/844/776 724/842/776 739/847/776
f 717/843/777 732/848/777 733/850/777
f 725/844/778 740/849/778 741/851/778
f 719/846/779 718/845/779 733/850/779
f 727/831/780 726/830/780 741/851/780
f 719/846/781 734/852/781 735/854/781
f 727/831/782 742/853/782 743/855/782
f 721/834/783 720/832/783 735/854/783
f 728/833/784 743/855/784 744/857/784
f 722/836/785 721/834/785 736/856/785
f 722/836/788 737/858/788 738/862/788
f 715/838/789 730/860/789 731/863/789
f 742/853/790 757/880/790 758/864/790
f 736/856/791 735/854/791 750/865/791
f 743/855/792 758/864/792 759/867/792
f 737/858/793 736/856/793 751/866/793
f 737/858/796 752/868/796 753/872/796
f 731/863/797 730/860/797 745/870/797
f 738/862/798 753/872/798 754/874/798
f 731/863/799 746/873/799 747/875/799
f 740/849/800 739/847/800 754/874/800
f 732/848/801 747/875/801 748/877/801
f 740/849/802 755/876/802 756/878/802
f 734/852/803 733/850/803 748/877/803
f 742/853/804 741/851/804 756/878/804
f 734/852/805 749/879/805 750/865/805
f 746/873/806 761/897/806 762/881/806
f 755/876/807 754/874/807 769/882/807
f 747/875/808 762/881/808 763/884/808
f 755/876/809 770/883/809 771/885/809
f 749/879/810 748/877/810 763/884/810
f 757/880/811 756/878/811 771/885/811
f 749/879/812 764/886/812 765/888/812
f 757/880/813 772/887/813 773/889/813
f 751/866/814 750/865/814 765/888/814
f 759/867/815 758/864/815 773/889/815
f 752/868/816 751/866/816 766/890/816
f 752/868/819 767/892/819 768/896/819
f 746/873/820 745/870/820 760/894/820
f 753/872/821 768/896/821 769/882/821
f 766/890/822 765/888/822 780/898/822
f 774/891/823 773/889/823 788/900/823
f 767/892/824 766/890/824 781/899/824
f 767/892/827 782/902/827 783/906/827
f 760/894/828 775/904/828 776/907/828
f 768/896/829 783/906/829 784/908/829
f 761/897/830 776/907/830 777/909/830
f 770/883/831 769/882/831 784/908/831
f 762/881/832 777/909/832 778/911/832
f 770/883/833 785/910/833 786/912/833
f 764/886/834 763/884/834 778/911/834
f 772/887/835 771/885/835 786/912/835
f 764/886/836 779/913/836 780/898/836
f 772/887/837 787/914/837 788/900/837
f 785/910/838 784/908/838 800/915/838
f 777/909/839 793/931/839 794/917/839
f 785/910/840 801/916/840 802/918/840
f 779/913/841 778/911/841 794/917/841
f 787/914/842 786/912/842 802/918/842
f 779/913/843 795/919/843 796/921/843
f 787/914/844 803/920/844 804/922/844
f 781/899/845 780/898/845 796/921/845
f 788/900/1034 804/922/1034 805/924/1034
f 782/902/847 781/899/847 797/923/847
f 782/902/850 798/925/850 799/929/850
f 775/904/851 791/927/851 792/930/851
f 783/906/852 799/929/852 800/915/852
f 776/907/853 792/930/853 793/931/853
f 804/922/854 819/948/854 820/932/854
f 798/925/855 797/923/855 812/933/855
f 798/925/858 813/934/858 814/938/858
f 791/927/859 806/936/859 807/939/859
f 799/929/860 814/938/860 815/940/860
f 792/930/861 807/939/861 808/941/861
f 801/916/862 800/915/862 815/940/862
f 793/931/863 808/941/863 809/943/863
f 801/916/864 816/942/864 817/944/864
f 794/917/865 809/943/865 810/945/865
f 803/920/866 802/918/866 817/944/866
f 795/919/867 810/945/867 811/947/867
f 803/920/868 818/946/868 819/948/868
f 797/923/869 796/921/869 811/947/869
f 808/941/870 823/964/870 824/949/870
f 816/942/871 831/965/871 832/950/871
f 810/945/872 809/943/872 824/949/872
f 818/946/873 817/944/873 832/950/873
f 810/945/874 825/951/874 826/953/874
f 818/946/875 833/952/875 834/954/875
f 812/933/876 811/947/876 826/953/876
f 819/948/877 834/954/877 835/956/877
f 813/934/878 812/933/878 827/955/878
f 813/934/881 828/957/881 829/961/881
f 806/936/882 821/959/882 822/962/882
f 814/938/883 829/961/883 830/963/883
f 807/939/884 822/962/884 823/964/884
f 816/942/885 815/940/885 830/963/885
f 828/957/886 827/955/886 842/966/886
f 828/957/889 843/967/889 844/972/889
f 821/959/890 836/969/890 837/973/890
f 829/961/891 844/972/891 845/974/891
f 823/964/892 822/962/892 837/973/892
f 831/965/893 830/963/893 845/974/893
f 823/964/894 838/975/894 839/977/894
f 831/965/895 846/976/895 847/978/895
f 825/951/896 824/949/896 839/977/896
f 833/952/897 832/950/897 847/978/897
f 825/951/898 840/979/898 841/981/898
f 833/952/899 848/980/899 849/982/899
f 827/955/900 826/953/900 841/981/900
f 834/954/901 849/982/901 850/971/901
f 846/983/902 861/1013/902 862/984/902
f 840/986/903 839/1014/903 854/987/903
f 848/989/904 847/985/904 862/984/904
f 840/986/905 855/988/905 856/991/905
f 848/989/906 863/990/906 864/993/906
f 842/995/907 841/992/907 856/991/907
f 849/994/908 864/993/908 865/997/908
f 843/999/909 842/995/909 857/996/909
f 843/999/912 858/1000/912 859/1005/912
f 836/1001/913 851/1003/913 852/1007/913
f 844/1006/914 859/1005/914 860/1009/914
f 838/1011/915 837/1008/915 852/1007/915
f 846/983/916 845/1010/916 860/1009/916
f 838/1011/917 853/1012/917 854/987/917
f 858/1000/919 873/1029/919 874/1017/919
f 851/1003/920 866/1031/920 867/1018/920
f 859/1005/921 874/1017/921 875/1019/921
f 853/1012/922 852/1007/922 867/1018/922
f 861/1013/923 860/1009/923 875/1019/923
f 853/1012/924 868/1020/924 869/1022/924
f 861/1013/925 876/1021/925 877/1023/925
f 855/988/926 854/987/926 869/1022/926
f 863/990/927 862/984/927 877/1023/927
f 855/988/928 870/1024/928 871/1026/928
f 863/990/929 878/1025/929 879/1027/929
f 857/996/930 856/991/930 871/1026/930
f 864/993/931 879/1027/931 880/1016/931
f 858/1000/932 857/996/932 872/1028/932
f 878/1025/934 877/1023/934 892/1032/934
f 870/1024/935 885/1048/935 886/1034/935
f 879/1027/936 878/1025/936 893/1033/936
f 872/1028/937 871/1026/937 886/1034/937
f 879/1027/938 894/1035/938 895/1037/938
f 873/1029/939 872/1028/939 887/1036/939
f 873/1029/942 888/1038/942 889/1042/942
f 867/1018/943 866/1031/943 881/1040/943
f 874/1017/944 889/1042/944 890/1044/944
f 867/1018/945 882/1043/945 883/1045/945
f 876/1021/946 875/1019/946 890/1044/946
f 868/1020/947 883/1045/947 884/1047/947
f 876/1021/948 891/1046/948 892/1032/948
f 870/1024/949 869/1022/949 884/1047/949
f 881/1040/950 896/1063/950 897/1049/950
f 889/1042/951 904/1065/951 905/1050/951
f 883/1045/952 882/1043/952 897/1049/952
f 891/1046/953 890/1044/953 905/1050/953
f 883/1045/954 898/1051/954 899/1053/954
f 891/1046/955 906/1052/955 907/1054/955
f 885/1048/956 884/1047/956 899/1053/956
f 893/1033/957 892/1032/957 907/1054/957
f 885/1048/958 900/1055/958 901/1057/958
f 893/1033/959 908/1056/959 909/1058/959
f 887/1036/960 886/1034/960 901/1057/960
f 894/1035/961 909/1058/961 910/1060/961
f 888/1038/962 887/1036/962 902/1059/962
f 888/1038/965 903/1061/965 904/1065/965
f 900/1055/966 915/1081/966 916/1066/966
f 908/1056/967 923/1082/967 924/1067/967
f 902/1059/968 901/1057/968 916/1066/968
f 909/1058/969 924/1067/969 925/1069/969
f 903/1061/970 902/1059/970 917/1068/970
f 903/1061/973 918/1070/973 919/1074/973
f 896/1063/974 911/1072/974 912/1075/974
f 904/1065/975 919/1074/975 920/1076/975
f 898/1051/976 897/1049/976 912/1075/976
f 906/1052/977 905/1050/977 920/1076/977
f 898/1051/978 913/1077/978 914/1079/978
f 906/1052/979 921/1078/979 922/1080/979
f 900/1055/980 899/1053/980 914/1079/980
f 908/1056/981 907/1054/981 922/1080/981
f 919/1074/982 934/1098/982 935/1083/982
f 912/1075/983 927/1099/983 928/1084/983
f 921/1078/984 920/1076/984 935/1083/984
f 913/1077/985 928/1084/985 929/1086/985
f 921/1078/986 936/1085/986 937/1087/986
f 915/1081/987 914/1079/987 929/1086/987
f 923/1082/988 922/1080/988 937/1087/988
f 915/1081/989 930/1088/989 931/1090/989
f 923/1082/990 938/1089/990 939/1091/990
f 917/1068/991 916/1066/991 931/1090/991
f 924/1067/992 939/1091/992 940/1093/992
f 918/1070/993 917/1068/993 932/1092/993
f 918/1070/996 933/1094/996 934/1098/996
f 912/1075/997 911/1072/997 926/1096/997
f 938/1089/998 953/1116/998 954/1100/998
f 932/1092/999 931/1090/999 946/1101/999
f 940/1093/1000 939/1091/1000 954/1100/1000
f 933/1094/1001 932/1092/1001 947/1102/1001
f 933/1094/1004 948/1104/1004 949/1108/1004
f 926/1096/1005 941/1106/1005 942/1109/1005
f 934/1098/1006 949/1108/1006 950/1110/1006
f 928/1084/1007 927/1099/1007 942/1109/1007
f 936/1085/1008 935/1083/1008 950/1110/1008
f 928/1084/1009 943/1111/1009 944/1113/1009
f 936/1085/1010 951/1112/1010 952/1114/1010
f 930/1088/1011 929/1086/1011 944/1113/1011
f 938/1089/1012 937/1087/1012 952/1114/1012
f 930/1088/1013 945/1115/1013 946/1101/1013
f 943/1111/1014 942/1109/1014 957/591/1014
f 950/1110/1015 960/562/1015 488/567/1015
f 944/1113/1016 943/1111/1016 483/563/1016
f 951/1112/1017 488/567/1017 961/571/1017
f 945/1115/1018 944/1113/1018 958/568/1018
f 953/1116/1019 952/1114/1019 961/571/1019
f 946/1101/1020 945/1115/1020 484/572/1020
f 953/1116/1021 962/575/1021 963/579/1021
f 947/1102/1022 946/1101/1022 959/576/1022
f 954/1100/1035 963/579/1035 964/583/1035
f 948/1104/1024 947/1102/1024 485/580/1024
f 948/1104/1027 486/585/1027 487/560/1027
f 942/1109/1036 941/1106/1036 956/586/1036
f 949/1108/1029 487/560/1029 960/562/1029
o Cube.001
v 0.131002 0.093472 -0.288635
v -0.046766 -0.163106 -0.234637
v 0.099596 0.093472 -0.392024
v -0.078171 -0.163106 -0.338025
v 0.046229 0.157626 -0.262885
v -0.131538 -0.098952 -0.208886
v 0.014823 0.157626 -0.366273
v -0.162944 -0.098952 -0.312274
v 0.097034 0.005807 -0.259496
v 0.025928 -0.096824 -0.237897
v -0.015946 -0.096824 -0.375748
v 0.055161 0.005807 -0.397347
v -0.128977 -0.011287 -0.341414
v -0.057870 0.091344 -0.363013
v -0.087103 -0.011287 -0.203562
v -0.015996 0.091344 -0.225162
v 0.114758 0.092536 -0.373665
v 0.128483 0.092066 -0.344334
v 0.133391 0.092536 -0.312325
v 0.008017 -0.121847 -0.232859
v -0.010693 -0.143062 -0.229997
v -0.030367 -0.157173 -0.230979
v -0.049562 -0.171527 -0.256751
v -0.057075 -0.175757 -0.287969
v -0.068195 -0.171527 -0.318091
v 0.072466 0.031288 -0.402201
v 0.086941 0.055707 -0.403777
v 0.096168 0.077724 -0.399622
v 0.032365 0.148323 -0.379838
v 0.057664 0.131172 -0.391660
v 0.082660 0.110261 -0.395115
v -0.100293 -0.153803 -0.339541
v -0.127894 -0.136652 -0.335294
v -0.150589 -0.115741 -0.324264
v -0.039960 0.116367 -0.368051
v -0.021249 0.137582 -0.370913
v -0.001575 0.151693 -0.369931
v 0.036253 0.166047 -0.282818
v 0.025133 0.170277 -0.312941
v 0.017620 0.166047 -0.344159
v -0.165333 -0.098016 -0.288585
v -0.160425 -0.097546 -0.256575
v -0.146700 -0.098016 -0.227244
v 0.001690 0.116367 -0.230937
v 0.018832 0.137582 -0.238965
v 0.034635 0.151693 -0.250724
v 0.118646 0.110261 -0.276646
v 0.095952 0.131172 -0.265615
v 0.068351 0.148323 -0.261368
v -0.114602 -0.115741 -0.205794
v -0.089606 -0.136652 -0.209250
v -0.064307 -0.153803 -0.221072
v 0.132378 0.077724 -0.280414
v 0.127021 0.055707 -0.271829
v 0.114115 0.031288 -0.265088
v 0.079258 -0.019851 -0.254096
v 0.061481 -0.045509 -0.248696
v 0.043704 -0.071167 -0.243296
v -0.066577 -0.157173 -0.350186
v -0.050774 -0.143062 -0.361945
v -0.033632 -0.121847 -0.369972
v 0.001830 -0.071167 -0.381148
v 0.019607 -0.045509 -0.386548
v 0.037384 -0.019851 -0.391947
v -0.164320 -0.083204 -0.320495
v -0.158964 -0.061187 -0.329081
v -0.146057 -0.036768 -0.335822
v -0.111200 0.014371 -0.346813
v -0.093423 0.040029 -0.352213
v -0.075646 0.065687 -0.357613
v -0.128110 -0.083204 -0.201288
v -0.118883 -0.061187 -0.197133
v -0.104408 -0.036768 -0.198709
v -0.069326 0.014371 -0.208962
v -0.051549 0.040029 -0.214362
v -0.033773 0.065687 -0.219762
v 0.015501 0.071965 -0.213555
v 0.048371 0.048576 -0.216482
v 0.077314 0.025186 -0.232332
v -0.055606 -0.030666 -0.191956
v -0.022736 -0.054056 -0.194882
v 0.006208 -0.077445 -0.210732
v -0.064278 0.103373 -0.326953
v -0.058126 0.107383 -0.287650
v -0.041378 0.103373 -0.251566
v -0.135384 0.000742 -0.305353
v -0.129233 0.004751 -0.266050
v -0.112485 0.000742 -0.229966
v 0.023664 0.025186 -0.408954
v -0.009206 0.048576 -0.406027
v -0.038150 0.071965 -0.390177
v -0.047443 -0.077445 -0.387354
v -0.080313 -0.054056 -0.384428
v -0.109257 -0.030666 -0.368578
v 0.103442 -0.006222 -0.295556
v 0.097291 -0.010231 -0.334859
v 0.080542 -0.006222 -0.370944
v 0.032335 -0.108853 -0.273957
v 0.026184 -0.112863 -0.313260
v 0.009436 -0.108853 -0.349344
v 0.014371 -0.133850 -0.268722
v -0.004720 -0.154879 -0.264476
v -0.025889 -0.168281 -0.261517
v 0.008231 -0.137875 -0.307807
v -0.010775 -0.159011 -0.302033
v -0.032003 -0.172596 -0.295585
v -0.008405 -0.133850 -0.343703
v -0.026633 -0.154879 -0.336613
v -0.045870 -0.168281 -0.327298
v -0.064987 -0.102605 -0.381514
v -0.081136 -0.124782 -0.373035
v -0.094129 -0.141632 -0.359547
v -0.097705 -0.079377 -0.378600
v -0.112789 -0.102679 -0.370206
v -0.123394 -0.122130 -0.356659
v -0.126468 -0.056079 -0.362839
v -0.140284 -0.080021 -0.355068
v -0.148065 -0.100815 -0.343164
v -0.152459 -0.024835 -0.299945
v -0.165320 -0.049924 -0.294485
v -0.170234 -0.074167 -0.289521
v -0.146368 -0.020879 -0.260846
v -0.159650 -0.046347 -0.256811
v -0.165376 -0.071664 -0.255072
v -0.129683 -0.024835 -0.224964
v -0.143408 -0.049924 -0.222348
v -0.150252 -0.074167 -0.223741
v -0.073101 -0.056079 -0.187152
v -0.088905 -0.080021 -0.185925
v -0.101993 -0.100815 -0.191491
v -0.040431 -0.079377 -0.190052
v -0.057636 -0.102679 -0.188638
v -0.073984 -0.122130 -0.193998
v -0.011621 -0.102605 -0.205828
v -0.029757 -0.124782 -0.203892
v -0.048057 -0.141632 -0.207874
v 0.097926 0.117017 -0.370755
v 0.072470 0.140011 -0.364941
v 0.044323 0.157582 -0.354473
v 0.111174 0.118639 -0.339076
v 0.084983 0.142971 -0.331121
v 0.054691 0.161383 -0.321919
v 0.117784 0.117017 -0.305382
v 0.093395 0.140011 -0.296055
v 0.064181 0.157582 -0.289099
v -0.149727 -0.122497 -0.295528
v -0.125337 -0.145491 -0.304855
v -0.096124 -0.163062 -0.311811
v -0.143116 -0.124119 -0.261833
v -0.116925 -0.148451 -0.269789
v -0.086633 -0.166863 -0.278990
v -0.129869 -0.122497 -0.230154
v -0.104412 -0.145491 -0.235969
v -0.076266 -0.163062 -0.246437
v 0.062187 0.136152 -0.241362
v 0.049194 0.119302 -0.227874
v 0.033045 0.097125 -0.219395
v 0.091452 0.116650 -0.244251
v 0.080847 0.097199 -0.230704
v 0.065763 0.073897 -0.222310
v 0.116123 0.095335 -0.257746
v 0.108342 0.074541 -0.245841
v 0.094525 0.050599 -0.238071
v -0.002276 0.046307 -0.208156
v -0.020052 0.020649 -0.202756
v -0.037829 -0.005009 -0.197356
v 0.030594 0.022918 -0.211082
v 0.012817 -0.002740 -0.205682
v -0.004960 -0.028398 -0.200282
v 0.059538 -0.000471 -0.226932
v 0.041761 -0.026129 -0.221532
v 0.023984 -0.051787 -0.216132
v -0.006053 0.162801 -0.339392
v -0.027222 0.149399 -0.336434
v -0.046313 0.128370 -0.332188
v 0.000060 0.167116 -0.305325
v -0.021167 0.153531 -0.298876
v -0.040174 0.132395 -0.293103
v 0.013928 0.162801 -0.273612
v -0.005309 0.149399 -0.264297
v -0.023537 0.128370 -0.257207
v -0.082054 0.077715 -0.321553
v -0.099831 0.052057 -0.316153
v -0.117608 0.026400 -0.310753
v -0.075903 0.081725 -0.282250
v -0.093679 0.056067 -0.276850
v -0.111456 0.030409 -0.271450
v -0.059154 0.077715 -0.246166
v -0.076931 0.052057 -0.240766
v -0.094708 0.026400 -0.235366
v 0.070051 0.095335 -0.409419
v 0.056963 0.074541 -0.414984
v 0.041159 0.050599 -0.413758
v 0.042042 0.116650 -0.406912
v 0.025694 0.097199 -0.412272
v 0.008489 0.073897 -0.410858
v 0.016115 0.136152 -0.393035
v -0.002185 0.119302 -0.397017
v -0.020322 0.097125 -0.395082
v 0.005887 -0.000471 -0.403554
v -0.011890 -0.026129 -0.398154
v -0.029667 -0.051787 -0.392754
v -0.026983 0.022918 -0.400627
v -0.044759 -0.002740 -0.395228
v -0.062536 -0.028398 -0.389828
v -0.055926 0.046307 -0.384777
v -0.073703 0.020649 -0.379378
v -0.091480 -0.005009 -0.373978
v 0.138292 0.068687 -0.311389
v 0.133378 0.044444 -0.306425
v 0.120517 0.019355 -0.300965
v 0.133433 0.066184 -0.345838
v 0.127708 0.040867 -0.344099
v 0.114425 0.015399 -0.340064
v 0.118310 0.068687 -0.377169
v 0.111466 0.044444 -0.378562
v 0.097740 0.019355 -0.375946
v 0.085666 -0.031880 -0.290156
v 0.067889 -0.057538 -0.284757
v 0.050112 -0.083195 -0.279357
v 0.079514 -0.035889 -0.329459
v 0.061737 -0.061547 -0.324060
v 0.043960 -0.087205 -0.318660
v 0.062766 -0.031880 -0.365544
v 0.044989 -0.057538 -0.360144
v 0.027212 -0.083195 -0.354744
vn 0.7350 -0.6765 -0.0458
vn 0.5356 -0.8445 0.0066
vn 0.6363 -0.6765 -0.3708
vn 0.4414 -0.8445 -0.3033
vn 0.8011 -0.5104 0.3126
vn 0.7297 -0.5991 0.3298
vn 0.5330 -0.7678 0.3555
vn 0.2444 -0.8992 0.3629
vn 0.1535 -0.9834 0.0970
vn 0.0736 -0.9834 -0.1660
vn 0.0012 -0.8992 -0.4375
vn 0.2452 -0.7678 -0.5919
vn 0.4230 -0.5991 -0.6799
vn 0.4919 -0.5104 -0.7054
vn 0.7044 -0.5914 -0.3926
vn 0.8038 -0.5914 -0.0655
vn -0.2266 -0.1995 -0.9533
vn -0.3557 -0.4013 -0.8440
vn -0.4836 -0.0050 -0.8753
vn -0.6012 -0.2155 -0.7695
vn 0.1477 -0.3204 -0.9357
vn 0.0825 -0.4103 -0.9082
vn -0.0653 -0.5946 -0.8013
vn -0.2314 -0.7691 -0.5958
vn -0.4980 -0.6648 -0.5568
vn -0.7073 -0.5064 -0.4932
vn -0.8700 -0.2858 -0.4018
vn -0.8198 -0.0236 -0.5721
vn -0.7210 0.1977 -0.6641
vn -0.6624 0.2927 -0.6896
vn -0.4252 0.0845 -0.9011
vn -0.1665 -0.1113 -0.9797
vn -0.8783 0.4697 0.0893
vn -0.9672 0.2215 0.1245
vn -0.7796 0.4697 0.4143
vn -0.8730 0.2215 0.4345
vn -0.8204 0.4826 -0.3068
vn -0.8778 0.3853 -0.2848
vn -0.9633 0.1467 -0.2248
vn -0.9773 -0.1587 -0.1403
vn -0.9693 -0.1942 0.1508
vn -0.8894 -0.1942 0.4138
vn -0.7342 -0.1587 0.6602
vn -0.6755 0.1467 0.7226
vn -0.5711 0.3853 0.7249
vn -0.5111 0.4826 0.7112
vn -0.7232 0.5642 0.3984
vn -0.8226 0.5642 0.0712
vn 0.0849 -0.0050 0.9964
vn -0.0717 -0.2155 0.9739
vn 0.3420 -0.1995 0.9183
vn 0.1739 -0.4013 0.8993
vn -0.1669 0.2927 0.9415
vn -0.2298 0.1977 0.9529
vn -0.3631 -0.0236 0.9315
vn -0.4995 -0.2858 0.8178
vn -0.3135 -0.5064 0.8033
vn -0.1042 -0.6648 0.7397
vn 0.1391 -0.7691 0.6238
vn 0.3914 -0.5946 0.7023
vn 0.5737 -0.4103 0.7089
vn 0.6432 -0.3204 0.6955
vn 0.4066 -0.1113 0.9068
vn 0.1478 0.0845 0.9854
vn 0.6026 0.7435 -0.2901
vn 0.4478 0.8610 -0.2410
vn 0.6621 0.7435 -0.0941
vn 0.5063 0.8610 -0.0488
vn 0.6700 0.5232 -0.5266
vn 0.5089 0.6927 -0.5111
vn 0.3408 0.8199 -0.4601
vn 0.1635 0.8994 -0.4054
vn 0.2312 0.9551 -0.1852
vn 0.2952 0.9551 0.0253
vn 0.3614 0.8994 0.2460
vn 0.5391 0.8199 0.1928
vn 0.7072 0.6927 0.1417
vn 0.8497 0.5232 0.0650
vn 0.8181 0.5593 -0.1337
vn 0.7542 0.5593 -0.3439
vn -0.6621 -0.7435 0.0941
vn -0.5063 -0.8610 0.0488
vn -0.6026 -0.7435 0.2901
vn -0.4478 -0.8610 0.2410
vn -0.8497 -0.5232 -0.0650
vn -0.7072 -0.6927 -0.1417
vn -0.5391 -0.8199 -0.1928
vn -0.3614 -0.8994 -0.2460
vn -0.2952 -0.9551 -0.0253
vn -0.2312 -0.9551 0.1852
vn -0.1635 -0.8994 0.4054
vn -0.3408 -0.8199 0.4601
vn -0.5089 -0.6927 0.5111
vn -0.6700 -0.5232 0.5266
vn -0.7542 -0.5593 0.3439
vn -0.8181 -0.5593 0.1337
vn 0.3498 0.3994 0.8474
vn 0.2240 0.1987 0.9541
vn 0.6019 0.2087 0.7708
vn 0.4837 0.0022 0.8752
vn 0.2880 0.7762 0.5609
vn 0.0624 0.5939 0.8021
vn -0.0853 0.4094 0.9083
vn -0.1479 0.3203 0.9357
vn 0.1663 0.1113 0.9798
vn 0.4252 -0.0847 0.9011
vn 0.6623 -0.2929 0.6896
vn 0.7208 -0.2006 0.6635
vn 0.8200 0.0206 0.5720
vn 0.8523 0.3491 0.3895
vn 0.7067 0.5063 0.4942
vn 0.4984 0.6640 0.5575
vn 0.1572 0.0979 0.9827
vn 0.4160 -0.0979 0.9041
vn -0.1573 0.3066 0.9388
vn 0.6529 -0.3066 0.6927
vn -0.5410 0.8409 -0.0094
vn -0.7368 0.6747 0.0445
vn -0.4444 0.8409 0.3088
vn -0.6371 0.6747 0.3728
vn -0.1928 0.9249 -0.3278
vn -0.5348 0.7659 -0.3570
vn -0.7309 0.5967 -0.3311
vn -0.8011 0.5103 -0.3127
vn -0.8039 0.5912 0.0654
vn -0.7044 0.5912 0.3928
vn -0.4919 0.5103 0.7055
vn -0.4233 0.5967 0.6817
vn -0.2459 0.7659 0.5941
vn 0.0221 0.9249 0.3797
vn -0.0747 0.9834 0.1657
vn -0.1542 0.9834 -0.0961
vn -0.8133 0.5779 0.0683
vn -0.7139 0.5779 0.3956
vn -0.8108 0.4966 -0.3097
vn -0.5016 0.4966 0.7084
vn 0.0715 0.2087 -0.9754
vn -0.0849 0.0022 -0.9964
vn -0.1806 0.3994 -0.8988
vn -0.3445 0.1987 -0.9175
vn 0.4917 0.3491 -0.7977
vn 0.3633 0.0206 -0.9314
vn 0.2300 -0.2006 -0.9523
vn 0.1669 -0.2929 -0.9415
vn -0.1478 -0.0847 -0.9854
vn -0.4068 0.1113 -0.9067
vn -0.6433 0.3203 -0.6954
vn -0.5761 0.4094 -0.7074
vn -0.3942 0.5939 -0.7013
vn -0.0727 0.7762 -0.6263
vn 0.1041 0.6640 -0.7405
vn 0.3125 0.5063 -0.8038
vn -0.1572 -0.0979 -0.9827
vn -0.4160 0.0979 -0.9041
vn 0.1573 -0.3066 -0.9388
vn -0.6529 0.3066 -0.6927
vn 0.9665 -0.2268 -0.1198
vn 0.8775 -0.4715 -0.0873
vn 0.8699 -0.2268 -0.4380
vn 0.7778 -0.4715 -0.4155
vn 0.9748 0.2039 0.0903
vn 0.9625 -0.1486 0.2271
vn 0.8764 -0.3868 0.2870
vn 0.8203 -0.4827 0.3069
vn 0.8225 -0.5643 -0.0711
vn 0.7231 -0.5643 -0.3984
vn 0.5110 -0.4827 -0.7113
vn 0.5687 -0.3868 -0.7259
vn 0.6736 -0.1486 -0.7240
vn 0.7599 0.2039 -0.6172
vn 0.8899 0.1932 -0.4133
vn 0.9694 0.1932 -0.1515
vn 0.8133 -0.5779 -0.0683
vn 0.7139 -0.5779 -0.3956
vn 0.8108 -0.4966 0.3097
vn 0.5016 -0.4966 -0.7084
vn 0.7368 -0.6747 -0.0445
vn 0.5410 -0.8409 0.0094
vn 0.6371 -0.6747 -0.3728
vn 0.4444 -0.8409 -0.3088
vn 0.8011 -0.5103 0.3127
vn 0.7309 -0.5967 0.3311
vn 0.5348 -0.7659 0.3570
vn 0.1928 -0.9249 0.3278
vn 0.1542 -0.9834 0.0961
vn 0.0747 -0.9834 -0.1657
vn -0.0221 -0.9249 -0.3797
vn 0.2459 -0.7659 -0.5941
vn 0.4233 -0.5967 -0.6817
vn 0.4919 -0.5103 -0.7055
vn 0.7044 -0.5912 -0.3928
vn 0.8039 -0.5912 -0.0654
vn -0.2240 -0.1987 -0.9541
vn -0.3498 -0.3994 -0.8474
vn -0.4837 -0.0022 -0.8752
vn -0.6019 -0.2087 -0.7708
vn 0.1479 -0.3203 -0.9357
vn 0.0853 -0.4094 -0.9083
vn -0.0624 -0.5939 -0.8021
vn -0.2880 -0.7762 -0.5609
vn -0.4984 -0.6640 -0.5575
vn -0.7067 -0.5063 -0.4942
vn -0.8523 -0.3491 -0.3895
vn -0.8200 -0.0206 -0.5720
vn -0.7208 0.2006 -0.6635
vn -0.6623 0.2929 -0.6896
vn -0.4252 0.0847 -0.9011
vn -0.1663 -0.1113 -0.9798
vn -0.8775 0.4715 0.0873
vn -0.9665 0.2268 0.1198
vn -0.7778 0.4715 0.4155
vn -0.8699 0.2268 0.4380
vn -0.8203 0.4827 -0.3069
vn -0.8764 0.3868 -0.2870
vn -0.9625 0.1486 -0.2271
vn -0.9748 -0.2039 -0.0903
vn -0.9694 -0.1932 0.1515
vn -0.8899 -0.1932 0.4133
vn -0.7599 -0.2039 0.6172
vn -0.6736 0.1486 0.7240
vn -0.5687 0.3868 0.7259
vn -0.5110 0.4827 0.7113
vn -0.7231 0.5643 0.3984
vn -0.8225 0.5643 0.0711
vn 0.0849 -0.0022 0.9964
vn -0.0715 -0.2087 0.9754
vn 0.3445 -0.1987 0.9175
vn 0.1806 -0.3994 0.8988
vn -0.1669 0.2929 0.9415
vn -0.2300 0.2006 0.9523
vn -0.3633 -0.0206 0.9314
vn -0.4917 -0.3491 0.7977
vn -0.3125 -0.5063 0.8038
vn -0.1041 -0.6640 0.7405
vn 0.0727 -0.7762 0.6263
vn 0.3942 -0.5939 0.7013
vn 0.5761 -0.4094 0.7074
vn 0.6433 -0.3203 0.6954
vn 0.4068 -0.1113 0.9067
vn 0.1478 0.0847 0.9854
vn 0.6046 0.7424 -0.2887
vn 0.4488 0.8599 -0.2433
vn 0.6630 0.7424 -0.0964
vn 0.5083 0.8599 -0.0474
vn 0.6368 0.5412 -0.5492
vn 0.5088 0.6927 -0.5111
vn 0.3408 0.8199 -0.4600
vn 0.1487 0.9177 -0.3683
vn 0.2312 0.9551 -0.1851
vn 0.2951 0.9551 0.0252
vn 0.3284 0.9177 0.2234
vn 0.5391 0.8199 0.1927
vn 0.7072 0.6927 0.1418
vn 0.8346 0.5412 0.1022
vn 0.8181 0.5594 -0.1336
vn 0.7541 0.5594 -0.3440
vn -0.6630 -0.7424 0.0964
vn -0.5083 -0.8599 0.0474
vn -0.6046 -0.7424 0.2887
vn -0.4488 -0.8599 0.2433
vn -0.8346 -0.5412 -0.1022
vn -0.7072 -0.6927 -0.1418
vn -0.5391 -0.8199 -0.1927
vn -0.3284 -0.9177 -0.2234
vn -0.2951 -0.9551 -0.0252
vn -0.2312 -0.9551 0.1851
vn -0.1487 -0.9177 0.3683
vn -0.3408 -0.8199 0.4600
vn -0.5088 -0.6927 0.5111
vn -0.6368 -0.5412 0.5492
vn -0.7541 -0.5594 0.3440
vn -0.8181 -0.5594 0.1336
vn 0.3557 0.4013 0.8440
vn 0.2266 0.1995 0.9533
vn 0.6012 0.2155 0.7695
vn 0.4836 0.0050 0.8753
vn 0.2314 0.7691 0.5958
vn 0.0653 0.5946 0.8013
vn -0.0825 0.4103 0.9082
vn -0.1477 0.3204 0.9357
vn 0.1665 0.1113 0.9797
vn 0.4252 -0.0845 0.9011
vn 0.6624 -0.2927 0.6896
vn 0.7210 -0.1977 0.6641
vn 0.8198 0.0236 0.5721
vn 0.8700 0.2858 0.4018
vn 0.7073 0.5064 0.4932
vn 0.4980 0.6648 0.5568
vn -0.5356 0.8445 -0.0066
vn -0.7350 0.6765 0.0458
vn -0.4414 0.8445 0.3033
vn -0.6363 0.6765 0.3708
vn -0.2444 0.8992 -0.3629
vn -0.5330 0.7678 -0.3555
vn -0.7297 0.5991 -0.3298
vn -0.8011 0.5104 -0.3126
vn -0.8038 0.5914 0.0655
vn -0.7044 0.5914 0.3926
vn -0.4919 0.5104 0.7054
vn -0.4230 0.5991 0.6799
vn -0.2452 0.7678 0.5919
vn -0.0012 0.8992 0.4375
vn -0.0736 0.9834 0.1660
vn -0.1535 0.9834 -0.0970
vn 0.0717 0.2155 -0.9739
vn -0.0849 0.0050 -0.9964
vn -0.1739 0.4013 -0.8993
vn -0.3420 0.1995 -0.9183
vn 0.4995 0.2858 -0.8178
vn 0.3631 0.0236 -0.9315
vn 0.2298 -0.1977 -0.9529
vn 0.1669 -0.2927 -0.9415
vn -0.1478 -0.0845 -0.9854
vn -0.4066 0.1113 -0.9068
vn -0.6432 0.3204 -0.6955
vn -0.5737 0.4103 -0.7089
vn -0.3914 0.5946 -0.7023
vn -0.1391 0.7691 -0.6238
vn 0.1042 0.6648 -0.7397
vn 0.3135 0.5064 -0.8033
vn 0.9672 -0.2215 -0.1245
vn 0.8783 -0.4697 -0.0893
vn 0.8730 -0.2215 -0.4345
vn 0.7796 -0.4697 -0.4143
vn 0.9773 0.1587 0.1403
vn 0.9633 -0.1467 0.2248
vn 0.8778 -0.3853 0.2848
vn 0.8204 -0.4826 0.3068
vn 0.8226 -0.5642 -0.0712
vn 0.7232 -0.5642 -0.3984
vn 0.5111 -0.4826 -0.7112
vn 0.5711 -0.3853 -0.7249
vn 0.6755 -0.1467 -0.7226
vn 0.7342 0.1587 -0.6602
vn 0.8894 0.1942 -0.4138
vn 0.9693 0.1942 -0.1508
vt 0.583333 0.062500
vt 0.562500 0.125000
vt 0.562500 0.062500
vt 0.604167 0.062500
vt 0.583333 0.125000
vt 0.583333 0.187500
vt 0.562500 0.187500
vt 0.604167 0.187500
vt 0.562500 0.000000
vt 0.541667 0.062500
vt 0.541667 0.000000
vt 0.583333 0.000000
vt 0.604167 0.000000
vt 0.625000 0.000000
vt 0.625000 0.125000
vt 0.604167 0.125000
vt 0.625000 0.250000
vt 0.604167 0.250000
vt 0.583333 0.250000
vt 0.562500 0.250000
vt 0.541667 0.187500
vt 0.541667 0.250000
vt 0.541667 0.125000
vt 0.583333 0.312500
vt 0.562500 0.375000
vt 0.562500 0.312500
vt 0.604167 0.312500
vt 0.583333 0.375000
vt 0.583406 0.437500
vt 0.562509 0.437500
vt 0.604320 0.437500
vt 0.541667 0.312500
vt 0.625000 0.375000
vt 0.604167 0.375000
vt 0.625000 0.500000
vt 0.605903 0.500000
vt 0.583912 0.500000
vt 0.562572 0.500000
vt 0.541667 0.437500
vt 0.541667 0.500000
vt 0.541667 0.375000
vt 0.584997 0.562500
vt 0.562789 0.625000
vt 0.562708 0.562500
vt 0.610343 0.562500
vt 0.585648 0.625000
vt 0.584997 0.687500
vt 0.562708 0.687500
vt 0.610343 0.687500
vt 0.541667 0.562500
vt 0.648148 0.625000
vt 0.612558 0.625000
vt 0.625000 0.750000
vt 0.605903 0.750000
vt 0.583912 0.750000
vt 0.562572 0.750000
vt 0.541667 0.687500
vt 0.541667 0.750000
vt 0.541667 0.625000
vt 0.583406 0.812500
vt 0.562500 0.875000
vt 0.562509 0.812500
vt 0.604320 0.812500
vt 0.583333 0.875000
vt 0.583333 0.937500
vt 0.562500 0.937500
vt 0.604167 0.937500
vt 0.541667 0.812500
vt 0.625000 0.875000
vt 0.604167 0.875000
vt 0.625000 1.000000
vt 0.604167 1.000000
vt 0.583333 1.000000
vt 0.562500 1.000000
vt 0.541667 0.937500
vt 0.541667 1.000000
vt 0.541667 0.875000
vt 0.187301 0.562500
vt 0.247685 0.625000
vt 0.187211 0.625000
vt 0.306478 0.562500
vt 0.248409 0.562500
vt 0.187301 0.687500
vt 0.306478 0.687500
vt 0.248409 0.687500
vt 0.125000 0.500000
vt 0.125000 0.562500
vt 0.187500 0.500000
vt 0.312500 0.500000
vt 0.250000 0.500000
vt 0.375000 0.500000
vt 0.355867 0.562500
vt 0.304109 0.625000
vt 0.355867 0.687500
vt 0.375000 0.750000
vt 0.312500 0.750000
vt 0.250000 0.750000
vt 0.187500 0.750000
vt 0.125000 0.750000
vt 0.125000 0.687500
vt 0.125000 0.625000
vt 0.693522 0.562500
vt 0.752315 0.625000
vt 0.695891 0.625000
vt 0.812699 0.562500
vt 0.751592 0.562500
vt 0.693522 0.687500
vt 0.812699 0.687500
vt 0.751591 0.687500
vt 0.644133 0.562500
vt 0.687500 0.500000
vt 0.812500 0.500000
vt 0.750000 0.500000
vt 0.875000 0.500000
vt 0.875000 0.562500
vt 0.812789 0.625000
vt 0.875000 0.687500
vt 0.875000 0.750000
vt 0.812500 0.750000
vt 0.750000 0.750000
vt 0.687500 0.750000
vt 0.644133 0.687500
vt 0.395680 0.812500
vt 0.416667 0.875000
vt 0.395833 0.875000
vt 0.416594 0.812500
vt 0.437500 0.875000
vt 0.395833 0.937500
vt 0.416667 0.937500
vt 0.375000 0.812500
vt 0.394097 0.750000
vt 0.416088 0.750000
vt 0.437491 0.812500
vt 0.437428 0.750000
vt 0.458333 0.812500
vt 0.458333 0.875000
vt 0.437500 0.937500
vt 0.458333 0.937500
vt 0.437500 1.000000
vt 0.416667 1.000000
vt 0.395833 1.000000
vt 0.375000 1.000000
vt 0.375000 0.937500
vt 0.375000 0.875000
vt 0.500000 0.812500
vt 0.479167 0.875000
vt 0.479167 0.812500
vt 0.520833 0.812500
vt 0.500000 0.875000
vt 0.479167 0.937500
vt 0.520833 0.875000
vt 0.500000 0.937500
vt 0.479167 0.750000
vt 0.458333 0.750000
vt 0.500000 0.750000
vt 0.520833 0.750000
vt 0.520833 0.937500
vt 0.520833 1.000000
vt 0.500000 1.000000
vt 0.479167 1.000000
vt 0.458333 1.000000
vt 0.389658 0.562500
vt 0.414352 0.625000
vt 0.387442 0.625000
vt 0.415003 0.562500
vt 0.437211 0.625000
vt 0.389657 0.687500
vt 0.415003 0.687500
vt 0.394097 0.500000
vt 0.416088 0.500000
vt 0.437292 0.562500
vt 0.437428 0.500000
vt 0.458333 0.562500
vt 0.458333 0.625000
vt 0.437292 0.687500
vt 0.458333 0.687500
vt 0.351852 0.625000
vt 0.500000 0.562500
vt 0.479167 0.625000
vt 0.479167 0.562500
vt 0.520833 0.562500
vt 0.500000 0.625000
vt 0.479167 0.687500
vt 0.520833 0.625000
vt 0.500000 0.687500
vt 0.479167 0.500000
vt 0.458333 0.500000
vt 0.520833 0.500000
vt 0.500000 0.500000
vt 0.541667 0.625000
vt 0.520833 0.687500
vt 0.520833 0.750000
vt 0.500000 0.750000
vt 0.458333 0.750000
vt 0.458333 0.625000
vt 0.395833 0.312500
vt 0.416667 0.375000
vt 0.395833 0.375000
vt 0.416667 0.312500
vt 0.437500 0.375000
vt 0.395680 0.437500
vt 0.416594 0.437500
vt 0.375000 0.250000
vt 0.375000 0.312500
vt 0.395833 0.250000
vt 0.416667 0.250000
vt 0.437500 0.312500
vt 0.437500 0.250000
vt 0.458333 0.312500
vt 0.458333 0.375000
vt 0.437491 0.437500
vt 0.458333 0.437500
vt 0.375000 0.437500
vt 0.375000 0.375000
vt 0.500000 0.312500
vt 0.479167 0.375000
vt 0.479167 0.312500
vt 0.520833 0.312500
vt 0.500000 0.375000
vt 0.479167 0.437500
vt 0.520833 0.375000
vt 0.500000 0.437500
vt 0.479167 0.250000
vt 0.458333 0.312500
vt 0.458333 0.250000
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.541667 0.312500
vt 0.541667 0.375000
vt 0.520833 0.437500
vt 0.541667 0.437500
vt 0.458333 0.437500
vt 0.458333 0.375000
vt 0.395833 0.062500
vt 0.416667 0.125000
vt 0.395833 0.125000
vt 0.416667 0.062500
vt 0.437500 0.125000
vt 0.395833 0.187500
vt 0.416667 0.187500
vt 0.375000 0.000000
vt 0.375000 0.062500
vt 0.395833 0.000000
vt 0.416667 0.000000
vt 0.437500 0.062500
vt 0.437500 0.000000
vt 0.458333 0.062500
vt 0.458333 0.125000
vt 0.437500 0.187500
vt 0.458333 0.187500
vt 0.375000 0.187500
vt 0.375000 0.125000
vt 0.500000 0.062500
vt 0.479167 0.125000
vt 0.479167 0.062500
vt 0.520833 0.062500
vt 0.500000 0.125000
vt 0.479167 0.187500
vt 0.520833 0.125000
vt 0.500000 0.187500
vt 0.458333 0.000000
vt 0.500000 0.000000
vt 0.479167 0.000000
vt 0.520833 0.000000
vt 0.520833 0.187500
vt 0.520833 0.250000
vt 0.500000 0.250000
vt 0.479167 0.250000
vt 0.458333 0.250000
vt 0.625000 0.062500
vt 0.625000 0.187500
vt 0.625000 0.312500
vt 0.625000 0.437500
vt 0.625000 0.812500
vt 0.625000 0.937500
vt 0.875000 0.625000
vt 0.458333 0.500000
vt 0.541667 0.500000
vt 0.541667 0.750000
s 0
f 1066/1119/1037 1068/1120/1037 1065/1121/1037
f 1067/1122/1038 1069/1123/1038 1066/1119/1038
f 1068/1120/1039 1072/1124/1039 1071/1125/1039
f 1069/1123/1040 1073/1126/1040 1072/1124/1040
f 984/1127/1041 1062/1128/1041 974/1129/1041
f 985/1130/1042 1065/1121/1042 984/1127/1042
f 986/1131/1043 1066/1119/1043 985/1130/1043
f 966/1132/1044 1067/1122/1044 986/1131/1044
f 1067/1122/1045 988/1133/1045 1070/1134/1045
f 988/1133/1046 1073/1126/1046 1070/1134/1046
f 1073/1126/1047 968/1135/1047 1023/1136/1047
f 1072/1124/1048 1023/1136/1048 1024/1137/1048
f 1071/1125/1049 1024/1137/1049 1025/1138/1049
f 1064/1139/1050 1025/1138/1050 975/1140/1050
f 1063/1141/1051 1071/1125/1051 1064/1139/1051
f 1065/1121/1052 1063/1141/1052 1062/1128/1052
f 1075/1142/1053 1077/1143/1053 1074/1144/1053
f 1076/1145/1054 1078/1146/1054 1075/1142/1054
f 1077/1143/1055 1081/1147/1055 1080/1148/1055
f 1078/1146/1056 1082/1149/1056 1081/1147/1056
f 1025/1138/1057 1056/1150/1057 975/1140/1057
f 1024/1137/1058 1074/1144/1058 1025/1138/1058
f 1023/1136/1059 1075/1142/1059 1024/1137/1059
f 968/1135/1060 1076/1145/1060 1023/1136/1060
f 1076/1145/1061 997/1151/1061 1079/1152/1061
f 997/1151/1062 1082/1149/1062 1079/1152/1062
f 1082/1149/1063 972/1153/1063 1029/1154/1063
f 1081/1147/1064 1029/1154/1064 1030/1155/1064
f 1080/1148/1065 1030/1155/1065 1031/1156/1065
f 1058/1157/1066 1031/1156/1066 977/1158/1066
f 1057/1159/1067 1080/1148/1067 1058/1157/1067
f 1074/1144/1068 1057/1159/1068 1056/1150/1068
f 1084/1160/1069 1086/1161/1069 1083/1162/1069
f 1085/1163/1070 1087/1164/1070 1084/1160/1070
f 1086/1161/1071 1090/1165/1071 1089/1166/1071
f 1087/1164/1072 1091/1167/1072 1090/1165/1072
f 1031/1156/1073 1050/1168/1073 977/1158/1073
f 1030/1155/1074 1083/1162/1074 1031/1156/1074
f 1029/1154/1075 1084/1160/1075 1030/1155/1075
f 972/1153/1076 1085/1163/1076 1029/1154/1076
f 1085/1163/1077 1006/1169/1077 1088/1170/1077
f 1006/1169/1078 1091/1167/1078 1088/1170/1078
f 1091/1167/1079 970/1171/1079 1035/1172/1079
f 1090/1165/1080 1035/1172/1080 1036/1173/1080
f 1089/1166/1081 1036/1173/1081 1037/1174/1081
f 1052/1175/1082 1037/1174/1082 979/1176/1082
f 1051/1177/1083 1089/1166/1083 1052/1175/1083
f 1083/1162/1084 1051/1177/1084 1050/1168/1084
f 1093/1178/1085 1095/1179/1085 1092/1180/1085
f 1094/1181/1086 1096/1182/1086 1093/1178/1086
f 1095/1179/1087 1099/1183/1087 1098/1184/1087
f 1096/1182/1088 1100/1185/1088 1099/1183/1088
f 1037/1174/1089 1044/1186/1089 979/1176/1089
f 1036/1173/1090 1092/1180/1090 1037/1174/1090
f 1035/1172/1091 1093/1178/1091 1036/1173/1091
f 970/1171/1092 1094/1181/1092 1035/1172/1092
f 1094/1181/1093 1015/1187/1093 1097/1188/1093
f 1015/
Download .txt
gitextract_sa0or0_n/

├── CONDITION.md
├── README.md
├── blender_utils/
│   └── render_proxy.py
├── configs/
│   ├── coin3d_train.yaml
│   ├── ctrldemo.yaml
│   ├── nerf.yaml
│   ├── neus.yaml
│   └── syncdreamer.yaml
├── example/
│   ├── panda/
│   │   ├── mesh.obj
│   │   └── proxy.txt
│   ├── pumpkin/
│   │   ├── mesh.obj
│   │   └── proxy.txt
│   ├── teddybear/
│   │   ├── mesh.obj
│   │   └── proxy.txt
│   ├── toycar/
│   │   ├── mesh.obj
│   │   └── proxy.txt
│   └── turtle/
│       ├── mesh.obj
│       └── proxy.txt
├── externs/
│   ├── __init__.py
│   └── pvcnn/
│       └── modules/
│           ├── __init__.py
│           ├── ball_query.py
│           ├── frustum.py
│           ├── functional/
│           │   ├── __init__.py
│           │   ├── backend.py
│           │   ├── ball_query.py
│           │   ├── devoxelization.py
│           │   ├── grouping.py
│           │   ├── interpolatation.py
│           │   ├── loss.py
│           │   ├── sampling.py
│           │   ├── src/
│           │   │   ├── ball_query/
│           │   │   │   ├── ball_query.cpp
│           │   │   │   ├── ball_query.cu
│           │   │   │   ├── ball_query.cuh
│           │   │   │   └── ball_query.hpp
│           │   │   ├── bindings.cpp
│           │   │   ├── cuda_utils.cuh
│           │   │   ├── grouping/
│           │   │   │   ├── grouping.cpp
│           │   │   │   ├── grouping.cu
│           │   │   │   ├── grouping.cuh
│           │   │   │   └── grouping.hpp
│           │   │   ├── interpolate/
│           │   │   │   ├── neighbor_interpolate.cpp
│           │   │   │   ├── neighbor_interpolate.cu
│           │   │   │   ├── neighbor_interpolate.cuh
│           │   │   │   ├── neighbor_interpolate.hpp
│           │   │   │   ├── trilinear_devox.cpp
│           │   │   │   ├── trilinear_devox.cu
│           │   │   │   ├── trilinear_devox.cuh
│           │   │   │   └── trilinear_devox.hpp
│           │   │   ├── sampling/
│           │   │   │   ├── sampling.cpp
│           │   │   │   ├── sampling.cu
│           │   │   │   ├── sampling.cuh
│           │   │   │   └── sampling.hpp
│           │   │   ├── utils.hpp
│           │   │   └── voxelization/
│           │   │       ├── vox.cpp
│           │   │       ├── vox.cu
│           │   │       ├── vox.cuh
│           │   │       └── vox.hpp
│           │   └── voxelization.py
│           ├── loss.py
│           ├── pointnet.py
│           ├── pvconv.py
│           ├── se.py
│           ├── shared_mlp.py
│           └── voxelization.py
├── foreground_segment.py
├── generate.py
├── ldm/
│   ├── DPMPPScheduler.py
│   ├── base_utils.py
│   ├── data/
│   │   ├── __init__.py
│   │   ├── base.py
│   │   ├── coco.py
│   │   ├── control_sync_dreamer.py
│   │   ├── dummy.py
│   │   ├── imagenet.py
│   │   ├── inpainting/
│   │   │   ├── __init__.py
│   │   │   └── synthetic_mask.py
│   │   ├── laion.py
│   │   ├── lsun.py
│   │   ├── nerf_like.py
│   │   ├── simple.py
│   │   └── sync_dreamer.py
│   ├── lr_scheduler.py
│   ├── models/
│   │   ├── autoencoder.py
│   │   └── diffusion/
│   │       ├── __init__.py
│   │       ├── ctrldemo_sync_dreamer.py
│   │       ├── sync_dreamer.py
│   │       ├── sync_dreamer_attention.py
│   │       ├── sync_dreamer_network.py
│   │       └── sync_dreamer_utils.py
│   ├── modules/
│   │   ├── attention.py
│   │   ├── diffusionmodules/
│   │   │   ├── __init__.py
│   │   │   ├── model.py
│   │   │   ├── openaimodel.py
│   │   │   └── util.py
│   │   ├── distributions/
│   │   │   ├── __init__.py
│   │   │   └── distributions.py
│   │   ├── encoders/
│   │   │   ├── __init__.py
│   │   │   └── modules.py
│   │   └── x_transformer.py
│   ├── thirdp/
│   │   └── psp/
│   │       ├── helpers.py
│   │       ├── id_loss.py
│   │       └── model_irse.py
│   ├── typing.py
│   └── util.py
├── meta_info/
│   └── camera-16.pkl
├── misc.ipynb
├── raymarching/
│   ├── __init__.py
│   ├── backend.py
│   ├── raymarching.py
│   ├── setup.py
│   └── src/
│       ├── bindings.cpp
│       ├── raymarching.cu
│       └── raymarching.h
├── renderer/
│   ├── agg_net.py
│   ├── cost_reg_net.py
│   ├── dummy_dataset.py
│   ├── feature_net.py
│   ├── neus_networks.py
│   ├── ngp_renderer.py
│   └── renderer.py
├── requirements.txt
├── train_diffusion.py
├── train_renderer.py
└── workflow/
    ├── Coin3D_condition_workflow.json
    ├── Coin3D_condition_workflow_api.json
    └── inference_comfyui_api.py
Download .txt
SYMBOL INDEX (1033 symbols across 70 files)

FILE: blender_utils/render_proxy.py
  function load_object (line 23) | def load_object(object_path: str) -> None:
  function az_el_to_points (line 34) | def az_el_to_points(azimuths, elevations):
  function points_to_az_el_dist (line 40) | def points_to_az_el_dist(location):
  function set_camera (line 48) | def set_camera(camera, az, el, dist):
  function get_camera (line 60) | def get_camera(camera_name):
  function init_global (line 90) | def init_global(context):
  function reset_scene (line 126) | def reset_scene() -> None:
  function scene_root_objects (line 142) | def scene_root_objects():
  function scene_meshes (line 147) | def scene_meshes():
  function selected_root_objects (line 152) | def selected_root_objects():
  function selected_meshes (line 157) | def selected_meshes():
  function selected_objects_bbox (line 162) | def selected_objects_bbox(single_obj=None, ignore_matrix=False):
  function scene_bbox (line 178) | def scene_bbox(single_obj=None, ignore_matrix=False):
  function normalize_scene (line 194) | def normalize_scene():

FILE: externs/pvcnn/modules/ball_query.py
  class BallQuery (line 9) | class BallQuery(nn.Module):
    method __init__ (line 10) | def __init__(self, radius, num_neighbors, include_coordinates=True):
    method forward (line 16) | def forward(self, points_coords, centers_coords, points_features=None):
    method extra_repr (line 32) | def extra_repr(self):

FILE: externs/pvcnn/modules/frustum.py
  class FrustumPointNetLoss (line 11) | class FrustumPointNetLoss(nn.Module):
    method __init__ (line 12) | def __init__(self, num_heading_angle_bins, num_size_templates, size_te...
    method forward (line 27) | def forward(self, inputs, targets):
  function get_box_corners_3d (line 92) | def get_box_corners_3d(centers, headings, sizes, with_flip=False):

FILE: externs/pvcnn/modules/functional/ball_query.py
  function ball_query (line 8) | def ball_query(centers_coords, points_coords, radius, num_neighbors):

FILE: externs/pvcnn/modules/functional/devoxelization.py
  class TrilinearDevoxelization (line 8) | class TrilinearDevoxelization(Function):
    method forward (line 10) | def forward(ctx, features, coords, resolution, is_training=True):
    method backward (line 30) | def backward(ctx, grad_output):

FILE: externs/pvcnn/modules/functional/grouping.py
  class Grouping (line 8) | class Grouping(Function):
    method forward (line 10) | def forward(ctx, features, indices):
    method backward (line 25) | def backward(ctx, grad_output):

FILE: externs/pvcnn/modules/functional/interpolatation.py
  class NeighborInterpolation (line 8) | class NeighborInterpolation(Function):
    method forward (line 10) | def forward(ctx, points_coords, centers_coords, centers_features):
    method backward (line 30) | def backward(ctx, grad_output):

FILE: externs/pvcnn/modules/functional/loss.py
  function kl_loss (line 7) | def kl_loss(x, y):
  function huber_loss (line 13) | def huber_loss(error, delta):

FILE: externs/pvcnn/modules/functional/sampling.py
  class Gather (line 10) | class Gather(Function):
    method forward (line 12) | def forward(ctx, features, indices):
    method backward (line 28) | def backward(ctx, grad_output):
  function furthest_point_sample (line 37) | def furthest_point_sample(coords, num_samples):
  function logits_mask (line 51) | def logits_mask(coords, logits, num_points_per_object):

FILE: externs/pvcnn/modules/functional/src/ball_query/ball_query.cpp
  function ball_query_forward (line 6) | at::Tensor ball_query_forward(at::Tensor centers_coords,

FILE: externs/pvcnn/modules/functional/src/bindings.cpp
  function PYBIND11_MODULE (line 10) | PYBIND11_MODULE(_pvcnn_backend, m) {

FILE: externs/pvcnn/modules/functional/src/grouping/grouping.cpp
  function grouping_forward (line 6) | at::Tensor grouping_forward(at::Tensor features, at::Tensor indices) {
  function grouping_backward (line 26) | at::Tensor grouping_backward(at::Tensor grad_y, at::Tensor indices,

FILE: externs/pvcnn/modules/functional/src/interpolate/neighbor_interpolate.cpp
  function three_nearest_neighbors_interpolate_forward (line 6) | std::vector<at::Tensor>
  function three_nearest_neighbors_interpolate_backward (line 42) | at::Tensor three_nearest_neighbors_interpolate_backward(at::Tensor grad_y,

FILE: externs/pvcnn/modules/functional/src/interpolate/trilinear_devox.cpp
  function trilinear_devoxelize_forward (line 18) | std::vector<at::Tensor>
  function trilinear_devoxelize_backward (line 67) | at::Tensor trilinear_devoxelize_backward(const at::Tensor grad_y,

FILE: externs/pvcnn/modules/functional/src/sampling/sampling.cpp
  function gather_features_forward (line 6) | at::Tensor gather_features_forward(at::Tensor features, at::Tensor indic...
  function gather_features_backward (line 25) | at::Tensor gather_features_backward(at::Tensor grad_y, at::Tensor indices,
  function furthest_point_sampling_forward (line 43) | at::Tensor furthest_point_sampling_forward(at::Tensor coords,

FILE: externs/pvcnn/modules/functional/src/voxelization/vox.cpp
  function avg_voxelize_forward (line 17) | std::vector<at::Tensor> avg_voxelize_forward(const at::Tensor features,
  function avg_voxelize_backward (line 54) | at::Tensor avg_voxelize_backward(const at::Tensor grad_y,

FILE: externs/pvcnn/modules/functional/voxelization.py
  class AvgVoxelization (line 8) | class AvgVoxelization(Function):
    method forward (line 10) | def forward(ctx, features, coords, resolution):
    method backward (line 27) | def backward(ctx, grad_output):

FILE: externs/pvcnn/modules/loss.py
  class KLLoss (line 8) | class KLLoss(nn.Module):
    method forward (line 9) | def forward(self, x, y):

FILE: externs/pvcnn/modules/pointnet.py
  class PointNetAModule (line 11) | class PointNetAModule(nn.Module):
    method __init__ (line 12) | def __init__(self, in_channels, out_channels, include_coordinates=True):
    method forward (line 32) | def forward(self, inputs):
    method extra_repr (line 45) | def extra_repr(self):
  class PointNetSAModule (line 49) | class PointNetSAModule(nn.Module):
    method __init__ (line 50) | def __init__(self, num_centers, radius, num_neighbors, in_channels, ou...
    method forward (line 80) | def forward(self, inputs):
    method extra_repr (line 91) | def extra_repr(self):
  class PointNetFPModule (line 95) | class PointNetFPModule(nn.Module):
    method __init__ (line 96) | def __init__(self, in_channels, out_channels):
    method forward (line 100) | def forward(self, inputs):

FILE: externs/pvcnn/modules/pvconv.py
  class PVConv (line 11) | class PVConv(nn.Module):
    method __init__ (line 12) | def __init__(self, in_channels, out_channels, kernel_size, resolution,...
    method forward (line 33) | def forward(self, inputs):
  class ProxyVoxelConv (line 41) | class ProxyVoxelConv(nn.Module):
    method __init__ (line 42) | def __init__(self, in_channels, out_channels, kernel_size, resolution,...
    method forward (line 51) | def forward(self, inputs):

FILE: externs/pvcnn/modules/se.py
  class SE3d (line 6) | class SE3d(nn.Module):
    method __init__ (line 7) | def __init__(self, channel, reduction=8):
    method forward (line 16) | def forward(self, inputs):

FILE: externs/pvcnn/modules/shared_mlp.py
  class SharedMLP (line 6) | class SharedMLP(nn.Module):
    method __init__ (line 7) | def __init__(self, in_channels, out_channels, dim=1):
    method forward (line 29) | def forward(self, inputs):

FILE: externs/pvcnn/modules/voxelization.py
  class Voxelization (line 9) | class Voxelization(nn.Module):
    method __init__ (line 10) | def __init__(self, resolution, normalize=True, eps=0):
    method forward (line 16) | def forward(self, features, coords):
    method extra_repr (line 28) | def extra_repr(self):

FILE: foreground_segment.py
  class BackgroundRemoval (line 9) | class BackgroundRemoval:
    method __init__ (line 10) | def __init__(self, device='cuda'):
    method __call__ (line 26) | def __call__(self, image):
  function process (line 33) | def process(image_path, mask_path):

FILE: generate.py
  function load_model (line 18) | def load_model(cfg,ckpt,strict=True):
  function main (line 27) | def main():

FILE: ldm/DPMPPScheduler.py
  class DPMPPSchedulerOutput (line 13) | class DPMPPSchedulerOutput(BaseOutput):
  class DPMPPScheduler (line 29) | class DPMPPScheduler(DPMSolverMultistepScheduler):
    method step (line 30) | def step(
    method reinit (line 105) | def reinit(self):

FILE: ldm/base_utils.py
  function save_pickle (line 7) | def save_pickle(data, pkl_path):
  function read_pickle (line 12) | def read_pickle(pkl_path):
  function draw_epipolar_line (line 16) | def draw_epipolar_line(F, img0, img1, pt0, color):
  function draw_epipolar_lines (line 29) | def draw_epipolar_lines(F, img0, img1,num=20):
  function compute_F (line 45) | def compute_F(K1, K2, Rt0, Rt1=None):
  function compute_dR_dt (line 58) | def compute_dR_dt(Rt0, Rt1):
  function concat_images (line 65) | def concat_images(img0,img1,vert=False):
  function concat_images_list (line 79) | def concat_images_list(*args,vert=False):
  function pose_inverse (line 87) | def pose_inverse(pose):
  function project_points (line 92) | def project_points(pts,RT,K):
  function draw_keypoints (line 104) | def draw_keypoints(img, kps, colors=None, radius=2):
  function output_points (line 116) | def output_points(fn,pts,colors=None):
  function mask_depth_to_pts (line 125) | def mask_depth_to_pts(mask,depth,K,rgb=None):
  function transform_points_pose (line 135) | def transform_points_pose(pts, pose):
  function pose_apply (line 141) | def pose_apply(pose,pts):
  function downsample_gaussian_blur (line 144) | def downsample_gaussian_blur(img, ratio):

FILE: ldm/data/base.py
  class Txt2ImgIterableBaseDataset (line 7) | class Txt2ImgIterableBaseDataset(IterableDataset):
    method __init__ (line 11) | def __init__(self, num_records=0, valid_ids=None, size=256):
    method __len__ (line 20) | def __len__(self):
    method __iter__ (line 24) | def __iter__(self):
  class PRNGMixin (line 28) | class PRNGMixin(object):
    method prng (line 35) | def prng(self):

FILE: ldm/data/coco.py
  class CocoBase (line 11) | class CocoBase(Dataset):
    method __init__ (line 13) | def __init__(self, size=None, dataroot="", datajson="", onehot_segment...
    method year (line 95) | def year(self):
    method __len__ (line 98) | def __len__(self):
    method preprocess_image (line 101) | def preprocess_image(self, image_path, segmentation_path=None):
    method __getitem__ (line 145) | def __getitem__(self, i):
  class CocoImagesAndCaptionsTrain2017 (line 166) | class CocoImagesAndCaptionsTrain2017(CocoBase):
    method __init__ (line 168) | def __init__(self, size, onehot_segmentation=False, use_stuffthing=Fal...
    method get_split (line 175) | def get_split(self):
    method year (line 178) | def year(self):
  class CocoImagesAndCaptionsValidation2017 (line 182) | class CocoImagesAndCaptionsValidation2017(CocoBase):
    method __init__ (line 184) | def __init__(self, size, onehot_segmentation=False, use_stuffthing=Fal...
    method get_split (line 193) | def get_split(self):
    method year (line 196) | def year(self):
  class CocoImagesAndCaptionsTrain2014 (line 201) | class CocoImagesAndCaptionsTrain2014(CocoBase):
    method __init__ (line 203) | def __init__(self, size, onehot_segmentation=False, use_stuffthing=Fal...
    method get_split (line 212) | def get_split(self):
    method year (line 215) | def year(self):
  class CocoImagesAndCaptionsValidation2014 (line 218) | class CocoImagesAndCaptionsValidation2014(CocoBase):
    method __init__ (line 220) | def __init__(self, size, onehot_segmentation=False, use_stuffthing=Fal...
    method get_split (line 231) | def get_split(self):
    method year (line 234) | def year(self):

FILE: ldm/data/control_sync_dreamer.py
  class ControlSyncDreamerTrainData (line 23) | class ControlSyncDreamerTrainData(SyncDreamerTrainData):
    method __init__ (line 24) | def __init__(self, target_dir, input_dir, proxy_dir, uid_set_pkl, imag...
    method get_data_for_index (line 41) | def get_data_for_index(self, index):
  class ControlSyncDreamerEvalData (line 69) | class ControlSyncDreamerEvalData(Dataset):
    method __init__ (line 70) | def __init__(self, image_dir, proxy_dir, uid_set_pkl):
    method __len__ (line 81) | def __len__(self):
    method get_data_for_index (line 84) | def get_data_for_index(self, index):
    method __getitem__ (line 92) | def __getitem__(self, index):
  class ControlSyncDreamerDataset (line 95) | class ControlSyncDreamerDataset(SyncDreamerDataset):
    method __init__ (line 96) | def __init__(self, target_dir, input_dir, validation_dir, proxy_dir, b...
    method setup (line 112) | def setup(self, stage):

FILE: ldm/data/dummy.py
  class DummyData (line 6) | class DummyData(Dataset):
    method __init__ (line 7) | def __init__(self, length, size):
    method __len__ (line 11) | def __len__(self):
    method __getitem__ (line 14) | def __getitem__(self, i):
  class DummyDataWithEmbeddings (line 21) | class DummyDataWithEmbeddings(Dataset):
    method __init__ (line 22) | def __init__(self, length, size, emb_size):
    method __len__ (line 27) | def __len__(self):
    method __getitem__ (line 30) | def __getitem__(self, i):

FILE: ldm/data/imagenet.py
  function synset2idx (line 20) | def synset2idx(path_to_yaml="data/index_synset.yaml"):
  class ImageNetBase (line 26) | class ImageNetBase(Dataset):
    method __init__ (line 27) | def __init__(self, config=None):
    method __len__ (line 39) | def __len__(self):
    method __getitem__ (line 42) | def __getitem__(self, i):
    method _prepare (line 45) | def _prepare(self):
    method _filter_relpaths (line 48) | def _filter_relpaths(self, relpaths):
    method _prepare_synset_to_human (line 66) | def _prepare_synset_to_human(self):
    method _prepare_idx_to_synset (line 74) | def _prepare_idx_to_synset(self):
    method _prepare_human_to_integer_label (line 80) | def _prepare_human_to_integer_label(self):
    method _load (line 93) | def _load(self):
  class ImageNetTrain (line 134) | class ImageNetTrain(ImageNetBase):
    method __init__ (line 145) | def __init__(self, process_images=True, data_root=None, **kwargs):
    method _prepare (line 150) | def _prepare(self):
  class ImageNetValidation (line 197) | class ImageNetValidation(ImageNetBase):
    method __init__ (line 211) | def __init__(self, process_images=True, data_root=None, **kwargs):
    method _prepare (line 216) | def _prepare(self):
  class ImageNetSR (line 272) | class ImageNetSR(Dataset):
    method __init__ (line 273) | def __init__(self, size=None,
    method __len__ (line 336) | def __len__(self):
    method __getitem__ (line 339) | def __getitem__(self, i):
  class ImageNetSRTrain (line 375) | class ImageNetSRTrain(ImageNetSR):
    method __init__ (line 376) | def __init__(self, **kwargs):
    method get_base (line 379) | def get_base(self):
  class ImageNetSRValidation (line 386) | class ImageNetSRValidation(ImageNetSR):
    method __init__ (line 387) | def __init__(self, **kwargs):
    method get_base (line 390) | def get_base(self):

FILE: ldm/data/inpainting/synthetic_mask.py
  function gen_segment_mask (line 56) | def gen_segment_mask(mask, start, end, brush_width):
  function gen_box_mask (line 66) | def gen_box_mask(mask, masked):
  function gen_round_mask (line 72) | def gen_round_mask(mask, masked, radius):
  function gen_large_mask (line 85) | def gen_large_mask(prng, img_h, img_w,

FILE: ldm/data/laion.py
  class DataWithWings (line 24) | class DataWithWings(torch.utils.data.IterableDataset):
    method __init__ (line 25) | def __init__(self, min_size, transform=None, target_transform=None):
    method _compute_hash (line 50) | def _compute_hash(url, text):
    method _add_tags (line 58) | def _add_tags(self, x):
    method _punsafe_to_class (line 64) | def _punsafe_to_class(self, punsafe):
    method _filter_predicate (line 67) | def _filter_predicate(self, x):
    method __iter__ (line 73) | def __iter__(self):
  function dict_collation_fn (line 77) | def dict_collation_fn(samples, combine_tensors=True, combine_scalars=True):
  class WebDataModuleFromConfig (line 108) | class WebDataModuleFromConfig(pl.LightningDataModule):
    method __init__ (line 109) | def __init__(self, tar_base, batch_size, train=None, validation=None,
    method make_loader (line 125) | def make_loader(self, dataset_config, train=True):
    method filter_size (line 188) | def filter_size(self, x):
    method filter_keys (line 205) | def filter_keys(self, x):
    method train_dataloader (line 211) | def train_dataloader(self):
    method val_dataloader (line 214) | def val_dataloader(self):
    method test_dataloader (line 217) | def test_dataloader(self):
  class AddLR (line 224) | class AddLR(object):
    method __init__ (line 225) | def __init__(self, factor, output_size, initial_size=None, image_key="...
    method pt2np (line 231) | def pt2np(self, x):
    method np2pt (line 235) | def np2pt(self, x):
    method __call__ (line 239) | def __call__(self, sample):
  class AddBW (line 250) | class AddBW(object):
    method __init__ (line 251) | def __init__(self, image_key="jpg"):
    method pt2np (line 254) | def pt2np(self, x):
    method np2pt (line 258) | def np2pt(self, x):
    method __call__ (line 262) | def __call__(self, sample):
  class AddMask (line 273) | class AddMask(PRNGMixin):
    method __init__ (line 274) | def __init__(self, mode="512train", p_drop=0.):
    method __call__ (line 280) | def __call__(self, sample):
  class AddEdge (line 294) | class AddEdge(PRNGMixin):
    method __init__ (line 295) | def __init__(self, mode="512train", mask_edges=True):
    method __call__ (line 304) | def __call__(self, sample):
  function example00 (line 365) | def example00():
  function example01 (line 397) | def example01():
  function example02 (line 438) | def example02():
  function example03 (line 457) | def example03():
  function example04 (line 505) | def example04():

FILE: ldm/data/lsun.py
  class LSUNBase (line 9) | class LSUNBase(Dataset):
    method __init__ (line 10) | def __init__(self,
    method __len__ (line 36) | def __len__(self):
    method __getitem__ (line 39) | def __getitem__(self, i):
  class LSUNChurchesTrain (line 62) | class LSUNChurchesTrain(LSUNBase):
    method __init__ (line 63) | def __init__(self, **kwargs):
  class LSUNChurchesValidation (line 67) | class LSUNChurchesValidation(LSUNBase):
    method __init__ (line 68) | def __init__(self, flip_p=0., **kwargs):
  class LSUNBedroomsTrain (line 73) | class LSUNBedroomsTrain(LSUNBase):
    method __init__ (line 74) | def __init__(self, **kwargs):
  class LSUNBedroomsValidation (line 78) | class LSUNBedroomsValidation(LSUNBase):
    method __init__ (line 79) | def __init__(self, flip_p=0.0, **kwargs):
  class LSUNCatsTrain (line 84) | class LSUNCatsTrain(LSUNBase):
    method __init__ (line 85) | def __init__(self, **kwargs):
  class LSUNCatsValidation (line 89) | class LSUNCatsValidation(LSUNBase):
    method __init__ (line 90) | def __init__(self, flip_p=0., **kwargs):

FILE: ldm/data/nerf_like.py
  function cartesian_to_spherical (line 11) | def cartesian_to_spherical(xyz):
  function get_T (line 21) | def get_T(T_target, T_cond):
  function get_spherical (line 32) | def get_spherical(T_target, T_cond):
  class RTMV (line 43) | class RTMV(Dataset):
    method __init__ (line 44) | def __init__(self, root_dir='datasets/RTMV/google_scanned',\
    method __len__ (line 52) | def __len__(self):
    method __getitem__ (line 55) | def __getitem__(self, idx):
    method blend_rgba (line 79) | def blend_rgba(self, img):
  class GSO (line 84) | class GSO(Dataset):
    method __init__ (line 85) | def __init__(self, root_dir='datasets/GoogleScannedObjects',\
    method __len__ (line 95) | def __len__(self):
    method __getitem__ (line 98) | def __getitem__(self, idx):
    method blend_rgba (line 123) | def blend_rgba(self, img):
  class WILD (line 127) | class WILD(Dataset):
    method __init__ (line 128) | def __init__(self, root_dir='data/nerf_wild',\
    method __len__ (line 136) | def __len__(self):
    method __getitem__ (line 139) | def __getitem__(self, idx):
    method blend_rgba (line 163) | def blend_rgba(self, img):

FILE: ldm/data/simple.py
  function make_transform_multi_folder_data (line 29) | def make_transform_multi_folder_data(paths, caption_files=None, **kwargs):
  function make_nfp_data (line 33) | def make_nfp_data(base_path):
  class VideoDataset (line 42) | class VideoDataset(Dataset):
    method __init__ (line 43) | def __init__(self, root_dir, image_transforms, caption_file, offset=8,...
    method __len__ (line 62) | def __len__(self):
    method __getitem__ (line 65) | def __getitem__(self, index):
    method _load_sample (line 73) | def _load_sample(self, index):
  function make_tranforms (line 103) | def make_tranforms(image_transforms):
  function make_multi_folder_data (line 113) | def make_multi_folder_data(paths, caption_files=None, **kwargs):
  class NfpDataset (line 136) | class NfpDataset(Dataset):
    method __init__ (line 137) | def __init__(self,
    method __len__ (line 151) | def __len__(self):
    method __getitem__ (line 155) | def __getitem__(self, index):
    method _load_im (line 164) | def _load_im(self, filename):
  class ObjaverseDataModuleFromConfig (line 168) | class ObjaverseDataModuleFromConfig(pl.LightningDataModule):
    method __init__ (line 169) | def __init__(self, root_dir, batch_size, total_view, train=None, valid...
    method train_dataloader (line 191) | def train_dataloader(self):
    method val_dataloader (line 197) | def val_dataloader(self):
    method test_dataloader (line 203) | def test_dataloader(self):
  class ObjaverseData (line 208) | class ObjaverseData(Dataset):
    method __init__ (line 209) | def __init__(self,
    method __len__ (line 245) | def __len__(self):
    method cartesian_to_spherical (line 248) | def cartesian_to_spherical(self, xyz):
    method get_T (line 257) | def get_T(self, target_RT, cond_RT):
    method load_im (line 274) | def load_im(self, path, color):
    method __getitem__ (line 287) | def __getitem__(self, index):
    method process_im (line 328) | def process_im(self, im):
  class FolderData (line 332) | class FolderData(Dataset):
    method __init__ (line 333) | def __init__(self,
    method __len__ (line 376) | def __len__(self):
    method __getitem__ (line 382) | def __getitem__(self, index):
    method process_im (line 410) | def process_im(self, im):
  class TransformDataset (line 415) | class TransformDataset():
    method __init__ (line 416) | def __init__(self, ds, extra_label="sksbspic"):
    method __getitem__ (line 426) | def __getitem__(self, index):
    method __len__ (line 444) | def __len__(self):
  function hf_dataset (line 447) | def hf_dataset(
  class TextOnly (line 473) | class TextOnly(Dataset):
    method __init__ (line 474) | def __init__(self, captions, output_size, image_key="image", caption_k...
    method __len__ (line 490) | def __len__(self):
    method __getitem__ (line 493) | def __getitem__(self, index):
    method _load_caption_file (line 498) | def _load_caption_file(self, filename):
  class IdRetreivalDataset (line 507) | class IdRetreivalDataset(FolderData):
    method __init__ (line 508) | def __init__(self, ret_file, *args, **kwargs):
    method __getitem__ (line 513) | def __getitem__(self, index):

FILE: ldm/data/sync_dreamer.py
  class SyncDreamerTrainData (line 21) | class SyncDreamerTrainData(Dataset):
    method __init__ (line 22) | def __init__(self, target_dir, input_dir, uid_set_pkl, image_size=256):
    method __len__ (line 36) | def __len__(self):
    method load_im (line 39) | def load_im(self, path):
    method process_im (line 47) | def process_im(self, im):
    method load_index (line 52) | def load_index(self, filename, index):
    method get_data_for_index (line 57) | def get_data_for_index(self, index):
    method __getitem__ (line 76) | def __getitem__(self, index):
  class SyncDreamerEvalData (line 80) | class SyncDreamerEvalData(Dataset):
    method __init__ (line 81) | def __init__(self, image_dir):
    method __len__ (line 92) | def __len__(self):
    method get_data_for_index (line 95) | def get_data_for_index(self, index):
    method __getitem__ (line 100) | def __getitem__(self, index):
  class SyncDreamerDataset (line 103) | class SyncDreamerDataset(pl.LightningDataModule):
    method __init__ (line 104) | def __init__(self, target_dir, input_dir, validation_dir, batch_size, ...
    method setup (line 116) | def setup(self, stage):
    method train_dataloader (line 123) | def train_dataloader(self):
    method val_dataloader (line 127) | def val_dataloader(self):
    method test_dataloader (line 131) | def test_dataloader(self):

FILE: ldm/lr_scheduler.py
  class LambdaWarmUpCosineScheduler (line 4) | class LambdaWarmUpCosineScheduler:
    method __init__ (line 8) | def __init__(self, warm_up_steps, lr_min, lr_max, lr_start, max_decay_...
    method schedule (line 17) | def schedule(self, n, **kwargs):
    method __call__ (line 32) | def __call__(self, n, **kwargs):
  class LambdaWarmUpCosineScheduler2 (line 36) | class LambdaWarmUpCosineScheduler2:
    method __init__ (line 41) | def __init__(self, warm_up_steps, f_min, f_max, f_start, cycle_lengths...
    method find_in_interval (line 52) | def find_in_interval(self, n):
    method schedule (line 59) | def schedule(self, n, **kwargs):
    method __call__ (line 77) | def __call__(self, n, **kwargs):
  class LambdaLinearScheduler (line 81) | class LambdaLinearScheduler(LambdaWarmUpCosineScheduler2):
    method schedule (line 83) | def schedule(self, n, **kwargs):

FILE: ldm/models/autoencoder.py
  class VQModel (line 14) | class VQModel(pl.LightningModule):
    method __init__ (line 15) | def __init__(self,
    method ema_scope (line 64) | def ema_scope(self, context=None):
    method init_from_ckpt (line 78) | def init_from_ckpt(self, path, ignore_keys=list()):
    method on_train_batch_end (line 92) | def on_train_batch_end(self, *args, **kwargs):
    method encode (line 96) | def encode(self, x):
    method encode_to_prequant (line 102) | def encode_to_prequant(self, x):
    method decode (line 107) | def decode(self, quant):
    method decode_code (line 112) | def decode_code(self, code_b):
    method forward (line 117) | def forward(self, input, return_pred_indices=False):
    method get_input (line 124) | def get_input(self, batch, k):
    method training_step (line 142) | def training_step(self, batch, batch_idx, optimizer_idx):
    method validation_step (line 164) | def validation_step(self, batch, batch_idx):
    method _validation_step (line 170) | def _validation_step(self, batch, batch_idx, suffix=""):
    method configure_optimizers (line 197) | def configure_optimizers(self):
    method get_last_layer (line 230) | def get_last_layer(self):
    method log_images (line 233) | def log_images(self, batch, only_inputs=False, plot_ema=False, **kwargs):
    method to_rgb (line 255) | def to_rgb(self, x):
  class VQModelInterface (line 264) | class VQModelInterface(VQModel):
    method __init__ (line 265) | def __init__(self, embed_dim, *args, **kwargs):
    method encode (line 269) | def encode(self, x):
    method decode (line 274) | def decode(self, h, force_not_quantize=False):
  class AutoencoderKL (line 285) | class AutoencoderKL(pl.LightningModule):
    method __init__ (line 286) | def __init__(self,
    method init_from_ckpt (line 313) | def init_from_ckpt(self, path, ignore_keys=list()):
    method encode (line 324) | def encode(self, x):
    method decode (line 330) | def decode(self, z):
    method forward (line 335) | def forward(self, input, sample_posterior=True):
    method get_input (line 344) | def get_input(self, batch, k):
    method training_step (line 351) | def training_step(self, batch, batch_idx, optimizer_idx):
    method validation_step (line 372) | def validation_step(self, batch, batch_idx):
    method configure_optimizers (line 386) | def configure_optimizers(self):
    method get_last_layer (line 397) | def get_last_layer(self):
    method log_images (line 401) | def log_images(self, batch, only_inputs=False, **kwargs):
    method to_rgb (line 417) | def to_rgb(self, x):
  class IdentityFirstStage (line 426) | class IdentityFirstStage(torch.nn.Module):
    method __init__ (line 427) | def __init__(self, *args, vq_interface=False, **kwargs):
    method encode (line 431) | def encode(self, x, *args, **kwargs):
    method decode (line 434) | def decode(self, x, *args, **kwargs):
    method quantize (line 437) | def quantize(self, x, *args, **kwargs):
    method forward (line 442) | def forward(self, x, *args, **kwargs):

FILE: ldm/models/diffusion/ctrldemo_sync_dreamer.py
  class ControlSpatialVolumeNet (line 24) | class ControlSpatialVolumeNet(SpatialVolumeNet):
    method __init__ (line 25) | def __init__(self, time_dim, view_dim, view_num,
    method construct_spatial_volume (line 41) | def construct_spatial_volume(self, x, t_embed, v_embed, target_poses, ...
  class CtrlDemo (line 100) | class CtrlDemo(SyncMultiviewDiffusion):
    method __init__ (line 101) | def __init__(self, unet_config, scheduler_config,
    method _init_sampler (line 136) | def _init_sampler(self, latent_size, sample_steps):
    method prepare (line 139) | def prepare(self, batch):
    method inference (line 145) | def inference(self, sampler, batch, cfg_scale, batch_view_num, return_...
    method decode_latents (line 152) | def decode_latents(self, x_sample):
    method get_target_view_feats (line 156) | def get_target_view_feats(self, x_input, spatial_volume, clip_embed, t...
    method training_step (line 182) | def training_step(self, batch):
    method configure_optimizers (line 213) | def configure_optimizers(self):
  class CtrlDemoSampler (line 226) | class CtrlDemoSampler:
    method __init__ (line 227) | def __init__(self, model: CtrlDemo, scheduler_steps, scheduler_name='d...
    method parameterization (line 243) | def parameterization(self):
    method from_pkl (line 254) | def from_pkl(cls, model, pkl_dir):
    method set_ctrl3D_params (line 258) | def set_ctrl3D_params(self, ctrl3D_params_list, strength: float=1.0):
    method concat_proxy (line 262) | def concat_proxy(self, proxys, inferenc_step):
    method denoise_apply_impl (line 276) | def denoise_apply_impl(self, x_target_noisy, time_steps, noise_pred, i...
    method denoise_apply (line 285) | def denoise_apply(self, x_target_noisy, input_info, v_embed, clip_embe...
    method inference (line 328) | def inference(self, input_info, clip_embed, unconditional_scale=1.0, l...
    method get_clip_feature (line 372) | def get_clip_feature(self, x_input, clip_embed, v_embed, target_index):

FILE: ldm/models/diffusion/sync_dreamer.py
  function disabled_train (line 20) | def disabled_train(self, mode=True):
  function disable_training_module (line 25) | def disable_training_module(module: nn.Module):
  function repeat_to_batch (line 32) | def repeat_to_batch(tensor, B, VN):
  class UNetWrapper (line 38) | class UNetWrapper(nn.Module):
    method __init__ (line 39) | def __init__(self, diff_model_config, drop_conditions=False, drop_sche...
    method drop (line 47) | def drop(self, cond, mask):
    method get_trainable_parameters (line 53) | def get_trainable_parameters(self):
    method get_drop_scheme (line 56) | def get_drop_scheme(self, B, device):
    method forward (line 67) | def forward(self, x, t, clip_embed, volume_feats, x_concat, is_train=F...
    method predict_with_unconditional_scale (line 104) | def predict_with_unconditional_scale(self, x, t, clip_embed, volume_fe...
    method predict_for_threestudio (line 123) | def predict_for_threestudio(self, x, t, clip_embed, volume_feats, x_co...
    method predict_with_unconditional_scale_mv (line 138) | def predict_with_unconditional_scale_mv(self, x, t, clip_embed, volume...
  class SpatialVolumeNet (line 146) | class SpatialVolumeNet(nn.Module):
    method __init__ (line 147) | def __init__(self, time_dim, view_dim, view_num,
    method construct_spatial_volume (line 168) | def construct_spatial_volume(self, x, t_embed, v_embed, target_poses, ...
    method construct_view_frustum_volume (line 214) | def construct_view_frustum_volume(self, spatial_volume, t_embed, v_emb...
  class SyncMultiviewDiffusion (line 247) | class SyncMultiviewDiffusion(pl.LightningModule):
    method __init__ (line 248) | def __init__(self, unet_config, scheduler_config,
    method _init_clip_projection (line 287) | def _init_clip_projection(self):
    method _init_multiview (line 296) | def _init_multiview(self):
    method get_viewpoint_embedding (line 309) | def get_viewpoint_embedding(self, batch_size, elevation_ref):
    method _init_first_stage (line 329) | def _init_first_stage(self):
    method _init_clip_image_encoder (line 354) | def _init_clip_image_encoder(self):
    method _init_schedule (line 358) | def _init_schedule(self):
    method _init_time_step_embedding (line 382) | def _init_time_step_embedding(self):
    method encode_first_stage (line 390) | def encode_first_stage(self, x, sample=True):
    method decode_first_stage (line 398) | def decode_first_stage(self, z):
    method prepare (line 403) | def prepare(self, batch):
    method embed_time (line 421) | def embed_time(self, t):
    method get_target_view_feats (line 426) | def get_target_view_feats(self, x_input, spatial_volume, clip_embed, t...
    method training_step (line 451) | def training_step(self, batch):
    method add_noise (line 482) | def add_noise(self, x_start, t):
    method sample (line 498) | def sample(self, sampler, batch, cfg_scale, batch_view_num, return_int...
    method decode_latents (line 520) | def decode_latents(self, x_sample):
    method inference (line 524) | def inference(self, sampler, batch, cfg_scale, batch_view_num, return_...
    method log_image (line 530) | def log_image(self,  x_sample, batch, step, output_dir):
    method validation_step (line 543) | def validation_step(self, batch, batch_idx):
    method configure_optimizers (line 554) | def configure_optimizers(self):
  class SyncDDIMSampler (line 575) | class SyncDDIMSampler:
    method __init__ (line 576) | def __init__(self, model: SyncMultiviewDiffusion, ddim_num_steps, ddim...
    method _make_schedule (line 584) | def _make_schedule(self,  ddim_num_steps, ddim_discretize="uniform", d...
    method denoise_apply_impl (line 602) | def denoise_apply_impl(self, x_target_noisy, index, noise_pred, is_ste...
    method denoise_apply (line 628) | def denoise_apply(self, x_target_noisy, input_info, clip_embed, time_s...
    method sample (line 669) | def sample(self, input_info, clip_embed, unconditional_scale=1.0, log_...

FILE: ldm/models/diffusion/sync_dreamer_attention.py
  class DepthAttention (line 8) | class DepthAttention(nn.Module):
    method __init__ (line 9) | def __init__(self, query_dim, context_dim, heads, dim_head, output_bia...
    method forward (line 26) | def forward(self, x, context):
  class DepthTransformer (line 50) | class DepthTransformer(nn.Module):
    method __init__ (line 51) | def __init__(self, dim, n_heads, d_head, context_dim=None, checkpoint=...
    method forward (line 75) | def forward(self, x, context=None):
    method _forward (line 78) | def _forward(self, x, context):
  class DepthWiseAttention (line 87) | class DepthWiseAttention(UNetModel):
    method __init__ (line 88) | def __init__(self, volume_dims=(5,16,32,64), *args, **kwargs):
    method forward (line 117) | def forward(self, x, timesteps=None, context=None, source_dict=None, *...
    method get_trainable_parameters (line 140) | def get_trainable_parameters(self):

FILE: ldm/models/diffusion/sync_dreamer_network.py
  class Image2DResBlockWithTV (line 6) | class Image2DResBlockWithTV(nn.Module):
    method __init__ (line 7) | def __init__(self, dim, tdim, vdim):
    method forward (line 21) | def forward(self, x, t, v):
  class NoisyTargetViewEncoder (line 25) | class NoisyTargetViewEncoder(nn.Module):
    method __init__ (line 26) | def __init__(self, time_embed_dim, viewpoint_dim, run_dim=16, output_d...
    method forward (line 39) | def forward(self, x, t, v):
  class SpatialUpTimeBlock (line 52) | class SpatialUpTimeBlock(nn.Module):
    method __init__ (line 53) | def __init__(self, x_in_dim, t_in_dim, out_dim):
    method forward (line 61) | def forward(self, x, t):
  class SpatialTimeBlock (line 65) | class SpatialTimeBlock(nn.Module):
    method __init__ (line 66) | def __init__(self, x_in_dim, t_in_dim, out_dim, stride):
    method forward (line 74) | def forward(self, x, t):
  class SpatialTime3DNet (line 78) | class SpatialTime3DNet(nn.Module):
    method __init__ (line 79) | def __init__(self, time_dim=256, input_dim=128, dims=(32, 64, 128, 256)):
    method forward (line 103) | def forward(self, x, t, control=None):
  class ControlSpatialTime3DNet (line 135) | class ControlSpatialTime3DNet(nn.Module):
    method __init__ (line 136) | def __init__(self, time_dim=256, input_dim=128, proxy_input_dim=3, dim...
    method forward (line 169) | def forward(self, x, t, proxy_feature):
    method make_zero_conv (line 198) | def make_zero_conv(self, dims, channels):
  class FrustumTVBlock (line 201) | class FrustumTVBlock(nn.Module):
    method __init__ (line 202) | def __init__(self, x_dim, t_dim, v_dim, out_dim, stride):
    method forward (line 211) | def forward(self, x, t, v):
  class FrustumTVUpBlock (line 215) | class FrustumTVUpBlock(nn.Module):
    method __init__ (line 216) | def __init__(self, x_dim, t_dim, v_dim, out_dim):
    method forward (line 225) | def forward(self, x, t, v):
  class FrustumTV3DNet (line 229) | class FrustumTV3DNet(nn.Module):
    method __init__ (line 230) | def __init__(self, in_dim, t_dim, v_dim, dims=(32, 64, 128, 256)):
    method forward (line 247) | def forward(self, x, t, v):

FILE: ldm/models/diffusion/sync_dreamer_utils.py
  function project_and_normalize (line 5) | def project_and_normalize(ref_grid, src_proj, length):
  function project_ (line 22) | def project_(ref_grid, src_proj):
  function construct_project_matrix (line 37) | def construct_project_matrix(x_ratio, y_ratio, Ks, poses):
  function get_warp_coordinates (line 54) | def get_warp_coordinates(volume_xyz, warp_size, input_size, Ks, warp_pose):
  function get_proxy_warp_coordinates (line 61) | def get_proxy_warp_coordinates(proxy_xyz, warp_size, input_size, Ks, war...
  function create_target_volume (line 68) | def create_target_volume(depth_size, volume_size, input_image_size, pose...
  function near_far_from_unit_sphere_using_camera_poses (line 106) | def near_far_from_unit_sphere_using_camera_poses(camera_poses):

FILE: ldm/modules/attention.py
  function exists (line 12) | def exists(val):
  function uniq (line 16) | def uniq(arr):
  function default (line 20) | def default(val, d):
  function max_neg_value (line 26) | def max_neg_value(t):
  function init_ (line 30) | def init_(tensor):
  class GEGLU (line 38) | class GEGLU(nn.Module):
    method __init__ (line 39) | def __init__(self, dim_in, dim_out):
    method forward (line 43) | def forward(self, x):
  class ConvGEGLU (line 47) | class ConvGEGLU(nn.Module):
    method __init__ (line 48) | def __init__(self, dim_in, dim_out):
    method forward (line 52) | def forward(self, x):
  class FeedForward (line 57) | class FeedForward(nn.Module):
    method __init__ (line 58) | def __init__(self, dim, dim_out=None, mult=4, glu=False, dropout=0.):
    method forward (line 73) | def forward(self, x):
  function zero_module (line 77) | def zero_module(module):
  function Normalize (line 86) | def Normalize(in_channels):
  class LinearAttention (line 90) | class LinearAttention(nn.Module):
    method __init__ (line 91) | def __init__(self, dim, heads=4, dim_head=32):
    method forward (line 98) | def forward(self, x):
  class SpatialSelfAttention (line 109) | class SpatialSelfAttention(nn.Module):
    method __init__ (line 110) | def __init__(self, in_channels):
    method forward (line 136) | def forward(self, x):
  class CrossAttention (line 162) | class CrossAttention(nn.Module):
    method __init__ (line 163) | def __init__(self, query_dim, context_dim=None, heads=8, dim_head=64, ...
    method forward (line 180) | def forward(self, x, context=None, mask=None):
  class BasicSpatialTransformer (line 210) | class BasicSpatialTransformer(nn.Module):
    method __init__ (line 211) | def __init__(self, dim, n_heads, d_head, context_dim=None, checkpoint=...
    method forward (line 233) | def forward(self, x, context=None):
    method _forward (line 236) | def _forward(self, x, context):
  class BasicTransformerBlock (line 253) | class BasicTransformerBlock(nn.Module):
    method __init__ (line 254) | def __init__(self, dim, n_heads, d_head, dropout=0., context_dim=None,...
    method forward (line 267) | def forward(self, x, context=None):
    method _forward (line 270) | def _forward(self, x, context=None):
  class ConvFeedForward (line 276) | class ConvFeedForward(nn.Module):
    method __init__ (line 277) | def __init__(self, dim, dim_out=None, mult=4, glu=False, dropout=0.):
    method forward (line 292) | def forward(self, x):
  class SpatialTransformer (line 296) | class SpatialTransformer(nn.Module):
    method __init__ (line 304) | def __init__(self, in_channels, n_heads, d_head,
    method forward (line 330) | def forward(self, x, context=None):

FILE: ldm/modules/diffusionmodules/model.py
  function get_timestep_embedding (line 12) | def get_timestep_embedding(timesteps, embedding_dim):
  function nonlinearity (line 33) | def nonlinearity(x):
  function Normalize (line 38) | def Normalize(in_channels, num_groups=32):
  class Upsample (line 42) | class Upsample(nn.Module):
    method __init__ (line 43) | def __init__(self, in_channels, with_conv):
    method forward (line 53) | def forward(self, x):
  class Downsample (line 60) | class Downsample(nn.Module):
    method __init__ (line 61) | def __init__(self, in_channels, with_conv):
    method forward (line 72) | def forward(self, x):
  class ResnetBlock (line 82) | class ResnetBlock(nn.Module):
    method __init__ (line 83) | def __init__(self, *, in_channels, out_channels=None, conv_shortcut=Fa...
    method forward (line 121) | def forward(self, x, temb):
  class LinAttnBlock (line 144) | class LinAttnBlock(LinearAttention):
    method __init__ (line 146) | def __init__(self, in_channels):
  class AttnBlock (line 150) | class AttnBlock(nn.Module):
    method __init__ (line 151) | def __init__(self, in_channels):
    method forward (line 178) | def forward(self, x):
  function make_attn (line 205) | def make_attn(in_channels, attn_type="vanilla"):
  class Model (line 216) | class Model(nn.Module):
    method __init__ (line 217) | def __init__(self, *, ch, out_ch, ch_mult=(1,2,4,8), num_res_blocks,
    method forward (line 316) | def forward(self, x, t=None, context=None):
    method get_last_layer (line 364) | def get_last_layer(self):
  class Encoder (line 368) | class Encoder(nn.Module):
    method __init__ (line 369) | def __init__(self, *, ch, out_ch, ch_mult=(1,2,4,8), num_res_blocks,
    method forward (line 434) | def forward(self, x):
  class Decoder (line 462) | class Decoder(nn.Module):
    method __init__ (line 463) | def __init__(self, *, ch, out_ch, ch_mult=(1,2,4,8), num_res_blocks,
    method forward (line 535) | def forward(self, z):
  class SimpleDecoder (line 571) | class SimpleDecoder(nn.Module):
    method __init__ (line 572) | def __init__(self, in_channels, out_channels, *args, **kwargs):
    method forward (line 594) | def forward(self, x):
  class UpsampleDecoder (line 607) | class UpsampleDecoder(nn.Module):
    method __init__ (line 608) | def __init__(self, in_channels, out_channels, ch, num_res_blocks, reso...
    method forward (line 641) | def forward(self, x):
  class LatentRescaler (line 655) | class LatentRescaler(nn.Module):
    method __init__ (line 656) | def __init__(self, factor, in_channels, mid_channels, out_channels, de...
    method forward (line 680) | def forward(self, x):
  class MergedRescaleEncoder (line 692) | class MergedRescaleEncoder(nn.Module):
    method __init__ (line 693) | def __init__(self, in_channels, ch, resolution, out_ch, num_res_blocks,
    method forward (line 705) | def forward(self, x):
  class MergedRescaleDecoder (line 711) | class MergedRescaleDecoder(nn.Module):
    method __init__ (line 712) | def __init__(self, z_channels, out_ch, resolution, num_res_blocks, att...
    method forward (line 722) | def forward(self, x):
  class Upsampler (line 728) | class Upsampler(nn.Module):
    method __init__ (line 729) | def __init__(self, in_size, out_size, in_channels, out_channels, ch_mu...
    method forward (line 741) | def forward(self, x):
  class Resize (line 747) | class Resize(nn.Module):
    method __init__ (line 748) | def __init__(self, in_channels=None, learned=False, mode="bilinear"):
    method forward (line 763) | def forward(self, x, scale_factor=1.0):
  class FirstStagePostProcessor (line 770) | class FirstStagePostProcessor(nn.Module):
    method __init__ (line 772) | def __init__(self, ch_mult:list, in_channels,
    method instantiate_pretrained (line 807) | def instantiate_pretrained(self, config):
    method encode_with_pretrained (line 816) | def encode_with_pretrained(self,x):
    method forward (line 822) | def forward(self,x):

FILE: ldm/modules/diffusionmodules/openaimodel.py
  function convert_module_to_f16 (line 25) | def convert_module_to_f16(x):
  function convert_module_to_f32 (line 28) | def convert_module_to_f32(x):
  class AttentionPool2d (line 33) | class AttentionPool2d(nn.Module):
    method __init__ (line 38) | def __init__(
    method forward (line 52) | def forward(self, x):
  class TimestepBlock (line 63) | class TimestepBlock(nn.Module):
    method forward (line 69) | def forward(self, x, emb):
  class TimestepEmbedSequential (line 75) | class TimestepEmbedSequential(nn.Sequential, TimestepBlock):
    method forward (line 81) | def forward(self, x, emb, context=None):
  class Upsample (line 92) | class Upsample(nn.Module):
    method __init__ (line 101) | def __init__(self, channels, use_conv, dims=2, out_channels=None, padd...
    method forward (line 110) | def forward(self, x):
  class TransposedUpsample (line 122) | class TransposedUpsample(nn.Module):
    method __init__ (line 124) | def __init__(self, channels, out_channels=None, ks=5):
    method forward (line 131) | def forward(self,x):
  class Downsample (line 135) | class Downsample(nn.Module):
    method __init__ (line 144) | def __init__(self, channels, use_conv, dims=2, out_channels=None,paddi...
    method forward (line 159) | def forward(self, x):
  class ResBlock (line 164) | class ResBlock(TimestepBlock):
    method __init__ (line 180) | def __init__(
    method forward (line 244) | def forward(self, x, emb):
    method _forward (line 256) | def _forward(self, x, emb):
  class AttentionBlock (line 279) | class AttentionBlock(nn.Module):
    method __init__ (line 286) | def __init__(
    method forward (line 315) | def forward(self, x):
    method _forward (line 319) | def _forward(self, x):
  function count_flops_attn (line 328) | def count_flops_attn(model, _x, y):
  class QKVAttentionLegacy (line 348) | class QKVAttentionLegacy(nn.Module):
    method __init__ (line 353) | def __init__(self, n_heads):
    method forward (line 357) | def forward(self, qkv):
    method count_flops (line 376) | def count_flops(model, _x, y):
  class QKVAttention (line 380) | class QKVAttention(nn.Module):
    method __init__ (line 385) | def __init__(self, n_heads):
    method forward (line 389) | def forward(self, qkv):
    method count_flops (line 410) | def count_flops(model, _x, y):
  class UNetModel (line 414) | class UNetModel(nn.Module):
    method __init__ (line 444) | def __init__(
    method convert_to_fp16 (line 729) | def convert_to_fp16(self):
    method convert_to_fp32 (line 737) | def convert_to_fp32(self):
    method forward (line 745) | def forward(self, x, timesteps=None, context=None, y=None,**kwargs):
  class EncoderUNetModel (line 780) | class EncoderUNetModel(nn.Module):
    method __init__ (line 786) | def __init__(
    method convert_to_fp16 (line 959) | def convert_to_fp16(self):
    method convert_to_fp32 (line 966) | def convert_to_fp32(self):
    method forward (line 973) | def forward(self, x, timesteps):

FILE: ldm/modules/diffusionmodules/util.py
  function make_beta_schedule (line 21) | def make_beta_schedule(schedule, n_timestep, linear_start=1e-4, linear_e...
  function make_ddim_timesteps (line 46) | def make_ddim_timesteps(ddim_discr_method, num_ddim_timesteps, num_ddpm_...
  function make_ddim_sampling_parameters (line 63) | def make_ddim_sampling_parameters(alphacums, ddim_timesteps, eta, verbos...
  function betas_for_alpha_bar (line 77) | def betas_for_alpha_bar(num_diffusion_timesteps, alpha_bar, max_beta=0.9...
  function extract_into_tensor (line 96) | def extract_into_tensor(a, t, x_shape):
  function checkpoint (line 102) | def checkpoint(func, inputs, params, flag):
  class CheckpointFunction (line 119) | class CheckpointFunction(torch.autograd.Function):
    method forward (line 121) | def forward(ctx, run_function, length, *args):
    method backward (line 131) | def backward(ctx, *output_grads):
  function timestep_embedding (line 151) | def timestep_embedding(timesteps, dim, max_period=10000, repeat_only=Fal...
  function zero_module (line 174) | def zero_module(module):
  function scale_module (line 183) | def scale_module(module, scale):
  function mean_flat (line 192) | def mean_flat(tensor):
  function normalization (line 199) | def normalization(channels):
  class SiLU (line 209) | class SiLU(nn.Module):
    method forward (line 210) | def forward(self, x):
  class GroupNorm32 (line 214) | class GroupNorm32(nn.GroupNorm):
    method forward (line 215) | def forward(self, x):
  function conv_nd (line 218) | def conv_nd(dims, *args, **kwargs):
  function linear (line 231) | def linear(*args, **kwargs):
  function avg_pool_nd (line 238) | def avg_pool_nd(dims, *args, **kwargs):
  class HybridConditioner (line 251) | class HybridConditioner(nn.Module):
    method __init__ (line 253) | def __init__(self, c_concat_config, c_crossattn_config):
    method forward (line 258) | def forward(self, c_concat, c_crossattn):
  function noise_like (line 264) | def noise_like(shape, device, repeat=False):

FILE: ldm/modules/distributions/distributions.py
  class AbstractDistribution (line 5) | class AbstractDistribution:
    method sample (line 6) | def sample(self):
    method mode (line 9) | def mode(self):
  class DiracDistribution (line 13) | class DiracDistribution(AbstractDistribution):
    method __init__ (line 14) | def __init__(self, value):
    method sample (line 17) | def sample(self):
    method mode (line 20) | def mode(self):
  class DiagonalGaussianDistribution (line 24) | class DiagonalGaussianDistribution(object):
    method __init__ (line 25) | def __init__(self, parameters, deterministic=False):
    method sample (line 35) | def sample(self):
    method kl (line 39) | def kl(self, other=None):
    method nll (line 53) | def nll(self, sample, dims=[1,2,3]):
    method mode (line 61) | def mode(self):
  function normal_kl (line 65) | def normal_kl(mean1, logvar1, mean2, logvar2):

FILE: ldm/modules/encoders/modules.py
  class AbstractEncoder (line 12) | class AbstractEncoder(nn.Module):
    method __init__ (line 13) | def __init__(self):
    method encode (line 16) | def encode(self, *args, **kwargs):
  class IdentityEncoder (line 19) | class IdentityEncoder(AbstractEncoder):
    method encode (line 21) | def encode(self, x):
  class FaceClipEncoder (line 24) | class FaceClipEncoder(AbstractEncoder):
    method __init__ (line 25) | def __init__(self, augment=True, retreival_key=None):
    method forward (line 31) | def forward(self, img):
    method encode (line 54) | def encode(self, img):
  class FaceIdClipEncoder (line 61) | class FaceIdClipEncoder(AbstractEncoder):
    method __init__ (line 62) | def __init__(self):
    method forward (line 69) | def forward(self, img):
    method encode (line 84) | def encode(self, img):
  class ClassEmbedder (line 91) | class ClassEmbedder(nn.Module):
    method __init__ (line 92) | def __init__(self, embed_dim, n_classes=1000, key='class'):
    method forward (line 97) | def forward(self, batch, key=None):
  class TransformerEmbedder (line 106) | class TransformerEmbedder(AbstractEncoder):
    method __init__ (line 108) | def __init__(self, n_embed, n_layer, vocab_size, max_seq_len=77, devic...
    method forward (line 114) | def forward(self, tokens):
    method encode (line 119) | def encode(self, x):
  class BERTTokenizer (line 123) | class BERTTokenizer(AbstractEncoder):
    method __init__ (line 125) | def __init__(self, device="cuda", vq_interface=True, max_length=77):
    method forward (line 133) | def forward(self, text):
    method encode (line 140) | def encode(self, text):
    method decode (line 146) | def decode(self, text):
  class BERTEmbedder (line 150) | class BERTEmbedder(AbstractEncoder):
    method __init__ (line 152) | def __init__(self, n_embed, n_layer, vocab_size=30522, max_seq_len=77,
    method forward (line 163) | def forward(self, text):
    method encode (line 171) | def encode(self, text):
  function disabled_train (line 178) | def disabled_train(self, mode=True):
  class FrozenT5Embedder (line 184) | class FrozenT5Embedder(AbstractEncoder):
    method __init__ (line 186) | def __init__(self, version="google/t5-v1_1-large", device="cuda", max_...
    method freeze (line 194) | def freeze(self):
    method forward (line 200) | def forward(self, text):
    method encode (line 209) | def encode(self, text):
  class FrozenFaceEncoder (line 215) | class FrozenFaceEncoder(AbstractEncoder):
    method __init__ (line 216) | def __init__(self, model_path, augment=False):
    method forward (line 237) | def forward(self, img):
    method encode (line 251) | def encode(self, img):
  class FrozenCLIPEmbedder (line 254) | class FrozenCLIPEmbedder(AbstractEncoder):
    method __init__ (line 256) | def __init__(self, version="openai/clip-vit-large-patch14", device="cu...
    method freeze (line 264) | def freeze(self):
    method forward (line 270) | def forward(self, text):
    method encode (line 279) | def encode(self, text):
  class ClipImageProjector (line 284) | class ClipImageProjector(AbstractEncoder):
    method __init__ (line 288) | def __init__(self, version="openai/clip-vit-large-patch14", max_length...
    method get_null_cond (line 301) | def get_null_cond(self, version, max_length):
    method preprocess (line 307) | def preprocess(self, x):
    method forward (line 317) | def forward(self, x):
    method encode (line 327) | def encode(self, im):
  class ProjectedFrozenCLIPEmbedder (line 330) | class ProjectedFrozenCLIPEmbedder(AbstractEncoder):
    method __init__ (line 331) | def __init__(self, version="openai/clip-vit-large-patch14", device="cu...
    method forward (line 336) | def forward(self, text):
    method encode (line 340) | def encode(self, text):
  class FrozenCLIPImageEmbedder (line 343) | class FrozenCLIPImageEmbedder(AbstractEncoder):
    method __init__ (line 348) | def __init__(
    method preprocess (line 363) | def preprocess(self, x):
    method forward (line 373) | def forward(self, x):
    method encode (line 381) | def encode(self, im):
  class FrozenCLIPImageMutliEmbedder (line 387) | class FrozenCLIPImageMutliEmbedder(AbstractEncoder):
    method __init__ (line 392) | def __init__(
    method preprocess (line 409) | def preprocess(self, x):
    method forward (line 423) | def forward(self, x):
    method encode (line 440) | def encode(self, im):
  class SpatialRescaler (line 443) | class SpatialRescaler(nn.Module):
    method __init__ (line 444) | def __init__(self,
    method forward (line 462) | def forward(self,x):
    method encode (line 471) | def encode(self, x):
  class LowScaleEncoder (line 479) | class LowScaleEncoder(nn.Module):
    method __init__ (line 480) | def __init__(self, model_config, linear_start, linear_end, timesteps=1...
    method register_schedule (line 490) | def register_schedule(self, beta_schedule="linear", timesteps=1000,
    method q_sample (line 517) | def q_sample(self, x_start, t, noise=None):
    method forward (line 522) | def forward(self, x):
    method decode (line 532) | def decode(self, z):

FILE: ldm/modules/x_transformer.py
  class AbsolutePositionalEmbedding (line 25) | class AbsolutePositionalEmbedding(nn.Module):
    method __init__ (line 26) | def __init__(self, dim, max_seq_len):
    method init_ (line 31) | def init_(self):
    method forward (line 34) | def forward(self, x):
  class FixedPositionalEmbedding (line 39) | class FixedPositionalEmbedding(nn.Module):
    method __init__ (line 40) | def __init__(self, dim):
    method forward (line 45) | def forward(self, x, seq_dim=1, offset=0):
  function exists (line 54) | def exists(val):
  function default (line 58) | def default(val, d):
  function always (line 64) | def always(val):
  function not_equals (line 70) | def not_equals(val):
  function equals (line 76) | def equals(val):
  function max_neg_value (line 82) | def max_neg_value(tensor):
  function pick_and_pop (line 88) | def pick_and_pop(keys, d):
  function group_dict_by_key (line 93) | def group_dict_by_key(cond, d):
  function string_begins_with (line 102) | def string_begins_with(prefix, str):
  function group_by_key_prefix (line 106) | def group_by_key_prefix(prefix, d):
  function groupby_prefix_and_trim (line 110) | def groupby_prefix_and_trim(prefix, d):
  class Scale (line 117) | class Scale(nn.Module):
    method __init__ (line 118) | def __init__(self, value, fn):
    method forward (line 123) | def forward(self, x, **kwargs):
  class Rezero (line 128) | class Rezero(nn.Module):
    method __init__ (line 129) | def __init__(self, fn):
    method forward (line 134) | def forward(self, x, **kwargs):
  class ScaleNorm (line 139) | class ScaleNorm(nn.Module):
    method __init__ (line 140) | def __init__(self, dim, eps=1e-5):
    method forward (line 146) | def forward(self, x):
  class RMSNorm (line 151) | class RMSNorm(nn.Module):
    method __init__ (line 152) | def __init__(self, dim, eps=1e-8):
    method forward (line 158) | def forward(self, x):
  class Residual (line 163) | class Residual(nn.Module):
    method forward (line 164) | def forward(self, x, residual):
  class GRUGating (line 168) | class GRUGating(nn.Module):
    method __init__ (line 169) | def __init__(self, dim):
    method forward (line 173) | def forward(self, x, residual):
  class GEGLU (line 184) | class GEGLU(nn.Module):
    method __init__ (line 185) | def __init__(self, dim_in, dim_out):
    method forward (line 189) | def forward(self, x):
  class FeedForward (line 194) | class FeedForward(nn.Module):
    method __init__ (line 195) | def __init__(self, dim, dim_out=None, mult=4, glu=False, dropout=0.):
    method forward (line 210) | def forward(self, x):
  class Attention (line 215) | class Attention(nn.Module):
    method __init__ (line 216) | def __init__(
    method forward (line 268) | def forward(
  class AttentionLayers (line 370) | class AttentionLayers(nn.Module):
    method __init__ (line 371) | def __init__(
    method forward (line 481) | def forward(
  class Encoder (line 541) | class Encoder(AttentionLayers):
    method __init__ (line 542) | def __init__(self, **kwargs):
  class TransformerWrapper (line 548) | class TransformerWrapper(nn.Module):
    method __init__ (line 549) | def __init__(
    method init_ (line 595) | def init_(self):
    method forward (line 598) | def forward(

FILE: ldm/thirdp/psp/helpers.py
  class Flatten (line 12) | class Flatten(Module):
    method forward (line 13) | def forward(self, input):
  function l2_norm (line 17) | def l2_norm(input, axis=1):
  class Bottleneck (line 23) | class Bottleneck(namedtuple('Block', ['in_channel', 'depth', 'stride'])):
  function get_block (line 27) | def get_block(in_channel, depth, num_units, stride=2):
  function get_blocks (line 31) | def get_blocks(num_layers):
  class SEModule (line 58) | class SEModule(Module):
    method __init__ (line 59) | def __init__(self, channels, reduction):
    method forward (line 67) | def forward(self, x):
  class bottleneck_IR (line 77) | class bottleneck_IR(Module):
    method __init__ (line 78) | def __init__(self, in_channel, depth, stride):
    method forward (line 93) | def forward(self, x):
  class bottleneck_IR_SE (line 99) | class bottleneck_IR_SE(Module):
    method __init__ (line 100) | def __init__(self, in_channel, depth, stride):
    method forward (line 118) | def forward(self, x):

FILE: ldm/thirdp/psp/id_loss.py
  class IDFeatures (line 7) | class IDFeatures(nn.Module):
    method __init__ (line 8) | def __init__(self, model_path):
    method forward (line 16) | def forward(self, x, crop=False):

FILE: ldm/thirdp/psp/model_irse.py
  class Backbone (line 11) | class Backbone(Module):
    method __init__ (line 12) | def __init__(self, input_size, num_layers, mode='ir', drop_ratio=0.4, ...
    method forward (line 46) | def forward(self, x):
  function IR_50 (line 53) | def IR_50(input_size):
  function IR_101 (line 59) | def IR_101(input_size):
  function IR_152 (line 65) | def IR_152(input_size):
  function IR_SE_50 (line 71) | def IR_SE_50(input_size):
  function IR_SE_101 (line 77) | def IR_SE_101(input_size):
  function IR_SE_152 (line 83) | def IR_SE_152(input_size):

FILE: ldm/util.py
  function normalize (line 26) | def normalize(vec):
  function look_at (line 31) | def look_at(cam_location, point):
  function az_el_to_points (line 54) | def az_el_to_points(azimuths, elevations):
  function get_3x4_RT_matrix_from_az_el (line 60) | def get_3x4_RT_matrix_from_az_el(az, el, distance):
  function pil_rectangle_crop (line 78) | def pil_rectangle_crop(im):
  function add_margin (line 97) | def add_margin(pil_img, color=0, size=256):
  function create_carvekit_interface (line 104) | def create_carvekit_interface():
  function load_and_preprocess (line 121) | def load_and_preprocess(interface, input_im):
  function log_txt_as_img (line 147) | def log_txt_as_img(wh, xc, size=10):
  function ismap (line 171) | def ismap(x):
  function isimage (line 177) | def isimage(x):
  function exists (line 183) | def exists(x):
  function default (line 187) | def default(val, d):
  function mean_flat (line 193) | def mean_flat(tensor):
  function count_params (line 201) | def count_params(model, verbose=False):
  function instantiate_from_config (line 208) | def instantiate_from_config(config):
  function get_obj_from_str (line 218) | def get_obj_from_str(string, reload=False):
  class AdamWwithEMAandWings (line 226) | class AdamWwithEMAandWings(optim.Optimizer):
    method __init__ (line 228) | def __init__(self, params, lr=1.e-3, betas=(0.9, 0.999), eps=1.e-8,  #...
    method __setstate__ (line 249) | def __setstate__(self, state):
    method step (line 255) | def step(self, closure=None):
  function prepare_inputs (line 335) | def prepare_inputs(image_input, elevation_input, crop_size=-1, image_siz...
  function prepare_proxy (line 364) | def prepare_proxy(proxy_path, start_view_index=0):
  function save_pickle (line 376) | def save_pickle(data, pkl_path):
  function read_pickle (line 381) | def read_pickle(pkl_path):
  class Ctrl3DParams (line 387) | class Ctrl3DParams:
  function sample_proxy (line 393) | def sample_proxy(object_dir, num_proxy=256, overwrite=False):

FILE: raymarching/backend.py
  function find_cl_path (line 17) | def find_cl_path():

FILE: raymarching/raymarching.py
  class _near_far_from_aabb (line 19) | class _near_far_from_aabb(Function):
    method forward (line 22) | def forward(ctx, rays_o, rays_d, aabb, min_near=0.2):
  class _sph_from_ray (line 52) | class _sph_from_ray(Function):
    method forward (line 55) | def forward(ctx, rays_o, rays_d, radius):
  class _morton3D (line 83) | class _morton3D(Function):
    method forward (line 85) | def forward(ctx, coords):
  class _morton3D_invert (line 106) | class _morton3D_invert(Function):
    method forward (line 108) | def forward(ctx, indices):
  class _packbits (line 129) | class _packbits(Function):
    method forward (line 132) | def forward(ctx, grid, thresh, bitfield=None):
  class _march_rays_train (line 161) | class _march_rays_train(Function):
    method forward (line 164) | def forward(ctx, rays_o, rays_d, bound, density_bitfield, C, H, nears,...
  class _composite_rays_train (line 238) | class _composite_rays_train(Function):
    method forward (line 241) | def forward(ctx, sigmas, rgbs, deltas, rays, T_thresh=1e-4):
    method backward (line 273) | def backward(ctx, grad_weights_sum, grad_depth, grad_image):
  class _march_rays (line 297) | class _march_rays(Function):
    method forward (line 300) | def forward(ctx, n_alive, n_step, rays_alive, rays_t, rays_o, rays_d, ...
  class _composite_rays (line 351) | class _composite_rays(Function):
    method forward (line 354) | def forward(ctx, n_alive, n_step, rays_alive, rays_t, sigmas, rgbs, de...

FILE: raymarching/setup.py
  function find_cl_path (line 18) | def find_cl_path():

FILE: raymarching/src/bindings.cpp
  function PYBIND11_MODULE (line 5) | PYBIND11_MODULE(TORCH_EXTENSION_NAME, m) {

FILE: renderer/agg_net.py
  function weights_init (line 5) | def weights_init(m):
  class NeRF (line 11) | class NeRF(nn.Module):
    method __init__ (line 12) | def __init__(self, vol_n=8+8, feat_ch=8+16+32+3, hid_n=64):
    method forward (line 27) | def forward(self, vox_feat, img_feat_rgb_dir, source_img_mask):
  class Agg (line 44) | class Agg(nn.Module):
    method __init__ (line 45) | def __init__(self, feat_ch):
    method masked_mean_var (line 58) | def masked_mean_var(self, img_feat_rgb, source_img_mask):
    method forward (line 66) | def forward(self, img_feat_rgb_dir, source_img_mask):

FILE: renderer/cost_reg_net.py
  class ConvBnReLU3D (line 3) | class ConvBnReLU3D(nn.Module):
    method __init__ (line 4) | def __init__(self, in_channels, out_channels, kernel_size=3, stride=1,...
    method forward (line 10) | def forward(self, x):
  class CostRegNet (line 13) | class CostRegNet(nn.Module):
    method __init__ (line 14) | def __init__(self, in_channels, norm_act=nn.BatchNorm3d):
    method forward (line 44) | def forward(self, x):
  class MinCostRegNet (line 60) | class MinCostRegNet(nn.Module):
    method __init__ (line 61) | def __init__(self, in_channels, norm_act=nn.BatchNorm3d):
    method forward (line 84) | def forward(self, x):

FILE: renderer/dummy_dataset.py
  class DummyDataset (line 5) | class DummyDataset(pl.LightningDataModule):
    method __init__ (line 6) | def __init__(self,seed):
    method setup (line 9) | def setup(self, stage):
    method train_dataloader (line 16) | def train_dataloader(self):
    method val_dataloader (line 19) | def val_dataloader(self):
    method test_dataloader (line 22) | def test_dataloader(self):
  class DummyData (line 25) | class DummyData(Dataset):
    method __init__ (line 26) | def __init__(self,is_train):
    method __len__ (line 29) | def __len__(self):
    method __getitem__ (line 35) | def __getitem__(self, index):

FILE: renderer/feature_net.py
  class ConvBnReLU (line 4) | class ConvBnReLU(nn.Module):
    method __init__ (line 5) | def __init__(self, in_channels, out_channels, kernel_size=3, stride=1,...
    method forward (line 11) | def forward(self, x):
  class FeatureNet (line 14) | class FeatureNet(nn.Module):
    method __init__ (line 15) | def __init__(self, norm_act=nn.BatchNorm2d):
    method _upsample_add (line 28) | def _upsample_add(self, x, y):
    method forward (line 31) | def forward(self, x):

FILE: renderer/neus_networks.py
  class Embedder (line 10) | class Embedder:
    method __init__ (line 11) | def __init__(self, **kwargs):
    method create_embedding_fn (line 15) | def create_embedding_fn(self):
    method embed (line 39) | def embed(self, inputs):
  function get_embedder (line 43) | def get_embedder(multires, input_dims=3):
  class SDFNetwork (line 60) | class SDFNetwork(nn.Module):
    method __init__ (line 61) | def __init__(self, d_in, d_out, d_hidden, n_layers, skip_in=(4,), mult...
    method forward (line 113) | def forward(self, inputs):
    method sdf (line 132) | def sdf(self, x):
    method sdf_hidden_appearance (line 135) | def sdf_hidden_appearance(self, x):
    method gradient (line 138) | def gradient(self, x):
    method sdf_normal (line 152) | def sdf_normal(self, x):
  class SDFNetworkWithFeature (line 166) | class SDFNetworkWithFeature(nn.Module):
    method __init__ (line 167) | def __init__(self, cube, dp_in, df_in, d_out, d_hidden, n_layers, skip...
    method forward (line 221) | def forward(self, points):
    method sdf (line 247) | def sdf(self, x):
    method sdf_hidden_appearance (line 250) | def sdf_hidden_appearance(self, x):
    method gradient (line 253) | def gradient(self, x):
    method sdf_normal (line 267) | def sdf_normal(self, x):
  class VanillaMLP (line 282) | class VanillaMLP(nn.Module):
    method __init__ (line 283) | def __init__(self, dim_in, dim_out, n_neurons, n_hidden_layers):
    method forward (line 295) | def forward(self, x):
    method make_linear (line 299) | def make_linear(self, dim_in, dim_out, is_first, is_last):
    method make_activation (line 320) | def make_activation(self):
  class SDFHashGridNetwork (line 327) | class SDFHashGridNetwork(nn.Module):
    method __init__ (line 328) | def __init__(self, bound=0.5, feats_dim=13):
    method forward (line 358) | def forward(self, x):
    method sdf (line 369) | def sdf(self, x):
    method gradient (line 372) | def gradient(self, x):
    method sdf_normal (line 386) | def sdf_normal(self, x):
  class RenderingFFNetwork (line 400) | class RenderingFFNetwork(nn.Module):
    method __init__ (line 401) | def __init__(self, in_feats_dim=12):
    method forward (line 422) | def forward(self, points, normals, view_dirs, feature_vectors):
  class RenderingNetwork (line 433) | class RenderingNetwork(nn.Module):
    method __init__ (line 434) | def __init__(self, d_feature, d_in, d_out, d_hidden,
    method forward (line 463) | def forward(self, points, normals, view_dirs, feature_vectors):
  class SingleVarianceNetwork (line 488) | class SingleVarianceNetwork(nn.Module):
    method __init__ (line 489) | def __init__(self, init_val, activation='exp'):
    method forward (line 494) | def forward(self, x):
    method warp (line 501) | def warp(self, x, inv_s):

FILE: renderer/ngp_renderer.py
  function custom_meshgrid (line 17) | def custom_meshgrid(*args):
  function sample_pdf (line 24) | def sample_pdf(bins, weights, n_samples, det=False):
  function plot_pointcloud (line 61) | def plot_pointcloud(pc, color=None):
  class NGPRenderer (line 73) | class NGPRenderer(nn.Module):
    method __init__ (line 74) | def __init__(self,
    method forward (line 115) | def forward(self, x, d):
    method density (line 119) | def density(self, x):
    method color (line 122) | def color(self, x, d, mask=None, **kwargs):
    method reset_extra_state (line 125) | def reset_extra_state(self):
    method run (line 137) | def run(self, rays_o, rays_d, num_steps=128, upsample_steps=128, bg_co...
    method run_cuda (line 268) | def run_cuda(self, rays_o, rays_d, dt_gamma=0, bg_color=None, perturb=...
    method mark_untrained_grid (line 369) | def mark_untrained_grid(self, poses, intrinsic, S=64):
    method update_extra_state (line 434) | def update_extra_state(self, decay=0.95, S=128):
    method render (line 529) | def render(self, rays_o, rays_d, staged=False, max_ray_batch=4096, **k...
  class _trunc_exp (line 543) | class _trunc_exp(Function):
    method forward (line 546) | def forward(ctx, x):
    method backward (line 552) | def backward(ctx, g):
  class NGPNetwork (line 558) | class NGPNetwork(NGPRenderer):
    method __init__ (line 559) | def __init__(self,
    method forward (line 636) | def forward(self, x, d):
    method density (line 669) | def density(self, x):
    method color (line 691) | def color(self, x, d, mask=None, geo_feat=None, **kwargs):

FILE: renderer/renderer.py
  function sample_pdf (line 25) | def sample_pdf(bins, weights, n_samples, det=True):
  function near_far_from_sphere (line 59) | def near_far_from_sphere(rays_o, rays_d, radius=DEFAULT_RADIUS):
  class BackgroundRemoval (line 67) | class BackgroundRemoval:
    method __init__ (line 68) | def __init__(self, device='cuda'):
    method __call__ (line 84) | def __call__(self, image):
  class BaseRenderer (line 92) | class BaseRenderer(nn.Module):
    method __init__ (line 93) | def __init__(self, train_batch_num, test_batch_num):
    method render_impl (line 99) | def render_impl(self, ray_batch, is_train, step):
    method render_with_loss (line 103) | def render_with_loss(self, ray_batch, is_train, step):
    method render (line 106) | def render(self, ray_batch, is_train, step):
  class NeuSRenderer (line 124) | class NeuSRenderer(BaseRenderer):
    method __init__ (line 125) | def __init__(self, train_batch_num, test_batch_num, lambda_eikonal_los...
    method get_vertex_colors (line 144) | def get_vertex_colors(self, vertices):
    method upsample (line 165) | def upsample(self, rays_o, rays_d, z_vals, sdf, n_importance, inv_s):
    method cat_z_vals (line 198) | def cat_z_vals(self, rays_o, rays_d, z_vals, new_z_vals, sdf, last=Fal...
    method sample_depth (line 215) | def sample_depth(self, rays_o, rays_d, near, far, perturb):
    method compute_sdf_alpha (line 243) | def compute_sdf_alpha(self, points, dists, dirs, cos_anneal_ratio, step):
    method get_anneal_val (line 270) | def get_anneal_val(self, step):
    method get_inner_mask (line 276) | def get_inner_mask(self, points):
    method render_impl (line 279) | def render_impl(self, ray_batch, is_train, step):
    method render_with_loss (line 323) | def render_with_loss(self, ray_batch, is_train, step):
  class NeRFRenderer (line 351) | class NeRFRenderer(BaseRenderer):
    method __init__ (line 352) | def __init__(self, train_batch_num, test_batch_num, bound=0.5, use_mas...
    method render_impl (line 364) | def render_impl(self, ray_batch, is_train, step):
    method render_with_loss (line 379) | def render_with_loss(self, ray_batch, is_train, step):
  class RendererTrainer (line 398) | class RendererTrainer(pl.LightningModule):
    method __init__ (line 399) | def __init__(self, image_path, total_steps, warm_up_steps, log_dir, tr...
    method _construct_ray_batch (line 438) | def _construct_ray_batch(self, images_info):
    method load_model (line 464) | def load_model(cfg, ckpt):
    method _init_dataset (line 473) | def _init_dataset(self):
    method _shuffle_train_batch (line 512) | def _shuffle_train_batch(self):
    method _shuffle_train_fg_batch (line 518) | def _shuffle_train_fg_batch(self):
    method training_step (line 525) | def training_step(self, batch, batch_idx):
    method _slice_images_info (line 545) | def _slice_images_info(self, index):
    method validation_step (line 549) | def validation_step(self, batch, batch_idx):
    method configure_optimizers (line 576) | def configure_optimizers(self):

FILE: train_diffusion.py
  function rank_zero_print (line 22) | def rank_zero_print(*args):
  function get_parser (line 25) | def get_parser(**parser_kwargs):
  function trainer_args (line 46) | def trainer_args(opt):
  class SetupCallback (line 52) | class SetupCallback(Callback):
    method __init__ (line 53) | def __init__(self, resume, logdir, ckptdir, cfgdir, config):
    method on_fit_start (line 61) | def on_fit_start(self, trainer, pl_module):
  class ImageLogger (line 74) | class ImageLogger(Callback):
    method __init__ (line 75) | def __init__(self, batch_frequency, max_images, log_images_kwargs=None):
    method log_to_logger (line 82) | def log_to_logger(self, pl_module, images, split):
    method log_to_file (line 91) | def log_to_file(self, save_dir, split, images, global_step, current_ep...
    method log_img (line 105) | def log_img(self, pl_module, batch, split="train"):
    method check_frequency (line 128) | def check_frequency(self, check_idx):
    method on_train_batch_end (line 134) | def on_train_batch_end(self, trainer, pl_module, outputs, batch, batch...
    method on_validation_batch_end (line 138) | def on_validation_batch_end(self, trainer, pl_module, outputs, batch, ...
  class CUDACallback (line 144) | class CUDACallback(Callback):
    method on_train_epoch_start (line 146) | def on_train_epoch_start(self, trainer, pl_module):
    method on_train_epoch_end (line 152) | def on_train_epoch_end(self, trainer, pl_module):
  function get_node_name (line 166) | def get_node_name(name, parent_name):
  class ResumeCallBacks (line 174) | class ResumeCallBacks(Callback):
    method on_train_start (line 175) | def on_train_start(self, trainer, pl_module):
  function load_pretrain_stable_diffusion (line 178) | def load_pretrain_stable_diffusion(new_model, finetune_from):
  function get_optional_dict (line 206) | def get_optional_dict(name, config):

FILE: train_renderer.py
  class ResumeCallBacks (line 23) | class ResumeCallBacks(Callback):
    method __init__ (line 24) | def __init__(self):
    method on_train_start (line 27) | def on_train_start(self, trainer, pl_module):
  function render_images (line 30) | def render_images(model, output,):
  function extract_fields (line 78) | def extract_fields(bound_min, bound_max, resolution, query_func, batch_s...
  function extract_geometry (line 98) | def extract_geometry(bound_min, bound_max, resolution, threshold, query_...
  function extract_mesh (line 108) | def extract_mesh(model, output, resolution=512):
  function main (line 119) | def main():

FILE: workflow/inference_comfyui_api.py
  function queue_prompt (line 16) | def queue_prompt(prompt):
  function get_image (line 22) | def get_image(filename, subfolder, folder_type):
  function get_history (line 28) | def get_history(prompt_id):
  function get_images (line 32) | def get_images(ws, prompt):
Condensed preview — 126 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (2,008K chars).
[
  {
    "path": "CONDITION.md",
    "chars": 4237,
    "preview": "### Prepare condition image for inference\n1. Rendering the proxy image\n\nFirst you need to download [Blender](https://www"
  },
  {
    "path": "README.md",
    "chars": 6290,
    "preview": "# Coin3D: Controllable and Interactive 3D Assets Generation with Proxy-Guided Conditioning\n### [Project Page](https://zj"
  },
  {
    "path": "blender_utils/render_proxy.py",
    "chars": 7619,
    "preview": "import argparse\nimport json\nimport math\nimport os\nimport random\nimport sys\nimport time\nimport urllib.request\nfrom pathli"
  },
  {
    "path": "configs/coin3d_train.yaml",
    "chars": 2051,
    "preview": "model:\n  base_learning_rate: 5.0e-05\n  target: ldm.models.diffusion.ctrldemo_sync_dreamer.CtrlDemo\n  params:\n    view_nu"
  },
  {
    "path": "configs/ctrldemo.yaml",
    "chars": 1135,
    "preview": "model:\n  base_learning_rate: 5.0e-05\n  target: ldm.models.diffusion.ctrldemo_sync_dreamer.CtrlDemo\n  params:\n    view_nu"
  },
  {
    "path": "configs/nerf.yaml",
    "chars": 441,
    "preview": "model:\n  base_lr: 1.0e-2\n  target: renderer.renderer.RendererTrainer\n  params:\n    total_steps: 2000\n    warm_up_steps: "
  },
  {
    "path": "configs/neus.yaml",
    "chars": 478,
    "preview": "model:\n  base_lr: 5.0e-4\n  target: renderer.renderer.RendererTrainer\n  params:\n    total_steps: 2000\n    warm_up_steps: "
  },
  {
    "path": "configs/syncdreamer.yaml",
    "chars": 1141,
    "preview": "model:\n  base_learning_rate: 5.0e-05\n  target: ldm.models.diffusion.sync_dreamer.SyncMultiviewDiffusion\n  params:\n    vi"
  },
  {
    "path": "example/panda/mesh.obj",
    "chars": 320831,
    "preview": "# Blender 3.6.1\n# www.blender.org\no Sphere\nv -0.090264 0.462556 -0.155620\nv -0.090264 0.401258 -0.226042\nv -0.090264 0.3"
  },
  {
    "path": "example/panda/proxy.txt",
    "chars": 19623,
    "preview": "-5.550776502986174560e-02 8.231279402941545087e-02 -1.258323406064429106e-01\n-3.998699747023472251e-02 3.020038030973061"
  },
  {
    "path": "example/pumpkin/mesh.obj",
    "chars": 81747,
    "preview": "# Blender 3.6.1\n# www.blender.org\no Cube.003\nv -0.229940 -0.199760 0.283841\nv -0.174661 -0.026964 0.192628\nv -0.229940 -"
  },
  {
    "path": "example/pumpkin/proxy.txt",
    "chars": 19650,
    "preview": "-3.305625800631857847e-01 -3.582112260109803631e-02 1.225265727345806049e-01\n-2.891790282159942826e-01 -2.95431763102793"
  },
  {
    "path": "example/teddybear/mesh.obj",
    "chars": 320831,
    "preview": "# Blender 3.6.1\n# www.blender.org\no Sphere\nv -0.090264 0.462556 -0.155620\nv -0.090264 0.401258 -0.226042\nv -0.090264 0.3"
  },
  {
    "path": "example/teddybear/proxy.txt",
    "chars": 19623,
    "preview": "-5.550776502986174560e-02 8.231279402941545087e-02 -1.258323406064429106e-01\n-3.998699747023472251e-02 3.020038030973061"
  },
  {
    "path": "example/toycar/mesh.obj",
    "chars": 79581,
    "preview": "# Blender v3.6.1 OBJ File: 'car.blend'\n# www.blender.org\nmtllib 0000_Collection.mtl\no Cylinder_Cylinder.004\nv 0.115303 -"
  },
  {
    "path": "example/toycar/proxy.txt",
    "chars": 19603,
    "preview": "-3.008041751643567713e-02 -1.599077930202584819e-01 3.288800368815070208e-01\n1.013531047142105096e-01 -1.094049703092424"
  },
  {
    "path": "example/turtle/mesh.obj",
    "chars": 288812,
    "preview": "# Blender v3.6.1 OBJ File: 'turbo.blend'\n# www.blender.org\nmtllib 0000_Collection.mtl\no Plane\nv -0.119246 -0.093755 0.34"
  },
  {
    "path": "example/turtle/proxy.txt",
    "chars": 19625,
    "preview": "-1.860431563537060518e-01 -9.375499933958053589e-02 -8.959774656980318275e-02\n-2.498418255269513610e-01 -9.3754999339580"
  },
  {
    "path": "externs/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "externs/pvcnn/modules/__init__.py",
    "chars": 431,
    "preview": "# from modules.ball_query import BallQuery\n# from modules.frustum import FrustumPointNetLoss\n# from modules.loss import "
  },
  {
    "path": "externs/pvcnn/modules/ball_query.py",
    "chars": 1404,
    "preview": "import torch\nimport torch.nn as nn\n\nimport modules.functional as F\n\n__all__ = ['BallQuery']\n\n\nclass BallQuery(nn.Module)"
  },
  {
    "path": "externs/pvcnn/modules/frustum.py",
    "chars": 7340,
    "preview": "import numpy as np\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\n\nimport modules.functional as PF\n\n"
  },
  {
    "path": "externs/pvcnn/modules/functional/__init__.py",
    "chars": 479,
    "preview": "# from modules.functional.ball_query import ball_query\nfrom externs.pvcnn.modules.functional.devoxelization import trili"
  },
  {
    "path": "externs/pvcnn/modules/functional/backend.py",
    "chars": 957,
    "preview": "import os\n\nfrom torch.utils.cpp_extension import load\n\n_src_path = os.path.dirname(os.path.abspath(__file__))\n_backend ="
  },
  {
    "path": "externs/pvcnn/modules/functional/ball_query.py",
    "chars": 752,
    "preview": "from torch.autograd import Function\n\nfrom modules.functional.backend import _backend\n\n__all__ = ['ball_query']\n\n\ndef bal"
  },
  {
    "path": "externs/pvcnn/modules/functional/devoxelization.py",
    "chars": 1499,
    "preview": "from torch.autograd import Function\n\nfrom externs.pvcnn.modules.functional.backend import _backend\n\n__all__ = ['trilinea"
  },
  {
    "path": "externs/pvcnn/modules/functional/grouping.py",
    "chars": 978,
    "preview": "from torch.autograd import Function\n\nfrom modules.functional.backend import _backend\n\n__all__ = ['grouping']\n\n\nclass Gro"
  },
  {
    "path": "externs/pvcnn/modules/functional/interpolatation.py",
    "chars": 1452,
    "preview": "from torch.autograd import Function\n\nfrom modules.functional.backend import _backend\n\n__all__ = ['nearest_neighbor_inter"
  },
  {
    "path": "externs/pvcnn/modules/functional/loss.py",
    "chars": 484,
    "preview": "import torch\nimport torch.nn.functional as F\n\n__all__ = ['kl_loss', 'huber_loss']\n\n\ndef kl_loss(x, y):\n    x = F.softmax"
  },
  {
    "path": "externs/pvcnn/modules/functional/sampling.py",
    "chars": 3656,
    "preview": "import numpy as np\nimport torch\nfrom torch.autograd import Function\n\nfrom modules.functional.backend import _backend\n\n__"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/ball_query/ball_query.cpp",
    "chars": 943,
    "preview": "#include \"ball_query.hpp\"\n#include \"ball_query.cuh\"\n\n#include \"../utils.hpp\"\n\nat::Tensor ball_query_forward(at::Tensor c"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/ball_query/ball_query.cu",
    "chars": 2039,
    "preview": "#include <math.h>\n#include <stdio.h>\n#include <stdlib.h>\n\n#include \"../cuda_utils.cuh\"\n\n/*\n  Function: ball query\n  Args"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/ball_query/ball_query.cuh",
    "chars": 225,
    "preview": "#ifndef _BALL_QUERY_CUH\n#define _BALL_QUERY_CUH\n\nvoid ball_query(int b, int n, int m, float r2, int u,\n                c"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/ball_query/ball_query.hpp",
    "chars": 276,
    "preview": "#ifndef _BALL_QUERY_HPP\n#define _BALL_QUERY_HPP\n\n#include <torch/extension.h>\n\nat::Tensor ball_query_forward(at::Tensor "
  },
  {
    "path": "externs/pvcnn/modules/functional/src/bindings.cpp",
    "chars": 1694,
    "preview": "#include <pybind11/pybind11.h>\n\n#include \"ball_query/ball_query.hpp\"\n#include \"grouping/grouping.hpp\"\n#include \"interpol"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/cuda_utils.cuh",
    "chars": 1385,
    "preview": "#ifndef _CUDA_UTILS_H\n#define _CUDA_UTILS_H\n\n#include <ATen/ATen.h>\n#include <ATen/cuda/CUDAContext.h>\n#include <cmath>\n"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/grouping/grouping.cpp",
    "chars": 1325,
    "preview": "#include \"grouping.hpp\"\n#include \"grouping.cuh\"\n\n#include \"../utils.hpp\"\n\nat::Tensor grouping_forward(at::Tensor feature"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/grouping/grouping.cu",
    "chars": 3056,
    "preview": "#include <stdio.h>\n#include <stdlib.h>\n\n#include \"../cuda_utils.cuh\"\n\n/*\n  Function: grouping features of neighbors (for"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/grouping/grouping.cuh",
    "chars": 301,
    "preview": "#ifndef _GROUPING_CUH\n#define _GROUPING_CUH\n\nvoid grouping(int b, int c, int n, int m, int u, const float *features,\n   "
  },
  {
    "path": "externs/pvcnn/modules/functional/src/grouping/grouping.hpp",
    "chars": 264,
    "preview": "#ifndef _GROUPING_HPP\n#define _GROUPING_HPP\n\n#include <torch/extension.h>\n\nat::Tensor grouping_forward(at::Tensor featur"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/interpolate/neighbor_interpolate.cpp",
    "chars": 2367,
    "preview": "#include \"neighbor_interpolate.hpp\"\n#include \"neighbor_interpolate.cuh\"\n\n#include \"../utils.hpp\"\n\nstd::vector<at::Tensor"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/interpolate/neighbor_interpolate.cu",
    "chars": 6580,
    "preview": "#include <math.h>\n#include <stdio.h>\n#include <stdlib.h>\n\n#include \"../cuda_utils.cuh\"\n\n/*\n  Function: three nearest nei"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/interpolate/neighbor_interpolate.cuh",
    "chars": 819,
    "preview": "#ifndef _NEIGHBOR_INTERPOLATE_CUH\n#define _NEIGHBOR_INTERPOLATE_CUH\n\nvoid three_nearest_neighbors_interpolate(int b, int"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/interpolate/neighbor_interpolate.hpp",
    "chars": 661,
    "preview": "#ifndef _NEIGHBOR_INTERPOLATE_HPP\n#define _NEIGHBOR_INTERPOLATE_HPP\n\n#include <torch/extension.h>\n#include <vector>\n\nstd"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/interpolate/trilinear_devox.cpp",
    "chars": 3401,
    "preview": "#include \"trilinear_devox.hpp\"\n#include \"trilinear_devox.cuh\"\n\n#include \"../utils.hpp\"\n\n/*\n  Function: trilinear devoxel"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/interpolate/trilinear_devox.cu",
    "chars": 6396,
    "preview": "#include <stdio.h>\n#include <stdlib.h>\n\n#include \"../cuda_utils.cuh\"\n\n/*\n  Function: trilinear devoxlization (forward)\n "
  },
  {
    "path": "externs/pvcnn/modules/functional/src/interpolate/trilinear_devox.cuh",
    "chars": 536,
    "preview": "#ifndef _TRILINEAR_DEVOX_CUH\n#define _TRILINEAR_DEVOX_CUH\n\n// CUDA function declarations\nvoid trilinear_devoxelize(int b"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/interpolate/trilinear_devox.hpp",
    "chars": 628,
    "preview": "#ifndef _TRILINEAR_DEVOX_HPP\n#define _TRILINEAR_DEVOX_HPP\n\n#include <torch/torch.h>\n#include <vector>\n\nstd::vector<at::T"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/sampling/sampling.cpp",
    "chars": 1966,
    "preview": "#include \"sampling.hpp\"\n#include \"sampling.cuh\"\n\n#include \"../utils.hpp\"\n\nat::Tensor gather_features_forward(at::Tensor "
  },
  {
    "path": "externs/pvcnn/modules/functional/src/sampling/sampling.cu",
    "chars": 5802,
    "preview": "#include <stdio.h>\n#include <stdlib.h>\n\n#include \"../cuda_utils.cuh\"\n\n/*\n  Function: gather centers' features (forward)\n"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/sampling/sampling.cuh",
    "chars": 449,
    "preview": "#ifndef _SAMPLING_CUH\n#define _SAMPLING_CUH\n\nvoid gather_features(int b, int c, int n, int m, const float *features,\n   "
  },
  {
    "path": "externs/pvcnn/modules/functional/src/sampling/sampling.hpp",
    "chars": 414,
    "preview": "#ifndef _SAMPLING_HPP\n#define _SAMPLING_HPP\n\n#include <torch/extension.h>\n\nat::Tensor gather_features_forward(at::Tensor"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/utils.hpp",
    "chars": 760,
    "preview": "#ifndef _UTILS_HPP\n#define _UTILS_HPP\n\n#include <ATen/cuda/CUDAContext.h>\n#include <torch/extension.h>\n\n#define CHECK_CU"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/voxelization/vox.cpp",
    "chars": 2540,
    "preview": "#include \"vox.hpp\"\n#include \"vox.cuh\"\n\n#include \"../utils.hpp\"\n\n/*\n  Function: average pool voxelization (forward)\n  Arg"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/voxelization/vox.cu",
    "chars": 4358,
    "preview": "#include <stdio.h>\n#include <stdlib.h>\n\n#include \"../cuda_utils.cuh\"\n\n/*\n  Function: get how many points in each voxel g"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/voxelization/vox.cuh",
    "chars": 367,
    "preview": "#ifndef _VOX_CUH\n#define _VOX_CUH\n\n// CUDA function declarations\nvoid avg_voxelize(int b, int c, int n, int r, int r2, i"
  },
  {
    "path": "externs/pvcnn/modules/functional/src/voxelization/vox.hpp",
    "chars": 471,
    "preview": "#ifndef _VOX_HPP\n#define _VOX_HPP\n\n#include <torch/torch.h>\n#include <vector>\n\nstd::vector<at::Tensor> avg_voxelize_forw"
  },
  {
    "path": "externs/pvcnn/modules/functional/voxelization.py",
    "chars": 1389,
    "preview": "from torch.autograd import Function\n\nfrom externs.pvcnn.modules.functional.backend import _backend\n\n__all__ = ['avg_voxe"
  },
  {
    "path": "externs/pvcnn/modules/loss.py",
    "chars": 163,
    "preview": "import torch.nn as nn\n\nimport modules.functional as F\n\n__all__ = ['KLLoss']\n\n\nclass KLLoss(nn.Module):\n    def forward(s"
  },
  {
    "path": "externs/pvcnn/modules/pointnet.py",
    "chars": 4581,
    "preview": "import torch\nimport torch.nn as nn\n\nimport modules.functional as F\nfrom modules.ball_query import BallQuery\nfrom modules"
  },
  {
    "path": "externs/pvcnn/modules/pvconv.py",
    "chars": 2617,
    "preview": "import torch.nn as nn\n\nimport externs.pvcnn.modules.functional as F\nfrom externs.pvcnn.modules.voxelization import Voxel"
  },
  {
    "path": "externs/pvcnn/modules/se.py",
    "chars": 522,
    "preview": "import torch.nn as nn\n\n__all__ = ['SE3d']\n\n\nclass SE3d(nn.Module):\n    def __init__(self, channel, reduction=8):\n       "
  },
  {
    "path": "externs/pvcnn/modules/shared_mlp.py",
    "chars": 923,
    "preview": "import torch.nn as nn\n\n__all__ = ['SharedMLP']\n\n\nclass SharedMLP(nn.Module):\n    def __init__(self, in_channels, out_cha"
  },
  {
    "path": "externs/pvcnn/modules/voxelization.py",
    "chars": 1068,
    "preview": "import torch\nimport torch.nn as nn\n\nimport externs.pvcnn.modules.functional as F\n\n__all__ = ['Voxelization']\n\n\nclass Vox"
  },
  {
    "path": "foreground_segment.py",
    "chars": 1534,
    "preview": "import cv2\nimport argparse\nimport numpy as np\n\nimport torch\nfrom PIL import Image\n\n\nclass BackgroundRemoval:\n    def __i"
  },
  {
    "path": "generate.py",
    "chars": 3716,
    "preview": "import argparse\nfrom pathlib import Path\n\nimport numpy as np\nimport torch\nfrom omegaconf import OmegaConf\nfrom skimage.i"
  },
  {
    "path": "ldm/DPMPPScheduler.py",
    "chars": 4557,
    "preview": "import math\nfrom typing import List, Optional, Tuple, Union\nfrom dataclasses import dataclass\nimport numpy as np\nimport "
  },
  {
    "path": "ldm/base_utils.py",
    "chars": 4837,
    "preview": "import pickle\nimport numpy as np\nimport cv2\nfrom skimage.io import imread\n\n\ndef save_pickle(data, pkl_path):\n    # os.sy"
  },
  {
    "path": "ldm/data/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "ldm/data/base.py",
    "chars": 1204,
    "preview": "import os\nimport numpy as np\nfrom abc import abstractmethod\nfrom torch.utils.data import Dataset, ConcatDataset, ChainDa"
  },
  {
    "path": "ldm/data/coco.py",
    "chars": 11339,
    "preview": "import os\nimport json\nimport albumentations\nimport numpy as np\nfrom PIL import Image\nfrom tqdm import tqdm\nfrom torch.ut"
  },
  {
    "path": "ldm/data/control_sync_dreamer.py",
    "chars": 5150,
    "preview": "import pytorch_lightning as pl\nimport numpy as np\nimport torch\nimport PIL\nimport os\nfrom skimage.io import imread\nimport"
  },
  {
    "path": "ldm/data/dummy.py",
    "chars": 887,
    "preview": "import numpy as np\nimport random\nimport string\nfrom torch.utils.data import Dataset, Subset\n\nclass DummyData(Dataset):\n "
  },
  {
    "path": "ldm/data/imagenet.py",
    "chars": 15565,
    "preview": "import os, yaml, pickle, shutil, tarfile, glob\nimport cv2\nimport albumentations\nimport PIL\nimport numpy as np\nimport tor"
  },
  {
    "path": "ldm/data/inpainting/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "ldm/data/inpainting/synthetic_mask.py",
    "chars": 5121,
    "preview": "from PIL import Image, ImageDraw\nimport numpy as np\n\nsettings = {\n    \"256narrow\": {\n        \"p_irr\": 1,\n        \"min_n_"
  },
  {
    "path": "ldm/data/laion.py",
    "chars": 19135,
    "preview": "import webdataset as wds\nimport kornia\nfrom PIL import Image\nimport io\nimport os\nimport torchvision\nfrom PIL import Imag"
  },
  {
    "path": "ldm/data/lsun.py",
    "chars": 3274,
    "preview": "import os\nimport numpy as np\nimport PIL\nfrom PIL import Image\nfrom torch.utils.data import Dataset\nfrom torchvision impo"
  },
  {
    "path": "ldm/data/nerf_like.py",
    "chars": 6534,
    "preview": "from torch.utils.data import Dataset\nimport os\nimport json\nimport numpy as np\nimport torch\nimport imageio\nimport math\nim"
  },
  {
    "path": "ldm/data/simple.py",
    "chars": 18705,
    "preview": "from typing import Dict\nimport webdataset as wds\nimport numpy as np\nfrom omegaconf import DictConfig, ListConfig\nimport "
  },
  {
    "path": "ldm/data/sync_dreamer.py",
    "chars": 5093,
    "preview": "import pytorch_lightning as pl\nimport numpy as np\nimport torch\nimport PIL\nimport os\nfrom skimage.io import imread\nimport"
  },
  {
    "path": "ldm/lr_scheduler.py",
    "chars": 3882,
    "preview": "import numpy as np\n\n\nclass LambdaWarmUpCosineScheduler:\n    \"\"\"\n    note: use with a base_lr of 1.0\n    \"\"\"\n    def __in"
  },
  {
    "path": "ldm/models/autoencoder.py",
    "chars": 17619,
    "preview": "import torch\nimport pytorch_lightning as pl\nimport torch.nn.functional as F\nfrom contextlib import contextmanager\n\nfrom "
  },
  {
    "path": "ldm/models/diffusion/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "ldm/models/diffusion/ctrldemo_sync_dreamer.py",
    "chars": 19230,
    "preview": "from pathlib import Path\n\nimport pytorch_lightning as pl\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional a"
  },
  {
    "path": "ldm/models/diffusion/sync_dreamer.py",
    "chars": 32795,
    "preview": "from pathlib import Path\n\nimport pytorch_lightning as pl\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional a"
  },
  {
    "path": "ldm/models/diffusion/sync_dreamer_attention.py",
    "chars": 5807,
    "preview": "import torch\nimport torch.nn as nn\n\nfrom ldm.modules.attention import default, zero_module, checkpoint\nfrom ldm.modules."
  },
  {
    "path": "ldm/models/diffusion/sync_dreamer_network.py",
    "chars": 9689,
    "preview": "import torch\nimport torch.nn as nn\nfrom ldm.modules.attention import default, zero_module, checkpoint\nfrom ldm.modules.d"
  },
  {
    "path": "ldm/models/diffusion/sync_dreamer_utils.py",
    "chars": 5230,
    "preview": "import torch\nfrom kornia import create_meshgrid\n\n\ndef project_and_normalize(ref_grid, src_proj, length):\n    \"\"\"\n\n    @p"
  },
  {
    "path": "ldm/modules/attention.py",
    "chars": 11691,
    "preview": "from inspect import isfunction\nimport math\nimport torch\nimport torch.nn.functional as F\nfrom torch import nn, einsum\nfro"
  },
  {
    "path": "ldm/modules/diffusionmodules/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "ldm/modules/diffusionmodules/model.py",
    "chars": 33409,
    "preview": "# pytorch_diffusion + derived encoder decoder\nimport math\nimport torch\nimport torch.nn as nn\nimport numpy as np\nfrom ein"
  },
  {
    "path": "ldm/modules/diffusionmodules/openaimodel.py",
    "chars": 37257,
    "preview": "from abc import abstractmethod\nfrom functools import partial\nimport math\nfrom typing import Iterable\n\nimport numpy as np"
  },
  {
    "path": "ldm/modules/diffusionmodules/util.py",
    "chars": 9561,
    "preview": "# adopted from\n# https://github.com/openai/improved-diffusion/blob/main/improved_diffusion/gaussian_diffusion.py\n# and\n#"
  },
  {
    "path": "ldm/modules/distributions/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "ldm/modules/distributions/distributions.py",
    "chars": 2970,
    "preview": "import torch\nimport numpy as np\n\n\nclass AbstractDistribution:\n    def sample(self):\n        raise NotImplementedError()\n"
  },
  {
    "path": "ldm/modules/encoders/__init__.py",
    "chars": 0,
    "preview": ""
  },
  {
    "path": "ldm/modules/encoders/modules.py",
    "chars": 21114,
    "preview": "import torch\nimport torch.nn as nn\nimport numpy as np\nfrom functools import partial\nimport kornia\n\nfrom ldm.modules.x_tr"
  },
  {
    "path": "ldm/modules/x_transformer.py",
    "chars": 20168,
    "preview": "\"\"\"shout-out to https://github.com/lucidrains/x-transformers/tree/main/x_transformers\"\"\"\nimport torch\nfrom torch import "
  },
  {
    "path": "ldm/thirdp/psp/helpers.py",
    "chars": 3604,
    "preview": "# https://github.com/eladrich/pixel2style2pixel\n\nfrom collections import namedtuple\nimport torch\nfrom torch.nn import Co"
  },
  {
    "path": "ldm/thirdp/psp/id_loss.py",
    "chars": 847,
    "preview": "# https://github.com/eladrich/pixel2style2pixel\nimport torch\nfrom torch import nn\nfrom ldm.thirdp.psp.model_irse import "
  },
  {
    "path": "ldm/thirdp/psp/model_irse.py",
    "chars": 2883,
    "preview": "# https://github.com/eladrich/pixel2style2pixel\n\nfrom torch.nn import Linear, Conv2d, BatchNorm1d, BatchNorm2d, PReLU, D"
  },
  {
    "path": "ldm/typing.py",
    "chars": 253,
    "preview": "# Basic types\nfrom typing import (\n    Any,\n    Callable,\n    Dict,\n    Iterable,\n    List,\n    Literal,\n    NamedTuple,"
  },
  {
    "path": "ldm/util.py",
    "chars": 14068,
    "preview": "import importlib\n\nimport torchvision\nimport torch\nfrom torch import optim\nimport numpy as np\nimport pickle\n\nfrom inspect"
  },
  {
    "path": "misc.ipynb",
    "chars": 105176,
    "preview": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\":"
  },
  {
    "path": "raymarching/__init__.py",
    "chars": 26,
    "preview": "from .raymarching import *"
  },
  {
    "path": "raymarching/backend.py",
    "chars": 1361,
    "preview": "import os\nfrom torch.utils.cpp_extension import load\n\n_src_path = os.path.dirname(os.path.abspath(__file__))\n\nnvcc_flags"
  },
  {
    "path": "raymarching/raymarching.py",
    "chars": 14712,
    "preview": "import numpy as np\nimport time\n\nimport torch\nimport torch.nn as nn\nfrom torch.autograd import Function\nfrom torch.cuda.a"
  },
  {
    "path": "raymarching/setup.py",
    "chars": 2062,
    "preview": "import os\nfrom setuptools import setup\nfrom torch.utils.cpp_extension import BuildExtension, CUDAExtension\n\n_src_path = "
  },
  {
    "path": "raymarching/src/bindings.cpp",
    "chars": 903,
    "preview": "#include <torch/extension.h>\n\n#include \"raymarching.h\"\n\nPYBIND11_MODULE(TORCH_EXTENSION_NAME, m) {\n    // utils\n    m.de"
  },
  {
    "path": "raymarching/src/raymarching.cu",
    "chars": 31326,
    "preview": "#include <cuda.h>\n#include <cuda_fp16.h>\n#include <cuda_runtime.h>\n\n#include <ATen/cuda/CUDAContext.h>\n#include <torch/t"
  },
  {
    "path": "raymarching/src/raymarching.h",
    "chars": 2283,
    "preview": "#pragma once\n\n#include <stdint.h>\n#include <torch/torch.h>\n\n\nvoid near_far_from_aabb(const at::Tensor rays_o, const at::"
  },
  {
    "path": "renderer/agg_net.py",
    "chars": 3741,
    "preview": "import torch.nn.functional as F\nimport torch.nn as nn\nimport torch\n\ndef weights_init(m):\n    if isinstance(m, nn.Linear)"
  },
  {
    "path": "renderer/cost_reg_net.py",
    "chars": 3565,
    "preview": "import torch.nn as nn\n\nclass ConvBnReLU3D(nn.Module):\n    def __init__(self, in_channels, out_channels, kernel_size=3, s"
  },
  {
    "path": "renderer/dummy_dataset.py",
    "chars": 1048,
    "preview": "import pytorch_lightning as pl\nfrom torch.utils.data import Dataset\nimport webdataset as wds\nfrom torch.utils.data.distr"
  },
  {
    "path": "renderer/feature_net.py",
    "chars": 1766,
    "preview": "import torch.nn as nn\nimport torch.nn.functional as F\n\nclass ConvBnReLU(nn.Module):\n    def __init__(self, in_channels, "
  },
  {
    "path": "renderer/neus_networks.py",
    "chars": 17899,
    "preview": "import math\n\nimport numpy as np\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nimport tinycudann as "
  },
  {
    "path": "renderer/ngp_renderer.py",
    "chars": 29447,
    "preview": "import math\nimport trimesh\nimport numpy as np\n\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nfrom p"
  },
  {
    "path": "renderer/renderer.py",
    "chars": 27082,
    "preview": "import abc\nimport os\nfrom pathlib import Path\n\nimport cv2\nimport numpy as np\nimport pytorch_lightning as pl\nimport torch"
  },
  {
    "path": "requirements.txt",
    "chars": 344,
    "preview": "pytorch_lightning==1.9.0\nPillow==10.0.0\nopencv-python\ntransformers\ntaming-transformers-rom1504 \ntqdm\nnumpy\nkornia\nwebdat"
  },
  {
    "path": "train_diffusion.py",
    "chars": 13663,
    "preview": "import argparse, os, sys\nimport numpy as np\nimport time\nimport torch\nimport torch.nn as nn\nimport torchvision\nimport pyt"
  },
  {
    "path": "train_renderer.py",
    "chars": 7724,
    "preview": "import argparse\n\nimport imageio\nimport numpy as np\nimport torch\nimport torch.nn.functional as F\nfrom pathlib import Path"
  },
  {
    "path": "workflow/Coin3D_condition_workflow.json",
    "chars": 16529,
    "preview": "{\n  \"last_node_id\": 189,\n  \"last_link_id\": 685,\n  \"nodes\": [\n    {\n      \"id\": 5,\n      \"type\": \"EmptyLatentImage\",\n    "
  },
  {
    "path": "workflow/Coin3D_condition_workflow_api.json",
    "chars": 4598,
    "preview": "{\n  \"3\": {\n    \"inputs\": {\n      \"seed\": 934049932825402,\n      \"steps\": 20,\n      \"cfg\": 7,\n      \"sampler_name\": \"dpmp"
  },
  {
    "path": "workflow/inference_comfyui_api.py",
    "chars": 3333,
    "preview": "import websocket #NOTE: websocket-client (https://github.com/websocket-client/websocket-client)\nimport uuid\nimport json\n"
  }
]

// ... and 1 more files (download for full content)

About this extraction

This page contains the full source code of the zju3dv/Coin3D GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 126 files (1.9 MB), approximately 930.8k tokens, and a symbol index with 1033 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!