[
  {
    "path": ".gitignore",
    "content": "models\nwandb\n*.pyc\n.ipynb_checkpoints\nfast_td3.egg-info/\n"
  },
  {
    "path": ".pre-commit-config.yaml",
    "content": "repos:\n-   repo: https://github.com/psf/black\n    rev: stable\n    hooks:\n    -   id: black "
  },
  {
    "path": "LICENSE",
    "content": "This software is part of the BAIR Commons HIC Repository as of calendar year 2025.\n\n--------------------------------------------------------------------------------\n\nMIT License\n\nCopyright (c) 2025 Younggyo Seo\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n\n--------------------------------------------------------------------------------\n\nMIT License\n\nCopyright (c) 2024 LeanRL developers\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n\n--------------------------------------------------------------------------------\nCode in `cleanrl/ddpg_continuous_action.py` and `cleanrl/td3_continuous_action.py` are adapted from https://github.com/sfujim/TD3\n\nMIT License\n\nCopyright (c) 2020 Scott Fujimoto\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n\n--------------------------------------------------------------------------------\nCode in `cleanrl/sac_continuous_action.py` is inspired and adapted from [haarnoja/sac](https://github.com/haarnoja/sac), [openai/spinningup](https://github.com/openai/spinningup), [pranz24/pytorch-soft-actor-critic](https://github.com/pranz24/pytorch-soft-actor-critic), [DLR-RM/stable-baselines3](https://github.com/DLR-RM/stable-baselines3), and [denisyarats/pytorch_sac](https://github.com/denisyarats/pytorch_sac).\n\n- [haarnoja/sac](https://github.com/haarnoja/sac/blob/8258e33633c7e37833cc39315891e77adfbe14b2/LICENSE.txt)\n\nCOPYRIGHT\n\nAll contributions by the University of California:\nCopyright (c) 2017, 2018 The Regents of the University of California (Regents)\nAll rights reserved.\n\nAll other contributions:\nCopyright (c) 2017, 2018, the respective contributors\nAll rights reserved.\n\nSAC uses a shared copyright model: each contributor holds copyright over\ntheir contributions to the SAC codebase. The project versioning records all such\ncontribution and copyright details. If a contributor wants to further mark\ntheir specific copyright on a particular contribution, they should indicate\ntheir copyright solely in the commit message of the change when it is\ncommitted.\n\nLICENSE\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met: \n\n1. Redistributions of source code must retain the above copyright notice, this\n   list of conditions and the following disclaimer. \n2. Redistributions in binary form must reproduce the above copyright notice,\n   this list of conditions and the following disclaimer in the documentation\n   and/or other materials provided with the distribution. \n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\" AND\nANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED\nWARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR\nANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES\n(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;\nLOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND\nON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS\nSOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n\nCONTRIBUTION AGREEMENT\n\nBy contributing to the SAC repository through pull-request, comment,\nor otherwise, the contributor releases their content to the\nlicense and copyright terms herein.\n\n- [openai/spinningup](https://github.com/openai/spinningup/blob/038665d62d569055401d91856abb287263096178/LICENSE)\n\nThe MIT License\n\nCopyright (c) 2018 OpenAI (http://openai.com)\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in\nall copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\nTHE SOFTWARE.\n\n- [DLR-RM/stable-baselines3](https://github.com/DLR-RM/stable-baselines3/blob/44e53ff8115e8f4bff1d5218f10c8c7d1a4cfc12/LICENSE)\n\nThe MIT License\n\nCopyright (c) 2019 Antonin Raffin\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in\nall copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\nTHE SOFTWARE.\n\n- [denisyarats/pytorch_sac](https://github.com/denisyarats/pytorch_sac/blob/81c5b536d3a1c5616b2531e446450df412a064fb/LICENSE)\n\nMIT License\n\nCopyright (c) 2019 Denis Yarats\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n\n- [pranz24/pytorch-soft-actor-critic](https://github.com/pranz24/pytorch-soft-actor-critic/blob/master/LICENSE)\n\nMIT License\n\nCopyright (c) 2018 Pranjal Tandon\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n\n\n---------------------------------------------------------------------------------\nThe CONTRIBUTING.md is adopted from https://github.com/entity-neural-network/incubator/blob/2a0c38b30828df78c47b0318c76a4905020618dd/CONTRIBUTING.md\nand https://github.com/Stable-Baselines-Team/stable-baselines3-contrib/blob/master/CONTRIBUTING.md\n\nMIT License\n\nCopyright (c) 2021 Entity Neural Network developers\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n\n\n\nMIT License\n\nCopyright (c) 2020 Stable-Baselines Team\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n\n\n---------------------------------------------------------------------------------\nThe cleanrl/ppo_continuous_action_isaacgym.py is contributed by Nvidia\n\nSPDX-FileCopyrightText: Copyright (c) 2022 NVIDIA CORPORATION & AFFILIATES. All rights reserved.\nSPDX-License-Identifier: MIT\n\nPermission is hereby granted, free of charge, to any person obtaining a\ncopy of this software and associated documentation files (the \"Software\"),\nto deal in the Software without restriction, including without limitation\nthe rights to use, copy, modify, merge, publish, distribute, sublicense,\nand/or sell copies of the Software, and to permit persons to whom the\nSoftware is furnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in\nall copies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL\nTHE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING\nFROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER\nDEALINGS IN THE SOFTWARE.\n\n--------------------------------------------------------------------------------\n\nCode in `cleanrl/qdagger_dqn_atari_impalacnn.py` and `cleanrl/qdagger_dqn_atari_jax_impalacnn.py` are adapted from https://github.com/google-research/reincarnating_rl\n\n**NOTE: the original repo did not fill out the copyright section in their license\nso the following copyright notice is copied as is per the license requirement.\nSee https://github.com/google-research/reincarnating_rl/blob/a1d402f48a9f8658ca6aa0ddf416ab391745ff2c/LICENSE#L189\n\n\nCopyright [yyyy] [name of copyright owner]\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n    http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.\n"
  },
  {
    "path": "README.md",
    "content": "# FastTD3 - Simple and Fast RL for Humanoid Control\n\n[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)\n[![arXiv](https://img.shields.io/badge/arXiv-2505.22642-b31b1b.svg)](https://arxiv.org/abs/2505.22642)\n\n\nFastTD3 is a high-performance variant of the Twin Delayed Deep Deterministic Policy Gradient (TD3) algorithm, optimized for complex humanoid control tasks. FastTD3 can solve various humanoid control tasks with dexterous hands from HumanoidBench in just a few hours of training. Furthermore, FastTD3 achieves similar or better wall-time-efficiency to PPO in high-dimensional control tasks from popular simulations such as IsaacLab and MuJoCo Playground.\n\nFor more information, please see our [project webpage](https://younggyo.me/fast_td3)\n\n\n## ❗ Updates\n\n- **[Sep/20/2025]** Added `data` directory that contains training logs used for plotting in the report.\n\n- **[Sep/17/2025]** Fixed an issue where `std_min` and `std_max` were not included in Actor config (credit: [@ningyuanz](https://github.com/ningyuanz)).\n\n- **[Aug/07/2025]** If you encounter an issue in reproducing the performance, try disabling `torch.compile`. Please use `--no_compile` in argument.\n\n- **[Jul/07/2025]** Added support for multi-GPU training! See [Multi-GPU Training](#multi-gpu-training) section for details. \n\n- **[Jul/02/2025]** Optimized codebase to speed up training around 10-30% when using a single RTX 4090 GPU.\n\n- **[Jun/20/2025]** Added support for [MTBench](https://github.com/Viraj-Joshi/MTBench) with the help of [Viraj Joshi](https://viraj-joshi.github.io/).\n\n- **[Jun/15/2025]** Added support for FastTD3 + [SimbaV2](https://dojeon-ai.github.io/SimbaV2/)! It's faster to train, and often achieves better asymptotic performance. We recommend using FastTD3 + SimbaV2 for most cases.\n\n- **[Jun/06/2025]** Thanks to [Antonin Raffin](https://araffin.github.io/) ([@araffin](https://github.com/araffin)), we fixed the issues when using `n_steps` > 1, which stabilizes training with n-step return quite a lot!\n\n- **[Jun/01/2025]** Updated the figures in the technical report to report deterministic evaluation for IsaacLab tasks.\n\n\n## ✨ Features\n\nFastTD3 offers researchers a significant speedup in training complex humanoid agents.\n\n- Ready-to-go codebase with detailed instructions and pre-configured hyperparameters for each task\n- Support popular benchmarks: HumanoidBench, MuJoCo Playground, and IsaacLab\n- User-friendly features that can accelerate your research, such as rendering rollouts, torch optimizations (AMP and compile), and saving and loading checkpoints\n\n## ⚙️ Prerequisites\n\nBefore you begin, ensure you have the following installed:\n- Conda (for environment management)\n- Git LFS (Large File Storage) -- For IsaacLab\n- CMake -- For IsaacLab\n\nAnd the following system packages:\n```bash\nsudo apt install libglfw3 libgl1-mesa-glx libosmesa6 git-lfs cmake\n```\n\n## 📖 Installation\n\nThis project requires different Conda environments for different sets of experiments.\n\n### Common Setup\nFirst, ensure the common dependencies are installed as mentioned in the [Prerequisites](#prerequisites) section.\n\n### Environment for HumanoidBench\n\n```bash\nconda create -n fasttd3_hb -y python=3.10\nconda activate fasttd3_hb\npip install --editable git+https://github.com/carlosferrazza/humanoid-bench.git#egg=humanoid-bench\npip install -r requirements/requirements.txt\n```\n\n### Environment for MuJoCo Playground\n```bash\nconda create -n fasttd3_playground -y python=3.10\nconda activate fasttd3_playground\npip install -r requirements/requirements_playground.txt\n```\n\n**⚠️ Note:** Our `requirements_playground.txt` specifies `Jax==0.4.35`, which we found to be stable for latest GPUs in certain tasks such as `LeapCubeReorient` or `LeapCubeRotateZAxis`\n\n**⚠️ Note:** Current FastTD3 codebase uses customized MuJoCo Playground that supports saving last observations into info dictionary. We will work on incorporating this change into official repository hopefully soon.\n\n### Environment for IsaacLab\n```bash\nconda create -n fasttd3_isaaclab -y python=3.10\nconda activate fasttd3_isaaclab\n\n# Install IsaacLab (refer to official documentation for the latest steps)\n# Official Quickstart: https://isaac-sim.github.io/IsaacLab/main/source/setup/quickstart.html\npip install 'isaacsim[all,extscache]==4.5.0' --extra-index-url https://pypi.nvidia.com\ngit clone https://github.com/isaac-sim/IsaacLab.git\ncd IsaacLab\n./isaaclab.sh --install\ncd ..\n\n# Install project-specific requirements\npip install -r requirements/requirements.txt\n```\n\n### Environment for MTBench\nMTBench does not support humanoid experiments, but is a useful multi-task benchmark with massive parallel simulation. This could be useful for users who want to use FastTD3 for their multi-task experiments.\n\n```bash\nconda create -n fasttd3_mtbench -y python=3.8  # Note python version\nconda activate fasttd3_mtbench\n\n# Install IsaacGym -- recommend to follow instructions in https://github.com/BoosterRobotics/booster_gym\n...\n\n# Install MTBench\ngit clone https://github.com/Viraj-Joshi/MTBench.git\ncd MTbench\npip install -e .\npip install skrl\ncd ..\n\n# Install project-specific requirements\npip install -r requirements/requirements_isaacgym.txt\n```\n\n### (Optional) Accelerate headless GPU rendering in cloud instances\n\nIn some cloud VM images the NVIDIA kernel driver is present but the user-space OpenGL/EGL/Vulkan libraries aren't, so MuJoCo falls back to CPU renderer. You can install just the NVIDIA user-space libraries (and skip rebuilding the kernel module) with:\n\n```bash\nsudo apt install -y kmod\nsudo sh NVIDIA-Linux-x86_64-<your_driver_version>.run -s --no-kernel-module --ui=none --no-questions\n```\n\nAs a rule-of-thumb, if you're running experiments and rendering is taking longer than 5 seconds, it is very likely that GPU renderer is not used.\n\n## 🚀 Running Experiments\n\nActivate the appropriate Conda environment before running experiments.\n\nPlease see `fast_td3/hyperparams.py` for information regarding hyperparameters!\n\n### HumanoidBench Experiments\n```bash\nconda activate fasttd3_hb\n# FastTD3\npython fast_td3/train.py \\\n    --env_name h1hand-hurdle-v0 \\\n    --exp_name FastTD3 \\\n    --render_interval 5000 \\\n    --seed 1\n# FastTD3 + SimbaV2\npython fast_td3/train.py \\\n    --env_name h1hand-hurdle-v0 \\\n    --exp_name FastTD3 \\\n    --render_interval 5000 \\\n    --agent fasttd3_simbav2 \\\n    --batch_size 8192 \\\n    --critic_learning_rate_end 3e-5 \\\n    --actor_learning_rate_end 3e-5 \\\n    --weight_decay 0.0 \\\n    --critic_hidden_dim 512 \\\n    --critic_num_blocks 2 \\\n    --actor_hidden_dim 256 \\\n    --actor_num_blocks 1 \\\n    --seed 1\n```\n\n### MuJoCo Playground Experiments\n```bash\nconda activate fasttd3_playground\n# FastTD3\npython fast_td3/train.py \\\n    --env_name T1JoystickFlatTerrain \\\n    --exp_name FastTD3 \\\n    --render_interval 5000 \\\n    --seed 1\n# FastTD3 + SimbaV2\npython fast_td3/train.py \\\n    --env_name T1JoystickFlatTerrain \\\n    --exp_name FastTD3 \\\n    --render_interval 5000 \\\n    --agent fasttd3_simbav2 \\\n    --batch_size 8192 \\\n    --critic_learning_rate_end 3e-5 \\\n    --actor_learning_rate_end 3e-5 \\\n    --weight_decay 0.0 \\\n    --critic_hidden_dim 512 \\\n    --critic_num_blocks 2 \\\n    --actor_hidden_dim 256 \\\n    --actor_num_blocks 1 \\\n    --seed 1\n```\n\n### IsaacLab Experiments\n```bash\nconda activate fasttd3_isaaclab\n# FastTD3\npython fast_td3/train.py \\\n    --env_name Isaac-Velocity-Flat-G1-v0 \\\n    --exp_name FastTD3 \\\n    --render_interval 0 \\\n    --seed 1\n# FastTD3 + SimbaV2\npython fast_td3/train.py \\\n    --env_name Isaac-Repose-Cube-Allegro-Direct-v0 \\\n    --exp_name FastTD3 \\\n    --render_interval 0 \\\n    --agent fasttd3_simbav2 \\\n    --batch_size 8192 \\\n    --critic_learning_rate_end 3e-5 \\\n    --actor_learning_rate_end 3e-5 \\\n    --weight_decay 0.0 \\\n    --critic_hidden_dim 512 \\\n    --critic_num_blocks 2 \\\n    --actor_hidden_dim 256 \\\n    --actor_num_blocks 1 \\\n    --seed 1\n```\n\n### MTBench Experiments\n```bash\nconda activate fasttd3_mtbench\n# FastTD3\npython fast_td3/train.py \\\n    --env_name MTBench-meta-world-v2-mt10 \\\n    --exp_name FastTD3 \\\n    --render_interval 0 \\\n    --seed 1\n# FastTD3 + SimbaV2\npython fast_td3/train.py \\\n    --env_name MTBench-meta-world-v2-mt10 \\\n    --exp_name FastTD3 \\\n    --render_interval 0 \\\n    --agent fasttd3_simbav2 \\\n    --batch_size 8192 \\\n    --critic_learning_rate_end 3e-5 \\\n    --actor_learning_rate_end 3e-5 \\\n    --weight_decay 0.0 \\\n    --critic_hidden_dim 1024 \\\n    --critic_num_blocks 2 \\\n    --actor_hidden_dim 512 \\\n    --actor_num_blocks 1 \\\n    --seed 1\n```\n\n**Quick note:** For boolean-based arguments, you can set them to False by adding `no_` in front each argument, for instance, if you want to disable Clipped Q Learning, you can specify `--no_use_cdq` in your command.\n\n## 💡 Performance-Related Tips\n\nWe used a single Nvidia A100 80GB GPU for all experiments. Here are some remarks and tips for improving performances in your setup or troubleshooting in your machine configurations.\n\n- *Sample-efficiency* tends to improve with larger `num_envs`, `num_updates`, and `batch_size`. But this comes at the cost of *Time-efficiency*. Our default settings are optimized for wall-time efficiency on a single A100 80GB GPU. If you're using a different setup, consider tuning hyperparameters accordingly.\n- When FastTD3 performance is stuck at local minima at the early phase of training in your experiments\n  - First consider increasing the `num_updates`. This happens usually when the agent fails to exploit value functions. We also find higher `num_updates` tends to be helpful for relatively easier tasks or tasks with low-dimensional action spaces.\n  - If the agent is completely stuck or much worse than your expectation, try using `num_steps=3` or disabling `use_cdq`.\n  - For tasks that have penalty reward terms (e.g., torques, energy, action_rate, ..), consider lowering them for initial experiments, and tune the values. In some cases, curriculum learning with lower penalty terms followed by fine-tuning with stronger terms is effective.\n- When you encounter out-of-memory error with your GPU, our recommendation for reducing GPU usage is (i) smaller `buffer_size`, (ii) smaller `batch_size`, and then (iii) smaller `num_envs`. Because our codebase is assigning the whole replay buffer in GPU to reduce CPU-GPU transfer bottleneck, it usually has the largest GPU consumption, but usually less harmful to reduce.\n- Consider using `--compile_mode max-autotune` if you plan to run for many training steps. This may speed up training by up to 10% at the cost of a few additional minutes of heavy compilation.\n\n## Multi-GPU Training\nWe support multi-GPU training. If your machine supports multiple GPUs, or specify multiple GPUs using `CUDA_VISIBLE_DEVICES`, and run `train_multigpu.py`, it will automatically use all GPUs to scale up training.\n\n**Important:** Our multi-GPU implementation launches the **same experiment independently on each GPU** rather than distributing parameters across GPUs. This means:\n- Effective number of environments: `num_envs × num_gpus`\n- Effective batch size: `batch_size × num_gpus` \n- Effective buffer size: `buffer_size × num_gpus`\n\nEach GPU runs a complete copy of the training process, which scales up data collection and training throughput proportionally to the number of GPUs.\n\nFor instance, running IsaacLab experiments with 4 GPUs and `num_envs=1024` will end up in similar results as experiments with 1 GPU with `num_envs=4096`.\n\n## 🛝 Playing with the FastTD3 training\n\nA Jupyter notebook (`training_notebook.ipynb`) is available to help you get started with:\n- Training FastTD3 agents.\n- Loading pre-trained models.\n- Visualizing agent behavior.\n- Potentially, re-training or fine-tuning models.\n\n## 🤖 Sim-to-Real RL with FastTD3\n\nWe provide the [walkthrough](sim2real.md) for training deployable policies with FastTD3.\n\n## Contributing\n\nWe welcome contributions! Please feel free to submit issues and pull requests.\n\n## License\n\nThis project is licensed under the MIT License -- see the [LICENSE](LICENSE) file for details. Note that the repository relies on third-party libraries subject to their respective licenses.\n\n## Acknowledgements\n\nThis codebase builds upon [LeanRL](https://github.com/pytorch-labs/LeanRL) framework. \n\nWe would like to thank people who have helped throughout the project:\n\n- We thank [Kevin Zakka](https://kzakka.com/) for the help in setting up MuJoCo Playground.\n- We thank [Changyeon Kim](https://changyeon.site/) for testing the early version of this codebase\n\n## Citations\n\n### FastTD3\n```bibtex\n@article{seo2025fasttd3,\n  title={FastTD3: Simple, Fast, and Capable Reinforcement Learning for Humanoid Control},\n  author={Seo, Younggyo and Sferrazza, Carmelo and Geng, Haoran and Nauman, Michal and Yin, Zhao-Heng and Abbeel, Pieter},\n  journal={arXiv preprint arXiv:2505.22642},\n  year={2025}\n}\n```\n\n### TD3\n```bibtex\n@inproceedings{fujimoto2018addressing,\n  title={Addressing function approximation error in actor-critic methods},\n  author={Fujimoto, Scott and Hoof, Herke and Meger, David},\n  booktitle={International conference on machine learning},\n  pages={1587--1596},\n  year={2018},\n  organization={PMLR}\n}\n```\n\n### SimbaV2\n```bibtex\n@article{lee2025hyperspherical,\n  title={Hyperspherical normalization for scalable deep reinforcement learning},\n  author={Lee, Hojoon and Lee, Youngdo and Seno, Takuma and Kim, Donghu and Stone, Peter and Choo, Jaegul},\n  journal={arXiv preprint arXiv:2502.15280},\n  year={2025}\n}\n```\n\n### LeanRL\n\nFollowing the [LeanRL](https://github.com/pytorch-labs/LeanRL)'s recommendation, we put CleanRL's bibtex here:\n\n```bibtex\n@article{huang2022cleanrl,\n  author  = {Shengyi Huang and Rousslan Fernand Julien Dossa and Chang Ye and Jeff Braga and Dipam Chakraborty and Kinal Mehta and João G.M. Araújo},\n  title   = {CleanRL: High-quality Single-file Implementations of Deep Reinforcement Learning Algorithms},\n  journal = {Journal of Machine Learning Research},\n  year    = {2022},\n  volume  = {23},\n  number  = {274},\n  pages   = {1--18},\n  url     = {http://jmlr.org/papers/v23/21-1342.html}\n}\n```\n\n### Parallel Q-Learning (PQL)\n```bibtex\n@inproceedings{li2023parallel,\n  title={Parallel $ Q $-Learning: Scaling Off-policy Reinforcement Learning under Massively Parallel Simulation},\n  author={Li, Zechu and Chen, Tao and Hong, Zhang-Wei and Ajay, Anurag and Agrawal, Pulkit},\n  booktitle={International Conference on Machine Learning},\n  pages={19440--19459},\n  year={2023},\n  organization={PMLR}\n}\n```\n\n### HumanoidBench\n```bibtex\n@inproceedings{sferrazza2024humanoidbench,\n  title={Humanoidbench: Simulated humanoid benchmark for whole-body locomotion and manipulation},\n  author={Sferrazza, Carmelo and Huang, Dun-Ming and Lin, Xingyu and Lee, Youngwoon and Abbeel, Pieter},\n  booktitle={Robotics: Science and Systems},\n  year={2024}\n}\n```\n\n### MuJoCo Playground\n```bibtex\n@article{zakka2025mujoco,\n  title={MuJoCo Playground},\n  author={Zakka, Kevin and Tabanpour, Baruch and Liao, Qiayuan and Haiderbhai, Mustafa and Holt, Samuel and Luo, Jing Yuan and Allshire, Arthur and Frey, Erik and Sreenath, Koushil and Kahrs, Lueder A and others},\n  journal={arXiv preprint arXiv:2502.08844},\n  year={2025}\n}\n```\n\n### IsaacLab\n```bibtex\n@article{mittal2023orbit,\n   author={Mittal, Mayank and Yu, Calvin and Yu, Qinxi and Liu, Jingzhou and Rudin, Nikita and Hoeller, David and Yuan, Jia Lin and Singh, Ritvik and Guo, Yunrong and Mazhar, Hammad and Mandlekar, Ajay and Babich, Buck and State, Gavriel and Hutter, Marco and Garg, Animesh},\n   journal={IEEE Robotics and Automation Letters},\n   title={Orbit: A Unified Simulation Framework for Interactive Robot Learning Environments},\n   year={2023},\n   volume={8},\n   number={6},\n   pages={3740-3747},\n   doi={10.1109/LRA.2023.3270034}\n}\n```\n\n### MTBench\n```bibtex\n@inproceedings{\njoshi2025benchmarking,\ntitle={Benchmarking Massively Parallelized Multi-Task Reinforcement Learning for Robotics Tasks},\nauthor={Viraj Joshi and Zifan Xu and Bo Liu and Peter Stone and Amy Zhang},\nbooktitle={Reinforcement Learning Conference},\nyear={2025},\nurl={https://openreview.net/forum?id=z0MM0y20I2}\n}\n```\n\n### Getting SAC to Work on a Massive Parallel Simulator\n```bibtex\n@article{raffin2025isaacsim,\n  title   = \"Getting SAC to Work on a Massive Parallel Simulator: An RL Journey With Off-Policy Algorithms\",\n  author  = \"Raffin, Antonin\",\n  journal = \"araffin.github.io\",\n  year    = \"2025\",\n  month   = \"Feb\",\n  url     = \"https://araffin.github.io/post/sac-massive-sim/\"\n}\n```\n\n### Speeding Up SAC with Massively Parallel Simulation\n```bibtex\n@article{shukla2025fastsac,\n  title   = \"Speeding Up SAC with Massively Parallel Simulation\",\n  author  = \"Shukla, Arth\",\n  journal = \"https://arthshukla.substack.com\",\n  year    = \"2025\",\n  month   = \"Mar\",\n  url     = \"https://arthshukla.substack.com/p/speeding-up-sac-with-massively-parallel\"\n}\n```\n"
  },
  {
    "path": "data/humanoidbench_result.json",
    "content": "{\n  \"h1hand_walk\": {\n    \"time\": [\n      0.0,\n      327.4388405844,\n      654.8776811688,\n      982.3165217531999,\n      1309.7553623376,\n      1637.194202922,\n      1964.6330435063999,\n      2292.0718840908,\n      2619.5107246752,\n      2946.9495652596,\n      3274.388405844\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0\n    ],\n    \"return\": [\n      0.0,\n      18.961605707804363,\n      21.807273546854656,\n      48.724927266438804,\n      94.65358352661133,\n      178.67118326822916,\n      501.64170328776044,\n      711.7367757161459,\n      901.3774210611979,\n      928.8246256510416,\n      928.2600301106771\n    ],\n    \"return_min\": [\n      0.0,\n      14.858372081304717,\n      12.96401166123347,\n      41.67681180605566,\n      54.95595528974483,\n      94.37760201041398,\n      279.43906161791847,\n      545.4619319542776,\n      877.1935439686172,\n      921.6425265540179,\n      906.3836415708065\n    ],\n    \"return_max\": [\n      0.0,\n      23.06483933430401,\n      30.65053543247584,\n      55.77304272682195,\n      134.35121176347783,\n      262.96476452604435,\n      723.8443449576024,\n      878.0116194780142,\n      925.5612981537786,\n      936.0067247480654,\n      950.1364186505477\n    ]\n  },\n  \"h1hand_stand\": {\n    \"time\": [\n      0.0,\n      325.5565317698,\n      651.1130635396,\n      976.6695953094,\n      1302.2261270792,\n      1627.782658849,\n      1953.3391906188,\n      2278.8957223886,\n      2604.4522541584,\n      2930.0087859282003,\n      3255.565317698\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0\n    ],\n    \"return\": [\n      0.0,\n      31.879506429036457,\n      42.487387339274086,\n      53.94635899861654,\n      71.63724517822266,\n      109.86285400390625,\n      255.9549357096354,\n      628.9070638020834,\n      776.9771118164062,\n      887.54248046875,\n      911.9251912434896\n    ],\n    \"return_min\": [\n      0.0,\n      21.445549292274528,\n      22.163247088006734,\n      45.241289229102975,\n      64.28271961875848,\n      76.26589758620838,\n      191.1954024340911,\n      393.33148063655887,\n      656.7962055436002,\n      851.7788577992328,\n      880.8326840828662\n    ],\n    \"return_max\": [\n      0.0,\n      42.313463565798386,\n      62.81152759054144,\n      62.6514287681301,\n      78.99177073768683,\n      143.45981042160412,\n      320.7144689851797,\n      864.4826469676079,\n      897.1580180892123,\n      923.3061031382672,\n      943.017698404113\n    ]\n  },\n  \"h1hand_run\": {\n    \"time\": [\n      0.0,\n      306.4421410813333,\n      612.8842821626666,\n      919.326423244,\n      1225.7685643253333,\n      1532.2107054066666,\n      1838.652846488,\n      2145.0949875693336,\n      2451.5371286506665,\n      2757.9792697320004,\n      3064.4214108133333,\n      3370.8635518946667,\n      3677.305692976,\n      3983.747834057334,\n      4290.189975138667,\n      4596.63211622\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0\n    ],\n    \"return\": [\n      0.0,\n      6.890847841898601,\n      10.521868069966635,\n      9.320551554361979,\n      17.587395985921223,\n      31.98662503560384,\n      46.11953608194987,\n      74.0572992960612,\n      121.98380025227864,\n      210.85086059570312,\n      506.30312093098956,\n      524.246826171875,\n      775.0013631184896,\n      811.9727172851562,\n      886.3514200846354,\n      902.0192260742188\n    ],\n    \"return_min\": [\n      0.0,\n      5.2453829775246215,\n      9.646580055146266,\n      6.430592906343505,\n      12.881361282168676,\n      26.82687314527471,\n      41.16444668808793,\n      62.03005710131488,\n      100.789247637638,\n      183.66015405980465,\n      427.02318928000045,\n      408.7644706774459,\n      737.4298733834557,\n      774.6940229887994,\n      879.065567239731,\n      892.5835266487527\n    ],\n    \"return_max\": [\n      0.0,\n      8.536312706272579,\n      11.397156084787003,\n      12.210510202380451,\n      22.29343068967377,\n      37.14637692593297,\n      51.07462547581181,\n      86.08454149080752,\n      143.17835286691928,\n      238.0415671316016,\n      585.5830525819787,\n      639.729181666304,\n      812.5728528535235,\n      849.2514115815131,\n      893.6372729295398,\n      911.4549254996848\n    ]\n  },\n  \"h1hand_reach\": {\n    \"time\": [\n      0.0,\n      372.80056838400003,\n      745.6011367680001,\n      1118.401705152,\n      1491.2022735360001,\n      1864.00284192,\n      2236.803410304,\n      2609.603978688,\n      2982.4045470720002,\n      3355.2051154560004,\n      3728.00568384,\n      4100.806252224001,\n      4473.606820608,\n      4846.407388992,\n      5219.207957376,\n      5592.00852576,\n      5964.8090941440005,\n      6337.609662528001,\n      6710.410230912001,\n      7083.210799296,\n      7456.01136768,\n      7828.811936064,\n      8201.612504448001,\n      8574.413072832,\n      8947.213641216,\n      9320.0142096,\n      9692.814777984,\n      10065.615346368,\n      10438.415914752,\n      10811.216483136,\n      11184.01705152\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0,\n      13440000.0,\n      14080000.0,\n      14720000.0,\n      15360000.0,\n      16000000.0,\n      16640000.0,\n      17280000.0,\n      17920000.0,\n      18560000.0,\n      19200000.0\n    ],\n    \"return\": [\n      0.0,\n      941.600341796875,\n      1135.4913330078125,\n      1474.2825520833333,\n      2313.754109700521,\n      3213.2079264322915,\n      4587.813883463542,\n      5064.874186197917,\n      5511.294270833333,\n      5576.633138020833,\n      6172.243489583333,\n      7008.345377604167,\n      6894.714680989583,\n      7536.7392578125,\n      7494.722819010417,\n      7460.627278645833,\n      7870.114095052083,\n      8164.121256510417,\n      8070.581217447917,\n      8084.484700520833,\n      8364.873697916666,\n      8254.95654296875,\n      8171.631673177083,\n      8403.752604166666,\n      8585.269368489584,\n      8388.333170572916,\n      8449.694010416666,\n      8608.694986979166,\n      8495.815755208334,\n      8748.194661458334,\n      8146.561197916667\n    ],\n    \"return_min\": [\n      0.0,\n      681.8662271797355,\n      1021.0384348384862,\n      1397.7603324284514,\n      1976.0678889906676,\n      2504.978510893123,\n      3452.5915196733913,\n      4091.507234596569,\n      4614.513790451145,\n      4584.797954406994,\n      5541.806414474982,\n      6259.515364983769,\n      6343.318836978547,\n      6854.539919974239,\n      6980.611797856333,\n      6705.693438934862,\n      7218.966063142879,\n      7807.798913658527,\n      7519.692863858716,\n      7372.049219729361,\n      7880.456149452872,\n      7570.937682802555,\n      7367.755272289276,\n      7989.806520979583,\n      8098.439330032555,\n      7945.363999112206,\n      8019.104933439191,\n      8156.065950744518,\n      7987.549893436531,\n      8366.520573819495,\n      7681.10435210547\n    ],\n    \"return_max\": [\n      0.0,\n      1201.3344564140145,\n      1249.9442311771388,\n      1550.804771738215,\n      2651.4403304103744,\n      3921.4373419714602,\n      5723.036247253693,\n      6038.241137799265,\n      6408.074751215521,\n      6568.468321634672,\n      6802.680564691684,\n      7757.175390224565,\n      7446.110525000619,\n      8218.938595650761,\n      8008.8338401645005,\n      8215.561118356805,\n      8521.262126961288,\n      8520.443599362306,\n      8621.469571037118,\n      8796.920181312305,\n      8849.29124638046,\n      8938.975403134946,\n      8975.50807406489,\n      8817.69868735375,\n      9072.099406946612,\n      8831.302342033627,\n      8880.283087394142,\n      9061.324023213814,\n      9004.081616980136,\n      9129.868749097173,\n      8612.018043727863\n    ]\n  },\n  \"h1hand_hurdle\": {\n    \"time\": [\n      0.0,\n      361.5054530611936,\n      723.0109061223872,\n      1084.5163591835806,\n      1446.0218122447743,\n      1807.5272653059678,\n      2169.032718367161,\n      2530.538171428355,\n      2892.0436244895486,\n      3253.549077550742,\n      3615.0545306119357,\n      3976.559983673129,\n      4338.065436734322,\n      4699.570889795516,\n      5061.07634285671,\n      5422.581795917904,\n      5784.087248979097,\n      6145.5927020402905,\n      6507.098155101484,\n      6868.603608162677,\n      7230.109061223871,\n      7591.614514285065,\n      7953.119967346258,\n      8314.625420407452,\n      8676.130873468644,\n      9037.636326529839,\n      9399.141779591033,\n      9760.647232652227,\n      10122.15268571342,\n      10483.658138774614,\n      10845.163591835808,\n      11206.669044897\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0,\n      13440000.0,\n      14080000.0,\n      14720000.0,\n      15360000.0,\n      16000000.0,\n      16640000.0,\n      17280000.0,\n      17920000.0,\n      18560000.0,\n      19200000.0,\n      19840000.0\n    ],\n    \"return\": [\n      0.0,\n      6.492611567179362,\n      10.588607947031656,\n      18.610817273457844,\n      29.661352157592773,\n      49.00843048095703,\n      85.58958435058594,\n      115.55878194173177,\n      173.2019500732422,\n      218.64263916015625,\n      319.6346893310547,\n      377.22975667317706,\n      496.73987833658856,\n      490.29189046223956,\n      610.6637573242188,\n      658.8320719401041,\n      689.0982869466146,\n      710.2075602213541,\n      735.3473714192709,\n      778.7053426106771,\n      762.7415568033854,\n      804.9999593098959,\n      830.8478800455729,\n      852.31982421875,\n      861.3894653320312,\n      814.1538696289062,\n      871.5404663085938,\n      876.4856160481771,\n      852.7265625,\n      889.6229248046875,\n      887.1525065104166,\n      886.8323160807291\n    ],\n    \"return_min\": [\n      0.0,\n      6.225973005334275,\n      8.412515126451241,\n      15.738143769806989,\n      21.76995365926382,\n      31.328784883872007,\n      56.83435619834776,\n      79.92347525493489,\n      112.07297356902194,\n      138.09470906365925,\n      216.08342306974944,\n      259.30040536744895,\n      389.9528713112492,\n      378.8689856869445,\n      480.1631532334461,\n      565.0675789124052,\n      598.9966247159923,\n      618.670444534875,\n      659.2942580856163,\n      729.4083980816131,\n      708.118056513654,\n      737.1563442724927,\n      785.5860491473233,\n      822.1971602106087,\n      853.2213237200906,\n      798.9233122130538,\n      848.562144860611,\n      867.3290162790252,\n      808.398990047507,\n      877.0273627520126,\n      880.6210154175322,\n      878.8996722928737\n    ],\n    \"return_max\": [\n      0.0,\n      6.7592501290244495,\n      12.764700767612071,\n      21.4834907771087,\n      37.55275065592173,\n      66.68807607804206,\n      114.34481250282411,\n      151.19408862852865,\n      234.33092657746244,\n      299.19056925665325,\n      423.18595559235996,\n      495.1591079789052,\n      603.526885361928,\n      601.7147952375346,\n      741.1643614149914,\n      752.596564967803,\n      779.1999491772369,\n      801.7446759078333,\n      811.4004847529254,\n      828.0022871397412,\n      817.3650570931168,\n      872.8435743472991,\n      876.1097109438225,\n      882.4424882268913,\n      869.5576069439719,\n      829.3844270447587,\n      894.5187877565764,\n      885.6422158173291,\n      897.054134952493,\n      902.2184868573624,\n      893.683997603301,\n      894.7649598685846\n    ]\n  },\n  \"h1hand_crawl\": {\n    \"time\": [\n      0.0,\n      351.0455393933077,\n      702.0910787866154,\n      1053.136618179923,\n      1404.1821575732308,\n      1755.2276969665386,\n      2106.273236359846,\n      2457.318775753154,\n      2808.3643151464616,\n      3159.4098545397696,\n      3510.455393933077,\n      3861.5009333263847,\n      4212.546472719692,\n      4563.592012113\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0\n    ],\n    \"return\": [\n      0.0,\n      352.52854410807294,\n      560.5812377929688,\n      552.0515441894531,\n      642.7318115234375,\n      714.5783081054688,\n      806.6729939778646,\n      860.4519856770834,\n      892.0007731119791,\n      920.3323160807291,\n      924.2599487304688,\n      945.3890991210938,\n      948.0378011067709,\n      956.2367553710938\n    ],\n    \"return_min\": [\n      0.0,\n      324.2043638986621,\n      545.5137620982706,\n      483.4934811974669,\n      635.1979678248894,\n      683.6522397156278,\n      727.5537564989331,\n      784.8352581603768,\n      839.2658061728478,\n      887.326843488466,\n      905.17119209457,\n      927.7121326375325,\n      932.2496491343232,\n      945.4639700799313\n    ],\n    \"return_max\": [\n      0.0,\n      380.85272431748376,\n      575.6487134876669,\n      620.6096071814393,\n      650.2656552219856,\n      745.5043764953097,\n      885.7922314567961,\n      936.0687131937899,\n      944.7357400511105,\n      953.3377886729922,\n      943.3487053663675,\n      963.066065604655,\n      963.8259530792185,\n      967.0095406622562\n    ]\n  },\n  \"h1hand_maze\": {\n    \"time\": [\n      0.0,\n      373.7336517353571,\n      747.4673034707142,\n      1121.2009552060715,\n      1494.9346069414285,\n      1868.6682586767856,\n      2242.401910412143,\n      2616.1355621475,\n      2989.869213882857,\n      3363.6028656182143,\n      3737.3365173535713,\n      4111.070169088928,\n      4484.803820824286,\n      4858.537472559643,\n      5232.271124295\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0\n    ],\n    \"return\": [\n      0.0,\n      140.10218302408853,\n      145.8614705403646,\n      147.9732869466146,\n      182.3691864013672,\n      233.79485066731772,\n      329.2705078125,\n      352.1261393229167,\n      357.1931559244792,\n      357.9311014811198,\n      298.9239857991536,\n      358.42406209309894,\n      350.0440165201823,\n      353.48012288411456,\n      358.6817118326823\n    ],\n    \"return_min\": [\n      0.0,\n      124.88956930471686,\n      135.93298616192584,\n      142.26980814418167,\n      163.01049749771548,\n      169.52534951133262,\n      312.57792152579833,\n      341.7712339158121,\n      350.91098197823186,\n      354.19958286243155,\n      211.5847442406885,\n      346.7498400009268,\n      342.2860426951425,\n      344.2272969649359,\n      354.4987809321401\n    ],\n    \"return_max\": [\n      0.0,\n      155.31479674346022,\n      155.78995491880335,\n      153.67676574904752,\n      201.7278753050189,\n      298.0643518233028,\n      345.96309409920167,\n      362.48104473002127,\n      363.4753298707265,\n      361.66262009980807,\n      386.26322735761875,\n      370.0982841852711,\n      357.8019903452221,\n      362.73294880329325,\n      362.8646427332245\n    ]\n  },\n  \"h1hand_sit_simple\": {\n    \"time\": [\n      0.0,\n      352.13274472366663,\n      704.2654894473333,\n      1056.398234171,\n      1408.5309788946665,\n      1760.6637236183333,\n      2112.796468342,\n      2464.9292130656668,\n      2817.061957789333,\n      3169.194702513\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0\n    ],\n    \"return\": [\n      0.0,\n      139.5462900797526,\n      653.7905069986979,\n      863.320068359375,\n      937.2736612955729,\n      917.0751342773438,\n      925.4646809895834,\n      934.3729451497396,\n      931.1214192708334,\n      929.0094401041666\n    ],\n    \"return_min\": [\n      0.0,\n      39.1384600509537,\n      561.3652721120882,\n      840.7680414731517,\n      928.7285812113073,\n      882.5101152312143,\n      910.9714846547159,\n      910.6220791649206,\n      912.940393490759,\n      909.0497473763742\n    ],\n    \"return_max\": [\n      0.0,\n      239.95412010855148,\n      746.2157418853076,\n      885.8720952455983,\n      945.8187413798385,\n      951.6401533234732,\n      939.9578773244508,\n      958.1238111345587,\n      949.3024450509077,\n      948.9691328319591\n    ]\n  },\n  \"h1hand_sit_hard\": {\n    \"time\": [\n      0.0,\n      355.06120608526663,\n      710.1224121705333,\n      1065.1836182558,\n      1420.2448243410665,\n      1775.3060304263333,\n      2130.3672365116,\n      2485.4284425968663,\n      2840.489648682133,\n      3195.5508547674003,\n      3550.6120608526667,\n      3905.673266937933,\n      4260.7344730232,\n      4615.795679108467,\n      4970.8568851937325,\n      5325.918091279,\n      5680.979297364266,\n      6036.040503449533,\n      6391.101709534801,\n      6746.162915620066,\n      7101.224121705333,\n      7456.2853277906,\n      7811.346533875866,\n      8166.407739961132,\n      8521.4689460464,\n      8876.530152131665,\n      9231.591358216934,\n      9586.6525643022,\n      9941.713770387465,\n      10296.774976472732,\n      10651.836182558\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0,\n      13440000.0,\n      14080000.0,\n      14720000.0,\n      15360000.0,\n      16000000.0,\n      16640000.0,\n      17280000.0,\n      17920000.0,\n      18560000.0,\n      19200000.0\n    ],\n    \"return\": [\n      0.0,\n      95.34047635396321,\n      195.36861419677734,\n      318.97853597005206,\n      396.64198811848956,\n      375.91650390625,\n      488.83921305338544,\n      548.6920267740885,\n      574.7945251464844,\n      566.6195780436198,\n      583.4102783203125,\n      590.7933756510416,\n      600.908457438151,\n      594.9339090983073,\n      610.3661804199219,\n      616.0959269205729,\n      610.0286865234375,\n      601.2849934895834,\n      611.8743286132812,\n      634.0820617675781,\n      632.5235493977865,\n      669.2791239420573,\n      701.1968180338541,\n      683.4253133138021,\n      706.3424275716146,\n      741.2285461425781,\n      699.800537109375,\n      749.2343546549479,\n      752.3892517089844,\n      734.4484252929688,\n      726.0768636067709\n    ],\n    \"return_min\": [\n      0.0,\n      2.5724704693975298,\n      93.58268813745721,\n      233.71247327846993,\n      368.7017851150339,\n      332.2081460365896,\n      340.63550191707344,\n      329.6445384869926,\n      351.03336679701863,\n      343.03801256893826,\n      363.99036303185073,\n      360.8858716647696,\n      369.6746990033322,\n      375.6881012599298,\n      384.3501951686797,\n      391.8689483225843,\n      385.04690037041826,\n      394.37185439974064,\n      381.77529344069296,\n      411.24672410522163,\n      405.85462333371214,\n      459.9114779257408,\n      504.43995094660033,\n      483.4763839716003,\n      499.2238858422869,\n      538.4201845543151,\n      470.58627615990247,\n      533.4576585016802,\n      541.3393913458676,\n      524.7727664466893,\n      470.0900042127373\n    ],\n    \"return_max\": [\n      0.0,\n      188.1084822385289,\n      297.15454025609745,\n      404.2445986616342,\n      424.58219112194524,\n      419.6248617759104,\n      637.0429241896975,\n      767.7395150611844,\n      798.5556834959501,\n      790.2011435183013,\n      802.8301936087743,\n      820.7008796373136,\n      832.1422158729698,\n      814.1797169366847,\n      836.3821656711641,\n      840.3229055185615,\n      835.0104726764567,\n      808.1981325794261,\n      841.9733637858695,\n      856.9173994299347,\n      859.1924754618608,\n      878.6467699583736,\n      897.9536851211079,\n      883.374242656004,\n      913.4609693009423,\n      944.0369077308411,\n      929.0147980588475,\n      965.0110508082156,\n      963.4391120721011,\n      944.1240841392482,\n      982.0637230008044\n    ]\n  },\n  \"h1hand_balance_simple\": {\n    \"time\": [\n      0.0,\n      175.717594372775,\n      351.43518874555,\n      527.152783118325,\n      702.8703774911,\n      878.587971863875,\n      1054.30556623665,\n      1230.023160609425,\n      1405.7407549822,\n      1581.458349354975,\n      1757.17594372775,\n      1932.8935381005253,\n      2108.6111324733,\n      2284.328726846075,\n      2460.04632121885,\n      2635.763915591625,\n      2811.4815099644,\n      2987.1991043371754,\n      3162.91669870995,\n      3338.6342930827254,\n      3514.3518874555,\n      3690.0694818282755,\n      3865.7870762010507,\n      4041.5046705738255,\n      4217.2222649466,\n      4392.939859319375,\n      4568.65745369215,\n      4744.375048064925,\n      4920.0926424377,\n      5095.8102368104755,\n      5271.52783118325,\n      5447.245425556025,\n      5622.9630199288,\n      5798.680614301576,\n      5974.398208674351,\n      6150.115803047125,\n      6325.8333974199,\n      6501.550991792676,\n      6677.268586165451,\n      6852.986180538226,\n      7028.703774911\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0,\n      13440000.0,\n      14080000.0,\n      14720000.0,\n      15360000.0,\n      16000000.0,\n      16640000.0,\n      17280000.0,\n      17920000.0,\n      18560000.0,\n      19200000.0,\n      19840000.0,\n      20480000.0,\n      21120000.0,\n      21760000.0,\n      22400000.0,\n      23040000.0,\n      23680000.0,\n      24320000.0,\n      24960000.0,\n      25600000.0\n    ],\n    \"return\": [\n      0.0,\n      52.67643197377523,\n      77.08096504211426,\n      100.20963668823242,\n      118.43112309773763,\n      157.09767150878906,\n      208.31290181477866,\n      215.8346757888794,\n      295.65764872233075,\n      373.72972615559894,\n      395.9920959472656,\n      326.1277338663737,\n      487.8640950520833,\n      412.4946568806966,\n      481.59149678548175,\n      489.4783935546875,\n      555.3950398763021,\n      549.860850016276,\n      587.4115397135416,\n      623.5388387044271,\n      538.5026601155599,\n      437.4591878255208,\n      563.7863718668619,\n      591.0351969401041,\n      635.8733622233073,\n      531.7674204508463,\n      557.8404591878256,\n      567.099183400472,\n      504.17720794677734,\n      611.8459065755209,\n      568.1315002441406,\n      531.0865529378256,\n      719.8113403320312,\n      617.6780344645182,\n      661.9119974772135,\n      701.4621276855469,\n      612.2174580891927,\n      698.0676879882812,\n      607.3912162780762,\n      701.1986897786459,\n      788.4394226074219\n    ],\n    \"return_min\": [\n      0.0,\n      34.03153521489918,\n      67.34714304892007,\n      93.43179041853405,\n      108.3451803854909,\n      131.60774027010135,\n      186.807812935322,\n      121.50917666317413,\n      243.0843846608089,\n      314.6589689026109,\n      335.93710664802484,\n      199.6418475452607,\n      421.46378419781684,\n      271.96722431041565,\n      419.1856732133755,\n      384.7529814203755,\n      496.47070280779275,\n      468.9030197426357,\n      520.0155579034114,\n      555.5749497533352,\n      406.5754864025145,\n      275.34879196835936,\n      498.0399353927207,\n      550.1356761025373,\n      516.3616959889615,\n      320.4064699483651,\n      442.36312091556783,\n      326.04143673322324,\n      303.6744638373668,\n      485.79766687848047,\n      303.55332555045385,\n      466.7197913683005,\n      625.171113211483,\n      395.48405865968545,\n      542.6871855684739,\n      615.614612025825,\n      499.65604515073494,\n      548.4941266431731,\n      350.1081528445121,\n      573.6862052997205,\n      653.9984031207549\n    ],\n    \"return_max\": [\n      0.0,\n      71.32132873265128,\n      86.81478703530844,\n      106.9874829579308,\n      128.51706580998433,\n      182.58760274747678,\n      229.8179906942353,\n      310.1601749145847,\n      348.2309127838526,\n      432.80048340858696,\n      456.0470852465064,\n      452.61362018748673,\n      554.2644059063498,\n      553.0220894509775,\n      543.997320357588,\n      594.2038056889995,\n      614.3193769448116,\n      630.8186802899163,\n      654.8075215236719,\n      691.502727655519,\n      670.4298338286053,\n      599.5695836826823,\n      629.5328083410031,\n      631.9347177776709,\n      755.385028457653,\n      743.1283709533275,\n      673.3177974600833,\n      808.1569300677209,\n      704.6799520561879,\n      737.8941462725612,\n      832.7096749378275,\n      595.4533145073506,\n      814.4515674525795,\n      839.872010269351,\n      781.1368093859531,\n      787.3096433452688,\n      724.7788710276506,\n      847.6412493333894,\n      864.6742797116402,\n      828.7111742575712,\n      922.8804420940888\n    ]\n  },\n  \"h1hand_balance_hard\": {\n    \"time\": [\n      0.0,\n      181.3574838361809,\n      362.7149676723618,\n      544.0724515085427,\n      725.4299353447236,\n      906.7874191809045,\n      1088.1449030170854,\n      1269.5023868532662,\n      1450.8598706894472,\n      1632.2173545256283,\n      1813.574838361809,\n      1994.9323221979898,\n      2176.289806034171,\n      2357.647289870352,\n      2539.0047737065324,\n      2720.3622575427135,\n      2901.7197413788945,\n      3083.0772252150755,\n      3264.4347090512565,\n      3445.792192887437,\n      3627.149676723618,\n      3808.507160559799,\n      3989.8646443959797,\n      4171.222128232161,\n      4352.579612068342,\n      4533.937095904523,\n      4715.294579740704,\n      4896.652063576884,\n      5078.009547413065,\n      5259.367031249247,\n      5440.724515085427,\n      5622.081998921608,\n      5803.439482757789,\n      5984.79696659397,\n      6166.154450430151,\n      6347.511934266332,\n      6528.869418102513,\n      6710.226901938693,\n      6891.584385774874,\n      7072.941869611055,\n      7254.299353447236,\n      7435.656837283417,\n      7617.014321119598,\n      7798.371804955779,\n      7979.729288791959,\n      8161.08677262814,\n      8342.444256464321,\n      8523.801740300501,\n      8705.159224136683,\n      8886.516707972865,\n      9067.874191809045,\n      9249.231675645227,\n      9430.589159481407,\n      9611.946643317588,\n      9793.304127153768,\n      9974.66161098995,\n      10156.01909482613,\n      10337.376578662312,\n      10518.734062498494,\n      10700.091546334672,\n      10881.449030170854,\n      11062.806514007034,\n      11244.163997843216,\n      11425.521481679396,\n      11606.878965515578,\n      11788.236449351758,\n      11969.59393318794,\n      12150.951417024122,\n      12332.308900860302,\n      12513.666384696484,\n      12695.023868532664,\n      12876.381352368846,\n      13057.738836205026,\n      13239.096320041204,\n      13420.453803877386,\n      13601.811287713566,\n      13783.168771549748,\n      13964.526255385928,\n      14145.88373922211,\n      14327.24122305829,\n      14508.598706894472,\n      14689.956190730652,\n      14871.313674566834,\n      15052.671158403016,\n      15234.028642239196,\n      15415.386126075378,\n      15596.743609911558,\n      15778.10109374774,\n      15959.458577583919,\n      16140.816061420099,\n      16322.17354525628,\n      16503.53102909246,\n      16684.888512928643,\n      16866.245996764825,\n      17047.603480601003,\n      17228.960964437185,\n      17410.318448273367,\n      17591.67593210955,\n      17773.03341594573,\n      17954.39089978191,\n      18135.74838361809,\n      18317.105867454273,\n      18498.463351290455,\n      18679.820835126633,\n      18861.178318962815,\n      19042.535802798993,\n      19223.893286635175,\n      19405.250770471357,\n      19586.608254307535,\n      19767.965738143717,\n      19949.3232219799,\n      20130.68070581608,\n      20312.03818965226,\n      20493.39567348844,\n      20674.753157324623,\n      20856.110641160805,\n      21037.468124996987,\n      21218.825608833165,\n      21400.183092669344,\n      21581.540576505526,\n      21762.898060341708,\n      21944.25554417789,\n      22125.613028014068,\n      22306.97051185025,\n      22488.32799568643,\n      22669.685479522614,\n      22851.042963358792,\n      23032.400447194974,\n      23213.757931031156,\n      23395.115414867338,\n      23576.472898703516,\n      23757.830382539698,\n      23939.18786637588,\n      24120.54535021206,\n      24301.902834048244,\n      24483.260317884422,\n      24664.617801720604,\n      24845.975285556786,\n      25027.332769392968,\n      25208.690253229146,\n      25390.047737065328,\n      25571.40522090151,\n      25752.762704737692,\n      25934.12018857387,\n      26115.477672410052,\n      26296.83515624623,\n      26478.19264008241,\n      26659.55012391859,\n      26840.907607754772,\n      27022.265091590954,\n      27203.622575427133,\n      27384.980059263315,\n      27566.337543099497,\n      27747.69502693568,\n      27929.052510771857,\n      28110.40999460804,\n      28291.76747844422,\n      28473.124962280403,\n      28654.48244611658,\n      28835.839929952763,\n      29017.197413788945,\n      29198.554897625127,\n      29379.912381461305,\n      29561.269865297487,\n      29742.62734913367,\n      29923.98483296985,\n      30105.342316806033,\n      30286.69980064221,\n      30468.057284478393,\n      30649.414768314575,\n      30830.772252150757,\n      31012.129735986935,\n      31193.487219823117,\n      31374.8447036593,\n      31556.20218749548,\n      31737.55967133166,\n      31918.917155167837,\n      32100.27463900402,\n      32281.632122840198,\n      32462.98960667638,\n      32644.34709051256,\n      32825.70457434874,\n      33007.06205818492,\n      33188.4195420211,\n      33369.777025857285,\n      33551.13450969347,\n      33732.49199352965,\n      33913.84947736583,\n      34095.206961202006,\n      34276.56444503819,\n      34457.92192887437,\n      34639.27941271055,\n      34820.63689654673,\n      35001.994380382916,\n      35183.3518642191,\n      35364.70934805528,\n      35546.06683189146,\n      35727.424315727636,\n      35908.78179956382,\n      36090.1392834\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0,\n      13440000.0,\n      14080000.0,\n      14720000.0,\n      15360000.0,\n      16000000.0,\n      16640000.0,\n      17280000.0,\n      17920000.0,\n      18560000.0,\n      19200000.0,\n      19840000.0,\n      20480000.0,\n      21120000.0,\n      21760000.0,\n      22400000.0,\n      23040000.0,\n      23680000.0,\n      24320000.0,\n      24960000.0,\n      25600000.0,\n      26240000.0,\n      26880000.0,\n      27520000.0,\n      28160000.0,\n      28800000.0,\n      29440000.0,\n      30080000.0,\n      30720000.0,\n      31360000.0,\n      32000000.0,\n      32640000.0,\n      33280000.0,\n      33920000.0,\n      34560000.0,\n      35200000.0,\n      35840000.0,\n      36480000.0,\n      37120000.0,\n      37760000.0,\n      38400000.0,\n      39040000.0,\n      39680000.0,\n      40320000.0,\n      40960000.0,\n      41600000.0,\n      42240000.0,\n      42880000.0,\n      43520000.0,\n      44160000.0,\n      44800000.0,\n      45440000.0,\n      46080000.0,\n      46720000.0,\n      47360000.0,\n      48000000.0,\n      48640000.0,\n      49280000.0,\n      49920000.0,\n      50560000.0,\n      51200000.0,\n      51840000.0,\n      52480000.0,\n      53120000.0,\n      53760000.0,\n      54400000.0,\n      55040000.0,\n      55680000.0,\n      56320000.0,\n      56960000.0,\n      57600000.0,\n      58240000.0,\n      58880000.0,\n      59520000.0,\n      60160000.0,\n      60800000.0,\n      61440000.0,\n      62080000.0,\n      62720000.0,\n      63360000.0,\n      64000000.0,\n      64640000.0,\n      65280000.0,\n      65920000.0,\n      66560000.0,\n      67200000.0,\n      67840000.0,\n      68480000.0,\n      69120000.0,\n      69760000.0,\n      70400000.0,\n      71040000.0,\n      71680000.0,\n      72320000.0,\n      72960000.0,\n      73600000.0,\n      74240000.0,\n      74880000.0,\n      75520000.0,\n      76160000.0,\n      76800000.0,\n      77440000.0,\n      78080000.0,\n      78720000.0,\n      79360000.0,\n      80000000.0,\n      80640000.0,\n      81280000.0,\n      81920000.0,\n      82560000.0,\n      83200000.0,\n      83840000.0,\n      84480000.0,\n      85120000.0,\n      85760000.0,\n      86400000.0,\n      87040000.0,\n      87680000.0,\n      88320000.0,\n      88960000.0,\n      89600000.0,\n      90240000.0,\n      90880000.0,\n      91520000.0,\n      92160000.0,\n      92800000.0,\n      93440000.0,\n      94080000.0,\n      94720000.0,\n      95360000.0,\n      96000000.0,\n      96640000.0,\n      97280000.0,\n      97920000.0,\n      98560000.0,\n      99200000.0,\n      99840000.0,\n      100480000.0,\n      101120000.0,\n      101760000.0,\n      102400000.0,\n      103040000.0,\n      103680000.0,\n      104320000.0,\n      104960000.0,\n      105600000.0,\n      106240000.0,\n      106880000.0,\n      107520000.0,\n      108160000.0,\n      108800000.0,\n      109440000.0,\n      110080000.0,\n      110720000.0,\n      111360000.0,\n      112000000.0,\n      112640000.0,\n      113280000.0,\n      113920000.0,\n      114560000.0,\n      115200000.0,\n      115840000.0,\n      116480000.0,\n      117120000.0,\n      117760000.0,\n      118400000.0,\n      119040000.0,\n      119680000.0,\n      120320000.0,\n      120960000.0,\n      121600000.0,\n      122240000.0,\n      122880000.0,\n      123520000.0,\n      124160000.0,\n      124800000.0,\n      125440000.0,\n      126080000.0,\n      126720000.0,\n      127360000.0\n    ],\n    \"return\": [\n      0.0,\n      38.196160634358726,\n      58.770947774251304,\n      62.07254155476888,\n      67.4906005859375,\n      70.74937947591145,\n      74.7848612467448,\n      77.75097147623698,\n      79.08344014485677,\n      80.12428538004558,\n      84.40036265055339,\n      84.25612894694011,\n      89.841064453125,\n      86.2732442220052,\n      92.54730733235677,\n      95.28934987386067,\n      95.8080342610677,\n      97.19317372639973,\n      107.09626007080078,\n      110.65089670817058,\n      114.67183176676433,\n      110.9939193725586,\n      112.79729715983073,\n      113.6125971476237,\n      116.34737141927083,\n      126.48547871907552,\n      133.89915466308594,\n      109.39595286051433,\n      134.32386779785156,\n      115.43318176269531,\n      143.85650126139322,\n      119.99575297037761,\n      132.2745564778646,\n      135.0002670288086,\n      187.80313873291016,\n      159.67255147298178,\n      146.672732035319,\n      158.2981389363607,\n      136.65037282307944,\n      128.09097798665366,\n      169.46741231282553,\n      161.30030314127603,\n      164.15653228759766,\n      137.65259297688803,\n      136.9373321533203,\n      206.19168599446616,\n      159.12822469075522,\n      179.7809041341146,\n      172.49272664388022,\n      250.76256306966147,\n      194.66214497884116,\n      195.08185323079428,\n      235.58951314290366,\n      166.8591283162435,\n      238.91442362467447,\n      245.55697631835938,\n      207.1026153564453,\n      255.12977600097656,\n      217.94174194335938,\n      166.34827168782553,\n      149.34825642903647,\n      183.07278696695963,\n      218.01471964518228,\n      235.31849161783853,\n      195.28186543782553,\n      210.81011454264322,\n      170.80133565266928,\n      211.78978983561197,\n      218.24620564778647,\n      197.42310587565103,\n      182.58748881022134,\n      152.4922637939453,\n      197.9974594116211,\n      201.38032023111978,\n      190.21263122558594,\n      240.82181294759116,\n      286.7570495605469,\n      188.37657165527344,\n      187.61556498209634,\n      203.70641072591147,\n      269.580815633138,\n      177.3887939453125,\n      252.92713419596353,\n      267.0267028808594,\n      240.3121795654297,\n      239.61591593424478,\n      207.05109151204428,\n      235.18618774414062,\n      177.18890889485678,\n      202.33190409342447,\n      160.5045623779297,\n      185.72308349609375,\n      188.78704833984375,\n      277.3094177246094,\n      272.63533782958984,\n      224.07214864095053,\n      232.78923543294272,\n      184.01846822102866,\n      225.31009419759116,\n      225.0649668375651,\n      304.5735168457031,\n      210.20008341471353,\n      201.0149180094401,\n      243.43156941731772,\n      215.55621337890625,\n      195.11417643229166,\n      200.63154093424478,\n      250.17939249674478,\n      166.86117553710938,\n      239.04779052734375,\n      202.9445546468099,\n      230.7897491455078,\n      200.33321634928384,\n      166.21000162760416,\n      196.61333719889322,\n      262.797119140625,\n      172.99969991048178,\n      223.8022206624349,\n      207.91385396321616,\n      194.3153839111328,\n      213.40963745117188,\n      237.738525390625,\n      221.23444112141928,\n      197.73790486653647,\n      180.7091064453125,\n      154.77271016438803,\n      241.7831827799479,\n      224.94592793782553,\n      185.94915771484375,\n      201.07649739583334,\n      219.7017618815104,\n      167.22635904947916,\n      170.35607401529947,\n      163.76718393961588,\n      225.6471710205078,\n      183.0110321044922,\n      192.79330444335938,\n      243.40941365559897,\n      207.83230590820312,\n      187.21705118815103,\n      254.02437337239584,\n      182.39125061035156,\n      186.51834615071616,\n      198.46961466471353,\n      173.48248799641928,\n      169.77811686197916,\n      194.39813232421875,\n      229.33489481608072,\n      197.5059356689453,\n      195.6003621419271,\n      193.1749471028646,\n      201.3477986653646,\n      149.3585459391276,\n      160.55817159016928,\n      262.8863830566406,\n      194.60286458333334,\n      155.88604227701822,\n      155.73231506347656,\n      185.68701171875,\n      232.66948445638022,\n      191.00763956705728,\n      184.53072102864584,\n      238.246337890625,\n      183.43255615234375,\n      208.6761220296224,\n      198.46814473470053,\n      182.61422729492188,\n      219.0441640218099,\n      272.5106150309245,\n      228.05343627929688,\n      161.55077107747397,\n      202.0955810546875,\n      226.70318094889322,\n      252.42306518554688,\n      248.66568501790366,\n      242.6952362060547,\n      189.46682739257812,\n      151.86629740397134,\n      180.39802042643228,\n      214.47354125976562,\n      190.78692626953125,\n      205.54802958170572,\n      258.9932454427083,\n      253.63004557291666,\n      211.98404439290366,\n      211.90413411458334,\n      191.16903686523438,\n      219.9814249674479,\n      173.84603881835938,\n      207.7472915649414,\n      268.8358154296875,\n      219.39684041341147,\n      195.6284434000651,\n      185.613037109375,\n      184.69058227539062,\n      175.6501668294271,\n      221.70868937174478,\n      179.950927734375,\n      196.78107706705728,\n      243.99713134765625\n    ],\n    \"return_min\": [\n      0.0,\n      34.45428217803158,\n      55.41737184120804,\n      61.41869831871158,\n      66.43249621420325,\n      69.62003501316896,\n      73.62000081838235,\n      74.60206213497288,\n      76.32229807409914,\n      77.34785807783203,\n      81.51511554114175,\n      79.76769534189077,\n      85.01361415766722,\n      81.90230639389048,\n      89.69955042492953,\n      90.6178239151312,\n      88.02047685300616,\n      96.07391744623853,\n      90.61242490649848,\n      101.45223290545653,\n      95.02559927014411,\n      100.60899040578748,\n      95.09311835848013,\n      97.23245698211139,\n      94.93065587379392,\n      96.36739335426385,\n      105.43736833701234,\n      97.00441945236639,\n      98.35959134250493,\n      98.28940804777696,\n      92.67936441640799,\n      96.88782462350221,\n      104.22682035772064,\n      118.07620764500636,\n      122.03861283666923,\n      77.70721064239504,\n      123.05194666169182,\n      90.49265642797565,\n      109.53681253072628,\n      100.79331677933696,\n      132.82121169040914,\n      118.10119067918774,\n      128.9584653469691,\n      130.8071722988806,\n      134.49261515372942,\n      92.08135912203397,\n      124.19046377359581,\n      98.81203426788495,\n      112.60570730312568,\n      191.6848021718945,\n      169.2558995325841,\n      164.44067226472907,\n      162.58134839465347,\n      101.12928192118886,\n      150.588229166145,\n      186.23761923215693,\n      150.1613026052525,\n      164.03645279537557,\n      155.29520585208192,\n      156.8134730705425,\n      110.8007256006957,\n      89.45774057195948,\n      183.66333080851112,\n      182.8045774261979,\n      147.439039041815,\n      192.50448931520535,\n      153.81440899229477,\n      159.37387265718547,\n      156.8847591896903,\n      140.9422189870913,\n      133.79386046844553,\n      121.2208783359017,\n      117.78878466109666,\n      162.0891325977859,\n      142.91152152327456,\n      145.66650485354245,\n      181.36210061214183,\n      172.0549260337872,\n      129.77998980595356,\n      184.15597889815282,\n      226.4602045321107,\n      139.90753198856368,\n      212.27614858542546,\n      258.7678969036728,\n      128.43190294932754,\n      155.87710678035796,\n      124.28137596113937,\n      212.97156265828514,\n      144.41096078962323,\n      124.89275940718964,\n      141.10913428610715,\n      166.79807120535114,\n      182.2678255457014,\n      174.8131193671947,\n      117.43064036520028,\n      186.94050341879202,\n      159.0381560034768,\n      155.60826522621846,\n      183.84612565643005,\n      137.57668363877298,\n      271.9786662677094,\n      160.17609619913318,\n      146.13779258497067,\n      175.88971848479008,\n      173.54095007784315,\n      134.59203053917872,\n      166.65528346229607,\n      180.4588111039666,\n      145.9365899929819,\n      151.56289219435445,\n      133.81831701051803,\n      191.35533174986944,\n      157.67126588854296,\n      130.87317243361662,\n      178.3746080438651,\n      229.5139109960802,\n      149.1997992438169,\n      160.45883006247314,\n      160.1759590836749,\n      125.17163497894785,\n      175.2298130949002,\n      187.0507229628084,\n      199.16419394083135,\n      141.84808456873083,\n      120.67866534173005,\n      141.746042432658,\n      220.1337018131785,\n      158.57603198020888,\n      148.14785461734908,\n      184.8981647429553,\n      148.06760980619828,\n      122.53421916003592,\n      130.70419158023753,\n      119.64487618156373,\n      154.3790765581566,\n      152.53780110145092,\n      178.43447367032533,\n      194.91483762218263,\n      149.96549429029432,\n      154.4239946844543,\n      169.34695408115252,\n      175.13164904117792,\n      134.66237652341292,\n      172.8040557403638,\n      153.92361892495038,\n      132.77755809968215,\n      158.73452883959555,\n      178.48066951373008,\n      158.17790159002567,\n      184.34224660737877,\n      151.50959380362832,\n      181.05786787718358,\n      128.44826901669964,\n      135.85806372404804,\n      224.19958239539173,\n      159.54753154225168,\n      139.97302598768363,\n      136.08539724634934,\n      169.96330045285345,\n      214.40018491853647,\n      138.97055888555082,\n      174.08251732565873,\n      188.24750174073284,\n      154.5361398840479,\n      147.65977634084902,\n      150.8830394081585,\n      148.54773711493283,\n      195.06914279689272,\n      249.31286395949382,\n      148.8014394989193,\n      128.945207717944,\n      171.02724765762608,\n      164.33448317784035,\n      173.62293096123312,\n      211.63861571735728,\n      163.8461708101563,\n      170.99422189105155,\n      127.6976483325336,\n      166.47019281749013,\n      182.54807929613827,\n      124.39573212425708,\n      162.95733679973253,\n      218.6751066764981,\n      200.5109498309493,\n      169.39523590474533,\n      139.30340693319937,\n      164.06790645736223,\n      182.94378495233704,\n      142.95961881523903,\n      146.49638746199017,\n      233.57057465778408,\n      165.27324945760324,\n      171.61117351294516,\n      142.4697787158965,\n      160.95787142437774,\n      163.002228594134,\n      167.43590048100222,\n      160.83785550209495,\n      151.80602190528384,\n      205.61825791496244\n    ],\n    \"return_max\": [\n      0.0,\n      41.93803909068587,\n      62.12452370729457,\n      62.72638479082619,\n      68.54870495767175,\n      71.87872393865395,\n      75.94972167510724,\n      80.89988081750109,\n      81.8445822156144,\n      82.90071268225913,\n      87.28560975996503,\n      88.74456255198945,\n      94.66851474858278,\n      90.64418205011992,\n      95.395064239784,\n      99.96087583259015,\n      103.59559166912925,\n      98.31243000656094,\n      123.58009523510309,\n      119.84956051088463,\n      134.31806426338454,\n      121.3788483393297,\n      130.50147596118134,\n      129.99273731313602,\n      137.76408696474772,\n      156.6035640838872,\n      162.36094098915953,\n      121.78748626866226,\n      170.2881442531982,\n      132.57695547761367,\n      195.03363810637845,\n      143.103681317253,\n      160.32229259800854,\n      151.92432641261084,\n      253.56766462915107,\n      241.63789230356852,\n      170.29351740894617,\n      226.10362144474573,\n      163.7639331154326,\n      155.38863919397033,\n      206.11361293524192,\n      204.49941560336433,\n      199.35459922822622,\n      144.49801365489546,\n      139.3820491529112,\n      320.30201286689834,\n      194.06598560791463,\n      260.74977400034425,\n      232.37974598463475,\n      309.84032396742845,\n      220.06839042509822,\n      225.7230341968595,\n      308.5976778911538,\n      232.58897471129814,\n      327.2406180832039,\n      304.8763334045618,\n      264.0439281076381,\n      346.22309920657756,\n      280.5882780346368,\n      175.88307030510856,\n      187.89578725737724,\n      276.68783336195975,\n      252.36610848185344,\n      287.83240580947916,\n      243.12469183383607,\n      229.1157397700811,\n      187.7882623130438,\n      264.20570701403847,\n      279.60765210588266,\n      253.90399276421076,\n      231.38111715199716,\n      183.7636492519889,\n      278.20613416214553,\n      240.67150786445367,\n      237.51374092789732,\n      335.97712104163986,\n      392.1519985089519,\n      204.69821727675966,\n      245.45114015823913,\n      223.2568425536701,\n      312.7014267341653,\n      214.87005590206132,\n      293.5781198065016,\n      275.28550885804594,\n      352.19245618153184,\n      323.3547250881316,\n      289.8208070629492,\n      257.4008128299961,\n      209.96685700009033,\n      279.7710487796593,\n      179.89999046975223,\n      204.64809578683636,\n      195.3062711339861,\n      379.80571608202405,\n      427.84003529397944,\n      261.203793863109,\n      306.54031486240865,\n      212.42867121583885,\n      266.77406273875226,\n      312.5532500363572,\n      337.16836742369685,\n      260.2240706302939,\n      255.89204343390952,\n      310.97342034984536,\n      257.57147667996935,\n      255.6363223254046,\n      234.6077984061935,\n      319.8999738895229,\n      187.78576108123684,\n      326.53268886033305,\n      272.0707922831018,\n      270.2241665411462,\n      242.99516681002473,\n      201.5468308215917,\n      214.85206635392134,\n      296.0803272851698,\n      196.79960057714666,\n      287.1456112623967,\n      255.6517488427574,\n      263.4591328433178,\n      251.58946180744354,\n      288.42632781844156,\n      243.3046883020072,\n      253.6277251643421,\n      240.73954754889496,\n      167.79937789611807,\n      263.4326637467173,\n      291.3158238954422,\n      223.75046081233842,\n      217.2548300487114,\n      291.33591395682254,\n      211.9184989389224,\n      210.0079564503614,\n      207.889491697668,\n      296.915265482859,\n      213.48426310753345,\n      207.15213521639342,\n      291.9039896890153,\n      265.69911752611193,\n      220.01010769184776,\n      338.70179266363914,\n      189.6508521795252,\n      238.3743157780194,\n      224.13517358906327,\n      193.04135706788819,\n      206.77867562427616,\n      230.06173580884195,\n      280.18912011843133,\n      236.83396974786496,\n      206.85847767647542,\n      234.84030040210087,\n      221.6377294535456,\n      170.26882286155555,\n      185.25827945629052,\n      301.5731837178895,\n      229.658197624415,\n      171.7990585663528,\n      175.37923288060378,\n      201.41072298464655,\n      250.93878399422397,\n      243.04472024856375,\n      194.97892473163296,\n      288.24517404051716,\n      212.3289724206396,\n      269.6924677183958,\n      246.05325006124255,\n      216.68071747491092,\n      243.0191852467271,\n      295.70836610235517,\n      307.30543305967444,\n      194.15633443700395,\n      233.16391445174892,\n      289.0718787199461,\n      331.2231994098606,\n      285.69275431845006,\n      321.54430160195307,\n      207.9394328941047,\n      176.0349464754091,\n      194.32584803537443,\n      246.39900322339298,\n      257.17812041480545,\n      248.1387223636789,\n      299.3113842089185,\n      306.749141314884,\n      254.57285288106198,\n      284.5048612959673,\n      218.27016727310652,\n      257.0190649825588,\n      204.73245882147972,\n      268.9981956678926,\n      304.10105620159095,\n      273.5204313692197,\n      219.64571328718503,\n      228.7562955028535,\n      208.4232931264035,\n      188.29810506472018,\n      275.98147826248737,\n      199.06399996665505,\n      241.75613222883072,\n      282.37600478035006\n    ]\n  },\n  \"h1hand_stair\": {\n    \"time\": [\n      0.0,\n      387.40268051806663,\n      774.8053610361333,\n      1162.2080415542,\n      1549.6107220722665,\n      1937.013402590333,\n      2324.4160831084,\n      2711.818763626466,\n      3099.221444144533,\n      3486.6241246626,\n      3874.026805180666,\n      4261.4294856987335,\n      4648.8321662168,\n      5036.234846734867,\n      5423.637527252932,\n      5811.040207771,\n      6198.442888289066,\n      6585.845568807133,\n      6973.2482493252,\n      7360.650929843266,\n      7748.053610361332,\n      8135.4562908794,\n      8522.858971397467,\n      8910.261651915533,\n      9297.6643324336,\n      9685.067012951667,\n      10072.469693469733,\n      10459.872373987799,\n      10847.275054505864,\n      11234.677735023932,\n      11622.080415542\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0,\n      13440000.0,\n      14080000.0,\n      14720000.0,\n      15360000.0,\n      16000000.0,\n      16640000.0,\n      17280000.0,\n      17920000.0,\n      18560000.0,\n      19200000.0\n    ],\n    \"return\": [\n      0.0,\n      23.89271863301595,\n      34.71700096130371,\n      60.09582392374674,\n      73.17438507080078,\n      151.34715779622397,\n      224.9233856201172,\n      271.56304423014325,\n      316.3719889322917,\n      371.5347391764323,\n      382.11834716796875,\n      427.6658426920573,\n      422.29393513997394,\n      487.94639078776044,\n      474.13771565755206,\n      548.6403503417969,\n      592.8809407552084,\n      625.5245564778646,\n      660.0660400390625,\n      652.6300048828125,\n      654.2977498372396,\n      667.5510660807291,\n      689.77783203125,\n      690.3555501302084,\n      692.7436726888021,\n      714.7565511067709,\n      700.9408976236979,\n      730.7681477864584,\n      744.6813557942709,\n      703.8641560872396,\n      689.6199951171875\n    ],\n    \"return_min\": [\n      0.0,\n      15.890752302265494,\n      29.305759036971846,\n      45.07623410366991,\n      64.0256023149527,\n      108.29035638002946,\n      168.69137872905253,\n      235.03404534904712,\n      268.15860258395946,\n      351.4744388725385,\n      372.16143158111464,\n      411.6028455251883,\n      342.44605901687726,\n      425.6015045319289,\n      438.91894798410647,\n      466.1055596018814,\n      559.5168597506888,\n      596.1932083138497,\n      610.9713671046555,\n      592.2205757048379,\n      609.8229299953937,\n      623.5360991628968,\n      650.7099879959824,\n      643.435216307602,\n      640.6507078943843,\n      684.2973876188141,\n      674.313749484506,\n      702.2090532268096,\n      720.8903004478105,\n      644.9576463371507,\n      627.955726274373\n    ],\n    \"return_max\": [\n      0.0,\n      31.894684963766405,\n      40.12824288563557,\n      75.11541374382357,\n      82.32316782664886,\n      194.40395921241847,\n      281.15539251118184,\n      308.0920431112394,\n      364.5853752806239,\n      391.59503948032614,\n      392.07526275482286,\n      443.72883985892634,\n      502.1418112630706,\n      550.2912770435919,\n      509.35648333099766,\n      631.1751410817124,\n      626.2450217597279,\n      654.8559046418795,\n      709.1607129734695,\n      713.0394340607871,\n      698.7725696790856,\n      711.5660329985615,\n      728.8456760665176,\n      737.2758839528148,\n      744.83663748322,\n      745.2157145947276,\n      727.5680457628897,\n      759.3272423461071,\n      768.4724111407312,\n      762.7706658373286,\n      751.284263960002\n    ]\n  },\n  \"h1hand_slide\": {\n    \"time\": [\n      0.0,\n      309.2993865171333,\n      618.5987730342666,\n      927.8981595514,\n      1237.1975460685333,\n      1546.4969325856666,\n      1855.7963191028,\n      2165.0957056199336,\n      2474.3950921370665,\n      2783.6944786542,\n      3092.9938651713333,\n      3402.293251688467,\n      3711.5926382056,\n      4020.8920247227334,\n      4330.191411239867,\n      4639.490797757,\n      4948.790184274133,\n      5258.089570791267,\n      5567.3889573084,\n      5876.688343825534,\n      6185.9877303426665,\n      6495.2871168598,\n      6804.586503376934,\n      7113.885889894067,\n      7423.1852764112,\n      7732.484662928333,\n      8041.784049445467,\n      8351.0834359626,\n      8660.382822479734,\n      8969.682208996868,\n      9278.981595514\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0,\n      13440000.0,\n      14080000.0,\n      14720000.0,\n      15360000.0,\n      16000000.0,\n      16640000.0,\n      17280000.0,\n      17920000.0,\n      18560000.0,\n      19200000.0\n    ],\n    \"return\": [\n      0.0,\n      25.91161886850993,\n      46.50613911946615,\n      69.50710932413737,\n      125.7647476196289,\n      170.76056416829428,\n      238.53360493977866,\n      265.1405283610026,\n      287.49407958984375,\n      301.4013264973958,\n      266.9022521972656,\n      286.79445393880206,\n      311.7269694010417,\n      334.2634684244792,\n      393.5237731933594,\n      413.90247599283856,\n      468.1215108235677,\n      478.6412353515625,\n      521.7706604003906,\n      597.3247273763021,\n      682.8544311523438,\n      815.7204182942709,\n      869.786865234375,\n      870.6855061848959,\n      924.8785807291666,\n      915.4677937825521,\n      915.6895751953125,\n      901.403076171875,\n      910.4904378255209,\n      935.8196411132812,\n      904.0150146484375\n    ],\n    \"return_min\": [\n      0.0,\n      17.31227752983613,\n      41.81182919734604,\n      57.14489937214505,\n      96.5149593569322,\n      163.1428876300595,\n      222.31254684790358,\n      231.41575879824072,\n      255.4749652490164,\n      243.84204384859126,\n      249.15111161639513,\n      265.7686367908954,\n      283.11169371305834,\n      310.17930949833135,\n      345.48766050051773,\n      399.79618209856903,\n      452.1210687697244,\n      462.9765296109677,\n      495.99326194394064,\n      558.9139653990013,\n      668.4022813113013,\n      795.2447221786368,\n      830.8790856292156,\n      823.1121698948539,\n      915.6907611174105,\n      893.0342590385319,\n      869.1971213119441,\n      878.6902612410832,\n      894.2164153448738,\n      921.1641786848095,\n      859.1733242686221\n    ],\n    \"return_max\": [\n      0.0,\n      34.51096020718373,\n      51.20044904158625,\n      81.8693192761297,\n      155.01453588232562,\n      178.37824070652906,\n      254.75466303165373,\n      298.86529792376456,\n      319.5131939306711,\n      358.9606091462004,\n      284.6533927781361,\n      307.8202710867087,\n      340.342245089025,\n      358.347627350627,\n      441.559885886201,\n      428.0087698871081,\n      484.121952877411,\n      494.3059410921573,\n      547.5480588568406,\n      635.7354893536029,\n      697.3065809933862,\n      836.1961144099049,\n      908.6946448395344,\n      918.2588424749379,\n      934.0664003409228,\n      937.9013285265723,\n      962.1820290786809,\n      924.1158911026668,\n      926.764460306168,\n      950.475103541753,\n      948.8567050282529\n    ]\n  },\n  \"h1hand_pole\": {\n    \"time\": [\n      0.0,\n      385.46620238036667,\n      770.9324047607333,\n      1156.3986071411002,\n      1541.8648095214667,\n      1927.331011901833,\n      2312.7972142822005,\n      2698.2634166625667,\n      3083.7296190429333,\n      3469.1958214233,\n      3854.662023803666,\n      4240.128226184033,\n      4625.594428564401,\n      5011.060630944767,\n      5396.526833325133,\n      5781.9930357055,\n      6167.459238085867,\n      6552.925440466233,\n      6938.3916428466,\n      7323.857845226967,\n      7709.324047607332,\n      8094.7902499877,\n      8480.256452368067,\n      8865.722654748433,\n      9251.188857128802,\n      9636.655059509167,\n      10022.121261889533,\n      10407.5874642699,\n      10793.053666650267,\n      11178.519869030633,\n      11563.986071411\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0,\n      13440000.0,\n      14080000.0,\n      14720000.0,\n      15360000.0,\n      16000000.0,\n      16640000.0,\n      17280000.0,\n      17920000.0,\n      18560000.0,\n      19200000.0\n    ],\n    \"return\": [\n      0.0,\n      86.94035212198894,\n      315.2128092447917,\n      259.81410725911456,\n      438.83277384440106,\n      453.3154805501302,\n      472.5972188313802,\n      600.6257120768229,\n      602.0543416341146,\n      604.5667724609375,\n      625.5578002929688,\n      697.9258626302084,\n      730.2250773111979,\n      770.3667805989584,\n      756.2709757486979,\n      817.2851765950521,\n      814.6104939778646,\n      834.628662109375,\n      836.6917317708334,\n      836.28759765625,\n      802.5843302408854,\n      828.5123494466146,\n      889.4725545247396,\n      927.0976359049479,\n      918.7555135091146,\n      918.3529663085938,\n      929.7622884114584,\n      927.1256306966146,\n      924.934814453125,\n      935.3818562825521,\n      867.1444905598959\n    ],\n    \"return_min\": [\n      0.0,\n      34.151917472845525,\n      169.25709369786907,\n      195.91231211958,\n      309.209006533762,\n      333.5828818070501,\n      457.3834555761806,\n      538.0321487054368,\n      556.7912159567086,\n      546.9315820056133,\n      562.6627573317008,\n      625.2555533079764,\n      698.4326425070107,\n      708.3732902267687,\n      687.8694938402527,\n      787.8280779660413,\n      789.6780869058325,\n      772.5260295300255,\n      792.1538079552,\n      783.306104588348,\n      739.7921613627637,\n      715.3343740103686,\n      847.4289290942377,\n      887.3489508402633,\n      877.1328123681296,\n      863.362262954571,\n      880.872949690255,\n      877.8235128123039,\n      882.9430126525199,\n      900.4706760458522,\n      743.7438399173179\n    ],\n    \"return_max\": [\n      0.0,\n      139.72878677113235,\n      461.1685247917143,\n      323.7159023986491,\n      568.4565411550401,\n      573.0480792932102,\n      487.81098208657977,\n      663.2192754482089,\n      647.3174673115207,\n      662.2019629162617,\n      688.4528432542367,\n      770.5961719524404,\n      762.0175121153851,\n      832.360270971148,\n      824.672457657143,\n      846.742275224063,\n      839.5429010498967,\n      896.7312946887245,\n      881.2296555864667,\n      889.269090724152,\n      865.376499119007,\n      941.6903248828606,\n      931.5161799552416,\n      966.8463209696324,\n      960.3782146500996,\n      973.3436696626165,\n      978.6516271326617,\n      976.4277485809254,\n      966.9266162537301,\n      970.293036519252,\n      990.5451412024738\n    ]\n  },\n  \"h1hand_push\": {\n    \"time\": [\n      0.0,\n      312.6167574498032,\n      625.2335148996063,\n      937.8502723494095,\n      1250.4670297992127,\n      1563.083787249016,\n      1875.700544698819,\n      2188.3173021486223,\n      2500.9340595984254,\n      2813.5508170482285,\n      3126.167574498032,\n      3438.784331947835,\n      3751.401089397638,\n      4064.0178468474414,\n      4376.634604297245,\n      4689.251361747049,\n      5001.868119196851,\n      5314.484876646654,\n      5627.101634096457,\n      5939.718391546261,\n      6252.335148996064,\n      6564.951906445867,\n      6877.56866389567,\n      7190.1854213454735,\n      7502.802178795276,\n      7815.418936245081,\n      8128.035693694883,\n      8440.652451144686,\n      8753.26920859449,\n      9065.885966044292,\n      9378.502723494097,\n      9691.119480943898,\n      10003.736238393702,\n      10316.352995843505,\n      10628.969753293308,\n      10941.586510743113,\n      11254.203268192914,\n      11566.820025642717,\n      11879.436783092522,\n      12192.053540542325,\n      12504.670297992128,\n      12817.287055441931,\n      13129.903812891735,\n      13442.520570341538,\n      13755.13732779134,\n      14067.754085241144,\n      14380.370842690947,\n      14692.987600140748,\n      15005.604357590551,\n      15318.221115040358,\n      15630.837872490161,\n      15943.454629939963,\n      16256.071387389766,\n      16568.68814483957,\n      16881.304902289372,\n      17193.921659739175,\n      17506.53841718898,\n      17819.15517463878,\n      18131.771932088584,\n      18444.38868953839,\n      18757.005446988194,\n      19069.622204437994,\n      19382.238961887797,\n      19694.8557193376,\n      20007.472476787403,\n      20320.089234237206,\n      20632.70599168701,\n      20945.322749136812,\n      21257.939506586616,\n      21570.55626403642,\n      21883.173021486225,\n      22195.78977893603,\n      22508.406536385828,\n      22821.02329383563,\n      23133.640051285434,\n      23446.25680873524,\n      23758.873566185044,\n      24071.490323634847,\n      24384.10708108465,\n      24696.723838534454,\n      25009.340595984257,\n      25321.95735343406,\n      25634.574110883863,\n      25947.190868333666,\n      26259.80762578347,\n      26572.424383233272,\n      26885.041140683075,\n      27197.65789813288,\n      27510.27465558268,\n      27822.891413032485,\n      28135.508170482288,\n      28448.12492793209,\n      28760.741685381894,\n      29073.358442831694,\n      29385.975200281497,\n      29698.5919577313,\n      30011.208715181103,\n      30323.825472630913,\n      30636.442230080716,\n      30949.05898753052,\n      31261.675744980323,\n      31574.292502430126,\n      31886.909259879925,\n      32199.52601732973,\n      32512.14277477953,\n      32824.759532229335,\n      33137.37628967914,\n      33449.99304712894,\n      33762.609804578744,\n      34075.22656202855,\n      34387.84331947835,\n      34700.46007692815,\n      35013.07683437796,\n      35325.69359182776,\n      35638.31034927756,\n      35950.927106727366,\n      36263.54386417717,\n      36576.16062162697,\n      36888.77737907678,\n      37201.394136526585,\n      37514.01089397639,\n      37826.62765142619,\n      38139.24440887599,\n      38451.86116632579,\n      38764.477923775594,\n      39077.0946812254,\n      39389.7114386752,\n      39702.328196125,\n      40014.944953574806,\n      40327.56171102461,\n      40640.17846847441,\n      40952.795225924216,\n      41265.41198337402,\n      41578.02874082382,\n      41890.645498273625,\n      42203.26225572343,\n      42515.87901317323,\n      42828.495770623034,\n      43141.11252807284,\n      43453.72928552265,\n      43766.34604297245,\n      44078.962800422254,\n      44391.57955787206,\n      44704.19631532185,\n      45016.813072771656,\n      45329.42983022146,\n      45642.04658767126,\n      45954.663345121066,\n      46267.28010257087,\n      46579.89686002067,\n      46892.51361747048,\n      47205.130374920285,\n      47517.74713237009,\n      47830.36388981989,\n      48142.980647269695,\n      48455.5974047195,\n      48768.2141621693,\n      49080.830919619104,\n      49393.44767706891,\n      49706.06443451871,\n      50018.68119196851,\n      50331.29794941832,\n      50643.91470686812,\n      50956.53146431792,\n      51269.148221767726,\n      51581.76497921753,\n      51894.38173666733,\n      52206.998494117135,\n      52519.61525156694,\n      52832.23200901674,\n      53144.848766466544,\n      53457.46552391635,\n      53770.08228136615,\n      54082.699038815954,\n      54395.31579626576,\n      54707.93255371556,\n      55020.54931116536,\n      55333.166068615166,\n      55645.78282606497,\n      55958.39958351477,\n      56271.016340964576,\n      56583.63309841438,\n      56896.24985586418,\n      57208.866613313985,\n      57521.48337076379,\n      57834.10012821359,\n      58146.71688566339,\n      58459.33364311319,\n      58771.95040056299\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0,\n      13440000.0,\n      14080000.0,\n      14720000.0,\n      15360000.0,\n      16000000.0,\n      16640000.0,\n      17280000.0,\n      17920000.0,\n      18560000.0,\n      19200000.0,\n      19840000.0,\n      20480000.0,\n      21120000.0,\n      21760000.0,\n      22400000.0,\n      23040000.0,\n      23680000.0,\n      24320000.0,\n      24960000.0,\n      25600000.0,\n      26240000.0,\n      26880000.0,\n      27520000.0,\n      28160000.0,\n      28800000.0,\n      29440000.0,\n      30080000.0,\n      30720000.0,\n      31360000.0,\n      32000000.0,\n      32640000.0,\n      33280000.0,\n      33920000.0,\n      34560000.0,\n      35200000.0,\n      35840000.0,\n      36480000.0,\n      37120000.0,\n      37760000.0,\n      38400000.0,\n      39040000.0,\n      39680000.0,\n      40320000.0,\n      40960000.0,\n      41600000.0,\n      42240000.0,\n      42880000.0,\n      43520000.0,\n      44160000.0,\n      44800000.0,\n      45440000.0,\n      46080000.0,\n      46720000.0,\n      47360000.0,\n      48000000.0,\n      48640000.0,\n      49280000.0,\n      49920000.0,\n      50560000.0,\n      51200000.0,\n      51840000.0,\n      52480000.0,\n      53120000.0,\n      53760000.0,\n      54400000.0,\n      55040000.0,\n      55680000.0,\n      56320000.0,\n      56960000.0,\n      57600000.0,\n      58240000.0,\n      58880000.0,\n      59520000.0,\n      60160000.0,\n      60800000.0,\n      61440000.0,\n      62080000.0,\n      62720000.0,\n      63360000.0,\n      64000000.0,\n      64640000.0,\n      65280000.0,\n      65920000.0,\n      66560000.0,\n      67200000.0,\n      67840000.0,\n      68480000.0,\n      69120000.0,\n      69760000.0,\n      70400000.0,\n      71040000.0,\n      71680000.0,\n      72320000.0,\n      72960000.0,\n      73600000.0,\n      74240000.0,\n      74880000.0,\n      75520000.0,\n      76160000.0,\n      76800000.0,\n      77440000.0,\n      78080000.0,\n      78720000.0,\n      79360000.0,\n      80000000.0,\n      80640000.0,\n      81280000.0,\n      81920000.0,\n      82560000.0,\n      83200000.0,\n      83840000.0,\n      84480000.0,\n      85120000.0,\n      85760000.0,\n      86400000.0,\n      87040000.0,\n      87680000.0,\n      88320000.0,\n      88960000.0,\n      89600000.0,\n      90240000.0,\n      90880000.0,\n      91520000.0,\n      92160000.0,\n      92800000.0,\n      93440000.0,\n      94080000.0,\n      94720000.0,\n      95360000.0,\n      96000000.0,\n      96640000.0,\n      97280000.0,\n      97920000.0,\n      98560000.0,\n      99200000.0,\n      99840000.0,\n      100480000.0,\n      101120000.0,\n      101760000.0,\n      102400000.0,\n      103040000.0,\n      103680000.0,\n      104320000.0,\n      104960000.0,\n      105600000.0,\n      106240000.0,\n      106880000.0,\n      107520000.0,\n      108160000.0,\n      108800000.0,\n      109440000.0,\n      110080000.0,\n      110720000.0,\n      111360000.0,\n      112000000.0,\n      112640000.0,\n      113280000.0,\n      113920000.0,\n      114560000.0,\n      115200000.0,\n      115840000.0,\n      116480000.0,\n      117120000.0,\n      117760000.0,\n      118400000.0,\n      119040000.0,\n      119680000.0,\n      120320000.0\n    ],\n    \"return\": [\n      0.0,\n      -465.70371500651044,\n      -209.66437276204428,\n      -215.35665893554688,\n      -79.81609725952148,\n      -97.29651769002278,\n      -171.10615030924478,\n      -58.52375920613607,\n      35.39680035909017,\n      67.48361078898112,\n      17.984004974365234,\n      46.02268600463867,\n      106.70371754964192,\n      52.096457163492836,\n      87.57223892211914,\n      60.093360900878906,\n      202.40312703450522,\n      35.78864034016927,\n      77.78755187988281,\n      199.13873799641928,\n      161.0906524658203,\n      -6.331198374430339,\n      227.29449462890625,\n      225.0338363647461,\n      311.14427693684894,\n      233.39718119303384,\n      389.0839029947917,\n      355.971435546875,\n      262.3876698811849,\n      351.2383626302083,\n      315.8093973795573,\n      400.12449137369794,\n      398.5304260253906,\n      408.3233947753906,\n      473.21187082926434,\n      377.5498504638672,\n      485.5910339355469,\n      400.6451822916667,\n      458.1425069173177,\n      439.06847127278644,\n      326.82744852701825,\n      403.84735616048175,\n      480.24944559733075,\n      495.53029378255206,\n      483.7826232910156,\n      533.0876261393229,\n      472.01613362630206,\n      496.1668395996094,\n      512.2647094726562,\n      514.5851847330729,\n      470.1879475911458,\n      544.6070658365885,\n      360.79221598307294,\n      496.0656331380208,\n      504.4897104899089,\n      546.812978108724,\n      533.8811543782552,\n      535.4517415364584,\n      471.8639500935872,\n      557.7514444986979,\n      629.8241882324219,\n      534.7682189941406,\n      511.1558583577474,\n      564.1106160481771,\n      494.4066518147786,\n      652.6817423502604,\n      521.490966796875,\n      520.7968419392904,\n      562.0110270182291,\n      456.7043991088867,\n      605.8467203776041,\n      640.2967529296875,\n      603.9107055664062,\n      574.7682291666666,\n      611.0949910481771,\n      631.2831217447916,\n      605.4075927734375,\n      546.4880421956381,\n      651.6212768554688,\n      625.2835998535156,\n      702.7826131184896,\n      600.3868408203125,\n      652.3099568684896,\n      490.0066731770833,\n      593.0624084472656,\n      541.475596110026,\n      587.3904622395834,\n      612.0767822265625,\n      577.2430826822916,\n      571.9033610026041,\n      625.8723347981771,\n      488.1465759277344,\n      592.8579711914062,\n      510.37445576985675,\n      523.4727376302084,\n      603.7941080729166,\n      547.8638305664062,\n      583.2688598632812,\n      570.062510172526,\n      527.5540873209635,\n      582.3120727539062,\n      656.2349243164062,\n      508.43341064453125,\n      607.9522298177084,\n      621.880859375,\n      636.3435770670573,\n      665.6842244466146,\n      588.9774169921875,\n      636.5202026367188,\n      662.1762288411459,\n      561.2283732096354,\n      598.4204813639323,\n      546.9343566894531,\n      626.6971232096354,\n      652.875732421875,\n      466.8674825032552,\n      524.4137674967448,\n      582.6043701171875,\n      534.4446411132812,\n      614.892588297526,\n      673.6151529947916,\n      547.5692138671875,\n      635.2989908854166,\n      611.1135660807291,\n      512.3421630859375,\n      652.3751627604166,\n      667.9252522786459,\n      649.40771484375,\n      579.6252848307291,\n      612.9027201334635,\n      615.0734354654948,\n      541.9854431152344,\n      587.8748372395834,\n      592.4884033203125,\n      755.3507893880209,\n      699.7528279622396,\n      717.8043619791666,\n      616.4128214518229,\n      610.3042602539062,\n      576.1018473307291,\n      669.9233601888021,\n      662.5718587239584,\n      601.7003580729166,\n      619.9717000325521,\n      616.17236328125,\n      647.919189453125,\n      720.2536214192709,\n      676.1502685546875,\n      638.7950236002604,\n      725.2923787434896,\n      708.4577229817709,\n      657.550038655599,\n      606.4547119140625,\n      720.4335225423177,\n      638.5685424804688,\n      644.5047810872396,\n      675.3748372395834,\n      545.7748819986979,\n      775.0440673828125,\n      696.5569864908854,\n      684.9332377115885,\n      675.4559529622396,\n      626.1231486002604,\n      670.3891398111979,\n      633.9583841959635,\n      662.5113728841146,\n      625.4878743489584,\n      723.1873982747396,\n      747.1727294921875,\n      709.0388793945312,\n      723.6645100911459,\n      625.6194051106771,\n      593.6909993489584,\n      727.9166056315104,\n      689.2951049804688,\n      630.6175944010416,\n      730.9366251627604,\n      689.8072102864584,\n      696.1072387695312,\n      611.1871439615885,\n      745.7859090169271,\n      636.9237670898438,\n      793.2918090820312,\n      752.7328694661459,\n      767.5932413736979,\n      738.7565307617188,\n      687.819091796875,\n      690.4653930664062,\n      769.5697631835938\n    ],\n    \"return_min\": [\n      0.0,\n      -651.7377904873346,\n      -218.19490330067313,\n      -230.40671483223093,\n      -105.68158272497024,\n      -210.8956213842916,\n      -233.1911752316335,\n      -161.58889381026177,\n      26.55360975511062,\n      4.289192049179803,\n      -94.47579688203697,\n      4.849184562816511,\n      69.07344829711946,\n      -22.40429315765919,\n      45.086476580251066,\n      -54.71491217125336,\n      125.38167282396118,\n      -98.6124742090824,\n      -38.373892937135324,\n      145.39119983240533,\n      134.49273414913256,\n      -59.93287233128325,\n      171.49033469732623,\n      133.5127446996148,\n      206.89851474999222,\n      177.69275565837683,\n      239.91640177783498,\n      218.16699335525163,\n      125.28320106593608,\n      177.69565455876588,\n      91.22711231593178,\n      174.67941277233874,\n      224.61917253986607,\n      176.72156917446412,\n      178.20974585184655,\n      162.2428865098692,\n      308.5236004576717,\n      101.43215919483697,\n      205.59336156515505,\n      218.07057710774632,\n      76.95591916101498,\n      102.58860226185669,\n      256.5846713119894,\n      279.8230541236833,\n      288.91774124081064,\n      285.3080260060623,\n      376.76426537108176,\n      292.3935583245744,\n      229.1453409203956,\n      294.52281767361126,\n      302.78698861571456,\n      356.1194735219518,\n      69.45947274408792,\n      383.9554921423231,\n      298.3401702958506,\n      321.4518209660676,\n      315.3049143709103,\n      346.0651825103665,\n      151.19494125741988,\n      357.1257353560337,\n      390.8240639442132,\n      348.42316335307174,\n      298.91965969292187,\n      462.3853769182599,\n      251.6649901787817,\n      479.78588898797443,\n      397.877011500767,\n      215.596613915609,\n      323.2564485248071,\n      186.6042427741824,\n      426.8233101559147,\n      362.0302869053308,\n      404.98571774036856,\n      359.2847078708686,\n      484.5054633019436,\n      493.2501176605723,\n      411.44129488977313,\n      256.5276751004272,\n      454.29501648725363,\n      413.8562263314633,\n      573.720797985042,\n      416.46692181466364,\n      467.0122085781444,\n      258.0562843972043,\n      395.2243701392048,\n      368.5233705526356,\n      452.636496904995,\n      464.5032990675227,\n      436.79629123776,\n      374.69121493690363,\n      466.7821493928696,\n      263.51951290052034,\n      429.315411959895,\n      304.9247130991566,\n      419.90325565387303,\n      432.2364118278375,\n      338.1089545341084,\n      364.2132868278239,\n      393.04972714186283,\n      359.9071598690796,\n      533.942556029381,\n      468.2753658460946,\n      261.8154182211118,\n      522.5536947167411,\n      557.584211053626,\n      510.90186706032495,\n      524.9047980503793,\n      487.37465424235245,\n      467.86209848079557,\n      545.6308974446529,\n      377.0755931355628,\n      422.04626971453774,\n      406.3920283244486,\n      444.8626825076674,\n      540.525108305069,\n      420.5410566062819,\n      489.81667849244985,\n      441.43925917296247,\n      289.6483089231782,\n      516.6815783827745,\n      591.9147876948792,\n      376.4286871094273,\n      534.641886307782,\n      388.33717493137976,\n      322.51834109876495,\n      523.6089601382292,\n      557.0252517096708,\n      537.192194230057,\n      479.63953220680753,\n      489.9297848053796,\n      456.3615203978124,\n      452.74782656646664,\n      437.8910113628925,\n      444.34476831068963,\n      622.8334792297339,\n      544.8195929605185,\n      599.7323369153378,\n      512.7902955981406,\n      406.1789579431253,\n      452.78874036721595,\n      582.1919827487237,\n      594.584856475092,\n      518.5465718482806,\n      598.0693065627131,\n      472.0681568588566,\n      485.4507956994522,\n      590.2059351368265,\n      586.9153657898868,\n      589.259669009053,\n      635.5611082804211,\n      623.3885305260387,\n      452.57285763522157,\n      565.053967031291,\n      539.3486244195427,\n      504.29379132207185,\n      558.5882700286743,\n      508.762833637914,\n      356.62391796920565,\n      629.1636777085658,\n      556.0355628400487,\n      444.2008257553892,\n      524.9059378555166,\n      404.75813990024164,\n      517.9978694633127,\n      513.871824831959,\n      612.3382451960599,\n      516.9922775142898,\n      680.363477847471,\n      627.5686018024139,\n      600.6839979902597,\n      614.6357271298568,\n      381.38851880783363,\n      525.7235252030711,\n      636.9343357735295,\n      617.0474412446536,\n      520.9154402643403,\n      551.0778985607511,\n      616.0456681349963,\n      608.2566427500684,\n      506.0818447569322,\n      715.3050290539916,\n      469.6640181908483,\n      710.4375439178509,\n      677.2274044637059,\n      715.0402021473672,\n      669.0276233195295,\n      590.3275579897734,\n      592.3499843504039,\n      654.5575435877125\n    ],\n    \"return_max\": [\n      0.0,\n      -279.6696395256863,\n      -201.13384222341543,\n      -200.30660303886282,\n      -53.95061179407273,\n      16.30258600424601,\n      -109.02112538685606,\n      44.54137539798962,\n      44.23999096306972,\n      130.67802952878245,\n      130.44380683076744,\n      87.19618744646084,\n      144.3339868021644,\n      126.59720748464485,\n      130.05800126398722,\n      174.90163397301117,\n      279.42458124504924,\n      170.18975488942095,\n      193.94899669690096,\n      252.88627616043323,\n      187.68857078250807,\n      47.27047558242258,\n      283.09865456048624,\n      316.5549280298774,\n      415.39003912370566,\n      289.10160672769086,\n      538.2514042117484,\n      493.77587773849837,\n      399.4921386964337,\n      524.7810707016507,\n      540.3916824431828,\n      625.5695699750571,\n      572.4416795109152,\n      639.9252203763172,\n      768.2139958066821,\n      592.8568144178652,\n      662.658467413422,\n      699.8582053884963,\n      710.6916522694803,\n      660.0663654378266,\n      576.6989778930215,\n      705.1061100591069,\n      703.9142198826721,\n      711.2375334414209,\n      678.6475053412206,\n      780.8672262725835,\n      567.2680018815224,\n      699.9401208746443,\n      795.3840780249169,\n      734.6475517925345,\n      637.5889065665771,\n      733.0946581512252,\n      652.1249592220579,\n      608.1757741337185,\n      710.6392506839671,\n      772.1741352513804,\n      752.4573943856002,\n      724.8383005625502,\n      792.5329589297546,\n      758.377153641362,\n      868.8243125206305,\n      721.1132746352096,\n      723.3920570225729,\n      665.8358551780943,\n      737.1483134507755,\n      825.5775957125463,\n      645.104922092983,\n      825.9970699629719,\n      800.7656055116511,\n      726.804555443591,\n      784.8701305992936,\n      918.5632189540443,\n      802.835693392444,\n      790.2517504624647,\n      737.6845187944107,\n      769.316125829011,\n      799.3738906571018,\n      836.448409290849,\n      848.9475372236839,\n      836.710973375568,\n      831.8444282519372,\n      784.3067598259613,\n      837.6077051588348,\n      721.9570619569623,\n      790.9004467553265,\n      714.4278216674164,\n      722.1444275741717,\n      759.6502653856023,\n      717.6898741268233,\n      769.1155070683046,\n      784.9625202034847,\n      712.7736389549484,\n      756.4005304229174,\n      715.824198440557,\n      627.0422196065438,\n      775.3518043179957,\n      757.6187065987041,\n      802.3244328987387,\n      747.0752932031892,\n      695.2010147728474,\n      630.6815894784315,\n      844.1944827867179,\n      755.0514030679507,\n      693.3507649186756,\n      686.177507696374,\n      761.7852870737895,\n      806.4636508428499,\n      690.5801797420226,\n      805.1783067926419,\n      778.7215602376389,\n      745.3811532837079,\n      774.7946930133268,\n      687.4766850544577,\n      808.5315639116034,\n      765.226356538681,\n      513.1939084002285,\n      559.0108565010396,\n      723.7694810614125,\n      779.2409733033843,\n      713.1035982122775,\n      755.315518294704,\n      718.7097406249477,\n      735.9560954630513,\n      833.8899572300785,\n      702.16598507311,\n      781.141365382604,\n      778.825252847621,\n      761.623235457443,\n      679.6110374546507,\n      735.8756554615474,\n      773.785350533177,\n      631.2230596640021,\n      737.8586631162742,\n      740.6320383299353,\n      887.8680995463078,\n      854.6860629639607,\n      835.8763870429955,\n      720.0353473055052,\n      814.4295625646872,\n      699.4149542942423,\n      757.6547376288805,\n      730.5588609728247,\n      684.8541442975527,\n      641.8740935023911,\n      760.2765697036434,\n      810.3875832067978,\n      850.3013077017152,\n      765.3851713194882,\n      688.3303781914677,\n      815.0236492065582,\n      793.526915437503,\n      862.5272196759764,\n      647.855456796834,\n      901.5184206650928,\n      772.8432936388656,\n      730.421292145805,\n      841.9868408412527,\n      734.9258460281901,\n      920.9244570570592,\n      837.0784101417221,\n      925.6656496677879,\n      826.0059680689626,\n      847.4881573002791,\n      822.7804101590831,\n      754.044943559968,\n      712.6845005721693,\n      733.983471183627,\n      766.0113187020082,\n      866.7768571819611,\n      817.3937607988028,\n      832.6932930524349,\n      869.8502914135206,\n      661.6584734948457,\n      818.8988754894913,\n      761.542768716284,\n      740.319748537743,\n      910.7953517647696,\n      763.5687524379205,\n      783.9578347889941,\n      716.2924431662449,\n      776.2667889798627,\n      804.1835159888392,\n      876.1460742462116,\n      828.2383344685859,\n      820.1462806000286,\n      808.485438203908,\n      785.3106256039766,\n      788.5808017824086,\n      884.581982779475\n    ]\n  },\n  \"h1hand_cabinet\": {\n    \"time\": [\n      0.0,\n      482.0103869639,\n      964.0207739278,\n      1446.0311608917,\n      1928.0415478556,\n      2410.0519348195,\n      2892.0623217834,\n      3374.0727087472997,\n      3856.0830957112,\n      4338.0934826751,\n      4820.103869639,\n      5302.114256602899,\n      5784.1246435668,\n      6266.1350305307,\n      6748.145417494599,\n      7230.155804458499,\n      7712.1661914224,\n      8194.1765783863,\n      8676.1869653502,\n      9158.197352314099,\n      9640.207739278\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0\n    ],\n    \"return\": [\n      0.0,\n      48.33211135864258,\n      41.21469751993815,\n      59.57719930013021,\n      75.67262268066406,\n      56.34848594665527,\n      81.90048472086589,\n      80.5737673441569,\n      81.59727350870769,\n      73.00228627522786,\n      88.43795522054036,\n      141.72611236572266,\n      114.48175557454427,\n      148.1658706665039,\n      112.07423400878906,\n      155.38145955403647,\n      130.3720957438151,\n      148.42277018229166,\n      155.57244873046875,\n      131.13333638509116,\n      188.2349853515625\n    ],\n    \"return_min\": [\n      0.0,\n      35.922316004492785,\n      28.06647614718972,\n      37.90235693360684,\n      44.75038850207668,\n      36.40830307855872,\n      61.24509401202533,\n      63.87340072139955,\n      61.64813090949677,\n      32.291284324269796,\n      66.49554026294187,\n      118.72944532889935,\n      74.6534328246156,\n      99.11059146542257,\n      80.17006584290101,\n      130.34182639754522,\n      98.22037494029969,\n      132.83954440041933,\n      144.75589448674302,\n      102.16349649420323,\n      136.03014741728717\n    ],\n    \"return_max\": [\n      0.0,\n      60.74190671279237,\n      54.36291889268658,\n      81.25204166665358,\n      106.59485685925145,\n      76.28866881475183,\n      102.55587542970645,\n      97.27413396691426,\n      101.5464161079186,\n      113.71328822618592,\n      110.38037017813885,\n      164.72277940254597,\n      154.31007832447293,\n      197.22114986758524,\n      143.97840217467711,\n      180.42109271052772,\n      162.5238165473305,\n      164.005995964164,\n      166.38900297419448,\n      160.10317627597908,\n      240.43982328583783\n    ]\n  },\n  \"h1hand_door\": {\n    \"time\": [\n      0.0,\n      474.1971419775,\n      948.394283955,\n      1422.5914259325,\n      1896.78856791,\n      2370.9857098875,\n      2845.182851865,\n      3319.3799938425,\n      3793.57713582,\n      4267.7742777975,\n      4741.971419775,\n      5216.1685617525,\n      5690.36570373,\n      6164.5628457075,\n      6638.759987685,\n      7112.9571296625,\n      7587.15427164,\n      8061.3514136175,\n      8535.548555595,\n      9009.7456975725,\n      9483.94283955\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0\n    ],\n    \"return\": [\n      0.0,\n      113.92080434163411,\n      225.69124348958334,\n      249.13692220052084,\n      276.68773396809894,\n      287.51111857096356,\n      287.70652262369794,\n      298.76854451497394,\n      303.0335388183594,\n      305.23622639973956,\n      311.20665486653644,\n      312.6994222005208,\n      316.5781656901042,\n      321.4351806640625,\n      323.1549580891927,\n      328.4767761230469,\n      331.2190246582031,\n      332.8224283854167,\n      333.1351013183594,\n      330.49583943684894,\n      331.7462666829427\n    ],\n    \"return_min\": [\n      0.0,\n      83.71072308019035,\n      208.12097190097407,\n      239.92148547332886,\n      268.6843518582533,\n      280.06576497283703,\n      282.01917265202314,\n      295.1558484227942,\n      298.3069802027448,\n      300.2730247321967,\n      303.5733979888646,\n      300.2010388008713,\n      303.3949109736265,\n      311.38583689703665,\n      308.21839791204366,\n      310.94946877443516,\n      310.7380146256782,\n      314.98049407945916,\n      316.6951429319435,\n      312.89852900846034,\n      317.0532202601144\n    ],\n    \"return_max\": [\n      0.0,\n      144.13088560307787,\n      243.26151507819262,\n      258.3523589277128,\n      284.69111607794457,\n      294.9564721690901,\n      293.3938725953727,\n      302.3812406071537,\n      307.76009743397395,\n      310.19942806728244,\n      318.8399117442083,\n      325.19780560017034,\n      329.76142040658186,\n      331.48452443108835,\n      338.0915182663417,\n      346.0040834716586,\n      351.70003469072805,\n      350.6643626913742,\n      349.57505970477524,\n      348.09314986523754,\n      346.43931310577096\n    ]\n  },\n  \"h1hand_truck\": {\n    \"time\": [\n      0.0,\n      356.2344639081333,\n      712.4689278162666,\n      1068.7033917244,\n      1424.9378556325332,\n      1781.1723195406666,\n      2137.4067834488,\n      2493.6412473569335,\n      2849.8757112650665,\n      3206.1101751732,\n      3562.3446390813333,\n      3918.5791029894667,\n      4274.8135668976,\n      4631.048030805733,\n      4987.282494713867,\n      5343.516958622,\n      5699.751422530133,\n      6055.985886438267,\n      6412.2203503464,\n      6768.454814254534,\n      7124.689278162667,\n      7480.9237420708,\n      7837.158205978933,\n      8193.392669887067,\n      8549.6271337952,\n      8905.861597703333,\n      9262.096061611466,\n      9618.3305255196,\n      9974.564989427734,\n      10330.799453335867,\n      10687.033917244,\n      11043.268381152133,\n      11399.502845060266,\n      11755.7373089684,\n      12111.971772876534,\n      12468.206236784667,\n      12824.4407006928,\n      13180.675164600932,\n      13536.909628509067,\n      13893.1440924172,\n      14249.378556325333,\n      14605.613020233466,\n      14961.8474841416,\n      15318.081948049734,\n      15674.316411957867,\n      16030.550875866,\n      16386.785339774135,\n      16743.019803682266,\n      17099.2542675904,\n      17455.48873149853,\n      17811.723195406666,\n      18167.957659314798,\n      18524.192123222932,\n      18880.426587131067,\n      19236.6610510392,\n      19592.89551494733,\n      19949.129978855468,\n      20305.364442763603,\n      20661.598906671734,\n      21017.833370579865,\n      21374.067834488\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0,\n      13440000.0,\n      14080000.0,\n      14720000.0,\n      15360000.0,\n      16000000.0,\n      16640000.0,\n      17280000.0,\n      17920000.0,\n      18560000.0,\n      19200000.0,\n      19840000.0,\n      20480000.0,\n      21120000.0,\n      21760000.0,\n      22400000.0,\n      23040000.0,\n      23680000.0,\n      24320000.0,\n      24960000.0,\n      25600000.0,\n      26240000.0,\n      26880000.0,\n      27520000.0,\n      28160000.0,\n      28800000.0,\n      29440000.0,\n      30080000.0,\n      30720000.0,\n      31360000.0,\n      32000000.0,\n      32640000.0,\n      33280000.0,\n      33920000.0,\n      34560000.0,\n      35200000.0,\n      35840000.0,\n      36480000.0,\n      37120000.0,\n      37760000.0,\n      38400000.0\n    ],\n    \"return\": [\n      0.0,\n      744.8859558105469,\n      796.5100860595703,\n      910.9717559814453,\n      975.1925201416016,\n      1047.324951171875,\n      1099.5369873046875,\n      1148.0682067871094,\n      1201.0522766113281,\n      1233.505615234375,\n      1255.5220031738281,\n      1290.6919860839844,\n      1311.6923828125,\n      1338.3526000976562,\n      1359.3213195800781,\n      1323.95654296875,\n      1365.9458312988281,\n      1372.3952026367188,\n      1371.4238586425781,\n      1380.5548706054688,\n      1387.2607421875,\n      1377.4356384277344,\n      1405.5323791503906,\n      1398.5776977539062,\n      1386.4181823730469,\n      1399.1890563964844,\n      1414.0697021484375,\n      1413.0535888671875,\n      1426.5853576660156,\n      1436.8981628417969,\n      1447.1106567382812,\n      1452.8192138671875,\n      1458.5155639648438,\n      1482.8174133300781,\n      1488.4780883789062,\n      1509.2735900878906,\n      1529.7430114746094,\n      1542.8193664550781,\n      1523.0067443847656,\n      1534.5915832519531,\n      1547.8734130859375,\n      1558.2222290039062,\n      1559.4790344238281,\n      1521.8999328613281,\n      1579.9296264648438,\n      1568.0743103027344,\n      1592.0011596679688,\n      1607.4393005371094,\n      1601.7018432617188,\n      1628.3120422363281,\n      1632.6797180175781,\n      1637.2363586425781,\n      1640.4541015625,\n      1648.7554626464844,\n      1637.6644287109375,\n      1660.0960693359375,\n      1661.5935363769531,\n      1654.3347778320312,\n      1661.7125244140625,\n      1665.4105834960938,\n      1686.1488037109375\n    ],\n    \"return_min\": [\n      0.0,\n      728.8708482176052,\n      762.0585259369802,\n      843.073411732124,\n      889.465099012336,\n      998.3990660977275,\n      1063.9706960921915,\n      1116.6410384966878,\n      1161.0117972409382,\n      1195.7086321791687,\n      1226.7667966679335,\n      1246.3428404936096,\n      1272.6613742372551,\n      1296.5317436158239,\n      1329.0087405178306,\n      1273.7531020924337,\n      1335.3961578228586,\n      1342.6771680749928,\n      1338.9942289987519,\n      1354.398859758841,\n      1352.2997701371219,\n      1322.036434338705,\n      1362.2402065457197,\n      1348.2375712433736,\n      1328.8611382153329,\n      1353.1487389210438,\n      1362.2909765642328,\n      1378.9425298602578,\n      1373.9676375490837,\n      1387.940960280719,\n      1394.5923024990384,\n      1392.6599873879725,\n      1397.4042979437356,\n      1416.8406937577124,\n      1436.3985979041825,\n      1451.1102549614811,\n      1462.6237026144347,\n      1480.086465539093,\n      1446.3469016733263,\n      1463.1031736967145,\n      1480.826246455255,\n      1507.2520940243835,\n      1484.954383969524,\n      1427.5705673711657,\n      1514.958677668042,\n      1495.3212565792096,\n      1524.5236672425701,\n      1493.734800162037,\n      1512.489867922995,\n      1521.317076862614,\n      1501.4009697149443,\n      1512.9679254210876,\n      1523.2808539998027,\n      1516.759711486079,\n      1521.9907509338339,\n      1541.900269758106,\n      1540.4093855281021,\n      1540.2998088326399,\n      1549.106670304241,\n      1539.2902805167103,\n      1557.2959035745841\n    ],\n    \"return_max\": [\n      0.0,\n      760.9010634034886,\n      830.9616461821604,\n      978.8701002307666,\n      1060.9199412708672,\n      1096.2508362460223,\n      1135.1032785171835,\n      1179.495375077531,\n      1241.092755981718,\n      1271.3025982895813,\n      1284.2772096797228,\n      1335.0411316743591,\n      1350.7233913877449,\n      1380.1734565794886,\n      1389.6338986423257,\n      1374.1599838450663,\n      1396.4955047747976,\n      1402.1132371984447,\n      1403.8534882864044,\n      1406.7108814520966,\n      1422.2217142378781,\n      1432.8348425167637,\n      1448.8245517550615,\n      1448.9178242644389,\n      1443.975226530761,\n      1445.229373871925,\n      1465.8484277326422,\n      1447.1646478741172,\n      1479.2030777829475,\n      1485.8553654028747,\n      1499.629010977524,\n      1512.9784403464025,\n      1519.6268299859519,\n      1548.7941329024438,\n      1540.55757885363,\n      1567.4369252143001,\n      1596.862320334784,\n      1605.5522673710632,\n      1599.666587096205,\n      1606.0799928071917,\n      1614.92057971662,\n      1609.192363983429,\n      1634.0036848781322,\n      1616.2292983514906,\n      1644.9005752616456,\n      1640.8273640262591,\n      1659.4786520933674,\n      1721.1438009121819,\n      1690.9138186004425,\n      1735.3070076100423,\n      1763.958466320212,\n      1761.5047918640687,\n      1757.6273491251973,\n      1780.7512138068898,\n      1753.3381064880411,\n      1778.291868913769,\n      1782.7776872258041,\n      1768.3697468314226,\n      1774.318378523884,\n      1791.5308864754772,\n      1815.0017038472909\n    ]\n  },\n  \"h1hand_cube\": {\n    \"time\": [\n      0.0,\n      383.7138327153,\n      767.4276654306,\n      1151.1414981459,\n      1534.8553308612,\n      1918.5691635765002,\n      2302.2829962918,\n      2685.9968290071,\n      3069.7106617224,\n      3453.4244944377,\n      3837.1383271530003,\n      4220.8521598683,\n      4604.5659925836,\n      4988.2798252989,\n      5371.9936580142,\n      5755.7074907295,\n      6139.4213234448,\n      6523.1351561600995,\n      6906.8489888754,\n      7290.5628215907,\n      7674.276654306001\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0\n    ],\n    \"return\": [\n      0.0,\n      13.080046653747559,\n      17.67292849222819,\n      21.029696782430012,\n      22.918834050496418,\n      24.50208346048991,\n      26.14434051513672,\n      35.192575454711914,\n      47.09134292602539,\n      60.91011047363281,\n      78.56637700398763,\n      158.25643920898438,\n      161.40049743652344,\n      135.81275049845377,\n      156.24981689453125,\n      225.83556111653647,\n      166.00001017252603,\n      213.50342814127603,\n      196.95799255371094,\n      205.33394368489584,\n      221.19602966308594\n    ],\n    \"return_min\": [\n      0.0,\n      11.60227465184281,\n      16.627264717234375,\n      20.016001116689015,\n      22.27803754584949,\n      23.950662436438055,\n      23.630521485969602,\n      23.826143545159375,\n      31.20715640142832,\n      43.19868665817663,\n      46.30510159872897,\n      130.83772424309208,\n      124.81023014354234,\n      58.897895471155806,\n      106.57978648931135,\n      219.94269343957157,\n      148.87724150076664,\n      193.22438191225328,\n      179.88170809603196,\n      178.60399498737277,\n      210.30042873135127\n    ],\n    \"return_max\": [\n      0.0,\n      14.557818655652307,\n      18.718592267222007,\n      22.04339244817101,\n      23.559630555143347,\n      25.053504484541765,\n      28.658159544303835,\n      46.55900736426445,\n      62.97552945062246,\n      78.621534289089,\n      110.82765240924628,\n      185.67515417487667,\n      197.99076472950452,\n      212.72760552575173,\n      205.91984729975115,\n      231.72842879350137,\n      183.12277884428542,\n      233.7824743702988,\n      214.03427701138992,\n      232.0638923824189,\n      232.0916305948206\n    ]\n  },\n  \"h1hand_bookshelf_simple\": {\n    \"time\": [\n      0.0,\n      607.6057830167999,\n      1215.2115660335999,\n      1822.8173490504,\n      2430.4231320671997,\n      3038.028915084,\n      3645.6346981008,\n      4253.2404811176,\n      4860.8462641343995,\n      5468.4520471512,\n      6076.057830168,\n      6683.663613184801,\n      7291.2693962016,\n      7898.875179218399,\n      8506.4809622352,\n      9114.086745252,\n      9721.692528268799,\n      10329.298311285598,\n      10936.9040943024,\n      11544.5098773192,\n      12152.115660336\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0\n    ],\n    \"return\": [\n      0.0,\n      72.59014129638672,\n      632.5331013997396,\n      482.05072021484375,\n      417.5148289998372,\n      487.9495900472005,\n      716.89990234375,\n      602.3393147786459,\n      723.6810099283854,\n      746.5069580078125,\n      680.5458170572916,\n      555.5288289388021,\n      762.16845703125,\n      761.0180053710938,\n      791.8561401367188,\n      807.6997273763021,\n      812.0292765299479,\n      816.4413655598959,\n      816.0659993489584,\n      819.2063598632812,\n      821.7978922526041\n    ],\n    \"return_min\": [\n      0.0,\n      31.586202163025966,\n      591.3560061650909,\n      333.63710951300334,\n      182.78757131870665,\n      272.4163623170391,\n      704.8062688039831,\n      466.3347552851721,\n      701.2404371259585,\n      723.2593912345505,\n      616.9144495010835,\n      362.17470111103154,\n      737.8159270729626,\n      749.7294135850707,\n      778.164806689675,\n      801.360166533209,\n      804.7945123252052,\n      799.3652871282668,\n      800.0447951141915,\n      800.9554924523965,\n      804.5781225227375\n    ],\n    \"return_max\": [\n      0.0,\n      113.59408042974746,\n      673.7101966343884,\n      630.4643309166842,\n      652.2420866809678,\n      703.4828177773619,\n      728.9935358835169,\n      738.3438742721196,\n      746.1215827308123,\n      769.7545247810745,\n      744.1771846134998,\n      748.8829567665728,\n      786.5209869895374,\n      772.3065971571168,\n      805.5474735837626,\n      814.0392882193952,\n      819.2640407346905,\n      833.5174439915249,\n      832.0872035837252,\n      837.457227274166,\n      839.0176619824707\n    ]\n  },\n  \"h1hand_bookshelf_hard\": {\n    \"time\": [\n      0.0,\n      543.7358112938999,\n      1087.4716225877999,\n      1631.2074338817,\n      2174.9432451755997,\n      2718.6790564694998,\n      3262.4148677634,\n      3806.1506790573,\n      4349.886490351199,\n      4893.6223016450995,\n      5437.3581129389995,\n      5981.0939242329005,\n      6524.8297355268,\n      7068.5655468207,\n      7612.3013581146,\n      8156.0371694085,\n      8699.772980702399,\n      9243.5087919963,\n      9787.244603290199,\n      10330.9804145841,\n      10874.716225877999,\n      11418.452037171899,\n      11962.187848465801,\n      12505.9236597597,\n      13049.6594710536,\n      13593.395282347501,\n      14137.1310936414,\n      14680.8669049353,\n      15224.6027162292,\n      15768.3385275231,\n      16312.074338817\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0,\n      13440000.0,\n      14080000.0,\n      14720000.0,\n      15360000.0,\n      16000000.0,\n      16640000.0,\n      17280000.0,\n      17920000.0,\n      18560000.0,\n      19200000.0\n    ],\n    \"return\": [\n      0.0,\n      115.04310989379883,\n      414.3397623697917,\n      392.4563496907552,\n      384.3321024576823,\n      445.73602294921875,\n      444.00734456380206,\n      507.5265604654948,\n      502.8351338704427,\n      563.632080078125,\n      479.8868408203125,\n      540.4158935546875,\n      504.34425862630206,\n      576.1414082845052,\n      565.2438761393229,\n      577.5482991536459,\n      557.5013427734375,\n      589.2844645182291,\n      630.3101806640625,\n      653.0324300130209,\n      625.0805460611979,\n      712.5912882486979,\n      699.0077311197916,\n      691.9513346354166,\n      665.5681762695312,\n      712.416259765625,\n      694.9320068359375,\n      723.7599080403646,\n      704.8312174479166,\n      718.2322794596354,\n      723.0150553385416\n    ],\n    \"return_min\": [\n      0.0,\n      43.59245386852548,\n      332.4218818003097,\n      299.83306064986766,\n      259.7826089424531,\n      356.7439063459728,\n      399.02623029313395,\n      478.43961173592857,\n      480.8452493513443,\n      529.9966373726622,\n      431.6378949090861,\n      494.5170640027713,\n      422.71565001450017,\n      521.9929747241146,\n      534.9803965407618,\n      513.9383974775631,\n      536.6718386033316,\n      543.0112313037442,\n      612.9851065450109,\n      646.1570432133583,\n      611.3883743554222,\n      687.0628606877813,\n      676.3860733803567,\n      672.6699943456148,\n      627.8278602083814,\n      698.0179459658565,\n      658.5159895805781,\n      707.7512852510698,\n      689.8015943037292,\n      706.8565054687476,\n      705.8978014554126\n    ],\n    \"return_max\": [\n      0.0,\n      186.4937659190722,\n      496.2576429392737,\n      485.0796387316427,\n      508.8815959729115,\n      534.7281395524647,\n      488.9884588344702,\n      536.613509195061,\n      524.8250183895411,\n      597.2675227835878,\n      528.1357867315388,\n      586.3147231066037,\n      585.972867238104,\n      630.2898418448959,\n      595.5073557378839,\n      641.1582008297287,\n      578.3308469435434,\n      635.557697732714,\n      647.6352547831141,\n      659.9078168126834,\n      638.7727177669735,\n      738.1197158096145,\n      721.6293888592265,\n      711.2326749252185,\n      703.3084923306811,\n      726.8145735653935,\n      731.3480240912969,\n      739.7685308296594,\n      719.8608405921041,\n      729.6080534505231,\n      740.1323092216707\n    ]\n  },\n  \"h1hand_basketball\": {\n    \"time\": [\n      0.0,\n      302.93849988592,\n      605.87699977184,\n      908.8154996577599,\n      1211.75399954368,\n      1514.6924994295998,\n      1817.6309993155198,\n      2120.56949920144,\n      2423.50799908736,\n      2726.44649897328,\n      3029.3849988591996,\n      3332.32349874512,\n      3635.2619986310397,\n      3938.20049851696,\n      4241.13899840288,\n      4544.0774982888,\n      4847.01599817472,\n      5149.95449806064,\n      5452.89299794656,\n      5755.83149783248,\n      6058.769997718399,\n      6361.708497604321,\n      6664.64699749024,\n      6967.58549737616,\n      7270.523997262079,\n      7573.462497147999,\n      7876.40099703392,\n      8179.33949691984,\n      8482.27799680576,\n      8785.216496691679,\n      9088.1549965776,\n      9391.09349646352,\n      9694.03199634944,\n      9996.970496235359,\n      10299.90899612128,\n      10602.847496007198,\n      10905.78599589312,\n      11208.72449577904,\n      11511.66299566496,\n      11814.60149555088,\n      12117.539995436799,\n      12420.478495322719,\n      12723.416995208641,\n      13026.35549509456,\n      13329.29399498048,\n      13632.232494866399,\n      13935.17099475232,\n      14238.10949463824,\n      14541.047994524159,\n      14843.98649441008,\n      15146.924994295998\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0,\n      13440000.0,\n      14080000.0,\n      14720000.0,\n      15360000.0,\n      16000000.0,\n      16640000.0,\n      17280000.0,\n      17920000.0,\n      18560000.0,\n      19200000.0,\n      19840000.0,\n      20480000.0,\n      21120000.0,\n      21760000.0,\n      22400000.0,\n      23040000.0,\n      23680000.0,\n      24320000.0,\n      24960000.0,\n      25600000.0,\n      26240000.0,\n      26880000.0,\n      27520000.0,\n      28160000.0,\n      28800000.0,\n      29440000.0,\n      30080000.0,\n      30720000.0,\n      31360000.0,\n      32000000.0\n    ],\n    \"return\": [\n      0.0,\n      15.71937624613444,\n      20.09008280436198,\n      28.070403416951496,\n      37.13821029663086,\n      48.140185038248696,\n      80.68272908528645,\n      124.40814971923828,\n      126.71758778889973,\n      161.22200520833334,\n      202.98166147867838,\n      204.3378473917643,\n      293.4285481770833,\n      315.3683675130208,\n      264.8266194661458,\n      278.62744140625,\n      291.4298909505208,\n      324.5673472086589,\n      380.23256429036456,\n      394.94849650065106,\n      298.89170837402344,\n      292.83021036783856,\n      401.73631795247394,\n      375.27061971028644,\n      386.8580322265625,\n      303.90362548828125,\n      453.88189697265625,\n      383.60216267903644,\n      323.7223765055339,\n      264.7662048339844,\n      407.06358846028644,\n      348.29319254557294,\n      415.5738118489583,\n      385.26619720458984,\n      416.16399637858075,\n      491.7182312011719,\n      459.73570760091144,\n      295.7273661295573,\n      305.8920593261719,\n      537.0791524251302,\n      566.8707377115885,\n      511.0398864746094,\n      371.42539469401044,\n      502.34424845377606,\n      437.50067138671875,\n      296.82629648844403,\n      467.6920878092448,\n      347.05420939127606,\n      577.7560831705729,\n      553.7951253255209,\n      528.9782918294271\n    ],\n    \"return_min\": [\n      0.0,\n      14.283715681857537,\n      16.631100081870674,\n      24.850849036315818,\n      25.682231879991985,\n      40.36236009161064,\n      62.133789054744106,\n      84.60940729497763,\n      81.26086178598888,\n      90.6152240448114,\n      104.67304686229286,\n      94.84658618782778,\n      198.80811217235924,\n      260.1644753571341,\n      154.88538218147525,\n      223.0036764790103,\n      261.8258052334307,\n      207.3873303370363,\n      286.2910705112556,\n      381.83799263632784,\n      245.92185692409578,\n      247.45123796782846,\n      364.1263718288159,\n      353.57045366003666,\n      364.68906452491575,\n      209.5885532905373,\n      390.04425163109977,\n      314.08592505260197,\n      213.68833766761594,\n      214.40978111414398,\n      329.92800186753425,\n      229.6240291107526,\n      337.0526110087683,\n      199.91591366084674,\n      268.50189022268603,\n      431.10496585660894,\n      434.0950085790193,\n      121.5927500915191,\n      259.7480611862015,\n      473.60034355255095,\n      467.59375605440545,\n      416.2188627932545,\n      249.31592931518838,\n      411.0434362283788,\n      261.3459217634894,\n      176.60671178439534,\n      300.33125904403,\n      221.02134041025272,\n      518.1882874852209,\n      470.53052483418503,\n      443.62534877420796\n    ],\n    \"return_max\": [\n      0.0,\n      17.155036810411342,\n      23.549065526853287,\n      31.289957797587174,\n      48.59418871326973,\n      55.91800998488675,\n      99.2316691158288,\n      164.20689214349892,\n      172.1743137918106,\n      231.8287863718553,\n      301.29027609506386,\n      313.82910859570086,\n      388.0489841818074,\n      370.5722596689075,\n      374.76785675081635,\n      334.2512063334897,\n      321.03397666761094,\n      441.74736408028144,\n      474.1740580694735,\n      408.0590003649743,\n      351.8615598239511,\n      338.2091827678487,\n      439.346264076132,\n      396.9707857605362,\n      409.02699992820925,\n      398.21869768602517,\n      517.7195423142127,\n      453.1184003054709,\n      433.7564153434518,\n      315.12262855382477,\n      484.19917505303863,\n      466.9623559803933,\n      494.09501268914835,\n      570.616480748333,\n      563.8261025344755,\n      552.3314965457348,\n      485.3764066228036,\n      469.86198216759556,\n      352.03605746614227,\n      600.5579612977095,\n      666.1477193687715,\n      605.8609101559642,\n      493.5348600728325,\n      593.6450606791733,\n      613.6554210099481,\n      417.04588119249274,\n      635.0529165744597,\n      473.0870783722994,\n      637.3238788559248,\n      637.0597258168567,\n      614.3312348846463\n    ]\n  },\n  \"h1hand_window\": {\n    \"time\": [\n      0.0,\n      375.65864049644,\n      751.31728099288,\n      1126.97592148932,\n      1502.63456198576,\n      1878.2932024822,\n      2253.95184297864,\n      2629.61048347508,\n      3005.26912397152,\n      3380.9277644679596,\n      3756.5864049644,\n      4132.24504546084,\n      4507.90368595728,\n      4883.56232645372,\n      5259.22096695016,\n      5634.8796074466,\n      6010.53824794304,\n      6386.196888439479,\n      6761.855528935919,\n      7137.51416943236,\n      7513.1728099288,\n      7888.831450425239,\n      8264.49009092168,\n      8640.14873141812,\n      9015.80737191456,\n      9391.466012411,\n      9767.12465290744,\n      10142.78329340388,\n      10518.44193390032,\n      10894.10057439676,\n      11269.7592148932,\n      11645.41785538964,\n      12021.07649588608,\n      12396.73513638252,\n      12772.393776878958,\n      13148.052417375398,\n      13523.711057871838,\n      13899.369698368278,\n      14275.02833886472,\n      14650.68697936116,\n      15026.3456198576,\n      15402.00426035404,\n      15777.662900850479,\n      16153.321541346919,\n      16528.98018184336,\n      16904.6388223398,\n      17280.29746283624,\n      17655.95610333268,\n      18031.61474382912,\n      18407.27338432556,\n      18782.932024822\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0,\n      13440000.0,\n      14080000.0,\n      14720000.0,\n      15360000.0,\n      16000000.0,\n      16640000.0,\n      17280000.0,\n      17920000.0,\n      18560000.0,\n      19200000.0,\n      19840000.0,\n      20480000.0,\n      21120000.0,\n      21760000.0,\n      22400000.0,\n      23040000.0,\n      23680000.0,\n      24320000.0,\n      24960000.0,\n      25600000.0,\n      26240000.0,\n      26880000.0,\n      27520000.0,\n      28160000.0,\n      28800000.0,\n      29440000.0,\n      30080000.0,\n      30720000.0,\n      31360000.0,\n      32000000.0\n    ],\n    \"return\": [\n      0.0,\n      9.344390233357748,\n      52.42061551411947,\n      165.07393646240234,\n      269.02171834309894,\n      336.8362223307292,\n      303.15257771809894,\n      354.14097086588544,\n      428.8276774088542,\n      233.65752283732095,\n      354.2726542154948,\n      446.46274820963544,\n      386.8551534016927,\n      488.44984944661456,\n      527.6175537109375,\n      499.61435953776044,\n      520.1758015950521,\n      533.7708740234375,\n      517.2260945638021,\n      615.2128092447916,\n      586.9806111653646,\n      631.5748697916666,\n      625.1402994791666,\n      522.4225260416666,\n      661.9009399414062,\n      621.3815714518229,\n      593.1093546549479,\n      609.5584716796875,\n      645.6303304036459,\n      606.4934692382812,\n      629.7203369140625,\n      619.4031575520834,\n      616.0424601236979,\n      619.2044474283854,\n      614.5390421549479,\n      615.4394938151041,\n      560.7685038248698,\n      474.59820556640625,\n      470.9849739074707,\n      668.3463948567709,\n      682.6567586263021,\n      464.4889322916667,\n      636.9969889322916,\n      627.0051981608073,\n      670.6390177408854,\n      633.2178141276041,\n      626.3368123372396,\n      667.2381591796875,\n      550.3759155273438,\n      638.7027587890625,\n      613.3283081054688\n    ],\n    \"return_min\": [\n      0.0,\n      7.793400754588625,\n      26.974647549456396,\n      96.90356411186578,\n      228.96287551323448,\n      309.31272125827473,\n      286.3705291802933,\n      326.8671206873417,\n      392.2532160264861,\n      84.46841388823327,\n      304.0752715628187,\n      367.63128755485377,\n      198.75912330174802,\n      418.10049013236585,\n      495.99406990234695,\n      466.59337810178283,\n      426.0956847450194,\n      528.027321153031,\n      427.74345783940487,\n      588.4670108977814,\n      572.1403582244072,\n      621.8291590940654,\n      580.3812827338901,\n      411.5124914178696,\n      624.5023420369643,\n      602.0108924461694,\n      561.3460675594077,\n      584.0818549693116,\n      585.6577994132044,\n      524.2934694401896,\n      556.6414078895589,\n      597.6325510119733,\n      578.4353616396595,\n      535.9672354385903,\n      548.0507414116869,\n      510.63892367992145,\n      495.25338547915845,\n      273.7842564520836,\n      166.61370163419144,\n      594.1346175295835,\n      654.6452652746367,\n      289.90572041974156,\n      555.2077996580139,\n      529.1499181682586,\n      639.152777939848,\n      553.0639730276913,\n      538.3629737339184,\n      595.1634994947292,\n      330.96572081605984,\n      596.1039158315787,\n      502.0886536727603\n    ],\n    \"return_max\": [\n      0.0,\n      10.895379712126871,\n      77.86658347878254,\n      233.24430881293893,\n      309.0805611729634,\n      364.35972340318364,\n      319.93462625590456,\n      381.4148210444292,\n      465.4021387912223,\n      382.84663178640864,\n      404.4700368681709,\n      525.2942088644171,\n      574.9511835016374,\n      558.7992087608633,\n      559.241037519528,\n      532.635340973738,\n      614.2559184450848,\n      539.514426893844,\n      606.7087312881994,\n      641.9586075918019,\n      601.820864106322,\n      641.3205804892679,\n      669.8993162244432,\n      633.3325606654637,\n      699.2995378458482,\n      640.7522504574764,\n      624.872641750488,\n      635.0350883900634,\n      705.6028613940873,\n      688.6934690363729,\n      702.7992659385661,\n      641.1737640921934,\n      653.6495586077363,\n      702.4416594181805,\n      681.0273428982089,\n      720.2400639502868,\n      626.2836221705811,\n      675.4121546807289,\n      775.35624618075,\n      742.5581721839583,\n      710.6682519779675,\n      639.0721441635918,\n      718.7861782065694,\n      724.8604781533559,\n      702.1252575419228,\n      713.371655227517,\n      714.3106509405609,\n      739.3128188646458,\n      769.7861102386277,\n      681.3016017465463,\n      724.5679625381772\n    ]\n  },\n  \"h1hand_spoon\": {\n    \"time\": [\n      0.0,\n      322.2585738862,\n      644.5171477724,\n      966.7757216586,\n      1289.0342955448,\n      1611.292869431,\n      1933.5514433172,\n      2255.8100172034,\n      2578.0685910896,\n      2900.3271649758,\n      3222.585738862\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0\n    ],\n    \"return\": [\n      0.0,\n      122.02554003397624,\n      186.70377604166666,\n      261.64577229817706,\n      300.80299886067706,\n      328.19361368815106,\n      339.11742146809894,\n      356.2308349609375,\n      356.4991963704427,\n      361.5694986979167,\n      369.06699625651044\n    ],\n    \"return_min\": [\n      0.0,\n      34.19382261198194,\n      91.68614757119323,\n      220.42312853910647,\n      278.48433900289217,\n      320.38500523770017,\n      325.4230361788222,\n      331.20817004445007,\n      354.7223789896127,\n      349.114697321542,\n      364.1910965162153\n    ],\n    \"return_max\": [\n      0.0,\n      209.85725745597054,\n      281.7214045121401,\n      302.86841605724766,\n      323.12165871846196,\n      336.00222213860195,\n      352.8118067573757,\n      381.25349987742493,\n      358.27601375127267,\n      374.0243000742914,\n      373.9428959968056\n    ]\n  },\n  \"h1hand_package\": {\n    \"time\": [\n      0.0,\n      340.7822499229,\n      681.5644998458,\n      1022.3467497687001,\n      1363.1289996916,\n      1703.9112496145,\n      2044.6934995374002,\n      2385.4757494603,\n      2726.2579993832,\n      3067.0402493061,\n      3407.822499229,\n      3748.6047491519,\n      4089.3869990748003,\n      4430.1692489977,\n      4770.9514989206,\n      5111.7337488435005,\n      5452.5159987664,\n      5793.2982486893,\n      6134.0804986122,\n      6474.862748535101,\n      6815.644998458\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0\n    ],\n    \"return\": [\n      0.0,\n      -8347.962727864584,\n      -7953.91357421875,\n      -7639.56884765625,\n      -7962.0322265625,\n      -8035.910807291667,\n      -7031.6279296875,\n      -7409.0107421875,\n      -6930.870279947917,\n      -7567.949869791667,\n      -8071.77685546875,\n      -6989.133951822917,\n      -7464.599609375,\n      -7308.195963541667,\n      -7369.9765625,\n      -7056.262532552083,\n      -7344.883463541667,\n      -6283.265950520833,\n      -7419.428873697917,\n      -7106.29931640625,\n      -7487.24951171875\n    ],\n    \"return_min\": [\n      0.0,\n      -9291.683999288682,\n      -8680.473541162266,\n      -7798.53664407206,\n      -8377.834608732313,\n      -8635.369594101,\n      -7305.690398586479,\n      -7625.490297935072,\n      -7398.502723648596,\n      -7928.4496191424105,\n      -8369.074711589812,\n      -7316.171947275012,\n      -8169.863535365709,\n      -7698.752542894817,\n      -7491.752086731574,\n      -7174.744292881087,\n      -7563.93196904456,\n      -6491.269627424492,\n      -7715.537928880829,\n      -7283.65067627879,\n      -7745.076350527821\n    ],\n    \"return_max\": [\n      0.0,\n      -7404.241456440486,\n      -7227.353607275234,\n      -7480.60105124044,\n      -7546.229844392687,\n      -7436.452020482336,\n      -6757.565460788521,\n      -7192.531186439928,\n      -6463.237836247238,\n      -7207.4501204409235,\n      -7774.478999347688,\n      -6662.095956370822,\n      -6759.335683384291,\n      -6917.6393841885165,\n      -7248.201038268426,\n      -6937.780772223079,\n      -7125.834958038774,\n      -6075.262273617174,\n      -7123.319818515005,\n      -6928.94795653371,\n      -7229.422672909679\n    ]\n  },\n  \"h1hand_powerlift\": {\n    \"time\": [\n      0.0,\n      330.8597732899667,\n      661.7195465799334,\n      992.5793198699,\n      1323.4390931598668,\n      1654.2988664498334,\n      1985.1586397398,\n      2316.0184130297666,\n      2646.8781863197337,\n      2977.7379596097003,\n      3308.597732899667,\n      3639.4575061896335,\n      3970.3172794796,\n      4301.177052769567,\n      4632.036826059533,\n      4962.896599349499,\n      5293.756372639467,\n      5624.616145929434,\n      5955.475919219401,\n      6286.335692509367,\n      6617.195465799334,\n      6948.055239089301,\n      7278.915012379267,\n      7609.774785669233,\n      7940.6345589592,\n      8271.494332249167,\n      8602.354105539134,\n      8933.2138788291,\n      9264.073652119067,\n      9594.933425409034,\n      9925.793198698999\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0,\n      13440000.0,\n      14080000.0,\n      14720000.0,\n      15360000.0,\n      16000000.0,\n      16640000.0,\n      17280000.0,\n      17920000.0,\n      18560000.0,\n      19200000.0\n    ],\n    \"return\": [\n      0.0,\n      113.51462300618489,\n      121.9887212117513,\n      114.68843332926433,\n      136.49109395345053,\n      133.07645670572916,\n      150.48657735188803,\n      147.2944081624349,\n      160.43098958333334,\n      160.48809814453125,\n      171.36279296875,\n      173.65789794921875,\n      175.5028839111328,\n      205.1602579752604,\n      244.93697102864584,\n      211.19544474283853,\n      298.95766194661456,\n      304.22186279296875,\n      269.803716023763,\n      286.91443888346356,\n      311.9042460123698,\n      286.8816223144531,\n      310.73230997721356,\n      300.2124938964844,\n      296.96824137369794,\n      301.94849650065106,\n      325.45009358723956,\n      312.3852844238281,\n      329.8019307454427,\n      309.90221150716144,\n      324.70689900716144\n    ],\n    \"return_min\": [\n      0.0,\n      98.60086063853973,\n      114.55728915382741,\n      95.01551471352975,\n      131.31232655463052,\n      132.52894643766788,\n      145.7905631000291,\n      133.60270472550508,\n      157.63405153985775,\n      159.3252445285353,\n      161.9094348876847,\n      170.73693424256538,\n      159.80489922254142,\n      193.6200625634648,\n      219.08587047631957,\n      191.63435289037298,\n      270.29798851974755,\n      286.12178639622397,\n      237.38621982654573,\n      260.68960010790164,\n      304.83943049232437,\n      273.19842889543116,\n      303.7634891685769,\n      274.43171223168747,\n      285.8977429185736,\n      278.42592868276324,\n      313.4876563980677,\n      295.64470743599946,\n      324.5820886809485,\n      283.7021672876653,\n      312.297612561625\n    ],\n    \"return_max\": [\n      0.0,\n      128.42838537383005,\n      129.42015326967518,\n      134.3613519449989,\n      141.66986135227054,\n      133.62396697379043,\n      155.18259160374697,\n      160.98611159936473,\n      163.22792762680893,\n      161.6509517605272,\n      180.8161510498153,\n      176.57886165587212,\n      191.2008685997242,\n      216.700453387056,\n      270.7880715809721,\n      230.75653659530408,\n      327.6173353734816,\n      322.32193918971353,\n      302.2212122209803,\n      313.1392776590255,\n      318.96906153241525,\n      300.5648157334751,\n      317.70113078585024,\n      325.9932755612813,\n      308.03873982882226,\n      325.4710643185389,\n      337.4125307764114,\n      329.1258614116568,\n      335.02177280993686,\n      336.1022557266576,\n      337.1161854526979\n    ]\n  },\n  \"h1hand_room\": {\n    \"time\": [\n      0.0,\n      465.0374399698,\n      930.0748799396,\n      1395.1123199094,\n      1860.1497598792,\n      2325.187199849,\n      2790.2246398188,\n      3255.2620797886,\n      3720.2995197584,\n      4185.3369597282,\n      4650.374399698,\n      5115.4118396678,\n      5580.4492796376,\n      6045.4867196074,\n      6510.5241595772,\n      6975.561599547,\n      7440.5990395168,\n      7905.6364794866,\n      8370.6739194564,\n      8835.711359426201,\n      9300.748799396\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0\n    ],\n    \"return\": [\n      0.0,\n      13.283664067586264,\n      16.206466674804688,\n      15.741510391235352,\n      17.81355921427409,\n      17.636091868082683,\n      20.84869448343913,\n      21.28005091349284,\n      29.286340077718098,\n      36.723029454549156,\n      55.41912968953451,\n      85.37582143147786,\n      111.54248301188152,\n      156.0373992919922,\n      161.06229146321616,\n      165.90640258789062,\n      175.1955108642578,\n      181.9734141031901,\n      174.88094584147134,\n      183.4038848876953,\n      176.96692911783853\n    ],\n    \"return_min\": [\n      0.0,\n      10.70380901646449,\n      15.550924139783536,\n      13.495577650634877,\n      15.432328173343322,\n      15.783839514520572,\n      17.184290172785886,\n      20.759621257856587,\n      14.243299280874174,\n      26.23099362535826,\n      21.424434200721606,\n      46.457612262779534,\n      76.395700351806,\n      142.86692395305536,\n      149.31479438123523,\n      150.45741972383985,\n      170.95027780126063,\n      173.86462173894353,\n      169.2223202556131,\n      179.6609978535562,\n      165.3778977118618\n    ],\n    \"return_max\": [\n      0.0,\n      15.863519118708037,\n      16.86200920982584,\n      17.98744313183583,\n      20.194790255204857,\n      19.488344221644795,\n      24.51309879409237,\n      21.800480569129093,\n      44.32938087456202,\n      47.21506528374005,\n      89.41382517834741,\n      124.29403060017619,\n      146.68926567195703,\n      169.207874630929,\n      172.80978854519708,\n      181.3553854519414,\n      179.440743927255,\n      190.08220646743666,\n      180.53957142732958,\n      187.14677192183441,\n      188.55596052381526\n    ]\n  },\n  \"h1hand_insert_small\": {\n    \"time\": [\n      0.0,\n      315.23139559215,\n      630.4627911843,\n      945.6941867764501,\n      1260.9255823686,\n      1576.15697796075,\n      1891.3883735529002,\n      2206.6197691450498,\n      2521.8511647372,\n      2837.08256032935,\n      3152.3139559215,\n      3467.5453515136496,\n      3782.7767471058005,\n      4098.0081426979505,\n      4413.2395382900995,\n      4728.47093388225,\n      5043.7023294744,\n      5358.93372506655,\n      5674.1651206587,\n      5989.39651625085,\n      6304.627911843\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0\n    ],\n    \"return\": [\n      0.0,\n      25.803948720296223,\n      65.2731081644694,\n      110.42922719319661,\n      124.65948994954427,\n      151.74050903320312,\n      158.40826924641928,\n      164.9043426513672,\n      167.84961954752603,\n      170.26196797688803,\n      165.7256317138672,\n      150.51225789388022,\n      159.7950897216797,\n      169.25322469075522,\n      170.7041982014974,\n      162.28902689615884,\n      171.52276102701822,\n      173.49874369303384,\n      164.4512736002604,\n      162.95905049641928,\n      171.48112996419272\n    ],\n    \"return_min\": [\n      0.0,\n      16.259126578045933,\n      27.20639444461152,\n      103.35436633108236,\n      114.25403470486306,\n      144.04370182263935,\n      153.86552855398466,\n      161.79437375930542,\n      166.41660146284676,\n      168.28469510106487,\n      161.68870399170868,\n      121.0084021069551,\n      141.81689245511308,\n      166.0688410087526,\n      167.85661523998058,\n      155.22548705200677,\n      168.48752737496002,\n      171.0301362849067,\n      156.70273979774515,\n      149.2293200021275,\n      169.0827938103662\n    ],\n    \"return_max\": [\n      0.0,\n      35.348770862546516,\n      103.33982188432729,\n      117.50408805531086,\n      135.0649451942255,\n      159.4373162437669,\n      162.9510099388539,\n      168.01431154342896,\n      169.2826376322053,\n      172.2392408527112,\n      169.7625594360257,\n      180.01611368080532,\n      177.7732869882463,\n      172.43760837275784,\n      173.55178116301423,\n      169.35256674031092,\n      174.55799467907642,\n      175.967351101161,\n      172.19980740277566,\n      176.68878099071105,\n      173.87946611801922\n    ]\n  },\n  \"h1hand_insert_normal\": {\n    \"time\": [\n      0.0,\n      353.9304402593,\n      707.8608805186,\n      1061.7913207779,\n      1415.7217610372,\n      1769.6522012965002,\n      2123.5826415558,\n      2477.5130818151,\n      2831.4435220744,\n      3185.3739623336996,\n      3539.3044025930003,\n      3893.2348428523,\n      4247.1652831116,\n      4601.0957233709,\n      4955.0261636302,\n      5308.9566038895,\n      5662.8870441488,\n      6016.8174844080995,\n      6370.747924667399,\n      6724.6783649267,\n      7078.608805186001\n    ],\n    \"env_step\": [\n      0.0,\n      640000.0,\n      1280000.0,\n      1920000.0,\n      2560000.0,\n      3200000.0,\n      3840000.0,\n      4480000.0,\n      5120000.0,\n      5760000.0,\n      6400000.0,\n      7040000.0,\n      7680000.0,\n      8320000.0,\n      8960000.0,\n      9600000.0,\n      10240000.0,\n      10880000.0,\n      11520000.0,\n      12160000.0,\n      12800000.0\n    ],\n    \"return\": [\n      0.0,\n      24.10168711344401,\n      54.070421854654946,\n      99.81527709960938,\n      135.47543080647787,\n      130.91323852539062,\n      152.1724065144857,\n      162.59126790364584,\n      164.9534657796224,\n      168.67947896321616,\n      168.1095987955729,\n      168.2154744466146,\n      174.81605529785156,\n      176.54434204101562,\n      179.82152811686197,\n      179.19852193196616,\n      182.65686543782553,\n      184.3785146077474,\n      187.4399668375651,\n      185.9150187174479,\n      191.98021443684897\n    ],\n    \"return_min\": [\n      0.0,\n      6.769507504418215,\n      38.73171288128932,\n      81.49871730558638,\n      113.11107571013979,\n      99.97948537079725,\n      130.3948495447914,\n      159.98263262262466,\n      163.80269122995844,\n      166.81903211046978,\n      163.45338885046095,\n      160.9069368380094,\n      169.2923240593455,\n      169.42940571808268,\n      168.66802504056085,\n      164.02913103172816,\n      165.1513139904467,\n      169.5900424483381,\n      170.5853867351582,\n      166.09199645001894,\n      172.27076893820444\n    ],\n    \"return_max\": [\n      0.0,\n      41.43386672246981,\n      69.40913082802058,\n      118.13183689363237,\n      157.83978590281595,\n      161.846991679984,\n      173.94996348417996,\n      165.19990318466702,\n      166.10424032928637,\n      170.53992581596253,\n      172.76580874068486,\n      175.5240120552198,\n      180.33978653635762,\n      183.65927836394857,\n      190.9750311931631,\n      194.36791283220415,\n      200.16241688520435,\n      199.16698676715671,\n      204.29454693997198,\n      205.73804098487687,\n      211.6896599354935\n    ]\n  }\n}"
  },
  {
    "path": "data/isaaclab_result.json",
    "content": "{\n  \"Isaac-Velocity-Flat-G1-v0\": {\n    \"FastTD3\": {\n      \"time\": [\n        0.0,\n        426.030303030303,\n        852.060606060606,\n        1278.090909090909,\n        1704.121212121212,\n        2130.151515151515,\n        2556.181818181818,\n        2982.212121212121,\n        3408.242424242424,\n        3834.2727272727275,\n        4260.30303030303,\n        4686.333333333333,\n        5112.363636363636,\n        5538.393939393939,\n        5964.424242424242,\n        6390.454545454545,\n        6816.484848484848,\n        7242.515151515152,\n        7668.545454545455,\n        8094.575757575758,\n        8520.60606060606,\n        8946.636363636364,\n        9372.666666666666,\n        9798.69696969697,\n        10224.727272727272,\n        10650.757575757576,\n        11076.787878787878,\n        11502.818181818182,\n        11928.848484848484,\n        12354.878787878788,\n        12780.90909090909,\n        13206.939393939394,\n        13632.969696969696,\n        14059.0\n      ],\n      \"env_step\": [\n        0,\n        20480000.0,\n        40960000.0,\n        61440000.0,\n        81920000.0,\n        102400000.0,\n        122880000.0,\n        143360000.0,\n        163840000.0,\n        184320000.0,\n        204800000.0,\n        225280000.0,\n        245760000.0,\n        266240000.0,\n        286720000.0,\n        307200000.0,\n        327680000.0,\n        348160000.0,\n        368640000.0,\n        389120000.0,\n        409600000.0,\n        430080000.0,\n        450560000.0,\n        471040000.0,\n        491520000.0,\n        512000000.0,\n        532480000.0,\n        552960000.0,\n        573440000.0,\n        593920000.0,\n        614400000.0,\n        634880000.0,\n        655360000.0,\n        675840000.0\n      ],\n      \"return\": [\n        0.0,\n        8.399159749348959,\n        9.912333170572916,\n        14.84921375910441,\n        18.012757539749146,\n        22.504486719767254,\n        27.125051498413086,\n        28.701950073242188,\n        30.197755813598633,\n        31.469010670979817,\n        32.14909235636393,\n        32.57625071207682,\n        32.17204030354818,\n        32.481005350748696,\n        30.93257013956706,\n        32.5035769144694,\n        32.88990084330241,\n        31.94870122273763,\n        32.73742167154948,\n        32.58588981628418,\n        32.65592384338379,\n        30.25904655456543,\n        32.79962031046549,\n        32.60353914896647,\n        32.388885498046875,\n        32.74258168538412,\n        30.75256411234538,\n        32.113457997639976,\n        30.90188471476237,\n        30.882394790649414,\n        31.751652399698894,\n        32.93280283610026,\n        32.285691579182945,\n        32.544677734375\n      ],\n      \"return_min\": [\n        0.0,\n        -0.4342708013469121,\n        0.15131576165535776,\n        3.056790615527323,\n        4.690397268276504,\n        13.184447519334512,\n        21.25279365145474,\n        22.70949840620638,\n        25.157422436625925,\n        27.96643827593808,\n        29.69758316302085,\n        30.56877892180152,\n        29.76787859060772,\n        30.696770112276443,\n        28.644023779533903,\n        30.465339016626782,\n        30.92349475362472,\n        29.35783947935273,\n        30.895364911595788,\n        31.14117205982312,\n        31.353408955374725,\n        27.429461255702815,\n        31.54753402208512,\n        31.30661763069416,\n        31.116279961417973,\n        31.529114984883087,\n        29.048792317035463,\n        30.636109312623667,\n        29.16179985474695,\n        29.983837058445577,\n        29.91569292592529,\n        32.48186031918378,\n        31.36364585744209,\n        32.12861466393829\n      ],\n      \"return_max\": [\n        0.0,\n        17.23259030004483,\n        19.673350579490474,\n        26.641636902681498,\n        31.335117811221785,\n        31.824525920199996,\n        32.99730934537143,\n        34.694401740277996,\n        35.238089190571344,\n        34.97158306602155,\n        34.60060154970701,\n        34.58372250235212,\n        34.57620201648864,\n        34.265240589220944,\n        33.221116499600214,\n        34.54181481231202,\n        34.85630693298009,\n        34.539562966122524,\n        34.57947843150316,\n        34.03060757274524,\n        33.95843873139285,\n        33.08863185342805,\n        34.05170659884587,\n        33.900460667238775,\n        33.66149103467578,\n        33.95604838588515,\n        32.456335907655294,\n        33.590806682656286,\n        32.6419695747778,\n        31.78095252285325,\n        33.5876118734725,\n        33.383745353016735,\n        33.207737300923796,\n        32.96074080481171\n      ]\n    },\n    \"null\": {\n      \"time\": [\n        0.0,\n        4.206,\n        8.412,\n        12.618,\n        16.824,\n        21.03,\n        25.236,\n        29.442,\n        33.648,\n        37.854,\n        42.06,\n        46.266,\n        50.472,\n        54.678,\n        58.884,\n        63.09,\n        67.296,\n        71.502,\n        75.708,\n        79.914,\n        84.12,\n        88.326,\n        92.532,\n        96.738,\n        100.944,\n        105.15,\n        109.356,\n        113.562,\n        117.768,\n        121.974,\n        126.18,\n        130.386,\n        134.592,\n        138.798,\n        143.004,\n        147.21,\n        151.416,\n        155.622,\n        159.828,\n        164.034,\n        168.24,\n        172.446,\n        176.652,\n        180.858,\n        185.064,\n        189.27,\n        193.476,\n        197.682,\n        201.888,\n        206.094,\n        210.3,\n        214.506,\n        218.712,\n        222.918,\n        227.124,\n        231.33,\n        235.536,\n        239.742,\n        243.948,\n        248.154,\n        252.36,\n        256.566,\n        260.772,\n        264.978,\n        269.184,\n        273.39,\n        277.596,\n        281.802,\n        286.008,\n        290.214,\n        294.42,\n        298.626,\n        302.832,\n        307.038,\n        311.244,\n        315.45,\n        319.656,\n        323.862,\n        328.068,\n        332.274,\n        336.48,\n        340.686,\n        344.892,\n        349.098,\n        353.304,\n        357.51,\n        361.716,\n        365.922,\n        370.128,\n        374.334,\n        378.54,\n        382.746,\n        386.952,\n        391.158,\n        395.364,\n        399.57,\n        403.776,\n        407.982,\n        412.188,\n        416.394,\n        420.6,\n        424.806,\n        429.012,\n        433.218,\n        437.424,\n        441.63,\n        445.836,\n        450.042,\n        454.248,\n        458.454,\n        462.66,\n        466.866,\n        471.072,\n        475.278,\n        479.484,\n        483.69,\n        487.896,\n        492.102,\n        496.308,\n        500.514,\n        504.72,\n        508.926,\n        513.132,\n        517.338,\n        521.544,\n        525.75,\n        529.956,\n        534.162,\n        538.368,\n        542.574,\n        546.78,\n        550.986,\n        555.192,\n        559.398,\n        563.604,\n        567.81,\n        572.016,\n        576.222,\n        580.428,\n        584.634,\n        588.84,\n        593.046,\n        597.252,\n        601.458,\n        605.664,\n        609.87,\n        614.076,\n        618.282,\n        622.488,\n        626.694,\n        630.9,\n        635.106,\n        639.312,\n        643.518,\n        647.724,\n        651.93,\n        656.136,\n        660.342,\n        664.548,\n        668.754,\n        672.96,\n        677.166,\n        681.372,\n        685.578,\n        689.784,\n        693.99,\n        698.196,\n        702.402,\n        706.608,\n        710.814,\n        715.02,\n        719.226,\n        723.432,\n        727.638,\n        731.844,\n        736.05,\n        740.256,\n        744.462,\n        748.668,\n        752.874,\n        757.08,\n        761.286,\n        765.492,\n        769.698,\n        773.904,\n        778.11,\n        782.316,\n        786.522,\n        790.728,\n        794.934,\n        799.14,\n        803.346,\n        807.552,\n        811.758,\n        815.964,\n        820.17,\n        824.376,\n        828.582,\n        832.788,\n        836.994,\n        841.2,\n        845.406,\n        849.612,\n        853.818,\n        858.024,\n        862.23,\n        866.436,\n        870.642,\n        874.848,\n        879.054,\n        883.26,\n        887.466,\n        891.672,\n        895.878,\n        900.084,\n        904.29,\n        908.496,\n        912.702,\n        916.908,\n        921.114,\n        925.32,\n        929.526,\n        933.732,\n        937.938,\n        942.144,\n        946.35,\n        950.556,\n        954.762,\n        958.968,\n        963.174,\n        967.38,\n        971.586,\n        975.792,\n        979.998,\n        984.204,\n        988.41,\n        992.616,\n        996.822,\n        1001.028,\n        1005.234,\n        1009.44,\n        1013.646,\n        1017.852,\n        1022.058,\n        1026.264,\n        1030.47,\n        1034.676,\n        1038.882,\n        1043.088,\n        1047.294,\n        1051.5,\n        1055.706,\n        1059.912,\n        1064.118,\n        1068.324,\n        1072.53,\n        1076.736,\n        1080.942,\n        1085.148,\n        1089.354,\n        1093.56,\n        1097.766,\n        1101.972,\n        1106.178,\n        1110.384,\n        1114.59,\n        1118.796,\n        1123.002,\n        1127.208,\n        1131.414,\n        1135.62,\n        1139.826,\n        1144.032,\n        1148.238,\n        1152.444,\n        1156.65,\n        1160.856,\n        1165.062,\n        1169.268,\n        1173.474,\n        1177.68,\n        1181.886,\n        1186.092,\n        1190.298,\n        1194.504,\n        1198.71,\n        1202.916,\n        1207.122,\n        1211.328,\n        1215.534,\n        1219.74,\n        1223.946,\n        1228.152,\n        1232.358,\n        1236.564,\n        1240.77,\n        1244.976,\n        1249.182,\n        1253.388,\n        1257.594,\n        1261.8,\n        1266.006,\n        1270.212,\n        1274.418,\n        1278.624,\n        1282.83,\n        1287.036,\n        1291.242,\n        1295.448,\n        1299.654,\n        1303.86,\n        1308.066,\n        1312.272,\n        1316.478,\n        1320.684,\n        1324.89,\n        1329.096,\n        1333.302,\n        1337.508,\n        1341.714,\n        1345.92,\n        1350.126,\n        1354.332,\n        1358.538,\n        1362.744,\n        1366.95,\n        1371.156,\n        1375.362,\n        1379.568,\n        1383.774,\n        1387.98,\n        1392.186,\n        1396.392,\n        1400.598,\n        1404.804,\n        1409.01,\n        1413.216,\n        1417.422,\n        1421.628,\n        1425.834,\n        1430.04,\n        1434.246,\n        1438.452,\n        1442.658,\n        1446.864,\n        1451.07,\n        1455.276,\n        1459.482,\n        1463.688,\n        1467.894,\n        1472.1,\n        1476.306,\n        1480.512,\n        1484.718,\n        1488.924,\n        1493.13,\n        1497.336,\n        1501.542,\n        1505.748,\n        1509.954,\n        1514.16,\n        1518.366,\n        1522.572,\n        1526.778,\n        1530.984,\n        1535.19,\n        1539.396,\n        1543.602,\n        1547.808,\n        1552.014,\n        1556.22,\n        1560.426,\n        1564.632,\n        1568.838,\n        1573.044,\n        1577.25,\n        1581.456,\n        1585.662,\n        1589.868,\n        1594.074,\n        1598.28,\n        1602.486,\n        1606.692,\n        1610.898,\n        1615.104,\n        1619.31,\n        1623.516,\n        1627.722,\n        1631.928,\n        1636.134,\n        1640.34,\n        1644.546,\n        1648.752,\n        1652.958,\n        1657.164,\n        1661.37,\n        1665.576,\n        1669.782,\n        1673.988,\n        1678.194,\n        1682.4,\n        1686.606,\n        1690.812,\n        1695.018,\n        1699.224,\n        1703.43,\n        1707.636,\n        1711.842,\n        1716.048,\n        1720.254,\n        1724.46,\n        1728.666,\n        1732.872,\n        1737.078,\n        1741.284,\n        1745.49,\n        1749.696,\n        1753.902,\n        1758.108,\n        1762.314,\n        1766.52,\n        1770.726,\n        1774.932,\n        1779.138,\n        1783.344,\n        1787.55,\n        1791.756,\n        1795.962,\n        1800.168,\n        1804.374,\n        1808.58,\n        1812.786,\n        1816.992,\n        1821.198,\n        1825.404,\n        1829.61,\n        1833.816,\n        1838.022,\n        1842.228,\n        1846.434,\n        1850.64,\n        1854.846,\n        1859.052,\n        1863.258,\n        1867.464,\n        1871.67,\n        1875.876,\n        1880.082,\n        1884.288,\n        1888.494,\n        1892.7,\n        1896.906,\n        1901.112,\n        1905.318,\n        1909.524,\n        1913.73,\n        1917.936,\n        1922.142,\n        1926.348,\n        1930.554,\n        1934.76,\n        1938.966,\n        1943.172,\n        1947.378,\n        1951.584,\n        1955.79,\n        1959.996,\n        1964.202,\n        1968.408,\n        1972.614,\n        1976.82,\n        1981.026,\n        1985.232,\n        1989.438,\n        1993.644,\n        1997.85,\n        2002.056,\n        2006.262,\n        2010.468,\n        2014.674,\n        2018.88,\n        2023.086,\n        2027.292,\n        2031.498,\n        2035.704,\n        2039.91,\n        2044.116,\n        2048.322,\n        2052.528,\n        2056.734,\n        2060.94,\n        2065.146,\n        2069.352,\n        2073.558,\n        2077.764,\n        2081.97,\n        2086.176,\n        2090.382,\n        2094.588,\n        2098.794,\n        2103.0\n      ],\n      \"env_step\": [\n        0,\n        294912,\n        393216,\n        491520,\n        589824,\n        786432,\n        884736,\n        983040,\n        1081344,\n        1376256,\n        1671168,\n        1867776,\n        1966080,\n        2260992,\n        2457600,\n        2850816,\n        3145728,\n        3342336,\n        3735552,\n        3833856,\n        4521984,\n        5013504,\n        5210112,\n        5308416,\n        5603328,\n        5898240,\n        6094848,\n        6389760,\n        6488064,\n        6684672,\n        7274496,\n        7372800,\n        7569408,\n        7667712,\n        7962624,\n        8749056,\n        9043968,\n        9633792,\n        9830400,\n        9928704,\n        10223616,\n        10321920,\n        10518528,\n        10715136,\n        10813440,\n        11010048,\n        11206656,\n        11599872,\n        11698176,\n        12582912,\n        12976128,\n        13271040,\n        13664256,\n        13762560,\n        13860864,\n        14057472,\n        14155776,\n        14352384,\n        14843904,\n        15237120,\n        15335424,\n        15630336,\n        15728640,\n        16416768,\n        16613376,\n        17104896,\n        17301504,\n        17793024,\n        17891328,\n        17989632,\n        18087936,\n        18284544,\n        19169280,\n        19267584,\n        19365888,\n        19562496,\n        19857408,\n        19955712,\n        20545536,\n        20643840,\n        20742144,\n        20938752,\n        21037056,\n        21331968,\n        21430272,\n        21528576,\n        21626880,\n        22216704,\n        22315008,\n        22413312,\n        23101440,\n        23199744,\n        23691264,\n        24084480,\n        24772608,\n        24969216,\n        25165824,\n        25559040,\n        25853952,\n        25952256,\n        26247168,\n        26345472,\n        26443776,\n        26935296,\n        27033600,\n        27328512,\n        27623424,\n        27918336,\n        28213248,\n        28311552,\n        28409856,\n        28704768,\n        28901376,\n        29294592,\n        30375936,\n        30572544,\n        30670848,\n        31260672,\n        31457280,\n        31555584,\n        31752192,\n        32047104,\n        32440320,\n        32538624,\n        32636928,\n        32833536,\n        33030144,\n        33619968,\n        33718272,\n        33816576,\n        33914880,\n        35192832,\n        35389440,\n        35487744,\n        35586048,\n        36274176,\n        36372480,\n        36470784,\n        36667392,\n        37158912,\n        37650432,\n        38043648,\n        38731776,\n        38928384,\n        39223296,\n        39616512,\n        39714816,\n        40009728,\n        40206336,\n        40894464,\n        41091072,\n        41189376,\n        41680896,\n        41779200,\n        41975808,\n        42172416,\n        42270720,\n        42369024,\n        42467328,\n        42958848,\n        43155456,\n        43352064,\n        43548672,\n        43745280,\n        43941888,\n        45023232,\n        45711360,\n        46104576,\n        46202880,\n        46301184,\n        46792704,\n        47480832,\n        47579136,\n        47677440,\n        47874048,\n        48168960,\n        48758784,\n        49053696,\n        49250304,\n        49348608,\n        49446912,\n        49643520,\n        49938432,\n        50331648,\n        50429952,\n        50626560,\n        51019776,\n        52002816,\n        52297728,\n        52887552,\n        53084160,\n        53182464,\n        53477376,\n        53772288,\n        53870592,\n        54067200,\n        54460416,\n        54558720,\n        55640064,\n        55934976,\n        56328192,\n        56426496,\n        56623104,\n        57212928,\n        58884096,\n        59375616,\n        59670528,\n        60260352,\n        60850176,\n        60948480,\n        61734912,\n        61833216,\n        62029824,\n        62521344,\n        62619648,\n        63406080,\n        63504384,\n        63799296,\n        64094208,\n        64290816,\n        64585728,\n        65077248,\n        65372160,\n        65470464,\n        65667072,\n        65863680,\n        66453504,\n        66650112,\n        66846720,\n        67239936,\n        67436544,\n        68517888,\n        68812800,\n        69304320,\n        69402624,\n        69894144,\n        70287360,\n        70385664,\n        70483968,\n        70582272,\n        70975488,\n        71073792,\n        71172096,\n        71270400,\n        71663616,\n        71958528,\n        72253440,\n        72548352,\n        72646656,\n        72843264,\n        73039872,\n        73236480,\n        73334784,\n        73728000,\n        74219520,\n        74317824,\n        74514432,\n        74907648,\n        75005952,\n        75399168,\n        75595776,\n        75988992,\n        76382208,\n        76578816,\n        76677120,\n        77070336,\n        77365248,\n        77561856,\n        77758464,\n        78053376,\n        78544896,\n        78643200,\n        78839808,\n        78938112,\n        79429632,\n        79724544,\n        79822848,\n        79921152,\n        80019456,\n        80510976,\n        80707584,\n        81297408,\n        81494016,\n        81690624,\n        82378752,\n        82870272,\n        84049920,\n        84148224,\n        84443136,\n        85917696,\n        86212608,\n        86310912,\n        86409216,\n        86507520,\n        86605824,\n        86704128,\n        86802432,\n        86999040,\n        87097344,\n        87392256,\n        87490560,\n        87588864,\n        87785472,\n        88276992,\n        88768512,\n        88866816,\n        89063424,\n        89456640,\n        89554944,\n        89849856,\n        89948160,\n        90046464,\n        90144768,\n        90439680,\n        90931200,\n        91029504,\n        91127808,\n        91619328,\n        91815936,\n        92110848,\n        92209152,\n        92504064,\n        92798976,\n        93388800,\n        93487104,\n        93683712,\n        93880320,\n        93978624,\n        94470144,\n        95256576,\n        95354880,\n        95649792,\n        95748096,\n        95846400,\n        96141312,\n        96239616,\n        96337920,\n        96829440,\n        96927744,\n        97320960,\n        97517568,\n        97615872,\n        97714176,\n        98304000,\n        98697216,\n        99090432,\n        99287040,\n        99385344,\n        99778560,\n        99876864,\n        100073472,\n        100171776,\n        100368384,\n        100564992,\n        100958208,\n        101154816,\n        101253120,\n        101744640,\n        101842944,\n        102137856,\n        102334464,\n        102432768,\n        102825984,\n        102924288,\n        103120896,\n        103219200,\n        103710720,\n        103809024,\n        103907328,\n        104103936,\n        104398848,\n        105185280,\n        105381888,\n        105480192,\n        105971712,\n        106070016,\n        106659840,\n        106758144,\n        106856448,\n        106954752,\n        107053056,\n        107249664,\n        107347968,\n        107446272,\n        107839488,\n        108232704,\n        108429312,\n        108822528,\n        109314048,\n        109707264,\n        109805568,\n        110100480,\n        110198784,\n        110788608,\n        110985216,\n        111083520,\n        111280128,\n        111378432,\n        111575040,\n        112164864,\n        112656384,\n        112754688,\n        112852992,\n        113541120,\n        114229248,\n        114327552,\n        114524160,\n        114622464,\n        114917376,\n        115015680,\n        115212288,\n        115900416,\n        116195328,\n        116490240,\n        117080064,\n        117178368,\n        117374976,\n        117768192,\n        118456320,\n        119734272,\n        120127488,\n        120324096,\n        120422400,\n        121012224,\n        121405440,\n        121602048,\n        121700352,\n        122191872,\n        122388480,\n        123076608,\n        123568128,\n        123666432,\n        124256256,\n        124354560,\n        124649472,\n        124747776,\n        124944384,\n        125239296,\n        125337600,\n        125632512,\n        125927424,\n        126418944,\n        126517248,\n        126713856,\n        127401984,\n        128188416,\n        128286720,\n        128581632,\n        128876544,\n        129564672,\n        130056192,\n        131137536,\n        131334144,\n        131530752,\n        132120576,\n        132218880,\n        132513792,\n        133005312,\n        133890048,\n        134086656,\n        134184960,\n        134774784,\n        135364608,\n        135561216,\n        136249344,\n        136347648,\n        136544256,\n        136740864,\n        137527296,\n        137625600,\n        138215424,\n        138412032,\n        138608640,\n        139001856,\n        139100160,\n        139198464,\n        139296768,\n        139395072,\n        139493376,\n        139984896,\n        140181504,\n        140279808,\n        141066240,\n        141656064,\n        141852672,\n        142049280,\n        142540800,\n        142737408,\n        143032320,\n        143228928,\n        143818752,\n        143917056,\n        144113664,\n        144211968,\n        144900096,\n        145391616,\n        146178048,\n        146276352,\n        146374656,\n        146767872,\n        146964480\n      ],\n      \"return\": [\n        0.0,\n        -4.244887483882465,\n        -6.062188769976298,\n        -5.946233414014181,\n        -5.934175154368082,\n        -5.931420200665792,\n        -5.926097372372946,\n        -5.890506885846456,\n        -5.845593395233155,\n        -5.745785452524821,\n        -5.6373031155268345,\n        -5.537198613484701,\n        -5.437902429898579,\n        -5.354564905166626,\n        -5.315855964024862,\n        -5.305097824732463,\n        -5.233588047027588,\n        -5.21569048722585,\n        -5.142245790163676,\n        -5.169298741022746,\n        -5.144494528770447,\n        -5.092627458572388,\n        -5.067925599416097,\n        -5.0944153579076135,\n        -5.075692962010701,\n        -5.080377626419067,\n        -5.094903222719828,\n        -5.125536025365193,\n        -5.115845588048299,\n        -5.232694799105326,\n        -5.326773929595947,\n        -5.540815292994181,\n        -5.565265787442525,\n        -5.767993720372519,\n        -6.0354754002889,\n        -6.422926289240519,\n        -6.6024469566345205,\n        -7.196066060066222,\n        -7.374582826296488,\n        -7.664462378819784,\n        -8.25899796485901,\n        -8.364871314366658,\n        -8.789411764144896,\n        -8.797979049682617,\n        -9.422961320877077,\n        -9.468292330106099,\n        -9.801179609298705,\n        -9.881031123797099,\n        -10.184216540654502,\n        -10.079860796928406,\n        -9.81778653462728,\n        -9.665794699986776,\n        -9.04004868030548,\n        -7.917503507932028,\n        -6.944561026891073,\n        -6.476619219779969,\n        -6.198991281986236,\n        -5.729001940091451,\n        -4.624928730726242,\n        -3.9058311102787653,\n        -3.415297171473503,\n        -3.0760760113100214,\n        -2.5505499145636956,\n        -1.384848330305734,\n        -0.2722055218198026,\n        0.37754478539495423,\n        0.9910718053362021,\n        1.9497537283304458,\n        2.0353074125107375,\n        2.7150473963220914,\n        3.57238434004287,\n        4.393549767136574,\n        5.22253050327301,\n        5.594326151609422,\n        5.980829033851624,\n        6.248111243247986,\n        6.301437652111052,\n        6.819287878672282,\n        7.668490665753683,\n        7.9056279023488365,\n        8.528459188143414,\n        8.9157959318161,\n        9.004425404866536,\n        9.240382947921754,\n        10.005460894107818,\n        10.388603275616964,\n        10.506964916388194,\n        11.041883727709452,\n        11.13649615128835,\n        11.725313129425048,\n        12.456134204069775,\n        12.757048015594483,\n        13.031923011144002,\n        13.684242639541628,\n        14.265029729207356,\n        14.391678101221721,\n        14.820610736211142,\n        15.028353486855826,\n        15.586650098164876,\n        16.098702065149944,\n        16.480080706278486,\n        16.47846872806549,\n        16.60223277727763,\n        16.996064742406208,\n        16.962005354563395,\n        17.414829864501954,\n        17.879654438495635,\n        18.075921009381613,\n        18.59587174097697,\n        18.768117705980938,\n        19.066952754656473,\n        19.140114987691245,\n        19.385992137591046,\n        19.82112768173218,\n        20.07494811375936,\n        20.38481765429179,\n        20.64120300292969,\n        20.944377085367837,\n        21.046505676905316,\n        21.219120070139567,\n        21.259444128672282,\n        21.437351309458418,\n        21.490676719347636,\n        21.614972871144612,\n        21.73858602523804,\n        21.836027539571123,\n        21.938353093465167,\n        22.12277064641317,\n        22.096696147918703,\n        22.379330412546793,\n        22.3636573155721,\n        22.667728048960367,\n        22.804457734425863,\n        22.989772860209147,\n        22.81014023224513,\n        22.752846819559732,\n        22.87626227140427,\n        23.268132722377782,\n        23.450477498372397,\n        23.49831963221232,\n        23.642868830362957,\n        23.749657942454018,\n        23.829254946708676,\n        23.89071097215017,\n        23.95269861221313,\n        23.80049831072489,\n        23.97088320493698,\n        24.035058199564617,\n        23.987095406850177,\n        24.143753560384113,\n        24.161667313575744,\n        24.20282023747762,\n        24.515907077789308,\n        24.48859402338664,\n        24.65326049168905,\n        24.695785293579103,\n        24.68093146642049,\n        24.64084663550059,\n        24.774629052480062,\n        24.721785335540773,\n        24.80927684466044,\n        24.64293538570404,\n        24.611320269902546,\n        24.890653877258302,\n        25.03125279744466,\n        25.04939617792765,\n        25.056911741892495,\n        25.012861070632937,\n        25.006108971436817,\n        25.19971148808797,\n        25.244278527895606,\n        25.243163836797077,\n        25.20921319325765,\n        25.25394191265106,\n        25.238686189651492,\n        25.24318765004476,\n        25.405525760650633,\n        25.469966049194337,\n        25.493687133789063,\n        25.4357274723053,\n        25.59193923314412,\n        25.664110005696614,\n        25.589963239034017,\n        25.78192579905192,\n        25.539709254900615,\n        25.59874127229055,\n        25.71211055119832,\n        25.83485471089681,\n        25.849721884727483,\n        25.78178211212158,\n        25.905887908935544,\n        26.03030003229777,\n        25.978182222048442,\n        26.085983327229815,\n        26.114759820302325,\n        26.203529485066735,\n        26.11474272966385,\n        26.242303619384767,\n        26.12391219774882,\n        25.88732498884201,\n        26.3737934366862,\n        26.40230457305908,\n        26.32623062769572,\n        26.447194538116452,\n        26.39948532104492,\n        26.48303238550822,\n        26.58627212524414,\n        26.62195227940877,\n        26.305497008959453,\n        26.5571306848526,\n        26.676981925964355,\n        26.715099360148113,\n        26.7491853205363,\n        26.77513189315796,\n        26.77031274795532,\n        26.699470434188843,\n        26.697296330134076,\n        26.761435826619465,\n        26.773332411448163,\n        26.817168051401776,\n        26.812312013308212,\n        26.730486590067546,\n        26.980856755574546,\n        26.768192443052925,\n        26.75973533630371,\n        26.726683316230776,\n        26.629371005694072,\n        26.737455078760785,\n        27.04755596160889,\n        26.996712977091473,\n        27.15843843460083,\n        27.056537364323933,\n        27.07920110066732,\n        27.063170181910195,\n        26.999029286702477,\n        26.95050971984863,\n        27.028388187090556,\n        27.162973842620847,\n        27.03351854960124,\n        26.927428197860717,\n        27.040847080548605,\n        27.197179768880208,\n        27.07626317977905,\n        27.213608862559,\n        27.158017387390135,\n        27.26690890630086,\n        27.162100307146705,\n        27.21130386193593,\n        27.205319883028665,\n        27.24621044794718,\n        27.22654726187388,\n        27.380611095428467,\n        27.297355159123736,\n        27.317657216389975,\n        27.35010398864746,\n        27.37887262980143,\n        27.204053241411845,\n        27.41668393452962,\n        27.38345681508382,\n        27.39409785588582,\n        27.458933315277097,\n        27.5122291692098,\n        27.437938969930016,\n        27.426795717875166,\n        27.50698534965515,\n        27.471194496154784,\n        27.481666673024495,\n        27.474861726760864,\n        27.534579620361328,\n        27.485984223683676,\n        27.63851408004761,\n        27.601514314015706,\n        27.468947197596233,\n        27.463055524826046,\n        27.63535165786743,\n        27.540319763819372,\n        27.61072525024414,\n        27.618154637018836,\n        27.742102012634277,\n        27.669274317423504,\n        27.588603134155274,\n        27.560866799354557,\n        27.558601020177207,\n        27.617678836186727,\n        27.334658497174576,\n        27.753905919392903,\n        27.710659249623617,\n        27.830444075266524,\n        27.792324670155846,\n        27.715141557057695,\n        27.80116782506307,\n        27.822862389882403,\n        27.925147520701092,\n        27.828692003885905,\n        27.774182446797692,\n        27.82865638097127,\n        27.87010919570923,\n        27.88524030049642,\n        27.780755894978842,\n        27.869576110839844,\n        27.8847115389506,\n        27.88112828572591,\n        27.95226758956909,\n        27.9632572110494,\n        27.98185961405436,\n        27.900285018285114,\n        27.749773004849754,\n        27.95440636952718,\n        27.897387890815736,\n        27.774074436823526,\n        27.851776852607728,\n        27.88595195611318,\n        27.957040735880536,\n        28.01861265818278,\n        28.072463569641116,\n        27.927986801465355,\n        28.054341169993084,\n        28.04882469813029,\n        28.08775482813517,\n        28.013588581085205,\n        27.971034762064615,\n        27.939692214330037,\n        27.961951595942182,\n        27.91120104789734,\n        27.912012211481727,\n        27.946552640597023,\n        28.13232330322266,\n        28.155323626200357,\n        28.21534303029378,\n        28.105512018203736,\n        28.12738781611124,\n        28.18438720703125,\n        28.129107367197673,\n        28.202164802551266,\n        28.183705285390218,\n        28.11675905227661,\n        28.143270753224694,\n        28.115149936676023,\n        27.962960600852966,\n        28.082834936777747,\n        28.17713045756022,\n        28.071439940134685,\n        28.036277154286704,\n        28.015016555786133,\n        27.74030689318975,\n        28.143797952334086,\n        28.083258043924967,\n        28.143300120035807,\n        28.22578639984131,\n        28.189498087565102,\n        28.28595506032308,\n        28.153611634572346,\n        28.271268889109294,\n        28.1683358446757,\n        28.215656159718833,\n        28.146134676933286,\n        28.001276264190675,\n        27.983188803990686,\n        27.911292719841004,\n        28.1467857392629,\n        28.130204054514564,\n        28.11707345326742,\n        28.18377244313558,\n        28.046776949564617,\n        28.17007304509481,\n        28.181098283131917,\n        28.141192302703857,\n        28.183331216176352,\n        28.045621960957845,\n        28.206850026448564,\n        28.094409753481546,\n        28.100912955602013,\n        28.095957455635073,\n        28.08301568508148,\n        27.989888830184935,\n        27.994859463373817,\n        28.122985763549806,\n        28.221680157979325,\n        28.20320109049479,\n        28.07082467556,\n        28.08045124053955,\n        28.234146792093913,\n        28.24658082962036,\n        28.167981278101603,\n        28.19567700703939,\n        28.181134967803956,\n        28.086012459596002,\n        28.175039056142168,\n        28.172370891571045,\n        28.007196400960286,\n        28.25803017298381,\n        28.184969975153606,\n        28.25297903696696,\n        28.171354506810506,\n        28.238894797960914,\n        28.285828113555908,\n        27.95717629591624,\n        28.18456648508708,\n        28.211361389160157,\n        28.155993226369223,\n        28.133868006070454,\n        28.174914951324464,\n        28.19500195185343,\n        28.166361468633017,\n        27.803325554529824,\n        27.919510776996614,\n        28.102442877292635,\n        28.161700401306152,\n        28.24873664855957,\n        28.270584068298344,\n        28.14114588101705,\n        28.200696325302122,\n        28.242460556030277,\n        28.08400403817495,\n        28.200781688690185,\n        28.170070962905882,\n        28.197558924357097,\n        28.121695067087813,\n        27.968536138534546,\n        28.11471136728922,\n        28.243860149383547,\n        28.14787934621175,\n        28.204611466725666,\n        28.119738744099934,\n        28.031441315015158,\n        27.99424771149953,\n        28.260030975341795,\n        28.055403108596803,\n        28.15147446632385,\n        28.236702639261882,\n        28.21238270441691,\n        28.137156661351522,\n        28.181380929946897,\n        28.15697519938151,\n        28.175251019795738,\n        28.244781970977783,\n        28.278363844553628,\n        28.293849919637044,\n        28.27718382517497,\n        28.177998784383135,\n        28.196655069986978,\n        28.221141552925115,\n        28.249198462168376,\n        28.194035822550457,\n        28.266193669637047,\n        28.199925409952797,\n        28.2498019917806,\n        28.1529238319397,\n        28.29481091181437,\n        28.126179259618127,\n        28.218130435943603,\n        28.13479091326396,\n        28.208430185317994,\n        28.240509795347847,\n        28.256366844177247,\n        28.31536688486735,\n        28.207336940765384,\n        28.345287787119545,\n        28.27900697072347,\n        27.943899685541783,\n        27.870618518193563,\n        28.060490986506142,\n        28.1964542388916,\n        28.145226942698162,\n        28.134871492385866,\n        28.11652568499247,\n        28.205403124491372,\n        28.25891178131103,\n        28.224900798797606,\n        28.18699549039205,\n        28.245219739278156,\n        27.95408559640249,\n        28.04483809630076,\n        28.184945538838704,\n        28.124226392110188,\n        28.126491333643596,\n        28.222030410766603,\n        28.277731647491454,\n        28.204816850026447,\n        28.326336828867596,\n        28.27530563990275,\n        28.337517782847087,\n        28.272301177978516,\n        28.169770604769386,\n        28.10294930299123,\n        28.11485408782959,\n        28.196449371973674,\n        28.230150807698564,\n        28.27503276189168,\n        28.39388942718506,\n        28.159951156775154,\n        28.471651503245038,\n        28.29022908528646,\n        28.39702470143636,\n        28.33332508722941,\n        28.344238732655842,\n        28.264009912808735,\n        28.330615717569987,\n        28.14671749909719,\n        28.083581444422403,\n        28.162535241444903\n      ],\n      \"return_min\": [\n        0.0,\n        -6.643669384840878,\n        -6.182021783308077,\n        -5.991857294835561,\n        -6.01209477450521,\n        -5.9483798211704855,\n        -5.999782261850072,\n        -5.916902296139923,\n        -5.886733559999383,\n        -5.808524865018686,\n        -5.767673070588338,\n        -5.656879782210418,\n        -5.516316668851651,\n        -5.4007619174378085,\n        -5.372779064718162,\n        -5.3428969204069,\n        -5.253152293681659,\n        -5.233328113111429,\n        -5.161262128656727,\n        -5.182595984476208,\n        -5.185557630198057,\n        -5.129352151903135,\n        -5.092037379164627,\n        -5.131577815601254,\n        -5.093579424569167,\n        -5.093719788086104,\n        -5.128770606785492,\n        -5.164726254562823,\n        -5.158627605712601,\n        -5.348331700371459,\n        -5.433969219755715,\n        -5.692475061982907,\n        -5.661098991347332,\n        -6.019793748866466,\n        -6.3892772787774526,\n        -6.618471670782988,\n        -6.93623582771524,\n        -7.783379441949263,\n        -7.772743356571364,\n        -8.374310306500192,\n        -8.383183613218646,\n        -8.739301357337679,\n        -9.011180633645404,\n        -9.167613726500848,\n        -9.546097485889467,\n        -9.830183506287723,\n        -10.401676641230377,\n        -10.383695511002582,\n        -10.622962101795936,\n        -10.417617680809428,\n        -10.212604744756401,\n        -10.430187174747314,\n        -10.060513521247342,\n        -8.386746572083084,\n        -7.483890396900349,\n        -6.769170298951451,\n        -6.563200095765459,\n        -6.363015656832963,\n        -5.314321983518964,\n        -4.321166917783934,\n        -3.8361483530772995,\n        -3.3285683618568154,\n        -2.997812024835769,\n        -2.1293781494344417,\n        -0.5802740736090363,\n        -0.2522531694659173,\n        0.44993455353870715,\n        1.1269001717729363,\n        1.2624771886189383,\n        2.242960541089087,\n        2.4963479947590113,\n        3.0845735431593404,\n        3.9279124633414924,\n        4.763775894213173,\n        5.049968982159641,\n        5.389919698911818,\n        5.48248416097672,\n        6.229347854311914,\n        6.521668988071955,\n        6.659136206112889,\n        7.427314258358753,\n        7.959931882256278,\n        8.028050845313581,\n        8.24168761513716,\n        9.345870248729108,\n        9.602207436283068,\n        9.661819458045514,\n        10.36769504799724,\n        10.479651576379243,\n        10.918097477003705,\n        11.49926771582258,\n        11.726100767383969,\n        12.017986155957812,\n        13.050080426953622,\n        13.706833546223123,\n        13.771967969075195,\n        13.830807728673413,\n        14.210788794303577,\n        14.74650412027818,\n        15.154901166861874,\n        15.427889498076768,\n        15.445720007976437,\n        15.42139226603878,\n        15.934492457847714,\n        15.834368998414789,\n        16.39698386013562,\n        16.947691591248272,\n        16.807375863238427,\n        17.612806679277735,\n        17.842858557229146,\n        17.8214759601972,\n        17.97937647179555,\n        18.227829245354723,\n        18.7644055402232,\n        19.325366890517444,\n        19.559545910473766,\n        19.630990604684314,\n        20.052528700425622,\n        20.07969423110993,\n        20.24338282719667,\n        20.316218292810056,\n        20.54437276683007,\n        20.61879250350746,\n        20.737601855773278,\n        20.842643896008965,\n        20.960295805924897,\n        21.05675283045511,\n        21.38516463895171,\n        21.29067986749454,\n        21.551385627548925,\n        21.53450581143342,\n        22.106338660705283,\n        22.277585385218103,\n        22.343014598444217,\n        22.336837392663952,\n        21.891858139567134,\n        21.873949149291946,\n        22.42005171861463,\n        22.791150529503188,\n        22.91568567873898,\n        23.103328667240632,\n        23.274285177687627,\n        23.381478304066786,\n        23.500560268083092,\n        23.567086975556506,\n        23.514825951037885,\n        23.69441051835588,\n        23.67786951187934,\n        23.70868130951907,\n        23.82123286847361,\n        23.921364629682458,\n        23.864665666102454,\n        24.240071066132046,\n        24.171078317636052,\n        24.294435932577716,\n        24.315215024411746,\n        24.429722664880202,\n        24.309891323003807,\n        24.358354925393265,\n        24.293342169139848,\n        24.524251045267125,\n        24.410533051064895,\n        24.259007268772475,\n        24.68031418108018,\n        24.6622536418055,\n        24.7900871663593,\n        24.839123934450377,\n        24.604940968948597,\n        24.678304904444722,\n        24.83199777097151,\n        24.893924012460136,\n        25.025322863275832,\n        25.006304955399667,\n        24.927603177713035,\n        24.870433418062774,\n        24.8206808457444,\n        25.042394627792078,\n        25.12909070901515,\n        25.127701577023146,\n        24.909385724601925,\n        25.26391796949775,\n        25.36489524550621,\n        25.262896738352463,\n        25.4065724808923,\n        25.235586585718753,\n        25.25165776554294,\n        25.21133184941777,\n        25.382838458944565,\n        25.291931799688506,\n        25.27953037736325,\n        25.45371787444146,\n        25.53356545650795,\n        25.659911682303985,\n        25.61246706370948,\n        25.67222761174437,\n        25.72419865409862,\n        25.691071648603913,\n        25.770755480440684,\n        25.503872557466057,\n        25.09439470110851,\n        25.94571608744351,\n        25.953143316854963,\n        25.715659588443316,\n        25.933383297310503,\n        25.944354522810634,\n        26.08021549674279,\n        26.162014760788978,\n        26.199008962618258,\n        25.834289088400073,\n        25.974639007126676,\n        26.167551596823426,\n        26.250286619672007,\n        26.287575915584835,\n        26.28088931506064,\n        26.365906335455993,\n        26.081784122112968,\n        26.129397360813545,\n        26.259239788514925,\n        26.150394042457304,\n        26.291752596593632,\n        26.313943131522063,\n        26.13410198199206,\n        26.55317314286874,\n        26.216099134936506,\n        26.380888321629403,\n        26.255848946469214,\n        25.980031272338643,\n        26.045063632168027,\n        26.6120241458033,\n        26.665698145218652,\n        26.699119141354032,\n        26.569793952169537,\n        26.58967941825657,\n        26.739340476993025,\n        26.330084638992602,\n        26.42094707654985,\n        26.519086340714452,\n        26.68647767657777,\n        26.502398302370402,\n        26.35836957803332,\n        26.500074767531167,\n        26.756825197135797,\n        26.61465102248258,\n        26.883696729797684,\n        26.721789269674048,\n        26.79997215000162,\n        26.507070394509714,\n        26.713321001102937,\n        26.675860686076554,\n        26.789316392947274,\n        26.734816868011468,\n        26.925030375521263,\n        26.78185686006352,\n        26.828669365171432,\n        26.898265564136672,\n        26.997670098960125,\n        26.58346725765291,\n        26.98360963815846,\n        26.949306773332108,\n        26.866127011603215,\n        26.904331611316476,\n        27.018117921387102,\n        26.913934278398166,\n        26.942785909195926,\n        27.0572525358541,\n        26.85422753198156,\n        26.802436434084658,\n        26.912846528111146,\n        26.953078963074674,\n        26.871379332530296,\n        27.133054445759683,\n        27.06329828105225,\n        26.86718681382005,\n        26.784242612220062,\n        27.129768449971873,\n        26.899663029998653,\n        27.092409754576504,\n        26.943686185842243,\n        27.126004030091234,\n        27.075997107782985,\n        27.00478223988364,\n        26.954881553754195,\n        27.128120737317083,\n        27.26391501026856,\n        26.62878925470602,\n        27.312878507222614,\n        27.209355420020994,\n        27.262608107937105,\n        27.206222758165307,\n        27.0120863356424,\n        27.192331535808393,\n        27.290430172994057,\n        27.383470471744634,\n        27.256051277576322,\n        27.192774668712882,\n        27.236724746496336,\n        27.286242265193405,\n        27.33987699486662,\n        27.134929513717836,\n        27.33610965800927,\n        27.342853607308985,\n        27.230874367930866,\n        27.32615844944727,\n        27.32991611463294,\n        27.49000317352162,\n        27.216721211062875,\n        27.2486755565496,\n        27.509170514821,\n        27.211931598018147,\n        27.182387761535466,\n        27.24112250621991,\n        27.222793413598858,\n        27.38562263124806,\n        27.445904418318488,\n        27.447368426403077,\n        27.37308873644017,\n        27.44656335043385,\n        27.377184098808225,\n        27.547445306617423,\n        27.4595219676506,\n        27.53880062124529,\n        27.625638695312645,\n        27.5723796241114,\n        27.136062660248417,\n        27.226319032746265,\n        27.29633140116729,\n        27.631505760426187,\n        27.674954931232374,\n        27.728625374929177,\n        27.591904928550058,\n        27.576594539421116,\n        27.65714041002535,\n        27.646333039333086,\n        27.665661961035934,\n        27.68377053604231,\n        27.692063291596124,\n        27.649428242882358,\n        27.534870946917117,\n        27.32500127104943,\n        27.567980196771007,\n        27.681384583879268,\n        27.433536908880416,\n        27.378598361090592,\n        27.700985322336567,\n        27.209901228897266,\n        27.678789740723424,\n        27.723840546913753,\n        27.614518330308616,\n        27.688873802362004,\n        27.62059429477935,\n        27.801427670180885,\n        27.670245048845366,\n        27.763636926151346,\n        27.70675009152503,\n        27.64940661773334,\n        27.4335089027518,\n        27.349631559524077,\n        27.39953333193096,\n        27.055275052523694,\n        27.54209773920211,\n        27.62560556671412,\n        27.71488282497128,\n        27.664155463406473,\n        27.47051615110866,\n        27.654433053969747,\n        27.63343809662031,\n        27.606025816266033,\n        27.609445142855385,\n        27.528261681132612,\n        27.64719501683486,\n        27.55462395513759,\n        27.4532317127279,\n        27.543209949117035,\n        27.420699812464836,\n        27.296464677297084,\n        27.363041088328057,\n        27.564845376439095,\n        27.647633517497994,\n        27.649257684453776,\n        27.564422745631916,\n        27.479429303596643,\n        27.70740389878346,\n        27.75087319581631,\n        27.67755023901446,\n        27.605301469431296,\n        27.607315751349333,\n        27.5710917990152,\n        27.589205013009703,\n        27.651066017265375,\n        27.462911639913305,\n        27.790245977010994,\n        27.575784425550502,\n        27.761994983293686,\n        27.607845196094743,\n        27.911244281136735,\n        27.74855794849233,\n        27.75745937747287,\n        27.75661492897514,\n        27.675667899653263,\n        27.699200621330775,\n        27.54465793316485,\n        27.661892369297806,\n        27.6970472029542,\n        27.52901228320329,\n        26.743230879922507,\n        26.990685311555517,\n        27.34539146443403,\n        27.5018051654596,\n        27.644637825924267,\n        27.723576723331842,\n        27.55805454031313,\n        27.690556460208317,\n        27.597992245732584,\n        27.4626199942016,\n        27.627549588268174,\n        27.67728166542562,\n        27.596896737593983,\n        27.434762114958083,\n        27.08431125696573,\n        27.458241416131465,\n        27.57888616087306,\n        27.43995994393739,\n        27.518333940197355,\n        27.390749467406465,\n        27.35386087119415,\n        27.525564231159255,\n        27.602347628922445,\n        27.23495997563261,\n        27.416147599069753,\n        27.50096654763,\n        27.62275519246325,\n        27.400365606658994,\n        27.433065480233566,\n        27.42234033103539,\n        27.51588385631785,\n        27.61102157868808,\n        27.629601057877036,\n        27.71414576486763,\n        27.638847101139316,\n        27.447860172670634,\n        27.59945093674042,\n        27.515394478263605,\n        27.5893211887083,\n        27.587017995854602,\n        27.57611290440029,\n        27.546331425135435,\n        27.559619073896844,\n        27.412269601066168,\n        27.691375433619754,\n        27.555580339779763,\n        27.673951594656256,\n        27.609451507297706,\n        27.621196011145194,\n        27.506598606210353,\n        27.707851219599547,\n        27.818420700582276,\n        27.51418845748706,\n        27.7430355760454,\n        27.723090472019116,\n        27.478483040480732,\n        27.317902256390436,\n        27.372293722132902,\n        27.686639453016497,\n        27.65690428848761,\n        27.395440888850807,\n        27.52792304343521,\n        27.52742960163641,\n        27.675669095177042,\n        27.758840469215265,\n        27.58115012938146,\n        27.81812255636417,\n        27.190757018784822,\n        27.375990397421095,\n        27.75719466939146,\n        27.623855557976576,\n        27.58026164356946,\n        27.749223344499686,\n        27.89389499473171,\n        27.608558293475895,\n        27.98247433110001,\n        27.800560273039114,\n        27.871598481486938,\n        27.67552564582866,\n        27.637531189824664,\n        27.672534330671002,\n        27.537895380241416,\n        27.7179898811072,\n        27.71181110280576,\n        27.75002013063051,\n        27.917199121877417,\n        27.612882484909445,\n        28.020438773362084,\n        27.785153456248054,\n        27.8962429311074,\n        27.862730599851073,\n        27.906056862893415,\n        27.8207261742784,\n        27.763832370520205,\n        27.582263225940892,\n        27.412546214292902,\n        27.39258011828865\n      ],\n      \"return_max\": [\n        0.0,\n        -1.8461055829240518,\n        -5.942355756644519,\n        -5.9006095331928,\n        -5.856255534230954,\n        -5.914460580161098,\n        -5.85241248289582,\n        -5.864111475552989,\n        -5.804453230466926,\n        -5.6830460400309555,\n        -5.506933160465331,\n        -5.4175174447589844,\n        -5.359488190945507,\n        -5.308367892895443,\n        -5.2589328633315615,\n        -5.267298729058026,\n        -5.214023800373518,\n        -5.198052861340272,\n        -5.123229451670625,\n        -5.156001497569283,\n        -5.103431427342837,\n        -5.055902765241641,\n        -5.043813819667567,\n        -5.057252900213973,\n        -5.057806499452236,\n        -5.06703546475203,\n        -5.061035838654163,\n        -5.086345796167563,\n        -5.073063570383997,\n        -5.117057897839193,\n        -5.219578639436179,\n        -5.389155524005456,\n        -5.469432583537717,\n        -5.516193691878572,\n        -5.681673521800347,\n        -6.22738090769805,\n        -6.268658085553801,\n        -6.608752678183182,\n        -6.976422296021613,\n        -6.954614451139375,\n        -8.134812316499373,\n        -7.990441271395638,\n        -8.567642894644388,\n        -8.428344372864386,\n        -9.299825155864687,\n        -9.106401153924475,\n        -9.200682577367033,\n        -9.378366736591616,\n        -9.745470979513067,\n        -9.742103913047384,\n        -9.422968324498157,\n        -8.901402225226239,\n        -8.019583839363618,\n        -7.448260443780973,\n        -6.405231656881797,\n        -6.1840681406084865,\n        -5.834782468207012,\n        -5.094988223349939,\n        -3.93553547793352,\n        -3.490495302773597,\n        -2.994445989869707,\n        -2.8235836607632274,\n        -2.103287804291622,\n        -0.6403185111770264,\n        0.035863029969431026,\n        1.0073427402558257,\n        1.532209057133697,\n        2.7726072848879553,\n        2.8081376364025368,\n        3.1871342515550958,\n        4.6484206853267285,\n        5.7025259911138075,\n        6.517148543204527,\n        6.4248764090056705,\n        6.911689085543606,\n        7.106302787584154,\n        7.120391143245384,\n        7.409227903032649,\n        8.81531234343541,\n        9.152119598584784,\n        9.629604117928073,\n        9.871659981375924,\n        9.980799964419491,\n        10.239078280706348,\n        10.665051539486528,\n        11.17499911495086,\n        11.352110374730874,\n        11.716072407421665,\n        11.793340726197457,\n        12.532528781846391,\n        13.41300069231697,\n        13.787995263804998,\n        14.045859866330192,\n        14.318404852129634,\n        14.823225912191589,\n        15.011388233368248,\n        15.810413743748871,\n        15.845918179408075,\n        16.42679607605157,\n        17.042502963438015,\n        17.5322719144802,\n        17.511217448154543,\n        17.783073288516484,\n        18.057637026964702,\n        18.089641710712,\n        18.432675868868287,\n        18.811617285742997,\n        19.3444661555248,\n        19.578936802676207,\n        19.69337685473273,\n        20.312429549115745,\n        20.30085350358694,\n        20.54415502982737,\n        20.877849823241156,\n        20.824529337001273,\n        21.210089398109815,\n        21.651415401175065,\n        21.83622547031005,\n        22.013317122700702,\n        22.194857313082466,\n        22.202669964534508,\n        22.330329852086766,\n        22.362560935187812,\n        22.492343886515947,\n        22.634528154467112,\n        22.71175927321735,\n        22.819953356475224,\n        22.860376653874628,\n        22.902712428342866,\n        23.20727519754466,\n        23.192808819710784,\n        23.22911743721545,\n        23.331330083633624,\n        23.636531121974077,\n        23.283443071826305,\n        23.61383549955233,\n        23.87857539351659,\n        24.116213726140934,\n        24.109804467241606,\n        24.080953585685656,\n        24.182408993485282,\n        24.22503070722041,\n        24.277031589350567,\n        24.28086167621725,\n        24.338310248869753,\n        24.086170670411896,\n        24.247355891518083,\n        24.392246887249893,\n        24.265509504181285,\n        24.466274252294614,\n        24.40196999746903,\n        24.540974808852788,\n        24.79174308944657,\n        24.806109729137226,\n        25.012085050800383,\n        25.07635556274646,\n        24.932140267960776,\n        24.971801947997374,\n        25.19090317956686,\n        25.150228501941697,\n        25.09430264405376,\n        24.875337720343182,\n        24.963633271032617,\n        25.100993573436423,\n        25.400251953083824,\n        25.308705189496003,\n        25.274699549334613,\n        25.420781172317277,\n        25.33391303842891,\n        25.567425205204433,\n        25.594633043331076,\n        25.46100481031832,\n        25.412121431115633,\n        25.58028064758909,\n        25.60693896124021,\n        25.66569445434512,\n        25.76865689350919,\n        25.810841389373525,\n        25.85967269055498,\n        25.962069220008672,\n        25.91996049679049,\n        25.963324765887016,\n        25.91702973971557,\n        26.157279117211537,\n        25.843831924082476,\n        25.94582477903816,\n        26.212889252978872,\n        26.286870962849054,\n        26.40751196976646,\n        26.284033846879908,\n        26.35805794342963,\n        26.52703460808759,\n        26.2964527617929,\n        26.55949959075015,\n        26.55729202886028,\n        26.68286031603485,\n        26.538413810723785,\n        26.71385175832885,\n        26.74395183803158,\n        26.68025527657551,\n        26.801870785928887,\n        26.851465829263198,\n        26.93680166694812,\n        26.961005778922402,\n        26.854616119279207,\n        26.88584927427365,\n        27.0105294896993,\n        27.044895596199282,\n        26.776704929518832,\n        27.13962236257852,\n        27.186412255105285,\n        27.179912100624218,\n        27.210794725487762,\n        27.269374471255276,\n        27.174719160454647,\n        27.31715674626472,\n        27.265195299454607,\n        27.263631864724005,\n        27.39627078043902,\n        27.34258350620992,\n        27.31068089509436,\n        27.32687119814303,\n        27.408540368280352,\n        27.320285751169344,\n        27.138582350978016,\n        27.197517685992338,\n        27.2787107390495,\n        27.429846525353543,\n        27.48308777741448,\n        27.327727808964294,\n        27.617757727847625,\n        27.54328077647833,\n        27.56872278307807,\n        27.386999886827365,\n        27.667973934412352,\n        27.480072363147407,\n        27.53769003346666,\n        27.639470008663924,\n        27.564638796832078,\n        27.496486817688115,\n        27.581619393566044,\n        27.63753434062462,\n        27.537875337075523,\n        27.543520995320314,\n        27.594245505106223,\n        27.7338456626001,\n        27.817130219783696,\n        27.709286722768926,\n        27.734779079980775,\n        27.703104502947088,\n        27.71827765573629,\n        27.83619181533567,\n        27.81285345818395,\n        27.806645067608518,\n        27.80194241315825,\n        27.76007516064274,\n        27.82463922517078,\n        27.84975823090078,\n        27.817606856835535,\n        27.922068700168428,\n        28.013535019237718,\n        28.006340417032497,\n        27.961943661461866,\n        27.910805526554405,\n        27.9567181634562,\n        28.088161460328006,\n        28.160896911964333,\n        28.03687692541058,\n        28.11608027764798,\n        28.100589114837057,\n        28.143973714335537,\n        28.13973034697916,\n        28.070707581372414,\n        28.14186843743203,\n        28.14093486576299,\n        28.180976497640092,\n        28.129040745911777,\n        28.29262308819543,\n        28.35819999517732,\n        28.262551527064023,\n        28.17242402842691,\n        28.16685204495492,\n        27.98908130303733,\n        27.971442662104895,\n        28.04052773964313,\n        28.194933331563192,\n        28.21196307922624,\n        28.398280042595943,\n        28.378426582146385,\n        28.41819677847299,\n        28.41000411431775,\n        28.355294606770748,\n        28.46682456965755,\n        28.401332730195488,\n        28.3555902248825,\n        28.420588015446206,\n        28.453976126225054,\n        28.43060360612622,\n        28.42658227623985,\n        28.403042563670418,\n        28.426569470592217,\n        28.531382203520955,\n        28.57837672969091,\n        28.596598307465857,\n        28.4737160545871,\n        28.583848825507353,\n        28.250870453149908,\n        28.39964222423336,\n        28.582844183613325,\n        28.365761112111585,\n        28.462431198995546,\n        28.5491104986275,\n        28.52845884051301,\n        28.59132089804707,\n        28.697558712879154,\n        28.48288486649054,\n        28.66211898955232,\n        28.720465297452353,\n        28.62806434965292,\n        28.567655194519812,\n        28.40326890288394,\n        28.25374573334743,\n        28.351523567772965,\n        28.686339435546262,\n        28.59770539021719,\n        28.596773880026756,\n        28.63314084601913,\n        28.63569232116834,\n        28.702060685658385,\n        28.619119107857415,\n        28.678181092801367,\n        28.71163400403715,\n        28.61188169506226,\n        28.7386676440666,\n        28.683640034738126,\n        28.541454812957095,\n        28.63711326356703,\n        28.69542892643493,\n        28.600919930656502,\n        28.597689676784487,\n        28.672876331241174,\n        28.709342971388953,\n        28.693955947482817,\n        28.329047789235698,\n        28.270712557482234,\n        28.60880616394475,\n        28.442675540936182,\n        28.672081909762998,\n        28.76269899732062,\n        28.758401880350853,\n        28.770482450465273,\n        28.636978220299326,\n        28.778900852067242,\n        28.629921597826367,\n        28.781905701704325,\n        28.858760451114772,\n        28.652920968857273,\n        28.56684427605041,\n        28.767310387158314,\n        28.75147373932369,\n        28.634802542315008,\n        28.51926408156356,\n        28.70338942286469,\n        28.62303774802057,\n        28.685713036219873,\n        28.728758469643523,\n        28.67635878914168,\n        28.75721728949732,\n        28.56298224078308,\n        28.766505036062266,\n        28.634195551825503,\n        28.748594198476127,\n        28.64870496215311,\n        28.745331557698126,\n        28.683312983072785,\n        28.626677838419578,\n        28.681126150660518,\n        28.795726798460656,\n        28.757144496535805,\n        28.577226605488086,\n        28.681473177482456,\n        28.760889685404365,\n        28.74228846342441,\n        28.658412317188745,\n        28.786052544647482,\n        28.75495418425858,\n        28.600933120176805,\n        28.760873099274633,\n        28.693675765876716,\n        28.551481162007267,\n        28.725814368956623,\n        28.79415552475671,\n        28.743963090640232,\n        28.734863817526268,\n        28.566545314785092,\n        28.823098278619486,\n        28.156893214359613,\n        28.612518041199017,\n        28.74705487866705,\n        28.61278583140767,\n        28.72307807897606,\n        28.68793753335112,\n        28.692956700752664,\n        28.803710654062744,\n        28.86342022913714,\n        28.84833624243771,\n        28.85949429015124,\n        28.821595637152704,\n        28.852835471194876,\n        28.817591413264847,\n        28.72423722172097,\n        28.710836190395927,\n        28.88692886632797,\n        28.7053880821483,\n        28.774013789112196,\n        28.662860260386143,\n        28.79822111112021,\n        28.808628019217544,\n        28.85276102010336,\n        28.771181318446978,\n        28.908834137894033,\n        28.85579874848611,\n        28.890888993253977,\n        28.848728020793402,\n        28.709021758836165,\n        28.462931191839804,\n        28.917714321761146,\n        28.875846241560996,\n        28.88680133357795,\n        28.972438730893764,\n        28.802010216370572,\n        28.87394771604405,\n        28.92969637966023,\n        28.89161006772763,\n        28.834618183273626,\n        28.878542363267485,\n        28.92712663123022,\n        28.873554074406456,\n        28.915520549210623,\n        28.908137396095636,\n        28.793859203233534,\n        28.926888627586624,\n        28.909075735628452,\n        28.801053649246313,\n        28.956274434873805,\n        28.85351939477016,\n        28.939984909664357,\n        28.89357806281323,\n        28.898246390008985,\n        28.69677817945649,\n        28.76230927723095,\n        28.660130319230213,\n        28.795664359490793,\n        28.97442098448534,\n        28.804882468754947,\n        28.81231306915242,\n        28.90048542404371,\n        28.94753999819369,\n        28.83492346942782,\n        28.409316330602834,\n        28.42333477999669,\n        28.748688250879383,\n        28.706269024766705,\n        28.633549596908715,\n        28.874302095920925,\n        28.705128326549733,\n        28.883376647346335,\n        28.84215446744502,\n        28.690961128379946,\n        28.79284085140264,\n        28.67231692219214,\n        28.717414174020156,\n        28.713685795180428,\n        28.61269640828595,\n        28.6245972262438,\n        28.67272102371773,\n        28.69483747703352,\n        28.661568300251197,\n        28.801075406577,\n        28.67019932663518,\n        28.750051006766387,\n        28.803437084207236,\n        28.869076710128372,\n        28.70201001971411,\n        28.53336427531146,\n        28.69181279541776,\n        28.674908862840148,\n        28.748490512591367,\n        28.80004539315285,\n        28.8705797324927,\n        28.707019828640863,\n        28.922864233127992,\n        28.795304714324864,\n        28.897806471765325,\n        28.80391957460775,\n        28.78242060241827,\n        28.70729365133907,\n        28.89739906461977,\n        28.711171772253486,\n        28.754616674551905,\n        28.932490364601158\n      ]\n    }\n  },\n  \"Isaac-Velocity-Rough-G1-v0\": {\n    \"FastTD3\": {\n      \"time\": [\n        0.0,\n        473.25,\n        946.5,\n        1419.75,\n        1893.0,\n        2366.25,\n        2839.5,\n        3312.75,\n        3786.0\n      ],\n      \"env_step\": [\n        0,\n        20480000.0,\n        40960000.0,\n        61440000.0,\n        81920000.0,\n        102400000.0,\n        122880000.0,\n        143360000.0,\n        163840000.0\n      ],\n      \"return\": [\n        0.0,\n        18.40044911702474,\n        27.28994305928548,\n        33.946146647135414,\n        35.17380905151367,\n        36.433878580729164,\n        37.3122189839681,\n        38.439432779947914,\n        37.77874247233073\n      ],\n      \"return_min\": [\n        0.0,\n        17.450108886655634,\n        25.337943364273244,\n        32.46642651481381,\n        32.07014682726799,\n        34.59316721803161,\n        34.879719062140424,\n        37.541568810964186,\n        36.28540258902592\n      ],\n      \"return_max\": [\n        0.0,\n        19.350789347393842,\n        29.241942754297717,\n        35.42586677945702,\n        38.27747127575935,\n        38.27458994342672,\n        39.74471890579578,\n        39.33729674893164,\n        39.27208235563553\n      ]\n    },\n    \"null\": {\n      \"time\": [\n        0.0,\n        9.782,\n        19.564,\n        29.346,\n        39.128,\n        48.91,\n        58.692,\n        68.474,\n        78.256,\n        88.038,\n        97.82,\n        107.602,\n        117.384,\n        127.166,\n        136.948,\n        146.73,\n        156.512,\n        166.294,\n        176.076,\n        185.858,\n        195.64,\n        205.422,\n        215.204,\n        224.986,\n        234.768,\n        244.55,\n        254.332,\n        264.114,\n        273.896,\n        283.678,\n        293.46,\n        303.242,\n        313.024,\n        322.806,\n        332.588,\n        342.37,\n        352.152,\n        361.934,\n        371.716,\n        381.498,\n        391.28,\n        401.062,\n        410.844,\n        420.626,\n        430.408,\n        440.19,\n        449.972,\n        459.754,\n        469.536,\n        479.318,\n        489.1,\n        498.882,\n        508.664,\n        518.446,\n        528.228,\n        538.01,\n        547.792,\n        557.574,\n        567.356,\n        577.138,\n        586.92,\n        596.702,\n        606.484,\n        616.266,\n        626.048,\n        635.83,\n        645.612,\n        655.394,\n        665.176,\n        674.958,\n        684.74,\n        694.522,\n        704.304,\n        714.086,\n        723.868,\n        733.65,\n        743.432,\n        753.214,\n        762.996,\n        772.778,\n        782.56,\n        792.342,\n        802.124,\n        811.906,\n        821.688,\n        831.47,\n        841.252,\n        851.034,\n        860.816,\n        870.598,\n        880.38,\n        890.162,\n        899.944,\n        909.726,\n        919.508,\n        929.29,\n        939.072,\n        948.854,\n        958.636,\n        968.418,\n        978.2,\n        987.982,\n        997.764,\n        1007.546,\n        1017.328,\n        1027.11,\n        1036.892,\n        1046.674,\n        1056.456,\n        1066.238,\n        1076.02,\n        1085.802,\n        1095.584,\n        1105.366,\n        1115.148,\n        1124.93,\n        1134.712,\n        1144.494,\n        1154.276,\n        1164.058,\n        1173.84,\n        1183.622,\n        1193.404,\n        1203.186,\n        1212.968,\n        1222.75,\n        1232.532,\n        1242.314,\n        1252.096,\n        1261.878,\n        1271.66,\n        1281.442,\n        1291.224,\n        1301.006,\n        1310.788,\n        1320.57,\n        1330.352,\n        1340.134,\n        1349.916,\n        1359.698,\n        1369.48,\n        1379.262,\n        1389.044,\n        1398.826,\n        1408.608,\n        1418.39,\n        1428.172,\n        1437.954,\n        1447.736,\n        1457.518,\n        1467.3,\n        1477.082,\n        1486.864,\n        1496.646,\n        1506.428,\n        1516.21,\n        1525.992,\n        1535.774,\n        1545.556,\n        1555.338,\n        1565.12,\n        1574.902,\n        1584.684,\n        1594.466,\n        1604.248,\n        1614.03,\n        1623.812,\n        1633.594,\n        1643.376,\n        1653.158,\n        1662.94,\n        1672.722,\n        1682.504,\n        1692.286,\n        1702.068,\n        1711.85,\n        1721.632,\n        1731.414,\n        1741.196,\n        1750.978,\n        1760.76,\n        1770.542,\n        1780.324,\n        1790.106,\n        1799.888,\n        1809.67,\n        1819.452,\n        1829.234,\n        1839.016,\n        1848.798,\n        1858.58,\n        1868.362,\n        1878.144,\n        1887.926,\n        1897.708,\n        1907.49,\n        1917.272,\n        1927.054,\n        1936.836,\n        1946.618,\n        1956.4,\n        1966.182,\n        1975.964,\n        1985.746,\n        1995.528,\n        2005.31,\n        2015.092,\n        2024.874,\n        2034.656,\n        2044.438,\n        2054.22,\n        2064.002,\n        2073.784,\n        2083.566,\n        2093.348,\n        2103.13,\n        2112.912,\n        2122.694,\n        2132.476,\n        2142.258,\n        2152.04,\n        2161.822,\n        2171.604,\n        2181.386,\n        2191.168,\n        2200.95,\n        2210.732,\n        2220.514,\n        2230.296,\n        2240.078,\n        2249.86,\n        2259.642,\n        2269.424,\n        2279.206,\n        2288.988,\n        2298.77,\n        2308.552,\n        2318.334,\n        2328.116,\n        2337.898,\n        2347.68,\n        2357.462,\n        2367.244,\n        2377.026,\n        2386.808,\n        2396.59,\n        2406.372,\n        2416.154,\n        2425.936,\n        2435.718,\n        2445.5,\n        2455.282,\n        2465.064,\n        2474.846,\n        2484.628,\n        2494.41,\n        2504.192,\n        2513.974,\n        2523.756,\n        2533.538,\n        2543.32,\n        2553.102,\n        2562.884,\n        2572.666,\n        2582.448,\n        2592.23,\n        2602.012,\n        2611.794,\n        2621.576,\n        2631.358,\n        2641.14,\n        2650.922,\n        2660.704,\n        2670.486,\n        2680.268,\n        2690.05,\n        2699.832,\n        2709.614,\n        2719.396,\n        2729.178,\n        2738.96,\n        2748.742,\n        2758.524,\n        2768.306,\n        2778.088,\n        2787.87,\n        2797.652,\n        2807.434,\n        2817.216,\n        2826.998,\n        2836.78,\n        2846.562,\n        2856.344,\n        2866.126,\n        2875.908,\n        2885.69,\n        2895.472,\n        2905.254,\n        2915.036,\n        2924.818,\n        2934.6,\n        2944.382,\n        2954.164,\n        2963.946,\n        2973.728,\n        2983.51,\n        2993.292,\n        3003.074,\n        3012.856,\n        3022.638,\n        3032.42,\n        3042.202,\n        3051.984,\n        3061.766,\n        3071.548,\n        3081.33,\n        3091.112,\n        3100.894,\n        3110.676,\n        3120.458,\n        3130.24,\n        3140.022,\n        3149.804,\n        3159.586,\n        3169.368,\n        3179.15,\n        3188.932,\n        3198.714,\n        3208.496,\n        3218.278,\n        3228.06,\n        3237.842,\n        3247.624,\n        3257.406,\n        3267.188,\n        3276.97,\n        3286.752,\n        3296.534,\n        3306.316,\n        3316.098,\n        3325.88,\n        3335.662,\n        3345.444,\n        3355.226,\n        3365.008,\n        3374.79,\n        3384.572,\n        3394.354,\n        3404.136,\n        3413.918,\n        3423.7,\n        3433.482,\n        3443.264,\n        3453.046,\n        3462.828,\n        3472.61,\n        3482.392,\n        3492.174,\n        3501.956,\n        3511.738,\n        3521.52,\n        3531.302,\n        3541.084,\n        3550.866,\n        3560.648,\n        3570.43,\n        3580.212,\n        3589.994,\n        3599.776,\n        3609.558,\n        3619.34,\n        3629.122,\n        3638.904,\n        3648.686,\n        3658.468,\n        3668.25,\n        3678.032,\n        3687.814,\n        3697.596,\n        3707.378,\n        3717.16,\n        3726.942,\n        3736.724,\n        3746.506,\n        3756.288,\n        3766.07,\n        3775.852,\n        3785.634,\n        3795.416,\n        3805.198,\n        3814.98,\n        3824.762,\n        3834.544,\n        3844.326,\n        3854.108,\n        3863.89,\n        3873.672,\n        3883.454,\n        3893.236,\n        3903.018,\n        3912.8,\n        3922.582,\n        3932.364,\n        3942.146,\n        3951.928,\n        3961.71,\n        3971.492,\n        3981.274,\n        3991.056,\n        4000.838,\n        4010.62,\n        4020.402,\n        4030.184,\n        4039.966,\n        4049.748,\n        4059.53,\n        4069.312,\n        4079.094,\n        4088.876,\n        4098.658,\n        4108.44,\n        4118.222,\n        4128.004,\n        4137.786,\n        4147.568,\n        4157.35,\n        4167.132,\n        4176.914,\n        4186.696,\n        4196.478,\n        4206.26,\n        4216.042,\n        4225.824,\n        4235.606,\n        4245.388,\n        4255.17,\n        4264.952,\n        4274.734,\n        4284.516,\n        4294.298,\n        4304.08,\n        4313.862,\n        4323.644,\n        4333.426,\n        4343.208,\n        4352.99,\n        4362.772,\n        4372.554,\n        4382.336,\n        4392.118,\n        4401.9,\n        4411.682,\n        4421.464,\n        4431.246,\n        4441.028,\n        4450.81,\n        4460.592,\n        4470.374,\n        4480.156,\n        4489.938,\n        4499.72,\n        4509.502,\n        4519.284,\n        4529.066,\n        4538.848,\n        4548.63,\n        4558.412,\n        4568.194,\n        4577.976,\n        4587.758,\n        4597.54,\n        4607.322,\n        4617.104,\n        4626.886,\n        4636.668,\n        4646.45,\n        4656.232,\n        4666.014,\n        4675.796,\n        4685.578,\n        4695.36,\n        4705.142,\n        4714.924,\n        4724.706,\n        4734.488,\n        4744.27,\n        4754.052,\n        4763.834,\n        4773.616,\n        4783.398,\n        4793.18,\n        4802.962,\n        4812.744,\n        4822.526,\n        4832.308,\n        4842.09,\n        4851.872,\n        4861.654,\n        4871.436,\n        4881.218,\n        4891.0\n      ],\n      \"env_step\": [\n        0,\n        2064384,\n        2654208,\n        3833856,\n        4128768,\n        5308416,\n        5701632,\n        5898240,\n        5996544,\n        6586368,\n        7471104,\n        7667712,\n        7864320,\n        10125312,\n        10321920,\n        10420224,\n        11010048,\n        11108352,\n        11304960,\n        11698176,\n        12582912,\n        12976128,\n        13565952,\n        13860864,\n        14057472,\n        14352384,\n        15335424,\n        15925248,\n        17301504,\n        18481152,\n        19759104,\n        19857408,\n        21528576,\n        21921792,\n        22020096,\n        22216704,\n        22609920,\n        22806528,\n        23592960,\n        24379392,\n        24576000,\n        24772608,\n        26443776,\n        26542080,\n        28508160,\n        29294592,\n        29687808,\n        30375936,\n        30474240,\n        31358976,\n        31653888,\n        31948800,\n        32342016,\n        32735232,\n        33325056,\n        34701312,\n        34996224,\n        35487744,\n        37060608,\n        37748736,\n        40108032,\n        40402944,\n        40501248,\n        40796160,\n        41582592,\n        42860544,\n        42958848,\n        43253760,\n        44531712,\n        44826624,\n        44924928,\n        45023232,\n        45711360,\n        45809664,\n        46202880,\n        46301184,\n        46989312,\n        47185920,\n        47579136,\n        47775744,\n        48365568,\n        50135040,\n        50528256,\n        50724864,\n        52199424,\n        53182464,\n        53379072,\n        53673984,\n        54067200,\n        54558720,\n        55050240,\n        56328192,\n        56721408,\n        57311232,\n        57507840,\n        58982400,\n        59768832,\n        60260352,\n        62128128,\n        62324736,\n        62619648,\n        63012864,\n        63111168,\n        63307776,\n        65077248,\n        65175552,\n        65372160,\n        65863680,\n        66355200,\n        66945024,\n        67436544,\n        68517888,\n        69697536,\n        69894144,\n        70483968,\n        71368704,\n        71467008,\n        71565312,\n        72548352,\n        74121216,\n        74416128,\n        75005952,\n        75300864,\n        75399168,\n        76185600,\n        76775424,\n        79527936,\n        79921152,\n        80314368,\n        80805888,\n        82182144,\n        83558400,\n        83656704,\n        83951616,\n        84541440,\n        84738048,\n        85524480,\n        85622784,\n        86409216,\n        86802432,\n        87392256,\n        87490560,\n        87588864,\n        87687168,\n        87785472,\n        87883776,\n        88375296,\n        88965120,\n        89161728,\n        89260032,\n        89849856,\n        90341376,\n        91422720,\n        91815936,\n        91914240,\n        92209152,\n        92700672,\n        93290496,\n        93487104,\n        94175232,\n        94371840,\n        94961664,\n        95059968,\n        95354880,\n        95748096,\n        95944704,\n        96239616,\n        96436224,\n        96632832,\n        98107392,\n        98304000,\n        98500608,\n        98697216,\n        98893824,\n        99090432,\n        99385344,\n        100073472,\n        100270080,\n        100761600,\n        101056512,\n        101842944,\n        102039552,\n        103219200,\n        104005632,\n        104693760,\n        104890368,\n        105086976,\n        105676800,\n        106266624,\n        107544576,\n        108724224,\n        109510656,\n        110198784,\n        110297088,\n        110788608,\n        111083520,\n        111476736,\n        111771648,\n        112656384,\n        114524160,\n        114622464,\n        116883456,\n        117080064,\n        117276672,\n        117374976,\n        117964800,\n        118358016,\n        118849536,\n        119046144,\n        119537664,\n        122486784,\n        122585088,\n        122978304,\n        123863040,\n        124059648,\n        124846080,\n        125927424,\n        126124032,\n        126615552,\n        127205376,\n        127500288,\n        129466368,\n        130351104,\n        130547712,\n        132710400,\n        133300224,\n        133791744,\n        134086656,\n        134873088,\n        135168000,\n        135757824,\n        136544256,\n        137723904,\n        138018816,\n        139001856,\n        139395072,\n        140574720,\n        140771328,\n        141656064,\n        141950976,\n        142245888,\n        142540800,\n        142639104,\n        143228928,\n        144113664,\n        144211968,\n        144310272,\n        145195008,\n        145981440,\n        146374656,\n        147554304,\n        147652608,\n        147947520,\n        149028864,\n        150110208,\n        150601728,\n        150798336,\n        151388160,\n        152469504,\n        152567808,\n        152666112,\n        153255936,\n        153845760,\n        154238976,\n        154435584,\n        156106752,\n        156303360,\n        157384704,\n        157581312,\n        157974528,\n        158564352,\n        159645696,\n        160628736,\n        161021952,\n        161120256,\n        162791424,\n        164265984,\n        165249024,\n        165937152,\n        166232064,\n        166526976,\n        166920192,\n        167018496,\n        167116800,\n        167313408,\n        167510016,\n        167804928,\n        167903232,\n        168198144,\n        168296448,\n        168886272,\n        169181184,\n        169279488,\n        170950656,\n        171343872,\n        171835392,\n        172720128,\n        173899776,\n        174391296,\n        174784512,\n        175177728,\n        175767552,\n        175964160,\n        176259072,\n        176357376,\n        176455680,\n        176553984,\n        177438720,\n        177635328,\n        178028544,\n        178520064,\n        178618368,\n        178913280,\n        179109888,\n        179404800,\n        180387840,\n        180977664,\n        182157312,\n        182353920,\n        184418304,\n        185204736,\n        185303040,\n        185892864,\n        186679296,\n        187072512,\n        187170816,\n        187564032,\n        187662336,\n        188841984,\n        189136896,\n        189628416,\n        190119936,\n        191004672,\n        192479232,\n        192774144,\n        193069056,\n        193855488,\n        194641920,\n        194740224,\n        195723264,\n        196313088,\n        197296128,\n        198475776,\n        199163904,\n        201031680,\n        201424896,\n        201916416,\n        202113024,\n        202604544,\n        202997760,\n        204079104,\n        205750272,\n        206241792,\n        206635008,\n        206929920,\n        207618048,\n        207716352,\n        209092608,\n        209879040,\n        210272256,\n        210370560,\n        210763776,\n        210862080,\n        211746816,\n        211943424,\n        213811200,\n        214892544,\n        216858624,\n        217055232,\n        217743360,\n        218628096,\n        218923008,\n        219217920,\n        219316224,\n        219414528,\n        219906048,\n        221577216,\n        222461952,\n        223739904,\n        223838208,\n        224428032,\n        224821248,\n        225116160,\n        226000896,\n        226983936,\n        227278848,\n        227475456,\n        228556800,\n        228851712,\n        228950016,\n        229244928,\n        230129664,\n        230227968,\n        230817792,\n        231309312,\n        231899136,\n        232783872,\n        233078784,\n        233275392,\n        233373696,\n        233766912,\n        234848256,\n        235536384,\n        237109248,\n        237305856,\n        237404160,\n        237699072,\n        238190592,\n        238780416,\n        239468544,\n        240451584,\n        240746496,\n        242122752,\n        242221056,\n        243105792,\n        243400704,\n        244187136,\n        245268480,\n        245563392,\n        245760000,\n        245858304,\n        246841344,\n        247136256,\n        247529472,\n        247922688,\n        249004032,\n        249200640,\n        249495552,\n        250773504,\n        250970112,\n        252641280,\n        253427712,\n        253526016,\n        255688704,\n        256868352,\n        257359872,\n        258342912,\n        258637824,\n        259620864,\n        259719168,\n        260210688,\n        260603904,\n        261095424,\n        261292032,\n        261685248,\n        262176768,\n        263651328,\n        263749632,\n        265224192,\n        265912320,\n        266010624,\n        266108928,\n        266502144,\n        266797056,\n        268271616,\n        268959744,\n        269844480,\n        270237696,\n        270925824,\n        271024128,\n        271319040,\n        271417344,\n        271515648,\n        272007168,\n        272302080,\n        272498688,\n        273186816,\n        274268160,\n        274366464,\n        274563072,\n        274759680,\n        275644416,\n        276529152,\n        278102016,\n        278495232,\n        280559616,\n        280756224,\n        280854528,\n        281542656,\n        282230784,\n        282820608,\n        283115520,\n        283705344,\n        284786688,\n        284884992,\n        285769728,\n        286457856,\n        286654464,\n        286851072,\n        286949376,\n        288423936,\n        288522240,\n        290586624,\n        290881536,\n        291471360,\n        292159488,\n        292552704,\n        292749312,\n        293437440,\n        294027264,\n        294420480\n      ],\n      \"return\": [\n        0.0,\n        -5.6881421136856085,\n        -5.5180525922775265,\n        -5.366918431917827,\n        -5.325606632232667,\n        -5.190784207979838,\n        -5.038557925224304,\n        -4.9782898871103916,\n        -4.919183246294657,\n        -4.905417499542236,\n        -4.753361112276713,\n        -4.771880612373352,\n        -4.804907844861348,\n        -5.218108415603638,\n        -5.452975128491719,\n        -5.459520988464355,\n        -5.728807779947917,\n        -6.293901631037394,\n        -6.285780429840088,\n        -7.291187626520792,\n        -7.811659456888834,\n        -7.735299712816873,\n        -7.949773249626159,\n        -8.23085147778193,\n        -8.023751058578492,\n        -7.658783731460571,\n        -7.70408089876175,\n        -7.493596713145574,\n        -6.802546091179053,\n        -6.02310235808293,\n        -5.458186463266611,\n        -4.963028981884321,\n        -4.4428239252169925,\n        -3.7779746002703902,\n        -3.5969980679887037,\n        -3.5128461730852725,\n        -2.607398334058623,\n        -1.886164164994067,\n        -2.217925976102706,\n        -1.715551134304454,\n        -1.1327023444014288,\n        -0.2374029954833287,\n        0.16780850405494366,\n        0.42279706716537485,\n        1.096491112957398,\n        1.278363335741063,\n        1.5731980310132105,\n        1.7684402238329249,\n        2.3747797200921923,\n        2.706895433639487,\n        3.013346167653799,\n        2.790748157600562,\n        3.2253397530379395,\n        3.0303994294193886,\n        3.191454022967567,\n        3.9127346877443294,\n        4.167857200068732,\n        4.537999338172376,\n        4.813405925780535,\n        5.180030785401662,\n        5.09416914695253,\n        5.239751322865486,\n        5.4623762649297705,\n        5.56985650156935,\n        5.447801009019216,\n        5.8951625898480415,\n        5.910035707950591,\n        6.284413039684296,\n        5.8277599749962485,\n        5.393726678490638,\n        5.226986187398434,\n        5.363927096525828,\n        6.0447085696458815,\n        6.022956263919671,\n        5.796698542435965,\n        5.830887189706167,\n        6.911331553782026,\n        6.52881139477094,\n        6.529581413269042,\n        5.943345656866828,\n        6.304789720376333,\n        6.2125239910682035,\n        6.334142784873644,\n        6.165451030731202,\n        6.199949367642403,\n        6.212140672504901,\n        6.043669728835424,\n        6.123819414041936,\n        6.673933910727501,\n        6.168897040734688,\n        5.858888900627693,\n        6.185403854250907,\n        6.2324638844529785,\n        6.3189695264895755,\n        5.429789771735668,\n        5.867755074054003,\n        5.9698123177016775,\n        6.2788647194703415,\n        6.108241503039996,\n        6.194222606966893,\n        5.6370969935754935,\n        5.860745827356975,\n        5.627046625390649,\n        5.9018260749181115,\n        5.5154518715043865,\n        5.629180502370001,\n        5.838229054013888,\n        5.671999172481397,\n        5.293166875243187,\n        5.526465571175019,\n        5.529769248962403,\n        5.878975370128949,\n        5.542135804857438,\n        4.932354421118895,\n        5.269491405735414,\n        5.4556330054998385,\n        5.799737800732255,\n        5.829726587931315,\n        5.968212634523709,\n        5.584529157479604,\n        5.912035822945957,\n        5.757281076312065,\n        6.098358916342259,\n        6.190662302970885,\n        6.169422122637431,\n        6.15962440357233,\n        6.283701638529698,\n        6.239324719111124,\n        6.476538527409236,\n        6.383258503476779,\n        6.16934560338656,\n        6.73461858527114,\n        6.619796413257718,\n        7.001251819133759,\n        6.74785814344883,\n        6.963990779990951,\n        7.004205753803254,\n        6.493811826705933,\n        6.724569251736004,\n        6.9595607940355935,\n        7.377430408398311,\n        7.391885064045589,\n        7.427037688096365,\n        7.614314963817596,\n        7.211228293528159,\n        7.1433285713195795,\n        7.39753812789917,\n        8.07717345317205,\n        7.722901459336281,\n        7.50667692820231,\n        7.629707463184992,\n        7.9553125536441796,\n        8.055245854854585,\n        7.869338385264079,\n        7.858060536384582,\n        8.241977485815683,\n        8.327039570013682,\n        8.404392083485922,\n        8.476991500854492,\n        8.417776710490386,\n        8.851075089772543,\n        8.378894633452099,\n        8.568723888397217,\n        8.807038741906483,\n        8.106232002973558,\n        9.009860292275748,\n        8.951988192399343,\n        8.541935040950776,\n        8.760698347091676,\n        9.00345556338628,\n        9.382102955182392,\n        9.348882626692456,\n        9.577981095314026,\n        9.366613285541534,\n        9.75678734322389,\n        9.434628633260727,\n        9.696422396500905,\n        9.92948784351349,\n        10.117260546286902,\n        10.056760675112406,\n        9.992847542762755,\n        10.470781871875127,\n        10.06678469657898,\n        10.482964709599813,\n        10.941076284249624,\n        10.520150343577066,\n        10.739750129381816,\n        10.971128408114117,\n        10.290067638556163,\n        11.149061189492542,\n        11.524184123675028,\n        11.633825011253357,\n        11.01133109887441,\n        11.827612968285878,\n        11.392092748483023,\n        11.764327395757041,\n        11.70821429570516,\n        11.733774793148042,\n        12.074558056990305,\n        11.616416401068369,\n        11.907227954069773,\n        12.560407905578614,\n        12.36894993464152,\n        12.091517403125764,\n        12.747315545082094,\n        12.558328438599903,\n        12.342031540075938,\n        12.96610297203064,\n        12.804973107973737,\n        12.630402886867524,\n        13.08221505800883,\n        13.075428587595622,\n        13.208603329658509,\n        12.54065419991811,\n        13.22744600137075,\n        13.474049444993335,\n        13.782562464872996,\n        13.70558207432429,\n        13.664181585311889,\n        13.819338383674621,\n        14.275129564603171,\n        14.416780421733856,\n        13.759321572780609,\n        14.259545510609945,\n        14.518339014053346,\n        14.214087019761402,\n        14.85833446741104,\n        14.629418731530507,\n        14.74431406259537,\n        14.71238970677058,\n        15.07072572628657,\n        14.685916913350423,\n        14.83047775665919,\n        14.68453181187312,\n        14.993108842372893,\n        15.29401432434718,\n        15.640210219224295,\n        15.341570507685345,\n        15.120413868427278,\n        15.916163510481516,\n        15.719678219159443,\n        15.838848996957141,\n        15.478066034317017,\n        15.486124574343364,\n        15.805122829278309,\n        15.930275870958965,\n        15.642760404745738,\n        16.525526213645936,\n        16.194698879718782,\n        16.2467432085673,\n        16.22657354513804,\n        16.767331422170002,\n        16.556936605771384,\n        16.73060949246089,\n        16.508015065987905,\n        16.879581843217213,\n        16.68470219373703,\n        16.331015097300213,\n        17.1932918492953,\n        17.274409716129302,\n        17.57423019170761,\n        17.57290551662445,\n        17.42105441729228,\n        17.225453462600708,\n        17.133760628700255,\n        17.36790020386378,\n        17.69404323577881,\n        17.407497894763946,\n        17.805010588963825,\n        17.68197588523229,\n        17.84539549589157,\n        17.520849231878916,\n        17.575091780821484,\n        18.307277234395347,\n        17.58764229933421,\n        17.703922248681383,\n        18.099228777885436,\n        18.280002871354423,\n        17.969462502002717,\n        18.29429830869039,\n        18.262509983380635,\n        18.301710220972698,\n        18.00693675438563,\n        18.199882732232414,\n        18.125358258883157,\n        18.519552324612935,\n        17.990541600386305,\n        18.569027196566264,\n        19.178397612571718,\n        18.605370388031005,\n        18.528695232868195,\n        18.21270858923594,\n        18.763140827814738,\n        18.657966167926787,\n        18.7651668437322,\n        18.790467648506166,\n        18.698187935352326,\n        18.84629309415817,\n        19.124954300721484,\n        18.934716939926147,\n        18.991049098173775,\n        19.78828253428141,\n        19.01165625333786,\n        18.79346280097961,\n        19.511103755633037,\n        19.065227732658386,\n        19.421975706418355,\n        19.211517736117045,\n        18.913628775278728,\n        19.251926871935524,\n        19.194325835704806,\n        19.08125954469045,\n        19.474459347724917,\n        19.06925933043162,\n        19.73238619248072,\n        19.493218274116515,\n        19.7938659175237,\n        20.104998388290408,\n        20.18069859266281,\n        20.11826090415319,\n        19.140096864700315,\n        19.855108708540598,\n        19.700749299526212,\n        20.264274895191193,\n        20.147352376778922,\n        19.73275642077128,\n        19.819068230787913,\n        20.011384061177573,\n        20.02887071212133,\n        20.07820870399475,\n        19.71256620407104,\n        19.83414210796356,\n        20.391562454700473,\n        20.002346297105152,\n        20.657952380180358,\n        20.258645397027333,\n        20.86577610492706,\n        20.674734957218167,\n        20.29822519461314,\n        20.035884976387024,\n        20.60575511932373,\n        19.918341357707977,\n        20.589110221862796,\n        20.634883064428966,\n        20.818063621520995,\n        20.816770624319712,\n        20.601503771940866,\n        20.502998891671496,\n        21.004771229426066,\n        20.736558705965678,\n        21.25983631690343,\n        21.269482579231262,\n        21.138610611756643,\n        20.938844652970634,\n        20.940916749636333,\n        21.20807293574015,\n        20.98027428865433,\n        21.046181119283045,\n        21.32889621814092,\n        21.05542133967082,\n        21.157513887882235,\n        21.59118415673574,\n        21.552832601865134,\n        21.455512606302893,\n        21.077143347263334,\n        21.365520514647162,\n        21.446350107192995,\n        21.994762991269425,\n        22.047921177546186,\n        21.408621869087217,\n        21.42010664701462,\n        21.38248774766922,\n        21.897231921354926,\n        21.796455845832824,\n        22.00268397172292,\n        21.341353923479716,\n        22.258859420617423,\n        21.87750239610672,\n        22.012093653678892,\n        22.11572533210119,\n        22.070228363672893,\n        21.90432560443878,\n        21.88233465353648,\n        21.868809100786848,\n        22.011972271601355,\n        21.727508841355643,\n        22.31119052171707,\n        22.12301502863566,\n        22.116760693391168,\n        22.425632123152415,\n        22.14697431723277,\n        22.17153032461802,\n        22.668648889859515,\n        22.57969790617625,\n        22.410574590365087,\n        22.001601606210073,\n        21.817777061462404,\n        22.229211839834846,\n        22.151013066768645,\n        22.43489840984344,\n        22.271503342787426,\n        21.90029133637746,\n        22.330113309224444,\n        22.955066961447397,\n        21.912163372834527,\n        22.186071017583213,\n        22.38241572300593,\n        22.47269917170207,\n        22.804330983956657,\n        22.403881865342456,\n        22.417255460421245,\n        22.50846732934316,\n        22.436016081174216,\n        22.510304973125457,\n        22.433432060082755,\n        22.458809146881105,\n        22.923311399618786,\n        22.47686696211497,\n        22.402629868189496,\n        22.616998944282532,\n        22.77077663580577,\n        22.905668086210884,\n        22.748367921511335,\n        22.205995389620465,\n        22.907470565636952,\n        22.8850816377004,\n        22.782681437333423,\n        22.988625747362775,\n        23.00508107582728,\n        22.886291638215383,\n        22.84144187609355,\n        23.04126522064209,\n        23.382145015398663,\n        22.824609417120616,\n        23.142978974183396,\n        23.523871717453,\n        23.504992637634277,\n        23.42341079155604,\n        23.660274498462673,\n        23.1919059920311,\n        23.286985676288605,\n        23.697115731239318,\n        23.35182003657023,\n        23.776714529196422,\n        23.88528642098109,\n        23.77788284619649,\n        23.53571937243144,\n        23.429716286659243,\n        23.593821457227076,\n        23.6001250298818,\n        24.131974957784013,\n        23.962793831030528,\n        23.7850101741155,\n        23.81359586238861,\n        23.871057611306508,\n        23.643791459401445,\n        23.712499034404757,\n        24.161554266611734,\n        24.0301451587677,\n        24.00370688756307,\n        24.046240677833556,\n        24.134283207257592,\n        23.779774011770883,\n        24.4099445605278,\n        24.606129792531334,\n        24.044053098360695,\n        24.392736365000406,\n        24.15341828107834,\n        24.489894301096598,\n        24.132509734630585,\n        24.23703903357188,\n        23.809118733406066,\n        24.21431887785594,\n        24.55282241980235,\n        24.52900070508321,\n        24.679437781969707,\n        24.63161761045456,\n        24.952858759562176,\n        24.57767979303996,\n        24.37862017631531,\n        24.57373389085134,\n        24.632339243094126,\n        24.571167072455086,\n        24.859973764419554,\n        24.99239943822225,\n        24.529701237678527,\n        24.770756055514017,\n        24.961983086268106,\n        24.882949465115868,\n        24.77448396841685,\n        25.166602612336476,\n        25.016854931513468,\n        25.510536619822187,\n        25.088715693155923,\n        25.07602248350779,\n        25.324519255956016,\n        24.968988726933798,\n        25.13067001819611,\n        25.30761011282603,\n        24.8148122882843,\n        24.975149160226184\n      ],\n      \"return_min\": [\n        0.0,\n        -5.847385083804003,\n        -5.7513117592921645,\n        -5.748779817936011,\n        -5.728821428054697,\n        -5.559080329213756,\n        -5.3636133316322265,\n        -5.302210290559181,\n        -5.251333814058662,\n        -5.194289546721069,\n        -4.8739384283057285,\n        -4.842630461534321,\n        -4.9151453492215245,\n        -5.843299921732156,\n        -6.3329078817675795,\n        -6.219128552891097,\n        -6.481890776537796,\n        -6.802709925873941,\n        -6.810748831780586,\n        -8.054595643731085,\n        -8.723781704520437,\n        -8.580540940926463,\n        -8.575722894918862,\n        -9.220375253800077,\n        -8.827859993796615,\n        -8.938540268579562,\n        -9.537985418663462,\n        -9.839037998870781,\n        -9.260156556661503,\n        -8.357282452989063,\n        -8.319941544401631,\n        -7.793454754107815,\n        -7.859164931975359,\n        -6.597464244259629,\n        -6.4995116443969785,\n        -6.33825469881865,\n        -5.805293320023743,\n        -5.415640063856721,\n        -5.345139400172988,\n        -5.100149198443003,\n        -4.775884997246164,\n        -3.901154927530331,\n        -3.5430011561290904,\n        -2.929028364818739,\n        -2.4767586717301917,\n        -1.8265270804764393,\n        -1.6176343798866804,\n        -1.5852250643868577,\n        -0.7284182326456308,\n        -0.44979643620616727,\n        -0.2577023392133886,\n        -0.23194055700112637,\n        0.07253895591730597,\n        -0.15651519592880758,\n        0.7055207887916808,\n        1.6360583348580673,\n        1.9420826897659431,\n        2.473776496516727,\n        3.0785349575773653,\n        3.9844120773625447,\n        3.441431652994045,\n        3.620430832636923,\n        4.100560387951199,\n        3.930868276930429,\n        4.550770792569821,\n        4.336923129658898,\n        4.599446487157081,\n        5.373622367155295,\n        5.176054051043158,\n        4.666238701619989,\n        3.951051968725962,\n        4.258857327615415,\n        5.173113526915158,\n        5.073355247708585,\n        4.371943558498154,\n        4.54636012262562,\n        5.808916391380994,\n        5.551197830898223,\n        5.34759574306736,\n        5.0277504587736,\n        5.306229214765686,\n        5.073083524975472,\n        5.491268758513117,\n        5.176724814298337,\n        5.573712277277728,\n        5.888213169839131,\n        5.000021083204806,\n        5.256854984893558,\n        5.665206042050714,\n        5.461860319919166,\n        5.175830911510187,\n        5.179716016850846,\n        5.417536568039349,\n        5.359956086682328,\n        4.491544768033436,\n        5.161442954113744,\n        5.31481914992216,\n        5.355294443273641,\n        5.662462702702301,\n        5.6259969021338865,\n        4.447962590494374,\n        5.167120718722103,\n        4.842777828917456,\n        5.4036097300877675,\n        4.706745617468813,\n        4.8632407453674995,\n        5.030492081517986,\n        5.312459879968527,\n        4.742696950444269,\n        4.907279705908248,\n        4.9511379993283064,\n        4.982742827487655,\n        4.956854429855504,\n        4.301009798833486,\n        4.838506825396664,\n        4.992781515720493,\n        5.290940471382674,\n        5.14348241722087,\n        5.579096637106402,\n        5.168243772759865,\n        5.5843474503410535,\n        4.996058413060161,\n        5.761075259812566,\n        6.067030634385872,\n        5.710991308717619,\n        5.431620018314759,\n        5.867838818613855,\n        5.807803655471879,\n        6.06987462127813,\n        6.016871892402834,\n        5.381815690518922,\n        6.535831944081462,\n        5.8180260636282695,\n        6.419818415073631,\n        6.423334427947558,\n        6.251856128022345,\n        6.676194921207104,\n        5.896413605071722,\n        6.127789900798275,\n        6.504789139261375,\n        7.125364290761334,\n        6.82233672990498,\n        7.183872097738285,\n        7.573518841732584,\n        6.668143500018438,\n        6.800361653251276,\n        7.354062609207972,\n        7.814342488447513,\n        7.348284773553699,\n        7.027655732334908,\n        7.164310953205494,\n        7.324909554131923,\n        7.79977177857154,\n        7.507403710366009,\n        7.445613236257481,\n        7.949163965051599,\n        8.029579861760306,\n        8.049631732191134,\n        7.98473374738377,\n        8.09692933562954,\n        8.595150747693317,\n        7.889534769486059,\n        8.182050033875075,\n        8.543661957454564,\n        7.83448305630662,\n        8.854549640247985,\n        8.734680818048101,\n        8.03914865755524,\n        8.188374096405493,\n        8.70408444962382,\n        9.08980827510886,\n        8.676364682762179,\n        9.001981073268468,\n        9.080738296785063,\n        9.163175043725404,\n        8.826958364232896,\n        9.212786567279501,\n        9.491640818533266,\n        9.813652886857588,\n        9.663235392269925,\n        8.831743708382604,\n        9.444130141965037,\n        9.063594161553546,\n        9.684780684589581,\n        10.580158786557131,\n        9.732795426778067,\n        9.653901282414537,\n        10.326423482399091,\n        9.520840913025431,\n        10.355890488911198,\n        10.635577274552382,\n        10.414180755942379,\n        10.261674546782004,\n        11.18722024808477,\n        10.622391593094918,\n        10.994410335019602,\n        11.134014403610646,\n        10.664608004781611,\n        11.602299322770435,\n        10.367783660472115,\n        10.797210054066554,\n        11.815822853246,\n        11.480982140330964,\n        11.09412850857745,\n        11.98628863883241,\n        12.067312099605639,\n        11.848993921764682,\n        12.27060637972741,\n        11.958689462390439,\n        12.120182752071292,\n        12.259304748296094,\n        12.537474673178647,\n        12.453123537891061,\n        11.77096552979382,\n        12.56915001066292,\n        12.472330111732534,\n        13.177987976754496,\n        12.936951414977278,\n        12.52047362077546,\n        13.183660561178558,\n        12.95994516279078,\n        13.693192051103107,\n        12.795102644449155,\n        13.118756348528736,\n        13.149097145512771,\n        13.6082577654462,\n        13.899241828559406,\n        14.07114478898924,\n        13.950021395740468,\n        13.671012076467038,\n        14.427718747037764,\n        14.21831097874148,\n        14.149740627413518,\n        14.239057837801768,\n        14.322750650080883,\n        15.077346698203964,\n        14.881819807588435,\n        14.711101975059336,\n        14.653175754415862,\n        15.045519200395086,\n        14.859055954390353,\n        15.135940821200311,\n        15.028333599354786,\n        14.868744466774464,\n        15.226525571946476,\n        15.31503408437216,\n        14.7364282155483,\n        15.393723435935847,\n        15.221118103469363,\n        15.473782593262934,\n        15.38283804151289,\n        15.710453887247304,\n        15.186448409186136,\n        15.756452784424319,\n        15.430849440579916,\n        15.735974655408505,\n        15.385461297489352,\n        15.333921868051087,\n        16.501653952635927,\n        16.599003846243647,\n        16.893195703478842,\n        16.799448832304318,\n        16.33393092320314,\n        16.538116462206915,\n        16.23932883789974,\n        16.2392471269894,\n        16.859290630715464,\n        16.366514700387274,\n        16.96167137928475,\n        16.83021949534363,\n        16.93184653675907,\n        16.409533092068592,\n        16.91301269277289,\n        17.66389706800658,\n        16.52550778385674,\n        17.459393479765613,\n        17.50406546273049,\n        17.832717112063488,\n        16.734378055630582,\n        17.583295864709555,\n        17.237196356311454,\n        17.522937823995925,\n        17.06484780577894,\n        17.512659281323586,\n        17.696821198647786,\n        17.04809714010114,\n        17.227966220191867,\n        18.19230346714556,\n        18.37196814867198,\n        17.557956314796975,\n        17.552119087203025,\n        17.0894637013022,\n        18.043170483807394,\n        17.935874151751484,\n        17.921508832666053,\n        18.489324193422146,\n        18.065332946132315,\n        18.56311195761933,\n        18.541413772237288,\n        18.322821215118474,\n        18.471863873121876,\n        19.07608052741781,\n        18.177953258137197,\n        18.69541772898632,\n        18.49791617502517,\n        18.566899710949116,\n        18.869836981612593,\n        18.515412419249127,\n        18.150184516257696,\n        18.774430720238293,\n        18.46886497647636,\n        18.34975519370064,\n        18.962458767076292,\n        18.417420290694835,\n        18.93657658048281,\n        18.87946945833734,\n        19.222341574933896,\n        18.911153114649522,\n        19.422345302867388,\n        19.256796418085795,\n        18.48861765420574,\n        19.39348874173251,\n        18.686595302739825,\n        19.865326190725906,\n        19.023511220811436,\n        19.48042699746907,\n        19.081363475407006,\n        19.168077325829774,\n        18.979960703588816,\n        18.960901993379544,\n        19.085202579524726,\n        18.999560969030274,\n        19.711606769702964,\n        18.959488752069426,\n        20.172440615902723,\n        19.29982897043603,\n        20.30167667282796,\n        20.204167564783216,\n        19.55270459351527,\n        19.489361101608488,\n        19.868227551195588,\n        19.63629873639086,\n        19.727042384361628,\n        19.650460912172175,\n        20.304064722344762,\n        20.530433882082416,\n        19.815647233599815,\n        20.10753847428461,\n        20.4396515278063,\n        20.306096779316388,\n        20.61319684217262,\n        20.79427306573855,\n        20.50030667587243,\n        20.6337940539087,\n        20.444683167378784,\n        20.72371245122131,\n        20.144890882169804,\n        20.36948294609023,\n        20.206079840247245,\n        20.502400121874505,\n        20.651558224409296,\n        21.28869615462403,\n        21.300729820560587,\n        20.8884738709366,\n        20.363674394481194,\n        21.263709922926456,\n        21.190644948408696,\n        21.67965625019779,\n        21.672157253802666,\n        20.828498976146875,\n        20.533388979541037,\n        20.787073606771063,\n        21.3215631638706,\n        21.45988745682205,\n        21.8200930560308,\n        20.78817167231566,\n        21.655696960233414,\n        21.714002809028038,\n        21.866736774087958,\n        21.496232647602827,\n        21.514589951285405,\n        21.652090762772087,\n        21.473138258549696,\n        21.755693431891835,\n        21.255858480721358,\n        21.143571918920127,\n        22.029002663461362,\n        21.64474379663598,\n        21.940025206676346,\n        22.165775704622984,\n        22.04905783589783,\n        21.59529946870306,\n        22.254059528916766,\n        22.043365619383987,\n        22.1456767933878,\n        21.618004891876733,\n        21.03774232298335,\n        21.839062277998448,\n        21.76475316203705,\n        21.911048228507198,\n        22.079604134598345,\n        21.752365020342125,\n        21.822249546209413,\n        22.35343603606794,\n        21.148138818179262,\n        21.85643268659679,\n        22.121406400168098,\n        22.085430097085883,\n        22.653894448701333,\n        21.977986147875118,\n        21.921911937890975,\n        22.297120078238695,\n        22.066393283390376,\n        21.940620483870475,\n        22.235573940667088,\n        22.283730896453243,\n        22.634673881619737,\n        22.184573864106852,\n        22.203408337928114,\n        22.389830706449683,\n        22.43042728311132,\n        22.446823918539195,\n        22.044060302395227,\n        21.844482147140617,\n        22.77543008355557,\n        22.599038030903625,\n        22.57242381413829,\n        22.555253775533405,\n        22.48516432179059,\n        22.47232352416741,\n        22.425519482395455,\n        22.42933711405455,\n        22.754756160177056,\n        22.481979415628512,\n        22.614384920296025,\n        23.406009885583067,\n        22.81771388369111,\n        23.29872607532363,\n        23.416096087931436,\n        22.965717407618822,\n        22.929384674771033,\n        23.174763366150852,\n        23.146237793971597,\n        23.323793709515748,\n        23.595784120655967,\n        23.306981854777174,\n        23.198029117791275,\n        22.63874471439163,\n        23.15110418633385,\n        23.11791809286721,\n        23.598176818256775,\n        23.558965419102005,\n        23.276426473355116,\n        23.379731300823266,\n        23.43913676383632,\n        23.488234499067303,\n        23.282094444838254,\n        23.726945372178676,\n        23.525413020785443,\n        23.754440998475417,\n        23.436217257418015,\n        23.829225328503867,\n        22.961255437753874,\n        23.969318481298654,\n        24.117145710800806,\n        23.861927483847584,\n        23.966705747354556,\n        23.85984075903328,\n        24.153896945938225,\n        23.342786406280453,\n        23.429045950457304,\n        23.186800448251468,\n        23.67177143368521,\n        24.023554060603047,\n        24.078081394249793,\n        24.15116228716688,\n        24.29868128398168,\n        24.37892108053382,\n        24.16634732334588,\n        23.78905968944604,\n        23.99682474205593,\n        24.238707910184257,\n        23.613639579649146,\n        23.82335433971291,\n        24.63875722333858,\n        24.03743156203353,\n        24.080775971265105,\n        24.315146075914914,\n        24.590043843689905,\n        24.216200725563397,\n        24.76149381518966,\n        24.876524565210026,\n        24.85678605216847,\n        24.49169589110832,\n        24.59421602505435,\n        24.40681221171436,\n        24.375352695958586,\n        24.33337034631693,\n        25.01862482532535,\n        24.24420512893595,\n        23.777740104262747\n      ],\n      \"return_max\": [\n        0.0,\n        -5.528899143567214,\n        -5.2847934252628885,\n        -4.985057045899643,\n        -4.922391836410636,\n        -4.82248808674592,\n        -4.713502518816382,\n        -4.654369483661602,\n        -4.587032678530652,\n        -4.616545452363403,\n        -4.632783796247698,\n        -4.701130763212384,\n        -4.694670340501172,\n        -4.592916909475119,\n        -4.573042375215858,\n        -4.6999134240376135,\n        -4.975724783358038,\n        -5.7850933362008465,\n        -5.76081202789959,\n        -6.5277796093105005,\n        -6.89953720925723,\n        -6.890058484707282,\n        -7.323823604333456,\n        -7.241327701763785,\n        -7.219642123360369,\n        -6.379027194341581,\n        -5.8701763788600365,\n        -5.148155427420367,\n        -4.344935625696603,\n        -3.688922263176798,\n        -2.5964313821315916,\n        -2.1326032096608274,\n        -1.0264829184586266,\n        -0.9584849562811515,\n        -0.6944844915804289,\n        -0.6874376473518944,\n        0.590496651906498,\n        1.6433117338685868,\n        0.9092874479675763,\n        1.6690469298340955,\n        2.510480308443307,\n        3.426348936563674,\n        3.8786181642389774,\n        3.7746224991494883,\n        4.669740897644988,\n        4.3832537519585655,\n        4.764030441913102,\n        5.122105512052707,\n        5.477977672830015,\n        5.863587303485141,\n        6.284394674520987,\n        5.813436872202251,\n        6.378140550158573,\n        6.217314054767584,\n        5.677387257143454,\n        6.189411040630592,\n        6.393631710371521,\n        6.602222179828026,\n        6.548276893983705,\n        6.37564949344078,\n        6.746906640911015,\n        6.85907181309405,\n        6.824192141908342,\n        7.208844726208271,\n        6.34483122546861,\n        7.453402050037185,\n        7.220624928744101,\n        7.195203712213297,\n        6.479465898949339,\n        6.121214655361287,\n        6.502920406070905,\n        6.468996865436241,\n        6.9163036123766055,\n        6.972557280130757,\n        7.221453526373775,\n        7.115414256786714,\n        8.013746716183057,\n        7.506424958643656,\n        7.711567083470725,\n        6.858940854960057,\n        7.30335022598698,\n        7.351964457160935,\n        7.1770168112341715,\n        7.154177247164066,\n        6.8261864580070775,\n        6.536068175170672,\n        7.087318374466043,\n        6.9907838431903135,\n        7.682661779404288,\n        6.875933761550209,\n        6.541946889745199,\n        7.191091691650969,\n        7.047391200866608,\n        7.277982966296823,\n        6.368034775437899,\n        6.574067193994262,\n        6.624805485481195,\n        7.202434995667042,\n        6.5540203033776905,\n        6.762448311799899,\n        6.826231396656613,\n        6.554370935991846,\n        6.411315421863841,\n        6.400042419748456,\n        6.32415812553996,\n        6.395120259372502,\n        6.6459660265097895,\n        6.0315384649942665,\n        5.843636800042105,\n        6.14565143644179,\n        6.108400498596499,\n        6.775207912770243,\n        6.127417179859371,\n        5.563699043404305,\n        5.700475986074164,\n        5.918484495279184,\n        6.308535130081836,\n        6.51597075864176,\n        6.357328631941017,\n        6.000814542199343,\n        6.239724195550861,\n        6.518503739563969,\n        6.435642572871952,\n        6.314293971555899,\n        6.627852936557242,\n        6.887628788829901,\n        6.699564458445542,\n        6.670845782750369,\n        6.883202433540342,\n        6.749645114550724,\n        6.956875516254199,\n        6.933405226460819,\n        7.421566762887166,\n        7.582685223193887,\n        7.072381858950101,\n        7.676125431959557,\n        7.332216586399404,\n        7.091210048340144,\n        7.3213486026737336,\n        7.414332448809812,\n        7.629496526035288,\n        7.961433398186197,\n        7.670203278454445,\n        7.6551110859026075,\n        7.754313087037881,\n        7.486295489387883,\n        7.441013646590369,\n        8.340004417896585,\n        8.097518145118864,\n        7.985698124069713,\n        8.095103973164491,\n        8.585715553156435,\n        8.31071993113763,\n        8.231273060162149,\n        8.270507836511683,\n        8.534791006579768,\n        8.624499278267058,\n        8.75915243478071,\n        8.969249254325213,\n        8.738624085351233,\n        9.106999431851769,\n        8.868254497418137,\n        8.95539774291936,\n        9.070415526358403,\n        8.377980949640495,\n        9.165170944303512,\n        9.169295566750584,\n        9.044721424346312,\n        9.333022597777859,\n        9.30282667714874,\n        9.674397635255923,\n        10.021400570622733,\n        10.153981117359585,\n        9.652488274298006,\n        10.350399642722374,\n        10.04229890228856,\n        10.18005822572231,\n        10.367334868493714,\n        10.420868205716216,\n        10.450285957954886,\n        11.153951377142906,\n        11.497433601785218,\n        11.069975231604413,\n        11.281148734610044,\n        11.301993781942116,\n        11.307505260376066,\n        11.825598976349095,\n        11.615833333829142,\n        11.059294364086895,\n        11.942231890073886,\n        12.412790972797673,\n        12.853469266564336,\n        11.760987650966815,\n        12.468005688486986,\n        12.161793903871128,\n        12.53424445649448,\n        12.282414187799674,\n        12.802941581514473,\n        12.546816791210176,\n        12.865049141664622,\n        13.017245854072993,\n        13.304992957911228,\n        13.256917728952077,\n        13.088906297674077,\n        13.508342451331778,\n        13.049344777594166,\n        12.835069158387194,\n        13.661599564333871,\n        13.651256753557035,\n        13.140623021663755,\n        13.905125367721565,\n        13.613382502012598,\n        13.964083121425956,\n        13.310342870042401,\n        13.885741992078579,\n        14.475768778254137,\n        14.387136952991495,\n        14.474212733671303,\n        14.807889549848317,\n        14.455016206170685,\n        15.590313966415563,\n        15.140368792364605,\n        14.723540501112062,\n        15.400334672691153,\n        15.887580882593921,\n        14.819916274076604,\n        15.817427106262675,\n        15.187692674071773,\n        15.538606729450272,\n        15.753767337074121,\n        15.713732705535376,\n        15.153522847959367,\n        15.511214885904863,\n        15.130005785944471,\n        15.663467034664903,\n        15.510681950490394,\n        16.398600630860155,\n        15.972039040311355,\n        15.587651982438693,\n        16.786807820567944,\n        16.580300483928532,\n        16.54175717271397,\n        15.927798469279248,\n        16.103504681912266,\n        16.38372008661014,\n        16.545517657545773,\n        16.549092593943175,\n        17.657328991356025,\n        17.168279655968202,\n        17.019703823871666,\n        17.070309048763193,\n        17.8242089570927,\n        17.927424802356633,\n        17.70476620049746,\n        17.585180691395895,\n        18.02318903102592,\n        17.983943089984713,\n        17.328108326549337,\n        17.884929745954672,\n        17.949815586014957,\n        18.255264679936378,\n        18.34636220094458,\n        18.508177911381416,\n        17.9127904629945,\n        18.02819241950077,\n        18.49655328073816,\n        18.528795840842157,\n        18.448481089140618,\n        18.6483497986429,\n        18.53373227512095,\n        18.758944455024068,\n        18.63216537168924,\n        18.237170868870077,\n        18.950657400784113,\n        18.64977681481168,\n        17.948451017597154,\n        18.69439209304038,\n        18.727288630645358,\n        19.20454694837485,\n        19.005300752671225,\n        19.287823610449816,\n        19.08048261794947,\n        18.949025702992323,\n        18.887106183141242,\n        18.553895319118528,\n        19.991007509124728,\n        18.753116980580742,\n        18.945750925986967,\n        19.984827076471458,\n        19.652784461265036,\n        19.505271378533365,\n        19.33595347716968,\n        19.48311117182208,\n        19.38005818410209,\n        19.608824854798346,\n        19.091611103590186,\n        19.331042924572337,\n        19.12947423069701,\n        19.70849482920568,\n        19.54661266473382,\n        19.510234323225674,\n        20.500484541145013,\n        19.845359248538525,\n        18.891507872972902,\n        20.524291336240903,\n        19.563555754367655,\n        19.974114431224116,\n        19.907623052984963,\n        19.67707303429976,\n        19.729423023632755,\n        19.91978669493325,\n        19.81276389568026,\n        19.98645992837354,\n        19.721098370168406,\n        20.52819580447863,\n        20.10696708989569,\n        20.365390260113504,\n        21.298843661931294,\n        20.939051882458234,\n        20.979725390220583,\n        19.79157607519489,\n        20.316728675348685,\n        20.7149032963126,\n        20.66322359965648,\n        21.27119353274641,\n        19.98508584407349,\n        20.55677298616882,\n        20.85469079652537,\n        21.07778072065384,\n        21.195515414609957,\n        20.339929828617358,\n        20.66872324689685,\n        21.071518139697982,\n        21.04520384214088,\n        21.143464144457994,\n        21.217461823618635,\n        21.429875537026163,\n        21.145302349653118,\n        21.043745795711008,\n        20.58240885116556,\n        21.343282687451875,\n        20.200383979025094,\n        21.451178059363965,\n        21.619305216685756,\n        21.332062520697228,\n        21.10310736655701,\n        21.387360310281917,\n        20.89845930905838,\n        21.56989093104583,\n        21.167020632614967,\n        21.906475791634243,\n        21.744692092723973,\n        21.776914547640857,\n        21.243895252032566,\n        21.437150331893882,\n        21.692433420258993,\n        21.815657695138853,\n        21.72287929247586,\n        22.451712596034593,\n        21.608442557467132,\n        21.663469551355174,\n        21.893672158847448,\n        21.80493538316968,\n        22.022551341669185,\n        21.790612300045474,\n        21.46733110636787,\n        21.702055265977293,\n        22.30986973234106,\n        22.423685101289706,\n        21.988744762027558,\n        22.306824314488203,\n        21.977901888567377,\n        22.472900678839252,\n        22.1330242348436,\n        22.18527488741504,\n        21.894536174643772,\n        22.862021881001432,\n        22.041001983185403,\n        22.157450533269827,\n        22.73521801659955,\n        22.62586677606038,\n        22.156560446105477,\n        22.29153104852326,\n        21.98192476968186,\n        22.768086062481352,\n        22.31144576379116,\n        22.593378379972776,\n        22.60128626063534,\n        22.29349618010599,\n        22.685488541681845,\n        22.244890798567713,\n        22.747761180532976,\n        23.083238250802264,\n        23.116030192968513,\n        22.675472387342374,\n        22.385198320543413,\n        22.59781179994146,\n        22.619361401671245,\n        22.53727297150024,\n        22.958748591179685,\n        22.463402550976507,\n        22.048217652412795,\n        22.837977072239475,\n        23.556697886826854,\n        22.676187927489792,\n        22.515709348569636,\n        22.643425045843763,\n        22.85996824631826,\n        22.95476751921198,\n        22.829777582809793,\n        22.912598982951515,\n        22.719814580447622,\n        22.805638878958057,\n        23.07998946238044,\n        22.63129017949842,\n        22.633887397308968,\n        23.211948917617836,\n        22.769160060123085,\n        22.60185139845088,\n        22.84416718211538,\n        23.11112598850022,\n        23.364512253882573,\n        23.452675540627443,\n        22.567508632100314,\n        23.039511047718335,\n        23.171125244497173,\n        22.992939060528556,\n        23.421997719192145,\n        23.52499782986397,\n        23.300259752263354,\n        23.257364269791648,\n        23.653193327229626,\n        24.00953387062027,\n        23.16723941861272,\n        23.671573028070767,\n        23.641733549322936,\n        24.192271391577446,\n        23.548095507788453,\n        23.90445290899391,\n        23.418094576443377,\n        23.644586677806178,\n        24.219468096327784,\n        23.557402279168866,\n        24.229635348877096,\n        24.17478872130621,\n        24.248783837615807,\n        23.873409627071606,\n        24.220687858926855,\n        24.036538728120302,\n        24.082331966896387,\n        24.665773097311252,\n        24.36662224295905,\n        24.29359387487588,\n        24.247460423953953,\n        24.302978458776696,\n        23.799348419735587,\n        24.14290362397126,\n        24.596163161044792,\n        24.534877296749954,\n        24.252972776650722,\n        24.656264098249096,\n        24.439341086011318,\n        24.598292585787892,\n        24.85057063975695,\n        25.095113874261862,\n        24.226178712873807,\n        24.818766982646256,\n        24.446995803123404,\n        24.82589165625497,\n        24.922233062980716,\n        25.045032116686453,\n        24.431437018560665,\n        24.756866322026674,\n        25.082090779001653,\n        24.979920015916626,\n        25.207713276772534,\n        24.964553936927437,\n        25.52679643859053,\n        24.989012262734036,\n        24.968180663184576,\n        25.15064303964675,\n        25.025970576003996,\n        25.528694565261027,\n        25.896593189126197,\n        25.34604165310592,\n        25.021970913323525,\n        25.46073613976293,\n        25.608820096621297,\n        25.17585508654183,\n        25.3327672112703,\n        25.571711409483292,\n        25.15718529781691,\n        26.164287187475903,\n        25.685735495203524,\n        25.557828941961226,\n        26.24222630019767,\n        25.56262475790901,\n        25.92796969007529,\n        25.59659540032671,\n        25.385419447632653,\n        26.17255821618962\n      ]\n    }\n  },\n  \"Isaac-Repose-Cube-Shadow-Direct-v0\": {\n    \"FastTD3\": {\n      \"time\": [\n        0.0,\n        330.68918918918916,\n        661.3783783783783,\n        992.0675675675676,\n        1322.7567567567567,\n        1653.445945945946,\n        1984.1351351351352,\n        2314.824324324324,\n        2645.5135135135133,\n        2976.2027027027025,\n        3306.891891891892,\n        3637.5810810810813,\n        3968.2702702702704,\n        4298.959459459459,\n        4629.648648648648,\n        4960.3378378378375,\n        5291.027027027027,\n        5621.716216216216,\n        5952.405405405405,\n        6283.094594594595,\n        6613.783783783784,\n        6944.472972972973,\n        7275.1621621621625,\n        7605.851351351352,\n        7936.540540540541,\n        8267.22972972973,\n        8597.918918918918,\n        8928.608108108108,\n        9259.297297297297,\n        9589.986486486487,\n        9920.675675675675,\n        10251.364864864865,\n        10582.054054054053,\n        10912.743243243243,\n        11243.432432432432,\n        11574.121621621622,\n        11904.81081081081,\n        12235.5,\n        12566.18918918919,\n        12896.878378378378,\n        13227.567567567568,\n        13558.256756756757,\n        13888.945945945947,\n        14219.635135135135,\n        14550.324324324325,\n        14881.013513513513,\n        15211.702702702703,\n        15542.391891891892,\n        15873.081081081082,\n        16203.77027027027,\n        16534.45945945946,\n        16865.14864864865,\n        17195.837837837837,\n        17526.527027027027,\n        17857.216216216217,\n        18187.905405405407,\n        18518.594594594593,\n        18849.283783783783,\n        19179.972972972973,\n        19510.662162162163,\n        19841.35135135135,\n        20172.04054054054,\n        20502.72972972973,\n        20833.41891891892,\n        21164.108108108107,\n        21494.797297297297,\n        21825.486486486487,\n        22156.175675675677,\n        22486.864864864863,\n        22817.554054054053,\n        23148.243243243243,\n        23478.932432432433,\n        23809.62162162162,\n        24140.31081081081,\n        24471.0\n      ],\n      \"env_step\": [\n        0,\n        20480000.0,\n        40960000.0,\n        61440000.0,\n        81920000.0,\n        102400000.0,\n        122880000.0,\n        143360000.0,\n        163840000.0,\n        184320000.0,\n        204800000.0,\n        225280000.0,\n        245760000.0,\n        266240000.0,\n        286720000.0,\n        307200000.0,\n        327680000.0,\n        348160000.0,\n        368640000.0,\n        389120000.0,\n        409600000.0,\n        430080000.0,\n        450560000.0,\n        471040000.0,\n        491520000.0,\n        512000000.0,\n        532480000.0,\n        552960000.0,\n        573440000.0,\n        593920000.0,\n        614400000.0,\n        634880000.0,\n        655360000.0,\n        675840000.0,\n        696320000.0,\n        716800000.0,\n        737280000.0,\n        757760000.0,\n        778240000.0,\n        798720000.0,\n        819200000.0,\n        839680000.0,\n        860160000.0,\n        880640000.0,\n        901120000.0,\n        921600000.0,\n        942080000.0,\n        962560000.0,\n        983040000.0,\n        1003520000.0,\n        1024000000.0,\n        1044480000.0,\n        1064960000.0,\n        1085440000.0,\n        1105920000.0,\n        1126400000.0,\n        1146880000.0,\n        1167360000.0,\n        1187840000.0,\n        1208320000.0,\n        1228800000.0,\n        1249280000.0,\n        1269760000.0,\n        1290240000.0,\n        1310720000.0,\n        1331200000.0,\n        1351680000.0,\n        1372160000.0,\n        1392640000.0,\n        1413120000.0,\n        1433600000.0,\n        1454080000.0,\n        1474560000.0,\n        1495040000.0,\n        1515520000.0\n      ],\n      \"return\": [\n        0.0,\n        382.9488016764323,\n        3325.7025553385415,\n        5983.295735677083,\n        7569.175944010417,\n        8402.328776041666,\n        8895.064453125,\n        9108.5126953125,\n        9351.854166666666,\n        9457.5625,\n        9600.7568359375,\n        9676.693684895834,\n        9690.444010416666,\n        9769.830729166666,\n        9926.238606770834,\n        9924.583658854166,\n        9927.7548828125,\n        9980.768880208334,\n        10041.905598958334,\n        10035.136067708334,\n        10072.431640625,\n        10218.255533854166,\n        10219.3720703125,\n        10140.8955078125,\n        10195.677734375,\n        10264.537760416666,\n        10275.744466145834,\n        10346.177734375,\n        10295.432291666666,\n        10356.6630859375,\n        10349.065755208334,\n        10308.400716145834,\n        10315.165364583334,\n        10341.683268229166,\n        10255.400716145834,\n        10431.327799479166,\n        10433.752604166666,\n        10452.818684895834,\n        10408.553385416666,\n        10441.5556640625,\n        10436.6494140625,\n        10437.878255208334,\n        10553.256510416666,\n        10468.160481770834,\n        10498.621419270834,\n        10488.468424479166,\n        10543.875651041666,\n        10547.175455729166,\n        10521.928059895834,\n        10505.654947916666,\n        10564.725260416666,\n        10582.248046875,\n        10550.1796875,\n        10497.415690104166,\n        10528.411458333334,\n        10508.069986979166,\n        10514.443033854166,\n        10597.4140625,\n        10602.786783854166,\n        10624.103515625,\n        10591.981119791666,\n        10728.904622395834,\n        10649.1533203125,\n        10522.616536458334,\n        10581.818033854166,\n        10641.3125,\n        10679.970703125,\n        10570.703125,\n        10605.524739583334,\n        10627.1142578125,\n        10650.056966145834,\n        10705.324869791666,\n        10595.880859375,\n        10693.9228515625,\n        10580.137044270834\n      ],\n      \"return_min\": [\n        0.0,\n        359.41059655147757,\n        2949.856590942412,\n        5776.541036955849,\n        7395.004803407598,\n        8248.132074280249,\n        8828.972871967186,\n        8958.659157152422,\n        9303.277804746735,\n        9412.228024899145,\n        9468.185128938925,\n        9569.948018047084,\n        9621.435716430296,\n        9699.129445877223,\n        9816.969484996353,\n        9769.014548711579,\n        9800.333162033327,\n        9923.72243183005,\n        10007.683047995282,\n        10021.136837813956,\n        10014.480328880763,\n        10184.788847313212,\n        10133.861623316618,\n        10067.86992411191,\n        10167.232640558608,\n        10193.601492220183,\n        10155.176689746373,\n        10245.377468169529,\n        10250.017587387996,\n        10272.533376427831,\n        10282.273485433534,\n        10234.121208538385,\n        10256.477859592766,\n        10269.329184206632,\n        10134.271242258248,\n        10374.883823081536,\n        10396.353590283215,\n        10322.026966766209,\n        10373.607044323328,\n        10369.999796834647,\n        10364.3489023407,\n        10367.793788095061,\n        10471.49904773048,\n        10392.126377871275,\n        10455.048923201117,\n        10359.731093856988,\n        10457.410493575546,\n        10457.966017755905,\n        10424.317180105267,\n        10399.902885386984,\n        10495.50258255835,\n        10495.068429035728,\n        10444.166750515762,\n        10321.152995551243,\n        10445.516438211675,\n        10460.457192528489,\n        10451.086833492674,\n        10475.111650363446,\n        10487.424596133797,\n        10618.090402098596,\n        10435.907537302917,\n        10626.064610097928,\n        10564.51645361193,\n        10429.779904509182,\n        10460.358847702342,\n        10615.205139810138,\n        10551.641075842563,\n        10454.799861652631,\n        10486.209465501834,\n        10502.08255684319,\n        10509.412334038356,\n        10596.807083458754,\n        10527.152332367086,\n        10618.574418628925,\n        10455.838224661382\n      ],\n      \"return_max\": [\n        0.0,\n        406.48700680138705,\n        3701.548519734671,\n        6190.050434398317,\n        7743.347084613236,\n        8556.525477803083,\n        8961.156034282814,\n        9258.366233472578,\n        9400.430528586598,\n        9502.896975100855,\n        9733.328542936075,\n        9783.439351744584,\n        9759.452304403036,\n        9840.53201245611,\n        10035.507728545315,\n        10080.152768996753,\n        10055.176603591673,\n        10037.815328586617,\n        10076.128149921386,\n        10049.135297602712,\n        10130.382952369237,\n        10251.72222039512,\n        10304.882517308382,\n        10213.92109151309,\n        10224.122828191392,\n        10335.47402861315,\n        10396.312242545295,\n        10446.978000580471,\n        10340.846995945336,\n        10440.792795447169,\n        10415.858024983134,\n        10382.680223753283,\n        10373.852869573902,\n        10414.0373522517,\n        10376.53019003342,\n        10487.771775876796,\n        10471.151618050117,\n        10583.61040302546,\n        10443.499726510005,\n        10513.111531290353,\n        10508.9499257843,\n        10507.962722321607,\n        10635.013973102852,\n        10544.194585670393,\n        10542.193915340551,\n        10617.205755101344,\n        10630.340808507786,\n        10636.384893702427,\n        10619.538939686401,\n        10611.407010446348,\n        10633.947938274981,\n        10669.427664714272,\n        10656.192624484238,\n        10673.678384657089,\n        10611.306478454993,\n        10555.682781429843,\n        10577.799234215658,\n        10719.716474636554,\n        10718.148971574536,\n        10630.116629151404,\n        10748.054702280415,\n        10831.74463469374,\n        10733.79018701307,\n        10615.453168407486,\n        10703.27722000599,\n        10667.419860189862,\n        10808.300330407437,\n        10686.606388347369,\n        10724.840013664834,\n        10752.14595878181,\n        10790.701598253312,\n        10813.842656124578,\n        10664.609386382914,\n        10769.271284496075,\n        10704.435863880286\n      ]\n    },\n    \"null\": {\n      \"time\": [\n        0.0,\n        27.434,\n        54.868,\n        82.302,\n        109.736,\n        137.17,\n        164.604,\n        192.038,\n        219.472,\n        246.906,\n        274.34,\n        301.774,\n        329.208,\n        356.642,\n        384.076,\n        411.51,\n        438.944,\n        466.378,\n        493.812,\n        521.246,\n        548.68,\n        576.114,\n        603.548,\n        630.982,\n        658.416,\n        685.85,\n        713.284,\n        740.718,\n        768.152,\n        795.586,\n        823.02,\n        850.454,\n        877.888,\n        905.322,\n        932.756,\n        960.19,\n        987.624,\n        1015.058,\n        1042.492,\n        1069.926,\n        1097.36,\n        1124.794,\n        1152.228,\n        1179.662,\n        1207.096,\n        1234.53,\n        1261.964,\n        1289.398,\n        1316.832,\n        1344.266,\n        1371.7,\n        1399.134,\n        1426.568,\n        1454.002,\n        1481.436,\n        1508.87,\n        1536.304,\n        1563.738,\n        1591.172,\n        1618.606,\n        1646.04,\n        1673.474,\n        1700.908,\n        1728.342,\n        1755.776,\n        1783.21,\n        1810.644,\n        1838.078,\n        1865.512,\n        1892.946,\n        1920.38,\n        1947.814,\n        1975.248,\n        2002.682,\n        2030.116,\n        2057.55,\n        2084.984,\n        2112.418,\n        2139.852,\n        2167.286,\n        2194.72,\n        2222.154,\n        2249.588,\n        2277.022,\n        2304.456,\n        2331.89,\n        2359.324,\n        2386.758,\n        2414.192,\n        2441.626,\n        2469.06,\n        2496.494,\n        2523.928,\n        2551.362,\n        2578.796,\n        2606.23,\n        2633.664,\n        2661.098,\n        2688.532,\n        2715.966,\n        2743.4,\n        2770.834,\n        2798.268,\n        2825.702,\n        2853.136,\n        2880.57,\n        2908.004,\n        2935.438,\n        2962.872,\n        2990.306,\n        3017.74,\n        3045.174,\n        3072.608,\n        3100.042,\n        3127.476,\n        3154.91,\n        3182.344,\n        3209.778,\n        3237.212,\n        3264.646,\n        3292.08,\n        3319.514,\n        3346.948,\n        3374.382,\n        3401.816,\n        3429.25,\n        3456.684,\n        3484.118,\n        3511.552,\n        3538.986,\n        3566.42,\n        3593.854,\n        3621.288,\n        3648.722,\n        3676.156,\n        3703.59,\n        3731.024,\n        3758.458,\n        3785.892,\n        3813.326,\n        3840.76,\n        3868.194,\n        3895.628,\n        3923.062,\n        3950.496,\n        3977.93,\n        4005.364,\n        4032.798,\n        4060.232,\n        4087.666,\n        4115.1,\n        4142.534,\n        4169.968,\n        4197.402,\n        4224.836,\n        4252.27,\n        4279.704,\n        4307.138,\n        4334.572,\n        4362.006,\n        4389.44,\n        4416.874,\n        4444.308,\n        4471.742,\n        4499.176,\n        4526.61,\n        4554.044,\n        4581.478,\n        4608.912,\n        4636.346,\n        4663.78,\n        4691.214,\n        4718.648,\n        4746.082,\n        4773.516,\n        4800.95,\n        4828.384,\n        4855.818,\n        4883.252,\n        4910.686,\n        4938.12,\n        4965.554,\n        4992.988,\n        5020.422,\n        5047.856,\n        5075.29,\n        5102.724,\n        5130.158,\n        5157.592,\n        5185.026,\n        5212.46,\n        5239.894,\n        5267.328,\n        5294.762,\n        5322.196,\n        5349.63,\n        5377.064,\n        5404.498,\n        5431.932,\n        5459.366,\n        5486.8,\n        5514.234,\n        5541.668,\n        5569.102,\n        5596.536,\n        5623.97,\n        5651.404,\n        5678.838,\n        5706.272,\n        5733.706,\n        5761.14,\n        5788.574,\n        5816.008,\n        5843.442,\n        5870.876,\n        5898.31,\n        5925.744,\n        5953.178,\n        5980.612,\n        6008.046,\n        6035.48,\n        6062.914,\n        6090.348,\n        6117.782,\n        6145.216,\n        6172.65,\n        6200.084,\n        6227.518,\n        6254.952,\n        6282.386,\n        6309.82,\n        6337.254,\n        6364.688,\n        6392.122,\n        6419.556,\n        6446.99,\n        6474.424,\n        6501.858,\n        6529.292,\n        6556.726,\n        6584.16,\n        6611.594,\n        6639.028,\n        6666.462,\n        6693.896,\n        6721.33,\n        6748.764,\n        6776.198,\n        6803.632,\n        6831.066,\n        6858.5,\n        6885.934,\n        6913.368,\n        6940.802,\n        6968.236,\n        6995.67,\n        7023.104,\n        7050.538,\n        7077.972,\n        7105.406,\n        7132.84,\n        7160.274,\n        7187.708,\n        7215.142,\n        7242.576,\n        7270.01,\n        7297.444,\n        7324.878,\n        7352.312,\n        7379.746,\n        7407.18,\n        7434.614,\n        7462.048,\n        7489.482,\n        7516.916,\n        7544.35,\n        7571.784,\n        7599.218,\n        7626.652,\n        7654.086,\n        7681.52,\n        7708.954,\n        7736.388,\n        7763.822,\n        7791.256,\n        7818.69,\n        7846.124,\n        7873.558,\n        7900.992,\n        7928.426,\n        7955.86,\n        7983.294,\n        8010.728,\n        8038.162,\n        8065.596,\n        8093.03,\n        8120.464,\n        8147.898,\n        8175.332,\n        8202.766,\n        8230.2,\n        8257.634,\n        8285.068,\n        8312.502,\n        8339.936,\n        8367.37,\n        8394.804,\n        8422.238,\n        8449.672,\n        8477.106,\n        8504.54,\n        8531.974,\n        8559.408,\n        8586.842,\n        8614.276,\n        8641.71,\n        8669.144,\n        8696.578,\n        8724.012,\n        8751.446,\n        8778.88,\n        8806.314,\n        8833.748,\n        8861.182,\n        8888.616,\n        8916.05,\n        8943.484,\n        8970.918,\n        8998.352,\n        9025.786,\n        9053.22,\n        9080.654,\n        9108.088,\n        9135.522,\n        9162.956,\n        9190.39,\n        9217.824,\n        9245.258,\n        9272.692,\n        9300.126,\n        9327.56,\n        9354.994,\n        9382.428,\n        9409.862,\n        9437.296,\n        9464.73,\n        9492.164,\n        9519.598,\n        9547.032,\n        9574.466,\n        9601.9,\n        9629.334,\n        9656.768,\n        9684.202,\n        9711.636,\n        9739.07,\n        9766.504,\n        9793.938,\n        9821.372,\n        9848.806,\n        9876.24,\n        9903.674,\n        9931.108,\n        9958.542,\n        9985.976,\n        10013.41,\n        10040.844,\n        10068.278,\n        10095.712,\n        10123.146,\n        10150.58,\n        10178.014,\n        10205.448,\n        10232.882,\n        10260.316,\n        10287.75,\n        10315.184,\n        10342.618,\n        10370.052,\n        10397.486,\n        10424.92,\n        10452.354,\n        10479.788,\n        10507.222,\n        10534.656,\n        10562.09,\n        10589.524,\n        10616.958,\n        10644.392,\n        10671.826,\n        10699.26,\n        10726.694,\n        10754.128,\n        10781.562,\n        10808.996,\n        10836.43,\n        10863.864,\n        10891.298,\n        10918.732,\n        10946.166,\n        10973.6,\n        11001.034,\n        11028.468,\n        11055.902,\n        11083.336,\n        11110.77,\n        11138.204,\n        11165.638,\n        11193.072,\n        11220.506,\n        11247.94,\n        11275.374,\n        11302.808,\n        11330.242,\n        11357.676,\n        11385.11,\n        11412.544,\n        11439.978,\n        11467.412,\n        11494.846,\n        11522.28,\n        11549.714,\n        11577.148,\n        11604.582,\n        11632.016,\n        11659.45,\n        11686.884,\n        11714.318,\n        11741.752,\n        11769.186,\n        11796.62,\n        11824.054,\n        11851.488,\n        11878.922,\n        11906.356,\n        11933.79,\n        11961.224,\n        11988.658,\n        12016.092,\n        12043.526,\n        12070.96,\n        12098.394,\n        12125.828,\n        12153.262,\n        12180.696,\n        12208.13,\n        12235.564,\n        12262.998,\n        12290.432,\n        12317.866,\n        12345.3,\n        12372.734,\n        12400.168,\n        12427.602,\n        12455.036,\n        12482.47,\n        12509.904,\n        12537.338,\n        12564.772,\n        12592.206,\n        12619.64,\n        12647.074,\n        12674.508,\n        12701.942,\n        12729.376,\n        12756.81,\n        12784.244,\n        12811.678,\n        12839.112,\n        12866.546,\n        12893.98,\n        12921.414,\n        12948.848,\n        12976.282,\n        13003.716,\n        13031.15,\n        13058.584,\n        13086.018,\n        13113.452,\n        13140.886,\n        13168.32,\n        13195.754,\n        13223.188,\n        13250.622,\n        13278.056,\n        13305.49,\n        13332.924,\n        13360.358,\n        13387.792,\n        13415.226,\n        13442.66,\n        13470.094,\n        13497.528,\n        13524.962,\n        13552.396,\n        13579.83,\n        13607.264,\n        13634.698,\n        13662.132,\n        13689.566,\n        13717.0\n      ],\n      \"env_step\": [\n        0,\n        1441792,\n        2883584,\n        6422528,\n        11141120,\n        12976128,\n        13238272,\n        13631488,\n        21757952,\n        21889024,\n        24248320,\n        32112640,\n        35782656,\n        39976960,\n        46137344,\n        47185920,\n        51511296,\n        52297728,\n        54525952,\n        55705600,\n        57409536,\n        59244544,\n        59899904,\n        65142784,\n        73531392,\n        73793536,\n        74055680,\n        80740352,\n        87031808,\n        88473600,\n        88735744,\n        91488256,\n        95551488,\n        96731136,\n        103022592,\n        103284736,\n        104595456,\n        108527616,\n        110493696,\n        111280128,\n        120061952,\n        121503744,\n        125173760,\n        130154496,\n        130285568,\n        131858432,\n        135659520,\n        141295616,\n        144048128,\n        147587072,\n        152567808,\n        153616384,\n        156237824,\n        156499968,\n        157417472,\n        158990336,\n        160432128,\n        162267136,\n        163315712,\n        163708928,\n        167510016,\n        170393600,\n        170655744,\n        172621824,\n        173801472,\n        175374336,\n        176816128,\n        178388992,\n        182190080,\n        182845440,\n        184156160,\n        185860096,\n        187695104,\n        187957248,\n        190447616,\n        192806912,\n        192937984,\n        198967296,\n        202375168,\n        203423744,\n        206176256,\n        209453056,\n        220463104,\n        221380608,\n        221511680,\n        222298112,\n        223870976,\n        226361344,\n        228720640,\n        232259584,\n        233570304,\n        240779264,\n        252444672,\n        253755392,\n        254935040,\n        266076160,\n        268697600,\n        269090816,\n        270794752,\n        272498688,\n        274202624,\n        274333696,\n        278921216,\n        284688384,\n        287834112,\n        289538048,\n        296222720,\n        296484864,\n        300023808,\n        301072384,\n        309592064,\n        310378496,\n        313393152,\n        315228160,\n        315621376,\n        316014592,\n        319029248,\n        319160320,\n        319684608,\n        322568192,\n        324403200,\n        325058560,\n        325582848,\n        329646080,\n        331087872,\n        331743232,\n        333053952,\n        335020032,\n        336461824,\n        343015424,\n        345767936,\n        346030080,\n        348127232,\n        348651520,\n        353370112,\n        356384768,\n        358219776,\n        362938368,\n        369098752,\n        374210560,\n        375652352,\n        376438784,\n        379846656,\n        380502016,\n        385875968,\n        386793472,\n        391118848,\n        394395648,\n        395051008,\n        396492800,\n        399376384,\n        403570688,\n        405405696,\n        406847488,\n        407240704,\n        408944640,\n        409862144,\n        411172864,\n        417333248,\n        426639360,\n        431226880,\n        433717248,\n        433979392,\n        436469760,\n        439091200,\n        439353344,\n        440532992,\n        440926208,\n        442630144,\n        445513728,\n        447873024,\n        462684160,\n        465698816,\n        466092032,\n        467927040,\n        468975616,\n        469106688,\n        469368832,\n        470024192,\n        477888512,\n        478937088,\n        481558528,\n        482869248,\n        486146048,\n        490995712,\n        492044288,\n        493617152,\n        494272512,\n        496762880,\n        497549312,\n        497942528,\n        498860032,\n        500039680,\n        504365056,\n        510656512,\n        511967232,\n        513409024,\n        515244032,\n        516423680,\n        517865472,\n        518389760,\n        520355840,\n        520880128,\n        524288000,\n        532938752,\n        534249472,\n        537395200,\n        537526272,\n        537919488,\n        538181632,\n        540278784,\n        544735232,\n        547094528,\n        548143104,\n        549847040,\n        551288832,\n        563216384,\n        564264960,\n        567017472,\n        567672832,\n        569114624,\n        570163200,\n        570687488,\n        571211776,\n        572129280,\n        575537152,\n        581566464,\n        583008256,\n        594804736,\n        596639744,\n        597557248,\n        603193344,\n        606601216,\n        607387648,\n        609353728,\n        610533376,\n        612368384,\n        615120896,\n        616038400,\n        617480192,\n        619970560,\n        622067712,\n        627965952,\n        628359168,\n        628490240,\n        629932032,\n        633733120,\n        636092416,\n        638189568,\n        638713856,\n        639893504,\n        642121728,\n        643694592,\n        649592832,\n        650510336,\n        653262848,\n        660340736,\n        662831104,\n        662962176,\n        664535040,\n        664928256,\n        666501120,\n        666632192,\n        669384704,\n        682229760,\n        683147264,\n        683409408,\n        686817280,\n        688652288,\n        691142656,\n        694943744,\n        695599104,\n        695730176,\n        708837376,\n        712376320,\n        713293824,\n        714604544,\n        715784192,\n        719716352,\n        720109568,\n        721813504,\n        722337792,\n        725090304,\n        726138880,\n        741998592,\n        747896832,\n        757989376,\n        760610816,\n        761135104,\n        762183680,\n        762707968,\n        762970112,\n        764149760,\n        768868352,\n        775028736,\n        782893056,\n        783286272,\n        786432000,\n        788529152,\n        789184512,\n        798097408,\n        801243136,\n        804388864,\n        809238528,\n        810156032,\n        810287104,\n        810549248,\n        812646400,\n        821035008,\n        821821440,\n        821952512,\n        826015744,\n        826540032,\n        827064320,\n        830734336,\n        833224704,\n        834928640,\n        835977216,\n        836501504,\n        837550080,\n        839516160,\n        844234752,\n        845152256,\n        845283328,\n        848035840,\n        848953344,\n        853934080,\n        858259456,\n        861929472,\n        865992704,\n        866779136,\n        866910208,\n        870055936,\n        874905600,\n        875560960,\n        876216320,\n        876347392,\n        877527040,\n        880279552,\n        884080640,\n        886571008,\n        889716736,\n        892993536,\n        893911040,\n        895614976,\n        899284992,\n        899809280,\n        903872512,\n        905707520,\n        908984320,\n        917897216,\n        919076864,\n        921436160,\n        928251904,\n        929431552,\n        929693696,\n        933101568,\n        933756928,\n        933888000,\n        934412288,\n        937426944,\n        940965888,\n        941096960,\n        942014464,\n        944766976,\n        945815552,\n        949354496,\n        953679872,\n        954204160,\n        955121664,\n        955645952,\n        958791680,\n        959578112,\n        961019904,\n        962199552,\n        963641344,\n        971112448,\n        978321408,\n        978452480,\n        991952896,\n        995622912,\n        998113280,\n        1012662272,\n        1013055488,\n        1013317632,\n        1014497280,\n        1014890496,\n        1021968384,\n        1027866624,\n        1028784128,\n        1030488064,\n        1033371648,\n        1034944512,\n        1035993088,\n        1037434880,\n        1040187392,\n        1040711680,\n        1044512768,\n        1049231360,\n        1050411008,\n        1050673152,\n        1054605312,\n        1056309248,\n        1057619968,\n        1058275328,\n        1062731776,\n        1064173568,\n        1065353216,\n        1068236800,\n        1087504384,\n        1091567616,\n        1093402624,\n        1099300864,\n        1104150528,\n        1105068032,\n        1105723392,\n        1105985536,\n        1106771968,\n        1107034112,\n        1111752704,\n        1114243072,\n        1114898432,\n        1115947008,\n        1117650944,\n        1118175232,\n        1118568448,\n        1118961664,\n        1120796672,\n        1124728832,\n        1126170624,\n        1126432768,\n        1127612416,\n        1129447424,\n        1130102784,\n        1134166016,\n        1138229248,\n        1138753536,\n        1139015680,\n        1140064256,\n        1140195328,\n        1140850688,\n        1143472128,\n        1144782848,\n        1152647168,\n        1157890048,\n        1160249344,\n        1160773632,\n        1163264000,\n        1168113664,\n        1171521536,\n        1175715840,\n        1176109056,\n        1177157632,\n        1177288704,\n        1178861568,\n        1180696576,\n        1181483008,\n        1184235520,\n        1186201600,\n        1186463744,\n        1186856960,\n        1193541632,\n        1199046656,\n        1207566336,\n        1212547072,\n        1220935680,\n        1231421440,\n        1232338944,\n        1234173952,\n        1234829312,\n        1236402176,\n        1237975040,\n        1241513984,\n        1247281152,\n        1249247232,\n        1260519424,\n        1260650496,\n        1260912640,\n        1265106944,\n        1266024448,\n        1266679808,\n        1267466240,\n        1267597312,\n        1270743040,\n        1271267328,\n        1274281984,\n        1275592704,\n        1279918080,\n        1281490944,\n        1282670592,\n        1282932736,\n        1283063808,\n        1286995968,\n        1289224192,\n        1294073856,\n        1294467072,\n        1294729216,\n        1295253504,\n        1302331392,\n        1303117824,\n        1308622848\n      ],\n      \"return\": [\n        0.0,\n        5.236283680185677,\n        102.00346561616661,\n        135.75974839633702,\n        217.9203052608967,\n        274.8097470428944,\n        280.9905634007454,\n        299.1841593341827,\n        320.27578971481324,\n        432.8313995814324,\n        447.9990608375072,\n        584.088671979189,\n        699.7305512046814,\n        878.0466557469368,\n        1019.8984030554295,\n        1002.1022647475005,\n        1077.601406442404,\n        1216.5086271886826,\n        1344.7774969083666,\n        1438.1453649656773,\n        1527.9062448343634,\n        1598.799950851202,\n        1724.6216583614346,\n        1799.3939088706968,\n        1875.027166177392,\n        1938.843445581436,\n        1990.5927048807146,\n        2046.135383161068,\n        2154.9252163674832,\n        2328.1289598841663,\n        2251.4501824004647,\n        2331.8998207616805,\n        2466.500129586458,\n        2404.7145573153493,\n        2583.091887678146,\n        2663.2294233992097,\n        2617.075662468195,\n        2711.554448129177,\n        2837.6077219905856,\n        2781.6240523614883,\n        2832.8479318242075,\n        2925.1859030513765,\n        2881.0791418402196,\n        3072.1628783736232,\n        2981.710749827385,\n        3235.595831243992,\n        3119.2624001512527,\n        3235.403108201504,\n        3242.492927195132,\n        3126.3234116115573,\n        3290.1375757427218,\n        3258.472759049416,\n        3309.0437541728015,\n        3313.4140671682353,\n        3432.629024629116,\n        3499.5061689734457,\n        3488.21505269146,\n        3543.742109605789,\n        3562.137778020858,\n        3650.741189673424,\n        3654.8448517746924,\n        3567.923550039291,\n        3579.626926459789,\n        3639.0841007270815,\n        3597.3003351850507,\n        3683.1284365525244,\n        3722.5531045532234,\n        3735.0221378602982,\n        3698.455936802864,\n        3723.6692370138167,\n        3766.410806018829,\n        3831.3407022600177,\n        3729.1116201820378,\n        3781.410314613342,\n        3944.3908525352476,\n        3908.5141741094594,\n        3913.3661565508837,\n        3948.811793651581,\n        3896.9180006217953,\n        3912.389421329498,\n        3971.1608142261503,\n        3826.356984316826,\n        4092.6435207028394,\n        4122.543938989639,\n        4019.705203544617,\n        4088.232182945013,\n        3999.3917568225866,\n        3998.7158728313443,\n        4117.989953351974,\n        4136.921306218624,\n        4137.75292036438,\n        4140.776959014892,\n        4129.997290261269,\n        4199.470038002968,\n        4129.829729902267,\n        4184.820166660308,\n        4154.497287431716,\n        4140.801594711304,\n        4223.778666554451,\n        4230.720368682862,\n        4335.208910928726,\n        4320.587505472183,\n        4279.5891404767035,\n        4372.656096599103,\n        4213.4231868834495,\n        4271.961008087159,\n        4305.597461331367,\n        4328.0909260568615,\n        4414.040591306687,\n        4414.380168487549,\n        4329.981696447373,\n        4426.483426190853,\n        4378.840885532379,\n        4353.974874118805,\n        4407.2164109535215,\n        4410.12230081749,\n        4318.541604546071,\n        4506.371503078461,\n        4457.252771657944,\n        4510.632806407929,\n        4467.406568637848,\n        4413.122340072632,\n        4427.9682405395515,\n        4320.175341485977,\n        4497.8624214315405,\n        4578.464052150726,\n        4498.392016014099,\n        4478.23864899826,\n        4521.771654136657,\n        4512.799011329651,\n        4485.047106071472,\n        4448.143535078049,\n        4536.469063568115,\n        4482.167032763004,\n        4461.829647455215,\n        4554.236630702972,\n        4481.976551002503,\n        4594.135609493256,\n        4576.309885606766,\n        4533.668542879104,\n        4593.2214253082275,\n        4597.23950636673,\n        4558.801636837006,\n        4625.431267238617,\n        4764.541600046157,\n        4643.302667323589,\n        4543.5092151412955,\n        4569.829136842728,\n        4691.1080415382385,\n        4614.56410823822,\n        4712.355486083985,\n        4627.576166591644,\n        4757.001649238587,\n        4754.4851669654845,\n        4628.6072655735015,\n        4665.539616868973,\n        4598.129826223374,\n        4725.939671382905,\n        4612.051948223114,\n        4712.190674921036,\n        4647.940960597992,\n        4706.48961642456,\n        4784.072993350982,\n        4740.444469493867,\n        4751.443206726074,\n        4626.085614406586,\n        4828.636562986374,\n        4830.9071949653635,\n        4888.817148040771,\n        4799.730001449585,\n        4803.863621171951,\n        4859.824459934234,\n        4789.379549774169,\n        4779.189342220307,\n        4762.207045719147,\n        4787.940696491241,\n        4791.509553516387,\n        4779.069134399413,\n        4833.9439957618715,\n        4925.727799835206,\n        4792.0102535820015,\n        4803.0068109359745,\n        4868.051529243469,\n        4796.148416542053,\n        4738.990540931702,\n        4707.000625541687,\n        4889.161682231903,\n        4860.448832752228,\n        4938.442969347,\n        4888.24940235138,\n        4962.321301048278,\n        4835.466524139405,\n        4850.661842075348,\n        4886.989757774352,\n        4896.520996544838,\n        4915.5101806182865,\n        4850.981204780578,\n        4777.362599323273,\n        4771.4760668907165,\n        4934.89265607071,\n        4974.275241218567,\n        4727.025384067535,\n        4763.875608978271,\n        4932.105666618347,\n        4788.006986774444,\n        4963.5356856842045,\n        4975.0135208892825,\n        4855.027117713928,\n        4849.923748680115,\n        4722.474940223694,\n        4853.057108760834,\n        4936.012247238159,\n        4823.776972335816,\n        4981.17521820259,\n        4858.281689052582,\n        4863.421200771332,\n        4954.844341381073,\n        4927.6710865173345,\n        4907.429337055206,\n        4927.645297805787,\n        4883.977951461792,\n        4842.6132472763065,\n        4913.963537078857,\n        4923.129209091187,\n        4895.095619495392,\n        4766.5804323959355,\n        4873.155703453063,\n        4971.745718772889,\n        4847.09561015892,\n        5030.245681488037,\n        4974.522681049347,\n        4934.959723098755,\n        5034.568326183319,\n        4819.898979125976,\n        5042.744435153962,\n        5037.858683376312,\n        4964.680161628724,\n        5014.581190307617,\n        4910.504225143433,\n        5046.842536304474,\n        4956.216326187134,\n        4993.427708992004,\n        5011.969644168854,\n        5013.753195144653,\n        5012.076093692779,\n        5007.031513072967,\n        4977.757820075988,\n        4918.5938954772955,\n        5027.504535961151,\n        5029.7741360549935,\n        5097.065293293,\n        5031.401218261719,\n        5118.090171936035,\n        4980.158747291565,\n        5087.3497388954165,\n        4997.21746483612,\n        5049.89016718483,\n        5052.48202145338,\n        4998.701796848297,\n        4999.208668247224,\n        4886.12294643402,\n        5026.853234748841,\n        4984.986180892944,\n        4995.789886810303,\n        5111.35556067276,\n        4959.245058212281,\n        5026.2039459877005,\n        5023.929447631836,\n        5109.746718658448,\n        5074.376200294495,\n        5043.018234619141,\n        4994.723159210205,\n        5033.370203350068,\n        4994.979048362731,\n        5050.698421276093,\n        4970.567650695801,\n        5142.736372253418,\n        5206.550512470245,\n        4957.855958114624,\n        5110.360929222106,\n        4998.667523242951,\n        5110.177480525972,\n        5062.622328849792,\n        4984.220653633118,\n        5052.763286636353,\n        5011.880573486328,\n        4979.447981903077,\n        5115.19174407959,\n        5032.8902960281375,\n        4902.994464187623,\n        5029.657383789063,\n        5048.658683044434,\n        5028.910795043946,\n        4909.834355426789,\n        4911.035541084289,\n        5085.230002365113,\n        5012.631169448853,\n        5113.218371482849,\n        5160.849894561767,\n        5167.585114463806,\n        5095.002285575867,\n        5172.043252494812,\n        4730.221512008667,\n        4915.24465864563,\n        4942.3944827651985,\n        5003.76211641693,\n        5107.787829620362,\n        5158.120981674195,\n        4982.206357467652,\n        5077.543367221832,\n        4966.418243301392,\n        5100.223124847412,\n        4913.418493736267,\n        4970.639212806702,\n        5088.850089622498,\n        5057.40702806759,\n        5014.9243317947385,\n        5020.501419372559,\n        4948.997114341736,\n        5052.762624259949,\n        5032.860914924621,\n        5054.642137229919,\n        4985.218594497681,\n        5033.177627876282,\n        5066.468012580872,\n        5092.042655509949,\n        5112.60638760376,\n        5105.836066616058,\n        5162.45666355896,\n        5147.268983337402,\n        5008.682528160095,\n        5037.204464515687,\n        5168.267694854736,\n        5033.9065711059575,\n        5066.976074508666,\n        5196.279511566162,\n        5093.294189910889,\n        5123.048552562714,\n        5061.140061874389,\n        5010.1269136505125,\n        5056.934467567444,\n        5041.947351013184,\n        4975.713252967835,\n        5049.654645835876,\n        4962.487987426758,\n        5010.011297958375,\n        4965.694996154785,\n        4945.400082908631,\n        5028.638770584106,\n        4975.87925856781,\n        5109.705345397949,\n        5117.070566131592,\n        5078.346026298523,\n        5131.825363212586,\n        5016.282830970764,\n        5116.730377670288,\n        5131.330189277649,\n        4940.001313926697,\n        5057.009353103637,\n        5024.000999679566,\n        5004.674883605958,\n        5155.328939086914,\n        5043.259490737915,\n        5130.654353294372,\n        5165.902773330688,\n        5011.435378509521,\n        5092.643320243836,\n        5026.348706832885,\n        4994.030126960754,\n        5007.96946346283,\n        4934.862240600586,\n        5056.537859802245,\n        5079.81287551117,\n        5065.511166244507,\n        5100.1370931091315,\n        5057.302691589355,\n        5006.724618148803,\n        4968.527329902649,\n        5003.266977737427,\n        5129.967458953857,\n        5059.039959884643,\n        4958.431204872131,\n        5055.456167999268,\n        4921.641325973511,\n        5108.192240783692,\n        5031.896564933777,\n        5050.812078147888,\n        5052.395368637085,\n        5183.077568122863,\n        5005.689804779053,\n        5095.717711608887,\n        4981.185117408752,\n        5037.46228678894,\n        5057.316320030212,\n        4998.430526168823,\n        4994.4445332031255,\n        4908.080332427979,\n        5160.852257247925,\n        4942.386556732178,\n        4977.80005708313,\n        4999.5668801116935,\n        5007.867194953918,\n        4988.167340057373,\n        5165.724187385559,\n        5148.104109390259,\n        4887.678792877197,\n        5108.603537368775,\n        5116.676084091187,\n        5039.772944686889,\n        5130.552591812134,\n        5052.179794326782,\n        5101.057332191468,\n        5050.516229759216,\n        5028.3649343185425,\n        5132.270079299927,\n        5086.8566754608155,\n        5134.982005775452,\n        5245.009395843505,\n        5210.982839965821,\n        5080.490185150146,\n        4979.681203170776,\n        5064.320907157898,\n        5119.925566406249,\n        5057.879759536743,\n        5102.072025909424,\n        5031.215700546264,\n        5019.1513743896485,\n        5086.748718887329,\n        5121.011468109131,\n        4908.945695343017,\n        5117.348231658935,\n        4964.579453414916,\n        4975.248231781006,\n        4773.869886505127,\n        5031.6520500183115,\n        5036.789762901307,\n        5068.320360412597,\n        5040.455801651001,\n        5126.994954360962,\n        5097.743839080811,\n        5033.6443821945195,\n        5011.765733551025,\n        5089.503886322022,\n        5084.362827529907,\n        5128.805971763611,\n        4963.232673416138,\n        4993.680429214478,\n        5017.836319595337,\n        4977.133294456481,\n        5038.540714752197,\n        4958.906798588752,\n        4985.946089561463,\n        5016.781194366455,\n        5072.254841171265,\n        4999.833683868407,\n        5056.062237426758,\n        5017.829210861206,\n        5101.638251464844,\n        4970.891885879517,\n        5022.780729690552,\n        5060.428104476929,\n        4972.043661186219,\n        5053.362578033448,\n        5031.0058487396245,\n        4934.2498914031985,\n        5110.03906829834,\n        4950.636026023864,\n        4875.610343353272,\n        4976.717636108398,\n        5058.674133804321,\n        5123.543363433839,\n        5034.8297764587405,\n        5080.173645599365,\n        5067.5694812011725,\n        5051.071068786621,\n        5086.857314743042,\n        5068.077889968872,\n        5049.054003616333,\n        5146.766270660401,\n        4886.542053894043,\n        5008.992813140869,\n        4992.11653074646,\n        4966.736727882385,\n        5051.737206115723,\n        5065.692213798522,\n        5092.993588012695,\n        5089.436850601196,\n        5091.006365074158,\n        4989.2801470794675,\n        4894.385232406616,\n        4994.612436340332,\n        5031.422633369445,\n        5046.974989196778,\n        5007.333500549317,\n        4881.702481468201,\n        4976.65591708374,\n        4950.086229003907,\n        5119.464210418701,\n        4998.693757247925,\n        4954.041024032593,\n        5041.931761985778\n      ],\n      \"return_min\": [\n        0.0,\n        -19.14411247696058,\n        -121.16046486277807,\n        -111.67575958425746,\n        -64.1436141826434,\n        -17.015392825342985,\n        -22.940368605642163,\n        -3.986633963480813,\n        16.45188670459504,\n        6.712078908638375,\n        23.85082779635968,\n        86.26909947922843,\n        2.3765774458348687,\n        44.0580940604425,\n        51.74676530953434,\n        94.09635517895913,\n        133.5802702677007,\n        118.54627781556928,\n        230.43897668253862,\n        149.94855524555805,\n        164.4603191881199,\n        294.78872684605767,\n        387.44056976074944,\n        404.0936291535736,\n        441.647768772071,\n        467.92491306047145,\n        497.5399510083257,\n        528.3376665133649,\n        568.4442803620686,\n        593.6743880036522,\n        638.3961789571385,\n        646.1292398391315,\n        726.1854735064071,\n        760.678543075496,\n        813.7057680064918,\n        811.1814907472349,\n        806.9400518326188,\n        885.9346247741264,\n        974.0043227807846,\n        943.5976664970865,\n        970.6241361595141,\n        1023.7487332620542,\n        1004.06703673836,\n        1058.59503174645,\n        1038.537140980236,\n        1112.7265178803164,\n        1108.2450790246705,\n        1162.0126130055623,\n        1140.1430752161418,\n        1127.3474259063005,\n        1209.4124047888185,\n        1227.0148951328126,\n        1254.5907390533848,\n        1259.2118933961092,\n        1304.3821379264937,\n        1341.8510404397261,\n        1334.1638840733267,\n        1387.7054534142812,\n        1388.493792272047,\n        1406.9151574162347,\n        1411.3257125421574,\n        1402.4307723224233,\n        1365.9277026352697,\n        1423.9381890492577,\n        1436.6417972550985,\n        1526.7958849973243,\n        1490.2378308979005,\n        1524.4179032350703,\n        1508.60452650845,\n        1571.3854383959533,\n        1671.377841916741,\n        1602.3149013798893,\n        1606.342585436565,\n        1610.0229047545126,\n        1687.299419306462,\n        1692.328175549469,\n        1700.26400600844,\n        1733.556046192612,\n        1698.785655133484,\n        1726.406747475267,\n        1778.1068786005849,\n        1740.454015587948,\n        1771.6557512560412,\n        1776.77397064449,\n        1775.8776709323652,\n        1776.3228369000517,\n        1792.5858563221318,\n        1774.7354070048045,\n        1800.8583504193862,\n        1816.794544878237,\n        1823.984431059709,\n        1896.8176100192845,\n        1897.2129955425148,\n        1853.6753946156741,\n        1835.3116948934676,\n        1941.9333416585014,\n        1910.544436016929,\n        1880.463058796557,\n        1867.5256232219508,\n        1878.544475364241,\n        1931.870927414434,\n        1973.7462233945084,\n        1920.783859465787,\n        1999.0382888402146,\n        1938.3399576500383,\n        1955.910168943888,\n        1939.1607659918436,\n        1953.6593699900172,\n        2012.3137686933655,\n        2057.846870538191,\n        1998.6500577518632,\n        2024.7936330469051,\n        2016.4967539773697,\n        2084.3824818781127,\n        2068.8351000636003,\n        2058.411771772281,\n        1999.7633880198582,\n        2131.007436038327,\n        2067.0917258319623,\n        2048.4664728104117,\n        2079.7951743338467,\n        2095.6410366821565,\n        2080.528611406117,\n        2095.2630976565965,\n        2191.3286345660763,\n        2185.63603676565,\n        2150.691076199537,\n        2136.198161291987,\n        2188.951956508231,\n        2180.4250874574514,\n        2169.7256105084693,\n        2154.997344072441,\n        2213.5608371490653,\n        2210.2927707706976,\n        2153.256393948539,\n        2235.717427827676,\n        2194.490043243389,\n        2264.51700837704,\n        2173.7039562347713,\n        2173.804066878409,\n        2244.1961785150534,\n        2234.912982382214,\n        2225.976085695263,\n        2242.2668054749147,\n        2329.0418085188944,\n        2305.4428641971654,\n        2223.966626169306,\n        2318.948059745077,\n        2283.2458759037136,\n        2248.311774422755,\n        2284.8161171890733,\n        2323.620721687716,\n        2276.211854106748,\n        2374.589542585502,\n        2369.418336175175,\n        2328.881011003339,\n        2348.1115426429883,\n        2378.4947609359847,\n        2333.725166696244,\n        2269.919633914457,\n        2325.4606795781906,\n        2343.6328501169664,\n        2430.4667807822657,\n        2421.2777490354506,\n        2364.3030357536536,\n        2371.5663799555314,\n        2438.1595922788797,\n        2407.628486342669,\n        2448.5114309384257,\n        2427.3816029917816,\n        2427.8041511026536,\n        2426.303159435416,\n        2396.4504228395253,\n        2464.8680439436444,\n        2531.8481655959376,\n        2459.342234503615,\n        2429.461720649014,\n        2443.596154596039,\n        2490.218512085239,\n        2537.18964757073,\n        2481.942569079252,\n        2489.58739873738,\n        2544.8305272005277,\n        2434.2977132683172,\n        2546.201156270302,\n        2481.0448799529127,\n        2508.2578790292987,\n        2506.982385477666,\n        2543.4425662020126,\n        2526.6750452729334,\n        2535.2401693878924,\n        2420.4033001284233,\n        2529.9318837139185,\n        2464.7313204897723,\n        2455.062815013207,\n        2499.7594748315646,\n        2596.718039013545,\n        2498.3330112661365,\n        2501.0188623383533,\n        2599.0403583265006,\n        2607.1146976611967,\n        2439.0988796102047,\n        2526.370944260332,\n        2576.28989670563,\n        2511.707113022177,\n        2583.7177867994187,\n        2596.082599644324,\n        2554.6982037986077,\n        2529.74288248526,\n        2557.2585855898205,\n        2548.3530088858483,\n        2645.7672715369263,\n        2530.241552844906,\n        2654.4528062686954,\n        2586.362923416898,\n        2559.3633733143765,\n        2657.8307338445657,\n        2559.81400870566,\n        2592.6661708658503,\n        2602.55990188937,\n        2538.8301559972024,\n        2526.818866221739,\n        2585.3523332974496,\n        2554.961569267399,\n        2618.709741830747,\n        2605.355065797593,\n        2533.366643731315,\n        2603.488801614006,\n        2623.9244136739503,\n        2604.5956721182524,\n        2582.0186102727416,\n        2488.195110192989,\n        2678.5429039689307,\n        2629.859266587499,\n        2681.822785369794,\n        2673.388099328283,\n        2631.2053841972024,\n        2683.1040794644287,\n        2658.8158941097913,\n        2625.0378821215463,\n        2618.371798162222,\n        2614.0333241682824,\n        2716.2779532738177,\n        2715.674905375105,\n        2608.987316665234,\n        2690.6974059137247,\n        2701.197499910661,\n        2603.51085242943,\n        2619.8276987693166,\n        2693.6704249315353,\n        2693.6398598498545,\n        2643.599226087435,\n        2706.0174013939577,\n        2683.98035596259,\n        2644.515138217409,\n        2647.779150677115,\n        2714.6857252242958,\n        2667.5344769851313,\n        2743.8487219943686,\n        2703.994709429201,\n        2652.7106753653225,\n        2687.259255540809,\n        2689.8341723747,\n        2702.3172134851657,\n        2690.619898558037,\n        2690.20159914257,\n        2731.2882072746784,\n        2724.5897084520916,\n        2793.4886357198225,\n        2786.153795751157,\n        2752.4751036105013,\n        2727.4717608288397,\n        2731.200797820936,\n        2731.4992825409627,\n        2743.995331826182,\n        2726.718163468208,\n        2791.2976764991945,\n        2873.345345243874,\n        2689.543749833033,\n        2698.1604599580855,\n        2717.14020210079,\n        2802.7555602870093,\n        2662.787734710689,\n        2733.5043919293325,\n        2703.9124319961306,\n        2754.7889111106833,\n        2688.629707960803,\n        2799.943871259781,\n        2761.855562336642,\n        2702.762397210271,\n        2766.4033078959274,\n        2746.765063949556,\n        2703.0380361993293,\n        2677.6219217340554,\n        2678.0154902951144,\n        2730.7819082837295,\n        2675.980785801202,\n        2769.8035378145146,\n        2751.2892434738205,\n        2766.09438145931,\n        2838.9154928833927,\n        2846.060846359105,\n        2756.848066909525,\n        2682.546426218882,\n        2775.3927221554013,\n        2737.9923712545597,\n        2805.0775929216393,\n        2799.4732623761683,\n        2733.8937176473405,\n        2831.7684297898013,\n        2685.4213352603974,\n        2788.835019436861,\n        2732.175812834023,\n        2747.4124471873465,\n        2795.2861736509262,\n        2762.6546285933678,\n        2802.362123603685,\n        2773.0122277888067,\n        2786.386136822623,\n        2719.684115864333,\n        2696.0847392216106,\n        2729.5592388262608,\n        2749.530823626408,\n        2816.6736458246664,\n        2804.7249710771875,\n        2906.6257654562905,\n        2849.055185139825,\n        2792.8976333702376,\n        2830.1563676677547,\n        2862.338290963305,\n        2752.3066554958996,\n        2740.868848732521,\n        2835.277596142402,\n        2798.7544825399314,\n        2851.664996594126,\n        2876.4141655376375,\n        2769.61919969799,\n        2793.185397097183,\n        2854.2205718089917,\n        2805.046206408135,\n        2856.312368522653,\n        2771.9291952890603,\n        2783.2311234464614,\n        2829.6024385859,\n        2722.508448202175,\n        2786.129787489262,\n        2754.204349514493,\n        2758.812791317388,\n        2734.931164664823,\n        2790.801199668432,\n        2825.8404851348037,\n        2785.41183130212,\n        2766.1152077464644,\n        2820.4563379891065,\n        2794.0924159205956,\n        2832.681822424446,\n        2778.744410842281,\n        2748.4515817251945,\n        2846.286648490259,\n        2762.645957826851,\n        2739.2733696640116,\n        2887.434781031315,\n        2838.746726928752,\n        2857.8076543181005,\n        2829.741328850746,\n        2731.8624506729475,\n        2809.194877964095,\n        2835.639991983081,\n        2774.2901464782617,\n        2812.890136567504,\n        2782.458862314639,\n        2735.865533018469,\n        2815.348098915428,\n        2805.9535033565508,\n        2890.763881300486,\n        2848.1588593900888,\n        2833.473149169562,\n        2787.2073612284817,\n        2795.9358155848436,\n        2941.596198936999,\n        2833.9879722064406,\n        2796.524481611566,\n        2784.61578363909,\n        2767.3057501974986,\n        2814.1997308881514,\n        2819.9634090838476,\n        2822.0487443886145,\n        2846.601672510604,\n        2847.951669243141,\n        2773.4273529098605,\n        2791.740390265389,\n        2862.543566549477,\n        2796.9048118651504,\n        2821.260893453963,\n        2841.843039744259,\n        2848.5834357883973,\n        2798.8378551984883,\n        2897.1628474778718,\n        2777.0892532593875,\n        2747.3025608182193,\n        2884.5188328905856,\n        2805.6558784292642,\n        2873.661394292537,\n        2907.3751075268256,\n        2893.325621942176,\n        2747.0951412659056,\n        2896.585398318787,\n        2813.064564004543,\n        2844.616281909878,\n        2863.9378095266284,\n        2856.5618401472434,\n        2872.102570091827,\n        2940.6296522612597,\n        2785.87743103792,\n        2890.401273625524,\n        2806.926249812004,\n        2800.5105874257015,\n        2934.333051948449,\n        2861.8742741290494,\n        2786.2141788882514,\n        2807.6331076032425,\n        2877.682585162351,\n        2907.9847977626364,\n        2928.9358913186143,\n        2935.2800486093333,\n        2792.3580558379067,\n        2857.7672668995774,\n        2844.386286244289,\n        2918.7957425843156,\n        2863.844621429787,\n        2858.7225352463993,\n        2867.658150496989,\n        2812.5856823295476,\n        2752.0894152911765,\n        2799.511261192273,\n        2832.7366098620546,\n        2876.003235962489,\n        2833.2982203386555,\n        2864.5488689222852,\n        2829.771138514546,\n        2840.602454991052,\n        2866.285854070573,\n        2882.464991637063,\n        2938.8973156252555,\n        2898.082892254109,\n        2697.687819643127,\n        2899.413144777709,\n        2874.372861938143,\n        2847.475277422531,\n        2900.2066438553893,\n        2825.795888362836,\n        2839.4757763190655,\n        2826.958567352137,\n        2876.644334335877,\n        2818.6940827226636,\n        2912.043561819364,\n        2818.903337286878,\n        2960.633181371939,\n        2771.5312069174724,\n        2832.1639711050643,\n        2896.1997997205617,\n        2841.616172046931,\n        2965.283974920245,\n        2835.2115810197247,\n        2839.6763936115753,\n        2905.8665082890457,\n        2816.410968088986,\n        2821.6035482367897,\n        2840.024586564322,\n        2852.1119461610997,\n        2958.3247730333446,\n        2904.3779799890126,\n        2910.740656152374,\n        2904.1957497786334,\n        2868.9946188248155,\n        2958.624178704684,\n        2896.141774405019,\n        2875.93737045916,\n        2916.411133918859,\n        2736.410530263688,\n        2833.689915605084,\n        2823.6615401054737,\n        2844.3455128566343,\n        2896.0081045476854,\n        2898.4796963828626,\n        2872.743197070322,\n        2888.9444025657967,\n        2874.813433677915,\n        2855.044082230879,\n        2902.575874793305,\n        2956.667727405208,\n        2946.343838304082,\n        2819.605031631867,\n        2926.1013529820198,\n        2804.995174389852,\n        2939.3965595095196,\n        2869.770046678058,\n        2920.6278785991376,\n        2891.905681590925,\n        2900.453764117853,\n        2908.1465660190843\n      ],\n      \"return_max\": [\n        0.0,\n        29.61667983733193,\n        325.16739609511126,\n        383.1952563769315,\n        499.98422470443677,\n        566.6348869111318,\n        584.9214954071329,\n        602.3549526318462,\n        624.0996927250314,\n        858.9507202542263,\n        872.1472938786548,\n        1081.9082444791497,\n        1397.084524963528,\n        1712.0352174334312,\n        1988.0500408013247,\n        1910.1081743160419,\n        2021.6225426171072,\n        2314.470976561796,\n        2459.1160171341944,\n        2726.3421746857966,\n        2891.352170480607,\n        2902.8111748563465,\n        3061.8027469621197,\n        3194.69418858782,\n        3308.406563582713,\n        3409.7619781024005,\n        3483.6454587531034,\n        3563.9330998087707,\n        3741.406152372898,\n        4062.5835317646806,\n        3864.5041858437908,\n        4017.6704016842295,\n        4206.814785666509,\n        4048.7505715552024,\n        4352.4780073498005,\n        4515.277356051185,\n        4427.2112731037705,\n        4537.174271484228,\n        4701.2111212003865,\n        4619.65043822589,\n        4695.071727488901,\n        4826.623072840699,\n        4758.091246942079,\n        5085.730725000796,\n        4924.884358674533,\n        5358.465144607668,\n        5130.279721277835,\n        5308.793603397446,\n        5344.8427791741215,\n        5125.299397316814,\n        5370.8627466966245,\n        5289.930622966019,\n        5363.496769292218,\n        5367.616240940361,\n        5560.875911331738,\n        5657.161297507166,\n        5642.266221309594,\n        5699.778765797297,\n        5735.78176376967,\n        5894.5672219306125,\n        5898.363991007227,\n        5733.416327756158,\n        5793.326150284309,\n        5854.230012404905,\n        5757.958873115003,\n        5839.460988107725,\n        5954.868378208546,\n        5945.626372485526,\n        5888.307347097278,\n        5875.95303563168,\n        5861.443770120917,\n        6060.366503140146,\n        5851.8806549275105,\n        5952.7977244721715,\n        6201.482285764034,\n        6124.70017266945,\n        6126.468307093328,\n        6164.067541110549,\n        6095.050346110107,\n        6098.3720951837295,\n        6164.214749851715,\n        5912.259953045705,\n        6413.631290149638,\n        6468.313907334788,\n        6263.532736156869,\n        6400.141528989974,\n        6206.197657323041,\n        6222.696338657885,\n        6435.121556284561,\n        6457.04806755901,\n        6451.521409669051,\n        6384.736308010499,\n        6362.781584980023,\n        6545.264681390262,\n        6424.347764911066,\n        6427.706991662115,\n        6398.450138846503,\n        6401.14013062605,\n        6580.031709886951,\n        6582.896262001483,\n        6738.546894443019,\n        6667.428787549858,\n        6638.39442148762,\n        6746.273904357991,\n        6488.506416116861,\n        6588.011847230429,\n        6672.034156670891,\n        6702.522482123706,\n        6815.767413920008,\n        6770.913466436907,\n        6661.313335142882,\n        6828.1732193348,\n        6741.185017087388,\n        6623.567266359498,\n        6745.597721843443,\n        6761.832829862699,\n        6637.319821072283,\n        6881.735570118594,\n        6847.413817483926,\n        6972.799140005446,\n        6855.017962941849,\n        6730.603643463108,\n        6775.407869672987,\n        6545.0875853153575,\n        6804.396208297005,\n        6971.292067535803,\n        6846.092955828661,\n        6820.279136704533,\n        6854.591351765083,\n        6845.17293520185,\n        6800.3686016344745,\n        6741.289726083658,\n        6859.377289987166,\n        6754.0412947553095,\n        6770.402900961892,\n        6872.755833578269,\n        6769.463058761616,\n        6923.7542106094725,\n        6978.915814978762,\n        6893.5330188798,\n        6942.246672101402,\n        6959.566030351246,\n        6891.627187978749,\n        7008.595729002319,\n        7200.041391573421,\n        6981.162470450013,\n        6863.0518041132855,\n        6820.71021394038,\n        7098.970207172763,\n        6980.816442053685,\n        7139.894854978897,\n        6931.531611495573,\n        7237.791444370425,\n        7134.3807913454675,\n        6887.796194971828,\n        7002.198222734607,\n        6848.14810980376,\n        7073.384581829825,\n        6890.378729749984,\n        7154.461715927615,\n        6970.4212416177925,\n        7069.346382732154,\n        7137.679205919698,\n        7059.611189952282,\n        7138.583377698495,\n        6880.60484885764,\n        7219.113533693868,\n        7254.185903588058,\n        7329.122865143116,\n        7172.078399907388,\n        7179.9230912412495,\n        7293.345760433052,\n        7182.308676708813,\n        7093.510640496968,\n        6992.565925842356,\n        7116.539158478868,\n        7153.557386383761,\n        7114.542114202787,\n        7177.669479438504,\n        7314.265952099681,\n        7102.077938084751,\n        7116.426223134569,\n        7191.272531286411,\n        7157.9991198157895,\n        6931.779925593101,\n        6932.956371130462,\n        7270.065485434507,\n        7213.915280026789,\n        7333.443372491987,\n        7249.823759429826,\n        7389.402432708663,\n        7250.529748150387,\n        7171.391800436779,\n        7309.248195058932,\n        7337.97917807647,\n        7331.260886405009,\n        7105.244370547611,\n        7056.392187380408,\n        7041.93327144308,\n        7270.7449538149185,\n        7341.435784775938,\n        7014.951888524865,\n        7001.38027369621,\n        7287.921436531064,\n        7064.306860526711,\n        7343.35358456899,\n        7353.944442134241,\n        7155.35603162925,\n        7170.10461487497,\n        6887.691294857567,\n        7157.76120863582,\n        7226.257222939392,\n        7117.3123918267265,\n        7307.8976301364855,\n        7130.200454688265,\n        7167.479028228288,\n        7251.85794891758,\n        7295.528164329009,\n        7222.1925032445615,\n        7252.730693722204,\n        7229.125746926382,\n        7158.407628330874,\n        7242.574740860266,\n        7291.296848914973,\n        7171.481497160037,\n        6927.805798994278,\n        7212.944763174812,\n        7340.002635931771,\n        7070.266806643889,\n        7455.895690857822,\n        7367.026751825952,\n        7381.72433600452,\n        7390.593748397708,\n        7009.938691664454,\n        7403.66608493813,\n        7402.329267424341,\n        7298.154939060245,\n        7346.058301150806,\n        7162.192556177075,\n        7468.647190487402,\n        7294.060854212045,\n        7372.8220938157265,\n        7307.66133506389,\n        7311.831484914201,\n        7415.164870720324,\n        7323.365620232209,\n        7254.318140241316,\n        7233.676938525161,\n        7435.1813731529855,\n        7365.877847178452,\n        7500.490726736145,\n        7419.203210436002,\n        7530.162942478113,\n        7276.337138620541,\n        7530.184339573424,\n        7346.655778995124,\n        7385.094609145364,\n        7437.42956592163,\n        7253.554871702225,\n        7294.422627065246,\n        7119.535217502718,\n        7366.4472139568725,\n        7280.138189411187,\n        7289.26256013544,\n        7532.0912227874815,\n        7228.2885172819915,\n        7321.119684700723,\n        7323.269186811582,\n        7426.004801597073,\n        7362.598604837833,\n        7333.561365627782,\n        7261.974557591571,\n        7335.5396088791995,\n        7258.4588141845,\n        7357.401510726004,\n        7214.417137923394,\n        7494.175068007642,\n        7539.755679696616,\n        7226.1681663962145,\n        7522.561398486127,\n        7280.1948443851115,\n        7417.599400764935,\n        7462.456922988895,\n        7234.9369153369025,\n        7401.6141412765755,\n        7268.972235861973,\n        7270.26625584535,\n        7430.439616899398,\n        7303.925029719633,\n        7103.226531164974,\n        7292.911459682198,\n        7350.552302139312,\n        7354.783553888562,\n        7142.046789119524,\n        7144.055591873464,\n        7439.678096446497,\n        7349.2815530965045,\n        7456.633205151184,\n        7570.410545649713,\n        7569.0758474683025,\n        7351.089078268342,\n        7498.025658630519,\n        6703.594957107808,\n        7147.942891072378,\n        7109.396243374996,\n        7269.531861579301,\n        7410.498066319084,\n        7516.768700972221,\n        7230.518997287963,\n        7323.318304653863,\n        7247.415151342387,\n        7411.611230257962,\n        7094.6611746385115,\n        7193.865978426057,\n        7382.41400559407,\n        7352.159427541812,\n        7227.486539985792,\n        7267.990610956312,\n        7111.6080918608495,\n        7385.841132655565,\n        7369.637090627632,\n        7379.725035633577,\n        7220.906365368954,\n        7249.681609927897,\n        7328.211054084556,\n        7277.459545563608,\n        7376.157590067694,\n        7418.774499861878,\n        7494.756959450166,\n        7432.199675711498,\n        7265.05840082429,\n        7333.5400802988515,\n        7501.25779356707,\n        7269.058659671984,\n        7282.287152423206,\n        7516.144857594687,\n        7416.969180123788,\n        7452.911708028245,\n        7268.059551939787,\n        7215.20762089289,\n        7257.556566612235,\n        7311.965506737308,\n        7168.195382489208,\n        7269.706853085852,\n        7202.467526651341,\n        7233.892808427487,\n        7177.185642795077,\n        7131.987374499873,\n        7322.346376503389,\n        7160.957317467188,\n        7393.5702056610935,\n        7448.729300961065,\n        7390.5768448505805,\n        7443.194388436066,\n        7238.4732460209325,\n        7400.7789329161305,\n        7483.915967713017,\n        7131.551046128199,\n        7267.732057717016,\n        7285.356041532281,\n        7270.076397547904,\n        7423.223097142512,\n        7247.772254547079,\n        7403.501052270644,\n        7502.064217810631,\n        7291.008306346094,\n        7376.091762523576,\n        7217.057421682689,\n        7213.770107443247,\n        7203.048790358156,\n        7087.265618886533,\n        7377.210186586022,\n        7344.277652106912,\n        7325.068829132463,\n        7309.510304917777,\n        7266.446523788622,\n        7179.976087128043,\n        7149.847298576817,\n        7210.598139890009,\n        7318.3387189707155,\n        7284.091947562846,\n        7120.337928132696,\n        7326.296552359445,\n        7075.976901749524,\n        7402.184750679233,\n        7243.829720783706,\n        7279.575411907163,\n        7258.189064763566,\n        7518.203467002586,\n        7237.952256648245,\n        7399.695032952384,\n        7099.826668268028,\n        7278.0197617127305,\n        7293.371746606462,\n        7155.0180125933875,\n        7140.305630617853,\n        7017.322809657469,\n        7424.541667017978,\n        7107.683860204968,\n        7208.29755334804,\n        7114.614927332801,\n        7210.078511478572,\n        7102.6732858222085,\n        7424.073267244293,\n        7402.882596838341,\n        7028.26244448849,\n        7320.621676418762,\n        7420.28760417783,\n        7234.9296074639,\n        7397.167374097639,\n        7247.79774850632,\n        7330.012094291109,\n        7160.402807257173,\n        7270.852437599166,\n        7374.138884974329,\n        7366.787101109627,\n        7469.453424125203,\n        7555.685739738562,\n        7560.091405802592,\n        7374.766191412042,\n        7151.72929873831,\n        7250.9592291534445,\n        7331.866335049863,\n        7186.823627754872,\n        7268.864003209515,\n        7270.073345254621,\n        7180.53548187972,\n        7329.111151530369,\n        7323.227193633946,\n        6954.046769256247,\n        7375.973928071471,\n        7061.500756332844,\n        7137.9107812324655,\n        6795.650357719078,\n        7263.79283884435,\n        7240.842915940559,\n        7260.637484862706,\n        7247.613382963346,\n        7389.441039799638,\n        7365.716539647076,\n        7226.686309397986,\n        7157.245613031477,\n        7296.54278100698,\n        7229.828339434558,\n        7359.529051273112,\n        7228.777527189149,\n        7087.947713651247,\n        7161.2997772525305,\n        7106.791311490431,\n        7176.874785649005,\n        7092.017708814668,\n        7132.41640280386,\n        7206.603821380772,\n        7267.865348006653,\n        7180.973285014151,\n        7200.080913034151,\n        7216.755084435534,\n        7242.643321557749,\n        7170.252564841561,\n        7213.397488276039,\n        7224.656409233296,\n        7102.471150325507,\n        7141.441181146651,\n        7226.800116459524,\n        7028.823389194822,\n        7314.211628307634,\n        7084.861083958743,\n        6929.617138469754,\n        7113.410685652475,\n        7265.236321447543,\n        7288.761953834333,\n        7165.281572928468,\n        7249.606635046356,\n        7230.943212623712,\n        7233.147518748427,\n        7215.090450781399,\n        7240.014005532725,\n        7222.170636773506,\n        7377.121407401943,\n        7036.673577524399,\n        7184.2957106766535,\n        7160.571521387446,\n        7089.127942908135,\n        7207.4663076837605,\n        7232.904731214181,\n        7313.243978955068,\n        7289.929298636595,\n        7307.1992964704,\n        7123.516211928056,\n        6886.194590019926,\n        7032.557145275457,\n        7116.501428434808,\n        7274.344946761688,\n        7088.565648116615,\n        6958.40978854655,\n        7013.91527465796,\n        7030.4024113297555,\n        7318.300542238265,\n        7105.481832904925,\n        7007.628283947332,\n        7175.7169579524725\n      ]\n    }\n  },\n  \"Isaac-Repose-Cube-Allegro-Direct-v0\": {\n    \"FastTD3\": {\n      \"time\": [\n        0.0,\n        487.19444444444446,\n        974.3888888888889,\n        1461.5833333333333,\n        1948.7777777777778,\n        2435.972222222222,\n        2923.1666666666665,\n        3410.3611111111113,\n        3897.5555555555557,\n        4384.75,\n        4871.944444444444,\n        5359.138888888889,\n        5846.333333333333,\n        6333.527777777777,\n        6820.722222222223,\n        7307.916666666667,\n        7795.111111111111,\n        8282.305555555555,\n        8769.5,\n        9256.694444444445,\n        9743.888888888889,\n        10231.083333333334,\n        10718.277777777777,\n        11205.472222222223,\n        11692.666666666666,\n        12179.861111111111,\n        12667.055555555555,\n        13154.25,\n        13641.444444444445,\n        14128.638888888889,\n        14615.833333333334,\n        15103.027777777777,\n        15590.222222222223,\n        16077.416666666666,\n        16564.61111111111,\n        17051.805555555555,\n        17539.0\n      ],\n      \"env_step\": [\n        0,\n        20480000.0,\n        40960000.0,\n        61440000.0,\n        81920000.0,\n        102400000.0,\n        122880000.0,\n        143360000.0,\n        163840000.0,\n        184320000.0,\n        204800000.0,\n        225280000.0,\n        245760000.0,\n        266240000.0,\n        286720000.0,\n        307200000.0,\n        327680000.0,\n        348160000.0,\n        368640000.0,\n        389120000.0,\n        409600000.0,\n        430080000.0,\n        450560000.0,\n        471040000.0,\n        491520000.0,\n        512000000.0,\n        532480000.0,\n        552960000.0,\n        573440000.0,\n        593920000.0,\n        614400000.0,\n        634880000.0,\n        655360000.0,\n        675840000.0,\n        696320000.0,\n        716800000.0,\n        737280000.0\n      ],\n      \"return\": [\n        0.0,\n        122.10221354166667,\n        435.0882975260417,\n        1421.8710123697917,\n        2348.2281901041665,\n        3163.7948404947915,\n        3698.861083984375,\n        4025.7603352864585,\n        4337.995442708333,\n        4589.761393229167,\n        4782.493001302083,\n        4890.22412109375,\n        5030.248046875,\n        5155.931477864583,\n        5282.59326171875,\n        5324.516927083333,\n        5457.94775390625,\n        5515.32958984375,\n        5564.52880859375,\n        5673.402506510417,\n        5739.943033854167,\n        5763.798502604167,\n        5847.9658203125,\n        5859.408203125,\n        5951.106282552083,\n        5957.567057291667,\n        6028.259602864583,\n        6083.15185546875,\n        6115.253580729167,\n        6183.505045572917,\n        6220.779947916667,\n        6284.12353515625,\n        6349.08984375,\n        6387.808756510417,\n        6371.83056640625,\n        6479.999348958333,\n        6477.792643229167\n      ],\n      \"return_min\": [\n        0.0,\n        112.10858028361118,\n        408.6617848585944,\n        1304.8137768436577,\n        2207.112529998192,\n        3054.42341419639,\n        3651.647282857428,\n        3973.3270545893097,\n        4298.076884293968,\n        4527.783274664672,\n        4740.259300399453,\n        4875.8400900253755,\n        5014.454607297292,\n        5066.5720863274855,\n        5229.377519512968,\n        5267.243130768987,\n        5421.939380663636,\n        5453.997084027583,\n        5459.271770361508,\n        5589.086964125713,\n        5681.091427120193,\n        5680.027501991008,\n        5702.091715209337,\n        5709.277469015035,\n        5830.487430823852,\n        5819.911405529101,\n        5917.187135276544,\n        5921.141539621999,\n        5981.998730611572,\n        5966.944434412144,\n        6069.596032579598,\n        6084.438338052321,\n        6161.312687749657,\n        6146.352347118702,\n        6086.806821737842,\n        6166.503157669386,\n        6221.325351242079\n      ],\n      \"return_max\": [\n        0.0,\n        132.09584679972215,\n        461.514810193489,\n        1538.9282478959258,\n        2489.343850210141,\n        3273.1662667931932,\n        3746.074885111322,\n        4078.1936159836073,\n        4377.914001122698,\n        4651.739511793662,\n        4824.726702204713,\n        4904.6081521621245,\n        5046.041486452708,\n        5245.2908694016805,\n        5335.809003924532,\n        5381.790723397679,\n        5493.956127148864,\n        5576.662095659917,\n        5669.785846825992,\n        5757.718048895121,\n        5798.794640588141,\n        5847.569503217326,\n        5993.839925415663,\n        6009.538937234965,\n        6071.725134280314,\n        6095.222709054233,\n        6139.332070452622,\n        6245.162171315501,\n        6248.508430846762,\n        6400.06565673369,\n        6371.963863253736,\n        6483.808732260179,\n        6536.866999750343,\n        6629.265165902132,\n        6656.854311074658,\n        6793.49554024728,\n        6734.259935216255\n      ]\n    },\n    \"null\": {\n      \"time\": [\n        0.0,\n        33.32,\n        66.64,\n        99.96,\n        133.28,\n        166.6,\n        199.92,\n        233.24,\n        266.56,\n        299.88,\n        333.2,\n        366.52,\n        399.84,\n        433.16,\n        466.48,\n        499.8,\n        533.12,\n        566.44,\n        599.76,\n        633.08,\n        666.4,\n        699.72,\n        733.04,\n        766.36,\n        799.68,\n        833.0,\n        866.32,\n        899.64,\n        932.96,\n        966.28,\n        999.6,\n        1032.92,\n        1066.24,\n        1099.56,\n        1132.88,\n        1166.2,\n        1199.52,\n        1232.84,\n        1266.16,\n        1299.48,\n        1332.8,\n        1366.12,\n        1399.44,\n        1432.76,\n        1466.08,\n        1499.4,\n        1532.72,\n        1566.04,\n        1599.36,\n        1632.68,\n        1666.0,\n        1699.32,\n        1732.64,\n        1765.96,\n        1799.28,\n        1832.6,\n        1865.92,\n        1899.24,\n        1932.56,\n        1965.88,\n        1999.2,\n        2032.52,\n        2065.84,\n        2099.16,\n        2132.48,\n        2165.8,\n        2199.12,\n        2232.44,\n        2265.76,\n        2299.08,\n        2332.4,\n        2365.72,\n        2399.04,\n        2432.36,\n        2465.68,\n        2499.0,\n        2532.32,\n        2565.64,\n        2598.96,\n        2632.28,\n        2665.6,\n        2698.92,\n        2732.24,\n        2765.56,\n        2798.88,\n        2832.2,\n        2865.52,\n        2898.84,\n        2932.16,\n        2965.48,\n        2998.8,\n        3032.12,\n        3065.44,\n        3098.76,\n        3132.08,\n        3165.4,\n        3198.72,\n        3232.04,\n        3265.36,\n        3298.68,\n        3332.0,\n        3365.32,\n        3398.64,\n        3431.96,\n        3465.28,\n        3498.6,\n        3531.92,\n        3565.24,\n        3598.56,\n        3631.88,\n        3665.2,\n        3698.52,\n        3731.84,\n        3765.16,\n        3798.48,\n        3831.8,\n        3865.12,\n        3898.44,\n        3931.76,\n        3965.08,\n        3998.4,\n        4031.72,\n        4065.04,\n        4098.36,\n        4131.68,\n        4165.0,\n        4198.32,\n        4231.64,\n        4264.96,\n        4298.28,\n        4331.6,\n        4364.92,\n        4398.24,\n        4431.56,\n        4464.88,\n        4498.2,\n        4531.52,\n        4564.84,\n        4598.16,\n        4631.48,\n        4664.8,\n        4698.12,\n        4731.44,\n        4764.76,\n        4798.08,\n        4831.4,\n        4864.72,\n        4898.04,\n        4931.36,\n        4964.68,\n        4998.0,\n        5031.32,\n        5064.64,\n        5097.96,\n        5131.28,\n        5164.6,\n        5197.92,\n        5231.24,\n        5264.56,\n        5297.88,\n        5331.2,\n        5364.52,\n        5397.84,\n        5431.16,\n        5464.48,\n        5497.8,\n        5531.12,\n        5564.44,\n        5597.76,\n        5631.08,\n        5664.4,\n        5697.72,\n        5731.04,\n        5764.36,\n        5797.68,\n        5831.0,\n        5864.32,\n        5897.64,\n        5930.96,\n        5964.28,\n        5997.6,\n        6030.92,\n        6064.24,\n        6097.56,\n        6130.88,\n        6164.2,\n        6197.52,\n        6230.84,\n        6264.16,\n        6297.48,\n        6330.8,\n        6364.12,\n        6397.44,\n        6430.76,\n        6464.08,\n        6497.4,\n        6530.72,\n        6564.04,\n        6597.36,\n        6630.68,\n        6664.0,\n        6697.32,\n        6730.64,\n        6763.96,\n        6797.28,\n        6830.6,\n        6863.92,\n        6897.24,\n        6930.56,\n        6963.88,\n        6997.2,\n        7030.52,\n        7063.84,\n        7097.16,\n        7130.48,\n        7163.8,\n        7197.12,\n        7230.44,\n        7263.76,\n        7297.08,\n        7330.4,\n        7363.72,\n        7397.04,\n        7430.36,\n        7463.68,\n        7497.0,\n        7530.32,\n        7563.64,\n        7596.96,\n        7630.28,\n        7663.6,\n        7696.92,\n        7730.24,\n        7763.56,\n        7796.88,\n        7830.2,\n        7863.52,\n        7896.84,\n        7930.16,\n        7963.48,\n        7996.8,\n        8030.12,\n        8063.44,\n        8096.76,\n        8130.08,\n        8163.4,\n        8196.72,\n        8230.04,\n        8263.36,\n        8296.68,\n        8330.0,\n        8363.32,\n        8396.64,\n        8429.96,\n        8463.28,\n        8496.6,\n        8529.92,\n        8563.24,\n        8596.56,\n        8629.88,\n        8663.2,\n        8696.52,\n        8729.84,\n        8763.16,\n        8796.48,\n        8829.8,\n        8863.12,\n        8896.44,\n        8929.76,\n        8963.08,\n        8996.4,\n        9029.72,\n        9063.04,\n        9096.36,\n        9129.68,\n        9163.0,\n        9196.32,\n        9229.64,\n        9262.96,\n        9296.28,\n        9329.6,\n        9362.92,\n        9396.24,\n        9429.56,\n        9462.88,\n        9496.2,\n        9529.52,\n        9562.84,\n        9596.16,\n        9629.48,\n        9662.8,\n        9696.12,\n        9729.44,\n        9762.76,\n        9796.08,\n        9829.4,\n        9862.72,\n        9896.04,\n        9929.36,\n        9962.68,\n        9996.0,\n        10029.32,\n        10062.64,\n        10095.96,\n        10129.28,\n        10162.6,\n        10195.92,\n        10229.24,\n        10262.56,\n        10295.88,\n        10329.2,\n        10362.52,\n        10395.84,\n        10429.16,\n        10462.48,\n        10495.8,\n        10529.12,\n        10562.44,\n        10595.76,\n        10629.08,\n        10662.4,\n        10695.72,\n        10729.04,\n        10762.36,\n        10795.68,\n        10829.0,\n        10862.32,\n        10895.64,\n        10928.96,\n        10962.28,\n        10995.6,\n        11028.92,\n        11062.24,\n        11095.56,\n        11128.88,\n        11162.2,\n        11195.52,\n        11228.84,\n        11262.16,\n        11295.48,\n        11328.8,\n        11362.12,\n        11395.44,\n        11428.76,\n        11462.08,\n        11495.4,\n        11528.72,\n        11562.04,\n        11595.36,\n        11628.68,\n        11662.0,\n        11695.32,\n        11728.64,\n        11761.96,\n        11795.28,\n        11828.6,\n        11861.92,\n        11895.24,\n        11928.56,\n        11961.88,\n        11995.2,\n        12028.52,\n        12061.84,\n        12095.16,\n        12128.48,\n        12161.8,\n        12195.12,\n        12228.44,\n        12261.76,\n        12295.08,\n        12328.4,\n        12361.72,\n        12395.04,\n        12428.36,\n        12461.68,\n        12495.0,\n        12528.32,\n        12561.64,\n        12594.96,\n        12628.28,\n        12661.6,\n        12694.92,\n        12728.24,\n        12761.56,\n        12794.88,\n        12828.2,\n        12861.52,\n        12894.84,\n        12928.16,\n        12961.48,\n        12994.8,\n        13028.12,\n        13061.44,\n        13094.76,\n        13128.08,\n        13161.4,\n        13194.72,\n        13228.04,\n        13261.36,\n        13294.68,\n        13328.0,\n        13361.32,\n        13394.64,\n        13427.96,\n        13461.28,\n        13494.6,\n        13527.92,\n        13561.24,\n        13594.56,\n        13627.88,\n        13661.2,\n        13694.52,\n        13727.84,\n        13761.16,\n        13794.48,\n        13827.8,\n        13861.12,\n        13894.44,\n        13927.76,\n        13961.08,\n        13994.4,\n        14027.72,\n        14061.04,\n        14094.36,\n        14127.68,\n        14161.0,\n        14194.32,\n        14227.64,\n        14260.96,\n        14294.28,\n        14327.6,\n        14360.92,\n        14394.24,\n        14427.56,\n        14460.88,\n        14494.2,\n        14527.52,\n        14560.84,\n        14594.16,\n        14627.48,\n        14660.8,\n        14694.12,\n        14727.44,\n        14760.76,\n        14794.08,\n        14827.4,\n        14860.72,\n        14894.04,\n        14927.36,\n        14960.68,\n        14994.0,\n        15027.32,\n        15060.64,\n        15093.96,\n        15127.28,\n        15160.6,\n        15193.92,\n        15227.24,\n        15260.56,\n        15293.88,\n        15327.2,\n        15360.52,\n        15393.84,\n        15427.16,\n        15460.48,\n        15493.8,\n        15527.12,\n        15560.44,\n        15593.76,\n        15627.08,\n        15660.4,\n        15693.72,\n        15727.04,\n        15760.36,\n        15793.68,\n        15827.0,\n        15860.32,\n        15893.64,\n        15926.96,\n        15960.28,\n        15993.6,\n        16026.92,\n        16060.24,\n        16093.56,\n        16126.88,\n        16160.2,\n        16193.52,\n        16226.84,\n        16260.16,\n        16293.48,\n        16326.8,\n        16360.12,\n        16393.44,\n        16426.76,\n        16460.08,\n        16493.4,\n        16526.72,\n        16560.04,\n        16593.36,\n        16626.68,\n        16660.0\n      ],\n      \"env_step\": [\n        0,\n        1835008,\n        10878976,\n        11141120,\n        11403264,\n        11534336,\n        12189696,\n        12713984,\n        16252928,\n        20185088,\n        23199744,\n        25952256,\n        29097984,\n        31981568,\n        33554432,\n        33947648,\n        36306944,\n        37355520,\n        40763392,\n        45219840,\n        46923776,\n        48496640,\n        54919168,\n        58458112,\n        60424192,\n        62783488,\n        72351744,\n        73400320,\n        76283904,\n        77594624,\n        81788928,\n        82182144,\n        82575360,\n        83623936,\n        87293952,\n        88866816,\n        89653248,\n        90439680,\n        93454336,\n        94371840,\n        97910784,\n        98435072,\n        98697216,\n        100401152,\n        100663296,\n        103022592,\n        105512960,\n        115867648,\n        117047296,\n        123994112,\n        125173760,\n        125304832,\n        130023424,\n        134086656,\n        135790592,\n        136839168,\n        139460608,\n        141164544,\n        141819904,\n        143523840,\n        144310272,\n        146800640,\n        147062784,\n        150077440,\n        152043520,\n        155058176,\n        163840000,\n        165281792,\n        171573248,\n        174325760,\n        174456832,\n        175505408,\n        177733632,\n        178126848,\n        179961856,\n        181403648,\n        181927936,\n        184549376,\n        188088320,\n        193331200,\n        193855488,\n        204472320,\n        205651968,\n        206176256,\n        206569472,\n        210763776,\n        211156992,\n        214433792,\n        214827008,\n        215351296,\n        219676672,\n        221118464,\n        222429184,\n        224657408,\n        224919552,\n        226492416,\n        228982784,\n        234094592,\n        244187136,\n        245366784,\n        251920384,\n        257687552,\n        258080768,\n        260308992,\n        260964352,\n        261226496,\n        263979008,\n        270139392,\n        274333696,\n        284164096,\n        285868032,\n        295043072,\n        297795584,\n        299761664,\n        299892736,\n        303955968,\n        304218112,\n        307757056,\n        311820288,\n        316145664,\n        319291392,\n        322306048,\n        322699264,\n        323747840,\n        326369280,\n        336592896,\n        336855040,\n        339738624,\n        343146496,\n        345243648,\n        346947584,\n        347602944,\n        349700096,\n        353370112,\n        358744064,\n        359399424,\n        365166592,\n        372899840,\n        376963072,\n        378273792,\n        383254528,\n        389021696,\n        390463488,\n        391905280,\n        394657792,\n        395444224,\n        396361728,\n        406847488,\n        407109632,\n        407896064,\n        412745728,\n        417333248,\n        420216832,\n        422051840,\n        422313984,\n        424148992,\n        427425792,\n        428867584,\n        429654016,\n        430440448,\n        437125120,\n        440270848,\n        443940864,\n        448397312,\n        452329472,\n        452722688,\n        455606272,\n        457572352,\n        459669504,\n        464125952,\n        468320256,\n        469106688,\n        471334912,\n        474087424,\n        474349568,\n        475004928,\n        476315648,\n        479723520,\n        480903168,\n        481689600,\n        483917824,\n        484048896,\n        484573184,\n        485883904,\n        486801408,\n        488898560,\n        489553920,\n        492306432,\n        502530048,\n        511311872,\n        512884736,\n        519307264,\n        519569408,\n        525074432,\n        525205504,\n        529793024,\n        529924096,\n        530186240,\n        531496960,\n        533331968,\n        535035904,\n        536215552,\n        539885568,\n        540016640,\n        541327360,\n        548274176,\n        558628864,\n        560857088,\n        565444608,\n        570294272,\n        572260352,\n        573833216,\n        576847872,\n        577896448,\n        589824000,\n        591921152,\n        592052224,\n        592314368,\n        604241920,\n        605814784,\n        607125504,\n        608174080,\n        609484800,\n        610402304,\n        613285888,\n        613941248,\n        615120896,\n        620625920,\n        621150208,\n        625082368,\n        626786304,\n        628883456,\n        637534208,\n        638320640,\n        638713856,\n        639238144,\n        640679936,\n        642121728,\n        643039232,\n        643563520,\n        646447104,\n        648544256,\n        649592832,\n        652214272,\n        654180352,\n        654704640,\n        655884288,\n        656670720,\n        662175744,\n        662306816,\n        663224320,\n        668598272,\n        670040064,\n        670957568,\n        671875072,\n        673447936,\n        674365440,\n        677773312,\n        684982272,\n        686424064,\n        691273728,\n        692584448,\n        692846592,\n        695468032,\n        701366272,\n        702021632,\n        705036288,\n        705953792,\n        706347008,\n        706740224,\n        707133440,\n        714604544,\n        716177408,\n        719716352,\n        720633856,\n        720896000,\n        724041728,\n        724303872,\n        724697088,\n        725876736,\n        728236032,\n        730988544,\n        731512832,\n        734658560,\n        737411072,\n        740425728,\n        740950016,\n        742522880,\n        742785024,\n        743047168,\n        745013248,\n        745799680,\n        746848256,\n        747110400,\n        747896832,\n        752091136,\n        756940800,\n        763887616,\n        765984768,\n        766377984,\n        767295488,\n        767819776,\n        776077312,\n        778960896,\n        781451264,\n        785907712,\n        788660224,\n        792592384,\n        797573120,\n        797835264,\n        798621696,\n        799670272,\n        802553856,\n        803078144,\n        805044224,\n        807272448,\n        809893888,\n        811728896,\n        812384256,\n        814612480,\n        814874624,\n        815398912,\n        821035008,\n        828243968,\n        828506112,\n        832831488,\n        832962560,\n        835059712,\n        835977216,\n        838205440,\n        838860800,\n        842661888,\n        843448320,\n        848560128,\n        850132992,\n        850657280,\n        861011968,\n        861667328,\n        865730560,\n        870580224,\n        872284160,\n        882114560,\n        883425280,\n        883556352,\n        890372096,\n        892338176,\n        896139264,\n        900726784,\n        904134656,\n        907149312,\n        908984320,\n        909901824,\n        912916480,\n        915013632,\n        916979712,\n        918159360,\n        918814720,\n        919339008,\n        920256512,\n        921960448,\n        923795456,\n        926679040,\n        928514048,\n        935198720,\n        936116224,\n        939261952,\n        942276608,\n        942669824,\n        947519488,\n        949092352,\n        949223424,\n        951844864,\n        951975936,\n        956432384,\n        958398464,\n        961544192,\n        962985984,\n        965476352,\n        974782464,\n        974913536,\n        981598208,\n        983433216,\n        984743936,\n        985661440,\n        987365376,\n        990773248,\n        992608256,\n        996802560,\n        998506496,\n        998768640,\n        1002831872,\n        1013579776,\n        1014104064,\n        1016987648,\n        1017118720,\n        1017643008,\n        1018298368,\n        1027735552,\n        1027866624,\n        1028784128,\n        1033764864,\n        1034682368,\n        1046478848,\n        1050279936,\n        1054343168,\n        1056178176,\n        1058930688,\n        1066139648,\n        1067843584,\n        1068498944,\n        1070989312,\n        1072693248,\n        1072824320,\n        1075838976,\n        1076494336,\n        1079246848,\n        1081212928,\n        1089208320,\n        1093795840,\n        1095892992,\n        1097465856,\n        1100611584,\n        1103364096,\n        1105723392,\n        1106378752,\n        1107296256,\n        1108869120,\n        1110704128,\n        1111621632,\n        1111752704,\n        1112408064,\n        1115553792,\n        1115947008,\n        1116602368,\n        1117126656,\n        1120272384,\n        1126563840,\n        1127219200,\n        1131413504,\n        1134952448,\n        1135214592,\n        1141768192,\n        1151598592,\n        1153302528,\n        1154613248,\n        1156317184,\n        1156972544,\n        1158021120,\n        1162084352,\n        1163264000,\n        1163788288,\n        1167589376,\n        1172307968,\n        1176764416,\n        1179123712,\n        1182007296,\n        1184628736,\n        1190133760,\n        1193541632,\n        1193803776,\n        1196294144,\n        1196949504,\n        1199439872,\n        1199702016,\n        1210712064,\n        1211105280,\n        1212022784,\n        1212284928,\n        1215168512,\n        1215430656,\n        1217134592,\n        1218969600,\n        1222639616,\n        1225785344,\n        1226178560,\n        1234305024,\n        1238499328,\n        1239547904,\n        1242169344,\n        1242300416,\n        1245315072,\n        1246887936,\n        1259470848,\n        1260257280,\n        1264713728,\n        1265500160,\n        1269039104,\n        1273626624,\n        1277820928,\n        1279787008,\n        1280180224,\n        1283457024,\n        1284767744,\n        1287520256,\n        1289224192,\n        1293418496,\n        1296171008,\n        1305214976,\n        1306918912,\n        1307049984,\n        1309409280\n      ],\n      \"return\": [\n        0.0,\n        -15.208062119086582,\n        27.985837462544442,\n        33.065118919610974,\n        63.739390674432116,\n        73.38403714617094,\n        81.76407511790593,\n        83.41257841587067,\n        94.50324208100636,\n        87.02840101520219,\n        96.66484376668932,\n        105.62438798487187,\n        103.97968959172567,\n        123.7255840688944,\n        130.92081548810003,\n        138.31118696252506,\n        143.94612252871195,\n        139.69248750011124,\n        161.23692795117697,\n        175.8585472281774,\n        158.11115952809652,\n        178.34551179766655,\n        195.95468923449516,\n        213.11414700110754,\n        189.21305813153586,\n        210.28728338837627,\n        275.5193917051951,\n        290.2456347779433,\n        328.01072924931844,\n        333.3376958012581,\n        364.03000196417173,\n        401.8089190347989,\n        394.4687174904347,\n        417.68140043894454,\n        426.2444740418593,\n        444.2910240530967,\n        434.3765281144779,\n        448.3476926716169,\n        524.3667556663355,\n        501.29599927862483,\n        509.64772761027024,\n        526.7205876469612,\n        538.6500786121686,\n        625.3346225706736,\n        607.9900910063585,\n        599.0540671960513,\n        685.6463563728333,\n        641.4686959608397,\n        668.2085113616785,\n        683.5394006729126,\n        660.5089496715864,\n        724.7123481567701,\n        721.6675941574573,\n        724.0352485704421,\n        779.669870207707,\n        760.2905054322878,\n        845.789764096737,\n        833.1766210738818,\n        924.196205678781,\n        830.4869041840235,\n        895.8986332496006,\n        844.670979039669,\n        856.6293645552795,\n        918.985225797097,\n        891.081965850989,\n        856.4227460622787,\n        907.1667261115709,\n        917.6800900681814,\n        966.6861300317446,\n        1008.8145752668382,\n        955.3873797392845,\n        959.3507392517726,\n        957.2452503728867,\n        1013.6096401135125,\n        1013.7457611942291,\n        1038.3559891557695,\n        1040.9363767464956,\n        1014.6830427742003,\n        1033.165902372996,\n        1059.8773714057604,\n        1054.6472192327183,\n        1128.236418112119,\n        1109.1773519515991,\n        1101.6388582499821,\n        1083.090611615181,\n        1195.6467939186095,\n        1115.4643084677061,\n        1157.6829477691651,\n        1120.246314201355,\n        1091.018876689275,\n        1148.0232208383084,\n        1154.231260531346,\n        1117.6589318927129,\n        1163.7312007880212,\n        1187.4146699810028,\n        1178.052467587789,\n        1234.0955010000864,\n        1209.0325522549947,\n        1230.0507287740709,\n        1282.1594580292701,\n        1234.0622125005723,\n        1271.1317810726166,\n        1225.4457596842446,\n        1270.9370497743287,\n        1265.8080000972748,\n        1285.0425640010833,\n        1273.9300214743614,\n        1239.6171310869852,\n        1258.1802626220385,\n        1325.5853050804137,\n        1297.6688449207943,\n        1369.975601091385,\n        1344.9264648564656,\n        1358.1098932425182,\n        1362.4708606147767,\n        1349.7314314222338,\n        1346.8320728611945,\n        1401.8542354472477,\n        1355.3846225547788,\n        1387.9516682084402,\n        1368.1288306109109,\n        1388.3219243033727,\n        1406.3560511612893,\n        1421.1573902893067,\n        1403.0136197344461,\n        1477.9739192326863,\n        1441.3298983828226,\n        1404.8318265914916,\n        1472.0646509361268,\n        1450.5080246194204,\n        1433.188861509959,\n        1486.0691950257617,\n        1499.02348812898,\n        1496.1851874955494,\n        1478.6511289024354,\n        1513.782597764333,\n        1502.1337704817454,\n        1531.7802909342447,\n        1463.9768899472554,\n        1446.483530351321,\n        1510.6605901590983,\n        1529.3808396275838,\n        1572.5691393343607,\n        1531.6558720620476,\n        1523.744317480723,\n        1494.7289040152234,\n        1560.7028294316926,\n        1585.1348492209115,\n        1537.608029677073,\n        1577.6323381169634,\n        1532.9813650194803,\n        1546.102662928899,\n        1572.8515084838866,\n        1600.6941162077585,\n        1576.739197918574,\n        1615.7113013267517,\n        1603.171879494985,\n        1576.099616212845,\n        1523.232181816101,\n        1565.5702744865418,\n        1609.7783724792798,\n        1703.4605189323427,\n        1615.7065868759157,\n        1594.436960922877,\n        1675.8799238904319,\n        1656.2481291834513,\n        1627.7645898501078,\n        1656.3225352541606,\n        1694.707741953532,\n        1655.415842622121,\n        1658.082688509623,\n        1713.2894029490153,\n        1588.20122118632,\n        1667.4801043764749,\n        1676.096830698649,\n        1687.85254655838,\n        1708.2096050071716,\n        1754.2827956136068,\n        1712.5367916679381,\n        1640.9438634514809,\n        1689.144550711314,\n        1664.0739923032124,\n        1683.9244037755332,\n        1696.7922770563762,\n        1768.1795618375143,\n        1727.5779999160768,\n        1736.8503835614522,\n        1719.941689491272,\n        1686.0338038508096,\n        1725.988723818461,\n        1738.6104576237994,\n        1740.7765754508973,\n        1705.1388481903075,\n        1670.4937730471295,\n        1658.214816932678,\n        1695.121711362203,\n        1792.4769620259603,\n        1745.4210001754761,\n        1760.5531140454611,\n        1781.594201936722,\n        1723.959014670054,\n        1671.6467679278055,\n        1757.5837479400634,\n        1741.0655566724142,\n        1774.9067363675433,\n        1751.9776115671793,\n        1750.5093571726482,\n        1795.1821841748554,\n        1745.589820652008,\n        1840.589610424042,\n        1717.551601155599,\n        1794.2092125193278,\n        1743.430915406545,\n        1779.547692756653,\n        1831.2570528539025,\n        1783.0511383438109,\n        1758.9477525520324,\n        1770.414341278076,\n        1782.2672223154705,\n        1818.8416625086466,\n        1744.8974377123516,\n        1782.1652972284953,\n        1861.500615870158,\n        1812.3222024790448,\n        1802.887804667155,\n        1756.6487368011476,\n        1830.3497687530519,\n        1783.0421785227456,\n        1770.1143108113608,\n        1762.8640670140585,\n        1797.7952985254924,\n        1739.315527311961,\n        1925.4324143981933,\n        1844.565812117259,\n        1859.4592933972679,\n        1782.93512550354,\n        1849.921053543091,\n        1896.0195062383016,\n        1852.6285049819944,\n        1870.3044067255653,\n        1812.883816680908,\n        1843.5550619125363,\n        1834.9702418899535,\n        1875.8230379358927,\n        1864.2732896296184,\n        1803.783723742167,\n        1781.7357353655498,\n        1851.7352653630576,\n        1828.711000213623,\n        1857.986676165263,\n        1830.465396830241,\n        1835.2933151626587,\n        1848.049705619812,\n        1831.3830486679078,\n        1903.9784999211631,\n        1841.445189297994,\n        1774.7733537991842,\n        1857.219704844157,\n        1853.6428640619913,\n        1827.793980585734,\n        1823.921475461324,\n        1918.2105669784548,\n        1919.4746598943075,\n        1856.3971628570555,\n        1857.3752022743222,\n        1857.5700806744892,\n        1898.651220486959,\n        1823.3593325424197,\n        1915.0531467819212,\n        1811.474196472168,\n        1920.302176755269,\n        1835.0142211023967,\n        1995.462091623942,\n        1945.341724770864,\n        1967.6135405858356,\n        1895.7059489440917,\n        1927.3008184973398,\n        1878.2231207148234,\n        1885.3370151138306,\n        1885.9712224833172,\n        1913.0469807688396,\n        1917.5231856409707,\n        1890.7723024622599,\n        1894.288302523295,\n        1817.398354326884,\n        1903.3093048095704,\n        1896.6997822443645,\n        1938.755055440267,\n        1874.6556552886962,\n        1941.1620425669353,\n        1896.1112930552165,\n        1945.2559768676758,\n        1922.5934006500245,\n        1906.1691541703542,\n        1938.1057643127442,\n        1862.2512243143717,\n        1945.4173630015055,\n        1937.4000315475462,\n        1943.181752243042,\n        1997.07544312795,\n        1915.3227619679772,\n        2009.8460191726688,\n        1928.3586402511598,\n        1915.2570816675823,\n        1958.9155800882975,\n        1925.594481455485,\n        1903.2088528315228,\n        1948.8292271677656,\n        1903.9699386978148,\n        1866.7248757425943,\n        1931.2256560770668,\n        1949.2665031051638,\n        1971.5855646133423,\n        2036.839611829122,\n        1957.3344420369467,\n        1971.0834051767986,\n        1930.8539343388875,\n        1910.7293909327188,\n        1963.4615940475462,\n        1936.6964011383054,\n        1920.6460943349202,\n        1964.1139424006142,\n        1939.6381277974444,\n        1983.1162591298423,\n        1969.4077895991006,\n        1916.6050984573365,\n        1989.2113303375245,\n        1848.2459172439576,\n        1952.5798476282755,\n        1950.3160333887736,\n        2034.1265435791017,\n        1957.3808938471477,\n        1975.661287091573,\n        1952.5201565678915,\n        1955.1099046198526,\n        1849.4640664164226,\n        2036.2135239664713,\n        1902.6708364105223,\n        1976.565142110189,\n        2000.1304290771484,\n        1970.1112725194296,\n        1999.9440422439575,\n        1968.6209470717113,\n        1991.952435569763,\n        1978.2394967905682,\n        2020.874125760396,\n        2021.6539390945434,\n        1970.0303056081136,\n        1953.0815971247357,\n        1943.7658596038818,\n        2006.5767294057212,\n        2002.7899671554562,\n        1949.61315633138,\n        1911.2338114929198,\n        1962.1358880996704,\n        2024.1895262781782,\n        2029.7549323145547,\n        1971.429596494039,\n        2015.836051381429,\n        1989.5375478617352,\n        1955.2756966527304,\n        2022.758341852824,\n        2020.1011663182574,\n        1985.6239275360106,\n        1995.0723993174233,\n        1984.3480288060507,\n        1990.6294145965576,\n        1941.2070553334554,\n        2012.1252802149454,\n        1954.0151485188799,\n        1966.6849535624187,\n        1956.8263164011635,\n        2009.279321454366,\n        1971.528699051539,\n        1971.0075210444131,\n        1974.9195355733236,\n        2014.5018423970541,\n        1963.5272241973878,\n        2062.865896174113,\n        2064.069284261068,\n        2032.2438666407268,\n        2045.7047826004027,\n        1994.6503591664634,\n        1944.566607913971,\n        2000.897696100871,\n        2038.6879258982342,\n        1946.1669205729165,\n        1967.0156407292682,\n        1937.578423576355,\n        2037.6293779500327,\n        1968.8212608210245,\n        2010.168547808329,\n        2016.3574635696411,\n        2048.5558566919963,\n        1975.3541211573283,\n        2061.4655352020263,\n        1937.0920262018838,\n        2034.1998028818764,\n        2086.1933650080364,\n        2010.8763928985597,\n        2010.5087301635742,\n        2027.8863892364504,\n        1981.2459899648031,\n        2012.5796449931465,\n        2036.9558761215212,\n        1977.7446804300944,\n        2063.926620445252,\n        2099.370090967814,\n        2031.5188530985515,\n        2079.7016563924153,\n        2036.4256989796959,\n        1980.0338309478757,\n        1954.092360877991,\n        2026.7211258061727,\n        2042.9697960281374,\n        2037.1259219614665,\n        2016.3184883499146,\n        2117.8120729827883,\n        1992.8279691696168,\n        1965.9154495493574,\n        1946.7313609409332,\n        2057.197262687683,\n        2033.9488484064739,\n        2022.2097825749715,\n        2064.514777202606,\n        2023.224939028422,\n        2095.5899105580647,\n        2000.7877479807537,\n        1995.3137570063275,\n        2066.0285115305583,\n        2057.512433611552,\n        2072.2311502329508,\n        2018.1963609441125,\n        2038.0414914067585,\n        2055.4949537150064,\n        2100.3472706476846,\n        2047.5307428741453,\n        2046.3145210266111,\n        2070.8851332473755,\n        2031.5462546221415,\n        2070.7231659825643,\n        2043.2447252909342,\n        2080.842988020579,\n        2043.9892267862954,\n        2013.1029631296794,\n        2077.845181528727,\n        1999.9314056777955,\n        2024.9522764460246,\n        2041.5528373972575,\n        2076.5959600957235,\n        2093.144813868205,\n        1954.2837760353088,\n        2075.3821962610878,\n        2055.6548400624592,\n        2082.33140557607,\n        2025.3550900268554,\n        2046.7053001149495,\n        2069.434745763143,\n        2030.7184680811563,\n        2129.1889733250937,\n        2045.801846707662,\n        2053.877966028849,\n        2099.7886859385176,\n        2071.1038087495167,\n        2113.2374559020996,\n        2142.6716127904256,\n        2041.0685925801592,\n        2022.7078050359094,\n        2069.1886087226867,\n        2069.757692337036,\n        2128.6494229634604,\n        2141.4252186838785,\n        2052.935356267293,\n        2093.082716585795,\n        2135.0062626902263,\n        2099.3032523727416,\n        2047.962077682813,\n        2055.616015446981,\n        2096.431504268646,\n        2165.0426854324337,\n        2071.234339103699,\n        2026.6400920359295,\n        2068.65818195343,\n        2060.2222735087075,\n        2063.7241380437213,\n        2022.5148383331298,\n        2068.477814381917,\n        2088.7389111423495,\n        2072.564161783854,\n        2097.643168818156,\n        2138.8968943659465,\n        2132.4746655718486,\n        2145.5754326375322,\n        2101.0678678512572,\n        2086.687912394206,\n        2057.179594141642,\n        2086.038907839457,\n        2093.507007738749,\n        2101.1903778711953,\n        2098.7112515767417,\n        2140.860017267863\n      ],\n      \"return_min\": [\n        0.0,\n        -27.161190098873938,\n        -22.596889943332886,\n        -18.04884047175878,\n        34.96802557070835,\n        42.14144832782458,\n        42.18764349919466,\n        55.902530695070986,\n        77.36366604239011,\n        71.03807036708037,\n        81.47330597845952,\n        97.73028509544966,\n        97.97264168524211,\n        105.96846207393762,\n        110.22956162854999,\n        128.14846576261468,\n        108.559148752248,\n        123.92340874979348,\n        132.78944635044505,\n        159.42855628495957,\n        144.46347678148942,\n        157.8610229525654,\n        158.35143035502105,\n        197.7479254473252,\n        160.54953800824737,\n        188.7994952359248,\n        217.39242615649243,\n        244.56688681087465,\n        301.11488470145423,\n        256.17273889754574,\n        292.8235861221529,\n        314.92021021487477,\n        306.4235374628608,\n        325.3520406500228,\n        344.34044697638694,\n        368.11517927384756,\n        336.8252980067012,\n        333.9395555124325,\n        427.2549358603718,\n        371.37608381527093,\n        420.9938648544428,\n        389.18023388585493,\n        388.7416321479917,\n        472.426062403483,\n        420.6928762920395,\n        424.0957463022822,\n        507.2579861115297,\n        482.9738704260956,\n        514.3924576752916,\n        515.5455037645223,\n        505.47987268452863,\n        549.1951941961452,\n        576.1318275055664,\n        554.7743324166245,\n        578.684664797232,\n        592.2225573434476,\n        624.6161211833768,\n        633.7916668340124,\n        680.684790989311,\n        628.6966472885151,\n        661.4334250216607,\n        693.722886939662,\n        632.8989605489518,\n        736.02060626199,\n        709.4836056376417,\n        656.7443508943927,\n        726.4691582918115,\n        733.7766667428998,\n        772.0049538980603,\n        771.1210966295357,\n        687.8146653746687,\n        772.2877183570141,\n        757.2506362258152,\n        831.5431251001323,\n        764.1121868299346,\n        774.9296975673354,\n        779.6430454129754,\n        760.2656831857466,\n        837.0739666126292,\n        834.701324203895,\n        820.945902380164,\n        918.2923429426099,\n        912.5397683243333,\n        865.2231092682929,\n        851.5707559534205,\n        961.1212342855194,\n        841.2827250647913,\n        908.2660836637016,\n        853.0973778229541,\n        842.277914583651,\n        895.7067200072679,\n        903.8746022376938,\n        909.0731031912112,\n        899.3540128471468,\n        892.1310636941934,\n        914.4578585600933,\n        1000.6894379726076,\n        930.2474086813293,\n        928.192620748513,\n        982.6184080156437,\n        985.0062257377579,\n        972.0931290747644,\n        1020.8499845695702,\n        1043.6865484733269,\n        962.5154794990744,\n        995.2043902628145,\n        1016.0871129267222,\n        999.3151881903785,\n        988.0216683911464,\n        1076.0456266138542,\n        1038.413815810871,\n        1134.5676517335849,\n        1127.0582451565451,\n        1029.4210202300158,\n        1008.7807577978716,\n        1131.6949019203973,\n        1084.653325585139,\n        1178.7731229642109,\n        1113.7457074972601,\n        1113.0967302489153,\n        1074.7978853578866,\n        1061.731097782584,\n        1118.770170470866,\n        1100.9872796639804,\n        1153.3560738865303,\n        1179.5669987881379,\n        1131.3296485345286,\n        1115.279419330474,\n        1150.983972038616,\n        1150.3776378371845,\n        1091.8778342513265,\n        1174.5594091944524,\n        1144.1578655765995,\n        1226.8169439788194,\n        1206.4039871117989,\n        1190.3003022235698,\n        1144.5405407138749,\n        1232.5913778660001,\n        1154.2737017656432,\n        1170.5795243711475,\n        1220.6887059803785,\n        1205.7898655331019,\n        1351.4844213259557,\n        1217.0863485837851,\n        1227.702084485864,\n        1179.059482340776,\n        1271.70303368492,\n        1248.3406054138006,\n        1248.1741466642331,\n        1277.1677626249261,\n        1214.558260797336,\n        1197.5053085935224,\n        1285.9107737681893,\n        1303.3717898497944,\n        1226.9067480283068,\n        1281.0673349128906,\n        1270.0320762435354,\n        1194.3613478912507,\n        1213.3823682848417,\n        1185.1487094475583,\n        1310.7679868182706,\n        1370.9720882821025,\n        1268.202556716447,\n        1242.343764031949,\n        1346.877691688978,\n        1317.495343061811,\n        1314.568714119066,\n        1361.3424888032723,\n        1339.0235495620177,\n        1267.3943711979912,\n        1301.7229607501765,\n        1368.062464069575,\n        1236.568057266568,\n        1365.4843721048537,\n        1267.4246117227956,\n        1380.7292223209784,\n        1404.7881334910667,\n        1420.0921517039255,\n        1361.3780125494404,\n        1332.0237658028361,\n        1376.9324775195712,\n        1341.9790275047662,\n        1318.1740906143439,\n        1328.6053661732908,\n        1424.480831565852,\n        1373.1092791573649,\n        1374.772976307375,\n        1324.893876172093,\n        1286.3563379195516,\n        1346.6836717547092,\n        1369.9218004416691,\n        1330.8701694224194,\n        1280.8059137701707,\n        1254.5575411723346,\n        1335.9852094244845,\n        1364.3224842003256,\n        1385.5057623830635,\n        1380.023729056787,\n        1432.4021606571796,\n        1358.2536781852564,\n        1359.5747690286767,\n        1284.2510806986868,\n        1455.7353569125598,\n        1388.2634609261072,\n        1381.3655357175812,\n        1374.0751343798888,\n        1411.685858399476,\n        1387.5794869635397,\n        1375.6519713514167,\n        1458.1457389245174,\n        1345.079579841159,\n        1388.019191836721,\n        1386.2673140956695,\n        1373.1435768897766,\n        1483.1654724490288,\n        1419.1127589141586,\n        1422.6441999617675,\n        1394.316212442754,\n        1435.0119399069597,\n        1486.8789219870775,\n        1388.1124059768877,\n        1408.8235776500576,\n        1388.8400768587592,\n        1381.4120399968901,\n        1465.4034592263815,\n        1400.4350416301975,\n        1504.3549284270698,\n        1365.002290992044,\n        1419.4738236188787,\n        1385.7303798776138,\n        1380.7123758037292,\n        1374.325530176849,\n        1546.4586301789147,\n        1416.9020671146745,\n        1451.6460060492382,\n        1395.8395607612806,\n        1516.8886349128268,\n        1566.2220417641367,\n        1443.3438756018134,\n        1497.0517409211652,\n        1439.7744408874278,\n        1461.1655404900193,\n        1431.6120416100669,\n        1482.632285336264,\n        1445.61722471427,\n        1373.9022471610704,\n        1383.4778009699876,\n        1514.5874970761788,\n        1449.3897922200754,\n        1459.2718640980195,\n        1445.673988689869,\n        1481.867102547193,\n        1416.5571935452258,\n        1428.6376018618348,\n        1471.5909162454504,\n        1437.9651272477709,\n        1376.6714314274057,\n        1435.2779243640286,\n        1460.0146592348506,\n        1444.573448109354,\n        1425.1645320221062,\n        1589.7575798981375,\n        1537.6307690887043,\n        1481.7225802245484,\n        1456.5958771890605,\n        1430.5105954220862,\n        1460.8339212615228,\n        1442.4455362521799,\n        1487.3489052326277,\n        1450.4672045571854,\n        1477.3002344345819,\n        1395.6450137317802,\n        1568.0512429525927,\n        1491.1838236247427,\n        1565.7123137775534,\n        1484.6256077306602,\n        1515.8810622500755,\n        1472.0829419518004,\n        1472.7142075134593,\n        1505.0820518975652,\n        1497.5566648524955,\n        1498.4511328410176,\n        1466.658886426278,\n        1543.8937521195107,\n        1450.2672900390025,\n        1430.1587646486926,\n        1498.9568595144442,\n        1447.542614495117,\n        1447.4056109371613,\n        1507.8509163027904,\n        1455.4372825667892,\n        1452.9341609485696,\n        1498.1143864181686,\n        1515.1634447227061,\n        1468.4866831933964,\n        1442.4412955931375,\n        1527.5556465426737,\n        1536.0565024663938,\n        1514.247800150507,\n        1583.536500617152,\n        1474.5626088776535,\n        1509.91882996999,\n        1528.1334189297806,\n        1470.9582023660391,\n        1578.043905306006,\n        1520.5414630516445,\n        1441.6468159267697,\n        1470.1766185048043,\n        1482.3683791048722,\n        1467.4125032446327,\n        1524.607720528038,\n        1503.150773887472,\n        1486.3705340005833,\n        1559.1060328345488,\n        1569.8997699314025,\n        1520.4322467032387,\n        1396.8354230278119,\n        1453.6314931495442,\n        1517.509476294477,\n        1450.7759682630508,\n        1474.1624315114384,\n        1480.0016272160174,\n        1532.2355047038743,\n        1520.3996238733514,\n        1519.2821826630918,\n        1528.9508599691799,\n        1543.4562284675474,\n        1432.0963105506623,\n        1442.7383662102407,\n        1495.1000309817264,\n        1596.8286355215296,\n        1548.0623781368479,\n        1513.473194333779,\n        1445.623128198912,\n        1558.6842115517425,\n        1470.387713310401,\n        1577.066097963253,\n        1454.9469317209691,\n        1526.2121612927629,\n        1574.327076638486,\n        1594.8228845777896,\n        1576.1779926049167,\n        1483.476359836326,\n        1506.2643467472722,\n        1488.250470296554,\n        1560.3696060920358,\n        1631.5524746290182,\n        1494.4993851391039,\n        1565.1822489015794,\n        1449.5369646032502,\n        1569.2371285379234,\n        1563.1984140915442,\n        1521.9330510275392,\n        1544.174194066606,\n        1558.2547751910918,\n        1561.1145405838733,\n        1552.731538143909,\n        1578.208912640424,\n        1516.625577093806,\n        1628.3454945911135,\n        1479.4409458937757,\n        1580.0587191038408,\n        1563.635343391275,\n        1519.502138217722,\n        1513.6321589265149,\n        1498.35854524867,\n        1501.401110032914,\n        1544.7897019286283,\n        1540.6201225300201,\n        1497.3815390956493,\n        1569.4663899211246,\n        1512.6643267583618,\n        1535.621112398052,\n        1495.5567046492806,\n        1429.5073148231895,\n        1557.4443777879355,\n        1567.0962056198289,\n        1530.7736527134039,\n        1584.7302307758193,\n        1560.4982829956334,\n        1596.559082695724,\n        1641.3211833407408,\n        1517.8796661373995,\n        1450.5012692957116,\n        1517.66185857882,\n        1623.861862632764,\n        1524.60548796018,\n        1501.9337103757932,\n        1446.0298615019626,\n        1606.0463985254053,\n        1583.0037423751878,\n        1606.8359237011962,\n        1579.0443105882025,\n        1666.6902098590833,\n        1476.5094490138526,\n        1579.4924227062668,\n        1502.4046419676313,\n        1581.0602606282112,\n        1643.893308246401,\n        1534.4804488328232,\n        1537.0412616431054,\n        1580.3659515002705,\n        1496.6012652120808,\n        1659.6623235855254,\n        1633.5241005729476,\n        1509.08508107819,\n        1611.0853749851426,\n        1629.3447044536451,\n        1577.0238591286475,\n        1741.3789317090202,\n        1604.8568264986948,\n        1464.4603215632667,\n        1528.850724150617,\n        1601.3909147640104,\n        1563.432721183317,\n        1563.117337356342,\n        1534.0604064009335,\n        1682.1950946507377,\n        1589.7012207264347,\n        1529.3202351849368,\n        1562.783527089386,\n        1596.4480706175289,\n        1612.1582084256165,\n        1556.2473492890576,\n        1572.5127956006552,\n        1579.9003587354428,\n        1603.3815175879731,\n        1496.2912121768586,\n        1547.1989400956572,\n        1529.7008966898647,\n        1591.9319267874932,\n        1620.66586739163,\n        1524.8975652680986,\n        1596.7308901438619,\n        1578.509760776493,\n        1644.9447567025031,\n        1608.9701122356018,\n        1584.3660853819813,\n        1583.958221531358,\n        1547.7429354654662,\n        1588.0843985723182,\n        1620.043228572627,\n        1624.8077049713972,\n        1535.407323620321,\n        1615.0586351498025,\n        1677.2807431240822,\n        1575.592527627532,\n        1644.6257283034101,\n        1612.8801666411941,\n        1630.8985728703,\n        1615.2113297627213,\n        1546.6288675826943,\n        1664.8153291931917,\n        1553.9245644556117,\n        1572.1012552448246,\n        1562.6978368470532,\n        1604.6422087335768,\n        1613.4301426983723,\n        1637.4686336857023,\n        1654.803314447553,\n        1572.5107533638277,\n        1648.6667519103562,\n        1605.3503187941099,\n        1585.1654769940812,\n        1650.9726708678616,\n        1699.881191757293,\n        1544.4039259615927,\n        1650.2981905111487,\n        1643.155559833605,\n        1621.1338818819904,\n        1745.929102319576,\n        1666.0085948359579,\n        1633.82751805364,\n        1645.624210375258,\n        1662.397404348184,\n        1662.432444994714,\n        1590.101139389238,\n        1675.457243838382,\n        1622.1487061811008,\n        1713.4838525418695,\n        1662.250600129536,\n        1619.9537068312998,\n        1580.1483505469207,\n        1660.4698799430842,\n        1613.5100304468401,\n        1530.5004914938645,\n        1684.3522602101496,\n        1580.6234154752178,\n        1613.4683769892192,\n        1589.010467444366,\n        1694.1019780951112,\n        1667.2330748321283,\n        1708.554258745362,\n        1575.2434393823205,\n        1606.5804405155193,\n        1654.5279706504311,\n        1603.8763502603792,\n        1600.9958365716125,\n        1654.6551724197448,\n        1690.9247311841996,\n        1644.1095798874194\n      ],\n      \"return_max\": [\n        0.0,\n        -3.254934139299225,\n        78.56856486842177,\n        84.17907831098073,\n        92.51075577815588,\n        104.6266259645173,\n        121.34050673661721,\n        110.92262613667035,\n        111.64281811962262,\n        103.01873166332402,\n        111.85638155491911,\n        113.51849087429409,\n        109.98673749820924,\n        141.4827060638512,\n        151.61206934765008,\n        148.47390816243544,\n        179.3330963051759,\n        155.461566250429,\n        189.68440955190889,\n        192.28853817139523,\n        171.75884227470362,\n        198.8300006427677,\n        233.55794811396927,\n        228.48036855488988,\n        217.87657825482435,\n        231.77507154082772,\n        333.6463572538978,\n        335.92438274501194,\n        354.90657379718266,\n        410.5026527049705,\n        435.23641780619056,\n        488.69762785472307,\n        482.5138975180086,\n        510.0107602278663,\n        508.14850110733164,\n        520.4668688323459,\n        531.9277582222546,\n        562.7558298308014,\n        621.4785754722992,\n        631.2159147419787,\n        598.3015903660977,\n        664.2609414080675,\n        688.5585250763455,\n        778.2431827378641,\n        795.2873057206775,\n        774.0123880898204,\n        864.034726634137,\n        799.9635214955838,\n        822.0245650480654,\n        851.5332975813029,\n        815.5380266586442,\n        900.229502117395,\n        867.2033608093482,\n        893.2961647242598,\n        980.6550756181821,\n        928.3584535211279,\n        1066.9634070100972,\n        1032.5615753137513,\n        1167.707620368251,\n        1032.2771610795319,\n        1130.3638414775405,\n        995.619071139676,\n        1080.359768561607,\n        1101.949845332204,\n        1072.6803260643364,\n        1056.1011412301648,\n        1087.8642939313304,\n        1101.583513393463,\n        1161.367306165429,\n        1246.5080539041408,\n        1222.9600941039002,\n        1146.4137601465309,\n        1157.2398645199582,\n        1195.6761551268926,\n        1263.3793355585235,\n        1301.7822807442035,\n        1302.2297080800158,\n        1269.100402362654,\n        1229.2578381333628,\n        1285.0534186076259,\n        1288.3485360852726,\n        1338.180493281628,\n        1305.8149355788648,\n        1338.0546072316713,\n        1314.6104672769413,\n        1430.1723535516996,\n        1389.645891870621,\n        1407.0998118746288,\n        1387.395250579756,\n        1339.759838794899,\n        1400.3397216693488,\n        1404.587918824998,\n        1326.2447605942145,\n        1428.1083887288955,\n        1482.6982762678122,\n        1441.6470766154844,\n        1467.5015640275653,\n        1487.8176958286601,\n        1531.9088367996287,\n        1581.7005080428967,\n        1483.1181992633867,\n        1570.1704330704688,\n        1430.041534798919,\n        1498.1875510753305,\n        1569.100520695475,\n        1574.880737739352,\n        1531.7729300220008,\n        1479.9190739835917,\n        1528.3388568529308,\n        1575.1249835469732,\n        1556.9238740307176,\n        1605.3835504491851,\n        1562.794684556386,\n        1686.7987662550206,\n        1716.160963431682,\n        1567.7679609240702,\n        1609.01082013725,\n        1624.9353479302845,\n        1597.0235376122976,\n        1662.8066061679651,\n        1661.4597758639352,\n        1714.9127508241615,\n        1693.9419318517125,\n        1741.327500914633,\n        1652.671165582362,\n        1776.3808396772347,\n        1751.3301482311167,\n        1694.3842338525092,\n        1793.1453298336376,\n        1750.6384114016564,\n        1774.4998887685915,\n        1797.578980857071,\n        1853.8891106813605,\n        1765.5534310122794,\n        1750.898270693072,\n        1837.2648933050964,\n        1859.727000249616,\n        1830.9692040024893,\n        1773.6800781288675,\n        1722.3875363314944,\n        1800.6324743378182,\n        1852.9718137220657,\n        1793.6538573427656,\n        1846.22539554031,\n        1819.7865504755823,\n        1810.3983256896709,\n        1849.7026251784653,\n        1921.9290930280224,\n        1827.041912689913,\n        1878.0969136090007,\n        1851.4044692416246,\n        1894.7000172642756,\n        1859.792243199584,\n        1898.0164425657226,\n        1926.5716478088414,\n        1950.3552677406128,\n        1936.3116827464346,\n        1957.8378845344391,\n        1833.0819953473604,\n        1945.9918395255254,\n        1908.788758140289,\n        2035.9489495825828,\n        1963.2106170353845,\n        1946.530157813805,\n        2004.8821560918857,\n        1995.0009153050917,\n        1940.9604655811495,\n        1951.3025817050489,\n        2050.391934345046,\n        2043.437314046251,\n        2014.4424162690696,\n        2058.5163418284556,\n        1939.834385106072,\n        1969.475836648096,\n        2084.7690496745026,\n        1994.9758707957817,\n        2011.6310765232765,\n        2088.473439523288,\n        2063.695570786436,\n        1949.8639611001256,\n        2001.3566239030567,\n        1986.1689571016586,\n        2049.6747169367227,\n        2064.9791879394616,\n        2111.8782921091765,\n        2082.046720674789,\n        2098.927790815529,\n        2114.989502810451,\n        2085.7112697820676,\n        2105.2937758822127,\n        2107.29911480593,\n        2150.682981479375,\n        2129.4717826104443,\n        2086.4300049219246,\n        1980.4444244408717,\n        2025.9209385240806,\n        2199.448161668857,\n        2110.8182712941652,\n        2088.7040674337427,\n        2204.9347256881874,\n        2088.3432603114315,\n        2059.042455156924,\n        2059.432138967567,\n        2093.8676524187213,\n        2168.4479370175054,\n        2129.88008875447,\n        2089.33285594582,\n        2202.784881386171,\n        2115.5276699525994,\n        2223.0334819235663,\n        2090.023622470039,\n        2200.3992332019347,\n        2100.5945167174204,\n        2185.9518086235294,\n        2179.348633258776,\n        2146.989517773463,\n        2095.2513051422975,\n        2146.512470113398,\n        2129.5225047239815,\n        2150.804403030216,\n        2101.6824694478155,\n        2155.5070168069333,\n        2334.1611548815567,\n        2243.2323649611994,\n        2140.3721501079285,\n        2112.8624319720975,\n        2156.344609079034,\n        2201.0820660534473,\n        2120.754798003843,\n        2139.997754150503,\n        2214.8782212472556,\n        2104.305524447073,\n        2304.4061986174715,\n        2272.2295571198433,\n        2267.2725807452975,\n        2170.030690245799,\n        2182.9534721733553,\n        2225.8169707124666,\n        2261.9131343621757,\n        2243.5570725299654,\n        2185.9931924743883,\n        2225.9445833350533,\n        2238.32844216984,\n        2269.0137905355214,\n        2282.929354544967,\n        2233.665200323264,\n        2179.993669761112,\n        2188.8830336499364,\n        2208.0322082071707,\n        2256.701488232506,\n        2215.256804970613,\n        2188.7195277781243,\n        2279.542217694398,\n        2234.128495473981,\n        2336.366083596876,\n        2244.925251348217,\n        2172.875276170963,\n        2279.161485324285,\n        2247.271068889132,\n        2211.014513062114,\n        2222.6784189005416,\n        2246.663554058772,\n        2301.3185506999107,\n        2231.0717454895625,\n        2258.154527359584,\n        2284.629565926892,\n        2336.468519712395,\n        2204.2731288326595,\n        2342.7573883312148,\n        2172.4811883871503,\n        2363.3041190759564,\n        2274.3834284730133,\n        2422.872940295291,\n        2399.499625916985,\n        2369.514767394118,\n        2306.786290157523,\n        2338.720574744604,\n        2284.3632994778463,\n        2297.959822714202,\n        2266.8603930690692,\n        2328.5372966851837,\n        2336.595238440924,\n        2314.8857184982417,\n        2244.6828529270792,\n        2184.5294186147653,\n        2376.4598449704486,\n        2294.442704974285,\n        2429.967496385417,\n        2301.905699640231,\n        2374.4731688310803,\n        2336.7853035436437,\n        2437.577792786782,\n        2347.0724148818804,\n        2297.1748636180023,\n        2407.724845432092,\n        2282.061153035606,\n        2363.2790794603375,\n        2338.7435606286986,\n        2372.115704335577,\n        2410.614385638748,\n        2356.082915058301,\n        2509.7732083753476,\n        2328.583861572539,\n        2359.5559609691254,\n        2339.787254870589,\n        2330.6474998593253,\n        2364.770889736276,\n        2427.481835830727,\n        2325.571498290757,\n        2266.037248240556,\n        2337.8435916260955,\n        2395.3822323228555,\n        2456.8005952261014,\n        2514.5731908236953,\n        2344.769114142491,\n        2421.7345636503587,\n        2464.872445649963,\n        2367.8272887158937,\n        2409.413711800615,\n        2422.61683401356,\n        2367.129757158402,\n        2448.226257585211,\n        2347.040750891015,\n        2445.8328943863335,\n        2419.5333965351097,\n        2304.259336945493,\n        2434.966432207502,\n        2264.3955239372526,\n        2462.4213290463103,\n        2405.532035795821,\n        2471.424451636674,\n        2366.6994095574473,\n        2437.8493798493673,\n        2459.417184936871,\n        2351.5355976879628,\n        2228.540419522444,\n        2495.3609499696895,\n        2350.3947411000754,\n        2426.9181229276155,\n        2425.9337815158106,\n        2345.3996604610697,\n        2423.7100918829983,\n        2453.7655343070965,\n        2477.640524392254,\n        2468.2285232845825,\n        2481.3786454287565,\n        2411.7554035600683,\n        2445.5612260771236,\n        2340.980945347892,\n        2437.9947546045137,\n        2443.9163302735187,\n        2442.381520219368,\n        2377.293261635221,\n        2278.2934289192335,\n        2366.017001008249,\n        2487.264511972483,\n        2506.7783264852005,\n        2364.650280347654,\n        2515.046525669052,\n        2350.729601132357,\n        2431.110447411685,\n        2465.4579646018074,\n        2476.5669892452397,\n        2451.745716854299,\n        2476.512639708332,\n        2470.3375123634314,\n        2479.857719160201,\n        2337.6244087382825,\n        2483.6304378998707,\n        2410.6487579421105,\n        2363.903517203713,\n        2400.988306043965,\n        2482.93753051068,\n        2447.5006934537973,\n        2512.5077272656367,\n        2392.3946933587117,\n        2461.9074791742796,\n        2396.2807956813717,\n        2541.0015615724064,\n        2567.6402855265023,\n        2467.9286505857294,\n        2450.0883818600646,\n        2471.4210521955274,\n        2438.6319465322304,\n        2484.1335336229217,\n        2453.5139891637045,\n        2367.7283531856533,\n        2432.0975710827433,\n        2429.126985650747,\n        2469.2123573746603,\n        2354.638779266861,\n        2413.501171915462,\n        2453.67061655108,\n        2430.421503524909,\n        2474.198793300804,\n        2543.438647697786,\n        2371.7794104361365,\n        2487.3393451355414,\n        2528.493421769672,\n        2487.2723369642963,\n        2483.976198684043,\n        2475.4068269726304,\n        2465.8907147175255,\n        2365.4969664007676,\n        2440.387651670095,\n        2446.404279781999,\n        2516.7678659053613,\n        2569.395477481983,\n        2486.0138470684556,\n        2418.0243810758107,\n        2467.994571460697,\n        2495.6073403324845,\n        2379.3339976053653,\n        2452.0513368483353,\n        2522.5068708729577,\n        2511.134506566591,\n        2498.5765702988956,\n        2553.429051314839,\n        2395.954717612799,\n        2402.510663913778,\n        2330.6791947924803,\n        2517.9464547578377,\n        2455.739488387331,\n        2488.1722158608854,\n        2556.516758804557,\n        2466.5495193214015,\n        2587.7983035281563,\n        2505.284283784649,\n        2443.4285739169977,\n        2602.356126371252,\n        2523.09294043561,\n        2523.7964330742716,\n        2511.4951566201266,\n        2479.352092669655,\n        2532.48014665352,\n        2555.749784592866,\n        2486.0913735126887,\n        2508.262956671241,\n        2557.812044963393,\n        2515.3495737788166,\n        2553.3619333928104,\n        2466.4462220092414,\n        2536.8782710697606,\n        2552.5711299522695,\n        2411.1472911095566,\n        2478.409619933372,\n        2424.270283728059,\n        2405.278824588639,\n        2470.225508153321,\n        2522.293347321147,\n        2571.0782979736887,\n        2361.938684487923,\n        2485.949063328984,\n        2557.3851156693067,\n        2592.561555907316,\n        2488.0123432066575,\n        2488.7683914963222,\n        2525.439348827914,\n        2423.96830247661,\n        2603.5746322026343,\n        2519.0929400514965,\n        2459.089180147342,\n        2594.2270530829255,\n        2557.042140504952,\n        2575.502240936338,\n        2585.462033823558,\n        2537.7332591987256,\n        2395.11741956067,\n        2495.2216576117685,\n        2518.381502792082,\n        2511.369743607345,\n        2616.841842531799,\n        2472.043194480946,\n        2540.541222796332,\n        2607.6151210322687,\n        2536.174059750769,\n        2505.823015976388,\n        2435.7747870555795,\n        2570.7143023561916,\n        2616.601518322998,\n        2480.2180780778617,\n        2433.326477240559,\n        2557.168013359939,\n        2459.9746670743307,\n        2513.9382456406024,\n        2514.529185172395,\n        2452.603368553685,\n        2596.854406809481,\n        2531.659946578489,\n        2606.275870191946,\n        2583.691810636782,\n        2597.716256311569,\n        2582.5966065297025,\n        2626.892296320194,\n        2566.795384272892,\n        2459.831217632853,\n        2568.2014654185346,\n        2586.0181789058856,\n        2547.725583322646,\n        2506.497771969284,\n        2637.6104546483066\n      ]\n    }\n  },\n  \"Isaac-Velocity-Flat-H1-v0\": {\n    \"FastTD3\": {\n      \"time\": [\n        0.0,\n        306.64285714285717,\n        613.2857142857143,\n        919.9285714285714,\n        1226.5714285714287,\n        1533.2142857142858,\n        1839.857142857143,\n        2146.5,\n        2453.1428571428573,\n        2759.785714285714,\n        3066.4285714285716,\n        3373.0714285714284,\n        3679.714285714286,\n        3986.3571428571427,\n        4293.0\n      ],\n      \"env_step\": [\n        0,\n        20480000.0,\n        40960000.0,\n        61440000.0,\n        81920000.0,\n        102400000.0,\n        122880000.0,\n        143360000.0,\n        163840000.0,\n        184320000.0,\n        204800000.0,\n        225280000.0,\n        245760000.0,\n        266240000.0,\n        286720000.0\n      ],\n      \"return\": [\n        0.0,\n        17.756866772969563,\n        27.37510617574056,\n        31.223042170206707,\n        31.729761759440105,\n        32.67671267191569,\n        32.91831970214844,\n        33.834747314453125,\n        33.87761561075846,\n        33.86462910970052,\n        35.029486338297524,\n        35.26934560139974,\n        35.934749603271484,\n        35.113051096598305,\n        35.61514027913412\n      ],\n      \"return_min\": [\n        0.0,\n        14.716606314864778,\n        25.070346544765535,\n        31.068635948141047,\n        30.948657101492984,\n        31.904495164552372,\n        31.709390777382133,\n        33.20864308639823,\n        33.359869135298865,\n        33.35168347791627,\n        34.427511220703074,\n        34.53568214692868,\n        35.450113096166,\n        34.36188582932133,\n        35.525983562100734\n      ],\n      \"return_max\": [\n        0.0,\n        20.79712723107435,\n        29.679865806715583,\n        31.377448392272367,\n        32.51086641738723,\n        33.448930179279,\n        34.12724862691474,\n        34.46085154250802,\n        34.39536208621806,\n        34.37757474148478,\n        35.63146145589197,\n        36.003009055870805,\n        36.41938611037697,\n        35.86421636387528,\n        35.7042969961675\n      ]\n    },\n    \"null\": {\n      \"time\": [\n        0.0,\n        1.938,\n        3.876,\n        5.814,\n        7.752,\n        9.69,\n        11.628,\n        13.566,\n        15.504,\n        17.442,\n        19.38,\n        21.318,\n        23.256,\n        25.194,\n        27.132,\n        29.07,\n        31.008,\n        32.946,\n        34.884,\n        36.822,\n        38.76,\n        40.698,\n        42.636,\n        44.574,\n        46.512,\n        48.45,\n        50.388,\n        52.326,\n        54.264,\n        56.202,\n        58.14,\n        60.078,\n        62.016,\n        63.954,\n        65.892,\n        67.83,\n        69.768,\n        71.706,\n        73.644,\n        75.582,\n        77.52,\n        79.458,\n        81.396,\n        83.334,\n        85.272,\n        87.21,\n        89.148,\n        91.086,\n        93.024,\n        94.962,\n        96.9,\n        98.838,\n        100.776,\n        102.714,\n        104.652,\n        106.59,\n        108.528,\n        110.466,\n        112.404,\n        114.342,\n        116.28,\n        118.218,\n        120.156,\n        122.094,\n        124.032,\n        125.97,\n        127.908,\n        129.846,\n        131.784,\n        133.722,\n        135.66,\n        137.598,\n        139.536,\n        141.474,\n        143.412,\n        145.35,\n        147.288,\n        149.226,\n        151.164,\n        153.102,\n        155.04,\n        156.978,\n        158.916,\n        160.854,\n        162.792,\n        164.73,\n        166.668,\n        168.606,\n        170.544,\n        172.482,\n        174.42,\n        176.358,\n        178.296,\n        180.234,\n        182.172,\n        184.11,\n        186.048,\n        187.986,\n        189.924,\n        191.862,\n        193.8,\n        195.738,\n        197.676,\n        199.614,\n        201.552,\n        203.49,\n        205.428,\n        207.366,\n        209.304,\n        211.242,\n        213.18,\n        215.118,\n        217.056,\n        218.994,\n        220.932,\n        222.87,\n        224.808,\n        226.746,\n        228.684,\n        230.622,\n        232.56,\n        234.498,\n        236.436,\n        238.374,\n        240.312,\n        242.25,\n        244.188,\n        246.126,\n        248.064,\n        250.002,\n        251.94,\n        253.878,\n        255.816,\n        257.754,\n        259.692,\n        261.63,\n        263.568,\n        265.506,\n        267.444,\n        269.382,\n        271.32,\n        273.258,\n        275.196,\n        277.134,\n        279.072,\n        281.01,\n        282.948,\n        284.886,\n        286.824,\n        288.762,\n        290.7,\n        292.638,\n        294.576,\n        296.514,\n        298.452,\n        300.39,\n        302.328,\n        304.266,\n        306.204,\n        308.142,\n        310.08,\n        312.018,\n        313.956,\n        315.894,\n        317.832,\n        319.77,\n        321.708,\n        323.646,\n        325.584,\n        327.522,\n        329.46,\n        331.398,\n        333.336,\n        335.274,\n        337.212,\n        339.15,\n        341.088,\n        343.026,\n        344.964,\n        346.902,\n        348.84,\n        350.778,\n        352.716,\n        354.654,\n        356.592,\n        358.53,\n        360.468,\n        362.406,\n        364.344,\n        366.282,\n        368.22,\n        370.158,\n        372.096,\n        374.034,\n        375.972,\n        377.91,\n        379.848,\n        381.786,\n        383.724,\n        385.662,\n        387.6,\n        389.538,\n        391.476,\n        393.414,\n        395.352,\n        397.29,\n        399.228,\n        401.166,\n        403.104,\n        405.042,\n        406.98,\n        408.918,\n        410.856,\n        412.794,\n        414.732,\n        416.67,\n        418.608,\n        420.546,\n        422.484,\n        424.422,\n        426.36,\n        428.298,\n        430.236,\n        432.174,\n        434.112,\n        436.05,\n        437.988,\n        439.926,\n        441.864,\n        443.802,\n        445.74,\n        447.678,\n        449.616,\n        451.554,\n        453.492,\n        455.43,\n        457.368,\n        459.306,\n        461.244,\n        463.182,\n        465.12,\n        467.058,\n        468.996,\n        470.934,\n        472.872,\n        474.81,\n        476.748,\n        478.686,\n        480.624,\n        482.562,\n        484.5,\n        486.438,\n        488.376,\n        490.314,\n        492.252,\n        494.19,\n        496.128,\n        498.066,\n        500.004,\n        501.942,\n        503.88,\n        505.818,\n        507.756,\n        509.694,\n        511.632,\n        513.57,\n        515.508,\n        517.446,\n        519.384,\n        521.322,\n        523.26,\n        525.198,\n        527.136,\n        529.074,\n        531.012,\n        532.95,\n        534.888,\n        536.826,\n        538.764,\n        540.702,\n        542.64,\n        544.578,\n        546.516,\n        548.454,\n        550.392,\n        552.33,\n        554.268,\n        556.206,\n        558.144,\n        560.082,\n        562.02,\n        563.958,\n        565.896,\n        567.834,\n        569.772,\n        571.71,\n        573.648,\n        575.586,\n        577.524,\n        579.462,\n        581.4,\n        583.338,\n        585.276,\n        587.214,\n        589.152,\n        591.09,\n        593.028,\n        594.966,\n        596.904,\n        598.842,\n        600.78,\n        602.718,\n        604.656,\n        606.594,\n        608.532,\n        610.47,\n        612.408,\n        614.346,\n        616.284,\n        618.222,\n        620.16,\n        622.098,\n        624.036,\n        625.974,\n        627.912,\n        629.85,\n        631.788,\n        633.726,\n        635.664,\n        637.602,\n        639.54,\n        641.478,\n        643.416,\n        645.354,\n        647.292,\n        649.23,\n        651.168,\n        653.106,\n        655.044,\n        656.982,\n        658.92,\n        660.858,\n        662.796,\n        664.734,\n        666.672,\n        668.61,\n        670.548,\n        672.486,\n        674.424,\n        676.362,\n        678.3,\n        680.238,\n        682.176,\n        684.114,\n        686.052,\n        687.99,\n        689.928,\n        691.866,\n        693.804,\n        695.742,\n        697.68,\n        699.618,\n        701.556,\n        703.494,\n        705.432,\n        707.37,\n        709.308,\n        711.246,\n        713.184,\n        715.122,\n        717.06,\n        718.998,\n        720.936,\n        722.874,\n        724.812,\n        726.75,\n        728.688,\n        730.626,\n        732.564,\n        734.502,\n        736.44,\n        738.378,\n        740.316,\n        742.254,\n        744.192,\n        746.13,\n        748.068,\n        750.006,\n        751.944,\n        753.882,\n        755.82,\n        757.758,\n        759.696,\n        761.634,\n        763.572,\n        765.51,\n        767.448,\n        769.386,\n        771.324,\n        773.262,\n        775.2,\n        777.138,\n        779.076,\n        781.014,\n        782.952,\n        784.89,\n        786.828,\n        788.766,\n        790.704,\n        792.642,\n        794.58,\n        796.518,\n        798.456,\n        800.394,\n        802.332,\n        804.27,\n        806.208,\n        808.146,\n        810.084,\n        812.022,\n        813.96,\n        815.898,\n        817.836,\n        819.774,\n        821.712,\n        823.65,\n        825.588,\n        827.526,\n        829.464,\n        831.402,\n        833.34,\n        835.278,\n        837.216,\n        839.154,\n        841.092,\n        843.03,\n        844.968,\n        846.906,\n        848.844,\n        850.782,\n        852.72,\n        854.658,\n        856.596,\n        858.534,\n        860.472,\n        862.41,\n        864.348,\n        866.286,\n        868.224,\n        870.162,\n        872.1,\n        874.038,\n        875.976,\n        877.914,\n        879.852,\n        881.79,\n        883.728,\n        885.666,\n        887.604,\n        889.542,\n        891.48,\n        893.418,\n        895.356,\n        897.294,\n        899.232,\n        901.17,\n        903.108,\n        905.046,\n        906.984,\n        908.922,\n        910.86,\n        912.798,\n        914.736,\n        916.674,\n        918.612,\n        920.55,\n        922.488,\n        924.426,\n        926.364,\n        928.302,\n        930.24,\n        932.178,\n        934.116,\n        936.054,\n        937.992,\n        939.93,\n        941.868,\n        943.806,\n        945.744,\n        947.682,\n        949.62,\n        951.558,\n        953.496,\n        955.434,\n        957.372,\n        959.31,\n        961.248,\n        963.186,\n        965.124,\n        967.062,\n        969.0\n      ],\n      \"env_step\": [\n        0,\n        98304,\n        393216,\n        491520,\n        688128,\n        786432,\n        983040,\n        1081344,\n        1179648,\n        1376256,\n        1572864,\n        1966080,\n        2457600,\n        2752512,\n        3244032,\n        3342336,\n        3735552,\n        4128768,\n        4227072,\n        4521984,\n        4620288,\n        5013504,\n        5406720,\n        5701632,\n        5799936,\n        5996544,\n        6193152,\n        6782976,\n        6979584,\n        7077888,\n        7274496,\n        7569408,\n        7667712,\n        7766016,\n        7962624,\n        8060928,\n        8159232,\n        8257536,\n        8355840,\n        8454144,\n        8552448,\n        8650752,\n        8847360,\n        8945664,\n        9043968,\n        9142272,\n        9240576,\n        9338880,\n        9437184,\n        9535488,\n        10027008,\n        10518528,\n        10813440,\n        10911744,\n        11304960,\n        11599872,\n        11698176,\n        11894784,\n        12189696,\n        12288000,\n        12582912,\n        13074432,\n        13369344,\n        13467648,\n        14057472,\n        14155776,\n        14254080,\n        14450688,\n        14647296,\n        14745600,\n        14843904,\n        15138816,\n        15335424,\n        15433728,\n        15532032,\n        15925248,\n        16023552,\n        16220160,\n        16318464,\n        16416768,\n        16711680,\n        16809984,\n        16908288,\n        17006592,\n        17203200,\n        17301504,\n        17399808,\n        17694720,\n        18087936,\n        18284544,\n        18481152,\n        18579456,\n        18677760,\n        18874368,\n        19169280,\n        19267584,\n        19464192,\n        19759104,\n        19857408,\n        19955712,\n        20348928,\n        20447232,\n        20840448,\n        20938752,\n        21135360,\n        21233664,\n        21430272,\n        21528576,\n        21823488,\n        21921792,\n        22020096,\n        22413312,\n        22708224,\n        23592960,\n        23691264,\n        23789568,\n        23887872,\n        23986176,\n        24182784,\n        24674304,\n        24772608,\n        24870912,\n        24969216,\n        25165824,\n        25460736,\n        25559040,\n        25657344,\n        25755648,\n        26050560,\n        26247168,\n        26443776,\n        26738688,\n        26836992,\n        26935296,\n        27033600,\n        27230208,\n        27623424,\n        28016640,\n        28114944,\n        28409856,\n        28508160,\n        28606464,\n        28803072,\n        28901376,\n        29097984,\n        29392896,\n        29491200,\n        29786112,\n        29982720,\n        30277632,\n        30375936,\n        30572544,\n        30670848,\n        30867456,\n        31162368,\n        31457280,\n        31555584,\n        31653888,\n        31850496,\n        31948800,\n        32636928,\n        32931840,\n        33030144,\n        33128448,\n        33325056,\n        33423360,\n        33521664,\n        33619968,\n        33914880,\n        34209792,\n        34504704,\n        34897920,\n        34996224,\n        35192832,\n        35389440,\n        35487744,\n        35586048,\n        35782656,\n        35979264,\n        36077568,\n        36372480,\n        36864000,\n        36962304,\n        37060608,\n        37257216,\n        37355520,\n        37453824,\n        37552128,\n        37650432,\n        37847040,\n        38043648,\n        38141952,\n        38338560,\n        38633472,\n        38731776,\n        38830080,\n        38928384,\n        39026688,\n        39124992,\n        39223296,\n        39419904,\n        39518208,\n        39714816,\n        39813120,\n        39911424,\n        40108032,\n        40206336,\n        40501248,\n        40599552,\n        40697856,\n        40796160,\n        41091072,\n        41189376,\n        41287680,\n        41385984,\n        41484288,\n        41582592,\n        41680896,\n        41877504,\n        42074112,\n        42467328,\n        42565632,\n        42663936,\n        42860544,\n        42958848,\n        43155456,\n        43253760,\n        43450368,\n        43548672,\n        43646976,\n        43745280,\n        43843584,\n        44138496,\n        44236800,\n        44335104,\n        44531712,\n        45023232,\n        45121536,\n        45219840,\n        45318144,\n        45514752,\n        45613056,\n        46006272,\n        46497792,\n        46596096,\n        46989312,\n        47087616,\n        47185920,\n        47284224,\n        47480832,\n        47775744,\n        47972352,\n        48758784,\n        48857088,\n        49152000,\n        49545216,\n        49840128,\n        50036736,\n        50135040,\n        50233344,\n        50626560,\n        50724864,\n        51019776,\n        51609600,\n        51904512,\n        52101120,\n        52199424,\n        52396032,\n        52494336,\n        52690944,\n        52887552,\n        52985856,\n        53182464,\n        53379072,\n        53870592,\n        54165504,\n        54263808,\n        54362112,\n        54460416,\n        54558720,\n        55050240,\n        55148544,\n        55246848,\n        55345152,\n        55443456,\n        55541760,\n        56033280,\n        56131584,\n        56426496,\n        56918016,\n        57016320,\n        57212928,\n        57409536,\n        57606144,\n        57704448,\n        57901056,\n        57999360,\n        58097664,\n        58195968,\n        58294272,\n        58490880,\n        58687488,\n        58785792,\n        58884096,\n        59080704,\n        59375616,\n        59473920,\n        59670528,\n        59867136,\n        59965440,\n        60063744,\n        60162048,\n        60260352,\n        60456960,\n        60555264,\n        60751872,\n        61046784,\n        61145088,\n        61341696,\n        61636608,\n        61931520,\n        62029824,\n        62128128,\n        62619648,\n        62717952,\n        62816256,\n        62914560,\n        63012864,\n        63111168,\n        63209472,\n        63307776,\n        63406080,\n        63504384,\n        63700992,\n        63799296,\n        64094208,\n        64192512,\n        64487424,\n        64585728,\n        64782336,\n        64880640,\n        64978944,\n        65077248,\n        65372160,\n        65470464,\n        65667072,\n        65765376,\n        66158592,\n        66256896,\n        66453504,\n        66650112,\n        66748416,\n        66846720,\n        67239936,\n        67829760,\n        68026368,\n        68419584,\n        68911104,\n        69107712,\n        69402624,\n        69500928,\n        69697536,\n        69795840,\n        70090752,\n        70287360,\n        70385664,\n        70582272,\n        70778880,\n        71368704,\n        71467008,\n        71565312,\n        71761920,\n        71958528,\n        72056832,\n        72351744,\n        72843264,\n        73039872,\n        73138176,\n        73236480,\n        73334784,\n        73433088,\n        73629696,\n        73728000,\n        73826304,\n        74022912,\n        74121216,\n        74317824,\n        74711040,\n        74809344,\n        74907648,\n        75005952,\n        75300864,\n        75399168,\n        75497472,\n        75890688,\n        76087296,\n        76382208,\n        76972032,\n        77070336,\n        77168640,\n        77266944,\n        77365248,\n        77463552,\n        77955072,\n        78544896,\n        79036416,\n        79134720,\n        79233024,\n        79527936,\n        79724544,\n        79822848,\n        80117760,\n        80314368,\n        80412672,\n        81002496,\n        81100800,\n        81395712,\n        81592320,\n        81690624,\n        81788928,\n        82280448,\n        82378752,\n        82477056,\n        82575360,\n        82968576,\n        83558400,\n        83755008,\n        84148224,\n        84344832,\n        84443136,\n        84836352,\n        85229568,\n        85327872,\n        85426176,\n        85524480,\n        86016000,\n        86409216,\n        86507520,\n        86605824,\n        86704128,\n        86802432,\n        86900736,\n        86999040,\n        87293952,\n        87588864,\n        87687168,\n        87785472,\n        87982080,\n        88080384,\n        88178688,\n        88276992,\n        88375296,\n        88473600,\n        88768512,\n        89456640,\n        89653248,\n        90046464,\n        90243072,\n        90439680,\n        90636288,\n        90734592,\n        90931200,\n        91127808,\n        91226112,\n        91422720,\n        91521024,\n        91914240,\n        92209152,\n        92307456,\n        92405760,\n        92504064,\n        92602368,\n        92897280,\n        92995584,\n        93192192,\n        93388800,\n        93487104,\n        93585408,\n        93683712,\n        93880320,\n        93978624,\n        94076928,\n        94273536,\n        94568448,\n        94961664,\n        95059968,\n        95453184,\n        95649792,\n        95748096,\n        96141312,\n        96239616,\n        96337920,\n        96632832,\n        96829440,\n        96927744,\n        97026048,\n        97124352,\n        97517568,\n        97615872,\n        97714176,\n        98009088\n      ],\n      \"return\": [\n        0.0,\n        -1.8211565176363214,\n        -5.450758711894353,\n        -5.3709365749359135,\n        -5.429918484687806,\n        -5.405227454503378,\n        -5.405241473515829,\n        -5.415232059160869,\n        -5.403986158370972,\n        -5.387823613484701,\n        -5.385273424784342,\n        -5.385051577885946,\n        -5.3242382780710855,\n        -5.249849046071371,\n        -5.189958149592082,\n        -5.174289337793986,\n        -5.123459353446961,\n        -5.08548603216807,\n        -5.016271096865336,\n        -5.009552868207295,\n        -4.954833366076151,\n        -5.0061487213770555,\n        -5.083908321062723,\n        -5.117522551218669,\n        -5.198800565401712,\n        -5.252866994539897,\n        -5.271668980916341,\n        -5.396079812049866,\n        -5.497399250666301,\n        -5.5650003163019806,\n        -5.655306051572164,\n        -5.701566495895386,\n        -5.728630830446879,\n        -5.84670994758606,\n        -5.9707754500706995,\n        -6.021102941830953,\n        -6.134735550880432,\n        -6.214331653912862,\n        -6.284563399950663,\n        -6.213578990697861,\n        -6.246251831650734,\n        -6.054963727196058,\n        -5.333051507820685,\n        -5.382347539166609,\n        -5.251159350425005,\n        -5.389291501591603,\n        -5.257902181896691,\n        -5.346594299984474,\n        -5.132231690908472,\n        -5.119642965160311,\n        -4.369317753314972,\n        -2.551518141825994,\n        -2.0427605971507727,\n        -1.654090012193968,\n        -0.45424261227250096,\n        0.30654167612083255,\n        0.9049611513409764,\n        1.4666259219994149,\n        1.8304008860948187,\n        2.1126247743951776,\n        2.523588375778248,\n        4.109312493679268,\n        4.620685461089015,\n        5.1200420312893895,\n        6.284545958700279,\n        6.4868240279083444,\n        6.8971409591659905,\n        7.536932006279627,\n        8.15484679798285,\n        8.331846399108569,\n        8.431811542312305,\n        8.856677153110503,\n        9.335615858236949,\n        9.78888226588567,\n        10.49026326974233,\n        10.818796298503875,\n        11.156456514994304,\n        12.019008894761404,\n        12.639701535701752,\n        13.121436796983083,\n        13.421463690598806,\n        13.684878850777944,\n        14.299875932534535,\n        14.655534369150798,\n        15.455265655517579,\n        15.615330264568328,\n        15.952052096525827,\n        16.416555580298105,\n        17.00739818652471,\n        17.326351670424142,\n        17.74453566392263,\n        17.81917943954468,\n        18.01390697956085,\n        18.690608501434326,\n        19.088207910855612,\n        19.423237595558167,\n        19.538911277453106,\n        19.90228245576223,\n        20.2472456741333,\n        21.422647728919983,\n        22.148719353675844,\n        22.370413347880046,\n        22.562456116676334,\n        22.841183287302652,\n        23.412538000742597,\n        23.730912761688234,\n        23.912889324824018,\n        24.026660318374635,\n        24.433674205144246,\n        24.443872968355816,\n        24.66891985098521,\n        24.82599383831024,\n        25.119243364334107,\n        25.31676951805751,\n        26.091290965080262,\n        26.447999830245973,\n        26.64739019076029,\n        26.733827587763468,\n        27.08850084940592,\n        27.234617513020833,\n        27.478463020324707,\n        27.659565181732177,\n        27.78176722844442,\n        28.15255584081014,\n        28.325472304026288,\n        28.51744202931722,\n        28.56866101582845,\n        28.694656810760502,\n        29.01594970703125,\n        29.116026492913566,\n        29.186051642894743,\n        29.267378375530242,\n        29.347289582093556,\n        29.509137900670368,\n        29.575214611689248,\n        29.494310462474825,\n        29.566665900548298,\n        29.60818713188171,\n        29.639113759994505,\n        29.53559107542038,\n        29.904576396942144,\n        30.098937937418622,\n        30.19250619252523,\n        30.24227956771851,\n        30.431443729400637,\n        30.525310249328612,\n        30.656077963511148,\n        30.735889282226562,\n        30.977694892883303,\n        30.981900826295217,\n        31.003630145390826,\n        31.060925975640615,\n        31.31851980845133,\n        31.361310170491535,\n        31.415920890967055,\n        31.278630374272666,\n        31.255216824213665,\n        31.50279481649399,\n        31.523300778865813,\n        31.72419583002726,\n        31.805352369944256,\n        31.853926067352294,\n        31.89563065846761,\n        31.944775962829592,\n        31.964195365905763,\n        32.005060046513876,\n        32.09825029055278,\n        32.20195636113485,\n        32.22373517990113,\n        32.28627243677775,\n        32.30005905151367,\n        32.32767866770426,\n        32.29544750054677,\n        32.26161565621694,\n        32.263696993192035,\n        32.373338920275366,\n        32.55905792236328,\n        32.5971901512146,\n        32.6483419418335,\n        32.577331740061446,\n        32.65967615763346,\n        32.50604391256968,\n        32.524837381045025,\n        32.453570529619846,\n        32.473360339800514,\n        32.50699528217316,\n        32.506180488268534,\n        32.38849087794622,\n        32.69574457168579,\n        32.88326619466146,\n        32.86996450424194,\n        32.8433007367452,\n        32.78393250147502,\n        32.71742320140203,\n        32.82113723754883,\n        32.95492044448852,\n        33.014233360290525,\n        32.95029862721761,\n        32.85574058214823,\n        32.63808204094568,\n        32.99278465588887,\n        33.014845549265544,\n        32.9728839302063,\n        32.93397596995035,\n        32.93905897140503,\n        32.82802639643351,\n        33.13534347216288,\n        33.17514415105184,\n        33.158193543752034,\n        33.1641508102417,\n        33.12397045771281,\n        33.163712355295814,\n        33.11274451891581,\n        33.1312927087148,\n        33.143696921666454,\n        33.16906797726949,\n        33.178442191282905,\n        32.782245856126146,\n        32.80315121809642,\n        32.75034557580948,\n        32.67346092542013,\n        32.794016884167995,\n        32.83101738452911,\n        32.991992235183716,\n        33.21010211626689,\n        33.21421767870585,\n        33.16107631365458,\n        33.34934869766235,\n        33.3692709350586,\n        33.232156912485756,\n        33.22941769282023,\n        33.397237396240236,\n        33.35016419092814,\n        33.349866479237875,\n        33.06404978672663,\n        33.11358634869257,\n        33.121176311969755,\n        33.3388022740682,\n        33.27921138127645,\n        33.40164766311646,\n        33.26413485129674,\n        33.38225418726603,\n        33.25256678104401,\n        33.34821179548899,\n        33.353779506683345,\n        33.49222758611044,\n        33.44428084691366,\n        33.51035448074341,\n        33.359815897146866,\n        33.3635134824117,\n        33.3504612382253,\n        33.269343312581384,\n        33.32479614496231,\n        33.34401448726654,\n        33.41709674676259,\n        33.360723201433814,\n        33.26504530906678,\n        33.209413143793746,\n        33.18987674395243,\n        33.44505291938781,\n        33.48292690277099,\n        33.544238618214926,\n        33.54053416570028,\n        33.54602517445882,\n        33.44048034191132,\n        33.470887223879494,\n        33.45609283606212,\n        33.60205269495646,\n        33.60698293685913,\n        33.64999708175659,\n        33.42452413479487,\n        33.21609922568003,\n        33.38258064905802,\n        33.48653169314067,\n        33.47724048535029,\n        33.42847904761632,\n        33.446080621083574,\n        33.46206173578898,\n        33.45817431767781,\n        33.49116609573364,\n        33.59252418518067,\n        33.59685064315796,\n        33.64596748987834,\n        33.60812362670899,\n        33.535508891741436,\n        33.64438659667969,\n        33.54005140622457,\n        33.40192108074824,\n        33.46623891750971,\n        33.573429889678955,\n        33.59540475845337,\n        33.57583349863688,\n        33.54388561884562,\n        33.60371445973714,\n        33.51048959970475,\n        33.26232818206151,\n        33.357575463453934,\n        33.67110483805339,\n        33.843606103261315,\n        33.799716498057045,\n        33.775785026550295,\n        33.812021497090655,\n        33.82620253245036,\n        33.807082901000975,\n        33.81073764165242,\n        33.63764720280965,\n        33.485613335768385,\n        33.65504789034525,\n        33.64657088915507,\n        33.74490778446198,\n        33.688238808314004,\n        33.65053201357524,\n        33.63879053115844,\n        33.75179484367371,\n        33.647144407431284,\n        33.66565582354864,\n        33.70479655663172,\n        33.741632859706876,\n        33.691917811234795,\n        33.69865835030873,\n        33.71498268286387,\n        33.65212141513825,\n        33.829328810373944,\n        33.85409680366516,\n        33.76617241859436,\n        33.86793506940206,\n        33.95968713760376,\n        33.957069365183514,\n        33.96092315673828,\n        33.9038141822815,\n        33.78778015931447,\n        33.802784686088565,\n        33.772612570126846,\n        33.9072968451182,\n        33.70608182668686,\n        33.72268221696218,\n        33.55177603403727,\n        33.914818897247315,\n        33.96780169169108,\n        33.953882675170895,\n        33.93014724095662,\n        33.98231894810994,\n        34.00627389907837,\n        33.90533470153809,\n        33.743881049950915,\n        33.74382516463597,\n        33.882653643290205,\n        33.64851376930873,\n        33.627794663111366,\n        33.58648926973343,\n        33.86447109540304,\n        34.00397662480672,\n        33.97151560465495,\n        33.960879128774,\n        34.00438674926758,\n        33.868241135279334,\n        34.008157377243045,\n        33.93994318962097,\n        33.86863050699234,\n        33.942175194422404,\n        33.71918567577998,\n        33.833626924355826,\n        33.925179407596595,\n        33.953650324344636,\n        33.954390884240475,\n        33.76647464593251,\n        33.7064058415095,\n        33.85868651946385,\n        33.66485804319382,\n        33.75230008284251,\n        33.7514514986674,\n        33.88039131800334,\n        33.99921775182088,\n        33.79016396840413,\n        33.70870203018188,\n        33.6108287191391,\n        34.0640186325709,\n        34.005900454521175,\n        34.13911101659139,\n        34.08947592894236,\n        34.20443028767904,\n        34.22977528889974,\n        33.92905995845794,\n        33.909010221163435,\n        33.91740820089976,\n        33.95200910568237,\n        34.14121877988179,\n        34.023240098953245,\n        33.9469082959493,\n        34.020559568405154,\n        34.05597455660502,\n        34.06671770413717,\n        34.19973275502523,\n        33.96253667593002,\n        33.83099495649338,\n        33.4363557044665,\n        33.69214943011602,\n        34.0144168249766,\n        34.00501637458802,\n        34.17815904617309,\n        34.2427860959371,\n        34.245318311055506,\n        34.188450533548995,\n        34.22512925465902,\n        34.15011143684387,\n        34.2105561765035,\n        34.13866239150366,\n        34.12881470441818,\n        34.110780564149216,\n        34.22185877482096,\n        34.2055456161499,\n        34.165921246210736,\n        34.14928177833557,\n        33.96859579165777,\n        33.83083196242651,\n        33.84601767460506,\n        33.855398262341815,\n        33.90395102659861,\n        33.87031110127767,\n        33.7126193745931,\n        33.87908462524414,\n        33.95480376561483,\n        34.25536927223205,\n        34.22514481544495,\n        34.27022196451823,\n        34.18469769001007,\n        34.00267321586609,\n        34.32991437276204,\n        34.30865136464437,\n        34.28107172648112,\n        34.19567479451497,\n        34.210505580902094,\n        34.00116219679514,\n        34.07368288199107,\n        34.046332208315526,\n        33.97249688625336,\n        34.081047983169555,\n        34.074682232538855,\n        34.0766791232427,\n        33.994779232343035,\n        33.904674633344015,\n        33.83213842709859,\n        34.186360073089595,\n        34.22037427902222,\n        34.04206380049387,\n        33.92758392572403,\n        33.86605194489161,\n        34.140858041445405,\n        33.71333369255066,\n        34.19078587214151,\n        34.12247137387593,\n        33.897727811336516,\n        34.093153146107994,\n        34.081556427478795,\n        34.225092843373616,\n        34.18845344543457,\n        34.03204092979431,\n        34.113009688059485,\n        33.65711380243301,\n        33.61201909224192,\n        33.91202318350474,\n        34.027429844538375,\n        34.03296609560649,\n        33.95445609887441,\n        33.90101886590322,\n        34.025549514293665,\n        33.72658471345901,\n        34.018566299279534,\n        33.888727053006484,\n        33.98073536554972,\n        33.990113363266,\n        34.07285011450449,\n        34.21228943506876,\n        34.15036308288574,\n        34.02376880645752,\n        33.992461017767596,\n        34.11304693222046,\n        34.16727955500285,\n        34.13363500595093,\n        33.661870261828106,\n        33.74703230381012,\n        34.067941786448166,\n        34.21645594914754,\n        34.21975947697957,\n        34.18952447255453,\n        34.21511004765828,\n        34.25453131357828,\n        34.22559996922811,\n        34.182696437835695,\n        34.18072349866231,\n        34.169769096374516,\n        34.24306580225627,\n        34.274355023701986,\n        34.28168207804362,\n        34.30049002329508,\n        34.11715464274088,\n        34.11595391432444,\n        34.07781842867533,\n        33.51906168699264,\n        33.571848865350084,\n        33.712630089918775\n      ],\n      \"return_min\": [\n        0.0,\n        -4.209529659930448,\n        -5.628355577413527,\n        -5.454971782483552,\n        -5.463996139367627,\n        -5.432741154225297,\n        -5.426888002217753,\n        -5.436807539475557,\n        -5.425299966313791,\n        -5.412182911469524,\n        -5.4104728792417385,\n        -5.42808231221047,\n        -5.375521999410365,\n        -5.3230288484886135,\n        -5.2797338855391915,\n        -5.240507263917413,\n        -5.19926127769744,\n        -5.179219142521742,\n        -5.07908284410958,\n        -5.032903823359099,\n        -4.969668282106384,\n        -5.094362818841234,\n        -5.281996883458076,\n        -5.3920096236993045,\n        -5.553269647597237,\n        -5.729036621436148,\n        -5.778985218016288,\n        -6.0301838321880625,\n        -6.217390434009163,\n        -6.3835459993152615,\n        -6.435836116763705,\n        -6.501016284407782,\n        -6.518235181677481,\n        -6.641257321926678,\n        -6.804782635415524,\n        -6.825958611170952,\n        -6.900034629882504,\n        -7.004061243688619,\n        -7.070126037618526,\n        -6.943216421817451,\n        -6.772073465644133,\n        -6.451086699481175,\n        -6.637737980770315,\n        -7.133531965887451,\n        -7.235613664375382,\n        -7.620630142315411,\n        -7.732041885493856,\n        -7.93043079664659,\n        -8.039439052465882,\n        -8.052169475312821,\n        -8.145489222359249,\n        -6.533113114668902,\n        -6.194064854392917,\n        -5.821998196115499,\n        -4.269998114915388,\n        -3.060062102494689,\n        -2.1766498666046306,\n        -1.522520611941957,\n        -1.0960023869981923,\n        -0.8067221361568175,\n        -0.34875043312814347,\n        -0.24051924615993325,\n        -0.030125249299234547,\n        0.6147006250639313,\n        1.6705156621163555,\n        1.8978773573042886,\n        2.492487197096148,\n        3.2078728538939396,\n        3.9325072151810927,\n        4.1495417987583405,\n        4.464200048218766,\n        4.695257760930558,\n        5.106385058253912,\n        5.80625788052178,\n        6.747859303388516,\n        7.033520115548896,\n        7.61384687106856,\n        9.121095072409364,\n        10.247541656339465,\n        11.164436568949185,\n        11.350218779574417,\n        11.715560912007636,\n        12.267298712031716,\n        12.49084894245549,\n        12.617885254881479,\n        12.860896229018945,\n        13.149930335939148,\n        13.545043920619051,\n        14.029415433170419,\n        14.37584588528084,\n        14.804220948047497,\n        14.792772486155016,\n        14.983113341048067,\n        15.783149829958043,\n        15.928902106852796,\n        16.3501276896744,\n        16.427507281469122,\n        16.841994678140587,\n        17.335166265053587,\n        19.849475169109517,\n        20.568988023923914,\n        20.92521922217223,\n        21.098380853856582,\n        21.237686399862632,\n        21.481328160428717,\n        21.861707905672635,\n        21.946872710964136,\n        22.15615356250457,\n        22.480182586951475,\n        22.439113334324347,\n        22.783054452309194,\n        22.71390440698288,\n        23.02964898305132,\n        23.057338734037277,\n        24.1364739924029,\n        24.55653930143314,\n        24.790787121035002,\n        24.863631804668003,\n        25.544529906995944,\n        25.892309377370864,\n        26.34160171474906,\n        26.628026989254643,\n        26.781689166899664,\n        26.97167769178579,\n        27.025717843397775,\n        27.279217143675528,\n        27.345248394639587,\n        27.534701973589357,\n        28.028468767921982,\n        28.12209195976707,\n        28.158240120581077,\n        28.17333204523116,\n        28.265092400875353,\n        28.47092830171811,\n        28.51324744549357,\n        28.60718218296158,\n        28.404079499641472,\n        28.433513656812547,\n        28.48298083773327,\n        28.20777281229254,\n        28.83806026401171,\n        29.095672022051513,\n        29.176605133554403,\n        29.227176012444364,\n        29.51045156047436,\n        29.55934002827877,\n        29.725652398442254,\n        29.857791131590933,\n        30.250474949875066,\n        30.097892827836176,\n        30.1205720272998,\n        30.222571934983538,\n        30.593196755308323,\n        30.67311152561452,\n        30.657967491293633,\n        30.58703137205276,\n        30.578546793645796,\n        30.716421153479097,\n        30.756165798926283,\n        31.130321696185632,\n        31.180588597373752,\n        31.250815806957338,\n        31.274866022851175,\n        31.360237967449972,\n        31.344060491523454,\n        31.399367117675258,\n        31.514803511527273,\n        31.678545248104303,\n        31.724162265731543,\n        31.718766100096886,\n        31.820441167887104,\n        31.809748956778147,\n        31.667314786205985,\n        31.665149156142952,\n        31.62035110631372,\n        31.742718209367506,\n        32.022972224180734,\n        32.04248009692774,\n        32.197398524591215,\n        32.03430305853228,\n        32.237484974198075,\n        32.100151691025054,\n        32.09026116382551,\n        31.987826541323436,\n        31.984515127439128,\n        31.933867257918262,\n        31.919322252655554,\n        31.77659611423544,\n        32.206594745672824,\n        32.45533158655721,\n        32.39492503630701,\n        32.304099971179696,\n        32.25262729610579,\n        32.41908143160763,\n        32.30487923098815,\n        32.5582240553896,\n        32.60160459883723,\n        32.431401203565294,\n        32.369493417243255,\n        32.21161546726209,\n        32.510942233125974,\n        32.55640827577592,\n        32.48728961375227,\n        32.446934766045196,\n        32.43471770493697,\n        32.20878724298071,\n        32.76947491920594,\n        32.80921502629607,\n        32.79006888503141,\n        32.781752366577294,\n        32.731922917885335,\n        32.82623178897912,\n        32.6462664405936,\n        32.63425745048648,\n        32.632413252321314,\n        32.65034100784082,\n        32.57447847932977,\n        31.77427950571114,\n        31.909927808121,\n        31.618161890837925,\n        31.47261618272682,\n        31.717342494918874,\n        31.838050338276254,\n        32.41965951145684,\n        32.749675875368965,\n        32.79157376861183,\n        32.70116923350964,\n        32.8588465396553,\n        32.9674953250711,\n        32.811215486513525,\n        32.717401451010154,\n        32.95060249716797,\n        32.87874931891394,\n        32.84635712663109,\n        32.143342025778594,\n        32.289241920374245,\n        32.33099643505135,\n        32.76786138310686,\n        32.72909474247403,\n        32.93045251612466,\n        32.8030939787735,\n        32.87775578006752,\n        32.70376760973,\n        32.81084849537366,\n        32.780997041413244,\n        32.968767569051465,\n        32.93412799801414,\n        32.98463241141619,\n        32.69498579548257,\n        33.03098130574189,\n        32.6656808730222,\n        32.56637424042163,\n        32.65086773570051,\n        32.67490052230798,\n        32.73366487570359,\n        32.709750099740184,\n        32.365409665479326,\n        32.34194315152263,\n        32.23646837486713,\n        32.75996267344055,\n        32.870196870809686,\n        33.093166501517324,\n        33.07097673853406,\n        33.106031609170124,\n        33.02547366270539,\n        33.07608886836146,\n        33.078466212870545,\n        33.07309098432332,\n        33.05535620081123,\n        33.15630201158901,\n        32.80892911401169,\n        32.493294814600226,\n        32.718674498279675,\n        32.835207144874474,\n        32.84555438935561,\n        32.760458185436136,\n        32.642590885496105,\n        32.645517500011216,\n        32.787408521131695,\n        32.803128271257215,\n        32.96766573915857,\n        33.02182458196768,\n        33.06764542818848,\n        33.04412100149526,\n        33.021516566445484,\n        33.02732691408033,\n        32.98825196500852,\n        32.751573189623244,\n        32.92157085654797,\n        33.008391144829616,\n        33.080099670691325,\n        33.06197665637975,\n        33.04223494190235,\n        33.13707536866676,\n        32.989384983307176,\n        32.478441922327704,\n        32.809178853591355,\n        33.25661037163764,\n        33.39445979675766,\n        33.405295153552395,\n        33.35775763402796,\n        33.32191098609057,\n        33.35619586350611,\n        33.299655769009085,\n        33.30764757278204,\n        32.8644155605529,\n        32.88229689011347,\n        32.91915262551825,\n        33.17257336994728,\n        33.14484868109328,\n        33.18764081460977,\n        33.158712628442274,\n        33.1068222921623,\n        33.26054504412447,\n        33.32513415225772,\n        33.372564664651954,\n        33.42742018698553,\n        33.48416863808679,\n        33.436095550397106,\n        33.126944711982034,\n        33.46067212999888,\n        33.51442993553725,\n        33.540470684181045,\n        33.46254666887379,\n        33.20444097781898,\n        33.377362989405796,\n        33.60318391569346,\n        33.57344715637073,\n        33.586223732418226,\n        33.60310619733873,\n        33.291651688605775,\n        33.25409801387752,\n        33.214976891334466,\n        33.52219575845811,\n        33.15037514066368,\n        33.15326696850124,\n        32.87370710954431,\n        33.54568352000057,\n        33.61704161008727,\n        33.62752242483598,\n        33.56393179559753,\n        33.65882495209487,\n        33.660115425241685,\n        33.67286110422097,\n        33.30801149644509,\n        33.300742198021766,\n        33.60665529291178,\n        32.859103411917665,\n        32.914552223565174,\n        32.95125362838702,\n        33.39018207803091,\n        33.61013390260663,\n        33.612453490604345,\n        33.48235920338519,\n        33.47339237810102,\n        33.191781660525834,\n        33.52863233851444,\n        33.620556089757095,\n        33.57614865826481,\n        33.58555271932166,\n        33.303261304921946,\n        33.508115073184634,\n        33.41952419674288,\n        33.43345984596744,\n        33.43200662672882,\n        33.14624532362674,\n        33.04859363863694,\n        33.29341880859891,\n        32.918686567270065,\n        33.17985651291829,\n        32.965548127877376,\n        33.10163278964282,\n        33.363579769434914,\n        32.83360904204991,\n        32.711381927941886,\n        32.70752628894768,\n        33.7298202959727,\n        33.63457119738028,\n        33.64793008978906,\n        33.56577284885361,\n        33.690202662460585,\n        33.76106324297055,\n        33.306808336901106,\n        33.32180686687506,\n        33.275599433915005,\n        33.349055974790524,\n        33.595424980213885,\n        33.42629543250327,\n        33.38586942035939,\n        33.510834643329005,\n        33.54593447676686,\n        33.538963472683065,\n        33.72586910080425,\n        33.22731798357096,\n        32.823112523161,\n        32.46374112984253,\n        32.526541578874195,\n        33.31524413846802,\n        33.29824960721062,\n        33.661277133487346,\n        33.743415806420096,\n        33.71825778666474,\n        33.64058746221431,\n        33.637353039854816,\n        33.47049646526259,\n        33.60852705127985,\n        33.570249232057954,\n        33.53589359925487,\n        33.51646917475235,\n        33.652598212485174,\n        33.65615853904905,\n        33.451178705396565,\n        33.434563643244886,\n        33.21332901949298,\n        32.95450014870425,\n        33.01047405583394,\n        32.76524800040569,\n        32.79024886950684,\n        32.856638338699675,\n        32.476051180352556,\n        32.92826454418564,\n        33.07934915126772,\n        33.50815317466772,\n        33.61016219502737,\n        33.67094222164079,\n        33.5974662406585,\n        33.33226295166716,\n        33.73789944265289,\n        33.661757322902666,\n        33.6498226778196,\n        33.49991993987689,\n        33.58044768148721,\n        33.09412398739152,\n        33.27651517632219,\n        33.119493786965954,\n        33.142376454589446,\n        33.61579164819622,\n        33.60038353053637,\n        33.21038260677685,\n        33.03290612718018,\n        32.79867687333837,\n        32.8516254366356,\n        33.48160374972705,\n        33.61470766029705,\n        33.37146528384432,\n        33.20071362052474,\n        33.040557209674404,\n        33.48061965929551,\n        33.157542261300485,\n        33.62163890121192,\n        33.34363454006072,\n        33.19473801655624,\n        33.314548215535616,\n        33.28270404732501,\n        33.54525858592337,\n        33.50697891926656,\n        33.417634181268,\n        33.572607856603355,\n        32.58939569769712,\n        32.58494234591918,\n        32.86144232437622,\n        33.05700141705454,\n        33.08225605251803,\n        33.01093363603906,\n        32.812148900160935,\n        33.29221635477259,\n        32.796024803236065,\n        33.28556066972077,\n        32.91004207668372,\n        33.14638977414718,\n        33.104234306250206,\n        33.34592715103986,\n        33.50639581739729,\n        33.60880215573404,\n        33.35797515442404,\n        33.21328473787432,\n        33.313652344537516,\n        33.448086993150646,\n        33.41096578399726,\n        32.33001209193853,\n        32.52109783963744,\n        33.29527984774868,\n        33.58159214390531,\n        33.51072681904591,\n        33.56232171762343,\n        33.51557845368608,\n        33.56718987218156,\n        33.550278943889005,\n        33.443680915948306,\n        33.45137163124387,\n        33.59035406831639,\n        33.501152248875314,\n        33.61012691014657,\n        33.62100590719376,\n        33.68242704066349,\n        33.20934171056824,\n        33.40098750245513,\n        33.59525360877404,\n        32.5350500516155,\n        32.648825969698215,\n        33.04150141687526\n      ],\n      \"return_max\": [\n        0.0,\n        0.5672166246578054,\n        -5.273161846375179,\n        -5.286901367388275,\n        -5.395840830007986,\n        -5.3777137547814595,\n        -5.383594944813906,\n        -5.393656578846181,\n        -5.3826723504281535,\n        -5.363464315499878,\n        -5.360073970326945,\n        -5.342020843561422,\n        -5.272954556731806,\n        -5.176669243654128,\n        -5.100182413644972,\n        -5.108071411670558,\n        -5.047657429196482,\n        -4.991752921814399,\n        -4.953459349621092,\n        -4.98620191305549,\n        -4.939998450045919,\n        -4.917934623912877,\n        -4.88581975866737,\n        -4.843035478738034,\n        -4.8443314832061874,\n        -4.776697367643646,\n        -4.764352743816394,\n        -4.761975791911669,\n        -4.777408067323439,\n        -4.7464546332887,\n        -4.874775986380623,\n        -4.90211670738299,\n        -4.9390264792162775,\n        -5.052162573245442,\n        -5.136768264725875,\n        -5.216247272490953,\n        -5.36943647187836,\n        -5.4246020641371056,\n        -5.4990007622828,\n        -5.483941559578271,\n        -5.720430197657335,\n        -5.6588407549109405,\n        -4.028365034871055,\n        -3.6311631124457673,\n        -3.2667050364746277,\n        -3.1579528608677947,\n        -2.7837624782995265,\n        -2.762757803322358,\n        -2.2250243293510614,\n        -2.1871164550077995,\n        -0.5931462842706945,\n        1.430076831016914,\n        2.108543660091372,\n        2.5138181717275634,\n        3.3615128903703857,\n        3.673145454736354,\n        3.986572169286583,\n        4.455772455940787,\n        4.75680415918783,\n        5.031971684947173,\n        5.3959271846846395,\n        8.459144233518469,\n        9.271496171477263,\n        9.625383437514849,\n        10.898576255284201,\n        11.0757706985124,\n        11.301794721235833,\n        11.865991158665313,\n        12.377186380784607,\n        12.514150999458797,\n        12.399423036405844,\n        13.018096545290447,\n        13.564846658219984,\n        13.77150665124956,\n        14.232667236096145,\n        14.604072481458854,\n        14.699066158920047,\n        14.916922717113444,\n        15.03186141506404,\n        15.078437025016981,\n        15.492708601623194,\n        15.654196789548251,\n        16.332453153037356,\n        16.820219795846107,\n        18.29264605615368,\n        18.369764300117712,\n        18.754173857112505,\n        19.28806723997716,\n        19.985380939879004,\n        20.27685745556744,\n        20.684850379797762,\n        20.845586392934344,\n        21.044700618073634,\n        21.59806717291061,\n        22.247513714858428,\n        22.49634750144193,\n        22.65031527343709,\n        22.96257023338387,\n        23.159325083213012,\n        22.99582028873045,\n        23.728450683427774,\n        23.81560747358786,\n        24.026531379496085,\n        24.444680174742672,\n        25.343747841056476,\n        25.600117617703834,\n        25.8789059386839,\n        25.8971670742447,\n        26.387165823337018,\n        26.448632602387285,\n        26.554785249661222,\n        26.9380832696376,\n        27.208837745616893,\n        27.57620030207774,\n        28.046107937757622,\n        28.339460359058805,\n        28.50399326048558,\n        28.604023370858933,\n        28.6324717918159,\n        28.5769256486708,\n        28.615324325900353,\n        28.69110337420971,\n        28.781845289989175,\n        29.333433989834493,\n        29.6252267646548,\n        29.755666914958912,\n        29.792073637017314,\n        29.854611647931648,\n        30.00343064614052,\n        30.109961026060063,\n        30.21386316520841,\n        30.361424705829325,\n        30.429486763311758,\n        30.547347499622624,\n        30.637181777884926,\n        30.38143874198807,\n        30.729252301455123,\n        30.782860606950873,\n        30.79524668225574,\n        30.863409338548223,\n        30.97109252987258,\n        31.10220385278573,\n        31.208407251496055,\n        31.257383122992653,\n        31.352435898326917,\n        31.491280470378456,\n        31.586503528580042,\n        31.613987432862192,\n        31.70491483589154,\n        31.865908824754257,\n        31.88668826348185,\n        31.89928001629769,\n        32.04384286159434,\n        32.049508815368554,\n        32.17387429064048,\n        31.970229376492572,\n        31.931886854781535,\n        32.28916847950888,\n        32.290435758805344,\n        32.318069963868886,\n        32.430116142514755,\n        32.45703632774725,\n        32.51639529408404,\n        32.52931395820921,\n        32.584330240288075,\n        32.6107529753525,\n        32.681697069578284,\n        32.725367474165395,\n        32.72330809407071,\n        32.853778773458615,\n        32.77967693514024,\n        32.84560837863038,\n        32.92358021488756,\n        32.858082156290926,\n        32.90704288007035,\n        33.00395963118322,\n        33.09514362054583,\n        33.151900205501455,\n        33.099285359075786,\n        33.12036042159061,\n        33.08186734106885,\n        32.911936134114306,\n        32.95941359826454,\n        32.919314517916256,\n        32.9622055521619,\n        33.08012330642805,\n        33.09303872388151,\n        33.000385641657,\n        33.18489439769876,\n        33.31120080276571,\n        33.345003972176876,\n        33.3825015023107,\n        33.31523770684425,\n        33.015764971196425,\n        33.337395244109516,\n        33.35161683358744,\n        33.426862121743824,\n        33.46919605086993,\n        33.3419877470532,\n        33.064548614629274,\n        33.474627078651764,\n        33.473282822755166,\n        33.458478246660334,\n        33.42101717385551,\n        33.44340023787309,\n        33.4472655498863,\n        33.50121202511982,\n        33.541073275807605,\n        33.52631820247266,\n        33.54654925390611,\n        33.51601799754028,\n        33.50119292161251,\n        33.57922259723802,\n        33.628327966943125,\n        33.654980591011594,\n        33.68779494669816,\n        33.78240590323604,\n        33.790212206541156,\n        33.69637462807184,\n        33.88252926078104,\n        33.874305668113436,\n        33.870691273417115,\n        33.82398443078197,\n        33.56432495891059,\n        33.67052835716481,\n        33.63686158879987,\n        33.62098339379953,\n        33.8398508556694,\n        33.771046545046104,\n        33.65309833845799,\n        33.7414339346303,\n        33.8438722953125,\n        33.82157906294234,\n        33.85337583184466,\n        33.98475754767466,\n        33.9379307770109,\n        33.91135618888816,\n        33.90974316502954,\n        33.82932802007886,\n        33.87284281010825,\n        33.72517572381998,\n        33.88675259446454,\n        33.80136595235802,\n        33.885575095604324,\n        33.92656197195345,\n        34.01568760316941,\n        33.95443369581317,\n        34.036076550070625,\n        34.02464599881116,\n        33.696045659081506,\n        34.0352416034284,\n        33.97231238474114,\n        33.99872455422412,\n        34.013128452225104,\n        34.10052861782159,\n        34.011696303127444,\n        34.16468095265423,\n        34.076883136064865,\n        34.143285113037734,\n        34.13014316533508,\n        34.0956569347323,\n        33.99531073491253,\n        34.0100915928665,\n        33.986018739747514,\n        33.85548702111725,\n        33.86568557939753,\n        33.83371945925369,\n        34.1310144055896,\n        34.15860967290703,\n        34.143692151924164,\n        34.040119155578054,\n        33.93890363675984,\n        34.046486799836366,\n        34.137856241406865,\n        34.108926581344974,\n        34.096499909796506,\n        34.24957035667104,\n        34.27860597156674,\n        34.12894011422392,\n        34.179203920210064,\n        34.217382631202774,\n        34.171876704348236,\n        34.2242895515682,\n        34.17212625192271,\n        34.04950121703739,\n        34.26144627927905,\n        34.09185084744062,\n        34.05226897187323,\n        34.01090697847145,\n        34.138468634528294,\n        34.110709846215414,\n        34.089690340894016,\n        34.04553629578889,\n        34.070353550807525,\n        34.03159421610232,\n        34.046214441795314,\n        33.90597207331651,\n        34.085599304469135,\n        34.29275240976497,\n        34.194137842561695,\n        34.19381241907263,\n        34.30213200809074,\n        34.29620920139461,\n        34.314510032992864,\n        34.31382771052281,\n        34.41087884506641,\n        34.088929781423296,\n        34.39094315517225,\n        34.120568408362864,\n        34.34496688783068,\n        34.18883680201824,\n        34.1423513987082,\n        34.170758770154585,\n        34.243044643222945,\n        33.96915466260485,\n        33.95874698244533,\n        33.98217292627791,\n        33.99909708132696,\n        33.947740072072484,\n        34.27037198863543,\n        33.969293235728856,\n        33.78981289473924,\n        34.11818693656684,\n        34.245646938456524,\n        34.327903859369734,\n        34.35850714939832,\n        34.31619035951406,\n        34.340691573996295,\n        34.33562258105834,\n        34.20452216722427,\n        34.28390863002317,\n        34.35147135829961,\n        34.330248248919226,\n        34.2923979317783,\n        34.26178851271005,\n        34.29209746542311,\n        34.22984495853022,\n        34.28395427449406,\n        34.31856177329489,\n        34.28024292550581,\n        34.29636268631571,\n        34.305812944125016,\n        34.35243237291506,\n        34.137808298855205,\n        34.17975060345674,\n        34.18690813125018,\n        34.15865199366863,\n        34.43792412669979,\n        34.34103710265756,\n        34.22172491107983,\n        34.33876011277517,\n        34.39781934700681,\n        34.33057771870555,\n        34.43939905416282,\n        34.535381120434145,\n        34.544700610032834,\n        34.48768241597165,\n        34.25933028948484,\n        34.161112355719865,\n        34.29879766952315,\n        34.13511004663802,\n        34.15913877552702,\n        34.43083461845031,\n        34.47384080272183,\n        34.47677514175213,\n        34.38670396823829,\n        34.36421804438206,\n        34.423954230328796,\n        34.41102951911757,\n        34.32474365276673,\n        34.537354869457424,\n        34.65914984636386,\n        34.634855734206845,\n        34.74671889475835,\n        34.70602213242188,\n        34.51413114933052,\n        34.3982169691691,\n        34.37722971166207,\n        34.63029194339373,\n        34.613179009031114,\n        34.71865791289749,\n        34.698487334828926,\n        34.551311580014776,\n        34.49621357545181,\n        34.55921696788451,\n        34.55496223657422,\n        34.687012579549695,\n        34.62018476540322,\n        34.50794717153921,\n        34.530284493481304,\n        34.566014636443185,\n        34.59447193559127,\n        34.67359640924621,\n        34.69775536828907,\n        34.838877389825754,\n        34.40897027909047,\n        34.85775728135784,\n        34.71358951148518,\n        34.71178314196541,\n        34.69504095885884,\n        34.7421563854541,\n        34.77237883544627,\n        34.73631360488368,\n        34.81290546946322,\n        34.82972640842515,\n        34.812585301727154,\n        34.70707555094936,\n        34.721735809581496,\n        34.70509195354608,\n        34.79111933715674,\n        34.754932693250744,\n        34.88066378702491,\n        34.863999913426255,\n        34.72386256382256,\n        34.70716377614877,\n        34.68156129337618,\n        34.94554852427794,\n        35.017653183690385,\n        34.88398386385567,\n        34.94918756883365,\n        34.82990470630264,\n        34.83025837996193,\n        35.002585369796385,\n        34.84012743586253,\n        34.86950170739566,\n        34.77192913936164,\n        34.67308348006502,\n        34.92192930287119,\n        34.95554540638607,\n        34.912320775142646,\n        34.891429649153054,\n        34.84056348031698,\n        34.908200406198766,\n        34.870850587659945,\n        34.9731706296651,\n        34.802617317917274,\n        34.54630431814289,\n        34.54898093454134,\n        34.94297563970855,\n        34.956652337505886,\n        35.01067239334966,\n        34.81265141756158,\n        34.89111639645214,\n        34.82604089774738,\n        34.71266231714342,\n        34.65445423092332,\n        34.691546680108814,\n        34.8010964235953,\n        34.26912512380084,\n        34.759932843071105,\n        34.901308207691145,\n        34.600717606116795,\n        34.87175807668037,\n        34.88040880763258,\n        34.90492710082386,\n        34.86992797160258,\n        34.64644767832063,\n        34.653411519515615,\n        34.7248319071689,\n        34.639095838564664,\n        34.96260404263327,\n        34.99785827202221,\n        34.98367613869495,\n        34.897978561709756,\n        34.9898888316455,\n        34.75888267381474,\n        34.65714462368196,\n        34.751571928838295,\n        34.867412029329245,\n        34.81508095695226,\n        34.87599242028179,\n        34.79977307796912,\n        34.91818305274023,\n        34.69192401003744,\n        34.689562458491,\n        34.771637297660874,\n        34.91244151990341,\n        34.88647211685505,\n        34.856304227904594,\n        34.993728431717685,\n        34.972966767982804,\n        34.84060372514765,\n        34.85131975438976,\n        34.92879213491323,\n        34.81672722748563,\n        34.914641641630475,\n        34.94187275497501,\n        34.90092099456721,\n        34.92171195972308,\n        34.91007536608075,\n        34.749184124432645,\n        34.98497935563722,\n        34.9385831372574,\n        34.942358248893484,\n        34.918553005926675,\n        35.02496757491352,\n        34.830920326193755,\n        34.56038324857662,\n        34.50307332236979,\n        34.49487176100195,\n        34.38375876296229\n      ]\n    }\n  },\n  \"Isaac-Velocity-Rough-H1-v0\": {\n    \"FastTD3\": {\n      \"time\": [\n        0.0,\n        364.7,\n        729.4,\n        1094.1,\n        1458.8,\n        1823.5,\n        2188.2,\n        2552.9,\n        2917.6,\n        3282.3,\n        3647.0\n      ],\n      \"env_step\": [\n        0,\n        20480000.0,\n        40960000.0,\n        61440000.0,\n        81920000.0,\n        102400000.0,\n        122880000.0,\n        143360000.0,\n        163840000.0,\n        184320000.0,\n        204800000.0\n      ],\n      \"return\": [\n        0.0,\n        0.8428554534912109,\n        10.589672724405924,\n        18.404813448588055,\n        20.45177714029948,\n        22.65296808878581,\n        23.864246368408203,\n        24.028223673502605,\n        23.937875747680664,\n        24.01940409342448,\n        25.180285771687824\n      ],\n      \"return_min\": [\n        0.0,\n        -2.6961081957202064,\n        6.380003830010861,\n        15.739199224866137,\n        19.15368035970884,\n        22.281677949442358,\n        23.391415568764668,\n        23.599606396166806,\n        23.64279778449942,\n        23.07866341290395,\n        24.741632156502256\n      ],\n      \"return_max\": [\n        0.0,\n        4.381819102702629,\n        14.799341618800987,\n        21.07042767230997,\n        21.74987392089012,\n        23.02425822812926,\n        24.33707716805174,\n        24.456840950838405,\n        24.232953710861906,\n        24.96014477394501,\n        25.618939386873393\n      ]\n    },\n    \"null\": {\n      \"time\": [\n        0.0,\n        7.016,\n        14.032,\n        21.048,\n        28.064,\n        35.08,\n        42.096,\n        49.112,\n        56.128,\n        63.144,\n        70.16,\n        77.176,\n        84.192,\n        91.208,\n        98.224,\n        105.24,\n        112.256,\n        119.272,\n        126.288,\n        133.304,\n        140.32,\n        147.336,\n        154.352,\n        161.368,\n        168.384,\n        175.4,\n        182.416,\n        189.432,\n        196.448,\n        203.464,\n        210.48,\n        217.496,\n        224.512,\n        231.528,\n        238.544,\n        245.56,\n        252.576,\n        259.592,\n        266.608,\n        273.624,\n        280.64,\n        287.656,\n        294.672,\n        301.688,\n        308.704,\n        315.72,\n        322.736,\n        329.752,\n        336.768,\n        343.784,\n        350.8,\n        357.816,\n        364.832,\n        371.848,\n        378.864,\n        385.88,\n        392.896,\n        399.912,\n        406.928,\n        413.944,\n        420.96,\n        427.976,\n        434.992,\n        442.008,\n        449.024,\n        456.04,\n        463.056,\n        470.072,\n        477.088,\n        484.104,\n        491.12,\n        498.136,\n        505.152,\n        512.168,\n        519.184,\n        526.2,\n        533.216,\n        540.232,\n        547.248,\n        554.264,\n        561.28,\n        568.296,\n        575.312,\n        582.328,\n        589.344,\n        596.36,\n        603.376,\n        610.392,\n        617.408,\n        624.424,\n        631.44,\n        638.456,\n        645.472,\n        652.488,\n        659.504,\n        666.52,\n        673.536,\n        680.552,\n        687.568,\n        694.584,\n        701.6,\n        708.616,\n        715.632,\n        722.648,\n        729.664,\n        736.68,\n        743.696,\n        750.712,\n        757.728,\n        764.744,\n        771.76,\n        778.776,\n        785.792,\n        792.808,\n        799.824,\n        806.84,\n        813.856,\n        820.872,\n        827.888,\n        834.904,\n        841.92,\n        848.936,\n        855.952,\n        862.968,\n        869.984,\n        877.0,\n        884.016,\n        891.032,\n        898.048,\n        905.064,\n        912.08,\n        919.096,\n        926.112,\n        933.128,\n        940.144,\n        947.16,\n        954.176,\n        961.192,\n        968.208,\n        975.224,\n        982.24,\n        989.256,\n        996.272,\n        1003.288,\n        1010.304,\n        1017.32,\n        1024.336,\n        1031.352,\n        1038.368,\n        1045.384,\n        1052.4,\n        1059.416,\n        1066.432,\n        1073.448,\n        1080.464,\n        1087.48,\n        1094.496,\n        1101.512,\n        1108.528,\n        1115.544,\n        1122.56,\n        1129.576,\n        1136.592,\n        1143.608,\n        1150.624,\n        1157.64,\n        1164.656,\n        1171.672,\n        1178.688,\n        1185.704,\n        1192.72,\n        1199.736,\n        1206.752,\n        1213.768,\n        1220.784,\n        1227.8,\n        1234.816,\n        1241.832,\n        1248.848,\n        1255.864,\n        1262.88,\n        1269.896,\n        1276.912,\n        1283.928,\n        1290.944,\n        1297.96,\n        1304.976,\n        1311.992,\n        1319.008,\n        1326.024,\n        1333.04,\n        1340.056,\n        1347.072,\n        1354.088,\n        1361.104,\n        1368.12,\n        1375.136,\n        1382.152,\n        1389.168,\n        1396.184,\n        1403.2,\n        1410.216,\n        1417.232,\n        1424.248,\n        1431.264,\n        1438.28,\n        1445.296,\n        1452.312,\n        1459.328,\n        1466.344,\n        1473.36,\n        1480.376,\n        1487.392,\n        1494.408,\n        1501.424,\n        1508.44,\n        1515.456,\n        1522.472,\n        1529.488,\n        1536.504,\n        1543.52,\n        1550.536,\n        1557.552,\n        1564.568,\n        1571.584,\n        1578.6,\n        1585.616,\n        1592.632,\n        1599.648,\n        1606.664,\n        1613.68,\n        1620.696,\n        1627.712,\n        1634.728,\n        1641.744,\n        1648.76,\n        1655.776,\n        1662.792,\n        1669.808,\n        1676.824,\n        1683.84,\n        1690.856,\n        1697.872,\n        1704.888,\n        1711.904,\n        1718.92,\n        1725.936,\n        1732.952,\n        1739.968,\n        1746.984,\n        1754.0,\n        1761.016,\n        1768.032,\n        1775.048,\n        1782.064,\n        1789.08,\n        1796.096,\n        1803.112,\n        1810.128,\n        1817.144,\n        1824.16,\n        1831.176,\n        1838.192,\n        1845.208,\n        1852.224,\n        1859.24,\n        1866.256,\n        1873.272,\n        1880.288,\n        1887.304,\n        1894.32,\n        1901.336,\n        1908.352,\n        1915.368,\n        1922.384,\n        1929.4,\n        1936.416,\n        1943.432,\n        1950.448,\n        1957.464,\n        1964.48,\n        1971.496,\n        1978.512,\n        1985.528,\n        1992.544,\n        1999.56,\n        2006.576,\n        2013.592,\n        2020.608,\n        2027.624,\n        2034.64,\n        2041.656,\n        2048.672,\n        2055.688,\n        2062.704,\n        2069.72,\n        2076.736,\n        2083.752,\n        2090.768,\n        2097.784,\n        2104.8,\n        2111.816,\n        2118.832,\n        2125.848,\n        2132.864,\n        2139.88,\n        2146.896,\n        2153.912,\n        2160.928,\n        2167.944,\n        2174.96,\n        2181.976,\n        2188.992,\n        2196.008,\n        2203.024,\n        2210.04,\n        2217.056,\n        2224.072,\n        2231.088,\n        2238.104,\n        2245.12,\n        2252.136,\n        2259.152,\n        2266.168,\n        2273.184,\n        2280.2,\n        2287.216,\n        2294.232,\n        2301.248,\n        2308.264,\n        2315.28,\n        2322.296,\n        2329.312,\n        2336.328,\n        2343.344,\n        2350.36,\n        2357.376,\n        2364.392,\n        2371.408,\n        2378.424,\n        2385.44,\n        2392.456,\n        2399.472,\n        2406.488,\n        2413.504,\n        2420.52,\n        2427.536,\n        2434.552,\n        2441.568,\n        2448.584,\n        2455.6,\n        2462.616,\n        2469.632,\n        2476.648,\n        2483.664,\n        2490.68,\n        2497.696,\n        2504.712,\n        2511.728,\n        2518.744,\n        2525.76,\n        2532.776,\n        2539.792,\n        2546.808,\n        2553.824,\n        2560.84,\n        2567.856,\n        2574.872,\n        2581.888,\n        2588.904,\n        2595.92,\n        2602.936,\n        2609.952,\n        2616.968,\n        2623.984,\n        2631.0,\n        2638.016,\n        2645.032,\n        2652.048,\n        2659.064,\n        2666.08,\n        2673.096,\n        2680.112,\n        2687.128,\n        2694.144,\n        2701.16,\n        2708.176,\n        2715.192,\n        2722.208,\n        2729.224,\n        2736.24,\n        2743.256,\n        2750.272,\n        2757.288,\n        2764.304,\n        2771.32,\n        2778.336,\n        2785.352,\n        2792.368,\n        2799.384,\n        2806.4,\n        2813.416,\n        2820.432,\n        2827.448,\n        2834.464,\n        2841.48,\n        2848.496,\n        2855.512,\n        2862.528,\n        2869.544,\n        2876.56,\n        2883.576,\n        2890.592,\n        2897.608,\n        2904.624,\n        2911.64,\n        2918.656,\n        2925.672,\n        2932.688,\n        2939.704,\n        2946.72,\n        2953.736,\n        2960.752,\n        2967.768,\n        2974.784,\n        2981.8,\n        2988.816,\n        2995.832,\n        3002.848,\n        3009.864,\n        3016.88,\n        3023.896,\n        3030.912,\n        3037.928,\n        3044.944,\n        3051.96,\n        3058.976,\n        3065.992,\n        3073.008,\n        3080.024,\n        3087.04,\n        3094.056,\n        3101.072,\n        3108.088,\n        3115.104,\n        3122.12,\n        3129.136,\n        3136.152,\n        3143.168,\n        3150.184,\n        3157.2,\n        3164.216,\n        3171.232,\n        3178.248,\n        3185.264,\n        3192.28,\n        3199.296,\n        3206.312,\n        3213.328,\n        3220.344,\n        3227.36,\n        3234.376,\n        3241.392,\n        3248.408,\n        3255.424,\n        3262.44,\n        3269.456,\n        3276.472,\n        3283.488,\n        3290.504,\n        3297.52,\n        3304.536,\n        3311.552,\n        3318.568,\n        3325.584,\n        3332.6,\n        3339.616,\n        3346.632,\n        3353.648,\n        3360.664,\n        3367.68,\n        3374.696,\n        3381.712,\n        3388.728,\n        3395.744,\n        3402.76,\n        3409.776,\n        3416.792,\n        3423.808,\n        3430.824,\n        3437.84,\n        3444.856,\n        3451.872,\n        3458.888,\n        3465.904,\n        3472.92,\n        3479.936,\n        3486.952,\n        3493.968,\n        3500.984,\n        3508.0\n      ],\n      \"env_step\": [\n        0,\n        491520,\n        786432,\n        1179648,\n        2162688,\n        2457600,\n        2654208,\n        3342336,\n        3440640,\n        3637248,\n        4128768,\n        5505024,\n        5701632,\n        6488064,\n        6684672,\n        6782976,\n        6979584,\n        8159232,\n        8355840,\n        9535488,\n        10518528,\n        10616832,\n        10911744,\n        11108352,\n        11304960,\n        11796480,\n        12288000,\n        12386304,\n        12582912,\n        13172736,\n        13369344,\n        14155776,\n        14548992,\n        14843904,\n        14942208,\n        15728640,\n        15826944,\n        16613376,\n        17694720,\n        18677760,\n        18874368,\n        18972672,\n        19267584,\n        20054016,\n        20348928,\n        20742144,\n        23101440,\n        23199744,\n        23396352,\n        23494656,\n        23986176,\n        24576000,\n        25264128,\n        25952256,\n        26148864,\n        26640384,\n        26836992,\n        27033600,\n        27131904,\n        27328512,\n        27525120,\n        28999680,\n        29687808,\n        29982720,\n        30081024,\n        30179328,\n        30375936,\n        30965760,\n        31162368,\n        32145408,\n        32440320,\n        32735232,\n        32931840,\n        34701312,\n        35684352,\n        36175872,\n        37158912,\n        37453824,\n        37945344,\n        38338560,\n        38436864,\n        38633472,\n        39321600,\n        40009728,\n        40796160,\n        41189376,\n        41680896,\n        41877504,\n        42270720,\n        44138496,\n        44236800,\n        45318144,\n        45416448,\n        46497792,\n        46596096,\n        46694400,\n        47087616,\n        47185920,\n        47579136,\n        47775744,\n        47972352,\n        48070656,\n        48562176,\n        49152000,\n        49643520,\n        50823168,\n        51609600,\n        53477376,\n        53870592,\n        55934976,\n        57704448,\n        60260352,\n        60555264,\n        62816256,\n        63700992,\n        64290816,\n        64389120,\n        65568768,\n        65765376,\n        67534848,\n        68419584,\n        69697536,\n        69894144,\n        70189056,\n        72646656,\n        72941568,\n        74907648,\n        75202560,\n        76087296,\n        77561856,\n        78544896,\n        78839808,\n        79331328,\n        80019456,\n        80412672,\n        80805888,\n        81002496,\n        81297408,\n        81494016,\n        81592320,\n        81887232,\n        82083840,\n        83361792,\n        84738048,\n        84934656,\n        85426176,\n        85622784,\n        86310912,\n        86409216,\n        86802432,\n        87293952,\n        88276992,\n        88965120,\n        90636288,\n        90832896,\n        91029504,\n        91226112,\n        93192192,\n        94076928,\n        94568448,\n        95158272,\n        95453184,\n        95944704,\n        96239616,\n        96534528,\n        96927744,\n        97320960,\n        98009088,\n        98304000,\n        98598912,\n        98992128,\n        99287040,\n        99385344,\n        100958208,\n        102039552,\n        102629376,\n        103317504,\n        104103936,\n        104398848,\n        104497152,\n        104792064,\n        104988672,\n        106070016,\n        106463232,\n        106758144,\n        107544576,\n        110690304,\n        111476736,\n        111575040,\n        112164864,\n        112558080,\n        112852992,\n        113639424,\n        113737728,\n        114327552,\n        115212288,\n        115703808,\n        116490240,\n        117473280,\n        118259712,\n        118554624,\n        118652928,\n        118849536,\n        118947840,\n        119144448,\n        122585088,\n        122880000,\n        123469824,\n        124452864,\n        124551168,\n        125042688,\n        126025728,\n        126124032,\n        126812160,\n        127008768,\n        128679936,\n        128778240,\n        129466368,\n        129957888,\n        130056192,\n        130744320,\n        131334144,\n        132612096,\n        132907008,\n        133398528,\n        133496832,\n        134873088,\n        136642560,\n        136740864,\n        136937472,\n        137428992,\n        137625600,\n        139001856,\n        140476416,\n        140574720,\n        141754368,\n        142147584,\n        142344192,\n        143228928,\n        143622144,\n        143818752,\n        144506880,\n        144900096,\n        145195008,\n        147259392,\n        147456000,\n        148045824,\n        148733952,\n        149323776,\n        149422080,\n        150896640,\n        151388160,\n        152764416,\n        153550848,\n        154730496,\n        155516928,\n        156794880,\n        156893184,\n        156991488,\n        158957568,\n        159645696,\n        160432128,\n        161021952,\n        161316864,\n        161513472,\n        161906688,\n        162299904,\n        162791424,\n        163577856,\n        163774464,\n        164560896,\n        165150720,\n        165347328,\n        165937152,\n        166133760,\n        167706624,\n        168296448,\n        168394752,\n        168493056,\n        169476096,\n        169967616,\n        170065920,\n        170164224,\n        170262528,\n        170360832,\n        170459136,\n        170557440,\n        170754048,\n        172916736,\n        173211648,\n        173506560,\n        173604864,\n        175079424,\n        175767552,\n        175964160,\n        176160768,\n        176455680,\n        177143808,\n        178323456,\n        178716672,\n        179208192,\n        180682752,\n        180879360,\n        181174272,\n        183238656,\n        184123392,\n        184221696,\n        185106432,\n        185892864,\n        186384384,\n        187465728,\n        188645376,\n        189530112,\n        189628416,\n        189726720,\n        189923328,\n        190316544,\n        190808064,\n        191397888,\n        191987712,\n        192282624,\n        192380928,\n        193658880,\n        194445312,\n        194740224,\n        197394432,\n        197492736,\n        198082560,\n        198672384,\n        199065600,\n        199655424,\n        200245248,\n        201228288,\n        201818112,\n        201916416,\n        202309632,\n        202407936,\n        203194368,\n        205848576,\n        206438400,\n        207028224,\n        207618048,\n        207912960,\n        208109568,\n        208601088,\n        209584128,\n        209682432,\n        209879040,\n        210272256,\n        210468864,\n        211845120,\n        212041728,\n        212828160,\n        213221376,\n        213909504,\n        214302720,\n        214499328,\n        214695936,\n        215384064,\n        215482368,\n        215678976,\n        215973888,\n        216072192,\n        216170496,\n        216563712,\n        217350144,\n        218136576,\n        218333184,\n        218431488,\n        218628096,\n        218726400,\n        219021312,\n        220102656,\n        220397568,\n        220495872,\n        220790784,\n        221577216,\n        221872128,\n        223051776,\n        223346688,\n        224329728,\n        224919552,\n        227082240,\n        227475456,\n        228753408,\n        228851712,\n        229933056,\n        230326272,\n        230522880,\n        230621184,\n        231112704,\n        232488960,\n        232980480,\n        233373696,\n        233472000,\n        234160128,\n        234258432,\n        234356736,\n        234651648,\n        234848256,\n        236224512,\n        237895680,\n        238092288,\n        238288896,\n        238583808,\n        239370240,\n        240254976,\n        241434624,\n        242515968,\n        242712576,\n        243400704,\n        244088832,\n        244187136,\n        245170176,\n        245268480,\n        245465088,\n        245956608,\n        246841344,\n        247431168,\n        247529472,\n        247627776,\n        249200640,\n        249987072,\n        250085376,\n        251953152,\n        252248064,\n        252837888,\n        253526016,\n        253624320,\n        253820928,\n        254017536,\n        254509056,\n        256180224,\n        256868352,\n        256966656,\n        257949696,\n        258146304,\n        258637824,\n        258834432,\n        259227648,\n        259620864,\n        259719168,\n        259915776,\n        261980160,\n        262569984,\n        262766592,\n        263454720,\n        264339456,\n        265224192,\n        265322496,\n        265814016,\n        265912320,\n        266207232,\n        266403840,\n        266797056,\n        266895360,\n        267681792,\n        268369920,\n        269746176,\n        270041088,\n        270237696,\n        270434304,\n        270729216,\n        271220736,\n        271613952,\n        272007168,\n        272203776,\n        273186816,\n        273580032,\n        273776640,\n        273973248,\n        274464768,\n        275152896,\n        275644416,\n        276037632,\n        277118976,\n        278495232,\n        279281664,\n        279478272,\n        279674880,\n        280756224,\n        281542656,\n        281935872,\n        283213824,\n        283312128,\n        283803648,\n        285474816,\n        285573120,\n        286556160,\n        286949376,\n        288718848,\n        289505280,\n        289996800,\n        290291712,\n        291864576,\n        292061184,\n        292257792,\n        292651008,\n        293535744,\n        293732352,\n        294027264\n      ],\n      \"return\": [\n        0.0,\n        -5.422170765797297,\n        -5.365042141278585,\n        -5.321201788584392,\n        -5.282661674817402,\n        -5.279570868810018,\n        -5.099006303151449,\n        -4.985642132759094,\n        -5.039014817873636,\n        -5.05856496334076,\n        -5.127120292981465,\n        -5.301984883944193,\n        -5.787255485852559,\n        -6.492295300165812,\n        -6.859130134582519,\n        -6.679243364930152,\n        -6.585359064737955,\n        -6.638607020725806,\n        -6.56383727868398,\n        -5.579464244066426,\n        -4.730856805543104,\n        -4.85022839024663,\n        -4.08018668456624,\n        -3.7235442843676236,\n        -2.5684121332814294,\n        -2.3172017341541746,\n        -1.8826996049284936,\n        -1.7406521982420236,\n        -1.3593260787675778,\n        2.4418186768361676,\n        2.5145354398091633,\n        3.53728093676269,\n        3.7485122829924027,\n        5.24840559720993,\n        5.905099849700928,\n        6.319293590386709,\n        6.558245106538137,\n        6.420299232800801,\n        7.086359784603118,\n        7.828594415982565,\n        8.177236615816753,\n        8.169828134377797,\n        8.109048763116201,\n        7.960519586404165,\n        7.632579886913301,\n        8.5222447570165,\n        9.480246132214864,\n        9.057840726375579,\n        8.536546057065328,\n        8.322958382765451,\n        8.593875254789987,\n        9.066962721347808,\n        9.4220951239268,\n        9.672696344852447,\n        9.42037155866623,\n        9.068991316159567,\n        9.227477009296416,\n        8.801373164653777,\n        9.659750367005666,\n        9.834801010290782,\n        9.733246921698251,\n        9.725076502164205,\n        9.575205939610798,\n        9.757781678835551,\n        9.515549600919087,\n        10.32995819648107,\n        10.743536387284598,\n        10.337343724568685,\n        10.239077248573302,\n        9.3617884238561,\n        9.82903660694758,\n        10.205209770997365,\n        10.302206721305849,\n        10.8276744333903,\n        11.209485573768616,\n        10.820306483109794,\n        10.813732381661731,\n        11.153387095133462,\n        11.57680031220118,\n        11.863303298155467,\n        12.104669983386993,\n        12.226192903518674,\n        12.226238845984142,\n        12.321544722716013,\n        12.010425718625386,\n        11.959592173894245,\n        12.702534379164378,\n        12.588660320440928,\n        12.730267083644867,\n        13.09765745639801,\n        13.098441476821899,\n        13.007480210463207,\n        12.896890660127005,\n        13.57533088763555,\n        14.103462287584941,\n        14.065941929022472,\n        13.571870381832122,\n        13.765269615650178,\n        13.634288456439974,\n        13.655183283686638,\n        13.933596647580465,\n        14.153377745151522,\n        13.484311122894288,\n        13.991338650385538,\n        14.217809696992239,\n        14.372374130090078,\n        14.48803258895874,\n        13.902689254283905,\n        14.008463487625122,\n        15.318258340358733,\n        14.727367254892984,\n        15.613157649834951,\n        15.31281984011332,\n        16.40384552160899,\n        16.804359534581504,\n        16.20739706039429,\n        16.820533531506857,\n        16.80816020488739,\n        17.16210698922475,\n        17.353576091925305,\n        17.84310720125834,\n        17.64714405377706,\n        17.666645275751748,\n        17.592155150572456,\n        17.889433806737262,\n        18.332248009840647,\n        18.998913583755495,\n        18.876606470743813,\n        19.217209668159484,\n        19.138889632225034,\n        18.948357582092285,\n        19.382302405039468,\n        19.401501762866975,\n        19.0712877702713,\n        19.468732802073163,\n        19.555791912873588,\n        19.585630633036295,\n        19.76214089870453,\n        19.80392255862554,\n        19.832017685572307,\n        19.642031608422595,\n        20.049005517959596,\n        20.15443610191345,\n        20.743607517878214,\n        20.581281551520032,\n        21.05371557156245,\n        20.423204619089763,\n        20.687201137542726,\n        21.01858803431193,\n        21.266472924550374,\n        21.371525878906255,\n        21.652281880378723,\n        21.299677049318948,\n        21.71703642209371,\n        21.522530011336006,\n        21.78698669195175,\n        22.034059658050538,\n        21.718550532658895,\n        21.89046088218689,\n        21.815517278512317,\n        22.191909686724347,\n        22.510205643971762,\n        22.344761204719543,\n        22.288901374340057,\n        22.643796286582944,\n        22.330036065578458,\n        22.251962281068163,\n        22.40083763678869,\n        22.013201440970104,\n        22.518215183417,\n        22.72313869396845,\n        22.826243489583334,\n        22.859257675011957,\n        23.06479694366455,\n        22.927606552441915,\n        22.74173924446106,\n        22.974355036417638,\n        23.50650741418203,\n        23.338268404006957,\n        22.919516321818037,\n        23.388370493253074,\n        23.163635557492572,\n        23.164315322240196,\n        23.40869496822357,\n        23.341194401582083,\n        23.485880421797432,\n        23.504843918482464,\n        23.721208157539365,\n        23.993358154296875,\n        23.909098584651947,\n        23.956194926897684,\n        24.3459050989151,\n        24.185533266067505,\n        24.134012846946717,\n        24.43199095249176,\n        24.177146474520367,\n        24.436910311381023,\n        24.54009437640508,\n        24.16632465362549,\n        24.412582944234213,\n        24.380601506233216,\n        24.468062558174136,\n        24.40877560297648,\n        24.71468245188395,\n        24.60827238559723,\n        24.740209692319237,\n        24.53396530866623,\n        24.663431402842207,\n        24.959084293047585,\n        24.908331915537516,\n        24.947622075080872,\n        24.6909338927269,\n        24.940318908691406,\n        25.16483751296997,\n        24.60695726076762,\n        25.112809979120886,\n        25.086977144877114,\n        24.773433555761972,\n        24.926412556966145,\n        24.998008828957875,\n        25.06103755235672,\n        24.86278089841207,\n        24.97040135542552,\n        25.0428440785408,\n        25.258395306269325,\n        25.210753719011944,\n        25.279651983579,\n        25.36194952170054,\n        25.692520167032878,\n        25.59342387040456,\n        25.554213938713076,\n        25.520990746816,\n        25.31104542016983,\n        25.52819642861684,\n        25.75904115041097,\n        25.72125477631887,\n        25.69705542564392,\n        25.73389612515767,\n        25.833797233899435,\n        25.576571871439615,\n        25.42343479792277,\n        25.95107562383016,\n        26.04738268693288,\n        25.781439321835837,\n        25.82269303480784,\n        25.62690583546956,\n        25.771284255981442,\n        26.133092199961343,\n        26.049315468470255,\n        26.092107651233675,\n        25.754562322298685,\n        26.115973978042604,\n        25.99166533311208,\n        26.16707617441813,\n        25.920664300918578,\n        26.06877758344014,\n        26.161833999951682,\n        26.205389365355174,\n        26.313472315470378,\n        26.228369690577185,\n        26.16665622870127,\n        26.187024978796643,\n        26.11963609695435,\n        26.04467580159505,\n        26.249091874758403,\n        26.569917356967924,\n        26.703698792457583,\n        26.39212519009908,\n        26.452600940068564,\n        26.32757260640462,\n        26.345079108079275,\n        26.594597864151,\n        26.496056710879007,\n        26.45632796843847,\n        26.45881276925405,\n        26.322761637369794,\n        26.41735316912333,\n        26.37774245738983,\n        26.58394108136495,\n        26.462264091173807,\n        26.70522584835688,\n        26.743710562388102,\n        26.41018081347148,\n        26.82354289849599,\n        26.584163082440696,\n        26.812084256807964,\n        26.887239375114444,\n        26.643198828697205,\n        26.901577798525494,\n        26.89036826769511,\n        27.12406840324402,\n        27.15825298945109,\n        26.974571345647174,\n        26.76065761725108,\n        26.80236604690552,\n        26.789501201311747,\n        27.014291658401486,\n        27.125630804697675,\n        27.15856007575989,\n        27.129440813859304,\n        27.022346736590066,\n        27.000964590708417,\n        26.63156608104706,\n        27.112519771258036,\n        27.229807084401447,\n        26.898604194323223,\n        26.96032204310099,\n        27.331697724660234,\n        26.889267352422078,\n        27.212157440185546,\n        27.202150770823163,\n        27.021636052926382,\n        27.016969119707742,\n        27.01914302031199,\n        27.37581664721171,\n        27.32651935577393,\n        27.31468152999878,\n        27.005629046758017,\n        27.286985462506617,\n        27.205869766076404,\n        27.2348338898023,\n        27.08270854314168,\n        27.12608816782634,\n        27.383097707430522,\n        27.283648891448976,\n        27.06018231074015,\n        27.234840971628827,\n        27.476725266774494,\n        27.707652390797932,\n        27.227288559277852,\n        27.3107603931427,\n        27.011848467191058,\n        27.36976047515869,\n        27.391914099057516,\n        27.385174859364827,\n        27.42166522105535,\n        27.630024264653525,\n        27.380434902509055,\n        27.47323336760203,\n        27.651270182927448,\n        27.44848692893982,\n        27.483901700973508,\n        27.21515275557836,\n        27.497027911345167,\n        27.572442706425985,\n        27.560447527567543,\n        27.36793696244558,\n        27.42006613175074,\n        27.493619120915735,\n        27.689073079427086,\n        27.50264252583186,\n        27.43011033058167,\n        27.64782487869263,\n        27.20738600730896,\n        27.465467980702716,\n        27.282119348843892,\n        27.665611066818233,\n        27.81447112083435,\n        27.87130407969157,\n        27.64211650212606,\n        27.7080734316508,\n        27.621387904485065,\n        27.702936627070113,\n        27.598064007759092,\n        27.722659085591633,\n        27.64586271524429,\n        27.999647210439047,\n        27.964578876495363,\n        27.837010904947917,\n        27.638748474121098,\n        27.76256209055583,\n        28.027448720932004,\n        27.709779580434162,\n        27.66845738093058,\n        27.72518820444743,\n        27.65982426007589,\n        27.579961166381835,\n        27.47355571269989,\n        27.76356635570526,\n        27.914256649017332,\n        27.394945340156553,\n        27.68763256072998,\n        27.689757191340124,\n        27.835959854125974,\n        27.943186308542888,\n        27.47778695344925,\n        27.718737327257788,\n        27.678874394098916,\n        27.8374245039622,\n        27.850753422578176,\n        27.432888934612276,\n        27.755647220611568,\n        27.72031798203786,\n        27.529903690020245,\n        27.607865818341576,\n        27.712867733637495,\n        27.99885702927907,\n        27.869108432133988,\n        27.792950577735905,\n        27.947116510073347,\n        27.803222284317016,\n        27.81898798942566,\n        27.745415393511454,\n        27.990131341616316,\n        28.14864422162374,\n        27.884647294680278,\n        27.716099699338276,\n        27.833606998125713,\n        28.014787572224932,\n        27.519190567334494,\n        27.7539791226387,\n        27.726883935928345,\n        27.769253171284998,\n        28.20607573827108,\n        27.807977307637533,\n        27.813251864115397,\n        27.978915383021036,\n        27.882311402956645,\n        28.148458941777548,\n        27.912579143047335,\n        27.93072050094604,\n        27.767308457692465,\n        28.174651209513346,\n        28.172889951070147,\n        28.343097937901813,\n        28.38390079498291,\n        28.22494049390157,\n        28.322374693552657,\n        28.16988698323568,\n        28.241848163604732,\n        28.278563607533773,\n        27.97595597028732,\n        28.320658842722576,\n        28.24419835249583,\n        28.260422398249307,\n        28.164151094754533,\n        28.21596943537394,\n        28.46122231165568,\n        28.27712041378021,\n        28.590335140228273,\n        28.311755402882895,\n        28.67214571317037,\n        28.50228882789612,\n        28.595824553171795,\n        28.50311995824178,\n        28.401127497355144,\n        28.80892203013102,\n        28.64995643933614,\n        28.63968591054281,\n        28.514695514043172,\n        28.670809841156004,\n        28.587760623296102,\n        28.8155946191152,\n        28.869128182729085,\n        28.65603721936544,\n        28.823503991762795,\n        28.95586130619049,\n        28.869864171346027,\n        28.76503180821737,\n        29.12373401006063,\n        28.938447481791183,\n        28.85428058942159,\n        28.565257782936097,\n        28.63410388787587,\n        28.925285924275716,\n        29.037555669148762,\n        28.76260529677073,\n        28.680053356488543,\n        28.91645269393921,\n        28.756759365399677,\n        28.825944765408835,\n        28.823520730336508,\n        28.880934257507324,\n        28.690467605590822,\n        28.81139595031738,\n        28.592766869862874,\n        28.87614825725555,\n        28.741094710032144,\n        29.01044699350993,\n        29.127510652542117,\n        28.686315304438278,\n        28.4821040948232,\n        28.787544911702472,\n        28.696746519406634,\n        28.59993433316549,\n        28.86362829049428,\n        28.638428608576458,\n        28.810294100443524,\n        28.65783554712931,\n        28.748908634185792,\n        28.889724987347922,\n        28.775232067108153,\n        28.812776306470237,\n        28.68773955980937,\n        28.682406692504884,\n        28.710577583312986,\n        28.798399791717532,\n        28.751555989583334,\n        28.43867567062378,\n        28.622998971939086,\n        28.5418660291036\n      ],\n      \"return_min\": [\n        0.0,\n        -5.520963165181699,\n        -5.398309007261076,\n        -5.350411727864705,\n        -5.300563709063288,\n        -5.373167151286887,\n        -5.244743047488927,\n        -5.018510029574243,\n        -5.23146215874125,\n        -5.253785732489721,\n        -5.446553555244509,\n        -5.580799970293974,\n        -6.26151833649266,\n        -6.979740932840051,\n        -7.505812459772718,\n        -7.092135843312693,\n        -7.39920407445033,\n        -7.7452636222094355,\n        -7.9050134285622855,\n        -6.48987129549158,\n        -5.654353417920056,\n        -5.815269690086543,\n        -4.9033229608684366,\n        -4.364460007935154,\n        -2.7990416494641397,\n        -2.9495417390998684,\n        -2.1731720117375652,\n        -1.8570615821223686,\n        -1.683727363340683,\n        1.3986503374892052,\n        1.578396193379838,\n        3.2064730591978994,\n        3.58337034294805,\n        4.303947198498285,\n        4.23811368968946,\n        4.844166448777771,\n        5.027521112944505,\n        5.054222594896141,\n        6.455040957762881,\n        6.603025119237945,\n        7.240161268213682,\n        7.445770366101291,\n        7.321740412287392,\n        6.651607739669598,\n        5.521256497526231,\n        6.277467043395728,\n        8.574633125834227,\n        7.974521417804013,\n        7.55285509943894,\n        7.299916074228484,\n        7.504092070112076,\n        7.389169200708295,\n        7.971362274313357,\n        8.948853512556664,\n        9.05238518322198,\n        8.173627072810628,\n        8.597720236513965,\n        8.279687455077365,\n        8.446574822859874,\n        8.99332873454503,\n        9.243581089298994,\n        8.82941824395879,\n        8.975877725575135,\n        9.256436597948978,\n        8.90494565678453,\n        9.57158849729407,\n        10.188809105616095,\n        9.77360360644471,\n        9.337660824915222,\n        8.051044333340174,\n        8.864180010138652,\n        8.744804824430856,\n        8.49640108291722,\n        10.067010098473812,\n        10.014871957505976,\n        9.64823068915522,\n        9.15242180940713,\n        10.082828645171604,\n        10.776856899541066,\n        10.577554678856938,\n        10.922870535973447,\n        10.903899618788715,\n        11.110146302869207,\n        11.234899446013943,\n        10.664980188331834,\n        10.61023834917109,\n        10.977294363326532,\n        11.188408920116812,\n        11.418770133334487,\n        11.80776633603974,\n        11.658676561647457,\n        11.498703604312505,\n        10.969441397737207,\n        12.203737145900309,\n        12.891263068433968,\n        12.743336304240811,\n        12.151705716844223,\n        12.767198778851697,\n        12.792299101983268,\n        12.17240064854574,\n        12.148095266100462,\n        12.432768169811734,\n        11.365202841854662,\n        12.16684822910994,\n        12.495021032412778,\n        13.26835609727346,\n        13.050785858650338,\n        12.281424819974456,\n        12.449084199784531,\n        13.625030204012711,\n        13.21897266304786,\n        14.48782746105462,\n        14.103399327134131,\n        15.519930829855001,\n        15.97552373557178,\n        15.529220441611425,\n        15.74885704151496,\n        15.016924988239797,\n        15.989010107035773,\n        16.442932816788804,\n        16.99508371654349,\n        16.53486178808719,\n        16.706773643149774,\n        16.661555294093077,\n        17.555512654693615,\n        18.112669203895777,\n        18.17118170021899,\n        18.065946631978374,\n        18.42934931637301,\n        18.676209657546792,\n        17.903826141385156,\n        18.6034161292777,\n        18.48889474575812,\n        17.758579913985656,\n        18.625405819750064,\n        19.23532065084414,\n        19.04211715145168,\n        18.877847871623374,\n        19.14924260166623,\n        18.998339729549528,\n        18.571789285071276,\n        19.417446402769226,\n        19.584450623471515,\n        20.181799769130308,\n        20.17112513383781,\n        20.51215091710531,\n        19.469530447820137,\n        20.424742202150163,\n        20.575052829757396,\n        20.784339971244243,\n        20.79814880451078,\n        21.561650860339824,\n        20.769088555281552,\n        21.583207375785705,\n        21.41970488153784,\n        21.66693042236608,\n        21.86798628928739,\n        21.316749097027397,\n        21.436188053015243,\n        21.134490993196177,\n        21.871763789661642,\n        22.315875755910508,\n        22.09950101356021,\n        22.12789918481529,\n        22.271840277606675,\n        22.060546728262114,\n        21.934839827743634,\n        22.100503142963884,\n        21.245196426043726,\n        21.869711692631675,\n        22.132582526880245,\n        22.46149806767977,\n        22.481397577260815,\n        22.770352157659595,\n        22.421027680957025,\n        22.252393067221245,\n        22.333097419765267,\n        22.991255548588548,\n        22.674374393471915,\n        22.535454298444176,\n        23.123628328353366,\n        22.69643708364719,\n        22.5819198826207,\n        22.94708603291372,\n        23.07638785760627,\n        22.74560562166574,\n        23.06932107625478,\n        23.300721860075413,\n        23.78194955286657,\n        23.523293205955508,\n        23.525998829282933,\n        23.76377698261631,\n        23.62537850586962,\n        23.864635305032632,\n        23.875682383296933,\n        23.46097954015071,\n        23.82543754032398,\n        23.761480614083627,\n        23.255379816835923,\n        23.54560044868967,\n        23.668944427175553,\n        23.832584937777096,\n        23.425595245832536,\n        24.05916452843633,\n        23.455597327213653,\n        24.038986145333908,\n        23.56369203440597,\n        24.171526792323046,\n        24.545470005081768,\n        24.484432222472666,\n        24.334504603248885,\n        23.39817352631335,\n        24.033054551448778,\n        24.482760384790556,\n        23.846917945112985,\n        24.390506037773623,\n        24.27431765022773,\n        24.061599451017777,\n        24.33728896947999,\n        24.144899762326954,\n        23.99540815574204,\n        23.57496830841703,\n        23.930048473254217,\n        23.94523741034942,\n        24.258384314268703,\n        24.37110129883553,\n        24.412283448059235,\n        24.324307472463204,\n        24.5954491279622,\n        24.899447987634463,\n        24.558637573616835,\n        24.818269162062435,\n        24.08092939132887,\n        24.275233238542537,\n        24.93987956500964,\n        24.822554121729205,\n        25.0395148262323,\n        25.136228336368635,\n        24.872140743082323,\n        24.757863129395144,\n        24.30237863460298,\n        24.730835786257032,\n        24.91673874558004,\n        25.09881997330028,\n        25.11666242969545,\n        24.928856844013023,\n        25.00190933529314,\n        24.950489014452607,\n        25.148173576425023,\n        24.952507670890725,\n        24.841307217586042,\n        25.71003983841862,\n        25.26704538169371,\n        25.224044727235064,\n        25.220263130793953,\n        25.21835000241302,\n        25.290933540632018,\n        25.181038007064192,\n        25.301149522880877,\n        25.62093761422375,\n        25.432852276965008,\n        25.342523079209098,\n        25.269339675082783,\n        24.949805414134993,\n        25.233401834020132,\n        25.576083550231427,\n        26.339662357104462,\n        25.792150727045904,\n        25.70682197517077,\n        25.807590150340896,\n        25.894729706751228,\n        25.837302977147562,\n        25.78135554745692,\n        25.68552533534089,\n        25.954483650340283,\n        25.94591635682661,\n        25.97817717944112,\n        25.75997075579898,\n        26.297399027884342,\n        25.80107864935811,\n        25.989724615648488,\n        25.984221681740333,\n        26.06569266026917,\n        26.481365883267824,\n        26.164847513213495,\n        26.603717323908448,\n        26.35348614473745,\n        26.30406394025493,\n        26.35116010418455,\n        26.23870062654923,\n        26.471481913277074,\n        26.72865827913775,\n        26.456025977567716,\n        26.509809123334087,\n        26.135543174587916,\n        26.414447583473727,\n        26.847739914839327,\n        26.986797064210627,\n        26.619696131921607,\n        26.740811445864274,\n        26.601182567396776,\n        26.63685962355374,\n        26.202893309498105,\n        26.53260006050866,\n        26.81746777977994,\n        26.35970819265938,\n        26.873575006909427,\n        26.999221390449513,\n        26.41511857693682,\n        26.750161047320116,\n        26.847929553691117,\n        26.322235246867464,\n        26.229498529666962,\n        26.46977801746138,\n        26.892038673702228,\n        26.58452566559043,\n        26.72107694201021,\n        26.389049676052693,\n        27.013425431358826,\n        26.902429688294113,\n        26.802812226360995,\n        26.385575632648816,\n        26.369492487494366,\n        26.920060112200925,\n        26.998724796791922,\n        26.68365616862227,\n        26.77927573299712,\n        27.14768838044163,\n        27.466859888550175,\n        26.89888082937872,\n        27.12633609842479,\n        26.714824263923475,\n        27.08128409218404,\n        26.98816964606872,\n        26.923163448198604,\n        26.80857466695888,\n        27.13636985361538,\n        26.978532109223686,\n        27.299038517001165,\n        27.52869762405963,\n        27.23234788711246,\n        27.250432316649512,\n        26.60946317727994,\n        27.10714554669029,\n        27.46608859685352,\n        27.26461963307082,\n        27.069754957120214,\n        27.31017870242129,\n        27.065506717686876,\n        27.195062973283548,\n        27.047814098820695,\n        27.272618294455032,\n        27.346293947512688,\n        26.68929354412294,\n        26.833985415803223,\n        26.550998418048945,\n        26.988439648514994,\n        27.333751302384293,\n        27.418736622351446,\n        26.98571833055701,\n        27.385679063968748,\n        27.334479780511817,\n        27.53131813266607,\n        27.09134479874994,\n        27.455784867981503,\n        27.108790568073573,\n        27.7333053434198,\n        27.735471393711826,\n        27.46263765149636,\n        26.99981107374089,\n        27.086095394451846,\n        27.660553475563162,\n        27.47148725177635,\n        27.462223492396287,\n        27.47444853382932,\n        27.28271224520508,\n        27.030280701112197,\n        26.953127198486797,\n        27.07038773963107,\n        27.40335757841962,\n        26.828224693044927,\n        27.455776077350375,\n        27.459088288760473,\n        27.55987625989684,\n        27.667771051731005,\n        27.306320572769852,\n        27.42247721506064,\n        27.363389568960162,\n        27.2417683309204,\n        27.16002617417197,\n        26.879341156026854,\n        27.409533505672982,\n        27.258284186429318,\n        27.13185402158389,\n        27.22422988865852,\n        27.107298046733188,\n        27.540015302529557,\n        27.252735638909673,\n        26.872490744618894,\n        27.202678057764224,\n        27.40268095166156,\n        27.28055555351517,\n        27.170128347919047,\n        27.69789697407816,\n        27.642380106133988,\n        27.332501783443227,\n        27.10840513223893,\n        27.31736408059001,\n        27.46624924375133,\n        26.64193874196349,\n        27.50888790519094,\n        27.229939764753325,\n        26.98551956035765,\n        27.744788462148396,\n        27.149869593429152,\n        27.484981796277946,\n        27.606668302204987,\n        27.255173815148787,\n        27.456134913716546,\n        27.099531472422207,\n        27.393945264619013,\n        27.25108337760589,\n        27.76944243898332,\n        27.733599140160784,\n        27.84646082153513,\n        27.933202163430472,\n        27.73045705324124,\n        28.041684326913252,\n        27.692758796177635,\n        27.746313479577946,\n        27.628974429062982,\n        27.369484454792406,\n        27.606020764457973,\n        27.75161067029577,\n        27.70126964798436,\n        27.649431270243365,\n        27.454127809238894,\n        27.842896762811133,\n        27.544721561443446,\n        27.835889016851528,\n        27.298496196673447,\n        28.035712672520734,\n        27.992795921610345,\n        28.12532764605974,\n        27.78349937488175,\n        27.672714149185374,\n        28.198158361700223,\n        28.114355241871845,\n        28.23306676740723,\n        28.19400077454502,\n        28.325819368964144,\n        28.409198407665194,\n        28.661559949465364,\n        28.534057885334462,\n        28.220808826082358,\n        28.480647238129848,\n        28.816288333143056,\n        28.823603017711733,\n        28.72982788893736,\n        28.891935847236173,\n        28.686997229285602,\n        28.75076735412921,\n        28.415471057558182,\n        28.44960991011972,\n        28.623370633354085,\n        28.8170532416532,\n        28.496819790952056,\n        28.313081375883748,\n        28.761926257981262,\n        28.570715940030333,\n        28.71350487001399,\n        28.64412790707295,\n        28.68453951723116,\n        28.415278606195162,\n        28.421916550091932,\n        28.176255415956366,\n        28.62514978950491,\n        28.498932264171632,\n        28.8257088770386,\n        29.01524324193152,\n        28.559692033382177,\n        28.30321338178444,\n        28.491747264076793,\n        28.542804594692967,\n        28.22614934251985,\n        28.106067120710527,\n        28.232353672890493,\n        28.643067026302873,\n        28.433486131362784,\n        28.47990484587543,\n        28.390362508890856,\n        28.348291888933517,\n        28.412395453613676,\n        28.177133127616443,\n        28.24077424802352,\n        28.172266605923447,\n        28.225951816403256,\n        28.400378623201224,\n        28.023487501372532,\n        28.417555112851705,\n        28.373084049809865\n      ],\n      \"return_max\": [\n        0.0,\n        -5.3233783664128955,\n        -5.331775275296094,\n        -5.291991849304079,\n        -5.264759640571516,\n        -5.18597458633315,\n        -4.953269558813971,\n        -4.952774235943945,\n        -4.8465674770060225,\n        -4.863344194191798,\n        -4.807687030718422,\n        -5.023169797594413,\n        -5.312992635212458,\n        -6.004849667491572,\n        -6.212447809392319,\n        -6.266350886547611,\n        -5.77151405502558,\n        -5.531950419242177,\n        -5.222661128805675,\n        -4.669057192641273,\n        -3.807360193166153,\n        -3.8851870904067165,\n        -3.2570504082640426,\n        -3.082628560800093,\n        -2.337782617098719,\n        -1.6848617292084809,\n        -1.5922271981194223,\n        -1.6242428143616785,\n        -1.0349247941944726,\n        3.48498701618313,\n        3.4506746862384885,\n        3.8680888143274808,\n        3.9136542230367555,\n        6.192863995921575,\n        7.572086009712397,\n        7.794420731995648,\n        8.088969100131768,\n        7.786375870705461,\n        7.717678611443356,\n        9.054163712727185,\n        9.114311963419823,\n        8.893885902654302,\n        8.89635711394501,\n        9.269431433138731,\n        9.74390327630037,\n        10.767022470637272,\n        10.385859138595501,\n        10.141160034947145,\n        9.520237014691716,\n        9.346000691302418,\n        9.683658439467898,\n        10.744756241987321,\n        10.872827973540243,\n        10.396539177148231,\n        9.788357934110481,\n        9.964355559508505,\n        9.857233782078868,\n        9.323058874230188,\n        10.872925911151457,\n        10.676273286036535,\n        10.222912754097509,\n        10.62073476036962,\n        10.174534153646462,\n        10.259126759722124,\n        10.126153545053644,\n        11.088327895668069,\n        11.298263668953101,\n        10.90108384269266,\n        11.140493672231383,\n        10.672532514372024,\n        10.79389320375651,\n        11.665614717563875,\n        12.108012359694477,\n        11.588338768306787,\n        12.404099190031255,\n        11.992382277064369,\n        12.475042953916333,\n        12.22394554509532,\n        12.376743724861296,\n        13.149051917453995,\n        13.286469430800539,\n        13.548486188248633,\n        13.342331389099076,\n        13.408189999418083,\n        13.35587124891894,\n        13.308945998617402,\n        14.427774395002224,\n        13.988911720765044,\n        14.041764033955246,\n        14.387548576756283,\n        14.53820639199634,\n        14.51625681661391,\n        14.824339922516803,\n        14.946924629370791,\n        15.315661506735914,\n        15.388547553804132,\n        14.992035046820021,\n        14.76334045244866,\n        14.47627781089668,\n        15.137965918827536,\n        15.719098029060468,\n        15.87398732049131,\n        15.603419403933914,\n        15.815829071661136,\n        15.9405983615717,\n        15.476392162906697,\n        15.925279319267142,\n        15.523953688593354,\n        15.567842775465712,\n        17.011486476704757,\n        16.23576184673811,\n        16.73848783861528,\n        16.52224035309251,\n        17.28776021336298,\n        17.633195333591228,\n        16.885573679177153,\n        17.892210021498755,\n        18.599395421534982,\n        18.335203871413725,\n        18.264219367061806,\n        18.691130685973192,\n        18.759426319466925,\n        18.62651690835372,\n        18.522755007051835,\n        18.22335495878091,\n        18.551826815785518,\n        19.826645467292,\n        19.68726630950925,\n        20.005070019945958,\n        19.601569606903276,\n        19.992889022799414,\n        20.161188680801235,\n        20.31410877997583,\n        20.383995626556946,\n        20.31205978439626,\n        19.876263174903034,\n        20.12914411462091,\n        20.646433925785686,\n        20.45860251558485,\n        20.665695641595086,\n        20.712273931773915,\n        20.680564633149967,\n        20.724421580355386,\n        21.30541526662612,\n        20.991437969202256,\n        21.59528022601959,\n        21.376878790359388,\n        20.949660072935288,\n        21.46212323886646,\n        21.748605877856505,\n        21.944902953301728,\n        21.742912900417622,\n        21.830265543356344,\n        21.850865468401715,\n        21.625355141134172,\n        21.90704296153742,\n        22.200133026813685,\n        22.120351968290393,\n        22.344733711358536,\n        22.496543563828457,\n        22.512055583787053,\n        22.704535532033017,\n        22.590021395878875,\n        22.449903563864822,\n        23.015752295559214,\n        22.599525402894802,\n        22.569084734392693,\n        22.701172130613497,\n        22.78120645589648,\n        23.166718674202322,\n        23.313694861056657,\n        23.190988911486897,\n        23.2371177727631,\n        23.359241729669506,\n        23.434185423926806,\n        23.23108542170087,\n        23.61561265307001,\n        24.02175927977551,\n        24.002162414542,\n        23.303578345191898,\n        23.653112658152782,\n        23.630834031337955,\n        23.746710761859692,\n        23.870303903533422,\n        23.606000945557895,\n        24.226155221929126,\n        23.94036676071015,\n        24.141694455003318,\n        24.204766755727178,\n        24.294903963348386,\n        24.386391024512434,\n        24.92803321521389,\n        24.74568802626539,\n        24.403390388860803,\n        24.988299521686585,\n        24.893313408890023,\n        25.048383082438068,\n        25.31870813872653,\n        25.07726949041506,\n        25.279565439778757,\n        25.092258585290878,\n        25.103540178571176,\n        25.391955960120423,\n        25.370200375331567,\n        25.760947443980804,\n        25.441433239304565,\n        25.504238582926487,\n        25.155336013361367,\n        25.372698581013402,\n        25.332231608602367,\n        25.56073954691286,\n        25.98369425914045,\n        25.847583265934034,\n        25.846914641149386,\n        25.366996576422252,\n        25.83511392046815,\n        25.8996366395265,\n        25.485267660506167,\n        25.5155361444523,\n        25.851117895588796,\n        26.126666948971398,\n        26.150593488407107,\n        26.01075423759682,\n        26.140450746732178,\n        26.258406298269946,\n        26.050406139188357,\n        26.147020519098767,\n        26.39959157093788,\n        26.789591206103555,\n        26.287399753174657,\n        26.549790303809317,\n        26.223712331569565,\n        26.54116144901079,\n        26.781159618691145,\n        26.578202735812297,\n        26.619955430908533,\n        26.354596025055542,\n        26.331563913946706,\n        26.79545372471655,\n        26.395280613484086,\n        26.54449096124256,\n        27.17131546140329,\n        27.17802662828572,\n        26.464058670371394,\n        26.528723639920234,\n        26.3249548269261,\n        26.540659176669745,\n        27.31569538547008,\n        26.950457360515486,\n        27.231707631576626,\n        26.66781742701133,\n        26.521908117666587,\n        26.716285284530453,\n        27.1101076216012,\n        26.621065471043202,\n        26.919205164467265,\n        27.032734459271346,\n        27.229740723646156,\n        27.32579510805988,\n        26.835801766930622,\n        26.900460180437534,\n        27.031526878384188,\n        26.969932518825914,\n        27.13954618905511,\n        27.264781915496673,\n        27.56375116370442,\n        27.067735227810704,\n        26.992099653152255,\n        27.19837990496636,\n        26.84755506246834,\n        26.795428509407323,\n        27.35189275115444,\n        27.210757874301095,\n        27.227130601536047,\n        26.963141888167815,\n        26.699606917912977,\n        26.856529158805543,\n        26.995514158980676,\n        26.87048313484556,\n        27.123449532989504,\n        27.42072708106527,\n        27.50319944303587,\n        26.75466896667379,\n        27.165719913724157,\n        27.003478651667898,\n        27.02045118970748,\n        27.42099260549144,\n        26.98233371713948,\n        27.451995492866438,\n        27.542035908840987,\n        27.776654893210967,\n        27.587847699764428,\n        27.493116713726632,\n        27.011506111168075,\n        27.46918891922312,\n        27.164554819149767,\n        27.180843401963646,\n        27.264464545184723,\n        27.697424019598174,\n        27.518070181854334,\n        27.443510905783356,\n        27.365069557863094,\n        27.06023885259601,\n        27.692439482007412,\n        27.642146389022955,\n        27.437500195987067,\n        27.047069079292555,\n        27.664174058870955,\n        27.363416127907335,\n        27.674153833050976,\n        27.55637198795521,\n        27.7210368589853,\n        27.80443970974852,\n        27.5685080231626,\n        27.859594620721193,\n        28.068513045957427,\n        27.90828611798735,\n        27.62220841746334,\n        27.56054549365441,\n        27.509309843858695,\n        27.666855553243604,\n        27.779841453634546,\n        27.88268384815831,\n        27.84613530266012,\n        27.56857298610603,\n        27.436708452858028,\n        27.690406210260534,\n        27.80576215310736,\n        27.94844489304569,\n        27.555696289176986,\n        27.49518468786061,\n        27.30887267045864,\n        27.65823685813334,\n        27.795658552046312,\n        27.84718627053105,\n        28.03475577515182,\n        28.123678675691668,\n        27.782337695794425,\n        27.647428218202897,\n        27.773842741795264,\n        27.664625970767176,\n        27.717371085297504,\n        27.820842333876776,\n        27.886910276000044,\n        27.67879681599845,\n        27.856275422064268,\n        27.666118967770945,\n        27.52995356108019,\n        27.921731524144594,\n        28.183083185570624,\n        27.957470952843025,\n        27.587602366708307,\n        27.949355809872575,\n        27.725478470494977,\n        28.09695054560221,\n        28.01324027963884,\n        28.34278248512147,\n        28.29519093928441,\n        28.323871537031692,\n        28.298514673695113,\n        28.03046779933285,\n        27.908296028458313,\n        27.874555121474156,\n        28.104783216768244,\n        27.989533303201764,\n        28.18293486241501,\n        28.265989077458293,\n        28.1936863592789,\n        28.211384158399472,\n        28.277685874501305,\n        28.43902878665981,\n        28.394343966300845,\n        27.948071909091976,\n        27.874691269464872,\n        27.97592787506554,\n        28.0369362749467,\n        28.129641631651474,\n        27.993984226912985,\n        28.456744971779454,\n        28.425155719615045,\n        27.96166598726818,\n        27.919489044109586,\n        27.920426093919776,\n        28.11204344835511,\n        28.21860156535477,\n        27.649253334128648,\n        28.014997439454937,\n        27.99435921923767,\n        28.433080677004,\n        28.54148067098438,\n        27.986436713197698,\n        28.101760935550153,\n        28.1823517776464,\n        27.927953358456598,\n        27.991501748024632,\n        28.3184374205418,\n        28.45769875602858,\n        28.485481225358303,\n        28.713410410852916,\n        28.69155496238247,\n        28.203763616972473,\n        28.35742042533615,\n        28.32070243910386,\n        28.282365709154472,\n        28.654908337113493,\n        28.436792805917328,\n        28.323794266437623,\n        28.349849915661416,\n        28.563325900698533,\n        28.3964423927055,\n        27.999070340086462,\n        28.223828107103365,\n        28.552986782212347,\n        28.66736301439376,\n        28.466085021845913,\n        28.14152193195285,\n        28.351162463837085,\n        28.509448990764504,\n        28.84078296983855,\n        28.725626813672463,\n        28.46749573727307,\n        28.28353353777904,\n        28.579859980043373,\n        28.61218076197951,\n        28.8397350542685,\n        28.83459942653535,\n        28.719423934561902,\n        28.603065060192062,\n        28.647015170293727,\n        28.73738284763152,\n        28.928152786004564,\n        28.582427485782233,\n        29.03529692098718,\n        28.73678603469589,\n        28.819575148514254,\n        28.6788709192657,\n        28.977811061508984,\n        29.079547860500224,\n        29.009519266116975,\n        29.344781263605018,\n        29.325014609092342,\n        29.308578753820004,\n        29.011781734181895,\n        29.06632146028385,\n        29.22274054160181,\n        29.129540845524915,\n        29.419685698561814,\n        29.185557636800436,\n        29.046305053678388,\n        28.835390253541323,\n        29.015800313347864,\n        28.76632283892701,\n        28.969629288765034,\n        29.204198480123708,\n        29.091265612648524,\n        29.166360745395743,\n        29.09543427923792,\n        28.91612532498032,\n        28.800235727497377,\n        29.355532172885088,\n        29.189897734296764,\n        28.957793824713974,\n        28.715044508314012,\n        28.81859786563202,\n        29.227201215197347,\n        29.258058096644326,\n        29.028390802589403,\n        29.047025337093338,\n        29.070979129897157,\n        28.942802790769022,\n        28.93838466080368,\n        29.002913553600067,\n        29.07732899778349,\n        28.96565660498648,\n        29.200875350542827,\n        29.009278323769383,\n        29.12714672500619,\n        28.983257155892655,\n        29.19518510998126,\n        29.239778063152716,\n        28.81293857549438,\n        28.66099480786196,\n        29.083342559328152,\n        28.8506884441203,\n        28.973719323811128,\n        29.621189460278032,\n        29.044503544262422,\n        28.977521174584176,\n        28.882184962895835,\n        29.017912422496153,\n        29.389087465804987,\n        29.20217224528279,\n        29.213157159326798,\n        29.198345992002295,\n        29.124039136986248,\n        29.248888560702525,\n        29.370847767031808,\n        29.102733355965444,\n        28.85386383987503,\n        28.828442831026468,\n        28.71064800839733\n      ]\n    }\n  }\n}"
  },
  {
    "path": "data/playground_result.json",
    "content": "{\n  \"G1JoystickFlatTerrain\": {\n    \"PPO\": {\n      \"time\": [\n        0.0,\n        213.46350627565,\n        426.9270125513,\n        640.3905188269499,\n        853.8540251026,\n        1067.31753137825,\n        1280.7810376538998,\n        1494.2445439295498,\n        1707.7080502052,\n        1921.1715564808499,\n        2134.6350627565,\n        2348.0985690321495,\n        2561.5620753077997,\n        2775.02558158345,\n        2988.4890878590995,\n        3201.9525941347497,\n        3415.4161004104,\n        3628.8796066860496,\n        3842.3431129616997,\n        4055.8066192373494,\n        4269.270125513\n      ],\n      \"env_step\": [\n        0,\n        0.0,\n        10649600.0,\n        21299200.0,\n        31948800.0,\n        42598400.0,\n        53248000.0,\n        63897600.0,\n        74547200.0,\n        85196800.0,\n        95846400.0,\n        106496000.0,\n        117145600.0,\n        127795200.0,\n        138444800.0,\n        149094400.0,\n        159744000.0,\n        170393600.0,\n        181043200.0,\n        191692800.0,\n        202342400.0\n      ],\n      \"return\": [\n        0.0,\n        -3.5786768595377603,\n        -1.73488183816274,\n        -1.4672497908274333,\n        -1.0114571849505107,\n        2.8664944966634116,\n        8.109216849009195,\n        12.15112050374349,\n        14.619683265686035,\n        16.250600814819336,\n        17.415241877237957,\n        19.337273279825848,\n        19.362972259521484,\n        20.73096466064453,\n        20.419541041056316,\n        20.557310740152996,\n        21.584625244140625,\n        22.650484720865887,\n        23.268614451090496,\n        23.964858373006184,\n        23.381336212158203\n      ],\n      \"return_min\": [\n        0.0,\n        -3.633624033793396,\n        -1.8136003478544724,\n        -1.5840279132431287,\n        -1.1807665772933178,\n        1.9221867482777824,\n        7.587271126532166,\n        10.834839595162432,\n        13.544779870637864,\n        15.040593747413778,\n        16.337773770375044,\n        18.42818295768993,\n        18.613121548035885,\n        19.99613895549954,\n        18.939780920828404,\n        19.84353432250872,\n        20.260494181371044,\n        22.000288411591143,\n        22.748610923303467,\n        22.875291623275718,\n        22.801813358049884\n      ],\n      \"return_max\": [\n        0.0,\n        -3.5237296852821247,\n        -1.6561633284710076,\n        -1.350471668411738,\n        -0.8421477926077037,\n        3.810802245049041,\n        8.631162571486225,\n        13.467401412324548,\n        15.694586660734206,\n        17.460607882224895,\n        18.49270998410087,\n        20.246363601961765,\n        20.112822971007084,\n        21.465790365789523,\n        21.899301161284228,\n        21.27108715779727,\n        22.908756306910206,\n        23.30068103014063,\n        23.788617978877525,\n        25.05442512273665,\n        23.960859066266522\n      ]\n    },\n    \"FastTD3\": {\n      \"time\": [\n        0.0,\n        433.3400245197778,\n        866.6800490395556,\n        1300.0200735593332,\n        1733.3600980791111,\n        2166.700122598889,\n        2600.0401471186665,\n        3033.3801716384446,\n        3466.7201961582223,\n        3900.060220678\n      ],\n      \"env_step\": [\n        0,\n        5120000.0,\n        10240000.0,\n        15360000.0,\n        20480000.0,\n        25600000.0,\n        30720000.0,\n        35840000.0,\n        40960000.0,\n        46080000.0\n      ],\n      \"return\": [\n        0.0,\n        4.487485567728679,\n        16.74320411682129,\n        23.278841654459637,\n        27.637049357096355,\n        29.066091537475586,\n        30.403479894002277,\n        30.677172978719074,\n        30.934303283691406,\n        31.36344337463379\n      ],\n      \"return_min\": [\n        0.0,\n        2.9314155707645613,\n        16.25554819781311,\n        22.555713738285238,\n        27.491040766131707,\n        28.865730935672705,\n        30.389128000549142,\n        30.534967621827455,\n        30.650468173578087,\n        31.16558190517613\n      ],\n      \"return_max\": [\n        0.0,\n        6.043555564692796,\n        17.23086003582947,\n        24.001969570634035,\n        27.783057948061003,\n        29.266452139278467,\n        30.417831787455412,\n        30.819378335610693,\n        31.218138393804725,\n        31.56130484409145\n      ]\n    }\n  },\n  \"G1JoystickRoughTerrain\": {\n    \"PPO\": {\n      \"time\": [\n        0.0,\n        430.02984852145,\n        860.0596970429,\n        1290.08954556435,\n        1720.1193940858,\n        2150.14924260725,\n        2580.1790911287,\n        3010.2089396501497,\n        3440.2387881716,\n        3870.26863669305,\n        4300.2984852145,\n        4730.32833373595,\n        5160.3581822574,\n        5590.38803077885,\n        6020.417879300299,\n        6450.44772782175,\n        6880.4775763432,\n        7310.507424864651,\n        7740.5372733861,\n        8170.56712190755,\n        8600.596970429\n      ],\n      \"env_step\": [\n        0,\n        0.0,\n        10649600.0,\n        21299200.0,\n        31948800.0,\n        42598400.0,\n        53248000.0,\n        63897600.0,\n        74547200.0,\n        85196800.0,\n        95846400.0,\n        106496000.0,\n        117145600.0,\n        127795200.0,\n        138444800.0,\n        149094400.0,\n        159744000.0,\n        170393600.0,\n        181043200.0,\n        191692800.0,\n        202342400.0\n      ],\n      \"return\": [\n        0.0,\n        -3.6238108476003013,\n        -1.7691378990809123,\n        -1.5785710016886394,\n        -1.4046337604522705,\n        -1.2989702622095745,\n        0.9209129015604655,\n        5.599383354187012,\n        9.098564147949219,\n        10.36640707651774,\n        11.69961961110433,\n        12.815311431884766,\n        13.983243624369303,\n        14.600811004638672,\n        15.401162465413412,\n        16.72524897257487,\n        17.59372901916504,\n        17.17877769470215,\n        18.038379033406574,\n        19.438427607218426,\n        19.256649017333984\n      ],\n      \"return_min\": [\n        0.0,\n        -3.6888348904174544,\n        -1.8035371110148877,\n        -1.6518320374166224,\n        -1.4434835026307684,\n        -1.3658170492283774,\n        -0.3835055360183316,\n        3.5933339888782276,\n        8.425358069779085,\n        9.000362013173968,\n        10.886633777692076,\n        12.64601019994388,\n        13.083222241676461,\n        14.4930801524455,\n        14.552314554875068,\n        15.949481202226977,\n        16.67476213765229,\n        16.62720103040347,\n        17.008479664699678,\n        18.59841010178635,\n        19.188480973348607\n      ],\n      \"return_max\": [\n        0.0,\n        -3.558786804783148,\n        -1.734738687146937,\n        -1.5053099659606564,\n        -1.3657840182737726,\n        -1.2321234751907715,\n        2.2253313391392626,\n        7.605432719495796,\n        9.771770226119353,\n        11.732452139861513,\n        12.512605444516584,\n        12.98461266382565,\n        14.883265007062144,\n        14.708541856831843,\n        16.250010375951756,\n        17.501016742922765,\n        18.512695900677787,\n        17.730354359000827,\n        19.06827840211347,\n        20.278445112650502,\n        19.32481706131936\n      ]\n    },\n    \"FastTD3\": {\n      \"time\": [\n        0.0,\n        599.3992147375556,\n        1198.798429475111,\n        1798.1976442126668,\n        2397.596858950222,\n        2996.996073687778,\n        3596.3952884253335,\n        4195.794503162889,\n        4795.193717900444,\n        5394.592932638\n      ],\n      \"env_step\": [\n        0,\n        5120000.0,\n        10240000.0,\n        15360000.0,\n        20480000.0,\n        25600000.0,\n        30720000.0,\n        35840000.0,\n        40960000.0,\n        46080000.0\n      ],\n      \"return\": [\n        0.0,\n        3.2808993657430015,\n        13.874738057454428,\n        20.431355794270832,\n        24.597570419311523,\n        26.355728149414062,\n        27.401456197102863,\n        27.81001917521159,\n        28.1074701944987,\n        27.748165130615234\n      ],\n      \"return_min\": [\n        0.0,\n        2.8048553516036847,\n        13.351663266422506,\n        19.902439898270348,\n        24.322450701046584,\n        25.83280810079747,\n        27.064759880116778,\n        27.480682414578805,\n        28.029204257881428,\n        27.266035552178657\n      ],\n      \"return_max\": [\n        0.0,\n        3.756943379882318,\n        14.39781284848635,\n        20.960271690271316,\n        24.872690137576463,\n        26.878648198030653,\n        27.73815251408895,\n        28.139355935844375,\n        28.18573613111597,\n        28.23029470905181\n      ]\n    }\n  },\n  \"T1JoystickFlatTerrain\": {\n    \"PPO\": {\n      \"time\": [\n        0.0,\n        156.0373824072,\n        312.0747648144,\n        468.1121472216,\n        624.1495296288,\n        780.186912036,\n        936.2242944432,\n        1092.2616768504,\n        1248.2990592576,\n        1404.3364416647998,\n        1560.373824072,\n        1716.4112064792,\n        1872.4485888864,\n        2028.4859712936,\n        2184.5233537008,\n        2340.5607361079997,\n        2496.5981185152,\n        2652.6355009224,\n        2808.6728833295997,\n        2964.7102657368,\n        3120.747648144\n      ],\n      \"env_step\": [\n        0,\n        0.0,\n        10649600.0,\n        21299200.0,\n        31948800.0,\n        42598400.0,\n        53248000.0,\n        63897600.0,\n        74547200.0,\n        85196800.0,\n        95846400.0,\n        106496000.0,\n        117145600.0,\n        127795200.0,\n        138444800.0,\n        149094400.0,\n        159744000.0,\n        170393600.0,\n        181043200.0,\n        191692800.0,\n        202342400.0\n      ],\n      \"return\": [\n        0.0,\n        0.327275812625885,\n        0.8326747417449951,\n        2.144671678543091,\n        9.289753913879395,\n        14.297985553741455,\n        18.105278968811035,\n        20.841957092285156,\n        25.58073329925537,\n        28.61104106903076,\n        29.105724334716797,\n        31.17124652862549,\n        31.661392211914062,\n        33.493133544921875,\n        31.919815063476562,\n        34.29528045654297,\n        34.032718658447266,\n        33.67773628234863,\n        35.22429275512695,\n        35.6923885345459,\n        34.44869613647461\n      ],\n      \"return_min\": [\n        0.0,\n        0.2993045151233673,\n        0.763859748840332,\n        1.678368091583252,\n        8.580310821533203,\n        14.120144844055176,\n        17.573272705078125,\n        18.319969177246094,\n        23.739049911499023,\n        26.93745994567871,\n        28.4345703125,\n        30.444316864013672,\n        29.525779724121094,\n        32.53502655029297,\n        31.550701141357422,\n        33.862396240234375,\n        33.20738220214844,\n        33.50837326049805,\n        34.951812744140625,\n        35.534149169921875,\n        33.58796691894531\n      ],\n      \"return_max\": [\n        0.0,\n        0.3552471101284027,\n        0.9014897346496582,\n        2.6109752655029297,\n        9.999197006225586,\n        14.475826263427734,\n        18.637285232543945,\n        23.36394500732422,\n        27.42241668701172,\n        30.284622192382812,\n        29.776878356933594,\n        31.898176193237305,\n        33.79700469970703,\n        34.45124053955078,\n        32.2889289855957,\n        34.72816467285156,\n        34.858055114746094,\n        33.84709930419922,\n        35.49677276611328,\n        35.85062789916992,\n        35.309425354003906\n      ]\n    },\n    \"FastTD3\": {\n      \"time\": [\n        0.0,\n        336.3940926118889,\n        672.7881852237778,\n        1009.1822778356667,\n        1345.5763704475555,\n        1681.9704630594445,\n        2018.3645556713334,\n        2354.758648283222,\n        2691.152740895111,\n        3027.546833507\n      ],\n      \"env_step\": [\n        0,\n        5120000.0,\n        10240000.0,\n        15360000.0,\n        20480000.0,\n        25600000.0,\n        30720000.0,\n        35840000.0,\n        40960000.0,\n        46080000.0\n      ],\n      \"return\": [\n        0.0,\n        17.497735182444256,\n        23.35838508605957,\n        29.219560623168945,\n        32.47132873535156,\n        33.60111618041992,\n        34.1003163655599,\n        34.494296391805015,\n        34.647412618001304,\n        34.80020268758138\n      ],\n      \"return_min\": [\n        0.0,\n        14.391559420284299,\n        20.08936997440851,\n        27.716625485294703,\n        29.593702042397194,\n        30.287767241900532,\n        30.65928257092611,\n        31.33219802426975,\n        31.594485160912818,\n        31.565288339653822\n      ],\n      \"return_max\": [\n        0.0,\n        20.603910944604213,\n        26.62740019771063,\n        30.722495761043188,\n        35.34895542830593,\n        36.91446511893931,\n        37.54135016019369,\n        37.65639475934028,\n        37.70034007508979,\n        38.03511703550894\n      ]\n    }\n  },\n  \"T1JoystickRoughTerrain\": {\n    \"PPO\": {\n      \"time\": [\n        0.0,\n        224.59065696369998,\n        449.18131392739997,\n        673.7719708911,\n        898.3626278547999,\n        1122.9532848185,\n        1347.5439417822,\n        1572.1345987459,\n        1796.7252557095999,\n        2021.3159126733,\n        2245.906569637,\n        2470.4972266007,\n        2695.0878835644,\n        2919.6785405281,\n        3144.2691974918,\n        3368.8598544554998,\n        3593.4505114191998,\n        3818.0411683828997,\n        4042.6318253466,\n        4267.2224823103,\n        4491.813139274\n      ],\n      \"env_step\": [\n        0,\n        0.0,\n        10649600.0,\n        21299200.0,\n        31948800.0,\n        42598400.0,\n        53248000.0,\n        63897600.0,\n        74547200.0,\n        85196800.0,\n        95846400.0,\n        106496000.0,\n        117145600.0,\n        127795200.0,\n        138444800.0,\n        149094400.0,\n        159744000.0,\n        170393600.0,\n        181043200.0,\n        191692800.0,\n        202342400.0\n      ],\n      \"return\": [\n        0.0,\n        0.35334428151448566,\n        0.8989856839179993,\n        1.4808813730875652,\n        2.67240309715271,\n        4.544658501942952,\n        5.027948538462321,\n        6.222784360249837,\n        7.626932621002197,\n        8.986979166666666,\n        9.412737210591635,\n        12.002973874409994,\n        12.44714609781901,\n        14.011113484700521,\n        15.020503044128418,\n        16.263832092285156,\n        17.344195048014324,\n        17.939491907755535,\n        19.251351674397785,\n        18.859820048014324,\n        18.648751576741535\n      ],\n      \"return_min\": [\n        0.0,\n        0.31176972443433465,\n        0.8396267360691291,\n        1.2663937149181501,\n        2.530283852092835,\n        4.464613059374979,\n        4.9106046317547305,\n        5.38566465616166,\n        5.950525768996994,\n        8.09050058904016,\n        7.377376393544287,\n        8.834169037903324,\n        9.362149864454587,\n        10.808991493684232,\n        12.488085054344273,\n        13.79303938444693,\n        16.140531526516416,\n        16.37832635466966,\n        18.58600524039474,\n        18.461735559677493,\n        17.944450849685\n      ],\n      \"return_max\": [\n        0.0,\n        0.39491883859463667,\n        0.9583446317668695,\n        1.6953690312569802,\n        2.814522342212585,\n        4.624703944510926,\n        5.145292445169912,\n        7.0599040643380135,\n        9.303339473007401,\n        9.883457744293173,\n        11.448098027638983,\n        15.171778710916664,\n        15.532142331183433,\n        17.21323547571681,\n        17.55292103391256,\n        18.734624800123385,\n        18.547858569512233,\n        19.50065746084141,\n        19.91669810840083,\n        19.257904536351155,\n        19.35305230379807\n      ]\n    },\n    \"FastTD3\": {\n      \"time\": [\n        0.0,\n        368.47968223911107,\n        736.9593644782221,\n        1105.4390467173334,\n        1473.9187289564443,\n        1842.3984111955554,\n        2210.878093434667,\n        2579.3577756737777,\n        2947.8374579128886,\n        3316.317140152\n      ],\n      \"env_step\": [\n        0,\n        5120000.0,\n        10240000.0,\n        15360000.0,\n        20480000.0,\n        25600000.0,\n        30720000.0,\n        35840000.0,\n        40960000.0,\n        46080000.0\n      ],\n      \"return\": [\n        0.0,\n        9.457830508550009,\n        15.376138687133789,\n        18.36476771036784,\n        23.326606432596844,\n        26.596385320027668,\n        28.654596010843914,\n        29.314904530843098,\n        30.264007886250813,\n        30.40758482615153\n      ],\n      \"return_min\": [\n        0.0,\n        6.853786206432801,\n        13.629757692263793,\n        15.874774229828427,\n        21.370459278573758,\n        25.307348278642664,\n        26.545756306321362,\n        26.61948520643785,\n        27.26373076116701,\n        27.204618859278554\n      ],\n      \"return_max\": [\n        0.0,\n        12.061874810667216,\n        17.122519682003784,\n        20.854761190907254,\n        25.28275358661993,\n        27.885422361412672,\n        30.763435715366466,\n        32.010323855248345,\n        33.264285011334614,\n        33.61055079302451\n      ]\n    }\n  }\n}"
  },
  {
    "path": "fast_td3/__init__.py",
    "content": "\"\"\"\nFast TD3 is a high-performance implementation of Twin Delayed Deep Deterministic Policy Gradient (TD3)\nwith distributional critics for reinforcement learning.\n\"\"\"\n\n# Core model components\nfrom fast_td3.fast_td3 import Actor, Critic, DistributionalQNetwork\nfrom fast_td3.fast_td3_utils import EmpiricalNormalization, SimpleReplayBuffer\nfrom fast_td3.fast_td3_deploy import Policy, load_policy\n\n__all__ = [\n    # Core model components\n    \"Actor\",\n    \"Critic\",\n    \"DistributionalQNetwork\",\n    \"EmpiricalNormalization\",\n    \"SimpleReplayBuffer\",\n    \"Policy\",\n    \"load_policy\",\n]\n"
  },
  {
    "path": "fast_td3/environments/humanoid_bench_env.py",
    "content": "from __future__ import annotations\n\nimport gymnasium as gym\n\nimport humanoid_bench\nfrom gymnasium.wrappers import TimeLimit\nfrom stable_baselines3.common.vec_env import SubprocVecEnv\nimport numpy as np\nimport torch\nfrom loguru import logger as log\n\n# Disable all logging below CRITICAL level\nlog.remove()\nlog.add(lambda msg: False, level=\"CRITICAL\")\n\n\ndef make_env(env_name, rank, render_mode=None, seed=0):\n    \"\"\"\n    Utility function for multiprocessed env.\n\n    :param rank: (int) index of the subprocess\n    :param seed: (int) the inital seed for RNG\n    \"\"\"\n\n    if env_name in [\n        \"h1hand-push-v0\",\n        \"h1-push-v0\",\n        \"h1hand-cube-v0\",\n        \"h1cube-v0\",\n        \"h1hand-basketball-v0\",\n        \"h1-basketball-v0\",\n        \"h1hand-kitchen-v0\",\n        \"h1-kitchen-v0\",\n    ]:\n        max_episode_steps = 500\n    else:\n        max_episode_steps = 1000\n\n    def _init():\n        import humanoid_bench\n\n        env = gym.make(env_name, render_mode=render_mode)\n        env = TimeLimit(env, max_episode_steps=max_episode_steps)\n        env.unwrapped.seed(seed + rank)\n\n        return env\n\n    return _init\n\n\nclass HumanoidBenchEnv:\n    \"\"\"Wraps HumanoidBench environment to support parallel environments.\"\"\"\n\n    def __init__(self, env_name, num_envs=1, render_mode=None, device=None):\n        # NOTE: HumanoidBench action space is already normalized to [-1, 1]\n        device = device or torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\n        self.sim_device = device\n        self.num_envs = num_envs\n\n        # Create the base environment\n        self.envs = SubprocVecEnv(\n            [make_env(env_name, i, render_mode=render_mode) for i in range(num_envs)]\n        )\n\n        if env_name in [\n            \"h1hand-push-v0\",\n            \"h1-push-v0\",\n            \"h1hand-cube-v0\",\n            \"h1cube-v0\",\n            \"h1hand-basketball-v0\",\n            \"h1-basketball-v0\",\n            \"h1hand-kitchen-v0\",\n            \"h1-kitchen-v0\",\n        ]:\n            self.max_episode_steps = 500\n        else:\n            self.max_episode_steps = 1000\n\n        # For compatibility with MuJoCo Playground\n        self.asymmetric_obs = False  # For comptatibility with MuJoCo Playground\n        self.num_obs = self.envs.observation_space.shape[-1]\n        self.num_actions = self.envs.action_space.shape[-1]\n\n    def reset(self):\n        \"\"\"Reset the environment.\"\"\"\n        observations = self.envs.reset()\n        observations = torch.from_numpy(observations).to(\n            device=self.sim_device, dtype=torch.float\n        )\n        return observations\n\n    def render(self):\n        assert (\n            self.num_envs == 1\n        ), \"Currently only supports single environment rendering\"\n        return self.envs.render()\n\n    def step(self, actions):\n        assert isinstance(actions, torch.Tensor)\n        actions = actions.cpu().numpy()\n\n        observations, rewards, dones, raw_infos = self.envs.step(actions)\n\n        # This will be used for getting 'true' next observations\n        infos = dict()\n        infos[\"observations\"] = {\"raw\": {\"obs\": observations.copy()}}\n        truncateds = np.zeros_like(dones)\n        for i in range(self.num_envs):\n            if raw_infos[i].get(\"TimeLimit.truncated\", False):\n                truncateds[i] = True\n                infos[\"observations\"][\"raw\"][\"obs\"][i] = raw_infos[i][\n                    \"terminal_observation\"\n                ]\n\n        observations = torch.from_numpy(observations).to(\n            device=self.sim_device, dtype=torch.float\n        )\n        rewards = torch.from_numpy(rewards).to(\n            device=self.sim_device, dtype=torch.float\n        )\n        dones = torch.from_numpy(dones).to(device=self.sim_device)\n        truncateds = torch.from_numpy(truncateds).to(device=self.sim_device)\n        infos[\"observations\"][\"raw\"][\"obs\"] = torch.from_numpy(\n            infos[\"observations\"][\"raw\"][\"obs\"]\n        ).to(device=self.sim_device, dtype=torch.float)\n        infos[\"time_outs\"] = truncateds\n\n        return observations, rewards, dones, infos\n"
  },
  {
    "path": "fast_td3/environments/isaaclab_env.py",
    "content": "from typing import Optional\n\nimport gymnasium as gym\nimport torch\n\n\nclass IsaacLabEnv:\n    \"\"\"Wrapper for IsaacLab environments to be compatible with MuJoCo Playground\"\"\"\n\n    def __init__(\n        self,\n        task_name: str,\n        device: str,\n        num_envs: int,\n        seed: int,\n        action_bounds: Optional[float] = None,\n    ):\n        from isaaclab.app import AppLauncher\n\n        app_launcher = AppLauncher(headless=True, device=device)\n        simulation_app = app_launcher.app\n\n        import isaaclab_tasks\n        from isaaclab_tasks.utils.parse_cfg import parse_env_cfg\n\n        env_cfg = parse_env_cfg(\n            task_name,\n            device=device,\n            num_envs=num_envs,\n        )\n        env_cfg.seed = seed\n        self.seed = seed\n        self.envs = gym.make(task_name, cfg=env_cfg, render_mode=None)\n\n        self.num_envs = self.envs.unwrapped.num_envs\n        self.max_episode_steps = self.envs.unwrapped.max_episode_length\n        self.action_bounds = action_bounds\n        self.num_obs = self.envs.unwrapped.single_observation_space[\"policy\"].shape[0]\n        self.asymmetric_obs = \"critic\" in self.envs.unwrapped.single_observation_space\n        if self.asymmetric_obs:\n            self.num_privileged_obs = self.envs.unwrapped.single_observation_space[\n                \"critic\"\n            ].shape[0]\n        else:\n            self.num_privileged_obs = 0\n        self.num_actions = self.envs.unwrapped.single_action_space.shape[0]\n\n    def reset(self, random_start_init: bool = True) -> torch.Tensor:\n        obs_dict, _ = self.envs.reset()\n        # NOTE: decorrelate episode horizons like RSL‑RL\n        if random_start_init:\n            self.envs.unwrapped.episode_length_buf = torch.randint_like(\n                self.envs.unwrapped.episode_length_buf, high=int(self.max_episode_steps)\n            )\n        return obs_dict[\"policy\"]\n\n    def reset_with_critic_obs(self) -> tuple[torch.Tensor, torch.Tensor]:\n        obs_dict, _ = self.envs.reset()\n        return obs_dict[\"policy\"], obs_dict[\"critic\"]\n\n    def step(\n        self, actions: torch.Tensor\n    ) -> tuple[torch.Tensor, torch.Tensor, torch.Tensor, dict]:\n        if self.action_bounds is not None:\n            actions = torch.clamp(actions, -1.0, 1.0) * self.action_bounds\n        obs_dict, rew, terminations, truncations, infos = self.envs.step(actions)\n        dones = (terminations | truncations).to(dtype=torch.long)\n        obs = obs_dict[\"policy\"]\n        critic_obs = obs_dict[\"critic\"] if self.asymmetric_obs else None\n        info_ret = {\"time_outs\": truncations, \"observations\": {\"critic\": critic_obs}}\n        # NOTE: There's really no way to get the raw observations from IsaacLab\n        # We just use the 'reset_obs' as next_obs, unfortunately.\n        # See https://github.com/isaac-sim/IsaacLab/issues/1362\n        info_ret[\"observations\"][\"raw\"] = {\n            \"obs\": obs,\n            \"critic_obs\": critic_obs,\n        }\n        return obs, rew, dones, info_ret\n\n    def render(self):\n        raise NotImplementedError(\n            \"We don't support rendering for IsaacLab environments\"\n        )\n"
  },
  {
    "path": "fast_td3/environments/mtbench_env.py",
    "content": "from __future__ import annotations\n\nimport torch\nfrom omegaconf import OmegaConf\n\nimport isaacgym\nimport isaacgymenvs\n\n\nclass MTBenchEnv:\n    def __init__(\n        self,\n        task_name: str,\n        device_id: int,\n        num_envs: int,\n        seed: int,\n    ):\n        # NOTE: Currently, we only support Meta-World-v2 MT-10/MT-50 in MTBench\n        task_config = MTBENCH_MW2_CONFIG.copy()\n        if task_name == \"meta-world-v2-mt10\":\n            # MT-10 Setup\n            assert num_envs == 4096, \"MT-10 only supports 4096 environments (for now)\"\n            self.num_tasks = 10\n            task_config[\"env\"][\"tasks\"] = [4, 16, 17, 18, 28, 31, 38, 40, 48, 49]\n            task_config[\"env\"][\"taskEnvCount\"] = [410] * 6 + [409] * 4\n        elif task_name == \"meta-world-v2-mt50\":\n            # MT-50 Setup\n            self.num_tasks = 50\n            assert num_envs == 8192, \"MT-50 only supports 8192 environments (for now)\"\n            task_config[\"env\"][\"tasks\"] = list(range(50))\n            task_config[\"env\"][\"taskEnvCount\"] = [164] * 42 + [163] * 8  # 6888 + 1304\n        else:\n            raise ValueError(f\"Unsupported task name: {task_name}\")\n        task_config[\"env\"][\"numEnvs\"] = num_envs\n        task_config[\"env\"][\"numObservations\"] = 39 + self.num_tasks\n        task_config[\"env\"][\"seed\"] = seed\n\n        # Convert dictionary to OmegaConf object\n        env_cfg = {\"task\": task_config}\n        env_cfg = OmegaConf.create(env_cfg)\n\n        self.env = isaacgymenvs.make(\n            task=env_cfg.task.name,\n            num_envs=num_envs,\n            sim_device=f\"cuda:{device_id}\",\n            rl_device=f\"cuda:{device_id}\",\n            seed=seed,\n            headless=True,\n            cfg=env_cfg,\n        )\n\n        self.num_envs = num_envs\n        self.asymmetric_obs = False\n        self.num_obs = self.env.observation_space.shape[0]\n        assert (\n            self.num_obs == 39 + self.num_tasks\n        ), \"MTBench observation space is 39 + num_tasks (one-hot vector)\"\n        self.num_privileged_obs = 0\n        self.num_actions = self.env.action_space.shape[0]\n        self.max_episode_steps = self.env.max_episode_length\n\n    def reset(self) -> torch.Tensor:\n        \"\"\"Reset the environment.\"\"\"\n        # TODO: Check if we need no_grad and detach here\n        with torch.no_grad():  # do we need this?\n            self.env.reset_idx(torch.arange(self.num_envs, device=self.env.device))\n            self.env.cumulatives[\"rewards\"][:] = 0\n            self.env.cumulatives[\"success\"][:] = 0\n            obs_dict = self.env.reset()\n            return obs_dict[\"obs\"].detach()\n\n    def step(\n        self, actions: torch.Tensor\n    ) -> tuple[torch.Tensor, torch.Tensor, torch.Tensor, dict]:\n        \"\"\"Step the environment.\"\"\"\n        assert isinstance(actions, torch.Tensor)\n\n        # TODO: Check if we need no_grad and detach here\n        with torch.no_grad():\n            obs_dict, rew, dones, infos = self.env.step(actions.detach())\n            truncations = infos[\"time_outs\"]\n            info_ret = {\"time_outs\": truncations.detach()}\n            if \"episode\" in infos:\n                info_ret[\"episode\"] = infos[\"episode\"]\n            # NOTE: There's really no way to get the raw observations from IsaacGym\n            # We just use the 'reset_obs' as next_obs, unfortunately.\n            info_ret[\"observations\"] = {\"raw\": {\"obs\": obs_dict[\"obs\"].detach()}}\n            return obs_dict[\"obs\"].detach(), rew.detach(), dones.detach(), info_ret\n\n    def render(self):\n        raise NotImplementedError(\n            \"We don't support rendering for IsaacLab environments\"\n        )\n\n\nMTBENCH_MW2_CONFIG = {\n    \"name\": \"meta-world-v2\",\n    \"physics_engine\": \"physx\",\n    \"env\": {\n        \"numEnvs\": 1,\n        \"envSpacing\": 1.5,\n        \"episodeLength\": 150,\n        \"enableDebugVis\": False,\n        \"clipObservations\": 5.0,\n        \"clipActions\": 1.0,\n        \"aggregateMode\": 3,\n        \"actionScale\": 0.01,\n        \"resetNoise\": 0.15,\n        \"tasks\": [0],\n        \"taskEnvCount\": [4096],\n        \"init_at_random_progress\": True,\n        \"exemptedInitAtRandomProgressTasks\": [],\n        \"taskEmbedding\": True,\n        \"taskEmbeddingType\": \"one_hot\",\n        \"seed\": 42,\n        \"cameraRenderingInterval\": 5000,\n        \"cameraWidth\": 1024,\n        \"cameraHeight\": 1024,\n        \"sparse_reward\": False,\n        \"termination_on_success\": False,\n        \"reward_scale\": 1.0,\n        \"fixed\": False,\n        \"numObservations\": None,\n        \"numActions\": 4,\n    },\n    \"enableCameraSensors\": False,\n    \"sim\": {\n        \"dt\": 0.01667,\n        \"substeps\": 2,\n        \"up_axis\": \"z\",\n        \"use_gpu_pipeline\": True,\n        \"gravity\": [0.0, 0.0, -9.81],\n        \"physx\": {\n            \"num_threads\": 4,\n            \"solver_type\": 1,\n            \"use_gpu\": True,\n            \"num_position_iterations\": 8,\n            \"num_velocity_iterations\": 1,\n            \"contact_offset\": 0.005,\n            \"rest_offset\": 0.0,\n            \"bounce_threshold_velocity\": 0.2,\n            \"max_depenetration_velocity\": 1000.0,\n            \"default_buffer_size_multiplier\": 10.0,\n            \"max_gpu_contact_pairs\": 1048576,\n            \"num_subscenes\": 4,\n            \"contact_collection\": 0,\n        },\n    },\n    \"task\": {\"randomize\": False},\n}\n"
  },
  {
    "path": "fast_td3/environments/mujoco_playground_env.py",
    "content": "from mujoco_playground import registry\nfrom mujoco_playground import wrapper_torch\n\nimport jax\nimport mujoco\n\n\nclass PlaygroundEvalEnvWrapper:\n    def __init__(\n        self,\n        eval_env,\n        max_episode_steps,\n        env_name,\n        num_eval_envs,\n        seed,\n        device_rank=None,\n    ):\n        \"\"\"\n        Wrapper used for evaluation / rendering environments.\n        Note that this is different from training environments that are\n        wrapped with RSLRLBraxWrapper.\n        \"\"\"\n        self.env = eval_env\n        self.env_name = env_name\n        self.num_envs = num_eval_envs\n        self.jit_reset = jax.jit(jax.vmap(self.env.reset))\n        self.jit_step = jax.jit(jax.vmap(self.env.step))\n\n        if isinstance(self.env.unwrapped.observation_size, dict):\n            self.asymmetric_obs = True\n        else:\n            self.asymmetric_obs = False\n\n        self.key = jax.random.PRNGKey(seed)\n\n        if device_rank is not None:\n            gpu_devices = jax.devices(\"gpu\")\n            self.key = jax.device_put(self.key, gpu_devices[device_rank])\n\n        self.key_reset = jax.random.split(self.key, num_eval_envs)\n        self.max_episode_steps = max_episode_steps\n\n    def reset(self):\n        self.state = self.jit_reset(self.key_reset)\n        if self.asymmetric_obs:\n            obs = wrapper_torch._jax_to_torch(self.state.obs[\"state\"])\n        else:\n            obs = wrapper_torch._jax_to_torch(self.state.obs)\n        return obs\n\n    def step(self, actions):\n        self.state = self.jit_step(self.state, wrapper_torch._torch_to_jax(actions))\n        if self.asymmetric_obs:\n            next_obs = wrapper_torch._jax_to_torch(self.state.obs[\"state\"])\n        else:\n            next_obs = wrapper_torch._jax_to_torch(self.state.obs)\n        rewards = wrapper_torch._jax_to_torch(self.state.reward)\n        dones = wrapper_torch._jax_to_torch(self.state.done)\n        return next_obs, rewards, dones, None\n\n    def render_trajectory(self, trajectory):\n        scene_option = mujoco.MjvOption()\n        scene_option.flags[mujoco.mjtVisFlag.mjVIS_TRANSPARENT] = False\n        scene_option.flags[mujoco.mjtVisFlag.mjVIS_PERTFORCE] = False\n        scene_option.flags[mujoco.mjtVisFlag.mjVIS_CONTACTFORCE] = False\n\n        frames = self.env.render(\n            trajectory,\n            camera=\"track\" if \"Joystick\" in self.env_name else None,\n            height=480,\n            width=640,\n            scene_option=scene_option,\n        )\n        return frames\n\n\ndef make_env(\n    env_name,\n    seed,\n    num_envs,\n    num_eval_envs,\n    device_rank,\n    use_tuned_reward=False,\n    use_domain_randomization=False,\n    use_push_randomization=False,\n):\n    # Make training environment\n    train_env_cfg = registry.get_default_config(env_name)\n    is_humanoid_task = env_name in [\n        \"G1JoystickRoughTerrain\",\n        \"G1JoystickFlatTerrain\",\n        \"T1JoystickRoughTerrain\",\n        \"T1JoystickFlatTerrain\",\n    ]\n\n    if use_tuned_reward and is_humanoid_task:\n        # NOTE: Tuned reward for G1. Used for producing Figure 7 in the paper.\n        # Somehow it works reasonably for T1 as well.\n        # However, see `sim2real.md` for sim-to-real RL with Booster T1\n        train_env_cfg.reward_config.scales.energy = -5e-5\n        train_env_cfg.reward_config.scales.action_rate = -1e-1\n        train_env_cfg.reward_config.scales.torques = -1e-3\n        train_env_cfg.reward_config.scales.pose = -1.0\n        train_env_cfg.reward_config.scales.tracking_ang_vel = 1.25\n        train_env_cfg.reward_config.scales.tracking_lin_vel = 1.25\n        train_env_cfg.reward_config.scales.feet_phase = 1.0\n        train_env_cfg.reward_config.scales.ang_vel_xy = -0.3\n        train_env_cfg.reward_config.scales.orientation = -5.0\n\n    if is_humanoid_task and not use_push_randomization:\n        train_env_cfg.push_config.enable = False\n        train_env_cfg.push_config.magnitude_range = [0.0, 0.0]\n    randomizer = (\n        registry.get_domain_randomizer(env_name) if use_domain_randomization else None\n    )\n    raw_env = registry.load(env_name, config=train_env_cfg)\n    train_env = wrapper_torch.RSLRLBraxWrapper(\n        raw_env,\n        num_envs,\n        seed,\n        train_env_cfg.episode_length,\n        train_env_cfg.action_repeat,\n        randomization_fn=randomizer,\n        device_rank=device_rank,\n    )\n\n    # Make evaluation environment\n    eval_env_cfg = registry.get_default_config(env_name)\n    if is_humanoid_task and not use_push_randomization:\n        eval_env_cfg.push_config.enable = False\n        eval_env_cfg.push_config.magnitude_range = [0.0, 0.0]\n    eval_env = registry.load(env_name, config=eval_env_cfg)\n    eval_env = PlaygroundEvalEnvWrapper(\n        eval_env,\n        eval_env_cfg.episode_length,\n        env_name,\n        num_eval_envs,\n        seed,\n        device_rank=device_rank,\n    )\n\n    render_env_cfg = registry.get_default_config(env_name)\n    if is_humanoid_task and not use_push_randomization:\n        render_env_cfg.push_config.enable = False\n        render_env_cfg.push_config.magnitude_range = [0.0, 0.0]\n    render_env = registry.load(env_name, config=render_env_cfg)\n    render_env = PlaygroundEvalEnvWrapper(\n        render_env,\n        render_env_cfg.episode_length,\n        env_name,\n        1,\n        seed,\n        device_rank=device_rank,\n    )\n\n    return train_env, eval_env, render_env\n"
  },
  {
    "path": "fast_td3/fast_td3.py",
    "content": "import torch\nimport torch.nn as nn\nimport torch.nn.functional as F\n\n\nclass DistributionalQNetwork(nn.Module):\n    def __init__(\n        self,\n        n_obs: int,\n        n_act: int,\n        num_atoms: int,\n        v_min: float,\n        v_max: float,\n        hidden_dim: int,\n        device: torch.device = None,\n    ):\n        super().__init__()\n        self.net = nn.Sequential(\n            nn.Linear(n_obs + n_act, hidden_dim, device=device),\n            nn.ReLU(),\n            nn.Linear(hidden_dim, hidden_dim // 2, device=device),\n            nn.ReLU(),\n            nn.Linear(hidden_dim // 2, hidden_dim // 4, device=device),\n            nn.ReLU(),\n            nn.Linear(hidden_dim // 4, num_atoms, device=device),\n        )\n        self.v_min = v_min\n        self.v_max = v_max\n        self.num_atoms = num_atoms\n\n    def forward(self, obs: torch.Tensor, actions: torch.Tensor) -> torch.Tensor:\n        x = torch.cat([obs, actions], 1)\n        x = self.net(x)\n        return x\n\n    def projection(\n        self,\n        obs: torch.Tensor,\n        actions: torch.Tensor,\n        rewards: torch.Tensor,\n        bootstrap: torch.Tensor,\n        discount: float,\n        q_support: torch.Tensor,\n        device: torch.device,\n    ) -> torch.Tensor:\n        delta_z = (self.v_max - self.v_min) / (self.num_atoms - 1)\n        batch_size = rewards.shape[0]\n\n        target_z = (\n            rewards.unsqueeze(1)\n            + bootstrap.unsqueeze(1) * discount.unsqueeze(1) * q_support\n        )\n        target_z = target_z.clamp(self.v_min, self.v_max)\n        b = (target_z - self.v_min) / delta_z\n        l = torch.floor(b).long()\n        u = torch.ceil(b).long()\n\n        is_int = (l == u)\n        l_mask = is_int & (l > 0)\n        u_mask = is_int & (l == 0)\n\n        l = torch.where(l_mask, l - 1, l)\n        u = torch.where(u_mask, u + 1, u)\n\n        next_dist = F.softmax(self.forward(obs, actions), dim=1)\n        proj_dist = torch.zeros_like(next_dist)\n        offset = (\n            torch.linspace(\n                0, (batch_size - 1) * self.num_atoms, batch_size, device=device\n            )\n            .unsqueeze(1)\n            .expand(batch_size, self.num_atoms)\n            .long()\n        )\n        proj_dist.view(-1).index_add_(\n            0, (l + offset).view(-1), (next_dist * (u.float() - b)).view(-1)\n        )\n        proj_dist.view(-1).index_add_(\n            0, (u + offset).view(-1), (next_dist * (b - l.float())).view(-1)\n        )\n        return proj_dist\n\n\nclass Critic(nn.Module):\n    def __init__(\n        self,\n        n_obs: int,\n        n_act: int,\n        num_atoms: int,\n        v_min: float,\n        v_max: float,\n        hidden_dim: int,\n        device: torch.device = None,\n    ):\n        super().__init__()\n        self.qnet1 = DistributionalQNetwork(\n            n_obs=n_obs,\n            n_act=n_act,\n            num_atoms=num_atoms,\n            v_min=v_min,\n            v_max=v_max,\n            hidden_dim=hidden_dim,\n            device=device,\n        )\n        self.qnet2 = DistributionalQNetwork(\n            n_obs=n_obs,\n            n_act=n_act,\n            num_atoms=num_atoms,\n            v_min=v_min,\n            v_max=v_max,\n            hidden_dim=hidden_dim,\n            device=device,\n        )\n\n        self.register_buffer(\n            \"q_support\", torch.linspace(v_min, v_max, num_atoms, device=device)\n        )\n        self.device = device\n\n    def forward(self, obs: torch.Tensor, actions: torch.Tensor) -> torch.Tensor:\n        return self.qnet1(obs, actions), self.qnet2(obs, actions)\n\n    def projection(\n        self,\n        obs: torch.Tensor,\n        actions: torch.Tensor,\n        rewards: torch.Tensor,\n        bootstrap: torch.Tensor,\n        discount: float,\n    ) -> torch.Tensor:\n        \"\"\"Projection operation that includes q_support directly\"\"\"\n        q1_proj = self.qnet1.projection(\n            obs,\n            actions,\n            rewards,\n            bootstrap,\n            discount,\n            self.q_support,\n            self.q_support.device,\n        )\n        q2_proj = self.qnet2.projection(\n            obs,\n            actions,\n            rewards,\n            bootstrap,\n            discount,\n            self.q_support,\n            self.q_support.device,\n        )\n        return q1_proj, q2_proj\n\n    def get_value(self, probs: torch.Tensor) -> torch.Tensor:\n        \"\"\"Calculate value from logits using support\"\"\"\n        return torch.sum(probs * self.q_support, dim=1)\n\n\nclass Actor(nn.Module):\n    def __init__(\n        self,\n        n_obs: int,\n        n_act: int,\n        num_envs: int,\n        init_scale: float,\n        hidden_dim: int,\n        std_min: float = 0.05,\n        std_max: float = 0.8,\n        device: torch.device = None,\n    ):\n        super().__init__()\n        self.n_act = n_act\n        self.net = nn.Sequential(\n            nn.Linear(n_obs, hidden_dim, device=device),\n            nn.ReLU(),\n            nn.Linear(hidden_dim, hidden_dim // 2, device=device),\n            nn.ReLU(),\n            nn.Linear(hidden_dim // 2, hidden_dim // 4, device=device),\n            nn.ReLU(),\n        )\n        self.fc_mu = nn.Sequential(\n            nn.Linear(hidden_dim // 4, n_act, device=device),\n            nn.Tanh(),\n        )\n        nn.init.normal_(self.fc_mu[0].weight, 0.0, init_scale)\n        nn.init.constant_(self.fc_mu[0].bias, 0.0)\n\n        noise_scales = (\n            torch.rand(num_envs, 1, device=device) * (std_max - std_min) + std_min\n        )\n        self.register_buffer(\"noise_scales\", noise_scales)\n\n        self.register_buffer(\"std_min\", torch.as_tensor(std_min, device=device))\n        self.register_buffer(\"std_max\", torch.as_tensor(std_max, device=device))\n        self.n_envs = num_envs\n        self.device = device\n\n    def forward(self, obs: torch.Tensor) -> torch.Tensor:\n        x = obs\n        x = self.net(x)\n        action = self.fc_mu(x)\n        return action\n\n    def explore(\n        self, obs: torch.Tensor, dones: torch.Tensor = None, deterministic: bool = False\n    ) -> torch.Tensor:\n        # If dones is provided, resample noise for environments that are done\n        if dones is not None and dones.sum() > 0:\n            # Generate new noise scales for done environments (one per environment)\n            new_scales = (\n                torch.rand(self.n_envs, 1, device=obs.device)\n                * (self.std_max - self.std_min)\n                + self.std_min\n            )\n\n            # Update only the noise scales for environments that are done\n            dones_view = dones.view(-1, 1) > 0\n            self.noise_scales.copy_(\n                torch.where(dones_view, new_scales, self.noise_scales)\n            )\n\n        act = self(obs)\n        if deterministic:\n            return act\n\n        noise = torch.randn_like(act) * self.noise_scales\n        return act + noise\n\n\nclass MultiTaskActor(Actor):\n    def __init__(self, num_tasks: int, task_embedding_dim: int, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n        self.num_tasks = num_tasks\n        self.task_embedding_dim = task_embedding_dim\n        self.task_embedding = nn.Embedding(\n            num_tasks, task_embedding_dim, max_norm=1.0, device=self.device\n        )\n\n    def forward(self, obs: torch.Tensor) -> torch.Tensor:\n        # TODO: Optimize the code to be compatible with cudagraphs\n        # Currently in-place creation of task_indices is not compatible with cudagraphs\n        task_ids_one_hot = obs[..., -self.num_tasks :]\n        task_indices = torch.argmax(task_ids_one_hot, dim=1)\n        task_embeddings = self.task_embedding(task_indices)\n        obs = torch.cat([obs[..., : -self.num_tasks], task_embeddings], dim=-1)\n        return super().forward(obs)\n\n\nclass MultiTaskCritic(Critic):\n    def __init__(self, num_tasks: int, task_embedding_dim: int, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n        self.num_tasks = num_tasks\n        self.task_embedding_dim = task_embedding_dim\n        self.task_embedding = nn.Embedding(\n            num_tasks, task_embedding_dim, max_norm=1.0, device=self.device\n        )\n\n    def forward(self, obs: torch.Tensor, actions: torch.Tensor) -> torch.Tensor:\n        # TODO: Optimize the code to be compatible with cudagraphs\n        # Currently in-place creation of task_indices is not compatible with cudagraphs\n        task_ids_one_hot = obs[..., -self.num_tasks :]\n        task_indices = torch.argmax(task_ids_one_hot, dim=1)\n        task_embeddings = self.task_embedding(task_indices)\n        obs = torch.cat([obs[..., : -self.num_tasks], task_embeddings], dim=-1)\n        return super().forward(obs, actions)\n\n    def projection(\n        self,\n        obs: torch.Tensor,\n        actions: torch.Tensor,\n        rewards: torch.Tensor,\n        bootstrap: torch.Tensor,\n        discount: float,\n    ) -> torch.Tensor:\n        task_ids_one_hot = obs[..., -self.num_tasks :]\n        task_indices = torch.argmax(task_ids_one_hot, dim=1)\n        task_embeddings = self.task_embedding(task_indices)\n        obs = torch.cat([obs[..., : -self.num_tasks], task_embeddings], dim=-1)\n        return super().projection(obs, actions, rewards, bootstrap, discount)\n"
  },
  {
    "path": "fast_td3/fast_td3_deploy.py",
    "content": "import math\n\nimport torch\nimport torch.nn as nn\nfrom .fast_td3_utils import EmpiricalNormalization\nfrom .fast_td3 import Actor\nfrom .fast_td3_simbav2 import Actor as ActorSimbaV2\n\n\nclass Policy(nn.Module):\n    def __init__(\n        self,\n        n_obs: int,\n        n_act: int,\n        args: dict,\n        agent: str = \"fasttd3\",\n    ):\n        super().__init__()\n\n        self.args = args\n\n        num_envs = args[\"num_envs\"]\n        init_scale = args[\"init_scale\"]\n        actor_hidden_dim = args[\"actor_hidden_dim\"]\n\n        actor_kwargs = dict(\n            n_obs=n_obs,\n            n_act=n_act,\n            num_envs=num_envs,\n            device=\"cpu\",\n            init_scale=init_scale,\n            hidden_dim=actor_hidden_dim,\n        )\n\n        if agent == \"fasttd3\":\n            actor_cls = Actor\n        elif agent == \"fasttd3_simbav2\":\n            actor_cls = ActorSimbaV2\n\n            actor_num_blocks = args[\"actor_num_blocks\"]\n            actor_kwargs.pop(\"init_scale\")\n            actor_kwargs.update(\n                {\n                    \"scaler_init\": math.sqrt(2.0 / actor_hidden_dim),\n                    \"scaler_scale\": math.sqrt(2.0 / actor_hidden_dim),\n                    \"alpha_init\": 1.0 / (actor_num_blocks + 1),\n                    \"alpha_scale\": 1.0 / math.sqrt(actor_hidden_dim),\n                    \"expansion\": 4,\n                    \"c_shift\": 3.0,\n                    \"num_blocks\": actor_num_blocks,\n                }\n            )\n        else:\n            raise ValueError(f\"Agent {agent} not supported\")\n\n        self.actor = actor_cls(\n            **actor_kwargs,\n        )\n        self.obs_normalizer = EmpiricalNormalization(shape=n_obs, device=\"cpu\")\n\n        self.actor.eval()\n        self.obs_normalizer.eval()\n\n    @torch.no_grad\n    def forward(self, obs: torch.Tensor) -> torch.Tensor:\n        norm_obs = self.obs_normalizer(obs)\n        actions = self.actor(norm_obs)\n        return actions\n\n    @torch.no_grad\n    def act(self, obs: torch.Tensor) -> torch.distributions.Normal:\n        actions = self.forward(obs)\n        return torch.distributions.Normal(actions, torch.ones_like(actions) * 1e-8)\n\n\ndef load_policy(checkpoint_path):\n    torch_checkpoint = torch.load(\n        f\"{checkpoint_path}\", map_location=\"cpu\", weights_only=False\n    )\n    args = torch_checkpoint[\"args\"]\n\n    agent = args.get(\"agent\", \"fasttd3\")\n    if agent == \"fasttd3\":\n        n_obs = torch_checkpoint[\"actor_state_dict\"][\"net.0.weight\"].shape[-1]\n        n_act = torch_checkpoint[\"actor_state_dict\"][\"fc_mu.0.weight\"].shape[0]\n    elif agent == \"fasttd3_simbav2\":\n        # TODO: Too hard-coded, maybe save n_obs and n_act in the checkpoint?\n        n_obs = (\n            torch_checkpoint[\"actor_state_dict\"][\"embedder.w.w.weight\"].shape[-1] - 1\n        )\n        n_act = torch_checkpoint[\"actor_state_dict\"][\"predictor.mean_bias\"].shape[0]\n    else:\n        raise ValueError(f\"Agent {agent} not supported\")\n\n    policy = Policy(\n        n_obs=n_obs,\n        n_act=n_act,\n        args=args,\n        agent=agent,\n    )\n    policy.actor.load_state_dict(torch_checkpoint[\"actor_state_dict\"])\n\n    if len(torch_checkpoint[\"obs_normalizer_state\"]) == 0:\n        policy.obs_normalizer = nn.Identity()\n    else:\n        policy.obs_normalizer.load_state_dict(torch_checkpoint[\"obs_normalizer_state\"])\n\n    return policy\n"
  },
  {
    "path": "fast_td3/fast_td3_simbav2.py",
    "content": "import torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nimport math\n\n\ndef l2normalize(\n    tensor: torch.Tensor, axis: int = -1, eps: float = 1e-8\n) -> torch.Tensor:\n    \"\"\"Computes L2 normalization of a tensor.\"\"\"\n    return tensor / (torch.linalg.norm(tensor, ord=2, dim=axis, keepdim=True) + eps)\n\n\nclass Scaler(nn.Module):\n    \"\"\"\n    A learnable scaling layer.\n    \"\"\"\n\n    def __init__(\n        self,\n        dim: int,\n        init: float = 1.0,\n        scale: float = 1.0,\n        device: torch.device = None,\n    ):\n        super().__init__()\n        self.scaler = nn.Parameter(torch.full((dim,), init * scale, device=device))\n        self.forward_scaler = init / scale\n\n    def forward(self, x: torch.Tensor) -> torch.Tensor:\n        return self.scaler.to(x.dtype) * self.forward_scaler * x\n\n\nclass HyperDense(nn.Module):\n    \"\"\"\n    A dense layer without bias and with orthogonal initialization.\n    \"\"\"\n\n    def __init__(self, in_dim: int, hidden_dim: int, device: torch.device = None):\n        super().__init__()\n        self.w = nn.Linear(in_dim, hidden_dim, bias=False, device=device)\n        nn.init.orthogonal_(self.w.weight, gain=1.0)\n\n    def forward(self, x: torch.Tensor) -> torch.Tensor:\n        return self.w(x)\n\n\nclass HyperMLP(nn.Module):\n    \"\"\"\n    A small MLP with a specific architecture using HyperDense and Scaler.\n    \"\"\"\n\n    def __init__(\n        self,\n        in_dim: int,\n        hidden_dim: int,\n        out_dim: int,\n        scaler_init: float,\n        scaler_scale: float,\n        eps: float = 1e-8,\n        device: torch.device = None,\n    ):\n        super().__init__()\n        self.w1 = HyperDense(in_dim, hidden_dim, device=device)\n        self.scaler = Scaler(hidden_dim, scaler_init, scaler_scale, device=device)\n        self.w2 = HyperDense(hidden_dim, out_dim, device=device)\n        self.eps = eps\n\n    def forward(self, x: torch.Tensor) -> torch.Tensor:\n        x = self.w1(x)\n        x = self.scaler(x)\n        # `eps` is required to prevent zero vector.\n        x = F.relu(x) + self.eps\n        x = self.w2(x)\n        x = l2normalize(x, axis=-1)\n        return x\n\n\nclass HyperEmbedder(nn.Module):\n    \"\"\"\n    Embeds input by concatenating a constant, normalizing, and applying layers.\n    \"\"\"\n\n    def __init__(\n        self,\n        in_dim: int,\n        hidden_dim: int,\n        scaler_init: float,\n        scaler_scale: float,\n        c_shift: float,\n        device: torch.device = None,\n    ):\n        super().__init__()\n        # The input dimension to the dense layer is in_dim + 1\n        self.w = HyperDense(in_dim + 1, hidden_dim, device=device)\n        self.scaler = Scaler(hidden_dim, scaler_init, scaler_scale, device=device)\n        self.c_shift = c_shift\n\n    def forward(self, x: torch.Tensor) -> torch.Tensor:\n        new_axis = torch.full(\n            (*x.shape[:-1], 1), self.c_shift, device=x.device, dtype=x.dtype\n        )\n        x = torch.cat([x, new_axis], dim=-1)\n        x = l2normalize(x, axis=-1)\n        x = self.w(x)\n        x = self.scaler(x)\n        x = l2normalize(x, axis=-1)\n        return x\n\n\nclass HyperLERPBlock(nn.Module):\n    \"\"\"\n    A residual block using Linear Interpolation (LERP).\n    \"\"\"\n\n    def __init__(\n        self,\n        hidden_dim: int,\n        scaler_init: float,\n        scaler_scale: float,\n        alpha_init: float,\n        alpha_scale: float,\n        expansion: int = 4,\n        device: torch.device = None,\n    ):\n        super().__init__()\n        self.mlp = HyperMLP(\n            in_dim=hidden_dim,\n            hidden_dim=hidden_dim * expansion,\n            out_dim=hidden_dim,\n            scaler_init=scaler_init / math.sqrt(expansion),\n            scaler_scale=scaler_scale / math.sqrt(expansion),\n            device=device,\n        )\n        self.alpha_scaler = Scaler(\n            dim=hidden_dim,\n            init=alpha_init,\n            scale=alpha_scale,\n            device=device,\n        )\n\n    def forward(self, x: torch.Tensor) -> torch.Tensor:\n        residual = x\n        mlp_out = self.mlp(x)\n        # The original paper uses (x - residual) but x is the residual here.\n        # This is interpreted as alpha * (mlp_output - residual_input)\n        x = residual + self.alpha_scaler(mlp_out - residual)\n        x = l2normalize(x, axis=-1)\n        return x\n\n\nclass HyperTanhPolicy(nn.Module):\n    \"\"\"\n    A policy that outputs a Tanh action.\n    \"\"\"\n\n    def __init__(\n        self,\n        hidden_dim: int,\n        action_dim: int,\n        scaler_init: float,\n        scaler_scale: float,\n        device: torch.device = None,\n    ):\n        super().__init__()\n        self.mean_w1 = HyperDense(hidden_dim, hidden_dim, device=device)\n        self.mean_scaler = Scaler(hidden_dim, scaler_init, scaler_scale, device=device)\n        self.mean_w2 = HyperDense(hidden_dim, action_dim, device=device)\n        self.mean_bias = nn.Parameter(torch.zeros(action_dim, device=device))\n\n    def forward(self, x: torch.Tensor) -> torch.Tensor:\n        # Mean path\n        mean = self.mean_w1(x)\n        mean = self.mean_scaler(mean)\n        mean = self.mean_w2(mean) + self.mean_bias.to(mean.dtype)\n        mean = torch.tanh(mean)\n        return mean\n\n\nclass HyperCategoricalValue(nn.Module):\n    \"\"\"\n    A value function that predicts a categorical distribution over a range of values.\n    \"\"\"\n\n    def __init__(\n        self,\n        hidden_dim: int,\n        num_bins: int,\n        scaler_init: float,\n        scaler_scale: float,\n        device: torch.device = None,\n    ):\n        super().__init__()\n        self.w1 = HyperDense(hidden_dim, hidden_dim, device=device)\n        self.scaler = Scaler(hidden_dim, scaler_init, scaler_scale, device=device)\n        self.w2 = HyperDense(hidden_dim, num_bins, device=device)\n        self.bias = nn.Parameter(torch.zeros(num_bins, device=device))\n\n    def forward(self, x: torch.Tensor) -> torch.Tensor:\n        logits = self.w1(x)\n        logits = self.scaler(logits)\n        logits = self.w2(logits) + self.bias.to(logits.dtype)\n        return logits\n\n\nclass DistributionalQNetwork(nn.Module):\n    def __init__(\n        self,\n        n_obs: int,\n        n_act: int,\n        num_atoms: int,\n        v_min: float,\n        v_max: float,\n        hidden_dim: int,\n        scaler_init: float,\n        scaler_scale: float,\n        alpha_init: float,\n        alpha_scale: float,\n        num_blocks: int,\n        c_shift: float,\n        expansion: int,\n        device: torch.device = None,\n    ):\n        super().__init__()\n\n        self.embedder = HyperEmbedder(\n            in_dim=n_obs + n_act,\n            hidden_dim=hidden_dim,\n            scaler_init=scaler_init,\n            scaler_scale=scaler_scale,\n            c_shift=c_shift,\n            device=device,\n        )\n\n        self.encoder = nn.Sequential(\n            *[\n                HyperLERPBlock(\n                    hidden_dim=hidden_dim,\n                    scaler_init=scaler_init,\n                    scaler_scale=scaler_scale,\n                    alpha_init=alpha_init,\n                    alpha_scale=alpha_scale,\n                    expansion=expansion,\n                    device=device,\n                )\n                for _ in range(num_blocks)\n            ]\n        )\n\n        self.predictor = HyperCategoricalValue(\n            hidden_dim=hidden_dim,\n            num_bins=num_atoms,\n            scaler_init=1.0,\n            scaler_scale=1.0,\n            device=device,\n        )\n        self.v_min = v_min\n        self.v_max = v_max\n        self.num_atoms = num_atoms\n\n    def forward(self, obs: torch.Tensor, actions: torch.Tensor) -> torch.Tensor:\n        x = torch.cat([obs, actions], 1)\n        x = self.embedder(x)\n        x = self.encoder(x)\n        x = self.predictor(x)\n        return x\n\n    def projection(\n        self,\n        obs: torch.Tensor,\n        actions: torch.Tensor,\n        rewards: torch.Tensor,\n        bootstrap: torch.Tensor,\n        discount: float,\n        q_support: torch.Tensor,\n        device: torch.device,\n    ) -> torch.Tensor:\n        delta_z = (self.v_max - self.v_min) / (self.num_atoms - 1)\n        batch_size = rewards.shape[0]\n\n        target_z = (\n            rewards.unsqueeze(1)\n            + bootstrap.unsqueeze(1) * discount.unsqueeze(1) * q_support\n        )\n        target_z = target_z.clamp(self.v_min, self.v_max)\n        b = (target_z - self.v_min) / delta_z\n        l = torch.floor(b).long()\n        u = torch.ceil(b).long()\n\n        is_int = (l == u)\n        l_mask = is_int & (l > 0)\n        u_mask = is_int & (l == 0)\n\n        l = torch.where(l_mask, l - 1, l)\n        u = torch.where(u_mask, u + 1, u)\n\n        next_dist = F.softmax(self.forward(obs, actions), dim=1)\n        proj_dist = torch.zeros_like(next_dist)\n        offset = (\n            torch.linspace(\n                0, (batch_size - 1) * self.num_atoms, batch_size, device=device\n            )\n            .unsqueeze(1)\n            .expand(batch_size, self.num_atoms)\n            .long()\n        )\n        proj_dist.view(-1).index_add_(\n            0, (l + offset).view(-1), (next_dist * (u.float() - b)).view(-1)\n        )\n        proj_dist.view(-1).index_add_(\n            0, (u + offset).view(-1), (next_dist * (b - l.float())).view(-1)\n        )\n        return proj_dist\n\n\nclass Critic(nn.Module):\n    def __init__(\n        self,\n        n_obs: int,\n        n_act: int,\n        num_atoms: int,\n        v_min: float,\n        v_max: float,\n        hidden_dim: int,\n        scaler_init: float,\n        scaler_scale: float,\n        alpha_init: float,\n        alpha_scale: float,\n        num_blocks: int,\n        c_shift: float,\n        expansion: int,\n        device: torch.device = None,\n    ):\n        super().__init__()\n        self.qnet1 = DistributionalQNetwork(\n            n_obs=n_obs,\n            n_act=n_act,\n            num_atoms=num_atoms,\n            v_min=v_min,\n            v_max=v_max,\n            scaler_init=scaler_init,\n            scaler_scale=scaler_scale,\n            alpha_init=alpha_init,\n            alpha_scale=alpha_scale,\n            num_blocks=num_blocks,\n            c_shift=c_shift,\n            expansion=expansion,\n            hidden_dim=hidden_dim,\n            device=device,\n        )\n        self.qnet2 = DistributionalQNetwork(\n            n_obs=n_obs,\n            n_act=n_act,\n            num_atoms=num_atoms,\n            v_min=v_min,\n            v_max=v_max,\n            scaler_init=scaler_init,\n            scaler_scale=scaler_scale,\n            alpha_init=alpha_init,\n            alpha_scale=alpha_scale,\n            num_blocks=num_blocks,\n            c_shift=c_shift,\n            expansion=expansion,\n            hidden_dim=hidden_dim,\n            device=device,\n        )\n\n        self.register_buffer(\n            \"q_support\", torch.linspace(v_min, v_max, num_atoms, device=device)\n        )\n        self.device = device\n\n    def forward(self, obs: torch.Tensor, actions: torch.Tensor) -> torch.Tensor:\n        return self.qnet1(obs, actions), self.qnet2(obs, actions)\n\n    def projection(\n        self,\n        obs: torch.Tensor,\n        actions: torch.Tensor,\n        rewards: torch.Tensor,\n        bootstrap: torch.Tensor,\n        discount: float,\n    ) -> torch.Tensor:\n        \"\"\"Projection operation that includes q_support directly\"\"\"\n        q1_proj = self.qnet1.projection(\n            obs,\n            actions,\n            rewards,\n            bootstrap,\n            discount,\n            self.q_support,\n            self.q_support.device,\n        )\n        q2_proj = self.qnet2.projection(\n            obs,\n            actions,\n            rewards,\n            bootstrap,\n            discount,\n            self.q_support,\n            self.q_support.device,\n        )\n        return q1_proj, q2_proj\n\n    def get_value(self, probs: torch.Tensor) -> torch.Tensor:\n        \"\"\"Calculate value from logits using support\"\"\"\n        return torch.sum(probs * self.q_support, dim=1)\n\n\nclass Actor(nn.Module):\n    def __init__(\n        self,\n        n_obs: int,\n        n_act: int,\n        num_envs: int,\n        hidden_dim: int,\n        scaler_init: float,\n        scaler_scale: float,\n        alpha_init: float,\n        alpha_scale: float,\n        expansion: int,\n        c_shift: float,\n        num_blocks: int,\n        std_min: float = 0.05,\n        std_max: float = 0.8,\n        device: torch.device = None,\n    ):\n        super().__init__()\n        self.n_act = n_act\n\n        self.embedder = HyperEmbedder(\n            in_dim=n_obs,\n            hidden_dim=hidden_dim,\n            scaler_init=scaler_init,\n            scaler_scale=scaler_scale,\n            c_shift=c_shift,\n            device=device,\n        )\n        self.encoder = nn.Sequential(\n            *[\n                HyperLERPBlock(\n                    hidden_dim=hidden_dim,\n                    scaler_init=scaler_init,\n                    scaler_scale=scaler_scale,\n                    alpha_init=alpha_init,\n                    alpha_scale=alpha_scale,\n                    expansion=expansion,\n                    device=device,\n                )\n                for _ in range(num_blocks)\n            ]\n        )\n        self.predictor = HyperTanhPolicy(\n            hidden_dim=hidden_dim,\n            action_dim=n_act,\n            scaler_init=1.0,\n            scaler_scale=1.0,\n            device=device,\n        )\n\n        noise_scales = (\n            torch.rand(num_envs, 1, device=device) * (std_max - std_min) + std_min\n        )\n        self.register_buffer(\"noise_scales\", noise_scales)\n\n        self.register_buffer(\"std_min\", torch.as_tensor(std_min, device=device))\n        self.register_buffer(\"std_max\", torch.as_tensor(std_max, device=device))\n        self.n_envs = num_envs\n        self.device = device\n\n    def forward(self, obs: torch.Tensor) -> torch.Tensor:\n        x = obs\n        x = self.embedder(x)\n        x = self.encoder(x)\n        x = self.predictor(x)\n        return x\n\n    def explore(\n        self, obs: torch.Tensor, dones: torch.Tensor = None, deterministic: bool = False\n    ) -> torch.Tensor:\n        # If dones is provided, resample noise for environments that are done\n        if dones is not None and dones.sum() > 0:\n            # Generate new noise scales for done environments (one per environment)\n            new_scales = (\n                torch.rand(self.n_envs, 1, device=obs.device)\n                * (self.std_max - self.std_min)\n                + self.std_min\n            )\n\n            # Update only the noise scales for environments that are done\n            dones_view = dones.view(-1, 1) > 0\n            self.noise_scales.copy_(\n                torch.where(dones_view, new_scales, self.noise_scales)\n            )\n\n        act = self(obs)\n        if deterministic:\n            return act\n\n        noise = torch.randn_like(act) * self.noise_scales\n        return act + noise\n\n\nclass MultiTaskActor(Actor):\n    def __init__(self, num_tasks: int, task_embedding_dim: int, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n        self.num_tasks = num_tasks\n        self.task_embedding_dim = task_embedding_dim\n        self.task_embedding = nn.Embedding(\n            num_tasks, task_embedding_dim, max_norm=1.0, device=self.device\n        )\n\n    def forward(self, obs: torch.Tensor) -> torch.Tensor:\n        # TODO: Optimize the code to be compatible with cudagraphs\n        # Currently in-place creation of task_indices is not compatible with cudagraphs\n        task_ids_one_hot = obs[..., -self.num_tasks :]\n        task_indices = torch.argmax(task_ids_one_hot, dim=1)\n        task_embeddings = self.task_embedding(task_indices)\n        obs = torch.cat([obs[..., : -self.num_tasks], task_embeddings], dim=-1)\n        return super().forward(obs)\n\n\nclass MultiTaskCritic(Critic):\n    def __init__(self, num_tasks: int, task_embedding_dim: int, *args, **kwargs):\n        super().__init__(*args, **kwargs)\n        self.num_tasks = num_tasks\n        self.task_embedding_dim = task_embedding_dim\n        self.task_embedding = nn.Embedding(\n            num_tasks, task_embedding_dim, max_norm=1.0, device=self.device\n        )\n\n    def forward(self, obs: torch.Tensor, actions: torch.Tensor) -> torch.Tensor:\n        # TODO: Optimize the code to be compatible with cudagraphs\n        # Currently in-place creation of task_indices is not compatible with cudagraphs\n        task_ids_one_hot = obs[..., -self.num_tasks :]\n        task_indices = torch.argmax(task_ids_one_hot, dim=1)\n        task_embeddings = self.task_embedding(task_indices)\n        obs = torch.cat([obs[..., : -self.num_tasks], task_embeddings], dim=-1)\n        return super().forward(obs, actions)\n\n    def projection(\n        self,\n        obs: torch.Tensor,\n        actions: torch.Tensor,\n        rewards: torch.Tensor,\n        bootstrap: torch.Tensor,\n        discount: float,\n    ) -> torch.Tensor:\n        task_ids_one_hot = obs[..., -self.num_tasks :]\n        task_indices = torch.argmax(task_ids_one_hot, dim=1)\n        task_embeddings = self.task_embedding(task_indices)\n        obs = torch.cat([obs[..., : -self.num_tasks], task_embeddings], dim=-1)\n        return super().projection(obs, actions, rewards, bootstrap, discount)\n"
  },
  {
    "path": "fast_td3/fast_td3_utils.py",
    "content": "import os\n\nfrom typing import Optional\n\nimport torch\nimport torch.nn as nn\nimport torch.distributed as dist\n\nfrom tensordict import TensorDict\n\n\nclass SimpleReplayBuffer(nn.Module):\n    def __init__(\n        self,\n        n_env: int,\n        buffer_size: int,\n        n_obs: int,\n        n_act: int,\n        n_critic_obs: int,\n        asymmetric_obs: bool = False,\n        playground_mode: bool = False,\n        n_steps: int = 1,\n        gamma: float = 0.99,\n        device=None,\n    ):\n        \"\"\"\n        A simple replay buffer that stores transitions in a circular buffer.\n        Supports n-step returns and asymmetric observations.\n\n        When playground_mode=True, critic_observations are treated as a concatenation of\n        regular observations and privileged observations, and only the privileged part is stored\n        to save memory.\n\n        TODO (Younggyo): Refactor to split this into SimpleReplayBuffer and NStepReplayBuffer\n        \"\"\"\n        super().__init__()\n\n        self.n_env = n_env\n        self.buffer_size = buffer_size\n        self.n_obs = n_obs\n        self.n_act = n_act\n        self.n_critic_obs = n_critic_obs\n        self.asymmetric_obs = asymmetric_obs\n        self.playground_mode = playground_mode and asymmetric_obs\n        self.gamma = gamma\n        self.n_steps = n_steps\n        self.device = device\n\n        self.observations = torch.zeros(\n            (n_env, buffer_size, n_obs), device=device, dtype=torch.float\n        )\n        self.actions = torch.zeros(\n            (n_env, buffer_size, n_act), device=device, dtype=torch.float\n        )\n        self.rewards = torch.zeros(\n            (n_env, buffer_size), device=device, dtype=torch.float\n        )\n        self.dones = torch.zeros((n_env, buffer_size), device=device, dtype=torch.long)\n        self.truncations = torch.zeros(\n            (n_env, buffer_size), device=device, dtype=torch.long\n        )\n        self.next_observations = torch.zeros(\n            (n_env, buffer_size, n_obs), device=device, dtype=torch.float\n        )\n        if asymmetric_obs:\n            if self.playground_mode:\n                # Only store the privileged part of observations (n_critic_obs - n_obs)\n                self.privileged_obs_size = n_critic_obs - n_obs\n                self.privileged_observations = torch.zeros(\n                    (n_env, buffer_size, self.privileged_obs_size),\n                    device=device,\n                    dtype=torch.float,\n                )\n                self.next_privileged_observations = torch.zeros(\n                    (n_env, buffer_size, self.privileged_obs_size),\n                    device=device,\n                    dtype=torch.float,\n                )\n            else:\n                # Store full critic observations\n                self.critic_observations = torch.zeros(\n                    (n_env, buffer_size, n_critic_obs), device=device, dtype=torch.float\n                )\n                self.next_critic_observations = torch.zeros(\n                    (n_env, buffer_size, n_critic_obs), device=device, dtype=torch.float\n                )\n        self.ptr = 0\n\n    @torch.no_grad()\n    def extend(\n        self,\n        tensor_dict: TensorDict,\n    ):\n        observations = tensor_dict[\"observations\"]\n        actions = tensor_dict[\"actions\"]\n        rewards = tensor_dict[\"next\"][\"rewards\"]\n        dones = tensor_dict[\"next\"][\"dones\"]\n        truncations = tensor_dict[\"next\"][\"truncations\"]\n        next_observations = tensor_dict[\"next\"][\"observations\"]\n\n        ptr = self.ptr % self.buffer_size\n        self.observations[:, ptr] = observations\n        self.actions[:, ptr] = actions\n        self.rewards[:, ptr] = rewards\n        self.dones[:, ptr] = dones\n        self.truncations[:, ptr] = truncations\n        self.next_observations[:, ptr] = next_observations\n        if self.asymmetric_obs:\n            critic_observations = tensor_dict[\"critic_observations\"]\n            next_critic_observations = tensor_dict[\"next\"][\"critic_observations\"]\n\n            if self.playground_mode:\n                # Extract and store only the privileged part\n                privileged_observations = critic_observations[:, self.n_obs :]\n                next_privileged_observations = next_critic_observations[:, self.n_obs :]\n                self.privileged_observations[:, ptr] = privileged_observations\n                self.next_privileged_observations[:, ptr] = next_privileged_observations\n            else:\n                # Store full critic observations\n                self.critic_observations[:, ptr] = critic_observations\n                self.next_critic_observations[:, ptr] = next_critic_observations\n        self.ptr += 1\n\n    @torch.no_grad()\n    def sample(self, batch_size: int):\n        # we will sample n_env * batch_size transitions\n\n        if self.n_steps == 1:\n            indices = torch.randint(\n                0,\n                min(self.buffer_size, self.ptr),\n                (self.n_env, batch_size),\n                device=self.device,\n            )\n            obs_indices = indices.unsqueeze(-1).expand(-1, -1, self.n_obs)\n            act_indices = indices.unsqueeze(-1).expand(-1, -1, self.n_act)\n            observations = torch.gather(self.observations, 1, obs_indices).reshape(\n                self.n_env * batch_size, self.n_obs\n            )\n            next_observations = torch.gather(\n                self.next_observations, 1, obs_indices\n            ).reshape(self.n_env * batch_size, self.n_obs)\n            actions = torch.gather(self.actions, 1, act_indices).reshape(\n                self.n_env * batch_size, self.n_act\n            )\n\n            rewards = torch.gather(self.rewards, 1, indices).reshape(\n                self.n_env * batch_size\n            )\n            dones = torch.gather(self.dones, 1, indices).reshape(\n                self.n_env * batch_size\n            )\n            truncations = torch.gather(self.truncations, 1, indices).reshape(\n                self.n_env * batch_size\n            )\n            effective_n_steps = torch.ones_like(dones)\n            if self.asymmetric_obs:\n                if self.playground_mode:\n                    # Gather privileged observations\n                    priv_obs_indices = indices.unsqueeze(-1).expand(\n                        -1, -1, self.privileged_obs_size\n                    )\n                    privileged_observations = torch.gather(\n                        self.privileged_observations, 1, priv_obs_indices\n                    ).reshape(self.n_env * batch_size, self.privileged_obs_size)\n                    next_privileged_observations = torch.gather(\n                        self.next_privileged_observations, 1, priv_obs_indices\n                    ).reshape(self.n_env * batch_size, self.privileged_obs_size)\n\n                    # Concatenate with regular observations to form full critic observations\n                    critic_observations = torch.cat(\n                        [observations, privileged_observations], dim=1\n                    )\n                    next_critic_observations = torch.cat(\n                        [next_observations, next_privileged_observations], dim=1\n                    )\n                else:\n                    # Gather full critic observations\n                    critic_obs_indices = indices.unsqueeze(-1).expand(\n                        -1, -1, self.n_critic_obs\n                    )\n                    critic_observations = torch.gather(\n                        self.critic_observations, 1, critic_obs_indices\n                    ).reshape(self.n_env * batch_size, self.n_critic_obs)\n                    next_critic_observations = torch.gather(\n                        self.next_critic_observations, 1, critic_obs_indices\n                    ).reshape(self.n_env * batch_size, self.n_critic_obs)\n        else:\n            # Sample base indices\n            if self.ptr >= self.buffer_size:\n                # When the buffer is full, there is no protection against sampling across different episodes\n                # We avoid this by temporarily setting self.pos - 1 to truncated = True if not done\n                # https://github.com/DLR-RM/stable-baselines3/blob/b91050ca94f8bce7a0285c91f85da518d5a26223/stable_baselines3/common/buffers.py#L857-L860\n                # TODO (Younggyo): Change the reference when this SB3 branch is merged\n                current_pos = self.ptr % self.buffer_size\n                curr_truncations = self.truncations[:, current_pos - 1].clone()\n                self.truncations[:, current_pos - 1] = torch.logical_not(\n                    self.dones[:, current_pos - 1]\n                )\n                indices = torch.randint(\n                    0,\n                    self.buffer_size,\n                    (self.n_env, batch_size),\n                    device=self.device,\n                )\n            else:\n                # Buffer not full - ensure n-step sequence doesn't exceed valid data\n                max_start_idx = max(1, self.ptr - self.n_steps + 1)\n                indices = torch.randint(\n                    0,\n                    max_start_idx,\n                    (self.n_env, batch_size),\n                    device=self.device,\n                )\n            obs_indices = indices.unsqueeze(-1).expand(-1, -1, self.n_obs)\n            act_indices = indices.unsqueeze(-1).expand(-1, -1, self.n_act)\n\n            # Get base transitions\n            observations = torch.gather(self.observations, 1, obs_indices).reshape(\n                self.n_env * batch_size, self.n_obs\n            )\n            actions = torch.gather(self.actions, 1, act_indices).reshape(\n                self.n_env * batch_size, self.n_act\n            )\n            if self.asymmetric_obs:\n                if self.playground_mode:\n                    # Gather privileged observations\n                    priv_obs_indices = indices.unsqueeze(-1).expand(\n                        -1, -1, self.privileged_obs_size\n                    )\n                    privileged_observations = torch.gather(\n                        self.privileged_observations, 1, priv_obs_indices\n                    ).reshape(self.n_env * batch_size, self.privileged_obs_size)\n\n                    # Concatenate with regular observations to form full critic observations\n                    critic_observations = torch.cat(\n                        [observations, privileged_observations], dim=1\n                    )\n                else:\n                    # Gather full critic observations\n                    critic_obs_indices = indices.unsqueeze(-1).expand(\n                        -1, -1, self.n_critic_obs\n                    )\n                    critic_observations = torch.gather(\n                        self.critic_observations, 1, critic_obs_indices\n                    ).reshape(self.n_env * batch_size, self.n_critic_obs)\n\n            # Create sequential indices for each sample\n            # This creates a [n_env, batch_size, n_step] tensor of indices\n            seq_offsets = torch.arange(self.n_steps, device=self.device).view(1, 1, -1)\n            all_indices = (\n                indices.unsqueeze(-1) + seq_offsets\n            ) % self.buffer_size  # [n_env, batch_size, n_step]\n\n            # Gather all rewards and terminal flags\n            # Using advanced indexing - result shapes: [n_env, batch_size, n_step]\n            all_rewards = torch.gather(\n                self.rewards.unsqueeze(-1).expand(-1, -1, self.n_steps), 1, all_indices\n            )\n            all_dones = torch.gather(\n                self.dones.unsqueeze(-1).expand(-1, -1, self.n_steps), 1, all_indices\n            )\n            all_truncations = torch.gather(\n                self.truncations.unsqueeze(-1).expand(-1, -1, self.n_steps),\n                1,\n                all_indices,\n            )\n\n            # Create masks for rewards *after* first done\n            # This creates a cumulative product that zeroes out rewards after the first done\n            all_dones_shifted = torch.cat(\n                [torch.zeros_like(all_dones[:, :, :1]), all_dones[:, :, :-1]], dim=2\n            )  # First reward should not be masked\n            done_masks = torch.cumprod(\n                1.0 - all_dones_shifted, dim=2\n            )  # [n_env, batch_size, n_step]\n            effective_n_steps = done_masks.sum(2)\n\n            # Create discount factors\n            discounts = torch.pow(\n                self.gamma, torch.arange(self.n_steps, device=self.device)\n            )  # [n_steps]\n\n            # Apply masks and discounts to rewards\n            masked_rewards = all_rewards * done_masks  # [n_env, batch_size, n_step]\n            discounted_rewards = masked_rewards * discounts.view(\n                1, 1, -1\n            )  # [n_env, batch_size, n_step]\n\n            # Sum rewards along the n_step dimension\n            n_step_rewards = discounted_rewards.sum(dim=2)  # [n_env, batch_size]\n\n            # Find index of first done or truncation or last step for each sequence\n            first_done = torch.argmax(\n                (all_dones > 0).float(), dim=2\n            )  # [n_env, batch_size]\n            first_trunc = torch.argmax(\n                (all_truncations > 0).float(), dim=2\n            )  # [n_env, batch_size]\n\n            # Handle case where there are no dones or truncations\n            no_dones = all_dones.sum(dim=2) == 0\n            no_truncs = all_truncations.sum(dim=2) == 0\n\n            # When no dones or truncs, use the last index\n            first_done = torch.where(no_dones, self.n_steps - 1, first_done)\n            first_trunc = torch.where(no_truncs, self.n_steps - 1, first_trunc)\n\n            # Take the minimum (first) of done or truncation\n            final_indices = torch.minimum(\n                first_done, first_trunc\n            )  # [n_env, batch_size]\n\n            # Create indices to gather the final next observations\n            final_next_obs_indices = torch.gather(\n                all_indices, 2, final_indices.unsqueeze(-1)\n            ).squeeze(\n                -1\n            )  # [n_env, batch_size]\n\n            # Gather final values\n            final_next_observations = self.next_observations.gather(\n                1, final_next_obs_indices.unsqueeze(-1).expand(-1, -1, self.n_obs)\n            )\n            final_dones = self.dones.gather(1, final_next_obs_indices)\n            final_truncations = self.truncations.gather(1, final_next_obs_indices)\n\n            if self.asymmetric_obs:\n                if self.playground_mode:\n                    # Gather final privileged observations\n                    final_next_privileged_observations = (\n                        self.next_privileged_observations.gather(\n                            1,\n                            final_next_obs_indices.unsqueeze(-1).expand(\n                                -1, -1, self.privileged_obs_size\n                            ),\n                        )\n                    )\n\n                    # Reshape for output\n                    next_privileged_observations = (\n                        final_next_privileged_observations.reshape(\n                            self.n_env * batch_size, self.privileged_obs_size\n                        )\n                    )\n\n                    # Concatenate with next observations to form full next critic observations\n                    next_observations_reshaped = final_next_observations.reshape(\n                        self.n_env * batch_size, self.n_obs\n                    )\n                    next_critic_observations = torch.cat(\n                        [next_observations_reshaped, next_privileged_observations],\n                        dim=1,\n                    )\n                else:\n                    # Gather final next critic observations directly\n                    final_next_critic_observations = (\n                        self.next_critic_observations.gather(\n                            1,\n                            final_next_obs_indices.unsqueeze(-1).expand(\n                                -1, -1, self.n_critic_obs\n                            ),\n                        )\n                    )\n                    next_critic_observations = final_next_critic_observations.reshape(\n                        self.n_env * batch_size, self.n_critic_obs\n                    )\n\n            # Reshape everything to batch dimension\n            rewards = n_step_rewards.reshape(self.n_env * batch_size)\n            dones = final_dones.reshape(self.n_env * batch_size)\n            truncations = final_truncations.reshape(self.n_env * batch_size)\n            effective_n_steps = effective_n_steps.reshape(self.n_env * batch_size)\n            next_observations = final_next_observations.reshape(\n                self.n_env * batch_size, self.n_obs\n            )\n\n        out = TensorDict(\n            {\n                \"observations\": observations,\n                \"actions\": actions,\n                \"next\": {\n                    \"rewards\": rewards,\n                    \"dones\": dones,\n                    \"truncations\": truncations,\n                    \"observations\": next_observations,\n                    \"effective_n_steps\": effective_n_steps,\n                },\n            },\n            batch_size=self.n_env * batch_size,\n        )\n        if self.asymmetric_obs:\n            out[\"critic_observations\"] = critic_observations\n            out[\"next\"][\"critic_observations\"] = next_critic_observations\n\n        if self.n_steps > 1 and self.ptr >= self.buffer_size:\n            # Roll back the truncation flags introduced for safe sampling\n            self.truncations[:, current_pos - 1] = curr_truncations\n        return out\n\n\nclass EmpiricalNormalization(nn.Module):\n    \"\"\"Normalize mean and variance of values based on empirical values.\"\"\"\n\n    def __init__(self, shape, device, eps=1e-2, until=None):\n        \"\"\"Initialize EmpiricalNormalization module.\n\n        Args:\n            shape (int or tuple of int): Shape of input values except batch axis.\n            eps (float): Small value for stability.\n            until (int or None): If this arg is specified, the link learns input values until the sum of batch sizes\n            exceeds it.\n        \"\"\"\n        super().__init__()\n        self.eps = eps\n        self.until = until\n        self.device = device\n        self.register_buffer(\"_mean\", torch.zeros(shape).unsqueeze(0).to(device))\n        self.register_buffer(\"_var\", torch.ones(shape).unsqueeze(0).to(device))\n        self.register_buffer(\"_std\", torch.ones(shape).unsqueeze(0).to(device))\n        self.register_buffer(\"count\", torch.tensor(0, dtype=torch.long).to(device))\n\n    @property\n    def mean(self):\n        return self._mean.squeeze(0).clone()\n\n    @property\n    def std(self):\n        return self._std.squeeze(0).clone()\n\n    @torch.no_grad()\n    def forward(\n        self, x: torch.Tensor, center: bool = True, update: bool = True\n    ) -> torch.Tensor:\n        if x.shape[1:] != self._mean.shape[1:]:\n            raise ValueError(\n                f\"Expected input of shape (*,{self._mean.shape[1:]}), got {x.shape}\"\n            )\n\n        if self.training and update:\n            self.update(x)\n        if center:\n            return (x - self._mean) / (self._std + self.eps)\n        else:\n            return x / (self._std + self.eps)\n\n    @torch.jit.unused\n    def update(self, x):\n        if self.until is not None and self.count >= self.until:\n            return\n\n        if dist.is_available() and dist.is_initialized():\n            # Calculate global batch size arithmetically\n            local_batch_size = x.shape[0]\n            world_size = dist.get_world_size()\n            global_batch_size = world_size * local_batch_size\n\n            # Calculate the stats\n            x_shifted = x - self._mean\n            local_sum_shifted = torch.sum(x_shifted, dim=0, keepdim=True)\n            local_sum_sq_shifted = torch.sum(x_shifted.pow(2), dim=0, keepdim=True)\n\n            # Sync the stats across all processes\n            stats_to_sync = torch.cat([local_sum_shifted, local_sum_sq_shifted], dim=0)\n            dist.all_reduce(stats_to_sync, op=dist.ReduceOp.SUM)\n            global_sum_shifted, global_sum_sq_shifted = stats_to_sync\n\n            # Calculate the mean and variance of the global batch\n            batch_mean_shifted = global_sum_shifted / global_batch_size\n            batch_var = (\n                global_sum_sq_shifted / global_batch_size - batch_mean_shifted.pow(2)\n            )\n            batch_mean = batch_mean_shifted + self._mean\n\n        else:\n            global_batch_size = x.shape[0]\n            batch_mean = torch.mean(x, dim=0, keepdim=True)\n            batch_var = torch.var(x, dim=0, keepdim=True, unbiased=False)\n\n        new_count = self.count + global_batch_size\n\n        # Update mean\n        delta = batch_mean - self._mean\n        self._mean.copy_(self._mean + delta * (global_batch_size / new_count))\n\n        # Update variance\n        delta2 = batch_mean - self._mean\n        m_a = self._var * self.count\n        m_b = batch_var * global_batch_size\n        M2 = m_a + m_b + delta2.pow(2) * (self.count * global_batch_size / new_count)\n        self._var.copy_(M2 / new_count)\n        self._std.copy_(self._var.sqrt())\n        self.count.copy_(new_count)\n\n    @torch.jit.unused\n    def inverse(self, y):\n        return y * (self._std + self.eps) + self._mean\n\n\nclass RewardNormalizer(nn.Module):\n    def __init__(\n        self,\n        gamma: float,\n        device: torch.device,\n        g_max: float = 10.0,\n        epsilon: float = 1e-8,\n    ):\n        super().__init__()\n        self.register_buffer(\n            \"G\", torch.zeros(1, device=device)\n        )  # running estimate of the discounted return\n        self.register_buffer(\"G_r_max\", torch.zeros(1, device=device))  # running-max\n        self.G_rms = EmpiricalNormalization(shape=1, device=device)\n        self.gamma = gamma\n        self.g_max = g_max\n        self.epsilon = epsilon\n\n    def _scale_reward(self, rewards: torch.Tensor) -> torch.Tensor:\n        var_denominator = self.G_rms.std[0] + self.epsilon\n        min_required_denominator = self.G_r_max / self.g_max\n        denominator = torch.maximum(var_denominator, min_required_denominator)\n\n        return rewards / denominator\n\n    def update_stats(\n        self,\n        rewards: torch.Tensor,\n        dones: torch.Tensor,\n    ):\n        self.G = self.gamma * (1 - dones) * self.G + rewards\n        self.G_rms.update(self.G.view(-1, 1))\n\n        local_max = torch.max(torch.abs(self.G))\n\n        if dist.is_available() and dist.is_initialized():\n            dist.all_reduce(local_max, op=dist.ReduceOp.MAX)\n\n        self.G_r_max = max(self.G_r_max, local_max)\n\n    def forward(self, rewards: torch.Tensor) -> torch.Tensor:\n        return self._scale_reward(rewards)\n\n\nclass PerTaskEmpiricalNormalization(nn.Module):\n    \"\"\"Normalize mean and variance of values based on empirical values for each task.\"\"\"\n\n    def __init__(\n        self,\n        num_tasks: int,\n        shape: tuple,\n        device: torch.device,\n        eps: float = 1e-2,\n        until: int = None,\n    ):\n        \"\"\"\n        Initialize PerTaskEmpiricalNormalization module.\n\n        Args:\n            num_tasks (int): The total number of tasks.\n            shape (int or tuple of int): Shape of input values except batch axis.\n            eps (float): Small value for stability.\n            until (int or None): If specified, learns until the sum of batch sizes\n                                 for a specific task exceeds this value.\n        \"\"\"\n        super().__init__()\n        if not isinstance(shape, tuple):\n            shape = (shape,)\n        self.num_tasks = num_tasks\n        self.shape = shape\n        self.eps = eps\n        self.until = until\n        self.device = device\n\n        # Buffers now have a leading dimension for tasks\n        self.register_buffer(\"_mean\", torch.zeros(num_tasks, *shape).to(device))\n        self.register_buffer(\"_var\", torch.ones(num_tasks, *shape).to(device))\n        self.register_buffer(\"_std\", torch.ones(num_tasks, *shape).to(device))\n        self.register_buffer(\n            \"count\", torch.zeros(num_tasks, dtype=torch.long).to(device)\n        )\n\n    def forward(\n        self, x: torch.Tensor, task_ids: torch.Tensor, center: bool = True\n    ) -> torch.Tensor:\n        \"\"\"\n        Normalize the input tensor `x` using statistics for the given `task_ids`.\n\n        Args:\n            x (torch.Tensor): Input tensor of shape [num_envs, *shape].\n            task_ids (torch.Tensor): Tensor of task indices, shape [num_envs].\n            center (bool): If True, center the data by subtracting the mean.\n        \"\"\"\n        if x.shape[1:] != self.shape:\n            raise ValueError(f\"Expected input shape (*, {self.shape}), got {x.shape}\")\n        if x.shape[0] != task_ids.shape[0]:\n            raise ValueError(\"Batch size of x and task_ids must match.\")\n\n        # Gather the stats for the tasks in the current batch\n        # Reshape task_ids for broadcasting: [num_envs] -> [num_envs, 1, ...]\n        view_shape = (task_ids.shape[0],) + (1,) * len(self.shape)\n        task_ids_expanded = task_ids.view(view_shape).expand_as(x)\n\n        mean = self._mean.gather(0, task_ids_expanded)\n        std = self._std.gather(0, task_ids_expanded)\n\n        if self.training:\n            self.update(x, task_ids)\n\n        if center:\n            return (x - mean) / (std + self.eps)\n        else:\n            return x / (std + self.eps)\n\n    @torch.jit.unused\n    def update(self, x: torch.Tensor, task_ids: torch.Tensor):\n        \"\"\"Update running statistics for the tasks present in the batch.\"\"\"\n        unique_tasks = torch.unique(task_ids)\n\n        for task_id in unique_tasks:\n            if self.until is not None and self.count[task_id] >= self.until:\n                continue\n\n            # Create a mask to select data for the current task\n            mask = task_ids == task_id\n            x_task = x[mask]\n            batch_size = x_task.shape[0]\n\n            if batch_size == 0:\n                continue\n\n            # Update count for this task\n            old_count = self.count[task_id].clone()\n            new_count = old_count + batch_size\n\n            # Update mean\n            task_mean = self._mean[task_id]\n            batch_mean = torch.mean(x_task, dim=0)\n            delta = batch_mean - task_mean\n            self._mean[task_id].copy_(task_mean + (batch_size / new_count) * delta)\n\n            # Update variance using Chan's parallel algorithm\n            if old_count > 0:\n                batch_var = torch.var(x_task, dim=0, unbiased=False)\n                m_a = self._var[task_id] * old_count\n                m_b = batch_var * batch_size\n                M2 = m_a + m_b + (delta**2) * (old_count * batch_size / new_count)\n                self._var[task_id].copy_(M2 / new_count)\n            else:\n                # For the first batch of this task\n                self._var[task_id].copy_(torch.var(x_task, dim=0, unbiased=False))\n\n            self._std[task_id].copy_(torch.sqrt(self._var[task_id]))\n            self.count[task_id].copy_(new_count)\n\n\nclass PerTaskRewardNormalizer(nn.Module):\n    def __init__(\n        self,\n        num_tasks: int,\n        gamma: float,\n        device: torch.device,\n        g_max: float = 10.0,\n        epsilon: float = 1e-8,\n    ):\n        \"\"\"\n        Per-task reward normalizer, motivation comes from BRC (https://arxiv.org/abs/2505.23150v1)\n        \"\"\"\n        super().__init__()\n        self.num_tasks = num_tasks\n        self.gamma = gamma\n        self.g_max = g_max\n        self.epsilon = epsilon\n        self.device = device\n\n        # Per-task running estimate of the discounted return\n        self.register_buffer(\"G\", torch.zeros(num_tasks, device=device))\n        # Per-task running-max of the discounted return\n        self.register_buffer(\"G_r_max\", torch.zeros(num_tasks, device=device))\n        # Use the new per-task normalizer for the statistics of G\n        self.G_rms = PerTaskEmpiricalNormalization(\n            num_tasks=num_tasks, shape=(1,), device=device\n        )\n\n    def _scale_reward(\n        self, rewards: torch.Tensor, task_ids: torch.Tensor\n    ) -> torch.Tensor:\n        \"\"\"\n        Scales rewards using per-task statistics.\n\n        Args:\n            rewards (torch.Tensor): Reward tensor, shape [num_envs].\n            task_ids (torch.Tensor): Task indices, shape [num_envs].\n        \"\"\"\n        # Gather stats for the tasks in the batch\n        std_for_batch = self.G_rms._std.gather(0, task_ids.unsqueeze(-1)).squeeze(-1)\n        g_r_max_for_batch = self.G_r_max.gather(0, task_ids)\n\n        var_denominator = std_for_batch + self.epsilon\n        min_required_denominator = g_r_max_for_batch / self.g_max\n        denominator = torch.maximum(var_denominator, min_required_denominator)\n\n        # Add a small epsilon to the final denominator to prevent division by zero\n        # in case g_r_max is also zero.\n        return rewards / (denominator + self.epsilon)\n\n    def update_stats(\n        self, rewards: torch.Tensor, dones: torch.Tensor, task_ids: torch.Tensor\n    ):\n        \"\"\"\n        Updates the running discounted return and its statistics for each task.\n\n        Args:\n            rewards (torch.Tensor): Reward tensor, shape [num_envs].\n            dones (torch.Tensor): Done tensor, shape [num_envs].\n            task_ids (torch.Tensor): Task indices, shape [num_envs].\n        \"\"\"\n        if not (rewards.shape == dones.shape == task_ids.shape):\n            raise ValueError(\"rewards, dones, and task_ids must have the same shape.\")\n\n        # === Update G (running discounted return) ===\n        # Gather the previous G values for the tasks in the batch\n        prev_G = self.G.gather(0, task_ids)\n        # Update G for each environment based on its own reward and done signal\n        new_G = self.gamma * (1 - dones.float()) * prev_G + rewards\n        # Scatter the updated G values back to the main buffer\n        self.G.scatter_(0, task_ids, new_G)\n\n        # === Update G_rms (statistics of G) ===\n        # The update function handles the per-task logic internally\n        self.G_rms.update(new_G.unsqueeze(-1), task_ids)\n\n        # === Update G_r_max (running max of |G|) ===\n        prev_G_r_max = self.G_r_max.gather(0, task_ids)\n        # Update the max for each environment\n        updated_G_r_max = torch.maximum(prev_G_r_max, torch.abs(new_G))\n        # Scatter the new maxes back to the main buffer\n        self.G_r_max.scatter_(0, task_ids, updated_G_r_max)\n\n    def forward(self, rewards: torch.Tensor, task_ids: torch.Tensor) -> torch.Tensor:\n        \"\"\"\n        Normalizes rewards. During training, it also updates the running statistics.\n\n        Args:\n            rewards (torch.Tensor): Reward tensor, shape [num_envs].\n            task_ids (torch.Tensor): Task indices, shape [num_envs].\n        \"\"\"\n        return self._scale_reward(rewards, task_ids)\n\n\ndef cpu_state(sd):\n    # detach & move to host without locking the compute stream\n    return {k: v.detach().to(\"cpu\", non_blocking=True) for k, v in sd.items()}\n\n\ndef save_params(\n    global_step,\n    actor,\n    qnet,\n    qnet_target,\n    obs_normalizer,\n    critic_obs_normalizer,\n    args,\n    save_path,\n):\n    \"\"\"Save model parameters and training configuration to disk.\"\"\"\n\n    def get_ddp_state_dict(model):\n        \"\"\"Get state dict from model, handling DDP wrapper if present.\"\"\"\n        if hasattr(model, \"module\"):\n            return model.module.state_dict()\n        return model.state_dict()\n\n    os.makedirs(os.path.dirname(save_path), exist_ok=True)\n    save_dict = {\n        \"actor_state_dict\": cpu_state(get_ddp_state_dict(actor)),\n        \"qnet_state_dict\": cpu_state(get_ddp_state_dict(qnet)),\n        \"qnet_target_state_dict\": cpu_state(get_ddp_state_dict(qnet_target)),\n        \"obs_normalizer_state\": (\n            cpu_state(obs_normalizer.state_dict())\n            if hasattr(obs_normalizer, \"state_dict\")\n            else None\n        ),\n        \"critic_obs_normalizer_state\": (\n            cpu_state(critic_obs_normalizer.state_dict())\n            if hasattr(critic_obs_normalizer, \"state_dict\")\n            else None\n        ),\n        \"args\": vars(args),  # Save all arguments\n        \"global_step\": global_step,\n    }\n    torch.save(save_dict, save_path, _use_new_zipfile_serialization=True)\n    print(f\"Saved parameters and configuration to {save_path}\")\n\n\ndef get_ddp_state_dict(model):\n    \"\"\"Get state dict from model, handling DDP wrapper if present.\"\"\"\n    if hasattr(model, \"module\"):\n        return model.module.state_dict()\n    return model.state_dict()\n\n\ndef load_ddp_state_dict(model, state_dict):\n    \"\"\"Load state dict into model, handling DDP wrapper if present.\"\"\"\n    if hasattr(model, \"module\"):\n        model.module.load_state_dict(state_dict)\n    else:\n        model.load_state_dict(state_dict)\n\n\n@torch.no_grad()\ndef mark_step():\n    # call this once per iteration *before* any compiled function\n    torch.compiler.cudagraph_mark_step_begin()\n"
  },
  {
    "path": "fast_td3/hyperparams.py",
    "content": "import os\nfrom dataclasses import dataclass\nimport tyro\n\n\n@dataclass\nclass BaseArgs:\n    # Default hyperparameters -- specifically for HumanoidBench\n    # See MuJoCoPlaygroundArgs for default hyperparameters for MuJoCo Playground\n    # See IsaacLabArgs for default hyperparameters for IsaacLab\n    env_name: str = \"h1hand-stand-v0\"\n    \"\"\"the id of the environment\"\"\"\n    agent: str = \"fasttd3\"\n    \"\"\"the agent to use: currently support [fasttd3, fasttd3_simbav2]\"\"\"\n    seed: int = 1\n    \"\"\"seed of the experiment\"\"\"\n    torch_deterministic: bool = True\n    \"\"\"if toggled, `torch.backends.cudnn.deterministic=False`\"\"\"\n    cuda: bool = True\n    \"\"\"if toggled, cuda will be enabled by default\"\"\"\n    device_rank: int = 0\n    \"\"\"the rank of the device\"\"\"\n    exp_name: str = os.path.basename(__file__)[: -len(\".py\")]\n    \"\"\"the name of this experiment\"\"\"\n    project: str = \"FastTD3\"\n    \"\"\"the project name\"\"\"\n    use_wandb: bool = True\n    \"\"\"whether to use wandb\"\"\"\n    checkpoint_path: str = None\n    \"\"\"the path to the checkpoint file\"\"\"\n    num_envs: int = 128\n    \"\"\"the number of environments to run in parallel\"\"\"\n    num_eval_envs: int = 128\n    \"\"\"the number of evaluation environments to run in parallel (only valid for MuJoCo Playground)\"\"\"\n    total_timesteps: int = 150000\n    \"\"\"total timesteps of the experiments\"\"\"\n    critic_learning_rate: float = 3e-4\n    \"\"\"the learning rate of the critic\"\"\"\n    actor_learning_rate: float = 3e-4\n    \"\"\"the learning rate for the actor\"\"\"\n    critic_learning_rate_end: float = 3e-4\n    \"\"\"the learning rate of the critic at the end of training\"\"\"\n    actor_learning_rate_end: float = 3e-4\n    \"\"\"the learning rate for the actor at the end of training\"\"\"\n    buffer_size: int = 1024 * 50\n    \"\"\"the replay memory buffer size\"\"\"\n    num_steps: int = 1\n    \"\"\"the number of steps to use for the multi-step return\"\"\"\n    gamma: float = 0.99\n    \"\"\"the discount factor gamma\"\"\"\n    tau: float = 0.1\n    \"\"\"target smoothing coefficient (default: 0.005)\"\"\"\n    batch_size: int = 32768\n    \"\"\"the batch size of sample from the replay memory\"\"\"\n    policy_noise: float = 0.001\n    \"\"\"the scale of policy noise\"\"\"\n    std_min: float = 0.001\n    \"\"\"the minimum scale of noise\"\"\"\n    std_max: float = 0.4\n    \"\"\"the maximum scale of noise\"\"\"\n    learning_starts: int = 10\n    \"\"\"timestep to start learning\"\"\"\n    policy_frequency: int = 2\n    \"\"\"the frequency of training policy (delayed)\"\"\"\n    noise_clip: float = 0.5\n    \"\"\"noise clip parameter of the Target Policy Smoothing Regularization\"\"\"\n    num_updates: int = 2\n    \"\"\"the number of updates to perform per step\"\"\"\n    init_scale: float = 0.01\n    \"\"\"the scale of the initial parameters\"\"\"\n    num_atoms: int = 101\n    \"\"\"the number of atoms\"\"\"\n    v_min: float = -250.0\n    \"\"\"the minimum value of the support\"\"\"\n    v_max: float = 250.0\n    \"\"\"the maximum value of the support\"\"\"\n    critic_hidden_dim: int = 1024\n    \"\"\"the hidden dimension of the critic network\"\"\"\n    actor_hidden_dim: int = 512\n    \"\"\"the hidden dimension of the actor network\"\"\"\n    critic_num_blocks: int = 2\n    \"\"\"(SimbaV2 only) the number of blocks in the critic network\"\"\"\n    actor_num_blocks: int = 1\n    \"\"\"(SimbaV2 only) the number of blocks in the actor network\"\"\"\n    use_cdq: bool = True\n    \"\"\"whether to use Clipped Double Q-learning\"\"\"\n    measure_burnin: int = 3\n    \"\"\"Number of burn-in iterations for speed measure.\"\"\"\n    eval_interval: int = 5000\n    \"\"\"the interval to evaluate the model\"\"\"\n    render_interval: int = 5000\n    \"\"\"the interval to render the model\"\"\"\n    compile: bool = True\n    \"\"\"whether to use torch.compile.\"\"\"\n    compile_mode: str = \"reduce-overhead\"\n    \"\"\"the mode of torch.compile.\"\"\"\n    obs_normalization: bool = True\n    \"\"\"whether to enable observation normalization\"\"\"\n    reward_normalization: bool = False\n    \"\"\"whether to enable reward normalization\"\"\"\n    use_grad_norm_clipping: bool = False\n    \"\"\"whether to use gradient norm clipping.\"\"\"\n    max_grad_norm: float = 0.0\n    \"\"\"the maximum gradient norm\"\"\"\n    amp: bool = True\n    \"\"\"whether to use amp\"\"\"\n    amp_dtype: str = \"bf16\"\n    \"\"\"the dtype of the amp\"\"\"\n    disable_bootstrap: bool = False\n    \"\"\"Whether to disable bootstrap in the critic learning\"\"\"\n\n    use_domain_randomization: bool = False\n    \"\"\"(Playground only) whether to use domain randomization\"\"\"\n    use_push_randomization: bool = False\n    \"\"\"(Playground only) whether to use push randomization\"\"\"\n    use_tuned_reward: bool = False\n    \"\"\"(Playground only) Use tuned reward for G1\"\"\"\n    action_bounds: float = 1.0\n    \"\"\"(IsaacLab only) the bounds of the action space (-action_bounds, action_bounds)\"\"\"\n    task_embedding_dim: int = 32\n    \"\"\"the dimension of the task embedding\"\"\"\n\n    weight_decay: float = 0.1\n    \"\"\"the weight decay of the optimizer\"\"\"\n    save_interval: int = 5000\n    \"\"\"the interval to save the model\"\"\"\n\n\ndef get_args():\n    \"\"\"\n    Parse command-line arguments and return the appropriate Args instance based on env_name.\n    \"\"\"\n    # First, parse all arguments using the base Args class\n    base_args = tyro.cli(BaseArgs)\n\n    # Map environment names to their specific Args classes\n    # For tasks not here, default hyperparameters are used\n    # See below links for available task list\n    # - HumanoidBench (https://arxiv.org/abs/2403.10506)\n    # - IsaacLab (https://isaac-sim.github.io/IsaacLab/main/source/overview/environments.html)\n    # - MuJoCo Playground (https://arxiv.org/abs/2502.08844)\n    env_to_args_class = {\n        # HumanoidBench\n        # NOTE: These tasks are not full list of HumanoidBench tasks\n        \"h1hand-reach-v0\": H1HandReachArgs,\n        \"h1hand-balance-simple-v0\": H1HandBalanceSimpleArgs,\n        \"h1hand-balance-hard-v0\": H1HandBalanceHardArgs,\n        \"h1hand-pole-v0\": H1HandPoleArgs,\n        \"h1hand-truck-v0\": H1HandTruckArgs,\n        \"h1hand-maze-v0\": H1HandMazeArgs,\n        \"h1hand-push-v0\": H1HandPushArgs,\n        \"h1hand-basketball-v0\": H1HandBasketballArgs,\n        \"h1hand-window-v0\": H1HandWindowArgs,\n        \"h1hand-package-v0\": H1HandPackageArgs,\n        \"h1hand-truck-v0\": H1HandTruckArgs,\n        # MuJoCo Playground\n        # NOTE: These tasks are not full list of MuJoCo Playground tasks\n        \"G1JoystickFlatTerrain\": G1JoystickFlatTerrainArgs,\n        \"G1JoystickRoughTerrain\": G1JoystickRoughTerrainArgs,\n        \"T1JoystickFlatTerrain\": T1JoystickFlatTerrainArgs,\n        \"T1JoystickRoughTerrain\": T1JoystickRoughTerrainArgs,\n        \"LeapCubeReorient\": LeapCubeReorientArgs,\n        \"LeapCubeRotateZAxis\": LeapCubeRotateZAxisArgs,\n        \"Go1JoystickFlatTerrain\": Go1JoystickFlatTerrainArgs,\n        \"Go1JoystickRoughTerrain\": Go1JoystickRoughTerrainArgs,\n        \"Go1Getup\": Go1GetupArgs,\n        \"CheetahRun\": CheetahRunArgs,  # NOTE: Example config for DeepMind Control Suite\n        # IsaacLab\n        # NOTE: These tasks are not full list of IsaacLab tasks\n        \"Isaac-Lift-Cube-Franka-v0\": IsaacLiftCubeFrankaArgs,\n        \"Isaac-Open-Drawer-Franka-v0\": IsaacOpenDrawerFrankaArgs,\n        \"Isaac-Velocity-Flat-H1-v0\": IsaacVelocityFlatH1Args,\n        \"Isaac-Velocity-Flat-G1-v0\": IsaacVelocityFlatG1Args,\n        \"Isaac-Velocity-Rough-H1-v0\": IsaacVelocityRoughH1Args,\n        \"Isaac-Velocity-Rough-G1-v0\": IsaacVelocityRoughG1Args,\n        \"Isaac-Repose-Cube-Allegro-Direct-v0\": IsaacReposeCubeAllegroDirectArgs,\n        \"Isaac-Repose-Cube-Shadow-Direct-v0\": IsaacReposeCubeShadowDirectArgs,\n        # MTBench\n        \"MTBench-meta-world-v2-mt10\": MetaWorldMT10Args,\n        \"MTBench-meta-world-v2-mt50\": MetaWorldMT50Args,\n    }\n    # If the provided env_name has a specific Args class, use it\n    if base_args.env_name in env_to_args_class:\n        specific_args_class = env_to_args_class[base_args.env_name]\n        # Re-parse with the specific class, maintaining any user overrides\n        specific_args = tyro.cli(specific_args_class)\n        return specific_args\n\n    if base_args.env_name.startswith(\"h1hand-\") or base_args.env_name.startswith(\"h1-\"):\n        # HumanoidBench\n        specific_args = tyro.cli(HumanoidBenchArgs)\n    elif base_args.env_name.startswith(\"Isaac-\"):\n        # IsaacLab\n        specific_args = tyro.cli(IsaacLabArgs)\n    elif base_args.env_name.startswith(\"MTBench-\"):\n        # MTBench\n        specific_args = tyro.cli(MTBenchArgs)\n    else:\n        # MuJoCo Playground\n        specific_args = tyro.cli(MuJoCoPlaygroundArgs)\n    return specific_args\n\n\n@dataclass\nclass HumanoidBenchArgs(BaseArgs):\n    # See HumanoidBench (https://arxiv.org/abs/2403.10506) for available task list\n    total_timesteps: int = 100000\n\n\n@dataclass\nclass H1HandReachArgs(HumanoidBenchArgs):\n    env_name: str = \"h1hand-reach-v0\"\n    v_min: float = -2000.0\n    v_max: float = 2000.0\n\n\n@dataclass\nclass H1HandBalanceSimpleArgs(HumanoidBenchArgs):\n    env_name: str = \"h1hand-balance-simple-v0\"\n    total_timesteps: int = 200000\n\n\n@dataclass\nclass H1HandBalanceHardArgs(HumanoidBenchArgs):\n    env_name: str = \"h1hand-balance-hard-v0\"\n    total_timesteps: int = 1000000\n\n\n@dataclass\nclass H1HandPoleArgs(HumanoidBenchArgs):\n    env_name: str = \"h1hand-pole-v0\"\n    total_timesteps: int = 150000\n\n\n@dataclass\nclass H1HandTruckArgs(HumanoidBenchArgs):\n    env_name: str = \"h1hand-truck-v0\"\n    total_timesteps: int = 500000\n\n\n@dataclass\nclass H1HandMazeArgs(HumanoidBenchArgs):\n    env_name: str = \"h1hand-maze-v0\"\n    v_min: float = -1000.0\n    v_max: float = 1000.0\n\n\n@dataclass\nclass H1HandPushArgs(HumanoidBenchArgs):\n    env_name: str = \"h1hand-push-v0\"\n    v_min: float = -1000.0\n    v_max: float = 1000.0\n    total_timesteps: int = 1000000\n\n\n@dataclass\nclass H1HandBasketballArgs(HumanoidBenchArgs):\n    env_name: str = \"h1hand-basketball-v0\"\n    v_min: float = -2000.0\n    v_max: float = 2000.0\n    total_timesteps: int = 250000\n\n\n@dataclass\nclass H1HandWindowArgs(HumanoidBenchArgs):\n    env_name: str = \"h1hand-window-v0\"\n    total_timesteps: int = 250000\n\n\n@dataclass\nclass H1HandPackageArgs(HumanoidBenchArgs):\n    env_name: str = \"h1hand-package-v0\"\n    v_min: float = -10000.0\n    v_max: float = 10000.0\n\n\n@dataclass\nclass H1HandTruckArgs(HumanoidBenchArgs):\n    env_name: str = \"h1hand-truck-v0\"\n    v_min: float = -1000.0\n    v_max: float = 1000.0\n\n\n@dataclass\nclass MuJoCoPlaygroundArgs(BaseArgs):\n    # Default hyperparameters for many of Playground environments\n    v_min: float = -10.0\n    v_max: float = 10.0\n    buffer_size: int = 1024 * 10\n    num_envs: int = 1024\n    num_eval_envs: int = 1024\n    gamma: float = 0.97\n\n\n@dataclass\nclass MTBenchArgs(BaseArgs):\n    # Default hyperparameters for MTBench\n    reward_normalization: bool = True\n    v_min: float = -10.0\n    v_max: float = 10.0\n    buffer_size: int = 2048  # 2K is usually enough for MTBench\n    num_envs: int = 4096\n    num_eval_envs: int = 4096\n    gamma: float = 0.97\n    num_steps: int = 8\n    compile_mode: str = \"default\"  # Multi-task training is not compatible with cudagraphs\n\n\n@dataclass\nclass MetaWorldMT10Args(MTBenchArgs):\n    # This config achieves 97 ~ 98% success rate within 10k steps (15-20 mins on A100)\n    env_name: str = \"MTBench-meta-world-v2-mt10\"\n    num_envs: int = 4096\n    num_eval_envs: int = 4096\n    num_steps: int = 8\n    gamma: float = 0.97\n    compile_mode: str = \"default\"  # Multi-task training is not compatible with cudagraphs\n\n\n@dataclass\nclass MetaWorldMT50Args(MTBenchArgs):\n    # FastTD3 + SimbaV2 achieves >90% success rate within 20k steps (80 mins on A100)\n    # Performance further improves with more training steps, slowly.\n    env_name: str = \"MTBench-meta-world-v2-mt50\"\n    num_envs: int = 8192\n    num_eval_envs: int = 8192\n    num_steps: int = 8\n    gamma: float = 0.99\n    compile_mode: str = \"default\"  # Multi-task training is not compatible with cudagraphs\n\n\n@dataclass\nclass G1JoystickFlatTerrainArgs(MuJoCoPlaygroundArgs):\n    env_name: str = \"G1JoystickFlatTerrain\"\n    total_timesteps: int = 100000\n\n\n@dataclass\nclass G1JoystickRoughTerrainArgs(MuJoCoPlaygroundArgs):\n    env_name: str = \"G1JoystickRoughTerrain\"\n    total_timesteps: int = 100000\n\n\n@dataclass\nclass T1JoystickFlatTerrainArgs(MuJoCoPlaygroundArgs):\n    env_name: str = \"T1JoystickFlatTerrain\"\n    total_timesteps: int = 100000\n\n\n@dataclass\nclass T1JoystickRoughTerrainArgs(MuJoCoPlaygroundArgs):\n    env_name: str = \"T1JoystickRoughTerrain\"\n    total_timesteps: int = 100000\n\n\n@dataclass\nclass T1LowDofJoystickFlatTerrainArgs(MuJoCoPlaygroundArgs):\n    env_name: str = \"T1LowDofJoystickFlatTerrain\"\n    total_timesteps: int = 1000000\n\n\n@dataclass\nclass T1LowDofJoystickRoughTerrainArgs(MuJoCoPlaygroundArgs):\n    env_name: str = \"T1LowDofJoystickRoughTerrain\"\n    total_timesteps: int = 1000000\n\n\n@dataclass\nclass CheetahRunArgs(MuJoCoPlaygroundArgs):\n    # NOTE: This config will work for most DMC tasks, though we haven't tested DMC extensively.\n    # Future research can consider using LayerNorm as we find it sometimes works better for DMC tasks.\n    env_name: str = \"CheetahRun\"\n    num_steps: int = 3\n    v_min: float = -500.0\n    v_max: float = 500.0\n    std_min: float = 0.1\n    policy_noise: float = 0.1\n\n\n@dataclass\nclass Go1JoystickFlatTerrainArgs(MuJoCoPlaygroundArgs):\n    env_name: str = \"Go1JoystickFlatTerrain\"\n    total_timesteps: int = 50000\n    std_min: float = 0.2\n    std_max: float = 0.8\n    policy_noise: float = 0.2\n    num_updates: int = 8\n\n\n@dataclass\nclass Go1JoystickRoughTerrainArgs(MuJoCoPlaygroundArgs):\n    env_name: str = \"Go1JoystickRoughTerrain\"\n    total_timesteps: int = 50000\n    std_min: float = 0.2\n    std_max: float = 0.8\n    policy_noise: float = 0.2\n    num_updates: int = 8\n\n\n@dataclass\nclass Go1GetupArgs(MuJoCoPlaygroundArgs):\n    env_name: str = \"Go1Getup\"\n    total_timesteps: int = 50000\n    std_min: float = 0.2\n    std_max: float = 0.8\n    policy_noise: float = 0.2\n    num_updates: int = 8\n\n\n@dataclass\nclass LeapCubeReorientArgs(MuJoCoPlaygroundArgs):\n    env_name: str = \"LeapCubeReorient\"\n    num_steps: int = 3\n    gamma: float = 0.99\n    policy_noise: float = 0.2\n    v_min: float = -50.0\n    v_max: float = 50.0\n    use_cdq: bool = False\n\n\n@dataclass\nclass LeapCubeRotateZAxisArgs(MuJoCoPlaygroundArgs):\n    env_name: str = \"LeapCubeRotateZAxis\"\n    num_steps: int = 1\n    policy_noise: float = 0.2\n    gamma: float = 0.99\n    v_min: float = -10.0\n    v_max: float = 10.0\n    use_cdq: bool = False\n\n\n@dataclass\nclass IsaacLabArgs(BaseArgs):\n    v_min: float = -10.0\n    v_max: float = 10.0\n    buffer_size: int = 1024 * 10\n    num_envs: int = 4096\n    num_eval_envs: int = 4096\n    action_bounds: float = 1.0\n    std_max: float = 0.4\n    num_atoms: int = 251\n    render_interval: int = 0  # IsaacLab does not support rendering in our codebase\n    total_timesteps: int = 100000\n\n\n@dataclass\nclass IsaacLiftCubeFrankaArgs(IsaacLabArgs):\n    # Value learning is unstable for Lift Cube task Due to brittle reward shaping\n    # Therefore, we need to disable bootstrap from 'reset_obs' in IsaacLab\n    # Higher UTD works better for manipulation tasks\n    env_name: str = \"Isaac-Lift-Cube-Franka-v0\"\n    num_updates: int = 8\n    v_min: float = -50.0\n    v_max: float = 50.0\n    std_max: float = 0.8\n    num_envs: int = 1024\n    num_eval_envs: int = 1024\n    action_bounds: float = 3.0\n    disable_bootstrap: bool = True\n    total_timesteps: int = 20000\n\n\n@dataclass\nclass IsaacOpenDrawerFrankaArgs(IsaacLabArgs):\n    # Higher UTD works better for manipulation tasks\n    env_name: str = \"Isaac-Open-Drawer-Franka-v0\"\n    v_min: float = -50.0\n    v_max: float = 50.0\n    num_updates: int = 8\n    action_bounds: float = 3.0\n    total_timesteps: int = 20000\n\n\n@dataclass\nclass IsaacVelocityFlatH1Args(IsaacLabArgs):\n    env_name: str = \"Isaac-Velocity-Flat-H1-v0\"\n    num_steps: int = 8\n    num_updates: int = 4\n    total_timesteps: int = 75000\n\n\n@dataclass\nclass IsaacVelocityFlatG1Args(IsaacLabArgs):\n    env_name: str = \"Isaac-Velocity-Flat-G1-v0\"\n    num_steps: int = 8\n    num_updates: int = 4\n    total_timesteps: int = 50000\n\n\n@dataclass\nclass IsaacVelocityRoughH1Args(IsaacLabArgs):\n    env_name: str = \"Isaac-Velocity-Rough-H1-v0\"\n    num_steps: int = 8\n    num_updates: int = 4\n    buffer_size: int = 1024 * 5  # To reduce memory usage\n    total_timesteps: int = 50000\n\n\n@dataclass\nclass IsaacVelocityRoughG1Args(IsaacLabArgs):\n    env_name: str = \"Isaac-Velocity-Rough-G1-v0\"\n    num_steps: int = 8\n    num_updates: int = 4\n    buffer_size: int = 1024 * 5  # To reduce memory usage\n    total_timesteps: int = 50000\n\n\n@dataclass\nclass IsaacReposeCubeAllegroDirectArgs(IsaacLabArgs):\n    env_name: str = \"Isaac-Repose-Cube-Allegro-Direct-v0\"\n    total_timesteps: int = 100000\n    v_min: float = -500.0\n    v_max: float = 500.0\n\n\n@dataclass\nclass IsaacReposeCubeShadowDirectArgs(IsaacLabArgs):\n    env_name: str = \"Isaac-Repose-Cube-Shadow-Direct-v0\"\n    total_timesteps: int = 100000\n    v_min: float = -500.0\n    v_max: float = 500.0\n"
  },
  {
    "path": "fast_td3/train.py",
    "content": "import os\nimport sys\n\nos.environ[\"TORCHDYNAMO_INLINE_INBUILT_NN_MODULES\"] = \"1\"\nos.environ[\"OMP_NUM_THREADS\"] = \"1\"\nif sys.platform != \"darwin\":\n    os.environ[\"MUJOCO_GL\"] = \"egl\"\nelse:\n    os.environ[\"MUJOCO_GL\"] = \"glfw\"\nos.environ[\"XLA_PYTHON_CLIENT_PREALLOCATE\"] = \"false\"\nos.environ[\"JAX_DEFAULT_MATMUL_PRECISION\"] = \"highest\"\n\nimport random\nimport time\nimport math\n\nimport tqdm\nimport wandb\nimport numpy as np\n\ntry:\n    # Required for avoiding IsaacGym import error\n    import isaacgym\nexcept ImportError:\n    pass\n\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nimport torch.optim as optim\nfrom torch.amp import autocast, GradScaler\n\nfrom tensordict import TensorDict\n\nfrom fast_td3_utils import (\n    EmpiricalNormalization,\n    RewardNormalizer,\n    PerTaskRewardNormalizer,\n    SimpleReplayBuffer,\n    save_params,\n    mark_step,\n)\nfrom hyperparams import get_args\n\ntorch.set_float32_matmul_precision(\"high\")\n\ntry:\n    import jax.numpy as jnp\nexcept ImportError:\n    pass\n\n\ndef main():\n    args = get_args()\n    print(args)\n    run_name = f\"{args.env_name}__{args.exp_name}__{args.seed}\"\n\n    amp_enabled = args.amp and args.cuda and torch.cuda.is_available()\n    amp_device_type = (\n        \"cuda\"\n        if args.cuda and torch.cuda.is_available()\n        else \"mps\" if args.cuda and torch.backends.mps.is_available() else \"cpu\"\n    )\n    amp_dtype = torch.bfloat16 if args.amp_dtype == \"bf16\" else torch.float16\n\n    scaler = GradScaler(enabled=amp_enabled and amp_dtype == torch.float16)\n\n    if args.use_wandb:\n        wandb.init(\n            project=args.project,\n            name=run_name,\n            config=vars(args),\n            save_code=True,\n        )\n\n    random.seed(args.seed)\n    np.random.seed(args.seed)\n    torch.manual_seed(args.seed)\n    torch.backends.cudnn.deterministic = args.torch_deterministic\n\n    if not args.cuda:\n        device = torch.device(\"cpu\")\n    else:\n        if torch.cuda.is_available():\n            device = torch.device(f\"cuda:{args.device_rank}\")\n        elif torch.backends.mps.is_available():\n            device = torch.device(f\"mps:{args.device_rank}\")\n        else:\n            raise ValueError(\"No GPU available\")\n    print(f\"Using device: {device}\")\n\n    if args.env_name.startswith(\"h1hand-\") or args.env_name.startswith(\"h1-\"):\n        from environments.humanoid_bench_env import HumanoidBenchEnv\n\n        env_type = \"humanoid_bench\"\n        envs = HumanoidBenchEnv(args.env_name, args.num_envs, device=device)\n        eval_envs = envs\n        render_env = HumanoidBenchEnv(\n            args.env_name, 1, render_mode=\"rgb_array\", device=device\n        )\n    elif args.env_name.startswith(\"Isaac-\"):\n        from environments.isaaclab_env import IsaacLabEnv\n\n        env_type = \"isaaclab\"\n        envs = IsaacLabEnv(\n            args.env_name,\n            device.type,\n            args.num_envs,\n            args.seed,\n            action_bounds=args.action_bounds,\n        )\n        eval_envs = envs\n        render_env = envs\n    elif args.env_name.startswith(\"MTBench-\"):\n        from environments.mtbench_env import MTBenchEnv\n\n        env_name = \"-\".join(args.env_name.split(\"-\")[1:])\n        env_type = \"mtbench\"\n        envs = MTBenchEnv(env_name, args.device_rank, args.num_envs, args.seed)\n        eval_envs = envs\n        render_env = envs\n    else:\n        from environments.mujoco_playground_env import make_env\n\n        # TODO: Check if re-using same envs for eval could reduce memory usage\n        env_type = \"mujoco_playground\"\n        envs, eval_envs, render_env = make_env(\n            args.env_name,\n            args.seed,\n            args.num_envs,\n            args.num_eval_envs,\n            args.device_rank,\n            use_tuned_reward=args.use_tuned_reward,\n            use_domain_randomization=args.use_domain_randomization,\n            use_push_randomization=args.use_push_randomization,\n        )\n\n    n_act = envs.num_actions\n    n_obs = envs.num_obs if type(envs.num_obs) == int else envs.num_obs[0]\n    if envs.asymmetric_obs:\n        n_critic_obs = (\n            envs.num_privileged_obs\n            if type(envs.num_privileged_obs) == int\n            else envs.num_privileged_obs[0]\n        )\n    else:\n        n_critic_obs = n_obs\n    action_low, action_high = -1.0, 1.0\n\n    if args.obs_normalization:\n        obs_normalizer = EmpiricalNormalization(shape=n_obs, device=device)\n        critic_obs_normalizer = EmpiricalNormalization(\n            shape=n_critic_obs, device=device\n        )\n    else:\n        obs_normalizer = nn.Identity()\n        critic_obs_normalizer = nn.Identity()\n\n    if args.reward_normalization:\n        if env_type in [\"mtbench\"]:\n            reward_normalizer = PerTaskRewardNormalizer(\n                num_tasks=envs.num_tasks,\n                gamma=args.gamma,\n                device=device,\n                g_max=min(abs(args.v_min), abs(args.v_max)),\n            )\n        else:\n            reward_normalizer = RewardNormalizer(\n                gamma=args.gamma,\n                device=device,\n                g_max=min(abs(args.v_min), abs(args.v_max)),\n            )\n    else:\n        reward_normalizer = nn.Identity()\n\n    actor_kwargs = {\n        \"n_obs\": n_obs,\n        \"n_act\": n_act,\n        \"num_envs\": args.num_envs,\n        \"device\": device,\n        \"init_scale\": args.init_scale,\n        \"hidden_dim\": args.actor_hidden_dim,\n        \"std_min\": args.std_min,\n        \"std_max\": args.std_max,\n    }\n    critic_kwargs = {\n        \"n_obs\": n_critic_obs,\n        \"n_act\": n_act,\n        \"num_atoms\": args.num_atoms,\n        \"v_min\": args.v_min,\n        \"v_max\": args.v_max,\n        \"hidden_dim\": args.critic_hidden_dim,\n        \"device\": device,\n    }\n\n    if env_type == \"mtbench\":\n        actor_kwargs[\"n_obs\"] = n_obs - envs.num_tasks + args.task_embedding_dim\n        critic_kwargs[\"n_obs\"] = n_critic_obs - envs.num_tasks + args.task_embedding_dim\n        actor_kwargs[\"num_tasks\"] = envs.num_tasks\n        actor_kwargs[\"task_embedding_dim\"] = args.task_embedding_dim\n        critic_kwargs[\"num_tasks\"] = envs.num_tasks\n        critic_kwargs[\"task_embedding_dim\"] = args.task_embedding_dim\n\n    if args.agent == \"fasttd3\":\n        if env_type in [\"mtbench\"]:\n            from fast_td3 import MultiTaskActor, MultiTaskCritic\n\n            actor_cls = MultiTaskActor\n            critic_cls = MultiTaskCritic\n        else:\n            from fast_td3 import Actor, Critic\n\n            actor_cls = Actor\n            critic_cls = Critic\n\n        print(\"Using FastTD3\")\n    elif args.agent == \"fasttd3_simbav2\":\n        if env_type in [\"mtbench\"]:\n            from fast_td3_simbav2 import MultiTaskActor, MultiTaskCritic\n\n            actor_cls = MultiTaskActor\n            critic_cls = MultiTaskCritic\n        else:\n            from fast_td3_simbav2 import Actor, Critic\n\n            actor_cls = Actor\n            critic_cls = Critic\n\n        print(\"Using FastTD3 + SimbaV2\")\n        actor_kwargs.pop(\"init_scale\")\n        actor_kwargs.update(\n            {\n                \"scaler_init\": math.sqrt(2.0 / args.actor_hidden_dim),\n                \"scaler_scale\": math.sqrt(2.0 / args.actor_hidden_dim),\n                \"alpha_init\": 1.0 / (args.actor_num_blocks + 1),\n                \"alpha_scale\": 1.0 / math.sqrt(args.actor_hidden_dim),\n                \"expansion\": 4,\n                \"c_shift\": 3.0,\n                \"num_blocks\": args.actor_num_blocks,\n            }\n        )\n        critic_kwargs.update(\n            {\n                \"scaler_init\": math.sqrt(2.0 / args.critic_hidden_dim),\n                \"scaler_scale\": math.sqrt(2.0 / args.critic_hidden_dim),\n                \"alpha_init\": 1.0 / (args.critic_num_blocks + 1),\n                \"alpha_scale\": 1.0 / math.sqrt(args.critic_hidden_dim),\n                \"num_blocks\": args.critic_num_blocks,\n                \"expansion\": 4,\n                \"c_shift\": 3.0,\n            }\n        )\n    else:\n        raise ValueError(f\"Agent {args.agent} not supported\")\n\n    actor = actor_cls(**actor_kwargs)\n\n    if env_type in [\"mtbench\"]:\n        # Python 3.8 doesn't support 'from_module' in tensordict\n        policy = actor.explore\n    else:\n        from tensordict import from_module\n\n        actor_detach = actor_cls(**actor_kwargs)\n        # Copy params to actor_detach without grad\n        from_module(actor).data.to_module(actor_detach)\n        policy = actor_detach.explore\n\n    qnet = critic_cls(**critic_kwargs)\n    qnet_target = critic_cls(**critic_kwargs)\n    qnet_target.load_state_dict(qnet.state_dict())\n\n    q_optimizer = optim.AdamW(\n        list(qnet.parameters()),\n        lr=torch.tensor(args.critic_learning_rate, device=device),\n        weight_decay=args.weight_decay,\n    )\n    actor_optimizer = optim.AdamW(\n        list(actor.parameters()),\n        lr=torch.tensor(args.actor_learning_rate, device=device),\n        weight_decay=args.weight_decay,\n    )\n\n    # Add learning rate schedulers\n    q_scheduler = optim.lr_scheduler.CosineAnnealingLR(\n        q_optimizer,\n        T_max=args.total_timesteps,\n        eta_min=torch.tensor(args.critic_learning_rate_end, device=device),\n    )\n    actor_scheduler = optim.lr_scheduler.CosineAnnealingLR(\n        actor_optimizer,\n        T_max=args.total_timesteps,\n        eta_min=torch.tensor(args.actor_learning_rate_end, device=device),\n    )\n\n    rb = SimpleReplayBuffer(\n        n_env=args.num_envs,\n        buffer_size=args.buffer_size,\n        n_obs=n_obs,\n        n_act=n_act,\n        n_critic_obs=n_critic_obs,\n        asymmetric_obs=envs.asymmetric_obs,\n        playground_mode=env_type == \"mujoco_playground\",\n        n_steps=args.num_steps,\n        gamma=args.gamma,\n        device=device,\n    )\n\n    policy_noise = args.policy_noise\n    noise_clip = args.noise_clip\n\n    def evaluate():\n        num_eval_envs = eval_envs.num_envs\n        episode_returns = torch.zeros(num_eval_envs, device=device)\n        episode_lengths = torch.zeros(num_eval_envs, device=device)\n        done_masks = torch.zeros(num_eval_envs, dtype=torch.bool, device=device)\n\n        if env_type == \"isaaclab\":\n            obs = eval_envs.reset(random_start_init=False)\n        else:\n            obs = eval_envs.reset()\n\n        # Run for a fixed number of steps\n        for i in range(eval_envs.max_episode_steps):\n            with torch.no_grad(), autocast(\n                device_type=amp_device_type, dtype=amp_dtype, enabled=amp_enabled\n            ):\n                obs = normalize_obs(obs, update=False)\n                actions = actor(obs)\n\n            next_obs, rewards, dones, infos = eval_envs.step(actions.float())\n\n            if env_type == \"mtbench\":\n                # We only report success rate in MTBench evaluation\n                rewards = (\n                    infos[\"episode\"][\"success\"].float() if \"episode\" in infos else 0.0\n                )\n            episode_returns = torch.where(\n                ~done_masks, episode_returns + rewards, episode_returns\n            )\n            episode_lengths = torch.where(\n                ~done_masks, episode_lengths + 1, episode_lengths\n            )\n            if env_type == \"mtbench\" and \"episode\" in infos:\n                dones = dones | infos[\"episode\"][\"success\"]\n            done_masks = torch.logical_or(done_masks, dones)\n            if done_masks.all():\n                break\n            obs = next_obs\n\n        return episode_returns.mean().item(), episode_lengths.mean().item()\n\n    def render_with_rollout():\n        # Quick rollout for rendering\n        if env_type == \"humanoid_bench\":\n            obs = render_env.reset()\n            renders = [render_env.render()]\n        elif env_type in [\"isaaclab\", \"mtbench\"]:\n            raise NotImplementedError(\n                \"We don't support rendering for IsaacLab and MTBench environments\"\n            )\n        else:\n            obs = render_env.reset()\n            render_env.state.info[\"command\"] = jnp.array([[1.0, 0.0, 0.0]])\n            renders = [render_env.state]\n        for i in range(render_env.max_episode_steps):\n            with torch.no_grad(), autocast(\n                device_type=amp_device_type, dtype=amp_dtype, enabled=amp_enabled\n            ):\n                obs = normalize_obs(obs, update=False)\n                actions = actor(obs)\n            next_obs, _, done, _ = render_env.step(actions.float())\n            if env_type == \"mujoco_playground\":\n                render_env.state.info[\"command\"] = jnp.array([[1.0, 0.0, 0.0]])\n            if i % 2 == 0:\n                if env_type == \"humanoid_bench\":\n                    renders.append(render_env.render())\n                else:\n                    renders.append(render_env.state)\n            if done.any():\n                break\n            obs = next_obs\n\n        if env_type == \"mujoco_playground\":\n            renders = render_env.render_trajectory(renders)\n        return renders\n\n    def update_main(data, logs_dict):\n        with autocast(\n            device_type=amp_device_type, dtype=amp_dtype, enabled=amp_enabled\n        ):\n            observations = data[\"observations\"]\n            next_observations = data[\"next\"][\"observations\"]\n            if envs.asymmetric_obs:\n                critic_observations = data[\"critic_observations\"]\n                next_critic_observations = data[\"next\"][\"critic_observations\"]\n            else:\n                critic_observations = observations\n                next_critic_observations = next_observations\n            actions = data[\"actions\"]\n            rewards = data[\"next\"][\"rewards\"]\n            dones = data[\"next\"][\"dones\"].bool()\n            truncations = data[\"next\"][\"truncations\"].bool()\n            if args.disable_bootstrap:\n                bootstrap = (~dones).float()\n            else:\n                bootstrap = (truncations | ~dones).float()\n\n            clipped_noise = torch.randn_like(actions)\n            clipped_noise = clipped_noise.mul(policy_noise).clamp(\n                -noise_clip, noise_clip\n            )\n\n            next_state_actions = (actor(next_observations) + clipped_noise).clamp(\n                action_low, action_high\n            )\n            discount = args.gamma ** data[\"next\"][\"effective_n_steps\"]\n\n            with torch.no_grad():\n                qf1_next_target_projected, qf2_next_target_projected = (\n                    qnet_target.projection(\n                        next_critic_observations,\n                        next_state_actions,\n                        rewards,\n                        bootstrap,\n                        discount,\n                    )\n                )\n                qf1_next_target_value = qnet_target.get_value(qf1_next_target_projected)\n                qf2_next_target_value = qnet_target.get_value(qf2_next_target_projected)\n                if args.use_cdq:\n                    qf_next_target_dist = torch.where(\n                        qf1_next_target_value.unsqueeze(1)\n                        < qf2_next_target_value.unsqueeze(1),\n                        qf1_next_target_projected,\n                        qf2_next_target_projected,\n                    )\n                    qf1_next_target_dist = qf2_next_target_dist = qf_next_target_dist\n                else:\n                    qf1_next_target_dist, qf2_next_target_dist = (\n                        qf1_next_target_projected,\n                        qf2_next_target_projected,\n                    )\n\n            qf1, qf2 = qnet(critic_observations, actions)\n            qf1_loss = -torch.sum(\n                qf1_next_target_dist * F.log_softmax(qf1, dim=1), dim=1\n            ).mean()\n            qf2_loss = -torch.sum(\n                qf2_next_target_dist * F.log_softmax(qf2, dim=1), dim=1\n            ).mean()\n            qf_loss = qf1_loss + qf2_loss\n\n        q_optimizer.zero_grad(set_to_none=True)\n        scaler.scale(qf_loss).backward()\n        scaler.unscale_(q_optimizer)\n\n        if args.use_grad_norm_clipping:\n            critic_grad_norm = torch.nn.utils.clip_grad_norm_(\n                qnet.parameters(),\n                max_norm=args.max_grad_norm if args.max_grad_norm > 0 else float(\"inf\"),\n            )\n        else:\n            critic_grad_norm = torch.tensor(0.0, device=device)\n        scaler.step(q_optimizer)\n        scaler.update()\n\n        logs_dict[\"critic_grad_norm\"] = critic_grad_norm.detach()\n        logs_dict[\"qf_loss\"] = qf_loss.detach()\n        logs_dict[\"qf_max\"] = qf1_next_target_value.max().detach()\n        logs_dict[\"qf_min\"] = qf1_next_target_value.min().detach()\n        return logs_dict\n\n    def update_pol(data, logs_dict):\n        with autocast(\n            device_type=amp_device_type, dtype=amp_dtype, enabled=amp_enabled\n        ):\n            critic_observations = (\n                data[\"critic_observations\"]\n                if envs.asymmetric_obs\n                else data[\"observations\"]\n            )\n\n            qf1, qf2 = qnet(critic_observations, actor(data[\"observations\"]))\n            qf1_value = qnet.get_value(F.softmax(qf1, dim=1))\n            qf2_value = qnet.get_value(F.softmax(qf2, dim=1))\n            if args.use_cdq:\n                qf_value = torch.minimum(qf1_value, qf2_value)\n            else:\n                qf_value = (qf1_value + qf2_value) / 2.0\n            actor_loss = -qf_value.mean()\n\n        actor_optimizer.zero_grad(set_to_none=True)\n        scaler.scale(actor_loss).backward()\n        scaler.unscale_(actor_optimizer)\n        if args.use_grad_norm_clipping:\n            actor_grad_norm = torch.nn.utils.clip_grad_norm_(\n                actor.parameters(),\n                max_norm=args.max_grad_norm if args.max_grad_norm > 0 else float(\"inf\"),\n            )\n        else:\n            actor_grad_norm = torch.tensor(0.0, device=device)\n        scaler.step(actor_optimizer)\n        scaler.update()\n        logs_dict[\"actor_grad_norm\"] = actor_grad_norm.detach()\n        logs_dict[\"actor_loss\"] = actor_loss.detach()\n        return logs_dict\n\n    @torch.no_grad()\n    def soft_update(src, tgt, tau: float):\n        src_ps = [p.data for p in src.parameters()]\n        tgt_ps = [p.data for p in tgt.parameters()]\n\n        torch._foreach_mul_(tgt_ps, 1.0 - tau)\n        torch._foreach_add_(tgt_ps, src_ps, alpha=tau)\n\n    if args.compile:\n        compile_mode = args.compile_mode\n        update_main = torch.compile(update_main, mode=compile_mode)\n        update_pol = torch.compile(update_pol, mode=compile_mode)\n        policy = torch.compile(policy, mode=None)\n        normalize_obs = torch.compile(obs_normalizer.forward, mode=None)\n        normalize_critic_obs = torch.compile(critic_obs_normalizer.forward, mode=None)\n        if args.reward_normalization:\n            update_stats = torch.compile(reward_normalizer.update_stats, mode=None)\n        normalize_reward = torch.compile(reward_normalizer.forward, mode=None)\n    else:\n        normalize_obs = obs_normalizer.forward\n        normalize_critic_obs = critic_obs_normalizer.forward\n        if args.reward_normalization:\n            update_stats = reward_normalizer.update_stats\n        normalize_reward = reward_normalizer.forward\n\n    if envs.asymmetric_obs:\n        obs, critic_obs = envs.reset_with_critic_obs()\n        critic_obs = torch.as_tensor(critic_obs, device=device, dtype=torch.float)\n    else:\n        obs = envs.reset()\n    if args.checkpoint_path:\n        # Load checkpoint if specified\n        torch_checkpoint = torch.load(\n            f\"{args.checkpoint_path}\", map_location=device, weights_only=False\n        )\n        actor.load_state_dict(torch_checkpoint[\"actor_state_dict\"])\n        obs_normalizer.load_state_dict(torch_checkpoint[\"obs_normalizer_state\"])\n        critic_obs_normalizer.load_state_dict(\n            torch_checkpoint[\"critic_obs_normalizer_state\"]\n        )\n        qnet.load_state_dict(torch_checkpoint[\"qnet_state_dict\"])\n        qnet_target.load_state_dict(torch_checkpoint[\"qnet_target_state_dict\"])\n        global_step = torch_checkpoint[\"global_step\"]\n    else:\n        global_step = 0\n\n    dones = None\n    pbar = tqdm.tqdm(total=args.total_timesteps, initial=global_step)\n    start_time = None\n    desc = \"\"\n\n    while global_step < args.total_timesteps:\n        mark_step()\n        logs_dict = TensorDict()\n        if (\n            start_time is None\n            and global_step >= args.measure_burnin + args.learning_starts\n        ):\n            start_time = time.time()\n            measure_burnin = global_step\n\n        with torch.no_grad(), autocast(\n            device_type=amp_device_type, dtype=amp_dtype, enabled=amp_enabled\n        ):\n            norm_obs = normalize_obs(obs)\n            actions = policy(obs=norm_obs, dones=dones)\n\n        next_obs, rewards, dones, infos = envs.step(actions.float())\n        truncations = infos[\"time_outs\"]\n\n        if args.reward_normalization:\n            if env_type == \"mtbench\":\n                task_ids_one_hot = obs[..., -envs.num_tasks :]\n                task_indices = torch.argmax(task_ids_one_hot, dim=1)\n                update_stats(rewards, dones.float(), task_ids=task_indices)\n            else:\n                update_stats(rewards, dones.float())\n\n        if envs.asymmetric_obs:\n            next_critic_obs = infos[\"observations\"][\"critic\"]\n        # Compute 'true' next_obs and next_critic_obs for saving\n        true_next_obs = torch.where(\n            dones[:, None] > 0, infos[\"observations\"][\"raw\"][\"obs\"], next_obs\n        )\n        if envs.asymmetric_obs:\n            true_next_critic_obs = torch.where(\n                dones[:, None] > 0,\n                infos[\"observations\"][\"raw\"][\"critic_obs\"],\n                next_critic_obs,\n            )\n\n        transition = TensorDict(\n            {\n                \"observations\": obs,\n                \"actions\": torch.as_tensor(actions, device=device, dtype=torch.float),\n                \"next\": {\n                    \"observations\": true_next_obs,\n                    \"rewards\": torch.as_tensor(\n                        rewards, device=device, dtype=torch.float\n                    ),\n                    \"truncations\": truncations.long(),\n                    \"dones\": dones.long(),\n                },\n            },\n            batch_size=(envs.num_envs,),\n            device=device,\n        )\n        if envs.asymmetric_obs:\n            transition[\"critic_observations\"] = critic_obs\n            transition[\"next\"][\"critic_observations\"] = true_next_critic_obs\n        rb.extend(transition)\n\n        obs = next_obs\n        if envs.asymmetric_obs:\n            critic_obs = next_critic_obs\n\n        if global_step > args.learning_starts:\n            for i in range(args.num_updates):\n                data = rb.sample(max(1, args.batch_size // args.num_envs))\n                data[\"observations\"] = normalize_obs(data[\"observations\"])\n                data[\"next\"][\"observations\"] = normalize_obs(\n                    data[\"next\"][\"observations\"]\n                )\n                if envs.asymmetric_obs:\n                    data[\"critic_observations\"] = normalize_critic_obs(\n                        data[\"critic_observations\"]\n                    )\n                    data[\"next\"][\"critic_observations\"] = normalize_critic_obs(\n                        data[\"next\"][\"critic_observations\"]\n                    )\n                raw_rewards = data[\"next\"][\"rewards\"]\n                if env_type in [\"mtbench\"] and args.reward_normalization:\n                    # Multi-task reward normalization\n                    task_ids_one_hot = data[\"observations\"][..., -envs.num_tasks :]\n                    task_indices = torch.argmax(task_ids_one_hot, dim=1)\n                    data[\"next\"][\"rewards\"] = normalize_reward(\n                        raw_rewards, task_ids=task_indices\n                    )\n                else:\n                    data[\"next\"][\"rewards\"] = normalize_reward(raw_rewards)\n\n                logs_dict = update_main(data, logs_dict)\n                if args.num_updates > 1:\n                    if i % args.policy_frequency == 1:\n                        logs_dict = update_pol(data, logs_dict)\n                else:\n                    if global_step % args.policy_frequency == 0:\n                        logs_dict = update_pol(data, logs_dict)\n\n                soft_update(qnet, qnet_target, args.tau)\n\n            if global_step % 100 == 0 and start_time is not None:\n                speed = (global_step - measure_burnin) / (time.time() - start_time)\n                pbar.set_description(f\"{speed: 4.4f} sps, \" + desc)\n                with torch.no_grad():\n                    logs = {\n                        \"actor_loss\": logs_dict[\"actor_loss\"].mean(),\n                        \"qf_loss\": logs_dict[\"qf_loss\"].mean(),\n                        \"qf_max\": logs_dict[\"qf_max\"].mean(),\n                        \"qf_min\": logs_dict[\"qf_min\"].mean(),\n                        \"actor_grad_norm\": logs_dict[\"actor_grad_norm\"].mean(),\n                        \"critic_grad_norm\": logs_dict[\"critic_grad_norm\"].mean(),\n                        \"env_rewards\": rewards.mean(),\n                        \"buffer_rewards\": raw_rewards.mean(),\n                    }\n\n                    if args.eval_interval > 0 and global_step % args.eval_interval == 0:\n                        print(f\"Evaluating at global step {global_step}\")\n                        eval_avg_return, eval_avg_length = evaluate()\n                        if env_type in [\"humanoid_bench\", \"isaaclab\", \"mtbench\"]:\n                            # NOTE: Hacky way of evaluating performance, but just works\n                            obs = envs.reset()\n                        logs[\"eval_avg_return\"] = eval_avg_return\n                        logs[\"eval_avg_length\"] = eval_avg_length\n\n                    if (\n                        args.render_interval > 0\n                        and global_step % args.render_interval == 0\n                    ):\n                        renders = render_with_rollout()\n                        render_video = wandb.Video(\n                            np.array(renders).transpose(\n                                0, 3, 1, 2\n                            ),  # Convert to (T, C, H, W) format\n                            fps=30,\n                            format=\"gif\",\n                        )\n                        logs[\"render_video\"] = render_video\n                if args.use_wandb:\n                    wandb.log(\n                        {\n                            \"speed\": speed,\n                            \"frame\": global_step * args.num_envs,\n                            \"critic_lr\": q_scheduler.get_last_lr()[0],\n                            \"actor_lr\": actor_scheduler.get_last_lr()[0],\n                            **logs,\n                        },\n                        step=global_step,\n                    )\n\n            if (\n                args.save_interval > 0\n                and global_step > 0\n                and global_step % args.save_interval == 0\n            ):\n                print(f\"Saving model at global step {global_step}\")\n                save_params(\n                    global_step,\n                    actor,\n                    qnet,\n                    qnet_target,\n                    obs_normalizer,\n                    critic_obs_normalizer,\n                    args,\n                    f\"models/{run_name}_{global_step}.pt\",\n                )\n\n        global_step += 1\n        actor_scheduler.step()\n        q_scheduler.step()\n        pbar.update(1)\n\n    save_params(\n        global_step,\n        actor,\n        qnet,\n        qnet_target,\n        obs_normalizer,\n        critic_obs_normalizer,\n        args,\n        f\"models/{run_name}_final.pt\",\n    )\n\n\nif __name__ == \"__main__\":\n    main()\n"
  },
  {
    "path": "fast_td3/train_multigpu.py",
    "content": "import os\nimport sys\n\nos.environ[\"TORCHDYNAMO_INLINE_INBUILT_NN_MODULES\"] = \"1\"\nos.environ[\"OMP_NUM_THREADS\"] = \"1\"\nif sys.platform != \"darwin\":\n    os.environ[\"MUJOCO_GL\"] = \"egl\"\nelse:\n    os.environ[\"MUJOCO_GL\"] = \"glfw\"\nos.environ[\"XLA_PYTHON_CLIENT_PREALLOCATE\"] = \"false\"\nos.environ[\"JAX_DEFAULT_MATMUL_PRECISION\"] = \"highest\"\n\nimport random\nimport time\nimport math\n\nimport tqdm\nimport wandb\nimport numpy as np\n\ntry:\n    # Required for avoiding IsaacGym import error\n    import isaacgym\nexcept ImportError:\n    pass\n\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nimport torch.optim as optim\nfrom torch.amp import autocast, GradScaler\nfrom torch.nn.parallel import DistributedDataParallel as DDP\nimport torch.multiprocessing as mp\n\nfrom tensordict import TensorDict\n\nfrom fast_td3_utils import (\n    EmpiricalNormalization,\n    RewardNormalizer,\n    PerTaskRewardNormalizer,\n    SimpleReplayBuffer,\n    save_params,\n    get_ddp_state_dict,\n    load_ddp_state_dict,\n    mark_step,\n)\nfrom hyperparams import get_args\n\ntorch.set_float32_matmul_precision(\"high\")\n\ntry:\n    import jax.numpy as jnp\nexcept ImportError:\n    pass\n\n\ndef setup_distributed(rank: int, world_size: int):\n    os.environ[\"MASTER_ADDR\"] = os.getenv(\"MASTER_ADDR\", \"localhost\")\n    os.environ[\"MASTER_PORT\"] = os.getenv(\"MASTER_PORT\", \"12355\")\n    is_distributed = world_size > 1\n    if is_distributed:\n        print(\n            f\"Initializing distributed training with rank {rank}, world size {world_size}\"\n        )\n        torch.distributed.init_process_group(\n            backend=\"nccl\", init_method=\"env://\", world_size=world_size, rank=rank\n        )\n        torch.cuda.set_device(rank)\n    return is_distributed\n\n\ndef main(rank: int, world_size: int):\n    is_distributed = setup_distributed(rank, world_size)\n\n    args = get_args()\n    if rank == 0:\n        print(args)\n    run_name = f\"{args.env_name}__{args.exp_name}__{args.seed}\"\n\n    amp_enabled = args.amp and args.cuda and torch.cuda.is_available()\n    amp_device_type = (\n        f\"cuda:{rank}\"\n        if args.cuda and torch.cuda.is_available()\n        else \"mps\" if args.cuda and torch.backends.mps.is_available() else \"cpu\"\n    )\n    amp_dtype = torch.bfloat16 if args.amp_dtype == \"bf16\" else torch.float16\n\n    scaler = GradScaler(enabled=amp_enabled and amp_dtype == torch.float16)\n\n    if args.use_wandb and rank == 0:\n        wandb.init(\n            project=args.project,\n            name=run_name,\n            config=vars(args),\n            save_code=True,\n        )\n\n    # Use different seeds per rank to avoid synchronization issues\n    random.seed(args.seed + rank)\n    np.random.seed(args.seed + rank)\n    torch.manual_seed(args.seed + rank)\n    torch.backends.cudnn.deterministic = args.torch_deterministic\n\n    if not args.cuda:\n        device = torch.device(\"cpu\")\n    else:\n        if torch.cuda.is_available():\n            device = torch.device(f\"cuda:{rank}\")\n        elif torch.backends.mps.is_available():\n            device = torch.device(f\"mps:{rank}\")\n        else:\n            raise ValueError(\"No GPU available\")\n    print(f\"Using device: {device}\")\n\n    if args.env_name.startswith(\"h1hand-\") or args.env_name.startswith(\"h1-\"):\n        from environments.humanoid_bench_env import HumanoidBenchEnv\n\n        env_type = \"humanoid_bench\"\n        envs = HumanoidBenchEnv(args.env_name, args.num_envs, device=device)\n        eval_envs = envs\n        render_env = HumanoidBenchEnv(\n            args.env_name, 1, render_mode=\"rgb_array\", device=device\n        )\n    elif args.env_name.startswith(\"Isaac-\"):\n        from environments.isaaclab_env import IsaacLabEnv\n\n        env_type = \"isaaclab\"\n        envs = IsaacLabEnv(\n            args.env_name,\n            f\"cuda:{rank}\",\n            args.num_envs,\n            args.seed + rank,\n            action_bounds=args.action_bounds,\n        )\n        eval_envs = envs\n        render_env = envs\n    elif args.env_name.startswith(\"MTBench-\"):\n        from environments.mtbench_env import MTBenchEnv\n\n        env_name = \"-\".join(args.env_name.split(\"-\")[1:])\n        env_type = \"mtbench\"\n        envs = MTBenchEnv(env_name, rank, args.num_envs, args.seed + rank)\n        eval_envs = envs\n        render_env = envs\n    else:\n        from environments.mujoco_playground_env import make_env\n\n        # TODO: Check if re-using same envs for eval could reduce memory usage\n        env_type = \"mujoco_playground\"\n        envs, eval_envs, render_env = make_env(\n            args.env_name,\n            args.seed + rank,\n            args.num_envs,\n            args.num_eval_envs,\n            rank,\n            use_tuned_reward=args.use_tuned_reward,\n            use_domain_randomization=args.use_domain_randomization,\n            use_push_randomization=args.use_push_randomization,\n        )\n\n    n_act = envs.num_actions\n    n_obs = envs.num_obs if type(envs.num_obs) == int else envs.num_obs[0]\n    if envs.asymmetric_obs:\n        n_critic_obs = (\n            envs.num_privileged_obs\n            if type(envs.num_privileged_obs) == int\n            else envs.num_privileged_obs[0]\n        )\n    else:\n        n_critic_obs = n_obs\n    action_low, action_high = -1.0, 1.0\n\n    if args.obs_normalization:\n        obs_normalizer = EmpiricalNormalization(shape=n_obs, device=device)\n        critic_obs_normalizer = EmpiricalNormalization(\n            shape=n_critic_obs, device=device\n        )\n    else:\n        obs_normalizer = nn.Identity()\n        critic_obs_normalizer = nn.Identity()\n\n    if args.reward_normalization:\n        if env_type in [\"mtbench\"]:\n            reward_normalizer = PerTaskRewardNormalizer(\n                num_tasks=envs.num_tasks,\n                gamma=args.gamma,\n                device=device,\n                g_max=min(abs(args.v_min), abs(args.v_max)),\n            )\n        else:\n            reward_normalizer = RewardNormalizer(\n                gamma=args.gamma,\n                device=device,\n                g_max=min(abs(args.v_min), abs(args.v_max)),\n            )\n    else:\n        reward_normalizer = nn.Identity()\n\n    actor_kwargs = {\n        \"n_obs\": n_obs,\n        \"n_act\": n_act,\n        \"num_envs\": args.num_envs,\n        \"device\": device,\n        \"init_scale\": args.init_scale,\n        \"hidden_dim\": args.actor_hidden_dim,\n        \"std_min\": args.std_min,\n        \"std_max\": args.std_max,\n    }\n    critic_kwargs = {\n        \"n_obs\": n_critic_obs,\n        \"n_act\": n_act,\n        \"num_atoms\": args.num_atoms,\n        \"v_min\": args.v_min,\n        \"v_max\": args.v_max,\n        \"hidden_dim\": args.critic_hidden_dim,\n        \"device\": device,\n    }\n\n    if env_type == \"mtbench\":\n        actor_kwargs[\"n_obs\"] = n_obs - envs.num_tasks + args.task_embedding_dim\n        critic_kwargs[\"n_obs\"] = n_critic_obs - envs.num_tasks + args.task_embedding_dim\n        actor_kwargs[\"num_tasks\"] = envs.num_tasks\n        actor_kwargs[\"task_embedding_dim\"] = args.task_embedding_dim\n        critic_kwargs[\"num_tasks\"] = envs.num_tasks\n        critic_kwargs[\"task_embedding_dim\"] = args.task_embedding_dim\n\n    if args.agent == \"fasttd3\":\n        if env_type in [\"mtbench\"]:\n            from fast_td3 import MultiTaskActor, MultiTaskCritic\n\n            actor_cls = MultiTaskActor\n            critic_cls = MultiTaskCritic\n        else:\n            from fast_td3 import Actor, Critic\n\n            actor_cls = Actor\n            critic_cls = Critic\n\n        if rank == 0:\n            print(\"Using FastTD3\")\n    elif args.agent == \"fasttd3_simbav2\":\n        if env_type in [\"mtbench\"]:\n            from fast_td3_simbav2 import MultiTaskActor, MultiTaskCritic\n\n            actor_cls = MultiTaskActor\n            critic_cls = MultiTaskCritic\n        else:\n            from fast_td3_simbav2 import Actor, Critic\n\n            actor_cls = Actor\n            critic_cls = Critic\n\n        if rank == 0:\n            print(\"Using FastTD3 + SimbaV2\")\n        actor_kwargs.pop(\"init_scale\")\n        actor_kwargs.update(\n            {\n                \"scaler_init\": math.sqrt(2.0 / args.actor_hidden_dim),\n                \"scaler_scale\": math.sqrt(2.0 / args.actor_hidden_dim),\n                \"alpha_init\": 1.0 / (args.actor_num_blocks + 1),\n                \"alpha_scale\": 1.0 / math.sqrt(args.actor_hidden_dim),\n                \"expansion\": 4,\n                \"c_shift\": 3.0,\n                \"num_blocks\": args.actor_num_blocks,\n            }\n        )\n        critic_kwargs.update(\n            {\n                \"scaler_init\": math.sqrt(2.0 / args.critic_hidden_dim),\n                \"scaler_scale\": math.sqrt(2.0 / args.critic_hidden_dim),\n                \"alpha_init\": 1.0 / (args.critic_num_blocks + 1),\n                \"alpha_scale\": 1.0 / math.sqrt(args.critic_hidden_dim),\n                \"num_blocks\": args.critic_num_blocks,\n                \"expansion\": 4,\n                \"c_shift\": 3.0,\n            }\n        )\n    else:\n        raise ValueError(f\"Agent {args.agent} not supported\")\n\n    actor = actor_cls(**actor_kwargs)\n    if is_distributed:\n        actor = DDP(actor, device_ids=[rank])\n    if env_type in [\"mtbench\"]:\n        # Python 3.8 doesn't support 'from_module' in tensordict\n        policy = actor.module.explore if hasattr(actor, \"module\") else actor.explore\n    else:\n        from tensordict import from_module\n\n        actor_detach = actor_cls(**actor_kwargs)\n        # Copy params to actor_detach without grad\n        from_module(actor.module if hasattr(actor, \"module\") else actor).data.to_module(\n            actor_detach\n        )\n        policy = actor_detach.explore\n\n    qnet = critic_cls(**critic_kwargs)\n    if is_distributed:\n        qnet = DDP(qnet, device_ids=[rank])\n    qnet_target = critic_cls(**critic_kwargs)  # Create a separate instance\n    qnet_target.load_state_dict(get_ddp_state_dict(qnet))\n\n    q_optimizer = optim.AdamW(\n        list(qnet.parameters()),\n        lr=torch.tensor(args.critic_learning_rate, device=device),\n        weight_decay=args.weight_decay,\n    )\n    actor_optimizer = optim.AdamW(\n        list(actor.parameters()),\n        lr=torch.tensor(args.actor_learning_rate, device=device),\n        weight_decay=args.weight_decay,\n    )\n\n    # Add learning rate schedulers\n    q_scheduler = optim.lr_scheduler.CosineAnnealingLR(\n        q_optimizer,\n        T_max=args.total_timesteps,\n        eta_min=torch.tensor(args.critic_learning_rate_end, device=device),\n    )\n    actor_scheduler = optim.lr_scheduler.CosineAnnealingLR(\n        actor_optimizer,\n        T_max=args.total_timesteps,\n        eta_min=torch.tensor(args.actor_learning_rate_end, device=device),\n    )\n\n    rb = SimpleReplayBuffer(\n        n_env=args.num_envs,\n        buffer_size=args.buffer_size,\n        n_obs=n_obs,\n        n_act=n_act,\n        n_critic_obs=n_critic_obs,\n        asymmetric_obs=envs.asymmetric_obs,\n        playground_mode=env_type == \"mujoco_playground\",\n        n_steps=args.num_steps,\n        gamma=args.gamma,\n        device=device,\n    )\n\n    policy_noise = args.policy_noise\n    noise_clip = args.noise_clip\n\n    def evaluate():\n        num_eval_envs = eval_envs.num_envs\n        episode_returns = torch.zeros(num_eval_envs, device=device)\n        episode_lengths = torch.zeros(num_eval_envs, device=device)\n        done_masks = torch.zeros(num_eval_envs, dtype=torch.bool, device=device)\n\n        if env_type == \"isaaclab\":\n            obs = eval_envs.reset(random_start_init=False)\n        else:\n            obs = eval_envs.reset()\n\n        # Run for a fixed number of steps\n        for i in range(eval_envs.max_episode_steps):\n            with torch.no_grad(), autocast(\n                device_type=amp_device_type, dtype=amp_dtype, enabled=amp_enabled\n            ):\n                obs = normalize_obs(obs, update=False)\n                actions = actor(obs)\n\n            next_obs, rewards, dones, infos = eval_envs.step(actions.float())\n\n            if env_type == \"mtbench\":\n                # We only report success rate in MTBench evaluation\n                rewards = (\n                    infos[\"episode\"][\"success\"].float() if \"episode\" in infos else 0.0\n                )\n            episode_returns = torch.where(\n                ~done_masks, episode_returns + rewards, episode_returns\n            )\n            episode_lengths = torch.where(\n                ~done_masks, episode_lengths + 1, episode_lengths\n            )\n            if env_type == \"mtbench\" and \"episode\" in infos:\n                dones = dones | infos[\"episode\"][\"success\"]\n            done_masks = torch.logical_or(done_masks, dones)\n            if done_masks.all():\n                break\n            obs = next_obs\n\n        return episode_returns.mean(), episode_lengths.mean()\n\n    def render_with_rollout():\n        # Quick rollout for rendering\n        if env_type == \"humanoid_bench\":\n            obs = render_env.reset()\n            renders = [render_env.render()]\n        elif env_type in [\"isaaclab\", \"mtbench\"]:\n            raise NotImplementedError(\n                \"We don't support rendering for IsaacLab and MTBench environments\"\n            )\n        else:\n            obs = render_env.reset()\n            render_env.state.info[\"command\"] = jnp.array([[1.0, 0.0, 0.0]])\n            renders = [render_env.state]\n        for i in range(render_env.max_episode_steps):\n            with torch.no_grad(), autocast(\n                device_type=amp_device_type, dtype=amp_dtype, enabled=amp_enabled\n            ):\n                obs = normalize_obs(obs, update=False)\n                actions = actor(obs)\n            next_obs, _, done, _ = render_env.step(actions.float())\n            if env_type == \"mujoco_playground\":\n                render_env.state.info[\"command\"] = jnp.array([[1.0, 0.0, 0.0]])\n            if i % 2 == 0:\n                if env_type == \"humanoid_bench\":\n                    renders.append(render_env.render())\n                else:\n                    renders.append(render_env.state)\n            if done.any():\n                break\n            obs = next_obs\n\n        if env_type == \"mujoco_playground\":\n            renders = render_env.render_trajectory(renders)\n        return renders\n\n    def update_main(data, logs_dict):\n        with autocast(\n            device_type=amp_device_type, dtype=amp_dtype, enabled=amp_enabled\n        ):\n            observations = data[\"observations\"]\n            next_observations = data[\"next\"][\"observations\"]\n            if envs.asymmetric_obs:\n                critic_observations = data[\"critic_observations\"]\n                next_critic_observations = data[\"next\"][\"critic_observations\"]\n            else:\n                critic_observations = observations\n                next_critic_observations = next_observations\n            actions = data[\"actions\"]\n            rewards = data[\"next\"][\"rewards\"]\n            dones = data[\"next\"][\"dones\"].bool()\n            truncations = data[\"next\"][\"truncations\"].bool()\n            if args.disable_bootstrap:\n                bootstrap = (~dones).float()\n            else:\n                bootstrap = (truncations | ~dones).float()\n\n            clipped_noise = torch.randn_like(actions)\n            clipped_noise = clipped_noise.mul(policy_noise).clamp(\n                -noise_clip, noise_clip\n            )\n\n            next_state_actions = (actor(next_observations) + clipped_noise).clamp(\n                action_low, action_high\n            )\n            discount = args.gamma ** data[\"next\"][\"effective_n_steps\"]\n\n            with torch.no_grad():\n                qf1_next_target_projected, qf2_next_target_projected = (\n                    qnet_target.projection(\n                        next_critic_observations,\n                        next_state_actions,\n                        rewards,\n                        bootstrap,\n                        discount,\n                    )\n                )\n                qf1_next_target_value = qnet_target.get_value(qf1_next_target_projected)\n                qf2_next_target_value = qnet_target.get_value(qf2_next_target_projected)\n                if args.use_cdq:\n                    qf_next_target_dist = torch.where(\n                        qf1_next_target_value.unsqueeze(1)\n                        < qf2_next_target_value.unsqueeze(1),\n                        qf1_next_target_projected,\n                        qf2_next_target_projected,\n                    )\n                    qf1_next_target_dist = qf2_next_target_dist = qf_next_target_dist\n                else:\n                    qf1_next_target_dist, qf2_next_target_dist = (\n                        qf1_next_target_projected,\n                        qf2_next_target_projected,\n                    )\n\n            qf1, qf2 = qnet(critic_observations, actions)\n            qf1_loss = -torch.sum(\n                qf1_next_target_dist * F.log_softmax(qf1, dim=1), dim=1\n            ).mean()\n            qf2_loss = -torch.sum(\n                qf2_next_target_dist * F.log_softmax(qf2, dim=1), dim=1\n            ).mean()\n            qf_loss = qf1_loss + qf2_loss\n\n        q_optimizer.zero_grad(set_to_none=True)\n        scaler.scale(qf_loss).backward()\n        scaler.unscale_(q_optimizer)\n\n        if args.use_grad_norm_clipping:\n            critic_grad_norm = torch.nn.utils.clip_grad_norm_(\n                qnet.parameters(),\n                max_norm=args.max_grad_norm if args.max_grad_norm > 0 else float(\"inf\"),\n            )\n        else:\n            critic_grad_norm = torch.tensor(0.0, device=device)\n        scaler.step(q_optimizer)\n        scaler.update()\n\n        logs_dict[\"critic_grad_norm\"] = critic_grad_norm.detach()\n        logs_dict[\"qf_loss\"] = qf_loss.detach()\n        logs_dict[\"qf_max\"] = qf1_next_target_value.max().detach()\n        logs_dict[\"qf_min\"] = qf1_next_target_value.min().detach()\n        return logs_dict\n\n    def update_pol(data, logs_dict):\n        with autocast(\n            device_type=amp_device_type, dtype=amp_dtype, enabled=amp_enabled\n        ):\n            critic_observations = (\n                data[\"critic_observations\"]\n                if envs.asymmetric_obs\n                else data[\"observations\"]\n            )\n\n            qf1, qf2 = qnet(critic_observations, actor(data[\"observations\"]))\n            qf1_value = (\n                qnet.module.get_value(F.softmax(qf1, dim=1))\n                if hasattr(qnet, \"module\")\n                else qnet.get_value(F.softmax(qf1, dim=1))\n            )\n            qf2_value = (\n                qnet.module.get_value(F.softmax(qf2, dim=1))\n                if hasattr(qnet, \"module\")\n                else qnet.get_value(F.softmax(qf2, dim=1))\n            )\n            if args.use_cdq:\n                qf_value = torch.minimum(qf1_value, qf2_value)\n            else:\n                qf_value = (qf1_value + qf2_value) / 2.0\n            actor_loss = -qf_value.mean()\n\n        actor_optimizer.zero_grad(set_to_none=True)\n        scaler.scale(actor_loss).backward()\n        scaler.unscale_(actor_optimizer)\n        if args.use_grad_norm_clipping:\n            actor_grad_norm = torch.nn.utils.clip_grad_norm_(\n                actor.parameters(),\n                max_norm=args.max_grad_norm if args.max_grad_norm > 0 else float(\"inf\"),\n            )\n        else:\n            actor_grad_norm = torch.tensor(0.0, device=device)\n        scaler.step(actor_optimizer)\n        scaler.update()\n        logs_dict[\"actor_grad_norm\"] = actor_grad_norm.detach()\n        logs_dict[\"actor_loss\"] = actor_loss.detach()\n        return logs_dict\n\n    @torch.no_grad()\n    def soft_update(src, tgt, tau: float):\n        # Handle DDP module by accessing .module attribute\n        src_module = src.module if hasattr(src, \"module\") else src\n        tgt_module = tgt.module if hasattr(tgt, \"module\") else tgt\n\n        src_ps = [p.data for p in src_module.parameters()]\n        tgt_ps = [p.data for p in tgt_module.parameters()]\n\n        torch._foreach_mul_(tgt_ps, 1.0 - tau)\n        torch._foreach_add_(tgt_ps, src_ps, alpha=tau)\n\n    if args.compile:\n        compile_mode = args.compile_mode\n        update_main = torch.compile(update_main, mode=compile_mode)\n        update_pol = torch.compile(update_pol, mode=compile_mode)\n        policy = torch.compile(policy, mode=None)\n        normalize_obs = torch.compile(obs_normalizer.forward, mode=None)\n        normalize_critic_obs = torch.compile(critic_obs_normalizer.forward, mode=None)\n        if args.reward_normalization:\n            update_stats = torch.compile(reward_normalizer.update_stats, mode=None)\n        normalize_reward = torch.compile(reward_normalizer.forward, mode=None)\n    else:\n        normalize_obs = obs_normalizer.forward\n        normalize_critic_obs = critic_obs_normalizer.forward\n        if args.reward_normalization:\n            update_stats = reward_normalizer.update_stats\n        normalize_reward = reward_normalizer.forward\n\n    if envs.asymmetric_obs:\n        obs, critic_obs = envs.reset_with_critic_obs()\n        critic_obs = torch.as_tensor(critic_obs, device=device, dtype=torch.float)\n    else:\n        obs = envs.reset()\n    if args.checkpoint_path:\n        # Load checkpoint if specified\n        torch_checkpoint = torch.load(\n            f\"{args.checkpoint_path}\", map_location=device, weights_only=False\n        )\n        load_ddp_state_dict(actor, torch_checkpoint[\"actor_state_dict\"])\n        if torch_checkpoint[\"obs_normalizer_state\"] is not None:\n            obs_normalizer.load_state_dict(torch_checkpoint[\"obs_normalizer_state\"])\n        if torch_checkpoint[\"critic_obs_normalizer_state\"] is not None:\n            critic_obs_normalizer.load_state_dict(\n                torch_checkpoint[\"critic_obs_normalizer_state\"]\n            )\n        load_ddp_state_dict(qnet, torch_checkpoint[\"qnet_state_dict\"])\n        qnet_target.load_state_dict(torch_checkpoint[\"qnet_target_state_dict\"])\n        global_step = torch_checkpoint[\"global_step\"]\n    else:\n        global_step = 0\n\n    dones = None\n    pbar = tqdm.tqdm(total=args.total_timesteps, initial=global_step)\n    start_time = None\n    desc = \"\"\n\n    while global_step < args.total_timesteps:\n        mark_step()\n        logs_dict = TensorDict()\n        if (\n            start_time is None\n            and global_step >= args.measure_burnin + args.learning_starts\n        ):\n            start_time = time.time()\n            measure_burnin = global_step\n\n        with torch.no_grad(), autocast(\n            device_type=amp_device_type, dtype=amp_dtype, enabled=amp_enabled\n        ):\n            norm_obs = normalize_obs(obs)\n            actions = policy(obs=norm_obs, dones=dones)\n\n        next_obs, rewards, dones, infos = envs.step(actions.float())\n        truncations = infos[\"time_outs\"]\n\n        if args.reward_normalization:\n            if env_type == \"mtbench\":\n                task_ids_one_hot = obs[..., -envs.num_tasks :]\n                task_indices = torch.argmax(task_ids_one_hot, dim=1)\n                update_stats(rewards, dones.float(), task_ids=task_indices)\n            else:\n                update_stats(rewards, dones.float())\n\n        if envs.asymmetric_obs:\n            next_critic_obs = infos[\"observations\"][\"critic\"]\n        # Compute 'true' next_obs and next_critic_obs for saving\n        true_next_obs = torch.where(\n            dones[:, None] > 0, infos[\"observations\"][\"raw\"][\"obs\"], next_obs\n        )\n        if envs.asymmetric_obs:\n            true_next_critic_obs = torch.where(\n                dones[:, None] > 0,\n                infos[\"observations\"][\"raw\"][\"critic_obs\"],\n                next_critic_obs,\n            )\n\n        transition = TensorDict(\n            {\n                \"observations\": obs,\n                \"actions\": torch.as_tensor(actions, device=device, dtype=torch.float),\n                \"next\": {\n                    \"observations\": true_next_obs,\n                    \"rewards\": torch.as_tensor(\n                        rewards, device=device, dtype=torch.float\n                    ),\n                    \"truncations\": truncations.long(),\n                    \"dones\": dones.long(),\n                },\n            },\n            batch_size=(envs.num_envs,),\n            device=device,\n        )\n        if envs.asymmetric_obs:\n            transition[\"critic_observations\"] = critic_obs\n            transition[\"next\"][\"critic_observations\"] = true_next_critic_obs\n        rb.extend(transition)\n\n        obs = next_obs\n        if envs.asymmetric_obs:\n            critic_obs = next_critic_obs\n\n        if global_step > args.learning_starts:\n            for i in range(args.num_updates):\n                data = rb.sample(max(1, args.batch_size // args.num_envs))\n                data[\"observations\"] = normalize_obs(data[\"observations\"])\n                data[\"next\"][\"observations\"] = normalize_obs(\n                    data[\"next\"][\"observations\"]\n                )\n                if envs.asymmetric_obs:\n                    data[\"critic_observations\"] = normalize_critic_obs(\n                        data[\"critic_observations\"]\n                    )\n                    data[\"next\"][\"critic_observations\"] = normalize_critic_obs(\n                        data[\"next\"][\"critic_observations\"]\n                    )\n                raw_rewards = data[\"next\"][\"rewards\"]\n                if env_type in [\"mtbench\"] and args.reward_normalization:\n                    # Multi-task reward normalization\n                    task_ids_one_hot = data[\"observations\"][..., -envs.num_tasks :]\n                    task_indices = torch.argmax(task_ids_one_hot, dim=1)\n                    data[\"next\"][\"rewards\"] = normalize_reward(\n                        raw_rewards, task_ids=task_indices\n                    )\n                else:\n                    data[\"next\"][\"rewards\"] = normalize_reward(raw_rewards)\n\n                logs_dict = update_main(data, logs_dict)\n                if args.num_updates > 1:\n                    if i % args.policy_frequency == 1:\n                        logs_dict = update_pol(data, logs_dict)\n                else:\n                    if global_step % args.policy_frequency == 0:\n                        logs_dict = update_pol(data, logs_dict)\n\n                soft_update(qnet, qnet_target, args.tau)\n\n            if global_step % 100 == 0 and start_time is not None:\n                speed = (global_step - measure_burnin) / (time.time() - start_time)\n                if rank == 0:\n                    pbar.set_description(f\"{speed: 4.4f} sps, \" + desc)\n                with torch.no_grad():\n                    logs = {\n                        \"actor_loss\": logs_dict[\"actor_loss\"].mean(),\n                        \"qf_loss\": logs_dict[\"qf_loss\"].mean(),\n                        \"qf_max\": logs_dict[\"qf_max\"].mean(),\n                        \"qf_min\": logs_dict[\"qf_min\"].mean(),\n                        \"actor_grad_norm\": logs_dict[\"actor_grad_norm\"].mean(),\n                        \"critic_grad_norm\": logs_dict[\"critic_grad_norm\"].mean(),\n                        \"env_rewards\": rewards.mean(),\n                        \"buffer_rewards\": raw_rewards.mean(),\n                    }\n\n                    if args.eval_interval > 0 and global_step % args.eval_interval == 0:\n                        local_eval_avg_return, local_eval_avg_length = evaluate()\n                        eval_results = torch.tensor(\n                            [local_eval_avg_return, local_eval_avg_length],\n                            device=device,\n                        )\n                        if is_distributed:\n                            torch.distributed.all_reduce(\n                                eval_results, op=torch.distributed.ReduceOp.AVG\n                            )\n\n                        if rank == 0:\n                            global_avg_return = eval_results[0].item()\n                            global_avg_length = eval_results[1].item()\n                            print(\n                                f\"Evaluating at global step {global_step}: Avg Return={global_avg_return:.2f}\"\n                            )\n                            logs[\"eval_avg_return\"] = global_avg_return\n                            logs[\"eval_avg_length\"] = global_avg_length\n\n                        if env_type in [\"humanoid_bench\", \"isaaclab\", \"mtbench\"]:\n                            # NOTE: Hacky way of evaluating performance, but just works\n                            obs = envs.reset()\n\n                    if (\n                        args.render_interval > 0\n                        and global_step % args.render_interval == 0\n                    ):\n                        renders = render_with_rollout()\n                        render_video = wandb.Video(\n                            np.array(renders).transpose(\n                                0, 3, 1, 2\n                            ),  # Convert to (T, C, H, W) format\n                            fps=30,\n                            format=\"gif\",\n                        )\n                        logs[\"render_video\"] = render_video\n\n                if args.use_wandb and rank == 0:\n                    wandb.log(\n                        {\n                            \"speed\": speed,\n                            \"frame\": global_step * args.num_envs,\n                            \"critic_lr\": q_scheduler.get_last_lr()[0],\n                            \"actor_lr\": actor_scheduler.get_last_lr()[0],\n                            **logs,\n                        },\n                        step=global_step,\n                    )\n\n            if (\n                args.save_interval > 0\n                and global_step > 0\n                and global_step % args.save_interval == 0\n                and rank == 0\n            ):\n                print(f\"Saving model at global step {global_step}\")\n                save_params(\n                    global_step,\n                    actor,\n                    qnet,\n                    qnet_target,\n                    obs_normalizer,\n                    critic_obs_normalizer,\n                    args,\n                    f\"models/{run_name}_{global_step}.pt\",\n                )\n\n        global_step += 1\n        actor_scheduler.step()\n        q_scheduler.step()\n        if rank == 0:\n            pbar.update(1)\n\n    save_params(\n        global_step,\n        actor,\n        qnet,\n        qnet_target,\n        obs_normalizer,\n        critic_obs_normalizer,\n        args,\n        f\"models/{run_name}_final.pt\",\n    )\n\n    # Cleanup distributed training\n    if is_distributed:\n        torch.distributed.destroy_process_group()\n\n\nif __name__ == \"__main__\":\n    world_size = torch.cuda.device_count()\n    mp.spawn(main, args=(world_size,), nprocs=world_size)\n"
  },
  {
    "path": "fast_td3/training_notebook.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"# FastTD3 Training Notebook\\n\",\n    \"\\n\",\n    \"Welcome! This notebook will let you execute a series of code blocks that enables you to experience how FastTD3 works -- each block will import packages, define arguments, create environments, create FastTD3 agent, and train the agent.\\n\",\n    \"\\n\",\n    \"This notebook also provide the same functionalities as `train.py` -- you can use this notebook to train your own agents, upload logs to wandb, render rollouts, and fine-tune pre-trained agents with more environment steps!\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# NOTE: Set environment variables and import packages\\n\",\n    \"\\n\",\n    \"import os\\n\",\n    \"\\n\",\n    \"os.environ[\\\"TORCHDYNAMO_INLINE_INBUILT_NN_MODULES\\\"] = \\\"1\\\"\\n\",\n    \"os.environ[\\\"OMP_NUM_THREADS\\\"] = \\\"1\\\"\\n\",\n    \"if sys.platform != \\\"darwin\\\":\\n\",\n    \"    os.environ[\\\"MUJOCO_GL\\\"] = \\\"egl\\\"\\n\",\n    \"else:\\n\",\n    \"    os.environ[\\\"MUJOCO_GL\\\"] = \\\"glfw\\\"\\n\",\n    \"os.environ[\\\"XLA_PYTHON_CLIENT_PREALLOCATE\\\"] = \\\"false\\\"\\n\",\n    \"os.environ[\\\"JAX_DEFAULT_MATMUL_PRECISION\\\"] = \\\"highest\\\"\\n\",\n    \"\\n\",\n    \"import random\\n\",\n    \"import time\\n\",\n    \"\\n\",\n    \"import tqdm\\n\",\n    \"import wandb\\n\",\n    \"import numpy as np\\n\",\n    \"\\n\",\n    \"import torch\\n\",\n    \"import torch.nn as nn\\n\",\n    \"import torch.nn.functional as F\\n\",\n    \"import torch.optim as optim\\n\",\n    \"from torch.amp import autocast, GradScaler\\n\",\n    \"from tensordict import TensorDict, from_module\\n\",\n    \"\\n\",\n    \"torch.set_float32_matmul_precision(\\\"high\\\")\\n\",\n    \"\\n\",\n    \"from fast_td3_utils import (\\n\",\n    \"    EmpiricalNormalization,\\n\",\n    \"    SimpleReplayBuffer,\\n\",\n    \"    save_params,\\n\",\n    \")\\n\",\n    \"\\n\",\n    \"from fast_td3 import Critic, Actor\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# NOTE: Set checkpoint if you want to fine-tune from existing checkpoint\\n\",\n    \"# e.g., set checkpoint to \\\"models/h1-walk-v0_notebook_experiment_30000.pt\\\"\\n\",\n    \"checkpoint_path = None\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# NOTE: Customize arguments as needed\\n\",\n    \"# However, IsaacLab may not work in Notebook Setup.\\n\",\n    \"# We recommend using HumanoidBench or MuJoCo Playground for notebook experiments.\\n\",\n    \"\\n\",\n    \"# For quick experiments, let's use a task without dexterous hands\\n\",\n    \"# But for your research, we recommend using `h1hand` tasks in HumanoidBench!\\n\",\n    \"from hyperparams import HumanoidBenchArgs\\n\",\n    \"\\n\",\n    \"args = HumanoidBenchArgs(\\n\",\n    \"    env_name=\\\"h1-walk-v0\\\",\\n\",\n    \"    total_timesteps=20000,\\n\",\n    \"    render_interval=5000,\\n\",\n    \"    eval_interval=5000,\\n\",\n    \")\\n\",\n    \"run_name = f\\\"{args.env_name}_notebook_experiment\\\"\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# NOTE: GPU-Related Configurations\\n\",\n    \"\\n\",\n    \"amp_enabled = args.amp and args.cuda and torch.cuda.is_available()\\n\",\n    \"amp_device_type = (\\n\",\n    \"    \\\"cuda\\\"\\n\",\n    \"    if args.cuda and torch.cuda.is_available()\\n\",\n    \"    else \\\"mps\\\" if args.cuda and torch.backends.mps.is_available() else \\\"cpu\\\"\\n\",\n    \")\\n\",\n    \"amp_dtype = torch.bfloat16 if args.amp_dtype == \\\"bf16\\\" else torch.float16\\n\",\n    \"\\n\",\n    \"scaler = GradScaler(enabled=amp_enabled and amp_dtype == torch.float16)\\n\",\n    \"\\n\",\n    \"if not args.cuda:\\n\",\n    \"    device = torch.device(\\\"cpu\\\")\\n\",\n    \"else:\\n\",\n    \"    if torch.cuda.is_available():\\n\",\n    \"        device = torch.device(f\\\"cuda:{args.device_rank}\\\")\\n\",\n    \"    elif torch.backends.mps.is_available():\\n\",\n    \"        device = torch.device(f\\\"mps:{args.device_rank}\\\")\\n\",\n    \"    else:\\n\",\n    \"        raise ValueError(\\\"No GPU available\\\")\\n\",\n    \"print(f\\\"Using device: {device}\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# NOTE: Define Wandb if needed\\n\",\n    \"\\n\",\n    \"# Set use_wandb to True if you want to use Wandb\\n\",\n    \"use_wandb = True\\n\",\n    \"\\n\",\n    \"if use_wandb:\\n\",\n    \"    wandb.init(\\n\",\n    \"        project=\\\"FastTD3\\\",\\n\",\n    \"        name=run_name,\\n\",\n    \"        config=vars(args),\\n\",\n    \"        save_code=True,\\n\",\n    \"    )\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# NOTE: Initialize Environment and Related Variables\\n\",\n    \"\\n\",\n    \"if args.env_name.startswith(\\\"h1hand-\\\") or args.env_name.startswith(\\\"h1-\\\"):\\n\",\n    \"    from environments.humanoid_bench_env import HumanoidBenchEnv\\n\",\n    \"\\n\",\n    \"    env_type = \\\"humanoid_bench\\\"\\n\",\n    \"    envs = HumanoidBenchEnv(args.env_name, args.num_envs, device=device)\\n\",\n    \"    eval_envs = envs\\n\",\n    \"    render_env = HumanoidBenchEnv(\\n\",\n    \"        args.env_name, 1, render_mode=\\\"rgb_array\\\", device=device\\n\",\n    \"    )\\n\",\n    \"elif args.env_name.startswith(\\\"Isaac-\\\"):\\n\",\n    \"    from environments.isaaclab_env import IsaacLabEnv\\n\",\n    \"\\n\",\n    \"    env_type = \\\"isaaclab\\\"\\n\",\n    \"    envs = IsaacLabEnv(\\n\",\n    \"        args.env_name,\\n\",\n    \"        device.type,\\n\",\n    \"        args.num_envs,\\n\",\n    \"        args.seed,\\n\",\n    \"        action_bounds=args.action_bounds,\\n\",\n    \"    )\\n\",\n    \"    eval_envs = envs\\n\",\n    \"    render_envs = envs\\n\",\n    \"else:\\n\",\n    \"    from environments.mujoco_playground_env import make_env\\n\",\n    \"    import jax.numpy as jnp\\n\",\n    \"\\n\",\n    \"    env_type = \\\"mujoco_playground\\\"\\n\",\n    \"    envs, eval_envs, render_env = make_env(\\n\",\n    \"        args.env_name,\\n\",\n    \"        args.seed,\\n\",\n    \"        args.num_envs,\\n\",\n    \"        args.num_eval_envs,\\n\",\n    \"        args.device_rank,\\n\",\n    \"        use_tuned_reward=args.use_tuned_reward,\\n\",\n    \"        use_domain_randomization=args.use_domain_randomization,\\n\",\n    \"    )\\n\",\n    \"\\n\",\n    \"n_act = envs.num_actions\\n\",\n    \"n_obs = envs.num_obs if type(envs.num_obs) == int else envs.num_obs[0]\\n\",\n    \"if envs.asymmetric_obs:\\n\",\n    \"    n_critic_obs = (\\n\",\n    \"        envs.num_privileged_obs\\n\",\n    \"        if type(envs.num_privileged_obs) == int\\n\",\n    \"        else envs.num_privileged_obs[0]\\n\",\n    \"    )\\n\",\n    \"else:\\n\",\n    \"    n_critic_obs = n_obs\\n\",\n    \"action_low, action_high = -1.0, 1.0\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# NOTE: Initialize Normalizer, Actor, and Critic\\n\",\n    \"\\n\",\n    \"if args.obs_normalization:\\n\",\n    \"    obs_normalizer = EmpiricalNormalization(shape=n_obs, device=device)\\n\",\n    \"    critic_obs_normalizer = EmpiricalNormalization(shape=n_critic_obs, device=device)\\n\",\n    \"else:\\n\",\n    \"    obs_normalizer = nn.Identity()\\n\",\n    \"    critic_obs_normalizer = nn.Identity()\\n\",\n    \"\\n\",\n    \"normalize_obs = obs_normalizer.forward\\n\",\n    \"normalize_critic_obs = critic_obs_normalizer.forward\\n\",\n    \"\\n\",\n    \"# Actor setup\\n\",\n    \"actor = Actor(\\n\",\n    \"    n_obs=n_obs,\\n\",\n    \"    n_act=n_act,\\n\",\n    \"    num_envs=args.num_envs,\\n\",\n    \"    device=device,\\n\",\n    \"    init_scale=args.init_scale,\\n\",\n    \"    hidden_dim=args.actor_hidden_dim,\\n\",\n    \")\\n\",\n    \"actor_detach = Actor(\\n\",\n    \"    n_obs=n_obs,\\n\",\n    \"    n_act=n_act,\\n\",\n    \"    num_envs=args.num_envs,\\n\",\n    \"    device=device,\\n\",\n    \"    init_scale=args.init_scale,\\n\",\n    \"    hidden_dim=args.actor_hidden_dim,\\n\",\n    \")\\n\",\n    \"# Copy params to actor_detach without grad\\n\",\n    \"from_module(actor).data.to_module(actor_detach)\\n\",\n    \"policy = actor_detach.explore\\n\",\n    \"\\n\",\n    \"qnet = Critic(\\n\",\n    \"    n_obs=n_critic_obs,\\n\",\n    \"    n_act=n_act,\\n\",\n    \"    num_atoms=args.num_atoms,\\n\",\n    \"    v_min=args.v_min,\\n\",\n    \"    v_max=args.v_max,\\n\",\n    \"    hidden_dim=args.critic_hidden_dim,\\n\",\n    \"    device=device,\\n\",\n    \")\\n\",\n    \"qnet_target = Critic(\\n\",\n    \"    n_obs=n_critic_obs,\\n\",\n    \"    n_act=n_act,\\n\",\n    \"    num_atoms=args.num_atoms,\\n\",\n    \"    v_min=args.v_min,\\n\",\n    \"    v_max=args.v_max,\\n\",\n    \"    hidden_dim=args.critic_hidden_dim,\\n\",\n    \"    device=device,\\n\",\n    \")\\n\",\n    \"qnet_target.load_state_dict(qnet.state_dict())\\n\",\n    \"\\n\",\n    \"q_optimizer = optim.AdamW(\\n\",\n    \"    list(qnet.parameters()),\\n\",\n    \"    lr=args.critic_learning_rate,\\n\",\n    \"    weight_decay=args.weight_decay,\\n\",\n    \")\\n\",\n    \"actor_optimizer = optim.AdamW(\\n\",\n    \"    list(actor.parameters()),\\n\",\n    \"    lr=args.actor_learning_rate,\\n\",\n    \"    weight_decay=args.weight_decay,\\n\",\n    \")\\n\",\n    \"\\n\",\n    \"rb = SimpleReplayBuffer(\\n\",\n    \"    n_env=args.num_envs,\\n\",\n    \"    buffer_size=args.buffer_size,\\n\",\n    \"    n_obs=n_obs,\\n\",\n    \"    n_act=n_act,\\n\",\n    \"    n_critic_obs=n_critic_obs,\\n\",\n    \"    asymmetric_obs=envs.asymmetric_obs,\\n\",\n    \"    playground_mode=env_type == \\\"mujoco_playground\\\",\\n\",\n    \"    n_steps=args.num_steps,\\n\",\n    \"    gamma=args.gamma,\\n\",\n    \"    device=device,\\n\",\n    \")\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 8,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# NOTE: Define Evaluation & Rendering Functions\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"def evaluate():\\n\",\n    \"    obs_normalizer.eval()\\n\",\n    \"    num_eval_envs = eval_envs.num_envs\\n\",\n    \"    episode_returns = torch.zeros(num_eval_envs, device=device)\\n\",\n    \"    episode_lengths = torch.zeros(num_eval_envs, device=device)\\n\",\n    \"    done_masks = torch.zeros(num_eval_envs, dtype=torch.bool, device=device)\\n\",\n    \"\\n\",\n    \"    if env_type == \\\"isaaclab\\\":\\n\",\n    \"        obs = eval_envs.reset(random_start_init=False)\\n\",\n    \"    else:\\n\",\n    \"        obs = eval_envs.reset()\\n\",\n    \"\\n\",\n    \"    # Run for a fixed number of steps\\n\",\n    \"    for _ in range(eval_envs.max_episode_steps):\\n\",\n    \"        with torch.no_grad(), autocast(\\n\",\n    \"            device_type=amp_device_type, dtype=amp_dtype, enabled=amp_enabled\\n\",\n    \"        ):\\n\",\n    \"            obs = normalize_obs(obs)\\n\",\n    \"            actions = actor(obs)\\n\",\n    \"\\n\",\n    \"        next_obs, rewards, dones, _ = eval_envs.step(actions.float())\\n\",\n    \"        episode_returns = torch.where(\\n\",\n    \"            ~done_masks, episode_returns + rewards, episode_returns\\n\",\n    \"        )\\n\",\n    \"        episode_lengths = torch.where(~done_masks, episode_lengths + 1, episode_lengths)\\n\",\n    \"        done_masks = torch.logical_or(done_masks, dones)\\n\",\n    \"        if done_masks.all():\\n\",\n    \"            break\\n\",\n    \"        obs = next_obs\\n\",\n    \"\\n\",\n    \"    obs_normalizer.train()\\n\",\n    \"    return episode_returns.mean().item(), episode_lengths.mean().item()\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"def render_with_rollout():\\n\",\n    \"    obs_normalizer.eval()\\n\",\n    \"\\n\",\n    \"    # Quick rollout for rendering\\n\",\n    \"    if env_type == \\\"humanoid_bench\\\":\\n\",\n    \"        obs = render_env.reset()\\n\",\n    \"        renders = [render_env.render()]\\n\",\n    \"    elif env_type == \\\"isaaclab\\\":\\n\",\n    \"        raise NotImplementedError(\\n\",\n    \"            \\\"We don't support rendering for IsaacLab environments\\\"\\n\",\n    \"        )\\n\",\n    \"    else:\\n\",\n    \"        obs = render_env.reset()\\n\",\n    \"        render_env.state.info[\\\"command\\\"] = jnp.array([[1.0, 0.0, 0.0]])\\n\",\n    \"        renders = [render_env.state]\\n\",\n    \"    for i in range(render_env.max_episode_steps):\\n\",\n    \"        with torch.no_grad(), autocast(\\n\",\n    \"            device_type=amp_device_type, dtype=amp_dtype, enabled=amp_enabled\\n\",\n    \"        ):\\n\",\n    \"            obs = normalize_obs(obs)\\n\",\n    \"            actions = actor(obs)\\n\",\n    \"        next_obs, _, done, _ = render_env.step(actions.float())\\n\",\n    \"        if env_type == \\\"mujoco_playground\\\":\\n\",\n    \"            render_env.state.info[\\\"command\\\"] = jnp.array([[1.0, 0.0, 0.0]])\\n\",\n    \"        if i % 2 == 0:\\n\",\n    \"            if env_type == \\\"humanoid_bench\\\":\\n\",\n    \"                renders.append(render_env.render())\\n\",\n    \"            else:\\n\",\n    \"                renders.append(render_env.state)\\n\",\n    \"        if done.any():\\n\",\n    \"            break\\n\",\n    \"        obs = next_obs\\n\",\n    \"\\n\",\n    \"    if env_type == \\\"mujoco_playground\\\":\\n\",\n    \"        renders = render_env.render_trajectory(renders)\\n\",\n    \"\\n\",\n    \"    obs_normalizer.train()\\n\",\n    \"    return renders\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 9,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# NOTE: Define Update Functions\\n\",\n    \"\\n\",\n    \"policy_noise = args.policy_noise\\n\",\n    \"noise_clip = args.noise_clip\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"def update_main(data, logs_dict):\\n\",\n    \"    with autocast(device_type=amp_device_type, dtype=amp_dtype, enabled=amp_enabled):\\n\",\n    \"        observations = data[\\\"observations\\\"]\\n\",\n    \"        next_observations = data[\\\"next\\\"][\\\"observations\\\"]\\n\",\n    \"        if envs.asymmetric_obs:\\n\",\n    \"            critic_observations = data[\\\"critic_observations\\\"]\\n\",\n    \"            next_critic_observations = data[\\\"next\\\"][\\\"critic_observations\\\"]\\n\",\n    \"        else:\\n\",\n    \"            critic_observations = observations\\n\",\n    \"            next_critic_observations = next_observations\\n\",\n    \"        actions = data[\\\"actions\\\"]\\n\",\n    \"        rewards = data[\\\"next\\\"][\\\"rewards\\\"]\\n\",\n    \"        dones = data[\\\"next\\\"][\\\"dones\\\"].bool()\\n\",\n    \"        truncations = data[\\\"next\\\"][\\\"truncations\\\"].bool()\\n\",\n    \"        if args.disable_bootstrap:\\n\",\n    \"            bootstrap = (~dones).float()\\n\",\n    \"        else:\\n\",\n    \"            bootstrap = (truncations | ~dones).float()\\n\",\n    \"\\n\",\n    \"        clipped_noise = torch.randn_like(actions)\\n\",\n    \"        clipped_noise = clipped_noise.mul(policy_noise).clamp(-noise_clip, noise_clip)\\n\",\n    \"\\n\",\n    \"        next_state_actions = (actor(next_observations) + clipped_noise).clamp(\\n\",\n    \"            action_low, action_high\\n\",\n    \"        )\\n\",\n    \"\\n\",\n    \"        with torch.no_grad():\\n\",\n    \"            qf1_next_target_projected, qf2_next_target_projected = (\\n\",\n    \"                qnet_target.projection(\\n\",\n    \"                    next_critic_observations,\\n\",\n    \"                    next_state_actions,\\n\",\n    \"                    rewards,\\n\",\n    \"                    bootstrap,\\n\",\n    \"                    args.gamma,\\n\",\n    \"                )\\n\",\n    \"            )\\n\",\n    \"            qf1_next_target_value = qnet_target.get_value(qf1_next_target_projected)\\n\",\n    \"            qf2_next_target_value = qnet_target.get_value(qf2_next_target_projected)\\n\",\n    \"            if args.use_cdq:\\n\",\n    \"                qf_next_target_dist = torch.where(\\n\",\n    \"                    qf1_next_target_value.unsqueeze(1)\\n\",\n    \"                    < qf2_next_target_value.unsqueeze(1),\\n\",\n    \"                    qf1_next_target_projected,\\n\",\n    \"                    qf2_next_target_projected,\\n\",\n    \"                )\\n\",\n    \"                qf1_next_target_dist = qf2_next_target_dist = qf_next_target_dist\\n\",\n    \"            else:\\n\",\n    \"                qf1_next_target_dist, qf2_next_target_dist = (\\n\",\n    \"                    qf1_next_target_projected,\\n\",\n    \"                    qf2_next_target_projected,\\n\",\n    \"                )\\n\",\n    \"\\n\",\n    \"        qf1, qf2 = qnet(critic_observations, actions)\\n\",\n    \"        qf1_loss = -torch.sum(\\n\",\n    \"            qf1_next_target_dist * F.log_softmax(qf1, dim=1), dim=1\\n\",\n    \"        ).mean()\\n\",\n    \"        qf2_loss = -torch.sum(\\n\",\n    \"            qf2_next_target_dist * F.log_softmax(qf2, dim=1), dim=1\\n\",\n    \"        ).mean()\\n\",\n    \"        qf_loss = qf1_loss + qf2_loss\\n\",\n    \"\\n\",\n    \"    q_optimizer.zero_grad(set_to_none=True)\\n\",\n    \"    scaler.scale(qf_loss).backward()\\n\",\n    \"    scaler.unscale_(q_optimizer)\\n\",\n    \"\\n\",\n    \"    critic_grad_norm = torch.nn.utils.clip_grad_norm_(\\n\",\n    \"        qnet.parameters(),\\n\",\n    \"        max_norm=args.max_grad_norm if args.max_grad_norm > 0 else float(\\\"inf\\\"),\\n\",\n    \"    )\\n\",\n    \"    scaler.step(q_optimizer)\\n\",\n    \"    scaler.update()\\n\",\n    \"\\n\",\n    \"    logs_dict[\\\"buffer_rewards\\\"] = rewards.mean()\\n\",\n    \"    logs_dict[\\\"critic_grad_norm\\\"] = critic_grad_norm.detach()\\n\",\n    \"    logs_dict[\\\"qf_loss\\\"] = qf_loss.detach()\\n\",\n    \"    logs_dict[\\\"qf_max\\\"] = qf1_next_target_value.max().detach()\\n\",\n    \"    logs_dict[\\\"qf_min\\\"] = qf1_next_target_value.min().detach()\\n\",\n    \"    return logs_dict\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"def update_pol(data, logs_dict):\\n\",\n    \"    with autocast(device_type=amp_device_type, dtype=amp_dtype, enabled=amp_enabled):\\n\",\n    \"        critic_observations = (\\n\",\n    \"            data[\\\"critic_observations\\\"] if envs.asymmetric_obs else data[\\\"observations\\\"]\\n\",\n    \"        )\\n\",\n    \"\\n\",\n    \"        qf1, qf2 = qnet(critic_observations, actor(data[\\\"observations\\\"]))\\n\",\n    \"        qf1_value = qnet.get_value(F.softmax(qf1, dim=1))\\n\",\n    \"        qf2_value = qnet.get_value(F.softmax(qf2, dim=1))\\n\",\n    \"        if args.use_cdq:\\n\",\n    \"            qf_value = torch.minimum(qf1_value, qf2_value)\\n\",\n    \"        else:\\n\",\n    \"            qf_value = (qf1_value + qf2_value) / 2.0\\n\",\n    \"        actor_loss = -qf_value.mean()\\n\",\n    \"\\n\",\n    \"    actor_optimizer.zero_grad(set_to_none=True)\\n\",\n    \"    scaler.scale(actor_loss).backward()\\n\",\n    \"    scaler.unscale_(actor_optimizer)\\n\",\n    \"    actor_grad_norm = torch.nn.utils.clip_grad_norm_(\\n\",\n    \"        actor.parameters(),\\n\",\n    \"        max_norm=args.max_grad_norm if args.max_grad_norm > 0 else float(\\\"inf\\\"),\\n\",\n    \"    )\\n\",\n    \"    scaler.step(actor_optimizer)\\n\",\n    \"    scaler.update()\\n\",\n    \"    logs_dict[\\\"actor_grad_norm\\\"] = actor_grad_norm.detach()\\n\",\n    \"    logs_dict[\\\"actor_loss\\\"] = actor_loss.detach()\\n\",\n    \"    return logs_dict\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 10,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# NOTE: Compile Functions if Needed\\n\",\n    \"\\n\",\n    \"if args.compile:\\n\",\n    \"    mode = None\\n\",\n    \"    update_main = torch.compile(update_main, mode=mode)\\n\",\n    \"    update_pol = torch.compile(update_pol, mode=mode)\\n\",\n    \"    policy = torch.compile(policy, mode=mode)\\n\",\n    \"    normalize_obs = torch.compile(normalize_obs, mode=mode)\\n\",\n    \"    normalize_critic_obs = torch.compile(normalize_critic_obs, mode=mode)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 11,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# NOTE: Load Checkpoint if Needed\\n\",\n    \"if checkpoint_path is not None:\\n\",\n    \"    torch_checkpoint = torch.load(\\n\",\n    \"        f\\\"{checkpoint_path}\\\", map_location=device, weights_only=False\\n\",\n    \"    )\\n\",\n    \"\\n\",\n    \"    actor.load_state_dict(torch_checkpoint[\\\"actor_state_dict\\\"])\\n\",\n    \"    obs_normalizer.load_state_dict(torch_checkpoint[\\\"obs_normalizer_state\\\"])\\n\",\n    \"    critic_obs_normalizer.load_state_dict(\\n\",\n    \"        torch_checkpoint[\\\"critic_obs_normalizer_state\\\"]\\n\",\n    \"    )\\n\",\n    \"    qnet.load_state_dict(torch_checkpoint[\\\"qnet_state_dict\\\"])\\n\",\n    \"    qnet_target.load_state_dict(torch_checkpoint[\\\"qnet_target_state_dict\\\"])\\n\",\n    \"    global_step = torch_checkpoint[\\\"global_step\\\"]\\n\",\n    \"else:\\n\",\n    \"    global_step = 0\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 12,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# NOTE: Utility functions for displaying videos in notebook\\n\",\n    \"\\n\",\n    \"from IPython.display import display, HTML\\n\",\n    \"import base64\\n\",\n    \"import imageio\\n\",\n    \"import tempfile\\n\",\n    \"import os\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"def frames_to_video_html(frames, fps=30):\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    Convert a list of numpy arrays to an HTML5 video element.\\n\",\n    \"\\n\",\n    \"    Args:\\n\",\n    \"        frames (list): List of numpy arrays representing video frames\\n\",\n    \"        fps (int): Frames per second for the video\\n\",\n    \"\\n\",\n    \"    Returns:\\n\",\n    \"        HTML object containing the video element\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    # Create a temporary file to store the video\\n\",\n    \"    with tempfile.NamedTemporaryFile(suffix=\\\".mp4\\\", delete=False) as temp_file:\\n\",\n    \"        temp_filename = temp_file.name\\n\",\n    \"\\n\",\n    \"    # Save frames as video\\n\",\n    \"    imageio.mimsave(temp_filename, frames, fps=fps)\\n\",\n    \"\\n\",\n    \"    # Read the video file and encode it to base64\\n\",\n    \"    with open(temp_filename, \\\"rb\\\") as f:\\n\",\n    \"        video_data = f.read()\\n\",\n    \"    video_b64 = base64.b64encode(video_data).decode(\\\"utf-8\\\")\\n\",\n    \"\\n\",\n    \"    # Create HTML video element\\n\",\n    \"    video_html = f\\\"\\\"\\\"\\n\",\n    \"    <video width=\\\"640\\\" height=\\\"480\\\" controls>\\n\",\n    \"        <source src=\\\"data:video/mp4;base64,{video_b64}\\\" type=\\\"video/mp4\\\">\\n\",\n    \"        Your browser does not support the video tag.\\n\",\n    \"    </video>\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"\\n\",\n    \"    # Clean up the temporary file\\n\",\n    \"    os.unlink(temp_filename)\\n\",\n    \"\\n\",\n    \"    return HTML(video_html)\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"def update_video_display(frames, fps=30):\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    Display video frames as an embedded HTML5 video element.\\n\",\n    \"\\n\",\n    \"    Args:\\n\",\n    \"        frames (list): List of numpy arrays representing video frames\\n\",\n    \"        fps (int): Frames per second for the video\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    video_html = frames_to_video_html(frames, fps=fps)\\n\",\n    \"    display(video_html)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# NOTE: Main Training Loop\\n\",\n    \"\\n\",\n    \"if envs.asymmetric_obs:\\n\",\n    \"    obs, critic_obs = envs.reset_with_critic_obs()\\n\",\n    \"    critic_obs = torch.as_tensor(critic_obs, device=device, dtype=torch.float)\\n\",\n    \"else:\\n\",\n    \"    obs = envs.reset()\\n\",\n    \"pbar = tqdm.tqdm(total=args.total_timesteps, initial=global_step)\\n\",\n    \"\\n\",\n    \"dones = None\\n\",\n    \"while global_step < args.total_timesteps:\\n\",\n    \"    logs_dict = TensorDict()\\n\",\n    \"    with torch.no_grad(), autocast(\\n\",\n    \"        device_type=amp_device_type, dtype=amp_dtype, enabled=amp_enabled\\n\",\n    \"    ):\\n\",\n    \"        norm_obs = normalize_obs(obs)\\n\",\n    \"        actions = policy(obs=norm_obs, dones=dones)\\n\",\n    \"\\n\",\n    \"    next_obs, rewards, dones, infos = envs.step(actions.float())\\n\",\n    \"    truncations = infos[\\\"time_outs\\\"]\\n\",\n    \"\\n\",\n    \"    if envs.asymmetric_obs:\\n\",\n    \"        next_critic_obs = infos[\\\"observations\\\"][\\\"critic\\\"]\\n\",\n    \"\\n\",\n    \"    # Compute 'true' next_obs and next_critic_obs for saving\\n\",\n    \"    true_next_obs = torch.where(\\n\",\n    \"        dones[:, None] > 0, infos[\\\"observations\\\"][\\\"raw\\\"][\\\"obs\\\"], next_obs\\n\",\n    \"    )\\n\",\n    \"    if envs.asymmetric_obs:\\n\",\n    \"        true_next_critic_obs = torch.where(\\n\",\n    \"            dones[:, None] > 0,\\n\",\n    \"            infos[\\\"observations\\\"][\\\"raw\\\"][\\\"critic_obs\\\"],\\n\",\n    \"            next_critic_obs,\\n\",\n    \"        )\\n\",\n    \"    transition = TensorDict(\\n\",\n    \"        {\\n\",\n    \"            \\\"observations\\\": obs,\\n\",\n    \"            \\\"actions\\\": torch.as_tensor(actions, device=device, dtype=torch.float),\\n\",\n    \"            \\\"next\\\": {\\n\",\n    \"                \\\"observations\\\": true_next_obs,\\n\",\n    \"                \\\"rewards\\\": torch.as_tensor(rewards, device=device, dtype=torch.float),\\n\",\n    \"                \\\"truncations\\\": truncations.long(),\\n\",\n    \"                \\\"dones\\\": dones.long(),\\n\",\n    \"            },\\n\",\n    \"        },\\n\",\n    \"        batch_size=(envs.num_envs,),\\n\",\n    \"        device=device,\\n\",\n    \"    )\\n\",\n    \"    if envs.asymmetric_obs:\\n\",\n    \"        transition[\\\"critic_observations\\\"] = critic_obs\\n\",\n    \"        transition[\\\"next\\\"][\\\"critic_observations\\\"] = true_next_critic_obs\\n\",\n    \"\\n\",\n    \"    obs = next_obs\\n\",\n    \"    if envs.asymmetric_obs:\\n\",\n    \"        critic_obs = next_critic_obs\\n\",\n    \"\\n\",\n    \"    rb.extend(transition)\\n\",\n    \"\\n\",\n    \"    batch_size = args.batch_size // args.num_envs\\n\",\n    \"    if global_step > args.learning_starts:\\n\",\n    \"        for i in range(args.num_updates):\\n\",\n    \"            data = rb.sample(batch_size)\\n\",\n    \"            data[\\\"observations\\\"] = normalize_obs(data[\\\"observations\\\"])\\n\",\n    \"            data[\\\"next\\\"][\\\"observations\\\"] = normalize_obs(data[\\\"next\\\"][\\\"observations\\\"])\\n\",\n    \"            if envs.asymmetric_obs:\\n\",\n    \"                data[\\\"critic_observations\\\"] = normalize_critic_obs(\\n\",\n    \"                    data[\\\"critic_observations\\\"]\\n\",\n    \"                )\\n\",\n    \"                data[\\\"next\\\"][\\\"critic_observations\\\"] = normalize_critic_obs(\\n\",\n    \"                    data[\\\"next\\\"][\\\"critic_observations\\\"]\\n\",\n    \"                )\\n\",\n    \"            logs_dict = update_main(data, logs_dict)\\n\",\n    \"            if args.num_updates > 1:\\n\",\n    \"                if i % args.policy_frequency == 1:\\n\",\n    \"                    logs_dict = update_pol(data, logs_dict)\\n\",\n    \"            else:\\n\",\n    \"                if global_step % args.policy_frequency == 0:\\n\",\n    \"                    logs_dict = update_pol(data, logs_dict)\\n\",\n    \"\\n\",\n    \"            for param, target_param in zip(qnet.parameters(), qnet_target.parameters()):\\n\",\n    \"                target_param.data.copy_(\\n\",\n    \"                    args.tau * param.data + (1 - args.tau) * target_param.data\\n\",\n    \"                )\\n\",\n    \"\\n\",\n    \"        if global_step > 0 and global_step % 100 == 0:\\n\",\n    \"            with torch.no_grad():\\n\",\n    \"                logs = {\\n\",\n    \"                    \\\"actor_loss\\\": logs_dict[\\\"actor_loss\\\"].mean(),\\n\",\n    \"                    \\\"qf_loss\\\": logs_dict[\\\"qf_loss\\\"].mean(),\\n\",\n    \"                    \\\"qf_max\\\": logs_dict[\\\"qf_max\\\"].mean(),\\n\",\n    \"                    \\\"qf_min\\\": logs_dict[\\\"qf_min\\\"].mean(),\\n\",\n    \"                    \\\"actor_grad_norm\\\": logs_dict[\\\"actor_grad_norm\\\"].mean(),\\n\",\n    \"                    \\\"critic_grad_norm\\\": logs_dict[\\\"critic_grad_norm\\\"].mean(),\\n\",\n    \"                    \\\"buffer_rewards\\\": logs_dict[\\\"buffer_rewards\\\"].mean(),\\n\",\n    \"                    \\\"env_rewards\\\": rewards.mean(),\\n\",\n    \"                }\\n\",\n    \"\\n\",\n    \"                if args.eval_interval > 0 and global_step % args.eval_interval == 0:\\n\",\n    \"                    eval_avg_return, eval_avg_length = evaluate()\\n\",\n    \"                    if env_type in [\\\"humanoid_bench\\\", \\\"isaaclab\\\"]:\\n\",\n    \"                        # NOTE: Hacky way of evaluating performance, but just works\\n\",\n    \"                        obs = envs.reset()\\n\",\n    \"                    logs[\\\"eval_avg_return\\\"] = eval_avg_return\\n\",\n    \"                    logs[\\\"eval_avg_length\\\"] = eval_avg_length\\n\",\n    \"\\n\",\n    \"                if args.render_interval > 0 and global_step % args.render_interval == 0:\\n\",\n    \"                    renders = render_with_rollout()\\n\",\n    \"                    print_logs = {\\n\",\n    \"                        k: v.item() if isinstance(v, torch.Tensor) else v\\n\",\n    \"                        for k, v in logs.items()\\n\",\n    \"                    }\\n\",\n    \"                    for k, v in print_logs.items():\\n\",\n    \"                        print(f\\\"{k}: {v:.4f}\\\")\\n\",\n    \"                    update_video_display(renders, fps=30)\\n\",\n    \"                    if use_wandb:\\n\",\n    \"                        wandb.log(\\n\",\n    \"                            {\\n\",\n    \"                                \\\"render_video\\\": wandb.Video(\\n\",\n    \"                                    np.array(renders).transpose(\\n\",\n    \"                                        0, 3, 1, 2\\n\",\n    \"                                    ),  # Convert to (T, C, H, W) format\\n\",\n    \"                                    fps=30,\\n\",\n    \"                                    format=\\\"gif\\\",\\n\",\n    \"                                )\\n\",\n    \"                            },\\n\",\n    \"                            step=global_step,\\n\",\n    \"                        )\\n\",\n    \"            if use_wandb:\\n\",\n    \"                wandb.log(\\n\",\n    \"                    {\\n\",\n    \"                        \\\"frame\\\": global_step * args.num_envs,\\n\",\n    \"                        **logs,\\n\",\n    \"                    },\\n\",\n    \"                    step=global_step,\\n\",\n    \"                )\\n\",\n    \"\\n\",\n    \"        if (\\n\",\n    \"            args.save_interval > 0\\n\",\n    \"            and global_step > 0\\n\",\n    \"            and global_step % args.save_interval == 0\\n\",\n    \"        ):\\n\",\n    \"            save_params(\\n\",\n    \"                global_step,\\n\",\n    \"                actor,\\n\",\n    \"                qnet,\\n\",\n    \"                qnet_target,\\n\",\n    \"                obs_normalizer,\\n\",\n    \"                critic_obs_normalizer,\\n\",\n    \"                args,\\n\",\n    \"                f\\\"models/{run_name}_{global_step}.pt\\\",\\n\",\n    \"            )\\n\",\n    \"\\n\",\n    \"    global_step += 1\\n\",\n    \"    pbar.update(1)\\n\",\n    \"\\n\",\n    \"save_params(\\n\",\n    \"    global_step,\\n\",\n    \"    actor,\\n\",\n    \"    qnet,\\n\",\n    \"    qnet_target,\\n\",\n    \"    obs_normalizer,\\n\",\n    \"    critic_obs_normalizer,\\n\",\n    \"    args,\\n\",\n    \"    f\\\"models/{run_name}_final.pt\\\",\\n\",\n    \")\"\n   ]\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"fasttd3_hb\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.10.17\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 2\n}\n"
  },
  {
    "path": "requirements/requirements.txt",
    "content": "gymnasium<1.0.0\njax-jumpy==1.0.0 ; python_version >= \"3.8\" and python_version < \"3.11\"\nmatplotlib\nmoviepy\nnumpy<2.0\npandas\nprotobuf\npygame\nstable-baselines3\ntqdm\nwandb\ntorchrl==0.7.2\ntensordict==0.7.2\ntyro\nloguru\ntorch==2.6.0 --index-url https://download.pytorch.org/whl/cu124\ntorchvision==0.21.0 --index-url https://download.pytorch.org/whl/cu124\ntorchaudio==2.6.0 --index-url https://download.pytorch.org/whl/cu124\n"
  },
  {
    "path": "requirements/requirements_isaacgym.txt",
    "content": "gymnasium<1.0.0\njax-jumpy==1.0.0 ; python_version >= \"3.8\" and python_version < \"3.11\"\nmatplotlib\nmoviepy\nnumpy<2.0\npandas\nprotobuf\npygame\nstable-baselines3\ntqdm\nwandb\ntorchrl==0.5.0\ntensordict==0.5.0\ntyro\nloguru"
  },
  {
    "path": "requirements/requirements_playground.txt",
    "content": "gymnasium<1.0.0\njax-jumpy==1.0.0 ; python_version >= \"3.8\" and python_version < \"3.11\"\nmatplotlib\nmoviepy\nnumpy<2.0\npandas\nprotobuf\npygame\nstable-baselines3\ntqdm\nwandb\ntorchrl==0.7.2\ntensordict==0.7.2\ntyro\nloguru\ngit+https://github.com/younggyoseo/mujoco_playground.git\ntorch==2.6.0 --index-url https://download.pytorch.org/whl/cu124\ntorchvision==0.21.0 --index-url https://download.pytorch.org/whl/cu124\ntorchaudio==2.6.0 --index-url https://download.pytorch.org/whl/cu124\njax[cuda12]==0.4.35\nnvidia-cublas-cu12==12.4.5.8\nnvidia-cuda-cupti-cu12==12.4.127\nnvidia-cuda-nvcc-cu12==12.8.93\nnvidia-cuda-nvrtc-cu12==12.4.127\nnvidia-cuda-runtime-cu12==12.4.127\nnvidia-cudnn-cu12==9.1.0.70\nnvidia-cufft-cu12==11.2.1.3\nnvidia-curand-cu12==10.3.5.147\nnvidia-cusolver-cu12==11.6.1.9\nnvidia-cusparse-cu12==12.3.1.170\nnvidia-cusparselt-cu12==0.6.2\nnvidia-nccl-cu12==2.21.5\nnvidia-nvjitlink-cu12==12.4.127\nnvidia-nvtx-cu12==12.4.127\n"
  },
  {
    "path": "setup.py",
    "content": "from setuptools import setup, find_packages\n\nsetup(\n    name=\"fast_td3\",\n    version=\"0.1.0\",\n    description=\"FastTD3 implementation\",\n    author=\"\",\n    author_email=\"\",\n    url=\"\",\n    packages=find_packages(),\n)\n"
  },
  {
    "path": "sim2real.md",
    "content": "# Guide for Sim2Real Training & Deployment\n\nThis guide provides guide to run sim-to-real experiments using FastTD3 and BoosterGym.\n\n**⚠️ Warning:** Deploying RL policies to real hardware can be sometimes very dangerous. Please make sure that you understand everything, check your policies work well in simulation, set every robot configuration correct (e.g., damping, stiffness, torque limits, etc), and follow proper safety protocols. **Simply copy-pasting commands in this README is not safe**.\n\n## ⚙️ Prerequisites\n\nInstall dependencies for Playground experiments (see `README.md`)\n\nThen, install `fast_td3` package with `pip install -e .` so you can import its classes in BoosterGym (see `fast_td3_deploy.py`).\n\n**⚠️ Note:** Our sim-to-real experiments depend on our customized MuJoCo Playground that supports `T1LowDimJoystick` tasks for 12-DOF T1 control instead of 23-DOF T1 control in `T1Joystick` tasks.\n\n## 🚀 Training in simulation\n\nUsers can train deployable policies for Booster T1 with FastTD3 using the below script:\n\n```bash\npython fast_td3/train.py --env_name T1LowDimJoystickRoughTerrain --exp_name FastTD3 --use_domain_randomization --use_push_randomization --total_timesteps 1000000 --render_interval 0 --seed 2\n```\n\n**⚠️ Note:** There is no 'guaranteed' number of training steps that can ensure safe real-world deployment. Usually, the gait becomes more stable with longer training. Please check the quality of gaits via sim-to-sim transfer, and fine-tune the policy to fix the issues. Use the checkpoints in `models` directory for sim-to-sim or sim-to-real transfer.\n\n**⚠️ Note:** We set `render_interval` to 0 to avoid dumping a lot of videos into wandb. Make sure to set it to non-zero values if you want to render videos during training.\n\n\n\n### (Optional) 2-Stage Training\n\nFor faster convergence, users can consider introducing curriculum to the training -- so that the robot first learns to walk in a flat terrain without push perturbations. For this, train policies with the below script:\n\n```bash\nSTAGE1_STEPS = 100000\nSTAGE2_STEPS = 300000  # Effective steps: 300000 - 200000 = 100000\nSEED = 2\nCHECKPOINT_PATH = T1LowDimJoystickFlatTerrain__FastTD3-Stage1__${SEED}_final.pt\n\nconda activate fasttd3_playground\n\n# Stage 1 training\npython fast_td3/train.py --env_name T1LowDimJoystickFlatTerrain --exp_name FastTD3-Stage1 --use_domain_randomization --no_use_push_randomization --total_timesteps ${STAGE1_STEPS} --render_interval 0 --seed ${SEED}\n\n# Stage 2 training\npython fast_td3/train.py --env_name T1LowDimJoystickRoughTerrain --exp_name FastTD3-Stage2 --use_domain_randomization --use_push_randomization --total_timesteps ${STAGE2_STEPS} --render_interval 0 --checkpoint_path ${CHECKPOINT_PATH} --seed ${SEED}\n```\n\nAgain, 100K and 200K steps do not guarantee safe real-world deployment. Please check the quality of gaits via sim-to-sim transfer, and fine-tune the policy to fix the issues. Use the final checkpoint (`models/T1LowDimJoystickRoughTerrain__FastTD3-Stage2__${SEED}_final.pt`) for sim-to-sim or sim-to-real transfer.\n\n## 🛝 Deployment with BoosterGym\n\nWe use the customized version of [BoosterGym](https://github.com/BoosterRobotics/booster_gym) for deployment with FastTD3.\n\nFirst, clone our fork of BoosterGym.\n\n```bash\ngit clone https://github.com/carlosferrazza/booster_gym.git\n```\n\nThen, follow the [guide](https://github.com/carlosferrazza/booster_gym) to install dependencies for BoosterGym.\n\n### Sim-to-Sim Transfer\n\nYou can check whether the trained policy transfers to non-MJX version of MuJoCo.\nUse the following commands in a machine that supports rendering to test sim-to-sim transfer:\n\n```bash\ncd <YOUR_WORKSPACE>/booster_gym\n\n# Activate your BoosterGym virtual environemnt\n\n# Launch MuJoCo simulation\npython play_mujoco.py --task=T1 --checkpoint=<CHECKPOINT_PATH>\nmjpython play_mujoco.py --task=T1 --checkpoint=<CHECKPOINT_PATH>  # for Mac\n```\n\n\n \n### Sim-to-Real Transfer\n\nFirst, prepare a JIT-scripted checkpoint\n\n```python\n# Python snippets for JIT-scripting checkpoints\nimport torch\nfrom fast_td3 import load_policy\npolicy = load_policy(<CHECKPOINT_PATH>)\nscripted_policy = torch.jit.script(policy)\nscripted_policy.save(<JIT_CHECKPOINT_PATH>)\n```\n\nThen, deploy this JIT-scripted checkpoint by following the guide on [Booster T1 Deployment](https://github.com/carlosferrazza/booster_gym/tree/main/deploy).\n\n\n**⚠️ Warning:** Please double-check every value in robot configuration (`booster_gym/deploy/configs/T1.yaml`) is correctly set! If values for position control such as `damping` or `stiffness` are set differently, your robot may perform dangerous behaviors. \n\n**⚠️ Warning:** You may want to use different configuration (e.g., `damping` and `stiffness`, etc) for your own experiments. Just make sure to thoroughly test it in simulation and make sure to set the values correctly.\n\n---\n\n🚀 That's it! Hope everything went smoothly, and be aware of your safety."
  }
]