master 18dc816c3fb9 cached
108 files
28.3 MB
715.7k tokens
231 symbols
1 requests
Download .txt
Showing preview only (2,862K chars total). Download the full file or copy to clipboard to get everything.
Repository: uber-research/atari-model-zoo
Branch: master
Commit: 18dc816c3fb9
Files: 108
Total size: 28.3 MB

Directory structure:
gitextract_j3665a4o/

├── .gitignore
├── LICENSE
├── NOTICE
├── README.md
├── atari_zoo/
│   ├── __init__.py
│   ├── activation_movie.py
│   ├── atari_wrappers.py
│   ├── config.py
│   ├── dopamine_preprocessing.py
│   ├── game_lists/
│   │   ├── a2c_game_list
│   │   ├── apex_game_list
│   │   └── dopamine_game_list
│   ├── log.py
│   ├── model_maker.py
│   ├── rollout.py
│   ├── scores.py
│   ├── synthetic_inputs.py
│   ├── top_patches.py
│   ├── translate.py
│   └── utils.py
├── colab/
│   └── AtariZooColabDemo.ipynb
├── dimensionality_reduction/
│   ├── README.md
│   ├── process.py
│   ├── process_helper.py
│   ├── ram_reduce.json
│   ├── representation_reduce.json
│   ├── visualize.py
│   ├── visualize_helper.py
│   ├── viz_ram_2d.json
│   └── viz_representation_2d.json
├── docs/
│   ├── RGraph/
│   │   └── libraries/
│   │       ├── RGraph.bar.js
│   │       ├── RGraph.bipolar.js
│   │       ├── RGraph.common.annotate.js
│   │       ├── RGraph.common.context.js
│   │       ├── RGraph.common.core.js
│   │       ├── RGraph.common.csv.js
│   │       ├── RGraph.common.deprecated.js
│   │       ├── RGraph.common.dynamic.js
│   │       ├── RGraph.common.effects.js
│   │       ├── RGraph.common.key.js
│   │       ├── RGraph.common.resizing.js
│   │       ├── RGraph.common.sheets.js
│   │       ├── RGraph.common.tooltips.js
│   │       ├── RGraph.common.zoom.js
│   │       ├── RGraph.cornergauge.js
│   │       ├── RGraph.drawing.background.js
│   │       ├── RGraph.drawing.circle.js
│   │       ├── RGraph.drawing.image.js
│   │       ├── RGraph.drawing.marker1.js
│   │       ├── RGraph.drawing.marker2.js
│   │       ├── RGraph.drawing.marker3.js
│   │       ├── RGraph.drawing.poly.js
│   │       ├── RGraph.drawing.rect.js
│   │       ├── RGraph.drawing.text.js
│   │       ├── RGraph.drawing.xaxis.js
│   │       ├── RGraph.drawing.yaxis.js
│   │       ├── RGraph.fuel.js
│   │       ├── RGraph.funnel.js
│   │       ├── RGraph.gantt.js
│   │       ├── RGraph.gauge.js
│   │       ├── RGraph.hbar.js
│   │       ├── RGraph.hprogress.js
│   │       ├── RGraph.line.js
│   │       ├── RGraph.meter.js
│   │       ├── RGraph.modaldialog.js
│   │       ├── RGraph.odo.js
│   │       ├── RGraph.pie.js
│   │       ├── RGraph.radar.js
│   │       ├── RGraph.rose.js
│   │       ├── RGraph.rscatter.js
│   │       ├── RGraph.scatter.js
│   │       ├── RGraph.semicircularprogress.js
│   │       ├── RGraph.svg.bar.js
│   │       ├── RGraph.svg.bipolar.js
│   │       ├── RGraph.svg.common.ajax.js
│   │       ├── RGraph.svg.common.core.js
│   │       ├── RGraph.svg.common.csv.js
│   │       ├── RGraph.svg.common.fx.js
│   │       ├── RGraph.svg.common.key.js
│   │       ├── RGraph.svg.common.sheets.js
│   │       ├── RGraph.svg.common.tooltips.js
│   │       ├── RGraph.svg.funnel.js
│   │       ├── RGraph.svg.gauge.js
│   │       ├── RGraph.svg.hbar.js
│   │       ├── RGraph.svg.line.js
│   │       ├── RGraph.svg.pie.js
│   │       ├── RGraph.svg.radar.js
│   │       ├── RGraph.svg.rose.js
│   │       ├── RGraph.svg.scatter.js
│   │       ├── RGraph.svg.semicircularprogress.js
│   │       ├── RGraph.svg.waterfall.js
│   │       ├── RGraph.thermometer.js
│   │       ├── RGraph.vprogress.js
│   │       └── RGraph.waterfall.js
│   ├── css/
│   │   ├── bootstrap-theme.css
│   │   └── bootstrap.css
│   ├── js/
│   │   ├── bootstrap.js
│   │   └── npm.js
│   ├── video.html
│   └── video2.html
├── examples/
│   ├── classify_state.py
│   └── demo.py
├── notebooks/
│   ├── Basic visualization.ipynb
│   ├── Filter Analysis.ipynb
│   ├── Training log visualization.ipynb
│   └── Walkthrough.ipynb
├── requirements.txt
└── setup.py

================================================
FILE CONTENTS
================================================

================================================
FILE: .gitignore
================================================
*.swp
__pycache__
.ipynb_checkpoints

data


================================================
FILE: LICENSE
================================================
 Copyright (c) 2018 Uber Technologies, Inc.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.


================================================
FILE: NOTICE
================================================
Atari Model Zoo includes derived work from Dopamine (https://github.com/google/dopamine) under the Apache License 2.0:

Copyright 2018 The Dopamine Authors.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
 limitations under the License.

The derived work can be found in the files:  atari_zoo/dopamine_preprocessing.py


================================================
FILE: README.md
================================================
# Atari Zoo

The aim of this project is to disseminate deep reinforcement learning agents trained by a variety of algorithms, and to enable easy analysis, comparision, and visualization of them. The hope is to reduce friction for further 
research into understanding reinforcement learning agents. 
This project makes use of the excellent [Lucid](https://github.com/tensorflow/lucid) neural network visualization library, and integrates with the [Dopamine](https://github.com/google/dopamine) [model release](https://github.com/google/dopamine/tree/master/docs#downloads).

A paper introducing this work was published at IJCAI 2019: [An Atari Model Zoo for Analyzing, Visualizing, and Comparing Deep Reinforcement Learning Agents](https://arxiv.org/abs/1812.07069).

## About

This software package is accompanied by a binary release of (1) frozen models trained on Atari games by a variety of deep reinforcement learning methods, and (2) cached gameplay experience of those agents in their
training environments, which is hosted online.

## Installation and Setup

Dependencies:
* [tensorflow](https://github.com/tensorflow/tensorflow) (with version >0.8 <2.0; *we are not currently supporting TF 2.x*)
* [lucid](https://github.com/tensorflow/lucid) (version
* [matplotlib](https://matplotlib.org/) for some visualiztions
* [moviepy](https://zulko.github.io/moviepy/) (optional for making movies) 
* [gym](https://github.com/openai/gym) (installed with support for Atari; optional for generating new rollouts)
* [opencv-python](https://pypi.org/project/opencv-python/) (optional for generating new rollouts)
* [tensorflow-onnx](https://github.com/onnx/tensorflow-onnx) (optional for exporting to [ONNX](https://onnx.ai/) format)

To install, run ```setup.py install``` after installing dependencies.

## Examples

```python

import atari_zoo
from atari_zoo import MakeAtariModel
from pylab import *

algo = "a2c"
env = "ZaxxonNoFrameskip-v4"
run_id = 1
tag = "final"
m = MakeAtariModel(algo,env,run_id,tag)()

# get observations, frames, and ram state from a representative rollout
obs = m.get_observations()
frames = m.get_frames()
ram = m.get_ram()

# visualize first layer of convolutional weights
session = atari_zoo.utils.get_session()

m.load_graphdef()
m.import_graph()

conv_weights = m.get_weights(session,0)
atari_zoo.utils.visualize_conv_w(conv_weights)
show()

```

From the command line you can run: ```python -m atari_zoo.activation_movie --algo rainbow --environment PongNoFrameskip-v4 --run_id 1 --output ./pong_rainbow1_activation.mp4```

## Notebooks

Example jupyter notebooks live in the notebook directory that give further examples of how this library can be used.

A [starter colab notebook](https://colab.research.google.com/github/uber-research/atari-model-zoo/blob/master/colab/AtariZooColabDemo.ipynb) enables you to check out the library without downloading and installing it.

## Web tools

* A tool for viewing videos of trained agents is available [here](https://uber-research.github.io/atari-model-zoo/video.html); note that it is possible to link to specific videos,
e.g. [https://uber-research.github.io/atari-model-zoo/video.html?algo=apex&game=Seaquest&tag=final&run=2](https://uber-research.github.io/atari-model-zoo/video.html?algo=apex&game=Seaquest&tag=final&run=2).

* A tool for viewing videos of trained agents alongside their neural activations is available [here](https://uber-research.github.io/atari-model-zoo/video2.html).

## Source code for training algorithms that produced zoo models

We trained four algorithms ourselves:

* [A2C](https://arxiv.org/abs/1602.01783) - we used the [baselines package from OpenAI](https://github.com/openai/baselines)
* [GA](https://arxiv.org/abs/1712.06567) - we used the [fast GPU implementation version released by Uber](https://github.com/uber-research/deep-neuroevolution)
* [ES](https://arxiv.org/abs/1703.03864) - we used the [fast GPU version released by Uber](https://github.com/uber-research/deep-neuroevolution)
* [Ape-X](https://arxiv.org/abs/1803.00933) - we used the [replication released by Uber](https://github.com/uber-research/ape-x)
* [IMPALA](https://arxiv.org/abs/1802.01561) - we used the [released code from DeepMind](https://github.com/deepmind/scalable_agent)

We took trained final models from two algorithms (DQN and Rainbow) from the [Dopamine model release](https://ai.googleblog.com/2018/08/introducing-new-framework-for-flexible.html):

* [DQN](https://arxiv.org/abs/1312.5602) - [implementation here](https://github.com/google/dopamine)
* [Rainbow](https://arxiv.org/abs/1710.02298) - [implementation here](https://github.com/google/dopamine)

## Citation

To cite this work in publications, please use the following BibTex entry:

```
@inproceedings{
title = {An Atari Model Zoo for Analyzing, Visualizing, and Comparing Deep Reinforcement Learning Agents},
author = {Felipe Such, Vashish Madhavan, Rosanne Liu, Rui Wang, Pablo Castro, Yulun Li, Jiale Zhi, Ludwig Schubert, Marc G. Bellemare, Jeff Clune, Joel Lehman},
booktitle = {Proceedings of IJCAI 2019},
year = {2019},
}
```

## Contact Information

For questions, comments, and suggestions, email [joel.lehman@uber.com](mailto:mailto:joel.lehman@uber.com).


================================================
FILE: atari_zoo/__init__.py
================================================
# Copyright (c) 2018 Uber Technologies, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
#     Unless required by applicable law or agreed to in writing, software
#     distributed under the License is distributed on an "AS IS" BASIS,
#     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#     See the License for the specific language governing permissions and
#     limitations under the License.


run_cnt = {'apex':3,'ga':3,'es':3,'a2c':3,'rainbow':5,'dqn':5,'impala':3}

analysis_subset_games = ['AmidarNoFrameskip-v4', 'AssaultNoFrameskip-v4', 'AsterixNoFrameskip-v4', 'AsteroidsNoFrameskip-v4', 'AtlantisNoFrameskip-v4','EnduroNoFrameskip-v4', 'FrostbiteNoFrameskip-v4', 'GravitarNoFrameskip-v4', 'KangarooNoFrameskip-v4',"SeaquestNoFrameskip-v4",'SkiingNoFrameskip-v4','VentureNoFrameskip-v4','ZaxxonNoFrameskip-v4']

game_list = ['AlienNoFrameskip-v4', 'AmidarNoFrameskip-v4', 'AssaultNoFrameskip-v4', 'AsterixNoFrameskip-v4', 'AsteroidsNoFrameskip-v4', 'AtlantisNoFrameskip-v4', 'BankHeistNoFrameskip-v4', 'BattleZoneNoFrameskip-v4', 'BeamRiderNoFrameskip-v4', 'BerzerkNoFrameskip-v4', 'BowlingNoFrameskip-v4', 'BoxingNoFrameskip-v4', 'BreakoutNoFrameskip-v4', 'CentipedeNoFrameskip-v4', 'ChopperCommandNoFrameskip-v4', 'CrazyClimberNoFrameskip-v4', 'DemonAttackNoFrameskip-v4', 'DoubleDunkNoFrameskip-v4', 'EnduroNoFrameskip-v4', 'FishingDerbyNoFrameskip-v4', 'FreewayNoFrameskip-v4', 'FrostbiteNoFrameskip-v4', 'GopherNoFrameskip-v4', 'GravitarNoFrameskip-v4', 'HeroNoFrameskip-v4', 'IceHockeyNoFrameskip-v4', 'JamesbondNoFrameskip-v4', 'KangarooNoFrameskip-v4', 'KrullNoFrameskip-v4', 'KungFuMasterNoFrameskip-v4', 'MontezumaRevengeNoFrameskip-v4', 'MsPacmanNoFrameskip-v4', 'NameThisGameNoFrameskip-v4', 'PhoenixNoFrameskip-v4', 'PitfallNoFrameskip-v4', 'PongNoFrameskip-v4', 'PrivateEyeNoFrameskip-v4', 'QbertNoFrameskip-v4', 'RiverraidNoFrameskip-v4', 'RoadRunnerNoFrameskip-v4', 'RobotankNoFrameskip-v4', 'SeaquestNoFrameskip-v4', 'SkiingNoFrameskip-v4', 'SolarisNoFrameskip-v4', 'SpaceInvadersNoFrameskip-v4', 'StarGunnerNoFrameskip-v4', 'TennisNoFrameskip-v4', 'TimePilotNoFrameskip-v4', 'TutankhamNoFrameskip-v4', 'UpNDownNoFrameskip-v4', 'VentureNoFrameskip-v4', 'VideoPinballNoFrameskip-v4', 'WizardOfWorNoFrameskip-v4', 'YarsRevengeNoFrameskip-v4', 'ZaxxonNoFrameskip-v4']

game_action_counts = {'DemonAttackNoFrameskip-v4': 6, 'BowlingNoFrameskip-v4': 6, 'QbertNoFrameskip-v4': 6, 'GopherNoFrameskip-v4': 8, 'PongNoFrameskip-v4': 6, 'BattleZoneNoFrameskip-v4': 18, 'VideoPinballNoFrameskip-v4': 9, 'FrostbiteNoFrameskip-v4': 18, 'BeamRiderNoFrameskip-v4': 9, 'YarsRevengeNoFrameskip-v4': 18, 'RoadRunnerNoFrameskip-v4': 18, 'JamesbondNoFrameskip-v4': 18, 'GravitarNoFrameskip-v4': 18, 'IceHockeyNoFrameskip-v4': 18, 'FishingDerbyNoFrameskip-v4': 18, 'BerzerkNoFrameskip-v4': 18, 'CrazyClimberNoFrameskip-v4': 9, 'ChopperCommandNoFrameskip-v4': 18, 'WizardOfWorNoFrameskip-v4': 10, 'ZaxxonNoFrameskip-v4': 18, 'AlienNoFrameskip-v4': 18, 'PitfallNoFrameskip-v4': 18, 'KrullNoFrameskip-v4': 18, 'KangarooNoFrameskip-v4': 18, 'BankHeistNoFrameskip-v4': 18, 'SpaceInvadersNoFrameskip-v4': 6, 'RobotankNoFrameskip-v4': 18, 'AmidarNoFrameskip-v4': 10, 'EnduroNoFrameskip-v4': 9, 'AsterixNoFrameskip-v4': 9, 'MontezumaRevengeNoFrameskip-v4': 18, 'VentureNoFrameskip-v4': 18, 'DoubleDunkNoFrameskip-v4': 18, 'KungFuMasterNoFrameskip-v4': 14, 'TimePilotNoFrameskip-v4': 10, 'CentipedeNoFrameskip-v4': 18, 'BreakoutNoFrameskip-v4': 4, 'SeaquestNoFrameskip-v4': 18, 'PhoenixNoFrameskip-v4': 8, 'FreewayNoFrameskip-v4': 3, 'AtlantisNoFrameskip-v4': 4, 'PrivateEyeNoFrameskip-v4': 18, 'NameThisGameNoFrameskip-v4': 6, 'TutankhamNoFrameskip-v4': 8, 'TennisNoFrameskip-v4': 18, 'AssaultNoFrameskip-v4': 7, 'SolarisNoFrameskip-v4': 18, 'StarGunnerNoFrameskip-v4': 18, 'AsteroidsNoFrameskip-v4': 14, 'SkiingNoFrameskip-v4': 3, 'HeroNoFrameskip-v4': 18, 'BoxingNoFrameskip-v4': 18, 'MsPacmanNoFrameskip-v4': 9, 'UpNDownNoFrameskip-v4': 6, 'RiverraidNoFrameskip-v4': 18}

game_action_meanings = {'AsteroidsNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE'], 'SkiingNoFrameskip-v4': ['NOOP', 'RIGHT', 'LEFT'], 'AtlantisNoFrameskip-v4': ['NOOP', 'FIRE', 'RIGHTFIRE', 'LEFTFIRE'], 'BoxingNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'IceHockeyNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'DoubleDunkNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'StarGunnerNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'RoadRunnerNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'TutankhamNoFrameskip-v4': ['NOOP', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE'], 'MontezumaRevengeNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'AsterixNoFrameskip-v4': ['NOOP', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT'], 'BowlingNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'DOWN', 'UPFIRE', 'DOWNFIRE'], 'PongNoFrameskip-v4': ['NOOP', 'FIRE', 'RIGHT', 'LEFT', 'RIGHTFIRE', 'LEFTFIRE'], 'GravitarNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'KungFuMasterNoFrameskip-v4': ['NOOP', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'DOWNRIGHT', 'DOWNLEFT', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'AssaultNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'RIGHTFIRE', 'LEFTFIRE'], 'SeaquestNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'PhoenixNoFrameskip-v4': ['NOOP', 'FIRE', 'RIGHT', 'LEFT', 'DOWN', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE'], 'RiverraidNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'UpNDownNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'DOWN', 'UPFIRE', 'DOWNFIRE'], 'FreewayNoFrameskip-v4': ['NOOP', 'UP', 'DOWN'], 'WizardOfWorNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE'], 'EnduroNoFrameskip-v4': ['NOOP', 'FIRE', 'RIGHT', 'LEFT', 'DOWN', 'DOWNRIGHT', 'DOWNLEFT', 'RIGHTFIRE', 'LEFTFIRE'], 'PrivateEyeNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'BattleZoneNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'BankHeistNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'VentureNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'TimePilotNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE'], 'ZaxxonNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'CentipedeNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'PitfallNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'BerzerkNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'BreakoutNoFrameskip-v4': ['NOOP', 'FIRE', 'RIGHT', 'LEFT'], 'NameThisGameNoFrameskip-v4': ['NOOP', 'FIRE', 'RIGHT', 'LEFT', 'RIGHTFIRE', 'LEFTFIRE'], 'CrazyClimberNoFrameskip-v4': ['NOOP', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT'], 'JamesbondNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'TennisNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'ChopperCommandNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'AlienNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'AmidarNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE'], 'QbertNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN'], 'MsPacmanNoFrameskip-v4': ['NOOP', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT'], 'FrostbiteNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'DemonAttackNoFrameskip-v4': ['NOOP', 'FIRE', 'RIGHT', 'LEFT', 'RIGHTFIRE', 'LEFTFIRE'], 'KangarooNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'VideoPinballNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE'], 'RobotankNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'SolarisNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'HeroNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'FishingDerbyNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'GopherNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE'], 'BeamRiderNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'UPRIGHT', 'UPLEFT', 'RIGHTFIRE', 'LEFTFIRE'], 'SpaceInvadersNoFrameskip-v4': ['NOOP', 'FIRE', 'RIGHT', 'LEFT', 'RIGHTFIRE', 'LEFTFIRE'], 'KrullNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE'], 'YarsRevengeNoFrameskip-v4': ['NOOP', 'FIRE', 'UP', 'RIGHT', 'LEFT', 'DOWN', 'UPRIGHT', 'UPLEFT', 'DOWNRIGHT', 'DOWNLEFT', 'UPFIRE', 'RIGHTFIRE', 'LEFTFIRE', 'DOWNFIRE', 'UPRIGHTFIRE', 'UPLEFTFIRE', 'DOWNRIGHTFIRE', 'DOWNLEFTFIRE']}

from atari_zoo.model_maker import MakeAtariModel


================================================
FILE: atari_zoo/activation_movie.py
================================================
# Copyright (c) 2018 Uber Technologies, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
#     Unless required by applicable law or agreed to in writing, software
#     distributed under the License is distributed on an "AS IS" BASIS,
#     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#     See the License for the specific language governing permissions and
#     limitations under the License.

import argparse
import numpy as np
import moviepy.editor as mpy
from moviepy.video.io.ffmpeg_writer import FFMPEG_VideoWriter
import gym
from lucid.optvis.render import import_model
from atari_zoo import MakeAtariModel
from atari_zoo.utils import get_session
from atari_zoo.utils import conv_activations_to_canvas
from atari_zoo.utils import fc_activations_to_canvas
from atari_zoo.utils import get_activation_scaling
import tensorflow as tf

def gather_activations(m,obs,activations_tensor,session,X_t,batch_size=200,add_observations=True):
    #gather activations over entire trajectory
    obs_idx = 0
    length = obs.shape[0]

    collected_reps = []

    while obs_idx < length:
        rep = session.run(activations_tensor,{X_t:obs[obs_idx:obs_idx+batch_size]})
        collected_reps.append(rep)
        obs_idx += batch_size

    #collate representations
    compiled_rep = {}
    for layer in range(len(rep)):
        collected = np.vstack([r[layer] for r in collected_reps])
        #print(collected.shape)
        layer_name = m.layers[layer]['name']
        compiled_rep[layer_name] = collected

    return compiled_rep

def activations_to_frames(m,activations):
    obs_idx = 0
    frames = []
    length = activations.shape[0]

    if len(activations.shape)==4:
        scaling = get_activation_scaling(m,activations)

    for obs_idx in range(length):
        if len(activations.shape)==4:
            frame = conv_activations_to_canvas(m,activations,padding=1,idx=obs_idx,scaling=scaling)
        elif len(activations.shape)==2:
            frame = fc_activations_to_canvas(m,activations,padding=1,idx=obs_idx)  
        frames.append(frame)
    return frames

def make_clips_from_activations(m,_frames,obs,activations_tensor,session,X_t,fps=60):
    clip_dict = {}
    activations = gather_activations(m,obs,activations_tensor=activations_tensor,
                                     session=session,X_t=X_t,batch_size=1)
    
    for layer_idx in range(len(m.layers)):
        layer_name = m.layers[layer_idx]['name']
        print(layer_name)
        frames = activations_to_frames(m,activations[layer_name])
        clip = mpy.ImageSequenceClip([frame*255 for frame in frames], fps=60)
        clip_dict[layer_name] = clip
        
    #create observation movie
    n_obs = m.native_activation_representation(obs)
    frames = activations_to_frames(m,n_obs)
    clip = mpy.ImageSequenceClip([frame*255 for frame in frames], fps=fps)
    clip_dict['observations'] = clip
    
    #create raw rollout movie
    clip = mpy.ImageSequenceClip([frame for frame in _frames], fps=60)
    clip_dict['frames'] = clip
    
    return clip_dict

def side_by_side_clips(clip1,clip2):
    #calculate size of background canvas
    total_size_x = clip1.size[0] + clip2.size[0]
    total_size_y = max(clip1.size[1],clip2.size[1])

    #create background canvas
    bg_clip = mpy.ColorClip(size=(total_size_x,total_size_y), color=(255,255,255))

    duration = clip2.duration

    #align clips on canvas
    clip1=clip1.set_position(pos=(0,"center"))
    clip2=clip2.set_position(pos=((total_size_x-clip2.size[0],"center")))

    clip_list = [bg_clip,clip1,clip2]

    #composite together
    cc = mpy.CompositeVideoClip(clip_list,(total_size_x,total_size_y)).subclip(0,duration)
    return cc
    

def _MakeActivationVideoOneLayer(m,clip_dict,layer_no):
    labels = ["conv1","conv2","conv3","fc","output"]
    scales = [1.5,2.0,2.0,0.5,1.5]

    #get game frames
    clip1 = clip_dict['frames']

    #get activations from one layer
    layer_name = m.layers[layer_no]['name']
    clip2 = clip_dict[layer_name]
    clip2_scale = scales[layer_no]
    clip2 = clip2.resize(clip2_scale)
    return side_by_side_clips(clip1,clip2)


def _MakeActivationVideo(m,clip_dict):
    composite_size = (550,1000)

    clip_list = []
    clip_list.append(mpy.ColorClip(size=composite_size, color=(255,255,255)))

    labels = ["obs","conv1","conv2","conv3","fc","output"]
    scales = [1.0, 1.5,2.0,2.0,0.5,1.5]

    x_pos = 350
    y_pos = 25
    padding = 50
    label_fontsize = 20

    layers = m.layers.copy()
    layers.insert(0,{'name':'observations'})

    for layer_idx in range(len(labels)):
        layer_name = layers[layer_idx]['name']
    
        #get clip and resize it
        clip = clip_dict[layer_name]
        clip = clip.resize(scales[layer_idx])
    
        #calculate where to place it
        _x_pos = x_pos - 0.5 * clip.size[0]
        _y_pos = y_pos
        clip = clip.set_position((_x_pos,_y_pos))
    
        txtClip = mpy.TextClip(labels[layer_idx],color='black', fontsize=label_fontsize)
        txtPos = (x_pos - 0.5 * txtClip.size[0],y_pos - txtClip.size[1])
        clip_list.append(txtClip.set_position(txtPos))
    
        #offset coordinates
        y_pos += clip.size[1]
        y_pos += padding
        clip_list.append(clip)
    
    duration = clip.duration

    clip_list.append(clip_dict['frames'].set_position((50,580)))
    #clip_list.append(clip_dict['observations'].set_position((0,50)))

    cc = mpy.CompositeVideoClip(clip_list,composite_size).subclip(0,duration)
    #cc.ipython_display()
    return cc 

"""
Take a model and create a dictionary of MoviePy clips
for all the activations of the NN given a cached evaluation.
"""
def MakeClipDict(m):
    tf.reset_default_graph()

    m.load_graphdef()
    m.import_graph()
    obs = m.get_observations()
    frames = m.get_frames()
    
    #get a tf session
    session = get_session()

    #create a placeholder input to the network
    X_t = tf.placeholder(tf.float32, [None] + m.image_shape)

    #now get access to a dictionary that grabs output layers from the model
    T = import_model(m,X_t,X_t)
    activations = [T(layer['name']) for layer in m.layers]
    
    clip_dict = make_clips_from_activations(m,frames,obs,activations,session=session,X_t=X_t,fps=60)

    return clip_dict

"""
Take a model and a layer number (0=conv1,1=conv2,2=conv3) and
generate a side-by-side video of agent and activations on that
layer.
"""
def MakeActivationVideoOneLayer(m,layer_no,out_file=None):
    clip_dict = MakeClipDict(m)
    clip = _MakeActivationVideoOneLayer(m,clip_dict,layer_no)

    if out_file!=None:
        clip.write_videofile(out_file)

    return clip
    


"""
Take a model m and generate a side-by-side video of agent and activations 
"""
def MakeActivationVideo(m,video_fn=None):
    clip_dict = MakeClipDict(m)
    clip = _MakeActivationVideo(m,clip_dict)

    if video_fn!=None:
        clip.write_videofile(video_fn)

    return clip

def main():
    """
    Generates an activation movie for a rollout with a particular model
    """

    parser = argparse.ArgumentParser(formatter_class=argparse.ArgumentDefaultsHelpFormatter)
    parser.add_argument('--algo', help='choose from [es, a2c, dqn]', type=str,default="ga")
    parser.add_argument('--environment', type=str,default="SeaquestNoFrameskip-v4")
    parser.add_argument('--run_id',type=int,default=1)
    parser.add_argument('--output', type=str, default="output.mp4")

    args = parser.parse_args()
    
    m = MakeAtariModel(args.algo,args.environment,args.run_id)()

    cc = MakeActivationVideo(m)
    cc.write_videofile(args.output)

if __name__=="__main__":
    main()


================================================
FILE: atari_zoo/atari_wrappers.py
================================================
# Copyright (c) 2018 Uber Technologies, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
#     Unless required by applicable law or agreed to in writing, software
#     distributed under the License is distributed on an "AS IS" BASIS,
#     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#     See the License for the specific language governing permissions and
#     limitations under the License.

import numpy as np
from collections import deque
from PIL import Image
import gym
from gym import spaces
import tensorflow as tf
from pdb import set_trace as bb

class NoopResetEnv(gym.Wrapper):
    def __init__(self, env, noop_max=30):
        """Sample initial states by taking random number of no-ops on reset.
        No-op is assumed to be action 0.
        """
        gym.Wrapper.__init__(self, env)
        self.noop_max = noop_max
        self.override_num_noops = None
        assert env.unwrapped.get_action_meanings()[0] == 'NOOP'

    def reset(self):
        """ Do no-op action for a number of steps in [1, noop_max]."""
        self.env.reset()
        if self.override_num_noops is not None:
            noops = self.override_num_noops
        else:
            noops = self.unwrapped.np_random.randint(1, self.noop_max + 1) #pylint: disable=E1101
        assert noops > 0
        obs = None
        for _ in range(noops):
            obs, _, done, _ = self.env.step(0)
            if done:
                obs = self.env.reset()
        return obs

class FireResetEnv(gym.Wrapper):
    def __init__(self, env):
        """Take action on reset for environments that are fixed until firing."""
        gym.Wrapper.__init__(self, env)
        assert env.unwrapped.get_action_meanings()[1] == 'FIRE'
        assert len(env.unwrapped.get_action_meanings()) >= 3

    def reset(self):
        self.env.reset()
        obs, _, done, _ = self.env.step(1)
        if done:
            self.env.reset()
        obs, _, done, _ = self.env.step(2)
        if done:
            self.env.reset()
        return obs

class EpisodicLifeEnv(gym.Wrapper):
    def __init__(self, env):
        """Make end-of-life == end-of-episode, but only reset on true game over.
        Done by DeepMind for the DQN and co. since it helps value estimation.
        """
        gym.Wrapper.__init__(self, env)
        self.lives = 0
        self.was_real_done  = True

    def step(self, action):
        obs, reward, done, info = self.env.step(action)
        self.was_real_done = done
        # check current lives, make loss of life terminal,
        # then update lives to handle bonus lives
        lives = self.env.unwrapped.ale.lives()
        if lives < self.lives and lives > 0:
            # for Qbert somtimes we stay in lives == 0 condtion for a few frames
            # so its important to keep lives > 0, so that we only reset once
            # the environment advertises done.
            done = True
        self.lives = lives
        return obs, reward, done, info

    def reset(self):
        """Reset only when lives are exhausted.
        This way all states are still reachable even though lives are episodic,
        and the learner need not know about any of this behind-the-scenes.
        """
        if self.was_real_done:
            obs = self.env.reset()
        else:
            # no-op step to advance from terminal/lost life state
            obs, _, _, _ = self.env.step(0)
        self.lives = self.env.unwrapped.ale.lives()
        return obs

class MaxAndSkipEnv(gym.Wrapper):
    def __init__(self, env, skip=4):
        """Return only every `skip`-th frame"""
        gym.Wrapper.__init__(self, env)
        # most recent raw observations (for max pooling across time steps)
        self._obs_buffer = deque(maxlen=2)
        self._skip       = skip
        self.viewer = None

    def step(self, action):
        """Repeat action, sum reward, and max over last observations."""
        total_reward = 0.0
        done = None
        for _ in range(self._skip):
            obs, reward, done, info = self.env.step(action)
            self._obs_buffer.append(obs)
            total_reward += reward
            if done:
                break
        max_frame = np.max(np.stack(self._obs_buffer), axis=0)

        return max_frame, total_reward, done, info

    def reset(self):
        """Clear past frame buffer and init. to first obs. from inner env."""
        self._obs_buffer.clear()
        obs = self.env.reset()
        self._obs_buffer.append(obs)
        return obs

    def _render(self, mode='human', close=False):
        if close:
            return
        if mode == 'human':
            from gym.envs.classic_control import rendering
            if self.viewer is None:
                self.viewer = rendering.SimpleImageViewer()
            self.viewer.imshow(np.max(np.stack(self._obs_buffer), axis=0))
            return np.max(np.stack(self._obs_buffer), axis=0)
        else:
            return np.max(np.stack(self._obs_buffer), axis=0)

# class ClipRewardEnv(gym.RewardWrapper):
#     def _reward(self, reward):
#         """Bin reward to {+1, 0, -1} by its sign."""
#         return np.sign(reward)



class WarpFrameTF(gym.ObservationWrapper):
    def __init__(self, env, show_warped=False,warp_size=(84,84)):
        """Warp frames to 84x84 as done in the Nature paper and later work."""
        gym.ObservationWrapper.__init__(self, env)
        self.res = 84
        self.observation_space = spaces.Box(low=0, high=255, shape=(self.res, self.res, 1))
        self.viewer = None
        self.show_warped = show_warped

        self.inp_shape = [None]+list(env.observation_space.shape[:2])+[1,]
        self.x_t = tf.placeholder(tf.float32, self.inp_shape,name='warp_ph')
        self.warp_size = warp_size
        self.transform_op = self.transform(self.x_t)

    def transform(self,obs):
        obs = tf.image.resize_bilinear(obs, self.warp_size, align_corners=True)
        obs = tf.reshape(obs, self.warp_size + (1,))
        return obs


    def observation(self, obs):
        frame = np.dot(obs.astype('float32'), np.array([0.299, 0.587, 0.114], 'float32'))
        frame = frame[np.newaxis,:]
        frame = frame[...,np.newaxis]

        return self.transform_op.eval({self.x_t:frame}) 

    def _render(self, mode='human', close=False):
        if close:
            return
        if mode == 'human' and self.show_warped:
            from gym.envs.classic_control import rendering
            if self.viewer is None:
                self.viewer = rendering.SimpleImageViewer()
            img = self.observation(self.env._render('rgb_array', close)) * np.ones([1, 1, 3], dtype=np.uint8)
            self.viewer.imshow(img)
            return img
        else:
            return self.env._render(mode, close)


class WarpFrame(gym.ObservationWrapper):
    def __init__(self, env, show_warped=False):
        """Warp frames to 84x84 as done in the Nature paper and later work."""
        gym.ObservationWrapper.__init__(self, env)
        self.res = 84
        self.observation_space = spaces.Box(low=0, high=255, shape=(self.res, self.res, 1))
        self.viewer = None
        self.show_warped = show_warped

    def observation(self, obs):
        frame = np.dot(obs.astype('float32'), np.array([0.299, 0.587, 0.114], 'float32'))
        frame = np.array(Image.fromarray(frame).resize((self.res, self.res),
            resample=Image.BILINEAR), dtype=np.uint8)
        return frame.reshape((self.res, self.res, 1))

    def _render(self, mode='human', close=False):
        if close:
            return
        if mode == 'human' and self.show_warped:
            from gym.envs.classic_control import rendering
            if self.viewer is None:
                self.viewer = rendering.SimpleImageViewer()
            img = self.observation(self.env._render('rgb_array', close)) * np.ones([1, 1, 3], dtype=np.uint8)
            self.viewer.imshow(img)
            return img
        else:
            return self.env._render(mode, close)

class FrameStack(gym.Wrapper):
    def __init__(self, env, k):
        """Buffer observations and stack across channels (last axis)."""
        gym.Wrapper.__init__(self, env)
        self.k = k
        self.frames = deque([], maxlen=k)
        shp = env.observation_space.shape
        assert shp[2] == 1  # can only stack 1-channel frames
        self.observation_space = spaces.Box(low=0, high=255, shape=(shp[0], shp[1], k))

    def reset(self):
        """Clear buffer and re-fill by duplicating the first observation."""
        ob = self.env.reset()
        for _ in range(self.k): self.frames.append(ob)
        return self.observation()

    def step(self, action):
        ob, reward, done, info = self.env.step(action)
        self.frames.append(ob)
        return self.observation(), reward, done, info

    def observation(self):
        assert len(self.frames) == self.k
        return np.concatenate(self.frames, axis=2)

class ScaledFloatFrame(gym.ObservationWrapper):
    def __init__(self,env,scale=(1/255.0)):
        gym.ObservationWrapper.__init__(self, env)
        self.scale = scale
    def observation(self, obs):
    # careful! This undoes the memory optimization, use
    # with smaller replay buffers only.
        return np.array(obs).astype(np.float32) * self.scale

class DiscretizeActions(gym.Wrapper):
    def __init__(self, env):
        """Buffer observations and stack across channels (last axis)."""
        gym.Wrapper.__init__(self, env)
        self.temp_action = env.action_space
        self.action_space = spaces.Discrete(5 ** int(np.prod(env.action_space.shape)))

    def step(self, action):
        cont_action = self.temp_action.low.copy()
        for i in range(cont_action.size):
            cont_action[i] += (self.temp_action.high[i] - self.temp_action.low[i]) * float(int(action) % 5) / 4.0
            action = int(action / 5)
        return self.env.step(cont_action)


# def wrap_deepmind(env, episode_life=True, clip_rewards=True):
def wrap_deepmind(env, episode_life=False, skip=4, stack_frames=4, noop_max=30, noops=None, show_warped=False,preproc='tf'):
    """Configure environment for DeepMind-style Atari.

    Note: this does not include frame stacking!"""
    import gym.envs.atari
    if isinstance(env.unwrapped, gym.envs.atari.AtariEnv):
        if episode_life:
            env = EpisodicLifeEnv(env)
        env = NoopResetEnv(env, noop_max=noop_max)
        if noops:
            env.override_num_noops = noops
        if skip > 1:
            assert 'NoFrameskip' in env.spec.id  # required for DeepMind-style skip
            env = MaxAndSkipEnv(env, skip=4)
        if 'FIRE' in env.unwrapped.get_action_meanings():
            env = FireResetEnv(env)
    if preproc=='np':
        env = WarpFrame(env, show_warped=show_warped)
    elif preproc=='tf':
        env = WarpFrameTF(env, show_warped=show_warped)
    #elif preproc=='dopamine':
    #    env = DopamineAtariPreprocessing(env)
    # if clip_rewards:
    #     env = ClipRewardEnv(env)
    if stack_frames > 1:
        env = FrameStack(env, stack_frames)
    env = ScaledFloatFrame(env)
    return env


================================================
FILE: atari_zoo/config.py
================================================
# Copyright (c) 2018 Uber Technologies, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
#     Unless required by applicable law or agreed to in writing, software
#     distributed under the License is distributed on an "AS IS" BASIS,
#     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#     See the License for the specific language governing permissions and
#     limitations under the License.

def dopamine_url_formatter(base_url,agent,game,run,tag=None):
    game_proc = game.split("NoFrameskip")[0]
    return "gs://download-dopamine-rl/lucid/{agent}/{game}/{run}/graph_def.pb".format(agent=agent,game=game_proc,run=run)


#remote lookup table
datadir_remote_dict = {'apex':"https://dgqeqexrlnkvd.cloudfront.net/zoo/apex",
                         'es':"https://dgqeqexrlnkvd.cloudfront.net/zoo/es",
                         'ga':"https://dgqeqexrlnkvd.cloudfront.net/zoo/ga",
                         'a2c':"https://dgqeqexrlnkvd.cloudfront.net/zoo/a2c",
                         'rainbow':"https://dgqeqexrlnkvd.cloudfront.net/zoo/rainbow",
                       'dqn':"https://dgqeqexrlnkvd.cloudfront.net/zoo/dqn",
                       'impala':"https://dgqeqexrlnkvd.cloudfront.net/zoo/impala"}

url_formatter_dict = {('rainbow','remote'):dopamine_url_formatter,('dqn','remote'):dopamine_url_formatter}


#local lookup table
datadir_local_dict = {'apex':"/space/rlzoo/apex",
                        'es':"/space/rlzoo/es",
                        'ga':"/space/rlzoo/ga",
                        'a2c':'/space/rlzoo/a2c',
                        'rainbow':'/space/rlzoo/rainbow',
                        'dqn':'/space/rlzoo/dqn',
                      'impala':'/space/rlzoo/impala' }

debug = True


================================================
FILE: atari_zoo/dopamine_preprocessing.py
================================================
# Modifications Copyright (c) 2018 Uber Technologies, Inc.

# Copyright 2018 The Dopamine Authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#      http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""A class implementing minimal Atari 2600 preprocessing.

This includes:
  . Emitting a terminal signal when losing a life (optional).
  . Frame skipping and color pooling.
  . Resizing the image before it is provided to the agent.
"""

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function

from gym.spaces.box import Box
import numpy as np
import cv2

#@gin.configurable

class AtariPreprocessing(object):
  """A class implementing image preprocessing for Atari 2600 agents.

  Specifically, this provides the following subset from the JAIR paper
  (Bellemare et al., 2013) and Nature DQN paper (Mnih et al., 2015):

    * Frame skipping (defaults to 4).
    * Terminal signal when a life is lost (off by default).
    * Grayscale and max-pooling of the last two frames.
    * Downsample the screen to a square image (defaults to 84x84).

  More generally, this class follows the preprocessing guidelines set down in
  Machado et al. (2018), "Revisiting the Arcade Learning Environment:
  Evaluation Protocols and Open Problems for General Agents".
  """

  def __init__(self, environment, frame_skip=4, terminal_on_life_loss=False,
               screen_size=84):
    """Constructor for an Atari 2600 preprocessor.

    Args:
      environment: Gym environment whose observations are preprocessed.
      frame_skip: int, the frequency at which the agent experiences the game.
      terminal_on_life_loss: bool, If True, the step() method returns
        is_terminal=True whenever a life is lost. See Mnih et al. 2015.
      screen_size: int, size of a resized Atari 2600 frame.

    Raises:
      ValueError: if frame_skip or screen_size are not strictly positive.
    """
    if frame_skip <= 0:
      raise ValueError('Frame skip should be strictly positive, got {}'.
                       format(frame_skip))
    if screen_size <= 0:
      raise ValueError('Target screen size should be strictly positive, got {}'.
                       format(screen_size))

    self.environment = environment
    self.unwrapped = environment
    self.terminal_on_life_loss = terminal_on_life_loss
    self.frame_skip = frame_skip
    self.screen_size = screen_size

    obs_dims = self.environment.observation_space
    # Stores temporary observations used for pooling over two successive
    # frames.
    self.screen_buffer = [
        np.empty((obs_dims.shape[0], obs_dims.shape[1]), dtype=np.uint8),
        np.empty((obs_dims.shape[0], obs_dims.shape[1]), dtype=np.uint8)
    ]

    self.game_over = False
    self.lives = 0  # Will need to be set by reset().

  @property
  def observation_space(self):
    # Return the observation space adjusted to match the shape of the processed
    # observations.
    return Box(low=0, high=255, shape=(self.screen_size, self.screen_size, 1),
               dtype=np.uint8)

  @property
  def action_space(self):
    return self.environment.action_space

  @property
  def reward_range(self):
    return self.environment.reward_range

  @property
  def metadata(self):
    return self.environment.metadata

  def reset(self):
    """Resets the environment.

    Returns:
      observation: numpy array, the initial observation emitted by the
        environment.
    """
    self.environment.reset()
    self.lives = self.environment.ale.lives()
    self._fetch_grayscale_observation(self.screen_buffer[0])
    self.screen_buffer[1].fill(0)
    return self._pool_and_resize()

  def render(self, mode):
    """Renders the current screen, before preprocessing.

    This calls the Gym API's render() method.

    Args:
      mode: Mode argument for the environment's render() method.
        Valid values (str) are:
          'rgb_array': returns the raw ALE image.
          'human': renders to display via the Gym renderer.

    Returns:
      if mode='rgb_array': numpy array, the most recent screen.
      if mode='human': bool, whether the rendering was successful.
    """
    return self.environment.render(mode)

  def step(self, action):
    """Applies the given action in the environment.

    Remarks:

      * If a terminal state (from life loss or episode end) is reached, this may
        execute fewer than self.frame_skip steps in the environment.
      * Furthermore, in this case the returned observation may not contain valid
        image data and should be ignored.

    Args:
      action: The action to be executed.

    Returns:
      observation: numpy array, the observation following the action.
      reward: float, the reward following the action.
      is_terminal: bool, whether the environment has reached a terminal state.
        This is true when a life is lost and terminal_on_life_loss, or when the
        episode is over.
      info: Gym API's info data structure.
    """
    accumulated_reward = 0.

    for time_step in range(self.frame_skip):
      # We bypass the Gym observation altogether and directly fetch the
      # grayscale image from the ALE. This is a little faster.
      _, reward, game_over, info = self.environment.step(action)
      accumulated_reward += reward

      if self.terminal_on_life_loss:
        new_lives = self.environment.ale.lives()
        is_terminal = game_over or new_lives < self.lives
        self.lives = new_lives
      else:
        is_terminal = game_over

      if is_terminal:
        break
      # We max-pool over the last two frames, in grayscale.
      elif time_step >= self.frame_skip - 2:
        t = time_step - (self.frame_skip - 2)
        self._fetch_grayscale_observation(self.screen_buffer[t])

    # Pool the last two observations.
    observation = self._pool_and_resize()

    self.game_over = game_over
    return observation, accumulated_reward, is_terminal, info

  def _fetch_grayscale_observation(self, output):
    """Returns the current observation in grayscale.

    The returned observation is stored in 'output'.

    Args:
      output: numpy array, screen buffer to hold the returned observation.

    Returns:
      observation: numpy array, the current observation in grayscale.
    """
    self.environment.ale.getScreenGrayscale(output)
    return output

  def _pool_and_resize(self):
    """Transforms two frames into a Nature DQN observation.

    For efficiency, the transformation is done in-place in self.screen_buffer.

    Returns:
      transformed_screen: numpy array, pooled, resized screen.
    """
    # Pool if there are enough screens to do so.
    if self.frame_skip > 1:
      np.maximum(self.screen_buffer[0], self.screen_buffer[1],
                 out=self.screen_buffer[0])

    transformed_image = cv2.resize(self.screen_buffer[0],
                                   (self.screen_size, self.screen_size),
                                   interpolation=cv2.INTER_AREA)
    int_image = np.asarray(transformed_image, dtype=np.uint8)
    return np.expand_dims(int_image, axis=2)


================================================
FILE: atari_zoo/game_lists/a2c_game_list
================================================
AirRaidNoFrameskip-v4
AlienNoFrameskip-v4
AmidarNoFrameskip-v4
AssaultNoFrameskip-v4
AsterixNoFrameskip-v4
AsteroidsNoFrameskip-v4
AtlantisNoFrameskip-v4
BankHeistNoFrameskip-v4
BattleZoneNoFrameskip-v4
BeamRiderNoFrameskip-v4
BerzerkNoFrameskip-v4
BowlingNoFrameskip-v4
BoxingNoFrameskip-v4
BreakoutNoFrameskip-v4
CarnivalNoFrameskip-v4
CentipedeNoFrameskip-v4
ChopperCommandNoFrameskip-v4
CrazyClimberNoFrameskip-v4
DemonAttackNoFrameskip-v4
DoubleDunkNoFrameskip-v4
ElevatorActionNoFrameskip-v4
EnduroNoFrameskip-v4
FishingDerbyNoFrameskip-v4
FreewayNoFrameskip-v4
FrostbiteNoFrameskip-v4
GopherNoFrameskip-v4
GravitarNoFrameskip-v4
HeroNoFrameskip-v4
IceHockeyNoFrameskip-v4
JamesbondNoFrameskip-v4
JourneyEscapeNoFrameskip-v4
KangarooNoFrameskip-v4
KrullNoFrameskip-v4
KungFuMasterNoFrameskip-v4
MontezumaRevengeNoFrameskip-v4
MsPacmanNoFrameskip-v4
NameThisGameNoFrameskip-v4
PhoenixNoFrameskip-v4
PitfallNoFrameskip-v4
PongNoFrameskip-v4
PooyanNoFrameskip-v4
PrivateEyeNoFrameskip-v4
QbertNoFrameskip-v4
RiverraidNoFrameskip-v4
RoadRunnerNoFrameskip-v4
RobotankNoFrameskip-v4
SeaquestNoFrameskip-v4
SkiingNoFrameskip-v4
SolarisNoFrameskip-v4
SpaceInvadersNoFrameskip-v4
StarGunnerNoFrameskip-v4
TennisNoFrameskip-v4
TimePilotNoFrameskip-v4
TutankhamNoFrameskip-v4
UpNDownNoFrameskip-v4
VentureNoFrameskip-v4
VideoPinballNoFrameskip-v4
WizardOfWorNoFrameskip-v4
YarsRevengeNoFrameskip-v4
ZaxxonNoFrameskip-v4


================================================
FILE: atari_zoo/game_lists/apex_game_list
================================================
alien
amidar
assault
asterix
asteroids
atlantis
bank_heist
battle_zone
beam_rider
berzerk
bowling
boxing
breakout
centipede
chopper_command
crazy_climber
demon_attack
double_dunk
enduro
fishing_derby
freeway
frostbite
gopher
gravitar
hero
ice_hockey
jamesbond
kangaroo
krull
kung_fu_master
montezuma_revenge
ms_pacman
name_this_game
phoenix
pitfall
pong
private_eye
qbert
riverraid
road_runner
robotank
seaquest
skiing
solaris
space_invaders
star_gunner
tennis
time_pilot
tutankham
up_n_down
venture
video_pinball
wizard_of_wor
yars_revenge
zaxxon


================================================
FILE: atari_zoo/game_lists/dopamine_game_list
================================================
AirRaid
Alien
Amidar
Assault
Asterix
Asteroids
Atlantis
BankHeist
BattleZone
BeamRider
Berzerk
Bowling
Boxing
Breakout
Carnival
Centipede
ChopperCommand
CrazyClimber
DemonAttack
DoubleDunk
ElevatorAction
Enduro
FishingDerby
Freeway
Frostbite
Gopher
Gravitar
Hero
IceHockey
Jamesbond
JourneyEscape
Kangaroo
Krull
KungFuMaster
MontezumaRevenge
MsPacman
NameThisGame
Phoenix
Pitfall
Pong
Pooyan
PrivateEye
Qbert
Riverraid
RoadRunner
Robotank
Seaquest
Skiing
Solaris
SpaceInvaders
StarGunner
Tennis
TimePilot
Tutankham
UpNDown
Venture
VideoPinball
WizardOfWor
YarsRevenge
Zaxxon


================================================
FILE: atari_zoo/log.py
================================================
# Copyright (c) 2018 Uber Technologies, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
#     Unless required by applicable law or agreed to in writing, software
#     distributed under the License is distributed on an "AS IS" BASIS,
#     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#     See the License for the specific language governing permissions and
#     limitations under the License.

import numpy as np
import pandas as pd
import json
import atari_zoo
from atari_zoo.utils import load_json_from_url

#TODO: Potentially refactor into class structure

"""
Helper function to parse checkpoint log to expose available 'tagged' checkpoints and their information
"""
def parse_checkpoint_info(json_data):
    ckt_points = {}
    for entry in json_data:
        tag = entry['criteria']
        if entry['best_checkpoint']!=None:
            ckt_points[tag]=entry['best_checkpoint']
    return ckt_points

"""
Load checkpoint json file from path, where
path can be either a local address or a web
address
"""
def load_checkpoint_info(path):
        fname = path+".json"

        if(fname.find('http')!=-1):
            return load_json_from_url(fname)
        else:
            return json.load(open(fname))

"""
Helper function to transform json log format to pandas data frames for plotting
"""
def get_dataframe_from_training_log(_data=None,_file=None,algo='default',run=1):
    if(not _data):
        log = json.load(_file)
    else:
        log = _data
    
    assert len(log)>0
    
    column_names = log[0].keys()
    data_dict = {}

    for column in column_names:
        data_dict[column] = []
            
    for entry in log:
        for key in column_names:
            data_dict[key].append(entry[key])
            
    data_dict['algo'] = [algo] * len(log)
    data_dict['run'] = [run] * len(log)
            
    df = pd.DataFrame(data_dict)
    
    if 'initial' in column_names:
        df = df[df['initial']==0]
        
    df = df.sort_values(by=['time'])

    """
    clean-up stage: some Ape-X runs were restarted from checkpoints
    due to a network outage. which creates a big timedelta that needs 
    to be cleaned up: i.e. 
    """
    clean=False
    threshold = 60*60*3 #assume >3 hour gap means restart

    while not clean:
        clean=True

        time_diffs = np.diff(df['time'])
        if np.max(time_diffs)>threshold:
            clean=False
            idx = np.argmax(time_diffs)
            amt = np.max(time_diffs)

            #TODO: use loc instead (don't operate on copy)
            #df['time'][idx+1:]-=amt
            df.loc[idx+1:,('time')]-=amt
    
    return df

"""
Helper function to gather logs for runs of a particular algo/game combo
"""
def gather_logs_across_runs(algo,game,runs,local=False):
    results = []

    for run in runs:
        k= atari_zoo.MakeAtariModel(algo,game,run,local=local)()
        log = k.get_log()
        results.append(get_dataframe_from_training_log(_data=log,algo=algo,run=run))

    df = pd.concat(results)
    return df


"""
Helper function to gather logs across algorithms for a particular game
"""
def gather_logs_across_algos(algos,game,local=False):
    results = []

    for algo in algos:
        results.append(gather_logs_across_runs(algo,game,range(1,atari_zoo.run_cnt[algo]+1),local=local))

    df = pd.concat(results)

    return df


if __name__=='__main__':
    import seaborn as sns
    from pylab import *
    algo = "apex"
    game = "AmidarNoFrameskip-v4"
    
    """
    apex_df = gather_logs_across_runs("apex",game,range(1,6),local=True)
    ga_df = gather_logs_across_runs("ga",game,range(1,6),local=True)
    a2c_df = gather_logs_across_runs("a2c",game,range(1,4),local=True)
    es_df = gather_logs_across_runs("es",game,range(1,4),local=True)

    df = pd.concat((apex_df,ga_df,a2c_df,es_df))
    """

    df = gather_logs_across_algos(['apex','ga','a2c','es'],game,local=True)

    sns.lineplot(x="time", y="score",
                              style="run", hue='algo',
                              data=df)

    show()


================================================
FILE: atari_zoo/model_maker.py
================================================
# Copyright (c) 2018 Uber Technologies, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
#     Unless required by applicable law or agreed to in writing, software
#     distributed under the License is distributed on an "AS IS" BASIS,
#     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#     See the License for the specific language governing permissions and
#     limitations under the License.


# Contains wrapper classes over Lucid that enable loading frozen graphs
# into the Lucid framework
import json
import lucid
from lucid.modelzoo.vision_base import Model
from lucid.misc.io.loading import load
from lucid.misc.io.reading import read, local_cache_path
import tensorflow as tf
import numpy as np
from pdb import set_trace as bb
import atari_zoo.config
from atari_zoo.config import datadir_local_dict,datadir_remote_dict,url_formatter_dict 
from atari_zoo import game_action_counts 
from atari_zoo.utils import *
import atari_zoo.log


"""
Basic RL model class that extends Lucid's Model class

Implements extra methods:

get_observations (Loads precomputed observations if they exist)
get_frames (Loads precomputed RGB frames if they exist)
get_ram (loads precomputed 128-integer RAM snapshots if they exist)
ram_state_to_bits (change 128-integer RAM into 1024-bit RAM)

Implements additional class variables:
channel_order: "NHWC" or "NCHW"
preprocess_style: 'tf' for tensorflow preprocessing, 'np' for numpy, 'dopamine' for dopamine-style
"""
class RL_model(Model):
    channel_order = "NHWC" #typical channel order
    dataset = 'RL'
    valid_run_range = (1,3)

    #if the model exposes other interesting layers
    #e.g. A2C exposes a value and a policy head
    additional_layers = {}

    #minutae -- whether atari preprocessing is done in
    #tensorflow or numpy; only difference is in implmentation
    #details of bilinear filtering downsampling; but of course
    #RL agents overfit to it!
    #
    #tf = tensorflow; np = numpy
    preprocess_style = 'tf'

    image_shape = [84, 84, 4]
    input_scale = 1.0
    image_value_range = (0, 1) 
    input_name = 'X_t'
    ph_type = 'float32'

    """ 
    overwrite input creating function to handle different
    datatypes; Dopamine models want uint8 placeholders
    """
    def create_input(self, t_input=None, forget_xy_shape=True):
        if t_input == None and self.ph_type=='uint8':
            t_input = tf.placeholder(tf.uint8,self.image_shape)

        #return super().create_input(t_input,forget_xy_shape)
        return super(RL_model,self).create_input(t_input,forget_xy_shape)

    #TODO integrate these file loads with Lucid's cache mechanism
    def get_log(self):
        fname = self.log_path+"_log.json"

        if(fname.find('http')!=-1):
            return load_json_from_url(fname)
        else:
            return json.load(open(fname))

    def get_checkpoint_info(self):
        return atari_zoo.log.load_checkpoint_info(self.log_path)

    def get_observations(self):
        fname = self.data_path
        return load(fname)['observations']

    def get_frames(self):
        fname = self.data_path
        return load(fname)['frames']

    def get_ram(self):
        fname = self.data_path
        return load(fname)['ram']

    def get_scores(self):
        fname = self.data_path
        return load(fname)['score']

    def get_representation(self):
        fname = self.data_path
        return load(fname)['representation']

    def get_episode_rewards(self):
        fname = self.data_path
        return load(fname)['ep_rewards']

    #TODO make more efficent
    def ram_state_to_bits(self,state):
        binary = ['{0:08b}'.format(k) for k in state]
        binary = ''.join(binary)
        return binary

    #what processing must be done to extract the
    #right distribution of actions from the output
    #of the network
    def get_action(self,model):
        raise NotImplementedError

    #transform weight tensor to be of canonical style
    def preprocess_weight(self,x):
        #default is identity
        return x
   
    #grab weights from model given current session 
    def get_weights(self,session,layer_no):
        weights_name = self.weights[layer_no]['name']
        weights = session.graph.get_tensor_by_name("import/%s:0" % weights_name)
        weights = self.preprocess_weight(weights)
        return session.run(weights)

    #transform activations into canonical tensor
    def canonical_activation_representation(self,act):
        if self.channel_order=='NHWC':
            return act
        else:
            #print("Current:",act.shape)
            return np.transpose(act,axes=[0,2,3,1])

    #transform activations into canonical tensor
    def native_activation_representation(self,act):
        if self.channel_order=='NHWC':
            return act
        else:
            return np.transpose(act,axes=[0,3,1,2])
  
#OpenAI's evolution strategy algorithm
class RL_ES(RL_model):
  weights = [
      {'name':'es/layer1/conv1/w'},
      {'name':'es/layer2/conv2/w'},
      {'name':'es/layer3/conv3/w'},
  ]

  layers = [
     {'type': 'conv', 'name': 'es/layer1/Relu', 'size': 32},
     {'type': 'conv', 'name': 'es/layer2/Relu', 'size': 64},
     {'type': 'conv', 'name': 'es/layer3/Relu', 'size': 64},
     {'type': 'dense', 'name': 'es/layer4/Relu', 'size': 512},
     {'type': 'dense', 'name': 'es/layer5/out/out', 'size':18}
   ]

  def preprocess_weight(self,x):
      return x[0]

  def get_action(self,model):
        policy = model(self.layers[-1]['name']) 
        action_sample = tf.argmax(policy, axis=-1)
        return action_sample

#Uber's Deep GA
class RL_GA(RL_model):
  layers = [
     {'type': 'conv', 'name': 'ga/conv1/relu', 'size': 32},
     {'type': 'conv', 'name': 'ga/conv2/relu', 'size': 64},
     {'type': 'conv', 'name': 'ga/conv3/relu', 'size': 64},
     {'type': 'dense', 'name': 'ga/fc/relu', 'size': 512},
     {'type': 'dense', 'name': 'ga/out/signal', 'size':18}
   ]

  weights = [
      {'name':'ga/conv1/w'},
      {'name':'ga/conv2/w'},
      {'name':'ga/conv3/w'},
  ]

  def get_action(self,model):
        policy = model(self.layers[-1]['name'])
        action_sample = tf.argmax(policy, axis=-1)
        return action_sample

  def preprocess_weight(self,x):
      return x[0]

#Ape-X (recent high-performing DQN variant)
class RL_Apex(RL_model):
  channel_order = "NCHW"

  #note: action_value/Relu also worth considering...
  layers = [
     {'type': 'conv', 'name': 'deepq/q_func/convnet/Conv/Relu', 'size': 32},
     {'type': 'conv', 'name': 'deepq/q_func/convnet/Conv_1/Relu', 'size': 64},
     {'type': 'conv', 'name': 'deepq/q_func/convnet/Conv_2/Relu', 'size': 64},
      {'type': 'dense', 'name': 'deepq/q_func/state_value/Relu', 'size': 512},
     {'type': 'dense', 'name': 'deepq/q_func/q_values', 'size':18}
   ]

  weights = [
      {'name':'deepq/q_func/convnet/Conv/weights'},
      {'name':'deepq/q_func/convnet/Conv_1/weights'},
      {'name':'deepq/q_func/convnet/Conv_2/weights'}
  ]
 
  def get_action(self,model):
        policy = model(self.layers[-1]['name']) #"a2c/policy/BiasAdd")
        action_sample = tf.argmax(policy, axis=-1)
        return action_sample

#DQN from dopamine model dump
class RL_DQN_dopamine(RL_model):
  #ph_type = 'uint8'
  input_scale = 255.0
  preprocess_style = 'dopamine'
  image_value_range = (0, 255) 
  input_name = 'Online/Cast'
  valid_run_range = (1,3)

  weights = [
      {'name':'Online/Conv/weights'},
      {'name':'Online/Conv_1/weights'},
      {'name':'Online/Conv_2/weights'}
  ]

  layers = [
     {'type': 'conv', 'name': 'Online/Conv/Relu', 'size': 32},
     {'type': 'conv', 'name': 'Online/Conv_1/Relu', 'size': 64},
     {'type': 'conv', 'name': 'Online/Conv_2/Relu', 'size': 64},
     {'type': 'dense', 'name': 'Online/fully_connected/Relu', 'size': 512},
     {'type': 'dense', 'name': 'Online/fully_connected_1/BiasAdd', 'size':18}
   ]
 
  def get_action(self,model):
        policy = model(self.layers[-1]['name']) 
        action_sample = tf.argmax(policy, axis=1)
        return action_sample

  def get_log(self):
    raise NotImplementedError
      #Integration with Dopamine log formatting not yet complete."

  def get_checkpoint_info(self):
    raise NotImplementedError
       #,"Dopamine models include only the final checkpoint."

#Rainbow (slightly older high-performing DQN variant)
class RL_Rainbow_dopamine(RL_model):
  #ph_type = 'uint8'
  valid_run_range = (1,5)
  preprocess_style = 'dopamine'
  input_scale = 255.0
  image_value_range = (0, 255) 
  #input_name = 'state_ph'
  input_name = 'Online/Cast'

  weights = [
      {'name':'Online/Conv/weights'},
      {'name':'Online/Conv_1/weights'},
      {'name':'Online/Conv_2/weights'}
  ]

  layers = [
     {'type': 'conv', 'name': 'Online/Conv/Relu', 'size': 32},
     {'type': 'conv', 'name': 'Online/Conv_1/Relu', 'size': 64},
     {'type': 'conv', 'name': 'Online/Conv_2/Relu', 'size': 64},
     {'type': 'dense', 'name': 'Online/fully_connected/Relu', 'size': 512},
     #{'type': 'dense', 'name': 'Online/fully_connected_1/BiasAdd', 'size':18}
     {'type': 'dense', 'name': 'Online/Sum', 'size':18}
   ]

  additional_layers={'c51':{'type':'dense','name': 'Online/fully_connected_1/BiasAdd', 'size':18*51}}

 
  def get_action(self,model):
        policy = model(self.layers[-1]['name'])
        action_sample = tf.argmax(policy, axis=1)
        return action_sample

  def get_log(self):
    raise NotImplementedError
    #"Integration with Dopamine log formatting not yet complete."

  def get_checkpoint_info(self):
    raise NotImplementedError
    #"Dopamine models include only the final checkpoint."

#A2C -- policy gradient algorithm
class RL_A2C(RL_model):
  weights = [
      {'name':'a2c/conv1/weights'},
      {'name':'a2c/conv2/weights'},
      {'name':'a2c/conv3/weights'}
  ]

  layers = [
     {'type': 'conv', 'name': 'a2c/conv1/Relu', 'size': 32},
     {'type': 'conv', 'name': 'a2c/conv2/Relu', 'size': 64},
     {'type': 'conv', 'name': 'a2c/conv3/Relu', 'size': 64},
     {'type': 'dense', 'name': 'a2c/fc/Relu', 'size': 512},
     #TODO: enable accesing a2c's value head as well! 
     #{'type': 'dense', 'name': 'a2c/value/BiasAdd', 'size':18},
     {'type': 'dense', 'name': 'a2c/policy/BiasAdd', 'size':18}
   ]
  
  def get_action(self,model):
        policy = model(self.layers[-1]['name']) 
        rand_u = tf.random_uniform(tf.shape(policy))
        action_sample = tf.argmax(policy - tf.log(-tf.log(rand_u)), axis=-1)
        return action_sample

class RL_IMPALA(RL_model):
  input_name = 'agent_1/agent/unroll/batch_apply/truediv'
  preprocess_style = 'np'

  weights = [
      {'name':'agent/batch_apply/convnet/conv_2d/w'},
      {'name':'agent/batch_apply/convnet/conv_2d_1/w'},
      {'name':'agent/batch_apply/convnet/conv_2d_2/w'},
  ]

  layers = [
     {'type': 'conv', 'name': 'agent_1/agent/unroll/batch_apply/convnet/Relu', 'size': 32},
     {'type': 'conv', 'name': 'agent_1/agent/unroll/batch_apply/convnet/Relu_1', 'size': 64},
     {'type': 'conv', 'name': 'agent_1/agent/unroll/batch_apply/convnet/Relu_2', 'size': 64},
     {'type': 'dense', 'name': 'agent_1/agent/unroll/batch_apply/Relu', 'size': 512},
     {'type': 'dense', 'name': 'agent_1/agent/unroll/batch_apply_1/policy_logits/add', 'size': 18},
   ]

  def get_action(self,model):
        policy_logits = model(self.layers[-1]['name'])
        new_action = tf.multinomial(policy_logits, num_samples=1,
                output_dtype=tf.int32)
        new_action = tf.squeeze(new_action, 1, name='new_action')
        return new_action

### Instantiate concrete models using python magic
class_map = {'ga':RL_GA,'es':RL_ES,'apex':RL_Apex,'a2c':RL_A2C,'dqn':RL_DQN_dopamine,'rainbow':RL_Rainbow_dopamine, 'impala':RL_IMPALA}

#helper utility to make new python model classes
def _MakeAtariModel(model_class,name,environment,model_path,run_id,algorithm,log_path,data_path):
    #find number of actions in this particular game
    num_actions = game_action_counts[environment]

    #change last layer size to reflect available actions
    #layers = model_class.layers.copy()
    layers = list(model_class.layers) #python2.7 compatibility
    layers[-1]['size']=num_actions

    #create inherited class with correct properties (hack?)
    return type('Atari'+name,(model_class,),{'model_path':model_path,'environment':environment,'layers':layers,'run_id':run_id,'algorithm':algorithm,'log_path':log_path,'data_path':data_path})

"""
Helper function to get paths to model, rollout data, and log
for a particular algo/env/run combo
"""
def GetFilePathsForModel(algo,environment,run_no,tag='final',local=False):

    #if loading off of local disk (rare; only for development)
    if local:
        data_root = datadir_local_dict[algo]
        if tag==None:
            model_path = "%s/%s/model%d.pb" % (data_root,environment,run_no)
            data_path = "%s/%s/model%d_rollout.npz" % (data_root,environment,run_no)
        else:
            model_path = "%s/%s/model%d_%s.pb" % (data_root,environment,run_no,tag)
            data_path = "%s/%s/model%d_%s_rollout.npz" % (data_root,environment,run_no,tag)

        log_path = "%s/checkpoints/%s_%d" % (data_root,environment,run_no)

    #otherwise if loading off the canonical remote server (most common)
    else:
        data_root = datadir_remote_dict[algo]
        if tag==None:
            model_path = "%s/%s/model%d.pb" % (data_root,environment,run_no)
            data_path = "%s/%s/model%d_rollout.npz" % (data_root,environment,run_no)
        else:
            model_path = "%s/%s/model%d_%s.pb" % (data_root,environment,run_no,tag)
            data_path = "%s/%s/model%d_%s_rollout.npz" % (data_root,environment,run_no,tag)

        if (algo,'remote') in url_formatter_dict:
            model_path = url_formatter_dict[(algo,'remote')](data_root,algo,environment,run_no)
   
        log_path = "%s/checkpoints/%s_%d" % (data_root,environment,run_no)

    return model_path,data_path,log_path

"""
Function to query for available checkpoints for a model
"""
def GetAvailableTaggedCheckpoints(algo,environment,run_no,local=False):
    _,_,log_path = GetFilePathsForModel(algo,environment,run_no,local=local)
    json_data = atari_zoo.log.load_checkpoint_info(log_path) 
    chkpoint_info = atari_zoo.log.parse_checkpoint_info(json_data)
    return chkpoint_info



"""
Function to load model from the model zoo

algo: Algorithm (ga,es,apex,a2c,dqn,rainbow)
environment: Atari gym environment (e.g. SeaquestNoFrameskip-v4)
run_no: which run of the algorithm
tag: which tag to search for (e.g. 1HR, human, 1B, final)
local: boolean, whether to get the model from a local archive or from the remote server
"""
def MakeAtariModel(algo,environment,run_no,tag='final',local=False):

    model_path,data_path,log_path = GetFilePathsForModel(algo,environment,run_no,tag,local)

    if atari_zoo.config.debug:
        print('Model path:',model_path)
        print('Data path:',data_path)
        print('Log path:',log_path)

    name = "%s_%s_%d_%s" % (algo,environment,run_no,tag)

    model_class = class_map[algo]
   
    valid_run_range = model_class.valid_run_range
    if run_no < valid_run_range[0] or run_no > valid_run_range[1]:
        raise ValueError("Requested run %d out of range (%d,%d)"%(run_no,valid_run_range[0],valid_run_range[1]))

    return _MakeAtariModel(class_map[algo],name,environment,model_path,run_no,algo,log_path,data_path)

if __name__=='__main__':
    #easy!
    Zaxxon_A2C = MakeAtariModel('rainbow','SeaquestNoFrameskip-v4',2,tag="final",local=False)
    model = Zaxxon_A2C()
    model.load_graphdef()
    model.import_graph()
    print("Done")


================================================
FILE: atari_zoo/rollout.py
================================================
# Copyright (c) 2018 Uber Technologies, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
#     Unless required by applicable law or agreed to in writing, software
#     distributed under the License is distributed on an "AS IS" BASIS,
#     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#     See the License for the specific language governing permissions and
#     limitations under the License.

from pdb import set_trace as bb
import argparse
#from utils import *
#from models import *
#from ga_vis import create_ga_model 
import pickle

import lucid
from lucid.modelzoo.vision_base import Model
from lucid.misc.io import show
import lucid.optvis.objectives as objectives
import lucid.optvis.param as param
import lucid.optvis.transform as transform
import lucid.optvis.render as render
import tensorflow as tf

from atari_zoo import MakeAtariModel
from lucid.optvis.render import import_model
import gym
import atari_zoo.atari_wrappers as atari_wrappers
import numpy as np
import random
from atari_zoo.dopamine_preprocessing import AtariPreprocessing as DopamineAtariPreprocessing	
from atari_zoo.atari_wrappers import FireResetEnv, NoopResetEnv, MaxAndSkipEnv,WarpFrameTF,FrameStack,ScaledFloatFrame 

class dotdict(dict):
    """dot.notation access to dictionary attributes"""
    __getattr__ = dict.get
    __setattr__ = dict.__setitem__
    __delattr__ = dict.__delitem__

def generate_rollout(model,args=None,action_noise=0.0,parameter_noise=0.0,observation_noise=0.0,test_eps=1,max_frames=2500,min_frames=2500,output='',sticky_action_prob=0.0,render=False,cpu=False,streamline=False,verbose=False):

    if args==None:
        arg_dict = {'parameter_noise':parameter_noise,
                    'observation_noise':observation_noise,
                    'test_eps':test_eps,
                    'max_frames':max_frames,
                    'min_frames':min_frames,
                    'output':output,
                    'sticky_action_prob':sticky_action_prob,
                    'render':render,
                    'streamline':streamline,
                    'action_noise':action_noise,
                    'verbose':verbose
                    }
        args = dotdict(arg_dict)
    
    #from machado
    sticky_action_prob = args.sticky_action_prob
    
    m = model 

    preprocessing = m.preprocess_style

    m.load_graphdef()

    #modify graphdef with gaussian noise
    if args.parameter_noise > 0.0:
        perturb_count = 0
        layer_names = [z['name'] for z in m.weights]

        #search for known-named nodes
        for node in m.graph_def.node:
            if node.name in layer_names:
                #black magic
                tensor = node.attr.get('value').tensor
                array = np.frombuffer(tensor.tensor_content,np.float32).copy()
                array += np.random.normal(0,args.parameter_noise,array.shape)
                tensor.tensor_content = array.tobytes()
                perturb_count+=1

        #print(perturb_count)
        #[n.name for n in m.graph_def.node if n.name.find("conv")!=-1]
        #bb()

        #should hit 3 conv layers
        assert perturb_count == 3

    dev_cnt = 1
    if args.cpu:
        dev_cnt = 0
    #for rollouts maybe don't use GPU?
    config = tf.ConfigProto(
            device_count = {'GPU': dev_cnt}
        )
    config.gpu_options.allow_growth=True

    with tf.Graph().as_default() as graph, tf.Session(config=config) as sess:
 
        if preprocessing == 'dopamine': #dopamine-style preprocessing
            env = gym.make(m.environment)
            if hasattr(env,'unwrapped'):
                env = env.unwrapped
            env = DopamineAtariPreprocessing(env)
            env = FrameStack(env, 4)
            env = ScaledFloatFrame(env,scale=1.0/255.0)
        elif preprocessing == 'np': #use numpy preprocessing
            env = gym.make(m.environment)
            env = atari_wrappers.wrap_deepmind(env, episode_life=False,preproc='np')
        else:  #use tensorflow preprocessing
            env = gym.make(m.environment)
            env = atari_wrappers.wrap_deepmind(env, episode_life=False,preproc='tf')

        nA = env.action_space.n
        X_t = tf.placeholder(tf.float32, [None] + list(env.observation_space.shape))

        T = import_model(m,X_t,X_t)
        action_sample = m.get_action(T)

        #get intermediate level representations
        activations = [T(layer['name']) for layer in m.layers]
        high_level_rep = activations[-2] #not output layer, but layer before

        sample_observations = []
        sample_frames = []
        sample_ram = []
        sample_representation = []
        sample_score = []

        obs = env.reset()

        ep_count = 0
        rewards = []; ep_rew = 0.
        frame_count = 0
    
        prev_action = None

        # Evaluate policy over test_eps episodes
        while ep_count < args.test_eps or frame_count<=args.min_frames:
            if args.render:
                env.render()

            #potentially add observation noise
            if args.observation_noise>0.0:
                obs += np.random.normal(0,args.observation_noise,obs.shape)


            train_dict = {X_t:obs[None]}
            if streamline:
                results = sess.run([action_sample], feed_dict=train_dict)
                #grab action
                act = results[0]
            else:
                results = sess.run([action_sample,high_level_rep], feed_dict=train_dict)

                #grab action
                act = results[0]

                #get high-level representation
                representation = results[1][0]

            if not streamline:
                frame = env.render(mode='rgb_array')
                sample_frames.append(np.array(frame,dtype=np.uint8))
                sample_ram.append(env.unwrapped._get_ram())
                sample_representation.append(representation)
                sample_observations.append(np.array(obs))

            sample_score.append(ep_rew)

            if args.action_noise >=0:
                if random.random() < args.action_noise:
                    act = random.randint(0,nA-1)

            if prev_action != None and random.random() < sticky_action_prob:
                act = prev_action

            prev_action = act

            obs, rew, done, info = env.step(np.squeeze(act))

            ep_rew += rew
            frame_count+=1

            if frame_count >= args.max_frames:
                done=True

            if done:
                obs = env.reset()
                ep_count += 1
                rewards.append(ep_rew)
                ep_rew = 0.

        if args.verbose:
            print("Avg. Episode Reward: ", np.mean(rewards))
            print("rewards:",rewards)
            print("frames:",frame_count)

        results = {'observations':sample_observations,'frames':sample_frames,'ram':sample_ram,'representation':sample_representation,'score':sample_score,'ep_rewards':rewards}

        if args.output!='':
            np.savez_compressed(args.output + "_rollout",**results)

        return results 


#TODO wrap this as a function call, so you can do multiple rollouts
def main():
    """
    Rolls out a model in the atari emulator -- can render it to screen, and also
    can save out image and observation sequences.
    """
    parser = argparse.ArgumentParser(formatter_class=argparse.ArgumentDefaultsHelpFormatter)
    parser.add_argument('--test_eps', help='number of test episodes', default=1, type=int)
    parser.add_argument('--algo', help='choose from [es, a2c, dqn]', type=str)
    parser.add_argument('--environment', type=str)
    parser.add_argument('--run_id',type=int, default=1)
    parser.add_argument('--render', action='store_true')
    parser.add_argument('--output', type=str, default="")
    parser.add_argument('--max_frames', type=int, default=1e8)
    parser.add_argument('--min_frames', type=int, default=0)
    parser.add_argument('--observation_noise', type=float, default=0.0)
    parser.add_argument('--parameter_noise', type=float, default=0.0)
    parser.add_argument('--tag', type=str, default=None)
    parser.add_argument('--cpu', action='store_true')
    parser.add_argument('--streamline', action='store_true')
    parser.add_argument('--local', action='store_true')
    parser.add_argument('--sticky_action_prob', type=float,default=0.0)
    parser.add_argument('--action_noise', type=float,default=0.0)
    parser.add_argument('--verbose', action="store_true")

    #from machado
    sticky_action_prob = 0.0

    args = parser.parse_args()
    
    m = MakeAtariModel(args.algo,args.environment,args.run_id,tag=args.tag,local=args.local)()
   
    results = generate_rollout(model=m,args=args)

    exit()

if __name__=="__main__":
    #generate_rollout(blah="blah2")
    main()


================================================
FILE: atari_zoo/scores.py
================================================
# Copyright (c) 2018 Uber Technologies, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
#     Unless required by applicable law or agreed to in writing, software
#     distributed under the License is distributed on an "AS IS" BASIS,
#     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#     See the License for the specific language governing permissions and
#     limitations under the License.

from atari_zoo.translate import translate_game_name

def get_random_agent_scores(game):
    global random_play_scores

    translated_name = translate_game_name(game,'canonical','apex')

    return random_play_scores[translated_name]

def get_human_scores(game):
    global human_scores

    translated_name = translate_game_name(game,'canonical','apex')

    return human_scores[translated_name]

human_scores = {
    "alien": 6875,
    "amidar": 1676,
    "assault": 1496,
    "asterix": 8503,
    "asteroids": 13157,
    "atlantis": 29028,
    "bank_heist": 734.4,
    "battle_zone": 37800,
    "beam_rider": 5775,
    "bowling": 154.8,
    "boxing": 4.3,
    "breakout": 31.8,
    "centipede": 11963,
    "chopper_command": 9882,
    "crazy_climber": 35411,
    "demon_attack": 3401,
    "double_dunk": -15.5,
    "enduro": 309.6,
    "fishing_derby": 5.5,
    "freeway": 29.6,
    "frostbite": 4335,
    "gopher": 2321,
    "gravitar": 2672,
    "hero": 25763,
    "ice_hockey": 0.9,
    "jamesbond": 406.7,
    "kangaroo": 3035,
    "krull": 2395,
    "kung_fu_master": 22736,
    "montezuma_revenge": 4367,
    "ms_pacman": 15693,
    "name_this_game": 4076,
    "pong": 9.3,
    "private_eye": 69571,
    "qbert": 13455,
    "riverraid": 13513,
    "road_runner": 7845,
    "robotank": 11.9,
    "seaquest": 20182,
    "space_invaders": 1652,
    "star_gunner": 10250,
    "tennis": -8.9,
    "time_pilot": 5925,
    "tutankham": 167.6,
    "up_n_down": 9082,
    "venture": 1188,
    "video_pinball": 17298,
    "wizard_of_wor": 4757,
    "zaxxon": 9173,
}

random_play_scores = {
    "alien": 227.8,
    "amidar": 5.8,
    "assault": 222.4,
    "asterix": 210,
    "asteroids": 719.1,
    "atlantis": 12850,
    "bank_heist": 14.2,
    "battle_zone": 2360,
    "beam_rider": 363.9,
    "bowling": 23.1,
    "boxing": 0.1,
    "breakout": 1.7,
    "centipede": 2091,
    "chopper_command": 811,
    "crazy_climber": 10781,
    "demon_attack": 152.1,
    "double_dunk": -18.6,
    "enduro": 0,
    "fishing_derby": -91.7,
    "freeway": 0,
    "frostbite": 65.2,
    "gopher": 257.6,
    "gravitar": 173,
    "hero": 1027,
    "ice_hockey": -11.2,
    "jamesbond": 29,
    "kangaroo": 52,
    "krull": 1598,
    "kung_fu_master": 258.5,
    "montezuma_revenge": 0,
    "ms_pacman": 307.3,
    "name_this_game": 2292,
    "pong": -20.7,
    "private_eye": 24.9,
    "qbert": 163.9,
    "riverraid": 1339,
    "road_runner": 11.5,
    "robotank": 2.2,
    "seaquest": 68.4,
    "space_invaders": 148,
    "skiing":-16679.9,
    "star_gunner": 664,
    "tennis": -23.8,
    "time_pilot": 3568,
    "tutankham": 11.4,
    "up_n_down": 533.4,
    "venture": 0,
    "video_pinball": 16257,
    "wizard_of_wor": 563.5,
    "zaxxon": 32.5,
}



================================================
FILE: atari_zoo/synthetic_inputs.py
================================================
# Copyright (c) 2018 Uber Technologies, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
#     Unless required by applicable law or agreed to in writing, software
#     distributed under the License is distributed on an "AS IS" BASIS,
#     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#     See the License for the specific language governing permissions and
#     limitations under the License.

import tensorflow as tf
import lucid
import atari_zoo
from atari_zoo import MakeAtariModel

from lucid.misc.io import show
import lucid.optvis.objectives as objectives
import lucid.optvis.param as param
import lucid.optvis.transform as transform
import lucid.optvis.render as render

from lucid.optvis.param.color import to_valid_rgb
from lucid.optvis.param.spatial import naive, fft_image
import pylab
from lucid.optvis.objectives import wrap_objective, Objective
import matplotlib
import numpy as np
from lucid.misc.io import load, save, show
from lucid.misc.io.showing import images

#call to create raw image
def image(shape, add_noise=False):
  if add_noise:
    raw_frames = lucid.optvis.param.spatial.naive(shape, sd=0.5)
  else:
    raw_frames = lucid.optvis.param.spatial.naive(shape)
  processed_frames = tf.nn.sigmoid(raw_frames)
  return processed_frames

#if you want to only optimize current frame and leave past 3 frames to be zero
def only_current_frame(shape):
    shape_1 = shape[:-1]+[1,]
    
    shape_2 = shape[:]
    shape_2[-1] -= 1
    
    print(shape_1,shape_2)
    
    current_frame = lucid.optvis.param.spatial.naive(shape_1)
    zero_frames = tf.zeros(shape_2)
       
    processed_current = tf.nn.sigmoid(current_frame)
    processed_frames = tf.concat([zero_frames,processed_current],-1)
    return processed_frames

#create lucid objective functions that work with different channel orderings (
@wrap_objective
def channel(layer, n_channel, ordering="NHWC"):
  """Tensor-order aware version of channel lucid objective"""
  if ordering=='NCHW':
    return lambda T: tf.reduce_mean(tf.transpose(T(layer),perm=[0,2,3,1])[...,n_channel])
  else:
    return lambda T: tf.reduce_mean(T(layer)[..., n_channel])

#an L2 penalty only for a specific channel of the input image
@wrap_objective
def L2c(layer="input", constant=0, epsilon=1e-6, batch=None,channel=0):
  """L2 norm of layer. Generally used as penalty."""
  if batch is None:
    return lambda T: tf.sqrt(epsilon + tf.reduce_sum((T(layer)[...,channel] - constant) ** 2))
  else:
    return lambda T: tf.sqrt(epsilon + tf.reduce_sum((T(layer)[batch,...,channel] - constant) ** 2))

@wrap_objective
def direction_cossim(layer, vec, ordering="NHWC"):
  """Visualize a direction (cossine similarity)"""
  def inner(T):
    if ordering=='NCHW':
        _layer = T(layer)
    else:
        _layer = tf.transpose(T(layer),perm=[0,2,3,1])
        
    act_mags = tf.sqrt(tf.reduce_sum(_layer**2, -1, keepdims=True))
    vec_mag = tf.sqrt(tf.reduce_sum(vec**2))
    
    mags = act_mags * vec_mag
    return tf.reduce_mean(_layer * vec.reshape([1, 1, 1, -1]) / mags)
    
  return inner

@wrap_objective
def direction_neuroncossim(layer, vec, ordering="NHWC"):
  """Visualize a direction (cossine similarity)"""
  def inner(T):
    if ordering=='NCHW':
        _layer = T(layer)
    else:
        _layer = tf.transpose(T(layer),perm=[0,2,3,1])
        
    act_mags = tf.sqrt(tf.reduce_sum(_layer[:,5:6,5:6,:]**2, -1, keepdims=True))
    vec_mag = tf.sqrt(tf.reduce_sum(vec**2))
    
    mags = act_mags * vec_mag
    return tf.reduce_mean(_layer[:,5:6,5:6,:] * vec.reshape([1, 1, 1, -1]) / mags)
    
  return inner 


def make_regularization(L1=0.0,L2=0.0,TV=0.0):
    return -L1*objectives.L2()-L2*objectives.L2()-TV*objectives.total_variation()


def visualize_neuron(algo='apex',env='SeaquestNoFrameskip-v4',run_id=1,tag="final",param_f=lambda: image([1,84,84,4]),do_render=False,
                     transforms=[transform.jitter(3),],layer_no=0,neuron=0,regularization=0,**params):
    tf.reset_default_graph()
    
    m = MakeAtariModel(algo,env,run_id,tag,local=False)()
    m.load_graphdef()
   
    if(m.layers[layer_no]['type']=='dense'):
        obj = objectives.channel(m.layers[layer_no]['name'],neuron)
    else:
        obj = channel(m.layers[layer_no]['name'],neuron,ordering=m.channel_order)

    out = optimize_input(obj+regularization,m,param_f,transforms,do_render=do_render,**params)
    return out


#differentiable image parameterizations
from tensorflow.contrib import slim
import numpy as np

#CPPN setup
def composite_activation(x):
  x = tf.atan(x)
  # Coefficients computed by:
  #   def rms(x):
  #     return np.sqrt((x*x).mean())
  #   a = np.arctan(np.random.normal(0.0, 1.0, 10**6))
  #   print(rms(a), rms(a*a))
  return tf.concat([x/0.67, (x*x)/0.6], -1)


def composite_activation_unbiased(x):
  x = tf.atan(x)
  # Coefficients computed by:
  #   a = np.arctan(np.random.normal(0.0, 1.0, 10**6))
  #   aa = a*a
  #   print(a.std(), aa.mean(), aa.std())
  return tf.concat([x/0.67, (x*x-0.45)/0.396], -1)


def relu_normalized(x):
  x = tf.nn.relu(x)
  # Coefficients computed by:
  #   a = np.random.normal(0.0, 1.0, 10**6)
  #   a = np.maximum(a, 0.0)
  #   print(a.mean(), a.std())
  return (x-0.40)/0.58


def image_cppn(
    size,
    num_output_channels=1,
    num_hidden_channels=24,
    num_layers=8,
    activation_fn=composite_activation,
    normalize=False):
  r = 3.0**0.5  # std(coord_range) == 1.0
  coord_range = tf.linspace(-r, r, size)
  y, x = tf.meshgrid(coord_range, coord_range, indexing='ij')
  net = tf.expand_dims(tf.stack([x, y], -1), 0)  # add batch dimension

  with slim.arg_scope([slim.conv2d], kernel_size=1, activation_fn=None):
    for i in range(num_layers):
      in_n = int(net.shape[-1])
      net = slim.conv2d(
          net, num_hidden_channels,
          # this is untruncated version of tf.variance_scaling_initializer
          weights_initializer=tf.random_normal_initializer(0.0, np.sqrt(1.0/in_n)),
      )
      if normalize:
        net = slim.instance_norm(net)
      net = activation_fn(net)
      
    rgb = slim.conv2d(net, num_output_channels, activation_fn=tf.nn.sigmoid,
                      weights_initializer=tf.zeros_initializer())
  
  return rgb

def render_feature(
    cppn_f = lambda: image_cppn(84),
    optimizer = tf.train.AdamOptimizer(0.001),
    objective = objectives.channel('noname', 0),transforms=[]):
  vis = render.render_vis(m, objective, param_f=cppn_f, optimizer=optimizer, transforms=transforms, thresholds=[2**i for i in range(5,10)], verbose=False)
  #show(vis)
  return vis

#video rendering code...
from lucid.misc.io.serialize_array import _normalize_array
from lucid.misc.tfutil import create_session
from moviepy.video.io.ffmpeg_writer import FFMPEG_VideoWriter
from IPython.display import clear_output, Image, display, HTML
import moviepy.editor as mpy
from lucid.modelzoo import vision_models
from lucid.misc.io import show, save, load
from lucid.optvis import objectives
from lucid.optvis import render

@wrap_objective
def all_activation(layer, batch=None):
  """Value of action minus average value of all actions"""
  if batch is None:
    return lambda T: tf.reduce_mean(T(layer))
  else:
    return lambda T: tf.reduce_mean(T(layer)[batch, ...])

cppn_default_f = lambda: image_cppn( 
          size=84, num_layers=8,num_hidden_channels=16,normalize=True, 
          activation_fn=relu_normalized, num_output_channels=4)

#composite_activation
#relu_normalized
#composite_activation_unbiased
def optimize_input(obj, model, param_f, transforms, lr=0.05, step_n=512,num_output_channels=4,do_render=False,out_name="out"):

  sess = create_session()

  # Set up optimization problem
  size = 84
  t_size = tf.placeholder_with_default(size, [])
  T = render.make_vis_T(
      model, obj, 
      param_f=param_f,
      transforms = transforms,
      optimizer=tf.train.AdamOptimizer(lr),
  )

  tf.global_variables_initializer().run()
 
  if do_render:
      video_fn = out_name + '.mp4'
      writer = FFMPEG_VideoWriter(video_fn, (size, size*4), 60.0)
  
  # Optimization loop
  try:
    for i in range(step_n):
      _, loss, img = sess.run([T("vis_op"), T("loss"), T("input")])

      if do_render:
          #if outputting only one channel...
          if num_output_channels==1:
              img=img[...,-1:] #print(img.shape)
              img=np.tile(img,3)
          else:
              #img=img[...,-3:]        
              img=img.transpose([0,3,1,2])
              img=img.reshape([84*4,84,1])
              img=np.tile(img,3)
          writer.write_frame(_normalize_array(img))
          if i > 0 and i % 50 == 0:
              clear_output()
              print("%d / %d  score: %f"%(i, step_n, loss))
              show(img)

  except KeyboardInterrupt:
    pass
  finally:
    if do_render:
        print("closing...")
        writer.close()
  
  # Save trained variables
  if do_render:
      train_vars = sess.graph.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES)
      params = np.array(sess.run(train_vars), object)
      save(params, out_name + '.npy')
  
      # Save final image
      final_img = T("input").eval({t_size: 600})[...,-1:] #change size
      save(final_img, out_name+'.jpg', quality=90)

  out = T("input").eval({t_size: 84})
  sess.close()
  return out

 


================================================
FILE: atari_zoo/top_patches.py
================================================
# Copyright (c) 2018 Uber Technologies, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
#     Unless required by applicable law or agreed to in writing, software
#     distributed under the License is distributed on an "AS IS" BASIS,
#     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#     See the License for the specific language governing permissions and
#     limitations under the License.




# Code to extract images patches that maximally activate particular neurons
# in an Atari convnet. Note that right now this code is specifically fit
# to the Atari convnet structure, and would need adaptation and generalization
# to fit to arbitrary structures. This would likely be non-trivial because it
# requires some reflection on the structure of the network (reasoning about
# pooling / convs) to calculate receptive fields at particular layers, etc.

import sys

import tensorflow as tf
import lucid
import numpy as np
import atari_zoo
from atari_zoo import MakeAtariModel
from atari_zoo.rollout import generate_rollout

from lucid.misc.io import show
import lucid.optvis.objectives as objectives
import lucid.optvis.param as param
import lucid.optvis.transform as transform
import lucid.optvis.render as render

import atari_zoo.utils
from atari_zoo.utils import conv_activations_to_canvas
from atari_zoo.utils import fc_activations_to_canvas

from lucid.optvis.render import import_model
from matplotlib.pyplot import *

#from IPython import embed


# receptive field at conv3 is 36
# receptive field at conv2 is 20
# receptive field at conv1 is 8 (8x8 conv...)

def pad_image(image, padSize, pad_values=0.):
    """ 
    Function that pads an image on all 4 sides, each side having the same padding. 
    simulating the receptive field that can be larger than original image 
    image: shape (batch, h, w, c) or (h, w, c)
    padSize: integer. Number of pixels to pad each side
    pad_values: what value to pad it with 
    
    """
    if len(image.shape) == 4: # (batch, h, w, c)
        pads = ((0,0), (padSize,padSize),(padSize,padSize), (0,0))
    elif len(image.shape) == 3: # (h, w, c)
        pads = ((padSize,padSize),(padSize,padSize), (0,0))
    else: 
        raise ValueError('Unsupported representation shape {}'.format(image.shape))
    ret = np.pad(image, pads, 'constant', constant_values=pad_values)
    return ret

def get_obs_patch(observation, ii, jj, receptive_stride=(36,8), pad_each_side=4+2*4+1*8,plot=False):
    """ Function that get a patch from an observation matrix, according to 
    a (ii, jj) location at a layer higher up 
    observation: (batch, h, w, c), normally (batch, 84, 84, 4)
    ii: integer index in the h dimension
    jj: integer index in the w dimension
    receptive_stride: a tuple of (receptive field size, stride size) indicating from this higher-up 
        layer where (ii, jj) is located, the size of receptive field and stride into the observation.
        For networks used in this application, the three conv layers have, respectively, 
        (8,4), (20,8), (36,8)
        onto the original observation.
    pad_each_side: how much the observation should be padded, due to the fact that receptive field at
        some point expand outside of the original image. Because there have been 3 layers of conv, having
        filter sizes of 8, 4, and 3, strides of 2, 2, and 1. Under "same" padding as they do, the eventual
        padding is 4 + 4*2 + 1*2*4 = 20
    """
    repp = pad_image(observation, pad_each_side)  # pad to (112,112,4)
    
    (rec_size, stride) = receptive_stride

    # the field to look at in observation
    top = int(ii*stride-rec_size/2)
    bot = int(ii*stride+rec_size/2)
    left = int(jj*stride-rec_size/2)
    right = int(jj*stride+rec_size/2)
    #print('Before pad: ', top, bot, left, right)
    print('bottom left location in original obs: ({},{})'.format(bot, left))
    [new_top, new_bot, new_left, new_right] = [k+pad_each_side for k in [top,bot,left,right]]
    #print('After pad: ', new_top, new_bot, new_left, new_right)
    #figure(figsize=(10,4))
    if plot:
        for cc in range(observation.shape[-1]):
            subplot(101+observation.shape[-1]*10+cc)
            #print('bottom left location in padded obs: ({},{})'.format(bot+pad_each_side, left+pad_each_side))
            matshow(repp[new_top:new_bot,new_left:new_right,cc], fignum=0)
    #print(repp[new_top:new_bot,new_left:new_right,cc].shape)
    return repp[new_top:new_bot,new_left:new_right,observation.shape[-1]-1], (top, left)

def build_model_get_act(algo, env, run_id=1, tag='final', local=True, which_layer=2):
    """ Function that builds/loads a model given algorithm algo and environment env, etc.,
    and obtain activations at a specific layer.
    which_layer: the index into layers. 0->Conv1, 1->Conv2, 2->Conv3, 3->FC
    """
    # Activation map shapes: 
    # 0 Online/Conv/Relu (21, 21, 32)
    # 1 Online/Conv_1/Relu (11, 11, 64)
    # 2 Online/Conv_2/Relu (11, 11, 64)
    # 3 Online/fully_connected/Relu (512)
    # 

    #TODO
    # load model
    m = MakeAtariModel(algo, env, run_id, tag=tag)()
    nA = atari_zoo.game_action_counts[env]
    acts_shapes = [(0,21,21,32), (0,11,11,64), (0,11,11,64), (0,512),(0,nA)] 
    # getting frames, observations
    obs = m.get_observations()
    frames = m.get_frames()

    # get the flow ready from observation the the layer activation you want
    m.load_graphdef()
    #get a tf session
    session = atari_zoo.utils.get_session()
    #create a placeholder input to the network
    X_t = tf.placeholder(tf.float32, [None] + m.image_shape)
    #now get access to a dictionary that grabs output layers from the model
    T = import_model(m,X_t,X_t)
    
    # the activation tensor we want
    acts_T = T(m.layers[which_layer]['name'])
    try:
        acts = session.run(acts_T, {X_t: obs})
    except:
        # some models does not allow batch size > 1 so do it one at a time
        acts = np.empty(acts_shapes[which_layer])
        for obs_1 in obs:
            obs_1 = np.expand_dims(obs_1, axis=0)
            #rep_1 = session.run(rep_layer_T, {X_t: obs_1})
            rep_1 = session.run(acts_T, {X_t: obs_1})
            acts = np.append(acts, rep_1, axis=0)
    if m.channel_order=='NCHW':
        acts = np.transpose(acts, axes=[0,2,3,1])

    print('Layer {} {} activations obtained. Shape {}'.format(which_layer, 
                                m.layers[which_layer]['name'], acts.shape))   
    return obs, acts, frames

def plot_topN_patches(activations, observations, which_filter=38, which_layer=2, which='top',n=3,plot=True):
    """ Plot the things
    activations: activations across all observations. e.g. (2501, 11, 11, 64)
    which_filter: the filter of interest, integer between e.g. [0, 64) for conv3
    Top 3 and Bottom 3 are determined by the activation vaules in activations
    Plots are first on activations and then on specific observation patches
    """
   
    #last two are fc layers
    receptive_stride = [(8,4), (20,8), (36,8),(84,0),(84,0)][which_layer]
    pad_each_side = [4, 4+4*2, 4+4*2+1*8,0,0][which_layer]

    # Find the maximum value in each channel of activation
    acts_filter = activations[..., which_filter]  # e.g. (2501, 11, 11)
    max_per_sample = []
    for act in acts_filter:     # each (11,11)
        max_per_sample.append(act.max())
    max_per_sample = np.array(max_per_sample)  
    top3 = max_per_sample.argsort()[::-1][:n]
    #print(max_per_sample)
    bot3 = max_per_sample.argsort()[:n]
    rand3 = np.random.choice(len(max_per_sample), n)

    if which.startswith('top'):
        picks = top3
    elif which.startswith('bot'):
        picks = bot3
    elif which.startswith('rand'):
        picks = rand3
    else:
        raise ValueError('which={"top", "bot", "rand"}')

    def plot_things(picks,plot=True):
        patches = []
        bottleft = []
        #figure(figsize=(10,4))
        for cc, sample_pick in enumerate(picks):
            
            if len(activations.shape)==2: #fc
                rep_pick = np.zeros((5,5))
                rep_pick[0,0] = activations[sample_pick,which_filter]+1e-6
            else:
                rep_pick = activations[sample_pick,:,:,which_filter]
                [ii, jj] = [int(x) for x in np.where(rep_pick == np.max(rep_pick))]
            if plot:
                figure(0,figsize=(10,4))
                subplot(1,n,1+cc)
                imshow(rep_pick)
                title('Maximum activation loc: ({},{})'.format(ii,jj))
            
                figure(cc+2,figsize=(12,4))

            if len(activations.shape)==2: #fc:
                _patches=observations[picks[cc]]
                if plot:
                    figure()
                    for k in range(4):
                        subplot(141+k)
                        matshow(_patches[...,k],fignum=0)
                _bl = (0,0)
            else:
                _patches, _bl = get_obs_patch(observations[picks[cc]], ii, jj, receptive_stride, pad_each_side)
            patches.append(_patches)
            bottleft.append(_bl)
        return np.array(patches), bottleft

    if plot:
        gray()
    patches, bottleft = plot_things(picks,plot=plot)
    
    return patches, picks, bottleft

if __name__=='__main__':
    algos = ['a2c','es','ga','apex','rainbow','dqn']
    game_list_local = ['AmidarNoFrameskip-v4',
                       'AtlantisNoFrameskip-v4',
                       'KangarooNoFrameskip-v4',
                       'ZaxxonNoFrameskip-v4',
                       'AssaultNoFrameskip-v4',
                       'EnduroNoFrameskip-v4',
                       'SeaquestNoFrameskip-v4',
                       'AsterixNoFrameskip-v4',
                       'FrostbiteNoFrameskip-v4',
                       'SkiingNoFrameskip-v4',
                       'AsteroidsNoFrameskip-v4',
                       'GravitarNoFrameskip-v4',
                       'VentureNoFrameskip-v4']
    
    algo = algos[-1] 
    env = 'SeaquestNoFrameskip-v4' # sequest
    
    observations, activations = build_model_get_act(algo, env, which_layer=2)
    plot_top3_bot3_patches(activations, observations, which_filter=38, which_layer=2)



================================================
FILE: atari_zoo/translate.py
================================================
# Copyright (c) 2018 Uber Technologies, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
#     Unless required by applicable law or agreed to in writing, software
#     distributed under the License is distributed on an "AS IS" BASIS,
#     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#     See the License for the specific language governing permissions and
#     limitations under the License.

import os
import glob
import sys
from pdb import set_trace as bb


def module_path():
    encoding = sys.getfilesystemencoding()
    return os.path.dirname(__file__)

path = module_path()

dopamine_game_list = open(os.path.join(path,"game_lists/dopamine_game_list")).read().split("\n")[:-1]
es_apex_game_list = open(os.path.join(path,"game_lists/apex_game_list")).read().split("\n")[:-1]
a2c_game_list = open(os.path.join(path,"game_lists/a2c_game_list")).read().split("\n")[:-1]

#align game lists by taking out these games
blacklist = ['AirRaid','Carnival','ElevatorAction','JourneyEscape','Pooyan']

dopamine_game_list = [k for k in dopamine_game_list if k not in blacklist]
a2c_game_list = [k for k in a2c_game_list if (k[:k.find("NoFrameskip-v4")]) not in blacklist]


def grab_list(mode):
        if mode=='a2c' or mode=='canonical':
            return a2c_game_list
        if mode=='es' or mode=='apex':
            return es_apex_game_list
        if mode=='dopamine':
            return dopamine_game_list

def translate_game_name(inp_name,inp_mode,out_mode):
    inp_list = None
    out_list = None

    inp_list = grab_list(inp_mode)
    out_list = grab_list(out_mode)

    inp_idx = inp_list.index(inp_name)

    return out_list[inp_idx]


if __name__=='__main__':

    for k in range(len(dopamine_game_list)):
        print(dopamine_game_list[k],es_apex_game_list[k],a2c_game_list[k])

    print(translate_game_name('ice_hockey','apex','canonical'))

    print(a2c_game_list)


================================================
FILE: atari_zoo/utils.py
================================================
# Copyright (c) 2018 Uber Technologies, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
#     Unless required by applicable law or agreed to in writing, software
#     distributed under the License is distributed on an "AS IS" BASIS,
#     WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#     See the License for the specific language governing permissions and
#     limitations under the License.


import tensorflow as tf
import pylab
import numpy as np
import json
from lucid.misc.io.reading import read_handle

try:
    import tf2onnx
except:
    print('tf2onnx not installed, you will not be able to export to onnx')
    pass

"""
Helper function to load json from a url
(lucid.misc.io.reading.load chokes on a decoding issue)
"""
def load_json_from_url(url,cache=None,encoding='utf-8'):
    with read_handle(url,cache=cache) as handle:
        res = handle.read().decode(encoding=encoding)
        return json.loads(res)

"""
Helper function to generate a new session
"""
def get_session():
    tf.reset_default_graph()
    tf_config = tf.ConfigProto(
        inter_op_parallelism_threads=0,
        intra_op_parallelism_threads=0)
    tf_config.gpu_options.allow_growth=True
    session = tf.Session(config=tf_config)
    return session

"""
Render a layer of conv weights to an
RGB numpy array canvas 
"""
def conv_weights_to_canvas(w):
    fx = w.shape[0]
    fy = w.shape[1]
    in_ch = w.shape[2]
    out_ch = w.shape[3]

    scale = 1
    padding = 1

    x_leap = (fx+padding)
    y_leap = (fy+padding)

    c_sz_x = padding + x_leap * in_ch
    c_sz_x *= scale
    c_sz_y = padding + y_leap * out_ch
    c_sz_y *= scale

    w_max = w.max()
    w_min = w.min()
    print(w_min,w_max)

    #first, cheap rescale
    w_scaled = (w-w_min)/(w_max-w_min)

    canvas = np.zeros((c_sz_x,c_sz_y,3))
    for i in range(in_ch):
        for j in range(out_ch):
            x_idx = padding + i*x_leap
            y_idx = padding + j*y_leap
        
            filt = w_scaled[:,:,i,j]
        
            canvas[x_idx:x_idx+fx,y_idx:y_idx+fy,0] = filt
            canvas[x_idx:x_idx+fx,y_idx:y_idx+fy,1] = filt
            canvas[x_idx:x_idx+fx,y_idx:y_idx+fy,2] = filt
        
    canvas = canvas.transpose([1,0,2])
    return canvas

"""
Use Matplotlib
"""
def visualize_conv_w(w,title=None,subsample=None):
    if subsample!=None:
        w=w[:,:,:,:subsample]
    canvas = conv_weights_to_canvas(w)
    #pylab.figure(figsize = (10,20))
    pylab.imshow(canvas)
    if title:
        pylab.title(title,fontsize=20)
    return canvas

#save model out to onnx format
def to_onnx(model,fname="./frozen_out.onnx",scope=""):
    tf.reset_default_graph()
    model.load_graphdef()
    model.import_graph(scope=scope)

    tf.import_graph_def(
            model.graph_def, {}, name=scope)

    graph = tf.get_default_graph()
    onnx_graph = tf2onnx.tfonnx.process_tf_graph(graph)

    inp_name = model.input_name+":0"
    out_name = model.layers[-1]['name']+":0"
    
    print(inp_name,out_name)
    model_proto = onnx_graph.make_model("", [inp_name], [out_name])

    with open(fname, "wb") as f:
        f.write(model_proto.SerializeToString())

    print("Done...")

#convert fc-level activations to a canvas representation
def fc_activations_to_canvas(m,act,scale=8,padding=1,width=32,idx=0):
    
    if len(act.shape)==2:
        act=act[idx]
    
    channels = act.shape[0]

    fx = fy = scale
    
    if width>channels:
        width=channels
    in_ch = width
    out_ch = int(channels / width)

    x_leap = (fx+padding)
    y_leap = (fy+padding)

    c_sz_x = padding + x_leap * in_ch
    #c_sz_x *= scale
    c_sz_y = padding + y_leap * out_ch
    #c_sz_y *= scale

    #print(c_sz_x,c_sz_y)

    a_max = act.max()
    a_min = act.min()
    #print(a_max,a_min)

    #first, cheap rescale
    a_scaled = (act-a_min)/(a_max-a_min)

    canvas = np.zeros((c_sz_x,c_sz_y,3))
    canvas[:,:,0]=1.0
    for i in range(in_ch):
        for j in range(out_ch):
            x_idx = padding + i*x_leap
            y_idx = padding + j*y_leap

            filt = a_scaled[i+j*width]

            canvas[x_idx:x_idx+fx,y_idx:y_idx+fy,0] = filt
            canvas[x_idx:x_idx+fx,y_idx:y_idx+fy,1] = filt
            canvas[x_idx:x_idx+fx,y_idx:y_idx+fy,2] = filt

    canvas = canvas.transpose([1,0,2])
    return canvas

def get_activation_scaling(model,act):

        act = model.canonical_activation_representation(act)
        #print("Processed shape",act.shape)

        act_max_ch = act.max((0,1,2))
        act_min_ch = act.min((0,1,2))
        return act_max_ch,act_min_ch

#convert conv-level activations to a canvas representation
def conv_activations_to_canvas(model,act,scale=1,padding=1,width=8,idx=0,scaling=None):

    act_max_ch = None
    act_min_ch = None

    if scaling!=None:
        act_max_ch,act_min_ch = scaling

    if len(act.shape)==4:
        #handle NCHW and NHWC
        act = model.canonical_activation_representation(act)
        #print("Processed shape",act.shape)

        act = act[idx]

    fx = act.shape[0]
    fy = act.shape[1]
    channels = act.shape[2]

    #no blank squares
    if width>channels:
        width = channels
        
        
    in_ch = width
    out_ch = int(channels / width)
    
    x_leap = (fx+padding)
    y_leap = (fy+padding)

    c_sz_x = padding + x_leap * in_ch
    c_sz_x *= scale
    c_sz_y = padding + y_leap * out_ch
    c_sz_y *= scale
    

    #global max/min
    a_max = act.max()
    a_min = act.min()

    #first, cheap rescale
    a_scaled = (act-a_min)/(a_max-a_min)

    canvas = np.zeros((c_sz_x,c_sz_y,3))
    canvas[:,:,0]=1.0
    for i in range(in_ch):
        for j in range(out_ch):
            x_idx = padding + i*x_leap
            y_idx = padding + j*y_leap

            if act_max_ch is None:
                filt = a_scaled[:,:,i+j*width]
            else:
                channel = i+j*width
                filt = (act[:,:,channel] - act_min_ch[channel])/(act_max_ch[channel]-act_min_ch[channel]+1e-8)

            #flip x & y
            filt = np.transpose(filt,[1,0])

            canvas[x_idx:x_idx+fx,y_idx:y_idx+fy,0] = filt
            canvas[x_idx:x_idx+fx,y_idx:y_idx+fy,1] = filt
            canvas[x_idx:x_idx+fx,y_idx:y_idx+fy,2] = filt


    canvas = canvas.transpose([1,0,2])
    return canvas

try:
	import moviepy.editor as mpy
	from moviepy.video.io.ffmpeg_writer import FFMPEG_VideoWriter
except:
	print("Moviepy not installed, movie generation features unavailable.")

from lucid.misc.io.serialize_array import _normalize_array
import numpy as np

def MakeVideo(m,fps=60.0,skip=1,video_fn='./tmp.mp4'):
    obs = m.get_observations() 
    frames = m.get_frames() 
    size_x,size_y = frames.shape[1:3]   

    writer = FFMPEG_VideoWriter(video_fn, (size_y, size_x), fps)
    for x in range(0,frames.shape[0],skip):
        writer.write_frame(frames[x])
    writer.close()

def load_clip_from_cache(algo,env,run_id,tag="final",video_cache="."):

    i_video_fn ="%s/%s-%s-%d-%s.mp4" % (video_cache,algo,env,run_id,tag)

    return  mpy.VideoFileClip(i_video_fn)


def movie_grid(clip_dict,x_labels,y_labels,grid_sz_x,grid_sz_y,label_padding=50,padding=5,label_fontsize=20):
    key = list(clip_dict.keys())[0]
    exemplar = clip_dict[key]
    size_x,size_y = exemplar.size
    duration = exemplar.duration

    x_step = (size_x+padding)
    y_step = (size_y+padding)

    composite_size = (label_padding + x_step * grid_sz_x), (label_padding + y_step * grid_sz_y)

    #load in all the movie clips
    for _x in range(grid_sz_x):
        for _y in range(grid_sz_y):
            pos =(label_padding + _x*x_step,label_padding + _y*y_step)
            clip_dict[(_x,_y)] = clip_dict[(_x,_y)].set_position(pos)
            #clip.write_gif(o_video_fn)

    clip_list = []
    #add background clip
    clip_list.append(mpy.ColorClip(size=composite_size, color=(255,255,255)))

    #now add x and y labels
    l_idx = 0
    if y_labels != None:
        for label in y_labels:
            txtClip = mpy.TextClip(label,color='black', fontsize=label_fontsize).set_position((0,label_padding+y_step*l_idx+(y_step/2)))
            l_idx+=1
            clip_list.append(txtClip)

    l_idx = 0
    if x_labels != None:
        for label in x_labels:
            txtClip = mpy.TextClip(label,color='black', fontsize=label_fontsize).set_position((label_padding+x_step*l_idx,label_padding/2))
            l_idx+=1
            clip_list.append(txtClip)
    
    for key in clip_dict:
        clip_list.append(clip_dict[key])
    
    cc = mpy.CompositeVideoClip(clip_list,composite_size)
    return cc




def rollout_grid(env,algos,run_ids,tag='final',clip_resize=0.5,label_fontsize=20,out_fn="composite.mp4",video_cache=".",length=None):

    clip_dict = {}
    key = None
    for algo in algos:
        for run_id in run_ids:
            key = (algo,run_id)
            clip_dict[key] = load_clip_from_cache(algo,env,run_id,tag,video_cache).resize(clip_resize)
            
    exemplar = clip_dict[key]
    size_x,size_y = exemplar.size
    duration = exemplar.duration

    #labels for grid
    y_labels = [("R%d"% r) for r in run_ids] 
    x_labels= algos

    label_padding = 50
    padding = 5

    num_runs = len(run_ids)

    x_step = (size_x+padding)
    y_step = (size_y+padding)

    composite_size = (label_padding + x_step * len(algos), label_padding + y_step * num_runs)

    algo_idx = 0

    #load in all the movie clips
    for algo in algos:
        for run_id in run_ids:
            pos =(label_padding + algo_idx*x_step,label_padding + (run_id-1)*y_step)
            clip_dict[(algo,run_id)] = clip_dict[(algo,run_id)].set_position(pos)
            #clip.write_gif(o_video_fn)
        
            print(env,algo,run_id)
        
        algo_idx+=1
    

    clip_list = []
    #add background clip
    clip_list.append(mpy.ColorClip(size=composite_size, color=(255,255,255)))

    #now add x and y labels
    l_idx = 0
    for label in y_labels:
        txtClip = mpy.TextClip(label,color='black', fontsize=label_fontsize).set_position((0,label_padding+y_step*l_idx+(y_step/2)))
        l_idx+=1
        clip_list.append(txtClip)

    l_idx = 0
    for label in x_labels:
        txtClip = mpy.TextClip(label,color='black', fontsize=label_fontsize).set_position((label_padding+x_step*l_idx,label_padding/2))
        l_idx+=1
        clip_list.append(txtClip)

    
    for key in clip_dict:
        clip_list.append(clip_dict[key])
    
    cc = mpy.CompositeVideoClip(clip_list,composite_size)

    if length!=None:
        duration = length

    cc = cc.resize(1.0).subclip(0,duration)

    if out_fn != None:
        cc.write_videofile(out_fn)

    return cc,clip_dict


================================================
FILE: colab/AtariZooColabDemo.ipynb
================================================
{
  "nbformat": 4,
  "nbformat_minor": 0,
  "metadata": {
    "colab": {
      "name": "AtariZooColabDemo.ipynb",
      "version": "0.3.2",
      "provenance": [],
      "collapsed_sections": []
    },
    "kernelspec": {
      "name": "python3",
      "display_name": "Python 3"
    },
    "accelerator": "GPU"
  },
  "cells": [
    {
      "metadata": {
        "id": "8HWEO12ygohP",
        "colab_type": "code",
        "outputId": "3d5a3e29-e11b-48e6-f421-1df434f589bd",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 254
        }
      },
      "cell_type": "code",
      "source": [
        "!git clone https://github.com/uber-research/atari-model-zoo\n",
        "import os\n",
        "os.chdir(\"/content/atari-model-zoo\")\n",
        "import atari_zoo"
      ],
      "execution_count": 1,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Cloning into 'atari-model-zoo'...\n",
            "remote: Enumerating objects: 126, done.\u001b[K\n",
            "remote: Counting objects: 100% (126/126), done.\u001b[K\n",
            "remote: Compressing objects: 100% (95/95), done.\u001b[K\n",
            "remote: Total 159 (delta 41), reused 104 (delta 30), pack-reused 33\u001b[K\n",
            "Receiving objects: 100% (159/159), 20.00 MiB | 27.31 MiB/s, done.\n",
            "Resolving deltas: 100% (46/46), done.\n",
            "tf2onnx not installed, you will not be able to export to onnx\n",
            "Imageio: 'ffmpeg-linux64-v3.3.1' was not found on your computer; downloading it now.\n",
            "Try 1. Download from https://github.com/imageio/imageio-binaries/raw/master/ffmpeg/ffmpeg-linux64-v3.3.1 (43.8 MB)\n",
            "Downloading: 8192/45929032 bytes (0.0%)\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b3047424/45929032 bytes (6.6%)\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b6823936/45929032 bytes (14.9%)\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b10690560/45929032 bytes (23.3%)\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b14573568/45929032 bytes (31.7%)\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b18399232/45929032 bytes (40.1%)\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b22315008/45929032 bytes (48.6%)\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b26206208/45929032 bytes (57.1%)\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b29335552/45929032 bytes (63.9%)\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b32645120/45929032 bytes (71.1%)\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b36446208/45929032 bytes (79.4%)\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b40312832/45929032 bytes (87.8%)\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b44179456/45929032 bytes (96.2%)\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b\b45929032/45929032 bytes (100.0%)\n",
            "  Done\n",
            "File saved as /root/.imageio/ffmpeg/ffmpeg-linux64-v3.3.1.\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "metadata": {
        "id": "135P9vp2siaA",
        "colab_type": "text"
      },
      "cell_type": "markdown",
      "source": [
        "## Download trained model and precomputed rollout data"
      ]
    },
    {
      "metadata": {
        "id": "KypUsPKqh_br",
        "colab_type": "code",
        "outputId": "6cb960df-3cf4-4853-a73c-87d6b3cc7a41",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 72
        }
      },
      "cell_type": "code",
      "source": [
        "import atari_zoo\n",
        "from atari_zoo import MakeAtariModel\n",
        "from pylab import *\n",
        "\n",
        "algo = \"rainbow\" #or try.... es, ga, dqn, a2c, apex\n",
        "env = \"SeaquestNoFrameskip-v4\"  #or try... ZaxxonNoFrameSkip-v4\n",
        "run_id = 2\n",
        "tag = \"final\"\n",
        "m = MakeAtariModel(algo,env,run_id,tag)()\n",
        "\n",
        "# get observations, frames, ram state, and neural representations from a representative rollout\n",
        "obs = m.get_observations()\n",
        "frames = m.get_frames()\n",
        "ram = m.get_ram()\n",
        "rep = m.get_representation()"
      ],
      "execution_count": 2,
      "outputs": [
        {
          "output_type": "stream",
          "text": [
            "Model path: gs://download-dopamine-rl/lucid/rainbow/Seaquest/2/graph_def.pb\n",
            "Data path: https://dgqeqexrlnkvd.cloudfront.net/zoo/rainbow/SeaquestNoFrameskip-v4/model2_final_rollout.npz\n",
            "Log path: https://dgqeqexrlnkvd.cloudfront.net/zoo/rainbow/checkpoints/SeaquestNoFrameskip-v4_2\n"
          ],
          "name": "stdout"
        }
      ]
    },
    {
      "metadata": {
        "id": "IH4uNHcGnFda",
        "colab_type": "text"
      },
      "cell_type": "markdown",
      "source": [
        "## Display one RGB frame and corresponding pre-processed observation fed into deep NN\n"
      ]
    },
    {
      "metadata": {
        "id": "KS9W6tNKggV1",
        "colab_type": "code",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 562
        },
        "outputId": "b53cec2c-bbbc-41ea-fb5f-afa620de287c"
      },
      "cell_type": "code",
      "source": [
        "from IPython.display import Image, display\n",
        "import numpy as np\n",
        "import PIL\n",
        "\n",
        "idx = 100\n",
        "frame = frames[idx]\n",
        "_obs = obs[idx].transpose((2,0,1))\n",
        "\n",
        "#display RGB frame \n",
        "display(PIL.Image.fromarray(frame))\n",
        "\n",
        "#display T-3, T-2, T-1, Now frames of observation fed into deep neural network\n",
        "for i in range(4):\n",
        "  _obsf = (np.tile(_obs[i,...,np.newaxis],(1,1,3)) * 255).astype(np.uint8)\n",
        "  display(PIL.Image.fromarray(_obsf))"
      ],
      "execution_count": 3,
      "outputs": [
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAKAAAADSCAIAAABCR1ywAAAE/UlEQVR4nO3dP4tUVxjH8Rux18KA\nhSCCnRMCVkIKhwFhsRUrLUx8B1PpO9jqvoPoQrKlbVgQhi0tFCymtFEsBLENCm5IceRwc+eemXvN\nec6f334/hGW8mZ095rvPmbt3djdNAwAAAADo+CH0L366fpRyHTByJvcCYOts7gWU6PBg39++9+DR\nzuMlY4vucxV9v8ODfXe7e7x3n5KxRe9QRcUt2KKH+d249sBM8LB7Dx51d+N6EVgcgcVxFj2AL5NQ\nDbZocQQWR2BxBBZXd+DQVYipx4VVHJi6Y1QcGGNUFthN4eHBvh9Hf2TS8dOjssANO/NE9QXGJNUH\nDl0TnnpcVX0v+FN0El5sEFf9Fo3tCCwu+Bx8pTlJuQ4YYYLFBSf4538+p1wHjDDB4oITfK15l3Id\nMMIEiwtO8Pkzf6RcB4wwweKCE3xy5e+U64ARJlhccII/nfuSch0wEgz8/urXlOuAkeDLhRd/+T3l\nOmCE52BxBBYXfA7+8PZjynXASDDwxcs/plwHjDDB4ngOFkdgcTwHi2OCgZoFL1U2l071D+XJYIsW\nR2BxBBZHYHEEFkdgcQQWR2BxBDaxmM1zL+EbrmSJY4JNlDPBBI5vMZuv1sdN09x/ss69FgIbGKyb\nKzaBo/Hb8v0na5/T3cg4ygQ25+r++dssy0cncDSr9fFiNl/M5rlaDiJwHC6tu90bWXeD5+C6rdbH\n7h/3Rxe11zgLLnTE50bZx86LCY7PpS3kWgcTbMhf8ciIwOLYosURWByBxRFYHIHFEVgcgcURWByB\nxRFYHIFNFPJKQ0NgebzYII4JjqyczdkhsDgCR1PUd+p4BBbHSVZMBQ5x8HdVYpLSzq08Jjiy0oaY\nwPF1pzl7aQKbKOEbZh3Ook24H0TLvYqmIbCdQhqzRYtjgsURWByBxRFYHIHFEVgcgcURWByBxRFY\nHIHFEVgcgcURWByBxRFYHIHF5Qn8/Omz50+fZfnQp03qb3yna2Js0eIILI7A4lJ82+yY591bv96J\n9eHQZXWSxclUIXL++Kib2mI/FUILq2uzsd2ix8er679aRfJv0aQ1ZbtFEy87fvhMHF8HiyOwOAKL\nI7C4aIFL+HUF2MQEi+PLJHGRJ5iNujRs0eIiBy7qf36NhgmWF/nFhtJ+1ypMJngxm7NLFyLmBDO+\nBYoQuKhfj4yeaBc6yvkNyeiK8xxM3WJFCEzdknEtWhwXOsSFz6IvvEm4DFhhgsWFJ/j8y4TLgBUm\nWFx4gs++TrcKmGGCxRFYHIHFjXo1qb3dDh5f/rWMsgge3+7xg5cq23b4QUOm/mVCi+bx4z5+tMBj\nPtjURfP4///xTQKjHJxkiSOwOAKLI7A4Aoub9m2zy+W38++2bZfLZeht9z7dO7tH2Hl+vre3N/nv\nkcOrhzc3D368+zj9SraYPMHdPL5l73bbtv5u/kb3Dkjm+7doV27zreNzhj4JkMb3B3a1djbbHGWk\nNDlwbyveeZ8x94cdLlWK48skcQQWR2BxBBZHYHEEFkdgcQQWR2BxBBZHYHEEBgAgj/DvyarWYu/F\n4PHV0Y3t9+neAcXZbNY90rsd+iTQI3IW7YL5t6G0/o9uWN09N9+3e2TzEeoiErj57wYb2mx9Xd9s\ndXSjd+edO3lddALvjNGd3e7B7e9Ye2OdwGNOkTaHdft7CZx26QTuGnmSPDi+vSMCjQEAAAAAAAAA\nMPUv9Fqtl5lgVM4AAAAASUVORK5CYII=\n",
            "text/plain": [
              "<PIL.Image.Image image mode=RGB size=160x210 at 0x7FA7BEB08278>"
            ]
          },
          "metadata": {
            "tags": []
          }
        },
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAFQAAABUCAIAAACTCYeWAAAFmUlEQVR4nO2cvW/TTBzHz/bZ8cV2\n6jZ2arlxgVILFdQOlLUTbRc2FiohsVas/CP8KzAgFtgqVUKRWCrxUql1lCYpdURDXuy0Mc4znB6r\nhtC3xJc6yWdyTvbvvt873/le7AAwZjShzv5IJpMMwwxKCnng2R8LCwuSJA1KCmZ+fl6W5Xw+b9u2\nruu6rtu2nc/no8iLjiLotTEM48mTJ6VSSZZlhmE2Nzc/f/68sbEhy3IU2d0s8xhJktbW1ghkdLPM\nHx8fW5aVTqe/f//u+34ul3v48OHXr18dx4kiu1CHt7y8PPA2TxISNZ9KpS48HgiRm19aWlpZWcHH\nuq4/ffoUH7Msu7m5GXXu5wMvPuW6rK+v37lzR9d1AECtVltYWJiZmQEArKysBMerq6sfPnyITsP5\nRGgeAFAqlUqlUqVScV23Wq2Wy2XHcfb39+fn53O5HADg06dPkQo4n5Hu8EI1z3FcIpEYlBTyhMxL\nkhTRWOpmEjIPIeQ4blBSyDO+7f9HEISBDzxIEurt7927l0wmByWFPKGaRwgJgjAoKeQJmXddl6Ko\nf506fNysKS1hQvU8MTEBYbQD3hvFn739SD3nR/q2H5uPGxRFTU5OTk9PP3/+fGpqSlXV6203hDo8\nXddj0eZFUfR9/+XLlwAAx3GKxeK7d+9+//591TjxM69pWq1Ww+u5mUzGMAzHcfb29k5OTq4aKmbm\nIYTpdDqdTu/v77uu22O0mJnH0DR969atk5OTUqnUU5x+CSKJ7/uWZfW+kxFL8wCATqdTrVZ7DBJX\n831hbH5UiaV5mqYVRek9TvwmsHitqVKp9B4qZuZpmk6lUhDCZrPZh2i9hyBMrVbzPG92dlZV1R5D\nxazmfd9vNpvNZlMUxd4XnWJmPqDRaPQ+Eo/fbR9wenraY4QYm++dsflR5TrmIYTZbFYUxb6rIcyV\ne/tMJvPixQtN0968ebO1tRWFJmJcwTzP84qiZDIZTdMAALIsZ7PZer3+69evyORFS2gZa319fWpq\n6tu3b8fHx2fTFxcXHz9+jM0HiYVCwTCMjx8/vn//npDYfhOq+Z2dna4jh2az2Wq1WJY9ODgIEimK\n2t7e3tnZiVxjZFxqAVMQhMnJyR8/frTbbVLCSHCpNo+H01FLIc/4OT+qXGxeVVWaHs4yutiVbdsI\noVjs5FyVS1UpQmgoX0i+wLwgCIqiSJJE0/Twva5zgZ9kMklRlGVZnU6HjCCSnGeeoijHcYbyCY8J\nmWdZ9o+Ord1uD2VXhwmZZxhm+Br2OYSs+r7v+/6gpJAnZF6WZVEUDcMIVmkqlYpt261Wq9FodL0e\nT3JN0wxumYODg3q93mg0Wq1Wl/wglGUZQmiaZpC4u7vreV61WvU87+9LeJ4XRVGSpGw2i1M8z9vd\n3QX/3rQSRZHneVVVgzl4o9EoFAo4l+C00Kzu1atXQQZnwebr9XowpQ3U/2vDEJu3bTvQh4sVm+96\nCTZfKBSCglYURVVVbL7rJTg4Ljucks1mJUnC5v8+PyhiXHaXMj+sDOeg/ZJ079t//vz59u3b2dlZ\ny7JM07QsS9O0o6Ojubm5vb29Bw8e5HK5VCr17Nmzs1cpivLo0aMoVL5+/Rq3VUEQlpaW+hX2UjVv\nmiZ2VSgUcIqmaYeHh/0SMSjGbX5UGZsfVUbafKjDm56eZll2UFLIQ+4rOoRQp9PRdR0h5LpuIpH4\n8uULz/Nzc3N4FuC6brlcJqYHEDOvadrdu3eLxSLLsjzPu66LECoWi/fv38/n8+l0ulwuS5KUSCQY\nhjk9PbVtWxTFiP4YKYBQm+c4rtPp8DyPN7wQQgzD3L59e2trS5Zl3/cNw0AIsSwLIUQImaZJ4Htu\nQjXPcRxe/Oc4Ltjww59K4HVxz/N834cQUhTVbrdbrVYikeg6KR4zZsyYa/Ifi2XoigW88HoAAAAA\nSUVORK5CYII=\n",
            "text/plain": [
              "<PIL.Image.Image image mode=RGB size=84x84 at 0x7FA7BEB083C8>"
            ]
          },
          "metadata": {
            "tags": []
          }
        },
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAFQAAABUCAIAAACTCYeWAAAFcElEQVR4nO2bP2/aTBzHj/gPd9jG\njrCDYyBKKVZFIqVS2zVbi1Rl69K8gmTPG+nbyJopS6ssGaoWCXVq1UgJhGBXNaIhQZjWlt3h9CCc\n5AkNYIOBz2ROvrvv17/z/fMBwJzZJNL7IxaLEQQxLinBQ/b+yOfzHMeNSwoml8sJglCpVAzDUBRF\nURTDMCqVih91LfhR6MBkMpmtrS1N0wRBIAhid3e3VCptb28LguBHdZNlHsNx3KtXrwKoaLLM//r1\nq1wuJxKJ79+/O45TLBafPXv27du3drvtR3WeDu/58+djf+eDJIjIx+PxvtdjwXfzGxsbm5ub+FpR\nlDdv3uBriqJ2d3f9rv1+yP63DEqhUHj06JGiKACAq6urfD6fSqUAAJubm93rly9fvn//3j8N9+Oj\neQCApmmaptXrddM0Ly8vdV1vt9tnZ2e5XK5YLAIAPn365KuA+5npDs8TeZqmo9HouKQEj8c8x3E+\nzaUmE495kiRpmh6XlOCZN/v/YBhm7BOPIPH09k+ePInFYuOSEjyeyCOEGIYZl5Tg8Zg3TTMSifzf\nrdPHZC1pA8YTZ57nSdLfCe9EcbO3n6lxfqab/dx82IAQxmKxp0+fDllO+Lo3giB4ni8UCktLS8lk\nEgBwdHRkWdYARYXMPISQ47irq6svX77IssyybKvVGrg0z1CnKMqE9/Y8zy8uLrqui7/hzJZ5DM/z\nkiRpmjbkfn7Imj2m2Wx2Op3hv6mG0jwA4Pfv38MXEsqhblTMzc8qoTTPcRxCaPhyQtbh4f1ly7I6\nnc7wpYUs8hDCpaWlUW06hMy8bduNRkOWZUmSht9rDVmz73Q6nU6n1WolEonBFjO9hCzyGMdx6vX6\n8OWE0jwAwHXdGY38qJibn1UGNx+JRNLpNM/zI1QTMAMOdQzD7OzspFKpDx8+HB4ejlZTYDzAPIRQ\nFMVms8nzPMMw+DhVPB5Pp9PtdrvRaPgm0i8esI21vLz8+vXrVCrV29RrtZosy6VSaX9/n+f5fD5/\nI5eu6z6dmR6eBzd7y7IuLi56U0ql0ufPnwEAzWbz48ePI5PmPw+IPG729Xp9JCuqSSCUu7ejYj7O\nzyr9zcdisWk9k9rffLvdtiyLpumFhWlrJv/qJ5FITN+JlT7mKYpiGEZRFNd1R7JhOlH0D6Yoipqm\nTc3Y3ksf8zRNn5+fu64bjJqA8ZinKOrGJMeyLIqigpUUHB7zBEFMX692Dx6rjuM4jjMuKcHjMS8I\nAsuymUyGZVmcUq/XDcPAW+V35hdFEQCgqmq3yVxcXFxfX7darTv7SJIkBUEgSVJV1W7iycmJbduX\nl5e2bd/OAiFkWZbjuHQ6jVNs2z45OcHy7lTFsiyEUJIkLA8A0Gq1qtUqrqV7m2dhs7e3162gF2z+\n+vq6u5jtqu+WfgNs3jCMrj78WLH5O7Ng89VqtfugRVGUJAmbvzMLLhw/O5ySTqc5jsPmb9/ffcT4\n2f2T+Wll2masD+Luvr3RaBwcHKysrJTLZVVVy+WyLMs/f/7MZrOnp6fr6+vFYjEej799+7Y3lyiK\nL1688EPlu3fv8LvKMMzGxsaoiv2nyKuqil1Vq1WcIsvyjx8/RiViXMzf+Vllbn5WmWnzng4vmUxO\n8RruNsH9iw4h5LquoigIIdM0o9Ho169fIYTZbBavAkzT1HU9MD0gMPOyLD9+/LhWq1EUBSE0TRMh\nVKvV1tbWKpVKIpHQdZ3juGg0ShDEnz9/DMNgWdbvj3wBvfM0TbuuCyHEB2kQQgRBrK6uHh8fC4Lg\nOE4mk0EIURRFkiRCSFXVAP7PHVDkuzvf+PwkTjRNEwCAEKJp2rZtx3FIkoxEIviAZTQancqNwzlz\n5oyNv4eC4dAakGmbAAAAAElFTkSuQmCC\n",
            "text/plain": [
              "<PIL.Image.Image image mode=RGB size=84x84 at 0x7FA7BE56B278>"
            ]
          },
          "metadata": {
            "tags": []
          }
        },
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAFQAAABUCAIAAACTCYeWAAAFdUlEQVR4nO2cz0/iTBjHp/Y3baUB\nZJtqjcvSg5Ig0T148eaabPZg4kUve/buP+HRP8STx032uPGgJHvcjfgDgoCxZEULFoW272HyNnZ1\nFaEtFPo5lUn7zPOdmT4zfToFgIDRBHn8IxQKoSjaL1e8B3v8Y3Z2luO4frkCSSaTPM8XCgVFUURR\nFEVRUZRCoeBGXWNuGO0aSZK+fPlSLpd5nkdRdGtr6+fPn5ubmzzPu1HdYImHcBz36dMnDyoaLPHX\n19f5fD4ajR4fHxuGkc1mFxYWfv/+fXd350Z1toC3uLjY93veS7zo+fHx8VeP+4Lr4tPp9PLyMjwW\nRXF9fR0e4zi+tbXldu0vg71+Sresrq6+f/9eFEUAwO3t7ezs7OTkJABgeXnZOl5ZWfn+/bt7PryM\ni+IBAOVyuVwuV6tVTdNqtVqlUrm7uzs/P08mk9lsFgBweHjoqgMvM9IBz9bzBEGQJNkvV7zHJp7j\nOJfWUoOJTTyGYQRB9MsV7wmG/f8wDNP3hYeX2KJ9JpNhWbZfrniPrec1TUMQ5F+nDh828aqq3t/f\n98sV77GJj0QioVCoX654j22Qh8NhDHN3wTtQ/B3tR2qeH6xMjscE4v1GPB5HEIRhmB7t+E98PB4n\nSfLz588rKyvxeHxxcbFrUz6L7YIg3NzciKIoSRLLsoIg7O3tdW3NNtWJojjI0R7H8UgkEolEcBw/\nPT1NJpMnJyeiKOZyue4M+kk8BEXR6enpZrNZqVR6NOW/e17X9Xw+r2la76b8Jx4AYJpmrVbr3Y4v\nxTtFIN5v9L68gfhPvIPzkc/EMwzDMEyj0XDEmp/EoygaDocdTLf4STwA4ObmxjRNSZJisVjv1vy0\nttd1vdFoNBoNjuPGxhzoNj+Jt1BV1ZGw57Nhb/Hw8NC7Eb+Kd4RA/KgSiB9V3jbVra2tLSwsHBwc\nfPv2zSWHvORtPS9JEsMwmUzGHWe8ptMc3tevX+fn5x+X6Lp+dHSUSqV2dnY6nHXj8bhlv1ar1ev1\nrnx2jE6HvSRJf5WgKLq0tKSq6lurpCgqk8lY4o+OjnRdf6sRR3hD9jadTj99oiqVSsVi0RXX3Md/\nqWsH6SjgDWuLdCTeqZzZoNGR+Ovr68eBemjodJ7XdX34tui9Lj4UCkWj0XA4jCAIjuMe+OQZr8/z\n8EvD8/Nz0zQ9cMhLXhevaZpTqeJBwyYex/GnUa3Vag1fqIPYxKMoOrr78AzDMAyjX654j008z/Ms\ny8LtLrCkWq0qitJsNv/1BAZfHsiybA2Zi4sLVVXr9Xqz2XymPgzjeR7DMFmWrcJcLtdut2u1Wrvd\nfnoJRVEsy3IcNzU1BUva7TbcilKtVp/1imVZiqImJiasdxv1er1YLMJarNNsa/vt7W2rgsdA8aqq\nXlxcWBqg9/96cwLFK4pi+QebFYp/9hIovlgsWg0di8UmJiag+GcvgcZh28GSqakpjuOg+KfnW00M\n264j8cNKkMN7wp8/f/b396enp/P5vCzL+XxeEISrq6tEInF2dpZKpbLZ7Pj4+MbGxuOrYrHYx48f\n3fByd3cX3qsMw6TTaafMdtTzsixDVVbeQhCEy8tLp5zoF8E9P6oE4keVkRZvC3jv3r0bsnTFy3j3\nCSFN06ZpiqJI07SmaSRJ/vr1i6KoRCIBnwI0Tet9I/Wb8Ei8IAgfPnwolUo4jlMUpWkaTdOlUmlu\nbq5QKESj0UqlwnEcSZIoij48PCiKwrKsS3+MZOHRPU8QhGmaFEW1Wi0AAE3TKIrOzMz8+PGD53nD\nMCRJomkax3EMw2ialmXZg++5Pep5giDg5jGCIKB+AADcM0/TNEEQ7XbbMAwMwxAEabVazWaTJMln\nH4oDAgICuuQ/DafRR0dmKlQAAAAASUVORK5CYII=\n",
            "text/plain": [
              "<PIL.Image.Image image mode=RGB size=84x84 at 0x7FA7BE5D5F60>"
            ]
          },
          "metadata": {
            "tags": []
          }
        },
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAFQAAABUCAIAAACTCYeWAAAFeElEQVR4nO2bQU/bPBiAXRInNk7a\ndG21ktENGBEr26qJMW7cgMtu0yT2Dzhx2R/ht3CFHXeYpkqTpkmbQNpStbRA2FpoRwKE9DtYX0RY\ngUKTtGnznFIrsd8ndhzHdgEIGUwiF38MDw8zDNOtUPyHvfgjm82KotitUCiTk5OSJBUKBU3TZFmW\nZVnTtEKh4EVZQ15kemcymczr16/L5bIkSQzDrKysfPny5d27d5IkeVFcb8lTRFFcXFz0oaDekq9W\nq6qqJhKJra0ty7Ly+fzMzMyPHz+Oj4+9KM7R4b18+bLrz7yf+FHz0Wj0xuOu4Ll8Lpebn5+nx7Is\nv3nzhh5DCFdWVrwu/XrYm0+5K0tLS+Pj47IsAwCOjo6y2eyDBw8AAPPz8/bxwsLChw8fvIvhejyU\nBwCUy+VyuXxwcKDreq1Wq1Qqx8fHv379mpyczOfzAIDPnz97GsD1DHSH56h5juN4nu9WKP7jkBdF\n0aOxVG/ikGdZluO4boXiP2Gz/x9CSNcHHn7i6O1fvHghCEK3QvEfR83ruh6JRK46tf9wyNfr9ZOT\nk26F4j8O+Xv37g0PD3crFP9xNPJYLMay3g54e4rLvf1Aved7aybHZ0L5QAEhdGswEjx5hNDIyMjU\n1BQAIJ1Od5JVkPp2QghCaGZmJpfLWZY1Nze3ubnZSYZBqnkIYTQa5Xl+Y2Njf3//69evHda84z0v\ny3Lvv+ri8TghZG9v7+zsrMOsgtTsKdVqtVaruTIYC1Kzt2k2m51XOwiovFuE8oGCYRiEkCtZBUx+\naGjIxe/OIPX2EEKe5y3LcmvFOkg1jzFOJpMuZhgk+dPT02q1KstyKpXCGHeeYZCavWEYhmE0Go14\nPG6aZucZBqnmKefn579//3Ylq+DJg3CE5wqh/KASyg8qd5RnGObRo0dv37599uyZuwH5yR0HOYSQ\n1dVVAMDfv3+/ffvmakj+cQt5hFAymSSEPH/+3J7qe/LkCSFEVVW6tSxY3EI+Ho8vLS1NT09fSn/1\n6hXLsp3Lx2KxbDZ7KbFSqXi02R7cSh5jHI1GNzc3Dw8PL6bv7u6qqtp5KIeHh58+feo8n/YJ3tS1\ni7TV27Ms25fbVdqSxxj35R+P2pKv1+uEEELI0FBfDYralTEMQxRFCKGn0fjMzfIQQkKILMuRSMSV\nyaPeoa1XXSKR2Nvb8+hfPl3kZnkIYalUsizLh2h8xiEPIfz3PW+aZr/uT3NYMQzTr54tcahaltWX\nzfsqHPKSJAmCkMlk7O1OBwcHmqbR2fKW19MlFEVR7CZTKpXq9Xqj0TAMo0V5LCtJEsuyiqLYidvb\n26Zp1mq1lrPxCCFBEERRHB0dpSmmaW5vb9PwWkYlCAJCKJVK2Ss8jUajWCzSUuzTHIPW9+/f2wVc\nhMrX6/VSqWQ70OivWj+i8pqm2fHR20rlW15C5YvFon2jk8lkKpWi8i0voZnTe0dTRkdHRVGk8v+e\nb99ieu/aku9X+mq4elta9+1//vxZX19/+PChqqqKoqiqmk6n9/f3JyYmfv78+fTp03w+H41Gl5eX\nL16VTCZnZ2e9iHJtbY0+q4SQXC7nVrZt1byiKNSqWCzSlHQ6vbu761YQ3SJ85geVUH5QGWh5R4d3\n//79PpuruR7/5mQxxs1mU5ZljLGu6zzPf//+HSE0MTFBvwJ0Xa9UKr7FA3yTT6fTjx8/3tnZgRAi\nhHRdxxjv7OxMT08XCoVEIlGpVERR5HmeYZjT01NN0wRB8G6thuLTM89xXLPZRAjRvTR0LnxsbOzj\nx4+SJFmWlclkMMYQQpZlMcaKovjwf26fap7jODrtzXGcvZdI13UAAMaY4zjTNC3LoqsjZ2dnhmHw\nPN/yozgkJCTkjvwHJtXSTKZ1uxkAAAAASUVORK5CYII=\n",
            "text/plain": [
              "<PIL.Image.Image image mode=RGB size=84x84 at 0x7FA7BE56B278>"
            ]
          },
          "metadata": {
            "tags": []
          }
        }
      ]
    },
    {
      "metadata": {
        "id": "AUSGKCZOrYsm",
        "colab_type": "text"
      },
      "cell_type": "markdown",
      "source": [
        "## Visualize RAM state across this rollout\n"
      ]
    },
    {
      "metadata": {
        "id": "rSZDq6L7rfYb",
        "colab_type": "code",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 635
        },
        "outputId": "79d8e679-f6d6-4f62-964b-2a77b9cb4f86"
      },
      "cell_type": "code",
      "source": [
        "ram = np.array(ram)\n",
        "#get ram state across all frames of the observation\n",
        "bin_array = np.zeros((ram.shape[0],128*8),dtype=np.uint8)\n",
        "\n",
        "#convert from integer to bits\n",
        "for idx in range(ram.shape[0]):\n",
        "    state = ram[idx]\n",
        "    \n",
        "    binary = m.ram_state_to_bits(state)\n",
        "    binary = [int(k) for k in binary]\n",
        "    \n",
        "    bin_array[idx]=binary\n",
        "\n",
        "gray()\n",
        "matshow(bin_array)"
      ],
      "execution_count": 62,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "<matplotlib.image.AxesImage at 0x7f866026ac50>"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 62
        },
        {
          "output_type": "display_data",
          "data": {
            "text/plain": [
              "<matplotlib.figure.Figure at 0x7f86603da9b0>"
            ]
          },
          "metadata": {
            "tags": []
          }
        },
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAREAAAJGCAYAAAB897FcAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJztvX2MX8V1//9ee1ltnKzlrOsPgoik\nUUoDqlyDZURt8xACdtTQ0pbUqKAFVQoqlIc6CRW4yGqpkGpjcBW+EIknoyKgLY2DWrdCC0Ixkqtu\nXIWVLBOpQkSlpZDYu9iwZFnjtXN/f+zvri+X+zBz75k577k7L8my97P3c++58/CeM2fOjPuSJEkQ\niUQiDVmkbUAkEgmbKCKRSKQVUUQikUgroohEIpFWRBGJRCKtiCISiURa0a9tQJa/+Zu/wYEDB9DX\n14e7774bv/mbv6lt0jw7duzAq6++ihMnTuCmm27CypUrceedd+LkyZNYsWIF7r//fgwMDGDPnj14\n6qmnsGjRIlxzzTXYtGmTir3Hjh3D7/zO7+CWW27B2rVrqW3ds2cPnnjiCfT39+PP/uzP8OUvf5nS\n3unpadx11114//33MTs7i1tvvRUrVqzAPffcAwD48pe/jL/+678GADzxxBMYHR1FX18fbrvtNlx6\n6aXe7Hz99ddxyy234I//+I8xMjKCn/3sZ8blOTs7iy1btuCdd97B4sWLsW3bNpx11lnVD0xI2L9/\nf/Inf/InSZIkyRtvvJFcc801yhadYmxsLLnxxhuTJEmSI0eOJJdeemmyZcuW5IUXXkiSJEl27tyZ\nPPvss8n09HSycePGZGpqKpmZmUmuvPLK5OjRoyo2/+3f/m1y9dVXJz/4wQ+obT1y5EiycePG5IMP\nPkgOHTqUbN26ldbep59+OnnggQeSJEmSn//858nXvva1ZGRkJDlw4ECSJEnyne98J3nllVeS//3f\n/03+4A/+IPnoo4+Sd999N/na176WnDhxwouN09PTycjISLJ169bk6aefTpIksSrP559/PrnnnnuS\nJEmSffv2JZs3b659Js10ZmxsDFdccQUA4Etf+hLef/99/OIXv1C2ao4LLrgADz74IABg6dKlmJmZ\nwf79+3H55ZcDAC677DKMjY3hwIEDWLlyJYaGhjA4OIjVq1djfHzcu70//elP8cYbb+ArX/kKAFDb\nOjY2hrVr1+Izn/kMer0e7r33Xlp7P/vZz+K9994DAExNTWHZsmV4++235z3m1Nb9+/fj4osvxsDA\nAIaHh/G5z30Ob7zxhhcbBwYG8Pjjj6PX681/ZlOeY2Nj2LBhAwBg3bp1RmVMIyKTk5P47Gc/O//z\n8PAwJiYmFC06xeLFi7FkyRIAwO7du3HJJZdgZmYGAwMDAIDly5djYmICk5OTGB4env+e1jvcd999\n2LJly/zPzLb+3//9H44dO4abb74Z1113HcbGxmjtvfLKK/HOO+9gw4YNGBkZwZ133omlS5fO/57B\n1v7+fgwODn7sM5vyzH6+aNEi9PX14fjx49XPFH4HMRLCbPyXX34Zu3fvxpNPPomNGzfOf15mq8Y7\n/PM//zPOO++80nksk60p7733Hh5++GG88847uOGGGz5mC5O9//Iv/4IzzzwTu3btwn/913/h1ltv\nxdDQUK1NTG3Z1kYT22lEpNfrYXJycv7nw4cPY8WKFYoWfZx9+/bhkUcewRNPPIGhoSEsWbIEx44d\nw+DgIA4dOoRer1f4Duedd55XO1955RW89dZbeOWVV/Dzn/8cAwMDtLYCcyPj+eefj/7+fnz+85/H\npz/9aSxevJjS3vHxcVx00UUAgHPOOQcfffQRTpw4Mf/7rK3//d///YnPtbCp/16vh4mJCZxzzjmY\nnZ1FkiTzXkwZNNOZ9evX48UXXwQA/OQnP0Gv18NnPvMZZavm+OCDD7Bjxw48+uijWLZsGYC5+WJq\n70svvYSLL74Yq1atwsGDBzE1NYXp6WmMj49jzZo1Xm397ne/ix/84Af4p3/6J2zatAm33HILra0A\ncNFFF+FHP/oRfvnLX+Lo0aP48MMPae39whe+gAMHDgAA3n77bXz605/Gl770Jfz4xz/+mK2/9Vu/\nhVdeeQXHjx/HoUOHcPjwYfzar/2aV1uz2JTn+vXrMTo6CgDYu3cvLrzwwtr79yVEvtYDDzyAH//4\nx+jr68Nf/dVf4ZxzztE2CQDw3HPP4aGHHsIXv/jF+c+2b9+OrVu34qOPPsKZZ56Jbdu24bTTTsPo\n6Ch27dqFvr4+jIyM4KqrrlKz+6GHHsLnPvc5XHTRRbjrrrtobf3Hf/xH7N69GwDwp3/6p1i5ciWl\nvdPT07j77rvx7rvv4sSJE9i8eTNWrFiBv/zLv8Qvf/lLrFq1Cn/xF38BAHj66afxr//6r+jr68O3\nvvUtrF271ouNr732Gu677z68/fbb6O/vx+mnn44HHngAW7ZsMSrPkydPYuvWrXjzzTcxMDCA7du3\n44wzzqh8JpWIRCKR8KCZzkQikTCJIhKJRFoRRSQSibQiikgkEmmFlzwR5o11kUikHc5F5D//8z/x\nP//zP3juuefw05/+FHfffTeee+4514+NRCKecD6dYd5YF4lE2uNcRJg31kUikfZ4D6ya5LYdO3bM\ngyWRSEQC5zGRJhvrBgcH0dfXB2BOdNJ/p6SfpYKUvTYl/X3+u2X3yH+n6N5Fz8l+N/u7Kvtdk3+n\n/O+yn1W9iw/qysemTjXJl6Omjfm2Wtd/AHziZxuceyJtNtalL8VUQVlY7Ij4J+4WOYVzT2T16tX4\njd/4DfzRH/3R/MY6U5hHHiA2pIWMaZtkbr9S0G7Ay7vcRT9rT2eqbIjTGTNCnc7UlWOczpCQ75Ap\nDI0IiJ7IQsbGE+k61CLSRh0lKbPBpiFl/9Td18aGLlJUVibfaXtd/nd115o8M23DXa4/ahEpC6z6\nem5K1ZTIlLppxUKFuQyqbOvr6zO23ebaEKEWEfbCt7GNaSRqMtK7tKUO05iJT0zLT2sg9Am1iGg1\ndBfz3awgLoSGZUrdaN/m+y6xHeC6HEOhFhEWL6RtTKToOyzvFilHIiZiS4jtglpEAJ0pjUSQru5a\nlhGHccpYlRmcR6scGctNC2oR0Zq7my4pN3Fns8vWDI2QJTbCCGMshhFqEWHoZFXEJVq3sMdEInNQ\ni0hZspmv50qSfQcbd92EhSBQEsvsIRDi+1CLSOrya8cTJJ6n/Q5az6zDNv7UdkBhX5YP0cOhFhHN\noFnVz03vqbW/Io17MDfQfCavabayy3eSvLdppiyj0NdBLSJMG67akh9Ju/BOkoSYRyG1ilc01Q0J\nahHRIq7/+8PkoKcqbOtKsm5t0t67DLWIlHkhriulaJ9LW4piIkyjqjaMHbKufmL9zUEtImU7ILXz\nRpreo+sjUhOKtgDYrF7ZlqmLVPWFDrWIpJ6I7/NETEWqiR3ZIKevRlpWhgydxDZG1HZAkVxW9y1I\nrJ4PtYiwn8XQhbR3ZhhErgxX068Q2wW1iJSpveuCNo2JtB2JGBqMqUfCmteiJTSuyoNZOMugFhGG\nk82q3NYmngjbEi+zp+eijKTuadM2WctXCmoRSWFNe2/iqobaoHwnxqX/Nrlegzan2nUNahGRPMfD\nBleHEmX/3ZUkOmlCKRObwHTIg4cJ1CJSVlE+K6TKbWXfh8Fogw0mS72aMZHojcxBLSJlbi3LEm+T\nRpT9m2VvEBsuDyWSKnMbT8TFDmQm74ZaRBjc/qrn23oi2etZGgBLvkgRbEHoLBKdmPG9mkAtIjYB\nNklcx0Rsn+MS7RHNdHdrGZqHEjHUHwPUIpJWlGZlScdEtA5aYqUqJ6coGM2C5sY/NqhFJMV3tqfr\n9X+NQ5VCbMQuYiKSeSKmhFj2NlCLiFbjz8cupDbg5fH1bkXeHIuwmK68aO3ojtRDLSLaUxlJ2PbO\nsJYto01laNchC9QiohU/8HEoEWsn9k1VGZgE1iVjE66OFair69DFiFpEGIJpUjZkBZHhvVio8tCY\nl3gBuZgN6/uZQi0i6TzYd7JZaDGRkLebt91B7PJQoogZ1CJSNmIzdQIbypYvNdG2I9S6jJyCWkRS\nWE82a3JPhk7TJHPW50qSDZr5GtoCzAK1iJRNZ1xjmuTU9lCi2AjNyqDqmlA24HUZahFh6GTShxI1\n+W5ELsAtmWzG0D4ZoBYR9iXepp6I76kB86jZdv+Ry/eSWlpmLXspqEUkHX00R3Gp9f28IPqcpkmM\nmj5PNiv6TKqspM4ACXlFTBpqEdE6BqBqU1jVdTb3jMjgslzrPBEXHmuIUItI2Wjtu1K6HBNhm9sz\nBaDrPBGmctOEWkRSTD0DKXyMMBqNsCymwCJoTZC03Ubobcuty4JDLSJaOzddH0qkuSyZ/ZsB5oxV\nqZhI6EJdB7WIZFcXNJE8lCj7s9Z7MY+KLF5m3bU28bq667Tbd1uoRUSrcF1kceanLywdOfS5vcs2\nUueJaK30sEEtImlF+Y6JFNlh83kR+amE1spTHm1Xm3kUlvBEGOrYNdQiwrCLt4qmIxFzx2Gkylty\n2Ra67D1IQi0iKZqeSOzwbjHdXFlWD7b1I9n5paa92d+H2N6oRUQrsJrfsi99KJHkfRcCoZdTnXAx\nxspsoBaRdL6umU/h8lCiSHtiuepDLSLpXDj0kQg4JYguAqs+ysfVM/KufEh1bRM7C+m9bKEWEYn8\nDMnnSsGwrNqVhh3CyhJDfbuEWkQYCl76HIsQtucX4aouWI95MIGhfTJALSJa54m4iImEJBghodmR\nY53OQS0iQPn5Er6fWYTtPow4ctXj+pyONlsV8r+L9TkHtYiwB1ab7sNIG2BshNwnhEluwOsy1CLC\nsEGtSsSiCLTH5uQ4iR3dkhvwpDzW0EWGWkTK9s5o2FFEk7T37BIvQ+Nh84h8rIxJXCtZbkzl3wRq\nEdE6T8RFTMT3/h9TWMSsKba2S3oikTmoRaQs7Z1ltGq7FZxBSJg8kSbHJWgeSmRS/9JtlVH0qUVE\na4m3zI48TebjmsHiInu1G6XUxjoXSK3O2JQxw3vbQi0iZWiLSkrb1ZlI+7oM5VCiKkIUjizUIsKw\nizdrR54mrjFLYJVlGlPl5Wl3Ll95InEXr0MYdvFWYeuJMDUQbRFLaVsmLpd4pfJEWATbFdQiUhY/\n0PZMUtoE6bresKTRKKu4OmMGtYhojZaug2AhioeGcJscKakZE7Ghy6JDLSL5na8auNyAxxATYfWI\nTOME2m3DFMYylqJf24AqNNPeXT1bK1icR/v5dbiKIUnes8vCYAO1J9IlimIikfZxB3YxBHiC2K6g\nFhGGJV4pryS7tJv+HfKBPFJULZ+zeG0SdHnQoBYRrS3zLg4lMgkSuqZNp3RVB23LQjLA6bKddXml\nh1pEUi8g9EIG4n8ZUYaL/TG+kNqoyfp+plCLCPsuXhuYshK1ny+J9gpXhFxEtJZ4XRxKVORRuT6P\nos51D60TFL2P5hTERMAWgsdJLSIsUxmJQ4ny39F+t+yWgphrYf88m3Kru5ahjbeBWkTKcF3oLg4l\nYgisFqEtZm07l63tUkIkWW6heYR5qEWEfYmvjV2hNxwpWOu2DhtPJNR3NIVaRBiWeFM7TK6rwvd8\nfqEQyrJsl4WEWkTKglIsna/JfJ7du/JN230xWp25zQ7urkEtIlpncLg4lKjonixHGmjCaFNKXbyG\nOSjsE2oRKRu1XTc8F3Pd7LtoNajQGrLpEqoNUqskktNsZiE1gVpEtE42K7KjiDaeiOZ7hdZoteyV\nyBMxuVa7fbeFWkQA7gIO1Z1tYotGR3YRE5FCMiYSmqjnoRYRiVhEE1zERPLf8Skq2rkgEmhMHeqm\nPlJCwjTANIFaRNhHGRv7ihpd6B3bF1JbDyRZCOnsplCLiFZg1TW+PQOJICDDKlnTa7KEPuozQi0i\n+e3zvp9b9nMT8oLIEt3X3jvDTOiDlS+oRYTFZZToZAzvERraZ43EejeDWkTSOILvinDRMIviIQwN\njMWOSLhQi0hKF9zt2FmLMUn+Cv1owa7XPbWIsCRjSawMMJ1sxoTvM1ZtqBOvWI9zUItImYK7rrz8\n/SUPJcpOz2Ij5KYutyPUZENpqEWEveDbbsCLhLuLNwrIKahFhGUXb9vrgE8upXZ9ntx1ui4MNlCL\nCMsuXrZsyYWCi128Us+3HeC6PGBQiwhD7KCqsdimvWfvF5O8+IkxETOoRSSF9fCeLjcMBph38XbZ\ns7CFWkRYMlbLCHkk6oonFA940odaRLQCq/mcDslDibJxHk0PiyWw6zuRTGu/EkNZu4JaRLroiWie\n1lb0TFOPZCEcSmSbPBgPa56jv8mX9u/fj82bN+Pss88GAPz6r/86brzxRtx55504efIkVqxYgfvv\nvx8DAwPYs2cPnnrqKSxatAjXXHMNNm3aZPwchvlu1f4dG5HLC4d2xmPZypeWHUW4sE2q3G2D/tr1\n7ZSkAT/60Y+S22+//WOfbdmyJXnhhReSJEmSnTt3Js8++2wyPT2dbNy4MZmamkpmZmaSK6+8Mjl6\n9KjRMwDU/slfm/9+1X2y1xR9p+peVfaa/GzybmX2mvyuqhyL3q/sXXz8qXteVT3Y3MfHe7guQ9O6\nqmtvRW2grk1UITad2b9/Py6//HIAwGWXXYaxsTEcOHAAK1euxNDQEAYHB7F69WqMj48b3zP5/13/\nRHG0lIyJJJnRSPu9WDAtQ6mprVSZ29Rf1+u60XQGAN544w3cfPPNeP/993HbbbdhZmYGAwMDAIDl\ny5djYmICk5OTGB4env/O8PAwJiYmrJ7D7AKG0jBSO5nLsgiT8s0KswlSZeDqmbbvw0AjEfnVX/1V\n3Hbbbfjt3/5tvPXWW7jhhhtw8uTJ+d+XVb5Np0uvNW1ITb6f/13Rd8r+bXqvpteZfNf2eaZlyiiO\nVfVg01Z80cYWiTZR127btMM8jUTk9NNPx9e//nUAwOc//3n8yq/8Cg4ePIhjx45hcHAQhw4dQq/X\nQ6/Xw+Tk5Pz3Dh8+jPPOO8/oGan7mv07Jft5em36edH3i8jfo+iZ+XtXBSOLjj/M/1z2PiaYvEv2\n57xd+WurGo3PkbDsvcrKuOp6jWM0y9pE0yB6XT3nqSuLovtVtf0mNIqJ7NmzB7t27QIATExM4N13\n38XVV1+NF198EQDw0ksv4eKLL8aqVatw8OBBTE1NYXp6GuPj41izZo3xc6pWRUIjP1oyjZqa+N7F\nK4mLeE6IbbsvaVALv/jFL/Dnf/7nmJqawuzsLG677Tace+65uOuuu/DRRx/hzDPPxLZt23Daaadh\ndHQUu3btQl9fH0ZGRnDVVVeZGVaglCmuPZEyOyQ8EVP7yuw1+Z2JJ1J0bdl3XNLWE7H17lx5LFKe\niM0z8s8pep4PT6SRiPjAphBci0jejiYiUnQvTRFhmc7UYVqnzCIi9VxWEaHOWE3xrXM2bmqTe6b/\ndr1SUCc6TGLRFNu2YbtSUvU702eTjtNiUItIflRnw1cEXhrW8mSjbbymybUhQi0iWqOl6fKX7agm\nuawmRegN3KX9dZ6IDaGXcxXUIpJ2PO3pjM2SW9092bwrBiFjpc4TYalDbahFRDp2IE3T+XUqjL7e\nS0OIfeHyvSQ8ka6WexZqEdGqABcNJC+IrMLIikZ5ScREFkI9U4uIVgUULYmZXFdF0T0WwijlGvaY\nyEKoY2oRYccmC3EhjEiu8S3E0RMxg1pEyuIGriumLnFIwg6NESrkUZGtvKIncgpqEanKEGXAxq6i\nLNKFMEpJ4ULImz7T9XNDg1pEfK9ipLiIibhe4m0qaKFQZ7PN9gGJZ5queC0EsaEWEfZYQtsGGWJn\nlqZtXMll0pdEnkhqX5frmlpEtPIbTGMiNqSNjn2KFhqag0yTJf4uQi0iaafznS7ucpdn1Q7btvc2\npclzuyJ6WoNSV8qviMZnrEbsYBuJQmrUVWWnETOzhd2+tlB7Iiz7TCQ6XNEUKaSOzABbeWm3Sxao\nRURjWQ9wFxPJHgaj0SHavIerMm+7qmJrV51XY4NNrojkjmA2qEUkxM5WRpEwxZHMDNZOJrnSEzLU\nIiK1rNcGF1MZqfuawixYNiO0dlto+mzWspeCWkS01thNz560HYlMk9gWEm07mOYGvK6LgynUIpKN\nI2hR9fy2ae8RbiGtm4Iw2+4TahFJ0T7ZrO11+WtTYYpiUo1NRqgLpDyRrosNtYiwF36oBzUzIeHl\n2SCVZRoPaj4FtYjkU8V94SImkv9edIfnaLv0KbnEa/N826lsl48OoBaRtPBd5G1UUTT1KLPPloWw\nIUsCjcEj/zypmEjdu4QeL6MWEfZzN9omb/l6r1D3bph4hOwxEZPkwvygFRrUIlI2arsu6K65nhKC\npbnMnv4sccqdZExEyhMxfSYr1CKi5YlknycVE8mORnFKUw/D6lVcnTGDWkRSWCsh1PMkWMszi+kU\nzOW7SOaJhFDmTaEWESkXtslz62ywtaMoeKYlLE2eq70BT+N5UqszTAOIC6hFhL3wY0Nyh+l0RjMm\nEnNF5qAWkS7v4g11xYSNEDbgdb2eqUWEQb2lksJMc098wRC4ZKPrnd0V1CKitcfENCbS9J5FP7eh\nLhkuRK8nb3OZmLvMWJW6V9ezk6lFJK0k33kiLkSrKBvSlziWHUMQUsN2Jea+7tVlr49aRLoEwxSm\nCOZpjYtdvBqeSNehFhGG0dJFTKTMw9JAu4wZyqAMLa8lNKhFpGyUZMkTaXrPyClMN6Y1+X7Teza9\n90KFWkS0Op7rQ4kA3umNb9rWsdZ0Jg4Kp6AWEe3MzjrapD2zvJN2TISlHGyx3TcT6nuaQC0iWvN1\nF4cS5fNEJN+t7QlrmntUTLfIt7mm6fPq0t4Z6o8BahFhOAFM6lAiTU9EO3gqhXSAu821th5cW7Fk\nhlpEgOJOzJIn0rZBauaJNLmHC6o2N5oIr21bkPQeoicyB7WIaJ274WJPRL7RaTcc7ViIFFoZqzbl\nVxdED70eqEUkrSjtwJ90TIRhmsZCqG6+rSfS5bru1zagirLItuvGZdrBmzYihgbFYAPQPk/Edqlc\namk9HgNwCnpPpAifHUDqUCLNmEio+NjD1BStWAwj1CKilZDlKibChPY0MaXt6WEuM1alnmvTjtna\niQnUIqK1xyQfv3AREwF0A8YhLPsyiFwZ7GXnE2oRSfEdEzHF1hPRHv1Zys0Uk6Q/rbR3m8B41+Mn\n1CLCkLEKyFZsvmNokV35Ym24+SxftrhS3IczB7WIlLn9IRc6W4dlntaw2pWiEV9hhFpEtAKrrp6Z\n3zOj1Umk9+64wsWhRJLvHHrnl4JaRFgqqcsNT9uetnkimtisHrG/SxuoRSQdLX03dNcxEe0jDprE\nQrqyi1ezzLsKtYgwLPGGdG8bQpjSMNsWOQW1iJTB0hFtyK80ZP+OFNOVmEjX65laRFgyViWSzeru\n5RvJXaht7ah6bpvv29JmV3YVXd9sSS0iLB6Hqz08LO+nSai7eG1F2JQQxYZaRMoKNMRDiUJsHCGg\nvUxuQp3gsCQgNoVaRLRXMeqwacDZhiQdMG5TPswZqy7QOJTI5F4hQy0iLGnvEjERtvNEUrRXaNp2\nIK0OaFNuddeytg1TqEUkhXWJ19YTafqcLtO2brU6na0nYno8YohtglpEWApU4lAilwlsEf/YxkS6\nDLWIlB2P6Ou5UtcBxWeUsASIQyaUDNQQpymmUIsIS9BPOiaSfpfh3Vjs8IXNknLbU9eq7t0lqEVE\nMsnLBh8xEZaRKZTAqqtcnSbPTH/XZWGwgVpEWM4TcRUTYRESEzQ24Pl4ftNn2todUl3bQi0iXYuJ\nZBtUlxuVDb6XeF2sqEg/NzSoRQTQUXAXFV70Hpo5Dl1BcwOei3T2EOuGWkTK5p2snkkVTCNRE1uY\n7O8yIZYztYhozSdd5HS4jIk0uU/2wCfNhtt2p26Ina5rUItI2d6ZEF0+1neI8RkuQqwLahHR8kRc\nxUSyoqjtAbDQ9uAhzXwNqSBt3MXrEImlVQkk8lWKrg1x1JGmbeq4y8zROvGSCpaGKBxZqEWEZYlX\nItFJUzCYpyymIzTbKp2NJxm6SNRBLSJAcQfwuedE6uQtzeQy5qlT28CqyzKVnEZ1GWoRYW34KU09\nkezKSKQeTfGt+p3UeSKhQy0iDNOZqsbSJibCsqzKLGQmJ9u5tF/KE2EuYwmoRYQh2azKY2gTE9Ec\nnUynayxI2iiZ9m6zUdPFpk4WqEWEoZNJXFd0bddHJwlM6t9l2rukJ9LlA4yoRcTX4T1NaZqjYOKm\nu8Y22Mq2OmJzjYvnmwqDSV2ztm9TqEVEK/joIibCBrM452HcqCjpAYXcjgByEQGKhcTnEm8VbV1j\nX52YWTBMRui0DUiUoUZMxPa5oUEtIiz5DS6yJX12aolydFUPNsuk2kmH+d+xCrNv+rUNqEJrKdLH\noUTa4lh2apyWHWW/00wZl1pa1kpV8AW9J8KAREzEd9ZtHQt1JNXYO8PSjl1BLSJlI7brSnERE4kb\n8IppWwahxES6DLWIMCyFuny+9nt1AZe7eH3eK2SoRUSrkrraOIreS3tEjUIaPtQiAvAmOdmSPYpQ\nq+Nkn5vaoR0bYZ7OSNJlsaQWEe1RMkWik+VXQySPBmAoo6b4PpNDax9Ol6EWkXSU9D1Smh5KZHvP\nbKDY5ztpexsSdDFruCtQi0iZJ+K7A5bRNtLvqwOweHRF+J7OSN07dFGWhFpEtEZQ12nvsQGGgdSh\nREC365xaRBhGT1eHEvmgzG6mBl11VotJpif7oURdz1YFyEWkrAJcV0jea5A+lEh7etHk2SGv4GSJ\nyWbyUItIGSy7eG0bJFOjs50qamzAM3mmy0OJ6u7D5NFpQi0iUrtnXRHKUQDMsNSlLaHa7QJqEdGa\n07s4lCh7bRQPM2xODvP9/FiHpzASkddffx1XXHEFnnnmGQDAz372M1x//fW47rrrsHnzZhw/fhwA\nsGfPHnzjG9/Apk2b8P3vfx8AMDs7izvuuAPXXnstRkZG8NZbbxkbV5ZTwTIK2HoiprGWhYRvb06q\nzEM4XtIXtSLy4Ycf4t5778XatWvnP/t//+//4brrrsPf//3f4wtf+AJ2796NDz/8EN/73vfwd3/3\nd3j66afx1FNP4b333sO//durpn5gAAAgAElEQVS/YenSpfiHf/gH3Hzzzdi5c6excWWBVZ8xkarO\n3iZIFwVkDuYVrrpT1xhiSgzUisjAwAAef/xx9Hq9+c/279+Pyy+/HABw2WWXYWxsDAcOHMDKlSsx\nNDSEwcFBrF69GuPj4xgbG8OGDRsAAOvWrcP4+LixcewF32Z1RjIw19VRzqT+tXbxRk/kFLUi0t/f\nj8HBwY99NjMzg4GBAQDA8uXLMTExgcnJSQwPD89fMzw8/InPFy1ahL6+vvnpTx1aLr/rmIjtd6Xp\ncoOuQ6rj2wwCXS/v1scj2gY/bQve5D5Vc2Wb4FjdnLvq+rp7ldkjmfUotQLEspJkWoY+bTD9va8y\nbNs/6u5jQiMRWbJkCY4dO4bBwUEcOnQIvV4PvV4Pk5OT89ccPnwY5513Hnq9HiYmJnDOOedgdnYW\nSZLMezFV1M1Hsx5CUQA2/X1Voljey8h+p+peRQVetEu36Ofs8+ves8jest8VPbPsuXUp22XfcYFN\nMl9dnbZ9XhvKgv+SGatV7S5/Xfb5Jm2gza7yRku869atw4svvggAeOmll3DxxRdj1apVOHjwIKam\npjA9PY3x8XGsWbMG69evx+joKABg7969uPDCC42fo+UG+hgxtOM92bNNWLNnXdgleU/TdqJd167p\nS2pK4rXXXsN9992Ht99+G/39/Tj99NPxwAMPYMuWLfjoo49w5plnYtu2bTjttNMwOjqKXbt2oa+v\nDyMjI7jqqqtw8uRJbN26FW+++SYGBgawfft2nHHGGfWGVSi5a0+kyBZJT6TJ6CTtieS/W/QuJs92\niYknYmub5LtUtYkqT6DN8/IweCK1IqJFfmpRVljptdmf898vwnQ6UydW2e/W2WRiVxltRCT/LlVl\nmb2PybNdU9dxNG3LUiciEnayigh1xqrpXNklLufQvt6jaspiagdDYLXoHdhXuRgEzjXUIiIZlGry\nXEmYlnhDgb2MbILiXYZaRACdCnAdvdfuHKE0ahcekuS7x8DqHNQiUhWD8GlD2fPbJJtpzuWbPNfl\nEm+bZ9ra5TLI6ftaFuj/L16NeXC2IqvyKkJrHAw22OBqqV2i/bQZQKSuZYHeEykK/LF0hraNg+U9\nWDEpX81cotAGEVdQi0g6avhWZxcBs7x3o0Foo5zUEmST59Vh2y5DK3sbqEWkzBPRsKHsdzb3afP9\nNlSVIXPjrkuKy1/jE+12yQS1iGi6qtl/S+SrsDY4bbuqRCC/xM8keNrbBZigFhFAp7JMn2friRSt\n0EjY1KR8ujKShhIT6TLUIsKu9DZ5Akx5IqGQLaeyMtNcJmcO/PqEWkS01N40oBdKTCRPiA1bux3Y\n/C7LQhgwqEWEIdmsii7ERJgxEXP2la6FUO/UIsKwxFsVWI15It3GJOjb5h5dgVpEypZ4WSomtDyR\n0ESLeYk3eiKnoBaRlC7u4i37LFJO9OY4oRaRqn0rLsmvCkg83/U7VB1CEyo253W42mfTlq7URRXU\nIqIhIHkkN2zFZd7maJSXxDPTOpdY6WGFWkS6FDsoEpDQG49rXAiH5D01kwWZoBaRLqUWMwlH0wxX\nbRhsyGITXGWzXRJqEdEqfF8xEdZ0funvVmHi5nfJI617TohiQy0iDJ6Ii0NsUnEMscFoIBkbk5qC\n2FB3H+023hZqEQGKK53lXIkm7qzJjlTp92MQ4zJMYgX5snP1PBtsc4RcbOpkgVpEtJYrXSQSFXXk\nEBuMNKGuWth4R12vZ2oRYSh8qQ14rEc8asNQx2WEKnC+oRYRSTe2yXPTf0scSlS0rT02xPanvbuk\ny8uyklCLCHtFNfFEtIQxIovNVLvru72pRaRL27yLTjWL54lUw2yn7Q7uLk+NqEWEIe3dRUxEO/+B\niTZxJdfUdXxTe+quMznBjRlqEUnxfRSAVOMoou2hw206UpuplKsObBMTYepgNsvmzEvsElCLiFb8\nwOWhRBqb8NJRkzFjtQrfO3NtREsyWTB0j5RaRCRWRVzS1I4207QmnbnrI6EGpmUqdQ0z1CLCsGfC\nVZ5I2WeRORjORZGKiXQdahFJ1b4LxyOm79E2JhLhgDkg7BtqESkrfF+nhKXPctHhGUZagHuqo935\npDbOsZavFNQi0qVRm3WJN7rlzbFZxetyGVOLSArrEq8NLnakdh32MpLyREIXGGoRKVua9Dmd6dKh\nREwekAl5wWVKNrO9T5f34VCLSFlg1ScuDiXy/U4S7rQre5vk2vhC8lCiUIS7CdQiwrDE6/KeTZ7T\nNEuWdbSrWj5ntbkJXXqXPNQikl8WzX4eGmmn0AgWF00LbTupdplLegW+6bIXApCLSNlUgiUm0iaw\npr3Ey7JiYFIGWrZK7ryNu3iVSDuwZrKZ1KFE2fdgCHCyTBeYO5CvYChDPbSBWkTKcN3wTO/f5NyR\ntku8C3HvjKTtkisu0u0kvW9oUItI6gX4jon4OJTIByEIRwg2FuHK7hDLg1pEyvIDfKq1iw14ZQHj\nhYjvkVfKKzCtv4WQXEgtImUuI0vGahNPRFNAGBu0VsKdCVIxkbpnhjiFyUItIvllUQ2kDiViOAqg\nyF6G8mVF6iiAumQzJlFvArWIpBXlezrjMiaSXZlhaDzaS70MZZBiswooKb5FgfeQoBYRCQ9AwgYX\nMRGfaAuFBFKp45IxEalpbz4vKTSoRSSFdd7cplE37RRdS3s3oWwjpmt8eSKhQy0iZUlZPqczUjER\nzUbXBU8E4NrFK7m8H3rdUIsIQ2C1CtuMVeDjWbhNE8eafKcLZSjxDpKrMxorPYz0axtQhVYcwcXI\nwLSJMPSRjwXb+EroYlEGvSeS/VvLBolOx7DEm9KkPKPwfBKb2FlVmcfVGYdIrIo0wTQm0uSemku8\nITZQabQ2zpnmiYTorVCLCPNc3haGDtymLOM5HJ/EJtmsK+24CGoR0cJlTCQbLGboJMwN3IVdXc4c\n1YJaRLSWJvPupYtplc/3qipH5uVfF0v7vjfgLQSoRYRhxHZxKFGKrwAns7fBjMSybNlO9C5BLSJl\no2RohxLl791mFFtIYuBiWbxJbk/Z73xkNocAtYiwj6Btk420GlfXG7UUUmnv7O24LdQi0qWYSNH8\nXqthhZYnorldoOp3kl6NyXWsUIsIwy5eqZhIXphCGplc2su0qc7mWlf5JiG1ixRqEUlhTcpq4olk\nA20hjjrSmIzQjNM+W08yRHEwhVpEqpYlXeLyUCLbZ0iS34sUSsOWtFNrA16XoRaRsobus/G7iInk\nvRLXFK0IhegJuc4TsclLCa3sXEItIloN3eXSXVsXfSHliWSnfhr2S63OdF1wqEWkLFHHd7anq0OJ\nmgqCBq7K3KYjsuWJmBKigNtALSK+3f4UFyNMUTyia4LQBN+2SK7OMJWjJtQiwuCGS8VEtN8jD0PZ\npnawIrU603WxoRYRQCcu4iImorHEayJ+tntAfOGqfKTuaeOJ1JVd6CJDLSKp2vteHs2vYrh4nrYn\nkHZS7QYc6nTGhq7vs6EWEYbAqhTMbrsmvrJCtSgbCLPEjFWHlCm464J27UazeAEMmK6ASAUyq+7h\nsl11ua6pRSStVE11lmq8IY4wWbQ7gdS0ss2myab3qrM9bsBzCEPau3R+QjbAyrCbVjs2YzOdYY6f\ntLlPnM44RmM93sUGvOx7pP9mSDYztcNV4/adtCXVlmyDpSF6GKZQi0ja2VjVWSr70TdMtlRlAzPZ\nmcfGg2Ntv1JQi0gqIJpurFSyWf47mtMIpkbNPEWpC/raeCIxJqIEw2hU1QDauOLa78UC8xKvlG3a\n7+EaahGR9AAkntvGjvw9mWM9PvG93V4yJiLl1cTAqkMkPACXhLaTk8GGrmA7CNicYxIa1CJStovX\nZ9q7VEwkG99J/4TUqTUauvYU0JeXFFI7KIJaRIDiTsyi3E1GouzfDO9hGuB11dB9xxYkcz9C7/xS\nUIuIVkcril8UYeuJMIhGStoJ2OzKoh0TkZqCsJavFNQiolX4Ls6JKJoiMezi1UZrL0tbJI8CCB1q\nEWHYOwPIrBIxdNgUJluqcGGn5PZ+zWkgE9QiUlZRLEu8TTyRfOq7BulUhr2BM9tnW3++l7J9Qi0i\nZeeJaNkhRdmqk2uKjiPQxvfSp8aqitRuYFaoRUSrs7l4HnuHrYNhiZeNGBOZg1pEgOIRk6FD2pLf\nM6PhCYRYbild74ghQy0iZXN33zER1ydquaYouc02JqLRiX0HVrXqKGRxB8hFJJvdmf/cJS5Eqyiw\n2uS+Tc8gKcr+ZG+8WZsl97xIISXCoXtZ1CLCsILAlt6sXR7SmKxaSL6z5L2kVvHYxbwOahHpkntZ\ntDKi2XgYBLoOF6tzGoNC3XXs9VAHtYgwHNrTxUOJgDCmMy6QTDZjnGJpQC0iDLtdXR1KpEnWbia7\nytCYgtRha1Oo6f0mUIsIUOwJsGSstkl79+EJlG2lz656ucjOtcGkDLXiUnXxGvbVLV9Qi0hZI2dx\n/9p6IiEl0bmylaUui6hbUbERYG2xdomRiLz++uu44oor8MwzzwAAtmzZgt/93d/F9ddfj+uvvx6v\nvPIKAGDPnj34xje+gU2bNuH73/8+AGB2dhZ33HEHrr32WoyMjOCtt94yNo5hF6/koUTp31rxkBAb\naIrUXiP2/S4heiz9dRd8+OGHuPfee7F27dqPff6d73wHl1122ceu+973vofdu3fjtNNOwx/+4R9i\nw4YN2Lt3L5YuXYqdO3fi3//937Fz505897vfNTJO4hyPtqSdvq1HVGSzr3gPw6a/tlTVgw02+RoS\nnoiU3czUeiIDAwN4/PHH0ev1Kq87cOAAVq5ciaGhIQwODmL16tUYHx/H2NgYNmzYAABYt24dxsfH\njY3Ljt4+cRkTabtsKZVsxoLUMqkLYkzEjFoR6e/vx+Dg4Cc+f+aZZ3DDDTfg29/+No4cOYLJyUkM\nDw/P/354eBgTExMf+3zRokXo6+vD8ePHjYwrO0/Ed0CyDFtPRGJK0+XGWAXbBkDJ1ZnQqZ3OFPF7\nv/d7WLZsGc4991w89thjePjhh3H++ed/7JqyQnOR5Vc11bCZ15Z9p829yu7hMoux7hmhZlC2fS+f\nFLVP10mMdc+v+l4b2xqtzqxduxbnnnsuAOCrX/0qXn/9dfR6PUxOTs5fc/jwYfR6PfR6PUxMTACY\nC7ImSYKBgYHaZ+S9kPxmvLrNZEXfq7pH0TNNnlP2u6Kfq75XZFuTP3XvXlSWJjZp/TGtU1O7Xb1f\nVXuQem6T9lN0TVF7LLu/CY1E5Pbbb59fZdm/fz/OPvtsrFq1CgcPHsTU1BSmp6cxPj6ONWvWYP36\n9RgdHQUA7N27FxdeeKHxc/r69A9qrpr72hR80Xu0qbg2NClTplHeF3We50IskyL6kpqSeO2113Df\nfffh7bffRn9/P04//XSMjIzgsccew6c+9SksWbIE27Ztw/LlyzE6Oopdu3ahr68PIyMjuOqqq3Dy\n5Els3boVb775JgYGBrB9+3acccYZ9YblBKSow1UFK9Pvl3XUoqh59jtV96oSBJMAqqltTcjaX1Zm\ndeKsJW5FmNapaZm5WhErS4isqw8b+0wHonxbzV9T1cabUCsiWlQVeL4QQhCRInvadoi69+qKiOTr\npcsiYvOM7HPy1/kUEeqM1bKAlOtGLlW4Zfdk0u2q+TYbEuVWNzjZYLs611WoRURKwdsg9fyy0cAH\nVfN37bm9yYpX3rtjgc0eLahFpMz1dt3ofbi7ZZ81uU8dIXkbZWgF2SP1UItIl0g7cttRNaQVHROq\n3sckfhQC2t6ea6hFhGGJ10VMpGyt3hddbtB1aLx7FzzBKqhFRGv0cTECFsVEtMjawtzAF7LYhQS1\niDAEVqtokmyW9UK0OonvGFNTXNR72wTB7O9Myo217UpCLSJlgVWfS7wS1wEcnkhRPMbUDo3OoC1u\ndfEaKUEKHWoR0Rqt8+5+mQ1tOhbzNMInzElvEp5I3X3aXMsCtYhUZSdq22BrR1Gj891gtDtlW5jK\ny6YsbQaMEOuIWkRS8gXLMp2xbUjZ+/sWwqIpIUODtelcvu2t80RcTGeiJyKMhAfgygZbO4r2+/jq\nFEXPYsldMMlYlZzWSqWq29Zd9ESU0GrkpkHQpp5IxIyqDY8+kIqJdB1qEdFuRHU0DZilIyvre/nE\nt7hKCb+t3V2ua2oRSTua9ly47PmheiJpmTLYxNy5JGMiptcyl0cZ1CKi5Ym4qPCiZDOtTswgHiFQ\n54m4aJch1g21iJRtVtP2TJrYwZCsFeIoB3DuMXLlIYdYR9QiAlQfi6hNGzs0YiJFZWnqEbmw1aYj\n+hAS20HBRRJZ9ESEYdjFm9oheU/fcZ6yRDcWMS4jX2Z110gjWUdNthqEArWIlO2dcU2Io0EkogW1\niHRpF28kktK1dkMtIiy7eLvmfkbMiPVuBrWIsCg2ix0Rv8R6N4NaRLSSzVzkieT3zfikaAUm/Uwz\nwFoX72LPWI4iMwe1iJTB2qgikTwLoa1Siwi70rPbF3FHl/M+bKEWEXYVZ7cv4o4u74WxhVpEWDaJ\nlcFsW8Qt0RM5BbWIpEE/VjVntSvinuiJnIJaRCS24LuExY6IfxaCOJhCLSIA95SmyQaspo1voTVa\nUw9Usm00XbKvu67rUIsI81QGaH8UgGYDYy9bdlwd1Bwi1CKilWxkmvbe5lAilkbFPFJq21aXCGdS\nh1oJhj6hFhGtY/xMD0Fq+n+PtJ3aSKJtA8sBU0XUnfTfZWGwgVpE2CupzaE07O/mC5P/MkISqXLX\nOuuGEWoR0RqxXeQAsMVEsjYw2OELyf+/ZiGVWxXUIlJ2nohrUTF9XhM7sscbaPyHTEXfZR1Rs/XA\ncEZtliggp6AWEaDYbQyxAjXfIZ+0xyoaWfL1HoLNCxVqESk7CsB1g3LR4aWmM03ePZ2yZP+7iib3\ncIF20LzNtaZ1UXdd6AJJLSJsy6FtyE8bfL6T5rPrMJ0yhOh9ZmEqc2moRaRsjd11g5L2fPLBy5Cm\nFV1D8r95kPrvLkIXSGoRKVs5CK3zZT2BNlOKpt+rOtlMm7pcjKJ/+0JqOtN1qEVEq5JMYyK2nTDv\njTB0YmbaCq5LbOuP8R2koBYRrYJ3dcZq/mfNs01t0BZzyedrJJvZBFZD9G6oRQQoLlQWVW+yASvG\nQz5OqLkYkp5k6AFkahHJJmZp22HzeRGsGavMaNd7FTbtsuv1TC0iZRXFkifSdCs4S1CTAeYNeBL3\nMXk/ZrE0gVpEylxG30u8ZbT1REJvPBL4zhPRuE/d1oLQBxRqESnbO8NCE09EI4EulP0xeXwHHLUO\nGYqBVYew7OKV3ICXfq/plCbERlaFb09Eqvxs6s8m2Yx1wKyCWkTK8gR8d6S2MRG2pLlQhIjZTlvb\nTD2uEKEWEUAnn0I6JpL1qNp6VxIZq6F4QEVbBSTv2fY+bDZpQS0iZXtnWGg7h9Z6L6aRz/cIrXUo\nEVOZS0MtIlo5Ii5iIkxCyHQUQKirFrZtM05nlChz/X0v8UrniVR91oSm0xOGxst8nkgVrjJWQ4Ra\nRNiTspqMRCybytKy1S5j7ZW3pkh6yQxi3gZqESkbLX1nrErfkyn/hcUjqYKhnFwS+vtRi4jUFvy2\nz5XoZBpLvNnENlahYJrOxCX3ZlCLCFA8pQm90IHmU7Um7y4xZXFV5r43sWkdKxBqANkEahFJR1DN\nebOLaUeb92E/Y8UVMSeDF2oR6VKF54810JxiMHlyoe7itX1el9pyHmoR0aoAqT0RVddqNiqmvRpM\ngpanyjbbQYD5PdtCLSLpXF6zAqqe37QRsa3ORIqpqh/bumParSwNtYiUFajrzmcaE2mT+amRZ8DY\nQH0f2iMVDLUZBOrsZ/IMm0AtIlojtq9DiXyT39DGICoMNpRR54nE6cwc1CJSBkuFNB3V0g7sS1iK\nVrm0M1Vt0ZjWSnkioZW1LdQiUlZRPitEKiaSTfzSoo0nwiDcLHum0t8xlAkD1CKSVhRrnkgTT6Rt\nTEQr2Ux7JNVuB0W/k4yvmFzHCrWIpBXlO2PVNNBl64lkhaTpO2h3Zp+42L0tFceyqcMYWFVEyxMx\npe1IpNVgmMqTNWgOxJiIKdQiopVLkW88LmIimnNqpgbNvKckrs6YQS0iZRXFMnq1WZ2JzKEtFFVI\nxkQYdw9LQS0iKSyikafN/JolY5UZ1+e61OHLEwm9HVCLSJc2qGWnMSHOkUMfLbWJnogSLJ1NopKL\n9gGF1Hhc1YPv80R8Y2J3qO+WQi0i2W3zPvFRqU3fKSThaUs+OY6N0Du/FNQiwrLb1VV+QpsNfAuJ\nLryzabJZiFCLSJkn4rrQXdw/K4jZv7XRnjKyBs1NkAzShky/tgFVaBWui9gFg2CkMMVlGDzNpoRq\ntzTUnkiKdkOXgGmJdyE3fsm25OI8mBDbOrWIlKW8+zyUqOp5Tewo2w/kG1sbXDVu7XKowtcGPOYy\nMIFaRBg6WxVtk42kOmZTMQvtUCXfSCabxTwRJfLb531hs7GqyT0ZNhWyC3Qehj1U+d9FT2QOahFh\nWcGQ2IBX1Oh8NR4G0SqDOQdIwxMJUVCoRURrKdRFTKRIQFwfSpRPtQ+JvPC5CGK2vY9UmYZWN3mo\nRaSso/k8lMhlklBMNvNP7PjyUItICmuF2c6Jsysz2tML5ilOSv4MFiZsy4/xHaSgFhGG6UzV89vM\niTWFhKlBsw4QgFwwlKm8XUAtIgwNrGruu5Aakiv7mfeUSCzLhpyRawq1iJQt8bquFBeHEuW/w5Kx\nahp0dWWr7zKQzM2RWukJHWoRSV1+3xvwTOlCI2KOjbgoM6l72pYZaxlLQC0iZaO1z7R3qZhIfrlS\nMibSxQaaLR/JAKvN9KkuJuIiTyTEuqQWEcBtungZPkZAn9MZZm/DxC7JzGXJMpea9uanl6FBLSJp\ngWoWrFRn1/Q6Qkw2Az7ZuZizW+vuwxxAbgu1iGgFH12kIacdOSuMTQVBA+2GzrjKIbVCY5rcyAq1\niLAkG0nPxbU7A8txBHVo13sdNhs1Td+FvU6KoBYRrUbUtZhIkS0sRwGYjtCMsIucL6hFJG3s2o1J\nOibC8E4smMQKulBWcXVGibKlUJ9LvFLkp2YaKyb5KVUonVOynCTv5WIzXyh1koVaRFKKpgI+n9f2\nOqD4eAHfDSYfwGMY9XxPZySTzVysuDHUiS1Gp73v2LEDr776Kk6cOIGbbroJK1euxJ133omTJ09i\nxYoVuP/++zEwMIA9e/bgqaeewqJFi3DNNddg06ZNmJ2dxZYtW/DOO+9g8eLF2LZtG8466ywj47Kj\nt6ZaS23A03LPQ50WuOhQUlNJySXn0D0RJDWMjY0lN954Y5IkSXLkyJHk0ksvTbZs2ZK88MILSZIk\nyc6dO5Nnn302mZ6eTjZu3JhMTU0lMzMzyZVXXpkcPXo0ef7555N77rknSZIk2bdvX7J58+a6RybJ\nXO187N9Ff/K/L/p+1XfLvpP/LH+vKhurfq66rsi2Mnurrq8ru7L3LHrnKnt8/amqhyY2Vl1n8575\na8vKULLsTOqqrD3UtYG6NlFF7XTmggsuwIMPPggAWLp0KWZmZrB//35cfvnlAIDLLrsMY2NjOHDg\nAFauXImhoSEMDg5i9erVGB8fx9jYGDZs2AAAWLduHcbHx+seOU/Z2Rv5n7WwsUPT5qTA9Q4hJsJS\nz0XYTnmZ36UttSKyePFiLFmyBACwe/duXHLJJZiZmcHAwAAAYPny5ZiYmMDk5CSGh4fnvzc8PPyJ\nzxctWoS+vj4cP37cyDitgjdN/rHdgJefnjXpxE2/UxRXYmjYpuXLYGsWSXvY3s0W4/8B7+WXX8bu\n3bvx5JNPYuPGjfOflxWA7edl1+X/LrpP0T3LvldlS92zbGyw/bnOtrrvlNlie5+m32lD0zbR9j5t\n7tHkPlLPNfmebRtoU79GIrJv3z488sgjeOKJJzA0NIQlS5bg2LFjGBwcxKFDh9Dr9dDr9TA5OTn/\nncOHD+O8885Dr9fDxMQEzjnnHMzOziJJknkvpoqqHJH08/TFi1zGuhyT/D2y36mzJfvcMhvqfs7e\nu8y2us+KfldWNmX3KXtf02e3pe698nbVtYm2z7PZVFdVji7K0KSu8s+qaksmbd+E2unMBx98gB07\nduDRRx/FsmXLAMzFNl588UUAwEsvvYSLL74Yq1atwsGDBzE1NYXp6WmMj49jzZo1WL9+PUZHRwEA\ne/fuxYUXXmhloGnBSSK1O7PqWu2YBMt0pgrmJV7b+7DHn9pQ64m88MILOHr0KL71rW/Nf7Z9+3Zs\n3boVzz33HM4880z8/u//Pk477TTccccd+OY3v4m+vj7ceuutGBoawte//nX8x3/8B6699loMDAxg\n+/btxsZVqSkDIe+JWAgBP5eUeXpl19ZlrLK39Sr6EtJWlHVdi1xHX9OZuudkr8v+ruznIjvLbKv7\nzOS9TFzZsnexfXYTTDpY1q6205k2tlRdXzWdkSq/YKczmpQ1Gp9KXdUAmmSssow06XTKdFqlPdZI\nJXfZiGfT+xTd10VQlQXj1RkNtDqcaYO1bUhVP7ukyvUOpQFL2ckYE2EYVNpA74kAsstRmuTdTNfv\nkT5LIojL0NAZbGiCzZJ6iG2bWkRsglcunps+W9qN1novRkIuA6mckmwZhFge1CKihY+lRe3lXRZ8\n2yL5vBA7vAuoRYRhGdJlZJ2pM2sRckdsExOTuCcL1CKSxg18j+IuKrIso1QDpo4bYqexJeR8IhOo\nRSQNCGp3OFdLi5qxnpBg7Vg2S+Os7yABtYiUrWD4WtVweU+tpWuf39O+ty2hiqw21CKSXaIs+twH\nUslm2e+k4ijVaE2WEIuWyUPpNJJTQKmEspDKzzX0yWbMz227xNdEhJoEeouuL0qF1qBpOr8Pupwg\nJgm1J1KGz+mMZEOSyAeQarxZbyiOqMVIpap3vYypRYRhibfq+U2X+BjeSyKLtevUDSA2Ky5dLmtq\nEUldWO1VDOkNeEXP8AtMjEcAAAvTSURBVEn0QMzo8qY5SahFxGSbtSZtN+BpvUfTWIwLTI8B0CDG\nRMygFhGtRmTaQFxtHXcFgw15GG1KqfNEQqt/V1CLSNkSL8t+iyZ2+NzFW/bs/GcmnSGkvBap50nF\nROruFTrUIqI1b3cRE8ley5TFyBwbye+mlr5nm2ttPBHmMpaAWkTKcN35XGyW0ozrFDVilhUD5o1p\n0RMxg1pEGJZCq7D1RCQOV2o6hSqaEjKUK4MNZUh5Il0WEIBcRFiWeCViItmlXd+ZlyyCUYTpCghb\nTKbrwmADtYiUdTbWDmFDU3Fs+p2imExISNks2flNbFoIYkMtIlIHAtmS73ASNjA1pvwIz2RbSMRy\nm4NaRNhjIjbkV2Q0phhF2bKmdriyNeS69bVRkx1qEUkbuObJZi62oGst8Zbl3dh8N3QkO6mPc2dC\ngFpEgOJCDfFQIobU/bw4MjRYBhtc4zuQ7htqEWFoYFUxEdsl3rQxZQ8mkqDJUiPzio1LJA8lisxB\nLSJlFeVzOiNxne210jAnm9nAsE0g+7soJHNQiwh7YNW2E2a9mqadWCrZLDS0p39Fv7PJtmXOzG0L\ntYiUBSB9xkSqRpxQ0p4X6tSlLVLJZq6uZYFaRFhcRomYSJY2hwI1mdNLeCLa9aCVM1SGbXlol59L\nqEVEI0U8fa7kdbbXuobJFt9IBrND8URdQy0iKdrnSrS9LnutpHcQ0WMhC3EeahFh6WSSG/A0xaNN\nw2epizwuO3PVvduUR9cEiFpEunQoUSQ8quq3a0LQBmoRiZ00wkqbttm1dk0tImUReZaTzSILl+iJ\nnIJaRFiSzbSfH9HBVUyka1CLiNZuVxeb5TQ2EpbZwNQBmFLZba5ly1vRhFpEUiTOJnUBix0R/3Q5\njd0WahFhyViNRJqyELwVahGJLmMkdBbCIEgtItETiYTOQhgEqUUkCkgkdBZCG+7XNqCKhaDikUjo\nUHsiZWgv+TaxI8Z33NDkYCiJa023ZNhOyUP0XKhFRKtATU+Xb7sVPIoK7w7tumttNlLGQ4kUYQ+s\nNjkgqOj/ftEkxEbLgPb/18MEtYiwHEokdTwio5Bo26Bdt02vjefBnIJaRLroiTT5boSPeG7tKahF\nJMZEug9zGUjFRLoOtYiUbRhjGQGa/neU0v95FUt5NMHUdrYOGw9qPgW1iKT4ngr4OKhZ8/9RaXPa\nvDShbmSzrT82EZSEWkS0jgIw9XzaLt35+s+r8s9LvaEuN2wJ4n+jaQa1iJQ1dJ8VWJUk1sSOtisz\nEgcEM3kioZ4nIhn0Z6iHNlCLiFZjdzFPz4oR0xKvNizZx1r3AcKf6lCLCONJXFmarM5ovhPjf3MQ\nqpDG6eApqEWEYYnXVa4Ky14aF6nbzLTJ7bH9ve11oUItIgz/34tUZ89PyzQT6bLvo93AmcWpzraF\nJsBlUIuIFi46Fms8RNstZymHJoRsuyTUIqLl8rt4Zt4LcfUcW1hWaepg3wJRhU0Zh/iO1CLC8P/O\nSDVe01T6hYZNshnbSo6JPSYDRojCkYVaRFIF9532nr2/ZEzE5LMm92mD9nTGBuackrr71O0Iln6m\nT6hFhKGBSzak7N8ayWZFhDKdAWTeXfJd227A7ArUIqLVwF1E3Yu8Gy33nMEDCh2WJXoGqEUku+NV\nE6lDidp8X4I2AV1XHUa7brOYHkYF6OSbsEItIlq4Snsv+lsDpkYb6mhu4yWH+o6mUItI2S5e15Xi\n6lAill20+UBe1xt5FskNeNLxslChFhHJ3bMuYNyLYoqLw5FcIdnJJDfgsZebL6hFpKyifB9KJBET\nyQdWbb9fRV15FJUjSydgsKGMuphI6B6EFNQiwl5RTVzjttmqTTpdUTmylK3v/SdS92EWP99Qi0jZ\niO26AvMb1FxMq5ouEWpNobSPAmBb6rddNay6tshLDQlqEUkrimHELEJymc8FrOWWhdlG047flpix\n6gHWk81sV2eyafwa+S9tsmU180QYjkyw+V0emzyR6IkIw9B4qjo7uycSaUedJyIVz4meiEO08ykA\nvpiINNo2aD+/ComYSFmuU9E1oUItIhIegORz25AVRO1Gkw8ca8I6VTWBbcVIC2oRYehwUkgdBdA1\nbKYELs51aUOsvzmoRYRhiddF42WYptmi3WFYpn9ZpGJi2mXbFmoRaZuYJW1HG0JvKF3ZxcsQrLf5\nXQhQi0hKFxpafnlVI+08dCGTIE5n5KEWES0X1sUzGRpcPqDKZlOXYShrV1CLCEsDk4qJZL0QyZiI\nrX1aJ6sxoLE6wxjPkYRaRLRGSx+HzWiOTKEcAaBNXTBUqp2EXg/UIlJW+L5VXSLZLDsatUk/l8LW\nE2EWcxu0DiUKXSiqoBYRLbfbxfb0oqMAtDNx2Ru2dmxKyhOpuy70qQ61iDCM2FXPl2qQIRB6Q2+C\npCdimvYeYjuhFpEyXBe065hIOoqF2GCkYU7CkrItbsBTRKujFZ0CVkSThmSyIcsV2p2yCNapat21\nNpnMjOUuCbWIlLmMLGrdNrDG3IF8UTVV1LZX4igAk02XoYsMtYhIxCIkn9vGjmxDYthYqB3YrYMh\nzV4yVT2mvSvBsnfG1aFEWkKykGMykpvmbAabLk99qEWkSzGR/H3YPYE8ruqhqmy1O5SU92BT1yG1\niRRqESmbT7IUdNMgHXPnkPyOCSZloSW4knGM6IkoohFgc1Hh+VUZnyNt1bOYPSIXZcUYE2lzXwao\nRaTsUCLX5CvS1aFETWhiS9XztKcNplsbGDtXTHufg1pE0sbvolPb2tEWzSVe5kAqo03S1L1jzFh1\nCMMuXqnnFx1K1OY+tt/JizHzNKYI6XrwSQysKhJigZbBsMSbF8cQRz0mXMbOQoJaRLSyFvMjtnRM\nRHp3sukSdNYbWqhb2avew3baLOVdhL53pt/koh07duDVV1/FiRMncNNNN+GHP/whfvKTn2DZsmUA\ngG9+85v4yle+gj179uCpp57CokWLcM0112DTpk2YnZ3Fli1b8M4772Dx4sXYtm0bzjrrLCPj8o1e\ng/TZbTtR0TtovxfAkT3LSl3dMKTmU5DUMDY2ltx4441JkiTJkSNHkksvvTS56667kh/+8Icfu256\nejrZuHFjMjU1lczMzCRXXnllcvTo0eT5559P7rnnniRJkmTfvn3J5s2b6x6ZJHOtuvZP/tr896vu\nk70m/50iW8qeU2ZD2bVFzyyzre4zk/cqK7Mqmj5b+k9dPbQpHyn7qmxt8syqa03qqsiuujZS9LMN\ntdOZCy64AA8++CAAYOnSpZiZmcHJkyc/cd2BAwewcuVKDA0NYXBwEKtXr8b4+DjGxsawYcMGAMC6\ndeswPj5e98h5EpKM1bbX5a/VTudnGj2r6teFnVL3TCy8EBOPxvRaRmpFZPHixViyZAkAYPfu3bjk\nkkuwePFiPPPMM7jhhhvw7W9/G0eOHMHk5CSGh4fnvzc8PIyJiYmPfb5o0SL09fXh+PHjRsblVzQ0\nqHL3bezSfIcqtFdpbDqi7zKsEzgTe0yuCVE4shjFRADg5Zdfxu7du/Hkk0/itddew7Jly3Duuefi\nsccew8MPP4zzzz//Y9e37Xj566q+V/S79DOb7xV9p+zfTe1ter+6622eYfN722e7wqTcXNhYd8+2\n9dyWps+36V91GK3O7Nu3D4888ggef/xxDA0NYe3atTj33HMBAF/96lfx+uuvo9frYXJycv47hw8f\nRq/XQ6/Xw8TEBABgdnYWSZJgYGCgscGRSISLWhH54IMPsGPHDjz66KPzqzG333473nrrLQDA/v37\ncfbZZ2PVqlU4ePAgpqamMD09jfHxcaxZswbr16/H6OgoAGDv3r248MILHb5OJBLxTV9S48c899xz\neOihh/DFL35x/rOrr74azzzzDD71qU9hyZIl2LZtG5YvX47R0VHs2rULfX19GBkZwVVXXYWTJ09i\n69atePPNNzEwMIDt27fjjDPOcP5ikUjED7UiEolEIlVQZ6xGIhF+oohEIpFWRBGJRCKtiCISiURa\nEUUkEom0IopIJBJpRRSRSCTSiigikUikFf8fhgHagJ0vCjgAAAAASUVORK5CYII=\n",
            "text/plain": [
              "<matplotlib.figure.Figure at 0x7f86604ad5c0>"
            ]
          },
          "metadata": {
            "tags": []
          }
        }
      ]
    },
    {
      "metadata": {
        "id": "GGGFATfmtr8l",
        "colab_type": "text"
      },
      "cell_type": "markdown",
      "source": [
        "## Visualize neural representation across rollout for this rollout \n",
        "\n",
        "Activations of the 512 fully-connected neurons for each observation"
      ]
    },
    {
      "metadata": {
        "id": "Puo55z-Us0mU",
        "colab_type": "code",
        "colab": {
          "base_uri": "https://localhost:8080/",
          "height": 984
        },
        "outputId": "0d42e3ba-bfa8-4a28-b1dc-94ad7a044057"
      },
      "cell_type": "code",
      "source": [
        "hot()\n",
        "matshow(rep,vmax=3,vmin=0)"
      ],
      "execution_count": 63,
      "outputs": [
        {
          "output_type": "execute_result",
          "data": {
            "text/plain": [
              "<matplotlib.image.AxesImage at 0x7f86601c9cc0>"
            ]
          },
          "metadata": {
            "tags": []
          },
          "execution_count": 63
        },
        {
          "output_type": "display_data",
          "data": {
            "text/plain": [
              "<matplotlib.figure.Figure at 0x7f86602d7518>"
            ]
          },
          "metadata": {
            "tags": []
          }
        },
        {
          "output_type": "display_data",
          "data": {
            "image/png": "iVBORw0KGgoAAAANSUhEUgAAAOcAAAOiCAYAAACLicBiAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMS4yLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvNQv5yAAAIABJREFUeJzsnXvgV/P9x58f8l06CqVv1BCWeyT3\n3HJXxIZcWhRjaJkhhCa3IZK7udVKlpkYMctlNEbC2iizXzRZKJXLmndFa+f3x/d7vt/zOZ/3+7zv\n73Pe53Mef/T9nPd5X159vt/XOe/L61IJwzBESUlJ7lgrawFKSkrolMpZUpJTSuUsKckppXKWlOSU\nUjlLSnJKqZwlJTmlTdYCxLnuuuvw9ttvo1Kp4LLLLsNOO+3kbOx58+Zh2LBhGDp0KAYPHoxFixbh\n4osvxpo1a9C5c2fcdNNNaGhowLRp0zBp0iSstdZaOOGEEzBw4EDjstx44434y1/+gv/+978466yz\n0LNnz8xkWblyJUaOHInPP/8c33zzDYYNG4Ztt902M3kAYNWqVTjqqKMwbNgw7L333pnJMmvWLJx3\n3nno0aMHAGDrrbfGGWecYU6eMCfMmjUr/PGPfxyGYRh+8MEH4QknnOBsbEJIOHjw4HDUqFHh5MmT\nwzAMw5EjR4bPPPNMGIZhePPNN4e//vWvQ0JIeNhhh4XLly8PV65cGR555JHhl19+aVSWmTNnhmec\ncUYYhmH4xRdfhAcccEBmsoRhGP7+978P77vvvjAMw/Djjz8ODzvssEzlCcMwHDduXHjssceGjz32\nWKayvP766+G5555bVWZSntxMa2fOnIlDDjkEALDVVlvh3//+N77++msnYzc0NOD+++9HY2NjS9ms\nWbNw8MEHAwAOPPBAzJw5E2+//TZ69uyJ9u3bo23btujduzdmz55tVJbdd98dt912GwCgQ4cOWLly\nZWayAED//v1x5plnAgAWLVqELl26ZCrP/Pnz8cEHH6Bv374Asvs9sTApT26Uc9myZdhwww1brjt2\n7IilS5c6GbtNmzZo27ZtVdnKlSvR0NAAAOjUqROWLl2KZcuWoWPHjlZlXHvttdGuXTsAwNSpU7H/\n/vtnJkuck046CSNGjMBll12WqTxjxozByJEjW66z/m4++OADnH322Tj55JPx6quvGpUnV2vOOGGO\nrApZstiU8YUXXsDUqVMxYcIEHHbYYZnKAgC/+c1v8N577+Giiy6qGsulPE888QR69eqFTTfdVGpM\nW99N9+7dMXz4cPTr1w8LFy7EqaeeijVr1hiTJzfK2djYiGXLlrVcL1myBJ07d85Mnnbt2mHVqlVo\n27YtPvvsMzQ2NlJl7NWrl/GxX3nlFdxzzz144IEH0L59+0xlmTt3Ljp16oRNNtkE2223HdasWYMg\nCDKRZ8aMGVi4cCFmzJiBxYsXo6GhIdPvpkuXLujfvz8AYLPNNsNGG22EOXPmGJMnN9PaffbZB88+\n+ywA4N1330VjYyPWW2+9zOTp06dPizzPPfcc9ttvP+y8886YM2cOli9fDkIIZs+ejd12283ouP/5\nz39w44034t5778UGG2yQqSwA8NZbb2HChAkAmpYeK1asyEyeW2+9FY899hh++9vfYuDAgRg2bFim\n3820adMwfvx4AMDSpUvx+eef49hjjzUmTyXM0fxx7NixeOutt1CpVDB69Ghsu+22TsadO3cuxowZ\ng08++QRt2rRBly5dMHbsWIwcORLffPMNunbtiuuvvx7rrLMOpk+fjvHjx6NSqWDw4ME4+uijjcry\nyCOP4I477sAWW2zRUnbDDTdg1KhRzmUBmo4tLr/8cixatAirVq3C8OHDseOOO+KSSy7JRJ6IO+64\nA926dcO+++6bmSxff/01RowYgeXLl2P16tUYPnw4tttuO2Py5Eo5S0pKWsnNtLakpKSaUjlLSnJK\nqZwlJTmlVM6Skpzi5JwzS4P2khJfsa6cb7zxBj766CM88sgjmD9/Pi677DI88sgjtoctKfEe69Pa\nLA3aS0p8xrpyZmnQXlLiM843hERsHlatWuVAkpISt/SuVKTqW19zqhi0t23bFoHkf8QUZEcgmJso\nC8PM5EmSJ1kAdXk6Alh4OxD8NKXvoUAwUW6sPH0/urJYf3PmzaCdy5zSmlGEtvwqqRwOAL/gVPrV\nlpqj+I31N2fv3r2xww474KSTTmoxaM83j2Uy6jUAfp7JyGqMB/BDjfadAdz2WXUZORIIfh8vmQ8g\nH2/BLMit4buJqckZAB6QbEP+CQSJB7aLqdLHAL4rUC9NlvMB3GJSqBTuAHAuR540yHeAY78Bno2X\n/QEI+sWuw+kIKkfErv2f1hIJdSu0hdAClUY3GhZCkA378evwuPZH+n2Iou0Gfzylj78nrr8+Ilmj\nrii0ch6v0uh3pqUQ5HwDfRxuoA9BtJWzKzArWdYjcf093UH8ptDKeahCm799xq9jhQUG+tjGQB+C\ndORXSacX0CdZ1rX6Msjqd5ETCq2cG+8s3+ZP5sUQ4/8M9LHaQB+CbK37IPgEqNmLXajZZ8EotHLi\nHvkml5mXQgwT09q1DfQhyCu6D5OjgKOSZZ9q9lkwiq2cew3IWgJxul2q38dX+l2Isl8XzQ62u7X2\nj6+dZp8Fo9jKiWlZCyDBdfpdbKDfhTDa69vv1m4INej2WSwKrpwesb3+2dx+u6i3nS9ZP3hZfSwA\nwKLjcXaybFAHzU6LRamcOSF4T78PnWwgGwuaJ43TGCMO6UorPcZQ724RnbDI7k96rZynZS1Akbha\nzHLF1FHqSdRSl/Nyc4gu9d+W7Ndr5azzY7BM2MFQP/+klt5uqPd8ImsU47VyPtP8U/tAvMQ5F1NL\neX68Y8wL4hD6/5mN18oZ8UXWApRIM2RfWuk+6Y1mjEy/n3N2+IFc/UIo579YN6bnwzuhhML6lLIz\n0re0FhxoRxRnHCBXvRDK2WkLevn5Bjw9SizxNMXymbMVbGq9mxmSobMKoZz3fkgvv8+tGHVLf6VW\nlCns+5qCZMRVohX/I9dvIZRTxeqLuuQpUUJyKdVMcX4De4tWfE2u30Iop8rZ2/+MS1G/qO2Wv15b\nJHsQmBN2F624uVy/hVDORqq1STqbmhejblELZHpFbVFnYGNNWbKg4WnBivW45pyv4Gqk4E3mPWpr\nQz7J6CJCzKfMXboAi3WFyYJugvUkXfoKoZwqaZEa8hnXTApZJ7Nn+FWUGNH8UyqQ5VaU738PP7yt\nlyQLeg0Ua3iH3DiFUE4V7s9JhDYdrs9agGYaPmr6STfJY7E+yL+TZSLxB81AmVQLEySsDIPKo2Lt\ndpQbp26V82dZC1DvNCwHOpyS2fBX6zS2tT5IULfKScLS7d4U10ruQgIAvu0HjJtsXBZRSHireuOt\n3OxY1K1yBpUVWYtQGEapxP5Z9geKUfSTqU3IzQrjMAgq6nOnPpUaN3Er1K1ylqjxMaUsUDjKwk9B\ncYSkGdzGuEB/E0/SvJWKq+NYJ2nnS4qDsS2bB4Fv10kW9jXVO5PMQp8qUIg3p5r5WIkIvW113KY3\nGk6QbHO3/zvsMhRCOYdnLUCB0YlLlMr82ZSoJOkZ3shPbAmTTwoxrXWYv6fEFDsB2D5ZeFxqk7G2\nZMkphXhzvptRZrCSamQi861cAel4XvX2ay6EcuIijTOrEmNcIFF33S2AeS9YE6UQFEM5p8ifWRXH\nm1Cc7pb6VUnRMuVD4NfGJcmG5yz1Wwzl5ByP0ZhuXorcY+uBJGXwXkD2sXRcUAzlPPIh6SYaBiLe\nIv8tiaHyR/QkAFouJEdmq2YRDoUgRyUM8+k7FUh4jQwBMMmeKCBhKCWPTWzJotqvart1APwewGEG\n+jIhjw1oshAJdSvEm9OmYpaYoXviejWADzKQwycKoZwqXJ61AHXGAkpZlmtVHb8SAzmnhKhb5fxF\n1gIUFCKRVLetPTG46PiVbGdMinTqVjlLzEG+1/o5kMgutburv/IYf3Q/pDKFUs4hgvVsnUv5hG7u\n2zgbKC4e/+ZqfhhjLwP+XibczkQolHKKbgzt80OrYjiDhA8ot93foBxfKc4RlaL2aRLIZrClEPd0\nUzHAEKUQRymyfAiAkV6FSt6357PsV0cesgsQ/NVMXyb7MEV5lKJAuRmUD677K79OPVMI5VwkWV99\nMlhikvIhmU4hlLODwm+5j3kxSkqMUgjlxGUN0k0mmpeixADCSYHqgGIo57vfSjfpJpp8psQpb2Yt\nQI4ohnLuIL/h/OhRFuTIOeS3WUtQzU8pZcT31PIGKYZyKjA0awGyYOC5WUtQxe20QsnQJao4GkaL\nQihnXs618k5QEUtzdZplOdIIfme3/+htXRPPOocUQjlLzPKrrAWwCPVtnVPqVjnbZy2AY+LG6abx\nKUyJiSO0ww30IULdKufifFot2uP9Dta6lsvLmS2vNf8kh6VWS+VxhZhVKtStctbbOnVIZXnWIuSK\nQMM1KahJ+msHr5VzTOK6cyZS+MHUrAUokcZr5bwkcf2hYDvykmlJ6pvjsxZAAmLAZcwVXitnkqNF\nKxbE4lrG/cgm6Slv80XgKrmmAQqlnK+IVrQUZ9Q5V+Rj3fxVof6K+BziaJxCfa2rRSvOsimFQ67O\nx0ryqf9lLYE85Ab1tk8KT9H08DoSwh0AXBik5d27Pst+TcrjSySELSF2fFTXkRAixxJSJhkrccg5\njsbxWjlb7CP/Jt/W1RdcUjxc5Qn1Wjlblo73yrcdzCjfVFGWkvphqaNxvFbOFhrkl829tqGXE01R\n8owP+TDJfVlLkB+KoZwqMF6RX7iVwinf18lB4ApbeQo9pH6V8ztZC5ABv8zlxnw1f2qXtQRckhHf\nO1oapxDKqbJ1vunv6eW9NWXJM6Lf02TLcqQRVFZkOLoYf0pc25ptFUI5VVjIeHOOcytGLnkxawFy\njlg8CX28Vk4th+l36MXP6/RZEIocCcEEp4duoh57rZz/0Wm8NT1O/Ec6fVrgy6wFyBH5iV6hcLCu\ngNfKqco1AILKJtR7eYsEv6Fgva4GxwwM9mUSrYexQXZnrIvJNLPjeK2cqnlwLkjZELxTsc+s+dRg\nX2lnvbZ2Jn3yCV3FumE4epjXyrmLYrvlKRuCeX1r5AVbO5P58K8Rg2X0PvoFs+N4rZyqdEhJnptF\nQldVyNysJWgiy6OXLJjIKF9jeJy6VM7XU+zYlvRzJ4cuwY5ZS9DEKY7GISc7GojD/zHKbzE8TiGU\nU9Zj7OC0mzn5g7dN26wFUGDsw1lL0MQxjsYphHKeuYfBzvKyJWgZ5qZGjslL0uOeJrfGUyiEcmKE\nwb582jYsIGun3FvoTIp0JpjcGk+hGMqpoFDbs274lEyjgJjeVLGB4U1ZJsVQzor8PIMZCcHRlKXE\nX1yFAi2Gcs6Qn2ecnnQtiHhQT5QSPYijVAc6EFuWGAkKoZyBQjbkIOmUF+EqKGmOuTvDsQNHSYJ0\neNyRR34hlLPELM9mLYAD9tVou9iYFOnUrXK+zLpROjMWOo5SxJ812g7TCEgtQ90q564/p5fP+Jrf\nlqxjVpa8cVDWAuSdS5h7/UapW+XE1QOoxZ8JNA2E8z7khyck6vrqmSODXiDyLqbESMXrdAyq3ATg\nIon6PqRj6Ay9eKplOgZxhgCYRBmrO4AFHFnqJh2DKmdkLYAhZsQ+n5+VEHVI66pmn6py0wEdvVNO\nEwbbZxroIw/0jX2+zNGYWQamJmE+JtyRjW9Qea2qnOWtoop3yhk32O6p2MfuJgSpU1JcYR3goy+N\nOt4pZ5w5zT9lQxWOMi1IiRuuK8qCRAyvlTPi9CPl6q9GmbDIJK6i4n1+uaOBckIhlFPFB7OHeSnq\nFlcusC4SJeeJYijnWfJNipJ5vp5w5Q2SF4qhnIPkj2p/YUGMvONDBnDy3awlyA+FUM4NFA6df2ZB\njrwTCP6nye/sypFG8HF2Y+eNQiinh9Z0+ebqrAUoAQqinCWGyVvCmGaOyloAx3itnIwUmyWa3JvT\n9N5PZy1AMxs4Gsdr5fQxvKMPXJByz5a3HNnIUscWcBV33GvlPE6x3SEAyEiTktjDVNYzcpd43cdS\n7tla3/dZZqljBjo5ca4yJkU6dekyRtoBgUR2cx9cxrLq16XLWFvwZ0t5/12VLmMcJkkoZkl+qLdl\nTF0qZ1HSoZAwH16cF2ctgGM6OxqnLqe1suR1qvQhgC0s9Ouina2+8vq7ipeJUog3p2x+SDfhmexj\nSjGzYlzWAuScQijnsUfL1fcpQW6ROSsn+TbzSiGUE09uJt2EmSulwFyRtQBJyqdkKsVQTqwn3UIn\n4revZGn+RvOf/fZt52J4RUGU813pFvVoLG8yx7As71PKGtrVlpE6SV4sQiGUU2V3bqh5MQoDCa81\n3ueW1IFqtTNwFfPEAwqhnCWmudF4jxNohZ39tAZxtTwolbOklt7LjXfZl1K2u2N7WlP81NE4Xivn\nc1kLUFDIX92MQ1uH+sBLjsbxWjlV/ep6G5WieAxzNM4xjsYxjatNZq+VU3X3cbZRKbLDVryuqZL1\n71EcJ+7TPUSxjyxwFT7Xa+VUZTCyTa0uQ5qpoa0gZbLpKs5WHCeep9iVgcSlBvro9aCBTgSoS+V8\nCP4Yp5iSc55E3Tcl+/6jZH0aWxnoQ4TrTXQiO7VQpFDKOVyibj7yVblja4t9i2TDIFfWlqlOh7Pm\nH9PcjOO1ci5JXI8JxUJAfWheFO8g3zPXl4gTdHClWrs8sq2tQEoJvFbOIHwvUSJ2eN44HjjRvDhe\n8dQHWUvQtPbPCq3I8o4OOr1WTmDb6sseL4s127I+Dd/jnJS1AHB3XkhloUaMAUcxOr1Wzsim9pro\nWvBt0OHA+stYlUf6b5fd2DrREs4zncKagdfKGXGBZFCgNXbEKJFkSHJV4gmuZl2FUE7MyWUYpBIO\ntMDyZH/nYkgz1NE4xVDOnvkI6FQUyPoabW8Qr1trafNfBILbBvVAIZTz0rlZS+AH/UUrylgsJJjE\niKS/KaVsrxr7y6/VB3aIK5fTQijn7VkL4AnPiFbUsHxnmRsupBXWaKyrFEF63OFonEIop++0zVqA\nBH3TkqXw2krUfV5jnCxxlRW9VM4EB2QwZt4sZVz9URyaZVAjDUqvlIxYnLUAOWCWo3FGveFoIMN8\n5Wgcr5VT1rVJhP0s9FlCJ4tZignO1NjNlsFr5ZR1bYroDoA8TL/HKC6xAHWTyAemuxnGa+VUZQEA\nnES37CQuBRFAy0A75+RtI0wYR4LXpXI20TdrAVpIO0AIPnYmRguuUpjs6Wgc4/RyY5FWKOV8VLDe\niWAbPpMMUne52mAQRXZqTy5SG+c1tWZamAj62c1RisFCKedAwXoTUg6q7i09seW58UulZrVREG/T\nlYTLaAN9xJecNtMXF0o5hXmWfessjxw9ZRKxWuWMDZWa1aZoOE9XEi63GOij55/M9seiLpUz1bg6\ni7mWIotzksE5GK/Wbt1dkiWeJLA/xM0wXitntKFgMh6OT7sUriLWxSEbGeys5lD5RrP9W+JbRynq\nvFbOFkuW9+Wndz1ZN7w9fHNDYDK/SU1G8lO9eHmu7Wgcr5VTB1a09NECRxf/MiqJXVzZgSpRc4b0\nIHBR/nPRr604jZfFa+U8qOXT5tJt97qLXi7iq5fFdFIVVx4USlB8/T6peGCj9QM3w3itnFE4/2Mq\n8u+y4Cf08hH9+G2LnhWbSP7xxauzQrr+mFY4aWX19fKK1eDXxijTMYjzgsnO8mYRkAH3/06ufrw6\n68F1H7X0gurLDgfKDaxAVxOdnGCiEz6FUE6j/CdrAbLn15L1OyqP9EjiupdyT6L0MdFJuVtrj52R\nEt7wGtaN+kHW24cWRU+MPyWuGRsBBknL2iZMOxOd8KlL5dwYwD9ZNwOHghQQuWOGxI7Qf781KAmd\nWqskBdSfRlLUpXI+C+Ao1s0LWDdKRJAL2N26Er0HANrYd5QceKGBTrb+uYFO+NSlcj4H4JaQ7s/5\nSkHDbJKcr6VHAsBtR1gfJ7i56Sf5q04vV5sQhUtdKudhAIIKfWdwqVtRnBFIBFuNZ0lwZQ3zFQAo\nup6pENTY9UowkW7TbDqYYKGUcxvBep0N9FFk4vmFRKapxxsad4EvB8hDn6AWm86qWCjlFE3+lPZ2\n9OnNSTSydA0xJwYmGuonebBik3Farem7ykZ2gmMUSjlFSfsS+zraiTNBoJGla5I5MbCeoX54KznZ\n89c0tPb9Jj9PLX5Ip08KhVDOCZL1/552c0MP0lwZwIiljGMyTbYb51N6sek3fyGU80STXgJX1Eea\nK8bfV67ZLWsBmvmSkazJdIKjQignTpdfKTIj3jEdPUtcQFIstM52J0YqrHeB6dOqYignDpduwcxy\nlzMfK/LbrCVwy6tuzve1GOEox0shlDOozJZu08i6oXU4bZ7AkQdEHPIj92NGHJbd0MLMd5TjpRDK\naZQyTAkWWPD0L9JqYasr3YxTKmcSVym2MkA0i8CZFsaeY6HPzBjsZphSOZP4YqWiwM6C9RbYFKII\nOLJUKZUzifzy1RtYIUSSFPj5ZIbSn9MudOtICIUp0TP9yo4/C9bLwoQxzd7ZJKL5dFJxtICuW+U8\nlLW2FHhz+ujySagRtugYcUiWxNUDIcqnc49OJ5VFBiThU7fKGTAiuz+fs6MUUwT0CFtUmLOKAqFl\n0DBmE2rxRJ0+KXitnMmEYOSn4m2PYZRvqipMgdgpawHyDiNIoOmcul4rZ2Myy5bojgeAJxnl5ypL\n4xe16ffqC/IzjcYMD3TT62avlTOZADcKQaFDvThbF3hTWoiAlY9DA9NeKW1UGs2aNQvnnXceevTo\nAQDYeuutccYZZ+Diiy/GmjVr0LlzZ9x0001oaGjAtGnTMGnSJKy11lo44YQTMHCgaIpbcU6EuS+m\nN4BfGeqrpKDsuj1ojocSy3ohlJQTAPbYYw/cfntraMNLL70UgwYNQr9+/TBu3DhMnToV3//+93HX\nXXdh6tSpWGeddXD88cfj0EMPxQYbMH1ClDgN5pTTA7vrkqxZSfcIPgC1kXh1MDatnTVrFg4++GAA\nwIEHHoiZM2fi7bffRs+ePdG+fXu0bdsWvXv3xuzZ5idU+4XyMcdZ6wMTkRNL1PFis5xxFm5SMQEN\n5fzggw9w9tln4+STT8arr76KlStXoqGhAQDQqVMnLF26FMuWLUPHjq2K07FjRyxdauNES9QwrZUz\nGOWuDsOzgExNu/u1KzFSyU20gzSMJudhozSt7d69O4YPH45+/fph4cKFOPXUU7FmTWuctjC5i8op\np0Ek6jbVl6oOABjFKD+F2r/CAJbQkSXte1LtV0+e2rZnKffG7tM0hPZHQqsXkyW5gclDSTm7dOmC\n/v37AwA222wzbLTRRpgzZw5WrVqFtm3b4rPPPkNjYyMaGxuxbFlrKuQlS5agVy+xZDWy/xFTtEet\nRzsJw8zkSUKThb49od4vOQ4IBIOwkjBEn0oFbzdfHwXgaYUxTeHid0X+So97S34ABLGUa7qyKE1r\np02bhvHjm5z+li5dis8//xzHHnssnn32WQDAc889h/322w8777wz5syZg+XLl4MQgtmzZ2O33fQi\nwZg66D2IUZ7zwOhUdBWzidbYdhMkoyPH47VK2IH4y/uMcsPuhpVQZq7ZzNdff40RI0Zg+fLlWL16\nNYYPH47tttsOl1xyCb755ht07doV119/PdZZZx1Mnz4d48ePR6VSweDBg3H00UcLjSHyxOkO8+5N\nXVEb/Crvb04j/d4ABM2BqzaFuM95Up53IG5h5O2b82f0c1LyKyA4LV0WmSm3knK6ICtlGIza+KNF\nVM7OqDY2Jz8EguaX574Q92BJyvNjiJ/3eaucU4GAEuY+Od3VVU6vLYR0eJ1RfoBTKbKjZs/8oVY/\njQEa/aoexO+uMaZzjqP77dypk3+FQt0qZ8/v0svr1vD9vFY/jb0zGF42Ya8K5HJDHV1Lz+463PCG\nRd0qJxbOpBbnJXCxc5a3ftzVzA5T/jD1yx3FmJqeaqj/ZgqlnCJG66Q5zVz/Cv39sG54qTmBmjEd\nCdwGwcTWz3tJZuSJkiLJpsVwTfADQx09xVjT3mSo/2YKpZzHCtT5S/MXyDS1eu96Q9K04tvxjGyk\nvLv7Nf0ssnVVFSwD7BfNDlMo5RRRq494FbYzHxiSWPJDy4vV0lZ/aPrpwwwhIt2UkcPfOtDL+2v0\nSaFQyinCsdz1lPlj9EA0cahsv5aODGQdsRc3/+yLprNnH6AdhYjyr8py+o1b1PukUXfKyc8N/kPj\nQ55ovEe76PgN9TMmRXZszLnPfL4bjilaGiEkYB0cF80IwRSi8owAMFaxrQ150ugIQCWHMlkfCP6d\nLktphFCSO5KKmefZhGpy8x3+za8jQ6mcJZkwIZ8TNiFYaRkXGB6nVM6SElkGnsavY4BSOUtyA1kv\nawnyRamcNXTLWoA64r/Vl74kobnOTXzGUjmTPJr05iyxx4Lqyx6ZCCGPQLIrE5TKmWD3DNK81y8X\nV1/6opyOAnyVypnAsAVWSRqDfld93e3kbOSQxc1+UKmcSTbPWoB64h+J6/8+nIkY0pjOWMSgVM4E\n9ZLIKBccnri+MRMp5JF0qVOlVM6S7Ehm62JFtcsb3d0MUypnSXYk35T/y0QKeRa4GaZUzpKM+C8u\nTXpxTMqHTwt31vqaCylKrxQh8uQJkidZAHV5SEcgSFiYm/i/OQmNGb6CoLKfkiylV0qdcH7WAmiQ\nVEyvGFetmOdYGqZUTo/5cdYC1CsXVGeDsbV5Wyqnx2wW3pm1CHXKgqorXuQEVbxWTkfHTbklqAy3\nPkaenaIzI6z+3vt/bGcYr5WzqLGPXcBKHpxkgljeKW8YbKKT0Ynrbnb2VL1WTh2CrAUwzP2S9R8Q\nrfgk/w9PxQ9zz2QfrFiwhkkmqVLi6mtN9MKlbpVzSbusJWjFxIPCfLRdccYpZKyvSWVJz46RCfxd\n8O6JaztZSQuhnORn8m0GrDAvhyokawEAEA3rHBOeXoEjNywRuOFnt62eHC+v3GFFjkIoJ26Rf+zq\nxGb1ldSQvRX1ddMvlVsm+Rp3G+vLHgMSQcI3sTROMZQTU6RbfBI2WpDDLSQ8UKr+EfTMddow885I\n8xaGhI6cJTVYx9E4hVDOQGEO56mBAAAgAElEQVRaEVSWWJDELUHlJbn69JyvNZDbWz/3kDSFI9vR\ny/sItf4ugoqb+Dw6aGRykKIQyllilnmx/Y2kVxePp96jlz8vlLj2e5KjZcMgR6ZZXitnGVLEDlfG\nPi+UbMty2Ah+oSZLnO76XZjhEDfDeK2cj+qkcSth8qRGW5u/kgUW+5bCUYBGr5VTJ41bESB3ZS1B\nLb4EFt1Xp/F5h5oSIxWvlbPeCX6StQT+8medxlOeNyVGKl4rp6h9aIk7XB0zZIqjfSuvlTOyD7Xv\nm1EiiuH8sfnEkceF18oZUXo1+s8EAMBbGUshyNAyy5hVSDgnaxGEuEmwXjIEbBJytviYO4tXNUZT\nMO/dnI2nN9uaYEiKdAqhnOQi+TZBpad5QSwg+l/j/W+Ce8THjL8XHAU3x8GOxonQm201WSnvbkKQ\nFAqhnLhR/r8xUWM4W2EpdBD5BpYL9hV38lmlIIsO5DnHA6rwZdMZ3puWhymGcs6T93caGDYoD7dY\nuaU9RB4Ya8+1LoY2Sw7LWgIBHO16FUI5799Gvk1Q+da8IBmSjJxBI9hRrC/yV3U5WL6dRNC5fQv1\noZ3QFQAam/4zgn4EyhRCORV8rQvHfwz29dQu6m2ZRksFOWNpsoBqyiv6keWxCqGcKrja6PCRTTXa\nfsMoDwqinE2cBwBYY3kUr5Uz6TUvEyTK9UZH3uiecm8fjX4HaLT1hkUbOhnGa+UckjyzHpKJGF6y\nIGsBfGYTN2fkXisndk3EvbkkGzFK/IP8Vqe14M6aJl4rZzKDU5AawaqkpJXgBI3GJ7rJ8ua1cpaU\nZMIjpW2tMFdkLUBJnbHMySiFUM6rFdqMMS5FiQ5k76wlkOECJ6MUQjlVGB52yFqE3JKFS8DyHKVj\n4NPXySiFUE7yA/k2QUXUDNwPRNKVEEHbuCyc6aKo6cSLYN8K9qIKFEI5VaL8msjvkSeEEn3ZSelh\nlrV9CPatY6YhTiGUs8MP5du8b16M3BMcJVaPWMjJ+aFgvUAjoZIzBrmJSl8I5bRt41hvnDVNve0M\nRnnjNep95o4ppYWQVY7JWoAcoxMYmvUHtZWj5LhuKHdrrfJs1gLkGB2ngIcZ5WIO6nl0Y6dRmu9Z\npQheKeQ76m1NGm7Eg5Dp5erMYwAYCv/lptc1Qt0qZyFYpZ7wVsVwg4XtQFc20PFZRRv1EDcylMrp\nMUnD/6zoa6APrdwlCshmT6tikZsQN6VyluQCrdwlrrEdn6QZr5Wze9YClNQnjhLCeK2cCzTaEkdP\nv7xAbhCve4A9MZhkEWVemaSTvyW8Vk4tRGJJFomvxKv+yZ4UTN52PB5xlJ1ah0Iop4q7UTDRuBiZ\nwjMvDiTenCdryOHB3zwAIHhBp/XapsRIpRDKidfkLWV9iPovg4hVDzlSrC+WIYEIyb95WUss4i6X\nkQaznIxSDOXcT97HZJ+lFuTIEKFn+dPTbYtRw5OyDQbbkMI0bp4ghVDOQGEfPuhsXo4sETH+DypH\nCPWlkzFcI5MDACBICd+fH2OH8U5GKYRyqlA0f06T6Ngdb23B3SziE3tdS/IjJ6N4rZwjNNrWoz+n\nKDrWM/tpuJvxuNRe15KUAb64jM1agIwhOYm7Ew+wMNviOOda7FuOjZyM4rVy1jtBTiLWmVMaDxKI\nAsABZVBpLidmLUBJDV21Wrvxk9TGUaBkr5XzkawFKCg6pqOfGpOiFp09BqM48tT3WjnbZy1AQclr\nKs2czOKdGR97rZw62ZwdORaUIMoDnY5I1M7jdAUxRX83w3itnN012n71T1NS2EXUitNktNeb+FWk\nuFGgzheGx7RKpfRK4XKYTuNbTUlhF9Gwn2mWNbL8zlxXwtSbk5AIXivnfTqNbzvQlBj5YHtzXb1m\nrqsSDbxWzgii4P4TVF4yL0iGBD/m15lnXwxtbMv4uolOlpTnnOJsK9+EvGVejCzpI1CnW3iKdTn0\nuALdLM+p9zLRSWO55hQm+K5CGy/8BsUR8YALKpOF+lLIC9XCl4zy84VaX41AIWNcUSmEcpaYNeTX\nyQRyPaM8i9An9nCTetlr5UzmpCQKb9CSWv5Poy3r2MSmQbxz3h3pZBivlbMSJpyIdOZjdcaWWQvg\nMzuUa04Brqu6+taTs8s84IkNhjVIFoe5knitnMl0BBtmJEfR2FOjrZ5Xijv0Np7+ZkqMVLxWTuJV\nJGJ/SIst91NOW1WvlN6K7bLhVSejeK2ca1xHIi7RcjZIw68No/lORvFaObdq/lmvTteOnCOqKE37\nAOBUJ6N4rZzRwXu9Ol0/E/s83GC/abkrdY5Z0iBaXgyu+a+TUbxWTh2IJ1Ni0fQGCwyO+WuDfYky\nzqsQ/GVQaWHI/vJtAk82k0Rt+q8xOKbC16nNz5t/mvRLtYebOCWFUE6cI9/kQ/NSZMouWQuQgkwA\nbzeHFLq0cTJKMZTzJPkIto3hThYEKaEhY/e7T9hoTQ5zHFx11dPSKIVQzqAiH5ctqLxjQZJiQPZQ\nb8s6r/yjYPugYndia8bpZZuqKx1HgTQKoZwlZhnxhnrbxxjlBzPKXWPGas/WnnU1pXJ6zHJL/f5S\nsn78rbiFSUFyi0SacA1K5fSYtXNy/NA2awEU4JkhprOBISnSqYRh6Mb/RZKkUXuWkDDMjTy2ZFHt\n16Q8Jvpy8bsSHYNWj0ioWyHenHdnLUBJiQUKoZzb8KuUlHhHIZSzxCyiUeZpHK418ularYtGIZRT\nZZueFSWuBFhfo62eYdsErdYy+OA/Wgjl/ItCmzJqAhudvCXETewrbfT8R8voe8Kkee6XyKNjjnbW\nDcbEyDGXOBmlEMo5JDSYKESAo5yO5h4dc7SHNMf+q2Z7NyxzMkohlHNG5e/SbYiGkeXT6k1LOOTZ\nu6aVjZyMUgjlPFKhTeBBaMSsIBo2eKxEuVeod1m3FEI5PfGb9obJGs6uo79DLy9y7KEolGhguF+v\nlZM0vzKjiCPl01mctMRCZ2v0G3xDL1fI0ugN8osqMbxWTjxdbad4yUUZyeEhL2YtQMacYbCvyPCC\nGOwT8F05k9wuVm1tlFNhT+KbWeO2y831dU/zz/bmugTguXImLf5ZU6oka1D+ceaBLI+kgl+Y6yva\nuzUdcNtr5YxQyZh1snEpioNABnsj+OgH6pJCKKdKxqxL+VXqlu6OxonHE/DB1tU1hVBOFbb6WdYS\n5JcnHY0Tt+H1K1eKGwqhnJMV2gRlLk8mtkI9JvF3aeHm8VUI5XwlawEKxgMabRdJ1HUTw84GxzgZ\npRDKecvP+XVK3LCJRF1aol2VWZB7SsN3cRTc73c3L0WJJAspZcd6YfleGr6L82f5JjPcpFgsSYH2\n5lzsh8+YE+o2NOYGEA8NXG+hMd8AIJqRISkP+S4QfKwvgypuQmPOQVDhb5uVoTEVcROz20+0Uie4\n2urNlB2djOK1ciajdosmyylJR8cMLfgDvVwsfKkvv0GFdZQCXivn9eGBVdd7HaffZ73EwGU5RdtC\n7NgkL+mOeOxbdaViPiqC18qZdHz6ByvFlQR76XfhBbqxfnxHZu3HQ8V8VASvlXOHxGJ7VwN9TjLQ\nhw/8T7FdUYzV9TaN3OTf9lo5FzT/PCBLITxlsWK7VUal8JVeTkbxWjkj/qTQRi9tQIkJdNI+1AOF\nUE5ResQ+Pz43MzFyz6uOxlnjaBxfqSvlfD/2OXBzVOUl+2QtQAmAgiinSdejHvwqJSkMZ5TTlhF7\nUspKWimEcj7wa3N9FdnkluzLr6PLnYxyWvaxQwTL6pVCKCdukW/CskV5SUuQnPNKv6wlqOJBStl1\njsY2HQDaBoVQzuAt+TYsW5SnNHKo5J2gwrCtS7Bcst/4soJ0odehlde4jL1TETa418V0jFkbFEI5\njeJg6pd3xkrWfzh+0ZFe58XPBDra6VzJkYuN18q5gY1OV9vo1C+u1mnMyLw7QKjxsTojFw6vlfOT\nhGOukeARb5roJP8MttRvIPKGZPK4KTEKgdfKiV7VxsvTTPRZF/6IQD4zILrzCTJp+G4Lr5Uzabx8\nuolOR59mopfcE98Q6ZyZFK1cEfvXBTaiJZgOVO61ctqhPrKoxPdtlhrs9yDFdk3rXDeBs2xxveH+\n6lY5mZZA8+oj9jhj30Yb1dSCEwCo5SgvLnWrnExjAxUXF4uITjkZJxi5ZuPY53sB1E8cCjG8Vk6d\nfIgbPs24kbMUzKJTTpNvQlcqEs+R2uQnasJdvjh4rZxRICqVY4GAlRyyj6IwBeIjjbZkO/G68UBi\nTSv93TRGLh5eK2fEvUbOUJo5b3+DnfmJTrSD4D3xurUWfa+DrKMxeMEohHI+cbR8G5YNKJa8rCVL\niTj9a0rexf2lhVYLhVDO76v8L1i+SY3lm9MVg25MlhxcJtGNUQjlrNpZEOQ6lg/oO+Wb0xm/Shb8\nVMX7r7AUQjkDheQ3v2DdsBWE1COIxr4MYYSLn08rfD1xPfypnJoVZkMhlNMo5YYELlXwj40IGOdb\nG9NsWYclrmk5AQU4Q61Z7vFaOZ9KXBvJHlcGwcXtFvrsT7NlTVqCjFPrWycTd57xWjkPCs+vulbM\nPFfNeqbNl/MJcWwpRzO8Ip8mCj5xIUkTT7gbShmvlTP5qD1IYqOV/ZCuExOyblkLAASHJQp+5m7s\nQ09wN5YqXitn0u0nkNhovYB5pz4SDgT32elXyunr+NaPZHM48V2Lcr0Ev7U/li5eK6cOv2fdWHS2\nSzG4ZDHJ1olxJhXiJB4ZbDGc5Mjw6dFbCOVUybnBXHJtMlVDEvOo+ggqHP22oBPChMg4h8biUwbf\nAMGFGgML8i/7QxijEMppNueGq8ipdtlPo61ODuLHZaamikcnOmzmfkhlCqGctTaafC5n3fiyGM7W\nqk7PqkSpFaSOiTcrQ2GmUQjlVNlf3Z11Y0Ofnq1s/k4pE/V/lZ3WtgUwS7JNE39RamWCPMRN4lEI\n5bw2PFS6DduGk7/fqOPk7YotKWWLvxFrK5uSPr7JcpJUy81rSoijjBGspXGeIkoUQjnx6vPSTZ75\nOesOLYNHNQzzUSuoxgKkmgg3nKIhiTps//UpNSXzxDJGWMNWbCUVhJRz3rx5OOSQQ/DQQ03P1EWL\nFuGUU07BoEGDcN555+Hbb78FAEybNg3HHXccBg4ciEcffRQAsHr1alx44YU4+eSTMXjwYCxcaH4X\nIFBIoRBcw7gxIl9eKTWOGxoElclC9QjTK0CgLSXv6WtgZbm4uKZkF/WhCwdXOVesWIFrrrkGe++9\nd0vZ7bffjkGDBmHKlCnYfPPNMXXqVKxYsQJ33XUXJk6ciMmTJ2PSpEn46quv8PTTT6NDhw54+OGH\ncfbZZ+Pmm2+2+h/S5ZN8i+eEMczdMgEYD0r6Gr9Mv5AGVzkbGhpw//33o7GxsaVs1qxZOPjgpjxd\nBx54IGbOnIm3334bPXv2RPv27dG2bVv07t0bs2fPxsyZM3HooU1rwj59+mD27HzvhnYryKN7e422\nOpF759xDL6ev8fdq+aRyVq2CzvfiGq5ytmnTBm3btq0qW7lyJRoaGgAAnTp1wtKlS7Fs2TJ07Ni6\nnO7YsWNN+VprrYVKpdIyDc4lm2YtgDiPpdyj7daK8qRkfRKbe+/FrkZhRssns2fVbN50aL+rSxvd\nDkJGzgnZ8iRZ5rKg5W7MU26NuCwm80wq/x+HhiBDHY+p0ScxEG5BVO54PdkUEErK2a5dO6xatQpt\n27bFZ599hsbGRjQ2NmLZstY8X0uWLEGvXr3Q2NiIpUuXYtttt8Xq1asRhmHLWzcNG7ksIsYAuIRx\n78cAkjbhJAytyiMDTRaazCb6tdlOt63LPlXH0JVF6SilT58+ePbZZwEAzz33HPbbbz/svPPOmDNn\nDpYvXw5CCGbPno3ddtsN++yzD6ZPnw4AeOmll7Dnnnumde2ESwCcw7j3gUtBDCF/ypvOxvwqUojs\nEbtKYUTWczRQjEWK7SohZ545d+5cjBkzBp988gnatGmDLl26YOzYsRg5ciS++eYbdO3aFddffz3W\nWWcdTJ8+HePHj0elUsHgwYNx9NFHY82aNRg1ahQWLFiAhoYG3HDDDdhkk024gtl/+vUTTsOe9zfn\ndQAuM9jvEgCN6dVT5VEZczDkjR9My2N6DFo9mWk8VzmzIi/KAORfOclnQMCKw6vRr0q7qwCMtjym\nTp/9ATxjeYy0ejLKWQgLoZP5VYpN4w1GuztRo+0IiXQMWaCrmC4phHKy1o9ptGWU+5k6V3vTvYpH\ndBqbXrDWMYVQzl3/LN+G5REvkg0g7XwxG1iPmmrI3vw6ugTM3IpikB+akaMIFGLNSQ4BAsnUfWQF\nELQTrJv3NaeBrf28HKWQcCcElXeU+tOVx/QY5ZoT8ooJiCumH1wlVEv0j3aG5Ohxc1pWtHjRpYeO\nYhaNQihnidnDu9ck68cDu8xgRIv/paowdUypnEXgyxFGu6M5aqcRDxfsOFZ1oSmV02NIFBh5Q7NH\nKXLRDIDhRkcviSiV02NaAyOzLIXdkK9goukck7UAEpTK6TFmPTqW8aswcOXulaSnQhtZd7gsKZXT\nY8weGWwkVItQzJFl4kgz+w2ZQZ2YzDEwbp6pW+VkpmPwBPM+4eu3fBqSUiswGh1vccun1yusoE71\nS90qZ98tspZAD+Nh0sYsb/noLn9wq61fd2djukfVVrlulRP/7JC1BAbRCJfXzLsjWz/fdpd2d4Ls\n0PJpV1dDZoCqrbJ3yklG0ssfFWwfRTMPKstT6/nFq9o97BG/0AirQEKJHGVft0Y6+kp9yAxwE6LR\nK+VsD2BA7EhvRuxef8HMfT6lgBPmN2YjMQcp4ffm8dpWfic+UAYRnM1kGXOQDg2eKefiw6oT9HSN\n3/zlh0J9ZBxQXJqnRCo5zAfU7Tlzfa2szcbAxJQrX5QJZ6JWL7UTVRvbWblVTupuZCIYzdZVV92F\n+p3LuS86PXbFAIE6gfoRpTQdkqniNVh3vHjdO8VOeoTh/R2kUzsJv8CCc1dulZO6G9nI/gJEz/x4\ns9+4XSkJzZrFFQGTO7nkRxJ1DT+Axuo0Prz2r2iyBTc1r/w5XftVkj835WHJuz9nlrgKjUm+BwQC\noRFdfD+vAthHoF7pz2kRlQRJeSbgV3HMQcI1RRTTFbc7GqdUzoJCywliMjq8ESanxzQh4U6OBJHD\nVeLdUjkTZJdruRYdDwqdXCmuePxUXo38RWsCgDHfdTNOqZwJXslagBiyHhTsJLX5ZBK3xvccSKHA\nQjffdKmcCXK0tJHmqKwFkIQb+inMz8ZXNW7+SkrlTHBn1gLE6C5Zf5YNIbLk3qwFYPGsk1E8U059\nA2+fWCBZf8rTNqSwBzeRbW6zjPdyMopfytl/VNYSOIWsz69TxZGXWpHDFtyTqtwaQrsx0/dKOWf4\nZhiryZB/SzaYf70VOWzxJ8794GMnYiiwgZNRvFJOH44HTDJJwsEDAJ7K6eYmi//LWoCc45X5XlbY\nMgnbAOkTJJ3UC66oR3nKdAx1wCcSfskl+cXWSr9UzgwJfsfJLTrGzRtJNsK7KeJx6kkYCuZKsw+R\n9KW2tdIvlTNjHk67ecl/nMjwz5R75BD2vSeaf16sOG6129ay3GzOBpJHOMTSWt9r5eydtQCWeabS\nPmsRgJ3Zt77f/HN0KOISnk5QcWVOboHf2OnWa+WcnbUAlhnIue9EdcdWb2B0p1Y6TnsYEh6o3Udm\n7M+vooLXymmKPTXaTjQlhAJuJr3VLKCWnq7db1DRTImdJVxTJzVK5YSeTWonY1LY5XzOfa0oIOf/\nT6c1ADsBslzxOiMnqS7eK+chMJ3QRw7d1ZZJ2dMMam7htF2XJccQ/o5xcCu3CpcoU4qLNBkvG+5v\nMb+KEt4r55OKO2VrmxUjF8R9gEXTvPMIHgQON9SXCH0NRvdjsatG0Gwa37fkcOC9cgYfALhC/jww\nq7R1gHnLzOh8MK5ED2r0NyNxzXOQikfkYx29iG5eBQbj4tLoCAC/Um9PPZc2MHOg4b1yAsBwzxYs\npn0aVqFJQeNKpPOL7StZf3X8Yjd6HbGQ32qQbcTrfgFgykz2/SjkGEvfqOfSqYfV6hRCOW1E9VfN\nDCXCPRb6TB7gZ7GTC4AZ2HbdX1scU/IU5sWUe1HOz5/JdGjJgt9r5YyecrRYOypZj+N00Wyfhukg\nFz1QG2TS0u4+lSpjkD/T6wQ/TO9DJ91CIPm0S3OVXhC+Ly/AmfJNRPBaOZ9KOfzVzXpsMzbpbVVX\n+kZrx6P2bSCUY0WQ+C4wzUE6bgxyvuJxpe7DVIZLUu+KhIsG4hOBIe9V3zNl6+S1cgam98QdUWWL\n2n5dah2RN190DEMLoZmWnVqWeDBq2ovxjthn1Y1Q1RyWptmrskSoXnwiMDVxb6khWbxWTl+pioT0\nH/o7XsSxPPIVpAUlk7E73p1z/wHO/XiSM9Xp9IspBvYueV0w+kL/2Gdbnn91o5xXZS1AjCpb8vny\nE7rkngtNIWQiurzJuf+pRF+q0SoCbpxMRwgGC45vGNmK0JFr5TQZh3VEYoeHdKXXc0E8N22gYESx\nOnFNyziS3NbQsUTiWRelmf6NExzDVP5NbX4pVi1uK2Fr8y3XyqlreBE/qwo+q74XyLwODGM65UNP\ngSS0NkN3pKXOvECwDw27ALyu0TZJEEvcOYJdrYqHKOkZiAFPlVwrZ5J3JOtLnVU5ZFfBery3SfQ2\nHPURv68ruDUuE5DIHkQxJHEPAHsptGNFfyDhHi2fhXN4Jl6d5GgAKxSESuCVctJyTom4e5myM3XN\nr4DU0B3R25A37QSAq1PuNZnWXSckU0ehWgpcpjbtVjiVBBmZEv1h3BvyHSYc0oNpwF4GPFW8Uk7a\n3F5kTUNbRuQ2JGqCzwXXirz4OyQlzzrPmihuLWXDGgsAcJvctFtny2DMDQDpx7ipcg5CWX+ZMIjy\nSjlpu2IvQG2zQyeLm87mSvKIg3B+A6JrxR9z7gc7Vk/Zani+aRxyUXUx2cbRGeRDzeMJ5rXX2TK4\npB0QMLazgxvk+wveqy0zkVnUK+WkoXLGRDaqnupK7wpvqb65kgyt0kHfTxmAqEVTq1t5jZfIoU0P\nnOCm6uJA026UhHSToWQ4ySCaBn5rycUjxoKU9eCj1kcXx3vl3AkAtpdTlmBZ9VRXdlc40HSx+DL2\neXk4Ua+zZmYI1BkRewvbMoxPOksHlSar9OQ5MyucZFCxs403IfZ5h5R6vLhNROHNqor3ygkAKynT\nCleo2FFWTRM/HGpEjt0FPDMEj/C0YNi9YwQ3i7U6RMCAgRfl6IDmn5FJCGt7LBgpJpMJCqGc6x6d\n3dgq+wenxOU1ZCV9Xk7iY7GWjIN1vL95GEgVGCVVihK7ZXuw1ITXynlMzQc5An4VO8S9rQ3ZfiWV\nghdKVcYQQsYji3VkI5mTSQrW5o4MkfH+s5oPetquuWq6Bq+Vs8UMcoFae2JIDhp9Uu5NMuxNcwBq\np6y83JeihhCAXAwh1uaaTZewO/hVquhOKYsCjOkmraad0Kima/BaOSP+kcMwJa+l3Kv6Be7B24Lg\nQ8tz+XNKmSoyWzTHM8oFLAyVSXsQ0lhAKWuZzPxVSxSjMwSvlZOETQcp0VuA7J2dLDJsHfu8pKK/\neU8LOnUspw3ZUbx/mWffUErZDOjbSachMwvgcQnHit1N2twmvFZO4PGqq09SAjepMNlsdy3EpTbx\nR0vbIeXF0w1SLIaSnC0jDIW+mu1tQN6ml8d9Y7tT7rtJON+E18o5JGE9szWjniqnGO4v4qDYzuq5\n7GrC2E7zmTZFNwX5iYNBAJDxTT+DlARN05t/LlDo36Sbo9fKKesD2N3w+Kr5JA+SiBZHBBZrMpEZ\nIzeoMRJtdHki5V60Y778LheSADgbGMypckTzzxkK3UczoU1jZdH+n0QETwCeK6fs0d6C2OfprEoc\n4gopG5or+uUkc7P8MaVNIOAOJnPWGrlBDU8aF39pz9/z+yn3CKPOu5ZkCVbzIz9E9BWsRzvbjT9T\nd21+GG8s2F+E18op7G8XI7L82E/R305WIeM2rCwz1b1C/QSPl0vW75Z0y8k4A1/ygXUEtRYbIjh9\nOAbmw8yeQSl7NuYcsWnzW0R2yuu1cqoQ/Q6XtHMznpCTxWknaY8j6qIU2Zh+ktgQeoaxQZIVCyXr\nB9Q8CbVEfpyyxy9p0M0i12/5FLnZDTtBrt+6U87ojFEkfIaJyOwi/o/zJuqPE+3YpkU8ILe22pgG\niaOUYfoiGGFTfhUtXm/2mjO5yUV9MN60vOpyGwCTfivXbyUMM8yfl4JMzBsShlZj5Njsn+zfGn/3\nNwB479A0Wfog/Y8ure05EDeM74pWf0qV74Ycxk5YdDJqN7guhbiVDU8eE79L0T5o9WR8gb1/c7Ji\nwXhD99aPupNbnbfBWInDTBlHZ9qbPC2T2AOUwFijLhQbS2SqavMhzkM2g7o3ysk6llKNZJeb/JwX\n86sYY0bKHyZD42S3/5OkxS6iQjEkIYJeJ07OY0P1RB+yoVW8UU6W80bDFgCOlX8a2sj0pYTszkcM\n0ZiwLfRNiTLwJH26ZSmBFhuKCWZeUjU0cQC/CgPZmF/eKGcykHILU4C/KVgb5yXAuE6W27NkU6E9\nIR9lwGW2MoCe/8bFG1GU5yu1W3yi55eyz2FvlDOiJrr5ING8UNVk+TQmcW/6Ker7cfFA2clzTpr1\nUiBo5xcPIyTrbkpCejRlohFRzVJuWiVou/yLLY3lnXJ+myxgvlLtofs2eVMwDg0tzQKNzkgkRwJw\nvoQ8SS7iV0mBbm+4OCexSFUtwyKY8W4tkHvlTD7sazwwFm6vbOOqim7wgt23EKv31GH8OgDdfI9n\nP1rDt+x1O299TraLXZxH9+VRdTg2jX42VHVolkRp5F45k8vJmh2vLf+e6ReuxP304pq0fYL2XjTT\nve43SsgDAA3s6TX3lCX+Okkmq2yGl0ZQd1dYlEPD9Jj1MqkTZeF9B0lyr5xJ4lmh2wO4VjNMZSZU\nHQ207uokY9qKzqGoztPmH7cAACAASURBVPucLLa1y092OiBe/k4Mb/2omiCKtytsyi3uE8qGTpya\n3wEDFydg3ilnPKjHfyA+XYrnZtVZjxmhOSgPeRAIGJmUdwdSlfNEAKR5evxM4l4AIPggXYTaDW52\nOiCeF8e3nHNIXlR7gBLkOoGp8B+mfH7TJiY3pdyToRDme7axbh64PhD8W08W0hEINBKZiP4f10H1\nHpyp74b0Azb9g34uljR5JqIpjApZCwg0Iu3z/s9XARjNqFco87203TUSbk+NnyMCUQyBGF+TiO6m\nchnPviX6y1wg+Fetm+FbdnNcOPzoQvsO4NHa+RlDKTAiHktcv4LqyPc/Bj8aIo3cK2ea7ez0ivpm\n0G3T1NrF1yTGUhqkhsoT+5NNSzEQZ7RgPVMIhx/9ib307RHR30r/48z2m+zuBQBHxq7vA9Bfod/c\nK2faGuGIPZTjSeM8gbNGQolCEN9oFfWo59K8u0BzlwoqDuP/K6IaNDlOz3McWgL91Gx3InF9VSLI\n5145I4ZQyga+0byGUJjbiuS8CCjn6WeiOl+lLiTsh3ebgyFpmNmKjRWzKIobUuiGezRxhvlPNEVD\nsO3PeQYA7G82dBsvDjUvPSMLb5RzEqWsZZdyinxYA50I5LKmf6nxcHb/A2hZMyNXXWIyOnTsDDI+\nhXQZ7pGHTt5NER4AgNv4QU/JkdwqwtyiGFnQG+UE0kJ+yCdWUHf8kSe5Hqxyf2M4+XVo/hmkRHSW\ndnsbJr5TmMyv5Oo4ebmsMX8M4UhMAk+jIJnLsBmStJME31YkUIws6JVy0nYKVQ+nSSIlncukRvFw\nPRM0QkKu0ZaETdIkUNDiUJ+UeLI8XhStOFojWvBltbM0XmBwVasjr5STxksAME7+nG1ewlXLZlKj\nNE7XDadeMP6SEiWBB8coqpW7ZVMfxRglH7ZR1OooiTfKydqE+QpQys+4i4YsulTNmH5pxwaE5jnj\nwrDjHUY5EXQlpTucpUNYGXsZXEJZAyYPrFj5jALKtNYW3ignbRMm2miJ7DnVfdTtQa6sLev7w9hF\nWzsKw/suVI+geOxEKXsXQJAShEEbCYf1+1GdDyXiksR1lg/vCG+Uk0b3hPXMM+szKmZIcGVt2aWx\nWIrBN/R2yUQ7RCLxENAaSY91BjnFlRsIWjfERGPryhIIz2ebjsKA6mMlFj2UpGnCxIvCC+VkRVVb\nkJymCUVwlsNGILDrY35vhDF/Siba2Y+Tso817WedQQaOgwNtj/RcNbLBr1SJ8ocGArvC76fcI5yH\nGy1nqixeKCfLPWcAqpUnWGZmvPgfio0d0cnxw7xeYmtO3qZCctovc/BNDKRt53E50sPJvB/aDcEf\n7TNNMpQmMnTwcPNCOVmJcIYDEDGRJRz3qSS8g3Cdpzz5VbXzsq1NGt5ML75xM4SWK10CkeS6P+RV\nWKaYvEaQlqASPwdIyo4w4QraxHqxz7Tz1XhmtUIepYxglEez1wsgZu4VfK+2jLZRI4qWFYukTxTN\nvleFpHdLfOOGEbxAmDQnD1FvjGeSVg8p6DheBy8AGA0QxtMrUFgYD0hYcW2Pao8l6RCmzeRaOVlZ\nxOLGCCKZ5mlfTq8rpcUxwqaJ6OW8XVOafa8Kb1o8RplCKYum1aKnHJTnJxNtx+uLgXmqBq/NxMNh\nJq24/g5gfuy6JiidILlWThH2BUA480xaOMO0xb5NOqH6Fzsl1DeepUURmJe47qs9Cpv48ivaZU6b\nVtM22XSOLviWsglu1z8q4YXDjPcvGKetBu+Vc1MA12laS6s6bKvwPpLTMv0IujS/0uS5nSumC5jf\nmd5kO5NfpYrnU7IuE5EFNFodE0Qo5LRWhIcAXKbp2S4btFjXUySezeuVSvIdZwbaep0IzAeJhm0r\nUOt4TCjrjrQZJUkPjkdF1uH+0JSd4Wgiw3Oj68C5H0fVNNR75VwFYI3j/0V8jaESM/fr2Of9QtmA\niWKoRnwPDCfRDWa2ni1G3JIS60QnDpIwfdg7w9EuLMtx5TpGeRqqz3LvlRMA1lYIOcIN9yiISpiU\n+Db8uxXZUMNi0Pwu4ju2It77pkjuBgcpxiIqjvPSvEYz4BPjPI20ErIUQjmvUgjWlRZiRGJXXwny\nndbPO4T6gTppcZZoGxbxM1We974qrJMI0dlqoJAY5V8CdaoCm902nFWNz93qTWUphHLKBjfnwQ6v\nbIh4GIZxt2h3R/PXnnOCdrdKsAxnVDdFROhECyWRoCqwmUaym0AxaqMKXisny0hBl43X49fR4s3T\nWj5eIpi1OQ3ayyb4bXobkc0hFVjJcofaGQ4A8Pgb4nUPABConm0oIp23phmvlfOqhNWLSmxQKibC\nyTFo+kU1RTpdBrr7EgAQiQgJa0PhZSDxZRGFP2ZTa3oayQ0mmXBdkUF6mteISEzjtNAkTyWuH+J3\nR8Vr5cSr1VYvWxnqdjktM5AhHgLwSqXplGwjAKcx6gUSQaF+g1pdo0UrrOpfZmGtYE2TtqanRaaU\n2RbTNTcE0r1GAsoGY+QZFX1tj/yK3X4AzERo9Dodg+00CS+jyTNfdpwxSDcC6I0mL5PtoZKctlaW\njpBPY0C2qXYbo/X7FJr+0GTlUYGETyOoCKZVMyBPWr0e4LiLCYxxDIApRU/HkIbtsBsqITMAvnXO\no80/I8UU8VpInj6QsDVr5hdQiIsl4Gw9SLZPDeYZUEwRyFzgCqT/7Zgw7VSwpajBa+XMM2nTmuT0\n+xBqreqzyOS5/YzK2VUbDdE6J7keY3GGwNkwzSww/v/amHJfla01DdFFOB9AsCNwyUZ2xxkBIGXW\nK4zXyvkovwoTQRPKGuJeJCTFm17GYpblmfIsmjaNaByJ6o2GTi80/RRdjykcJwKo/n+xjL/j62je\nw6J79EFzIUkEomC0HFp10xuLx1hU/23yMoOz8Fo5NaKP4oID+XVom7bxaCGBQBwaEdJy5Io+5Cew\nXr8KTNRsH39r8HRuQfPPFzXN9gKZ9Gcv6I0lwkOMzzJ4rZzRtIsoBGwJUjwTImjxd34BsynSyVS5\nowAWIg8qElNgErP9TnryDzUgjyzRxpNNYwWg6fvGRuzQD6YyaD8Z+6w6/fdaOVus/ffnxdw2i2r4\nGKo9KzuhNJfzwTY1pOaCef6clo9BzPb7pFgVVrxWHWSWk7MsjB8nOB4AnmFuoLU88A0mbVENU+L1\nUUoEeRgILBpM2zyyke1btH6AWlelO8B+w6r+H1Xakd2A4C3poYzIQy4EcIC6Gd58ABtrHNfUzVFK\nCyfJP19sbw6KJ0vVdKBkQPMhPD0nz+E0xUzNyGZi7JuhZVu7sWSwOEDOwCJOIZRzgMITXyIOsRLP\npNwjVSYy6g6UslEAXaRjYEWvE3UpF83QXTVmKLlSPFigz+0YN7aS97+NWsialxZCOV07YLDi6Iry\nzO3V16prEtu5LJVgLBq7OfSD5EKxRH8scf3Je/SmgYb/rWRKl2KsOW1jes1JDmkO0YimvJcy6fVs\nrX911pxYu4IgJVTMBrCboJeE+yOovNwiT9r/4zmkB9wixwFBUlNrxlNbc24K4B/1tObUcJtt4S+J\na9U3mShnxc7ZWMcoInGKCMVAX3QFO0GwnhAcayPbmbO7NyumCJFissKN9OYopgjzGeU0g/80vFPO\npAWmesCJVnZNXNveLLo3lnGL5VqVltG6hWtrA1VFK1hGwuwWThfoXpQOBsxiF2m0TSb6FYFl+ME7\nJossf9KSLbN+pxdx+k7ihXLGn/LxGK2RMYBpp2vr+WxHtX68VmNVEVTYgape5Ch33KyO9gaWIR7q\nMk1PWQmpAKCD3VQpVRyF1k2aJLyQlwObf6Zt+JqKUVa3a84+AF4TrGvbNS117JOr4+oorw1j61zq\nfY01Z6dKRSnQWVU/NwDBSM1OIPb/IDsCgWRKRdkx4vXiLmh1dc7ZFWohN0QVk4Zq2ImIeGIlEqZH\nQFUJeAU0GRxU9WPRnjSumETRz66vhmJK58K8Un0sFVRd0LxXzk8BpbTzcURjkUabT9rx8raKpREK\nZWKHi6OzqaXji7gxY28m7nlDSx+RFjmBxzn8KtXcUFsUTfNF7RNIMnq2BbxXTgAIZA+QUG1Deplg\nm2jzKbmBJMsxldYEfIMs/QaSuTDJqeJtF2okIab5gALVhuC0Ojq77ifxq1RzDkDC6iQOkfeMaGQK\n3nFLHNWQJYVQzuQXLYJOIhsiu+2WID7DnMKYkke7gqoeMDXT2gebcoPy+A0k3a8UoL2ZowdfUm4r\nfAoElfuptyJF0k1LEWeCTAq1GN4qZzz/Id6lf9FpaJhXIrhJo3ESRsrqaFdwtqxpWjNUL7qh/AWh\n9FtIgbRAbLaCXVeRsqUcOZOfyLCq5KVspNFHwR4X8Fg5X4xf0BYxHOLTF1rEdJtU2VhenR6HM6io\nBZil53BpfQ+nrUknKo0oTtr60pbzX5VP5WV8H0uWHFOa4wHT0hiyULWe9lY5q1AwQSGxkAYqyWl0\nqIpodw7NpVue7olruvd9q40K44UNIBtn6wjpzR1BqkKqHMDPr8mc1TYbNJtOY0ijEMr5qsL6IH7O\nNcC61UE1w+NnXb80EQehdgOSNpnoX6G6YBuFcCLN8/glv0oNTwjUiVv0LBY4tmGuXBTWj6pLqEIo\nJ8uQubtg+96SEZh09wrujh1gX1WRzstMJblWXExxJlaI5iLEFbHPAcVFqAcAEpoIFknnUIGD/bh/\nq0jwcVZM70DBVJFtx5WO98rJipgOtAaP4iEbdkQ3hWV8FamS31MEWtRyW/CCx78PYFLFReJNc5iI\nXRvxe8V2XikniU0ponXinSyn2BzzfMyynpX4h+zG36hKOwjnOWLLBEXjZb0T2bIaJj6cNCZNK0Uj\nMaTlUyGbV18foSiLV8oZxLakI8OBgOEUq4Ot7GURz8TCMLB2TYO30kNmAsCQlIPw9zn+SYGEzdsQ\njsPpi+m3hSASuWFswovE0KP554SUmUnwUfX1QkVZvFBOkmIxYOLIMZnw1XZkhfhOatquKY9JKWut\n4HbmLWn+8aG5vlgElKxqqoG/ReC51AH0jZwojpzNLGoRXihnkBKvUdNYBwCQDHsjkItVi7jhvE7I\nk4M0pnPx2cF0Tl1dc0URyN61ZQL+5mpjfU8sBCfNlC9ahrAimtL2i8iPhMSqwQvlLBoDY595WbnT\nTO50YryOjX1WXROZ5LaZDgd7PxlGuwn16ECtREc2cW+j7uPV+iqEclpYdlqF/CH2mePKFaRtR3vC\nBgJ10qIV8pD1wCEVupGifFy9WiLzv6DS6m20qWJfhVBO0Q3buHeAjcjmogSxbAC7G8xxIsJVbocD\nIGbA9axGKniZdfsQAIEBo3aZ+Lqqp1qFUE4SDuRXQnWGLC2vlD/w64jy5jfm+orzDqXsjwBGc9qJ\nBBaT5SB+FQTPNf3kZeTW5TQgdcdP9KEtE19XNeGWt8oZ2Zt0BoBP5JMBygb4jROw8+DI8/3021ek\n32ZCOwLdi7IjmkQosJgkMkcttkMJ9QWALdmG77wsDSrxltRcFwoSQ+ge2A3K5WOuFLI/ECSiEnQG\nO1JdvN/DIe66pfLdfAlgQ6kW4rDk6Y2m6e8dAE4Pt0RQ4Z0is/p/AMCPylwpopwSyiYmUPPLs8KH\ndpT+IEq4kAWxxU+ay5Ntn8o0xWTFfNUlWpc2TTHbaPQkkxa5CVUH8kIoZ6+KfGKCJ/lVrFF1PLKF\nnTME2jFLPLNWmssTz1Y2jTcY5UQwrL2IUXpN36HkOqMtP3ML29MlGYKcT92tOeO43nnVfeuOqzoe\nOUuzNzo6m8AqQZojaMm+uwL4i00rox5yO3Sfp2zCRa7vrK2AwKEBfyGUcz3H4+m+dS+IBdAKKrR9\nVf2o87xImNoRBGPEHwQDKPc/BaAYMVMMiYnTYwA2S7k/ykJ2KNXfZSGUMw+QkB57gMZWAgG0bhHw\nf9RJYSATZoM1VY148kANQQwQSDhMpkW03BTQC2jMQDXdpBfK2T1rAQQIKuKhpufHLYSSVvct/fGn\nTz8THrGWsfwqLfBmCr1f0hCkGfId/T5ESAuXthBRWvp84IVypp315ei7FCfuqDlIzICCxgJtQZog\nYbp28d6yLCPwOM/xKiwR6MQADxmKyGDbrRAoyDmnbUyfc54IlQ35dFn2hLohfG8Ar6T8Hw8C25DA\n1HdDfm7GAEJEnkUANrE8RrweCTu02NrWzTknby2U5PDY54kG5ZBFVTHTEI1UToNnm6rjTC0cad5W\nvBYKHS601zftvXx/RS3lhtfKuUPzRoSoPWb8cH2oIRlkDKAjSOw3yFpzymI/rp4awYNi9fpqpiEU\nZV8Aw1Ny6yRDjMhC2ylQ3RvwWjnHNi+VJmUog4wBdMRZsd9gkPT0VsTCJqMxyEf8OvZi81XzZwB3\nphhWJ0OMmIDskR6IjoXXyqlz3vixMSnkSR666FjkJIkHy3ZJj5R7gcDbyEkahghDsxUWhycLDgAE\n0tTU4LVyRmslEsrHCf+uWVGUuQl6FjlJomDZqgY5/1JsJxNKckzKvZNT7pmA/B+AI1PuC5oZppF8\n0Kjm1vFaOb9s+SSfj6G/SUE02A/pZ29E0Fos+fZV/RtLs54xpTiXpNwbZGgMFsE2SI0bGjgIZiaK\n18rZ4t3wmHz6Z52wGCbpOT7d3y8QXIyZfPuyUEyyLcVBLqyNNIyj7aQ6puO1ckac75klQlV4qdPl\nj5nTpoURura5qqimnY8IEvYQInlGiMAGQhSZZHcAZwnEyO3DKF9bN9y/BIVQzlt2y1oCOaoDUomo\nWjWPC9RRtefURifxKQWh89s7+VUinXoTYnbyrN1vE/GHRPFaOUnYFLU2eKvpepnj8bsrtts69vn5\nCj3lFUkcjsXPRnVCYrb0p5oym0MgmRTKBMcmU6yl8DrMRKh3gRfK+RSjfE6lehtsLqOeLRYotovH\nRz2UMSUbc2v1dWDYjTCQzd6kCc+wPS33CA+ZY5ieKmcaMaIwrC5Mgb1QzoOaAwQlE/t8m7juK9jf\nifwqVqmKQtCNvrphJTgSxZbFkIyrWZyAE2VwrOGsaMzdeM52MLkm/e8jCsNqyngkDS+Uc3TzVCkZ\nnHfXhBGxaCShmkNix8QT2z5fsWPbw4vax8tgxsJWRmfdh1ES1m580PwGZ52YHPrzVttn1owNQJOp\nEQNTAQy9UM7I9zCZta5vwjOAtdDfBtVP/NM15eEZ3F/KuR9XDE5kTGV4jmjxSAhEIluzrQQ+y8MG\nSz1XE3mFsM6Bnw9bF820qA4RaWZ+pkL/eqGcLN4UrDd7bvoTX9aAg5fo6HrOfZlMx6pvOB7xJewG\nHzCr1UD7znn2WUJHteOSixRLXFsBSU0RYP6sRDV3ed36c5IXgEAwCpbNuLWyqMpCOqZvKin3G4bo\nUKlgR9j4s5ZH5P/REcDC8FoElVFWxlgHwOpYvbj/bt34c4pCO0k83XGOEhY8o3cT+UcBNP3FWGIN\n9BWTpLhxmeYLAK8zFFMipzCTZIgoVf9dr5VTdMpHs+XU2Lk36t7EM7szkX8UALA4FIrT093UeJIQ\niw7QSRYB2IthKJzc1zABUXwweq2cUUB9EsqbpejshJs8cpSN5qBKUKlUHWfEbUTJ062fFziSJ8m1\nDsfaBEDAMBTWNT+kMUog2iINIeWcN28eDjnkEDz0UJMn4siRIzFgwACccsopOOWUUzBjxgwAwLRp\n03Dcccdh4MCBePTRpuRCq1evxoUXXoiTTz4ZgwcPxsKFC9UkTUUlHoE6olnNRGBtLulOZ2lndSRs\nfV30ipUHsXTMd0uMkZbYN7Xdn2o9XG5v/vllsrIFuqfcm9+cxkIkM5oot/OrUOEq54oVK3DNNddg\n772r84JfcMEFmDx5MiZPnoy+fftixYoVuOuuuzBx4kRMnjwZkyZNwldffYWnn34aHTp0wMMPP4yz\nzz4bN99sYXHxpcJGRsKcSMZkMqjIZzWTJZrOEo7dMMupn77OmdLyiZXGZ1j6cFWoJvYNDmB7uDTc\nyrghAMtYPUna7vJOzT9ZJn4TxMVpQXXGzlXOhoYG3H///WhsbEyt9/bbb6Nnz55o37492rZti969\ne2P27NmYOXMmDj30UABAnz59MHu2TKpTQTbkHW7UEiQiBmS108hbj0R2wyxSzsIzoT2/SirbagTj\nFTXnSPMn5XGiQggNmaOzOFzlbNOmDdq2rQ2N9tBDD+HUU0/F+eefjy+++ALLli1Dx46tWyUdO3bE\n0qVLq8rXWmstVCoVfPut4TOtQa5WbmaYHr/41uGyv636cVBaGJI4rOk4EfTTVFn0UBKq1WDMWUbB\nal7AaYaKUi60Y445BhtssAG222473Hfffbjzzjuxyy7VuaJZx6eix6oy50EAQKbw6+ggKw+3v6q+\n3clC0u4p9ptsd4rC2LpUf5/2j+5Fx4jXkz1HVlLO+PrzoIMOwpVXXonDDz8cy5a1Om0tWbIEvXr1\nQmNjI5YuXYptt90Wq1evRhiGaGjgm2qJ/Ecmo/oPoSPM7qRGmDZCeAzpOTtcyqLbr3CA5X5AIJcM\nTEkGnjzDof4mo41noh4LpTnVueee27LrOmvWLPTo0QM777wz5syZg+XLl4MQgtmzZ2O33XbDPvvs\ng+nTmyZyL730Evbcc09lYZMcG06suhaNX2uK7ortRBSTJKz4VdMcsgy8SUp4SBvYUkwAOF9CAXQV\nU5S4k4ZqWCKu+d7cuXMxZswYfPLJJ2jTpg26dOmCwYMH47777sO6666Ldu3a4frrr0enTp0wffp0\njB8/HpVKBYMHD8bRRx+NNWvWYNSoUViwYAEaGhpwww03YJNN+MHwVZ5MSwCkb1upYdt8j7SrzZRF\nbgCCmB82+TcQrC8vy3zQE9Im3966b87fADhJurV5bL/VOgNYwGlL/gcEa7WOsS9aN+5kptxe2Nbe\nitao2fGpa/ILngB9jxMaWdrW3g/gTAey6CrnNgB0/LeTSxRVTCnnRLCzAuiMUTjb2vjuenxN2S3x\nH7ehmDaQmVGeya8ifL7HkkHxuLKKNMU8KuVexLGOcnySZsd9nnINtS8KFy/enDYg4YEIKmKJJW2/\nObcE2yhARJaeAOZoymBrQ+hjiAXwfg+tUQZ0cDHLKd+cAnRv/qmyESSqmC4QVUwWr2ecWToN0cj6\nJhSzaHitnNHUTCWR0V9MCmIREW+N+RLPmd78KplANspagvzhtXJGCYFUDp131Rj3jxptZQkETJE3\noJQRyiKQfJefizMrguYjcrJLej1d5tnt3iheK2dWHJy1AAk6Ta0ta2yOS1tlqL3QfSIQ2VjCT6ke\n6AoSxQy+mHHfcgIyKQqinCdkLUC2UAxHI3O2qh3sQQZSaHHYOHEdzVZFE0e5Oiu9kVHO8/MlM01L\nwqYYynm/vAsX68npJdu9IFbPnHEWE1r2+J0BrG9/aDdQZim2KIZyKoSoG32qeTGyQzAF4nn2t4MW\nUMrehniGMhP5Ma0yVuTk2QzFUM4H5ZsECm3yi5j/wrsV+9tBrOxmpJ1Y+2R+TLI3vV4cW7F06Yx3\nNlIhlLNYiiZOa8qFbkL15V3S5WFmNyNqm1GBwBpvhmKMnjTYsWZfNT8Yg0IoZ73SahU0P0MpRBPK\ndrcnQJtzjXfJsvMNKgKvckOUylkI5FMzmUyjubZAaBGbJnVB5Q5rfWdJoZQzyFoAx8jkOEkiE2WP\nS0oUPt2YQvVMoZTTZhiMXPK+us9CX3NSpKZ3+49gF8kMctOptcyhEkVPFdUNq0IpZ72hM1V8R7Gd\nyWj3cf7x0+rrIyyNEz0EXLoXiibcSlK3yklCW/m7zEJCM7EdyC+afkaR6nZi1kyHFqMpeI9SKEmg\nGnlZEhshzW3hnXKaWlc+UdF11HJDUDGT4Dy4vOmnSLYBcpiRIUs08U45lyimAEjiIGt4pnwt3WKf\nlk/Bc+rjkm3
Download .txt
gitextract_j3665a4o/

├── .gitignore
├── LICENSE
├── NOTICE
├── README.md
├── atari_zoo/
│   ├── __init__.py
│   ├── activation_movie.py
│   ├── atari_wrappers.py
│   ├── config.py
│   ├── dopamine_preprocessing.py
│   ├── game_lists/
│   │   ├── a2c_game_list
│   │   ├── apex_game_list
│   │   └── dopamine_game_list
│   ├── log.py
│   ├── model_maker.py
│   ├── rollout.py
│   ├── scores.py
│   ├── synthetic_inputs.py
│   ├── top_patches.py
│   ├── translate.py
│   └── utils.py
├── colab/
│   └── AtariZooColabDemo.ipynb
├── dimensionality_reduction/
│   ├── README.md
│   ├── process.py
│   ├── process_helper.py
│   ├── ram_reduce.json
│   ├── representation_reduce.json
│   ├── visualize.py
│   ├── visualize_helper.py
│   ├── viz_ram_2d.json
│   └── viz_representation_2d.json
├── docs/
│   ├── RGraph/
│   │   └── libraries/
│   │       ├── RGraph.bar.js
│   │       ├── RGraph.bipolar.js
│   │       ├── RGraph.common.annotate.js
│   │       ├── RGraph.common.context.js
│   │       ├── RGraph.common.core.js
│   │       ├── RGraph.common.csv.js
│   │       ├── RGraph.common.deprecated.js
│   │       ├── RGraph.common.dynamic.js
│   │       ├── RGraph.common.effects.js
│   │       ├── RGraph.common.key.js
│   │       ├── RGraph.common.resizing.js
│   │       ├── RGraph.common.sheets.js
│   │       ├── RGraph.common.tooltips.js
│   │       ├── RGraph.common.zoom.js
│   │       ├── RGraph.cornergauge.js
│   │       ├── RGraph.drawing.background.js
│   │       ├── RGraph.drawing.circle.js
│   │       ├── RGraph.drawing.image.js
│   │       ├── RGraph.drawing.marker1.js
│   │       ├── RGraph.drawing.marker2.js
│   │       ├── RGraph.drawing.marker3.js
│   │       ├── RGraph.drawing.poly.js
│   │       ├── RGraph.drawing.rect.js
│   │       ├── RGraph.drawing.text.js
│   │       ├── RGraph.drawing.xaxis.js
│   │       ├── RGraph.drawing.yaxis.js
│   │       ├── RGraph.fuel.js
│   │       ├── RGraph.funnel.js
│   │       ├── RGraph.gantt.js
│   │       ├── RGraph.gauge.js
│   │       ├── RGraph.hbar.js
│   │       ├── RGraph.hprogress.js
│   │       ├── RGraph.line.js
│   │       ├── RGraph.meter.js
│   │       ├── RGraph.modaldialog.js
│   │       ├── RGraph.odo.js
│   │       ├── RGraph.pie.js
│   │       ├── RGraph.radar.js
│   │       ├── RGraph.rose.js
│   │       ├── RGraph.rscatter.js
│   │       ├── RGraph.scatter.js
│   │       ├── RGraph.semicircularprogress.js
│   │       ├── RGraph.svg.bar.js
│   │       ├── RGraph.svg.bipolar.js
│   │       ├── RGraph.svg.common.ajax.js
│   │       ├── RGraph.svg.common.core.js
│   │       ├── RGraph.svg.common.csv.js
│   │       ├── RGraph.svg.common.fx.js
│   │       ├── RGraph.svg.common.key.js
│   │       ├── RGraph.svg.common.sheets.js
│   │       ├── RGraph.svg.common.tooltips.js
│   │       ├── RGraph.svg.funnel.js
│   │       ├── RGraph.svg.gauge.js
│   │       ├── RGraph.svg.hbar.js
│   │       ├── RGraph.svg.line.js
│   │       ├── RGraph.svg.pie.js
│   │       ├── RGraph.svg.radar.js
│   │       ├── RGraph.svg.rose.js
│   │       ├── RGraph.svg.scatter.js
│   │       ├── RGraph.svg.semicircularprogress.js
│   │       ├── RGraph.svg.waterfall.js
│   │       ├── RGraph.thermometer.js
│   │       ├── RGraph.vprogress.js
│   │       └── RGraph.waterfall.js
│   ├── css/
│   │   ├── bootstrap-theme.css
│   │   └── bootstrap.css
│   ├── js/
│   │   ├── bootstrap.js
│   │   └── npm.js
│   ├── video.html
│   └── video2.html
├── examples/
│   ├── classify_state.py
│   └── demo.py
├── notebooks/
│   ├── Basic visualization.ipynb
│   ├── Filter Analysis.ipynb
│   ├── Training log visualization.ipynb
│   └── Walkthrough.ipynb
├── requirements.txt
└── setup.py
Download .txt
SYMBOL INDEX (231 symbols across 45 files)

FILE: atari_zoo/activation_movie.py
  function gather_activations (line 28) | def gather_activations(m,obs,activations_tensor,session,X_t,batch_size=2...
  function activations_to_frames (line 50) | def activations_to_frames(m,activations):
  function make_clips_from_activations (line 66) | def make_clips_from_activations(m,_frames,obs,activations_tensor,session...
  function side_by_side_clips (line 90) | def side_by_side_clips(clip1,clip2):
  function _MakeActivationVideoOneLayer (line 111) | def _MakeActivationVideoOneLayer(m,clip_dict,layer_no):
  function _MakeActivationVideo (line 126) | def _MakeActivationVideo(m,clip_dict):
  function MakeClipDict (line 177) | def MakeClipDict(m):
  function MakeActivationVideoOneLayer (line 204) | def MakeActivationVideoOneLayer(m,layer_no,out_file=None):
  function MakeActivationVideo (line 218) | def MakeActivationVideo(m,video_fn=None):
  function main (line 227) | def main():

FILE: atari_zoo/atari_wrappers.py
  class NoopResetEnv (line 23) | class NoopResetEnv(gym.Wrapper):
    method __init__ (line 24) | def __init__(self, env, noop_max=30):
    method reset (line 33) | def reset(self):
  class FireResetEnv (line 48) | class FireResetEnv(gym.Wrapper):
    method __init__ (line 49) | def __init__(self, env):
    method reset (line 55) | def reset(self):
  class EpisodicLifeEnv (line 65) | class EpisodicLifeEnv(gym.Wrapper):
    method __init__ (line 66) | def __init__(self, env):
    method step (line 74) | def step(self, action):
    method reset (line 88) | def reset(self):
  class MaxAndSkipEnv (line 101) | class MaxAndSkipEnv(gym.Wrapper):
    method __init__ (line 102) | def __init__(self, env, skip=4):
    method step (line 110) | def step(self, action):
    method reset (line 124) | def reset(self):
    method _render (line 131) | def _render(self, mode='human', close=False):
  class WarpFrameTF (line 150) | class WarpFrameTF(gym.ObservationWrapper):
    method __init__ (line 151) | def __init__(self, env, show_warped=False,warp_size=(84,84)):
    method transform (line 164) | def transform(self,obs):
    method observation (line 170) | def observation(self, obs):
    method _render (line 177) | def _render(self, mode='human', close=False):
  class WarpFrame (line 191) | class WarpFrame(gym.ObservationWrapper):
    method __init__ (line 192) | def __init__(self, env, show_warped=False):
    method observation (line 200) | def observation(self, obs):
    method _render (line 206) | def _render(self, mode='human', close=False):
  class FrameStack (line 219) | class FrameStack(gym.Wrapper):
    method __init__ (line 220) | def __init__(self, env, k):
    method reset (line 229) | def reset(self):
    method step (line 235) | def step(self, action):
    method observation (line 240) | def observation(self):
  class ScaledFloatFrame (line 244) | class ScaledFloatFrame(gym.ObservationWrapper):
    method __init__ (line 245) | def __init__(self,env,scale=(1/255.0)):
    method observation (line 248) | def observation(self, obs):
  class DiscretizeActions (line 253) | class DiscretizeActions(gym.Wrapper):
    method __init__ (line 254) | def __init__(self, env):
    method step (line 260) | def step(self, action):
  function wrap_deepmind (line 269) | def wrap_deepmind(env, episode_life=False, skip=4, stack_frames=4, noop_...

FILE: atari_zoo/config.py
  function dopamine_url_formatter (line 15) | def dopamine_url_formatter(base_url,agent,game,run,tag=None):

FILE: atari_zoo/dopamine_preprocessing.py
  class AtariPreprocessing (line 34) | class AtariPreprocessing(object):
    method __init__ (line 50) | def __init__(self, environment, frame_skip=4, terminal_on_life_loss=Fa...
    method observation_space (line 89) | def observation_space(self):
    method action_space (line 96) | def action_space(self):
    method reward_range (line 100) | def reward_range(self):
    method metadata (line 104) | def metadata(self):
    method reset (line 107) | def reset(self):
    method render (line 120) | def render(self, mode):
    method step (line 137) | def step(self, action):
    method _fetch_grayscale_observation (line 186) | def _fetch_grayscale_observation(self, output):
    method _pool_and_resize (line 200) | def _pool_and_resize(self):

FILE: atari_zoo/log.py
  function parse_checkpoint_info (line 26) | def parse_checkpoint_info(json_data):
  function load_checkpoint_info (line 39) | def load_checkpoint_info(path):
  function get_dataframe_from_training_log (line 50) | def get_dataframe_from_training_log(_data=None,_file=None,algo='default'...
  function gather_logs_across_runs (line 104) | def gather_logs_across_runs(algo,game,runs,local=False):
  function gather_logs_across_algos (line 119) | def gather_logs_across_algos(algos,game,local=False):

FILE: atari_zoo/model_maker.py
  class RL_model (line 47) | class RL_model(Model):
    method create_input (line 74) | def create_input(self, t_input=None, forget_xy_shape=True):
    method get_log (line 82) | def get_log(self):
    method get_checkpoint_info (line 90) | def get_checkpoint_info(self):
    method get_observations (line 93) | def get_observations(self):
    method get_frames (line 97) | def get_frames(self):
    method get_ram (line 101) | def get_ram(self):
    method get_scores (line 105) | def get_scores(self):
    method get_representation (line 109) | def get_representation(self):
    method get_episode_rewards (line 113) | def get_episode_rewards(self):
    method ram_state_to_bits (line 118) | def ram_state_to_bits(self,state):
    method get_action (line 126) | def get_action(self,model):
    method preprocess_weight (line 130) | def preprocess_weight(self,x):
    method get_weights (line 135) | def get_weights(self,session,layer_no):
    method canonical_activation_representation (line 142) | def canonical_activation_representation(self,act):
    method native_activation_representation (line 150) | def native_activation_representation(self,act):
  class RL_ES (line 157) | class RL_ES(RL_model):
    method preprocess_weight (line 172) | def preprocess_weight(self,x):
    method get_action (line 175) | def get_action(self,model):
  class RL_GA (line 181) | class RL_GA(RL_model):
    method get_action (line 196) | def get_action(self,model):
    method preprocess_weight (line 201) | def preprocess_weight(self,x):
  class RL_Apex (line 205) | class RL_Apex(RL_model):
    method get_action (line 223) | def get_action(self,model):
  class RL_DQN_dopamine (line 229) | class RL_DQN_dopamine(RL_model):
    method get_action (line 251) | def get_action(self,model):
    method get_log (line 256) | def get_log(self):
    method get_checkpoint_info (line 260) | def get_checkpoint_info(self):
  class RL_Rainbow_dopamine (line 265) | class RL_Rainbow_dopamine(RL_model):
    method get_action (line 292) | def get_action(self,model):
    method get_log (line 297) | def get_log(self):
    method get_checkpoint_info (line 301) | def get_checkpoint_info(self):
  class RL_A2C (line 306) | class RL_A2C(RL_model):
    method get_action (line 323) | def get_action(self,model):
  class RL_IMPALA (line 329) | class RL_IMPALA(RL_model):
    method get_action (line 347) | def get_action(self,model):
  function _MakeAtariModel (line 358) | def _MakeAtariModel(model_class,name,environment,model_path,run_id,algor...
  function GetFilePathsForModel (line 374) | def GetFilePathsForModel(algo,environment,run_no,tag='final',local=False):
  function GetAvailableTaggedCheckpoints (line 408) | def GetAvailableTaggedCheckpoints(algo,environment,run_no,local=False):
  function MakeAtariModel (line 425) | def MakeAtariModel(algo,environment,run_no,tag='final',local=False):

FILE: atari_zoo/rollout.py
  class dotdict (line 40) | class dotdict(dict):
  function generate_rollout (line 46) | def generate_rollout(model,args=None,action_noise=0.0,parameter_noise=0....
  function main (line 213) | def main():

FILE: atari_zoo/scores.py
  function get_random_agent_scores (line 17) | def get_random_agent_scores(game):
  function get_human_scores (line 24) | def get_human_scores(game):

FILE: atari_zoo/synthetic_inputs.py
  function image (line 36) | def image(shape, add_noise=False):
  function only_current_frame (line 45) | def only_current_frame(shape):
  function channel (line 62) | def channel(layer, n_channel, ordering="NHWC"):
  function L2c (line 71) | def L2c(layer="input", constant=0, epsilon=1e-6, batch=None,channel=0):
  function direction_cossim (line 79) | def direction_cossim(layer, vec, ordering="NHWC"):
  function direction_neuroncossim (line 96) | def direction_neuroncossim(layer, vec, ordering="NHWC"):
  function make_regularization (line 113) | def make_regularization(L1=0.0,L2=0.0,TV=0.0):
  function visualize_neuron (line 117) | def visualize_neuron(algo='apex',env='SeaquestNoFrameskip-v4',run_id=1,t...
  function composite_activation (line 138) | def composite_activation(x):
  function composite_activation_unbiased (line 148) | def composite_activation_unbiased(x):
  function relu_normalized (line 157) | def relu_normalized(x):
  function image_cppn (line 166) | def image_cppn(
  function render_feature (line 195) | def render_feature(
  function all_activation (line 215) | def all_activation(layer, batch=None):
  function optimize_input (line 229) | def optimize_input(obj, model, param_f, transforms, lr=0.05, step_n=512,...

FILE: atari_zoo/top_patches.py
  function pad_image (line 54) | def pad_image(image, padSize, pad_values=0.):
  function get_obs_patch (line 72) | def get_obs_patch(observation, ii, jj, receptive_stride=(36,8), pad_each...
  function build_model_get_act (line 110) | def build_model_get_act(algo, env, run_id=1, tag='final', local=True, wh...
  function plot_topN_patches (line 159) | def plot_topN_patches(activations, observations, which_filter=38, which_...

FILE: atari_zoo/translate.py
  function module_path (line 21) | def module_path():
  function grab_list (line 38) | def grab_list(mode):
  function translate_game_name (line 46) | def translate_game_name(inp_name,inp_mode,out_mode):

FILE: atari_zoo/utils.py
  function load_json_from_url (line 32) | def load_json_from_url(url,cache=None,encoding='utf-8'):
  function get_session (line 40) | def get_session():
  function conv_weights_to_canvas (line 53) | def conv_weights_to_canvas(w):
  function visualize_conv_w (line 95) | def visualize_conv_w(w,title=None,subsample=None):
  function to_onnx (line 106) | def to_onnx(model,fname="./frozen_out.onnx",scope=""):
  function fc_activations_to_canvas (line 129) | def fc_activations_to_canvas(m,act,scale=8,padding=1,width=32,idx=0):
  function get_activation_scaling (line 176) | def get_activation_scaling(model,act):
  function conv_activations_to_canvas (line 186) | def conv_activations_to_canvas(model,act,scale=1,padding=1,width=8,idx=0...
  function MakeVideo (line 262) | def MakeVideo(m,fps=60.0,skip=1,video_fn='./tmp.mp4'):
  function load_clip_from_cache (line 272) | def load_clip_from_cache(algo,env,run_id,tag="final",video_cache="."):
  function movie_grid (line 279) | def movie_grid(clip_dict,x_labels,y_labels,grid_sz_x,grid_sz_y,label_pad...
  function rollout_grid (line 325) | def rollout_grid(env,algos,run_ids,tag='final',clip_resize=0.5,label_fon...

FILE: dimensionality_reduction/process.py
  function main (line 24) | def main(path_to_config, download):

FILE: dimensionality_reduction/process_helper.py
  function download_data (line 23) | def download_data(data):
  function assemble_data (line 43) | def assemble_data(data, ext=".", dict_key='ram'):
  function reduce_dim (line 87) | def reduce_dim(X, method):
  function disassemble (line 130) | def disassemble(X, files, dims, dr_method, dict_key='ram'):

FILE: dimensionality_reduction/visualize.py
  function main (line 24) | def main(path_to_config, global_max_min):

FILE: dimensionality_reduction/visualize_helper.py
  function color_index (line 45) | def color_index(fitness, minfit, maxfit):
  function color_list (line 55) | def color_list(color_idx, scores, global_limit=None):
  function checkb_click (line 67) | def checkb_click(label):
  class figure_control (line 81) | class figure_control:
    method __init__ (line 82) | def __init__(self, config, global_max_min):
    method pop_frame (line 143) | def pop_frame(self, algo, rollout, index):
    method onpick (line 154) | def onpick(self, event):

FILE: docs/RGraph/libraries/RGraph.bar.js
  function iterator (line 199) | function iterator()
  function iterator (line 206) | function iterator()
  function myOnresizebeforedraw (line 238) | function myOnresizebeforedraw(obj)

FILE: docs/RGraph/libraries/RGraph.bipolar.js
  function iteratorLeft (line 159) | function iteratorLeft()
  function iteratorRight (line 163) | function iteratorRight()

FILE: docs/RGraph/libraries/RGraph.common.core.js
  function drawUpArrow (line 161) | function drawUpArrow(x,y)
  function drawDownArrow (line 163) | function drawDownArrow(x,y)
  function domtext (line 269) | function domtext()
  function splitstring (line 412) | function splitstring(p)

FILE: docs/RGraph/libraries/RGraph.common.effects.js
  function iterator (line 15) | function iterator()
  function iterator (line 18) | function iterator()
  function iterator (line 22) | function iterator()
  function iterator (line 25) | function iterator()
  function iterator (line 28) | function iterator()
  function iterator (line 31) | function iterator()
  function iterator (line 47) | function iterator()
  function iterator (line 86) | function iterator()

FILE: docs/RGraph/libraries/RGraph.common.key.js
  function DrawKey_graph (line 7) | function DrawKey_graph(obj,key,colors)
  function DrawKey_gutter (line 20) | function DrawKey_gutter(obj,key,colors)

FILE: docs/RGraph/libraries/RGraph.fuel.js
  function iterator (line 57) | function iterator()

FILE: docs/RGraph/libraries/RGraph.gantt.js
  function iterator (line 73) | function iterator()

FILE: docs/RGraph/libraries/RGraph.hbar.js
  function iterator (line 135) | function iterator()
  function iterator (line 157) | function iterator()

FILE: docs/RGraph/libraries/RGraph.hprogress.js
  function iterator (line 79) | function iterator()

FILE: docs/RGraph/libraries/RGraph.line.js
  function Spline (line 193) | function Spline(t,P0,P1,P2,P3)
  function iterator (line 230) | function iterator()
  function iterator (line 233) | function iterator()
  function iterator (line 236) | function iterator()
  function unfoldFromCenter (line 246) | function unfoldFromCenter()

FILE: docs/RGraph/libraries/RGraph.meter.js
  function iterator (line 75) | function iterator()

FILE: docs/RGraph/libraries/RGraph.odo.js
  function iterator (line 71) | function iterator()

FILE: docs/RGraph/libraries/RGraph.pie.js
  function iterator (line 137) | function iterator()

FILE: docs/RGraph/libraries/RGraph.radar.js
  function iterator (line 107) | function iterator()

FILE: docs/RGraph/libraries/RGraph.rose.js
  function iterator (line 102) | function iterator()
  function iterator (line 106) | function iterator()
  function iterator (line 109) | function iterator()
  function iterator (line 112) | function iterator()

FILE: docs/RGraph/libraries/RGraph.scatter.js
  function iterator (line 172) | function iterator()

FILE: docs/RGraph/libraries/RGraph.semicircularprogress.js
  function iterator (line 60) | function iterator()

FILE: docs/RGraph/libraries/RGraph.svg.bar.js
  function iterator (line 72) | function iterator()

FILE: docs/RGraph/libraries/RGraph.svg.bipolar.js
  function iteratorLeft (line 94) | function iteratorLeft()
  function iteratorRight (line 99) | function iteratorRight()

FILE: docs/RGraph/libraries/RGraph.svg.common.core.js
  function isMonth (line 241) | function isMonth(str)

FILE: docs/RGraph/libraries/RGraph.svg.common.fx.js
  function iterator (line 13) | function iterator()
  function iterator (line 16) | function iterator()
  function iterator (line 19) | function iterator()
  function iterator (line 22) | function iterator()
  function iterator (line 25) | function iterator()
  function iterator (line 28) | function iterator()

FILE: docs/RGraph/libraries/RGraph.svg.hbar.js
  function iterator (line 60) | function iterator()

FILE: docs/RGraph/libraries/RGraph.svg.line.js
  function spline (line 64) | function spline(t,P0,P1,P2,P3)

FILE: docs/RGraph/libraries/RGraph.svg.pie.js
  function iterator (line 45) | function iterator()

FILE: docs/RGraph/libraries/RGraph.svg.rose.js
  function iterator (line 76) | function iterator()
  function iterator (line 79) | function iterator()

FILE: docs/RGraph/libraries/RGraph.thermometer.js
  function iterate (line 63) | function iterate()

FILE: docs/RGraph/libraries/RGraph.vprogress.js
  function iterator (line 82) | function iterator()

FILE: docs/RGraph/libraries/RGraph.waterfall.js
  function iterator (line 88) | function iterator()

FILE: docs/js/bootstrap.js
  function transitionEnd (line 34) | function transitionEnd() {
  function removeElement (line 127) | function removeElement() {
  function Plugin (line 143) | function Plugin(option) {
  function Plugin (line 252) | function Plugin(option) {
  function Plugin (line 478) | function Plugin(option) {
  function getTargetFromTrigger (line 705) | function getTargetFromTrigger($trigger) {
  function Plugin (line 717) | function Plugin(option) {
  function getParent (line 784) | function getParent($this) {
  function clearMenus (line 797) | function clearMenus(e) {
  function Plugin (line 890) | function Plugin(option) {
  function Plugin (line 1234) | function Plugin(option, _relatedTarget) {
  function complete (line 1603) | function complete() {
  function Plugin (line 1779) | function Plugin(option) {
  function Plugin (line 1888) | function Plugin(option) {
  function ScrollSpy (line 1931) | function ScrollSpy(element, options) {
  function Plugin (line 2051) | function Plugin(option) {
  function next (line 2160) | function next() {
  function Plugin (line 2206) | function Plugin(option) {
  function Plugin (line 2365) | function Plugin(option) {
Condensed preview — 108 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (2,887K chars).
[
  {
    "path": ".gitignore",
    "chars": 43,
    "preview": "*.swp\n__pycache__\n.ipynb_checkpoints\n\ndata\n"
  },
  {
    "path": "LICENSE",
    "chars": 569,
    "preview": " Copyright (c) 2018 Uber Technologies, Inc.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not"
  },
  {
    "path": "NOTICE",
    "chars": 761,
    "preview": "Atari Model Zoo includes derived work from Dopamine (https://github.com/google/dopamine) under the Apache License 2.0:\n\n"
  },
  {
    "path": "README.md",
    "chars": 5231,
    "preview": "# Atari Zoo\n\nThe aim of this project is to disseminate deep reinforcement learning agents trained by a variety of algori"
  },
  {
    "path": "atari_zoo/__init__.py",
    "chars": 13583,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "atari_zoo/activation_movie.py",
    "chars": 7864,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "atari_zoo/atari_wrappers.py",
    "chars": 11356,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "atari_zoo/config.py",
    "chars": 1926,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "atari_zoo/dopamine_preprocessing.py",
    "chars": 7557,
    "preview": "# Modifications Copyright (c) 2018 Uber Technologies, Inc.\n\n# Copyright 2018 The Dopamine Authors.\n#\n# Licensed under th"
  },
  {
    "path": "atari_zoo/game_lists/a2c_game_list",
    "chars": 1415,
    "preview": "AirRaidNoFrameskip-v4\nAlienNoFrameskip-v4\nAmidarNoFrameskip-v4\nAssaultNoFrameskip-v4\nAsterixNoFrameskip-v4\nAsteroidsNoFr"
  },
  {
    "path": "atari_zoo/game_lists/apex_game_list",
    "chars": 548,
    "preview": "alien\namidar\nassault\nasterix\nasteroids\natlantis\nbank_heist\nbattle_zone\nbeam_rider\nberzerk\nbowling\nboxing\nbreakout\ncentip"
  },
  {
    "path": "atari_zoo/game_lists/dopamine_game_list",
    "chars": 575,
    "preview": "AirRaid\nAlien\nAmidar\nAssault\nAsterix\nAsteroids\nAtlantis\nBankHeist\nBattleZone\nBeamRider\nBerzerk\nBowling\nBoxing\nBreakout\nC"
  },
  {
    "path": "atari_zoo/log.py",
    "chars": 4266,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "atari_zoo/model_maker.py",
    "chars": 15893,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "atari_zoo/rollout.py",
    "chars": 9022,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "atari_zoo/scores.py",
    "chars": 3355,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "atari_zoo/synthetic_inputs.py",
    "chars": 9515,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "atari_zoo/top_patches.py",
    "chars": 10409,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "atari_zoo/translate.py",
    "chars": 2102,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "atari_zoo/utils.py",
    "chars": 10913,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "colab/AtariZooColabDemo.ipynb",
    "chars": 210862,
    "preview": "{\n  \"nbformat\": 4,\n  \"nbformat_minor\": 0,\n  \"metadata\": {\n    \"colab\": {\n      \"name\": \"AtariZooColabDemo.ipynb\",\n      "
  },
  {
    "path": "dimensionality_reduction/README.md",
    "chars": 432,
    "preview": "Install requirements:\n\n```\npip install click\npip install matplotlib==2.0.2\npip install mpldatacursor\n```\n\nFirst download"
  },
  {
    "path": "dimensionality_reduction/process.py",
    "chars": 1574,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "dimensionality_reduction/process_helper.py",
    "chars": 5553,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "dimensionality_reduction/ram_reduce.json",
    "chars": 380,
    "preview": "{\n  \"data\": {\n    \"path\": \"/Users//data/rollout2\",\n    \"game\": \"SeaquestNoFrameskip-v4\",\n    \"key\": \"ram\",\n    \"algos\": "
  },
  {
    "path": "dimensionality_reduction/representation_reduce.json",
    "chars": 271,
    "preview": "{\n  \"data\": {\n    \"path\": \"/Users/ailabs/data/rollout\",\n    \"game\": \"FrostbiteNoFrameskip-v4\",\n    \"key\": \"representatio"
  },
  {
    "path": "dimensionality_reduction/visualize.py",
    "chars": 1287,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "dimensionality_reduction/visualize_helper.py",
    "chars": 5793,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "dimensionality_reduction/viz_ram_2d.json",
    "chars": 315,
    "preview": "{\n  \"data\": {\n    \"path\": \"/data/rollout2\",\n    \"game\": \"SeaquestNoFrameskip-v4\",\n    \"key\": \"ram_2d\",\n    \"algos\": {\n  "
  },
  {
    "path": "dimensionality_reduction/viz_representation_2d.json",
    "chars": 207,
    "preview": "{\n  \"data\": {\n    \"path\": \"/Users//data/rollout\",\n    \"game\": \"SeaquestNoFrameskip-v4\",\n    \"key\": \"representation_2d\",\n"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.bar.js",
    "chars": 64272,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Bar=function(conf)\n{if(typeof conf==='object'&&typeof conf.data==='object'"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.bipolar.js",
    "chars": 46133,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Bipolar=function(conf)\n{if(typeof conf==='object'&&typeof conf.left==='obj"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.common.annotate.js",
    "chars": 6049,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigator.userAgent,ma=Math;RG.ann"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.common.context.js",
    "chars": 13426,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigator.userAgent,ma=Math;RG.con"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.common.core.js",
    "chars": 73477,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigator.userAgent,ma=Math;RG.Hig"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.common.csv.js",
    "chars": 3271,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};if(!RGraph.AJAX)RGraph.AJAX=function(url,callback)\n{if(window.XMLHttpRequest){var"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.common.deprecated.js",
    "chars": 4508,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigator.userAgent,ma=Math;RG.tex"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.common.dynamic.js",
    "chars": 19459,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigator.userAgent,ma=Math;RG.ins"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.common.effects.js",
    "chars": 28571,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Effects=RGraph.Effects||{};RGraph.Effects.Common={};(function(win,doc,unde"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.common.key.js",
    "chars": 12275,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.HTML=RGraph.HTML||{};(function(win,doc,undefined)\n{var RG=RGraph,ua=naviga"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.common.resizing.js",
    "chars": 11517,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigator.userAgent,ma=Math,active"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.common.sheets.js",
    "chars": 3591,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};(function(win,doc,undefined)\n{RGraph.Sheets=function(key)\n{var worksheet,callback"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.common.tooltips.js",
    "chars": 5753,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigator.userAgent,ma=Math;RG.too"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.common.zoom.js",
    "chars": 4058,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigator.userAgent,ma=Math;RG.zoo"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.cornergauge.js",
    "chars": 17099,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.CornerGauge=function(conf)\n{if(typeof conf==='object'&&typeof conf.min==='"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.drawing.background.js",
    "chars": 6892,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Drawing=RGraph.Drawing||{};RGraph.Drawing.Background=function(conf)\n{if(ty"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.drawing.circle.js",
    "chars": 5502,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Drawing=RGraph.Drawing||{};RGraph.Drawing.Circle=function(conf)\n{if(typeof"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.drawing.image.js",
    "chars": 8683,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Drawing=RGraph.Drawing||{};RGraph.Drawing.Image=function(conf)\n{if(typeof "
  },
  {
    "path": "docs/RGraph/libraries/RGraph.drawing.marker1.js",
    "chars": 7226,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Drawing=RGraph.Drawing||{};RGraph.Drawing.Marker1=function(conf)\n{if(typeo"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.drawing.marker2.js",
    "chars": 6828,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Drawing=RGraph.Drawing||{};RGraph.Drawing.Marker2=function(conf)\n{if(typeo"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.drawing.marker3.js",
    "chars": 5189,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Drawing=RGraph.Drawing||{};RGraph.Drawing.Marker3=function(conf)\n{if(typeo"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.drawing.poly.js",
    "chars": 5710,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Drawing=RGraph.Drawing||{};RGraph.Drawing.Poly=function(conf)\n{if(typeof c"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.drawing.rect.js",
    "chars": 5549,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Drawing=RGraph.Drawing||{};RGraph.Drawing.Rect=function(conf)\n{if(typeof c"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.drawing.text.js",
    "chars": 7057,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Drawing=RGraph.Drawing||{};RGraph.Drawing.Text=function(conf)\n{if(typeof c"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.drawing.xaxis.js",
    "chars": 9517,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Drawing=RGraph.Drawing||{};RGraph.Drawing.XAxis=function(conf)\n{if(typeof "
  },
  {
    "path": "docs/RGraph/libraries/RGraph.drawing.yaxis.js",
    "chars": 9812,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Drawing=RGraph.Drawing||{};RGraph.Drawing.YAxis=function(conf)\n{if(typeof "
  },
  {
    "path": "docs/RGraph/libraries/RGraph.fuel.js",
    "chars": 13519,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Fuel=function(conf)\n{if(typeof conf==='object'&&typeof conf.id==='string')"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.funnel.js",
    "chars": 12920,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Funnel=function(conf)\n{if(typeof conf==='object'&&typeof conf.data==='obje"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.gantt.js",
    "chars": 20328,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Gantt=function(conf)\n{if(typeof conf==='object'&&typeof conf.data==='objec"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.gauge.js",
    "chars": 24227,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Gauge=function(conf)\n{if(typeof conf==='object'&&typeof conf.id==='string'"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.hbar.js",
    "chars": 43062,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.HBar=function(conf)\n{if(typeof conf==='object'&&typeof conf.data==='object"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.hprogress.js",
    "chars": 20069,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.HProgress=function(conf)\n{if(typeof conf==='object'&&typeof conf.value!=='"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.line.js",
    "chars": 68881,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Line=function(conf)\n{if(typeof conf==='object'&&typeof conf.data==='object"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.meter.js",
    "chars": 23711,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Meter=function(conf)\n{if(typeof conf==='object'&&typeof conf.value!=='unde"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.modaldialog.js",
    "chars": 5592,
    "preview": "\nModalDialog={dialog:null,background:null,offset:50,events:[],Show:function(id,width)\n{ModalDialog.id=id;ModalDialog.wid"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.odo.js",
    "chars": 21578,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Odometer=function(conf)\n{if(typeof conf==='object'&&typeof conf.value!=='u"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.pie.js",
    "chars": 34231,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Pie=function(conf)\n{if(typeof conf==='object'&&typeof conf.data==='object'"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.radar.js",
    "chars": 29343,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Radar=function(conf)\n{if(typeof conf==='object'&&typeof conf.data==='objec"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.rose.js",
    "chars": 30232,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Effects=RGraph.Effects||{};RGraph.Effects.Rose=RGraph.Effects.Rose||{};RGr"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.rscatter.js",
    "chars": 22929,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.RScatter=RGraph.Rscatter=function(conf)\n{if(typeof conf==='object'&&typeof"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.scatter.js",
    "chars": 48929,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Scatter=function(conf)\n{if(typeof conf==='object'&&typeof conf.data==='obj"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.semicircularprogress.js",
    "chars": 15019,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.SemiCircularProgress=function(conf)\n{if(typeof conf==='object'&&typeof con"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.bar.js",
    "chars": 31729,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.SVG=RGraph.SVG||{};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigato"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.bipolar.js",
    "chars": 44550,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.SVG=RGraph.SVG||{};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigato"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.common.ajax.js",
    "chars": 2118,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true,isRGraphSVG:true};RGraph.SVG=RGraph.SVG||{};RGraph.SVG.AJAX=RGraph.SVG.AJAX||{};(f"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.common.core.js",
    "chars": 59562,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true,isRGraphSVG:true};RGraph.SVG=RGraph.SVG||{};RGraph.SVG.FX=RGraph.SVG.FX||{};(funct"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.common.csv.js",
    "chars": 3265,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.SVG=RGraph.SVG||{};if(!RGraph.SVG.AJAX)RGraph.SVG.AJAX=function(url,callba"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.common.fx.js",
    "chars": 22482,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.SVG=RGraph.SVG||{};RGraph.SVG.FX=RGraph.SVG.FX||{};(function(win,doc,undef"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.common.key.js",
    "chars": 5337,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.SVG=RGraph.SVG||{};RGraph.SVG.HTML=RGraph.SVG.HTML||{};(function(win,doc,u"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.common.sheets.js",
    "chars": 3591,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};(function(win,doc,undefined)\n{RGraph.Sheets=function(key)\n{var worksheet,callback"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.common.tooltips.js",
    "chars": 3238,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true,isRGraphSVG:true};RGraph.SVG=RGraph.SVG||{};(function(win,doc,undefined)\n{var RG=R"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.funnel.js",
    "chars": 9461,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.SVG=RGraph.SVG||{};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigato"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.gauge.js",
    "chars": 16034,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.SVG=RGraph.SVG||{};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigato"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.hbar.js",
    "chars": 21753,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.SVG=RGraph.SVG||{};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigato"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.line.js",
    "chars": 26789,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.SVG=RGraph.SVG||{};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigato"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.pie.js",
    "chars": 17019,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.SVG=RGraph.SVG||{};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigato"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.radar.js",
    "chars": 17288,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.SVG=RGraph.SVG||{};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigato"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.rose.js",
    "chars": 25525,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.SVG=RGraph.SVG||{};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigato"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.scatter.js",
    "chars": 20056,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.SVG=RGraph.SVG||{};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigato"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.semicircularprogress.js",
    "chars": 12623,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.SVG=RGraph.SVG||{};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigato"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.svg.waterfall.js",
    "chars": 14850,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.SVG=RGraph.SVG||{};(function(win,doc,undefined)\n{var RG=RGraph,ua=navigato"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.thermometer.js",
    "chars": 14383,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Thermometer=function(conf)\n{if(typeof conf==='object'&&typeof conf.id==='s"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.vprogress.js",
    "chars": 21056,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.VProgress=function(conf)\n{if(typeof conf==='object'&&typeof conf.id==='str"
  },
  {
    "path": "docs/RGraph/libraries/RGraph.waterfall.js",
    "chars": 25590,
    "preview": "\nRGraph=window.RGraph||{isRGraph:true};RGraph.Waterfall=function(conf)\n{if(typeof conf==='object'&&typeof conf.data==='o"
  },
  {
    "path": "docs/css/bootstrap-theme.css",
    "chars": 25682,
    "preview": "/*!\n * Bootstrap v3.4.0 (https://getbootstrap.com/)\n * Copyright 2011-2018 Twitter, Inc.\n * Licensed under MIT (https://"
  },
  {
    "path": "docs/css/bootstrap.css",
    "chars": 145933,
    "preview": "/*!\n * Bootstrap v3.4.0 (https://getbootstrap.com/)\n * Copyright 2011-2018 Twitter, Inc.\n * Licensed under MIT (https://"
  },
  {
    "path": "docs/js/bootstrap.js",
    "chars": 70815,
    "preview": "/*!\n * Bootstrap v3.4.0 (https://getbootstrap.com/)\n * Copyright 2011-2018 Twitter, Inc.\n * Licensed under the MIT licen"
  },
  {
    "path": "docs/js/npm.js",
    "chars": 484,
    "preview": "// This file is autogenerated via the `commonjs` Grunt task. You can require() this file in a CommonJS environment.\nrequ"
  },
  {
    "path": "docs/video.html",
    "chars": 7181,
    "preview": "<!DOCTYPE html> \n<html> \n\t<head>\n\n<!-- Latest compiled and minified CSS -->\n<link rel=\"stylesheet\" href=\"css/bootstrap.m"
  },
  {
    "path": "docs/video2.html",
    "chars": 19681,
    "preview": "<!DOCTYPE html >\n<html>\n<head>\n<!-- Latest compiled and minified CSS -->\n<link rel=\"stylesheet\" href=\"css/bootstrap.min."
  },
  {
    "path": "examples/classify_state.py",
    "chars": 4544,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "examples/demo.py",
    "chars": 1176,
    "preview": "# Copyright (c) 2018 Uber Technologies, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you m"
  },
  {
    "path": "notebooks/Basic visualization.ipynb",
    "chars": 403083,
    "preview": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n "
  },
  {
    "path": "notebooks/Filter Analysis.ipynb",
    "chars": 479660,
    "preview": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n "
  },
  {
    "path": "notebooks/Training log visualization.ipynb",
    "chars": 95267,
    "preview": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n "
  },
  {
    "path": "requirements.txt",
    "chars": 51,
    "preview": "tensorflow-gpu\nmatplotlib\nmoviepy\ngym\nlucid\npandas\n"
  },
  {
    "path": "setup.py",
    "chars": 206,
    "preview": "from distutils.core import setup\n\nsetup(\n        name='AtariZoo',\n        version='0.1dev',\n        packages=['atari_zoo"
  }
]

// ... and 1 more files (download for full content)

About this extraction

This page contains the full source code of the uber-research/atari-model-zoo GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 108 files (28.3 MB), approximately 715.7k tokens, and a symbol index with 231 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.

Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.

Copied to clipboard!